# [Official] NVIDIA GTX 1080 Ti Owner's Club



## Jsunn

Pre-order submitted...
Does anyone have any experience with the delivery times on nvidia pre-orders?

Thanks,
-Jsunn


----------



## al210

Quote:


> Originally Posted by *Jsunn*
> 
> Pre-order submitted...
> Does anyone have any experience with the delivery times on nvidia pre-orders?
> 
> Thanks,
> -Jsunn


I don't. this is my 1st pre order with them. I did see several shipping options available. I choose the standard one since i'm not in a hurry.


----------



## Jsunn

Quote:


> Originally Posted by *al210*
> 
> I don't. this is my 1st pre order with them. I did see several shipping options available. I choose the standard one since i'm not in a hurry.


I think I read that pre orders will start shipping on the 10th, I can't seem to confirm that. This is my first pre order as well.
-J


----------



## bfedorov11

How long ago did the first batch sell out?

Ordered one and an ek block









They ship from digital river.


----------



## al210

Sad to report that pre-orders are now sold out at NVIDIA. Did not last an hour!


----------



## Kdubbs

Woo I am glad I got into the preorder when I did. Contemplated getting two but decided to hold off to see if amd releases something more powerful to pop over to that. Having 290x's I can't wait for the performance increases. Ordered standard shipping as well.


----------



## Jsunn

I almost got 2 as well, but I figured that it was way overkill... plus that is expensive!


----------



## Kdubbs

Quote:


> Originally Posted by *Jsunn*
> 
> I almost got 2 as well, but I figured that it was way overkill... plus that is expensive!


Plus we are both nice guys and gave 1-2 other people a chance


----------



## Jbravo33

Ordered 1 as well with next day shipping. Should have gotten 2. I'll be selling it anyway once evga comes out with theirs.


----------



## al210

Quote:


> Originally Posted by *Jbravo33*
> 
> Ordered 1 as well with next day shipping. Should have gotten 2. I'll be selling it anyway once evga comes out with theirs.


What will be special about the EVGA version?


----------



## Jbravo33

Quote:


> Originally Posted by *al210*
> 
> What will be special about the EVGA version?


Nothing I just really liked the classified 1080's I had and I'm hoping for either that or their copper hydro so I don't have to worry about adding a waterblock myself. I just want to get it in and play my games and be done with this thing for a while


----------



## bfedorov11

Who ever just listed that preorder on ebay for $850 is going to be in for a surprise if it sells lol


----------



## al210

Quote:


> Originally Posted by *bfedorov11*
> 
> Who ever just listed that preorder on ebay for $850 is going to be in for a surprise if it sells lol


Ha Ha! I wondered when the 1st one would show up there? Not suprised


----------



## Jbravo33

There was one 2 days ago for 1599. Just checked there $999 lol


----------



## al210

There are people on Ebay bidding more $ on a used Titan then what a new Ti goes for? They are either misinformed or want Titan in SLI?


----------



## Jbravo33

probably misinformed those prices will continue to drop. No way they stay where there at.


----------



## Alexbo1101

Just got one on the EU preorder


----------



## al210

Quote:


> Originally Posted by *Alexbo1101*
> 
> Just got one on the EU preorder


Nice! Welcome to the pre-order club.


----------



## hebrewbacon

Managed to get a pre-order in as well. I really wanted to get 2 but I held back as I couldn't bring myself to pay that kind of dough







Need to bring my WC 1080s back to stock to sell them off.


----------



## Jbravo33

Wow. just got email regarding order number, now they are telling me my order was canceled. by who? what? where? and how? hahahah what a joke of a company. This what happens when there is no competition. on the phone for an hour. Got no where. so sad. but I can look up item number and it states pending. Whats going on?!


----------



## al210

Quote:


> Originally Posted by *hebrewbacon*
> 
> Managed to get a pre-order in as well. I really wanted to get 2 but I held back as I couldn't bring myself to pay that kind of dough
> 
> 
> 
> 
> 
> 
> 
> Need to bring my WC 1080s back to stock to sell them off.


Sweet! This couldn't have come at a better time for me since I'm just about ready to start a new Kaby Lake water build. Just need my Ti and Parvum case.


----------



## hebrewbacon

Quote:


> Originally Posted by *al210*
> 
> Sweet! This couldn't have come at a better time for me since I'm just about ready to start a new Kaby Lake water build. Just need my Ti and Parvum case.


Nice. I just ordered a Titan XP EK waterblock as well to integrate this beast into my loop. I will definitely miss the power of SLI in games where it scaled well but it will be nice not to worry about SLI issues and if the game supported it before buying it.


----------



## bfedorov11

Quote:


> Originally Posted by *Jbravo33*
> 
> Wow. just got email regarding order number, now they are telling me my order was canceled. by who? what? where? and how? hahahah what a joke of a company. This what happens when there is no competition. on the phone for an hour. Got no where. so sad. but I can look up item number and it states pending. Whats going on?!


Digital River had a lot of credit card issues when they sold the vive. Call your bank and see if the charge was denied.


----------



## hebrewbacon

Quote:


> Originally Posted by *Jbravo33*
> 
> Wow. just got email regarding order number, now they are telling me my order was canceled. by who? what? where? and how? hahahah what a joke of a company. This what happens when there is no competition. on the phone for an hour. Got no where. so sad. but I can look up item number and it states pending. Whats going on?!


That really sucks man. Were you charged as well? My credit card was charged within 5-10 mins of me placing the order. I'm hoping mine goes smoothly as I've already ordered a waterblock as well. Maybe I should wait to take out my 1080s until I get a shipping confirmation.


----------



## Joshwaa

Count me in. Got in on the pre-order and also ordered the FC from EK. Guess my 1080 FTW with the EK will be my physx card. lol

Thinking about maybe just building a Z270 build for the ti. Maybe an open case with a 540 radiator stand. Been wanting to do that.


----------



## Jbravo33

Quote:


> Originally Posted by *hebrewbacon*
> 
> That really sucks man. Were you charged as well? My credit card was charged within 5-10 mins of me placing the order. I'm hoping mine goes smoothly as I've already ordered a waterblock as well. Maybe I should wait to take out my 1080s until I get a shipping confirmation.


As mentioned above might be a cc issue. If you got charged you should be fine. Crazy thing is I used PayPal which is direct out of account. Ugh. This sucks just called PayPal they said a transaction was never put thru. *** nvidia


----------



## kckyle

was gonna pre order it but saw nvidia charging me 70 bucks in tax. i'm gonna wait til newegg get some in stock.


----------



## al210

Quote:


> Originally Posted by *Jbravo33*
> 
> As mentioned above might be a cc issue. If you got charged you should be fine. Crazy thing is I used PayPal which is direct out of account. Ugh. This sucks just called PayPal they said a transaction was never put thru. *** nvidia


That blows! I used PayPal as well. I get an email within seconds from PayPal when I use them.


----------



## RobotDevil666

Quote:


> Originally Posted by *hebrewbacon*
> 
> Nice. I just ordered a Titan XP EK waterblock as well to integrate this beast into my loop. I will definitely miss the power of SLI in games where it scaled well but it will be nice not to worry about SLI issues and if the game supported it before buying it.


Im in the same boat mate, gonna sell of my 1080's and get 1080Ti lack of SLI support and bugs are pissing me off, that said in games like Division or GTA V I will be worse off, oh well I'll probably buy a 2nd one later on anyway.








Any word on when are they actually coming in stock ?


----------



## LukkyStrike

I was able to secure 2 of them this morning. The charge went through right after ordering as well.

Should be an interesting couple of days to see when we will actually get the cards.


----------



## KraxKill

Anybody have any idea if this thing comes with a backplate like the 1080 founders? Or is a naked board?


----------



## Jsunn

Quote:


> Originally Posted by *KraxKill*
> 
> Anybody have any idea if this thing comes with a backplate like the 1080 founders? Or is a naked board?


All of the pics I've seen show a back plate.


----------



## jleslie246

sold out. Which version will be the best for a water cooling block? Im guessing an EVGA FTW version. I wonder how long before those come out?


----------



## al210

Quote:


> Originally Posted by *jleslie246*
> 
> sold out. Which version will be the best for a water cooling block? Im guessing an EVGA FTW version. I wonder how long before those come out?


The Founders Version. EK already stated that their Titan block fits the Ti Founders Edition.


----------



## hebrewbacon

Quote:


> Originally Posted by *RobotDevil666*
> 
> Im in the same boat mate, gonna sell of my 1080's and get 1080Ti lack of SLI support and bugs are pissing me off, that said in games like Division or GTA V I will be worse off, oh well I'll probably buy a 2nd one later on anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> Any word on when are they actually coming in stock ?


Not sure on founders but other companies might be putting some in stock on the 10th.
As far as SLI goes, I'm probably not going to venture into that again since it seems lesser games seem to be supporting it these days. It's really nice to get 100+ fps at 4k when it works but sucks when it doesn't and your second card is a paperweight in the game. I know I'm going to lose FPS in dishonored 2 but oh well. I think the 30-40% increase in single card performance at 4k makes this worth it.


----------



## y2kcamaross

I'm in on this pre-order train, placed mine at 10:03am central


----------



## LucaZPF

Waiting for a custom one. I'm going to replace my old 780 Ti SLI. I think it's time


----------



## MURDoctrine

Yeah I'm waiting to see what happens with the board partners announce. But I will be buying one of these bad boys to replace my 980.


----------



## thyfartismurder

Joined the pre order club last night my first brand new gpu ?


----------



## wanako

I'll be getting one to replace my 980 Ti.


----------



## dboythagr8

Tried to get my pre-order in yesterday, but had bank issues at the worst time. By the time it was resolved and I refreshed, they were gone.

So now I'm waiting until the 10th to see what the AIBs have. I've always gone with EVGA when possible, up until the 1070 where I went with ASUS Strix, and that was because of the VRM issue. I sold my 1070 right after the announcement of the Ti, and picked up an ASUS 1060 to hold me over to the 1080Ti. Have had a great time with the ASUS, but I know from history their customer service is garbage. EVGA fixed their issue with the new ICX cooler, so I'll probably go with them again. That customer support has always been great to me.


----------



## al210

Quote:


> Originally Posted by *dboythagr8*
> 
> Tried to get my pre-order in yesterday, but had bank issues at the worst time. By the time it was resolved and I refreshed, they were gone.
> 
> So now I'm waiting until the 10th to see what the AIBs have. I've always gone with EVGA when possible, up until the 1070 where I went with ASUS Strix, and that was because of the VRM issue. I sold my 1070 right after the announcement of the Ti, and picked up an ASUS 1060 to hold me over to the 1080Ti. Have had a great time with the ASUS, but I know from history their customer service is garbage. EVGA fixed their issue with the new ICX cooler, so I'll probably go with them again. That customer support has always been great to me.


Ummm? Why didn't you just keep your 1070 until you get the Ti?


----------



## KraxKill

Quote:


> Originally Posted by *al210*
> 
> Ummm? Why didn't you just keep your 1070 until you get the Ti?


Probably because he can return his 1060 within 14-30 days for a full refund. And in doing so get the maximum possible for his 1070 before the price drop is priced in. All this while having a fully functional rig while he waits for his Ti to arrive.


----------



## stocksux

If water cooling these, is there any reason to not just get a founders edition one now? Seems like block options are higher for FE cards, and since it seems voltage is locked for Pascal and we already know the TDP is 250W just like on the Titan, that the only difference the AIB cards will have between them and the FE cards will be the coolers. And if your ripping it off for a water block then who cares?? Thoughts??


----------



## velocityx

gotta say looking good!


----------



## Douglaster

Waiting for EVGA myself.

Hopefully 8+8 will enable to hit 2kghz core !


----------



## LukkyStrike

Quote:


> Originally Posted by *stocksux*
> 
> If water cooling these, is there any reason to not just get a founders edition one now? Seems like block options are higher for FE cards, and since it seems voltage is locked for Pascal and we already know the TDP is 250W just like on the Titan, that the only difference the AIB cards will have between them and the FE cards will be the coolers. And if your ripping it off for a water block then who cares?? Thoughts??


I ordered the Reference ones because i have no reason to have "better cooling" as mine are water cooled. EK had the WB's ready so it was a no brainier. I think that Nvidia would be paying close attention to the chips put into their direct sales anyways.

No reason to wait in my opinion.

Now the question is when will these cards get here. My EK order will be here Monday.


----------



## kl6mk6

Glad to find this thread. Preordered mine and went with th FE cause I'll be watercooling it too. So excited.


----------



## Caos

Quote:


> Originally Posted by *velocityx*
> 
> gotta say looking good!


When this model goes on sale?

beautiful


----------



## al210

I'll be water cooling as well. You guys that ordered blocks did you just get the EK Titan block?


----------



## RyzenChrist

Preorder? It says Notify me.. Did i miss it?


----------



## MURDoctrine

Quote:


> Originally Posted by *RyzenChrist*
> 
> Preorder? It says Notify me.. Did i miss it?


Yes preorders opened yesterday and are already sold out.


----------



## RyzenChrist

Quote:


> Originally Posted by *MURDoctrine*
> 
> Yes preorders opened yesterday and are already sold out.


Ugh... Why do i have to work and miss these things


----------



## al210

Quote:


> Originally Posted by *RyzenChrist*
> 
> Ugh... Why do i have to work and miss these things


Keep an eye on their site. Some have reported the pre orders going live again at different times.


----------



## LukkyStrike

Quote:


> Originally Posted by *al210*
> 
> I'll be water cooling as well. You guys that ordered blocks did you just get the EK Titan block?


Yes I got the Titan X block.


----------



## CoreyL4

Anyone know when AIB cards go up for pre order?


----------



## Jbravo33

Quote:


> Originally Posted by *velocityx*
> 
> gotta say looking good!


u

Guess I don't feel bad with that bs I went thru yesterday. I'll try again for two of these bad boys! Funny nvidia emailed me again today stating to checkout as I still have item (1080ti) in shopping cart. For a company that has all that money there store and support sure does suck.


----------



## al210

Quote:


> Originally Posted by *Jbravo33*
> 
> u
> 
> Guess I don't feel bad with that bs I went thru yesterday. I'll try again for two of these bad boys! Funny nvidia emailed me again today stating to checkout as I still have item (1080ti) in shopping cart. For a company that has all that money there store and support sure does suck.


That is one sweet looking card.


----------



## stocksux

I read March 10


----------



## al210

Quote:


> Originally Posted by *stocksux*
> 
> I read March 10


That's what I heard as well!


----------



## hebrewbacon

Quote:


> Originally Posted by *al210*
> 
> I'll be water cooling as well. You guys that ordered blocks did you just get the EK Titan block?


I ordered the Swiftech Titan XP block as it has an inbuilt LED system in the block. Any Titan XP block should work with this as the Ti uses the same PCB.


----------



## Mephistobr

Does anyone know how many days/weeks does it usually take for the launches of the third-party GPUs?


----------



## Mongo

I have F5 til my fingers bleed and all I have saw is *NOTIFY ME!* why would you take a preorder down? Just leave it up and put a warning it will be filled in the order that were gotten.


----------



## kckyle

save yourself 70 bucks and wait for newegg to have it in stock.
Quote:


> Originally Posted by *Mephistobr*
> 
> Does anyone know how many days/weeks does it usually take for the launches of the third-party GPUs?


newegg told me they should have it up by next week.


----------



## Mongo

Quote:


> Originally Posted by *kckyle*
> 
> save yourself 70 bucks and wait for newegg to have it in stock.


You will not save anything if Newegg price gouges like they have been on new card releases.


----------



## Jsunn

How do you over clock a reference card? I've only had non reference cards that have their own overclocking utilities? Is there one that people recommend?

Sent from my Pixel XL using Tapatalk


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Jsunn*
> 
> How do you over clock a reference card? I've only had non reference cards that have their own overclocking utilities? Is there one that people recommend?
> 
> Sent from my Pixel XL using Tapatalk


MSI afterburner or EVGA Precision X.

I stick with AB myself.


----------



## Clockster

You can add me to the list. I always fall into this trap lol xD


----------



## Neokolzia

Wonder what the date will be for EVGA Step up program.

I'll be part of the owners group for sure, since I bought my 1080 early Feb, so should easily qualify


----------



## rolldog

Any thoughts on who will release the pre-binned non-reference cards first? I plan to watercool these cards anyway, but after owning a reference 780 Ti and a non-reference Gigabyte 980 Ti Gaming G1 GPU, I think I'll wait for someone to release their pre-binned non-reference card. The better air cooling on the Gigabyte GTX 980 Ti Gaming G1 was wasted since I removed it, but I am willing to wait for non-reference with a high ASIC score and the ability to use NVFLASH to unlock the voltage controls and overclocking ability. I don't have a lot of patience though. It looks like Asus and EVGA will have the first non-reference cards available, but I'm sure it's only a matter of time before all the other manufacturers release their souped up version. Luckily, EK made a waterblock for the Gigabyte's non-reference 980 Ti Gaming G1 GPU, but I don't think all non-reference cards had a waterblock available for it. From what I remember, I think I ran my Gigabyte 980 Tis on air for 6 months or so until the waterblock was available.

This card, no matter which version, is going to be a beast. I just need to try not reading anyone's reviews or else I'll make an impulse purchase.


----------



## stocksux

Quote:


> Originally Posted by *rolldog*
> 
> Any thoughts on who will release the pre-binned non-reference cards first? I plan to watercool these cards anyway, but after owning a reference 780 Ti and a non-reference Gigabyte 980 Ti Gaming G1 GPU, I think I'll wait for someone to release their pre-binned non-reference card. The better air cooling on the Gigabyte GTX 980 Ti Gaming G1 was wasted since I removed it, but I am willing to wait for non-reference with a high ASIC score and the ability to use NVFLASH to unlock the voltage controls and overclocking ability. I don't have a lot of patience though. It looks like Asus and EVGA will have the first non-reference cards available, but I'm sure it's only a matter of time before all the other manufacturers release their souped up version. Luckily, EK made a waterblock for the Gigabyte's non-reference 980 Ti Gaming G1 GPU, but I don't think all non-reference cards had a waterblock available for it. From what I remember, I think I ran my Gigabyte 980 Tis on air for 6 months or so until the waterblock was available.
> 
> This card, no matter which version, is going to be a beast. I just need to try not reading anyone's reviews or else I'll make an impulse purchase.


So far I've read that asus and evga will have their respective 1080ti ready on launch day with the FE.


----------



## trawetSluaP

So, a little bit of advice is needed.

I plan to upgrade to a 1080 Ti and as I've just installed a custom water cooling loop to my build I'd like to add it to that.

Is it better to buy a reference card and buy a block separately or should I wait for the likes of EVGA's Hydrocopper and MSI's Seahawk EK and buy one of those?


----------



## becks

Quote:


> Originally Posted by *trawetSluaP*
> 
> So, a little bit of advice is needed.
> 
> I plan to upgrade to a 1080 Ti and as I've just installed a custom water cooling loop to my build I'd like to add it to that.
> 
> Is it better to buy a reference card and buy a block separately or should I wait for the likes of EVGA's Hydrocopper and MSI's Seahawk EK and buy one of those?


Have the same thing on my mind. And from what I've read so far its better to wait.
Even if one of those water cooled cards from factory wont pop anytime soon or never... its till better to buy an aftermarket "pre-binned non-reference card" card than reference..
But that is just my toughs...


----------



## startekee

Will a 980ti classified water block work with a reference card or should I wait for the classified version. I can only wait until October.


----------



## barsh90

Quote:


> Originally Posted by *startekee*
> 
> Will a 980ti classified water block work with a reference card or should I wait for the classified version. I can only wait until October.


980ti block will be inconpatible. As of right now the titan x pascal is the only compatible waterblock


----------



## al210

Quote:


> Originally Posted by *trawetSluaP*
> 
> So, a little bit of advice is needed.
> 
> I plan to upgrade to a 1080 Ti and as I've just installed a custom water cooling loop to my build I'd like to add it to that.
> 
> Is it better to buy a reference card and buy a block separately or should I wait for the likes of EVGA's Hydrocopper and MSI's Seahawk EK and buy one of those?


If you want to water cool and are the kind of person who wants to go their own block then reference is the way to go since it provides the best option for different manufacturers water blocks. If you don't want to put your own block on and are happy with a pre-installed solution then you have to wait for those cards to be released.

I myself like the option of choosing the block I want.


----------



## bfedorov11

None of the custom pcb 1080s were binned. Just look at reviews for cards like the HOF that can't hit 2100 while almost every 1080 FE can go over it. Maxwell wasn't much better with asic binned kingpins. Any card had the chance to go 1500mhz with proper cooling.

Cooling is the only difference with all the cards. IMO, the 1080 FE cards clocked higher in general. That's why I am going with another FE and waterblock.


----------



## wstanci3

Miss the days of Kepler. Where you knew if you added more voltage, the higher the overclock will go.


----------



## EvilPieMoo

Got 2 Ti's pre ordered with Nvidia, now it's just a matter of playing the waiting game. It's a shame Nvidia are being so vague with the shipping/release dates though.


----------



## PasK1234Xw

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Got 2 Ti's pre ordered with Nvidia, now it's just a matter of playing the waiting game. It's a shame Nvidia are being so vague with the shipping/release dates though.


I just sold my 1080 SLI for one ti been running SLI for years now its a joke even BF1 SLI disappointment regardless of high frames still not smooth even with gsync.


----------



## al210

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Got 2 Ti's pre ordered with Nvidia, now it's just a matter of playing the waiting game. It's a shame Nvidia are being so vague with the shipping/release dates though.


2 of those, wow!







Maybe Nvidia will surprise us.


----------



## hebrewbacon

Quote:


> Originally Posted by *PasK1234Xw*
> 
> I just sold my 1080 SLI for one ti been running SLI for years now its a joke even BF1 SLI disappointment regardless of high frames still not smooth even with gsync.


I'm waiting for my Ti to arrive before I sell my 1080s. I have a backup 780 that I use for emergency purposes but don't want to have a crippled PC for a week while I wait for the Ti. SLI seems like a dying technology and I think it's better to just stick with 1 powerful GPU. I am in the middle of Dishonored 2 and SLI seems to work great in this game, so it will suck not having the ~100fps at 4k ultra.


----------



## EvilPieMoo

Anybody that has pre ordered the Ti not had the money taken yet? I'm assuming Nvidia takes payment for pre orders when it ships, that or they screwed up. Hopefully not the latter.


----------



## al210

Just ordered up the EK nickel- plexi Titan X water block for the Ti


----------



## al210

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Anybody that has pre ordered the Ti not had the money taken yet? I'm assuming Nvidia takes payment for pre orders when it ships, that or they screwed up. Hopefully not the latter.


Did you pay with CC or PayPal?


----------



## EvilPieMoo

Quote:


> Originally Posted by *al210*
> 
> Did you pay with CC or PayPal?


CC (well debit card)


----------



## bfedorov11

CC probably has a hold which will end up posting since the ship date is close. See if it is reflected in your available credit.

Paypal posts instantly.. which is why I use it for preorders.. no worries about a false fraud flag. The money is gone the second you pay.


----------



## Biggu

man I thought today was the day for pre orders. I guess I screwed the pooch on this one. Heres to hoping I can get one at Microcenter on the 10th.


----------



## Jsunn

Here is a teardown video of the 1080Ti. Nothing really new, but some good video of the PCB.


----------



## KraxKill

Quote:


> Originally Posted by *Jsunn*
> 
> Here is a teardown video of the 1080Ti. Nothing ereally new, but some good video of the PCB.


Nice find. So basically a titan with a ram chip hacked off.


----------



## Baasha

Releasing it on Friday is a bit meh... are they planning to ship on the 10th or do we get the GPUs then? I did next morning delivery so hopefully should have these in soon.

1080 Ti 4-Way SLI FTW!


----------



## bfedorov11

Yeah, I hope they arrive before Friday. If you sign up and create an account at fedex and ups, you'll get a notification if someone sends you a package.

Ready to go!


----------



## Joshwaa

Got my EK block today also.


----------



## Snabeltorsk

Quote:


> Originally Posted by *jleslie246*
> 
> sold out. Which version will be the best for a water cooling block? Im guessing an EVGA FTW version. I wonder how long before those come out?


Founders wil be best for water.


----------



## al210

Quote:


> Originally Posted by *Joshwaa*
> 
> Got my EK block today also.


Your set









Mine is on order.


----------



## kevindd992002

Quote:


> Originally Posted by *Snabeltorsk*
> 
> Founders wil be best for water.


Why is that?


----------



## al210

Quote:


> Originally Posted by *kevindd992002*
> 
> Why is that?


Founders edition will have the most choices from different manufacturers for blocks.


----------



## LukkyStrike

Anyone here ever pre-order direct from Nvidia? i only ask because whenever I have pre-ordered phones i get them on launch day or the day before....

I am just curious on how they handle this.


----------



## nyk20z3

I couldn't have a block that says Titan X on a non Titan X card though i am to Anal for that. EK should revise the blocks to also say 1080 Ti for who ever orders them







. I am currently on a 980 Ti Extreme Gaming and its still a monster card but as always as an enthusiast we get sucked in to upgrading when we don't really need to.


----------



## kevindd992002

Quote:


> Originally Posted by *al210*
> 
> Founders edition will have the most choices from different manufacturers for blocks.


Yup but when we talk about performance and the risk of bring limited by the power limit during overclocking, won't AIB boards with a third party waterblock be always better?


----------



## al210

Quote:


> Originally Posted by *kevindd992002*
> 
> Yup but when we talk about performance and the risk of bring limited by the power limit during overclocking, won't AIB boards with a third party waterblock be always better?


History says yes but the verdict is not in for the Ti which maybe already pushed hard at stock since the clocks are higher than the Titan XP


----------



## nyk20z3

Quote:


> Originally Posted by *al210*
> 
> History says yes but the verdict is not in for the Ti which maybe already pushed hard at stock since the clocks are higher than the Titan XP


Is this something with Pascal because the custom 980 Ti's had plenty of headroom over the reference board ?


----------



## bfedorov11

Quote:


> Originally Posted by *kevindd992002*
> 
> Yup but when we talk about performance and the risk of bring limited by the power limit during overclocking, won't AIB boards with a third party waterblock be always better?


It didn't really matter with the 1080 cards. They all hit the same PT wall.
Quote:


> Originally Posted by *nyk20z3*
> 
> Is this something with Pascal because the custom 980 Ti's had plenty of headroom over the reference board ?


Reference cards could hit 1500+


----------



## Roadrunners

Quote:


> Originally Posted by *bfedorov11*
> 
> It didn't really matter with the 1080 cards. They all hit the same PT wall.


Will be interesting to see if this also applies to the GP102 chip.


----------



## hebrewbacon

My waterblock just arrived today. Sucks that I will need to wait until possibly mid next week (hopefully) before I can use it


----------



## Trino

Hope this is wrong gonna be using my igpu as of tomorrow :'(

https://pcpartpicker.com/forums/topic/213395-1080-ti-confusion


----------



## thyfartismurder

Quote:


> Originally Posted by *Trino*
> 
> Hope this is wrong gonna be using my igpu as of tomorrow :'(
> 
> https://pcpartpicker.com/forums/topic/213395-1080-ti-confusion


Hmm ive been through every possible page in the order option and all i get is "order recieved"
but if they havent shipped by the time AIB cards are out, im pretty sure everyone will will just get a refund


----------



## Trino

Yeah I get the same "Order Received" well here's to hoping, gonna be a boring month otherwise!


----------



## EvilPieMoo

Quote:


> Originally Posted by *Trino*
> 
> Hope this is wrong gonna be using my igpu as of tomorrow :'(
> 
> https://pcpartpicker.com/forums/topic/213395-1080-ti-confusion


Nobody else seems to see this, I've looked everywhere on the order confirmation and no sign of what that guy is reading.


----------



## RedM00N

Thinking of going with reference or at least a third party board that will fit a waterblock for a reference card. In the event no bios modding comes and the great pascal wall stays true to the 1080 Ti it might be better off going this route.


----------



## navjack27

Hey guys. I'm tempted to sell my 980 ti that I'm just in love with after I grab one of these to compare. I'll just get whatever is cheapest/founders. How the card clocks up during iray or folding matters to me. Gaming is secondary.


----------



## Neokolzia

Quote:


> Originally Posted by *thyfartismurder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Trino*
> 
> Hope this is wrong gonna be using my igpu as of tomorrow :'(
> 
> https://pcpartpicker.com/forums/topic/213395-1080-ti-confusion
> 
> 
> 
> Hmm ive been through every possible page in the order option and all i get is "order recieved"
> but if they havent shipped by the time AIB cards are out, im pretty sure everyone will will just get a refund
Click to expand...

Considering the Unboxing embargo was released like a few days ago I highly highly doubt they'll push the release back to the end of the month.


----------



## y2kcamaross

Anyone actually had their credit card charged? Mine isn't even pending yet..


----------



## KraxKill

Quote:


> Originally Posted by *y2kcamaross*
> 
> Anyone actually had their credit card charged? Mine isn't even pending yet..


If you didn't pay with PayPal, your payment will only be processed upon shipment.

May be worth it to call your bank and make sure the payment will go through on or around the 10th. That way you don't loose your spot if it doesn't.


----------



## y2kcamaross

Quote:


> Originally Posted by *KraxKill*
> 
> If you didn't pay with PayPal, your payment will only be processed upon shipment.
> 
> May be worth it to call your bank and make sure the payment will go through on or around the 10th. That way you don't loose your spot if it doesn't.


No Paypal, not worried about my bank, it'll go through, I've just seen a few people on Nvidia's forums say their cards were charged today so I was just curious


----------



## tconroy135

Quote:


> Originally Posted by *y2kcamaross*
> 
> No Paypal, not worried about my bank, it'll go through, I've just seen a few people on Nvidia's forums say their cards were charged today so I was just curious


I think PayPal is the way to go because of the immediate charge with pre-orders. It puts you at the front of the line in my experience.


----------



## EvilPieMoo

Quote:


> Originally Posted by *tconroy135*
> 
> I think PayPal is the way to go because of the immediate charge with pre-orders. It puts you at the front of the line in my experience.


I hope this isn't the case, I ordered my 2 Ti's within a minute of pre orders going live and will be peeved if people who paid with CC get pushed to the back for not using Paypal.


----------



## y2kcamaross

Hmmm...just checked my order status, and now instead of saying boxed shipment : received, it says no orders found


----------



## EvilPieMoo

Quote:


> Originally Posted by *y2kcamaross*
> 
> Hmmm...just checked my order status, and now instead of saying boxed shipment : received, it says no orders found


Just checked, same issue.


----------



## Alexbo1101

Yup, same here, order doesn't exist.


----------



## stocksux

I'm just gonna go to Microcenter on Friday and grab a couple


----------



## al210

Quote:


> Originally Posted by *stocksux*
> 
> I'm just gonna go to Microcenter on Friday and grab a couple


They will have them for sure? You called?


----------



## alucardis666

So when do the AIBs hit the shops? After going FE with the 1080, I'm thinking of doing a Strix or FTW for the Ti... just need a place to take my money.


----------



## DishTheRock

UPDATE: scratch that, Nvidia customer support just sent me some login info. My order appears to still say boxed shipment.

Did anyone pre-order their cards from Nvidia as a guest when checking out? When I go to check my order status it says I should have received a password in my confirmation email but there is no password there. they have already pulled the funds from my PayPal. I just want to see if it says my order does not exist like others are saying.


----------



## y2kcamaross

Quote:


> Originally Posted by *DishTheRock*
> 
> UPDATE: scratch that, Nvidia customer support just sent me some login info. My order appears to still say boxed shipment.
> 
> Did anyone pre-order their cards from Nvidia as a guest when checking out? When I go to check my order status it says I should have received a password in my confirmation email but there is no password there. they have already pulled the funds from my PayPal. I just want to see if it says my order does not exist like others are saying.


No guest checkout here, but the order does not exist problem has been fixed, they all show up correctly again


----------



## foolycooly

Quote:


> Originally Posted by *DishTheRock*
> 
> UPDATE: scratch that, Nvidia customer support just sent me some login info. My order appears to still say boxed shipment.
> 
> Did anyone pre-order their cards from Nvidia as a guest when checking out? When I go to check my order status it says I should have received a password in my confirmation email but there is no password there. they have already pulled the funds from my PayPal. I just want to see if it says my order does not exist like others are saying.


the order number/password are in the black header of the e-mail.


----------



## DishTheRock

Quote:


> Originally Posted by *foolycooly*
> 
> the order number/password are in the black header of the e-mail.


I Must be missing something then. Unless they want me to use my order number as my password as well.


----------



## foolycooly

Weird--my password appears under "order number" in the black bar at the top.


----------



## thyfartismurder

Quote:


> Originally Posted by *DishTheRock*
> 
> I Must be missing something then. Unless they want me to use my order number as my password as well.


It will be in the email called "order confirmation for order#****************"
that email looks like the order recieved email


----------



## DishTheRock

Quote:


> Originally Posted by *thyfartismurder*
> 
> It will be in the email called "order confirmation for order#****************"
> that email looks like the order recieved email


Neither my "order confirmation" or "order received" email had a password anywhere.

It's no big deal as I was able to get some login info from customer support.

I just thought it was weird that it's just not there.


----------



## tconroy135

Quote:


> Originally Posted by *DishTheRock*
> 
> Neither my "order confirmation" or "order received" email had a password anywhere.
> 
> It's no big deal as I was able to get some login info from customer support.
> 
> I just thought it was weird that it's just not there.


It is weird. My order number and password were in the confirmation email, I paid with PayPal, not sure if that makes any difference.


----------



## al210

Mine says "Boxed Shipment" as well. Paid with PayPal


----------



## tconroy135

Quote:


> Originally Posted by *al210*
> 
> Mine says "Boxed Shipment" as well. Paid with PayPal


Same, but what the hell does Boxed Shipment mean, that's my question.


----------



## DrkHaredNFiry

I think that it means that it comes in a box.

Sorry to be a dic, but that is what it means.

I hate waiting


----------



## al210

I'm hoping it means its already boxed and ready to ship for the March 10th launch?


----------



## Jsunn

Yeah, I just checked mine, it says, boxed shipment as well.

I guess there were some folks that were having issues with their orders. I paid with PayPal and the funds have already left my account.

It looked like on the Nvidia forums, that people were having issues if they used their credit cards, but I can't be certain.

I'm hoping to get a tracking number on Friday! woo-hoo! Fingers crossed.
-J


----------



## EvilPieMoo

Still no money taken from my account nor any update info on the order, still says order reveived - boxed shipment. Not going to hold out hope for Friday delivery at this rate.


----------



## tconroy135

Quote:


> Originally Posted by *Jsunn*
> 
> Yeah, I just checked mine, it says, boxed shipment as well.
> 
> I guess there were some folks that were having issues with their orders. I paid with PayPal and the funds have already left my account.
> 
> It looked like on the Nvidia forums, that people were having issues if they used their credit cards, but I can't be certain.
> 
> I'm hoping to get a tracking number on Friday! woo-hoo! Fingers crossed.
> -J


Quote:


> Originally Posted by *EvilPieMoo*
> 
> Still no money taken from my account nor any update info on the order, still says order reveived - boxed shipment. Not going to hold out hope for Friday delivery at this rate.


What happened when the 1080 or Titan XP was released?


----------



## romanlegion13th

When will EVGA and others bring cards out?


----------



## Neokolzia

Quote:


> Originally Posted by *romanlegion13th*
> 
> When will EVGA and others bring cards out?


Assuming some time after the 10th, EK or EVGA can't remember said Definitely before the 31st. So.. thats a thing, I'll be waiting for step up to 1080ti myself across from my FTW 3 1080


----------



## hebrewbacon

Quote:


> Originally Posted by *stocksux*
> 
> I'm just gonna go to Microcenter on Friday and grab a couple


Oh boy, did you confirm this with Microcenter? I'm hoping I receive a tracking number by Friday or I may have to visit the one near my house to pick one up


----------



## thyfartismurder

Order status now says "Processing" its getting closer boiz


----------



## kevinjohn

Quote:


> Originally Posted by *Jsunn*
> 
> Pre-order submitted...
> Does anyone have any experience with the delivery times on nvidia pre-orders?
> 
> Thanks,
> -Jsunn


Well. I personally have never experienced but my friend order a Nvidia graphic card and they took around 5 business days.


----------



## Renairy

I'll be in this group, but i'll be waiting for the Strix.

Any idea when reviews will be up? NDA is lifted on the 9th isnt it?


----------



## MURDoctrine

Quote:


> Originally Posted by *Renairy*
> 
> I'll be in this group, but i'll be waiting for the Strix.
> 
> Any idea when reviews will be up? NDA is lifted on the 9th isnt it?


10th but we get reviews the 9th.


----------



## G woodlogger

Just something to laugh at while we wait:

https://www.techpowerup.com/231327/msi-unveils-camo-squad-limited-edition-products-with-ghost-recon-wildlands

The graphic card are apparently designed for the dessert or mountains? It should have motherboard images on it so it cant be seen in the computer!


----------



## MURDoctrine

Quote:


> Originally Posted by *G woodlogger*
> 
> Just something to laugh at while we wait:
> 
> https://www.techpowerup.com/231327/msi-unveils-camo-squad-limited-edition-products-with-ghost-recon-wildlands
> 
> The graphic card are apparently designed for the dessert or mountains? It should have motherboard images on it so it cant be seen in the computer!


It might seem stupid but its not. We had a camouflage consumer pc come through our Wal-Marts. They sold like crazy here in the south. We also had a camouflage television. There is a market for it or they wouldn't do it.


----------



## G woodlogger

So where are the cards from my environment, i live in social housing next to a golf course







so green grass?


----------



## thyfartismurder

Quote:


> Originally Posted by *G woodlogger*
> 
> Just something to laugh at while we wait:
> 
> https://www.techpowerup.com/231327/msi-unveils-camo-squad-limited-edition-products-with-ghost-recon-wildlands
> 
> The graphic card are apparently designed for the dessert or mountains? It should have motherboard images on it so it cant be seen in the computer!


here is my full msi camo build in a corsair 570x
https://www.technic3d.com/article/pics/1953/Corsair_570X__1_von_1_-36.jpg


----------



## LukkyStrike

CC was charged this morning. And it now states processing!

Wonder when it will ship?


----------



## thyfartismurder

Quote:


> Originally Posted by *LukkyStrike*
> 
> CC was charged this morning. And it now states processing!
> 
> Wonder when it will ship?


I would imagine it will ship tonight or tomorrow morning "Processesing" tends to be fast


----------



## Jsunn

Woo-hoo! Mine says processing as well.

So even if it ships tomorrow, I won't get it till the 13th at the earliest. I should have gone with overnight shipping.


----------



## bfedorov11

Quote:


> Originally Posted by *G woodlogger*
> 
> Just something to laugh at while we wait:
> 
> https://www.techpowerup.com/231327/msi-unveils-camo-squad-limited-edition-products-with-ghost-recon-wildlands
> 
> The graphic card are apparently designed for the dessert or mountains? It should have motherboard images on it so it cant be seen in the computer!


I dunno, I'd rock that... hits on two of my favorite things in this world, GPUs and firearms. Black digital camo would look better IMO.


----------



## KraxKill

So... Are there two different versions of the Nvidia Ti? How did Nvidia decide which ram chip to remove?





https://www.extremetech.com/gaming/245604-review-gtx-1080-ti-first-real-4k-gpu-drives-better-amd-intel?source=flipboard

VS.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/3.html


----------



## Jsunn

Looks like some more info is up! Woo-hoo!


----------



## kl6mk6

My CC declined the NVIDIA transaction this morning. Got on real fast and updated the order with a different CC. Hopefully it didn't mess up my pre-order status.


----------



## MightEMatt

Quote:


> Originally Posted by *KraxKill*
> 
> So... Are there two different versions of the Nvidia Ti? How did Nvidia decide which ram chip to remove?
> 
> 
> 
> 
> 
> https://www.extremetech.com/gaming/245604-review-gtx-1080-ti-first-real-4k-gpu-drives-better-amd-intel?source=flipboard
> 
> VS.
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/3.html


PCBs are all the same but which ROP they disable is random, so probably depends on which one gets disabled.


----------



## alucardis666

Can't wait to see the 1080ti go up on newegg and sell out before I can get one ordered. lol


----------



## al210

I have to laugh.







Someone over on the Titan XP forum said "I'll be sticking with the Titan X because it just sounds better" If sounding better is worth $500 then have at it.


----------



## bfedorov11

It is nice to be king! I'd keep it too. The name does hold value. Just look at the TXM cards that were selling for $700+ just before the 1080ti announcement.


----------



## KraxKill

Quote:


> Originally Posted by *MightEMatt*
> 
> PCBs are all the same but which ROP they disable is random, so probably depends on which one gets disabled.


Lets hope they disable the weakest set right....


----------



## y2kcamaross

Anyone's shipped yet? A few people over on the Nvidia forums are reporting shipment


----------



## EvilPieMoo

Still stuck on processing for me, the ones claiming to have theirs shipped seem to be a very small minority. Either they're lucky or talking outta their a$$ haha


----------



## Joshwaa

I ordered right at 8AM PST and mine still does not show shipped. Still processing. Did the people that said theirs shipped give their order number?


----------



## y2kcamaross

Quote:


> Originally Posted by *Joshwaa*
> 
> I ordered right at 8AM PST and mine still does not show shipped. Still processing. Did the people that said theirs shipped give their order number?


No, not entirely sure I believe them


----------



## Joshwaa

Seems it is outside the US orders that are shipping if accounts are true.


----------



## EvilPieMoo

Mine has been stuck on processing all day and I'm in the UK 9:30pm here so doesn't look like they'll arrive until Monday. Disappointing.


----------



## MightEMatt

Mine was shipped a couple hours ago, I ordered from the first batch minute they went up the with priority shipping.


----------



## Joshwaa

Blame Canada, lol.


----------



## bfedorov11

I got an email from Fedex saying it was picked up at 1pm. Monday delivery and I paid for 2 day







I am going to try to change it for pick up.. might be able to grab it on saturday. My order status on the nvidia site still says processing though. Probably takes time to update.

::strange, I have two shipments with different tracking numbers from them in my fedex manager.. all the information is correct in both so I don't know why they would need to void the first..


----------



## KedarWolf

Midnight tonight scour newegg and amazon for my 1080 Ti!!


----------



## KedarWolf

What are the odds you can do a hard mod and enable the ROPs?


----------



## RedM00N

Quote:


> Originally Posted by *bfedorov11*
> 
> It is nice to be king! I'd keep it too. The name does hold value. Just look at the TXM cards that were selling for $700+ just before the 1080ti announcement.


Wow, I could've sold my 3 TXM for 700 a pop before the announcement? Welp..

Couldave used my 680's to hold me over till I got my Ti's


----------



## Neokolzia

Quote:


> Originally Posted by *KedarWolf*
> 
> Midnight tonight scour newegg and amazon for my 1080 Ti!!












I got a program running already xD parsing EVGA site for 1080Ti drop on the Step up and will alert all my devices the instant it shows

Could do something similar with sites like newegg with a bit of tweakin


----------



## Jsunn

Mine has shipped, scheduled delivery is the 14th.

Woohoo!


----------



## LukkyStrike

Mine has shipped. From MN, scheduled FedEx priority before 10:30am tomorrow.


----------



## EvilPieMoo

Doesn't look like UK orders Will ship today (Thursday), just hit midnight and still processing. Disappointed tbh, only reason I pre ordered was because I assumed we'd get it on release day, now it'll be Monday at the earliest.


----------



## bfedorov11

Called Fedex.. you can not change it for pickup in the system, but you can go pick it up early if it arrives at your retail store/shipping center.


----------



## stocksux

Verified today with my local microcenter that they have stock of 1080ti's and will have them for sale when doors open tomorrow morning. I'll have a pair of these bad boys in the morning


----------



## hebrewbacon

Quote:


> Originally Posted by *stocksux*
> 
> Verified today with my local microcenter that they have stock of 1080ti's and will have them for sale when doors open tomorrow morning. I'll have a pair of these bad boys in the morning


That is awesome news. Hopefully mine ships tomorrow or I'm cancelling and picking mine up at microcenter.


----------



## al210

Got shipped confirmation email







. Should be here mid next week. That's ok cause I just started a new build and will be some time before I need it. My Parvum case is supposed to ship this week and now I need that case bad.


----------



## al210

Quote:


> Originally Posted by *stocksux*
> 
> Verified today with my local microcenter that they have stock of 1080ti's and will have them for sale when doors open tomorrow morning. I'll have a pair of these bad boys in the morning


Wonder how many they got?


----------



## KedarWolf

Anyone flush with dollars and have a three or four way motherboard here's how to get three/four way SLI to work with our cards.






Says 1070 but will work any Pascal card.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Anyone flush with dollars and have a three or four way motherboard here's how to get three/four way SLI to work with our cards.
> 
> 
> 
> 
> 
> 
> Says 1070 but will work any Pascal card.


You need two tri or quad SLI bridges that fit to get it to work though, one on each SLI connector. I think there is a way to team short and long flexible SLI bridges though.


----------



## BoredErica

Are we going to have an overclocking spreadsheet w/ verification?


----------



## c0nsistent

Dang I hope there's still stock after work tomorrow, cant order til I get my paycheck


----------



## straha20

Quote:


> Originally Posted by *stocksux*
> 
> Verified today with my local microcenter that they have stock of 1080ti's and will have them for sale when doors open tomorrow morning. I'll have a pair of these bad boys in the morning


I am actually going to my nearest MicroCenter tomorrow morning to pick up my x370 board, so this is good news indeed. Did they give any indication of whether or not they have any non reference variants such as hybrid cooling and such?


----------



## al210

Quote:


> Originally Posted by *straha20*
> 
> I am actually going to my nearest MicroCenter tomorrow morning to pick up my x370 board, so this is good news indeed. Did they give any indication of whether or not they have any non reference variants such as hybrid cooling and such?


I doubt there will be any variants till later this month.


----------



## Domler

Sold off my two 1080s and just ordered ek titan x blocks and backplates. Hopefully I can score two ti's tomorrow. They will be under water by Tuesday!!!


----------



## stocksux

They did not give any details on anything other than they will have stock of the 1080 ti when the doors open. From what I hear on YouTube, the partner cards will debut in April.


----------



## BoredErica

Quote:


> Originally Posted by *stocksux*
> 
> They did not give any details on anything other than they will have stock of the 1080 ti when the doors open. From what I hear on YouTube, the partner cards will debut in April.


That would suck a whole lot if true.

I have no problem waiting until a card launches. But once it launches I get impatient.


----------



## OutlawXGP

Hey guys any one here from the UK got update on their Order Status?, it just says processing for me...


----------



## MURDoctrine

So I really didn't keep up with pascal too much after the launch hype. If I plan on throwing my 1080ti under water pretty much once I get it should I go for one of the FE or wait for AIB custom reference pcb?


----------



## KedarWolf

Quote:


> Originally Posted by *MURDoctrine*
> 
> So I really didn't keep up with pascal too much after the launch hype. If I plan on throwing my 1080ti under water pretty much once I get it should I go for one of the FE or wait for AIB custom reference pcb?


FE has EK water block support already. AIB might not be for a few weeks, and water blocks soon after depending on the model.

But for example FE has 7 Mosfets for power where Asus Strix will have 10x2 Mosfets, 12 in total and two eight pin power connectors instead of an eight and six pin like the FE.

Strix will have an EK waterblock as the 1080s do. They likely will OC some better.


----------



## MURDoctrine

Quote:


> Originally Posted by *KedarWolf*
> 
> FE has EK water block support already. AIB might not be for a few weeks, and water blocks soon after depending on the model.
> 
> But for example FE has 7 Mosfets for power where Asus Strix will have 10x2 Mosfets, 12 in total and two eight pin power connectors instead of an eight and six pin like the FE.
> 
> Strix will have an EK waterblock as the 1080s do. They likely will OC some better.


I thought I read though that the Titan XP's were heavily limited by the power target and voltage and that the extra power won't do anything to aid in their OC's?


----------



## KCDC

I'm in SoCal and mine still says processing! Aren't these coming from CA?? I know I paid the 120 bucks state sales tax for two cards, that's for dang sure...


----------



## Alexbo1101

Shipped, but with UPS standard, so they are first going to arrive on monday, so jealous of the americans who get FedEx same day shipping.


----------



## alucardis666

Dying to get my hands on a Ti, and also thinking I should really hold off for an AIB card from ASUS, MSI or EVGA this time... Ugh...


----------



## dkaardal

UK here - just looked at my preorder, and it's marked as shipped now. It wasn't as of midnight last night, but Parcelforce tracking has it already out for delivery.

Going from 3 cards (290x's) down to one - should be an interesting experiment!


----------



## alucardis666

Quote:


> Originally Posted by *dkaardal*
> 
> UK here - just looked at my preorder, and it's marked as shipped now. It wasn't as of midnight last night, but Parcelforce tracking has it already out for delivery.
> 
> Going from 3 cards (290x's) down to one - should be an interesting experiment!


Nice man. Congratz!


----------



## Alexbo1101

Quote:


> Originally Posted by *dkaardal*
> 
> UK here - just looked at my preorder, and it's marked as shipped now. It wasn't as of midnight last night, but Parcelforce tracking has it already out for delivery.
> 
> Going from 3 cards (290x's) down to one - should be an interesting experiment!


Now I'm a little jealous of you too







, shipping from Germany is apparently very slow with UPS.


----------



## Emissary of Pain

I can get my card on Monday if I order in the next 2 hours.

Is there going to be any difference going with the partner cards ? ... I am looking at watercooling the card so the FE will be easier to get blocks for but I am worried that it will fall behind other cards with higher power limits

Any advice ?


----------



## thyfartismurder

UK- went from Processing to out for delivery at 8am this morning, Not back from work for another 2/3 hours and no one is home


----------



## thyfartismurder

Quote:


> Originally Posted by *Emissary of Pain*
> 
> I can get my card on Monday if I order in the next 2 hours.
> 
> Is there going to be any difference going with the partner cards ? ... I am looking at watercooling the card so the FE will be easier to get blocks for but I am worried that it will fall behind other cards with higher power limits
> 
> Any advice ?


FE will be have more options for blocks but EK will have blocks for all the major cards, if your worried about power limits there is no harm in waiting a few weeks.


----------



## Emissary of Pain

I am officially the proud owner of a 1080Ti ...

Impulse bought the card when I ran out of time ... haha ... Let's hope the card is stable and doesn't need to ever be RMA'd


----------



## hebrewbacon

Dang woke up this morning and my preorder is still showing "processing". Might have to take a visit to my microcenter


----------



## stocksux

Quote:


> Originally Posted by *KedarWolf*
> 
> FE has EK water block support already. AIB might not be for a few weeks, and water blocks soon after depending on the model.
> 
> But for example FE has 7 Mosfets for power where Asus Strix will have 10x2 Mosfets, 12 in total and two eight pin power connectors instead of an eight and six pin like the FE.
> 
> Strix will have an EK waterblock as the 1080s do. They likely will OC some better.


Power connectors and mosfets won't matter. Just as they didn't for the 1080. The card will hit power constraints (assuming on water and thermal cap isn't an issue) before there's need for all that. 1080FE clocked just as high to even higher that AIB cards.


----------



## keikei

I took the day off today. I'm going get some duncans and a coffee-roll. Hopefully, by 10am I will see that order button and snatch one up. Gonna be gud.


----------



## Cool Mike

Would love a surprise and see a custom 1080ti pop up on Newegg.


----------



## Emissary of Pain

Quote:


> Originally Posted by *Cool Mike*
> 
> Would love a surprise and see a custom 1080ti pop up on Newegg.


Please no ... I'd cry


----------



## thyfartismurder

ITS HERE!!! (posted on system with 1080ti)


----------



## vmanuelgm

Quote:


> Originally Posted by *thyfartismurder*
> 
> ITS HERE!!! (posted on system with 1080ti)


Congrats!!!

Show us how the beast behaves!!!


----------



## thyfartismurder

Quote:


> Originally Posted by *vmanuelgm*
> 
> Congrats!!!
> 
> Show us how the beast behaves!!!


Installing drivers now


----------



## bfedorov11

Quote:


> Originally Posted by *thyfartismurder*
> 
> ITS HERE!!! (posted on system with 1080ti)


Congrats! Let's see some OC numbers!

Wonder why no retailers have the ti for sale or even preorder..... evga, amazon, newegg..


----------



## RamGuy

Quote:


> Originally Posted by *bfedorov11*
> 
> Congrats! Let's see some OC numbers!
> 
> Wonder why no retailers have the ti for sale or even preorder..... evga, amazon, newegg..


Yeah, is it severely understocked?


----------



## Mongo

Quote:


> Originally Posted by *thyfartismurder*
> 
> ITS HERE!!! (posted on system with 1080ti)


Pics or it didnt happen!


----------



## thyfartismurder

Quote:


> Originally Posted by *Mongo*
> 
> Pics or it didnt happen!


----------



## Cool Mike

Hoping Newegg will have stock within a few hours.


----------



## Mongo

YOU SUCK







my hand hurts from the F5 button and you already got one. By the way got a email from Nvidia that said 10am PST going on sale.


----------



## failwheeldrive

Heads up, the MSI founder's in stock from bhphotovideo right now. Just bought it.

edit: out of stock now. That was quick


----------



## PurdueBoy

I remember there was talk of a 980TI step up plan awhile back, not the evga program but an actual nvidia one. Any word on this or one for the 1080?


----------



## thyfartismurder

MY GOd are blower cards loud!


----------



## failwheeldrive

Quote:


> Originally Posted by *thyfartismurder*
> 
> MY GOd are blower cards loud!


Stock coolers are garbage. EK's Titan XP water block fits your card. That'll solve your noise problem


----------



## bfedorov11

bhphotovideo says the Asus FE card comes with For Honor or Wildlands.. anyone know if nvidia will be including it?


----------



## thyfartismurder

Quote:


> Originally Posted by *failwheeldrive*
> 
> Stock coolers are garbage. EK's Titan XP water block fits your card. That'll solve your noise problem


yeah ik Im watercooling next month


----------



## failwheeldrive

Quote:


> Originally Posted by *thyfartismurder*
> 
> yeah ik Im watercooling next month


Good call, you won't be disappointed


----------



## thyfartismurder

How do i get the overlay where it shows the Clocks going up and down? the one i see just shows the max boost.


----------



## bfedorov11

Quote:


> Originally Posted by *thyfartismurder*
> 
> How do i get the overlay where it shows the Clocks going up and down? the one i see just shows the max boost.


Install afterburner.. it will install rivatuner with it. You can change what is shown in the options in afterburner. It should have some stuff enabled and shown by default.


----------



## failwheeldrive

What utility are you using? Afterburner?


----------



## KraxKill

Quote:


> Originally Posted by *thyfartismurder*
> 
> How do i get the overlay where it shows the Clocks going up and down? the one i see just shows the max boost.


The simple way is to use GPUZ, set the sampling interval to 0.1sec and use its logging option. Then after your bench runs, you review the file to see if your clocks are solid and graph these with your favorite software like excel or similar.


----------



## tconroy135

Anyone who ordered on the 2nd from NVIDIA already got delivery? And if so which shipping option did you choose? Thanks


----------



## alucardis666

Quote:


> Originally Posted by *Cool Mike*
> 
> Hoping Newegg will have stock within a few hours.


Same


----------



## stocksux

Well I'm an owner! Kinda...stood in line at microcenter and they had six total. I was third in line. They limited it to one per customer! I'm pretty pissed ?! Tough to sli when you can only buy one card. BS!!!!!!! To top it off they won't sell them till 1pm est (I'm in Ohio) per Nvidia guidelines. Now I've got a three hour wait before I can even buy it. They handed out vouchers. I'd drive home and come back but I drove an hour to get here! Not the kind of day I was hoping for.


----------



## alucardis666

Least you're getting one.

Seems Newegg hasn't put em up yet and they're not on the Nvidia store, or Amazon either.

B&H had them and they sold out in minutes...

https://www.bhphotovideo.com/c/product/1324400-REG/asus_gtx1080ti_fe_geforce_gtx_1080_ti.html

https://www.bhphotovideo.com/c/product/1324403-REG/msi_gtx_1080_ti_founders.html


----------



## xBlitzerx

I'm not sure if I should get a FE card or wait. I hate blower coolers. I haven't had one since the 580.


----------



## hokk

I can't seem to use any res over 1080p with new drivers anyone else having this problem?


----------



## alucardis666

Quote:


> Originally Posted by *kylzer*
> 
> I can't seem to use any res over 1080p with new drivers anyone else having this problem?


3840x2160 no issues here.


----------



## failwheeldrive

Quote:


> Originally Posted by *xBlitzerx*
> 
> I'm not sure if I should get a FE card or wait. I hate blower coolers. I haven't had one since the 580.


I'd only get one if you're using a SFF or case with restricted airflow where a blower card is needed, you plan on getting an EVGA FE and using step up to switch to a non reference cooled model, or if you don't mind the hassle of buying another card once they're available and selling the FE. And obviously if you plan on water cooling the card the FE is a great choice. It's the only reason I'm getting one now.


----------



## Cool Mike

No reference for me.

Waiting for Asus Strix, Gigabyte or EVGA Custom cooler versions.


----------



## thyfartismurder

Is the voltage slider locked on afterburner?


----------



## KedarWolf

Quote:


> Originally Posted by *xBlitzerx*
> 
> I'm not sure if I should get a FE card or wait. I hate blower coolers. I haven't had one since the 580.


I'll be throwing a prefilled EK water block on mine on a few weeks to add to my EK Predator 360.


----------



## alucardis666

Quote:


> Originally Posted by *failwheeldrive*
> 
> I'd only get one if you're using a SFF or case with restricted airflow where a blower card is needed, you plan on getting an EVGA FE and using step up to switch to a non reference cooled model, or if you don't mind the hassle of buying another card once they're available and selling the FE. And obviously if you plan on water cooling the card the FE is a great choice. It's the only reason I'm getting one now.


So being that I have a 330R FE sounds like a good choice, right?


----------



## Biggu

Quote:


> Originally Posted by *stocksux*
> 
> Well I'm an owner! Kinda...stood in line at microcenter and they had six total. I was third in line. They limited it to one per customer! I'm pretty pissed ?! Tough to sli when you can only buy one card. BS!!!!!!! To top it off they won't sell them till 1pm est (I'm in Ohio) per Nvidia guidelines. Now I've got a three hour wait before I can even buy it. They handed out vouchers. I'd drive home and come back but I drove an hour to get here! Not the kind of day I was hoping for.


what store are you at? im in columbus.


----------



## seven7thirty30

I pre-ordered through Nvidia on day one via PayPal(funds were taken instantly) and got a verificaton email from Nvidia, but have been unable to see anything on the status. Just shows up blank. Trying to contact them. Frustrating...


----------



## sotorious

Wasnt the evga 1080 ti suppose to come out for pre orders today? or was i mistaken. I missed the first boat.


----------



## stocksux

Quote:


> Originally Posted by *Biggu*
> 
> what store are you at? im in columbus.


I'm at Sharonville (cincinnati)


----------



## stocksux

Quote:


> Originally Posted by *Biggu*
> 
> what store are you at? im in columbus.


how many did they get there? Do you have to wait till 1pm to buy it? Did they limit to one per customer? I was told yesterday that they weren't going to limit it. Then they did. Had I known, I would have brought a friend with me...


----------



## dkaardal

Just unpacked it and about to reboot and run ddu.

Man, my case is going to look empty with only 1 card in it. I have all 3 of the 290x's running with kraken g10 / H90 combos, so even a Switch 810 gets a little crowded.

... yeah, I know. I could have put in a custom water loop for not much more, and it would be a lot nicer looking. I always meant to, but I never quite got around to it.

Cooling this 1080 Ti might just convince me to take the time though. I'm not all that keen on air cooling anymore.

hmm. I wonder if a G10 would fit this new card?


----------



## keikei

Quote:


> Originally Posted by *sotorious*
> 
> Wasnt the evga 1080 ti suppose to come out for pre orders today? or was i mistaken. I missed the first boat.


Launch date is today. Some sites are quicker to post the card for sale.


----------



## failwheeldrive

Quote:


> Originally Posted by *alucardis666*
> 
> So being that I have a 330R FE sounds like a good choice, right?


TBH I think the 330R would handle an open cooler card without any problems, assuming you've got the front and rear fan slots populated with decent fans. But if you're happy with the noise output of your current card and don't want to wait several weeks for new cards to release then the FE is a solid choice. The main problem with it is noise, especially when cranking fan speeds up to handle an overclock. If that's not a concern for you I say go for it. The FE cooler actually does a good job of cooling the power delivery and a fair job at cooling the core, just don't expect near silence from it.


----------



## Biggu

Quote:


> Originally Posted by *stocksux*
> 
> how many did they get there? Do you have to wait till 1pm to buy it? Did they limit to one per customer? I was told yesterday that they weren't going to limit it. Then they did. Had I known, I would have brought a friend with me...


They got 4 and aparently sold them all at opening


----------



## al210

Didn't take long to show up on Ebay for over $1000.







That is for card in hand not a pre-order one.


----------



## stocksux

Are all the founders card variants (from each manufactures; asus, evga, msi, gigabyte, etc) the same?


----------



## Epona

Quote:


> Originally Posted by *stocksux*
> 
> Are all the founders card variants (from each manufactures; asus, evga, msi, gigabyte, etc) the same?


Yep, it's just rebranded for the specific company.


----------



## Mongo

Was in stock at Nvidia store for 2 mins. lol


----------



## failwheeldrive

I'm surprised it lasted that long lol


----------



## keikei

^Yep. I saw the tax total and declined. Welp, going to the gym and see what happens later, maybe the other sites will be up.


----------



## DishTheRock

Quote:


> Originally Posted by *Mongo*
> 
> Was in stock at Nvidia store for 2 mins. lol


I don't get how this works. If they had more in stock, then why is my order from last week still processing. I paid for next day shipping too lol.

Do the few people that got an order in that 2-minute window have to wait for the next batch?


----------



## lilchronic

newegg has stock


----------



## alucardis666

Quote:


> Originally Posted by *Mongo*
> 
> Was in stock at Nvidia store for 2 mins. lol


Yup! by the time it popped up on

http://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

and I clicked to order they were gone :-(


----------



## netjack

newegg link?


----------



## lilchronic

https://www.newegg.com/videocards/EventSaleStore/ID-2041834


----------



## egandt

https://www.newegg.com/videocards/EventSaleStore/ID-2041834

Don't mind posting, mines on it's way.

ERIC


----------



## lilchronic

evga sold out quick


----------



## alucardis666

Quote:


> Originally Posted by *lilchronic*
> 
> newegg has stock


Only PNY & Zotac left Dunno how I feel about that.

Edit:

Nvm... all gone.


----------



## netjack

love the markup for "shipping"


----------



## BigMack70

I'm in. Got mine off EVGA's site before it crashed. Good-bye SLI


----------



## bfedorov11

nvidia probably didn't even sell any today.. people on reddit still waiting for shipment from the first preorder..

newegg all sold out


----------



## fisher6

Ordered EVGA FE. Just need a block now


----------



## dboythagr8

Got in on a EVGA from Newegg, paid for next day delivery+ sat delivery, and it says order verification....estimated to ship on 3/13.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Got in on a EVGA from Newegg, paid for next day delivery+ sat delivery, and it says order verification....estimated to ship on 3/13.


Yeah i got the same


----------



## alucardis666

Looks like I'm waiting for a non FE card now


----------



## RedM00N

Might just see if a store has them tomorrow. I don't have time during the week to do the F5/notify game.


----------



## lilchronic

EVGA site is still in stock it seems. just takes a while to load
http://www.evga.com/products/product.aspx?pn=11G-P4-6390-KR


----------



## Maintenance Bot

I got 1 from Nvidia. Was not planning on getting one until my Titan decided to kill it self last night.


----------



## hotrod717

Just picked mine up from Microcenter. 1 of 4 that were available. A bit more expensive than from Newegg but without shipping and that wait. Evga.


----------



## alucardis666

Quote:


> Originally Posted by *hotrod717*
> 
> Just picked mine up from Microcenter. A bit more expensive than from Newegg but without shipping and that wait. Evga.


Lucky


----------



## alucardis666

EVGA site keeps crashing, can't even look at the add to cart page


----------



## Clockster

I'll post some better pics and info once the machine is built.


----------



## McAlberts

was able to purchase one on evga's site or so i thought. got redirected to paypal to complete the purchase, once i hit complete order it just sent me back to evga's main page. talked to chat and they have no record of any order and i'm out of luck


----------



## dboythagr8

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah i got the same


Moved to packaging


----------



## hokk

Well after 2 hours i still can get any res higher than 1080p on my monitor.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Moved to packaging


Me too








Quote:


> Originally Posted by *kylzer*
> 
> Well after 2 hours i still can get any res higher than 1080p on my monitor.


What monitor and are you using the display port to dvi adapter?


----------



## hokk

Quote:


> Originally Posted by *lilchronic*
> 
> Me too
> 
> 
> 
> 
> 
> 
> 
> 
> What monitor and are you using the display port to dvi adapter?


BenQ BL2420PT and yes

if just use hdmi to hdmi it says no signal.


----------



## failwheeldrive

Just got the tracking number for my bhphoto order from this morning. Should be here by Monday, same day as the Titan XP waterblock


----------



## Epona

Quote:


> Originally Posted by *McAlberts*
> 
> was able to purchase one on evga's site or so i thought. got redirected to paypal to complete the purchase, once i hit complete order it just sent me back to evga's main page. talked to chat and they have no record of any order and i'm out of luck


Check out Amazon right now: https://www.amazon.com/gp/product/B06XH2P8DD/


----------



## lilchronic

Quote:


> Originally Posted by *kylzer*
> 
> BenQ BL2420PT and yes


You need an active adapter. One the is powered by usb.
http://www.cablestogo.com/product/41379/dvi-to-displayport-adapter-converter

I have that one but have not used it yet


----------



## Alexbo1101

The question now is if i should go to my local hardware store and buy it tomorrow and refuse the UPS package on monday/tuesday. Or just wait the two extra days.


----------



## dboythagr8

Quote:


> Originally Posted by *Epona*
> 
> Check out Amazon right now: https://www.amazon.com/gp/product/B06XH2P8DD/


Has it actually been available or is this just pre order.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Has it actually been available or is this just pre order.


It just went out of stock when i refreshed


----------



## RedM00N

Quote:


> Originally Posted by *dboythagr8*
> 
> Has it actually been available or is this just pre order.


Probably was available at one point. Says the listing was added today.


----------



## hokk

Quote:


> Originally Posted by *kylzer*
> 
> BenQ BL2420PT and yes


Quote:


> Originally Posted by *lilchronic*
> 
> You need an active adapter. One the is powered by usb.
> http://www.cablestogo.com/product/41379/dvi-to-displayport-adapter-converter
> 
> I have that one but have not used it yet


wew ill just use display port then lol


----------



## alucardis666

Thye're back up on newegg

http://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

Just ordered an EVGA one!









https://www.newegg.com/Product/Product.aspx?Item=N82E16814487335


----------



## lilchronic

Quote:


> Originally Posted by *kylzer*
> 
> wew ill just use display port then lol


lol that should work. My monitor only has DVI.


----------



## LukkyStrike

And they have arrived!!!!


----------



## hokk

Quote:


> Originally Posted by *lilchronic*
> 
> lol that should work. My monitor only has DVI.


Well my main problem is

my hdmi cable is to my 4k tv and it works fine

i use hdmi cable to my monitor = no signal
so i had to use the adaptor

but i'll go to store and buy a dp cable tomorrow hopefully that will fix it.


----------



## y2kcamaross

Did anyone who ordered from Nvidia get the for honor/ghost recon wildlands code? A lot of retailers are offering it but it was never offered when I pre ordered...


----------



## hokk

Quote:


> Originally Posted by *y2kcamaross*
> 
> Did anyone who ordered from Nvidia get the for honor/ghost recon wildlands code? A lot of retailers are offering it but it was never offered when I pre ordered...


Nope nothing here in my email or anything.


----------



## sWaY20

Ordered an evga FE at noon on newegg, next day and sat delivery, so crossing fingers for getting it Tom.


----------



## stocksux

card in hand!


----------



## alucardis666

Quote:


> Originally Posted by *kylzer*
> 
> Nope nothing here in my email or anything.


I haven't received my code yet :-(


----------



## tconroy135

Quote:


> Originally Posted by *stocksux*
> 
> card in hand!


I like the box from NVIDIA a little bit more.


----------



## y2kcamaross

Quote:


> Originally Posted by *kylzer*
> 
> Nope nothing here in my email or anything.


Which is ridiculous, because the few that were sold on Nvidia.com today DID come with the code


----------



## Epona

Quote:


> Originally Posted by *y2kcamaross*
> 
> Which is ridiculous, because the few that were sold on Nvidia.com today DID come with the code


Just to confirm, is this something that's delivered digitally? Last time I bought a graphics card it came in the box (granted that was a while ago).


----------



## y2kcamaross

Im chatting with Nvidia now, apparently they will be rolling the codes out soon


----------



## KCDC

Anyone that preordered on the 2nd get their cards yet? Or are all of these in-store purchases?


----------



## y2kcamaross

chat excerpt

Manojit: and as a result, we we are rolling out promo codes to all pre-orders once the cards are shipped
Manojit: Preorders will start shipping 3/9.
Nathan Busch: yes my card has already shipped and will be here tomorrow
Manojit: We are expecting the promo codes to emailed to you by early next week.
Manojit: Please accept our apologies for the unforeseen circumstances
Nathan Busch: But - to confirm - those that did pre order, will get the code via email at some point early next week?
Manojit: Yes...Its already Friday
Nathan Busch: Alright then, thanks!


----------



## keikei

I placed an order for the asus one on newegg. Everything else soldout. Its on backorder though. Now, just gotta see that confirmation email.


----------



## kl6mk6

Quote:


> Originally Posted by *KCDC*
> 
> Anyone that preordered on the 2nd get their cards yet? Or are all of these in-store purchases?


Prordered on the 2nd here. Still says boxed shipment on my order status page.


----------



## KedarWolf

Have a hold on a Gigabyte FE at a local store.

I can't go before they close so paying a co-worker to pick it up and bring it back before I finish work tonight!! I mean $50 to travel to the other end of the city for a $1139 CAD (with tax) card is worth it to me to have it tonight!!









Edit: RIP income tax refund.


----------



## AMDATI

Ordered MSI version from Newegg the moment it became available.

As for my experience with game codes, they're usually emailed to you a little while after ordering, then you go to a website and redeem the code. Last game I redeemed was Far Cry 4, which didn't really matter much to me. Neither of these games matter much to me either, but I'll probably choose For Honor.

But I did read that they were trying to stop people from selling game codes, so they might be in the box....and might be tied to the videocard hardware initially. So you might have to do some kind of verification process where they hardware ID your GPU before you're allowed to redeem the code, which would tie that code to a game account, depending on the platform the game is on (such as origin)


----------



## KCDC

Quote:


> Originally Posted by *kl6mk6*
> 
> Prordered on the 2nd here. Still says boxed shipment on my order status page.


Same.. Oh the humanity!
Quote:


> Originally Posted by *y2kcamaross*
> 
> chat excerpt
> 
> Manojit: and as a result, we we are rolling out promo codes to all pre-orders once the cards are shipped
> Manojit: Preorders will start shipping 3/9.
> Nathan Busch: yes my card has already shipped and will be here tomorrow
> Manojit: We are expecting the promo codes to emailed to you by early next week.
> Manojit: Please accept our apologies for the unforeseen circumstances
> Nathan Busch: But - to confirm - those that did pre order, will get the code via email at some point early next week?
> Manojit: Yes...Its already Friday
> Nathan Busch: Alright then, thanks!


Thanks for checking!


----------



## y2kcamaross

Quote:


> Originally Posted by *AMDATI*
> 
> Ordered MSI version from Newegg the moment it became available.
> 
> As for my experience with game codes, they're usually emailed to you a little while after ordering, then you go to a website and redeem the code. Last game I redeemed was Far Cry 4, which didn't really matter much to me. Neither of these games matter much to me either, but I'll probably choose For Honor.
> 
> But I did read that they were trying to stop people from selling game codes, so they might be in the box....and might be tied to the videocard hardware initially. So you might have to do some kind of verification process where they hardware ID your GPU before you're allowed to redeem the code, which would tie that code to a game account, depending on the platform the game is on (such as origin)


They are actually tied to the GPU type, I believe they only work on the 1070/1080/1080ti


----------



## AMDATI

Quote:


> Originally Posted by *y2kcamaross*
> 
> They are actually tied to the GPU type, I believe they only work on the 1070/1080/1080ti


That would be an issue though, because what happens when people replace their GPU's in the future?


----------



## y2kcamaross

Quote:


> Originally Posted by *AMDATI*
> 
> That would be an issue though, because what happens when people replace their GPU's in the future?


They will be fine, it's just to redeem it through Geforce experience, the game is actually via Uplay


----------



## keikei

What is everyone getting for their free game code?


----------



## alucardis666

Quote:


> Originally Posted by *keikei*
> 
> What is everyone getting for their free game code?


Wildlands for me. For Honor looks like it'd get old REAL quick


----------



## AMDATI

Both aren't spectacular games. For honor seems like the most worthwhile, but I hear the multiplayer is peer to peer rather than central server, so that can be an issue. Ghost Recon has nothing at all appealing to me, looks like a cookie cutter of a dozen other games like far cry/just cause.


----------



## Nineball

Cant wait for my block.


----------



## SynchroSCP

Just did a step up from a 1080 to the 1080 Ti founders edition. Wonder how long that will take and when EK will have Ti blocks instead of Titan X.


----------



## KCDC

I know EVGA doesn't care if you remove the stock cooler for a waterblock, but what about nvidia? Would they void a warranty claim just because the stock cooler was removed and put back on for the RMA?


----------



## bfmv2k5

So jelly of those who got theirs already. But I was able to get my order in. Cant wait!


----------



## AMDATI

I hate that newegg charged for shipping. I upgraded to 2 day for 50 cents more though.


----------



## foolycooly

I was able to find my fedex shipping label directly from Fedex. Shows an estimated delivery date of 3/14 (free shipping to VA from nvidia preorder). Then again it has an estimated ship date of yesterday, and it doesn't look like it has actually moved.


----------



## KCDC

Update, just got my shipping confirmation. Coming from Minnesota, won't have them until the 16th! Oh well, at least now I know.


----------



## y2kcamaross

My card will be here tomorrow, unfortunately it's being delivered to work, which will be closed due to it being a weekend, trying contacting Fedex to hold it at their facility, they said they can't because they have restrictions from the seller(Nvidia) not to place holds on this item :sad:


----------



## Code-Red

Got my shipping confirmation a half hour ago. Just waiting on motherboards to be available and my new build is complete


----------



## lilchronic

Quote:


> Originally Posted by *KCDC*
> 
> Update, just got my shipping confirmation. Coming from Minnesota, won't have them until the 16th! Oh well, at least now I know.


Quote:


> Originally Posted by *Code-Red*
> 
> Got my shipping confirmation a half hour ago. Just waiting on motherboards to be available and my new build is complete


who did you order from?


----------



## Jaysend

I was able to get an EVGA card on Newegg. Should be here next Monday or Tuesday. Now I need to find it a sibling!


----------



## alucardis666

Quote:


> Originally Posted by *Jaysend*
> 
> I was able to get an EVGA card on Newegg. Should be here next Monday or Tuesday. Now I need to find it a sibling!


SLi just seems like a waste imo, what with the optimization being so poor and broken or non-existent profiles for many titles. Unless you're doing high refresh gaming like 120,144,165, or 240hz. Get one, OC it and save the $$$?


----------



## tconroy135

Mine from NVIDIA is weird. The Fed-Ex says it's in Circle Pines, MN and that only the label has been created. Except it also says that it shipped yesterday. And anticipated delivery is Tuesday next Week.


----------



## KCDC

Quote:


> Originally Posted by *lilchronic*
> 
> who did you order from?


This was the Nvidia Preorder from march 2nd


----------



## tconroy135

Quote:


> Originally Posted by *kl6mk6*
> 
> Prordered on the 2nd here. Still says boxed shipment on my order status page.


Have they all been shipped if ordered on the 2nd and the order status not updated? I'm hoping one is waiting for me at home.
Quote:


> Originally Posted by *tconroy135*
> 
> Mine from NVIDIA is weird. The Fed-Ex says it's in Circle Pines, MN and that only the label has been created. Except it also says that it shipped yesterday. And anticipated delivery is Tuesday next Week.


Ordered the second it was possible on the 2nd with PayPal


----------



## tconroy135

Quote:


> Originally Posted by *tconroy135*
> 
> Ordered the second it was possible on the 2nd with PayPal


----------



## kl6mk6

Just got my shipping confirmation from NVIDIA!!!


----------



## Curseair

Done a EVGA step up from a 1070 FTW2 to a 1080Ti founders, Anyone think they will make a hybrid kit for it?


----------



## tconroy135

Quote:


> Originally Posted by *kl6mk6*
> 
> Just got my shipping confirmation from NVIDIA!!!


What's your anticipated delivery date?


----------



## kl6mk6

Quote:


> Originally Posted by *tconroy135*
> 
> What's your anticipated delivery date?


Wednsday. So much for paying extra for next day.

Edit: Correction! I mis read weekday delivery as Wednesday, it will be here Monday!


----------



## BigMack70

Quote:


> Originally Posted by *alucardis666*
> 
> SLi just seems like a waste imo, what with the optimization being so poor and broken or non-existent profiles for many titles. Unless you're doing high refresh gaming like 120,144,165, or 240hz. Get one, OC it and save the $$$?


Even in high refresh gaming, the limit is typically CPU not graphics cards, so SLI doesn't even always help. Only reason I can see for SLI at this point is you are above 4k resolution, or you just like the tech / benchmarking. The practical uses for SLI in gaming are becoming fewer and fewer by the year.


----------



## motivman

After waiting in line for three and half hours at microcenter, and some major hustling/ Quick thinking at my local frys electronics, Ladies and Gentlemen, we have my new twins, replacing my EVGA 980TI SLI. Can't wait to see what these babies can do under water... who is jealous?


----------



## fisher6

Anyone has OC results under water yet?


----------



## McAlberts

i ordered earlier from newegg. my order status shows as order verification for the card and packaging for ghost recon / for honor. checked on paypal and $59.99 cleared yet $655.56 is pending. seems weird.
i was really hoping it would ship today


----------



## sotorious

Damnit now im going to have to wait months before these become AVAILABLE again.


----------



## KCDC

Quote:


> Originally Posted by *alucardis666*
> 
> SLi just seems like a waste imo, what with the optimization being so poor and broken or non-existent profiles for many titles. Unless you're doing high refresh gaming like 120,144,165, or 240hz. Get one, OC it and save the $$$?


Unless you plan on GPU rendering and also for fluid dynamics, then it's well worth it. I'll do a few freelance gigs and they've already paid for themselves. Not everyone is using these just for gaming









Can't wait to see what I can do vs my dual 980ti's


----------



## AMDATI




----------



## Jaysend

I had mostly positive experiences with my 980ti SLI. When it works it is fantastic. If it doesn't work, I run on one card. Which was like 2 games out of dozens. I don't see all the doom and gloom.
Looking at firestrike scores, my old 980ti sli scored quite a bit better than a single 1080ti. I Anticipating a small drop in performance with the single 1080ti. If I am mistaken, I may consider running it as a single.


----------



## BigMack70

Quote:


> Originally Posted by *Jaysend*
> 
> I had mostly positive experiences with my 980ti SLI. When it works it is fantastic. If it doesn't work, I run on one card.


This is the problem. Subjectively, the severity of the problem depends entirely on what games you play. For example, Gears 4 / Forza Horizon 3 / Path of Exile / Arkham Knight / Wolfenstein / Unreal Tournament are all games I spent a decent amount of time with which cannot use SLI - and the experience suffers significantly in all of them except PoE as a result. And it appears that games supporting SLI under DX12 may be the exception rather than the rule.

There's also a lot of games where the scaling is just sort of OK (~65% scaling), like Crysis 3 or The Witcher 3.

In the games where you get 80%+ scaling, it's a fantastic experience, but it's just not consistent enough to be worth it IMO.


----------



## RedM00N

Quote:


> Originally Posted by *Jaysend*
> 
> I had mostly positive experiences with my 980ti SLI. When it works it is fantastic. If it doesn't work, I run on one card. Which was like 2 games out of dozens. I don't see all the doom and gloom.
> Looking at firestrike scores, my old 980ti sli scored quite a bit better than a single 1080ti. I Anticipating a small drop in performance with the single 1080ti. If I am mistaken, I may consider running it as a single.


My experience as well. Though I also went out of the way for some games to find bits that would give good SLI support in a game that didn't have any support. I've seen enough good scaling (150+%) to keep it rolling all these years.

Going with one ti at the start, but still see a reason for SLI as I'm running 1440p 165hz and eventually will be rolling 4K at 144.


----------



## tconroy135

Quote:


> Originally Posted by *kl6mk6*
> 
> Wednsday. So much for paying extra for next day.
> 
> Edit: Correction! I mis read weekday delivery as Wednesday, it will be here Monday!


If you ordered next-day and don't get it Today or Tomorrow you should demand NVIDIA refund the Shipping Fee


----------



## ManuelG_at_NVIDIA

Users who pre-ordered their 1080 Ti and received shipping notification today will receive their promotional game code sometime next week via email.


----------



## Jaysend

Quote:


> Originally Posted by *BigMack70*
> 
> This is the problem. Subjectively, the severity of the problem depends entirely on what games you play. For example, Gears 4 / Forza Horizon 3 / Path of Exile / Arkham Knight / Wolfenstein / Unreal Tournament are all games I spent a decent amount of time with which cannot use SLI - and the experience suffers significantly in all of them except PoE as a result. And it appears that games supporting SLI under DX12 may be the exception rather than the rule.
> 
> There's also a lot of games where the scaling is just sort of OK (~65% scaling), like Crysis 3 or The Witcher 3.
> 
> In the games where you get 80%+ scaling, it's a fantastic experience, but it's just not consistent enough to be worth it IMO.


A lot of titles you mentioned are console ports which I just always assume are going to have issues on PC. I had enough scaling in witcher 3 to push me over 100fps on 3880x1440. Which was worth it for me. I forget what I got with a single for average but I had some really annoying dips that went away with SLI.
Bottom line, I don't see any downside other than cost to SLI. Id it doesn't work for a game, disable it. (And complain to the developers)


----------



## Slackaveli

what if we pre-ordered last week? no game ?


----------



## BigMack70

Quote:


> Originally Posted by *Jaysend*
> 
> A lot of titles you mentioned are console ports which I just always assume are going to have issues on PC.


The vast majority of AAA PC games are console ports in the sense that PC development takes a back seat to console


----------



## Slackaveli

btw, lemme in the club.;

i7-5775c , gtx 1080ti


----------



## Slackaveli

Quote:


> Originally Posted by *Jaysend*
> 
> A lot of titles you mentioned are console ports which I just always assume are going to have issues on PC. I had enough scaling in witcher 3 to push me over 100fps on 3880x1440. Which was worth it for me. I forget what I got with a single for average but I had some really annoying dips that went away with SLI.
> Bottom line, I don't see any downside other than cost to SLI. Id it doesn't work for a game, disable it. (And complain to the developers)


what if we pre ordered last friday during the 20 minute window lol?


----------



## Slackaveli

nm


----------



## alucardis666

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> Users who pre-ordered their 1080 Ti and received shipping notification today will receive their promotional game code sometime next week via email.


Does this also include those who purchased through newegg?


----------



## Slackaveli

Quote:


> Originally Posted by *tconroy135*
> 
> Mine from NVIDIA is weird. The Fed-Ex says it's in Circle Pines, MN and that only the label has been created. Except it also says that it shipped yesterday. And anticipated delivery is Tuesday next Week.


mine from same place (ordered last friday) shipped yesterday around noon.


----------



## Jaysend

Quote:


> Originally Posted by *BigMack70*
> 
> The vast majority of AAA PC games are console ports in the sense that PC development takes a back seat to console


Indeed. And I hate it!


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> The vast majority of AAA PC games are console ports in the sense that PC development takes a back seat to console


true, and 90% of the time they run way better than on consoles.


----------



## MURDoctrine

Haha sold out everwhere I'm looking. Guess thats a good thing since I can wait to see what happens with the AIB now instead of just pulling the trigger.


----------



## BigMack70

Quote:


> Originally Posted by *Slackaveli*
> 
> true, and 90% of the time they run way better than on consoles.


Of course. However, my comment was in reference to discussing the relative merits of SLI... essentially, because so many big PC games are handicapped by console development, SLI support is a crapshoot, and not worth doing if it can be at all avoided.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> Haha sold out everwhere I'm looking. Guess thats a good thing since I can wait to see what happens with the AIB now instead of just pulling the trigger.


Do what I did and grab an EVGA card then step up if it's worth it.


----------



## Chalupa

Units on Newegg keep going in and out of stock. I just picked up an EVGA 1080 TI FE about twenty minutes ago. I recommend using nowinstock.com to try and catch them before they sell out again.


----------



## PasK1234Xw

Quote:


> Originally Posted by *motivman*
> 
> 
> 
> After waiting in line for three and half hours at microcenter, and some major hustling/ Quick thinking at my local frys electronics, Ladies and Gentlemen, we have my new twins, replacing my EVGA 980TI SLI. Can't wait to see what these babies can do under water... who is jealous?


Good luck i just sold my 1080 SLI and going with one Ti Ive been running SLI for years with no real problems now its a joke. Between DX12 and lack of support on lot of game engines being used not worth it. I find myself playing older games just to enjoy SLI
Quote:


> Originally Posted by *BigMack70*
> 
> *Even in high refresh gaming, the limit is typically CPU not graphics cards, so SLI doesn't even always help.* Only reason I can see for SLI at this point is you are above 4k resolution, or you just like the tech / benchmarking. The practical uses for SLI in gaming are becoming fewer and fewer by the year.


Exactly BF1 lot of times CPU was limited when in middle or towards end of match on 64 players with lot of destruction so annoying. I play 1440/165hz and even with high 140-165fps just didn't feel right even with gsync when CPU load was high.


----------



## Chalupa

Quote:


> Originally Posted by *alucardis666*
> 
> Do what I did and grab an EVGA card then step up if it's worth it.


That's my plan as well. I'm going to upgrade if I feel like it's worth it. I'm EVGA or bust when it comes to buying graphic cards.


----------



## enyceedanny

Waiting for an EVGA variant for their warranty.


----------



## foolycooly

Quote:


> Originally Posted by *Chalupa*
> 
> That's my plan as well. I'm going to upgrade if I feel like it's worth it. I'm EVGA or bust when it comes to buying graphic cards.


I didn't even know EVGA offered step-up for the same series of card. I will probably be scoping out an EVGA FE on newegg then and possibly returning or selling my Nvidia FE if/when it finally gets here.


----------



## keikei

Worst case scenario for me, the card never comes out of backorder. I'll cancel it when the non-FE cards come out in a few weeks. Damn, i hate waiting....


----------



## stocksux

I was able to get the 1080ti installed to check for DOA before ripping the cooler off. Ran a quick Heaven and here are the results. Notables, 7700k @ 5.1GHz, Asus Formula IX, 3466 DDR4, and of course the 1080ti. GPU was run completely stock


----------



## undercoverb0ss

Is there a way to currently test ASIC quality? I'm curious what mine is.

Tried GPU-Z.1.17.0 and it's not supported on this card.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Is there a way to currently test ASIC quality? I'm curious what mine is.


no way at all.


----------



## undercoverb0ss

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> no way at all.


Sorry, coming from a 980ti and haven't been involved in the scene enough recently to judge if you are sarcastic or not. Has ASIC quality gone the way of the dodo?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Sorry, coming from a 980ti and haven't been involved in the scene enough recently to judge if you are sarcastic or not. Has ASIC quality gone the way of the dodo?


The asic can't be read by any device.


----------



## MURDoctrine

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Sorry, coming from a 980ti and haven't been involved in the scene enough recently to judge if you are sarcastic or not. Has ASIC quality gone the way of the dodo?


It didn't mean anything with 900 series on. My 980 was a high asic quality and had little OC headroom on air and water.

Dangit. Had a Gigabyte FE in my cart at Newegg and it sold before I could get it.


----------



## undercoverb0ss

Quote:


> Originally Posted by *MURDoctrine*
> 
> It didn't mean anything with 900 series on. My 980 was a high asic quality and had little OC headroom on air and water.
> 
> Dangit. Had a Gigabyte FE in my cart at Newegg and it sold before I could get it.


Interesting, my 980ti Hybrid was high asic quality and OC'ed pretty well. I definitely remember the numbers being pretty contentious back then.


----------



## KedarWolf

Quote:


> Originally Posted by *MURDoctrine*
> 
> Quote:
> 
> 
> 
> Originally Posted by *undercoverb0ss*
> 
> Sorry, coming from a 980ti and haven't been involved in the scene enough recently to judge if you are sarcastic or not. Has ASIC quality gone the way of the dodo?
> 
> 
> 
> It didn't mean anything with 900 series on. My 980 was a high asic quality and had little OC headroom on air and water.
> 
> Dangit. Had a Gigabyte FE in my cart at Newegg and it sold before I could get it.
Click to expand...

I'll have my Gigabyte FE from a local store delivered within an hour, then take it home from here at work in 4.5 hours, play, see how it OCs and stuff.


----------



## hotrod717

This is where Universal waterblocks come in extremely handy.
Curious what a stock 1080 TI will squash a 1080 clocked at? Can a 1080 even be clocked high enough to beat a stock TI?
Do all the cards come with a backplate?


----------



## y2kcamaross

Quote:


> Originally Posted by *hotrod717*
> 
> This is where Universal waterblocks come in extremely handy.
> Curious what a stock 1080 TI will squash a 1080 clocked at? *Can a 1080 even be clocked high enough to beat a stock TI*?
> Do all the cards come with a backplate?


maybe on LN2


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> This is where Universal waterblocks come in extremely handy.
> Curious what a stock 1080 TI will squash a 1080 clocked at? Can a 1080 even be clocked high enough to beat a stock TI?
> Do all the cards come with a backplate?


http://www.kitguru.net/components/graphic-cards/luke-hill/nvidia-gtx-1080-ti-founders-edition-11gb-review/2/

Yes, they have a two piece backplate.


----------



## KedarWolf

Eleven GDDR5X memory chips comprise the 11GB of VRAM. Nvidia has enhanced the power delivery system compared to the GTX 1080 Founder's Edition. A 7-phase control system drives a total of 14 dual-FETs with a rated power capacity of 250 Amps for the GPU.

This is an upgrade from the GTX 1080 both in terms of control channels and FET count, as the GTX 1080 Ti now uses two dual FETs per control channel, as opposed to one. It is also an upgrade from the Titan X Pascal's power delivery solution which is likely to translate into slightly higher electrical efficiency or better overclocking.


----------



## undercoverb0ss

Overclocking, so far I am maxing out at 2012Mhz sustained core clock, GPU temp 50-59C. Over 60C seems to bring instability.


----------



## KedarWolf

My Gigabyte 1080 Ti Founders Edition under my desk at work, I'll be home with it in about 3 hours and 45 minutes!

Can't beg off early tonight, I'm filling in evening shift for some vacation peeps, normally I could if we had more coverage here.


----------



## GRABibus

I don't want a FE.
Waiting for customised one's with improved air and water coolings


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> Of course. However, my comment was in reference to discussing the relative merits of SLI... essentially, because so many big PC games are handicapped by console development, SLI support is a crapshoot, and not worth doing if it can be at all avoided.


totally concur. why i ordered 1080ti (get away from sli)


----------



## ManuelG_at_NVIDIA

If you pre-ordered last week and your GTX 1080 Ti shipped today then yes you will get a separate email sometime next week with your game code.


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> I don't want a FE.
> Waiting for customised one's with improved air and water coolings


I'll have an EK waterblock for my FE within two weeks.


----------



## MURDoctrine

Gigabyte, EVGA, and PNY back in stock on Newegg. Just snagged an EVGA FE.


----------



## Slackaveli

Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> If you pre-ordered last week and your GTX 1080 Ti shipped today then yes you will get a separate email sometime next week with your game code.


thank you, sir.


----------



## alucardis666

For those that ordered for NewEgg today, I just got my game redemption code.


----------



## tconroy135

Quote:


> Originally Posted by *GRABibus*
> 
> I don't want a FE.
> Waiting for customised one's with improved air and water coolings


If you're getting a Water Block the FE is your best bet. It is the sole card that is guaranteed to have Liquid support. And a good loop with reasonable ambient temperatures should keep temps in the 30s while eliminating the power draw of the blower.


----------



## Asus11

Quote:


> Originally Posted by *Slackaveli*
> 
> btw, lemme in the club.;
> 
> i7-5775c , gtx 1080ti


very nice CPU dude









I used to have one.. very niche haha


----------



## Asus11

got a msi one.. can't wait to look back on this thread and laugh at how stupid we are to spend so much on GPUs

feels like Deja vu all over again


----------



## Nineball

Cant wait for a block and the ability to adjust the voltage. best i can do for now 160Mhz OC, not messing with memory will its under water.


----------



## keikei

Newegg is slowly restocking the Ti's. Play the F5 game. I *just* snag a zotac.


----------



## BoredErica

Wait so like...

The adapter that comes with FE is single link on the inside. I can't really find a cable for my 1440p 60hz Catleap that isn't active.









Help please!


----------



## Slackaveli

hell yeah, man. in testing vs 4690k oc'd to 4.5 , i am getting 10-20 fps more @ 1440p with this cpu.


----------



## Slackaveli

Quote:


> Originally Posted by *Asus11*
> 
> very nice CPU dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to have one.. very niche haha


^^


----------



## Slackaveli

Quote:


> Originally Posted by *Nineball*
> 
> 
> 
> Cant wait for a block and the ability to adjust the voltage. best i can do for now 160Mhz OC, not messing with memory will its under water.


the evga titan xp ones work but you have to cut the shroud over the 6pin hole.


----------



## lilchronic

Quote:


> Originally Posted by *Darkwizzie*
> 
> Wait so like...
> The adapter that comes with FE is single link on the inside. I can't really find a cable for my 1440p 60hz Catleap that isn't active.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Help please!


There expensive

https://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY
https://www.amazon.com/Accell-B087B-002B-UltraAV-DisplayPort-Dual-Link/dp/B002ISVI3U/ref=pd_sim_23_1?_encoding=UTF8&pd_rd_i=B002ISVI3U&pd_rd_r=NM4EDMHE69XMDY9TT8DE&pd_rd_w=NnZXT&pd_rd_wg=X3TwC&psc=1&refRID=NM4EDMHE69XMDY9TT8DE


----------



## MURDoctrine

Sigh guess its time to start running benchmarks on my 980 to compare to my 1080ti when it arrives. Does DSR put enough stress on the gpu to counter any CPU bottlenecks I might see since my monitors are 1080p?


----------



## rubicsphere

Snagged EVGA at the Egg:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487335


----------



## hotrod717

Where are all the benches and screenies? No one is posting anything performance related other than a heaven bench?
Guess I'll have to change that.
Out of the box stock settings seeing a top boost of 1866 core. A couple of nudges of conventional core caps out at 1936 with drop in bench score above that. 1.05v is top of what I've seen on ambient air.
Time for some a/c action and 3D11 Performance runs.

3d11 -5960x - 1080ti -



Agree with earlier post 2012 seems to be sweet spot on curve.

--A little more - 2050



I believe a shunt mod is in order once i get some water on it.

Not sure when "the pursuit of performance" turned into the pursuit of ordering/have ordered a card?


----------



## djfunz

An extra $66 with shipping and tax from Newegg. One of the reasons I stopped buying expensive items from Amazon and Newegg since I live in California.


----------



## AMDATI

Newegg is emailing out the game bundle codes now, but that doesn't count for much since you need to have the card installed and detected to redeem it. Or at least, any 'qualifying card' that the bundle is eligible for.


----------



## KedarWolf

Quote:


> Originally Posted by *AMDATI*
> 
> Newegg is emailing out the game bundle codes now, but that doesn't count for much since you need to have the card installed and detected to redeem it. Or at least, any 'qualifying card' that the bundle is eligible for.


I read you install the card, go to Geforce Experience comes with drivers, it detects the card, redeem there.

Edit: I'll try it when I'm home in ten minutes!


----------



## AMDATI

It'll only allow you to get the code if it detects qualifying cards. So even a 1080 would likely work in activating it.


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> It'll only allow you to get the code if it detects qualifying cards. So even a 1080 would likely work in activating it.


Try redeeming the code in uplay instead of geforce experience


----------



## Agueybana_II

MSI and Zotac available in Newegg...


----------



## Testing12

Quote:


> Originally Posted by *rubicsphere*
> 
> Snagged EVGA at the Egg:
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487335


Backordered as I write this.


----------



## rubicsphere

Quote:


> Originally Posted by *Testing12*
> 
> Backordered as I write this.


Someone had posted this earlier in the thread:

https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

That's how I was able to snag one. They keep going in and out of stock.


----------



## Slackaveli

Quote:


> Originally Posted by *MURDoctrine*
> 
> Sigh guess its time to start running benchmarks on my 980 to compare to my 1080ti when it arrives. Does DSR put enough stress on the gpu to counter any CPU bottlenecks I might see since my monitors are 1080p?


more or less. games with the in game slider for rez work better (ie BF1)


----------



## Slackaveli

Quote:


> Originally Posted by *hotrod717*
> 
> Where are all the benches and screenies? No one is posting anything performance related other than a heaven bench?
> Guess I'll have to change that.
> Out of the box stock settings seeing a top boost of 1866 core. A couple of nudges of conventional core caps out at 1936 with drop in bench score above that. 1.05v is top of what I've seen on ambient air.
> Time for some a/c action and 3D11 Performance runs.
> 
> 3d11 -5960x - 1080ti -
> 
> 
> 
> Agree with earlier post 2012 seems to be sweet spot on curve.


Quote:


> Originally Posted by *hotrod717*
> 
> Where are all the benches and screenies? No one is posting anything performance related other than a heaven bench?
> Guess I'll have to change that.
> Out of the box stock settings seeing a top boost of 1866 core. A couple of nudges of conventional core caps out at 1936 with drop in bench score above that. 1.05v is top of what I've seen on ambient air.
> Time for some a/c action and 3D11 Performance runs.
> 
> 3d11 -5960x - 1080ti -
> 
> 
> 
> Agree with earlier post 2012 seems to be sweet spot on curve.


Quote:


> Originally Posted by *hotrod717*
> 
> Where are all the benches and screenies? No one is posting anything performance related other than a heaven bench?
> Guess I'll have to change that.
> Out of the box stock settings seeing a top boost of 1866 core. A couple of nudges of conventional core caps out at 1936 with drop in bench score above that. 1.05v is top of what I've seen on ambient air.
> Time for some a/c action and 3D11 Performance runs.
> 
> 3d11 -5960x - 1080ti -
> 
> 
> 
> Agree with earlier post 2012 seems to be sweet spot on curve.
> 
> --A little more -


so, what was your afterburner settings? core +150 and Memory +350? Mine is still on a fedex truck delayed by weather even though it was supposed to be here by noon


----------



## Slackaveli

Anybody opened her up and applied better thermal paste? On my 980ti that gave me -10c under load.

If anybody tries it, this is the MUCH easier way. some idiots take the whole pcb off. (8 screws or 28, you choose







)


----------



## AMDATI

Quote:


> Originally Posted by *alucardis666*
> 
> Try redeeming the code in uplay instead of geforce experience


Are you saying that from experience or guessing? From what I can see....you choose the game on geforce experience, then they give you a code to redeem on uplay/origin or whatever it uses.


----------



## Slackaveli

you're right. they changed it last promo to this new method because of people reselling all the codes i think. or piracy (g2a and greenman gaming, etc). at any rate, you are correct.


----------



## Pandora's Box

So wait, you cant give the code away to a friend that doesn't have a 1080Ti?


----------



## alucardis666

Wow... Wildlands performance at 4k is a crapshoot. even on the low preset I can't maintain a steady 60fps, at 1440p I need to set it to medium in order to get a playable 60fps *Minimum dips down to high 40's*

I smell some shady things going on with driver *optimizations* and Nvidia paying Ubisoft to gimp performance on everything but a 1080ti... Will be interesting to see how it runs at 4k on ultra with the Ti when I get it.


----------



## BigMack70

Quote:


> Originally Posted by *alucardis666*
> 
> Wow... Wildlands performance at 4k is a crapshoot. even on the low preset I can't maintain a steady 60fps, at 1440p I need to set it to medium in order to get a playable 60fps *Minimum dips down to high 40's*
> 
> I smell some shady things going on with driver *optimizations* and Nvidia paying Ubisoft to gimp performance on everything but a 1080ti... Will be interesting to see how it runs at 4k on ultra with the Ti when I get it.


It's Ubisoft. What did you expect?

I stopped buying Ubisoft games years ago, and performance was a significant contributing factor.


----------



## hotrod717

Quote:


> Originally Posted by *Slackaveli*
> 
> so, what was your afterburner settings? core +150 and Memory +350? Mine is still on a fedex truck delayed by weather even though it was supposed to be here by noon


Conventional settings arent as efficient or provide best results. On 1080 and now 1080ti, custom voltage curve is going to net best results. Click on the icon beside core clock slider.

Catzilla 720p - 2062 core / stock 1377 mem on custom voltage curve- Crushing Titan XP - Remember this is on chilled air!



Firestrike -



Firestrike Extreme -


----------



## alucardis666

Quote:


> Originally Posted by *BigMack70*
> 
> It's Ubisoft. What did you expect?
> 
> I stopped buying Ubisoft games years ago, and performance was a significant contributing factor.


Still... This game needs some serious patching.


----------



## keikei

Speaking of new games, any big ones coming soon or out? Would love to test them with the Ti!


----------



## Slackaveli

nice. so you added like +80 and max voltage? or you mean a custom bios?

your temps are amazing. what's your cooling set-up?


----------



## AMDATI

I think both games aren't very good, I wish they had instead included a code for mass effect andromeda for when it comes out. that would have been amazing. But in any case, wildlands is pretty much a just cause or far cry clone....and even though for honor isn't that great, it is focused around multiplayer, which is what I prefer in games. I honestly never really play the free game bundles. I got far cry 4 with my 970 and I didn't really enjoy it at all. But I've never been a fan of far cry anyways.


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> So wait, you cant give the code away to a friend that doesn't have a 1080Ti?


yes, IF he has a 1070 or a 1080 (i think it'll work); no if not.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> I think both games aren't very good, I wish they had instead included a code for mass effect andromeda for when it comes out. that would have been amazing. But in any case, wildlands is pretty much a just cause or far cry clone....and even though for honor isn't that great, it is focused around multiplayer, which is what I prefer in games. I honestly never really play the free game bundles. I got far cry 4 with my 970 and I didn't really enjoy it at all. But I've never been a fan of far cry anyways.


exactly what I'd hoped for.


----------



## lukeluke

I picked up the last one from Rockville Md Microcenter today. Virginia microcenter was sold out so I went to the next one and made it just in time.

They were selling them at $750 instead of $699 and I haven't seen anything about a free game. Not sure if that was only for people pre-ordering online or what.

Got home, opened the TI, took my 780 SLI out of the case to sell, took the water blocks off and realized I no longer have the screws to reattach the cooler. I have already promised these cards to some people, so it's really annoying. Would anyone who's removed the cooler from a GTX card to put on a water block be willing to sell me ($20 or whatever) the unused screws?

Really in a bind with these whole cards being rendered unusable by a stupid $1 part. Was hoping to get cash from the 780s to offset the TI cost. It's 16 5mm screws plus four with springs on them.


----------



## KCDC

Quote:


> Originally Posted by *keikei*
> 
> Speaking of new games, any big ones coming soon or out? Would love to test them with the Ti!


I think Mass Effect Andromeda will be a great one to test this card out. Especially if nvidia surround is supported. 7680x1440 on max settings with two Ti's, pending scaling is supported. Sounds glorious.


----------



## muhd86

guys i wanted to know can we do tri sli or quad sli with 1080ti ---- or its the same as 1080 only 2 way sli .


----------



## Mongo

Quote:


> Originally Posted by *kl6mk6*
> 
> Wednsday. So much for paying extra for next day.
> 
> Edit: Correction! I mis read weekday delivery as Wednesday, it will be here Monday!


I also PAID for OVERNIGHT delivery and show a delivery date of Monday. I'm in contact with Nvidia right now.

I'm not about to pay for next day and not get next day, And yes Fedex delivers on Saturday.


----------



## AMDATI

fedex delivers on saturday, but it costs extra on top of next day delivery.

quoted from fedex's page....

"When you need your shipment to arrive on Saturday, we've got you covered. FedEx Priority Overnight offers customers next business-day delivery and, for an additional fee, delivery on Saturday. "


----------



## KingEngineRevUp

I am having issues with redeeming my game code with Newegg. I have chatted with both Nvidia and Newegg now. No luck. Trying to get to the bottom of this.

I am getting "You do not meet the hardware requirements to redeem this coupon."


----------



## Slackaveli

Quote:


> Originally Posted by *KCDC*
> 
> I think Mass Effect Andromeda will be a great one to test this card out. Especially if nvidia surround is supported. 7680x1440 on max settings with two Ti's, pending scaling is supported. Sounds glorious.


agreed. can't wait to try it on my 4k samsung HDR.


----------



## outofmyheadyo

Does someone know how is nvidias customer service? I am trying to decide between nvidia.de site and evga, price of the 1080ti card is the same nvidia just has a nicer box snd shipping is free. Planning to watercool it aswell, so no idea how does nvidia feel about it incase of problems, if I need to rma. I know EVGA customer service is great, if not the best.


----------



## Mongo

Quote:


> Originally Posted by *SlimJ87D*
> 
> I am having issues with redeeming my game code with Newegg. I have chatted with both Nvidia and Newegg now. No luck. Trying to get to the bottom of this.
> 
> I am getting "You do not meet the hardware requirements to redeem this coupon."


what part is the problem? I was told you had to now have a qualifying card in the system to redeem the code or something like that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mongo*
> 
> what part is the problem? I was told you had to now have a qualifying card in the system to redeem the code or something like that.


Well GeForce Experience doesn't know I have a 1080 Ti for some reason, even though it clearly shows I'm running one in the system specs of GeForce Experience.

Either

1. Newegg didn't give me the right code.
2. NVIDIA didn't give Newegg the right code.
3. NVIDIA didn't update their servers to read the cards and the codes together correctly.


----------



## Slackaveli

Quote:


> Originally Posted by *Mongo*
> 
> I also PAID for OVERNIGHT delivery and show a delivery date of Monday. I'm in contact with Nvidia right now.
> 
> I'm not about to pay for next day and not get next day, And yes Fedex delivers on Saturday.


yeah, we screwed. same boat as you. it rained thursday night so apparently that excuses them from delivering an "overnight morning" delivery all the way until the 4th day. No refunds. Screw fed ex. i wont be using them anymore. Don't even get my Ti for the weekend.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well GeForce Experience doesn't know I have a 1080 Ti for some reason, even though it clearly shows I'm running one in the system specs of GeForce Experience.
> 
> Either
> 
> 1. Newegg didn't give me the right code.
> 2. NVIDIA didn't give Newegg the right code.
> 3. NVIDIA didn't update their servers to read the cards and the codes together correctly.


sounds like classic launch day problem. i suspect the 3rd one.


----------



## Mongo

Well I got the code when I ordered today but I'm not going to try it until I get the card in the system.

So I hope its all fixed by then.

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, we screwed. same boat as you. it rained thursday night so apparently that excuses them from delivering an "overnight morning" delivery all the way until the 4th day. No refunds. Screw fed ex. i wont be using them anymore. Don't even get my Ti for the weekend.


I have already told Nvidia I'm not going to pay the Overnight shipping fee if I don't get it tomorrow.

They told me they want to try to contact Fedex first to see if they can move up delivery before they refund shipping.


----------



## Slackaveli

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Does someone know how is nvidias customer service? I am trying to decide between nvidia.de site and evga, price of the 1080ti card is the same nvidia just has a nicer box snd shipping is free. Planning to watercool it aswell, so no idea how does nvidia feel about it incase of problems, if I need to rma. I know EVGA customer service is great, if not the best.


if you are planning that then go with EVGA.. they have 3 yr warranty and wont void you for opening her up as long as it's received in "original" config in an RMA situation. Both are 3 year warranty.


----------



## KedarWolf

Gigabyte 1080 Ti Founders Edition.

Quick OC test, +144 core, +512 memory, passes Firestrike Ultra stress test and TimeSpy.

This is just the second settings I tried, started with +114 core +496 memory.

I have a decent card maybe? Need to compare results with peeps.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Gigabyte 1080 Ti Founders Edition.
> 
> Quick OC test, +144 core, +512 memory, passes Firestrike Ultra stress test and TimeSpy.
> 
> This is just the second settings I tried, started with +114 core +496 memory.
> 
> I have a decent card maybe? Need to compare results with peeps.


Keep pushing!


----------



## KraxKill

Liquid Metal / Shunt / NZXT x41 AIO + G10 bracket hack. Zero throttle at 2062 @ 1.062v Temps 30C-40C

Anybody have more voltage?


----------



## lukeluke

Quote:


> Originally Posted by *Slackaveli*
> 
> if you are planning that then go with EVGA.. they have 3 yr warranty and wont void you for opening her up as long as it's received in "original" config in an RMA situation. Both are 3 year warranty.


Yeah, I picked up an MSI (only one I could get) and it has a sticker over a screw you'd have to remove to put on a waterblock, and it says warranty void if removed. Lame. Hopefully I won't have a situation where I'd have any reason to RMA, but I did plan to put on a waterblock and never saw that on my old cards.


----------



## outofmyheadyo

That`s why I stay away from msi, terrible customer service, and rules made up to screw customers.
I am not sure if nvidias own cards have the sticker or not, and do they come with 2 or 3 years of warranty from nvidia ?
If they come with a sticker and 2 years, I`m not going to order one from them for the nicer box and free shipping.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> Liquid Metal / Shunt / NZXT x41 AIO + G10 bracket hack. Zero throttle at 2062 @ 1.062v Temps 30C
> 
> Anybody have more voltage?


Wow, how did you do that mod? Just used the Kraken hardware without the bracket? Would my H55 be able to do that?


----------



## hotrod717

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> Liquid Metal / Shunt / NZXT x41 AIO + G10 bracket hack. Zero throttle at 2062 @ 1.062v Temps 30C
> 
> Anybody have more voltage?


I'm at 1.05v with no shunt mod, yet. 2062 is good in all benches ran save 3d11. May be i need to tweak a bit more.
How are you reading Clocks. GPUz render and sensor?


----------



## KingEngineRevUp

Did you guys just take a Kraken G10 bracket and cut off the fan area?


----------



## KraxKill

Exactly....

I wire snipped off the fan portion of the g10 bracket leaving just the portion used to hold down the AIO and then used a bench grinder + file to clean it up some so that the cut bracket would clear any protrusions within the 1080ti cooler housing. I then had to use an appropriately sized copper shim between the NZXT pump block and the die to bridge the gap to the core. This in order to keep the stock shroud and fan cooling the VRM and the rest of the board. Liquid metal on the die + shim with liquid tape surrounding it to be safe (peels right off like you'd expect tape to). Same for the shunt mod to kill off any possible TDP throttle prior to voltage.


----------



## KraxKill

Quote:


> Originally Posted by *hotrod717*
> 
> How are you reading Clocks. GPUz render and sensor?


yes GPUZ and logging at .1sec.


----------



## stocksux

So I've picked up and installed an evga 1080ti today and got an EK block on it. My question is...should I grab another? I'm currently on an asus PG279Q which is 165hz 1440p. 80% of the games I'm either playing or would like to start soon are officially "sli supported" per the Nvidia website. I've never been sli before. When asus launches the PG27UQ (4K 144hz) I'll buy that. That's supposed to be Q3. Should I stay one 1080ti till the 4K monitor and maybe have more info on Volta (currently slated for sometime in 2018) or grab a second 1080ti and play at super fast 1440p now and be prepared for the 4K monitor when it launches??


----------



## KCDC

Those just getting started with real gpu overclockling for this gpu (like me), start here:

http://www.guru3d.com/articles-pages/nvidia-titan-x-(pascal)-overclock-guide,1.html

mine will go into my loop later in the year when I'm ready to go with rigid tubing. getting rid of that thermal throttle is key!

Now have two 420 rads, currently only cooling my vrms and cpu, should have enough headroom.


----------



## KCDC

Quote:


> Originally Posted by *stocksux*
> 
> So I've picked up and installed an evga 1080ti today and got an EK block on it. My question is...should I grab another? I'm currently on an asus PG279Q which is 165hz 1440p. 80% of the games I'm either playing or would like to start soon are officially "sli supported" per the Nvidia website. I've never been sli before. When asus launches the PG27UQ (4K 144hz) I'll buy that. That's supposed to be Q3. Should I stay one 1080ti till the 4K monitor and maybe have more info on Volta (currently slated for sometime in 2018) or grab a second 1080ti and play at super fast 1440p now and be prepared for the 4K monitor when it launches??


I'm going dual GPU for my work, 3D animation, particle effects, GPU rendering.. which the more the better..

I'm also doing it because I have a triple monitor (7680x1440) setup. With my dual 980ti's I've had to take it down to 5760x1080 on many games.

I know that, with those games that will support either SLI or surround, having that second card will help me tons for the native resolution, since I'm about 3million pixels over a 4K screen with the triple setup. If one can make 4k scream, then two should do it for past 4K renderings.

I will report my results once they finally show up which won't be till next week.

For you, I would only get a second one if you really think you're seeing a performance degradation. Live with the first one for a few months, play lots of games.. By then there will be better stock of the card, and then you can finally decide if you need two. Since it's on a custom water loop, have some fun overclocking it! mess with the voltages! see what it can do!

EDIT:I am putting both on water with EK blocks once my wallet has a second to breathe







maybe next month


----------



## KedarWolf

How do I unlock the voltage slider in Afterburner and I use the latest version, right?

With just power limit at 120% on my Gigabyte FE I'm getting +144 core, +512 memory stable in Firestrike Ultra stress test, TimeSpy and Metro Last Light Redux benchmark maxed out.


----------



## DrFreeman35

Quote:


> Originally Posted by *KedarWolf*
> 
> How do I unlock the voltage slider in Afterburner and I use the latest version, right?
> 
> With just power limit at 120% on my Gigabyte FE I'm getting +144 core, +512 memory stable in Firestrike Ultra stress test, TimeSpy and Metro Last Light Redux benchmark maxed out.


To unlock the voltage, you have to go into settings and unlock it. Should be with the cog wheel, and check the box to unlock voltage.


----------



## GRABibus

Quote:


> Originally Posted by *KedarWolf*
> 
> How do I unlock the voltage slider in Afterburner and I use the latest version, right?
> 
> With just power limit at 120% on my Gigabyte FE I'm getting +144 core, +512 memory stable in Firestrike Ultra stress test, TimeSpy and Metro Last Light Redux benchmark maxed out.


Please, post some results at Benchmark section of Time Spy and Firestrike


----------



## Clockster

Just did a quick dirty run on my old setup.

http://www.3dmark.com/spy/1352391

Gigabyte GTX1080 Ti
+100Mhz Core
+450 Mem

7700K
Stock CPU.


----------



## undercoverb0ss

If I don't care about looks, can I get away with putting my 980ti's hybrid cooler on my 1080ti without cutting/permanently modding any of it?


----------



## Clockster

http://www.3dmark.com/3dm/18505876?

Gigabyte GTX1080 Ti
+100Mhz Core
+450 Mem

7700K
Stock CPU.


----------



## undercoverb0ss

6700k @ 4.7
1080ti
+160core
+425mem



http://i.imgur.com/pl00dJm.jpg


----------



## undercoverb0ss

Firestrike v1.1 22572, Graphics score: 31079



https://image.ibb.co/fs6qGF/Untitled_2.png


----------



## Asus11

has anyone done the shunt mod? thinking of doing it straight off the bat


----------



## derfa1975

...have anyone coil whine issues like in the high an low fps ranges?

greets


----------



## kevindd992002

Quote:


> Originally Posted by *lilchronic*
> 
> lol that should work. My monitor only has DVI.


Quote:


> Originally Posted by *lilchronic*
> 
> There expensive
> 
> https://www.amazon.com/StarTech-com-DisplayPort-Active-Adapter-Converter/dp/B00A493CNY
> https://www.amazon.com/Accell-B087B-002B-UltraAV-DisplayPort-Dual-Link/dp/B002ISVI3U/ref=pd_sim_23_1?_encoding=UTF8&pd_rd_i=B002ISVI3U&pd_rd_r=NM4EDMHE69XMDY9TT8DE&pd_rd_w=NnZXT&pd_rd_wg=X3TwC&psc=1&refRID=NM4EDMHE69XMDY9TT8DE


So the included dp to dvi passive converter is really of no use?


----------



## stocksux

Quote:


> Originally Posted by *kevindd992002*
> 
> So the included dp to dvi passive converter is really of no use?


It is if you have a monitor with DVI and no display port


----------



## kevindd992002

Quote:


> Originally Posted by *stocksux*
> 
> It is if you have a monitor with DVI and no display port


And it's different if the monitor has both DP and DVI?


----------



## Nineball

Quote:


> Originally Posted by *DrFreeman35*
> 
> To unlock the voltage, you have to go into settings and unlock it. Should be with the cog wheel, and check the box to unlock voltage.


As of now, MSI AB how not been updated, so voltage is currently locked. I dont think it would matter anyway without watercooling. This cards at hitting thermal throttling with decent over clocks anyway.


----------



## stocksux

Quote:


> Originally Posted by *kevindd992002*
> 
> And it's different if the monitor has both DP and DVI?


If your monitor has both, then use a display port cable and disregard the included DVI to DP dongle


----------



## muhd86

so can we do tri sli / or quad sli

with gtx 1080ti


----------



## keikei

Quote:


> Originally Posted by *muhd86*
> 
> so can we do tri sli / or quad sli
> 
> with gtx 1080ti


Doesnt look like it. NVIDIA GeForce "Pascal" 3-way and 4-way SLI Restricted to Select Non-Gaming Apps

From the latest driver release notes. Page 25: http://us.download.nvidia.com/Windows/378.78/378.78-win10-win8-win7-notebook-release-notes.pdf

Quote:


> Increasing 4-way SLI/Multi-GPU Performance
> Issue
> With some games and applications, you may experience little to no performance gain or
> even a performance drop with 4-way SLI or multi-GPU configurations.


----------



## al210

Waterblock here!


----------



## Pandora's Box

Quote:


> Originally Posted by *Mongo*
> 
> I also PAID for OVERNIGHT delivery and show a delivery date of Monday. I'm in contact with Nvidia right now.
> 
> I'm not about to pay for next day and not get next day, And yes Fedex delivers on Saturday.


I ordered from newegg yesterday, i had to pay extra for overnight saturday delivery. Their regular overnight delivers on business days. My order is currently out for delivery today by noon.

Also those posting overclock results - can you please post actual core clocks and not what you added in msi afterburner? Each card will be different so saying "+160 core" means nothing.


----------



## vmanuelgm

Quote:


> Originally Posted by *Pandora's Box*
> 
> I ordered from newegg yesterday, i had to pay extra for overnight saturday delivery. Their regular overnight delivers on business days. My order is currently out for delivery today by noon.
> 
> Also those posting overclock results - can you please post actual core clocks and not what you added in msi afterburner? Each card will be different so saying "+160 core" means nothing.


Not so different.

+160=2050+-
+260=2150+-


----------



## c0nsistent

Well I ordered a Gigabyte 1080 Ti since the others were out of stock... paid for next day delivery which means I'll get it Tuesday it says.

Unfortunately it only comes with a passive DP to DVI adapter so I wont be able to use my DVI-DL QNIX 1440p monitor and will have to use a 1080p monitor until I get an adapter, and even then it will only be at 60hz instead of 120hz. With that said, I think I'll have no choice but to buy a monitor at this point unless I want to flip the card and buy an AIB model next month.


----------



## BigMack70

Paid $60 to EVGA for next day delivery which apparently means Monday. Guess this is what happens when they launch a card on a Friday afternoon.


----------



## hotrod717

Quote:


> Originally Posted by *vmanuelgm*
> 
> Not so different.
> 
> +160=2050+-
> +260=2150+-


Ultimately this why gpuz sensor tab and render is preferable. Across different oc tools and cards efficiency this provides a clear boost clock. + this and + that can become convuluted and does not indicate true clock. Also a screen of completed bench helps show stability.
It's just bs without proof.
Hence why I post screens of benchmarks and gpuz sensor tab with render. Everything else is fiction.


----------



## alucardis666

Should have my card before or by Wednesday. Can't wait!


----------



## c0nsistent

So noone else is upset they didn't include a DVI connector on the card? I know some of you guys have Korean 1440p displays like myself.


----------



## BigMack70

Quote:


> Originally Posted by *c0nsistent*
> 
> So noone else is upset they didn't include a DVI connector on the card? I know some of you guys have Korean 1440p displays like myself.


I get that it sucks if you have one of the DVI-only korean panels, but DVI is an ancient fossil of an interface in 2017 and it's really time for everyone to move on...


----------



## alucardis666

Quote:


> Originally Posted by *c0nsistent*
> 
> So noone else is upset they didn't include a DVI connector on the card? I know some of you guys have Korean 1440p displays like myself.


Not really... Display port is superior, and those of us with UHDTV's will be using HDMI.


----------



## kevindd992002

Quote:


> Originally Posted by *stocksux*
> 
> If your monitor has both, then use a display port cable and disregard the included DVI to DP dongle


Yes, I get that, no questions asked on that as I know DP is superior.

But what I really want to know if the included adapter will work if a monitor has both DVI and DP? Just for curiosity's sake.


----------



## lilchronic

Quote:


> Originally Posted by *Pandora's Box*
> 
> I ordered from newegg yesterday, i had to pay extra for overnight saturday delivery. Their regular overnight delivers on business days. My order is currently out for delivery today by noon.
> 
> Also those posting overclock results - can you please post actual core clocks and not what you added in msi afterburner? Each card will be different so saying "+160 core" means nothing.


I did the same but no update on tracking yet.


----------



## Pandora's Box

Quote:


> Originally Posted by *lilchronic*
> 
> I did the same but no update on tracking yet.


I also did the $2.99 rush processing option. You also have to select saturday delivery ($12.99 charge)


----------



## vmanuelgm

Quote:


> Originally Posted by *hotrod717*
> 
> Ultimately this why gpuz sensor tab and render is preferable. Across different oc tools and cards efficiency this provides a clear boost clock. + this and + that can become convuluted and does not indicate true clock. Also a screen of completed bench helps show stability.
> It's just bs without proof.
> Hence why I post screens of benchmarks and gpuz sensor tab with render. Everything else is fiction.


I can do 2200 on water, more or less...


----------



## lilchronic

Quote:


> Originally Posted by *Pandora's Box*
> 
> I also did the $2.99 rush processing option. You also have to select saturday delivery ($12.99 charge)


Me too


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> Me too


Hey lil, waiting for a good oc from this beasty in your hands!!!


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> Hey lil, waiting for a good oc from this beasty in your hands!!!


It's not in my hands yet.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> It's not in my hands yet.


I know, but soon will be, maybe one month??? xD


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> I know, but soon will be, maybe one month??? xD


hopefully a couple of hours.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> hopefully a couple of hours.


Ohhh, i thought the carrier would make a mistake and bring it to Spain... Pity!!!

Enjoy yourself!!!

Are you watercooling it??? Gonna push it hard??? Hope so, wanna see good numbers!!!


----------



## KingEngineRevUp

Has anyone found a sweet spot for memory? I don't th maxing out memory on Pascal is the way to go.

People on Reddit have found d that +500 on memory doesn't get you better results than +0.


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> Ohhh, i thought the carrier would make a mistake and bring it to Spain... Pity!!!
> 
> Enjoy yourself!!!
> 
> Are you watercooling it??? Gonna push it hard??? Hope so, wanna see good numbers!!!


I know you would love that.









Yeah i will water cool it right away. I only have a uni block right now and the fullcover should be here next week some time.

I also need to go pick up some distilled water but i don't want to miss the fedex truck.


----------



## DooRules

Quote:


> Originally Posted by *SlimJ87D*
> 
> Has anyone found a sweet spot for memory? I don't th maxing out memory on Pascal is the way to go.
> 
> People on Reddit have found d that +500 on memory doesn't get you better results than +0.


I would not being paying too much attention to what anyone on Reddit has to say. Find your highest stable core clock and then work on memory.


----------



## lilchronic

I got no updates from fedex


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DooRules*
> 
> I would not being paying too much attention to what anyone on Reddit has to say. Find your highest stable core clock and then work on memory.


Are you taking from your own test?


----------



## BigMack70

Quote:


> Originally Posted by *SlimJ87D*
> 
> Has anyone found a sweet spot for memory? I don't th maxing out memory on Pascal is the way to go.
> 
> People on Reddit have found d that +500 on memory doesn't get you better results than +0.


Reddit is not exactly the most credible source.


----------



## DooRules

Just from my Titan XP's, and they are for sure similar enough.


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> How do I unlock the voltage slider in Afterburner and I use the latest version, right?
> 
> With just power limit at 120% on my Gigabyte FE I'm getting +144 core, +512 memory stable in Firestrike Ultra stress test, TimeSpy and Metro Last Light Redux benchmark maxed out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please, post some results at Benchmark section of Time Spy and Firestrike
Click to expand...

http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/940_20#post_25913900

http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1980_20#post_25913932


----------



## stocksux

Quote:


> Originally Posted by *c0nsistent*
> 
> So noone else is upset they didn't include a DVI connector on the card? I know some of you guys have Korean 1440p displays like myself.


So glad it's gone. Now we need a single slot I/O cover for us water cooled guys


----------



## drkCrix

Quote:


> Originally Posted by *Slackaveli*
> 
> Anybody opened her up and applied better thermal paste? On my 980ti that gave me -10c under load.
> 
> If anybody tries it, this is the MUCH easier way. some idiots take the whole pcb off. (8 screws or 28, you choose
> 
> 
> 
> 
> 
> 
> 
> )


I am interested in this as well, I was watching the Gamers Nexus tear down and the GPU core they had was coated in thermal paste.

For those that have removed the stock cooler, did they over apply the thermal paste as well?

Cheers


----------



## stocksux

Quote:


> Originally Posted by *kevindd992002*
> 
> Yes, I get that, no questions asked on that as I know DP is superior.
> 
> But what I really want to know if the included adapter will work if a monitor has both DVI and DP? Just for curiosity's sake.


im not sure why it matters. You would use the DP if it's an option. That's like asking can you use a fork to eat pudding. Sure you can but if a spoon is available, you're gonna use the spoon. But I don't see why the adapter wouldn't work if both are on a monitor. If it's the only connection it should default to it. If there are multiple connections you should have an input button to select which you want.


----------



## stocksux

Quote:


> Originally Posted by *drkCrix*
> 
> I am interested in this as well, I was watching the Gamers Nexus tear down and the GPU core they had was coated in thermal paste.
> 
> For those that have removed the stock cooler, did they over apply the thermal paste as well?
> 
> Cheers



This one in particular is the EVGA variety for what it's worth.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *BigMack70*
> 
> Reddit is not exactly the most credible source.


Here's the post I was talking about for a 1080 (not Ti)

"I opened a window version of Unigine Valley and checked the FPS at a certain point and paused it. At stock memory it was 106 FPS. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i noticed a leap in performance. Ended up with +457 which gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.

Recommend you all do it. Could be getting less FPS than stock just by being +1 memory over your "limit" so to say. I chose Valley because it is a memory sensitive benchmark.









Tested with an EVGA FE 1080 w/ EK block"

It doesn't sound too far-fetched to me.


----------



## kevindd992002

Quote:


> Originally Posted by *stocksux*
> 
> im not sure why it matters. You would use the DP if it's an option. That's like asking can you use a fork to eat pudding. Sure you can but if a spoon is available, you're gonna use the spoon. But I don't see why the adapter wouldn't work if both are on a monitor. If it's the only connection it should default to it. If there are multiple connections you should have an input button to select which you want.


I guess what I want to understand about is why would the passive adapter work with a monitor that has both interfaces while it won't work with a monitor that only has DVI. I just want the reason/rationale behind it.


----------



## DooRules

Quote:


> Originally Posted by *SlimJ87D*
> 
> Here's the post I was talking about for a 1080 (not Ti)
> 
> "I opened a window version of Unigine Valley and checked the FPS at a certain point and paused it. At stock memory it was 106 FPS. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i noticed a leap in performance. Ended up with +457 which gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.
> 
> Recommend you all do it. Could be getting less FPS than stock just by being +1 memory over your "limit" so to say. I chose Valley because it is a memory sensitive benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tested with an EVGA FE 1080 w/ EK block"
> 
> It doesn't sound too far-fetched to me.


All he is saying is test your own gpu to see where you fall. They are all slightly different in what they can do. The only way to find out is to test your own gpu.


----------



## drkCrix

Quote:


> Originally Posted by *stocksux*
> 
> 
> This one in particular is the EVGA variety for what it's worth.


Thanks, that's what I have on order. Probably won't mess with the TIM.

Cheers


----------



## RedM00N

Quote:


> Originally Posted by *SlimJ87D*
> 
> Here's the post I was talking about for a 1080 (not Ti)
> 
> "I opened a window version of Unigine Valley and checked the FPS at a certain point and paused it. At stock memory it was 106 FPS. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i noticed a leap in performance. Ended up with +457 which gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.
> 
> Recommend you all do it. Could be getting less FPS than stock just by being +1 memory over your "limit" so to say. I chose Valley because it is a memory sensitive benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tested with an EVGA FE 1080 w/ EK block"
> 
> It doesn't sound too far-fetched to me.


Sounds like error correction or whatever it is they call in on NV gpu's. I know after a certain frequency on my Maxwell cards (around >8200) performance starts to drop even though things stay stable, and its because the memory is getting errors(going off some cuda test or whatever it was).


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> I got no updates from fedex


Coming to Spain then???


----------



## PewnFlavorTang

I bought 3 (2 EVGA's and 1 ASUS). I can't wait to get these. Right now I just have a poor EVGA 1080ftw.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BigMack70*
> 
> Reddit is not exactly the most credible source.
> 
> 
> 
> Here's the post I was talking about for a 1080 (not Ti)
> 
> "I opened a window version of Unigine Valley and checked the FPS at a certain point and paused it. At stock memory it was 106 FPS. Raising it up to +600 made it drop to 102 FPS so i slowly decreased it by -10 each time and then -1 when i noticed a leap in performance. Ended up with +457 which gave me 113 FPS, the SECOND i raised it to 458 it dropped to 102 again.
> 
> Recommend you all do it. Could be getting less FPS than stock just by being +1 memory over your "limit" so to say. I chose Valley because it is a memory sensitive benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tested with an EVGA FE 1080 w/ EK block"
> 
> It doesn't sound too far-fetched to me.
Click to expand...

Did a test with TimeSpy, dropped memory from 562 to 456, TimeSpy went UP from 10553 to 10585.


----------



## KingEngineRevUp

What happens when you drop memory to 400? Did performance start going down?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> Exactly....
> 
> I wire snipped off the fan portion of the g10 bracket leaving just the portion used to hold down the AIO and then used a bench grinder + file to clean it up some so that the cut bracket would clear any protrusions within the 1080ti cooler housing. I then had to use an appropriately sized copper shim between the NZXT pump block and the die to bridge the gap to the core. This in order to keep the stock shroud and fan cooling the VRM and the rest of the board. Liquid metal on the die + shim with liquid tape surrounding it to be safe (peels right off like you'd expect tape to). Same for the shunt mod to kill off any possible TDP throttle prior to voltage.


How thick is your shim? I might attack my H55 using zip ties... Lol.


----------



## bowman

Question for owners and ponderers: Would you get a 1440p G-Sync panel for futureproofing, or a 1080p G-Sync panel so that the card can actually drive current games up to the panel's capabilities?


----------



## bfedorov11

Quote:


> Originally Posted by *bowman*
> 
> Question for owners and ponderers: Would you get a 1440p G-Sync panel for futureproofing, or a 1080p G-Sync panel so that the card can actually drive current games up to the panel's capabilities?


It will perform better at higher resolutions.. 1440p+. DigitalFoundry did a great comparison, 1080ti at 4k vs 960, 480, and 970 at 1080p. Their overclocked 6700k bottlenecked at 1080p.


----------



## SpartanJet

Quote:


> Originally Posted by *lilchronic*
> 
> I got no updates from fedex


Same here 7:47 pm label created from newegg with overnight+Saturday shipping. I'm thinking they didn't make the cut off time for Saturday and will be delivered on Monday.


----------



## vmanuelgm

Quote:


> Originally Posted by *SpartanJet*
> 
> Same here 7:47 pm label created from newegg with overnight+Saturday shipping. I'm thinking they didn't make the cut off time for Saturday and will be delivered on Monday.


Bad news...


----------



## sWaY20

Quote:


> Originally Posted by *SpartanJet*
> 
> Same here 7:47 pm label created from newegg with overnight+Saturday shipping. I'm thinking they didn't make the cut off time for Saturday and will be delivered on Monday.


mine says same thing, hasn't changed, I'm hoping their system just hasn't caught up or something. If that's not the case though, I'm getting my overnight, and sat delivery money back.


----------



## BigMack70

Quote:


> Originally Posted by *bfedorov11*
> 
> Their overclocked 6700k bottlenecked at 1080p.


The crazier thing is that they were even seeing occasional - and slight - CPU bottlenecking at _1440p_ on the 1080 Ti.

CPUs really have just not kept up at all with the demands of games and GPUs. Really hope AMD lights a fire under Intel's ass so we can get real CPU improvements to keep pace with GPUs again.


----------



## AMDATI

Here's the plus side.....as games get more GPU demanding, the bottleneck will go away, because there will be less FPS to draw.

My system is 1080ti ready.....bought a new power supply to drive it.....also picked up a rival 700 for the heck of it







But man it's a pain changing stuff out on a mini itx system that's for sure, basically had to take everything out but the motherboard and ram. But it did need a bit of dusting anyways.

Pro tip guys: I just bought an ED500 DataVac electrical duster, it's great replacement for canned air. Cost $60, but it'll pay for itself before long. I've seen people with 10 year old ones, so they're sturdy too.


----------



## BigMack70

Quote:


> Originally Posted by *AMDATI*
> 
> Here's the plus side.....as games get more GPU demanding, the bottleneck will go away, because there will be less FPS to draw.


I'm not so sure. Graphics cards continue to march exponentially forward in performance with _[top GPU]+30%_ performance coming to market each year. At the same time, and as a general rule, graphical progress in games is limited by static low-end console hardware that changes at most every 3-5 years. This provides the opportunity on PC to either push resolution or framerates up.

I could be wrong, but I don't see resolution ever being pushed past 8k in the mainstream even for enthusiast PC gaming. It may not even go past 4k. At some point resolution becomes "good enough".

This means that framerates eventually become the area things must advance - and VR pushes things this way significantly at the high end of tech - which means that it may be far more common to have CPU limitations than GPU, regardless of what you're doing.


----------



## dboythagr8

Got screwed by Newegg or FedEx, most likely both.

Newegg Premiere member w/ auto rush processing, next day delivery, Saturday delivery....And package hasn't left CA despite my order getting in multiple hours before the rush processing cutoff time.

Great service

Did anybody else who ordered from Newegg have their card shipped?

High annoyed right now. I called and they confirmed that it's most likely still with newegg. She put in request to refund me all of the delivery charges and to call them back on Monday if tracking hasn't updated or whatever. Most likely won't renew the Premiere service because of this.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> What happens when you drop memory to 400? Did performance start going down?


I tried at 417, well below 10500.

Sweet spot seems to be +520, got 10564 in TimeSpy.


----------



## BigMack70

Quote:


> Originally Posted by *dboythagr8*
> 
> Most likely won't renew the Premiere service because of this..


I tried Premiere a while back and thought it was a supremely awful version of Amazon Prime and will never do it again. Found it to be a huge waste of money.


----------



## KedarWolf

Quote:


> Originally Posted by *c0nsistent*
> 
> Well I ordered a Gigabyte 1080 Ti since the others were out of stock... paid for next day delivery which means I'll get it Tuesday it says.
> 
> Unfortunately it only comes with a passive DP to DVI adapter so I wont be able to use my DVI-DL QNIX 1440p monitor and will have to use a 1080p monitor until I get an adapter, and even then it will only be at 60hz instead of 120hz. With that said, I think I'll have no choice but to buy a monitor at this point unless I want to flip the card and buy an AIB model next month.


Yeah, my QNIX not working with adapter, but I have my old card in as dedicated PhysX and QNIX works on it's DVI.


----------



## dboythagr8

Quote:


> Originally Posted by *BigMack70*
> 
> I tried Premiere a while back and thought it was a supremely awful version of Amazon Prime and will never do it again. Found it to be a huge waste of money.


They definitely messed this up.


----------



## stocksux

Well my second water block and backplate just showed up (returned my Titan X so already had one water block). Since Micro Center only allowed me to buy one 1080ti I have a second block/back plate and no second card.......decisions decisions. Buy another 1080ti for sli or return the water block (that I paid for rush shipping in anticipation of having two cards) and pay for more shipping to return them.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Got screwed by Newegg or FedEx, most likely both.
> 
> Newegg Premiere member w/ auto rush processing, next day delivery, Saturday delivery....And package hasn't left CA despite my order getting in multiple hours before the rush processing cutoff time.
> 
> Great service
> 
> Did anybody else who ordered from Newegg have their card shipped?
> 
> High annoyed right now. I called and they confirmed that it's most likely still with newegg. She put in request to refund me all of the delivery charges and to call them back on Monday if tracking hasn't updated or whatever. Most likely won't renew the Premiere service because of this.


No card, no update on shipping no nothing!


----------



## undercoverb0ss

Took the plunge and installed my 980ti hybrid cooler on the 1080ti.









Max core clock so far 2063Mhz, memory at 6000Mhz, and temp never goes above 38C


----------



## BigMack70

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Took the plunge and installed my 980ti hybrid cooler on the 1080ti.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Max core clock so far 2063Mhz, memory at 6000Mhz, and temp never goes above 38C


Could you please post some more detail/photos about how you managed to do this? I've got two hybrid 980 Ti / Titan XM coolers and would love to just add one myself.


----------



## c0nsistent

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, my QNIX not working with adapter, but I have my old card in as dedicated PhysX and QNIX works on it's DVI.


Is that seriously possible? So I can use my GTX 1070 as a Dedicated PhysX GPU and use the DVI-DL on that while the 1080 Ti does the rendering? Or is it only for a secondary monitor?


----------



## hokk

Great card a solid 76-80 FPS in GTA5 on max @ 4k


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> No card, no update on shipping no nothing!


**** happens!!!


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> **** happens!!!


Just got done talking to newegg and they refunded me 15$ for the failed satuday delivery.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> Just got done talking to newegg and they refunded me 15$ for the failed satuday delivery.


I bet those 15 wont compensate u at all...


----------



## dboythagr8

What a weekend killer


----------



## undercoverb0ss

Quote:


> Originally Posted by *BigMack70*
> 
> Could you please post some more detail/photos about how you managed to do this? I've got two hybrid 980 Ti / Titan XM coolers and would love to just add one myself.


I don't really have the time to go back and detail anything with photos but it's roughly the same process detailed here: http://www.gamersnexus.net/guides/2565-titan-x-pascal-hybrid-build-part-2-assembling-liquid

Also, mine looks just like they show with no shroud because I don't care about looks, if you want the shroud to go back on it's going to take cutting/modding.


----------



## gunit2004

Which 1080ti card would be the best to get when considering cooling and noise level?

Would aftermarket cards (triple fan custom variants) be quieter than the regular Founder edition cards with blower fans?


----------



## undercoverb0ss

Quote:


> Originally Posted by *gunit2004*
> 
> Which 1080ti card would be the best to get when considering cooling and noise level?
> 
> Would aftermarket cards (triple fan custom variants) be quieter than the regular Founder edition cards with blower fans?


I think anything is quieter than the stock blower to be honest. Best cooling and noise would be a hybrid but not sure when those will be available.


----------



## Baasha

4x 1080 Ti FE beat my 4x Titan XP in 3DMark FSU: o_0


----------



## pharcycle

Just ordered my pair from Nvidia yesterday due sometime next week hopefully. Do these cards benefit much from water cooling? Thinking of re-plumbing my water loop if these cards would benefit.


----------



## BigMack70

Quote:


> Originally Posted by *undercoverb0ss*
> 
> I don't really have the time to go back and detail anything with photos but it's roughly the same process detailed here: http://www.gamersnexus.net/guides/2565-titan-x-pascal-hybrid-build-part-2-assembling-liquid
> 
> Also, mine looks just like they show with no shroud because I don't care about looks, if you want the shroud to go back on it's going to take cutting/modding.


Thanks. I'll be doing this ASAP on my card. Just saved me $100-130 on a new AIO









Now just need to wait for custom BIOS to bypass the power limit so hopefully overclocking headroom improves.


----------



## KedarWolf

Quote:


> Originally Posted by *c0nsistent*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, my QNIX not working with adapter, but I have my old card in as dedicated PhysX and QNIX works on it's DVI.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that seriously possible? So I can use my GTX 1070 as a Dedicated PhysX GPU and use the DVI-DL on that while the 1080 Ti does the rendering? Or is it only for a secondary monitor?
Click to expand...

Yes, my old card does dedicated PhysX AND second DVI screen.









Edit: You still need a main screen for the 1080 Ti though, I use a 4K GSync screen.


----------



## stocksux

Anyone know anything about this sli bridge from asus? I can't find it anywhere to buy. Has it not been released yet?


----------



## KedarWolf

Quote:


> Originally Posted by *Baasha*
> 
> 4x 1080 Ti FE beat my 4x Titan XP in 3DMark FSU: o_0


What do you use for 4 way SLI high bandwidth bridge?









Just curious, I posted earlier the link to the driver hack to enable three and four way SLI, not doing either though.


----------



## kevindd992002

Quote:


> Originally Posted by *bfedorov11*
> 
> It will perform better at higher resolutions.. 1440p+. DigitalFoundry did a great comparison, 1080ti at 4k vs 960, 480, and 970 at 1080p. Their overclocked 6700k bottlenecked at 1080p.


Quote:


> Originally Posted by *BigMack70*
> 
> The crazier thing is that they were even seeing occasional - and slight - CPU bottlenecking at _1440p_ on the 1080 Ti.
> 
> CPUs really have just not kept up at all with the demands of games and GPUs. Really hope AMD lights a fire under Intel's ass so we can get real CPU improvements to keep pace with GPUs again.


Would a 4.5GHz-overclocked i7-2600K be bottlenecked whrn gaming with a 1080Ti at 1080p?


----------



## BigMack70

Quote:


> Originally Posted by *kevindd992002*
> 
> Would a 4.5GHz-overclocked i7-2600K be bottlenecked whrn gaming with a 1080Ti at 1080p?


Any CPU will bottleneck a 1080 Ti at 1080p, so yes. This really isn't a card that makes sense for 1080p gaming.


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> I bet those 15 wont compensate u at all...


nope









Quote:


> Originally Posted by *dboythagr8*
> 
> What a weekend killer


Yep


----------



## sWaY20

These asshats at newegg won't refund any of my shipping cost, I chose next day and saturday delivery. Their claiming if it doesnt come by monday i can submit a delay claim...***. Do these idiots not understand next day and saturday delivery. Im done with newegg after this.


----------



## sWaY20

Quote:


> Originally Posted by *vmanuelgm*
> 
> I bet those 15 wont compensate u at all...


At least they gave you something


----------



## lilchronic

Quote:


> Originally Posted by *sWaY20*
> 
> At least they gave you something


There probably dealing with hundreds of these cases, I'm sure they will refund you that satuday delivery fee. They said it will go through on monday and take 3-5 day's to get refunded.


----------



## sWaY20

Quote:


> Originally Posted by *lilchronic*
> 
> There probably dealing with hundreds of these cases, I'm sure they will refund you that satuday delivery fee. They said it will go through on monday and take 3-5 day's to get refunded.


yeah I'm just irritated, I'll chat with them again on Monday or Tues.


----------



## lilchronic

Quote:


> Originally Posted by *sWaY20*
> 
> yeah I'm just irritated, I'll chat with them again on Monday or Tues.


Yeah me too man, but a little mary jane helps cure the pain.


----------



## Cuenda

Hello, im planning on getting a 1080 Ti the next days, but i have a problem with it.....

My PC:

CPU: i5 2500k at 4.2 (can try to push a bit more but im a bit bad with OC)
MB: AsRock Z68 Extreme4 Gen 3
Ram: G.Skill RipJaws X 1600 x12 GB
PSU: EVGA 650w plus Gold
GPU: AMD HD 6850 (i had previously a 970 that i sold planning to get a 1080Ti or a 1080)

Monitor: Dell S2716DG, G-Sync 2560x1440p 144Hz (here is the biggest problem)

As i bough this monitor recently (i was having the 970 yet) i saw devastating performance drop on my games, as would be normal, changing to 1440p was good but hell, my PC wouldnt handle it, i didnt care so much cause Ryzen was coming soon, and 1080 Ti as well so i was planning to change my PC soon, thats why i sold my 970 (it was the last week).

I wanted to upgrade to Ryzen, but with all the problems on it, super low Stock on Mobos, i decided to not move to Ryzen, at least not yet, maybe w8 for R5 or reconsider to get an i7 6700K or 7700K.

What i know is that i will end buying an 1080Ti alone or with a full PC but now im worried as well about... is there a big differ on perfomance between the Founder Edition than the Customs one? and will my PC Bottleneck it?(im planning to play at 1440p) of course im not expecting to get the same FPS as a top Ryzen or i7 but with the GTX 970 was a bit sad to play at 1440p on Ultra in games like Bf1, i just dont want to upgrade the card if the improvement gonna be just a little more comparead to the 970 cause my PC bottleneck it, and yeah im planning to change my PC so will be temporal, but maybe something happens and cant change the rest of my PC in a while thats why i care.

So need an honest opinion.


----------



## hokk

Quote:


> Originally Posted by *Cuenda*
> 
> Hello, im planning on getting a 1080 Ti the next days, but i have a problem with it.....
> 
> My PC:
> 
> CPU: i5 2500k at 4.2 (can try to push a bit more but im a bit bad with OC)
> MB: AsRock Z68 Extreme4 Gen 3
> Ram: G.Skill RipJaws X 1600 x12 GB
> PSU: EVGA 650w plus Gold
> GPU: AMD HD 6850 (i had previously a 970 that i sold planning to get a 1080Ti or a 1080)
> 
> Monitor: Dell S2716DG, G-Sync 2560x1440p 144Hz (here is the biggest problem)
> 
> As i bough this monitor recently (i was having the 970 yet) i saw devastating performance drop on my games, as would be normal, changing to 1440p was good but hell, my PC wouldnt handle it, i didnt care so much cause Ryzen was coming soon, and 1080 Ti as well so i was planning to change my PC soon, thats why i sold my 970 (it was the last week).
> 
> I wanted to upgrade to Ryzen, but with all the problems on it, super low Stock on Mobos, i decided to not move to Ryzen, at least not yet, maybe w8 for R5 or reconsider to get an i7 6700K or 7700K.
> 
> What i know is that i will end buying an 1080Ti alone or with a full PC but now im worried as well about... is there a big differ on perfomance between the Founder Edition than the Customs one? and will my PC Bottleneck it?(im planning to play at 1440p) of course im not expecting to get the same FPS as a top Ryzen or i7 but with the GTX 970 was a bit sad to play at 1440p on Ultra in games like Bf1, i just dont want to upgrade the card if the improvement gonna be just a little more comparead to the 970 cause my PC bottleneck it, and yeah im planning to change my PC so will be temporal, but maybe something happens and cant change the rest of my PC in a while thats why i care.
> 
> So need an honest opinion.


Well i'm running the FE but i would recommend pushing your cpu as much as you can this card really loves high clocks.

Your cpu will bottleneck it a fair amount, you will suffer low minimums.

but you will see nice performance on average over your current card.


----------



## LukkyStrike

Quote:


> Originally Posted by *Cuenda*
> 
> Hello, im planning on getting a 1080 Ti the next days, but i have a problem with it.....
> 
> My PC:
> 
> CPU: i5 2500k at 4.2 (can try to push a bit more but im a bit bad with OC)
> MB: AsRock Z68 Extreme4 Gen 3
> Ram: G.Skill RipJaws X 1600 x12 GB
> PSU: EVGA 650w plus Gold
> GPU: AMD HD 6850 (i had previously a 970 that i sold planning to get a 1080Ti or a 1080)
> 
> Monitor: Dell S2716DG, G-Sync 2560x1440p 144Hz (here is the biggest problem)
> 
> As i bough this monitor recently (i was having the 970 yet) i saw devastating performance drop on my games, as would be normal, changing to 1440p was good but hell, my PC wouldnt handle it, i didnt care so much cause Ryzen was coming soon, and 1080 Ti as well so i was planning to change my PC soon, thats why i sold my 970 (it was the last week).
> 
> I wanted to upgrade to Ryzen, but with all the problems on it, super low Stock on Mobos, i decided to not move to Ryzen, at least not yet, maybe w8 for R5 or reconsider to get an i7 6700K or 7700K.
> 
> What i know is that i will end buying an 1080Ti alone or with a full PC but now im worried as well about... is there a big differ on perfomance between the Founder Edition than the Customs one? and will my PC Bottleneck it?(im planning to play at 1440p) of course im not expecting to get the same FPS as a top Ryzen or i7 but with the GTX 970 was a bit sad to play at 1440p on Ultra in games like Bf1, i just dont want to upgrade the card if the improvement gonna be just a little more comparead to the 970 cause my PC bottleneck it, and yeah im planning to change my PC so will be temporal, but maybe something happens and cant change the rest of my PC in a while thats why i care.
> 
> So need an honest opinion.


Well i have a i3-6300 that i use in my media server. I used it to test the cards prior to putting their WB's on yesterday. It ran doom at highest pre-set "ultra" at about 120FPS. It was impressive. It scored low in 3d mark, but not by a huge margin.


----------



## Cuenda

Quote:


> Originally Posted by *kylzer*
> 
> Well i'm running the FE but i would recommend pushing your cpu as much as you can this card really loves high clocks.
> 
> Your cpu will bottleneck it a fair amount, you will suffer low minimums.
> 
> but you will see nice performance on average over your current card.


And why about the differ performance between the FE and Customs one? would be mainly temps?

Quote:


> Originally Posted by *LukkyStrike*
> 
> Well i have a i3-6300 that i use in my media server. I used it to test the cards prior to putting their WB's on yesterday. It ran doom at highest pre-set "ultra" at about 120FPS. It was impressive. It scored low in 3d mark, but not by a huge margin.


At what resolution?


----------



## hokk

Quote:


> Originally Posted by *Cuenda*
> 
> And why about the differ performance between the FE and Customs one? would be mainly temps?
> At what resolution?


Just temps maybe pre overclocked.


----------



## lexR0608

Just had an email from Nvidia saying my 1080Ti is on back order, anyone here have a clue when that will be, the sale went fine, order confirmed and was given Estimated delivery date for next week... Quite annoying.


----------



## kevindd992002

Quote:


> Originally Posted by *BigMack70*
> 
> Any CPU will bottleneck a 1080 Ti at 1080p, so yes. This really isn't a card that makes sense for 1080p gaming.


But the thing is that I will upgrade to a higher-resolution monitor in the future anyway. So if I still go with the 1080Ti (from a 1070), will I be making the right choice or not? Is a CPU bottleneck bad if the GPU can render the game to its max fps?


----------



## BigMack70

Quote:


> Originally Posted by *kevindd992002*
> 
> But the thing is that I will upgrade to a higher-resolution monitor in the future anyway. So if I still go with the 1080Ti (from a 1070), will I be making the right choice or not? Is a CPU bottleneck bad if the GPU can render the game to its max fps?


It's your PC and your money so you can do whatever you want, but I would not buy a 1080 Ti until I had the higher resolution screen already in hand. Particularly if I already had a 1070... it's not like there are games the 1070 struggles in at 1080p.


----------



## stocksux

Quote:


> Originally Posted by *kevindd992002*
> 
> But the thing is that I will upgrade to a higher-resolution monitor in the future anyway. So if I still go with the 1080Ti (from a 1070), will I be making the right choice or not? Is a CPU bottleneck bad if the GPU can render the game to its max fps?


If you go down the road of the 1080ti just know that the money you shell out for the card won't get you its peak performance as your CPU swill hold it back. So let's say your "future" monitor upgrade gets held up and the next generation of GPU comes out before you can upgrade your monitor. You've essentially thrown away money on the ti. Assuming the 2080 or maybe Volta comes out you'll want to go with one of those. But then you're spending money on a GPU again instead of a monitor. Imo try and save up to be able to do it at once to get the most time out of current tech before bigger and better comes along. FTIW a monitor will likely last you three to four builds if not more but the gpu will likely be replaced every build or every other.


----------



## KraxKill

Quote:


> Originally Posted by *stocksux*
> 
> If you go down the road of the 1080ti just know that the money you shell out for the card won't get you its peak performance as your CPU swill hold it back. So let's say your "future" monitor upgrade gets held up and the next generation of GPU comes out before you can upgrade your monitor. You've essentially thrown away money on the ti. Assuming the 2080 or maybe Volta comes out you'll want to go with one of those. But then you're spending money on a GPU again instead of a monitor. Imo try and save up to be able to do it at once to get the most time out of current tech before bigger and better comes along. FTIW a monitor will likely last you three to four builds if not more but the gpu will likely be replaced every build or every other.


People please!!! If your card is bottlenecked by your CPU!!! RUN SUPER SAMPLING!!!!!!! What bottleneck?


----------



## Pandora's Box

Got my 1080 Ti installed this afternoon. With a fan curve that maxes out at 65% I am hitting 2,000MHz core clock. My max GPU temp is 76C, fan speed at that temp is 56%. This was recorded doing 2 loops of the firestrike extreme benchmark. Core clock bounced around from 1960 - 2000. I did mess around briefly with memory overclocking, I took the card to 11,716Mhz memory and I gained a whole 8 points in firestrike extreme - not worth it. My ambient temp is 68F or 20C.

Firestrike Extreme

GPU overclock settings:

Core Voltage - Not changed
Power Limit - 120%
Temp Limit - 90C
Core Clock - +152 (2,000Mhz in firestrike)
Mem Clock - 0
Fan Speed - Curve which maxes out at 65% at 84C

Mandatory pics:


----------



## EvilPieMoo

My 2 Founders 1080Ti's are the worst overclockers, can't go over 2020Mhz.


----------



## BigMack70

Quote:


> Originally Posted by *EvilPieMoo*
> 
> My 2 Founders 1080Ti's are the worst overclockers, can't go over 2020Mhz.


Seems decent/average based on reviews


----------



## EvilPieMoo

Quote:


> Originally Posted by *BigMack70*
> 
> Seems decent/average based on reviews


surprised to see them all clock so low, not really a complaint because they still pack plenty of performance but the 1080 FE was able to do over 2100.


----------



## EvilPieMoo

Quote:


> Originally Posted by *EvilPieMoo*
> 
> surprised to see them all clock so low, not really a complaint because they still pack plenty of performance but the 1080 FE was able to do over 2100 easily


----------



## lilchronic

Quote:


> Originally Posted by *Pandora's Box*
> 
> Got my 1080 Ti installed this afternoon. With a fan curve that maxes out at 65% I am hitting 2,000MHz core clock. My max GPU temp is 76C, fan speed at that temp is 56%. This was recorded doing 2 loops of the firestrike extreme benchmark. Core clock bounced around from 1960 - 2000. I did mess around briefly with memory overclocking, I took the card to 11,716Mhz memory and I gained a whole 8 points in firestrike extreme - not worth it. My ambient temp is 68F or 20C.
> 
> Firestrike Extreme
> 
> GPU overclock settings:
> 
> Core Voltage - Not changed
> Power Limit - 120%
> Temp Limit - 90C
> Core Clock - +152 (2,000Mhz in firestrike)
> Mem Clock - 0
> Fan Speed - Curve which maxes out at 65% at 84C
> 
> Mandatory pics:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ]


Didn't you order from newegg?


----------



## Renairy

GPU launches arent fun anymore... We need a new Crysis.


----------



## BigMack70

Quote:


> Originally Posted by *EvilPieMoo*
> 
> surprised to see them all clock so low, not really a complaint because they still pack plenty of performance but the 1080 FE was able to do over 2100.


The 1080 is a different and smaller chip. The 980 typically could reach slightly higher clocks than Titan XM / 980 Ti also.


----------



## Baasha

This thread needs more pics!


----------



## AMDATI

You might think people who buy 1080ti's to overclock are wealthy....but I'm soo wealthy I buy 1080ti's and don't bother to overclock. Bam! #MetaIsReal


----------



## Pandora's Box

Quote:


> Originally Posted by *lilchronic*
> 
> Didn't you order from newegg?


Yes with saturday and rush delivery options. They dropped it off at 9AM this morning (March 11th).


----------



## RedM00N

Quote:


> Originally Posted by *Renairy*
> 
> GPU launches arent fun anymore... We need a new Crysis.


Gets less and less likely of having that happen again. Focus on consoles / pc hardware advancing to fast. Though Arma 3 maxed out gets like 20FPS on the menu for me on 1 Titan XM











Maxed out, some of the test is cpu bound, most is GPU bound. That delicious fps


----------



## Renairy

Quote:


> Originally Posted by *Baasha*
> 
> This thread needs more pics!


Which small country are you trying to hack and takeover?









Quote:


> Originally Posted by *RedM00N*
> 
> Gets less and less likely of having that happen again. Focus on consoles / pc hardware advancing to fast. Though Arma 3 maxed out gets like 20FPS on the menu for me on 1 Titan XM


This has to be the least exciting launch for me and i think it has to do with the games that are currently available and the games that should be.


----------



## AMDATI

that's cool and all that you got four 1080ti's, but can your mouse do this. check and mate.


----------



## Renairy

Quote:


> Originally Posted by *AMDATI*
> 
> that's cool and all that you got four 1080ti's, but can your mouse do


Check mate is Chess...
This mouse will whip u in connect 4


Spoiler: Warning: Spoiler!


----------



## Renairy

Just spoke to my local supplier and they said the Strix are already shipping.


----------



## Cuenda

Quote:


> Originally Posted by *Renairy*
> 
> Just spoke to my local supplier and they said the Strix are already shipping.


U mean the Asus Strix? O_O

I really want to know how long will take the customs to be on shops.... cant wait early april so far... else i would go for FE tomorrow :|


----------



## sWaY20

Quote:


> Originally Posted by *Pandora's Box*
> 
> Yes with saturday and rush delivery options. They dropped it off at 9AM this morning (March 11th).


so there's no reason why others didn't receive theirs from newegg with overnight and sat delivery...hmmm


----------



## tconroy135

Quote:


> Originally Posted by *dboythagr8*
> 
> Got screwed by Newegg or FedEx, most likely both.
> 
> Newegg Premiere member w/ auto rush processing, next day delivery, Saturday delivery....And package hasn't left CA despite my order getting in multiple hours before the rush processing cutoff time.
> 
> Great service
> 
> Did anybody else who ordered from Newegg have their card shipped?
> 
> High annoyed right now. I called and they confirmed that it's most likely still with newegg. She put in request to refund me all of the delivery charges and to call them back on Monday if tracking hasn't updated or whatever. Most likely won't renew the Premiere service because of this.


TBH whether it's NVIDIA Pre-Orders or Newegg orders the way they handle these releases is ****. Take pre-orders, prepare the shipments, mail the cards in an efficient way so that pre-orders arrive on the doorstop of early adopters between 8 AM and 8 PM on the day of release and free of shipping charge.


----------



## Renairy

Quote:


> Originally Posted by *Cuenda*
> 
> U mean the Asus Strix? O_O
> 
> I really want to know how long will take the customs to be on shops.... cant wait early april so far... else i would go for FE tomorrow :|


Yes the Asus Strix, apparently they are already shipping from ASUS to suppliers, shops will then need to have them shipped to them.

We will be able to order from shops in another 5-7 days if thats the case


----------



## Cuenda

Quote:


> Originally Posted by *Renairy*
> 
> Yes the Asus Strix, apparently they are already shipping from ASUS to suppliers, shops will then need to have them shipped to them.
> 
> We will be able to order from shops in another 5-7 days if thats the case


He can only knows about Asus one? No info about EVGA?


----------



## Alvarado

So did I screw myself over by getting one of these if I don't plan on watercooling? By that I mean the FE version. Read through a handful of reviews that talked about the 84c (that sounds scary by the way) but didn't think much about it till today. My current 980 ti gets to just under 70c under heavy load.


----------



## Renairy

Quote:


> Originally Posted by *Cuenda*
> 
> He can only knows about Asus one? No info about EVGA?


I only inquired about ASUS.

Quote:


> Originally Posted by *Alvarado*
> 
> So did I screw myself over by getting one of these if I don't plan on watercooling? By that I mean the FE version. Read through a handful of reviews that talked about the 84c (that sounds scary by the way) but didn't think much about it till today. My current 980 ti gets to just under 70c under heavy load.


I never buy reference design... always wait for the superior components and cooling of a AIB board.


----------



## keikei

Quote:


> Originally Posted by *Pandora's Box*
> 
> Got my 1080 Ti installed this afternoon. With a fan curve that maxes out at 65% I am hitting 2,000MHz core clock. My max GPU temp is 76C, fan speed at that temp is 56%. This was recorded doing 2 loops of the firestrike extreme benchmark. Core clock bounced around from 1960 - 2000. I did mess around briefly with memory overclocking, I took the card to 11,716Mhz memory and I gained a whole 8 points in firestrike extreme - not worth it. My ambient temp is 68F or 20C.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Firestrike Extreme
> 
> GPU overclock settings:
> 
> Core Voltage - Not changed
> Power Limit - 120%
> Temp Limit - 90C
> Core Clock - +152 (2,000Mhz in firestrike)
> Mem Clock - 0
> Fan Speed - Curve which maxes out at 65% at 84C
> 
> Mandatory pics:


The damn FE is so sexy. I my just get a new case with glass sides just so i can ogle at the card when i want.


----------



## Pandora's Box

Quote:


> Originally Posted by *Alvarado*
> 
> So did I screw myself over by getting one of these if I don't plan on watercooling? By that I mean the FE version. Read through a handful of reviews that talked about the 84c (that sounds scary by the way) but didn't think much about it till today. My current 980 ti gets to just under 70c under heavy load.


Meh. The GPU is designed to run at 84C. The aftermarket cards will net you 3-5% fps boost at best.


----------



## drkCrix

Hopefully I get a shipping notice from NCIX on Monday (EVGA version), Can't wait for it and my 4K monitor to show up


----------



## keikei

Quote:


> Originally Posted by *drkCrix*
> 
> Hopefully I get a shipping notice from NCIX on Monday (EVGA version), Can't wait for it and my *4K monitor* to show up


What model?


----------



## drkCrix

Acer XB280HK (G-Sync) Its an older model but I got it for 50% off ($450 CDN) as it was open box


----------



## dboythagr8

Quote:


> Originally Posted by *sWaY20*
> 
> yeah I'm just irritated, I'll chat with them again on Monday or Tues.


I spoke with someone earlier in the day and they initiated a refund. I'd ask to speak to a supervisor or call back and get a different agent.
Quote:


> Originally Posted by *sWaY20*
> 
> so there's no reason why others didn't receive theirs from newegg with overnight and sat delivery...hmmm


Did you order EVGA version as well? I wonder if that has something to do with it...

I complained to Newegg about this situation on Twitter. Of course that prompted a request for a DM with my order number. They looked into and said the following in a message:
Quote:


> Thanks for reaching out to us and providing your order number! What I find interesting is that this was scanned out of our warehouse and shipped early enough to be delivered on Saturday. I am happy to see that we have issued a refund for your shipping charges already, but I also would like to touch base with our warehouse on Monday to see what happened. Again, thanks for bringing this to our attention. If you have any other questions or concerns in the meantime, please let us know


I responded that others were having the same issue, and they got back saying
Quote:


> Thank you for that information. Personally, this is the first time I'm hearing of this, but I'll keep an eye out and escalate this in our warehouse come Monday


So at the very least I brought some attention to it by blasting them on Twitter about it, and we'll see what they say on Mon.


----------



## al210

Quote:


> Originally Posted by *Renairy*
> 
> I only inquired about ASUS.
> I never buy reference design... always wait for the superior components and cooling of a AIB board.


You make it sound like reference cards are built with inferior components which is ridiculous. RF gives water coolers the best selection for different water block options while non-reference is usually best for better air cooling options. Both cards are built with high end components.


----------



## Pandora's Box

I couldn't even get the EVGA card to move forward to check out, kept saying out of stock as soon as i clicked on add to cart lol. Settled on PNY.


----------



## Alvarado

I guess I could always use this reason to try watercooling now that I'm kinda stuck with a reference card.


----------



## dboythagr8

Quote:


> Originally Posted by *Alvarado*
> 
> I guess I could always use this reason to try watercooling now that I'm kinda stuck with a reference card.


Why are you feeling some kind of way about the Founder's Edition? Did you not know going in the advantages and disadvantages of the blower style card?


----------



## Alvarado

Quote:


> Originally Posted by *dboythagr8*
> 
> Why are you feeling some kind of way about the Founder's Edition? Did you not know going in the advantages and disadvantages of the blower style card?


Sort of. Think this feeling is more "Uh oh, what did I do" type of thing.


----------



## Domler

@Alvarado You'll be hooked. I Just finished watercooling my define nano s lan rig, that I'll use maybe once a month, because it needed it? Lol. It's a lot of fun. 
A 4790k overclocked to 4.7 and a gtx 1080 overclocked to 2.05 ghz. It's my rental till my two 1080 ti's come in for the big rig.


----------



## al210

Quote:


> Originally Posted by *Alvarado*
> 
> Sort of. Think this feeling is more "Uh oh, what did I do" type of thing.


Your good. Sit back and enjoy your Top End card and don't look back.


----------



## blackforce

Quote:


> Originally Posted by *Renairy*
> 
> GPU launches arent fun anymore... We need a new Crysis.


not this dumb we need a crysis benchmark again and again on every forum, stop living in the past.


----------



## dboythagr8

Quote:


> Originally Posted by *Alvarado*
> 
> Sort of. Think this feeling is more "Uh oh, what did I do" type of thing.


You have the fastest GPU. Having a different cooler won't change that. Plus if you decide to go the custom WC route, you've already got the best option.


----------



## Mongo

Quote:


> Originally Posted by *Slackaveli*
> 
> High annoyed right now. I called and they confirmed that it's most likely still with newegg. She put in request to refund me all of the delivery charges and to call them back on Monday if tracking hasn't updated or whatever. Most likely won't renew the Premiere service because of this.


I ordered from Nvidia and did overnight shipping and didn't get it today also. But I have contacted them and I'm getting all shipping charges refunded.


----------



## Renairy

Quote:


> Originally Posted by *al210*
> 
> You make it sound like reference cards are built with inferior components which is ridiculous. RF gives water coolers the best selection for different water block options while non-reference is usually best for better air cooling options. Both cards are built with high end components.


No all i said was the truth. If you're comparing to some AIB boards, then yes it is inferior.

Quote:


> Originally Posted by *blackforce*
> 
> not this dumb we need a crysis benchmark again and again on every forum, stop living in the past.


----------



## Slackaveli

Nvidia refunded you? because fedex shut me down and i am very persistent.


----------



## muhd86

Quote:


> Originally Posted by *keikei*
> 
> Doesnt look like it. NVIDIA GeForce "Pascal" 3-way and 4-way SLI Restricted to Select Non-Gaming Apps
> 
> From the latest driver release notes. Page 25: http://us.download.nvidia.com/Windows/378.78/378.78-win10-win8-win7-notebook-release-notes.pdf


well i seen some benchmarks on youtube with ppl doing 3 way gtx 1080 even if its not supported how come they did it -

i was talking about the 1080ti as they are new gpus is it possible nvidia will unlock the 4way and 3 way sli


----------



## dboythagr8

Quote:


> Originally Posted by *Slackaveli*
> 
> Nvidia refunded you? because fedex shut me down and i am very persistent.


I don't know how FedEx could refund you. You didn't directly pay them, you'd have to go to the vendor.


----------



## Mongo

Quote:


> Originally Posted by *Slackaveli*
> 
> Nvidia refunded you? because fedex shut me down and i am very persistent.


Yes Nvidia, I called them last night told them the tracking number told me I was not getting it until Mon 3/13 and that I ordered it on Fri 3/10 with overnight shipping. They told me they were going to pass it on to Logistics and see if they could get it delivered before then, And if it wasn't call back. So when I didn't get it today I called them back and the person read the notes said sorry about that I will issue refund for your shipping.


----------



## alucardis666

Quote:


> Originally Posted by *Mongo*
> 
> Yes Nvidia, I called them last night told them the tracking number told me I was not getting it until Mon 3/13 and that I ordered it on Fri 3/10 with overnight shipping. They told me they were going to pass it on to Logistics and see if they could get it delivered before then, And if it wasn't call back. So when I didn't get it today I called them back and the person read the notes said sorry about that I will issue refund for your shipping.


Now that's good customer service!


----------



## Agueybana_II

Decided to reach out to Gigabyte, ASUS, PNY, MSI, EVGA, ZOTAC to asked the following question regarding WARRANTY

My email is in regards to [Company] warranty for the 1080TI FE. Does [Company] voids the warranty of the GPU after an aftermarket water block is installed?

****Responses*****

EVGA stated:
Hello,

As long as you are able to put the original cooler back on the card when sending it in for replacement there shouldn't be any issues with warranty. Simply removing the cooler does not void your warranty. If you have any other questions, please let us know.

Regards,
EVGA

Will update as manufacture provide response.


----------



## BigMack70

Pretty sure if you can get your card back to its vendor with original cooler and original BIOS in place, they will honor warranty (barring obvious user-induced physical damage). Always good to get confirmation though.


----------



## Pandora's Box

Most board partners have that info on their website:

PNY: http://www4.pny.com/rma/index2.aspx

(No modifications allowed)


----------



## AMDATI

I actually wish they made souped up custom blower style coolers. I'd love to see a dual fan blower design with a beefier heatsink. Or maybe even a triple blower design running at lower RPM's for more airflow but quieter operation.

But even in my mini itx system heat isn't a problem, my 970 gets to 80c and my CPU never goes over 54c.


----------



## sWaY20

Quote:


> Originally Posted by *dboythagr8*
> 
> I spoke with someone earlier in the day and they initiated a refund. I'd ask to speak to a supervisor or call back and get a different agent.
> Did you order EVGA version as well? I wonder if that has something to do with it...
> 
> I complained to Newegg about this situation on Twitter. Of course that prompted a request for a DM with my order number. They looked into and said the following in a message:
> I responded that others were having the same issue, and they got back saying
> So at the very least I brought some attention to it by blasting them on Twitter about it, and we'll see what they say on Mon.


yeah got in on the evga just in time, which is what i wanted. Did rush processing, overnight and Saturday delivery. Said shipped at 7:47pm Fri night.


----------



## lilchronic

Quote:


> Originally Posted by *sWaY20*
> 
> yeah got in on the evga just in time, which is what i wanted. Did rush processing, overnight and Saturday delivery. Said shipped at 7:47pm Fri night.


Yeah i ordered the EVGA 1080Ti.


----------



## KingEngineRevUp

This is a stupid question... But what screw sizes do we need to take apart the 1080 Ti? Thank you.


----------



## dboythagr8

Quote:


> Originally Posted by *sWaY20*
> 
> yeah got in on the evga just in time, which is what i wanted. Did rush processing, overnight and Saturday delivery. Said shipped at 7:47pm Fri night.


Quote:


> Originally Posted by *lilchronic*
> 
> Yeah i ordered the EVGA 1080Ti.


Hmm...well all 3 of us ordered EVGA versions and were screwed over. The user with the PNY card did same shipping and got his as expected.


----------



## Slackaveli

Quote:


> Originally Posted by *Mongo*
> 
> Yes Nvidia, I called them last night told them the tracking number told me I was not getting it until Mon 3/13 and that I ordered it on Fri 3/10 with overnight shipping. They told me they were going to pass it on to Logistics and see if they could get it delivered before then, And if it wasn't call back. So when I didn't get it today I called them back and the person read the notes said sorry about that I will issue refund for your shipping.


worked. thanks, man,. she blamed fedex, told me wasnt nothing she could do but hang in there, then relented on third try. so, looks like a 3 "no's" then a "yes" guideline at Nvidia. Good service, yes. Got my $32 back. Nvidia paying for Fedex's mistake.


----------



## Neokolzia

Threw my 1080 FTW in for step up, was approved its shipping off, like 120$ Cad for shipping there and back, and 140$CAD difference for the GPU not... the worst upgrade ever certainly wasn't cheap. Brings total cost with waterblock to like.... 1300-1400$ Q_Q....

Will arrive at EVGA on Friday, suspect a ship out on monday? and get it mid week after


----------



## KickAssCop

Backordered on newegg for an ASUS 1080 Ti. Hopefully it ships out on Monday or earlier.


----------



## Cuenda

And EVGA didnt said anything about the customs one at PAX? like Prize or any release date? :|


----------



## Slackaveli

Quote:


> Originally Posted by *Agueybana_II*
> 
> Decided to reach out to Gigabyte, ASUS, PNY, MSI, EVGA, ZOTAC to asked the following question regarding WARRANTY
> 
> My email is in regards to [Company] warranty for the 1080TI FE. Does [Company] voids the warranty of the GPU after an aftermarket water block is installed?
> 
> ****Responses*****
> 
> EVGA stated:
> Hello,
> 
> As long as you are able to put the original cooler back on the card when sending it in for replacement there shouldn't be any issues with warranty. Simply removing the cooler does not void your warranty. If you have any other questions, please let us know.
> 
> Regards,
> EVGA
> 
> Will update as manufacture provide response.


WHAT about Nvidia reference>?


----------



## Slackaveli

just asked nvidia. no coolers. they even void warranty for overclocking. will be selling this one and buying a evga it seems.


----------



## BigMack70

You guys are way too paranoid. Nobody can tell what you did with your card as long as you restore it to its original condition (factory default BIOS + original cooler). Nobody can tell if you overclocked, put a water block on it, changed the BIOS (provided you changed it back before return), etc...


----------



## Pandora's Box

Checked the back of my PNY 1080 Ti. No stickers over the screws on the backplate. No idea about underneath the backplate though


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> You guys are way too paranoid. Nobody can tell what you did with your card as long as you restore it to its original condition (factory default BIOS + original cooler). Nobody can tell if you overclocked, put a water block on it, changed the BIOS (provided you changed it back before return), etc...


well, im definitely sneaky, but, how hard is it to get that sticker off because screwdriver marks in that is a dead giveaway.


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> Checked the back of my PNY 1080 Ti. No stickers over the screws on the backplate. No idea about underneath the backplate though


best way to do a repaste if you're interested.


----------



## Pandora's Box

Gotta love buying new GPU's straight away - Your "old" one sells instantly. This is easily the best way to reduce the cost of buying GPU's and staying on the bleeding edge. Sold my GTX 1080 (Founders) for $400 just now.


----------



## AstinGC90

I have used a pair of fine pointed pliers in the past to pinch the edges of the screw and turn it carefully without touching the stickers. Also be mindful of and board components as you spin it out using this method.


----------



## MURDoctrine

Quote:


> Originally Posted by *KickAssCop*
> 
> Backordered on newegg for an ASUS 1080 Ti. Hopefully it ships out on Monday or earlier.


If you back ordered you are probably going to be waiting longer than Monday.


----------



## kevindd992002

Quote:


> Originally Posted by *BigMack70*
> 
> It's your PC and your money so you can do whatever you want, but I would not buy a 1080 Ti until I had the higher resolution screen already in hand. Particularly if I already had a 1070... it's not like there are games the 1070 struggles in at 1080p.


Quote:


> Originally Posted by *stocksux*
> 
> If you go down the road of the 1080ti just know that the money you shell out for the card won't get you its peak performance as your CPU swill hold it back. So let's say your "future" monitor upgrade gets held up and the next generation of GPU comes out before you can upgrade your monitor. You've essentially thrown away money on the ti. Assuming the 2080 or maybe Volta comes out you'll want to go with one of those. But then you're spending money on a GPU again instead of a monitor. Imo try and save up to be able to do it at once to get the most time out of current tech before bigger and better comes along. FTIW a monitor will likely last you three to four builds if not more but the gpu will likely be replaced every build or every other.


Since I'm using a 1080p 144Hz monitor, my aim is always at least 144fps. There are still other AAA games that would benefit from a 1080 and my original plan was really to upgrade to a 1080 until 1080Ti came along. I was told that my overclocked CPU will not bottleneck a 1080, is that accurate? Is the performance difference between a 1080 and a 1080Ti enough to bottleneck my CPU?

I can probably find a way to upgrade my monitor to at least a 1440p 144Hz G-Sync one but then my GPU (assuming I already go with the 1080Ti) would still be bottlenecked by my CPU and I also need to upgrade my board and RAM. So I guess I'm not really sure if it's wiser to upgrade to a 1080 or a 1080Ti considering these circumstances. Thoughts?


----------



## KraxKill

Quote:


> Originally Posted by *kevindd992002*
> 
> Since I'm using a 1080p 144Hz monitor, my aim is always at least 144fps. There are still other AAA games that would benefit from a 1080 and my original plan was really to upgrade to a 1080 until 1080Ti came along. I was told that my overclocked CPU will not bottleneck a 1080, is that accurate? Is the performance difference between a 1080 and a 1080Ti enough to bottleneck my CPU?
> 
> I can probably find a way to upgrade my monitor to at least a 1440p 144Hz G-Sync one but then my GPU (assuming I already go with the 1080Ti) would still be bottlenecked by my CPU and I also need to upgrade my board and RAM. So I guess I'm not really sure if it's wiser to upgrade to a 1080 or a 1080Ti considering these circumstances. Thoughts?


Since Nivida introduced Super Sampling bottlenecking is not really a thing. You can always force additional work on the card this way. If you are max fps or above 144fps and your GPU is underutilized I can assure you that you can simply ask the card to render at 133% or higher and downsample to your display from there. This looks better than native and will add additional load onto your GPU if by some chance you see underutilized.


----------



## MURDoctrine

Quote:


> Originally Posted by *Renairy*
> 
> This has to be the least exciting launch for me and i think it has to do with the games that are currently available and the games that should be.


Well I'm looking forward to Mass Effect Andromeda in its buttery smooth glory that will come from the card. At least I hope so haha.


----------



## kevindd992002

Quote:


> Originally Posted by *KraxKill*
> 
> Since Nivida introduced Super Sampling bottlenecking is not really a thing. You can always force additional work on the card this way. If you are max fps or above 144fps and your GPU is underutilized I can assure you that you can simply ask the card to render at 133% or higher and downsample to your display from there. This looks better than native and will add additional load onto your GPU if by some chance you see underutilized.


I've tried DSR a few months ago and there were numerous disadvantages with it. It's just simply not better than native. I've created a thread about it before, I'll post it when I have the chance to find it.


----------



## TwinParadox

Hi guys. As I'm a bios modder, is there anyone who can send me a bios dump of his GTX 1080 Ti please ? I'd like to try to mod it.


----------



## becks

Quote:


> Originally Posted by *Agueybana_II*
> 
> Decided to reach out to Gigabyte, ASUS, PNY, MSI, EVGA, ZOTAC to asked the following question regarding WARRANTY
> 
> My email is in regards to [Company] warranty for the 1080TI FE. Does [Company] voids the warranty of the GPU after an aftermarket water block is installed?
> 
> ****Responses*****
> 
> EVGA stated:
> Hello,
> 
> As long as you are able to put the original cooler back on the card when sending it in for replacement there shouldn't be any issues with warranty. Simply removing the cooler does not void your warranty. If you have any other questions, please let us know.
> 
> Regards,
> EVGA
> 
> Will update as manufacture provide response.





Spoiler: MSI Response



Dear CV

Thank you for your question
You will not void the warranty by doing so
Please note should you need to return the card to service, it should be complete and without any mechanical damage

Thank you and Best regards

MSI Mainboard and VGA Customer Service


----------



## BURGER4life

Quote:


> Originally Posted by *stocksux*
> 
> Anyone know anything about this sli bridge from asus? I can't find it anywhere to buy. Has it not been released yet?]


https://www.caseking.de/en/asus-rog-sli-bridge-hb-2-way-60-mm-gcas-200.html


----------



## KedarWolf

Quote:


> Originally Posted by *TwinParadox*
> 
> Hi guys. As I'm a bios modder, is there anyone who can send me a bios dump of his GTX 1080 Ti please ? I'd like to try to mod it.


Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.

1080Ti.zip 156k .zip file


Identifying EEPROM...
EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page
Reading adapter firmware image...
IFR Data Size : 2184 bytes
IFR CRC32 : A73AE6C5
IFR Image Size : 2560 bytes
IFR Image CRC32 : 432F79F3
IFR Subsystem ID : 10DE-120F
Image Size : 260096 bytes
Version : 86.02.39.00.01
~CRC32 : E303990C
Image Hash : 901B1B9D77096B6AEA31138C207A1F2E
Subsystem ID : 10DE-120F
Hierarchy ID : Normal Board
Chip SKU : 350-0
Project : G611-0050
CDP : N/A
Build Date : 01/18/17
Modification Date : 01/18/17
Build GUID : 402D5B72FFFFFFB8FFFFFFA34DFFFFFFFBFFFFFFBEFFFFFFB8FFFFFFA9FFFFFFC5FFFFFFCDFFFFFF99FFFFFFEE31
UEFI Support : Yes
UEFI Version : 0x30006 (Oct 26 2016 @ 21305424 )
UEFI Variant Id : 0x0000000000000007 ( GP1xx )
UEFI Signer(s) : Microsoft Corporation UEFI CA 2011
InfoROM Version : G001.0000.01.04
InfoROM Backup Exist : NO
License Placeholder : Absent
GPU Mode : N/A
Saving of image completed.


----------



## TwinParadox

Quote:


> Originally Posted by *KedarWolf*
> 
> Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.


Thanks mate.


----------



## undercoverb0ss

So because I removed the 980ti's hybrid cooler and put it onto my 1080ti, what should I do with the 980ti?

I see two options, wait for EVGA to release a 1080ti specific one, then return the 980ti to its original state and then try to sell it.

OR try to sell it without the cooler for super cheap?

I really don't need a spare GPU.


----------



## pharcycle

According to EK their Titan X water blocks should fit the 1080 Ti (https://www.ekwb.com/news/existing-ek-fc-titan-x-pascal-compatible-nvidia-geforce-gtx-1080-ti/) or do you really want EVGA's hydro solution?

I would sell it as original as possible


----------



## undercoverb0ss

Quote:


> Originally Posted by *pharcycle*
> 
> According to EK their Titan X water blocks should fit the 1080 Ti (https://www.ekwb.com/news/existing-ek-fc-titan-x-pascal-compatible-nvidia-geforce-gtx-1080-ti/) or do you really want EVGA's hydro solution?
> 
> I would sell it as original as possible


Full waterblock cost + everything else I will need to complete the cooling is ~200-300$, right? And it's not going to give me better performance than I'm seeing. I can't get past 2063Mhz right now and my max temp is 38C.


----------



## KickAssCop

Sell the 980 Ti without the cooler. I am sure it will be scooped up fast if you price it right.


----------



## undercoverb0ss

Quote:


> Originally Posted by *KickAssCop*
> 
> Sell the 980 Ti without the cooler. I am sure it will be scooped up fast if you price it right.


Yea that's what I'm leaning towards since EVGA hasn't even talked about a 1080ti kit and I'd rather get rid of it sooner than later.

You think $200 shipped makes sense or should I go lower?


----------



## KickAssCop

200 is a fantastic price for this card.


----------



## pharcycle

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Full waterblock cost + everything else I will need to complete the cooling is ~200-300$, right? And it's not going to give me better performance than I'm seeing. I can't get past 2063Mhz right now and my max temp is 38C.


Ahh, sorry hadn't really read everything properly and didn't see the 'hybrid' part! I thought you had taken a full water block from the 980 ti and put it on your 1080 ti because you couldn't buy 1080 ti blocks just yet (hence the info about the titan x blocks fitting).

I'll shut up now!


----------



## BigMack70

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Full waterblock cost + everything else I will need to complete the cooling is ~200-300$, right? And it's not going to give me better performance than I'm seeing. I can't get past 2063Mhz right now and my max temp is 38C.


Yup. The performance increases you seek might come from switching to a custom BIOS as soon as someone makes one with a higher power target / voltage. But the slight temperature difference you would see between AIO and custom loop will not yield performance gains.

Normal (i.e. not subzero) overclocking is not temperature limited on water - either AIO or custom loop.


----------



## khemist

Just arrived!.


----------



## hotrod717

Quote:


> Originally Posted by *vmanuelgm*
> 
> I can do 2200 on water, more or less...


My point exactly - I can do...


----------



## pantsoftime

Quote:


> Originally Posted by *Slackaveli*
> 
> well, im definitely sneaky, but, how hard is it to get that sticker off because screwdriver marks in that is a dead giveaway.


If it's anything like the 1080FE, there are zero warranty stickers that get in the way of removing the cooler. I have an Nvidia branded one and when I put a waterblock on it there were no stickers to deal with at all. Still waiting on Ti to arrive to confirm how that one's setup.


----------



## undercoverb0ss

Quote:


> Originally Posted by *pantsoftime*
> 
> If it's anything like the 1080FE, there are zero warranty stickers that get in the way of removing the cooler. I have an Nvidia branded one and when I put a waterblock on it there were no stickers to deal with at all. Still waiting on Ti to arrive to confirm how that one's setup.


Can confirm - encountered none with my 1080ti from nvidia.


----------



## Jason33w

So, I ordered mine from Newegg last night. It's backordered which is fine, but it says it usually ships in 3 to 5 days. How accurate do you think that is?


----------



## bastian

The MSI Gaming X 1080 Ti will be mine.


----------



## stocksux

Quote:


> Originally Posted by *Alvarado*
> 
> So did I screw myself over by getting one of these if I don't plan on watercooling? By that I mean the FE version. Read through a handful of reviews that talked about the 84c (that sounds scary by the way) but didn't think much about it till today. My current 980 ti gets to just under 70c under heavy load.


You sure did. Return it once the partner boards come out ?


----------



## stocksux

Quote:


> Originally Posted by *SlimJ87D*
> 
> This is a stupid question... But what screw sizes do we need to take apart the 1080 Ti? Thank you.


You'll need a #00 Phillips head screwdriver and a 4mm socket/nut driver


----------



## becks

Quote:


> Originally Posted by *stocksux*
> 
> You'll need a #00 Phillips head screwdriver and a 4mm socket/nut driver


You need iFixit in your life


----------



## stocksux

Quote:


> Originally Posted by *kevindd992002*
> 
> Since I'm using a 1080p 144Hz monitor, my aim is always at least 144fps. There are still other AAA games that would benefit from a 1080 and my original plan was really to upgrade to a 1080 until 1080Ti came along. I was told that my overclocked CPU will not bottleneck a 1080, is that accurate? Is the performance difference between a 1080 and a 1080Ti enough to bottleneck my CPU?
> 
> I can probably find a way to upgrade my monitor to at least a 1440p 144Hz G-Sync one but then my GPU (assuming I already go with the 1080Ti) would still be bottlenecked by my CPU and I also need to upgrade my board and RAM. So I guess I'm not really sure if it's wiser to upgrade to a 1080 or a 1080Ti considering these circumstances. Thoughts?


If you want to push 144fps on said 144hz panel at 1440p the 1080ti is you're better bet. The 1080 will certainly handle 1440p but maybe not at a constant 144fps all the time. The bottleneck of the CPU won't be ridiculous. If you're on the upgrade path and cash isn't growing in trees, you may benefits more from saving the $200 and getting a 1080 and using that extra on a CPU, motherboard, ram, monitor. Sounds like you're in complete rebuild mode. Try out pcpartpicker and put together multiple builds and step back and look at them. Try some around the 1080ti and some around the 1080. You can even save them and link to them if you want others to see it


----------



## stocksux

Quote:


> Originally Posted by *BURGER4life*
> 
> https://www.caseking.de/en/asus-rog-sli-bridge-hb-2-way-60-mm-gcas-200.html


Looks like it's not available until March 17, 2017. If it drops soon I guess I'll have to get another 1080ti. Nice looking piece.


----------



## KraxKill

What's everyone assessment on their memory?

Are the 1080ti mem chips truly binned faster than those on the Titan XP? Or are they simply the same micron chips clocked higher?

I'm at 2062.5 on the Core and +600 (12.2ghz) on the memory. Unlike with the vanilla 1080, my scores continue to increase at higher mem clocks but I get artifacts at +625, +650.

I'm at 28-40C depending on the benchmark, load and ambients.

X41 AIO on the die, noctuas in push/pull with the stock FE mid plate + fan retained over the mem + vrm.


----------



## Pandora's Box

I gained a whole 8 points in firestrike extreme by adding +350 to mem. Meh not worth it.


----------



## stocksux

I returned my Titan xp for this1080ti and was able to get +800 mem on it. This 1080ti comes out of the box a gig faster than the Titan so not as much room to OC on the 1080ti. Both cores clocked the same. There is no difference it seems in performance despite nvidias claims of newer better micron memory. When overclocked it's the same as Titan XP but at one less gig overall of total memory (11GB vs 12GB). Which also means the total bandwidth of the two cards favors the Titan XP when both are overclocked to their limits


----------



## KedarWolf

Quote:


> Originally Posted by *stocksux*
> 
> I returned my Titan xp for this1080ti and was able to get +800 mem on it. This 1080ti comes out of the box a gig faster than the Titan so not as much room to OC on the 1080ti. Both cores clocked the same. There is no difference it seems in performance despite nvidias claims of newer better micron memory. When overclocked it's the same as Titan XP but at one less gig overall of total memory (11GB vs 12GB). Which also means the total bandwidth of the two cards favors the Titan XP when both are overclocked to their limits


Run TimeSpy and experiment, I found better benchmarks running at +520 than +650. Or loop Heaven and adjust memory on the fly.
At some point your framerate should jump by 15 FPS or so and that you know your memory is optimised.

Apparently the higher you go it can actually slow things down.


----------



## LukkyStrike

Quote:


> Originally Posted by *Cuenda*
> 
> And why about the differ performance between the FE and Customs one? would be mainly temps?
> At what resolution?


1440p


----------



## fisher6

I will get card and block tomorrow







. Would a 4790k at 4.8Ghz be a botteneck for the 1080 Ti at 3440x1440?


----------



## fat4l

Sooo.
Im hoping that the same situation that FE clocks better thab any AIB card wont happen again.
Im thinking of getting some custom pcb card but im kind of scared that it will not clock as good as FE card.

Card would be under water.
Any thoughts?


----------



## Baasha

Looks like I'll need to replace my Antec HCP-1300 Platinum with the EVGA 1600 T2 PSU since the 1300W is not enough to power 4x 1080 Ti.

Hmm.. but I love me some GPU sammich!


----------



## Slackaveli

Quote:


> Originally Posted by *pantsoftime*
> 
> If it's anything like the 1080FE, there are zero warranty stickers that get in the way of removing the cooler. I have an Nvidia branded one and when I put a waterblock on it there were no stickers to deal with at all. Still waiting on Ti to arrive to confirm how that one's setup.


great news then. ty


----------



## Slackaveli

Quote:


> Originally Posted by *Baasha*
> 
> Looks like I'll need to replace my Antec HCP-1300 Platinum with the EVGA 1600 T2 PSU since the 1300W is not enough to power 4x 1080 Ti.
> 
> Hmm.. but I love me some GPU sammich!


3x more vram than system memory ;p


----------



## GRABibus

Quote:


> Originally Posted by *Baasha*
> 
> Looks like I'll need to replace my Antec HCP-1300 Platinum with the EVGA 1600 T2 PSU since the 1300W is not enough to power 4x 1080 Ti.
> 
> Hmm.. but I love me some GPU sammich!


----------



## hokk

Oc memory
card dies

well guys back to the 980ti till i can rma


----------



## KraxKill

Quote:


> Originally Posted by *kylzer*
> 
> Oc memory
> card dies
> 
> well guys back to the 980ti till i can rma


Never knew a card to die from a memory OC. I guess that's possible. You sure it's dead? What do you set?


----------



## D13mass

Guys, one question, if I`m not mistaken I can buy any of Founders edition cards (Asus/Zotac/MSI, Evga, etc ...) and for example this water block Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL ? Is it compatible and can I use any water blocks which was created for Titan Pascal ?


----------



## MrTOOSHORT

Never know if all the thermal pads are installed from the factory, and the paste can have way too much or too little. Almost have to take off cooler when buying a new card just to see.

Quote:


> Originally Posted by *D13mass*
> 
> Guys, one question, if I`m not mistaken I can buy any of Founders edition cards (Asus/Zotac/MSI, Evga, etc ...) and for example this water block Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL ? Is it compatible and can I use any water blocks which was created for Titan Pascal ?
> 
> Am I right?
> 
> Do you have any suggestions related full cover blocks ?


Yes, same pcb as TXP, so you can use any TXP block on 1080 ti.


----------



## tconroy135

Quote:


> Originally Posted by *D13mass*
> 
> Guys, one question, if I`m not mistaken I can buy any of Founders edition cards (Asus/Zotac/MSI, Evga, etc ...) and for example this water block Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL ? Is it compatible and can I use any water blocks which was created for Titan Pascal ?
> 
> Am I right?
> 
> Do you have any suggestions related full cover blocks ?


I would make the assumption that that water block would work, but EKWB has confirmed their Titan-XP is compatible.


----------



## hokk

Quote:


> Originally Posted by *KraxKill*
> 
> Never knew a card to die from a memory OC. I guess that's possible. You sure it's dead? What do you set?


250mhz

and it blue screened

now it won't display anything from the DP ports

only the hdmi port works which i tried and outputs a corruput picture with a weird purple colour

i've tried it on a different pc with same issue.


----------



## Outcasst

Quote:


> Originally Posted by *kylzer*
> 
> 250mhz
> 
> and it blue screened
> 
> now it won't display anything from the DP ports
> 
> only the hdmi port works which i tried and outputs a corruput picture with a weird purple colour
> 
> i've tried it on a different pc with same issue.


I just tried to push mine to +200 memory and my PC completely locked up.

Thankfully the card is still working...


----------



## D13mass

Removed


----------



## Cybertox

The Founder's Edition has no DVI, only HDMI and DisplayPort. My monitor supports HDMI and DVI, however HDMI only goes up to 1920x1080, so I need DVI to run the monitor at its native resolutions as far as I am aware. So how the hell am I supposed to connect my card if I were to get one? I read there is an DisplayPort to DVI adapter included, can someone comment on that?


----------



## Slackaveli

Quote:


> Originally Posted by *Outcasst*
> 
> I just tried to push mine to +200 memory and my PC completely locked up.
> 
> Thankfully the card is still working...


wow, this is crazy. you and the guy who killed his. i had heard that bumping vram was pointless on this card but never that it was harmful. DON'T TELL NVIDIA that you OC'd it during RMA if you guys end up having to.


----------



## Slackaveli

Quote:


> Originally Posted by *Cybertox*
> 
> The Founder's Edition has no DVI, only HDMI and DisplayPort. My monitor supports HDMI and DVI, however HDMI only goes up to 1920x1080, so I need DVI to run the monitor at its native resolutions as far as I am aware. So how the hell am I supposed to connect my card if I were to get one? I read there is an DisplayPort to DVI adapter included, can someone comment on that?


hdmi goes well ppast 1080p. i am on hdmi @ [email protected] right now by HDMI 2.0


----------



## tconroy135

Quote:


> Originally Posted by *Cybertox*
> 
> The Founder's Edition has no DVI, only HDMI and DisplayPort. My monitor supports HDMI and DVI, however HDMI only goes up to 1920x1080, so I need DVI to run the monitor at its native resolutions as far as I am aware. So how the hell am I supposed to connect my card if I were to get one? I read there is an DisplayPort to DVI adapter included, can someone comment on that?


The monitor you list in your sig rig has a Display Port http://www.samsung.com/uk/monitors/led-s27b970d/


----------



## Pandora's Box

Quote:


> Originally Posted by *Outcasst*
> 
> I just tried to push mine to +200 memory and my PC completely locked up.
> 
> Thankfully the card is still working...


Mine locked up at +400. Artifacts all over the screen too. Yeah, leaving memory well enough alone.


----------



## Slackaveli

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Yea that's what I'm leaning towards since EVGA hasn't even talked about a 1080ti kit and I'd rather get rid of it sooner than later.
> 
> You think $200 shipped makes sense or should I go lower?


got $350 out of my 980ti hybrid on ebay in a matter of 3 hrs priced buy now. 3 days ago.


----------



## Cybertox

Quote:


> Originally Posted by *tconroy135*
> 
> The monitor you list in your sig rig has a Display Port http://www.samsung.com/uk/monitors/led-s27b970d/


You are right, I totally overlooked that. Thanks for the heads up. I assume Display Port goes up to 2560x1440, correct?


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> Mine locked up at +400. Artifacts all over the screen too. Yeah, leaving memory well enough alone.


so weird. further up the thread we got guys saying they get +520, +600, and +800. saying they see performance bumps all the way up (some say up until +500). and then we have a bricked card at +250. seeing artifacts is fine, that is what should happen when you go up to high. i usually go up until i see artifacts then go back down 50-100.


----------



## Slackaveli

Quote:


> Originally Posted by *Cybertox*
> 
> You are right, I totally overlooked that. Thanks for the heads up. I assume Display Port goes up to 2560x1440, correct?


hdmi and diaplay port both go to 5k


----------



## tconroy135

Quote:


> Originally Posted by *Cybertox*
> 
> You are right, I totally overlooked that. Thanks for the heads up. I assume Display Port goes up to 2560x1440, correct?


Yes.


----------



## Cybertox

Quote:


> Originally Posted by *Slackaveli*
> 
> hdmi and diaplay port both go to 5k


While I am not entirely sure as it was quite some time ago, when I was setting up my monitor with the computer, I had difficulties getting 2560x1440 via HDMI and could only do 1920x1080 for some reason. So due to that I used DVI, never had the change to try out DisplayPort though. I might have to test it out again with the HDMI cable that it is included. I do not think its HDMI 2.0 though.

Is there any difference between using HDMI or DisplayPort?


----------



## Slackaveli

Quote:


> Originally Posted by *Cybertox*
> 
> While I am not entirely sure as it was quite some time ago, when I was setting up my monitor with the computer, I had difficulties getting 2560x1440 via HDMI and could only do 1920x1080 for some reason. So due to that I used DVI, never had the change to try out DisplayPort though. I might have to test it out again with the HDMI cable that it is included. I do not think its HDMI 2.0 though.
> 
> Is there any difference between using HDMI or DisplayPort?


well, dp is better unless you have hdmi2.0 port and a quality cable. So, i'd recommend dp. it's pretty great and will easily do 1440p/144


----------



## pantsoftime

Quote:


> Originally Posted by *Cybertox*
> 
> While I am not entirely sure as it was quite some time ago, when I was setting up my monitor with the computer, I had difficulties getting 2560x1440 via HDMI and could only do 1920x1080 for some reason. So due to that I used DVI, never had the change to try out DisplayPort though. I might have to test it out again with the HDMI cable that it is included. I do not think its HDMI 2.0 though.
> 
> Is there any difference between using HDMI or DisplayPort?


Some older graphics cards can't do >1080p over HDMI. The 1080Ti definitely can so you won't run into that issue.

HDMI and DisplayPort are different standards but are comparable in most ways. DisplayPort currently supports additional features like adaptive sync and some higher resolutions (above 4K).


----------



## pantsoftime

Can anyone comment on power limiting on these cards (without shunt mod)? On a regular 1080FE I hit the power limit continuously. I'm wondering how much of an issue that is with the 1080Ti FE - in particular under water.


----------



## al210

Quote:


> Originally Posted by *kylzer*
> 
> 250mhz
> 
> and it blue screened
> 
> now it won't display anything from the DP ports
> 
> only the hdmi port works which i tried and outputs a corruput picture with a weird purple colour
> 
> i've tried it on a different pc with same issue.


I highly doubt that it died from OC. I suspect something was weak and just failed. It happens, sorry to hear. No product has 100% no failure rate.


----------



## pantsoftime

For those of you looking to unlock voltage control in Afterburner, it is possible by editing a file.
Quote:


> Originally Posted by *guru3d*
> 
> Add the card to third party database yourself (edit MSIAfterburner.oem2 included in the last official build) then select third party voltage control mode in properties and adding the following will do the trick.
> 
> [VEN_10DE&DEV_1B06&SUBSYS_120F10DE&REV_??]
> VDDC_Generic_Detection = 1


Source: http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html


----------



## alucardis666

Damn. Dead card from minor OC huh?

Makes me worry about pushing mine once I get it :-(


----------



## tconroy135

Quote:


> Originally Posted by *alucardis666*
> 
> Damn. Dead card from minor OC huh?
> 
> Makes me worry about pushing mine once I get it :-(


I would imagine there was a good chance, if it is the memory, that it would have failed at some point regardless. We know they pushed the memory speed pretty far on these cards so let's just hope they don't fail at stock.


----------



## sWaY20

Haven't seen anyone mention what brand memory is coming with their cards. Would be nice to know hynix or Sammy and what brand card it came on.


----------



## Cybertox

Thanks for the clarification guys!

Now its time to decide whether to get the Founder's Edition or wait for custom cards. Out of the currently available ones I do not see one that I particularly like. The EVGA 1080 TI FTW3 Gaming has a very slick design but I am not sure whether I am really in need of the additional features that it provides. Any suggestions on which custom card to get or to wait for? The rest looks pretty hideous in terms of design.

I am definitely upgrading to a 1080 Ti though, time to retire my 290X and transition to Nvidia.


----------



## alucardis666

Quote:


> Originally Posted by *tconroy135*
> 
> I would imagine there was a good chance, if it is the memory, that it would have failed at some point regardless. We know they pushed the memory speed pretty far on these cards so let's just hope they don't fail at stock.


I'd imagined they would've done some extensive testing on the memory used and chose the best for the job rather than just overclocking it and creating the stock bios. I'm still gonna OC my card to get the most out of it that I can, just gonna take my time and go slow to avoid killing it if possible. I've had my 1080 overclocked since day 1 with no issues.


----------



## stocksux

Quote:


> Originally Posted by *fat4l*
> 
> Sooo.
> Im hoping that the same situation that FE clocks better thab any AIB card wont happen again.
> Im thinking of getting some custom pcb card but im kind of scared that it will not clock as good as FE card.
> 
> Card would be under water.
> Any thoughts?


Honestly is a roll of the dice either way you go. You'll have more options for blocks with the FE.


----------



## becks

***

MSI GeForce GTX 1080Ti Sea Hawk EK X 11264MB GDDR5X

Zotac Geforce GTX 1080Ti ArcticStorm 11264MB GDDR5X


----------



## stocksux

Quote:


> Originally Posted by *D13mass*
> 
> Guys, one question, if I`m not mistaken I can buy any of Founders edition cards (Asus/Zotac/MSI, Evga, etc ...) and for example this water block Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL ? Is it compatible and can I use any water blocks which was created for Titan Pascal ?


That's correct


----------



## stocksux

Quote:


> Originally Posted by *Cybertox*
> 
> The Founder's Edition has no DVI, only HDMI and DisplayPort. My monitor supports HDMI and DVI, however HDMI only goes up to 1920x1080, so I need DVI to run the monitor at its native resolutions as far as I am aware. So how the hell am I supposed to connect my card if I were to get one? I read there is an DisplayPort to DVI adapter included, can someone comment on that?


There is one included in the box. Plug the DP side into the card and then connect your DVI cable into the other end of the adapter. Easy peasy


----------



## stocksux

Quote:


> Originally Posted by *pantsoftime*
> 
> Can anyone comment on power limiting on these cards (without shunt mod)? On a regular 1080FE I hit the power limit continuously. I'm wondering how much of an issue that is with the 1080Ti FE - in particular under water.


It's still there. You'll hit power limits well before thermal limit. Weeeeellllllllllllll before.


----------



## pantsoftime

Quote:


> Originally Posted by *sWaY20*
> 
> Haven't seen anyone mention what brand memory is coming with their cards. Would be nice to know hynix or Sammy and what brand card it came on.


Micron is still the only supplier of GDDR5X memory. All 1080's, 1080Ti's, and TitanXPs have Micron memory.


----------



## trawetSluaP

Quote:


> Originally Posted by *stocksux*
> 
> It's still there. You'll hit power limits well before thermal limit. Weeeeellllllllllllll before.


Unless you put it at 120% then you hit both haha.


----------



## AMDATI

PCper's interview with Tom Peterson from Nvidia explains the reasoning behind the power and voltage limit really well.






Long story short, Nvidia, even if being conservative, considers max voltage limits to reduce the life of a card all the way down to a year.....regardless of cooling. Why regardless of cooling? Because electrical current and voltage, regardless of heat, degrades and almost literally evaporates components/pathways.


----------



## hokk

Quote:


> Originally Posted by *al210*
> 
> I highly doubt that it died from OC. I suspect something was weak and just failed. It happens, sorry to hear. No product has 100% no failure rate.


Yes the memory was weak


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> Looks like I'll need to replace my Antec HCP-1300 Platinum with the EVGA 1600 T2 PSU since the 1300W is not enough to power 4x 1080 Ti.
> 
> Hmm.. but I love me some GPU sammich!


this is mainly for benchmarking purposes right? what's the gaming experience like?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *kylzer*
> 
> 250mhz
> 
> and it blue screened
> 
> now it won't display anything from the DP ports
> 
> only the hdmi port works which i tried and outputs a corruput picture with a weird purple colour
> 
> i've tried it on a different pc with same issue.


Honestly, it sounds like your card was doomed to begin with. It had to be faulty out of the factory.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *pantsoftime*
> 
> Can anyone comment on power limiting on these cards (without shunt mod)? On a regular 1080FE I hit the power limit continuously. I'm wondering how much of an issue that is with the 1080Ti FE - in particular under water.


Card can do 300 Watts with 120% on the slider.

A lot of websites that have tested the card have seen it go to 270 watts. So you have an extra 30 watts to OC and play with


----------



## sWaY20

Quote:


> Originally Posted by *pantsoftime*
> 
> Micron is still the only supplier of GDDR5X memory. All 1080's, 1080Ti's, and TitanXPs have Micron memory.


oh ok i had no idea thanks, I'm stuck in the old days.


----------



## OutlawXGP

Hey guys, I ordered my GTX 1080Ti Direct from Nvidia on 2nd of march and it arrived on Friday, however I like to know when are we going to get our free games to redeem from nvidia? I already contacted them using the live chat support twice and the CS guy told me they should send out the codes soon after shipping. Has any one here received their code for Ghost Recon Wild Lands or For Honor Yet?


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> so weird. further up the thread we got guys saying they get +520, +600, and +800. saying they see performance bumps all the way up (some say up until +500). and then we have a bricked card at +250. seeing artifacts is fine, that is what should happen when you go up to high. i usually go up until i see artifacts then go back down 50-100.


I can run 600 with no artifacts but I'm hunting for the optimal clock now which appear to be somewhere between +450 and +550.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *OutlawXGP*
> 
> Hey guys, I ordered my GTX 1080Ti Direct from Nvidia on 2nd of march and it arrived on Friday, however I like to know when are we going to get our free games to redeem from nvidia? I already contacted them using the live chat support twice and the CS guy told me they should send out the codes soon after shipping. Has any one here received their code for Ghost Recon Wild Lands or For Honor Yet?


Quote:


> Originally Posted by *ManuelG_at_NVIDIA*
> 
> Users who pre-ordered their 1080 Ti and received shipping notification today will receive their promotional game code sometime next week via email.


----------



## OutlawXGP

Quote:


> Originally Posted by *MrTOOSHORT*


Quote:


> Originally Posted by *MrTOOSHORT*


Thanks allot for clearing that up. Just one question, I am planning on reformatting my PC soon and I do not wish to install Geforce Experience on my main computer so could I possible use the code to redeem the game on my other PC which has Geforce Experience software already installed?

Thanks again.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *OutlawXGP*
> 
> Thanks allot for clearing that up. Just one question, I am planning on reformatting my PC soon and I do not wish to install Geforce Experience on my main computer so could I possible use the code to redeem the game on my other PC which has Geforce Experience software already installed?
> 
> Thanks again.


I have no idea, I don't use Geforce Experience. I just install PhysX and the GPU driver.


----------



## dboythagr8

Quote:


> Originally Posted by *OutlawXGP*
> 
> Thanks allot for clearing that up. Just one question, I am planning on reformatting my PC soon and I do not wish to install Geforce Experience on my main computer so could I possible use the code to redeem the game on my other PC which has Geforce Experience software already installed?
> 
> Thanks again.


Nvidia recently changed their code redemption protocol. The codes are now tied to your GPU. For instance I currently have a 1060 installed while I wait for the 1080Ti this week. Newegg emailed me a redemption code. I knew of Nvidia's redemption change, but I wanted to try anyway. So I used the code in GFE, and got hit with an error. The only way I can redeem the code is with a 1080ti in my machine.


----------



## Antsu

Has anyone tried max OC with 100% fan speed? Looking to get a taste of aftermarket OC performance. Please list the core speed and temperature.


----------



## stocksux

Quote:


> Originally Posted by *trawetSluaP*
> 
> Unless you put it at 120% then you hit both haha.


Nope still just power. I don't break 40c while overclocked


----------



## Antsu

Quote:


> Originally Posted by *KraxKill*
> 
> Never knew a card to die from a memory OC. I guess that's possible. You sure it's dead? What do you set?


I actually quite recently killed my old (sold to a friend) 980Ti with a mild bump in memory OC, which used to be well within the limits of max stable memory OC. Card was BIOS modded to max voltage without physical mod, but ran at a cool 55C so I dunno if it had an effect.


----------



## stocksux

Quote:


> Originally Posted by *OutlawXGP*
> 
> Hey guys, I ordered my GTX 1080Ti Direct from Nvidia on 2nd of march and it arrived on Friday, however I like to know when are we going to get our free games to redeem from nvidia? I already contacted them using the live chat support twice and the CS guy told me they should send out the codes soon after shipping. Has any one here received their code for Ghost Recon Wild Lands or For Honor Yet?


I picked one up at my local Micro Center on launch day and I received my code yesterday


----------



## MunneY

Im a lil pissed.

I ordered my 2 with overnight shipping and Fedex just decided not to deliver mine on Friday. Delivery date still says "pending"

I live in the Fedex World hub, and they said they couldn't deliver it nor could I pick it up.


----------



## pharcycle

Quote:


> Originally Posted by *dboythagr8*
> 
> Nvidia recently changed their code redemption protocol. The codes are now tied to your GPU. For instance I currently have a 1060 installed while I wait for the 1080Ti this week. Newegg emailed me a redemption code. I knew of Nvidia's redemption change, but I wanted to try anyway. So I used the code in GFE, and got hit with an error. The only way I can redeem the code is with a 1080ti in my machine.


I was able to redeem my Ghost Recon code yesterday and I've not received my 1080 TI's yet.... Running a 1080 so maybe it's on any 10-series graphics card? It actually says that on the nvidia UK website but maybe it's different where you are?



I had trouble redeeming it most of yesterday though, wasn't until later on that evening I was able to get it to go through. They also emailed me the codes with my order (ordered the cards on Friday night direct from nvidia)


----------



## trawetSluaP

Quote:


> Originally Posted by *stocksux*
> 
> Nope still just power. I don't break 40c while overclocked


Under water yeah, but not on air!


----------



## stocksux

Quote:


> Originally Posted by *trawetSluaP*
> 
> Under water yeah, but not on air!


Yeah under water. Is there any other way??


----------



## trawetSluaP

Quote:


> Originally Posted by *stocksux*
> 
> Yeah under water. Is there any other way??


Touche.

Water will be the way I go when EK release an actual 1080 Ti block!


----------



## stocksux

Quote:


> Originally Posted by *trawetSluaP*
> 
> Touche.
> 
> Water will be the way I go when EK release an actual 1080 Ti block!


They already did. The Titan XP block is exactly the same. I know this because I returned my Titan XP and I removed the block and set it right down in top on my 1080ti. Now the backplate on the other hand...says Titan X Pascal on the backside of my 1080ti ¯\_(ツ)_/¯


----------



## y2kcamaross

Quote:


> Originally Posted by *MunneY*
> 
> Im a lil pissed.
> 
> I ordered my 2 with overnight shipping and Fedex just decided not to deliver mine on Friday. Delivery date still says "pending"
> 
> I live in the Fedex World hub, and they said they couldn't deliver it nor could I pick it up.


yup, mine showed up yesterday, my place of business wasn't open, for some reason fedex home delivery doesn't DELIVER on MONDAYS and for another completely ridiculous reason...we weren't allowed to pick it up in person? What sense does that make


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I can run 600 with no artifacts but I'm hunting for the optimal clock now which appear to be somewhere between +450 and +550.


i'll let you know tomorrow when mine finally arrives. keep us posted. for me i'll probably take her up to 12gbps and stay there. something right around +500 if im able. that's 525 bandwidth i guess lol.


----------



## Slackaveli

Quote:


> Originally Posted by *MunneY*
> 
> Im a lil pissed.
> 
> I ordered my 2 with overnight shipping and Fedex just decided not to deliver mine on Friday. Delivery date still says "pending"
> 
> I live in the Fedex World hub, and they said they couldn't deliver it nor could I pick it up.


same. im furious with fedex. mine was set for friday before noon. on saturday morning it finally left hub in memphis and proceeded to drive right past my city in arkansas on to oklahoma, in tulsa, where it is just sitting since saturday night. and they refuse refund.


----------



## bfedorov11

Well, if you're going to brick a card, at least it was done when you can rma through retail. One of the changes to the ti was the power delivery to the memory. It has the ability to fluctuate more depending on state.

der8auer's video..

~ 5:00 for shunt resistor locations.. 7:15 about vmem voltage


----------



## Slackaveli

good unboxing vid right there.

On dude that bricked his- man, im thinking it was just faulty. something with that top vrm from this video.


----------



## joeh4384

Quote:


> Originally Posted by *bastian*
> 
> The MSI Gaming X 1080 Ti will be mine.


Me too. I am going with an ITX Corsair 250d build. I will have everything but the GPU on tuesday and then it is going to be a lot F5ing when they come out.


----------



## MunneY

Quote:


> Originally Posted by *Slackaveli*
> 
> same. im furious with fedex. mine was set for friday before noon. on saturday morning it finally left hub in memphis and proceeded to drive right past my city in arkansas on to oklahoma, in tulsa, where it is just sitting since saturday night. and they refuse refund.


yeah they said it was from the storms thursday night... it rained for 30 minutes. I'm getting my money back or I'm charging back on my CC. I don't care.


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> Nvidia recently changed their code redemption protocol. The codes are now tied to your GPU. For instance I currently have a 1060 installed while I wait for the 1080Ti this week. Newegg emailed me a redemption code. I knew of Nvidia's redemption change, but I wanted to try anyway. So I used the code in GFE, and got hit with an error. The only way I can redeem the code is with a 1080ti in my machine.


False, I redeemed mine with my 1080 no issue.


----------



## lilchronic

Quote:


> Originally Posted by *MunneY*
> 
> yeah they said it was from the storms thursday night... it rained for 30 minutes. I'm getting my money back or I'm charging back on my CC. I don't care.


I got refunded my saturday shipping fee's from newegg so if they don't get it delivered tomorrow they will have to refund me my next day shipping fee's as well.


----------



## keikei

Quote:


> Originally Posted by *OutlawXGP*
> 
> Hey guys, I ordered my GTX 1080Ti Direct from Nvidia on 2nd of march and it arrived on Friday, however I like to know when are we going to get our free games to redeem from nvidia? I already contacted them using the live chat support twice and the CS guy told me they should send out the codes soon after shipping. Has any one here received their code for Ghost Recon Wild Lands or For Honor Yet?


I believe the Nvidia rep here mentioned those codes would be sent out the upcoming week.


----------



## alucardis666

Quote:


> Originally Posted by *keikei*
> 
> I believe the Nvidia rep here mentioned those codes would be sent out the upcoming week.


Which is lame, Newegg sent me my code about 5 hours after my purchase.


----------



## lilchronic

Quote:


> Originally Posted by *alucardis666*
> 
> Which is lame, Newegg sent me my code about 5 hours after my purchase.


Same but can't use it till you get the card.


----------



## Renairy

Anyone else think Nvidia are playing dirty towards consumers in the market? To be quite honest, users are getting rorted.
Heres why i think that.

They're bringing out GPU's in a fashion that is not only milking consumers but also charging us alot more than we should be paying.

For example, the GTX Titan X is only on the market to give us the illusion that the 1080Ti is cheap at $699.
On the same note, the GTX1080 was sold at the same price the 1080Ti/TitanXP should of been sold for.

The reason why they held out on the 1080Ti for this long was because there was no competition, it was a wild trump card.
Also to compete with what ever AMD's offering is.
The TitanX brand should be boycotted entirely, and consumers need to put their foot down hardcore.
How do you feel knowing your $$$premium card is now obsolete because of their sly and shifty marketing tactic?

What they should be doing is release all versions of the same GPU, at the same time. This trump card bull is only there to benefit them whilst milking us.

1050, 1060, 1070, 1080, 1080Ti all to be released together from now on.
Notice when i put it like that and in order, you can see how the TitanX's release is nothing but a scam. It actually belongs no where and its only there to gouge and decieve.

From this day on, if you care about your hard earned money, dnt ever invest in a TitanX and give these leeches the message thats never been sent.

A company that hires random puppets to shout out positive pricing remarks at launch events is not a company of the people.

If there is another Titan card next generation, i am switching to AMD monitors and GPU's 100%.


----------



## BigMack70

Quote:


> Originally Posted by *Renairy*
> 
> Anyone else think Nvidia are playing dirty towards consumers in the market? To be quite honest, users are getting rorted.
> Heres why i think that.
> 
> They're bringing out GPU's in a fashion that is not only milking consumers but also charging us alot more than we should be paying.
> 
> For example, the GTX Titan X is only on the market to give us the illusion that the 1080Ti is cheap at $699.
> On the same note, the GTX1080 was sold at the same price the 1080Ti/TitanXP should of been sold for.
> 
> The reason why they held out on the 1080Ti for this long was because there was no competition, it was a wild trump card.
> Also to compete with what ever AMD's offering is.
> The TitanX brand should be boycotted entirely, and consumers need to put their foot down hardcore.
> How do you feel knowing your $$$premium card is now obsolete because of their sly and shifty marketing tactic?
> 
> What they should be doing is release all versions of the same GPU, at the same time. This trump card bull is only there to benefit them whilst milking us.
> 
> 1050, 1060, 1070, 1080, 1080Ti all to be released together from now on.
> Notice when i put it like that and in order, you can see how the TitanX's release is nothing but a scam. It actually belongs no where and its only there to gouge and decieve.
> 
> From this day on, if you care about your hard earned money, dnt ever invest in a TitanX and give these leeches the message thats never been sent.
> 
> A company that hires random puppets to shout out positive pricing remarks at launch events is not a company of the people.
> 
> If there is another Titan card next generation, i am switching to AMD monitors and GPU's 100%.


They have been doing this ever since they achieved a midrange chip (GK104) which competed with AMD's big chip (Tahiti XT aka HD 7970). Ever since then, they've jacked up the prices on their big chips to $700-1200 and jacked up their pricing on midrange cards to $400-700.

Until AMD can actually dethrone them in the performance department, just get the lube and bend over.


----------



## Somasonic

Quote:


> Originally Posted by *Renairy*
> 
> Anyone else think Nvidia are playing dirty towards consumers in the market? To be quite honest, users are getting rorted.
> Heres why i think that.
> 
> They're bringing out GPU's in a fashion that is not only milking consumers but also charging us alot more than we should be paying.
> 
> For example, the GTX Titan X is only on the market to give us the illusion that the 1080Ti is cheap at $699.
> On the same note, the GTX1080 was sold at the same price the 1080Ti/TitanXP should of been sold for.
> 
> The reason why they held out on the 1080Ti for this long was because there was no competition, it was a wild trump card.
> Also to compete with what ever AMD's offering is.
> The TitanX brand should be boycotted entirely, and consumers need to put their foot down hardcore.
> How do you feel knowing your $$$premium card is now obsolete because of their sly and shifty marketing tactic?
> 
> What they should be doing is release all versions of the same GPU, at the same time. This trump card bull is only there to benefit them whilst milking us.
> 
> 1050, 1060, 1070, 1080, 1080Ti all to be released together from now on.
> Notice when i put it like that and in order, you can see how the TitanX's release is nothing but a scam. It actually belongs no where and its only there to gouge and decieve.
> 
> From this day on, if you care about your hard earned money, dnt ever invest in a TitanX and give these leeches the message thats never been sent.
> 
> A company that hires random puppets to shout out positive pricing remarks at launch events is not a company of the people.
> 
> If there is another Titan card next generation, i am switching to AMD monitors and GPU's 100%.


The TXP was never meant to be a consumer gaming card so there's no point getting bent out of shape about it. And the Ti really is reasonably priced however you slice it.


----------



## smushroomed

Anyone place an Arctic Accelero hybrid cooler on their 1080ti FE?

I ordered the 140mm off amazon


----------



## dboythagr8

Quote:


> Originally Posted by *alucardis666*
> 
> False, I redeemed mine with my 1080 no issue.


I don't get what you're disputing? You have to have the same family card in your machine that the card comes with i.e. you can no longer just buy a code from someone and use it on your machine unless you have the same GPU family.

https://arstechnica.com/gaming/2017/02/nvidia-game-codes-gfe-hardware/


----------



## sWaY20

Quote:


> Originally Posted by *lilchronic*
> 
> I got refunded my saturday shipping fee's from newegg so if they don't get it delivered tomorrow they will have to refund me my next day shipping fee's as well.


since we both haven't gotten an update, and had the same tracking info and time, something tells me it's going out Monday and we're getting it Tuesday. In that case I'll be really pissed and expect full refund on my entire shipping. Or I'll dispute it with my bank.


----------



## Testing12

Quote:


> Originally Posted by *Renairy*
> 
> Anyone else think Nvidia are playing dirty towards consumers in the market? To be quite honest, users are getting rorted.
> Heres why i think that.
> 
> They're bringing out GPU's in a fashion that is not only milking consumers but also charging us alot more than we should be paying.
> 
> For example, the GTX Titan X is only on the market to give us the illusion that the 1080Ti is cheap at $699.




What the market will bear.


----------



## Nervoize

Quote:


> Originally Posted by *dboythagr8*
> 
> I don't get what you're disputing? You have to have the same family card in your machine that the card comes with i.e. you can no longer just buy a code from someone and use it on your machine unless you have the same GPU family.
> 
> https://arstechnica.com/gaming/2017/02/nvidia-game-codes-gfe-hardware/


----------



## AMDATI

We're almost there guys....


----------



## dboythagr8

Quote:


> Originally Posted by *sWaY20*
> 
> since we both haven't gotten an update, and had the same tracking info and time, something tells me it's going out Monday and we're getting it Tuesday. In that case I'll be really pissed and expect full refund on my entire shipping. Or I'll dispute it with my bank.


I'm expecting the card on Tuesday. For me next day delivery + saturday delievery = $25 extra due to me being a Newegg Premier member. I received an invoice in my email today with the credit being initiated back to my Paypal account.
Quote:


> Originally Posted by *Nervoize*


*shrug*


----------



## Renairy

Quote:


> Originally Posted by *Somasonic*
> 
> The TXP was never meant to be a consumer gaming card so there's no point getting bent out of shape about it. And the Ti really is reasonably priced however you slice it.


What was the TitanXP supposed to be then?
Ofcourse it was supposed to be a consumer gaming card. What else was it supposed to be other than a scam ofcourse?

Its quite unfortuante that you missed my point isnt it?

The 1080ti is reasonably priced NOW yes. But the 1080 shouldnt have shared the same price tag.
And the "Titan" should have never been sold for the $1000 premium. Since the Titan is really a 1080Ti and was always intended to be a 1080Ti.

Do you get it now ?

Nvidia make great hardware yes, but i have zero respect for them as a company.


----------



## Alvarado

Quote:


> Originally Posted by *Testing12*
> 
> What the market will bear.


Dude makes a good point. Our little hardware niche is nothing compared to fashion.


----------



## dboythagr8

The $1000 price tag for the OG Titan was sort of justified in that it was definitely a prosumer card. They've gone away from that as the Titan line has progressed.


----------



## KickAssCop

Man, still backordered. This sucks.


----------



## KraxKill

Quote:


> Originally Posted by *Renairy*
> 
> What was the TitanXP supposed to be then?
> Ofcourse it was supposed to be a consumer gaming card. What else was it supposed to be other than a scam ofcourse?
> 
> Its quite unfortuante that you missed my point isnt it?
> 
> The 1080ti is reasonably priced NOW yes. But the 1080 shouldnt have shared the same price tag.
> And the "Titan" should have never been sold for the $1000 premium. Since the Titan is really a 1080Ti and was always intended to be a 1080Ti.
> 
> Do you get it now ?
> 
> Nvidia make great hardware yes, but i have zero respect for them as a company.


Yeah what a shady company.

How dare they recoup their R&D costs by first bringing emerging technology to willing early adopters, spendy labs and prosumers. Then, as yields improve, the process is refined and the margins increase they dare to make available the same technology to even more consumers at nearly half the price for profit. All while working on the next latest and greatest.

Haters gonna hate...


----------



## Somasonic

Quote:


> Originally Posted by *Renairy*
> 
> What was the TitanXP supposed to be then?
> Ofcourse it was supposed to be a consumer gaming card. What else was it supposed to be other than a scam ofcourse?
> 
> Its quite unfortuante that you missed my point isnt it?
> 
> The 1080ti is reasonably priced NOW yes. But the 1080 shouldnt have shared the same price tag.
> And the "Titan" should have never been sold for the $1000 premium. Since the Titan is really a 1080Ti and was always intended to be a 1080Ti.
> 
> Do you get it now ?
> 
> Nvidia make great hardware yes, but i have zero respect for them as a company.


The TXP wasn't part of the GeForce line. FWIW I agree with most of what you said. But the TXP wasn't marketed as a gaming card.


----------



## AMDATI

The companies that buy Titans for whatever their workload entails, have long since recouped the extra cost for Titans. That's the advantage......the highest level of performance. These are work horses designed to make money, not just be used to consume content.

Let's put it this way, if you have a machine that is just 5% faster, runs 10 hours a day for 5 days a week, you will have gained 10 hours of extra productivity a month. 120 hours in a year. And that's just 5% increased performance.

Just like with the 6900k from Intel......the workstations these chips are going in are seeing tons of gains.....and this performance was available long before AMD could provide it. So when you look at the price difference of Ryzen vs Intel, that $500 price difference, while meaningful, is chump change in the grand scheme of things, and will obviously also change now that there's competition.

Even high end gaming cards have their place in professional sectors, from using it to drive displays, render farms, AI, etc.


----------



## D13mass

Quote:


> Originally Posted by *KraxKill*
> 
> I can run 600 with no artifacts but I'm hunting for the optimal clock now which appear to be somewhere between +450 and +550.


I`m sorry guys, but could you attach screenshots from GPU-Z on sensors tab ? Because who knows how much you had before you overclocked your card for 450-550 mhz.


----------



## Slackaveli

Quote:


> Originally Posted by *smushroomed*
> 
> Anyone place an Arctic Accelero hybrid cooler on their 1080ti FE?
> 
> I ordered the 140mm off amazon


have not seen that one but im sure it'll work fine. you don't like the EVGA ones?


----------



## Slackaveli

Quote:


> Originally Posted by *Renairy*
> 
> What was the TitanXP supposed to be then?
> Ofcourse it was supposed to be a consumer gaming card. What else was it supposed to be other than a scam ofcourse?
> 
> Its quite unfortuante that you missed my point isnt it?
> 
> The 1080ti is reasonably priced NOW yes. But the 1080 shouldnt have shared the same price tag.
> And the "Titan" should have never been sold for the $1000 premium. Since the Titan is really a 1080Ti and was always intended to be a 1080Ti.
> 
> Do you get it now ?
> 
> Nvidia make great hardware yes, but i have zero respect for them as a company.


dude, why do you think it's not called "geforce gtx" (aka gaming division) Tiat X? Because it's for AI.


----------



## KraxKill

Quote:


> Originally Posted by *D13mass*
> 
> I`m sorry guys, but could you attach screenshots from GPU-Z on sensors tab ? Because who knows how much you had before you overclocked your card for 450-550 mhz.


I could, but I'm talking about the memory clock above which unlike the GPU is clocked the same for everyone.


----------



## D13mass

Quote:


> Originally Posted by *KraxKill*
> 
> I could, but I'm talking about the memory clock above which unlike the GPU is clocked the same for everyone.


Ah, now it`s clear







but anyway it`s not first time when somebody just writes 'I oc`ed my card for +400 Mhz' and I always think it`s about core


----------



## Slackaveli

Quote:


> Originally Posted by *D13mass*
> 
> I`m sorry guys, but could you attach screenshots from GPU-Z on sensors tab ? Because who knows how much you had before you overclocked your card for 450-550 mhz.


they are 1376 (x8 effective=11008) on the memory at stock. when he says he added +500, he means +125 and he's running it at 1500 (x8= 12,000) , or 12gbps which is hard to follow, yes. the +500 is what you enter in Afterburner when overclocking.


----------



## AMDATI

Wow newegg isn't playing when it comes to the 2 day shipping option....

They shipped it just now (basically first thing in the morning)......

ESTIMATED DELIVERY: Tuesday, March 14, 2017

That's essentially 1 day shipping. Of course, I did order Friday, but it's understandable since they don't even process orders on weekends, let alone ship them.

Still, pretty fast even by newegg's standards. Sometimes I do get packages really quick if they ship from a supply outlet close to me, this one is shipping from across the country which makes the speed even more impressive.

But I did pay $15 for shipping (when I'm used to free shipping)....so I do feel a bit less sting for it! $15 isn't much to ship a $700 item, but still, I paused for like 2ms when I saw no free shipping lol.


----------



## MURDoctrine

Quote:


> Originally Posted by *AMDATI*
> 
> Wow newegg isn't playing when it comes to the 2 day shipping option....
> 
> They shipped it just now (basically first thing in the morning)......
> 
> ESTIMATED DELIVERY: Tuesday, March 14, 2017
> 
> That's essentially 1 day shipping. Of course, I did order Friday, but it's understandable since they don't even process orders on weekends, let alone ship them.
> 
> Still, pretty fast even by newegg's standards. Sometimes I do get packages really quick if they ship from a supply outlet close to me, this one is shipping from across the country which makes the speed even more impressive.
> 
> But I did pay $15 for shipping (when I'm used to free shipping)....so I do feel a bit less sting for it! $15 isn't much to ship a $700 item, but still, I paused for like 2ms when I saw no free shipping lol.


Dang I still haven't had an update on mine but I ordered on the second batch around 7 pm EST. Been hoping I get a message saying its been packaged and leaving their warehouse. That does give me hope that I might get mine Wed. since I did 3 day shipping.


----------



## AMDATI

I ordered mine the first 10 minutes it went live.....actually before it went live, it was a page that wasn't even officially listed yet. But I didn't do any of that rush processing stuff.


----------



## RaleighStClair

Do we have any release date info for any of the AIB cards? My googlefu isn't picking up anything.


----------



## trawetSluaP

Overclockers UK have said some time in April.

Might be different for other regions.


----------



## AMDATI

I always heard like a week after release, but I've also heard the april figures too. I don't think anyone really has any definitive answers. You have to keep in mind that this is a supply limited card, that is selling really well for its price range. So I would expect it to be more difficult for them to have enough chips to make enough customs cards for a while. You definitely don't want to overshoot your market demand either. And with the FE's selling soo well, there's not as much incentive.

But I think Nvidia is also doing this so people can actually get cards for MSRP prices, since custom coolers will add $50-$100 more to the base price. They really took advantage of that on the 1080 release, which was supposed to be $600, but you couldn't even touch one for under $700+ because everyone sold for more than FE prices, even though FE was supposed to be $100 more than MSRP.

I'm actually glad not as many people want FE cards....because it meant less competition when I ordered mine.

Very soon, it's going to get pretty difficult to find any stock. They produced these 1080ti's for months before they released it. It'll take months for them to produce enough to ship in enough quantities for custom cooling solutions.


----------



## kevinjohn

It means their shipment is good, delivering a product with in 2 or 3 days is excellent.


----------



## Renairy

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah what a shady company.
> 
> How dare they recoup their R&D costs by first bringing emerging technology to willing early adopters, spendy labs and prosumers. Then, as yields improve, the process is refined and the margins increase they dare to make available the same technology to even more consumers at nearly half the price for profit. All while working on the next latest and greatest.
> 
> Haters gonna hate...


Amazing how you have the full shabang info on their finances. You must be employed by them to know what targets they need to hit to recoup.
You speak like they dont have money in the bank and every bit of R&D is borrowed money.
Sigh.

Quote:


> Originally Posted by *Somasonic*
> 
> The TXP wasn't part of the GeForce line. FWIW I agree with most of what you said. But the TXP wasn't marketed as a gaming card.


This is where you're wrong and purposely misled i think.
The first titan had double precision, it was a GPU/Workstation card.

The new Titans have gone away with that. For the final time, do you now understand?
You're saying it wasn't marketed as a gaming card... what was it marketed as bro? A trophy?
Specs wise, Its identical to a 1080Ti.

Quote:


> Originally Posted by *AMDATI*
> 
> The companies that buy Titans for whatever their workload entails, have long since recouped the extra cost for Titans. That's the advantage......the highest level of performance. These are work horses designed to make money, not just be used to consume content.
> 
> Let's put it this way, if you have a machine that is just 5% faster, runs 10 hours a day for 5 days a week, you will have gained 10 hours of extra productivity a month. 120 hours in a year. And that's just 5% increased performance.
> 
> Just like with the 6900k from Intel......the workstations these chips are going in are seeing tons of gains.....and this performance was available long before AMD could provide it. So when you look at the price difference of Ryzen vs Intel, that $500 price difference, while meaningful, is chump change in the grand scheme of things, and will obviously also change now that there's competition.
> 
> Even high end gaming cards have their place in professional sectors, from using it to drive displays, render farms, AI, etc.


The Titan is NOT a workstation chip, read above response.

Quote:


> Originally Posted by *Slackaveli*
> 
> dude, why do you think it's not called "geforce gtx" (aka gaming division) Tiat X? Because it's for AI.


Again, this seems to be the misleading effect and spell Nvidia have put on you all lol.
For the last time, the Titan XM and the Titan XP do not have the double precision abilities of the OG Titan.
It is not a workstation card.


----------



## sWaY20

Quote:


> Originally Posted by *AMDATI*
> 
> Wow newegg isn't playing when it comes to the 2 day shipping option....
> 
> They shipped it just now (basically first thing in the morning)......
> 
> ESTIMATED DELIVERY: Tuesday, March 14, 2017
> 
> That's essentially 1 day shipping. Of course, I did order Friday, but it's understandable since they don't even process orders on weekends, let alone ship them.
> 
> Still, pretty fast even by newegg's standards. Sometimes I do get packages really quick if they ship from a supply outlet close to me, this one is shipping from across the country which makes the speed even more impressive.
> 
> But I did pay $15 for shipping (when I'm used to free shipping)....so I do feel a bit less sting for it! $15 isn't much to ship a $700 item, but still, I paused for like 2ms when I saw no free shipping lol.


wonderful, i order the sec it drops at newegg paid extra for rush processing, overnight and Saturday delivery and we're gonna get it the same day, Tuesday... That doesn't make me happy one bit.


----------



## BigMack70

Quote:


> Originally Posted by *Slackaveli*
> 
> dude, why do you think it's not called "geforce gtx" (aka gaming division) Tiat X? Because it's for AI.


You do realize that "GeForce GTX" is literally printed on the titan xp card and backplate, right?









The argument "Nvidia didn't make the Titan for gaming, rather for professional workloads" is a load of crap and always has been. Even the OG Titan was marketed as a "by gamers for gamers" card.

Nvidia is screwing everyone in price and they've been doing so for 5 years. There is no immorality in this per-se, as they have had zero high end competition from AMD and so they have behaved exactly as a company without competition behaves... By increasing their pricing as much as they can get away with.

The only thing I think is unethical that they've done is mislead consumers by labeling midrange chips as the x80 part (680, 980,1080) - a nomenclature which used to denote their high end chip.

But the blame for pricing is squarely on AMD for underperforming and being unable to compete.


----------



## xTesla1856

Any word on when nV gets new stock? I'm supposed to ship the Titan this week and I'd like to get my new babies in a timely manner


----------



## cutty1998

Quote:


> Originally Posted by *Kdubbs*
> 
> Woo I am glad I got into the preorder when I did. Contemplated getting two but decided to hold off to see if amd releases something more powerful to pop over to that. Having 290x's I can't wait for the performance increases. Ordered standard shipping as well.


Well, Your room will probably be a lot cooler going from those 2 heaters to the single Ti ,I'll tell you that !


----------



## Besty

hi all.

has anyone apart from the Guru3d reviewer has confirmed that the afterburner voltage control tweak works? I am running a EVGA 1080ti and the Afterburner tweak correctly identifies the 1080ti when the .oem2 line is added however the voltage slider does not work. Replacing part of the oem2 line with ???????? gives me the voltage slider however it has no affect on the voltage readout when used.

Also>

GPU Tweak has voltage control slider - no affect when used
Precision XOC has voltage control slider - no affect when used.

Did something happen between review and release or have others had success with the voltage controls ?


----------



## Biggu

Well I picked up my 1080TI on Saturday from microcenter. Holy crap what an upgrade from Maxwel Titan X.

waterblock and backplate on order for the sim setup.


----------



## OutlawXGP

I went from 2x EVGA GTX 980Tis SC+ ACX 2.0 to Nvidia REF GTX 1080ti in hope to ditch SLI because within the last year SLI support has really gone down the toilet and the games that do support DX 11 SLI don't really make good use of the power of 2 GPUs, having said that do you guys think 1 1080Ti is good to stick with for 1440p or should I get another once I sold my 2 GTX 980tis.


----------



## D13mass

Answer from Aquatuning GmbH
Quote:


> Hello,
> sorry but we have no graficcardcooler for this items. I hope for understanding. Many thanks.


It was my question related compatibility their water blocks for Titan Pascal with 1080ti cards.


----------



## outofmyheadyo

So whats the average clocks one can hope from a watercooled card?


----------



## Transmaniacon

How do the aftermarket blower cards like the Asus Turbo compare to the Founder's Edition? I have an ITX build in the Nano S and am leaning towards a blower cooler, but not sure if it's worth waiting on Asus/Zotac/etc. They also seem to not have a backplate unlike the FE.


----------



## stocksux

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So whats the average clocks one can hope from a watercooled card?


im about 2075mhz core and can go over 12GBps on the memory with temps in ge 30s at load. Still working on tweaking


----------



## eXistencelies

Quote:


> Originally Posted by *AMDATI*
> 
> Newegg is emailing out the game bundle codes now, but that doesn't count for much since you need to have the card installed and detected to redeem it. Or at least, any 'qualifying card' that the bundle is eligible for.


I redeemed mine without the TI. I do have a 1080 FE though.


----------



## DimmyK

Cards are back in stock at nvidia online store as I type it.


----------



## Emissary of Pain

May I join ?


----------



## Mrip541

Any comments on noise? Is the new dual width blower as quiet as previous dual fan designs?


----------



## Testing12

Quote:


> Originally Posted by *DimmyK*
> 
> Cards are back in stock at nvidia online store as I type it.


Out of stock as I write this, 27 minutes after your post.


----------



## Mato87

Only few more weeks before I join the club







I'll be exchanging my 980 ti with the new gtx 1080 ti. I'll be going for the gigabyte aorus version that was teased a few days ago.


----------



## dboythagr8

Those of us that ordered EVGA version from Newegg and paid for Saturday delivery when the card didn't even ship out (I think there is 3 of us), please keep this thread updated with shipping status. I just checked mine and it still says it's in Rowland Heights, CA and only the label has been created...

If it doesn't ship today I am going to be heated.


----------



## trawetSluaP

Just got confirmation from EK that they are releasing a single slot Waterblock and a backplate specifically for the 1080 Ti. Should be released first or second week of April!


----------



## D13mass

Guys, anybody bought here 1080ti from Zotac https://www.amazon.com/gp/product/B06XDN4TXK ?


----------



## eXistencelies

Quote:


> Originally Posted by *dboythagr8*
> 
> Those of us that ordered EVGA version from Newegg and paid for Saturday delivery when the card didn't even ship out (I think there is 3 of us), please keep this thread updated with shipping status. I just checked mine and it still says it's in Rowland Heights, CA and only the label has been created...
> 
> If it doesn't ship today I am going to be heated.


I ordered mine Friday but used standard shipping. They shipped it out Friday. Also I used the $2.99 fast shipping thing they have.


----------



## eXistencelies

Quote:


> Originally Posted by *trawetSluaP*
> 
> Just got confirmation from EK that they are releasing a single slot Waterblock and a backplate specifically for the 1080 Ti. Should be released first or second week of April!


I already went and ordered the Titan XP and back plate from EK. Don't feel like waiting till April. Wish my current 1080 EKWB would work though.


----------



## sWaY20

Quote:


> Originally Posted by *dboythagr8*
> 
> Those of us that ordered EVGA version from Newegg and paid for Saturday delivery when the card didn't even ship out (I think there is 3 of us), please keep this thread updated with shipping status. I just checked mine and it still says it's in Rowland Heights, CA and only the label has been created...
> 
> If it doesn't ship today I am going to be heated.


same here, not a damn thing has changed on tracking since i got it mid afternoon Fri.


----------



## tconroy135

Quote:


> Originally Posted by *sWaY20*
> 
> same here, not a damn thing has changed on tracking since i got it mid afternoon Fri.


I don't understand why releases are handled in this manner. It's clear they've had the cards lying around for weeks.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Those of us that ordered EVGA version from Newegg and paid for Saturday delivery when the card didn't even ship out (I think there is 3 of us), please keep this thread updated with shipping status. I just checked mine and it still says it's in Rowland Heights, CA and only the label has been created...
> 
> If it doesn't ship today I am going to be heated.


Same.


----------



## SpartanJet

Quote:


> Originally Posted by *dboythagr8*
> 
> Those of us that ordered EVGA version from Newegg and paid for Saturday delivery when the card didn't even ship out (I think there is 3 of us), please keep this thread updated with shipping status. I just checked mine and it still says it's in Rowland Heights, CA and only the label has been created...
> 
> If it doesn't ship today I am going to be heated.


I am one of those Newegg customers who ordered Friday with rush + overnight + Saturday delivery options with just the label created on Friday. I got an email from Newegg this morning with a shipping notification stating its has been shipped on 3/13. FedEx still only shows the label created still but that is normal until it actually goes somewhere. Of course with the snowstorm that is hitting my area tomorrow it probably means the rush shipment, Saturday delivery, AND overnight will have all been a huge waste of money.


----------



## KraxKill

Quote:


> Originally Posted by *Besty*
> 
> hi all.
> 
> has anyone apart from the Guru3d reviewer has confirmed that the afterburner voltage control tweak works? I am running a EVGA 1080ti and the Afterburner tweak correctly identifies the 1080ti when the .oem2 line is added however the voltage slider does not work. Replacing part of the oem2 line with ???????? gives me the voltage slider however it has no affect on the voltage readout when used.
> 
> Also>
> 
> GPU Tweak has voltage control slider - no affect when used
> Precision XOC has voltage control slider - no affect when used.
> 
> Did something happen between review and release or have others had success with the voltage controls ?


I did use the afterburner profile config trick, I did see voltage increase by two bins. Sadly to no avail.

Without touching the voltage slider, I'm at 1.0620. With the slider at half I got 1.07x, and at full 1.08x.

However, this did not offer any extra OC room for me on the core. I have not seen 1.09-1.1, which may be what is required to hit higher core bins.

Since it had no effect for me, I left that extra voltage alone.

I'm sitting at 2062.5 core 12ghz memory at 1.050v - 1.062v 28-30c core temps with 19-21c room ambients. My voltage runs at 1.062 at anythithing over 29c. Below 29c the card will run 1.050v. I'm running liquid metal TIM on the die and an x41 AIO with Noctuas in push pull with the stock mid-plate and blower fan retained to cool the vrm along with the rest of the PCB. I'm also Shunt modded.

Hoping for a custom BIOS to explore higher boost bins but unless the card is cool, I doubt the card will ever pull those voltages.

I'd like to see if people with custom loop chillers can get to higher voltage states.

2075 core is almost stable for me at 1.08v but not quite. Would be nice to test 1.09-1.1 but it's just not there. At least not at ~30c


----------



## dboythagr8

I just got off the phone with Newegg. No updates to tell me other than putting me on hold and saying they've sent an email to the warehouse. If tracking information isn't updated within two business days, they will file a missing package claim for a refund.

This is terrible

@SpartanJet you need to call for refund


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> I just got off the phone with Newegg. No updates to tell me other than putting me on hold and saying they've sent an email to the warehouse. If tracking information isn't updated within two business days, they will file a missing package claim for a refund.
> 
> This is terrible
> 
> @SpartanJet you need to call for refund


I just did the same and this time they refunded me 20$


----------



## chibi

I ordered an EVGA 1080 Ti from Newegg within the first 5 mins. I got the game code and tracking number within hours. However, my order is still marked as "shipping label created" with Purolator. No signs of it being shipped out yet


----------



## AMDATI

Quote:


> Originally Posted by *BigMack70*
> 
> You do realize that "GeForce GTX" is literally printed on the titan xp card and backplate, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The argument "Nvidia didn't make the Titan for gaming, rather for professional workloads" is a load of crap and always has been. Even the OG Titan was marketed as a "by gamers for gamers" card.
> 
> Nvidia is screwing everyone in price and they've been doing so for 5 years. There is no immorality in this per-se, as they have had zero high end competition from AMD and so they have behaved exactly as a company without competition behaves... By increasing their pricing as much as they can get away with.
> 
> The only thing I think is unethical that they've done is mislead consumers by labeling midrange chips as the x80 part (680, 980,1080) - a nomenclature which used to denote their high end chip.
> 
> But the blame for pricing is squarely on AMD for underperforming and being unable to compete.


Gaming/graphical workloads have their place in professional/business settings.


----------



## McAlberts

i got an evga from newegg and did the 2 day shipping, its been sitting at my local sorting facility since sunday night and shouldn't be delivered until tuesday. i'm about to call up there and see if i can pick it up and save them a delivery lol. doubt it will work but worth a shot. so weird you guys are having issues since you even paid for the expedited delivery. id be heated.


----------



## Radox-0

Got my Trio of EVGA 1080Ti's in and running. Just finished Benchmarking about 20 games and tests with the 1080 FE at stock and overclocked @ 3440 x 1440, so hopefully will be able to see how it compares











Quick messing around all seem to do 12Ghz on memory and overclock to 2063 Mhz (of course fan ramped up to 1:1 profile, do not seem to happy above that though!


----------



## dboythagr8

Making a list of those affected by this EVGA Newegg 1080Ti debacle:

dboythagr8
chibi
lilchronic
SpartanJet
sway20


----------



## sWaY20

Texting with newegg


----------



## dboythagr8

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> Texting with newegg


"It looks like"

What are they basing this off of? When you check your Order History, where does it say the package is? Rowland Heights, CA for me on FedEx site...


----------



## BigMack70

Quote:


> Originally Posted by *AMDATI*
> 
> Gaming/graphical workloads have their place in professional/business settings.


Which is irrelevant to the point at issue. The cards were marketed as gaming cards. They are gaming cards. I used my 7970s to mine bitcoins a few years ago, but that didn't mean that they were bitcoin mining cards. They were gaming cards. You can argue the OG Titan was a hybrid prosumer/gaming card, but not the Titan XM or XP.


----------



## sWaY20

Quote:


> Originally Posted by *dboythagr8*
> 
> "It looks like"
> 
> What are they basing this off of? When you check your Order History, where does it say the package is? Rowland Heights, CA for me on FedEx site...


hell if i know, my package says what the others say, wherever in Cali and package label created.


----------



## Baasha

In my tests so far, the 1080 Ti outperforms the Titan X Pascal in every department until and including 4K resolution. The moment I switch to 5K (my default res), the Titan XP pulls ahead and at 8K, the difference is significant.


----------



## Baasha

Quote:


> Originally Posted by *Radox-0*
> 
> Got my Trio of EVGA 1080Ti's in and running. Just finished Benchmarking about 20 games and tests with the 1080 FE at stock and overclocked @ 3440 x 1440, so hopefully will be able to see how it compares
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quick messing around all seem to do 12Ghz on memory and overclock to 2063 Mhz (of course fan ramped up to 1:1 profile, do not seem to happy above that though!


Nice cable-management.









That HB bridge you're using just connects two cards... unless you modified yours?


----------



## fg2chase

I really want to see if they make a hybrid water cooled version


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> You do realize that "GeForce GTX" is literally printed on the titan xp card and backplate, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The argument "Nvidia didn't make the Titan for gaming, rather for professional workloads" is a load of crap and always has been. Even the OG Titan was marketed as a "by gamers for gamers" card.
> 
> Nvidia is screwing everyone in price and they've been doing so for 5 years. There is no immorality in this per-se, as they have had zero high end competition from AMD and so they have behaved exactly as a company without competition behaves... By increasing their pricing as much as they can get away with.
> 
> The only thing I think is unethical that they've done is mislead consumers by labeling midrange chips as the x80 part (680, 980,1080) - a nomenclature which used to denote their high end chip.
> 
> But the blame for pricing is squarely on AMD for underperforming and being unable to compete.


of course i realize that, but it's from old stock i guess. Huang came on stage last year and said, "titan x is NOT in the geforce gaming line". you cant get any more clear than that.


----------



## Radox-0

Quote:


> Originally Posted by *Baasha*
> 
> Nice cable-management.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That HB bridge you're using just connects two cards... unless you modified yours?


Thanks







. No its only 2 way, annoyingly move home recently so 3-Way one has gone on walkies! Need to dig though the remaining unpacked boxes, but these cards take priority


----------



## smushroomed

Quote:


> Originally Posted by *Slackaveli*
> 
> have not seen that one but im sure it'll work fine. you don't like the EVGA ones?


From what I've seen it requires a slight modification (cutting)

I'd rather just get something that fit from the start if possible.


----------



## AMDATI

Quote:


> Originally Posted by *BigMack70*
> 
> Which is irrelevant to the point at issue. The cards were marketed as gaming cards. They are gaming cards. I used my 7970s to mine bitcoins a few years ago, but that didn't mean that they were bitcoin mining cards. They were gaming cards. You can argue the OG Titan was a hybrid prosumer/gaming card, but not the Titan XM or XP.


Saying the Titan X has no place in a workstation environment is like saying a consumer utility pickup truck has no place in a businesses fleet of vehicles. Maybe I'm a business that wants to drive an advertising display. Maybe I want to create and record a cutscene in a game. Maybe I'm a game designer or work in the movie industry, or whatever else. It can be a 'gaming' card while still being meant for business operations.


----------



## lilchronic

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> Texting with newegg


When i was chatting with them they said it should be here today. But i dont expect it till tomorrow.


----------



## sWaY20

Damn there was an open box evga 1080ti at my local mc, should've jumped on it and cancelled neweggs.


----------



## sWaY20

Quote:


> Originally Posted by *lilchronic*
> 
> When i was chatting with them they said it should be here today. But i dont expect it till tomorrow.


if it's here today, I'll take my 15 dollars and be upset but get over it, Tom I'll be mad for a while and prolly never use Newegg again.


----------



## BigMack70

Quote:


> Originally Posted by *AMDATI*
> 
> Saying the Titan X has no place in a workstation environment


Stop this fallacious nonsense now. I did not say that, and you are simply putting words in my mouth in order to argue against a straw man. If you cannot read carefully and comprehend, don't respond.


----------



## BigMack70

Quote:


> Originally Posted by *Slackaveli*
> 
> titan x is NOT in the geforce gaming line


It doesn't matter who said this, it's obviously false. If you don't believe me, go check out Nvidia's website.


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> Which is irrelevant to the point at issue. The cards were marketed as gaming cards. They are gaming cards. I used my 7970s to mine bitcoins a few years ago, but that didn't mean that they were bitcoin mining cards. They were gaming cards. You can argue the OG Titan was a hybrid prosumer/gaming card, but not the Titan XM or XP.


no, they really werent. they are proffessional cards. you are just a blind sheep.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> Saying the Titan X has no place in a workstation environment is like saying a consumer utility pickup truck has no place in a businesses fleet of vehicles. Maybe I'm a business that wants to drive an advertising display. Maybe I want to create and record a cutscene in a game. Maybe I'm a game designer or work in the movie industry, or whatever else. It can be a 'gaming' card while still being meant for business operations.


my cousin's engineering firm works with a law firm and together they use Titan X to recreate crash scenes, etc, for multi-million dollar court cases.


----------



## BigMack70

Quote:


> Originally Posted by *Slackaveli*
> 
> no, they really werent. they are proffessional cards. you are just a blind sheep.


Nvidia lists the Titan X under its Geforce line. Not Quadro. Not Tegra. Not GRID. Not Tesla. Geforce.

This discussion is over. If you cannot read nvidia.com and their own product locations, there's no point in posting here.

Go to Nvidia.com and look at their product line and find the Titan X. It's under Geforce. The fact that it can be used for non-gaming workloads does not mean it is a professional or workstation card. It's a gaming card and clearly marked as such.

Fun fact: You can play games on some of the professional cards too but that does not make them gaming cards.

You guys need some lessons in logic and reading comprehension.


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> It doesn't matter who said this, it's obviously false. If you don't believe me, go check out Nvidia's website.


YOU JUST QUOTED IT. IT DOES NOT say "geforce," It says "Titan X". thanks for proving the point.

JEN-SUN HUANG said it. Last year. What you are arguing is old news. Google it.


----------



## dboythagr8

Quote:


> Originally Posted by *lilchronic*
> 
> When i was chatting with them they said it should be here today. But i dont expect it till tomorrow.


But are they telling you this despite the tracking information saying 'Label Created'? If so they don't have a clue and most likely saying what the customer wants to hear. The same thing was relayed to me on Fri btw, when I called and repeatedly asked about this.

I also got an email with a generated tracking number that's the same as Fri, and it says shipping time 3/13.


----------



## Slackaveli




----------



## chibi

You guys going on about the Titan X and 1080Ti need to take a few chill pills.


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*
> 
> It's the Nvidia Titan X, not the geforce titan x. dang, you guys are hard headed AF.


sure, you can game on it, but it is MADE for Deep Learning AI. I said nothing about the old titans used like compute cards. These Titan X are made for AI.


----------



## BigMack70

Titan X is listed under the following category:



I don't think you guys understand basic logic and categorization. Your argument is analogous to saying "_because humans are primates, they are clearly not mammals_". Nvidia lists the Titan X not under any of their professional lines, but under their Geforce GTX 10 series line. The fact that the card itself is simply the "Titan X" does not magically make it a _tertium quid_ which defies Geforce gaming categorization _when Nvidia themselves list it under the Geforce GTX 10 series_.

Anyway this is way off topic and I feel like I'm trying to explain calculus to kindergartners with their heads in the sand.


----------



## lilchronic

How about both of you guy's are right







. So stop bickering about it.


----------



## becks

They are both on Nvidia website...
On the internet...


----------



## fisher6

Officially joining the club:


----------



## KraxKill

Quote:


> Originally Posted by *BigMack70*
> 
> Nvidia lists the Titan X under its Geforce line. Not Quadro. Not Tegra. Not GRID. Not Tesla. Geforce.
> 
> This discussion is over. If you cannot read nvidia.com and their own product locations, there's no point in posting here.
> 
> Go to Nvidia.com and look at their product line and find the Titan X. It's under Geforce. The fact that it can be used for non-gaming workloads does not mean it is a professional or workstation card. It's a gaming card and clearly marked as such.
> 
> Fun fact: You can play games on some of the professional cards too but that does not make them gaming cards.
> 
> You guys need some lessons in logic and reading comprehension.


I would say that how it's marketed has little effect on its capability.


----------



## Alexbo1101

Card finally arrived









Going under water when EK makes a dedicated TI block.


----------



## al210

Quote:


> Originally Posted by *fisher6*
> 
> Officially joining the club:


----------



## dboythagr8

Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak

So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


----------



## doogk

Been delivered, now the long wait for the work day to be over.


----------



## keikei

Quote:


> Originally Posted by *dboythagr8*
> 
> Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak
> 
> So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


Thank god for insurance. I'd be livid. The 'lost' part bugs me as well. How do you lose something that is tracked?


----------



## Motley01

Quote:


> Originally Posted by *doogk*
> 
> Been delivered, now the long wait for the work day to be over.


haha, doesn't that suck. And time seems to go by so slow. But when you get home its like Christmas!


----------



## dboythagr8

Quote:


> Originally Posted by *keikei*
> 
> Thank god for insurance. I'd be livid. The 'lost' part bugs me as well. How do you lose something that is tracked?


Curious, isn't it

After the last exchange via DM Twitter, I'm irritated again. I called the Premiere help line and immediately asked to speak to a supervisor. I explained to him what is going on. He is opening an investigation etc etc and asked that I give him until Wednesday.

I mentioned that other folks (people from here) are experiencing the exact same issue.

chibi
lilchronic
SpartanJet
sway20

You guys need to call into the support line and just immediately asked the rep to speak to a supervisor/manager.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak
> 
> So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


That does not sound good.


----------



## wstanci3

Quote:


> Originally Posted by *dboythagr8*
> 
> Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak
> 
> So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


Quote:


> Originally Posted by *keikei*
> 
> Thank god for insurance. I'd be livid. The 'lost' part bugs me as well. How do you lose something that is tracked?


Quote:


> Originally Posted by *lilchronic*
> 
> That does not sound good.


Things like this is fairly common, unfortunately. If anyone has ever worked or been into a sorting facility, you wold understand how complex the system is. It probably went down the wrong "lane" and didnt get scanned, etc. Not saying it's acceptable. I'd be livid.
Anyway, just checked my order and its still on route to be sent out today. Not "lost", so there's that.


----------



## sWaY20

Quote:


> Originally Posted by *dboythagr8*
> 
> Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak
> 
> So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


man I'm at a loss for words, the ones i wanna say will be blocked here.


----------



## djfunz

Has there been any word as to when the release of the EK backplates for the 1080Ti will be released?


----------



## Pandora's Box

Man that sucks about the lost 1080 Ti's. Glad newegg/fedex didnt mess up my shipment. Does your tracking info say which facility they were sent from? Mine was New Jersey, delivered to Pittsburgh.


----------



## lilchronic

Quote:


> Originally Posted by *Pandora's Box*
> 
> Man that sucks about the lost 1080 Ti's. Glad newegg/fedex didnt mess up my shipment. Does your tracking info say which facility they were sent from? Mine was New Jersey, delivered to Pittsburgh.


ROWLAND HEIGHTS, CA US


----------



## trawetSluaP

Quote:


> Originally Posted by *djfunz*
> 
> Has there been any word as to when the release of the EK backplates for the 1080Ti will be released?


Single slot I/O bracket Block and backplates for 1080 Ti will be releasing first or second week of April.

That's what EK told me!


----------



## dboythagr8

Quote:


> Originally Posted by *Pandora's Box*
> 
> Man that sucks about the lost 1080 Ti's. Glad newegg/fedex didnt mess up my shipment. Does your tracking info say which facility they were sent from? Mine was New Jersey, delivered to Pittsburgh.


Quote:


> Originally Posted by *lilchronic*
> 
> ROWLAND HEIGHTS, CA US


This

They're probably all hanging out at some dudes house


----------



## lilchronic

What's wrong with the titan xp block? I ordered mine this morning from performance pc's and it's already on the way closer than my 1080Ti lol








not even a hour later the shipping label was created and then shipped.


----------



## stocksux

Quote:


> Originally Posted by *trawetSluaP*
> 
> Just got confirmation from EK that they are releasing a single slot Waterblock and a backplate specifically for the 1080 Ti. Should be released first or second week of April!



What do you mean single slot?? That's pretty flat. Perhaps they will offer a single slot I/O cover as the dula one isn't necessary with a water block on it.

I would like to replace the TItan X Pascal back plate with a 1080ti plate I suppose for aesthetic purposes. Or most likely save the money and not care lol.


----------



## tconroy135

Quote:


> Originally Posted by *lilchronic*
> 
> What's wrong with the titan xp block? I ordered mine this morning from performance pc's and it's already on the way closer than my 1080Ti lol
> 
> 
> 
> 
> 
> 
> 
> 
> not even a hour later the shipping label was created and then shipped.


XP block is fine, it'll probably just be re-branded for Ti anyway. I'm waiting for EK-MLC to be released and then I'm gonna get a prefilled all-in-one and prefilled Water Block


----------



## chibi

Quote:


> Originally Posted by *dboythagr8*
> 
> Curious, isn't it
> 
> After the last exchange via DM Twitter, I'm irritated again. I called the Premiere help line and immediately asked to speak to a supervisor. I explained to him what is going on. He is opening an investigation etc etc and asked that I give him until Wednesday.
> 
> I mentioned that other folks (people from here) are experiencing the exact same issue.
> 
> chibi
> lilchronic
> SpartanJet
> sway20
> 
> You guys need to call into the support line and just immediately asked the rep to speak to a supervisor/manager.


I didn't go for expedited delivery since I'm still waiting on some parts, so I'll give them a day or two to sort things out.


----------



## Domler

Yeah. I'm way to impatient. Looked what was on my door step when I got home just now. Lol


----------



## Somasonic

Quote:


> Originally Posted by *BigMack70*
> 
> *You do realize that "GeForce GTX" is literally printed on the titan xp card and backplate, right?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The argument "Nvidia didn't make the Titan for gaming, rather for professional workloads" is a load of crap and always has been. Even the OG Titan was marketed as a "by gamers for gamers" card.
> 
> Nvidia is screwing everyone in price and they've been doing so for 5 years. There is no immorality in this per-se, as they have had zero high end competition from AMD and so they have behaved exactly as a company without competition behaves... By increasing their pricing as much as they can get away with.
> 
> The only thing I think is unethical that they've done is mislead consumers by labeling midrange chips as the x80 part (680, 980,1080) - a nomenclature which used to denote their high end chip.
> 
> But the blame for pricing is squarely on AMD for underperforming and being unable to compete.


OMG that's hilarious. The original images I saw didn't have it and since I've had no interest in the card I never realised the production model actually had Geforce on them.


----------



## sWaY20

Well i talked to a nice lady at newegg and she hasnt gotten any notice of any type of loss or whatever, She did say if they picked up a pallet of gpus then they wouldn't have scanned them separately and might not even be scanned until they arrive to the destination. So she said it should come tomorrow, if not call wed and submit a claim. I mention there's more people with my same problem, same brand card, out of the same warehouse in cali. Im taking this with a grain of salt but at this point im just want my stuff, especially with the new bf1 dlc dropping tom...


----------



## trawetSluaP

Quote:


> Originally Posted by *stocksux*
> 
> What do you mean single slot?? That's pretty flat. Perhaps they will offer a single slot I/O cover as the dula one isn't necessary with a water block on it.


That's what I meant. A single I/O bracket slot.


----------



## MunneY

Well

To interrupt the arguements... My cards showed up... and apparently my Bios in my X99 Micro crapped out sooooooooooooo... I still cant use the dang things.


----------



## dboythagr8

Got an email about a claim being opened, so I guess they went ahead and did that for me.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> Hello, we received a response from our warehouse and this was sent out via FedEx on March 10th with the tracking# [removed]. What we can do at this point is open a shipping claim for a lost package. Would you like us to proceed with this? If the original package cannot be found by FedEx would you like us to send out a replacement or issue a refund? - Mak
> 
> So now 5+ 1080tis are "lost"? I'm getting beyond pissed.


wow. some fools jacked them.


----------



## MURDoctrine

Sorry to hear your cards are "missing/lost". Some worker probably saw them and stole them to be honest.

Well I got my tracking number just now so hopefully mine doesn't vanish as well. I'm worried the weather is going to somehow delay my delivery though coming from Cali to NC with all that snow hitting the mid west at the moment.


----------



## Slackaveli

Quote:


> Originally Posted by *MunneY*
> 
> Well
> 
> To interrupt the arguements... My cards showed up... and apparently my Bios in my X99 Micro crapped out sooooooooooooo... I still cant use the dang things.


dang! unlucky. Mine arrived a couple hours ago. runs 1873 out of the box under load.


----------



## Besty

Quote:


> Originally Posted by *pantsoftime*
> 
> For those of you looking to unlock voltage control in Afterburner, it is possible by editing a file.
> Source: http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html


if you could not get this tweak working, the review has a minor update with clearer instructions on how to get this working so give it another try. props to Guru3d guys.

this is because the code can be different from machine to machine.


----------



## BoredErica

Quote:


> Originally Posted by *Emissary of Pain*
> 
> May I join ?


There's nothing to join. All the op did was write two sentences and called it a thread.


----------



## AMDATI

Yeah a lot of stuff gets stolen in the shipping line. I've had boxes arrived opened, as if someone had saw newegg or amazon on the box and opened them to check what's inside to see if it's any good or small enough to sneak out with. Thankfully they couldn't exactly hide a whole keyboard under their jacket on the way out, so they just decided to not steal it, but the box was still essentially opened almost enough for the contents to fall out, they didn't even bother taping it back up. A 1080ti however, would be easy to slip into a jacket. They might pull it out of the conveyor belt and hide it somewhere then come pick it up later, potentially one card at a time until they get all cards.

I really wish newegg/amazon would stop labeling their boxes with newegg/amazon logo's, that would cut down hugely on stolen packages.


----------



## yukkerz

I ordered the MSI variant on newegg. Looks like I might be one of the people not getting it.


----------



## sWaY20

Quote:


> Originally Posted by *yukkerz*
> 
> I ordered the MSI variant on newegg. Looks like I might be one of the people not getting it.


i doubt it, tracking on ours didn't even make it that far. It's still on shipping info sent to fed ex. It's not out of that City either, hope your good though.


----------



## sWaY20

Quote:


> Originally Posted by *AMDATI*
> 
> Yeah a lot of stuff gets stolen in the shipping line. I've had boxes arrived opened, as if someone had saw newegg or amazon on the box and opened them to check what's inside to see if it's any good or small enough to sneak out with. Thankfully they couldn't exactly hide a whole keyboard under their jacket on the way out, so they just decided to not steal it, but the box was still essentially opened almost enough for the contents to fall out, they didn't even bother taping it back up. A 1080ti however, would be easy to slip into a jacket. They might pull it out of the conveyor belt and hide it somewhere then come pick it up later, potentially one card at a time until they get all cards.
> 
> I really wish newegg/amazon would stop labeling their boxes with newegg/amazon logo's, that would cut down hugely on stolen packages.


i worked at an Amazon sorting center part time for the holidays a few years ago There's an insane amount of security where there's just no way to do that, at least at Amazon. I mean i couldn't even take my phone in past the lockers and they took down my phones serial number for their records. Even had metal detectors.


----------



## AMDATI

This is how mine shipped from Newegg...



Looks like they don't just ship in big batches, but rather just every half hour or hour or so throughout the day. I think they wanted to get those 1080ti's out ASAP so they could make room for more stock coming in. After all, with as many as they were selling day 1, no reason to not ship as many as possible out. Pretty much all of the 1080ti's come from City of Industry, CA.....because it's off the coast where it ships in from manufacturers in asia. So it seems like 1080ti's in general got extra priority when it came to processing orders and getting them shipped.


----------



## yukkerz

Weird that I don't see your comment at the bottom until I quoted you. What type of shipping? Being that I did 2 day and it still hasn't hit a hub and its just about 7 p.m. I doubt I will get tomorrow. Lets not forget about the snow the east coast (where I am at) is about to get. :-( Sadness.
Quote:


> Originally Posted by *AMDATI*
> 
> This is how mine shipped from Newegg...
> 
> 
> 
> Looks like they don't just ship in big batches, but rather just every half hour or hour or so throughout the day. I think they wanted to get those 1080ti's out ASAP so they could make room for more stock coming in. After all, with as many as they were selling day 1, no reason to not ship as many as possible out. Pretty much all of the 1080ti's come from City of Industry, CA.....because it's off the coast where it ships in from manufacturers in asia.


----------



## AMDATI

I did the 2 day since it was 50 cents more than the standard $15 3 day. But I did order the very first batch within 10 minutes after it went live, and my billing cleared almost immediately too, so they had their money.

I'd say east cost is 3 days minimum anyways. It is shipping from the west coast after all. You don't really count the first day they ship as that's not a full day.


----------



## Slushpup

Got mine in for my second rig today!


----------



## Radox-0

Tad long post, but thought I would share some benching I did most of today between 1080's and 1080Ti's both FE to see how they compare at stock and overclocked with a few Titan X Maxwell Scores sprinkled in.

System Specs used: 5960x @ 4.6 Ghz, Cache @ 4.4 Ghz, 32GB RAM @ 14-17-17-34 1T

Panel used was the ASUS PG348Q, so 3440 x 1440 with V-Sync and G-Sync off in everything.

All cards used a 1:1 fan profile

GTX 1080Ti

Stock settings average boost 1860 Mhz Core / 11,016 Memory
Overclocked settings average boost 2002 Mhz / 12,000 MHz Memory

GTX 1080

Stock settings average boost 1848 Mhz Core / 10,012 Memory
Overclocked settings average boost 2075 Mhz / 11,000 MHz Memory

Titan X (Maxwell, EVGA SC BIOS)

Stock settings average boost 1316 Mhz Core / 7,012 Memory
Overclocked settings average boost 1474 Mhz / 8,020 MHz Memory

% increase and decrease is based on my stock 1080 scores. FPS are Averages

*OVERALL AVERAGES ACROSS THE BENCHMARKS*

GTX 1080: BASELINE
GTX 1080 OC'd: 9.98 % increase
GTX 1080Ti: 29.5% increase
GTX 1080 OC'd: 39.3% increase
Titan X: 12.2% Decrease
Titan X OC'd: 3.1% Decrease
Overall stock 1080Ti is 30% ahead or so of stock 1080 and similar for overlocked 1080Ti vs 1080. Gap is smaller, when we strip out synthetics which will scale better then games mostly. Now to see how SLI 1080 vs SLI 180Ti compares









*Far Cry Primal - Ultra Preset + HD Pack*

GTX 1080: 60 FPS
GTX 1080 OC'd: 67 FPS (11.6% Increase)
GTX 1080Ti: 80 FPS (33% increase)
GTX 1080 OC'd: 87 FPS (45% increase)
Titan X: Average: 53 (11.7% Decrease)
Titan X OC'd: Average: 59 (2% Decrease)

*Ghost Recon Wildland - Very High Preset*

GTX 1080: 57.5 FPS
GTX 1080 OC'd: 61.21 FPS (6.5% increase)
GTX 1080Ti: 71.25 FPS (23.9% increase)
GTX 1080 OC'd: 76.59 FPS (33.2% increase)

*Ash's of the Singularity GPU Test - DX11 - Extreme Preset*

GTX 1080: 57.5 FPS
GTX 1080 OC'd: 61.21 FPS (6.5% increase)
GTX 1080Ti: 71.25 FPS (23.9% increase)
GTX 1080 OC'd: 76.59 FPS (33.2% increase)
*
GTA 5 - All settings Maxed but no MSAA on anything*

GTX 1080: 71 FPS
GTX 1080 OC'd: 83.1 FPS (15.7% increase)
GTX 1080Ti: 88 FPS (22.6% increase)
GTX 1080 OC'd: 94.8 FPS (32% increase
Titan X: 64.34 FPS (10.4% Decrease)
Titan X OC'd: 72.5 FPS (1% increase)

*Arkham Knight - All settings maxed, No gamework settings on*

GTX 1080:84 FPS
GTX 1080 OC'd: 91 FPS (8.3% increase)
GTX 1080Ti: 103 FPS (22.6% increase)
GTX 1080 OC'd: 116 FPS (38.1% increase)

*Tomb Raider - Ultimate Preset*

GTX 1080: 98.2 FPS
GTX 1080 OC'd: 108 FPS (10% increase)
GTX 1080Ti: 131.9 FPS (34% increase)
GTX 1080 OC'd: 141 FPS (44.5% increase)
Titan X: 88 FPS (10.4% Decrease)
Titan X OC'd: 96 FPS (2.3% decrease)
*
Dues Ex mankind Divided - Ultra Preset no MSAA*

GTX 1080: 41.3 FPS
GTX 1080 OC'd: 45.6 FPS (10.4% increase)
GTX 1080Ti: 55.3 FPS (33.9% increase)
GTX 1080 OC'd: 62.1 GPS (50.4% increase

*Dragon Age inquisition - Ultra Preset*

GTX 1080: 72.4 FPS
GTX 1080 OC'd: 79.6 FPS (9.9% increase)
GTX 1080Ti: 89.1 FPS (23.1% increase)
GTX 1080 OC'd: 102 FPS (40.1% increase)
Titan X: 60.5 FPS (16.4% Decrease)
Titan X OC'd: 68 FPS (6.1% Decrease)

*Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Novigrad loop*

GTX 1080: 54 FPS
GTX 1080 OC'd: 60 FPS (11% increase)
GTX 1080Ti: 67 FPS (24.1% increase)
GTX 1080 OC'd: 72 FPS (33.3% increase)
Titan X: 49 FPS (9.3% Decrease)
Titan X OC'd: 52 FPS (3.1% Decrease)
*
Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Skelliga crossroads trees / forests loop*

GTX 1080: 49 FPS
GTX 1080 OC'd: 51 FPS (4.1% increase)
GTX 1080Ti: 59 FPS (20.4% increase)
GTX 1080 OC'd: 63 FPS (28.6% increase)
Titan X: 43 FPS (12% Decrease)
Titan X OC'd: 47 FPS (4.1% Decrease)

*Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Beauclair castle balcony loop (Long Vista) *

GTX 1080: 67 FPS
GTX 1080 OC'd: 72 FPS (7.5% increase)
GTX 1080Ti: 82 FPS (22.4% increase)
GTX 1080 OC'd: 87 FPS (29.9% increase)
Titan X: 60 FPS (10.4% decrease)
Titan X OC'd: 65 FPS (3% Decrease)

[*U]Hitman Absolution - Ultra - 2 x MSAA[/U]*

GTX 1080: 81.1 FPS
GTX 1080 OC'd: 92.6 FPS (14.1% increase)
GTX 1080Ti: 103.2 FPS (27% increase
GTX 1080 OC'd: 113 FPS (39% increase)
Titan X: 68.4 FPS (15.7% Decrease)
Titan X OC'd: 78.1 FPS (3.7% Decrease)

*Shadow of Mordor - Ultra Preset*

GTX 1080: 94.95 FPS
GTX 1080 OC'd: 107.19 FPS (12.9% increase)
GTX 1080Ti: 116.06 FPS (22.2% increase)
GTX 1080 OC'd: 127.06 (33.8% increase)
Titan X: 81.2 FPS (14.5% Decrease)
Titan X OC'd: 92.6 FPS (2.5% Decrease)

*Total War Warhammer - Ultra Preset*

GTX 1080: 68 FPS
GTX 1080 OC'd: 78.2 FPS (15% increase)
GTX 1080Ti: 96.71 FPS (42.2%)
GTX 1080 OC'd: 103 FPS (51.5% increase)

*Metro Last Light Redux - All settings Maxed No Motion Blur*

GTX 1080: 38.1 FPS
GTX 1080 OC'd: 41.14 FPS (8% increase)
GTX 1080Ti: 49.91 FPS (31% increase)
GTX 1080 OC'd: 56 FPS (47% increase)

*Vally Benchmark - Extreme HD Preset resolution at 1440p
*

GTX 1080: 66.1 FPS
GTX 1080 OC'd: 70.5 FPS (6.7% increase)
GTX 1080Ti: 89.9 FPS (36% increase)
GTX 1080 OC'd: 97.3 FPS (47% increase)
*
Firestrike*

GTX 1080: Graphics Score: 20800
GTX 1080 OC'd: Graphics Score: 23700 (13.9% increase)
GTX 1080Ti: Graphics Score: 27417 (31.8% increase)
GTX 1080 OC'd: Graphics score 29050 (39.7% increase)
*
Firestrike Extreme*

GTX 1080: Graphics Score: 9721
GTX 1080 OC'd: Graphics Score: 11451 (17.8% increase)
GTX 1080Ti: Graphics Score: 13465 (38.4% increase)
GTX 1080 OC'd: Graphics Score: 14525 (49.4% increase)

*Firestrike Ultra*

GTX 1080: Graphics Score: 4917
GTX 1080 OC'd: Graphics Score: 5624 (14.4% increase)
GTX 1080Ti: Graphics Score: 6808 (38.5% increase)
GTX 1080 OC'd: Graphics Score: 7109 (44.6% increase)


----------



## pompss

Got mine today 1950 mhz stable on air for right now

Will do more test later on.

Newegg sucks they never shipped mine on friday even they marked as shipped

Tracking number never showed a pick up scan. Never again

Nvidia web site shipped mine friday and got it today


----------



## xTesla1856

The guy I sold the Titan to bailed out of the deal, oh well I guess I'm stuck with this lowly TXP


----------



## KedarWolf

Quote:


> Originally Posted by *pompss*
> 
> Got mine today 1950 mhz stable on air for right now
> 
> Will do more test later on.
> 
> Newegg sucks they never shipped mine on friday even they marked as shipped
> 
> Tracking number never showed a pick up scan. Never again
> 
> Nvidia web site shipped mine friday and got it today


Watch in MSI Afterburner on air. When I set it in a voltage curve to 2000-2080 it don't matter, the temperature will hit 60C right away and it'll down clock it to 1981 no matter how I adjust the curve.









Well, until I get paid and order my EK waterblock that is.


----------



## dboythagr8

Quote:


> Originally Posted by *Radox-0*
> 
> Tad long post, but thought I would share some benching I did most of today between 1080's and 1080Ti's both FE to see how they compare at stock and overclocked with a few Titan X Maxwell Scores sprinkled in.
> 
> System Specs used: 5960x @ 4.6 Ghz, Cache @ 4.4 Ghz, 32GB RAM @ 14-17-17-34 1T
> 
> Panel used was the ASUS PG348Q, so 3440 x 1440 with V-Sync and G-Sync off in everything.
> 
> All cards used a 1:1 fan profile
> 
> GTX 1080Ti
> 
> Stock settings average boost 1860 Mhz Core / 11,016 Memory
> Overclocked settings average boost 2002 Mhz / 12,000 MHz Memory
> 
> GTX 1080
> 
> Stock settings average boost 1848 Mhz Core / 10,012 Memory
> Overclocked settings average boost 2075 Mhz / 11,000 MHz Memory
> 
> Titan X (Maxwell, EVGA SC BIOS)
> 
> Stock settings average boost 1316 Mhz Core / 7,012 Memory
> Overclocked settings average boost 1474 Mhz / 8,020 MHz Memory
> 
> % increase and decrease is based on my stock 1080 scores. FPS are Averages
> 
> *OVERALL AVERAGES ACROSS THE BENCHMARKS*
> 
> GTX 1080: BASELINE
> GTX 1080 OC'd: 9.98 % increase
> GTX 1080Ti: 29.5% increase
> GTX 1080 OC'd: 39.3% increase
> Titan X: 12.2% Decrease
> Titan X OC'd: 3.1% Decrease
> Overall stock 1080Ti is 30% ahead or so of stock 1080 and similar for overlocked 1080Ti vs 1080. Gap is smaller, when we strip out synthetics which will scale better then games mostly. Now to see how SLI 1080 vs SLI 180Ti compares
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Far Cry Primal - Ultra Preset + HD Pack*
> 
> GTX 1080: 60 FPS
> GTX 1080 OC'd: 67 FPS (11.6% Increase)
> GTX 1080Ti: 80 FPS (33% increase)
> GTX 1080 OC'd: 87 FPS (45% increase)
> Titan X: Average: 53 (11.7% Decrease)
> Titan X OC'd: Average: 59 (2% Decrease)
> 
> *Ghost Recon Wildland - Very High Preset*
> 
> GTX 1080: 57.5 FPS
> GTX 1080 OC'd: 61.21 FPS (6.5% increase)
> GTX 1080Ti: 71.25 FPS (23.9% increase)
> GTX 1080 OC'd: 76.59 FPS (33.2% increase)
> 
> *Ash's of the Singularity GPU Test - DX11 - Extreme Preset*
> 
> GTX 1080: 57.5 FPS
> GTX 1080 OC'd: 61.21 FPS (6.5% increase)
> GTX 1080Ti: 71.25 FPS (23.9% increase)
> GTX 1080 OC'd: 76.59 FPS (33.2% increase)
> *
> GTA 5 - All settings Maxed but no MSAA on anything*
> 
> GTX 1080: 71 FPS
> GTX 1080 OC'd: 83.1 FPS (15.7% increase)
> GTX 1080Ti: 88 FPS (22.6% increase)
> GTX 1080 OC'd: 94.8 FPS (32% increase
> Titan X: 64.34 FPS (10.4% Decrease)
> Titan X OC'd: 72.5 FPS (1% increase)
> 
> *Arkham Knight - All settings maxed, No gamework settings on*
> 
> GTX 1080:84 FPS
> GTX 1080 OC'd: 91 FPS (8.3% increase)
> GTX 1080Ti: 103 FPS (22.6% increase)
> GTX 1080 OC'd: 116 FPS (38.1% increase)
> 
> *Tomb Raider - Ultimate Preset*
> 
> GTX 1080: 98.2 FPS
> GTX 1080 OC'd: 108 FPS (10% increase)
> GTX 1080Ti: 131.9 FPS (34% increase)
> GTX 1080 OC'd: 141 FPS (44.5% increase)
> Titan X: 88 FPS (10.4% Decrease)
> Titan X OC'd: 96 FPS (2.3% decrease)
> *
> Dues Ex mankind Divided - Ultra Preset no MSAA*
> 
> GTX 1080: 41.3 FPS
> GTX 1080 OC'd: 45.6 FPS (10.4% increase)
> GTX 1080Ti: 55.3 FPS (33.9% increase)
> GTX 1080 OC'd: 62.1 GPS (50.4% increase
> 
> *Dragon Age inquisition - Ultra Preset*
> 
> GTX 1080: 72.4 FPS
> GTX 1080 OC'd: 79.6 FPS (9.9% increase)
> GTX 1080Ti: 89.1 FPS (23.1% increase)
> GTX 1080 OC'd: 102 FPS (40.1% increase)
> Titan X: 60.5 FPS (16.4% Decrease)
> Titan X OC'd: 68 FPS (6.1% Decrease)
> 
> *Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Novigrad loop*
> 
> GTX 1080: 54 FPS
> GTX 1080 OC'd: 60 FPS (11% increase)
> GTX 1080Ti: 67 FPS (24.1% increase)
> GTX 1080 OC'd: 72 FPS (33.3% increase)
> Titan X: 49 FPS (9.3% Decrease)
> Titan X OC'd: 52 FPS (3.1% Decrease)
> *
> Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Skelliga crossroads trees / forests loop*
> 
> GTX 1080: 49 FPS
> GTX 1080 OC'd: 51 FPS (4.1% increase)
> GTX 1080Ti: 59 FPS (20.4% increase)
> GTX 1080 OC'd: 63 FPS (28.6% increase)
> Titan X: 43 FPS (12% Decrease)
> Titan X OC'd: 47 FPS (4.1% Decrease)
> 
> *Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Beauclair castle balcony loop (Long Vista) *
> 
> GTX 1080: 67 FPS
> GTX 1080 OC'd: 72 FPS (7.5% increase)
> GTX 1080Ti: 82 FPS (22.4% increase)
> GTX 1080 OC'd: 87 FPS (29.9% increase)
> Titan X: 60 FPS (10.4% decrease)
> Titan X OC'd: 65 FPS (3% Decrease)
> 
> [*U]Hitman Absolution - Ultra - 2 x MSAA[/U]*
> 
> GTX 1080: 81.1 FPS
> GTX 1080 OC'd: 92.6 FPS (14.1% increase)
> GTX 1080Ti: 103.2 FPS (27% increase
> GTX 1080 OC'd: 113 FPS (39% increase)
> Titan X: 68.4 FPS (15.7% Decrease)
> Titan X OC'd: 78.1 FPS (3.7% Decrease)
> 
> *Shadow of Mordor - Ultra Preset*
> 
> GTX 1080: 94.95 FPS
> GTX 1080 OC'd: 107.19 FPS (12.9% increase)
> GTX 1080Ti: 116.06 FPS (22.2% increase)
> GTX 1080 OC'd: 127.06 (33.8% increase)
> Titan X: 81.2 FPS (14.5% Decrease)
> Titan X OC'd: 92.6 FPS (2.5% Decrease)
> 
> *Total War Warhammer - Ultra Preset*
> 
> GTX 1080: 68 FPS
> GTX 1080 OC'd: 78.2 FPS (15% increase)
> GTX 1080Ti: 96.71 FPS (42.2%)
> GTX 1080 OC'd: 103 FPS (51.5% increase)
> 
> *Metro Last Light Redux - All settings Maxed No Motion Blur*
> 
> GTX 1080: 38.1 FPS
> GTX 1080 OC'd: 41.14 FPS (8% increase)
> GTX 1080Ti: 49.91 FPS (31% increase)
> GTX 1080 OC'd: 56 FPS (47% increase)
> 
> *Vally Benchmark - Extreme HD Preset resolution at 1440p
> *
> 
> GTX 1080: 66.1 FPS
> GTX 1080 OC'd: 70.5 FPS (6.7% increase)
> GTX 1080Ti: 89.9 FPS (36% increase)
> GTX 1080 OC'd: 97.3 FPS (47% increase)
> *
> Firestrike*
> 
> GTX 1080: Graphics Score: 20800
> GTX 1080 OC'd: Graphics Score: 23700 (13.9% increase)
> GTX 1080Ti: Graphics Score: 27417 (31.8% increase)
> GTX 1080 OC'd: Graphics score 29050 (39.7% increase)
> *
> Firestrike Extreme*
> 
> GTX 1080: Graphics Score: 9721
> GTX 1080 OC'd: Graphics Score: 11451 (17.8% increase)
> GTX 1080Ti: Graphics Score: 13465 (38.4% increase)
> GTX 1080 OC'd: Graphics Score: 14525 (49.4% increase)
> 
> *Firestrike Ultra*
> 
> GTX 1080: Graphics Score: 4917
> GTX 1080 OC'd: Graphics Score: 5624 (14.4% increase)
> GTX 1080Ti: Graphics Score: 6808 (38.5% increase)
> GTX 1080 OC'd: Graphics Score: 7109 (44.6% increase)


Good data and thanks. FYI I think you left off the Ti name on your OC benches. Deus Ex is so unoptimized.


----------



## Radox-0

Quote:


> Originally Posted by *dboythagr8*
> 
> Good data and thanks. FYI I think you left off the Ti name on your OC benches. Deus Ex is so unoptimized.


Yeah, all sounds the same after a while when typing up







Hopefully order makes it fairly obvious.

Yeah Dues Ex is quiet bad as were a few others games I tried but were giving varied results that I ended up leaving them out.


----------



## dboythagr8

Looks like 1080ti will be able to run Mass Effect Andromeda on Ultra @ 4k60
Quote:


> Earlier today, we got our first report of Mass Effect Andromeda's PC performance. After EA worried us with recommended requirements targeting just [email protected] on High settings, it seems like they were just being very conservative and the game should actually run much smoother.
> 
> Bioware's Michael Gamble, Producer on Mass Effect Andromeda (as well as the original trilogy), commented on Twitter that the PC version is not a port and gamers can rest easy on its quality.
> 
> We then asked him if the fastest new single GPU produced by NVIDIA, the GTX 1080Ti, will be able to run Mass Effect Andromeda smoothly at 4K and Ultra settings. Apparently, it may even hit the 60+FPS sweet spot.


http://wccftech.com/mass-effect-andromeda-not-port-pc/


----------



## Motley01

Ohh nice! Looking forward to running ME: A on 4k on my brand new 4k monitor.


----------



## fisher6

Finally got the block installed and the 1080 Ti under water. My 1440p Qnix from South Korea woudln't even show an image, just a black screen with the included adapter. I have a ROG SWIFT PG348Q coming tomorrow which should solve that problem.

Initial random OC results in valley: Core is on 2075Mhz stable with 12k on the memory. Temps are 35-40C in valley. Will report more results tomorrow hopefully.


----------



## Radox-0

Quote:


> Originally Posted by *dboythagr8*
> 
> Looks like 1080ti will be able to run Mass Effect Andromeda on Ultra @ 4k60
> http://wccftech.com/mass-effect-andromeda-not-port-pc/


That's good to hear, looking forward to this title for ages!

Quote:


> Originally Posted by *fisher6*
> 
> Finally got the block installed and the 1080 Ti under water. My 1440p Qnix from South Korea woudln't even show an image, just a black screen with the included adapter. I have a ROG SWIFT PG348Q coming tomorrow which should solve that problem.
> 
> Initial random OC results in valley: Core is on 2075Mhz stable with 12k on the memory. Temps are 35-40C in valley. Will report more results tomorrow hopefully.


Panel I use with the cards's is pretty amazing once up and running. Will be curious how much further you get the memory. Taken it to 12 GHz, but still not found point at which score starts droping.


----------



## Agueybana_II

Quote:


> Originally Posted by *Agueybana_II*
> 
> Decided to reach out to Gigabyte, ASUS, PNY, MSI, EVGA, ZOTAC to asked the following question regarding WARRANTY
> 
> My email is in regards to [Company] warranty for the 1080TI FE. Does [Company] voids the warranty of the GPU after an aftermarket water block is installed?
> 
> ****Responses*****
> 
> EVGA stated:
> Hello,
> 
> As long as you are able to put the original cooler back on the card when sending it in for replacement there shouldn't be any issues with warranty. Simply removing the cooler does not void your warranty. If you have any other questions, please let us know.
> 
> Regards,
> EVGA
> 
> Will update as manufacture provide response.


****ZOTAC****
Hi,

Thank you for contacting ZOTAC Technical Support.

Yes sir. It will void your warranty.

Cheers,
Paul
Zotac Technical Support

****ASUS****
Dear Valued Customer,

Thanks for making your inquiry. Your warranty is voided should you cause any customer induce damages. We recommend any modification being done to the product should be done by a certified technician. Please be advised that you can consult us for texchnical support aswell.

Best Regards,

Anthony Williams
ASUS eShop Representative

****PNY****
Hello,

Thank you for contacting PNY Technical Support.

Any modifications or dismantling of the card will void the cards warranty.

Jan

PNY Technical Support

______________________________________________
Still awaiting MSI and Gigabyte will update once received.


----------



## Slushpup

Quote:


> Originally Posted by *Agueybana_II*
> 
> ****ZOTAC****
> Hi,
> 
> Thank you for contacting ZOTAC Technical Support.
> 
> Yes sir. It will void your warranty.
> 
> Cheers,
> Paul
> Zotac Technical Support
> 
> ****ASUS****
> Dear Valued Customer,
> 
> Thanks for making your inquiry. Your warranty is voided should you cause any customer induce damages. We recommend any modification being done to the product should be done by a certified technician. Please be advised that you can consult us for texchnical support aswell.
> 
> Best Regards,
> 
> Anthony Williams
> ASUS eShop Representative
> 
> ****PNY****
> Hello,
> 
> Thank you for contacting PNY Technical Support.
> 
> Any modifications or dismantling of the card will void the cards warranty.
> 
> Jan
> 
> PNY Technical Support
> 
> ______________________________________________
> Still awaiting MSI and Gigabyte will update once received.


How would they even know if you put it back on the same way you took it off?


----------



## DerComissar

@Radox-O:
Thanks for your hard work, and posting all those benches.
Great comparisons, good to see how well the 1080TI is doing in a real-world scenario.
Rep+


----------



## blackforce

Some people have to much time on ther hands, who cares?
don't buy it or just go get a evga where they don't care.


----------



## blackforce

Quote:


> Originally Posted by *Radox-0*
> 
> Tad long post, but thought I would share some benching I did most of today between 1080's and 1080Ti's both FE to see how they compare at stock and overclocked with a few Titan X Maxwell Scores sprinkled in.
> 
> System Specs used: 5960x @ 4.6 Ghz, Cache @ 4.4 Ghz, 32GB RAM @ 14-17-17-34 1T
> 
> Panel used was the ASUS PG348Q, so 3440 x 1440 with V-Sync and G-Sync off in everything.
> 
> All cards used a 1:1 fan profile
> 
> GTX 1080Ti
> 
> Stock settings average boost 1860 Mhz Core / 11,016 Memory
> Overclocked settings average boost 2002 Mhz / 12,000 MHz Memory
> 
> GTX 1080
> 
> Stock settings average boost 1848 Mhz Core / 10,012 Memory
> Overclocked settings average boost 2075 Mhz / 11,000 MHz Memory
> 
> Titan X (Maxwell, EVGA SC BIOS)
> 
> Stock settings average boost 1316 Mhz Core / 7,012 Memory
> Overclocked settings average boost 1474 Mhz / 8,020 MHz Memory
> 
> % increase and decrease is based on my stock 1080 scores. FPS are Averages
> 
> *OVERALL AVERAGES ACROSS THE BENCHMARKS*
> 
> GTX 1080: BASELINE
> GTX 1080 OC'd: 9.98 % increase
> GTX 1080Ti: 29.5% increase
> GTX 1080 OC'd: 39.3% increase
> Titan X: 12.2% Decrease
> Titan X OC'd: 3.1% Decrease
> Overall stock 1080Ti is 30% ahead or so of stock 1080 and similar for overlocked 1080Ti vs 1080. Gap is smaller, when we strip out synthetics which will scale better then games mostly. Now to see how SLI 1080 vs SLI 180Ti compares
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Far Cry Primal - Ultra Preset + HD Pack*
> 
> GTX 1080: 60 FPS
> GTX 1080 OC'd: 67 FPS (11.6% Increase)
> GTX 1080Ti: 80 FPS (33% increase)
> GTX 1080 OC'd: 87 FPS (45% increase)
> Titan X: Average: 53 (11.7% Decrease)
> Titan X OC'd: Average: 59 (2% Decrease)
> 
> *Ghost Recon Wildland - Very High Preset*
> 
> GTX 1080: 57.5 FPS
> GTX 1080 OC'd: 61.21 FPS (6.5% increase)
> GTX 1080Ti: 71.25 FPS (23.9% increase)
> GTX 1080 OC'd: 76.59 FPS (33.2% increase)
> 
> *Ash's of the Singularity GPU Test - DX11 - Extreme Preset*
> 
> GTX 1080: 57.5 FPS
> GTX 1080 OC'd: 61.21 FPS (6.5% increase)
> GTX 1080Ti: 71.25 FPS (23.9% increase)
> GTX 1080 OC'd: 76.59 FPS (33.2% increase)
> *
> GTA 5 - All settings Maxed but no MSAA on anything*
> 
> GTX 1080: 71 FPS
> GTX 1080 OC'd: 83.1 FPS (15.7% increase)
> GTX 1080Ti: 88 FPS (22.6% increase)
> GTX 1080 OC'd: 94.8 FPS (32% increase
> Titan X: 64.34 FPS (10.4% Decrease)
> Titan X OC'd: 72.5 FPS (1% increase)
> 
> *Arkham Knight - All settings maxed, No gamework settings on*
> 
> GTX 1080:84 FPS
> GTX 1080 OC'd: 91 FPS (8.3% increase)
> GTX 1080Ti: 103 FPS (22.6% increase)
> GTX 1080 OC'd: 116 FPS (38.1% increase)
> 
> *Tomb Raider - Ultimate Preset*
> 
> GTX 1080: 98.2 FPS
> GTX 1080 OC'd: 108 FPS (10% increase)
> GTX 1080Ti: 131.9 FPS (34% increase)
> GTX 1080 OC'd: 141 FPS (44.5% increase)
> Titan X: 88 FPS (10.4% Decrease)
> Titan X OC'd: 96 FPS (2.3% decrease)
> *
> Dues Ex mankind Divided - Ultra Preset no MSAA*
> 
> GTX 1080: 41.3 FPS
> GTX 1080 OC'd: 45.6 FPS (10.4% increase)
> GTX 1080Ti: 55.3 FPS (33.9% increase)
> GTX 1080 OC'd: 62.1 GPS (50.4% increase
> 
> *Dragon Age inquisition - Ultra Preset*
> 
> GTX 1080: 72.4 FPS
> GTX 1080 OC'd: 79.6 FPS (9.9% increase)
> GTX 1080Ti: 89.1 FPS (23.1% increase)
> GTX 1080 OC'd: 102 FPS (40.1% increase)
> Titan X: 60.5 FPS (16.4% Decrease)
> Titan X OC'd: 68 FPS (6.1% Decrease)
> 
> *Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Novigrad loop*
> 
> GTX 1080: 54 FPS
> GTX 1080 OC'd: 60 FPS (11% increase)
> GTX 1080Ti: 67 FPS (24.1% increase)
> GTX 1080 OC'd: 72 FPS (33.3% increase)
> Titan X: 49 FPS (9.3% Decrease)
> Titan X OC'd: 52 FPS (3.1% Decrease)
> *
> Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Skelliga crossroads trees / forests loop*
> 
> GTX 1080: 49 FPS
> GTX 1080 OC'd: 51 FPS (4.1% increase)
> GTX 1080Ti: 59 FPS (20.4% increase)
> GTX 1080 OC'd: 63 FPS (28.6% increase)
> Titan X: 43 FPS (12% Decrease)
> Titan X OC'd: 47 FPS (4.1% Decrease)
> 
> *Witcher 3 - High Post processing Preset and Ultra Graphical quality Preset - Beauclair castle balcony loop (Long Vista) *
> 
> GTX 1080: 67 FPS
> GTX 1080 OC'd: 72 FPS (7.5% increase)
> GTX 1080Ti: 82 FPS (22.4% increase)
> GTX 1080 OC'd: 87 FPS (29.9% increase)
> Titan X: 60 FPS (10.4% decrease)
> Titan X OC'd: 65 FPS (3% Decrease)
> 
> [*U]Hitman Absolution - Ultra - 2 x MSAA[/U]*
> 
> GTX 1080: 81.1 FPS
> GTX 1080 OC'd: 92.6 FPS (14.1% increase)
> GTX 1080Ti: 103.2 FPS (27% increase
> GTX 1080 OC'd: 113 FPS (39% increase)
> Titan X: 68.4 FPS (15.7% Decrease)
> Titan X OC'd: 78.1 FPS (3.7% Decrease)
> 
> *Shadow of Mordor - Ultra Preset*
> 
> GTX 1080: 94.95 FPS
> GTX 1080 OC'd: 107.19 FPS (12.9% increase)
> GTX 1080Ti: 116.06 FPS (22.2% increase)
> GTX 1080 OC'd: 127.06 (33.8% increase)
> Titan X: 81.2 FPS (14.5% Decrease)
> Titan X OC'd: 92.6 FPS (2.5% Decrease)
> 
> *Total War Warhammer - Ultra Preset*
> 
> GTX 1080: 68 FPS
> GTX 1080 OC'd: 78.2 FPS (15% increase)
> GTX 1080Ti: 96.71 FPS (42.2%)
> GTX 1080 OC'd: 103 FPS (51.5% increase)
> 
> *Metro Last Light Redux - All settings Maxed No Motion Blur*
> 
> GTX 1080: 38.1 FPS
> GTX 1080 OC'd: 41.14 FPS (8% increase)
> GTX 1080Ti: 49.91 FPS (31% increase)
> GTX 1080 OC'd: 56 FPS (47% increase)
> 
> *Vally Benchmark - Extreme HD Preset resolution at 1440p
> *
> 
> GTX 1080: 66.1 FPS
> GTX 1080 OC'd: 70.5 FPS (6.7% increase)
> GTX 1080Ti: 89.9 FPS (36% increase)
> GTX 1080 OC'd: 97.3 FPS (47% increase)
> *
> Firestrike*
> 
> GTX 1080: Graphics Score: 20800
> GTX 1080 OC'd: Graphics Score: 23700 (13.9% increase)
> GTX 1080Ti: Graphics Score: 27417 (31.8% increase)
> GTX 1080 OC'd: Graphics score 29050 (39.7% increase)
> *
> Firestrike Extreme*
> 
> GTX 1080: Graphics Score: 9721
> GTX 1080 OC'd: Graphics Score: 11451 (17.8% increase)
> GTX 1080Ti: Graphics Score: 13465 (38.4% increase)
> GTX 1080 OC'd: Graphics Score: 14525 (49.4% increase)
> 
> *Firestrike Ultra*
> 
> GTX 1080: Graphics Score: 4917
> GTX 1080 OC'd: Graphics Score: 5624 (14.4% increase)
> GTX 1080Ti: Graphics Score: 6808 (38.5% increase)
> GTX 1080 OC'd: Graphics Score: 7109 (44.6% increase)


Yeah thanks this has helped me out a ton.
waiting on hybird any brand.


----------



## Radox-0

Quote:


> Originally Posted by *DerComissar*
> 
> @Radox-O:
> Thanks for your hard work, and posting all those benches.
> Great comparisons, good to see how well the 1080TI is doing in a real-world scenario.
> Rep+


No problem








Quote:


> Originally Posted by *blackforce*
> 
> Yeah thanks this has helped me out a ton.
> waiting on hybird any brand.


No probs. Great stuff. Will be interested to see how the hybrids fair. No doubt will make these cards even better being able to maintain a higher consistent boost.


----------



## dboythagr8

Quote:


> Originally Posted by *blackforce*
> 
> Some people have to much time on ther hands, who cares?
> don't buy it or just go get a evga where they don't care.


Some people care? Why would you not want to at least be aware of a company's policy on modification? Called being a smart consumer.


----------



## Outcasst

Got my card running with a Corsair H55, 2025Mhz solid no throttling.


----------



## BigMack70

Doing some final testing but looks like my card is a dud that tops out between 1900 and 1950 MHz average clock in-game. Peak clocks around 2040 MHz. This is a +130 offset. Increasing voltage does not appear to allow higher clocks for me.

Temps are great though with the EVGA hybrid cooler... 51C max so far when overclocked.

Not sure how some of you guys are getting no throttling - I get power throttled constantly at 120% power and stock volts.

Performance is ok-ish... Definitely not as fast as Titan XM SLI in SLI supported titles, but it feels good to be done with SLI

It was surprisingly easy to put on the hybrid cooler... No modifications needed and just took fifteen minutes


----------



## dboythagr8

Quote:


> Originally Posted by *BigMack70*
> 
> Doing some final testing but looks like my card is a dud that tops out between 1900 and 1950 MHz average clock in-game. Peak clocks around 2040 MHz. This is a +130 offset. Increasing voltage does not appear to allow higher clocks for me.
> 
> Temps are great though with the EVGA hybrid cooler... 51C max so far when overclocked.
> 
> Not sure how some of you guys are getting no throttling - I get power throttled constantly at 120% power and stock volts.
> 
> Performance is ok-ish... Definitely not as fast as Titan XM SLI in SLI supported titles, but it feels good to be done with SLI
> 
> It was surprisingly easy to put on the hybrid cooler... No modifications needed and just took fifteen minutes


I thought some people reported having to use a tool to cut metal or something.

Is this what you are using?


----------



## KedarWolf

Quote:


> Originally Posted by *BigMack70*
> 
> Doing some final testing but looks like my card is a dud that tops out between 1900 and 1950 MHz average clock in-game. Peak clocks around 2040 MHz. This is a +130 offset. Increasing voltage does not appear to allow higher clocks for me.
> 
> Temps are great though with the EVGA hybrid cooler... 51C max so far when overclocked.
> 
> Not sure how some of you guys are getting no throttling - I get power throttled constantly at 120% power and stock volts.
> 
> Performance is ok-ish... Definitely not as fast as Titan XM SLI in SLI supported titles, but it feels good to be done with SLI
> 
> It was surprisingly easy to put on the hybrid cooler... No modifications needed and just took fifteen minutes


With 2000 or even 2080 on a 1093v voltage curve it'll throttle at 60C to 1981 consistently right away. Will have a waterblock for the card in a week or so and I'll post my results then.


----------



## BigMack70

Quote:


> Originally Posted by *dboythagr8*
> 
> I thought some people reported having to use a tool to cut metal or something.
> 
> Is this what you are using?


I am using the 980 Ti AIO from EVGA that I had on my Titan XM. Didn't have to cut anything because I just ignored the half of the shroud near the I/O panel and left of off... It wouldn't really serve a purpose anyway. The part that's important is the shroud around the blower fan, which more or less fit fine.

I assume the same thing could be done with the 1080 AIO and you'd also get the option of cutting the shroud to fit, but I didn't want to spend the money when I didn't need to.

-edit- On the bright side, looks like my memory overclocks like a champ... +600 MHz looking like the sweet spot


----------



## pompss

will mount ek waterblock as soon i finish some air text

Newegg still hasnt shipped my order marked as shipped and tracking number added since friday without actually shipping the card.

Also the customers support told me he will file a claim to fedex







There is no pick up scan you cant file a claim if fedex never received the card

They just lost my Premier membership and a customer i couldnt bealive what other customers said about newegg but i do now.


----------



## dboythagr8

Quote:


> Originally Posted by *pompss*
> 
> will mount ek waterblock as soon i finish some air text
> 
> Newegg still hasnt shipped my order marked as shipped and tracking number added since friday without actually shipping the card
> 
> They just lost my Premier membership and a customer


Hope you guys are letting them know via social media or via their customer reps. Unacceptable for such a high priced item.


----------



## Slushpup

Not bad for just installing an hour ago. Thought +175 was a good start and it killed it. Might try +200 in a bit.


----------



## pompss

guys its normal that my intel i7 6850k overclock to 4ghz 1249 V in idle has 41c temperature with 480 radiator and pump full speed and three fans 50% speed ?

Seems pretty high to me


----------



## dboythagr8

Quote:


> Originally Posted by *pompss*
> 
> guys its normal that my intel i7 6850k overclock to 4ghz 1249 V in idle has 41c temperature with 480 radiator and pump full speed and three fans 50% speed ?
> 
> Seems pretty high to me


It does seem a bit on the high side. I have a 4930k (6-core) @ 4.5ghz hooked to a Kraken x62. It's in silent mode with the fans moving between 480 - 530 RPM. My current temp just browsing the web is ~35c.


----------



## Slackaveli

+175, eh. i havent tried past +150 because i locked up in FFIV benchmark test at +160. wonder if it was a fluke. damn, i hate the process of discovering the "ceiling" on core clocks. At +150 i get exactly 2000 boost that settles in to 1987 in game.

In Battlefield One I was locked 60 FPS, never dipped, all ultra everything, taa, and all, in 4k. I slid the resolution slider up and it took up to 115% rez before it dropped to 57fps minimums. That's pretty filthy.


----------



## pompss

Quote:


> Originally Posted by *dboythagr8*
> 
> It does seem a bit on the high side. I have a 4930k (6-core) @ 4.5ghz hooked to a Kraken x62. It's in silent mode with the fans moving between 480 - 530 RPM. My current temp just browsing the web is ~35c.


i cant believe i added the thermal paste wrong twice







What a jerk .


----------



## Theros

One more into the thread. Here is my bench: http://www.3dmark.com/fs/11988933


----------



## drkCrix

How many people are using the stock cooler with this card or is everyone water-cooling?


----------



## AMDATI

I'm using stock cooler and not even gonna overclock. That's how I roll.


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> I'm using stock cooler and not even gonna overclock. That's how I roll.


I see you like to live dangerously


----------



## Pandora's Box

Quote:


> Originally Posted by *drkCrix*
> 
> How many people are using the stock cooler with this card or is everyone water-cooling?


Currently running stock. More than likely going to stay stock too. I really don't want to give up my ITX case to put a water block on this card.


----------



## Slackaveli

i am air cooling with a 1:1 curve and never get over mid 70s even with +150 on the core. Things is so beastly it can just run stock and still crush 4k.


----------



## ajresendez

Anybody asked new egg how long before their back ordered cards will take to come in?


----------



## AMDATI

You could always put an external waterblock on it, or put a radiator on a free fan slot


----------



## AMDATI

Quote:


> Originally Posted by *ajresendez*
> 
> Anybody asked new egg how long before their back ordered cards will take to come in?


They wouldn't know, but I'd say it could be anywhere from tomorrow to a month or more. They've kind of stayed sold out after the first 24-48 hours which means there might not be any stock until Nvidia's next production batch is done and shipped. You have to keep in mind that all these units that just sold out, were produced over the last 3 months or so.

I know from the GTX 1080 launch, people couldn't get them really at all after the first day of launch sold out. Many people were complaining of back order for weeks.

That's exactly why I was prepared to order as soon as it dropped. I made sure I was logged in, had my payment/shipping information already set, etc. All I had to do was a couple quick clicks. It was literally sold out right after I ordered when I refreshed the page, so any slower by even a few milliseconds and I could have missed it.


----------



## drkCrix

Quote:


> Originally Posted by *Slackaveli*
> 
> i am air cooling with a 1:1 curve and never get over mid 70s even with +150 on the core. Things is so beastly it can just run stock and still crush 4k.


By 1:1 you mean at 40c the fan is 40% and at 80c it's at 80% ?


----------



## BigMack70

Quote:


> Originally Posted by *drkCrix*
> 
> By 1:1 you mean at 40c the fan is 40% and at 80c it's at 80% ?


AKA Vacuum Cleaner mode


----------



## jsutter71

Quote:


> Originally Posted by *Theros*
> 
> One more into the thread. Here is my bench: http://www.3dmark.com/fs/11988933


Interesting. Are you water cooling?

My TXP scores.
http://www.3dmark.com/fs/11512998


----------



## Juggalo23451

Quote:


> Originally Posted by *drkCrix*
> 
> How many people are using the stock cooler with this card or is everyone water-cooling?


I'll be water cooling in a Case labs S3. Teaser pic below


----------



## PewnFlavorTang

Well I hope to join this club soon. Newegg has me frustrated as all get out. This is likely my last purchase from newegg. Their shipping blows.


----------



## lilchronic

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> Well I hope to join this club soon. Newegg has me frustrated as all get out. This is likely my last purchase from newegg. Their shipping blows.


Did you order next day delivery?


----------



## tconroy135

Quote:


> Originally Posted by *BigMack70*
> 
> AKA Vacuum Cleaner mode


It's all about ambient temperatures if you use the blower. I keep my ambient temp feeding into the box at about 20c. It allows for pretty low temps. A lot of people are in hot rooms and that creates problems.


----------



## AMDATI

Newegg isn't as good as it was several years ago. I only order from them because no taxes. Even Newegg's prices aren't always the best.

Amazon does have better return policy, I even had a package stolen and they replaced it no questions asked. But those taxes kill me, since I pay 10% tax on everything.


----------



## PewnFlavorTang

Quote:


> Originally Posted by *lilchronic*
> 
> Did you order next day delivery?


Yes sir. Apparently it really means "next week delivery"


----------



## Theros

Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. Are you water cooling?
> 
> My TXP scores.
> http://www.3dmark.com/fs/11512998


Yep, my 6950x has a Thermaltake Water 3.0 360mm (i had before custom watercooling but i got tired of doing maintenance) and the 1080 Ti's are with stock cooling (both Gigabyte's Founders).


----------



## ajresendez

Quote:


> Originally Posted by *AMDATI*
> 
> They wouldn't know, but I'd say it could be anywhere from tomorrow to a month or more. They've kind of stayed sold out after the first 24-48 hours which means there might not be any stock until Nvidia's next production batch is done and shipped. You have to keep in mind that all these units that just sold out, were produced over the last 3 months or so.
> 
> I know from the GTX 1080 launch, people couldn't get them really at all after the first day of launch sold out. Many people were complaining of back order for weeks.
> 
> That's exactly why I was prepared to order as soon as it dropped. I made sure I was logged in, had my payment/shipping information already set, etc. All I had to do was a couple quick clicks. It was literally sold out right after I ordered when I refreshed the page, so any slower by even a few milliseconds and I could have missed it.


Just got off chat and they seem to be pretty sure that they will restock by the end of the week and that I should be getting a shipping email by mid next week at the latest of not by this Friday so I guess we'll see. I'm just wanting it to get here before Andromeda launches.


----------



## Slackaveli

yessir. works great. i will tweak the lower end to slightly less fan than temp once i find my normal idle temps to cut noise during browsing if need be.


----------



## Slackaveli

Quote:


> Originally Posted by *drkCrix*
> 
> By 1:1 you mean at 40c the fan is 40% and at 80c it's at 80% ?


yessir.


----------



## Slackaveli

Quote:


> Originally Posted by *tconroy135*
> 
> It's all about ambient temperatures if you use the blower. I keep my ambient temp feeding into the box at about 20c. It allows for pretty low temps. A lot of people are in hot rooms and that creates problems.


so the ole cpu next to the window unit ac trick ftw.


----------



## Swolern

What has the highest core OC been here so far? Air vs water?

Also are you guys seeing more or temp or power limit throttling?


----------



## KedarWolf

Quote:


> Originally Posted by *Swolern*
> 
> What has the highest core OC been here so far? Air vs water?
> 
> Also are you guys seeing more or temp or power limit throttling?


I got as high as 2080 but when I checked in Afterburner it immediately went above 60C and throttled to 1981 both at 2080 and 2000 so I run it at 2000 core on a voltage curve, +634 memory.









Will have my waterblock in a week or so and will update then.


----------



## tconroy135

Quote:


> Originally Posted by *Slackaveli*
> 
> so the ole cpu next to the window unit ac trick ftw.


That's my literal setup, haha.


----------



## lilchronic

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> Yes sir. Apparently it really means "next week delivery"


I just got a update on the tracking # from fedex. My package is now in Memphis Tennessee.









@dboythagr8
@chibi
@SpartanJet
@sway20


----------



## AMDATI

fedex usually delivers early in the morning for me, I'm too excited to go to bed


----------



## tconroy135

Quote:


> Originally Posted by *lilchronic*
> 
> I just got a update on the tracking # from fedex. My package is now in Memphis Tennessee.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @dboythagr8
> @chibi
> @SpartanJet
> @sway20


I just got an update as well. Mine is 45 minutes away and the delivery should be today, I hope it's early, but not holding my breath.


----------



## zipeldiablo

So my card will arrive tomorrow.
For those who had a hybrid wc is it worth it?
I don't have the room for a custom loop in my case (i mean, it is doable but definitly difficult) and from what i read it should lower temp enough for a good oc without having too much noise?

Hope i didn't make a mistake ordering the fe but i just sold one my 980ti so i needed to order quickly and we still don't have any words on when the custom cards will be release


----------



## DrFreeman35

I am looking to update to the Ti, and have 2 1080 FTW's both without the update to the thermal problems. It took me about a year to purchase and complete my whole PC, and I was not keeping up with the problems of these cards. This is my first build, and I am not WC my cards. One of my cards is available for Step-up program through EVGA, and I can upgrade to the Ti for $279.99. Would you guys upgrade? Or wait for the AIB's to come out and hopefully upgrade then? I am not sure when they will be out, but my card that is available has 71 days left for the program. I want to eventually upgrade to 2 of the Ti's, but I will most likely not WC them anytime soon. Just curious as to some opinions on what i should do? My current cards hit 2075 and seem to not get too hot while gaming, although I cannot see the VRM temps...... Any input would be greatly appreciated, because I have contacted EVGA and tried to ask about sending in my cards for the thermal issue, or upgrading the FTW2's. TIA


----------



## lilchronic

Quote:


> Originally Posted by *AMDATI*
> 
> fedex usually delivers early in the morning for me, I'm too excited to go to bed


I hope mine departs from memphis soon because then i might just be able to get it today before 8pm


----------



## ozlay

Quote:


> Originally Posted by *Baasha*
> 
> Looks like I'll need to replace my Antec HCP-1300 Platinum with the EVGA 1600 T2 PSU since the 1300W is not enough to power 4x 1080 Ti.
> 
> Hmm.. but I love me some GPU sammich!


Its rated for 1300w but it can do more. Iv seen that model pushed up to 1700w or so. Interestingly the older HCP-1200 model could be pushed over 2000w.


----------



## dade_kash_xD

Newegg is sneaky! Saturday night it showed available for purchase. So, purchase i did! My order has been in backorder since! How long does NE usually fill backorders? Wanted to wait for Vega b4 uprade but my 290's are showing their age @ 3440x1440 and massive driver issues always!


----------



## MURDoctrine

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Newegg is sneaky! Saturday night it showed available for purchase. So, purchase i did! My order has been in backorder since! How long does NE usually fill backorders? Wanted to wait for Vega b4 uprade but my 290's are showing their age @ 3440x1440 and massive driver issues always!


Someone probably snagged it before you could get it and it switched to backorder during your checkout process. I had a gigabyte one sell as I was putting it in my cart launch day.

Quote:


> Originally Posted by *lilchronic*
> 
> I just got a update on the tracking # from fedex. My package is now in Memphis Tennessee.


Mines there too. Hoping it comes a day early but I'm not holding my breath with Fedex. I've had items sit at the local distribution for a full 24 hours before haha.


----------



## Mato87

Right fellas, I have a question for you all.

Should I wait for a non reference design card like Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition or snag the founders edition at a lower price now? I can get it for 799 euro right now. Will there be a lot of disadvantages with the reference design card when comapred to the non reference? I really want the Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition , but it seems it won't come out so soon as it was only announced like a week ago.


----------



## MURDoctrine

Quote:


> Originally Posted by *Mato87*
> 
> Right fellas, I have a question for you all.
> 
> Should I wait for a non reference design card like Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition or snag the founders edition at a lower price now? I can get it for 799 euro right now. Will there be a lot of disadvantages with the reference design card when comapred to the non reference? I really want the Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition , but it seems it won't come out so soon as it was only announced like a week ago.


If you don't plan on watercooling then wait for a cooler that can keep the cards below their throttling points. One user has stated there's throttles starting at 60C but others haven't had this issue it seems. That being said they run hot with the reference blower and thermals are the limiting factor so unless you need it right now wait.


----------



## Mato87

Quote:


> Originally Posted by *MURDoctrine*
> 
> If you don't plan on watercooling then wait for a cooler that can keep the cards below their throttling points. One user has stated there's throttles starting at 60C but others haven't had this issue it seems. That being said they run hot with the reference blower and thermals are the limiting factor so unless you need it right now wait.


Yes, that's what I was afraid of, the reference cooling solution is kind of crappy. I was hoping nvidia improved it, but it seems they haven't. There is nothing they can improve with this type of cooling solution anyway so whatever. I'll just wait until some non reference design cards are out, but I worry that it will take too long. I heard some of the new cards take almost 3 months to make...we'll see.

So I just got a message from the shop rep saying the Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition should be added to the store for purchase this week, so Iam excited. Hope he wasn't bull****ing me


----------



## Radox-0

cross.
Quote:


> Originally Posted by *Mato87*
> 
> Yes, that's what I was afraid of, the reference cooling solution is kind of ****ty. I was hoping nvidia improved it, but it seems they haven't. There is nothing they can improve with this type of cooling solution anyway so whatever. I'll just wait until some non reference design cards are out, but I worry that it will take too long. I heard some of the new cards take almost 3 months to make...we'll see.


Reference design has their use cases and I am for one thankful they produce these. Of course it does suck for 90% of users who do not need such as card







+ they are hands down the best looking GPU's I have seen, though the 980Ti Kingpin does get close







(yes I am a sucker for that look








Quote:


> Originally Posted by *DrFreeman35*
> 
> I am looking to update to the Ti, and have 2 1080 FTW's both without the update to the thermal problems. It took me about a year to purchase and complete my whole PC, and I was not keeping up with the problems of these cards. This is my first build, and I am not WC my cards. One of my cards is available for Step-up program through EVGA, and I can upgrade to the Ti for $279.99. Would you guys upgrade? Or wait for the AIB's to come out and hopefully upgrade then? I am not sure when they will be out, but my card that is available has 71 days left for the program. I want to eventually upgrade to 2 of the Ti's, but I will most likely not WC them anytime soon. Just curious as to some opinions on what i should do? My current cards hit 2075 and seem to not get too hot while gaming, although I cannot see the VRM temps...... Any input would be greatly appreciated, because I have contacted EVGA and tried to ask about sending in my cards for the thermal issue, or upgrading the FTW2's. TIA


I posted about 20 or so benchmarks I have ran with the 1080, 1080Ti and Titan X (Maxewll) few pages back at stock and overclocked settings. At stock a 1080Ti handily outperformed an overclocked 1080 at similar speeds to your FTW. Now the 1080Ti Ref cooler can attain similar speeds (will fluctuate), but noise will be a factor so really you will limit your clocks on a daily basis, unless you do not mind alot of noise would possibly wait it out for third part coolers as you do not seem to have a absolute need to a blower style cooler.


----------



## DrFreeman35

Quote:


> Originally Posted by *Mato87*
> 
> Yes, that's what I was afraid of, the reference cooling solution is kind of crappy. I was hoping nvidia improved it, but it seems they haven't. There is nothing they can improve with this type of cooling solution anyway so whatever. I'll just wait until some non reference design cards are out, but I worry that it will take too long. I heard some of the new cards take almost 3 months to make...we'll see.
> 
> So I just got a message from the shop rep saying the Gigabyte GeForce GTX 1080 Ti AORUS Xtreme edition should be added to the store for purchase this week, so Iam excited. Hope he wasn't bull****ing me


I hope EVGA is as quick on their end as Gigabyte then. I want to see If I can upgrade myself, considering I am not going to WC either.


----------



## Mato87

Quote:


> Originally Posted by *Radox-0*
> 
> cross.
> Reference design has their use cases and I am for one thankful they produce these. Of course it does suck for 90% of users who do not need such as card
> 
> 
> 
> 
> 
> 
> 
> + they are hands down the best looking GPU's I have seen, though the 980Ti Kingpin does get close
> 
> 
> 
> 
> 
> 
> 
> (yes I am a sucker for that look


Of course, but I won't ever go for water cooling solution. They look sleek and tidy, but the cooling performance is very crappy in most cases and whats more important I heard about the thermal throttling, which is usually not a good sign. I really freaking hope the non reference design cards will be available to purchase sometime this week.
Quote:


> Originally Posted by *DrFreeman35*
> 
> I hope EVGA is as quick on their end as Gigabyte then. I want to see If I can upgrade myself, considering I am not going to WC either.


I already saw both EVGA GeForce GTX 1080Ti FTW3 GAMING ICX and the MSI GTX 1080Ti GAMING X 8G added to the store, not yet available for purchase, mind you, but they already added them to the store listing, which means they will be available to purchase later this week or the next one. The question is whether I'll wait for the gigabyte aorus version or I'll go with the evga, but considering the problems evga had with their 1080 and vrms I'll pass on that brand.


----------



## BoredErica

Without a way of standardized testing and verification people's clocks are going to be all over the place. If the OP would bother to update us on all the relevant 1080ti details since he started the thread, that would be nice.



> Originally Posted by *Mato87*
> 
> Of course, but I won't ever go for water cooling solution. They look sleek and tidy, but the cooling performance is very crappy in most cases and whats more important I heard about the thermal throttling, which is usually not a good sign.


On air GPU temps typically go above CPU temps. On water GPU temps often go below CPU temps. If we're just comparing degrees then a 280mm AIO like Kraken x61 will kick the pants off AIB air cooling solutions. It's actually kind of ridiculous. It's not quite as awesome with many CPUs however.


----------



## Radox-0

Quote:


> Originally Posted by *Mato87*
> 
> Of course, but I won't ever go for water cooling solution. They look sleek and tidy, but the cooling performance is very crappy in most cases and whats more important I heard about the thermal throttling, which is usually not a good sign. I really freaking hope the non reference design cards will be available to purchase sometime this week.


Indeed. One thing though to clear up is that there is a difference between thermal throttling and not being able to boost as high at least in my opinion.

For a test in Haven at 1440p on a hour loop, with all of the stock settings the card targets 83 degrees and at those speeds I see my cards hover between mostly between 1750 - 1850 mhz with slight fluctuation on either side on that with a stock fan profile which settles at 50%. I have not once seen the clocks approach the 1582 Mhz quoted boost clock speed of the GPU, if it drops under that I would consider it as thermally throttling.

I do agree however that these cards will not Boost as high due to the thermal constraint as aftermarket AIB cards will be able too, though nothing new in that regards really and same thing applies to every generation of FE / Blower style coolers. Just more apparent with Boost technology. why for 90% of people this is the preferred way to go.


----------



## fisher6

Quote:


> Originally Posted by *Swolern*
> 
> What has the highest core OC been here so far? Air vs water?
> 
> Also are you guys seeing more or temp or power limit throttling?


I got mine installed yesterday with an EK waterblock and was stable at 2075 while running valley in a loop. Will try more later today, haven't touched voltage yet.


----------



## sWaY20

Quote:


> Originally Posted by *tconroy135*
> 
> I just got an update as well. Mine is 45 minutes away and the delivery should be today, I hope it's early, but not holding my breath.


haha just woke up to this, came here to post a screenshot with the good news.


----------



## KraxKill

I can get my card to boost as high as 2088 at 1.08v and 2075 at 1.07v using the Afterburner voltage unlock trick described earlier in this thread.

Ultimately however that doesn't improve performance much for me. My scores are actually nearly identical at 2062.5 at 1.06v where the lower voltage allows for reduced temperatures, lower overall TDP and doesn't put me in between boost 3.0 steps.

I'm running 12k (+500) on the memory. Anything from +450 to +600 scores the same for me. So I'm keeping my vram at +500 well below artifacting, which for me, occures with the vram at +650.

I like to explore and confirm stability using my 4 favorite benches.

Valley, Time Spy, Fire Strike and Heaven

They don't all behave the same with memory clocks. Valley scales really well, even past +550. But Time Spy doesn't give me a notable improvement above +450.

I found Heaven to be the most stressful of the four on the clock speed. I can usually run a bin higher on the other three vs Heaven which becomes my go to for core stability testing as I can usually reach the lowest common denominator here.

I also use GPUZ's logging function at the highest polling rate to review the clocks for any throttling.

It's frankly a peeve of mine when people quote unsustainable (throttling) boost clocks as their overclocks.

Momentary bursts of speed perhaps, but not what I would call a true OC. These kinds of results should quote the lowest noted core clock not the momentary jolt in frequency.

My setup is nothing fancy, I'm running this 1080ti with an NZXT X41 AIO, Noctua fans in push pull, Shunt modded TDP, Liquid Metal Tim.

For those who've asked for screenshots.

This is what that looks like on screen with valley looping for 15-30min. As you can see no throttle whatsoever.



I think this is where I'll be keeping her until Volta.


----------



## pez

So for the adopters of this card, are you guys noticing any differences in temps and noise vs earlier FE coolers? Obviously I'm targeting 1070 or 1080 FE users. I don't expect the minimal changes to the FE cooler to really have a measurable effect, but I figure it's worth an ask.


----------



## Radox-0

Quote:


> Originally Posted by *pez*
> 
> So for the adopters of this card, are you guys noticing any differences in temps and noise vs earlier FE coolers? Obviously I'm targeting 1070 or 1080 FE users. I don't expect the minimal changes to the FE cooler to really have a measurable effect, but I figure it's worth an ask.


Compared to all my 1080's FE's not massively. Temps is hard to say because as the 1080FE, they all target 83 degrees and will hit that and maintain it with a mix managing the boost. With stock fan profile's however I actually find it pretty quiet and not at all intrusive. Fan profile at stock tops out at 50% or so on my cards which maintains 83 degrees and is good for maintaining a boost in the 1750-1850 mhz region with memory clocked to 12.2 Ghz. Find that a nice spot to be in.


----------



## pez

Quote:


> Originally Posted by *Radox-0*
> 
> Compared to all my 1080's FE's not massively. Temps is hard to say because as the 1080FE, they all target 83 degrees and will hit that and maintain it with a mix managing the boost. With stock fan profile's however I actually find it pretty quiet and not at all intrusive. Fan profile at stock tops out at 50% or so on my cards which maintains 83 degrees and is good for maintaining a boost in the 1750-1850 mhz region with memory clocked to 12.2 Ghz. Find that a nice spot to be in.


Yeah, I found with the FE on the TXP that 70% needed to be utilized to maintain a ~2000Mhz clock speed on it. Thanks for the input







.


----------



## undercoverb0ss

Quote:


> Originally Posted by *KraxKill*
> 
> I can get my card to boost as high as 2088 at 1.08v and 2075 at 1.07v using the Afterburner voltage unlock trick described earlier in this thread.
> 
> Ultimately however that doesn't improve performance much for me. My scores are actually nearly identical at 2062.5 at 1.06v where the lower voltage allows for reduced temperatures, lower overall TDP and doesn't put me in between boost 3.0 steps.
> 
> I'm running 12k (+500) on the memory. Anything from +450 to +600 scores the same for me. So I'm keeping my vram at +500 well below artifacting, which for me, occures with the vram at +650.
> 
> I like to explore and confirm stability using my 4 favorite benches.
> 
> Valley, Time Spy, Fire Strike and Heaven
> 
> They don't all behave the same with memory clocks. Valley scales really well, even past +550. But Time Spy doesn't give me a notable improvement above +450.
> 
> I found Heaven to be the most stressful of the four on the clock speed. I can usually run a bin higher on the other three vs Heaven which becomes my go to for core stability testing as I can usually reach the lowest common denominator here.
> 
> I also use GPUZ's logging function at the highest polling rate to review the clocks for any throttling.
> 
> It's frankly a peeve of mine when people quote unsustainable (throttling) boost clocks as their overclocks.
> 
> Momentary bursts of speed perhaps, but not what I would call a true OC. These kinds of results should quote the lowest noted core clock not the momentary jolt in frequency.
> 
> My setup is nothing fancy, I'm running this 1080ti with an NZXT X41 AIO, Noctua fans in push pull, Shunt modded TDP, Liquid Metal Tim.
> 
> For those who've asked for screenshots.
> 
> This is what that looks like on screen with valley looping for 15-30min. As you can see no throttle whatsoever.
> 
> 
> 
> I think this is where I'll be keeping her until Volta.


This is basically identical to how my card is behaving, 2063 sustained core with 6003 on memory. I have the 980ti hybrid cooler with push-pull fans, no shunt mod, Noctua TIM. Max temp ~36-38C.


----------



## dboythagr8

Quote:


> Originally Posted by *lilchronic*
> 
> I just got a update on the tracking # from fedex. My package is now in Memphis Tennessee.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @dboythagr8
> @chibi
> @SpartanJet
> @sway20


Well well

Mine is in town at local FedEx facility. Hopefully this situation comes to an end today!


----------



## keikei

Quote:


> Originally Posted by *dboythagr8*
> 
> Well well
> 
> Mine is in town at local FedEx facility. Hopefully this situation comes to an end today!


Great news. How many rigs are you building to need a handful of Ti's?! My first used car in highchool didnt cost that much! I'm expecting mine sometime tomorrow. The delivery trucks will have to brave the snow storm today.


----------



## AMDATI

I officially dub today gtx 1080ti day *waits patiently for package to arrive*


----------



## eXistencelies

Little bit odd question, but I had some cables made from ice modz a while back and had an extra GPU cable (8 pin) made just in case I went SLI. Well now I see the TI is a 6 pin + 8 pin. Would I be able to put two 8 pins in there with two slots hanging off to the side? These cables aren't cheap either.


----------



## dboythagr8

Quote:


> Originally Posted by *keikei*
> 
> Great news. How many rigs are you building to need a handful of Ti's?! My first used car in highchool didnt cost that much! I'm expecting mine sometime tomorrow. The delivery trucks will have to brave the snow storm today.


I'm only getting one. When I mentioned "handful" I was talking about myself and 5 or so other folks here in this thread who were going through the same shipping issues.

I'm not baasha ?


----------



## Cuenda

Most of the Shops are saying they gonna sell their customs in April :|


----------



## deejaykristoff

my TI already swimming... 40/42° at full load... some games 36/38°. have installed a hybrid evga kit modded with h105 rad with short tubes. only push, have better result if push pull but not necessary. excellent result at 1440p on rog swift. when playing on 1080p lcd tv temps never reach 33° on some demanding game. very happy with results.


----------



## lilchronic

Just got my card


----------



## Benny89

Quote:


> Originally Posted by *deejaykristoff*
> 
> my TI already swimming... 40/42° at full load... some games 36/38°. have installed a hybrid evga kit modded with h105 rad with short tubes. only push, have better result if push pull but not necessary. excellent result at 1440p on rog swift. when playing on 1080p lcd tv temps never reach 33° on some demanding game. very happy with results.


Nice. Evga Kit from 1080?

Good to know the temps. What is your OC that you have achieved with that temps?


----------



## keikei

Quote:


> Originally Posted by *lilchronic*
> 
> Just got my card


Where dem pics @?!


----------



## y2kcamaross

After sitting 10 miles from me in a warehouse since Saturday - mine is finally out for delivery


----------



## deejaykristoff

Quote:


> Originally Posted by *Benny89*
> 
> Nice. Evga Kit from 1080?
> 
> Good to know the temps. What is your OC that you have achieved with that temps?


the evga kit is from 980. always have used this kit on 980, 980ti, 1080 and now 1080ti.
no oc applied yet. the card boost to 1900mhz hitself due to no temp limitaton


----------



## lilchronic

Quote:


> Originally Posted by *keikei*
> 
> Where dem pics @?!


Im a bad photographer.


----------



## alucardis666

Quote:


> Originally Posted by *y2kcamaross*
> 
> After sitting 10 miles from me in a warehouse since Saturday - mine is finally out for delivery


I hear you, mine sat all day in a warehouse about an hour away from me, now it's at another sorting facility 30 minutes away, but my card is scheduled for delivery tomorrow. Here's hoping I get it early today instead of tomorrow evening


----------



## tconroy135

Quote:


> Originally Posted by *y2kcamaross*
> 
> After sitting 10 miles from me in a warehouse since Saturday - mine is finally out for delivery


Mine is scheduled for delivery today but if it gets delayed because of last nights dusting prepare for a wall-o-text rant/rage tonight


----------



## sWaY20

Quote:


> Originally Posted by *lilchronic*
> 
> Just got my card


nice, mines out for delivery should get it anytime now. I need to order a block, been putting it off.


----------



## Joshwaa

Well my card was scheduled for delivery today but seems it wanted to stay in Ocala, FL for a couple days first. Lovely...
Got a question for the 1440p guys is G-Sync worth it? I am looking to get a new monitor and wanted to know if it is worth the extra cost?


----------



## mouacyk

NVidia needs to get rid of the power throttling already. Buys a $700 card and $140 water block to eliminate the thermal throttling only to hit power throttle.


----------



## dboythagr8

Good to see you guys getting your cards today. I'm not counting on it for me, but we'll see! My $25 refund hit my PayPal and is being transferred back to my card, so that's good


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> Without a way of standardized testing and verification people's clocks are going to be all over the place. If the OP would bother to update us on all the relevant 1080ti details since he started the thread, that would be nice.
> On air GPU temps typically go above CPU temps. On water GPU temps often go below CPU temps. If we're just comparing degrees then a 280mm AIO like Kraken x61 will kick the pants off AIB air cooling solutions. It's actually kind of ridiculous. It's not quite as awesome with many CPUs however.


I think need an "official"thread where the op will update it regularly on OC advice and news.

Also, don't CPU not do as well with AIO because of the heat spreader? I've seen a delid CPU barenon a AIO get like a 30C difference. The heat spreader is most of the time aluminum and aluminum has a half the conductance rate as copper.


----------



## sWaY20

Quote:


> Originally Posted by *mouacyk*
> 
> NVidia needs to get rid of the power throttling already. Buys a $700 card and $140 water block to eliminate the thermal throttling only to hit power throttle.


once we have a custom bios that'll be eliminated won't it?


----------



## yukkerz

Quote:


> Originally Posted by *Joshwaa*
> 
> Well my card was scheduled for delivery today but seems it wanted to stay in Ocala, FL for a couple days first. Lovely...
> Got a question for the 1440p guys is G-Sync worth it? I am looking to get a new monitor and wanted to know if it is worth the extra cost?


I just recently went from 4k to a 1440p gsync monitor, I couldn't be happier. Not only just for gaming, scrolling on forums and whatnot is so much smoother. Gsync is just icing on the cake when it comes to the very demanding games.


----------



## Joshwaa

Quote:


> Originally Posted by *yukkerz*
> 
> I just recently went from 4k to a 1440p gsync monitor, I couldn't be happier. Not only just for gaming, scrolling on forums and whatnot is so much smoother. Gsync is just icing on the cake when it comes to the very demanding games.


I am guessing you have the 1080ti on this monitor? What monitor did you go with?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *sWaY20*
> 
> once we have a custom bios that'll be eliminated won't it?


Will it? At 120% power draw were already pushing 300 watts of the 8+6-pin setup. Anymore would be dangerous no?


----------



## BigMack70

Quote:


> Originally Posted by *SlimJ87D*
> 
> Will it? At 120% power draw were already pushing 300 watts of the 8+6-pin setup. Anymore would be dangerous no?


Danger is sort of inherent in custom BIOS since if you screw something up and brick your card without the stock BIOS on it your warranty is void. I ran my Titan X Maxwell cards on a custom BIOS with a 400W power limit (rarely hit but often well above 300W) and they were OK for nearly 2 years of use.

I probably am not going to go custom BIOS on my 1080 Ti because my card looks like such a dud overclocker that I imagine it will be pointless, but people with good samples might see a significant benefit.


----------



## navjack27

You can't custom bios pascal


----------



## BigMack70

Quote:


> Originally Posted by *navjack27*
> 
> You can't custom bios pascal


Yet.









Someone will likely figure it out once we get custom cards in the wild. Nvidia just had it locked down on the Titan XP as they were the only vendor.


----------



## fisher6

Quote:


> Originally Posted by *BigMack70*
> 
> Yet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Someone will likely figure it out once we get custom cards in the wild. Nvidia just had it locked down on the Titan XP as they were the only vendor.


We have custom card for the other Pascal cards but nothing happened in terms of custom bios.


----------



## GraphicsWhore

FYI, on EVGA Forums I'm hearing that EK is planning to release a dedicated 1080Ti block within a month.

Was about to purchase the TXP block and backplate but I guess I can wait. Still waiting for my EVGA Step-Up to be processed anyway.


----------



## keikei

Quote:


> Originally Posted by *lilchronic*
> 
> Im a bad photographer.


Open bench due to humidity from where you live or...laziness. Lul.


----------



## mouacyk

Quote:


> Originally Posted by *fisher6*
> 
> We have custom card for the other Pascal cards but nothing happened in terms of custom bios.


1080 1070 1060 1050 ti 1050... none zip pretty sure nvidia is done with giving away free performance. They released the TXP with the worst possible VRM to discourage any attempts to crack the bios. And now knowing their bios is safe they release the ti with a better VRM but with a measly 300w limit. Doubt vendors will be any better because they couldn't with 1070 and 1080.


----------



## chibi

Quote:


> Originally Posted by *lilchronic*
> 
> I just got a update on the tracking # from fedex. My package is now in Memphis Tennessee.


Thanks for the Tag, I was asleep when you posted that. Checked my order this morning and it's shipped as well. Hopefully will get the GPU by Thursday.









Still need to wait for Aquacomputer waterblocks to get back in stock though, their website shows 3 weeks.


----------



## pantsoftime

Quote:


> Originally Posted by *fisher6*
> 
> We have custom card for the other Pascal cards but nothing happened in terms of custom bios.


There is only one useful custom Pascal BIOS in the wild: the "T4" BIOS for 1080. It disables power limit and raises voltage limit to 1.2V. It was originally designed for Asus Strix cards so it disables a displayport on other cards (since Asus has 2xHDMI instead). The reason this BIOS exists is because it was properly signed by Asus for LN2 overclockers of Strix. It just happens to work on many other 1080 cards (though some FE users reported it melting their 8-pin jack due to excessive current draw and likely cheap PS cables).

It's possible the same team at Asus would release a similar BIOS for the Strix version of the Ti if they're feeling generous. It would be nicer if we could convince other vendors to do the same and release some signed overclocking BIOSs for their cards.


----------



## lilchronic

Quote:


> Originally Posted by *keikei*
> 
> Open bench due to humidity from where you live or...laziness. Lul.


Too lazy to put it in my main rig. lol


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> Just got my card


I got lucky, lil, Friday local store had two Gigabytes in stock, I called, they held one for me, paid a coworker to pick it up for me before store closed that night as they would only hold until the end of the night and I was working until 9 p.m., couldn't beg off early myself.

Got it though!!









Now, order waterblock tomorrow from EK, have it in four or five days..


----------



## fisher6

Since this is my first Pascal card, I have a question about voltage control in After Burner, does pushing the voltage to the maximum after unlocking the slider risk damaging the card in anyway?


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> I got lucky, lil, Friday local store had two Gigabytes in stock, I called, they held one for me, paid a coworker to pick it up for me before store closed that night as they would only hold until the end of the night and I was working until 9 p.m., couldn't beg off early myself.
> 
> Got it though!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, order waterblock tomorrow from EK, have it in four or five days..


Very lucky they held it for you.
ordered one yesterday morning and it will be here tomorrow


----------



## chibi

Quote:


> Originally Posted by *lilchronic*
> 
> Too lazy to put it in my main rig. lol


The Dimastech benches are very handy, sometimes I think to get rid of the big ole case and look for one of them XL benches


----------



## pantsoftime

Quote:


> Originally Posted by *fisher6*
> 
> Since this is my first Pascal card, I have a question about voltage control in After Burner, does pushing the voltage to the maximum after unlocking the slider risk damaging the card in anyway?


Not at all. Voltage control is extremely limited on these cards. Bumping it up gives you 2 extra performance bins that are still well below the max voltage for the GPU.


----------



## Cybertox

Just pre-ordered mine today, should be here towards the end of March. Here in Switzerland we get things last, but I dont really mind waiting.


----------



## al210

Man guys! Why are we never satisfied? These cards are already pushed hard stock and performance is phenomenal. Unlocking and tweaking is locked down so we won't have the chance of a fire for the sake of a few % increase.


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> I hear you, mine sat all day in a warehouse about an hour away from me, now it's at another sorting facility 30 minutes away, but my card is scheduled for delivery tomorrow. Here's hoping I get it early today instead of tomorrow evening


They updated my tracking! I'll have my card today!










Quote:


> Originally Posted by *al210*
> 
> Man guys! Why are we never satisfied? These cards are already pushed hard stock and performance is phenomenal. Unlocking and tweaking is locked down so we won't have the chance of a fire for the sake of a few % increase.


You must be new here.


----------



## dboythagr8

We
Have
Deliveryyyyyyyyyyyyyyyyy!


----------



## pantsoftime

Quote:


> Originally Posted by *dboythagr8*
> 
> We
> Have
> Deliveryyyyyyyyyyyyyyyyy!


Congrats! Keep us updated on scores.


----------



## AMDATI

Most people should be getting their cards today if they ordered on Friday. Mines out for delivery, and I've got a porch cam so I'm not missing it


----------



## drkCrix

Mine still hasn't shipped (NCIX Canada)

Wondering if it hasn't shipped by tomorrow to cancel and wait for the EVGA SC card?


----------



## MunneY

Here is my run with a gimped CPU until I get a new motherboard.

http://www.3dmark.com/fs/11987583 Card 1 or 2

@2050/12000


----------



## Slackaveli

Quote:


> Originally Posted by *tconroy135*
> 
> That's my literal setup, haha.


it's unbeatable lol. the ole take off the front window and box fan right next to open PC works solidly too. lol. Learned this dealing with a 290x.


----------



## DerComissar

Quote:


> Originally Posted by *drkCrix*
> 
> Mine still hasn't shipped (NCIX Canada)
> 
> Wondering if it hasn't shipped by tomorrow to cancel and wait for the EVGA SC card?


Same here.
Have a look at your order info from NCIX, if they have provided any yet.
My order from the 10th is still stuck on "Order Processing".
They took my payment right away, of course, lol.

The mock invoice they provided, has no serial number for the card, as there is no card yet.
Numerous requests all week for any shipping info or updates has gone ignored.

Your decision on whether or not to bail.
I'm going to give them another day to provide some kind of explanation, but just by looking at the NCIX Facebook page, or googling order issues with NCIX, leads to many horror stories.


----------



## al210

Mine is scheduled for today but I don't think it will happen.







18" of snow here!


----------



## krutoydiesel

Ordered from Nvidia,

Unfortunately, Performance-PCs just notified me that my pre-order EKWB block is on standby pending stock from EKWB without an ETA.

Oh well, gives me a chance to cancel the TITAN XP backplate part of the order and wait for the 1080Ti branded backplate.


----------



## dseg

Does anyone know where to find a step-by-step on how install a waterblock on the 1080 TI?
Don't no body have time to wait for a HC.


----------



## krutoydiesel

Quote:


> Originally Posted by *dseg*
> 
> Does anyone know where to find a step-by-step on how install a waterblock on the 1080 TI?
> Don't no body have time to wait for a HC.


Look up the Titan X pascal waterblock on EKWB's website, they should have a PDF step-by-step.


----------



## stocksux

Quote:


> Originally Posted by *pompss*
> 
> guys its normal that my intel i7 6850k overclock to 4ghz 1249 V in idle has 41c temperature with 480 radiator and pump full speed and three fans 50% speed ?
> 
> Seems pretty high to me


Try adding a fourth fan to the 120.4 radiator


----------



## sWaY20

Quote:


> Originally Posted by *al210*
> 
> Man guys! Why are we never satisfied? These cards are already pushed hard stock and performance is phenomenal. Unlocking and tweaking is locked down so we won't have the chance of a fire for the sake of a few % increase.


you know this is overclock.net right? Not stock.net


----------



## KraxKill

http://www.3dmark.com/fs/11992139


----------



## lilchronic

Quote:


> Originally Posted by *MunneY*
> 
> Here is my run with a gimped CPU until I get a new motherboard.
> 
> http://www.3dmark.com/fs/11987583 Card 1 or 2
> 
> @2050/12000


Yeah my card maxed out at the same 2050/12000 but it throttles down to 1900Mhz in time spy.
here is a firestrike run
http://www.3dmark.com/3dm/18587182


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> I got mine installed yesterday with an EK waterblock and was stable at 2075 while running valley in a loop. Will try more later today, haven't touched voltage yet.


so that is the range then. 1:1 fan curve with the FE settles in at 77-78c on a 30 minute Heaven run and boost settles in at 1974, down from 2012 (@ +150core). with the water cooling you can get it to run below the 60c throttle mark thus keeping your max boost.

How did you get to 2075, though?


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> so that is the range then. 1:1 fan curve with the FE settles in at 77-78c on a 30 minute Heaven run and boost settles in at 1974, down from 2012 (@ +150core). with the water cooling you can get it to run below the 60c throttle mark thus keeping your max boost.
> 
> How did you get to 2075, though?


@100% fan i get 60°c ambient temps 23°c


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I can get my card to boost as high as 2088 at 1.08v and 2075 at 1.07v using the Afterburner voltage unlock trick described earlier in this thread.
> 
> Ultimately however that doesn't improve performance much for me. My scores are actually nearly identical at 2062.5 at 1.06v where the lower voltage allows for reduced temperatures, lower overall TDP and doesn't put me in between boost 3.0 steps.
> 
> I'm running 12k (+500) on the memory. Anything from +450 to +600 scores the same for me. So I'm keeping my vram at +500 well below artifacting, which for me, occures with the vram at +650.
> 
> I like to explore and confirm stability using my 4 favorite benches.
> 
> Valley, Time Spy, Fire Strike and Heaven
> 
> They don't all behave the same with memory clocks. Valley scales really well, even past +550. But Time Spy doesn't give me a notable improvement above +450.
> 
> I found Heaven to be the most stressful of the four on the clock speed. I can usually run a bin higher on the other three vs Heaven which becomes my go to for core stability testing as I can usually reach the lowest common denominator here.
> 
> I also use GPUZ's logging function at the highest polling rate to review the clocks for any throttling.
> 
> It's frankly a peeve of mine when people quote unsustainable (throttling) boost clocks as their overclocks.
> 
> Momentary bursts of speed perhaps, but not what I would call a true OC. These kinds of results should quote the lowest noted core clock not the momentary jolt in frequency.
> 
> My setup is nothing fancy, I'm running this 1080ti with an NZXT X41 AIO, Noctua fans in push pull, Shunt modded TDP, Liquid Metal Tim.
> 
> For those who've asked for screenshots.
> 
> This is what that looks like on screen with valley looping for 15-30min. As you can see no throttle whatsoever.
> 
> 
> 
> I think this is where I'll be keeping her until Volta.


you are a great poster. very informative.

BUT

Yeah, that rig really is "something special" runnign with the shunt mod, voltage mod, and the dirtiest AIO of them all on a 1080Ti. Don't sell yourself short! I'm jello , mane! I want shunt mod and water cooling but I scare.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I can get my card to boost as high as 2088 at 1.08v and 2075 at 1.07v using the Afterburner voltage unlock trick described earlier in this thread.
> 
> Ultimately however that doesn't improve performance much for me. My scores are actually nearly identical at 2062.5 at 1.06v where the lower voltage allows for reduced temperatures, lower overall TDP and doesn't put me in between boost 3.0 steps.
> 
> I'm running 12k (+500) on the memory. Anything from +450 to +600 scores the same for me. So I'm keeping my vram at +500 well below artifacting, which for me, occures with the vram at +650.
> 
> I like to explore and confirm stability using my 4 favorite benches.
> 
> Valley, Time Spy, Fire Strike and Heaven
> 
> They don't all behave the same with memory clocks. Valley scales really well, even past +550. But Time Spy doesn't give me a notable improvement above +450.
> 
> I found Heaven to be the most stressful of the four on the clock speed. I can usually run a bin higher on the other three vs Heaven which becomes my go to for core stability testing as I can usually reach the lowest common denominator here.
> 
> I also use GPUZ's logging function at the highest polling rate to review the clocks for any throttling.
> 
> It's frankly a peeve of mine when people quote unsustainable (throttling) boost clocks as their overclocks.
> 
> Momentary bursts of speed perhaps, but not what I would call a true OC. These kinds of results should quote the lowest noted core clock not the momentary jolt in frequency.
> 
> My setup is nothing fancy, I'm running this 1080ti with an NZXT X41 AIO, Noctua fans in push pull, Shunt modded TDP, Liquid Metal Tim.
> 
> For those who've asked for screenshots.
> 
> This is what that looks like on screen with valley looping for 15-30min. As you can see no throttle whatsoever.
> 
> 
> 
> I think this is where I'll be keeping her until Volta.


post a pic of what she looks like with
the NXT on her for me?


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Will it? At 120% power draw were already pushing 300 watts of the 8+6-pin setup. Anymore would be dangerous no?


8-pin gives 150w, 6-pin give 75w more plus PCI-E gives 75w more. that's 300. I don't think you can go past that with 8+6 pin. Need the 8+8 setup for that. That Zotac Amp Extreme will end up being the top clocker I bet (maybe classy? i doubt it but i'd still grab a classy







) with it's 8+8, it will be able to draw 375w. It's power slider will go up to 150% i think.


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> Danger is sort of inherent in custom BIOS since if you screw something up and brick your card without the stock BIOS on it your warranty is void. I ran my Titan X Maxwell cards on a custom BIOS with a 400W power limit (rarely hit but often well above 300W) and they were OK for nearly 2 years of use.
> 
> I probably am not going to go custom BIOS on my 1080 Ti because my card looks like such a dud overclocker that I imagine it will be pointless, but people with good samples might see a significant benefit.


man, i really like this card, it's gorgeous and a solid card, but, i will be tempted to flip her on ebay and grab a ftw hybrid or a classy, i aint gonna lie.


----------



## Slackaveli

Quote:


> Originally Posted by *mouacyk*
> 
> 1080 1070 1060 1050 ti 1050... none zip pretty sure nvidia is done with giving away free performance. They released the TXP with the worst possible VRM to discourage any attempts to crack the bios. And now knowing their bios is safe they release the ti with a better VRM but with a measly 300w limit. Doubt vendors will be any better because they couldn't with 1070 and 1080.


it's turning a card with a 5 year lifespan into a one year lifespan to run it at high volts like that anyway. NOT WORTH IT. It had some value in the past because we couldnt hit 1440p/144 ultra or 4k/60 ultra, so basically we were all "chasing the dragon". with this card, that isn't really relevant because we can do that stock more or less. Just a +150 core +494memory gives us plenty of perf. To me, I'm running this setup geared toward a little more longevity and less power consumption than on previous system builds. My i-7 5775c sips juice like an LED light. It idles at 12w! even with an almost max overclock it games at 55c with 82w consumption. May as well run this Ti on something reasonable. My electric bill will surely drop coming from a max clocked 4690k and a pair of 980Tis maxed out lmao.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> 8-pin gives 150w, 6-pin give 75w more plus PCI-E gives 75w more. that's 300. I don't think you can go past that with 8+6 pin. Need the 8+8 setup for that. That Zotac Amp Extreme will end up being the top clocker I bet (maybe classy? i doubt it but i'd still grab a classy
> 
> 
> 
> 
> 
> 
> 
> ) with it's 8+8, it will be able to draw 375w. It's power slider will go up to 150% i think.


So the spec says but there were a few times.es my card did 122%. Custom bios can make cards run higher past spec at an added risk I think.

Someone here got their 300W Titan Maxwell rated to 400W


----------



## al210

Quote:


> Originally Posted by *sWaY20*
> 
> you know this is overclock.net right? Not stock.net










Yes I do but my point is that this is sort of a first for Nvidia. From the reports it seems they have already over clocked this Pascal chip to pretty much max to have it on par or slightly higher then the Titan. They left very little room for anything more! This what I'm seeing so far at least on these FE boards.


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> post a pic of what she looks like with
> the NXT on her for me?


Some pics earlier in the thread....here's a couple more.

She's basic....


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> Since this is my first Pascal card, I have a question about voltage control in After Burner, does pushing the voltage to the maximum after unlocking the slider risk damaging the card in anyway?


lowers it's lifespan considerably. i'd go for +23 or +46v at the most (slider settings) as that will give a boost but not a max one. Nvidia says in re: to that slider (not sure how truthful this is though) that the cards are designed to last 5 years and max on that slider takes the lifespan down to ~1 year. half slider would be 3 yr. That's what they say. But +23v on that slider will give a bump and theoretically only shave the lifespan down to 4 years.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> Most people should be getting their cards today if they ordered on Friday. Mines out for delivery, and I've got a porch cam so I'm not missing it


i have one of those. Best thing ever!


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> http://www.3dmark.com/fs/11992139


nice, bro. you got a nice 4790k there, too. I could never get mine to hold a 5.0 clock. that's a golden chip right there. they never even could keep those in stock on Silicon Lottery store (5.0) and even their 4.9 ones were in/out of stock and they sell for $420.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> @100% fan i get 60°c ambient temps 23°c


wow! i think i may need to at least do the thermal paste mod.


----------



## Radox-0

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> http://www.3dmark.com/fs/11992139


Nice. How are you finding overclocking on the memory?


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> Some pics earlier in the thread....here's a couple more.
> 
> She's basic....


Quote:


> Originally Posted by *KraxKill*
> 
> looks great, man!


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> nice, bro. you got a nice 4790k there, too. I could never get mine to hold a 5.0 clock. that's a golden chip right there. they never even could keep those in stock on Silicon Lottery store (5.0) and even their 4.9 ones were in/out of stock and they sell for $420.


Thanks for the props man...
I did get very lucky with the 4790k. It's been an amazing chip at 5.0 and just 1.324v / 1.712v input. Cache at 44. RealBench and Prime95 stable at 8+ hours.

But it's spoiled me. Next build chip coming from the lottery for sure.


----------



## mouacyk

Quote:


> Originally Posted by *Slackaveli*
> 
> lowers it's lifespan considerably. i'd go for +23 or +46v at the most (slider settings) as that will give a boost but not a max one. Nvidia says in re: to that slider (not sure how truthful this is though) that the cards are designed to last 5 years and max on that slider takes the lifespan down to ~1 year. half slider would be 3 yr. That's what they say. But +23v on that slider will give a bump and theoretically only shave the lifespan down to 4 years.


Gonna need some source or evidence for those claims. With the power targets and voltages locked as they are, the most important things to keep in mind are stay below the max operating temperatures and keep the VRM cool enough, so it doesn't blow (but there's no sensor for it). This being a 300W card, there's really no reason not to put it under water to reap as much of that boost clock as possible. Unless you're going open-bench, air cooling this beast will just cook the rest of your components in your case.


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> 8-pin gives 150w, 6-pin give 75w more plus PCI-E gives 75w more. that's 300. I don't think you can go past that with 8+6 pin. Need the 8+8 setup for that. That Zotac Amp Extreme will end up being the top clocker I bet (maybe classy? i doubt it but i'd still grab a classy
> 
> 
> 
> 
> 
> 
> 
> ) with it's 8+8, it will be able to draw 375w. It's power slider will go up to 150% i think.


They can draw more power than what there rated for.


----------



## AMDATI

Quote:


> Originally Posted by *mouacyk*
> 
> Gonna need some source or evidence for those claims. With the power targets and voltages locked as they are, the most important things to keep in mind are stay below the max operating temperatures and keep the VRM cool enough, so it doesn't blow (but there's no sensor for it). This being a 300W card, there's really no reason not to put it under water to reap as much of that boost clock as possible. Unless you're going open-bench, air cooling this beast will just cook the rest of your components in your case.


It's actually true. Nvidia might be a bit conservative, but they consider max voltage slider to reduce the cards life down to just 1 year of normal usage.

Here's proof from Nvidia themselves: 




They even explain that voltages are locked to a limit because if you go beyond that, you reduce the life to not just a year, but 1 month.

Voltage/current, regardless of temp, causes degradation. As chips get smaller and components consist of less mass, these issues become much more prevalent. This is typically called 'electron migration'.


----------



## Slackaveli

Quote:


> Originally Posted by *mouacyk*
> 
> Gonna need some source or evidence for those claims. With the power targets and voltages locked as they are, the most important things to keep in mind are stay below the max operating temperatures and keep the VRM cool enough, so it doesn't blow (but there's no sensor for it). This being a 300W card, there's really no reason not to put it under water to reap as much of that boost clock as possible. Unless you're going open-bench, air cooling this beast will just cook the rest of your components in your case.







the man explains it here.


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> the man explains it here.


start at 15:55.


----------



## dboythagr8

Trying to decide between the *ASUS PG348Q* or *ACER Predator XB321HK* to go with my new 1080ti. Yes I know the HDR Asus monitor is on the way, but who really knows when they're going to ship it.

I'm currently on a Swift PG278Q .


----------



## eXistencelies

Could someone please grab your 8 pin PCI-E cable and see if it will fit into the 6 pin slot on the TI. I have a feeling it doesn't and I will need to get another cable made.


----------



## eXistencelies

Quote:


> Originally Posted by *dboythagr8*
> 
> Trying to decide between the *ASUS PG348Q* or *ACER Predator XB321HK* to go with my new 1080ti. Yes I know the HDR Asus monitor is on the way, but who really knows when they're going to ship it.
> 
> I'm currently on a Swift PG278Q .


I would wait for the new Samsungs being released soon.


----------



## dboythagr8

Quote:


> Originally Posted by *eXistencelies*
> 
> I would wait for the new Samsungs being released soon.


Not familiar with those. What's so special about them? Although tbh Samsung electronics are very hit or miss.


----------



## lilchronic

Quote:


> Originally Posted by *AMDATI*
> 
> It's actually true. Nvidia might be a bit conservative, but they consider max voltage slider to reduce the cards life down to just 1 year of normal usage.
> 
> Here's proof from Nvidia themselves:
> 
> 
> 
> 
> They even explain that voltages are locked to a limit because if you go beyond that, you reduce the life to not just a year, but 1 month.
> 
> Voltage/current, regardless of temp, causes degradation. As chips get smaller and components consist of less mass, these issues become much more prevalent. This is typically called 'electron migration'.


Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> the man explains it here.


Quote:


> Originally Posted by *Slackaveli*
> 
> start at 15:55.


That's bs he's just saying it could happen.

People have been running 1080 for almost a year now. don't see any of there cards dead.

I have a GTX480 that's still alive and a 780Ti that has been overvolted beyond what you can even do with the voltage slider and there still alive. By the time your card dies there will probably be 6 more generations of graphics cards.









And if it does die in a year or 2 or 3 it's under warranty.


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> wow! i think i may need to at least do the thermal paste mod.


My favorite new thing is Liquid electrical Tape!!!

For those that want significantly better thermals on air and don't want to bother with water.

Consider running liquid metal. It's not nearly as scary as people make it out to be. Unless you have shaky hands from Parkinson's or similar ofcorse.

Just flood the area around the GPU die with higher temp liquid tape and go to town. This will drastically lower your temps as it's nearly as good as solder in terms of thermal transfer and will drop your temps significantly.

When you're done with the card, just wipe it off carefully and peel off the tape.

He're how my old 1080 and Titan X Maxwell before it cleaned up after I was done with em. You can literally flood the area surrounding the socket with liquid electrical tape to ward off shorts and peel it off when done. Same with the Shunt mod frankly.

One can always practice on old useless hardware first if scurrd....


----------



## eXistencelies

Quote:


> Originally Posted by *dboythagr8*
> 
> Not familiar with those. What's so special about them? Although tbh Samsung electronics are very hit or miss.


https://news.samsung.com/global/samsung-to-introduce-new-quantum-dot-curved-monitor-at-ces-2017


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> My favorite new thing is Liquid electrical Tape!!!
> 
> For those that want significantly better thermals on air and don't want to bother with water.
> 
> Consider running liquid metal. It's not nearly as scary as people make it out to be. Unless you have shaky hands from Parkinson's or similar ofcorse.
> 
> Just flood the area around the GPU die with higher temp liquid tape and go to town. This will drastically lower your temps as it's nearly as good as solder in terms of thermal transfer and will drop your temps significantly.
> 
> When you're done with the card, just wipe it off carefully and peel off the tape.
> 
> He're how my old 1080 and Titan X Maxwell before it cleaned up after I was done with em. You can literally flood the area surrounding the socket with liquid electrical tape to ward off shorts and peel it off when done. Same with the Shunt mod frankly.
> 
> One can always practice on old useless hardware first if scurrd....
> 
> 
> Spoiler: Warning: Spoiler!


Show some results.

I have used liquid metal in the past on gpu's and the temp difference for say a high quality tim compared to clu is not that much better. Maybe 1°c - 3°c difference.

liguid metal seems to be best on the die of intel cpu's


----------



## alucardis666

Quote:


> Originally Posted by *lilchronic*
> 
> Show some results.
> 
> I have used liquid metal in the past on gpu's and the temp difference for say a high quality tim compared to clu is not that much better. Maybe 1°c - 3°c difference.
> 
> liguid metal seems to be best on the die of intel cpu's


Agreed. I delided a 4770k back in the day and put some liquid metal on it. made a HUGE difference in temps.

Dunno about re-timming a gpu though, diminishing returns? Is the Ti using paste or pads? And what sort of temp delta are we looking at between stock and replaced trim?


----------



## zipeldiablo

Quote:


> Originally Posted by *KraxKill*
> 
> Some pics earlier in the thread....here's a couple more.
> 
> She's basic....


looks neat, was it difficult to install? I'm questionning myself whether to go the aio way or not

Quote:


> Originally Posted by *KraxKill*
> 
> My favorite new thing is Liquid electrical Tape!!!


Any bench with temps?


----------



## BoredErica

I am worried about reports of liquid metal eating up metals. I think that's one reason why it has remained under the IHS.


----------



## KraxKill

Quote:


> Originally Posted by *zipeldiablo*
> 
> looks neat, was it difficult to install? I'm questionning myself whether to go the aio way or not
> Any bench with temps?


Check my posts earlier in the thread for more info.

Here it is again..


----------



## MunneY

I eeked out just a bit more out of it. I've got the door open and the car is running at 35c under load :-D

http://www.3dmark.com/3dm/18586743?

@lilchronic


----------



## GunnzAkimbo

Quote:


> Originally Posted by *dboythagr8*
> 
> Not familiar with those. What's so special about them? Although tbh Samsung electronics are very hit or miss.


im using a ks8000 55", it's good but uses PWM for its display, which can make your eyes tired.

If your getting a TV for PC, make sure its a non flicker display (eg. Sony X850D) but they have picture flaws like "grey blacks" and low nits.

The KS8000 would be perfect if it wasn't a PWM screen.

I have run Shadow Warrior in HDR mode but couldn't see much difference on or off, as for movies, HUGE difference, like next level.

Games like CIV 6 are awesome with the large display, and it's just as sharp as a monitor which is one of the main worries i had.

I think 60" is THE size for big screen gaming, no bigger though, and lower = get a monitor.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *zipeldiablo*
> 
> looks neat, was it difficult to install? I'm questionning myself whether to go the aio way or not
> Any bench with temps?




__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/


----------



## zipeldiablo

Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> __
> https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h5_gtx_10xx_fe/


Thanks mate


----------



## AMDATI

ladies and gentlemen, we have lift off! And yes, that is a bamboo uplift motorized desk


----------



## chibi

Hey guys, anyone able to help with an Aquacomputer vendor? I've googled for the last hour and couldn't find anyone with stock for a GPU block.
I tried the following vendors and it's out of stock:

Aquacomputer
Aquatuning
Performance PC's
Modmymods
OC CO UK
Amazon.com
Looking for an Aquacomputer Kryographics Pascal for Nvidia TITAN X/1080 Ti - Black Edition, Nickel Plated Version.


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> Show some results.
> 
> I have used liquid metal in the past on gpu's and the temp difference for say a high quality tim compared to clu is not that much better. Maybe 1°c - 3°c difference.
> 
> liguid metal seems to be best on the die of intel cpu's


Hmmm...maybe I should ask you to post some proof of only 1C-3C difference since you're the one questioning my claim with no supporting results.

Along with my posts earlier in this thread, there are plenty of user posts above also running AIOs and more elaborate custom loops in the 45C range which you could use to do some deductive reasoning.

I'm sitting at 28C to 35C with 20-24C ambients on a hack-jobbed NZXT AIO setup. Those low temps are worth 2-3 extra Boost 3.0 bins for free. With LM i'm seeing only an 8-10C delta between the ambients and the core temps on my pedestrian AIO. Not to mention that with all things equal as with any GPU or CPU it will be more stable at whichever clock you run since it's running cooler

If thats not enough for you... there are posts in this thread..right here on Overclock reporting 10-12C drops which is about what I'm getting out of it.

Is it for everyone? No. Yes it does take some dexterity and precaution. Is it 100% safe? Yes if done correctly. I've had it (have it) on my CPU, and 3 GPUs. My 4790K, my Titan X maxwell, 1080 and now the Ti without any issues.

Liquid Metals:
Thermal Grizzly Conductonaut - 73 W/mk
Coollaboratory Liquid Ultra - 38.4 W/mk
Coollaboratory Liquid Pro - 32.6 W/mk

Traditional Paste:
Gelid GC Extreme: 8.5 W/mk
Grizzly Kyronaut: 12.5 W/mk
ICD: 4.5 W/mk

LM thermal conductivity properties are simply unbeatable without resorting to solder.


----------



## Radox-0

Quote:


> Originally Posted by *dboythagr8*
> 
> Trying to decide between the *ASUS PG348Q* or *ACER Predator XB321HK* to go with my new 1080ti. Yes I know the HDR Asus monitor is on the way, but who really knows when they're going to ship it.
> 
> I'm currently on a Swift PG278Q .


Get the ASUS, maby I am bias having had it for over a year, but once you go 3440 x 1440, you will not go back







Single 1080Ti does a very nice job at the resolution


----------



## D13mass

I wrote letter to BHPhoto shop about availability of 1080ti`s but answer was "we don`t know, everything sold out" blah blah blah


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> Hmmm...maybe I should ask you to post some proof of only 1C-3C difference since you're the one questioning my claim with no proof.
> 
> Along with my posts earlier in this thread, there are plenty of user posts above also running AIOs and more elaborate custom loops in the 45C range which you could use to do some deductive reasoning.
> 
> I'm sitting at 28C to 35C with 20-24C ambients on a hack-jobbed NZXT AIO setup. Those low temps are worth 2-3 extra Boost 3.0 bins for free. With LM i'm seeing only an 8-10C delta between the ambients and the core temps on my pedestrian AIO. Not to mention that with all things equal as with any GPU or CPU it will be more stable at whichever clock you run since it's running cooler
> 
> If thats not enough for you... there are posts in this thread..right here on Overclock reporting 10-12C drops which is about what I'm getting out of it.
> 
> Is it for everyone? No. Yes it does take some dexterity and precaution. Is it 100% safe? Yes if done correctly. I've had it (have it) on my CPU, and 3 GPUs. My 4790K, my Titan X maxwell, 1080 and now the Ti without any issues.
> 
> Liquid Metals:
> Thermal Grizzly Conductonaut - 73 W/mk
> Coollaboratory Liquid Ultra - 38.4 W/mk
> Coollaboratory Liquid Pro - 32.6 W/mk
> 
> Traditional Paste:
> Gelid GC Extreme: 8.5 W/mk
> Grizzly Kyronaut: 12.5 W/mk
> ICD: 4.5 W/mk
> 
> LM thermal conductivity properties are simply unbeatable without resorting to solder.


Not one in that thread had any proof. Just oh yeah it worked 10C drop. Where is the proof? They always say it's better but no proof and when i try i only see marginal gains.

Anyway that thread is 4 years old when CLU first became popular. I had two GTX 670 back then and went from Gelid gc extreme to CLu and saw a 1 to 2 degree difference.

My result's are 4 year's old somewhere in the GTX 670 owners club.


----------



## AMDATI

I was going to order from BH, but they never actually had it in stock online, and they probably sold out from in store anyways.


----------



## D13mass

Quote:


> Originally Posted by *AMDATI*
> 
> I was going to order from BH, but they never actually had it in stock online, and they probably sold out from in store anyways.


Unfortunately I live on another continent and for me actual only online shopping with subsequent worlwide delivery.


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> Not one in that thread had any proof. Just oh yeah it worked 10C drop. Where is the proof? They always say it's better but no proof and when i try i only see marginal gains.
> 
> Anyway that thread is 4 years old when CLU first became popular. I had two GTX 670 back then and went from Gelid gc extreme to CLu and saw a 1 to 2 degree difference.
> 
> My result's are 4 year's old somewhere in the GTX 670 owners club.


I've certainly posted more reasoning for why it works than you have on why it doesn't. We're suppose to just take your word for it? Back at you.


----------



## D13mass

Quote:


> Originally Posted by *chibi*
> 
> Hey guys, anyone able to help with an Aquacomputer vendor? I've googled for the last hour and couldn't find anyone with stock for a GPU block.
> I tried the following vendors and it's out of stock:
> 
> Aquacomputer
> Aquatuning
> Performance PC's
> Modmymods
> OC CO UK
> Amazon.com
> Looking for an Aquacomputer Kryographics Pascal for Nvidia TITAN X/1080 Ti - Black Edition, Nickel Plated Version.


I`m going to buy this one Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> I've certainly posted more reasoning for why it works than you have on why it doesn't. So back at you.


First of all i never said it does not work. Only that you will see a small difference from a high quality tim to liquid metal when using it on GPU's.


----------



## KedarWolf

Quote:


> Originally Posted by *D13mass*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chibi*
> 
> Hey guys, anyone able to help with an Aquacomputer vendor? I've googled for the last hour and couldn't find anyone with stock for a GPU block.
> [/LIST]
> 
> Looking for an Aquacomputer Kryographics Pascal for Nvidia TITAN X/1080 Ti - Black Edition, Nickel Plated Version.
> 
> 
> 
> I`m going to buy this one Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL
Click to expand...

Should have these within a week, purchasing tomorrow.

https://www.ekwb.com/shop/ek-fc-titan-x-pascal-acetal-nickel



Prefilled for my EK Predator 360, just put block on GPU, clip on the EK AIO, done!









I thought about the nickel one you can see if coolant is a bit low but I top it off every few months or so anyways with EK EVO coolant so it's all good.









And,

https://www.ekwb.com/shop/ek-fc-titan-x-pascal-backplate-black.


----------



## chibi

Quote:


> Originally Posted by *D13mass*
> 
> I`m going to buy this one Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL


Nice looking block!








I like the Active XCS Backplate on the Aquacomputer model, so I'll be going with that - whenever it comes back in stock, or if I can source one soon.


----------



## yukkerz

Quote:


> Originally Posted by *Joshwaa*
> 
> I am guessing you have the 1080ti on this monitor? What monitor did you go with?


I should be getting my 1080 ti tomorrow because of the snow I got today. I went with the Dell S2716DG. For the price it really couldn't be beat. $450.


----------



## chibi

Quote:


> Originally Posted by *yukkerz*
> 
> I should be getting my 1080 ti tomorrow because of the snow I got today. I went with the Dell S2716DG. For the price it really couldn't be beat. $450.


Great choice! I have the same monitor and calibrated it with an ICC profile by one of our users here on OCN. There's a big thread in the Monitor sub-forum that has a lot of good discussion.


----------



## Addsome

Hey guys I got a Gigabyte reference 1080ti and when the temps on the card get in the 80c+ range I start smelling a slight burning smell. If i put fans on 100% and temps stay below 70c this smell doesn't happen. I plan to watercool the card. Should I return the card?


----------



## AMDATI

Quote:


> Originally Posted by *lilchronic*
> 
> That's bs he's just saying it could happen.
> 
> People have been running 1080 for almost a year now. don't see any of there cards dead.
> 
> I have a GTX480 that's still alive and a 780Ti that has been overvolted beyond what you can even do with the voltage slider and there still alive. By the time your card dies there will probably be 6 more generations of graphics cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And if it does die in a year or 2 or 3 it's under warranty.


If I had a dime for every time someone asked for proof, got it, then tried to refute it......I'd be a very very rich man.


----------



## dboythagr8

For those who are using the Hybrid kits -- Does the fan extension plug into a standard CHA_FAN header on the motherboard or does it need to be a specific header?


----------



## PewnFlavorTang

got it..doesn't overclock as well as my 1080 for obvious reasons.


----------



## PasK1234Xw

^ running benchmarks at 720p is not good way of testing stability barely using that card gtx 1080 scores about same at 1080p

edit

also if stable that is really good clock for a ti


----------



## BoredErica

> Originally Posted by *KraxKill*
> 
> Hmmm...maybe I should ask you to post some proof of only 1C-3C difference since you're the one questioning my claim with no supporting results.


I agree with you that it would be best that everybody posts evidence. But it would've only made sense if you went first.



> Along with my posts earlier in this thread, there are plenty of user posts above also running AIOs and more elaborate custom loops in the 45C range which you could use to do some deductive reasoning.


What?



> I'm sitting at 28C to 35C with 20-24C ambients on a hack-jobbed NZXT AIO setup. Those low temps are worth 2-3 extra Boost 3.0 bins for free. With LM i'm seeing only an 8-10C delta between the ambients and the core temps on my pedestrian AIO. Not to mention that with all things equal as with any GPU or CPU it will be more stable at whichever clock you run since it's running cooler





> If thats not enough for you... there are posts in this thread..right here on Overclock reporting 10-12C drops which is about what I'm getting out of it.


Specifically there were three posts in that thread talking about specific temp drops. If I keep seeing 10C I'm going to get turned off because it seems like a rough figure everyone is tossing around for everything. One of the guys claimed 10C reduction ON IDLE WITH CPU. That's nonsense and just shows I cannot trust simply what people say. That's why I mandate verification and a standardized testing methodology for my threads. Because when I request that everybody's results miraculously go down.

10C reduction on idle with CPU is leaps and bounds different than 10C reduction on SLI GPUs under load. Perhaps the guy was talking about delidding. Or maybe between users somebody did not test correctly. I can go to Thermal Grizzly's non-metal paste review on Amazon and see people claiming 10C decrease with non-metal solutions. I was skeptical then, and I am skeptical here.

Even just saying 10C lower doesn't tell us the GPU or the load it was run on. And we have to be careful about what we're even comparing. Are we comparing stock thermal paste to liquid metal or an aftermarket one with liquid metal?



> Is it 100% safe? Yes if done correctly. I've had it (have it) on my CPU, and 3 GPUs. My 4790K, my Titan X maxwell, 1080 and now the Ti without any issues.


My post got totally ignored and you're in your own way trying to answer it in a totally different conversation.







Just saying it's '100% safe if done correctly' doesn't tell us that much though. For example, are we sure that those types of liquid metal solutions only eat up aluminum?



> Liquid Metals:
> Thermal Grizzly Conductonaut - 73 W/mk
> Coollaboratory Liquid Ultra - 38.4 W/mk
> Coollaboratory Liquid Pro - 32.6 W/mk
> 
> Traditional Paste:
> Gelid GC Extreme: 8.5 W/mk
> Grizzly Kyronaut: 12.5 W/mk
> ICD: 4.5 W/m


That's theory, not testing. Did you just pull specs from product page for this? At that rate we might as well figure out what the best fan is by looking at their airflow and static pressure specs.



> LM thermal conductivity properties are simply unbeatable without resorting to solder.


Question isn't whether liquid metal is better, the question is how much better in practice.

To me the most important question is safety. If it is I can test in a more detailed manner.


----------



## hokk

Ok got my 2nd 1080ti while my other RMAs

ocing the memory again seems to be stable @ 350mhz now + with 300 on the core

that will do for now


----------



## MunneY

I've run 780ti Classys at 1.35v volts for 18 months with 0 degredation at the clocks they were running. I've not used high volts since, but I did it with 680/780s/780ti's and had 0 issues with any cards. Some of them were even run as high as 1.5v during that time.


----------



## wstanci3

Quote:


> Originally Posted by *Addsome*
> 
> Hey guys I got a Gigabyte reference 1080ti and when the temps on the card get in the 80c+ range I start smelling a slight burning smell. If i put fans on 100% and temps stay below 70c this smell doesn't happen. I plan to watercool the card. Should I return the card?


That would make me alarmed.
Does the card also have a whining sound once the fans >70%?
Honestly, I'd return it while my return window was open. Safe than sorry, friend.


----------



## mouacyk

Yeah, some guy saying degradation might happen doesn't really hold up to customers actually not experiencing that "expected" degradation. This isn't exactly the place to scare people from eking out every bit of performance from their hard earned investments. Anybody that carelessly burns a $700 or $1200 GPU deserves it.


----------



## stocksux

Quote:


> Originally Posted by *Slackaveli*
> 
> 8-pin gives 150w, 6-pin give 75w more plus PCI-E gives 75w more. that's 300. I don't think you can go past that with 8+6 pin. Need the 8+8 setup for that. That Zotac Amp Extreme will end up being the top clocker I bet (maybe classy? i doubt it but i'd still grab a classy
> 
> 
> 
> 
> 
> 
> 
> ) with it's 8+8, it will be able to draw 375w. It's power slider will go up to 150% i think.


It won't. Just because you add more power plugs doesn't mean anything with Pascal. The die itself is limited. You've already seen this with the 1080. It won't be different. Just learn to like it


----------



## stocksux

Quote:


> Originally Posted by *kylzer*
> 
> Ok got my 2nd 1080ti while my other RMAs
> 
> ocing the memory again seems to be stable @ 350mhz now + with 300 on the core
> 
> that will do for now


+300 on core??? Pics please


----------



## alucardis666

Card arrive 30 minutes ago. Clean installed Nvidia Drivers and cleared all my Afterburner settings. Time to overclock!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *stocksux*
> 
> +300 on core??? Pics please


Well saying +300 on the core can really just be 2000 mhz with thermal throttling.

I'd like to see what his core is actually at and what temperatures.


----------



## jsutter71

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well saying +300 on the core can really just be 2000 mhz with thermal throttling.
> 
> I'd like to see what his core is actually at and what temperatures.


Interesting. I have never been able to go past 225 on my TXP's which are also under water.


----------



## BoredErica

Is it like 980ti where past like 60C the clock takes a hit, but serious throttling really doesn't set in until those 80c+ temps?

Personally I don't think I could stand FE cooler on air. I had 980ti MSI AIB and even then it was loud.


----------



## PasK1234Xw

Pascal blower fan is actually quiter than one used in maxwell.

Regardless you can throw hybrid cooler on and will be better than any fan cooled 1080


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> First of all i never said it does not work. Only that you will see a small difference from a high quality tim to liquid metal when using it on GPU's.


Quote:


> Originally Posted by *Darkwizzie*
> 
> That's theory, not testing. Did you just pull specs from product page for this? At that rate we might as well figure out what the best fan is by looking at their airflow and static pressure specs.
> Question isn't whether liquid metal is better, the question is how much better in practice.


Saying that it doesn't work better than high end TIM should requires just as much if not more evidence since you're effectively calling out everybody making these claims.

Post some facts to the contrary of it NOT working significantly better than high end TIM does, or somebody claiming lackluster results from it. Something please...I'm still waiting.









I made the claim that it does work from my experience with multiple GPUs and works better than high end TIM and certainly better than the default TIM from Nvidia! You guys are saying it doesn't make a difference over quality TIM. So prove YOUR point! Otherwise you're the classical "hater". You say "it doesn't make much difference" prove to me that it doesn't otherwise it's just borderline nerd logic. Clearly you'll have to take somebody's word for it at some point or better yet try it yourself.

Moreover, you're assuming that the people that have made claims that it works are all placebo, noobs and or are bull****ting? That's a bit egotistical don't you think.

Here are a bunch more examples on LM on GPUs on mobile 760-780m GPUs. The 10C claims in this thread are also not valid?

You can scour the net yourself of countless others if you need further data. Unless you can tell me how all these people including myself are wrong, prove them incompetent or that they are just buying into the hype or whatever it is you're trying to suggest please stop making refuting claims.

If nothing else I made my suggestion trying to helps others. What's your assessment of my intention?

I'm willing to say that it will help to the tune of ˜10C at 100% TDP for those that want to tackle it. If you don't see a good improvement you're doing something wrong plain and simple.

YOU are questioning other's claims, the claimed thermal conductivity ratings of the paste vs LM (which has upwards of 7 times greater thermal conductivity rating) all while suggesting that the difference is only 1-3C. How is the burden of proof is on me?

I frankly looks more and more that you are trying to justify to yourself why you're leaving 10C on the table by falsely discouraging others that may want to try it. If it's for the inherent risk factor, personal ability and electrical conductivity reasons associated with LM I get it but otherwise please stop with the unsubstantiated naysay.


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> Saying that it doesn't work better than high end TIM should requires just as much if not more evidence since you're effectively calling out everybody making these claims.
> 
> Post some facts to the contrary of it NOT working significantly better than high end TIM does, or somebody claiming lackluster results from it. Something please...I'm still waiting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I made the claim that it does work from my experience with multiple GPUs and works better than high end TIM and certainly better than the default TIM from Nvidia! You guys are saying it doesn't make a difference over quality TIM. So prove YOUR point! Otherwise you're the classical "hater". You say "it doesn't make much difference" prove to me that it doesn't otherwise it's just borderline nerd logic. Clearly you'll have to take somebody's word for it at some point or better yet try it yourself.
> 
> Moreover, you're assuming that the people that have made claims that it works are all placebo, noobs and or are bull****ting? That's a bit egotistical don't you think.
> 
> Here are a bunch more examples on LM on GPUs on mobile 760-780m GPUs. The 10C claims in this thread are also not valid?
> 
> You can scour the net yourself of countless others if you need further data. Unless you can tell me how all these people including myself are wrong, prove them incompetent or that they are just buying into the hype or whatever it is you're trying to suggest please stop making refuting claims.
> 
> If nothing else I made my suggestion trying to helps others. What's your assessment of my intention?
> 
> I'm willing to say that it will help to the tune of ˜10C at 100% TDP for those that want to tackle it. If you don't see a good improvement you're doing something wrong plain and simple.
> 
> YOU are questioning other's claims, the claimed thermal conductivity ratings of the paste vs LM (which has upwards of 7 times greater thermal conductivity rating) all while suggesting that the difference is only 1-3C. How is the burden of proof is on me?
> 
> I frankly looks more and more that you are trying to justify to yourself why you're leaving 10C on the table by falsely discouraging others that may want to try it. If it's for the inherent risk factor, personal ability and electrical conductivity reasons associated with LM I get it but otherwise please stop with the unsubstantiated naysay.


Dude this has been discussed 4 years ago.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> Is it like 980ti where past like 60C the clock takes a hit, but serious throttling really doesn't set in until those 80c+ temps?
> 
> Personally I don't think I could stand FE cooler on air. I had 980ti MSI AIB and even then it was loud.


I have mine on an AIO and I get max temps at 62C. I notice the core drops like 25 Mhz once I go past 60C. Kind of annoying... But I have to use a copper shim inbetween my H55 and GPU, so that's why it doesn't get kept at 55C.


----------



## vmanuelgm

Quote:


> Originally Posted by *SlimJ87D*
> 
> I have mine on an AIO and I get max temps at 62C. I notice the core drops like 25 Mhz once I go past 60C. Kind of annoying... But I have to use a copper shim inbetween my H55 and GPU, so that's why it doesn't get kept at 55C.


Which is your max core clock???

Quote:


> Originally Posted by *lilchronic*
> 
> Dude this has been discussed 4 years ago.


What about your Ti???


----------



## richiec77

Got the card in earlier and benched my GTX 980 Ti Hybrids OC in SLI against the 1080 Ti Reference. It's nice! Except for the Jet engine in the room. Thank god the ek waterblock is coming soon....I Hope. Backorder from Performance PCS at this time.

Had EVGA Hybrid GTX 980 Ti's OC to 1486MHz GPU with +420 on the Memory. Benched them prior to installing the 1080 Ti. Tested Stock and then with a 110+MHz Core OC and just bumping Temp/Power target and fan to Jet Engine Simulator Mode!! (Damn this stock Ref cooler fan is loud at 100%....70-80% is more tolerable but still reaches 84C in a Heaven Loop) This card begs for better cooling.
_________________________________________
GTX 980 Ti SINGLE CARD OC
_________________________________________

Score: 2508 Heaven 1080p Single

Score: 1568 Heaven 1440p Single

Score: 4230 Valley 1080p Single

Score: 2628 Valley 1440p Single

Firestrike: 16400
19,348 Graphics
15,655 Physics
7,919 Combined

FireStrike_Extreme: 7,981
8,254 Graphics
15,749 Physics
3,799 Combined

_________________________________________
GTX 980 Ti SLI OC
_________________________________________

Score: 4117 Heaven 1080p SLI

Score: 2895 Heaven 1440p SLI

Score: 5466 Valley 1080p SLI

Score: 4187 Valley 1440p SLI

Firestrike: 22,150
31,393 Graphics
15,595 Physics
8,594 Combined

FireStrike_Extreme: 13,261
15,304 Graphics
15,492 Physics
5,981 Combined

_________________________________________
GTX 1080 Ti STOCK
_________________________________________

Score: 3544 Heaven 1080p

Score: 2207 Heaven 1440p

Score: 5658 Valley 1080p

Score: 3807 Valley 1440p

Firestrike: 21,269
27,409 Graphics
17,315 Physics
9,099 Combined

FireStrike_Extreme: 12,286
13,245 Graphics
17,291 Physics
6,214 Combined

_____________________________________________
GTX 1080 Ti Power Limit/Temp/Fan Max
_____________________________________________

Firestrike: 21,580
28,151 Graphics
17,257 Physics
9,087 Combined

Noticed Change points while OC of 110MHz applied.

2025 Observer from 42C to 58C
2012 Observer from 59C to 63C
2000 Observed from 63C to 68C
1987 Observed from 68+C


----------



## krutoydiesel

Quote:


> Originally Posted by *richiec77*
> 
> Got the card in earlier and benched my GTX 980 Ti Hybrids OC in SLI against the 1080 Ti Reference. It's nice! Except for the Jet engine in the room. Thank god the ek waterblock is coming soon....I Hope. Backorder from Performance PCS at this time.
> 
> Had EVGA Hybrid GTX 980 Ti's OC to 1486MHz GPU with +420 on the Memory. Benched them prior to installing the 1080 Ti. Tested Stock and then with a 110+MHz Core OC and just bumping Temp/Power target and fan to Jet Engine Simulator Mode!! (Damn this stock Ref cooler fan is loud at 100%....70-80% is more tolerable but still reaches 84C in a Heaven Loop) This card begs for better cooling.
> _________________________________________
> GTX 980 Ti SINGLE CARD OC
> _________________________________________
> 
> Score: 2508 Heaven 1080p Single
> 
> Score: 1568 Heaven 1440p Single
> 
> Score: 4230 Valley 1080p Single
> 
> Score: 2628 Valley 1440p Single
> 
> Firestrike: 16400
> 19,348 Graphics
> 15,655 Physics
> 7,919 Combined
> 
> FireStrike_Extreme: 7,981
> 8,254 Graphics
> 15,749 Physics
> 3,799 Combined
> 
> _________________________________________
> GTX 980 Ti SLI OC
> _________________________________________
> 
> Score: 4117 Heaven 1080p SLI
> 
> Score: 2895 Heaven 1440p SLI
> 
> Score: 5466 Valley 1080p SLI
> 
> Score: 4187 Valley 1440p SLI
> 
> Firestrike: 22,150
> 31,393 Graphics
> 15,595 Physics
> 8,594 Combined
> 
> FireStrike_Extreme: 13,261
> 15,304 Graphics
> 15,492 Physics
> 5,981 Combined
> 
> _________________________________________
> GTX 1080 Ti STOCK
> _________________________________________
> 
> Score: 3544 Heaven 1080p
> 
> Score: 2207 Heaven 1440p
> 
> Score: 5658 Valley 1080p
> 
> Score: 3807 Valley 1440p
> 
> Firestrike: 21,269
> 27,409 Graphics
> 17,315 Physics
> 9,099 Combined
> 
> FireStrike_Extreme: 12,286
> 13,245 Graphics
> 17,291 Physics
> 6,214 Combined
> 
> _____________________________________________
> GTX 1080 Ti Power Limit/Temp/Fan Max
> _____________________________________________
> 
> Firestrike: 21,580
> 28,151 Graphics
> 17,257 Physics
> 9,087 Combined
> 
> Noticed Change points while OC of 110MHz applied.
> 
> 2025 Observer from 42C to 58C
> 2012 Observer from 59C to 63C
> 2000 Observed from 63C to 68C
> 1987 Observed from 68+C


Now this is a proper post, thank you good sir.


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> Which is your max core clock???
> What about your Ti???


My card boost's to 1911Mhz with power limit maxed out So +151 core offset gives me 2062Mhz


----------



## richiec77

Quote:


> Originally Posted by *krutoydiesel*
> 
> Now this is a proper post, thank you good sir.


Thanks. Not going to mess with OC much as the card is limited by thermals with the stock cooler. Once it's under water, the temps will be much better balanced and a more stable OC can be held.


----------



## richiec77

Quote:


> Originally Posted by *lilchronic*
> 
> My card boost's to 1911Mhz with power limit maxed out So +151 core offset gives me 2062Mhz


Saw basically the same thing. STOCK as is out of the box: card would boost to 1886 and start clocking up and down due to temps. Once Limit's raised to 120%/90C: card hit 1911.

Also comes close to max power limit as is. Hitting around 111-114%.

Does removing the fan and LED crap off the card increase the amount of power available for the GPU? Anyone who went full waterblock notice this?


----------



## lilchronic

Quote:


> Originally Posted by *richiec77*
> 
> Saw basically the same thing. STOCK as is out of the box: card would boost to 1886 and start clocking up and down due to temps. Once Limit's raised to 120%/90C: card hit 1911.
> 
> Also comes close to max power limit as is. Hitting around 111-114%.
> 
> Does removing the fan and LED crap off the card increase the amount of power available for the GPU? Anyone who went full waterblock notice this?


yeah i have noticed this with pretty much all of my past gpu's so i don't see why it would be any different with pascal.

Also just from being cooler with a waterblock will make it more efficient.


----------



## BoredErica

Quote:


> Originally Posted by *KraxKill*
> 
> Saying that it doesn't work better than high end TIM should requires just as much if not more evidence since you're effectively calling out everybody making these claims.


Bro. You need to calm down. I'm totally open to testing this for myself. In fact I might just go buy some as a side project. Stop making this a liquid metal vs anti liquid metal thing. It's a 'is your evidence good' thing. If you bring up the same level of defense for normal TIM I'd be talking to you all the same.

As I've mentioned in my last post: Nobody is saying it doesn't work better. At least I'm not saying that. Let me quote you the relevant sentence from my last post.



> Question isn't whether liquid metal is better, the question is how much better in practice.





> Post some facts to the contrary of it NOT working significantly better than high end TIM does, or somebody claiming lackluster results from it. Something please...I'm still waiting.


The burden of proof is not on me. The burden of proof is on you to demonstrate that something works. If I swap out TIM with homeopathy or crystal healing the example would be more clear to you. I was examining your evidence. Had you brought up better evidence there would have been no problem.



> I made the claim that it does work from my experience with multiple GPUs and works better than high end TIM and certainly better than the default TIM from Nvidia!


If that is all you meant to say, then why did you try to quote thermal conductivity stats for the leading aftermarket thermal pastes? If you are not making the claim that liquid metal TIMs are better than aftermarket pastes then why advocate for liquid metal TIMs? That there are drawbacks is well known.



> You guys are saying it doesn't make a difference over quality TIM. So prove YOUR point! Otherwise you're the classical "hater". You say "it doesn't make much difference" prove to me that it doesn't otherwise it's just borderline nerd logic. Clearly you'll have to take somebody's word for it at some point or better yet try it yourself.


Stop making this emotional. We're discussing reason and evidence. Going down your path is not constructive. Now: I didn't say liquid metal TIMs are not better than non-metal TIMs. Refer to my statements last time:



> Question isn't whether liquid metal is better, the question is how much better in practice.


The above quote shows that I am open to metal TIMs being better not just in theory but also in practice.



> To me the most important question is safety. If it is I can test in a more detailed manner.


The above quote demonstrates that I think more testing is needed to get a good conclusion.

You lumped me in with lilchronic. If you mean to say that I said 'it doesn't make much difference' then first show me where I said that. Otherwise do not accuse me of saying things I did not say.



> Moreover, you're assuming that the people that have made claims that it works are all placebo, noobs and or are bull****ting? That's a bit egotistical don't you think.


Actually it's not. Arguably that's what the scientific method is for. Placebo is not something better men can overcome, that's a fool's errand. Placebo and expectation bias is something we overcome by careful documentation of methodology and results, and peer review (and replication). When I accuse other people of inflating their overclocks, I know how it feels. I was tempted to do the exact same thing. This is why I set up a standardized testing methodology and verification process.



> Here are a bunch more examples on LM on GPUs on mobile 760-780m GPUs. The 10C claims in this thread are also not valid?


I'm not sure where the 10C comes from in this case. That looks more like 21C to me. It is near 100C so we are dealing with very high temps and cutting many degrees down is relatively easier. However, that is still a bit jump and is worth investigating. This is better than some random guy that talks about 10C decrease on their CPU with H100i (with that literally being the long and short of it).



> You can scour the net yourself of countless others if you need further data.


I was merely looking at the evidence you brought forth. If I had said 'You're wrong, liquid TIMs barely make any difference' then maybe I should post some stats of my own. But I didn't do that, did I?



> Unless you can tell me how all these people including myself are wrong, prove them incompetent or that they are just buying into the hype or whatever it is you're trying to suggest please stop making refuting claims.


I can easily do that: Two quotes down you tried to make a claim that liquid TIM will improve "10C at 100% TDP".



> If nothing else I made my suggestion trying to helps others. What's your assessment of my intention?


I don't think my assessment of your intention matters. If I think you intended to lie and deceive others I would've come out and said it. It's actually possible for people to be wrong and have good intentions. It's also possible for somebody to be right but go about proving it in a bad way. It's also possible for egos to get in the way of a good discussion.



> I'm willing to say that it will help to the tune of ˜10C at 100% TDP for those that want to tackle it. If you don't see a good improvement you're doing something wrong plain and simple.


Again: 10C on what? Doing what? We may not be professional scientists but let's act like we are. I think we'll all be better for it. A no context claim isn't helpful and it only hurts your credibility.



> YOU are questioning other's claims, the claimed thermal conductivity ratings of the paste vs LM (which has upwards of 7 times greater thermal conductivity rating) all while suggesting that the difference is only 1-3C. How is the burden of proof is on me?


Because a claim is made and good evidence needs to be brought forth. If you don't understand how burden of proof works you can try QualiaSoup's Youtube video on it.



> I frankly looks more and more that you are trying to justify to yourself why you're leaving 10C on the table by falsely discouraging others that may want to try it.


I didn't try to assume your motives. Maybe you shouldn't do it to me.



> If it's for the inherent risk factor, personal ability and electrical conductivity reasons associated with LM I get it


But that's like... very important. As I said: First and foremost I want to be sure it's safe. Then I can test it.


----------



## alucardis666

Ok. So here's my OC.














Honestly I'd hopped for 1500 score or better in heaven.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> Dude this has been discussed 4 years ago.


Quote:


> Originally Posted by *lilchronic*
> 
> My card boost's to 1911Mhz with power limit maxed out So +151 core offset gives me 2062Mhz


Thanks Lil.

Are you on air or watercooled??? Shunt mod???

Sorry but i didnt read your past posts...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *vmanuelgm*
> 
> Which is your max core clock???
> What about your Ti???


I'm at 2038 MHz on the core and +450 MHz on memory. No extra voltage.

I might try maxing out my voltage and OC some more later.


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> Thanks Lil.
> 
> Are you on air or watercooled??? Shunt mod???
> 
> Sorry but i didnt read your past posts...


Air cooled with fan @ 100%, max temp i have seen was 61c with 23c ambient
Not shunt mod yet, waiting to see how much a waterblock will lessen the power throttling for this card.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> That's bs he's just saying it could happen.
> 
> People have been running 1080 for almost a year now. don't see any of there cards dead.
> 
> I have a GTX480 that's still alive and a 780Ti that has been overvolted beyond what you can even do with the voltage slider and there still alive. By the time your card dies there will probably be 6 more generations of graphics cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And if it does die in a year or 2 or 3 it's under warranty.


well, i killed a msi 980 and a evga 980ti in the last two years running max volts so I am inclined to believe him. doesnt matter if you have a 3 year warranty anyway, truthfully,.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> My favorite new thing is Liquid electrical Tape!!!
> 
> For those that want significantly better thermals on air and don't want to bother with water.
> 
> Consider running liquid metal. It's not nearly as scary as people make it out to be. Unless you have shaky hands from Parkinson's or similar ofcorse.
> 
> Just flood the area around the GPU die with higher temp liquid tape and go to town. This will drastically lower your temps as it's nearly as good as solder in terms of thermal transfer and will drop your temps significantly.
> 
> When you're done with the card, just wipe it off carefully and peel off the tape.
> 
> He're how my old 1080 and Titan X Maxwell before it cleaned up after I was done with em. You can literally flood the area surrounding the socket with liquid electrical tape to ward off shorts and peel it off when done. Same with the Shunt mod frankly.
> 
> One can always practice on old useless hardware first if scurrd....


it's pretty great stuff.


----------



## y2kcamaross

I can't get the voltage to show, did what Guru3d said, isn't working, my card is the same config as the one Guru3d has, can someone with the same settings upload their msiafterburner.oem2 file?


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> well, i killed a msi 980 and a evga 980ti in the last two years running max volts so I am inclined to believe him. doesnt matter if you have a 3 year warranty anyway, truthfully,.


Must of been burning them cards up.

And why not does it say overvolting voids your warranty?

http://www.overclock.net/t/1464842/evga-over-voltage-question-and-warranty/0_50#post_21727448


----------



## AMDATI

Feels soo good to get rid of that GTX 970 coil whine


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Ok. So here's my OC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly I'd hopped for 1500 score or better in heaven.


That's a little too optimistic.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> Air cooled with fan @ 100%, max temp i have seen was 61c with 23c ambient
> Not shunt mod yet, waiting to see how much a waterblock will lessen the power throttling for this card.


Thanks. Waiting youre watercooled.


----------



## vmanuelgm

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm at 2038 MHz on the core and 450 MHz on memory. No extra voltage.
> 
> I might try maxing out my voltage and OC some more later.












Sorry double post.


----------



## richiec77

Quote:


> Originally Posted by *y2kcamaross*
> 
> I can't get the voltage to show, did what Guru3d said, isn't working, my card is the same config as the one Guru3d has, can someone with the same settings upload their msiafterburner.oem2 file?


Having issues as well with 4.3.0 AB. Doesn't unlock the slider.

Do have the ability to pull up the Volt/Freq curve. Use CRTL+F for that.

Best OC as of now using the curve and fan curve is 2038....but it quickly drops to 1978 due to temps getting into the 78C range.


----------



## lilchronic

Quote:


> Originally Posted by *y2kcamaross*
> 
> I can't get the voltage to show, did what Guru3d said, isn't working, my card is the same config as the one Guru3d has, can someone with the same settings upload their msiafterburner.oem2 file?


got a link to the guide?


----------



## pompss

To lazy and high to install it


----------



## richiec77

Quote:


> Originally Posted by *lilchronic*
> 
> got a link to the guide?


https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html

Post #692 linked article and code string.

Added code to the .oem2 file. Saved. Checked in AfterBurner for the voltage adjustment drop-down. That doesn't appear in the settings.

But voltage curve still seems to works.

also linked this Screenshot


----------



## KedarWolf

Quote:


> Originally Posted by *richiec77*
> 
> Quote:
> 
> 
> 
> Originally Posted by *y2kcamaross*
> 
> I can't get the voltage to show, did what Guru3d said, isn't working, my card is the same config as the one Guru3d has, can someone with the same settings upload their msiafterburner.oem2 file?
> 
> 
> 
> Having issues as well with 4.3.0 AB. Doesn't unlock the slider.
> 
> Do have the ability to pull up the Volt/Freq curve. Use CRTL+F for that.
> 
> Best OC as of now using the curve and fan curve is 2038....but it quickly drops to 1978 due to temps getting into the 78C range.
Click to expand...

I use a headset on my PC so I have the fan set at 50C, low as it'll go but have it at 100% at 60C and my temps even at 2084 core 1093v on a voltage curve don't go over 70C. My ambient temps are low though, probably around 20C.

I'm getting a waterblock soon so not worried about the wear and tear on the fan and with headset don't hear it.

At 2084 or 2000 soon as I hit 60C it downclocks to 1981 though so I run it right now at 2000 1093v on a voltage curve. Seems to stay at 1991 this way when running Heaven etc.









When i get home from work later tonight I'm going to loop a Firestrike scene and play with core and memory on the fly, see what the optimum core and memory settings are for frame rates.









With really low ambient temps I've gotten to 650 memory and with decent ambient temps I can run it a 634 24/7.









Has anyone unlocked the voltage slider yet? Might get a bin or two out of that.


----------



## alucardis666

Quote:


> Originally Posted by *richiec77*
> 
> Having issues as well with 4.3.0 AB. Doesn't unlock the slider.
> 
> Do have the ability to pull up the Volt/Freq curve. Use CRTL+F for that.
> 
> Best OC as of now using the curve and fan curve is 2038....but it quickly drops to 1978 due to temps getting into the 78C range.


Same boat as you, can others adjust the voltage slider? Mine is grayed out.


----------



## pompss

Quote:


> Originally Posted by *alucardis666*
> 
> Same boat as you, can others adjust the voltage slider? Mine is grayed out.


Follow the tutorial above this post . Worked for me


----------



## pompss

Quote:


> Originally Posted by *richiec77*
> 
> https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html
> 
> Post #692 linked article and code string.
> 
> Added code to the .oem2 file. Saved. Checked in AfterBurner for the voltage adjustment drop-down. That doesn't appear in the settings.
> 
> But voltage curve still seems to works.
> 
> also linked this Screenshot


thanks for sharing.it Work's


----------



## alucardis666

Anyone run the bench within Wildlands? How're your frames?

@4k Ultra this still isn't playable with 1 card... :-(

@1440p however, that's a different story.


----------



## hokk

Quote:


> Originally Posted by *stocksux*
> 
> +300 on core??? Pics please


Quote:


> Originally Posted by *SlimJ87D*
> 
> Well saying +300 on the core can really just be 2000 mhz with thermal throttling.
> 
> I'd like to see what his core is actually at and what temperatures.


Quote:


> Originally Posted by *jsutter71*
> 
> Interesting. I have never been able to go past 225 on my TXP's which are also under water.


Sorry dudes

i mistype not used to this kb

what i ment = 150mhz on the core and 300 on the memory

i really don't know how it came out like this lol


----------



## Addsome

Quote:


> Originally Posted by *wstanci3*
> 
> That would make me alarmed.
> Does the card also have a whining sound once the fans >70%?
> Honestly, I'd return it while my return window was open. Safe than sorry, friend.


No there's no whining noise and also I just ran heaven and the card goes up to 86c but I'm not smelling anything anymore. Confused as to what to do. Cards been in my system 2 days.


----------



## alucardis666

Quote:


> Originally Posted by *Addsome*
> 
> No there's no whining noise and also I just ran heaven and the card goes up to 86c but I'm not smelling anything anymore. Confused as to what to do. Cards been in my system 2 days.


My card had a strange hot smell too that I can't say I noticed before with any other card, seems to have gone away after a few hrs, that or I just got used to it and my brain is tuning it out.
















I think maybe it's just the card breaking it. But so far so good, no issues as of yet.


----------



## y2kcamaross

Quote:


> Originally Posted by *pompss*
> 
> thanks for sharing.it Work's


can you upload your modified oem2 file?


----------



## alucardis666

Quote:


> Originally Posted by *y2kcamaross*
> 
> can you upload your modified oem2 file?


Yes please. I tried and it corrupted AB... Had to reinstall


----------



## Radox-0

Quote:


> Originally Posted by *alucardis666*
> 
> Anyone run the bench within Wildlands? How're your frames?
> 
> @4k Ultra this still isn't playable with 1 card... :-(
> 
> @1440p however, that's a different story.


Posted my benchs back on page 89, for wildland I get the following:

Ghost Recon Wildland - Very High Preset

GTX 1080: 57.5 FPS
GTX 1080 OC'd: 61.21 FPS (6.5% increase)
GTX 1080Ti: 71.25 FPS (23.9% increase)
GTX 1080 OC'd: 76.59 FPS (33.2% increase)
A pair of 1080Ti's at stock keep the game running near 100hz for most intents and purposes at 3440 x 1440 for me on very high preset


----------



## alucardis666

Quote:


> Originally Posted by *Radox-0*
> 
> Posted my benchs back on page 89, for wildland I get the following:
> 
> Ghost Recon Wildland - Very High Preset
> 
> GTX 1080: 57.5 FPS
> GTX 1080 OC'd: 61.21 FPS (6.5% increase)
> GTX 1080Ti: 71.25 FPS (23.9% increase)
> GTX 1080 OC'd: 76.59 FPS (33.2% increase)
> A pair of 1080Ti's at stock keep the game running near 100hz for most intents and purposes at 3440 x 1440 for me on very high preset


Thanks for sharing! How about @ Ultra preset?

You'd think this card would be up to snuff, the game doesn't even look at pretty.


----------



## Slackaveli

Quote:


> Originally Posted by *richiec77*
> 
> Having issues as well with 4.3.0 AB. Doesn't unlock the slider.
> 
> Do have the ability to pull up the Volt/Freq curve. Use CRTL+F for that.
> 
> Best OC as of now using the curve and fan curve is 2038....but it quickly drops to 1978 due to temps getting into the 78C range.


Im about the same. we have "midling" cards. not terrible, not great. im ok with it. my boost is 2025 with +155 being my top without any voltage bumps. downclocks to 2012 or 2000 until over 70c then down to 1987 and 1974. Haven't seen lower than that. if you do the cooling mods and gain -10c to -20c you will stay locked at your topend.


----------



## dboythagr8

Got my 1080Ti installed...and no sooner did I install drivers did I get email notification that cards directly from EVGA were available.

Couldn't resist.

Ordered a 2nd card from EVGA, UPS Next Day Air :x


----------



## KraxKill

Quote:


> Originally Posted by *Darkwizzie*
> 
> I'm totally open to testing this for myself. In fact I might just go buy some as a side project. Stop making this a liquid metal vs anti liquid metal thing. It's a 'is your evidence good' thing. If you bring up the same level of defense for normal TIM I'd be talking to you all the same.
> 
> As I've mentioned in my last post: Nobody is saying it doesn't work better. At least I'm not saying that.
> 
> When I accuse other people of inflating their overclocks, I know how it feels. I was tempted to do the exact same thing. This is why I set up a standardized testing methodology and verification process.
> I don't think my assessment of your intention matters. If I think you intended to lie and deceive others I would've come out and said it. It's actually possible for people to be wrong and have good intentions. It's also possible for somebody to be right but go about proving it in a bad way. It's also possible for egos to get in the way of a good discussion.
> Again: 10C on what? Doing what? We may not be professional scientists but let's act like we are. I think we'll all be better for it.
> 
> As I said: First and foremost I want to be sure it's safe. Then I can test it.


Point taken, and Apologies for lumping your post in. It made for a lot of lost context. I should have addressed your posts individually.

I agree. I would love for you or somebody else to as you say conduct "a standardized testing methodology and verification process.". I'm not about to conduct one on LM. I have no qualms about the fact that I present anecdotal evidence on the merits of LM on GPU specific applications based on the reports of limited number of others and myself. If you can point me to some evidence I am as you are all ears. But to be fair as with threads prior, this one is about to get filled with unstandardized overclock claims, misunderstood sustained boost vs momentary clocks context and more. These will all be online claims and opinions. I took online claims and reports of LM effectiveness on this very forum among others as reason enough for me to try it. Not being priced outrageously out of reach I did. First with great results on my 4790K then on my GPU and consequent GPUs from there on out.

Report back if you do...I sure don't want to waste any more time applying the stuff if it's no more than 1-3C better than standard TIM (which has frankly felt all the same to me for as long as I can remember). One thing I do know for a fact, is this stuff is not exactly a breeze to work with. It's liquid, a trick to spread, is electrically conductive and all that goes with that.


----------



## Radox-0

Quote:


> Originally Posted by *alucardis666*
> 
> Thanks for sharing! How about @ Ultra preset?
> 
> You'd think this card would be up to snuff, the game doesn't even look at pretty.


Okay, just ran Ultra and now I know why I did very high only, so little gains for a tank in the fps!

Both are run at stock:

Single card: 50.41 (Min 42.67, Max 54.89)
SLI: 58.21 (Min 40.12, Max 63.63)


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> Thanks for the props man...
> I did get very lucky with the 4790k. It's been an amazing chip at 5.0 and just 1.324v / 1.712v input. Cache at 44. RealBench and Prime95 stable at 8+ hours.
> 
> But it's spoiled me. Next build chip coming from the lottery for sure.


a golden chip will do that to you. I went the 5775c route myself and it is a beast. i love it so much and my 128mb eDRAM L4 cache is NASTY for gaming. Made me stick with her for awhile because i wont give up that cache. when it's brought back out I'll upgrade.


----------



## trawetSluaP

Quote:


> Originally Posted by *alucardis666*
> 
> Anyone run the bench within Wildlands? How're your frames?
> 
> @4k Ultra this still isn't playable with 1 card... :-(
> 
> @1440p however, that's a different story.


I had someone make fun of me for buying a 1080 Ti to play at 1440p yesterday. Apparently I should be playing at 4K and I'm dumb for not doing so! Someone people can be such tools!!!


----------



## alucardis666

Quote:


> Originally Posted by *Radox-0*
> 
> Okay, just ran Ultra and now I know why I did very high only, so little gains for a tank in the fps!
> 
> Both are run at stock:
> 
> Single card: 50.41 (Min 42.67, Max 54.89)
> SLI: 58.21 (Min 40.12, Max 63.63)


Thanks for sharing!


----------



## Slackaveli

Quote:


> Originally Posted by *eXistencelies*
> 
> https://news.samsung.com/global/samsung-to-introduce-new-quantum-dot-curved-monitor-at-ces-2017


damn those are freaking sexy


----------



## Radox-0

Quote:


> Originally Posted by *alucardis666*
> 
> Thanks for sharing!


No Probs









What sort of frames is 4k putting you at? Seems to be rock solid at 1440p at ultra which seems about right if its 50 or so fps at 3440 x 1440


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> Thanks for sharing!


Quote:


> Originally Posted by *trawetSluaP*
> 
> I had someone make fun of me for buying a 1080 Ti to play at 1440p yesterday. Apparently I should be playing at 4K and I'm dumb for not doing so! Someone people can be such tools!!!


Not at all. all depends on optimization I'd say conservatively, 80% of AAA games out now and in the next 1-2 years should run maxed out @ 4k just fine with an OC'd 1080Ti. The other 20% you'll either need to drop to 1440p or go SLi


----------



## wstanci3

Quote:


> Originally Posted by *Addsome*
> 
> No there's no whining noise and also I just ran heaven and the card goes up to 86c but I'm not smelling anything anymore. Confused as to what to do. Cards been in my system 2 days.


Quote:


> Originally Posted by *alucardis666*
> 
> My card had a strange hot smell too that I can't say I noticed before with any other card, seems to have gone away after a few hrs, that or I just got used to it and my brain is tuning it out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think maybe it's just the card breaking it. But so far so good, no issues as of yet.


This could be the case. If the smell went away and the performance is what is expected and nothing else amiss happens, I'd say keep it. Otherwise, if the smell were to continue I'd RMA.


----------



## Addsome

Quote:


> Originally Posted by *alucardis666*
> 
> My card had a strange hot smell too that I can't say I noticed before with any other card, seems to have gone away after a few hrs, that or I just got used to it and my brain is tuning it out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think maybe it's just the card breaking it. But so far so good, no issues as of yet.


So you think were good?


----------



## alucardis666

Quote:


> Originally Posted by *Radox-0*
> 
> No Probs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What sort of frames is 4k putting you at? Seems to be rock solid at 1440p at ultra which seems about right if its 50 or so fps at 3440 x 1440


Here's the story at 4k.

*ULTRA



VERY HIGH



HIGH*


Quote:


> Originally Posted by *Addsome*
> 
> So you think were good?


Well, my card hasn't died yet. So so far so good, at least. We'll see how it goes for the rest of the week.


----------



## wstanci3

If the smell goes away, then I'd say so. Just keep an eye on it. If you're not comfortable with it, then return/exchange.


----------



## Addsome

Quote:


> Originally Posted by *alucardis666*
> 
> My card had a strange hot smell too that I can't say I noticed before with any other card, seems to have gone away after a few hrs, that or I just got used to it and my brain is tuning it out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think maybe it's just the card breaking it. But so far so good, no issues as of yet.


My score in heaven is lower than people with similar setups but my card drops to 2025 at 100% fan speed so idk whats wrong. The smell for now seems to be gone, ill test it for a couple days, really wanted to install my hybrid kit







aha


----------



## KraxKill

Quote:


> Originally Posted by *alucardis666*
> 
> Ok. So here's my OC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly I'd hopped for 1500 score or better in heaven.


Did you test anything else but Heaven? For me it's been the hardest of the benches between it, Valley, TimeSpy, and Firestrike. I can actually run a bin or two higher in the other 3 benches. Wonder if this is the case for others as well.


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> My score in heaven is lower than people with similar setups but my card drops to 2025 at 100% fan speed so idk whats wrong. The smell for now seems to be gone, ill test it for a couple days, really wanted to install my hybrid kit
> 
> 
> 
> 
> 
> 
> 
> aha


i drop to 2000 or less under heavy load with a 1:1.2 fan curve. don't fret. you are covered for 3 years if it dies so if you dont smell it again i think you are fine. maybe dont run that fan at 5,000 rpms fulltime.


----------



## alucardis666

Quote:


> Originally Posted by *KraxKill*
> 
> Did you test anything else but Heaven? For me it's been the hardest of the benches between it, Valley, TimeSpy, and Firestrike. I can actually run a bin or two higher in the other 3 benches. Wonder if this is the case for others as well.


Haven't tested anything besides Heaven and Wildlands bench so far.


----------



## PewnFlavorTang

Here's 1440p Ultra single 1080ti


----------



## Radox-0

Quote:


> Originally Posted by *alucardis666*
> 
> Here's the story at 4k.


Wow, pretty brutal on Ultra and Very High at 4k. Always be those few titles which just absolutely suck up all the GPU power you can throw at it and frames still such.


----------



## alucardis666

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> Here's 1440p Ultra single 1080ti


What's your CPU clocked @?

Here are my results


Quote:


> Originally Posted by *Radox-0*
> 
> Wow, pretty brutal on Ultra and Very High at 4k. Always be those few titles which just absolutely suck up all the GPU power you can throw at it and frames still such.


Yea, I'm settling for 1440p @ Very High with some things on ultra. Still dips at times. Kinda wondering if this game is more cpu limited than GPU and my 4.3ghz is too low.


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> i drop to 2000 or less under heavy load with a 1:1.2 fan curve. don't fret. you are covered for 3 years if it dies so if you dont smell it again i think you are fine. maybe dont run that fan at 5,000 rpms fulltime.


Im gonna put the 1080 hybrid kit on it soon so no more 100% fan lol


----------



## alucardis666

:ELETED::


----------



## PewnFlavorTang

Quote:


> Originally Posted by *alucardis666*
> 
> What's your CPU clocked @?


4.1ghz right now. What about you?

nevermind saw 4.3ghz.


----------



## dboythagr8

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> Here's 1440p Ultra single 1080ti


What game / bench is this?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> Did you test anything else but Heaven? For me it's been the hardest of the benches between it, Valley, TimeSpy, and Firestrike. I can actually run a bin or two higher in the other 3 benches. Wonder if this is the case for others as well.


No joke Witcher 3 with hair works has always been the best stress test for me


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> My card boost's to 1911Mhz with power limit maxed out So +151 core offset gives me 2062Mhz


that's a winner! mine boosts to 1876 power limit maxed and +151 gives me 2025 :/ damn!


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> Must of been burning them cards up.
> 
> And why not does it say overvolting voids your warranty?
> 
> http://www.overclock.net/t/1464842/evga-over-voltage-question-and-warranty/0_50#post_21727448


EVGA don't care. they are the OC brand of choice.


----------



## Radox-0

Quote:


> Originally Posted by *alucardis666*
> 
> What's your CPU clocked @?
> 
> Here are my results
> 
> 
> Yea, I'm settling for 1440p @ Very High with some things on ultra. Still dips at times. Kinda wondering if this game is more cpu limited than GPU and my 4.3ghz is too low.


Ran it at 1440p out of curiosity with stock GPU. For me am GPU bound on the run. My clocks are lower on the GPU's in terms of boost I suspect due to higher temps I hit (in 3 way sandwich, not actually running other two cards, they show up) But the CPU does not seem to be getting hit too hard, rather the GPU is pegged at near 100% for my runs


----------



## PewnFlavorTang

interesting how the Ryzen seems to be fairing a little better.


----------



## alucardis666

Quote:


> Originally Posted by *Radox-0*
> 
> Ran it at 1440p out of curiosity with stock GPU. For me am GPU bound on the run. My clocks are lower on the GPU's in terms of boost I suspect due to higher temps I hit (in 3 way sandwich, not actually running other two cards, they show up) But the CPU does not seem to be getting hit too hard, rather the GPU is pegged at near 100% for my runs


Poor optimization I guess. :-/
Quote:


> Originally Posted by *PewnFlavorTang*
> 
> interesting how the Ryzen seems to be fairing a little better.


Maybe Ubisoft if making cloak and dagger made for AMD titles








Quote:


> Originally Posted by *dboythagr8*
> 
> What game / bench is this?


Ghost Recon Wildlands bench under graphics settings


----------



## sWaY20

Well i wanted to wait a few weeks before ordering a water block, that didn't last 10 min. Heard that jet in my pc, went to ek site and ordered a block.


----------



## alucardis666

Quote:


> Originally Posted by *sWaY20*
> 
> Well i wanted to wait a few weeks before ordering a water block, that didn't last 10 min. Heard that jet in my pc, went to ek site and ordered a block.


The Ti cooler does seem to be noticeably louder than the 1080 FE cooler above ~65%. Guess it's all that extra air coming out of the i/o. Card also seems to run hotter than the 1080 too... Don't know how I feel about it just yet. Might need to step-up if EVGA will let me in April when they release the ICX version.


----------



## dseg

Quote:


> Originally Posted by *D13mass*
> 
> I`m going to buy this one Watercool HEATKILLER® IV for TITAN X (Pascal) - ACRYL


Do you know if you need a Heatkiller backplate or if it uses the stock backplate?


----------



## KingEngineRevUp

So how do we get this darn voltage unlocked in MSI afterburner. When I open that file, everything is in Asian characters!


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> So how do we get this darn voltage unlocked in MSI afterburner. When I open that file, everything is in Asian characters!


Same.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Same.


Hopefully someone generous can upload their file for us


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Hopefully someone generous can upload their file for us


I'll make the popcorn.









*Edit:*

Can you guys post some GPU-Z logs during gaming/heaven? Despite my Curve in AB maxing @ 2037 my card never seems to go bast 2012, and does so rarely, even with the fan @ 100% and the card loading in the mid 60's, not sure if this is the thermal throttling being extra aggressive or what.


----------



## BroHamBone

w/o voltage change, was hitting 1873. Reset the stats and increased voltage and hit 1898 on firestrike ultra. Before resetting the stats , though, i seen 1928MHz on the max. Maybe when I ran extreme....


----------



## sWaY20

Quote:


> Originally Posted by *alucardis666*
> 
> The Ti cooler does seem to be noticeably louder than the 1080 FE cooler above ~65%. Guess it's all that extra air coming out of the i/o. Card also seems to run hotter than the 1080 too... Don't know how I feel about it just yet. Might need to step-up if EVGA will let me in April when they release the ICX version.


i really wanted a 1080ti classy, but i was impatient this time. I think I'll be perfectly content once it's under water and quiet.


----------



## BigMack70

Quote:


> Originally Posted by *SlimJ87D*
> 
> So how do we get this darn voltage unlocked in MSI afterburner. When I open that file, everything is in Asian characters!


Are you using a beta version of Afterburner 4.3.0? I was, and had that problem. I uninstalled and then downloaded and installed 4.3.0 final and could edit the files in a text editor properly.


----------



## Code-Red

Looks like Newegg created a shipping label on Friday, but didn't bother actually shipping it until today. Not getting it until the 22nd now, not impressed at all.


----------



## KedarWolf

OEM File For peeps, might need to right click, Properties, Unblock, replace .oem2 with it, working on my Gigabyte 1080 Ti FE.

MSIAfterburner.zip 1k .zip file


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> OEM File For peeps, might need to right click, Properties, Unblock, replace .oem2 with it, working on my Gigabyte 1080 Ti FE.
> 
> MSIAfterburner.zip 1k .zip file


Thanks so much!


----------



## dboythagr8

Straight out the box, no OC. Only raised power to 120% and set temp limit to 84c. With my custom fan curve, card topped out at 74c. CPU was at 4.5ghz.

1080Ti


For reference here are other results I have

1070 Strix


1060


And for giggles, the Titan Black Tri-SLI setup


I have Titan X Maxwell results, but it looks like I only ran those at 4k.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> OEM File For peeps, might need to right click, Properties, Unblock, replace .oem2 with it, working on my Gigabyte 1080 Ti FE.
> 
> MSIAfterburner.zip 1k .zip file


Ok. Can't get it to work...

https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html

Not sure what I'm doing wrong here...


----------



## Belmire

1080Tis SLI checking in. Beast! Firestrike scores are watercooled. Game pics are on blower fans at 60%.

Firestrike score still holding on!!!
http://www.3dmark.com/fs/11978782

Witcher 3 4k:


Crysis 3 4k:


Ryse 4k:


----------



## MURDoctrine

Quote:


> Originally Posted by *trawetSluaP*
> 
> I had someone make fun of me for buying a 1080 Ti to play at 1440p yesterday. Apparently I should be playing at 4K and I'm dumb for not doing so! Someone people can be such tools!!!


I bought one and play at 1080p. Haters gonna hate.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> OEM File For peeps, might need to right click, Properties, Unblock, replace .oem2 with it, working on my Gigabyte 1080 Ti FE.
> 
> MSIAfterburner.zip 1k .zip file
> 
> 
> 
> 
> Ok. Can't get it to work...
> 
> https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html
> 
> Not sure what I'm doing wrong here...
Click to expand...

Made an .oem2 file for you.

Right click, Properties, 'Unblock' unzip, replace your .oem2 file with it.









MSIAfterburner.zip 1k .zip file


----------



## pompss

Quote:


> Originally Posted by *y2kcamaross*
> 
> can you upload your modified oem2 file?


Quote:


> Originally Posted by *alucardis666*
> 
> Yes please. I tried and it corrupted AB... Had to reinstall


Pretty useless

ITs like the titan x there is no benefit adding more voltage


----------



## pompss

My idle temperature are 54c and its pretty high should be 30c max 35c

Anyone get the same temps?

I will try to check and change the thermal paste see if gets better


----------



## AMDATI

Quote:


> Originally Posted by *MURDoctrine*
> 
> I bought one and play at 1080p. Haters gonna hate.


I'm hating hard over here! 1440p is where it's at!


----------



## Somasonic

Quote:


> Originally Posted by *AMDATI*
> 
> I'm hating hard over here! 1440p is where it's at!


Maybe they're downsampling 4k...


----------



## TacticalDev91

Quote:


> Originally Posted by *Belmire*


Are you using hairworks for Witcher3?


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Made an .oem2 file for you.
> 
> Right click, Properties, 'Unblock' unzip, replace your .oem2 file with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSIAfterburner.zip 1k .zip file


Thanks! I'll give it a shot.

*EDIT:*

Success! +Rep given!



*EDIT 2:*

Seems even maxing out the voltage slider to 100 doesn't increase my stability beyond 135 core and 500 mem


----------



## Belmire

Quote:


> Originally Posted by *TacticalDev91*
> 
> Are you using hairworks for Witcher3?


Hairworks off on that pic. I just turned Hairwork on in this one and with watercooling:


----------



## TacticalDev91

Quote:


> Originally Posted by *Belmire*
> 
> Hairworks off on that pic. I just turned Hairwork on in this one


Quote:


> Originally Posted by *Belmire*
> 
> Hairworks off on that pic. I just turned Hairwork on in this one and with watercooling:


Thanks man! Pretty decent sli scaling. My 2 cards are stuck in the middle of the "blizzard" by me. Hopefully they will get here by Friday


----------



## MURDoctrine

Quote:


> Originally Posted by *Somasonic*
> 
> Maybe they're downsampling 4k...


I'm going to upgrade my monitor setup eventually to either 1440p or 4k. Probably 4k now that I should have the horsepower to handle it. But for the time being I'm going to be using DSR or Surround to alleviate the CPU bottlenecks I'll face with my 4770k.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Made an .oem2 file for you.
> 
> Right click, Properties, 'Unblock' unzip, replace your .oem2 file with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSIAfterburner.zip 1k .zip file


so this works on the nvidia ref models?


----------



## Slackaveli

Quote:


> Originally Posted by *pompss*
> 
> My idle temperature are 54c and its pretty high should be 30c max 35c
> 
> Anyone get the same temps?
> 
> I will try to check and change the thermal paste see if gets better


my idle temps are locked 45c at 45% fan


----------



## Slackaveli

Quote:


> Originally Posted by *Belmire*
> 
> Hairworks off on that pic. I just turned Hairwork on in this one and with watercooling:


nice, huhuhu


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> OEM File For peeps, might need to right click, Properties, Unblock, replace .oem2 with it, working on my Gigabyte 1080 Ti FE.
> 
> MSIAfterburner.zip 1k .zip file
> 
> 
> 
> 
> Ok. Can't get it to work...
> 
> https://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html
> 
> Not sure what I'm doing wrong here...
Click to expand...

This one for oem models.


----------



## Slackaveli

Quote:


> Originally Posted by *MURDoctrine*
> 
> I'm going to upgrade my monitor setup eventually to either 1440p or 4k. Probably 4k now that I should have the horsepower to handle it. But for the time being I'm going to be using DSR or Surround to alleviate the CPU bottlenecks I'll face with my 4770k.


you got that 4770k on a z97? if so, get a broadwell i7-5775c and end cpu bottlenecks ( 128mb HUGE L4 cache)


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> This one for oem models.


ty sir


----------



## MURDoctrine

Quote:


> Originally Posted by *Slackaveli*
> 
> you got that 4770k on a z97? if so, get a broadwell i7-5775c and end cpu bottlenecks ( 128mb HUGE L4 cache)


Nah still on z87 with it. Planning on doing a whole refresh later this year. I do have a z97 mini-itx board but this 4770k will be going into that rig with my 980 to be a portable VR machine.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> so this works on the nvidia ref models?


Worked for me.


----------



## alucardis666

With how loud this card is and how poorly it seems to cool I'm debating doing either a re-tim or doing the kraken mod.

Thoughts?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> With how loud this card is and how poorly it seems to cool I'm debating doing either a re-tim or doing the kraken mod.
> 
> Thoughts?


See if you can score a EVGA hybrid AIO on its own. Just remove the heatsink and install it onto the reference design.

If you want to just use a H55:

__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/


----------



## AMDATI

FE cooling FTW


----------



## Slackaveli

Quote:


> Originally Posted by *MURDoctrine*
> 
> Nah still on z87 with it. Planning on doing a whole refresh later this year. I do have a z97 mini-itx board but this 4770k will be going into that rig with my 980 to be a portable VR machine.


nice. good use. i gave my son my 4790k and was deciding whether to replace or buy a new mobo, ram, and chip. ended up getting the broadwell and, man, i love it. better in several games (notably) and sips juice like an led bulb.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> See if you can score a EVGA hybrid AIO on its own. Just remove the heatsink and install it onto the reference design.
> 
> If you want to just use a H55:
> 
> __
> https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/%5B/URL
> 
> And it'll work with the 1080ti? Also what fan should I replace the included one with?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Cool write up. But I guess since I don't have a single rad aio lying around it's cheaper to go with this...
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814998133&ignorebbr=1
> 
> And it'll work with the 1080ti? Also what fan should I replace the included one with?


Personally, if you're going to buy a new hybrid kit then id just wait for one made for the 1080 Ti to come out.

Or buy one from this guy.


__
https://www.reddit.com/r/5y9sh2/usamd_h_evga_980_hybrid_cooling_kit_w_paypal/

You can fit it right onto your GPU through the reference shroud. But please double check and see if it's the same pump as all the other hybrid kits. From my knowledge it is.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Personally, if you're going to buy a new hybrid kit then id just wait for one made for the 1080 Ti to come out.
> 
> Or buy one from this guy.
> 
> 
> __
> https://www.reddit.com/r/5y9sh2/usamd_h_evga_980_hybrid_cooling_kit_w_paypal/
> 
> You can fit it right onto your GPU through the reference shroud. But please double check and see if it's the same pump as all the other hybrid kits. From my knowledge it is.


Appreciate all the help/info. I'm just super impatient.







And this isn't helping anything...



http://imgur.com/XNAPj







I'd imagine that all the third party cards will be out by May or so so perhaps I'll sell my card for a hybrid/step up through evga when they're out, or I'll go with a 1080ti cooler when that's out, whichever comes first.


----------



## KickAssCop

How is the sound on these FE suckers?


----------



## AMDATI

Quote:


> Originally Posted by *KickAssCop*
> 
> How is the sound on these FE suckers?


air wooshing sound, not like AMD's vacuum cleaner blowers. My Noctua industrial fans make it sound quiet in comparison.


----------



## alucardis666

Quote:


> Originally Posted by *KickAssCop*
> 
> How is the sound on these FE suckers?


@40% it's barely audible, @50% you'll notice it if you have a pretty quiet system, @60% it's noticeable, @65% it's kinda loud, @70% it's very loud, 75%+ **** *** IS THAT?!?


----------



## AMDATI

My fan doesnt need to go over 60% for max load at mid 70's temp. These things need proper intake air, and not just some fan blowing across the case.


----------



## KickAssCop

Thanks for replies.


----------



## AMDATI

This thing fits my case like a glove. It's as long as this side mesh.



2x90mm fans in the front feeding intake, one basically in front of the GPU. Then a nice big 140mm exhaust. Keeps pretty cool.


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> This thing fits my case like a glove. It's as long as this side mesh.
> 
> 
> 
> 2x90mm fans in the front feeding intake, one basically in front of the GPU. Then a nice big 140mm exhaust. Keeps pretty cool.


Def gonna try to go for a smaller form factor when I overhaul my rig nex time. Seems so efficient, and it'd be nice to see my rig atop my desk rather than next to it.









Also...

Decided to open my window *it's 24 degrees F outside...* And lowered my ambient room temp to ~20C Maxed the Fan to 100 and Ran Heaven 3x to test stability. Seems that this card is very aggressive with thermal throttling. Without the fat @100% Load temps were in the mid 70's and it wouldn't ask for mote than 1.06V, now I'm able to feed 1.08V and get a higher boost clock.









Really wanna throw some better cooling at this beast and see if 2100 or more is doable.


----------



## BroHamBone

Quote:


> Originally Posted by *alucardis666*
> 
> @40% it's barely audible, @50% you'll notice it if you have a pretty quiet system, @60% it's noticeable, @65% it's kinda loud, @70% it's very loud, 75%+ **** *** IS THAT?!?


http://forums.evga.com/My-1080-Ti-Hybrid-m2629639.aspx

I actually grabbed a 1070/1080 kit today to do this. I had enough evga bucks, i pretty much paid $1 plus ground shipping


----------



## alucardis666

Quote:


> Originally Posted by *BroHamBone*
> 
> http://forums.evga.com/My-1080-Ti-Hybrid-m2629639.aspx
> 
> I actually grabbed a 1070/1080 kit today to do this. I had enough evga bucks, i pretty much paid $1 plus ground shipping


Looks great man, Really tempted to try this, did you change out the thermal pads or need any additional ones? Anything I should be aware of before attempting?

Thanks!


----------



## DennyCorsa86

My sli will arrive today ??

Inviato dal mio HUAWEI VNS-L31 utilizzando Tapatalk


----------



## BroHamBone

Quote:


> Originally Posted by *alucardis666*
> 
> Looks great man, Really tempted to try this, did you change out the thermal pads or need any additional ones? Anything I should be aware of before attempting?
> 
> Thanks!


I can ask...hhahaha. Im going to be doing mine once it arrives. I will probably leave the same thermal pads. The video posted earlier, there is a part 2 for installation of the AIO...im sure he uses the same thermal pads since he said something along the lines of "im going to try not to touch the thermal pads w/ my fingers so dirt/oil doesnt get on them"....i think.

Here is the link,






edit:

The guy from the evga thread also linked me this,same dood doing a titan xP 



 , since I asked about the pump housing cap.


----------



## LucaZPF

My EVGA 1080 Ti is on it's way! Will be delivered tomorrow I hope:thumb:


----------



## alucardis666

Quote:


> Originally Posted by *BroHamBone*
> 
> I can ask...hhahaha. Im going to be doing mine once it arrives. I will probably leave the same thermal pads. The video posted earlier, there is a part 2 for installation of the AIO...im sure he uses the same thermal pads since he said something along the lines of "im going to try not to touch the thermal pads w/ my fingers so dirt/oil doesnt get on them"....i think.
> 
> Here is the link,
> 
> 
> 
> 
> 
> 
> edit:
> 
> The guy from the evga thread also linked me this,same dood doing a titan xP
> 
> 
> 
> , since I asked about the pump housing cap.


My bad, I thought you were that guy on the EVGA forums lol. It's 3AM here... I gotta get to bed.









*EDIT:* Just bought a hybrid cooler off ebay for $60. We'll see how it goes.


----------



## KickAssCop

What tool is required to open up the FE cooler on 1080 Ti. Pics and links would go along way.


----------



## pez

Quote:


> Originally Posted by *alucardis666*
> 
> Here's the story at 4k.
> 
> *ULTRA
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> **
> 
> 
> *VERY HIGH
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> **
> 
> 
> *HIGH*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Well, my card hasn't died yet. So so far so good, at least. We'll see how it goes for the rest of the week.


Do you guys mind testing this with long distance shadows on low/off as well as with God rays on low/off? I have a feeling these may be taxing CPU more than the GPU as well as making the hit to performance pretty large. Usually shadows and lighting-based stuff is so over-the-top and hogs performance.


----------



## xTesla1856

Has anyone broken the 2100 wall yet?


----------



## fisher6

Quote:


> Originally Posted by *xTesla1856*
> 
> Has anyone broken the 2100 wall yet?


The highest mine has gone is 2085 but I haven't tried different combinations of memory and core OC yet. Had to install my new screen yesterday.


----------



## mechwarrior

guys would a universal gpu block fit?
and still keep the original vram heatsinks?


----------



## D13mass

Quote:


> Originally Posted by *dseg*
> 
> Do you know if you need a Heatkiller backplate or if it uses the stock backplate?


No, I don`t know, sorry, but I use now my 980ti + water block from EK without any backplate. It`s really not needed if water block has so big size.


----------



## Quaddamage1080

Has anyone on here put one of these in a mini itx build yet? How about a rvz02? Really wanting to put one of those in there. How are temperatures?


----------



## KickAssCop

Possible to put an EK 980 Ti Classified block on this FE pup?


----------



## Radox-0

Quote:


> Originally Posted by *KickAssCop*
> 
> Possible to put an EK 980 Ti Classified block on this FE pup?


Nope, very much doubt it, Classy uses a much wider PCB then the 1080Ti FE then plenty of other differences.


----------



## keikei

Welp, the card comes in today! Hopefully, i can play dark souls 3 maxed out with no frame drops!


----------



## Biggu

Quote:


> Originally Posted by *D13mass*
> 
> No, I don`t know, sorry, but I use now my 980ti + water block from EK without any backplate. It`s really not needed if water block has so big size.


I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


----------



## D13mass

Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


I would be grateful for your feedback, because I still without card


----------



## keikei

For those with the free game codes, which game did you choose? Both being ubisoft games, im still up in the air about a choice. They look decent enough.


----------



## zipeldiablo

I tried to update the power limite to 120% but it increases the temp limit to 95 degrees on msi afterburner.
Any way to prevent that?
Also the vcore seems look


----------



## pez

Quote:


> Originally Posted by *Quaddamage1080*
> 
> Has anyone on here put one of these in a mini itx build yet? How about a rvz02? Really wanting to put one of those in there. How are temperatures?


I have my TXP in a Ncase, though have an intake fan directly underneath it. You will probably hit the thermal limit if you let it and you will get a lower boost in the long run, but not by much. I capped my fan at 60% on the TXP and my boost will finalize around 1911MHz I believe. This is after 3 hours of OW and about 1 hour of BF1 yesterday. Ambient temps will play a bigger role in your situation, however.

That being said, the slight adjustments to the cooler and the rear bracket may make a tangible difference for you, but I wouldn't count on it.


----------



## SuprUsrStan

Quote:


> Originally Posted by *Belmire*
> 
> 1080Tis SLI checking in. Beast! Firestrike scores are watercooled. Game pics are on blower fans at 60%.
> 
> Firestrike score still holding on!!!
> http://www.3dmark.com/fs/11978782
> 
> Witcher 3 4k:
> 
> 
> Crysis 3 4k:
> 
> 
> Ryse 4k:


And people are claiming SLI is dead.









Nice.


----------



## xTesla1856

Quote:


> Originally Posted by *Syan48306*
> 
> And people are claiming SLI is dead.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice.


It is _kinda sorta_ I'm hoping the Volta Titan will be the first card to do 4K60+ consistently


----------



## SuprUsrStan

Quote:


> Originally Posted by *AMDATI*
> 
> FE cooling FTW


Skimming through this thread, it's hard to tell who's under water and who's not

Do we have any overclocking results under custom water?

EDIT: I was mistaking the 4790K temps for the 1080 TI.


----------



## SuprUsrStan

Quote:


> Originally Posted by *xTesla1856*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> And people are claiming SLI is dead.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is _kinda sorta_ I'm hoping the Volta Titan will be the first card to do 4K60+ consistently
Click to expand...

144hz 4K panels are right on the horizon. We'll have a reason to push frames past 4K60 now.


----------



## stocksux

Quote:


> Originally Posted by *alucardis666*
> 
> Anyone run the bench within Wildlands? How're your frames?
> 
> @4k Ultra this still isn't playable with 1 card... :-(
> 
> @1440p however, that's a different story.


I can't comment on 4K as I don't have a 4K monitor, but I'm playing at 1440p with everything on ultra. I'm on water and overclocked a good bit. I'm seeing right at 70fps on the benchmark test in Wildlands with those settings.


----------



## dboythagr8

Quote:


> Originally Posted by *pompss*
> 
> My idle temperature are 54c and its pretty high should be 30c max 35c
> 
> Anyone get the same temps?
> 
> I will try to check and change the thermal paste see if gets better


Way too high. I'm idling at 34c with my custom fan curve turned off.\

---

Anybody prefer Precision XOC to Afterburner? Now that I'm back on EVGA curious if it's worth it to go back or stay with Afterburner.


----------



## stocksux

Quote:


> Originally Posted by *Syan48306*
> 
> Skimming through this thread, it's hard to tell who's under water and who's not
> 
> Do we have any overclocking results under custom water?
> 
> EDIT: I was mistaking the 4790K temps for the 1080 TI.


I'm underwater. Dual loops. 1080ti gets its own loop with a Hardware Labds GTX Nemesis 560mm radiator with 8 Be Quiet! 140mm fans in push/pull @ 700rpm. An EK pump/res combo and EK Titan block/backplate. Pump running @ 1200rpm. At load I see temps hit about 37c. If I raise fan speeds to full (1600rpm) I keep temps more towards the low 30s.


----------



## SuprUsrStan

Quote:


> Originally Posted by *stocksux*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> Skimming through this thread, it's hard to tell who's under water and who's not
> 
> Do we have any overclocking results under custom water?
> 
> EDIT: I was mistaking the 4790K temps for the 1080 TI.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm underwater. Dual loops. 1080ti gets its own loop with a Hardware Labds GTX Nemesis 560mm radiator with 8 Be Quiet! 140mm fans in push/pull @ 700rpm. An EK pump/res combo and EK Titan block/backplate. Pump running @ 1200rpm. At load I see temps hit about 37c. If I raise fan speeds to full (1600rpm) I keep temps more towards the low 30s.
Click to expand...

So what's your max clocks?


----------



## stocksux

Quote:


> Originally Posted by *Syan48306*
> 
> 144hz 4K panels are right on the horizon. We'll have a reason to push frames past 4K60 now.


And they can't get here fast enough! I picked up a second block/backplate for sli 1080ti but Micro Center wouldn't sell me two on launch day ?. Now my brain has got in the way and said hold off and wait for next gen of card. Buuuut I'm ready for that Asus PG27UQ! By the time it comes out though (Q3 I believe) seems like Volta will probably be right around the corner.


----------



## MunneY

Quote:


> Originally Posted by *xTesla1856*
> 
> Has anyone broken the 2100 wall yet?


Yeah.. K|ngP|n has at 2500mhz LOL

It seems the wall is pretty much 2050 stable. Thats disappointing because my TXP did 2100


----------



## pez

Quote:


> Originally Posted by *MunneY*
> 
> Yeah.. K|ngP|n has at 2500mhz LOL
> 
> It seems the wall is pretty much 2050 stable. Thats disappointing because my TXP did 2100


50Mhz for $500 more







. I'd say you're doing ok







.


----------



## PasK1234Xw

Quote:


> Originally Posted by *Syan48306*
> 
> And people are claiming SLI is dead.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice.


WOW SLI working in older games.


----------



## MunneY

Quote:


> Originally Posted by *pez*
> 
> 50Mhz for $500 more
> 
> 
> 
> 
> 
> 
> 
> . I'd say you're doing ok
> 
> 
> 
> 
> 
> 
> 
> .


I'm not mad, just hoped they would actually clock higher considering they had slightly LESS going on. Its also too bad that we wont be getting bios mods more than likely. I should have hung out and waited for the K|ngP|n cards or Classy/Lightnings, but I didn't want to take a chance with waterblocks.


----------



## pez

Quote:


> Originally Posted by *MunneY*
> 
> I'm not mad, just hoped they would actually clock higher considering they had slightly LESS going on. Its also too bad that we wont be getting bios mods more than likely. I should have hung out and waited for the K|ngP|n cards or Classy/Lightnings, but I didn't want to take a chance with waterblocks.


Oh I was just making light of the situation







.


----------



## SuprUsrStan

Quote:


> Originally Posted by *MunneY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pez*
> 
> 50Mhz for $500 more
> 
> 
> 
> 
> 
> 
> 
> . I'd say you're doing ok
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> I'm not mad, just hoped they would actually clock higher considering they had slightly LESS going on. Its also too bad that we wont be getting bios mods more than likely. I should have hung out and waited for the K|ngP|n cards or Classy/Lightnings, but I didn't want to take a chance with waterblocks.
Click to expand...

Look at it this way, sure the max overclock aren't there but it's guaranteed not to down clock while most 1080Ti's will throttle if not on water.


----------



## KedarWolf

Quote:


> Originally Posted by *zipeldiablo*
> 
> I tried to update the power limite to 120% but it increases the temp limit to 95 degrees on msi afterburner.
> Any way to prevent that?
> Also the vcore seems look


See my earlier posts on the .oem2 file to unlock the voltage slider. You only want to do that under water though, and you want the reference version unless you are using an EVGA, then you want the .zip file for that version, I posted attachments to both.


----------



## richiec77

Who's going to be the 1st one to power unlock via Voltage Shunt tricks?


----------



## KedarWolf

Quote:


> Originally Posted by *Syan48306*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AMDATI*
> 
> FE cooling FTW
> 
> 
> 
> 
> 
> Skimming through this thread, it's hard to tell who's under water and who's not
> 
> Do we have any overclocking results under custom water?
> 
> EDIT: I was mistaking the 4790K temps for the 1080 TI.
Click to expand...

My EK waterblock and backplate should be here by next Wednesday, will post results then.


----------



## PasK1234Xw

Quote:


> Originally Posted by *richiec77*
> 
> Who's going to be the 1st one to power unlock via Voltage Shunt tricks?


People are crazy literally voiding cards just to get slightly higher benchmarks shunt does very little to frame increase.


----------



## zipeldiablo

Strix review :
https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/

NOT WORTH IT


----------



## zipeldiablo

Quote:


> Originally Posted by *KedarWolf*
> 
> See my earlier posts on the .oem2 file to unlock the voltage slider. You only want to do that under water though, and you want the reference version unless you are using an EVGA, then you want the .zip file for that version, I posted attachments to both.


Thanks mate


----------



## MunneY

Quote:


> Originally Posted by *zipeldiablo*
> 
> Strix review :
> https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/
> 
> NOT WORTH IT


LOL at the AIB being slower than the reference. This just proves that unless someone gets us voltage and power limit controls, we are stuck at 2050mhz!


----------



## PasK1234Xw

Ti is not stuck at 2050


----------



## vmanuelgm

Quote:


> Originally Posted by *MunneY*
> 
> LOL at the AIB being slower than the reference. This just proves that unless someone gets us voltage and power limit controls, we are stuck at 2050mhz!


With water and shunt, the FE's should be around 2100-2150 more or less.

FE cards could bring the best chips (binning)???

Just seen the TXP review in Techpowerup, and got the same 2063 MHz max when they reviewed it...


----------



## PasK1234Xw

Quote:


> Originally Posted by *vmanuelgm*
> 
> With water and shunt, the FE's should be around 2100-2126 more or less.
> 
> FE cards could bring the best chips (binning)???


Hoenstly think if you want best OC FE go for nvidia branded as they don't release pre OC cards and bin for those;

Place like EVGA take better FE PCB and use for their SC models till they start producing their own reference PCB then they make their own FE so then nvidia is completely out of the loop with them.

I had 6 EVGA FE and all never went over 2050 even on water i went and grabbed nvidia branded from best buy and did over 2100 no problem on stock voltage and air.

I also watched bunch of vids on YouTube and read comments and seemed to be case with others also. All the while it seemed 2100 was no problem on cards from nvidia.

Heard rumor that nvidia took chips in the center of the wafer and used on their cards. I guess those yield best quality chips.

This time around i went with nvidia but card delayed due to snow storm so cant test yet.


----------



## Bull56

Hey guys, is there any way to mod the BIOS yet?
I would like to have more Power Limit without hardware modding...


----------



## vmanuelgm

Quote:


> Originally Posted by *Bull56*
> 
> Hey guys, is there any way to mod the BIOS yet?
> I would like to have more Power Limit without hardware modding...


Nope until customs that could come with some powerful special bios's

I flashed custom bios's in my previous 1080 FE


----------



## Slackaveli

Quote:


> Originally Posted by *stocksux*
> 
> And they can't get here fast enough! I picked up a second block/backplate for sli 1080ti but Micro Center wouldn't sell me two on launch day ?. Now my brain has got in the way and said hold off and wait for next gen of card. Buuuut I'm ready for that Asus PG27UQ! By the time it comes out though (Q3 I believe) seems like Volta will probably be right around the corner.


probably true , that. sli 1080ti would still give a great experience on a 4k/144 hdr, but, sli 2080 will be at least +20% (maybe more) per card so 40-50% muh frames. then 2080ti sli should come close to filling out that monitor. We are living in great times, men!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *keikei*
> 
> For those with the free game codes, which game did you choose? Both being ubisoft games, im still up in the air about a choice. They look decent enough.


I got For Honor and I'm starting to regret it. The servers have been down or ****ty since I got it. And the gameplay is pretty boring to be honest, Dark Souls 3 is way more exciting.

Quote:


> Originally Posted by *KickAssCop*
> 
> What tool is required to open up the FE cooler on 1080 Ti. Pics and links would go along way.


Depending how far you're going to take it apart, get this baby. It's good to have for computers anyways and cheap.

http://www.homedepot.com/p/Husky-8-in-1-Precision-Slotted-and-Philips-Screwdriver-Set-71281H/204664388

And you'll need metric allen keys, I think 1.5mm or 2mm was needed for all the socket screws.


----------



## dboythagr8

Quote:


> Originally Posted by *zipeldiablo*
> 
> Strix review :
> https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/
> 
> NOT WORTH IT


My biggest takeaway from this is how fast a single 1080Ti is at 1440p (my resolution), and I have another card coming tomorrow. I really need the PG27UQ to get some kind of updated release information. I wanted to move to the PG348Q, but for that kind of money, I can be nearly there with the PG27UQ.

At least when I inevitably have to disable 1 card for SLI issues, I know I'll still have an incredibly fast single card to play on.
Quote:


> Originally Posted by *Slackaveli*
> 
> probably true , that. sli 1080ti would still give a great experience on a 4k/144 hdr, but, sli 2080 will be at least +20% (maybe more) per card so 40-50% muh frames. then 2080ti sli should come close to filling out that monitor. We are living in great times, men!


And then the 3080ti and so on. It's a never ending cycle.


----------



## MunneY

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Ti is not stuck at 2050


Show me 1 card hitting 2100mhz... I'll wait
Quote:


> Originally Posted by *vmanuelgm*
> 
> With water and shunt, the FE's should be around 2100-2150 more or less.
> 
> FE cards could bring the best chips (binning)???
> 
> Just seen the TXP review in Techpowerup, and got the same 2063 MHz max when they reviewed it...


Yup. My Nvidia TXP did 2100mhz pretty much no problems. They would do 2081 in SLI when I loaned it to a buddy.

I think pretty much every chip will do 2000mhz, and great ones will nudge under 2100 (mine boost to 2083 but wont stay there)


----------



## PasK1234Xw

Quote:


> Originally Posted by *MunneY*
> 
> Show me 1 card hitting 2100mhz... I'll wait


Wipe that smug look off your face

http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975-2.html

edit

also going from 2080 to 2100 will yield maybe 2 fps if you're lucky


----------



## vmanuelgm

Quote:


> Originally Posted by *MunneY*
> 
> Show me 1 card hitting 2100mhz... I'll wait
> Yup. My Nvidia TXP did 2100mhz pretty much no problems. They would do 2081 in SLI when I loaned it to a buddy.
> 
> I think pretty much every chip will do 2000mhz, and great ones will nudge under 2100 (mine boost to 2083 but wont stay there)


Yep, that is the average, more or less.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Wipe that smug look off your face
> 
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975-2.html
> 
> edit
> 
> also going from 2080 to 2100 will yield maybe 2 fps if you're lucky


Lol, he just had to wait 2 minutes to get owned.


----------



## BigMack70

OC performance seems a bit lackluster... I was comparing my card to some of the benches posted in this thread and even though my max average boost with +130 offset on water is just ~1950 MHz, I'm only seeing scores about 2% lower than most of what's been posted here from users with ~2050 MHz boost clocks.

Overclocking seems uninteresting on Pascal... just make sure you have temps in check and the card itself gets much closer to its maximum performance than I've ever seen from another architecture. Disappointing, to be honest. But I guess that's what Nvidia's end goal is for its GPU boost technology.


----------



## pez

Quote:


> Originally Posted by *zipeldiablo*
> 
> Strix review :
> https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/
> 
> NOT WORTH IT


If you're not going water, the noise and temperature benefit alone would make that an easy $40 for me to give up. Chances are the 39dB rating for the FE cooler is based on the stock fan curve (read max of 50%) fan.


----------



## MunneY

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Wipe that smug look off your face
> 
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975-2.html
> 
> edit
> 
> also going from 2080 to 2100 will yield maybe 2 fps if you're lucky


Watercooled chief.. Im talking about air cards.

Quote:


> Originally Posted by *SlimJ87D*
> 
> Lol, he just had to wait 2 minutes to get owned.


if you look at this here graph, you'll see it was bumping 2100, not maintaining it too.


----------



## SuprUsrStan

Generally what is the turn around time from order to shipping from the nvidia store? Same day if ordered early in the morning?


----------



## BigMack70

Quote:


> Originally Posted by *MunneY*
> 
> Show me 1 card *hitting* 2100mhz


...
Quote:


> Originally Posted by *MunneY*
> 
> it was *bumping* 2100


Soooooooooooo.... to avoid admitting you were wrong, you are going to enter into a semantics debate between "_hitting_ 2100 MHz" and "_bumping_ 2100 MHz".... ohhhhhhhhhh dis gonna be good......


----------



## Maintenance Bot

Quote:


> Originally Posted by *Syan48306*
> 
> Generally what is the turn around time from order to shipping from the nvidia store? Same day if ordered early in the morning?


Yes, usually get tracking info in the evening.


----------



## MunneY

Quote:


> Originally Posted by *BigMack70*
> 
> ...
> Soooooooooooo.... to avoid admitting you were wrong, you are going to enter into a semantics debate between "_hitting_ 2100 MHz" and "_bumping_ 2100 MHz".... ohhhhhhhhhh dis gonna be good......


You are right its semantics, but when I say a card running 2100mhz, i mean staying there...

my card now does 2083mhz but i consider it running 2050 because that is where it stays.


----------



## sew333

Can somebody tell me is any sense to buy Gtx 1080 ti reference? Any happy owners here?
I will ignore noise. But how about high temps and clock throttling?

I already sold my Gtx 1080 Xtreme.
My monitor is Yijama Black Hawk 1080p 60hz 1 MS.
I know weak,but in future i buy 1440p 144hz monitor.


----------



## BigMack70

Quote:


> Originally Posted by *sew333*
> 
> Can somebody tell me is any sense to buy Gtx 1080 ti reference? Any happy owners here?
> I will ignore noise. But how about high temps and clock throttling?
> 
> I already sold my Gtx 1080 Xtreme.
> My monitor is Yijama Black Hawk 1080p 60hz 1 MS.
> I know weak,but in future i buy 1440p 144hz monitor.


If you are serious about being able to ignore noise then the reference card is OK. It's just going to be really loud because you need 70%+ fan speed to avoid serious temperature throttling. I would just wait for the STRIX or another custom card unless you are either putting the card under water or into an ITX rig.


----------



## Luda

The heatkiller backplate does have thermal pads for 3 components on the board. I doubt this would make or break anything but it looks good and im a huge fan of the sleek look so i grabbed one for my heatkiller TXP block. Great block, 44C with the card deciding to self boost to 1900mhz it seems.










Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


----------



## yukkerz

My head is going to pop!!! I just want my card!!!


I also ordered a evga hybrid cooler. So simple, don't need to strip the whole card. Just the cover and the evga block bolts right up.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *MunneY*
> 
> You are right its semantics, but when I say a card running 2100mhz, i mean staying there...
> 
> my card now does 2083mhz but i consider it running 2050 because that is where it stays.


I get what you're saying. Sometimes I wonder what core speeds people are reporting here. What they hit in the first 15 seconds of benchmarking or what they hit and stay at.


----------



## MunneY

Quote:


> Originally Posted by *Luda*
> 
> The heatkiller backplate does have thermal pads for 3 components on the board. I doubt this would make or break anything but it looks good and im a huge fan of the sleek look so i grabbed one for my heatkiller TXP block. Great block, 44C with the card deciding to self boost to 1900mhz it seems.


Dem air bubbles might as well be my personal nemesis... I hate refilling loops :-D

Quote:


> Originally Posted by *SlimJ87D*
> 
> I get what you're saying. Sometimes I wonder what core speeds people are reporting here. What they hit in the first 15 seconds of benchmarking or what they hit and stay at.


Yeah. I get its a wiener measuring competition to some, but I try to be as honest about my clocks as I can. I had screenshots of my TXP doing nearly 2200mhz, quickly followed by a driver crash :-D


----------



## chibi

Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine
> 
> 
> Spoiler: Warning: Spoiler!


Do you know if the Aquacomputer Active XCS Backplate can be used with the Heatkiller block?
Quote:


> Originally Posted by *Luda*
> 
> The heatkiller backplate does have thermal pads for 3 components on the board. I doubt this would make or break anything but it looks good and im a huge fan of the sleek look so i grabbed one for my heatkiller TXP block. Great block, 44C with the card deciding to self boost to 1900mhz it seems.
> 
> 
> Spoiler: Warning: Spoiler!


Hmm, 44C seems a bit high... perhaps those air bubbles in your loop can be bleed further.


----------



## Luda

Quote:


> Originally Posted by *MunneY*
> 
> Dem air bubbles might as well be my personal nemesis... I hate refilling loops :-D


Yea i tried to do it will as little air as possible, but we all know how that goes. I didnt bleed **** out of the system, i plugged it up and fired up wildlands. There is a TON of air in the system but its not like its overheating or anything, this weekend ill do the bubble killer dance and try add move the card to 2050mhz. IT did show 1895mhz when i was playing wildlands last night and keeping an eye on this though so i might just leave it alone and enjoy for once.

Quote:


> Originally Posted by *chibi*
> 
> Do you know if the Aquacomputer Active XCS Backplate can be used with the Heatkiller block?
> Hmm, 44C seems a bit high... perhaps those air bubbles in your loop can be bleed further.


If you are referencing THIS one then yes it should match up. I can 100% confirm their titan block fits like a glove on the reference 1080ti.

And as i said in my response ^ yea those bubbles need to go. My top 970 was running 38C peak so my estimate is 40-42c should be achievable worst case.


----------



## KingEngineRevUp

Questions regarding radiator fans.

I have Noctua NF-F12 fans, but I feel like they're not pushing enough hot air through my H55. If I got the 2000 or 3000 RPM fans ( I know they're loud at 3000 but I'd control them), would it make a major difference?

https://www.amazon.com/gp/offer-listing/B00KFCRIQM/ref=sr_1_1_olp?s=pc&ie=UTF8&qid=1489593320&sr=1-1&keywords=noctua+nf-f12+2000+pwm


----------



## Luda

Quote:


> Originally Posted by *SlimJ87D*
> 
> Questions regarding radiator fans.
> 
> I have Noctua NF-F12 fans, but I feel like they're not pushing enough hot air through my H55. If I got the 2000 or 3000 RPM fans ( I know they're loud at 3000 but I'd control them), would it make a major difference?
> 
> https://www.amazon.com/gp/offer-listing/B00KFCRIQM/ref=sr_1_1_olp?s=pc&ie=UTF8&qid=1489593320&sr=1-1&keywords=noctua+nf-f12+2000+pwm


I run the NF-F12 iPPC 200rpm on my thin 360 and 3000 on my 55mm thick front radiator and they get the job done. I doubt changing fans from those would make much of a difference without getting into the realm of insane fans.


----------



## sew333

There is review of Strix 1080 Ti :

https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/34.html

Why there is 1-2 fps difference between 1080 ti ref and strix 1080 ti ? Its not worth then to pay more for strix. Thanks!


----------



## chibi

Quote:


> Originally Posted by *Luda*
> 
> Yea i tried to do it will as little air as possible, but we all know how that goes. I didnt bleed **** out of the system, i plugged it up and fired up wildlands. There is a TON of air in the system but its not like its overheating or anything, this weekend ill do the bubble killer dance and try add move the card to 2050mhz. IT did show 1895mhz when i was playing wildlands last night and keeping an eye on this though so i might just leave it alone and enjoy for once.
> If you are referencing THIS one then yes it should match up. I can 100% confirm their titan block fits like a glove on the reference 1080ti.
> 
> And as i said in my response ^ yea those bubbles need to go. My top 970 was running 38C peak so my estimate is 40-42c should be achievable worst case.


Dance on brother, I know I will be doing that once the loop is complete.









Regarding the Heatkiller block, so you're saying technically I can mix and match the following on my EVGA 1080 Ti and it will still work?

Link -> Watercool HEATKILLER® IV for TITAN X (Pascal)
Link -> Aquacomputer Backplate for Kryographics Pascal NVIDIA TITAN X, Active XCS


----------



## sew333

Hey. Please stop and no answer about change monitor ok? Just answer to question.

How much fps i will gain from changing from GTX 1080 Xtreme to Gtx 1080 Ti ref? ON res 1080p

cpu 6700k stock


----------



## Pandora's Box

Quote:


> Originally Posted by *Quaddamage1080*
> 
> Has anyone on here put one of these in a mini itx build yet? How about a rvz02? Really wanting to put one of those in there. How are temperatures?


I have mine installed in a RVZ-02.

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/550_50#post_25914883


----------



## Baasha

Two of these cards is *NOT* enough to run 4K @ 144Hz.

Proof:






On that note, even three of these cards are not enough to run 4K @ 144Hz since scaling at 4K is not that great (!).

Proof:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luda*
> 
> I run the NF-F12 iPPC 200rpm on my thin 360 and 3000 on my 55mm thick front radiator and they get the job done. I doubt changing fans from those would make much of a difference without getting into the realm of insane fans.


I have the 1500, I wonder if that extra 500 RPM help.


----------



## sew333

Is ref 1080 ti throttling on stock too much due to high temps? How much fps will drop then?


----------



## Baasha

Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.

What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0

I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.

I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


----------



## AMDATI

Here's my custom fan profile, seems to work well for stock.



My reasoning is this:

Stock min fan speed was 40%, but that's higher than necessary.

Since CPU will get above 70c at load anyways, there's no point for more steps in the curve at lower temperatures. The 60c starting point for 70% fan speed, is a little buffer room for the fan to push some air in before the inevitable +70c temps. For me, this is where the curve stops since I max out at 76c.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> Do you guys mind testing this with long distance shadows on low/off as well as with God rays on low/off? I have a feeling these may be taxing CPU more than the GPU as well as making the hit to performance pretty large. Usually shadows and lighting-based stuff is so over-the-top and hogs performance.


Tried it, no major changes.







1-2 frames higher at minimum. 1440p is still the best way to play


----------



## lilchronic

Quote:


> Originally Posted by *Baasha*
> 
> Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.
> 
> What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0
> 
> I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.
> 
> I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


The hotter these cards run the more performance you lose.


----------



## Luda

Quote:


> Originally Posted by *Baasha*
> 
> Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.
> 
> What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0
> 
> I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.
> 
> I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


Save the money on upgrading fans and put it towards the x62. The additional radiator space will net you MUCH lower temps, a single 140 on that proc is WAY under spec'd.

1)pics of the 4x cards installed?
2)how much do you care about noise?


----------



## Pandora's Box

Quote:


> Originally Posted by *AMDATI*
> 
> Here's my custom fan profile, seems to work well for stock.
> 
> 
> 
> My reasoning is this:
> 
> Stock min fan speed was 40c, but that's higher than necessary.
> 
> Since CPU will get above 60c at load anyways, there's no point for more steps in the curve at lower temperatures.


You must wear headphones lol. I have my curve set like this:


----------



## AMDATI

these cards don't just need good case airflow, they do best with direct airflow. If it's possible to put a fan facing right at the intakes from an inch or two away, it will really boost the cooling because you'll get focused CFM flow.

I think best case scenario is a custom fan shroud that connects an intake fan directly to the front of the cards heatsink.


----------



## Pandora's Box

Quote:


> Originally Posted by *AMDATI*
> 
> these cards don't just need good case airflow, they do best with direct airflow. If it's possible to put a fan facing right at the intakes from an inch or two away, it will really boost the cooling.
> 
> I think best case scenario is a custom fan shroud that connects an intake fan directly to the front of the cards heatsink.


You do know that the front of the card doesnt actually do anything right? Those fins and that opening are just aesthetics. They are blocked by the fan shroud.


----------



## AMDATI

Quote:


> Originally Posted by *Pandora's Box*
> 
> You must wear headphones lol. I have my curve set like this:


With that fan curve I'd say you're throttling though.


----------



## keikei

Quote:


> Originally Posted by *sew333*
> 
> Hey. Please stop and no answer about change monitor ok? Just answer to question.
> 
> How much fps i will gain from changing from GTX 1080 Xtreme to Gtx 1080 Ti ref? ON res 1080p
> 
> cpu 6700k stock


Scroll down to 1080p benches: http://www.eurogamer.net/articles/digitalfoundry-2017-nvidia-geforce-gtx-1080-ti-review


----------



## Pandora's Box

Quote:


> Originally Posted by *AMDATI*
> 
> With that fan curve I'd say you're throttling though.


I hit max of 82C with 30 minutes of heaven. 1950Mhz core clock is the lowest I see.


----------



## Gunslinger.

Quote:


> Originally Posted by *sew333*
> 
> Hey. Please stop and no answer about change monitor ok? Just answer to question.
> 
> How much fps i will gain from changing from GTX 1080 Xtreme to Gtx 1080 Ti ref? ON res 1080p
> 
> cpu 6700k stock




You'd probably get a bigger boost by pushing your CPU OC up to 4.8GHz


----------



## Luda

Quote:


> Originally Posted by *chibi*
> 
> Dance on brother, I know I will be doing that once the loop is complete.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding the Heatkiller block, so you're saying technically I can mix and match the following on my EVGA 1080 Ti and it will still work?
> 
> Link -> Watercool HEATKILLER® IV for TITAN X (Pascal)
> Link -> Aquacomputer Backplate for Kryographics Pascal NVIDIA TITAN X, Active XCS


I cant confirm 100% yes, but both will bolt up, ive seen them both run on 1080ti's. The issue i foresee is the g1/4 connectors could potentially be in the way on each other. That being said i don't think the hassle to run liquid through the back plate would be worth it after see whats there and how cool it stays, its barely warm to the touch on the passive heatkiller back plate.

Quote:


> Originally Posted by *SlimJ87D*
> 
> I have the 1500, I wonder if that extra 500 RPM help.


It wont, you need more radiator space.


----------



## stocksux

Quote:


> Originally Posted by *sew333*
> 
> Hey. Please stop and no answer about change monitor ok? Just answer to question.
> 
> How much fps i will gain from changing from GTX 1080 Xtreme to Gtx 1080 Ti ref? ON res 1080p
> 
> cpu 6700k stock


25%-35%


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.
> 
> What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0
> 
> I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.
> 
> I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


Dude. You only have a single 140mm as cooling for your CPU with that setup? You gotta address that.

I have the same case. I've had up to 3 cards. I'm currently running a total of 5 Corsair ML Pro 140s, and I have a Kraken x62. 2x 140mm up front, 2x on Kraken x62, and 1x 140mm exhaust.










I know I only have 1 at the moment, but my CPU doesn't break 65c.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luda*
> 
> I cant confirm 100% yes, but both will bolt up, ive seen them both run on 1080ti's. The issue i foresee is the g1/4 connectors could potentially be in the way on each other. That being said i don't think the hassle to run liquid through the back plate would be worth it after see whats there and how cool it stays, its barely warm to the touch on the passive heatkiller back plate.
> It wont, you need more radiator space.


My H55 is also acting as an exhaust, so I was thinking I need to also exhaust out the power from the Mobo.

I'll try putting it as an intake and see how much of a difference that makes.


----------



## Gunslinger.




----------



## chibi

Quote:


> Originally Posted by *Luda*
> 
> I cant confirm 100% yes, but both will bolt up, ive seen them both run on 1080ti's. The issue i foresee is the g1/4 connectors could potentially be in the way on each other. That being said i don't think the hassle to run liquid through the back plate would be worth it after see whats there and how cool it stays, its barely warm to the touch on the passive heatkiller back plate.
> It wont, you need more radiator space.


I'll sit this one out then, thanks for your input. In other news, Newegg just delivered my EVGA 1080 Ti! I ordered it within the first 5 minutes on Friday and it didn't ship out until Monday of this week.


----------



## ruggercb

I swapped out 970s SLI (+175/+300 mem) for 1080ti w/ txp ek block and my firestrike score went from 18500 to 18900. Yay. CPU is at 3.9 ghz. X58 might be holding me back.


----------



## SuprUsrStan

Quote:


> Originally Posted by *Baasha*
> 
> Two of these cards is *NOT* enough to run 4K @ 144Hz.
> 
> Proof:
> 
> 
> 
> 
> 
> 
> On that note, even three of these cards are not enough to run 4K @ 144Hz since scaling at 4K is not that great (!).
> 
> Proof:


Fluctuations between 120 and 144 hz at 4K on a g-sync monitor is plenty fine


----------



## Baasha

Quote:


> Originally Posted by *dboythagr8*
> 
> Dude. You only have a single 140mm as cooling for your CPU with that setup? You gotta address that.
> 
> I have the same case. I've had up to 3 cards. I'm currently running a total of 5 Corsair ML Pro 140s, and I have a Kraken x62. 2x 140mm up front, 2x on Kraken x62, and 1x 140mm exhaust.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know I only have 1 at the moment, but my CPU doesn't break 65c.


Very nice!

Can you please post some pics of the X62 showing the radiator and fan setup? looks like you installed it in the 'front' of the case instead on the top(?). If you install it in the front, is there enough room to install a push/pull config or is that going to be too thick with the GPUs sitting right there?

How do the Corsair ML Pro 140mm fans compare to the Noctua 140mm 3000RPM fans? Are all your Corsair ML Pro fans PWM (4-pin - which means you install only 3 pins for the ones that are not on the radiator/X62)?


----------



## Baasha

Quote:


> Originally Posted by *lilchronic*
> 
> The hotter these cards run the more performance you lose.


Yea I understand - I wish I could just have someone install waterblocks on these cards - I don't have the patience to do water-cooling on my GPUs. I change GPUs so often it doesn't seem worth the effort to me. However, if someone is willing to do it for me, I'll do it. Is there a professional (may be semi-professional) in the bay area who is willing to do a custom water-cooling setup on my rigs? Both my rigs have 4-Way SLI and are air-cooled.
Quote:


> Originally Posted by *Luda*
> 
> Save the money on upgrading fans and put it towards the x62. The additional radiator space will net you MUCH lower temps, a single 140 on that proc is WAY under spec'd.
> 
> 1)pics of the 4x cards installed?
> 2)how much do you care about noise?


yea, I'm picking up the X62 - I bought the X41 thinking I'll save space but that was not a good idea. Which fans should I get? The Corsair ML Pro 140mm PWM or the Noctua 140mm PWM 3000RPM? I want the best performance - I don't care about noise at all.

1.) 
2.) I don't care too much about noise - my rigs are in another room so I enjoy quiet and cool gaming!


----------



## Luda

Quote:


> Originally Posted by *ruggercb*
> 
> I swapped out 970s SLI (+175/+300 mem) for 1080ti w/ txp ek block and my firestrike score went from 18500 to 18900. Yay. CPU is at 3.9 ghz. X58 might be holding me back.


Im coming from 3x 970s @ 1480mhz, havent gotten a chance to bench 3dmark but im crossing my fingers it beats out my 25000 firestrike run. Gaming has been sublime on it though.


----------



## KedarWolf

Here are the .oem2 files to unlock the voltage slider. Most cards use the first one, EVGA might be the second. Right click file, Properties, Unblock, unzip. In MSI Programs folder replace .oem2 file.

MSIAfterburner.zip 1k .zip file


EVGA.zip 1k .zip file


If these do not work for you, post a screenshot of your Afterburner 'Info' tab and I'll make one if you want.









Edit: Only use the voltage slider if you are under water as the card will downclock anyways after you hit 60C and higher. The extra voltage will make ir run a bit hotter I'm sure.


----------



## richiec77

Quote:


> Originally Posted by *ruggercb*
> 
> I swapped out 970s SLI (+175/+300 mem) for 1080ti w/ txp ek block and my firestrike score went from 18500 to 18900. Yay. CPU is at 3.9 ghz. X58 might be holding me back.


X58 must be. I went from GTX 980 Ti OC Hybrids in SLI to a GTX 1080 Ti. Stock out of the box Firestrike for me is 21,269. 980's Ti in SLI is 22,150. Running x99 5930K @4.2GHz. I did an Auto OC tune (yeah I know...) and with the AVX un-checked, it ramped up a 4.5GHz OC on the CPU. So the 1080Ti looks closer in overall, but compare the graphics scores to see where you're at.

_________________________________________
GTX 980 Ti SLI OC
_________________________________________
Firestrike: 22,150
31,393 Graphics
15,595 Physics
8,594 Combined
_________________________________________
GTX 1080 Ti STOCK
_________________________________________
Firestrike: 21,269
27,409 Graphics
17,315 Physics
9,099 Combined


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> Very nice!
> 
> Can you please post some pics of the X62 showing the radiator and fan setup? looks like you installed it in the 'front' of the case instead on the top(?). If you install it in the front, is there enough room to install a push/pull config or is that going to be too thick with the GPUs sitting right there?
> 
> How do the Corsair ML Pro 140mm fans compare to the Noctua 140mm 3000RPM fans? Are all your Corsair ML Pro fans PWM (4-pin - which means you install only 3 pins for the ones that are not on the radiator/X62)?


The x62 is installed at the top of the case. Plenty of room with the 140mm fans. Before this setup I had a H100i in push/pull running 4x 120mm SP120s and it fit fine.

The ML 140 Pros are all 4 pin PWM. On the Kraken there is a single 4 pin fan extension, the rest are 3 pin. So I just plugged one of the fans on the radiator into the 4 pin, and the other into an available extension (there are 4 total on the x62).

The other ML Pros are all plugged into my motherboard and controlled via Fan Xpert.

I've never used Noctuas so I can't speak on that, but these fans keep my temps cool. And they're black


----------



## Luda

Quote:


> Originally Posted by *Baasha*
> 
> Yea I understand - I wish I could just have someone install waterblocks on these cards - I don't have the patience to do water-cooling on my GPUs. I change GPUs so often it doesn't seem worth the effort to me. However, if someone is willing to do it for me, I'll do it. Is there a professional (may be semi-professional) in the bay area who is willing to do a custom water-cooling setup on my rigs? Both my rigs have 4-Way SLI and are air-cooled.
> yea, I'm picking up the X62 - I bought the X41 thinking I'll save space but that was not a good idea. Which fans should I get? The Corsair ML Pro 140mm PWM or the Noctua 140mm PWM 3000RPM? I want the best performance - I don't care about noise at all.
> 
> 1.)
> 2.) I don't care too much about noise - my rigs are in another room so I enjoy quiet and cool gaming!


Man i wish i weren't on the other coast, that would be a sick hardline loop. I change Gpus once a year - 18 months or so, its not too bad, i drain the loop with the rig on the back panel so the card empties at the connectors then just pull the card out, and hook the new one up. I love my waterloop, 5930k @ 4.8ghz hits about 50c gaming, and my 1080ti without bleeding the system is barely scratching 44c.

You could got with the 24v noctuas, or something from Delta if you really dont care about noise. Both options should give you plenty of CFM but would be insanely loud.


----------



## dboythagr8

@Baasha

Whatever fans you go with I recommend moving to 3x 120mm intakes in the front (my previous setup) or go to 2x 140mm fans. You need all the air you can get.

I imagine you're leaving tons of performance on the table as is, plus those cards are throttling. I saw it quick with my tri-sli setup, so i know you're running into it. Your setup is really made for water especially with the CPU.


----------



## ruggercb

Quote:


> Originally Posted by *richiec77*
> 
> X58 must be. I went from GTX 980 Ti OC Hybrids in SLI to a GTX 1080 Ti. Stock out of the box Firestrike for me is 21,269. 980's Ti in SLI is 22,150. Running x99 5930K @4.2GHz. I did an Auto OC tune (yeah I know...) and with the AVX un-checked, it ramped up a 4.5GHz OC on the CPU. So the 1080Ti looks closer in overall, but compare the graphics scores to see where you're at.
> 
> _________________________________________
> GTX 980 Ti SLI OC
> _________________________________________
> Firestrike: 22,150
> 31,393 Graphics
> 15,595 Physics
> 8,594 Combined
> _________________________________________
> GTX 1080 Ti STOCK
> _________________________________________
> Firestrike: 21,269
> 27,409 Graphics
> 17,315 Physics
> 9,099 Combined


I think my physics is around 9500(edit I think it's actually around 12000). Graphics are similar to yours. I'll do some double checking tonight.


----------



## Baasha

Quote:


> Originally Posted by *Luda*
> 
> Man i wish i weren't on the other coast, that would be a sick hardline loop. I change Gpus once a year - 18 months or so, its not too bad, i drain the loop with the rig on the back panel so the card empties at the connectors then just pull the card out, and hook the new one up. I love my waterloop, 5930k @ 4.8ghz hits about 50c gaming, and my 1080ti without bleeding the system is barely scratching 44c.
> 
> You could got with the 24v noctuas, or something from Delta if you really dont care about noise. Both options should give you plenty of CFM but would be insanely loud.


I have a few Delta 5000RPM fans - that's too much even for me. LOL.. I think I'll try the 3000RPM Noctua 140mm fans and see.

Quote:


> Originally Posted by *dboythagr8*
> 
> @Baasha
> 
> Whatever fans you go with I recommend moving to 3x 120mm intakes in the front (my previous setup) or go to 2x 140mm fans. You need all the air you can get.


Yea, I'm definitely going with 2x 140mm for the front - just a question of whether to use the 3000RPM Noctuas or something else.

I in fact have 5 or 6 3000RPM fans from way back when I used to use the Mountain Mods case (was rocking 4x 580 Classified in 4-Way SLI!!):



Those fans are THICK though - so will have to see if they'll fit properly. Anyway, for the Kraken X62, I'm probably going to go with the 3000RPM Noctua 140mm fans and see how they perform.


----------



## Biggu

Quote:


> Originally Posted by *chibi*
> 
> Dance on brother, I know I will be doing that once the loop is complete.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding the Heatkiller block, so you're saying technically I can mix and match the following on my EVGA 1080 Ti and it will still work?
> 
> Link -> Watercool HEATKILLER® IV for TITAN X (Pascal)
> Link -> Aquacomputer Backplate for Kryographics Pascal NVIDIA TITAN X, Active XCS


I dont know about the active one since you will need to replace the outlet of the block with the one that comes with it however the passive one (which I got) should work no problems.


----------



## zipeldiablo

Quote:


> Originally Posted by *richiec77*
> 
> X58 must be. I went from GTX 980 Ti OC Hybrids in SLI to a GTX 1080 Ti. Stock out of the box Firestrike for me is 21,269. 980's Ti in SLI is 22,150. Running x99 5930K @4.2GHz. I did an Auto OC tune (yeah I know...) and with the AVX un-checked, it ramped up a 4.5GHz OC on the CPU. So the 1080Ti looks closer in overall, but compare the graphics scores to see where you're at.
> 
> _________________________________________
> GTX 980 Ti SLI OC
> _________________________________________
> Firestrike: 22,150
> 31,393 Graphics
> 15,595 Physics
> 8,594 Combined
> _________________________________________
> GTX 1080 Ti STOCK
> _________________________________________
> Firestrike: 21,269
> 27,409 Graphics
> 17,315 Physics
> 9,099 Combined


I have an oc 5960x and i have the same issue, my sli of gtx 980ti hof had a better score on firestrike included firestrike ultra.
I don't get how though since i have better performances in games.


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> I have an oc 5960x and i have the same issue, my sli of gtx 980ti hof had a better score on firestrike included firestrike ultra.
> I don't get how though since i have better performances in games.


Probably has to do with the memory being faster and the larger memory bus, vs with your 980Ti having higher boosts and the benchmarks caring more for that than the card RAM. At least that's what I'd think anyway.


----------



## alucardis666

In other news my 1080ti should become an Hybrid this friday! Super excited!


----------



## Slackaveli

Quote:


> Originally Posted by *Syan48306*
> 
> Fluctuations between 120 and 144 hz at 4K on a g-sync monitor is plenty fine


why these fools act like you have to have 144 locked or you might as well just stay on 60fps? idiots/


----------



## Addsome

Quote:


> Originally Posted by *alucardis666*
> 
> In other news my 1080ti should become an Hybrid this friday! Super excited!


Will you be modifying the evga shroud or only using the cooler on the chip? Im gonna be doing the gamernexus way tonight after work. They just posted up part 2 of their 1080ti hybrid series.


----------



## alucardis666

Quote:


> Originally Posted by *Addsome*
> 
> Will you be modifying the evga shroud or only using the cooler on the chip? Im gonna be doing the gamernexus way tonight after work. They just posted up part 2 of their 1080ti hybrid series.


Yup, just saw that. I'm planning to do it like this

http://forums.evga.com/My-1080-Ti-Hybrid-m2629639.aspx


----------



## Addsome

Quote:


> Originally Posted by *alucardis666*
> 
> Yup, just saw that. I'm planning to do it like this
> 
> http://forums.evga.com/My-1080-Ti-Hybrid-m2629639.aspx


Do you know if we have to take the whole card apart to do it that way?


----------



## richiec77

Quote:


> Originally Posted by *ruggercb*
> 
> I think my physics is around 9500(edit I think it's actually around 12000). Graphics are similar to yours. I'll do some double checking tonight.


That's stock out of the box. Nothing changed from the factory. Iirc...power limits, temp and a 110+ on the care got me to about 29,500 on GRAPHIC. I remember it being close. I've stopped tinkering until the waterblock comes in as it thermally downclocks below 2,000 all the time and I'm not,living with a jet engine.


----------



## alucardis666

Quote:


> Originally Posted by *Addsome*
> 
> Do you know if we have to take the whole card apart to do it that way?


Read the post, he said you do not need to take apart the back plate at all, only remove the vaporchamber/finds and the acrylic window and that's it.


----------



## Addsome

Quote:


> Originally Posted by *alucardis666*
> 
> Read the post, he said you do not need to take apart the back plate at all, only remove the vaporchamber/finds and the acrylic window and that's it.


Will you be using the thermal paste that comes on the hybrid kit or putting your own?


----------



## alucardis666

Quote:


> Originally Posted by *Addsome*
> 
> Will you be using the thermal paste that comes on the hybrid kit or putting your own?


I bought my hybrid kit used off ebay so I will be applying my own TIM.


----------



## KedarWolf

Quote:


> Originally Posted by *richiec77*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ruggercb*
> 
> I think my physics is around 9500(edit I think it's actually around 12000). Graphics are similar to yours. I'll do some double checking tonight.
> 
> 
> 
> That's stock out of the box. Nothing changed from the factory. Iirc...power limits, temp and a 110+ on the care got me to about 29,500 on GRAPHIC. I remember it being close. I've stopped tinkering until the waterblock comes in as it thermally downclocks below 2,000 all the time and I'm not,living with a jet engine.
Click to expand...

I've stopper tinkering too, at 2000 core it hits 60-70C and a bit higher and downclocks in Firestrike etc.









Ek waterblock and backplate coming by Wednesday of next week though!!


----------



## Radox-0

Quote:


> Originally Posted by *Baasha*
> 
> Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.
> 
> What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0
> 
> I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.
> 
> I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


Only use trio of them so not the forth, but not too much you can do really on these stock coolers as they are tightly packed. I assume you have pulled off the backplate on the other cards in the stack, found that gave it some relief. I have got a pair of fan's (Noctua NF-A14 ippc 2000 RPM PWM fans)) roughly inline with the GPU's and it pushs air down the opening which also helps, though simply put only so much with a stack of reference cards. by the sounds of it you dont live in the same room as the PC so the 3000 RPM would be the way I go and blast air into that case









Would definitely get something better for your CPU, though. Most of the air should be exhausted out the rear so I expect its just your CPU cooler is not up to snuf or you have a serious OC on that CPU.


----------



## Slackaveli

Anybody tried just thermal paste mod with stock vapor chamber? Like with Kryonaut?


----------



## PasK1234Xw

Quote:


> Originally Posted by *MunneY*
> 
> Watercooled chief.. Im talking about air cards.
> ]


First you never specified air and you said show you one card.

Also you originally said these were limited to 2050 then say show you one that can hit 2100 so regardless of what you think even with clocks fluctuating its still over your 2050 claim.
Quote:


> Originally Posted by *MunneY*
> 
> LOL at the AIB being slower than the reference. This just proves that unless someone gets us voltage and power limit controls, we are stuck at 2050mhz!


now move along


----------



## nrpeyton

Opinions ----> Any benefit to going with a beefier PCB on the 1080TI (i.e. EVGA CLASSIFIED) _i realise its not been released/announced yet_

whats overclocking like, you guys finding FE power limit awfully restrictive?


----------



## lilchronic

Quote:


> Originally Posted by *Baasha*
> 
> Yea I understand - I wish I could just have someone install waterblocks on these cards - I don't have the patience to do water-cooling on my GPUs. I change GPUs so often it doesn't seem worth the effort to me. However, if someone is willing to do it for me, I'll do it. Is there a professional (may be semi-professional) in the bay area who is willing to do a custom water-cooling setup on my rigs? Both my rigs have 4-Way SLI and are air-cooled.
> yea, I'm picking up the X62 - I bought the X41 thinking I'll save space but that was not a good idea. Which fans should I get? The Corsair ML Pro 140mm PWM or the Noctua 140mm PWM 3000RPM? I want the best performance - I don't care about noise at all.
> 
> 1.)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2.) I don't care too much about noise - my rigs are in another room so I enjoy quiet and cool gaming!


You would have to find someone that live's near you to do it for you. I'm sure someone on ocn is close to you and would do it for a little cash. It's really not that hard just time consuming.
just got done putting a water block on and it took about a hour taking the card apart putting all the thermal pads on for the water block then putting it all back together. Now i just need a hour or two to drain my loop and install the new card.


----------



## richiec77

Quote:


> Originally Posted by *KedarWolf*
> 
> I've stopper tinkering too, at 2000 core it hits 60-70C and a bit higher and downclocks in Firestrike etc.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ek waterblock and backplate coming by Wednesday of next week though!!


Right on! Also...nice golden CPU!

Still waiting on my watercooling parts to ship from Performance PCS. Ordered March 5th and it's been held up due to the Titan X waterblock and backplate being out of stock. Maybe today? I'm hoping it gets out the door soon for this weekend.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> You would have to find someone that live's near you to do it for you. I'm sure someone on ocn is close to you and would do it for a little cash. It's really not that hard just time consuming.
> just got done putting a water block on and it took about a hour taking the card apart putting all the thermal pads on for the water block then putting it all back together. Now i just need a hour or two to drain my loop and install the new card.


No quick disconnects I guess...

Only 15 minutes to setup the block and put it in...


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Do you know if we have to take the whole card apart to do it that way?







Do it this way except slap the cooler on instead of just the paste.


----------



## KedarWolf

Made a thread how to enable the voltage slider in Afterburner. I'd only use this on water cooling though. Your card will downclock when it hits 60C and higher anyways and the extra voltage may make it run a bit hotter.

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Do you know if we have to take the whole card apart to do it that way?


Quote:


> Originally Posted by *Slackaveli*
> 
> Anybody tried just thermal paste mod with stock vapor chamber? Like with Kryonaut?


anyone?


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> Do it this way except slap the cooler on instead of just the paste.


Dont I have to connect the fan headers on the hybrid kit to the card aswell?


----------



## zipeldiablo

Quote:


> Originally Posted by *alucardis666*
> 
> Probably has to do with the memory being faster and the larger memory bus, vs with your 980Ti having higher boosts and the benchmarks caring more for that than the card RAM. At least that's what I'd think anyway.


I see. Kinda weird though
Quote:


> Originally Posted by *alucardis666*
> 
> In other news my 1080ti should become an Hybrid this friday! Super excited!


Same here







and then she will get a nice oc.


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Dont I have to connect the fan headers on the hybrid kit to the card aswell?


yes. sorry if i was confusing. this is just to show you how to open it up easy way. after that do the cooler mod. if you have big fingers it may be hard to plug it in and that means you may then need to remover the cover over the fan like in the gamersnexus vieo.


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> yes. sorry if i was confusing. this is just to show you how to open it up easy way. after that do the cooler mod. if you have big fingers it may be hard to plug it in and that means you may then need to remover the cover over the fan like in the gamersnexus vieo.


Alright thanks for your help


----------



## Joshwaa

Quote:


> Originally Posted by *Slackaveli*
> 
> Anybody tried just thermal paste mod with stock vapor chamber? Like with Kryonaut?


Well since FedEx decided to not deliver my New Mobo but did deliver the 1080ti I might just take the 1080ti apart and put Kryonat on it while I am waiting. Don't really feel like just shoving the 1080ti into one of my other rigs for just a day. However I would not have a test with the stock paste that way. Still might just do it as I am pretty bored and the 1080ti is just staring at me.


----------



## Slackaveli

word


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Well since FedEx decided to not deliver my New Mobo but did deliver the 1080ti I might just take the 1080ti apart and put Kryonat on it while I am waiting. Don't really feel like just shoving the 1080ti into one of my other rigs for just a day. However I would not have a test with the stock paste that way. Still might just do it as I am pretty bored and the 1080ti is just staring at me.


well it wont be apples to apples but i know i idle at 35c (if in adaptive/optimal) and 45c (in Prefer Maximum Performance- aka staying full clockspeed). On the latter, i end up at 77-78c with a 1.2/1 fan/temp ratio after a half hour 4k Heaven loop. So, im interested in your results if you do it. Dont have paste on hand and stores around here sell crap so i gotta wait for amazon.

Very curious because i gained 10c under load and 11c idle on my 980ti hybrid when i opened her up and repasted. and that wasnt even with Kryo.


----------



## keikei

Quote:


> Originally Posted by *Joshwaa*
> 
> Well since FedEx decided to not deliver my New Mobo but did deliver the 1080ti I might just take the 1080ti apart and put Kryonat on it while I am waiting. Don't really feel like just shoving the 1080ti into one of my other rigs for just a day. However I would not have a test with the stock paste that way. Still might just do it as I am pretty bored and the 1080ti is just staring at me.




Still waiting on mine...any time now...


----------



## TheFallenDeity

Getting my Ti tomorrow. I returned my Titan XP that did 2126 on water. I got back $500 dollars and a free game for the same performance.







Not too bad I'd say.

Now I can't decide which game to get, Wildlands or For Honor?


----------



## GRABibus

I hope Gigabyte will release a 1080Ti Xtreme Gaming WATERFORCE, as they did for the 1080.


----------



## Joshwaa

Yea so I was bored enough...


----------



## keikei

Quote:


> Originally Posted by *TheFallenDeity*
> 
> Getting my Ti tomorrow. I returned my Titan XP that did 2126 on water. I got back $500 dollars and a free game for the same performance.
> 
> 
> 
> 
> 
> 
> 
> Not too bad I'd say.
> 
> Now I can't decide which game to get, Wildlands or For Honor?


Open world third person shooter vs. fighting game. Both buggy as its Ubi. I'm getting For Honor myself.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Yea so I was bored enough...


oh, sweet. cant wait to hear the results.


----------



## Joshwaa

I am starting to wonder if my EVGA FTW cooler from my 1080 that is now under water would fit on my FE 1080ti... Hmmmm.. The rest of my stuff from EK for the loop for the 1080ti wont be here till next week. Not sure if I will get that bored though.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> I am starting to wonder if my EVGA FTW cooler from my 1080 that is now under water would fit on my FE 1080ti... Hmmmm..


sure it will.


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*
> 
> sure it will. there are a could thermal pads that may not be aligned perfectly but if you just do the rad itself and leave Ti's fasn/cover on it'll work fine.


----------



## eXistencelies

Quote:


> Originally Posted by *Joshwaa*
> 
> I am starting to wonder if my EVGA FTW cooler from my 1080 that is now under water would fit on my FE 1080ti... Hmmmm.. The rest of my stuff from EK for the loop for the 1080ti wont be here till next week. Not sure if I will get that bored though.


It will not. The only block that fits it in the Titan XP block.


----------



## Joshwaa

The stock one from the FTW is huge and does not have a rad. Might test fit it tonight. Kinda want to make sure the card works first before I go thru all that though.


----------



## alucardis666

Quote:


> Originally Posted by *Joshwaa*
> 
> Yea so I was bored enough...


Results?


----------



## vmanuelgm

Quote:


> Originally Posted by *Joshwaa*
> 
> Yea so I was bored enough...


To be continued... Or not...


----------



## Joshwaa

Quote:


> Originally Posted by *eXistencelies*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> sure it will.
> 
> 
> 
> It will not. The only block that fits it in the Titan XP block.
Click to expand...

No I have the EK block for the Titan that is for the 1080ti. I was talking about seeing if the stock cooler from my 1080 FTW would fit on my 1080ti FE.


----------



## richiec77

Quote:


> Originally Posted by *Joshwaa*
> 
> No I have the EK block for the Titan that is for the 1080ti. I was talking about seeing if the stock cooler from my 1080 FTW would fit on my 1080ti FE.


Oh crap! I have 2 Hybrid 980 ti sitting here now..........wonder if it'll fit.

Watercooling stuff shipped today finally. Doesn't get here until Monday. Might see how well this works.

Or I'll be lazy and keep playing Killing Floor 2


----------



## eXistencelies

Quote:


> Originally Posted by *Joshwaa*
> 
> No I have the EK block for the Titan that is for the 1080ti. I was talking about seeing if the stock cooler from my 1080 FTW would fit on my 1080ti FE.


Ohhhh.....Maybe...


----------



## lilchronic

Quote:


> Originally Posted by *vmanuelgm*
> 
> No quick disconnects I guess...
> 
> Only 15 minutes to setup the block and put it in...


Unfortunately not.


----------



## vmanuelgm

Quote:


> Originally Posted by *lilchronic*
> 
> Unfortunately not.


Would be a wise purchase!!! Very useful!!!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *richiec77*
> 
> Oh crap! I have 2 Hybrid 980 ti sitting here now..........wonder if it'll fit.
> 
> Watercooling stuff shipped today finally. Doesn't get here until Monday. Might see how well this works.
> 
> Or I'll be lazy and keep playing Killing Floor 2


It'll fit. They have the same hold pattern as the 1000 series.


----------



## AMDATI

I can get away with 80-84c at 60% fan temps, but one has to keep in mind that when the GPU is at 80c, the memory is at 90c. I still prefer my 70% fan speed and 74c max temps.


----------



## BigMack70

Quote:


> Originally Posted by *richiec77*
> 
> Oh crap! I have 2 Hybrid 980 ti sitting here now..........wonder if it'll fit.
> 
> Watercooling stuff shipped today finally. Doesn't get here until Monday. Might see how well this works.
> 
> Or I'll be lazy and keep playing Killing Floor 2


980 Ti hybrid fits perfectly and is very easy to fit. You sacrifice being able to use the half of the shroud near the I/O panel but that doesn't really matter for anything except aesthetics.


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> 980 Ti hybrid fits perfectly and is very easy to fit. You sacrifice being able to use the half of the shroud near the I/O panel but that doesn't really matter for anything except aesthetics.


it is funny how picky we are about our builds with the colors and schemes and matching components and wiring mods and all that, but, then we'll hack our 1080ti in half on day one to get an extra 20c, not to overclock mind you, to get 20c lower temps. What have we become? LMBO!


----------



## BigMack70

Quote:


> Originally Posted by *Slackaveli*
> 
> it is funny how picky we are about our builds with the colors and schemes and matching components and wiring mods and all that, but, then we'll hack our 1080ti in half on day one to get an extra 20c, not to overclock mind you, to get 20c lower temps. What have we become? LMBO!


I don't care about temps, but I do care greatly about noise. My PC is in the living room, so it needs to be silent. And you cannot get silence + halfway decent clock speeds on a stock blower.


----------



## alucardis666

Just Re-timed my Ti with MX4 for kicks and to see how hard it's gonna be to convert to a hybrid when my cooler arrives. Temps dropped 4c at idle and 3c @ Load in Heaven


----------



## vmanuelgm

Quote:


> Originally Posted by *alucardis666*
> 
> Just Re-timed my Ti with MX4 for kicks and to see how hard it's gonna be to convert to a hybrid when my cooler arrives. Temps dropped 4c at idle and 3c @ Load in Heaven


Best non conductive tim is kryonaut.

Best conductive, conductonaut, both Thermal Grizzly.


----------



## ajresendez

Yay just got my ship email from newegg!


----------



## alucardis666

Quote:


> Originally Posted by *vmanuelgm*
> 
> Best non conductive tim is kryonaut.
> 
> Best conductive, conductonaut, both Thermal Grizzly.


So I've heard, I used MX4 cuz it's what I had on-hand, honestly though how much difference can it really make over MX4? 1-2c?


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *vmanuelgm*
> 
> Best non conductive tim is kryonaut.
> 
> Best conductive, conductonaut, both Thermal Grizzly.
> 
> 
> 
> So I've heard, I used MX4 cuz it's what I had on-hand, honestly though how much difference can it really make over MX4? 1-2c?
Click to expand...

Yeah, you might get 2C difference.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, you might get 2C difference.


http://www.eteknix.com/thermal-grizzly-thermal-paste-vs-6-major-brands-review/3/

Can't say it's worth $20


----------



## KraxKill

2100 does seem to be hard to attain on ambient air.

I'm at *2075MHz @ 1.09v* sustained and this is on an *X41 AIO, Shunt mod, liquid metal on the die, with Noctua NF-A14 fans in push pull. 1080 midplate fan at 60%*

I run at *29-38C at 20-24C* ambients.

I could imagine my card doing 2100 with lower load temps in the 17-25C range. In fact I'm "Valley" stable at 2100 right now so could claim it if I was dishonest. I test my OC's religiously with multiple benches, games, etc before I can claim to be remotely solid. I'm talking looping the bench for 15min-30min to get an idea and then some more to confirm stability. I do not allow crashes, driver recoveries, dips or even hiccups. For me it has to work 100% or it's a failed OC in my book. Besides unsustained boost clocks, I bet many let a lot of the above slide when stating clocks.

I'm stable through Valley at 2100 but in the Heaven benchmark both my 2088 and 2100 bins crash or fail.

This in the past, for me has usually signified that I will crash in regular use, in a game, sometime in the near future. In multiplayer, when it happens this SUCKS! So I cannot claim 2100 or even 2088 as stable. Not at 30C anyway. 5-10C Lower, I'm willing to bet I'd fly right on through.

I don't think for a minute that 2100 is a clock wall so much as a wall of diminishing returns, If somebody has a water chiller or wants to drop their AIO in an ice bucket, I encourage you on!!! I don't and this rig has to travel ever so often but I would love to see some sub ambient numbers from somebody.

I'm willing to call this more stable than the Hardware 3D card bouncing off the rev limiter but despite being close, just as with theirs, not a stable 2100 in my book. Would have loved to be at 2100 on a single din AIO with room temp ambients, but sadly it's unlikely to happen. At least not with my card.

On a side note, Heaven for me has more often than not been the Achilles heel. It's been my go to for OC'ing cards. If I can loop heaven, I know I'm more than likely good to go explore elsewhere. Valley seems to be the easiest of the big 4 to loop for me. Opinions and results may vary.









http://www.3dmark.com/fs/12007457


----------



## dboythagr8

I asked earlier but don't think I got a response:

Afterburner or Precision XOC?


----------



## vmanuelgm

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, you might get 2C difference.


Using conductonaut the difference should be higher than 2 degrees.

Quote:


> Originally Posted by *dboythagr8*
> 
> I asked earlier but don't think I got a response:
> 
> Afterburner or Precision XOC?


Afterburner


----------



## alucardis666

Any advice for fitting the EVGA hybrid cooler in my 330r? Wanted to put it at the rear exhaust but I won't have any clearance with the Cryorig, I'm thinking maybe of trying to top mount it or mount it in the front where the HDD cages are, unfortunately neither is a great alternative due to restricted airflow.


----------



## dboythagr8

Quote:


> Originally Posted by *vmanuelgm*
> 
> Using conductonaut the difference should be higher than 2 degrees.
> Afterburner


Any reason for it over Precision?


----------



## KingEngineRevUp

*Those using a 120mm radiator, can I ask you what type of fans you are using and what RPMs you're running them at?*

I have a H55 with Noctua NF-F12 running at 1500 RPM which translates to 55 CFM.

I'm going to try and switch to the 2000 RPM version fans which will push around 78 CFM.

The reason being is that the air that blows out of the radiator is still very very hot. My temps are at 62C to 65C. I want to try and get them down below 59C. I'm hoping that a higher CFM rate will make the air coming out a bit cooler, I will test this when I get the fans.

I know that a larger radiator would net much better results and even a more quiet system but I don't have room for a 140mm radiator right now.



Depending on certain radiator size and mesh density, I know you hit a point where more CFM will have degrading results. Where that point is, well we have to test for it.


----------



## dboythagr8

About what offset are you guys starting off with to OC, those on air and about what appears to be the limit?

I stated with +85 GPU / +56 MEM and working up from there...


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Just Re-timed my Ti with MX4 for kicks and to see how hard it's gonna be to convert to a hybrid when my cooler arrives. Temps dropped 4c at idle and 3c @ Load in Heaven


nice, man.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> 2100 does seem to be hard to attain on ambient air.
> 
> I'm at 2075MHz @ 1.09v sustained and this is on an *X41 AIO, Shunt mod, liquid metal on the die, with Noctua 1500rpm fans in push pull. 1080 midplate fan at 60%*
> 
> I run at 29-38C at 20-24C ambients.
> 
> I could imagine my card doing 2100 with lower load temps in the 17-25C range. In fact I'm "Valley" stable at 2100 right now so could claim it if I was dishonest. I test my OC's religiously with multiple benches, games, etc before I can claim to be remotely solid. I'm talking looping the bench for 15min-30min to get an idea and then some more to confirm stability. I do not allow crashes, driver recoveries, dips or even hiccups. For me it has to work 100% or it's a failed OC in my book. Besides unsustained boost clocks, I bet many let a lot of the above slide when stating clocks.
> 
> I'm stable through Valley at 2100 but in the Heaven benchmark both my 2088 and 2100 bins crash or fail.
> 
> This in the past, for me has usually signified that I will crash in regular use, in a game, sometime in the near future. In multiplayer, when it happens this SUCKS! So I cannot claim 2100 or even 2088 as stable. Not at 30C anyway. 5-10C Lower, I'm willing to bet I'd fly right on through.
> 
> I don't think for a minute that 2100 is a clock wall so much as a wall of diminishing returns, If somebody has a water chiller or wants to drop their AIO in an ice bucket, I encourage you on!!! I don't and this rig has to travel ever so often but I would love to see some sub ambient numbers from somebody.
> 
> I'm willing to call this more stable than the Hardware 3D card bouncing off the rev limiter but despite being close, just as with theirs, not a stable 2100 in my book. Would have loved to be at 2100 on a single din AIO with room temp ambients, but sadly it's unlikely to happen. At least not with my card.
> 
> On a side note, Heaven for me has more often than not been the Achilles heel. It's been my go to for OC'ing cards. If I can loop heaven, I know I'm more than likely good to go explore elsewhere. Valley seems to be the easiest of the big 4 to loop for me. Opinions and results may vary.
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12007457


yeah, if i freeze or crash or bog down, i fix it. it isnt stable. cinebench loops for cpus and heaven loops for gpus will let you know the real deal.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> About what offset are you guys starting off with to OC, those on air and about what appears to be the limit?
> 
> I stated with +85 GPU / +56 MEM and working up from there...


looks like most top out at 150, some can get a few more. some top out at 135. i'd start around 100. and, hell, just shove 400 on the mem to start. i will say this, i settled in at +250 mem, +155core and havent crashed at that 155 since i lowered the memory from 500. did freeze in heaven before that. so it was eith +145c +500 mem, or +155c and +250 mem. for now im on the latter.


----------



## fisher6

Most stable i got without touching the voltage while playing games is +175 on the core and +600 on the memory. Not sure if touching the voltage is worh it.

EDIT: For those wondering about ultrawide performance, this card is absolutely amazing


----------



## dboythagr8

Ok I'll step it up on MEM


----------



## lilchronic

I so sad









So my monitor is a X-Star 2560x1440 with only a DVI port. This monitor is capable of up to 130Hz with out it skipping frames. I used to run it @ 120hz with no problems and now with a active Display Port to DVI adapter i cant get above 75Hz.









I cant game at less then 120Hz, it hurts me eye's man.

Probably going to just sell it with a the water block. idk

I need a card with A DVI port so i'll probably wait for the strix or some other custom pcb card that will get a waterblock. I have been eyeballing that strix though, hope someone make's a block for it.


----------



## RedM00N

Thinking of just forgetting going with an aib and just grab an FE. Too bad its just out of stock everywhere.

Might just get all the vrm/heatsyncs and aio stuff for now to be ready for whenever I can snag one.


----------



## dboythagr8

Ok so can someone explain to me the "throttling" that's happening?

I put a +100 offset on the core clock, power at 120%.

Heaven starts out at 1974mhz. Then drops to 1962,1949,1936, before settling at 1924. Temps maxed out at 73c. So despite that even with a +100, the card was running at 1924mhz. Why?


----------



## KedarWolf

Quote:


> Originally Posted by *dboythagr8*
> 
> Ok so can someone explain to me the "throttling" that's happening?
> 
> I put a +100 offset on the core clock, power at 120%.
> 
> Heaven starts out at 1974mhz. Then drops to 1962,1949,1936, before settling at 1924. Temps maxed out at 73c. So despite that even with a +100, the card was running at 1924mhz. Why?


Cards start throttling at 60C and higher temps go more they throttle, hence clocks going down more and more.









Why I'm going water cooling, EK block and backplate ordered, to be here by Wednesday of next week. With water you should always be under 60C.


----------



## dboythagr8

Quote:


> Originally Posted by *KedarWolf*
> 
> Cards start throttling at 60C and higher temps go more they throttle, hence clocks going down more and more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why I'm going water cooling, EK block and backplate ordered, to be here by Wednesday of next week. With water you should always be under 60C.


Ohh

I thought cards started throttling at 70c?

That clears up some things


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> Ohh
> 
> I thought cards started throttling at 70c?
> 
> That clears up some things


Yea, honestly I wish they started to throttle at 70-75C. 60 is VERY aggressive and makes overclocking really tough unless you have great cooling


----------



## RedM00N

Quote:


> Originally Posted by *KedarWolf*
> 
> Cards start throttling at 60C and higher temps go more they throttle, hence clocks going down more and more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why I'm going water cooling, EK block and backplate ordered, to be here by Wednesday of next week. With water you should always be under 60C.


Wish it was like my Maxwell cards where they start to throttle in the 80's. 60 seems so conservative.


----------



## KingEngineRevUp

That's why we need a custom bios ASAP. To adjust that damn thermal throttling.


----------



## KedarWolf

Quote:


> Originally Posted by *RedM00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Cards start throttling at 60C and higher temps go more they throttle, hence clocks going down more and more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why I'm going water cooling, EK block and backplate ordered, to be here by Wednesday of next week. With water you should always be under 60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wish it was like my Maxwell cards where they start to throttle in the 80's. 60 seems so conservative.
Click to expand...

Until someone can make custom Pascal BIOS's that can be flashed we're SOL. They have this built in protection kind of thing if they are modified in any way they can't be flashed. The only reason I think the designers would force that on us is less warranty claims from bad flashes and/or cards dying from modified BIOS's, it really bites they do that now.


----------



## lilchronic

My card throttle's once it hit's 30°c- 32°c.

http://www.3dmark.com/3dm/18616587


----------



## mouacyk

Quote:


> Originally Posted by *lilchronic*
> 
> My card throttle's once it hit's 30°c- 32°c.
> 
> http://www.3dmark.com/3dm/18616587


Looking at your "Power" graph, it might be throttling due to maxing out power and not a temperature throttle. I've alluded to this before, but this is the part that sucks. Once you beat the temps, you can't beat the power throttle.


----------



## alucardis666

My card will do 2000 boost up to 56c then it downclocks hard to 1354?!


----------



## dboythagr8

Up to +180/+325 so far in Heaven

Settles in at 2012mhz. 98.2 fps @ Heaven @ 1440p. I need those last 2 fps!!!


----------



## alucardis666

Here's my GPU-Z log running fan @100% and playing Wildlands for about 15 minutes on ultra preset @ 1440p

Thoughts?

GPU-ZSensorLog.txt 492k .txt file



Really hopping to get 2050-2100 sustainable once that hybrid cooler arrives. Even @ 100% Fan in a 20C Ambient room it's not enough to prevent throttling.


----------



## AMDATI

Quote:


> Originally Posted by *RedM00N*
> 
> Wish it was like my Maxwell cards where they start to throttle in the 80's. 60 seems so conservative.


They do that because when the GPU is at 60c, the memory is at 80c. When the GPU is at 80c the memory is at 90c.


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> They do that because when the GPU is at 60c, the memory is at 80c. When the GPU is at 80c the memory is at 90c.


So these GDDR5X chips are basically lava... lol.

Hoping when they switch to HBM Ram that the thermal throttling story is VERY different.


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> Most stable i got without touching the voltage while playing games is +175 on the core and +600 on the memory. Not sure if touching the voltage is worh it.
> 
> EDIT: For those wondering about ultrawide performance, this card is absolutely amazing


that's actually REALLY good. i lock up over +150 if i mem oc much at all. i can hold 150 with a 500 mem clock but no higher. and even with no mem clock i can only keep +155 on an hourlong heaven loop. YOU LUCKY DOG YOU


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> I so sad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So my monitor is a X-Star 2560x1440 with only a DVI port. This monitor is capable of up to 130Hz with out it skipping frames. I used to run it @ 120hz with no problems and now with a active Display Port to DVI adapter i cant get above 75Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I cant game at less then 120Hz, it hurts me eye's man.
> 
> Probably going to just sell it with a the water block. idk
> 
> I need a card with A DVI port so i'll probably wait for the strix or some other custom pcb card that will get a waterblock. I have been eyeballing that strix though, hope someone make's a block for it.


the obvious move is to wait for the new 4k HDR high framerate g-sync monitors.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> Ok so can someone explain to me the "throttling" that's happening?
> 
> I put a +100 offset on the core clock, power at 120%.
> 
> Heaven starts out at 1974mhz. Then drops to 1962,1949,1936, before settling at 1924. Temps maxed out at 73c. So despite that even with a +100, the card was running at 1924mhz. Why?


why we love hybrid coolers and waterblocks.
You need to go set up a fan curve, though. it will help. since we know she throttles at 60 and then more at 70, i set fans to kick up to 70% at 58c to fight the throttle. at 70c i have it at 80%. set to 100% at 85, just in case it comes up (it wont). and the thermal paste mod seems to be worth about ~4c as well. maybe as much as 6-7~ with the Thermal Grizzly Kyronaut.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> My card will do 2000 boost up to 56c then it downclocks hard to 1354?!


woooooooooooooo. that seems odd.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> Up to +180/+325 so far in Heaven
> 
> Settles in at 2012mhz. 98.2 fps @ Heaven @ 1440p. I need those last 2 fps!!!


+180??!! how the hell are ya'll getting that much?! mine sucks!


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Here's my GPU-Z log running fan @100% and playing Wildlands for about 15 minutes on ultra preset @ 1440p
> 
> Thoughts?
> 
> GPU-ZSensorLog.txt 492k .txt file
> 
> 
> 
> Really hopping to get 2050-2100 sustainable once that hybrid cooler arrives. Even @ 100% Fan in a 20C Ambient room it's not enough to prevent throttling.


20c room? what are you playing in a fridge?


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> They do that because when the GPU is at 60c, the memory is at 80c. When the GPU is at 80c the memory is at 90c.


eeeee. sounds like we crazy to be adding +550 and whatnot to these micron mems.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> 20c room? what are you playing in a fridge?


Kinda! Midwest winter. It's 1.7C outside, my window is open and I have all the fans in my case @100% and the GPU at fan 100% too while I'm gaming. Trying to eliminate the thermal wall as much as I can.


----------



## RedM00N

Quote:


> Originally Posted by *Slackaveli*
> 
> eeeee. sounds like we crazy to be adding +550 and whatnot to these micron mems.


This was from a TXM review. Their memory broke over 100c lol.


I've clocked my memory modules up to 8500mhz on my TXM's which was enoughto get close to 100C via a thermal gun on the backside vram chips when I was stressing back then. Course this is gddr5 on maxwell, but gpu memory seems to get toasty at times.


----------



## dboythagr8

Wait...how can you unlock voltage in Afterburner?

Even after I check Unlock Voltage, it's still greyed out? I assume this is normal.


----------



## KraxKill

Quote:


> Originally Posted by *alucardis666*
> 
> Kinda! Midwest winter. It's 1.7C outside, my window is open and I have all the fans in my case @100% and the GPU at fan 100% too while I'm gaming. Trying to eliminate the thermal wall as much as I can.


Dang....I'd set up a little chill box in my window hahaha


----------



## AMDATI

Quote:


> Originally Posted by *alucardis666*
> 
> Kinda! Midwest winter. It's 1.7C outside, my window is open and I have all the fans in my case @100% and the GPU at fan 100% too while I'm gaming. Trying to eliminate the thermal wall as much as I can.


You could set the temp limit lower, you'll get slightly less FPS, but you can run lower fans.


----------



## KedarWolf

Quote:


> Originally Posted by *dboythagr8*
> 
> Wait...how can you unlock voltage in Afterburner?
> 
> Even after I check Unlock Voltage, it's still greyed out? I assume this is normal.


http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> You could set the temp limit lower, you'll get slightly less FPS, but you can run lower fans.


Just trying to see what kinds clocks are possible for benching and getting more FPS in Wildlands. Game runs like crap and has terrible frame drops due to the card throttling down.


----------



## lilchronic

Quote:


> Originally Posted by *mouacyk*
> 
> Looking at your "Power" graph, it might be throttling due to maxing out power and not a temperature throttle. I've alluded to this before, but this is the part that sucks. Once you beat the temps, you can't beat the power throttle.


Yeah i think your right. Time spy was really throttling it.
Just ran heaven at 2050Mhz and it dropped twice to 2038Mhz but only dropped for like a split second.
Checked it out in afterburner and my power limt hit 117% and it throttled to 2038Mhz @ 113%
Other then those 2 drops it stayed at 2050Mhz throughout the bench max temp 34c
2560x1440


----------



## alucardis666

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah i think your right. Time spy was really throttling it.
> Just ran heaven at 2050Mhz and it dropped twice to 2038Mhz but only dropped for like a split second.
> Checked it out in afterburner and my power limt hit 117% and it throttled to 2038Mhz @ 113%
> Other then those 2 drops it stayed at 2050Mhz throughout the bench max temp 34c
> 2560x1440


Very impressive that it stayed boosted so consistently. Makes me VERY excited to get this under water.


----------



## dboythagr8

Now that I know what I know, think I'm going to looking into a WC setup. Any good guides or videos anybody recommends?

I think, for benchmarking purposes, my OC is going to top out or be close to topping out at : *+195 Core / + 360 Mem*. I got 100.3 fps in Heaven with this, up from 92fps on stock clocks.


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> Any advice for fitting the EVGA hybrid cooler in my 330r? Wanted to put it at the rear exhaust but I won't have any clearance with the Cryorig, I'm thinking maybe of trying to top mount it or mount it in the front where the HDD cages are, unfortunately neither is a great alternative due to restricted airflow.


Anyone?


----------



## Baasha

has anyone gotten their game codes / emails from Nvidia yet?







I pre-ordered on Mar. 2nd and it's been almost two weeks and nothing yet. I really want to try For Honor and Ghost Recon Wildlands.


----------



## wirefox

sorry for being lazy, I haven't read the whole string.

I have a Zotac Artic Storm 1080 GTX

This card doesn't come with a factory fan. So...

...I am curious if anyone has added a 1080 GTX EK or aftermarket waterblock to a 1080 "ti" ?

I want to buy the 1080ti, move the zotac block onto the 1080ti (it's a great block) and put the 1080ti fan onto the zotac 1080 card and resell -- obviously at a discount









massive thanks .wirefox


----------



## AMDATI

Anyone else getting a bug, not sure what's causing it, maybe sleep. But I notice that at some point, games start underutilizing the GPU, and it applies to all games. The clocks seem fine, but the GPU usage is less than 10%. When I reboot it's not a problem anymore, but trying to figure out what might be the cause so that I don't have to reboot.


----------



## Maintenance Bot

Quote:


> Originally Posted by *Baasha*
> 
> has anyone gotten their game codes / emails from Nvidia yet?
> 
> 
> 
> 
> 
> 
> 
> I pre-ordered on Mar. 2nd and it's been almost two weeks and nothing yet. I really want to try For Honor and Ghost Recon Wildlands.


Login into there website and look for it on your order details page for the order. That's where I found mine.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luda*
> 
> I run the NF-F12 iPPC 200rpm on my thin 360 and 3000 on my 55mm thick front radiator and they get the job done. I doubt changing fans from those would make much of a difference without getting into the realm of insane fans.


I just did some research. Eventually you reach diminishing returns, but before you hit the maximum watt of heat dissipation limit, doubling CFM can get you 180% extra in cooling (1.8X).

https://martinsliquidlab.wordpress.com/2012/04/15/alphacool-nexxxos-xt45-360-radiator/4/



The air coming out of my H55 is still pretty hot with my NF-F12 1500. I know more heat can be dissipated out of them. The cooler the air is, the more heat is being dissipated. So I ordered the 2000 RPMs, which will net me an extra 30% in CFM.


----------



## blurp

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Login into there website and look for it on your order details page for the order. That's where I found mine.


Nothing for me yet. I'll check daily and report. Thanks for the head's up.


----------



## Swolern

Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.

Anyone hitting over 2100 or 2200mhz to the core?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.
> 
> Anyone hitting over 2100 or 2200mhz to the core?


To your last question, not many if at all. Even with extra voltage.


----------



## lilchronic

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.
> 
> Anyone hitting over 2100 or 2200mhz to the core?


Not really I'm at 2050Mhz as well


----------



## alucardis666

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.


Same as me. But 80C max temp. I don't wanna hear the fan anymore than I need to.


----------



## drfouad

I got two gtx1080ti that I need to get rid of


----------



## alucardis666

Quote:


> Originally Posted by *drfouad*
> 
> I got two gtx1080ti that I need to get rid of


Why are you getting rid of them?


----------



## drfouad

well I ended up getting a titan x pascal.
I was going to replace them with these
But.......no


----------



## KraxKill

Quote:


> Originally Posted by *drfouad*
> 
> I got two gtx1080ti that I need to get rid of


Are they radioactive? Maybe try you soliciting on eBay?


----------



## drfouad

will do thanks
I think having two of them in sli is asking for trouble
My wife will kill me!!!\
Spending too money and time at overkill. Last night she asked if I miss her
Gasp!


----------



## alucardis666

Quote:


> Originally Posted by *drfouad*
> 
> will do thanks
> I think having two of them in sli is asking for trouble
> My wife will kill me!!!\
> Spending too money and time at overkill. Last night she asked if I miss her
> Gasp!


Not to be disrespectful, but if that's your wife in your avatar pic I would do whatever she asked to keep her "happy"
















Also, why not sell one Ti and make some $$$ since they're in short supply and very high demand, and sell your Titan X and OC the one Ti. The money you get for all your selling you can take Wifey on a vacation or buy a nice gift. Win-Win all around I think.


----------



## chibi

Quote:


> Originally Posted by *alucardis666*
> 
> Also, why not sell one Ti and make some $$$ since they're in short supply and very high demand, and sell your Titan X and OC the one Ti. The money you get for all your selling you can take Wifey on a vacation or buy a nice gift. Win-Win all around I think.


You sir, know how to keep a loved one happy.


----------



## jsutter71

Quote:


> Originally Posted by *drfouad*
> 
> well I ended up getting a titan x pascal.
> I was going to replace them with these
> But.......no


Your the first person I've heard say they prefer the TXP. Not that I disagree with you because I still feel that the TXP is a superior product, but what caused you to prefer the TXP?


----------



## alucardis666

Quote:


> Originally Posted by *chibi*
> 
> You sir, know how to keep a loved one happy.











Quote:


> Originally Posted by *jsutter71*
> 
> Your the first person I've heard say they prefer the TXP. Not that I disagree with you because I still feel that the TXP is a superior product, but what caused you to prefer the TXP?


Superior how exactly? It's nearly double the price for 1gb more of VRAM. the VRAM is also slower.
Yes TXP has more ROPs but the stock Ti beats it 8/10 and will beat it 10/10 with a modest overclock that any Ti should be able to do.


----------



## drfouad

1 card no Sli woes
The force is strong with this TXP


----------



## DerComissar

Price aside, I still have a lot of lust for the TXP.
And the 1080Ti, of course, lol.

But if money was no object, I'd love to score an M6000 Pascal Quadro.
One can only dream...........









Edit:
I guess that would be the P6000 Pascal Quadro, lol.
http://hothardware.com/reviews/nvidia-quadro-p6000-and-p5000-workstation-gpu-reviews


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.
> 
> Anyone hitting over 2100 or 2200mhz to the core?


Yeah just don't use Craigslist, lol. It's full of weird cheap people.


----------



## alucardis666

Quote:


> Originally Posted by *drfouad*
> 
> 1 card no Sli woes
> The force is strong with this TXP


Again, you'd also have no SLi woes, and an overall faster card if you stuck with one Ti and overclocked it...


----------



## Quaddamage1080

Got my 1080 ti today from Frys, they had two left! Very happy about this card, runs great and with a good fan curve,it runs pretty cool too! Good upgrade from my 1080.


----------



## keikei

Interesting to see if this card can do 5k.....


----------



## pompss

Right now im at 230+ on the core mem+0 on air.

Valley goes from 2088mhz min to max 2100.5 mhz. Wall broken Guyzzzz

I think i got another good one like my old golden Titan X chip
















i will keep pushing !!!


----------



## pompss

Ok the max i touched was 2113mhz max temp was 75c. I think with the waterblock i can reach 2125mhz to 2150mhz.

With mod bios and some voltage mod maybe 2200 in the future









Great Card







Will not push it further without putting the card under water this weekend


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Right now im at 230+ on the core mem+0 on air.
> 
> Valley goes from 2088mhz min to max 2100.5 mhz. Wall broken Guyzzzz
> 
> I think i got another good one like my old golden Titan X chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i will keep pushing !!!


I can spin Valley pegged at 2100, but can you do Heaven? Firestrike? Heaven is the easiest of all the benches in my book Please post you GPU-Z graph over time.

Otherwise we're all gonna be suspect. You may have touched 2100 but I seriously doubt the card stayed there.

This is what a valid OC should look like..others may disagree.



http://www.3dmark.com/fs/12007457


----------



## Swolern

Quote:


> Originally Posted by *pompss*
> 
> Ok the max i touched was 2113mhz max temp was 75c. I think with the waterblock i can reach 2125mhz to 2150mhz.
> 
> With mod bios and some voltage mod maybe 2200 in the future
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great Card
> 
> 
> 
> 
> 
> 
> 
> Will not push it further without putting the card under water this weekend


Nice clocks!







Will be awaiting WC results.

Was a memory OC with that as well?


----------



## rcasey02

Got my EVGA Founders Edition in today and highest OC I was able to get was 2050 on the core and 6000 on the memory. hitting max 75c with a custom fan curve, so far I'm loving this upgrade over my 980ti. Got a EK block on back order through Performance PCs so hopefully that'll come in soon, maybe will the card to push 2100 but idk. Im happy so far. Seems like most of the EVGA cards are maxing out at 2050 kinda interesting.


----------



## AMDATI

Personally, I think everything is a bit overzealous on the boost clocks. If I set everything to 75c max temp, with 50% fan, the boost clock is still about 1800mhz steady. I can live with a few less FPS in exchange for lower fan speeds and temps.


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> I can spin Valley pegged at 2100, but can you do Heaven? Please post GPU-Z graph over time. Otherwise we're all gonna be suspect.
> 
> Like this...
> 
> 
> 
> http://www.3dmark.com/fs/12007457


Lol.. im not that kind of guy you think but yes here are the proof you looking









Graphic score are low bc of mem light overclock but working on it.

http://www.3dmark.com/3dm/18620560? 2100mhz core on 3d mark


----------



## KraxKill

Quote:


> Originally Posted by *AMDATI*
> 
> Personally, I think everything is a bit overzealous on the boost clocks. If I set everything to 75c max temp, with 50% fan, the boost clock is still about 1800mhz steady. I can live with a few less FPS in exchange for lower fan speeds and temps.


I would rather have a card at a solid 1800 and not throttling like a nerd with a weed whacker. It's a ton better than some momentary boost wonder most people claim as an "overclock". Not to mention the executive throttling and frame judder that goes with excessive bin hopping.


----------



## pompss

Quote:


> Originally Posted by *Swolern*
> 
> Nice clocks!
> 
> 
> 
> 
> 
> 
> 
> Will be awaiting WC results.
> 
> Was a memory OC with that as well?


thanks

mem was not . working on it right now


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Lol.. im not that kind of guy you think but yes here are the proof you looking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Graphic score are low bc of mem light overclock but working on it.
> 
> http://www.3dmark.com/3dm/18620560? 2100mhz core on 3d mark


That GPU graph is impossible to see. But from what you posted you did not sustain 2100+ Grab just the GPU-Z graph for your screen shot. It appears that your clock was at 2100 for a moment but downclocked? Unless the top of the graph is 2200+.



Check you log file. Run heaven. Good luck.


----------



## jsutter71

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Superior how exactly? It's nearly double the price for 1gb more of VRAM. the VRAM is also slower.
> Yes TXP has more ROPs but the stock Ti beats it 8/10 and will beat it 10/10 with a modest overclock that any Ti should be able to do.


Before you jump to conclusions I am not trashing the 1080Ti. But don't underestimate the TXP. Before I answer your question I would like to know if you have any experience with overclocking the TXP, and if so does that include a SLI system under water?

If you have experience overclocking the TXP then you would know that it is very easy to overclock the VRAM past the point where it improves frame rates and performance. The improved memory speeds compared to the TXP melt away the more you overclock.

The advantage of the 1080Ti which is a direct quote from pcgamesn.com
"he new 7-phase dual-FET design spreads the power load more than the Titan X and allows for less power leakage (realised in GPU terms as heat) and a cleaner supply of juice. This is partly how Nvidia have been able to ship the standard GTX 1080 Ti cards with a higher out-of-the-box boost clock than the Titan X."

So Nvidia essentially improved the cards power distribution and improved memory speeds in order to match the TXP's performance while at the same time cutting ROPS, 96 to 88, and cutting the l2 cache 3072 to 2816.

the thermal threshold of the chip has actually been lowered from its Titan X days. The Titan X will only start throttling its chip when the silicon tops 94°C, while the GTX 1080 Ti will start reining in its GPU at 91°C. So for those people who like to push their cards to the limit means less ability to overclock.

So back to your question. Why is the TXP superior? If your under water and pushing very high resolutions 5K or greater the TXP is better equipped.

Now for people who do not overclock, water cool, or already own a TXP, the answer is simple.


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> That GPU graph is impossible to see. But from what you posted you did not sustain 2100+ Grab just the GPU-Z graph for your screen shot. It appears that your clock was at 2100 for a moment but downclocked? Unless the top of the graph is 2200+.


Dude just copy the link of the image and add it to another windows you will see it big.
In 3d mark the core was min 2075mhz to 2113 mhz max.

Follow this link see if you can reach 2100mhz in 3d mark with your card.

good luck to you









http://www.3dmark.com/3dm/18620771?


----------



## KraxKill

Quote:


> Originally Posted by *KraxKill*
> 
> That GPU graph is impossible to see. But from what you posted you did not sustain 2100+ Grab just the GPU-Z graph for your screen shot. It appears that your clock was at 2100 for a moment but downclocked? Unless the top of the graph is 2200+.
> 
> 
> 
> This is what a solid core clock looks like..
> 
> Check you log file. Run heaven. Good luck.


Quote:


> Originally Posted by *pompss*
> 
> Dude just copy the link of the image and add it to another windows you will see it big.
> In 3d mark the core was min 2075mhz to 2113 mhz max.
> 
> Follow this link see if you can reach 2100mhz in 3d mark with your card.
> 
> good luck to you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/18620771?


You gotta sustain 2100 to say you got 2100 in my book. But if you're calling that good more power to you. Loop heaven now. Good luck lol

This is what a sustained clock graph looks like. If you're going to claim to be the first 2100 OC lets see some proof. And more power to you if you got there.


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> You gotta sustain 2100 to say you got 2100 in my book. But if you're calling that good more power to you. Loop heaven now. Good luck lol


Lets see if you can get 3d mark score with the link that show 2100mhz.

Then come back and we can talk

Good night !!


----------



## KickAssCop

Picked one card up for now. I don't think I will SLI FE cards. One card sound starts to bother me a bit once it goes past 55% fan speed. Normally stays at 48% fan w/ 84 C max load temps. Pisses the crap out of me since I see it throttle down to like 1800 clocks. Will test some more.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Lets see if you can get 3d mark score with the link that show 2100mhz. 3d mark link its my proof i dont need to show nothing
> 
> 2100 mhz on the core wall broken thats it. All to the end. Bum!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have good night !
> 
> http://www.3dmark.com/3dm/18620771?


You gotta sustain 2100 to say you got 2100 in my book. But if you're calling that good more power to you. Loop heaven now. Good luck lol

If you're going to claim to be *the first 2100 OC*, "wall broken" and such lets see some proof. And more power to you if you got there.

But for now it's a momentary boost clock you're displaying until you can show a sustained boost graph. I have serious doubts it's possible at those temps you're seeing. I've been proven wrong before.

Congrats on an awesome OC either way.

This is what a sustained clock graph should look like looping heaven.


----------



## MURDoctrine

Well Newegg/FedEx got my card here a day early so that was a pleasant surprise to come home from work to. Anyway this thing is a beast compared to my 980.

*980 EVGA SC Stock*



*EVGA 1080Ti FE stock*


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> You gotta sustain 2100 to say you got 2100 in my book. But if you're calling that good more power to you. Loop heaven now. Good luck lol
> 
> If you're going to claim to be *the first 2100 OC*, "wall broken" and such lets see some proof.[/B] And more power to you if you got there. But for now it's a momentary boost clock until you can show a sustained boost graph. I have serious doubts it's possible at those temps you're seeing. I've been proven wrong before.
> 
> Congrats on an awesome OC either way.
> 
> This is what a sustained clock graph should look like looping heaven.


Dude are you serious ??

3d mark run all the way to the end and recorder the 2100 mhz on the core

Did you see the link ??sustained or not i reach the 2100.5 mhz and broke the wall period.

That it !!

Now lets see who score more then 2100mhz core on air in 3d mark and post the link of the score

You Really want me to make a video ??


----------



## pompss

Running heaven right now


----------



## KickAssCop

3dmark has variations due to loading and is not the right measure for sustained clocks. 2100 MHz to sustain, you need to run something like Witcher 3 for about 15 minutes or loop Heaven/Valley for 15 minutes after which most cards stabilize on clocks/temperatures.


----------



## Gunslinger.

2100 constant right here


----------



## pompss

Quote:


> Originally Posted by *KickAssCop*
> 
> 3dmark has variations due to loading and is not the right measure for sustained clocks. 2100 MHz to sustain, you need to run something like Witcher 3 for about 15 minutes or loop Heaven/Valley for 15 minutes after which most cards stabilize on clocks/temperatures.


Wall its broken anyway and of course the card will go down to 2050 min to 2100 mhz max

to stabilize and have constant 2100 mhz you need water cooling and maybe a mod bios and power limit unlocked.

my point is that this card reaches 2100 mhz on air without crashing and thats it. Wall broken . the only thing to do now it put a waterblock on it and wait for mod bios.

If anyone can do 2100 mhz then just post a link with the 3d mark score then we all set.







Jeez


----------



## pompss

valley 2088mhz min to 2100 max.

goes down to 2088mhz after hitting 70c on air. For me its stable enough on air.

Putting a waterblock and im sure i will see 2100 mhz stable all the time or maybe more in gaming









Have good night folks:thumb:


----------



## AstinGC90

GTX 1080ti founders and TXP Block/Backplates shipped. Getting keen :-D


----------



## AMDATI

Quote:


> Originally Posted by *KraxKill*
> 
> I would rather have a card at a solid 1800 and not throttling like a nerd with a weed whacker. It's a ton better than some momentary boost wonder most people claim as an "overclock". Not to mention the executive throttling and frame judder that goes with excessive bin hopping.


Yeah, when you suddenly go from 1500 to 1900 boost back and forth, the sudden FPS spikes/drops are noticeable even at high FPS. 50% fan speed is pretty good, sound wise.

The problem with boost, is if you don't limit its temps, I think it can actually throttle even less than stock clock, because it's trying to keep temps in check, which means you'll essentially be running hot at stock clocks, the temps will drop as it tries to compensate, and those clocks will rise again, then quickly heat things up, then it all drops back down again.

These settings seem to be really stable. Since the temp limit target is well below the 80c/50% speed cut off, it throttles well before that, which keeps temps and fan speed steady.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Kinda! Midwest winter. It's 1.7C outside, my window is open and I have all the fans in my case @100% and the GPU at fan 100% too while I'm gaming. Trying to eliminate the thermal wall as much as I can.


sittin there in your rope with snow blowin in overclocking. a man after my own heart!!


----------



## Slackaveli

Quote:


> Originally Posted by *Baasha*
> 
> has anyone gotten their game codes / emails from Nvidia yet?
> 
> 
> 
> 
> 
> 
> 
> I pre-ordered on Mar. 2nd and it's been almost two weeks and nothing yet. I really want to try For Honor and Ghost Recon Wildlands.


no but i called today and he said "I assure you that you will receive them, they havent gone out yet, but we dont have an eta.

An Nvidia mod on Nvidia forum said it would be 'sometime this week'.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Gunslinger.*
> 
> 2100 constant right here


Awesome! congrats.


----------



## Slackaveli

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.
> 
> Anyone hitting over 2100 or 2200mhz to the core?


your "quick and diryt oc" is the oc. that's all there is to it.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Wall its broken anyway and of course the card will go down to 2050 min to 2100 mhz max
> 
> to stabilize and have constant 2100 mhz you need water cooling and maybe a mod bios and power limit unlocked.
> 
> my point is that this card reaches 2100 mhz on air without crashing and thats it. Wall broken . the only thing to do now it put a waterblock on it and wait for mod bios.
> 
> If anyone can do 2100 mhz then just post a link with the 3d mark score then we all set.
> 
> 
> 
> 
> 
> 
> 
> Jeez


Point proven. Wall NOT broken.


----------



## KraxKill

Quote:


> Originally Posted by *Gunslinger.*
> 
> 2100 constant right here


That's more like it. Those are some low temps. Nice OC!


----------



## Slackaveli

Quote:


> Originally Posted by *pompss*
> 
> valley 2088mhz min to 2100 max.
> 
> goes down to 2088mhz after hitting 70c on air. For me its stable enough on air.
> 
> Putting a waterblock and im sure i will see 2100 mhz stable all the time or maybe more in gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have good night folks:thumb:


+220?!?! Wow, dude


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> Yeah, when you suddenly go from 1500 to 1900 boost back and forth, the sudden FPS spikes/drops are noticeable even at high FPS. 50% fan speed is pretty good, sound wise.
> 
> The problem with boost, is if you don't limit its temps, I think it can actually throttle even less than stock clock, because it's trying to keep temps in check, which means you'll essentially be running hot at stock clocks, the temps will drop as it tries to compensate, and those clocks will rise again, then quickly heat things up, then it all drops back down again.
> 
> These settings seem to be really stable. Since the temp limit target is well below the 80c/50% speed cut off, it throttles well before that, which keeps temps and fan speed steady.


best way is just set the frame limiter. 4k, 60 limiter, v-sync off. done. card never throttles. 60 locked.


----------



## Drakeskull

Got it today! Haven't touched it to OC yet, but it was boosting to 1810 on its own? Around 65c. I'll do more texting tomorrow to see.


----------



## AMDATI

I picked For Honor for me free game, and I have to say, it's a really good looking game.
Quote:


> Originally Posted by *Slackaveli*
> 
> best way is just set the frame limiter. 4k, 60 limiter, v-sync off. done. card never throttles. 60 locked.


Well that would be fine....if I wasn't running a 144hz monitor







But I do frame limit to 144hz. In demanding games, the FPS still doesn't always get the full 144hz, but I'm happy with 90fps+.


----------



## pez

Quote:


> Originally Posted by *SlimJ87D*
> 
> Questions regarding radiator fans.
> 
> I have Noctua NF-F12 fans, but I feel like they're not pushing enough hot air through my H55. If I got the 2000 or 3000 RPM fans ( I know they're loud at 3000 but I'd control them), would it make a major difference?
> 
> https://www.amazon.com/gp/offer-listing/B00KFCRIQM/ref=sr_1_1_olp?s=pc&ie=UTF8&qid=1489593320&sr=1-1&keywords=noctua+nf-f12+2000+pwm


Quote:


> Originally Posted by *SlimJ87D*
> 
> I have the 1500, I wonder if that extra 500 RPM help.


Quote:


> Originally Posted by *SlimJ87D*
> 
> My H55 is also acting as an exhaust, so I was thinking I need to also exhaust out the power from the Mobo.
> 
> I'll try putting it as an intake and see how much of a difference that makes.


Setting your H55 to intake will be best for temps, but in the end, you're going to be limited by the radiator/CLC itself. It's a rather small/slim radiator with a moderate fin density. You'll hit a point where you don't see any improvement since you'll cool the system more than the water/liquid can transfer the heat to the radiator.
Quote:


> Originally Posted by *Baasha*
> 
> Talking about cooling - I have these cards (4x of them) installed in a Corsair Carbide Air 540 case.
> 
> What are the absolute best *intake* fans to use for airflow? These 4 cards are throwing a LOT of heat (of course, they get pushed out the back of the case which is nice) and so my Kraken X41 which has only one 140mm fan as a 'pull' config (on top) is struggling to keep my CPU cool - I was seeing 90C on the CPU last night! o_0
> 
> I was thinking of picking up the Noctua NF-A14 iPPC 3000 PWM fans for the Kraken X41 but I need fans for intake in the front of the case - right now I'm just using 2x Corsair LED 120mm fans - they don't push/pull a lot of air.
> 
> I was also considering ditching the Kraken X41 for the Kraken X62 - anyone have experience with that?


Quote:


> Originally Posted by *Luda*
> 
> Save the money on upgrading fans and put it towards the x62. The additional radiator space will net you MUCH lower temps, a single 140 on that proc is WAY under spec'd.
> 
> 1)pics of the 4x cards installed?
> 2)how much do you care about noise?


Going to a new radiator in his situation will help, but those LED Corsair fans should be replaced stat.

Personally I'd recommend Deltas or even the Noctua Industrials if noise and cost isn't a factor.
Quote:


> Originally Posted by *alucardis666*
> 
> Tried it, no major changes.
> 
> 
> 
> 
> 
> 
> 
> 1-2 frames higher at minimum. 1440p is still the best way to play


Ah, thank you for trying anyhow







. +rep


----------



## alucardis666

Quote:


> Originally Posted by *jsutter71*
> 
> Before you jump to conclusions I am not trashing the 1080Ti. But don't underestimate the TXP.
> 
> If you have experience overclocking the TXP then you would know it's very easy to overclock the VRAM past the point where it improves frame rates and performance. The improved memory speeds compared to the TXP melt away the more you overclock.
> 
> Why is the TXP superior? If you're under water and pushing very high resolutions, 5K or greater, the TXP is better equipped.
> 
> Now, for people who do not overclock, water cool, or already own a TXP, the answer is simple.


Sorry if I stepped on your toes or insulted your logic at all. I was just curious as to your thoughts.
Quote:


> Originally Posted by *Slackaveli*
> 
> Sittin there in your robe with snow blowin' in, overclocking. A man after my own heart!!

























Quote:


> Originally Posted by *Slackaveli*
> 
> best way is just set the frame limiter. 4k, 60 limiter, v-sync off. done. card never throttles. 60 locked.


Can you elaborate? Idk what you mean by "frame limiter" I have V-sync on always, is this wrong?


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> Ah, thank you for trying anyhow
> 
> 
> 
> 
> 
> 
> 
> . +rep


Thanks man!


----------



## dboythagr8

If anyone is curious, Mass Effect Andromeda 10 hr trial is up on Origin Access. I played for about a hour.

Setup

Driver Set - 378.78

2560x1440 w/ GSYNC
4930k @ 4.5ghz
16GB RAM
1080Ti @ 1924/5805
Windows 10
Frame Buffer HALF16

All settings Ultra

Avg - 94fps

I had some initial trouble getting the game to launch, kept going to a black screen. But after that was resolved, performance has been smooth.


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> If anyone is curious, Mass Effect Andromeda 10 hr trial is up on Origin Access. I played for about a hour.
> 
> Setup
> 
> Driver Set - 378.78
> 
> 2560x1440 w/ GSYNC
> 4930k @ 4.5ghz
> 16GB RAM
> 1080Ti @ 1924/5805
> Windows 10
> Frame Buffer HALF16
> 
> All settings Ultra
> 
> Avg - 94fps
> 
> I had some initial trouble getting the game to launch, kept going to a black screen. But after that was resolved, performance has been smooth.


Awesome! Downloading now!


----------



## Swolern

For Honor or Ghost Recon: Windlands? I dont have either one.


----------



## Radox-0

Quote:


> Originally Posted by *jsutter71*
> 
> Before you jump to conclusions I am not trashing the 1080Ti. But don't underestimate the TXP. Before I answer your question I would like to know if you have any experience with overclocking the TXP, and if so does that include a SLI system under water?
> 
> If you have experience overclocking the TXP then you would know that it is very easy to overclock the VRAM past the point where it improves frame rates and performance. The improved memory speeds compared to the TXP melt away the more you overclock.
> 
> The advantage of the 1080Ti which is a direct quote from pcgamesn.com
> "he new 7-phase dual-FET design spreads the power load more than the Titan X and allows for less power leakage (realised in GPU terms as heat) and a cleaner supply of juice. This is partly how Nvidia have been able to ship the standard GTX 1080 Ti cards with a higher out-of-the-box boost clock than the Titan X."
> 
> So Nvidia essentially improved the cards power distribution and improved memory speeds in order to match the TXP's performance while at the same time cutting ROPS, 96 to 88, and cutting the l2 cache 3072 to 2816.
> 
> the thermal threshold of the chip has actually been lowered from its Titan X days. The Titan X will only start throttling its chip when the silicon tops 94°C, while the GTX 1080 Ti will start reining in its GPU at 91°C. So for those people who like to push their cards to the limit means less ability to overclock.
> 
> So back to your question. Why is the TXP superior? If your under water and pushing very high resolutions 5K or greater the TXP is better equipped.
> 
> Now for people who do not overclock, water cool, or already own a TXP, the answer is simple.


Agree with a fair bit of your post, but not other parts. Definitely agree if you have a TXP, would keep it, unless you can sell them for a steller price. Also all the hype and marketing show it as killing the TXP, when really load of crap and for most the part at stock they are similar or slightly in favour of 1080Ti due to higher clocks as you pointed out.

The 1080Ti, can make up the memory overclock gain once again and hit 12 Ghz or so to make up the delta when you OC the Titan XP Memory so increase that delta again. 91 vs 94 degrees will not be making a difference as you will be throttling pretty hard on both cards far before then. That limit is the maximum GPU temp limits and you will not be able to get that high, boost drops far before then on both. Also if your hitting those temps, you would be on air which would then ask the question, why would you not get a 1080Ti AIB which will be able to sustain higher clocks and once again outperform the TXP for cheaper.

Likewise even these days if you are into overclocking and watercooling etc, then then 1080Ti mostly makes more sense. It may not be able to absolutely match the TXP on the top end. But for a % or so for less performance (depending on the chip) for nearly half the price, that's considerable IMO and more then makes up for that for 99% of people. Last 1% or so of course want the best at any price and no doubt that 1% is considerable on this site, it is indeed OCN and has overclock in the name


----------



## AMDATI

ME: Andromeda bugs in a nutshell...


----------



## vmanuelgm

Quote:


> Originally Posted by *Gunslinger.*
> 
> 2100 constant right here


LN2 session??? xDDD

Now seriously, post some benchs at max with that beast, wanna see good numbers and guess you are one of the greatest for that!!!


----------



## KraxKill

Quote:


> Originally Posted by *vmanuelgm*
> 
> LN2 session??? xDDD
> 
> Now seriously, post some benchs at max with that beast, wanna see good numbers and guess you are one of the greatest for that!!!


Does that say Titan X Pascal in GPUZ?


----------



## GreedyMuffin

Will be joining you guys soon!

Had a 980 I got for free, that died but I got a full refund! That's 600 USD. Also sold my 1080 today.

Just wondering about If I should have the 1080Ti under water.. I have a XTX360 + XE240 with a 1700 from AMD. That should be plenty of Rad space, right?


----------



## vmanuelgm

Quote:


> Originally Posted by *Gunslinger.*
> 
> 2100 constant right here


Quote:


> Originally Posted by *KraxKill*
> 
> Does that say Titan X Pascal in GPUZ?












Yep, I thought it was a Ti...


----------



## alucardis666

Quote:


> Originally Posted by *vmanuelgm*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep, I thought it was a Ti...


He cheated...









*Side note:*

WOW! Mass Effect is DEMANDING!!!! Makes my CPU Run up to 78c!?! and my 1080Ti is artifacting with mem over 400 where it hasn't done that anywhere else card is also getting up to mid 70's with fan set to 100%.

This is all at 4K with the Ultra Preset btw. Really need this card under water to kill the noise and be able to hear my game/get the most performance out of it. On the upside even with some pretty nasty thermal throttling frames stay above 50fps.


----------



## Quaddamage1080

Ghost Recon is one of the most fun games Ive played in a while.


----------



## Mato87

I have a question for you folks, those of you who bought the founders edition 1080 Ti's. Did they come with the DVI adapter or not? I read that the non reference models will have a dedicated DVI output, is that true? Like the Zotac gtx 1080 Ti that was announced a few days ago. Now Iam looking forward to my Gigabyte GTX 1080 Ti AORUS Xtreme edition even more









Iam asking because I have a 3D monitor without a DP input, only DVI-D dual link and Iam not going to change this beauty anytime soon. Recently I played through rise of tomb raider in 3D and it was freaking cool.


----------



## doogk

It comes with an adapter but mine didnt work, I think the adapter doesnt work for a lot of people. You need an active adapter or whatever and they cost alot. I ended up having to buy a new monitor, sucks but needed to upgrade anyways.


----------



## MURDoctrine

Quote:


> Originally Posted by *Mato87*
> 
> I have a question for you folks, those of you who bought the founders edition 1080 Ti's. Did they come with the DVI adapter or not? I read that the non reference models will have a dedicated DVI output, is that true? Like the Zotac gtx 1080 Ti that was announced a few days ago. Now Iam looking forward to my Gigabyte GTX 1080 Ti AORUS Xtreme edition even more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Iam asking because I have a 3D monitor without a DP input, only DVI-D dual link and Iam not going to change this beauty anytime soon. Recently I played through rise of tomb raider in 3D and it was freaking cool.


They come with the DP to DVI adapter but it isn't active. It will only work up to 1080p 60hz according to what others have said. If your display is higher than that then you will need one of the active ones that get power from a USB/DC source. Those that people have linked are around $90-$100 USD. Mine is currently attached to my Dell U2312HM which is 1080p 60hz and seems to be working fine.


----------



## Mato87

Quote:


> Originally Posted by *doogk*
> 
> It comes with an adapter but mine didnt work, I think the adapter doesnt work for a lot of people. You need an active adapter or whatever and they cost alot. I ended up having to buy a new monitor, sucks but needed to upgrade anyways.


Oh damn, that's really bad news. So you buy a new graphics card worth 830 euro and you have to buy an active DP to DVI adapter for another 50 euro? That's mad, they should've included an active adapter imo...Now it's even more reasonable to just wait for the non reference models.
Quote:


> Originally Posted by *MURDoctrine*
> 
> They come with the DP to DVI adapter but it isn't active. It will only work up to 1080p 60hz according to what others have said. If your display is higher than that then you will need one of the active ones that get power from a USB/DC source. Those that people have linked are around $90-$100 USD. Mine is currently attached to my Dell U2312HM which is 1080p 60hz and seems to be working fine.


Damn, that surely sucks. Not everyone has a new monitor with DP input. Mine is Asus VG278 HR, it's a 144 Hz 3D monitor, Iam glad I didn't buy the founders edition, when I had a chance. I'll gladly wait another month or two before the non reference models that will have the necessary DVI output build in are available to purchase.


----------



## Joshwaa

Quote:


> Originally Posted by *KraxKill*
> 
> That's more like it. Those are some low temps. Nice OC!


Looks like that is a Titan not a ti.


----------



## Drakeskull

Quote:


> Originally Posted by *Drakeskull*
> 
> Got it today! Haven't touched it to OC yet, but it was boosting to 1810 on its own? Around 65c. I'll do more texting tomorrow to see.


So no offsets, and with a higher fan curve it stays around 1822-1860 boost, and 65c through most testing I've done. Fan might be a little high for some, but I cant hear it where my pc sits.

Boostspeed.JPG 237k .JPG file


Capture.JPG 263k .JPG file


----------



## pangallosr

Quote:


> Originally Posted by *Mato87*
> 
> I have a question for you folks, those of you who bought the founders edition 1080 Ti's. Did they come with the DVI adapter or not? I read that the non reference models will have a dedicated DVI output, is that true? Like the Zotac gtx 1080 Ti that was announced a few days ago. Now Iam looking forward to my Gigabyte GTX 1080 Ti AORUS Xtreme edition even more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Iam asking because I have a 3D monitor without a DP input, only DVI-D dual link and Iam not going to change this beauty anytime soon. Recently I played through rise of tomb raider in 3D and it was freaking cool.


Every unboxing on YT that I have viewed has the adapter.


----------



## Mato87

Quote:


> Originally Posted by *pangallosr*
> 
> Every unboxing on YT that I have viewed has the adapter.


Good to know, but very disapointing to discover it's not able to do more than full hd 60 hz, which is a fairly disgraceful move by nvidia.


----------



## pompss

Second Time i buy Zotac reference card and came up really good in overclock







.
Coincidence ?


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> Point proven. Wall NOT broken.


You have no point.

Card on air throttles down bc of the temprature reaching 70c its normal. With a waterblock will keep 2100 stable or maybe 2150mhz

Passing the 3d mark and valley at 230+ on the core on air without crashing its the point you missed all this time.

2100mhz on the core on air no crash !! WALL Broken .

Now you can go hide .


----------



## Joshwaa

Quote:


> Originally Posted by *pompss*
> 
> You have no point.
> 
> Card on air throttles down bc on the temprature reaching 70c its normal. With a waterblock will keep 2100 stable or maybe 2150mhz
> 
> Passing the 3d mark and valley at 230+ on the core without crashing its the point you missed all this time.
> 
> 2100mhz on the core no crash !! WALL Broken .
> 
> Now you can go hide .


It did not hold it but that is the closest I have seen thus far. Good job. I should have my MoBo today so hopefully I can beat that on air before I put the EK block on it.


----------



## pompss

Quote:


> Originally Posted by *Joshwaa*
> 
> It did not hold it but that is the closest I have seen thus far. Good job. I should have my MoBo today so hopefully I can beat that on air before I put the EK block on it.


Hold 2100 mhz half of valley bench.

Cant hold it bc of the temp reach 70c half way but the most important its that didn't crash and goes all the way to the end. With a waterblock will hold 2100mhz 100%.

Some people just don't get it or they just jealous and find excuse or seeking proof. Pathetic in my opinion.


----------



## Joshwaa

Quote:


> Originally Posted by *pompss*
> 
> Hold 2100 mhz half of valley bench.
> 
> Cant hold it bc of the temp reach 70c half way but the most important its that didn't crash and goes all the way to the end. With a waterblock will hold 2100mhz 100%.
> 
> Some people just don't get it or they just jealous and find excuse or seeking proof. Pathetic in my opinion.


With how good it was doing there I would think you would be able to hold 2126mhz or better under water. Have you tried seeing if you can get it to hold using the graph OC and dropping the volts down a little. Might give you that extra 1C cooler to not down clock during the run.


----------



## pompss

Quote:


> Originally Posted by *Joshwaa*
> 
> With how good it was doing there I would think you would be able to hold 2126mhz or better under water. Have you tried seeing if you can get it to hold using the graph OC and dropping the volts down a little. Might give you that extra 1C cooler to not down clock during the run.


Thats a good idea









thanks will try it later today.


----------



## keikei

So what games are ppl playing with their fancy super duper fast card?


----------



## Joshwaa

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


World of Tanks, lol. I think it uses about 45% of my 1080 GPU at fully maxed out settings. Also playing a little bit of Metro Redux.


----------



## Pandora's Box

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


Civilization 6 and finally decided to play Witcher 3 now that I can max every setting at 1440P. Witcher looks absolutely stunning.


----------



## fisher6

Is it just me or does OC'ing the memory make the GPU run hotter? Talking about a few degrees like 5C.


----------



## shalafi

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


Stardew Valley, I suppose? Now seriously, I'm still on a 980Ti, waiting until the prices/stock stabilise a bit .. probably getting a FE and slapping an EVGA hybrid kit on that. And then I will be sitting in my virtual truck and hoping that the fossil of a game engine has usable FPS in ProMods parts of the map (if you didn't get any of that - that's Euro Truck Simulator 2).


----------



## Addsome

So last night I finished installing the 1080 hybrid kit to my 1080ti using the half shroud off method. I get 45c load on heaven with 120% power and 90c temp limit and my clocks stabilize around 1900mhz without any manual overclock. Would it be better to replace the stock fan the hybrid kit came with to another one? If so which one? People are saying I should connect the fan to the mobo so I can control fan speeds? Also if I read correctly you cannot control the speed of the radiator fan but you can control the speed of the cards blower fan by setting the value in afterburner? What speed should I leave that running at to cool the vram even during overclocks?


----------



## failwheeldrive

Got my 1080 ti on Tuesday, put it under water the next day. It's a massive upgrade over my 1080.


----------



## pez

Quote:


> Originally Posted by *Addsome*
> 
> So last night I finished installing the 1080 hybrid kit to my 1080ti using the half shroud off method. I get 45c load on heaven with 120% power and 90c temp limit and my clocks stabilize around 1900mhz without any manual overclock. Would it be better to replace the stock fan the hybrid kit came with to another one? If so which one? Also if I read correctly you cannot control the speed of the radiator fan but you can control the speed of the cards blower fan by setting the value in afterburner? What speed should I leave that running at to cool the vram even during overclocks?


Most likely you will be better off using something like Phanteks or Noctua if you care about noise. You could absolutely control the radiator fan if you plug it into a mobo header or a fan controller.


----------



## GreedyMuffin

Quote:


> Originally Posted by *failwheeldrive*
> 
> Got my 1080 ti on Tuesday, put it under water the next day. It's a massive upgrade over my 1080.


Did you use the EK backplate or the FE one? Looking sexy!


----------



## Addsome

Quote:


> Originally Posted by *pez*
> 
> Most likely you will be better off using something like Phanteks or Noctua if you care about noise. You could absolutely control the radiator fan if you plug it into a mobo header or a fan controller.


What would be the best fan for noise/performance? I don't care about looks.


----------



## pez

Quote:


> Originally Posted by *Addsome*
> 
> What would be the best fan for noise/performance? I don't care about looks.


I am personally a huge fan of the F120MPs from Phanteks. Depending on where you are and where you can purchase from would help me tailor a suggestion for you. But yes, I use 3 of those in my case and am extremely pleased. For reference I have one on a H75 on my 4770K @ 4.5Ghz w/ 1.28 vcore and after an hour+ of BF1, the cores hit somewhere in the ballpark of 65-70 max. Even at the max 1800RPM the fans are tolerable. They *might* match the noise profile of a 50% FE cooler at that point.


----------



## Addsome

Quote:


> Originally Posted by *pez*
> 
> I am personally a huge fan of the F120MPs from Phanteks. Depending on where you are and where you can purchase from would help me tailor a suggestion for you. But yes, I use 3 of those in my case and am extremely pleased. For reference I have one on a H75 on my 4770K @ 4.5Ghz w/ 1.28 vcore and after an hour+ of BF1, the cores hit somewhere in the ballpark of 65-70 max. Even at the max 1800RPM the fans are tolerable. They *might* match the noise profile of a 50% FE cooler at that point.


I'm located in Canada


----------



## pez

Quote:


> Originally Posted by *Addsome*
> 
> I'm located in Canada


Yeah, something from Phanteks, Noctua, etc (I know I'm probably forgetting a couple good ones) in the 1800-2400RPM range would be a really good solution I think. 2200RPM 120mm fans start to get noisy regardless of who makes them, so you can be a bit more nitpicky on that. Just get whichever is cheapest of those and have at it







.

Also, which AIO/CLC did you use and what 'half-shroud' method did you go by?


----------



## celk00dx

Hello, I'm trying to decide if I should get 1080 Ti or TXP for gaming 4k and higher with WC. I hear TXP is better for 5k and higher but 1080 Ti just set a new 3dmark record

http://www.pcgamer.com/geforce-gtx-1080-ti-overclocked-to-25ghz-on-sets-world-record/

So what do you guys think?


----------



## Addsome

Quote:


> Originally Posted by *pez*
> 
> Yeah, something from Phanteks, Noctua, etc (I know I'm probably forgetting a couple good ones) in the 1800-2400RPM range would be a really good solution I think. 2200RPM 120mm fans start to get noisy regardless of who makes them, so you can be a bit more nitpicky on that. Just get whichever is cheapest of those and have at it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also, which AIO/CLC did you use and what 'half-shroud' method did you go by?


I used the 1080 hybrid kit from evga. I basically did the same thing that gamersnexus did with their 1080ti


----------



## Joshwaa

Noctua Industrial 140mm 3000rpm PWM FTW! They are the only fans I buy anymore. Under 40% they are pretty quiet in my case. Plus the push really good thru rads.


----------



## Addsome

Quote:


> Originally Posted by *Joshwaa*
> 
> Noctua Industrial 140mm 3000rpm PWM FTW! They are the only fans I buy anymore. Under 40% they are pretty quiet in my case. Plus the push really good thru rads.


Would a 140mm fan fit on the evga hybrid kit radiator?


----------



## pez

Quote:


> Originally Posted by *Addsome*
> 
> I used the 1080 hybrid kit from evga. I basically did the same thing that gamersnexus did with their 1080ti


That looks much better than I anticipated it would look. Thanks for the heads up on those videos







.


----------



## dboythagr8

Quote:


> Originally Posted by *alucardis666*
> 
> He cheated...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Side note:*
> 
> WOW! Mass Effect is DEMANDING!!!! Makes my CPU Run up to 78c!?! and my 1080Ti is artifacting with mem over 400 where it hasn't done that anywhere else card is also getting up to mid 70's with fan set to 100%.
> 
> This is all at 4K with the Ultra Preset btw. Really need this card under water to kill the noise and be able to hear my game/get the most performance out of it. On the upside even with some pretty nasty thermal throttling frames stay above 50fps.


Hmmm -- I didn't encounter any of what you mentioned. Artifacts or high CPU temp. But maybe Broadwell-E chip runs a bit warmer than Ivy-E.

I think I've settled on my gaming OC at +175/300. I've run Heaven with it, played 1.5hr of Mass Effect, and played BF1. No crashing or artifacts of any kind. Will do 3Dmark in a few, but I think I'm gonna hold it there.

I do need to try some non Frostbite 3 games though. BF1 gave me an avg of 131 on Ultra @ 1440p, Mass Effect 94.


----------



## dade_kash_xD

Newegg shipped my EVGA 1080 Ti Founder's Edition yesterday! Should have it installed and playing by Sunday morning. It's amazing how dated my 290's crossfire have become and they can't keep up in newer titles @ 3440x1440.


----------



## Joshwaa

They make 120mm also. I think that is what the hybrid is.


----------



## failwheeldrive

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Did you use the EK backplate or the FE one? Looking sexy!


Thanks! Still using the stock FE backplate, wanted to see how it looked before deciding on an aftermarket option. Might still pick up an EK backplate.


----------



## richiec77

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


Killing Floor 2 maxed out high refresh, Deus Ex MKD, Andromida once it comes put...and doesn't suck. Dragon Age Inquisition. Wit her 3.


----------



## mouacyk

So, it looks like LN2 BIOses were publicly available for the GTX 1080 and the power limits were removed completely. However, the voltages were locked at a rather dangerous level, 1.2v+. When custom models of the Ti come out, this is probably the best option (besides a fully custom BIOS) to eliminate power throttling if an LN2 BIOS is made available. I would think 1.2v on water is doable?


----------



## Maintenance Bot

Quote:


> Originally Posted by *celk00dx*
> 
> Hello, I'm trying to decide if I should get 1080 Ti or TXP for gaming 4k and higher with WC. I hear TXP is better for 5k and higher but 1080 Ti just set a new 3dmark record
> 
> http://www.pcgamer.com/geforce-gtx-1080-ti-overclocked-to-25ghz-on-sets-world-record/
> 
> So what do you guys think?


Get the 1080 Ti, no reason to buy Titan now.


----------



## Norlig

Is the GPU on this at a lower level than the black part that cools the VRAM?

Got an Arctic Cooling Hybrid unit and thinking about getting this, but worried the cooler might stop on the VRAM cooler before it touches the GPU.

If i need a copper shim I'd probably wait for a different version.


----------



## celk00dx

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Get the 1080 Ti, no reason to buy Titan now.


But what about the claim that TXP is better for 5k and higher? I also read in this thread that TXP is better for OC? But if that's true, how did 1080 Ti beat the TXP record, I'm confused.


----------



## dboythagr8

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


Battlefield 1
Mass Effect Andromeda trial

Heaven

Got Witcher 3, Rise of TR, and every other major AAA release on my HDD. Not enough time in the day.


----------



## BigMack70

Quote:


> Originally Posted by *celk00dx*
> 
> But what about the claim that TXP is better for 5k and higher? I also read in this thread that TXP is better for OC? But if that's true, how did 1080 Ti beat the TXP record, I'm confused.


Are _you_ gaming at 5k or higher? If so, you probably have the money where it doesn't matter. If you are not, the claims don't matter.

As for the claims themselves...


----------



## Radox-0

Quote:


> Originally Posted by *celk00dx*
> 
> But what about the claim that TXP is better for 5k and higher? I also read in this thread that TXP is better for OC? But if that's true, how did 1080 Ti beat the TXP record, I'm confused.


LN2 all normal rules go out the window.

Right now does seem some TXP's can get a notch over 2100 mhz under water whole most 1080Ti's are hitting a brick wall around there. 1080Ti on the other hand can clock higher on the memory to make up the deficit somewhat. Overall performance under water may go in favor of the TXP by a whisker but its all very much silicon lottery.

Also don't see why such two cards which for all intents and purposes have the same performance would all of a sudden make the TXP amazing at 5k and and the 1080Ti a worse choice, Real world performance will be a few % away from one another depending on clock. But would be happy to be proven wrong.


----------



## Maintenance Bot

Quote:


> Originally Posted by *celk00dx*
> 
> But what about the claim that TXP is better for 5k and higher? I also read in this thread that TXP is better for OC? But if that's true, how did 1080 Ti beat the TXP record, I'm confused.


I own both these gpu's, and I cannot distinguish a difference between them in a gaming enviroment.


----------



## vmanuelgm

Quote:


> Originally Posted by *mouacyk*
> 
> So, it looks like LN2 BIOses were publicly available for the GTX 1080 and the power limits were removed completely. However, the voltages were locked at a rather dangerous level, 1.2v+. When custom models of the Ti come out, this is probably the best option (besides a fully custom BIOS) to eliminate power throttling if an LN2 BIOS is made available. I would think 1.2v on water is doable?


https://xdevs.com/guide/pascal_oc/

This also brings and answer to the question if overvolting can help OC on aircooling or watercooling. It does not help, due to thermal, which get only worse. Higher temperature render stability and performance decrease. GPU literally overheats and cannot run high frequency anymore, even though temperature is below specified maximum temperature +94°C. Think of it as temperature to frequency dependency, all the way down from +94°C to -196°C, with slope around 100MHz every 50°C. So just like in 980/980Ti/TitanX case, over-voltage on aircooling/watercooling is not recommended, as it gains little if any performance improvement.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Hold 2100 mhz half of valley bench.
> 
> Cant hold it bc of the temp reach 70c half way but the most important its that didn't crash and goes all the way to the end. With a waterblock will hold 2100mhz 100%.
> 
> Some people just don't get it or they just jealous and find excuse or seeking proof. Pathetic in my opinion.


You are simply regurgitating your own delusion hoping that people buy in?

You're the one pathetic trying to claim an overclock that is not stable. You can claim 2100 when you can actually hold 2100. Until then you appear to be pandering to your on delusion.

*Having your card touch 2100 and maintain 2100 are worlds appart.*

This card,



http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html

Like yours, cannot claim a 2100 OC. Thats why they make it clear to point out that the card throttled to clear that assumption. All that area under the curve missing at each one of those throttle stops is from the card not actually running 2100. I bet your card would not throttle at 1800-2000, but you're not even showing that. What else is there to talk about.

Fix your 2100 before you claim it. That's all you're being asked of. Otherwise stop pretending.

There is no telling that had your card stayed at 2100 wether it would have crashed trying to sustain that clock, throttled due to TDP or else.

A working OC should like this.



Until you show something remotely similar vs just momentary boost to 2100 stop.

When you get there you can enjoy all the props you want but until then you're just pretending.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> I picked For Honor for me free game, and I have to say, it's a really good looking game.
> Well that would be fine....if I wasn't running a 144hz monitor
> 
> 
> 
> 
> 
> 
> 
> But I do frame limit to 144hz. In demanding games, the FPS still doesn't always get the full 144hz, but I'm happy with 90fps+.


yeah, it's a little different beast for sure. a lot easier to run slightly subdued @ 4k. always pushin for those 140's is definitely trickier.


----------



## stocksux

Quote:


> Originally Posted by *dboythagr8*
> 
> Ok so can someone explain to me the "throttling" that's happening?
> 
> I put a +100 offset on the core clock, power at 120%.
> 
> Heaven starts out at 1974mhz. Then drops to 1962,1949,1936, before settling at 1924. Temps maxed out at 73c. So despite that even with a +100, the card was running at 1924mhz. Why?


Why? Sounds like you'll end up spending more money? Finding these things in stock is pretty tough as is. By the time you find one the Partner boards will likely be out. Then you won't need all the extras. If you want lower temps, invest in a custom loop. Hybrid coolers are just a gateway drug to custom loops anyway ?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Sorry if I stepped on your toes or insulted your logic at all. I was just curious as to your thoughts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you elaborate? Idk what you mean by "frame limiter" I have V-sync on always, is this wrong?


ok, sorry i was asleep.

Go into Riva Tuner (that comes equipt with afterburner). in there is a "frame limiter" which DOES NOT add input latency. So, you can cap frames without using v-sync. In conjunction with "fast sync" you wont get screen tear either, but NONE of the input lag.
Bonus is you arent burning out your cpu and gpu at those times when you are laying down unused framerates well past your monitor's spec.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Sorry if I stepped on your toes or insulted your logic at all. I was just curious as to your thoughts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you elaborate? Idk what you mean by "frame limiter" I have V-sync on always, is this wrong?


some people do use it, i've seen it recommended, but the input lag gets me (but i really hate screen tear, too). so , this is the win/win solution.


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> You are simply regurgitating your own delusion hoping that people buy in?
> 
> You're the one pathetic trying to claim an overclock that is not stable. You can claim 2100 when you can actually hold 2100. Until then you appear to be pandering to your on delusion.
> 
> *Having your card touch 2100 and maintain 2100 are worlds appart.*
> 
> This card,
> 
> 
> 
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html
> 
> Like yours, cannot claim a 2100 OC. Thats why they make it clear to point out that the card throttled to clear that assumption. All that area under the curve missing at each one of those throttle stops is from the card not actually running 2100. I bet your card would not throttle at 1800-2000, but you're not even showing that. What else is there to talk about.
> 
> Fix your 2100 before you claim it. That's all you're being asked of. Otherwise stop pretending.
> 
> There is no telling that had your card stayed at 2100 wether it would have crashed trying to sustain that clock, throttled due to TDP or else.
> 
> A working OC should like this.
> 
> 
> 
> Until you show something remotely similar vs just momentary boost to 2100 stop.
> 
> When you get there you can enjoy all the props you want but until then you're just pretending.


what are you talking about ?

I show the video of valley all the way to the end.

2100 mhz 50% of valley after goes down to 2088mhz bc of the temperature

the one pathetic here its you

Get lost please dont even quote me again. Waste of me time taking with people like you.


----------



## Slackaveli

Quote:


> Originally Posted by *pompss*
> 
> You have no point.
> 
> Card on air throttles down bc of the temprature reaching 70c its normal. With a waterblock will keep 2100 stable or maybe 2150mhz
> 
> Passing the 3d mark and valley at 230+ on the core on air without crashing its the point you missed all this time.
> 
> 2100mhz on the core on air no crash !! WALL Broken .
> 
> Now you can go hide .


it hurts me, though, but im glad somebody can do it. i can only hold +155.


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> Good to know, but very disapointing to discover it's not able to do more than full hd 60 hz, which is a fairly disgraceful move by nvidia.


you, sir, have a way with words. should be (or are) in management/relations.


----------



## pez

This is an argument about semantics that's eventually going to become true when he's under water. I don't understand why you're so adamant about arguing with him over 12Mhz. It's not like he touches and is instantly dropped to sub-2000. He's sitting so close to the threshold that i can't imagine why this is even an argument.

Keep in mind when you break through a wall, it doesn't mean that the wall seals itself once you come back to the other side of it








.


----------



## pompss

Quote:


> Originally Posted by *Slackaveli*
> 
> it hurts me, though, but im glad somebody can do it. i can only hold +155.


i feel you . My first gtx 1080 ti goes only 165+.

Got lucky to get a good one and somebody here feels really butt hurt for that


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> ok, sorry i was asleep.
> 
> Go into Riva Tuner (that comes equipt with afterburner). in there is a "frame limiter" which DOES NOT add input latency. So, you can cap frames without using v-sync. In conjunction with "fast sync" you wont get screen tear either, but NONE of the input lag.
> Bonus is you arent burning out your cpu and gpu at those times when you are laying down unused framerates well past your monitor's spec.


Thanks for clarifying! I'll give it a shot.


----------



## Slackaveli

Quote:


> Originally Posted by *keikei*
> 
> So what games are ppl playing with their fancy super duper fast card?


i'v played a few but i'll throw this one out for the giggles.

League of Legends plays at 4k/400; or, 4k/60 locked @ 22%pow with 6%util.
[Cleveland Show voice] "Now THAT's nasty".


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> Civilization 6 and finally decided to play Witcher 3 now that I can max every setting at 1440P. Witcher looks absolutely stunning.


oh, thanks for reminding me. i need to go grab that game.


----------



## stocksux

Sorry double posted from an earlier post


----------



## pompss

Quote:


> Originally Posted by *pez*
> 
> This is an argument about semantics that's eventually going to become true when he's under water. I don't understand why you're so adamant about arguing with him over 12Mhz. It's not like he touches and is instantly dropped to sub-2000. He's sitting so close to the threshold that i can't imagine why this is even an argument.
> 
> Keep in mind when you break through a wall, it doesn't mean that the wall seals itself once you come back to the other side of it
> 
> 
> 
> 
> 
> 
> 
> .


Sometimes you cant argue with those people they just dont listen the so called Haters.

the card doesn't crash at 2100 mhz and ones it reach the high temp 70c throttles down which is normal. If crashes then i would agree with him but it doesn't.

Waterblock and Bam stable 2100mhz. Will install it this week.


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> Is it just me or does OC'ing the memory make the GPU run hotter? Talking about a few degrees like 5C.


Im at the point where Im trying to talk my Asperger's/OCD into not overclocking the memory(or gasp, uc'ing it)because of volts/temps being robbed from my core. I think it isnt bandwidth hungry anyway, how could it be if it's "effectively 1.2TBps blah blah bla", right?
Problem my ocd has with it is that drops us below the 500gbps line, and that is like, a mountain peak i have been lookin gup at longingly for fuh YEARS.


----------



## dboythagr8

So I ran the 3DMark Firestrike Test

GPU usage is fine on the individual graphic test, but on the combined test, it's only around 60%? Am I being CPU bottlenecked or something because the test is at 1080p? Makes no sense to me.


----------



## SuprUsrStan

Alright I'm officially part of this thread.




Who's jelly.


----------



## SuprUsrStan

This quite possibly the first time I've had SLI with stock coolers. Can't wait until EK releases actual 1080ti blocks.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> what are you talking about ?
> 
> I show the video of valley all the way to the end.
> 
> 2100 mhz 50% of valley after goes down to 2088mhz bc of the temperature
> 
> the one pathetic here its you
> 
> Get lost please dont even quote me again. Waste of me time taking with people like you.


If I used your logic of what a 2100OC is, I would have claimed it a long time ago but this OC, though it sails through Valley at 2100+ does not do so in Firestrike or Heaven so I, unlike YOU cannot claim it as such but for people that have "realistic expectations" I'm glad I'm able to shine some light on the "CONTEXT"of your claim. Your snappy reactionary responses should be evidence enough to not take you seriously.

I will say it again your OC is a monetary boost to 2100, it's just that a momentary boost but not sustained clocks or a valid 2100 OC in my book. Get it under water, get it to sustain 2100 and then you can spew all day. Until then...

FWIW, Here is mine and I can't claim it. At least mine doesn't throttle lol....


----------



## keikei

Spoiler: Warning: Spoiler!


----------



## Maintenance Bot

Quote:


> Originally Posted by *Syan48306*
> 
> This quite possibly the first time I've had SLI with stock coolers. Can't wait until EK releases actual 1080ti blocks.


When you got time, can you report temps? I may grab another also.


----------



## stocksux

Quote:


> Originally Posted by *Syan48306*
> 
> This quite possibly the first time I've had SLI with stock coolers. Can't wait until EK releases actual 1080ti blocks.


Why do people keep saying this?!?!? Buy the Titan block. It's the same


----------



## SuprUsrStan

Quote:


> Originally Posted by *stocksux*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> This quite possibly the first time I've had SLI with stock coolers. Can't wait until EK releases actual 1080ti blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why do people keep saying this?!?!? Buy the Titan block. It's the same
Click to expand...

It is most definitely not the same block. Just because it may be compatible does not mean it is the SAME block.

The Titan X Pascal block has a cutout for a DVI port and both the backplate and block have Titan X Pascal embossed on it.









I can wait until the first week of April for a proper 1080Ti block.


----------



## stocksux

Quote:


> Originally Posted by *keikei*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Could you move that card out of the way. Can't see the potato chips


----------



## Slackaveli

Quote:


> Originally Posted by *celk00dx*
> 
> But what about the claim that TXP is better for 5k and higher? I also read in this thread that TXP is better for OC? But if that's true, how did 1080 Ti beat the TXP record, I'm confused.


how could it be with half of the power bits missing? listen to your wallet, dude. you are trippin. Titan XP worse than a TI where it matters. It's "better" where it doesnt, and the cost doesnt justify it. put it this way, take the "Name" out of it. spec for spec you'd be dumb to pay more for less. if they were the exact same price i might still take the ti. it has bettet power delivery. but, maybe id be a "name whore" at that point.

Anyway, you dont see people scrambling to buy used Titans. They are scrambling for Ti.


----------



## Mato87

Quote:


> Originally Posted by *failwheeldrive*
> 
> Got my 1080 ti on Tuesday, put it under water the next day. It's a massive upgrade over my 1080.


Nice one, how are you guys doing all the custom water cooling for the graphic cards? Iam interested, is it hard to do ? What about the maintenance?


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> If I used your logic of what a 2100OC is, I would have claimed it a long time ago but this OC, though it sails through Valley at 2100+ does not do so in Firestrike or Heaven so I, unlike YOU cannot claim it as such but for people that have "realistic expectations" I'm glad I'm able to shine some light on the "CONTEXT"of your claim. Your snappy reactionary responses should be evidence enough to not take you seriously.
> 
> I will say it again your OC is a monetary boost to 2100, it's just that a momentary boost but not sustained clocks or a valid 2100 OC in my book. Get it under water, get it to sustain 2100 and then you can spew all day. Until then...
> 
> FWIW, Here is mine and I can't claim it. At least mine doesn't throttle lol....


Are you serious ?

Claim it with a video running valley to the end without crash or post 3d mark score link showing that you reach that clock without crashing

what you show means nothing no proof.

That's the proof right here: 2100mhz 3d mark

http://www.3dmark.com/3dm/18620560

Talk to me when you get the link please until then just shut up.


----------



## Mato87

Quote:


> Originally Posted by *Slackaveli*
> 
> you, sir, have a way with words. should be (or are) in management/relations.


Thanks mate, if it wasn't sarcastic, then I appreciate it.


----------



## Slackaveli

Quote:


> Originally Posted by *Radox-0*
> 
> LN2 all normal rules go out the window.
> 
> Right now does seem some TXP's can get a notch over 2100 mhz under water whole most 1080Ti's are hitting a brick wall around there. 1080Ti on the other hand can clock higher on the memory to make up the deficit somewhat. Overall performance under water may go in favor of the TXP by a whisker but its all very much silicon lottery.
> 
> Also don't see why such two cards which for all intents and purposes have the same performance would all of a sudden make the TXP amazing at 5k and and the 1080Ti a worse choice, Real world performance will be a few % away from one another depending on clock. But would be happy to be proven wrong.


that 'claim' is nothing more than whiny rich guys trying to justify their 10,000 pcs (probably because their wife would kill them if she knew how much money was just pissed away).


----------



## pompss

double post sorry


----------



## outofmyheadyo

So do you guys think the card is worth the price nvidia asks for it ? I have a heatkiller fullcover block ready for it, so is my custom loop, sitting here thinking if I should pull the trigger or not, currently using the 980ti wich is showing its age, and gaming on a 1440p 144hz monitor, I have a feeling I will order one today, but not entirely sure


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> You are simply regurgitating your own delusion hoping that people buy in?
> 
> You're the one pathetic trying to claim an overclock that is not stable. You can claim 2100 when you can actually hold 2100. Until then you appear to be pandering to your on delusion.
> 
> *Having your card touch 2100 and maintain 2100 are worlds appart.*
> 
> This card,
> 
> 
> 
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html
> 
> Like yours, cannot claim a 2100 OC. Thats why they make it clear to point out that the card throttled to clear that assumption. All that area under the curve missing at each one of those throttle stops is from the card not actually running 2100. I bet your card would not throttle at 1800-2000, but you're not even showing that. What else is there to talk about.
> 
> Fix your 2100 before you claim it. That's all you're being asked of. Otherwise stop pretending.
> 
> There is no telling that had your card stayed at 2100 wether it would have crashed trying to sustain that clock, throttled due to TDP or else.
> 
> A working OC should like this.
> 
> 
> 
> Until you show something remotely similar vs just momentary boost to 2100 stop.
> 
> When you get there you can enjoy all the props you want but until then you're just pretending.


if i were moderating you two id point out that of course you are correct, you did come off a bit harsh to begin with (his initial cheering faces pissed you off), but otherwise you are correct. HOWEVER, as soon as pompss card hits water, i bet he can hold it. He's right that that is a golden card. as is yours. that 2075 locked is a sight to behold. Neither of you have anything to be mad about , really. Ya'll got the best best cards in the thread. BE HAPPY. SMILE!! SMILE!!

SMILE!!


----------



## jarble

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So do you guys think the card is worth the price nvidia asks for it ? I have a heatkiller fullcover block ready for it, so is my custom loop, sitting here thinking if I should pull the trigger or not, currently using the 980ti wich is showing its age, and gaming on a 1440p 144hz monitor, I have a feeling I will order one today, but not entirely sure


Give into the darkside









One that note fedex says my pair should be here by the end of the week


----------



## BigMack70

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So do you guys think the card is worth the price nvidia asks for it ? I have a heatkiller fullcover block ready for it, so is my custom loop, sitting here thinking if I should pull the trigger or not, currently using the 980ti wich is showing its age, and gaming on a 1440p 144hz monitor, I have a feeling I will order one today, but not entirely sure


Technically, everyone in here who owns the card has decided that yes, the card is worth the asking price.


----------



## pompss

Quote:


> Originally Posted by *Slackaveli*
> 
> if i were moderating you too id point out that of course you are correct, you did come off a bit harsh to begin with (his initial cheering faces pissed you off), but otherwise you are correct. HOWEVER, as soon as pompss card hits water, i bet he can hold it. He's right that that is a golden card. as is yours. that 2075 locked is a sight to behold. Neither of you have anything to be mad about , really. Ya'll got the best best cards in the thread. BE HAPPY. SMILE!! SMILE!!
> 
> SMILE!!


He cannot run 3d mark at 2100 mhz to the end and he get pissed off if someone else can

Classic haters

Here its my score and 2100mhz 3d mark

Facts. Proof. Not words

http://www.3dmark.com/3dm/18620560


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Thanks for clarifying! I'll give it a shot.


lemme know what you think because to me it should be called "Poor Man's g-sync mod". It really does play that good.


----------



## keikei

Quote:


> Originally Posted by *stocksux*
> 
> Could you move that card out of the way. Can't see the potato chips


LMAO. I need to wait abit until i install the card. Thing is like a brick of solid ice.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> So I ran the 3DMark Firestrike Test
> 
> GPU usage is fine on the individual graphic test, but on the combined test, it's only around 60%? Am I being CPU bottlenecked or something because the test is at 1080p? Makes no sense to me.


yes, of course. always test at highest resolution if you have a stronger gpu than cpu setup


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> If I used your logic of what a 2100OC is, I would have claimed it a long time ago but this OC, though it sails through Valley at 2100+ does not do so in Firestrike or Heaven so I, unlike YOU cannot claim it as such but for people that have "realistic expectations" I'm glad I'm able to shine some light on the "CONTEXT"of your claim. Your snappy reactionary responses should be evidence enough to not take you seriously.
> 
> I will say it again your OC is a monetary boost to 2100, it's just that a momentary boost but not sustained clocks or a valid 2100 OC in my book. Get it under water, get it to sustain 2100 and then you can spew all day. Until then...
> 
> FWIW, Here is mine and I can't claim it. At least mine doesn't throttle lol....


honestly, id claim the frick out of that.


----------



## pompss

Quote:


> Originally Posted by *Slackaveli*
> 
> honestly, id claim the frick out of that.


yeah 2113 mhz at 22c









Really ?


----------



## Kimir

Quote:


> Originally Posted by *pompss*
> 
> He cannot run 3d mark at 2100 mhz to the end and he get pissed off if someone else can
> 
> Classic haters
> 
> Here its my score and 2100mhz 3d mark
> 
> Facts. Proof. Not words
> 
> http://www.3dmark.com/3dm/18620560


Now, you and everyone else can go and share your score in those threads:
Fire Strike Ultra Top 30
Fire Strike Extreme Top 30
3DMark Time Spy Benchmark Top 30
[OFFICIAL] Top 30 Heaven Benchmark 4.0 Scores
[OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0

Only to list a few.


----------



## pompss




----------



## I-Siamak-I

Quote:


> Originally Posted by *failwheeldrive*
> 
> Got my 1080 ti on Tuesday, put it under water the next day. It's a massive upgrade over my 1080.


Hey man, can you please tell me if your card has any coil whine while playing games? Can u leave the fans on low and listen to any buzzing while game is running muted?


----------



## Joshwaa

Quote:


> Originally Posted by *jarble*
> 
> Give into the darkside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One that note fedex says my pair should be here by the end of the week


FedEx is a Lie.


----------



## KraxKill

I'm keeping mine at 2075.

FWIW The Pwr and TDP limits on these cards introduce the MHz wall not the cores themselves. My FIrestrike scores at 2088 are actually lower than at 2075.

Here's a full Valley screen cap for fun @ 2100+, as you can see....Pwr limits signify the end of the line for me.

This,



vs

this


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> lemme know what you think because to me it should be called "Poor Man's g-sync mod". It really does play that good.


Probably stupid question, but... After setting Fast in Nvidia CP and turning the Limiter to 60 do I still need to apply V-Sync in game options or no?


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> Nice one, how are you guys doing all the custom water cooling for the graphic cards? Iam interested, is it hard to do ? What about the maintenance?


it's crack. not too expensive at first, is a rush, becomes an addiction, and will always make every build more expensive lol. Oh, and your whole life, err rig, could come crashing down in one instant. LOL!

But, yeah, it's really great, though!


----------



## Baasha

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Login into there website and look for it on your order details page for the order. That's where I found mine.


I must be blind or something - I don't see anything about game codes on the Order Details page - just my address, order number, product I ordered, price, and a button that says "VIEW INVOICE."

Please post a screenshot or something - I already called them on Monday and they said I should receive an email within 24 hours - nothing yet and it's been 3 days!


----------



## Slackaveli

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So do you guys think the card is worth the price nvidia asks for it ? I have a heatkiller fullcover block ready for it, so is my custom loop, sitting here thinking if I should pull the trigger or not, currently using the 980ti wich is showing its age, and gaming on a 1440p 144hz monitor, I have a feeling I will order one today, but not entirely sure


bro, if you have those things and aren't screwing your family for the month out of food or rent money, DO IT. You will not be mad you did. especially with water on deck like that.


----------



## SuprUsrStan

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> This quite possibly the first time I've had SLI with stock coolers. Can't wait until EK releases actual 1080ti blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When you got time, can you report temps? I may grab another also.
Click to expand...

Looks like in SLI on a default fan profile, it looks like it bumps 2025mhz core and 6000mhz on the memory then starts throttling.
With fans at 100% and both backplates on, the bottom card sits at 59 degrees and the top one at 75 degrees.


----------



## Slackaveli

Quote:


> Originally Posted by *keikei*
> 
> LMAO. I need to wait abit until i install the card. Thing is like a brick of solid ice.


no, test it quickly!


----------



## Mato87

Quote:


> Originally Posted by *Slackaveli*
> 
> it's crack. not too expensive at first, is a rush, becomes an addiction, and will always make every build more expensive lol. Oh, and your whole life, err rig, could come crashing down in one instant. LOL!
> 
> But, yeah, it's really great, though!


HAH lol, I shall keep my hands off water cooling then


----------



## Slackaveli

Quote:


> Originally Posted by *pompss*
> 
> yeah 2113 mhz at 22c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really ?


it's absurd. i picture him with a swamp fan in front of an open door in a snowstorm overclocking so hard his hair is standing up like Doc from Back to the Future.

It really ias an insane clock, but those temps are best ive ever seen , ever, without LN2.


----------



## PewnFlavorTang

Not hating but would like to see 2100 with a memory overclock. Keeping the memory stock or sub-stock will increase the ability to have a higher core clock. Just saying. Either way, zfg on what other people can clock to. I'm happy with my ti's.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I'm keeping mine at 2075.
> 
> FWIW The Pwr and TDP limits on these cards introduce the MHz wall not the cores themselves. My FIrestrike scores at 2088 are actually lower than at 2075.
> 
> Here's a full Valley screen cap for fun @ 2100+, as you can see....Pwr limits signify the end of the line for me.
> 
> This,
> 
> 
> 
> vs
> 
> this


i mean, i guess im just a dirty dude b/c ive never had a run where there wasnt at least a spec in perf cap. ever.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> FedEx is a Lie.


NO DOUBT!!!


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Probably stupid question, but... After setting Fast in Nvidia CP and turning the Limiter to 60 do I still need to apply V-Sync in game options or no?


i leave 'off' in game.

these settings make my bigscreen samsung suhd play very nice. the same settings are also perfect on g-sync monitor as well. IF you are getting any hitching at all it could be your (HEPT) in cpu bios settings. I turned that off last year and man it has been a much smoother gaming experience so you may wanna uncheck that box, too. it's in the main bios settings, not OC settings.

NOTE: others claim the exact opposite, ENABLING hept made it smooth. it's like a balancing situation and Im just saying if you guys get stuttering, just "unsmoothness", try it.


----------



## vmanuelgm

Quote:


> Originally Posted by *pompss*
> 
> He cannot run 3d mark at 2100 mhz to the end and he get pissed off if someone else can
> 
> Classic haters
> 
> Here its my score and 2100mhz 3d mark
> 
> Facts. Proof. Not words
> 
> http://www.3dmark.com/3dm/18620560


Could u please pass time spy???


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> i leave 'off' in game.
> 
> these settings make my bigscreen samsung suhd play very nice. the same settings are also perfect on g-sync monitor as well. IF you are getting any hitching at all it could be your (HEPT) in cpu bios settings. I turned that off last year and man it has been a much smoother gaming experience so you may wanna uncheck that box, too. it's in the main bios settings, not OC settings.
> 
> NOTE: others claim the exact opposite, ENABLING hept made it smooth. it's like a balancing situation and Im just saying if you guys get stuttering, just "unsmoothness", try it.


Thanks, I'll test now


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Probably stupid question, but... After setting Fast in Nvidia CP and turning the Limiter to 60 do I still need to apply V-Sync in game options or no?


OH. I forgot to mention. don't set it at 60. set it at 59 (or at 142 on a 144 monitor). that was the last "input lag" cutting setting. it eliminated input lag because of a different reason related to timings. just trust me , it helps. My bad telling you wrong to begin with. was tired when i said it last night. too weary to promperly explain in just a random comment. since you are trying it, that's the full story.


----------



## pompss

dude that not proof everybody can do that

i can add 2200 mhz and post the same screenshot as you do,

Proof:

3d mark link of what you claim and video of valley running to the end please.

Screenshot are not poof you might crash after 2 min.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> yeah 2113 mhz at 22c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really ?


That 22C is not real, it's while i was dragging the window and the bench was momentarily paused . I'm at 29-35C during the run as in the followup screen I posted. Good luck with your water project.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> OH. I forgot to mention. don't set it at 60. set it at 59 (or at 142 on a 144 monitor). that was the last "input lag" cutting setting. it eliminated input lag because of a different reason related to timings. just trust me , it helps. My bad telling you wrong to begin with. was tired when i said it last night. too weary to promperly explain in just a random comment. since you are trying it, that's the full story.


Appreciate all the help man, Rep given. I learned something new today









Application detection mode and stealth mode?

Does this look right to you?


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> I'm keeping mine at 2075.
> 
> FWIW The Pwr and TDP limits on these cards introduce the MHz wall not the cores themselves. My FIrestrike scores at 2088 are actually lower than at 2075.
> 
> Here's a full Valley screen cap for fun @ 2100+, as you can see....Pwr limits signify the end of the line for me.
> 
> This,
> 
> 
> 
> vs
> 
> this


dude that's not proof everybody can do that

i can add 2200 mhz run pc express render and post the same screenshot as you do,

Proof:

3d mark link of what you claim and video of valley running to the end please.

Screenshot are not poof you might crash after 2 min.

but dont worry you got the best card









Good luck to you


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> HAH lol, I shall keep my hands off water cooling then:thumb:


just start with an AIO hybrid and if those amazing temps are good enough to not throttle than you are golden. Only adds $100 cost but keep the card cool enough you can justify that b/c extra longevity, plus it wont throttle. AND it's safe.

But no where near as sexy as the closed loop. I miss it, but not the headaches.


----------



## Mato87

Quote:


> Originally Posted by *Slackaveli*
> 
> just start with an AIO hybrid and if those amazing temps are good enough to not throttle than you are golden. Only adds $100 cost but keep the card cool enough you can justify that b/c extra longevity, plus it wont throttle. AND it's safe.
> 
> But no where near as sexy as the closed loop. I miss it, but not the headaches.


Oh yeah, I know about the headaches, well similar ones. I had 4 or 5 sli rigs throught my lifetime

I've seen this EVGA HYBRID Water Cooler (all in one), but Iam not so sure if it's worth it, where would I put the fan, no space in the case...


----------



## al210

Quote:


> Originally Posted by *Baasha*
> 
> I must be blind or something - I don't see anything about game codes on the Order Details page - just my address, order number, product I ordered, price, and a button that says "VIEW INVOICE."
> 
> Please post a screenshot or something - I already called them on Monday and they said I should receive an email within 24 hours - nothing yet and it's been 3 days!


Excerpt from my chat conversation with Nvidia representative:

Sanjib: Due to unforeseen circumstances, the GeForce GTX 1080 Ti was added to our qualified bundle list after the pre-orders had been processed. Our team is working swiftly to ensure that you receive your bundled game codes no later than March 22.
Sanjib: No need to worry,
Sanjib: You will receive a separate email with the code.


----------



## dboythagr8

Quote:


> Originally Posted by *Slackaveli*
> 
> yes, of course. always test at highest resolution if you have a stronger gpu than cpu setup


I haven't yet accepted that my 4930k @ 4.5ghz is weak

I refuse ;__;

Warrior in my eyes


----------



## alucardis666

The results from Gamers Nexus Hybrid mod for those interested.






Tomorrow will be VERY interesting and fun when I get my Hybrid in!


----------



## Baasha

Quote:


> Originally Posted by *al210*
> 
> Excerpt from my chat conversation with Nvidia representative:
> 
> Sanjib: Due to unforeseen circumstances, the GeForce GTX 1080 Ti was added to our qualified bundle list after the pre-orders had been processed. Our team is working swiftly to ensure that you receive your bundled game codes no later than March 22.
> Sanjib: No need to worry,
> Sanjib: You will receive a separate email with the code.


March 22?









Well, thanks for the info though. I was going crazy thinking I wasn't getting the codes. Ugh...


----------



## alucardis666

Quote:


> Originally Posted by *Baasha*
> 
> March 22?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, thanks for the info though. I was going crazy thinking I wasn't getting the codes. Ugh...


Funny how NVIDIA can't issue codes for those who pre-ordered but Newegg had 0 issues.


----------



## Baasha

Quote:


> Originally Posted by *dboythagr8*
> 
> I haven't yet accepted that my 4930k @ 4.5ghz is weak
> 
> I refuse ;__;
> 
> Warrior in my eyes


hey @dboythagr8 I ordered the Kraken X62 and the Noctua 3000 140mm PWM fans. Won't be getting them till Tuesday though. This wait is killing me. Grrrr...


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baasha*
> 
> March 22?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, thanks for the info though. I was going crazy thinking I wasn't getting the codes. Ugh...
> 
> 
> 
> Funny how NVIDIA can't issue codes for those who pre-ordered but Newegg had 0 issues.
Click to expand...

Bought in store in in Toronto at Canada Computers last Friday, day of release of card, Emailed order number from receipt to store, had code next day for games.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Baasha*
> 
> March 22?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, thanks for the info though. I was going crazy thinking I wasn't getting the codes. Ugh...
> 
> 
> 
> Funny how NVIDIA can't issue codes for those who pre-ordered but Newegg had 0 issues.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Bought in store in in Toronto at Canada Computers last Friday, day of release of card, Emailed order number from receipt to store, had code next day for games.
Click to expand...

Edit: I really shouldn't have redeemed the code though, I won't play either game and realized right after I redeemed it I could of given it to someone here.

My bad.









Second edit: I never meant to quote myself but edit my original post, I do this way too often.


----------



## Kdubbs

For anyone interested in doing a AIO mod to their reference card, bnh is selling the 1080 hybrid cooler for 100 with the evga promotional ghost recon wildlands or for honor. And at least for me I got expedited shipping of tomorrow for free! Just placed the order.


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> hey @dboythagr8 I ordered the Kraken X62 and the Noctua 3000 140mm PWM fans. Won't be getting them till Tuesday though. This wait is killing me. Grrrr...


Good stuff!

You'll definitely notice a difference. You should be able to put the rad in the front of the case as well if you want. I have mine up top and it fits fine.

Air 540 is such a great versatile case


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> dude that not proof everybody can do that
> 
> i can add 2200 mhz run pc express render and post the same screenshot as you do,
> 
> Proof:
> 
> 3d mark link of what you claim and video of valley running to the end please.
> 
> Screenshot are not poof you might crash after 2 min.
> 
> Until then good luck to you too


I've already told you, I get higher scores in 3d mark Firestrike at 2075, than I do at 2088.

Here's a link to one at 2075...
http://www.3dmark.com/fs/12007457



I haven't even attempted to try stabilize 2100 since I'm hitting minor power limits at around 2088 trying to sustain the clock vs letting it bounce. These throttle events lower the score and make seeking 2100 a pointless proposition for me. I predict you will encounter this as well once you get under water and unlock your cards potential, which frankly could be amazing since you've got one of the best examples so far.

If I set my card like yours to hit the TDP and PWR and or temp throttles @ 2100 I'm sure I can run Firestirke just fine. It wouldn't be hard for me to set the boost at 2100 and use the TDP and or temp limiter to allow my card to down clock to 2050 during the run and pass. Again that would be misleading and is essentially what you're doing without knowing it.

Since your card currently throttles away form 2100, you're not actually loading the core nearly as hard as you would under water where the thermal throttle will no longer save you and will actually increase your core usage and thus TDP as the card will now be spending longer at your top bin.

When you get under water, and you're not bouncing off the TDP and temp limit like you are now, post your results. I'll be waiting.

You've yet to post a video of your card sustaining anything other than a momentary boost clocks. I really could care less about debating the semantics you're presenting about sustained boost except to give context to people reading and evaluating your claimed 2100.

From what you have said you throttle down to 2050 do you not? When you get under water and your card is stressed even harder due to lower thermals, is no longer hitting at lest the temp limits and better yet TDP and Pwr, I suspect you'll be in for a surprise.

In practicality, you're also likely to see higher 3d Mark scores at lower but sustained boost levels where the card is not hitting the pwr limits and spends longer in your top clock bin. This will increase the area under the curve and essentially improve your performance.

Here's the video you requested. 2100+ through Valley. (2113 if we use your logic).


----------



## alucardis666

Quote:


> Originally Posted by *Kdubbs*
> 
> For anyone interested, bnh is selling the 1080 hybrid cooler for 100 with the evga promotional ghost recon wildlands or for honor. And at least for me I got expedited shipping of tomorrow for free! Just placed the order.


Good info!


----------



## x-apoc

Quote:


> Originally Posted by *Kdubbs*
> 
> For anyone interested, bnh is selling the 1080 hybrid cooler for 100 with the evga promotional ghost recon wildlands or for honor. And at least for me I got expedited shipping of tomorrow for free! Just placed the order.


Cant wait to see ME: Andromeda bundles, if ever..


----------



## alucardis666

Quote:


> Originally Posted by *x-apoc*
> 
> Cant wait to see ME: Andromeda bundles, if ever..


If ever is right. I doubt we'll see it bundled with anything. Nvidia seems to have a pretty tight partnership with Ubisoft atm.


----------



## jsutter71

Quote:


> Originally Posted by *Radox-0*
> 
> Agree with a fair bit of your post, but not other parts. Definitely agree if you have a TXP, would keep it, unless you can sell them for a steller price. Also all the hype and marketing show it as killing the TXP, when really load of crap and for most the part at stock they are similar or slightly in favour of 1080Ti due to higher clocks as you pointed out.
> 
> The 1080Ti, can make up the memory overclock gain once again and hit 12 Ghz or so to make up the delta when you OC the Titan XP Memory so increase that delta again. 91 vs 94 degrees will not be making a difference as you will be throttling pretty hard on both cards far before then. That limit is the maximum GPU temp limits and you will not be able to get that high, boost drops far before then on both. Also if your hitting those temps, you would be on air which would then ask the question, why would you not get a 1080Ti AIB which will be able to sustain higher clocks and once again outperform the TXP for cheaper.
> 
> Likewise even these days if you are into overclocking and watercooling etc, then then 1080Ti mostly makes more sense. It may not be able to absolutely match the TXP on the top end. But for a % or so for less performance (depending on the chip) for nearly half the price, that's considerable IMO and more then makes up for that for 99% of people. Last 1% or so of course want the best at any price and no doubt that 1% is considerable on this site, it is indeed OCN and has overclock in the name


For what it's worth, I would be jumping onto the 1080Ti train if I had not already owned my TXPs. Releasing a card that is practically identical to their flagship consumer product while at the same time going toe to toe with said flagship product 8 months later at a $500 price cut makes great marketing since. It was great for Nvidia and their bottom line. *Great for Nvidia* It's unfortunate that their was not a competitive product to keep their prices in check.


----------



## KraxKill

GeForce GTX 1080 Ti overclocked to 2.5GHz sets world record


----------



## alucardis666

Quote:


> Originally Posted by *KraxKill*
> 
> GeForce GTX 1080 Ti overclocked to 2.5GHz sets world record


Impressive. If only it could do that with water and not LN2 we'd be set!


----------



## KCDC

My two have finally arrived from the Nvidia preorder.... And here I sit at work.... While they sit at my house.....

must feign emergency....


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Appreciate all the help man, Rep given. I learned something new today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Application detection mode and stealth mode?
> 
> Does this look right to you?


perfect. i know you like that. added bonus is most games you'll lock and never stress the card.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> perfect. i know you like that. added bonus is most games you'll lock and never stress the card.


TBH, I can't tell much difference.
















Mass Effect Andromeda feels a little smoother, but that might just be in my head? Still makes the card run up to ~84c and the cpu hit ~78c


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> I haven't yet accepted that my 4930k @ 4.5ghz is weak
> 
> I refuse ;__;
> 
> Warrior in my eyes


it certainly isnt weak but isn't always getting utilized properly. should be a beautiful match at 1440p. it really is great that it's this way . our cpus age gracefully by just upping the resolution.


----------



## Radox-0

Quote:


> Originally Posted by *jsutter71*
> 
> For what it's worth, I would be jumping onto the 1080Ti train if I had not already owned my TXPs. Releasing a card that is practically identical to their flagship consumer product while at the same time going toe to toe with said flagship product 8 months later at a $500 price cut makes great marketing since. It was great for Nvidia and their bottom line. *Great for Nvidia* It's unfortunate that their was not a competitive product to keep their prices in check.


Yup spot on bud. Part of the premium with the Titan cards is of course you get to enjoy that power unmatched for a considerable amount of time, that in itself is worth something and nvidia know it. Fingers crossed Vega does something special


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> TBH, I can't tell much difference.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mass Effect Andromeda feels a little smoother, but that might just be in my head? Still makes the card run up to ~84c and the cpu hit ~78c


lol, i cant help you there. if you wanted you could lock the frames at 50 and that might help. that particular game is brutal.
Quote:


> Originally Posted by *Radox-0*
> 
> Yup spot on bud. Part of the premium with the Titan cards is of course you get to enjoy that power unmatched for a considerable amount of time, that in itself is worth something and nvidia know it. Fingers crossed Vega does something special


and they are still equals so you didnt get screwed, either. you still get that awesome black card with Titan on it for epeen,.

it's a win/win. even amd fans can sfford to grab a 1080 now.
Quote:


> Originally Posted by *alucardis666*
> 
> TBH, I can't tell much difference.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mass Effect Andromeda feels a little smoother, but that might just be in my head? Still makes the card run up to ~84c and the cpu hit ~78c


at least no screen tearing and no input delay either.


----------



## jarble

Quote:


> Originally Posted by *Joshwaa*
> 
> FedEx is a Lie.


Yeah I am not holding my breath on a true delivery date but it would kick but if I got them before the foldathon

Quote:


> Originally Posted by *KCDC*
> 
> My two have finally arrived from the Nvidia preorder.... And here I sit at work.... While they sit at my house.....
> 
> must feign emergency....


No need to feign new gpus are an emergency


----------



## joeh4384

I probably would be rocking a Titan X if they allowed non reference designs.


----------



## dboythagr8

Quote:


> Originally Posted by *x-apoc*
> 
> Cant wait to see ME: Andromeda bundles, if ever..


Quote:


> Originally Posted by *alucardis666*
> 
> If ever is right. I doubt we'll see it bundled with anything. Nvidia seems to have a pretty tight partnership with Ubisoft atm.


http://www.geforce.com/whats-new/articles/mass-effect-andromeda-best-buy-evga-geforce-gtx-1070-bundle


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> http://www.geforce.com/whats-new/articles/mass-effect-andromeda-best-buy-evga-geforce-gtx-1070-bundle


Seems that's more an EVGA bundle than it is Nvidia or their partners. And it's also exclusive to Best Buy.


----------



## dboythagr8

Quote:


> Originally Posted by *alucardis666*
> 
> Seems that's more an EVGA bundle than it is Nvidia or their partners. And it's also exclusive to Best Buy.


Just showing there is a ME:A bundle out there at some level.


----------



## x-apoc

Quote:


> Originally Posted by *dboythagr8*
> 
> http://www.geforce.com/whats-new/articles/mass-effect-andromeda-best-buy-evga-geforce-gtx-1070-bundle


haha nice.. way to push sales nvidia

hybrid cooler+ Andromeda or no buy!


----------



## pompss

Quote:


> Originally Posted by *KraxKill*
> 
> I've already told you, I get higher scores in 3d mark Firestrike at 2075, than I do at 2088.
> 
> Here's a link to one at 2075...
> http://www.3dmark.com/fs/12007457
> 
> 
> 
> I haven't even attempted to try stabilize 2100 since I'm hitting minor power limits at around 2088 trying to sustain the clock vs letting it bounce. These throttle events lower the score and make seeking 2100 a pointless proposition for me. I predict you will encounter this as well once you get under water and unlock your cards potential, which frankly could be amazing since you've got one of the best examples so far.
> 
> If I set my card like yours to hit the TDP and PWR and or temp throttles @ 2100 I'm sure I can run Firestirke just fine. It wouldn't be hard for me to set the boost at 2100 and use the TDP and or temp limiter to allow my card to down clock to 2050 during the run and pass. Again that would be misleading and is essentially what you're doing without knowing it.
> 
> Since your card currently throttles away form 2100, you're not actually loading the core nearly as hard as you would under water where the thermal throttle will no longer save you and will actually increase your core usage and thus TDP as the card will now be spending longer at your top bin.
> 
> When you get under water, and you're not bouncing off the TDP and temp limit like you are now, post your results. I'll be waiting.
> 
> You've yet to post a video of your card sustaining anything other than a momentary boost clocks. I really could care less about debating the semantics you're presenting about sustained boost except to give context to people reading and evaluating your claimed 2100.
> 
> From what you have said you throttle down to 2050 do you not? When you get under water and your card is stressed even harder due to lower thermals, is no longer hitting at lest the temp limits and better yet TDP and Pwr, I suspect you'll be in for a surprise.
> 
> In practicality, you're also likely to see higher 3d Mark scores at lower but sustained boost levels where the card is not hitting the pwr limits and spends longer in your top clock bin. This will increase the area under the curve and essentially improve your performance.
> 
> Here's the video you requested. 2100+ through Valley. (2113 if we use your logic).


Yep thats a nice clock in valley dude. its under water? 30c i guess it is.

You get stable 2075 mhz on heaven ? 3d mark throttle from 2000mhz to 2100 mhz but im not sure why its that since i dont hit high temps.


----------



## Baasha

Have you guys who are air-cooling removed the back-plates and seen any perceptible difference in temps? My 4 Way GPU sammich is obviously keeping things...hot.


----------



## KraxKill

Quote:


> Originally Posted by *pompss*
> 
> Yep thats a nice clock in valley dude. its under water? 30c i guess it is.
> 
> You get stable 2075 mhz on heaven? 3d mark throttle from 2000mhz to 2100 mhz but im not sure why its that since i dont hit high temps.


The card is on water, on an x41 AIO mod. Very basic but it has a slightly larger radiator than a standard AIO. 30% more volume. 140mm vs the 120mm EVGA unit for example. It's running two noctuas in push pull and I'm using liquid metal TIM on the die to help further with the temps. People debate if the trouble is worth it, but I've had good results with it.

I am stable benching and looping through everything at 2075. As I'm sure you've noticed throttling is a peeve of mine. Perhaps too much so. I'll admit it. Heaven, Valley, Firestrike, TimeSpy games you name it all bench and loop at a solid 2075 for me. Since this is where my highest scores are as well it will be where I leave it unless there is some magic bios to be had that unlocks more potential. Unlikely.

I also found 2088 to be solid in everything as well, but my scores do not improve as I begin to bumps the power limit 3-4 times during the bench which despite running a higher clock actually keeps my scores flat or lower depending on the bench. Valley is abot the same. Firestrike drops a few points. Heaven is about the same but lower as well.

2100-2113 and the quick go I had at 2135 is stable in Valley and in ARMA3 only. But performance, even in Valley begins do go down since at this clock I'm hitting the Pwr limit even more frequently than before and the temps start to creep past 35 as well.

I feel you'll likely be somewhere in the 2050-2100 range with your card once you get it on water. Assuming 35-40C. Higher will be tough without undervolting (lower stability in exchange for lower draw) to stay off the limiter or hard modding the power delivery.

I've had the card since the 10th with plenty of time to explore it, but you've got me thinking...I may try running my top bins at lower voltages to see if I can stay off the little bumps of power limiter and still maintain stabliity at 2088. If I can, this would improve my scores at 2088 by a tiny bit and make it worthwhile running the card there. Will see.

Enjoy the project and good luck! Feel free to PM if you wish. I'm good with the voltage curve and am always willing to bounce ideas.


----------



## KedarWolf

Quote:


> Originally Posted by *Baasha*
> 
> Have you guys who are air-cooling removed the back-plates and seen any perceptible difference in temps? My 4 Way GPU sammich is obviously keeping things...hot.


Question, how do you do the four way SLI bridge setup? I mean is a MUST a high bandwidth bridge setup I know.

And you might want to have the Power Limit at 0% and clocks limited to 1800-1900 on all four cards on a lower voltage curve. I'd CTRL F in Afterburner, bring up the voltage curve, CTRL D to set it to defaults, see what the voltage is at for say 1800-1850 with a 0% Power Limit, then set a custom voltage curve with it at 1850 or so at max but topping out at a much lower voltage than 1093v to keep temps down.

And a custom fan curve is a must. You don't need by any means 1093v and max clocks on all four cards unless you put them under water and are benching. They downclock at 60C+ anyways so lower volts and stable lower clocks running four cards best bet on air.

Edit: Thinking about it you don't want the Power Limit at 0% but lower might help.


----------



## Maintenance Bot

Quote:


> Originally Posted by *Baasha*
> 
> I must be blind or something - I don't see anything about game codes on the Order Details page - just my address, order number, product I ordered, price, and a button that says "VIEW INVOICE."
> 
> Please post a screenshot or something - I already called them on Monday and they said I should receive an email within 24 hours - nothing yet and it's been 3 days!





Spoiler: Warning: Spoiler!






I bet you get yours soon.
Quote:


> Originally Posted by *Syan48306*
> 
> Looks like in SLI on a default fan profile, it looks like it bumps 2025mhz core and 6000mhz on the memory then starts throttling.
> With fans at 100% and both backplates on, the bottom card sits at 59 degrees and the top one at 75 degrees.


Thanks, I will keep this in mind.


----------



## SuprUsrStan

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Syan48306*
> 
> Looks like in SLI on a default fan profile, it looks like it bumps 2025mhz core and 6000mhz on the memory then starts throttling.
> With fans at 100% and both backplates on, the bottom card sits at 59 degrees and the top one at 75 degrees.
> 
> 
> 
> Thanks, I will keep this in mind.
Click to expand...

To be perfectly honest, don't SLI if you have a 0 slot gap between the cards. The reference coolers just can't handle the heat. If you have a 1 or 2 slot gap then you should be okay. That or if you're going water cooling like me.


----------



## mouacyk

Quote:


> Originally Posted by *KraxKill*
> 
> The card is on water, on an x41 AIO mod. Very basic but it has a slightly larger radiator 140mm vs the 120mm EVGA unit for example. It's running two noctuas in push pull and I'm using liquid metal TIM on the die to help further with the temps.
> 
> I am stable benching and looping through everything at 2075. Heaven, Valley, Firestrike, TimeSpy games you name it. This will be where I leave it unless there is some magic bios to be had.
> 
> I am also found 2088 to be solid in everything, but my scores do not improve as I begin to bumps the power limit which despite running a higher clock actually keeps my scores flat or lower depending on the bench. Valley is abot the same. Firestrike drops a few points. Heaven is about the same but lower as well.
> 
> 2100-2113 and the quick go I had at 2135 is stable in Valley and in ARMA3 only. But performance, even in Valley begins do go down since at this clock I'm regularly hitting the Pwr limit.


I wouldn't be surprised if throttling is incurring a slight performance penalty, due to additional logic being done. Take the following examples:

1. Steady 2075MHz - 16.6ms frame time constant no throttling
2. 2075MHz - 2088MHz - 15ms frame time + 2ms throttle penalty = 17ms frame time total

Not sure about others, but I have seen judder as a result of throttling on my 980 Ti before I went water and I did not like it.


----------



## lilchronic

Quote:


> Originally Posted by *dboythagr8*
> 
> Just showing there is a ME:A bundle out there at some level.


I saw a link some were that if you buy the card from bestbuy you get that game code. Cant find the page anymore.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> The card is on water, on an x41 AIO mod. Very basic but it has a slightly larger radiator than a standard AIO. 30% more volume. 140mm vs the 120mm EVGA unit for example. It's running two noctuas in push pull and I'm using liquid metal TIM on the die to help further with the temps. People debate if the trouble is worth it, but I've had good results with it.
> 
> I am stable benching and looping through everything at 2075. As I'm sure you've noticed throttling is a peeve of mine. Perhaps too much so. I'll admit it. Heaven, Valley, Firestrike, TimeSpy games you name it all bench and loop at a solid 2075 for me. Since this is where my highest scores are as well it will be where I leave it unless there is some magic bios to be had that unlocks more potential. Unlikely.
> 
> I also found 2088 to be solid in everything as well, but my scores do not improve as I begin to bumps the power limit 3-4 times during the bench which despite running a higher clock actually keeps my scores flat or lower depending on the bench. Valley is abot the same. Firestrike drops a few points. Heaven is about the same but lower as well.
> 
> 2100-2113 and the quick go I had at 2135 is stable in Valley and in ARMA3 only. But performance, even in Valley begins do go down since at this clock I'm hitting the Pwr limit even more frequently than before and the temps start to creep past 35 as well.
> 
> I feel you'll likely be somewhere in the 2050-2100 range with your card once you get it on water. Assuming 35-40C. Higher will be tough without undervolting (lower stability in exchange for lower draw) to stay off the limiter or hard modding the power delivery.
> 
> I've had the card since the 10th with plenty of time to explore it, but you've got me thinking...I may try running my top bins at lower voltages to see if I can stay off the little bumps of power limiter and still maintain stabliity at 2088. If I can, this would improve my scores at 2088 by a tiny bit and make it worthwhile running the card there. Will see.
> 
> Enjoy the project and good luck! Feel free to PM if you wish. I'm good with the voltage curve and am always willing to bounce ideas.


first time i've ever heard Arma3 and stable used in the same sentence.
Quote:


> Originally Posted by *KedarWolf*
> 
> Question, how do you do the four way SLI bridge setup? I mean is a MUST a high bandwidth bridge setup I know.
> 
> And you might want to have the Power Limit at 0% and clocks limited to 1800-1900 on all four cards on a lower voltage curve. I'd CTRL F in Afterburner, bring up the voltage curve, CTRL D to set it to defaults, see what the voltage is at for say 1800-1850 with a 0% Power Limit, then set a custom voltage curve with it at 1850 or so at max but topping out at a much lower voltage than 1093v to keep temps down.
> 
> And a custom fan curve is a must. You don't need by any means 1093v and max clocks on all four cards unless you put them under water and are benching. They downclock at 60C+ anyways so lower volts and stable lower clocks running four cards best bet on air.
> 
> Edit: Thinking about it you don't want the Power Limit at 0% but lower might help.


ctrl f doesn't seem to work for me. do you have to do the voltage hack first?


----------



## stocksux

Gigabyte 1080 ti in stock on Newegg right now...for $899







Sadly, it'll still sell to some poor suckers.


----------



## AMDATI

Yep that's why I ordered mine within 10 minutes of release lol. I knew they would be out of stock for months, and any stock they do get will sky rocket. You won't be seeing sub-$800 1080ti's for 6 months at least, especially not the custom cards.

I run Gsync, so AMD had nothing to offer me, but even then, it's already pretty much set in stone that Vega won't even touch the 1080ti in performance. 1080ti will be king for at least a year, and will be great at gaming for at least 3+ years. Vega will also only be 8GB. That 11GB VRAM will really extend the longevity of the card.


----------



## alucardis666

Quote:


> Originally Posted by *AMDATI*
> 
> Yep that's why I ordered mine within 10 minutes of release lol. I knew they would be out of stock for months, and any stock they do get will sky rocket. You won't be seeing sub-$800 1080ti's for 6 months at least, especially not the custom cards.


I'd pay $800 for a hybrid if that was out, especially since I've sunk nearly that into trying to rig my own. But if what you say is accurate then the hybrid they do release will probably cost $900-1000


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> first time i've ever heard Arma3 and stable used in the same sentence.


Touché


----------



## stocksux

Has anyone seen this ROG SLI HB Bridge for sale state side? Seems the only ones I find are in Europe.


----------



## AMDATI

I find Arma 3 to be a very boring game. I refunded it, even as sale price.


----------



## alucardis666

So, strange issue... In ME Andromeda, if I try to run non-native resolutions (>4K) then the image is centered in the screen and doesn't scale to fill my screen, not sure if this is a bug or what. This is the only game I've seen this in so far.

As for performance, since I can't run 1440p without sacrificing half my screen real estate, I've set the ultra profile @ 4k and resolution scale to 75%. Seems to run pretty decent without too many severe drops. Don't really understand what it is with the optimization of EA/Ubisoft games lately. They don't look good enough to justify their poor performance returns.


----------



## I-Siamak-I

Quote:


> Originally Posted by *alucardis666*
> 
> I'd pay $800 for a hybrid if that was out, especially since I've sunk nearly that into trying to rig my own. But if what you say is accurate then the hybrid they do release will probably cost $900-1000


Hey man just off topic question, are u using a 55inch 4k tv as your primary monitor? If so how is it for desktop use? I know for gaming its awesome. Also do u have it mounted to wall in front of your desk? How far are u sitting to the TV?


----------



## RedM00N

Quote:


> Originally Posted by *AMDATI*
> 
> I find Arma 3 to be a very boring game. I refunded it, even as sale price.


Arma 3 is my goto bench game as it seems to scale well with everything cpu/mem/gpu (specially when overclocked). It's kinda like the best modern day Crysis we have. Maxed out your gonna get like 20 fps in a user made bench either due to a gpu or cpu bottleneck









Also it is a bit of a slow/boring game as its supposed to be a military sim, but the mods are where it's at.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> I find Arma 3 to be a very boring game. I refunded it, even as sale price.


it can be great but you gotta get with a fun team. game has amazing graphics must is rough to get good framerates in, though. My son loves that game. Of course he ended up getting me to upgrade his cpu/mobo/ram because of it lol. his 4690k oc'd and 980 does the job, at least in 1080p/60 where he is.


----------



## alucardis666

Quote:


> Originally Posted by *I-Siamak-I*
> 
> Hey man just off topic question, are u using a 55inch 4k tv as your primary monitor? If so how is it for desktop use? I know for gaming its awesome. Also do u have it mounted to wall in front of your desk? How far are u sitting to the TV?


Hey, Yes I am! It's great! My vision isn't perfect even with glasses so having a big screen is always nice! I switched to TV's back in 2004 with a 32" and have gone up from there (32>40>46>55), mainly due to the failing prices in recent years. HDR is great for the pre-installed apps on the TV, as well as games with an HDR mode (Shadow Warrior 2, Hitman 2016, Mass Effect Andromeda, Resident Evil 7, Etc...) Desktop looks great with 175 Resolution scale set in windows, some things are a little too big but 150 looks off and if I use a custom % inbetween 150-175 it also looks off.

As for wall mounting no, I do not have it mounted, I keep putting that off, I have it sitting atop my desk pushed back as far as it will go on the desk and I'm sitting ~3ft away from it.

I'd say the only complaint I have with this setup is that this TV will not run 1080p/1440p @ 75/100/120hz like some vizio's and sony's will. But I chose this display as it has a 22ms response time in game mode which I leave it in exclusively during pc use/gaming and comparatively I think the image quality of this is superior to everything out there besides LGs OLEDs. If I had to do it again I'd have sprung the extra cash for one of LG's OLEDs, those blacks are just next level! And the halo-ing/flashlight effects of this edge-lit panel can look pretty ulgy watching dark movies/scenes or playing horror games. It's not a deal-breaker but it's far from perfect. However nothing is as bad as the TN/IPS/AH/AV monitor panels I've seen out there, blacks are grey, and the backlight bleed is FAR worse, not to mention that everything just looks washed out and lack's clarity.

If you're planning to use a TV for a monitor I'd say get one of the 4k HDR OLEDs from LG, downside compared to my monitor is increase input lag/slower response time and it won't get as bright or looks as good with objects in motion. But the infinite deep blacks/contrast ratio more than make up for it. IMO.









And if you can hold out, I'd say wait to see what sort of 4k HDR 144hz gaming monitors are coming out by end of this year and see how they compare. Personally, I would never go back to anything smaller than a 40" as I also use this TV to watch movies and TV laying in my bed behind me (~12ft from TV.) and to play my PS4 Pro.
*
TL;DR*

Yes. It's great!
Desktop is great with scaling.
Gaming is a treat for the eyes. But not high refresh rates (60fps)
Not wall-mounted, yet.
Sitting ~3ft from display.

If you want a 4k HDR TV consider one of LGs OLEDs instead.
*
*** Sorry for the long post ****


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> So, strange issue... In ME Andromeda, if I try to run non-native resolutions (>4K) then the image is centered in the screen and doesn't scale to fill my screen, not sure if this is a bug or what. This is the only game I've seen this in so far.
> 
> As for performance, since I can't run 1440p without sacrificing half my screen real estate, I've set the ultra profile @ 4k and resolution scale to 75%. Seems to run pretty decent without too many severe drops. Don't really understand what it is with the optimization of EA/Ubisoft games lately. They don't look good enough to justify their poor performance returns.


what do you have as your "scaling" setting in NCP settings? Under 'Adjust desktop size and position'. If you are using no scaling or aspect ratio, that could be why. or make sure your custom rez is a properly scaled aspect ratio (ie 4000x2250 is a proper 16:9), then you can use "aspect ratio" setting. ive found that works best.
Quote:


> Originally Posted by *alucardis666*
> 
> Hey, Yes I am! It's great! My vision isn't perfect even with glasses so having a big screen is always nice! I switched to TV's back in 2004 with a 32" and have gone up from there (32>40>46>55), mainly due to the failing prices in recent years. HDR is great for the pre-installed apps on the TV, as well as games with an HDR mode (Shadow Warrior 2, Hitman 2016, Mass Effect Andromeda, Resident Evil 7, Etc...) Desktop looks great with 175 Resolution scale set in windows, some things are a little too big but 150 looks off and if I use a custom % inbetween 150-175 it also looks off.
> 
> As for wall mounting no, I do not have it mounted, I keep putting that off, I have it sitting atop my desk pushed back as far as it will go on the desk and I'm sitting ~3ft away from it.
> 
> I'd say the only complaint I have with this setup is that this TV will not run 1080p/1440p @ 75/100/120hz like some vizio's and sony's will. But I chose this display as it has a 22ms response time in game mode which I leave it in exclusively during pc use/gaming and comparatively I think the image quality of this is superior to everything out there besides LGs OLEDs. If I had to do it again I'd have sprung the extra cash for one of LG's OLEDs, those blacks are just next level! And the halo-ing/flashlight effects of this edge-lit panel can look pretty ulgy watching dark movies/scenes or playing horror games. It's not a deal-breaker but it's far from perfect. However nothing is as bad as the TN/IPS/AH/AV monitor panels I've seen out there, blacks are grey, and the backlight bleed is FAR worse, not to mention that everything just looks washed out and lack's clarity.
> 
> If you're planning to use a TV for a monitor I'd say get one of the 4k HDR OLEDs from LG, downside compared to my monitor is increase input lag/slower response time and it won't get as bright or looks as good with objects in motion. But the infinite deep blacks/contrast ratio more than make up for it. IMO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And if you can hold out, I'd say wait to see what sort of 4k HDR 144hz gaming monitors are coming out by end of this year and see how they compare. Personally, I would never go back to anything smaller than a 40" as I also use this TV to watch movies and TV laying in my bed behind me (~12ft from TV.) and to play my PS4 Pro.
> *
> TL;DR*
> 
> Yes. It's great!
> Desktop is great with scaling.
> Gaming is a treat for the eyes. But not high refresh rates (60fps)
> Not wall-mounted, yet.
> Sitting ~3ft from display.
> 
> If you want a 4k HDR TV consider one of LGs OLEDs instead.
> *
> *** Sorry for the long post ****


i love gaming on my 55" suhdtv sammy. lovely. desktop use is super good, too. so much acreage!


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> what do you have as your "scaling" setting in NCP settings? Under 'Adjust desktop size and position'. If you are using no scaling or aspect ratio, that could be why. or make sure your custom rez is a properly scaled aspect ratio (ie 4000x2250 is a proper 16:9), then you can use "aspect ratio" setting. ive found that works best.


Yea, Thought that might be why as well, tried every combo I can think of, didn't seem to help :-( Also, why would this not be an issue in other games if these settings were the cause? Just seems to be ME Andromeda that has the issue, far as I can tell.


Quote:


> Originally Posted by *Slackaveli*
> 
> i love gaming on my 55" suhdtv sammy. lovely. desktop use is super good, too. so much acreage!


YESSSSSS!


----------



## Slackaveli

Quote:


> Originally Posted by *I-Siamak-I*
> 
> Hey man just off topic question, are u using a 55inch 4k tv as your primary monitor? If so how is it for desktop use? I know for gaming its awesome. Also do u have it mounted to wall in front of your desk? How far are u sitting to the TV?


2 feet. it's great. customize desktop zoom a bit and you are golden.
Quote:


> Originally Posted by *alucardis666*
> 
> Yea, Thought that might be why as well, tried every combo I can think of, didn't seem to help :-( Also, why would this not be an issue in other games if these settings were the cause? Just seems to be ME Andromeda that has the issue, far as I can tell.


now that i know you are gaming on a 4k sammy like me, i guess it's probably a ME:A issue. weird. You did try "full screen"?
Quote:


> Originally Posted by *Slackaveli*
> 
> now that i know you are gaming on a 4k sammy like me, i guess it's probably a ME:A issue. weird.


ive played a lot with custom rez. this tv will do 5040x 2160 @ 21:9 aspect ratio. 55".... set it as "aspect ratio" and play fifa 17 like that. awesome.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> now that i know you are gaming on a 4k sammy like me, i guess it's probably a ME:A issue. weird.


Yup. We're non-identical twins! I have the 9 series, but tbh not sure how it differs from your set.

If it is actually an issue withing the game, I hope it get's patched soon.


----------



## Slackaveli

Quote:


> Originally Posted by *RedM00N*
> 
> Arma 3 is my goto bench game as it seems to scale well with everything cpu/mem/gpu (specially when overclocked). It's kinda like the best modern day Crysis we have. Maxed out your gonna get like 20 fps in a user made bench either due to a gpu or cpu bottleneck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also it is a bit of a slow/boring game as its supposed to be a military sim, but the mods are where it's at.


no doubt. that is the game that taught me how crucial RAM speed and timings are in gaming.
Quote:


> Originally Posted by *stocksux*
> 
> Has anyone seen this ROG SLI HB Bridge for sale state side? Seems the only ones I find are in Europe.


i think it is "coming soon" is US
Quote:


> Originally Posted by *alucardis666*
> 
> Yup. We're non-identical twins! I have the 9 series, but tbh not sure how it differs from your set.
> 
> If it is actually an issue withing the game, I hope it get's patched soon.


we both got what matters - hdr & 22ms. i also have 3d







. i do wish it would let us play 1440/120, but, that is what i use my g-sync monitor for- and as a second screen of course. Im actuall on the One Connect upgrade running as a js9500 (on the octacore). i only get about 600~ nit peak brightness in hdr, but it's still great. colors are crazy.


----------



## PewnFlavorTang




----------



## Judah R

New 1080ti owner checking in


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> we both got what matters - hdr & 22ms. i also have 3d
> 
> 
> 
> 
> 
> 
> 
> . i do wish it would let us play 1440/120, but, that is what i use my g-sync monitor for- and as a second screen of course. Im actuall on the One Connect upgrade running as a js9500 (on the octacore). i only get about 600~ nit peak brightness in hdr, but it's still great. colors are crazy.


Yea don't think mine does 3d, strange omission, my HDR goes up to 1000 nits, and what do you mean one connect upgrade?


----------



## Radox-0

Quote:


> Originally Posted by *Baasha*
> 
> Have you guys who are air-cooling removed the back-plates and seen any perceptible difference in temps? My 4 Way GPU sammich is obviously keeping things...hot.


I find it helps in tight configs. You will still throttle, but card gets abit more gap from one below to pull in more air and maintain a very very marginally higher boost. Nothing significant, but every little helps.


----------



## dboythagr8

Second card installed.



Don't even know if I'll go through and do more OCing and benching, did a ton of that with the first one


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Yea don't think mine does 3d, strange omission, my HDR goes up to 1000 nits, and what do you mean one connect upgrade?


2015s were the same as 2016s but need a one connect box to give then the Tizen OS and octa-core. yours came with that already.
Quote:


> Originally Posted by *dboythagr8*
> 
> Second card installed.
> 
> 
> 
> Don't even know if I'll go through and do more OCing and benching, did a ton of that with the first one


looks beautiful.
Quote:


> Originally Posted by *Judah R*
> 
> New 1080ti owner checking in


Sexy! and smart to buy from Evga if you are doing all that. very nice.


----------



## AstinGC90

Quote:


> Originally Posted by *Judah R*
> 
> New 1080ti owner checking in


Droooooooolllllllllll, Literally double my order haha. Pictures and benchmarks expected sir...


----------



## jarble

@Slackaveli ease up on the triple post man you killing me here

@Judah R nice can't wait to see more pics


----------



## lilchronic

@Slackaveli you can quote multiple people with the multi tab. It's right next to the quote tab.


----------



## dboythagr8

Quote:


> Originally Posted by *Slackaveli*
> 
> looks beautiful.


Appreciate it! Need to tidy up the cables, but with the side panel on you don't really notice it to much.

Did a small and quick OC on the SLI setup. +115/+150 to start off.



E: So apparently that bug with SLI and 144hz monitors is still around where the cards won't downclock? ugh


----------



## PewnFlavorTang

Quote:


> Originally Posted by *dboythagr8*
> 
> Appreciate it! Need to tidy up the cables, but with the side panel on you don't really notice it to much.
> 
> Did a small and quick OC on the SLI setup. +115/+150 to start off.
> 
> 
> 
> E: So apparently that bug with SLI and 144hz monitors is still around where the cards won't downclock? ugh


interesting how that compares with ryzen and running 2 x 8x pcie.


----------



## egandt

So I got my 1080TI went to run 3D mark on it (I got 405!), why because it will only run on the Intel Decelerated montior (primary).

I have 2 monitors 4K running off the GTX 1080 Ti (use to be GTX 1070), I upgraded for CUDA, as much as anything, but I can not even run anything on the 1080, as every stupid program run using the Intel chip (which I have active on a secondary display as I also need QuickSync so I have to have at least one monitor atteched. How do I choice to run using the Nvidia GTX 1080 TI, and not on the Intel POS, as everything wants to use teh Intel and nothing allows me to chose. Previously withe th GTX 1070, I never had this issue before. I want to stress test, but do nopt want to mess with settings as this HTPC is finally working, all I want is to chose which monitor and card is used for testing.

I want to mention that CUDA-Z works, a does GPU-Z and MADvr, but not a single benchmark uses the 1080 TI, or even lets me chose it. Any suggestion how to make Windows and software suck less and allow me to run it on the secondary screen, which is where I normally do anything that uses a GPU?

The probem with both on the 1080 or even 1070 is of course that they are both HDMI and active DP->HDMi adapter all fail to work at 4K so I end up with a 4K monitor used only for text at 1080 (talk about a waste), so by using the Nvidia for monitor used for games/Video work and non-stressful things say browsing on the Intel everything is4k/4K without issues. Do not even get me started on how much I've wasted on "Active" 4K DP->HDMI adapters I have 4 or 5 different one over $120 spent not one is stable at 4K 60P, 40K 30P 422 mostly, but that is not acceptable for work either.

Decided to remove monitor (so only the Nvidia 4K is connected), rebooted, I now get a blistering 1542 in 3D mark, so that is 3x more and 1/10 what I should, so unimpressed, all the same. I have 378.78 installed, reinstalled (rebooted), same thing. GPU-Z (1.18), shows that even 4D-Mark does not change the 721Mhz clock speed, so that could exmplain somethin, Memory was 202.5Mhz during a run.

ERIC


----------



## dboythagr8

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> interesting how that compares with ryzen and running 2 x 8x pcie.


That's a nice score. If that's yours when you run the bench, are you getting 98-99% usage on both cards?


----------



## RedM00N

Quote:


> Originally Posted by *dboythagr8*
> 
> Appreciate it! Need to tidy up the cables, but with the side panel on you don't really notice it to much.
> 
> Did a small and quick OC on the SLI setup. +115/+150 to start off.
> 
> 
> 
> E: So apparently that bug with SLI and 144hz monitors is still around where the cards won't downclock? ugh


I just sit at 120hz (and go to 165 in games) cause this bug never really got fixed. The issue went away when I clean installed to the new driver, then I disabled SLI and its back to my TXM being stuck at 810mhz for 165hz. At least its only the desktop for 120hz, not hurting much but I wish it was fully solved


----------



## rubicsphere

Finally!


----------



## MightEMatt

Quote:


> Originally Posted by *pompss*
> 
> Yep thats a nice clock in valley dude. its under water? 30c i guess it is.
> 
> You get stable 2075 mhz on heaven ? 3d mark throttle from 2000mhz to 2100 mhz but im not sure why its that since i dont hit high temps.


That's not proof it might crash after 17 hours better run [email protected] for 2 weeks to make sure its stable and we want it on video with an atomic clock or no proof anybody can do that.


----------



## yukkerz

I just had ordered mine from amazon and was going to get it tomorrow. (got the card today) Had to cancel so I could get this deal. Good thing they are east coast and I should get Monday. Thank you!
Quote:


> Originally Posted by *Kdubbs*
> 
> For anyone interested in doing a AIO mod to their reference card, bnh is selling the 1080 hybrid cooler for 100 with the evga promotional ghost recon wildlands or for honor. And at least for me I got expedited shipping of tomorrow for free! Just placed the order.


----------



## uniwarking

I'm planning to wait until EVGA or MSI drops a hybrid card... worth the wait? I'd really like it now though, awesome card! Blower style cards are noisy, especially OC'd and folding. I feel like the Aftermarket designs really spread heat through the case. I don't know how long it will be...

Sent from my iPhone using Tapatalk


----------



## Vizkos

Quote:


> Originally Posted by *uniwarking*
> 
> I'm planning to wait until EVGA or MSI drops a hybrid card... worth the wait? I'd really like it now though, awesome card! Blower style cards are noisy, especially OC'd and folding. I feel like the Aftermarket designs really spread heat through the case. I don't know how long it will be...
> 
> Sent from my iPhone using Tapatalk


Worth waiting I'd say. No use screwing yourself with something inferior to what you want because you're impatient. The only real demanding game that is coming out in the near future is Middle Earth: Shadow of War in August. Until then its only ME Andromeda. I am planning on waiting until June or so, then dumping my 980Ti SLI.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> @Slackaveli you can quote multiple people with the multi tab. It's right next to the quote tab.


well aware. but i was reading through the thread. it didnt happen at the same time. But thanks for the tip all the same.


----------



## stocksux

1080 ti so old hat now. Had it a week. When's Volta coming out? I need more


----------



## gopanthersgo1

In before 10.5GB fast vram and 0.5GB slow stuff.







I entered EVGA's step-up queue Saturday, will be my first top-end gpu since my GTX480, hopefully I can get 4 years out of it at 2K. (Going to be in uni so imma be broke.)


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> 2015s were the same as 2016s but need a one connect box to give then the Tizen OS and octa-core. yours came with that already.


Gotcha, thanks for clarifying








Quote:


> Originally Posted by *Slackaveli*
> 
> what do you have as your "scaling" setting in NCP settings? Under 'Adjust desktop size and position'. If you are using no scaling or aspect ratio, that could be why. or make sure your custom rez is a properly scaled aspect ratio (ie 4000x2250 is a proper 16:9), then you can use "aspect ratio" setting. ive found that works best.


Figured it out! it was the HDR mode in game that caused the problem. So turned that off and now have 1440p working. Will test to see how it runs.


----------



## kckyle

got my 1080 ti


----------



## kl6mk6

Put my Ek block on the other day using a Liquid Metal Pad. My temps were in the high 50s.











Took it all back apart today and found that the pad had fused properly to the die but was not making any contact to the EK block. Fortunately I had some leftover Conductonaut. Slathered some of that right over the thermal pad and dropped my temps almost 20C.











Was able to get a my Timespy overall above 10K. (too bad it wasn't valid)

http://www.3dmark.com/3dm/18643714?


----------



## keikei

Just going through some games now. Satisfied with DS3, stays relatively at 60fps. I can actually play SFV @5k. Not necessarily a demanding game, but nevertheless awesome. Odd situation with The Division. Frames lock at 18 fps for some reason. I'll go through the settings, but it seems to be a bug...its Ubi so who knows. Testing For Honor right now.


----------



## alucardis666

Quote:


> Originally Posted by *keikei*
> 
> Just going through some games now. Satisfied with DS3, stays relatively at 60fps. I can actually play SFV @5k. Not necessarily a demanding game, but nevertheless awesome. Odd situation with The Division. Frames lock at 18 fps for some reason. I'll go through the settings, but it seems to be a bug...its Ubi so who knows. Testing For Honor right now.


I hear you!

I have to play Wildlands and ME Andromeda @ 1440p with ultra settings to get decent frame rates, both run like total Shi!t @ 4k. with frames dropping down into the 30's


----------



## dboythagr8

Battlefield 1 was a nice welcome back to SLI. Flickering problems all over the place.

Loaded up Rise of the Tomb Raider. Haven't played it in months. Threw everything on Ultra including AA (SSAA 4x) and was still able to pull avg of 66 fps. Image Quality was insane. Certain games like that which I have a good SLI profile I'll activate the second card for extra eye candy, but for the majority of the time, SLI will probably be disabled. A single card is stupid fast for just about any game at 1440p.


----------



## wirefox

Quote:


> Originally Posted by *wirefox*
> 
> sorry for being lazy, I haven't read the whole string.
> 
> I have a Zotac Artic Storm 1080 GTX
> 
> This card doesn't come with a factory fan. So...
> 
> ...I am curious if anyone has added a 1080 GTX EK or aftermarket waterblock to a 1080 "ti" ?
> 
> I want to buy the 1080ti, move the zotac block onto the 1080ti (it's a great block) and put the 1080ti fan onto the zotac 1080 card and resell -- obviously at a discount
> 
> 
> 
> 
> 
> 
> 
> 
> 
> massive thanks .wirefox


Anyone have thougths. maybe I'll start a new thread.

Ben


----------



## AMDATI

I think the FE is adequate. I keep her at 50% fan speed maximum, and to me that's pretty quiet, 79c maximum temp limit. Still not even using half of my 4790k. No CPU bottleneck here. Maybe not clocking to 2Ghz, but I get some high 1800's and low 1900's here and there. But average is probably around 1700.


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> Battlefield 1 was a nice welcome back to SLI. Flickering problems all over the place.
> 
> Loaded up Rise of the Tomb Raider. Haven't played it in months. Threw everything on Ultra including AA (SSAA 4x) and was still able to pull avg of 66 fps. Image Quality was insane. Certain games like that which I have a good SLI profile I'll activate the second card for extra eye candy, but for the majority of the time, SLI will probably be disabled. A single card is stupid fast for just about any game at 1440p.


No offense but congrats on your $700 space heater/paperweight I've tried SLi with 2 mid-range cards, 2 low-end cards, and 2 high-end cards. Every single time it's been diminishing returns, headaches, issues and just a plain waste of $$$. Get the flagship card, or the step below always and just enjoy that it works, no issues, no extra heat, no extra noise, and no wasted money


----------



## barsh90

Quote:


> Originally Posted by *kckyle*
> 
> 
> 
> got my 1080 ti


You really think you are special, don't you?

Pathetic...


----------



## AMDATI

looks like someones jelly


----------



## lilchronic

Quote:


> Originally Posted by *AMDATI*
> 
> I think the FE is adequate. I keep her at 50% fan speed maximum, and to me that's pretty quiet, 79c maximum temp limit. Still not even using half of my 4790k. No CPU bottleneck here. Maybe not clocking to 2Ghz, but I get some high 1800's and low 1900's here and there. But average is probably around 1700.


Check individual core's. 50% can be 2 core / 2 threads maxed out if that's all the game engine uses.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> No offense but congrats on your $700 space heater/paperweight I've tried SLi with 2 mid-range cards, 2 low-end cards, and 2 high-end cards. Every single time it's been diminishing returns, headaches, issues and just a plain waste of $$$. Get the flagship card, or the step below always and just enjoy that it works, no issues, no extra heat, no extra noise, and no wasted money


that's me and that's one reason why i skipped the 1080. went from two 980ti, keep same (more or less) fps as i did with 2 980tis in games that actually use sli, and in the ones that didnt i made nice gains. No more nvidia inspector sli bits!!


----------



## AMDATI

Quote:


> Originally Posted by *lilchronic*
> 
> Check individual core's. 50% can be 2 core / 2 threads maxed out if that's all the game engine uses.


That's from while playing For Honor. Plus, if it was only maxing out 2 threads and CPU bottlenecking, it wouldn't be 100% GPU. Also at 1440p at 90-160fps. And I'm running stock CPU clocks too, just because I like to keep my temps ultra cool. This system should last as long as this Ti.


----------



## keikei

Derp. I had afterburner locked to 18 fps....anywho, still tweaking The Division, but i can crank a lot of the effects higher with the card. We may need a few more gens to max out the game unfortunately. Game still looks phenomenal. I forgot how addictive the gameplay can be. Probably will be playing all weekend.


----------



## MURDoctrine

SO. This just happened playing Overwatch with +100/+500. I've never seen artifacting like this if that's even what it is. The game was normal then that green filter effect traveled from top to bottom. When the kill cams cam up it was normal colors but looked as if it had scanlines in it. When I left the game it was back to normal on the main menu. Neither of my other two displays were affected at all. This monitor that the game is on is using the DP to DVI adapter. I'm wondering if that was the cause.

*
Warning I say a dirty word*.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> SO. This just happened playing Overwatch with +100/+500. I've never seen artifacting like this if that's even what it is. The game was normal then that green filter effect traveled from top to bottom. When the kill cams cam up it was normal colors but looked as if it had scanlines in it. When I left the game it was back to normal on the main menu. Neither of my other two displays were affected at all. This monitor that the game is on is using the DP to DVI adapter. I'm wondering if that was the cause.
> 
> *
> Warning I say a dirty word*.


I must be blind cuz I don't see any artifacting in the vid lol


----------



## MURDoctrine

Quote:


> Originally Posted by *alucardis666*
> 
> I must be blind cuz I don't see any artifacting in the vid lol


That isn't suppose to be green. Its supposed to look identical to the kill cam footage. Thats why I was like what the hell I've never seen anything like that in my years. I thought it might be the adapter but then it should effecting everything. Maybe blizzard is trolling us like they did with the sombra release and are applying this at random with the Orisa launch coming up in a few days.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> That isn't suppose to be green. Its supposed to look identical to the kill cam footage.


I see, sorry, I don't play much overwatch.









What're your load temps like? I've noticed that different max oc's are possible wither better cooling thus resulting in higher sustained clocks and increased stability. I'd try cracking the fan to 100% and playing to see if it happens again, if it doesn't you know that it was the heat that caused it. If it happens again with the fam going 100% I'd lower that mem offset to say 400-475 or so to see if it fixes it.


----------



## MURDoctrine

Quote:


> Originally Posted by *alucardis666*
> 
> I see, sorry, I don't play much overwatch.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What're your load temps like? I've noticed that different max oc's are possible wither better cooling thus resulting in higher sustained clocks and increased stability. I'd try cracking the fan to 100% and playing to see if it happens again, if it doesn't you know that it was the heat that caused it. If it happens again with the fam going 100% I'd lower that mem offset to say 400-475 or so to see if it fixes it.


I hit 74C max during that session as I have a pretty aggressive fan curve setup. Ambient is probably ~24C. I'm just trying to figure it out as I've never seen anything like that before. No worries as the card will be under water soon enough and thermals won't be an issue anymore.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> I hit 74C max during that session as I have a pretty aggressive fan curve setup. Ambient is probably ~24C. I'm just trying to figure it out as I've never seen anything like that before. No worries as the card will be under water soon enough and thermals won't be an issue anymore.


I noticed some green artifacts in Mass Effect once I hit ~70c, lowered off sets, pumped up the fan and it's gone. Hoping you have a similar experience when your card is under water.


----------



## MURDoctrine

Quote:


> Originally Posted by *alucardis666*
> 
> I noticed some green artifacts in Mass Effect once I hit ~70c, lowered off sets, pumped up the fan and it's gone. Hoping you have a similar experience when your card is under water.


Yeah just did some more testing. It was the mem that was causing it. Did it again after a few minutes then tried it at +475 and it seems fine. Thanks.


----------



## fluidzoverclock

I'm using a 4770k @ 4.2ghz [ht enabled] alongside my newly acquired palit 1080ti fe.

Running Forza 3 I see massive frame dips down to 40s when I'm passing shops in the city.
Before I upgraded to this 1080ti, I was running a 980ti, overclocked, and she had no issues running this game at high settings with occasional dips to 55fps, but this 1080ti, with the same preset, is dropping to 40s.

980ti, high settings, 720p - 4k, city areas, drops to 50's
1080ti, high settings, 720p - 4k, city areas, drops to 40's

Same cpu speed for both cards. 1080ti has an aggressive fan curve so doesn't run past 70c in gameplay.

I messed around with the graphics in Forza and resolutions, running at stock (1440p) and pushing it up to 4k and down to 720p, there is minimal difference in fps for those areas affected in gameplay. I also cranked Forza to ultra and the fps doesn't move much. What's worrying is, 1080ti usage sits below 70 per cent, and downclocks itself as it's not being fully utilized.

Even running Forza 3 on low preset i get fps dips under 60fps with the 1080ti @ 1440p.

I've tried a few things people have mentioned, such as, disabling cores, priority etc to no avail.

I've reinstalled the latest driver countless times using ddu [setup as a clean card in safe mode] so I'm not seeing a driver issue I don't think.

Is the 4770k @ 4.2ghz to blame? Is it not playing nicely with the 1080ti?


----------



## fisher6

Quote:


> Originally Posted by *fluidzoverclock*
> 
> I'm using a 4770k @ 4.2ghz [ht enabled] alongside my newly acquired palit 1080ti fe.
> 
> Running Forza 3 I see massive frame dips down to 40s when I'm passing shops in the city.
> Before I upgraded to this 1080ti, I was running a 980ti, overclocked, and she had no issues running this game at high settings with occasional dips to 55fps, but this 1080ti, with the same preset, is dropping to 40s.
> 
> 980ti, high settings, 720p - 4k, city areas, drops to 50's
> 1080ti, high settings, 720p - 4k, city areas, drops to 40's
> 
> Same cpu speed for both cards. 1080ti has an aggressive fan curve so doesn't run past 70c in gameplay.
> 
> I messed around with the graphics in Forza and resolutions, running at stock (1440p) and pushing it up to 4k and down to 720p, there is minimal difference in fps for those areas affected in gameplay. I also cranked Forza to ultra and the fps doesn't move much. What's worrying is, 1080ti usage sits below 70 per cent, and downclocks itself as it's not being fully utilized.
> 
> Even running Forza 3 on low preset i get fps dips under 60fps with the 1080ti @ 1440p.
> 
> I've tried a few things people have mentioned, such as, disabling cores, priority etc to no avail.
> 
> I've reinstalled the latest driver countless times using ddu [setup as a clean card in safe mode] so I'm not seeing a driver issue I don't think.
> 
> Is the 4770k @ 4.2ghz to blame? Is it not playing nicely with the 1080ti?


I have no issues running Forza 3. I used to get dips in the town on the north east of the map I think but that is a known issue. I don't get them anymore with the 1080 Ti. Have you tried any other games to see if you get poor performance?


----------



## MURDoctrine

Quote:


> Originally Posted by *fluidzoverclock*
> 
> I'm using a 4770k @ 4.2ghz [ht enabled] alongside my newly acquired palit 1080ti fe.
> 
> Running Forza 3 I see massive frame dips down to 40s when I'm passing shops in the city.
> Before I upgraded to this 1080ti, I was running a 980ti, overclocked, and she had no issues running this game at high settings with occasional dips to 55fps, but this 1080ti, with the same preset, is dropping to 40s.
> 
> 980ti, high settings, 720p - 4k, city areas, drops to 50's
> 1080ti, high settings, 720p - 4k, city areas, drops to 40's
> 
> Same cpu speed for both cards. 1080ti has an aggressive fan curve so doesn't run past 70c in gameplay.
> 
> I messed around with the graphics in Forza and resolutions, running at stock (1440p) and pushing it up to 4k and down to 720p, there is minimal difference in fps for those areas affected in gameplay. I also cranked Forza to ultra and the fps doesn't move much. What's worrying is, 1080ti usage sits below 70 per cent, and downclocks itself as it's not being fully utilized.
> 
> Even running Forza 3 on low preset i get fps dips under 60fps with the 1080ti @ 1440p.
> 
> I've tried a few things people have mentioned, such as, disabling cores, priority etc to no avail.
> 
> I've reinstalled the latest driver countless times using ddu [setup as a clean card in safe mode] so I'm not seeing a driver issue I don't think.
> 
> Is the 4770k @ 4.2ghz to blame? Is it not playing nicely with the 1080ti?


This is def an issue with the game. I have a 4770k @4.5ghz and have similar issues. My CPU never really went above 60% utilization even when I went to the city and had dips down to mid 40s at times. Micro$oft thought, "Hey lets combat piracy by encoding everything!" so I think that is the main issue we are experiencing. I have my game on a 1TB HDD so I attribute the poor performance to that.

*EDIT*
*Forgot to mention I'm running it at 1080p.*


----------



## RobotDevil666

So what sorts of overclocks are you guys getting ?
So far i'm @ 2039 which is +150 on the core and +200 on memory but I'm hoping to push it further over the weekend.
Also I see few people decided to get 1080Ti SLI, I did the same thing but after testing them I decided to return one, most of the games have no SLI support, bad or broken SLI or negative performance impact.
I think the only game that worked well was GTA V but that's not enough of a reason to keep a £700 card.
It's things like that made me ditch SLI for single 1080Ti, SLI support is so bad nowdays it's pointless to use it, complete waste of money.
And that comes from someone who used SLI for a decade with every Nvidia generation starting 9800GX2
SLI support has been up and down throughout the years but I was always able to justify it, until now.
Influx of DX12 games that mostly do not support SLI and generally crap support killed it for me.
I hope this changes and SLI will make a comeback or it's going to be replaced with some new multi GPU tech but for now it's just a waste of money.


----------



## zipeldiablo

Quote:


> Originally Posted by *dboythagr8*
> 
> Battlefield 1 was a nice welcome back to SLI. Flickering problems all over the place.
> 
> Loaded up Rise of the Tomb Raider. Haven't played it in months. Threw everything on Ultra including AA (SSAA 4x) and was still able to pull avg of 66 fps. Image Quality was insane. Certain games like that which I have a good SLI profile I'll activate the second card for extra eye candy, but for the majority of the time, SLI will probably be disabled. A single card is stupid fast for just about any game at 1440p.


You're running rise of the tomb raider in 4k right? kinda low fps for 1440p


----------



## Tonza

This thing is a monster, dat GPU score (31594). Maximum for my card seems to be +200 core (2076mhz boost) and +350mhz mem. Getting waterblock next week, so i dont need to play my cheeks flapping, this stock cooler is so loud, can feel the air flowing through my body.

http://www.3dmark.com/fs/12025698


----------



## Norlig

Can anyone let me know if I will be able to mount an AIO Cooler onto the GPU of Founders edition, with the VRAM cooler/shroud still installed?

i.e.

Is the GPU of the 1080ti founders edition: Higher(A) , Same(B) or lower(C) than the RAM cooler next to it?










Worried it will not touch the GPU if the answer is C










thanks.


----------



## MrKenzie

I'm surprised you guys aren't talking more about power limit. That has been the bain of the Titan X Pascal since launch, and the 1080Ti has the same power limit. I think most of you will find that once you water cool, you will still see a lot of downclocking due to power limit. It will only be the custom 1080Ti's with a higher power limit that can sustain clocks between 2100-2200MHz unless a bios mod can solve it?


----------



## Tonza

Quote:


> Originally Posted by *Norlig*
> 
> Can anyone let me know if I will be able to mount an AIO Cooler onto the GPU of Founders edition, with the VRAM cooler/shroud still installed?
> 
> i.e.
> 
> Is the GPU of the 1080ti founders edition: Higher(A) , Same(B) or lower(C) than the RAM cooler next to it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Worried it will not touch the GPU if the answer is C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks.


Gamernexus has 1080Ti FE AIO build log, they put EVGA hybrid cooler to the stock shroud and it makes contact.


----------



## SperVxo

Quote:


> Originally Posted by *Swolern*
> 
> Got my EVGA FE in today. Quick dirty OC before work gave me a stable 2050/6000 on stock voltage. Throttling to 1950mhz over time. Max temps are 70c with custom fan curve.
> 
> Anyone hitting over 2100 or 2200mhz to the core?


What kind of fan % is it running to keep that temp? Loud?


----------



## GreedyMuffin

The 1080Ti should yields me lower temps than my 1080, right? Because the DIE is bigger so the heat moves quicker? My 980Ti was colder than my 1080, even though the TDP was 70 lower.


----------



## SperVxo

Quote:


> Originally Posted by *Norlig*
> 
> Can anyone let me know if I will be able to mount an AIO Cooler onto the GPU of Founders edition, with the VRAM cooler/shroud still installed?
> 
> i.e.
> 
> Is the GPU of the 1080ti founders edition: Higher(A) , Same(B) or lower(C) than the RAM cooler next to it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Worried it will not touch the GPU if the answer is C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks.


If you have the Evga AIO it will work fine. But it seems like if you have Corsair or NZXT or whatever you will nee a copper shim?


----------



## Silent Scone

Quote:


> Originally Posted by *GreedyMuffin*
> 
> The 1080Ti should yields me lower temps than my 1080, right? Because the DIE is bigger so the heat moves quicker? My 980Ti was colder than my 1080, even though the TDP was 70 lower.


Depending on the cooling you'll likely notice a small improvement. The process size on your 980 was also larger. These 14nm cards can get hot quickly.


----------



## THE_WITCHER(TM)

B
Quote:


> Originally Posted by *Norlig*
> 
> Can anyone let me know if I will be able to mount an AIO Cooler onto the GPU of Founders edition, with the VRAM cooler/shroud still installed?
> 
> i.e.
> 
> Is the GPU of the 1080ti founders edition: Higher(A) , Same(B) or lower(C) than the RAM cooler next to it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Worried it will not touch the GPU if the answer is C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks.


Yes you can just saw a video on YouTube by Gamers Nexus look it up however I just did my card with an evga hybrid 1080 kit and all I have to do is cut a square piece for the power plugs and a little piece for a power capacitor to make the evga Shroud fit and it's perfect max temp on 3D mark batman or rise 43c it auto boost to 1890mhz also no noise.


----------



## Blackclad

MSI 1080ti

Motherboard: ASUS Maximus IX Code
RAM: 32G Dominator Pro 3200mhz
Processor: i7-7700k (stock) have not messed much with overclocking yet and recently cleared CMOS for good measure (latest BIOS)
PSU: Corsair RM850i
Cooler: Kraken 62
Case: Phanteks Eclipse P400s
Monitor: ROG Swift

This is a new build, all I havve done is set my RAM to XMP. No overclock (I'm at 4.2)

I ran 3D Mark FS and got a score of around 13000, also my card has god awful coil whine. BF1 Ultra gives me ~65 FPS, Wildlands ~50FPS, WoW even dips down to 40FPS at times. I can't seem to find the problem.

I have:
Reinstalled Chipset
DDU->Clean install 3 times now
Disconnected the card, checked all connections, reinstalled card
Switched to other PCI Slot
Switched monitor connections

BF1 for example, using CAM, GPU is ~100% with temps hovering at 84C

Newegg has sent me a label for RMA, but before I send it on it's way I wanted to run this all by you guys, see if I'm missing something.


----------



## dboythagr8

Quote:


> Originally Posted by *alucardis666*
> 
> No offense but congrats on your $700 space heater/paperweight I've tried SLi with 2 mid-range cards, 2 low-end cards, and 2 high-end cards. Every single time it's been diminishing returns, headaches, issues and just a plain waste of $$$. Get the flagship card, or the step below always and just enjoy that it works, no issues, no extra heat, no extra noise, and no wasted money


Well aware of how it works. Ive SLI'd every generation since 580 and I've gone to single cards in between.
Quote:


> Originally Posted by *zipeldiablo*
> 
> You're running rise of the tomb raider in 4k right? kinda low fps for 1440p


1440p with *SSAA 4X* which is supersampling


----------



## Silent Scone

Super sampling is a massive waste of performance. Buy more pixels or go home


----------



## pantsoftime

Quote:


> Originally Posted by *mouacyk*
> 
> So, it looks like LN2 BIOses were publicly available for the GTX 1080 and the power limits were removed completely. However, the voltages were locked at a rather dangerous level, 1.2v+. When custom models of the Ti come out, this is probably the best option (besides a fully custom BIOS) to eliminate power throttling if an LN2 BIOS is made available. I would think 1.2v on water is doable?


The voltages were not locked at 1.2V. The BIOS allows the card to run up to 1.2V, but only if you crank the voltage slider and setup a very particular style of VF curve. The BIOS runs normal voltages unless you go out of your way to make it run above that. Most people found that running higher voltages didn't help their overclocks whatsoever and ran this BIOS at regular voltage levels.


----------



## BrainSplatter

Quote:


> Originally Posted by *Silent Scone*
> 
> Super sampling is a massive waste of performance.


Depends completely on your personal preferences. I personally hate the slightest sign of aliasing. Super sampling (besides others like DSR) is one of the great features to effectively get rid of aliasing. More potential for anti-aliasing is by far my main reason to invest so much in GPUs.

Other people don't care about jaggies but need 200fps. To each his own


----------



## Joshwaa

Just thought I would check in with Slack. On my 1080ti with the Kryonaut at 40% fan and the card on idle the ambient temp and gpu temps are the exact same. Put a good load on the gpu (118% power) temp goes to 70C with 75% fan. So a little better than stock paste I am guessing.


----------



## mshagg

Quote:


> Originally Posted by *MrKenzie*
> 
> I'm surprised you guys aren't talking more about power limit. That has been the bain of the Titan X Pascal since launch, and the 1080Ti has the same power limit. I think most of you will find that once you water cool, you will still see a lot of downclocking due to power limit. It will only be the custom 1080Ti's with a higher power limit that can sustain clocks between 2100-2200MHz unless a bios mod can solve it?


Hi. This is certainly my experience. Just fitted the EKWB and ran through the first set of benches. At +175 core, +500 mem the highest it wants to sustain over a given run is 2063Mhz on the core. Dont really see power exceed 112%, I assume there's a backstop within pascal to operate within its own thermal limits despite what the setting is in afterburner.

That being said, it pegs at 2063 and peaks at 41deg on the diode, all in relative silence.

On the plus side, I have this thing for VR, which is notorious for GPU utilization < 100%, so the core speeds should hit target without running afoul of pascal's power limit.


----------



## dboythagr8

Quote:


> Originally Posted by *Blackclad*
> 
> MSI 1080ti
> 
> Motherboard: ASUS Maximus IX Code
> RAM: 32G Dominator Pro 3200mhz
> Processor: i7-7700k (stock) have not messed much with overclocking yet and recently cleared CMOS for good measure (latest BIOS)
> PSU: Corsair RM850i
> Cooler: Kraken 62
> Case: Phanteks Eclipse P400s
> Monitor: ROG Swift
> 
> This is a new build, all I havve done is set my RAM to XMP. No overclock (I'm at 4.2)
> 
> I ran 3D Mark FS and got a score of around 13000, also my card has god awful coil whine. BF1 Ultra gives me ~65 FPS, Wildlands ~50FPS, WoW even dips down to 40FPS at times. I can't seem to find the problem.
> 
> I have:
> Reinstalled Chipset
> DDU->Clean install 3 times now
> Disconnected the card, checked all connections, reinstalled card
> Switched to other PCI Slot
> Switched monitor connections
> 
> BF1 for example, using CAM, GPU is ~100% with temps hovering at 84C
> 
> Newegg has sent me a label for RMA, but before I send it on it's way I wanted to run this all by you guys, see if I'm missing something.


What is your CPU usage like in those games? That is way too low for BF1 and I'm assuming you're at 1440p based on the Swift. I'm at the same resolution and average about 130fps on Ultra. I do have a 6 core processor at 4.5ghz, but overall your machine is much newer than mine.

I'd monitor your CPU and GPU usage and see what it's like during gaming. Do you have VSYNC on?


----------



## Silent Scone

Quote:


> Originally Posted by *BrainSplatter*
> 
> Depends completely on your personal preferences. I personally hate the slightest sign of aliasing. Super sampling (besides others like DSR) is one of the great features to effectively get rid of aliasing. More potential for anti-aliasing is by far my main reason to invest so much in GPUs.
> 
> Other people don't care about jaggies but need 200fps. To each his own


There are plenty of anti aliasing / post processing methods that don't obliterate performance, and do an acceptable job. If you want to use SSAA, then arguably you'd be better off with a higher resolution panel in the first place, which would give you better results.


----------



## dboythagr8

Quote:


> Originally Posted by *Silent Scone*
> 
> There are plenty of anti aliasing / post processing methods that don't obliterate performance, and do an acceptable job. If you want to use SSAA, then arguably you'd be better off with a higher resolution panel in the first place, which would give you better results.


Yes, this is known.

I literally just went and decided to check every box to see where performance would be on SLI 1080tis. Overkill was the point.


----------



## Biggu

Quote:


> Originally Posted by *Biggu*
> 
> I dont know about the active one since you will need to replace the outlet of the block with the one that comes with it however the passive one (which I got) should work no problems.


Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


So for those still looking, this backplate DOES NOT workwith the heatkiller IV block. the holes line up however the screws are the wrong size and not long enough.


----------



## Norlig

Quote:


> Originally Posted by *Tonza*
> 
> Gamernexus has 1080Ti FE AIO build log, they put EVGA hybrid cooler to the stock shroud and it makes contact.


Quote:


> Originally Posted by *SperVxo*
> 
> If you have the Evga AIO it will work fine. But it seems like if you have Corsair or NZXT or whatever you will nee a copper shim?


Quote:


> Originally Posted by *THE_WITCHER(TM)*
> 
> B
> Yes you can just saw a video on YouTube by Gamers Nexus look it up however I just did my card with an evga hybrid 1080 kit and all I have to do is cut a square piece for the power plugs and a little piece for a power capacitor to make the evga Shroud fit and it's perfect max temp on 3D mark batman or rise 43c it auto boost to 1890mhz also no noise.


The EVGA Hybrid has a circle that sticks out in the middle, so it seems like it would go down into the square where the GPU sits.

The Arctic Cooling Accelero Hybrid that I have, is completely flat.


----------



## THE_WITCHER(TM)

Got it yesterday FedEx sucks it was supposed to be delivered Wednesday lol installed the cooler without any issues only had to cut 2 square pieces of aluminum ?


----------



## THE_WITCHER(TM)

B
Quote:


> Originally Posted by *Norlig*
> 
> The EVGA Hybrid has a circle that sticks out in the middle, so it seems like it would go down into the square where the GPU sits.
> 
> The Arctic Cooling Accelero Hybrid that I have, is completely flat.


I see yeah you have to get the evga hybrid to keep the stock shroud then


----------



## RobotDevil666

Quote:


> Originally Posted by *Tonza*
> 
> This thing is a monster, dat GPU score (31594). Maximum for my card seems to be +200 core (2076mhz boost) and +350mhz mem. Getting waterblock next week, so i dont need to play my cheeks flapping, this stock cooler is so loud, can feel the air flowing through my body.
> 
> http://www.3dmark.com/fs/12025698


That's pretty good for air but getting waterblock is definitely the way to go.
I tested 2 1080Ti's on air when i was deciding if I'm going to keep SLI or not and two of them sound like a hurricane lol
Quote:


> Originally Posted by *mshagg*
> 
> Hi. This is certainly my experience. Just fitted the EKWB and ran through the first set of benches. At +175 core, +500 mem the highest it wants to sustain over a given run is 2063Mhz on the core. Dont really see power exceed 112%, I assume there's a backstop within pascal to operate within its own thermal limits despite what the setting is in afterburner.
> 
> That being said, it pegs at 2063 and peaks at 41% on the diode, all in relative silence.
> 
> On the plus side, I have this thing for VR, which is notorious for GPU utilization < 100%, so the core speeds should hit target without running afoul of pascal's power limit.


Is it stable at those clocks ?
Those cards seem to be hitting 2ghz + fairly easily, there's tomshardware review of WC 1080Ti that talks about the power limit issue.
In this review they hit 2.1 but it's being hampered by the power limit.

http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html


----------



## Norlig

Quote:


> Originally Posted by *THE_WITCHER(TM)*
> 
> B
> I see yeah he has to get the evga hybrid to keep the stock shroud then


On the Arctic Cooling website: https://www.arctic.ac/eu_en/products/cooling/vga.html

The Hybrid II and Hybrid III lists compatibility with 1080ti, even though they are flat aswell.

I will send them an email to confirm.


----------



## THE_WITCHER(TM)

Quote:


> Originally Posted by *Norlig*
> 
> On the Arctic Cooling website: https://www.arctic.ac/eu_en/products/cooling/vga.html
> 
> The Hybrid II and Hybrid III lists compatibility with 1080ti, even though they are flat aswell.
> 
> I will send them an email to confirm.


That's probably without the shroud but I could be wrong


----------



## mshagg

Quote:


> Originally Posted by *RobotDevil666*
> 
> Is it stable at those clocks ?
> Those cards seem to be hitting 2ghz + fairly easily, there's tomshardware review of WC 1080Ti that talks about the power limit issue.
> In this review they hit 2.1 but it's being hampered by the power limit.
> 
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html


So far so good - havent had any signs of instability yet, but ive only been tinkering for an hour. In something like 3d11 it has no dramas at 2075 (so obviously I'll push for 2100) but in firestrike extreme it's pegging back to 2062 and even 2038.

Dont get me wrong, 3,584 cuda cores > 2000Mhz is suitable for my needs, but the power management seems to present a fairly hard limit with even a modest overclock. I think with the stock blower it was topping out at about 1900Mhz when left to its own devices. Guess I could flex the fan curves on the radiators to try and bring the core temps down but my observations are consistent with tomshardware - power likes to sit around 112% regardless.

My firestrike results are similar to those posted above, albeit i have a slightly lower cpu ratio.

http://www.3dmark.com/fs/12027049


----------



## criminal

Quote:


> Originally Posted by *Norlig*
> 
> The EVGA Hybrid has a circle that sticks out in the middle, so it seems like it would go down into the square where the GPU sits.
> 
> The Arctic Cooling Accelero Hybrid that I have, is completely flat.


Shim


----------



## SperVxo

What is the measurment on the gpu? So i can get the right copper shim size? is it 25mmx25mm ?


----------



## Slackaveli

Quote:


> Originally Posted by *MURDoctrine*
> 
> I hit 74C max during that session as I have a pretty aggressive fan curve setup. Ambient is probably ~24C. I'm just trying to figure it out as I've never seen anything like that before. No worries as the card will be under water soon enough and thermals won't be an issue anymore.


im pretty sure it was the adapter, the scan lines make me think so b/c you said it was an active adapter.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> I noticed some green artifacts in Mass Effect once I hit ~70c, lowered off sets, pumped up the fan and it's gone. Hoping you have a similar experience when your card is under water.
> 
> 
> 
> 
> 
> 
> 
> [/quote
> Quote:
> 
> 
> 
> Originally Posted by *Joshwaa*
> 
> Just thought I would check in with Slack. On my 1080ti with the Kryonaut at 40% fan and the card on idle the ambient temp and gpu temps are the exact same. Put a good load on the gpu (118% power) temp goes to 70C with 75% fan. So a little better than stock paste I am guessing.
> 
> 
> 
> yeah, that's something like 5-8c it appears. VERY NICE!!
Click to expand...


----------



## Norlig

Quote:


> Originally Posted by *SperVxo*
> 
> What is the measurment on the gpu? So i can get the right copper shim size? is it 25mmx25mm ?


According to this


__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/

Yes


----------



## Joshwaa

Quote:


> Originally Posted by *Norlig*
> 
> According to this
> 
> 
> __
> https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/
> 
> Yes


I can tell you as I have taken off my cooler. The gpu die is not an exact square.


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> What is your CPU usage like in those games? That is way too low for BF1 and I'm assuming you're at 1440p based on the Swift. I'm at the same resolution and average about 130fps on Ultra. I do have a 6 core processor at 4.5ghz, but overall your machine is much newer than mine.
> 
> I'd monitor your CPU and GPU usage and see what it's like during gaming. Do you have VSYNC on?


Quote:


> Originally Posted by *Joshwaa*
> 
> I can tell you as I have taken off my cooler. The gpu die is not an exact square.


somewhere way back in this thread a guy had all the info on this. Not sure where, but, he even links where you can order these from the good ole USA.


----------



## Norlig

Quote:


> Originally Posted by *Joshwaa*
> 
> I can tell you as I have taken off my cooler. The gpu die is not an exact square.


The die? No.

The square around is, which is part of the weight distribution so the die won't crack.


----------



## SperVxo

Quote:


> Originally Posted by *Joshwaa*
> 
> I can tell you as I have taken off my cooler. The gpu die is not an exact square.


Yeah I saw that on nexus. You know what size it is? If I get a copper shim it need to cover the whole area.


----------



## jarble

As expected fedex pushed my package arrival date out














So it looks like it will be next week before I get to play with my new toys


----------



## pantsoftime

Has anyone compared the Ti FE PCB to a TXP PCB to see what's missing around the DVI port? Is it just the DVI port itself or are some additional components also missing? I'm looking at some photos and on the top of the board they look the same. I don't know about the bottom though.

If it was just a matter of soldering on a DVI connector it might be a worthwhile mod for some folks. The BIOS would have to support it also, but I would imagine you could use an aftermarket card BIOS and many already have the DVI ports. Most aftermarket cards have the same voltage controller so BIOSes should be compatible.

Speaking of BIOSes I'm curious what would happen if someone flashed a TitanXP BIOS to a Ti or vice versa. It wouldn't provide any performance value but might teach us a bit more about how these cards work.


----------



## tconroy135

Quote:


> Originally Posted by *jarble*
> 
> As expected fedex pushed my package arrival date out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So it looks like it will be next week before I get to play with my new toys


It feels to me the FedEx has much lower standards than UPS.


----------



## CptSpig

Quote:


> Originally Posted by *alucardis666*
> 
> Impressive. If only it could do that with water and not LN2 we'd be set!


Don't get to excited he is using highly modified EVGA cards on LN2. He was not able to do these modifications on the Titan XP because the board is much more complicated. So far without these modifications on the 1080 ti it has been able to get a graphic score of 11548 on LN2. The Titan XP has a graphic score of 12316 on LN2. These scores are in Time Spy Single card. Not bad for a $700.00 card.









http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


----------



## D13mass

Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


Quote:


> Originally Posted by *Biggu*
> 
> So for those still looking, this backplate DOES NOT workwith the heatkiller IV block. the holes line up however the screws are the wrong size and not long enough.


Thank you so much! It`s really important for us !








How about water block ? Is it fully compatible?
Maybe you can take a few photos ?
And one more question - without backplate is it ok for using 1080ti only with water block?
Thanks.


----------



## Slackaveli

Quote:


> Originally Posted by *SperVxo*
> 
> Yeah I saw that on nexus. You know what size it is? If I get a copper shim it need to cover the whole area.


i think it was 25mm, but there was something about the widtht hat's important. it's in this thread somewhere between page 14- say, 75. pictures are there of the shims so start scanning. or, search this thread for "shim".

To the guy with fedex probz- i just got my nvidia store shipping refund. mine was set for friday morning, i got it monday at 11am. still got my $32 back.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> Yeah just did some more testing. It was the mem that was causing it. Did it again after a few minutes then tried it at +475 and it seems fine. Thanks.


Glad I could help


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Glad I could help


my ocd is perfectly fine with +178 memory as it gives 500.2 bandwidth. that allows me another +10 core to stick without crashing. how much bandwidth are we even really using? we are only using maybe half our vram and with the tiled caching we are at 1.3tbps effective when @ 500gbps bandwidth. sounds goot to me.


----------



## duppex

I was going to wait on a GTX 1080ti FTW HYBRID release but just got my Dell 24" 1440p G-SYNC Monitor. And it needs a workout.

Could anyone provide links to the correct EK full block + backplate needed to put an EVGA 1080ti FE underwater.

Also not sure on fittings and tube size.

First time putting a GPU underwater ?

I have a small NZXT Manta case so will get a small Res/Pump combo + 120mm radiator.

Any tips would be appreciated


----------



## Slackaveli

Quote:


> Originally Posted by *duppex*
> 
> I was going to wait on a GTX 1080ti FTW HYBRID release but just got my Dell 24" 1440p G-SYNC Monitor. And it needs a workout.
> 
> Could anyone provide links to the correct EK blocks and parts needed to put an EVGA 1080ti FE underwater.
> 
> First time putting a GPU underwater ?
> 
> Any tips would be appreciated


i have that monitor. it's gpretty solid. i recommend a +67-75 "digital vibrance" setting on Nvidia Control Panel. Makes it look like an IPS panel









yo welcome

oh, what type of cpu you using? may need to get it oc'd and stable while waiting on your gpu. Needs a lot more cpu power to keep those high frames i've found.


----------



## braindamage

Quote:


> Originally Posted by *Biggu*
> 
> So for those still looking, this backplate DOES NOT workwith the heatkiller IV block. the holes line up however the screws are the wrong size and not long enough.


Does the heatkiller IV block work with the stock backplate? I'm thinking of using the stock backplate for now, then grabbing the EK nickel backplate in the future (assuming the screws are correct).


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> my ocd is perfectly fine with +178 memory as it gives 500.2 bandwidth. that allows me another +10 core to stick without crashing. how much bandwidth are we even really using? we are only using maybe half our vram and with the tiled caching we are at 1.3tbps effective when @ 500gbps bandwidth. sounds goot to me.


Good point, I'm backing down to 313 for 512.2, gonna see if I can eek out a little more on the core.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Good point, I'm backing down to 313 for 512.2, gonna see if I can eek out a little more on the core.


yessir. got me another bin of boost. 2025 now.

I have ALWAYS been a high mem oc'er. i believe in it. But, this time, we are at the very cutting edge with these 11gbps 5x's. I keep seeing the slide in my head of how 10gbps mem looks clocked to 11 vs these well optimized at 11 mems. Makes me wonder what these would look like in slide form at 12gbps. I bet not nearly as optimized, just like the titan memes are when at 11. So, a minor membump to clear the 500gbps theoretical 'mountain' but well optimized 'pretty', clean bandwidth i hope. plus the baby bump to core. also less worry about these mems being hot because they are some baby nukes.


----------



## Blackclad

Quote:


> Originally Posted by *Blackclad*
> 
> MSI 1080ti
> 
> Motherboard: ASUS Maximus IX Code
> RAM: 32G Dominator Pro 3200mhz
> Processor: i7-7700k (stock) have not messed much with overclocking yet and recently cleared CMOS for good measure (latest BIOS)
> PSU: Corsair RM850i
> Cooler: Kraken 62
> Case: Phanteks Eclipse P400s
> Monitor: ROG Swift
> 
> This is a new build, all I havve done is set my RAM to XMP. No overclock (I'm at 4.2)
> 
> I ran 3D Mark FS and got a score of around 13000, also my card has god awful coil whine. BF1 Ultra gives me ~65 FPS, Wildlands ~50FPS, WoW even dips down to 40FPS at times. I can't seem to find the problem.
> 
> I have:
> Reinstalled Chipset
> DDU->Clean install 3 times now
> Disconnected the card, checked all connections, reinstalled card
> Switched to other PCI Slot
> Switched monitor connections
> 
> BF1 for example, using CAM, GPU is ~100% with temps hovering at 84C
> 
> Newegg has sent me a label for RMA, but before I send it on it's way I wanted to run this all by you guys, see if I'm missing something.


Quote:


> Originally Posted by *dboythagr8*
> 
> What is your CPU usage like in those games? That is way too low for BF1 and I'm assuming you're at 1440p based on the Swift. I'm at the same resolution and average about 130fps on Ultra. I do have a 6 core processor at 4.5ghz, but overall your machine is much newer than mine.
> 
> I'd monitor your CPU and GPU usage and see what it's like during gaming. Do you have VSYNC on?


Affirm on G-Sync. V-Sync disabled.

Running BF1 right now I'm looking at:

CPU - 46c 52% @ 4500mhz
GPU - 84c 99% @ 1794mhz

Getting an AVG of ~70FPS @ Ultra


----------



## Biggu

Quote:


> Originally Posted by *braindamage*
> 
> Does the heatkiller IV block work with the stock backplate? I'm thinking of using the stock backplate for now, then grabbing the EK nickel backplate in the future (assuming the screws are correct).


Sadly no the stock back plate will not work. You could probably drill out the holes a bit and make it work but id rather just go aftermarket.

Quote:


> Originally Posted by *D13mass*
> 
> Thank you so much! It`s really important for us !
> 
> 
> 
> 
> 
> 
> 
> 
> How about water block ? Is it fully compatible?
> Maybe you can take a few photos ?
> And one more question - without backplate is it ok for using 1080ti only with water block?
> Thanks.


Waterblock works like charm. I am running with no back plate now. Id take a picture but im kinda embarrassed with the way i did my tubes since its only temporary in my setup. Once Microcenter gets more z270i in my 1080 ti will move to my sim pc and ill be putting my titan X back in my main rig.


----------



## Slackaveli

Quote:


> Originally Posted by *Blackclad*
> 
> Affirm on G-Sync. V-Sync disabled.
> 
> Running BF1 right now I'm looking at:
> 
> CPU - 46c 52% @ 4500mhz
> GPU - 84c 99% @ 1794mhz
> 
> Getting an AVG of ~70FPS @ Ultra


that's what i get at 4k. im getting 135~ at 1440p. i do have that crystalwelll massive 128mb L4 cache but im not sure if that's where the extra comes from or not. some games i get a huge bump. usually in heavy cpu games. this qualifies as it ran 100% cpu usage on my old [email protected]


----------



## duppex

Quote:


> Originally Posted by *Slackaveli*
> 
> i have that monitor. it's gpretty solid. i recommend a +67-75 "digital vibrance" setting on Nvidia Control Panel. Makes it look like an IPS panel
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yo welcome
> 
> oh, what type of cpu you using? may need to get it oc'd and stable while waiting on your gpu. Needs a lot more cpu power to keep those high frames i've found.


Thanks for the heads up on the Monitor settings. Only had it for 1 day and very impressed. Got it direct from Dell £445 including delivery.

I would totally recommend this monitor for anyone who looking for 165hz 1ms response time. And does not want to spend a fortune on a IPS screen.

Just installed my 7700k, and keeping cool with EVGA CLC 280 Hydro Cooler.

I am so confused on which 1080ti EK GPU parts to go for. Think I will jump over to water-cooling forum and do some research.


----------



## jarble

Quote:


> Originally Posted by *braindamage*
> 
> Does the heatkiller IV block work with the stock backplate? I'm thinking of using the stock backplate for now, then grabbing the EK nickel backplate in the future (assuming the screws are correct).


I would ask that here http://www.overclock.net/t/528648/official-heatkiller-club/1050_50 water @Watercool-Jakob may know


----------



## Slackaveli

Quote:


> Originally Posted by *duppex*
> 
> Thanks for the heads up on the Monitor settings. Only had it for 1 day and very impressed. Got it direct from Dell £445 including delivery.
> 
> I would totally recommend this monitor for anyone who looking for 165hz 1ms response time. And does not want to spend a fortune on a IPS screen.
> 
> Just installed my 7700k, and keeping cool with EVGA CLC 280 Hydro Cooler.
> 
> I am so confused on which 1080ti EK GPU parts to go for. Think I will jump over to water-cooling forum and do some research.


yeah you will be good2go with Kaby.

That's a lot of folding @jarble. what do you guess your lifetime earning from folding are? like evga bucks and any other bonuses? I dont know much about it, admittedly.

***Edit: To the guy recomending the best rad fans, which models were they again? i want high flow sp , best cooling please.


----------



## Blackclad

Quote:


> Originally Posted by *Slackaveli*
> 
> that's what i get at 4k. im getting 135~ at 1440p. i do have that crystalwelll massive 128mb L4 cache but im not sure if that's where the extra comes from or not. some games i get a huge bump. usually in heavy cpu games. this qualifies as it ran 100% cpu usage on my old [email protected]


Yeah this is just so frustrating. I'm trying to figure out if this is an error on my part, being a new system, or just a bad card.


----------



## Norlig

First AIB 1080ti came out for sale on a Norwegian Site today, its 17.3% more expensive than the Founders edition.

I thought the AIB's were gonna be cheaper than the FE's?


----------



## dboythagr8

Quote:


> Originally Posted by *Blackclad*
> 
> Affirm on G-Sync. V-Sync disabled.
> 
> Running BF1 right now I'm looking at:
> 
> CPU - 46c 52% @ 4500mhz
> GPU - 84c 99% @ 1794mhz
> 
> Getting an AVG of ~70FPS @ Ultra


That's definitely off.

Hmm not sure what could be going on here, maybe others can fill in.


----------



## Joshwaa

Just got my game code from nVidia. So they are starting to send them out. Bummer you have to use the Experience thing to redeem them though.


----------



## Slackaveli

Quote:


> Originally Posted by *Blackclad*
> 
> Yeah this is just so frustrating. I'm trying to figure out if this is an error on my part, being a new system, or just a bad card.


i feel for you, man. when the whole system is new it's hard. do you have another system to try the Ti on? Or another gpu to try on your system?

Cool, Joshwaa! word. Im pretty sure we can just redeem it then install and then use the new "remove geforce experience only" option on ddu afterwards. my plan anyway.


----------



## dboythagr8

@Blackclad

-Check that the frame rate limiter is not on in the BF1 Graphics menu

-Turn GSYNC off and see what you get

-Double check VSYNC is not on in BF1 AND NVCP


----------



## alucardis666

Quote:


> Originally Posted by *Norlig*
> 
> First AIB 1080ti came out for sale on a Norwegian Site today, its 17.3% more expensive than the Founders edition.
> 
> I thought the AIB's were gonna be cheaper than the FE's?


Got a link? I don't see anything new on the egg. As for price, it was the 1080 FE that had the upcharge, one of Nvidia's selling points was that the FE would be priced at the same MSRP as AIB cards. so $700, the AIB's will go up from there based on their greed/how worthy they fell they are vs stock founders. I expect hybrid cards to be around $849.99 and watercooled ones approaching $949.99, I just hope things like the CLASSIFIED, HOF, and AMP will be less than $1000


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Just got my game code from nVidia. So they are starting to send them out. Bummer you have to use the Experience thing to redeem them though.


Quote:


> Originally Posted by *Norlig*
> 
> First AIB 1080ti came out for sale on a Norwegian Site today, its 17.3% more expensive than the Founders edition.
> 
> I thought the AIB's were gonna be cheaper than the FE's?


ouch. with the same thermal and/our powr limits and voltage caps. meh. FE is the way to go. Maybe ideally evga FEs, what i should have purchased, just to eliminate the need to hide your thermal mods if an rma is needed.

Hey, Black, make sure your frame limiter in Rivatuner isnt on, either.


----------



## Ayahuasca

What kind of clocks are people seeing on the stock cooler, I've been running +130 core +500 mem the past 2 days but I've just had 2 crashes in The Witcher 3 after it playing fine for a couple of days. The core only sits around 1950-1980mhz , is there a way to tell if it's the core crashing the game or the memory?


----------



## dboythagr8

Quote:


> Originally Posted by *Ayahuasca*
> 
> What kind of clocks are people seeing on the stock cooler, I've been running +130 core +500 mem the past 2 days but I've just had 2 crashes in The Witcher 3 after it playing fine for a couple of days. The core only sits around 1950-1980mhz , is there a way to tell if it's the core crashing the game or the memory?


+175/+300 has been my stable gaming OC.

Throttling usually settles the clocks in 2000mhz - 2012mhz.


----------



## chibi

Quote:


> Originally Posted by *Ayahuasca*
> 
> What kind of clocks are people seeing on the stock cooler, I've been running +130 core +500 mem the past 2 days but I've just had 2 crashes in The Witcher 3 after it playing fine for a couple of days. The core only sits around 1950-1980mhz , is there a way to tell if it's the core crashing the game or the memory?


Best to OC the Core & Memory one at a time and test for stability. This way you know which variable is the cause for problem.


----------



## dboythagr8

I'm really thinking of going ahead and putting my rig underwater. Never done it before. I'm thinking a front 360 rad in my Air 540, and a 240 rad up top. 4930k and 2x 1080Tis in the loop. Is EK considered the best to go with here? Anybody have any pointers or information I can watch/read?


----------



## Slackaveli

Quote:


> Originally Posted by *Ayahuasca*
> 
> What kind of clocks are people seeing on the stock cooler, I've been running +130 core +500 mem the past 2 days but I've just had 2 crashes in The Witcher 3 after it playing fine for a couple of days. The core only sits around 1950-1980mhz , is there a way to tell if it's the core crashing the game or the memory?


Im @ +178 Memory (500gbps bandwidth) and +145. gets me 2025. settles in at 1974 and holds @ max 75c.

do a full whole day of youtube watch, daboy. learn before you order.


----------



## D13mass

Quote:


> Originally Posted by *Biggu*
> 
> Sadly no the stock back plate will not work. You could probably drill out the holes a bit and make it work but id rather just go aftermarket.
> Waterblock works like charm. I am running with no back plate now. Id take a picture but im kinda embarrassed with the way i did my tubes since its only temporary in my setup. Once Microcenter gets more z270i in my 1080 ti will move to my sim pc and ill be putting my titan X back in my main rig.


Do not be shy, take the photo


----------



## Joshwaa

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah you will be good2go with Kaby.
> 
> That's a lot of folding @jarble. what do you guess your lifetime earning from folding are? like evga bucks and any other bonuses? I dont know much about it, admittedly.
> 
> ***Edit: To the guy recomending the best rad fans, which models were they again? i want high flow sp , best cooling please.


Noctua Industrial PWM 3000rpm 140mm or 120mm. I would not run them at 100% though kinda loud. But they move plenty of air at 35% to 40% and a relatively quiet.


----------



## keikei

Quote:


> Originally Posted by *dboythagr8*
> 
> +175/+300 has been my stable gaming OC.
> 
> Throttling usually settles the clocks in 2000mhz - 2012mhz.


Any significant difference in game frame rates?


----------



## braindamage

Quote:


> Originally Posted by *Biggu*
> 
> Sadly no the stock back plate will not work. You could probably drill out the holes a bit and make it work but id rather just go aftermarket.
> Waterblock works like charm. I am running with no back plate now. Id take a picture but im kinda embarrassed with the way i did my tubes since its only temporary in my setup. Once Microcenter gets more z270i in my 1080 ti will move to my sim pc and ill be putting my titan X back in my main rig.


Quote:


> Originally Posted by *jarble*
> 
> I would ask that here http://www.overclock.net/t/528648/official-heatkiller-club/1050_50 water @Watercool-Jakob may know


Thanks!
Quote:


> Originally Posted by *Ayahuasca*
> 
> What kind of clocks are people seeing on the stock cooler, I've been running +130 core +500 mem the past 2 days but I've just had 2 crashes in The Witcher 3 after it playing fine for a couple of days. The core only sits around 1950-1980mhz , is there a way to tell if it's the core crashing the game or the memory?


I can only get around +120 on core at 100% fan speed in Unigine Heaven. +135 and up immediately locks up as the program starts.


----------



## Blackclad

Quote:


> Originally Posted by *dboythagr8*
> 
> @Blackclad
> 
> -Check that the frame rate limiter is not on in the BF1 Graphics menu
> 
> -Turn GSYNC off and see what you get
> 
> -Double check VSYNC is not on in BF1 AND NVCP


No joy, 3Dmark also gave me a whopping score of 13000 for what it's worth.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Noctua Industrial PWM 3000rpm 140mm or 120mm. I would not run them at 100% though kinda loud. But they move plenty of air at 35% to 40% and a relatively quiet.


word. ty

@Blackclad , something's not right. try uninstalling afterburner. what score then?

and what is your ram running at?
Quote:


> Originally Posted by *Blackclad*
> 
> Affirm on G-Sync. V-Sync disabled.
> 
> Running BF1 right now I'm looking at:
> 
> CPU - 46c 52% @ 4500mhz
> GPU - 84c 99% @ 1794mhz
> 
> Getting an AVG of ~70FPS @ Ultra


gotta be your ram, man. it's not gettin fed.

my bad on the double post. got like 20 windows open right now and multi tasking and watching ncaa


----------



## blurp

Got my game code too!


----------



## Norlig

Quote:


> Originally Posted by *alucardis666*
> 
> Got a link? I don't see anything new on the egg. As for price, it was the 1080 FE that had the upcharge, one of Nvidia's selling points was that the FE would be priced at the same MSRP as AIB cards. so $700, the AIB's will go up from there based on their greed/how worthy they fell they are vs stock founders. I expect hybrid cards to be around $849.99 and watercooled ones approaching $949.99, I just hope things like the CLASSIFIED, HOF, and AMP will be less than $1000


Quote:


> Originally Posted by *Slackaveli*
> 
> ouch. with the same thermal and/our powr limits and voltage caps. meh. FE is the way to go. Maybe ideally evga FEs, what i should have purchased, just to eliminate the need to hide your thermal mods if an rma is needed.
> 
> Hey, Black, make sure your frame limiter in Rivatuner isnt on, either.


FE:
https://www.komplett.no/product/915924

AIB:
https://www.komplett.no/product/917194


----------



## Slackaveli

Quote:


> Originally Posted by *Norlig*
> 
> FE:
> https://www.komplett.no/product/915924
> 
> AIB:
> https://www.komplett.no/product/917194


im curious if that +80 clock doesnt just limit your oc. will it hold +150 on top of that? if not is sure isnt worth it for some rgbs.

Got my code, too. it wasnt in my mail gmail, was in the "promotions" folder.

i kinda want them both, for honour and wildlands. cant decide. if one of ya'll just hates them both, id be happy with a code kick-down. not beggin; volunteerin!


----------



## SperVxo

Quote:


> Originally Posted by *Norlig*
> 
> FE:
> https://www.komplett.no/product/915924
> 
> AIB:
> https://www.komplett.no/product/917194


Well thats expensiv


----------



## jarble

me all day today


----------



## SperVxo

There have been 2 cards now with aftermarket cooler. Asus and Inno3d and both of those had the same OC as the FE. So there doesnt seem to make a huge diffrence when it come to Power limit throttle?


----------



## sWaY20

block showed up, shipped from slovenia EU faster than my gpu did from Cali...sad!!! Ordered it Tues.


----------



## GMcDougal

The beast is finally in!


----------



## keikei

Quote:


> Originally Posted by *jarble*
> 
> 
> 
> me all day today


Yeah, the craziness doesnt really end until your baby is in the rig. Am i right?


----------



## GMcDougal

Quote:


> Originally Posted by *keikei*
> 
> Yeah, the craziness doesnt really end until your baby is in the rig. Am i right?


Im the same way. I track it like 100 times day. I know i got problems, dont judge.


----------



## Slackaveli

Quote:


> Originally Posted by *jarble*
> 
> 
> 
> me all day today


they wont update when it leaves, it'll update at it's next receiving spot, though.


----------



## OldNerd

I cracked and ended up ordering a MSI GTX 1080 Ti Founders Edition which arrives tomorrow. Really excited to see how it will handle 4K despite the fact I rarely game anymore LOL


----------



## PasK1234Xw

Quote:


> Originally Posted by *tconroy135*
> 
> It feels to me the FedEx has much lower standards than UPS.


Blame the storms on Tuesday in mid west and east coast everyone playing catch up even UPS.

Was suppose to get my card Wednesday didn't show up till today


----------



## OldNerd

Quote:


> Originally Posted by *jarble*
> 
> 
> 
> me all day today


That will be me all day tomorrow and likely chasing the UPS truck since they cant get down my street because of the snow lol


----------



## Slackaveli

Quote:


> Originally Posted by *OldNerd*
> 
> I cracked and ended up ordering a MSI GTX 1080 Ti Founders Edition which arrives tomorrow. Really excited to see how it will handle 4K despite the fact I rarely game anymore LOL


but when you DO game, it needs to be RIGHT! Amiright?


----------



## AMDATI

I like to pretend that my 1080ti is still in the mail so that when I go to use my PC I can feel the excitement of just getting it again


----------



## PasK1234Xw

My card from nvidia garbage +150 not even sure if this is stable. I went with them thinking they kept better chips like it seemed with 1080s but nope.
It also has legit issues every time i reboot or turn on computer it goes into the BIOS

Going to get refund and get EVGA for transferable warranty and not BS when removing cooler.
Luckily i already have RMA number since I tried to cancel order but already shipped


----------



## Slackaveli

sweet.

http://www.userbenchmark.com/UserRun/3138340

that's a z97 platform, fully maxed (w/ one gpu albeit) imo. that cpu baseline of 100% is stock 7700k. Beat that.
Quote:


> Originally Posted by *Joshwaa*
> 
> Noctua Industrial PWM 3000rpm 140mm or 120mm. I would not run them at 100% though kinda loud. But they move plenty of air at 35% to 40% and a relatively quiet.


https://www.amazon.com/NF-F12-iPPC-3000-PWM-Cooling/dp/B00KESS6O0 this guy?


----------



## Joshwaa

Quote:


> Originally Posted by *Slackaveli*
> 
> https://www.amazon.com/NF-F12-iPPC-3000-PWM-Cooling/dp/B00KESS6O0 this guy?


Yes. Pricey but IMO worth it.


----------



## Hulio225

Got mine today, sadly i have to wait until next week friday until i get my waterblock for it. with the waterblock arrival ill power target mod the card aswell (shorting shunt resistors).

http://www.userbenchmark.com/UserRun/3138733



Cheers ;-)


----------



## Blackclad

Quote:


> Originally Posted by *Slackaveli*
> 
> gotta be your ram, man. it's not gettin fed.
> 
> my bad on the double post. got like 20 windows open right now and multi tasking and watching ncaa


Are you referring to my system ram? or the 1080ti ram

Where would I go from here? Not sure what this all means


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Yes. Pricey but IMO worth it.


word . may have to grab a couple of these.
Quote:


> Originally Posted by *Blackclad*
> 
> Are you referring to my system ram? or the 1080ti ram
> 
> Where would I go from here? Not sure what this all means


im just wondering if maybe your system ram is running at 1300 or something. is your XMP profile on in your Bios? Set-up with your ram's best speed/timings?

jeez. ya'll type more so i dont keep double posting when i catch up on the thread lol.

Mod edit: Please use the edit function instead of double posting.


----------



## Blackclad

Quote:


> Originally Posted by *Slackaveli*
> 
> im just wondering if maybe your system ram is running at 1300 or something. is your XMP profile on in your Bios? Set-up with your ram's best speed/timings?
> 
> jeez. ya'll type more so i dont keep double posting when i catch up on the thread lol.


Yeah XMP is set up for 3200mhz w/ correct timings and such. Still haven't overclocked. Would reinstalling OS be a good idea>?


----------



## Slackaveli

Quote:


> Originally Posted by *Blackclad*
> 
> Yeah XMP is set up for 3200mhz w/ correct timings and such. Still haven't overclocked. Would reinstalling OS be a good idea>?


maybe? im kinda at a loss. no access to another gpu to try or another computer to plug the gpu into?


----------



## degenn

w00p w00p just picked up an EVGA Founders Edition, sold off my 2 980Ti's.

Time to see what this thing can do, should be _close_ to a complete replacement for 2 980Ti's based on the reviews I've read. Good stuff.


----------



## x-apoc

I'm going to stick with my sli 1080 for now. Invested too much, selling them back would got me Ti but I would lose on it - 350 bucks, I still outperform single Ti


----------



## pantsoftime

EVGA founders and EK waterblock just showed up. FedEx left me hanging all day. Oh well at least tomorrow should be fun!


----------



## RGSPro

Has anyone installed an EK Titan X(P) waterblock on their new Ti and gotten the Founders backplate installed as well? I know you could use that block with the founders backplate on the Titan X(P) but just wanted to make sure you can do the same thing with the 1080 Ti Founders.


----------



## dboythagr8

I forgot about DSR on Geforce cards and enabled it after folks in BF1 thread suggested I try it. Resolution was 5120 x 2880. Everything stayed on max settings. Played on the new Soissons DLC map, and still averaged 85fps in SLI. GPU usage was much much better vs lower resolution ~1440p.

Really need a new monitor lol


----------



## Slackaveli

Quote:


> Originally Posted by *x-apoc*
> 
> I'm going to stick with my sli 1080 for now. Invested too much, selling them back would got me Ti but I would lose on it - 350 bucks, I still outperform single Ti


of course. if you buy the 80, wait for the next 80. if you buy titans, wait for the next titan. if you buy Ti, well, I waited for the Ti. I do understand why some always get the 80, then the Ti, then the 80. Constant 20%-30% upgrades that way. Im just never willing to go backwards on cores and BUS size/bandwidth (like going from 980Ti was to 1080) and Tis are always a beast, so I do the Ti>Ti cycle. Although I reserve the right to get Volta because that new arch might be a little more than just the 25~% upgrade from the 1080Ti.

yeah, daboy. stay patient tho. yes, you really do, but you need it to be a 4k >100Hz hdr one. i could see the argument for a 5k one, but, not (to me) if it's stuck at 60.


----------



## Slackaveli

nm


----------



## dracotonisamond

Picked one of these up the other day.

got ekwb parts for the whole system on the way too.


----------



## lilchronic

Quote:


> Originally Posted by *RGSPro*
> 
> Has anyone installed an EK Titan X(P) waterblock on their new Ti and gotten the Founders backplate installed as well? I know you could use that block with the founders backplate on the Titan X(P) but just wanted to make sure you can do the same thing with the 1080 Ti Founders.


Yes i have evga 1080ti with stock backplate on the ek-fc titan x pascal block.








It will work.


----------



## jarble

Quote:


> Originally Posted by *keikei*
> 
> Yeah, the craziness doesnt really end until your baby is in the rig. Am i right?


I pretty sure the crazy is permanent over here


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> I forgot about DSR on Geforce cards and enabled it after folks in BF1 thread suggested I try it. Resolution was 5120 x 2880. Everything stayed on max settings. Played on the new Soissons DLC map, and still averaged 85fps in SLI. GPU usage was much much better vs lower resolution ~1440p.
> 
> Really need a new monitor lol


stay patient tho. yes, you really do, but you need it to be a 4k >100Hz hdr one. i could see the argument for a 5k one, but, not (to me) if it's stuck at 60.

Quote:


> Originally Posted by *Blackclad*
> 
> No joy, 3Dmark also gave me a whopping score of 13000 for what it's worth.


ok. one last thing. in gpu-z readout, what does it say for PCIe? does it say x16 3.0 v1.1?


----------



## eth3rton

I ended up ordering a Watercool HEATKILLER IV for my Nvidia 1080Ti FE. Right now Watercool is having supply issues due to outsourcing the plating/finish work (per their rep here on OCnet). Decided to play around with the factory backplate that came with the 1080Ti and came up with the following. Looks great, saved myself about $40 (with shipping) and the trouble of removing the GPU when stock is available.

Drilled out the factory holes, at very slow speeds, and cleaned up any edges with a razor.


__
https://flic.kr/p/SsjeRq


__
https://flic.kr/p/SsjeRq

The screws that came with the Heatkiller waterblock are ALMOST too short. Almost. They had just enough threads to make sure there was a tight fit between the block and GPU. I did go down to Fastenal and order some slightly longer (matched the factory screws) ones but I don't think I need them. Might change them out the next time I have it apart. No need really though...


__
https://flic.kr/p/SsjeGh


__
https://flic.kr/p/SsjeGh

The screws around the main GPU chip did not require drilling out. Don't drill out any of them...


__
https://flic.kr/p/RKDvNh


__
https://flic.kr/p/RKDvNh

Complete view:


__
https://flic.kr/p/SNoyyE


__
https://flic.kr/p/SNoyyE


__
https://flic.kr/p/T3cDct


__
https://flic.kr/p/T3cDct


__
https://flic.kr/p/Ssjfc5


__
https://flic.kr/p/Ssjfc5

Great (idle) temps:


__
https://flic.kr/p/SNpDRw


__
https://flic.kr/p/SNpDRw

Temps raised 10 degrees after running Time Spy:


__
https://flic.kr/p/SssPVb


__
https://flic.kr/p/SssPVb

Running the leak check and it's game time!


----------



## Blackclad

Quote:


> Originally Posted by *Slackaveli*
> 
> stay patient tho. yes, you really do, but you need it to be a 4k >100Hz hdr one. i could see the argument for a 5k one, but, not (to me) if it's stuck at 60.
> ok. one last thing. in gpu-z readout, what does it say for PCIe? does it say x16 3.0 v1.1?


No it says PCIe x 16 3.0 @x16 1.1


----------



## egandt

What am I doing wrong 3D mark get about 11 Frames a second for a grand total score of 1500 on my 1080 TI , that is less than 1/10 what I'm expecting and is well a serious joke. It also does not exceed 800MHZ on the GPU while running memory is unde 250MHZ during the run it is as if it is simply doing nothing at all. Heaven Benchmark seems fine at 1080P extreme, setting 8x AA also has issues sometimes it is 200 frames and running smoth GPU at 1500 other times it is at 34 frame and GPU is 800 or less, simply put is is also crap. Now CUDA has no issues only Games seem to be an issue, but a major one.

Any ideas, by 1070 was rock solid on the same 7700K at 4.9GHZ with 32GB DDR3200 RAM, 960 EVO drive so it is not like I have a CPU/MEM/Disk issues, it is simply that I can not get the card to work for 3d. CUDA has no issues, only 3D.

ERIC


----------



## stocksux

Quote:


> Originally Posted by *SperVxo*
> 
> There have been 2 cards now with aftermarket cooler. Asus and Inno3d and both of those had the same OC as the FE. So there doesnt seem to make a huge diffrence when it come to Power limit throttle?


Even screaming this since the FE launched. No one wanted to listen ?


----------



## trawetSluaP

Quote:


> Originally Posted by *egandt*
> 
> What am I doing wrong 3D mark get about 11 Frames a second for a grand total score of 1500 on my 1080 TI , that is less than 1/10 what I'm expecting and is well a serious joke. It also does not exceed 800MHZ on the GPU while running memory is unde 250MHZ during the run it is as if it is simply doing nothing at all. Heaven Benchmark seems fine at 1080P extreme, setting 8x AA also has issues sometimes it is 200 frames and running smoth GPU at 1500 other times it is at 34 frame and GPU is 800 or less, simply put is is also crap. Now CUDA has no issues only Games seem to be an issue, but a major one.
> 
> Any ideas, by 1070 was rock solid on the same 7700K at 4.9GHZ with 32GB DDR3200 RAM, 960 EVO drive so it is not like I have a CPU/MEM/Disk issues, it is simply that I can not get the card to work for 3d. CUDA has no issues, only 3D.
> 
> ERIC


Have you tried a fresh install of the drivers?


----------



## eXistencelies

I just benchmarked my 1080ti stock clocks compared to my old 1080 fe. Did your TI run hotter? Both were on EK water blocks and EKBPs and my 1080 would hit around mid 40's during benchmark. My 1080TI just hit 52C during timestrike benchmark. I even have my side panel off. What temps were you guys getting on water? I have a full custom 720mm loop. Even my idle temps are higher. Idle was 29c-30c. Usually it matches my CPU which is around 26-28c


----------



## mshagg

Damn man. Im hitting 41 degrees max with a +175 OC, only a few degrees higher than at idle. Using the Titan XP EK block and backplate as well.

Might be worth reseating the block, checking for contact etc? ***** of a job, i know, but doesnt sound right at face value.


----------



## eXistencelies

Quote:


> Originally Posted by *mshagg*
> 
> Damn man. Im hitting 41 degrees max with a +175 OC, only a few degrees higher than at idle. Using the Titan XP EK block and backplate as well.
> 
> Might be worth reseating the block, checking for contact etc? ***** of a job, i know, but doesnt sound right at face value.


Yea looks like I will have too. I also used GELID thermal paste that I used when I delidded my 7700k on the IHS/copper block. Now I did the pea size way but not the way they called it out on the instructions from corner to corner and side to side. I also got all thermal pads places correctly.

Damn......


----------



## dboythagr8

Changed the SLI bridge



I think it looks better than the EVGA one I had on:



The NVIDIA bridge is actually their "4-Slot" bridge, while the EVGA is "2-slot"


----------



## sWaY20

So much quieter, and temps are great. I couldn't stand that damn jet engine in my pc.


----------



## dboythagr8

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> So much quieter, and temps are great. I couldn't stand that damn jet engine in my pc.


Looks great. Probably headed that way too. These fans on idle are bothering me. Not so much when gaming because of everything else going on, but sitting here doing nothing, they're running because of the dumb 120/144hz SLI bug.


----------



## Slackaveli

Quote:


> Originally Posted by *Blackclad*
> 
> No it says PCIe x 16 3.0 @x16 1.1


yeah, that's what it should say. it's not that. when you installed drivers, did you remove the remnants from the older ones? Did you use DDU and then a clean installation of the newest drivers?

@Existenc- cant do "pea method" on gpus, man!! smear every mm of it!


----------



## eXistencelies

Quote:


> Originally Posted by *mshagg*
> 
> Damn man. Im hitting 41 degrees max with a +175 OC, only a few degrees higher than at idle. Using the Titan XP EK block and backplate as well.
> 
> Might be worth reseating the block, checking for contact etc? ***** of a job, i know, but doesnt sound right at face value.


Well that was interesting....Come to find out when I took the block off I noticed a perfect square outline. Well some how the clear side of the peel back from the thermal padding landed right in the center of the water block. Removed that sucker. Just finished benchmarking and my max temp was 38c.

Double check your stuff lol.


----------



## hotrod717

Looks like i will have to call MC. Never received my game code. Saw an earlier post about someone getting it via email a few days later. I did not.


----------



## lilchronic

Using the EK-FC titan x Pascal block i put thermal pads on the areas marked red. I noticed when i took the stock air cooler off they had pads there but it did not say to put them on there in the waterblock installation guide.
Just wondering if anyone else did the same.


----------



## alucardis666

Hey all! So I got my EVGA Hybrid AIO cooler in today! And it's awesome! Paired it with a this Scythe fan https://www.newegg.com/Product/Product.aspx?Item=9SIA2W04KZ5536

And my temps and noise are wayyyyyy down!









*Idle*

*Load*


Might've gotten even better temps if I used thermal grizzly instead of MX4












Had just enough clearnace behind my CryoRig R1 to squeeze it in!









Downside to this, probably leaving another 2-5c on the table having the rad so close to the R1, upside is that the scythe fan is pulling more hot air away from the cpu so that's a few c cooler at load and idle too









Overall, very happy!


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Hey all! So I got my EVGA Hybrid AIO cooler in today! And it's awesome! Paired it with a this Scythe fan https://www.newegg.com/Product/Product.aspx?Item=9SIA2W04KZ5536
> 
> And my temps and noise are wayyyyyy down!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Idle*
> 
> *Load*
> 
> 
> Might've gotten even better temps if I used thermal grizzly instead of MX4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had just enough clearnace behind my CryoRig R1 to squeeze it in!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Downside to this, probably leaving another 2-5c on the table having the rad so close to the R1, upside is that the scythe fan is pulling more hot air away from the cpu so that's a few c cooler at load and idle too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overall, very happy!


EK waterblock and backplate will be here Tuesday!!


----------



## bfedorov11

Unfortunately, I had my card since Monday and just got around to testing it out tonight. Hope to have it all back under water tonight


----------



## hotrod717

Quote:


> Originally Posted by *lilchronic*
> 
> Using the EK-FC titan x Pascal block i put thermal pads on the areas marked red. I noticed when i took the stock air cooler off they had pads there but it did not say to put them on there in the waterblock installation guide.
> Just wondering if anyone else did the same.


Normally these dont require thermal pads or active cooling.
Quote:


> Originally Posted by *bfedorov11*
> 
> Unfortunately, I had my card since Monday and just got around to testing it out tonight. Hope to have it all back under water tonight


Yeah, unfortunately for most of us it's called life.








Anticipation is half the fun.


----------



## sWaY20

Quote:


> Originally Posted by *lilchronic*
> 
> Using the EK-FC titan x Pascal block i put thermal pads on the areas marked red. I noticed when i took the stock air cooler off they had pads there but it did not say to put them on there in the waterblock installation guide.
> Just wondering if anyone else did the same.


i noticed this also, so i did the same and put some thermal pads there.


----------



## PewnFlavorTang

I have 2 and one card runs nearly 20c cooler than the other.


----------



## lilchronic

Quote:


> Originally Posted by *hotrod717*
> 
> Normally these dont require thermal pads or active cooling.
> Yeah, unfortunately for most of us it's called life.
> 
> 
> 
> 
> 
> 
> 
> 
> Anticipation is half the fun.


Well i put them on there anyway.









I'm pretty sure the two r22's in red or for the memory so i think they can get pretty hot when pushing them mem clocks.

Quote:


> Originally Posted by *sWaY20*
> 
> i noticed this also, so i did the same and put some thermal pads there.


Yeah i think the r22's for memory are the most important out of the one's i highlighted.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *lilchronic*
> 
> Well i put them on there anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure the two r22's in red or for the memory so i think they can get pretty hot when pushing them mem clocks.
> 
> Yeah i think the r22's for memory are the most important out of the one's i highlighted.


I put pads on my EK'd TXP where the stock cooler had them, but EK didn't put in the instructions. Probably helps something.


----------



## Blackclad

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, that's what it should say. it's not that. when you installed drivers, did you remove the remnants from the older ones? Did you use DDU and then a clean installation of the newest drivers?
> 
> @Existenc- cant do "pea method" on gpus, man!! smear every mm of it!


I ended up reinstalling Windows. after getting the essentials taken care of I ran 3dmark and got 21000ish. Then I ran BF1 and was getting ~130FPS w/DX12 on AVG. Looks like that did the trick.

Now after about an hour and half of playing my GPU temp is hovering around 58C and not going down. Hasn't gone down for about 20 minutes. Is this normal?


----------



## mshagg

Quote:


> Originally Posted by *lilchronic*
> 
> Using the EK-FC titan x Pascal block i put thermal pads on the areas marked red. I noticed when i took the stock air cooler off they had pads there but it did not say to put them on there in the waterblock installation guide.
> Just wondering if anyone else did the same.


Not on the far right but the R22s were covered by the thermal pad I applied to the row of, I assume, VRMs below them. There were some thermal pads in odd spaces, see the one on the back lol?


----------



## Slackaveli

Quote:


> Originally Posted by *Blackclad*
> 
> I ended up reinstalling Windows. after getting the essentials taken care of I ran 3dmark and got 21000ish. Then I ran BF1 and was getting ~130FPS w/DX12 on AVG. Looks like that did the trick.
> 
> Now after about an hour and half of playing my GPU temp is hovering around 58C and not going down. Hasn't gone down for about 20 minutes. Is this normal?


glad you got that figured out. I was feelin bad for you.
Try NCP settings Optimal Power instead of Prefer Max Performance. will downclock and get down to the 30s.


----------



## bfedorov11

Quote:


> Originally Posted by *Blackclad*
> 
> Now after about an hour and half of playing my GPU temp is hovering around 58C and not going down. Hasn't gone down for about 20 minutes. Is this normal?


You have a program open that is preventing the card from down clocking. Is it dropping down to ~139mhz?


----------



## Blackclad

Quote:


> Originally Posted by *bfedorov11*
> 
> You have a program open that is preventing the card from down clocking. Is it dropping down to ~139mhz?


Hmm, according to GPU-Z I am at:
GPU Core Clock 1480.5Mhz
GPU Mem Clock 1377MHz
GPU Temp 59C
Fan Speed% 34%
RPM 1645
Mem used 455MB
GPU Load 0%
PWR Consumption 27.3%

I don't really have anything running right now though. And I just changed power management to 'Optimal Power'


----------



## degenn

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> I have 2 and one card runs nearly 20c cooler than the other.


Are you running SLI on air? If so, it's normal for the top card to be a good solid 10+ degrees hotter than the bottom card, depending on airflow in your particular chassis.

If one card runs 20c hotter when just using each on their own, then yeah there's something wonky going on... maybe ill-applied thermal paste from the factory or just something wrong w/ heatsink/gpu contact in general. I'd pop off the cooler and investigate if that were the case.


----------



## Swolern

20c difference in GPUs? My guess is a poor SLI profile where the 2nd card is hardly doing anything.


----------



## alucardis666

So now with the hybrid cooler I figured I'd be able to push the card a little further, not really the case at all I'm afraid!












Any farther on either slider causes instability, adding voltage also just makes things run hotter or unstable.

On the upside, my boosts are higher and more consistent, I managed to run up from 130 offset to 160 hitting some higher bins, and my card never gets any hotter than ~53c


----------



## alucardis666

Quote:


> Originally Posted by *Blackclad*
> 
> I ended up reinstalling Windows. after getting the essentials taken care of I ran 3dmark and got 21000ish. Then I ran BF1 and was getting ~130FPS w/DX12 on AVG. Looks like that did the trick.
> 
> Now after about an hour and half of playing my GPU temp is hovering around 58C and not going down. Hasn't gone down for about 20 minutes. Is this normal?


No, not normal, GPU temp should go down pretty quickly when it goes into low power mode after you exit game and the card is idle. Check your fan profile


----------



## dboythagr8

Quote:


> Originally Posted by *Blackclad*
> 
> Hmm, according to GPU-Z I am at:
> GPU Core Clock 1480.5Mhz
> GPU Mem Clock 1377MHz
> GPU Temp 59C
> Fan Speed% 34%
> RPM 1645
> Mem used 455MB
> GPU Load 0%
> PWR Consumption 27.3%
> 
> I don't really have anything running right now though. And I just changed power management to 'Optimal Power'


Make sure Afterburner is actually seeing the second card, otherwise any profiles or other changes will only impact card 1.

When I put my second card in, AB didn't recognize it. It just said GPU2 or something like that. Wouldn't even show up under the OSD options. I restarted my machine and it showed both cards as expected. There shouldn't be a 20 degree difference. Right now my top card is 49c and my bottom card is 45c for example.


----------



## bfedorov11

Quote:


> Originally Posted by *Blackclad*
> 
> GPU Core Clock 1480.5Mhz


What speed does it show when you reboot and it is running cool? If it shows the lower idle speed, something is keeping the card in a higher state. Close steam, origins, uplay, etc. Lots of programs can do it.


----------



## dboythagr8

Quote:


> Originally Posted by *bfedorov11*
> 
> What speed does it show when you reboot and it is running cool? If it shows the lower idle speed, something is keeping the card in a higher state. Close steam, origins, uplay, etc. Lots of programs can do it.


There is a bug/glitch with SLI where the cards don't downclock at 120hz or 144hz, hence the high idle temps of 49c and 45c I posted about. In my case they won't even DC at 60hz.


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> There is a bug/glitch with SLI where the cards don't downclock at 120hz or 144hz, hence the high idle temps of 49c and 45c I posted about. In my case they won't even DC at 60hz.


Ouch. That's not good.


----------



## dboythagr8

Quote:


> Originally Posted by *alucardis666*
> 
> Ouch. That's not good.


It's been around for years. No clue how they haven't addressed it, or most likely, don't care to address it.


----------



## alucardis666

Quote:


> Originally Posted by *dboythagr8*
> 
> It's been around for years. No clue how they haven't addressed it, or most likely, don't care to address it.


No offense... but, that's another reason why I personally avoid SLi like the plague, just so many little issues, annoyances and inconsistencies


----------



## KraxKill

Quote:


> Originally Posted by *alucardis666*
> 
> No offense... but, that's another reason why I personally avoid SLi like the plague, just so many little issues, annoyances and inconsistencies


Here's why I avoid it,


----------



## Silent Scone

Quote:


> Originally Posted by *dboythagr8*
> 
> There is a bug/glitch with SLI where the cards don't downclock at 120hz or 144hz, hence the high idle temps of 49c and 45c I posted about. In my case they won't even DC at 60hz.


Are you using prefer max performance on your global profile? I'm on the latest beta build of W10, and have this issue.

Switching to the other more conservative options resolves it for me.


----------



## outofmyheadyo

What is the max power draw you guys are seeing with your overclocked/uncapped ( no vsync ) cards, with hwinfo64?
I am just curious if my temps would drastically increase going from a 980ti to a 1080ti, according to hwinfo64 my 980ti pulls upto 325w


----------



## Radox-0

Quote:


> Originally Posted by *KraxKill*
> 
> Here's why I avoid it,


Firestrike actually scales extremely well with SLI. Much more then any game will. Normal version of Firestrike however places a large emphasis on CPU performance also in the overall score compared to Extreme and Ultra which do to a lesser extent (in part why normal FS is dominated by 8 core and 10 cores). Without seeing the actual breakdown of the scores between the runs you compared, hard to say what is going on. . Here is a fire strike Run on my rig with CPU the same so physics scores are nearly identical, but 1 way and 2 way config at +150 Mhz on core and + 500 on memory (core clock averages 2002 mhz on both runs)

1 Way


2 Way


Nearly 82% scaling.

Quote:


> Originally Posted by *Silent Scone*
> 
> Are you using prefer max performance on your global profile? I'm on the latest beta build of W10, and have this issue.
> 
> Switching to the other more conservative options resolves it for me.


Yup see the same issue myself on Non-Beta version of Windows 10 also.


----------



## aylan1196

Installed swiftech titan x pascal waterblock
So far so good ?


----------



## D13mass

Quote:


> Originally Posted by *aylan1196*
> 
> 
> 
> Installed swiftech titan x pascal waterblock
> So far so good ?


And why are you showing stupid screenshot without OC and any load ? I can show photo of my mouse - it will be more usefull


----------



## aylan1196

Quote:


> Originally Posted by *D13mass*
> 
> And why are you showing stupid screenshot without OC and any load ? I can show photo of my mouse - it will be more usefull


I'll do ?


----------



## DrFreeman35

Is this SLI problem only for Ti's?


----------



## RedM00N

Quote:


> Originally Posted by *dboythagr8*
> 
> There is a bug/glitch with SLI where the cards don't downclock at 120hz or 144hz, hence the high idle temps of 49c and 45c I posted about. In my case they won't even DC at 60hz.


Have you seen whether the issue is still around after a clean install? (DDU then fresh install)
SLI on/off shouldn't make a difference either as only the display gpu would see usage (unless W10 uses sli on the desktop)


----------



## GreedyMuffin

Asus Strix or FE + WB?

FE option is a good 75 USD more expensive. (975 vs 1050 USD)


----------



## eXistencelies

Stock benchmark on Timespy

http://www.3dmark.com/spy/1396578

Overclocked for now at +150 core and +500 memory. Want to play for a few hours to see if it will be stable. So as of now it is at 2050mhz.

http://www.3dmark.com/spy/1396655


----------



## eXistencelies

Quote:


> Originally Posted by *bfedorov11*
> 
> What speed does it show when you reboot and it is running cool? If it shows the lower idle speed, something is keeping the card in a higher state. Close steam, origins, uplay, etc. Lots of programs can do it.


Mine is always clocked high at idle speeds. I have 3 monitors. Main is 165hz and the other two are 60hz. Been a problem for a long time that I can see and I am not the only one.


----------



## eXistencelies

Quote:


> Originally Posted by *dboythagr8*
> 
> There is a bug/glitch with SLI where the cards don't downclock at 120hz or 144hz, hence the high idle temps of 49c and 45c I posted about. In my case they won't even DC at 60hz.


It is not cause of SLI. I am on a single card both my 1080 and my 1080ti clock high due to the multiple monitors I have plugged in.


----------



## PasK1234Xw

Just a heads up when OC it can will prob mess up driver where even if OC is stable it wont be till you reinstall drivers. Its a PIA but had to do twice now my last unstable OC no longer locking up.


----------



## dboythagr8

Quote:


> Originally Posted by *Silent Scone*
> 
> Are you using prefer max performance on your global profile? I'm on the latest beta build of W10, and have this issue.
> 
> Switching to the other more conservative options resolves it for me.


No, I'm on Optimal.
Quote:


> Originally Posted by *RedM00N*
> 
> Have you seen whether the issue is still around after a clean install? (DDU then fresh install)
> SLI on/off shouldn't make a difference either as only the display gpu would see usage (unless W10 uses sli on the desktop)


Yep, doesn't matter on my end. I can remember this issue going all the way back to my SLI 580s when I had a 120hz monitor.
Quote:


> Originally Posted by *eXistencelies*
> 
> It is not cause of SLI. I am on a single card both my 1080 and my 1080ti clock high due to the multiple monitors I have plugged in.


Quote:


> Originally Posted by *alucardis666*
> 
> No offense... but, that's another reason why I personally avoid SLi like the plague, just so many little issues, annoyances and inconsistencies


It's only for mGPU that I see it.


----------



## aylan1196

3dmark
http://www.3dmark.com/3dm/18677415?


----------



## Radox-0

Quote:


> Originally Posted by *DrFreeman35*
> 
> Is this SLI problem only for Ti's?


I got no issues with a dual and tri sli 1080ti config. Behaves exactly the same as prior my prior titan x (maxwell and pascal) and 1080 dual and tri sli config. for me. Currently have two panels off of it. Only time clocks seem to remain locked at high is when i set nvidia control panel to maximum performace until i restart.


----------



## DrFreeman35

Quote:


> Originally Posted by *Radox-0*
> 
> I got no issues with a dual and tri sli 1080ti config. Behaves exactly the same as prior my prior titan x (maxwell and pascal) and 1080 dual and tri sli config. for me. Currently have two panels off of it. Only time clocks seem to remain locked at high is when i set nvidia control panel to maximum performace until i restart.


Ok, I was wondering about this. Thanks for the input, I am looking to get 2 of these myself. I will be making my decision soon, depends how much longer I have to wait for the ICX upgrade Program. I am not going to WC these cards, so FE is out of the question for me. I might just do the ICX upgrade, then wait for Volta.......who knows how long that will be.


----------



## DooRules

Quote:


> Originally Posted by *Radox-0*
> 
> I got no issues with a dual and tri sli 1080ti config. Behaves exactly the same as prior my prior titan x (maxwell and pascal) and 1080 dual and tri sli config. for me. Currently have two panels off of it. Only time clocks seem to remain locked at high is when i set nvidia control panel to maximum performace until i restart.


Depends upon the resolution of the two monitors as well. My primary card will not downclock at idle if I run both screens at full resolution. I am using a Samsung 4k monitor and a Dell 34" WS @ 3440 x 1440. I have to drop the rez on the 4k monitor, then it will downclock primary gpu.

This is with sli Titans but they are close enough to the Ti's I think in this regard.


----------



## Radox-0

Quote:


> Originally Posted by *DooRules*
> 
> Depends upon the resolution of the two monitors as well. My primary card will not downclock at idle if I run both screens at full resolution. I am using a Samsung 4k monitor and a Dell 34" WS @ 3440 x 1440. I have to drop the rez on the 4k monitor, then it will downclock primary gpu.
> 
> This is with sli Titans but they are close enough to the Ti's I think in this regard.


Will look again but this is with my 3440 x1440 rog 100hz panel and a 4k tv (not monitor) and behaved as expected. Though this was with the 1080s and not the titan xp or now 1080tis so my curiosity is now piqued and will give it a try when possible. What resolution did you need to drop down too?


----------



## Jason33w

It seems I may have gotten lucky and found a GTX 1080 ti available directly from PNY's website. I've never purchased a PNY product before. What are everyone's thoughts on the brand?


----------



## Hulio225

Quote:


> Originally Posted by *Jason33w*
> 
> It seems I may have gotten lucky and found a GTX 1080 ti available directly from PNY's website. I've never purchased a PNY product before. What are everyone's thoughts on the brand?


If its a Founders Edition, which i highly assume, it doesn't matter who is selling it to you. It willbe always the same ;-P


----------



## PewnFlavorTang

One thing I'm noticing on these cards that I did not notice on my GTX1080 is that once the power limit is hit a few times it triggers a TDP throttle where the card won't go above like 50-60 percent TDP. It'll maintain the clock speeds I set, but frames will be low and TDP is cut in half.


----------



## Jason33w

I've heard claims about lower build quality with PNY. But most most of those claims were from around 2005. I hope it's better now.


----------



## PewnFlavorTang

Quote:


> Originally Posted by *degenn*
> 
> Are you running SLI on air? If so, it's normal for the top card to be a good solid 10+ degrees hotter than the bottom card, depending on airflow in your particular chassis.
> 
> If one card runs 20c hotter when just using each on their own, then yeah there's something wonky going on... maybe ill-applied thermal paste from the factory or just something wrong w/ heatsink/gpu contact in general. I'd pop off the cooler and investigate if that were the case.


Yes and no. I put the card in as well to single gpu use and it's literally 15-20c cooler at the same fan speeds and clocks. Also, I put it back into SLI where it's the primary card and it still runs 15c cooler than the secondary bottom card.


----------



## Radox-0

Quote:


> Originally Posted by *Jason33w*
> 
> I've heard claims about lower build quality with PNY. But most most of those claims were from around 2005. I hope it's better now.


If it is a founders card then its all the same anyways. All come from same factories just the warenty and company that manages it that is diffrent.


----------



## Jason33w

Helps to set my mind at ease. Now, I hope they actually had it in stock. The PNY website didn't say that it wasnt't and my card has been charged. Keeping my fingers crossed!


----------



## NoDoz

Got my 1080ti from Nvidia yesterday...love it so far!


----------



## dboythagr8

While we're still talking about idle clocks with high refresh monitors...

I did flip from Optimal Power to Maximum and back to Optimal. Turned computer off/on. Second card is now down clocked at idle, sitting at 29c. My primary card isn't though. 44c and running at 1392mhz.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Hey all! So I got my EVGA Hybrid AIO cooler in today! And it's awesome! Paired it with a this Scythe fan https://www.newegg.com/Product/Product.aspx?Item=9SIA2W04KZ5536
> 
> And my temps and noise are wayyyyyy down!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Idle*
> ALT=""]http://www.overclock.net/content/type/61/id/2987228/width/350/height/700
> *Load*
> ALT=""]http://www.overclock.net/content/type/61/id/2987227/width/350/height/700
> 
> Might've gotten even better temps if I used thermal grizzly instead of MX4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ALT=""]http://www.overclock.net/content/type/61/id/2987229/width/350/height/700
> ALT=""]http://www.overclock.net/content/type/61/id/2987230/width/350/height/700
> 
> Had just enough clearnace behind my CryoRig R1 to squeeze it in!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Downside to this, probably leaving another 2-5c on the table having the rad so close to the R1, upside is that the scythe fan is pulling more hot air away from the cpu so that's a few c cooler at load and idle too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overall, very happy!


What fan profile are you running for the fans and what speed are they running at?


----------



## hotrod717

Quote:


> Originally Posted by *lilchronic*
> 
> Well i put them on there anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure the two r22's in red or for the memory so i think they can get pretty hot when pushing them mem clocks.
> 
> Yeah i think the r22's for memory are the most important out of the one's i highlighted.


certainly doesn't hurt.


----------



## Asus11

does anyone have any info on the MSI SEA HAWK EK X?


----------



## sWaY20

Is time spy oc sensitive? I ran several times oc on heaven, fire strike, and crashed time spy every time. I even lowered it to where it should've passed and it still crashed. Even gamed when i had it oc the highest.


----------



## PewnFlavorTang

I can't get time spy to work at factory settings. It's a fickle beyatch.


----------



## OldNerd

Just got my 1080 Ti installed and holy hell it is a BEAST !! 4K has never been so good !


----------



## hotrod717

Quote:


> Originally Posted by *sWaY20*
> 
> Is time spy oc sensitive? I ran several times oc on heaven, fire strike, and crashed time spy every time. I even lowered it to where it should've passed and it still crashed. Even gamed when i had it oc the highest.


Probably has more to do with the rest of your system and oc. Never had it run better than with 1080ti. Some benches are more sensitive to memory. Both good and bad.


----------



## SperVxo

If i use Headphones that are somewhat noise isolated. Will i still hear the fan alot if its 70%+ speed?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *SperVxo*
> 
> If i use Headphones that are somewhat noise isolated. Will i still hear the fan alot if its 70%+ speed?


Not really.


----------



## keikei

Quote:


> Originally Posted by *SperVxo*
> 
> If i use Headphones that are somewhat noise isolated. Will i still hear the fan alot if its 70%+ speed?


No. I sit 3 feet from my card.


----------



## sWaY20

Quote:


> Originally Posted by *hotrod717*
> 
> Probably has more to do with the rest of your system and oc. Never had it run better than with 1080ti. Some benches are more sensitive to memory. Both good and bad.


def not my system, ran it no prob a bunch with my 980ti.


----------



## Slackaveli

Quote:


> Originally Posted by *outofmyheadyo*
> 
> What is the max power draw you guys are seeing with your overclocked/uncapped ( no vsync ) cards, with hwinfo64?
> I am just curious if my temps would drastically increase going from a 980ti to a 1080ti, according to hwinfo64 my 980ti pulls upto 325w


im seeing 296~w


----------



## moonbogg

Hits 2100/6000 stable at stock voltage. Replaced two 980ti's with this beast and couldn't be happier.

Also, regarding timespy, I can't get it to run the second graphics benchmark no matter what I do. I seem to remember having that problem with the 980ti's as well. I think I said screw it and just ignored that benchmark. All the other ones run perfect.


----------



## Buddaking

SOOOOOOOO glad i brought this card.

This thing is a monster.


----------



## Slackaveli

Lemme in that 9k single card time Spy Club.

Also, is this saying I have a totally unique system? 1 like system. I totally dig that. Beast
z97 ftw


----------



## SperVxo

Easy to install those ek waterblocks?


----------



## GreedyMuffin

Quote:


> Originally Posted by *SperVxo*
> 
> Easy to install those ek waterblocks?


Yep!

The harder part is the FE cooler, but it is not hard if you got the right tools!


----------



## jleslie246

omg im still trying to find one in stock! Everyone is sold out. Since im still waiting, what version is the best for a waterblock?


----------



## OldNerd

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> Lemme in that 9k single card time Spy Club.
> 
> Also, is this saying I have a totally unique system? 1 like system. I totally dig that. Beast
> z97 ftw


Quote:


> Originally Posted by *jleslie246*
> 
> omg im still trying to find one in stock! Everyone is sold out. Since im still waiting, what version is the best for a waterblock?


PCConnection had them in stock yesterday and I pounced and had it overnighted for today


----------



## Jbravo33

Too impatient for AIB cards. Just snagged one msi FE off frys along with ek titan waterblock. Gonna run sli anyone here run a waterblock on top card air on bottom? Look goofy? When I was running sli classifieds bottom card never really heated up. Wondering how this looks/works. Hopefully I can grab a ftw3 in a few weeks.


----------



## moonbogg

Quote:


> Originally Posted by *SperVxo*
> 
> Easy to install those ek waterblocks?


Yeah but you need a 5/32" hex socket to remove the screws. If you use pliers you'll wreck the card.


----------



## PewnFlavorTang

2126mhz on stock cooler with my ASUS card. This card for some reason runs super cool compared to my evga.


----------



## KraxKill

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> 2126mhz on stock cooler with my ASUS card. This card for some reason runs super cool compared to my evga.


Nice...Thats a very impressive OC.

How are you testing you OC? Do you mind grabbing some GPUZ sensor tab screenshots after a Heaven loop at 2126? The screens you posted don't say much other than your idle state clocks.


----------



## KedarWolf

Quote:


> Originally Posted by *moonbogg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SperVxo*
> 
> Easy to install those ek waterblocks?
> 
> 
> 
> Yeah but you need a 5/32" hex socket to remove the screws. If you use pliers you'll wreck the card.
Click to expand...

5/32 hex socket? Would a 66 piece precision screwdriver set have one and what do I need the hex socket for?

Here is the set.

http://www.canadiantire.ca/en/pdp/mastercraft-specialty-precision-electronics-bit-set-66-pc-0573624p.html#srp


----------



## KraxKill

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> I can't get time spy to work at factory settings. It's a fickle beyatch.


Where exactly (which screen) does it crash in? If it's during the physics test, or combined, it may be your CPU. If you have the paid version, you can try to loop that one particular scene only to find your issue.


----------



## keikei

So, i can play dark souls 3 @ 5k. Still tweaking the settings, but damn.


----------



## Transmaniacon

Joined the club today! Just recently finished the new build and was waiting to add the final piece. Also picked up a Dell S2716DG monitor.


----------



## OldNerd

Quote:


> Originally Posted by *jleslie246*
> 
> omg im still trying to find one in stock! Everyone is sold out. Since im still waiting, what version is the best for a waterblock?


I just ran a bench of Fire Strike Ultra and got about scored about 1K less then a similarly spec'd machine and only 80% bracket?

http://www.3dmark.com/compare/fs/12044857/fs/12035062#

Any ideas?


----------



## duppex

Quote:


> Originally Posted by *moonbogg*
> 
> 
> 
> Hits 2100/6000 stable at stock voltage. Replaced two 980ti's with this beast and couldn't be happier.
> 
> Also, regarding timespy, I can't get it to run the second graphics benchmark no matter what I do. I seem to remember having that problem with the 980ti's as well. I think I said screw it and just ignored that benchmark. All the other ones run perfect.


Nice GPU Block.

First time trying to put a GPU underwater. Some advise on the following would be appreciated.

I am looking to get the same EK block as the one above post for the my GTX 1080ti FE.

Can I use the stock back plate with the EK full block?
Will a 120mm EK cool stream radiator be enough to keep this GPU cool?
Also will the below EK Reservoir/Pump combo be ok?

EK-DCP 2.2 X-RES Combo Reservoir for EK-DCP 2.2
https://www.scan.co.uk/products/ek-dcp-22-x-res-combo-reservoir-for-ek-dcp-22

Does anyone know of a cheap but OK toolkit that will do the job.

I only have a small NZXT Manta case so not a lot of space to play about with.

Many Thanks in advance


----------



## KraxKill

Quote:


> Originally Posted by *OldNerd*
> 
> I just ran a bench of Fire Strike Ultra and got about scored about 1K less then a similarly spec'd machine and only 80% bracket?
> 
> http://www.3dmark.com/compare/fs/12044857/fs/12035062#
> 
> Any ideas?


His card is unlikely to be at running at 1646 like 3dmark thinks. Probably in the 2000-2100 range like the rest of em.

Here's my fire strike score, but this one was run at 2063 on just a pedestrian 4790K at 5.0Ghz, despite's 3DMark claiming otherwise.



http://www.3dmark.com/fs/12024073


----------



## fisher6

Is it safe to have the memory oc'ed +700? My card is under water.


----------



## KraxKill

Quote:


> Originally Posted by *fisher6*
> 
> Is it safe to have the memory oc'ed +700? My card is under water.


You're likely to see artifacts there and loose actual performance due to higher error rates.


----------



## nrpeyton

How ironic.

The worst performing crappiest cooler/heatsink (FE), is the most difficult to dismantle & reassemble.

How much extra money, would it really cost them. Do you think; to simply use a semi-decent cooler, on it. 10 bux?

Thank god we have EVGA, and others.. etc.


----------



## khemist

Quote:


> Originally Posted by *Transmaniacon*
> 
> Joined the club today! Just recently finished the new build and was waiting to add the final piece. Also picked up a Dell S2716DG monitor.


Enjoy the monitor, i've been using one since release and love it!.


----------



## OldNerd

Quote:


> Originally Posted by *KraxKill*
> 
> His card is unlikely to be at running at 1646 like 3dmark thinks. Probably in the 2000-2100 range like the rest of em.
> 
> Here's my fire strike score, but this one was run at 2063 on just a pedestrian 4790K at 5.0Ghz, despite's 3DMark claiming otherwise.
> 
> 
> 
> http://www.3dmark.com/fs/12024073


Thanks for the info. Any consensus what the avg oc for stock voltages are?


----------



## bfedorov11

Quote:


> Originally Posted by *nrpeyton*
> 
> How ironic.
> 
> The worst performing crappiest cooler/heatsink (FE), is the most difficult to dismantle & reassemble.


Take it off from the back plate, not the front. Remove all screws/nuts from back and two on vent. Takes all of 2 minutes if you have the right tools.


----------



## moonbogg

Quote:


> Originally Posted by *KedarWolf*
> 
> 5/32 hex socket? Would a 66 piece precision screwdriver set have one and what do I need the hex socket for?
> 
> Here is the set.
> 
> http://www.canadiantire.ca/en/pdp/mastercraft-specialty-precision-electronics-bit-set-66-pc-0573624p.html#srp


Watch this video. This is for a Titan XP, but its the same story for the 1080ti. It explains exactly why you need a socket. I clipped it at the right timeline. Basically without the hex socket, you will slip if trying to use pliers and as they slip under the force from your hand squeezing them, they will invariably break the tiny, fragile components on the back of the PCB. Those screws have Loctite, so you need leverage to get them moving. Pliers will break the card.

A 5/32" imperial socket is very close in size to 4mm, so a 4mm would likely also work if that's what you have.


----------



## nrpeyton

Quote:


> Originally Posted by *bfedorov11*
> 
> Take it off from the back plate, not the front. Remove all screws/nuts from back and two on vent. Takes all of 2 minutes if you have the right tools.


Well thanks.

I haven't actually received it yet.

It's in the post.

But obviously I've been reading reviews & you-tube. Incl. that video someone posted when Steve Burke does it.

I assume he over-complicates it then? It's easier than the video makes out??

If thats true, good to know ;-)

P.S.

You guys finding the *nvidia power limit* too restrictive?

What about with slider at 120%? Still ??

Reason I ask; most partner edition cards have higher power targets; and I don't really fancy doing physical hardware power mod on £700 card.

der8auer says it doesn't do any harm, but I've heard the LM eats the solder. Buildzoids answer to that is use conductonaught instead of the traditional liquid metal, but i remain skeptical.


----------



## ajresendez

Quote:


> Originally Posted by *Transmaniacon*
> 
> Joined the club today! Just recently finished the new build and was waiting to add the final piece. Also picked up a Dell S2716DG monitor.


Thinking about getting that monitor too. how are toy liking it?


----------



## illypso

It is done, I could not resist anymore, go to hell budget, I bought one. that what credit is for









After missing the one on newegg.ca because they got the stock in the 15 minute that I decided to close my computer to clean it... (to prepare for the eventual new card) I can't believe the timing and irony in that lol...
I've been waiting for hour just to miss it by a dust...









I did manage to get one from Nvidia website at arround 9:15 AM EST (8 hours later) but Its like 100$ more because of the taxe (I'm from quebec)
(how come newegg charge only 5% and Nvidia 15% taxe?)

But I don't care, I needed it ;-)

Those 2 gigabyte gtx970 served me well at 1080p but not at 1440p (predator) I need more power.

Anyone from canada who bought from NVIDIA, its my firs time, how long should be the shipping.

Also I'm going to eventually water-cool it,

I would really like a EK-KIT P280 and a EK-FC Titan X Pascal

But I already have a h110 corsair so I will probably go with EVGA HYBRID Water Cooler. much cheaper.

Is it that much cooler to justifies the price difference.
It will probably be enough for now.

I will post result of my overclock when I received it, I hope I will get a lucky card like my 2x970.

Edit: the evga kit that fit with little cutting is 400-HY-5188-B1 ?


----------



## richiec77

Quote:


> Originally Posted by *KedarWolf*
> 
> 5/32 hex socket? Would a 66 piece precision screwdriver set have one and what do I need the hex socket for?
> 
> Here is the set.
> 
> http://www.canadiantire.ca/en/pdp/mastercraft-specialty-precision-electronics-bit-set-66-pc-0573624p.html#srp


Instructions call for a 4mm which IIRC is like 5/32"? I see a 4mm in that pic you posted so that should work.

EDIT: 3.97mm is what 5/32" is...so yeah. Either 5/32 or 4mm.

https://www.ekwb.com/shop/EK-IM/EK-IM-3831109831687.pdf


----------



## GRABibus

Do you now if any modded BIOS are ready to unlock power limits ? (As ASUS Strix OC BIOS t4 for GTX 1080) ?


----------



## pantsoftime

Quote:


> Originally Posted by *GRABibus*
> 
> Do you now if any modded BIOS are ready to unlock power limits ? (As ASUS Strix OC BIOS t4 for GTX 1080) ?


Nothing yet... Maybe once the STRIX OC is widely available then the same guys will be willing to release a similar BIOS.


----------



## GRABibus

Quote:


> Originally Posted by *pantsoftime*
> 
> Nothing yet... Maybe once the STRIX OC is widely available then the same guys will be willing to release a similar BIOS.


I hope.
And then I wil buy a GTX 1080Ti


----------



## nrpeyton

I thought ASUS had already released a new T4 for the TI?


----------



## pompss

Waterblock installed

First test Heaven

2100 mhz min like for one second after goes back 2126mhz all the time to the end
0+ men

Valley 2138,5 mhz stable to the end no throttling. Mem Zero

Temp max 44c

Installed fujipoly 1 mm thermal pads

Testing now 3d mark


----------



## GRABibus

Quote:


> Originally Posted by *nrpeyton*
> 
> I thought ASUS had already released a new T4 for the TI?


Didn't ear about that.
By the way, if someone's got it, please, post it here for download !


----------



## OldNerd

Quote:


> Originally Posted by *KraxKill*
> 
> His card is unlikely to be at running at 1646 like 3dmark thinks. Probably in the 2000-2100 range like the rest of em.
> 
> Here's my fire strike score, but this one was run at 2063 on just a pedestrian 4790K at 5.0Ghz, despite's 3DMark claiming otherwise.
> 
> 
> 
> http://www.3dmark.com/fs/12024073


Turned up the clocks a little and it started to wake up


----------



## Transmaniacon

Quote:


> Originally Posted by *ajresendez*
> 
> Thinking about getting that monitor too. how are toy liking it?


It's been great so far! I was coming from a HP Omen 32 with a R9 390X and having the 144Hz is so much better than 75Hz. The colors are surprisingly accurate for a TN panel, the Omen is better but it's not a huge difference. Panel uniformity is great and there's no IPS glow or backlight bleed to worry about. Very happy with this monitor so far and will likely be keeping it.


----------



## pantsoftime

Quote:


> Originally Posted by *nrpeyton*
> 
> I thought ASUS had already released a new T4 for the TI?


That would be excellent but I'm not sure where you saw that. Could you let us know if you find it?


----------



## pantsoftime

Quote:


> Originally Posted by *pompss*
> 
> Waterblock installed
> 
> First test Heaven
> 
> 2100 mhz min like for one second after goes back 2126mhz all the time to the end
> 0+ men
> 
> Valley 2138,5 mhz stable to the end no throttling. Mem Zero
> 
> Temp max 44c
> 
> Installed fujipoly 1 mm thermal pads
> 
> Testing now 3d mark


Are you bypassing power limit on your card?


----------



## richiec77

Quote:


> Originally Posted by *pantsoftime*
> 
> Are you bypassing power limit on your card?


Seems like he's getting more power available to the core by not OC the memory at all.


----------



## blurp

Quote:


> Originally Posted by *illypso*
> 
> Anyone from canada who bought from NVIDIA, its my firs time, how long should be the shipping.


Ordered from Nvidia like you did for the same reason. Shipped Friday the 10th. Arrived on Monday 13th. Pretty fast. I live in Québec City. ;-) Now i'm waiting for the waterblock from NCIX ... no ETA yet.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> What fan profile are you running for the fans and what speed are they running at?


Here's my curve, though I'm still playing with it for max efficiency, my card never goes above 53c during heaven or gaming.


----------



## Baasha

Going to stream on the G-Sync rig w/ 4x 1080 Ti here in a few: https://goo.gl/XNDh26 Join the fun!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Here's my curve, though I'm still playing with it for max efficiency, my card never goes above 53c during heaven or gaming.


Oh wait, are you using a EVGA hybrid kit? If you are, the radiator fan will be running at 100% (roughly 2000 RPM). The fan curve you have will be ckbtrolling the blower so you can just keep that constant at 25%


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Oh wait, are you using a EVGA hybrid kit? If you are, the radiator fan will be running at 100% (roughly 2000 RPM). The fan curve you have will be ckbtrolling the blower so you can just keep that constant at 25%


Yea, I have the fan on the rad hooked up to my cases fan controller, I crank it up to high when gaming, and leave it at low otherwise.









I was just kinda worried about the the vrms overheating.


----------



## pompss

Quote:


> Originally Posted by *pantsoftime*
> 
> Are you bypassing power limit on your card?


Not yet but i wish i could









For right now best score in 3d mark firestrike ultra was 7684

225+ core 2,126mhz 620+ men 1531 mhz

http://www.3dmark.com/3dm/18693493?

No crash at 240+ core and 650+ mem but score are less then the one above bc im hitting power limit. i hope for mod bios soon to remove power limit so i can push it further.

I will keep pushing to find the best mem+ core+ for the card

wish me luck


----------



## alucardis666

Quote:


> Originally Posted by *pompss*
> 
> I wish.
> 
> For right now best score in 3d mark firestrike ultra was 7684
> 
> 225+ core 2,126mhz 620+ men 1531 mhz
> 
> http://www.3dmark.com/3dm/18693493?
> 
> No crash at 240+ core and 650+ mem but score are less then the one above bc im hitting power limit. i hope for mod bios soon to remove power limit so i can push it further.
> 
> I will keep pushing to find the best mem+ core+ for the card
> 
> wish me luck


Meanwhile my card can't even do 2075. lol. You lucky dog you.


----------



## illypso

Quote:


> Originally Posted by *blurp*
> 
> Ordered from Nvidia like you did for the same reason. Shipped Friday the 10th. Arrived on Monday 13th. Pretty fast. I live in Québec City. ;-) Now i'm waiting for the waterblock from NCIX ... no ETA yet.


Nice, that is fast shipping.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Yea, I have the fan on the rad hooked up to my cases fan controller, I crank it up to high when gaming, and leave it at low otherwise.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was just kinda worried about the the vrms over heating.


I think VRAM and VRM will be fine since the entire midplate is acting as a heat sync.

Turning the fan up high won't make too much of a difference without the use of fins. Unless if you want to place some heatsinks on the midplate









Thanks, I'm concerned my fans are holding me back. The Noctua NF-F12 only moves 55 CFM. The EVGA fans move 70 CFM. I just bought some new noctu that move 72 CFM so hoping I can lower my temperature below 60C because I'm at 62C right now.

Are you running the scythe fans at 100%?


----------



## pompss

Quote:


> Originally Posted by *alucardis666*
> 
> Meanwhile my card can't even do 2075. lol. You lucky dog you.


http://www.3dmark.com/3dm/18693977?

7733 score a nice jump


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think VRAM and VRM will be fine since the entire midplate is acting as a heat sync.
> 
> Turning the fan up high won't make too much of a difference without the use of fins. Unless if you want to place some heatsinks on the midplate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks, I'm concerned my fans are holding me back. The Noctua NF-F12 only moves 55 CFM. The EVGA fans move 70 CFM. I just bought some new noctu that move 72 CFM so hoping I can lower my temperature below 60C because I'm at 62C right now.
> 
> Are you running the scythe fans at 100%?


Just using one scythe fan pushing hot air through the rad and out the case, not enough room for push-pull, and when gaming I have the fan at 100%, when not gaming it runs at ~40%.


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> Meanwhile my card can't even do 2075. lol. You lucky dog you.


Lol, for a card with a base clock of 1480. (Boost 1582).

Most people would be happy to hit 2000 on air. Which I assume you are


----------



## BigMack70

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol, for a card with a base clock of 1480. (Boost 1582).
> 
> Most people would be happy to hit 2000 on air. Which I assume you are


Let's be honest here - Nvidia's stated base and boost numbers are 100% meaningless.


----------



## nrpeyton

What you boosting to 'out of the box' with _no_ overclock, on air?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Just using one scythe fan pushing hot air through the rad and out the case, not enough room for push-pull, and when gaming I have the fan at 100%, when not gaming it runs at ~40%.


If you want, try using speedfan to put a fan profile on the scythe so it shuts off when the GPU is at 40C and it ramps up to 100% at 58C. It'll eliminate noise and extend the life of your fans, etc ;D


----------



## nrpeyton

The bottom 1% or 2% of cards actually _won't_ properly boost to advertised speeds. Hence down-tuned variants on partner cards. Such cards would probably only _marginally_ be within Nvidia's spec.


----------



## DerComissar

Quote:


> Originally Posted by *illypso*
> 
> *It is done, I could not resist anymore, go to hell budget, I bought one. that what credit is for
> 
> 
> 
> 
> 
> 
> 
> *
> 
> After missing the one on newegg.ca because they got the stock in the 15 minute that I decided to close my computer to clean it... (to prepare for the eventual new card) I can't believe the timing and irony in that lol...
> I've been waiting for hour just to miss it by a dust...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did manage to get one from Nvidia website at arround 9:15 AM EST (8 hours later) but Its like 100$ more because of the taxe (I'm from quebec)
> (how come newegg charge only 5% and Nvidia 15% taxe?)
> 
> But I don't care, I needed it ;-)
> 
> Those 2 gigabyte gtx970 served me well at 1080p but not at 1440p (predator) I need more power.
> 
> Anyone from canada who bought from NVIDIA, its my firs time, how long should be the shipping.
> 
> Also I'm going to eventually water-cool it,
> 
> I would really like a EK-KIT P280 and a EK-FC Titan X Pascal
> 
> But I already have a h110 corsair so I will probably go with EVGA HYBRID Water Cooler. much cheaper.
> 
> Is it that much cooler to justifies the price difference.
> It will probably be enough for now.
> 
> I will post result of my overclock when I received it, I hope I will get a lucky card like my 2x970.
> 
> Edit: the evga kit that fit with little cutting is 400-HY-5188-B1 ?


Love that statement, and so true, lol!








Rep+









Do consider the first option for cooling if you aren't in a hurry, it's great to have a proper loop.
An aio certainly is cheaper and easier initially, imo a real loop is still worth it in the long run.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *illypso*
> 
> It is done, I could not resist anymore, go to hell budget, I bought one. that what credit is for
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After missing the one on newegg.ca because they got the stock in the 15 minute that I decided to close my computer to clean it... (to prepare for the eventual new card) I can't believe the timing and irony in that lol...
> I've been waiting for hour just to miss it by a dust...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did manage to get one from Nvidia website at arround 9:15 AM EST (8 hours later) but Its like 100$ more because of the taxe (I'm from quebec)
> (how come newegg charge only 5% and Nvidia 15% taxe?)
> 
> But I don't care, I needed it ;-)
> 
> Those 2 gigabyte gtx970 served me well at 1080p but not at 1440p (predator) I need more power.
> 
> Anyone from canada who bought from NVIDIA, its my firs time, how long should be the shipping.
> 
> Also I'm going to eventually water-cool it,
> 
> I would really like a EK-KIT P280 and a EK-FC Titan X Pascal
> 
> But I already have a h110 corsair so I will probably go with EVGA HYBRID Water Cooler. much cheaper.
> 
> Is it that much cooler to justifies the price difference.
> It will probably be enough for now.
> 
> I will post result of my overclock when I received it, I hope I will get a lucky card like my 2x970.
> 
> Edit: the evga kit that fit with little cutting is 400-HY-5188-B1 ?


I bought my Titan XP from the Nvidia store Aug 2nd and received my card Aug 3rd to Edmonton, Alberta.


----------



## khemist

Quote:


> Originally Posted by *ajresendez*
> 
> Thinking about getting that monitor too. how are toy liking it?


I have the same setup, it's fantastic, perfect match i would say.


----------



## illypso

Quote:


> Originally Posted by *DerComissar*
> 
> Love that statement, and so true, lol!
> 
> 
> 
> 
> 
> 
> 
> 
> Rep+
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do consider the first option for cooling if you aren't in a hurry, it's great to have a proper loop.
> An aio certainly is cheaper and easier initially, imo a real loop is still worth it in the long run.


I decided to go for the AIO, newegg had it in stock, 5 mouse click and its on the way, (way too easy to make money disappear) I will start with that and see what kind of number i can get of it, I will probably switch to custom loop this summer when outside temps get high and mastercard get low









It was a hard choice but my cpu temp is really nice so I see no need to cool it further for now. (4790k at 4.7ghz stock volt, but 4.8 need almost max volt so no real gain to make on cpu)
Quote:


> I bought my Titan XP from the Nvidia store Aug 2nd and received my card Aug 3rd to Edmonton, Alberta.


Dam I can almost expect it this monday, thats fast. too bad there is no shipping in the weekend.


----------



## Slackaveli

Quote:


> Originally Posted by *OldNerd*
> 
> PCConnection had them in stock yesterday and I pounced and had it overnighted for today


nice. us old dudes need dat power!


----------



## nrpeyton

Plenty on Ebay, but you're paying premium for "in hand ready for dispatch".

Ppl buying, just to sell @ ridiculous prices. I hope they get left with a few, lol.

_First_ 3 that popped up cost:

-999
-950
-845

Normal price = 699

Chancers, like.


----------



## JedixJarf

Just keep an eye on nowinstock and zooalert


----------



## nrpeyton

I truly wonder if those Ebay guys actually _do_ manage to sell those cards at those prices this week.


----------



## OldNerd

Quote:


> Originally Posted by *nrpeyton*
> 
> Plenty on Ebay, but you're paying premium for "in hand ready for dispatch".
> 
> Ppl buying, just to sell @ ridiculous prices. I hope they get left with a few, lol.
> 
> _First_ 3 that popped up cost:
> 
> -999
> -950
> -845
> 
> Normal price = 699
> 
> Chancers, like.


Amazon private stores / sellers are just as bad $850 + ! I paid 775 I think with overnight saturdays early am delivery lol


----------



## OldNerd

Quote:


> Originally Posted by *nrpeyton*
> 
> I truly wonder if those Ebay guys actually _do_ manage to sell those cards at those prices this week.


Many are full on bs pre-order sales. They dont even have the card but are flipping it based on credit card purchases and a premium....


----------



## nrpeyton

Quote:


> Originally Posted by *OldNerd*
> 
> Amazon private stores / sellers are just as bad $850 + ! I paid 775 I think with overnight saturdays early am delivery lol


Is there any reason for Nvidia rushing the TI out, was it ahead of schedule?

Seems odd how they're so short. I mean they are only manufacturing 1 product jost now.. factories won't be doing anything else.

Should catch up soon.

Partner companies all seem a bit behind too.

Or maybe its all part of a strategy to keep suspense up. "_the harder something is to get, the more you want it_".


----------



## illypso

Quote:


> Originally Posted by *JedixJarf*
> 
> Just keep an eye on nowinstock and zooalert


nowinstock, thats how I got mine, They even have an alarm module, I put the volume to the max and jump out of the bed. (almost killed myself) then got back to sleep. Wasn't sure if it was a dream when I woke up but I had the confirmation email.


----------



## nrpeyton

Quote:


> Originally Posted by *illypso*
> 
> nowinstock, thats how I got mine, They even have an alarm module, I put the volume to the max and jump out of the bed. (almost killed myself) then got back to sleep. Wasn't sure if it was a dream when I woke up but I had the confirmation email.


lol


----------



## OldNerd

Quote:


> Originally Posted by *nrpeyton*
> 
> Is there any reason for Nvidia rushing the TI out, was it ahead of schedule?
> 
> Seems odd how they're so short. I mean they are only manufacturing 1 product jost now.. factories won't be doing anything else.
> 
> Should catch up soon.
> 
> Partner companies all seem a bit behind too.
> 
> Or maybe its all part of a strategy to keep suspense up. "_the harder something is to get, the more you want it_".


It is the Apple pioneered "choked" supply chain to ensure hype and demand , keeps it in the news and artificially primes supply/ demand.


----------



## moonbogg

Quote:


> Originally Posted by *duppex*
> 
> Nice GPU Block.
> 
> First time trying to put a GPU underwater. Some advise on the following would be appreciated.
> 
> I am looking to get the same EK block as the one above post for the my GTX 1080ti FE.
> 
> Can I use the stock back plate with the EK full block?
> Will a 120mm EK cool stream radiator be enough to keep this GPU cool?
> Also will the below EK Reservoir/Pump combo be ok?
> 
> EK-DCP 2.2 X-RES Combo Reservoir for EK-DCP 2.2
> https://www.scan.co.uk/products/ek-dcp-22-x-res-combo-reservoir-for-ek-dcp-22
> 
> Does anyone know of a cheap but OK toolkit that will do the job.
> 
> I only have a small NZXT Manta case so not a lot of space to play about with.
> 
> Many Thanks in advance


Stock backplate won't fit without modding it based on what i read from EK's instructions for that waterblock.

I think that pump/res should work fine, but it seems kind of cheap. I wouldn't be comfortable with it. Make sure you do research and read reviews. As far as a single 120mm fan/radiator being used, thats all people use with the hybrid kits and they work fine, so I think it would work well. You may consider an EVGA hybrid kit and just do it that way, but a full block is better I think. Looks better at least, thats for sure. Those hybrid kits are ugly as all hell.

Also, I feel very lucky to already have one of these in my rig under water and working great. So many people want one but just can't buy it. I barely made it on release day. I almost missed the initial stock.


----------



## craxx

What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


----------



## JedixJarf

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


Uh... something seems way wrong on those cpu temps. Did you use mustard for thermal paste? lol. Right now with 360mm of rad space I'm seeing around 40-45c on my 1080 ti with +150 core and a 5820k @ 4.2ghz.


----------



## moonbogg

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


The GPU is simply heating the water up and effecting the CPU temps. Try turning the D5 up to a higher setting and turn your fans up higher. I have three UT60 x 360 rads and all I have is a 6800K and 1080ti and if I leave my fans all the way down, temps will creep up. The GPU simply dumps a fat load of heat. CPU hits upper 50's to lower 60's and GPU hits low to mid 40's.
If you hit 56 on your GPU then your loop is starting to get loaded and can't dump the heat fast enough IMO.


----------



## Slackaveli

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


bf1 is the hardest game on a cpu. it's worse than a run of cinebench. it's almost burntest level. would recommend a new thermal pasting on cpu and maybe a better sp fan for the rad. Or, turn that dude down to 4.3 @ 1.275 (or whatever yours is at at that freq).


----------



## OldNerd

[quote name="JedixJarf" url="/t/1624521/nvidia-gtx-1080-ti-owners-thread/1950#post_25937Did you use mustard for thermal paste?
[/quote]

Choked on my beer laughing at that , the visual of mustard and a plastic knife spreading it on the gpu lmaooooo pretty funny lol


----------



## doogk

With voltage unlocked, what is afterburner reading for yall with %100 voltage?


----------



## djriful

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


Something is wrong with your loop.

Compare to my old 2011 system with not so much efficient as newer CPU and GPU.

I ran Prime95 + GPU Benchmark, CPU max around 65'c and GPU sit 40'c with room ambient 21'c. My system is a 130w CPU and 250w GPU, and both are overclocked pulling much more wattage.

D5 Pump speed 4 with EK Reservoir, CPU EK Block, MOSFET EK Block, GPU EK Block, distill water + dazmode protector glycol. 360mm Dazmode rad and 240mm XSPC rad.


----------



## craxx

Quote:


> Originally Posted by *JedixJarf*
> 
> Uh... something seems way wrong on those cpu temps. Did you use mustard for thermal paste? lol. Right now with 360mm of rad space I'm seeing around 40-45c on my 1080 ti with +150 core and a 5820k @ 4.2ghz.


lol no I have coolaboratory liquid pro under the ihs and gelid gc extreme on top of the ihs. It has always seemed way hot to me. Think I may return it back to stock and see what it is. Honestly not really sure what I'm doing when OC the cpu. Read the haswell guide several times and attempted to redo my OC but still end up about the same.


----------



## Baasha

Wondering if someone can help me diagnose the following.

Just installed my 4x 1080 Ti GPUs on my 2nd rig and put the EVGA 1600W T2 PSU in there as well. Ran 3DMark Fire Strike Ultra - runs fine - all GPUs scale (score is much lower since I'm using the 3970X @ 4.50Ghz).

However, Heaven 4.0 doesn't scale well even at 4K w/ 8x AA!

Games also are not scaling anywhere as nearly well as the first rig w/ the 6950X @ 4.30Ghz.

Can the CPU alone cause this? I'm also running "only" DDR3 @ 2133Mhz (32GB) on the rig with the 4x 1080 Tis whereas on the other rig, I have DDR4 @ 3200Mhz (64GB).

Further, if I use DSR to play at 5K (this rig has the RoG Swift @ 1440P @ 144Hz), the scaling is pretty bad where as 5K on the other rig scales >90% across all 4 GPUs.

If I bump up resolution scale to 8K (150% from 5K or 200% from 4K UHD), the game (BF1) crashes instantly with a DirectX error.

The other rig can play at 8K all day (4x Titan XP).

Why is the scaling so bad when all other components (other than CPU & RAM) are the same? Could the X79 platform itself be limiting scaling somehow?

Would like some advice on this.


----------



## craxx

Quote:


> Originally Posted by *Slackaveli*
> 
> bf1 is the hardest game on a cpu. it's worse than a run of cinebench. it's almost burntest level. would recommend a new thermal pasting on cpu and maybe a better sp fan for the rad. Or, turn that dude down to 4.3 @ 1.275 (or whatever yours is at at that freq).


Thermal paste was new as of today. Not sure what is needed for 4.3 but I will try to lower my voltage if I can.
Quote:


> Originally Posted by *djriful*
> 
> Something is wrong with your loop.
> 
> Compare to my old 2011 system with not so much efficient as newer CPU and GPU.
> 
> I ran Prime95 + GPU Benchmark, CPU max around 65'c and GPU sit 40'c with room ambient 21'c. My system is a 130w CPU and 250w GPU, and both are overclocked pulling much more wattage.
> 
> D5 Pump speed 4 with EK Reservoir, CPU EK Block, MOSFET EK Block, GPU EK Block, distill water + dazmode protector glycol. 360mm Dazmode rad and 240mm XSPC rad.


I think my rs360 might have some buildup inside it, its like 5 years old atleast now. I disassemble and clean everything atleast once a year. Cpu block is only like 3 months old, added the monsta 240 at the same time.


----------



## JedixJarf

Quote:


> Originally Posted by *doogk*
> 
> With voltage unlocked, what is afterburner reading for yall with %100 voltage?


1093mV here.


----------



## jarble

I am a happy camper







Ignore the potato loop pic hdr and I are currently not on speaking terms. Simple gpu bypass until I decide on some blocks.


----------



## DerComissar

Quote:


> Originally Posted by *illypso*
> 
> I decided to go for the AIO, newegg had it in stock, 5 mouse click and its on the way, (way too easy to make money disappear) I will start with that and see what kind of number i can get of it, I will probably switch to custom loop this summer when outside temps get high and mastercard get low
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was a hard choice but my cpu temp is really nice so I see no need to cool it further for now. (4790k at 4.7ghz stock volt, but 4.8 need almost max volt so no real gain to make on cpu)
> 
> Quote:
> 
> 
> 
> I bought my Titan XP from the Nvidia store Aug 2nd and received my card Aug 3rd to Edmonton, Alberta.
> 
> 
> 
> Dam I can almost expect it this monday, thats fast. too bad there is n.o shipping in the weekend.
Click to expand...

Well that's good, from what I've see posted, those with aio's are getting quite good temps. with their cards.
And certainly a lot more silence!

You may not get the card quite as fast, keep in mind that those of us who have been following MrTOOSHORT all know that he is the "Titan Meister", lol.


----------



## djriful

Out of stock everywhere in Canada...

Also waiting for my tax return... and my new credit card (last one got compromised by some thief when I was in down south USA)... Grrrrrr


----------



## sWaY20

Quote:


> Originally Posted by *djriful*
> 
> Out of stock everywhere in Canada...
> 
> Also waiting for my tax return... and my new credit card (last one got compromised by some thief when I was in down south USA)... Grrrrrr


I'll sell mine with wb for 1100...jk, unless someone really wants it ?


----------



## DerComissar

Quote:


> Originally Posted by *djriful*
> 
> Out of stock everywhere in Canada...
> 
> Also waiting for my tax return... and my new credit card (last one got compromised by some thief when I was in down south USA)... Grrrrrr


Keep an eye on the tracking sites.
I lucked out on Thursday night, and found a site called Mike's Computer Shop, in B.C., that had about 6 EVGA 1080Ti's in stock for a short time.
I jumped on the order button and managed to score one before they ran out.


----------



## Slackaveli

Quote:


> Originally Posted by *Baasha*
> 
> Wondering if someone can help me diagnose the following.
> 
> Just installed my 4x 1080 Ti GPUs on my 2nd rig and put the EVGA 1600W T2 PSU in there as well. Ran 3DMark Fire Strike Ultra - runs fine - all GPUs scale (score is much lower since I'm using the 3970X @ 4.50Ghz).
> 
> However, Heaven 4.0 doesn't scale well even at 4K w/ 8x AA!
> 
> Games also are not scaling anywhere as nearly well as the first rig w/ the 6950X @ 4.30Ghz.
> 
> Can the CPU alone cause this? I'm also running "only" DDR3 @ 2133Mhz (32GB) on the rig with the 4x 1080 Tis whereas on the other rig, I have DDR4 @ 3200Mhz (64GB).
> 
> Further, if I use DSR to play at 5K (this rig has the RoG Swift @ 1440P @ 144Hz), the scaling is pretty bad where as 5K on the other rig scales >90% across all 4 GPUs.
> 
> If I bump up resolution scale to 8K (150% from 5K or 200% from 4K UHD), the game (BF1) crashes instantly with a DirectX error.
> 
> The other rig can play at 8K all day (4x Titan XP).
> 
> Why is the scaling so bad when all other components (other than CPU & RAM) are the same? Could the X79 platform itself be limiting scaling somehow?
> 
> Would like some advice on this.


that millionaire struggle is real, huh?
I didnt think more than 2 were supported, but, yeah, i imagine 56 TFLOPS on a 3 series Intel might not be the way to go.


----------



## craxx

Turned the pumps up to 5 now. Cpu still hit 82 and gpu seemed to hold about 53c. Gonna remove the corsair fan reducers and just use the bitfenix 5v adapters and see if that changes anything. Just added the corsair adapters today, didn't notice a drop in noise at all so I'll remove them and see if the little bump in fan speed helps any. If not I'll remove the 5v adapters and just run the corsair adapters and deal with the loud ass fans.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> that millionaire struggle is real, huh?
> I didnt think more than 2 were supported, but, yeah, i imagine 56 TFLOPS on a 3 series Intel might not be the way to go.


Lol. Wonder what he does for a living









I'd say sell the Titan's and put the 1080 Ti's in the main rig and then buy some regular vanilla 1080's or 1070's for the secondary and use the money made to beef up that secondary pc's cpu, or go on a vacation.


----------



## KedarWolf

Quote:


> Originally Posted by *djriful*
> 
> Out of stock everywhere in Canada...
> 
> Also waiting for my tax return... and my new credit card (last one got compromised by some thief when I was in down south USA)... Grrrrrr


https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/ In stock I think.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Baasha*
> 
> Wondering if someone can help me diagnose the following.
> 
> Just installed my 4x 1080 Ti GPUs on my 2nd rig and put the EVGA 1600W T2 PSU in there as well. Ran 3DMark Fire Strike Ultra - runs fine - all GPUs scale (score is much lower since I'm using the 3970X @ 4.50Ghz).
> 
> However, Heaven 4.0 doesn't scale well even at 4K w/ 8x AA!
> 
> Games also are not scaling anywhere as nearly well as the first rig w/ the 6950X @ 4.30Ghz.
> 
> Can the CPU alone cause this? I'm also running "only" DDR3 @ 2133Mhz (32GB) on the rig with the 4x 1080 Tis whereas on the other rig, I have DDR4 @ 3200Mhz (64GB).
> 
> Further, if I use DSR to play at 5K (this rig has the RoG Swift @ 1440P @ 144Hz), the scaling is pretty bad where as 5K on the other rig scales >90% across all 4 GPUs.
> 
> If I bump up resolution scale to 8K (150% from 5K or 200% from 4K UHD), the game (BF1) crashes instantly with a DirectX error.
> 
> The other rig can play at 8K all day (4x Titan XP).
> 
> Why is the scaling so bad when all other components (other than CPU & RAM) are the same? Could the X79 platform itself be limiting scaling somehow?
> 
> Would like some advice on this.


The 3970x is getting kind of slow for the new gen of cards especially so with quad 1080 TIs. But also make sure you are running pci-E 3.0 with the regedit hack. Make sure to use the extra pci-E molex plug on the motherboard for extra juice if you hadn't already.


----------



## alucardis666

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> But also make sure you are running pci-E 3.0 with the regedit hack.


Can you elaborate on this? I just set my slots to 3.0 in my bios.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *alucardis666*
> 
> Can you elaborate on this? I just set my slots to 3.0 in my bios.


SB-E couldn't run pci-E 3.0 without a reg hack:

*http://nvidia.custhelp.com/app/answers/detail/a_id/3135/session/L3RpbWUvMTM0MDIyMzU2OC9zaWQvaDEzbE45X2s=*

Some still have issues with the hack, I never did though with a couple 3930ks, a 3960x and two 3970x cpus.


----------



## djriful

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrTOOSHORT*
> 
> But also make sure you are running pci-E 3.0 with the regedit hack.
> 
> 
> 
> Can you elaborate on this? I just set my slots to 3.0 in my bios.
Click to expand...

I believe he meant for X79 boards. They need to manually turn on via Regedit, like right now mine is stuck at 2.0 in Windows.


----------



## MightEMatt

Quote:


> Originally Posted by *djriful*
> 
> I believe he meant for X79 boards. They need to manually turn on via Regedit, like right now mine is stuck at 2.0 in Windows.


Or download the Kepler patch.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti/ In stock I think.


was for a lil minute.


----------



## KickAssCop

People w/ FE SLI. Top and bottom card temps and fan speeds. Max overclocks?


----------



## alucardis666

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> SB-E couldn't run pci-E 3.0 without a reg hack:
> 
> *http://nvidia.custhelp.com/app/answers/detail/a_id/3135/session/L3RpbWUvMTM0MDIyMzU2OC9zaWQvaDEzbE45X2s=*
> 
> Some still have issues with the hack, I never did though with a couple 3930ks, a 3960x and two 3970x cpus.


Quote:


> Originally Posted by *djriful*
> 
> I believe he meant for X79 boards. They need to manually turn on via Regedit, like right now mine is stuck at 2.0 in Windows.


Quote:


> Originally Posted by *MightEMatt*
> 
> Or download the Kepler patch.


Thanks for clarifying


----------



## DStealth

Quote:


> Originally Posted by *pompss*
> 
> http://www.3dmark.com/3dm/18693977?
> 
> 7733 score a nice jump


Not bad...
http://www.3dmark.com/compare/fs/12047371/fs/10950994
A quick comparison to my 1080 with the same clocks shows 28% difference or GT1 9.5fps and GT2 6fps overall gain:thumb:


----------



## alucardis666

Well, after some tedious trial and error I've discovered that my card tops out at 475 mem offset and 2037mhz boost @ 1.0620v kinda disappointing really. I'd hopped to push higher with the hybrid cooler but I guess the silicone lottery wasn't too kind to me this round.


----------



## nrpeyton

Anyone tried under-volting yet?

On my 1080 (air) I'd game with, 2100 MHZ @ 1.0v & 50c.

Fan Curve (3 point _only_ curve):
15% @ 40c
100% @ 60c

On water, GPU temp was never more than 8c-10c above water temp.


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone tried under-volting yet?
> 
> On my 1080 (air) I'd game with, 2100 MHZ @ 1.0v & 50c.
> 
> Fan Curve (3 point _only_ curve):
> 15% @ 40c
> 100% @ 60c
> 
> On water, GPU temp was never more than 8c-10c above water temp.


Hmmm. I'll try it!


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> Hmmm. I'll try it!


I'll watch this space then ;-)


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> I'll watch this space then ;-)


*Both power and temp were maxed out before.*


*BEFORE Adjusting POWER & TEMP slider*


*AFTER Adjusting POWER & TEMP slider*


----------



## Baasha

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The 3970x is getting kind of slow for the new gen of cards especially so with quad 1080 TIs. But also make sure you are running pci-E 3.0 with the regedit hack. Make sure to use the extra pci-E molex plug on the motherboard for extra juice if you hadn't already.


Does my mobo need that RegEdit hack though? I have the ASRock Extreme11 X79 which has 2x PLX chips so I'm running x16/x16/x16/x16 @ PCI-E 3.0 (at least that's what it says in GPU-Z).

Will try it out and see if it helps.

Hmm.. I upgraded the PSU to run 4x 1080 Ti. I bought a CPU cooler (Kraken X62) and replaced all the fans in my case (Scythe Ultra Kaze @ 3000RPM)... the question is should I go the full distance and replace mobo/CPU as well? I wasn't planning on building a 'new' rig tbh... just wanted to run 4 Way SLI in the 2nd rig as well and see how well it can do 4K 144Hz or 5K 144Hz... - none of which is possible with the current setup due to the crappy scaling.

The funny thing is when I first bought the 3970X I was running 4x OG Titans in 4 Way SLI and the scaling was pretty good back in the day. I guess with the new tech, this CPU is showing its age.


----------



## KraxKill

Quote:


> Originally Posted by *alucardis666*
> 
> *Both power and temp were maxed out before.*
> 
> 
> *BEFORE Adjusting POWER & TEMP slider*
> 
> 
> *AFTER Adjusting POWER & TEMP slider*


I would guess that the card in your "before" screen is actually performing better and likely to score higher in 3dmark despite the lower top bin clock. Looking at PerfCap Reason. Since it's hitting limiter, more so than your "before" screen which seems to only be hitting the voltage and an occasional pwr?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Baasha*
> 
> Does my mobo need that RegEdit hack though? I have the ASRock Extreme11 X79 which has 2x PLX chips so I'm running x16/x16/x16/x16 @ PCI-E 3.0 (at least that's what it says in GPU-Z).
> 
> Will try it out and see if it helps.
> 
> Hmm.. I upgraded the PSU to run 4x 1080 Ti. I bought a CPU cooler (Kraken X62) and replaced all the fans in my case (Scythe Ultra Kaze @ 3000RPM)... the question is should I go the full distance and replace mobo/CPU as well? I wasn't planning on building a 'new' rig tbh... just wanted to run 4 Way SLI in the 2nd rig as well and see how well it can do 4K 144Hz or 5K 144Hz... - none of which is possible with the current setup due to the crappy scaling.
> 
> The funny thing is when I first bought the 3970X I was running 4x OG Titans in 4 Way SLI and the scaling was pretty good back in the day. I guess with the new tech, this CPU is showing its age.


I seen your 1080ti bench just now in the benchmarks area and it does say pci-E 3.0, so it shouldn't be that. I'd have to say it's the cpu showing it's age. You could get an $800 1680 V2 from ebay that should work for your board. It's and unlocked IB-E 8 core/ 16thread cpu.

Easy swap, instead of a motherboard/ram overhaul. Then wait for Skylake-E, the 6950x will be the back up system then LoL.


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> *Both power and temp were maxed out before.*
> 
> 
> *BEFORE Adjusting POWER & TEMP slider*
> 
> 
> *AFTER Adjusting POWER & TEMP slider*


Could try playing around with the voltage/frequency curve?


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> Could try playing around with the voltage/frequency curve?


I can do 1999mhz @ 1.000v


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> I can do 1999mhz @ 1.000v


Nice, thats exactly what I was hoping for ;-)

Not sure what to make of your pics yet though (earlier post), faster core clock, despite lower targets. Unless something else moved, too.


----------



## MURDoctrine

Well for anyone having downclocking issues I may have found the root of the cause. Well for myself atleast. I turned on desktop recording with shadowplay and my card sticks at first bin(1506MHz for me) load until I turn that feature off. So if you have that enabled and are having you card not downclock give that a look.


----------



## wizardbro

Anyone on a full water block and a quiet/silent setup have coil whine with this card at 100+ fps and 100%gpu usage?

My 980ti coil whine is the loudest thing in my PC BY FAR at Max loads, thinking of upgrading.


----------



## pantsoftime

Quote:


> Originally Posted by *wizardbro*
> 
> Anyone on a full water block and a quiet/silent setup have coil whine with this card at 100+ fps and 100%gpu usage?
> 
> My 980ti coil whine is the loudest thing in my PC BY FAR at Max loads, thinking of upgrading.


No discernible coil whine under water. My previous 1080FE had noticeable coil whine in the same scenarios.


----------



## Nitemare3219

Quote:


> Originally Posted by *KickAssCop*
> 
> People w/ FE SLI. Top and bottom card temps and fan speeds. Max overclocks?


You're not going to be able to overclock on FE SLI. My TXP SLI is terrible with the FE coolers... top card is 10+ degrees hotter. I have my fan curve set to GPU temp - 10%, and it still throttles slightly at stock speeds sometimes, even with a 65 FPS cap, and power/temp raised up.


----------



## Blackclad

delete


----------



## Blackclad

Quote:


> Originally Posted by *wizardbro*
> 
> Anyone on a full water block and a quiet/silent setup have coil whine with this card at 100+ fps and 100%gpu usage?
> 
> My 980ti coil whine is the loudest thing in my PC BY FAR at Max loads, thinking of upgrading.


Mine has massive coil whine yes. with AIO

I am still trying to figure out why my card has decided to idle at 57c.


----------



## craxx

Mine has bad coil whine also. Evga if it makes a difference.


----------



## NoDoz

Ive been reading that the PCB exactly the same as the titan x pascal and the coolers/blocks will work on the 1080ti. I was looking at this one, would this work to cool the chip and other components unlike the other aio's that just cool the chip?

http://www.aquatuning.us/water-cooling/kits-und-systems/internal-kits/alphacool/eiswolf/21866/alphacool-eiswolf-120-gpx-pro-nvidia-geforce-gtx-titan-x-pascal-m02-black?c=21301


----------



## Radox-0

Quote:


> Originally Posted by *KickAssCop*
> 
> People w/ FE SLI. Top and bottom card temps and fan speeds. Max overclocks?


Through an few hour long play with wildland, my cards (bearing in mind there is 3rd sandwiched between them limiting airflow somewhat) and my case sucks for airflow, my cards settle in at using 53% fan speed for top card and 50% bottom card and that maintains a temp of 83 degrees and 70 degrees for bottom card which has access to more air.. In terms of clocks they stay mostly in the 1750-1810 region, normally at 1810 mhz.

Max overclocks is around 2050 - 2100, but thats the short term boost and as they warm up settles in around 2026 mhz or so. That needs a pretty high fan profile though, so don't bother running those extra 200 mhz for the increase in noise.


----------



## OldNerd

Quote:


> Originally Posted by *alucardis666*
> 
> Well, after some tedious trial and error I've discovered that my card tops out at 475 mem offset and 2037mhz boost @ 1.0620v kinda disappointing really. I'd hopped to push higher with the hybrid cooler but I guess the silicone lottery wasn't too kind to me this round.


Yep, I've been burned in silicon lotto a few times ( especially trying to sync clocks in SLI ) .

My card gets similar clocks on stock voltage and stock FE cooler , so this round I got lucky.


----------



## OldNerd

I have coil whine on Passmark GPU tests with stock Air that is pretty loud. I have had this issue before when i used a crappy PSU without enough juice on the rails under load. And I am on a Rosewjll 1k junk psu while the strider is powering another temp machine right now so that where I think the reason is.


----------



## Baasha

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I seen your 1080ti bench just now in the benchmarks area and it does say pci-E 3.0, so it shouldn't be that. I'd have to say it's the cpu showing it's age. You could get an $800 1680 V2 from ebay that should work for your board. It's and unlocked IB-E 8 core/ 16thread cpu.
> 
> Easy swap, instead of a motherboard/ram overhaul. Then wait for Skylake-E, the 6950x will be the back up system then LoL.


hmm. that's a thought. When is Skylake-E supposed to be released? This year or next?

If this year, I might just part out my 2nd rig now and wait for Skylake-E to hit and make my X99 system the backup like you said - with 4x Titan XPs!


----------



## sWaY20

Quote:


> Originally Posted by *Baasha*
> 
> hmm. that's a thought. When is Skylake-E supposed to be released? This year or next?
> 
> If this year, I might just part out my 2nd rig now and wait for Skylake-E to hit and make my X99 system the backup like you said - with 4x Titan XPs!


should be this year, it's been almost 2 years since the last one. I'm waiting for Skylake e also.


----------



## Jbravo33

Quote:


> Originally Posted by *Nitemare3219*
> 
> You're not going to be able to overclock on FE SLI. My TXP SLI is terrible with the FE coolers... top card is 10+ degrees hotter. I have my fan curve set to GPU temp - 10%, and it still throttles slightly at stock speeds sometimes, even with a 65 FPS cap, and power/temp raised up.


just got my order of msi fe put in with ek waterblock. but i plan on running water on top card and custom aib card on bottom preferebly ftw3 when they drop. i was too impatient to just wait so do you think id have same issues you had? i had sli classifieds and top card was always much hotter.

I know its a weird combo but i been trying to get two fe but im having no luck except for friday frys had one in stock so im thinking one can hold me over for a while and just wait it out for ftw3 or something similar. i will most likely water that too but i know i'll be too excited and want to pop it in. thats why i asked.


----------



## pantsoftime

Some thoughts for anyone lamenting the loss of the DVI port... I picked up an EVGA DisplayPort hub and it works well with a couple of DVI monitors using DP->DVI adapters. On my 1080 i had all of the ports used up so it was kind of a big loss to lose an entire port. The hub takes a single DisplayPort and turns it into 3 more ports.


----------



## Draxx

You lads think its worth waiting for the non-reference cards or to hell with it and grab a FE? Want to throw one of these into my Raven ITX, blower might benefit in a small case like that I would imagine.


----------



## alucardis666

Quote:


> Originally Posted by *Draxx*
> 
> You lads think its worth waiting for the non-reference cards or to hell with it and grab a FE? Want to throw one of these into my Raven ITX, blower might benefit in a small case like that I would imagine.


I'd grab a FE and just watercool it with either the EVGA Hybrid cooler, or a closed loop aio cooler and a Kraken G10. You eliminate the crazy fan noise form the blower fan and get better sustained boosts


----------



## chibi

EVGA 1080 Ti reporting in, see sig rig for full system specs.

With the system at full default settings, including bios set to load optimzed. I let Heaven loop for 30 minutes, then proceeded with a benchmark run. GPU Core sustained boost at 1911 MHz and Memory at 5505 MHz.

I tested for coil whine with the following:

Standard games list - None
TimeSpy - None
Heaven - Nothing during benchmark loop, only at loading screen
Valley - Coil whine like the cry of a dying monster from the abyss
I even swapped out a known good psu with no coil whine and same results as above. Something about Valley that causes the coil whine. I'm glad my standard games do not exhibit coil whine, phew. This GPU will be a keeper.


----------



## alucardis666

Quote:


> Originally Posted by *chibi*
> 
> EVGA 1080 Ti reporting in, see sig rig for full system specs.
> 
> With the system at full default settings, including bios set to load optimzed. I let Heaven loop for 30 minutes, then proceeded with a benchmark run. GPU Core sustained boost at 1911 MHz and Memory at 5505 MHz.
> 
> I tested for coil whine with the following:
> 
> Standard games list - None
> TimeSpy - None
> Heaven - Nothing during benchmark loop, only at loading screen
> Valley - Coil whine like the cry of a dying monster from the abyss
> I even swapped out a known good psu with no coil whine and same results as above. Something about Valley that causes the coil whine. I'm glad my standard games do not exhibit coil whine, phew. This GPU will be a keeper.


I also get whine at the heaven load screen. But no wher else









Push your card higher! should be able to sustain 1999-2025 easy!


----------



## chibi

Quote:


> Originally Posted by *alucardis666*
> 
> I also get whine at the heaven load screen. But no wher else
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Push your card higher! should be able to sustain 1999-2025 easy!


I'm just testing components right now for DOA faults. Will be transitioning into water blocks as my build log progresses - that's where the overclocking will be begin.
Link to build log in my sig, come check it out if you have some time.


----------



## Draxx

Quote:


> Originally Posted by *alucardis666*
> 
> I'd grab a FE and just watercool it with either the EVGA Hybrid cooler, or a closed loop aio cooler and a Kraken G10. You eliminate the crazy fan noise form the blower fan and get better sustained boosts


Only problem I foresee is space. Don't know without doing some measurements, but I don't think I'll have the space for a rad in there.


----------



## OldNerd

Quote:


> Originally Posted by *Draxx*
> 
> You lads think its worth waiting for the non-reference cards or to hell with it and grab a FE? Want to throw one of these into my Raven ITX, blower might benefit in a small case like that I would imagine.


Grab a FE now and down the road if you think you need more OC or less fan noise grab an AIO or Waterblock







At UHD 4K with a single 1080 TI with a small OC I havent seen any lag on any game at Ultra High and MAX AA etc


----------



## Chaert

Proud owner of the TI here to. Quick question tho.
My EVGA FE did not come with a case sticker or driver CD. Not that I need them, but I was thinking that the box while it was sealed may have been opened before. Can a EVGA owner enlighten me on this? I know the GTX 1080 FE did come with these. Card runs like a dream tho! (Tad noisier than my GTX 980 reference gotta say.)


----------



## craxx

Quote:


> Originally Posted by *Chaert*
> 
> Proud owner of the TI here to. Quick question tho.
> My EVGA FE did not come with a case sticker or driver CD. Not that I need them, but I was thinking that the box while it was sealed may have been opened before. Can a EVGA owner enlighten me on this? I know the GTX 1080 FE did come with these. Card runs like a dream tho! (Tad noisier than my GTX 980 reference gotta say.)


I have an evga also and mine didn't come with either.

Sent from my SM-N910V using Tapatalk


----------



## Slackaveli

Quote:


> Originally Posted by *Chaert*
> 
> Proud owner of the TI here to. Quick question tho.
> My EVGA FE did not come with a case sticker or driver CD. Not that I need them, but I was thinking that the box while it was sealed may have been opened before. Can a EVGA owner enlighten me on this? I know the GTX 1080 FE did come with these. Card runs like a dream tho! (Tad noisier than my GTX 980 reference gotta say.)


Ill sell you mine for $20 on ebay


----------



## GreedyMuffin

Okey, honest question.

How long would a GTX 1080Ti watercooled last for 1440P gaming with a G-synch monitor? two-three years, tops?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Okey, honest question.
> 
> How long would a GTX 1080Ti watercooled last for 1440P gaming with a G-synch monitor? two-three years, tops?


I think so. What do you have planned?


----------



## BigMack70

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How long would a GTX 1080Ti watercooled last for 1440P gaming with a G-synch monitor?


Nobody can answer this because
1) Acceptable performance for a graphics card is subjective (you might be happy with performance whereas someone else would upgrade, or vice-versa)

2) Nobody knows how demanding future titles will be

Buy what you need for your desired games/performance _now_, not for hypothetical and unknown future performance.


----------



## GreedyMuffin

Thanks!

I sold my 7700K system and I bought a 1700. Sold my watercooled 1080 (2100) and I am now using a 980Ti stock (1300, but can do 1500 on stock voltage) in the meantime until my 1080Ti arrives. Ordered it + a waterblock from EK (Clear/Nickel) so I was wondering. Been upgrading each gen since 780 since I got some awesome deals and low prices. so the price in between is low, 200 USD on a 1000 USD card after owning it for a year is not bad.

EDIT: My point was, the 980Ti, even at stock speed, gave me 80-90 FPS on the same settings where the 7700K 4900/1080 2100 gave me 125-130. Will test the 980Ti with another 200mhz on the core, and 450+ on mem, should kick it pretty good.

So I do not need alot of performance and such. Main reason for upgrading is [email protected] and new toy.


----------



## pantsoftime

Spent some time tinkering with memory again today. Over the last couple of days I was thinking that +465 was my optimal speed, but that turned out to be incorrect. After some additional hunting I settled on +398 which AB reports as exactly 5900MHz/11800MT/s. The +465 was giving me the best perf between +400 and +500 but I didn't realize at first that I had already been too high all along.


----------



## pantsoftime

Quote:


> Originally Posted by *Chaert*
> 
> Proud owner of the TI here to. Quick question tho.
> My EVGA FE did not come with a case sticker or driver CD. Not that I need them, but I was thinking that the box while it was sealed may have been opened before. Can a EVGA owner enlighten me on this? I know the GTX 1080 FE did come with these. Card runs like a dream tho! (Tad noisier than my GTX 980 reference gotta say.)


Mine didn't come with a small case sticker but it came with two large stickers and a giant movie poster-styled thing with a robot on it.


----------



## KingEngineRevUp

So earlier I was investigating the following.

1080 Ti GPU Temperatures with respect to my AIO and fan speeds.

I had the following:
1080 Ti
Corsair H55
Noctua NF-F12 1500 RPM fans push-Pull

I was getting load temperatures in For Honor of 62C and eventually it went up to 65C due to my fans being an exhaust and added heat from mobo.

I upgraded to:
1080 Ti
Corsair H55
*Noctua NF-F12 2000 RPM fans* Push-Pull

Temperatures have dropped to 55C and go up to 58C sometimes maximuim.

This graph is a good estimate. You will experience diminishing returns eventually as your radiator cannot have its heat exhausted any quicker.


----------



## Baasha

Guys I finally got the email from Nvidia about the game code redemption. I got 4 codes since I got 4 cards. I used one code to redeem For Honor and that's fine. But I want to also redeem Ghost Recon Wildlands and it keeps saying "you've already redeemed this" or something like that!









I tried the other three codes and none of them work. I also tried on the main rig w/ the Titan XP and it says "you do not meet the hardware requirements" etc.

Can anyone help? I just want to redeem Wildlands in addition to For Honor which I've already redeemed.


----------



## utparatrooper

Quote:


> Originally Posted by *DerComissar*
> 
> Love that statement, and so true, lol!
> 
> 
> 
> 
> 
> 
> 
> 
> Rep+
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do consider the first option for cooling if you aren't in a hurry, it's great to have a proper loop.
> An aio certainly is cheaper and easier initially, imo a real loop is still worth it in the long run.


It hurts....oh it hurts....after wearing out the F5 button on my keyboard, I justified to myself ordering an EVGA FE off ebay at a slight premium to what I otherwise would have paid in sales taxes plus what I would have elected for overnight shipping. Not too bad really, better than I thought. Also ordered EK block and backplate. I haven't watercooled in awhile but getting set up to do a full loop for the CPU and the card. Perfect setup since all that horsepower is needed now that I am going back to play Fallout NV and Fallout 3









I always tell myself this will last at least a couple of generations. And then the next Ti comes out and all that discipline goes out the window.

I'd agree that a full, albeit somewhat expensive, loop is worth it in the long run. Maintenance on a full loop isn't really bad and the amount of difficulty is somewhat trivial. Heck, much like most people I learned from YT. It's a great teacher.


----------



## utparatrooper

Quote:


> Originally Posted by *Baasha*
> 
> Guys I finally got the email from Nvidia about the game code redemption. I got 4 codes since I got 4 cards. I used one code to redeem For Honor and that's fine. But I want to also redeem Ghost Recon Wildlands and it keeps saying "you've already redeemed this" or something like that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried the other three codes and none of them work. I also tried on the main rig w/ the Titan XP and it says "you do not meet the hardware requirements" etc.
> 
> Can anyone help? I just want to redeem Wildlands in addition to For Honor which I've already redeemed.


I could be wrong in saying this, as I haven't received my card or code yet (see previous post). But I thought I read somewhere it's redeemed through GeForce Experience?

If so, maybe take one of your Ti's and put in another available and create a different GE account?

Just a guess.


----------



## Radox-0

Quote:


> Originally Posted by *Baasha*
> 
> Guys I finally got the email from Nvidia about the game code redemption. I got 4 codes since I got 4 cards. I used one code to redeem For Honor and that's fine. But I want to also redeem Ghost Recon Wildlands and it keeps saying "you've already redeemed this" or something like that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried the other three codes and none of them work. I also tried on the main rig w/ the Titan XP and it says "you do not meet the hardware requirements" etc.
> 
> Can anyone help? I just want to redeem Wildlands in addition to For Honor which I've already redeemed.


I just rotated one of my other 1080Ti's in to place and worked fine. Will not work with the Titan XP as Geforce Experience detects your card and will only allow redemption with the 1070/1080 or 1080Ti.


----------



## djriful

I am suffering.


----------



## Slackaveli

Quote:


> Originally Posted by *Baasha*
> 
> Guys I finally got the email from Nvidia about the game code redemption. I got 4 codes since I got 4 cards. I used one code to redeem For Honor and that's fine. But I want to also redeem Ghost Recon Wildlands and it keeps saying "you've already redeemed this" or something like that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried the other three codes and none of them work. I also tried on the main rig w/ the Titan XP and it says "you do not meet the hardware requirements" etc.
> 
> Can anyone help? I just want to redeem Wildlands in addition to For Honor which I've already redeemed.


i think it's "max one per account". that's crappy!!

im willing to take one if you dont want them as my son took mine and my account still hasnt redeemed one. pm me if you feelin generous.


----------



## Slackaveli

Quote:


> Originally Posted by *djriful*
> 
> _I am suffering._


why?


----------



## eXistencelies

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


I was getting hot temps too on my 1080ti. Come to find out when I went to remove the block I found the clear backing from one of the thermal pads on the block that sat right on the gpu chip. It must have fallen onto the block without seeing it and I put it on without knowing. My gpu was in the 50's under load. Now with it removed it gets no higher than 45.


----------



## djriful

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> I am suffering.
> 
> 
> 
> why?
Click to expand...

1080 Ti no stock... TITAN XP waterblock no stock.


----------



## jleslie246

i feel your pain.


----------



## LuisG7

Hello, my 1080Ti FE only use 0.900v/0.930 in game/bench(ROTTR, WITCHER 3, HEAVEN, FIRESTRIKE, TIMESPY etc.) with fan at 70-80%, few times use more than 1.000v, my clocks are unstable (1780/1850/1950/1800/1900 every second). Is this a problem? This happen in OC and non OC, 120% power limit and 100% power limit I've reinstall drivers and nothing Thanks







Sorry for my english
My hardware is: 6700K 4.6ghz 32GB ram EVGA 850W G2 80+ Gold


----------



## illypso

Quote:


> Originally Posted by *djriful*
> 
> 1080 Ti no stock... TITAN XP waterblock no stock.


There is stock on NVIDIA website from time to time.

Open this page, create an account, use the browser alert and set the alarm. that's how I got mine.

https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/


----------



## Slackaveli

Quote:


> Originally Posted by *djriful*
> 
> 1080 Ti no stock... TITAN XP waterblock no stock.


that'll do it
Quote:


> Originally Posted by *LuisG7*
> 
> Hello, my 1080Ti FE only use 0.900v/0.930 in game/bench(ROTTR, WITCHER 3, HEAVEN, FIRESTRIKE, TIMESPY etc.) with fan at 70-80%, few times use more than 1.000v, my clocks are unstable (1780/1850/1950/1800/1900 every second). Is this a problem? This happen in OC and non OC, 120% power limit and 100% power limit I've reinstall drivers and nothing Thanks
> 
> 
> 
> 
> 
> 
> 
> Sorry for my english
> My hardware is: 6700K 4.6ghz 32GB ram EVGA 850W G2 80+ Gold


seems odd. it should get 1.06v


----------



## lanofsong

Hey GTX 1080 Ti owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST. From what i gather, these 1080 Ti are monster GPU's each capable of pumping out in excess of 1 Million Points per day







.

So with that said, would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## PasK1234Xw

Quote:


> Originally Posted by *djriful*
> 
> 1080 Ti no stock... TITAN XP waterblock no stock.


if you dont mind non tranferable warrenty and chance of voiding warrenty removing cooler nvidia has stock every day it goes live around 9-9:30am EST


----------



## djriful

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> 1080 Ti no stock... TITAN XP waterblock no stock.
> 
> 
> 
> if you dont mind non tranferable warrenty and chance of voiding warrenty removing cooler nvidia has stock every day it goes live around 9-9:30am EST
Click to expand...

Been doing that since 2011.


----------



## alucardis666

Quote:


> Originally Posted by *OldNerd*
> 
> Grab a FE now and down the road if you think you need more OC or less fan noise grab an AIO or Waterblock
> 
> 
> 
> 
> 
> 
> 
> At UHD 4K with a single 1080 TI with a small OC I havent seen any lag on any game at Ultra High and MAX AA etc


Not playing Mass Effect Andromeda or Ghost Recon Wildlands I see


----------



## AMDATI

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1080 Ti owners,
> 
> We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST. From what i gather, these 1080 Ti are monster GPU's each capable of pumping out in excess of 1 Million Points per day
> 
> 
> 
> 
> 
> 
> 
> .
> So with that said, would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> March 2017 Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


You picked a terrible time, considering it's in the ME: Andromeda release day.


----------



## KickAssCop

Thanks to those who posted about FE SLi temps. I don't think I will be using SLi for these cards.


----------



## kl6mk6

Quote:


> Originally Posted by *craxx*
> 
> What is everyone seeing for temps with water? I have dual d5s set on speed 2, an rs360, monsta 240mm both with push pull sp120 hp on 5v, ek supremacy evo, and ek nickel/plexi titan xp block. 4770k is at 4.4 at 1.325v. 1080ti is stock voltage +160 core, +400mem. Before I added the gpu I would hit about 72c on the cpu, now with the gpu under water, I hit 83 on the cpu and 56 on gpu after 1 conquest round in bf1. Should be plenty of rad I would think, but it seems way too hot.


I was hitting 60C with my EK TXP block. I took my block back apart to find I had no contact using a Liquid Metal Pad. I Added some Conductonaut and now I am below 40C for everything I have tested with 21C ambient. If your CPU temps increased 11C , Something else has changed drastically. Your flow may be restricted by something...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Baasha*
> 
> hmm. that's a thought. When is Skylake-E supposed to be released? This year or next?
> 
> If this year, I might just part out my 2nd rig now and wait for Skylake-E to hit and make my X99 system the backup like you said - with 4x Titan XPs!


Look at this:
Quote:


> http://www.overclock.net/t/1400061/asus-p9x79-e-ws-owners/250#post_25939729


Just seen what this poster wrote, interesting.


----------



## DerComissar

Quote:


> Originally Posted by *djriful*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> _I am suffering._
> 
> 
> 
> why?
> 
> Click to expand...
> 
> 1080 Ti no stock... TITAN XP waterblock no stock.
Click to expand...

As suggested, you may be able to snag one directly from Nvidia as they go in and out of stock there constantly.
I did mention earlier that I got one from a dealer in B.C., just by watching the reports of 1080Ti stock at various sites, I think PCPartpicker showed that particular listing when they got some in.

As for Titan X Pascal blocks, I just ordered an EK block and backplate directly from EK last night, they had plenty in stock then.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *lilchronic*
> 
> Using the EK-FC titan x Pascal block i put thermal pads on the areas marked red. I noticed when i took the stock air cooler off they had pads there but it did not say to put them on there in the waterblock installation guide.
> Just wondering if anyone else did the same.


That's the problem with ek water blocks. They are not built with overclockers in mind (weirdly).
Had this issue in the past, for the 780 L the block didn't even touch the vrm's. I wrote a message to a ekwb rep about it but he just responded that the vrm's were rated to handle 80C and shouldn't be an issue .. *****

We solved it by putting double pads on there so it had some contact with the block. Usually sat around 60C after that, and I haven't given it a thought until I saw your post.

Seemingly on these cards (1080 ti) it's the memory that needs cooling the most. Which makes me think those hybrid solutions with the closed loop attachment won't do a very good job, since the have very minimal effect on the memory temps. Other than moving the heat away from the core, thus lowering the ambient around the mems.


----------



## nrpeyton

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> That's the problem with ek water blocks. They are not built with overclockers in mind (weirdly).
> Had this issue in the past, for the 780 L the block didn't even touch the vrm's. I wrote a message to a ekwb rep about it but he just responded that the vrm's were rated to handle 80C and shouldn't be an issue .. *****
> 
> We solved it by putting double pads on there so it had some contact with the block. Usually sat around 60C after that, and I haven't given it a thought until I saw your post.
> 
> Seemingly on these cards (1080 ti) it's the memory that needs cooling the most. Which makes me think those hybrid solutions with the closed loop attachment won't do a very good job, since the have very minimal effect on the memory temps. Other than moving the heat away from the core, thus lowering the ambient around the mems.


The top most red squares, are the chokes on the memory VRM _(i think)._
Would probably run @ 65c under load with no help. (totally bare)

Not sure about the other bits you marked red. Would be interesting to see how hot they get without any extra pads. But I will probably do the same as you, once everything arrives.

How high did you need to go, to make contact with block on those extra areas? _(i.e. what sizes of pads & did you have to stack them)?_

The reference design is quite small, so any extra pads at all (beyond what EK manual states) will help. I just imagine it would be quite tricky (painstaking) working out heights for everything. (too much and it can affect pressure/compression of pads on nearby areas which are more important).

I found that getting good compression, was a lot more important than W/mk rating, with pads.

I used an EK 780TI block on my 1080 Classy. Perfect fit. Everything was directly cooled except memory-VRM.
So core, memory chips & core-VRM all directly cooled.


----------



## Mongo

*Well I have had a 1080 Ti for Almost a week.

I removed a 1060 6gb and everything was working great.

I Installed the GTX 1080 Ti from Nvidia, I have already ordered the EK waterblock and backplate but have not gotten yet.

But after I fire a game up within a few mins I get whats in the video below. I have not overclocked yet and I have even set fan to 100% and it still does it.

I have seen the temp go as high as 84c but most of the time I pop out its in the 70C ish range.

I have reinstalled the drivers and I have a EVGA SuperNOVA 1300 G2 Power Supply.I have reseated the card and made sure the cable was in good.

Guess its RMA time for me.

Video Link: 




*


----------



## D13mass

Quote:


> Originally Posted by *Mongo*
> 
> Well I have had a 1080 Ti for Almost a week.
> 
> I removed a 1060 6gb and everything was working great.
> 
> I Installed the GTX 1080 Ti from Nvidia, I have already ordered the EK waterblock and backplate but have not gotten yet.
> 
> But after I fire a game up within a few mins I get whats in the video below. I have not overclocked yet and I have even set fan to 100% and it still does it.
> 
> I have seen the temp go as high as 84c but most of the time I pop out its in the 70C ish range.
> 
> I have reinstalled the drivers and I have a EVGA SuperNOVA 1300 G2 Power Supply.I have reseated the card and made sure the cable was in good.
> 
> Guess its RMA time for me.
> 
> Video Link:


Maybe it`s only in 1 game ? I`m not sure but it can be bug in game, have you tried run any other game ?


----------



## Kimir

Yeah, try some other games, Ghost recon wildlands is far from perfect of glitches, I have some weird issues on 780Ti SLI myself. Run fine on 980Ti except those dip with huge frametime increase.
The temperature seems high tho, could be bad TIM from factory.


----------



## jarble

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1080 Ti owners,
> 
> We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST. From what i gather, these 1080 Ti are monster GPU's each capable of pumping out in excess of 1 Million Points per day
> 
> 
> 
> 
> 
> 
> 
> .
> So with that said, would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> March 2017 Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


bump for awareness OCN could really use a few of the 1080ti users to fold some units to help us retake 3rd








Quote:


> Originally Posted by *AMDATI*
> 
> You picked a terrible time, considering it's in the ME: Andromeda release day.


You don't have to run full time just a few hr's here are there on these big cards puts up a lot of work units.


----------



## PewnFlavorTang

No thanks. Mom and dad doesn't pay my electric bill.


----------



## Joshwaa

Wonder what kind of numbers I could put down for folding. Have not done it in a long time. 12 RX480's 2 RX470's 1 1080 and 1 1080ti. Hmmmm.....


----------



## PasK1234Xw

Quote:


> Originally Posted by *djriful*
> 
> Been doing that since 2011.


----------



## straha20

Three Gigabyte 1080ti's left here...$745 shipped, so not too bad of a markup over Newegg...

https://www.walmart.com/ip/710149939


----------



## keikei

Quote:


> Originally Posted by *straha20*
> 
> Three Gigabyte 1080ti's left here...$745 shipped, so not too bad of a markup over Newegg...
> 
> https://www.walmart.com/ip/710149939


I lulled when we were speculating the price of the Ti awhile back. I'm eating crow, but for nvidia to release it at such a price and just before vega shows just how much of a threat vega is to them. I wonder how much the Ti's with aftermarket coolers wiil go for?


----------



## straha20

Quote:


> Originally Posted by *keikei*
> 
> I lulled when we were speculating the price of the Ti awhile back. I'm eating crow, but for nvidia to release it at such a price and just before vega shows just how much of a threat vega is to them. I wonder how much the Ti's with aftermarket coolers wiil go for?


Well for me, I'm not too worried about the AIB cards...I have my water block sitting and waiting patiently to go on a card, so the FE card is exactly what I need. Looking at the $30 over retail including shipping as a premium...I ordered mine today, and should have it by the end of the week...especially considering what I was planning on spending before we knew the actual MSRP.


----------



## RedM00N

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> No thanks. Mom and dad doesn't pay my electric bill.


If this is in response to folding for a couple days, my two Titan XM's which are overvolted to 1.2 and running 1405/4000 would cost under $10 for the two days my system would be folding. Money is money though









I do know rates vary by place though, but should still be low, unless you have some rediculous kW/h rate.

I remember using three cards playing overwatch for a month added like $40 onto my bill doing 3-12 hour sessions


----------



## keikei

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1080 Ti owners,
> 
> We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST. From what i gather, these 1080 Ti are monster GPU's each capable of pumping out in excess of 1 Million Points per day
> 
> 
> 
> 
> 
> 
> 
> .
> So with that said, would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> March 2017 Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


Quote:


> Originally Posted by *RedM00N*
> 
> If this is in response to folding for a couple days, my two Titan XM's which are overvolted to 1.2 and running 1405/4000 would cost under $10 for the two days my system would be folding. Money is money though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do know rates vary by place though, but should still be low, unless you have some rediculous kW/h rate.
> 
> I remember using three cards playing overwatch for a month added like $40 onto my bill doing 3-12 hour sessions


I really have no excuse then. I'll see what i can do. I just dropped $700 on card. Whats a few more $$ on the electric bill for a good cause.


----------



## SperVxo

Can only do 120 on the gpu. Over that and it crash. But the card hits power limit all the time.


----------



## RedM00N

Quote:


> Originally Posted by *keikei*
> 
> I really have no excuse then. I'll see what i can do. I just dropped $700 on card. Whats a few more $$ on the electric bill for a good cause.


It's also another good way to stability test your card if its overclocked. I can run 1450 in games/benches fine but folding will fail at anything over 1425 with the voltage I use (Titan XM ). Since it's one giant sensitive math simulation, any instability will cause folding to fail or revert to a previous checkpoint.


----------



## coolharris93

Guys i need your help. I currently have an msi 980 ti gaming card and i'm planning to upgrade to 1080 ti.The problem is that aftermarket cards here in my coutry will arrive late Aprin early May so i was thinking about buying an FE card.You think it's worth it buying an FE card or just wait 2 months for aftermarket ones? Good thing is i can buy an AIO if i go with FE after 2-3 months.


----------



## sWaY20

Quote:


> Originally Posted by *coolharris93*
> 
> Guys i need your help. I currently have an msi 980 ti gaming card and i'm planning to upgrade to 1080 ti.The problem is that aftermarket cards here in my coutry will arrive late Aprin early May so i was thinking about buying an FE card.You think it's worth it buying an FE card or just wait 2 months for aftermarket ones? Good thing is i can buy an AIO if i go with FE after 2-3 months.


this question has been asked repeatedly, just get one!!!


----------



## s1rrah

EVGA Hybrid please.


----------



## coolharris93

im just curious is the 1080 ti FE thermal throttles a lot?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *coolharris93*
> 
> Guys i need your help. I currently have an msi 980 ti gaming card and i'm planning to upgrade to 1080 ti.The problem is that aftermarket cards here in my coutry will arrive late Aprin early May so i was thinking about buying an FE card.You think it's worth it buying an FE card or just wait 2 months for aftermarket ones? Good thing is i can buy an AIO if i go with FE after 2-3 months.


Is there any difference from your question and the 100s of times it has been asked here or on Reddit?

Please just Google or read.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *coolharris93*
> 
> im just curious is the 1080 ti FE thermal throttles a lot?


This has been discussed already also. Gamer Nexus even wrote a entire article on it. Google.


----------



## jarble

Quote:


> Originally Posted by *keikei*
> 
> I really have no excuse then. I'll see what i can do. I just dropped $700 on card. Whats a few more $$ on the electric bill for a good cause.


Thats the spirit


----------



## PasK1234Xw

Well round 2 manage to grab the only 1080 ti at microcenter and I was hour after they opened it was surprisingly still there.

My nvidia branded was ok but was expecting great OC from them directly but not case at all. Default boost 1886 and max 2025 stable oc.

Msi 1080 TI FE boost to 1911 out of box and stable 2063 not 2100 but I'll take it plus better warrenty I also got replacement plan from microcenter for 3 years. If they can't fix in store *which don't repair GPU* I get brand new replacement not refurbished like msi.


----------



## Alwrath

Looks like im joining the club. No voltage @+140 mhz im getting fluctuations 1900 mhz to 2000 in crysis 3 at 4k. Very nice card so far. Fan isnt THAT loud at 100% and temp stays around 50c gaming. Not too shabby.


----------



## zswickliffe

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Well round 2 manage to grab the only 1080 ti at microcenter and I was hour after they opened it was surprisingly still there.
> 
> My nvidia branded was ok but was expecting great OC from them directly but not case at all. Default boost 1886 and max 2025 stable oc.
> 
> Msi 1080 TI FE boost to 1911 out of box and stable 2063 not 2100 but I'll take it plus better warrenty I also got replacement plan from microcenter for 3 years. If they can't fix in store *which don't repair GPU* I get brand new replacement not refurbished like msi.


Yeah I can only get mine to sit stable at 2050. Temps look perfect, I suppose I just didn't get lucky with the silicon lottery. Also throttles down to about 2000Mhz once temps are above 55*C or so (but that might be more related to voltage than temps since the voltage limit is pegged).


----------



## jarble

Quote:


> Originally Posted by *Alwrath*
> 
> 
> 
> Looks like im joining the club. No voltage @+140 mhz im getting fluctuations 1898 mhz to 2000 in crysis 3 at 4k. Very nice card so far. Fan isnt THAT loud at 100% and temp stays around 50c gaming. Not too shabby.


Maybe its because I have two cards but the wail these make is worse than some server fans (some 40mm fans have it beat)

edit also gratz on the card man


----------



## ratzofftoya

I have two reference cards (one EVGA and one Gigabyte). Have not opened yet. Is there any reason to wait for aftermarket if I am watercooling anyway?


----------



## KedarWolf

My EK waterblock and backplate are arriving today!!





I +1'd a few peeps that suggested a few extra thermal pads on stuff the stock air cooler has them on. Good advice for sure, I love overclock.net!!

When I was in the Novice overclock.net overclocking comp team the first HWbot competition we entered we came in second, followed by four straight wins in a row.


----------



## Mato87

https://ocaholic.ch/modules/news/article.php?storyid=16287 New renders of the anticipated non reference gtx 1080 ti from gigabyte just surfaced. Source says release is due mid april. Iam getting that beauty for sure. Kind of sad it's a bit of a downgrade from the 980 ti xtrem windforce in terms of aesthetics, but the performance is freaking worth it ! Can't wait. It's going to set me back for at least 900 euro Iam sure, but I have my wallet ready


----------



## keikei

Nvidia GeForce 378.92 WHQL

release notes

Quote:


> Originally Posted by *ratzofftoya*
> 
> I have two reference cards (one EVGA and one Gigabyte). Have not opened yet. Is there any reason to wait for aftermarket if I am watercooling anyway?


Still a little early, but evga has some info on their aftermarket Ti's. There is a comparision chart. http://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR


----------



## zswickliffe

Quote:


> Originally Posted by *ratzofftoya*
> 
> I have two reference cards (one EVGA and one Gigabyte). Have not opened yet. Is there any reason to wait for aftermarket if I am watercooling anyway?


If it were me I'd test my GPUs for a few days running various stress tests before I put them under water. Imagine finding out you have an issue and having to re-do the whole loop.


----------



## Alwrath

Quote:


> Originally Posted by *jarble*
> 
> Maybe its because I have two cards but the wail these make is worse than some server fans (some 40mm fans have it beat)
> 
> edit also gratz on the card man


It def makes some noise but nothing too extreme imo. People dont realize how loud some of the older ati radeons used to be... man you want to talk about loud... they sounded like a jet plane taking off. Also, thanks ?✌???


----------



## Slackaveli

Quote:


> Originally Posted by *straha20*
> 
> Well for me, I'm not too worried about the AIB cards...I have my water block sitting and waiting patiently to go on a card, so the FE card is exactly what I need. Looking at the $30 over retail including shipping as a premium...I ordered mine today, and should have it by the end of the week...especially considering what I was planning on spending before we knew the actual MSRP.


I TRIED to tell everybody that it would be $649 if a 3300cuda core cutdown or $699 if full titan but EVERYBODY just called me all kind of stupid. I got literally nobody to agree with me. It wasnt that hard to predict. Same as EVERY OTHER Ti, EVER.







Quote:


> Originally Posted by *RedM00N*
> 
> If this is in response to folding for a couple days, my two Titan XM's which are overvolted to 1.2 and running 1405/4000 would cost under $10 for the two days my system would be folding. Money is money though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do know rates vary by place though, but should still be low, unless you have some rediculous kW/h rate.
> 
> I remember using three cards playing overwatch for a month added like $40 onto my bill doing 3-12 hour sessions


THAT is $1500/mo! Man, screw folding. that's crazy. What do you get for it? Id rather flip dollars to homeless folks at streetlights, tbh.


----------



## Slackaveli

Quote:


> Originally Posted by *zswickliffe*
> 
> Yeah I can only get mine to sit stable at 2050. Temps look perfect, I suppose I just didn't get lucky with the silicon lottery. Also throttles down to about 2000Mhz once temps are above 55*C or so (but that might be more related to voltage than temps since the voltage limit is pegged).


man, you did get lucky. you are in here crying when many of us can only get 2012-2025 max stable and downclock in the 1900s. enjoy your solid midling card. it's above average for sure. average is about 2025/downclocks to 1974


----------



## zswickliffe

Quote:


> Originally Posted by *Slackaveli*
> 
> man, you did get lucky. you are in here crying when many of us can only get 2012-2025 max stable and downclock in the 1900s. enjoy your solid midling card. it's above average for sure. average is about 2025/downclocks to 1974


Definitely not crying... Just stating that I didn't win the silicon lottery. Not sure if it's BS but I've seen people posting 2100+ clocks with no issue. I definitely didn't get screwed, it's performing about par.


----------



## tconroy135

Quote:


> Originally Posted by *zswickliffe*
> 
> Definitely not crying... Just stating that I didn't win the silicon lottery. Not sure if it's BS but I've seen people posting 2100+ clocks with no issue. I definitely didn't get screwed, it's performing about par.


Aren't most, if not all, of the 2100+ clocks under water?


----------



## zswickliffe

Quote:


> Originally Posted by *tconroy135*
> 
> Aren't most, if not all, of the 2100+ clocks under water?


Not sure honestly. Mine seems like more of a voltage issue than a temp issue though. I guess you can mod the BIOS and raise voltages, not sure what they're doing.


----------



## lilchronic

My EVGA card does 2050Mhz / 6100Mhz stable in everything i throw at it, but as soon as i go to 2063Mhz it will crash.

My brother on the other hand also got a 1080ti Gigabyte and he can get up 2126Mhz to before it crashes instantly. 2114Mhz he can run maybe half of the valley bench and 2101Mhz he can pass it multiple times. Once he start's other benches like for firestike or heaven and the games he play's ARK survival he can only get 2088Mhz.

I find it weird that his card can get that high but mine just crashes with one bin higher.









both cards under water


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zswickliffe*
> 
> Yeah I can only get mine to sit stable at 2050. Temps look perfect, I suppose I just didn't get lucky with the silicon lottery. Also throttles down to about 2000Mhz once temps are above 55*C or so (but that might be more related to voltage than temps since the voltage limit is pegged).
> 
> 
> 
> man, you did get lucky. you are in here crying when many of us can only get 2012-2025 max stable and downclock in the 1900s. enjoy your solid midling card. it's above average for sure. average is about 2025/downclocks to 1974
Click to expand...

When I get my EK waterblock and backplate on in the P.M. after work I'll post my max core, memory, Heaven stable and temps etc.

I do have my CPU, 5960X at 4.6GHZ with decent voltages one older overvolted Maxwell Titan X as dedicated PhysX and my 1080 Ti on a 360 rad but the Titan X is always at low volts and clocks being a PhysX card so I don't think it'll affect my temps much.

As long as my 1080 Ti is under 60C I'm good!! Probably be well under 50C while benching with Heaven.









Was rushed doing this post at work, fixed the terrible typos.









Oh, and may as well flash the lowest voltage BIOS on my old Titan X.

I'm thinking of selling it though. It has a prefilled Predator 360 waterblock and backplate on it so after I put it in the For Sale section here I'll post it on the Predator forum here.


----------



## tconroy135

Quote:


> Originally Posted by *zswickliffe*
> 
> Not sure honestly. Mine seems like more of a voltage issue than a temp issue though. I guess you can mod the BIOS and raise voltages, not sure what they're doing.


I thought with Pascal that you can't add voltage with a physical modification to the card.


----------



## zswickliffe

Quote:


> Originally Posted by *tconroy135*
> 
> I thought with Pascal that you can't add voltage with a physical modification to the card.


I'm still new to the Pascal game but I thought EVGA had a modded LN2 BIOS that unlocked things like that. Could be totally wrong.


----------



## eXistencelies

Quote:


> Originally Posted by *tconroy135*
> 
> Aren't most, if not all, of the 2100+ clocks under water?


I have mine right now at 2050mhz and +550 on memory on water. Haven't tested it for long gaming sessions, but playing Ghost Recond for an hour I had no problems. I will do some more tests over the weekend.


----------



## Hulio225

Finally got in that 10k+ Club, took me some time tweaking the system to achieve that.











http://www.3dmark.com/3dm/18733517


----------



## Whach

OK. So would everyone here say that a pair OC'd GTX 980Ti's in SLI would approximate a single OC'd 1080 TI? I've been looking around comparing TitanXP benches against 980Ti SLI as there isn't much about a direct comparison. If so, I may just make the swap just to help minimum frames @1600p/4K DSR.

Anything would be insightful =)


----------



## Hulio225

Quote:


> Originally Posted by *Whach*
> 
> OK. So would everyone here say that a pair OC'd GTX 980Ti's in SLI would approximate a single OC'd 1080 TI? I've been looking around comparing TitanXP benches against 980Ti SLI as there isn't much about a direct comparison. If so, I may just make the swap just to help minimum frames @1600p/4K DSR.
> 
> Anything would be insightful =)


I checked some HWBOT 980Ti 2Way-SLI FireStrike submissions, it almost seems like what you said... those 2 980ti's can be a bit more powerful in synthetics, but i guess on the long run in games the 1080ti will win, cuz not all games scale like synthetics... but who am i explaining that.... if you use sli you know what i mean

here a link to hw bot 2 way 980ti sli on air cooling you can make some compariosons for your self
http://hwbot.org/benchmark/3dmark_-_fire_strike/rankings?hardwareTypeId=videocard_2459&start=0#interval=20#cores=2#start=0#undefined=#coolingType=2

and here my score with one 1080ti
http://hwbot.org/submission/3495799_hulio_3dmark___fire_strike_geforce_gtx_1080_ti_24138_marks


----------



## y2kcamaross

I moved from 980ti SLI'd to a single 1080ti, I gotta say, the 1080ti is superior, it may make a few frames per second less in certain titles that had good scaling, but it always feels smoother, regardless, at least with a 1440p144hz gsync display


----------



## tconroy135

Quote:


> Originally Posted by *y2kcamaross*
> 
> I moved from 980ti SLI'd to a single 1080ti, I gotta say, the 1080ti is superior, it may make a few frames per second less in certain titles that had good scaling, but it always feels smoother, regardless, at least with a 1440p144hz gsync display


I went from 1080 SLI to single 1080 ti and I agree with you that this is much better. I think I am going to get the EK-MLC when it's released next month and a pre-filled block and just stick with maximum performance from a Single Card.


----------



## DillnutMcGee

I've joined the club!! I've had my Nvidia 1080Ti FE for about a week and so far I am loving it. I have not overclocked it yet, but did have to turn up the fan to about 70% to keep it below 70C during gaming. I can't hear the fan with my headset on, but without it's definitely audible.

I've been toying with the idea of building a fully watercooled loop for years and finally decided to pull the trigger. My EK block and back plate is on the way along with the other components and I hope to be up and running this weekend.


----------



## RedM00N

Quote:


> Originally Posted by *Whach*
> 
> OK. So would everyone here say that a pair OC'd GTX 980Ti's in SLI would approximate a single OC'd 1080 TI? I've been looking around comparing TitanXP benches against 980Ti SLI as there isn't much about a direct comparison. If so, I may just make the swap just to help minimum frames @1600p/4K DSR.
> 
> Anything would be insightful =)


My two Titan XM's at 1405/8000 were in timespy about 16%faster than a 2050mhz 1080ti. And I'd imagine that's a best case scenario for the performance difference when it comes to SLI enabled games. And reviews show the card doing upwards of around 2x in some games vs a 980ti. So it's pretty much right near that 2x depending on how each card setup is overclocked.


----------



## Baasha

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Look at this:
> Just seen what this poster wrote, interesting.


Very interesting indeed...

So it looks like regardless of what GPU-Z etc. state, I'm not _really_ running the 4x cards in PCI-E 3.0 because the 3970X's micro-code (or whatever) does not support it? It also seems like a Win 10 issue - I'm not sure I want to go back to Win 7 tbh - I play BF and GTA mostly and I'm not sure how well they run on Win 7.

I'm not really looking to upgrade the CPU atm so I'll probably just wait for Skylake-E...

In the meantime, really would like to get this setup to work properly - if that's possible. FWIW, even 3-Way scales rather poorly at 5K which is quite surprising. The main rig scales phenomenally well @ 5K in almost all games with 4-Way. Definitely related to the CPU/platform methinks...

For S&G, ever seen <60FPS with 4-Way GTX 1080 Ti?









I did, for the first time, last night:


----------



## PasK1234Xw

His usage is a joke prob same results with 2 cards no CPU going to drive quad Ti properly in game like that even my 1080 SLI would be held back at times with 5960x.
Quote:


> Originally Posted by *zswickliffe*
> 
> Yeah I can only get mine to sit stable at 2050. Temps look perfect, I suppose I just didn't get lucky with the silicon lottery. Also throttles down to about 2000Mhz once temps are above 55*C or so (but that might be more related to voltage than temps since the voltage limit is pegged).


Yea ive been playing BF1 that seems much better at stability testing over heaven and firestrike. I eventually locked up in BF1 at 2063.
I'm with you now at 2050 played like 4 hours and all seems good *Knock on wood* im still on air and max 75c my core stable at 2025Mhz.

Also throttle points are not same as nvidia. The MSI is keeping a higher boost at same core increase/temps than i could with the nvidia model.

Ill be putting on hybrid when EVGA makes official Ti version. I had them on my 1080s and they ran so much better when kept cool.


----------



## alucardis666

New Driver!









http://www.guru3d.com/files-details/geforce-378-92-whql-driver-download.html


----------



## Chaert

Quote:


> Originally Posted by *craxx*
> 
> I have an evga also and mine didn't come with either.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> Ill sell you mine for $20 on ebay
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *pantsoftime*
> 
> Mine didn't come with a small case sticker but it came with two large stickers and a giant movie poster-styled thing with a robot on it.
> 
> Click to expand...
> 
> Thanks for the replies guys. Looks like I got everything I should have.
> 
> Reason I'm asking is because I got the feeling my card was used before. The foam on the heatsink plastic looked like it was put there for the second time, the box was damaged and the serial number sticker had it's corner lifted up a bit as can be seen on the pictures. I'm probably nitpicking but I hate these things when it comes to a 800 euro's graphics card.
Click to expand...


----------



## PasK1234Xw

Seriously how hard is it for EVGA to stick their serial number sticker in the empty space right under nvidia's and not across back plate

edit

Also you can see serial number on PCIE


----------



## Somasonic

Just ordered the Asus ROG Strix from a local online store. It's not expected to ship until April 3rd but I'm happy to wait. Pre-ordered Mass Effect Andromeda to go along with it; new beast card and the latest instalment from one of my favourite series =


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> When I get my EK waterblock and backplate on in the P.M. after work I'll post my max core, memory, Heaven stable and temps etc.
> 
> I do have my CPU, 5960X at 4.6GHZ with decent voltages one older overvolted Maxwell Titan X as dedicated PhysX and my 1980 Ti on a 360 rad but the Titan X is always at low volts and clocks being a PhysX card so I don't think it'll affect my temps much.
> 
> As long as my 1090 Ti is under 60C I'm good!! Probably be well under 50C while benching with Heaven.


ah, the ole Titan physX card.







:thumb:
Quote:


> Originally Posted by *Somasonic*
> 
> Just ordered the Asus ROG Strix from a local online store. It's not expected to ship until April 3rd but I'm happy to wait. Pre-ordered Mass Effect Andromeda to go along with it; new beast card and the latest instalment from one of my favourite series =


just played about 5 hours of it in 4k HDR. Man, the HDR is awesome. really really nice. Not sure hoe hooked on the gameplay i am but the gunplay is a lot better (after an initial breaking in period). Ti keeps frames in the 50s+ with everything on ultra+(custom to get full ultra).


----------



## Jbravo33

Anyone else order from frys? I ordered Saturday morning caught one msi in stock and my delivery date is tomorrow but still haven't received tracking. This is getting out of hand. Nvidia just holding them back to keep the buzz on. Like when bars and clubs have a line form outside then when u get in theres no one. I hope vega steps up just for consumers sake.


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> Anyone else order from frys? I ordered Saturday morning caught one msi in stock and my delivery date is tomorrow but still haven't received tracking. This is getting out of hand. Nvidia just holding them back to keep the buzz on. Like when bars and clubs have a line form outside then when u get in theres no one. I hope vega steps up just for consumers sake.


don't blame nvidia b/c Fry's sells things they dont really have in stock.


----------



## Jbravo33

Quote:


> Originally Posted by *Slackaveli*
> 
> don't blame nvidia b/c Fry's sells things they dont really have in stock.


I am blaming nvidia. There the ones holding back. You think there initial order was for 500? Doubt it. Wouldn't look good releasing 5k first month then sales drop. They holding out to make the numbers look like they increasing every month. Definitely blaming them. Guess I'm just not one of the lucky ones.


----------



## moonbogg

Quote:


> Originally Posted by *Whach*
> 
> OK. So would everyone here say that a pair OC'd GTX 980Ti's in SLI would approximate a single OC'd 1080 TI? I've been looking around comparing TitanXP benches against 980Ti SLI as there isn't much about a direct comparison. If so, I may just make the swap just to help minimum frames @1600p/4K DSR.
> 
> Anything would be insightful =)


I switched from two watercooled 980ti's. I lost some FPS in Rise of the Tomb Raider and of course in the synthetics. Oddly enough I gained FPS in BF1. Always pegged at 100fps Gsync'd, but it did dip down to 96fps last night, so I'll have to investigate this "issue" hehe.
Everything else my FPS is almost double what it was because SLI sucks these days. I'm so damn happy with the 1080ti that I can hardly contain myself. Totally worth ditching the 980ti's all day long.
Quote:


> Originally Posted by *zswickliffe*
> 
> Definitely not crying... Just stating that I didn't win the silicon lottery. Not sure if it's BS but I've seen people posting 2100+ clocks with no issue. I definitely didn't get screwed, it's performing about par.


I now feel very fortunate to have gotten a card and water block during the first wave and this card is as stable as the laws of physics at 2114/6000 under the EK block with stock voltage and 120% power limit. It sometimes decides to stick to 2101, so its either 2101 or 2114, depending on what PrecisionX is showing. Don't know why it does that, but it doesn't appear to be a throttle as its always one of those two clocks, most often 2114. Once it picks a clock, its just stuck there and doesn't budge after hours of BF1. So it looks like the golden clock to hit for these cards is 2.1ghz and there are cards out there that will do it. A little extra voltage and power headroom might help the cards that can't do it get there.

Also, I just installed GPU-Z to check the ASIC quality and it says that feature isn't supported on this card. WHY? I want to know my ASIC.


----------



## Somasonic

Quote:


> Originally Posted by *Slackaveli*
> 
> ah, the ole Titan physX card.
> 
> 
> 
> 
> 
> 
> 
> :thumb:
> just played about 5 hours of it in 4k HDR. Man, the HDR is awesome. really really nice. Not sure hoe hooked on the gameplay i am but the gunplay is a lot better (after an initial breaking in period). *Ti keeps frames in the 50s+ with everything on ultra+(custom to get full ultra)*.


Good to know. The 1080 Ti's going to provide more than enough frames for me at [email protected]; I wish I had the money to go [email protected] at the same time for this game but that's going to have to wait a little bit


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> I am blaming nvidia. There the ones holding back. You think there initial order was for 500? Doubt it. Wouldn't look good releasing 5k first month then sales drop. They holding out to make the numbers look like they increasing every month. Definitely blaming them. Guess I'm just not one of the lucky ones.


500? where do you get this crap from? they've sold many thousands of units. YOU biought 'the last one' but somebody else cleared checkout before you did. Always a possibility on a poor online store, which fry's is and you got hosed by it. I feel bad for you, but, you sound like you need to just go ahead and wait for Vega, fanboyism is an ugly look.
Quote:


> Originally Posted by *moonbogg*
> 
> I switched from two watercooled 980ti's. I lost some FPS in Rise of the Tomb Raider and of course in the synthetics. Oddly enough I gained FPS in BF1. Always pegged at 100fps Gsync'd, but it did dip down to 96fps last night, so I'll have to investigate this "issue" hehe.
> Everything else my FPS is almost double what it was because SLI sucks these days. I'm so damn happy with the 1080ti that I can hardly contain myself. Totally worth ditching the 980ti's all day long.
> I now feel very fortunate to have gotten a card and water block during the first wave and this card is as stable as the laws of physics at 2114/6000 under the EK block with stock voltage and 120% power limit. It sometimes decides to stick to 2101, so its either 2101 or 2114, depending on what PrecisionX is showing. Don't know why it does that, but it doesn't appear to be a throttle as its always one of those two clocks, most often 2114. Once it picks a clock, its just stuck there and doesn't budge after hours of BF1. So it looks like the golden clock to hit for these cards is 2.1ghz and there are cards out there that will do it. A little extra voltage and power headroom might help the cards that can't do it get there.
> 
> Also, I just installed GPU-Z to check the ASIC quality and it says that feature isn't supported on this card. WHY? I want to know my ASIC.


no ASIC on any Pascal cards. i hate that.
But, yeah, you have a golden one. I'v read this whole thread and you are one of 3-4 who have hit that mark. Mine's stuck @ 2025. pisses me off.


----------



## mr2cam

Hurry up, there are some EVGA FE's in stock over at newegg, I just got mine ordered!


----------



## mr2cam

Quote:


> Originally Posted by *mr2cam*
> 
> Hurry up, there are some EVGA FE's in stock over at newegg, I just got mine ordered!


----------



## ajresendez

Finally got my card in tonight!


----------



## straha20

Quote:


> Originally Posted by *mr2cam*


I thought I had one earlier in the day from Wal Mart.com of all places, but they over sold, so I got the "Sorry, we had to cancel your order email." I kept hitting refresh on Newegg, and lo and behold, the MSI FE came in stock, and I just clicked buttons as fast as I could, and ended up snagging one for real this time.

After I placed the order, I tried to go back and set it to over night shipping, but couldn't edit shipping. I got in touch with customer support, and asked if they could upgrade it to overnight, and their suggestion...cancel the original order, and reorder with faster shipping...Yeah right! Not happening, and it took them all of about five minutes to go out of stock again,

The good news is, my card has shipped, I have a tracking number, and it is showing up in FedEx's system, so now all I have to do is wait until Thursday...


----------



## drfouad

Don't hate me
I got two unopened and sending them back to nvidia.
eBay was not working to make $$


----------



## djfunz

Quote:


> Originally Posted by *drfouad*
> 
> Don't hate me
> I got two unopened and sending them back to nvidia.
> eBay was not working to make $$


Glad to hear that.


----------



## djriful

So I heard this 1080 Ti bottleneck by 7700K CPU @ 5Ghz... on 1080p and 1440p res. in some games.

I guess this is the last GPU I ever needed while still running 3930k.


----------



## zswickliffe

Quote:


> Originally Posted by *drfouad*
> 
> Don't hate me
> I got two unopened and sending them back to nvidia.
> eBay was not working to make $$


Super happy to hear it didn't work for you. Hate when people do that.


----------



## Jbravo33

Quote:


> Originally Posted by *Slackaveli*
> 
> 500? where do you get this crap from? they've sold many thousands of units. YOU biought 'the last one' but somebody else cleared checkout before you did. Always a possibility on a poor online store, which fry's is and you got hosed by it. I feel bad for you, but, you sound like you need to just go ahead and wait for Vega, fanboyism is an ugly look.


Haha. They've sold many of thousands? Source? Let me know where you read that info then get back to me. You clearly don't get it. I was asking a question for someone who's ordered from frys. Your reply helped zero not sure why I'm still entertaining this. You'll be alright tho.


----------



## alucardis666

Quote:


> Originally Posted by *djriful*
> 
> So I heard this 1080 Ti bottleneck by 7700K CPU @ 5Ghz... on 1080p and 1440p res. in some games.
> 
> I guess this is the last GPU I ever needed while still running 3930k.


Well damn, if 5Ghz is bottle-necking this card then I'm just stupid for owning one with my measly 4.3ghz.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Well damn, if 5Ghz is bottle-necking this card then I'm just stupid for owning one with my measly 4.3ghz.


i get a kick out of people who say that. it's like they just can't fathom the concept of just moving up in resolution to give their poor, tired cpu's a break.. All these outdated cpu's would be just fine in 4k.

got a couple of these badboys on the way
https://www.amazon.com/gp/product/B00KESS6O0/ref=od_aui_detailpages00?ie=UTF8&psc=1


----------



## jrcbandit

What kind of numbers can you expect on air? I think I got a dud overclocker, it only does +125 Mhz core and +550 memory. The boost is around to 1900 or so for the core, but I haven't tried running the fans at a constant 70-100% or anything else to see if it goes higher. If my card is that anemic, I dunno if it isn't worth bothering with a hybrid kit once they are finally released. Well it depends on what I do with my 1080 - I could sell it with the hybrid kit installed for more $, or try adapting my existing 1080 hybrid kit to my 1080Ti, although the memory layout is different on the 1080 vs the 1080 Ti.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Got a couple of these badboys on the way
> https://www.amazon.com/gp/product/B00KESS6O0/ref=od_aui_detailpages00?ie=UTF8&psc=1


Gonna use a fan controller? Those are loud


----------



## BigMack70

Quote:


> Originally Posted by *jrcbandit*
> 
> What kind of numbers can you expect on air? I think I got a dud overclocker, it only does +125 Mhz core and +550 memory. The boost is around to 1900 or so for the core, but I haven't tried running the fans at a constant 70-100% or anything else to see if it goes higher. If my card is that anemic, I dunno if it isn't worth bothering with a hybrid kit once they are finally released. Well it depends on what I do with my 1080 - I could sell it with the hybrid kit installed for more $, or try adapting my existing 1080 hybrid kit to my 1080Ti, although the memory layout is different on the 1080 vs the 1080 Ti.


e

My card is a similar kind of dud and I still put a hybrid kit on it (though I didn't spend money... just re-used my Titan X Maxwell kit) just so I could have silence and zero temperature limitation. Glad I'm not the only one with a +120-130 max core clock.


----------



## alucardis666

Quote:


> Originally Posted by *BigMack70*
> 
> e
> 
> My card is a similar kind of dud and I still put a hybrid kit on it (though I didn't spend money... just re-used my Titan X Maxwell kit) just so I could have silence and zero temperature limitation. Glad I'm not the only one with a +120-130 max core clock.


550 mem is good though! My card doesn't like going above 475. I have it set to 464 to have a little buffer. Core I'm @ 135.


----------



## KraxKill

Quote:


> Originally Posted by *alucardis666*
> 
> New Driver!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/files-details/geforce-378-92-whql-driver-download.html


I've reverted to the previous driver. This driver won't let my card down clock to 135mhz at desktop. Just makes it sit idling at ˜1500mhz 60w TDP. I'm at 144hz on a PG278Q.

The old driver at least allowed the card to down clock at 144hz with gsync enabled but this driver keeps it from down clocking. At least for me....

You've been warned...


----------



## KingEngineRevUp

So once we get custom bios going, we'll be able to adjust the thermal throttling points? Because it's annoying as hell to get thermal throttling at 59C backing be from 2050 to 2038 Mhz. I have my fan speed at 100% at 59C.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> I've reverted to the previous driver. This driver won't let my card down clock to 135mhz at desktop. Just makes it sit idling at ˜1500mhz 60w TDP. I'm at 144hz on a PG278Q.
> 
> The old driver at least allowed the card to down clock at 144hz with gsync enabled but this driver keeps it from down clocking. At least for me....
> 
> You've been warned...


That's not right... I have the same monitor and I'm downclocked to 135 Mhz.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> So once we get custom bios going, we'll be able to adjust the thermal throttling points? Because it's annoying as hell to get thermal throttling at 59C backing be from 2050 to 2038 Mhz. I have my fan speed at 100% at 59C.


You actually think we will get a custom bios?!







It hasn't happened with any 10 series card yet, what makes you think the Ti will be any different?
Quote:


> Originally Posted by *KraxKill*
> 
> I've reverted to the previous driver. This driver won't let my card down clock to 135mhz at desktop. Just makes it sit idling at ˜1500mhz 60w TDP. I'm at 144hz on a PG278Q.
> 
> The old driver at least allowed the card to down clock at 144hz with gsync enabled but this driver keeps it from down clocking. At least for me....
> 
> You've been warned...


Yea... no issues downclocking to 135 here.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> You actually think we will get a custom bios?!
> 
> 
> 
> 
> 
> 
> 
> It hasn't happened with any 10 series card yet, what makes you think the Ti will be any different?
> Yea... no issues downclocking to 135 here.


Hmmm, I thought everyone was hopeful once partner cards started coming out. I didn't realize that none of the 10 series cards didn't have a custom bios.

Maybe we all need to start a donation fund to whoever can figure out a flash ability for the 10 series cards then.


----------



## KingEngineRevUp

Well hopefully once DX12 goes mainstream we won't have CPU bottlenecks.


----------



## MURDoctrine

Quote:


> Originally Posted by *KraxKill*
> 
> I've reverted to the previous driver. This driver won't let my card down clock to 135mhz at desktop. Just makes it sit idling at ˜1500mhz 60w TDP. I'm at 144hz on a PG278Q.
> 
> The old driver at least allowed the card to down clock at 144hz with gsync enabled but this driver keeps it from down clocking. At least for me....
> 
> You've been warned...


Make sure shadowplay isn't allowed to record the desktop. Mine wouldn't downclock past 1500-1600 with that enabled.


----------



## KraxKill

Quote:


> Originally Posted by *MURDoctrine*
> 
> Make sure shadowplay isn't allowed to record the desktop. Mine wouldn't downclock past 1500-1600 with that enabled.


That could be it, will check but as soon as I install the old drover it down clocks. Will try it again.


----------



## KingEngineRevUp

Man... Is shadowplay even worth the 5% to 10% hit for you guys?


----------



## djriful

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Well damn, if 5Ghz is bottle-necking this card then I'm just stupid for owning one with my measly 4.3ghz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i get a kick out of people who say that. it's like they just can't fathom the concept of just moving up in resolution to give their poor, tired cpu's a break.. All these outdated cpu's would be just fine in 4k.
> 
> got a couple of these badboys on the way
> https://www.amazon.com/gp/product/B00KESS6O0/ref=od_aui_detailpages00?ie=UTF8&psc=1
Click to expand...

I rather go low power... fanless or high power full watercool...


----------



## alucardis666

Quote:


> Originally Posted by *djriful*
> 
> I rather go low power... fanless or high power full watercool...


Agreed








Quote:


> Originally Posted by *SlimJ87D*
> 
> Hmmm, I thought everyone was hopeful once partner cards started coming out. I didn't realize that none of the 10 series cards didn't have a custom bios.
> 
> Maybe we all need to start a donation fund to whoever can figure out a flash ability for the 10 series cards then.


Best you can do is flash other bioses, but a full blown custom bios is not possible on 10-series cards, dunno why no one has begun a bounty yet.


----------



## MURDoctrine

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man... Is shadowplay even worth the 5% to 10% hit for you guys?


What 5-10% hit? I have never felt Shadowplay's impact on performance. Now fraps on the other hand haha.


----------



## Slackaveli

Quote:


> Originally Posted by *djriful*
> 
> I rather go low power... fanless or high power full watercool...


they are for the rads ; the cpu and for the hybrid kit that will go on my Ti.

never have used them. am i missing something? I cant just stick them in push and have them on the fanspeed curve on my mobo like any other sp fans? I am not opposed to returning them if need be, just, everyone seems to think these dudes are the truth.


----------



## Mongo

Quote:


> Originally Posted by *Kimir*
> 
> Yeah, try some other games, Ghost recon wildlands is far from perfect of glitches, I have some weird issues on 780Ti SLI myself. Run fine on 980Ti except those dip with huge frametime increase.
> The temperature seems high tho, could be bad TIM from factory.


Quote:


> Originally Posted by *D13mass*
> 
> Maybe it`s only in 1 game ? I`m not sure but it can be bug in game, have you tried run any other game ?


Videos are crap shot with my phone.
















All these have color pops artifacts. Hope you can see it in the video.


----------



## bfedorov11

Finished up getting my rig back under water. Added some gskill 3600 cl15 and a 7700k too









Card holds 2100mhz through valley. Max temp is 35 with an ek block.


----------



## alucardis666

Quote:


> Originally Posted by *bfedorov11*
> 
> Finished up getting my rig back under water. Added some gskill 3600 cl15 and a 7700k too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card holds 2100mhz through valley. Max temp is 35 with an ek block.


Nice man! My card tops out at 2050 with the Hybrid cooler and maxes at 52c.


----------



## sWaY20

Quote:


> Originally Posted by *bfedorov11*
> 
> Finished up getting my rig back under water. Added some gskill 3600 cl15 and a 7700k too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card holds 2100mhz through valley. Max temp is 35 with an ek block.


just bc it held that in valley doesn't mean a thing. Valley isn't a good gpu benchmark.


----------



## bfedorov11

Quote:


> Originally Posted by *sWaY20*
> 
> just bc it held that in valley doesn't mean a thing. Valley isn't a good gpu benchmark.


13,269 posts say otherwise....

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_50


----------



## KingEngineRevUp

Quote:


> Originally Posted by *MURDoctrine*
> 
> What 5-10% hit? I have never felt Shadowplay's impact on performance. Now fraps on the other hand haha.


Gamer Nexus shows it can take 5% to 10% of your resources. But I'm not sure what's going on under the hood and how this can be.

http://www.gamersnexus.net/guides/2710-amd-relive-vs-nvidia-shadowplay-benchmarks


----------



## bfedorov11

Still not too bad in FS. Only pinged PT a couple times in scene 1.. max was 2076mhz.


----------



## Slackaveli

Quote:


> Originally Posted by *bfedorov11*
> 
> Finished up getting my rig back under water. Added some gskill 3600 cl15 and a 7700k too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card holds 2100mhz through valley. Max temp is 35 with an ek block.


damn she's sexy


----------



## sWaY20

Quote:


> Originally Posted by *bfedorov11*
> 
> 13,269 posts say otherwise....
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_50


yeah thanks for providing that link, i used to be top 30 back in the 780 classy days. Still not a good gpu bench though.


----------



## Slackaveli

Quote:


> Originally Posted by *BigMack70*
> 
> e
> 
> My card is a similar kind of dud and I still put a hybrid kit on it (though I didn't spend money... just re-used my Titan X Maxwell kit) just so I could have silence and zero temperature limitation. Glad I'm not the only one with a +120-130 max core clock.


im close. i run +140, +396


----------



## dboythagr8

The SLI profile for Mass Effect : Andromeda is great.

I downsampled from 4k to my 1440p monitor. Ultra everything. Pulled in avg of 76fps. Gameplay was smooth with no stutters, crashes, or any performance issues. The IQ was amazing.
















1440p is too low of a resolution for my setup. The GPUs get bottlenecked and utilization is terrible. Soon as I use DSR to get to 4k or beyond, they sing.


----------



## pez

Quote:


> Originally Posted by *dboythagr8*
> 
> The SLI profile for Mass Effect : Andromeda is great.
> 
> I downsampled from 4k to my 1440p monitor. Ultra everything. Pulled in avg of 76fps. Gameplay was smooth with no stutters, crashes, or any performance issues. The IQ was amazing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440p is too low of a resolution for my setup. The GPUs get bottlenecked and utilization is terrible. Soon as I use DSR to get to 4k or beyond, they sing.


You've got high refresh and G-sync, so DSR is a great option if you're happy with your screen size, honestly. Ultrawide is a nice step...even 4K is, but there's no reasonably priced displays available that will give you 144Hz where it counts. I mean, I love my x34, but I played some OW on my GFs rig this past weekend and 144hz vs 100hz is night and day even.


----------



## GreedyMuffin

Getting mine today!

On my 1080 I re-used my backplate from the FE cooler. Have people tried the same with the Ti?


----------



## SperVxo

Peopler that gets 2000+ or even 2100+ Have they done the shunt mod? Im hitting the limit easy. Only 130 on core


----------



## Pandora's Box

Quote:


> Originally Posted by *SperVxo*
> 
> Peopler that gets 2000+ or even 2100+ Have they done the shunt mod? Im hitting the limit easy. Only 130 on core


Think you'll find they aren't pushing the card hard enough. Run Witcher 3 at 4K at 1950Mhz core - 120% power limit gets hit fairly easily.


----------



## D13mass

Quote:


> Originally Posted by *Mongo*
> 
> Videos are crap shot with my phone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these have color pops artifacts. Hope you can see it in the video.


Sorry I can take a look only in evening (now in my country day). but you can use Nvidia shadow recorder


----------



## D13mass

Quote:


> Originally Posted by *bfedorov11*
> 
> Finished up getting my rig back under water. Added some gskill 3600 cl15 and a 7700k too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card holds 2100mhz through valley. Max temp is 35 with an ek block.


Max temp is 35 with an ek block. - how many radiators you have and what is the speed of fans? It`s very low temp, probably your fans are working like a crazy







and many noise...


----------



## Kimir

Quote:


> Originally Posted by *Mongo*
> 
> Videos are crap shot with my phone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these have color pops artifacts. Hope you can see it in the video.


Yeah, something is wrong with your card, RMA time.


----------



## PCBUILDER1980

My 1080ti FE from NVidia will only do 140+ and 300+ on memory with fan at 80 percent stable! I'm getting the hybrid cooler! Card is amazingly fast over my 980ti on my x34 monitor. Handles everything at 1440p at 100hz. Going try 4k this weekend on 79 inch lg and see what all this DSR fuss is about on RE7. Just wish the card would overclock better and I can't seem to get the slider in afterburner to add more volts????


----------



## zswickliffe

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Getting mine today!
> 
> On my 1080 I re-used my backplate from the FE cooler. Have people tried the same with the Ti?


I've seen someone on the EVGA forums who did this with their Titan X Pascal with no issues. EK lists it as incompatible so be careful.


----------



## pez

Quote:


> Originally Posted by *PCBUILDER1980*
> 
> My 1080ti FE from NVidia will only do 140+ and 300+ on memory with fan at 80 percent stable! I'm getting the hybrid cooler! Card is amazingly fast over my 980ti on my x34 monitor. Handles everything at 1440p at 100hz. Going try 4k this weekend on 79 inch lg and see what all this DSR fuss is about on RE7. Just wish the card would overclock better and I can't seem to get the slider in afterburner to add more volts????


I think this is what you're looking for: http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti


----------



## GreedyMuffin

Quote:


> Originally Posted by *zswickliffe*
> 
> I've seen someone on the EVGA forums who did this with their Titan X Pascal with no issues. EK lists it as incompatible so be careful.


Thank you! If the threads are the same and the length are the same. It will work perfectly. I will test and if others are interested. I can post some pictures under way and after.


----------



## MURDoctrine

Quote:


> Originally Posted by *Mongo*
> 
> Videos are crap shot with my phone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these have color pops artifacts. Hope you can see it in the video.


That would be your memory. If it is at stock settings then you more than likely have a defective card.


----------



## zswickliffe

Quote:


> Originally Posted by *Mongo*
> 
> Videos are crap shot with my phone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these have color pops artifacts. Hope you can see it in the video.


I've seen a similar thing but it when over locking the processor (memory at stock clocks). I'll do some testing with a few games at stock clocks and report back.


----------



## yukkerz

Finally got my card and the hybrid cooler, was very easy to install. Found a mess up while installing it, wrong screw was used in the one spot. It wasn't fully screwed down either and was stripped, other than that I am happy about the outcome.


----------



## MunneY

Vince (k|ngp|n) just dropped this!


----------



## jarble

Quote:


> Originally Posted by *MunneY*
> 
> 
> 
> Vince (k|ngp|n) just dropped this!


Vince is just nuts! 3Ghz core clock


----------



## TWiST2k

Quote:


> Originally Posted by *MunneY*
> 
> 
> 
> Vince (k|ngp|n) just dropped this!


How stable is that lol.


----------



## eXistencelies

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Getting mine today!
> 
> On my 1080 I re-used my backplate from the FE cooler. Have people tried the same with the Ti?


They are actually different. I didn't try it as I talked with EK and they said it would not work. From the looks of it though the screw pattern is the same. Underneath is where it is different. I still have my 1080 FE EKWB and EKBP I am trying to sell.


----------



## eXistencelies

Quote:


> Originally Posted by *D13mass*
> 
> Max temp is 35 with an ek block. - how many radiators you have and what is the speed of fans? It`s very low temp, probably your fans are working like a crazy
> 
> 
> 
> 
> 
> 
> 
> and many noise...


Yea that does seem on the low side or a very NON taxing game like CSGO. My gaming temp gets no higher than 45c. Tested that last night with Arma 3 on the new 64bit platform. I am also on 720mm rad space.


----------



## eXistencelies

Quote:


> Originally Posted by *MunneY*
> 
> 
> 
> Vince (k|ngp|n) just dropped this!


This another one of those liquid nitrogen tests that .000001% people actually would use as their home setup?


----------



## GreedyMuffin

Quote:


> Originally Posted by *eXistencelies*
> 
> They are actually different. I didn't try it as I talked with EK and they said it would not work. From the looks of it though the screw pattern is the same. Underneath is where it is different. I still have my 1080 FE EKWB and EKBP I am trying to sell.


I found that somebody did that with an EK block and FE backplate on Pascal Titan X.


----------



## RedM00N

Quote:


> Originally Posted by *MunneY*
> 
> 
> 
> Vince (k|ngp|n) just dropped this!












Wont be long now till subzero runs can push gpu clocks past average .CPU overclocks


----------



## eXistencelies

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I found that somebody did that with an EK block and FE backplate on Pascal Titan X.


Hmm....They must have done some modding because the Titan XP and 1080ti have a 8+6pin power phase where as the 1080 fe has only an 8 pin. Not sure how they would get the 1080 fe block to fit on the 1080ti fe card without cutting the plexi?





Look at the cut for the power phase.


----------



## GreedyMuffin

Quote:


> Originally Posted by *eXistencelies*
> 
> Hmm....They must have done some modding because the Titan XP and 1080ti have a 8+6pin power phase where as the 1080 fe has only an 8 pin. Not sure how they would get the 1080 fe block to fit on the 1080ti fe card without cutting the plexi?
> 
> 
> 
> 
> 
> Look at the cut for the power phase.


I am sorry, you misunderstood.

Someone installed a EK Titan XP block on their TXP, and used the stock backplate that came with the TXP.

I just said FE due to the bascially same cooler across the board. ^^


----------



## eXistencelies

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I am sorry, you misunderstood.
> 
> Someone installed a EK Titan XP block on their TXP, and used the stock backplate that came with the TXP.
> 
> I just said FE due to the bascially same cooler across the board. ^^


Ahh yes, You can do that. I did that with my 1080 FE when I first got it to test it. Then I removed it. Just didn't like the faded black plastic back plate. I used EK's back plate.


----------



## GreedyMuffin

I did that with my old 1080 as well. I am curious to see if it is doable with th eTi as well.


----------



## Dasboogieman

Quote:


> Originally Posted by *yukkerz*
> 
> Finally got my card and the hybrid cooler, was very easy to install. Found a mess up while installing it, wrong screw was used in the one spot. It wasn't fully screwed down either and was stripped, other than that I am happy about the outcome.


What AIO did you use?


----------



## SperVxo

Has anyone seen if the third party cards with other coolers have higher power limit?

From Hexus site

The above numbers are system-wide and depict peak power consumption when gaming. It's interesting to note that the 1080 Ti Founders Edition starts off at around 300 watts but soon drops to 260 watts as the card begins to throttle. No such quirk with the Inno3D, as the system continues to chew through 350 watts for the duration of the test.

It that from thermal or the Power limit =P


----------



## keikei

Any one running a Ryzen chip with their Ti?
Quote:


> Originally Posted by *SperVxo*
> 
> Has anyone seen if the third party cards with other coolers have higher power limit?


I noticed evga has different powerpins on some of their aftermarket versions.


----------



## SperVxo

How does it work with watercooling? or Hybrid? Will the card boost higher with the same amount of voltage? Im not getting over 1950, Hitting power limit and around 74c temp. Adding voltage does nothing,

My 1080 Evga hybrid was solid att 2114 never hit any power limit or anything. It never touched 130% I dont think it got close to 120% in games,


----------



## CptSpig

Quote:


> Originally Posted by *zswickliffe*
> 
> I've seen someone on the EVGA forums who did this with their Titan X Pascal with no issues. EK lists it as incompatible so be careful.


Quote:


> Originally Posted by *eXistencelies*
> 
> They are actually different. I didn't try it as I talked with EK and they said it would not work. From the looks of it though the screw pattern is the same. Underneath is where it is different. I still have my 1080 FE EKWB and EKBP I am trying to sell.


You are correct my friend. I have the original back plate on my Titan XP. You can only use two screws on the front plate but it's totally secure. Need to check to see if the original screws are identical in thread count and length. I like the original back plate better than the plain one from EKWB.









http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


----------



## CptSpig

Quote:


> Originally Posted by *MunneY*
> 
> 
> 
> Vince (k|ngp|n) just dropped this!


Vince used highly modified cards with the 1080ti. He could not do this with the Titan XP the boards are much more complicated. Don't think you will see this on some ones home machine.









http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


----------



## Kimir

We know that, that's part of the development of the 1080Ti KPE, probably.


----------



## deejaykristoff

my baby is under water too
great card premium temps, always under 42 degrees







at 1440p with rog swift, when playing on 50' lcd 1080p temp never pass 32 degrees
have modded my hybrid kit with h105 rad and short tubes

[I


MG ALT=""]http://www.overclock.net/content/type/61/id/2984206/width/500/height/1000[/IMG]


----------



## BroHamBone

Quote:


> Originally Posted by *Dasboogieman*
> 
> What AIO did you use?


http://www.evga.com/products/product.aspx?pn=400-HY-5188-B1

That one.


----------



## CptSpig

Quote:


> Originally Posted by *Kimir*
> 
> We know that, that's part of the development of the 1080Ti KPE, probably.


You are probably right I had two 980 KingPins before my Titan XP and they were not good over clockers. Needed to be on LN2. I will not buy those cards again unless he makes changes to make them more friendly on water.


----------



## yukkerz

Quote:


> Originally Posted by *Dasboogieman*
> 
> What AIO did you use?


EVGA Hybrid for the 1070/1080 fits without any mods. Don't even need to take backplate off either, very simple.


----------



## PewnFlavorTang

Quote:


> Originally Posted by *CptSpig*
> 
> You are correct my friend. I have the original back plate on my Titan XP. You can only use two screws on the front plate but it's totally secure. Need to check to see if the original screws are identical in thread count and length. I like the original back plate better than the plain one from EKWB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


where do you get those QDC's at?

Quote:


> Originally Posted by *keikei*
> 
> Any one running a Ryzen chip with their Ti?
> 
> I noticed evga has different powerpins on some of their aftermarket versions.


I am.


----------



## KedarWolf

Quote:


> Originally Posted by *CptSpig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zswickliffe*
> 
> I've seen someone on the EVGA forums who did this with their Titan X Pascal with no issues. EK lists it as incompatible so be careful.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *eXistencelies*
> 
> They are actually different. I didn't try it as I talked with EK and they said it would not work. From the looks of it though the screw pattern is the same. Underneath is where it is different. I still have my 1080 FE EKWB and EKBP I am trying to sell.
> 
> Click to expand...
> 
> You are correct my friend. I have the original back plate on my Titan XP. You can only use two screws on the front plate but it's totally secure. Need to check to see if the original screws are identical in thread count and length. I like the original back plate better than the plain one from EKWB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html
Click to expand...

I prefer the look of the EK backplate myself.

Leak testing my QDC prefilled waterblock 24 hours,then slapping it on with the backplate on my 1080 Ti.

Oh, and check some posts here about peeps using a few extra thermal pads on their EK blocks that the stock cooler has by default. Seems like a good idea, I will be using them for sure.









Edit: Predator 280 or 360 I bet.


----------



## mr2cam

Quote:


> Originally Posted by *yukkerz*
> 
> EVGA Hybrid for the 1070/1080 fits without any mods. Don't even need to take backplate off either, very simple.


You mean like gamers nexus did? Meaning you can't use the EVGA Hybrid shroud?


----------



## CptSpig

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> where do you get those QDC's at?


EKWB they come on the Predator AIO's soon to be EK-MLC.


----------



## CptSpig

Quote:


> Originally Posted by *KedarWolf*
> 
> I prefer the look of the EK backplate myself.
> 
> Leak testing my QDC prefilled waterblock 24 hours,then slapping it on with the backplate on my 1080 Ti.
> 
> Oh, and check some posts here about peeps using a few extra thermal pads on their EK blocks that the stock cooler has by default. Seems like a good idea, I will be using them for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Predator 280 or 360 I bet.


I did the extra two pads. The pads that come with the block cover the rest.


----------



## bfedorov11

Quote:


> Originally Posted by *D13mass*
> 
> Max temp is 35 with an ek block. - how many radiators you have and what is the speed of fans? It`s very low temp, probably your fans are working like a crazy
> 
> 
> 
> 
> 
> 
> 
> and many noise...


Just the two 240mm rads. Fans are ap15 1850 rpm, but they're running low. I have them all running off the mobo. That temp was just from a bench run. Gaming it goes to around 40.


----------



## Kimir

Quote:


> Originally Posted by *CptSpig*
> 
> EKWB they come on the Predator AIO's soon to be EK-MLC.


They also exist as a standalone product on some shop (not EK, which make no sense to me), the brand is CPC.
CPC QDC 12.7 F
CPC QDC 12.7 M


----------



## qazplm5089

Do you guys know if the EVGA 980 ti hybrid kit would fit? It seems like it would be easier to fit the shroud since there are no cuts needed for the power connectors.


----------



## Joshwaa

Quote:


> Originally Posted by *CptSpig*
> 
> I did the extra two pads. The pads that come with the block cover the rest.


What size pads did you use 1.0mm?


----------



## Code-Red

Got my 1080Ti. Question: Is it only the reference cards that come in those sleek vertical boxes, or do reviewers only receive those. Was kind of surprised to see such a premium card in the old 2006-era thin cardboard box and plastic clam-shell.


----------



## Jbravo33

Finally!!! Got my ek waterblock but back plate not in stock. Is it ok to run stock backplate in meantime?


----------



## alucardis666

Quote:


> Originally Posted by *Code-Red*
> 
> 
> 
> Got my 1080Ti. Question: Is it only the reference cards that come in those sleek vertical boxes, or do reviewers only receive those. Was kind of surprised to see such a premium card in the old 2006-era thin cardboard box and plastic clam-shell.


Yea, only Nvidia reference come in the vertical box


----------



## GreedyMuffin

Folding on 2062 at 1.000V. 100% fan 58'C.

Will test higher later. I think will do easily 2100 under water.


----------



## jonny30bass

Quote:


> Originally Posted by *KraxKill*
> 
> I've reverted to the previous driver. This driver won't let my card down clock to 135mhz at desktop. Just makes it sit idling at ˜1500mhz 60w TDP. I'm at 144hz on a PG278Q.
> 
> The old driver at least allowed the card to down clock at 144hz with gsync enabled but this driver keeps it from down clocking. At least for me....
> 
> You've been warned...


I had the same problem. Turns out that NZXT CAM was the culprit. I reinstalled CAM and then reinstalled the driver and it would downclock again.

UPDATE: I restarted my computer and it was back at idling at 3D clocks. I think it is just a driver problem. I restored default driver settings and then it went back to normal idle clocks. After restoring the default driver settings, I can then change my settings back to what they were before and it will still be at normal idle clocks.


----------



## KCDC

The latest Nvidia driver fixed my SLI issues with ME:A... now I'm getting constant 60 fps at 7680x1440! I was pissed there for a second...


----------



## eXistencelies

Quote:


> Originally Posted by *Jbravo33*
> 
> Finally!!! Got my ek waterblock but back plate not in stock. Is it ok to run stock backplate in meantime?


Yea it is. Only crappy thing is you'll more than likely have to drain your loop to put the back plate on. Best to hold out till it gets in. Unless if you very tiny hands and a very small phillips.


----------



## Bilie

Quote:


> Originally Posted by *Jbravo33*
> 
> Finally!!! Got my ek waterblock but back plate not in stock. Is it ok to run stock backplate in meantime?


No problem at all since your backplate is all about visuals, a backplate dont provide cooling.









http://www.overclock.net/t/1359650/gpu-backplate


----------



## jonny30bass

Got my 1080 Ti last Thursday. Been doing some testing. My max stable OC seems to be +135 on the core and +500 on the mem. It runs rather hot though. Max temp at around 74C with 84% fan speed. Here is my best Firestrike bench with my max stable OC: http://www.3dmark.com/fs/12071359


----------



## JedixJarf

Quote:


> Originally Posted by *Jbravo33*
> 
> Finally!!! Got my ek waterblock but back plate not in stock. Is it ok to run stock backplate in meantime?


I run the stock backplate, the EK backplate is such a waste of money unless you absolutely have to have the aesthetics lol.


----------



## JedixJarf

Quote:


> Originally Posted by *KCDC*
> 
> The latest Nvidia driver fixed my SLI issues with ME:A... now I'm getting constant 60 fps at 7680x1440! I was pissed there for a second...


That res....


----------



## Jbravo33

Anyone else have problems after install. Keeps booting into bios. I'm gonna rip my hair out. Even loaded factory defaults. And it won't load. Any suggestions

Disregard. Tried the retry button. Worked.


----------



## eXistencelies

Quote:


> Originally Posted by *JedixJarf*
> 
> I run the stock backplate, the EK backplate is such a waste of money unless you absolutely have to have the aesthetics lol.


I def favor aesthetics when you have a custom loop.


----------



## CptSpig

Quote:


> Originally Posted by *Kimir*
> 
> They also exist as a standalone product on some shop (not EK, which make no sense to me), the brand is CPC.
> CPC QDC 12.7 F
> CPC QDC 12.7 M


These QDC connectors are used in the medical field. Top notch build quality.
https://www.cpcworldwide.com/Industries/Medical


----------



## CptSpig

Quote:


> Originally Posted by *Joshwaa*
> 
> What size pads did you use 1.0mm?


I used 0.5mm pads.


----------



## KedarWolf

Quote:


> Originally Posted by *CptSpig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Joshwaa*
> 
> What size pads did you use 1.0mm?
> 
> 
> 
> I used 0.5mm pads.
Click to expand...

I don't have any extra pads so I'm hoping take a few of the pads off the stock cooler they used will do.


----------



## JedixJarf

Quote:


> Originally Posted by *eXistencelies*
> 
> I def favor aesthetics when you have a custom loop.


Stock backplate is sexy too


----------



## JedixJarf

Quote:


> Originally Posted by *CptSpig*
> 
> These QDC connectors are used in the medical field. Top notch build quality.
> https://www.cpcworldwide.com/Industries/Medical


They are also fugly, koolance is the way to go


----------



## spyui

have anyone try running this card in SLI on Ryzen motherboard yet ? Is the scaling good on 1440p compare to z170/270 ?


----------



## PasK1234Xw

EVGA Titan XP/Ti hybrid kit possibly next week.


__ https://twitter.com/i/web/status/844245181150851072


----------



## KCDC

I'm a little peeved.

There appears to be only one SLI HB Bridge thats 3 slot spacing for two cards; the one made by Nvidia. Unfortunately this specific bridge doesn't fit with EK waterblocks...

Does anyone know of another company that makes a 2-card 3-slot HB Bridge? Evga doesn't, nor zotac, nor MSI, nor Gigabyte...I haven't found one yet. Quite baffling, honestly. It's a fairly common configuration, isn't it? My mobo has only two x16 slots, so I can't move the cards either.

Since I am running 7680 x [email protected], I'm pretty sure I need the extra bandwidth.

Other options... use an older sli bridge and take a possible performance hit

Maybe another brand waterblock allows fitment of the Nvidia HB bridge (can anyone confirm this, possibly?)

Or, my least favored option, take the bridge apart and hack off the protruding nibs that prevent fitment...

I haven't pulled the trigger on blocks yet, though my monoblock is EK.. wouldn't mind staying in the same brand.. But it's not a dealbreaker if not.


----------



## Tonza

Does the stock backplate need any modifications with EK block?


----------



## PasK1234Xw

Quote:


> Originally Posted by *KCDC*
> 
> I'm a little peeved.
> 
> There appears to be only one SLI HB Bridge thats 3 slot spacing for two cards; the one made by Nvidia. Unfortunately this specific bridge doesn't fit with EK waterblocks...


This is known and they mention on their site. You can take off the plastic shroud off the SLI bridge and it should fit


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> EVGA Titan XP/Ti hybrid kit possibly next week.
> 
> 
> __ https://twitter.com/i/web/status/844245181150851072


I've always been leary about hybrid kits because it's just the GPU chip itself that is water cooled, the memory chips etc are air cooled and on an aggressive overclock they will get very hot. Might lesson the life of the card.

Full waterblock with backplate everything is actively cooled under the waterblock and anything under the backplate still gets some passive cooling from it.

Seems a much better option.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Tonza*
> 
> Does the stock backplate need any modifications with EK block?


No, you just need to re-use some of the FE coolers screws. Will do this tomorrow and report back.


----------



## KCDC

Quote:


> Originally Posted by *PasK1234Xw*
> 
> This is known and they mention on their site. You can take off the plastic shroud and it will fit


Yeah, I didn't do enough research before buying the bridge, hence me asking what I asked.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KedarWolf*
> 
> I've always been leary about hybrid kits because it's just the GPU chip itself that is water cooled, the memory chips etc are air cooled and on an aggressive overclock they will get very hot. Might lesson the life of the card.
> 
> Full waterblock with backplate everything is actively cooled under the waterblock and anything under the backplate still gets some passive cooling from it.
> 
> Seems a much better option.


putting a block on also requires running a loop cost lot more than a $119 kit.
The memory is perfectly fine with mid plate/thermal pads and blower fan.


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> I'm a little peeved.
> 
> There appears to be only one SLI HB Bridge thats 3 slot spacing for two cards; the one made by Nvidia. Unfortunately this specific bridge doesn't fit with EK waterblocks...
> 
> Does anyone know of another company that makes a 2-card 3-slot HB Bridge? Evga doesn't, nor zotac, nor MSI, nor Gigabyte...I haven't found one yet. Quite baffling, honestly. It's a fairly common configuration, isn't it? My mobo has only two x16 slots, so I can't move the cards either.
> 
> Since I am running 7680 x [email protected], I'm pretty sure I need the extra bandwidth.
> 
> Other options... use an older sli bridge and take a possible performance hit
> 
> Maybe another brand waterblock allows fitment of the Nvidia HB bridge (can anyone confirm this, possibly?)
> 
> Or, my least favored option, take the bridge apart and hack off the protruding nibs that prevent fitment...
> 
> I haven't pulled the trigger on blocks yet, though my monoblock is EK.. wouldn't mind staying in the same brand.. But it's not a dealbreaker if not.


You can attach two ribbon SLI connectors in two way SLI and it'll have the same effect I've read but can't confirm this.

Google is your friend.










__
https://www.reddit.com/r/4m534q/psa_you_can_use_two_ribbon_sli_bridges_instead_of/%5B/URL


----------



## KraxKill

Quote:


> Originally Posted by *jonny30bass*
> 
> I had the same problem. Turns out that NZXT CAM was the culprit. I reinstalled CAM and then reinstalled the driver and it would downclock again.
> 
> UPDATE: I restarted my computer and it was back at idling at 3D clocks. I think it is just a driver problem. I restored default driver settings and then it went back to normal idle clocks. After restoring the default driver settings, I can then change my settings back to what they were before and it will still be at normal idle clocks.


Yeah with cam running, it certainly does that, but usually I've been able to get it to down clock by turning CAM off. I only use it to adjust the pump RPM for when I'm benching or extended gaming sessions. Normally it's operating in set it and forget it mode for me. But in the newest driver, if I turn CAM off (along with the background services it runs) it still keeps my clocks higher.

I didn't try uninstalling CAM and then installing the driver followed by re-installing cam after. May be worth a shot. Are people actually getting higher 3D mark scores and or performance with the latest driver, because I'f not I'm good to leave well enough alone.


----------



## jrcbandit

Quote:


> Originally Posted by *PasK1234Xw*
> 
> EVGA Titan XP/Ti hybrid kit possibly next week.
> 
> 
> __ https://twitter.com/i/web/status/844245181150851072


Is there any reason to get this when I have a hybrid kit for the 1080 I can adapt?

Any suggestions for installing the 1080 hybrid kit for the Ti? Should I be cutting the 1080 hybrid kit backplate to make it work with the Ti or do you use the Ti backplate and just install the waterblock portion? I guess it depends on how the Nvidia cooler disassembles with its fan location. Can the 1080 hybrid shroud be put back on with the Hybrid LED in white?


----------



## D13mass

Quote:


> Originally Posted by *Mongo*
> 
> Videos are crap shot with my phone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these have color pops artifacts. Hope you can see it in the video.


I watched yours video, artifacts







need to replace videocard, go to shop immediately


----------



## JedixJarf

Quote:


> Originally Posted by *GreedyMuffin*
> 
> No, you just need to re-use some of the FE coolers screws. Will do this tomorrow and report back.


Yeah, did the same with the koolance block.


----------



## Tonza

Quote:


> Originally Posted by *GreedyMuffin*
> 
> No, you just need to re-use some of the FE coolers screws. Will do this tomorrow and report back.


Okay, sounds good, odd that EK says it wont fit







.


----------



## mfdoom7

hows 1080Ti dpc latency ? is issiue fixed ?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Tonza*
> 
> Okay, sounds good, odd that EK says it wont fit
> 
> 
> 
> 
> 
> 
> 
> .


Not really.









1. They want to make money, buy their backplate
2. Might only work on the FE, no other reference based cards
3. Since you need to mess with different screws and such, EK does not recommend it and says it is not doable?


----------



## KedarWolf

Quote:


> Originally Posted by *Tonza*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GreedyMuffin*
> 
> No, you just need to re-use some of the FE coolers screws. Will do this tomorrow and report back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay, sounds good, odd that EK says it wont fit
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Someone here did it and you can't use all the old screws, only a few fit it on the stock backplate with an EK waterblock but they said it doesn't affect functionality at all.


----------



## Baasha

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Look at this:
> Just seen what this poster wrote, interesting.


Just ordered a 4960X - hopefully this will be enough for the cards to scale. Let's see.


----------



## GreedyMuffin

Quote:


> Originally Posted by *KedarWolf*
> 
> Someone here did it and you can't use all the old screws, only a few fit it on the stock backplate with an EK waterblock but they said it doesn't affect functionality at all.


Yeah. Were you can't fit the FE screws, just use the EK scews so everything is correct.


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> Someone here did it and you can't use all the old screws, only a few fit it on the stock backplate with an EK waterblock but they said it doesn't affect functionality at all.


There isn't anything functional about the backplate anyways, it's just for cosmetics, you could slap some double sided tape on there if you wanted to hold it in place lol.


----------



## KedarWolf

Quote:


> Originally Posted by *JedixJarf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Someone here did it and you can't use all the old screws, only a few fit it on the stock backplate with an EK waterblock but they said it doesn't affect functionality at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There isn't anything functional about the backplate anyways, it's just for cosmetics, you could slap some double sided tape on there if you wanted to hold it in place lol.
Click to expand...

I'm pretty sure there are some thermal pads on the backplate and it provides some passive cooling on some components.

At least my Maxwell Titan X has that with EK but i'd have to look at the not installed plate at home.

Doing a 24 hour leak test on my prefilled waterblock with quick disconnects NOT attached to my 1080 Ti.


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> You can attach two ribbon SLI connectors in two way SLI and it'll have the same effect I've read but can't confirm this.
> 
> Google is your friend.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> __
> https://www.reddit.com/r/4m534q/psa_you_can_use_two_ribbon_sli_bridges_instead_of/%5B/URL


I know this, I did mention it in my post.


----------



## KedarWolf

Please don't make me read. I don't wanna read.


----------



## CptSpig

Quote:


> Originally Posted by *Tonza*
> 
> Okay, sounds good, odd that EK says it wont fit
> 
> 
> 
> 
> 
> 
> 
> .


They want to sell you a backplate more money in their pocket. It worked fine on my Titan XP.









http://s1164.photobucket.com/user/CptSpig/media/Mobile Uploads/20161029_221954_zpszpkrc7de.jpg.html


----------



## sWaY20

Quote:


> Originally Posted by *JedixJarf*
> 
> They are also fugly, koolance is the way to go


I'll take my medical qdc that work, i had koolance and they leaked.


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure there are some thermal pads on the backplate and it provides some passive cooling on some components.
> 
> At least my Maxwell Titan X has that with EK but i'd have to look at the not installed plate at home.
> 
> Doing a 24 hour leak test on my prefilled waterblock with quick disconnects NOT attached to my 1080 Ti.


Neither my 1080 (AIB) or 1080 ti (FE) had pads, but if they did, it would just be so metal isn't making contact with the components on the board.


----------



## JedixJarf

Quote:


> Originally Posted by *sWaY20*
> 
> I'll take my medical qdc that work, i had koolance and they leaked.


Did they leak immediately? Just put another set in my rig so I'm genuinely curious.


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> Just ordered a 4960X - hopefully this will be enough for the cards to scale. Let's see.


You're running 4 way SLI in an era where beyond 2, it's no longer "officially" supported. So I'm curious what you are expecting here? 4 cards have always been the extreme of the extreme. Are you expecting 90% util on all 4 cards, because I don't think you're necessarily going to see that?


----------



## sWaY20

Quote:


> Originally Posted by *JedixJarf*
> 
> Did they leak immediately? Just put another set in my rig so I'm genuinely curious.


after a few months i noticed a slow leak, got worse so i ditched them. These ek qdc are ugly i agree, but man they work good. I can Plasti dip them to blend easily.


----------



## dboythagr8

I asked before but I will try again: Is EK who I should stick with for WC my system? Would be first time doing it and I don't want to go cheap on components for fear of leaks or malfunctions as mentioned by Sway. I know it can happen with any product, im just saying.


----------



## KCDC

This sweet looking Phanteks waterblock comes out next month:

http://www.phanteks.com/PH-GB1080TI.html


----------



## bfedorov11

Quote:


> Originally Posted by *dboythagr8*
> 
> I asked before but I will try again: Is EK who I should stick with for WC my system? Would be first time doing it and I don't want to go cheap on components for fear of leaks or malfunctions as mentioned by Sway. I know it can happen with any product, im just saying.


Ek is all I use now. I've had a few small issues with them from time to time, but nothing I couldn't correct myself.

Leaks come down to the builder, not manufacture. I always do a quick air pressure test for an hour or so before filling. I also use the clear ek koolant which is supposed to have lower electrical conductivity compared to water.. although I doubt it would save anything if there was a leak.


----------



## sotorious

Just received the 1080ti a little disappointed it only comes with 4 ports. If i end up getting a HTC VIVE i will have to constantly disconnect a monitor to be able to connect the VR. my 980 had 5 ports and i just assumed so would the 1080ti


----------



## Slackaveli

Quote:


> Originally Posted by *sotorious*
> 
> Just received the 1080ti a little disappointed it only comes with 4 ports. If i end up getting a HTC VIVE i will have to constantly disconnect a monitor to be able to connect the VR. my 980 had 5 ports and i just assumed so would the 1080ti


you want the asus strix model that comes out in a week and a half. It has two hdmi2.0 and 2 dp1.4


----------



## Asus11

hey guys..

considering the 1080 ti is ''meant'' to have a better power delivery than previous pascal models

is the power throttling limit still prevalent ? would it be okay to go founders / waterblock..

im sure I read somewhere some people was not hitting any limits compared to previous FE cards i.e Titan X pascal and 1080 FE?

im asking this because im about to buy an FE / waterblock.. or should I wait for AIB cards?

your opinions are much appreciated!

will be gaming @ 3440 x 1440


----------



## jonny30bass

I think they removed a port for better exhaust on the Founders Edition.


----------



## KCDC

Quote:


> Originally Posted by *bfedorov11*
> 
> Ek is all I use now. I've had a few small issues with them from time to time, but nothing I couldn't correct myself.
> 
> Leaks come down to the builder, not manufacture. I always do a quick air pressure test for an hour or so before filling. I also use the clear ek koolant which is supposed to have lower electrical conductivity compared to water.. although I doubt it would save anything if there was a leak.


They come from the builder, yes.. But they also come from faulty components. It happened to me. I built everything to spec, but the reservoir I purchased had a small leak around the bottom seal, almost invisible to the eye... It was one of those frozenQ helix reservoirs with top and bottom caps that don't unscrew.I think a fault like this is probably more likely to happen than say a fitting, but faulty components do happen.


----------



## sotorious

Quote:


> Originally Posted by *Slackaveli*
> 
> you want the asus strix model that comes out in a week and a half. It has two hdmi2.0 and 2 dp1.4


Thats still only 4 ports still


----------



## KCDC

Looks like someone hit 3GHz on LN2, sorry if it's a repost:

https://segmentnext.com/2017/03/21/evga-nvidia-gtx-1080-ti-overclocked/

edit: now that I've read it again, it seems fake to me.


----------



## p33k

Well, I definitely lost the silicone 1080ti lottery... anything more then +100 core crashes my system (1974mhz), got +500 on the memory and quit playing.


----------



## bfedorov11

Quote:


> Originally Posted by *Asus11*
> 
> hey guys..
> 
> considering the 1080 ti is ''meant'' to have a better power delivery than previous pascal models
> 
> is the power throttling limit still prevalent ? would it be okay to go founders / waterblock..
> 
> im sure I read somewhere some people was not hitting any limits compared to previous FE cards i.e Titan X pascal and 1080 FE?
> 
> im asking this because im about to buy an FE / waterblock.. or should I wait for AIB cards?


I haven't used my card all that much, but I would say the 1080ti throttles a little less than the 1080 FE. I had a 1080 FE under water. It would ping the PT often. I've played a little of Wildlands and TW3.. if I set it to 2050, it will drop down to 2038 every now and then.. while the 1080 seemed like it dropped lower and more often.


----------



## JedixJarf

Quote:


> Originally Posted by *bfedorov11*
> 
> I haven't used my card all that much, but I would say the 1080ti throttles a little less than the 1080 FE. I had a 1080 FE under water. It would ping the PT often. I've played a little of Wildlands and TW3.. if I set it to 2050, it will drop down to 2038 every now and then.. while the 1080 seemed like it dropped lower and more often.


Same here, seeing pretty much the same results, doesn't clock down as aggressively as the 1080 did.


----------



## Asus11

Quote:


> Originally Posted by *bfedorov11*
> 
> I haven't used my card all that much, but I would say the 1080ti throttles a little less than the 1080 FE. I had a 1080 FE under water. It would ping the PT often. I've played a little of Wildlands and TW3.. if I set it to 2050, it will drop down to 2038 every now and then.. while the 1080 seemed like it dropped lower and more often.


Quote:


> Originally Posted by *JedixJarf*
> 
> Same here, seeing pretty much the same results, doesn't clock down as aggressively as the 1080 did.


ty guys.. might go for it then


----------



## zswickliffe

Quote:


> Originally Posted by *KCDC*
> 
> This sweet looking Phanteks waterblock comes out next month:
> 
> http://www.phanteks.com/PH-GB1080TI.html


I think this might be my favorite. I don't dig the LEDs so much but I'm sure you can keep all that off.


----------



## mshagg

Quote:


> Originally Posted by *sotorious*
> 
> Just received the 1080ti a little disappointed it only comes with 4 ports. If i end up getting a HTC VIVE i will have to constantly disconnect a monitor to be able to connect the VR. my 980 had 5 ports and i just assumed so would the 1080ti


You must have a fairly specific scenario. I'm running a Vive on the thing and also three monitors with DP-DVI cables. The output configuration is fairly flexible...


----------



## mr2cam

NM, was a gtx 1080, that hadn't received a price drop yet lol


----------



## Slackaveli

Quote:


> Originally Posted by *Asus11*
> 
> ty guys.. might go for it then


do it, brother. this thing is a big leap. 2 980ti with no sli issues
Quote:


> Originally Posted by *zswickliffe*
> 
> I think this might be my favorite. I don't dig the LEDs so much but I'm sure you can keep all that off.


me too. that thing looks amazing


----------



## pantsoftime

Quote:


> Originally Posted by *sotorious*
> 
> Just received the 1080ti a little disappointed it only comes with 4 ports. If i end up getting a HTC VIVE i will have to constantly disconnect a monitor to be able to connect the VR. my 980 had 5 ports and i just assumed so would the 1080ti


I have the same complaint. Solved my problem by using an evga displayport hub for my side monitors. Vive gets a direct connection though as I wouldn't want to impact latency.

The strix has 2 HDMI, 2 DP, and 1 DVI though. Some other vendors are going to add the DVI back in as well on their aftermarket cards.


----------



## Jbravo33

Quote:


> Originally Posted by *eXistencelies*
> 
> Yea it is. Only crappy thing is you'll more than likely have to drain your loop to put the back plate on. Best to hold out till it gets in. Unless if you very tiny hands and a very small phillips.


I figured that. im gonna wait till i can get my hands on another card. i have all petg tubing and fittings so im only trying to do it ONCE! So until then I'll air it out. gonna Start OC in a few and see what i can come up with literally just went back to page one to see what everyones been getting to have a refernece.


----------



## KCDC

Quote:


> Originally Posted by *zswickliffe*
> 
> I think this might be my favorite. I don't dig the LEDs so much but I'm sure you can keep all that off.


Nah, you never have to plug those in. It is nice that the LEDs support asus aura, though.

Main reason why I'm going for it is that it can support Nvidia's HB SLI bridge.

Also supports the OG backplate, which I'm guessing means all the screws will work.

Says so here:

https://www.overclockers.co.uk/phanteks-glacier-gtx-1080-ti-gpu-full-water-block-with-rgb-lighting-satin-black-wc-012-pt.html


----------



## zswickliffe

Quote:


> Originally Posted by *KCDC*
> 
> Nah, you never have to plug those in. It is nice that the LEDs support asus aura, though.
> 
> Main reason why I'm going for it is that it can support Nvidia's HB SLI bridge.
> 
> Also supports the OG backplate, which I'm guessing means all the screws will work.
> 
> Says so here:
> 
> https://www.overclockers.co.uk/phanteks-glacier-gtx-1080-ti-gpu-full-water-block-with-rgb-lighting-satin-black-wc-012-pt.html


It keeps sounding better and better


----------



## fat4l

What is the best to go for if putting it under water?
NVIDIA FE?
has EK released their block for Ti alteady?


----------



## JedixJarf

Quote:


> Originally Posted by *fat4l*
> 
> What is the best to go for if putting it under water?
> NVIDIA FE?
> has EK released their block for Ti alteady?


The Ti FE block is just the titan x pascal block.


----------



## SonnyTubbs

Quote:


> Originally Posted by *dboythagr8*
> 
> I asked before but I will try again: Is EK who I should stick with for WC my system? Would be first time doing it and I don't want to go cheap on components for fear of leaks or malfunctions as mentioned by Sway. I know it can happen with any product, im just saying.


The best way to avoid leaks is to avoid rotary fittings. I've had Bit spower, EKWB, and Monsoon rotaries leak but never a regular fitting. That said, I still use rotaries because I like the look and how they simplify tubing runs.

Other than that you're on the right track. Don't go cheap, especially with the pump.

Sent from my Pixel XL using Tapatalk


----------



## sWaY20

Quote:


> Originally Posted by *dboythagr8*
> 
> I asked before but I will try again: Is EK who I should stick with for WC my system? Would be first time doing it and I don't want to go cheap on components for fear of leaks or malfunctions as mentioned by Sway. I know it can happen with any product, im just saying.


I'd go with ek, one of the best products imo, and out of this world customer service.


----------



## KedarWolf

I'm pretty happy under water. Temps under 50C, 50% pump speed with another GPU dedicated PhysX and CPU in the loop, 2062 core, +152, memory 6147 +641.

Firestrike.


----------



## Bishop07764

I'll be joining the ranks here too.



I just got this thing today and have barely used it. I already hate the cooler. I thought I might wait a little while to order my water block, but my gosh this cooler sucks. I've already ordered an EK block. This thing is louder than all 23 cases fans and watercooling pumps combined by a fair margin. I'm also starting to see that my cpu might be my weak spot now. Benching Gear of War 4 showed only 75% gpu bound at 1440p. My i7 3770k at 4.8ghz might finally be starting to show it's age.


----------



## xNutella

Sold out, Sold out everywhere. might throw my papers on FE card. bored of waiting for Vega card.


----------



## pez

Quote:


> Originally Posted by *PasK1234Xw*
> 
> EVGA Titan XP/Ti hybrid kit possibly next week.
> 
> 
> __ https://twitter.com/i/web/status/844245181150851072%5B%2FURL
> Is there any reason to get this when I have a hybrid kit for the 1080 I can adapt?
> 
> Any suggestions for installing the 1080 hybrid kit for the Ti? Should I be cutting the 1080 hybrid kit backplate to make it work with the Ti or do you use the Ti backplate and just install the waterblock portion? I guess it depends on how the Nvidia cooler disassembles with its fan location. Can the 1080 hybrid shroud be put back on with the Hybrid LED in white?


They *might* account for the higher TDP of the TXP and Ti, but not sure. I wouldn't necessarily count on it. At the end of the day, even if that's the case, it's probably going to be a margin-of-error difference.
Quote:


> Originally Posted by *dboythagr8*
> 
> You're running 4 way SLI in an era where beyond 2, it's no longer "officially" supported. So I'm curious what you are expecting here? 4 cards have always been the extreme of the extreme. Are you expecting 90% util on all 4 cards, because I don't think you're necessarily going to see that?


You must not be familiar with him







. Baasha has been able to get 90+% usage on a few titles already playing at 4K and 8K. Pretty impressive, actually







.


----------



## KCDC

GPU rendering will take every card you have.. and you need SLI off for it..Octane is running great! Gonna try Redshift this weekend. I have heard it's faster than Octane with the quality of VRay... Though, I think it will only allow two GPUs, I could be wrong.


----------



## pez

I know for sure I've seen Baasha post a 4-way SLI screenshot with TXPs on GTA V at 4K or 8K with all GPUs sitting at 95-99%. I mean...it's probably not worth it for most people....but this is OCN







.


----------



## Jbravo33

So.... after about 3 hours and 50+ variations of gpu oc and mem oc. Seems my best combo is 155core 375 mem fan on max. Power limit 120. temp on 90. Highest temp 70 avg 47.4
Timespy 10,048
Fire strike 23,558 which is weird cuz earlier I got 23,723 with 150/350
Fire strike ultra 7519
Fire strike extreme 14,133
Heaven 2560x1440 2062core/5880mem FPS176.2 SCORE 4438 minfps26.3 max fps 360.3
Valley 2560x1440 2062core/5880 mem FPS 126.9 SCORE 5311 min fps 21.7 max fps 256

Not sure if I should be frustrated or content. Putting block on in next two days. How much realistic gains would I see? 2050/2063core and 5880 pretty much max I can get on air. I can get mem to 6000 but doesn't make results any better. If my temps were in 50's on water will it make that much a difference? I'm using afterburner and didn't mess with core voltage as I cannot adjust it. Will adjust core voltage make a difference as well? Hope this helps some of you ocing your card. Only tried one overclock for valley and heaven which was the 155/375 think I can get more out of those with tweaking.


----------



## JunkaDK

Quote:


> Originally Posted by *Jbravo33*
> 
> So.... after about 3 hours and 50+ variations of gpu oc and mem oc. Seems my best combo is 155core 375 mem fan on max. Power limit 120. temp on 90. Highest temp 70 avg 47.4
> Timespy 10,048
> *Fire strike 23,558 which is weird cuz earlier I got 23,723 with 150/350*
> Fire strike ultra 7519
> Fire strike extreme 14,133
> Heaven 2560x1440 2062core/5880mem FPS176.2 SCORE 4438 minfps26.3 max fps 360.3
> Valley 2560x1440 2062core/5880 mem FPS 126.9 SCORE 5311 min fps 21.7 max fps 256
> 
> Not sure if I should be frustrated or content. Putting block on in next two days. How much realistic gains would I see? 2050/2063core and 5880 pretty much max I can get on air. I can get mem to 6000 but doesn't make results any better. If my temps were in 50's on water will it make that much a difference? I'm using afterburner and didn't mess with core voltage as I cannot adjust it. Will adjust core voltage make a difference as well? Hope this helps some of you ocing your card. Only tried one overclock for valley and heaven which was the 155/375 think I can get more out of those with tweaking.


Did you try to dial back to 150/350 and see if you got a higher score again? when pushed to the limit, more Mhz does not always equal a higher score


----------



## Silent Scone

Quote:


> Originally Posted by *sWaY20*
> 
> after a few months i noticed a slow leak, got worse so i ditched them. These ek qdc are ugly i agree, but man they work good. I can Plasti dip them to blend easily.


My rig has a ton of QDC3s and every single one of them works, had them over 2 years - possibly slightly longer.


----------



## Neokolzia

Quote:


> Originally Posted by *pez*
> 
> I know for sure I've seen Baasha post a 4-way SLI screenshot with TXPs on GTA V at 4K or 8K with all GPUs sitting at 95-99%. I mean...it's probably not worth it for most people....but this is OCN
> 
> 
> 
> 
> 
> 
> 
> .


inb4 Vega Quad Fire, with much more appropriate scaling and supported scaling xD.


----------



## Kimir

Quote:


> Originally Posted by *Silent Scone*
> 
> My rig has a ton of QDC3s and every single one of them works, had them over 2 years - possibly slightly longer.


As long as they are not the black ones (even newest batch), koolance one are indeed fine.
The CPC QDC used by EK are not the pretiest one that's for sure, there is some metal version that doesn't look that bad imo (and are compression, not clamp).


----------



## reset1101

Hi. Have any of you tried any AC cooler on a FE 1080Ti? According to AC, a lot of coolers are compatible:

https://www.arctic.ac/de_en/products/cooling/vga.html

But I prefer if someone has tried one, and see temps, noise, vrm temps, etc.

Thanks!


----------



## M4c4br3

$1080 for Asus 1080Ti Strix here in Sweden


----------



## GreedyMuffin

Quote:


> Originally Posted by *M4c4br3*
> 
> $1080 for Asus 1080Ti Strix here in Sweden


About the same here In Norway as well.. So I def. feel you. Luckily we get a five years "warranty" by law. Do you got that sorta thing in Sweden?


----------



## KickAssCop

I got an ASUS FE 1080 Ti for 866 locally.


----------



## KedarWolf

Quote:


> Originally Posted by *KickAssCop*
> 
> I got an ASUS FE 1080 Ti for 866 locally.


Got a Gigabyte FE for $1250 Canadian dollars locally on launch day which is about $933 USD but that includes 13% tax as the $50 I paid someone to go to other side of city and pick it up for me as I couldn't beg off euro that day before store iclosed.

Store held on to it until my friend got there that evening and it would never been sold out next day.


----------



## Joshwaa

Just going to throw this out there for the crazy people me being one of them. I saw this on a car at the drag strip last time I was there and thought that would be awesome for my loop. Who doesn't want to say they run Nitrous on their computer.

http://www.jegs.com/i/DEI/186/080125/10002/-1


----------



## zswickliffe

Quote:


> Originally Posted by *Joshwaa*
> 
> Just going to throw this out there for the crazy people me being one of them. I saw this on a car at the drag strip last time I was there and thought that would be awesome for my loop. Who doesn't want to say they run Nitrous on their computer.
> 
> http://www.jegs.com/i/DEI/186/080125/10002/-1


Unfortunately N2O is a gas at atmospheric pressures, so you'd have a tough time running a loop with gas in it.

An alternative is to pressurize the system to a couple of thousand PSI and it'll be a liquid. Not sure how good it is at cooling though.


----------



## M4c4br3

Quote:


> Originally Posted by *GreedyMuffin*
> 
> About the same here In Norway as well.. So I def. feel you. Luckily we get a five years "warranty" by law. Do you got that sorta thing in Sweden?


No idea honestly, but I'm not buying this. I'll wait until Volta Titan and buy that instead.


----------



## Joshwaa

Quote:


> Originally Posted by *zswickliffe*
> 
> Unfortunately N2O is a gas at atmospheric pressures, so you'd have a tough time running a loop with gas in it.
> 
> An alternative is to pressurize the system to a couple of thousand PSI and it'll be a liquid. Not sure how good it is at cooling though.


It is a chambered system you run the N2O through the heat exchanger to drop the temps of the water in your loop. You can also use CO2 which would be a lot cheaper.


----------



## Jbravo33

Quote:


> Originally Posted by *JunkaDK*
> 
> Did you try to dial back to 150/350 and see if you got a higher score again? when pushed to the limit, more Mhz does not always equal a higher score


I did unfortunately. Started low went high. Then went low again. But only tried one setting for valley and heaven. If I get more out of it during valley or heaven does that mean anything considering it would crash in 3D mark? One bench better than the other?


----------



## gstarr

Ordered EVGA GTX 1080 Ti FE in germany for 633 $ excl vat and incl shipping. Great price, but 1 week shipping time. Well 2 way of cooling: wait for hybrid kit from Evga (next week?) or new water cooling with 280 radiator.


----------



## Jbravo33

They're available now on nvidia. Just snagged my second one.


----------



## zswickliffe

Quote:


> Originally Posted by *Joshwaa*
> 
> It is a chambered system you run the N2O through the heat exchanger to drop the temps of the water in your loop. You can also use CO2 which would be a lot cheaper.


Ah, I understand what you're saying. Yeah, that would get a bit costly haha.


----------



## Anzial

Quote:


> Originally Posted by *gstarr*
> 
> Ordered EVGA GTX 1080 Ti FE in germany for 633 $ excl vat and incl shipping. Great price, but 1 week shipping time.


That's an awesome deal, below $699 MSRP already! That was fast.


----------



## shalafi

aaaand there I go - just ordered a Gigabyte FE and the EVGA 1080 Hybrid kit


----------



## GreedyMuffin

Quote:


> Originally Posted by *Anzial*
> 
> That's an awesome deal, below $699 MSRP already! That was fast.


That was excluding VAT. In Norway I payed about 860 USD including 25% wat and including shipping. Then I bought a block and etc.


----------



## Anzial

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That was excluding VAT. In Norway I payed about 860 USD including 25% wat and including shipping. Then I bought a block and etc.


MSRP doesn't include VAT either. So yeah, if he paid $633 for his 1080 ti, that's a sale price on a product that has barely been released!


----------



## GreedyMuffin

Quote:


> Originally Posted by *Anzial*
> 
> MSRP doesn't include VAT either. So yeah, if he paid $633 for his 1080 ti, that's a sale price on a product that has barely been released!


Oh, I did not know. Thanks!

EDIT: Then I payed 644$ USD. I got it with a little discount though (6%)

I installed my EKWB on my 1080TI. Used the stock FE backplate. Made a little album for those who are interested.








https://photos.google.com/share/AF1QipPJ1dJatJMyaV2VdB4JC6oB75pxXm0x9CpZXwjaytIB5TfssSIJtc7szWDyXa1f9Q?hl=no&key=SjFOTlNoTk1aalNGTkVmdURxcm16MDNRWjM2Sm13

Leak-testing now. Had to use the pads (2x 0.5mm + 2x 1.0mm) I had left over since the height was not the same as I used washers on the FE screws. But works good. No wobling, everything is fit and smooth.


----------



## M4c4br3

Quote:


> Originally Posted by *Anzial*
> 
> MSRP doesn't include VAT either. So yeah, if he paid $633 for his 1080 ti, that's a sale price on a product that has barely been released!


Yeah because in the stupid ass USA everything is sold as is and vat is added once you order the item. That's why we in Europe have to pay insane prices.


----------



## Anzial

Quote:


> Originally Posted by *M4c4br3*
> 
> Yeah because in the stupid ass USA everything is sold as is and vat is added once you order the item. That's why we in Europe have to pay insane prices.


I'm not sure you got something to complain about, with below MSRP prices on 1080 tis, you won't find this sort of deals in the good old US of A lol







As for VAT, just like sales tax in the US, it is decided by each individual state, no?


----------



## smushroomed

What seems to be the average OC on the 1080ti?


----------



## GreedyMuffin

Quote:


> Originally Posted by *smushroomed*
> 
> What seems to be the average OC on the 1080ti?


2000 to 2100?


----------



## KedarWolf

Quote:


> Originally Posted by *smushroomed*
> 
> What seems to be the average OC on the 1080ti?


Seems to be 2050-2060 core, +450-+550 memory, I get 2062 core, +642 memory under water.


----------



## SperVxo

Well i have seen alot of 1950-1999, Tech of tomorrow is one of them,
But on the more enthusiast users there are more 2050? And some with shunt mod seem to be over 2050

Mine was 1970ish but hits power limit in some games lowering it to 1950. Voltage doesnt seem to do much

So i would say average of 2000 maybe =D


----------



## Luda

Ive seen it settle in around 2050 on most games, full block, +500mhz memory. But i also saw it pushing 1890mhz without afterburner running when i first put it in, pascal boosting is weird.


----------



## sew333

Ok. My stock 1080 Ti reference not passing 3dmark fire strike stress test. Time to RMA? 83-84 temps


----------



## chibi

Ouch, that's some bad luck - hopefully they have stock available wherever you bought from to fulfill the RMA.


----------



## bfedorov11

Quote:


> Originally Posted by *sew333*
> 
> Ok. My stock 1080 Ti reference not passing 3dmark fire strike stress test. Time to RMA? 83-84 temps


seems that's how the test is supposed to work.. I guess it is the throttling. What speed is it running? Manually set the fan to 65% or higher.

https://www.futuremark.com/pressreleases/check-your-pcs-stability-with-new-3dmark-stress-tests
Quote:


> After running a 3DMark Stress Test, you will see your system's Frame Rate Stability score. A high score means your PC's performance under load is stable and consistent. To pass the test, your system must complete all loops with a Frame Rate Stability of at least 97%. For more details, please read our 3DMark Technical Guide.
> 
> In the example below, you can see that the system failed the test because its average frame rate dropped from 59 FPS to 51 FPS after the GPU reached its peak temperature. 3DMark shows that the system was not able to maintain a consistent level of performance under load. It had throttled down its performance when hot, suggesting that better cooling may be needed.


----------



## sew333

Clocks are 1700-1850 in test.


----------



## OldNerd

Quote:


> Originally Posted by *Joshwaa*
> 
> It is a chambered system you run the N2O through the heat exchanger to drop the temps of the water in your loop. You can also use CO2 which would be a lot cheaper.


'

Hold a can of compressed air upside down and spray into your rad ; Cheaper and fun to watch the frost lmao Just dont breathe too much


----------



## Joshwaa

Quote:


> Originally Posted by *OldNerd*
> 
> '
> 
> Hold a can of compressed air upside down and spray into your rad ; Cheaper and fun to watch the frost lmao Just dont breathe too much


I think we have all done that. lol


----------



## Sem

So how do you guys apply your overclocks at start-up?

in the past if I didn't want to mod the bios I would just create NVI shortcut and put that in the start-up folder

that was with Windows 7 doesn't seem to work with Windows 10


----------



## fisher6

Quote:


> Originally Posted by *Sem*
> 
> So how do you guys apply your overclocks at start-up?
> 
> in the past if I didn't want to mod the bios I would just create NVI shortcut and put that in the start-up folder
> 
> that was with Windows 7 doesn't seem to work with Windows 10


Use MSI Afterburner.


----------



## sew333

Guys listen. When i am starting fire strike stress test i am on 1850 mhz core. But when temps hits 83C i am on 1700-1750mhz and result is 95% and said is FIRE STRIKE STRESS TEST FAILER. Should i rma card or not?


----------



## MURDoctrine

Quote:


> Originally Posted by *sew333*
> 
> Guys listen. When i am starting fire strike stress test i am on 1850 mhz core. But when temps hits 83C i am on 1700-1750mhz and result is 95% and said is FIRE STRIKE STRESS TEST FAILER. Should i rma card or not?


Are you not using a custom fan profile? I never go above 70-75C with mine and the Firestrike stress test determines that percentage based on how the frame drop from the first test. Since you are hitting the 70+ thermal throttling for GPU boost 3.0 you are going to get worse frames the higher the temps go. That is more than likely why your score is that bad. Try ~80% fan speed locked and see what it does.


----------



## Radox-0

Quote:


> Originally Posted by *sew333*
> 
> Guys listen. When i am starting fire strike stress test i am on 1850 mhz core. But when temps hits 83C i am on 1700-1750mhz and result is 95% and said is FIRE STRIKE STRESS TEST FAILER. Should i rma card or not?


Someone has already answered you. 83/84 degrees is a normal temp for a FE GTX 1080Ti (and most Nvidia cards) they will all go to that temp then maintain a boost that will keep it at those temps. You initial boost clock speed of 1850 mhz is when the card is cool. By the time your hit 83/84 degrees it will drop the boost which corresponds with you seeing 1700-1750 mhz or so on the stock fan profile. So the card is behaving as expected.

The frame rate stability is likely just highlighting that the drop down in mhz is possibly affecting frames. In reality this is a Non-Issue as after the initial high mhz clock speed, it will drop down and maintain a particular range. If you really want to stabilise things, up the fan curve so it will stay around the 1850mhz.


----------



## sew333

Yes i am using stock fan settings and no oc.
When temps hits 83C core clock is 1700-1750mhz.
And then 95% fail Stress test is not indicating of faulty card?


----------



## bfedorov11

Quote:


> Originally Posted by *sew333*
> 
> Yes i am using stock fan settings and no oc.
> When temps hits 83C core clock is 1700-1750mhz.
> And then 95% fail Stress test is not indicating of faulty card?


No. It isn't a test that checks for stability. It just means you're leaving a little performance on table due to heat.

The stock fan profile for the reference card is terrible. It sacrifices performance for noise. If you want higher clocks, you need to manually set the fan with afterburner. Set the fan to 100% and you'll get a higher score.


----------



## chibi

Quote:


> Originally Posted by *sew333*
> 
> Yes i am using stock fan settings and no oc.
> When temps hits 83C core clock is 1700-1750mhz.
> And then 95% fail Stress test is not indicating of faulty card?


Not necessarily. The mass fluctuation of core speed may indicate instability to Fire Strike - when in actuality, the GPU is operating as intended with boost 3.0. Try opening your side panel of the computer case, and direct a big fan facing the GPU to negate air flow issues. Then run Fire Strike again while monitoring the boost clocks + temps and post results.


----------



## GreedyMuffin

1911mhz on stock on my watercooled 1080Ti. I am testing now at 2025 at 1.000V. No point in running 2075 1.0500V when I am hitting the power limit. I'd rather get max 95% TDP rather than spiking up to 125%. The power usage is noticable.


----------



## drkCrix

Very happy to annouce that I can finally join the club (after waiting 2 weeks)


----------



## zswickliffe

Just replaced my radiator fans with the BeQuiet! Silent Wings 3 (140mm High-Speed PWM x 2) and they're so quiet!! Not even kidding, it's amazing.

However, now I can hear my 1080 Ti's coil whine... I'll monitor it over the next few days and if it bugs me I'll be talking with EVGA. Luckily they cross-ship


----------



## TWiST2k

When are we gonna get some word from Evga about the FTW3, seems like they are lagging a bit. Maybe trying to avoid the 1080 FTW debacle this time haha.


----------



## CptSpig

Quote:


> Originally Posted by *TWiST2k*
> 
> When are we gonna get some word from Evga about the FTW3, seems like they are lagging a bit. Maybe trying to avoid the 1080 FTW debacle this time haha.


http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR


----------



## zswickliffe

Quote:


> Originally Posted by *TWiST2k*
> 
> When are we gonna get some word from Evga about the FTW3, seems like they are lagging a bit. Maybe trying to avoid the 1080 FTW debacle this time haha.


*Ahem*

Specs are up...

SC2 - http://www.evga.com/products/product.aspx?pn=11G-P4-6593-KR
SC Black - http://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR
FTW3 - http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR


----------



## JedixJarf

90 mhz clock boost omg its so amazing. lul.


----------



## TWiST2k

Quote:


> Originally Posted by *zswickliffe*
> 
> *Ahem*
> 
> Specs are up...
> 
> SC2 - http://www.evga.com/products/product.aspx?pn=11G-P4-6593-KR
> SC Black - http://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR
> FTW3 - http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR


Awesome!! I just checked last night and there were no specs yet!! Thanks for the links guys!!


----------



## dboythagr8

Quote:


> Originally Posted by *zswickliffe*
> 
> *Ahem*
> 
> Specs are up...
> 
> SC2 - http://www.evga.com/products/product.aspx?pn=11G-P4-6593-KR
> SC Black - http://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR
> FTW3 - http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR


Slightly...disappointed in the FTW3 specs?


----------



## zswickliffe

Quote:


> Originally Posted by *JedixJarf*
> 
> 90 mhz clock boost omg its so amazing. lul.


Agreed it's not amazing, but I'm hoping the cooler temps lead to much more stable clocks without high noise levels.

Disclaimer: I'm not buying it either way lol so it doesn't matter to me, it just seems like my Ti is super sensitive to temps.


----------



## TWiST2k

Quote:


> Originally Posted by *dboythagr8*
> 
> Slightly...disappointed in the FTW3 specs?


Ehh whatever, I am waiting on the big price reveal, it is going to be the determining factor for me for sure!


----------



## JollyGreenJoint

I see a lot of users blowing wind up each others skirts but wheres the beef !? aka post with list of downloadable bios? Pics of proper thermal pad placement ? or is this just a fanboy thread ?


----------



## pompss

For Everyone who is not an expert looking for answer or thinking to go on water need know that putting a water block with reduce throttling and keep core clock more stable.

You could gain 20 to 30 mhz when benching and on gaming have the max core clock more stable.

Me personally when gaming i'm stable at 2100mhz in some games on a 30 min test and pushing further i will not gain any substantial fps on 1440 res that will make a big difference.

So if you think to go on water to gain 50 mhz or more then you should wait until a bios or a mod will unlock power limit.

Vince already unlocked the power limit but i would not suggest to do it since its an hard mod very difficult to do.

For all other reason like silence , temp and reducing throttling i would say go for it.


----------



## jleslie246

Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?

Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


----------



## JedixJarf

Quote:


> Originally Posted by *jleslie246*
> 
> Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?
> 
> Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


this is in stock right now

https://www.newegg.com/Product/Product.aspx?Item=N82E16814487335&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=7057735&SID=


----------



## JedixJarf

Quote:


> Originally Posted by *jleslie246*
> 
> Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?
> 
> Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


and this

https://www.newegg.com/Product/Product.aspx?Item=N82E16814500415&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-Veeralava%20LLC-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=7057735&SID=


----------



## JedixJarf

Quote:


> Originally Posted by *jleslie246*
> 
> Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?
> 
> Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


Stock trackers

https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

https://www.zoolert.com/computers/videocards/nvidia/gtx1080ti/


----------



## jleslie246

Quote:


> Originally Posted by *JedixJarf*
> 
> Stock trackers
> 
> https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/
> 
> https://www.zoolert.com/computers/videocards/nvidia/gtx1080ti/


Thank you for the quick response! The EVGA one is out of stock even though it says it is in stock. When you try to put it in your cart it says not in stock. I dont want to order the Zotac card. I prefer evga, and evga has the trade up option I may consider with later models.


----------



## JedixJarf

Quote:


> Originally Posted by *jleslie246*
> 
> Thank you for the quick response! The EVGA one is out of stock even though it says it is in stock. When you try to put it in your cart it says not in stock. I dont want to order the Zotac card. I prefer evga, and evga has the trade up option I may consider with later models.


Yeah evga just sold out. Just sign up for those alerts and you get notified as soon as they are in stock.


----------



## keikei

Quote:


> Originally Posted by *jleslie246*
> 
> Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?
> 
> Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


In stock right now. https://www.newegg.com/Product/Product.aspx?Item=N82E16814500415&cm_re=1080ti-_-14-500-415-_-Product


----------



## Outcasst

Do you think I'd be able to connect a regular PWM 120mm 2000RPM fan to the onboard connector using an adaptor? Would it be able to power it through that header and control fan speeds?


----------



## DerComissar

Quote:


> Originally Posted by *drkCrix*
> 
> Very happy to annouce that I can finally join the club (after waiting 2 weeks)


Congrats for making it through that waiting list, and finally getting the card!


----------



## duganator

God I'm so excited about this card, can't wait for the custom versions to be released. I'm coming from a 1070 @ 1440p 144hz


----------



## JedixJarf

Quote:


> Originally Posted by *Outcasst*
> 
> Do you think I'd be able to connect a regular PWM 120mm 2000RPM fan to the onboard connector using an adaptor? Would it be able to power it through that header and control fan speeds?


Thats exactly how the hybrid cards work. This is what I used.

https://www.amazon.com/gp/product/B005ZKZEQA/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


----------



## Outcasst

Quote:


> Originally Posted by *JedixJarf*
> 
> Thats exactly how the hybrid cards work. This is what I used.
> 
> https://www.amazon.com/gp/product/B005ZKZEQA/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


Yeah I was looking at that exact model. I already have a H55 mounted to the card, however I'd like to be able to spin the fan on the radiator up depending on GPU temperature.

The fan i'd be using would draw 0.3A at 2400RPM maximum. I assume this is easy enough for the header on the card to handle?


----------



## JedixJarf

Quote:


> Originally Posted by *Outcasst*
> 
> Yeah I was looking at that exact model. I already have a H55 mounted to the card, however I'd like to be able to spin the fan on the radiator up depending on GPU temperature.
> 
> The fan i'd be using would draw 0.3A at 2400RPM maximum. I assume this is easy enough for the header on the card to handle?


Just use one of these?

https://www.amazon.com/NZXT-Kraken-RL-KRG10-B1-Bracket-Black/dp/B00ITTFO8M


----------



## pez

Quote:


> Originally Posted by *jleslie246*
> 
> Any tips on finding a 1080ti in stock? I keep missing out. How do i find out when the next batch is being released?
> 
> Also, I plan on water cooling for sure. Is the FE the best card for this? Or should I relax and wait for say, EVGA SC model?


Quote:


> Originally Posted by *JedixJarf*
> 
> Stock trackers
> 
> https://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/
> 
> https://www.zoolert.com/computers/videocards/nvidia/gtx1080ti/


Was going to suggest this. Was how I got so lucky to get 2 of my 1080s before some people even got their first







.

Still patiently waiting until EVGA announces or puts a standard iCX 1080 Ti on step-up







.


----------



## GreedyMuffin

2050 at 1.000V and 500+ on mem, that is completely decent?

With tht sort of undervotage, max TDP under gaming/folding etc. does not go over 90%. So my card are using 225 watts under full OC. :-D


----------



## Ayahuasca

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 2050 at 1.000V and 500+ on mem, that is completely decent?
> 
> With tht sort of undervotage, max TDP under gaming/folding etc. does not go over 90%. So my card are using 225 watts under full OC. :-D


How does one undervolt like this?


----------



## gkolarov

Hello Guys,

May I share with you my experience with the new 1080ti FE







I have set a custom fan profile in MSI afterburner rised the power limit and the temp limits to the max (120/90) and started to overclock. I succed to +120 on the core. Haven't touch the memory yet, just want to see the core first. It is stable so far in benchmarks and real games. The temp goes to 80 degrees and the fan speed to 70%. It is a little loud indeed, but at least doesn't trottles much. Keeps GPU speed at ~ 1975Mhz. Actually I have noticed that:

1. the GPU clock starts under load at 2000Mhz
2. when the core temp goes to ~60 the core clock changes to ~1984
3. when the temp goes to ~ 70 the core clocks changes once again to ~ 1976

What do you think, is that normal? I think thet the card doesn't clocks well. Even when I speed the fan to 90% cannot clock the core more than +120. It is not stable. Because the card brand is Asus is it possible that MSI AB cannot clock it well ? I stil lcan return it in the 14th day period to the seller.


----------



## SperVxo

My max is 120 also.


----------



## Ayahuasca

Quote:


> Originally Posted by *gkolarov*
> 
> Hello Guys,
> 
> May I share with you my experience with the new 1080ti FE
> 
> 
> 
> 
> 
> 
> 
> I have set a custom fan profile in MSI afterburner rised the power limit and the temp limits to the max (120/90) and started to overclock. I succed to +120 on the core. Haven't touch the memory yet, just want to see the core first. It is stable so far in benchmarks and real games. The temp goes to 80 degrees and the fan speed to 70%. It is a little loud indeed, but at least doesn't trottles much. Keeps GPU speed at ~ 1975Mhz. Actually I have noticed that:
> 
> 1. the GPU clock starts under load at 2000Mhz
> 2. when the core temp goes to ~60 the core clock changes to ~1984
> 3. when the temp goes to ~ 70 the core clocks changes once again to ~ 1976
> 
> What do you think, is that normal? I think thet the card doesn't clocks well. Even when I speed the fan to 90% cannot clock the core more than +120. It is not stable. Because the card brand is Asus is it possible that MSI AB cannot clock it well ? I stil lcan return it in the 14th day period to the seller.


On air +120 is my max too, with pascal every 10c in temp drops the clocks by around 12-13mhz.

You need to be under water to hold 2000+mhz stable.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Ayahuasca*
> 
> How does one undervolt like this?


Hi!

The curve function in MSI afterburner for an example. 

The downside is that the voltage does need to be locked with "L". Making it going on a constant 1.000V. But it is not an issue at all. Def. worth the 75 watt decrease under load for a 30 watt increase when Idle. I do not let my machine idle. It is either folding, or gaming. So for me it does not have anything to say.


----------



## KickAssCop

Where are the AIB cards? Any updates? It is already the 23rd. This whole FE nonsense that NVidia pulls is pissing me off.


----------



## TWiST2k

Quote:


> Originally Posted by *KickAssCop*
> 
> Where are the AIB cards? Any updates? It is already the 23rd. This whole FE nonsense that NVidia pulls is pissing me off.


Agreed! This is what happens when you are unchallenged by competition for too long.


----------



## Swolern

Quote:


> Originally Posted by *keikei*
> 
> In stock right now. https://www.newegg.com/Product/Product.aspx?Item=N82E16814500415&cm_re=1080ti-_-14-500-415-_-Product


WTH?? A $200 tax? Damn!
Quote:


> Originally Posted by *KickAssCop*
> 
> Where are the AIB cards? Any updates? It is already the 23rd. This whole FE nonsense that NVidia pulls is pissing me off.


FE + WC = Win with no wait.


----------



## DennyCorsa86

My SLI of 1080Ti

http://www.3dmark.com/3dm/18787879?


----------



## Kylar182

Hi, I snagged a couple NVIDIA FE's and I can't seem to get any software that lets me increase voltage or Power above 120. They're under water so it kind of blows. Anyone have any luck?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Kylar182*
> 
> Hi, I snagged a couple NVIDIA FE's and I can't seem to get any software that lets me increase voltage or Power above 120. They're under water so it kind of blows. Anyone have any luck?


Nope.

Try to undervolt? They won't hit the power limit. My card is at 2050/500+ on mem at 1.000V. Max TPD I've hit is 90% under full usage.


----------



## Anzial

You can't, it's hardware locked. You can bypass with a shunt mod,
http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/0_100


----------



## Anzial

Quote:


> Originally Posted by *Swolern*
> 
> WTH?? A $200 tax? Damn!


Third-party reseller (read: scalper) on newegg.


----------



## SperVxo

So I was changing the thermal paste just cause I was bored. Made a slight difference. 2-4c but what I really wanted to check was the copper shim size for an aio cooler. The gpu is 25x20mm and the thickness of that should be 2mm. That is the measurement from the square metal protection around the gpu up to the top of the stock cooler plate. And the gpu and that metal protection should be the same height so you don't overtighten the screws? I seen some say 1.5mm but that should leave a gap of 0.5mm?


----------



## fisher6

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Nope.
> 
> Try to undervolt? They won't hit the power limit. My card is at 2050/500+ on mem at 1.000V. Max TPD I've hit is 90% under full usage.


Does this give you a more stable overclock? My card is watercooled if that's relevant.


----------



## GreedyMuffin

My is aswell. EK WB. Have not tested much. But my clocks is at least stable.


----------



## Kylar182

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Nope.
> 
> Try to undervolt? They won't hit the power limit. My card is at 2050/500+ on mem at 1.000V. Max TPD I've hit is 90% under full usage.


Mine did 115-120% power at 2025 after 5 hours. No not undervolt. Just trying to overclock higher. Only hit 42c after 5 hours so I should have a boat load of head room. Is there not bios unlock on Pascal or even voltage/power unlock via software?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Kylar182*
> 
> Mine did 115-120% power at 2025 after 5 hours. No not undervolt. Just trying to overclock higher. Only hit 42c after 5 hours so I should have a boat load of head room. Is there not bios unlock on Pascal or even voltage/power unlock via software?


Sadly not.. This is not the 780 Classified era :/


----------



## Kylar182

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Sadly not.. This is not the 780 Classified era :/


Kingpin somehow unlocked his and got 3GHz on his card. So there's a way, just need to find it.


----------



## fisher6

Quote:


> Originally Posted by *Kylar182*
> 
> Kingpin somehow unlocked his and got 3GHz on his card. So there's a way, just need to find it.


That was a hardmod and not something the typical user does.


----------



## GreedyMuffin

Will EK release a singel-slot bracket for the 1080Ti?


----------



## Luck100

Quote:


> Originally Posted by *gstarr*
> 
> Ordered EVGA GTX 1080 Ti FE in germany for 633 $ excl vat and incl shipping. Great price, but 1 week shipping time. Well 2 way of cooling: wait for hybrid kit from Evga (next week?) or new water cooling with 280 radiator.


Wow! who sells for that price in Germany?


----------



## Anzial

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Will EK release a singel-slot bracket for the 1080Ti?


I heard that yes, they are going to do it at some point


----------



## eXistencelies

Quote:


> Originally Posted by *DennyCorsa86*
> 
> My SLI of 1080Ti
> 
> http://www.3dmark.com/3dm/18787879?


Strong 5.3GHz on that 7700k. What is your voltage and temps like? I am 5ghz @ 1.285 and gaming temps hover high 40s to mid 50s.


----------



## SandroX

Quote:


> Originally Posted by *eXistencelies*
> 
> I am 5ghz @ 1.285 and gaming temps hover high 40s to mid 50s.


You are very lucky man







What cooling do you use?


----------



## CptSpig

Quote:


> Originally Posted by *Kylar182*
> 
> Kingpin somehow unlocked his and got 3GHz on his card. So there's a way, just need to find it.


This is how KinPin did it and I don't think any one else will be doing this modification.

http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


----------



## eXistencelies

Quote:


> Originally Posted by *SandroX*
> 
> You are very lucky man
> 
> 
> 
> 
> 
> 
> 
> What cooling do you use?


Thanks. Full custom loop on 720mm rad space in a Phantek Evolv TG. TT RIING fans. 6 intakes and 1 exhaust.


----------



## Kylar182

Quote:


> Originally Posted by *CptSpig*
> 
> This is how KinPin did it and I don't think any one else will be doing this modification.
> 
> http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


I understand but if the bios is still posting the original max voltage and power then he had to have edited it somehow.


----------



## Kimir

No, you don't need to touch the bios when you do it that way on the hardware level.


----------



## Anzial

Quote:


> Originally Posted by *Kylar182*
> 
> I understand but if the bios is still posting the original max voltage and power then he had to have edited it somehow.


Kingpin has direct access to nvidia, evga engineers so sure, he can get his bios modded but the rest of mere mortals will have to do with what's given. The encryption pretty much shuts down average Joe's bios modding.


----------



## Kylar182

Quote:


> Originally Posted by *Kimir*
> 
> No, you don't need to touch the bios when you do it that way on the hardware level.


Yes you do, you always have. It's why you don't have to PCB mod the 7 and 9 series cards to increase the voltage to a point. If the bios says to send X Voltage then no amount of hard Modding will change it. Some people have decrypted the bios and I wish they'd leak it.


----------



## Kimir

Believe what you want.


----------



## Kylar182

Quote:


> Originally Posted by *Kimir*
> 
> Believe what you want.


It's not belief, I literally just showed my example and proved it. At which point it becomes fact. I understand the confusion considering modern culture does place a much higher value on feelings vs facts.


----------



## jarble

Quote:


> Originally Posted by *Kylar182*
> 
> It's not belief, I literally just showed my example and proved it. At which point it becomes fact. I understand the confusion considering modern culture does place a much higher value on feelings vs facts.


Sorry friend your are very wrong here. These are bypassing the control chips and applying voltage directly









http://forum.kingpincooling.com/showthread.php?t=3879
"First modification is one to adjust GPU core voltage. This is most popular one by far, and often (but not always, just like in case of Maxwell!) extra voltage can help to get better stability and higher clocks, given no other limitations are hit. Maximum available voltage thru software only control on reference cards are very limited, usually around 1.2V, so that's why hardware mod is required to get over this limit.

GPU power controller, which is now new UPI uP9511P, instead of ON NCP81174 which was used on previous reference GTX 980 and GTX 980 Ti cards. UPI's site is way obsolete and lacks of any useful information, so I could only guess from layout and probing around that this controller supports at least 6 phases, has no digital interface or parallel VID code setting (no array of 6-8 resistors found around to set VID voltage code) and unlikely to have digital interface like I2C. "


----------



## DennyCorsa86

Quote:


> Originally Posted by *eXistencelies*
> 
> Strong 5.3GHz on that 7700k. What is your voltage and temps like? I am 5ghz @ 1.285 and gaming temps hover high 40s to mid 50s.


5300/5000 @ 1.56 volt under water , temp in firestrike cpu test 76/77 degrees


----------



## eXistencelies

Quote:


> Originally Posted by *DennyCorsa86*
> 
> 5300/5000 @ 1.56 volt under water , temp in firestrike cpu test 76/77 degrees


Damn...are you ok with 1.5v?


----------



## Jbravo33

So are heaven, valley and fire strike best tests for gpu? I have been doing custom runs on fire strike with just graphics test 1 and 2 enabled and trying to get highest score. Should that be the focus? I can clock higher core and memory but score isn't as high.


----------



## Rhadamanthys

So will the custom cards at least have the option to unlock power limit and voltage via modded BIOS?


----------



## GreedyMuffin

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So will the custom cards at least have the option to unlock power limit and voltage via modded BIOS?


Probably not.


----------



## keikei

https://videocardz.com/67542/evga-confirms-gtx-1080-ti-sc-sc2-and-ftw3-specifications


----------



## Rhadamanthys

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Probably not.


Why not? Wasn't an issue with the 980ti was it?


----------



## navjack27

This is Pascal... So no.

It should just be put on the first page of this thread that custom bios outside of a vendor specific implementation won't happen ever.


----------



## hebrewbacon

Quote:


> Originally Posted by *wizardbro*
> 
> Anyone on a full water block and a quiet/silent setup have coil whine with this card at 100+ fps and 100%gpu usage?
> 
> My 980ti coil whine is the loudest thing in my PC BY FAR at Max loads, thinking of upgrading.


Yes I have coil whine and it's under water. It's not very loud but it is definitely audible and extremely annoying. I initially thought it was my PSU so I went and got a new one but it persisted so I'm just ignoring it now. The coil whine only becomes noticeable on load screens when the FPS are over 100. In game play is generally silent since I don't hit 100fps at 4k








My 1080s had no coil whine so I'm pretty irritated


----------



## JedixJarf

Quote:


> Originally Posted by *hebrewbacon*
> 
> Yes I have coil whine and it's under water. It's not very loud but it is definitely audible and extremely annoying. I initially thought it was my PSU so I went and got a new one but it persisted so I'm just ignoring it now. The coil whine only becomes noticeable on load screens when the FPS are over 100. In game play is generally silent since I don't hit 100fps at 4k
> 
> 
> 
> 
> 
> 
> 
> 
> My 1080s had no coil whine so I'm pretty irritated


I'm the opposite lol. My 1080 had a terrible whine, now no whine.


----------



## Alwrath

Hey guys I have an evga 1080ti FE and I can increase voltage in evga presiscion x, I am registered through evga though.


----------



## CptSpig

Quote:


> Originally Posted by *Kylar182*
> 
> I understand but if the bios is still posting the original max voltage and power then he had to have edited it somehow.


Just look at the cables soldered to the board. They are going directly to PSU that's how he is getting voltage. No bios needed.


----------



## richiec77

Finished up the new Watercooling loop. Monoblock and GPU Block. Currently at 2101MHz core @ 1.043V. +250 on the memory. Going above that memory speed decreases scores. Still testing. but temps are not going above 41C (ambient 23C) after a 1 hour loop of Heaven and then some KF2 gaming for 1 hour. Seems very stable at 2101 currently. Using a +190 GPUC core offset.

Going to do some more testing and see where I end up and then do some screen shots and validations.


----------



## Joshwaa

Quote:


> Originally Posted by *richiec77*
> 
> Finished up the new Watercooling loop. Monoblock and GPU Block. Currently at 2101MHz core @ 1.043V. +250 on the memory. Going above that memory speed decreases scores. Still testing. but temps are not going above 41C (ambient 23C) after a 1 hour loop of Heaven and then some KF2 gaming for 1 hour. Seems very stable at 2101 currently. Using a +190 GPUC core offset.
> 
> Going to do some more testing and see where I end up and then do some screen shots and validations.


Lucky you. I had to go with the regular Evo cpu block. EK has not made a mono block for the MSI Gaming M7 as of yet.


----------



## Slackaveli

Quote:


> Originally Posted by *zswickliffe*
> 
> I think this might be my favorite. I don't dig the LEDs so much but I'm sure you can keep all that off.


Yeah, that's quite a bit more in depth than a shunt mod.
Quote:


> Originally Posted by *CptSpig*
> 
> This is how KinPin did it and I don't think any one else will be doing this modification.
> 
> http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


Quote:


> Originally Posted by *DennyCorsa86*
> 
> 5300/5000 @ 1.56 volt under water , temp in firestrike cpu test 76/77 degrees


damn, son. that chip wont last a year with those volts!


----------



## BucketInABucket

Here's my contribution, got it on Tuesday!


----------



## richiec77

Quote:


> Originally Posted by *Joshwaa*
> 
> Lucky you. I had to go with the regular Evo cpu block. EK has not made a mono block for the MSI Gaming M7 as of yet.


Yeah. The Asus X99 deluxe monoblock is really nice.


----------



## KedarWolf

So, what I've settled on under water if +137 core, +602 memory. If I got higher in core I get some driver crashes while running Firestrike Stress Test, if I go higher memory I get the occasional artefact.

These settings zero driver crashes, no artefacts at all.









Not the best card, but still, I'm happy.


----------



## chibi

Quote:


> Originally Posted by *KedarWolf*
> 
> So, what I've settled on under water if +137 core, +602 memory. If I got higher in core I get some driver crashes while running Firestrike Stress Test, if I go higher memory I get the occasional artefact.
> 
> These settings zero driver crashes, no artefacts at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not the best card, but still, I'm happy.


What is your final Core count at +137?
I'm just sitting idle right now... waiting on waterblock


----------



## DennyCorsa86

Quote:


> Originally Posted by *Slackaveli*
> 
> Yeah, that's quite a bit more in depth than a shunt mod.
> 
> damn, son. that chip wont last a year with those volts!


I use it in daily at 51/45 @ 1.280


----------



## KedarWolf

Quote:


> Originally Posted by *chibi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> So, what I've settled on under water if +137 core, +602 memory. If I got higher in core I get some driver crashes while running Firestrike Stress Test, if I go higher memory I get the occasional artefact.
> 
> These settings zero driver crashes, no artefacts at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not the best card, but still, I'm happy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What is your final Core count at +137?
> I'm just sitting idle right now... waiting on waterblock
Click to expand...

2037 core, 6106 memory.


----------



## richiec77

Quote:


> Originally Posted by *KedarWolf*


Ah! I lucked out in the lottery it seems. Full waterblock getting +190 core +250 memory. [email protected] But that's the wall. Have to ramp all the way to 1.08v for 2113 and 1.093v for 2126. At that point I start really hitting the power limit hard. At the lower voltage point frees up about 5% power target. You seem to have much better memory than me.


----------



## KedarWolf

Quote:


> Originally Posted by *richiec77*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> Ah! I lucked out in the lottery it seems. Full waterblock getting +190 core +250 memory. [email protected] But that's the wall. Have to ramp all the way to 1.08v for 2113 and 1.093v for 2126. At that point I start really hitting the power limit hard. At the lower voltage point frees up about 5% power target. You seem to have much better memory than me.
Click to expand...

How do you get 1.093v? I tried a custom power curve but it wouldn't top out the clocks, if I had it say 2100 at 1.093v it would stay around 1983 or so and never ramp up. Seems like it doesn't like going higher than the 1.062v it's getting.


----------



## Rhadamanthys

So if I wanna go WC, would I be better off waiting for custom cards with higher core/boost or will these not overclock higher than the FEs under water?


----------



## GreedyMuffin

500+ on mem is not an issue. 600 neither.. only at 600 it decreases performance.


----------



## richiec77

ctrl+f or the Curve plus commanding more voltage. Was tricky to do. I would set 13mhz jump about every other voltage point in the curve, then command more voltage. Had Heaven looping windowed and would command more voltage until I saw the jump up. Usually had crashes and the Curve map was a little wonky to use this way. Like tons of lag. My 1st attempt at OC was to just max Temp/Power and apply more and more OC to the core letting boost do it's thing. That's how I got to 2100. After that I tried the steps above for like 2 hours....lots of work for no gain at all.


----------



## krutoydiesel

My turn to add my results:
*Nvidia GTX 1080 Ti FE
EKWB Titan X (Pascal) Nickel/Plexi FCWB*

1911mhz stock boost
*2050 mhz 1hr Unigine Heaven stable* - only power limit set to 120%, without upping voltage.

*I am using the STOCK FE backplate that came with the FE cooler as well. I used FE stock screws, the only screws I used from EK were the four that immediately surround the GPU. On the backplate side, I have a couple strips of 1.0mm thermal pads double stacked (so 2mm thickness to make sure the backplate looks even towards the middle where the backplate doesn't screw down.*


----------



## Slackaveli

Quote:


> Originally Posted by *DennyCorsa86*
> 
> I use it in daily at 51/45 @ 1.280


ah, ok. i was actually feeling anxiety for you lol.


----------



## Somasonic

Quote:


> Originally Posted by *hebrewbacon*
> 
> Yes I have coil whine and it's under water. It's not very loud but it is definitely audible and extremely annoying. I initially thought it was my PSU so I went and got a new one but it persisted so I'm just ignoring it now. The coil whine only becomes noticeable on load screens when the FPS are over 100. In game play is generally silent since I don't hit 100fps at 4k
> 
> 
> 
> 
> 
> 
> 
> 
> My 1080s had no coil whine so I'm pretty irritated


Set a frame cap with Nvidia Inspector or something and it should solve your problem. I'm guessing you don't need 100 fps in your menus


----------



## JedixJarf

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So if I wanna go WC, would I be better off waiting for custom cards with higher core/boost or will these not overclock higher than the FEs under water?


If these OC just like the 1080's, then FE will clock the same as aftermarket. Pascal is just silicon lottery.


----------



## chibi

Quote:


> Originally Posted by *krutoydiesel*
> 
> My turn to add my results:
> *Nvidia GTX 1080 Ti FE
> EKWB Titan X (Pascal) Nickel/Plexi FCWB*
> 
> 1911mhz stock boost
> *2050 mhz 1hr Unigine Heaven stable* - only power limit set to 120%, without upping voltage.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice results, love your gpu support post.


----------



## krutoydiesel

Quote:


> Originally Posted by *chibi*
> 
> Nice results, love your gpu support post.


Thank you, without it, nothing is properly level, my OCD doesn't allow it. The support isn't visible behind the GPU cables once those are installed.


----------



## richiec77

Quote:


> Originally Posted by *chibi*
> 
> Nice results, love your gpu support post.


Quote:


> Originally Posted by *krutoydiesel*
> 
> Thank you, without it, nothing is properly level, my OCD doesn't allow it. The support isn't visible behind the GPU cables once those are installed.


I didn't notice that.....so it's a damn good idea! Going to do that myself once I get some PETG and Hardline fittings for the next iteration of the loop.


----------



## KedarWolf

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 500+ on mem is not an issue. 600 neither.. only at 600 it decreases performance.


This is +602 memory.



+553



+504


----------



## Somasonic

Quote:


> Originally Posted by *krutoydiesel*
> 
> Thank you, without it, nothing is properly level, my OCD doesn't allow it. The support isn't visible behind the GPU cables once those are installed.


lol I was going to say it blends in so well with the rest of the build I didn't even notice it until it was pointed out







Nice job


----------



## krutoydiesel

Quote:


> Originally Posted by *Somasonic*
> 
> lol I was going to say it blends in so well with the rest of the build I didn't even notice it until it was pointed out
> 
> 
> 
> 
> 
> 
> 
> Nice job


Thank you good sir. Seems no one is willing to create a proper GPU support bracket that doesn't look too gaudy.


----------



## Slackaveli

Quote:


> Originally Posted by *krutoydiesel*
> 
> Thank you good sir. Seems no one is willing to create a proper GPU support bracket that doesn't look too gaudy.


it really is a clean build. very very sexy.


----------



## nrpeyton

I just watched a *"VRM Build Qualit*y" video analysis of the *1080 TI Founders Edition.*

Experts are saying that the stock 1080 TI Founders Ed has a significantly upgraded VRM, compared to the Titan X Pascal.

Nvidia say the VRM is capable of 250w, but if you could adequately cool the VRM; it could theoretically take up to 800w before dieing.

Overall they were _very_ impressed with the VRM quality, and say Nvidia has done a fantastic job on the VRM.

This is definitely a breath of fresh air. "_Nvidia could of went with less mosfets or lower quality, but they didn't"._
Here's the *video*, it's extremely interesting... anyone who owns a Nvidia 1080TI FE, or thinking of getting one; should watch this:


----------



## dunbheagan

I got my Asus 1080 Ti FE for 820€(915$) inkl. tax and shipping i few days ago here in Germany.

I am testing it at stock cooler right now, but i am planning to put it on a EK waterblock within the next days. So far i'm satisfied. My aim is 2000/6000 and i got a stable OC at +150/+550 on air. +175/+550 seemed to be stable first, but then it crashed after some longer benches. Normally the card boosts to 2000-2050(cooler on manual 80%), my only concern are the short drops in clockspeed, at demanding scenes they go down to 1900 because of the power limit. I hope to smoothen that out with the watercooling, but i am not sure in how far the cooling affects the power limit. I really would like to see the clocks above 2000 in every situation.

Question to all the guys, who already have it on watercooling: can you hold the card above 2000MHz under all circumstances?


----------



## KedarWolf

My MSI log file under water, pump at 65%, Heaven benchmark Extreme preset, fullscreen disabled.

Pretty much the entire time my card keeps clocked at 2063 core except a few glitches near the end. Probably if I put my pump speed higher I wouldn't even get the minor glitches.

Might need to right click, Properties, Unblock the zip file.

HardwareMonitoring.zip 2k .zip file


Edit: This is with an EK waterblock and backplate, CPU and dedicated PhysX card in loop on a 360 rad with six fans, I should change the fan curve though on the fans, I think they are only running at 30% or so.

Might do that now.


----------



## Jbravo33

Second card just got in! Waiting for waterblocks. I QUIT! Lol no more mods!!! Have to say with the one card installed I played mass effect 3440x1440 on ultra and it looks amazing. My Monitor only refreshes 60Hz can only imagine 100 and above.


----------



## dunbheagan

@Kevlarwolf:
thanks a lot, very interesting and nice results. if i get similar results on water i will be satisfied.


----------



## KedarWolf

This Afterburner log is Heaven maxed out at system resolution of 4K.

2063 core, once again, just a few minor clock drops at the end which may be due to Heaven's engine and the scenes being rendered.

Right click zip, Properties, Unblock if you need to.

I upped my rad fans to 60%, pump at 65%.

HardwareMonitoring2.zip 3k .zip file


----------



## KedarWolf

Now that I lowered my memory to +553 because it benches better than at +604 I have core at 2063 substained in Heaven benchmark at 4K.









It's Firestrike stress test and Heaven stable.









See above post for MSI Afterburner log file.


----------



## dunbheagan

thanks for sharing, even better!

The highest OC i got for now on stock [email protected]% for a complete run of Metro Last Light Redux Benchmark ist +175/+560 which results in 2075-1911MHz core and 6066MHz Memory(nice number).

Check screenshot for details.


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> thanks for sharing, even better!
> 
> The highest OC i got for now on stock [email protected]% for a complete run of Metro Last Light Redux Benchmark ist +175/+560 which results in 2075-1911MHz core and 6066MHz Memory(nice number).
> 
> Check screenshot for details.


Are the drops into the 1900's due to power limit or temp limit?


----------



## dunbheagan

its not temp limited, temps are below 65°C, cooler at manual 100%. in the screenshot you can see the power limit regulary switching to "1", so i guess power is the main reason for the downclock.

in DOOM the card goes as low as 1936MHz too. So although my card's max overclock is not too bad, it is not possible to keep it above 2000MHz at all times on stock cooler. But i think that is the case for (almost) all FEs. Hope the watercooler will raise the min clock above 2000 and the low power target will not be to much of a bottleneck. I dont want to do the shunt mod, i think i couldnt sleep well with liquid metal on my Ti...

@KedarWolf:
Have you done the shunt mod on your 1080ti or have you just changed the cooler? whats your average tdp at full load?

Here the screen of a couple minutes Doom:


----------



## MURDoctrine

Everyone is complaining about overclocks and I can only get +120 / +350 on mine. This is completely stock voltage and game stable. It has been higher for benchmarks but I don't care about synthetics. I could probably push the memory more as I've had it up to +475 - 500. I'm hoping I can push more once I'm under water but I normally lose the silicon lottery.


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> its not temp limited, temps are below 65°C, cooler at manual 100%. in the screenshot you can see the power limit regulary switching to "1", so i guess power is the main reason for the downclock.
> 
> in DOOM the card goes as low as 1936MHz too. So although my card's max overclock is not too bad, it is not possible to keep it above 2000MHz at all times on stock cooler. But i think that is the case for (almost) all FEs. Hope the watercooler will raise the min clock above 2000 and the low power target will not be to much of a bottleneck. I dont want to do the shunt mod, i think i couldnt sleep well with liquid metal on my Ti...
> 
> @KedarWolf:
> Have you done the shunt mod on your 1080ti or have you just changed the cooler? whats your average tdp at full load?
> 
> Here the screen of a couple minutes Doom:


If temps are below 65c and it is the power limit; then I'd guess going water wouldn't help?

As soon as power draw nears 300w its downclocking regardless if temp is 20c or 80c


----------



## KedarWolf

Quote:


> Originally Posted by *dunbheagan*
> 
> its not temp limited, temps are below 65°C, cooler at manual 100%. in the screenshot you can see the power limit regulary switching to "1", so i guess power is the main reason for the downclock.
> 
> in DOOM the card goes as low as 1936MHz too. So although my card's max overclock is not too bad, it is not possible to keep it above 2000MHz at all times on stock cooler. But i think that is the case for (almost) all FEs. Hope the watercooler will raise the min clock above 2000 and the low power target will not be to much of a bottleneck. I dont want to do the shunt mod, i think i couldnt sleep well with liquid metal on my Ti...
> 
> @KedarWolf:
> Have you done the shunt mod on your 1080ti or have you just changed the cooler? whats your average tdp at full load?
> 
> Here the screen of a couple minutes Doom:


No shunt mod, just EK waterblock and backplate.

I logged GPU-Z but for some reason new version not logging Power Consumption even though it's enabled.

Heaven max settings 4K system setting windowed seems to average around 108% TDP, goes as low as 93, as high as 118, but hovers mostly around 100-114% depending on Heaven scene being rendered while watching GPU-Z.

Added log, might make more sense to someone else.









GPU-ZSensorLog.txt 134k .txt file


----------



## Baasha

Quote:


> Originally Posted by *KedarWolf*
> 
> 2037 core, 6106 memory.


How did you adjust voltage on the 1080 Ti?

I'd like to try that to get a higher OC. Although I don't have much room temp-wise...


----------



## Swolern

What's up Baasha!
Quote:


> Originally Posted by *dunbheagan*
> 
> its not temp limited, temps are below 65°C, cooler at manual 100%. in the screenshot you can see the power limit regulary switching to "1", so i guess power is the main reason for the downclock.
> 
> in DOOM the card goes as low as 1936MHz too. So although my card's max overclock is not too bad, it is not possible to keep it above 2000MHz at all times on stock cooler. But i think that is the case for (almost) all FEs. Hope the watercooler will raise the min clock above 2000 and the low power target will not be to much of a bottleneck. I dont want to do the shunt mod, i think i couldnt sleep well with liquid metal on my Ti...
> 
> @KedarWolf:
> Have you done the shunt mod on your 1080ti or have you just changed the cooler? whats your average tdp at full load?
> 
> Here the screen of a couple minutes Doom:


Temps will start to throttle on the 1080ti @ 60c, no matter what you have he temp slider set at. It's the way Nvidia designed it. So yes you are temp limited.


----------



## KedarWolf

Quote:


> Originally Posted by *Baasha*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 2037 core, 6106 memory.
> 
> 
> 
> 
> 
> How did you adjust voltage on the 1080 Ti?
> 
> I'd like to try that to get a higher OC. Although I don't have much room temp-wise...
Click to expand...

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> Second card just got in! Waiting for waterblocks. I QUIT! Lol no more mods!!! Have to say with the one card installed I played mass effect 3440x1440 on ultra and it looks amazing. My Monitor only refreshes 60Hz can only imagine 100 and above.


well, then you aren't done just yet. You'll be needing that high refresh rate 4k screen now.


----------



## powerincarnate

Just wondering would you prefer 4K 60hz or 1440 100hz, and lastly with the option for 3840x1600 (though so far freesync only







at 75 Hz would that be the best of both world


----------



## imLagging

I need some advice because I'm going crazy.
Did a few quick runs with +150/+500 120% on the 1080 Ti and I am still confused with my results to say the least.


----------



## gavros777

i have an i7 3770k at 4.6ghz with titan x maxwell in sli and a 4k oled tv.
does it worth upgrading both my cpu and gpus or it's a waste of money to upgrade my cpu and motherboard etc as at 4k the cpu matters little.
I'm thinking to go 1080ti sli.


----------



## imLagging

Here is another comparison from the Extreme benchmark

I've only tested a few games but I'm so confused. Battlefield 1 with the SLI setup would run a constant 120fps and the 1080 Ti is at 130-145fps with the same settings. Even vram used is similar, before the SLI was at 2.8-3.2gb and the 1080 Ti is at 3.8gb.

Am I expecting too much? The 10-24% increase on graphics test feels like it would of been appropriate for the Maxwell Titan. If I can remember correct, the 980Ti was ~10% over the 780Ti, so why the low numbers a generation after that?

I really want to test this card to it's best potential so that means I need a waterblock and shunt mod to bypass power limits. I was going to wait for AIB boards with better components and power delivery, the more I read I see that Pascal just does not keep scaling with extra voltage and power draw. Also did not want to wait on custom waterblocks so I bought the FE. It's been a while since I upgrade because the SLI setup has practically destroyed anything with SLI support. Plenty of times I read forums and review sites. Vram has never been an issue for me on 2k and 4k and FPS was high enough to drive high refresh rate monitors. I'm not sure what to do besides ask other owners on how they feel because I'm 50% ready to box this card up and return it


----------



## Jbravo33

Quote:


> Originally Posted by *Slackaveli*
> 
> well, then you aren't done just yet. You'll be needing that high refresh rate 4k screen now.


lol you're right. Im gonna hold out as long as i can. Would really like to see a curved 3440x1440 144Hz at some point. Id probably take that over a 4k, very taxing hobby.


----------



## c0nsistent

Quote:


> Originally Posted by *imLagging*
> 
> Here is another comparison from the Extreme benchmark
> 
> I've only tested a few games but I'm so confused. Battlefield 1 with the SLI setup would run a constant 120fps and the 1080 Ti is at 130-145fps with the same settings. Even vram used is similar, before the SLI was at 2.8-3.2gb and the 1080 Ti is at 3.8gb.
> 
> Am I expecting too much? The 10-24% increase on graphics test feels like it would of been appropriate for the Maxwell Titan. If I can remember correct, the 980Ti was ~10% over the 780Ti, so why the low numbers a generation after that?
> 
> I really want to test this card to it's best potential so that means I need a waterblock and shunt mod to bypass power limits. I was going to wait for AIB boards with better components and power delivery, the more I read I see that Pascal just does not keep scaling with extra voltage and power draw. Also did not want to wait on custom waterblocks so I bought the FE. It's been a while since I upgrade because the SLI setup has practically destroyed anything with SLI support. Plenty of times I read forums and review sites. Vram has never been an issue for me on 2k and 4k and FPS was high enough to drive high refresh rate monitors. I'm not sure what to do besides ask other owners on how they feel because I'm 50% ready to box this card up and return it


Buy a 2nd 1080 Ti... problem solved.

At least you aren't like me... I have a 1070 I'm using to drive a QNIX 1440p 120hz display. Ordered a 1080 Ti thinking I could drive it with the adapter I have... nope. Now I'm waiting for a DL-DVI to DP adapter, a new monitor, or an aftermarket 1080 Ti...


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> lol you're right. Im gonna hold out as long as i can. Would really like to see a curved 3440x1440 144Hz at some point. Id probably take that over a 4k, very taxing hobby.


indeed it is. it would be wise to wait a bit as some serious upgrades coming this year in monitor tech.


----------



## keikei

Quote:


> Originally Posted by *gavros777*
> 
> i have an i7 3770k at 4.6ghz with titan x maxwell in sli and a 4k oled tv.
> does it worth upgrading both my cpu and gpus or it's a waste of money to upgrade my cpu and motherboard etc as at 4k the cpu matters little.
> I'm thinking to go 1080ti sli.


The 3770k is chugging along just fine, especially at 4k. Unless you want new tech, i dont see the need to upgrade your cpu/board at the present time.


----------



## Somasonic

Quote:


> Originally Posted by *imLagging*
> 
> Here is another comparison from the Extreme benchmark
> 
> I've only tested a few games but I'm so confused. Battlefield 1 with the SLI setup would run a constant 120fps and the 1080 Ti is at 130-145fps with the same settings. Even vram used is similar, before the SLI was at 2.8-3.2gb and the 1080 Ti is at 3.8gb.
> 
> Am I expecting too much? The 10-24% increase on graphics test feels like it would of been appropriate for the Maxwell Titan. If I can remember correct, the 980Ti was ~10% over the 780Ti, so why the low numbers a generation after that?
> 
> I really want to test this card to it's best potential so that means I need a waterblock and shunt mod to bypass power limits. I was going to wait for AIB boards with better components and power delivery, the more I read I see that Pascal just does not keep scaling with extra voltage and power draw. Also did not want to wait on custom waterblocks so I bought the FE. It's been a while since I upgrade because the SLI setup has practically destroyed anything with SLI support. Plenty of times I read forums and review sites. Vram has never been an issue for me on 2k and 4k and FPS was high enough to drive high refresh rate monitors. I'm not sure what to do besides ask other owners on how they feel because I'm 50% ready to box this card up and return it


I don't know what the comparative performance of these cards should be but from what you've said we could possibly conclude that your results are about as expected.

If 980 Ti = 780 Ti +10%

we could suppose that 980 Ti SLI = 780 Ti SLI + 10%

We know that that the 1080 Ti is roughly equivalent to 980 Ti SLI so we can then say

1080 Ti = 780 Ti SLI +10%

Which would be about in line with what you're seeing.

This could all be a steaming pile in the real world so I'm happy to be corrected


----------



## powerincarnate

Quote:


> Originally Posted by *Slackaveli*
> 
> indeed it is. it would be wise to wait a bit as some serious upgrades coming this year in monitor tech.


Quote:


> Originally Posted by *Jbravo33*
> 
> lol you're right. Im gonna hold out as long as i can. Would really like to see a curved 3440x1440 144Hz at some point. Id probably take that over a 4k, very taxing hobby.


It's all about available bandwidth. HDMI 2.0 and Displayport 1.2 have roughly the same bandwidth. 4K 60hz is the limit, with 3440 x 1440, 100hz is the limit, with 3840 x 1600, 75 hz is the limit. From there you may add HDR and stuff, but you can't squeeze much more than that.

3440 x 1440 at 144 hz would likely require displayport 1.4 or HDMI 2.1, with HDMI 2.1 being a massive increase in bandwidth. Adoption of it will mean fast rise is monitor tech. Keep in mind the current geforce 1080ti doesn't support it, and Vega will likely not as well. So, for the forseable future, the choices will be one of those three resolutions, OR do what is already done and that is reduce resolution in order to increase the refresh rate on some of these other gaming monitors. The 2nd option is to do like they do in 5K monitors and that is allow the stitching of imagings into one screen by allowing you to plug in two HDMI or displayport to one screen in order to increase the bandwidth.


----------



## havoc315

Sign me up







got these bad boys today, replacing two gigabyte 1070 g1's with ekwb blocks, so now have to sell those with same upgrade doing to these new ones... but im getting my ek blocks 2morrow and already have the cool lab liquid metal plus Fuji poly 17km/w to go with..just have to mod the HB bridge for the ek blocks. Plus going to eventually do some hw mods on them also, so they should go great with the rest of system I'm trying to finish up. I'll try to keep at log if I can.


----------



## KeRo77

Hi all,

Just got my two cards, wondering if this score is normal?

http://www.3dmark.com/fs/12092742


----------



## jonny30bass

Quote:


> Originally Posted by *c0nsistent*
> 
> At least you aren't like me... I have a 1070 I'm using to drive a QNIX 1440p 120hz display. Ordered a 1080 Ti thinking I could drive it with the adapter I have... nope. Now I'm waiting for a DL-DVI to DP adapter, a new monitor, or an aftermarket 1080 Ti...


The display port to dvi adapter that came with the card didn't work?


----------



## KingEngineRevUp

I need some real help here. I have a Corsair H55 mounted to my 1080 Ti FE. It gets max temperatures of 58 and I'm at +175 on the core (2050 Mhz). But sometimes temperatures will go to 60C and I will throttle to 2038 Mhz. This is with fans at 2000 RPMs.

I'm debating about getting a Kraken x41. Besides lowering fan noise, I might be able to get my temperatures down to 45C to 50C. But is it worth it? *At +175, what will my core boost to if I can get temperatures at 52C lets so. I don't get the math with Boost 3.0.*

Any help and advice would be appreciated.


----------



## lilchronic

Active displayport to dual link dvi adapter in the market place.
http://www.overclock.net/t/1626333/accell-b087b-002b-ultraav-displayport-to-dvi-d-dual-link-adapter-black/0_50


----------



## Jbravo33

Quote:


> Originally Posted by *powerincarnate*
> 
> It's all about available bandwidth. HDMI 2.0 and Displayport 1.2 have roughly the same bandwidth. 4K 60hz is the limit, with 3440 x 1440, 100hz is the limit, with 3840 x 1600, 75 hz is the limit. From there you may add HDR and stuff, but you can't squeeze much more than that.
> 
> 3440 x 1440 at 144 hz would likely require displayport 1.4 or HDMI 2.1, with HDMI 2.1 being a massive increase in bandwidth. Adoption of it will mean fast rise is monitor tech. Keep in mind the current geforce 1080ti doesn't support it, and Vega will likely not as well. So, for the forseable future, the choices will be one of those three resolutions, OR do what is already done and that is reduce resolution in order to increase the refresh rate on some of these other gaming monitors. The 2nd option is to do like they do in 5K monitors and that is allow the stitching of imagings into one screen by allowing you to plug in two HDMI or displayport to one screen in order to increase the bandwidth.


Interesting. So can a [email protected] 60 monitor produce higher refresh rate at a lower resolution? not familiar with cable bandwidth.


----------



## imLagging

Quote:


> Originally Posted by *Somasonic*
> 
> I don't know what the comparative performance of these cards should be but from what you've said we could possibly conclude that your results are about as expected.
> 
> If 980 Ti = 780 Ti +10%
> 
> we could suppose that 980 Ti SLI = 780 Ti SLI + 10%
> 
> We know that that the 1080 Ti is roughly equivalent to 980 Ti SLI so we can then say
> 
> 1080 Ti = 780 Ti SLI +10%
> 
> Which would be about in line with what you're seeing.
> 
> This could all be a steaming pile in the real world so I'm happy to be corrected


I've seen so many benchmarks in the past years that I might have hyped my expectations.

example http://www.overclock.net/t/1601896/overclockersclub-overclock-showdown-gtx-980ti-vs-gtx-1070-vs-gtx-1080
1440P overclocked:
GTX 1070 is 3% faster than GTX 980Ti
GTX 1080 is 23% faster than GTX 980Ti

I remember not picking up 980 Ti because it was in many scenarios that 5-20% performance increase. Worst case a few fps, best case a noticeable difference, and of course the DX12 marketing.

examples firestrike http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30 and firestrike extreme http://www.overclock.net/t/1443196/fire-strike-extreme-top-30
Looking at firestrike scores in the benchmark sections I see to a 20-25%+ improvement on the 980Ti.
Then another 20-25% for 1080 to 980Ti
Last 20-25% for Titan XP to 1080
The extreme scores are is very similar

I'm trying to understand how in this large gap of architectural improvements over 2 generations that a SLI 780 Ti is putting up the scores close to this 1080 Ti. Could it be another 50mhz will make 5-10% out of this card? Something just doesn't seem right with my numbers, maybe the custom 780Ti's are very strong and still capable or the reference 1080 Ti design is really held back with it's power draw?


----------



## Slackaveli

Quote:


> Originally Posted by *powerincarnate*
> 
> It's all about available bandwidth. HDMI 2.0 and Displayport 1.2 have roughly the same bandwidth. 4K 60hz is the limit, with 3440 x 1440, 100hz is the limit, with 3840 x 1600, 75 hz is the limit. From there you may add HDR and stuff, but you can't squeeze much more than that.
> 
> 3440 x 1440 at 144 hz would likely require displayport 1.4 or HDMI 2.1, with HDMI 2.1 being a massive increase in bandwidth. Adoption of it will mean fast rise is monitor tech. Keep in mind the current geforce 1080ti doesn't support it, and Vega will likely not as well. So, for the forseable future, the choices will be one of those three resolutions, OR do what is already done and that is reduce resolution in order to increase the refresh rate on some of these other gaming monitors. The 2nd option is to do like they do in 5K monitors and that is allow the stitching of imagings into one screen by allowing you to plug in two HDMI or displayport to one screen in order to increase the bandwidth.


all true. except that you dont seem to realize we have DP1.4 right now on these 1080Ti. [email protected] (Digital)


----------



## tahsin95

Guys, like others have stated i had a hot weird smell coming from the card - its not noticeable at all, sometimes you just forget about it. Anyway, I was only able to notice it after my friend who was standing next to the PC smelled something. It doesn't exactly smell like a burning smell but, its more of a hot hair dryer smell. This only happens when the card hits 84c+ The smell was present initially and i haven't smelled it since, especially after setting up a custom fan profile. I contacted nvidia about it, and they told me to see how it is for the next few days and if the smell gets worse etc - they'll rma it.

I honestly don't want to take the risk of RMAing this card, especially after the fact that its out of stock virtually everywhere in the planet. If i send them the card - it'll take a good 1-2 months before i get it back on my hands. Well anyway, like others have said I don't notice the smell anymore and it was like the first few days kind of smell. But yeah, the closest smell i can describe this card to would be a hot hair dryer or some radiator etc. Maybe its a hot smell? This is especially true for those with cool air temperatures - as a peak in high temperature difference i.e hot air - becomes increasingly easier to smell in a colder room than a room that is hot or normal temperature. If more of you who has this issue, contact me personally - i want to discuss a little more about this.


----------



## tahsin95

Quote:


> Originally Posted by *SlimJ87D*
> 
> I need some real help here. I have a Corsair H55 mounted to my 1080 Ti FE. It gets max temperatures of 58 and I'm at +175 on the core (2050 Mhz). But sometimes temperatures will go to 60C and I will throttle to 2038 Mhz. This is with fans at 2000 RPMs.
> 
> I'm debating about getting a Kraken x41. Besides lowering fan noise, I might be able to get my temperatures down to 45C to 50C. But is it worth it? *At +175, what will my core boost to if I can get temperatures at 52C lets so. I don't get the math with Boost 3.0.*
> 
> Any help and advice would be appreciated.


Your card throttles back not only when the temperatures are high, but also your gpu power usage. Is it going above 100%? (or if you set 120% limit) - it will throttle back by lowering the clock speeds.


----------



## Dasboogieman

Quote:


> Originally Posted by *imLagging*
> 
> I've seen so many benchmarks in the past years that I might have hyped my expectations.
> 
> example http://www.overclock.net/t/1601896/overclockersclub-overclock-showdown-gtx-980ti-vs-gtx-1070-vs-gtx-1080
> 1440P overclocked:
> GTX 1070 is 3% faster than GTX 980Ti
> GTX 1080 is 23% faster than GTX 980Ti
> 
> I remember not picking up 980 Ti because it was in many scenarios that 5-20% performance increase. Worst case a few fps, best case a noticeable difference, and of course the DX12 marketing.
> 
> examples firestrike http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30 and firestrike extreme http://www.overclock.net/t/1443196/fire-strike-extreme-top-30
> Looking at firestrike scores in the benchmark sections I see to a 20-25%+ improvement on the 980Ti.
> Then another 20-25% for 1080 to 980Ti
> Last 20-25% for Titan XP to 1080
> The extreme scores are is very similar
> 
> I'm trying to understand how in this large gap of architectural improvements over 2 generations that a SLI 780 Ti is putting up the scores close to this 1080 Ti. Could it be another 50mhz will make 5-10% out of this card? Something just doesn't seem right with my numbers, maybe the custom 780Ti's are very strong and still capable or the reference 1080 Ti design is really held back with it's power draw?


My best guess is because of 2 reasons:

1. VRAM bandwidth, the GTX 780ti had a much more favourable BW to Flops ratio so it was less bottnecked, granted its not a hard advantage because Pascal has fantastic caching and mem compression but I suspect its still a net loss in ratio (for Pascal). I mean, most OC tests I've seen show measurable to modest improvements to performance via VRAM OC even with GDDR5X so I suspect there still performance on tap if they ever do a HBM2 GP102 card.

2. 2x780ti actually has more cores at fairly competitive clocks. While again, the tile-based rendering of Pascal compensates for the discrepancy via sheer IPC. The combination of IPC + Clockspeed might theoretically, not fully compensate for the discrepancy.

However, these benchmarks are highly optimized for the specific architectures, in the real world tests, you tend to find the GTX 1080ti pulling ahead because the cores are much better utilized in a moderate-poorly optimised scenario, even assuming perfect SLI scaling.


----------



## becks

How much power is this baby juicing out of the system ?
I think I might be limited by my PSU.

CPU: Intel Core i7-7700K 4.2GHz Quad-Core Processor @5.2 GHz
Motherboard: Asus MAXIMUS VIII IMPACT Mini ITX LGA1151 Motherboard
Memory: G.Skill TridentZ Series 32GB (2 x 16GB) DDR4-3200 Memory
Storage: Intel 750 Series 400GB 2.5" Solid State Drive
Power Supply: XFX XTR 650W 80+ Gold Certified Fully-Modular ATX Power Supply
+ Aquaero + 6 Noctua Ippc Fans + Water Pump.. + Led's and what not ...


----------



## t1337dude

Does anyone have that chart that shows games that max out 8Gb Vram, 11Gb, 12Gb etc? I remember it just showed examples of a game or two that used a lot of VRAM.


----------



## Anzial

Quote:


> Originally Posted by *becks*
> 
> How much power is this baby juicing out of the system ?
> I think I might be limited by my PSU.
> 
> CPU: Intel Core i7-7700K 4.2GHz Quad-Core Processor @5.2 GHz
> Motherboard: Asus MAXIMUS VIII IMPACT Mini ITX LGA1151 Motherboard
> Memory: G.Skill TridentZ Series 32GB (2 x 16GB) DDR4-3200 Memory
> Storage: Intel 750 Series 400GB 2.5" Solid State Drive
> Power Supply: XFX XTR 650W 80+ Gold Certified Fully-Modular ATX Power Supply
> + Aquaero + 6 Noctua Ippc Fans + Water Pump.. + Led's and what not ...


You should be just fine. You may be hitting thermal/power limits but that's because of the card itself, not the psu.


----------



## KedarWolf

Quote:


> Originally Posted by *t1337dude*
> 
> Does anyone have that chart that shows games that max out 8Gb Vram, 11Gb, 12Gb etc? I remember it just showed examples of a game or two that used a lot of VRAM.


I don't recall that chart but I know Call Of Duty Advanced Warfare and maybe Black Ops 3 if I recall right when I tested them on my old Maxwell Titan X used 10-11GB of Ram, but mostly just caching textures and stuff I think.


----------



## aylan1196

just done the shunt mode on the 2 top shunts left the bottom shunt untouched
? The card hold 2088 on core and 6000 on memory not a single drop in clock max TDP 80-86 ?
On evga aio max temp 50 until ek mlc get released


----------



## gstarr

Quote:


> Originally Posted by *Luck100*
> 
> Wow! who sells for that price in Germany?


This was on Conrad.de, with membercard + shoop discount.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *tahsin95*
> 
> Your card throttles back not only when the temperatures are high, but also your gpu power usage. Is it going above 100%? (or if you set 120% limit) - it will throttle back by lowering the clock speeds.


My power sticks around 115% to 118% but I've seen it hit 120% every now and then.

What boost are these cards getting if you can get the temperature below 50C?

What is the math for thermal throttling exactly? I know I lose 12 MHz when my temps go fro. 59C to 60C. Goes from 2050 to 2038.


----------



## Hulio225

Waterblock came yesterday! Installed it and Shunt Modded the card:







2088/12312 (+184/+650)

Temps are in the low 30's °C, Custom Loop with 2*480 Radiators, with fans at lowest possible Speeds and Ambient of 18 ° C.


----------



## MrTOOSHORT

^^ good gpu score for a 1080TI

Nice job, card looks good.


----------



## gkolarov

+180 on the core = very good


----------



## mr2cam

I get my card today, now just patiently waiting for EVGA to release the hybrid kit for the FE cards


----------



## gkolarov

It's already released








http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


----------



## Hulio225

I just tested a bit around and checked for the boostclocks...

+190MHz to +xxxMHz on the Core results in 2100.5MHz Boostclock
+177MHz to +189MHz on the Core results in 2088MHz Boostclock
+165MHz to +176MHz on the Core results in 2075.5MHz Boostclock
+1xXMHz to +164MHz on the Core results in 2062.5MHz Boostclock

So it seems there is no Point in going over the Plateau which enables a certain Boostclock...

Strange how that stuff works, somehow counterintuitiv









Edit: added values for 2100.5 MHz and the last Value for 2088MHz


----------



## bfedorov11

Did a few late night winter runs in the garage a couple nights ago. Even with 6 degrees ambient and 19 under load, I couldn't go past 2100mhz due to voltage and PT (mostly FS scene 1). Tested with 7700k at 5.1ghz. 12ghz memory was best. Any more and I lost frames. Forgot to grab screenshots.. it was cold and late.

Ultra - http://www.3dmark.com/fs/12091656

7791 - overall
7868 - graphics
44.64 fps - graphics test 1
27.73 fps - graphics test 2

Extreme - http://www.3dmark.com/fs/12091760

14619 - overall
16085 - graphics
85.97 fps - graphics test 1
58.94 fps - graphics test 2

Firestrike - http://www.3dmark.com/fs/12091781

24199 - overall
32828 - graphics
160.06 fps - graphics test 1
128.79 fps - graphics test 2


----------



## Necrocis85

EVGA Founders Edition in stock at Newegg.


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> I just tested a bit around and checked for the boostclocks...
> 
> +190MHz to +xxxMHz on the Core results in 2100.5MHz Boostclock
> +177MHz to +189MHz on the Core results in 2088MHz Boostclock
> +165MHz to +176MHz on the Core results in 2075.5MHz Boostclock
> +1xXMHz to +164MHz on the Core results in 2062.5MHz Boostclock
> 
> So it seems there is no Point in going over the Plateau which enables a certain Boostclock...
> 
> Strange how that stuff works, somehow counterintuitiv
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: added values for 2100.5 MHz and the last Value for 2088MHz


There is an easier way to figure that out. Figure out what your card boosts too with just power limit and voltage raised.
My card does 1911Mhz so with a +152 core offset i will get 2063Mhz. Each frequency bin is 13Mhz

usually my card will boost to 1911Mz but drops to 1898Mhz if i dont move the voltage slider up to 100%

1911Mhz + 152 =2063Mhz
1898Mhz + 152 = 2050Mhz

just take your max boost clock and add your offset to that and you should get the max boost clock


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> There is an easier way to figure that out. Figure out what your card boosts too with just power limit and voltage raised.
> My card does 1911Mhz so with a +152 core offset i will get 2063Mhz. Each frequency bin is 13Mhz
> 
> usually my card will boost to 1911Mz but drops to 1898Mhz if i dont move the voltage slider up to 100%
> 
> 1911Mhz + 152 =2063Mhz
> 1898Mhz + 152 = 2050Mhz
> 
> just take your max boost clock and add your offset to that and you should get the max boost clock


Isn't the 1911 MHz boost for every card the same if no thermal throtteling happens? Cuz mine seems to have the same^^
And are you sure with that voltage argument in your case? im just asking because i tested afterburner +100% voltage and it doesn't do anything for me.
Your drop seems to be temperature related or am i assuming wrong and you are sure its the voltage thing?


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> Isn't the 1911 MHz boost for every card the same if no thermal throtteling happens? Cuz mine seems to have the same^^
> And are you sure with that voltage argument in your case? im just asking because i tested afterburner +100% voltage and it doesn't do anything for me.
> Your drop seems to be temperature related or am i assuming wrong and you are sure its the voltage thing?


My temps are below 40c im water cooled.
If i just leave the voltage @ stock the card will drop from 1.062 to 1.050v and 1911Mhz to 1898Mhz depending on the load it puts on the gpu it fluctuates back and forth. With voltage slider @ 100% it stay's pegged @ 1911Mhz 1.075v untill i start to add overclocks and then it will star to throttle from the power limits.


----------



## y2kcamaross

Quote:


> Originally Posted by *gkolarov*
> 
> It's already released
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


Not yet


----------



## JedixJarf

Quote:


> Originally Posted by *Hulio225*
> 
> Waterblock came yesterday! Installed it and Shunt Modded the card:
> 
> 
> 
> 2088/12312 (+184/+650)
> 
> Temps are in the low 30's °C, Custom Loop with 2*480 Radiators, with fans at lowest possible Speeds and Ambient of 18 ° C.


Awesome man! What did you end up using on the shunt resistors?


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> My temps are below 40c im water cooled.
> If i just leave the voltage @ stock the card will drop from 1.062 to 1.050v and 1911Mhz to 1898Mhz depending on the load it puts on the gpu it fluctuates back and forth. With voltage slider @ 100% it stay's pegged @ 1911Mhz 1.075v untill i start to add overclocks and then it will star to throttle from the power limits.


ok thanks for the clarification, ill give that slider another chance, maybe ill be able to hit the next bin of the magic 2100MHz









just to be sure you added the third party [settings] blablabla command in your afterburner config, not another method or something i dont know about, right?
Quote:


> Originally Posted by *JedixJarf*
> 
> Awesome man! What did you end up using on the shunt resistors?


thanks







i used CLU


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lilchronic*
> 
> My temps are below 40c im water cooled.
> If i just leave the voltage @ stock the card will drop from 1.062 to 1.050v and 1911Mhz to 1898Mhz depending on the load it puts on the gpu it fluctuates back and forth. With voltage slider @ 100% it stay's pegged @ 1911Mhz 1.075v untill i start to add overclocks and then it will star to throttle from the power limits.
> 
> 
> 
> ok thanks for the clarification, ill give that slider another chance, maybe ill be able to hit the next bin of the magic 2100MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just to be sure you added the third party [settings] blablabla command in your afterburner config, not another method or something i dont know about, right?
> Quote:
> 
> 
> 
> Originally Posted by *JedixJarf*
> 
> Awesome man! What did you end up using on the shunt resistors?
> 
> Click to expand...
> 
> thanks
> 
> 
> 
> 
> 
> 
> 
> i used CLU
Click to expand...

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> ok thanks for the clarification, ill give that slider another chance, maybe ill be able to hit the next bin of the magic 2100MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just to be sure you added the third party [settings] blablabla command in your afterburner config, not another method or something i dont know about, right?


Yeah i did this http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html


----------



## Anzial

First custom 1080 ti is on sale, MSI Armor, nothing fancy but it does have 2-fan cooler and DL DVI out
https://m.newegg.com/products/N82E16814137111


----------



## mr2cam

Quote:


> Originally Posted by *y2kcamaross*
> 
> Not yet


Ya hopefully soon they will have it for sale


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah i did this http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html


okay i did the same, i have the slider, i put in 100 and ran some benchmark while monitoringVDDC on gpu-z and the maximum is 1.0620V the same like if the slider is at 0.
***? shouldnt that go up?


----------



## bfedorov11

Quote:


> Originally Posted by *Hulio225*
> 
> okay i did the same, i have the slider, i put in 100 and ran some benchmark while monitoringVDDC on gpu-z and the maximum is 1.0620V the same like if the slider is at 0.
> ***? shouldnt that go up?


Need lower temps. When I was running with the cold outside air, it was maxing out. Even with the max voltage, the difference is within margin of error. My card was failing 50% of the time at 2100 without it and then it would pass 90% of runs with it. It got me like another 10mhz. It wouldn't matter if all pascal cards were 100% unlocked.. you need subzero for more.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> My temps are below 40c im water cooled.
> If i just leave the voltage @ stock the card will drop from 1.062 to 1.050v and 1911Mhz to 1898Mhz depending on the load it puts on the gpu it fluctuates back and forth. With voltage slider @ 100% it stay's pegged @ 1911Mhz 1.075v untill i start to add overclocks and then it will star to throttle from the power limits.


Doesn't adding more voltage make you reach power limiting quicker?


----------



## Hulio225

Quote:


> Originally Posted by *bfedorov11*
> 
> Need lower temps. When I was running with the cold outside air, it was maxing out.


atm im at max 33 °C

can't imagine my voltage isnt going up because of that



that was after several time spy tests runs etc


----------



## zswickliffe

FYI... stolen from EVGA forums.

http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> okay i did the same, i have the slider, i put in 100 and ran some benchmark while monitoringVDDC on gpu-z and the maximum is 1.0620V the same like if the slider is at 0.
> ***? shouldnt that go up?


Yeah it's weird depending on the bench i run it will be 1.063v to 1.075v but it stays at 1911Mhz.

With voltage slider at 0 i see the voltage go from 1.043 to 1.050 to 1.063 and clocks drop from 1911Mhz to 1898Mhz.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Doesn't adding more voltage make you reach power limiting quicker?


Yes...

Here is a better explanation of what the voltage slider actually does. From unwinder
http://forums.guru3d.com/showpost.php?p=5297047&postcount=2


----------



## Jbravo33

Quote:


> Originally Posted by *DennyCorsa86*
> 
> My SLI of 1080Ti
> 
> http://www.3dmark.com/3dm/18787879?


Your graphics score of 59k is insane. Highest I can get is 55k you on ek blocks? Also what are you oc numbers core and mem?


----------



## DennyCorsa86

Quote:


> Originally Posted by *Jbravo33*
> 
> Your graphics score of 59k is insane. Highest I can get is 55k you on ek blocks? Also what are you oc numbers core and mem?


Yes i use ek blocks with backplates
my sli works at 2050 mhz on core and 1539 mhz on vram , 7700K @ 5300/5000 mhz


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lilchronic*
> 
> Yeah i did this http://www.guru3d.com/articles-pages/geforce-gtx-1080-ti-review,32.html
> 
> 
> 
> okay i did the same, i have the slider, i put in 100 and ran some benchmark while monitoringVDDC on gpu-z and the maximum is 1.0620V the same like if the slider is at 0.
> ***? shouldnt that go up?
Click to expand...

It doesn't go up for me but it DOES stay at a consistent 1.062 for me instead of dropping at times. If it drops your core clocks also drop on stock voltage curve.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah it's weird depending on the bench i run it will be 1.063v to 1.075v but it stays at 1911Mhz.
> 
> With voltage slider at 0 i see the voltage go from 1.043 to 1.050 to 1.063 and clocks drop from 1911Mhz to 1898Mhz.
> Yes...
> 
> Here is a better explanation of what the voltage slider actually does. From unwinder
> http://forums.guru3d.com/showpost.php?p=5297047&postcount=2


So putting my voltage slider to max is going to put me at a higher boost clock? Or does it help you avoid throttling? Or both?

Is it not best to try and see our maximum OC without overvolting first like on Maxwell?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lilchronic*
> 
> Yeah it's weird depending on the bench i run it will be 1.063v to 1.075v but it stays at 1911Mhz.
> 
> With voltage slider at 0 i see the voltage go from 1.043 to 1.050 to 1.063 and clocks drop from 1911Mhz to 1898Mhz.
> Yes...
> 
> Here is a better explanation of what the voltage slider actually does. From unwinder
> http://forums.guru3d.com/showpost.php?p=5297047&postcount=2
> 
> 
> 
> So putting my voltage slider to max is going to put me at a higher boost clock? Or does it help you avoid throttling? Or both?
> 
> Is it not best to try and see our maximum OC without overvolting first like on Maxwell?
Click to expand...

Voltage slider will give you a more consistent top voltage which is 1.062v for me. i have seen it go to 1.075 briefly but clocks at both on the stock voltage curve are the same.









On air your going throttle from too much heat but on water you won't hardly at all. See my earlier posts with the MSI Afterburner logs, 99% of the time with voltage slider at 100% on water I'm at 1.062v 2064 core.









It only throttles a tiny bit because I've never done the shunt mod and in Heaven a scene or two puts me just a bit over 120% power limit I think.


----------



## Hulio225

Just as another example ive done the shunt mod, slider does nothing for me if at 0 or 100 i can run 2088MHz and voltage never goes over the 1.062V.

Edit: And i cant run either way the next boost level wich would be 2100.5MHz... so for me the voltage slider story ends here


----------



## SperVxo

Got my ghetto AIO installation completed today. 48c-52 depending on game and rather hot ambient atm.


----------



## sWaY20

Finally broke 10k in time spy


----------



## KedarWolf

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> Finally broke 10k in time spy




http://www.3dmark.com/3dm/18827678?

Different CPU though, I'm not sure my graphics score is better, would have to compare.


----------



## KedarWolf

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> Finally broke 10k in time spy


Welp, +642 memory, +152 core is better result. I know with bad ambient temps even on water not to push it more, nice result, though.







10856



http://www.3dmark.com/3dm/18828861?


----------



## sWaY20

Quote:


> Originally Posted by *KedarWolf*
> 
> Welp, +642 memory, +152 core is better result. I know with bad ambient temps even on water not to push it more, nice result, though.
> 
> 
> 
> 
> 
> 
> 
> 10856
> 
> 
> 
> http://www.3dmark.com/3dm/18828861?


Id say its the cpu getting it that high, i was 160 core 600 mem


----------



## KedarWolf

Quote:


> Originally Posted by *sWaY20*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Welp, +642 memory, +152 core is better result. I know with bad ambient temps even on water not to push it more, nice result, though.
> 
> 
> 
> 
> 
> 
> 
> 10856
> 
> 
> 
> http://www.3dmark.com/3dm/18828861?
> 
> 
> 
> Id say its the cpu getting it that high, i was 160 core 600 mem
Click to expand...

Got 10808 just graphics, not including CPU though.









Edit: But yeah, CPU will push graphics up a bit too.


----------



## Slackaveli

ATTENTION FE owner's on air. Today I did a repasting with Thermal Grizzly Kryonaut. (Ebay has 5.5gram tubes for $21, btw) Took all of 10 minutes. All you need are two screws off from each side (one black, one silver right by each other), two off right on top by the fan (silver) to take off the window piece, and then remove the 4 spring loaded screws on the bottom. Clean, paste, re-assemble.

Results, long story short, are 4-5c less on idle and 7-8c less on full load. Hell, I only got 3-4 on my cpu, but, the 1080ti loves Kyronaut. So, on Battlefiled One online, Ultra everything with unlocked framerate , in 4k, im getting ~80-90fps @ 59-60c. I have fan profile at 35% until 57c then it goes to 75% (90% at 79c, but it has never gotten close to that).

So, Kryonaut paste and an aggressive (but still not max) fan keeps it where at most I lose 1 turbo bin. Not bad for about $4 worth of thermal paste and 10 minutes time. ZERO reason to get a 3rd party 1080ti, too. None of them will be under 59c on air.


----------



## Slackaveli

Quote:


> Originally Posted by *powerincarnate*
> 
> Just wondering would you prefer 4K 60hz or 1440 100hz, and lastly with the option for 3840x1600 (though so far freesync only
> 
> 
> 
> 
> 
> 
> 
> at 75 Hz would that be the best of both world


short answer- both. i play battlefield, day of infamy, & rainbow six siege on my 1440/144. everything else on the giant 4k HDR/60.

*****
I can't remember who recommended the Noctua Industrial 3000rpm pmw sp fan but man, oh man, that thing is beast. it alone gave me 3 degress less than a two fan thermaltake push/pull set-up did. then the thermal grizzly and both my gpu and cpu are gaming in the 50s under max 4k loads. And my cpu is at 4.3 from 3.3 and at 1.38v vcore. still running icy now.


----------



## Joshwaa

That was me with the fans, I did the Kryonaut also. EK block goes on tomorrow.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> That was me with the fans, I did the Kryonaut also. EK block goes on tomorrow.


ah, yes, there you are. I love that thing. I even like it when it ramps up. sounds awesome/


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> ATTENTION FE owner's on air. Today I did a repasting with Thermal Grizzly Kryonaut. (Ebay has 5.5gram tubes for $21, btw) Took all of 10 minutes. All you need are two screws off from each side (one black, one silver right by each other), two off right on top by the fan (silver) to take off the window piece, and then remove the 4 spring loaded screws on the bottom. Clean, paste, re-assemble.
> 
> Results, long story short, are 4-5c less on idle and 7-8c less on full load. Hell, I only got 3-4 on my cpu, but, the 1080ti loves Kyronaut. So, on Battlefiled One online, Ultra everything with unlocked framerate , in 4k, im getting ~80-90fps @ 59-60c. I have fan profile at 35% until 57c then it goes to 75% (90% at 79c, but it has never gotten close to that).
> 
> So, Kryonaut paste and an aggressive (but still not max) fan keeps it where at most I love 1 turbo bin. Not bad for about $4 worth of thermal paste and 10 minutes time. ZERO reason to get a 3rd party 1080ti, too. None of them will be under 59c on air.


I wonder if thermal grizzly would be worth it for me if I used Noctua... Maybe like 1C better lol. Maybe not worth the time to repaste again.


----------



## Radox-0

Quote:


> Originally Posted by *Slackaveli*
> 
> ATTENTION FE owner's on air. Today I did a repasting with Thermal Grizzly Kryonaut. (Ebay has 5.5gram tubes for $21, btw) Took all of 10 minutes. All you need are two screws off from each side (one black, one silver right by each other), two off right on top by the fan (silver) to take off the window piece, and then remove the 4 spring loaded screws on the bottom. Clean, paste, re-assemble.
> 
> Results, long story short, are 4-5c less on idle and 7-8c less on full load. Hell, I only got 3-4 on my cpu, but, the 1080ti loves Kyronaut. So, on Battlefiled One online, Ultra everything with unlocked framerate , in 4k, im getting ~80-90fps @ 59-60c. I have fan profile at 35% until 57c then it goes to 75% (90% at 79c, but it has never gotten close to that).
> 
> So, Kryonaut paste and an aggressive (but still not max) fan keeps it where at most I love 1 turbo bin. Not bad for about $4 worth of thermal paste and 10 minutes time. ZERO reason to get a 3rd party 1080ti, too. None of them will be under 59c on air.


Yeah but aftermarket won't need to sit at 75% fan profile to maintain those sort of temps.

Impressive results though, I have never had anything so significant applying kyronaught to my Titan x's, and 1080s, shaved 2 or so degrees off but nothing significant. Will give it a try I think.


----------



## Slackaveli

yeah I dont know what nvidia used for paste, but it was a nice bump for sure. I really love to work with the thermal grizzly, too. It spreads so easily and smoothly.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah I dont know what nvidia used for paste, but it was a nice bump for sure. I really love to work with the thermal grizzly, too. It spreads so easily and smoothly.


Really?! You're the first to say that, from what I hear it spreads like half-dried clay. lol


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Really?! You're the first to say that, from what I hear it spreads like half-dried clay. lol


wow. you sure it was kyronaut? b/c it was super easy. spread very thin to all corners in very few strokes, didnt resist or fall over the edge. easy peasy.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> wow. you sure it was kyronaut? b/c it was super easy. spread very thin to all corners in very few strokes, didnt resist or fall over the edge. easy peasy.


Yea it was. But I've personally never used the stuff.









Though I'm wondering if it's worth while to get some and redo the tim on my card and my cpu. I have mx-4 on both.


----------



## bfedorov11

I would say kyronaut is in the middle as far as spreading goes. Unless they go on like butter, it is always best to heat up with a hair dryer or run it under hot water. It is workable without heat. My spread before block.. didn't use heat though.


----------



## iamjanco

Just got this from EVGA in my mailbox:



Edit: mods, I hope it was okay to post that.


----------



## alucardis666

Quote:


> Originally Posted by *iamjanco*
> 
> Just got this from EVGA in my mailbox:
> 
> 
> 
> Edit: mods, I hope it was okay to post that.


Good stuff!









Had I been patient and not gone FE, that's the card i'd have gone with!


----------



## krutoydiesel

Quote:


> Originally Posted by *chibi*
> 
> Nice results, love your gpu support post.


Did some more overclocking, now on the memory, I went up to 500 Mhz + on the memory, but didn't see any gains in Firestrike, so I brought it back to +300 on the memory and ran heaven for a little over 2 hours, extreme preset with 1440p settings

*Final results
OVERCLOCKING*
+150 on GPU core clock = 2050 mhz stable
+300 on mem core clock = 5805 mhz stable
43c max temp during Unigine Heaven for 2 hours at max preset (EKWB Full block)

*FIRESTRIKE RESULTS*
21584 in Firestrike overall
29980 Graphics score
14185 Physics score
9312 Combined Score

*STABLE AF*


----------



## Exilon

How's the default 120% power limit without shunt mod? Something to worry about or an non-issue? (On water)


----------



## Mato87

I have a question for you folks, When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks. Important thing is to leave the ugly muddy, blurry upscaling option turned OFF

Oh and please test mafia 3 aswell, thanks


----------



## KedarWolf

Quote:


> Originally Posted by *sWaY20*
> 
> 
> 
> Finally broke 10k in time spy


All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.

Original run had my 128GB kit in at 2690 MHZ, same CPU clocks.

Time Spy 10880, Graphics Test 10853.

http://www.3dmark.com/3dm/18838458?


----------



## GRABibus

Quote:


> Originally Posted by *KedarWolf*
> 
> All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.
> 
> Original run had my 128GB kit in at 2690 MHZ, same CPU clocks.
> 
> Time Spy 10880, Graphics Test 10853.
> 
> http://www.3dmark.com/3dm/18838458?


Why is 3Dmark detecting your card as a TITAN X ?


----------



## Cybertox

Mine arrived today


----------



## pantsoftime

Quote:


> Originally Posted by *Exilon*
> 
> How's the default 120% power limit without shunt mod? Something to worry about or an non-issue? (On water)


On my card it's pretty miserable. Drops me from 2050 all the way down to 1900 sometimes. I'm on water and temps at or below 43 most of the time.


----------



## sWaY20

Quote:


> Originally Posted by *GRABibus*
> 
> Why is 3Dmark detecting your card as a TITAN X ?


bc that was with his Titan x


----------



## PasK1234Xw

What is with all the crazy VRAM speeds are you guys actually seeing improvements or just assuming its stable because its not crashing or artifact?
Unlike Maxwell pascal's DDR5x uses ECC and even if it don't crash or artifact it will held back performance due to error correcting due to unstable clocks.
.
Start low and work your way up till you see no performance improvements.

Quote:


> Originally Posted by *pantsoftime*
> 
> On my card it's pretty miserable. Drops me from 2050 all the way down to 1900 sometimes. I'm on water and temps at or below 43 most of the time.


Set voltage +100


----------



## GRABibus

Quote:


> Originally Posted by *sWaY20*
> 
> bc that was with his Titan x


?

The screenshot shows 1080ti in GPUZ...


----------



## Joshwaa

For those who went with a water block. Are the white pads on a couple parts insulators or thermal pads?


----------



## KickAssCop

Finally got around to doing some benches. CPU at 4.4 though, not sure why as maybe I forgot to set it to 4.5 last time I changed out some stuff.

Timespy: http://www.3dmark.com/spy/1442248 - 9678

Firestrike Ultra: http://www.3dmark.com/fs/12114327 - 7485

Firestrike: http://www.3dmark.com/fs/12114384 - 20268 (looks low)

Card at +150/+500. Ran without a hitch.


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.
> 
> Original run had my 128GB kit in at 2690 MHZ, same CPU clocks.
> 
> Time Spy 10880, Graphics Test 10853.
> 
> http://www.3dmark.com/3dm/18838458?
> 
> 
> 
> 
> 
> Why is 3Dmark detecting your card as a TITAN X ?
Click to expand...

I have my 1080 Ti as my primary screen on my 4K G-Sync monitor and an older Maxwell Titan X as dedicated PhysX as my second screen on my QNIX 1440P screen.
DP to DVI adaptor doesn't work on my second QNIX screen.

So 3Dmark detects both cards but only the 1080 Ti is rendering. Old Titan X doesn't even go above minimum clocks at all during Time Spy etc.









Edit: If you expand secondary card you'll see my 1080 Ti but it IS my primary card. Not sure why 3DMark detects it as secondary. I render 3DMark on the 4K 1080 Ti screen with G-Sync turned off.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KickAssCop*
> 
> Finally got around to doing some benches. CPU at 4.4 though, not sure why as maybe I forgot to set it to 4.5 last time I changed out some stuff.
> 
> Timespy: http://www.3dmark.com/spy/1442248 - 9678
> 
> Firestrike Ultra: http://www.3dmark.com/fs/12114327 - 7485
> 
> Firestrike: http://www.3dmark.com/fs/12114384 - 20268 (looks low)
> 
> Card at +150/+500. Ran without a hitch.


Yea firestrike score low even for stock speeds should be around 28k on graphics score at stock and around 30-31k with +150 on core.


----------



## KickAssCop

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Yea firestrike score low even for stock speeds should be around 28k on graphics score at stock and around 30-31k with +150 on core.


Firestrike http://www.3dmark.com/3dm/18846138? - 22517.


----------



## powerincarnate

My 1080Ti is coming Monday, i also took the time to buy the various water cooling materials for both the GPU and CPU. it was actually more expensive than i thought. I already had a corsair HS110, so getting rid of that (sold it to my nephew) and then starting a new loop is more than if I had just either tried Stock and see how the noise and temp was, OR use one of the other manufacturer non-reference designs, OR use something like the EVGA hybrid cooler.

At least this will be future proof though since I shouldn't have to buy radiators, cpu blocks for a long time, and the 1080ti should serve me well for a few years.

Any tips on how I should mount the reservior/pump combo, I have a caselabs case.


----------



## lilchronic

Quote:


> Originally Posted by *PasK1234Xw*
> 
> What is with all the crazy VRAM speeds are you guys actually seeing improvements or just assuming its stable because its not crashing or artifact?
> Unlike Maxwell pascal's *DDR5x uses ECC* and even if it don't crash or artifact it will held back performance due to error correcting due to unstable clocks.
> .
> Start low and work your way up till you see no performance improvements.
> Set voltage +100


Got a source that say's that?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> Got a source that say's that?


I don't know if there's any article written about it but I kind of believe him since I get better performance at +400 than +450 or +500.


----------



## jleslie246

any news on when the other EVGA cards will release? I might wait on the others since they will have one DVI output. I run 4 monitors so I would prefer the extra DVI output.


----------



## pompss

its safe to disable asus anti surge when overclocking ?

Getting random reset when gaming with a dual sli gtx 1080ti 2000mhz and cpu overclock to 4.4 ghz psu seasonic white snow 1050w

anyone have same experience ?


----------



## PharmingInStyle

Would like to see more head to head OC tests of the 1080 Ti vs Titan XP. I don't OC but if I saw the XP OCed with even slightly higher fps in games overall it would make me feel better. An emotional thing considering I bought an XP a few months ago. Also I read in the thread or on OC that we still need to wait for the right custom bios for the Ti to work best? Do the custom Tis coming out already have the bios well in hand? Assuming I'm understanding this correctly.

i7 6700 3.4ghz || Titan X Pascal || Win 10 Home || AsRock H110M-DGS || DDR4 Crucial 16gb
Toshiba HD 1tb || Onboard Sound || Cooler Master 750W || 32" Samsung 4k Lcd 3840 x 2160


----------



## Silent Scone

Quote:


> Originally Posted by *PharmingInStyle*
> 
> Would like to see more head to head OC tests of the 1080 Ti vs Titan XP. I don't OC but if I saw the XP OCed with even slightly higher fps in games overall it would make me feel better. An emotional thing considering I bought an XP a few months ago. Also I read in the thread or on OC that we still need to wait for the right custom bios for the Ti to work best? Do the custom Tis coming out already have the bios well in hand? Assuming I'm understanding this correctly.
> 
> i7 6700 3.4ghz || Titan X Pascal || Win 10 Home || AsRock H110M-DGS || DDR4 Crucial 16gb
> Toshiba HD 1tb || Onboard Sound || Cooler Master 750W || 32" Samsung 4k Lcd 3840 x 2160


I'll be doing some comparisons against the Titan with a Strix OC soon hopefully


----------



## dboythagr8

GTA 5 maxed everything is still a killa. CPU and GPU(s) both getting worked hard.

Literally put everything at its max -- Shadows, TXAA, Pop density etc. At 1440p I was able to still hold 60fps (barely) on a single card. At 4k though? Ti got ate up. Averaged 45 fps. Granted that's still playable but it was working hard. SLI performance good at 4k tho, averaged in the mid 80s on those same settings.

Ended up playing for about a hour @ 5120x2880 DSR to 1440p. Still kept everything max and SLI performance gave me avg of 77 fps. No performance issues. Great performance and crazy IQ


----------



## lilchronic

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know if there's any article written about it but I kind of believe him since I get better performance at +400 than +450 or +500.


Who?


----------



## keikei

This card can handle The Division @5k. Granted with some vid tweakings, but the jump in res really benefits the game visually due to the city architecture. Amazing. I'll be playing a lot more of the game now.


----------



## bfedorov11

Quote:


> Originally Posted by *lilchronic*
> 
> Got a source that say's that?


There has never been anything official about it.. it isn't specifically ECC, but it is similar. Search the 1080 owners' thread for "error correction" or "ecc." No doubt though, every single card (at least with gddr5x) has number if you go over, you will lose frames.


----------



## dboythagr8

Quote:


> Originally Posted by *keikei*
> 
> This card can handle The Division @5k. Granted with some vid tweakings, but the jump in res really benefits the game visually due to the city architecture. Amazing. I'll be playing a lot more of the game now.


What's your performance look like at 5K?


----------



## lilchronic

Quote:


> Originally Posted by *bfedorov11*
> 
> There has never been anything official about it.. it isn't specifically ECC, but it is similar. Search the 1080 owners' thread for "error correction" or "ecc." No doubt though, every single card (at least with gddr5x) has number if you go over, you will lose frames.


This is what i think but i could be wrong..
When you are overclocking the memory it draws a lot more power so you start hitting the power limit harder making the card throttle core clocks even more thus lowering your score.

I ran some tests from +300 - +600 all showed improvement when not hitting the power limit. @ +600 it sure did come close thou.

+300 mem

+400

+500

+600

+700
Blackscreen

+650


So that's what i thought before i ran the test's and now it just confirms it for me.......


----------



## keikei

Quote:


> Originally Posted by *dboythagr8*
> 
> What's your performance look like at 5K?


It varys from 45 to 60fps. Most things on low to med settings. If I were set low for everything, id bet it would be closer to 60 most of the time. I dont mind the frame hit because i like the ao and shadows. During fire fights, it does drop to around 45fps. Very much playable. Still tweaking.


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> This is what i think but i could be wrong..
> When you are overclocking the memory it draws a lot more power so you start hitting the power limit harder making the card throttle core clocks even more thus lowering your score.
> 
> ....
> 
> So that's what i thought before i ran the test's and now it just confirms it for me.......


I did ram testing aswell, i used Time Spy Graphics Test 1 and compared with different vram speeds....
i can go up to +680 probably more but i don't care tbh cuz:



what i am curious about is, why i have sometimes outbursts which are giving me the underlinded higher FPS numbers than the others always pretty close in line to each other.


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> I did ram testing aswell, i used Time Spy Graphics Test 1 and compared with different vram speeds....
> i can go up to +680 probably more but i don't care tbh cuz:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> what i am curious about is, why i have sometimes outbursts which are giving me the underlinded higher FPS numbers than the others always pretty close in line to each other.


probably hitting the power limit even with a shunt mod. Time spy draws a ton of power.


----------



## Hulio225

Now i have gotten a problem the second time.
I have a seperate windows 10 installation for time spy basically. its a striped down windows 10, disabled services features and all of that stuff.
I made one 1 week ago i benched the **** out of it was getting my scores, couple days later i was getting 300ish points less constantly.
I made another fresh install, stripped the windows down etc etc. disabled windows update cuz i thought maybe that was the issue...
Now i beched the last 3 days the **** out of my card and everything was fine my scores were never below 10040 ish and went up to 10119....
But couple minutes ago i strtet benching again on that system and im getting again 300ish points less than normal all the time...
I have no idea what that crap is.... i havent changed a thing thb.... i just load my bios profile boot into that window 10 time spy system and run my tests...

has anyone an idea what this could be?


----------



## pantsoftime

Quote:


> Originally Posted by *lilchronic*
> 
> This is what i think but i could be wrong..
> When you are overclocking the memory it draws a lot more power so you start hitting the power limit harder making the card throttle core clocks even more thus lowering your score.


It's quite well documented in the 1080 thread that memory speeds are impacted by both error correction and specific frequency. You will find that adjusting memory speeds even by 5 MHz in either direction will have an impact on your scores. There's a bit of a hill and valley effect over the course of a ~25MHz window.

There is also error correction that starts to kick in as your memory starts to hit its limits. It will cause a dip in scores when it starts to engage. This is a feature of GDDR5X that is not present in traditional GDDR5.

You're not wrong about memory power eating into your power limit, but that's secondary to other factors.

Open up heaven and pause it in a location where there's minimal movement going on in the background (like when it's staring at the cobblestone path). Watch your FPS indicator and adjust your memory speed by 5MHz. You'll see the FPS counter move around by a couple of percent each time. Use this as a tool to dial in your best overclock.


----------



## jonny30bass

Quote:


> Originally Posted by *PasK1234Xw*
> 
> What is with all the crazy VRAM speeds are you guys actually seeing improvements or just assuming its stable because its not crashing or artifact?
> Unlike Maxwell pascal's DDR5x uses ECC and even if it don't crash or artifact it will held back performance due to error correcting due to unstable clocks.
> .
> Start low and work your way up till you see no performance improvements.
> Set voltage +100


Adding more voltage is the last thing he should do. More voltage equals more power draw and higher temps. So he will throttle even more.


----------



## PasK1234Xw

Quote:


> Originally Posted by *jonny30bass*
> 
> Adding more voltage is the last thing he should do. More voltage equals more power draw and higher temps. So he will throttle even more.


I run +100 just fine for 2060 i see maybe 10-20Mhz drop when power target hit and only really in benchmark nothing to cry over as it no lost in frames. It also jumps right back up to 2060.
If i run at default voltage my clock drops down to 2010/20 overtime and it never goes back up until i set +100 again.

Heat is another story but not news that pascal runs better when kept cool.
You have to set a custom fan profile with FE the default profile is crap.
My Ti never goes ever 75c with +100. And yes its not quietest but it also isn't earth shattering loud.
Ill be installing EVGA hybrid kit like i did with my 1080 SLI then my card will run even better.

Not adjusting your voltage can cause down clocking as does heat and hitting power target.
You have to find sweet spot for YOUR GPU not all of us are lucky to receive a golden chip that requires no extra voltage at higher clocks

Here is my Ti @ +160 with default, +50 and +100 voltage In GTAV @ 1440/165zhz

https://s28.postimg.org/wi55phguz/Untitled.png

The max default 1.06v is not enough to sustain 2060 on my GPU so GPU boost down clocks my core.

edit

And i do agree with you its the last thing you should do.
But i always find it necessary with my luck in lottery.


----------



## richiec77

Quote:


> Originally Posted by *pantsoftime*
> 
> It's quite well documented in the 1080 thread that memory speeds are impacted by both error correction and specific frequency. You will find that adjusting memory speeds even by 5 MHz in either direction will have an impact on your scores. There's a bit of a hill and valley effect over the course of a ~25MHz window.
> 
> There is also error correction that starts to kick in as your memory starts to hit its limits. It will cause a dip in scores when it starts to engage. This is a feature of GDDR5X that is not present in traditional GDDR5.
> 
> You're not wrong about memory power eating into your power limit, but that's secondary to other factors.
> 
> Open up heaven and pause it in a location where there's minimal movement going on in the background (like when it's staring at the cobblestone path). Watch your FPS indicator and adjust your memory speed by 5MHz. You'll see the FPS counter move around by a couple of percent each time. Use this as a tool to dial in your best overclock.


WAS about to post something long about find this out myself using Valley Benchmark and testing memory from +250 to +600 memory OC. I THOUGHT I had ****ty memory as a few data points chosen in 25 MHz point didn't do anything for me. Guess my 1st stab in the dark was a good Peak.

About 2 Min into Valley after the rain effects, there's the flying thru the mountains. Trees and what not. Most stable FPS counter I found to use. Rain scenes used the most VRAM...but also jumped around like crazy.


----------



## illypso

Got my TI for a week, did some air test, throttle a lot by temps so I install the EVGA hybrid watercool.

Temp never when higher than 45 but I was getting pwr limit a lots.

2000ghz in game stable but in 3dmark graphic test one I hit power limit and drop to 1900 even 1850 sometime.

So today I decided to remove this limit.

It was a scary day.

I decided to do the 3 x 10 ohm resistor on the bord
But I only had big resistor, but a good soldering kit. decided to give a try.

Its a this moment you realize how your hand shake a lot.

But I manage to do it.



I did verified that they where not touching elswhere or eachother and that the connection where good, and each one was isolated with shink tubing after that to be safe.

Then at the last moment of puting back everything together a screw fell of the screwdriver and ended in the fan... stuck to its magnet at the center.

After a long 5 minute of despair (did not want to remove everything again) fighting the magnet inside, I decided to take a bigger magnet and win against the fan loll



Back to business

The computer boot to windows, then 3dmark start but after 5 second of 3d rendering he driver fail, it was ok when downclock to 1400....

The power consumtion did lower, gpuz was 25 TDP but afterburner was showing 80% power the 5 seconde the test could run. (aren't they suppose to be the same?)

Then windows crash in a rainbow of color...

Thought I killed it but it boot back ok.

So I decided to remove the resistor to see if I did not break something while soldering but just removing the resistor, the card came back to normal.
Even seems to hit power limit less than before maybe because I altered the capacitor with the heat a little.

They where 3 10 ohm resistor, check with mutimeter.

My card did not like this mod or I did it wrong.
Will maybe try shunt mod instead but another time to much stress for today ;-) it time to play game with it.


----------



## PasK1234Xw

I should try the shunt mod. I got microcenter replacement plan to avoid MSI and getting new GPU is mine needs to be replaced.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Yea it was. But I've personally never used the stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though I'm wondering if it's worth while to get some and redo the tim on my card and my cpu. I have mx-4 on both.


i came from that on my cpu. gained a few degrees but i had put too much paste on time before. It is (TG:K) better, no doubt. I mean, mx-4 is a $6 amazon add-on, though, so it's the best value imo. But at the same time, i was attempting to get sub 60c on air so i needed the absolute best paste. i actually achieved it in most games i play. a few it games at 63c. all in all, on air, that's nasty. added bonus is my drives/mobo/etc are all 30-32c because my case is super cool. My house's ac/heat is kinda loud and now my pc still isn't louder so it bothers me none. Probably going to skip the hybrid mod this time and save a hundred on that one.


----------



## Mato87

Since it was ignored, Iam posting this again.

I have a question for you folks, When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks. Important thing is to leave the ugly muddy, blurry upscaling option turned OFF

Oh and please test mafia 3 aswell, thanks


----------



## KingEngineRevUp

Quote:


> Originally Posted by *illypso*
> 
> Got my TI for a week, did some air test, throttle a lot by temps so I install the EVGA hybrid watercool.
> 
> Temp never when higher than 45 but I was getting pwr limit a lots.
> 
> 2000ghz in game stable but in 3dmark graphic test one I hit power limit and drop to 1900 even 1850 sometime.
> 
> So today I decided to remove this limit.
> 
> It was a scary day.
> 
> I decided to do the 3 x 10 ohm resistor on the bord
> But I only had big resistor, but a good soldering kit. decided to give a try.
> 
> Its a this moment you realize how your hand shake a lot.
> 
> But I manage to do it.
> 
> 
> 
> I did verified that they where not touching elswhere or eachother and that the connection where good, and each one was isolated with shink tubing after that to be safe.
> 
> Then at the last moment of puting back everything together a screw fell of the screwdriver and ended in the fan... stuck to its magnet at the center.
> 
> After a long 5 minute of despair (did not want to remove everything again) fighting the magnet inside, I decided to take a bigger magnet and win against the fan loll
> 
> 
> 
> Back to business
> 
> The computer boot to windows, then 3dmark start but after 5 second of 3d rendering he driver fail, it was ok when downclock to 1400....
> 
> The power consumtion did lower, gpuz was 25 TDP but afterburner was showing 80% power the 5 seconde the test could run. (aren't they suppose to be the same?)
> 
> Then windows crash in a rainbow of color...
> 
> Thought I killed it but it boot back ok.
> 
> So I decided to remove the resistor to see if I did not break something while soldering but just removing the resistor, the card came back to normal.
> Even seems to hit power limit less than before maybe because I altered the capacitor with the heat a little.
> 
> They where 3 10 ohm resistor, check with mutimeter.
> 
> My card did not like this mod or I did it wrong.
> Will maybe try shunt mod instead but another time to much stress for today ;-) it time to play game with it.


Man this is too hardcore for like 2% extra performance.


----------



## Hulio225

Quote:


> Originally Posted by *illypso*
> 
> Got my TI for a week, did some air test, throttle a lot by temps so I install the EVGA hybrid watercool.
> 
> Temp never when higher than 45 but I was getting pwr limit a lots.
> 
> 2000ghz in game stable but in 3dmark graphic test one I hit power limit and drop to 1900 even 1850 sometime.
> 
> So today I decided to remove this limit.
> 
> It was a scary day.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I decided to do the 3 x 10 ohm resistor on the bord
> But I only had big resistor, but a good soldering kit. decided to give a try.
> 
> Its a this moment you realize how your hand shake a lot.
> 
> But I manage to do it.
> 
> 
> 
> I did verified that they where not touching elswhere or eachother and that the connection where good, and each one was isolated with shink tubing after that to be safe.
> 
> Then at the last moment of puting back everything together a screw fell of the screwdriver and ended in the fan... stuck to its magnet at the center.
> 
> After a long 5 minute of despair (did not want to remove everything again) fighting the magnet inside, I decided to take a bigger magnet and win against the fan loll
> 
> 
> 
> Back to business
> 
> The computer boot to windows, then 3dmark start but after 5 second of 3d rendering he driver fail, it was ok when downclock to 1400....
> 
> The power consumtion did lower, gpuz was 25 TDP but afterburner was showing 80% power the 5 seconde the test could run. (aren't they suppose to be the same?)
> 
> Then windows crash in a rainbow of color...
> 
> Thought I killed it but it boot back ok.
> 
> So I decided to remove the resistor to see if I did not break something while soldering but just removing the resistor, the card came back to normal.
> Even seems to hit power limit less than before maybe because I altered the capacitor with the heat a little.
> 
> They where 3 10 ohm resistor, check with mutimeter.
> 
> My card did not like this mod or I did it wrong.
> Will maybe try shunt mod instead but another time to much stress for today ;-) it time to play game with it.


Why not doing the less risky, the less fail prone shunt method in the first place?
Don't get me wrong i don't want to offend you, but i don't understand "why"


----------



## illypso

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man this is too hardcore for like 2% extra performance.


I was bored







and I like to mod all I have for fun.

Quote:


> Originally Posted by *Hulio225*
> 
> Why not doing the less risky, the less fail prone shunt method in the first place?
> Don't get me wrong i don't want to offend you, but i don't understand "why"


Its simple, I did not had liquid metal and where I am I could not find some in my computer shop, but I had some resistor lying around and thought it would be the same just done differently. (reading that https://xdevs.com/guide/pascal_oc/ make it look like it was better this way too)

I am going to try the shunt mod with rear defrost windows repair kit, I try a little with the multimeter and it resistance seems low (one cm long was 30ohm) but not 0 so maybe it will help a little more.

And its weird but I really gain something just by installing and removing the mod, I can now hold 2000GHZ in 3dmark instead of 1850 but I still hit 120% power limit.


----------



## dboythagr8

Quote:


> Originally Posted by *keikei*
> 
> It varys from 45 to 60fps. Most things on low to med settings. If I were set low for everything, id bet it would be closer to 60 most of the time. I dont mind the frame hit because i like the ao and shadows. During fire fights, it does drop to around 45fps. Very much playable. Still tweaking.


Pretty great for a single card!


----------



## Hulio225

Quote:


> Originally Posted by *illypso*
> 
> I was bored
> 
> 
> 
> 
> 
> 
> 
> and I like to mod all I have for fun.
> Its simple, I did not had liquid metal and where I am I could not find some in my computer shop, but I had some resistor lying around and thought it would be the same just done differently. (reading that https://xdevs.com/guide/pascal_oc/ make it look like it was better this way too)
> 
> I am going to try the shunt mod with rear defrost windows repair kit, I try a little with the multimeter and it resistance seems low (one cm long was 30ohm) but not 0 so maybe it will help a little more.
> 
> And its weird but I really gain something just by installing and removing the mod, I can now hold 2000GHZ in 3dmark instead of 1850 but I still hit 120% power limit.


Okay 30 ohm is to much on 10mm, on 1mm it would be 3 ohm if width stays the same. those shunts have 5 mOhm what is by alot smaller than 3ohm so at best you lower the 5mOhm to 4.9mOhm.


----------



## illypso

Quote:


> Originally Posted by *Hulio225*
> 
> Okay 30 ohm is to much on 10mm, on 1mm it would be 3 ohm if width stays the same. those shunts have 5 mOhm what is by alot smaller than 3ohm so at best you lower the 5mOhm to 4.9mOhm.


I know but it was a very thin layer so I did a second thick layer and it went to 0.6 ohm for one cm so I decided to try anyways. to much ohm is better than to low. I dont want safe mode on the card. worst senario I will gain minimum power. and its safer than liquid metal once dry. Im waiting for it to dry at the moment.


----------



## Hulio225

Quote:


> Originally Posted by *illypso*
> 
> I know but it was a very thin layer so I did a second thick layer and it went to 0.6 ohm for one cm so I decided to try anyways. to much ohm is better than to low. I dont want safe mode on the card. worst senario I will gain minimum power. and its safer than liquid metal once dry. Im waiting for it to dry at the moment.


Still even with 0.6 Ohm lets assume the way you have to overcome is 1mm (which is underestimated), 1/(1/0.06 Ohm + 1/0.005 Ohm)= 4.615 mOhm.
-> Not worth the hassle imho.

Now please explain liquid metal dry argument ;-)


----------



## MrKenzie

Look at the results from Titan Pascal owners that have done the shunt mod, 1% gain at most. Totally not worth risking a $700 card for a small gain, wait for the 1080Ti bios to be unlocked or for the partner cards to come out with higher power limits.

I'm excited to see what extra performance is available with a higher power limit, because 99.9% of all current Titan XP and 1080Ti will hit a wall at 2150MHz.


----------



## illypso

Quote:


> Originally Posted by *Hulio225*
> 
> Still even with 0.6 Ohm lets assume the way you have to overcome is 1mm (which is underestimated), 1/(1/0.06 Ohm + 1/0.005 Ohm)= 4.615 mOhm.
> -> Not worth the hassle imho.
> 
> Now please explain liquid metal dry argument ;-)


Its just that the stuff I used dry in second and its more like a primer paint so it is less prone to drip after than the liquid metal.
Was easier to apply too. (also the only conductive thing I have witch is liquid and not 0ohm)
So if it would have work the same it would have been a better option.

But it did not lower the resistance by a lots
by like 5% but I cant get my clock past 1950ghz without driver crash now. a little like the resistor mod did but less drastic.

So my card don't want more physical power mod, I will remove the "paint" and test back to 2000ghz
.


----------



## Hulio225

Quote:


> Originally Posted by *MrKenzie*
> 
> Look at the results from Titan Pascal owners that have done the shunt mod, 1% gain at most. Totally not worth risking a $700 card for a small gain, wait for the 1080Ti bios to be unlocked or for the partner cards to come out with higher power limits.
> 
> I'm excited to see what extra performance is available with a higher power limit, because 99.9% of all current Titan XP and 1080Ti will hit a wall at 2150MHz.


I have done the shunt mod for example on a 1080ti, the risk is 0 if you are not person who has some comprehension issues... but ill give you the point that the gains aren't that high.

but what a lot of people don't understand is the fact, maybe the fps which you gain aren't that hight, yes 1% maybe, but frame time differences are significantly lower than without shunt mod.
in a synthetic benchmark you won't notice that much of a difference but while playing it can, cuz the borderline fluctuation is gone!!
EDIT:
Quote:


> Originally Posted by *illypso*
> 
> Its just that the stuff I used dry in second and its more like a primer paint so it is less prone to drip after than the liquid metal.
> 
> .


There are things called "Laws of physics" or something like that









surface tension is something which cant be overcome, by even Liquid Metal if not apllied like ******ed to much!

No risk of dripping, seriously


----------



## MrKenzie

Quote:


> Originally Posted by *Hulio225*
> 
> I have done the shunt mod for example on a 1080ti, the risk is 0 if you are not person who has some comprehension issues... but ill give you the point that the gains aren't that high.
> 
> but what a lot of people don't understand is the fact, maybe the fps which you gain aren't that hight, yes 1% maybe, but frame time differences are significantly lower than without shunt mod.
> in a synthetic benchmark you won't notice that much of a difference but while playing it can, cuz the borderline fluctuation is gone!!
> EDIT:
> There are things called "Laws of physics" or something like that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> surface tension is something which cant be overcome, by even Liquid Metal if not apllied like ******ed to much!
> 
> No risk of dripping, seriously


That's a good point, I didn't take frame time into consideration. Do you know if anyone has done a review showing the difference it made?


----------



## Hulio225

Quote:


> Originally Posted by *MrKenzie*
> 
> That's a good point, I didn't take frame time into consideration. Do you know if anyone has done a review showing the difference it made?


I can link a video of Gamers Nexus where they do the gehetto hybrid mod, removing the thermal throttling issue and getting into power limit issues and they are messuring lower / worse frame times with overclocks than without.

which is proving my point.






EDIT:

Somewhere in the comments, Steve is even stating that the lower frame times are a result of borderline fluctuations cuz of power limt crap


----------



## MrKenzie

Quote:


> Originally Posted by *Hulio225*
> 
> I can link a video of Gamers Nexus where they do the gehetto hybrid mod, removing the thermal throttling issue and getting into power limit issues and they are messuring lower / worse frame times with overclocks than without.
> 
> which is proving my point.
> 
> 
> 
> 
> 
> 
> EDIT:
> 
> Somewhere in the comments, Steve is even stating that the lower frame times are a result of borderline fluctuations cuz of power limt crap


I might do some frametime tests now on my Titan as I haven't looked at them before. I hold a steady 2114MHz on most games, with no power limit mods.


----------



## Hulio225

Quote:


> Originally Posted by *MrKenzie*
> 
> I might do some frametime tests now on my Titan as I haven't looked at them before. I hold a steady 2114MHz on most games, with no power limit mods.


2114 thats sick


----------



## TWiST2k

Quote:


> Originally Posted by *Mato87*
> 
> Since it was ignored, Iam posting this again.
> 
> I have a question for you folks, When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks. Important thing is to leave the ugly muddy, blurry upscaling option turned OFF
> 
> Oh and please test mafia 3 aswell, thanks


Dude, I typed "1080 ti quantum break" into Google and OMG look at my first result!

http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page6.html

And oh wait, what's this? Google has results for Mafia 3 with 1080 Ti AND its the SAME ARTICLE!

http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page4.html

Wow, mind blowing isn't it? The power of the Google.


----------



## MrKenzie

Quote:


> Originally Posted by *Hulio225*
> 
> 2114 thats sick


It is good, some games cause it to hit the power limit too hard so it drops to 2050-2060MHz but that's very rare and can be solved by setting a frame cap 5-10 fps above my monitor refresh rate (G-Sync).

I don't know how to test frametime properly, testing it using G-Sync on and V-Sync off like I normally game with there is huge frametime variance 16ms to 31ms so I assume you should test it using a manual frame limit? I'm happy with my gaming performance as it is so I don't feel it's necessary.


----------



## Hulio225

Quote:


> Originally Posted by *MrKenzie*
> 
> It is good, some games cause it to hit the power limit too hard so it drops to 2050-2060MHz but that's very rare and can be solved by setting a frame cap 5-10 fps above my monitor refresh rate (G-Sync).
> 
> I don't know how to test frametime properly, testing it using G-Sync on and V-Sync off like I normally game with there is huge frametime variance 16ms to 31ms so I assume you should test it using a manual frame limit? I'm happy with my gaming performance as it is so I don't feel it's necessary.


i guess it always depends somehow on circumstances. if you play 4k and your monitor refresh rate is 60 hz but you are mostlikly below 60 hz frametimes can make a difference in good gaming experience... in other circumstances maybe not that much...

i have a 4k monitor and in a lot of games i havent 60+ fps obviously... so it can make a difference
in addition i had to install my waterblock anyway, so for me (i have an engineering degree in electronics) a shunt mod is like yeah what ever, ill do it becaue i have stripped the card down anyway...


----------



## MrKenzie

Quote:


> Originally Posted by *Hulio225*
> 
> i guess it always depends somehow on circumstances. if you play 4k and your monitor refresh rate is 60 hz but you are mostlikly below 60 hz frametimes can make a difference in good gaming experience... in other circumstances maybe not that much...
> 
> i have a 4k monitor and in a lot of games i havent 60+ fps obviously... so it can make a difference
> in addition i had to install my waterblock anyway, so for me (i have an engineering degree in electronics) a shunt mod is like yeah what ever, ill do it becaue i have stripped the card down anyway...


I do have a 4K monitor also. With AA off most of my games run above 60fps, only Crysis 3 and some poorly optimised games don't. Having a degree in electronics would help, I'm an electrician but we don't work on such small components!


----------



## Hulio225

Quote:


> Originally Posted by *MrKenzie*
> 
> I do have a 4K monitor also. With AA off most of my games run above 60fps, only Crysis 3 and some poorly optimised games don't. Having a degree in electronics would help, I'm an electrician but we don't work on such small components!


Don't wanted to sound arrogant or something with stating that i have a degree in electronics, that doesn't help on stuff others have engineered and you can't follow the thoughts those ppl had, my point just was that its sometimes hard to determine what to expect and what should be done.
The thing is if the card has to regulate clocks because of power limitations, there is a drawback because the card has always to adjust stuff. The gamers nexus charts are showing that pretty well.
And there will be always a difference what is measurable and what is feel able/recognizable.

I simply think that if you strip your card dowm for water cooling, you should do the shunt mod anyway cuz its kindergarden stuff anyway


----------



## illypso

I had thermal paste for one last try and one last idea.

I took a wire and use one wire of the multi wire inside. (the smallest I could find)

solder it on 2 of the 3 shunt. (the 2 close to the power connector, i presume the other one would be for the PCIexpress power)

In 3dmark I had from 112 to 119% now I have 85 to 92. so a drop of 27% although I cant reach 2000mhz anymore

So you can use a wire for shunt mod if its small enough.



About my driver crash,
I think that my card was stable because of the power limit and removing that is actually making it go at a higher boost table number (so higher voltage) and causing it to fail because I don't have a good bin card and higher voltage make it unstable. is it possible.

Or something happen in the 5 time I opened the card... but I cant think of something that would make me lose 100hz

Can too much paste going into the small resistance/cap around the GPU cause anomaly, because after 5 time reaplying paste, i did not take the time to clean it and there was a lots around the GPU.


----------



## Hulio225

Quote:


> Originally Posted by *illypso*
> 
> I had thermal paste for one last try and one last idea.
> 
> I took a wire and use one wire of the multi wire inside. (the smallest I could find)
> 
> solder it on 2 of the 3 shunt. (the 2 close to the power connector, i presume the other one would be for the PCIexpress power)
> 
> In 3dmark I had from 112 to 119% now I have 85 to 92. so a drop of 27% although I cant reach 2000mhz anymore
> 
> So you can use a wire for shunt mod if its small enough.
> 
> 
> 
> About my driver crash,
> I think that my card was stable because of the power limit and removing that is actually making it go at a higher boost table number (so higher voltage) and causing it to fail because I don't have a good bin card and higher voltage make it unstable. is it possible.
> 
> Or something happen in the 5 time I opened the card... but I cant think of something that would make me lose 100hz
> 
> Can too much paste going into the small resistance/cap around the GPU cause anomaly, because after 5 time reaplying paste, i did not take the time to clean it and there was a lots around the GPU.


Stolen from the8auer


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> i came from that on my cpu. gained a few degrees but i had put too much paste on time before. It is (TG:K) better, no doubt. I mean, mx-4 is a %6 amazon add-on, though, so it's the best value imo. But at the same time, i was attempting to get sub 60c on air so i needed the absolute best paste. i actually achieved it in most games i play. a few it games at 63c. all in all, on air, that's nasty. added bonus is my drives/mobo/etc are all 30-32c because my case is super cool. My house's ac/heat is kinda loud and now my pc still isn't louder so it bothers me none. Probably going to skip the hybrid mod this time and save a hundred on that one.


Hybrid is totally worth it! 10/10 would recommend.


----------



## Jbravo33

Waiting on one more ek block to start the Petg project. Digging the ftw3 tho


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*


Thermaltake Riing fans?


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> Thermaltake Riing fans?


Yes. New ones are pretty sick riing premium but for $100 for 3 I'll pass lol got 11 fans as it stands


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Yes. New ones are pretty sick riing premium but for $100 for 3 I'll pass lol got 11 fans as it stands


Looks really sharp. I'm a little envious.







But man... 11 fans?! Geez... How loud does that get?


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> Looks really sharp. I'm a little envious.
> 
> 
> 
> 
> 
> 
> 
> But man... 11 fans?! Geez... How loud does that get?


Super quiet believe it or not. The 1080ti fans are much louder. CPU temps stay really low with all rads. So usually my pump will step it up a notch not so much fans. Why I wanna put these cards under water. Was going for a silent gaming monster lol


----------



## muhd86

can we do 3 way or 4 way sli with founder edition 1080ti or is it the same ...as before locked


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Super quiet believe it or not. The 1080ti fans are much louder. CPU temps stay really low with all rads. So usually my pump will step it up a notch not so much fans. Why I wanna put these cards under water. Was going for a silent gaming monster lol


Build looks gorgeous for sure. Almost makes me wanna re-do my setup with an In-Win 805 and some Riings


----------



## Mato87

Quote:


> Originally Posted by *TWiST2k*
> 
> Dude, I typed "1080 ti quantum break" into Google and OMG look at my first result!
> 
> http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page6.html
> 
> And oh wait, what's this? Google has results for Mafia 3 with 1080 Ti AND its the SAME ARTICLE!
> 
> http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page4.html
> 
> Wow, mind blowing isn't it? The power of the Google.


You see, here is where we disagree, have you even read what I have written in my comment or not?
You can also take your condescending tone somewhere else.

So Iam going to ask for the third time since it was ignored twice now, but this time Iam going to highlight the important stuff so huffy and self-important blokes like the one Iam replying to will take notice.

I have a question for you folks, When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra *with upscaling turned off* obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks.*!!!! Important thing is to leave the ugly muddy, blurry upscaling option turned OFF !!!!*

Also do a Mafia 3 benchmark, I would prefer a real world gameplay scenario of you driving around the city and the swamps and see what the fps was, because it seems the FPS tanks quite a lot during fast driving through these areas, especially during day in swamps and the city.


----------



## slickric21

Is there a concensus as to what would be the ideal copper shim size for mounting an AIO with a flat base (such as H55 with a Kraken G10) ?

I have searched the thread and saw someone mentioned the GPU die is 25x20mm so a shim of 25x20x2mm would be needed. I haven't seen such sized shims anywhere.

I have a 25x25x1.5mm shim that I had on my 980ti.

Looking at getting a founders edition 1080ti and mounting my H75 with KrakenG10 on it, in the same way that GamerNexus did their DIY hybrid. However I see that that use an eVGA GPU AIO that has the stepped base - so negating the need for a shim.


----------



## Bishop07764

Quote:


> Originally Posted by *Jbravo33*
> 
> Super quiet believe it or not. The 1080ti fans are much louder. CPU temps stay really low with all rads. So usually my pump will step it up a notch not so much fans. Why I wanna put these cards under water. Was going for a silent gaming monster lol


I know what you mean. My case has 23 fans, and the 1080 ti was louder than all of them combined by a huge margin. These things are sweet under water. My idle temps went from 50c to 25c. My highest temp last night testing and gaming was 35c even with it clocked to 2063.

The clocks are solid now too except when it slams into the power limit. I don't know how you guys can get 2100 core without it power throttling like crazy. Mine is average I suppose but no coil whine.


----------



## Mato87

Quote:


> Originally Posted by *Bishop07764*
> 
> I know what you mean. My case has 23 fans, and the 1080 ti was louder than all of them combined by a huge margin. These things are sweet under water. My idle temps went from 50c to 25c. My highest temp last night testing and gaming was 35c even with it clocked to 2063.
> 
> The clocks are solid now too except when it slams into the power limit. I don't know how you guys can get 2100 core without it power throttling like crazy. Mine is average I suppose but no coil whine.


23 fans? Jesus Christ and I thought having 11 fans was too much, 5 case fans, 3 gk fans, 2 cpu fans and one psu fan







Apparently it isn't


----------



## Bishop07764

Quote:


> Originally Posted by *Mato87*
> 
> 23 fans? Jesus Christ and I thought having 11 fans was too much, 5 case fans, 3 gk fans, 2 cpu fans and one psu fan
> 
> 
> 
> 
> 
> 
> 
> Apparently it isn't


Lol. And that wasn't counting my psu fan. My fan controller gets a workout for sure. It allows me to run my fans at about 600 rpms or so. Quiet and they still keep my CPU and graphics card cool. I've never even tried benching with the fans any higher. Most of them are Yate Loons. Cheap but awesome.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Doesn't adding more voltage make you reach power limiting quicker?


sure does for me. i turned it back to default (which ive never done on a card). once you beat the voltage and thermals you indeed hit the power. In fact , I hit power as soon as i add volts. The slider gives extra volts at +22, (1.06) +44, (1.72) and +66(1.83), and +100 (1.93); that's it. it basically does nothing on this card. And all of those increase ssuck b/c it power throttles very hard. I wish I could undervolt it with the slider tbh.


----------



## Slackaveli

Quote:


> Originally Posted by *pompss*
> 
> its safe to disable asus anti surge when overclocking ?
> 
> Getting random reset when gaming with a dual sli gtx 1080ti 2000mhz and cpu overclock to 4.4 ghz psu seasonic white snow 1050w
> 
> anyone have same experience ?


do you have you phase switching at at least 670? and make sure you have the droop prevention at 50%+ in the digi+ bios settings.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> sure does for me. i turned it back to default (which ive never done on a card). once you beat the voltage and thermals you indeed hit the power. In fact , I hit power as soon as i add volts. The slider gives extra volts at +22, (1.06) +44, (1.72) and +66(1.83), and +100 (1.93); that's it. it basically does nothing on this card. And all of those increase ssuck b/c it power throttles very hard. I wish I could undervolt it with the slider tbh.


Yeah, overvolting did nothing for me as it just made me approach power limiting quicker.

Someone here did undervolt their card but I'm not sure how they did it.


----------



## Slackaveli

Quote:


> Originally Posted by *TWiST2k*
> 
> Dude, I typed "1080 ti quantum break" into Google and OMG look at my first result!
> 
> http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page6.html
> 
> And oh wait, what's this? Google has results for Mafia 3 with 1080 Ti AND its the SAME ARTICLE!
> 
> http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page4.html
> 
> Wow, mind blowing isn't it? The power of the Google.


yes it is. and i saw your question but dont have those games, lol. I will say that the googles say that you will get 60+ fps ultra 4k.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, overvolting did nothing for me as it just made me approach power limiting quicker.
> 
> Someone here did undervolt their card but I'm not sure how they did it.


me either. i tried with the slider to manually enter "-25". nope. I tried playing with the curve but couldnt get good results with it either. keeps crashing on me. Any "curve" experts want to give us the head's up on it?


----------



## Slackaveli

Quote:


> Originally Posted by *MrKenzie*
> 
> It is good, some games cause it to hit the power limit too hard so it drops to 2050-2060MHz but that's very rare and can be solved by setting a frame cap 5-10 fps above my monitor refresh rate (G-Sync).
> 
> I don't know how to test frametime properly, testing it using G-Sync on and V-Sync off like I normally game with there is huge frametime variance 16ms to 31ms so I assume you should test it using a manual frame limit? I'm happy with my gaming performance as it is so I don't feel it's necessary.


why would you set you frame limit OVER your refresh rate? All the input lag testing I've seen says to set it 1 below your refresh rate at 60Hz, and 2 below at 144. Aka, set the limiter at 142 on a 144Hz monitor (59 on a 60Hz0. NO input lag at all, no tearing, and slightly less workload for your gpu, keeping power target throttling at a minimum.


----------



## Jbravo33

Quote:


> Originally Posted by *Bishop07764*
> 
> I know what you mean. My case has 23 fans, and the 1080 ti was louder than all of them combined by a huge margin. These things are sweet under water. My idle temps went from 50c to 25c. My highest temp last night testing and gaming was 35c even with it clocked to 2063.
> 
> The clocks are solid now too except when it slams into the power limit. I don't know how you guys can get 2100 core without it power throttling like crazy. Mine is average I suppose but no coil whine.


Same here i cant get passed 2050 with fans on 100 percent. also tried playing game (squad and mass effect) last night for first time with sli got a few crashes when cards were overclocked at highest preset i had for benching. cards heated up and fans were super loud. i think these cards def need to be swimming. Do you have a seperate loop for gpu? how much of an effect would these two cards have on my loop with cpu do u think?


----------



## PewnFlavorTang

After messing with sli I'm likely getting rid of my second ti. Not worth it to me at 1440p.


----------



## mcg75

I'm using a 980 Ti EVGA hybrid cooler on my 1080 Ti.

Temps are usually about 40C gaming.

My normal gaming clocks are 2025 MHZ core and +250 on memory. The core on mine doesn't want to go above 2050 without crashing.

Haven't pushed memory that far yet but it seems like a lot of you are getting good memory OC.


----------



## lilchronic

Quote:


> Originally Posted by *mcg75*
> 
> I'm using a 980 Ti EVGA hybrid cooler on my 1080 Ti.
> 
> Temps are usually about 40C gaming.
> 
> My normal gaming clocks are 2025 MHZ core and +250 on memory. The core on mine doesn't want to go above 2050 without crashing.
> 
> Haven't pushed memory that far yet but it seems like a lot of you are getting good memory OC.


I can't get passed 2050Mhz either. Kinda weird as i can run 2050Mhz all day with a +151 core offset but as soon as i go to+152 core offset or 2063Mhz it will crash.


----------



## Bishop07764

Quote:


> Originally Posted by *Jbravo33*
> 
> Same here i cant get passed 2050 with fans on 100 percent. also tried playing game (squad and mass effect) last night for first time with sli got a few crashes when cards were overclocked at highest preset i had for benching. cards heated up and fans were super loud. i think these cards def need to be swimming. Do you have a seperate loop for gpu? how much of an effect would these two cards have on my loop with cpu do u think?


Yeah. I have a completely separate loop for my GPU. It's on a ridiculous amount of rad space. 480 + 280. My plan was back in the day to get 2 780 lightning cards and run then both at 1.3 volts. I wanted to be able to push it with fans at slow speeds.

With your loop, it would depend on rad space, pump, and fans. It wouldn't have much affect on mine because it's a dedicated loop. How much rad space do you have? My last shared loop was an old i7 860 and crossfired 7850s. Maybe someone else with SLI on their CPU loop could chime in?


----------



## Jbravo33

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. I have a completely separate loop for my GPU. It's on a ridiculous amount of rad space. 480 + 280. My plan was back in the day to get 2 780 lightning cards and run then both at 1.3 volts. I wanted to be able to push it with fans at slow speeds.
> 
> With your loop, it would depend on rad space, pump, and fans. It wouldn't have much affect on mine because it's a dedicated loop. How much rad space do you have? My last shared loop was an old i7 860 and crossfired 7850s. Maybe someone else with SLI on their CPU loop could chime in?


Currently have 300c res pumping into a front 280x45mm rad in pul> bottom 240 slim in pull >cpu > out to rear 360x60mm push/pull back into res. would like to throw in the gpus right after the 240 and in parallel.


----------



## Bishop07764

Quote:


> Originally Posted by *Jbravo33*
> 
> Currently have 300c res pumping into a front 280x45mm rad in pul> bottom 240 slim in pull >cpu > out to rear 360x60mm push/pull back into res. would like to throw in the gpus right after the 240 and in parallel.


That sounds like enough rad space to me. Provided that your pump is up to it. It sounds like you would be good to go. I'm loving the silence myself. The founders cooler was one of the worst that I've ever used in terms of noise when trying to push it. But maybe the brief time that I used my old 780 lightning air cooler is biasing my memory a bit on the noise.









Now if someone could do a modified bios allowing 287% power limit like my old lightning, that would be awesome. ?


----------



## Mato87

Quote:


> Originally Posted by *Bishop07764*
> 
> Lol. And that wasn't counting my psu fan. My fan controller gets a workout for sure. It allows me to run my fans at about 600 rpms or so. Quiet and they still keep my CPU and graphics card cool. I've never even tried benching with the fans any higher. Most of them are Yate Loons. Cheap but awesome.


Please tell me where exactly are your fans, In my PC, most of the fans are turned off, mainly the psu fan, its switched to the hybrid mode and the 3 gpu fans...


----------



## Bishop07764

Quote:


> Originally Posted by *Mato87*
> 
> Please tell me where exactly are your fans, In my PC, most of the fans are turned off, mainly the psu fan, its switched to the hybrid mode and the 3 gpu fans...


20 of them are on my 4 water radiators. I have 2 alphacool ut 60 480 mm rads that account for 16 of them. They are in push/pull configuration. Then 2 in very front and one for exhaust.


----------



## Mato87

Quote:


> Originally Posted by *Bishop07764*
> 
> 20 of them are on my 4 water radiators. I have 2 alphacool ut 60 480 mm rads that account for 16 of them. They are in push/pull configuration. Then 2 in very front and one for exhaust.


Ahhh, I understand now, you have the radiators for the water cooling, but why 4 man? where do you have them? Outside the case? or do you have a gigantic case?


----------



## jrcbandit

Quote:


> Originally Posted by *mcg75*
> 
> I'm using a 980 Ti EVGA hybrid cooler on my 1080 Ti.
> 
> Temps are usually about 40C gaming.
> 
> My normal gaming clocks are 2025 MHZ core and +250 on memory. The core on mine doesn't want to go above 2050 without crashing.
> 
> Haven't pushed memory that far yet but it seems like a lot of you are getting good memory OC.


How are you temps so low, are you playing less demanding games or have cold ambient temperature? Watch Dogs 2 and Mass Effect Andromeda go up to 52-53ish C for me, but I am also overclocking the memory to +500 if that causes more heat. I also have a very poor core clock overclocker. I can only go to 2025 Mhz core/+120, any higher than that and it crashes in games even after adding the hybrid cooler ;p. In fact, I dunno if I can maintain the +120, I have to play ME: Andromeda some more to see, lol. I know at +100 on air, the game never crashed.


----------



## mshagg

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. I have a completely separate loop for my GPU. It's on a ridiculous amount of rad space. 480 + 280. My plan was back in the day to get 2 780 lightning cards and run then both at 1.3 volts. I wanted to be able to push it with fans at slow speeds.


Haha, yeah my loop was built to contain crossfire 290x's - about 600W of heat to dump from the GPUs alone. 50mm thick 420 + 480 with 11 fans between them now pretty overkill for a CPU and the 1080Ti, but it's quiet and well behaved. Damn thing weighs like 30KG though.


----------



## richiec77

I was one person who messed with undervolting. Iirc...there was another poster who did the same? Learned how to do this working on an Alienware 17R4 with a GTX 1080 that was Hard limited until,very recently at 165W. So you had to find the best OC at voltages around 0.913-0.943v to avoid power limit. You MUST learn to use the voltage/freq curve to achieve this. I'll post up something later once I'm home on the rig I did this on but here's a couple videos that I found that show/explain how. (Russian and German speakers rejoice, English speakers...you can see what's happening)











In a nut shell....you apply a slider based OC with the chart open (CTRL+F). You'll see the slider applies the + across all voltage/freq point. Try going for a 2000-2050 OC at 1.000V point and then drag all points past that down. Or if driver crashes...choose a slightly higher voltage point to try like 1.013v. This forces the Boost 3.0 to choose the highest Freq point at the lowest voltage point temperature based which we can't see in a chart unfortunately. There does appear to be a Temp/Voltage coorelation that I've seen but have not quantified.

This will take way more time to adjust vs just a little slider adjustment and apply. But worth it to find the best compromise between Voltage-Freq-power target use.


----------



## mcg75

Quote:


> Originally Posted by *jrcbandit*
> 
> How are you temps so low, are you playing less demanding games or have cold ambient temperature? Watch Dogs 2 and Mass Effect Andromeda go up to 52-53ish C for me, but I am also overclocking the memory to +500 if that causes more heat. I also have a very poor core clock overclocker. I can only go to 2025 Mhz core/+120, any higher than that and it crashes in games even after adding the hybrid cooler ;p. In fact, I dunno if I can maintain the +120, I have to play ME: Andromeda some more to see, lol. I know at +100 on air, the game never crashed.


I've actually never really looked into seeing if my temperatures were that good. The fact that the hybrid is so much quieter than the card's fans sold me on it.

I just let Heaven run for 20 minutes. GPU usage 99%. Leveled off at 45C. Ambient room temp is 22C.


----------



## Bishop07764

Quote:


> Originally Posted by *Mato87*
> 
> Ahhh, I understand now, you have the radiators for the water cooling, but why 4 man? where do you have them? Outside the case? or do you have a gigantic case?


I got into water-cooling and that's can become a bit of an obsession as others here could testify to I'm guessing. It's all in my gigantic Corsair 900d. One 480 is hung on the ceiling, a 480 and 280 are in the bottom, and another 240 is hung on the hard drive cage. I wanted to go crazy and have something to last me a long time. Any upgrades will always fit in this thing.









I've even water-cooled my wife's htpc in the living room now. Maxes out at like 32 c. Overkill but awesome.
Quote:


> Originally Posted by *mshagg*
> 
> Haha, yeah my loop was built to contain crossfire 290x's - about 600W of heat to dump from the GPUs alone. 50mm thick 420 + 480 with 11 fans between them now pretty overkill for a CPU and the 1080Ti, but it's quiet and well behaved. Damn thing weighs like 30KG though.


That's awesome. I love not having to worry about whatever upgrade that I might want to do. It will cool it. But your right. It weighs a ton.


----------



## Mato87

Quote:


> Originally Posted by *Bishop07764*
> 
> I got into water-cooling and that's can become a bit of an obsession as others here could testify to I'm guessing. It's all in my gigantic Corsair 900d. One 480 is hung on the ceiling, a 480 and 280 are in the bottom, and another 240 is hung on the hard drive cage. I wanted to go crazy and have something to last me a long time. Any upgrades will always fit in this thing.
> 
> 
> 
> 
> 
> 
> 
> .


Damn, that's a huge case, I reckon it weighs around 25 kg with all the components


----------



## KingEngineRevUp

Quote:


> Originally Posted by *slickric21*
> 
> Is there a concensus as to what would be the ideal copper shim size for mounting an AIO with a flat base (such as H55 with a Kraken G10) ?
> 
> I have searched the thread and saw someone mentioned the GPU die is 25x20mm so a shim of 25x20x2mm would be needed. I haven't seen such sized shims anywhere.
> 
> I have a 25x25x1.5mm shim that I had on my 980ti.
> 
> Looking at getting a founders edition 1080ti and mounting my H75 with KrakenG10 on it, in the same way that GamerNexus did their DIY hybrid. However I see that that use an eVGA GPU AIO that has the stepped base - so negating the need for a shim.


That shim size should still work as I have one with a H55. This card runs hotter though, I get 58 to 60C with a 2000 rpm ran, too loud. Some people have e successfully used this copper disk, which will give you more surface area and should get you better performance.

Get the 1.5" one.

https://www.etsy.com/listing/294213319/copper-discs-14-gauge-stamping-blanks?ref=pla_similar_listing_top-1


----------



## geriatricpollywog

When watercooled, do the AIB cards with additional power phases clock higher than Founder's?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *0451*
> 
> When watercooled, do the AIB cards with additional power phases clock higher than Founder's?


I have this gun for you, if you point it at your head and pull the trigger it'll send you 4 weeks into the future so you can find out. JK.

The answer is a that it's a gamble. First you have to win the silicon lottery and get a card that can OC past 2050 without crashing and having the card at 2050 will put you around 115%-118% power draw. Look at all the reviews and people here, no one is really pushing past 2100 on "average."

Now if your card is a lottery winner, then yes that extra power might help you. The card at 2100+ MHz will probably draw like 125% power max sometimes, but sorry I highly doubt you'll get to 2150 MHz. But you won't throttle and lose that 1-2% extra frames. Yes, that's right, that 1-2% extra frames.

So partner card. You have to win the lottery, but first you should decide if it's worth paying extra money to get a OC dud like the latest 1080 To Asus review where they couldn't even OC the card and boost it to 2000 MHz without crashing, and this is a review site that sh9have gotten a binned card.


----------



## barsh90

Quote:


> Originally Posted by *0451*
> 
> When watercooled, do the AIB cards with additional power phases clock higher than Founder's?


Quote:


> Originally Posted by *0451*
> 
> When watercooled, do the AIB cards with additional power phases clock higher than Founder's?


Not in this generation with pascal 1080, Titan xp and 1080 ti. Due to the fact that they are bottle-necked by their locked voltage.


----------



## Slackaveli

Quote:


> Originally Posted by *0451*
> 
> When watercooled, do the AIB cards with additional power phases clock higher than Founder's?


founder's has better power phases than some of the aftermarket cards. Unless one of them (likely Zotac) releases one where power limit is raised to 130% or 150% than, no. None will beat FE. And even if the power limit is raised you'll need a lucky chip.


----------



## MrKenzie

Quote:


> Originally Posted by *Slackaveli*
> 
> why would you set you frame limit OVER your refresh rate? All the input lag testing I've seen says to set it 1 below your refresh rate at 60Hz, and 2 below at 144. Aka, set the limiter at 142 on a 144Hz monitor (59 on a 60Hz0. NO input lag at all, no tearing, and slightly less workload for your gpu, keeping power target throttling at a minimum.


I have always ran G-Sync on and V-Sync on FAST. I was under the impression that this had all the benefits of what you mentioned, but with a higher refresh rate. I only recently set a cap at 70fps and I didn't notice any difference to lag. It's all very confusing and I'm happy to be proven wrong


----------



## pez

Excited to see that the hybrid for this and the TXP is officially announced. IIRC, the preorder for the FTW3 and SC2 should be up tomorrow. Hopefully that means a nice ICX non-SC version for step-up should follow soon after.


----------



## Zirc60

On my watercooled 1080ti, maxed out power % and voltage @ 100. The core is stable @ +170. But i am also able to overclock memory over +1200 mhz. Is that normal? i get some artifacts if i go over 1200mhz. Seems weird everyone is doing like 400-500 and getting freeze if unstable. But i have no problem at 1200 + :S


----------



## fisher6

Quote:


> Originally Posted by *Zirc60*
> 
> On my watercooled 1080ti, maxed out power % and voltage @ 100. The core is stable @ +170. But i am also able to overclock memory over +1200 mhz. Is that normal? i get some artifacts if i go over 1200mhz. Seems weird everyone is doing like 400-500 and getting freeze if unstable. But i have no problem at 1200 + :S


As I understood it the new memory on the 1080 Ti can be pushed quite a bit. Highest I tried was 700, could probably push even more but i noticed that beyond 550 I was losing points in FS so I just dialed it back and saved the extra heat/power.


----------



## mshagg

Little memory/CPU OCing session playing around with a 125 strap on this board and *finally* pushed over 10K in Time Spy with the 1080Ti.

http://www.3dmark.com/3dm/18887285?

Phew.

EDIT: Strix review, sounds like it has a higher power limit. 20W more power consumption than the FE and with their OC the clocks look pegged at 2050.

http://www.kitguru.net/components/graphic-cards/luke-hill/asus-rog-strix-gtx-1080-ti-oc-o11g-11gb-review/


----------



## Gripen90

Got these a bit over a week ago. Replaced my 3x GTX 980Ti's
http://s95.photobucket.com/user/Gri...i/20170315_195250_edit_1489619728423.jpg.html
http://s95.photobucket.com/user/Gripen90/media/GTX 1080Ti/IMG_8928.jpg.html


----------



## MightEMatt

In stock on Nvidia store right now.

Edit: Out of stock again.


----------



## NYU87

Waiting for 1080ti AIBs...


----------



## alucardis666

Quote:


> Originally Posted by *NYU87*
> 
> Waiting for 1080ti AIBs...


Why?


----------



## NYU87

Quote:


> Originally Posted by *alucardis666*
> 
> Why?


Noise and aesthetic reasons. The FE won't fit my current build.


----------



## Blotto80

So for those of you who have slapped an AIO on their Ti, what are you running the on-card blower fan at to cool the VRMs? If I leave it at the default curve it never really spins up as the core doesn't get hot enough to trigger it. If I manually adjust it, I can get it to about 36% before I can hear it. Does that seem enough to cool the VRMs?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Blotto80*
> 
> So for those of you who have slapped an AIO on their Ti, what are you running the on-card blower fan at to cool the VRMs? If I leave it at the default curve it never really spins up as the core doesn't get hot enough to trigger it. If I manually adjust it, I can get it to about 36% before I can hear it. Does that seem enough to cool the VRMs?


I put it to the minimum. I believe Nexus did some thermal imaging on the Titan Pascal and it didn't matter what speed you put it on.

But that makes sense to me. The midplate is just that, it's a plate. Without fins, it won't be releasing heat that fast since it's just a slab of metal.


----------



## frunction

Can you change the LED color on the 1080 Ti founders edition card? Or it's still stuck on green?

Would think they've put RGB by now.


----------



## NoDoz

Hey guys, got my 1080ti cooler in today. Ran one heaven benchmark and max temp of 42. Seems to be working really good so far. More testing going on now.


----------



## JedixJarf

Quote:


> Originally Posted by *NoDoz*
> 
> Hey guys, got my 1080ti cooler in today. Ran one heaven benchmark and max temp of 42. Seems to be working really good so far. More testing going on now.


Looks rad


----------



## mouacyk

How can you gaurantee no liquid loss on those QDC's? If you loose any liquid, it would create a bubble in the closed loop and affect cooling.


----------



## NoDoz

Beating my old heaven benchmark by 170 points with no overclock because temps stay so low


----------



## orion933

Hello !

I plan to buy one of these bad boy to replace my sli 780!

I would like to know if my cpu will bottleneck it ?

specs:
cpu: i7 4790k
ram 16g DDR3 2400mhz gskill
monitor: pg278q 1440p 144hz g sync
ps: I stream on twitch if it is revelant.

thank you !


----------



## Slackaveli

Quote:


> Originally Posted by *MrKenzie*
> 
> I have always ran G-Sync on and V-Sync on FAST. I was under the impression that this had all the benefits of what you mentioned, but with a higher refresh rate. I only recently set a cap at 70fps and I didn't notice any difference to lag. It's all very confusing and I'm happy to be proven wrong


no, you are very close to perfect. i also run on fast. IT'S GREAT. it may even be negligible but supposedly the very best settings would be fast, g-sync on, and frame limiter at 59 (60Hz monitor) or 99(100Hz monitor) or 142(144Hz monitor) or 163 (165 Hz monitor). It's an input lag tweak. very nice settings imo.


----------



## looniam

Quote:


> Originally Posted by *JedixJarf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *NoDoz*
> 
> Hey guys, got my 1080ti cooler in today. Ran one heaven benchmark and max temp of 42. Seems to be working really good so far. More testing going on now.
> 
> 
> 
> 
> 
> 
> 
> Looks rad
Click to expand...


----------



## wanako

Quote:


> Originally Posted by *JedixJarf*
> 
> Looks rad


----------



## Alwrath

Quote:


> Originally Posted by *barsh90*
> 
> Unless you got an evga card, then your voltage is unlocked with presiscion x ?✌
> 
> Not in this generation with pascal 1080, Titan xp and 1080 ti. Due to the fact that they are bottle-necked by their locked voltage.


----------



## Asus11

Quote:


> Originally Posted by *orion933*
> 
> Hello !
> 
> I plan to buy one of these bad boy to replace my sli 780!
> 
> I would like to know if my cpu will bottleneck it ?
> 
> specs:
> cpu: i7 4790k
> ram 16g DDR3 2400mhz gskill
> monitor: pg278q 1440p 144hz g sync
> ps: I stream on twitch if it is revelant.
> 
> thank you !


should be perfect, make sure you try overclock the i7 4790k though


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I put it to the minimum. I believe Nexus did some thermal imaging on the Titan Pascal and it didn't matter what speed you put it on.
> 
> But that makes sense to me. The midplate is just that, it's a plate. Without fins, it won't be releasing heat that fast since it's just a slab of metal.


couple of finned heatsinks on that backplate could help.


----------



## KedarWolf

Quote:


> Originally Posted by *Zirc60*
> 
> On my watercooled 1080ti, maxed out power % and voltage @ 100. The core is stable @ +170. But i am also able to overclock memory over +1200 mhz. Is that normal? i get some artifacts if i go over 1200mhz. Seems weird everyone is doing like 400-500 and getting freeze if unstable. But i have no problem at 1200 + :S


You running Heaven when checking? Kinda hard to believe you're getting +1200 memory to be honest. I was lucky to get +650 for benching.









Edit: Would love to see Heaven running windowed mode with Afterburner open in a screenshot.


----------



## Kaltenbrunner

What r they like vs 980Ti at 1440p ? I never checked and I'm tired and hungry


----------



## havoc315

Just got my EKWB Titan Waterblocks and Backplates in today, also received my CLU deliding kit and grizzly kryonaut along with Fuji poly ultra pads and also I think i will be doing the shunt mod along with the skylake 6700k delid at the same time, needless to say this is going to be interesting lol


----------



## alucardis666

Quote:


> Originally Posted by *NoDoz*
> 
> Hey guys, got my 1080ti cooler in today. Ran one heaven benchmark and max temp of 42. Seems to be working really good so far. More testing going on now.


What cooler is that and where do I get one?! lol. Looks so much nicer than the EVGA Hybrid cooler.


----------



## hoevito

Had mine for a couple weeks now, under water for the last week. Max stable overclock achieved was +160 core, +550 Memory. Core will top out at 2063 mhz, memory at 6055 mhz. Holding steady at 50 degrees after looping Heaven for around 30 minutes. Substanial upgrade coming from SLI 780 Lightnings.









http://s69.photobucket.com/user/hoevito/media/card_zpsotqmd8mk.png.html


----------



## Slackaveli

Quote:


> Originally Posted by *havoc315*
> 
> Just got my EKWB Titan Waterblocks and Backplates in today, also received my CLU deliding kit and grizzly kryonaut along with Fuji poly ultra pads and also I think i will be doing the shunt mod along with the skylake 6700k delid at the same time, needless to say this is going to be interesting lol


you went full on darkside


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> You running Heaven when checking? Kinda hard to believe you're getting +1200 memory to be honest. I was lucky to get +650 for benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Would love to see Heaven running windowed mode with Afterburner open in a screenshot.


or at least a gpu-z screenshot. im not setting it that high even just to look at the stats, but, I am curious what his bandwidth is. like 575?


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> you went full on darkside


That sounds pretty beastly!


----------



## KCDC

Quote:


> Originally Posted by *havoc315*
> 
> Just got my EKWB Titan Waterblocks and Backplates in today, also received my CLU deliding kit and grizzly kryonaut along with Fuji poly ultra pads and also I think i will be doing the shunt mod along with the skylake 6700k delid at the same time, needless to say this is going to be interesting lol


Hey there were you able to find the right sized fujipoly pads? I can't find them on amazon.. It's for the waterblock right? Mind linking what you purchased? Those pads do wonders. Amazon only has like two sizes that I could find...


----------



## Zirc60

I used asus overclock here puttin 1250 on memory. I guess its not accurate becuase i tried on msi afterburner, and could not even do 1000 before everything froze.


----------



## havoc315

Yeah, I decided to go ahead and go all the way lol, and I just bought 2 gifabyte 1070 g1s with ek blocks too and didn't eve get to use one, one is only 3mo old and other like 2 weeks so now I got 2 try and move them lol, I didn't know the it's were coming that soon... But yet I was able to get right sizes
https://www.amazon.com/gp/aw/d/B00MQ65K0G/ref=mp_s_a_1_3?ie=UTF8&qid=1490675017&sr=8-3&pi=AC_SX236_SY340_FMwebp_QL65&keywords=fujipoly+17.0+w+mk
https://www.amazon.com/gp/aw/d/B00ZSELP3O/ref=psd_mlt_nbc_B00MQ65K0G_ri

And here is o e just for good measure just in case u need
https://www.amazon.com/gp/aw/d/B00ZSJR1ZK/ref=mp_s_a_1_6?ie=UTF8&qid=1490675017&sr=8-6&pi=AC_SX236_SY340_FMwebp_QL65&keywords=fujipoly+17.0+w+mk
And I can't find the 100x20 for the vrms, so I just cut mine to size on top each one using the 100x15


----------



## Zirc60

Tried with msi afterburner now, only got 650 + on memory, was able to run valley, but i saw a frame stop for a second or two. So i guess ASUS gpu tweak is wrong?


----------



## KedarWolf

Quote:


> Originally Posted by *Zirc60*
> 
> 
> 
> Tried with msi afterburner now, only got 650 + on memory, was able to run valley, but i saw a frame stop for a second or two. So i guess ASUS gpu tweak is wrong?


Yes, quite wrong.

Edit: And if screen freezes a few seconds, then restarts, you very likely will have much lower framerates from driver crash until you restart PC. And if you run TimeSpy or something and get much lower benches even on a reboot it's driver reinstall time as well.


----------



## Zirc60

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, quite wrong.
> 
> Edit: And if screen freezes a few seconds, then restarts, you very likely will have much lower framerates from driver crash until you restart PC. And if you run TimeSpy or something and get much lower benches even on a reboot it's driver reinstall time as well.


Yeah, saw that my gpu memory clock was a little higher on gpuz with msi afterburner at 650 instead of 1250 with asus gpu tweak.


----------



## lilchronic

+750 on the memory. Had a couple of artifacts.
2560x1440 x8AA


----------



## Mato87

Hi, can someone finally answer my question ?

When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks.!!!! Important thing is to leave the ugly muddy, blurry upscaling option turned OFF !!!!

Also do a Mafia 3 benchmark, I would prefer a real world gameplay scenario of you driving around the city and the swamps and see what the fps was, because it seems the FPS tanks quite a lot during fast driving through these areas, especially during day in swamps and the city.


----------



## lilchronic

Quote:


> Originally Posted by *Mato87*
> 
> Hi, can someone finally answer my question ?
> 
> When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks.!!!! Important thing is to leave the ugly muddy, blurry upscaling option turned OFF !!!!
> 
> Also do a Mafia 3 benchmark, I would prefer a real world gameplay scenario of you driving around the city and the swamps and see what the fps was, because it seems the FPS tanks quite a lot during fast driving through these areas, especially during day in swamps and the city.


Buy both games for me and ill do the test's.


----------



## Mato87

Quote:


> Originally Posted by *lilchronic*
> 
> Buy both games for me and ill do the test's.


c'mon naw, both games were recently on sale for mere 20 bucks, I believe quantum break was even cheaper than that...If you can afford a 800 dollars gpu, you can definitely spare some change for a game


----------



## shalafi

Quote:


> Originally Posted by *mcg75*
> 
> I've actually never really looked into seeing if my temperatures were that good. The fact that the hybrid is so much quieter than the card's fans sold me on it.
> 
> I just let Heaven run for 20 minutes. GPU usage 99%. Leveled off at 45C. Ambient room temp is 22C.


Interesting .. are you using any frame limiter or vsync? I just did the 1080 EVGA Hybrid kit swap yesterday and my temps are nowhere near that. Did you replace the defualt thermal paste? Also, did you use the stock screws (with springs), or the ones from the Hybrid? I assumed they are the same ..


----------



## lilchronic

Quote:


> Originally Posted by *Mato87*
> 
> c'mon naw, both games were recently on sale for mere 20 bucks, I believe quantum break was even cheaper than that...If you can afford a 800 dollars gpu, you can definitely spare some change for a game


----------



## Mato87

Quote:


> Originally Posted by *lilchronic*


That's indoors in 1080p right? Go outside and do some testing while driving around the city during day and in the swamps too please. And please, if you can, test quantum break with upscalling turned off at 1080p aswell, both during the regular gameplay and in combat. Thank You !









*EDIT

ahh I just checked and it's at 1440p , can you possibly do a ful hd benchmark please?


----------



## Zirc60

Kinda off topic, is there any disadvantage of having an sli bridge connected to only on one card? I have only one gpu but the rog sli bridge is so cool with its light and all.


----------



## shalafi

by the way, really happy with the upgrade from my 980Ti (which was in no way a slowpoke), comparison OC to OC:


http://www.3dmark.com/compare/fs/12086773/fs/12143587


----------



## pez

Quote:


> Originally Posted by *Mato87*
> 
> Hi, can someone finally answer my question ?
> 
> When the non reference 1080 ti cards will arrive, I plan on playing quantum break after I get one, but can someone do a test or a little benchmark with quantum break ? I prefer to play on 1080p and everything ultra with upscaling turned off obviously and with my gtx 980 ti I get around 40 - 45 fps max, in combat scenes even below 40. Can someone do a test for me on how it plays and performs on a 1080 ti please? Thanks.!!!! Important thing is to leave the ugly muddy, blurry upscaling option turned OFF !!!!
> 
> Also do a Mafia 3 benchmark, I would prefer a real world gameplay scenario of you driving around the city and the swamps and see what the fps was, because it seems the FPS tanks quite a lot during fast driving through these areas, especially during day in swamps and the city.


http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page6.html

Check them out. Not sure their methodology on benchmarking, but they tested both QB and Mafia III. When all else fails, just search for TXP results and you can at least expect that out of it.


----------



## Mato87

Quote:


> Originally Posted by *pez*
> 
> http://www.techspot.com/review/1352-nvidia-geforce-gtx-1080-ti/page6.html
> 
> Check them out. Not sure their methodology on benchmarking, but they tested both QB and Mafia III. When all else fails, just search for TXP results and you can at least expect that out of it.


That's the thing man, I can't find whether it was tested with the upscalling turned off or not, I have a strong hunch it might have been tested with upscalling turned on. As for the Mafia 3 results, that's great, but there is no comparison with gtx 980 ti in any of the benchmarks, also we don't know how they tested it, for all we know it could be all indoor areas, which have significantly higher framerates.


----------



## pez

Quote:


> Originally Posted by *Mato87*
> 
> That's the thing man, I can't find whether it was tested with the upscalling turned off or not, I have a strong hunch it might have been tested with upscalling turned on. As for the Mafia 3 results, that's great, but there is no comparison with gtx 980 ti in any of the benchmarks, also we don't know how they tested it, for all we know it could be all indoor areas, which have significantly higher framerates.


Well IIRC, the 980Ti and the 1070 perform fairly closely. If anything the 1070 may be about 10% faster on average at this point, but that's me throwing a number out there. I recall the difference at launch being ~5%.


----------



## lilchronic

Quote:


> Originally Posted by *Mato87*
> 
> That's indoors in 1080p right? Go outside and do some testing while driving around the city during day and in the swamps too please. And please, if you can, test quantum break with upscalling turned off at 1080p aswell, both during the regular gameplay and in combat. Thank You !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT
> 
> ahh I just checked and it's at 1440p , can you possibly do a ful hd benchmark please?


Here 1080p


----------



## mcg75

Quote:


> Originally Posted by *shalafi*
> 
> Interesting .. are you using any frame limiter or vsync? I just did the 1080 EVGA Hybrid kit swap yesterday and my temps are nowhere near that. Did you replace the defualt thermal paste? Also, did you use the stock screws (with springs), or the ones from the Hybrid? I assumed they are the same ..


In gaming, yes. 59 fps limiter with PrecisionX for Fallout 4. 59 fps is very smooth, 60 is jittery.

In benching, no.

The hybrid kit was on the 1080 before this with similar temps. So the thermal paste on the 1080 Ti is now Arctic Silver MX2.

All screws used are from the card itself.

What are your temps?


----------



## shalafi

Quote:


> Originally Posted by *mcg75*
> 
> What are your temps?


I'm peaking at about 63 after 10-15 minutes benching - unacceptable. Just ordered some Cooler Master Master Gel Maker (jeez, what a name), as it was the best that was available immediately, will pick it up in an hour and redo the Hybrid kit in the evening.


----------



## Mato87

Quote:


> Originally Posted by *lilchronic*
> 
> Here 1080p


You're a life saver ! Thats the beginning of the game basically, can I ask you for one more favour? To do a drivethrough through the bayou, you know the swamps during daylight too? Iam liking the results though, it's basically above 100 fps all the time and sometimes it jumps to 125 fps which is just fantastic, can't wait to get my gtx 1080 ti.

Also how about quantum break, with the upscalling turned off it's a killer even for the titanx pascal.


----------



## dboythagr8

Quote:


> Originally Posted by *Slackaveli*
> 
> no, you are very close to perfect. i also run on fast. IT'S GREAT. it may even be negligible but supposedly the very best settings would be fast, g-sync on, and frame limiter at 59 (60Hz monitor) or 99(100Hz monitor) or 142(144Hz monitor) or 163 (165 Hz monitor). It's an input lag tweak. very nice settings imo.


Wait what? Gsync and Vsync fast should cancel each other out right? How can they work together to decrease lag? If anything I'd think it would add a touch of it?


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> Wait what? Gsync and Vsync fast should cancel each other out right? How can they work together to decrease lag? If anything I'd think it would add a touch of it?


nope. you actually wont conflict at all. you could leave fast sync off if you want but testing shows they work better together than independently.


----------



## Radox-0

Quote:


> Originally Posted by *dboythagr8*
> 
> Wait what? Gsync and Vsync fast should cancel each other out right? How can they work together to decrease lag? If anything I'd think it would add a touch of it?


G-Sync works within the monitors refresh rate range. Go over the limit and without a form of limiter like V-Sync, FPS cap etc you will get all those issues you have with a normal panel. So you would want to use V-Sync or FastSync or even FPS limiter there, basically plug that gap at the top end where G-Sync stops doing its magic.


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> That's the thing man, I can't find whether it was tested with the upscalling turned off or not, I have a strong hunch it might have been tested with upscalling turned on. As for the Mafia 3 results, that's great, but there is no comparison with gtx 980 ti in any of the benchmarks, also we don't know how they tested it, for all we know it could be all indoor areas, which have significantly higher framerates.


just never use upscaling, man.


----------



## JedixJarf

Quote:


> Originally Posted by *looniam*


Quote:


> Originally Posted by *wanako*


My dad jokes are as cool as a waterblock yo.


----------



## Norlig

Any chance in increasing the Power limit on the Founders editions?









ie. Anyone know if anyone is working on it or not?


----------



## warbucks

My EK waterblock and backplate arrived yesterday. What a difference it makes to ensure you don't hit the temperature limit. I'm sitting at +160 on the core and +350 for memory. Memory has lots of room and I've had the core up to +175. GPU Core sits between 2062Mhz and 2075Mhz when gaming and benching. Played BF1 for a couple hours last night and it was steady at 2062Mhz. Temperatures hover around 38-40C. I'm running dual 360 rads and also have my 7700K in the loop @ 5.0Ghz with no AVX offset.

This card is a beast. I'm running an ultrawide 3440x1440 (Asus MG348Q) and gaming has never been better with this pairing.


----------



## JedixJarf

Quote:


> Originally Posted by *warbucks*
> 
> My EK waterblock and backplate arrived yesterday. What a difference it makes to ensure you don't hit the temperature limit. I'm sitting at +160 on the core and +350 for memory. Memory has lots of room and I've had the core up to +175. GPU Core sits between 2062Mhz and 2075Mhz when gaming and benching. Played BF1 for a couple hours last night and it was steady at 2062Mhz. Temperatures hover around 38-40C. I'm running dual 360 rads and also have my 7700K in the loop @ 5.0Ghz with no AVX offset.
> 
> This card is a beast. I'm running an ultrawide 3440x1440 (Asus MG348Q) and gaming has never been better with this pairing.


Awesome clocks my man.


----------



## Jbravo33

Question for you guys. So playing mass effect on ultra settings on my monitor which is a samsung s34e790c 60Hz I'm getting 90+ fps on 3440x1440. But then I select vsync it locks it at 60 and the game plays smoother. I'm limited to 60 so does anything more than that make the game not as smooth?


----------



## lilchronic

Quote:


> Originally Posted by *Mato87*
> 
> You're a life saver ! Thats the beginning of the game basically, can I ask you for one more favour? To do a drivethrough through the bayou, you know the swamps during daylight too? Iam liking the results though, it's basically above 100 fps all the time and sometimes it jumps to 125 fps which is just fantastic, can't wait to get my gtx 1080 ti.
> 
> Also how about quantum break, with the upscalling turned off it's a killer even for the titanx pascal.











Sorry i don't have quantum break


----------



## Mato87

Quote:


> Originally Posted by *lilchronic*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry i don't have quantum break


Nevermind then, thanks for the benchmarks provided though, much appreciated







The wait for my gtx 1080 ti is even longer now


----------



## JedixJarf

Quote:


> Originally Posted by *Jbravo33*
> 
> Question for you guys. So playing mass effect on ultra settings on my monitor which is a samsung s34e790c 60Hz I'm getting 90+ fps on 3440x1440. But then I select vsync it locks it at 60 and the game plays smoother. I'm limited to 60 so does anything more than that make the game not as smooth?


If the monitor is 60 hz, anything over 60 fps wont look smoother, you are probably noticing screen tearing.


----------



## GTRtank

Quote:


> Originally Posted by *Jbravo33*
> 
> Ordered 1 as well with next day shipping. Should have gotten 2. I'll be selling it anyway once evga comes out with theirs.


Just got my pre order in with EVGA. Check their website.


----------



## PewnFlavorTang

was the ftw3 pre-order already sold out? Only thing I see is the sc be and the sc2.


----------



## RadActiveLobstr

Got my FTW3 Pre-Order placed.

it sold out rather quickly.


----------



## GTRtank

It was up yes, 739.99. The website was getting hit pretty hard. I only got a SC2, but I am not watercooling as its going into my mITX build. All air cooled cards seem to be performing the same, so yeah.


----------



## Slackaveli

Quote:


> Originally Posted by *Zirc60*
> 
> Yeah, saw that my gpu memory clock was a little higher on gpuz with msi afterburner at 650 instead of 1250 with asus gpu tweak.


doesnt it just count differently on tweak? I think that +1300 on tweak=650 on afterburner.

Evga site still has the SC cards up for pre-order as we speak. Ship date is 4-17-17. Just in time for 420 lol.
Thought about it but I actually like this FE card and even though I could switch to the FTW3 and get $30 back (tax was $60 on FE) Im not even trippin. Im gonna stay pat with the FE.


----------



## trawetSluaP

I have caved and ordered a Block and Backplate, as well as some new coloured fittings! First time I've watercooled a GPU so I'm pretty excited.


----------



## JedixJarf

Quote:


> Originally Posted by *trawetSluaP*
> 
> I have caved and ordered a Block and Backplate, as well as some new coloured fittings! First time I've watercooled a GPU so I'm pretty excited.


Watercooling a gpu is an incredible difference. Welcome to the club, buddy


----------



## havoc315

Quote:


> Originally Posted by *Norlig*
> 
> Any chance in increasing the Power limit on the Founders editions?
> 
> 
> 
> 
> 
> 
> 
> 
> ie. Anyone know if anyone is working on it or not?


You could do a shunt mod if you have the know how and have a little darkness in you, it's not for the faint of heart I promise....ill be trying it out myself probably, not immediately though because want to see how it runs stock then with upgraded stock then Water cooled and so on.


----------



## JedixJarf

Quote:


> Originally Posted by *havoc315*
> 
> You could do a shunt mod if you have the know how and have a little darkness in you, it's not for the faint of heart I promise....ill be trying it out myself probably, not immediately though because want to see how it runs stock then with upgraded stock then Water cooled and so on.


All you got to do is lay down a layer of liquid metal on the resistor. ez pz. Just don't get any on the rest of the components lol.


----------



## KedarWolf

Quote:


> Originally Posted by *JedixJarf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *havoc315*
> 
> You could do a shunt mod if you have the know how and have a little darkness in you, it's not for the faint of heart I promise....ill be trying it out myself probably, not immediately though because want to see how it runs stock then with upgraded stock then Water cooled and so on.
> 
> 
> 
> All you got to do is lay down a layer of liquid metal on the resistor. ez pz. Just don't get any on the rest of the components lol.
Click to expand...

EK water block and backplate, no shunt mod, running Heaven bench at 2052 core, +604 memory, according to GPU-Z and Afterburner logs card throttles a couple seconds from going over power limit entire bench. Having block already on etc. not taking that risk for almost no gain. Stays at 2052 all except a few seconds you see.









Edit: Anyone try this mod with a conductive pen? It comes off really easily, I know it's not near the conductivity of liquid metal, but wouldn't it lower the power limit enough that I wouldn't get that few seconds dip?


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> EK water block and backplate, no shunt mod, running Heaven bench at 2052 core, +604 memory, according to GPU-Z and Afterburner logs card throttles a couple seconds from going over power limit entire bench. Having block already on etc. not taking that risk for almost no gain. Stays at 2052 all except a few seconds you see.


Gain would be minimal for sure, but the tagline of overclock.net is, "The pursuit of performance"


----------



## BigPharma

Hey guys

I picked up an EVGA 1080ti FE last week. I wanted to run some issues by yall to see if there is possibly a problem with the card.

I played BF1 yesterday and was averaging around 110FPS @ 1440p all settings maxed. I tried to run some Playerunknown's Battlegrounds and after about 4 rounds I got a crash with an error that said "insufficient video memory to load texture". I tried to reload the game and promptly crashed the next round with no error message given.

I thought this sounded like an overheating issue since it happened after a prolonged session, so I downloaded the Heaven benchmark and ran it several times, as well as keeping it on loop for about 15 minutes. I did not encounter any crashes or issues with the benchmark, and my score seemed to be in line with others @1440p and 8xAA. I also ran some Overwatch for about an hour without issues.

Now today, I have tried to run BF1 and I cannot even get in game because it crashed and I get the generic "BF1 has stopped working" error message. I ran the Heaven benchmark again and it ran correctly.

My question is, do these symptoms sound like a possible issue with this video card? I know PUBG is prone to crashing, but the error message I received about video memory threw me off. I also haven't had issues with BF1 in the past.

I am most likely looking at doing a reformat as my next step.

My specs are as follows

CPU: i7 6600k - stock
GPU: 1080ti FE - stock
Motherboard: ASUS Maxumus Hero
RAM: 16gb Gskill

I appreciate any help you can offer.


----------



## havoc315

Quote:


> Originally Posted by *JedixJarf*
> 
> All you got to do is lay down a layer of liquid metal on the resistor. ez pz. Just don't get any on the rest of the components lol.


Lol no doubt, only reason I'm probably going to do it too is because I'm deliding my 6700k too so seeing as I have the clu in hand and will be doing the w/b for the gpus around same time I do the delid I figured maybe just go ahead and try....i thought about taking my two 1070 g1s from gigabyte with ek blocks on them and trying it on them first and then sell them as the gpus superman would use if he was on a slight budget lol


----------



## havoc315

Quote:


> Originally Posted by *JedixJarf*
> 
> Gain would be minimal for sure, but the tagline of overclock.net is, "The pursuit of performance"


Exactly it's all about the love


----------



## havoc315

I mod in the pursuit of performance and for the love of it and the ability to say I went to the edge of insanity and made it back with the knowledge and power to change it into something that I can truly call mine.


----------



## KedarWolf

I'm thinking if I only go over the power limit a few seconds on a Heaven run wouldn't a conductive pen have just enough conductivity to lower the resists enough to never go over without the risk of liquid metal?


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm thinking if I only go over the power limit a few seconds on a Heaven run wouldn't a conductive pen have just enough conductivity to lower the resists enough to never go over without the risk of liquid metal?


Yeah for sure, or silver paint, or soldering on resistors, w/e works for you : )


----------



## havoc315

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm thinking if I only go over the power limit a few seconds on a Heaven run wouldn't a conductive pen have just enough conductivity to lower the resists enough to never go over without the risk of liquid metal?


From what I understand is that using a conductive pen would lower it too much or not enough and you need it two be half the ohms and no more, so 5 ohm you could do 2ohm but be careful because that will give you alot of power...enough to do some damage, but there is a thread about this if you want to check it out they know more about it than me


----------



## mcg75

Quote:


> Originally Posted by *shalafi*
> 
> I'm peaking at about 63 after 10-15 minutes benching - unacceptable. Just ordered some Cooler Master Master Gel Maker (jeez, what a name), as it was the best that was available immediately, will pick it up in an hour and redo the Hybrid kit in the evening.


Yeah, that's way too high. There may be an issue with the pump if it stays that way.


----------



## GRABibus

Still no ASUS modded Bios to unlock power limits ?


----------



## PewnFlavorTang

Anyone done the shunt mod with a adjustable pot?


----------



## alucardis666

Quote:


> Originally Posted by *mcg75*
> 
> Yeah, that's way too high. There may be an issue with the pump if it stays that way.


This has been my experience too. My temps now are peaking @ 62-64c, about 2 weeks ago they were maxing at 43c... Maybe my pump has died? Either way this is still much better than the 85c I was getting with the FE cooler.


----------



## pantsoftime

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> Anyone done the shunt mod with a adjustable pot?


You're not going to find a pot with a range of 5 to 10 milliohms that can handle 20A of current. The shunts are current sense resistors and all of the current coming in through the power connectors pass through the resistors.


----------



## jleslie246

EVGA is letting you preorder the 1080ti SC. I just preordered mine! Shipping will start April 17th! I hope EK will have a waterblock for this soon.


----------



## Essenbe

I pre Ordered a 1080 TI FTW3 today. Supposed to ship May 1, darn it.


----------



## al210

Quote:


> Originally Posted by *Essenbe*
> 
> I pre Ordered a 1080 TI FTW3 today. Supposed to ship May 1, darn it.


----------



## BoredErica

Anyone know if the MSI Armor will work with an EK waterblock for FE? I believe it has 8+8 pin power connectors so now I'm not sure.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *shalafi*
> 
> I'm peaking at about 63 after 10-15 minutes benching - unacceptable. Just ordered some Cooler Master Master Gel Maker (jeez, what a name), as it was the best that was available immediately, will pick it up in an hour and redo the Hybrid kit in the evening.


It depends on what fans you are using. I'm running a H55 in push-pull on mine and it was at around 62-65C with Noctua 1500 RPM fans until I got new Noctua 2000 RPM fans and my temperatures are now at 58-60C... Running these damn fans at max is annoying though. I'm going to change over to a x41 Kraken and see if I get an improvement.


----------



## HeyImJesse

Quote:


> Originally Posted by *Essenbe*
> 
> I pre Ordered a 1080 TI FTW3 today. Supposed to ship May 1, darn it.


I'm with you there. 2 x 1080 ti FTW3 on order. Oh well, I don't mind the month wait but at least trying to get them weren't as much of a pain as previous generation preorders. 10am pst preorders were opened, ~ 10:25am pst got the email confirmation. Just hoping EK or Heatkiller will have some blocks ready by then!


----------



## bfedorov11

Quote:


> Originally Posted by *Darkwizzie*
> 
> Anyone know if the MSI Armor will work with an EK waterblock for FE? I believe it has 8+8 pin power connectors so now I'm not sure.


No. It is a custom pcb. Newegg apparently now has nude pics.


----------



## BoredErica

Quote:


> Originally Posted by *bfedorov11*
> 
> No. It is a custom pcb. Newegg apparently now has nude pics.


Okay, thanks.
I guess I can't buy a 1080ti yet then. There are just too many small things nagging me at every 1080ti I can get today.


----------



## illypso

Quote:


> Originally Posted by *Norlig*
> 
> Any chance in increasing the Power limit on the Founders editions?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ie. Anyone know if anyone is working on it or not?


I did some test with rear windows rapair kit, was not low enough to see a difference.

I heard a lot of good thing on CLU nut I did not have some to try.

If your good at soldering, you could use a small wire from a multiwire audio cable on 2 of the shunt, it reduce my power from 120max hitting limit to 100 never hitting limit.



It did not help me on having a better clock tho, but I'm having Power suply shutdown from time to time so it having a hard time and may affect my test, waiting for my 1000W to see if it was causing problem.

There also this post
http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


----------



## aylan1196




----------



## c0ld

Quote:


> Originally Posted by *aylan1196*


Jelly is the Strix out?

Where can we pre-order the Aorus Xtreme?


----------



## DerComissar

Quote:


> Originally Posted by *aylan1196*


You should always make sure your 1080Ti is wearing a seat belt when riding in the car.


----------



## shalafi

Quote:


> Originally Posted by *mcg75*
> 
> Yeah, that's way too high. There may be an issue with the pump if it stays that way.


Hmm, I can try returning this kit and getting another one. Or I have an unopened NZXT X62 for my CPU (waiting for the bracket) - but I suppose this wouldn't make good contact with the GPU out of the box, maybe a shim needed?
Quote:


> Originally Posted by *alucardis666*
> 
> This has been my experience too. My temps now are peaking @ 62-64c, about 2 weeks ago they were maxing at 43c... Maybe my pump has died? Either way this is still much better than the 85c I was getting with the FE cooler.


Seriously? You were at 43C and now you're over 60C? If this pump theory is true (my pump was a dud and yours died in a week) then that would lower my confidence about the hybrid kit by quite a lot








Quote:


> Originally Posted by *SlimJ87D*
> 
> It depends on what fans you are using. I'm running a H55 in push-pull on mine and it was at around 62-65C with Noctua 1500 RPM fans until I got new Noctua 2000 RPM fans and my temperatures are now at 58-60C... Running these damn fans at max is annoying though. I'm going to change over to a x41 Kraken and see if I get an improvement.


Stock EVGA fan at 100% would top out at around 59C, but it's loud as hell. Noctua NF-F12 PWM at 100% is roughly 60C after I changed the TIM and about half as loud.


----------



## shalafi

I'm really confused now .. found these on reddit:


__
https://www.reddit.com/r/61ymwt/1080_ti_fe_with_hybrid_kit_or_1080_ti_aib/
.

one of them says 60C,the other one <50C with push/pull fans at 65%.









I don't think there are two versions of the hybrid kit, It looks like EVGA use the same 120mm rad, fan and pump combo since the 980Ti.


----------



## Jbravo33

Really wanted to wait for ftw3 actually hydro copper but there is no word on that so I did my own version. Finally done!!! Ti's Swimming. Idle temps 26 max temp on heaven 37!!!! Holy crap. Held 2050 the entire time not one drop. Haven't even tried to push overclock was on my last setting when on air. Wow!


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> I'm really confused now .. found these on reddit:
> 
> 
> __
> https://www.reddit.com/r/61ymwt/1080_ti_fe_with_hybrid_kit_or_1080_ti_aib/
> .
> 
> one of them says 60C,the other one <50C with push/pull fans at 65%.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think there are two versions of the hybrid kit, It looks like EVGA use the same 120mm rad, fan and pump combo since the 980Ti.


makes no sense. my highest temps after a 4 hour BF1 session tonight were 65c on Founder's edition on stock cooler.


----------



## palote99

Quote:


> Originally Posted by *Slackaveli*
> 
> makes no sense. my highest temps after a 4 hour BF1 session tonight were 65c on Founder's edition on stock cooler.


65?

At what fan speed?

Thanks


----------



## aylan1196

BEAST of a card stock boosts to 1936 on core idle 26c load 62c


----------



## Radox-0

Quote:


> Originally Posted by *aylan1196*
> 
> BEAST of a card stock boosts to 1936 on core idle 26c load 62c


Noice







You lot in Dubai got these really early it seems. I thought the Strix cooler on 1080 looked beastly, looks more so with the thicker heatsink. what sort of overclocks are you getting out of that card?


----------



## aylan1196

Quote:


> Originally Posted by *Radox-0*
> 
> Noice
> 
> 
> 
> 
> 
> 
> 
> You lot in Dubai got these really early it seems. I thought the Strix cooler on 1080 looked beastly, looks more so with the thicker heatsink. what sort of overclocks are you getting out of that card?


The card holds 2050 with ease only 110 + core and maintain 500 + on memory didn't miss with it too much
Edit :
Volt 50+
Pl 120
Tl 90
Max 2088 core
Max memory 6106
Fan 55


----------



## fisher6

Quote:


> Originally Posted by *aylan1196*
> 
> The card holds 2050 with ease only 110 + core and maintain 500 + on memory didn't miss with it too much
> Edit :
> Volt 50+
> Pl 120
> Tl 90
> Max 2088 core
> Max memory 6106
> Fan 55


Are you able to push the card further? To like 2100?


----------



## aylan1196

can reach 2100 on air lol crazy


----------



## GreedyMuffin

I won't ever buy ASUS again due to the stupid "warranty void" sticker.


----------



## mcg75

Quote:


> Originally Posted by *shalafi*
> 
> I'm really confused now .. found these on reddit:
> 
> 
> __
> https://www.reddit.com/r/61ymwt/1080_ti_fe_with_hybrid_kit_or_1080_ti_aib/
> .
> 
> one of them says 60C,the other one <50C with push/pull fans at 65%.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think there are two versions of the hybrid kit, It looks like EVGA use the same 120mm rad, fan and pump combo since the 980Ti.


My kit is from the 980 Ti. I don't think there is a difference.

It's in a Corsair 540 high airflow case. Rad is mounted as the main rear exhaust fan.

My cpu is cooled with a H100I Corsair cooler at the top of the case.

My tubes are located at the bottom of the rad as that's how my instruction manual for the kit says to install for best acoustics.


----------



## PasK1234Xw

Quote:


> Originally Posted by *palote99*
> 
> 65?
> 
> At what fan speed?


Prob 100% and prob not fully utilizing the card. Anything less FOS


----------



## Benny89

Anyone here with 1080 Ti SLI and 144/165Hz 1440p monitor? What are your FPS in some new games with SLI? Thanks in advance for info!


----------



## Bishop07764

Quote:


> Originally Posted by *Jbravo33*
> 
> Really wanted to wait for ftw3 actually hydro copper but there is no word on that so I did my own version. Finally done!!! Ti's Swimming. Idle temps 26 max temp on heaven 37!!!! Holy crap. Held 2050 the entire time not one drop. Haven't even tried to push overclock was on my last setting when on air. Wow!


Awesome. These things rock under water. You should have the thermal element removed. ?


----------



## 428cobra

Quote:


> Originally Posted by *Bishop07764*
> 
> Awesome. These things rock under water. You should have the thermal element removed. ?


man thats a CLEAN LOOKING RIG CONGRATS


----------



## aylan1196

Loving the strix so far great beast


----------



## Benny89

Quote:


> Originally Posted by *aylan1196*
> 
> Loving the strix so far great beast


Nice! STRIX cooler this time seems to do the job. What are your temps under load with 2100 Mhz? Is it stable on air in games?


----------



## aylan1196

cant wait for ek waterblock best I can push so far


http://imgur.com/pBLHg


----------



## shalafi

Quote:


> Originally Posted by *aylan1196*
> 
> cant wait for ek waterblock best I can push so far
> 
> 
> http://imgur.com/pBLHg


huh? how are you at 50-70% GPU load during a benchmark?


----------



## Benny89

Quote:


> Originally Posted by *aylan1196*
> 
> cant wait for ek waterblock best I can push so far
> 
> 
> http://imgur.com/pBLHg


Why are you 70% load and only 43 Celsius temp? This is the beginning of the benchmark, right? Can you please show screen from benchmark when it reaches 99% load and what are the temps during full load please?


----------



## aylan1196

Quote:


> Originally Posted by *shalafi*
> 
> huh? how are you at 50-70% GPU load during a benchmark?


No idea


----------



## bfedorov11

Quote:


> Originally Posted by *aylan1196*
> 
> No idea


What does it do in 3dmark?


----------



## mouacyk

Quote:


> Originally Posted by *shalafi*
> 
> huh? how are you at 50-70% GPU load during a benchmark?


Probably i3 + 1333MHz RAM


----------



## guttheslayer

Hi guys can I join?



Anyway from a GTX 670 to a GTX 1080 Ti.


----------



## keikei

^Not gonna lie, sometimes i wish i had waited for the aftermarket cards....Congrats.


----------



## guttheslayer

Quote:


> Originally Posted by *keikei*
> 
> ^Not gonna lie, sometimes i wish i had waited for the aftermarket cards....Congrats.


Its the only aftermarket card in my country as of now.

There is no other cards except FE. 1080 Ti stocks are really trickling down slowly.


----------



## Benny89

Quote:


> Originally Posted by *guttheslayer*
> 
> Hi guys can I join?
> 
> 
> 
> Anyway from a GTX 670 to a GTX 1080 Ti.


Nice, let us know the how are OC results!


----------



## guttheslayer

Quote:


> Originally Posted by *Benny89*
> 
> Nice, let us know the how are OC results!


I try OCed and I couldnt really get past 2038MHz

Mainly stable clock is 2025 MHz. Anything higher will auto crash 3Dmark. I am not sure why but definitely not limited by power / temps factor.

I am not sure if Gaming or OC mode will affect. I haven't download the Aorus software.


----------



## Benny89

Quote:


> Originally Posted by *guttheslayer*
> 
> I try OCed and I couldnt really get past 2038MHz
> 
> Mainly stable clock is 2025 MHz. Anything higher will auto crash 3Dmark. I am not sure why but definitely not limited by power / temps factor.
> 
> I am not sure if Gaming or OC mode will affect. I haven't download the Aorus software.


That is ok, the average OC for STRIX is also around 2050 core. Better chips will maybe hit 2100.

I think, simillar to whole PASCAL lineup- those chips are not "OC friendly". They boost so much on their own (in review STRIX boosted to 1900 on is own) that there is not much OC room.

From what I saw in 1080 TI Owners Club thread rarely any 1080Ti was above 2100 on water loops so there is it.


----------



## Mato87

Quote:


> Originally Posted by *guttheslayer*
> 
> Hi guys can I join?
> 
> 
> 
> Anyway from a GTX 670 to a GTX 1080 Ti.


damn how are you able to get the custom pcb non reference ones so early? I thought they won't be available until mid may...


----------



## aylan1196

3dmark







http://www.3dmark.com/3dm/18931212?


----------



## Slackaveli

Quote:


> Originally Posted by *palote99*
> 
> 65?
> 
> At what fan speed?
> 
> Thanks


it'll get as high as 69c if i leave it uncapped but that gives me input lag and an occasional screen tear and has no benefits so i cap it @59 in 4k and 142 in QHD, so I am getting a bit of rest here and there (that's funny to me and very awesome. resting at 4k/60ultra







)but, yeah, great temps. I have the thermal grizzly kyronaut on it and run 1:1 curve up to 57c then add +10 to that, so at 60c its 70% at 70c its 80%, but it never gets that high. my case is great and at 70% it is pretty quiet, about the same as my house's ac/heat vents and my case fans, etc. tomorrow Im slapping a pair of 100mm x 30mm x 18mm heatsinks on the backplate over the mems/gpu and I already have a noctua 3000 pointed at the backplate. My buddy did that and dropped 5c on heavy load.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Anyone here with 1080 Ti SLI and 144/165Hz 1440p monitor? What are your FPS in some new games with SLI? Thanks in advance for info!


140+ on a single Ti


----------



## Slackaveli

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Prob 100% and prob not fully utilizing the card. Anything less FOS


never would i run a fan at 100%, and yeah the card was resting here and there because it was maxing out 142 fps and why run it harder? Ultra everything, btw. I have reached 69c when playing Fifa at 5400x2900 DSR'd on a 4k screen using 10gb of vram. still maxed at 69c.


----------



## bfedorov11

I was just looking at some comparisons from old setups.. all custom loop..

1080ti vs 2x 780ti dcii 1455mhz 1.47v
Graphics score +24.9% 7868 / 6299
http://www.3dmark.com/compare/fs/12091656/fs/4440923

1080ti vs 2x Titan X Maxwell 1534mhz
Graphics score 7868 / 9930 +26.2%
http://www.3dmark.com/compare/fs/12091656/fs/6905062

Probably would be a lot closer in game since sli scaling is no where near as good as in 3dmark.


----------



## shalafi

bought and installed a second hybrid kit, same 60C max - I rest my case


----------



## kaiqi07

New hoots, sadly this is the last piece... Wanted to get two to go SLI... Have to wait for restock liao...

*Gigabyte Aorus 1080TI GTX Xtreme with Free Gifts.*


http://imgur.com/sMO5O0V




http://imgur.com/ZGm12GS




http://imgur.com/LRjpd9V


----------



## JedixJarf

Quote:


> Originally Posted by *mouacyk*
> 
> Probably i3 + 1333MHz RAM


Lol does seem odd for sure.


----------



## Slackaveli

Damn, that Aurus is friggin sexy! What store in USA is selling those?


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> bought and installed a second hybrid kit, same 60C max - I rest my case


need better thermal paste


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shalafi*
> 
> bought and installed a second hybrid kit, same 60C max - I rest my case
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> need better thermal paste
Click to expand...

i get about 50C on my EK black with backplate but that's with two video cards and a CPU on a single 360 rad with two sets of QDCs which slows coolant flow rate some, it's to be expected.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> i get about 50C on my EK black with backplate but that's with two video cards and a CPU on a single 360 rad with two sets of QDCs which slows coolant flow rate some, it's to be expected.


yeah, those are great temps. the real key is <58c


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> bought and installed a second hybrid kit, same 60C max - I rest my case


all you need is to drop it by 1 or 2 c and you will be golden. slap a heatsink on the backplate with a strip of Fujipoly and you'll get a couple degrees. Or, repaste with thermal grizzly. Even if you used gelid or another top notch paste you'll get a degree or 2. if it's what comes with the hybrid kit, well, then there is your problem right there. i got 8c on my 980ti evga hybrid when i repasted, and that was with gelid.


----------



## Benny89

Quote:


> Originally Posted by *aylan1196*
> 
> 3dmark
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/18931212?


Can you post your final Afterburn settings? Nice score!


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aylan1196*
> 
> 3dmark
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/18931212?
> 
> 
> 
> Can you post your final Afterburn settings? Nice score!
Click to expand...

My best score.

+642 memory, +152 core



http://www.3dmark.com/3dm/18828861?

Afterburner with voltage slider mod.


----------



## Jbravo33

Not sure what's going on but when I first installed cards with blocks (last night) I ran heaven got 180fps. Then turned on precision x and started playing with core and mem and couldn't get passed 140fps. Restarted system and once again back to 180fps. Precision not working correctly with 1080ti?? Anyone else notice better number after restarting? I have precision x checked off to be on at startup but it's not like it pops up on desktop. 43c my highest temp when fans on lowest speed. Is precision xworking in background? I want to leave it alone seeing how I'm getting best score.


----------



## Bychus

Hi, here I leave you my best score with the 6700k in graphics

http://www.3dmark.com/spy/1467708


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> i get about 50C on my EK black with backplate but that's with two video cards and a CPU on a single 360 rad with two sets of QDCs which slows coolant flow rate some, it's to be expected.


That is not bad at all amigo


----------



## KedarWolf

Quote:


> Originally Posted by *Jbravo33*
> 
> Not sure what's going on but when I first installed cards with blocks (last night) I ran heaven got 180fps. Then turned on precision x and started playing with core and mem and couldn't get passed 140fps. Restarted system and once again back to 180fps. Precision not working correctly with 1080ti?? Anyone else notice better number after restarting? I have precision x checked off to be on at startup but it's not like it pops up on desktop. 43c my highest temp when fans on lowest speed. Is precision xworking in background? I want to leave it alone seeing how I'm getting best score.


Here's what to do.

Uninstall Precision X, install Afterburner.

http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup.zip Direct link.

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20 do this if you want a working voltage slider.

Open Heaven, pause it by hitting space bar on scene with little going on in background like while staring at the cobblestone. It'll show FPS you're getting.

Raise core one notch at a time and hit Apply until driver crashes or screen freezes then restarts, then drop core down say 8 points, you'll likely have to reboot or have lower FPS after a driver crash.

Then adjust memory until you get the most FPS or start getting artefacts, whichever comes first. often your highest FPS will be less that your max memory overclock. It may be only a few FPS higher at optimum settings.

If for any reason you're not getting as high FPS after a driver crash and reboot, you may need to reinstall the driver. Usually if it only crashes once is fine though.

Hope that helps.


----------



## JedixJarf

Quote:


> Originally Posted by *KedarWolf*
> 
> Here's what to do.
> 
> Uninstall Precision X, install Afterburner.
> 
> http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup.zip Direct link.
> 
> http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20 do this if you want a working voltage slider.
> 
> Open Heaven, pause it by hitting space bar on scene with little going on in background like while staring at the cobblestone. It'll show FPS you're getting.
> 
> Raise core one notch at a time and hit Apply until driver crashes or screen freezes then restarts, then drop core down say 8 points, you'll likely have to reboot or have lower FPS after a driver crash.
> 
> Then adjust memory until you get the most FPS or start getting artefacts, whichever comes first. often your highest FPS will be less that your max memory overclock. It may be only a few FPS higher at optimum settings.
> 
> If for any reason you're not getting as high FPS after a driver crash and reboot, you may need to reinstall the driver. Usually if it only crashes once is fine though.
> 
> Hope that helps.


Agreed!

Get rid of that terrible software, and use afterburner : )


----------



## lilchronic

Quote:


> Originally Posted by *Bychus*
> 
> Hi, here I leave you my best score with the 6700k in graphics
> 
> http://www.3dmark.com/spy/1467708


Have you done the shunt mod?
My best graphics score.
http://www.3dmark.com/3dm/18934258


----------



## Bychus

Quote:


> Originally Posted by *lilchronic*
> 
> Have you done the shunt mod?
> My best graphics score.
> http://www.3dmark.com/3dm/18934258


Yeah


----------



## mr2cam

Quote:


> Originally Posted by *lilchronic*
> 
> Have you done the shunt mod?
> My best graphics score.
> http://www.3dmark.com/3dm/18934258


What is the shunt mod?


----------



## lilchronic

Quote:


> Originally Posted by *Bychus*
> 
> Yeah


Yeah this card throttle's like crazy in time spy.
Quote:


> Originally Posted by *mr2cam*
> 
> What is the shunt mod?


http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/0_50

I have not done it.


----------



## Bychus

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bychus*
> 
> Yeah
> 
> 
> 
> Yeah this card throttle's like crazy in time spy.
Click to expand...

Now stands at 2114 +500/600 mem or 2126 +350


----------



## Mato87

Quote:


> Originally Posted by *kaiqi07*
> 
> New hoots, sadly this is the last piece... Wanted to get two to go SLI... Have to wait for restock liao...
> 
> *Gigabyte Aorus 1080TI GTX Xtreme with Free Gifts.*
> 
> 
> http://imgur.com/sMO5O0V
> 
> 
> 
> 
> http://imgur.com/ZGm12GS
> 
> 
> 
> 
> http://imgur.com/LRjpd9V


You lucky duck, how much was it? And how the heck did you get it so early? I thought they were supposed to be available mid may or at the end of next month....


----------



## Jbravo33

Quote:


> Originally Posted by *JedixJarf*
> 
> Agreed!
> 
> Get rid of that terrible software, and use afterburner : )


Thank you. Will try tonight


----------



## fisher6

Finally broke the 9k mark in TS
http://www.3dmark.com/3dm/18935211


----------



## Boost240

Ok. The FE is back in stock on Amazon.ca. I've been waiting for this card since last June when I built my new PC. I want to add it to my custom loop. I'm just scared that EK will come out with a block for maybe one of the aftermarket EVGA or Asus cards lol. And I'm also afraid that someone will find away around the locked bios and I won't be able to take advantage of the Partner cards better voltage regulation.

If they chances are slim of that happening though I'm ready to push buy!


----------



## DerComissar

Quote:


> Originally Posted by *Boost240*
> 
> Ok. The FE is back in stock on Amazon.ca. I've been waiting for this card since last June when I built my new PC. I want to add it to my custom loop. I'm just scared that EK will come out with a block for maybe one of the aftermarket EVGA or Asus cards lol. And I'm also afraid that someone will find away around the locked bios and I won't be able to take advantage of the Partner cards better voltage regulation.
> 
> If they chances are slim of that happening though I'm ready to push buy!


The waiting game never ends, may as well push the button.

Considering you're putting a block on the card, the FE will be just fine.
I bought one a couple weeks ago, just received my block from EK today for it.

Even at stock, on air as I'm running it for now, this sucker flies!


----------



## mcg75

Here's a screenshot from PrecisionX with the info from my 1080 Ti FE with EVGA hybrid cooler.

Running at 2025 MHz with temp and power sliders maxed.

4K resolution Heaven. Pic was taken after 30 mins of running. Max temp 42c.


----------



## Slackaveli

Quote:


> Originally Posted by *Boost240*
> 
> Ok. The FE is back in stock on Amazon.ca. I've been waiting for this card since last June when I built my new PC. I want to add it to my custom loop. I'm just scared that EK will come out with a block for maybe one of the aftermarket EVGA or Asus cards lol. And I'm also afraid that someone will find away around the locked bios and I won't be able to take advantage of the Partner cards better voltage regulation.
> 
> If they chances are slim of that happening though I'm ready to push buy!


bro, 3rd party boards won't have better voltage regulation.


----------



## KaRLiToS

Hi guys, I will join this club, just bought 2 x GTX 1080ti with EK blocks.

So far I have only received one card and it was this morning.

Bought one at Ncix.ca on the 13th which I still don't have and one bought at newegg.ca bought on the 23rd, which arrive today.

The one I bought at NCIX was listed IN STOCK but still no sign of it being shipped







.

I will post pictures in a few


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> bro, 3rd party boards won't have better voltage regulation.


You right, you right. I'll buy the FE tonight.


----------



## Slackaveli

Quote:


> Originally Posted by *Boost240*
> 
> You right, you right. I'll buy the FE tonight.


do it, man! They really did do a good job on them, quality components and dual fets on all 7. good voltage regs for a founder's. much better than on Titan Xp's.


----------



## Boost240

Ordered! Friday delivery!


----------



## Slackaveli

feels great, eh?


----------



## JedixJarf

Quote:


> Originally Posted by *Boost240*
> 
> You right, you right. I'll buy the FE tonight.


Plus, if you want to go under water, it will be much easier getting a block


----------



## PewnFlavorTang

Ftw3 pre-ordered just now. Ship date is in may lol ***.


----------



## KaRLiToS

Have you ever seen a WaterCooled R9 290x with an air cooled GTX 1080ti

I will receive another GTX 1080ti soon along with 2 x EKblocks with a triple bridge



Also got the Asus PG348Q and ditched my old eyefinity setup.



Awsome monitor honestly


----------



## zswickliffe

Quote:


> Originally Posted by *Slackaveli*
> 
> 140+ on a single Ti


Uhhh, no.
Quote:


> Originally Posted by *Benny89*
> 
> Anyone here with 1080 Ti SLI and 144/165Hz 1440p monitor? What are your FPS in some new games with SLI? Thanks in advance for info!


You'll be looking at about 120fps in modern AAA games with all graphics turned up to ultra. I see some dips to 100fps in Battlefield 1 but it's normally above 115 consistently.


----------



## MightEMatt

Quote:


> Originally Posted by *zswickliffe*
> 
> Uhhh, no.
> You'll be looking at about 120fps in modern AAA games with all graphics turned up to ultra. I see some dips to 100fps in Battlefield 1 but it's normally above 115 consistently.


I can second that. I mostly play GTAV and The Division for modern titles but in both cases I pretty much have a locked 120fps. Any drops below that are usually a CPU bottleneck at this point. With the framerate unlocked I've seen it settle closer to 200 in lightweight areas so I imagine with a more modern CPU you could comfortably push for a steady 144 or 165 with a setting or two off.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KaRLiToS*
> 
> Hi guys, I will join this club, just bought 2 x GTX 1080ti with EK blocks.
> 
> So far I have only received one card and it was this morning.
> 
> Bought one at Ncix.ca on the 13th which I still don't have and one bought at newegg.ca bought on the 23rd, which arrive today.
> 
> The one I bought at NCIX was listed IN STOCK but still no sign of it being shipped
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I will post pictures in a few












Nice to see you buddy!?


----------



## zswickliffe

Quote:


> Originally Posted by *MightEMatt*
> 
> I can second that. I mostly play GTAV and The Division for modern titles but in both cases I pretty much have a locked 120fps. Any drops below that are usually a CPU bottleneck at this point. With the framerate unlocked I've seen it settle closer to 200 in lightweight areas so I imagine with a more modern CPU you could comfortably push for a steady 144 or 165 with a setting or two off.


What CPU are you running? I'm pretty sure mine isn't the bottleneck for my system, I'll re-review some logs when I get a chance.


----------



## jleslie246

Quote:


> Originally Posted by *zswickliffe*
> 
> Uhhh, no.
> You'll be looking at about 120fps in modern AAA games with all graphics turned up to ultra. I see some dips to 100fps in Battlefield 1 but it's normally above 115 consistently.


Is that with one or two 1080ti's? I hope with one


----------



## zswickliffe

Quote:


> Originally Posted by *jleslie246*
> 
> Is that with one or two 1080ti's? I hope with one


Haha yes, just one.


----------



## rcfc89

Quote:


> Originally Posted by *zswickliffe*
> 
> Uhhh, no.
> You'll be looking at about 120fps in modern AAA games with all graphics turned up to ultra. I see some dips to 100fps in Battlefield 1 but it's normally above 115 consistently.


Should be higher then that. I never drop below 100fps with Bf1 Ultra at 3440x1440 with a pair of 980Ti's. I'd expect a pair of 1080ti's on a 2560x1440 to be in the 160's+


----------



## zswickliffe

Quote:


> Originally Posted by *rcfc89*
> 
> Should be higher then that. I never drop below 100fps with Bf1 Ultra at 3440x1440 with a pair of 980Ti's. I'd expect a pair of 1080ti's on a 2560x1440 to be in the 160's+


Sorry for the confusion guys.

I responded to someone saying a single 1080 Ti would bring 140+ on modern games. I was saying that with a single card I see around 120 fps.

I see how confusing that was since the original request was for SLI. Ignore me...


----------



## KaRLiToS

Really nice to see you too. I was about to PM you hello.


----------



## Slackaveli

Quote:


> Originally Posted by *zswickliffe*
> 
> Uhhh, no.
> You'll be looking at about 120fps in modern AAA games with all graphics turned up to ultra. I see some dips to 100fps in Battlefield 1 but it's normally above 115 consistently.


uh, yes.

Maybe YOUR cpu cant produce it perhaps but i am getting over 135 in almost everything. Battlefield one, rainbow six siege , and can keep 60 fps at 4k in them all, too. if you are getting dips to 100 in BF1, your cpu is the issue. i never dip below 135 (well, in Ameins im sure i did but not into the teens even), even in 64 man Operations multiplayer maps. Ultra everything.
But, I do have the best gaming cpu. I would be surprised it is that much better, though. what cpu do you have?

edit: i see you are saying you average 115. that is more believable because in normal gpu bound testing i usually get an extra 20-25 fps in 1440p over a 4790k/6700k so that matches up here as well. 115+20=135.

You boys need to find a 5775c like I did. best decision I've made in pc world in a minute.


----------



## zswickliffe

Quote:


> Originally Posted by *Slackaveli*
> 
> uh, yes. maybe you need a better rig. I dont lie, so yeah. lucky there are forum rules here.
> 
> YOUR cpu cant produce it perhaps but i am getting over 135 in almost everything. Battlefield one, rainbow six siege , and can keep 60 fps at 4k in them all, too.


Maybe one day I'll be as great as you. For now, I'll just post benchmarks.

http://www.pcgamer.com/geforce-gtx-1080-ti-review/
http://www.tomsguide.com/us/nvidia-gtx-1080-ti-benchmarks,review-4241.html
https://www.google.com/amp/www.techspot.com/amp/review/1352-nvidia-geforce-gtx-1080-ti/

Edit: looks like you added to your post. CPU is my trusty dusty 4790k OC'd to 4.6GHz. Max usage around 80% in BF1. OCing to 4.8GHz is stable but yields no FPS gain.

It's unlikely that it's my CPU.


----------



## Stealth3si

I am looking for a custom 1080 Ti < 11" that will fit in my case. My 980 Ti barely fits with a quarter of an inch to spare.

My priority is:
1. Size
2. Performance
3. Noise

Anyone know which ones I can choose from?

EDIT: looks like I found one, EVGA GeForce GTX 1080 Ti FTW3 GAMING...


----------



## havoc315

In the middle of this right now lol,was going to do shunt to but going to wait, and do delid 2m ...hopefully have system done this week....had to send my 2 vpp755 back one washer fell off so now have to figure out which pump to put inside my to Monsoon Rez combos


----------



## Slackaveli

Quote:


> Originally Posted by *zswickliffe*
> 
> Maybe one day I'll be as great as you. For now, I'll just post benchmarks.
> 
> http://www.pcgamer.com/geforce-gtx-1080-ti-review/
> http://www.tomsguide.com/us/nvidia-gtx-1080-ti-benchmarks,review-4241.html
> https://www.google.com/amp/www.techspot.com/amp/review/1352-nvidia-geforce-gtx-1080-ti/
> 
> Edit: looks like you added to your post. CPU is my trusty dusty 4790k OC'd to 4.6GHz. Max usage around 80% in BF1. OCing to 4.8GHz is stable but yields no FPS gain.
> 
> It's unlikely that it's my CPU.


than that makes perfect sense. i gave my 4790k to my son and got the King of gaming cpus, the 5775c with that massive 128mb Crystalwell L4 cache. it adds 10-12~fps in 4k, 20~ in qhd, and almost 30 in 1080p over my old 4790k @ 4.8; tested myself. same ram-2400Mhz C-10 dominator Plat and same gpu. It's (L4 cache)beast. It's DEFINITELY my cpu that is giving me the extra frames so for the other guy who asked you are probably right. I was answering for my rig. But I'd be willing to bet you dont even know what a 5775c is, so, good day.

Maybe you shouldn't be so condescending if you dont want people to be salty in return.


----------



## Slackaveli

Quote:


> Originally Posted by *havoc315*
> 
> 
> In the middle of this right now lol,was going to do shunt to but going to wait, and do delid 2m ...hopefully have system done this week....had to send my 2 vpp755 back one washer fell off so now have to figure out which pump to put inside my to Monsoon Rez combos


dang, son.


----------



## zswickliffe

Quote:


> Originally Posted by *Slackaveli*
> 
> than that makes perfect sense. i gave my 4790k to my son and got the King of gaming cpus, the 5775c with that massive 128mb Crystalwell L4 cache. it adds 10-12~fps in 4k, 20~ in qhd, and almost 30 in 1080p over my old 4790k @ 4.8; tested myself. same ram-2400Mhz C-10 dominator Plat and same gpu. It's (L4 cache)beast. It's DEFINITELY my cpu that is giving me the extra frames so for the other guy who asked you are probably right. I was answering for my rig. But I'd be willing to bet you dont even know what a 5775c is, so, good day.
> 
> Maybe you shouldn't be so condescending if you dont want people to be salty in return.


Lol, you have tons of room to talk on being condescending. Yeah, I know what the 5775c is. Yeah, I know about its cache. Every benchmark shows dips into 115-120 range for BF1. Maybe they should pick up some 5775c's too and get 20fps more.


----------



## Slackaveli

they should. my minimums are WAY better than what you posted. That's just sad to handcuff such a good gpu like that.

i'll trust MY OWN EYES over a reviewer, who is usually on an x-99 system anyway, which is even worse for gaming.
You trashed me first, my salt is in retaliation, that and I AM RIGHT and you are basically calling me a liar. Show me a benchmark with a 5775c if you want to "prove" something.
Im sorry your lil cpu isnt meeting your needs. maybe you should just upgrade. It'd cost you a $100 if you sell your $4790k ($250 on ebay will sell that day).

$358
https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1181928&gclid=Cj0KEQjwtu3GBRDY6ZLY1erL44EBEiQAAKIcviIpzJ_rxqPS2GV1qeQVpDxiENOkqZorrzNKZlvagssaAoSU8P8HAQ&Q=&ap=y&m=Y&c3api=1876%2C%7Bcreative%7D%2C%7Bkeyword%7D&is=REG&A=details Then you too could actually utilize your monitor's Hz at full frame rates lol. Take that as good advice, or take it as a dig. i dont care, but, you really should just do yourself a favor and slap the best gaming cpu on earth in your board. You even have a z97 for christ's sake!

educate yoself http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club


----------



## Slushpup

Quote:


> Originally Posted by *Slackaveli*
> 
> they should. my minimums are WAY better than what you posted. That's just sad to handcuff such a good gpu like that.
> 
> i'll trust MY OWN EYES over a reviewer, who is usually on an x-99 system anyway, which is even worse for gaming.
> You trashed me first, my salt is in retaliation, that and I AM RIGHT and you are basically calling me a liar. Show me a benchmark with a 5775c if you want to "prove" something.
> Im sorry your lil cpu isnt meeting your needs. maybe you should just upgrade. It'd cost you a $100 if you sell your $4790k ($250 on ebay will sell that day).
> 
> $358
> https://www.bhphotovideo.com/bnh/controller/home?O=&sku=1181928&gclid=Cj0KEQjwtu3GBRDY6ZLY1erL44EBEiQAAKIcviIpzJ_rxqPS2GV1qeQVpDxiENOkqZorrzNKZlvagssaAoSU8P8HAQ&Q=&ap=y&m=Y&c3api=1876%2C%7Bcreative%7D%2C%7Bkeyword%7D&is=REG&A=details Then you too could actually utilize your monitor's Hz at full frame rates lol. Take that as good advice, or take it as a dig. i dont care, but, you really should just do yourself a favor and slap the best gaming cpu on earth in your board. You even have a z97 for christ's sake!
> 
> educate yoself http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club


Lol who is this guy?


----------



## JedixJarf

Quote:


> Originally Posted by *Slushpup*
> 
> Lol who is this guy?


Yeah idk, that's weird.


----------



## rcfc89

Quote:


> Originally Posted by *Slushpup*
> 
> Lol who is this guy?


Today's entertainment. I hope he's trolling.


----------



## illypso

I have a 1080ti on a 4790k on a z97a.

Now I feel bad...

I think I will be obligated to change it...

seriously I thought my 4790K was more than I need.


----------



## Stealth3si

What is the noise level on the EVGA GeForce GTX 1080 Ti FTW3 GAMING when overclocking?

EDIT: NVM, just realized it is not out yet.


----------



## GraphicsWhore

Got my EVGA FE via step-up today. Will be putting it on the EK block and backplate soon as part of a new build. For now looks good inside my current case


----------



## Slackaveli

Quote:


> Originally Posted by *illypso*
> 
> I have a 1080ti on a 4790k on a z97a.
> 
> Now I feel bad...
> 
> I think I will be obligated to change it...
> 
> seriously I thought my 4790K was more than I need.


depends. are you getting your monitor's full frame rates? if not, best upgrade you could make. dont feel bad, Intel hides 5775c news. they killed it off b/c it was too good and was going to hurt their hedt and skylake/z170 sales... but you can still get one, i linked it above. Why would you NOT take 20fps for $100 and a super easy upgrade? dont have to change anything but the cpu and paste.


----------



## Slackaveli

Quote:


> Originally Posted by *MightEMatt*
> 
> I can second that. I mostly play GTAV and The Division for modern titles but in both cases I pretty much have a locked 120fps. Any drops below that are usually a CPU bottleneck at this point. With the framerate unlocked I've seen it settle closer to 200 in lightweight areas so I imagine with a more modern CPU you could comfortably push for a steady 144 or 165 with a setting or two off.


ding ding ding. i am now . on Battlefield One my gpu was actually resting at times in domination maps because I had the frame limiter on 142 and it was on cruise control, all Ultra with 105% resolution slider and locked at 142. Awesome times. Oh, and 66c on air conistantly (with thermal grizzly paste and fan on 70%).


----------



## jrcbandit

Quote:


> Originally Posted by *Stealth3si*
> 
> What is the noise level on the EVGA GeForce GTX 1080 Ti FTW3 GAMING when overclocking?


Lol, come back in May when it is released.... The cooler is a complete unknown because the prior iCX coolers were 2 fan, not 3.


----------



## Slackaveli

Quote:


> Originally Posted by *Stealth3si*
> 
> What is the noise level on the EVGA GeForce GTX 1080 Ti FTW3 GAMING when overclocking?


we'll tell you in may when they come out


----------



## Slackaveli

Quote:


> Originally Posted by *jrcbandit*
> 
> Lol, come back in May when it is released.... The cooler is a complete unknown because the prior iCX coolers were 2 fan, not 3.


should be better though.


----------



## Slackaveli

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Got my EVGA FE via step-up today. Will be putting it on the EK block and backplate soon as part of a new build. For now looks good inside my current case


nice, dude, you are the first one I've seen that actually got their step up 1080Ti. congrats. Do the voltage unlock hack. I could only hold 2012 before voltage but now with +70mv I am holding 2050.


----------



## Exilon

Shunt mod time



GPU Boost 3 hardly gives me 1V once it starts bouncing on the 120% power limit.


----------



## PewnFlavorTang

Quote:


> Originally Posted by *Exilon*
> 
> Shunt mod time
> 
> 
> 
> GPU Boost 3 hardly gives me 1V once it starts bouncing on the 120% power limit.


I have an asus card that once it hits power limit too many times will trigger some fault and only get to use 50 percent TDP until I restart.


----------



## Slackaveli

Have you guys been running into power limit in games? b/c personally it only gets me in 3d mark and Heaven benches, those are power hoz bigtime. But in games i rarely see it over 100.


----------



## havoc315

Quote:


> Originally Posted by *Exilon*
> 
> Shunt mod time
> 
> 
> 
> GPU Boost 3 hardly gives me 1V once it starts bouncing on the 120% power limit.




I'm having same issues, and that's why I'm outing it in water and nodding...git a quick question for anyone is there a bug difference between the fujipoly 11mk and 17mk on a gpu when it is water cooled and if so where should I put the 17 vs 11 also noticed on ekwb titanx block it doesn't have all the same places marked for pads like what are on the stock FE cooler.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> uh, yes.
> 
> Maybe YOUR cpu cant produce it perhaps but i am getting over 135 in almost everything. Battlefield one, rainbow six siege , and can keep 60 fps at 4k in them all, too. if you are getting dips to 100 in BF1, your cpu is the issue. i never dip below 135 (well, in Ameins im sure i did but not into the teens even), even in 64 man Operations multiplayer maps. Ultra everything.
> But, I do have the best gaming cpu. I would be surprised it is that much better, though. what cpu do you have?
> 
> edit: i see you are saying you average 115. that is more believable because in normal gpu bound testing i usually get an extra 20-25 fps in 1440p over a 4790k/6700k so that matches up here as well. 115+20=135.
> 
> You boys need to find a 5775c like I did. best decision I've made in pc world in a minute.


4K 60fps maxed out?

Allow me my friend to introduce you to Mass Effect Andromeda and Ghost Recon Wildlands.


----------



## Exilon

Quote:


> Originally Posted by *havoc315*
> 
> 
> 
> I'm having same issues, and that's why I'm outing it in water and nodding...git a quick question for anyone is there a bug difference between the fujipoly 11mk and 17mk on a gpu when it is water cooled and if so where should I put the 17 vs 11 also noticed on ekwb titanx block it doesn't have all the same places marked for pads like what are on the stock FE cooler.


VRMs can run hotter than the VRAM so put the better ones on the RAM.

Really doubt it makes that much of a difference though. RAM doesn't get that hot regardless and VRMs are fine being toasty.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> 4K 60fps maxed out?
> 
> Allow me my friend to introduce you to Mass Effect Andromeda and Ghost Recon Wildlands.


those games dont count! lol. i did get 55-60 in mass effect all ultra . i even got 60 in gta V with EVERYTHINg maxed , even the distance sliders and shadows, etc. Havent tried wildlands. Mafia 3, 60 maxed. It wont last forever but for now we are straight.


----------



## Slackaveli

Quote:


> Originally Posted by *Exilon*
> 
> VRMs can run hotter than the VRAM so put the better ones on the RAM.
> 
> Really doubt it makes that much of a difference though. RAM doesn't get that hot regardless and VRMs are fine being toasty.


plus the vrms are only at 600 so they will be well under their specs heatwise


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> those games dont count! lol. i did get 55-60 in mass effect all ultra . i even got 60 in gta V with EVERYTHINg maxed , even the distance sliders and shadows, etc. Havent tried wildlands. Mafia 3, 60 maxed. It wont last forever but for now we are straight.


They do count! Especially cuz they're all I'm playing atm, and it doesn't feel like I "upgraded" my GPU from the 1080 lol


----------



## pez

Quote:


> Originally Posted by *alucardis666*
> 
> They do count! Especially cuz they're all I'm playing atm, and it doesn't feel like I "upgraded" my GPU from the 1080 lol


You're on X99...get another one







.


----------



## DerComissar

Quote:


> Originally Posted by *KaRLiToS*
> 
> Hi guys, I will join this club, just bought 2 x GTX 1080ti with EK blocks.
> 
> So far I have only received one card and it was this morning.
> 
> Bought one at Ncix.ca on the 13th which I still don't have and one bought at newegg.ca bought on the 23rd, which arrive today.
> 
> The one I bought at NCIX was listed IN STOCK but still no sign of it being shipped
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I will post pictures in a few


Good to hear that, looking forward to hearing how you like the Ti twins.

I hope NCIX comes through for you, they have been taking payment for items that they don't actually have, for quite a while now.
At least Newegg is a reputable company, and sent your card.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> You're on X99...get another one
> 
> 
> 
> 
> 
> 
> 
> .


Another 1080Ti?

No way. SLI is a crapshoot


----------



## pez

Quote:


> Originally Posted by *alucardis666*
> 
> Another 1080Ti?
> 
> No way. SLI is a crapshoot


Both games you're playing support it with great results and scaling. Sometimes you have to pay to play the way you want







.


----------



## Dasboogieman

Quote:


> Originally Posted by *illypso*
> 
> I have a 1080ti on a 4790k on a z97a.
> 
> Now I feel bad...
> 
> I think I will be obligated to change it...
> 
> seriously I thought my 4790K was more than I need.


Lol don't be. Don't just blindly trust a guy whose best evidence is "my eyes reckon its better" and a link to a thread full of potential confirmation bias lol.
Do some deeper digging, I personally am very skeptical of gains in the 10FPS region on CPU alone but hey, skepticism is what improves the robustness of the evidence to prove something.
I mean, even when I see OC results from people around OCN. I take it with a grain of salt and assume everything is 5% lower simply because it is in human nature to exaggerate.

Enquire further and do directed questions at as many people as possible, preferably ones with no vested interest in defending their purchase decisions.

For the record, I'm not saying the 5775c dude is wrong, but I'm saying to broaden your scope before you go out to spend money.

As a followup: I found some reasonably reliable data.

http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed

From what I can tell trawling through the Crystallwell owners thread, the 5775c derives its performance from the low latency of the Crystalwell coupled with HEDT class Bandwidth.
In your usage scenario, the 5775c may achieve an increase in FPS to reach 120 FPS. But you may also get similar results from faster RAM, this is a really tough call to make.


----------



## pez

Yeah, Haswell and up is not bottlenecking a Ti or TXP. If it is, the performance difference isn't enough to warrant a side grade to the tune of $350-ish.


----------



## Alex24buc

Sorry if this has been discussed before, but I want to buy a waterblock from EK for my 1080ti FE and I don`t know if it is necessary a backplate too or I can just leave it without it. Thanks for your helps, I am very limited by the budget right now and I was hoping I could use the card without a backplate.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Alex24buc*
> 
> Sorry if this has been discussed before, but I want to buy a waterblock from EK for my 1080ti FE and I don`t know if it is necessary a backplate too or I can just leave it without it. Thanks for your helps, I am very limited by the budget right now and I was hoping I could use the card without a backplate.


No need for a back plate, it's just nice to have for bling and protect the pcb and components.


----------



## BoredErica

Quote:


> Originally Posted by *Dasboogieman*
> 
> I mean, even when I see OC results from people around OCN. I take it with a grain of salt and assume everything is 5% lower simply because it is in human nature to exaggerate.


Sounds like my threads in a nutshell.


----------



## Alex24buc

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> No need for a back plate, it's just nice to have for bling and protect the pcb and components.


Thanks for the answer.


----------



## EvilPieMoo

Quote:


> Originally Posted by *Slackaveli*
> 
> those games dont count! lol. i did get 55-60 in mass effect all ultra . i even got 60 in gta V with EVERYTHINg maxed , even the distance sliders and shadows, etc. Havent tried wildlands. Mafia 3, 60 maxed. It wont last forever but for now we are straight.


You're not running those games maxed out if you're holding 60 with single Ti. GTA fully maxed out at 4K with 8xAA brings 1080 Ti SLI way below 60fps Mass effect Andromeda can't hold 60 with a single card at 4K either. fully maxed out, got my own benchmark videos to confirm it as well.


----------



## BrainSplatter

Quote:


> Originally Posted by *EvilPieMoo*
> 
> You're not running those games maxed out if you're holding 60 with single Ti. GTA fully maxed out at 4K with 8xAA brings 1080 Ti SLI way below 60fps Mass effect Andromeda can't hold 60 with a single card at 4K either. fully maxed out, got my own benchmark videos to confirm it as well.


8xAA seems a bit overkill @ 4K imho. The higher the base resolution the less need for costly AA methods. Also higher DSR might be an alternative since it definitely anti-aliases everything.

Btw, am I the only one who doesn't want 140 fps but rather 30 fps @ 5K-8K resolution ?? That's my reason why I just got 2.


----------



## guttheslayer

My best attempt so far for my Aorus 1080 Ti.



Managed to break that 14K/15K barrier


----------



## fisher6

Quote:


> Originally Posted by *guttheslayer*
> 
> My best attempt so far for my Aorus 1080 Ti.
> 
> 
> 
> Managed to break that 14K/15K barrier


Nice, whats your stable OC like?


----------



## guttheslayer

2025 Mhz, Peak at 2050 occasionally


----------



## Joshwaa

Well after 6 months of use my EVGA GTX 1080 FTW died. Sad day. Any time it tried to go into 3D mode it would stop outputting video and crash the system. RMA time. Glad I have my 1080 ti system. On another note I got a EK EVO cpu block last week and it came with Thermal Grizzly paste, when did they start doing that?

EDIT: Anyone here using CLLU or Conductonaut on the gpu with EK block? Think it is worth it over Kryonaut or GC Extreme? Just asking as I have never used it.


----------



## KedarWolf

Quote:


> Originally Posted by *Joshwaa*
> 
> Well after 6 months of use my EVGA GTX 1080 FTW died. Sad day. Any time it tried to go into 3D mode it would stop outputting video and crash the system. RMA time. Glad I have my 1080 ti system. On another note I got a EK EVO cpu block last week and it came with Thermal Grizzly paste, when did they start doing that?
> 
> EDIT: Anyone here using CLLU or Conductonaut on the gpu with EK block? Think it is worth it over Kryonaut or GC Extreme? Just asking as I have never used it.


My 1080 Ti EK black came with EK Ectotherm thermal paste, but I had Grizzly on hand.


----------



## havoc315

Quote:


> Originally Posted by *Alex24buc*
> 
> Sorry if this has been discussed before, but I want to buy a waterblock from EK for my 1080ti FE and I don`t know if it is necessary a backplate too or I can just leave it without it. Thanks for your helps, I am very limited by the budget right now and I was hoping I could use the card without a backplate.


You can actually remount the stock backplate by just using the original screws and the hex head standoff that came with your FE.
All you have to do is once you have your card seated on your waterblock and the back of the card is facing up and your ready to put the ekwb screws in to hold the water block on, is instead of using the supplied screws, use the hex head standoffs that you took off earlier and screw them in to hold the waterblock on and then place your original stock backplate back on top of it and use the small screws that you had to undo to remove he stock plate in the first place, but use the screws provided by ekwb for the 4 around the processor, the ones that had the springs on them originally. Then bam you have the original backplate.
There is a video on it if you look it up








And here is a picture showing the hex screws fit into ekwbs standoff


----------



## Alwrath

Quote:


> Originally Posted by *BrainSplatter*
> 
> 8xAA seems a bit overkill @ 4K imho. The higher the base resolution the less need for costly AA methods. Also higher DSR might be an alternative since it definitely anti-aliases everything.
> 
> Btw, am I the only one who doesn't want 140 fps but rather 30 fps @ 5K-8K resolution ?? That's my reason why I just got 2.


Yeah I only run low AA on my 4K panel, sometimes I even go without. Once you get past the 2k mark AA is not really needed all that much imo. If you want fps for 5k or 8k, you should just wait for volta.


----------



## Silent Scone

A bit overkill is cutting it fine, it's pointless.

*Cue senseless arguement


----------



## BrainSplatter

Quote:


> Originally Posted by *Alwrath*
> 
> If you want fps for 5k or 8k, you should just wait for volta.


So far it looks like the first volta products will get 'only' 8GB VRAM. So I am not too enthusiastic about it yet. 16GB HBM2 with even greater bandwidth would be nice though.

A single 1080TI can run 8K TW-Warhammer at about 18fps in the benchmark (which isn't very demanding). Now I am waiting for my 2nd 1080Ti to see whether 2 of them will break the 30fps







. TW-WH scales pretty will in SLI (as do almost all my other favorite games atm).

In WH @ 4K, a single 1080TI slightly overclocked was about 10% slower than 2x980Tis @ 1450Mhz. Interestingly, @ 5K that gap widened to > 15% (I had to reduce texture quality to medium for the 980Tis).


----------



## Alwrath

Quote:


> Originally Posted by *BrainSplatter*
> 
> So far it looks like the first volta products will get 'only' 8GB VRAM. So I am not too enthusiastic about it yet.


HBM2? If it is, 8GB will be plenty. Besides, Volta is gonna blow the doors off pascal. Its gonna be sick performance. Its gonna be like pascal on crack.


----------



## davidmoffitt

Count me among the owners.



3DMark Time Spy


----------



## BoredErica

Quote:


> Originally Posted by *Alwrath*
> 
> HBM2? If it is, 8GB will be plenty. Besides, Volta is gonna blow the doors off pascal. Its gonna be sick performance. Its gonna be like pascal on crack.


You don't know this. Don't pretend to know things you can't know.

Quote:



> Originally Posted by *Alwrath*
> 
> Yeah I only run low AA on my 4K panel, sometimes I even go without. Once you get past the 2k mark AA is not really needed all that much imo.


I still want x4 AA in the games I play. On some games not having AA is almost unplayable... there would be moving jaggies everywhere (prime example is Fallout 4). This is on 1440p, so 2.5k.


----------



## davidmoffitt

Quote:


> Originally Posted by *guttheslayer*
> 
> My best attempt so far for my Aorus 1080 Ti.


Jesus why is my PhysX (CPU?) so gimped compared to yours?? I'm on a 7700K @ 4.9 - could it be memory? I'm "only" at 3200Mhz.

(edit: quoted the wrong post)


----------



## BrainSplatter

Quote:


> Originally Posted by *Darkwizzie*
> 
> I still want x4 AA in the games I play. On some games not having AA is almost unplayable... there would be moving jaggies everywhere (prime example is Fallout 4). This is on 1440p, so 2.5k.


Exactly. But sensitivity to jaggies seems to be very different from person to person (similar to sensitivity for high refresh rates). But as I always say, even in the crappiest computer animate flick u will not see any jaggies because that's one of the most distracting visual artifacts in computer generated movies.


----------



## Archdregs

Quote:


> Originally Posted by *davidmoffitt*
> 
> Jesus why is my PhysX (CPU?) so gimped compared to yours?? I'm on a 7700K @ 4.9 - could it be memory? I'm "only" at 3200Mhz.
> 
> (edit: quoted the wrong post)


He has more cores.


----------



## lilchronic

Quote:


> Originally Posted by *BrainSplatter*
> 
> Exactly. But sensitivity to jaggies seems to be very different from person to person (similar to sensitivity for high refresh rates). But as I always say, even in the crappiest computer animate flick u will not see any jaggies because that's one of the most distracting visual artifacts in computer generated movies.


screen tearing is worse


----------



## Menta

Quote:


> Originally Posted by *kaiqi07*
> 
> New hoots, sadly this is the last piece... Wanted to get two to go SLI... Have to wait for restock liao...]
> 
> So early where did you buy the card * country?


----------



## Vamrick

Got my Reference 1080ti from evga... under water I hti 160 core - 2050 rock solid and 725 on memory ... Highest i have seen yet ./. im scared to push any further !! lol anyone have higher memory ? And is it worth it to keep pushing to see what it makes out at ?


----------



## EvilPieMoo

Quote:


> Originally Posted by *guttheslayer*
> 
> My best attempt so far for my Aorus 1080 Ti.
> 
> 
> 
> Managed to break that 14K/15K barrier


I hope you're able to get more out of it being that it's a premium aftermarket card. This was my run with a Founders edition


----------



## davidmoffitt

Quote:


> Originally Posted by *Archdregs*
> 
> He has more cores.


Oh I'm dumb, I read his as "6700K" thinking it was Skylake not Broadwell-E, duh, that makes sense


----------



## mcg75

Quote:


> Originally Posted by *davidmoffitt*
> 
> Jesus why is my PhysX (CPU?) so gimped compared to yours?? I'm on a 7700K @ 4.9 - could it be memory? I'm "only" at 3200Mhz.
> 
> (edit: quoted the wrong post)


You'll never catch a processor with more cores in that bench.

The good news is your 7700k is the best gaming processor there is today so I wouldn't worry about that bench.


----------



## davidmoffitt

Quote:


> Originally Posted by *mcg75*
> 
> You'll never catch a processor with more cores in that bench.
> 
> The good news is your 7700k is the best gaming processor there is today so I wouldn't worry about that bench.


Yeah I mis-read the screenshot. I used to have a 6850K and a 5930K prior, I know how that gig goes







I actually sold it to a buddy who is doing a ton of Solidworks flow-sim etc as I mostly game on this machine at home now (my old setup was a R5E with the EK mono block and was AWESOME for video editing, but 64GB of RAM was utterly silly / pointless for gaming LOL).


----------



## Alex24buc

Quote:


> Originally Posted by *havoc315*
> 
> You can actually remount the stock backplate by just using the original screws and the hex head standoff that came with your FE.
> All you have to do is once you have your card seated on your waterblock and the back of the card is facing up and your ready to put the ekwb screws in to hold the water block on, is instead of using the supplied screws, use the hex head standoffs that you took off earlier and screw them in to hold the waterblock on and then place your original stock backplate back on top of it and use the small screws that you had to undo to remove he stock plate in the first place, but use the screws provided by ekwb for the 4 around the processor, the ones that had the springs on them originally. Then bam you have the original backplate.
> There is a video on it if you look it up
> 
> 
> 
> 
> 
> 
> 
> 
> And here is a picture showing the hex screws fit into ekwbs standoff


I`ll give it a try once my ek waterblock gets delivered. It would be awesome for the original backplate to fit with my waterblock, can hardly wait to see. Thanks!


----------



## KedarWolf

Quote:


> Originally Posted by *davidmoffitt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mcg75*
> 
> You'll never catch a processor with more cores in that bench.
> 
> The good news is your 7700k is the best gaming processor there is today so I wouldn't worry about that bench.
> 
> 
> 
> Yeah I mis-read the screenshot. I used to have a 6850K and a 5930K prior, I know how that gig goes
> 
> 
> 
> 
> 
> 
> 
> I actually sold it to a buddy who is doing a ton of Solidworks flow-sim etc as I mostly game on this machine at home now (my old setup was a R5E with the EK mono block and was AWESOME for video editing, but 64GB of RAM was utterly silly / pointless for gaming LOL).
Click to expand...

I run 128GB on my gaming rig but 100gb of that is a RAM disk I have my regular games on..


----------



## fisher6

Anybody getting low GPU usage in GTA 5? Just did a clean install of it and it drops all the way to 46% sometimes for no reason.


----------



## PewnFlavorTang

I need to do the shunt mod to my ASUS FE card. I can do 2100-2139 on the stock blower, but I then get some power limitation fault triggered and I can only use upto half of my total 120% TDP. Anyone have any other alternative ways of doing the shunt mod without having to use the liquid metal approach? my motherboard stis on a desk so gravity isn't my friend in this case.


----------



## Slackaveli

Quote:


> Originally Posted by *EvilPieMoo*
> 
> You're not running those games maxed out if you're holding 60 with single Ti. GTA fully maxed out at 4K with 8xAA brings 1080 Ti SLI way below 60fps Mass effect Andromeda can't hold 60 with a single card at 4K either. fully maxed out, got my own benchmark videos to confirm it as well.


well of course not , Im not a 4k noob. Beren playing in 4k for 3 years. would never try to run that much AA. that is just pissin away performance for ZERO benefits, hell it actually looks worse imo. I use Super Sampling if I need AA. Otherwise I run with MFAA on (which doubles whatever in game AA \you use) and 2x AA (so, 4x AA effectively but better) and FXAA/TAA post process. Honestly, I have it on just to say I did b/c without AA in 4k is fine.


----------



## Slackaveli

Quote:


> Originally Posted by *mcg75*
> 
> You'll never catch a processor with more cores in that bench.
> 
> The good news is your 7700k is the best gaming processor there is today so I wouldn't worry about that bench.


not quite but almost the best. Broadwell is still better if just for gaming.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Lol don't be. Don't just blindly trust a guy whose best evidence is "my eyes reckon its better" and a link to a thread full of potential confirmation bias lol.
> Do some deeper digging, I personally am very skeptical of gains in the 10FPS region on CPU alone but hey, skepticism is what improves the robustness of the evidence to prove something.
> I mean, even when I see OC results from people around OCN. I take it with a grain of salt and assume everything is 5% lower simply because it is in human nature to exaggerate.
> 
> Enquire further and do directed questions at as many people as possible, preferably ones with no vested interest in defending their purchase decisions.
> 
> For the record, I'm not saying the 5775c dude is wrong, but I'm saying to broaden your scope before you go out to spend money.
> 
> As a followup: I found some reasonably reliable data.
> 
> http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed
> 
> From what I can tell trawling through the Crystallwell owners thread, the 5775c derives its performance from the low latency of the Crystalwell coupled with HEDT class Bandwidth.
> In your usage scenario, the 5775c may achieve an increase in FPS to reach 120 FPS. But you may also get similar results from faster RAM, this is a really tough call to make.


you're going to want the faster ram too. I hope some others actually investigate before just dismissing my claim b/c they were nubbies and hadnt heard of an L4 cache or the magic it provides in games. Honestly I like being one of the only people on Earth to have this gem of a gaming cpu paired with the best gpu on the planet. you guys just do you. Nothing to see here.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Yeah, Haswell and up is not bottlenecking a Ti or TXP. If it is, the performance difference isn't enough to warrant a side grade to the tune of $350-ish.


of course not, but it's definitely worth $80 ($350 minus the ~$270 for selling the lesser 4790k). You guys just dont know about that cache. I have been going through this online everytime i mention it. People are just uninformed but they THINK they know everything. DO A LITTLE FREAKING RESEARCH.


----------



## kevindd992002

So is it still general consensus for the 1080Ti that the custom boards can overclock better than the FE's when fitted with a waterblock? I say this because this was mostly the case with the 1080 because the 1080 FE's were power-limited and we don't have a Pascal BIOS Editor. Is it still worth it to go with a custom 1080Ti board when you know you'll just take the cooler off and slap in your own waterblock?

How long after a custom board release does EK generally make custom WB's for them, say the EVGA FTW3 1080Ti?


----------



## mcg75

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> not quite but almost the best. Broadwell is still better if just for gaming.


Can't agree with that part sorry. I've probably read 100 reviews for Ryzen over the past month or so where all the cpu's were compared and the 7700k was coming on top more often than anything else.

It's just the way most current games are coded. A quad core with higher IPC does better than more cores because the game doesn't take advantage.


----------



## Slackaveli

they are beast and are the obvious best gaming cpu except for the 5775c. Magic cache still wins in most gaming scenarios. But, 7700k is still bench winner and top dog at single core. You can see i beat both ryzen and 7700k above, though. How many of those 100 reviews included a 5775c? a few of them did for sure because i have also been following ryzen news daily and i've seen 5775c shoved down Ryzen's throat over at WCCFtech for the last 2 months. A lot of people know the truth.

remember, the 5775c was buried by intel. most benches dont even include it. But whenever it's broken out, it wins.

In this pic you see 5775c beating 7700k by 10+ fps and way better minimums, resulting in far smoother gameplay.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> Both games you're playing support it with great results and scaling. Sometimes you have to pay to play the way you want
> 
> 
> 
> 
> 
> 
> 
> .


Here's hopping that Volta will finally get 4K 60 FPS right.


----------



## Slackaveli

man. tweak those settings. i was over 50 fps all the way through that game at 2x AA ultra everything else. and it seems to utilize all 8 threads so i'd think your beast cpu could handle it. idk, may need a 7700k @5mhz or a Broadwell magic cache to get there in this game. Sometimes all those cores work against you in gaming. But, yeah, I'm sure Volta Ti will power through anything but it'll be 18 months or so away.


----------



## havoc315

Quote:


> Originally Posted by *Alex24buc*
> 
> I`ll give it a try once my ek waterblock gets delivered. It would be awesome for the original backplate to fit with my waterblock, can hardly wait to see. Thanks!


No problem and good luck, just know that when you remove the stock FE cover is that you will notice there are more places that have pads on them than what ekwb shows on the instructions of the titan x waterblock...So you might have to do a little measuring if you want to cool all the things nvidia was on the air cooler, I'm doing that just for security to make sure.
The place between your vrm and vram is not shown but I found that a 1mm thick pad should suffice and then where the mosfets and drivers are the pad provided covers all them correctly but to the right are the ones not shown so might want to throw a 1mm pad on each of them same as on your stock cooler...i just used mine as a reference guide instead of ek's seeing as it's not the same card, compatible does not mean exact.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> man. tweak those settings. i was over 50 fps all the way through that game at 2x AA ultra everything else.


Well, Res scaling helps a lot If I drop to .70 or .80 both games are infinitely more playable. But I'm sacrificing some clarity obviously. I kinda am debating picking up a 144hz or better monitor to game on instead.









Dunno whether to do ultra wide or not. Just gonna be hard to game on a sub 40" screen as I've been using TV's over 40" since '05.


----------



## Slackaveli

yeah i gotta have both. i play on the 55" 4k HDR samsung and a Dell 1440p 144Hz g-sync 27". really depends on the game which one i prefer. ME:A looks way better on the 4k HDR but it plays much smoother on the 144Hz.


----------



## Jbravo33

Quote:


> Originally Posted by *Benny89*
> 
> Anyone here with 1080 Ti SLI and 144/165Hz 1440p monitor? What are your FPS in some new games with SLI? Thanks in advance for info!


Mass effect and ghost recon 3440x1440 getting around 90 fps ultra everything


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah i gotta have both. i play on the 55" 4k HDR samsung and a Dell 1440p 144Hz g-sync 27". really depends on the game which one i prefer. ME:A looks way better on the 4k HDR but it plays much smoother on the 144Hz.


This one?

http://www.bestbuy.com/site/dell-27-led-gsync-monitor-black/5293502.p?skuId=5293502
Quote:


> Originally Posted by *Jbravo33*
> 
> Mass effect and ghost recon 3440x1440 getting around 90 fps ultra everything


I hate you...
















Would you mind running without SLI and letting my know what kinda results you get in both? I'm curious to know how both play at that res.


----------



## gstarr

finally after 2 weeks


----------



## mtbiker033

Amazon has evga FE's in stock on Saturday:

https://www.amazon.com/gp/product/B06XH2P8DD/ref=od_aui_detailpages00?ie=UTF8&psc=1

$699 prime

just snagged one, should have it April 5th!


----------



## mcg75

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> they are beast and are the obvious best gaming cpu except for the 5775c. Magic cache still wins in most gaming scenarios. But, 7700k is still bench winner and top dog at single core. You can see i beat both ryzen and 7700k above, though. How many of those 100 reviews included a 5775c? a few of them did for sure because i have also been following ryzen news daily and i've seen 5775c shoved down Ryzen's throat over at WCCFtech for the last 2 months. A lot of people know the truth.
> 
> remember, the 5775c was buried by intel. most benches dont even include it. But whenever it's broken out, it wins.
> 
> In this pic you see 5775c beating 7700k by 10+ fps and way better minimums, resulting in far smoother gameplay.


http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/

These guys had the 5775c in their test. 7700k won 5 of 7 vs it including minimums.


----------



## KedarWolf

Quote:


> Originally Posted by *mtbiker033*
> 
> Amazon has evga FE's in stock on Saturday:
> 
> https://www.amazon.com/gp/product/B06XH2P8DD/ref=od_aui_detailpages00?ie=UTF8&psc=1
> 
> $699 prime
> 
> just snagged one, should have it April 5th!


I was soooooo lucky.

Launch day local store had a few Gigabyte in stock. I called, they actually held it until end of business day for me, paid a coworker to pick it up as I was on evening shift, got it!!

Got EK block and backplate about ten days later.

Below is a benching run. 10880 in TimeSpy.

All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.

Time Spy 10880, Graphics Test 10853.

http://www.3dmark.com/3dm/18838458?


----------



## jelome1989

Quote:


> Originally Posted by *mcg75*
> 
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> 
> These guys had the 5775c in their test. 7700k won 5 of 7 vs it including minimums.


That's why I don't trust benchmarks these days...
Here, the 5775c handily beats the 7700k in most of the tests both at stock and OC speeds, with the 7700k having a 500-800mhz clock lead over the 5775c
https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10

EDIT:
Another one:
https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,18
The only game where the 7700k beats the 5775c is Crysis 3.


----------



## dboythagr8

Quote:


> Originally Posted by *Benny89*
> 
> Anyone here with 1080 Ti SLI and 144/165Hz 1440p monitor? What are your FPS in some new games with SLI? Thanks in advance for info!


Me.

You will run into a CPU bottleneck. 2 cards are too fast, at least for my 4930k 6 core @ 4.5ghz at 1440p. It's not until I use DSR to downsample from 4k or higher do I get full 98%-99% usage rate on both cards. This happens in 3Dmark Firestrike as well btw (the standard test).

It's happened on Mass Effect Andromeda, Battlefield 1, GTA V, Rise of the Tomb Raider, 3DMARK. So it's not just a one off thing. Single GPU this isn't an issue.


----------



## Benny89

Quote:


> Originally Posted by *dboythagr8*
> 
> Me.
> 
> You will run into a CPU bottleneck. 2 cards are too fast, at least for my 4930k 6 core @ 4.5ghz at 1440p. It's not until I use DSR to downsample from 4k or higher do I get full 98%-99% usage rate on both cards. This happens in 3Dmark Firestrike as well btw (the standard test).
> 
> It's happened on Mass Effect Andromeda, Battlefield 1, GTA V, Rise of the Tomb Raider, 3DMARK. So it's not just a one off thing. Single GPU this isn't an issue.


Ok, thanks. I will stick to single card on my XB271HU then till 32' 4K 144Hz monitors hit.


----------



## Rhadamanthys

Can someone explain something to me please, I still don't understand how Pascal's power limit affects overclocking ability. The Zotac Amp Extreme is said to have a powerlimit of 320W. Can you guys tell me how much if at all it will benefit from overclocking compared to, say, the FEs? Also, from your experience, will there be a waterblock for the Extreme?


----------



## dboythagr8

Quote:


> Originally Posted by *Benny89*
> 
> Ok, thanks. I will stick to single card on my XB271HU then till 32' 4K 144Hz monitors hit.


No problem. One Ti at 1440p is perfect imo. I can't lie though, the DSR image is absolutely amazing from 4k and higher with SLI. And still able to maintain well over 60 fps


----------



## bfedorov11

Quote:


> Originally Posted by *Rhadamanthys*
> 
> Can someone explain something to me please, I still don't understand how Pascal's power limit affects overclocking ability. The Zotac Amp Extreme is said to have a powerlimit of 320W. Can you guys tell me how much if at all it will benefit from overclocking compared to, say, the FEs? Also, from your experience, will there be a waterblock for the Extreme?


They will have a lower power target compared to the FE's 120%. Aside from the different cooling, they will perform the same.


----------



## Benny89

Quote:


> Originally Posted by *dboythagr8*
> 
> No problem. One Ti at 1440p is perfect imo. I can't lie though, the DSR image is absolutely amazing from 4k and higher with SLI. And still able to maintain well over 60 fps


I belive but it's hard to go back to 60 fps once you started to play at 90+ fps in games









Besides I want to finally let my monitor use his potential with all those refresh rate Hz


----------



## rcfc89

Quote:


> Originally Posted by *dboythagr8*
> 
> Me.
> 
> You will run into a CPU bottleneck. 2 cards are too fast, at least for my 4930k 6 core @ 4.5ghz at 1440p. It's not until I use DSR to downsample from 4k or higher do I get full 98%-99% usage rate on both cards. This happens in 3Dmark Firestrike as well btw (the standard test).
> 
> It's happened on Mass Effect Andromeda, Battlefield 1, GTA V, Rise of the Tomb Raider, 3DMARK. So it's not just a one off thing. Single GPU this isn't an issue.


Damn looks like this single 1080Ti might be all she wrote for my current setup. Next gpu will be an entire different build.


----------



## lilchronic

Quote:


> Originally Posted by *Benny89*
> 
> I belive but it's hard to go back to 60 fps once you started to play at 90+ fps in games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides I want to finally let my monitor use his potential with all those refresh rate Hz


It's not possible.


----------



## Alex24buc

Quote:


> Originally Posted by *havoc315*
> 
> No problem and good luck, just know that when you remove the stock FE cover is that you will notice there are more places that have pads on them than what ekwb shows on the instructions of the titan x waterblock...So you might have to do a little measuring if you want to cool all the things nvidia was on the air cooler, I'm doing that just for security to make sure.
> The place between your vrm and vram is not shown but I found that a 1mm thick pad should suffice and then where the mosfets and drivers are the pad provided covers all them correctly but to the right are the ones not shown so might want to throw a 1mm pad on each of them same as on your stock cooler...i just used mine as a reference guide instead of ek's seeing as it's not the same card, compatible does not mean exact.


I`ll keep in mind your advices, appreciate your help!


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> So is it still general consensus for the 1080Ti that the custom boards can overclock better than the FE's when fitted with a waterblock? I say this because this was mostly the case with the 1080 because the 1080 FE's were power-limited and we don't have a Pascal BIOS Editor. Is it still worth it to go with a custom 1080Ti board when you know you'll just take the cooler off and slap in your own waterblock?
> 
> How long after a custom board release does EK generally make custom WB's for them, say the EVGA FTW3 1080Ti?


Anybody?


----------



## trawetSluaP

All set up:



Few points - can't add more than a 100MHz to the core otherwise I crash, bit of a disappointment. I get a load of coilwhine when rendering high frames such as in Heaven etc but in real applications such as gaming I don't.

Lastly, little bit disappointed with EK (although I ordered with OcUK) as my backplate didn't come with a logo...


----------



## mcg75

Quote:


> Originally Posted by *jelome1989*
> 
> That's why I don't trust benchmarks these days...
> Here, the 5775c handily beats the 7700k in most of the tests both at stock and OC speeds, with the 7700k having a 500-800mhz clock lead over the 5775c
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10
> 
> EDIT:
> Another one:
> https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,18
> The only game where the 7700k beats the 5775c is Crysis 3.


In that first link, the 7700k beats the 5775c 7 of 10 times.

The second link doesn't have a 7700k in it. Test was done in 2015.


----------



## rcfc89

Quote:


> Originally Posted by *kevindd992002*
> 
> Anybody?


The popular opinion today is that they are no longer binning chips. So if you're throwing a block on it then I'd just go with a FE. Although there could be more available power on the higher-end AIB's considering they're 2-8 pin vs the FE 6+8. This could simply be due to better cooling fans.


----------



## davidmoffitt

Quote:


> Originally Posted by *trawetSluaP*
> 
> ... can't add more than a 100MHz to the core otherwise I crash, bit of a disappointment. I get a load of coilwhine ...


Dang, sounds like you got a "Friday afternoon" card.







Are you upping the memory speed and/or adding any voltage?


----------



## trawetSluaP

Quote:


> Originally Posted by *davidmoffitt*
> 
> Dang, sounds like you got a "Friday afternoon" card.
> 
> 
> 
> 
> 
> 
> 
> Are you upping the memory speed and/or adding any voltage?


No, not touched the memory or voltage.


----------



## kevindd992002

Quote:


> Originally Posted by *rcfc89*
> 
> The popular opinion today is that they are no longer binning chips. So if you're throwing a block on it then I'd just go with a FE. Although there could be more available power on the higher-end AIB's considering they're 2-8 pin vs the FE 6+8. This could simply be due to better cooling fans.


Right but are 1080Ti FE's known to be power-limited as well? I mean if they're not then the extra available power with the custom boards will not matter. What then would the advantage of custom boards be aside from the good air cooler?

Also, how could it be simply due to better cooling fans? I didn't understand that part.


----------



## jelome1989

Quote:


> Originally Posted by *mcg75*
> 
> In that first link, the 7700k beats the 5775c 7 of 10 times.
> 
> The second link doesn't have a 7700k in it. Test was done in 2015.


Sorry I was talking about the OC results of the 5775c in the first link. At stock, the 5775c is at a huge disadvantage with about 800mhz less clock speeds, but the 7700k only beats the 5775c by 1 or 2 fps on average. At OC speeds the 5775c beats the 7700k handily, even with 800mhz less speeds. Clock for clock, the 5775c clearly beats the 7700k on almost every game, except Crysis 3.
The second link has a 6700k. We can draw estimates for the 7700k too since both CPUs perform equally clock for clock.

Anyway, what I really want to say is just that benchmarks are just hard to trust these days. Looking at pcgameshardware's test, the 7700k beats the 5775c at StarCraft 2 by a HUGE margin, but in purepc's test, the 5775c beats the 6700k/7700k at StarCraft 2.


----------



## rcfc89

Quote:


> Originally Posted by *kevindd992002*
> 
> Right but are 1080Ti FE's known to be power-limited as well? I mean if they're not then the extra available power with the custom boards will not matter. What then would the advantage of custom boards be aside from the good air cooler?
> 
> Also, how could it be simply due to better cooling fans? I didn't understand that part.


Rumor has it that these gpu's will be voltage locked. If that's the case I'm just assuming the additional power added from 2x8pin on select AIB's is for the additional fans/ RGB's etc over what the FE has.


----------



## kevindd992002

Quote:


> Originally Posted by *rcfc89*
> 
> Rumor has it that these gpu's will be voltage locked. If that's the case I'm just assuming the additional power added from 2x8pin on select AIB's is for the additional fans/ RGB's etc over what the FE has.


Voltage-locked as in even the Afterburner voltage slider won't work? I was assuming all Pascal cards are voltage-locked.


----------



## EvilPieMoo

Quote:


> Originally Posted by *dboythagr8*
> 
> Me.
> 
> You will run into a CPU bottleneck. 2 cards are too fast, at least for my 4930k 6 core @ 4.5ghz at 1440p. It's not until I use DSR to downsample from 4k or higher do I get full 98%-99% usage rate on both cards. This happens in 3Dmark Firestrike as well btw (the standard test).
> 
> It's happened on Mass Effect Andromeda, Battlefield 1, GTA V, Rise of the Tomb Raider, 3DMARK. So it's not just a one off thing. Single GPU this isn't an issue.


I'm curious as to how you're running into a CPU bottleneck, is one of the cores running at 100% or is the CPU utilisation completely maxing out? I've not run into any such issues with 1080 Ti SLI with a 6800K 6 core, pushing 98-99% usage handily in most games at 4K. Crisis 3 even pushed both cards to 100% each.


----------



## jelome1989

Quote:


> Originally Posted by *EvilPieMoo*
> 
> I'm curious as to how you're running into a CPU bottleneck, is one of the cores running at 100% or is the CPU utilisation completely maxing out? I've not run into any such issues with 1080 Ti SLI with a 6800K 6 core, pushing 98-99% usage handily in most games at 4K. Crisis 3 even pushed both cards to 100% each.


Have you tried running at 1440p? Because at 1440p those Tis get a lot stronger at bottlenecking CPUs...


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rcfc89*
> 
> Rumor has it that these gpu's will be voltage locked. If that's the case I'm just assuming the additional power added from 2x8pin on select AIB's is for the additional fans/ RGB's etc over what the FE has.
> 
> 
> 
> Voltage-locked as in even the Afterburner voltage slider won't work? I was assuming all Pascal cards are voltage-locked.
Click to expand...

Myself and others have found the unlocked voltage slider does not increase max voltages but DOES on water keep voltages at the maximum 1.062v without it dropping lower more constantly.

With it in an earlier post without shunt mod doing a Heaven run my voltages stayed at 1.062 99% of the run, I think the few seconds dip was a Power Limit issue.

Without it voltages and clocks dipped much more off and on.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> This one?
> 
> http://www.bestbuy.com/site/dell-27-led-gsync-monitor-black/5293502.p?skuId=5293502
> I hate you...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would you mind running without SLI and letting my know what kinda results you get in both? I'm curious to know how both play at that res.


yep. that's the best buy in 1440p 144hz g-sync right there


----------



## Slackaveli

Quote:


> Originally Posted by *mcg75*
> 
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> 
> These guys had the 5775c in their test. 7700k won 5 of 7 vs it including minimums.


@ 3.3 GHZ. it's magic but it can't overcome 1.7 ghz. try it again @ 4.3


----------



## Slackaveli

Quote:


> Originally Posted by *jelome1989*
> 
> That's why I don't trust benchmarks these days...
> Here, the 5775c handily beats the 7700k in most of the tests both at stock and OC speeds, with the 7700k having a 500-800mhz clock lead over the 5775c
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10
> 
> EDIT:
> Another one:
> https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,18
> The only game where the 7700k beats the 5775c is Crysis 3.


THANK YOU> I finally got some back-up!! Appreciate it, man.


----------



## Slackaveli

Quote:


> Originally Posted by *jelome1989*
> 
> Sorry I was talking about the OC results of the 5775c in the first link. At stock, the 5775c is at a huge disadvantage with about 800mhz less clock speeds, but the 7700k only beats the 5775c by 1 or 2 fps on average. At OC speeds the 5775c beats the 7700k handily, even with 800mhz less speeds. Clock for clock, the 5775c clearly beats the 7700k on almost every game, except Crysis 3.
> The second link has a 6700k. We can draw estimates for the 7700k too since both CPUs perform equally clock for clock.
> 
> Anyway, what I really want to say is just that benchmarks are just hard to trust these days. Looking at pcgameshardware's test, the 7700k beats the 5775c at StarCraft 2 by a HUGE margin, but in purepc's test, the 5775c beats the 6700k/7700k at StarCraft 2.


everytime i bring it up i get abused by the forum. So many IGNORANT posters on here it's pathetic. I thought on Ooverclock.net we'd have smarter posters....


----------



## palote99

Quote:


> Originally Posted by *mcg75*
> 
> In that first link, the 7700k beats the 5775c 7 of 10 times.
> 
> The second link doesn't have a 7700k in it. Test was done in 2015.


Thats not true.... And when both are OC..... 5775c rules al benches......


----------



## Slackaveli

Quote:


> Originally Posted by *mcg75*
> 
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> 
> These guys had the 5775c in their test. 7700k won 5 of 7 vs it including minimums.


Quote:


> Originally Posted by *palote99*
> 
> Thats not true.... And when both are OC..... 5775c rules al benches......


I've been trying to tell them. They act like it's some crappy lil old chip when really it's not even a year and a half old and beats ryzen and Kaby in gaming. Some people are very easily fooled.


----------



## Mato87

Iam pleased to announce I will be joing this club tomorrow







It's going to be at least 1 or 2 months before the gigabyte aorus xtreme will even be available, so I decided not to wait any longer and went for the ASUS ROG STRIX GAMING OC version. It's going to set me back for quite a lot of money, but considering all the benchmarks, tests and reviews I read over the past two weeks, it's going to be a huge improvement over 980 ti I have now, almost 2 times the performance. Iam glad I sold the 980 ti last month for quite a good price too, so the set back wasn't as big


----------



## dboythagr8

Quote:


> Originally Posted by *EvilPieMoo*
> 
> I'm curious as to how you're running into a CPU bottleneck, is one of the cores running at 100% or is the CPU utilisation completely maxing out? I've not run into any such issues with 1080 Ti SLI with a 6800K 6 core, pushing 98-99% usage handily in most games at 4K. Crisis 3 even pushed both cards to 100% each.


Read my post again ?


----------



## manolith

My 1080ti gets here monday. Evga founders edition.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> Voltage-locked as in even the Afterburner voltage slider won't work? I was assuming all Pascal cards are voltage-locked.


quick easy fix to that. it just needs to be added to the 3rd party card database manually by copying/pasting one string


----------



## EvilPieMoo

Quote:


> Originally Posted by *dboythagr8*
> 
> Read my post again ?


Ah, when you said downsample from 4K I read that as you're downscaling from 4K to 1440p, which lead to believe you were having the issue running 4K. My mistake, I should read more carefully next time.


----------



## Slackaveli

still, though, on a 6800k you wont bottleneck.


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rcfc89*
> 
> Rumor has it that these gpu's will be voltage locked. If that's the case I'm just assuming the additional power added from 2x8pin on select AIB's is for the additional fans/ RGB's etc over what the FE has.
> 
> 
> 
> Voltage-locked as in even the Afterburner voltage slider won't work? I was assuming all Pascal cards are voltage-locked.
Click to expand...

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20 use the WildCards.zip file.

And read the last few posts in thread.


----------



## dboythagr8

Quote:


> Originally Posted by *Slackaveli*
> 
> still, though, on a 6800k you wont bottleneck.


While I do not own a 6800k, I feel fairly confident in saying the CPU will be holding a pair of Tis back at *2560x1440*. They are ridiculously fast at that resolution and that's not even taking into account overclocking.


----------



## EXVAS3221

soon to be one?


----------



## rcfc89

Quote:


> Originally Posted by *Mato87*
> 
> Iam pleased to announce I will be joing this club tomorrow
> 
> 
> 
> 
> 
> 
> 
> It's going to be at least 1 or 2 months before the gigabyte aorus xtreme will even be available, so I decided not to wait any longer and went for the ASUS ROG STRIX GAMING OC version. It's going to set me back for quite a lot of money, but considering all the benchmarks, tests and reviews I read over the past two weeks, it's going to be a huge improvement over 980 ti I have now, almost 2 times the performance. Iam glad I sold the 980 ti last month for quite a good price too, so the set back wasn't as big


That's odd Aorus stated this on Twitter : https://twitter.com/search?q=%23aorus

It was originally tomorrow.


----------



## jelome1989

Quote:


> Originally Posted by *Slackaveli*
> 
> everytime i bring it up i get abused by the forum. So many IGNORANT posters on here it's pathetic. I thought on Ooverclock.net we'd have smarter posters....


Oh sorry to hear about that. I don't know the details or context so I'd rather keep silent about the matter.

The benchmarks just dictate that the 5775c is the better gaming CPU overall.


----------



## jelome1989

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Ah, when you said downsample from 4K I read that as you're downscaling from 4K to 1440p, which lead to believe you were having the issue running 4K. My mistake, I should read more carefully next time.


Makes sense now...
Quote:


> Originally Posted by *dboythagr8*
> 
> While I do not own a 6800k, I feel fairly confident in saying the CPU will be holding a pair of Tis back at *2560x1440*. They are ridiculously fast at that resolution and that's not even taking into account overclocking.


Yeah, those Tis are so fast they can still bottleneck at 1440p. It's crazy.


----------



## Mato87

Quote:


> Originally Posted by *rcfc89*
> 
> That's odd Aorus stated this on Twitter : https://twitter.com/search?q=%23aorus
> 
> It was originally tomorrow.


Yeah that's the standard aorus version, I wanted the xtreme







Doesn't matter anyway, there is only a slight difference so I just went for the one non reference card that was available at the local electronics store. I was lucky, because they sold out a few minutes after I ordered it LOL. And where I live there isn't really a huge market for this kind of hardware...


----------



## Slackaveli

Quote:


> Originally Posted by *dboythagr8*
> 
> While I do not own a 6800k, I feel fairly confident in saying the CPU will be holding a pair of Tis back at *2560x1440*. They are ridiculously fast at that resolution and that's not even taking into account overclocking.


maybe. because the single core score and poor optimization utilizing it's cores. But a 7700k/5775c or any of the other great gaming cpu's don't have the pci-e lanes to not be "holding them back" a bit, and SLI scaling itself is already "holding them back" tbh, so, it's impossible to fully utilize 2 cards. Just part of the deal.


----------



## Slackaveli

Quote:


> Originally Posted by *jelome1989*
> 
> Oh sorry to hear about that. I don't know the details or context so I'd rather keep silent about the matter.
> 
> The benchmarks just dictate that the 5775c is the better gaming CPU overall.


people just don't know the facts and they get very defensive about their cpu choices and really really hate to hear they could have just plopped a 5775c in their z97 boards and had better gaming performance than what most did and buy a new mobo and new ram and a new cpu. Which is on Intel because instead of telling the world about the awesome L4 cache they buried the product lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> Iam pleased to announce I will be joing this club tomorrow
> 
> 
> 
> 
> 
> 
> 
> It's going to be at least 1 or 2 months before the gigabyte aorus xtreme will even be available, so I decided not to wait any longer and went for the ASUS ROG STRIX GAMING OC version. It's going to set me back for quite a lot of money, but considering all the benchmarks, tests and reviews I read over the past two weeks, it's going to be a huge improvement over 980 ti I have now, almost 2 times the performance. Iam glad I sold the 980 ti last month for quite a good price too, so the set back wasn't as big


Good, now you can stop bugging people about Quantum Break.


----------



## WarbossChoppa

I'm gonna be buying one as soon as models with a DVID connector are released. I'm leaning towards the ASUS Strix version and by the end of the year I'll be putting it into a custom water loop and I don't know which model I should grab because in the end the final version I'll want to buy is the one that overclocks the best. Anyone know which one they'll be grabbing for something similar?


----------



## Jbravo33

Quote:


> Originally Posted by *dboythagr8*
> 
> No problem. One Ti at 1440p is perfect imo. I can't lie though, the DSR image is absolutely amazing from 4k and higher with SLI. And still able to maintain well over 60 fps


When people say 1440 they are mostly talking about 2560 correct? the fps some people are stating i cant get unless i adjust to 2560x1440 and make it fullscreen. i want to try the dsr tonight. do you adjust it in the nvidia cp lets say at 1.5 then do so within the game? are u on 3440 or 2560. hate that my monitor is only 60hz









Also how do tell if your cpu is being bottlenecked?


----------



## PewnFlavorTang

I'm not sure how you guys can say you get X frames with SLI. SLI has been piss poor for me. I uninstalled the 2nd card b/c it was pretty much dung imho.


----------



## duppex

OC3D TV

YouTube Asus GTX1080 Ti Strix OC Review now up

GPU Looks very nice. Shame about the price


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> they are beast and are the obvious best gaming cpu except for the 5775c. Magic cache still wins in most gaming scenarios. But, 7700k is still bench winner and top dog at single core. You can see i beat both ryzen and 7700k above, though. How many of those 100 reviews included a 5775c? a few of them did for sure because i have also been following ryzen news daily and i've seen 5775c shoved down Ryzen's throat over at WCCFtech for the last 2 months. A lot of people know the truth.
> 
> remember, the 5775c was buried by intel. most benches dont even include it. But whenever it's broken out, it wins.
> 
> In this pic you see 5775c beating 7700k by 10+ fps and way better minimums, resulting in far smoother gameplay.


Why I did not hear about it more?? Looks like OCed (not stock 3,3 Ghz) 5775c is beating every other top CPU in gaming (4790k, 6700k, 7700k) by quite a big margin (10 fps vs 7700k).

Honestly I never head about this CPU. Why it was dumped by Intel?


----------



## dboythagr8

Quote:


> Originally Posted by *Jbravo33*
> 
> When people say 1440 they are mostly talking about 2560 correct? the fps some people are stating i cant get unless i adjust to 2560x1440 and make it fullscreen. i want to try the dsr tonight. do you adjust it in the nvidia cp lets say at 1.5 then do so within the game? are u on 3440 or 2560. hate that my monitor is only 60hz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also how do tell if your cpu is being bottlenecked?


Yes 2560x1440. You enable DSR in NVCP and you can select the how much you want to use. I think it goes up to 4x of the resolution you're currently using. Then you go into the game and select that resolution or you can do it through GeForce Experience. As for CPU bottlenecking I could tell because my GPU performance was in SLI was at about 60% for each card @ 1440p. Something's not right. Looked at logs and just my hunch, I figured 1440p was too low of a resolution for TWO Tis. Turned to DSR and everything evened out on my GPUs.

Think of it as having Lamborghini Aventador. Driving it on a road where the speed limit is 40mph. The car probably isn't all that comfortable although it will certainly get you to where you want in style. All that power in the V12 is going to waste. Take it to a free way where the limit is upped or better yet a track and the car can actually use that power under the hood. Same thing with resolutions with this amount of GPU power ?

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> I'm not sure how you guys can say you get X frames with SLI. SLI has been piss poor for me. I uninstalled the 2nd card b/c it was pretty **** imho.


I've had great SLI performance outside of blown out contrast and flickering in BF1.


----------



## rcfc89

Quote:


> Originally Posted by *Mato87*
> 
> Yeah that's the standard aorus version, I wanted the xtreme
> 
> 
> 
> 
> 
> 
> 
> Doesn't matter anyway, there is only a slight difference so I just went for the one non reference card that was available at the local electronics store. I was lucky, because they sold out a few minutes after I ordered it LOL. And where I live there isn't really a huge market for this kind of hardware...


Who confirmed its only the non-OC being released?


----------



## rcfc89

Quote:


> Originally Posted by *dboythagr8*
> 
> Yes 2560x1440. You enable DSR in NVCP and you can select the how much you want to use. I think it goes up to 4x of the resolution you're currently using. Then you go into the game and select that resolution or you can do it through GeForce Experience. As for CPU bottlenecking I could tell because my GPU performance was in SLI was at about 60% for each card @ 1440p. Something's not right. Looked at logs and just my hunch, I figured 1440p was too low of a resolution for TWO Tis. Turned to DSR and everything evened out on my GPUs.
> 
> Think of it as having Lamborghini Aventador. Driving it on a road where the speed limit is 40mph. The car probably isn't all that comfortable although it will certainly get you to where you want in style. All that power in the V12 is going to waste. Take it to a free way where the limit is upped or better yet a track and the car can actually use that power under the hood. Same thing with resolutions with this amount of GPU power ?
> I've had great SLI performance outside of blown out contrast and flickering in BF1.


The blown out contrast in BF1 is brutal at times. One card for me here on out. I think we have finally hit the point where 1 is enough.


----------



## kevindd992002

Quote:


> Originally Posted by *KedarWolf*
> 
> Myself and others have found the unlocked voltage slider does not increase max voltages but DOES on water keep voltages at the maximum 1.062v without it dropping lower more constantly.
> 
> With it in an earlier post without shunt mod doing a Heaven run my voltages stayed at 1.062 99% of the run, I think the few seconds dip was a Power Limit issue.
> 
> Without it voltages and clocks dipped much more off and on.


Is your card a 1080Ti FE? Sorry, I cannot see your sig as I'm mobile. If it is, then can you say a custom card is still worth it for watercooling as it removes the potential power limit that one can reach at one point or another?
Quote:


> Originally Posted by *Slackaveli*
> 
> quick easy fix to that. it just needs to be added to the 3rd party card database manually by copying/pasting one string


Yeah and as far as I remember the same behavior (voltage locked) happens on the 1080/1070, right? So I don't follow how the 1080Ti being voltage-locked is a rumor that can warrant that the additional power of partner boards will not affect OC results. @rcfc89 care to elaborate?


----------



## rcfc89

Quote:


> Originally Posted by *kevindd992002*
> 
> Is your card a 1080Ti FE? Sorry, I cannot see your sig as I'm mobile. If it is, then can you say a custom card is still worth it for watercooling as it removes the potential power limit that one can reach at one point or another?
> Yeah and as far as I remember the same behavior (voltage locked) happens on the 1080/1070, right? So I don't follow how the 1080Ti being voltage-locked is a rumor that can warrant that the additional power of partner boards will not affect OC results. @rcfc89 care to elaborate?


Time will tell I guess. From what I have gathered from the 1080ti owners thread which has a few Strix owners and one Auros owner (both 8+8pin) is that they are getting similar clocks as those with reference boards on water with only a 6+8pin. Seems that for now there is a ceiling with the 1080ti's with voltage being locked.


----------



## Jbravo33

regardless of AIB or not i dont see this card getting more than 2100 only a couple people have. Im on water sli, one fe, one msi fe and 2075 is max whats a AIB on water gonna get 21? maybe 2113 or so? Is that worth the wait and extra money? Id say just get FE and soak it!


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Myself and others have found the unlocked voltage slider does not increase max voltages but DOES on water keep voltages at the maximum 1.062v without it dropping lower more constantly.
> 
> With it in an earlier post without shunt mod doing a Heaven run my voltages stayed at 1.062 99% of the run, I think the few seconds dip was a Power Limit issue.
> 
> Without it voltages and clocks dipped much more off and on.
> 
> 
> 
> Is your card a 1080Ti FE? Sorry, I cannot see your sig as I'm mobile. If it is, then can you say a custom card is still worth it for watercooling as it removes the potential power limit that one can reach at one point or another?
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> quick easy fix to that. it just needs to be added to the 3rd party card database manually by copying/pasting one string
> 
> Click to expand...
> 
> Yeah and as far as I remember the same behavior (voltage locked) happens on the 1080/1070, right? So I don't follow how the 1080Ti being voltage-locked is a rumor that can warrant that the additional power of partner boards will not affect OC results. @rcfc89 care to elaborate?
Click to expand...

I have a Gigabyte 1080 FE. Yes, aftermarket card or wait to flash different BIOS on an FE.









Or FE with shunt mod. Even without while running Heaven full run my card doesn't power limit throttle 99% of the time.


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> regardless of AIB or not i dont see this card getting more than 2100 only a couple people have. Im on water sli, one fe, one msi fe and 2075 is max whats a AIB on water gonna get 21? maybe 2113 or so? Is that worth the wait and extra money? Id say just get FE and soak it!


Agreed!


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Why I did not hear about it more?? Looks like OCed (not stock 3,3 Ghz) 5775c is beating every other top CPU in gaming (4790k, 6700k, 7700k) by quite a big margin (10 fps vs 7700k).
> 
> Honestly I never head about this CPU. Why it was dumped by Intel?


because it was an unintended consequence (the L4 cache and how it helps when the igpu is off and a dgpu is used) that was going to make Skylake, Kabylake, z170/270 all un-needed upgrades, worse than side grades in fact, in gaming. They buried it because the tech is too good. Also, the info is out there but people get very defensive when it's pointed out to them. Certain games it's well known. Project Cars players, for instance. There are a ton of PC players on 5775c, because it is WAY better than a Skylake/haswell in that game. Also, huge gains in rts games. And on top of all that, it overclocks from 3.3 to 4.3 easily , and does it just sipping juice compared to the other z97 cpu's. it's tdp is only 65w. Idle it uses 11w. Crazy.


----------



## KedarWolf

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> I'm not sure how you guys can say you get X frames with SLI. SLI has been piss poor for me. I uninstalled the 2nd card b/c it was pretty much dung imho.


You can enable SLI in most any game with Nvidia Inspector, even enable three and four way SLI with it. I just dunno how peeps are setting up high bandwidth bridges three and four way though.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> because it was an unintended consequence (the L4 cache and how it helps when the igpu is off and a dgpu is used) that was going to make Skylake, Kabylake, z170/270 all un-needed upgrades, worse than side grades in fact, in gaming. They buried it because the tech is too good. Also, the info is out there but people get very defensive when it's pointed out to them. Certain games it's well known. Project Cars players, for instance. There are a ton of PC players on 5775c, because it is WAY better than a Skylake/haswell in that game. Also, huge gains in rts games. And on top of all that, it overclocks from 3.3 to 4.3 easily , and does it just sipping juice compared to the other z97 cpu's. it's tdp is only 65w. Idle it uses 11w. Crazy.


So I should sell my 6950x and score one of these.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Why I did not hear about it more?? Looks like OCed (not stock 3,3 Ghz) 5775c is beating every other top CPU in gaming (4790k, 6700k, 7700k) by quite a big margin (10 fps vs 7700k).
> 
> Honestly I never head about this CPU. Why it was dumped by Intel?
> 
> 
> 
> because it was an unintended consequence (the L4 cache and how it helps when the igpu is off and a dgpu is used) that was going to make Skylake, Kabylake, z170/270 all un-needed upgrades, worse than side grades in fact, in gaming. They buried it because the tech is too good. Also, the info is out there but people get very defensive when it's pointed out to them. Certain games it's well known. Project Cars players, for instance. There are a ton of PC players on 5775c, because it is WAY better than a Skylake/haswell in that game. Also, huge gains in rts games. And on top of all that, it overclocks from 3.3 to 4.3 easily , and does it just sipping juice compared to the other z97 cpu's. it's tdp is only 65w. Idle it uses 11w. Crazy.
Click to expand...

Sounds good, but I think I'll stick with my 5960x at 4.742GHZ, getting nice results with it.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Sounds good, but I think I'll stick with my 5960x at 4.742GHZ, getting nice results with it.


4.7 at under 1.3v, you won the silicon lottery for sure!


----------



## Bishop07764

Quote:


> Originally Posted by *Jbravo33*
> 
> regardless of AIB or not i dont see this card getting more than 2100 only a couple people have. Im on water sli, one fe, one msi fe and 2075 is max whats a AIB on water gonna get 21? maybe 2113 or so? Is that worth the wait and extra money? Id say just get FE and soak it!


I'd second that. FE makes a lot of sense if you are going to water-cool. If I was on air though, I'd want a quieter AIB card. Overclocking still seems to be complete silicone lottery.

Anyone else get like a 20 fps boost in The Division with dx 12? The 1080 ti is pulling over 100 fps average at 1440 p with all maxed including Nvidia hfts shadows according to the benchmark. Out maybe my CPU is just that weak now. ?


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Sounds good, but I think I'll stick with my 5960x at 4.742GHZ, getting nice results with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4.7 at under 1.3v, you won the silicon lottery for sure!
Click to expand...

Strange thing is this is an Intel warranty replacement for a CPU I thought was great that did 4.6GHZ at 1.312v. I fried that one.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Strange thing is this is an Intel warranty replacement for a CPU I thought was great that did 4.6GHZ at 1.312v. I fried that one.


LMAO!

Guess you struck gold twice.


----------



## egandt

I have 4 displays on my GTX 1080 TI: Windows 10, 378.92 Drivers (I've tired 3 driver versions with the same issue occurring)

1
2 3 4

The Nvidia drivers works and I can drive them all, 1 HDMI and 3 using DP, however when I resume from sleep the ID on the monitors change so if the above config is before I sleep on resume I might see:

2
1 4 3

Next time:

3
4 2 1

Now the primary monitor is maintained it was 3, then 4 and is now 2, but for instance Video set to start on 4 now starts on the wrong screen, or a program running on 1 is not on the wrong monitor, this gets annoying very quickly, and I do not see why it is the case or what I can do to resolve it, any suggestions? As it is driving be plain batty.

Thanks,
ERIC


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> So I should sell my 6950x and score one of these.


of course not. it's a $350 4 core. but if all you care about is gaming, than why not? you could get a 5775c/z97mobo and another 1080ti off the money made from that cpu, AND have change. And more fps...


----------



## BoredErica

Quote:


> Originally Posted by *mtbiker033*
> 
> Amazon has evga FE's in stock on Saturday:
> 
> https://www.amazon.com/gp/product/B06XH2P8DD/ref=od_aui_detailpages00?ie=UTF8&psc=1
> 
> $699 prime
> 
> just snagged one, should have it April 5th!


Wow, only took them ten years, lol.
Another 10 for MSI Armor to show on Amazon, and another 50 until EVGA SC does.









Quote:


> Originally Posted by *mcg75*
> 
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> 
> These guys had the 5775c in their test. 7700k won 5 of 7 vs it including minimums.


The feels when my 7600k goes to 5.2ghz instead of 4.5 :thinking:


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> You could get a 5775c/z97mobo and another 1080ti off the money made from that cpu, AND have change. And more fps...


Don't tempt me...








Quote:


> Originally Posted by *Darkwizzie*
> 
> The feels when my 7600k goes to 5.2ghz instead of 4.5


5.2Ghz?!

#Jealous...


----------



## Baasha

Quote:


> Originally Posted by *dboythagr8*
> 
> I've had great SLI performance outside of *blown out contrast* and flickering in BF1.


First time I'm hearing this. Have played the game using single GPU in DX12 and SLI in DX11 and have never seen any blown-out contrast.

Got any pics/videos of this?

Here's 4-Way SLI madness at 5K maxed out - and that too with a old-school platform + CPU:


----------



## Slackaveli

old school plat / cpu in 5k is still very OP


----------



## alucardis666

Quote:


> Originally Posted by *Baasha*
> 
> First time I'm hearing this. Have played the game using single GPU in DX12 and SLI in DX11 and have never seen any blown-out contrast.
> 
> Got any pics/videos of this?
> 
> Here's 4-Way SLI madness at 5K maxed out - and that too with a old-school platform + CPU:


You have 4 Titan XP's and 4 1080Ti's... What the heck do you do for a living?!


----------



## dboythagr8

Quote:


> Originally Posted by *Baasha*
> 
> First time I'm hearing this. Have played the game using single GPU in DX12 and SLI in DX11 and have never seen any blown-out contrast.
> 
> Got any pics/videos of this?
> 
> Here's 4-Way SLI madness at 5K maxed out - and that too with a old-school platform + CPU:


It's awful. I'll see if I can get some pics. It's blown out contrast and flickering. Immediately going back to a single card removes this and it works as expected.

Edit - Are you using a specific profile in Inspector? You don't have the flickering but it kind of looks like the overblown contrast is there...


----------



## mcg75

Quote:


> Originally Posted by *jelome1989*
> 
> Sorry I was talking about the OC results of the 5775c in the first link. At stock, the 5775c is at a huge disadvantage with about 800mhz less clock speeds, but the 7700k only beats the 5775c by 1 or 2 fps on average. At OC speeds the 5775c beats the 7700k handily, even with 800mhz less speeds. Clock for clock, the 5775c clearly beats the 7700k on almost every game, except Crysis 3.
> The second link has a 6700k. We can draw estimates for the 7700k too since both CPUs perform equally clock for clock.


Sorry, I didn't even see the OC results there.

But even after having read them, I can't agree with some of your conclusions above. Here's the two reasons why.

6 of 10 games in the OC test end up within less than 2 fps difference. That's well within the margin of error for the test.

The overall average of all 10 games sees the 5775c at 86.5 fps and 7700k at 85.0 fps.

So while it in no way beats the 7700k handily or clearly, it does indeed beat it.

And quite frankly, that impresses the heck out of me. The 5775c is a beast worthy of respect.
Quote:


> Originally Posted by *jelome1989*
> 
> Anyway, what I really want to say is just that benchmarks are just hard to trust these days. Looking at pcgameshardware's test, the 7700k beats the 5775c at StarCraft 2 by a HUGE margin, but in purepc's test, the 5775c beats the 6700k/7700k at StarCraft 2.


I agree. There can be some real differences in results sometimes that boggle my mind.
Quote:


> Originally Posted by *Slackaveli*
> 
> everytime i bring it up i get abused by the forum. So many IGNORANT posters on here it's pathetic. I thought on Ooverclock.net we'd have smarter posters....


Please just stop now. Discussion is not abuse. And you will never gain any respect by dissing other people either.

My conclusion is fair. 5775c is a beast that deserves respect. And a 1080 Ti with that respect too.


----------



## KingEngineRevUp

Anyone on the fence about mounting a x41 can rest assured, I'm getting 42C with 100% fan load and 45C with 50% fan speed.

Hitting 2088 easily now. Can do half a run on heaven at 2101Mhz but once temperatures go to 40s, I go down to 2088.


----------



## Slackaveli

Quote:


> Originally Posted by *mcg75*
> 
> Sorry, I didn't even see the OC results there.
> 
> But even after having read them, I can't agree with some of your conclusions above. Here's the two reasons why.
> 
> 6 of 10 games in the OC test end up within less than 2 fps difference. That's well within the margin of error for the test.
> 
> The overall average of all 10 games sees the 5775c at 86.5 fps and 7700k at 85.0 fps.
> 
> So while it in no way beats the 7700k handily or clearly, it does indeed beat it.
> 
> And quite frankly, that impresses the heck out of me. The 5775c is a beast worthy of respect.
> I agree. There can be some real differences in results sometimes that boggle my mind.
> Please just stop now. Discussion is not abuse. And you will never gain any respect by dissing other people either.
> 
> My conclusion is fair. 5775c is a beast that deserves respect. And a 1080 Ti with that respect too.


it's just tiresome. i was called a liar from the start. It's all good, i know what i have and was only trying to help other people know. some are on z97 mobos right now and have no idea what they could be running. Glad you researched it. It is a very impressive thing for a z97 on ddr3 to tie a 7700k in gaming. they are inded the two champs and both are beasts.

I am glad that some see it though. Im proud to have discovered and own a 5775c , and a 1080ti for that matter, it's a nasty combo.


----------



## kevindd992002

Quote:


> Originally Posted by *Jbravo33*
> 
> regardless of AIB or not i dont see this card getting more than 2100 only a couple people have. Im on water sli, one fe, one msi fe and 2075 is max whats a AIB on water gonna get 21? maybe 2113 or so? Is that worth the wait and extra money? Id say just get FE and soak it!


And are your FE's being power limited?


----------



## Jbravo33

To be honest I don't even know. I'm using msi afterburner voltage is unlocked and set to 100. Power target also max. Highest temp after hours of benching then gaming hit 46. I can only get max 170 core 575 mem but that doesn't get me best score in benches. I'm just going by my situation and from this thread literally been thru every page and wrote down values everyone had posted 2100 is scarce lol. We haven't seen aib cards under water but I'm just speculating but to be much higher than that I'd be shocked and pissed for not waiting







how can u tell if it's power limited? Here I go more tests. Haha


----------



## havoc315

Got them on finally, they are some beautiful cards and oh so powerful I absolutely love these babies lol...cant wait to put my system back together to play with them after installing the water blocks and probably do shunt mod after a while to get a base line to work with for the shunt mod and deliding my 6700k should help boost all oc's.
When I first got them and installed after just playing with them I got them to this and didn't push my further because my pump was going out and had to rip both pumps out and send back


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> To be honest I don't even know. I'm using msi afterburner voltage is unlocked and set to 100. Power target also max. Highest temp after hours of benching then gaming hit 46. I can only get max 170 core 575 mem but that doesn't get me best score in benches. I'm just going by my situation and from this thread literally been thru every page and wrote down values everyone had posted 2100 is scarce lol. We haven't seen aib cards under water but I'm just speculating but to be much higher than that I'd be shocked and pissed for not waiting
> 
> 
> 
> 
> 
> 
> 
> how can u tell if it's power limited? Here I go more tests. Haha


turn on monitoring in your settings menu in afterburner. then tick the box saying 'show in osd' then you can see your power limit, clocks, etc while gaming/benching.


----------



## illypso

Hi,

I can confirm that to much thermal paste to the point where it touch the resisance/capacitor around the GPU DIE

Can impact stability.

Before touching anything,(with the stock card) I was able to hold 2000 but not for long before power limit was pulling me bellow.but it never crash.

I decided to remove power limit.

in the end it had artic silver 5 on it, (but while searching for a way to shunt mod with what I had, I just re-apply thermal pas without cleaning arround the GPU die, There was so much that I could not see the small "cap" arround the GPU.

My card was not stable with a long run or gaming at 1900 and was instant crash at 2000.

At first I tough the shunt could have done that or my power supply could not old the new power (it did shut down itself a couple of time in some game. but I change it for a 1000W and it never shut down again)

But I still could not touch the 2000mhz clock....

So I decided to clean everything very well around the GPU and reapply thermal paste (MX4 because I bought a big tube of it)

I use the small rice X on GPU instead of spreading it everywhere.

And now I am stable at 2037

I know I read artic silver was a little capacitive so maybe it hurt the value of the cap/res around the GPU and mess with the stability.

I was a little scare I broke something when soldering the shunt but it was the paste and now everything is fine.


----------



## evassion

Quote:


> Originally Posted by *Hulio225*
> 
> i guess it always depends somehow on circumstances. if you play 4k and your monitor refresh rate is 60 hz but you are mostlikly below 60 hz frametimes can make a difference in good gaming experience... in other circumstances maybe not that much...
> 
> i have a 4k monitor and in a lot of games i havent 60+ fps obviously... so it can make a difference
> in addition i had to install my waterblock anyway, so for me (i have an engineering degree in electronics) a shunt mod is like yeah what ever, ill do it becaue i have stripped the card down anyway...


Hello, only a little question because you are using a x41 with the 1080 ti fe.
I have a new x62 and a water extreme 3.0. Do you think I could attach one of them to my evga 1080 ti fe?? Did you use a copper shim or something to addapt it? Thanks


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> Good, now you can stop bugging people about Quantum Break.


hehe , will finally find out if it can run at 1080p above 60 fps with upscaling turned off








Quote:


> Originally Posted by *rcfc89*
> 
> Who confirmed its only the non-OC being released?


At my store, they don't even have the standard aorus version of the card in the store and they won't have it until the end of next month, so Iam guessing the xtreme version comes after that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *illypso*
> 
> Hi,
> 
> I can confirm that to much thermal paste to the point where it touch the resisance/capacitor around the GPU DIE
> 
> Can impact stability.
> 
> Before touching anything,(with the stock card) I was able to hold 2000 but not for long before power limit was pulling me bellow.but it never crash.
> 
> I decided to remove power limit.
> 
> in the end it had artic silver 5 on it, (but while searching for a way to shunt mod with what I had, I just re-apply thermal pas without cleaning arround the GPU die, There was so much that I could not see the small "cap" arround the GPU.
> 
> My card was not stable with a long run or gaming at 1900 and was instant crash at 2000.
> 
> At first I tough the shunt could have done that or my power supply could not old the new power (it did shut down itself a couple of time in some game. but I change it for a 1000W and it never shut down again)
> 
> But I still could not touch the 2000mhz clock....
> 
> So I decided to clean everything very well around the GPU and reapply thermal paste (MX4 because I bought a big tube of it)
> 
> I use the small rice X on GPU instead of spreading it everywhere.
> 
> And now I am stable at 2037
> 
> I know I read artic silver was a little capacitive so maybe it hurt the value of the cap/res around the GPU and mess with the stability.
> 
> I was a little scare I broke something when soldering the shunt but it was the paste and now everything is fine.


I don't know, if more people see similar results I'll believe it, but too many variables for me to believe.


----------



## shalafi

Quote:


> Originally Posted by *SlimJ87D*
> 
> Anyone on the fence about mounting a x41 can rest assured, I'm getting 42C with 100% fan load and 45C with 50% fan speed.
> 
> Hitting 2088 easily now. Can do half a run on heaven at 2101Mhz but once temperatures go to 40s, I go down to 2088.


How did you mount it, G10 bracket?
Quote:


> Originally Posted by *evassion*
> 
> Hello, only a little question because you are using a x41 with the 1080 ti fe.
> I have a new x62 and a water extreme 3.0. Do you think I could attach one of them to my evga 1080 ti fe?? Did you use a copper shim or something to addapt it? Thanks


This .. I have an unopened X62 (waiting for AM4 bracket). Frankly, the EVGA Hybrid kit temps are underwhelming at best, moreso when looking at the Guru3d review of the MSI 1080Ti X 11G, which manages 69C with the fans at 60% .. overclocked.


----------



## D13mass

Guys, who knows which waterblock will be better and why I see different list of products for Titan and 1080ti if it`s should be compatible ?
For 1080ti only http://www.aquatuning.co.uk/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&o=3&n=12&f=2255
For Titan Pascal only http://www.aquatuning.co.uk/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&o=3&n=12&f=2202


----------



## BrainSplatter

Just for comparison, FE on stock air cooler 100% fan, [email protected], 32GB 3200CL15 RAM, 378.78 driver, Win10 64bit:

Time Spy: 9624, Graphics 10584, CPU 6359
Firestrike Ultra 1.1: 7574, Graphics 7622, Physics 16415, Combined 4084
Firestrike Extreme 1.1: 14216, Graphics 15576, Physics 16397, Combined 7667

Regarding the somewhat off-topic discussion about 5775C vs 7700K, in my experience, with both overclocked, u get higher *max* framerates with the 7700K and also often higher average framerates but the *minimum* framerates and average framerates in RAM intensive games like Arma 3, all Total War games are better with the 5775C.

So it's essentially max frame rates for shooters the 7700K, higher minimum framerates for simulation and strategy games with the 5775C.


----------



## Jbravo33

Quote:


> Originally Posted by *kevindd992002*
> 
> And are your FE's being power limited?


Quote:


> Originally Posted by *Slackaveli*
> 
> turn on monitoring in your settings menu in afterburner. then tick the box saying 'show in osd' then you can see your power limit, clocks, etc while gaming/benching.


Good looking out guess my voltage was at 50. changed that to 100 I can now get to 2088 on heaven stable. if I push to 2100 I crash but power percentage max 114% and 113% on each card. that mean theres more room to work with?

edit: at 2100 I'm reaching limit on 3d mark and crashes so I guess 2088 is where it stands for these. one hits power limit the other hovers around 114%


----------



## illypso

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know, if more people see similar results I'll believe it, but too many variables for me to believe.


well the only variable on the last disassemble was the paste.

I wasn't able to play stable at 1900 for long (which was really not good). the only thing I did is open the card, clean it well on those res/cap, put back new paste but not too much and I can now game a 2000 no problem with the same game. adn even bench higher which was impossible before.
Temps where the same, actually I was not going over 49 when it crash and now I reach 53 because I can game long enought to buid heat in the system.

I do find that weird but I don't see another variable on that last dismantle of the card. and a gain of 150mhz and stability.


----------



## fisher6

So was trying GTA 5 yesterday to figure out the issue with low GPU usage. It jumps around from anything between 46% to 99%, vsync is off and FPS is between 46 and 120 at 3440x1440.

The only way for me to get consistent GPU usage is by using the frame scaling option and set it to 0.87 or something. Is the 1080 Ti being bottlenecked? My 4790k is running at 4.8.


----------



## BrainSplatter

Quote:


> Originally Posted by *fisher6*
> 
> Is the 1080 Ti be bottlenecked? My 4790k is running at 4.8.


You can also be bottlenecked by RAM /speed in GTA 5, especially when there is a lot of texture streaming going on.


----------



## fisher6

Quote:


> Originally Posted by *BrainSplatter*
> 
> You can also be bottlenecked by RAM /speed in GTA 5, especially when there is a lot of texture streaming going on.


I do have ****ty DDR3-1600 RAM but it was never an issue before with my 980 Ti and Ghost recon runs great.


----------



## BrainSplatter

Well, the 1080Ti will be about 60-80% faster I guess. It's not really surprising that the 1600Mhz RAM can't keep up all the time.


----------



## fisher6

Quote:


> Originally Posted by *BrainSplatter*
> 
> Well, the 1080Ti will be about 60-80% faster I guess. It's not really surprising that the 1600Mhz RAM can't keep up all the time.


If I can somehow confirm it's the RAM I will go get a faster stick.


----------



## Radox-0

Quote:


> Originally Posted by *KedarWolf*
> 
> You can enable SLI in most any game with Nvidia Inspector, even enable three and four way SLI with it. I just dunno how peeps are setting up high bandwidth bridges three and four way though.


I just use the normal Nvidia LED SLI bridge for 3 way. Only get a message of insufficient bandwidth etc if I use one of the flexible or hard type that are packaged with motherboards.


----------



## Jbravo33

Quote:


> Originally Posted by *DennyCorsa86*
> 
> Yes i use ek blocks with backplates
> my sli works at 2050 mhz on core and 1539 mhz on vram , 7700K @ 5300/5000 mhz


I'm so Close!!! I see you! lol


----------



## richiec77

Quote:


> Originally Posted by *illypso*
> 
> Hi,
> 
> I can confirm that to much thermal paste to the point where it touch the resisance/capacitor around the GPU DIE
> 
> Can impact stability.
> 
> Before touching anything,(with the stock card) I was able to hold 2000 but not for long before power limit was pulling me bellow.but it never crash.
> 
> I decided to remove power limit.
> 
> in the end it had artic silver 5 on it, (but while searching for a way to shunt mod with what I had, I just re-apply thermal pas without cleaning arround the GPU die, There was so much that I could not see the small "cap" arround the GPU.
> 
> My card was not stable with a long run or gaming at 1900 and was instant crash at 2000.
> 
> At first I tough the shunt could have done that or my power supply could not old the new power (it did shut down itself a couple of time in some game. but I change it for a 1000W and it never shut down again)
> 
> But I still could not touch the 2000mhz clock....
> 
> So I decided to clean everything very well around the GPU and reapply thermal paste (MX4 because I bought a big tube of it)
> 
> I use the small rice X on GPU instead of spreading it everywhere.
> 
> And now I am stable at 2037
> 
> I know I read artic silver was a little capacitive so maybe it hurt the value of the cap/res around the GPU and mess with the stability.
> 
> I was a little scare I broke something when soldering the shunt but it was the paste and now everything is fine.


Woooo....hold on. With an exposed die you did the grain and squish?

If so, that is very very very risky. Being an exposed die, the entire and I mean entire die needs coverage. Not the sides though. Squishing it may not cover the corners and that can lead to the die burning up. Or parts of the die like ROPs or Memory controllers.

CPUs with IHS can use this method without ill effects, but you can not relie on this method for an exposed die.

Advice is to take it apart and reapply. Use a small plastic scraper...edge of a card. Something to act as a spreader. Or a clean finger tip. You can be stable and ok now...but that is not a risk I would take.

Prior results could be from too much paste/bad paste acting more like a heat insulator than a conductor of heat.


----------



## DennyCorsa86

Quote:


> Originally Posted by *Jbravo33*
> 
> I'm so Close!!! I see you! lol


Quote:


> Originally Posted by *Jbravo33*
> 
> I'm so Close!!! I see you! lol


My gs is crazy , 59900 points , i will retry it with my 6900k


----------



## Silent Scone

Quote:


> Originally Posted by *Jbravo33*
> 
> Mass effect and ghost recon 3440x1440 getting around 90 fps ultra everything


Getting that at 3440 with one card here in Andromeda. Lows of 65


----------



## AMDATI

I am too high and prestigious on the pc master race pyramid to bother to look at what FPS my games are running at. It is beneath me.

Also, Andromeda is such a long game, I'm like 50 hours into and still tons of stuff to do. Granted, I am doing basically every side mission and exploring every nook, and I haven't even touched the multiplayer yet.


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> Hybrid is totally worth it! 10/10 would recommend.


Quote:


> Originally Posted by *DennyCorsa86*
> 
> My gs is crazy , 59900 points , i will retry it with my 6900k


Yea it is. I've gotten to 59100 physics is good at 20k but for some reason can't seem to patch it all together with a good combined. I've gotten 9900 combined on prior runs but I can never get plus 9200 with all three. That would def throw me in the club.









Quote:


> Originally Posted by *Silent Scone*
> 
> Getting that at 3440 with one card here in Andromeda. Lows of 65


I ditched the precision oc, afterburner is working much better. I'll try again and see what I get. It was holding me back 40 plus fps in heaven.


----------



## EvilPieMoo

Trying to go for 33K graphics score, don't think it'll push that far though


----------



## mcg75

Quote:


> Originally Posted by *shalafi*
> 
> This .. I have an unopened X62 (waiting for AM4 bracket). Frankly, the EVGA Hybrid kit temps are underwhelming at best, moreso when looking at the Guru3d review of the MSI 1080Ti X 11G, which manages 69C with the fans at 60% .. overclocked.


I'm lost as to why the kits you buy aren't working very well. Running Heaven maxed out for 30 minutes results in 45c for me.

I honestly thought something might be reporting temps wrong given your problems. Then I went back to Gamers Nexus 1080 Ti hybrid test and found his test results are the same as mine.

http://www.gamersnexus.net/guides/2841-gtx-1080-ti-hybrid-benchmarks-remove-thermal-limit/page-2
Quote:


> Looking at the GN Hybrid variant, we're hovering around 45C. This gives us a ~40C reduction in hard temperature vs. FE, from which stems additional thermal headroom for boosting.


----------



## kevindd992002

Quote:


> Originally Posted by *Jbravo33*
> 
> Good looking out guess my voltage was at 50. changed that to 100 I can now get to 2088 on heaven stable. if I push to 2100 I crash but power percentage max 114% and 113% on each card. that mean theres more room to work with?
> 
> edit: at 2100 I'm reaching limit on 3d mark and crashes so I guess 2088 is where it stands for these. one hits power limit the other hovers around 114%


So yeah, that proves that the 1080Ti FE's are still power-limited, sigh







That doesn't mean that a custom card that is not power-limited will outperform a 1080Ti that is power-limited but at least you get the peace of mind from the custom card that you won't be considering the power limit because there's always extra room, right? I guess what I'm trying to say is:

FE's OC potential main factors = power limit and silicone lottery
Custom boards OC potential main factor = silicone lottery

Makes sense?


----------



## TWiST2k

Quote:


> Originally Posted by *kevindd992002*
> 
> So yeah, that proves that the 1080Ti FE's are still power-limited, sigh
> 
> 
> 
> 
> 
> 
> 
> That doesn't mean that a custom card that is not power-limited will outperform a 1080Ti that is power-limited but at least you get the peace of mind from the custom card that you won't be considering the power limit because there's always extra room, right? I guess what I'm trying to say is:
> 
> FE's OC potential main factors = power limit and silicone lottery
> Custom boards OC potential main factor = silicone lottery
> 
> Makes sense?


I am pickin' up, what your layin' down. I still personally think that lottery is key, time will tell.

I got a good month to wait till May 1st and that is if things remain on point.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *shalafi*
> 
> How did you mount it, G10 bracket?
> This .. I have an unopened X62 (waiting for AM4 bracket). Frankly, the EVGA Hybrid kit temps are underwhelming at best, moreso when looking at the Guru3d review of the MSI 1080Ti X 11G, which manages 69C with the fans at 60% .. overclocked.


No, I just cut the intel bracket, drillef four holes and mounted it in the shroud.


----------



## PasK1234Xw

Quote:


> Originally Posted by *Jbravo33*
> 
> I'm so Close!!! I see you! lol


If you have gsync turn it off if you haven't already when benchmark your score will increase


----------



## BoredErica

Quote:


> Originally Posted by *TWiST2k*
> 
> I am pickin' up, what your layin' down. I still personally think that lottery is key, time will tell.
> 
> I got a good month to wait till May 1st and that is if things remain on point.


Friendly reminder that silicon =/= silicone


----------



## rt123

Its an important distinction.


----------



## shalafi

Quote:


> Originally Posted by *Darkwizzie*
> 
> Friendly reminder that silicon =/= silicone


I'll just leave this here ..


----------



## shalafi

Quote:


> Originally Posted by *mcg75*
> 
> I'm lost as to why the kits you buy aren't working very well. Running Heaven maxed out for 30 minutes results in 45c for me.
> 
> I honestly thought something might be reporting temps wrong given your problems. Then I went back to Gamers Nexus 1080 Ti hybrid test and found his test results are the same as mine.
> 
> http://www.gamersnexus.net/guides/2841-gtx-1080-ti-hybrid-benchmarks-remove-thermal-limit/page-2


I've been poking around the Titan Xp thread as it is TDP-wise the same card. Stumbled upon this:
Quote:


> Originally Posted by *bwana*
> 
> remounted my evga 1080 hybrid cooler on the txp using TG cryonaut. Temps dropped 10 degrees! Max on heaven is now 51 deg. down from 63 when i was using the stock paste on the evga block. [..]


So two more things I can try:
1) I've already ordered the Kryonaut
2) mount the pump block with the pipes facing sideways (now they go towards the back of the card - GN mounted it the same in their video, part 2 I think)


----------



## Joshwaa

Can I join the holds 2100.5 in haven club?


----------



## Benny89

How many people did unlock Voltage slider in Afterburner and can confirm that they could push their clock higher with +100 on voltage?

Thanks for all info in advance!


----------



## Addsome

Quote:


> Originally Posted by *PasK1234Xw*
> 
> If you have gsync turn it off if you haven't already when benchmark your score will increase


Why does turning off Gsync increase frames and effect benchmark score? I gained 20 frames from turning off gsync and running heaven benchmark @ 1080p. from 140fps to 160fps. Does this mean having gsync on in games is lowering my frames?


----------



## Joshwaa

Quote:


> Originally Posted by *Benny89*
> 
> How many people did unlock Voltage slider in Afterburner and can confirm that they could push their clock higher with +100 on voltage?
> 
> Thanks for all info in advance!


I unlocked it but found it useless. I can hold 2100.5 without it. Use the curve graph thingy.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> everytime i bring it up i get abused by the forum. So many IGNORANT posters on here it's pathetic. I thought on Ooverclock.net we'd have smarter posters....


Hope I'm not being lumped in here as ignorant.

I'm happy to start looking into the CPU, though. I'd love to see more benchmarks with >1080p, though.


----------



## pantsoftime

Quote:


> Originally Posted by *KedarWolf*
> 
> Myself and others have found the unlocked voltage slider does not increase max voltages but DOES on water keep voltages at the maximum 1.062v without it dropping lower more constantly.
> 
> With it in an earlier post without shunt mod doing a Heaven run my voltages stayed at 1.062 99% of the run, I think the few seconds dip was a Power Limit issue.
> 
> Without it voltages and clocks dipped much more off and on.


The unlocked voltage slider absolutely does increase max voltage. It allows you to go to 1.093 volts. You have to use the curve OC to get it to go above 1.062. This is well documented in the 1080 thread.


----------



## mouacyk

Quote:


> Originally Posted by *Addsome*
> 
> Why does turning off Gsync increase frames and effect benchmark score? I gained 20 frames from turning off gsync and running heaven benchmark @ 1080p. from 140fps to 160fps. Does this mean having gsync on in games is lowering my frames?


G-Sync discards partial frames, so those don't get reported back to the benchmark program. In the ideal world, we all should bench counting only full frames, be it with G-Sync or Free-Sync, because that creates the best visuals... but winners gotta cheat, you know?


----------



## Addsome

Quote:


> Originally Posted by *mouacyk*
> 
> G-Sync discards partial frames, so those don't get reported back to the benchmark program. In the ideal world, we all should bench counting only full frames, be it with G-Sync or Free-Sync, because that creates the best visuals... but winners gotta cheat, you know?


So is G-sync only effecting benchmark scores or is it lowering my ingame fps aswell?


----------



## pez

Quote:


> Originally Posted by *Addsome*
> 
> So is G-sync only effecting benchmark scores or is it lowering my ingame fps aswell?


Try it and see. Different games will probably react differently.


----------



## mouacyk

Quote:


> Originally Posted by *Addsome*
> 
> So is G-sync only effecting benchmark scores or is it lowering my ingame fps aswell?


Generally, both. In very old games and benchmarks that aren't G-Sync compatible, there won't be a difference because G-Sync just will not work. With compatible games, it definitely will lower your ingame fps, because it is discarding partial frames which result in tearing, which G-Sync is meant to eliminate.


----------



## Addsome

Quote:


> Originally Posted by *mouacyk*
> 
> Generally, both. In very old games and benchmarks that aren't G-Sync compatible, there won't be a difference because G-Sync just will not work. With compatible games, it definitely will lower your ingame fps, because it is discarding partial frames which result in tearing, which G-Sync is meant to eliminate.


Any idea on % drop? I read 3-5% online but according to my 20fps loss from 160 to 140 that was about a 15% loss in frames. Would that also translate to 15% loss in compatible games?


----------



## mouacyk

The percentage drops depend more on the game engine, your hardware, and driver quality (realistically partial frames are a result of astral entropy). G-Sync has a very specific purpose and has matured well, so it's a clear question of visual fidelity vs minimal input lag. There are games where one option is better than the other -- you have to judge for yourself. Typically, go with visual fidelity (G-Sync on) for SP and lower-paced games and minimal input lag (G-Sync off) for competitive online games.


----------



## Addsome

Quote:


> Originally Posted by *mouacyk*
> 
> The percentage drops depend more on the game engine, your hardware, and driver quality (realistically partial frames are a result of astral entropy). G-Sync has a very specific purpose and has matured well, so it's a clear question of visual fidelity vs minimal input lag. There are games where one option is better than the other -- you have to judge for yourself. Typically, go with visual fidelity (G-Sync on) for SP and lower-paced games and minimal input lag (G-Sync off) for competitive online games.


Isint the whole attractiveness of G-Sync is to have minimal input lag compared to V-Sync while still providing tear free gaming?


----------



## mouacyk

Quote:


> Originally Posted by *Addsome*
> 
> Isint the whole attractiveness of G-Sync is to have *minimal* input lag compared to V-Sync while still providing tear free gaming?


I believe your original question was answered, and I will not de-rail this owner's thread anymore with a G-Sync discussion.


----------



## Addsome

Quote:


> Originally Posted by *mouacyk*
> 
> I believe your original question was answered, and I will not de-rail this owner's thread anymore with a G-Sync discussion.


Alright thanks for the help


----------



## alucardis666

Quote:


> Originally Posted by *Joshwaa*
> 
> Can I join the holds 2100.5 in haven club?


Jealous


----------



## Joshwaa

Quote:


> Originally Posted by *alucardis666*
> 
> Jealous


Use the curve graph thingy and I am sure you could get there.

I will tryFirestrike and Time Spy tonight.


----------



## Addsome

Quote:


> Originally Posted by *Joshwaa*
> 
> Use the curve graph thingy and I am sure you could get there.
> 
> I will tryFirestrike and Time Spy tonight.


So you put the 1.062V curve point at 2100mhz and applied it?


----------



## Joshwaa

Quote:


> Originally Posted by *Addsome*
> 
> So you put the 1.062V curve point at 2100mhz and applied it?


Pretty much.


----------



## Jbravo33

Quote:


> Originally Posted by *Silent Scone*
> 
> Getting that at 3440 with one card here in Andromeda. Lows of 65


Precision x was definitely not working correctly. I'm now getting 140fps lowest dip I saw was 122 ultra errthang. thanks for pointing that out








that's with 145core/425mem oc
Quote:


> Originally Posted by *Joshwaa*
> 
> Use the curve graph thingy and I am sure you could get there.
> 
> I will tryFirestrike and Time Spy tonight.


whats the curve graph thing?


----------



## Simkin

Very happy with my 1080 Ti FE.

coming from 980Ti reference, and i can say its well worth the upgrade even for 1080p.

In BF4, 980Ti clocked to 1400 on the core could run this game on Ultra/4XMSAA 120fps in almost any situation in multiplayer, but turning resolution scale up, even 110% made the fps drop more than i liked. The 1080Ti hammer through 150% resoultion scale with no problem.

And, on the same custom fan profile as my 980Ti in Afterburner, its running 10c cooler. My 980Ti was running on 70-73c, 1080Ti is 60-63c, and that is with an overclock of 2025 on the core.

BF4 really deserve to have resolution scale turned up on a 1080p monitor, it looks so much crisper and clearer, it really makes a difference.

Now i just wait for Samsung to release its 31,5" 1440p 144Hz G-Sync so i can use both of my 1080Ti's


----------



## Joshwaa

Press the three bars next to the gpu overclock slider in AB. You will see it.


----------



## alucardis666

Quote:


> Originally Posted by *Joshwaa*
> 
> Use the curve graph thingy and I am sure you could get there.
> 
> I will tryFirestrike and Time Spy tonight.


I sure can't! lol. My card struggles to maintain 2000mhz. Let alone 2100.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Joshwaa*
> 
> Pretty much.


Do you have screenshots of your graph or a tutorial on how to set it up.

I can boost to 2101 when I first start a benchmark but temperature throttle causes me to go down to 2088. I run through like 5 scenes at 2101 with no issues and then boom, thermal throttle.

If I can add some voltage to sustain 2101 MHz, then I think I can sustain it.


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> If I can somehow confirm it's the RAM I will go get a faster stick.


wait, Stick as in singular 1 stick? THAT is your problem. Your need two to actually double the data rate, plugged into slots 1 and 3 on your mobo. And get at least 2400Mhz with C-10 or C-11. You wont regret it. While you are at it, sell that 4790k and spend the extra $80 to get a 5775c. A couple users have done it since yesterdays discussion already and in a few PMs dude is telling me how happy he is about it.


----------



## illypso

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *richiec77*
> 
> Woooo....hold on. With an exposed die you did the grain and squish?
> 
> If so, that is very very very risky. Being an exposed die, the entire and I mean entire die needs coverage. Not the sides though. Squishing it may not cover the corners and that can lead to the die burning up. Or parts of the die like ROPs or Memory controllers.
> 
> CPUs with IHS can use this method without ill effects, but you can not relie on this method for an exposed die.
> 
> Advice is to take it apart and reapply. Use a small plastic scraper...edge of a card. Something to act as a spreader. Or a clean finger tip. You can be stable and ok now...but that is not a risk I would take.
> 
> Prior results could be from too much paste/bad paste acting more like a heat insulator than a conductor of heat.






The prior unstable result was with the "card tricks", I took my time, put one bead a one side and did a swipe in one slow stroke not to get air bubble and covered the entire die.
Then confirm with a flashlight, it seem uniform and miror like paste. I did not see the die, so there seem there was enough and when last time a remove it the quantity seem perfect too. (apart for the already there paste on the side)
I had stability problem but as I said there where a lot of paste on the side from prior installation..

The last try I did use the X squish method which CAN be risky BUT I tried to apply and remove than re-clean 2 time to see and confirm that my amount use was enough to cover the entire die.

I did an X and the entire die was covered and there was not too much on the edge and it did not touch the cap/res on the side. So although dangerous on direct DIE, I prefer this method now.

Maybe I had trap air bubble with the card trick but I'm pretty sure I did it right and the temps report where fine.
Can a part of the die heat up more than the rest and cause a good temp reading although small part of it overheat.
If so, the yes it could have been air pocket on the die and the method use was different so its a variable.

So all variable from 1900 not stable to 2037 stable would be
Cleaner GPU side(res/cap).
MX4 paste instead of Artic5
X bead method instead of card swipe.

Also I did use thermal paste (artic 5) before on those thing (cap/res anyone know which one there are) but on my CPU to protect them from CLU install use on my delid and I don't seem to have problem with my CPU overclock. (same result that before just cooler temps) but they may be less picky about it. there is also a lot less of those thing on my CPU (only one side,)


----------



## Cool Mike

Just Grabbed A MSI Gaming X on Newegg!!! $749.99 thumb.gif


----------



## Joshwaa

Quote:


> Originally Posted by *SlimJ87D*
> 
> Do you have screenshots of your graph or a tutorial on how to set it up.
> 
> I can boost to 2101 when I first start a benchmark but temperature throttle causes me to go down to 2088. I run through like 5 scenes at 2101 with no issues and then boom, thermal throttle.
> 
> If I can add some voltage to sustain 2101 MHz, then I think I can sustain it.


You have to keep the temps down but you might be able to get 2100.5 at less that 1.062


----------



## Slackaveli

Quote:


> Originally Posted by *BrainSplatter*
> 
> Just for comparison, FE on stock air cooler 100% fan, [email protected], 32GB 3200CL15 RAM, 378.78 driver, Win10 64bit:
> 
> Time Spy: 9624, Graphics 10584, CPU 6359
> Firestrike Ultra 1.1: 7574, Graphics 7622, Physics 16415, Combined 4084
> Firestrike Extreme 1.1: 14216, Graphics 15576, Physics 16397, Combined 7667
> 
> Regarding the somewhat off-topic discussion about 5775C vs 7700K, in my experience, with both overclocked, u get higher *max* framerates with the 7700K and also often higher average framerates but the *minimum* framerates and average framerates in RAM intensive games like Arma 3, all Total War games are better with the 5775C.
> 
> So it's essentially max frame rates for shooters the 7700K, higher minimum framerates for simulation and strategy games with the 5775C.


I would concur with that. I would point out that max framerates are less important than smooth better lows unless you aren't maxing monitor refresh rate (or are at least very close) or one has a sizable lead in max frames, not the 2 or 3 fps in this case.. I wouldn't ever recomend somebody go backwards in platforms,, but, if you are on a 4690k or a 4790k on z97, it's a no brainer to upgrade to a 5775c for peanuts. Except if you are like @Alucardis666 and have a cpu/mobo combo worth $2,000 and are using it just for gaming and only have 1 1080ti. In that case I actually WOULD recommend getting a 5775c/z97/another 1080Ti, and a ultrawide 21:9 g-sync for the same price


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Joshwaa*
> 
> You have to keep the temps down but you might be able to get 2100.5 at less that 1.062


My temps are 42C to 50C depending on my fan curve. That should be good enough right?


----------



## Joshwaa

Quote:


> Originally Posted by *SlimJ87D*
> 
> My temps are 42C to 50C depending on my fan curve. That should be good enough right?


Not sure. I did not mess with the graph till I was on water. Would not hurt to try and see what you get.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Hope I'm not being lumped in here as ignorant.
> 
> I'm happy to start looking into the CPU, though. I'd love to see more benchmarks with >1080p, though.


Guys, I apologize for saying the forum was ignorant. I was just surprised that I had no less than 5 posters saying I was FoS, and none had yet confirmed that I indeed was not just lying and making up false claims about my cpu b/c I wanted to justify my purchase. It wasnt like that AT ALL, I was trying to share knowledge. I got frustrated. But later several informed posters came and the education of the forum has now occured on this topic. I am sorry I came across so salty; I am a highly functioning autistic and got a little carried away, so on that day less highly functioning lol. I apologize, I hope you guys understand.


----------



## DooRules

If you want higher than 1.062 on the curve you need to move the voltage slider on top of AB. Then you can max out to 1.093V


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Any idea on % drop? I read 3-5% online but according to my 20fps loss from 160 to 140 that was about a 15% loss in frames. Would that also translate to 15% loss in compatible games?


bad frames are lost. you dont want them. just keep a frame rate limiter (use Riva, it is best for input lag) @ 142 on a 144Hz monitor for best frame pacing and negligible input lag. And run Fast Sync in NCP. And leave g-sync on of course.


----------



## Addsome

Quote:


> Originally Posted by *DooRules*
> 
> If you want higher than 1.062 on the curve you need to move the voltage slider on top of AB. Then you can max out to 1.093V


How much would the life of the card be reduced by when using 1.093V?
Quote:


> Originally Posted by *Slackaveli*
> 
> bad frames are lost. you dont want them. just keep a frame rate limiter (use Riva, it is best for input lag) @ 142 on a 144Hz monitor for best frame pacing and negligible input lag. And run Fast Sync in NCP. And leave g-sync on of course.


Guides online say to turn Vsync on instead of fast sync in the Nvidia control panel. Whats the difference?


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Except if you are like @Alucardis666 and have a cpu/mobo combo worth $2,000 and are using it just for gaming and only have 1 1080ti. In that case I actually WOULD recommend getting a 5775c/z97/another 1080Ti, and a ultrawide 21:9 g-sync for the same price












I SAID DON'T TEMPT ME!!!


----------



## DooRules

I only use max voltage for over clocking and card is running very cool. If cooling is sufficient I would not be too worried about it.


----------



## Addsome

Quote:


> Originally Posted by *DooRules*
> 
> I only use max voltage for over clocking and card is running very cool. If cooling is sufficient I would not be too worried about it.


Im running a 1080 hyrbid kit on my 1080ti with the original shroud still on, so basically how gamersnexus did it. The VRM and VRAM modules are being cooled by the stock fan while the die is being cooled with the AIO. Would it still be wise to overvolt? My GPU doesnt go above 51c at full load.


----------



## DooRules

At 51c you are already subject to throttling. Adding voltage would likely just make it worse.


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> How much would the life of the card be reduced by when using 1.093V?
> Guides online say to turn Vsync on instead of fast sync in the Nvidia control panel. Whats the difference?


input lag is less with fast sync. both are better than v-sync off, though. I read a bunch on this topic, but the jyst of it is what I said. There are no advantages to v-sync on vs fast but there are advantages to Fast vs On.


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> How much would the life of the card be reduced by when using 1.093V?
> Guides online say to turn Vsync on instead of fast sync in the Nvidia control panel. Whats the difference?


Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I SAID DON'T TEMPT ME!!!


sorry, man. You have the Epeen trophy, though. And I was just playin about gaming only. You may be editing or doing work or rendering or whatever that may use all those cores. And your screenshots are epic if your OSD is on with all those threads. And your physics score in Firestrike is Bossmode...


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> input lag is less with fast sync. both are better than v-sync off, though. I read a bunch on this topic, but the jyst of it is what I said. There are no advantages to v-sync on vs fast but there are advantages to Fast vs On.


So I have a 100hz G-Sync monitor. Currently I have G-Sync and Vsync enabled in control panel and have my frames locked at 98 using rivatuner or 96 if the game allows in game limiting. Should I change my Vsync on to Fast Sync and leave everything else the same for a better experience?


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> So I have a 100hz G-Sync monitor. Currently I have G-Sync and Vsync enabled in control panel and have my frames locked at 98 using rivatuner or 96 if the game allows in game limiting. Should I change my Vsync on to Fast Sync and leave everything else the same for a better experience?


yes, although you may not notice it much b/c how you have it is already great. imo fast is better, though. I feel no input lag at all and i notice input lag of any kind. It's a curse lol. And there is never tearing which is my least favorite glitchy thing. Also, v-sync is a small perf hit whereas fast isnt


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> yes, although you may not notice it much b/c how you have it is already great. imo fast is better, though. I feel no input lag at all and i notice input lag of any kind. It's a curse lol. And there is never tearing which is my least favorite glitchy thing. Also, v-sync is a small perf hit whereas fast isnt


I never did understand why using vsync or fast sync when you already have gsync and frames limited to under your refresh rate helps the smoothness?


----------



## Boost240

Gentlemen, I'm in!


----------



## mouacyk

Quote:


> Originally Posted by *Addsome*
> 
> I never did understand why using vsync or fast sync when you already have gsync and frames limited to under your refresh rate helps the smoothness?


Don't need to bother with vsync or fast sync with GSYNC if you're already using frame limiting. They don't add anything but additional lag.


----------



## jelome1989

MSI GTX 1080 Ti Gaming X sold out in Newegg in just under an hour... And at $750!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jelome1989*
> 
> MSI GTX 1080 Ti Gaming X sold out in Newegg in just under an hour... And at $750!


I used to be all about partner cards until I realized I was taking the freaking cooler off for AIO anyways. FE + Water is the way to go!


----------



## guttheslayer

Quote:


> Originally Posted by *KedarWolf*
> 
> I was soooooo lucky.
> 
> Launch day local store had a few Gigabyte in stock. I called, they actually held it until end of business day for me, paid a coworker to pick it up as I was on evening shift, got it!!
> 
> Got EK block and backplate about ten days later.
> 
> Below is a benching run. 10880 in TimeSpy.
> 
> All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.
> 
> Time Spy 10880, Graphics Test 10853.
> 
> http://www.3dmark.com/3dm/18838458?


How did you even unlock the core voltage?


----------



## guttheslayer

Quote:


> Originally Posted by *EvilPieMoo*
> 
> I hope you're able to get more out of it being that it's a premium aftermarket card. This was my run with a Founders edition


Its all about silicon lottery.


----------



## kevindd992002

Quote:


> Originally Posted by *Addsome*
> 
> I never did understand why using vsync or fast sync when you already have gsync and frames limited to under your refresh rate helps the smoothness?


Quote:


> Originally Posted by *mouacyk*
> 
> Don't need to bother with vsync or fast sync with GSYNC if you're already using frame limiting. They don't add anything but additional lag.


Exactly! Why would one be turning on vsync or fast sync if a framr limiter is turned on? Vsync/fast sync won't getto activate itself that way and what you get overall is just additional input lag.
Quote:


> Originally Posted by *SlimJ87D*
> 
> I used to be all about partner cards until I realized I was taking the freaking cooler off for AIO anyways. FE + Water is the way to go!


I feel the same but like my previous posts yesterday say, FE's are power-limited and the possibility of reaching higher OC's with them tends to be lower than what you get with partner cards.


----------



## JedixJarf

Quote:


> Originally Posted by *kevindd992002*
> 
> Exactly! Why would one be turning on vsync or fast sync if a framr limiter is turned on? Vsync/fast sync won't getto activate itself that way and what you get overall is just additional input lag.
> I feel the same but like my previous posts yesterday say, FE's are power-limited and the possibility of reaching higher OC's with them tends to be lower than what you get with partner cards.


AIB's and FE have the same power limit...


----------



## Slackaveli

Quote:


> Originally Posted by *mouacyk*
> 
> Don't need to bother with vsync or fast sync with GSYNC if you're already using frame limiting. They don't add anything but additional lag.


that'
s not true but i don't feel like arguing.


----------



## kevindd992002

Quote:


> Originally Posted by *JedixJarf*
> 
> AIB's and FE have the same power limit...


What, really? So what's the use of the 2x8pin connectors for AIB's compared to the 6+8pin connectors of FE's?

And why did they decide to make the power limit the same for the 1080Ti when it's not the same for othet Pascal cards?


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> What, really? So what's the use of the 2x8pin connectors for AIB's compared to the 6+8pin connectors of FE's?
> 
> And why did they decide to make the power limit the same for the 1080Ti when it's not the same for othet Pascal cards?


it's for all the LEDs, etc.


----------



## guttheslayer

Quote:


> Originally Posted by *EvilPieMoo*
> 
> I hope you're able to get more out of it being that it's a premium aftermarket card. This was my run with a Founders edition


This is my best so far already.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> it's for all the LEDs, etc.


So the OC potential of both just soleley depend on the silicon lottery?


----------



## alucardis666

Quote:


> Originally Posted by *kevindd992002*
> 
> So the OC potential of both just soleley depend on the silicon lottery?


Pretty much.


----------



## kevindd992002

Quote:


> Originally Posted by *alucardis666*
> 
> Pretty much.


Has this been explained on most reviews also? And is it only true for the 1080Ti? Because in the 1080 officiwl thread, it is well known that FE's usually hit their power limits and that's where partner cards come in as they have higher limits.


----------



## Addsome

Guys something weird is happening when I use the Afterburner Curve to overlock. The Perfcap Reason is highlighted with 4 different reasons even though im not being performance capped? What does this mean?


----------



## Mato87

Iam officialy part of the club now


----------



## alucardis666

Quote:


> Originally Posted by *Mato87*
> 
> Iam officialy part of the club now


Wooooooooo! Congratz bud


----------



## Mato87

Quote:


> Originally Posted by *alucardis666*
> 
> Wooooooooo! Congratz bud


Thanks dude, Iam going to play until morning now







Quantum Break here I come


----------



## Menta

Quote:


> Originally Posted by *Mato87*
> 
> Iam officialy part of the club now


Lucky you....









where did you get the card from country ?


----------



## Norlig

What are the different temp limits and what happens when you reach them?


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> Iam officialy part of the club now


THAT is the proper way to ride with a gpu.


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> Thanks dude, Iam going to play until morning now
> 
> 
> 
> 
> 
> 
> 
> Quantum Break here I come


hell yeah, man! give us the feedback


----------



## Jbravo33

Quote:


> Originally Posted by *Mato87*
> 
> Iam officialy part of the club now


haha nice! can I ask you for a favour? To do a drivethrough through the bayou, you know the swamps during daylight too?


----------



## Mato87

Quote:


> Originally Posted by *Menta*
> 
> Lucky you....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> where did you get the card from country ?


Czech Republic, but it seems to have been sent straight from china or wherever Asus have their factories, it was packed in the original factory box too, which is unusual.


----------



## Menta

Quote:


> Originally Posted by *Mato87*
> 
> Czech Republic, but it seems to have been sent straight from china or wherever Asus have their factories, it was packed in the original factory box too, which is unusual.












I guess i have to wait a little longer but getting impatient by the minute, yes minute


----------



## Mato87

Quote:


> Originally Posted by *Slackaveli*
> 
> THAT is the proper way to ride with a gpu.


lol
Quote:


> Originally Posted by *Slackaveli*
> 
> hell yeah, man! give us the feedback


well right now I only tested wildlands, the game that came out with the card and so far, the performance is brutal, it's over 80 ps all the time, that's with everything on ultra, when I tone it down, especially with the shadow draw distance I can get a stable 100 fps.


----------



## Mato87

Quote:


> Originally Posted by *Jbravo33*
> 
> haha nice! can I ask you for a favour? To do a drivethrough through the bayou, you know the swamps during daylight too?


You cheeky **** you








Quote:


> Originally Posted by *Menta*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess i have to wait a little longer but getting impatient by the minute, yes minute


I couldn't wait for the aorus xtreme so I went for the first non reference one that was available at the local store. I was lucky though, it seems to have been sold out in a matter of minutes...


----------



## ThingyNess

Quote:


> Originally Posted by *Addsome*
> 
> Guys something weird is happening when I use the Afterburner Curve to overlock. The Perfcap Reason is highlighted with 4 different reasons even though im not being performance capped? What does this mean?


How do you know you're not being performance capped? That looks like the normal state of things to me with a maxed out OC running into the TDP limits.

The Vop reason means that it can't boost any higher because you've run out of voltage bins to use. You've unlocked the couple of extra bins with the increased voltage slider in Afterburner, and the BIOS is locked and won't allow any more, so it won't increase the voltage and frequency any more because it's hit the 'hard' Nvidia BIOS limit. If you dial the voltage slider back in afterburner you won't see VOp as a perfcap reason anymore as the highest voltage bin will never be used.

PWR means you're power limited, and it's actually throttling your clocks back to ensure you don't go over your TDP limit.

Interesting that it's happening at as low as 108% TDP, because the slider in Afterburner goes up to 20%. What a lot of people don't know is that in addition to the overall power limit there are also sub-limits within the BIOS for how much power it can draw from the PCIe slot, and each of the PCIe power connectors on the card. Depending on how the BIOS was set up (every card is different, although the FEs should all be the same) the card could be running into one of the individual limits before it hits the +20% overall limit you set in Afterburner.

The only way around that and to stop it from power throttling is to hard-modify the card and shunt the current sense resistors. GPU-Z refreshes too slowly to reflect he transient TDP throttling in its core clock readout a lot of the time. TDP throttling happens much quicker than the steps between voltage / temperature bins.


----------



## Menta

Quote:


> Originally Posted by *Mato87*
> 
> You cheeky **** you
> 
> 
> 
> 
> 
> 
> 
> 
> I couldn't wait for the aorus xtreme so I went for the first non reference one that was available at the local store. I was lucky though, it seems to have been sold out in a matter of minutes...


I would do the same thing, but Asus or Gigabyte are looking good, performance wise same thing, Auros has 3 DP plus all those hdmi´s thats the big difference. not really important for most users


----------



## ThingyNess

Quote:


> Originally Posted by *JedixJarf*
> 
> AIB's and FE have the same power limit...


Says who? EVGA is already on the record that their FTW3 card has a 280W TDP target versus 250W of the lesser cards and the FE models.

http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR

The reviews so far about the AIB cards have been infuriatingly silent on what the TDP limits are, but you can read between the lines from their power consumption results.

Most of the reviews so far show the AIB cards to be 30-50w higher than the FE as far as peak power consumption goes, and that makes sense given I'd expect a lot of the AIB cards to ship with a 280w or even 300w power target out of the box. That's the whole point of why they're beefing up their VRMs.


----------



## lilchronic

Did they release this a day to early ?








http://www.geforce.com/gtx-g-assist


----------



## mouacyk

Quote:


> Originally Posted by *lilchronic*
> 
> Did they release this a day to early ?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geforce.com/gtx-g-assist


Somewhere in the world, tomorrow is here.


----------



## Mato87

Quote:


> Originally Posted by *Menta*
> 
> I would do the same thing, but Asus or Gigabyte are looking good, performance wise same thing, Auros has 3 DP plus all those hdmi´s thats the big difference. not really important for most users


yeah I don't care for vr, so I just went for asus







it's a fantastic card so far.


----------



## alucardis666

Quote:


> Originally Posted by *lilchronic*
> 
> Did they release this a day to early ?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geforce.com/gtx-g-assist


I'll take 12!


----------



## Joshwaa

Quote:


> Originally Posted by *lilchronic*
> 
> Did they release this a day to early ?
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geforce.com/gtx-g-assist


GeForce Assisted Gaming (Gag) Yea April Fools!


----------



## KedarWolf

Quote:


> Originally Posted by *guttheslayer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I was soooooo lucky.
> 
> Launch day local store had a few Gigabyte in stock. I called, they actually held it until end of business day for me, paid a coworker to pick it up as I was on evening shift, got it!!
> 
> Got EK block and backplate about ten days later.
> 
> Below is a benching run. 10880 in TimeSpy.
> 
> All rad fans, system fans and pumps at 100% for bench run, +175 core, +659 memory, my 32GB CL14 Ripjaws 5 kit in at 13-14-13-29 1T 3228 MHZ, 5960x CPU at 4.742GHZ cache at 4.440 GHZ.
> 
> Time Spy 10880, Graphics Test 10853.
> 
> http://www.3dmark.com/3dm/18838458?
> 
> 
> 
> 
> 
> How did you even unlock the core voltage?
Click to expand...

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20

See last few posts as well in thread.


----------



## Addsome

Quote:


> Originally Posted by *mouacyk*
> 
> Don't need to bother with vsync or fast sync with GSYNC if you're already using frame limiting. They don't add anything but additional lag.


According to G-Sync 101 on the blur buster forum the optimum setting is:

"Optimal G-Sync Settings:

G-Sync + V-Sync On (Nvidia Control Panel) + fps limit:
100 Hz: 98 fps limit (In-game)
144 Hz: 142 fps limit (In-game)
165 Hz: 162 fps limit (In-game)"

http://forums.blurbusters.com/viewtopic.php?f=5&t=3073

So many different answers on this topic, not sure which one to go with.


----------



## RedM00N

Quote:


> Originally Posted by *lilchronic*
> 
> Did they release this a day to early ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.geforce.com/gtx-g-assist


Not gonna lie, I like the concept lol. I cant wait until every game's AI works in a way similar to this, that is, it learns from how you play and adapts to your faults/weaknesses.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *kevindd992002*
> 
> Exactly! Why would one be turning on vsync or fast sync if a framr limiter is turned on? Vsync/fast sync won't getto activate itself that way and what you get overall is just additional input lag.
> I feel the same but like my previous posts yesterday say, FE's are power-limited and the possibility of reaching higher OC's with them tends to be lower than what you get with partner cards.


Can the partner cards even utilize that extra power with everything locked down and all? Do they come with editable BIOs?

I don't know, I average around 115% power draw and sometimes hit 120% for a second. I dont think extra power means much with a voltage limit. Guys with shunt mods don't even get past 2100Mhz on average.


----------



## netjack

Quote:


> Originally Posted by *smushroomed*
> 
> Anyone place an Arctic Accelero hybrid cooler on their 1080ti FE?
> 
> I ordered the 140mm off amazon


Did you install it? I just ordered the same kit.


----------



## mtbiker033

Quote:


> Originally Posted by *Mato87*
> 
> Iam officialy part of the club now


congratulations bro! I really wanted to wait for a STRIX but the FE came in stock on Amazon prime and i pulled the trigger!!


----------



## EvilPieMoo

Curious to see some OC numbers from you guys with AIB cards.


----------



## alucardis666

Quote:


> Originally Posted by *EvilPieMoo*
> 
> Curious to see some OC numbers from you guys with AIB cards.


I can't see them doing better than any of the FE cards under water.


----------



## D1RTYD1Z619

Anyone still running a 2600k and using a GTX 1080 TI @4k Ultra settings in Battlefield 1? I cant tell if I'm bottlenecking my GTX 1080. I want to buy a GTX 1080 TI but if my 2600k is bottlenecking my 1080 getting a TI will just be worse. In that case ill just upgrade the proc and motherboard instead of the vid card and get the Evga TI when it is released.


----------



## alucardis666

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> Anyone still running a 2600k and using a GTX 1080 TI @4k Ultra settings in Battlefield 1? I cant tell if I'm bottlenecking my GTX 1080. I want to buy a GTX 1080 TI but if my 2600k is bottlenecking my 1080 getting a TI will just be worse. In that case ill just upgrade the proc and motherboard instead of the vid card and get the Evga TI when it is released.


I'd upgrade that cpu tbh.


----------



## Joshwaa

I can now also join the held 2100.5 in Time Spy club also.

http://www.3dmark.com/spy/1479891

Need to OC the 7700k and get a better score though. Funny thing while holding 2100.5 if i go over +260 on the mem it will fail the benchmark. Might need to do the shunt mod.


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> I'd upgrade that cpu tbh.


I would have to agree. I'm just sitting here with a 4820k @ 4.6ghz wondering the same thing (somehow I feel a little better now)


----------



## KedarWolf

I checked this a.m. and you are right. Got my max core 2080MHZ stable. Only issue I have is core and voltages jump from 1.052v and in between to 1.093v up and downwhile running Heaven, and I'm not hitting the power limit, temps around 50C, not sure why.

At 1.062v with max +152 I think it is no custom power curve it stays at 1.062v 99% of the time in Heaven.









I might have the voltage numbers a bit off, I'm not at home to check.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I checked this a.m. and you are right. Got my max core 2080MHZ stable. Only issue I have is core and voltages jump from 1.052v and in between to 1.093v up and downwhile running Heaven, and I'm not hitting the power limit, temps around 50C, not sure why.
> 
> At 1.062v with max +152 I think it is no custom power curve it stays at 1.062v 99% of the time in Heaven.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I might have the voltage numbers a bit off, I'm not at home to check.


Wait, what did you do to get 180 core?


----------



## NYU87

MSI GTX 1080ti Gaming X arrive next Thursday.... can't wait.


----------



## mouacyk

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> Anyone still running a 2600k and using a GTX 1080 TI @*4k Ultra settings in Battlefield 1*? I cant tell if I'm bottlenecking my GTX 1080. I want to buy a GTX 1080 TI but if my 2600k is bottlenecking my 1080 getting a TI will just be worse. In that case ill just upgrade the proc and motherboard instead of the vid card and get the Evga TI when it is released.


Quote:


> Originally Posted by *alucardis666*
> 
> I'd upgrade that cpu tbh.


Quote:


> Originally Posted by *mtbiker033*
> 
> I would have to agree. I'm just sitting here with a 4820k @ 4.6ghz wondering the same thing (somehow I feel a little better now)


The question was about BF1 @4K, which would absolutely not be bottlenecked by a decent cooler and RAM upgrade to the existing 2600K overclocked. Instead of spending $700+ on new platform, I would recommend getting 16GB of 2400MHz DDR3 and decent air cooler like the D15 and oc that 2600K to 4.6-4.8GHz. If the goal was <=1440p and high refresh, I'd understand the suggestions to upgrade the CPU better.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I checked this a.m. and you are right. Got my max core 2080MHZ stable. Only issue I have is core and voltages jump from 1.052v and in between to 1.093v up and downwhile running Heaven, and I'm not hitting the power limit, temps around 50C, not sure why.
> 
> At 1.062v with max +152 I think it is no custom power curve it stays at 1.062v 99% of the time in Heaven.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I might have the voltage numbers a bit off, I'm not at home to check.
> 
> 
> 
> Wait, what did you do to get 180 core?
Click to expand...

Actually meant to say 2080mhz core, not sure what that is on the slider, custom voltage curve with 2080 topping off at 1.093v.

Edit: On my way home from evening shift is why I can't check the + whatever.

Second edit: With fans and pump at 100% I can bench at 2083 core, + 667 memory, I have a decent card.









Highest TimeSpy bench though was at +172 no custom curve and +658 memory I think it was. Without a custom curve the clocks and voltages don't seem to drop or fluctuate, why I think it was my best result.


----------



## D1RTYD1Z619

Quote:


> Originally Posted by *mouacyk*
> 
> The question was about BF1 @4K, which would absolutely not be bottlenecked by a decent cooler and RAM upgrade to the existing 2600K overclocked. Instead of spending $700+ on new platform, I would recommend getting 16GB of 2400MHz DDR3 and decent air cooler like the D15 and oc that 2600K to 4.6-4.8GHz. If the goal was <=1440p and high refresh, I'd understand the suggestions to upgrade the CPU better.


I should have mentioned that my CPU is overclocked (most stable overclock) to 4.2 and my cooler is very good. Never gets hotter than 60 degrees Celsius and I've upgraded to 16 gigs of memory. I'm just curious to know if someone with a similar setup can tell me what they get in terms of FPS at 4k in Battlefield1 with every maxed out and AA off.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Actually meant to say 2080mhz core, not sure what that is on the slider, custom voltage curve with 2080 topping off at 1.093v.
> 
> Edit: On my way home from evening shift is why I can't check the + whatever.
> 
> Second edit: With fans and pump at 100% I can bench at 2083 core, + 667 memory, I have a decent card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Highest TimeSpy bench though was at +172 no custom curve and +658 memory I think it was. Without a custom curve the clocks and voltages don't seem to drop or fluctuate, why I think it was my best result.


Damn impressive!


----------



## rcfc89

Quote:


> Originally Posted by *Slackaveli*
> 
> I would concur with that. I would point out that max framerates are less important than smooth better lows unless you aren't maxing monitor refresh rate (or are at least very close) or one has a sizable lead in max frames, not the 2 or 3 fps in this case.. I wouldn't ever recomend somebody go backwards in platforms,, but, if you are on a 4690k or a 4790k on z97, it's a no brainer to upgrade to a 5775c for peanuts. Except if you are like @Alucardis666 and have a cpu/mobo combo worth $2,000 and are using it just for gaming and only have 1 1080ti. In that case I actually WOULD recommend getting a 5775c/z97/another 1080Ti, and a ultrawide 21:9 g-sync for the same price


Why would someone with a new build choose a CPU that is older architecture and 2 years old? I'd much rather start with the newer higher clocking 7700k. It will hold its value much better in the foreseeable future.


----------



## hotrod717

Quote:


> Originally Posted by *KedarWolf*
> 
> Actually meant to say 2080mhz core, not sure what that is on the slider, custom voltage curve with 2080 topping off at 1.093v.
> 
> Edit: On my way home from evening shift is why I can't check the + whatever.
> 
> Second edit: With fans and pump at 100% I can bench at 2083 core, + 667 memory, I have a decent card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Highest TimeSpy bench though was at +172 no custom curve and +658 memory I think it was. Without a custom curve the clocks and voltages don't seem to drop or fluctuate, why I think it was my best result.


Making me curious to see what my card can do on water with shunt mod.


----------



## c0nsistent

Well I'm about to mount an Antec 620 onto my 1080 Ti... anything I should know about beforehand?


----------



## KaRLiToS

Hi guys,

Today I tried GR: Wildland and I was astonished how my 4930k at 4.5ghz is at 45% utilization while the first core is at 65%. I still have a single GTX 1080ti and I am about to receive the other one in a week or so. Already received a block but I won't install it right now as I don't want to flush my loop several times. Will wait for everythings to arrive.

I hope SLI will give great performance in my games and I won't be CPU bound.


----------



## keikei

Quote:


> Originally Posted by *KaRLiToS*
> 
> Hi guys,
> 
> Today I tried GR: Wildland and I was astonished how my 4930k at 4.5ghz is at 45% utilization while the first core is at 65%. I still have a single GTX 1080ti and I am about to receive the other one in a week or so. Already received a block but I won't install it right now as I don't want to flush my loop several times. Will wait for everythings to arrive.
> 
> I hope SLI will give great performance in my games and I won't be CPU bound.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> that'
> s not true but i don't feel like arguing.


Please expound?
Quote:


> Originally Posted by *ThingyNess*
> 
> How do you know you're not being performance capped? That looks like the normal state of things to me with a maxed out OC running into the TDP limits.
> 
> The Vop reason means that it can't boost any higher because you've run out of voltage bins to use. You've unlocked the couple of extra bins with the increased voltage slider in Afterburner, and the BIOS is locked and won't allow any more, so it won't increase the voltage and frequency any more because it's hit the 'hard' Nvidia BIOS limit. If you dial the voltage slider back in afterburner you won't see VOp as a perfcap reason anymore as the highest voltage bin will never be used.
> 
> PWR means you're power limited, and it's actually throttling your clocks back to ensure you don't go over your TDP limit.
> 
> Interesting that it's happening at as low as 108% TDP, because the slider in Afterburner goes up to 20%. What a lot of people don't know is that in addition to the overall power limit there are also sub-limits within the BIOS for how much power it can draw from the PCIe slot, and each of the PCIe power connectors on the card. Depending on how the BIOS was set up (every card is different, although the FEs should all be the same) the card could be running into one of the individual limits before it hits the +20% overall limit you set in Afterburner.
> 
> The only way around that and to stop it from power throttling is to hard-modify the card and shunt the current sense resistors. GPU-Z refreshes too slowly to reflect he transient TDP throttling in its core clock readout a lot of the time. TDP throttling happens much quicker than the steps between voltage / temperature bins.


I think that the 108% TDP limit even though it's set at 120% is also existent in 1080 FE's.
Quote:


> Originally Posted by *ThingyNess*
> 
> Says who? EVGA is already on the record that their FTW3 card has a 280W TDP target versus 250W of the lesser cards and the FE models.
> 
> http://www.evga.com/products/product.aspx?pn=11G-P4-6696-KR
> 
> The reviews so far about the AIB cards have been infuriatingly silent on what the TDP limits are, but you can read between the lines from their power consumption results.
> 
> Most of the reviews so far show the AIB cards to be 30-50w higher than the FE as far as peak power consumption goes, and that makes sense given I'd expect a lot of the AIB cards to ship with a 280w or even 300w power target out of the box. That's the whole point of why they're beefing up their VRMs.


Thanks for the confirmation. When I read in this thread that the power limits of the FE's and AIB cards are the same, I was like "impossible and it doesn't make sense".


----------



## Slackaveli

Quote:


> Originally Posted by *rcfc89*
> 
> Why would someone with a new build choose a CPU that is older architecture and 2 years old? I'd much rather start with the newer higher clocking 7700k. It will hold its value much better in the foreseeable future.


it's not 2 years old, it's a year and a half old. But the obvious answer is because it has the L4 crystalwell cache and it's the only one who does. But, as I already said, do it IF YOU ALREADY have a z97 board and good ddr3. I was talking specifically to posters who are on the same arc. re-read the very post of mine that you quoted. I said clearly" i dont recommend going backwards in arch, but if you have a 4690k/4790k it's a no brainer."


----------



## geriatricpollywog

How are these cards doing under water? 2100mhz+ every time?


----------



## bfedorov11

Quote:


> Originally Posted by *0451*
> 
> How are these cards doing under water? 2100mhz+ every time?


Stability is hit or miss for most at 2100. My card will bench it if it stays cool. I would say if you get over 2050 you have a decent card. 2100 would be a golden sample.


----------



## c0nsistent

At 2075mhz on my Gigabyte FE card with no artifacts on stock air. I'm planning on putting an Antec 620 on it.... 2100 should be no problem.


----------



## KedarWolf

Been experimenting with a power curve under water. At +180 1.093v under water, no shunt mod, voltages fluctuate from 1050v up and down often to 1.093v.








At +160 1.075v, voltages fluctuate between 1.062 and 1.075v, sometimes dropping to 1.050v.










At +152 1.062v core and voltages stay maxed out 99% of the time, all these while running Heaven bench.









Temps average around 42C with an EK block and backplate.

So I'm thinking the 1.062v voltage stable is better the higher clocks with it jumping around. As you know on a voltage curve if voltages drop, clocks drop too.

You agree a constant voltage/clock is the way to go?


----------



## KingEngineRevUp

Hmmm, am I doing something wrong? With voltage slider at 100%, my voltage won't go above 1031mv.trying to run a custom curve.

Running heaven in window mode looking at MSI AFTERBURNER.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Joshwaa*
> 
> You have to keep the temps down but you might be able to get 2100.5 at less that 1.062


Okay, your method worked. I don't know how to go above 1.062V, but with the voltage curve, my card can push 2100 Mhz on the core. *Thank you. Joining the Twenty-One-Hundred-Club now







.*



Edit: Wow, for a little bit my card ran at 2114 MHz for like 5 scenes in heaven. I think I can push this baby even further but think I'm settled at 2101 MHz.


----------



## Dasboogieman

Quote:


> Originally Posted by *rcfc89*
> 
> Why would someone with a new build choose a CPU that is older architecture and 2 years old? I'd much rather start with the newer higher clocking 7700k. It will hold its value much better in the foreseeable future.


What he meant was that if you are already on Haswell, upgrading to a 5775c is comparable to a full platform upgrade to Kaby lake in terms of performance. Obviously, going 7700k is ideal for future proofing but an overclocked 5775c is a drop in upgrade (that means no reinstallation of Windows and no fiddling around with unscrewing hardware) so its minimal fuss.

I'm not at my desktop atm but I suspect the 7700k needs to be at least 4.8ghz + with 3000mhz CL15 type DDR4 to exceed a 5775c in games. Ill do a followup when I get home from holidays and get my 7700k delidded and tested.


----------



## KedarWolf

Joining the 2100 club.









No shunt mod, sometimes clocks drop below 2100 but I know driver never crashed or suspended, I turned off 'Restore settings after driver suspend' and voltages and clocks stayed at what I had the set at.

GPU-ZSensorLog.txt 125k .txt file


HardwareMonitoring.zip 6k .zip file
 Afterburner log.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Joining the 2100 club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No shunt mod, sometimes clocks drop below 2100 but I know driver never crashed or suspended, I turned off 'Restore settings after driver suspend' and voltages and clocks stayed at what I had the set at.
> 
> GPU-ZSensorLog.txt 125k .txt file
> 
> 
> HardwareMonitoring.zip 6k .zip file
> Afterburner log.


I think the club is going to continue to grow once others start figuring out how to *tame the beast that is voltage curve!*


----------



## KingEngineRevUp

Stole this from reddit user. Instructions on voltage curve:


__
https://www.reddit.com/r/62r2nc/hes_reached_2100_mhz_oc_on_the_1080_ti_find_out/%5B/URL

2. Video on voltage curves





3. Guru3D method of unlocking voltage
http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,32.html

EDIT: Just did a run at 2114 Mhz on the core. Heaven score went from 2592 to 2603. Please see IMGUR.


http://imgur.com/V8th5pF


----------



## Jbravo33

Quote:


> Originally Posted by *Joshwaa*
> 
> You have to keep the temps down but you might be able to get 2100.5 at less that 1.062


Worked for me too except I cannot for the life of me get both cards to use the same curve. I can get 2113 on one card and 1911 base on other. They are linked when I do this, if I unlink and do separately the opposite gpu hits 2113. Smh. Any suggestions? I can't install multiple msi can I? Frustrating.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Stole this from reddit user. Instructions on voltage curve:
> 
> 
> __
> https://www.reddit.com/r/62r2nc/hes_reached_2100_mhz_oc_on_the_1080_ti_find_out/%5B/URL
> 
> 2. Video on voltage curves
> 
> 
> 
> 
> 
> 3. Guru3D method of unlocking voltage
> http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,32.html
> 
> EDIT: Just did a run at 2114 Mhz on the core. Heaven score went from 2592 to 2603. Please see IMGUR.
> 
> 
> http://imgur.com/V8th5pF


Lazy way of getting voltage slider to work. Use WildCards.zip and see last few posts in thread for clarification.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Lazy way of getting voltage slider to work. Use WildCards.zip and see last few posts in thread for clarification.


Yeah, I did that previously, just thought it would be nice to post a guide from another user for others.


----------



## Mato87

Quote:


> Originally Posted by *mtbiker033*
> 
> congratulations bro! I really wanted to wait for a STRIX but the FE came in stock on Amazon prime and i pulled the trigger!!


Thanks man ! yeah it was tempting for me too, but what stopped from buying the FE was that the cards did not have any dvi port so I just had to wait for a non reference model


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> Thanks man ! yeah it was tempting for me too, but what stopped from buying the FE was that the cards did not have any dvi port so I just had to wait for a non reference model


Be quiet and go play quantum break!


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Be quiet and go play quantum break!


Savage.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Savage.


I know you were having issues earlier, but did you try the curve method?

What clock are you crashing at? What i recommend is to have a line from that clock and voltage to the next voltage.

Lets say you're crashing at 2000 Mhz at 1.032V, well on the curve, make a line from 2000 Mhz at 1.025 to 1.050 and then increase the curve from there.

The reason these cards are crashing is because of the preset votlage curve. You need to add the voltage at the right time with relationship to core.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> I know you were having issues earlier, but did you try the curve method?
> 
> What clock are you crashing at? What i recommend is to have a line from that clock and voltage to the next voltage.
> 
> Lets say you're crashing at 2000 Mhz at 1.032V, well on the curve, make a line from 2000 Mhz at 1.025 to 1.050 and then increase the curve from there.
> 
> The reason these cards are crashing is because of the preset voltage curve. You need to add the voltage at the right time with relationship to core.


Still testing, very inconsistent results.


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> Be quiet and go play quantum break!


Damn it was worth the wait, I can finally play on a stable framerate with the upscaling turned off and it doesn't dip below 60 even during combat but I had to turn down the volumetric lighting or whatever it is called. It's not an optimized game unfortunately.


----------



## zipeldiablo

Quote:


> Originally Posted by *SlimJ87D*
> 
> Be quiet and go play quantum break!


Damn totally forgot this game.
Is it fixed yet? I played it in the beginning and oh my god there were bugs.
The funniest thing was to not have an exit button on the main menu to go back to windows


----------



## Mato87

Quote:


> Originally Posted by *zipeldiablo*
> 
> Damn totally forgot this game.
> Is it fixed yet? I played it in the beginning and oh my god there were bugs.
> The funniest thing was to not have an exit button on the main menu to go back to windows


It's a fantastic game so far, such a good gameplay, if you loved max payne or alan wake you'll love quantum break. I haven't noticed any bugs, but then again Iam playing the retail version, not the gimped UWP version.


----------



## Biggu

Quote:


> Originally Posted by *Biggu*
> 
> I will know tomorrow for sure. I ordered Watercool HEATKILLER® IV for TITAN X (Pascal) - ACETAL Nickel-BL and Aquacomputer backplate for kryographics Pascal NVIDIA TITAN X, passive for mine


so quite a few pages ago we were talking about mounting the Aquacomputer backplate to the heatkiller block and I said it wants possible with out mods. Well I did those mods and ive got it mounted. What it required is for the most part longer screws, m3x12 seemed to work on most, I had to use a few m3x16 on one or two screws. M3x13 or 14 would have been perfect on all the holes but I didnt bother. M3x12 was good enough on majority. Second, Required drilling out the hole size to a 3/32nd hole on all the spots, even the two that were threaded. on those too I also did a bevel on it.

irregardless here it is.


----------



## BoredErica

Can OP update the thread so I don't have to comb through hundreds and hundreds of posts to get to the good stuff?


----------



## Joshwaa

Quote:


> Originally Posted by *Jbravo33*
> 
> Worked for me too except I cannot for the life of me get both cards to use the same curve. I can get 2113 on one card and 1911 base on other. They are linked when I do this, if I unlink and do separately the opposite gpu hits 2113. Smh. Any suggestions? I can't install multiple msi can I? Frustrating.


So you are not able to un-link the two cards and do a graph for each of them?


----------



## madmeatballs

People who have kraken g10s on their card do you know how to make the fan work. I have the gelid adapter, it seems like when I install the gpu driver(latest, also used DDU) it stops working. Fans works before to logging in to windows.


----------



## gstarr

hey,

how can i *simple test* the GPU for chip quality? without cooling modification.

tdp 120%
max temp
max volt

worst < 2000 mhz?
good ~ 2050 mhz?
golden >= 2085 mhz?

sure for longer test and correct fire strike runs i need better cooling for higher clocks.


----------



## zipeldiablo

https://www.ekwb.com/news/official-list-ek-water-blocks-geforce-gtx-1080-ti-series/

Why would they release a block for the 1080ti fe if we can already use the one for titan pascal, i don't get it


----------



## khemist

So it wont be annoying having a block that says Titan on it when you have a 1080Ti.


----------



## Asus11

Quote:


> Originally Posted by *khemist*
> 
> So it wont be annoying having a block that says Titan on it when you have a 1080Ti.


was the same for the 980 ti / titan X maxwell if im not mistaken


----------



## hotrod717

Any user oc comparisons on Strix vs. Reference?


----------



## gstarr

worst card? without cooling modification



curve on 1.062 - 2025

after 15 sec drop to 2012 - 2000, thermal issue, but everything above 2025 will crash 3d instantly


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> It's a fantastic game so far, such a good gameplay, if you loved max payne or alan wake you'll love quantum break. I haven't noticed any bugs, but then again Iam playing the retail version, not the gimped UWP version.


I might have to play it now


----------



## hotrod717

Quote:


> Originally Posted by *gstarr*
> 
> worst card? without cooling modification
> 
> 
> 
> curve on 1.062 - 2025
> 
> after 15 sec drop to 2012 - 2000, thermal issue, but everything above 2025 will crash 3d instantly


Is that a strix?? I am asking, not to offend anyone, but most reviews are showing no improvement in oc vs reference. In fact most reviewers oc fall short of reference. I know its early and reviewers aren't typically the best oc'ers. Just curious strictly from oc'ing stand point.


----------



## gstarr

was able to run 2025 stable at 1.050.. no way to clock higher.. nope it's evga fe.. better way to order from nvidia directly?


----------



## Menta

Quote:


> Originally Posted by *hotrod717*
> 
> Is that a strix?? I am asking, not to offend anyone, but most reviews are showing no improvement in oc vs reference. In fact most reviewers oc fall short of reference. I know its early and reviewers aren't typically the best oc'ers. Just curious strictly from oc'ing stand point.


Dont expect differences from a maximum overclock standpoint, AIB VS Founders,

AIB cards will maintain higher frequencies, reference will throttle down to main reasonable temperatures. that´s the big difference


----------



## PewnFlavorTang

I'm almost certain one of the FE's I have will do 2150 on water and the other one would probably do 2114-2126. I have 2 fairly good clocking FE's.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> Can OP update the thread so I don't have to comb through hundreds and hundreds of posts to get to the good stuff?


We need an "Official" Nvidia GTX 1080 Ti Owners Thread. OP should have never made this thread if he doesn't want to maintain it. For such a hot card right now, this is one of the worse kept OPs here.


----------



## EvilPieMoo

I really wish there was a Pascal BIOS Editor just so GPU boost 3.0 could be disabled, I don't see why it feels the need to downclock the card when it's no where close to the thermal limits.


----------



## PasK1234Xw

Quote:


> Originally Posted by *gstarr*
> 
> was able to run 2025 stable at 1.050.. no way to clock higher.. nope it's evga fe.. better way to order from nvidia directly?


EVGA is notorious for binning cards even though they wont admit it. I tried nvidia this time around but didnt get great results. Currently have MSI FE that barely holds 2050. Now that im seeing almost 10fps increase at 2100 making me want to return and try again.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> EVGA is notorious for binning cards even though they wont admit it. I tried nvidia this time around but didnt get great results. Currently have MSI FE that barely holds 2050. Now that im seeing almost 10fps increase at 2100 making me want to return and try again.


I agree... Boost 3.0 isn't meant for a OC junky like us.

Did you try the curve?

You can get a good idea if you can hit 2100 Mhz if when you first run a benchmark and it boosts to 2100 when temperatures are around 30C.

I always saw my card hit 2100 Mhz at 1.032V just for a second before temperature went up and I would go down to 2088 Mhz.


----------



## c0ld

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> Anyone still running a 2600k and using a GTX 1080 TI @4k Ultra settings in Battlefield 1? I cant tell if I'm bottlenecking my GTX 1080. I want to buy a GTX 1080 TI but if my 2600k is bottlenecking my 1080 getting a TI will just be worse. In that case ill just upgrade the proc and motherboard instead of the vid card and get the Evga TI when it is released.


I knew a lot of noobs would jump straight to the gun and say upgrade when there is hardly any difference between a 2600k vs 7700k. I would account more the difference because of the available DDR4 than actual upgrades to the CPU. Intel has just been focusing on power savings.

If your 2600k isnt OC'ed I would recommend doing an OC.

Check this article
https://www.hardocp.com/article/2017/01/13/kaby_lake_7700k_vs_sandy_bridge_2600k_ipc_review/


----------



## D1RTYD1Z619

Quote:


> Originally Posted by *c0ld*
> 
> I knew a lot of noobs would jump straight to the gun and say upgrade when there is hardly any difference between a 2600k vs 7700k. I would account more the difference because of the available DDR4 than actual upgrades to the CPU. Intel has just been focusing on power savings.
> 
> If your 2600k isnt OC'ed I would recommend doing an OC.
> 
> Check this article
> https://www.hardocp.com/article/2017/01/13/kaby_lake_7700k_vs_sandy_bridge_2600k_ipc_review/


It is overclocked at the highest stable speed 4.2ghz. with 16 gigs of RAM. I've gone through that article when it was posted but that isn't really talking about the answer I'm after.


----------



## c0ld

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> It is overclocked at the highest stable speed 4.2ghz. with 16 gigs of RAM. I've gone through that article when it was posted but that isn't really talking about the answer I'm after.


I would recommend bumping the OC up to eliminate any chance of bottle-necking. There is about a 20% advantage of the 7700k over the 2600k clock for clock and that accounting the 7700k ran with faster DDR4 memory. Now if it was a 2500k I would definitely recommend upgrading the CPU.

My 2600k sits at 4.8GHz OC and I'm sure I wont bottleneck my Aorus 1080 TI im getting this Tuesday


----------



## Alwrath

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> It is overclocked at the highest stable speed 4.2ghz. with 16 gigs of RAM. I've gone through that article when it was posted but that isn't really talking about the answer I'm after.


My core i5 760 @4.2 was fine for BF1 with my geforce 1080 @ 2101 mhz @4K. Framerate was mostly 65, lowest dip I saw was 55. People throw the term " bottleneck " around way too much. I dont think it means what you people think it means. You can still happily game on a 2600k overclocked with a 1080ti. You could do so for the next 4-5 years im sure.


----------



## c0ld

Quote:


> Originally Posted by *Alwrath*
> 
> My core i5 760 @4.2 was fine for BF1 with my geforce 1080 @ 2101 mhz @4K. Framerate was mostly 65, lowest dip I saw was 55. People throw the term " bottleneck " around way too much. I dont think it means what you people think it means. You can still happily game on a 2600k overclocked with a 1080ti. You could do so for the next 4-5 years im sure.


Exactly! I see soo many noobs throwing the term that the CPU is gonna bottleneck. Some of them with ridiculous claims. That HardOC article disproves the myths that apparently 2+ year old CPU's now bottleneck newer GPU's.
Quote:


> Originally Posted by *mtbiker033*
> 
> I would have to agree. I'm just sitting here with a 4820k @ 4.6ghz wondering the same thing (somehow I feel a little better now)


^ A prime example.


----------



## PasK1234Xw

Quote:


> Originally Posted by *SlimJ87D*
> 
> I agree... Boost 3.0 isn't meant for a OC junky like us.
> 
> Did you try the curve?
> 
> You can get a good idea if you can hit 2100 Mhz if when you first run a benchmark and it boosts to 2100 when temperatures are around 30C.
> 
> I always saw my card hit 2100 Mhz at 1.032V just for a second before temperature went up and I would go down to 2088 Mhz.


Anything over +150 eventually locks up 2100 instant lock even with voltage curve.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *c0ld*
> 
> Exactly! I see soo many noobs throwing the term that the CPU is gonna bottleneck. Some of them with ridiculous claims. That HardOC article disproves the myths that apparently 2+ year old CPU's now bottleneck newer GPU's.
> ^ A prime example.


Next time you have a question on something you're not sure about, I'll make sure to refer to as *"noob"* and try to answer it for you.


----------



## lilchronic

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Next time you have a question on something you're not sure about, I'll make sure to refer to as *"noob"* and try to answer it for you.


Speaking of noobs.i kinda feel bad for em









Spoiler: Warning: Spoiler!


----------



## KedarWolf

Quote:


> Originally Posted by *c0ld*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alwrath*
> 
> My core i5 760 @4.2 was fine for BF1 with my geforce 1080 @ 2101 mhz @4K. Framerate was mostly 65, lowest dip I saw was 55. People throw the term " bottleneck " around way too much. I dont think it means what you people think it means. You can still happily game on a 2600k overclocked with a 1080ti. You could do so for the next 4-5 years im sure.
> 
> 
> 
> Exactly! I see soo many noobs throwing the term that the CPU is gonna bottleneck. Some of them with ridiculous claims. That HardOC article disproves the myths that apparently 2+ year old CPU's now bottleneck newer GPU's.
> Quote:
> 
> 
> 
> Originally Posted by *mtbiker033*
> 
> I would have to agree. I'm just sitting here with a 4820k @ 4.6ghz wondering the same thing (somehow I feel a little better now)
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> ^ A prime example.
Click to expand...

Pops the top off his beer and seductively puts the bottle neck in his mouth...


----------



## Jbravo33

Quote:


> Originally Posted by *Joshwaa*
> 
> So you are not able to un-link the two cards and do a graph for each of them?


When I unlink them after saving the curve to one card I then hit the little graph button and a new graph pops up for that second card. I change the curve then I start a bench and now that second card is at 2113 but the first card is back to stock. It's like there's only one graph for the software. Can anyone else with sli confirm this?


----------



## Joshwaa

Quote:


> Originally Posted by *Jbravo33*
> 
> When I unlink them after saving the curve to one card I then hit the little graph button and a new graph pops up for that second card. I change the curve then I start a bench and now that second card is at 2113 but the first card is back to stock. It's like there's only one graph for the software. Can anyone else with sli confirm this?


I think the problem is you need to adjust the first one and apply and not hit save then adjust the second one and apply then hit save. Should work that way.


----------



## DooRules

Use a separate curve for each gpu as mentioned above. Each card is not the same and cannot hold the same clocks at a set V, don't be surprised to see one card at certain clock with x Voltage and the second card at same clock with y voltage.


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> I might have to play it now


Do that, you won't be disappointed believe me.


----------



## KaRLiToS

Quote:


> Originally Posted by *lilchronic*
> 
> Speaking of noobs.i kinda feel bad for em
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


HAx0r


----------



## lilchronic

Quote:


> Originally Posted by *KaRLiToS*
> 
> HAx0r


ehh just noobs


----------



## joder

Any pointers on getting this curve set right?

I have
1031 mV/+170 (2050 Mhz)
1043 mV/+173 (2062 Mhz)
1050 mV/+176 (2062 Mhz)
And flat after that

I noticed the following when firestrike crashes:

Power 113%
Temp 50C
2050Mhz
1.05V

Which then goes to

Power 40%
Temp 47C
2063 Mhz
1.081V

Currently an FE card with full fans going when benching. Possibly thermal throttling? I am going to get an EK block when they come out with the 1080Ti branded ones.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> What he meant was that if you are already on Haswell, upgrading to a 5775c is comparable to a full platform upgrade to Kaby lake in terms of performance. Obviously, going 7700k is ideal for future proofing but an overclocked 5775c is a drop in upgrade (that means no reinstallation of Windows and no fiddling around with unscrewing hardware) so its minimal fuss.
> 
> I'm not at my desktop atm but I suspect the 7700k needs to be at least 4.8ghz + with 3000mhz CL15 type DDR4 to exceed a 5775c in games. Ill do a followup when I get home from holidays and get my 7700k delidded and tested.


that you, sir.


----------



## hotrod717

Quote:


> Originally Posted by *Menta*
> 
> Dont expect differences from a maximum overclock standpoint, AIB VS Founders,
> 
> AIB cards will maintain higher frequencies, reference will throttle down to main reasonable temperatures. that´s the big difference


That iswhere I'm coming from. Would like to see some screens on aib cards. I'm sure they will be popping up soon enough.
For someone staying with air cooling it would seem to make sense. Buy for water or extreme, looking at info that's out right now, silicon quality, cooling and soldering skills are the only factor.


----------



## lilchronic

Quote:


> Originally Posted by *joder*
> 
> Any pointers on getting this curve set right?
> 
> I have
> 1031 mV/+170 (2050 Mhz)
> 1043 mV/+173 (2062 Mhz)
> 1050 mV/+176 (2062 Mhz)
> And flat after that
> 
> I noticed the following when firestrike crashes:
> 
> Power 113%
> Temp 50C
> 2050Mhz
> 1.05V
> 
> Which then goes to
> 
> Power 40%
> Temp 47C
> 2063 Mhz
> 1.081V
> 
> Currently an FE card with full fans going when benching. Possibly thermal throttling? I am going to get an EK block when they come out with the 1080Ti branded ones.


You can try to start at the lowest voltage point you have seen your card throttle too.

For me in the 3D Mark Timespy benchmark i saw voltage drop down to .963v so i just start there and add more core clocks to each voltage point.


----------



## KaRLiToS

Is that debate still going on?

Do you want 39.5 fps or 38.5 fps?



We are all here with *GTX 1080tis*, mostly playing at *1440p or 4k*. I don't understand all this debate? Can someone help me understand if I am missing something?


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> You can try to start at the lowest voltage point you have seen your card throttle too.
> 
> For me in the 3D Mark Timespy benchmark i saw voltage drop down to .963v so i just start there and add more core clocks to each voltage point.


post a pic of your curve again? i came from 980ti's so i havent conquered the curve yet.


----------



## Slackaveli

Quote:


> Originally Posted by *KaRLiToS*
> 
> Is that debate still going on?
> 
> Do you want 39.5 fps or 38.5 fps?
> 
> 
> 
> We are all here with *GTX 1080tis*, mostly playing at *1440p or 4k*. I don't understand all this debate? Can someone help me understand if I am missing something?


not really. i think the debate is settled with an occasional noob getting brought up to speed.


----------



## Norlig

Where do I get the MSI Afterburner Beta with the voltage graph?


----------



## Slackaveli

Quote:


> Originally Posted by *Norlig*
> 
> Where do I get the MSI Afterburner Beta with the voltage graph?


yeah, to see your curve graph.


----------



## pantsoftime

For those of you new to curve overclocking you can hold the control key to drag the entire curve. You can also highlight a point and hit L to lock the card to that VF point.


----------



## Norlig

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, to see your curve graph.


I got 4.3 stable, but I read you need a beta verstion to get the voltage graph, where do I download that beta? (is what I meant)


----------



## joder

Quote:


> Originally Posted by *lilchronic*
> 
> You can try to start at the lowest voltage point you have seen your card throttle too.
> 
> For me in the 3D Mark Timespy benchmark i saw voltage drop down to .963v so i just start there and add more core clocks to each voltage point.


Thank you. I think my OC for 1.05V was too high for my card. I backed off a little bit and it seems to have gotten me through. Trying to hit 2.1 is going to be tough for me without unlocked voltage (thus never)


----------



## pantsoftime

Quote:


> Originally Posted by *Norlig*
> 
> I got 4.3 stable, but I read you need a beta verstion to get the voltage graph, where do I download that beta? (is what I meant)


You don't need a beta version. Just hit Ctrl+F


----------



## Menta

Quote:


> Originally Posted by *hotrod717*
> 
> That iswhere I'm coming from. Would like to see some screens on aib cards. I'm sure they will be popping up soon enough.
> For someone staying with air cooling it would seem to make sense. Buy for water or extreme, looking at info that's out right now, silicon quality, cooling and soldering skills are the only factor.


Yeah, Asus, Msi, Zotac, Gigabyte, Evga, tick all those things, Zotac has a 16+2 power phase design, ASUS 10+2, Gigabyte 12+2, Msi 8+2.

Zotac is a beast 320w but whatever card you get will perform about the same, acoustics should be close also.


----------



## c0ld

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Next time you have a question on something you're not sure about, I'll make sure to refer to as *"noob"* and try to answer it for you.


Glad I made my point though









Hopefully that can save some money on people doing useless upgrades.


----------



## Jbravo33

Quote:


> Originally Posted by *Joshwaa*
> 
> I think the problem is you need to adjust the first one and apply and not hit save then adjust the second one and apply then hit save. Should work that way.


Unfortunately i tried that as well. will do some more tinkering tonight. now that i know how to adjust the graph faster, smh.

Quote:


> Originally Posted by *pantsoftime*
> 
> For those of you new to curve overclocking you can hold the control key to drag the entire curve. You can also highlight a point and hit L to lock the card to that VF point.


Wow! wish i knew that last night spent so much time adjusting each one..


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> post a pic of your curve again? i came from 980ti's so i havent conquered the curve yet.


I came from Maxwell as well Titan X.

First i set my overclock to +151 core offset and power / voltage slider maxed. 2050Mhz

Then you go to the next voltage bin below that (1.043v) in my case and add +13

Then do that all the way down to .963v


you can even try to run in my case 2050Mhz @ 1.031v and 1.025v bin so when it throttles from power or heat it will drop voltage but keep the 2050Mhz.


----------



## ThingyNess

Has anyone been able to dump a BIOS from their 1080Ti yet? I've yet to find one 'in the wild' and would like to start writing a program to dump all of the power limits, fan speeds, etc. from them.
Looks like GPU-Z doesn't work with the 1080Ti yet, but hopefully someone can dump one with NVFlash and upload it somewhere.

Also, interesting reading from the MSI 1080Ti review on Guru3d. Like all the reviewers so far there have been no mention about TDP targets, adjustability, or what any of the actual numbers are, but...
They do show a picture of Afterburner with the power limit slider maxed out, and even on air the card runs right up against its 117% TDP limit and power throttles.

Looks like I have to hold out for a review of the Gigabyte card - I have high hopes for its 12+2 phase VRM design.

http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,30.html

Anyone have any guesses what the base TDP target is? They've traditionally been pretty generous on that front, with the 1080 Gaming X having 230w instead of the default 180w of the FE cards.

If they gave the 1080Ti an extra 50w too, that'd be 300w, and 300w + 17% is almost exactly 350w, a nice round number...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> Any pointers on getting this curve set right?
> 
> I have
> 1031 mV/+170 (2050 Mhz)
> 1043 mV/+173 (2062 Mhz)
> 1050 mV/+176 (2062 Mhz)
> And flat after that
> 
> I noticed the following when firestrike crashes:
> 
> Power 113%
> Temp 50C
> 2050Mhz
> 1.05V
> 
> Which then goes to
> 
> Power 40%
> Temp 47C
> 2063 Mhz
> 1.081V
> 
> Currently an FE card with full fans going when benching. Possibly thermal throttling? I am going to get an EK block when they come out with the 1080Ti branded ones.


Do 2033 at 1.032 to 1.050 and then add up from there. You need to get more voltage before 2050, which is where you're crashing.


----------



## RedM00N

Quote:


> Originally Posted by *ThingyNess*
> 
> Has anyone been able to dump a BIOS from their 1080Ti yet? I've yet to find one 'in the wild' and would like to start writing a program to dump all of the power limits, fan speeds, etc. from them.
> Looks like GPU-Z doesn't work with the 1080Ti yet, but hopefully someone can dump one with NVFlash and upload it somewhere.
> 
> Also, interesting reading from the MSI 1080Ti review on Guru3d. Like all the reviewers so far there have been no mention about TDP targets, adjustability, or what any of the actual numbers are, but...
> They do show a picture of Afterburner with the power limit slider maxed out, and even on air the card runs right up against its 117% TDP limit and power throttles.
> 
> Looks like I have to hold out for a review of the Gigabyte card - I have high hopes for its 12+2 phase VRM design.
> 
> http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,30.html
> 
> Anyone have any guesses what the base TDP target is? They've traditionally been pretty generous on that front, with the 1080 Gaming X having 230w instead of the default 180w of the FE cards.
> 
> If they gave the 1080Ti an extra 50w too, that'd be 300w, and 300w + 17% is almost exactly 350w, a nice round number...


If anything, it can possibly be dumped in linux(ive done it with maxwell, not pascal) if your willing to do a linux install. There may be similar ways to do it in windows, but I haven't looked


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Has anyone been able to dump a BIOS from their 1080Ti yet? I've yet to find one 'in the wild' and would like to start writing a program to dump all of the power limits, fan speeds, etc. from them.
> Looks like GPU-Z doesn't work with the 1080Ti yet, but hopefully someone can dump one with NVFlash and upload it somewhere.
> 
> Also, interesting reading from the MSI 1080Ti review on Guru3d. Like all the reviewers so far there have been no mention about TDP targets, adjustability, or what any of the actual numbers are, but...
> They do show a picture of Afterburner with the power limit slider maxed out, and even on air the card runs right up against its 117% TDP limit and power throttles.
> 
> Looks like I have to hold out for a review of the Gigabyte card - I have high hopes for its 12+2 phase VRM design.
> 
> http://www.guru3d.com/articles_pages/msi_gtx_1080_ti_gaming_x_review,30.html
> 
> Anyone have any guesses what the base TDP target is? They've traditionally been pretty generous on that front, with the 1080 Gaming X having 230w instead of the default 180w of the FE cards.
> 
> If they gave the 1080Ti an extra 50w too, that'd be 300w, and 300w + 17% is almost exactly 350w, a nice round number...


Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.

1080Ti.zip 156k .zip file


Identifying EEPROM...
EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page
Reading adapter firmware image...
IFR Data Size : 2184 bytes
IFR CRC32 : A73AE6C5
IFR Image Size : 2560 bytes
IFR Image CRC32 : 432F79F3
IFR Subsystem ID : 10DE-120F
Image Size : 260096 bytes
Version : 86.02.39.00.01
~CRC32 : E303990C
Image Hash : 901B1B9D77096B6AEA31138C207A1F2E
Subsystem ID : 10DE-120F
Hierarchy ID : Normal Board
Chip SKU : 350-0
Project : G611-0050
CDP : N/A
Build Date : 01/18/17
Modification Date : 01/18/17
Build GUID : 402D5B72FFFFFFB8FFFFFFA34DFFFFFFFBFFFFFFBEFFFFFFB8FFFFFFA9FFFFFFC5FFFFFFCDFFFFFF99FFFFFFEE31
UEFI Support : Yes
UEFI Version : 0x30006 (Oct 26 2016 @ 21305424 )
UEFI Variant Id : 0x0000000000000007 ( GP1xx )
UEFI Signer(s) : Microsoft Corporation UEFI CA 2011
InfoROM Version : G001.0000.01.04
InfoROM Backup Exist : NO
License Placeholder : Absent
GPU Mode : N/A
Saving of image completed.


----------



## RedM00N

Quote:


> Originally Posted by *KedarWolf*
> 
> Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.
> 
> 1080Ti.zip 156k .zip file
> 
> 
> Identifying EEPROM...
> EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page
> Reading adapter firmware image...
> IFR Data Size : 2184 bytes
> IFR CRC32 : A73AE6C5
> IFR Image Size : 2560 bytes
> IFR Image CRC32 : 432F79F3
> IFR Subsystem ID : 10DE-120F
> Image Size : 260096 bytes
> Version : 86.02.39.00.01
> ~CRC32 : E303990C
> Image Hash : 901B1B9D77096B6AEA31138C207A1F2E
> Subsystem ID : 10DE-120F
> Hierarchy ID : Normal Board
> Chip SKU : 350-0
> Project : G611-0050
> CDP : N/A
> Build Date : 01/18/17
> Modification Date : 01/18/17
> Build GUID : 402D5B72FFFFFFB8FFFFFFA34DFFFFFFFBFFFFFFBEFFFFFFB8FFFFFFA9FFFFFFC5FFFFFFCDFFFFFF99FFFFFFEE31
> UEFI Support : Yes
> UEFI Version : 0x30006 (Oct 26 2016 @ 21305424 )
> UEFI Variant Id : 0x0000000000000007 ( GP1xx )
> UEFI Signer(s) : Microsoft Corporation UEFI CA 2011
> InfoROM Version : G001.0000.01.04
> InfoROM Backup Exist : NO
> License Placeholder : Absent
> GPU Mode : N/A
> Saving of image completed.


Nvflash version 5.353.0 works with the 1080ti? Or is there a newer one I missed.


----------



## KedarWolf

Quote:


> Originally Posted by *RedM00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.
> 
> 1080Ti.zip 156k .zip file
> 
> 
> Identifying EEPROM...
> EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page
> Reading adapter firmware image...
> IFR Data Size : 2184 bytes
> IFR CRC32 : A73AE6C5
> IFR Image Size : 2560 bytes
> IFR Image CRC32 : 432F79F3
> IFR Subsystem ID : 10DE-120F
> Image Size : 260096 bytes
> Version : 86.02.39.00.01
> ~CRC32 : E303990C
> Image Hash : 901B1B9D77096B6AEA31138C207A1F2E
> Subsystem ID : 10DE-120F
> Hierarchy ID : Normal Board
> Chip SKU : 350-0
> Project : G611-0050
> CDP : N/A
> Build Date : 01/18/17
> Modification Date : 01/18/17
> Build GUID : 402D5B72FFFFFFB8FFFFFFA34DFFFFFFFBFFFFFFBEFFFFFFB8FFFFFFA9FFFFFFC5FFFFFFCDFFFFFF99FFFFFFEE31
> UEFI Support : Yes
> UEFI Version : 0x30006 (Oct 26 2016 @ 21305424 )
> UEFI Variant Id : 0x0000000000000007 ( GP1xx )
> UEFI Signer(s) : Microsoft Corporation UEFI CA 2011
> InfoROM Version : G001.0000.01.04
> InfoROM Backup Exist : NO
> License Placeholder : Absent
> GPU Mode : N/A
> Saving of image completed.
> 
> 
> 
> Nvflash version 5.353.0 works with the 1080ti? Or is there a newer one I missed.
Click to expand...

Took a bit to find one that worked but due to an unfortunate hard drive wipe I lost the one that worked and not quite sure to be honest. I know it was one of the latest ones you don't need the certificate signed thingy I think it was. This post is from a month or so ago, I just edited it, copied and pasted the info.


----------



## RedM00N

Quote:


> Originally Posted by *KedarWolf*
> 
> Took a bit to find one that worked but due to an unfortunate hard drive wipe I lost the one that worked and not quite sure to be honest. I know it was one of the latest ones you don't need the certificate signed thingy I think it was. This post is from a month or so ago, I just edited it, copied and pasted the info.


Interesting. Guess it doesnt require support for the card to dump, just support for the EEPROM if an older version worked.


----------



## KedarWolf

Quote:


> Originally Posted by *RedM00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.
> 
> 1080Ti.zip 156k .zip file
> 
> 
> Identifying EEPROM...
> EEPROM ID (EF,6013) : WBond W25Q40EW 1.65-1.95V 4096Kx1S, page
> Reading adapter firmware image...
> IFR Data Size : 2184 bytes
> IFR CRC32 : A73AE6C5
> IFR Image Size : 2560 bytes
> IFR Image CRC32 : 432F79F3
> IFR Subsystem ID : 10DE-120F
> Image Size : 260096 bytes
> Version : 86.02.39.00.01
> ~CRC32 : E303990C
> Image Hash : 901B1B9D77096B6AEA31138C207A1F2E
> Subsystem ID : 10DE-120F
> Hierarchy ID : Normal Board
> Chip SKU : 350-0
> Project : G611-0050
> CDP : N/A
> Build Date : 01/18/17
> Modification Date : 01/18/17
> Build GUID : 402D5B72FFFFFFB8FFFFFFA34DFFFFFFFBFFFFFFBEFFFFFFB8FFFFFFA9FFFFFFC5FFFFFFCDFFFFFF99FFFFFFEE31
> UEFI Support : Yes
> UEFI Version : 0x30006 (Oct 26 2016 @ 21305424 )
> UEFI Variant Id : 0x0000000000000007 ( GP1xx )
> UEFI Signer(s) : Microsoft Corporation UEFI CA 2011
> InfoROM Version : G001.0000.01.04
> InfoROM Backup Exist : NO
> License Placeholder : Absent
> GPU Mode : N/A
> Saving of image completed.
> 
> 
> 
> Nvflash version 5.353.0 works with the 1080ti? Or is there a newer one I missed.
Click to expand...

NVFlash 5.353.0 but you need to use the nvflash64.exe 'nvflash64 --list' list to see cards, I have two, one for dedicated PhysX. nvflash64 --save 'filename' if you only have the one card. I think I did
'nvflash64 --protectoff' as well but got an error so I don't know if you actually need to do that.

https://www.techpowerup.com/download/nvidia-nvflash/

nvflash_5.353.0.zip 3031k .zip file


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> I came from Maxwell as well Titan X.
> 
> First i set my overclock to +151 core offset and power / voltage slider maxed. 2050Mhz
> 
> Then you go to the next voltage bin below that (1.043v) in my case and add +13
> 
> Then do that all the way down to .963v
> 
> 
> you can even try to run in my case 2050Mhz @ 1.031v and 1.025v bin so when it throttles from power or heat it will drop voltage but keep the 2050Mhz.


pretty much just what I did and now I can complete the Heaven loop at 2050/2037 throughout, on air. THANKS , bro.


----------



## Slackaveli

Quote:


> Originally Posted by *PewnFlavorTang*
> 
> I'm almost certain one of the FE's I have will do 2150 on water and the other one would probably do 2114-2126. I have 2 fairly good clocking FE's.


"fairly good" isnt a fair assessment. Those are two BEASTS.


----------



## KedarWolf

Quote:


> Originally Posted by *RedM00N*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Took a bit to find one that worked but due to an unfortunate hard drive wipe I lost the one that worked and not quite sure to be honest. I know it was one of the latest ones you don't need the certificate signed thingy I think it was. This post is from a month or so ago, I just edited it, copied and pasted the info.
> 
> 
> 
> Interesting. Guess it doesnt require support for the card to dump, just support for the EEPROM if an older version worked.
Click to expand...

NVFlash 5.353.0 but you need to use the nvflash64.exe 'nvflash64 --list' list to see cards, I have two, one for dedicated PhysX.

'nvflash64 --save filename' if you only have the one card.

I think I did 'nvflash64 --protectoff' as well but got an error so I don't know if you actually need to do that.

https://www.techpowerup.com/download/nvidia-nvflash/

nvflash_5.353.0.zip 3031k .zip file


so it doesn't get lost in an edit, so many posts, so little time.


----------



## Slackaveli

i hadnt even seen this one yet. damn. https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c

these 3 for "tl;dr
damn-o https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,33

https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38

https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,39


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> Done with latest version of nvflash, BIOS date 1/18/2017 Gigabyte 1080 Ti FE.
> 
> 1080Ti.zip 156k .zip file


Thank you very much for this. Now we just need some AIB/aftermarket 1080Ti BIOSes to compare to. Not a single reviewer has published any hard TDP numbers for these cards yet, so if anyone has an AIB 1080Ti and can post the BIOS, I can extract all the information from it. I've included the offsets to everything in case anyone wants to do this on their own too.

Note that the 6 "fan curve" entries are the rpm levels set at the various temperature breakpoints within the stock fan curve. Unfortunately I haven't figured out where the temperature values for those are stored yet, but I think Afterburner already shows you the stock fan curve anyway, so no huge need for that.

From the FE 1080Ti BIOS KedarWolf posted:

Code:



Code:


Note - in the BIOS the numbers are stored LSB first, so if you see "48 e8 01 00" in your hex editor, you need to swap to MSB first notation before converting to decimal.
"48 e8 01 00" would become "00 01 e8 48" which when converted to decimal = 125000 which is your Min TDP target in milliwatts.

Min TDP target                          DWORD - 0x00030e70 - (48 e8 01 00) = 125000mW = 125W
Default TDP target                      DWORD - 0x00030e74 - (90 d0 03 00) = 250000mW = 250w
Max TDP limit                           DWORD - 0x00030e78 - (e0 93 04 00) = 300000mW = 300w

Motherboard Default TDP Target          DWORD - 0x00030ea2 - (d0 01 01 00) = 66000mW = 66W
Motherboard Max TDP Limit               DWORD - 0x00030ea6 - (d0 30 01 00) = 78000mW = 78W

PCIe1 Default TDP Target                DWORD - 0x00030ed0 - (d8 53 01 00) = 87000mW = 87W
PCIe1 Max TDP Limit                     DWORD - 0x00030ed4 - (b8 82 01 00) = 99000mW = 99W

PCIe2 Default TDP Target                DWORD - 0x00030efe - (d0 78 02 00) = 162000mW = 162W
PCIe2 Max TDP Limit                     DWORD - 0x00030f02 - (98 ab 02 00) = 175000mW = 175W

Fan Curve Entry 1                       WORD  - 0x0003150a - (40 05) = 1344 rpm
Fan Curve Entry 2                       WORD  - 0x0003150c - (4c 04) = 1100 rpm
Fan Curve Entry 3                       WORD  - 0x0003150e - (80 0a) = 2688 rpm
Fan Curve Entry 4                       WORD  - 0x00031510 - (60 09) = 2400 rpm
Fan Curve Entry 5                       WORD  - 0x00031512 - (50 0b) = 2896 rpm
Fan Curve Entry 6                       WORD  - 0x00031514 - (c0 12) = 4800 rpm

Fan Max RPM                             WORD  - 0x000314e6 - (c0 12) = 4800 rpm


----------



## KingEngineRevUp

What's the trick to hitting 1.092V? Very low temperatures​?


----------



## Addsome

Hey guys, my card reaches 51c sometimes with the 1080 hybrid cooler and thus downclocks. I want to keep the temps below 50c and I was thinking of adding two NF-F12 fans to the hybrid kit. Should I connect these two fans to my mobo using a splitter? Will that make them run at 100%? Idk if I should get the normal NF-F12s, NF-F12s 2000 Industrial or the 3000rpm ones. Or should I be looked at other fans since the Noctua ones are pretty expensive?


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> What's the trick to hitting 1.092V? Very low temperatures ?


That and staying far away from the board power limit, which unfortunately is hard to do on an FE if you're benchmarking or running it hard. It already bumps up against the power delivery limits pretty quick even at stock voltage, and raising the voltage just makes it hit the power delivery brick wall that much quicker.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> That and staying far away from the board power limit, which unfortunately is hard to do on an FE if you're benchmarking or running it hard. It already bumps up against the power delivery limits pretty quick even at stock voltage, and raising the voltage just makes it hit the power delivery brick wall that much quicker.


yeah, I can't sustain 2101 Mhz, the card is hitting power limits and voltages are dropping all the way down to 1.04 which causes it to crash in Witcher 3.

Although benchmarking in Heaven is possible at 2114 Mhz for me, I have reached the power limiting wall. Don't want to do the shunt mod either, so probably going to back my OC down to 2088 or 2075 and be happy with it.


----------



## Jbravo33

Now if I could just get 
this curve to work on both cards!!!

2113.5


----------



## caenlen

Wish I could have joined this round of fun, but I am waiting for GDDR6 or HBM2 with next year round up, and DDR5 is rumored to be out next year as well overtaking DDR4 ram slots with double the bandwidth... hopefully early 2019 at the latest, until then my laptop will hold me over since I am traveling anyway.

Was fun reading this thread though, got my OC itch going on DD

side note - I got my gtx 1070 laptop hitting 2000 core while gaming


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Jbravo33*
> 
> Now if I could just get
> this curve to work on both cards!!!


Are you running a FE? Because without the shunt mod, it's useless. After playing Witcher 3 at 2101, I get power limiting issues. Hitting 120 and crash if the power goes above 120. Voltage is unstable also.


----------



## Jbravo33

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you running a FE? Because without the shunt mod, it's useless. After playing Witcher 3 at 2101, I get power limiting issues. Hitting 120 and crash if the power goes above 120. Voltage is unstable also.


No shunt can u post a link as to how to do that. Might mess with it in the future. I'm doing this purely just to bench card when I game I set power to 110 voltage to 50 core 125 mem 300


----------



## illypso

In stock for canada.

Dam could not resist, just bought a second card...

Do you need a special SLI brige or the one I use before with my gtx970 will work. (its the basic single bridge one)

https://www.newegg.ca/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007708%20601294835%20601295933

Edit: From what I found on the net, at 1440p it was not making a huge difference on most of game on a1080, Dont know about the 1080ti though, I will use the old one to start and compare bench with other to see if it really make a difference.


----------



## KedarWolf

If my card sits vertically in my case, motherboard sits flat, horizontal, doing a shunt mod under an EK block the liquid metal will run, ruining my card, right?


----------



## bfedorov11

Quote:


> Originally Posted by *KedarWolf*
> 
> If my card sits vertically in my case, motherboard sits flat, horizontal, doing a shunt mod under an EK block the liquid metal will run, ruining my card, right?


Not unless you put way too much on them. My 1080 was vertical and didn't have a problem. The CLU ended up drying and I couldn't get it off though. Decided to not do it on the Ti.. not much to gain.


----------



## KingEngineRevUp

Hitting 2101 MHz isn't impressive to me anymore. It's not realistic for gameplay due to the power limiting.

Witcher 3 max everything in toussiant is a true benchmark to a stable OC. Your power limits will nearly be at 120% the majority of the time.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> If my card sits vertically in my case, motherboard sits flat, horizontal, doing a shunt mod under an EK block the liquid metal will run, ruining my card, right?


You can put hot glue around the resistor, CLU and then hair dry it to soften the hot glu and then stick a piece of wood on top to close it off so it can't run out ;D


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Hitting 2101 MHz isn't impressive to me anymore. It's not realistic for gameplay due to the power limiting.
> 
> Witcher 3 max everything in toussiant is a true benchmark to a stable OC. Your power limits will nearly be at 120% the majority of the time.


I can run at 2100 for benching but my go to everyday clocks are 2062, at 1.062v, can do a full Heaven bench and only dips a few seconds the entire bench under water. At 1.093v, 2083 for 24/7 use if I did that it jumps up and down from 1.093v to 1.050v and in between.









Better a stable 2062 core I think then clocks all over the place.


----------



## Jbravo33

I can go in peace now!







so with curve 1 card hits 2113.5 the other maxes out 2088. probably best if you plan to sli buy two of the same and hopefully in series with sku.


----------



## mcg75

Quote:


> Originally Posted by *PasK1234Xw*
> 
> EVGA is notorious for binning cards even though they wont admit it. I tried nvidia this time around but didnt get great results. Currently have MSI FE that barely holds 2050. Now that im seeing almost 10fps increase at 2100 making me want to return and try again.


FE edition launch cards are luck of the draw. I'm pretty certain they are all assembled by Nvidia and shipped to each vendor as such.

When you see their custom pcb designs is when binning may be involved as they are only buying chips at that point.


----------



## KedarWolf

Being obsessive about these things, I've found under water, no shunt mod, voltage slider maxed out, the trick is to find the best clock speeds you can obtain at the lowest possible voltages for sustained voltages and core speeds.









If I run my core at +177 2062 at 1.031v I get a solid 1.031v and 2062 core on a full 1920x1080 Heaven run. This is the lowest I can go on voltages with zero driver crashes at +177 core.









If you look at my GPU-Z log and Afterburner logs not once did I dip below 1.031v or 2062 core.









GPU-ZSensorLog.txt 91k .txt file


HardwareMonitoring.zip 3k .zip file


With 2093 core 1.093v it bounced up and down from 1.050v to 1.093v and in between, clock speeds changing as it does of course.









So while you can brag about maximum clocks (I got 2100 benching) for every day 24/7 scenarios a balance between voltages and clock speed seems to be key.









Hope this helps some. I'm reluctant to do the shunt mod as in my case my video card sits vertical and I'm not going to have the CLU run under my EK block and ruin my card.


----------



## ThingyNess

KedarWolf: Just out of curiosity, what sort of clock speeds can you sustain in a Furmark run with the water cooling?

(both at stock 100% TDP slider and with it maxed at 120%?)

Just trying to get a sense of just how TDP limited these cards are in pure compute loads on the FE boards.

As a comparison datapoint I can sustain 1911mhz/0.95v on my Strix 1070 indefinitely on air in FurMark, but that's with a 200w TDP limit (33% over a stock 1070!) and even then it's still running hard up against the TDP limiter. In games and other things it sits between 2025 and 2050mhz.


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> KedarWolf: Just out of curiosity, what sort of clock speeds can you sustain in a Furmark run with the water cooling?
> 
> (both at stock 100% TDP slider and with it maxed at 120%?)
> 
> Just trying to get a sense of just how TDP limited these cards are in pure compute loads on the FE boards.
> 
> As a comparison datapoint I can sustain 1911mhz/0.95v on my Strix 1070 indefinitely on air in FurMark, but that's with a 200w TDP limit (33% over a stock 1070!) and even then it's still running hard up against the TDP limiter. In games and other things it sits between 2025 and 2050mhz.


Furmark isn't good to stress test with. It can stress a card too much and actually cause harm.

Here is a guide I made though, it's pretty detailed, and after much tweaking the best way to do my card under water no shunt mod. Check it out, interesting read.









http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Furmark isn't good to stress test with. It can stress a card too much and actually cause harm.
> 
> Here is a guide I made though, it's pretty detailed, and after much tweaking the best way to do my card under water no shunt mod. Check it out, interesting read.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


good guide. rep'd.


----------



## KedarWolf

This is the guide I made for peeps.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20#post_25977285

Do this first.

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20

Next you want to CTRL F in Afterburner, open the custom voltage curve, CTRL D to set it to defaults. Then you want to hold Shift, drag it to say +145 1999 core at 1050v and hit apply in Afterburner

It'll run around 1999 core at 1.062v in Afterburner in Heaven. Don't run Heaven though, drag the 1031v point up the same as the 1050v point and hit apply.

If you do it right everything to the right of 1031v should be in a straight line.

Try lower voltages if you are on an air cooler and want to keep temps down, though you may need to start with lower clock speeds as well.

Keep your memory between +400 and +500 to start even if you can do more.

Open Heaven at a lower resolution than your screen resolution NOT in full-screen mode so you can still see Afterburner.

When you run Heaven if you get no driver crashes or screen freezing about three seconds and restarting that's good. Now raise the 1031v point with Heaven running one notch at a time to high core speeds like from +145 to +155 and hit apply, then between each time wait 30 seconds or so.

Keep doing this until the driver crashes or screen freezes three seconds and Heaven restarts. Close Heaven then drop in back down one notch, hit apply, and reboot. Your frame rate and stability will be compromised until you reboot.

Now do a full benchmark run with Heaven. If nothing crashes core is good. If driver crashes or screen freezes just drop it down one more notch. At 1.031v max core you can get you should get zero drops in voltages and core speed, should stay at 1.031v and the core it's at.

You can have GPU-Z running and logging to see your core does not drop during the run and be sure you had no driver crashes. If core and volts drop drastically a few seconds, then resumes, your driver crashed.

Ideally, this is what your final voltage curve should look like, but with the maximum core you determine by this method.



After you get core stable run Heaven and hit Shift to pause it at a scene. It'll show the frame rate up top with screen paused. Stop it at a lower frame rate scene when the frame rate only fluctuates a few FPS while paused, usually a scene with no clouds or smoke or anything.

Now adjust your memory up/or down until you get the frame rate as high as it'll go with no driver crashes or artifacts. HIGHER ISN"T ALWAYS BETTER. I find at +642 is a few frames slower than +610.

Your frame rate may only be higher a few FPS at best memory speed but that's fine.

You have now found a low voltage best clock speed compromise for your water/air cooled 1080 Ti.









Being obsessive about these things, I've found under water, no shunt mod, voltage slider maxed out, the trick is to find the best clock speeds you can obtain at the lowest possible voltages for sustained voltages and core speeds.









If I run my core at +177 2062 at 1.031v I get a solid 1.031v and 2062 core on a full 1920x1080 Heaven run. This is the lowest I can go on voltages with zero driver crashes at +177 core.









If you look at my GPU-Z log and Afterburner logs not once did I dip below 1.031v or 2062 core.









GPU-ZSensorLog.txt 91k .txt file


HardwareMonitoring.zip 3k .zip file


With 2093 core 1.093v it bounced up and down from 1.050v to 1.093v and in between, clock speeds changing as it does of course.









So while you can brag about maximum clocks (I got 2100 benching) for every day 24/7 scenarios a balance between voltages and clock speed seems to be key.









Hope this helps some. I'm reluctant to do the shunt mod as in my case my video card sits vertical and I'm not going to have the CLU run under my EK block and ruin my card.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I can run at 2100 for benching but my go to everyday clocks are 2062, at 1.062v, can do a full Heaven bench and only dips a few seconds the entire bench under water. At 1.093v, 2083 for 24/7 use if I did that it jumps up and down from 1.093v to 1.050v and in between.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Better a stable 2062 core I think then clocks all over the place.


I don't think it matters if you have voltage fully unlocked to 1.075, if you play a power hungry game then it will calculate power draw , voltage and clocks and push you to a lower bin.

So if you drew a straight ling from [email protected] to 1.050V and then a jump from [email protected] to [email protected] then your card will calculate which bin to put you in according to your power draw.

The Witcher keeps the TDP at close to 120% for me, so it will drop me down to [email protected] automatically in fight scenes but when I'm walking out and about, then I'll be at 2075.

At least this is my experience.

Power draw goes up? Boost will make voltage go down. And whatever clocks you have at that voltage will be where you'll stay until power goes back down again.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I can run at 2100 for benching but my go to everyday clocks are 2062, at 1.062v, can do a full Heaven bench and only dips a few seconds the entire bench under water. At 1.093v, 2083 for 24/7 use if I did that it jumps up and down from 1.093v to 1.050v and in between.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Better a stable 2062 core I think then clocks all over the place.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think it matters if you have voltage fully unlocked to 1.075, if you play a power hungry game then it will calculate power draw , voltage and clocks and push you to a lower bin.
> 
> So if you drew a straight ling from [email protected] to 1.050V and then a jump from [email protected] to [email protected] then your card will calculate which bin to put you in according to your power draw.
> 
> The Witcher keeps the TDP at close to 120% for me, so it will drop me down to [email protected] automatically in fight scenes but when I'm walking out and about, then I'll be at 2075.
> 
> At least this is my experience.
> 
> Power draw goes up? Boost will make voltage go down. And whatever clocks you have at that voltage will be where you'll stay until power goes back down again.
Click to expand...

See two posts above.


----------



## KingEngineRevUp

*Hmm, honestly I just went back to using the slider.*

Throw +175 and +400 on my memory, and it just runs more smooth and better. Stable 2075 Mhz.

The voltage curve method was cool to hit 2114 Mhz and show on here to compensate for my small penis but that's all it was good for.


----------



## TWiST2k

Quote:


> Originally Posted by *SlimJ87D*
> 
> *Hmm, honestly I just went back to using the slider.*
> 
> Throw +175 and +400 on my memory, and it just runs more smooth and better. Stable 2075 Mhz.
> 
> The voltage curve method was cool to hit 2114 Mhz and show on here to compensate for my small penis but that's all it was good for.


Hahahahaha! The ePeen is all that counts in these forums.


----------



## gunit2004

Ordered a ASUS Strix 1080ti yesterday when I saw it was finally available.

The rest of my system is fairly old at the moment and I have been thinking of upgrading but just haven't gotten around to it.

Currently running a i7 2600k clocked @ 4.6 ghz, 16GB Corsair Vengeance DDR3-1600 RAM & a 2GB Sapphire Radeon HD6970.... So yeah I've been running this build for about 6 years now, it has aged well and still runs great.

Only thing I don't like is that the 6970 has gotten annoying with it's high temps and loudness. Part of the reason why I even wanted a new video card in the first place. I figured I would future proof myself and get the brand spankin' new 1080ti and figure out if I want to upgrade the rest of my stuff later on.

So basically... should I upgrade my system now or wait on it as everything else still runs pretty great? Maybe there is better stuff on the horizon I should wait for?


----------



## Mato87

Quote:


> Originally Posted by *gunit2004*
> 
> Ordered a ASUS Strix 1080ti yesterday when I saw it was finally available.
> 
> The rest of my system is fairly old at the moment and I have been thinking of upgrading but just haven't gotten around to it.
> 
> Currently running a i7 2600k clocked @ 4.6 ghz, 16GB Corsair Vengeance DDR3-1600 RAM & a 2GB Sapphire Radeon HD6970.... So yeah I've been running this build for about 6 years now, it has aged well and still runs great.
> 
> Only thing I don't like is that the 6970 has gotten annoying with it's high temps and loudness. Part of the reason why I even wanted a new video card in the first place. I figured I would future proof myself and get the brand spankin' new 1080ti and figure out if I want to upgrade the rest of my stuff later on.
> 
> So basically... should I upgrade my system now or wait on it as everything else still runs pretty great? Maybe there is better stuff on the horizon I should wait for?


Most definitely upgrade all the components, new cpu and new mobo along with new ram is a must, otherwise your new card will be bottlenecked by the cpu. We are talking about the increase of both the average framerate and more importantly the min framerate that will just skyrocket with the new cpu.


----------



## alucardis666

Quote:


> Originally Posted by *Mato87*
> 
> Most definitely upgrade all the components, new cpu and new mobo along with new ram is a must, otherwise your new card will be bottlenecked by the cpu. We are talking about the increase of both the average framerate and more importantly the min framerate that will just skyrocket with the new cpu.


Agreed!


----------



## alucardis666

And for anyone who was curious my best curve is...


----------



## DStealth

Quote:


> Originally Posted by *gunit2004*
> 
> Ordered a ASUS Strix 1080ti yesterday when I saw it was finally available.
> 
> The rest of my system is fairly old at the moment and I have been thinking of upgrading but just haven't gotten around to it.
> 
> Currently running a i7 2600k clocked @ 4.6 ghz, 16GB Corsair Vengeance DDR3-1600 RAM & a 2GB Sapphire Radeon HD6970.... So yeah I've been running this build for about 6 years now, it has aged well and still runs great.


The moment when your Video RAM amount becomes closer to your operating RAM memory amount ...priceless


----------



## undercoverb0ss

Quote:


> Originally Posted by *SlimJ87D*
> 
> *Hmm, honestly I just went back to using the slider.*
> 
> Throw +175 and +400 on my memory, and it just runs more smooth and better. Stable 2075 Mhz.
> 
> The voltage curve method was cool to hit 2114 Mhz and show on here to compensate for my small penis but that's all it was good for.


Same here, I spent more time tweaking with no improvement over what I was getting with my slider settings.


----------



## G woodlogger

*gunit2004* You should just relax and test it out on the games you play. It depend a lot what frame rate you want, if it is 60-72 there will only be problems in some places, in some games. More that that, maybe but you should also think about what you upgrade target would be and i would wait till Coffee lake or Ryzen 2/3 when games start to better make use of it with cores and compatibility. Greed for the future FTW!


----------



## kevindd992002

Quote:


> Originally Posted by *Mato87*
> 
> Most definitely upgrade all the components, new cpu and new mobo along with new ram is a must, otherwise your new card will be bottlenecked by the cpu. We are talking about the increase of both the average framerate and more importantly the min framerate that will just skyrocket with the new cpu.


I thought this CPU upgrading debate was already addressed a few pages back? Like the others said, an OC'd 2600K will not really bottleneck a 1080Ti at 1440p and above.


----------



## Norlig

So I did this now










How long should I wait until using it? (Guessing it needs to harden?)


----------



## PasK1234Xw

Quote:


> Originally Posted by *mcg75*
> 
> FE edition launch cards are luck of the draw. I'm pretty certain they are all assembled by Nvidia and shipped to each vendor as such.
> 
> When you see their custom pcb designs is when binning may be involved as they are only buying chips at that point.


Its not luck of draw when dealing with EVGA trust me yes their FE they receive are from nvidia BUT they use the best cards tested for their SC model.

After a month or so they start manufacturing their own reference PCB and make their own FE and sc gets shafted by even higher end cards as it becomes GPU binning not just testing cards

Tell tail is the logo by the PCIE and EVGA even told me they eventually start produce their own FE.

http://i.imgur.com/Vd9DKKX.png

http://i.imgur.com/SkzjIRB.jpg

http://i.imgur.com/vOqPsDR.jpg


----------



## Dasboogieman

Quote:


> Originally Posted by *gunit2004*
> 
> Ordered a ASUS Strix 1080ti yesterday when I saw it was finally available.
> 
> The rest of my system is fairly old at the moment and I have been thinking of upgrading but just haven't gotten around to it.
> 
> Currently running a i7 2600k clocked @ 4.6 ghz, 16GB Corsair Vengeance DDR3-1600 RAM & a 2GB Sapphire Radeon HD6970.... So yeah I've been running this build for about 6 years now, it has aged well and still runs great.
> 
> Only thing I don't like is that the 6970 has gotten annoying with it's high temps and loudness. Part of the reason why I even wanted a new video card in the first place. I figured I would future proof myself and get the brand spankin' new 1080ti and figure out if I want to upgrade the rest of my stuff later on.
> 
> So basically... should I upgrade my system now or wait on it as everything else still runs pretty great? Maybe there is better stuff on the horizon I should wait for?


Short answer, ur fine. Unless you are gunning for 120FPS type monitors with no Gsync then I wouldn't really bother about the CPU.
I myself only noticed a difference in CPU when I upgraded from a 4.5ghz 3770k to a 4.8ghz 7700k with the GTX 980ti (1450mhz) in one single game, FFXIV while sitting in Idyllshire town (this is basically the central hub town with at least 50+ player characters being rendered). I went from 45 FPS to smooth 60 FPS but I knew this was a CPU bottleneck from the beginning since the GPU usage was only 55%.

Long answer, it depends on the games you play and what your expectations are. I only upgraded because a mate was willing to buy my old crap straight away for $450 so I went ahead with the heavily subsidized cost.


----------



## Mato87

Quote:


> Originally Posted by *kevindd992002*
> 
> I thought this CPU upgrading debate was already addressed a few pages back? Like the others said, an OC'd 2600K will not really bottleneck a 1080Ti at 1440p and above.


Good luck holding a playable min framerate in more demanding titles like Battlefield 1 or the new ghost recon. Not to mention basically all the upcoming titles that can utilize 8 cores very efficiently.


----------



## PasK1234Xw

Really depends on res and frame CPU bottleneck will happes when pushing like 1440 and 100+ fps BF1 is perfect example their forum is littered with people complaining with i5 and 4/8 i7 and 1440 gsync monitors are very popular.

Here is my 5960x usage at 1440 /165hz game runs great and hold high consistent frames
No way 2600k would not bottleneck when my 5960x usage can look like this in BF1.


----------



## SkItZo

Does anyone know if the Watercool Titan X Pascal/1080ti block works with the stock backplate? (This block: http://shop.watercool.de/HEATKILLER-IV-for-TITAN-X-Pascal-ACETAL/en)

Cheers


----------



## D1RTYD1Z619

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Really depends on res and frame CPU bottleneck will happes when pushing like 1440 and 100+ fps BF1 is perfect example their forum is littered with people complaining with i5 and 4/8 i7 and 1440 gsync monitors are very popular.
> 
> Here is my 5960x usage at 1440 /165hz game runs great and hold high consistent frames
> No way 2600k would not bottleneck when my 5960x usage can look like this in BF1.


would you mind DSRing that to 4k and let me know what your CPU usage is like. I'm not getting the answers I need about BF1 on ultra setting, minus AA, at 4k.


----------



## PasK1234Xw

im not home atm that was screen i took while back and posted online i will though when i get chance.

edit

ill shoot you a PM when i post ill try for tonight.


----------



## kevindd992002

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Really depends on res and frame CPU bottleneck will happes when pushing like 1440 and 100+ fps BF1 is perfect example their forum is littered with people complaining with i5 and 4/8 i7 and 1440 gsync monitors are very popular.
> 
> Here is my 5960x usage at 1440 /165hz game runs great and hold high consistent frames
> No way 2600k would not bottleneck when my 5960x usage can look like this in BF1.


That makes sense! Can you say that this is applicable for all games if your aim is 144fps (144Hz)?


----------



## pantsoftime

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Its not luck of draw when dealing with EVGA trust me yes their FE they receive are from nvidia BUT they use the best cards tested for their SC model.
> 
> After a month or so they start manufacturing their own reference PCB and make their own FE and sc gets shafted by even higher end cards as it becomes GPU binning not just testing cards


I agree that EVGA definitely tests their cards and sorts them into FE and SC performance levels - as does other manufacturers. Binning is a term better suited for chip-level testing but your point is valid. Steve covers this pretty well in the latest Ask GN where he actually talked to EVGA and Asus about this practice:


----------



## mcg75

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Its not luck of draw when dealing with EVGA trust me yes their FE they receive are from nvidia BUT they use the best cards tested for their SC model.
> 
> After a month or so they start manufacturing their own reference PCB and make their own FE and sc gets shafted by even higher end cards as it becomes GPU binning not just testing cards
> 
> Tell tail is the logo by the PCIE and EVGA even told me they eventually start produce their own FE.


The only thing the SC card gets tested for is to hold the higher clocks within the same thermal envelope.

Your message I originally responded to was talking about hitting 2100 MHz.

In that sense, it's still luck of the draw. There is no guarantee an SC card will OC higher than a normal clocked.


----------



## Mato87

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Really depends on res and frame CPU bottleneck will happes when pushing like 1440 and 100+ fps BF1 is perfect example their forum is littered with people complaining with i5 and 4/8 i7 and 1440 gsync monitors are very popular.
> 
> Here is my 5960x usage at 1440 /165hz game runs great and hold high consistent frames
> No way 2600k would not bottleneck when my 5960x usage can look like this in BF1.


That cpu you have is a true 8 core / 16 thread processor, which means you won't run into any kind of bottlenecks in any of the new games that will come out in the foreseeable future. We are talking about mainstream cpu's like the older i7 2600k he has currently vs skylake i7 6700k or newer kabylake i7 7700k. I myself run to bottlenecks from time to time on certain maps in battlefield 1 where I see 100% utilization on all 8 cores. Same goes for games like witcher 3 or the new ghost recon.


----------



## Norlig

Great Success on modifying the Shunt resistors on my end







, Max TDP now at 89%



GPU Voltage now always at 1.081 - 1.093 , but majority of the time 1.093









Havent played with the Core that much yet, as I am waiting on some copper shims so I can mount my Accelero Hybrid


----------



## Benny89

1440p 165Hz. Upgrading from 4790k DDR3 or not for 1080 Ti?


----------



## keikei

Quote:


> Originally Posted by *Benny89*
> 
> 1440p 165Hz. Upgrading from 4790k DDR3 or not for 1080 Ti?


Upgrade to 1440p 165hz?


----------



## Benny89

Quote:


> Originally Posted by *keikei*
> 
> Upgrade to 1440p 165hz?


Sorry, no (my bad). Upgrading CPU and RAM for 1080Ti.


----------



## keikei

Quote:


> Originally Posted by *Benny89*
> 
> Sorry, no (my bad). Upgrading CPU and RAM for 1080Ti.


If youre just gaming it'd be a waste of an upgrade. I'm running a stock 3770k and I see around 50% cpu usage @4k. If you have a workstation or a ton of background apps running i can see going ryzen. I can see the allure though. Ryzen is a very good budget upgrade right now. The new R5 lineup is VERY attractive, not gonna lie.


----------



## Mato87

Quote:


> Originally Posted by *keikei*
> 
> If youre just gaming it'd be a waste of an upgrade. I'm running a stock 3770k and I see around 50% cpu usage @4k. If you have a workstation or a ton of background apps running i can see going ryzen. I can see the allure though. Ryzen is a very good budget upgrade right now. The new R5 lineup is VERY attractive, not gonna lie.


it depends on what it is that you're running at 4K with 50% cpu usage, if it's solitaire, then that's alright I guess.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> That cpu you have is a true 8 core / 16 thread processor, which means you won't run into any kind of bottlenecks in any of the new games that will come out in the foreseeable future. We are talking about mainstream cpu's like the older i7 2600k he has currently vs skylake i7 6700k or newer kabylake i7 7700k. I myself run to bottlenecks from time to time on certain maps in battlefield 1 where I see 100% utilization on all 8 cores. Same goes for games like witcher 3 or the new ghost recon.


But aren't you playing at [email protected] or something like that? Yeah that would definitely cause CPU bottlenecking.


----------



## keikei

Quote:


> Originally Posted by *Mato87*
> 
> it depends on what it is that you're running at 4K with 50% cpu usage, if it's solitaire, then that's alright I guess.


Thats my experience, unless you have some other info. Please enlighten me.


----------



## Benny89

Quote:


> Originally Posted by *keikei*
> 
> If youre just gaming it'd be a waste of an upgrade. I'm running a stock 3770k and I see around 50% cpu usage @4k. If you have a workstation or a ton of background apps running i can see going ryzen. I can see the allure though. Ryzen is a very good budget upgrade right now. The new R5 lineup is VERY attractive, not gonna lie.


I only game at 1440p 165Hz. Nothing else


----------



## keikei

Quote:


> Originally Posted by *Benny89*
> 
> I only game at 1440p 165Hz. Nothing else


You want some gaming numbers @ 1440p?


----------



## Benny89

Quote:


> Originally Posted by *keikei*
> 
> You want some gaming numbers @ 1440p?


7700k vs 4790k? Sure, that be cool!







thx


----------



## keikei

Quote:


> Originally Posted by *Benny89*
> 
> 7700k vs 4790k? Sure, that be cool!
> 
> 
> 
> 
> 
> 
> 
> thx


This is using a 1080:



Spoiler: Warning: Spoiler!


----------



## Boost240

SO I'm sorta let down right about now. Go the card on Friday and installed it Saturday night. First thing I noticed was an annoying whine which I found out to be the infamous coil whine. It's very noticeable and annoying. Also, I think I expected far too much from this card, even for my 1080p 144hz display. I was hoping I would be able to max out all my games and hit at least 120fps so I can us ultra low motion blur, but games like Rise of the Tomb still get dips into the 90s and even the 70s. What weird is I notice sometimes that my GPU usage is pretty low.

I'm not really sure what to do. i can return the card to Amazon and hope to get another but who knows if the whine will be better. I just find it a bit disappointing that my much cheaope460 and 970 cards never had this issue at all, yet a $1100 card does







. I plan on getting a water block but apparently that won't help.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Boost240*
> 
> SO I'm sorta let down right about now. Go the card on Friday and installed it Saturday night. First thing I noticed was an annoying whine which I found out to be the infamous coil whine. It's very noticeable and annoying. Also, I think I expected far too much from this card, even for my 1080p 144hz display. I was hoping I would be able to max out all my games and hit at least 120fps so I can us ultra low motion blur, but games like Rise of the Tomb still get dips into the 90s and even the 70s. What weird is I notice sometimes that my GPU usage is pretty low.
> 
> I'm not really sure what to do. i can return the card to Amazon and hope to get another but who knows if the whine will be better. I just find it a bit disappointing that my much cheaope460 and 970 cards never had this issue at all, yet a $1100 card does
> 
> 
> 
> 
> 
> 
> 
> . I plan on getting a water block but apparently that won't help.


Low GPU usage is due to a CPU bottleneck.


----------



## Vlada011

ASUS ROG Poseidon GTX1080Ti will be available soon.





https://www.asus.com/Graphics-Cards/ROG-POSEIDON-GTX1080TI-P11G-GAMING/?_ga=1.251805184.827311713.1490866924

What you think? For me is attractive and nice I could used on this, with smiley on face.


----------



## mcg75

Quote:


> Originally Posted by *Mato87*
> 
> it depends on what it is that you're running at 4K with 50% cpu usage, if it's solitaire, then that's alright I guess.


As long as the gpu usage is hitting 99-100% when needed, cpu usage really doesn't matter because it's already feeding the gpu all it can take.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> And for anyone who was curious my best curve is...


mine and your cards are similar. they dont even want to hold 2000 with slider OC but with the curve i get a solid 2025-2037-2050 now (depending on temps). Feel a lot better about her now


----------



## mtbiker033

ok just received my evga FE and installed it









did one run of Valley at stock settings with just a fan curve set up @ ultra 1440p, no AA and got:



temps maxed out at 65C

eww my gpu usage graph:


----------



## Slackaveli

Quote:


> Originally Posted by *G woodlogger*
> 
> *gunit2004* You should just relax and test it out on the games you play. It depend a lot what frame rate you want, if it is 60-72 there will only be problems in some places, in some games. More that that, maybe but you should also think about what you upgrade target would be and i would wait till Coffee lake or Ryzen 2/3 when games start to better make use of it with cores and compatibility. Greed for the future FTW!


if he were to wait it should be for the 7740k, very obviously if gaming. That thing will hit close to 6.0Mhz

Ya'll get to posting so Im not always triple posting damnit


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> Good luck holding a playable min framerate in more demanding titles like Battlefield 1 or the new ghost recon. Not to mention basically all the upcoming titles that can utilize 8 cores very efficiently.


i agree. i noticed a nice bump just dropping in a 5775c over a 4790k. ESPECIALLY in the crucial low 0.1% and minimum frames in general. Overall, MUCH smoother experience if you aren't running 90% cpu usage.


----------



## Blotto80

Think I just settled into my 24/7 clocks. 2050 @ 1.025v. Just gone through a few loops of Heaven and it hasn't dipped from 2050 at all and is topping out at 48c (EVGA Hybrid AIO with 2xSP120 in push/pull). I could get it higher for benches but the clocks couldn't hold at my highest (2090ish), I'd rather have a stable clock at a lower voltage than go all out.


----------



## Slackaveli

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Really depends on res and frame CPU bottleneck will happes when pushing like 1440 and 100+ fps BF1 is perfect example their forum is littered with people complaining with i5 and 4/8 i7 and 1440 gsync monitors are very popular.
> 
> Here is my 5960x usage at 1440 /165hz game runs great and hold high consistent frames
> No way 2600k would not bottleneck when my 5960x usage can look like this in BF1.


in that game a 2600k bottlenecks a 1080, no doubt a 1080ti will. That said, he could still get a little more life out of that 2600k at 4k. At that rez it would be fine for a bit longer. Until you need higher frames. It's all about higher frames. If you game at 60 less cpu is needed. Hitting 144 in 1080p/1440p/and soon 4k is very hard on a cpu AND a gpu.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> That makes sense! Can you say that this is applicable for all games if your aim is 144fps (144Hz)?


not necessarily as BF1 is a cpu hog. It is also a game that will utilize all your cores and it CRUSHES I-5s.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> 1440p 165Hz. Upgrading from 4790k DDR3 or not for 1080 Ti?


I can't resist









In YOUR case, buy a 5775c and sell your 4790k. Net $75~ upgrade. results like these....

https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38

nasty

Seriously, IF you have the money to build a new complete rig, WAIT for 7740k. For real. If you are getting new it might as well be the future, and THAT for gaming (for awhile) will be either the Skylake-X or Kaby-X. That Kaby-x will be 4/8 @ 135TDP (vs 95tdp of the 7700k which already can hit 5.2MHZ) and may hit 6.0Mhz+ overclocked with that 40% extra juice. Thing will be a monster.

If you want the smartest upgrade that will definitely give you max performance for the cheapest amount, DEFINITELY go with the 5775c for $350 and drop it right in and then sell your 4790k on Ebay for $275~. THAT is a great plan for folks in your shoes.

If you are mostly using it for workstation stuff and gaming is secondary, than RyZen is pretty amazing for that usage. Just not so much for gaming at 1080p(although at 4k it should be pretty great). Word it is plays very smooth like the 5775c does. Extra cache helps smooth everything out a lot.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Low GPU usage is due to a CPU bottleneck.


usually. But, it could be a RAM bottleneck, too. low gpu usage with high cpu usage is a bottlenecked cpu for sure though.


----------



## xNutella

that moment when you hear your door buzzer.


----------



## Slackaveli

BRO!! Solid delivery right there and definitely not one to leave on the front step!


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> I can't resist
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In YOUR case, buy a 5775c and sell your 4790k. Net $75~ upgrade. results like these....
> 
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38
> 
> nasty
> 
> Seriously, IF you have the money to build a new complete rig, WAIT for 7740k. For real. If you are getting new it might as well be the future, and THAT for gaming (for awhile) will be either the Skylake-X or Kaby-X. That Kaby-x will be 4/8 @ 135TDP (vs 95tdp of the 7700k which already can hit 5.2MHZ) and may hit 6.0Mhz overclocked with that 40% extra juice. Thing will be a monster.
> 
> If you want the smartest upgrade that will definitely give you max performance for the cheapest amount, DEFINITELY go with the 5775c for #350 and drop it right in and then sell your 4790k on Ebay for $275~. THAT is a great plan for folks in your shoes.
> 
> If you are mostly using it for workstation stuff and gaming is secondary, than RyZen is pretty amazing for that usage. Just not so much for gaming (although at 4k it should be pretty great).


While I acknowledge 5775c power I think this is even bigger sidegrade than upgrading to 7700k.

Also 7740k from what I see is only 100hz faster on stock than 7700k. And it will require (again...) new MOBO.

Why the hell every 2 years we need new MOBO for new CPUs....seriously... I understand getting from DDR3 to DDR4 but new MOBO just for new CPUs....ughh....

Thanks for that link btw but that were 1080p benches. Please show me how 5775c vs 4790k vs 7700k is on 1440p.

Thanks for information!


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> While I acknowledge 5775c power I think this is even bigger sidegrade than upgrading to 7700k.
> 
> Also 7740k from what I see is only 100hz faster on stock than 7700k. And it will require (again...) new MOBO.
> 
> Why the hell every 2 years we need new MOBO for new CPUs....seriously... I understand getting from DDR3 to DDR4 but new MOBO just for new CPUs....ughh....
> 
> Thanks for that link btw but that were 1080p benches. Please show me how 5775c vs 4790k vs 7700k is on 1440p.
> 
> Thanks for information!


he has to buy a new mobo anyway though (unless he does the sensible 5775c upgrade. THAT is what makes it a great upgrade. A child could do it and it CHEAP). And the new one is the future platform where-as z170/z270 is dying. (Intel are dirty for that)

I cant show you benches as I dont have the 4790k available anymore but I can tell you that I know 5 people who have taken my advice on this one and to a man they were super excited about it. For $75 net....

How have you seen any tests on the 7740k? there are none I thought. I get that stock it may be 100Mhz or whatever, but all that HEADROOM with that juice, right? It wouldnt be possible for it to clock only 100mhz higher than the 7700k with all that extra power. Am I missing something? These new Intel chips are the answer to Ryzen and WILL be the best (at gaming), 100% guaranteed. Whether or not the Skylake-x or Kaby-X turns out the best, i dont know. Maybe the 6 core will be best, but, I have a feeling that 7740k will be a BEAST and I would bet a golden chip will hit 6.0Mhz.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> he has to buy a new mobo anyway though (unless he does the sensible 5775c upgrade. THAT is what makes it a great upgrade. A child could do it and it CHEAP). And the new one is the future platform where-as z170/z270 is dying. (Intel are dirty for that)
> 
> I cant show you benches as I dont have the 4790k available anymore but I can tell you that I know 5 people who have taken my advice on this one and to a man they were super excited about it. For $75 net....
> 
> How have you seen any tests on the 7740k? there are none I thought. It wouldnt be possible for it to clock only 100mhz higher than the 7700k with all that extra power. Am I missing something? These new Intel chips are the answer to Ryzen and WILL be the best, 100% guaranteed. Whether or not the Skylake-x or Kaby-X turns out the best, i dont know. Maybe the 6 core will be best, but, I have a feeling that 7740k will be a BEAST and I would bet a golden chip will hit 6.0Mhz.


But they talk about it and it's base right now is only 100hz more. Also I did not see that it has anything better (in significant manner) than 7700k. While requiring new Mobo. Besides hitting 6.0 GHz? It is just another 14nm improved chip.

http://hothardware.com/news/intel-kaby-lake-x-core-i7-7740k-and-x299-platform-details-leak


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> But they talk about it and it's base right now is only 100hz more. Also I did not see that it has anything better (in significant manner) than 7700k. While requiring new Mobo. Besides hitting 6.0 GHz? It is just another 14nm improved chip.
> 
> http://hothardware.com/news/intel-kaby-lake-x-core-i7-7740k-and-x299-platform-details-leak


I think you are overlooking the obvious 95w vs 135w .

I understand that for someone with a 7700k, of course they wont upgrade unless they are "that guy", but for people on a old platform i think the few months wait for a new platform and the 7740k would be smart. Or Skylake-X, which has the potential to be amazing too. And if you are on the new platform it should leave you with an upgrade path, which obviously 5775c and 7700k do not. Those are as high as those platforms will let you climb. And that is kind of why I mentioned all of this to begin with, because a lot of people think that 4790k is the top of the line on z97, but that is obviously not true as the 5775c is a nice upgrade. Better perf, has magic cache, better temps, less power draw.

I guess it's possible that Intel has a poor showing in the RyZen aftermath but that would be a pathetic time to fumble . They way they appear to be unconcerned about Ryzen leads me to believe the answer is already coming.

Fact is still true that one just has to move up in rez to buy their cpu a couple years of relevancy. At 4k, even the 2600k will do decently.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> mine and your cards are similar. they dont even want to hold 2000 with slider OC but with the curve i get a solid 2025-2037-2050 now (depending on temps). Feel a lot better about her now


Similar TVs, similar cards now too? Just admit it. You're me from another dimension.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Similar TVs, similar cards now too? Just admit it. You're me from another dimension.


doppleganger. lol. Are you all tatted up, too? lol


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> doppleganger. lol. Are you all tatted up, too? lol


Nope.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Nope.


and it ends...







You probably need clean skin to keep making all that money


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> and it ends...











Quote:


> Originally Posted by *Norlig*
> 
> Great Success on modifying the Shunt resistors on my end
> 
> 
> 
> 
> 
> 
> 
> , Max TDP now at 89%
> 
> 
> 
> GPU Voltage now always at 1.081 - 1.093 , but majority of the time 1.093
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Havent played with the Core that much yet, as I am waiting on some copper shims so I can mount my Accelero Hybrid


That's great! I'm debating doing it to my card, I've seen the TDP got up as high as 140%


----------



## KedarWolf

Quote:


> Originally Posted by *Norlig*
> 
> Great Success on modifying the Shunt resistors on my end
> 
> 
> 
> 
> 
> 
> 
> , Max TDP now at 89%
> 
> 
> 
> GPU Voltage now always at 1.081 - 1.093 , but majority of the time 1.093
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Havent played with the Core that much yet, as I am waiting on some copper shims so I can mount my Accelero Hybrid


Your card sits vertically?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's great! I'm debating doing it to my card, I've seen the TDP got up as high as 140%


yeah me too. my question is, can it be cleaned up like regular thermal paste? Don't want to void my warranty just for a number in my top left corner of the monitor. Only number that really matters up ther is frametime and Im good on that. I admit my OCD wants to see nice round numbers and consistency, though.


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> usually. But, it could be a RAM bottleneck, too. low gpu usage with high cpu usage is a bottlenecked cpu for sure though.


What percentage of cpu usage would you consider high?


----------



## Slackaveli

really depends on the game tbh. I experienced it on my son's rig in Arma 3. he was getting 50%~ cpu and gpu usage but only 30s framerate. I upgraded his ram from 1333 to 1866 c-9 , and he now gets 50s fps and much higher usage, but if his AMD setup allowed he'd still benefit from faster ram, up to around 2400 on ddr3 set-ups.
In BF1 on a 4690k/980ti setup i was hitting 100% cpu usage and ~80s% gpu usage, so that was a classic cpu bound scenario right there.

What numbers are you seeing?


----------



## Norlig

Quote:


> Originally Posted by *KedarWolf*
> 
> Your card sits vertically?


Nope, it sits Horisontally, motherboard is vertically, Like most PC's









I read that I should not do it, if the GPU sits vertically, but mine doesn't so I'm good!

It was nerve wrecking starting it up for the first time though, had to film it incase it went boom, so I could share


----------



## rcfc89

Quote:


> Originally Posted by *Slackaveli*
> 
> I think you are overlooking the obvious 95w vs 135w .
> 
> I understand that for someone with a 7700k, of course they wont upgrade unless they are "that guy", but for people on a old platform i think the few months wait for a new platform and the 7740k would be smart. Or Skylake-X, which has the potential to be amazing too. And if you are on the new platform it should leave you with an upgrade path, which obviously 5775c and 7700k do not. Those are as high as those platforms will let you climb. And that is kind of why I mentioned all of this to begin with, because a lot of people think that 4790k is the top of the line on z97, but that is obviously not true as the 5775c is a nice upgrade. Better perf, has magic cache, better temps, less power draw.
> 
> I guess it's possible that Intel has a poor showing in the RyZen aftermath but that would be a pathetic time to fumble . They way they appear to be unconcerned about Ryzen leads me to believe the answer is already coming.
> 
> Fact is still true that one just has to move up in rez to buy their cpu a couple years of relevancy. At 4k, even the 2600k will do decently.


I'm likely waiting until August and going with the 7740k for my new build. Skylake-X looks appealing but with a $1500 price tag it's likely overkill for a strictly gaming rig.


----------



## fisher6

If Ryzen is gonna continue on the path it is now then that's my next CPU upgrade. The fact that I don't need to change to a new socket is a huge plus.


----------



## Slackaveli

both good plans, guys.


----------



## PasK1234Xw

Quote:


> Originally Posted by *Mato87*
> 
> That cpu you have is a true 8 core / 16 thread processor, which means you won't run into any kind of bottlenecks in any of the new games that will come out in the foreseeable future. We are talking about mainstream cpu's like the older i7 2600k he has currently vs skylake i7 6700k or newer kabylake i7 7700k. I myself run to bottlenecks from time to time on certain maps in battlefield 1 where I see 100% utilization on all 8 cores. Same goes for games like witcher 3 or the new ghost recon.


You litterly just said
Quote:


> Originally Posted by *Mato87*
> 
> Good luck holding a playable min framerate in more demanding titles like Battlefield 1 or the new ghost recon. Not to mention basically all the upcoming titles that can utilize 8 cores very efficiently.


i posted to show how well game utilise 8 core CPU if you were referring to 4/8 i7 then don't call it 8 cores because its not 8 core and just causes confusion.

Also on contrary to what you think i still run into some bottleneck even more so when I ran 1080 SLI its more so due to game CPU utilization rather than hardware problem but still it happens even on 5960x when pushing high frames at 1440 its not bad really but it does happen.


----------



## mtbiker033

ok after playing a few games, this thing is amazing

squad maxed out with a solid 100-108fps....

dooom on vulkan was amazing

I love this thing


----------



## joder

Quote:


> Originally Posted by *mtbiker033*
> 
> ok after playing a few games, this thing is amazing
> 
> squad maxed out with a solid 100-108fps....
> 
> dooom on vulkan was amazing
> 
> I love this thing


Doom + Vulkan is just sexy.


----------



## mtbiker033

Quote:


> Originally Posted by *joder*
> 
> Doom + Vulkan is just sexy.


totally unbelievable!! why aren't more games on Vulkan my word!


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> really depends on the game tbh. I experienced it on my son's rig in Arma 3. he was getting 50%~ cpu and gpu usage but only 30s framerate. I upgraded his ram from 1333 to 1866 c-9 , and he now gets 50s fps and much higher usage, but if his AMD setup allowed he'd still benefit from faster ram, up to around 2400 on ddr3 set-ups.
> In BF1 on a 4690k/980ti setup i was hitting 100% cpu usage and ~80s% gpu usage, so that was a classic cpu bound scenario right there.
> 
> What numbers are you seeing?


It's mainly Rise of the Tomb I'm concerned about. I'm in this area where sometimes the CPU will go tp about 50-60 usage while the GPU goes to the 70s. Funny thing is in the Witcher 3 fully maxed and getting the 120 I want easily. If I load an earlier area in tomb I hit my 120 no problem. So maybe it's the area I'm in. One other thing. I notice in some of my older games both CPU and GPU are at low usages but high framerates.. Could this be because the game isn't that demanding?

I guess my real issue is that I raised my expectations too much. I was hoping for 120+fps in every area of every game lol. If I sit back and look at it realistically this is still a massive jump from y 970 and an eve larger jump from my 460. I still may exchange it because of the coil whine though. It's real bad unless I crank up my speakers.


----------



## PasK1234Xw

Quote:


> Originally Posted by *D1RTYD1Z619*
> 
> would you mind DSRing that to 4k and let me know what your CPU usage is like. I'm not getting the answers I need about BF1 on ultra setting, minus AA, at 4k.


4k 60hz this on empires edge but at beginning of match 64 conquest usage will just increase the more the map gets destroyed.










St. Quentin Scar towards end of round










So many scenario come into play usage really depends on map also


----------



## Mato87

Quote:


> Originally Posted by *Boost240*
> 
> SO I'm sorta let down right about now. Go the card on Friday and installed it Saturday night. First thing I noticed was an annoying whine which I found out to be the infamous coil whine. It's very noticeable and annoying. Also, I think I expected far too much from this card, even for my 1080p 144hz display. I was hoping I would be able to max out all my games and hit at least 120fps so I can us ultra low motion blur, but games like Rise of the Tomb still get dips into the 90s and even the 70s. What weird is I notice sometimes that my GPU usage is pretty low.
> 
> I'm not really sure what to do. i can return the card to Amazon and hope to get another but who knows if the whine will be better. I just find it a bit disappointing that my much cheaope460 and 970 cards never had this issue at all, yet a $1100 card does
> 
> 
> 
> 
> 
> 
> 
> . I plan on getting a water block but apparently that won't help.


Rise of Tomb Raider is a very demanding game at a few of those larger hubs, try lowering or turning off the demanding tress fx (now called pure hair) to get a major performance boost, you'll gain at least 15-20 fps and lower the ao settings from vxao to hbao, you'll get at least 5 fps framerate boost aswell.


----------



## c0nsistent

Quite a large fps increase from overclocking my ram from 1600 to 2400 with my 3770k @ 4.7
















Sent from my XT1609 using Tapatalk


----------



## Mato87

Quote:


> Originally Posted by *PasK1234Xw*
> 
> i posted to show how well game utilise 8 core CPU if you were referring to 4/8 i7 then don't call it 8 cores because its not 8 core and just causes confusion.
> 
> Also on contrary to what you think i still run into some bottleneck even more so when I ran 1080 SLI its more so due to game CPU utilization rather than hardware problem but still it happens even on 5960x when pushing high frames at 1440 its not bad really but it does happen.


Well technically the i7 6700k it has 4 cores, but the cpu has hyperthreading which means it has double the amount of cores, which is 8.


----------



## alucardis666

Quote:


> Originally Posted by *c0nsistent*
> 
> Quite a large fps increase from overclocking my ram from 1600 to 2400 with my 3770k @ 4.7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1609 using Tapatalk


I oc'd my ram up from 2400-3000. No regrets









Most use cases today seem to prefer speed to timings.


----------



## Addsome

Quote:


> Originally Posted by *Blotto80*
> 
> Think I just settled into my 24/7 clocks. 2050 @ 1.025v. Just gone through a few loops of Heaven and it hasn't dipped from 2050 at all and is topping out at 48c (EVGA Hybrid AIO with 2xSP120 in push/pull). I could get it higher for benches but the clocks couldn't hold at my highest (2090ish), I'd rather have a stable clock at a lower voltage than go all out.


How loud are the SP120s compared to the stock fan on the hybrid?


----------



## Mato87

Quote:


> Originally Posted by *c0nsistent*
> 
> Quite a large fps increase from overclocking my ram from 1600 to 2400 with my 3770k @ 4.7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1609 using Tapatalk


we need more info, what resolution, graphical settings and so on?


----------



## Blotto80

Quote:


> Originally Posted by *Addsome*
> 
> How loud are the SP120s compared to the stock fan on the hybrid?


I didn't have the stock fan, I bought it used and he didn't have it so he gave me the two SP120's instead. That said, adding the second didn't increase the noise at all and I don't really notice the system anymore now than I would have with my Fury X and it's Gentle Typhoon. I've also got an H100i with it's two 120mm fans on low and 2x NZXT 120mm and 1x NZXT 140mm all on lowest settings running in an H440 and while I wouldn't call it silent, it's reasonably quiet.


----------



## Addsome

Quote:


> Originally Posted by *Blotto80*
> 
> I didn't have the stock fan, I bought it used and he didn't have it so he gave me the two SP120's instead. That said, adding the second didn't increase the noise at all and I don't really notice the system anymore now than I would have with my Fury X and it's Gentle Typhoon. I've also got an H100i with it's two 120mm fans on low and 2x NZXT 120mm and 1x NZXT 140mm all on lowest settings running in an H440 and while I wouldn't call it silent, it's reasonably quiet.


Is it worth adding the second in terms of performance? Are you able to control the fan speeds from the mobo?


----------



## Blotto80

Adding the second dropped temps by 2ish degrees so not tons but where I'm hovering around that 51c throttle point, it's worth it for me. I could control via the motherboard header but I just used a splitter and plugged them right into the card. They are relatively low wattage fans and should be fine off the card's header.


----------



## Slackaveli

Quote:


> Originally Posted by *Boost240*
> 
> It's mainly Rise of the Tomb I'm concerned about. I'm in this area where sometimes the CPU will go tp about 50-60 usage while the GPU goes to the 70s. Funny thing is in the Witcher 3 fully maxed and getting the 120 I want easily. If I load an earlier area in tomb I hit my 120 no problem. So maybe it's the area I'm in. One other thing. I notice in some of my older games both CPU and GPU are at low usages but high framerates.. Could this be because the game isn't that demanding?
> 
> I guess my real issue is that I raised my expectations too much. I was hoping for 120+fps in every area of every game lol. If I sit back and look at it realistically this is still a massive jump from y 970 and an eve larger jump from my 460. I still may exchange it because of the coil whine though. It's real bad unless I crank up my speakers.


sounds to me like your ram speed/timings are holding you back. if you are only 70% gpu but not hitting over 120 fps and the cpu is only in the 50%s ( you have to make sure one of the cores isnt hitting 100% to know for sure though. if that is happening it's a cpu bind even with the 50% average showing)than it's your ram holding you back. What ram speed/timings?


----------



## Addsome

Quote:


> Originally Posted by *Blotto80*
> 
> Adding the second dropped temps by 2ish degrees so not tons but where I'm hovering around that 51c throttle point, it's worth it for me. I could control via the motherboard header but I just used a splitter and plugged them right into the card. They are relatively low wattage fans and should be fine off the card's header.


Ya thats the reason I want to get the 2 SP120s. I sometimes hit 51c


----------



## c0nsistent

Quote:


> Originally Posted by *Mato87*
> 
> Quote:
> 
> 
> 
> Originally Posted by *c0nsistent*
> 
> Quite a large fps increase from overclocking my ram from 1600 to 2400 with my 3770k @ 4.7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my XT1609 using Tapatalk
> 
> 
> 
> we need more info, what resolution, graphical settings and so on?
Click to expand...


















3770K @ 4.7ghz 1.35v DDR3 2400 10-12-12-34-2

I ran Timespy on this current setup and got a score of 9072 which is rank 3 for 3770K/1080Ti systems... so, not bad I guess. I could push the CPU to 4.8-4.9 and the 1080 Ti a little harder for the top spot, but the extra performance gain vs time spent tweaking just isn't worth it beyond this point.

Sent from my XT1609 using Tapatalk


----------



## Slackaveli

Quote:


> Originally Posted by *c0nsistent*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3770K @ 4.7ghz 1.35v DDR3 2400 10-12-12-34-2
> 
> I ran Timespy on this current setup and got a score of 9072 which is rank 3 for 3770K/1080Ti systems... so, not bad I guess. I could push the CPU to 4.8-4.9 and the 1080 Ti a little harder for the top spot, but the extra performance gain vs time spent tweaking just isn't worth it beyond this point.
> 
> Sent from my XT1609 using Tapatalk


Nice. Im at #1 on the 5775c charts.


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> sounds to me like your ram speed/timings are holding you back. if you are only 70% gpu but not hitting over 120 fps and the cpu is only in the 50%s ( you have to make sure one of the cores isnt hitting 100% to know for sure though. if that is happening it's a cpu bind even with the 50% average showing)than it's your ram holding you back. What ram speed/timings?


So I was just looking at the average CPU usage. I checked the individual cores and when the average is at about 50%, one or two of the cores will regularly reach into the 80s. I'm not in the same area I originally posted about but I can get the CPU and GPU usage to about the same. My CPU is overclocked to 4.8Ghz. As for my RAM, it's running at 3000Mhz and the timings are the stock values. 15-16--16-35.


----------



## c0nsistent

Quote:


> Originally Posted by *Slackaveli*
> 
> Nice. Im at #1 on the 5775c charts.


What clockspeed are you at on CPU/RAM?


----------



## Mato87

Quote:


> Originally Posted by *c0nsistent*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3770K @ 4.7ghz 1.35v DDR3 2400 10-12-12-34-2
> 
> I ran Timespy on this current setup and got a score of 9072 which is rank 3 for 3770K/1080Ti systems... so, not bad I guess. I could push the CPU to 4.8-4.9 and the 1080 Ti a little harder for the top spot, but the extra performance gain vs time spent tweaking just isn't worth it beyond this point.
> 
> Sent from my XT1609 using Tapatalk


Thanks, interesting settings. I'll gladly do the test with the same settings, but I only have a full hd monitor, so I can only do a comparison on that resolution, it's interesting you chose to play on 1440p with high settings. I play on 1080p with very high / ultra settings

Here are my results and settings


----------



## Benny89

Anyone managed to get MSI 1080 TI Gaming X and can share his temps and OC results?


----------



## Slackaveli

Quote:


> Originally Posted by *Boost240*
> 
> So I was just looking at the average CPU usage. I checked the individual cores and when the average is at about 50%, one or two of the cores will regularly reach into the 80s. I'm not in the same area I originally posted about but I can get the CPU and GPU usage to about the same. My CPU is overclocked to 4.8Ghz. As for my RAM, it's running at 3000Mhz and the timings are the stock values. 15-16--16-35.


Quote:


> Originally Posted by *Boost240*
> 
> So I was just looking at the average CPU usage. I checked the individual cores and when the average is at about 50%, one or two of the cores will regularly reach into the 80s. I'm not in the same area I originally posted about but I can get the CPU and GPU usage to about the same. My CPU is overclocked to 4.8Ghz. As for my RAM, it's running at 3000Mhz and the timings are the stock values. 15-16--16-35.


not the ram, then, probably just seeing your first signs of actually using every drop of that cpu (in those threads). Going up in rez will end that problem, or turning more things to ultra on settings or dsr settings. personally if i am at the point where im losing frames but not utilizing ~99% gpu, might as well throw more load on the gpu.
Quote:


> Originally Posted by *c0nsistent*
> 
> What clockspeed are you at on CPU/RAM?


Ram is ddr3 2400 c-10 dominator platinum and I am at 4.3core, 3.8uncore {stock is 3.3}(which boosts the caches including the L4). I don't have a monster 1080ti, though. I hit on only 2025mhz in timespy.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> Thanks, interesting settings. I'll gladly do the test with the same settings, but I only have a full hd monitor, so I can only do a comparison on that resolution, it's interesting you chose to play on 1440p with high settings. I play on 1080p with very high / ultra settings
> 
> Here are my results and settings


You could always DSR up to 1440P for comparisons.

Man... Quantum break has given me motion sickness. The constant variations in FPS of 60 to 30 to 75 to 100 to 25 is literally making me feel like throwing up. I am nauseated after playing playing for 1.5 hours.

It's a good game though, but where did they go wrong with optimization... I think they need to rebuild their engine.


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> You could always DSR up to 1440P for comparisons.
> 
> Man... Quantum break has given me motion sickness. The constant variations in FPS of 60 to 30 to 75 to 100 to 25 is literally making me feel like throwing up. I am nauseated after playing playing for 1.5 hours.
> 
> It's a good game though, but where did they go wrong with optimization... I think they need to rebuild their engine.


Oh yeah the dsr I haven't used it, like ever







lol

Yeah I played through it, finished it just yesterday, it was a blast, but man, the optimization was HORRIBLE. I played with upscaling turned off and at times it dipped below 40 in a few gunfights...it was really pissing me off. I will play through the game once again, but this time with upscaling turned on to see if the framerate wil stay above 60 all the time even in the most intense gunfights. But I gotta say, the graphics in this game, more specifically, the effects, are probably the best I have ever seen in any game. With alan wake, remedy has done a fantastic job on the effects and with quantum break they took it on a whole another level...brilliant game all around.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mato87*
> 
> Oh yeah the dsr I haven't used it, like ever
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> Yeah I played through it, finished it just yesterday, it was a blast, but man, the optimization was HORRIBLE. I played with upscaling turned off and at times it dipped below 40 in a few gunfights...it was really pissing me off. I will play through the game once again, but this time with upscaling turned on to see if the framerate wil stay above 60 all the time even in the most intense gunfights. But I gotta say, the graphics in this game, more specifically, the effects, are probably the best I have ever seen in any game. With alan wake, remedy has done a fantastic job on the effects and with quantum break they took it on a whole another level...brilliant game all around.


I think I need to play with upscaling on as well, because I don't want to feel like this again. Might have to take some medicine, lol.


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think I need to play with upscaling on as well, because I don't want to feel like this again. Might have to take some medicine, lol.


with upscaling turned off it cleans up the image significantly, it no longer renders the game at 720p and upscales it with heavy antialiasing but it renders on the resolution of your choice, for me it was at 1080p. I tried to dsr to 4K without the upscaling just now and man, the game does look crisp and clean, but I can't be bothered to play this kind of game with sub 30 fps...no freaking way..Especially after I just installed a played Mad Max for a little while after a year and with everything on ultra Iam getting over 300 frames per second, which is ridiculous. With dsr to 4K over 100 fps too, that's how games should be optimized


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> both good plans, guys.


Y, I am also taking your advice and I will wait for 7740k. If Upgrading I might as well snag new best chip with new generation of MOBO as well....

In the meantime I was OCing my 4790k today. 4.8 Ghz stable right now, 61 C under load, 1.25V. I will try to go for 5.0 Ghz tomorrow.

I am also gonna upgrade from 2133 RAM to 2400 and wait for now.

Thanks for advice









Going back to topic- can't decide between MSI GAMING X and ROG STRIX. Both will look sick in my black-red RIG but both looks great. I like top ROG RGB asus Icon but I also like MSI side RGB dragon and msi text....

Can't decide


----------



## Blotto80

Quote:


> Originally Posted by *SlimJ87D*
> 
> You could always DSR up to 1440P for comparisons.
> 
> Man... Quantum break has given me motion sickness. The constant variations in FPS of 60 to 30 to 75 to 100 to 25 is literally making me feel like throwing up. I am nauseated after playing playing for 1.5 hours.
> 
> It's a good game though, but where did they go wrong with optimization... I think they need to rebuild their engine.


I know, such a waste of a good game. I had to bite the bullet and turn upscaling on when I played it, it was way too janky at 1440p native.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Y, I am also taking your advice and I will wait for 7740k. If Upgrading I might as well snag new best chip with new generation of MOBO as well....
> 
> In the meantime I was OCing my 4790k today. 4.8 Ghz stable right now, 61 C under load, 1.25V. I will try to go for 5.0 Ghz tomorrow.
> 
> I am also gonna upgrade from 2133 RAM to 2400 and wait for now.
> 
> Thanks for advice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going back to topic- can't decide between MSI GAMING X and ROG STRIX. Both will look sick in my black-red RIG but both looks great. I like top ROG RGB asus Icon but I also like MSI side RGB dragon and msi text....
> 
> Can't decide


tough call but it's a win/win. MSI has 3 year warranty, not sure on Asus.

your welcome. glad i could help. If you are hitting 4.8 stable at 1.25v, that is a golden 4790k. It should hit 4.9 easily and probably 5.0, which makes it worth more for resell value like how Silicon Lottery does it. I know that the 4.9s were $380 and the 5.0 were like $425 on his store binned. Very nice one.


----------



## AMDATI

Quote:


> Originally Posted by *keikei*
> 
> If youre just gaming it'd be a waste of an upgrade. I'm running a stock 3770k and I see around 50% cpu usage @4k. If you have a workstation or a ton of background apps running i can see going ryzen. I can see the allure though. Ryzen is a very good budget upgrade right now. The new R5 lineup is VERY attractive, not gonna lie.


Actually, even if you're not seeing 100% CPU usage, you will still get less performance than newer CPU's. For example, in most games, the 7700k will outdo the 4790k by 10% or more FPS. This is the behavior even when the GPU is the bottleneck and the CPU isn't 100% loaded. So just because you're not seeing full load, doesn't mean you're not losing performance.

I think the reason for this behavior, and I've used this to prove why AMD chips perform less even if they have 16 threads, is because it's not what your CPU can do in one full second, it's what it can do in a few milliseconds that matters. It reminds me back when games would actually perform better on DDR1 with faster timings, even though it provided less overall bandwidth performance.

Let's say your CPU is effectively idling every other 2 milliseconds....you'd never know, because your display doesn't update that quickly. Even on a 165hz screen, the display only updates once every 6ms. So you could effectively be at half CPU usage, and it could appear like 100% CPU usage.

The thing about gaming, is if you want say 100fps, you need to draw a frame every 10ms. If your CPU can't perform the needed tasks in that 10ms, and does it in 12ms instead, that's how you lose frame rates, even if you actually have 'low' CPU usage.

There's also probably other features at play like newer instruction sets which accelerate certain math.

When it comes to gaming, sadly, CPU usage doesn't really mean you're not losing performance.

When I did the math between Ryzen and the 7700k in gaming, both with a 1080ti, the Ryzen chip not only delivered less FPS, it was actually on average, loaded more. The average load of the entire CPU was something like 54%, while the 7700k was loaded to something like 90%. Well if you consider that the 7700k has half the cores, 90% of 50% is just 45%.


----------



## Benny89

Quote:


> Originally Posted by *AMDATI*
> 
> Actually, even if you're not seeing 100% CPU usage, you will still get less performance than newer CPU's. For example, in most games, the 7700k will outdo the 4790k by 10% or more FPS. This is the behavior even when the GPU is the bottleneck and the CPU isn't 100% loaded. So just because you're not seeing full load, doesn't mean you're not losing performance.
> 
> I think the reason for this behavior, and I've used this to prove why AMD chips perform less even if they have 16 threads, is because it's not what your CPU can do in one full second, it's what it can do in a few milliseconds that matters.
> 
> Let's say your CPU is effectively idling every other 2 milliseconds....you'd never know, because your display doesn't update that quickly. Even on a 165hz screen, the display only updates once every 6ms. So you could effectively be at half CPU usage, and it could appear like 100% CPU usage.
> 
> The thing about gaming, is if you want say 100fps, you need to draw a frame every 10ms. If your CPU can't perform the needed tasks in that 10ms, and 12ms instead, that's how you lose frame rates, even if you actually have 'low' CPU usage.


Of course 770k will be better than 4790k in most games. But that depends on many things how much. Here you can see 4790k vs 7700k:



Of course 7700k will provide in most games higher min frame rates, but again- are 5-9 fps worth upgrading now? Then there is OC, game engine, resolution, optimalization etc. Clock for clock 4,9 vs 4,9 both CPUs are very close.

Overall, difference is not big enough to FEEL like meaningfull upgrade in games.

If you use PC not only for gaming, I would go for 7700k of course. But for gaming alone...meh...

*
Any review of Aorus Extreme yet??*


----------



## AMDATI

Quote:


> Originally Posted by *Benny89*
> 
> Of course 770k will be better than 4790k in most games. But that depends on many things how much. Here you can see 4790k vs 7700k: Of course 7700k will provide in some games higher min frame rates, but again- are 5-9 fps worth upgrading now? Then there is OC, game engine, resolution, optimalization etc. Clock for clock 4,9 vs 4,9 both CPUs are very close.
> 
> Overall, difference is not big enough to FEEL like meaningfull upgrades in games.
> 
> If you use PC not only for gaming, I would go for 7700k of course.
> 
> *
> Any review of Aorus Extreme yet??*


I think you're missing the point.....CPU usage doesn't mean you're not losing performance. You could look at a game at 50% CPU usage and think "Oh I'm getting just as good performance as everyone else with room to spare", but that's not the case at all. That 50% usage might as well read 100% usage.

The comment had nothing to do with price to performance gain ratio.


----------



## Benny89

Quote:


> Originally Posted by *AMDATI*
> 
> I think you're missing the point.....CPU usage doesn't mean you're not losing performance. You could look at a game at 50% CPU usage and think "Oh I'm getting just as good performance as everyone else with room to spare", but that's not the case at all. That 50% usage might as well read 100% usage.


I am talking about pure fps numbers in games, not about usage. I know usage means poop. Fps are what I am talking about.

As I said there is difference but very small. In an age when we play at 80-120 fps this is no where meaningfull difference. This is not any longer a matter of going from 50 to 60 fps. Especially at higher resolutions.


----------



## AMDATI

Quote:


> Originally Posted by *Benny89*
> 
> I am talking about pure fps numbers in games, not about usage. I know usage means poop. Fps are what I am talking about.
> 
> As I said there is difference but very small. In an age when we play at 80-120 fps this is no where meaningfull difference. This is not any longer a matter of going from 50 to 60 fps. Especially at higher resolutions.


Clearly the major point of your post was, and I quote you: "Overall, difference is not big enough to FEEL like meaningfull upgrades in games."

And at 4K, the 50 or 60fps issue is still very much relevant. Although I would argue that anything under 90fps is an issue when it comes to buttery smoothness, but that's another topic.

Still the major point I was making was that CPU usage does not indicate relative performance, which is something that many people don't realize. I mean you'd expect if two CPU's are not at full load, they'd effectively perform virtually the same, when it comes to delivering gaming FPS anyways. Yet that is not true, even when it's Intel vs Intel with the same amount of cores/threads.


----------



## Benny89

Quote:


> Originally Posted by *AMDATI*
> 
> Clearly the major point of your post was, and I quote you: "Overall, difference is not big enough to FEEL like meaningfull upgrades in games."
> 
> And at 4K, the 50 or 60fps issue is still very much relevant. Although I would argue that anything under 90fps is an issue when it comes to buttery smoothness.
> 
> Still the major point I was making was that CPU usage does not indicate relative performance, which is something that many people don't realize. I mean you'd expect if two CPU's are not at full load, they'd effectively perform virtually the same, when it comes to delivering gaming FPS anyways. Yet that is not true.


Yes, "Overall, difference is not big enough to FEEL like meaningfull upgrades in games." - that was my point. In 4K 50 to 60 is relevant but that is in 95% cases GPU that will deliver that at 4k, not CPU. Considering 90% of CPU consumers do not OC their CPU and 90% of bechnes "vs" in internet are on stock clocks, I think we can just agree that while one CPU is clearly faster- Intel did not make almost any progress when we consider gaming

*
Any one with Aorus 1080 Ti Xtreme Edition that can share his results? I saw that few people manged to already obtain them somehow?







*


----------



## AMDATI

There's a difference between being faster at full load, and being faster even when not at full load. I don't think we can agree that Intel hasn't made progress, this clearly proves they have, but it seems like you're trying to argue just to argue, and creating additional straw man arguments in the process. OC'ing won't really help, because once you compare the other chip OC'd too, the difference is still there. In the end, you're still losing a noticeable degree of performance, even if not at full load. That also speaks to longevity.

You've gone way out of the context of the post I was replying to, and it's unclear why unless you're just trying to argue just to argue, which really is not something that is welcome here.


----------



## Menta

Quote:


> Originally Posted by *Boost240*
> 
> SO I'm sorta let down right about now. Go the card on Friday and installed it Saturday night. First thing I noticed was an annoying whine which I found out to be the infamous coil whine. It's very noticeable and annoying. Also, I think I expected far too much from this card, even for my 1080p 144hz display. I was hoping I would be able to max out all my games and hit at least 120fps so I can us ultra low motion blur, but games like Rise of the Tomb still get dips into the 90s and even the 70s. What weird is I notice sometimes that my GPU usage is pretty low.
> 
> I'm not really sure what to do. i can return the card to Amazon and hope to get another but who knows if the whine will be better. I just find it a bit disappointing that my much cheaope460 and 970 cards never had this issue at all, yet a $1100 card does
> 
> 
> 
> 
> 
> 
> 
> . I plan on getting a water block but apparently that won't help.


TI is a great card but you have to be realistic it wont work miracles in some games yes it will perform around 20% +- better then a 1080 depending on the game...coming from a 970 its night and day!!!

This is really not a card for 1080P you might be getting coile whine due to frames being pushed hard at that resolution, nearly certain that if you try a 1440p or 4k there will be no coile whine or very little maybe some menus, to much wasted powa! to the point where some engines are the limiting factor


----------



## SoftCircle

I just want to play witcher 3 at 1080p with all settings on ultra and hairworks maxed out. Can the 1080ti with a 6700k keep my frames from dropping below 60?


----------



## Boost240

Quote:


> Originally Posted by *SoftCircle*
> 
> I just want to play witcher 3 at 1080p with all settings on ultra and hairworks maxed out. Can the 1080ti with a 6700k keep my frames from dropping below 60?


Easily! I have the same setup and maxed out I hit over 120fps easy. Don't worry!


----------



## keikei

Quote:


> Originally Posted by *SoftCircle*
> 
> I just want to play witcher 3 at 1080p with all settings on ultra and hairworks maxed out. Can the 1080ti with a 6700k keep my frames from dropping below 60?


Yes. Go to 1:45 for 1080p only bench.


----------



## Boost240

Quote:


> Originally Posted by *Menta*
> 
> TI is a great card but you have to be realistic it wont work miracles in some games yes it will perform around 20% +- better then a 1080 depending on the game...coming from a 970 its night and day!!!
> 
> This is really not a card for 1080P you might be getting coile whine due to frames being pushed hard at that resolution, nearly certain that if you try a 1440p or 4k there will be no coil whine or very little maybe some menus, to much wasted powa! to the point where some engines are the limiting factor


I've tried everything from lowering the fps to increasing the resolution. It's aa big deal since I want to water cool it and create a silent PC. I'll return it and try my luck.

That being said, I've decided to settle for 85fps with ULMb enabled for Tomb. I can't really tell the difference between 85 and 100 or 120. Maybe if I had three screens side by side. However, high framerates and ULMB have ruined 60fps for me. 85fps + ULMB is a godsend. I'm sure the TI will be able to eat any games at that fps. Once the whine is fixed I'll be a super happy camper!


----------



## Slackaveli

really... I may have to actually give ULMB a try. I've always just used g-sync.


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> really... I may have to actually give ULMB a try. I've always just used g-sync.


Dude. ULMB changed my life. I didn't even notice the screen blur before I used it. Once I did I couldn't unnotice it. You can read scrolling text whether it's moving up and dow or side to side. Side to side movement doesn't blur at all. It's amazing. The only downside is it dims the screen a bit. It doesn't bother me enough though. G-Sync is great but I'd take ULMB any day of the week over it. You just gotta make sure you can hold the required FPS consistently. 85, 100 and 120 are the only refresh rates it works at.

Also, I lied, I can notice a difference between 85 and 120 though it's not too bad.


----------



## SoftCircle

Quote:


> Originally Posted by *Boost240*
> 
> Easily! I have the same setup and maxed out I hit over 120fps easy. Don't worry!


Quote:


> Originally Posted by *keikei*
> 
> Yes. Go to 1:45 for 1080p only bench.


Thank you for the information.


----------



## illypso

Quote:


> Originally Posted by *Boost240*
> 
> Dude. ULMB changed my life. I didn't even notice the screen blur before I used it. Once I did I couldn't unnotice it. You can read scrolling text whether it's moving up and dow or side to side. Side to side movement doesn't blur at all. It's amazing. The only downside is it dims the screen a bit. It doesn't bother me enough though. G-Sync is great but I'd take ULMB any day of the week over it. You just gotta make sure you can hold the required FPS consistently. 85, 100 and 120 are the only refresh rates it works at.
> 
> Also, I lied, I can notice a difference between 85 and 120 though it's not too bad.


Wow you just made my day.

I had a asus VG248QE that did the same thing with a program simulating 3d in 2d and it was amazing.

Then it die and I bought a predator XB27HU that was suppose to have a similar option but never really search on the subject and thought it was always doing it by default.
I saw it was more blurry but though it was because its an slower IPS and not a TN

I disable gsync, put 120Hz, found the option ULMB (didn't know it exist) in the menu, enable it, boost brightness to compensate and WOW !!!

I do have ghosting because its an IPS but the image is so clear. Now I can read the name of the street, before at 144hz its was a blurry mess.

https://www.testufo.com/#test=photo&photo=toronto-map.png&pps=1920&pursuit=0&height=0

I hope the ghosting will not be too bad in gaming.


----------



## AMDATI

I can see the ULMB flicker, gives me eye strain / headache. It definitely does significantly reduce motion blur but mostly for things that are scrolling or slow moving across the screen. In video games, I hardly notice a benefit because my view isn't panning, it's twitching from one point of the view to next. Might look good if you use a controller with joystick though. But in FPS, I don't see any benefit.

Even though 1080ti is good for running ULMB, it's still not a good feature to me.


----------



## Amuro

zotac 1080 ti fe, done some taming on this beast.

4790k @ 4ghz no boost, 1.024v
16gb ddr3 1600mhz cl9
asus h97 gaming pro

currently im playing on dsr 4k on my peasant vc239h monitor thats why my cpu is downclock, after years of ocd 4.5ghz while playing on 1080p, 4k doesnt need too much cpu power, now its cooler and cpu tdp is cut into half, since 1080 ti can handle 4k gaming at 60fps, my 980ti cant do that. sorry 4 my english.

some comparison of powerlimit vs powerdraw vs performance, my cpu here @ 4.5ghz

1080ti fe oc +160 +500
4790k 4.5ghz
1600mhz ddr3 cl9

average powerdraw to powerlimit and equivalent
powerdraw is not static(rough estimate sorry via HWINFO64)
stock voltage
note this is firestrike graphics performance score only

50 pl 125watts firestrike graphics 19475(oc) gigabyte g1 980ti equivalent
60 pl 150watts firestrike graphics 25900(oc) asus strix/zotac 1080 oc equivalent

70 pl 175watts firestrike graphics 27800(oc) titan pascal equivalent (max temp 69c @ 65% fan, 100% gpu load on gaming) currently using in dsr 4k gaming, efficient power draw and less heat, since im locked at 60fps vsyn on

80 pl 200watts firestrike graphics 28760(oc) gtx 1080ti equivalent
90 pl 220watts firestrike graphics 29794(oc)
100pl 250watts firestrike graphics 28756(stock) 30302(oc)
110pl 275watts firestrike graphics 30789(oc)
120pl 295watts firestrike graphics 31253(oc)


----------



## KedarWolf

Quote:


> Originally Posted by *Norlig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Your card sits vertically?
> 
> 
> 
> Nope, it sits Horisontally, motherboard is vertically, Like most PC's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I read that I should not do it, if the GPU sits vertically, but mine doesn't so I'm good!
> 
> It was nerve wrecking starting it up for the first time though, had to film it incase it went boom, so I could share
Click to expand...

What exactly does the pci-e shunt mod do?

I'm just curious, can't do the shut mods as my card sits vertically in my case.


----------



## TWiST2k

Quote:


> Originally Posted by *xNutella*
> 
> that moment when you hear your door buzzer.


I have that monitor on both my main systems and there is no turning back after using it! I did have to exchange it a couple times to get ones without any BLB (Thanks for being amazing Amazon). You can read all about my, and others similar adventures on the Owners Thread for it.


----------



## Jbravo33

Just in case anyone is wondering what sli can do on ghost recon ultra settings 10320x1440 just a measly 57fps







and dropped cards to stock before doing this.


----------



## Scorpion667

Interested in seeing Minumum FPS for 1080TI in Black Ops 3 at 1080p minimum graphics settings.

Looking to see if that would reliably feed 180+fps in to the upcoming Asus 240hz 1080p monitor


----------



## Slackaveli

totally depends on what cpu you'd be trying it with.


----------



## KingEngineRevUp

Got Quantum Break for $17, it's a pretty nice looking game! It runs like crap in Ultra, but honestly, turning half the settings down to High will let you run it from 60 to 85 FPS. And it still looks better than a lot of other games.

I think Remedy should have just not released the game with Ultra settings, or they should have called them "Uber" settings and the High settings "Ultra."

Anyways, here's some screenshots with 1080 Ti and 4790K.

Below, comparison between 1440P, 1080P and Upscaling


1440P Max Everything


1080P


1440P with Upscaling

Below, you can see the game still looks gorgeous at some settings turned down a little bit.


Reflections set to Medium, Volumetric Lighting set to High, Shadows set to High, Upscale off, 1440P


Everything set to max, Upscale off, 1440P

Below, Reflections set to Medium, Volumetric Lighting set to High, Shadows set to High, Upscale off, 1440P


----------



## Scorpion667

Quote:


> Originally Posted by *Slackaveli*
> 
> totally depends on what cpu you'd be trying it with.


Best case scenario. CPU budget is $2k

Preliminary research shows 7700k to be best for rediculous fps @ 1080p


----------



## leequan2009

Hi guys
I just installed Block for 1080Ti. I don't know this is a trouble with my card.
When I Fulload I put the ear near the card, there is a small sound inside the card and block, but when in Idie or light work, there is no sound that appears. ( I searched Google, this like Coil Whine?)
I do not know whether this is normal or abnormal with my CARD. Is there anyone like me?


----------



## Scorpion667

Quote:


> Originally Posted by *leequan2009*
> 
> Hi guys
> I just installed Block for 1080Ti. I don't know this is a trouble with my card.
> When I Fulload I put the ear near the card, there is a small sound inside the card and block, but when in Idie or light work, there is no sound that appears.
> I do not know whether this is normal or abnormal with my CARD. Is there anyone like me?


Most likely coil whine which is normal.

Many of us here in the forums have experience with the specific sound of coil whine. If you want to record it and post here you could get some opinions. Do temperatures appear to be normal?


----------



## KedarWolf

Quote:


> Originally Posted by *Scorpion667*
> 
> Interested in seeing Minumum FPS for 1080TI in Black Ops 3 at 1080p minimum graphics settings.
> 
> Looking to see if that would reliably feed 180+fps in to the upcoming Asus 240hz 1080p monitor


1080p Graphics settings at minimum, anti-aliasing off, 60Hz screen. Motion blur off. FPS top right.



1080p, settings maxed out, everything as high as it can go. Filmic SMAA maxed as well. Motion blur off.


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> Got Quantum Break for $17, it's a pretty nice looking game! It runs like **** in Ultra, but honestly, turning half the settings down to High will let you run it from 60 to 85 FPS. And it still looks better than a lot of other games.
> 
> I think Remedy should have just not released the game with Ultra settings, or they should have called them "Uber" settings and the High settings "Ultra."
> 
> Anyways, here's some screenshots with 1080 Ti and 4790K.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Below, comparison between 1440P, 1080P and Upscaling
> 
> 
> 1440P Max Everything
> 
> 
> 1080P
> 
> 
> 1440P with Upscaling
> 
> Below, you can see the game still looks gorgeous at some settings turned down a little bit.
> 
> 
> Reflections set to Medium, Volumetric Lighting set to High, Shadows set to High, Upscale off, 1440P
> 
> 
> Everything set to max, Upscale off, 1440P
> 
> Below, Reflections set to Medium, Volumetric Lighting set to High, Shadows set to High, Upscale off, 1440P
> 
> 
> ]


Nice, for 17 dollars it's literally a steal !
Yep, great game and with upscaling off and a few settings turned down you can achieve above 60 framerate even during heavier combat scenarios. I thought when I get the new card I would be able to max out the game completely, but unfortunately this is not the case as I soon discovered. The optimization was not one of the top priorities of the developer it seems







Which is a huge shame, because in terms of quality it's right on par with their previous fantastic work, like max payne games and alan wake.


----------



## Norlig

Quote:


> Originally Posted by *KedarWolf*
> 
> What exactly does the pci-e shunt mod do?
> 
> I'm just curious, can't do the shut mods as my card sits vertically in my case.


Dunno, the guide I followed included modding it, I guess it gives a few % more power, but not as much as the 8/6 Pin Shunts.


----------



## leequan2009

Quote:


> Originally Posted by *Scorpion667*
> 
> Most likely coil whine which is normal.
> 
> Many of us here in the forums have experience with the specific sound of coil whine. If you want to record it and post here you could get some opinions. Do temperatures appear to be normal?


Thank for sharing .My temperature is running <45*C for Fulload. When i hear that sound, I imaged one day my card will be burn. LOL


----------



## gstarr

My first EVGA card was to worst, 2025 mhz cap.. so just ordered second card directly from nvidia store


----------



## havoc315

Out of grizzly kryonaut, attic silver 5, coollabritories liquid metal ultra, or should I use a Indigo pad to cool this thing of


----------



## Amuro

1080ti fe oc +160 +500
4790k 4.5ghz 1.190v
1600mhz ddr3 cl9

average powerdraw to powerlimit and performance ratio/equivalent gpu
note gpu powerdraw is not static, average gpu tdp taken via HWinfo64
stock voltage
firestrike performance graphics score

50 pl 125watts firestrike graphics 19475(oc) gigabyte g1 980ti
60 pl 150watts firestrike graphics 25900(oc) asus strix 1080 oc
70 pl 175watts firestrike graphics 27800(oc) titan pascal
80 pl 200watts firestrike graphics 28760(oc) gtx 1080ti
90 pl 220watts firestrike graphics 29794(oc)
100pl 250watts firestrike graphics 28756(stock) 30302(oc)
110pl 275watts firestrike graphics 30789(oc)
120pl 295watts firestrike graphics 31253(oc)

dsr 4k on my peasant asus vc239h ips is very useful atm, my previous 980 ti cant handle that at least with all bells n whistle,
finally my old 4790k can rest from overclocking, now at 4.0ghz 1.024v,
i guess u dont need to oc cpu at 4k, 1080 ti will handle everything, sorry for my english.


----------



## guttheslayer

Anyone know how to get the recon wildland benchmark?

I got the game but cant seem to find the benchmark option.


----------



## Radox-0

Quote:


> Originally Posted by *guttheslayer*
> 
> Anyone know how to get the recon wildland benchmark?
> 
> I got the game but cant seem to find the benchmark option.


Go into Options menu, then Graphics Sub-Menu. Will get an option at that stage to do the benchmark. "B" by default in graphics subsection.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> not necessarily as BF1 is a cpu hog. It is also a game that will utilize all your cores and it CRUSHES I-5s.


For discussion purposes only, let's say two hypothetical games (each ran on a different system) are CPU-bottlenecked and they exactly have the same load on the CPU, will both systems have the same fps if one has a 1080Ti and the other has a 1070?


----------



## Radox-0

Quote:


> Originally Posted by *kevindd992002*
> 
> For discussion purposes only, let's say two hypothetical games (each ran on a different system) are CPU-bottlenecked and they exactly have the same load on the CPU, will both systems have the same fps if one has a 1080Ti and the other has a 1070?


It can happen. Total war Warhammer is a pretty good example I see this in within the benchmark at 1080p. My 1080 and GTX 1080Ti, get within a few FPS of one another on the same CPU and settings. Looking at the CPU usage I can see its usually pretty high and is the overall boundary on the numbers. Move out to 1440p or my normal resolution of 3440 x 1440 and I can observe the boundary moving to the GPU and the overall delta between the two cards opening up a fair bit.

3D guru actually do a pretty good article for the game and a bunch of GPUs and you can see they show pretty high CPU usage at 1080p and this reflects in FPS between some cards such as the 1080 and 1070 being fairly close with the delta opening up at 1440p and 4k. http://www.guru3d.com/articles_pages/total_war_warhammer_directx_12_pc_graphics_performance_benchmark_review,6.html


----------



## Benny89

Quote:


> Originally Posted by *gstarr*
> 
> My first EVGA card was to worst, 2025 mhz cap.. so just ordered second card directly from nvidia store


You know that other AIB cards like MSI GAMING X and STRIX ROG are achieving 2039 to 2050 OC? Also that on custom water loops there are not many people who have achieved stable 2100?

1080 Ti Pascal has very very small OC room on anything other than LN2. Max 50mhz difference is worth that much for you?


----------



## PasK1234Xw

^ water doesn't magically make it hit 2100 you can get idea of what card can do by putting fan at 100% keeping it at 60-65c
Quote:


> Originally Posted by *gstarr*
> 
> My first EVGA card was to worst, 2025 mhz cap.. so just ordered second card directly from nvidia store


Good luck at least they don't bin my first card came from nvidia though and couldn't even do 2050. Also warranty sucks and took me almost 2 weeks after they received my item to get my refund email. Im still waiting for funds to be refunded.


----------



## Benny89

Quote:


> Originally Posted by *PasK1234Xw*
> 
> ^ water doesn't magically make it hit 2100 you can get idea of what card can do by putting fan at 100% keeping it at 60-65c


Thanks for info. So to add that to my point- so far seems like max OC for your average 1080Ti is somewhere between 2020-2055. Which won't give you even 1 fps more I think









Even MUCH more than in 980Ti previously people should just go for looks and temps/noise. Seems like all AIBs are the same anyway.


----------



## PasK1234Xw

I agree but 2050 to 2100 is yielding people an extra 10fps so looking at 20fps increase (best case scenario) with great OC 1080 ti

People like you and I running high refresh and no longer running SLI want all we can get.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gstarr*
> 
> My first EVGA card was to worst, 2025 mhz cap.. so just ordered second card directly from nvidia store
> 
> 
> 
> You know that other AIB cards like MSI GAMING X and STRIX ROG are achieving 2039 to 2050 OC? Also that on custom water loops there are not many people who have achieved stable 2100?
> 
> 1080 Ti Pascal has very very small OC room on anything other than LN2. Max 50mhz difference is worth that much for you?
Click to expand...

I did a full benching Heaven loop on water, fans and pump at 100% at 2100, +657 memory, no driver crashes. Would power limit though and go from between 1.050v and 1.093v during the run though.

For 24/7 use i run fans at 65%,pump at 75% at 2062 core, +610 memory at 1.031v custom power curve.. Temps stay below 47C.

With no shunt mod (my card sits vertical) I get zero power limiting and my core will never dip below 2062 at 1.031v an entire Heaven run.

See how I did that here.









http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## PasK1234Xw

Just want to throw this out there for those not happy with evga 1080 Ti hybrid kit price shenanigans seems the MSI 1080 Ti seahawk cards are out and about and will be much cheaper option that FE/hybrid kit combo.

https://www.msi.com/Graphics-card/GeForce-GTX-1080-Ti-SEA-HAWK.html#hero-overview

https://www.msi.com/Graphics-card/GeForce-GTX-1080-Ti-SEA-HAWK-X.html#hero-overview


__
https://www.reddit.com/r/635uv2/denmark_proshop_just_received_their_msi_shipment/


----------



## Benny89

Quote:


> Originally Posted by *KedarWolf*
> 
> I did a full benching Heaven loop on water, fans and pump at 100% at 2100, +657 memory, no driver crashes. Would power limit though and go from between 1.050v and 1.093v during the run though.
> 
> For 24/7 use i run fans at 65%,pump at 75% at 2062 core, +610 memory at 1.031v custom power curve.. Temps stay below 47C.
> 
> With no shunt mod (my card sits vertical) I get zero power limiting and my core will never dip below 2062 at 1.031v an entire Heaven run.
> 
> See how I did that here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


I was reading your thread there. Good stuff!









But you are on 2062 Mhz now and in all so far reviews for AIBs (STRIX and Gaming X only) there was no problem in hitting that 2040-2050 mark with simple AB OC. So far reading through internet all ROG STRIX 1080Ti were capping at 2050 on OC.

What I am trying to say is- you can try to squeeze every last drop of your 1080Ti but results won't be near as what we could see in previous Generations.

Besides what will be a difference in games (not benchmarks) between 2025 mhz and 2065mhz? If any.

Wish we had more OC room with Pascals









I wonder if we will see any OC with Volta is that keeps up.


----------



## Biggu

Quote:


> Originally Posted by *leequan2009*
> 
> Hi guys
> I just installed Block for 1080Ti. I don't know this is a trouble with my card.
> When I Fulload I put the ear near the card, there is a small sound inside the card and block, but when in Idie or light work, there is no sound that appears. ( I searched Google, this like Coil Whine?)
> I do not know whether this is normal or abnormal with my CARD. Is there anyone like me?


Please tell me you took the plastic off the thermal pads before you put that block on.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> For discussion purposes only, let's say two hypothetical games (each ran on a different system) are CPU-bottlenecked and they exactly have the same load on the CPU, will both systems have the same fps if one has a 1080Ti and the other has a 1070?


if the cpu is bound will the gpu's perform the same and the answer to THAT is yes. Not the exact same fps but very very close. But if they have different cpu's, the one with the higher IPC/single core perf/cache speed and size will have higher FPS.
Quote:


> Originally Posted by *havoc315*
> 
> Out of grizzly kryonaut, attic silver 5, coollabritories liquid metal ultra, or should I use a Indigo pad to cool this thing of


Kyro, 100%


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> if the cpu is bound will the gpu's perform the same and the answer to THAT is yes. Not the exact same fps but very very close. But if they have different cpu's, the one with the higher IPC/single core perf/cache speed and size will have higher FPS.


I see. So in BF1, an OC'd 2600K (4.5GHz) will probably be CPU-bottlenecked with a 1080Ti at 1440p because the game is CPU hog in such a way that it can't keep up with the number of draw calls needed by the 1080Ti to achieve a high FPS, correct?


----------



## BucketInABucket

The ghetto bench while I wait for the rest of my parts to arrive!


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> What exactly does the pci-e shunt mod do?
> 
> I'm just curious, can't do the shut mods as my card sits vertically in my case.


The three shunts report the power being drawn from the three power soruces (pcie, 8 pin, 6 pin). Screwing with them (lowering their resistance) causes them to read low. The pcie has, i understand, quite a low power limit (someone interpreted the BIOS earlier in the thread with the exact readouts) so causing it to report lower than what it actually is drawing has obvious benefits for GPU boost.


----------



## leequan2009

Anyone got Coil Whine after installed Block for 1080Ti. is it normal to continue to use or playing game?


----------



## Alwrath

Hey guys, whats everyone getting for Heaven 4K benches? This is my result with stock voltage, +115 on the core, +309 on the memory, 3.9 ghz Ryzen 1700.

No AA, everything else max.


----------



## JedixJarf

Quote:


> Originally Posted by *Biggu*
> 
> Please tell me you took the plastic off the thermal pads before you put that block on.


Lol, I was thinking the same thing.


----------



## leequan2009

Quote:


> Originally Posted by *Biggu*
> 
> Please tell me you took the plastic off the thermal pads before you put that block on.


Yes. That was a shot photo show. When I put block into card I took off the plastic. LOL

Too cold with EK Block and 480 Radiator


----------



## KingEngineRevUp

I need to stop coming here, lol. The more I visit this thread, the more I am tempted to do the shunt mod to play at 2114 Mhz... But for what? Literally 1.5% extra performance? RESIST THE URGE!


----------



## Blotto80

Quote:


> Originally Posted by *leequan2009*
> 
> Too cold with EK Block and 480 Radiator


Those temps!!! Now I want, I was extremely happy plodding along at 47c until I saw that.


----------



## Iceman2733

Quote:


> Originally Posted by *leequan2009*
> 
> Anyone got Coil Whine after installed Block for 1080Ti. is it normal to continue to use or playing game?


Coil whine is normal for GPU's a lot of time it doesn't become noticeable until after installing a water block due to all the noise of the fans on the GPU being gone. Also as asked above did you remove the backings on your thermal pads when installing? In your pic you posted you still have some of the backings on??


----------



## leequan2009

Quote:


> Originally Posted by *Iceman2733*
> 
> Coil whine is normal for GPU's a lot of time it doesn't become noticeable until after installing a water block due to all the noise of the fans on the GPU being gone. Also as asked above did you remove the backings on your thermal pads when installing? In your pic you posted you still have some of the backings on??


Thank for your experience. Yes, I took off the plastic of thermalpad after pasted block to card.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Blotto80*
> 
> Those temps!!! Now I want, I was extremely happy plodding along at 47c until I saw that.


But how much of a difference would it be to go form 47C to 35C? Almost nearly negligible?


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> I see. So in BF1, an OC'd 2600K (4.5GHz) will probably be CPU-bottlenecked with a 1080Ti at 1440p because the game is CPU hog in such a way that it can't keep up with the number of draw calls needed by the 1080Ti to achieve a high FPS, correct?


at 144fps, yes. at 60fps, probably not.
Quote:


> Originally Posted by *SlimJ87D*
> 
> But how much of a difference would it be to go form 47C to 35C? Almost nearly negligible?


yes. just like gaming at 1974 is basically the same as 2100, it's all in our ocd'd out heads. I guarantee you the rate of ocd on this board is very very high







. Now, if one were getting 57 fps in 4k, or was sitting at 96 fps on their 3440x1440 screen and just need to max it out (even then it's silly, but screw it!) then there's a reason to push for the MAX.


----------



## Blotto80

Quote:


> Originally Posted by *SlimJ87D*
> 
> But how much of a difference would it be to go form 47C to 35C? Almost nearly negligible?


It'll make a difference to how I feel







. I've been doing everything I can do to keep it under the 51c throttle point the last few days, I've settled down to 2050mhz @ 1.031v but with more temp headroom I could push the volts back up to stock and get 2100mhz full time.

Or I could get no perceivable speed gain but get to brag that I run my overclocked Ti at 36c. lol.


----------



## Slackaveli

that's where it comes into play: Peace of Mind.


----------



## rcfc89

Quote:


> Originally Posted by *Slackaveli*
> 
> at 144fps, yes. at 60fps, probably not.


Quote:


> Originally Posted by *Slackaveli*
> 
> at 144fps, yes. at 60fps, probably not.


I'm not sure I would agree with this. This is my 3930k @4.5ghz only at 45% usage with a pair of 980Ti's(98-99% usage) running 3440x1440p Ultra completely maxed out 114fps.


----------



## Biggu

Quote:


> Originally Posted by *havoc315*
> 
> Out of grizzly kryonaut, attic silver 5, coollabritories liquid metal ultra, or should I use a Indigo pad to cool this thing of


I used Conductonaut on my 1080 Ti..... Never using that stuff again. Now that its in place its staying but I will go back to kryonaut for the future.


----------



## p2im0

Quote:


> Originally Posted by *Biggu*
> 
> I used Conductonaut on my 1080 Ti..... Never using that stuff again. Now that its in place its staying but I will go back to kryonaut for the future.


I just used Conductonaut for the first time to do the shunt mod and when mounting my EVGA Hybrid cooler. It's super difficult to work with and the first two times I mounted my hybrid cooler I didn't use enough so I had to rip it apart and apply more. I also used it when remounting my CPU cooler and had the same problem - check the temps above - I needed to pull off my CPU cooler and re-apply too.

The results of the shunt mod worked well for me though (with vs without shunt)


----------



## gstarr

someone idea if EVGA CLC 280 (CPU) will fit on the 1080 Ti GPU?


----------



## alucardis666

Quote:


> Originally Posted by *gstarr*
> 
> someone idea if EVGA CLC 280 (CPU) will fit on the 1080 Ti GPU?


I'm kinda curious about this as well. Would be interesting and probably 5-10c better than the hybrid cooler depending on the fans you strap to it.


----------



## Boost240

Quote:


> Originally Posted by *PasK1234Xw*
> 
> I agree but 2050 to 2100 is yielding people an extra 10fps so looking at 20fps increase (best case scenario) with great OC 1080 ti
> 
> People like you and I running high refresh and no longer running SLI want all we can get.


An extra 10-20fps just by going up 50Mhz?


----------



## alucardis666

Quote:


> Originally Posted by *Boost240*
> 
> An extra 10-20fps just by going up 50Mhz?


It's probably more like 3-5fps best case scenario.









My card can bench @ 2100 but isn't game stable over 2050. So. 2050 is where I keep it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Boost240*
> 
> An extra 10-20fps just by going up 50Mhz?


Lol, no way do you get 10-20 FPS from a 2.4% increase from 2050 to 2100...

If you're at 100 FPS, you'll get 101-102 FPS

If you're at 50 FPS, you'll get 51 FPS

If you're at 144 FPS, you'll get 145-147 FPS

You'd have to boost to 2255 - 2460 Mhz to get 10-20 FPS extra.

This is assuming a generous 1:1 linear growth with Clock vs FPS.


----------



## dunbheagan

Hey guys, i am going to use grizzly conductonaut on my die and decided to isolate the small parts on the side of die with black nail polish. Couldnt test it yet but visually it looks ok imo. What do you think about it? I am planning the same thing to isolate the area around the shunts for shunt modding.


----------



## Biggu

Looks fine to me, I never did that on mine though so I dunno if thats overkill or not. then again my 1080 Ti has a warranty at a local store so if it blows up it gets exchanged no questions asked.


----------



## dunbheagan

Yeah, i know what you mean, maybe it is not necesary at all. But it gives me a little extra feeling of secureness, which is very welcome since it is the first time i am playing around with liquid metall... 

This is how it looks around the shunts:


----------



## p2im0

Quote:


> Originally Posted by *dunbheagan*
> 
> Hey guys, i am going to use grizzly conductonaut on my die and decided to isolate the small parts on the side of die with black nail polish. Couldnt test it yet but visually it looks ok imo. What do you think about it? I am planning the same thing to isolate the area around the shunts for shunt modding.


I did the exact same thing but with Liquid Electrical Tape around the GPU and electrical tape around the shunts (did the shunts first and forgot about the liquid electrical tape. Put my mind at ease.)

Start off really light with the conductonaut on the GPU (instructions say a pin-head size but you will end up needing more than this). Don't forget to do both the DIE and the GPU water block. I ended up not putting enough - and had my GPU temps much higher than with the arctic silver I had on previously.

You don't want it to the point where its liquidy and moves around when you move the GPU but enough so you can't see (or feel with the q-tip) the nvidia printing when rubbing the conductonaut on the die.

Post pics when you do it!

Here are my shunts after the first attempt, then after the second (I didn't put enough on the first time)


----------



## Mato87

Quote:


> Originally Posted by *dunbheagan*
> 
> Yeah, i know what you mean, maybe it is not necesary at all. But it gives me a little extra feeling of secureness, which is very welcome since it is the first time i am playing around with liquid metall...
> 
> This is how it looks around the shunts:


Damn, what is that, looks dangerous.


----------



## dunbheagan

You think so? I dont see why. It is just glossy black nail polish. The gloss makes it look very wet, maybe thats why you think it looks dangerous. But it has already totally hardened. I am still positive


----------



## Mato87

Quote:


> Originally Posted by *dunbheagan*
> 
> You think so? I dont see why. It is just glossy black nail polish. The gloss makes it look very wet, maybe thats why you think it looks dangerous. But it has already totally hardened. I am still positive


What's it for ?


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Lol, no way do you get 10-20 FPS from a 2.4% increase from 2050 to 2100...
> 
> If you're at 100 FPS, you'll get 101-102 FPS
> 
> If you're at 50 FPS, you'll get 51 FPS
> 
> If you're at 144 FPS, you'll get 145-147 FPS
> 
> You'd have to boost to 2255 - 2460 Mhz to get 10-20 FPS extra.
> 
> This is assuming a generous 1:1 linear growth with Clock vs FPS.


foiled by that pesky math...


----------



## dunbheagan

@Mato87
I will use liquid metal on the shunts for the shunt mod. The nail polish is to isolate the small parts around the shunts, in case the liquid metall accidantelly spills on them.


----------



## Slackaveli

Quote:


> Originally Posted by *Mato87*
> 
> What's it for ?


to keep the conductive Conductanaut off of the transistors around the gpu die itself.
Quote:


> Originally Posted by *dunbheagan*
> 
> @Mato87
> I will use liquid metal on the shunts for the shunt mod. The nail polish is to isolate the small parts around the shunts, in case the liquid metall accidantelly spills on them.


works for the shunt mod Im told. I am seriously anxious for you to use it on the die itself, though. I'd go Kyronaut there.


----------



## mouacyk

I think the most reliable solution so far has been soldering a thin wire across the resistor. I guarantee that for Volta, NVidia is going to put some kind of warranty-sealed cap over the resistors.


----------



## Cr4zy

Quote:


> Originally Posted by *KedarWolf*
> 
> I did a full benching Heaven loop on water, fans and pump at 100% at 2100, +657 memory, no driver crashes. Would power limit though and go from between 1.050v and 1.093v during the run though.
> 
> For 24/7 use i run fans at 65%,pump at 75% at 2062 core, +610 memory at 1.031v custom power curve.. Temps stay below 47C.
> 
> With no shunt mod (my card sits vertical) I get zero power limiting and my core will never dip below 2062 at 1.031v an entire Heaven run.
> 
> See how I did that here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


Nice, I've been running 2050Mhz @ 1.000v & +500 memory for the past few days and its handled everything without issue so far peaks @41c. Manages 2000Mhz at 0.931v too.

I can't hit a stable 2100 it seems, but I'm happy with this.


----------



## gkolarov

Here is the BIOS file of ROG-STRIX-GTX1080TI-O11G-GAMING . If someone can extend the power limit and/or add a little more voltage











1080ti_strix.zip 154k .zip file


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> works for the shunt mod Im told. I am seriously anxious for you to use it on the die itself, though. I'd go Kyronaut there.


Honestly i am a bit anxious too messing around with liquid metal on such an expensive card. But i am surely not the first one using conductonaut directly on the die, GPU or CPU. Reviews say you surely get a few degrees less from it, so i want to try it. Wish me luck


----------



## Mato87

Quote:


> Originally Posted by *dunbheagan*
> 
> @Mato87
> I will use liquid metal on the shunts for the shunt mod. The nail polish is to isolate the small parts around the shunts, in case the liquid metall accidantelly spills on them.


Oh ok, I guess that's a whole another level of modding


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mouacyk*
> 
> I think the most reliable solution so far has been soldering a thin wire across the resistor. I guarantee that for Volta, NVidia is going to put some kind of warranty-sealed cap over the resistors.


The shunt has been going on since 600 series cards. So who knows when and if that warranty sticker will come about.

Hopefully one day someone figures out how to modify the BIOs, that's better.


----------



## mouacyk

Quote:


> Originally Posted by *SlimJ87D*
> 
> The shunt has been going on since 600 series cards. So who knows when and if that warranty sticker will come about.
> 
> Hopefully one day someone figures out how to modify the BIOs, that's better.


As you can see, NVidia is locking down more tightly each successive generation -- it seems almost in reaction to overclocking revelations. You didn't need shunts for 600/700/900 series, because the BIOSes weren't locked -- at least AIB's had access to unlock power and voltage beyond reference specs. With 1000, you had Boost 3.0 and AIB BIOS lockdowns as well. The OC headroom and opportunities are dwindling and the next logical step is to make hardware mods like this impossible.


----------



## alucardis666

Quote:


> Originally Posted by *mouacyk*
> 
> As you can see, NVidia is locking down more tightly each successive generation -- it seems almost in reaction to overclocking revelations. You didn't need shunts for 600/700/900 series, because the BIOSes weren't locked -- at least AIB's had access to unlock power and voltage beyond reference specs. With 1000, you had Boost 3.0 and AIB BIOS lockdowns as well. The OC headroom and opportunities are dwindling and the next logical step is to make hardware mods like this impossible.


Agreed!









Hoping they stop impeding the modding and ocing potential of their products.


----------



## mshagg

Quote:


> Originally Posted by *gkolarov*
> 
> Here is the BIOS file of ROG-STRIX-GTX1080TI-O11G-GAMING . If someone can extend the power limit and/or add a little more voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080ti_strix.zip 154k .zip file


Nice! There was a guy here that analysed the FE one, will be interesting to see how the power limit compares to the FE.

In theory this could be flashed to a card right? i.e. it's already signed?


----------



## p2im0

Quote:


> Originally Posted by *dunbheagan*
> 
> Honestly i am a bit anxious too messing around with liquid metal on such an expensive card. But i am surely not the first one using conductonaut directly on the die, GPU or CPU. Reviews say you surely get a few degrees less from it, so i want to try it. Wish me luck


Good luck! I have cryonaut (edit: Conductonaut) on my GPU die now and I needed more than I expected to get good contact between the block and the die.


----------



## dunbheagan

Quote:


> Originally Posted by *p2im0*
> 
> Good luck! I have cryonaut (edit: Conductonaut) on my GPU die now and I needed more than I expected to get good contact between the block and the die.


Thanks for the advice!


----------



## JedixJarf

Quote:


> Originally Posted by *mshagg*
> 
> Nice! There was a guy here that analysed the FE one, will be interesting to see how the power limit compares to the FE.
> 
> In theory this could be flashed to a card right? i.e. it's already signed?


In theory yes, that is how I did it with the 1080 at least.


----------



## Boost240

I notice that my 1080 Ti won't idle down to 138Mhz or whatever at idle. It's stuck at 1544Mhz with no load. This same thing happened with m y outgoing 970. I did a fresh install of the drivers and it seemed fine at first but now it went back to doing it. Anyone seen this before and have a solution? Thanks.


----------



## chibi

Quote:


> Originally Posted by *Boost240*
> 
> I notice that my 1080 Ti won't idle down to 138Mhz or whatever at idle. It's stuck at 1544Mhz with no load. This same thing happened with m y outgoing 970. I did a fresh install of the drivers and it seemed fine at first but now it went back to doing it. Anyone seen this before and have a solution? Thanks.


What monitor are you using and what is your desktop resolution/refresh rate? 144Hz may be keeping your card clocking higher.


----------



## Boost240

Two 1080p monitors, one 60hz, the other 144hz. I'll unplug one at a time and see if it makes a difference. Thanks.


----------



## p2im0

Quote:


> Originally Posted by *Boost240*
> 
> Two 1080p monitors, one 60hz, the other 144hz. I'll unplug one at a time and see if it makes a difference. Thanks.


Also check your windows power plan. If it's set to high performance your GPU and CPU will stay clocked. This happens to me after exiting SteamVR with my Vive, it doesn't change the power plan back.


----------



## KingEngineRevUp

Okay, from a UL standpoint, nail polish IS NOT OKAY because it's flammable, the opposite of what ul looks for, flame resistant material.

I would have used circuit board RTV instead, that's what it's made for!

Use RTV DAMN IT!

You remember EVGA VRM blowing up? You want nail polish around that if it happened?


----------



## Mato87

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay, from a UL standpoint, nail polish IS NOT OKAY because it's flammable, the opposite of what ul looks for, flame resistant material.
> 
> I would have used circuit board RTV instead, that's what it's made for!
> 
> Use RTV DAMN IT!
> 
> You remember EVGA VRM blowing up? You want nail polish around that if it happened?


Oh damn, that guy fudged up I guess ?


----------



## spyui

Quote:


> Originally Posted by *Boost240*
> 
> I notice that my 1080 Ti won't idle down to 138Mhz or whatever at idle. It's stuck at 1544Mhz with no load. This same thing happened with m y outgoing 970. I did a fresh install of the drivers and it seemed fine at first but now it went back to doing it. Anyone seen this before and have a solution? Thanks.


Do you have any program running on the background ? Some programs will use gpu power for its UI. There is also a Nvidia bug has not been fix for the last 3 generations was if you are using more than 2 monitors , it wont clock down to 138mhz in idle.


----------



## hotrod717

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay, from a UL standpoint, nail polish IS NOT OKAY because it's flammable, the opposite of what ul looks for, flame resistant material.
> 
> I would have used circuit board RTV instead, that's what it's made for!
> 
> Use RTV DAMN IT!
> 
> You remember EVGA VRM blowing up? You want nail polish around that if it happened?


People have been using nail polish to isolate on circuit boards and chips for a long time. Really not an issue as long it is allowed to dry.


----------



## Boost240

Quote:


> Originally Posted by *spyui*
> 
> Do you have any program running on the background ? Some programs will use gpu power for its UI. There is also a Nvidia bug has not been fix for the last 3 generations was if you are using more than 2 monitors , it wont clock down to 138mhz in idle.


Nothing running in the background. I went into Nvidia control panel and set the power management to adaptive. It was set ti performance. I rebooted and now it's idling fine.


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> Honestly i am a bit anxious too messing around with liquid metal on such an expensive card. But i am surely not the first one using conductonaut directly on the die, GPU or CPU. Reviews say you surely get a few degrees less from it, so i want to try it. Wish me luck


definitely, Good luck, Brother!


----------



## Somasonic

Quote:


> Originally Posted by *Boost240*
> 
> Nothing running in the background. *I went into Nvidia control panel and set the power management to adaptive*. It was set ti performance. *I rebooted and now it's idling fine*.


This means that something that was running was using the card and being set to max performance it was staying at full clocks. Steam, GOG and Origin are possible culprits for this. You can get around this by setting the NVCP back to max performance and then setting up a profile for anything else that you don't want max clocks for and setting that to adaptive. Or leave it as is


----------



## Boost240

Quote:


> Originally Posted by *Somasonic*
> 
> This means that something that was running was using the card and being set to max performance it was staying at full clocks. Steam, GOG and Origin are possible culprits for this. You can get around this by setting the NVCP back to max performance and then setting up a profile for anything else that you don't want max clocks for and setting that to adaptive. Or leave it as is


I think it was chrome actually. I will the profile thing though.


----------



## kevindd992002

Quote:


> Originally Posted by *rcfc89*
> 
> I'm not sure I would agree with this. This is my 3930k @4.5ghz only at 45% usage with a pair of 980Ti's(98-99% usage) running 3440x1440p Ultra completely maxed out 114fps.


Now that's interesting.

@Mato87 and Slackaveli

Any thoughts on this behavior?
Quote:


> Originally Posted by *p2im0*
> 
> I just used Conductonaut for the first time to do the shunt mod and when mounting my EVGA Hybrid cooler. It's super difficult to work with and the first two times I mounted my hybrid cooler I didn't use enough so I had to rip it apart and apply more. I also used it when remounting my CPU cooler and had the same problem - check the temps above - I needed to pull off my CPU cooler and re-apply too.
> 
> The results of the shunt mod worked well for me though (with vs without shunt)


So the shunt mod completely zeroes out the reading of the power limit? I mean in theory it should (as you short out the resistor) but I thought it will just lower it down significantly below the real limit?


----------



## KedarWolf

Quote:


> Originally Posted by *gkolarov*
> 
> Here is the BIOS file of ROG-STRIX-GTX1080TI-O11G-GAMING . If someone can extend the power limit and/or add a little more voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080ti_strix.zip 154k .zip file


I successfully flashed the Strix BIOS to my 1080 TI FE.

I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.

Here's how I did it









nvflash_5.353.0.zip 3031k .zip file


1080ti_strix.zip 154k .zip file


Unzip NVFlash and the Strix bios to a folder.

Disable your video card in Control Panel in Device Manager.

Run an admin command prompt and cd to the folder you made.

Do in command prompt:

nvflash64 --protectoff.

Then do:

nvflash64 --save filename.rom

to backup original bios.

nvflash64 -6 biosfilename.rom

Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.

For more than one card.

nvflash64 --list

nvflash64 -6 --index=0 BIOSfilename.rom

nvflash64 -6 --index=1 BIOSfilename.rom

Choose yes when asked to override mismatched IDs.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash -6 BIOSfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


You've got some balls bud, glad it worked and helped you get a better result. Idk if I should also try it or not with my EVGA FE card XD.

Also how did you backup your stock bios? GPU-z is saying bios reading isn't supported.


----------



## chibi

You must not know Kedarwolf then. He is the master at Titan X (M) bios editing. Good ole times








+rep and will try the flash once my waterblock is in.


----------



## alucardis666

Quote:


> Originally Posted by *chibi*
> 
> You must not know Kedarwolf then. He is the master at Titan X (M) bios editing. Good ole times
> 
> 
> 
> 
> 
> 
> 
> 
> +rep and will try the flash once my waterblock is in.


Not doubting in his abilities, just saying I'd like to back-up my car'd bios before I try flashing another.


----------



## mshagg

Quote:


> Originally Posted by *kevindd992002*
> 
> Now that's interesting.
> 
> @Mato87 and Slackaveli
> 
> Any thoughts on this behavior?
> So the shunt mod completely zeroes out the reading of the power limit? I mean in theory it should (as you short out the resistor) but I thought it will just lower it down significantly below the real limit?


I don't believe it zeros it completely, there needs to be a minimum reading or the card will enter a 'limp home' strategy.
Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash -6 BIOSfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


Now we're talking. Ive been keen to get an OC bios alone because afterburner seems to cause dramas with SteamVR, presumably because they're both working with the GPU at quite a low level.


----------



## dentnu

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash -6 BIOSfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


How safe is it to flash this to a FE card ? From what is saw the Strix is a custom board made by Asus using 2x 8pin vs the FE using 1 6-pin & 1 8pin. Can this bios damage the FE card or cause any other issues in the long run ?


----------



## evassion

I have a evga 1080 ti fe too... let me know if you decide to do it and flash yout card.

What advantage did you had flashing?? Compared to before i mean ? More clock and stable?


----------



## chibi

Quote:


> Originally Posted by *alucardis666*
> 
> You've got some balls bud, glad it worked and helped you get a better result. Idk if I should also try it or not with my EVGA FE card XD.
> 
> Also how did you backup your stock bios? GPU-z is saying bios reading isn't supported.


Forgot to answer your question, in Post# 3181, he explains how to do a bios dump with NVFlash.


----------



## alucardis666

Quote:


> Originally Posted by *chibi*
> 
> Forgot to answer your question, in Post# 3181, he explains how to do a bios dump with NVFlash.


Thanks!


----------



## looniam

i'll just leave this here: (sorry if already posted)

https://www.ekwb.com/news/official-list-ek-water-blocks-geforce-gtx-1080-ti-series/


----------



## KingEngineRevUp

Quote:


> Originally Posted by *hotrod717*
> 
> People have been using nail polish to isolate on circuit boards and chips for a long time. Really not an issue as long it is allowed to dry.


It's still not flame resistant when dry. It can actually light on fire, not saying it'll ever reach those temperatures. Otherwise it would be a magical industry standard to just apply nail polish to PCB boards.

But why not use RTV? You can also remove RTV much easier for an RMA, and the kind that is made to be put on circuit boards is just that, it's made to be put on circuit boards.

The LID of a CPU is held together by silicon sealing gasket.


----------



## p2im0

Quote:


> Originally Posted by *kevindd992002*
> 
> So the shunt mod completely zeroes out the reading of the power limit? I mean in theory it should (as you short out the resistor) but I thought it will just lower it down significantly below the real limit?


The power is still displayed, it's just lower as current is bypassing the shunts. If you blow up the original screenshot I went from ~114% to 87% power with higher clocks.


----------



## xartion

Just ordered the Strix 1080TI from Newegg after trying for the last 30 hours









Crossing my fingers -- even though the order went through and I got the confirmation email, it's still in the verification stage and newegg has been known to cancel orders.

Edit: 'We are happy to inform you that your order (Sales Order Number: #########) has been successfully charged to your VISA and order verification is now complete.' Does this mean that I got it? I really hope so


----------



## Mato87

Quote:


> Originally Posted by *xartion*
> 
> Just ordered the Strix 1080TI from Newegg after trying for the last 30 hours
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: crossing my fingers -- even though the order went through and I got the confirmation email, it's still in the verification stage and newegg has been known to cancel orders


Good luck, the card is a beast , I just fired up crysis 3 a few hours ago and with everything maxed out at 1080p even with maxed out aa I get around 100 - 190 fps...even the demanding dome area I get around 80-140 fps...it's a crazy fast card.


----------



## Mato87

Quote:


> Originally Posted by *kevindd992002*
> 
> Now that's interesting.
> 
> @Mato87
> Any thoughts on this behavior??


Well first of all he has two gtx 980 Ti's and second of all he has the i7 six core with twelve threads, of course he will have only 50% of the cpu utilization.


----------



## dentnu

I just made a backup of my EVGA GTX 1080TI FE bios. Here it is if anyone needs it. It would be great if everyone can start making backups of there cards bios that way we can compare and try them out.

1. Download NVFlash 5.353.0 (https://www.techpowerup.com/download/nvidia-nvflash/

2. Extract contents from nvflash zip file and place them on a new folder called nvflash in your C drive.

3. Disable video card in device manger.

4. Run CMD (Admin)

5. Type CD C:\

6. Type CD nvflash

7. Use command : nvflash64 -b backup.rom

8. You should now have the backup.rom file in your nvlfash folder.

9. Enable video card in device manager.

10.DONE (Please post a copy of your cards bios)

(Please note there is zero risk of anything happening to your card as your are just reading the bios you are not flashing anything to it)

Here is my EVGA 1080TI FE Bios

EVGA1080TIFEBios.zip 154k .zip file


THANKS







KedarWolf and gkolarov for posting the Strix bios and guide on how to flash it.


----------



## KedarWolf

Quote:


> Originally Posted by *dentnu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash -6 BIOSfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.
> 
> 
> 
> How safe is it to flash this to a FE card ? From what is saw the Strix is a custom board made by Asus using 2x 8pin vs the FE using 1 6-pin & 1 8pin. Can this bios damage the FE card or cause any other issues in the long run ?
Click to expand...

i can't answer that but I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run.

GPU-ZSensorLog.txt 35k .txt file


----------



## alucardis666

Ok. I successfully extracted my stock EVGA FE BIOS
Quote:


> Originally Posted by *dentnu*
> 
> I just made a backup of my EVGA GTX 1080TI FE bios. Here it is if anyone needs it. It would be great if everyone can start making backups of there cards bios that way we can compare and try them out.
> 
> 1. Download NVFlash 5.353.0 (https://www.techpowerup.com/download/nvidia-nvflash/
> 
> 2. Extract contents from nvflash zip file and place them on a new folder called nvflash in your C drive.
> 
> 3. Disable video card in device manger.
> 
> 4. Run CMD (Admin)
> 
> 5. Type CD C:\
> 
> 6. Type CD nvflash
> 
> 7. Use command : nvflash64 -b backup.rom
> 
> 8. You should now have the backup.rom file in your nvlfash folder.
> 
> 9. Enable video card in device manager.
> 
> 10.DONE (Please post a copy of your cards bios)
> 
> Here is my EVGA 1080TI FE Bios
> 
> EVGA1080TIFEBios.zip 154k .zip file
> 
> 
> THANKS
> 
> 
> 
> 
> 
> 
> 
> KedarWolf for posting the Strix bios and guide on how to flash it.


Nice, I was gonna do this but you beat me to it... haha.

I also just flashed the Strix Bios on my FE card, only changes I see are that I don't need to raise the slider as much for the core offset now as the strix is oc'd a bit from factory compared to FE.

Also my card now downclocks a little higher @ 240mhz at idle instead of 138mhz like it did before.

Finally, now my voltage slider in AB is greyed out again.


----------



## dentnu

Quote:


> Originally Posted by *KedarWolf*
> 
> i can't answer that but I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run.
> 
> GPU-ZSensorLog.txt 35k .txt file


Wow that is very nice thanks for the update. Very tempted to give this a shot.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gkolarov*
> 
> Here is the BIOS file of ROG-STRIX-GTX1080TI-O11G-GAMING . If someone can extend the power limit and/or add a little more voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> 
> 
> Nice! There was a guy here that analysed the FE one, will be interesting to see how the power limit compares to the FE.
> 
> In theory this could be flashed to a card right? i.e. it's already signed?
Click to expand...

I successfully flashed the Strix BIOS to my 1080 TI FE.

I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.

GPU-ZSensorLog.txt 35k .txt file


Here's how I did it









nvflash_5.353.0.zip 3031k .zip file


1080ti_strix.zip 154k .zip file


Unzip NVFlash and the Strix bios to a folder.

Disable your video card in Control Panel in Device Manager.

Run an admin command prompt and cd to the folder you made.

Do in command prompt:

nvflash64 --protectoff.

Then do:

nvflash64 --save filename.rom

to backup original bios.

nvflash64 -6 biosfilename.rom

Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.

For more than one card.

nvflash64 --list

nvflash64 -6 --index=0 BIOSfilename.rom

nvflash64 -6 --index=1 BIOSfilename.rom

Choose yes when asked to override mismatched IDs.


----------



## alucardis666

Quote:


> Originally Posted by *dentnu*
> 
> Wow that is very nice thanks for the update. Very tempted to give this a shot.


Do it!









Hope your results are more exciting than mine. lol.
Quote:


> Originally Posted by *alucardis666*
> 
> I just flashed the Strix Bios on my FE card, only changes I see are that I don't need to raise the slider as much for the core offset now as the strix is oc'd a bit from factory compared to FE.
> 
> Also my card now downclocks a little higher @ 240mhz at idle instead of 138mhz like it did before.
> 
> Finally, now my voltage slider in AB is greyed out again.


----------



## ChironX

Count me in as well guys. EVGA Founder's Edition, but might get the Poseidon when it releases ;-)

Took the pictures a little artistically


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dentnu*
> 
> Wow that is very nice thanks for the update. Very tempted to give this a shot.
> 
> 
> 
> Do it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope your results are more exciting than mine. lol.
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> I just flashed the Strix Bios on my FE card, only changes I see are that I don't need to raise the slider as much for the core offset now as the strix is oc'd a bit from factory compared to FE.
> 
> Also my card now downclocks a little higher @ 240mhz at idle instead of 138mhz like it did before.
> 
> Finally, now my voltage slider in AB is greyed out again.
> 
> Click to expand...
Click to expand...

Use the WildCards.zip one.

http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Use the WildCards.zip one.
> 
> http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


Perfect! Thanks!


----------



## Asus11

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.
> 
> GPU-ZSensorLog.txt 35k .txt file
> 
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash64 --save filename.rom
> 
> to backup original bios.
> 
> nvflash64 -6 biosfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


me and a few people was the first to flash the STRIX bios to the 1080 FE

what we found out was we could get higher boost... but performance was not on par of the FE at lower clocks

I think the STRIX shows higher clocks but is weaker.. test it out.. put 2k on the FE bios run a benchmack and then go run STRIX bios at 2k and report back


----------



## alucardis666

Quote:


> Originally Posted by *Asus11*
> 
> me and a few people was the first to flash the STRIX bios to the 1080 FE
> 
> what we found out was we could get higher boost... but performance was not on par of the FE at lower clocks
> 
> I think the STRIX shows higher clocks but is weaker.. test it out.. put 2k on the FE bios run a benchmack and then go run STRIX bios at 2k and report back


Well... Now that I got voltage slider back I did some testing in Heaven and my max TDP is 108.5% vs before with the stock FE bios I'd see my TDP shoot as high as 138% So I'll look at this as basically a poor/conservative man's shunt mod.


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.


Be careful with this one - there is a genuine risk of overloading a couple of the VRM phases on the FE card by using the Strix BIOS as it has 2x 8-pin connectors and the current split between the phases is going to be very different.

In the FE BIOS the allowable limits with the adjustment slider maxed are:

78W from the PCIe slot, 99W from the 6-pin PCIe connector, and 175W from the 8-pin PCIe connector.

It's tough to tell how they partitioned the VRMs on the FE since I don't have one in hand, but if it were me, I'd probably power one phase from the PCIe slot, two from the 6-pin, and 4 from the 8-pin PCIe connectors.

However, the Strix 1080Ti (and most of the aftermarket cards) have 2x 8-pin PCIe connectors, and allows 78w from the PCIe slot and *175w from each of the two PCIe connectors.*

If the FE card did partition the VRMs such that there's double the phases assigned the 8-pin versus the 6-pin connector, then using the Strix 1080Ti BIOS might mean that you're underloading the 4 phases assigned to the 8-pin PCIe connector and heavily overloading the (2?) phases assigned to the 6-pin.

It might be worth pulling the backplate and using an infrared temp gun and seeing if any of the VRM MOSFETS are at a very different temperature versus the others. Out of the box with the stock BIOS they should be pretty well balanced.


----------



## Asus11

Quote:


> Originally Posted by *alucardis666*
> 
> Well... Now that I got voltage slider back I did some testing in Heaven and my max TDP is 108.5% vs before with the stock FE bios I'd see my TDP shoot as high as 138% So I'll look at this as basically a poor/conservative man's shunt mod.


forget what its reading









do some performance tests!

put it at 2kMHz run valley 3 times take the average, do the same for the STRIX bios


----------



## alucardis666

Quote:


> Originally Posted by *Asus11*
> 
> forget what its reading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> do some performance tests!
> 
> put it at 2kMHz run valley 3 times take the average, do the same for the STRIX bios


Not too much point really aside from stability due to not hitting the TDP wall. My OC is the same, no changes with running the strix bios.


----------



## ThingyNess

Quote:


> Originally Posted by *gkolarov*
> 
> Here is the BIOS file of ROG-STRIX-GTX1080TI-O11G-GAMING . If someone can extend the power limit and/or add a little more voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080ti_strix.zip 154k .zip file


This is a great help, thank-you!

*I've pulled the info from the Strix 1080Ti OC BIOS and it has a 275w default TDP target, and 330w adjustment limit (+20%). Now everyone knows!*

Still waiting anxiously for a sample from an MSI Armor or Gaming X card, or any of the Gigabyte ones


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> Now that's interesting.
> 
> @Mato87 and Slackaveli
> 
> Any thoughts on this behavior?
> So the shunt mod completely zeroes out the reading of the power limit? I mean in theory it should (as you short out the resistor) but I thought it will just lower it down significantly below the real limit?


my thought would be that there is no information there. he could be very cpu bound with an average of "50%" gpu usage. Must see each and every thread to be able to tell. Also, he is getting 63% power out of one gpu and 45% power out of the other. Way too many variables for me, sorry.


----------



## KedarWolf

Quote:


> Originally Posted by *Asus11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.
> 
> GPU-ZSensorLog.txt 35k .txt file
> 
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash64 --save filename.rom
> 
> to backup original bios.
> 
> nvflash64 -6 biosfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.
> 
> 
> 
> me and a few people was the first to flash the STRIX bios to the 1080 FE
> 
> what we found out was we could get higher boost... but performance was not on par of the FE at lower clocks
> 
> I think the STRIX shows higher clocks but is weaker.. test it out.. put 2k on the FE bios run a benchmack and then go run STRIX bios at 2k and report back
Click to expand...

My best TimeSpy before Strix BIOS was 10808, below with Strix BIOS at 2100 core, 6162 memory Score with Strix BIOS 10876


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> i can't answer that but I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run.
> 
> GPU-ZSensorLog.txt 35k .txt file


well, sir, mission accomplished. And you are the ONLY one with as pretty of a "2100Mhz verification file" as that one! You are the boss, dude.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.
> 
> GPU-ZSensorLog.txt 35k .txt file
> 
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash64 --save filename.rom
> 
> to backup original bios.
> 
> nvflash64 -6 biosfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


Let me know how your testing goes! See if you can do 2114 Mhz without power throttling!

Are you hitting 1.092V consistently now? And also, can we backup our BIOs and reflash it for RMA purposes? Thanks for testing this out and being brave!


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.
> 
> 
> 
> Be careful with this one - there is a genuine risk of overloading a couple of the VRM phases on the FE card by using the Strix BIOS as it has 2x 8-pin connectors and the current split between the phases is going to be very different.
> 
> In the FE BIOS the allowable limits with the adjustment slider maxed are:
> 
> 78W from the PCIe slot, 99W from the 6-pin PCIe connector, and 175W from the 8-pin PCIe connector.
> 
> It's tough to tell how they partitioned the VRMs on the FE since I don't have one in hand, but if it were me, I'd probably power one phase from the PCIe slot, two from the 6-pin, and 4 from the 8-pin PCIe connectors.
> 
> However, the Strix 1080Ti (and most of the aftermarket cards) have 2x 8-pin PCIe connectors, and allows 78w from the PCIe slot and *175w from each of the two PCIe connectors.*
> 
> If the FE card did partition the VRMs such that there's double the phases assigned the 8-pin versus the 6-pin connector, then using the Strix 1080Ti BIOS might mean that you're underloading the 4 phases assigned to the 8-pin PCIe connector and heavily overloading the (2?) phases assigned to the 6-pin.
> 
> It might be worth pulling the backplate and using an infrared temp gun and seeing if any of the VRM MOSFETS are at a very different temperature versus the others. Out of the box with the stock BIOS they should be pretty well balanced.
Click to expand...

My Corsair AX1500i can log the power going to each pci-e power connector. I have separate cables on my six and eight pin, doing that now.


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> Be careful with this one - there is a genuine risk of overloading a couple of the VRM phases on the FE card by using the Strix BIOS as it has 2x 8-pin connectors and the current split between the phases is going to be very different.
> 
> In the FE BIOS the allowable limits with the adjustment slider maxed are:
> 
> 78W from the PCIe slot, 99W from the 6-pin PCIe connector, and 175W from the 8-pin PCIe connector.
> 
> It's tough to tell how they partitioned the VRMs on the FE since I don't have one in hand, but if it were me, I'd probably power one phase from the PCIe slot, two from the 6-pin, and 4 from the 8-pin PCIe connectors.
> 
> However, the Strix 1080Ti (and most of the aftermarket cards) have 2x 8-pin PCIe connectors, and allows 78w from the PCIe slot and *175w from each of the two PCIe connectors.*
> 
> If the FE card did partition the VRMs such that there's double the phases assigned the 8-pin versus the 6-pin connector, then using the Strix 1080Ti BIOS might mean that you're underloading the 4 phases assigned to the 8-pin PCIe connector and heavily overloading the (2?) phases assigned to the 6-pin.
> 
> It might be worth pulling the backplate and using an infrared temp gun and seeing if any of the VRM MOSFETS are at a very different temperature versus the others. Out of the box with the stock BIOS they should be pretty well balanced.


beat me to it. i was going to say the idiot version of this so kudos on the great explanation.

@KedarWolf dang, that is a dope psu bro.


----------



## KingEngineRevUp

I forgot about that. My RM1000i PSU can also do that.


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> My Corsair AX1500i can log the power going to each pci-e power connector. I have separate cables on my six and eight pin, doing that now.


Also great news - at least that way you can try with each BIOS and have a before/after comparison.









I actually am downgrading my previous dire warning to a gentle 'caution' - I took a closer look at the datasheet for the uP9511p phase controller that Nvidia used on the FE cards, and it doesn't look like it supports any sort of modification of phase balancing in software - all of the current limits and sensing are all programmed physically with resistors on the board.

AMD had this capability with the controller they used on their RX480 boards and were able to have it draw less power from the motherboard slot with a BIOS flash, which is why I was a little worried.

My only other worry was that perhaps they changed the current sense resistors from the FE to the Strix board, but I double checked and both boards use 5 milliohm resistors to sense the current for all 3 power sources, so we're good there too.


----------



## mshagg

Quote:


> Originally Posted by *ThingyNess*
> 
> Also great news - at least that way you can try with each BIOS and have a before/after comparison.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually am downgrading my previous dire warning to a gentle 'caution' - I took a closer look at the datasheet for the uP9511p phase controller that Nvidia used on the FE cards, and it doesn't look like it supports any sort of modification of phase balancing in software - all of the current limits and sensing are all programmed physically with resistors on the board.
> 
> AMD had this capability with the controller they used on their RX480 boards and were able to have it draw less power from the motherboard slot with a BIOS flash, which is why I was a little worried.
> 
> My only other worry was that perhaps they changed the current sense resistors from the FE to the Strix board, but I double checked and both boards use 5 milliohm resistors to sense the current for all 3 power sources, so we're good there too.


The gamersnexus breakdown of the FE PCB suggested the VRM was fairly well overbuilt, even if you were going to try and pull 400A through the GPU core VRM, you'd be looking at about 40W of heat to deal with, so I suspect each phase indiviudally has significant headroom.






Analysis of the PCIE power draw will certainly be interesting.


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> My Corsair AX1500i can log the power going to each pci-e power connector. I have separate cables on my six and eight pin, doing that now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also great news - at least that way you can try with each BIOS and have a before/after comparison.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually am downgrading my previous dire warning to a gentle 'caution' - I took a closer look at the datasheet for the uP9511p phase controller that Nvidia used on the FE cards, and it doesn't look like it supports any sort of modification of phase balancing in software - all of the current limits and sensing are all programmed physically with resistors on the board.
> 
> AMD had this capability with the controller they used on their RX480 boards and were able to have it draw less power from the motherboard slot with a BIOS flash, which is why I was a little worried.
> 
> My only other worry was that perhaps they changed the current sense resistors from the FE to the Strix board, but I double checked and both boards use 5 milliohm resistors to sense the current for all 3 power sources, so we're good there too.
Click to expand...

According to my AX1500i log the 8 pin power connector is pulling about twice the power of the six pin power connector. Max 20 amps to 11 amps on six pin.









Anything to worry about?

corsair_link_20170403_21_08_08.zip 1k .zip file


----------



## joder

Quote:


> Originally Posted by *KedarWolf*
> 
> My best TimeSpy before Strix BIOS was 10808, below with Strix BIOS at 2100 core, 6162 memory Score with Strix BIOS 10876


Nice work! Could you hit 2100 before?

Edit: Also can you flash back to stock if needed?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2100 core at 1.075v 6132 memory, Power Limit never goes over 111% an entire Heaven run, and as you can see by my GPU-Z file it never throttled or lowered core or voltages once the entire Heaven run, no shunt mod.
> 
> GPU-ZSensorLog.txt 35k .txt file
> 
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash64 --save filename.rom
> 
> to backup original bios.
> 
> nvflash64 -6 biosfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.
> 
> 
> 
> Let me know how your testing goes! See if you can do 2114 Mhz without power throttling!
> 
> Are you hitting 1.092V consistently now? And also, can we backup our BIOs and reflash it for RMA purposes? Thanks for testing this out and being brave!
Click to expand...

I have 2100 at 1.075v with zero throttling.

You can reflash it, yes, just need to do the nvflash64 --protecton command when you are done to lock the BIOS again so they have no idea you did --protectoff and flashed it from original.

Trick I figured out from my Maxwell Titan X days


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> According to my AX1500i log the 8 pin power connector is pulling about twice the power of the six pin power connector. Max 20 amps to 11 amps on six pin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anything to worry about?
> 
> corsair_link_20170403_21_08_08.zip 1k .zip file


Don't suppose you have similar readings from the FE BIOS? I'd grab it myself but I don't have a fancy corsair PSU


----------



## KedarWolf

Quote:


> Originally Posted by *joder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> My best TimeSpy before Strix BIOS was 10808, below with Strix BIOS at 2100 core, 6162 memory Score with Strix BIOS 10876
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice work! Could you hit 2100 before?
Click to expand...

I could, at 1.093 volts and it would jump all over from 1.050v to 1.093v and the clocks would jump all over as well of course.









As you see by my GPU-Z log not once during Heaven does it vary from the 1.075v max and 2100.









My best timeSpy score with Heaven at 2100 before at 1.093 not using Strix BIOS was about 10758.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> According to my AX1500i log the 8 pin power connector is pulling about twice the power of the six pin power connector. Max 20 amps to 11 amps on six pin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anything to worry about?
> 
> corsair_link_20170403_21_08_08.zip 1k .zip file
> 
> 
> 
> 
> Don't suppose you have similar readings from the FE BIOS? I'd grab it myself but I don't have a fancy corsair PSU
Click to expand...

No, and I'm getting such good results with this BIOS I don't want to flash it back even to test.









Edit: Actually I'm going to flash it back to test, you've been really helpful and are very knowledgeable.

Least I can do.


----------



## bfedorov11

If anyone is interested, here is the Nvidia branded FE card's bios.

nvidia_gp102.zip 257k .zip file


----------



## rcfc89

Quote:


> Originally Posted by *Slackaveli*
> 
> my thought would be that there is no information there. he could be very cpu bound with an average of "50%" gpu usage. Must see each and every thread to be able to tell. Also, he is getting 63% power out of one gpu and 45% power out of the other. Way too many variables for me, sorry.


The 50 and 63% are fan speeds. Gpu usage is 98-99% on both. Not sure why Msi has it arse backwards on BF1. I'll take another one showing all threads.


----------



## Somasonic

Just got my hands on a Strix last night and so far it's seems pretty beastly It's boosting into the mid-high 1900s straight out of the box which I'm pretty good with. Fans don't get too loud but sometimes pulse on and off when not gaming which could get annoying (although it looks like it might just be after gaming it does this, need to test more).

Issues I have with the card (any help or suggestions much appreciated):

- When it boosts up the boost isn't stable, it jumps around constantly by small amounts even though the temp seems stable
- I'm getting jittering every few seconds, which could be related to the boost jumping around. It isn't unplayable by any means but it's annoying. I have vsync on and a frame cap at 58 fps on a 60Hz monitor. These settings were fine with the 980 Ti
- The colors seem washed out compared to my 980 Ti (same settings). I've tried adjusting settings in the NvCP and on the monitor but I can't seem to get back the 'richness' of color I had before. I'm using a display port cable if that makes any difference.

I should post a screen of what the boost is doing but I'm pretty short on time at the moment sorry and am just testing the card quickly when I can. I'll get to really work it out in the weekend hopefully.

Thanks to anyone who can help


----------



## kevindd992002

Quote:


> Originally Posted by *Mato87*
> 
> Well first of all he has two gtx 980 Ti's


Which means?

@All

If a lower-end card can run a certain game at say 120fps without being CPU-bottlenecked, can the same game be run at 120fps also with a 1080Ti that is CPU-bottlenecked? I'm not sure if I get this correctly but the CPU gets bottlnecked with a 1080Ti past the fps rate that you are getting BEFORE the upgrade with a lower-end card, right?


----------



## rcfc89

Is this considered a bottleneck? 3 threads over 70%


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> According to my AX1500i log the 8 pin power connector is pulling about twice the power of the six pin power connector. Max 20 amps to 11 amps on six pin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anything to worry about?
> 
> corsair_link_20170403_21_08_08.zip 1k .zip file
> 
> 
> 
> 
> Don't suppose you have similar readings from the FE BIOS? I'd grab it myself but I don't have a fancy corsair PSU
Click to expand...

Only a tad higher with Strix BIOS.

corsair_link_20170403_22_09_24.zip 1k .zip file


----------



## braindamage

Tried the Strix bios on my EVGA 1080ti FE. Some notes:

-The topmost displayport is disabled (thought I bricked my card at first







)
-100% fan speed with the reference fan tops out at ~3600 RPM now. This is about 75% on the FE bios.
-Minimum fan speed is now 0%.
-Power limit setting in MSI Afterburner does not seem to work past 108%, even when set to 120%.

Ended up flashing back to stock for now since I couldn't figure out why the card was still hitting the power limit.


----------



## KedarWolf

Quote:


> Originally Posted by *braindamage*
> 
> Tried the Strix bios on my EVGA 1080ti FE. Some notes:
> 
> -The topmost displayport is disabled (thought I bricked my card at first
> 
> 
> 
> 
> 
> 
> 
> )
> -100% fan speed with the reference fan tops out at ~3600 RPM now. This is about 75% on the FE bios.
> -Minimum fan speed is now 0%.
> -Power limit setting in MSI Afterburner does not seem to work past 108%, even when set to 120%.
> 
> Ended up flashing back to stock for now since I couldn't figure out why the card was still hitting the power limit.


Can confirm, top display port does not work.









Edit: You do know in GPU-Z and Afterburner you want the Power Limit lower while benching, right? If it says 108% while running Heaven that's great, not bad!!

Under water my power limit never went over 111% with zero throttling.

And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.

1080 Ti FE



1080 Ti Strix BIOS


----------



## KedarWolf

Quote:


> Originally Posted by *Asus11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Well... Now that I got voltage slider back I did some testing in Heaven and my max TDP is 108.5% vs before with the stock FE bios I'd see my TDP shoot as high as 138% So I'll look at this as basically a poor/conservative man's shunt mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> forget what its reading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> do some performance tests!
> 
> put it at 2kMHz run valley 3 times take the average, do the same for the STRIX bios
Click to expand...

See this TimeSpy, says it all.









http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/3460_20#post_25982609


----------



## mshagg

Urgh, damn - currently using all of the outputs, would be a pain to have one disabled.


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> According to my AX1500i log the 8 pin power connector is pulling about twice the power of the six pin power connector. Max 20 amps to 11 amps on six pin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anything to worry about?
> 
> corsair_link_20170403_21_08_08.zip 1k .zip file


That looks like exactly what we'd expect and nothing to worry about. If they were the same or close to the same then it'd be cause to worry. Congrats on the upgraded TDP limit!


----------



## Addsome

Quote:


> Originally Posted by *ThingyNess*
> 
> That looks like exactly what we'd expect and nothing to worry about. If they were the same or close to the same then it'd be cause to worry. Congrats on the upgraded TDP limit!


So running the strix bios on a FE should be safe?


----------



## ThingyNess

Quote:


> Originally Posted by *braindamage*
> 
> Tried the Strix bios on my EVGA 1080ti FE. Some notes:
> 
> -The topmost displayport is disabled (thought I bricked my card at first
> 
> 
> 
> 
> 
> 
> 
> )
> -100% fan speed with the reference fan tops out at ~3600 RPM now. This is about 75% on the FE bios.
> -Minimum fan speed is now 0%.
> -Power limit setting in MSI Afterburner does not seem to work past 108%, even when set to 120%.
> 
> Ended up flashing back to stock for now since I couldn't figure out why the card was still hitting the power limit.


Yeah, the displayport being disabled is normal as ASUS has that mapped to their 2nd HDMI port.
You do lose some peak cooling headroom on an FE card too if you haven't swapped coolers or watercooled your card, as the FE BIOS has the max fan rpm programmed to 4800rpm.
You will also inherit the 'zero fan rpm idle' of the Asus card, although the card will heat up enough at idle to turn the fan on around 55-60c I'd imagine, so no worries about potentially bricking it that way.

Are you saying that you're still showing as 'Pwr' limited in GPU-Z when you hit 108% even with the slider maxed to 120% in Afterburner? Or does it just not exceed 108% in general?


----------



## ThingyNess

Quote:


> Originally Posted by *Addsome*
> 
> So running the strix bios on a FE should be safe?


It looks that way so far, with the following caveats:

1) You lose one of your DisplayPort outputs as the Asus card only has two.
2) Your max fan RPM will be lower (3600rpm) versus the FE card's 4800rpm, so if you are overvolting or coming close to the stock cooler's limits, you will thermal throttle worse now instead of TDP throttling. This may or may not be desireable depending on the type of load you were putting on the card. You're sort of trading some steady state performance for peak performance this way.
3) The stock fan curve on the Asus tries to turn the fans off at idle, so you might notice the blower speed moving around a bit more at idle temps, and your idle temps being higher than before. This is easily solvable with a custom fan curve in your overclocking program of choice - this won't get you higher than 3600rpm, though.


----------



## Addsome

Quote:


> Originally Posted by *ThingyNess*
> 
> It looks that way so far, with the following caveats:
> 
> 1) You lose one of your DisplayPort outputs as the Asus card only has two.
> 2) Your max fan RPM will be lower (3600rpm) versus the FE card's 4800rpm, so if you are overvolting or coming close to the stock cooler's limits, you will thermal throttle worse now instead of TDP throttling. This may or may not be desireable depending on the type of load you were putting on the card. You're sort of trading some steady state performance for peak performance this way.
> 3) The stock fan curve on the Asus tries to turn the fans off at idle, so you might notice the blower speed moving around a bit more at idle temps, and your idle temps being higher than before. This is easily solvable with a custom fan curve in your overclocking program of choice - this won't get you higher than 3600rpm, though.


I'm using the half shroud 1080 hybrid method so idc about fan speeds. Im scared about melting the power inputs like what happened with the 1080 FE and that Asus bios


----------



## leequan2009

How to burn your card







LOL


----------



## braindamage

Quote:


> Originally Posted by *KedarWolf*
> 
> Can confirm, top display port does not work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: You do know in GPU-Z and Afterburner you want the Power Limit lower while benching, right? If it says 108% while running Heaven that's great, not bad!!
> 
> Under water my power limit never went over 111% with zero throttling.
> 
> And here is TimeSpy exact same settings, 2100 core, 6162 memory.


Quote:


> Originally Posted by *ThingyNess*
> 
> Yeah, the displayport being disabled is normal as ASUS has that mapped to their 2nd HDMI port.
> You do lose some peak cooling headroom on an FE card too if you haven't swapped coolers or watercooled your card, as the FE BIOS has the max fan rpm programmed to 4800rpm.
> You will also inherit the 'zero fan rpm idle' of the Asus card, although the card will heat up enough at idle to turn the fan on around 55-60c I'd imagine, so no worries about potentially bricking it that way.
> 
> Are you saying that you're still showing as 'Pwr' limited in GPU-Z when you hit 108% even with the slider maxed to 120% in Afterburner? Or does it just not exceed 108% in general?


Yeah, I saw 'Pwr' under "PerfCap Reason", even with power % at ~108. GPU voltage was down to about 1V in Heaven benchmark. My card locks up past ~1940 mhz at that voltage.


----------



## ThingyNess

Quote:


> Originally Posted by *Addsome*
> 
> I'm using the half shroud 1080 hybrid method so idc about fan speeds. Im scared about melting the power inputs like what happened with the 1080 FE and that Asus bios


If you are rocking a hybrid cooler I'd say you're okay. If you're talking about the 1080FE and the Asus XOC BIOS with the 1.25v vcore and *no* TDP limit, that's kind of a different story and was a much bigger risk. You're only adding 10% to the stock 300w TDP cap.


----------



## ThingyNess

Quote:


> Originally Posted by *braindamage*
> 
> Yeah, I saw 'Pwr' under "PerfCap Reason", even with power % at ~108. GPU voltage was down to about 1V in Heaven benchmark. My card locks up past ~1940 mhz at that voltage.


That's interesting. I've seen that happen on some other Pascal cards too, including my Strix 1070. My hypothesis is that the card is hitting one of the other individual power limits within the BIOS other than the 'overall' limit that most people look at.

There are also individual limits set for the Motherboard +12 from the PCIe slot and the PCIe1 and PCIe2 connectors, and exceeding any one of these will also cause a 'Pwr' perfcap reason even if the overall limit hasn't been breached.

Each card maps its phases differently - one might choose to power the memory and one or two of the Vcore phases from the motherboard +12, and all the rest of the phases from the PCIe power connectors. I did see one 1080 non-Ti card (Zotac?) that actually *only* powered the memory from the motherboard slot and split the VCore phases evenly between two 8-pin PCIe connectors. In that case you'd be highly unlikely to ever hit any of the smaller individual limits.

Sigh, I might have to dig through the NVAPI and see if it exposes any more details over and above what GPU-Z gives us.


----------



## Pandora's Box

My MSI Gaming X 1080 Ti arrives tomorrow. Can't wait! Selling this Founders Edition 1080 Ti. I have a small ITX case and also use speakers while gaming. The noise of the founders edition cooler is just too much plus I want 2,000Mhz. In order to get 2GHz clock speed on the founders edition I have to run 75-80% fan speed, and even then its not a stable 2GHz. Way too loud. My Founders Edition card usually hovers around 1800Mhz at 60% fan speed. I figure if I can get 2Ghz out of the MSI Gaming X its a worthwhile exchange. I already have a buyer lined up for the FE card too.

I game at 1440P 165Hz so every little bit helps. I also stream to my Nvidia Shield connected to my 4K OLED TV.


----------



## alucardis666

Quote:


> Originally Posted by *leequan2009*
> 
> How to burn your card
> 
> 
> 
> 
> 
> 
> 
> LOL


Man, that's SEXY!


----------



## Slackaveli

Quote:


> Originally Posted by *rcfc89*
> 
> The 50 and 63% are fan speeds. Gpu usage is 98-99% on both. Not sure why Msi has it arse backwards on BF1. I'll take another one showing all threads.


ah, ok. turn on your gpu power usage, too.

From your newer pic, i wouldnt call that bottlenecked per say but it is really close, even with all those cores.


----------



## KingEngineRevUp

So flashing the ASUS Bios will give us an extra (but possibly dangerous) 30 Watts?


----------



## tahsin95

Sorry guys, a bit unrelated to the heavy discussion about overclocking above.
I'm thinking of fitting the EVGA AIO (1080/1070) hybrid kit on my 1080 ti. My biggest question is, are you able to put on the evga shroud on top of your 1080 ti?
What I mean is, gamersnexus has done a full video on this, and he basically just leaves it as it is, just fits on the water cooler and has that sticking out. Looks quite ugly imo and if that's really the case I was thinking of going for a much cheaper approach with my own custom cpu water cooler + nzxt case gpu case.

Thanks








would be very helpful if I could get some feedbacks off those who decided to custom water cool their 1080 ti's.


----------



## Pandora's Box

Quote:


> Originally Posted by *tahsin95*
> 
> Sorry guys, a bit unrelated to the heavy discussion about overclocking above.
> I'm thinking of fitting the EVGA AIO (1080/1070) hybrid kit on my 1080 ti. My biggest question is, are you able to put on the evga shroud on top of your 1080 ti?
> What I mean is, gamersnexus has done a full video on this, and he basically just leaves it as it is, just fits on the water cooler and has that sticking out. Looks quite ugly imo and if that's really the case I was thinking of going for a much cheaper approach with my own custom cpu water cooler + nzxt case gpu case.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> would be very helpful if I could get some feedbacks off those who decided to custom water cool their 1080 ti's.


That's the entire reason EVGA released a new AIO cooler for the 1080Ti. The only difference is the newer shroud. The 1080/1070 hybrid kits shroud does not fit on the 1080 Ti. The cooler itself does, just not the shroud. If you want the shroud you have to buy the 1080 Ti kit.


----------



## illypso

Quote:


> Originally Posted by *tahsin95*
> 
> Sorry guys, a bit unrelated to the heavy discussion about overclocking above.
> I'm thinking of fitting the EVGA AIO (1080/1070) hybrid kit on my 1080 ti. My biggest question is, are you able to put on the evga shroud on top of your 1080 ti?
> What I mean is, gamersnexus has done a full video on this, and he basically just leaves it as it is, just fits on the water cooler and has that sticking out. Looks quite ugly imo and if that's really the case I was thinking of going for a much cheaper approach with my own custom cpu water cooler + nzxt case gpu case.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> would be very helpful if I could get some feedbacks off those who decided to custom water cool their 1080 ti's.


I have this aio kit and follow the video for it and end up with the case at the end..... No it doesn't fit after.

I even try to make it fit by removing plastic but it didnt go well...

I ask myself if I would have use the plate that come with the kit maybe it fit but didn't check after if there is a reason you don't use the plate that came with the kit.
Maybe bolt emplacement, one sure thing is the thermal pad are not the same at the same place.

It was said that you don't need the air to go everywhere to cool the card so its not useful for the flow of air but its kind of look weird.

But for me, my case is a frankenstein case so it fit with the ... theme....

Also I can say that the AIO from evga, the pump is noisy and vibrate a lots. Its not that bad, but its way more noisy than my corsair h110 (witch was more a high pitch sound)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I have 2100 at 1.075v with zero throttling.
> 
> You can reflash it, yes, just need to do the nvflash64 --protecton command when you are done to lock the BIOS again so they have no idea you did --protectoff and flashed it from original.
> 
> Trick I figured out from my Maxwell Titan X days


I can't seem to unlock the voltage with the .Oem2 file again. Is there a reason for this?

This is also really odd but I think I have coil whine now running this bios.


----------



## ThingyNess

Your PCI device ID will have changed after you flashed, so you'll have to use the one of the new Asus card whose BIOS you used, right? (just making sure you did this step, easy to forget)


----------



## KingEngineRevUp

Yeah, I did. The new ID is

85E41043

I added it as

[VEN_10DE&DEV_1B06&SUBSYS_85E41043&REV_??]
VDDC_Generic_Detection = 1

and it's not working









EDIT: Okay, got it... I think notepad messed it up. I used Notepad++


----------



## Juub

Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


----------



## leequan2009

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


What is the Max oc vs max oc?
980ti max oc vs 1080ti max oc.
I upgaded 1080ti from 980ti xtreme. I saw it stronger about 50% than 980ti


----------



## gkolarov

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


I upgraded from 980ti windforce OC @1500 core and the performance gain is around 50%. Of course it depends on the particular game, but let say that [email protected] is faster 30 to 60%.


----------



## KingEngineRevUp

Hmmm...

So with the ASUS Strix bios, I now have coil whine, or either it wasn't coming through DP1 (which is now disabled) but is coming through the other Display ports.

Also, my score running at 2088 Mhz got worse.



Left with ASUS bios right with stock bios

Not sure what's going on here or if I'm doing something wrong.


----------



## Jbravo33

Quote:


> Originally Posted by *KedarWolf*
> 
> I successfully flashed the Strix BIOS to my 1080 TI FE.
> 
> I'm getting 2088 core, 6102 memory first run in Heaven under water with no throttling, no driver crashes. My power limit hasn't gone over 106, no shunt mod.
> 
> Here's how I did it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> 1080ti_strix.zip 154k .zip file
> 
> 
> Unzip NVFlash and the Strix bios to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> nvflash64 --protectoff.
> 
> Then do:
> 
> nvflash64 --save filename.rom
> 
> to backup original bios.
> 
> nvflash64 -6 biosfilename.rom
> 
> Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.
> 
> For more than one card.
> 
> nvflash64 --list
> 
> nvflash64 -6 --index=0 BIOSfilename.rom
> 
> nvflash64 -6 --index=1 BIOSfilename.rom
> 
> Choose yes when asked to override mismatched IDs.


i want to try this but having sli is holding me back. i might f it up


----------



## KickAssCop

Anyone got 2 ASUS STRIX in SLi? Temps?


----------



## shalafi

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


"me too" post - my 1080Ti @ 2000MHz+ is 50% faster than my previous EVGA 980Ti Hybrid @ 1500MHz+


----------



## Exilon

Shunt mod + water block installation completed! TDP now hovers around 90% in game and all power throttling is gone.
Power at the wall peaked at 500W in FSE, which meant the shunt mod added at least 70W worth of extra TDP headroom.



http://www.3dmark.com/fs/12211761

Memory actually went up to 12.2 GHz, forgot to capture GPU-Z for it.

I can see why Nvidia limited the voltage to 1.093v. The power scaling is crazy inefficient for a mere 100 MHz past 2GHz and any more would result in people blowing up their VRMs.

I've also noticed my system memory reset to 1333 MHz... no wonder I was getting so CPU bottlenecked in BF1


----------



## Mato87

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


I did, don't know about the exact numbers in terms of OC, but the card is 50% faster in almost all the games I have tested so far.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Shunt mod + water block installation completed! TDP now hovers around 90% in game and all power throttling is gone.
> Power at the wall peaked at 500W in FSE, which meant the shunt mod added at least 70W worth of extra TDP headroom.
> 
> 
> 
> http://www.3dmark.com/fs/12211761
> 
> Memory actually went up to 12.2 GHz, forgot to capture GPU-Z for it.
> 
> I can see why Nvidia limited the voltage to 1.093v. The power scaling is crazy inefficient for a mere 100 MHz past 2GHz and any more would result in people blowing up their VRMs.
> 
> I've also noticed my system memory reset to 1333 MHz... no wonder I was getting so CPU bottlenecked in BF1


Lol, if you only came earlier today the shunt mod isn't needed anymore. You can bios flash ASUS bios and get a similar affect.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


The 1080 is sup to be about 30% greater than then980 Ti. The 1080 Ti is approximately 1.25 times greater than a 1080.

So it's about 63% greater than a 980 Ti.


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> Lol, if you only came earlier today the shunt mod isn't needed anymore. You can bios flash ASUS bios and get a similar affect.


Asus BIOS only goes up to 320W and reading your experience with it, I gather it wasn't a stellar improvement. GTX 1080 ASUS BIOS was the same story with people clocking up to 2.2 GHz and getting worse scores.

I'll stick with the shunt mod


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> Guys, I apologize for saying the forum was ignorant. I was just surprised that I had no less than 5 posters saying I was FoS, and none had yet confirmed that I indeed was not just lying and making up false claims about my cpu b/c I wanted to justify my purchase. It wasnt like that AT ALL, I was trying to share knowledge. I got frustrated. But later several informed posters came and the education of the forum has now occured on this topic. I am sorry I came across so salty; I am a highly functioning autistic and got a little carried away, so on that day less highly functioning lol. I apologize, I hope you guys understand.


Hey man, no worries







. It's hard not to immediately want to be aggressive on forums (even as what I consider to be a functioning normal person







).

I *supposedly* got my hands on a Maximum VII Impact, and if it isn't too good to be true, I'll have an extra Z97 board that I'd love to put an i5 in, move the i7 to my GFs system and give a serious look into that CPU. Even though it's a bit off-topic, what OCs are generally expected/sought after with those chips?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Asus BIOS only goes up to 320W and reading your experience with it, I gather it wasn't a stellar improvement. GTX 1080 ASUS BIOS was the same story with people clocking up to 2.2 GHz and getting worse scores.
> 
> I'll stick with the shunt mod


You're just saying thatbbec you already did it, lol JK.

I probably got to setup my voltag curve better.

Running my card at 2114 MHz constant at 1.072V never drew more than 320W. At most it drew 305-310 watts max. Do we really need to out more than that in these cards?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're just saying thatbbec you already did it, lol JK.
> 
> I probably got to setup my voltag curve better.
> 
> Running my card at 2114 MHz constant at 1.072V never drew more than 320W. At most it drew 305-310 watts max. Do we really need to out more than that in these cards?


Man that's great! My card is a fickle little b!tch and doesn't like to be pushed passed 2050 on the core no matter what I do.


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're just saying thatbbec you already did it, lol JK.
> 
> I probably got to setup my voltag curve better.
> 
> Running my card at 2114 MHz constant at 1.072V never drew more than 320W. At most it drew 305-310 watts max. Do we really need to out more than that in these cards?


Guilty as charged







.

You got a nice sample. I need the last 1.093V bin to hit 2114 MHz benchmark stable so I backed it off to 2100 MHz.

What were you running to load the GPU? I found that Firestrike Extreme and Ultra just wrecked the power usage. Overwatch as well for some reason.


----------



## Benny89

Can someone be so kind and brief me on this "shunt mod" and ASUS Bios? My STRIX Will be here tomorrow and I am aiming to at least try in bechmark to get 2100mhz.


----------



## Exilon

Quote:


> Originally Posted by *Benny89*
> 
> Can someone be so kind and brief me on this "shunt mod" and ASUS Bios? My STRIX Will be here tomorrow and I am aiming to at least try in bechmark to get 2100mhz.


Shunt mod is where you modify the power measurement resistors on the PCB to reduce their resistance. The GPU firmware limits board power by measuring the voltage drop across these resistors to calculate total power being consumed by the whole board (fans/LEDs included). Since these power measurement resistors are in series with the load, simply applying a lower resistance element (such as liquid metal) in parallel is enough to trick the power measurement. The surface tension of the liquid metal keeps the mod in place by itself. Don't get it on the solder because the gallium will weaken the tin and can cause the solder joint to fail.

BIOS mod is where you flash (overwrite) the stock BIOS with another one. In Maxwell, we could download our own card's BIOS and modify it at will, but with Pascal Nvidia has locked down on that. The ASUS mod is where we take the ASUS Strix BIOS and flash it onto the Founder's Edition (or other cards maybe). This is done to take advantage of the Strix BIOS's higher power limit, but you end up with a different board's BIOS on your card. This can cause ports to be disabled and strange behavior. Need more vetting, but a few people have gotten it to work.

But since you already have the Strix coming in, you don't really need to do either one.


----------



## Benny89

Quote:


> Originally Posted by *Exilon*
> 
> Shunt mod is where you modify the power measurement resistors on the PCB to reduce their resistance. The GPU firmware limits board power by measuring the voltage drop across these resistors to calculate total power being consumed by the whole board (fans/LEDs included). Since these power measurement resistors are in series with the load, simply applying a lower resistance element (such as liquid metal) in parallel is enough to trick the power measurement. The surface tension of the liquid metal keeps the mod in place by itself. Don't get it on the solder because the gallium will weaken the tin and can cause the solder joint to fail.
> 
> BIOS mod is where you flash (overwrite) the stock BIOS with another one. In Maxwell, we could download our own card's BIOS and modify it at will, but with Pascal Nvidia has locked down on that. The ASUS mod is where we take the ASUS Strix BIOS and flash it onto the Founder's Edition (or other cards maybe). This is done to take advantage of the Strix BIOS's higher power limit, but you end up with a different board's BIOS on your card. This can cause ports to be disabled and strange behavior. Need more vetting, but a few people have gotten it to work.
> 
> But since you already have the Strix coming in, you don't really need to do either one.


Nice mate, thanks! + Rep for that. I learnt something new


----------



## pez

So I bit the bullet on the GB/AORUS AIB...not sure how much longer it'll be in stock if people are interested: https://www.newegg.com/Product/Product.aspx?Item=N82E16814125954

Here's hoping I don't see those nasty QC issues that the Xtreme Gaming cards had for the 1080.


----------



## Luck100

What QC control issues? Got any links?


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> So I bit the bullet on the GB/AORUS AIB...not sure how much longer it'll be in stock if people are interested: https://www.newegg.com/Product/Product.aspx?Item=N82E16814125954
> 
> Here's hoping I don't see those nasty QC issues that the Xtreme Gaming cards had for the 1080.


Just want to warn you, as I read from one owner on YouTube that his Aorus Xtreme Gaming was hitting 78C under load without OC (1990+ boost).

It confirms nothing, since there are no reviews yet, but better to post it here as this is non-biased random owner experience.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> It confirms nothing, since there are no reviews yet, but better to post it here as this is non-biased random owner experience.


Without some fan noise/RPM numbers, this says essentially nothing because it could just be a conservative fan curve. Maybe in combination with high GPU VID. It's also annoying that default GPU voltage is usually missing from reviews since it can make a noticeable difference in power requirements, heat production and fan noise.
Quote:


> Originally Posted by *Luck100*
> 
> What QC control issues? Got any links?


Fan was hitting the frame, a number of people complained about it in previous versions of the cooler model.


----------



## TWiST2k

Quote:


> Originally Posted by *leequan2009*
> 
> How to burn your card
> 
> 
> 
> 
> 
> 
> 
> LOL


Your setup looks awesome! What case and cooler is that?


----------



## pez

Quote:


> Originally Posted by *Luck100*
> 
> What QC control issues? Got any links?


The Gigabyte branded Xtreme Gaming card had some initial launch issues where (even reviewers) users were receiving cards with the heatsink fins were bent as well as the middle fan being improperly assembled and contacting the other two fans.

Fan issue:

__
https://www.reddit.com/r/58dmz3/gtx_1080_xtreme_gaming_do_those_heatsinks_look/%5B/URL

Just want to warn you, as I read from one owner on YouTube that his Aorus Xtreme Gaming was hitting 78C under load without OC (1990+ boost).

It confirms nothing, since there are no reviews yet, but better to post it here as this is non-biased random owner experience.[/QUOTE]

Yeah, I'll definitely keep an eye on it. Tried to put in a cancellation actually as I don't believe the card is going to fit in my case like I previously thought. A bit bummed out by it, but if it does arrive and I decide to try and stick it out (maybe some modding) I'll update you guys.


----------



## PasK1234Xw

Quote:


> Originally Posted by *alucardis666*
> 
> Man that's great! My card is a fickle little b!tch and doesn't like to be pushed passed 2050 on the core no matter what I do.


Yea curve isn't going to do anything for us.
Only time it will is if you can hold 2100 for prolong periods and then lock up. Instant locks even 1.093 isn't enough.

Ive had 3 FE so far, EVGA, MSI and nvidia all about the same (2030-50) MSI being slightly better keeping 2050 so i have that still for the moment.


----------



## mshagg

Quote:


> Originally Posted by *braindamage*
> 
> Tried the Strix bios on my EVGA 1080ti FE. Some notes:
> 
> -The topmost displayport is disabled (thought I bricked my card at first
> 
> 
> 
> 
> 
> 
> 
> )
> .


So, fingers crossed some of the other partner cards also have the extra 30W headroom on the TDP limit, as I believe it's only the ASUS cards that replaced one of the DP ports with an HDMI. Kinda need all three DP for monitors and the HDMI for the Vive; not a fan of workflows that require constant unplugging/plugging.


----------



## fisher6

I managed to get my card stable at 2076Mhz yesterday with the curve. Played Wildlands for hours without issues.


----------



## mshagg

Techpower up has the first lot of TI BIOS files uploaded. Of note is the MSI Gaming X:

https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320

Board power limit Target: 280.0 W
Limit: 330.0 W

Unfortunately it also has the two HDMI ports, so this is likely to disable a DP port on the FE.


----------



## leequan2009

Quote:


> Originally Posted by *TWiST2k*
> 
> Your setup looks awesome! What case and cooler is that?


Case: Thermaltake Core X9
WaterCooler: Block EK for VGA and 480mm Radiator


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Techpower up has the first lot of TI BIOS files uploaded. Of note is the MSI Gaming X:
> 
> https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320
> 
> Board power limit Target: 280.0 W
> Limit: 330.0 W
> 
> Unfortunately it also has the two HDMI ports, so this is likely to disable a DP port on the FE.


I'm getting such good results with the Strix BIOS on my FE I don't think I'm going to try the MSI one.

Get a constant 2100 core, 6132 memory, power limit never goes over 111% while benching. And at 1.075v the core and voltage never downclocks an entire Heaven run at 2100.









I have an earlier post with the GPU-Z log of a Heaven bench.









Here's my GPU-Z log









GPU-ZSensorLog.txt 35k .txt file


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm getting such good results with the Strix BIOS on my FE I don't think I'm going to try the MSI one.
> 
> Get a constant 2100 core, 6162 memory, power limit never goes over 111% while benching. And at 1.075v the core and voltage never downclocks an entire Heaven run at 2100.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an earlier post with the GPU-Z log of a Heaven bench.


Is there any risk of bricking the card if something goes wrong during the flashing process?


----------



## KedarWolf

Quote:


> Originally Posted by *fisher6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm getting such good results with the Strix BIOS on my FE I don't think I'm going to try the MSI one.
> 
> Get a constant 2100 core, 6162 memory, power limit never goes over 111% while benching. And at 1.075v the core and voltage never downclocks an entire Heaven run at 2100.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an earlier post with the GPU-Z log of a Heaven bench.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any risk of bricking the card if something goes wrong during the flashing process?
Click to expand...

if you have a second video card and anything goes wrong you can flash the original back if you have any issues but I don't think there is much chance of bricking the card, no.

See here how to flash.

Here's how I did it









nvflash_5.353.0.zip 3031k .zip file


1080ti_strix.zip 154k .zip file


Unzip NVFlash and the Strix bios to a folder.

Disable your video card in Control Panel in Device Manager.

Run an admin command prompt and cd to the folder you made.

Do in command prompt:

To turn protection off so you can flash the BIOS.

nvflash64 --protectoff.

Then do:

nvflash64 --save filename.rom

to backup original bios.

To flash bios.

nvflash64 -6 biosfilename.rom

Reboot, if the card is not enabled in Device Manager, enable it, reboot again. Good idea to reinstall Nvidia driver, reboot last time, done.

For more than one card.
To see cards.

nvflash64 --list

To flash BIOS.

nvflash64 -6 --index=0 BIOSfilename.rom

nvflash64 -6 --index=1 BIOSfilename.rom

Choose yes when asked to override mismatched IDs.


----------



## madmeatballs

I really hope we get some statistics on the first page of this thread. Or maybe someone can start a new one while this is still kinda fresh. lol


----------



## pantsoftime

Here's hoping for an AIB BIOS that doesn't wipe out yet another port on an FE card. It was bad enough losing the DVI. The EVGA cards have 3 DP's right? Hopefully their BIOS can boost power limits. On the 1080 flashing AIB BIOSes seemed to do nothing helpful at all (other than T4).


----------



## KedarWolf

Quote:


> Originally Posted by *pantsoftime*
> 
> Here's hoping for an AIB BIOS that doesn't wipe out yet another port on an FE card. It was bad enough losing the DVI. The EVGA cards have 3 DP's right? Hopefully their BIOS can boost power limits. On the 1080 flashing AIB BIOSes seemed to do nothing helpful at all (other than T4).


Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.

Did a guide with info and results here.

http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584


----------



## pantsoftime

Quote:


> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584


Yeah thanks for your write-up. Looking forward to trying this with a BIOS that doesn't drop a port.


----------



## mshagg

Quote:


> Originally Posted by *pantsoftime*
> 
> Here's hoping for an AIB BIOS that doesn't wipe out yet another port on an FE card. It was bad enough losing the DVI. The EVGA cards have 3 DP's right? Hopefully their BIOS can boost power limits. On the 1080 flashing AIB BIOSes seemed to do nothing helpful at all (other than T4).


Not expecting much of an improvement in peak clocks but if I can get this card to hold 2075 with the extra TDP headroom it's well worth something as simple as a BIOS flash. Currently steps down 2 or 3 bins on the 300W limit.

Not worth losing a display output for, if it's something you use.


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584


Does it hold 2100Mhz while gaming too?


----------



## KedarWolf

Quote:


> Originally Posted by *fisher6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584
> 
> 
> 
> Does it hold 2100Mhz while gaming too?
Click to expand...

I'll GPU-Z log Diablo 3 Reaper Of Souls and Rise of The Tomb Raider at 4K with G-Sync off and on when I get home from work in eight hours, post the results.

Can do Black Ops 3 and Wildlands as well.


----------



## Lefty23

First post here so hello,
&#8230;and sorry for the long read.

I finally found some time last weekend to install the waterblock (EK-FC Titan X Pascal - Acetal+Nickel) and put the card (EVGA 1080Ti FE) into my system.
Then spend few hours running synthetics like 3DMark suite and Unigine (next time at the end of the year when I upgrade into a more current platform).

For reference the system is a 10 months old win10 installation.
CPU is a [email protected] - DDR3 [email protected]
Using the latest GeForce driver (378.92 WHQL).
There are 3*240 rads and a D5 pump in a Core V21 case - all running @100% for the benches.
The room temperature is 21-22 C.

Card is running @+190 on the core (upto 2100 but hits PL depending on the bench - check screenshots below) and @+575 on the memory.
Also included, some comparisons against my WCed 980ti (Palit SJ) running at 1550/[email protected], on a moded Bios that removed thermal/power limits. This is on the same system running the same driver and at the same conditions (room temps, etc).
Judging from the overall scores, it seems the 4790k shows its age at least in synthetics (not really a problem yet in games I play @1440p). Graphics scores seem ok though.

Firestrike - Overall Score: 21821


Spoiler: Warning: Spoiler!






Graphics Score: 33076 (980ti - 22082)
http://www.3dmark.com/fs/12186356
http://www.3dmark.com/compare/fs/12150061/fs/12186356

FS Extreme - Overall Score: 13967


Spoiler: Warning: Spoiler!






Graphics Score: 16016 (980ti - 9974)
http://www.3dmark.com/fs/12180293
http://www.3dmark.com/compare/fs/12149955/fs/12180293

FS Ultra - Overall Score: 7672


Spoiler: Warning: Spoiler!






Graphics Score: 7885 (980ti - 5011)
http://www.3dmark.com/fs/12180196
http://www.3dmark.com/compare/fs/12149848/fs/12180196

Timespy - Overall Score: 9466


Spoiler: Warning: Spoiler!






Graphics Score: 11131 (980ti - 6699)
http://www.3dmark.com/spy/1493873
http://www.3dmark.com/compare/spy/1462441/spy/1493873

For Unigine runs I just used the settings giving me the best scores in 3DMark (for some reason Heaven 1080p test didn't like 2100Mhz - any theories??? - so that run only is @2088).
Heaven
2088/6075 --- 167.2 FPS --- 4211 --- 1080p


Spoiler: Warning: Spoiler!






2100/6075 --- 104.7 FPS --- 2638 --- 1440p


Spoiler: Warning: Spoiler!






2100/6075 --- 66.5 FPS --- 1674 --- 4K wrong DSR ratio in NVCP
2100/6075 --- 57.7 FPS --- 1454 --- 4K


Spoiler: Warning: Spoiler!







Valley
2100/6075 --- 152.0 FPS --- 6360 --- 1080p


Spoiler: Warning: Spoiler!






2100/6075 --- 106.8 FPS --- 4469 --- 1440p


Spoiler: Warning: Spoiler!






2100/6075 --- 71.8 FPS --- 3005 --- 4K wrong DSR ratio in NVCP
2100/6075 --- 64.9 FPS --- 2715 --- 4K


Spoiler: Warning: Spoiler!







It is really a shame that we won't get the ability to modify our Bios with Pascal. The card boosts to 2100 but hits the power limit losing up to 4 bins (in FSUltra test). This happens only once in the 1080p bench (just 1 bin) but, as expected, as we go up in resolution it happens more and more. For Unigine 1080p & 1440p are ok but, PL comes back @ 4K.
Unfortunately, the card is positioned vertically in my case so the CLU shunt mod is out of the question.
My best runs in 3DMark where @+575 on the memory (tried from 455-625 in intervals of 10).
Overall it's a nice card that runs cool @32-35 C during benches. Deltas on average are +12C @ full blast for benching and, +21C for gaming on a silent profile.
For gaming, depending on the game, I use either a 2076/12000 or the 2100/12150 profile used for benching above. Some games throttle back to 2050 due to PL others don't. It really depends on engine, game settings, etc.
Up to now, for benching, my max voltage is 1.050v even with the slider all the way to the right. In games I've rarely seen 1.063v.
I just hate how GPU boost works since it looks like there is more potential there. I would really like to see what this card can do at say 350w PL.
I might try the Strix bios or the curve in afterburner next. Unfortunately, no time until next weekend.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pantsoftime*
> 
> Here's hoping for an AIB BIOS that doesn't wipe out yet another port on an FE card. It was bad enough losing the DVI. The EVGA cards have 3 DP's right? Hopefully their BIOS can boost power limits. On the 1080 flashing AIB BIOSes seemed to do nothing helpful at all (other than T4).
> 
> 
> 
> Not expecting much of an improvement in peak clocks but if I can get this card to hold 2075 with the extra TDP headroom it's well worth something as simple as a BIOS flash. Currently steps down 2 or 3 bins on the 300W limit.
> 
> Not worth losing a display output for, if it's something you use.
Click to expand...

I could only do 2062 at 1.031v without it down volting and down clocking all the time during Heaven, to get a continual 2062 core that is without it fluctuating on original BIOS.









Now i'm getting 2100 core at 1.075v with no drops in core or voltages during Heaven, so big improvement.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584


My results were a little worse, did you do anything else to your voltage curve?

What are your temperatures also?


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> I could only do 2062 at 1.031v without it down volting and down clocking all the time during Heaven, to get a continual 2062 core that is without it fluctuating on original BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now i'm getting 2100 core at 1.075v with no drops in core or voltages during Heaven, so big improvement.


Presumably this also has a custom voltage curve, which I cant really do as Afterburner and SteamVR dont play nice. Although for benching goodness it's not an issue.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584
> 
> 
> 
> My results were a little worse, did you do anything else to your voltage curve?
Click to expand...

Just topped it out at 2100 core at 1.075v and 6132 memory. But if you're on air you may be getting throttling from going over 50C I think it starts to throttle then, some say it's 60C but I've seen some throttling starting at 52C on my old BIOS.

See here about adjusting clocks for lower heat. You can even go below the 1.031v used in this guide, some do.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I could only do 2062 at 1.031v without it down volting and down clocking all the time during Heaven, to get a continual 2062 core that is without it fluctuating on original BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now i'm getting 2100 core at 1.075v with no drops in core or voltages during Heaven, so big improvement.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Presumably this also has a custom voltage curve, which I cant really do as Afterburner and SteamVR dont play nice. Although for benching goodness it's not an issue.
Click to expand...

Yes, custom curve.









Edit: Couldn't you set clocks with Afterburner on boot, then close it before the VR?

You could try that then log with GPU-Z, see if clocks stay set.


----------



## fisher6

@KedarWolf Did you see any thermal throttling under water? Tempted to try this when I get home. Do you know what CPU usage was like during the benchmarks?


----------



## KedarWolf

Quote:


> Originally Posted by *fisher6*
> 
> @KedarWolf Did you see any thermal throttling under water? Tempted to try this when I get home. Do you know what CPU usage was like during the benchmarks?


No thermal throttling, no, temps hover around 40C. Core never dips below 2100 or 1.075v.

Haven't checked CPU usage, can do later with Heaven running.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> No thermal throttling, no, temps hover around 40C. Core never dips below 2100 or 1.075v.
> 
> Haven't checked CPU usage, can do later with Heaven running.


You might be right about that, more power means more heat.

My temps were around 47C with heavy benching. That extra power was pushing me a little around 51C.

Might have to run my fans at 100%.... =/


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay, from a UL standpoint, nail polish IS NOT OKAY because it's flammable, the opposite of what ul looks for, flame resistant material.
> 
> I would have used circuit board RTV instead, that's what it's made for!
> 
> Use RTV DAMN IT!
> 
> You remember EVGA VRM blowing up? You want nail polish around that if it happened?


This is NOT an issue. I am going to watercool the card and i did not apply the nail polish directly on the DIE or the VRM. The hottest the nail polish will get is 50°C.
But let's create a worst case scenario: Let's assume i apllied the polish directly on the hottest part of my graphics card, the VRM, and let's assume I have VERY hot VRM at 125°C because of poor air cooling and heavy overclocking. No let's add an extra 10°C for whoever knows and let's assume the nail polish gets heated up to 135°C constantly over a longer period of time.
Would that be an issue? NO.
How do I know? Because I tested it on an old sound card. I applied two different brands of nail polish to different spots on the card, then i put it in the oven at 135°C for a whole hour. I put it vertically, so the nail polish would run down the card if it melts.
The pictures in my next post are taken directly after i got it out. Every spot of nail polish was exactly in the same condition as before. No change in color, it was still hard, it did not even smell. Absolutely no change at all.
So dont panic. If you get problems with nail polish on your graphics card, it is definetly not because of heat.
You are right, there are surely isolators or paints with a better heat resistance than nail polish, but for the temps you get in PC, even worst case, it simply doesn't matter. And nail polish is easily accessible, especially if you are married  and it comes with a handy brush. What more could you want?

PS: I am not saying that applying nail polish to your graphics card is totally safe. I am doing this for the first time, so maybe reality will teach me not to do it again. But heat will not be a problem.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> This is NOT an issue. I am going to watercool the card and i did not apply the nail polish directly on the DIE or the VRM. The hottest the nail polish will get is 50°C.
> But let's create a worst case scenario: Let's assume i apllied the polish directly on the hottest part of my graphics card, the VRM, and let's assume I have VERY hot VRM at 125°C because of poor air cooling and heavy overclocking. No let's add an extra 10°C for whoever knows and let's assume the nail polish gets heated up to 135°C constantly over a longer period of time.
> Would that be an issue? NO.
> How do I know? Because I tested it on an old sound card. I applied two different brands of nail polish to different spots on the card, then i put it in the oven at 135°C for a whole hour. I put it vertically, so the nail polish would run down the card if it melts.
> The pictures in my next post are taken directly after i got it out. Every spot of nail polish was exactly in the same condition as before. No change in color, it was still hard, it did not even smell. Absolutely no change at all.
> So dont panic. If you get problems with nail polish on your graphics card, it is definetly not because of heat.
> You are right, there are surely isolators or paints with a better heat resistance than nail polish, but for the temps you get in PC, even worst case, it simply doesn't matter. And nail polish is easily accessible, especially if you are married  and it comes with a handy brush. What more could you want?
> 
> PS: I am not saying that applying nail polish to your graphics card is totally safe. I am doing this for the first time, so maybe reality will teach me not to do it again. But heat will not be a problem.


From a UL standpoint, circuit board RTV would have been better, it is acceptable. Nail polish is not. That's all I'm saying. Why use nail polish when one can use RTV.

You can also peel off black RTV with your finger if you need to RMA. I hope if other people do the mod, they use RTV.


----------



## Slackaveli

Quote:


> Originally Posted by *Juub*
> 
> Any of you guys upgraded from a 980 Ti? If so what are the differences max OC vs max OC?


boost is 2000~ instead of 1500~ and it packs almost double the vram, almost double the bandwidth, and still manages 25% more cores.


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> From a UL standpoint, circuit board RTV would have been better, it is acceptable. Nail polish is not. That's all I'm saying. Why use nail polish when one can use RTV.
> 
> You can also peel off black RTV with your finger if you need to RMA. I hope if other people do the mod, they use RTV.


btw: What is a UL standpoint?

And what makes RTV better for this particular case? You surely would not like to peel it off with your finger with all the small parts under it. Even if you would, it is not interesting for me. I am not the type of guy who shunt mods his card and then tries to rma it if it gors wrong.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Hey man, no worries
> 
> 
> 
> 
> 
> 
> 
> . It's hard not to immediately want to be aggressive on forums (even as what I consider to be a functioning normal person
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> I *supposedly* got my hands on a Maximum VII Impact, and if it isn't too good to be true, I'll have an extra Z97 board that I'd love to put an i5 in, move the i7 to my GFs system and give a serious look into that CPU. Even though it's a bit off-topic, what OCs are generally expected/sought after with those chips?


pretty much everybody can get 4.2-4.3 , at anywhere from 1.25-1.35v and these chips can take over 1.4v core and still run cool. I am running (daily) [email protected], up from stock which is 3.3 dont let those lower sounding numbers scare you as this dude at 4.2 is on par IPC wise with skylake at 4.5


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> btw: What is a UL standpoint?
> 
> And what makes RTV better for this particular case? You surely would not like to peel it off with your finger with all the small parts under it. Even if you would, it is not interesting for me. I am not the type of guy who shunt mods his card and then tries to rma it if it gors wrong.


UL is a standard we follow in the US to keep things safe. They do tons of research on computer boards and control panels, etc.

RTV is a silicon that is not conductive and is applied to boards quite often. It is like a sealing gasket you can form and can be peeled off.

You can get various types of RTV, ones that can be peeled off easily (in this case you would want that for RTV reasons), or some that can hold a heatsink to a VRM chip if you glued the side of it, etc.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> The Gigabyte branded Xtreme Gaming card had some initial launch issues where (even reviewers) users were receiving cards with the heatsink fins were bent as well as the middle fan being improperly assembled and contacting the other two fans.
> 
> Fan issue:
> 
> __
> https://www.reddit.com/r/58dmz3/gtx_1080_xtreme_gaming_do_those_heatsinks_look/
> Yeah, I'll definitely keep an eye on it. Tried to put in a cancellation actually as I don't believe the card is going to fit in my case like I previously thought. A bit bummed out by it, but if it does arrive and I decide to try and stick it out (maybe some modding) I'll update you guys.


ill trade you for my FE that cost me $770 w tax :/


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> UL is a standard we follow in the US to keep things safe. They do tons of research on computer boards and control panels, etc.
> 
> RTV is a silicon that is not conductive and is applied to boards quite often. It is like a sealing gasket you can form and can be peeled off.
> 
> You can get various types of RTV, ones that can be peeled off easily (in this case you would want that for RTV reasons), or some that can hold a heatsink to a VRM chip if you glued the side of it, etc.


thanks for the explanation

ok, let me put it like this: if i had nail polish and that RTV at home, i would use rtv. But for me nail polish is good enough, not to run to the next store to buy rtv.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> thanks for the explanation
> 
> ok, let me put it like this: if i had nail polish and that RTV at home, i would use rtv. But for me nail polish is good enough, not to run to the next store to buy rtv.


You can also use gasket seal maker black RTV from a auto store. It's what is used to hold lid heat spreaders onto CPU

Look at the images here, someone actually has images of them applying and also pealing off a RTV equal.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> You can also use gasket seal maker black RTV from a auto store. It's what is used to hold lid heat spreaders onto CPU
> 
> Look at the images here, someone actually has images of them applying and also pealing off a RTV equal.


What about kapton (polyimide) tape. Some people who delid put that instead of nail polish to protect from possible LM spillage.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> What about kapton (polyimide) tape. Some people who delid put that instead of nail polish to protect from possible LM spillage.


should be fine, but how good of a sealnt it can be is a concern. I like RTV because it flows, but it isn't harder to clean.


----------



## Baasha

Flashing the BIOS should not be an issue but it's always something to be cautious about if you don't have a 2nd card to revert the card to its original state.

Also, I would be hesitant to flash the BIOS of the 1080 Ti using that of another GPU that has a totally different PCB (power delivery etc.). The FE/factory 1080 Ti may not handle the power as well and could cause issues IMO.


----------



## fisher6

Quote:


> Originally Posted by *Baasha*
> 
> Flashing the BIOS should not be an issue but it's always something to be cautious about if you don't have a 2nd card to revert the card to its original state.
> 
> Also, I would be hesitant to flash the BIOS of the 1080 Ti using that of another GPU that has a totally different PCB (power delivery etc.). The FE/factory 1080 Ti may not handle the power as well and could cause issues IMO.


I think some people looked at the power draw here and found it to be safe but not sure.


----------



## lilchronic

Quote:


> Originally Posted by *fisher6*
> 
> I think some people looked at the power draw here and found it to be safe but not sure.


And if water cooled it should not be an issue.


----------



## Addsome

Quote:


> Originally Posted by *lilchronic*
> 
> And if water cooled it should not be an issue.


Would it be an issue if only the GPU die is watercooled and the VRM and VRAM is still cooled using the reference fan? Im basically talking about the evga 1080 hyrbid half shroud method that gamersnexus did


----------



## KedarWolf

Quote:


> Originally Posted by *Baasha*
> 
> Flashing the BIOS should not be an issue but it's always something to be cautious about if you don't have a 2nd card to revert the card to its original state.
> 
> Also, I would be hesitant to flash the BIOS of the 1080 Ti using that of another GPU that has a totally different PCB (power delivery etc.). The FE/factory 1080 Ti may not handle the power as well and could cause issues IMO.


http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20

Includes power draw logs with Strix BIOS on my FE lower in the OP.


----------



## fisher6

Can the onboard graphics be used to flash back original BIOS if things go south?


----------



## Slackaveli

this forum ruins me lol. i can't be in here with a nvidia single card on air and only dogging to a +120 core, lol. I just ordered the Aurus 1080ti (thx stock tracker) , it's been in/out of stock all day on newegg. $739 w/ 2 day shipping total.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> this forum ruins me lol. i can't be in here with a nvidia single card on air and only dogging to a +120 core, lol. I just ordered the Aurus 1080ti (thx stock tracker) , it's been in/out of stock all day on newegg. $739 w/ 2 day shipping total.


Lol, what?! What about your FE?


----------



## Slackaveli

back to Nvidia for $770


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> back to Nvidia for $770


Hmmm. Alright then!


----------



## Slackaveli

not a terrible plan, eh? i guess it's possible they could ho me on the dates as i pre-ordered on March 2, 32 days ago... hmm. Better call now.

If they do, Ebay is about to get a less than cost quick mover. If one of ya'll wants a FE for $700 shipped, I may have one for you.


----------



## Juub

Quote:


> Originally Posted by *leequan2009*
> 
> What is the Max oc vs max oc?
> 980ti max oc vs 1080ti max oc.
> I upgaded 1080ti from 980ti xtreme. I saw it stronger about 50% than 980ti


Quote:


> Originally Posted by *gkolarov*
> 
> I upgraded from 980ti windforce OC @1500 core and the performance gain is around 50%. Of course it depends on the particular game, but let say that [email protected] is faster 30 to 60%.


Quote:


> Originally Posted by *shalafi*
> 
> "me too" post - my 1080Ti @ 2000MHz+ is 50% faster than my previous EVGA 980Ti Hybrid @ 1500MHz+


Quote:


> Originally Posted by *Mato87*
> 
> I did, don't know about the exact numbers in terms of OC, but the card is 50% faster in almost all the games I have tested so far.


Quote:


> Originally Posted by *SlimJ87D*
> 
> The 1080 is sup to be about 30% greater than then980 Ti. The 1080 Ti is approximately 1.25 times greater than a 1080.
> 
> So it's about 63% greater than a 980 Ti.


Thanks guys so I'm looking at an improvement in the neighborhood of 50%. That's pretty good and would get awfully close to get 60fps in most games at near max settings(Shadows on Ultra be damned lol).


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> If one of ya'll wants a FE for $700 shipped, I may have one for you.


I'll give you $650 for it


----------



## Slackaveli

hopefully i wont be that desperate. they did give me trouble, im on a ticket looking for a waiver now :/ how they gonna call it 30 days when the street date for release was march 9th??

It'd be a perfect match for your card, great for sli


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> hopefully i wont be that desperate. they did give me trouble, im on a ticket looking for a waiver now :/ how they gonna call it 30 days when the street date for release was march 9th??
> 
> It'd be a perfect match for your card, great for sli


Lmk how it goes.


----------



## p2im0

Just an FYI I just got notification the 1080 Ti hybrid kits by EVGA are available. $159 though ?

http://www.evga.com/products/Product.aspx?pn=400-HY-5388-B1


----------



## KingEngineRevUp

This might be a good point to make us feel safer, but TiN, the guy that literally modified his card in all kinds of ways, made his card run triple the power limit and his VRM didn't die lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *p2im0*
> 
> Just an FYI I just got notification the 1080 Ti hybrid kits by EVGA are available. $159 though ?
> 
> http://www.evga.com/products/Product.aspx?pn=400-HY-5388-B1


Yet you can buy a refurbished x41 for $65 and some mounting hardware at a hardware store and hack it to mount into the FE for $5 more. You'll need a copper shim too though, another $5.

$75 for mounted x41 hybrid. $60 if you want to go H55.

BTW, I have a H55 with a Noctua NF-F12 2000 RPM NON-PWM for sale for $45+split shipping. I'll throw in a copper shim with it. Missing the Intel bracket though if anyone is interested.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yes you can buy a refurbished x41 for $65 and some mounting hardware at a hardware store and hack it to mount into the FE for $5 more. You'll need a copper shim too though, another $5.
> 
> $75 for mounted x41 hybrid.


Which will probably cool better I bet.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Which will probably cool better I bet.


Yeah, with x41 I get 42C if the card is stock and 48C with some OC. But last night the Asus bios made it go up to 51C. Had to do 100% fan speed for 48C =/.

I can't wait to get home to give the Asus bios a try again.


----------



## Menta

Incoming TI on Thursday!!! the wait has been long and to go with the 1080TI a good spanking from the wife. FTW


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Lmk how it goes.


yeah, ill let you know if nvidia store craps on me.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, with x41 I get 42C if the card is stock and 48C with some OC. But last night the Asus bios made it go up to 51C. Had to do 100% fan speed for 48C =/.
> 
> I can't wait to get home to give the Asus bios a try again.


Well don't forget that the fan curve with the ASUS bios is reduced as well. 100% is actually tolerable now.


----------



## EniGma1987

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay, from a UL standpoint, nail polish IS NOT OKAY because it's flammable, the opposite of what ul looks for, flame resistant material.
> 
> I would have used circuit board RTV instead, that's what it's made for!
> 
> Use RTV DAMN IT!
> 
> You remember EVGA VRM blowing up? You want nail polish around that if it happened?


Maybe I am not understanding what you mean by "UL standpoint". The only thing I know of with a capital UL and talking about safety is Underwriter Laboratories. That "governing body" however does not stand for something being flame resistant at all. It is simply a regulatory agency that you submit a product to for a certain testing parameter and if your product passes you get a UL number associated for that product. You *can* test products for flame resistance, but simply being a UL listed product does not at all mean something is or isnt flame resistant.

Now myself I do much prefer a silicone because it is easier to apply, and easier to remove, and allows you to be even safer. I am just confused about the whole "UL" thing you are talking about.


----------



## RavageTheEarth

Hey guys just came in to say that I'm joining the club. If you asked me last week if I had plans to upgrade I would say no, but I told myself that if any of the Aorus 1080 Ti's were in stock by the time I woke up I would buy one. I was 100% sure they would all be gone immediately after midnight launch... and they were. Feel bad for the people who stayed up. I woke up in the morning a couple minutes after they put the Aorus back in stock. I was honestly shocked, but a promise is a promise! So $730 later here I am. Just kind of stinks that the waterblock for this isn't coming out until early next month.

I'm going to try to avoid draining my loop so I'm going to just stick this card in the slot underneath my 980 Ti which is currently under water. Maybe use the 980 Ti as a physics card? Then when I get the waterblock for the Aorus I'll drain the loop and sell the 980 Ti. Hopefully EK doesn't delay the blocks!


----------



## alucardis666

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Hey guys just came in to say that I'm joining the club. If you asked me last week if I had plans to upgrade I would say no, but I told myself that if any of the Aorus 1080 Ti's were in stock by the time I woke up I would buy one. I was 100% sure they would all be gone immediately after midnight launch... and they were. Feel bad for the people who stayed up. I woke up in the morning a couple minutes after they put the Aorus back in stock. I was honestly shocked, but a promise is a promise! So $730 later here I am. Just kind of stinks that the waterblock for this isn't coming out until early next month.
> 
> I'm going to try to avoid draining my loop so I'm going to just stick this card in the slot underneath my 980 Ti which is currently under water. Maybe use the 980 Ti as a physics card? Then when I get the waterblock for the Aorus I'll drain the loop and sell the 980 Ti. Hopefully EK doesn't delay the blocks!


At this point physics cards are totally useless. You're talking 1-3 fps difference on average and maybe max a 5fps increase on a heavily physics based game in optimal conditions. Save yourself the electric bill and the heat.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *EniGma1987*
> 
> Maybe I am not understanding what you mean by "UL standpoint". The only thing I know of with a capital UL and talking about safety is Underwriter Laboratories. That "governing body" however does not stand for something being flame resistant at all. It is simply a regulatory agency that you submit a product to for a certain testing parameter and if your product passes you get a UL number associated for that product. You *can* test products for flame resistance, but simply being a UL listed product does not at all mean something is or isnt flame resistant.
> 
> Now myself I do much prefer a silicone because it is easier to apply, and easier to remove, and allows you to be even safer. I am just confused about the whole "UL" thing you are talking about.


I have to submit a bill of materials to them of all the connectors, wires, etc for them to approve.

They audited our soldering process and the materials we used to solder things.

They also went over the documentations on how our boards were fabricated.

And that's how Nissan, BMW and Tesla got some of their car chargers









I don't work in that field anymore BTW. If I used nail polish as a sealant, it would not fly with them.

Sure, I think the nail polish will do it's job for what the user wants it to do. But just because something works doesn't mean it should be done that way.

All I was pointing out is that RTV is made for what he was doing. Why not just use it? Cost like $3 at some auto stores.


----------



## Juub

So if I may ask among the 1080 Ti owners what games do you guys play at what res/settings and what kinds of frame rates do you get?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Juub*
> 
> So if I may ask among the 1080 Ti owners what games do you guys play at what res/settings and what kinds of frame rates do you get?


Bud, you're better off just looking at all the reviews for that. Here's a collection of nearly all of them.


__
https://www.reddit.com/r/5ycn82/gtx_1080_ti_review_launchday_megathread/%5B/URL

*THREE THOUSAND SIX HUNDRED AND SIXTY ******* BENCHMARKS ALL PUT TOGETHER!*


----------



## Juub

Quote:


> Originally Posted by *SlimJ87D*
> 
> Bud, you're better off just looking at all the reviews for that. Here's a collection of nearly all of them.
> 
> 
> __
> https://www.reddit.com/r/5ycn82/gtx_1080_ti_review_launchday_megathread/%5B/URL
> 
> *THREE THOUSAND SIX HUNDRED AND SIXTY ******* BENCHMARKS ALL PUT TOGETHER!*


I don't even trust reviews any more. The benchmarks vary so much from one site to the other it's insane. One site reports 40fps of average and the other might report 60 with the same cards and same settings. I don't bother with benchmarks any more.

But thanks. 3000+ benchmarks should be accurate.


----------



## Somasonic

Quote:


> Originally Posted by *Somasonic*
> 
> Just got my hands on a Strix last night and so far it's seems pretty beastly It's boosting into the mid-high 1900s straight out of the box which I'm pretty good with. Fans don't get too loud but sometimes pulse on and off when not gaming which could get annoying (although it looks like it might just be after gaming it does this, need to test more).
> 
> Issues I have with the card (any help or suggestions much appreciated):
> 
> - When it boosts up the boost isn't stable, it jumps around constantly by small amounts even though the temp seems stable
> - I'm getting jittering every few seconds, which could be related to the boost jumping around. It isn't unplayable by any means but it's annoying. I have vsync on and a frame cap at 58 fps on a 60Hz monitor. These settings were fine with the 980 Ti
> - The colors seem washed out compared to my 980 Ti (same settings). I've tried adjusting settings in the NvCP and on the monitor but I can't seem to get back the 'richness' of color I had before. I'm using a display port cable if that makes any difference.
> 
> I should post a screen of what the boost is doing but I'm pretty short on time at the moment sorry and am just testing the card quickly when I can. I'll get to really work it out in the weekend hopefully.
> 
> Thanks to anyone who can help


So the jittery boost seems to have gone away and it's staying much more stable. Generally it clocks to 1974 and I've seen it go as high as 1999, hovers around 63-64 degrees. I know it's not up there with the overclocks I'm seeing but straight outta the box, I'm pretty happy with that. Now I just need to get these colors sorted and I'll be a happy chappy









Edit: The coil whine at 4k is pretty bad. I don't notice it so much at 1440 but I would like to downsample so that's a bummer. Any tips on getting rid of the whine would be much appreciated


----------



## Slackaveli

"Thank you for ordering from NVIDIA. At your request, a return has been initiated. Your request to return items ordered will expire in 29 days. You must follow the instructions below before your request expires. We cannot issue a refund before these instructions are followed."

sweeet. getting the full $770 back so even after shipping I made $20 and switched to a Aurus, which is almost the one I wanted most (auros xtreme) as I dont sli i dont mind the 3 slot width. Gonna be sexy. My penence for ho'ing nvidia store is i cant catch the xtreme in stock. The regular Aurus still comes @ +114 and I only hold about that anyway so it's all bonus headroom for me. Btw, they are in stock off/on 30 times today on newegg.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> "Thank you for ordering from NVIDIA. At your request, a return has been initiated. Your request to return items ordered will expire in 29 days. You must follow the instructions below before your request expires. We cannot issue a refund before these instructions are followed."
> 
> sweeet. getting the full $770 back so even after shipping I made $20 and switched to a Aurus, which is almost the one I wanted most (auros xtreme) as I dont sli i dont mind the 3 slot width. Gonna be sexy. My penence for ho'ing nvidia store is i cant catch the xtreme in stock. The regular Aurus still comes @ +114 and I only hold about that anyway so it's all bonus headroom for me. Btw, they are in stock off/on 30 times today on newegg.


Nice man! congratz


----------



## feznz

FYI that's about $1100 USD for those niggling about the price in the U.S.
I will have it my hot little hands for the weekend


----------



## Slackaveli

hell yeah, i feel better about floating my rent money now (lmao, i love how we have no late fees until after 15 days )
Quote:


> Originally Posted by *feznz*
> 
> FYI that's about $1100 USD for those niggling about the price in the U.S.
> I will have it my hot little hands for the weekend


damn, son, ya'll get crushed. It's all relative, I guess. Congrats, bro. You'll be the envy of your block now.


----------



## Slackaveli

my google fingers are broke. any good reviews/breakdowns of the Auros 1080ti non extreme out there?
It's in stock at this moment on newegg.


----------



## qwaz

What do you guys think about the reports of FE, and even some aftermarket cards, power throttling on even stock overclocks? I have an order in for the MSI Gaming X 1080 ti but posts like

__
https://www.reddit.com/r/60xdij/the_most_important_1080ti_card_spec_nobody_is/
 make me wonder if these cards were designed well and if should still cancel my order.

I'm hoping to do 2000 - 2050~ ish on the core but if the even the first batch of aftermarket cards (with higher limits) will throttle on stock speeds, what is the use? Does this mean only the Aorus / Zotac cards are probably going to be worth it? (because of even higher TDP limits).


----------



## alucardis666

Quote:


> Originally Posted by *qwaz*
> 
> What do you guys think about the reports of FE, and even some aftermarket cards, power throttling on even stock overclocks? I have an order in for the MSI Gaming X 1080 ti but posts like
> 
> __
> https://www.reddit.com/r/60xdij/the_most_important_1080ti_card_spec_nobody_is/
> make me wonder if these cards were designed well and if should still cancel my order.
> 
> I'm hoping to do 2000 - 2050~ ish on the core but if the even the first batch of aftermarket cards (with higher limits) will throttle on stock speeds, what is the use? Does this mean only the Aorus / Zotac cards are probably going to be worth it? (because of even higher TDP limits).


Honestly these cards are more than enough for 1440p max settings ultra wide or 16:9 @ 60fps maxed out, so unless you're doing 4k or high refresh rates over 100fps then there's not much point to OC or SLi, imo. Any additional gains through those routes is bragging rights for the 90% of people buying these cards.

So ask yourself if you're apart of that 90% or if you're in the 10% who is after high refresh rates or 4k 60fps maxed out. If you're ok waiting I'd stick it out for the corsair hydro cards, or the MSI SeaHawx or anything with a hybrid cooler really. Those should be the best clocking cards with the least amount of throttling


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Honestly these cards are more than enough for 1440p max settings ultra wide or 16:9 @ 60fps maxed out, so unless you're doing 4k or high refresh rates over 100fps then there's not much point to OC or SLi, imo. Any additional gains through those routes is bragging rights for the 90% of people buying these cards.
> 
> So ask yourself if you're apart of that 90% or if you're in the 10% who is after high refresh rates or 4k 60fps maxed out. If you're ok waiting I'd stick it out for the corsair hydro cards, or the MSI SeaHawx or anything with a hybrid cooler really. Those should be the best clocking cards with the least amount of throttling


agreed. Unless one intends to grab the 4k/144Hz/HDR monitors coming Q2 (any day now) or is running three ultrawide QHD monitors like the one guy here (10,400x1440 lol) than one of these, any brand, will do just fine for 99% of people. Even in 4k. In fact, I find 4k/60 is about the same as 1440/120, not 144, that is even harder to produce and harder on the cpu, too.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> hell yeah, i feel better about floating my rent money now (lmao, i love how we have no late fees until after 15 days )
> damn, son, ya'll get crushed. It's all relative, I guess. Congrats, bro. You'll be the envy of your block now.


He doesn't have a block... No one else could afford to live there.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> my google fingers are broke. any good reviews/breakdowns of the Auros 1080ti non extreme out there?
> It's in stock at this moment on newegg.


http://bfy.tw/B2kA


----------



## Slackaveli

Quote:


> Originally Posted by *joder*
> 
> He doesn't have a block... No one else could afford to live there.


I thought of that, too. In my head he is in a castle on a Hill overlooking throngs of peasants







Quote:


> Originally Posted by *joder*
> 
> http://bfy.tw/B2k7


thanks


----------



## mshagg

Quote:


> Originally Posted by *qwaz*
> 
> What do you guys think about the reports of FE, and even some aftermarket cards, power throttling on even stock overclocks? I have an order in for the MSI Gaming X 1080 ti but posts like
> 
> __
> https://www.reddit.com/r/60xdij/the_most_important_1080ti_card_spec_nobody_is/
> make me wonder if these cards were designed well and if should still cancel my order.
> 
> I'm hoping to do 2000 - 2050~ ish on the core but if the even the first batch of aftermarket cards (with higher limits) will throttle on stock speeds, what is the use? Does this mean only the Aorus / Zotac cards are probably going to be worth it? (because of even higher TDP limits).


Honestly I think it's testament to how well Boost 3.0 works. It's a 300/330W card that you're buying and it pushes itself to that limit basically without us doing much. In that regard I'd say the design is quite amazing. As overclockers it feels like a pain, because we see single digit % OCs. But. The card is still a monster.


----------



## qwaz

At some point (hopefully soon) i do want to go 3440x1440 @ 100hz so i do feel the more you can get out of these cards (stable and 24/7) the better. I don't think 2000/2050 should be that much of a stretch but the power limits have to be able to handle it.

That being said i have also been looking at my first EK custom water cooling loop, so i would definitely want to keep my options open for that. The heat is not REALLY the issue even with fans it seems though, isn't is just the max TDP limits that are insufficient?


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> agreed. Unless one intends to grab the 4k/144Hz/HDR monitors coming Q2 (any day now) or is running three ultrawide QHD monitors like the one guy here (10,400x1440 lol) than one of these, any brand, will do just fine for 99% of people. Even in 4k. In fact, I find 4k/60 is about the same as 1440/120, not 144, that is even harder to produce and harder on the cpu, too.


I'm kinda bummed about those monitors, no 21:9s coming with that specsheet anytime soon







I need a display over 36" to feel comfortable. Ideally something 42-50". I watch lots of videos, edit and render and play games and my vision isn't 20/20 even with glasses.

If these were Gsync and HDR I'd buy that 48" in a heartbeat

http://www.guru3d.com/news-story/samsung-working-on-44-and-48-inch-monitors-with-299-and-329-aspect-ratios.html


----------



## Somasonic

Quote:


> Originally Posted by *feznz*
> 
> FYI that's about $1100 USD for those niggling about the price in the U.S.
> I will have it my hot little hands for the weekend


Nice. I paid $1605 for mine the day they went up on Ascent. They raised the price by $60 the next day so I was pretty happy I was able to get in before that


----------



## Slackaveli

Quote:


> Originally Posted by *joder*
> 
> http://bfy.tw/B2k7


Quote:


> Originally Posted by *alucardis666*
> 
> I'm kinda bummed about those monitors, no 21:9s coming with that specsheet anytime soon
> 
> 
> 
> 
> 
> 
> 
> I need a display over 36" to feel comfortable. Ideally something 42-50". I watch lots of videos, edit and render and play games and my vision isn't 20/20 even with glasses.
> 
> If these were Gsync and HDR I'd buy that 48" in a heartbeat
> 
> http://www.guru3d.com/news-story/samsung-working-on-44-and-48-inch-monitors-with-299-and-329-aspect-ratios.html


damn, those are hot.


----------



## mtbiker033

quick and dirty +150 c, +500 mem firestrike score (msi AB voltage & power limit maxed) :

http://www.3dmark.com/fs/12216291

cpu at 4598mhz beat my SLI 780's by a fair margin, all the while being cooler and much quieter

the single 1080ti is truly perfect for 1440p 144hz gaming


----------



## Slackaveli

agreed. very nice cards.

http://www.3dmark.com/fs/12160930


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> agreed. very nice cards.
> 
> http://www.3dmark.com/fs/12160930


very nice score!! I just compared our scores, that 5775c really performs well!!!







even with a 400mhz advantage my 4820k can't hang


----------



## Slackaveli

I tried to tell ya'll









you certainly still have enough perf in that johnny though.


----------



## D1RTYD1Z619

What do you think guys EVGA GTX 1080 TI RGB or No RGB? Only thing is I really want to get the black edition but it's got a clashy White LED on the logo. I preordered the Black already but im debating canceling and getting the RGB one.


----------



## Mato87

So I just played all the crysis games and Iam finding the card performs almost flawlessly in crysis 2 and especially 3, but the performance is kind of wonky in crysis 1 and especially warhead, where on ultra and 1080p I can't get a stable framerate above 100, it always dips below 80 and sometimes even below 60, which is just unacceptable. In crysis 1 there are some framerate drops, but not as much as in warhead. It's odd to say the least because I remember playing warhead on my older rigs with two 8800 gts and then the gtx 295 and I never had these kind of issues, it always performed much better than the original crysis, granted I played on high settings only, because on ultra/entusiast it was running very poor, but still , those cards are like 1/50 performance of the gtx 1080 ti I currently have...


----------



## hotrod717

Not sure if anyone has posted or noticed this I just upgraded to 378.92 from 378.78 and noticing what i feel are artificially high clocks. Meaning, card is clocking higher, but performance is definitely not there for clock. It is definitely not a throttling issue.
On 378.78 I was getting 2088 core. Any higher and I'd crash in benches. 378.92, 2120 core but not getting same performance by a long shot. 2088 was way better than 2120. Anyone notice this or are we caught up in clocks more than actual performance? Have some more testing to do but, after several runs of bench I've done a lot of testing on this gpu, looks like it.


----------



## mtbiker033

Quote:


> Originally Posted by *Mato87*
> 
> So I just played all the crysis games and Iam finding the card performs almost flawlessly in crysis 2 and especially 3, but the performance is kind of wonky in crysis 1 and especially warhead, where on ultra and 1080p I can't get a stable framerate above 100, it always dips below 80 and sometimes even below 60, which is just unacceptable. In crysis 1 there are some framerate drops, but not as much as in warhead. It's odd to say the least because I remember playing warhead on my older rigs with two 8800 gts and then the gtx 295 and I never had these kind of issues, it always performed much better than the original crysis, granted I played on high settings only, because on ultra/entusiast it was running very poor, but still , those cards are like 1/50 performance of the gtx 1080 ti I currently have...


crysis 1 with blackfires ultimate mod............







it's the best experience ever, don't worry about the fps just enjoy it!!

the first crysis was the last and arguably the PC games ever made


----------



## alucardis666

Quote:


> Originally Posted by *mtbiker033*
> 
> crysis 1 with blackfires ultimate mod............
> 
> 
> 
> 
> 
> 
> 
> it's the best experience ever, don't worry about the fps just enjoy it!!
> 
> the first crysis was the last and arguably the PC games ever made







Don't think it looks that impressive personally.









Battlefield 1 looks better imo.


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> Don't think it looks that impressive personally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Battlefield 1 looks better imo.


frostbite is a great looking engine no doubt but considering when this came out and the overall experience (freedom to roam etc) it looks pretty good to me!


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> Don't think it looks that impressive personally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Battlefield 1 looks better imo.


Quote:


> Originally Posted by *mtbiker033*
> 
> crysis 1 with blackfires ultimate mod............
> 
> 
> 
> 
> 
> 
> 
> it's the best experience ever, don't worry about the fps just enjoy it!!
> 
> the first crysis was the last and arguably the PC games ever made


i dont know, did you watch the video in 4k? Looks pretty great. Not better texture wise than BF1, but, it was just LUSH. very nice with the mod.


----------



## KedarWolf

Quote:


> Originally Posted by *fisher6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Big improvement with Strix BIOS, about 4% better bench with exact same clocks each BIOS, and a fully stable no downlocking 2100 core at 1.075v.
> 
> Did a guide with info and results here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25983584
> 
> 
> 
> Does it hold 2100Mhz while gaming too?
Click to expand...

Had to up the voltage to 1.093 for 2100 due to higher ambient temps.

Diablo 3: Reaper Of Souls, never goes below 2100 core, 1.093v.

GPU-ZSensorLog.txt 38k .txt file


Rise of The Tomb Raider DirectX 12. Only dipped between cut scenes.

GPU-ZSensorLog.txt 131k .txt file


Downloading Black Ops 3 and Ghost Recon: Wildlands now.


----------



## Slackaveli

it worked out great for you, Kedar. Very nice. I'll dump the Aorus bios in a couple days. It may have a higher tdp/powermax as giga tends to have it higher than asus.


----------



## Pandora's Box

Got my MSI Gaming X GTX 1080 TI installed this evening. This thing is a beast, so ****ing huge! Barely fits in my case, lol. It's also silent, so beautifully silent







Will be selling my PNY GTX 1080 TI FE card to a friend this weekend. Aftermarket cards are so worth it over the founders edition.











http://www.3dmark.com/fs/12218591


----------



## Juub

Quote:


> Originally Posted by *Mato87*
> 
> So I just played all the crysis games and Iam finding the card performs almost flawlessly in crysis 2 and especially 3, but the performance is kind of wonky in crysis 1 and especially warhead, where on ultra and 1080p I can't get a stable framerate above 100, it always dips below 80 and sometimes even below 60, which is just unacceptable. In crysis 1 there are some framerate drops, but not as much as in warhead. It's odd to say the least because I remember playing warhead on my older rigs with two 8800 gts and then the gtx 295 and I never had these kind of issues, it always performed much better than the original crysis, granted I played on high settings only, because on ultra/entusiast it was running very poor, but still , those cards are like 1/50 performance of the gtx 1080 ti I currently have...


Isn't Crysis heavily single-threaded and perform like crap on quad-cores and up CPU's? I remember dual-cores with high clock speeds tend to run a lot better for Crysis.


----------



## Slackaveli

very nice, man! That is about as much beef as you could cram on that burger. Sexy card. I was buying it or the Aorus today, hell or high water, and both would be in/out of stock for about 2 minutes per hour lol. Grabbed the one I could but I would have been stoked with either. Doesn't look like you could cram the extra beefy Auros in that case, though, so great build, man.
Quote:


> Originally Posted by *Pandora's Box*
> 
> Got my MSI Gaming GTX 1080 TI installed this evening. This thing is a beast, so ****ing huge! Barely fits in my case, lol. It's also silent, so beautifully silent
> 
> 
> 
> 
> 
> 
> 
> Will be selling my PNY GTX 1080 TI FE card to a friend this weekend. Aftermarket cards are so worth it over the founders edition.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12218591


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Had to up the voltage to 1.093 for 2100 due to higher ambient temps.
> 
> Diablo 3: Reaper Of Souls, never goes below 2100 core, 1.093v.
> 
> GPU-ZSensorLog.txt 38k .txt file
> 
> 
> Rise of The Tomb Raider DirectX 12. Only dipped between cut scenes.
> 
> GPU-ZSensorLog.txt 131k .txt file
> 
> 
> Downloading Black Ops 3 and Ghost Recon: Wildlands now.


Sadly I am not getting better results. My temperatures also float at 51C. I believe I am being thermal throttled. The extra power seems outdo my x41 AIO. I'm going to try and do a open case test.


----------



## RavageTheEarth

Quote:


> Originally Posted by *alucardis666*
> 
> At this point physics cards are totally useless. You're talking 1-3 fps difference on average and maybe max a 5fps increase on a heavily physics based game in optimal conditions. Save yourself the electric bill and the heat.


Haha so can I have the 980 Ti installed, but not at all active? Never done anything like that before. Just so I can have the 1080 Ti the primary card until the waterblocks are released and I can get it underwater. Then I can sell the 980 Ti and the block and recoup some of this money I blew!!!


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Sadly I am not getting better results. My temperatures also float at 51C. I believe I am being thermal throttled. The extra power seems outdo my x41 AIO. I'm going to try and do a open case test.


Like you, my card is running about 4-7c hotter on average while gaming and benching, and although I didn't get my clocks any higher my boost clocks are more consistent and don't bounce around as much. Sucks that my card hates to be OC'd beyond 2000mhz, 2050, is a stretch and 2100 is really unstable. Dunno if I'll flash the stock bios back, hoping to see more bioses get posted to try out.
Quote:


> Originally Posted by *RavageTheEarth*
> 
> Haha so can I have the 980 Ti installed, but not at all active? Never done anything like that before. Just so I can have the 1080 Ti the primary card until the waterblocks are released and I can get it underwater. Then I can sell the 980 Ti and the block and recoup some of this money I blew!!!


I'd sell it man, it won't be inactive, just doesn't really make enough sense and is kind of impractical, if you were deadset on keeping a dedicated card for physics I'd go with something like a 960 or 1050ti.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Like you, my card is running about 4-7c hotter on average while gaming and benching, and although I didn't get my clocks any higher my boost clocks are more consistent and don't bounce around as much. Sucks that my card hates to be OC'd beyond 2000mhz, 2050, is a stretch and 2100 is really unstable. Dunno if I'll flash the stock bios back, hoping to see more bioses get posted to try out.
> I'd sell it man, it won't be inactive, just doesn't really make enough sense and is kind of impractical, if you were deadset on keeping a dedicated card for physics I'd go with something like a 960 or 1050ti.


Yeah, I'll need to do more testing but this is looking like a better method for people with 280mm radiators on their cards.

Actual gaming where the power draw isn't at Asus bios 100%-108% might be better.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I'll need to do more testing but this is looking like a better method for people with 280mm radiators on their cards.
> 
> Actual gaming where the power draw isn't at Asus bios 100%-108% might be better.


For me it was worth it to lower my TDP, I'd regularly see it go up to ~138% with the stock FE bios, with the Asus one the most I've seen so far has been 111%


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> it worked out great for you, Kedar. Very nice. I'll dump the Aorus bios in a couple days. It may have a higher tdp/powermax as giga tends to have it higher than asus.


Black OPs 3, 2100 substained.

GPU-ZSensorLog.txt 14k .txt file


Ghost Recon: Wildlands, same.

GPU-ZSensorLog.txt 114k .txt file


I'm really liking this Strix BIOS, see no need to try others.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Had to up the voltage to 1.093 for 2100 due to higher ambient temps.
> 
> Diablo 3: Reaper Of Souls, never goes below 2100 core, 1.093v.
> 
> GPU-ZSensorLog.txt 38k .txt file
> 
> 
> Rise of The Tomb Raider DirectX 12. Only dipped between cut scenes.
> 
> GPU-ZSensorLog.txt 131k .txt file
> 
> 
> Downloading Black Ops 3 and Ghost Recon: Wildlands now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly I am not getting better results. My temperatures also float at 51C. I believe I am being thermal throttled. The extra power seems outdo my x41 AIO. I'm going to try and do a open case test.
Click to expand...

For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.


Have you tried it?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.


I didn't know that was out. I'll give that a try. Link?


----------



## alucardis666

Quote:


> Originally Posted by *VanuSovereignty*
> 
> LOVE this card.


Thanks for the very thorough write up







+Rep given
Quote:


> Originally Posted by *SlimJ87D*
> 
> I didn't know that was out. I'll give that a try. Link?


https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320

About to try it myself.


----------



## Jbravo33

any videos on this bios yet? this my first time building and overclocking nevermind flashing bios..... noob


----------



## Tikerz

New to overlocking GPU here. So is the goal here to achieve the highest sustained overclock with zero PerfCaps? I flashed the Asus BIOS and my TDP levels decreased as expected. I can run sustained at 2050MHz at 1.31V and 1.43V with no PerfCaps through about half of the Heaven benchmark before it crashes. When I try to run it at a higher voltage I get PerfCaps right off the bat. So I'm a little confused. I'm on water with the EVGA hybrid mod. Thanks.


----------



## alucardis666

Quote:


> Originally Posted by *Tikerz*
> 
> So is the goal here to achieve the highest sustained overclock with zero PerfCaps?


Yes, that's the goal.








Quote:


> Originally Posted by *Jbravo33*
> 
> any videos on this bios yet? this my first time building and overclocking nevermind flashing bios..... noob


Very impressive setup for a n00b


----------



## Tikerz

So if I get zero PerfCaps at lower voltages but PerfCaps at higher voltages then what does that mean?


----------



## theunknownkid

Hi All,
i have a asus pg384q (3440x1440, 100hz) and a single 980 ti lightning oc'd 1500/8000. The single 980ti is struggling to run games on max, dropping to 20fps.
Thinking of pulling the trigger on ASUS ROG Strix GeForce GTX 1080 Ti OC. Will i get much performance gain?
Is anyone else running UWQHD on single 1080ti?


----------



## blurp

Quote:


> Originally Posted by *Tikerz*
> 
> So if I get zero PerfCaps at lower voltages but PerfCaps at higher voltages then what does that mean?


Don't focus too much on PerfCaps. It's too generic. Zero PerfCaps means to cap actually active. PerfCaps = 1 means a CAP somewhere. Taht somewhere is more important than the PerfCaps. Could be temp (not a problem on water), voltage (may be manage a bit increasing voltage) but most likely power. Identify those CAPS and try to attenuate them. That's the goal! So far i'm at 2025 GPU and + 400 mem super stable at 41-43C on water. Sometime i'll test a bit further!


----------



## Tikerz

Quote:


> Originally Posted by *blurp*
> 
> Don't focus too much on PerfCaps. It's too generic. Zero PerfCaps means to cap actually active. PerfCaps = 1 means a CAP somewhere. Taht somewhere is more important than the PerfCaps. Could be temp (not a problem on water), voltage (may be manage a bit increasing voltage) but most likely power. Identify those CAPS and try to attenuate them. That's the goal! So far i'm at 2025 GPU and + 400 mem super stable at 41-43C on water. Sometime i'll test a bit further!


So as long as the voltage and core speed is stable throughout the benchmark then I should be ok?


----------



## alucardis666

So I flashed the MSI bios, idle clock is now 202.









Time to start testing again.


----------



## blurp

Quote:


> Originally Posted by *Tikerz*
> 
> So as long as the voltage and core speed is stable throughout the benchmark then I should be ok?


The main objective is : no crash! Then having a stable core speed most of the time is a good thing. Most likely during synthetic benchmarks and sometimes in game, the core will throttle a bit because of a power or voltage limit on water. You can try to optimize it or live with it.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.
> 
> 
> 
> Have you tried it?
Click to expand...

Just the Strix, not the Gaming X.


----------



## Tikerz

Quote:


> Originally Posted by *SlimJ87D*
> 
> Sadly I am not getting better results. My temperatures also float at 51C. I believe I am being thermal throttled. The extra power seems outdo my x41 AIO. I'm going to try and do a open case test.


Same here on the Asus BIOS.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.
> 
> 
> 
> I didn't know that was out. I'll give that a try. Link?
Click to expand...

MSI BIOS there, same method as flashing Strix BIOS.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20#post_25985321


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.


I didn't know that was out. I'll give that a try. Link?

EDIT: Nvm, I saw the link.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For some people Strix BIOS not working well. Get power limited it seems. I suggested they try the MSI Gaming X BIOS instead.
> 
> 
> 
> I didn't know that was out. I'll give that a try. Link?
> 
> Nvm, I saw the link.
Click to expand...

Are you peeps trying the Strix BIOS on air? It has a higher power draw and it'll throttle. On my old BIOS I'd notice throttling started above 50C. I'm on water you see.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Are you peeps trying the Strix BIOS on air? It has a higher power draw and it'll throttle. On my old BIOS I'd notice throttling started above 50C. I'm on water you see.


I have a Kraken x41 on the GPU, which keeps it at 48C max on stock bios. I'm going to try the MSI one, but I get a feeling it will be the same thing.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Are you peeps trying the Strix BIOS on air? It has a higher power draw and it'll throttle. On my old BIOS I'd notice throttling started above 50C. I'm on water you see.


MSI Bios flashed and I'm on the EVGA hybrid mod.


----------



## Tikerz

Does your max power limit only go up to 117%? I just flashed the MSI BIOS and it only goes up to 117 for me.


----------



## alucardis666

Quote:


> Originally Posted by *Tikerz*
> 
> Does your max power limit only go up to 117%? I just flashed the MSI BIOS and it only goes up to 117 for me.


Same.


----------



## theunknownkid

Quote:


> Originally Posted by *VanuSovereignty*
> 
> In my post above I note a 60% improvement at 2560x1440 and my 980 Ti was at the same frequency as yours. For me this translated into being able to run The Witcher 3 and Rise of the Tomb Raider at 60 FPS in 3D Vision (120 FPS 2D), up from ~35-40 FPS, with the same compromises (no Hairworks of course, but TressFX is ok!). This was absolutely worth it to me. I actually put down these two titles until some future point because they just look SO good in 3D but I just didn't have the GPU to run them and 35-40 FPS hurts my eyes. For a while I was running The Witcher 3 at 2103x1183 and it was playable, but there was still stutter at the 50 FPS I would see in certain areas such as Novigrad. I even tried running the game at 1920x1080 but that was just too blurry on this monitor (PG278Q, 27"). Now, 1080 Ti is running this game at the same smoothness and fluidity at 2560x1440 as 980 Ti was running it at 1920x1080.
> 
> I was going to wait until Volta, but to be honest, this card is fantastic at this price-point. Had Nvidia priced this thing at $800 and gimped it down from Titan XP I probably would have passed. But they didn't. It's actually faster, minus 1GB of VRAM, and runs a good 8C cooler:
> 
> https://www.hardocp.com/article/2017/03/09/nvidia_geforce_gtx_1080_ti_video_card_review/10
> 
> For some, especially those on 1080p, I could see this as being a non-essential upgrade. But for others, say those of us trying to push 3D Vision at 2.5k, this card is absolutely worth the money. Oh and Titanfall 2 saw a massive FPS improvement, I used to see dips down to 60-70 FPS in Angel City, but now, even with texture quality turned up to Insane, the minimums are like 100-110 FPS here. This game is so smooth, it's amazing. Oh and I'm seeing 8GB of VRAM used in this game and Rise of the Tomb Raider, so that's another thing to consider. 6GB is still adequate 90% of the time, but there are a growing number of titles that need more than that.
> 
> Here's an idea of performance gains to be had in The Witcher 3. Basically, 1080 Ti FE out of the box is 3-5 FPS FASTER than 980 Ti SLI out of the box:


60% improvement is pretty darn good. I guess i am just attempting to justify $1300AUD for an fps upgrade.
But i did invest in a expensive monitor, guess i need the grunt to run it to the full potential.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Same.


Man, this makes me miss my MSI 980 Ti, but the MSI Bios is running much better than the ASUS Bios. It doesn't draw as much power for whatever reason. This is at 2088 Mhz on the core. It performs on par with the Stock BIOs. But what will be nice is avoiding the power limiting.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, this makes me miss my MSI 980 Ti, but the MSI Bios is running much better than the ASUS Bios. It doesn't draw as much power for whatever reason. This is at 2088 Mhz on the core. It performs on par with the Stock BIOs. But what will be nice is avoiding the power limiting.


Thanks for testing all 3 and posting your findings


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, this makes me miss my MSI 980 Ti, but the MSI Bios is running much better than the ASUS Bios. It doesn't draw as much power for whatever reason. This is at 2088 Mhz on the core. It performs on par with the Stock BIOs. But what will be nice is avoiding the power limiting.


How can the score be the same if the stock bios is hitting power limit and downclocking while the MSI bios stays at a constant 2088Mhz?


----------



## prelude514

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, this makes me miss my MSI 980 Ti, but the MSI Bios is running much better than the ASUS Bios. It doesn't draw as much power for whatever reason. This is at 2088 Mhz on the core. It performs on par with the Stock BIOs. But what will be nice is avoiding the power limiting.


Thanks for posting. I'm going to assume Asus loosened the memory timings vs stock to give people the illusion that Asus cards OC VRAM better.

Has anyone who flashed the MSI BIOS noticed if 100% fan is gimped like with the Strix BIOS? I need 4800RPM.


----------



## c0ld

Bling bling, got freebies with my order of my 1080 Ti.


----------



## alucardis666

Quote:


> Originally Posted by *prelude514*
> 
> Has anyone who flashed the MSI BIOS noticed if 100% fan is gimped like with the Strix BIOS? I need 4800RPM.


2637 is max RPM in GPUZ with fan @ 100% and it's very tolerable, so gimped more than the asus card for sure.


----------



## prelude514

Quote:


> Originally Posted by *alucardis666*
> 
> 2637 is max RPM in GPUZ with fan @ 100% and it's very tolerable, so gimped more than the asus card for sure.


Dang, that sucks. Thanks for the info.


----------



## mshagg

Quote:


> Originally Posted by *Tikerz*
> 
> Does your max power limit only go up to 117%? I just flashed the MSI BIOS and it only goes up to 117 for me.


Nominal target TDP of 280W according to techpowerup, so the 17% increase gets you to 327W.


----------



## alucardis666

Quote:


> Originally Posted by *mshagg*
> 
> Nominal target TDP of 280W according to techpowerup, so the 17% increase gets you to 327W.


I'm good with that


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> i dont know, did you watch the video in 4k? Looks pretty great. Not better texture wise than BF1, but, it was just LUSH. very nice with the mod.


watched it in 1440p here and thought it looked pretty good!
Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yes this card is truly amazing, a word of advice though, in case you haven't read the entire thread up until this point, OC'ing the memory too high can actually reduce the performance. I initially read this in HardOCP's review where they criticized Guru3D's methodology of just setting the memory OC to +600 or whatever and when I got my card I discovered that they were indeed correct. Running The Witcher 3, with multiple Afterburner overclocking profiles on hotkey, switching between +150 core / +400 memory to +150 core / +500 memory, both PT: 120% drops my FPS a good 3-4 FPS at 2560x1440. And the same holds true for synthetic benchmarks. Someone on here had a graph and they referred to the phenomenon as error correction.
> 
> I've taken the time to investigate this further and for me +400 is my card's sweet-spot, at +387 performance dips and at +415 performance dips. +600 lol, the card's performance takes a massive dive.
> 
> Following up on a few previous comments In terms of comparing the performance to overclocked 980 Ti, my former card, I saw a 60% improvement in Firestrike Extreme and a 50% improvement in Firestrike Normal and my 980 Ti was a good overclocker that did 1500MHz core / 8GHz memory with only 1.218v (load temps under 45C).
> 
> http://www.3dmark.com/compare/fs/11780853/fs/12060712
> 
> Only the former score is relevant to me as I'm at 2560x1440. This card begins to widen the gap with it's Ti predecessor the higher the resolution.
> 
> As far as thermals go, I'm actually seeing very good performance with a side-panel fan and having removed the piece of metal in the PCI-E slot area between slot's 1 and 2 that obscure the exhaust of the card. I'm about 10C lower than average. I'm seeing 70C overclocked at 70% RPM (1:1 ratio), this is with PT at 120%, +150 core, +400 memory. If I set the fan to 80% the temps don't exceed 65C. If I set the fan to 100% the temps don't exceed 57C. Yes, this is loud, but I use headphones and don't notice the fan at 70% RPM. I can hear it at 80% RPM but don't use that RPM as there is no point, there is no discernible thermal throttling until 73-75C. It will hold 2.0GHz at 70C.
> 
> I point out the mod to the PCI-E bracket area in the following video, which can be accomplished with a pair of needlenose pliers and gently twisting this area obstructing the exhaust portion of the card off of the bracket area. Someone who took this advice reported it reducing their load temps by 5C alone over on PCgamer.com:
> 
> http://www.pcgamer.com/geforce-gtx-1080-ti-overclocking/#comment-3201515832
> 
> 
> 
> 
> 
> 
> (In this video I had memory at +500, hence the aggressive core clock throttling, and here also I was using the factory thermal compound, you can see it at 67-69C or so with the fan at 78% RPM, I've revisited this same area with Gelid GC Extreme on the core and with similar ambient it is now at 63-65C with the fan set to 80% RPM)
> 
> And here it's running Heaven at 70-71C with a 1:1 fan profile, overclocked, with PT at 120% before Gelid GC Extreme. I plan to do another update on another morning when I don't have to work and ambient is comparable to the two previous videos.
> 
> 
> 
> 
> 
> 
> Oh and I'm #2 in the world with an i7 4930k and a single 1080 Ti and the person above me apparently won the silicon lottery as they have their 4930k at 4.7GHz. I'm not typically one to boast, I found this out after I was trying to work out a mysterious lower than average Combined score in Firestrike (7700) which I resolved by setting 3DMark.exe's priority to "High Priority" via Process Hacker and also forcing 3DMark to run it's 64 bit exe in the launcher options. Score went up a thousand points.
> 
> So yeah, FE runs fantastic if you address your airflow, I was expecting it to run around 80C and exhibit all kinds of thermal throttling, now I'm seeing it run at 65C with a 1:1 fan profile in some games that tend to not get the PT above 100% too much such as Titanfall 2. The Witcher 3 in 3D Vision is a different story, PT is always pegged, but even then temps still don't exceed 70C with 1:1 ratio and Gelid GC Extreme, an RPM that truly doesn't bother me with my Astro A40's on. Yeah at 100% RPM this thing is obnoxious, but at 70% it's perfectly fine IMHO.
> 
> LOVE this card.


thank you very much for the advice!! I will do some testing with lowering the memory clocks!!


----------



## prelude514

The lower memory OC = better performance is true. I get better results at +350 than I do at +500.


----------



## Pandora's Box

Quote:


> Originally Posted by *VanuSovereignty*
> 
> I hate to be the one to say it but I would run FE in that case vs. that AIB card. I've owned that same cooler, my previous card was a 980 Ti MSI 6G Gaming (I invariably swapped that cooler, which was wholly inadequate, i.e. 70C+ load, for an AIO) and although you have it oriented so that it is getting cool air from outside of the case it is expelling heat from the sides of the cooler with nowhere to go but circulate in that poor Silverstone Raven case.
> 
> There is really next-to-no performance to be had with that card over FE. Another 20W maybe?
> 
> Reference is better than AIB for small form factor cases as they push the heat out of the case. I'm sure you know all of this.
> 
> I have a post a ways up where I'm running games at 100% load at 2.0GHz with 67C.
> 
> Again, I would NOT go with that card over FE. There is no advantage in a case with proper cooling, and you sir, do not have a spacious, well ventilated case to accommodate said card even if there was a performance advantage.
> 
> Here's my Firestrike Extreme, I'm also seeing 67C load at 70% RPM. So yeah, do yourself a favor and keep the FE and sell / return the AIB card.
> 
> If you were running a 900D or my case it would be different. But youre not, youre running a case the size of a cereal box.
> 
> http://www.3dmark.com/fs/12060712


Actually you're wrong. I had a founders edition 1080 Ti before this Gaming X. My CPU temps haven't changed at all and my Nvme drive temp hasn't changed at all. The only thing that's changed is the GPU is now cooler, quieter, and overclocks to 2Ghz with ease. There are plent of air vents around the GPU side of the case. The CPU gets fresh air on the opposite side of the case. Have a feeling you don't know the layout of this case. GPU and CPU/Motherboard compartments are completely closed off from one another.

Edit: and there's no way you are at 2Ghz at 67C with 70% fan speed on a founders edition 1080Ti. I'm sure other posters here would agree with me on that statement. Even if you somehow magically are - enjoy your wind tunnel. 70% fan speed on a founders edition card is very loud. You even stated in a previous post that you are getting 70C at 70%. Contradicting yourself here buddy. " I'm seeing 70C overclocked at 70% RPM (1:1 ratio), this is with PT at 120%, +150 core, +400 memory. If I set the fan to 80% the temps don't exceed 65C. If I set the fan to 100% the temps don't exceed 57C". In another post you state 65C at 65%. You are all over the place man. You also state that you dont go over 100% power target hardly ever but in some games you hit 120% power limit. If you are hitting 120% power limit - you are not at 2Ghz core clock at all. Probably around 1750-1800.


----------



## mouacyk

Quote:


> Originally Posted by *prelude514*
> 
> The lower memory OC = better performance is true. I get better results at +350 than I do at +500.


It's called error correction penalty.


----------



## Slackaveli

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yes this card is truly amazing, a word of advice though, in case you haven't read the entire thread up until this point, OC'ing the memory too high can actually reduce the performance. I initially read this in HardOCP's review where they criticized Guru3D's methodology of just setting the memory OC to +600 or whatever and when I got my card I discovered that they were indeed correct. Running The Witcher 3, with multiple Afterburner overclocking profiles on hotkey, switching between +150 core / +400 memory to +150 core / +500 memory, both PT: 120% drops my FPS a good 3-4 FPS at 2560x1440. And the same holds true for synthetic benchmarks. Someone on here had a graph and they referred to the phenomenon as error correction.
> 
> I've taken the time to investigate this further and for me +400 is my card's sweet-spot, at +387 performance dips and at +415 performance dips. +600 lol, the card's performance takes a massive dive.
> 
> Following up on a few previous comments In terms of comparing the performance to overclocked 980 Ti, my former card, I saw a 60% improvement in Firestrike Extreme and a 50% improvement in Firestrike Normal and my 980 Ti was a good overclocker that did 1500MHz core / 8GHz memory with only 1.218v (load temps under 45C).
> 
> http://www.3dmark.com/compare/fs/11780853/fs/12060712
> 
> Only the former score is relevant to me as I'm at 2560x1440. This card begins to widen the gap with it's Ti predecessor the higher the resolution.
> 
> As far as thermals go, I'm actually seeing very good performance with a side-panel fan and having removed the piece of metal in the PCI-E slot area between slot's 1 and 2 that obscure the exhaust of the card. I'm about 10C lower than average. I'm seeing 70C overclocked at 70% RPM (1:1 ratio), this is with PT at 120%, +150 core, +400 memory. If I set the fan to 80% the temps don't exceed 65C. If I set the fan to 100% the temps don't exceed 57C. Yes, this is loud, but I use headphones and don't notice the fan at 70% RPM. I can hear it at 80% RPM but don't use that RPM as there is no point, there is no discernible thermal throttling until 73-75C. It will hold 2.0GHz at 70C.
> 
> I point out the mod to the PCI-E bracket area in the following video, which can be accomplished with a pair of needlenose pliers and gently twisting this area obstructing the exhaust portion of the card off of the bracket area. Someone who took this advice reported it reducing their load temps by 5C alone over on PCgamer.com:
> 
> http://www.pcgamer.com/geforce-gtx-1080-ti-overclocking/#comment-3201515832
> 
> 
> 
> 
> 
> 
> (In this video I had memory at +500, hence the aggressive core clock throttling, and here also I was using the factory thermal compound, you can see it at 67-69C or so with the fan at 78% RPM, I've revisited this same area with Gelid GC Extreme on the core and with similar ambient it is now at 63-65C with the fan set to 80% RPM)
> 
> And here it's running Heaven at 70-71C with a 1:1 fan profile, overclocked, with PT at 120% before Gelid GC Extreme. I plan to do another update on another morning when I don't have to work and ambient is comparable to the two previous videos.
> 
> 
> 
> 
> 
> 
> Oh and I'm #2 in the world with an i7 4930k and a single 1080 Ti and the person above me apparently won the silicon lottery as they have their 4930k at 4.7GHz. I'm not typically one to boast, I found this out after I was trying to work out a mysterious lower than average Combined score in Firestrike (7700) which I resolved by setting 3DMark.exe's priority to "High Priority" via Process Hacker and also forcing 3DMark to run it's 64 bit exe in the launcher options. Score went up a thousand points.
> 
> So yeah, FE runs fantastic if you address your airflow, I was expecting it to run around 80C and exhibit all kinds of thermal throttling, now I'm seeing it run at 65C with a 1:1 fan profile in some games that tend to not get the PT above 100% too much such as Titanfall 2. The Witcher 3 in 3D Vision is a different story, PT is always pegged, but even then temps still don't exceed 70C with 1:1 ratio and Gelid GC Extreme, an RPM that truly doesn't bother me with my Astro A40's on. Yeah at 100% RPM this thing is obnoxious, but at 70% it's perfectly fine IMHO.
> 
> LOVE this card.


awesome


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> How can the score be the same if the stock bios is hitting power limit and downclocking while the MSI bios stays at a constant 2088Mhz?


I think the MSI and stock are equal, the scores are a tiny bit off but still in a margin of error.

The ASUS bios just ran hotter and I had a loud coil whine, it was bizarre. Even my speakers had a him that matched the frequency of the coil whine from the card. The ASUS bios caused my card to get hot, it probably had random thermal throttles with my card approaching 51C even though I have a x41 which is quite beefy for a AIO.

All I know is that with a EVGA hybrid kit, or H55, x31, these cards run quite hotter. They need a 280mm radiator to probably stay under 51C ifnyoubdraw 330 Watts or greater. Or we need a bios mod to modify thermal throttling.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> So I flashed the MSI bios, idle clock is now 202.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time to start testing again.


oh boy! this may be the one. ill have another for you to try by thursday.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> 2637 is max RPM in GPUZ with fan @ 100% and it's very tolerable, so gimped more than the asus card for sure.


Makes sense, the MSI cards has larger fans so don't have to spin as fast.


----------



## Slackaveli

Quote:


> Originally Posted by *theunknownkid*
> 
> 60% improvement is pretty darn good. I guess i am just attempting to justify $1300AUD for an fps upgrade.
> But i did invest in a expensive monitor, guess i need the grunt to run it to the full potential.


of course, man, otherwise you WASTED a ton of money on that monitor


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think the MSI and stock are equal, the scores are a tiny bit off but still in a margin of error.
> 
> The ASUS bios just ran hotter and I had a loud coil whine, it was bizarre. Even my speakers had a him that matched the frequency of the coil whine from the card. The ASUS bios caused my card to get hot, it probably had random thermal throttles with my card approaching 51C even though I have a x41 which is quite beefy for a AIO.
> 
> All I know is that with a EVGA hybrid kit, or H55, x31, these cards run quite hotter. They need a 280mm radiator to probably stay under 51C ifnyoubdraw 330 Watts or greater. Or we need a bios mod to modify thermal throttling.


I have a 1080 hyrbid kit on my 1080ti and even with Asus bios and hitting 120% power limit at 1.075V I was not going over 47C.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, this makes me miss my MSI 980 Ti, but the MSI Bios is running much better than the ASUS Bios. It doesn't draw as much power for whatever reason. This is at 2088 Mhz on the core. It performs on par with the Stock BIOs. But what will be nice is avoiding the power limiting.


Quote:


> Originally Posted by *c0ld*
> 
> Bling bling, got freebies with my order of my 1080 Ti.


say what? details, man! That's a lot of swag and I have the same gpu on order form the egg already left california headed east.


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> Actually you're wrong. I had a founders edition 1080 Ti before this Gaming X. My CPU temps haven't changed at all and my Nvme drive temp hasn't changed at all. The only thing that's changed is the GPU is now cooler, quieter, and overclocks to 2Ghz with ease. There are plent of air vents around the GPU side of the case. The CPU gets fresh air on the opposite side of the case. Have a feeling you don't know the layout of this case. GPU and CPU/Motherboard compartments are completely closed off from one another.
> 
> Edit: and there's no way you are at 2Ghz at 67C with 70% fan speed on a founders edition 1080Ti. I'm sure other posters here would agree with me on that statement. Even if you somehow magically are - enjoy your wind tunnel. 70% fan speed on a founders edition card is very loud. You even stated in a previous post that you are getting 70C at 70%. Contradicting yourself here buddy. " I'm seeing 70C overclocked at 70% RPM (1:1 ratio), this is with PT at 120%, +150 core, +400 memory. If I set the fan to 80% the temps don't exceed 65C. If I set the fan to 100% the temps don't exceed 57C"


to be fair he explains it above. He has done a case mod that gave him 4-5c by improving the gpu's exhaust, he has a MASSIVE side fan giving the card fresh air and cooling the backplate and vrm by shear windpower, and he has done the thermal paste mod for another 4-5c by replacing crap with GeLidd, which is almost as good as thermal grizzly. I get basically the same temps as him and i have a heatsink with thermal paste and an extra fan on top of the backplate and did the thermal grizzley mod.If i cut the metal like he did, i'd probably beat him in temps. But my silicon itself is a dog so I switched, but if i could hold 2.-something Id probably be just as happy as ole vince there.


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> say what? details, man! That's a lot of swag and I have the same gpu on order form the egg already left california headed east.


There was a special pre-order through an Aorus Community manager in the FB group. They allocated 10 of each (Aorus Xtreme, Aorus, Gaming G1), you could pre-order directly from Gigabyte by requesting a form from the rep and filling your payment info.

The perks were:
- The card would ship for free on the 3rd of April(1 day before Newegg) with 2 Day Air, mine was ground since I was in the same state got it today








- Guaranteed availability once you filled the form and they accepted it.
- Freebies that they kept a secret they were a nice surprise. I only paid for the card and got all that!


----------



## Pandora's Box

Quote:


> Originally Posted by *Slackaveli*
> 
> to be fair he explains it above. He has done a case mod that gave him 4-5c by improving the gpu's exhaust, he has a MASSIVE side fan giving the card fresh air and cooling the backplate and vrm by shear windpower, and he has done the thermal paste mod for another 4-5c by replacing crap with GeLidd, which is almost as good as thermal grizzly. I get basically the same temps as him and i have a heatsink with thermal paste and an extra fan on top of the backplate and did the thermal grizzley mod.If i cut the metal like he did, i'd probably beat him in temps. But my silicon itself is a dog so I switched, but if i could hold 2.-something Id probably be just as happy as ole vince there.


Wait he cut the metal on the exhaust part of the card? Lol RIP warranty. Fact is there's no way he is holding 2GHz at 70% fan speed on a founders edition. If he is then his case must sound like a wind tunnel.


----------



## Slackaveli

Quote:


> Originally Posted by *c0ld*
> 
> There was a special pre-order through an Aorus Community manager in the FB group. They allocated 10 of each (Aorus Xtreme, Aorus, Gaming G1), you could pre-order directly from Gigabyte by requesting a form from the rep and filling your payment info.
> 
> The perks were:
> - The card would ship for free on the 3rd of April(1 day before Newegg) with 2 Day Air, mine was ground since I was in the same state
> 
> 
> 
> 
> 
> 
> 
> 
> - Guaranteed availability once you filled the form and they accepted it.
> - Freebies that they kept a secret they were a nice surprise. I only paid for the card and got all that!


damn dude, i would have jumped all over that deal, too, but i guess not knowing the swag you took a chance, but it paid off bigtime! That's a decent mouse, nice pad, all kinds of schwag. Borat says, "veruh niiiiice".


----------



## alucardis666

So I had some points on my amazon card available, long story short, I have *this* on it's way for schedule delivery tomorrow. Gotta love that one day shipping!









Did I mention it was free?


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> damn dude, i would have jumped all over that deal, too, but i guess not knowing the swag you took a chance, but it paid off bigtime! That's a decent mouse, nice pad, all kinds of schwag. Borat says, "veruh niiiiice".


Yeah man! It was way better than fighting for one on newegg. Some people got Mechanical Keyboards instead of teh mousepad/mouse combo. They went all out it was a nice touch all the goodies.


----------



## Slackaveli

Quote:


> Originally Posted by *Pandora's Box*
> 
> Wait he cut the metal on the exhaust part of the card? Lol RIP warranty. Fact is there's no way he is holding 2GHz at 70% fan speed on a founders edition. If he is then his case must sound like a wind tunnel.


it probably is a bit loud but he wears headphones to game he said.

He cut the case, the metal bar between the pci-e slot connectors that partially blocks exhaust, probably blocks like 15%~ of it. He cut that and improved flow.
Quote:


> Originally Posted by *alucardis666*
> 
> So I had some points on my amazon card available, long story short, I have *this* on it's way for schedule delivery tomorrow. Gotta love that one day shipping!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did I mention it was free?


smart move. There's another 4-5c. Being on a hybrid mod, that thermal grizzly could make that critical delta difference around a throttle point.


----------



## KingEngineRevUp

I don't know why but I find this very hilarious. I have a x41 mounted inside of my FE reference. Since the pump is doing most of the work, i didn't care to use that small blower fan on the card anymore, I just kept it at 30%.

Well running the MSI Bios did push my temperatures to 50C... Putting the blower at 75% actually helped hold the card at 48C so now I'm not thermal throttling anymore... I wasn't going to do this, but I'm actually going to stick heatsinks on the midplate now so the blower can pass over them and get rid of more heat.

I'll stick heatsinks on the following areas. Literally whatever helps me stay right below 51C, stupid thermal throttle...



This *isn't* a picture of my card BTW.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know why but I find this very hilarious. I have a x41 mounted inside of my FE reference. Since the pump is doing most of the work, i didn't care to use that small blower fan on the card anymore, I just kept it at 30%.
> 
> Well running the MSI Bios did push my temperatures to 50C... Putting the blower at 75% actually helped hold the card at 48C so now I'm not thermal throttling anymore... I wasn't going to do this, but I'm actually going to stick heatsinks on the midplate now so the blower can pass over them and get rid of more heat.
> 
> I'll stick heatsinks on the following areas. Literally whatever helps me stay right below 51C, stupid thermal throttle...


Same experience here bud, *this* should also do the trick


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> So I had some points on my amazon card available, long story short, I have *this* on it's way for schedule delivery tomorrow. Gotta love that one day shipping!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did I mention it was free?


Quote:


> Originally Posted by *c0ld*
> 
> Yeah man! It was way better than fighting for one on newegg. Some people got Mechanical Keyboards instead of teh mousepad/mouse combo. They went all out it was a nice touch all the goodies.


dang. a free mechanical keyboard is a helluva freebie.

I guess you guys got a BIG ole box for a gpu delivery. You were probably excited as hell.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Smart move. There's another 4-5c. Being on a hybrid mod, that thermal grizzly could make that critical delta difference around a throttle point.


That's what I'm hoping for! Gonna retim my cpu and gpu and see how it goes.









*EDIT:* does it *REALLY* make that much of a difference?! 4-5c? I'm using MX-4 right now on both.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know why but I find this very hilarious. I have a x41 mounted inside of my FE reference. Since the pump is doing most of the work, i didn't care to use that small blower fan on the card anymore, I just kept it at 30%.
> 
> Well running the MSI Bios did push my temperatures to 50C... Putting the blower at 75% actually helped hold the card at 48C so now I'm not thermal throttling anymore... I wasn't going to do this, but I'm actually going to stick heatsinks on the midplate now so the blower can pass over them and get rid of more heat.
> 
> I'll stick heatsinks on the following areas. Literally whatever helps me stay right below 51C, stupid thermal throttle...


use thermal poly pads under a 100mmx50x15 finned heatsink horizontally between the blower and the cooler . i bet that'd give you gains.
Quote:


> Originally Posted by *alucardis666*
> 
> That's what I'm hoping for! Gonna retim my cpu and gpu and see how it goes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT:* does it *REALLY* make that much of a difference?! 4-5c? I'm using MX-4 right now on both.


i got 8c on load over default paste on my 980ti, and got ~6 on the founder's. You may only gain a degree or two, but it should register some gain. mx4 is decent, but even the maths say it's half as goood as kyro.


----------



## Slackaveli

nm


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> I got 8c on load over default paste on my 980ti, and got ~6 on the founder's. You may only gain a degree or two, but it should register some gain. mx4 is decent, but even the maths say it's half as good as kyro.


Only half as good huh? The hype is real. I'm *VERY* curious to see how it affects my CPU and GPU temps.

Use the applicator to make as thin a layer as possible right? Or do I just stick with the rice grain method?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> I have a 1080 hyrbid kit on my 1080ti and even with Asus bios and hitting 120% power limit at 1.075V I was not going over 47C.


Are you running at 2114Mhz?

Edit: wait a minute... You're only at 120%? Brah, I'm running at 330 Watts... That's like 132% relative to you... Unless if you're doing 120% on the partner bios, I don't see how that's possible, it would be 360 Watts O_O.

At stock bios with OC and 120% slider, I ran around 42C to 45C.


----------



## Pandora's Box

Quote:


> Originally Posted by *VanuSovereignty*
> 
> 
> 
> 
> 
> 
> Thats 70C at 70% RPM BEFORE changing the TIM out for Gelid GC Extreme which dropped my temps 3-4C.
> 
> And here's The Witcher 3 at 99% load and a PT induced core clock throttled 1950Mhz or so at 67C with the fan at 78% RPM BEFORE Gelid GC Extreme, now it's 63-65C at 78% RPM. I have been meaning to do an update.
> 
> 
> 
> 
> 
> 
> Apparently you haven't read my previous post where I note that removing the metal area dividing PCI-E slot's 1 and 2 has dropped load temps 5C for others who heeded this suggestion.
> 
> Yeah having a second look at your case, sure, it may be sealed off, but your GPU's VRM and memory and everything else gets to sit in that oven of an enclosure. Great for Hot Pockets, bad for 300W AIB GPU's. Hopefully it all works out in the end. How about posting some videos of your card sitting in a Heaven loop at 2.0GHz, +400 memory and 120% PT in return? Predicting 75C.
> 
> Edit:
> 
> I said I am under 65C in Titanfall 2 (yes 1:1 ratio). I suppose I can do another video of this. For whatever reason PT in TF2 typically doesn't get much over 100, I mean there are a few spikes to 110,115,120% PT but usually it's at or under 100% and then the matches are only say 5-10 minutes each and then there's 3 minutes or so sitting back in a lobby waiting for another match so it doesn't really get up and stay up. The Witcher 3 it's 99% load for hours with PT at 110%+ so there is a lot for the card to cope with.
> 
> Sorry if I came off super negative, if I were you I would consider getting a larger case though. Silvestone Raven is a nice form factor case but it's really meant for reference cooled cards. With summer approaching, you might find yourself look at 80C load temps. That cooler is really inadequate, they haven't changed the design and everyone with the 980 Ti variant complained to no end about the temperatures sitting in the mid 70's, and this was in cases that could breathe. Hell I was seeing 70C in my case with the side-panel fan feeding air directly to it!


I'm not wasting bandwidth uploading a video of heaven looping. Hell its been discussed extensively in this thread that heaven does not stress these cards. Go run 3DMark Firestrike Extreme Stress test and report back with your core clock. lol @ 78% fan speed. Now that is a wind tunnel. And that's coming from a guy with bad hearing. I am locked at 2,000Mhz at 68C in Firestrike extreme stress test.

Edit: I do not wear headphones. I guess if you have noise cancelling headphones then maybe 78% fan speed is justified but its still insane.


----------



## JedixJarf

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think the MSI and stock are equal, the scores are a tiny bit off but still in a margin of error.
> 
> The ASUS bios just ran hotter and I had a loud coil whine, it was bizarre. Even my speakers had a him that matched the frequency of the coil whine from the card. The ASUS bios caused my card to get hot, it probably had random thermal throttles with my card approaching 51C even though I have a x41 which is quite beefy for a AIO.
> 
> All I know is that with a EVGA hybrid kit, or H55, x31, these cards run quite hotter. They need a 280mm radiator to probably stay under 51C ifnyoubdraw 330 Watts or greater. Or we need a bios mod to modify thermal throttling.


When I had run the 1080 strix bios on my old 1080 g1 it developed a terrible coil whine that never went away after that.


----------



## Pandora's Box

Quote:


> Originally Posted by *VanuSovereignty*
> 
> 1:1 fan ratio makes for 70% RPM at 70C as indicated in the video, not 78%, it's a substantial difference.
> 
> In Titanfall 2 the fan doesn't get over 60-65% RPM (yes, I'm talking about PEAKS of 65C here).
> 
> Heaven isn't intensive since when? It's probably the only benchmark that is used for actual stress testing. You can be Firestrike stable but you won't be Heaven stable. Heaven also get's your PT up there.


Heaven hovers around 105% power target for me. firestrike extreme is pretty much maxing out at 117% (max power target on the gaming x - 350Watt TDP)


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Only half as good huh? The hype is real. I'm *VERY* curious to see how it affects my CPU and GPU temps.
> 
> Use the applicator to make as thin a layer as possible right? Or do I just stick with the rice grain method?


i use X method or line method on cpu, but on gpu spread over every bit of it. a "not-TOO-thin" thin spread lol.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> i use X method or line method on cpu, but on gpu spread over every bit of it. a "not-TOO-thin" thin spread lol.


Sounds good!


----------



## Slackaveli

Quote:


> Originally Posted by *VanuSovereignty*
> 
> TY
> 
> 
> 
> 
> 
> 
> 
> 
> TY for corroborating. Too bad your FE was a poor overclocker, hopefully your Aorus is a golden sample. Hopefully someone will unlock these vbios so these AIB cards with better thermal management and power delivery can truly shine. But yeah, another 30W or whatever and being able to go higher than +120 on the core and be quieter and good looking in the process for less than what you paid for FE, who could resist?
> 
> You got me thinking about trying to return mine, but it does +150 (flirts with 2050 very briefly before 50C) and well, I just can't be bothered with it. It would be a gamble for me actually as some AIB cards are only capable of +120 or so core (I forgot the review, possibly Strix) so yeah, better quit while I'm ahead. It would be nice to have a quieter cooler though, I can hear this thing at 80% over the headphones, particularly during dialogue or when there isn't anything going on in the game. At 70% I really don't notice it.


No problem, bro. If I were you I'd stay pat. I don't mind a little bit of fan noise. If it were coil whine I'd lose my mind, though.

For me it was really that $70 tax, TAX to those fools in the capital pissin me off. So, getting a refund, like you said, who could resist. Got to play on the FE for 3 weeks very early on as well. Already got my shipping refunded b/c fedex was 3 days late and I got another game code out of the deal and traded that to mtbiker for PUBG, so I came out like a champ. Thanks again, broski.

Edit* I find it humorous that I(we) got no problem dropping the initial $700, but that $70 tax coulda been used on upgraded cards, and pisses us all off.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> pretty much everybody can get 4.2-4.3 , at anywhere from 1.25-1.35v and these chips can take over 1.4v core and still run cool. I am running (daily) [email protected], up from stock which is 3.3 dont let those lower sounding numbers scare you as this dude at 4.2 is on par IPC wise with skylake at 4.5


That's good to know. After the card purchase and a case purchase since the Aorus did actually get shipped and won't fit into my NCASE....It'll be on hold a bit longer. +rep though







.
Quote:


> Originally Posted by *Slackaveli*
> 
> ill trade you for my FE that cost me $770 w tax :/


I see you already ordered one







. That didn't take long hehe.
Quote:


> Originally Posted by *p2im0*
> 
> Just an FYI I just got notification the 1080 Ti hybrid kits by EVGA are available. $159 though ?
> 
> http://www.evga.com/products/Product.aspx?pn=400-HY-5388-B1


I saw this, laughed, and was even happier that I bought a AIB Ti. I was originally pretty interested in these for my TXP, but I'll be happy enough to avoid that and sell the TXP.
Quote:


> Originally Posted by *alucardis666*
> 
> At this point physics cards are totally useless. You're talking 1-3 fps difference on average and maybe max a 5fps increase on a heavily physics based game in optimal conditions. Save yourself the electric bill and the heat.


For those not on water, AIB cards are very desirable. FE cards without the OEM fan speed limit (read: 50%) are simply noisy. Also, if staying on air, most AIB coolers keep the cards at very low temps compared to a FE at a fraction of the noise output. Should hold boost a bit better as well.
Quote:


> Originally Posted by *Tikerz*
> 
> So as long as the voltage and core speed is stable throughout the benchmark then I should be ok?


Ensure that you try a few different OC's. For example; if you find +225Mhz on GPU and +550Mhz on VRAM works and is stable, also try it at +175/500 and +125/450 in benchmarks and games you play to see where you actually see better performance. Someone has mentioned it before, but Pascal can scale negatively with an OC that's too high.
Quote:


> Originally Posted by *VanuSovereignty*
> 
> I hate to be the one to say it but I would run FE in that case vs. that AIB card. I've owned that same cooler, my previous card was a 980 Ti MSI 6G Gaming (I invariably swapped that cooler, which was wholly inadequate, i.e. 70C+ load, for an AIO) and although you have it oriented so that it is getting cool air from outside of the case it is expelling heat from the sides of the cooler with nowhere to go but circulate in that poor Silverstone Raven case.
> 
> There is really next-to-no performance to be had with that card over FE. Another 20W maybe?
> 
> Reference is better than AIB for small form factor cases as they push the heat out of the case. I'm sure you know all of this.
> 
> I have a post a ways up where I'm running games at 100% load at 2.0GHz with 67C.
> 
> Again, I would NOT go with that card over FE. There is no advantage in a case with proper cooling, and you sir, do not have a spacious, well ventilated case to accommodate said card even if there was a performance advantage.
> 
> Here's my Firestrike Extreme, I'm also seeing 67C load at 70% RPM. So yeah, do yourself a favor and keep the FE and sell / return the AIB card.
> 
> If you were running a 900D or my case it would be different. But youre not, youre running a case the size of a cereal box.
> 
> http://www.3dmark.com/fs/12060712


There's a very common misconception with cases like these. Getting direct ambient air is perfect for these cards. If it was a non-iCX EVGA card, I could see the concern, but for his scenario, this cooler will perform better and remain quieter than an FE. And let's be honest, 70% fan speed is not quiet by any means on these cards. It's not the end of the world, but simply put--it is loud. If you're seeing bad performance and temps on one of these MSI coolers, your airflow was insufficient.
Quote:


> Originally Posted by *Slackaveli*
> 
> to be fair he explains it above. He has done a case mod that gave him 4-5c by improving the gpu's exhaust, he has a MASSIVE side fan giving the card fresh air and cooling the backplate and vrm by shear windpower, and he has done the thermal paste mod for another 4-5c by replacing crap with GeLidd, which is almost as good as thermal grizzly. I get basically the same temps as him and i have a heatsink with thermal paste and an extra fan on top of the backplate and did the thermal grizzley mod.If i cut the metal like he did, i'd probably beat him in temps. But my silicon itself is a dog so I switched, but if i could hold 2.-something Id probably be just as happy as ole vince there.


Even in the NCASE build I have, a cooler like the ACX 3.0 for the 1080 performs wholly better than a blower style cooler. This is both GPUs run with 2 120mm fans directly underneath them. The blower style is effective, but we're all kidding ourselves if we think GPUs haven't progressed to the point that AIBs aren't more effective in any situation (outside of a AIB in a tower with insufficient airflow) than a FE/reference cooler.
Quote:


> Originally Posted by *c0ld*
> 
> There was a special pre-order through an Aorus Community manager in the FB group. They allocated 10 of each (Aorus Xtreme, Aorus, Gaming G1), you could pre-order directly from Gigabyte by requesting a form from the rep and filling your payment info.
> 
> The perks were:
> - The card would ship for free on the 3rd of April(1 day before Newegg) with 2 Day Air, mine was ground since I was in the same state got it today
> 
> 
> 
> 
> 
> 
> 
> 
> - Guaranteed availability once you filled the form and they accepted it.
> - Freebies that they kept a secret they were a nice surprise. I only paid for the card and got all that!


That's pretty awesome. That mouse is even half-decent if you get one without QC issues.


----------



## Pandora's Box

I dont think folding performance has been discussed yet. Check this out









1.4 Million PPD









Can now leave my GPU folding 24/7. With the founders card I couldn't leave it running 24/7 folding. Was too loud.


----------



## Slackaveli

@pez, yeah I couldn't wait for you









$770 for a golden FE and I'd be alright. If I had a AIO w/o having to drop another $100+; maybe. But $770 for a dog , on air, was hurting me deeply.


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> hell yeah, i feel better about floating my rent money now (lmao, i love how we have no late fees until after 15 days )
> damn, son, ya'll get crushed. It's all relative, I guess. Congrats, bro. You'll be the envy of your block now.


Quote:


> Originally Posted by *joder*
> 
> He doesn't have a block... No one else could afford to live there.


Quote:


> Originally Posted by *Slackaveli*
> 
> I thought of that, too. In my head he is in a castle on a Hill overlooking throngs of peasants
> 
> 
> 
> 
> 
> 
> 
> 
> thanks


I guess in my lan party group I will now be King of GPU and monitor. we are having a lan night in a few weeks and I will thinking........ bad thoughts you guys put in my head friends are friends even if they are peasants.
This card will go under water I don't have a full cover but it will slip in my current water loop with universal blocks I don't really want to do this but I am sure it will be a few more weeks before the blocks become available.

Quote:


> Originally Posted by *Somasonic*
> 
> Nice. I paid $1605 for mine the day they went up on Ascent. They raised the price by $60 the next day so I was pretty happy I was able to get in before that


Nice I almost got the EVGA with water block when Computer lounge had their big sale I was hanging for a card I didn't want the EVGA would have been $1450ish including water block, but the ETA for the strix was still 3-4 weeks away but on the online check out someone bet to it and glad they did I really wanted the Asus 1080Ti Strix
I might add I swore I would never deal with PlayTech again but when it came to getting this card or wait........... I will just say they have not had 4 years of my business.

Quote:


> Originally Posted by *Slackaveli*
> 
> of course, man, otherwise you WASTED a ton of money on that monitor


you right there I been looking at my 2840x1600 LG 38" and GTX 770 SLI to tease me I could play some games... that are 2 years old now.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> @pez, yeah I couldn't wait for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> $770 for a golden FE and I'd be alright. If I had a AIO w/o having to drop another $100+; maybe. But $770 for a dog , on air, was hurting me deeply.


Haha oh I'm not bothered







. And yeah...$699.99 + tax + $160 AIO specific cooler...or $720 for the AIB that will get a waterblock from EKWB (they run $130 IIRC)....and now we're at $10 of savings







.


----------



## Jbravo33

I can't get by this anyone help. Probably shouldn't have tried this lol

Have no idea what I'm doing can't change directory or should say I don't know how


----------



## Jbravo33

Ok got this far now I can't type anymore


----------



## guttheslayer

Wildland recon make my 1080 ti beg for mercy.

Cant believe i cannot run 60 fps stable for 1440p


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Ok got this far now I can't type anymore


1. Close out MSI Afterburner.

2. Go to your folder where you have Nvflash and the bios you wanna flash saved. *Note: If they're not in the same folder then put them into the same folder.*

3. Once inside the folder mentioned earlier, holding *SHIFT*, *RIGHT CLICK* and choose *OPEN COMMAND WINDOW HERE*, now type "NVFLASH64 -6 [the name of the bios file.rom]" *Note: do not use the [ ] when typing* Hit *ENTER*
This **SHOULD* then prompt you to flash.* Hit *Y* Your screen will go black for a second and the card will be uninstalled in device manager, flashed, then detected.

4. At this point once everything is done, you're going to want to restart your computer. There will be some additional redetecting more than likely... so you'll want to restart one more time.

5.Once you're back in you can now relaunch MSI Afterburner and begin tweaking settings. *Note: You may need to reconfigure settings within Nvidia control panel as well.*

Hope that helps.









This might also help.




Skip to about the 5:00 mark.


----------



## Jbravo33

Sweet you got me a little further I have two gpu and this what's growing me off


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Sweet you got me a little further I have two gpu and this what's growing me off


Type: Nvflash --protectoff
Nvflash -4 -5 -6 i -i0 "Name of BIOS FILE".rom
Nvflash -4 -5 -6 i -i1 "Name of BIOS FILE".rom

It should flash the first card then ask if you wanna flash the next I believe. So you'll just be prompted to hit Y 2x.

If it doesn't work I'm sorry, maybe someone else can chime in?

Or if you're really impatient, you can disable SLi, pull 1 card out, flash it, pull it out after flashing, pop the other in, flash that card, then reinstall both cards and reconfigure SLi.


----------



## Jbravo33

Or if you're really impatient, you can disable SLi, pull 1 card out, flash it, pull it out after flashing, pop the other in, flash that card, then reinstall both cards and reconfigure SLi.







[/quote]

Got It!! Thank you my man!!!!


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Got It!! Thank you my man!!!!


You got your cards flashed?


----------



## Jbravo33

So I'm benching with bios update and on timespy it's stated score not valid. Says non default settings were used. What's that mean


----------



## Mato87

Quote:


> Originally Posted by *mtbiker033*
> 
> crysis 1 with blackfires ultimate mod............
> 
> 
> 
> 
> 
> 
> 
> it's the best experience ever, don't worry about the fps just enjoy it!!
> 
> the first crysis was the last and arguably the PC games ever made


I will try this one out, I think the game looks ******* fantastic, a bit rough around the edges, but that's expected from a 10 year old game, still holds even to this date in terms of some of graphical effects and so on. I read about the blackfires ultimate mod and it basically changes the lighting and adds a bit of paralax occlusion. I will try it out nonetheless to see the difference. Thanks for the tip
Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> Don't think it looks that impressive personally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Battlefield 1 looks better imo.


Yeah, you're comparing a 10 year old game to a new game









Quote:


> Originally Posted by *mtbiker033*
> 
> frostbite is a great looking engine no doubt but considering when this came out and the overall experience (freedom to roam etc) it looks pretty good to me!


Yep, exactly, crysis still holds its ground after all those years.

Quote:


> Originally Posted by *Juub*
> 
> Isn't Crysis heavily single-threaded and perform like crap on quad-cores and up CPU's? I remember dual-cores with high clock speeds tend to run a lot better for Crysis.


Hmm you could be onto something here, I remember having a core 2 duo processor and the game ran reasonably well with the graphic cards that were available back then...Should I emulate my quad core to dual core just to see if it fixes the performance?


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> So I'm benching with bios update and on timespy it's stated score not valid. Says mon default settings were used. What's that mean


***?! lol.

Did you restart post flashing? And did you redo your MSI Afterburner settings?


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> ***?! lol.
> 
> Did you restart post flashing? And did you redo your MSI Afterburner settings?


Lol yea I restarted twice. And afterburner I put everything back on default


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> Lol yea I restarted twice. And afterburner I put everything back on default


Try running DDU and clearing out your display driver then clean installing.

http://www.guru3d.com/files-details/display-driver-uninstaller-download.html

http://www.guru3d.com/files-get/geforce-378-92-whql-driver-download,1.html


----------



## geriatricpollywog

Quote:


> Originally Posted by *guttheslayer*
> 
> Wildland recon make my 1080 ti beg for mercy.
> 
> Cant believe i cannot run 60 fps stable for 1440p


Maybe it's your see-pee-you?


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> Black OPs 3, 2100 substained.
> 
> GPU-ZSensorLog.txt 14k .txt file
> 
> 
> Ghost Recon: Wildlands, same.
> 
> GPU-ZSensorLog.txt 114k .txt file
> 
> 
> I'm really liking this Strix BIOS, see no need to try others.


Thanks for testing. Looks good, i might as well just flash it then.


----------



## Benny89

Quote:


> Originally Posted by *guttheslayer*
> 
> Wildland recon make my 1080 ti beg for mercy.
> 
> Cant believe i cannot run 60 fps stable for 1440p


Mate, Wildland is known for ABSOLUTELTY BARBARIC optimalization since beta. This also Ubi game. It is one of the worst optimized games ever.

Even in the division at 1440p with 980Ti I was getting around 70-80 fps I think (was long ago, but for SURE much more than 60).


----------



## shalafi

Quote:


> Originally Posted by *Benny89*
> 
> Mate, Wildland is known for ABSOLUTELTY BARBARIC optimalization since beta. This also Ubi game. It is one of the worst optimized games ever.
> 
> Even in the division at 1440p with 980Ti I was getting around 70-80 fps I think (was long ago, but for SURE much more than 60).


Speaking of which, I tried installing ARK again, just for the lolz. 1440p, Epic everything, 40fps (+weird input lag). I think ARK wins the unoptimized piece of garbage award.


----------



## MURDoctrine

Quote:


> Originally Posted by *shalafi*
> 
> Speaking of which, I tried installing ARK again, just for the lolz. 1440p, Epic everything, 40fps (+weird input lag). I think ARK wins the unoptimized pieco of garbage award.


I would have to agree there. The sad thing is I saw a kid buy a $299 AIO HP cheapo to try and play that. All I could do was facepalm as you couldn't tell him otherwise that it wouldn't run the game. Had a Celeron, 4gb Ram, and no GPU.


----------



## mshagg

1080Ti crushes Wildlands on my setup in 5760x1080, which is almost double the number of pixels as 1440p. It is pretty thirsty on CPU resources and gives this overclocked 5930k a better workout than any other game I've seen.

Admittedly im only targeting three 60Hz screens but it pushes 75+fps


----------



## MURDoctrine

Quote:


> Originally Posted by *mshagg*
> 
> 1080Ti crushes Wildlands on my setup in 5760x1080, which is almost double the number of pixels as 1440p. It is pretty thirsty on CPU resources and gives this overclocked 5930k a better workout than any other game I've seen.
> 
> Admittedly im only targeting three 60Hz screens but it pushes 75+fps


Yeah it is a CPU hog. It keeps my 4770k at 75-85% on 1080p 60hz. I had to up the resolution scaling just to throw more load onto the GPU. Its the only game to let me see my CPU bottleneck without just letting the 1080ti go unrestricted on the FPS.


----------



## Mato87

Quote:


> Originally Posted by *mshagg*
> 
> 1080Ti crushes Wildlands on my setup in 5760x1080, which is almost double the number of pixels as 1440p. It is pretty thirsty on CPU resources and gives this overclocked 5930k a better workout than any other game I've seen.
> 
> Admittedly im only targeting three 60Hz screens but it pushes 75+fps


Iam guessing you're playing on medium to high graphic settings too? Can we have a screenshot of afterburner overlay of how much cpu utilization the game has?


----------



## pez

Quote:


> Originally Posted by *guttheslayer*
> 
> Wildland recon make my 1080 ti beg for mercy.
> 
> Cant believe i cannot run 60 fps stable for 1440p


I thought I had maxed out my settings when I tested it with the 1080 (non-Ti) on 21:9 1440p and was seeing 60+ in the open beta. I already got my code for the Ti I ordered, so I'm going to install it and test it on my TXP to see how it does. I remember being generally impressed that the game ran that well on a 1080 fully maxed at that res.


----------



## guttheslayer

Quote:


> Originally Posted by *0451*
> 
> Maybe it's your see-pee-you?


Haha maybe you mean my 4.3ghz 6800k is getting obsolette


----------



## guttheslayer

Quote:


> Originally Posted by *pez*
> 
> I thought I had maxed out my settings when I tested it with the 1080 (non-Ti) on 21:9 1440p and was seeing 60+ in the open beta. I already got my code for the Ti I ordered, so I'm going to install it and test it on my TXP to see how it does. I remember being generally impressed that the game ran that well on a 1080 fully maxed at that res.


Currently the benchmark only show 65.7 fps on my 2050mhz gpu and 4.3ghz cpu. I dunno man the demand is crazy


----------



## Jbravo33

feel like a got coil whine from strix bios gonna load msi. Anyone else notice noise?


----------



## pez

Quote:


> Originally Posted by *guttheslayer*
> 
> Currently the benchmark only show 65.7 fps on my 2050mhz gpu and 4.3ghz cpu. I dunno man the demand is crazy


Yeah, I'll install it and give it a try this afternoon for you on the TXP... I'm @ 4.5 on my 4770K, but I am on 21:9 1440p, so maybe I'm getting more GPU usage by default. I'll try both resolutions and report back for ya.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Hey so youre unable to keep it under 50C and youre running a single H55 (EVGA kit)?
> 
> Well I have some experience with these AIO's:
> 
> 780 Ti + NZXT Kraken G10 + H55 single fan: 55-60C
> " " dual fan push-pull: 48-53C
> MSI 980 Ti 6G Gaming + H55 dual fan push pull: about the same, we are talking about the same TDP here, so same should apply for 1080 Ti.
> 
> MSI 980 Ti 6G Gaming + x41 dual fan push-pull 42-47C
> 
> Try adding a second fan if you haven't already.
> 
> The x41 is hands down a better AIO, I would keep the fan's at an inaudible 50% RPM and it was still at least 7C cooler than the H55. It's a lot thicker than H55 as well.
> 
> Also, I'm sure you know this but, you want this AIO pushing this heat out of the case!
> 
> You wouldn't believe it but I've seen people posting their AIO cooled GPU set up with the radiator pulling heat into the case and expelling it right on to the VRM fan and you just want to do a line of face palm emoji's.


I have an x41. I actually have it blowing into the case but it's blowing at exhaust fans. I did a open case test too.

It runs great, at the temperatures you're saying. 50% fan speed at 42C to 47C. It's when I run a custom bios at 330 watts it gains an extra 4C or 5C. That kind of makes sense, if you get 4.5C for every 10 watts, at 300 watts you'd be at 45C and 330 watts you'd be a 49.5C.

I have it in push pull also. I'll probably try different things later today. But turning on the blower fan higher got me at 48C which helped a lot. Must be cooling the midplate. Got to find means of extracting heat in other ways.


----------



## guttheslayer

Quote:


> Originally Posted by *pez*
> 
> Yeah, I'll install it and give it a try this afternoon for you on the TXP... I'm @ 4.5 on my 4770K, but I am on 21:9 1440p, so maybe I'm getting more GPU usage by default. I'll try both resolutions and report back for ya.


I dont think its the cpu. Ppl at youtube have tried sli 1080 ti at 4k res and the game goes around 50 fps+. For single card its low 40+ average


----------



## Madness11

Guys hello. Who have here Gtx 1080ti Aourus xtreme ? Tell me please max OC


----------



## Sh0rtcut

Hey,

First post for me!









I need a second opinion here... As many of you know, the 1080 ti costs around $900 here in Europe







BUT I am going to the US next week, to visit some family and I want to get myself a new GPU!

We are only there for one week, (next week) and as of right now most of the 1080ti out there are the FE edition, the only Aftermarket model I could find in stock is the Gigabyte Gaming OC.

Do any of you own this model, or can you recomend another model? I currently own a Nvidia 1070 FE, but I think the FE models is a little too noisy for me, and as you can see the Gigabyte model costs the same as the FE so why not get that one over the FE









GIGABYTE GeForce® GTX 1080 Ti Gaming OC 11G

This is the card:
http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#sp

Thanks


----------



## fisher6

Quote:


> Originally Posted by *guttheslayer*
> 
> I dont think its the cpu. Ppl at youtube have tried sli 1080 ti at 4k res and the game goes around 50 fps+. For single card its low 40+ average


I'm playing at 3440x1440 ultrawide with the very high preset and some things pushed higher like god rays I think. I always get 75-80 FPS with a single 1080 Ti, never dips below 60 at least. I'm also using reshade which costs like 5-7 FPS but the game is very smooth. The highest preset (Ultra?) is useless when you look at the performance hit you take.


----------



## Benny89

Quote:


> Originally Posted by *Sh0rtcut*
> 
> Hey,
> 
> First post for me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need a second opinion here... As many of you know, the 1080 ti costs around $900 here in Europe
> 
> 
> 
> 
> 
> 
> 
> BUT I am going to the US next week, to visit some family and I want to get myself a new GPU!
> 
> We are only there for one week, (next week) and as of right now most of the 1080ti out there are the FE edition, the only Aftermarket model I could find in stock is the Gigabyte Gaming OC.
> 
> Do any of you own this model, or can you recomend another model? I currently own a Nvidia 1070 FE, but I think the FE models is a little too noisy for me, and as you can see the Gigabyte model costs the same as the FE so why not get that one over the FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce® GTX 1080 Ti Gaming OC 11G
> 
> This is the card:
> http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#sp
> 
> Thanks


Hi Mate.

What you are doing is very risky in terms of hardware, let me tell you why:

1. Card you have your eyes on has not been reviewed even once. You buy it on your own risk. I would not buy not reviewed card.
2. Buying card in US will give you a lot of trouble if for example you will want to return it. Here in EU if you buy card you have 2 weeks to return it without any reasons (of course only through online order). In US the law is not that generous and you have to many times pay restock fees etc. Also judging from your post you will buy it directly from a store in US, yes? If so you will be unable to return it.
3. If you return to EU and something happens to card (or you want to return it) you can't go through store support to handle your complaint/warranty since it is located in US, and you would have to send them card (that means shipped HUGE cost, TAXes etc). Or you have to try directcly Gigabyte Support.
4. Card bought in US is under US warranty law, and everybody knows that in term of warranty (in EU 2 years is always guarantee), return or complaint it is easier in EU. *Also in EU there are no restocking fees and shipp cost* when you are under warranty - store/supplier pays for everything.
5. Buying in EU store (best in some local store via online order) means that if your card has any flaws etc. You just pack GPU, go to store, give it to them and they handle everything (shipping, contact with supplier, repair/replacement etc.). You just sit and wait.

I would not buy THAT expensive hardware outside of my *continent* to be honest


----------



## Dasboogieman

Quote:


> Originally Posted by *Sh0rtcut*
> 
> Hey,
> 
> First post for me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need a second opinion here... As many of you know, the 1080 ti costs around $900 here in Europe
> 
> 
> 
> 
> 
> 
> 
> BUT I am going to the US next week, to visit some family and I want to get myself a new GPU!
> 
> We are only there for one week, (next week) and as of right now most of the 1080ti out there are the FE edition, the only Aftermarket model I could find in stock is the Gigabyte Gaming OC.
> 
> Do any of you own this model, or can you recomend another model? I currently own a Nvidia 1070 FE, but I think the FE models is a little too noisy for me, and as you can see the Gigabyte model costs the same as the FE so why not get that one over the FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce® GTX 1080 Ti Gaming OC 11G
> 
> This is the card:
> http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#sp
> 
> Thanks


Check your EU consumer laws. You may very well find it is worth it to buy in your home country.
For example:
I pay roughly 10-20% more on stuff in Australia but I have full statutory 2 years warranty regardless of manufacturer policy, if the manufacturer is defunct then the issue is taken up with the shop I bought the product at.

Basically, this entitles me to full protection, even bypassing "warranty void if removed stickers". In my opinion, these protections far outweigh any cost difference but YMMV.

As for the specific model the question really boils down to whether you intend to go Water/AIO cooling later. If you don't intend to hard mod then go for the ones with custom cooling. However, it is recommended to wait a bit so we can get more info on the specifics of a custom card.


----------



## KingEngineRevUp

See this is why I get a good credit card. Will cover anything over the manufacturer warranty and even adds 1 year.


----------



## evassion

Aorus Xtreme Gtx1080Ti Heaven Bench Temperature..high?

and any more than 2025MHz result in crash.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> See this is why I get a good credit card. Will cover anything over the manufacturer warranty and even adds 1 year.


Or you just don't need credit card and you don't create for yourself debt to pay







. You don't need it because law is protecting you as a consumer.

Simple debit card is good enough to pay for everything. At least in EU of course.


----------



## Benny89

Quote:


> Originally Posted by *evassion*
> 
> 
> 
> 
> 
> 
> Aorus Xtreme Gtx1080Ti Heaven Bench Temperature..high?
> 
> and any more than 2025MHz result in crash.


Yup, that what I was talking about. Good that I went with STRIX.

Those temps are much higher than MSI GAMING X or ROG STRIX.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Or you just don't need credit card and you don't create for yourself debt to pay
> 
> 
> 
> 
> 
> 
> 
> . You don't need it because law is protecting you as a consumer.
> 
> Simple debit card is good enough to pay for everything. At least in EU of course.


I actually never have debt. I pay it off at the end of every month so no interest is charged. And you need a CC to build a credit score, might as well get a good one. You should learn about them.

I get FREE points, warranty extension, insurance, etc. A debit card won't get you that, there might be some that do but I highly doubt it.

For the next 3 months my gpu can be lost or stolen and I'll get a new one. That's another perk.


----------



## evassion

Is this better thermal solution than 1080 ti hybrid kit on a Fe??

http://www.aquatuning.es/refrigeracion-liquida/kits-y-sistemas/kits-internos/alphacool/eiswolf/21866/alphacool-eiswolf-120-gpx-pro-nvidia-geforce-gtx-titan-x-pascal-/-1080-ti-m02-black

and other question.. worth the price difference between 1080 hybrid kit and 1080 ti hybrid kit? I am not willing to pay 50 more euro for aesthetic reason, if it works.. ^^


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I actually never have debt. I pay it off at the end of every month so no interest is charged. And you need a CC to build a credit score, might as well get a good one. You should learn about them.
> 
> I get FREE points, warranty extension, insurance, etc. A debit card won't get you that, there might be some that do but I highly doubt it.
> 
> For the next 3 months my gpu can be lost or stolen and I'll get a new one. That's another perk.


Those are actually a very good perks and points. I will look into that in my bank. Thanks


----------



## KedarWolf

Quote:


> Originally Posted by *fisher6*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Black OPs 3, 2100 substained.
> 
> GPU-ZSensorLog.txt 14k .txt file
> 
> 
> Ghost Recon: Wildlands, same.
> 
> GPU-ZSensorLog.txt 114k .txt file
> 
> 
> I'm really liking this Strix BIOS, see no need to try others.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for testing. Looks good, i might as well just flash it then.
Click to expand...

Just so you know some peeps are having issues with the Strix BIOS, not sure why. It's working quite well for myself under water.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> I actually never have debt. I pay it off at the end of every month so no interest is charged. And you need a CC to build a credit score, might as well get a good one. You should learn about them.
> 
> I get FREE points, warranty extension, insurance, etc. A debit card won't get you that, there might be some that do but I highly doubt it.
> 
> For the next 3 months my gpu can be lost or stolen and I'll get a new one. That's another perk.
> 
> 
> 
> Those are actually a very good perks and points. I will look into that in my bank. Thanks
Click to expand...

I flashed it like this.

http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20


----------



## pez

Quote:


> Originally Posted by *Sh0rtcut*
> 
> Hey,
> 
> First post for me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need a second opinion here... As many of you know, the 1080 ti costs around $900 here in Europe
> 
> 
> 
> 
> 
> 
> 
> BUT I am going to the US next week, to visit some family and I want to get myself a new GPU!
> 
> We are only there for one week, (next week) and as of right now most of the 1080ti out there are the FE edition, the only Aftermarket model I could find in stock is the Gigabyte Gaming OC.
> 
> Do any of you own this model, or can you recomend another model? I currently own a Nvidia 1070 FE, but I think the FE models is a little too noisy for me, and as you can see the Gigabyte model costs the same as the FE so why not get that one over the FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce® GTX 1080 Ti Gaming OC 11G
> 
> This is the card:
> http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#sp
> 
> Thanks


This card should be pretty solid, actually. I think most people are avoiding due to it's garish color scheme and the fact that it does not have a backplate....but in the end, that doesn't really matter. That cooler on the 1080 G1 was a very good cooler, so I wouldn't be afraid to buy it...of course if you do like the color scheme







.
Quote:


> Originally Posted by *Benny89*
> 
> Hi Mate.
> 
> What you are doing is very risky in terms of hardware, let me tell you why:
> 
> 1. Card you have your eyes on has not been reviewed even once. You buy it on your own risk. I would not buy not reviewed card.
> 2. Buying card in US will give you a lot of trouble if for example you will want to return it. Here in EU if you buy card you have 2 weeks to return it without any reasons (of course only through online order). In US the law is not that generous and you have to many times pay restock fees etc. Also judging from your post you will buy it directly from a store in US, yes? If so you will be unable to return it.
> 3. If you return to EU and something happens to card (or you want to return it) you can't go through store support to handle your complaint/warranty since it is located in US, and you would have to send them card (that means shipped HUGE cost, TAXes etc). Or you have to try directcly Gigabyte Support.
> 4. Card bought in US is under US warranty law, and everybody knows that in term of warranty (in EU 2 years is always guarantee), return or complaint it is easier in EU. *Also in EU there are no restocking fees and shipp cost* when you are under warranty - store/supplier pays for everything.
> 5. Buying in EU store (best in some local store via online order) means that if your card has any flaws etc. You just pack GPU, go to store, give it to them and they handle everything (shipping, contact with supplier, repair/replacement etc.). You just sit and wait.
> 
> I would not buy THAT expensive hardware outside of my *continent* to be honest


Gigabyte RMA is serial-based, so country of origin should not matter. The SKUs are the same, so I don't see what you're ultimately getting at here.
Quote:


> Originally Posted by *Benny89*
> 
> Or you just don't need credit card and you don't create for yourself debt to pay
> 
> 
> 
> 
> 
> 
> 
> . You don't need it because law is protecting you as a consumer.
> 
> Simple debit card is good enough to pay for everything. At least in EU of course.


Quote:


> Originally Posted by *SlimJ87D*
> 
> I actually never have debt. I pay it off at the end of every month so no interest is charged. And you need a CC to build a credit score, might as well get a good one. You should learn about them.
> 
> I get FREE points, warranty extension, insurance, etc. A debit card won't get you that, there might be some that do but I highly doubt it.
> 
> For the next 3 months my gpu can be lost or stolen and I'll get a new one. That's another perk.


This....do people not understand that CCs are used to build credit and of course build free points? I'm the same as you. I buy my stuff and pay it off so I never see interest. And depending on your spending, you're getting 10, 20, 30 bucks a month at least in cashback. It's almost silly not to use a CC.


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> Gigabyte RMA is serial-based, so country of origin should not matter. The SKUs are the same, so I don't see what you're ultimately getting at here.


In short? Better to pay 100$ more to save yourself any possible problems and having something bought on your continent and country if possible. Things are faster, more reliant and you are familliar with your local law. No supprises. That is all.

I only buy from US if said product is not available in EU.


----------



## Nico67

Got an EK titan block on my Asus 1080 Ti FE and its working very well.

Stock air cooled 120% power would boost to 1911 max but run up to 84 degrees and throttle back to around 1838, kinda expected and was also running to full 120%

Stock water cooled 120% power boosts to 1911 and stays there









+104 gives 2012
+137 gives 2038 max power only 110 temp 25c with odd spike to 26c
+154 gives 2063 max power only 114, temp same
+165 gives 2076 max power only 117, temp same
+182 gives 2088 max power and temp hasn't spiked any higher
+193 gives 2101 max power and temp hasn't spiked any higher
+204 gives 2114 did driver crash here, but just adjust core clock is starting to bin max volts at 1.043, adding +25 on voltage got it back to 1.050 and it passed benchmark. power and temp still hasn't spiked any higher.
+216 gives 2126 did crash again due to voltage binning left it there for my first play under water









Admittedly I was only running in game benchmark from "The Division" and it is pretty short, but its pretty graphic intensive game and it was running full gpu usage.

Will play around with AB graph tomorrow to get volt bins a bit better. Before some asks, its on chilled water set at 20c which is why the GPU temps are so low. Also has a great effect on power efficiency being colder which is why power usage drops so much. Would still need to use a better bios if I wanted to go for max volts though I think.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you running at 2114Mhz?
> 
> Edit: wait a minute... You're only at 120%? Brah, I'm running at 330 Watts... That's like 132% relative to you... Unless if you're doing 120% on the partner bios, I don't see how that's possible, it would be 360 Watts O_O.
> 
> At stock bios with OC and 120% slider, I ran around 42C to 45C.


Ya for some reason even at 1.075V I would hit the 120% powerlimit on the Asus partner card bios flashed on my FE. But tbh there was barely any gains and for now I'm back on the stock bios but I might give the MSI one a try.


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> In short? Better to pay 100$ more to save yourself any possible problems and having something bought on your continent and country if possible. Things are faster, more reliant and you are familliar with your local law. No supprises. That is all.
> 
> I only buy from US if said product is not available in EU.


Well sure, but this would only be the case if the warranty was carried out via invoice and registration. Unless Gigabyte has changed it recently, serial RMAs show no discrimination by country.


----------



## guttheslayer

Quote:


> Originally Posted by *Madness11*
> 
> Guys hello. Who have here Gtx 1080ti Aourus xtreme ? Tell me please max OC


I am using Aorus GTX 1080 Ti and able to hold on its own for 2050 MHz. Temps is high but when u set the fan profile properly it is good.


----------



## RavageTheEarth

Quote:


> Originally Posted by *guttheslayer*
> 
> I am using Aorus GTX 1080 Ti and able to hold on its own for 2050 MHz. Temps is high but when u set the fan profile properly it is good.


Good to know! Do you have the Xtreme? I have the regular Aorus 1080 Ti on the way to replace my 980 Ti. Don't mind setting 100% fan until EK releases the waterblock for this card. Can't wait to try the card out even though it seems the Aorus's temps aren't that great. Not a big problem since it's going under water anyways.


----------



## pez

Hmmm, where are people seeing temp results? I remember reading something about someone getting results only to find out the fan wasn't cutting on after 60C like it should have. If the previous cards are any indication, the cooler should perform along the lines of the MSI one.


----------



## c0ld

Quote:


> Originally Posted by *guttheslayer*
> 
> I am using Aorus GTX 1080 Ti and able to hold on its own for 2050 MHz. Temps is high but when u set the fan profile properly it is good.


I havent even set up a fan profile on my Aorus and stays at 70-72c at around 58% fan speed this thins is amazingly quiet!


----------



## Sh0rtcut

Quote:


> Originally Posted by *Benny89*
> 
> Hi Mate.
> 
> What you are doing is very risky in terms of hardware, let me tell you why:
> 
> 1. Card you have your eyes on has not been reviewed even once. You buy it on your own risk. I would not buy not reviewed card.
> 2. Buying card in US will give you a lot of trouble if for example you will want to return it. Here in EU if you buy card you have 2 weeks to return it without any reasons (of course only through online order). In US the law is not that generous and you have to many times pay restock fees etc. Also judging from your post you will buy it directly from a store in US, yes? If so you will be unable to return it.
> 3. If you return to EU and something happens to card (or you want to return it) you can't go through store support to handle your complaint/warranty since it is located in US, and you would have to send them card (that means shipped HUGE cost, TAXes etc). Or you have to try directcly Gigabyte Support.
> 4. Card bought in US is under US warranty law, and everybody knows that in term of warranty (in EU 2 years is always guarantee), return or complaint it is easier in EU. *Also in EU there are no restocking fees and shipp cost* when you are under warranty - store/supplier pays for everything.
> 5. Buying in EU store (best in some local store via online order) means that if your card has any flaws etc. You just pack GPU, go to store, give it to them and they handle everything (shipping, contact with supplier, repair/replacement etc.). You just sit and wait.
> 
> I would not buy THAT expensive hardware outside of my *continent* to be honest


Thanks for the advice regarding the support. I am well aware of how the supporting works, this is not the first time I buy from the US and bring it to the EU. I have not previously had any problems with contacting the support, for any of these product.

My question did not really go about the US vs EU warranty, but more regards if anyone had and info regarding Gigabyte GPU's and are they any good.
Quote:


> Originally Posted by *Dasboogieman*
> 
> Check your EU consumer laws. You may very well find it is worth it to buy in your home country.
> For example:
> I pay roughly 10-20% more on stuff in Australia but I have full statutory 2 years warranty regardless of manufacturer policy, if the manufacturer is defunct then the issue is taken up with the shop I bought the product at.
> 
> Basically, this entitles me to full protection, even bypassing "warranty void if removed stickers". In my opinion, these protections far outweigh any cost difference but YMMV.
> 
> As for the specific model the question really boils down to whether you intend to go Water/AIO cooling later. If you don't intend to hard mod then go for the ones with custom cooling. However, it is recommended to wait a bit so we can get more info on the specifics of a custom card.


Thank you, for the advice.


----------



## Benny89

Quote:


> Originally Posted by *Sh0rtcut*
> 
> Thanks for the advice regarding the support. I am well aware of how the supporting works, this is not the first time I buy from the US and bring it to the EU. I have not previously had any problems with contacting the support, for any of these product.
> 
> My question did not really go about the US vs EU warranty, but more regards if anyone had and info regarding Gigabyte GPU's and are they any good.
> Thank you, for the advice.


Ok, no problem then! Well, card was not yet reviewed. Also it does not have backplate so not many people are interested in it. If I were you I would try to snag different 1080Ti in US. Maybe until you actually go visit family, more brands should have restock. I would AIM for ROG, GAMING X or maybe Aorus Xtreme.

I would not but card without proper reviews first.


----------



## pez

Quote:


> Originally Posted by *c0ld*
> 
> I havent even set up a fan profile on my Aorus and stays at 70-72c at around 58% fan speed this thins is amazingly quiet!


Glad to hear this as this is in line with how last gen cards did. Any coil whine from your unit?


----------



## Benny89

Quote:


> Originally Posted by *c0ld*
> 
> I havent even set up a fan profile on my Aorus and stays at 70-72c at around 58% fan speed this thins is amazingly quiet!


Good to hear, maybe few people got bad sample units, and there we have those high temps users reports.

Actually so far all AIBs are really quiet. They did great job on this card so far.


----------



## Sh0rtcut

Quote:


> Originally Posted by *Benny89*
> 
> Ok, no problem then! Well, card was not yet reviewed. Also it does not have backplate so not many people are interested in it. If I were you I would try to snag different 1080Ti in US. Maybe until you actually go visit family, more brands should have restock. I would AIM for ROG, GAMING X or maybe Aorus Xtreme.
> 
> I would not but card without proper reviews first.


Allright, I will cool it for the next few days and hope to see some re-stock on the Aorus.
Why should people not be interested because of the "missing" backplate? As far as I know it has no real function, it is only there for looks, like V1Tech backplates...


----------



## Benny89

Quote:


> Originally Posted by *Sh0rtcut*
> 
> Allright, I will cool it for the next few days and hope to see some re-stock on the Aorus.
> Why should people not be interested because of the "missing" backplate? As far as I know it has no real function, it is only there for looks, like V1Tech backplates...


People buying AIBs are simillar to people buying AIOs, Tempered Glass cases etc







We love good looking shinies inside our RIGs









Considering everything and their mothers today is RGB- looks became as important factor in nowadays PC hardware as performance. If you want to win market- you need to have both look and performance. And RGB of course.

This is evolution of PC from work/gaming station to actually a furniture inside your house that you want to look good. Hell, people repaint and design WHOLE Rooms just for their gaming RIGS. Looks are super important today









Did I mention RGB?


----------



## mshagg

Gigabyte Extreme Gaming Bios is live on Techpowerup:

https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331

Get a load of the numbers - 250W target TDP with 50% adjustment, netting *375W.*

Not sure im brave enough to flash that on a FE lol.


----------



## pez

Unless that thing somehow holds 2.2GHz with some nice gains to back it up, it would be so wasteful to use that much power.


----------



## mshagg

1632 Boost clock in OC mode, it looks like it's king of the mountain in the partner cards. Although the BIOS uploaded states 1608, which is the gaming mode.

Does thing have a BIOS switch for the two modes or is done via software?

Have a go at this guy's video - pegged at 2062 looping heaven even with 74 degree core temps.






For out of the box performance that's pretty impressive. I dont think my FE holds that looping heaven under water thanks to the power limit (edit: scratch that, just tested and it holds 2075). I would have expected temperature throttling at 74 as well.


----------



## evassion

Some hous ago i posted about this video in the comments , he wrote that he crashed at 2025. And temps at 19xx werent low with this card


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Haha oh I'm not bothered
> 
> 
> 
> 
> 
> 
> 
> . And yeah...$699.99 + tax + $160 AIO specific cooler...or $720 for the AIB that will get a waterblock from EKWB (they run $130 IIRC)....and now we're at $10 of savings
> 
> 
> 
> 
> 
> 
> 
> .


much but use for those tax dollars than giving them to Trump to vacay in Mara-lago every weekend.
Quote:


> Originally Posted by *feznz*
> 
> I guess in my lan party group I will now be King of GPU and monitor. we are having a lan night in a few weeks and I will thinking........ bad thoughts you guys put in my head friends are friends even if they are peasants.
> This card will go under water I don't have a full cover but it will slip in my current water loop with universal blocks I don't really want to do this but I am sure it will be a few more weeks before the blocks become available.
> Nice I almost got the EVGA with water block when Computer lounge had their big sale I was hanging for a card I didn't want the EVGA would have been $1450ish including water block, but the ETA for the strix was still 3-4 weeks away but on the online check out someone bet to it and glad they did I really wanted the Asus 1080Ti Strix
> I might add I swore I would never deal with PlayTech again but when it came to getting this card or wait........... I will just say they have not had 4 years of my business.
> you right there I been looking at my 2840x1600 LG 38" and GTX 770 SLI to tease me I could play some games... that are 2 years old now.


oh, i love that monitor as I know you MEANT to say 3840x1600 WQHD+ , so nasty. the horizontal rez of 4k, the 21:9 AR, the xtra pixel on vertical... that dude is basically widescreen 4k. Or, 3.5k lol, giving up a few pixels but gaining width and framerate. And it's HUGE. I am jello now!
Quote:


> Originally Posted by *guttheslayer*
> 
> Wildland recon make my 1080 ti beg for mercy.
> 
> Cant believe i cannot run 60 fps stable for 1440p


hmmm. what cpu? i get ~45-50 in 4k.
Quote:


> Originally Posted by *MURDoctrine*
> 
> Yeah it is a CPU hog. It keeps my 4770k at 75-85% on 1080p 60hz. I had to up the resolution scaling just to throw more load onto the GPU. Its the only game to let me see my CPU bottleneck without just letting the 1080ti go unrestricted on the FPS.


you know what you COULD do with a small amount of cash...


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Yeah, I'll install it and give it a try this afternoon for you on the TXP... I'm @ 4.5 on my 4770K, but I am on 21:9 1440p, so maybe I'm getting more GPU usage by default. I'll try both resolutions and report back for ya.


yeah, you have a lot of extra pixels freeing up your cpu in wqhd.
Quote:


> Originally Posted by *SlimJ87D*
> 
> I have an x41. I actually have it blowing into the case but it's blowing at exhaust fans. I did a open case test too.
> 
> It runs great, at the temperatures you're saying. 50% fan speed at 42C to 47C. It's when I run a custom bios at 330 watts it gains an extra 4C or 5C. That kind of makes sense, if you get 4.5C for every 10 watts, at 300 watts you'd be at 45C and 330 watts you'd be a 49.5C.
> 
> I have it in push pull also. I'll probably try different things later today. But turning on the blower fan higher got me at 48C which helped a lot. Must be cooling the midplate. Got to find means of extracting heat in other ways.


put a 100mmx50x15mm (or 2) finned heatsinks on your backplate right above the gpu/vram with some Fujipoly; that will help a degree or three. I dont guess the case snip mod that @VanuSovereignty came up w will help in your case but maybe a degree or two there, two. When you only need a degree or two more to avoid throttle you gotta do what you gotta do. Also, did you use thermal grizz kyro? Any other paste and you could be leaving a degree or two on the table there.
Quote:


> Originally Posted by *Sh0rtcut*
> 
> Hey,
> 
> First post for me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need a second opinion here... As many of you know, the 1080 ti costs around $900 here in Europe
> 
> 
> 
> 
> 
> 
> 
> BUT I am going to the US next week, to visit some family and I want to get myself a new GPU!
> 
> We are only there for one week, (next week) and as of right now most of the 1080ti out there are the FE edition, the only Aftermarket model I could find in stock is the Gigabyte Gaming OC.
> 
> Do any of you own this model, or can you recomend another model? I currently own a Nvidia 1070 FE, but I think the FE models is a little too noisy for me, and as you can see the Gigabyte model costs the same as the FE so why not get that one over the FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce® GTX 1080 Ti Gaming OC 11G
> 
> This is the card:
> http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#sp
> 
> Thanks


i looked at it yesterday, but, ust set up instocktracker to the asus, msi armor, msi gamingX, Aorus OC and Aorus Xtrme. They come in/out of stock throughout the day. Usually for 5-10 minutes at a time so set the alarm and be ready. have paypal already logged in and have your decisions about next day/2 day ship already made so you can turbo checkout like it's a race... b/c it is. I KNOW you'll have a shot at the Aorus, Armor, the giga you already saw (which looks decent tbh, but i wanted 8+8pin), and could get lucky on an extreme aorus or asus. If it doesnt appen for you before you are set to leave you will still at least be able to get that giga or an FE for $699 as a fallback.
Quote:


> Originally Posted by *Dasboogieman*
> 
> Check your EU consumer laws. You may very well find it is worth it to buy in your home country.
> For example:
> I pay roughly 10-20% more on stuff in Australia but I have full statutory 2 years warranty regardless of manufacturer policy, if the manufacturer is defunct then the issue is taken up with the shop I bought the product at.
> 
> Basically, this entitles me to full protection, even bypassing "warranty void if removed stickers". In my opinion, these protections far outweigh any cost difference but YMMV.
> 
> As for the specific model the question really boils down to whether you intend to go Water/AIO cooling later. If you don't intend to hard mod then go for the ones with custom cooling. However, it is recommended to wait a bit so we can get more info on the specifics of a custom card.


but the aorus for example has a full 4 year warranty. that istwice what you guys are quoting as "good".


----------



## rcfc89

Getting tired of waiting on getting a pair of these Beast. Any tips on snagging them on Newegg? Might just wait until August when I do my 7740k build.


----------



## Slackaveli

Quote:


> Originally Posted by *guttheslayer*
> 
> I am using Aorus GTX 1080 Ti and able to hold on its own for 2050 MHz. Temps is high but when u set the fan profile properly it is good.


what are your afterburner settings?
Quote:


> Originally Posted by *pez*
> 
> Hmmm, where are people seeing temp results? I remember reading something about someone getting results only to find out the fan wasn't cutting on after 60C like it should have. If the previous cards are any indication, the cooler should perform along the lines of the MSI one.


it's misinfo based on a video where the N00by youtuber had no idea what a custom fan profile even was in the comments and actually had FanStop illuminated during his run where he reached 78c. once he turned on the fan he was in the 60s
Quote:


> Originally Posted by *mshagg*
> 
> Gigabyte Extreme Gaming Bios is live on Techpowerup:
> 
> https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331
> 
> Get a load of the numbers - 250W target TDP with 50% adjustment, netting *375W.*
> 
> Not sure im brave enough to flash that on a FE lol.


THAT is the reason to buy Aorus. 50%!!!

Comment from a follower on youtube "ELVIS G23 hours ago
Dim2go yeah same here I also talked with gigabyte they upgraded gpu bios so even better temps between 60-65c reviews in Nordic coming soon maybe day or two﻿"


----------



## c0ld

Quote:


> Originally Posted by *pez*
> 
> Glad to hear this as this is in line with how last gen cards did. Any coil whine from your unit?


I didn't had time to play around with it much yesterday, since I cleaned the case, erased any leftover AMD drivers.

I tested Witcher 3, GTA V, Squad, Unknownplayer Battlegrounds all everything maxed out with 8x MSAA @ 3440x1440 no coil whine.

Left Ungine Valley running while I did other stuff and temps were really nice.

What's a program that might be able to stress it for coil whine? FurMark?
Quote:


> Originally Posted by *Benny89*
> 
> Good to hear, maybe few people got bad sample units, and there we have those high temps users reports.
> 
> Actually so far all AIBs are really quiet. They did great job on this card so far.


How high were they reporting? I havent even tried OC just set the power slider to 125% and hit 2.0GHz core.

It's amazing I jumped from a 780 and its more than double performance.


----------



## Slackaveli

Quote:


> Originally Posted by *rcfc89*
> 
> Getting tired of waiting on getting a pair of these Beast. Any tips on snagging them on Newegg? Might just wait until August when I do my 7740k build.


right above you, bro


----------



## RavageTheEarth

I'll be receiving my Aorus (not the Xtreme edition) next Wednesday and will post some results on air and when the waterblock is released I'll post results under water for my card. I was able to snag one the when I woke up on the day of release. Luckily I wasn't one of the people who stayed up to midnight to try to grab one only for them to go out of stock immediately. Luckily when I woke up from my nice 8 hours they had just put the Aorus back in stock so I instinctively grabbed one. Should be a nice upgrade from my 980 Ti considering I'm using a 1440p 144hz G-Sync monitor.


----------



## Benny89

Quote:


> Originally Posted by *c0ld*
> 
> How high were they reporting? I havent even tried OC just set the power slider to 125% and hit 2.0GHz core.
> 
> It's amazing I jumped from a 780 and its more than double performance.


People were reporting 78C without OCing.

Xtreme has 375W power limit but that won't imo help with higher overclock on air and water. It hits higher without OCing but so far all 1080Ti have soft cap at around 2050mhz and hard cap at around 2100mhz.


----------



## rcfc89

Quote:


> Originally Posted by *Slackaveli*
> 
> right above you, bro


You da man.............repped


----------



## Slackaveli

concur with those temps (my experience almost exactly) on your FE
Quote:


> Originally Posted by *Benny89*
> 
> People were reporting 78C without OCing.
> 
> Xtreme has 375W power limit but that won't imo help with higher overclock on air and water. It hits higher without OCing but so far all 1080Ti have soft cap at around 2050mhz and hard cap at around 2100mhz.


it's misinfo based on a video where the N00by youtuber had no idea what a custom fan profile even was in the comments and actually had FanStop illuminated during his run where he reached 78c. once he turned on the fan he was in the 60s. This card cools better than any others besides the Asus atm, and does it silently. The Asus is the champ of that but is the most expensive.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> People were reporting 78C without OCing.
> 
> Xtreme has 375W power limit but that won't imo help with higher overclock on air and water. It hits higher without OCing but so far all 1080Ti have soft cap at around 2050mhz and hard cap at around 2100mhz.


Quote:


> Originally Posted by *rcfc89*
> 
> You da man.............repped


hope it helps, bud. worked for me. 2 hours of stalking off/on w/ the alarm set and i was good to go on the aorus. first shopping cart failed b/c i wasnt signed into paypal fast enough. second time was a go. 5 second checkout, 2 day shipping selected.

dang, i swear im trying not to double post :/


----------



## Alwrath

Honestly guys this really is the best cooler Nvidia has come out with for a video card. After much testing, all you have to do is set up a good fan profile and your set. I will admit though the stock fan profile is one of the worst ive ever seen for a gaming card, so its very important to set up your own fan profile. I honestly dont mind the fan at 100%, sure I can hear it, but its no big deal especially with gaming headset on. Honestly, I will never understand peoples complaints about fan noise, you want to complain about a loud air cooler? Try some old radeon 290's or 4870's at 100% fan speed and then tell me the Nvidia cooler is loud. Those cards sounded like a jet taking off, this thing is tame in comparison. Im happy with the FE. Got it direct from EVGA's website so no tax. With shipping I bought it for $720 and have had this thing since 4 days after launch day and im loving every minute of this beast. No voltage it fluctuates between 1950-2000 depending on the game, and my temp never goes above 60C.
























If I do upgrade with EVGA stepup, it will be for a Classy or Kingpin.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, you have a lot of extra pixels freeing up your cpu in wqhd.
> put a 100mmx50x15mm (or 2) finned heatsinks on your backplate right above the gpu/vram with some Fujipoly; that will help a degree or three. I dont guess the case snip mod that @VanuSovereignty came up w will help in your case but maybe a degree or two there, two. When you only need a degree or two more to avoid throttle you gotta do what you gotta do. Also, did you use thermal grizz kyro? Any other paste and you could be leaving a degree or two on the table there.
> i looked at it yesterday, but, ust set up instocktracker to the asus, msi armor, msi gamingX, Aorus OC and Aorus Xtrme. They come in/out of stock throughout the day. Usually for 5-10 minutes at a time so set the alarm and be ready. have paypal already logged in and have your decisions about next day/2 day ship already made so you can turbo checkout like it's a race... b/c it is. I KNOW you'll have a shot at the Aorus, Armor, the giga you already saw (which looks decent tbh, but i wanted 8+8pin), and could get lucky on an extreme aorus or asus. If it doesnt appen for you before you are set to leave you will still at least be able to get that giga or an FE for $699 as a fallback.
> but the aorus for example has a full 4 year warranty. that istwice what you guys are quoting as "good".


I used Gelid Extreme. Im going to try and do a test with the radiator outside of my case.

I get a feeling its just the extra 30 Watts of heat from the bios. 30 Watts extra can make quite a difference, like trying to hold onto a 30 watts light bulb turned on :O


----------



## GRABibus

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'll be receiving my Aorus (not the Xtreme edition) next Wednesday and will post some results on air and when the waterblock is released I'll post results under water for my card. I was able to snag one the when I woke up on the day of release. Luckily I wasn't one of the people who stayed up to midnight to try to grab one only for them to go out of stock immediately. Luckily when I woke up from my nice 8 hours they had just put the Aorus back in stock so I instinctively grabbed one. Should be a nice upgrade from my 980 Ti considering I'm using a 1440p 144hz G-Sync monitor.


Does someone know if Gigabyte will release the Xtreme Gaming Waterforce series for 1080Ti ?


----------



## Mato87

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'll be receiving my Aorus (not the Xtreme edition) next Wednesday and will post some results on air and when the waterblock is released I'll post results under water for my card. I was able to snag one the when I woke up on the day of release. Luckily I wasn't one of the people who stayed up to midnight to try to grab one only for them to go out of stock immediately. Luckily when I woke up from my nice 8 hours they had just put the Aorus back in stock so I instinctively grabbed one. Should be a nice upgrade from my 980 Ti considering I'm using a 1440p 144hz G-Sync monitor.


I game on 1080p 144 hz and I felt the jump from gb 980 ti xtreme gamer to asus 1080ti strix oc and every single game I have tested so far.


----------



## JedixJarf

Quote:


> Originally Posted by *Mato87*
> 
> I game on 1080p 144 hz and I felt the jump from gb 980 ti xtreme gamer to asus 1080ti strix oc and every single game I have tested so far.


I certainly would hope so : P. BTW did you feel satisfied by the quantum break 1080 ti experience lol?


----------



## c0ld

Quote:


> Originally Posted by *Benny89*
> 
> People were reporting 78C without OCing.
> 
> Xtreme has 375W power limit but that won't imo help with higher overclock on air and water. It hits higher without OCing but so far all 1080Ti have soft cap at around 2050mhz and hard cap at around 2100mhz.


Yikes! Any good guides with OC'ing with GPU Boost 3.0?

I'm gonna play round with it but hitting 2.0GHz and staying really quiet is Win for me. My case fans are slightly louder than my Aorus while gaming lol.

I would loved to have a bit more headroom most hit the wall at 2.1 GHz right?


----------



## dunbheagan

Hey guys, today EK released their waterblocks for the 1080Ti FE:

https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-nvidia-gtx-1080-ti-graphics-cards/

Sorry if this has been posted before here, this thread is way too busy to read all posts...


----------



## Mato87

Quote:


> Originally Posted by *JedixJarf*
> 
> I certainly would hope so : P. BTW did you feel satisfied by the quantum break 1080 ti experience lol?


Well yes and no. I finally played through the game without it looking blurry and ugly, but had to turn down a few settings, mainly the volumetric lighting to achieve a stable 60 even during combat. Game is an unoptimized mess to be fair. But I still love it, it's like a blend of max payne and alan wake, some of the effects are absolutely breathtaking.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> ah, ok. turn on your gpu power usage, too.
> 
> From your newer pic, i wouldnt call that bottlenecked per say but it is really close, even with all those cores.


How is it bottlenecked really close? Is it only because one or more cpu cores is close to 100% usage?
Quote:


> Originally Posted by *pez*
> 
> Well sure, but this would only be the case if the warranty was carried out via invoice and registration. Unless Gigabyte has changed it recently, serial RMAs show no discrimination by country.


When I bought my Gigabyte GTX 670 before in the US and had it shipped here in the Philippines I asked Gigabyte Taiwan if it had international warranty. They outright said no. So what do you mean by serial-based RMA?


----------



## Slackaveli

any of you three + guys with the Aorus checked the thermal paste? Curious if Kyro mod is worth the trouble on this guy but i cant even find a teardown video of the Aorus, not even for the 1080...
















How is it bottlenecked really closed? Is it only because one or more cpu cores is close to 100% usage?

In a nutshell. games are coded for (layman's) 1 core (rare, older games) , 2 cores, 4 cores, 4 cores allowing H/T, 8 cores, ... etc (rare!!) so it's all game dependent. That's why 6950k doesnt beat 6700k in gaming (most games) except when all those cores are used. Older cpus have lower single threaded performance and lower IPC , so if you get stuck in a game utilizing mainly 2 cores (like above) you could be bottlenecking when they hit 99% usage, evn though the cpu average usage may say 55%. That's why I always have all 8 threads on my OBD. It's not the end of the world if your gpu finally gives the bottleneck to your cpu, but it is preferred to have the gpu be the bottleneck because of gaming smoothness (amongst other things).

One thing you could try in addition to OCing your cpu is to turn off hyperthreading as it is rarely utilized and a lot of games will give a little better cpu performance with it off.


----------



## JedixJarf

Quote:


> Originally Posted by *Slackaveli*
> 
> any of you three + guys with the Aorus checked the thermal paste? Curious if Kyro mod is worth the trouble on this guy but i cant even find a teardown video of the Aorus, not even for the 1080...


My Gigabyte 1080 G1 OC came with the paste super caked on, it was all over the place when I took the block cooler off.


----------



## Slackaveli

Quote:


> Originally Posted by *JedixJarf*
> 
> My Gigabyte 1080 G1 OC came with the paste super caked on, it was all over the place when I took the block cooler off.


word. I had a feeling. How many screws to get to it? Best way to do it, i cant find a video. not sure how close the coolers were on the two. Im sure i could figure it out but id rather know the fastest/easiest/safest way first. Obviously I need to find somebody who has removed the cooler on one of the 3 fans coolers with the offset center fan, gigabyte gaming extreme 1080 maybe. But the all I can find is the waterforce one teardown which helps nada.

Also, did it have the screw sticker?


----------



## rt123

You shouldn't need a video teardown to repaste a card man. Most GPUs you only need to remove the 4 screws around the core & unplug the fan plug.
Its pretty straight forward.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> any of you three + guys with the Aorus checked the thermal paste? Curious if Kyro mod is worth the trouble on this guy but i cant even find a teardown video of the Aorus, not even for the 1080...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How is it bottlenecked really closed? Is it only because one or more cpu cores is close to 100% usage?
> 
> In a nutshell. games are coded for (layman's) 1 core (rare, older games) , 2 cores, 4 cores, 4 cores allowing H/T, 8 cores, ... etc (rare!!) so it's all game dependent. That's why 6950k doesnt beat 6700k in gaming (most games) except when all those cores are used. Older cpus have lower single threaded performance and lower IPC , so if you get stuck in a game utilizing mainly 2 cores (like above) you could be bottlenecking when they hit 99% usage, evn though the cpu average usage may say 55%. That's why I always have all 8 threads on my OBD. It's not the end of the world if your gpu finally gives the bottleneck to your cpu, but it is preferred to have the gpu be the bottleneck because of gaming smoothness (amongst other things).
> 
> One thing you could try in addition to OCing your cpu is to turn off hyperthreading as it is rarely utilized and a lot of games will give a little better cpu performance with it off.


Got it.

Btw, why do you call repasting with Kryonaut a "mod"? Curious


----------



## KingEngineRevUp

Quote:


> Originally Posted by *kevindd992002*
> 
> Got it.
> 
> Btw, why do you call repasting with Kryonaut a "mod"? Curious


Technically you still are modifying the card. You no longer have stock or OEM paste on it so your results aren't a reference for someone that wants to know let's say thermal performance of the average FE.


----------



## Slackaveli

Quote:


> Originally Posted by *rt123*
> 
> You shouldn't need a video teardown to repaste a card man. Most GPUs you only need to remove the 4 screws around the core & unplug the fan plug.
> Its pretty straight forward.


it looks like it has several different screw types, im just hoping for a 4, or 6 or 8 screw solution. Im sure ill figure it out. i just like to be prepared. It'll probably be 2-4 side screws and 4 backplate screws. I've only done it 3 times. Im not a long time water cooler like a lot of ya'll.

judging by this I may need to remove a BUNCH of screws. Aorus has a huge plate covering all the main components.




@10:53 is how i think it will look on the plate level.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Technically you still are modifying the card. You no longer have stock or OEM paste on it so your results aren't a reference for someone that wants to know let's say thermal performance of the average FE.


exactly. It's a thermal paste mod. It brings gains.


----------



## alucardis666

Hey guys I think I'm gonna flash that gigabyte bios.


----------



## madmeatballs

So which bios is best to flash to FE? And what are some drawbacks with bios flashing? Like does the MSI one also disable a DP port?


----------



## Slackaveli

Ive been wondering who would be first. I had my money on you but Bravo was up there too.

Dat 150% power limit....
Quote:


> Originally Posted by *madmeatballs*
> 
> So which bios is best to flash to FE? And what are some drawbacks with bios flashing? Like does the MSI one also disable a DP port?


i think the msi is a 3dp 1hdmi like founders. aorus has 3 dp and 2 hdmi and a dvi so im not sure.

Edit : @alucardis666 maybe watch this first. power phases are done differently , quaduplers and whatnot. Im no engineer, but I think this will be very similar to the pcb on the aorus which is what your bios would be using.


----------



## kevindd992002

Quote:


> Originally Posted by *SlimJ87D*
> 
> Technically you still are modifying the card. You no longer have stock or OEM paste on it so your results aren't a reference for someone that wants to know let's say thermal performance of the average FE.


Quote:


> Originally Posted by *Slackaveli*
> 
> exactly. It's a thermal paste mod. It brings gains.


Ok, that makes sense!


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Ive been wondering who would be first. I had my money on you but Bravo was up there too.
> 
> Dat 150% power limit....
> i think the msi is a 3dp 1hdmi like founders. aorus has 3 dp and 2 hdmi and a dvi so im not sure


Gigabyte is a no go for FE. Boost isn't working like it should with the MSI or the ASUS Bios' did.



Card won't take more than .800mv while running heaven, and boost clocks are below stock values.


----------



## GraphicsWhore

Quote:


> Originally Posted by *Alwrath*
> 
> Honestly guys this really is the best cooler Nvidia has come out with for a video card. After much testing, all you have to do is set up a good fan profile and your set. I will admit though the stock fan profile is one of the worst ive ever seen for a gaming card, so its very important to set up your own fan profile. I honestly dont mind the fan at 100%, sure I can hear it, but its no big deal especially with gaming headset on. Honestly, I will never understand peoples complaints about fan noise, you want to complain about a loud air cooler? Try some old radeon 290's or 4870's at 100% fan speed and then tell me the Nvidia cooler is loud. Those cards sounded like a jet taking off, this thing is tame in comparison. Im happy with the FE. Got it direct from EVGA's website so no tax. With shipping I bought it for $720 and have had this thing since 4 days after launch day and im loving every minute of this beast. No voltage it fluctuates between 1950-2000 depending on the game, and my temp never goes above 60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I do upgrade with EVGA stepup, it will be for a Classy or Kingpin.


Never goes above 60 on air? Are you playing solitaire?


----------



## Slackaveli

Quote:


> Originally Posted by *madmeatballs*
> 
> So which bios is best to flash to FE? And what are some drawbacks with bios flashing? Like does the MSI one also disable a DP port?


Quote:


> Originally Posted by *alucardis666*
> 
> Gigabyte is a no go for FE. Boost isn't working like it should with the MSI or the ASUS Bios' did.
> 
> 
> 
> Card won't take more than .800mv while running heaven, and boost clocks are below stock values.


it's because of the way they did the vrms and the phase switching like that video i posted shows. Oh well. Unlucky. The asus pcb has a much more similar methodology to the FE 



Quote:


> Originally Posted by *GraphicsWhore*
> 
> Never goes above 60 on air? Are you playing solitaire?


he didnt claim _that_. But personally, 2 hours of Battlefield One @ 1440p/144 or 4k/60, i level at ~65c


----------



## alucardis666

Gigabyte Bios is a no go :-(

GPU-ZSensorLog.txt 286k .txt file





Quote:


> Originally Posted by *Slackaveli*
> 
> it's because of the way they did the vrms and the phase switching like that video i posted shows. Oh well. Unlucky.


Really makes me wanna buy one and see how it runs. Would the hybrid cooler fit the PCB like it does with the reference? I'd guess not...


----------



## Alwrath

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Never goes above 60 on air? Are you playing solitaire?


DOOM, Crysis 3, Overwatch, League...

100% fan speed, and my case has excellent airflow. Fan profile goes up to 100% fan speed when my gpu hits 50C.









Most likely during the summer it will hit 65C max. Heh, I havent even taken her apart to put on some arctic silver ceramic on it. Should lower temps by around 5C give or take.


----------



## alucardis666

Flashing back to the stock FE bios. The others didn't do anything but gimp my fan rpms or make ports on the card not function anymore. Not a fault of the bioses, they did what they're designed for for their respective cards. I feel confident that the gigabyte one would've been best for my particular card, but the VRM changes they made to it didn't prove to do what I'd hopped. Kinda contemplating selling my card and getting a Gigabyte one now. Hmmm...


----------



## Alwrath

Quote:


> Originally Posted by *alucardis666*
> 
> Flashing back to the stock FE bios. The others didn't do anything but gimp my fan rpms or make ports on the card not function anymore. Not a fault of the bioses, they did what they're designed for for their respective cards. I feel confident that the gigabyte one would've been best for my particular card, but the VRM changes they made to it didn't prove to do what I'd hopped. Kinda contemplating selling my card and getting a Gigabyte one now. Hmmm...


Just do what im doing and use EVGA step up to a Classy or Kingpin, good things come to those who wait


----------



## Slackaveli

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Never goes above 60 on air? Are you playing solitaire?


Quote:


> Originally Posted by *alucardis666*
> 
> Gigabyte Bios is a no go :-(
> 
> GPU-ZSensorLog.txt 286k .txt file
> 
> 
> 
> 
> 
> Really makes me wanna buy one and see how it runs. Would the hybrid cooler fit the PCB like it does with the reference? I'd guess not...


i've had three hybrids in a row but if it's on air but not throttling... that's all that matter. folks are hitting over 2ghz at the "stock" oc with fan curves in the 60sc, and are reporting no thermal/power throttling. So this dude may even take voltage...







Quote:


> Originally Posted by *alucardis666*
> 
> Flashing back to the stock FE bios. The others didn't do anything but gimp my fan rpms or make ports on the card not function anymore. Not a fault of the bioses, they did what they're designed for for their respective cards. I feel confident that the gigabyte one would've been best for my particular card, but the VRM changes they made to it didn't prove to do what I'd hopped. Kinda contemplating selling my card and getting a Gigabyte one now. Hmmm...


once I got to that point i thought of the tax and it was over. curious to see where you end up on this, my clone in teh windy city.


----------



## alucardis666

Quote:


> Originally Posted by *Alwrath*
> 
> Just do what im doing and use EVGA step up to a Classy or Kingpin, good things come to those who wait


That's an option for sure, but I recall someone saying they wouldn't allow FE cards to step up to AIB, if that is *NOT* the case then yea, I can and probably will do that.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Flashing back to the stock FE bios. The others didn't do anything but gimp my fan rpms or make ports on the card not function anymore. Not a fault of the bioses, they did what they're designed for for their respective cards. I feel confident that the gigabyte one would've been best for my particular card, but the VRM changes they made to it didn't prove to do what I'd hopped. Kinda contemplating selling my card and getting a Gigabyte one now. Hmmm...


Quote:


> Originally Posted by *alucardis666*
> 
> That's an option for sure, but I recall someone saying they wouldn't allow FE cards to step up to AIB, if that is *NOT* the case then yea, I can and probably will do that.


that was my understanding but ive never been in the 3 month window so ive never dug deeper.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Curious to see where you end up on this, my clone in teh windy city.











Quote:


> Originally Posted by *Slackaveli*
> 
> that was my understanding but ive never been in the 3 month window so ive never dug deeper.


We'll see then


----------



## Kashtan

Quote:


> Originally Posted by *jelome1989*
> 
> That's why I don't trust benchmarks these days...
> Here, the 5775c handily beats the 7700k in most of the tests both at stock and OC speeds, with the 7700k having a 500-800mhz clock lead over the 5775c
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10
> 
> EDIT:
> Another one:
> https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,18
> The only game where the 7700k beats the 5775c is Crysis 3.


For Starcraft 2 i wish make upgrade my CPU.
1080Ti definitely will not become a limitation.
But these tests confused me, until now I looked at Kaby Lake 7700k 7600k and 7350k, and waited 7740k and 7640k.
But...
http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
So where is the truth? Kaby or Broadwell?


----------



## Drakeskull

Quote:


> Originally Posted by *Mato87*
> 
> Good luck holding a playable min framerate in more demanding titles like Battlefield 1 or the new ghost recon. Not to mention basically all the upcoming titles that can utilize 8 cores very efficiently.


On a 2600k @4.4 here, and bf1 ultra at 1440p never dips below 100fps


----------



## KingEngineRevUp

New MSI afterburner available, apparently all 1080 Ti should be unlocked.

http://forums.guru3d.com/showpost.php?p=5412373&postcount=216


----------



## krutoydiesel

Wish I knew they were going to make a 1080Ti specific block when Nvidia released the card. This looks beautiful, and single slot.


----------



## Mato87

Quote:


> Originally Posted by *Drakeskull*
> 
> On a 2600k @4.4 here, and bf1 ultra at 1440p never dips below 100fps


What card? If it's the new one, the gtx 1080 ti, then it's certainly possible. But I know for a fact you saying it never dips below 100 fps ia straight lie.


----------



## Slackaveli

Quote:


> Originally Posted by *Kashtan*
> 
> For Starcraft 2 i wish make upgrade my CPU.
> 1080Ti definitely will not become a limitation.
> But these tests confused me, until now I looked at Kaby Lake 7700k 7600k and 7350k, and waited 7740k and 7640k.
> But...
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> So where is the truth? Kaby or Broadwell?


on THAT game, if you have easy access to a z97 board, get Broadwell 5775c.
That page you quoted/linked has the 5775c at 3.3ghz and it STILL keeps up with Kaby at over 1200 higher Mhz. It'll hit 4.3 easy which is over 30%oc, something Kaby can't match (that high of an OC %wise).

**********Guys who bought/are buying Aorus or the gigabyte 1080ti cards of any type i guess, we can get the swag bags like dude did- check out the fb page-https://www.facebook.com/AorusOfficial/
"[Review and Win]
If you're the owner of a new #AORUS or #GIGABYTE GeForce GTX 1080 Ti graphics cards, we want to hear your honest feedback. In return, we'll provide you with SURPRISE SWAG to along with your new card!
Here's what you need to do to qualify:
- Leave an HONEST review on Newegg about your AORUS or GIGABYTE GeForce GTX 1080 Ti
- PM us a screenshot of your review and invoice (for verification)
- Sit back and relax while your swag ships out!
Happy reviewing, everyone!
*Purchases up until April 30th are eligible for this offer*

i already reviewed before it's in hand. i know what im getting lol.
*Offer valid for USA only*"
Quote:


> Originally Posted by *Kashtan*
> 
> For Starcraft 2 i wish make upgrade my CPU.
> 1080Ti definitely will not become a limitation.
> But these tests confused me, until now I looked at Kaby Lake 7700k 7600k and 7350k, and waited 7740k and 7640k.
> But...
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> So where is the truth? Kaby or Broadwell?


no tests for Starcraft, but here is a hard cpu game here-
https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,37 It's one of the few Kaby keeps up, but everything else wilts.

@Krutoydiesel Wish I knew they were going to make a 1080Ti specific block when Nvidia released the card. This looks beautiful, and single slot.
Ebay is your friend. If I was on water I'd be all over that one. dang. sorry, broski for your pain


----------



## Drakeskull

Quote:


> Originally Posted by *Mato87*
> 
> What card? If it's the new one, the gtx 1080 ti, then it's certainly possible. But I know for a fact you saying it never dips below 100 fps ia straight lie.


It's a 1080 ti fe only have it running at 1936mhz. I'll take some videos in a conquest tonight.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> on THAT game, if you have easy access to a z97 board, get Broadwell 5775c.
> That page you quoted/linked has the 5775c at 3.3ghz and it STILL keeps up with Kaby at over 1200 higher Mhz. It'll hit 4.3 easy which is over 30%oc, something Kaby can't match (that high of an OC %wise).
> 
> **********Guys who bought/are buying Aorus or the gigabyte 1080ti cards of any type i guess, we can get the swag bags like dude did- check out the fb page-https://www.facebook.com/AorusOfficial/
> "[Review and Win]
> If you're the owner of a new #AORUS or #GIGABYTE GeForce GTX 1080 Ti graphics cards, we want to hear your honest feedback. In return, we'll provide you with SURPRISE SWAG to along with your new card!
> Here's what you need to do to qualify:
> - Leave an HONEST review on Newegg about your AORUS or GIGABYTE GeForce GTX 1080 Ti
> - PM us a screenshot of your review and invoice (for verification)
> - Sit back and relax while your swag ships out!
> Happy reviewing, everyone!
> *Purchases up until April 30th are eligible for this offer*
> 
> i already reviewed before it's in hand. i know what im getting lol.
> *Offer valid for USA only*"
> no tests for Starcraft, but here is a hard cpu game here-
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,37 It's one of the few Kaby keeps up, but everything else wilts.
> 
> Wish I knew they were going to make a 1080Ti specific block when Nvidia released the card. This looks beautiful, and single slot.
> Ebay is your friend. If I was on water I'd be all over that one. dang. sorry, broski for your pain


Wish I could see 5775c vs 7700k at 1440p and 4K tests.

What is your Core Voltage for your 5775c 4,3?


----------



## alucardis666

Side note... I'm really gonna miss these winter weather temps.


----------



## Slackaveli

Wish I could see 5775c vs 7700k at 1440p and 4K tests.

What is your Core Voltage for your 5775c 4,3?[/quote]well, that's my wall and Im at 1.345v however these guys can take more volts than a haswell, 1.25 on haswell is like 1.32 or so on broadwell, anyway, i hold 4.2 at 1.285 (some can hold that at 1.25). and 4.2-4.4 is where you want to be to beat everything else.

me too, i wish i had a 7700k to test against. I had the 4790k for awhile and beat up on it pretty good.


----------



## zipeldiablo

Just got a 1080ti back from evga, box was on blister, do you think it is a new card or the same one repacked with blister?

Ordered the block for 1080ti can't wait !


----------



## krutoydiesel

Quote:


> @Krutoydiesel Wish I knew they were going to make a 1080Ti specific block when Nvidia released the card. This looks beautiful, and single slot.
> Ebay is your friend. If I was on water I'd be all over that one. dang. sorry, broski for your pain


I refuse! I will see what Performance PCs return policy is.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> well, that's my wall and Im at 1.345v however these guys can take more volts than a haswell, 1.25 on haswell is like 1.32 or so on broadwell, anyway, i hold 4.2 at 1.285 (some can hold that at 1.25). and 4.2-4.4 is where you want to be to beat everything else. me too, i wish i had a 7700k to test against. I had the 4790k for awhile and beat up on it pretty good.


Yeah, cause I am considering selling my 4.9Ghz 4790k and get 5775c. If I OC it to 4.2/4.3GHZ- it will beat my 4790k, correct?


----------



## palote99

Quote:


> Originally Posted by *Mato87*
> 
> What card? If it's the new one, the gtx 1080 ti, then it's certainly possible. But I know for a fact you saying it never dips below 100 fps ia straight lie.


Quote:


> Originally Posted by *Drakeskull*
> 
> On a 2600k @4.4 here, and bf1 ultra at 1440p never dips below 100fps


On a [email protected] here, and BF1 ultra at 1440 never dips below 130. Maybe Amiens i hardly see 120ish.....

See u in the Battlefield!!!!


----------



## palote99

Quote:


> Originally Posted by *Benny89*
> 
> Yeah, cause I am considering selling my 4.9Ghz 4790k and get 5775c. If I OC it to 4.2/4.3GHZ- it will beat my 4790k, correct?


YES!


----------



## Mato87

Quote:


> Originally Posted by *Drakeskull*
> 
> It's a 1080 ti fe only have it running at 1936mhz. I'll take some videos in a conquest tonight.


Looking forward to it.


----------



## KingEngineRevUp

I might have to get a 5775c one day.


----------



## zipeldiablo

Quick question by the way, any of you guys encounter signal interrupting since the last time i asked?
(screen going black for one or two seconds, happens randomly)

It seems the card i just received from evga is a new one and i got the same issue as before the rma.
My friend who has a gtx 1070 has the exact same issue since he updated nvidia driver to the latest.
Also i am connected to a 4k tvscreen with hdmi and he is connected to a 1440p monitor with display port.
Since i have no issue at all with same display, same cable but maxwell gpu it would point the issue to either a second brand new faulty gpu (unlikely, also my friend gpu is not new) or to the driver directly, most likely the driver.

If it is a common issue it would be great if all the concerned people could send a bug report to nvidia to see this fixed.


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> Quick question by the way, any of you guys encounter signal interrupting since the last time i asked?
> (screen going black for one or two seconds, happens randomly)
> 
> It seems the card i just received from evga is a new one and i got the same issue as before the rma.
> My friend who has a gtx 1070 has the exact same issue since he updated nvidia driver to the latest.
> Also i am connected to a 4k tvscreen with hdmi and he is connected to a 1440p monitor with display port.
> Since i have no issue at all with same display, same cable but maxwell gpu it would point the issue to either a second brand new faulty gpu (unlikely, also my friend gpu is not new) or to the driver directly, most likely the driver.
> 
> If it is a common issue it would be great if all the concerned people could send a bug report to nvidia to see this fixed.


Yes, I have the same issue, depends on cards power state and clocks. Certain OCs on the card trigger it more frequently than others, I figureed it was instability of the OC but then did some digging. It should be fixed in the next Nvidia Driver hopefully. And wasn't present in previous drivers I used.









I've not experienced this issue while gaming or watching videos, only when idle browsing the web or file explorer windows. Hbu?


----------



## zipeldiablo

Quote:


> Originally Posted by *alucardis666*
> 
> Yes, I have the same issue, depends on cards power state and clocks. Certain OCs on the card trigger it more frequently than others, I figureed it was instability of the OC but then did some digging. It should be fixed in the next Nvidia Driver hopefully. And wasn't present in previous drivers I used.


I only had the last two versions so i cannot say for the other drivers since i had the issue from the beginning.
Also my card is not oc, i run the card with stock nvidia profile because i am waiting for my wc block to arrive.

Please send an email to nvidia, the more we are the more chance we have for this issue to be fixed with next release


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> I only had the last two versions so i cannot say for the other drivers since i had the issue from the beginning.
> Also my card is not oc, i run the card with stock nvidia profile because i am waiting for my wc block to arrive.
> 
> Please send an email to nvidia, the more we are the more chance we have for this issue to be fixed with next release


Mind PM-ing me the info to contact them along with what it was you said exactly?









I've got a few e-mail accounts I can spam them with


----------



## Slackaveli

Quote:


> Originally Posted by *Kashtan*
> 
> For Starcraft 2 i wish make upgrade my CPU.
> 1080Ti definitely will not become a limitation.
> But these tests confused me, until now I looked at Kaby Lake 7700k 7600k and 7350k, and waited 7740k and 7640k.
> But...
> http://www.pcgameshardware.de/Ryzen-7-1800X-CPU-265804/Tests/Test-Review-1222033/
> So where is the truth? Kaby or Broadwell?


Quote:


> Originally Posted by *Benny89*
> 
> Yeah, cause I am considering selling my 4.9Ghz 4790k and get 5775c. If I OC it to 4.2/4.3GHZ- it will beat my 4790k, correct?


absolutely, at lower temps, too.

Look here for that knowledge. http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/600#post_25986920

Quote:


> Originally Posted by *krutoydiesel*
> 
> I refuse! I will see what Performance PCs return policy is.


a man after my own heart.

[/quote]On a [email protected] here, and BF1 ultra at 1440 never dips below 130. Maybe Amiens i hardly see 120ish.....

See u in the Battlefield!!!![/quote]That's my dude! He listened and reaped the rewards. I told yall about dat 5775c on BF1. It's great.

Quote:


> Originally Posted by *SlimJ87D*
> 
> I might have to get a 5775c one day.


I should be getting a commission. I've sold like 5 of these lol.


----------



## zipeldiablo

Quote:


> Originally Posted by *alucardis666*
> 
> Mind PM-ing me the info to contact them along with what it was you said exactly?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got a few e-mail accounts I can spam them with


Done


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> absolutely, at lower temps, too.
> 
> Look here for that knowledge. http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/600#post_25986920
> a man after my own heart.


Ok, I just put my golden i7 4790k on sale. Will grab 5775c if it sells.


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> Done


Thanks!


----------



## Slackaveli

On a [email protected] here, and BF1 ultra at 1440 never dips below 130. Maybe Amiens i hardly see 120ish.....

See u in the Battlefield!!!![/quote]
Quote:


> Originally Posted by *SlimJ87D*
> 
> I might have to get a 5775c one day.


Quote:


> Originally Posted by *alucardis666*
> 
> Yes, I have the same issue, depends on cards power state and clocks. Certain OCs on the card trigger it more frequently than others, I figureed it was instability of the OC but then did some digging. It should be fixed in the next Nvidia Driver hopefully. And wasn't present in previous drivers I used.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've not experienced this issue while gaming or watching videos, only when idle browsing the web or file explorer windows. Hbu?


o hthank God, i was worried about the Sammy. I havent seen it on my dp connected monitor but have on the hdmi2.0b connected uhdtv.
Quote:


> Originally Posted by *Benny89*
> 
> Ok, I just put my golden i7 4790k on sale. Will grab 5775c if it sells.


great, man, you won't regret. Make sure to list the Oc and voltages required that your golden cpu can hit, it'll help it sell.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Oh thank God, i was worried about the Sammy. I havent seen it on my dp connected monitor but have on the hdmi2.0b connected uhdtv.


You too huh?!

Fill out this form here.

http://surveys.nvidia.com/index.jsp?pi=6e7ea6bb4a02641fa8f07694a40f8ac6



I just did mine.


----------



## Mato87

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Ok youre struggling to attain 60 FPS at 1080p?!


Yep as I said the game is very poorly optimized, with upscaling turned off it's very hard to get a stable 60 fps, you have to turn down a few settings.
Quote:


> Originally Posted by *VanuSovereignty*
> 
> Looking at your build, youre on 8.1, 10 is recommended, is this another DX12 title? I am dual booting both 7 and 10, if Quantum Break is DX12 I will definitely see a benefit to putting that game on my secondary SSD with 10 as I have a hexacore CPU. Also, I vaguely remember you saying this was on sale for $17, is that still going?
> r.


Still rocking the 8.1 without a problem, I was tempted to go to win 10, but decided not to and I did the right thing. Win 10 is just crap.
Quantum break was a dx12 title, until remedy decided to release a steam version that works under DX11 without any issues. I heard about the UWP version being gimped and ****ty and performing even worse than the steam version.

You should try it, it's no longer on sale, but steam says 36 euro right now. It's a fantastic game, shame about the optimization, very low effort was put into that particular part of the porting process.


----------



## mshagg

Quote:


> Originally Posted by *alucardis666*
> 
> Flashing back to the stock FE bios. The others didn't do anything but gimp my fan rpms or make ports on the card not function anymore. Not a fault of the bioses, they did what they're designed for for their respective cards. I feel confident that the gigabyte one would've been best for my particular card, but the VRM changes they made to it didn't prove to do what I'd hopped. Kinda contemplating selling my card and getting a Gigabyte one now. Hmmm...


Thanks for testing the Aorus one. There's a lot of partner cards to come out, all with tasty BIOSes we can try. Im sure there'll be a suitable combo of outputs + VRM configuration that is well suited to flashing over a FE.


----------



## Slackaveli

@VanuSovereignty
The Aorus gpu's are binned! I saw them bragging about Gpu Guantlet on their website on both Aorus models. Of course, with that comes the knwledge that those 'lesser' chips gotta go somewhere so the other gigabyte editions should probably be avoided, which is probably why they call binning "gpu gauntlet" so nubz wont realize the other side of it on their other products. Like, "Hey we bin so our regular models are guaranteed to be booty".

Also, Aorus have a 4 year warranty. Not that I'll be running it four years but it's destined to be a hand me down someday ( I have three teens, 2 boys and a girl who are all gamers) so I appreciate the 4 year warranty. That is extreme!


----------



## Slackaveli

Quote:


> Originally Posted by *c0ld*
> 
> Bling bling, got freebies with my order of my 1080 Ti.


just got logged by the admin on Aorus facebook page for the swag bag part 2. Instructions are 2 or 3 pages back (or on the facebook page). I seriously doubt I'll get hooked up as much as you. I'm probably destined for a lanyard and wristband lol. It's all good, though. I am due for a winner in the silicon lottery. My 1st 980ti hybrid was good (1500mhz)but i had a replacement after i had to rma b/c of the cooler failing was booty. I then got a decent but not golden 5775c (although even bad 5775c's are crazy good), a poor 1070 that i 'borrowed' from best buy for a week waiting on my 1080ti, which then that was also was pretty much booty. I havent had a straight lottery winner since i bought my 4690k that hit 4.6 and that was before my 4790k (4.8). I am due for a golden sample and these things are binned. Come on, Silicon Goddess!


----------



## mshagg

Quote:


> Originally Posted by *VanuSovereignty*
> 
> I never really use Win 10 but I love racing games, so I'm tempted to purchase Forza Horizon 3 which is a Win 10 only game but yeah, there is zero compelling reason to use 10 over 7.


WDDM2.0 was pretty significant in itself, I think Creators Update is about to push that to WDDM2.2.

But hey, I remember people raging against Windows 95 and the loss of booting to DOS. Fight the power!


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *VanuSovereignty*
> 
> I never really use Win 10 but I love racing games, so I'm tempted to purchase Forza Horizon 3 which is a Win 10 only game but yeah, there is zero compelling reason to use 10 over 7.
> 
> 
> 
> WDDM2.0 was pretty significant in itself, I think Creators Update is about to push that to WDDM2.2.
> 
> But hey, I remember people raging against Windows 95 and the loss of booting to DOS. Fight the power!
Click to expand...

The final version of the Creator's Update has been leaked. Google is your friend.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Are you peeps trying the Strix BIOS on air? It has a higher power draw and it'll throttle. On my old BIOS I'd notice throttling started above 50C. I'm on water you see.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a Kraken x41 on the GPU, which keeps it at 48C max on stock bios. I'm going to try the MSI one, but I get a feeling it will be the same thing.
Click to expand...

http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20









Can do MSI BIOS same way.


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> just got logged by the admin on Aorus facebook page for the swag bag part 2. Instructions are 2 or 3 pages back (or on the facebook page). I seriously doubt I'll get hooked up as much as you. I'm probably destined for a lanyard and wristband lol. It's all good, though. I am due for a winner in the silicon lottery. My 1st 980ti hybrid was good (1500mhz)but i had a replacement after i had to rma b/c of the cooler failing was booty. I then got a decent but not golden 5775c (although even bad 5775c's are crazy good), a poor 1070 that i 'borrowed' from best buy for a week waiting on my 1080ti, which then that was also was pretty much booty. I havent had a straight lottery winner since i bought my 4690k that hit 4.6 and that was before my 4790k (4.8). I am due for a golden sample and these things are binned. Come on, Silicon Goddess!


Hopefully you get something similar!

I've been lucky with the lottery too my 2600k is doing 4.8GHz on really low voltage with HT on, my old 780 Lightning I got the core at about 1.38GHz with temps within limit on air









I haven't tried OC'ing my Aorus yet but from what I'm reading here everything is hitting a 2.1GHz wall?


----------



## bluewr

Update from my 780, now time to benchmark and run some game


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> The final version of the Creator's Update has been leaked. Google is your friend.


Yeah been running it for a few days. Game mode seems like a bit of a non event from my testing so far.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> The final version of the Creator's Update has been leaked. Google is your friend.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah been running it for a few days. Game mode seems like a bit of a non event from my testing so far.
Click to expand...

Don't even use Game Mode, Razer Cortez OP!! Frees up almost a GB of RAM on my system with custom boost settings.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> @VanuSovereignty
> The Aorus gpu's are binned! I saw them bragging about Gpu Guantlet on their website on both Aorus models. Of course, with that comes the knwledge that those 'lesser' chips gotta go somewhere so the other gigabyte editions should probably be avoided, which is probably why they call binning "gpu gauntlet" so nubz wont realize the other side of it on their other products. Like, "Hey we bin so our regular models are guaranteed to be booty".
> 
> Also, Aorus have a 4 year warranty. Not that I'll be running it four years but it's destined to be a hand me down someday ( I have three teens, 2 boys and a girl who are all gamers) so I appreciate the 4 year warranty. That is extreme!


_Gigabyte Guantlet_ is a known process of them since 900 series. *They are not "binned"* like old Classies or Kingpins are. So don't expect that they will on average OC/boost higher than other brands. They just test them before putting inside their AIB. That is all. Binning is too expensive for mainstream AIB card. Pascal has its own architecture limitation which no binning will remove.

*here is AORUS XTREMETE EDITION 1080 TI REVIEW*: https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/33.html

*Overlock and boost is (again) exact the same as on ROG STRIX and MSI GAMING X.* *2050 max.*

All 1080 Ti Pascals are basicelly the same. Architecture limit.


----------



## newjerseydamo

Is the EVGA 5288 FTW Hybrid pump/radiator the exact same as the 5188 model?

Reason I ask is there mighty be issues sourcing the 5188 model locally but the 5288 may be available.

I plan on keeping the original FE shroud on so just realty need the pump/radiator.
If it's the same between the 2 kits I'll jump on the 5288.


----------



## Slackaveli

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Swag indeed my friend.
> 
> I'm not kidding when I almost wanted to follow suit and return my FE for the Aorus. If my FE only did +120MHz core like yours did I would absolutely have done so. I'm afraid that it may be a gamble though, I'm still trying to understand how the AIB cards perform vis a vis reference and right now it looks like we are all in the same silicon lottery. Extra power, better and quieter cooling and better aesthetics are definitely compelling though.
> 
> Edit:
> 
> Yeah I think youre due for getting a golden sample. I kinda feel like my FE is near golden sample, it did 1911MHz out of the box simply putting PT to 120%. It does 2050MHz, albeit briefly. It's not doing 2100MHz, but it definitely is doing 2.025MHz.


sorry if my tale is needlessly tempting you. I think yours is golden. I bet if it was on water it'd be over 2100. The fact that you can hold mid 2000's on air is impressive. I am definitely due for golden. I will either hit this time, or I'll hit with Volta, but I _am_ due, damnit~
Quote:


> Originally Posted by *c0ld*
> 
> Hopefully you get something similar!
> 
> I've been lucky with the lottery too my 2600k is doing 4.8GHz on really low voltage with HT on, my old 780 Lightning I got the core at about 1.38GHz with temps within limit on air
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't tried OC'ing my Aorus yet but from what I'm reading here everything is hitting a 2.1GHz wall?


yeah, on water the golden chips hit that. on air if you keep 2000+ you are very good.
Quote:


> Originally Posted by *Benny89*
> 
> _Gigabyte Guantlet_ is a known process of them since 900 series. *They are not "binned"* like old Classies or Kingpins are. So don't expect that they will on average OC/boost higher than other brands. They just test them before putting inside their AIB. That is all. Binning is too expensive for mainstream AIB card. Pascal has its own architecture limitation which no binning will remove.
> 
> *here is AORUS XTREMETE EDITION 1080 TI REVIEW*: https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/33.html
> 
> *Overlock and boost is (again) exact the same as on ROG STRIX and GAMING X.* *2050 max.*
> 
> All 1080 Ti Pascals are basicelly the same. Architecture limit.


party pooper
Quote:


> Originally Posted by *c0ld*
> 
> Hopefully you get something similar!
> 
> I've been lucky with the lottery too my 2600k is doing 4.8GHz on really low voltage with HT on, my old 780 Lightning I got the core at about 1.38GHz with temps within limit on air
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't tried OC'ing my Aorus yet but from what I'm reading here everything is hitting a 2.1GHz wall?


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> sorry if my tale is needlessly tempting you. I think yours is golden. I bet if it was on water it'd be over 2100. The fact that you can hold mid 2000's on air is impressive. I am definitely due for golden. I will either hit this time, or I'll hit with Volta, but I _am_ due, damnit~
> yeah, on water the golden chips hit that. on air if you keep 2000+ you are very good.


All AIBs reviewed so far had no problem OCing to 2050 on air stable... in all reviews. This is a soft cap for them.


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> sorry if my tale is needlessly tempting you. I think yours is golden. I bet if it was on water it'd be over 2100. The fact that you can hold mid 2000's on air is impressive. I am definitely due for golden. I will either hit this time, or I'll hit with Volta, but I _am_ due, damnit~
> yeah, on water the golden chips hit that. on air if you keep 2000+ you are very good.


Im hitting 2.0GHz max boost on just maxing the Power to 125%. Running about 72c with 58% fan speed


----------



## KedarWolf

Quote:


> Originally Posted by *c0ld*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> sorry if my tale is needlessly tempting you. I think yours is golden. I bet if it was on water it'd be over 2100. The fact that you can hold mid 2000's on air is impressive. I am definitely due for golden. I will either hit this time, or I'll hit with Volta, but I _am_ due, damnit~
> yeah, on water the golden chips hit that. on air if you keep 2000+ you are very good.
> 
> 
> 
> Im hitting 2.0GHz max boost on just maxing the Power to 125%. Running about 72c with 58% fan speed
Click to expand...

Which 1080 Ti card you have and Afterburner does higher than 120% boost with it? Or you use Precision X or something?


----------



## Slackaveli

it's not like i wouldnt be happy with 2050 after only seeing 2000 before dropping into the 1900s on my FE.
Quote:


> Originally Posted by *KedarWolf*
> 
> Which 1080 Ti card you have and Afterburner does higher than 120% boost with it? Or you use Precision X or something?


he's on Aorus. We have a 150% power slider


----------



## c0ld

Quote:


> Originally Posted by *KedarWolf*
> 
> Which 1080 Ti card you have and Afterburner does higher than 120% boost with it? Or you use Precision X or something?


Aorus 1080 Ti 11G used Afterburner to set it to 125%. I'm pretty sure it was 125%, but I'll double check and get back to you after I get home.


----------



## Slackaveli

So, they may all clock the same, but Aorus is King (for now) in power limit and is the quietest. Almost the coolest. https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/34.html


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> party pooper


Sorry







. But better not to spread missleading info hear. People are too hyped and when OC community hears "binned" the jump first, think later







(joking of course).
Quote:


> Originally Posted by *Slackaveli*
> 
> it's not like i wouldnt be happy with 2050 after only seeing 2000 before dropping into the 1900s on my FE.
> he's on Aorus. We have a 150% power slider


You should have no problems with AIBs. You can check all so far reviews and that is basicelly guarantee. All AIBs boost without OC to almost 2000 anyway. 2050-60 is definitely ok in 98% of AIBs.

*Also, everyone, now we have reviews of ROG STRIX, MSI GAMING X and AORUS XTREME EDITION.

All are the same. Same OC max, same Temps range (69 ROG, 71 Xtreme, 72 MSI) on load, same quite performance (33-39 range of all three).

Just go for the looks people. There is NO DIFFERENCE







*


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> it's not like i wouldnt be happy with 2050 after only seeing 2000 before dropping into the 1900s on my FE.
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Which 1080 Ti card you have and Afterburner does higher than 120% boost with it? Or you use Precision X or something?
> 
> 
> 
> he's on Aorus. We have a 150% power slider
Click to expand...

Can you:

nvflash_5.353.0.zip 3031k .zip file


Unzip NVFlash to a folder.

Disable your video card in Control Panel in Device Manager.

Run an admin command prompt and cd to the folder you made.

Do in command prompt:

*nvflash64 --save filename.rom*

And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?

I'd like to test it.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> So, they may all clock the same, but Aorus is King (for now) in power limit and is the quietest. Almost the coolest. https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/34.html


Yes, but:
1. Power Limit will not give you anything. 2100 is still a hard cap that so far is not stable even with custom voltage curves. This is architecture limitation sadly.

As you saw in review, despite higher power limit- Aorus boosted 6mhz higher than ROG Strix and OC limit was still the same- 2050 stable.
2. Aorus is so far the quitest, but ROG is the coolest.

As I said- no difference. Just go for looks. There are no "golden chips" on Pascals, simillar to 7700k (at least not that golden as we have in older hardware). Not much OC room in new generations of hardware


----------



## c0ld

Quote:


> Originally Posted by *Benny89*
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> . But better not to spread missleading info hear. People are too hyped and when OC community hears "binned" the jump first, think later
> 
> 
> 
> 
> 
> 
> 
> (joking of course).
> You should have no problems with AIBs. You can check all so far reviews and that is basicelly guarantee. All AIBs boost without OC to almost 2000 anyway. 2050-60 is definitely ok in 98% of AIBs.
> 
> *Also, everyone, now we have reviews of ROG STRIX, MSI GAMING X and AORUS XTREME EDITION.
> 
> All are the same. Same OC max, same Temps range (69 ROG, 71 Xtreme, 72 MSI) on load, same quite performance (33-39 range of all three).
> 
> Just go for the looks people. There is NO DIFFERENCE
> 
> 
> 
> 
> 
> 
> 
> *


Are those temps with the stock fan profile?


----------



## Benny89

Quote:


> Originally Posted by *c0ld*
> 
> Are those temps with the stock fan profile?


Yes, auto stock profile in reviews.


----------



## KedarWolf

*Someone with on Aorus, please?*

nvflash_5.353.0.zip 3031k .zip file


Unzip NVFlash to a folder.

Disable your video card in Control Panel in Device Manager.

Run an admin command prompt and cd to the folder you made.

Do in command prompt:

*nvflash64 --save filename.rom*

And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?

I'd like to test it.


----------



## c0ld

Quote:


> Originally Posted by *Benny89*
> 
> Yes, auto stock profile in reviews.


Nice!

Is OC'ing the memory worth it? I upgraded from a 780 I don't know much about newer cards getting up to date as we speak.


----------



## Benny89

For Comparsion:
Quote:


> Originally Posted by *c0ld*
> 
> Nice!
> 
> Is OC'ing the memory worth it? I upgraded from a 780 I don't know much about newer cards getting up to date as we speak.


Yes, it is since *all AIBs have memory NOT OCed* (which is always written down in reviews as downside). So OCing it is imo necessary to get max from it.


----------



## Benny89

*Here is a full comparsion of so far all top 3 AIBs reviewed.*

As you see, there is no reason to not choose your card based only on looks or your brand preferences


----------



## Pandora's Box

I question techpowerup's overclock method. They bench with battlefield 3 for gods sake. Also looks like they are recording max clock speed and not clock speed after 10-20 minutes of gaming/benching.


----------



## Jbravo33

Quote:


> Originally Posted by *Slackaveli*
> 
> Ive been wondering who would be first. I had my money on you but Bravo was up there too.
> 
> Dat 150% power limit....
> i think the msi is a 3dp 1hdmi like founders. aorus has 3 dp and 2 hdmi and a dvi so im not sure.
> 
> Edit : @alucardis666 maybe watch this first. power phases are done differently , quaduplers and whatnot. Im no engineer, but I think this will be very similar to the pcb on the aorus which is what your bios would be using.


haha f it why not! ive spent more time messing with these cards then playing games,


----------



## alucardis666

Quote:


> Originally Posted by *mshagg*
> 
> Thanks for testing the Aorus one. There's a lot of partner cards to come out, all with tasty BIOSes we can try. Im sure there'll be a suitable combo of outputs + VRM configuration that is well suited to flashing over a FE.


Here's hoping.








Quote:


> Originally Posted by *Slackaveli*
> 
> @VanuSovereignty
> The Aorus gpu's are binned! I saw them bragging about Gpu Guantlet on their website on both Aorus models. Of course, with that comes the knwledge that those 'lesser' chips gotta go somewhere so the other gigabyte editions should probably be avoided, which is probably why they call binning "gpu gauntlet" so nubz wont realize the other side of it on their other products. Like, "Hey we bin so our regular models are guaranteed to be booty".
> 
> Also, Aorus have a 4 year warranty. Not that I'll be running it four years but it's destined to be a hand me down someday ( I have three teens, 2 boys and a girl who are all gamers) so I appreciate the 4 year warranty. That is extreme!


It does sound like a very nice card, that's for sure.








Quote:


> Originally Posted by *KedarWolf*
> 
> *Someone with on Aorus, please?*
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> Unzip NVFlash to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> *nvflash64 --save filename.rom*
> 
> And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?
> 
> I'd like to test it.


https://www.techpowerup.com/vgabios/190963/gigabyte-gtx1080ti-11264-170331-2

Not good enough for you?


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mshagg*
> 
> Thanks for testing the Aorus one. There's a lot of partner cards to come out, all with tasty BIOSes we can try. Im sure there'll be a suitable combo of outputs + VRM configuration that is well suited to flashing over a FE.
> 
> 
> 
> Here's hoping.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> @VanuSovereignty
> The Aorus gpu's are binned! I saw them bragging about Gpu Guantlet on their website on both Aorus models. Of course, with that comes the knwledge that those 'lesser' chips gotta go somewhere so the other gigabyte editions should probably be avoided, which is probably why they call binning "gpu gauntlet" so nubz wont realize the other side of it on their other products. Like, "Hey we bin so our regular models are guaranteed to be booty".
> 
> Also, Aorus have a 4 year warranty. Not that I'll be running it four years but it's destined to be a hand me down someday ( I have three teens, 2 boys and a girl who are all gamers) so I appreciate the 4 year warranty. That is extreme!
> 
> Click to expand...
> 
> It does sound like a very nice card, that's for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> *Someone with on Aorus, please?*
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> Unzip NVFlash to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> *nvflash64 --save filename.rom*
> 
> And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?
> 
> I'd like to test it.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> https://www.techpowerup.com/vgabios/190963/gigabyte-gtx1080ti-11264-170331-2
> 
> Not good enough for you?
Click to expand...

No, want specifically the Aorus.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> . But better not to spread missleading info hear. People are too hyped and when OC community hears "binned" the jump first, think later
> 
> 
> 
> 
> 
> 
> 
> (joking of course).
> You should have no problems with AIBs. You can check all so far reviews and that is basicelly guarantee. All AIBs boost without OC to almost 2000 anyway. 2050-60 is definitely ok in 98% of AIBs.
> 
> *Also, everyone, now we have reviews of ROG STRIX, MSI GAMING X and AORUS XTREME EDITION.
> 
> All are the same. Same OC max, same Temps range (69 ROG, 71 Xtreme, 72 MSI) on load, same quite performance (33-39 range of all three).
> 
> Just go for the looks people. There is NO DIFFERENCE
> 
> 
> 
> 
> 
> 
> 
> *


looks AND price.
Yeah, i was wrong in my understanding of gpu gauntlet so thanks.
Quote:


> Originally Posted by *KedarWolf*
> 
> Can you:
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> Unzip NVFlash to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> *nvflash64 --save filename.rom*
> 
> And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?
> 
> I'd like to test it.


yeah, man, I got you but I think the power phases are way different. Basically the controller runs lower watts(amps, whatever)but quadruples the current through the 12+2 phases while switching frequency is super high and basically it works like a very undervolted card on the FE. At least on the alucardis666 's card it did. Can't do it until tomorrow when mine arrives if one of the others hasnt done it yet.


----------



## Kashtan

Quote:


> Originally Posted by *Benny89*
> 
> Wish I could see 5775c vs 7700k at 1440p and 4K tests.


http://www.anandtech.com/bench/product/1501?vs=1826
Direct 4k
Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245)4K Ultra Settings, Average Frames Per Second 5775C - 19.63 fps, 7700K - 18.54 fps

Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 10.76 fps, 7700K - 9.89 fps

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)4K Ultra Settings, Average Frames Per Second 5775C - 38.36 fps, 7700K - 37.89 fps

Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 28.4 fps, 7700K - 19.79 fps

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)4K Ultra Settings, Average Frames Per Second 5775C - 39.5 fps, 7700K - 38.59 fps

Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 32.18 fps, 7700K - 30.98 fps


----------



## Jbravo33

anyone try the new afterburner yet?


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> . But better not to spread missleading info hear. People are too hyped and when OC community hears "binned" the jump first, think later
> 
> 
> 
> 
> 
> 
> 
> (joking of course).
> You should have no problems with AIBs. You can check all so far reviews and that is basicelly guarantee. All AIBs boost without OC to almost 2000 anyway. 2050-60 is definitely ok in 98% of AIBs.
> 
> *Also, everyone, now we have reviews of ROG STRIX, MSI GAMING X and AORUS XTREME EDITION.
> 
> All are the same. Same OC max, same Temps range (69 ROG, 71 Xtreme, 72 MSI) on load, same quite performance (33-39 range of all three).
> 
> Just go for the looks people. There is NO DIFFERENCE
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> looks AND price.
> Yeah, i was wrong in my understanding of gpu gauntlet so thanks.
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Can you:
> 
> nvflash_5.353.0.zip 3031k .zip file
> 
> 
> Unzip NVFlash to a folder.
> 
> Disable your video card in Control Panel in Device Manager.
> 
> Run an admin command prompt and cd to the folder you made.
> 
> Do in command prompt:
> 
> *nvflash64 --save filename.rom*
> 
> And zip the file with winrar add your BIOS as an attachment here? Or someone with an Aorus?
> 
> I'd like to test it.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> yeah, man, I got you but I think the power phases are way different. Basically the controller runs lower watts(amps, whatever)but quadruples the current through the 12+2 phases while switching frequency is super high and basically it works like a very undervolted card on the FE. At least on the alucardis666 's card it did. Can't do it until tomorrow when mine arrives if one of the others hasnt done it yet.
Click to expand...

Ignore my PM, was before I saw this here. Thanks!!


----------



## KedarWolf

Quote:


> Originally Posted by *Jbravo33*
> 
> anyone try the new afterburner yet?


I'm using the new beta, yes, no issues.


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> haha f it why not! ive spent more time messing with these cards then playing games,


lol, you and a lot of us , too!
Quote:


> Originally Posted by *Kashtan*
> 
> http://www.anandtech.com/bench/product/1501?vs=1826
> Direct 4k
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245)4K Ultra Settings, Average Frames Per Second 5775C - 19.63 fps, 7700K - 18.54 fps
> 
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 10.76 fps, 7700K - 9.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)4K Ultra Settings, Average Frames Per Second 5775C - 38.36 fps, 7700K - 37.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 28.4 fps, 7700K - 19.79 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)4K Ultra Settings, Average Frames Per Second 5775C - 39.5 fps, 7700K - 38.59 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 32.18 fps, 7700K - 30.98 fps


dang, it flexes hard in Shadow of Mordor on a 290x. But there is some more proof of my original claim of the 5775c being King. nice.


----------



## bluewr

The card is bigger then my 780, and...Nvidia broke all my custom resolution again, and it's giving me the your reslution isn't supported by your monitor bug again.


----------



## alucardis666




----------



## JedixJarf

Quote:


> Originally Posted by *Benny89*
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> . But better not to spread missleading info hear. People are too hyped and when OC community hears "binned" the jump first, think later
> 
> 
> 
> 
> 
> 
> 
> (joking of course).
> You should have no problems with AIBs. You can check all so far reviews and that is basicelly guarantee. All AIBs boost without OC to almost 2000 anyway. 2050-60 is definitely ok in 98% of AIBs.
> 
> *Also, everyone, now we have reviews of ROG STRIX, MSI GAMING X and AORUS XTREME EDITION.
> 
> All are the same. Same OC max, same Temps range (69 ROG, 71 Xtreme, 72 MSI) on load, same quite performance (33-39 range of all three).
> 
> Just go for the looks people. There is NO DIFFERENCE
> 
> 
> 
> 
> 
> 
> 
> *


Told you guys all 1080 Ti's would OC the same : P


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*


That be me!!









Disclaimer, I'm hopelessly dyslexic so don't mind me messing up words and stuff.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> That be me!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Disclaimer, I'm hopelessly dyslexic so don't mind me messing up words and stuff.


Great vid


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That be me!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Disclaimer, I'm hopelessly dyslexic so don't mind me messing up words and stuff.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great vid
Click to expand...

I normally never did this stuff but someone on overclock.net asked for it and there it is.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I normally never did this stuff but someone on overclock.net asked for it and there it is.


You should make more! Maybe a guide to flashing VBios as well?


----------



## Jbravo33

Quote:


> Originally Posted by *KedarWolf*
> 
> I normally never did this stuff but someone on overclock.net asked for it and there it is.


haha I did I was scared ****less to flash, thinking id get a black screen or something. great vid thanks!


----------



## Caustin

In, finally got around to taking a few pictures.


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> haha I did I was scared ****less to flash, thinking id get a black screen or something. great vid thanks!


How's it going for you? Everything running ok?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*


at least my eyes have seen a card hit 2100 and hold it. That was cathartic.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> at least my eyes have seen a card hit 2100 and hold it. That was cathartic.


I feel your pains bro. Same here.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> I feel your pains bro. Same here.


for real tho. that was perfect those those in our shoes.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> for real tho. that was perfect those those in our shoes.


Yup. My card seems happiest *most stable and best performing with a max boost of 2012 @ 1062mv and mem @ 464. I hopped for more


----------



## c0ld

Quote:


> Originally Posted by *Benny89*
> 
> For Comparsion:
> Yes, it is since *all AIBs have memory NOT OCed* (which is always written down in reviews as downside). So OCing it is imo necessary to get max from it.


Any recent guides for Overclocking with GPU Boost 3.0???


----------



## Drakeskull

I guess there was a slight lie lol, in Deathmatchs and Rush it tends to stay over 100fps. In conquest it will go below, shortly after up close to tanks and explosions I seemed to hit bottoms of 85fps..... but its still very playable with the 2600k. I wonder how much of a bottleneck it has and how much I can increase the minimum by going to a 7700k? Its running maxed at 1440p not sure why youtube made it 720p


----------



## alucardis666

Well here's some bull*****



No one knocked, rang my doorbell, or anything.


----------



## c0nsistent

Quote:


> Originally Posted by *Drakeskull*
> 
> I guess there was a slight lie lol, in Deathmatchs and Rush it tends to stay over 100fps. In conquest it will go below, shortly after up close to tanks and explosions I seemed to hit bottoms of 85fps..... but its still very playable with the 2600k. I wonder how much of a bottleneck it has and how much I can increase the minimum by going to a 7700k? Its running maxed at 1440p not sure why youtube made it 720p


As someone running a 3770k @ 4.7ghz with 2400mhz DDR3, I can tell you that there is a bottleneck at 1440p in BF1. To compensate I use 120% resolution scale and the GPU is typically 95-99% usage most of the game. If I drop down to 100%, I'm in the 70-81 % range. My CPU has about 10% IPC over yours, so I'd definitely say you're in a worse spot than I am when it comes to that. I'm happy with 100-120fps @ 120% resolution scale 1440p until I get a decent 4K display. I'm just trying to find one that is around $500 USD, has Gsync, and good response time... or perhaps a 50" 4K TV with solid response time, but those are in the $1000+ range with the OLEDs.


----------



## Drakeskull

Quote:


> Originally Posted by *c0nsistent*
> 
> As someone running a 3770k @ 4.7ghz with 2400mhz DDR3, I can tell you that there is a bottleneck at 1440p in BF1. To compensate I use 120% resolution scale and the GPU is typically 95-99% usage most of the game. If I drop down to 100%, I'm in the 70-81 % range. My CPU has about 10% IPC over yours, so I'd definitely say you're in a worse spot than I am when it comes to that. I'm happy with 100-120fps @ 120% resolution scale 1440p until I get a decent 4K display. I'm just trying to find one that is around $500 USD, has Gsync, and good response time... or perhaps a 50" 4K TV with solid response time, but those are in the $1000+ range with the OLEDs.


100+fps with a rare low in the mid 80s @1440p with gsync is very playable, but I am going to make it a point to upgrade my CPU in the next few months. Thanks for the info


----------



## KingEngineRevUp

Besides kedrawolf, these other bios aren't working well for my card. The stock bios still runs best.

Is it working much better for anyone else? As in 5% extra gains like Wolf's?


----------



## jorgerp86

I know everyone is going to hate me for this but I actually returned my GTX 1080 Ti FE. Got so excited and rushed into ordering one but I just couldn't bare the temps and loud blower over my standard EVGA GTX 1080 SC. I definitely noticed the improvements in overall FPS with all my games but I decided to just stick with what I got until I see more Ti AIB reviews.

I also noticed a slight "smell" with mine when it would get hot. I'm sure it was just something that someone else mentioned before. Overall I'll be ready to get a quieter GTX 1080 Ti from a AIB.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Besides kedrawolf, these other bios aren't working well for my card. The stock bios still runs best.
> 
> Is it working much better for anyone else? As in 5% extra gains like Wolf's?


Nah, Wolf's cheating.







J/k. I went back to the stock bios as well.








Quote:


> Originally Posted by *jorgerp86*
> 
> I know everyone is going to hate me for this but I actually returned my GTX 1080 Ti FE. Got so excited and rushed into ordering one but I just couldn't bare the temps and loud blower over my standard EVGA GTX 1080 SC. I definitely noticed the improvements in overall FPS with all my games but I decided to just stick with what I got until I see more Ti AIB reviews.
> 
> I also noticed a slight "smell" with mine when it would get hot. I'm sure it was just something that someone else mentioned before. Overall I'll be ready to get a quieter GTX 1080 Ti from a AIB.


Do whatever makes *YOU* happy. I kinda wish I'd waited for more AIB's as well. KingPin, HOF, Classified, etc. Those are what I wanna see.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> at least my eyes have seen a card hit 2100 and hold it. That was cathartic.


All he did was just open the benchmark lol. I ran it at 2114 MHz with the stock bios at 1.075 Volts.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> All he did was just open the benchmark lol. I ran it at 2114 MHz with the stock bios at 1.075 Volts.


Do *YOU* have a video?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Do *YOU* have a video?


what I mean is that guy didn't have a video either. Lol.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Besides kedrawolf, these other bios aren't working well for my card. The stock bios still runs best.
> 
> Is it working much better for anyone else? As in 5% extra gains like Wolf's?


Are you water cooled? I haven't flashed anything onto my FE yet as the Strix/MSI ones disable a much-needed displayport and the GA extreme BIOS doesn't seem to be working on the FE.

But I will be trying to flash one when something suitable comes along. From what I can gather, wizard and co at techpowerup are uploading the BIOSes as they review the cards.

So, when you see a new BIOS pop up on there, expect to see their review shortly after .


----------



## cekim

Quote:


> Originally Posted by *Drakeskull*
> 
> I guess there was a slight lie lol, in Deathmatchs and Rush it tends to stay over 100fps. In conquest it will go below, shortly after up close to tanks and explosions I seemed to hit bottoms of 85fps..... but its still very playable with the 2600k. I wonder how much of a bottleneck it has and how much I can increase the minimum by going to a 7700k? Its running maxed at 1440p not sure why youtube made it 720p


You are killing that poor 1080ti with the CPU.

Here's a run-down of what I see in BF1 @ 1440p ultra:
2x1080 @2100MHz (water cooled)= ~130-140 - dips to 115-120
1x1080ti @1897MHz (air cooled)= 120-130 - dips to 90-110 now and then
2x1080ti @1964MHz (water cooled) = 170-180 - dips to 150-160 now and then

Of course its all over the map (and dependent on which map - the open ones tend to do better than the busy cities).

I have no been able to get these cards stable above 1964 at all on water, but they saturate the cooling loop at 55C and give me > 144Hz @ 1440p all day. Keeps the room warm, that's for sure.

Memory OC's 500MHz without breaking a sweat. Haven't played around enough to know where the absolute limit really is there. by 600MHz, its generating an enormous amount of heat and higher temps despite my water loop. By contrast the 1080 @ 2100MHz is rock solid @ 41-45C on the same loop.

I had hoped to clock them to 2000MHz, but its an artificial goal, they are delivering the promised performance already.


----------



## cekim

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yeah just one minor correction.....FE IS SUPREME!


I don't know about that (despite owning 2 of them), but "good enough" that's for sure.

The FE cooler is crap. That's what you are getting out of the "non-reference" designs, is something closer to water-cooled non-thermal throttled performance, but the chip itself going back to 1080 appears not to benefit from 1.21 Gigawatts of power provision in any way shape or form until you are pouring LN2 on it.


----------



## MURDoctrine

Quote:


> Originally Posted by *krutoydiesel*
> 
> Wish I knew they were going to make a 1080Ti specific block when Nvidia released the card. This looks beautiful, and single slot.


So glad I waited now. Gonna grab me one of those and an EVO to finally replace my old ass Rasa block. Time to refresh my entire loop.


----------



## JedixJarf

Quote:


> Originally Posted by *VanuSovereignty*
> 
> I was being funny, but yeah, FE cooler isn't crap, maybe you missed my string of preceding posts where I am showing my FE running at 70C at 70% at 2000MHz with no additional voltage, PT at 120% BEFORE the Gelid GC Extreme that reduced the temps another 3-4C further.
> 
> I will be doing a follow up video tomorrow.
> 
> This cooler runs 8C cooler than Titan XP and it's essentially the same card. Looks elegant, and expels all of the heat from the case, sexy illuminated Geforce GTX, man I was being funny but honestly, FE IS SUPREME.
> 
> 
> 
> 
> 
> 
> https://www.hardocp.com/article/2017/03/09/nvidia_geforce_gtx_1080_ti_video_card_review/10
> 
> 1080 Ti FE on wheels:
> 
> 
> __
> https://flic.kr/p/15790494174
> 
> MSI Gaming X and Strix on wheels:
> 
> http://www.city-data.com/forum/automotive/568543-riced-out-car-pics.html


That's supra funny. Get it? SUPRA


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Are you water cooled? I haven't flashed anything onto my FE yet as the Strix/MSI ones disable a much-needed displayport and the GA extreme BIOS doesn't seem to be working on the FE.
> 
> But I will be trying to flash one when something suitable comes along. From what I can gather, wizard and co at techpowerup are uploading the BIOSes as they review the cards.
> 
> So, when you see a new BIOS pop up on there, expect to see their review shortly after .


I have an AIO, the x41.


----------



## cekim

Quote:


> Originally Posted by *VanuSovereignty*
> 
> I was being funny, but yeah, FE cooler isn't crap, maybe you missed my string of preceding posts where I am showing my FE running at 70C at 70% at 2000MHz with no additional voltage, PT at 120% BEFORE the Gelid GC Extreme that reduced the temps another 3-4C further.


II shouldn't say "crap". It is designed well to do something different than most users expect. That is to be packed into tight spaces and not be able to dump its heat anywhere by the rear.

As the non-reference card show, you can get 20C+ reductions in temp and corresponding higher and more stable clocks a lot less noise with a non-blower fan.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Yup. My card seems happiest *most stable and best performing with a max boost of 2012 @ 1062mv and mem @ 464. I hopped for more


well, yeah, but mine is happiest at 1974 or 1961 :/, uggh. But, it's still a beastly card even dogging it up. been on BF1 for the last 2-3 hours gaming at ~135-145 fps in ultra 1440p @ 65c. Pretty solid.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Well here's some bull*****
> 
> 
> 
> No one knocked, rang my doorbell, or anything.


oh, damn. gated mansion or southside ghetto?


----------



## KCDC

Hey y'all so, Vray released a benchmark app.. It's perfect for gpu benchmarking. Think cinebench for gpus.

https://docs.chaosgroup.com/display/VRAYBENCH

I did it with dual FEs, with and without sli on. Normally with gpu rendering you turn SLI off.

Since my score was exactly the same with and without it off, I am concerned that the bench only sees one GPU.. I wonder if any of you out there with a single card wouldn't mind running this bench? Just trying to see if the bench sees multiple cards or not.

My results are attached

Thanks!!

EDIT: After doing some scoreboard reading, it definitely scales to multiple gpus. No worry with SLI enabled, it ignores it.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> well, yeah, but mine is happiest at 1974 or 1961 :/, uggh. But, it's still a beatly card even dogging it up. been on BF1 for the last 2-3 hours gaming at ~135-140 fps in ultra 1440p @ 65c. Pretty solid.


Same here (1974), which vendor's FE? (aka: which BIOS)?

Anything beyond that so far and it eventually crashes while gaming


----------



## Slackaveli

Quote:


> Originally Posted by *cekim*
> 
> Same here (1974), which vendor's FE? (aka: which BIOS)?
> 
> Anything beyond that so far and it eventually crashes while gaming


it's the Nvidia store model and it's at +114 core. My temps are great because I did the thermal grizzly kyronaut repasting and my two hour loop on heaven ends at 70c at 70% fans (1:1)

Im in the same boat. I can play at even +152 for an hour or two but it's going to crash, and crash hard, eventually.
At that setting it holds 2000-2012 throughout, too. it's not temp or voltage or power, it's topping out the silicon. When it crashes the gpu-z log will look great. all straightline 2000mhz, 65c, etc. no signs of a problem.


----------



## KedarWolf

Quote:


> Originally Posted by *c0ld*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> For Comparsion:
> Yes, it is since *all AIBs have memory NOT OCed* (which is always written down in reviews as downside). So OCing it is imo necessary to get max from it.
> 
> 
> 
> Any recent guides for Overclocking with GPU Boost 3.0???
Click to expand...

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> oh, damn. gated mansion or southside ghetto?


Neither! Suburbs just outside the city, I dunno why "Amazon Logistics" is so incompetent. I'm yelling at their customer support now to refund my points for the expedited shipping.








Quote:


> Originally Posted by *KCDC*
> 
> Hey y'all so, Vray released a benchmark app.. It's perfect for gpu benchmarking. Think cinebench for gpus.
> 
> https://docs.chaosgroup.com/display/VRAYBENCH
> 
> I did it with dual FEs, with and without sli on. Normally with gpu rendering you turn SLI off.
> 
> Since my score was exactly the same with and without it off, I am concerned that the bench only sees one GPU.. I wonder if any of you out there with a single card wouldn't mind running this bench? Just trying to see if the bench sees multiple cards or not.
> 
> My results are attached
> 
> Thanks!!
> 
> EDIT: After doing some scoreboard reading, it definitely scales to multiple gpus. No worry with SLI enabled, it ignores it.


Thanks for sharing!
Quote:


> Originally Posted by *Slackaveli*
> 
> it's the Nvidia store model and it's at +114 core. My temps are great because I did the thermal grizzly kyronaut repasting and my two hour loop on heaven ends at 70c at 70% fans (1:1)
> 
> Im in the same boat. I can play at even +152 for an hour or two but it's going to crash, and crash hard, eventually.
> At that setting it holds 2000-2012 throughout, too. it's not temp or voltage or power, it's topping out the silicon. When it crashes the gpu-z log will look great. all straightline 2000mhz, 65c, etc. no signs of a problem.


Long as the card can do 2000mhz then it's a winner in my book. I just wish I could hold at a frequency over 1950







My card likes to bounce around between bins when gaming or benching 1964-2012


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Neither! Suburbs just outside the city, I dunno why "Amazon Logistics" is so incompetent. I'm yelling at their customer support now to refund my points for the expedited shipping.
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for sharing!
> Long as the card can do 2000mhz then it's a winner in my book. I just wish I could hold at a frequency over 1950
> 
> 
> 
> 
> 
> 
> 
> My card likes to bounce around between bins when gaming or benching 1964-2012


i was at 1961-1974 for the last 2 hours in BF1. I feel your pain. I could have ran 2000 and crashed , but naw lol.

these swag bags, tho


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> i was at 1961-1974 for the last 2 hours in BF1. I feel your pain. I could have ran 2000 and crashed , but naw lol.
> 
> these swag bags, tho


Sexy card


----------



## KedarWolf

Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.

GPU-ZSensorLog.txt 88k .txt file






http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20




Edit: Going to make new video, headset was unplugged, no voice audio.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.


HOW'RE YOU DOING THAT?!

EDIT: Can you push further at the next 2 bins of voltage or no?


----------



## alucardis666

So Amazon is at least pretty speedy and helpful. Got my points refunded and got my order sorted. Nice.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HOW'RE YOU DOING THAT?!
> 
> EDIT: Can you push further at the next 2 bins of voltage or no?
Click to expand...

I could try but at 2100 1.093v it throttles a few bins. Haven't tried higher than 2100 at 1.075v or higher voltages though.

I'm thinking 100% stable 24/7 clocks here not throttling is better than pushing it higher and having it throttle. My ambient temps are cool right now, if it wasn't so late, work in the a.m., I'd try to push it higher at 1.093.









Nite peeps, Peace out.

Oh, and one thing I know about video card clocking you WON'T always get better results with highest voltages, sometimes a step or two down is better.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.


such a nasty FE , dude. You paid "tax" (your buddy) but it was worth it!


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> HOW'RE YOU DOING THAT?!
> 
> EDIT: Can you push further at the next 2 bins of voltage or no?


dude is a jedi


----------



## Bishop07764

Quote:


> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.


Nice man! I might have to try the old Asus Bios. I'm not sure that it's worth it though as I can already score higher graphics scores on Firestrike and Timespy than some people who are at 2100 and 600+ memory with my card sitting at 2063 core and +500 memory. I only seldom hit the power cap now but no power throttling sounds nice.


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> How's it going for you? Everything running ok?


yes sir! thanks to you and kedar I'm a flash god, and I'm not talking barry allen.


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> yes sir! thanks to you and kedar I'm a flash god, and I'm not talking barry allen.


Great! Glad I could help out


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I could try but at 2100 1.093v it throttles a few bins. Haven't tried higher than 2100 at 1.075v or higher voltages though.
> 
> I'm thinking 100% stable 24/7 clocks here not throttling is better than pushing it higher and having it throttle. My ambient temps are cool right now, if it wasn't so late, work in the a.m., I'd try to push it higher at 1.093.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nite peeps, Peace out.
> 
> Oh, and one thing I know about video card clocking you WON'T always get better results with highest voltages, sometimes a step or two down is better.


I think what helps you the most out is being water cooled. My x41 can only keep the card at 48C.

I have become content with 2088 at 1.050V. Seems to be the most stable for me.

I won if there's some golden card out there that can do 2100 at 1.032V... Just to find out it's with someone that isn't a turbo nerd like us and they have it sitting with a FE blower on it.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think what helps you the most out is being water cooled. My x41 can only keep the card at 48C.
> 
> I have become content with 2088 at 1.050V. Seems to be the most stable for me.
> 
> I won if there's some golden card out there that can do 2100 at 1.032V... Just to find out it's with someone that isn't a turbo nerd like us and they have it sitting with a FE blower on it.


honestly yours and kedar's are golden. HOLDING 2088? That's dope, man. I probably wont even be able to do that with an Aorus. I'll be happy holding 2ghz+


----------



## flint314

Same here. I had Bf1 running with my 1080ti ona 4770k @ 4.7Ghz with a 165hz Gsyn monitor. I can tell everyone that BF1 is one of these games that will hugely profit from anything with more than 4 cores. The 4770k was limiting my 1080 and my 1080ti - they barey ran above 80% @ 115 res scale.
I then put them into my work machine with a 5820k @ 4.5Ghz and since them have been seeing them running at 99%. I recommend everyone that intends to primarly play BF1 to buy at least a 6core. Any i5 will not suffice and even a 7700k has fps drops now and then in the 0.1 percentile, and it will bother you. BF1 on 64p maps needs a lot of cpu power and no 4 core will deliver enough.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> honestly yours and kedar's are golden. HOLDING 2088? That's dope, man. I probably wont even be able to do that with an Aorus. I'll be happy holding 2ghz+


At the end of the day it's only like 1.8% difference from 2050







. Neither of us would ever notice.

I was actually pretty bummed out to have to buy a Zotac one but I guess I got lucky. If I was under a full water block the card could probably do 2114 or 2126 below 40C as sometimes in the beginning of my benchmark it will shoot up to 2126 for a few scenes in heaven only to go down when temps go into the 40s


----------



## Pandora's Box

Clearly he is watercooling that 1080 TI FE card. Look at the temps.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> what are your afterburner settings?
> it's misinfo based on a video where the N00by youtuber had no idea what a custom fan profile even was in the comments and actually had FanStop illuminated during his run where he reached 78c. once he turned on the fan he was in the 60s
> THAT is the reason to buy Aorus. 50%!!!
> 
> Comment from a follower on youtube "ELVIS G23 hours ago
> Dim2go yeah same here I also talked with gigabyte they upgraded gpu bios so even better temps between 60-65c reviews in Nordic coming soon maybe day or two﻿"


Ah that was the story I had heard too. I mean, 78C without the fan is pretty impressive in its' own right







.
Quote:


> Originally Posted by *c0ld*
> 
> I didn't had time to play around with it much yesterday, since I cleaned the case, erased any leftover AMD drivers.
> 
> I tested Witcher 3, GTA V, Squad, Unknownplayer Battlegrounds all everything maxed out with 8x MSAA @ 3440x1440 no coil whine.
> 
> Left Ungine Valley running while I did other stuff and temps were really nice.
> 
> What's a program that might be able to stress it for coil whine? FurMark?
> How high were they reporting? I havent even tried OC just set the power slider to 125% and hit 2.0GHz core.
> 
> It's amazing I jumped from a 780 and its more than double performance.


Something that you get extremely high frames in can produce coil whine sometimes. CS:GO, the intro of Doom, or even some older titles can give you 300+ FPS and usually produce the noise.
Quote:


> Originally Posted by *Alwrath*
> 
> Honestly guys this really is the best cooler Nvidia has come out with for a video card. After much testing, all you have to do is set up a good fan profile and your set. I will admit though the stock fan profile is one of the worst ive ever seen for a gaming card, so its very important to set up your own fan profile. I honestly dont mind the fan at 100%, sure I can hear it, but its no big deal especially with gaming headset on. Honestly, I will never understand peoples complaints about fan noise, you want to complain about a loud air cooler? Try some old radeon 290's or 4870's at 100% fan speed and then tell me the Nvidia cooler is loud. Those cards sounded like a jet taking off, this thing is tame in comparison. Im happy with the FE. Got it direct from EVGA's website so no tax. With shipping I bought it for $720 and have had this thing since 4 days after launch day and im loving every minute of this beast. No voltage it fluctuates between 1950-2000 depending on the game, and my temp never goes above 60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I do upgrade with EVGA stepup, it will be for a Classy or Kingpin.


Because people who use open headphones find 70% fan noise atrocious







. There's also a good amount of people out there that like silent PCs, and with an appropriate setup, low noise with the same OCs that everyone else sees around here is possible. That's where AIBs come in. It's fine if you don't mind the noise, but to say it's quiet is silly.
Quote:


> Originally Posted by *kevindd992002*
> 
> How is it bottlenecked really close? Is it only because one or more cpu cores is close to 100% usage?
> When I bought my Gigabyte GTX 670 before in the US and had it shipped here in the Philippines I asked Gigabyte Taiwan if it had international warranty. They outright said no. So what do you mean by serial-based RMA?


http://rma.gigabyte.us/

For the US site, if you go down to user support, you check your warranty based on SN. Of course if you go to any company that doesn't require registration and ask if they'll support you in another country they're going to say no.
Quote:


> Originally Posted by *kevindd992002*
> 
> How is it bottlenecked really close? Is it only because one or more cpu cores is close to 100% usage?
> When I bought my Gigabyte GTX 670 before in the US and had it shipped here in the Philippines I asked Gigabyte Taiwan if it had international warranty. They outright said no. So what do you mean by serial-based RMA?


Quote:


> Originally Posted by *alucardis666*
> 
> Hey guys I think I'm gonna flash that gigabyte bios.


Please apply caution when attempting to use full power limit. Since the FE is a 8 + 6 pin. Sure 6 pin can still handle the full 150w, but it's not a true 8-pin and power issues are a potential factor.
Quote:


> Originally Posted by *kevindd992002*
> 
> How is it bottlenecked really close? Is it only because one or more cpu cores is close to 100% usage?
> When I bought my Gigabyte GTX 670 before in the US and had it shipped here in the Philippines I asked Gigabyte Taiwan if it had international warranty. They outright said no. So what do you mean by serial-based RMA?


Quote:


> Originally Posted by *Benny89*
> 
> _Gigabyte Guantlet_ is a known process of them since 900 series. *They are not "binned"* like old Classies or Kingpins are. So don't expect that they will on average OC/boost higher than other brands. They just test them before putting inside their AIB. That is all. Binning is too expensive for mainstream AIB card. Pascal has its own architecture limitation which no binning will remove.
> 
> *here is AORUS XTREMETE EDITION 1080 TI REVIEW*: https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/33.html
> 
> *Overlock and boost is (again) exact the same as on ROG STRIX and MSI GAMING X.* *2050 max.*
> 
> All 1080 Ti Pascals are basicelly the same. Architecture limit.


What's more exciting to me is:

Gigabyte GTX 1080 Ti Xtreme Gaming
Idle - 46°C
Load - 71°C
Gaming Noise - 33 dBA

FE cooler at 50% is pushing 35dB-39dB between a couple reviews.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> At the end of the day it's only like 1.8% difference from 2050
> 
> 
> 
> 
> 
> 
> 
> . Neither of us would ever notice.
> 
> I was actually pretty bummed out to have to buy a Zotac one but I guess I got lucky. If I was under a full water block the card could probably do 2114 or 2126 below 40C as sometimes in the beginning of my benchmark it will shoot up to 2126 for a few scenes in heaven only to go down when temps go into the 40s


Now imagine if Nvidia and Intel started releasing new hardware that was around ~2% faster each generation.


----------



## cekim

Able to flash the strix bios into FE without issue - 2x1080ti (EVGA).

Curiously, precisionX and heaven OSD show different clock speeds?

Heaven is now running at 2050... I'll have to try more tomorrow.


----------



## cekim

Still baffled by:
1. it is running quite a bit cooler now with the strix bios
2. precisionX and heaven do not agree on the gpu clock rate.

But, it finished the test this way - no complaints... (it was 2100 all the way through)


----------



## YVWM-47

Hey i have a question for anyone out there running these in SLI i have been having a strange problem where games run fine but when i exit back into windows everything becomes extremely sluggish mouse takes 3-4 seconds to move and everything is just extremely slow, restarting or sleeping then waking it up computer seems to fix the problem. Doesnt happen everytime maybe 30 percent of the time, anyone else out their had this issue and is their a way to fix it or is it driver related?


----------



## alucardis666

Quote:


> Originally Posted by *cekim*
> 
> Still baffled by:
> 1. it is running quite a bit cooler now with the strix bios
> 2. precisionX and heaven do not agree on the gpu clock rate.
> 
> But, it finished the test this way - no complaints... (it was 2100 all the way through)


If your minimum is 4.2fps that means you had a display driver crash during testing, I'd back down from 2100.


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> If your minimum is 4.2fps that means you had a display driver crash during testing, I'd back down from 2100.


Yeah, I just fired it up that's the first run, so some tuning to be done, but I'm now playing BF1 at these settings after adjusting voltage a little...

180-200fps 1440p ultra is sweet!


----------



## Exilon

Finally dialed in on my 24/7 clocks by downclocking everytime BF1 crashes. Was hoping to hit 2100 with 1.093V but oh well









2076 core 6156 mem.

http://www.3dmark.com/3dm/19078334?



Hats off to Micron and Nvidia for making such ludicrously fast memory

550GB/s on a 384-bit bus is crazy


----------



## Mato87

Quote:


> Originally Posted by *Drakeskull*
> 
> I guess there was a slight lie lol, in Deathmatchs and Rush it tends to stay over 100fps. In conquest it will go below, shortly after up close to tanks and explosions I seemed to hit bottoms of 85fps..... but its still very playable with the 2600k. I wonder how much of a bottleneck it has and how much I can increase the minimum by going to a 7700k? Its running maxed at 1440p not sure why youtube made it 720p


Told you so


----------



## fisher6

Flashed the Strix bios yesterday and it removed the power limit I had with stock bios which caused it to downclock to 2063. Was able to push it to 2100 at 1.093v and was stable in heaven but haven't tried any games yet (real test). My temps also went up a few degrees but still under 50.


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> If your minimum is 4.2fps that means you had a display driver crash during testing, I'd back down from 2100.


I didn't see a crash in the first run, but 4.2 says otherwise... I played a little BF1 and tweaked the voltage a little and now it runs clean:


----------



## alucardis666

Quote:


> Originally Posted by *cekim*
> 
> I didn't see a crash in the first run, but 4.2 says otherwise... I played a little BF1 and tweaked the voltage a little and now it runs clean:


Very nice!

Are you running SLI?

200fps in BF1 is nuts!


----------



## BrainSplatter

Quote:


> Originally Posted by *YVWM-47*
> 
> Hey i have a question for anyone out there running these in SLI i have been having a strange problem where games run fine but when i exit back into windows everything becomes extremely sluggish mouse takes 3-4 seconds to move and everything is just extremely slow, restarting or sleeping then waking it up computer seems to fix the problem. Doesnt happen everytime maybe 30 percent of the time, anyone else out their had this issue and is their a way to fix it or is it driver related?


Oh, I think I have the same problem. I didn't really paid much attention to it since I am still fiddling with OC settings and when I actually game then I often also switch off the computer right afterwards. But I have observed this strange slowdown after exiting fullscreen games also 2 or 3 times. Next time I will pay more attention and check whether there is anything to fix it.

Using MSI-AB for custom voltage curve, Win10, and 378.92 drivers. GPU power management is set to max power and Windows power scheme is also max. Z270 board.


----------



## Benny89

Quote:


> Originally Posted by *Kashtan*
> 
> http://www.anandtech.com/bench/product/1501?vs=1826
> Direct 4k
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245)4K Ultra Settings, Average Frames Per Second 5775C - 19.63 fps, 7700K - 18.54 fps
> 
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 10.76 fps, 7700K - 9.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)4K Ultra Settings, Average Frames Per Second 5775C - 38.36 fps, 7700K - 37.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 28.4 fps, 7700K - 19.79 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)4K Ultra Settings, Average Frames Per Second 5775C - 39.5 fps, 7700K - 38.59 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 32.18 fps, 7700K - 30.98 fps


Thanks. Hmm, aren't those bottlenecked by GPU? 4K on those cards is...well..

As long as 5775C betas my 4790k in gaming it is good upgrade for me







.

But wth is with Shadow of Mordor...


----------



## kevindd992002

Quote:


> Originally Posted by *Benny89*
> 
> Yes, but:
> 1. Power Limit will not give you anything. 2100 is still a hard cap that so far is not stable even with custom voltage curves. This is architecture limitation sadly.
> 
> As you saw in review, despite higher power limit- Aorus boosted 6mhz higher than ROG Strix and OC limit was still the same- 2050 stable.
> 2. Aorus is so far the quitest, but ROG is the coolest.
> 
> As I said- no difference. Just go for looks. There are no "golden chips" on Pascals, simillar to 7700k (at least not that golden as we have in older hardware). Not much OC room in new generations of hardware


Quote:


> Originally Posted by *JedixJarf*
> 
> Told you guys all 1080 Ti's would OC the same : P


So can you guy suggest to go with an FE if watercooling it with a full waterblock is the plan? We still don't have the other AIB cards to tell, don't you think? I personally am waiting for the EVGA FTW3.

@pez

What does that mean? Gigabyte does international warranty if I buy from the US and use it in the Philippines? Sorry,I don't follow. I understand that the warranty is serial-based but what's the point?


----------



## Benny89

Quote:


> Originally Posted by *kevindd992002*
> 
> So can you guy suggest to go with an FE if watercooling it with a full waterblock is the plan? We still don't have the other AIB cards to tell, don't you think? I personally am waiting for the EVGA FTW3.
> 
> @pez
> 
> What does that mean? Gigabyte does international warranty if I buy from the US and use it in the Philippines? Sorry,I don't follow. I understand that the warranty is serial-based but what's the point?


Best sugestion if you have full custom water loop- go FE, flash STRIX BIOS, OC and enjoy.

If you don't have CWL- just go for any AIB that was reviewed. You should have no problem getting 2050 stable OC.


----------



## Lefty23

Quote:


> Originally Posted by *Slackaveli*
> 
> at least my eyes have seen a card hit 2100 and hold it. That was cathartic.


Quote:


> Originally Posted by *alucardis666*
> 
> I feel your pains bro. Same here.


Quote:


> Originally Posted by *SlimJ87D*
> 
> All he did was just open the benchmark lol. I ran it at 2114 MHz with the stock bios at 1.075 Volts.


2100 stable, is doable in "Heaven/Valley @ 1080p & 1440p" and "firestrike @ 1080p" tests under water.
e.g. Heaven @1440p:


If you want, check here for more info (open spoilers - you might have to right click on the image and open it in a new tab to see better)
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/3500_100#post_25983655
This is on stock bios. I'll probably try some of the other bioses this weekend.
By the way, I get driver crash as soon as I move to the next bin up (2114), in the first few seconds of all the benchmarks I tried.
Looks like 2100 on the core is the limit of my card.

I'm pretty sure as more people get their cards under full water loops we'll see more examples like this.
@KedarWolf or, anyone else who wants to chime in, if I could only try one* bios, which one would you recommend?
Card is an EVGA 1080ti FE, though I don't think it matters, right?

* Release of the card could not come at a worse time for me as April-May & October-November are really busy at work as we prepare our two major product releases. I'm lucky if I can spare like 5-6 hours per week for "hobbies" and that is usually on the weekends.


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> So can you guy suggest to go with an FE if watercooling it with a full waterblock is the plan? We still don't have the other AIB cards to tell, don't you think? I personally am waiting for the EVGA FTW3.
> 
> @pez
> 
> What does that mean? Gigabyte does international warranty if I buy from the US and use it in the Philippines? Sorry,I don't follow. I understand that the warranty is serial-based but what's the point?


Basically you go to that site and enter the serial number of your card and they give you warranty status. The only thing you have to do is create an account on their site and they process your RMA based on that. ASUS is the same way IIRC, so it makes reselling GPUs from either GB or Asus a bit better.


----------



## kevindd992002

Quote:


> Originally Posted by *Benny89*
> 
> Best sugestion if you have full custom water loop- go FE, flash STRIX BIOS, OC and enjoy.
> 
> If you don't have CWL- just go for any AIB that was reviewed. You should have no problem getting 2050 stable OC.


I'm planning on expanding my Swiftech H220-X loop so I guess I can say that that is already a full CWL. Aside from one port being disabled when flashing a STRIX BIOS over an FE, what other disadvantages are there when you do this? It's just been a few days since Kedarwolf have tried this and there are others that are not having positive experiences with cross-flashing.
Quote:


> Originally Posted by *pez*
> 
> Basically you go to that site and enter the serial number of your card and they give you warranty status. The only thing you have to do is create an account on their site and they process your RMA based on that. ASUS is the same way IIRC, so it makes reselling GPUs from either GB or Asus a bit better.


Ok. But I would still have to send back the card to the US, correct?


----------



## Benny89

Quote:


> Originally Posted by *Kashtan*
> 
> http://www.anandtech.com/bench/product/1501?vs=1826
> Direct 4k
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245)4K Ultra Settings, Average Frames Per Second 5775C - 19.63 fps, 7700K - 18.54 fps
> 
> Shadow of Mordor on MSI GTX 770 Lightning 2GB ($245) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 10.76 fps, 7700K - 9.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380)4K Ultra Settings, Average Frames Per Second 5775C - 38.36 fps, 7700K - 37.89 fps
> 
> Shadow of Mordor on MSI R9 290X Gaming LE 4GB ($380) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 28.4 fps, 7700K - 19.79 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560)4K Ultra Settings, Average Frames Per Second 5775C - 39.5 fps, 7700K - 38.59 fps
> 
> Shadow of Mordor on ASUS GTX 980 Strix 4GB ($560) [Minimum FPS]4K Ultra Settings, Minimum Frames Per Second 5775C - 32.18 fps, 7700K - 30.98 fps


Btw. Those benches are weird...

Here are benches from this site comparing 5775C to 4790k: http://www.anandtech.com/bench/product/1501?vs=1260

Aaaand...both CPUs are basicelly toe to toe...difference is almst none and even sometimes 4790k is better.... 5775C was supposed to be better than 4790k, correct?, so I don't know what to think about it...


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm planning on expanding my Swiftech H220-X loop so I guess I can say that that is already a full CWL. Aside from one port being disabled when flashing a STRIX BIOS over an FE, what other disadvantages are there when you do this? It's just been a few days since Kedarwolf have tried this and there are others that are not having positive experiences with cross-flashing.
> Ok. But I would still have to send back the card to the US, correct?


You shouldn't have to. You should just have to speak to customer service and they should just give you an address of who to send it back to.


----------



## Wishmaker

What do people think of this card?

*ZOTAC GeForce GTX 1080 Ti AMP Extreme Spectre*

I have never owned a Zotac product and have been extremely happy with my Asus DCU cards. I was this close to pull the trigger on the Strix OC but I do not like the port layout. I do not like the Aourus because it seems to heavy on one end and bends while sitting in the PCI-E port.


----------



## pez

Quote:


> Originally Posted by *Wishmaker*
> 
> What do people think of this card?
> 
> *ZOTAC GeForce GTX 1080 Ti AMP Extreme Spectre*
> 
> I have never owned a Zotac product and have been extremely happy with my Asus DCU cards. I was this close to pull the trigger on the Strix OC but I do not like the port layout. I do not like the Aourus because it seems to heavy on one end and bends while sitting in the PCI-E port.


That's a super sexy looking card. Zotac seems to put out great cards, but aren't as popular as Asus, MSI, GB, or EVGA. I'm not sure why, but I've never seen anything negative about them. They even have a competitive warranty.


----------



## D1RTYD1Z619

Quote:


> Originally Posted by *pez*
> 
> That's a super sexy looking card. Zotac seems to put out great cards, but aren't as popular as Asus, MSI, GB, or EVGA. I'm not sure why, but I've never seen anything negative about them. They even have a competitive warranty.


I know people were bagging on their customer service for a while and I think that scared a lot of people off.


----------



## kevindd992002

Quote:


> Originally Posted by *pez*
> 
> You shouldn't have to. You should just have to speak to customer service and they should just give you an address of who to send it back to.


But I already asked them that and they told me I have to send it to the nearest distributor which is Taiwan. They have distributors here in our country but I don't understand why I can't just send it directly to them. I would be all for Gigabyte or ASUS if the warranty is valid in our country. If you do a Google search, there are lot of hits saying that Gigabyte and ASUS doesn't have international warranty when it comes to GPU's.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> I'm not sure why, but I've never seen anything negative about them.


Well, the Zotac 980Ti Amp has a somewhat annoying fan speed bug which was never properly fixed (which is why I got 2 of them from the tons of Amazon returns for cheap). Cooler is great though. They also seem to be a bit more susceptible to coil whine than the ASUS , Gigabyte and MSI cards I had before.


----------



## Wishmaker

Thank you for the replies.
There is a gamble with every product and coil whine is normal in GPUS. Some exhibit more others less.
Is that Zotac Spectre card a good choice? Fan bugs are not something I would want to have on a 900 euro card







!


----------



## weskeh

Quote:


> Originally Posted by *BrainSplatter*
> 
> Well, the Zotac 980Ti Amp has a somewhat annoying fan speed bug which was never properly fixed (which is why I got 2 of them from the tons of Amazon returns for cheap). Cooler is great though. They also seem to be a bit more susceptible to coil whine than the ASUS , Gigabyte and MSI cards I had before.


I own a amp extreme 1070. That fan speed thing still exists but is easely fixed by putting fan speed above 30% so its either idle for me and ramps up from 30%. First time zotac for me too. And i am very happy with this card. No coilwhine at all and solid build.


----------



## mtbiker033

Quote:


> Originally Posted by *YVWM-47*
> 
> Hey i have a question for anyone out there running these in SLI i have been having a strange problem where games run fine but when i exit back into windows everything becomes extremely sluggish mouse takes 3-4 seconds to move and everything is just extremely slow, restarting or sleeping then waking it up computer seems to fix the problem. Doesnt happen everytime maybe 30 percent of the time, anyone else out their had this issue and is their a way to fix it or is it driver related?


prior to my getting a 1080ti, a week ago, I finally switched over to windows10. I was using 780 SLI and had the exact thing happen all the time. It never did this in win8.1 but as soon as I went to win10 started having the exact same thing happen. When I got the single 1080ti it hasn't happened at all. it must be a win10 + SLI thing.


----------



## BrainSplatter

Quote:


> Originally Posted by *weskeh*
> 
> That fan speed thing still exists but is easely fixed by putting fan speed above 30% so its either idle for me and ramps up from 30%.


Might be a different problem since from what I have read, the problem on the 980TI Omega was that the fans starting voltage / behavior didn't properly match the design of the fan speed controller making them switch on and off again and again when going from idle to operating speed or vice versa.

Regarding coil whine, from the 3 Zotac 980Ti's I had, 1 had strong coil whine, 1 medium and 1 none. The 2 1080Ti founder edition cards, my previous ASUS 970s and Gigabyte 7970s had none. Ofc, too small a sample size to draw any real conclusion but it's probably good to order from a shop with a generous return policy.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.


Your score is too low for 2100 flashing AIB BIOS and clocking higher doesn't mean its actually faster

We dealt with this back with 1080s


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Full Heaven run, 2100 core, 6156 memory, zero throttling! Gigabyte 1080 Ti flashed with Asus Strix BIOS.
> 
> GPU-ZSensorLog.txt 88k .txt file
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe/0_20
> 
> 
> 
> 
> Edit: Going to make new video, headset was unplugged, no voice audio.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your score is too low for 2100 flashing AIB BIOS and clocking higher doesn't mean its actually faster
> 
> We dealt with this back with 1080s
Click to expand...

About a 4% improvement same clock speeds.









And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.

1080 Ti FE



1080 Ti Strix BIOS


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cekim*
> 
> Still baffled by:
> 1. it is running quite a bit cooler now with the strix bios
> 2. precisionX and heaven do not agree on the gpu clock rate.
> 
> But, it finished the test this way - no complaints... (it was 2100 all the way through)


You can't use heaven to report your information. You have to use GPU-Z and afterburner graphs.

Heaven doesn't display the right information.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KedarWolf*
> 
> About a 4% improvement same clock speeds.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.
> 
> 1080 Ti FE
> 
> 
> 
> 1080 Ti Strix BIOS


You got 10,800 gpu score before the strix bios though.

*http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/2760#post_25967140*


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> Btw. Those benches are weird...
> 
> Here are benches from this site comparing 5775C to 4790k: http://www.anandtech.com/bench/product/1501?vs=1260
> 
> Aaaand...both CPUs are basicelly toe to toe...difference is almst none and even sometimes 4790k is better.... 5775C was supposed to be better than 4790k, correct?, so I don't know what to think about it...


yeah but check the clockspeeds. The 4790k only manages to go toe to toe with a much higher base clockspeed. Considering most haswell chips seem to top out at 4.5ghz ish. The premise is an overclocked 5775c can deliver performance that is in the same league as a heavily overclocked Skylake/Kaby Lake thus making it a viable, cost effective upgrade to Haswell platforms compared to a full platform upgrade to Kaby/Skylake.


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> yeah but check the clockspeeds. The 4790k only manages to go toe to toe with a much higher base clockspeed. Considering most haswell chips seem to top out at 4.5ghz ish. The premise is an overclocked 5775c can deliver performance that is in the same league as a heavily overclocked Skylake/Kaby Lake thus making it a viable, cost effective upgrade to Haswell platforms compared to a full platform upgrade to Kaby/Skylake.


Nah, it still does not make sense.

Look here, here is full articale about Broadwell vs 5,6 and 7 CPUs generation: https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,19

And on stock 3,3 it beats stock 7700k, 6700k and 4790k on stocks with *Turbo enabled* (so 4,4 for 4790l). On 1080p! In almost all new games...

The benchmarks from anandtech.com does not make any sense.... I think they are falsed or at least unreliable.


----------



## KedarWolf

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> About a 4% improvement same clock speeds.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.
> 
> 1080 Ti FE
> 
> 
> 
> 1080 Ti Strix BIOS
> 
> 
> 
> 
> 
> You got 10,800 gpu score before the strix bios though.
> 
> *http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/2760#post_25967140*
Click to expand...

Yes, I did, different clock speeds but it was an exceptional run with fans and pump screaming at 100% and really low ambient temps with all my windows open in winter.

Higher ambient temps and my driver would crash.

I could only run 24/7 2062 +618 memory, 1.031v without clocks throttling and with regular pump and fan curves, had higher GPU temps and would hit power limit at times as well.

My scores at those clocks would always be well under 10500.

This run is my new 24/7 clocks with regular fan curves and pump at 75%, big improvement.

And the comparison was same fan curves, pump speed, ambient temps and exact same clock speeds both BIOS's.


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> Nah, it still does not make sense.
> 
> Look here, here is full articale about Broadwell vs 5,6 and 7 CPUs generation: https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,19
> 
> And on stock 3,3 it beats stock 7700k, 6700k and 4790k on stocks with *Turbo enabled* (so 4,4 for 4790l). On 1080p! In almost all new games...
> 
> The benchmarks from anandtech.com does not make any sense.... I think they are falsed or at least unreliable.


http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/9

No these results are perfectly legit. As far as websites go, Anandtech is one of the best so I doubt they're false. The reason for the massive performance is that the 5775c has a 128mb L4 cache which basically gives the CPU core the Bandwidth of a HEDT processor coupled with the latency of Kaby Lake (with 3000mhz class RAM).

Remember, the performance of the L4 cache on the 5775c was only recently matched by a brand-new next gen Memory controller on the 7700k but only if coupled with extremely high performance DDR4.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KedarWolf*
> 
> About a 4% improvement same clock speeds.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.
> 
> 1080 Ti FE
> 
> 
> 
> 1080 Ti Strix BIOS


So your FE in stock BIOS was running 2100 to get that result?

I get slightly higher GPU score in timespy and my clocks fluctuate from 2050 -2000 allot for hitting power limit and VRAM stock.
And in heaven i got over 152fps

Like i said compare to people who can hit 2100 on stock FE bios and compare.


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/9
> 
> No these results are perfectly legit. As far as websites go, Anandtech is one of the best so I doubt they're false. The reason for the massive performance is that the 5775c has a 128mb L4 cache which basically gives the CPU core the Bandwidth of a HEDT processor coupled with the latency of Kaby Lake (with 3000mhz class RAM).
> 
> Remember, the performance of the L4 cache on the 5775c was only recently matched by a brand-new next gen Memory controller on the 7700k but only if coupled with extremely high performance DDR4.


What I meant is that on Anandtech 4790k performs too well compare to 5775C compare to those game benches here: https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,19 - where 5775C beats the hell out of 4790k and even 5,0Ghz 7700k.

Anandtech benches shows that 4790k is near performance of 5775C in 1080p, while as we can see in purepc.pl tests that even stock 5775C beats 4.4Ghz 4790k in 1080p Ultra by huge margin.

That is why I think those Anandtech are totally off. Just can't point where and why.


----------



## KingEngineRevUp

But what happens when DX12 is utikized to its fullest and CPU overhead goes down? That's one reason I might not move away from my 4790K.


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> What I meant is that on Anandtech 4790k performs too well compare to 5775C compare to those game benches here: https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,19 - where 5775C beats the hell out of 4790k and even 5,0Ghz 7700k.
> 
> Anandtech benches shows that 4790k is near performance of 5775C in 1080p, while as we can see in purepc.pl tests that even stock 5775C beats 4.4Ghz 4790k in 1080p Ultra by huge margin.
> 
> That is why I think those Anandtech are totally off. Just can't point where and why.


I re-scanned the PurePC review, I'm 90% sure its memory related. PurePC used terribad 2133mhz DDR4 with atrocious latencies so the advantage of the L4 becomes really pronounced. I'm quite sure with 3000mhz+ sticks you will see the gap between the chips close.
Quote:


> Originally Posted by *SlimJ87D*
> 
> But what happens when DX12 is utikized to its fullest and CPU overhead goes down? That's one reason I might not move away from my 4790K.


If anything, DX12 will benefit all the chips equally so I doubt it will close the performance difference. Plus, its all down to perspective, if you are close to a boundary like 120fps and you need that slight extra push, the 5775c is a sensible choice as its a drop-in replacement with no other system changes needed. I mean, you can also get the same result with a full platform upgrade to 7700k but that costs a heck of a lot more plus all the effort in reinstalling the mobo + windows.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yes, I did, different clock speeds but it was an exceptional run with fans and pump screaming at 100% and really low ambient temps with all my windows open in winter.
> 
> Higher ambient temps and my driver would crash.
> 
> I could only run 24/7 2062 +618 memory, 1.031v without clocks throttling and with regular pump and fan curves, had higher GPU temps and would hit power limit at times as well.
> 
> My scores at those clocks would always be well under 10500.
> 
> This run is my new 24/7 clocks with regular fan curves and pump at 75%, big improvement.
> 
> And the comparison was same fan curves, pump speed, ambient temps and exact same clock speeds both BIOS's.


Well do the same kind of low ambient winter run with the Strix bios to make a fair comparison. Would be interesting to see.


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> I re-scanned the PurePC review, I'm 90% sure its memory related. PurePC used terribad 2133mhz DDR4 with atrocious latencies so the advantage of the L4 becomes really pronounced. I'm quite sure with 3000mhz+ sticks you will see the gap between the chips close.


I don't think that is the case. Here is again 5775C vs 6700k. 6700k uses 3200mhz memory, while 5775C uses 2400mhz (test from before 7700k, so no comparsion): https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,13

Besides I was talking about 4790k vs 5775C- both are using same DDR3 memory and we can see that 4790k falls behind (as it should) compare to 5775C.

That is why I think Anandtech benches are off, since they show that 4790k=5775C, which is not true. 5775C easly beats 4790k in all gaming scenarios.


----------



## MrTOOSHORT

Is this the 1080ti thread?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Dasboogieman*
> 
> I re-scanned the PurePC review, I'm 90% sure its memory related. PurePC used terribad 2133mhz DDR4 with atrocious latencies so the advantage of the L4 becomes really pronounced. I'm quite sure with 3000mhz+ sticks you will see the gap between the chips close.
> If anything, DX12 will benefit all the chips equally so I doubt it will close the performance difference. Plus, its all down to perspective, if you are close to a boundary like 120fps and you need that slight extra push, the 5775c is a sensible choice as its a drop-in replacement with no other system changes needed. I mean, you can also get the same result with a full platform upgrade to 7700k but that costs a heck of a lot more plus all the effort in reinstalling the mobo + windows.


Would it make a difference if let's say all our CPU usage is dropped to like the 30%s? At that point it could be minor differences by a percent


----------



## Benny89

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Is this the 1080ti thread?


Yes, but this topic came out of CPU bottlenecking 1080Ti and min frames with 1080Ti CPU vs CPU. It just went little too far.

Backing to topic: My STRIX got delayed to Saturday







Hope not later.... Can't wait to put my system in Enthoo Evolv ATX Glass with ROG STRIX 1080 Ti


----------



## Alwrath

Quote:


> Originally Posted by *flint314*
> 
> Same here. I had Bf1 running with my 1080ti ona 4770k @ 4.7Ghz with a 165hz Gsyn monitor. I can tell everyone that BF1 is one of these games that will hugely profit from anything with more than 4 cores. The 4770k was limiting my 1080 and my 1080ti - they barey ran above 80% @ 115 res scale.
> I then put them into my work machine with a 5820k @ 4.5Ghz and since them have been seeing them running at 99%. I recommend everyone that intends to primarly play BF1 to buy at least a 6core. Any i5 will not suffice and even a 7700k has fps drops now and then in the 0.1 percentile, and it will bother you. BF1 on 64p maps needs a lot of cpu power and no 4 core will deliver enough.


Agreed. The future is here and now. Battlefield 1 proves it that games are finally starting to move on. More cores will mean alot in the next few years. Its actually better to have a few extra threads/cores than what the game fully supports especially if you have programs running in the background.


----------



## Benny89

Quote:


> Originally Posted by *Alwrath*
> 
> Agreed. The future is here and now. Battlefield 1 proves it that games are finally starting to move on. More cores will mean alot in the next few years. Its actually better to have a few extra threads/cores than what the game fully supports especially if you have programs running in the background.


I really hope for enthusiast 6-core chip for gaming in 2018 from Intel. With at least 4,0/4,2 GHZ stock with 4,4-4,5 Turbo boost.

I would jump on that train...


----------



## Silent Scone

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Is this the 1080ti thread?


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> But I already asked them that and they told me I have to send it to the nearest distributor which is Taiwan. They have distributors here in our country but I don't understand why I can't just send it directly to them. I would be all for Gigabyte or ASUS if the warranty is valid in our country. If you do a Google search, there are lot of hits saying that Gigabyte and ASUS doesn't have international warranty when it comes to GPU's.


My point is that if you start an RMA with them, there should be no way they can tell where you got your card from.

Also, my Ti could have come today, but FedEx doesn't want to







.


----------



## PasK1234Xw

One thing im seeing with reference and AIB the reference is hitting wall unless you land a golden sample and so far seems very slim.
Even worst than 1080 FE

AIB like gaming X are getting better results at lower slightly lower clocks than overclocked FE due to more power.

2000-2050 stock vram

http://www.3dmark.com/3dm/19082809

http://i.imgur.com/XQuWy3Q.png

Even at 2050 and keeping GPU under 65c im hitting a wall even clocking vram yielding no results what so ever.


----------



## Wishmaker

Great I wanted to buy Zotac AMP Extreme but EK does not make water blocks for it. Now I am back at square one ...


----------



## Benny89

Quote:


> Originally Posted by *PasK1234Xw*
> 
> One thing im seeing with reference and AIB the reference is hitting wall unless you land a golden sample and so far seems very slim.
> Even worst than 1080 FE
> 
> AIB like gaming X are getting better results at lower slightly lower clocks than overclocked FE due to more power.
> 
> http://www.3dmark.com/3dm/19082809
> 
> http://i.imgur.com/XQuWy3Q.png
> 
> Even at 2050 and keeping GPU under 65c im hitting a wall even clocking vram yielding no results what so ever.


Yes, you need at least 330W to get max out of over 2k clocks. Xtreme Edition has 375W but that is overkill since chip can't get past 2100 stable, which is another wall despite Power Limit. But that is architecture wall.


----------



## pez

Quote:


> Originally Posted by *Wishmaker*
> 
> Great I wanted to buy Zotac AMP Extreme but EK does not make water blocks for it. Now I am back at square one ...


Wait, if you're going to put a waterblock on it, why not go with an FE card?


----------



## Wishmaker

Quote:


> Originally Posted by *pez*
> 
> Wait, if you're going to put a waterblock on it, why not go with an FE card?


Well, I wanted the Zotac card on air until fall when I wanna go wc in my rig. Until I get proper cooling, the high clocks from the factory should suffice. Unfortunately, I am forced to ignore this card or drop WC on the GPU in the fall when I move to my new build







.


----------



## eXteR

Quote:


> Originally Posted by *PasK1234Xw*
> 
> One thing im seeing with reference and AIB the reference is hitting wall unless you land a golden sample and so far seems very slim.
> Even worst than 1080 FE
> 
> AIB like gaming X are getting better results at lower slightly lower clocks than overclocked FE due to more power.
> 
> 2000-2050 stock vram
> 
> http://www.3dmark.com/3dm/19082809
> 
> http://i.imgur.com/XQuWy3Q.png
> 
> Even at 2050 and keeping GPU under 65c im hitting a wall even clocking vram yielding no results what so ever.


Quote:


> Originally Posted by *PasK1234Xw*
> 
> Your score is too low for 2100 flashing AIB BIOS and clocking higher doesn't mean its actually faster
> 
> We dealt with this back with 1080s


Here's mine, with clock jumping between 1984-2023 due to power limit. VRam +400 (12800).

[email protected] 4.4Ghz XMP Profile


----------



## pez

Quote:


> Originally Posted by *Wishmaker*
> 
> Well, I wanted the Zotac card on air until fall when I wanna go wc in my rig. Until I get proper cooling, the high clocks from the factory should suffice. Unfortunately, I am forced to ignore this card or drop WC on the GPU in the fall when I move to my new build
> 
> 
> 
> 
> 
> 
> 
> .


Ah I see. Well if you don't mind the noise of the FE for that long, it could still be an option







.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> One thing im seeing with reference and AIB the reference is hitting wall unless you land a golden sample and so far seems very slim.
> Even worst than 1080 FE
> 
> AIB like gaming X are getting better results at lower slightly lower clocks than overclocked FE due to more power.
> 
> 2000-2050 stock vram
> 
> http://www.3dmark.com/3dm/19082809
> 
> http://i.imgur.com/XQuWy3Q.png
> 
> Even at 2050 and keeping GPU under 65c im hitting a wall even clocking vram yielding no results what so ever.


I've been lucky but my OC can at low voltages so I'm not really hittinf those power limits.

Testing at 2075 and 2088 at 1.040V on stock and MSI bios gives me similar results on both runs for my FE.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Here's mine, with clock jumping between 1984-2023 due to power limit. VRam +400 (12800).
> 
> [email protected] 4.4Ghz XMP Profile


Are you on water or just a cooler? To me it sounds more like thermal throttling.


----------



## kevindd992002

Quote:


> Originally Posted by *pez*
> 
> My point is that if you start an RMA with them, there should be no way they can tell where you got your card from.
> 
> Also, my Ti could have come today, but FedEx doesn't want to
> 
> 
> 
> 
> 
> 
> 
> .


Is this the same with EVGA?

Also, what difference do most FE's have? Why are there FE'smade by AIB partners?


----------



## Alwrath

Quote:


> Originally Posted by *eXteR*
> 
> Here's mine, with clock jumping between 1984-2023 due to power limit. VRam +400 (12800).
> 
> [email protected] 4.4Ghz XMP Profile


Any chance you could post a 4K score? Just curious what you get with your setup. My cousin has an ivy bridge as well.


----------



## Wishmaker

Well, I ordered the 1080ti Founder for EK WB compatibility. With NVIDIA just launching a new Titan card, I cancelled my order and will wait for NVIDIA to make up their mind







.


----------



## Benny89

Quote:


> Originally Posted by *Wishmaker*
> 
> Well, I ordered the 1080ti Founder for EK WB compatibility. With NVIDIA just launching a new Titan card, I cancelled my order and will wait for NVIDIA to make up their mind
> 
> 
> 
> 
> 
> 
> 
> .


Nice, welcome to the club!







Can't wait for my STRIX. THe waiting is killing me. Wanna immiedietly go play BF1 and Witcher 3 when I get it!


----------



## PasK1234Xw

Quote:


> Originally Posted by *eXteR*
> 
> Here's mine, with clock jumping between 1984-2023 due to power limit. VRam +400 (12800).
> 
> [email protected] 4.4Ghz XMP Profile


Can you post results with vram stock.
Starting to think my vram is garbage if i increase slider at all and hit apply my screen flashes for quick second


----------



## Wishmaker

Quote:


> Originally Posted by *Benny89*
> 
> Nice, welcome to the club!
> 
> 
> 
> 
> 
> 
> 
> Can't wait for my STRIX. THe waiting is killing me. Wanna immiedietly go play BF1 and Witcher 3 when I get it!


Thanks ...but as I said, I cancelled the 1080ti because it is not top dog anymore







.


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> About a 4% improvement same clock speeds.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here is TimeSpy exact same settings both runs, 2100 core, 6162 memory.
> 
> 1080 Ti FE
> 
> 
> 
> 1080 Ti Strix BIOS
> 
> 
> 
> 
> 
> So your FE in stock BIOS was running 2100 to get that result?
> 
> I get slightly higher GPU score in timespy and my clocks fluctuate from 2050 -2000 allot for hitting power limit and VRAM stock.
> And in heaven i got over 152fps
> 
> Like i said compare to people who can hit 2100 on stock FE bios and compare.
Click to expand...

That compare IS 2100 on the FE.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KedarWolf*
> 
> That compare IS 2100 on the FE.


lol dont role eyes i score better than you at lower clocks than your "2100"


----------



## KedarWolf

Quote:


> Originally Posted by *Alwrath*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eXteR*
> 
> Here's mine, with clock jumping between 1984-2023 due to power limit. VRam +400 (12800).
> 
> [email protected] 4.4Ghz XMP Profile
> 
> 
> 
> 
> 
> Any chance you could post a 4K score? Just curious what you get with your setup. My cousin has an ivy bridge as well.
Click to expand...

I'm running Heaven at a custom windowed resolution at 1920x1080 , not just on the Extreme preset to get my results.


----------



## Pandora's Box

Quote:


> Originally Posted by *PasK1234Xw*
> 
> One thing im seeing with reference and AIB the reference is hitting wall unless you land a golden sample and so far seems very slim.
> Even worst than 1080 FE
> 
> AIB like gaming X are getting better results at lower slightly lower clocks than overclocked FE due to more power.
> 
> 2000-2050 stock vram
> 
> http://www.3dmark.com/3dm/19082809
> 
> http://i.imgur.com/XQuWy3Q.png
> 
> Even at 2050 and keeping GPU under 65c im hitting a wall even clocking vram yielding no results what so ever.


Dont say that too loudly. There are some delusional fokes in this thread that think the Founders Edition is the best 1080 Ti variant out there.

Edit: Folks claiming to be running at 2.1 or 2.0Ghz on the founders edition - refusing to run firestrike extreme stress test when asked to prove it. I have had a founders edition 1080 Ti. It will throttle down to 1850-1900 at best under firestrike extreme or witcher 3 4K. And you're running atleast 70-75% fan speed. AKA windtunnel.


----------



## PasK1234Xw

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm running Heaven at a custom windowed resolution at 1920x1080 , not just on the Extreme preset to get my results.


You using strix vBIOS at 2100

http://i.imgur.com/yfsbH6p.jpg

Me at 2050 stock vram with downclocking

http://i.imgur.com/XQuWy3Q.png


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That compare IS 2100 on the FE.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol dont role eyes i score better than you at lower clocks than your "2100"
Click to expand...

You score better than 10880 AND your Heaven is running a custom resolution, not just a preset, right?

I'm NOT using the Extreme preset is why you are getting better scores.

But I should state in the video I'm not using a preset and the resolution I'm running at, you made me figure that out, and I'm grateful for that.


----------



## Benny89

Fights over 50mhz difference has begun







(jk).


----------



## PasK1234Xw

Lol its not that its that us pascal veteran were already messing with cross flashing back with our 1080s nothing has changed.
Quote:


> Originally Posted by *KedarWolf*
> 
> You score better than 10880 AND your Heaven is running a custom resolution, not just a preset, right?
> 
> I'm NOT using the Extreme preset is why you are getting better scores.
> 
> But I should state in the video I'm not using a preset and the resolution I'm running at, you made me figure that out, and I'm grateful for that.


lol it even says 1080p in my pic same as you buddy im done arguing. You also have this guy hitting over 160fps at lower clocks

http://cdn.overclock.net/2/27/2721e5af_heaven_2017_04_05_19_19_49_475.png


----------



## PasK1234Xw

edit double post


----------



## Bishop07764

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Can you post results with vram stock.
> Starting to think my vram is garbage if i increase slider at all and hit apply my screen flashes for quick second


I was having the same issue after installing the latest Rivatuner Statistics Server. I uninstalled, and it completely went away.

Quote:


> Originally Posted by *KedarWolf*
> 
> That compare IS 2100 on the FE.


Are you sure that your memory isn't clocked too high or something? This is my graphics score in Timespy. I remember people seeing this when I had my gtx 1080 Seahawk EK. Higher clocks didn't always translate into more performance. That cpu is beastly by the way. Awesome clocks.

http://www.3dmark.com/3dm/18939504?


----------



## PasK1234Xw

^ rerun with gsync off your score will be even higher same with other benchmarks. Heaven i gain back about 5fps with it off

also ill check out Riva tuner but was doing before i updated it yesterday still think my card is garbage


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> Very nice!
> 
> Are you running SLI?
> 
> 200fps in BF1 is nuts!


Yes, SLI.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> ^ rerun with gsync off your score will be even higher same with other benchmarks. Heaven i gain back about 5fps with it off
> 
> also ill check out Riva tuner but was doing before i updated it yesterday still think my card is garbage


Yeah I run all my test with g-sync, I need to remember to turn it off.

But flashing the msi bios didn't really make too much of a difference for me but then again it made my card run hotter.

Do you have a run at 1440P for me to compare to?


----------



## PasK1234Xw

Lol nvidia just released a new titan Xp with 3840 cores

https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah I run all my test with g-sync, I need to remember to turn it off.
> 
> But flashing the msi bios didn't really make too much of a difference for me but then again it made my card run hotter.
> 
> Do you have a run at 1440P for me to compare to?


i score about 92fps at 1440 in heaven ill run test in few and post


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Lol nvidia just released a new titan Xp with 3840 cores
> 
> https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/
> i score about 92fps at 1440 in heaven ill run test in few and post


Yeah I'm not sure if the Asus bios on the FE is helping.

Here's another user's results. This is stock bios for fe.



http://imgur.com/V8th5pF


----------



## cekim

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah I'm not sure if the Asus bios on the FE is helping.
> 
> Here's another user's results.
> 
> 
> 
> http://imgur.com/V8th5pF


I think you are right to be suspicious about the specifics as precisionX and Asus tweak tool both show different (lower) numbers than Heaven.

However, so far I'm seeing higher clock rates in precisionX and TTII that map to higher frame rates in BF1.

BF1 is a really sloppy way to benchmark though as a given location in a given map behaves very differently, but map-to-map typical play shows a noticeable bump in frame rates and lower temps for me so far.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cekim*
> 
> I think you are right to be suspicious about the specifics as precisionX and Asus tweak tool both show different (lower) numbers than Heaven.
> 
> However, so far I'm seeing higher clock rates in precisionX and TTII that map to higher frame rates in BF1.
> 
> BF1 is a really sloppy way to benchmark though as a given location in a given map behaves very differently, but map-to-map typical play shows a noticeable bump in frame rates and lower temps for me so far.


heaven is weird, it never showed me my clocks for maxwell or pascal. I'd never trust it. GPU-Z and msi afterbuner graphs agree with one another.


----------



## DooRules

Heaven has always been off with clocks shown. Just go by AB and pay no attention to what Heaven says.


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> Is this the same with EVGA?
> 
> Also, what difference do most FE's have? Why are there FE'smade by AIB partners?


EVGA requires registration for warranty, so I don't think so.
Quote:


> Originally Posted by *Wishmaker*
> 
> Well, I ordered the 1080ti Founder for EK WB compatibility. With NVIDIA just launching a new Titan card, I cancelled my order and will wait for NVIDIA to make up their mind
> 
> 
> 
> 
> 
> 
> 
> .


Yeah for that reason, my Ti is definitely going back now and a TXp (I guess that's how I signify it now) is on its way. I mean it's that or I'm back to going SLI every time







.


----------



## Pandora's Box

One thing about this MSI Gaming X card - fans shutting off on the desktop is a gimmick and pointless. Card runs at 48-52C idling on the desktop, probably because it occasionally spikes to 1500mhz for GPU accelerated browsing. Setting the fan to 30% when below 45C results in still deadly silent operation (I cant tell the difference between fans off and 30%). Also results in idle temps of 30C, a 20C drop...


----------



## fynxer

Quote:


> Originally Posted by *Pandora's Box*
> 
> One thing about this MSI Gaming X card - fans shutting off on the desktop is a gimmick and pointless. Card runs at 48-52C idling on the desktop, probably because it occasionally spikes to 1500mhz for GPU accelerated browsing. Setting the fan to 30% when below 45C results in still deadly silent operation (I cant tell the difference between fans off and 30%). Also results in idle temps of 30C, a 20C drop...


Does it really matter, by stopping them you save the grind on the fans and extend their life.

48-52C is so low anyways, temp that actually don't burn your skin should be completely safe for the gpu.

Or why are you bent on keeping 30C temp at the expense of fan life?


----------



## Pandora's Box

Quote:


> Originally Posted by *fynxer*
> 
> Does it really matter, by stopping them you save the grind on the fans and extend their life.
> 
> 48-52C is so low anyways, temp that actually don't burn your skin should be completely safe for the gpu.
> 
> Or why are you bent on keeping 30C temp at the expense of fan life?


OCD? lol.

By the time the fan dies from running at 30% I hope to be on a GTX 1380 lol.


----------



## cekim

Quote:


> Originally Posted by *Pandora's Box*
> 
> OCD? lol.


Not to be underestimated in this community.


----------



## Besty

The Gigabyte Xtreme bios has been posted, anyone apply it to an FE card yet ?

https://www.techpowerup.com/vgabios/190963/gigabyte-gtx1080ti-11264-170331-2


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> One thing about this MSI Gaming X card - fans shutting off on the desktop is a gimmick and pointless. Card runs at 48-52C idling on the desktop, probably because it occasionally spikes to 1500mhz for GPU accelerated browsing. Setting the fan to 30% when below 45C results in still deadly silent operation (I cant tell the difference between fans off and 30%). Also results in idle temps of 30C, a 20C drop...


That's odd, other users on reddit have lower idles I think. Good excuse for a re-tim


----------



## Pandora's Box

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's odd, other users on reddit have lower idles I think. Good excuse for a re-tim


I'm running 165hz on my monitor. It causes the GPU to not always downclock to desktop speeds.


----------



## Menta

Quote:


> Originally Posted by *Pandora's Box*
> 
> One thing about this MSI Gaming X card - fans shutting off on the desktop is a gimmick and pointless. Card runs at 48-52C idling on the desktop, probably because it occasionally spikes to 1500mhz for GPU accelerated browsing. Setting the fan to 30% when below 45C results in still deadly silent operation (I cant tell the difference between fans off and 30%). Also results in idle temps of 30C, a 20C drop...


Do you notice any electrical noise when stressing the card or playing a game high fps?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> I'm running 165hz on my monitor. It causes the GPU to not always downclock to desktop speeds.


Yeah, I run my desktop at 144hz for that reason... This problem had existed since maxwell! Quite annoying.


----------



## Madness11

Hey guys







have strix 1080ti OC. But cant play Far cry 3







)) FPS 55-70 jump all time (1440P) tell me what i do wrong ??? (6900k non oc)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Madness11*
> 
> Hey guys
> 
> 
> 
> 
> 
> 
> 
> have strix 1080ti OC. But cant play Far cry 3
> 
> 
> 
> 
> 
> 
> 
> )) FPS 55-70 jump all time (1440P) tell me what i do wrong ??? (6900k non oc)


You're not giving any data of graphs to us, that's what you're doing wrong.

GPU-Z or afterburner charts.


----------



## fynxer

Quote:


> Originally Posted by *Pandora's Box*
> 
> OCD? lol.
> 
> By the time the fan dies from running at 30% I hope to be on a GTX 1380 lol.


You could think so but more like lazy, avoid pushing dust in to the card so i don't have to clean until i sell it.


----------



## Bishop07764

Well. It looks like I can push my memory a bit farther. This is probably close to the ragged edge for my memory clock.









http://www.3dmark.com/3dm/19088344?

Slight boost in Firestrike.

http://www.3dmark.com/3dm/19088949?


----------



## Slackaveli

Quote:


> Originally Posted by *flint314*
> 
> Same here. I had Bf1 running with my 1080ti ona 4770k @ 4.7Ghz with a 165hz Gsyn monitor. I can tell everyone that BF1 is one of these games that will hugely profit from anything with more than 4 cores. The 4770k was limiting my 1080 and my 1080ti - they barey ran above 80% @ 115 res scale.
> I then put them into my work machine with a 5820k @ 4.5Ghz and since them have been seeing them running at 99%. I recommend everyone that intends to primarly play BF1 to buy at least a 6core. Any i5 will not suffice and even a 7700k has fps drops now and then in the 0.1 percentile, and it will bother you. BF1 on 64p maps needs a lot of cpu power and no 4 core will deliver enough.


well, there IS ONE that will...


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Now imagine if Nvidia and Intel started releasing new hardware that was around ~2% faster each generation.


i almost spit coffee
Quote:


> Originally Posted by *cekim*
> 
> Able to flash the strix bios into FE without issue - 2x1080ti (EVGA).
> 
> Curiously, precisionX and heaven OSD show different clock speeds?
> 
> Heaven is now running at 2050... I'll have to try more tomorrow.


Heaven is a lie
Quote:


> Originally Posted by *cekim*
> 
> Still baffled by:
> 1. it is running quite a bit cooler now with the strix bios
> 2. precisionX and heaven do not agree on the gpu clock rate.
> 
> But, it finished the test this way - no complaints... (it was 2100 all the way through)


doesnt discredit it in my eyes or anything, but, it's Heaven that be lyin
Quote:


> Originally Posted by *Exilon*
> 
> Finally dialed in on my 24/7 clocks by downclocking everytime BF1 crashes. Was hoping to hit 2100 with 1.093V but oh well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2076 core 6156 mem.
> 
> http://www.3dmark.com/3dm/19078334?
> 
> 
> 
> Hats off to Micron and Nvidia for making such ludicrously fast memory
> 
> 550GB/s on a 384-bit bus is crazy


Yep. I found I could pass multiple heaven runs at +152, but not +165 (although that's my TimeSpy max run, +165) but even +140 crashes in Battlefield 1, and then it crashed at +128 (that was disheartening) after a 2 hour session. But I played 4 hours w/o a crash at +114 (same as a pre-clocked Aorus so I guess Im guaranteed to at least be no worse off than my FE when Aorus comes this afternoon). So, my advice if you are crashing in games is 3 bins below your best Heaven/TimeSpy settings, (-13 per step).


----------



## kevindd992002

Quote:


> Originally Posted by *Pandora's Box*
> 
> Dont say that too loudly. There are some delusional fokes in this thread that think the Founders Edition is the best 1080 Ti variant out there.
> 
> Edit: Folks claiming to be running at 2.1 or 2.0Ghz on the founders edition - refusing to run firestrike extreme stress test when asked to prove it. I have had a founders edition 1080 Ti. It will throttle down to 1850-1900 at best under firestrike extreme or witcher 3 4K. And you're running atleast 70-75% fan speed. AKA windtunnel.


What's your suggestion regarding the "get an FE if you're slapping a waterblock onto it" argument?


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Btw. Those benches are weird...
> 
> Here are benches from this site comparing 5775C to 4790k: http://www.anandtech.com/bench/product/1501?vs=1260
> 
> Aaaand...both CPUs are basicelly toe to toe...difference is almst none and even sometimes 4790k is better.... 5775C was supposed to be better than 4790k, correct?, so I don't know what to think about it...


Are you seriously considering buying a 5775c and running it at 3.3ghz? Because that would be mad.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> Heaven is a lie


"I don't know what its called, but I know the sound it makes when it LIES!!!"









Yes it is definitely lying, but there do seem to be some real gains to be had, but further quantification of that is required.


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> What's your suggestion regarding the "get an FE if you're slapping a waterblock onto it" argument?


The FE is the cheapest around this time outside of the MSI Armor and Gigabyte OC model. If you're going to just put it under water, the FE will be the easiest option for the time being. Plus, the trend is that all of these cards top out to about the same OCs no matter the cooler. Water just helps keep it more consistent.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> i almost spit coffee


You know you'd still buy it.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> Are you seriously considering buying a 5775c and running it at 3.3ghz? Because that would be mad.


So, you are saying, I shouldn't clamp my 6950x to 1.2GHz 24/7 to save all of the electrons







but, but the baby seals? The Polar Bears? The spotted Pandas?


----------



## aylan1196

Long live the king ? titan xp


----------



## PasK1234Xw

I'm so tempted to get new titan broke even selling my 1080s that didnt spend anymore on my 1080 ti IDK though between XP and hybrid kit out of pocket $700
Quote:


> Originally Posted by *Bishop07764*
> 
> I was having the same issue after installing the latest Rivatuner Statistics Server. I uninstalled, and it completely went away.


Figured out it was because i had monitor sett to 165hz its not effecting clocks or anything just fluke with driver


----------



## Slackaveli

Quote:


> Originally Posted by *Wishmaker*
> 
> Thanks ...but as I said, I cancelled the 1080ti because it is not top dog anymore
> 
> 
> 
> 
> 
> 
> 
> .


yeah that announcement, even though i have suspected a new titan, caught the world off guard. That thing is an animal. Im not mad, though, as I wouldnt cough up another $480 for ~275 cuda cores. But, dang, that run (1080ti top dog) lasted for LESS THAN 30 days.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah that announcement, even though i have suspected a new titan, caught the world off guard. That thing is an animal. Im not mad, though, as I wouldnt cough up another $480 for ~275 cuda cores. But, dang, that run (1080ti top dog) lasted for LESS THAN 30 days.


Wait... what're we talking about?


----------



## Madness11

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're not giving any data of graphs to us, that's what you're doing wrong.
> 
> GPU-Z or afterburner charts.


Hey ))) my card load only 40-60% Max and i dont know Why )))


----------



## cekim

I should know better than to log into steam... What a disaster... Oh well, firestrike will have to wait.


----------



## pez

Quote:


> Originally Posted by *alucardis666*
> 
> Wait... what're we talking about?


NVIDIA threw the world on its axis this morning by announcing 'Big Pascal'.


----------



## jelome1989

Any ideas as to how the Titan Xp will fare against AIB GTX 1080 Ti if it's not watercooled?


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That compare IS 2100 on the FE.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol dont role eyes i score better than you at lower clocks than your "2100"
Click to expand...

Well, what do I go by. Same 2100 +650.

Heaven on FE BIOS.



Not sure I could keep these clocks on FE BIOS though, my ambient temps are quite low, not usually like this, graphics driver locks up with warmer temps. Strix BIOS is fine these clocks.









I normally run FE bios at 2062 1.031v +610 memory. Let me do a run those settings.

I get about 152 FPS in Heaven Strix BIOS, but get 160 FE BIOS. In TimeSpy I get 10800 Strix BIOS, about 10500 FE BIOS.

Which results should I go by? I'm thinking TimeSpy would be the more reliable on the two.









Edit: Deleted the TimeSpy, that was my Strix result and I no longer have my FE result, but it was about 10500 last time I tested.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> NVIDIA threw the world on its axis this morning by announcing 'Big Pascal'.


So I should return my evga card to newegg and buy that, is what you're saying?


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> I was having the same issue after installing the latest Rivatuner Statistics Server. I uninstalled, and it completely went away.
> Are you sure that your memory isn't clocked too high or something? This is my graphics score in Timespy. I remember people seeing this when I had my gtx 1080 Seahawk EK. Higher clocks didn't always translate into more performance. That cpu is beastly by the way. Awesome clocks.
> 
> http://www.3dmark.com/3dm/18939504?


agreed b/c here is mine http://www.3dmark.com/spy/1447239, and mine is on air and that is at a semi stable +165
Quote:


> Originally Posted by *pez*
> 
> EVGA requires registration for warranty, so I don't think so.
> Yeah for that reason, my Ti is definitely going back now and a TXp (I guess that's how I signify it now) is on its way. I mean it's that or I'm back to going SLI every time
> 
> 
> 
> 
> 
> 
> 
> .


I GUARANTEE YOU that this is no accident waiting until it's been 34 days since pre-orders began and Nvidia store is no longer accepting returns even though street date launch was March 9th. I had to get a waiver for mine and I may have been the very last refund issued. I am so glad I returned before this was announced b/c those waivers are over with now. I had to mention youtube to get it as is....


----------



## Jbravo33

I can already see myself selling these cards. Least the two I have once a seahawk or hydro copper releases. The fan curve does not play well with sli especially having one founders and another msi founders. Better off going with two of the Same with serial numbers in sequence. The 2 classifieds I had were like that and I was able to oc to 2166 on air with 65% fan. What happens when u have zero patience. I can get one card to 2113 thru a heaven run other card won't get passed 2050 without crashing.


----------



## mouacyk

Quote:


> Originally Posted by *alucardis666*
> 
> So I should return my evga card to newegg and buy that, is what you're saying?


You've been flashing the wrong BIOS to try to get beast performance out of your 1080 Ti. Instead, you now flash beast Titan Xp onto your lackluster BIOS for more corepleteness.


----------



## Slackaveli

Quote:


> Originally Posted by *cekim*
> 
> So, you are saying, I shouldn't clamp my 6950x to 1.2GHz 24/7 to save all of the electrons
> 
> 
> 
> 
> 
> 
> 
> but, but the baby seals? The Polar Bears? The spotted Pandas?


Yeah, Im a straight Hippy raised, tree-hugging, unabashed socialist and I still wouldn't dare not OC my i7! Hell, they lucky I dont lock C-states and turn off all power saving features, but my tree-hugger inside makes me do at least that.







Quote:


> Originally Posted by *alucardis666*
> 
> Wait... what're we talking about?


https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/ THIS, for sale NOW, before we can even find Tis in stock. I guess Vega really is a beast, too. What a year for GPUs.....
Quote:


> Originally Posted by *pez*
> 
> NVIDIA threw the world on its axis this morning by announcing 'Big Pascal'.


And called it Titan Xp (officially) so what is Titan X(p) 2016 then? FAIL on the naming!!


----------



## Slackaveli

edit: double post


----------



## alucardis666

Quote:


> Originally Posted by *mouacyk*
> 
> You've been flashing the wrong BIOS to try to get beast performance out of your 1080 Ti. Instead, you now flash beast Titan Xp onto your lackluster BIOS for more corepleteness.










If only...








Quote:


> Originally Posted by *Slackaveli*
> 
> Yeah, Im a straight Hippy raised, tree-hugging, unabashed socialist and I still wouldn't dare not OC my i7! Hell, they lucky I dont lock C-states and turn off all power saving features, but my tree-hugger inside makes me do at least that.
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/ THIS, for sale NOW, before we can even find Tis in stock. I guess Vega really is a beast, too. What a year for GPUs.....
> And called it Titan Xp (officially) so what is Titan X(p) 2016 then? FAIL on the naming!!


I'm throwing my Ti on ebay now and buying a new Titan.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> If only...
> 
> 
> 
> 
> 
> 
> 
> 
> I'm throwing my Ti on ebay now and buying a new Titan.


THAT'S how you get off a dog! Your patience since yesterday... WOW, that paid off. GO NOW, before the world realizes what just happened, these are going to be gone any minute I predict. People are still on scramble and gather money mode. HURRY. I want to see you get a golden one, too. These are DEFINITELY binned as they have to be damn near perfect to have all cores. These are like freaking quadros! Damn, nvidia.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> THAT'S how you get off a dog! Your patience since yesterday... WOW, that paid off. GO NOW, before the world realizes what just happened, these are going to be gone any minute I predict. People are still on scramble and gather money mode. HURRY. I want to see you get a golden one, too. These are DEFINITELY binned as they have to be damn near perfect to have all cores. These are like freaking quadros! Damn, nvidia.


----------



## fynxer

Quote:


> Originally Posted by *mouacyk*
> 
> You've been flashing the wrong BIOS to try to get beast performance out of your 1080 Ti. Instead, you now flash beast Titan Xp onto your lackluster BIOS for more corepleteness.


What if the Titan Xp bios unlocks all the GP102 cores, that'll be just to much power to handle. Best not try it.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*


SLAP!!! Make sure they gave good delivery address this time









holy sh.... well played. If a man is going to pay taxes on a gpu, it should be on THE FULL FAT CHIP ON LAUNCH DAY







Quote:


> Originally Posted by *pez*
> 
> NVIDIA threw the world on its axis this morning by announcing 'Big Pascal'.


And called it Titan Xp (officially) so what is Titan X(p) 2016 then? FAIL on the naming!!


----------



## Slackaveli

p


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> SLAP!!! Make sure they gave good delivery address this time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> holy sh.... well played. If a man is going to pay taxes on a gpu, it should be on THE FULL FAT CHIP ON LAUNCH DAY


It'll be here tomorrow morning!









And yes I gave the right address. I gave the right address to Amazon too >.<

Will I need to retim this card or do Titan's work differently?


----------



## Rhadamanthys

Quote:


> Originally Posted by *pez*
> 
> The FE is the cheapest around this time outside of the MSI Armor and Gigabyte OC model. If you're going to just put it under water, the FE will be the easiest option for the time being. Plus, the trend is that all of these cards top out to about the same OCs no matter the cooler. Water just helps keep it more consistent.


So the extra power limit of the custom cards won't help overclocking at all when put under water?


----------



## alucardis666

If anyone is interested...

http://www.ebay.com/itm/272621203851?ssPageName=STRK:MESELX:IT&_trksid=p3984.m1558.l2649


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> It'll be here tomorrow morning!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes I gave the right address. I gave the right address to Amazon too >.<
> 
> Will I need to retim this card or do Titan's work differently?


NVIDIA TITAN Xp (2017) Specifications Comparison
NVIDIA Graphics Card TITAN Xp
Process Node 1 16nm
GPU Die Size 471mm² (TBC)
SMs 60
CUDA Cores Per SM 64
FP32 CUDA Cores (Total) 3840
Core Clock 1582
Compute Power 12.15 TFLOPs
Memory Interface 384-bit
Memory Size 12 GB GDDR5X
Memory Clock 11.4 Gbps
Bandwidth 547.7 GB/s
tdp 250W

should have over 600gbps bandwidth after OC

"Of course, when talking about bandwidth its usually a good idea to look at the increase in bandwidth per core since an increase in bandwidth is usually offset by some amount by the increase in core count. In the older TITAN X's case, you had 480 GB/s of bandwidth available to 3584 cores, or roughly 134 MB/s/core. The TITAN Xp has 3840 cores with 547.7 GB/s worth of bandwidth for roughly 142.6 MB/s/core which is a pretty decent gain. The important thing when dealing with an increasing core count is to make sure that the bandwidth available per core doesn't drop while you scale, and this isn't the case here - so you can expect some decent returns with the TITAN Xp."

http://wccftech.com/nvidia-titan-xp-graphics-card-3840-cores-1200-usd/


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> NVIDIA TITAN Xp (2017) Specifications Comparison
> NVIDIA Graphics Card TITAN X (Pascal) GTX 1080 Ti TITAN Xp
> Process Node 16nm 16nm 16nm
> GPU Die Size 471mm² 471mm² 471mm² (TBC)
> SMs 56 56 60
> CUDA Cores Per SM 64 64 64
> FP32 CUDA Cores (Total) 3584 3584 3840
> Core Clock 1531 1584 1582
> Compute Power 10.97 TFLOPs 11.34 TFLOPs 12.15 TFLOPs
> Memory Interface 384-bit 352-bit 384-bit
> Memory Size 12 GB GDDR5X 11 GB GDDR5X 12 GB GDDR5X
> Memory Clock 10 Gbps 11 Gbps 11.4 Gbps
> Bandwidth 480 GB/s 484 GB/s 547.7 GB/s
> TDP 250W 250W 250W
> MSRP $1200 $700` $1200
> 
> ugh, that was a bad copy. lemme try to fix (edit)


Here's hoping it can do 2100mhz


----------



## PasK1234Xw

Well I caved returning my ti to microcenter and got titan xp


----------



## alucardis666

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Well I caved returning my ti to microcenter and got titan xp


Welcome to the club!


----------



## Slackaveli

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Well I caved returning my ti to microcenter and got titan xp


todays theme. We'll miss you guys in here :/


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> todays theme. We'll miss you guys in here :/












I'm not going anywhere! I'll just have another thread to post in aswell!


----------



## rcfc89

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not going anywhere! I'll just have another thread to post in aswell!


Speaking of that thread. Where is it?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Madness11*
> 
> Hey ))) my card load only 40-60% Max and i dont know Why )))


You need to share data graphs. Sounds like it's your cpu.


----------



## alucardis666

Quote:


> Originally Posted by *rcfc89*
> 
> Speaking of that thread. Where is it?


http://www.overclock.net/t/1627390/official-2017-nvidia-titan-xp-owners-thread

I just made one!


----------



## PasK1234Xw

Quote:


> Originally Posted by *alucardis666*
> 
> Welcome to the club!


Yea not happy with this card seems to be under performaning. I really hope nothing changed on pcb I'll be putting hybrid kit as no doubt going to run very hot.


----------



## alucardis666

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Yea not happy with this card seems to be under performaning. I really hope nothing changed on pcb I'll be putting hybrid kit as no doubt going to run very hot.


That's my intention as well if I can thought I plan to rebuy the 1080ti version of the kit and cool the VRMs









http://www.evga.com/products/product.aspx?pn=400-HY-5388-B1


----------



## cekim

So, here's the uncomfortable question:
Did Nvidia make the Titan Xpp or epeen edition because the Ti over-compared to the existing Xp? Or because they got their hands on something from Team Red?


----------



## alucardis666

Quote:


> Originally Posted by *cekim*
> 
> So, here's the uncomfortable question:
> Did Nvidia make the Titan Xpp or epeen edition because the Ti over-compared to the existing Xp? Or because they got their hands on something from Team Red?


Probably the first.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not going anywhere! I'll just have another thread to post in aswell!


good. I like you :


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> good. I like you :


D'awww


----------



## Slackaveli

Quote:


> Originally Posted by *cekim*
> 
> So, here's the uncomfortable question:
> Did Nvidia make the Titan Xpp or epeen edition because the Ti over-compared to the existing Xp? Or because they got their hands on something from Team Red?


a little of both. I've long called for Vega to beat 1080. It may be close enough to 1080TI that they wanted to guarantee top dog status for however many years running. Also, b/c of demand. They cant keep Ti in stock, and these will sell, so why not?

On that naming scheme
NVIDIA GEFORCE GTX TITAN X
NVIDIA TITAN X
NVIDIA TITAN Xp
This guys are jokesters. They know this is supposed to be the Nvidia Titan X Black.


----------



## Madness11

Quote:


> Originally Posted by *SlimJ87D*
> 
> You need to share data graphs. Sounds like it's your cpu.


dont know







but use only 1 core (core2) and what i need to do ?)


----------



## Tikerz

Quote:


> Originally Posted by *Pandora's Box*
> 
> Folks claiming to be running at 2.1 or 2.0Ghz on the founders edition - refusing to run firestrike extreme stress test when asked to prove it.


I admit I'm too cheap to buy the full version of 3DMark to run Firestrike Extreme.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Madness11*
> 
> dont know
> 
> 
> 
> 
> 
> 
> 
> but use only 1 core (core2) and what i need to do ?)


Download GPU-Z and watch a YouTube video on how to use it. Use msi afterburner and take a screenshot of both graphs for us.

YouTube how to do.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> a little of both. I've long called for Vega to beat 1080. It may be close enough to 1080TI that they wanted to guarantee top dog status for however many years running. Also, b/c of demand. They cant keep Ti in stock, and these will sell, so why not?


Yep...

Team Red has the advantage of not urinating in your face if you need to work with openCL... So, I wish them both joy joy feelings and much success so that all of us get cheaper and better GPUs.


----------



## Foxrun

Well I just bought 2 1080Ti's for SLI, and am in the process of trying to sell my titan"x"p. I wonder what performance increases will see with the Xp over the Ti and xp. I feel a little burned right now.


----------



## alucardis666

Quote:


> Originally Posted by *Foxrun*
> 
> Well I just bought 2 1080Ti's for SLI, and am in the process of trying to sell my titan"x"p. I wonder what performance increases will see with the Xp over the Ti and xp. I feel a little burned right now.


Exactly how I felt... Then I bought into the hybe.

Here's hoping this new card can actually *truly* do 4k 60fps all settings maxed.

*EDIT:*




Probably not though considering he's getting 48FPS min with 1080ti SLI.


----------



## cekim

I guess fur mark does not do SLI?

It does manage to show some 2000+ goodness on one GPU (note the 2088 hump - that was the fur mark run - so far it doesn't seem to throttle):


----------



## Slackaveli

fedex man has arrived. ill report later


----------



## Foxrun

Quote:


> Originally Posted by *alucardis666*
> 
> Exactly how I felt... Then I bought into the hybe.
> 
> Here's hoping this new card can actually *truly* do 4k 60fps all settings maxed.
> 
> *EDIT:*
> 
> 
> 
> 
> Probably not though considering he's getting 48FPS min with 1080ti SLI.


Yeah, I think it'll be best to stick with the Ti SLI for 4k max 60fps. On paper the Xp still doesnt seem like it would be able to maintain 60fps in demanding titles at 4k.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> fedex man has arrived. ill report later


Wooo!
Quote:


> Originally Posted by *Foxrun*
> 
> Yeah, I think it'll be best to stick with the Ti SLI for 4k max 60fps. On paper the Xp still doesnt seem like it would be able to maintain 60fps in demanding titles at 4k.


I refuse to do SLi... *EVER*! Just seems to wasteful.


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> Exactly how I felt... Then I bought into the hybe.
> 
> Here's hoping this new card can actually *truly* do 4k 60fps all settings maxed.
> 
> *EDIT:*
> 
> 
> 
> 
> Probably not though considering he's getting 48FPS min with 1080ti SLI.


[email protected] or bust.... I really like playing at 4K, but FPS games penalize you for 60hz... So, 1440 it is until that 4K/144 monitor comes out and then whatever SLI absurdity one needs to drive it.

Given the 180-200FPS I'm seeing now at 1440p/SLI, we are getting "close"...


----------



## alucardis666

Quote:


> Originally Posted by *cekim*
> 
> [email protected] or bust.... I really like playing at 4K, but FPS games penalize you for 60hz... So, 1440 it is until that 4K/144 monitor comes out and then whatever SLI absurdity one needs to drive it.
> 
> Given the 180-200FPS I'm seeing now at 1440p/SLI, we are getting "close"...


1440p is *NOT* 4k. If you wanna do 4k 144+hz max settings you'd need like 3-4 of Nvidia's next gen Ti card *Volta Ti* in order to keep up.


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> Wooo!
> I refuse to do SLi... *EVER*! Just seems to wasteful.


Lol. I supposed I have the benefit of "other things and stuff" to justify it, but particularly the "one down x2" approach (like 1080x2 or 1070x2 "back in the day" when 1080 was the new hotness) can get you some pretty impressive performance per dollar.

Assuming your game makes use of it which is a bigger problem and not improving as rapidly as one would hope.


----------



## Foxrun

Quote:


> Originally Posted by *alucardis666*
> 
> Wooo!
> I refuse to do SLi... *EVER*! Just seems to wasteful.


When it works, it works very well. Most big titles support SLI now which makes it a better investment.
Quote:


> Originally Posted by *cekim*
> 
> [email protected] or bust.... I really like playing at 4K, but FPS games penalize you for 60hz... So, 1440 it is until that 4K/144 monitor comes out and then whatever SLI absurdity one needs to drive it.
> 
> Given the 180-200FPS I'm seeing now at 1440p/SLI, we are getting "close"...


Im getting older, 60fps works for me still. Guessing Vega or Volta may get us there.


----------



## cekim

Quote:


> Originally Posted by *alucardis666*
> 
> 1440p is *NOT* 4k. If you wanna do 4k 144+hz max settings you'd need like 3-4 of Nvidia's next gen Ti card *Volta Ti* in order to keep up.


You misread or I mistyped. I did not mean to imply it was at all. I very much understand the jump in resources required.

I'm saying that given that we can (today) with mere TI SLI get nearly 200fps on ultra. We are getting close to the point that 4K with no MSAA (because why) and maybe some minor loss here or there) could be feasible at or near 144 within a generation.

Of course, then the next generation of games comes out and bloats it all up again...


----------



## alucardis666

Quote:


> Originally Posted by *cekim*
> 
> You misread or I mistyped. I did not mean to imply it was at all. I very much understand the jump in resources required.
> 
> I'm saying that given that we can (today) with mere TI SLI get nearly 200fps on ultra. We are getting close to the point that 4K with no MSAA (because why) and maybe some minor loss here or there) could be feasible at or near 144 within a generation.
> 
> Of course, then the next generation of games comes out and bloats it all up again...


Yup. It's a uphill/never-ending battle.


----------



## jarble

This thread could use a updated FP with useful data







IE overclocking tip links to the afterburner mods etc


----------



## alucardis666

Ugh... why was my thread locked...

http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/10#post_25991267


----------



## pez

Quote:


> Originally Posted by *alucardis666*
> 
> So I should return my evga card to newegg and buy that, is what you're saying?


Of course. It's what I'm doing when the Aorus arrives







. That or I keep the Ti and sell the 1080 and 1070...nothing is certain yet







. Still waiting on shipping confirmation







.

Quote:


> Originally Posted by *Slackaveli*
> 
> agreed b/c here is mine http://www.3dmark.com/spy/1447239, and mine is on air and that is at a semi stable +165
> I GUARANTEE YOU that this is no accident waiting until it's been 34 days since pre-orders began and Nvidia store is no longer accepting returns even though street date launch was March 9th. I had to get a waiver for mine and I may have been the very last refund issued. I am so glad I returned before this was announced b/c those waivers are over with now. I had to mention youtube to get it as is....


Yeah, not to mention it was announced on Twitter.
Quote:


> Originally Posted by *Rhadamanthys*
> 
> So the extra power limit of the custom cards won't help overclocking at all when put under water?


The Aorus Ti has a power limit of 150% which translates to 375w. Yet TPU has found that it still clocks the same as the other AIBs.


----------



## Tikerz

Quote:


> Originally Posted by *alucardis666*
> 
> Ugh... why was my thread locked...
> 
> http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/10#post_25991267


I think the mod doesn't know about the new card. He thinks the thread is for the other XP.


----------



## alucardis666

Quote:


> Originally Posted by *Tikerz*
> 
> I think the mod doesn't know about the new card. He thinks the thread is for the other XP.


Yea, I sent a pm. Hoping he fixes it.

That's part of the reason I stated 2017 in the title of the thread. Nvidia's naming scheme is just ******ed.


----------



## Jbravo33

Right after I say I wanna return my cards I go on videocardz and see new titan. F my life! 1080ti's going back tomorrow!


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> The Aorus Ti has a power limit of 150% which translates to 375w. Yet TPU has found that it still clocks the same as the other AIBs.


Becasue bigger Power Limit allows clocks to stabilize above 2k, however then we hit Architecture limit, a clock wall which is 2100 mhz. That is why 320-340W is where Power Limit just allows to hit another wall, that is all.


----------



## Benny89

Quote:


> Originally Posted by *Jbravo33*
> 
> Right after I say I wanna return my cards I go on videocardz and see new titan. F my life! 1080ti's going back tomorrow!


Um, this is 1080Ti thread







. But we are glad that you have a lot of money to throw away







. Order me one also please.


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> Um, this is 1080Ti thread
> 
> 
> 
> 
> 
> 
> 
> . But we are glad that you have a lot of money to throw away
> 
> 
> 
> 
> 
> 
> 
> . Order me one also please.


Don't be salty, he's still on topic saying he's returning his Ti's


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> Becasue bigger Power Limit allows clocks to stabilize above 2k, however then we hit Architecture limit, a clock wall which is 2100 mhz. That is why 320-340W is where Power Limit just allows to hit another wall, that is all.


When your temps are low enough under water, that becomes a blurred line. People running TXP cards at full tilt <60C on water are not really hitting a wall. The power limit isn't the reason you're hitting a wall. It's the fact that after 50C you start to lose one bin at a time.


----------



## Jbravo33

Quote:


> Originally Posted by *Benny89*
> 
> Um, this is 1080Ti thread
> 
> 
> 
> 
> 
> 
> 
> . But we are glad that you have a lot of money to throw away
> 
> 
> 
> 
> 
> 
> 
> . Order me one also please.


i know but but i just dont wanna leave this thread


----------



## alucardis666

Quote:


> Originally Posted by *Jbravo33*
> 
> i know but but i just dont wanna leave this thread


Same, I've had fun here.


----------



## dunbheagan

My first thought on the Titan XP2017 today was:

Man, i will miss the short feeling of owning the fastest graphics card... I enjoyed it for the first time in my life, and it just lasted a week...


----------



## Foxrun

Quote:


> Originally Posted by *dunbheagan*
> 
> My first thought on the Titan XP2017 today was:
> 
> Man, i will miss the short feeling of owning the fastest graphics card... I enjoyed it for the first time in my life, and it just lasted a week...


Dont sweat it, the Ti is still a beast of a card and it's cheaper to SLI.


----------



## hebrewbacon

Quote:


> Originally Posted by *dunbheagan*
> 
> My first thought on the Titan XP2017 today was:
> 
> Man, i will miss the short feeling of owning the fastest graphics card... I enjoyed it for the first time in my life, and it just lasted a week...


I got the Ti on the 10th so I had the fastest card for about a month







Didn't expect Nvidia to refresh their Titan X this early. It was a good run lol


----------



## Rhadamanthys

So having a higher power limit won't help me getting higher clocks, I get that. But (given the right temperatures) it will keep the oc more stable?


----------



## Menta

Testing out the Strix, happy but has some slight coil whine\electrical noise when gaming at 300fps sometimes even lower...not to bad i guess but in comparison my 1080 msi gaming x i only noticed coil whine at around 4000 frames


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jarble*
> 
> This thread could use a updated FP with useful data
> 
> 
> 
> 
> 
> 
> 
> IE overclocking tip links to the afterburner mods etc


Yeah many of us have said that but we need someone to km start a new one and keep it alive.

The op sucks, he just wanted to be first to make a 108p Ti thread and abandoned it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So having a higher power limit won't help me getting higher clocks, I get that. But (given the right temperatures) it will keep the oc more stable?


Not necessarily. Many factors come into play. Throw in an extra 30 watts of power and add in 5C of heat might push you into a thermal throttle bin and be counter productive.

Then your card will use more voltage to keep those clocks and you're either back to where you were or even worse performance.

That's why the Asus bios doesn't work well for many people.

If you're under 40C then more power will help. But being under 40C is kind of hard.

If you're at 45C and then you draw 30 more watts it can push you to 49-51C and cause the card to start thermal throttling.


----------



## dunbheagan

Quote:


> Originally Posted by *Foxrun*
> 
> Dont sweat it, the Ti is still a beast of a card and it's cheaper to SLI.


Understand my statement with a twinkle in the eye. Its not that i am sad now or think the 1080Ti is a bad card. But really, having the fastest card could have lasted a little longer...its purely emotional. Beside, a full and uncut GP102 has a certain attraction to me, that is not justified by the roughly 8% performance increase over the 1080Ti, its just because it is uncut...also purely emotional. But in no way i am going to spend the Titan price... If it was 1.000€ i would say **** it and go for it, but not one € more...


----------



## Rhadamanthys

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not necessarily. Many factors come into play. Throw in an extra 30 watts of power and add in 5C of heat might push you into a thermal throttle bin and be counter productive.
> 
> Then your card will use more voltage to keep those clocks and you're either back to where you were or even worse performance.
> 
> That's why the Asus bios doesn't work well for many people.
> 
> If you're under 40C then more power will help. But being under 40C is kind of hard.
> 
> If you're at 45C and then you draw 30 more watts it can push you to 49-51C and cause the card to start thermal throttling.


Alright, so it's seems to be a bit of a gamble, since I won't really know my load temperatures beforehand. I was undecided between going SLI FE on water now (the EK Founders blocks will ship very soon) or wait for the Aorus Extreme, for which the blocks won't be out for another month. In that case, I guess it's better two just stick with two FE's. Probably going to be enough for [email protected] for most games no matter the oc.


----------



## zipeldiablo

Quote:


> Originally Posted by *dunbheagan*
> 
> Understand my statement with a twinkle in the eye. Its not that i am sad now or think the 1080Ti is a bad card. But really, having the fastest card could have lasted a little longer...its purely emotional. Beside, a full and uncut GP102 has a certain attraction to me, that is not justified by the roughly 8% performance increase over the 1080Ti, its just because it is uncut...also purely emotional. But in no way i am going to spend the Titan price... If it was 1.000€ i would say **** it and go for it, but not one € more...


They are saying 8%, but imo with the oc we are looking at much more.
Will wait for the benchmark and see from there


----------



## Bishop07764

I'm assuming that my Titan X Pascal waterblock would theoretically fit this Titan X Pascal squared card? ?


----------



## alucardis666

Got our thread back! For those buying the new Titan, I'll see you there!

http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/10#post_25991607


----------



## feznz

Hi guys just got my strix 1080 ti been having a little fun with it simply amazing so far.....
Just a quick question voltage curve I am using MSI AB 4.3.0 I cannot find it in AB I seen plenty of screen shot of it. what am I missing?

any way I got a solid 2050Mhz core and maxing 43°C so got a little head room for OC

just a little pic of my ugly universal blocks but in saying that I had my card in and running under water within 1 hour of receiving it.











I know I got to do something about those ugly industrial brass elbows at least paint them black


----------



## Benny89

Quote:


> Originally Posted by *alucardis666*
> 
> Don't be salty, he's still on topic saying he's returning his Ti's


You know that was a joke









Both of you seems to have a lot of money, sure you can buy me one too







.


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> You know that was a joke
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both of you seems to have a lot of money, sure you can buy me one too
> 
> 
> 
> 
> 
> 
> 
> .


Just unmarried.


----------



## Bishop07764

Quote:


> Originally Posted by *feznz*
> 
> Hi guys just got my strix 1080 ti been having a little fun with it simply amazing so far.....
> Just a quick question voltage curve I am using MSI AB 4.3.0 I cannot find it in AB I seen plenty of screen shot of it. what am I missing?
> 
> any way I got a solid 2050Mhz core and maxing 43°C so got a little head room for OC
> 
> just a little pic of my ugly universal blocks but in saying that I had my card in and running under water within 1 hour of receiving it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know I got to do something about those ugly industrial brass elbows at least paint them black


Here's how you get voltage control.

http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,32.html


----------



## dunbheagan

Quote:


> Originally Posted by *zipeldiablo*
> 
> They are saying 8%, but imo with the oc we are looking at much more.
> Will wait for the benchmark and see from there


I dont think we need benchmarks, there will be no surprises. The full GP120 has 8% more Cuda cores and a 8% wider memory interface than the 1080Ti. The OC headroom of the GPU and the memory will be the same, so you will have to compare both cards at same clocks. The Titan Xp2017 at 2000/6000 will be 8% faster than a 1080Ti at 2000/6000, if the application scales well and there are no other bottlenecks. So expect the 8% performance increase the theoretical maximum, in some cases it is going to be less.

I am talking about two OCed cards with the same cooler here, so both at the same water- or aircooler.

If you compare the Titan XP2017 at stock settings to a good CUSTOM 1080Ti with a better cooler and higher power target at stock settings the performance will be even, or maybe even a tiny bit better on the custom 1080Ti.


----------



## Tikerz

Quote:


> Originally Posted by *alucardis666*
> 
> Just unmarried.


That's how it's done folks.


----------



## alucardis666

Quote:


> Originally Posted by *Tikerz*
> 
> That's how it's done folks.


----------



## Benny89

Quote:


> Originally Posted by *Tikerz*
> 
> That's how it's done folks.


Even umarried that would still be two salaries







... *cries in poor tears*


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> Even umarried that would still be two salaries
> 
> 
> 
> 
> 
> 
> 
> ... *cries in poor tears*


2 salaries? I make less than $50k USD a year.









Don't spend too much money needlessly, don't go out much aside from dates with the GF. I got a dog, my car is a lease and I pay $40 a month for my cellphone. Only really thing hurting is my student loans.


----------



## keikei

*381.65 WHQL*


----------



## alucardis666

Quote:


> Originally Posted by *keikei*
> 
> *381.65 WHQL*


Thanks for sharing the link!


----------



## Dasboogieman

Quote:


> Originally Posted by *jelome1989*
> 
> Any ideas as to how the Titan Xp will fare against AIB GTX 1080 Ti if it's not watercooled?


Quote:


> Originally Posted by *cekim*
> 
> So, here's the uncomfortable question:
> Did Nvidia make the Titan Xpp or epeen edition because the Ti over-compared to the existing Xp? Or because they got their hands on something from Team Red?


It would be the logical progression. Yields improve with time so they had a bunch of perfect GP102 chips saved up that didn't meet the cut for Quadro for one reason or another, or just simply slow Quadro sales. So instead of manually cutting down the chips or waiting for anemic (relatively speaking vs the 1808ti) Quadro demand, they naturally update the Titan.

Honestly, though looking at the specs and previous data on the older titans. This one might not come out too far ahead of the 1080ti. I mean, they are perfect chips in the sense the whole circuit is working but it will be a complete crapshoot as to how well binned they are for frequency (i.e. if they were centrally on the wafer or nearer to the edge). Coupled with simply more functional units to drive, the Titan Xp will almost certainly need a shunt mod, at least an AIO cooler and 1950mhz+ (which might be difficult due to more functional units) to match the 2050mhz type speeds people seem to easily be getting with the 1080ti. Any major performance gains will likely come from the increased VRAM bandwidth which will certainly be felt at 2-4K, especially open world games.

I was actually expecting the Titan Xp refresh to have HBM2 to fully differentiate itself as a complete refresh and to guarantee success vs Vega. This would've made more sense as HBM2 would largely fix the power + temp issues of the GDDR5X version thus even the FE cooler will perform well.


----------



## cekim

Quote:


> Originally Posted by *Bishop07764*
> 
> I'm assuming that my Titan X Pascal waterblock would theoretically fit this Titan X Pascal squared card? ?


So, it is the TitanXpeepee as I suggested? Capital p or small?









I keed, I keed.... If I didn't have to buy exotic luxury cars for the IRS (I can only assume that's what they do with the large checks I send them?), I'd buy 2 Xpp's and use my new TIs to display my email triumphanly in 1080p on a side monitor just to show them who's boss... (or I'd have them crunch numbers along side my 980s and 1080s, but either way... I'm happy for Titan owners... sounds like fun).


----------



## DrFreeman35

Quote:


> Originally Posted by *alucardis666*
> 
> 2 salaries? I make less than $50k USD a year.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't spend too much money needlessly, don't go out much aside from dates with the GF. I got a dog, my car is a lease and I pay $40 a month for my cellphone. Only really thing hurting is my student loans.


Agreed, I make more and feel the same. It's an option in the end, hopefully this will make for people selling Ti's for cheap. I make more than 80k, but mostly because of OT..... and I don't find myself bothered by the release or the price. Still have yet to buy one, I'll wait for Volta and jump on the Titan bandwagon. I'm brand new to PC building, and have not done WC yet, so I want a Volta build to be my first. I'm so new I'm stepping up from 1080 ACX FTW to ICX FTW 2. Might sell the 2's and get the FTW3's.


----------



## Benny89

Quote:


> Originally Posted by *alucardis666*
> 
> 2 salaries? I make less than $50k USD a year.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't spend too much money needlessly, don't go out much aside from dates with the GF. I got a dog, my car is a lease and I pay $40 a month for my cellphone. Only really thing hurting is my student loans.


Mate I am from Eastern Europe, here a big company Manager salary per month is like.... 1500-2000 USD at top or 1300-1800 EUR if you like. So for your average mate those are 3 salaries, for middle good earning class is two. We don't have EUR or USD here









I make little above 10k USD per year dude







. These are good money here.

50k USD year I would have to be director or something here lol









Be glad you live in rich country









I don't make *EVEN ONE UNNECESSARY* spending per month. How else could I spend whole salary on 1080Ti, eh?


----------



## Dasboogieman

dupe


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> Mate I am from Eastern Europe, here a big company Manager salary per month is like.... 1500-2000 USD at top or 1300-1800 EUR if you like. So for your average mate those are 3 salaries, for middle good earning class is two. We don't have EUR or USD here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I make little above 10k USD per year dude
> 
> 
> 
> 
> 
> 
> 
> . These are good money here.
> 
> 50k USD year I would have to be director or something here lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be glad you live in rich country
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't make *EVEN ONE UNNECESSARY* spending per month. How else could I spend whole salary on 1080Ti, eh?


Gotcha. Sorry to hear all that. And I hope I didn't come off as grandstanding or anything.

Also just bought these to help pimp my rig out and make me feel TXp worthy!











and 2 sets of these

http://www.bestbuy.com/site/thermaltake-riing-120mm-case-cooling-fan-blue-green-red-white/5588906.p?skuId=5588906

*HYPE!!!







*


----------



## DrFreeman35

Quote:


> Originally Posted by *Benny89*
> 
> Mate I am from Eastern Europe, here a big company Manager salary per month is like.... 1500-2000 USD at top or 1300-1800 EUR if you like. So for your average mate those are 3 salaries, for middle good earning class is two. We don't have EUR or USD here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I make little above 10k USD per year dude
> 
> 
> 
> 
> 
> 
> 
> . These are good money here.
> 
> 50k USD year I would have to be director or something here lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be glad you live in rich country
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't make *EVEN ONE UNNECESSARY* spending per month. How else could I spend whole salary on 1080Ti, eh?


Being responsible with money is a good thing, I am a bit frugal so to speak in other aspects of my life. I refrain from "blowing" money, but after building my PC noticed it's a whole different ball game. I spent way over what I thought originally, and realized I would have spent another $2k+ just water cooling it. I actually have quite a few pieces ready for a custom loop, but decided it's in my best interest to be smart about it and not go overboard all at once. I've had people bash me on other forums for spending what I did and not WC my system.


----------



## alucardis666

Quote:


> Originally Posted by *DrFreeman35*
> 
> I've had people bash me on other forums for spending what I did and not WC my system.


Don't feel badly bro. Water isn't for everyone. And it can get *SUPER* expensive to do it right with all the necessary blocks and fittings. I've had my 6950X for over 8 months and just now decided to throw it under water.


----------



## Benny89

Quote:


> Originally Posted by *alucardis666*
> 
> Gotcha. Sorry to hear all that. And I hope I didn't come off as grandstanding or anything.
> 
> Also just bought these to help pimp my rig out and make me feel TXp worthy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and 2 sets of these
> 
> http://www.bestbuy.com/site/thermaltake-riing-120mm-case-cooling-fan-blue-green-red-white/5588906.p?skuId=5588906
> 
> *HYPE!!!
> 
> 
> 
> 
> 
> 
> 
> *


Nah, no worries. Not all of us are lucky







. I know that there are people out there who can't afford even that







. Well, at least being not rich makes me not spend money for overpriced stuff









I also made my 1080Ti comfortable







Already here, waiting for a card:


----------



## DrFreeman35

Quote:


> Originally Posted by *alucardis666*
> 
> Don't feel badly bro. Water isn't for everyone. And it can get *SUPER* expensive to do it right with all the necessary blocks and fittings. I've had my 6950X for over 8 months and just now decided to throw it under water.


Haha true, and that is one beastly CPU. I would be scared to touch that thing. I hope to join the Ti club soon with some FTW's. I will be building a new rig for gaming/editing with Volta titans, and whatever new CPU is out at that time. Best of luck to everyone with the Ti's.


----------



## Benny89

Quote:


> Originally Posted by *DrFreeman35*
> 
> Being responsible with money is a good thing, I am a bit frugal so to speak in other aspects of my life. I refrain from "blowing" money, but after building my PC noticed it's a whole different ball game. I spent way over what I thought originally, and realized I would have spent another $2k+ just water cooling it. I actually have quite a few pieces ready for a custom loop, but decided it's in my best interest to be smart about it and not go overboard all at once. I've had people bash me on other forums for spending what I did and not WC my system.


What can I say, being in poor country makes you always invest in best value for what money can't buy.

For example, that is why I am not upgrading to 7700k or anything since selling my 4790k I can add 50$ and buy 5775C CPU, OC it and be on pair in games with 7700k. That comes with being poor







I have to plan my spending on my hobby to get maximum for minimum.

That why things like SLI, TITANs, overpriced "gaming" UW monitors or upcoming 2k USD 4K 144hz monitors are not on my list since those are poor value per money







.

1080Ti I consider great value per money. TITAN and TITAN XP I consider throwing my money out unnecessary









But hey, I have to calculate everything so you know


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> What can I say, being in poor country makes you always invest in best value for what money can't buy.
> 
> For example, that is why I am not upgrading to 7700k or anything since selling my 4790k I can add 50$ and buy 5775C CPU, OC it and be on pair in games with 7700k. That comes with being poor
> 
> 
> 
> 
> 
> 
> 
> I have to plan my spending on my hobby to get maximum for minimum.
> 
> That why things like SLI, TITANs, overpriced "gaming" UW monitors or upcoming 2k USD 4K 144hz monitors are not on my list since those are poor value per money
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 1080Ti I consider great value per money. TITAN and TITAN XP I consider throwing my money out unnecessary
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But hey, I have to calculate everything so you know


Very nice case. Almost bought it but it's overpriced I think for what it is and the P400 looks cleaner imo.

If the TXp is a bad overclocker and not worth it I'm sure I can refund it or sell if for a profit and pick up an AIB 1080ti.


----------



## dentnu

I personally will never again buy another titan card as I upgrade cards every time the new cards come out which is every 7 to 10 months. I bought two titans in the past and loss allot of money reselling them never again will I throw money away like that.I can see someone that makes or has allot of money buying them but unless you are going to keep it for 2 plus years it make no sense getting one. Volta should be here before the end of the year or start of 2018. $1200+ to waste and upgrade in 6+ months to the latest and best really is not smart. With that said to each there own if it makes you happy get one life is short and you never know where you will be tomorrow.


----------



## alucardis666

Quote:


> Originally Posted by *dentnu*
> 
> I personally will never again buy another titan card as I upgrade cards every time the new cards come out which is every 7 to 10 months. I bought two titans in the past and loss allot of money reselling them never again will I throw money away like that.I can see someone that makes or has allot of money buying them but unless you are going to keep it for 2 plus years it make no sense getting one. Volta should be here before the end of the year or start of 2018. $1200+ to waste and upgrade in 6+ months to the latest and best really is not smart. With that said to each there own if it makes you happy get one life is short and you never know where you will be tomorrow.


Can't argue against any of that. This will be my first titan card. The last time I spent close to this much money was when I spent $900 on the 7950 Gx2 back in the day.


----------



## Benny89

Quote:


> Originally Posted by *alucardis666*
> 
> Very nice case. Almost bought it but it's overpriced I think for what it is and the P400 looks cleaner imo.
> 
> If the TXp is a bad overclocker and not worth it I'm sure I can refund it or sell if for a profit and pick up an AIB 1080ti.


I don't like P400 since it does not have bracket for radiator at top and place for it. With Enthoo Evolv you can fit there even 360mm rad with 3x120mm fans and case is made so it does not interfere with any components. Whats more I like in Enthoo Evolv that it shows a little of your PSU brand, it has place for SSD drive in front just behind glass and it has this bracket for radiator and AIOs at top which I mentioned. Also it's 3mm alluminium with RGB lights (small but I like RGB







).

But each to his own of course. Cases are super personal taste things







. But I thought with 1080Ti it is finally time to jump to tempered glass train







. I hate scratches in plexi in my 780T. Not to mention it takes too much place and I am ready to go back to mid-towers now that they handled space for rads and AIOs (at least Phanteks with Enthoo Evolv did).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Gotcha. Sorry to hear all that. And I hope I didn't come off as grandstanding or anything.
> 
> Also just bought these to help pimp my rig out and make me feel TXp worthy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and 2 sets of these
> 
> http://www.bestbuy.com/site/thermaltake-riing-120mm-case-cooling-fan-blue-green-red-white/5588906.p?skuId=5588906
> 
> *HYPE!!!
> 
> 
> 
> 
> 
> 
> 
> *


Is the x62 going on your CPU?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Is the x62 going on your CPU?


Yes. Already have the hybrid cooler for the GPU. Assuming it works with the Titan.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Yes. Already have the hybrid cooler for the GPU. Assuming it works with the Titan.


Man, you crazy guy. Gonna miss you around here.


----------



## Slackaveli

Man, I had to go deal with life, lol, I have barely even been able to do anything but it's installed (THERE'S A NEW DRIVER OUT, BTW ) but this thing boosts to 1974 ootb without touching anything. Oh, and even 100% fan is inaudible (also , no coil whine thank god).

So, I can't find it anywhere but I need the new Msi afterburnner. All i can find is 4.3. isnt there a new one out with all the voltage unlocks for all the different cards?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, you crazy guy. Gonna miss you around here.


I'll still be here.








Quote:


> Originally Posted by *Slackaveli*
> 
> So, I can't find it anywhere but I need the new Msi afterburnner. All i can find is 4.3. isnt there a new one out with all the voltage unlocks for all the different cards?


If there is. I've heard/seen nothing on it.


----------



## KCDC

Quote:


> Originally Posted by *Slackaveli*
> 
> Man, I had to go deal with life, lol, I have barely even been able to do anything but it's installed (THERE'S A NEW DRIVER OUT, BTW ) but this thing boosts to 1974 ootb without touching anything. Oh, and even 100% fan is inaudible (also , no coil whine thank god).
> 
> So, I can't find it anywhere but I need the new Msi afterburnner. All i can find is 4.3. isnt there a new one out with all the voltage unlocks for all the different cards?


afterburner beta

http://forums.guru3d.com/showpost.php?p=5412373&postcount=216

someone posted it earlier in this thread. voltage bar is unlocked.


----------



## Bishop07764

Quote:


> Originally Posted by *cekim*
> 
> So, it is the TitanXpeepee as I suggested? Capital p or small?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keed, I keed.... If I didn't have to buy exotic luxury cars for the IRS (I can only assume that's what they do with the large checks I send them?), I'd buy 2 Xpp's and use my new TIs to display my email triumphanly in 1080p on a side monitor just to show them who's boss... (or I'd have them crunch numbers along side my 980s and 1080s, but either way... I'm happy for Titan owners... sounds like fun).


That is perfect. Haha. ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> Man, I had to go deal with life, lol, I have barely even been able to do anything but it's installed (THERE'S A NEW DRIVER OUT, BTW ) but this thing boosts to 1974 ootb without touching anything. Oh, and even 100% fan is inaudible (also , no coil whine thank god).
> 
> So, I can't find it anywhere but I need the new Msi afterburnner. All i can find is 4.3. isnt there a new one out with all the voltage unlocks for all the different cards?


Check my old posts, I posted it here.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> I'll still be here.
> 
> 
> 
> 
> 
> 
> 
> 
> If there is. I've heard/seen nothing on it.


Yeah, that's what they all say... Then they move onto bigger and better things


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Gotcha. Sorry to hear all that. And I hope I didn't come off as grandstanding or anything.
> 
> Also just bought these to help pimp my rig out and make me feel TXp worthy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and 2 sets of these
> 
> http://www.bestbuy.com/site/thermaltake-riing-120mm-case-cooling-fan-blue-green-red-white/5588906.p?skuId=5588906
> 
> *HYPE!!!
> 
> 
> 
> 
> 
> 
> 
> *


oh, i love that case! HOT, DUDE


----------



## Pandora's Box

Quote:


> Originally Posted by *Menta*
> 
> Do you notice any electrical noise when stressing the card or playing a game high fps?


Nope!


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> What can I say, being in poor country makes you always invest in best value for what money can't buy.
> 
> For example, that is why I am not upgrading to 7700k or anything since selling my 4790k I can add 50$ and buy 5775C CPU, OC it and be on pair in games with 7700k. That comes with being poor
> 
> 
> 
> 
> 
> 
> 
> I have to plan my spending on my hobby to get maximum for minimum.
> 
> That why things like SLI, TITANs, overpriced "gaming" UW monitors or upcoming 2k USD 4K 144hz monitors are not on my list since those are poor value per money
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 1080Ti I consider great value per money. TITAN and TITAN XP I consider throwing my money out unnecessary
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But hey, I have to calculate everything so you know


damn right. same reason i did it a couple months ago.


----------



## Pandora's Box

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So the extra power limit of the custom cards won't help overclocking at all when put under water?


Lower heat - lower voltage needed. Water on the FE card is just fine for 2.1GHz.


----------



## Slackaveli

Quote:


> Originally Posted by *KCDC*
> 
> afterburner beta
> 
> http://forums.guru3d.com/showpost.php?p=5412373&postcount=216
> 
> someone posted it earlier in this thread. voltage bar is unlocked.


much abliged.


----------



## JedixJarf

Quote:


> Originally Posted by *Benny89*
> 
> Nah, no worries. Not all of us are lucky
> 
> 
> 
> 
> 
> 
> 
> . I know that there are people out there who can't afford even that
> 
> 
> 
> 
> 
> 
> 
> . Well, at least being not rich makes me not spend money for overpriced stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also made my 1080Ti comfortable
> 
> 
> 
> 
> 
> 
> 
> Already here, waiting for a card:


You will not be disappointed with that case


----------



## alucardis666

Quote:


> Originally Posted by *VanuSovereignty*
> 
> You're getting the new TItan Xp? How much faster is it really though? That's a lot of money, if it were me I would stay put and wait for Volta Ti.


My Ti was kinda dog overclocker. Barely doing 2000mhz. Really I should be doing SLI, but that's a crapshoot of issues.

I'm hoping to get more headroom, and the Titan's retain their value better. Should hold me over till the next round of cards. At that point I can sell and get the next Titan or Ti, whatever comes first.









Worst case scenario is I return it to Nvidia and pick up an AIB 1080ti Hybrid or something crazy like a Classified/KingPin or HOF if they're out in the next 30 days.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> My Ti was kinda dog overclocker. Barely doing 2000mhz. Really I should be doing SLI, but that's a crapshoot of issues.
> 
> I'm hoping to get more headroom, and the Titan's retain their value better. Should hold me over till the next round of cards. At that point I can sell and get the next Titan or Ti, whatever comes first.


Titan not a bad choice if one hates sli, no doubt. That extra ~10% will come in handy.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Titan not a bad choice if one hates sli, no doubt. That extra ~10% will come in handy.


And that's just stock vs stock. Who knows how much better the card will OC!

***GOD do I pray that it OC's better.







***


----------



## hotrod717

Just thought i would share. Unfortunately didnt do a proper capture on higher score. 5960x btw





Just about done with testing if anyone is interested, shoot me a pm.


----------



## Pandora's Box

@Vanu Where's your firestrike extreme with 100% fan?


----------



## Pandora's Box

http://www.3dmark.com/fs/12239672


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought i would share. Unfortunately didnt do a proper capture on higher score. 5960x btw
> 
> 
> 
> 
> 
> Just about done with testing if anyone is interested, shoot me a pm.


My Catzilla, 4K, one card.


----------



## alucardis666

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Well it's not a bad plan as long as youre within your Ti's return window with Nvidia and can manage it so if Titan Xp2 underwhelms you can return for a Classified or HOF variant, although I'm not sure those cards are going to afford much more performance without someone, whether within the AIB industry or without, getting in there and unlocking this vbios. But yeah, Aurous, Classified, or HOF will be very nice in terms of thermals and being a lot quieter and looking good (subjective, I still think FE is the best looking, IMHO). Just don't try go do anything ******ed like attempt to shoehorn a Classified into a case the size of a cereal box and then complain because your suffocated 600W cooling apparatus is running at 70C with the fans on full bore.
> 
> 
> 
> 
> 
> 
> 
> 
> 10%? That's not worth it, TitanXp2 would need to be at least 25% faster than what we have for me to even begin to contemplate trading up to it with money I (we) do not have.
> Yeah, hopefully Nvidia is understating the performance potential. So far I'm seeing compute at 12 TFLOPS up from 11.3? But that cooler, man that looks like a Lamborghini Aventador GPU!
> 
> https://www.nvidia.com/en-us/geforce/products/10series/titan-xp/
> 
> 1080 Ti FE / Titan XP(2)
> 
> 
> 
> MSI Gamings X RE (Rice Edition):


Thanks for understanding. And those pics?!


----------



## Slackaveli

MSI Gaming X is now in stock at the egg.

This pic 
is pretty

Quote:


> Originally Posted by *alucardis666*
> 
> And that's just stock vs stock. Who knows how much better the card will OC!
> 
> ***GOD do I pray that it OC's better.
> 
> 
> 
> 
> 
> 
> 
> ***


yeah, man, i traded a ti for a ti and gained almost 100mhz. this dude hits 2076, holds 2050 thru the 60s and is still over 2ghz at 74c. that's +75mz. If i run +100mv it hits 76c after 30 minutes of heaven, but for some reason my ambient is 80f today so that's a coupIe degrees right there. AC in on full bore now







am going to do the thermal grizzly in a few minutes.


----------



## Addsome

Quote:


> Originally Posted by *PasK1234Xw*
> 
> You using strix vBIOS at 2100
> 
> http://i.imgur.com/yfsbH6p.jpg
> 
> Me at 2050 stock vram with downclocking
> 
> http://i.imgur.com/XQuWy3Q.png


Try running the same benchmark but in windowed mode. I get 157.9FPS windowed and 164.5 fulscreen.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man, i traded a ti for a ti and gained almost 100mhz. this dude hits 2076, holds 2050 thru the 60s and is still over 2ghz at 74c. I am going to do the thermal grizzly in a few minutes.


So was it worth it?


----------



## Pandora's Box

Quote:


> Originally Posted by *VanuSovereignty*
> 
> It's not any better than with the fan at 80%, but it doesn't exceed 58C. It fluctuates between 2000 and 1974 MHz with occasional dips down to 1962 and seldom dips down to 1949 MHz. I took a video but there isn't any point in uploading it.
> 
> That said, here's my benches, even with the fluctuations. From what members here have stated so far, although MSI's variant seems to hold higher clocks it's not any better than FE, for whatever reason.
> 
> Go ahead, post a video of your Firestrike Extreme Stress Test, I'm curious to see what kind of frequency yours holds and what kind of temperatures. Good luck staying at 58C.
> 
> Lets compare benches until then:
> 
> http://www.3dmark.com/3dm/18721648?
> 
> http://www.3dmark.com/fs/12060734
> 
> http://www.3dmark.com/spy/1441608
> 
> Edit:
> 
> Oh and no additional voltage.


Quote:


> Originally Posted by *Pandora's Box*
> 
> http://www.3dmark.com/fs/12239672


Video is uploading now. I'm waiting for you to say you have a higher 3dmark score blah blah blah. 4930K has more cores than the 6700k. You got beaten in GPU score which is what we are comparing. Don't need any luck holding 58C lmao.




This "cereal box" PC is a beast and you have no idea what you are talking about. Multiple posters on here have told you this.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> So was it worth it?


oh hell yeah. it's super sexy, super beefy, runs ~70MHZish higher, holds clocks better, and even after shipping I made $30 and a free copy of Playerunknown's battlegrounds (after trading for honor code #2 to a poster here)/
Quote:


> Originally Posted by *Addsome*
> 
> Try running the same benchmark but in windowed mode. I get 157.9FPS windowed and 164.5 fulscreen.


that was windowed. havent tried fullscreen but i will now.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> oh hell yeah. it's super sexy, super beefy, runs ~70MHZish higher, holds clocks better, and even after shipping I made $30 and a free copy of Playerunknown's battlegrounds (after trading for honor code #2 to a poster here)


Great! Glad it worked out for you


----------



## mtbiker033

did the new windows update and re-installed the new drivers...weird but I'm getting less performance in PUBG (***?) also had to adjust gamma in a few games I tried as everything had a washed out look? anyone else?


----------



## alucardis666

Quote:


> Originally Posted by *mtbiker033*
> 
> did the new windows update and re-installed the new drivers...weird but I'm getting less performance in PUBG (***?) also had to adjust gamma in a few games I tried as everything had a washed out look? anyone else?


Yes! I noticed this with Mass Effect Andromeda. HDR mode just looked off. Thought I was going nuts.


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PasK1234Xw*
> 
> You using strix vBIOS at 2100
> 
> http://i.imgur.com/yfsbH6p.jpg
> 
> Me at 2050 stock vram with downclocking
> 
> http://i.imgur.com/XQuWy3Q.png
> 
> 
> 
> Try running the same benchmark but in windowed mode. I get 157.9FPS windowed and 164.5 fulscreen.
Click to expand...

Get much lower results fullscreen but I have a 4K screen.


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> Yes! I noticed this with Mass Effect Andromeda. HDR mode just looked off. Thought I was going nuts.


glad I'm not the only one!! surely this will get sorted out soon!


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Get much lower results fullscreen but I have a 4K screen.


same. but on the 4k i do get 157.11 windowed. i bet if i went to 1080p it'd go up but, naw.


----------



## KedarWolf

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm running Heaven at a custom windowed resolution at 1920x1080 , not just on the Extreme preset to get my results.
> 
> 
> 
> You using strix vBIOS at 2100
> 
> http://i.imgur.com/yfsbH6p.jpg
> 
> Me at 2050 stock vram with downclocking
> 
> http://i.imgur.com/XQuWy3Q.png
Click to expand...

I get 150-152 FPS with Strix BIOS and 160FPS same clocks with FE BIOS in Heaven . I get 10876 Strix to 10458 FE BIOS though in TimeSpy.









The two games I'm playing are DirectX 12. it's a mute point anyways for me other than benching cause I use RivaTuner with G-Sync on and clocks etc. never top out anyways I don't think.


----------



## alucardis666

Anyone know what this means?!



I still don't have my package...


----------



## Addsome

Quote:


> Originally Posted by *KedarWolf*
> 
> Get much lower results fullscreen but I have a 4K screen.


You can run 1080p fullscreen heaven on a 4k screen. I did on a 3440x1440 screen.


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Get much lower results fullscreen but I have a 4K screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can run 1080p fullscreen heaven on a 4k screen. I did on a 3440x1440 screen.
Click to expand...

I get around 140 FPS fullscreen, 152 FPS windowed.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> Video is uploading now. I'm waiting for you to say you have a higher 3dmark score blah blah blah. 4930K has more cores than the 6700k. You got beaten in GPU score which is what we are comparing. Don't need any luck holding 58C lmao.
> 
> 
> 
> 
> This "cereal box" PC is a beast and you have no idea what you are talking about. Multiple posters on here have told you this.


That's a huge difference, your gpu score is about 2X of his. But what's up with the condensing tone and hate?


----------



## Pandora's Box

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's a huge difference, your gpu score is about 2X of his. But what's up with the condensing tone and hate?


Check his previous posts.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> Check his previous posts.


Something is off with his score, shouldn't be that low.


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought i would share. Unfortunately didnt do a proper capture on higher score. 5960x btw
> 
> 
> 
> 
> 
> Just about done with testing if anyone is interested, shoot me a pm.


Is that one card or SLI? I see in GPU-Z looks like you have a selection for a different card.

This is my 720p, one card.


----------



## pez

The problem isn't that the FE cooler is bad. It's that for its sound output for its performance is quite extreme.


----------



## alucardis666

Quote:


> Originally Posted by *pez*
> 
> The problem isn't that the FE cooler is bad. It's that for its sound output for its performance is quite extreme.


Truth.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *pez*
> 
> The problem isn't that the FE cooler is bad. It's that for its sound output for its performance is quite extreme.


Yeah it's just to save that $50 to use it elsewhere in your system, like water cooling, etc.


----------



## Mad Pistol

Quote:


> Originally Posted by *Pandora's Box*
> 
> Video is uploading now. I'm waiting for you to say you have a higher 3dmark score blah blah blah. 4930K has more cores than the 6700k. You got beaten in GPU score which is what we are comparing. Don't need any luck holding 58C lmao.
> 
> 
> 
> 
> This "cereal box" PC is a beast and you have no idea what you are talking about. Multiple posters on here have told you this.


I see your 1080 Ti FE, and will raise you two GTX 1070 FE's!!! (and a slower CPU)

http://www.3dmark.com/3dm/19098527?



The FE's aren't bad, but it's pretty common knowledge that AIB coolers are usually better. No denying how sexy the FE cooler looks, though.


----------



## Slackaveli

Quote:


> Originally Posted by *Mad Pistol*
> 
> I see your 1080 Ti FE, and will raise you two GTX 1070 FE's!!! (and a slower CPU)
> 
> http://www.3dmark.com/3dm/19098527?
> 
> 
> 
> The FE's aren't bad, but it's pretty common knowledge that AIB coolers are usually better. No denying how sexy the FE cooler looks, though.


it's better than several of the aftermarket ones in looks. It also outperforms aero, armor, and the other cheapos.

Lookswise, I like Aorus, Asus, FTW3, and FE.

And I wouldnt call the FE cooler bad. This beast is just a lot to keep cool.


----------



## Nico67

Been playing with AB curve today and managed to get some more out of my card,

2126 is good at 1.062v runs to 118% power on 5 runs of benchmark

2139 seems best so far and that needs 1.075v, runs 120% power max on 5 runs



2152 maybe ok at 1.081, past once but starts power capping and does take a few dips, 1.093 takes power dips too so next step would be better bios.



benchmark is from the last big dip on clock about mid graph, and did pass ok but does dip with power limit, successive runs crashed.

seems to clock up pretty smoothly so far but maybe reaching the wall,

2101 1.043
2114 1.050
2126 1.062
2139 1.075...

one voltage bin per clock bin seems to be required at the moment

Still to play with memory, but that ain't going to help the power situation so will try that next.


----------



## pez

So this is an e-peen battle about FE vs AIB now? Can we all settle down so that you guys can enjoy your cards instead of being at each other's throats? If you want to save money, go with a waterblock, and/or don't mind a noisy cooler go with a FE.

If you like AIB aesthetics, care about noise-to-temp ratio, and don't mind spending another $20-50 for that, go with that.

At this point, you guys aren't even arguing about differences that are down to the cooler but down to factors influenced by the 'silicon lottery'.


----------



## MURDoctrine

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yeah the misconception here is that the AIB cards are faster and that FE "runs hot".
> 
> Now that I've thoroughly destroyed all of that nonsense it's "FE is too loud at 70% RPM".


Yeah the whole AIB being faster days died back with the 7 series didn't it? Wasn't that the last year that the EVGA Classies were actually binned and guaranteed to OC like beasts?


----------



## JedixJarf

Quote:


> Originally Posted by *alucardis666*
> 
> Anyone know *** this means?!
> 
> 
> 
> I still don't have my package...


One time the carrier through a package over my fence, apparently it was a secure location...


----------



## Pandora's Box

Vanu at the end of the day man our cards perform almost identically. The difference is mine is quiet and yours is loud, this is my opinion. I think the big difference here is I dont wear headphones and you do. If high fan speed doesn't bother you because you wear headphones, then more power to you. For everyday stuff I dont even overclock my card. It boosts to 1950mhz out of the box and I can't hear the fans unless I put my head right against the card. Enjoy your card, I'm done benchmarking. Just don't be childish about other peoples builds.


----------



## Pandora's Box

I miss the 780 TI and 980 Ti card days. ******* around with custom bios's that gave the cards 500Watts or more. Nvidia really locked these things down. Ruins the fun for us enthusiasts. I think I had my 980 Ti running at 1500Mhz core with a EVGA AIO on it


----------



## illypso

Has I already have the shunt mod and not getting past 2070 mhz and bios unlock seems far,

I'm starting to think about doing the physical voltage mod on my 1080ti.

https://xdevs.com/guide/pascal_oc/#step4

Anyone here has done it ?

I know about risk and temps, just want to give it a little more juice, just a little...

I'm not sure yet but its tempting.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> I miss the 780 TI and 980 Ti card days. ******* around with custom bios's that gave the cards 500Watts or more. Nvidia really locked these things down. Ruins the fun for us enthusiasts. I think I had my 980 Ti running at 1500Mhz core with a EVGA AIO on it


Same here, had sli 980 ti at 1500 Mhz


----------



## Slackaveli

speaking of juice, (i guess it's the way they do the power delivery differently on giga)this guy does not like extra juice. At all. HOWEVER, it will take +162 core and hold 65c in BF1 after 2 hours , clocks were 2025 with an occasional 2037. Stayed solid as a rock. Going to bed now but will try more core tomorrow. BEASTY card, but don't add volts. Peace out fellaz.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Same here, had sli 980 ti at 1500 Mhz


yeah, me too. would crash any higher bet held 1500 like a champ at ~48c , evga hybrid.


----------



## Slackaveli

gn


----------



## KingEngineRevUp

I finally bought 3DMark. Firestrike Extreme won't even start up for me. It just crashes instantly, what's wrong with it? Even with no OC. It must be another issue.

It says test cancelled by user. The demo runs fine.


----------



## alucardis666

Quote:


> Originally Posted by *Nico67*
> 
> Been playing with AB curve today and managed to get some more out of my card,
> 
> 2126 is good at 1.062v runs to 118% power on 5 runs of benchmark
> 
> 2139 seems best so far and that needs 1.075v, runs 120% power max on 5 runs
> 
> 
> 
> 2152 maybe ok at 1.081, past once but starts power capping and does take a few dips, 1.093 takes power dips too so next step would be better bios.
> 
> 
> 
> benchmark is from the last big dip on clock about mid graph, and did pass ok but does dip with power limit, successive runs crashed.
> 
> seems to clock up pretty smoothly so far but maybe reaching the wall,
> 
> 2101 1.043
> 2114 1.050
> 2126 1.062
> 2139 1.075...
> 
> one voltage bin per clock bin seems to be required at the moment
> 
> Still to play with memory, but that ain't going to help the power situation so will try that next.


Golden chip








Quote:


> Originally Posted by *JedixJarf*
> 
> One time the carrier through a package over my fence, apparently it was a secure location...


Yea no clue where the F my kryonaut is...


----------



## KingEngineRevUp

Okay I'm reading online that afterburner is causing 3DMARK to crash. ***?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay I'm reading online that afterburner is causing 3DMARK to crash. ***?


Strange... Clean install video drivers?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Strange... Clean install video drivers?


It actually doesn't like afterburner. It considers the FPS counter from afterburner like a cheat or something.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> It actually doesn't like afterburner. It considers the FPS counter from afterburner like a cheat or something.


And if you turn the counter off? Does it work?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> And if you turn the counter off? Does it work?


I have to close afterburner out and it works.

Man, I've spent more time benchmark, tweaking and oc than actually gaming... This is kinda sad... It needs to stop now, I know I've said that like 2 times already.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> I have to close afterburner out and it works.
> 
> Man, I've spent more time benchmark, tweaking and oc than actually gaming... This is kinda sad... It needs to stop now, I know I've said that like 2 times already.


hmmm. I'd test myself but I don't wanna buy a benchmark.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> hmmm. I'd test myself but I don't wanna buy a benchmark.


It's $7 on g2a but yeah what a waste. Wish I bought a cheese burger instead.

In other news, Riva tuner supports DX12 now! Or at least I just found out.


----------



## Exilon

Quote:


> Originally Posted by *illypso*
> 
> Has I already have the shunt mod and not getting past 2070 mhz and bios unlock seems far,
> 
> I'm starting to think about doing the physical voltage mod on my 1080ti.
> 
> https://xdevs.com/guide/pascal_oc/#step4
> 
> Anyone here has done it ?
> 
> I know about risk and temps, just want to give it a little more juice, just a little...
> 
> I'm not sure yet but its tempting.


I wouldn't bother. I'm getting negative scaling on the last two voltage bins. My max stable overclock ( in Fire Strike Extreme Stress Test ) was 2100 @ 1.075v.

Max stable at 1.093 was 2088.


----------



## feznz

Quote:


> Originally Posted by *Bishop07764*
> 
> Here's how you get voltage control.
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_1080_ti_review,32.html


Great thanks +1
I was too lazy to look myself


----------



## Taint3dBulge

Ordered a evga 1080 ti reference card and the new ek water block. Whats kinda the average oc for core and mem on the ref cards. +150c +550m?


----------



## Jbravo33

someone say cheese burger? is the included thermal compound from ek no good? i see alot of people stating other brands. any gains or a crap shoot?


----------



## joder

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Ordered a evga 1080 ti reference card and the new ek water block. Whats kinda the average oc for core and mem on the ref cards. +150c +550m?


Sounds about right. More maybe like +130 clock. 2037 and 2050Mhz seem to be pretty comon for some folks on there with boost.


----------



## kevindd992002

Quote:


> Originally Posted by *pez*
> 
> The FE is the cheapest around this time outside of the MSI Armor and Gigabyte OC model. If you're going to just put it under water, the FE will be the easiest option for the time being. Plus, the trend is that all of these cards top out to about the same OCs no matter the cooler. Water just helps keep it more consistent.


With the FE and a waterblock, will I be power-limited if I plan on OC'ing the card?
Quote:


> Originally Posted by *pez*
> 
> The Aorus Ti has a power limit of 150% which translates to 375w. Yet TPU has found that it still clocks the same as the other AIBs.


Could that be because it's just not a golden card? So there's really no use for the extra power limit there?
Quote:


> Originally Posted by *Benny89*
> 
> Becasue bigger Power Limit allows clocks to stabilize above 2k, however then we hit Architecture limit, a clock wall which is 2100 mhz. That is why 320-340W is where Power Limit just allows to hit another wall, that is all.


If I understand that correctly, the extra power limit would still be useful then. If I go with an FE plus a waterblock, I will hit the power limit wall before hitting the 2.1GHz wall. I wouldn't want that, would I? I wanna get past the power limit wall and hopefully reach the 2.1GHz wall.
Quote:


> Originally Posted by *pez*
> 
> When your temps are low enough under water, that becomes a blurred line. People running TXP cards at full tilt <60C on water are not really hitting a wall. The power limit isn't the reason you're hitting a wall. It's the fact that after 50C you start to lose one bin at a time.


So like my question above, what does power limit really signify when watercooling?
Quote:


> Originally Posted by *SlimJ87D*
> 
> Not necessarily. Many factors come into play. Throw in an extra 30 watts of power and add in 5C of heat might push you into a thermal throttle bin and be counter productive.
> 
> Then your card will use more voltage to keep those clocks and you're either back to where you were or even worse performance.
> 
> That's why the Asus bios doesn't work well for many people.
> 
> If you're under 40C then more power will help. But being under 40C is kind of hard.
> 
> If you're at 45C and then you draw 30 more watts it can push you to 49-51C and cause the card to start thermal throttling.


Let's just say, hypothetically, I can get under 40C with watercooling. I would have the room to add more voltage and consequently draw more power. But if I go with an FE card, I wouldn't have that room since I will be power-limited, right?
Quote:


> Originally Posted by *Pandora's Box*
> 
> Lower heat - lower voltage needed. Water on the FE card is just fine for 2.1GHz.


----------



## Exilon

Quote:


> Originally Posted by *Jbravo33*
> 
> someone say cheese burger? is the included thermal compound from ek no good? i see alot of people stating other brands. any gains or a crap shoot?


It's about on par with AC MX-2

http://overclocking.guide/thermal-paste-roundup-2015-47-products-tested-with-air-cooling-and-liquid-nitrogen-ln2/6/

I don't understand why they don't include GC Extreme like they do with their CPU blocks.

Mounting pressure and making sure you have enough applied is really a lot more relevant to temperatures.


----------



## illypso

Quote:


> Originally Posted by *Exilon*
> 
> I wouldn't bother. I'm getting negative scaling on the last two voltage bins. My max stable overclock ( in Fire Strike Extreme Stress Test ) was 2100 @ 1.075v.
> Max stable at 1.093 was 2088.


So adding more voltage would make the clock less stable, Dam I was a little hype on the project. it would have been fun.

You where able to get a higher stable clock with lower voltage? it wasn't because of power limit clock fluctuation or the more heat from the volt.

those pascal chip are different than my gtx970, it was in love with voltage.

Quote:


> Originally Posted by *VanuSovereignty*
> 
> For another what, 50 MHz. Dude just be happy with what you got. I could see if shunt mod yielded 200 MHz, but 50-100, from what 2.0 to 2.1GHz, that isn't worth the hassle, risk and loss of warranty.


Hassle, not really it would be fun to do,

risk, without risk you get nothing.

Warranty, I have shunt mod already but I did it by soldering a wire on the shunt and before that resistor on the power controller. there to much carnage evidence for warranty.

has for why.... for science ;-)


----------



## Emissary of Pain

Hey all

Random question, what temps are you guys getting at idle ? ... With P&T limit set to max with fan speed set to 70%, I idle around 50c


----------



## Pandora's Box

Quote:


> Originally Posted by *Emissary of Pain*
> 
> Hey all
> 
> Random question, what temps are you guys getting at idle ? ... With P&T limit set to max with fan speed set to 70%, I idle around 50c


Gaming X card here. I run 30% fan speed at idle. Card is currently sitting at 29C.


----------



## Madness11

Guys hello







please help me







i want OC my card with Curve , but dont know how







Help


----------



## Emissary of Pain

Quote:


> Originally Posted by *Pandora's Box*
> 
> Gaming X card here. I run 30% fan speed at idle. Card is currently sitting at 29C.


Damn, probably should have noted that I have the FE


----------



## madmeatballs

I hope we can really get statistics on this thread so we can average things out.


----------



## gstarr

Got my second card from Nvidia, first was EVGA
First limited by 2025
Second card first test on stock cooling 1.093 2125.. temps going to high.. maybe water + Strix bios?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *kevindd992002*
> 
> With the FE and a waterblock, will I be power-limited if I plan on OC'ing the card?
> Could that be because it's just not a golden card? So there's really no use for the extra power limit there?
> If I understand that correctly, the extra power limit would still be useful then. If I go with an FE plus a waterblock, I will hit the power limit wall before hitting the 2.1GHz wall. I wouldn't want that, would I? I wanna get past the power limit wall and hopefully reach the 2.1GHz wall.
> So like my question above, what does power limit really signify when watercooling?
> Let's just say, hypothetically, I can get under 40C with watercooling. I would have the room to add more voltage and consequently draw more power. But if I go with an FE card, I wouldn't have that room since I will be power-limited, right?


Well you got to win the lottery still and get a card that can oc to 2088+. Watercooling will get you some higher boost bins. I know for a fact if I was below 40C I'd run benches at 2126, etc.

I've been playing with the Asus bios at night with an open window to get ambient Temps down to 70F and keep my card at 45C. Yeah the extra power draw helps avoid minor power limits.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *illypso*
> 
> So adding more voltage would make the clock less stable, Dam I was a little hype on the project. it would have been fun.
> 
> You where able to get a higher stable clock with lower voltage? it wasn't because of power limit clock fluctuation or the more heat from the volt.
> 
> those pascal chip are different than my gtx970, it was in love with voltage.
> Hassle, not really it would be fun to do,
> 
> risk, without risk you get nothing.
> 
> Warranty, I have shunt mod already but I did it by soldering a wire on the shunt and before that resistor on the power controller. there to much carnage evidence for warranty.
> 
> has for why.... for science ;-)


That's nice and all, but there's also just going to far and wasting time on something that might not even get you gains or even negatively impact you.

It's like do all this soldering to get a 25% chance at 2% more fps or 75% chance of just getting the same or worse performance.

The reward for the risk wouldn't be worth it for me. Heck, the shunt mod wasn't worth it for me, not when the Asus bios is giving me similar results.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> I hope we can really get statistics on this thread so we can average things out.


Dont sorry, hwbot will start to have statistics.


----------



## kevindd992002

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well you got to win the lottery still and get a card that can oc to 2088+. Watercooling will get you some higher boost bins. I know for a fact if I was below 40C I'd run benches at 2126, etc.
> 
> I've been playing with the Asus bios at night with an open window to get ambient Temps down to 70F and keep my card at 45C. Yeah the extra power draw helps avoid minor power limits.


Are you talking about the FE's here?


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> With the FE and a waterblock, will I be power-limited if I plan on OC'ing the card?
> Could that be because it's just not a golden card? So there's really no use for the extra power limit there?
> If I understand that correctly, the extra power limit would still be useful then. If I go with an FE plus a waterblock, I will hit the power limit wall before hitting the 2.1GHz wall. I wouldn't want that, would I? I wanna get past the power limit wall and hopefully reach the 2.1GHz wall.
> So like my question above, what does power limit really signify when watercooling?
> Let's just say, hypothetically, I can get under 40C with watercooling. I would have the room to add more voltage and consequently draw more power. But if I go with an FE card, I wouldn't have that room since I will be power-limited, right?


Not really. It's an architecture issue first and then factors like silicon lottery, voltage, and temperatures come next. I wouldn't worry about OCs too much with whatever card you get.
Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yeah the misconception here is that the AIB cards are faster and that FE "runs hot".
> 
> Now that I've thoroughly destroyed all of that nonsense it's "FE is too loud at 70% RPM".


Quote:


> Originally Posted by *Pandora's Box*
> 
> Vanu at the end of the day man our cards perform almost identically. The difference is mine is quiet and yours is loud, this is my opinion. I think the big difference here is I dont wear headphones and you do. If high fan speed doesn't bother you because you wear headphones, then more power to you. For everyday stuff I dont even overclock my card. It boosts to 1950mhz out of the box and I can't hear the fans unless I put my head right against the card. Enjoy your card, I'm done benchmarking. Just don't be childish about other peoples builds.


Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yeah I agree, that may change if you get an unlocked vbios and can use the better cooling and power delivery. I have to say that your card is actually the better card in that it is much quieter, and from what you've shown from your Firestrike Extreme run much cooler as well (46C vs. 50C). But yeah, we are all in the same vbios boat, more or less. Some cards are showing better power delivery, as Kedarwolf has shown simply flashing to the Strix vbios. That and messing with the voltage is a bit more than I want to get myself into at this point, if anything should go wrong with the card, you are truly screwed if you don't have another GPU lying around to flash back to the default vbios. Nvidia does NOT play games, if you try and send that in with a different vbios you can pretty much kiss $700 goodbye.
> 
> But yeah, I have to take back the early criticism of your build, I actually put some thought into it and what is happening is that your card is intaking adequate air through the vented side panel (Silverstone Raven for those not following) and all of that hot air is actually working naturally, flowing out of what would normally be the side of the card up and out through the ceiling. There is not a whole lot of heat recirculating in the case as the fans care continually bringing air in which is forcing the exhaust out of the case. You are actually seeing really phenomenal temperatures. 46C peak in Firestrike Extreme, that is exceeding what an H55 equipped FE is capable of.
> 
> Here's to hoping someone unlocks this vbios so everyone rocking 600W 3rd party cooling solutions and 8+8 power delivery can put all of that jazz to use.
> 
> When that happens then yeah, AIB will be vastly superior to reference, hands down. For now they are far better acoustically speaking, aesthetically (subjective) and being that they run cooler, you can get away with maybe another 39Mhz or so higher peak frequency keeping the core under 60C without nearly as obtrusive of an acoustical profile.
> 
> Repped for putting up with my BS.
> 
> THUNDERCATS HO!
> 
> PS:
> 
> I just ran Normal, Ultra and Time Spy, I'm putting them all here for whoever else wants an E-Peen measuring contest.
> 
> Rules:
> 
> You must indicate if youre using a shunt mod and / or additional voltage.
> 
> http://www.3dmark.com/3dm/19100386? (picked up 300 points GPU compared to 7 but lower combined and overall)
> 
> http://www.3dmark.com/3dm/19100445? (direct improvement over 7)
> 
> http://www.3dmark.com/3dm/19100539?


Very glad to see you both come to an agreement and get along.


----------



## Jaysend

Hoping for some help here:

I have an EVGA card and it was running great on air, overclocked well and all was good except for the atrocious fan noise.
So I decided to finally go with a full custom water loop. I got an EK Titan water block and back plate. Followed directions and was pretty meticulous. Went to fire it up and all the lights came in for a split second than click and nothing. I checked the PSU. So I tested it thouroughly and moved on to testing on the bench. I thought it was motherboard so i got another motherboard. Then I thought it was CPU so I got another CPU.

Same thing happened. So I went back to the PSU and tested one connection at a time. It posted with everything connected. Last thing to connect was GPU. It would not turn on. I removed the card and everything turned on. So power to the GPU wether it is actually in the PCIE slot causes nothing to turn on. In slot but no power system will turn on.

What would cause this? Am I correct that my GPU is dead? It must be a short that is causing the system to shut down.

Now I am really worried. If I get a replacement GPU, will this happen again? Did I damage the new mobo and CPU?

I will try and get my hands on another video card to test.

Any ideas, tips, suggestions???

Thanks!!!


----------



## Matthew89

Just arrived


----------



## pez

It's gonna be so hard to leave that thing sealed and not keep it later. Arghhhh


----------



## Vapochilled

Hello guys!!

I currently own a 980Ti and i just triggered a 1080Ti.
Gonna pick it today.
Over here, MSI Gaming X = 820e and Aorus from Gigabyte is 850e (not the extreme).
Wondering if all AOrus have the 150 power, or just the extreme edition.
Gaming G1 is also here for 800e....

If i check price per dollar, gaming G1 for 800e is the best bang for the buck.
What do you think?
Strix is 879e. Asus is crazy....


----------



## Madness11

Hey guys. Help me







)


http://imgur.com/D2gLG

Its ok ?? Or can better (its my first time on Curve


----------



## OcSlave

Not long had the Msi gaming x, stock boost is 1974mhz.
Was going to get the Gb extreme but at £130 more I thought what the hell, not a bad decision by the looks of the Extreme review.
Ive settled at 2050mhz for the oc.


----------



## Vapochilled

So, you would recommend me to spare 30e from the Gigabyte verson? (check my previous post right above)


----------



## Benny89

Quote:


> Originally Posted by *Vapochilled*
> 
> So, you would recommend me to spare 30e from the Gigabyte verson? (check my previous post right above)


Mate, just buy which one looks better. All of them OC THE SAME. 2050mhz is soft cap most achieve while anything above that till 2100 is a matter of winning a golden chip which is a lottery and can happen with every brand.

Just buy what looks best for your RIG.

All AIBs are the same....


----------



## Vapochilled

Quote:


> Originally Posted by *Benny89*
> 
> Mate, just buy which one looks better. All of them OC THE SAME. 2050mhz is soft cap most achieve while anything above that till 2100 is a matter of winning a golden chip which is a lottery and can happen with every brand.
> 
> Just buy what looks best for your RIG.
> 
> All AIBs are the same....


Thanks Benny! I'm gonna trigger the best looking one for me then (taking pricing into account)


----------



## Benny89

Quote:


> Originally Posted by *Vapochilled*
> 
> Thanks Benny! I'm gonna trigger the best looking one for me then (taking pricing into account)


No problem


----------



## AsusFan30




----------



## Benny89

Quote:


> Originally Posted by *AsusFan30*


Nice, welcome to the club


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Golden chip
> 
> 
> 
> 
> 
> 
> 
> 
> Yea no clue where the F my kryonaut is...


thats crazy. you check any "hidden" spots on your porch? Mailbox? After all that trouble and you still cant get your kyro...








Quote:


> Originally Posted by *VanuSovereignty*
> 
> Yeah but man, isn't this thing completely beast compared to 980 TI? I'm actually seeing a 60% improvement compared to my 980 Ti @ 1500 MHz core / 8 GHz memory. That's insane. It's basically as fast as 980 Ti SLI at 1200 MHz at default clocks (1850, but actually 1080 Ti is faster in this comparison by 3-5 FPS if you watch closely) and probably as fast as 980 Ti SLI at 1350-1400MHz at 2025Mhz:
> 
> 
> 
> 
> 
> 
> Well you went from +120 and 70C sounding like a hair drier to +162 and being whisper quiet! Congratulations.
> 
> To be honest, I don't care if this vbios never get's unlocked, the performance with it the way it is is phenomenal.


oh yeah. In games where sli worked my fps is about the same now but much smoother and much better minimums. And, of course, sli works about 1/3 of the time so in all those other games (like rainbow six siege) it's basically doubled (framerate).


----------



## RavageTheEarth

Quote:


> Originally Posted by *AsusFan30*


Damn that's expensive! That reseller is making $130 per card! But hey, if you got the money do it up.


----------



## Nico67

Quote:


> Originally Posted by *Madness11*
> 
> Hey guys. Help me
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> http://imgur.com/D2gLG
> 
> Its ok ?? Or can better (its my first time on Curve


A couple of things with the curve, first depends if you have voltage unlocked.

1/ might as well be a straight line after 1.093 as that's the max voltage unlocked anyway.

2/ In reference to stock volts its the 1.050 you really want to concentrate on and have it flat after that point.

Once you find the max stable at 1.050, if you are not power limited, then maybe try adjust up 1.062 a clock bin higher and flat from there, and so forth.


----------



## AsusFan30

Quote:


> Originally Posted by *Benny89*
> 
> Nice, welcome to the club


Thanks alot! I am just waiting for people to start talking about the price. LOL. I even had to pay an extra $145.00 USD for Shipping.


----------



## AsusFan30

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Damn that's expensive! That reseller is making $130 per card! But hey, if you got the money do it up.


It was like $1900 USD after Shipping to Japan. LOL. They made some good Money off me, but hey..why not!


----------



## Slackaveli

Quote:


> Originally Posted by *VanuSovereignty*
> 
> For another what, 50 MHz. Dude just be happy with what you got. I could see if shunt mod yielded 200 MHz, but 50-100, from what 2.0 to 2.1GHz, that isn't worth the hassle, risk and loss of warranty.
> 
> Yeah but man, isn't this thing completely beast compared to 980 TI? I'm actually seeing a 60% improvement compared to my 980 Ti @ 1500 MHz core / 8 GHz memory. That's insane. It's basically as fast as 980 Ti SLI at 1200 MHz at default clocks (1850, but actually 1080 Ti is faster in this comparison by 3-5 FPS if you watch closely) and probably as fast as 980 Ti SLI at 1350-1400MHz at 2025Mhz:
> 
> 
> 
> 
> 
> 
> Well you went from +120 and 70C sounding like a hair drier to +162 and being whisper quiet! Congratulations.
> 
> To be honest, I don't care if this vbios never get's unlocked, the performance with it the way it is is phenomenal.


yeah, I 'm totally good now. Happy happy. This card looks so beast and now Im always gaming somewhere in the 60s quietly with 2Ghz+ steady. For my OCD gaming at 2025 or so seems so much better than 1961 or so lol. that 2Ghz mark is a killer mentally if you end up on the wrong side of it lol.
Quote:


> Originally Posted by *Taint3dBulge*
> 
> Ordered a evga 1080 ti reference card and the new ek water block. Whats kinda the average oc for core and mem on the ref cards. +150c +550m?


i'd say that is better than average. Average is about +100-120. some can hold +135, others +150. Very few are holding higher than that.


----------



## Slackaveli

Quote:


> Originally Posted by *madmeatballs*
> 
> I hope we can really get statistics on this thread so we can average things out.


we cant read the thread for you, Bud. But anything you could want to know is in here.


----------



## RavageTheEarth

Quote:


> Originally Posted by *AsusFan30*
> 
> It was like $1900 USD after Shipping to Japan. LOL. They made some good Money off me, but hey..why not!


JESUS! I can see patience is not your strong suit. Me neither haha. I was lucky enough to grab a Gigabyte Aorus 1080 Ti on launch day. I'm on vacation right now, but it'll be waiting for me when I get home. Should be a nice upgrade from my 980 Ti!


----------



## Bishop07764

Quote:


> Originally Posted by *kevindd992002*
> 
> With the FE and a waterblock, will I be power-limited if I plan on OC'ing the card?
> Could that be because it's just not a golden card? So there's really no use for the extra power limit there?
> If I understand that correctly, the extra power limit would still be useful then. If I go with an FE plus a waterblock, I will hit the power limit wall before hitting the 2.1GHz wall. I wouldn't want that, would I? I wanna get past the power limit wall and hopefully reach the 2.1GHz wall.
> So like my question above, what does power limit really signify when watercooling?
> Let's just say, hypothetically, I can get under 40C with watercooling. I would have the room to add more voltage and consequently draw more power. But if I go with an FE card, I wouldn't have that room since I will be power-limited, right?


Yes!!! You will be power limited big time if it's anything like my card. My temps have never gone over 36c ever since putting on my waterblock even when trying for 2.1 GHz and voltage at 1.093. It may allow you some room to do lower volts to hit some max clocks. I tested mine last night at 1.031 volts at 2063 core. Power limit for these is my only limitation right now.


----------



## AsusFan30

Quote:


> Originally Posted by *RavageTheEarth*
> 
> JESUS! I can see patience is not your strong suit. Me neither haha. I was lucky enough to grab a Gigabyte Aorus 1080 Ti on launch day. I'm on vacation right now, but it'll be waiting for me when I get home. Should be a nice upgrade from my 980 Ti!


I have always been an early adapter. I don't wait for any price drops. Very nice. You are lucky to get that on launch day. Lucky you to be on Vacation. The sad part about my Story is that I just bought (2) GTX 1080 Gaming X last week. lol. But I saw these 1080 Ti, and I just had to have them.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Bishop07764*
> 
> Yes!!! You will be power limited big time if it's anything like my card. My temps have never gone over 36c ever since putting on my waterblock even when trying for 2.1 GHz and voltage at 1.093. It may allow you some room to do lower volts to hit some max clocks. I tested mine last night at 1.031 volts at 2063 core. Power limit for these is my only limitation right now.


Those are some fantastic temps under water! With my 980 Ti I get around 46C under load. I'm interested in seeing what the temps will be like when EK releases the waterblock for my Aorus.


----------



## Slackaveli

Quote:


> Originally Posted by *OcSlave*
> 
> Not long had the Msi gaming x, stock boost is 1974mhz.
> Was going to get the Gb extreme but at £130 more I thought what the hell, not a bad decision by the looks of the Extreme review.
> Ive settled at 2050mhz for the oc.


solid!!
Quote:


> Originally Posted by *Jaysend*
> 
> Hoping for some help here:
> 
> I have an EVGA card and it was running great on air, overclocked well and all was good except for the atrocious fan noise.
> So I decided to finally go with a full custom water loop. I got an EK Titan water block and back plate. Followed directions and was pretty meticulous. Went to fire it up and all the lights came in for a split second than click and nothing. I checked the PSU. So I tested it thouroughly and moved on to testing on the bench. I thought it was motherboard so i got another motherboard. Then I thought it was CPU so I got another CPU.
> 
> Same thing happened. So I went back to the PSU and tested one connection at a time. It posted with everything connected. Last thing to connect was GPU. It would not turn on. I removed the card and everything turned on. So power to the GPU wether it is actually in the PCIE slot causes nothing to turn on. In slot but no power system will turn on.
> 
> What would cause this? Am I correct that my GPU is dead? It must be a short that is causing the system to shut down.
> 
> Now I am really worried. If I get a replacement GPU, will this happen again? Did I damage the new mobo and CPU?
> 
> I will try and get my hands on another video card to test.
> 
> Any ideas, tips, suggestions???
> 
> Thanks!!!


dang, dude, wish i knew what to say but if your rig starts w/o gpu but doesnt with it hooked up, it's the gpu. Maybe you forgot to plug something back in when you installed the waterblock? made sure your 8-pin/6-pin are ALL the way in? Tries reinstalling the og cooler and see if it will post (if not card is borked)?


----------



## mtbiker033

anyone else get the new windows update yesterday, install the new geforce drivers for it and notice that in games the color's seem washed out? less performance?


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> JESUS! I can see patience is not your strong suit. Me neither haha. I was lucky enough to grab a Gigabyte Aorus 1080 Ti on launch day. I'm on vacation right now, but it'll be waiting for me when I get home. Should be a nice upgrade from my 980 Ti!


hopefully not just sitting on your porch :/
Quote:


> Originally Posted by *mtbiker033*
> 
> anyone else get the new windows update yesterday, install the new geforce drivers for it and notice that in games the color's seem washed out? less performance?


Havent got it yet but i have heard that said elsewhere yesterday. Colors always look washed to me when i update drivers b/c i play with digital vibrance on 66%.
Quote:


> Originally Posted by *AsusFan30*
> 
> I have always been an early adapter. I don't wait for any price drops. Very nice. You are lucky to get that on launch day. Lucky you to be on Vacation. The sad part about my Story is that I just bought (2) GTX 1080 Gaming X last week. lol. But I saw these 1080 Ti, and I just had to have them.


yeah, man, you were supposed to be keeping up with the tech news. Don't you know Titan XP3.0 released yesterday







but seriously, i bet you get a good resale out of your 1080s over there.


----------



## RavageTheEarth

Quote:


> Originally Posted by *AsusFan30*
> 
> I have always been an early adapter. I don't wait for any price drops. Very nice. You are lucky to get that on launch day. Lucky you to be on Vacation. The sad part about my Story is that I just bought (2) GTX 1080 Gaming X last week. lol. But I saw these 1080 Ti, and I just had to have them.


That's going to be a beastly SLI setup! You running a 4k monitor? That's A LOT of power at your fingertips! I'm running a 1440p 144hz G-Sync monitor so I could use the extra FPS.


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> Havent got it yet but i have heard that said elsewhere yesterday. Colors always look washed to me when i update drivers b/c i play with digital vibrance on 66%.


oh I understand what you mean there but I mean in game, I also use digital brightness and use nvidia settings for desktop color scheme but I and a friend of mine noticed it in games.

so with that said I wanted to use DDU to do a clean driver install but for some reason it won't start since the update, I click on it and I get the "let this program make changes to your pc" hit yes and then nothing happens


----------



## AsusFan30

Quote:


> Originally Posted by *RavageTheEarth*
> 
> That's going to be a beastly SLI setup! You running a 4k monitor? That's A LOT of power at your fingertips! I'm running a 1440p 144hz G-Sync monitor so I could use the extra FPS.


I do have a 4k Monitor but I do not like 60Hz. I have an AOC Agon 27inch 165Hz G-sync.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mtbiker033*
> 
> oh I understand what you mean there but I mean in game, I also use digital brightness and use nvidia settings for desktop color scheme but I and a friend of mine noticed it in games.
> 
> so with that said I wanted to use DDU to do a clean driver install but for some reason it won't start since the update, I click on it and I get the "let this program make changes to your pc" hit yes and then nothing happens


Just use the fresh install option rig the nvidia driver. Does practically the same thing.


----------



## mtbiker033

Quote:


> Originally Posted by *SlimJ87D*
> 
> Just use the fresh install option rig the nvidia driver. Does practically the same thing.


good idea, I had forgotten about that!


----------



## RavageTheEarth

Quote:


> Originally Posted by *AsusFan30*
> 
> I do have a 4k Monitor but I do not like 60Hz. I have an AOC Agon 27inch 165Hz G-sync.


Oh so you are using a 1440p monitor for gaming? I also can't go back to 60hz. It just feels so terrible and choppy when you get used to playing at high framerates. I remember my first day with the monitor it was weird because everything was TOO smooth. I'm using the Acer XB270HU. I got it off someone on here for $250. The thing is mint with one dead pixel. Love it.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Oh so you are using a 1440p monitor for gaming? I also can't go back to 60hz. It just feels so terrible and choppy when you get used to playing at high framerates. I remember my first day with the monitor it was weird because everything was TOO smooth. I'm using the Acer XB270HU. I got it off someone on here for $250. The thing is mint with one dead pixel. Love it.


helluva buy.


----------



## KedarWolf

Quote:


> Originally Posted by *Lefty23*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> at least my eyes have seen a card hit 2100 and hold it. That was cathartic.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> I feel your pains bro. Same here.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> All he did was just open the benchmark lol. I ran it at 2114 MHz with the stock bios at 1.075 Volts.
> 
> Click to expand...
> 
> 2100 stable, is doable in "Heaven/Valley @ 1080p & 1440p" and "firestrike @ 1080p" tests under water.
> e.g. Heaven @1440p:
> 
> 
> If you want, check here for more info (open spoilers - you might have to right click on the image and open it in a new tab to see better)
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-thread/3500_100#post_25983655
> This is on stock bios. I'll probably try some of the other bioses this weekend.
> By the way, I get driver crash as soon as I move to the next bin up (2114), in the first few seconds of all the benchmarks I tried.
> Looks like 2100 on the core is the limit of my card.
> 
> I'm pretty sure as more people get their cards under full water loops we'll see more examples like this.
> @KedarWolf or, anyone else who wants to chime in, if I could only try one* bios, which one would you recommend?
> Card is an EVGA 1080ti FE, though I don't think it matters, right?
> 
> * Release of the card could not come at a worse time for me as April-May & October-November are really busy at work as we prepare our two major product releases. I'm lucky if I can spare like 5-6 hours per week for "hobbies" and that is usually on the weekends.
Click to expand...

I tried the MSI gaming X, had a bit of throttling, pretty much the same as stock FE BIOS.

The Gigabyte BIOS was clocks all over the place, 500, 800, 1000, never worked right at all. People said the same about the Aorus BIOS as the Gigabyte.

The Strix BIOS I get sustained 2100 at 1.075v, best one so far.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Been playing with AB curve today and managed to get some more out of my card,
> 
> 2126 is good at 1.062v runs to 118% power on 5 runs of benchmark
> 
> 2139 seems best so far and that needs 1.075v, runs 120% power max on 5 runs
> 
> 
> 
> 2152 maybe ok at 1.081, past once but starts power capping and does take a few dips, 1.093 takes power dips too so next step would be better bios.
> 
> 
> 
> benchmark is from the last big dip on clock about mid graph, and did pass ok but does dip with power limit, successive runs crashed.
> 
> seems to clock up pretty smoothly so far but maybe reaching the wall,
> 
> 2101 1.043
> 2114 1.050
> 2126 1.062
> 2139 1.075...
> 
> one voltage bin per clock bin seems to be required at the moment
> 
> Still to play with memory, but that ain't going to help the power situation so will try that next.


dang, dude, that's a DIAMOND gpu. Very rare. It must have been cut from the very center of the die. It's like the heart of a watermelon!
Quote:


> Originally Posted by *KedarWolf*
> 
> I tried the MSI gaming X, had a bit of throttling, pretty much the same as stock FE BIOS.
> 
> The Gigabyte BIOS was clocks all over the place, 500, 800, 1000, never worked right at all. People said the same about the Aorus BIOS as the Gigabyte.
> 
> The Strix BIOS I get sustained 2100 at 1.075v, best one so far.


it makes sense. It is closest to the FE at pcb level and is just a bit more power phases running similar currents so, in other words, the same thing but a bit more aggressive. The Aorus pcb is very different and favors lower volts. If this dude was under water it'd hold 2062+ at 1.05v.


----------



## kevindd992002

Quote:


> Originally Posted by *Bishop07764*
> 
> Yes!!! You will be power limited big time if it's anything like my card. My temps have never gone over 36c ever since putting on my waterblock even when trying for 2.1 GHz and voltage at 1.093. It may allow you some room to do lower volts to hit some max clocks. I tested mine last night at 1.031 volts at 2063 core. Power limit for these is my only limitation right now.


Why would higher power limits allow you to do lower volts to hit the max clocks? Isn't it the other way around (allow you to have higher voltage to hit max clocks)? Are you saying that you always am being power-limited no matter what?


----------



## RavageTheEarth

Quote:


> Originally Posted by *Slackaveli*
> 
> helluva buy.


Yeah I got really lucky. I bought a broken XB271HU thinking I could buy a replacement panel. Nope. I called Acer and they can't even replace it if I send it in and pay them. I looked everywhere online for the panel. Nowhere to be found. They just don't sell it. So I posted about it and someone PM'd me saying they had a XB270HU sitting in his closet and that he'd throw it up on eBay for me for $250. Awesome dude. Now I have the 1080 Ti on the way to replace my 980 Ti and get some more FPS!


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Yeah I got really lucky. I bought a broken XB271HU thinking I could buy a replacement panel. Nope. I called Acer and they can't even replace it if I send it in and pay them. I looked everywhere online for the panel. Nowhere to be found. They just don't sell it. So I posted about it and someone PM'd me saying they had a XB270HU sitting in his closet and that he'd throw it up on eBay for me for $250. Awesome dude. Now I have the 1080 Ti on the way to replace my 980 Ti and get some more FPS!


yeah this forum is pretty great at times. Some cools cats in here.

Just passed a two hour Heaven run @ 1650 core, always in the 2025+ range, tops out at 72c. im going to open her up and put dat thermal grizz to her and see if that helps a bit.


----------



## mouacyk

For those flashing a 330W (or 375W) BIOS meant for a 2x8pin card to the FE, aren't you worried about forcing >=127W over the 6pin where it's only spec'ed for 75W? That's 70% more power than what the 6pin connector is rated for.
Quote:


> Board power limit
> Target: 275.0 W
> Limit: 330.0 W
> Adj. Range: -55%, +20%


Be prepared for possible burning plastic smell if you run furmark or high loads for any long period of time. I burned my PCIe cable with a 980 Ti, but that's because I was forcing ~325W through one cable (400W furmark load - 75W PCIe = 325W split over a y-connector.)


----------



## Wishmaker

Quote:


> Originally Posted by *mouacyk*
> 
> For those flashing a 330W (or 375W) BIOS meant for a 2x8pin card to the FE, aren't you worried about forcing >=127W over the 6pin where it's only spec'ed for 75W? That's 70% more power than what the 6pin connector is rated for.
> Be prepared for possible burning plastic smell if you run furmark or high loads for any long period of time. I burned my PCIe cable with a 980 Ti, but that's because I was forcing ~325W through one cable (400W furmark load - 75W PCIe = 325W split over a y-connector.)


This is OCN! If you smell something burning you push more volts and it this case watts!


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I finally bought 3DMark. Firestrike Extreme won't even start up for me. It just crashes instantly, what's wrong with it? Even with no OC. It must be another issue.
> 
> It says test cancelled by user. The demo runs fine.


you gotta run rivatuner in stealth mode.
Quote:


> Originally Posted by *Wishmaker*
> 
> This is OCN! If you smell something burning you push more volts and it this case watts!


just keep a can of Duster handy and cool it with that.


----------



## Lefty23

Quote:


> Originally Posted by *KedarWolf*
> 
> I tried the MSI gaming X, had a bit of throttling, pretty much the same as stock FE BIOS.
> 
> The Gigabyte BIOS was clocks all over the place, 500, 800, 1000, never worked right at all. People said the same about the Aorus BIOS as the Gigabyte.
> 
> The Strix BIOS I get sustained 2100 at 1.075v, best one so far.


Nice, thanks man.
I was just reading through the "How To Flash Strix ....." thread looking for more info.
If i do find time during the weekend I will try the Strix and post some results there


----------



## NYU87

Good bye old friend,


----------



## Unnatural

Just ordered an EVGA FE and EK waterblock. Let's go study this thread!


----------



## alucardis666

Well FedEx now says my package was delivered... I was outside a whole 45 minutes before the delivery time and did not see any FedEx people... So my Titan Xp says it was delivered, but in actuality was not. I have a case # from FedEx now.

As for Amazon, they refunded me for the Kryonaut they were unable to deliver...

This is shaping up to be a great week!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Well FedEx now says my package was delivered... I was outside a whole 45 minutes before the delivery time and did not see any FedEx people... So my Titan Xp says it was delivered, but in actuality was not. I have a case # from FedEx now.
> 
> As for Amazon, they refunded me for the Kryonaut they were unable to deliver...
> 
> This is shaping up to be a great week!


***, is someone stealing everything in your area somehow?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> ***, is someone stealing everything in your area somehow?


Idk!?! How would they know though? And the Amazon product was their own carrier so I'll let that one slide, this is fedex we're talking about here.

I'm thinking of installing a security cam for my porch. And I've never had any issues like this before this week. In fact my GF ordered some stuff off amazon just last week that made it here just fine.


----------



## Bishop07764

Quote:


> Originally Posted by *kevindd992002*
> 
> Why would higher power limits allow you to do lower volts to hit the max clocks? Isn't it the other way around (allow you to have higher voltage to hit max clocks)? Are you saying that you always am being power-limited no matter what?


I was just trying to say that under water. The things will be power limited almost always. For instance, I cab hit the power limit easily in some benchmarks at 2063 core by just adding +160 or so to core. I can get 2076 if I use the curve and drop my voltage to 1.031 or so. But I still hit the power limit even then. Hope that makes sense.


----------



## Bishop07764

Quote:


> Originally Posted by *Jaysend*
> 
> Hoping for some help here:
> 
> I have an EVGA card and it was running great on air, overclocked well and all was good except for the atrocious fan noise.
> So I decided to finally go with a full custom water loop. I got an EK Titan water block and back plate. Followed directions and was pretty meticulous. Went to fire it up and all the lights came in for a split second than click and nothing. I checked the PSU. So I tested it thouroughly and moved on to testing on the bench. I thought it was motherboard so i got another motherboard. Then I thought it was CPU so I got another CPU.
> 
> Same thing happened. So I went back to the PSU and tested one connection at a time. It posted with everything connected. Last thing to connect was GPU. It would not turn on. I removed the card and everything turned on. So power to the GPU wether it is actually in the PCIE slot causes nothing to turn on. In slot but no power system will turn on.
> 
> What would cause this? Am I correct that my GPU is dead? It must be a short that is causing the system to shut down.
> 
> Now I am really worried. If I get a replacement GPU, will this happen again? Did I damage the new mobo and CPU?
> 
> I will try and get my hands on another video card to test.
> 
> Any ideas, tips, suggestions???
> 
> Thanks!!!


That sucks. Did something fall into the pcie slot? Anything stuck on the bottom of the card? Sounds like it's shorting out on something. I'd definitely test another card.


----------



## ottoore

Quote:


> Originally Posted by *mouacyk*
> 
> For those flashing a 330W (or 375W) BIOS meant for a 2x8pin card to the FE, aren't you worried about forcing >=127W over the 6pin where it's only spec'ed for 75W? That's 70% more power than what the 6pin connector is rated for.
> Be prepared for possible burning plastic smell if you run furmark or high loads for any long period of time. I burned my PCIe cable with a 980 Ti, but that's because I was forcing ~325W through one cable (400W furmark load - 75W PCIe = 325W split over a y-connector.)


6 and 8 pin connectors are electrically identical. The 2 more pins are ground.
If your psu doesn't have 22awg cables on 6 pins you won' t have any problem.


----------



## alucardis666

Quote:


> Originally Posted by *ottoore*
> 
> 6 and 8 pin connectors are electrically identical. The 2 more pins are ground.
> If your psu doesn't have 22awg cables on 6 pins you won' t have any problem.


Good to know!









*EDIT:*

Also... if anyone has some left over *Thermal Grizzly Kryonaut* they wouldn't mind sending me, I'd be very appreciative! I can pay for shipping via Paypal. I've just tired to order it 2x now from 2 different e-tailers and had issues with getting it.


----------



## illypso

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's nice and all, but there's also just going to far and wasting time on something that might not even get you gains or even negatively impact you.
> 
> It's like do all this soldering to get a 25% chance at 2% more fps or 75% chance of just getting the same or worse performance.
> 
> The reward for the risk wouldn't be worth it for me. Heck, the shunt mod wasn't worth it for me, not when the Asus bios is giving me similar results.


Has I said, for me, just trying it if fun is enough to justify the try even if I get no result. I'm just a little crazy on the side... I love test

New bios where out after I made my shunt mod but I would have still done it physically because we don't know all the possible hidden consequence on having another bios in a FE card (although chance are there are none)

And also I'm talking about more VOLT (not the shunt mod), if we could do it by bios everyone would try it. (think about 1.1500V







)

I'm starting to look for a small 1K pot. And if more volt give better result then we can hope for a bios unlock mod eventually for those who don't want to risk there card (witch I understand greatly),but if not then we would know that its useless to get more volt (unless LN2 I presume) but we would know it.


----------



## KingEngineRevUp

Playing around more with the ASUS bios, and the extra power is definitely more stable, at least with these new drivers which I think they have lowered my performance.

Stock Below: Has a drop most likely due to power limiting



ASUS Below



I guess the ASUS bios is finally working out for me now.

Also, these little babies might be helping me drop just that 2C below thermal throttling. Had to use chopsticks to get them under the shroud, lol.


----------



## Jbravo33

no luck with gigabyte bios. gpu was all over the place. strix seems to be most stable, but very faintly hear coil whine. if glass is on cant hear at all. after 4 hours of squad last night highest temp gpus saw were 55. pump 30% and fans were on silent mode. what a difference watering a card does, don't see myself on air ever again.


----------



## SortOfGrim

old vs new
also got the EK fwb on pre-order


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Playing around more with the ASUS bios, and the extra power is definitely more stable, at least with these new drivers which I think they have lowered my performance.
> 
> Stock Below: Has a drop most likely due to power limiting
> 
> 
> 
> ASUS Below
> 
> 
> 
> I guess the ASUS bios is finally working out for me now.
> 
> Also, these little babies might be helping me drop just that 2C below thermal throttling. Had to use chopsticks to get them under the shroud, lol.


Maybe you should try a lower voltage and have a constant core clock on the FE bios? Im getting 103fps same settings as you running [email protected] and [email protected] Asus bios lowered my scores.


----------



## eXteR

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Your score is too low for 2100 flashing AIB BIOS and clocking higher doesn't mean its actually faster
> 
> We dealt with this back with 1080s


Quote:


> Originally Posted by *SlimJ87D*
> 
> Playing around more with the ASUS bios, and the extra power is definitely more stable, at least with these new drivers which I think they have lowered my performance.
> 
> Stock Below: Has a drop most likely due to power limiting
> 
> 
> 
> ASUS Below
> 
> 
> 
> I guess the ASUS bios is finally working out for me now.
> 
> Also, these little babies might be helping me drop just that 2C below thermal throttling. Had to use chopsticks to get them under the shroud, lol.


Did you used a copper shrim between die and cooler to make contact?

I want to mod it like yours, but i saw that the only block that can be installed with the original baseplate is the EVGA one because the gap that the pump has.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Idk!?! How would they know though? And the Amazon product was their own carrier so I'll let that one slide, this is fedex we're talking about here.
> 
> I'm thinking of installing a security cam for my porch. And I've never had any issues like this before this week. In fact my GF ordered some stuff off amazon just last week that made it here just fine.


i have porch cam Porch cam is THE BOMB. I can't believe your titan X-lil"p" is missing. HOLY MOTHER OF GOD

**Warning, Aorus Extreme in stock on the Egg. Won't last long.

This Aorus would have passed an extreme test. I am gaming a few bins higher than extreme's "OCmode" profile. Just played BF1 basically 5 hours straight. No crashes. Max gpu temp was 67c. Basically the same as aggro fan FE w/ Thermal Grizzly. So whatever the delta is between stock pasted aggro fan FE and Aorus is what you get from the TG:K. LOVING this card.
Quote:


> Originally Posted by *Jbravo33*
> 
> no luck with gigabyte bios. gpu was all over the place. strix seems to be most stable, but very faintly hear coil whine. if glass is on cant hear at all. after 4 hours of squad last night highest temp gpus saw were 55. pump 30% and fans were on silent mode. what a difference watering a card does, don't see myself on air ever again.


I want to see someone try the Aorus bios and set up the curve from the 1.05v spot and flatten out after that point. Run with NO EXTRA volts. I bet it does much better. Power limit max, too. I am getting 2050 to hold at 1.05 and then drops to 2037 @ 1.042 or whatever. But it does not like any higher voltage points than that. IT HAS NEVER POWER THROTTLED on me. Even in Heaven and timespy.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Maybe you should try a lower voltage and have a constant core clock on the FE bios? Im getting 103fps same settings as you running [email protected] and [email protected] Asus bios lowered my scores.


I do have some logs of older scores on the stock bios and you're right, it was higher before at lower voltages.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I do have some logs of older scores on the stock bios and you're right, it was higher before at lower voltages.


Ya I think the best bios is the stock one.


----------



## Slackaveli

Quote:


> Originally Posted by *Jbravo33*
> 
> no luck with gigabyte bios. gpu was all over the place. strix seems to be most stable, but very faintly hear coil whine. if glass is on cant hear at all. after 4 hours of squad last night highest temp gpus saw were 55. pump 30% and fans were on silent mode. what a difference watering a card does, don't see myself on air ever again.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Playing around more with the ASUS bios, and the extra power is definitely more stable, at least with these new drivers which I think they have lowered my performance.
> 
> Stock Below: Has a drop most likely due to power limiting
> 
> 
> 
> ASUS Below
> 
> 
> 
> I guess the ASUS bios is finally working out for me now.
> 
> Also, these little babies might be helping me drop just that 2C below thermal throttling. Had to use chopsticks to get them under the shroud, lol.


you use Fujipoly under those? I like that mod. It's got to be worth 1-3 degrees. Also, those 5x mems get hella hot. i noticed a 2c drop from +650 to +275, and negligible if any framerate decrease. I'm sure that is lottery based, too(perf) but temps are going up even if perf is, too.

In my battlefield tests today, i was on conquest, and i was locked 144 a large portion of the time. Almost everytime i looked up. There were a few times it would be 136~. This Aorus1080ti/5775c combo is a destroyer.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Ya I think the best bios is the stock one.


Yeah, I don't know why it only benefits one person in this entire forum, lol.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I don't know why it only benefits one person in this entire forum, lol.


seems to remove power limit for a few bins, so it can stabilize or add a bin or two, that's about it. It seems to help about 20% or so. I bet some more will find a sweet spot with it. I think you must be on water to get the gains. Strix has more power and that's more heat so the power limit becomes thermal/voltage limit. IF you have a golden chip and are underwater, it can help, though.


----------



## Jaysend

Quote:


> Originally Posted by *Bishop07764*
> 
> That sucks. Did something fall into the pcie slot? Anything stuck on the bottom of the card? Sounds like it's shorting out on something. I'd definitely test another card.


The same
Thing happens on 3 different motherboards. Also the system starts up and behaves fine with the GPU plugged
Into the Pcie slot. It is only when power is hooked up. I tried multiple cables and PSUs. I think something in the GPU is shorted. I took it apart and couldn't find anything obvious...
Just got an RMA from the egg.


----------



## Slackaveli

Quote:


> Originally Posted by *Jaysend*
> 
> The same
> Thing happens on 3 different motherboards. Also the system starts up and behaves fine with the GPU plugged
> Into the Pcie slot. It is only when power is hooked up. I tried multiple cables and PSUs. I think something in the GPU is shorted. I took it apart and couldn't find anything obvious...
> Just got an RMA from the egg.


best bet. if you dont have another gpu handy to test, just go grab a 1060 from best buy for the day and return it. Only requires $200.


----------



## Foxrun

How are you guys cooling the FE? I just got my two but the hybrid kits on EVGA are sold out.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I don't know why it only benefits one person in this entire forum, lol.


I actually don't know if it is helping him. My scores at 1080p are higher than what he shows @2100Mhz and im only running 2038-2063MHz.
Quote:


> Originally Posted by *Foxrun*
> 
> How are you guys cooling the FE? I just got my two but the hybrid kits on EVGA are sold out.


Im using the 1080 hybrid kit on my ti.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> seems to remove power limit for a few bins, so it can stabilize or add a bin or two, that's about it. It seems to help about 20% or so. I bet some more will find a sweet spot with it. I think you must be on water to get the gains. Strix has more power and that's more heat so the power limit becomes thermal/voltage limit. IF you have a golden chip and are underwater, it can help, though.


My Temps are at 45C,check my previous post. I was at a stable 2114Mhz low Temps and voltage was constant.

I'm wondering where the extra power actually comes from. The 6-pin or our mobos? It could be my motherboard. Doesn't benefit from drawing that extra power.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> I actually don't know if it is helping him. My scores at 1080p are higher than what he shows @2100Mhz and im only running 2038-2063MHz.
> Im using the 1080 hybrid kit on my ti.


Yeah, same here. I agree. Everyone I've talked to has said the Asus strix bios hurts their score.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> My Temps are at 45C,check my previous post. I was at a stable 2114Mhz low Temps and voltage was constant.
> 
> I'm wondering where the extra power actually comes from. The 6-pin or our mobos? It could be my motherboard. Doesn't benefit from drawing that extra power.


aaaaaaaaahhhhhh. thinking back to the pcb breakdown on gamersnexus i believe, there is some 'sharing' going on. I bet you are right. It's coming from the pcb. So, if you have a ridiculous mobo maybe it's useful, but mostly just scary.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> How are you guys cooling the FE? I just got my two but the hybrid kits on EVGA are sold out.


Poor man's hybrid. Look at my previous post a page back.


----------



## kevindd992002

Quote:


> Originally Posted by *Unnatural*
> 
> Just ordered an EVGA FE and EK waterblock. Let's go study this thread!


Why'd you choose the EVGA FE? Is it purely because of good after-sales support?

Quote:


> Originally Posted by *Bishop07764*
> 
> I was just trying to say that under water. The things will be power limited almost always. For instance, I cab hit the power limit easily in some benchmarks at 2063 core by just adding +160 or so to core. I can get 2076 if I use the curve and drop my voltage to 1.031 or so. But I still hit the power limit even then. Hope that makes sense.


And that if you're not power-limited, your clocks will be more stable. That's just it? Nothing else?

Quote:


> Originally Posted by *Addsome*
> 
> I actually don't know if it is helping him. My scores at 1080p are higher than what he shows @2100Mhz and im only running 2038-2063MHz.
> Im using the 1080 hybrid kit on my ti.


We're pertaining to Kedarwolf here, right?


----------



## Foxrun

Quote:


> Originally Posted by *Addsome*
> 
> I actually don't know if it is helping him. My scores at 1080p are higher than what he shows @2100Mhz and im only running 2038-2063MHz.
> Im using the 1080 hybrid kit on my ti.


Did you install vrm heatsinks? Or just keeping the blower fan for that?


----------



## Addsome

Quote:


> Originally Posted by *kevindd992002*
> 
> We're pertaining to Kedarwolf here, right?


Yes


----------



## Unnatural

Quote:


> Originally Posted by *kevindd992002*
> 
> Why'd you choose the EVGA FE? Is it purely because of good after-sales support?


Actually it was the cheapest one I could find here in Italy (less than 740€ shipping included).


----------



## Slackaveli

Quote:


> Originally Posted by *Unnatural*
> 
> Actually it was the cheapest one I could find here in Italy (less than 740€ shipping included).


it's got the best warranty. you did right. Also, no void for cooling mods is nice, too.


----------



## Slackaveli

Aorus X drops in Canada tomorrow morning. Get your F5 finger stretched out.


----------



## Eco28

Sorry guys for not reading the entire thread here but is there long story short for FE vs non-FE watercooled scenario? Planning to buy me some 1080 TI goodness any day now.

1. I am tempted to go FE and sink it with EKWB block just because it is "univeral", more common. Based on peoples experinece with 1080s I heard that extra $ is not worth it for non-FE when watercooling, you just pay for better air cooler there.

2. On the other hand some say non-FE cards usually have superior power section which allows for better overclocking, stable and longer lifespan of the card.

Ps. I am still running two GTX670s by Asus on custom PCB and am not really amused by their OC potential (especially compared to reference cards back then). I know it can vary from model to model, card to card, generation to generation but I feel like custom PCB was only marketing. Does it still hold the truth on today's 10xx generation?

I like to rely on forumers personal experience here rather than magazine tests. Apprecaite your input.


----------



## KedarWolf

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Emissary of Pain*
> 
> Hey all
> 
> Random question, what temps are you guys getting at idle ? ... With P&T limit set to max with fan speed set to 70%, I idle around 50c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Something is wrong, is this FE? My FE idles between 24 and 27C with 1:1 fan ratio.
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> That's nice and all, but there's also just going to far and wasting time on something that might not even get you gains or even negatively impact you.
> 
> It's like do all this soldering to get a 25% chance at 2% more fps or 75% chance of just getting the same or worse performance.
> 
> The reward for the risk wouldn't be worth it for me. Heck, the shunt mod wasn't worth it for me, not when the Asus bios is giving me similar results.
> 
> Click to expand...
> 
> Totally not worth it. What are you seeing with the Asus Strix BIOS without the shunt-mod? Any additional voltage? I'm somewhat interested in this as I have some thermal-overhead to play around with and my card currently can stay at 70C at 70% RPM so having my fan RPM reduced to 75% isn't an issue, unless I up the voltage significantly.
> 
> Quote:
> 
> 
> 
> Originally Posted by *mtbiker033*
> 
> anyone else get the new windows update yesterday, install the new geforce drivers for it and notice that in games the color's seem washed out? less performance?
> 
> Click to expand...
> 
> Redo your settings in NVCP, for me with the PG278Q I like:
> 
> Brightness: 47%
> Contrast: 52%
> Gamma: 98
> 
> Digital Vibrance: 53% (any more than 55% with this monitor and reds and blues look way way too saturated).
> 
> Edit: Don't forget to select "prefer maximum performance" per game / application (don't set this in global or your browsers will run at 1500MHz lol) Oh and don't install GeFarce experience! I installed it this time around with 378.78 out of curiosity and it dropped my Firestrike Normal GPU score over 1k points and I wasn't even using ShadowPlay to record anything, just having GeFarce running whatever processes was that impactive. And 90% of the time the recommended settings are garbage!
> 
> On water no shunt mod. I've seen throttling seem to start as low as 50C on old BIOS but I know for sure it'll throttle at 60C and higher several bins depending how much you are over.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I tried the MSI gaming X, had a bit of throttling, pretty much the same as stock FE BIOS.
> 
> The Gigabyte BIOS was clocks all over the place, 500, 800, 1000, never worked right at all. People said the same about the Aorus BIOS as the Gigabyte.
> 
> The Strix BIOS I get sustained 2100 at 1.075v, best one so far.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> So youre holding 2100MHz with Strix at 1.075v with no shunt-mod? Is this because youre keeping the core under 50C? What do you think I would see at 70C with the same voltage?
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Well FedEx now says my package was delivered... I was outside a whole 45 minutes before the delivery time and did not see any FedEx people... So my Titan Xp says it was delivered, but in actuality was not. I have a case # from FedEx now.
> 
> As for Amazon, they refunded me for the Kryonaut they were unable to deliver...
> 
> This is shaping up to be a great week!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Man that sucks! I would be so paranoid and upset it wouldn't even be funny. That's why for high dollar items I always pay a little more for signature required, I used to work for the USPS and man, if there is no signature, we don't know the value of the item in the box and we would leave them right on door-mat's sometimes, we would take an effort to hide them but if there is no place to hide them and no signature required they go right on the doormat out in the open.
> 
> If it's signature required, you have to sign for it, the courier cannot leave it anywhere.
Click to expand...


----------



## Eco28

Quote:


> Originally Posted by *VanuSovereignty*
> 
> I would base my decision on the cost and availability of said water-block and GPU....


I have now the option to get FE, ASUS Strix or MSI Gaming X cards. EKWB used to make full cover water blocks for Asus cards and perhaps few other most popular units after some time. Though I'd assume it may take months to get my hands on other custom water blocks and I am just not sure if it's worth waiting. Even without bios flash/solder is FE noticably behind when not throttled by temp compared to non-FE cards? Thanks for sharing.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *VanuSovereignty*
> 
> Something is wrong, is this FE? My FE idles between 24 and 27C with 1:1 fan ratio.
> Totally not worth it. What are you seeing with the Asus Strix BIOS without the shunt-mod? Any additional voltage? I'm somewhat interested in this as I have some thermal-overhead to play around with and my card currently can stay at 70C at 70% RPM so having my fan RPM reduced to 75% isn't an issue, unless I up the voltage significantly.
> Redo your settings in NVCP, for me with the PG278Q I like:
> 
> Brightness: 47%
> Contrast: 52%
> Gamma: 98
> 
> Digital Vibrance: 53% (any more than 55% with this monitor and reds and blues look way way too saturated).
> 
> Edit: Don't forget to select "prefer maximum performance" per game / application (don't set this in global or your browsers will run at 1500MHz lol) Oh and don't install GeFarce experience! I installed it this time around with 378.78 out of curiosity and it dropped my Firestrike Normal GPU score over 1k points and I wasn't even using ShadowPlay to record anything, just having GeFarce running whatever processes was that impactive. And 90% of the time the recommended settings are garbage!
> So youre holding 2100MHz with Strix at 1.075v with no shunt-mod? Is this because youre keeping the core under 50C? What do you think I would see at 70C with the same voltage?
> Man that sucks! I would be so paranoid and upset it wouldn't even be funny. That's why for high dollar items I always pay a little more for signature required, I used to work for the USPS and man, if there is no signature, we don't know the value of the item in the box and we would leave them right on door-mat's sometimes, we would take an effort to hide them but if there is no place to hide them and no signature required they go right on the doormat out in the open.
> 
> If it's signature required, you have to sign for it, the courier cannot leave it anywhere.


I don't really get anything out of it, if anything it performs worse for me along with many other users. I just did another test.



What I did learn is that my card likes more on the memory at higher clocks. If I had the memory at 400 Mhz, I'd get lower scores or even a crash. Kinda weird.

Stock Below


ASUS Below


Otherwise before +400 memory at lower than 2100 Mhz core actually hurt my scores.


----------



## Benny89

My rules are simple:

1. Hybri kit or water loop - FE
2. Air - only AIBs
3. Lazy? Hybrid cards


----------



## krutoydiesel

Quote:


> Originally Posted by *Jaysend*
> 
> Hoping for some help here:
> 
> I have an EVGA card and it was running great on air, overclocked well and all was good except for the atrocious fan noise.
> So I decided to finally go with a full custom water loop. I got an EK Titan water block and back plate. Followed directions and was pretty meticulous. Went to fire it up and all the lights came in for a split second than click and nothing. I checked the PSU. So I tested it thouroughly and moved on to testing on the bench. I thought it was motherboard so i got another motherboard. Then I thought it was CPU so I got another CPU.
> 
> Same thing happened. So I went back to the PSU and tested one connection at a time. It posted with everything connected. Last thing to connect was GPU. It would not turn on. I removed the card and everything turned on. So power to the GPU wether it is actually in the PCIE slot causes nothing to turn on. In slot but no power system will turn on.
> 
> What would cause this? Am I correct that my GPU is dead? It must be a short that is causing the system to shut down.
> 
> Now I am really worried. If I get a replacement GPU, will this happen again? Did I damage the new mobo and CPU?
> 
> I will try and get my hands on another video card to test.
> 
> Any ideas, tips, suggestions???
> 
> Thanks!!!


*1:*It most definitely is a short
*2:*You most likely did not damage mobo or the CPU

*Solution:* this most likely will work - remove the back-plate and try then. If that still doesn't work, remove the block and back-plate, plug it in naked- no danger as the GPU won't be under load, then turn it off once it posts- it should post.

Source: Had the same issue with my 980 TI and the Swiftech back-plate I had. As soon as I took off the back-plate it worked just fine.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Jaysend*
> 
> The same
> Thing happens on 3 different motherboards. Also the system starts up and behaves fine with the GPU plugged
> Into the Pcie slot. It is only when power is hooked up. I tried multiple cables and PSUs. I think something in the GPU is shorted. I took it apart and couldn't find anything obvious...
> Just got an RMA from the egg.


I remember you saying this happened after you put a waterblock on. If you tighten it too much and there is too much pressure on the gpu core it will not boot. It happened to me before. Have you tried putting the stock cooler back on?


----------



## cekim

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I remember you saying this happened after you put a waterblock on. If you tighten it too much and there is too much pressure on the gpu core it will not boot. It happened to me before. Have you tried putting the stock cooler back on?


GentlYEEEE...

Bare die, don't crank that thing.


----------



## hebrewbacon

Quote:


> Originally Posted by *Benny89*
> 
> My rules are simple:
> 
> 1. Hybri kit or water loop - FE
> 2. Air - only AIBs
> 3. Lazy? Hybrid cards


Definitely agree with this. FE cards are horrible if you don't swap it out with a better cooler. However, they are the best if you plan to watercool as they have the most options when it comes to waterblocks.


----------



## Blotto80

Quote:


> Originally Posted by *cekim*
> 
> GentlYEEEE...
> 
> Bare die, don't crank that thing.


I did that with a (actually two because I don't learn lessons) 290x and a Kraken G10. When it came time to install the Hybrid on my Ti, I was so worried that I'd tighten it too much that I set my screw driver to ratchet in the loosen direction and as soon as it started clicking I stopped. That seemed to be a perfect amount of pressure, I max out around 48c @ 2062mhz.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Blotto80*
> 
> I did that with a (actually two because I don't learn lessons) 290x and a Kraken G10. When it came time to install the Hybrid on my Ti, I was so worried that I'd tighten it too much that I set my screw driver to ratchet in the loosen direction and as soon as it started clicking I stopped. That seemed to be a perfect amount of pressure, I max out around 48c @ 2062mhz.


Really? I just get a power tool and drill it on unevenly at 1000 ft-lbs until the fasteners strip and the die permanently deforms. I then RMA it and repeat the process until I finally get one that works.

In all seriousness, that's why it's important to use a steel backplate like the kraken has. It will stop the board from flexing. Definitely good to just go finger tight and then do 1/8 to 1/4 of a turn all around.


----------



## Slackaveli

Quote:


> Originally Posted by *hebrewbacon*
> 
> Definitely agree with this. FE cards are horrible if you don't swap it out with a better cooler. However, they are the best if you plan to watercool as they have the most options when it comes to waterblocks.


not really true. if you put the thermal grizz on them and run a 1.1:1 fan curve, it'll be loud but fairly chilled. Two hours of heaven, TG FE was at 71c, my Aorus was the same. After I thermal grizzed the Aorus it is 64c same scenario.

So, results from Thermal Grizzly Kyronaut on an Aorus: 4c idle, 7c-8c full load improvement. VERY nice. I cant imagine why they dont use better thermal paste with all the effort they put into cooling. I guess because it requires human hands.

Time to go see what my 2 hour high temp mark is in BF1 now.


----------



## cekim

Quote:


> Originally Posted by *Blotto80*
> 
> I did that with a (actually two because I don't learn lessons) 290x and a Kraken G10. When it came time to install the Hybrid on my Ti, I was so worried that I'd tighten it too much that I set my screw driver to ratchet in the loosen direction and as soon as it started clicking I stopped. That seemed to be a perfect amount of pressure, I max out around 48c @ 2062mhz.


It can handle more than that, but it just doesn't need it.

I always have to temper my torque as I'm used to car parts (and iron and steel ones at that), so pinkies out, small screw drivers, etc... )

I've not broken any of those stupidly small screws Nvidia now uses on their back plates. Watching Gamer's Nexus donk those was pretty funny, but they are stupidly tiny.

My finding last night is one of my cards is a bit of a nag compared to the other. I see 110% peak TDP to get/stay above 2000MHz on one card, but 126% with the Strix BIOS (FE cards, EK water blocks, Strix BIOS).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cekim*
> 
> It can handle more than that, but it just doesn't need it.
> 
> I always have to temper my torque as I'm used to car parts (and iron and steel ones at that), so pinkies out, small screw drivers, etc... )
> 
> I've not broken any of those stupidly small screws Nvidia now uses on their back plates. Watching Gamer's Nexus donk those was pretty funny, but they are stupidly tiny.
> 
> My finding last night is one of my cards is a bit of a nag compared to the other. I see 110% peak TDP to get/stay above 2000MHz on one card, but 126% with the Strix BIOS (FE cards, EK water blocks, Strix BIOS).


It's important to use a #00 Phillips on them. They shouldn't break. I know they had the right screw driver, not sure why they broke them.


----------



## Jaysend

Quote:


> Originally Posted by *krutoydiesel*
> 
> *1:*It most definitely is a short
> *2:*You most likely did not damage mobo or the CPU
> 
> *Solution:* this most likely will work - remove the back-plate and try then. If that still doesn't work, remove the block and back-plate, plug it in naked- no danger as the GPU won't be under load, then turn it off once it posts- it should post.
> 
> Source: Had the same issue with my 980 TI and the Swiftech back-plate I had. As soon as I took off the back-plate it worked just fine.


Thanks,
Great minds think alike. I did try naked and same thing happened. (Also tried with stock cooler first.)
Baffled. Once the water block was in it did run for a little while so maybe a coincidence...


----------



## krutoydiesel

Quote:


> Originally Posted by *Jaysend*
> 
> Thanks,
> Great minds think alike. I did try naked and same thing happened. (Also tried with stock cooler first.)
> Baffled. Once the water block was in it did run for a little while so maybe a coincidence...


So were you able to get it to work or still no dice?


----------



## Jaysend

Quote:


> Originally Posted by *Slackaveli*
> 
> best bet. if you dont have another gpu handy to test, just go grab a 1060 from best buy for the day and return it. Only requires $200.


Yeah I just found my old 660ti. Just have to add a temporary section to my loop to not blow water all over.


----------



## Slackaveli

Quote:


> Originally Posted by *Jaysend*
> 
> Yeah I just found my old 660ti. Just have to add a temporary section to my loop to not blow water all over.


the loop certainly complicates gpu issues...

@alucardis666 update on your titan? im trippin for you, bro! That is worth it's weight in silver, like ACTUALLY, and I think the fedex guys are getting their hustle on.


----------



## Jaysend

Quote:


> Originally Posted by *krutoydiesel*
> 
> So were you able to get it to work or still no dice?


Nope still no dice. Off to RMA. Hoping it's not something inherent to the waterblock... going to try an old gpu as soon as I get the water loop rerouted.


----------



## krutoydiesel

Quote:


> Originally Posted by *Jaysend*
> 
> Nope still no dice. Off to RMA. Hoping it's not something inherent to the waterblock... going to try an old gpu as soon as I get the water loop rerouted.


You didn't accidentally drop any screws or whatnot or gauge any of the components did you?


----------



## Jbravo33

The asus bios absolutely helped me achieve highest score. Was 92nd but keep getting bumped. Won't be on list for long, but content seeing that there's not many 6850's on there if any. All 8 or 10 cores. I think u need to be on water to see card stabilize better with bios. Oh hey jay!


----------



## KedarWolf

Quote:


> Originally Posted by *cekim*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blotto80*
> 
> I did that with a (actually two because I don't learn lessons) 290x and a Kraken G10. When it came time to install the Hybrid on my Ti, I was so worried that I'd tighten it too much that I set my screw driver to ratchet in the loosen direction and as soon as it started clicking I stopped. That seemed to be a perfect amount of pressure, I max out around 48c @ 2062mhz.
> 
> 
> 
> It can handle more than that, but it just doesn't need it.
> 
> I always have to temper my torque as I'm used to car parts (and iron and steel ones at that), so pinkies out, small screw drivers, etc... )
> 
> I've not broken any of those stupidly small screws Nvidia now uses on their back plates. Watching Gamer's Nexus donk those was pretty funny, but they are stupidly tiny.
> 
> My finding last night is one of my cards is a bit of a nag compared to the other. I see 110% peak TDP to get/stay above 2000MHz on one card, but 126% with the Strix BIOS (FE cards, EK water blocks, Strix BIOS).
Click to expand...

Strix BIOS for me even at 1.093v never goes above 111% TDP.

Might have some to do with my power supply as well though, Corsair AX1500i with separate cables to each PCI-E power connector.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Strix BIOS for me even at 1.093v never goes above 111% TDP.
> 
> Might have some to do with my power supply as well though, Corsair AX1500i with separate cables to each PCI-E power connector.


i use a separate direct power to each 8-pin as well. It's the way to go. Love my Evga G2 1000w.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Jbravo33*
> 
> 
> The asus bios absolutely helped me achieve highest score. Was 92nd but keep getting bumped. Won't be on list for long, but content seeing that there's not many 6850's on there if any. All 8 or 10 cores. I think u need to be on water to see card stabilize better with bios. Oh hey jay!


Did you actually do a very thorough test comparison between the two? Or did you just flash and forget? I actually did a thorough comparison between msi, Asus and stock at 2063, 2075, 2088, 2100, 2114 while varrying memory at 400 to 450 and stock won all the time. That was 10 test I did and temperatures were at 45C so there wasn't any thermal throttling at 51C.

MSI performed really close to stock at times or Asus would. But neither would be as consistent as stock.

I really suggest you do a thorough test before you assume the Asus is helping that much.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> you use Fujipoly under those? I like that mod. It's got to be worth 1-3 degrees. Also, those 5x mems get hella hot. i noticed a 2c drop from +650 to +275, and negligible if any framerate decrease. I'm sure that is lottery based, too(perf) but temps are going up even if perf is, too.
> 
> In my battlefield tests today, i was on conquest, and i was locked 144 a large portion of the time. Almost everytime i looked up. There were a few times it would be 136~. This Aorus1080ti/5775c combo is a destroyer.


I believe 3M thermal adhesive is on them and its legitimately adhering. If one dropped in place it was very hard for me to peel it off, I mean hard with a chopstick at least.

Yeah I've noticed a 2C drop so far on my core. Any heat dissipation will help.

This morning the max I was getting while messing with the Asus bios was 45C.

I took a measurements with a temperature gun and I could see around the plate it was 40C and the heatsink went from 20C to 38C so they're definitely absorbing heat and dissipating it.


----------



## Jbravo33

Quote:


> Originally Posted by *SlimJ87D*
> 
> Did you actually do a very thorough test comparison between the two? Or did you just flash and forget? I actually did a thorough comparison between msi, Asus and stock at 2063, 2075, 2088, 2100, 2114 while varrying memory at 400 to 450 and stock won all the time. That was 10 test I did and temperatures were at 45C so there wasn't any thermal throttling at 51C.
> 
> MSI performed really close to stock at times or Asus would. But neither would be as consistent as stock.
> 
> I really suggest you do a thorough test before you assume the Asus is helping that much.


i'll check my notes tonight. haha i have notes. like what!? no seriously tho i have everything written down pages of benches. the msi worked fine as well i just went back to strix cause i would leave card at base speeds for gaming. and that was the highest 1999


----------



## Foxrun

It never stops feeling cool when you mod a gpu. Swapped the 1080 hybrid kit from my titan xp to the first 1080ti. Now I just have wait for the next kit to come to install on the other one. Temps dropped drastically, from averaging 70c to now 40c! I stupidly lost a few power cables to hook up my second gpu, so I ordered a new psu since they dont make cables for the hx1000w anymore.


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Unnatural*
> 
> Just ordered an EVGA FE and EK waterblock. Let's go study this thread!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why'd you choose the EVGA FE? Is it purely because of good after-sales support?
> Quote:
> 
> 
> 
> Originally Posted by *Bishop07764*
> 
> I was just trying to say that under water. The things will be power limited almost always. For instance, I cab hit the power limit easily in some benchmarks at 2063 core by just adding +160 or so to core. I can get 2076 if I use the curve and drop my voltage to 1.031 or so. But I still hit the power limit even then. Hope that makes sense.
> 
> Click to expand...
> 
> And that if you're not power-limited, your clocks will be more stable. That's just it? Nothing else?
> Quote:
> 
> 
> 
> Originally Posted by *Addsome*
> 
> I actually don't know if it is helping him. My scores at 1080p are higher than what he shows @2100Mhz and im only running 2038-2063MHz.
> 
> Im using the 1080 hybrid kit on my ti.
> 
> Click to expand...
> 
> We're pertaining to Kedarwolf here, right?
Click to expand...

I get mixed results, Heaven stock BIOS, 160 FPS, Heaven Asus BIOS, 152 FPS.

TimeSpy stock BIOS, 10458, TimeSpy Asus BIOS, 10868 or something close to these numbers, on transit on my phone.


----------



## eXteR

Quote:


> Originally Posted by *Foxrun*
> 
> 
> 
> 
> 
> 
> It never stops feeling cool when you mod a gpu. Swapped the 1080 hybrid kit from my titan xp to the first 1080ti. Now I just have wait for the next kit to come to install on the other one. Temps dropped drastically, from averaging 70c to now 40c! I stupidly lost a few power cables to hook up my second gpu, so I ordered a new psu since they dont make cables for the hx1000w anymore.


Hi mate,

i want to make exactly the same mod as you.

Do you connected the pump and radiator fan, with the Y connector on the GPU, or to the motherboard?

Where thinking to make a cable mod, so i can connect to the motherboard so the pump and radiator fan don't draw power from the board and eat the power limit.


----------



## Jaysend

Quote:


> Originally Posted by *krutoydiesel*
> 
> You didn't accidentally drop any screws or whatnot or gauge any of the components did you?


I hope so, because that would be a tangible explanation, but sadly my forensics didn't turn up anything...


----------



## cekim

Quote:


> Originally Posted by *KedarWolf*
> 
> Strix BIOS for me even at 1.093v never goes above 111% TDP.
> 
> Might have some to do with my power supply as well though, Corsair AX1500i with separate cables to each PCI-E power connector.


EVGA 1600 - same - separate cable for each.


----------



## Foxrun

Quote:


> Originally Posted by *eXteR*
> 
> Hi mate,
> 
> i want to make exactly the same mod as you.
> 
> Do you connected the pump and radiator fan, with the Y connector on the GPU, or to the motherboard?
> 
> Where thinking to make a cable mod, so i can connect to the motherboard so the pump and radiator fan don't draw power from the board and eat the power limit.


I connected the pump to the gpu along with the gpu fan. The radiator fan is connected to the mobo.


----------



## Bishop07764

Quote:


> Originally Posted by *Jaysend*
> 
> The same
> Thing happens on 3 different motherboards. Also the system starts up and behaves fine with the GPU plugged
> Into the Pcie slot. It is only when power is hooked up. I tried multiple cables and PSUs. I think something in the GPU is shorted. I took it apart and couldn't find anything obvious...
> Just got an RMA from the egg.


Hopefully the RMA will get it sorted out. Feel sorry about all that hassle. Hopefully the replacement wait will reward you with a golden card.








Quote:


> Originally Posted by *Foxrun*
> 
> How are you guys cooling the FE? I just got my two but the hybrid kits on EVGA are sold out.


Full cover EK block! It's awesome. I thought the founders blower sucked quite badly to be honest. But my expectations were completely unreasonable since I have been watercooling for a number of years now.









Quote:


> Originally Posted by *kevindd992002*
> 
> And that if you're not power-limited, your clocks will be more stable. That's just it? Nothing else?


Yes. Watercooled will run you right into the power limit wall most likely. You need to have a decent chip too of course. Mine won't do 2100 no matter what I do. But it holds 2063 almost constantly in games until I hit the said power limit occasionally. Hitting the limit happens more frequently in benchmarks. Much less frequently in actual games from what I can tell. My card will do 2076 or so, but it scores slightly lower in benchmarks. It's my cards wall. Haven't tried other BIOS yet, but I have been scoring higher graphics scores than some of the guys running 2100 or higher so I'm not sure that It's even worth me bothering with. Watercooling will make your clocks a lot more consistent by taking out the main hurdle with air coolers aka heat and then you run into the said voltage power limit issues most likely next.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Jbravo33*
> 
> i'll check my notes tonight. haha i have notes. like what!? no seriously tho i have everything written down pages of benches. the msi worked fine as well i just went back to strix cause i would leave card at base speeds for gaming. and that was the highest 1999


Yeah I have a folder with the runs saved with a name of the core run, memory and voltage it was at and lastly bios time and date.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Hi mate,
> 
> i want to make exactly the same mod as you.
> 
> Do you connected the pump and radiator fan, with the Y connector on the GPU, or to the motherboard?
> 
> Where thinking to make a cable mod, so i can connect to the motherboard so the pump and radiator fan don't draw power from the board and eat the power limit.


I recommend connecting pump directly to PSU with an adapter. You'll need mini to normal pwm 4 pin


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> not really true. if you put the thermal grizz on them and run a 1.1:1 fan curve, it'll be loud but fairly chilled. Two hours of heaven, TG FE was at 71c, my Aorus was the same. After I thermal grizzed the Aorus it is 64c same scenario.
> 
> So, results from Thermal Grizzly Kyronaut on an Aorus: 4c idle, 7c-8c full load improvement. VERY nice. I cant imagine why they dont use better thermal paste with all the effort they put into cooling. I guess because it requires human hands.
> 
> Time to go see what my 2 hour high temp mark is in BF1 now.


I just put order on Kryonout paste so I can apply it to my STRIX and 5775C when I get them.

Is it hard to dismantle AIB card? Never done that to change thermal paste. Can I break something??


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> i use a separate direct power to each 8-pin as well. It's the way to go. Love my Evga G2 1000w.


EVGA G2 1000W here too







Love G2 PSUs.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I just put order on Kryonout paste so I can apply it to my STRIX and 5775C when I get them.
> 
> Is it hard to dismantle AIB card? Never done that to change thermal paste. Can I break something??


the aorus was a pain. easy peasy 7 screws an pull, but, unplugging those fans and led plugs... i gave up and just had my daughter hold it while i cleaned/pasted. took 10-15 minutes is all. FE was easy.

you gonna love that 5775c.

After a couple hours of BF1, i was gaming at 60-61 first hour and a half but Amiens came on so the high i according to HWmonitor was 62. Same situation this morning pre-paste was 66c. My cpu was actually running hotter than my gpu (65c) throughout.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> the aorus was a pain. easy peasy 7 screws an pull, but, unplugging those fans and led plugs... i gave up and just had my daughter hold it while i cleaned/pasted. took 10-15 minutes is all. FE was easy.
> 
> you gonna love that 5775c.
> 
> After a couple hours of BF1, i was gaming at 60-61 first hour and a half but Amiens came on so the high i according to HWmonitor was 62. Same situation this morning pre-paste was 66c. My cpu was actually running hotter than my gpu (65c) throughout.


Nice. But I will void warranty on Asus when I dismantle it and change paste, right?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Nice. But I will void warranty on Asus when I dismantle it and change paste, right?


They won't know


----------



## Slackaveli

Yeah, if you tell them, but they wont open it up just to check for that. just don't break the fan plugs!
If it's too hard, abort. We are talking about 4-5c here. Wanted but not needed.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> Yeah, if you tell them, but they wont open it up just to check for that. just don't break the fan plugs!
> If it's too hard, abort. We are talking about 4-5c here. Wanted but not needed.


Well, if I break anything I have 14 days to return it anyway







God bless EU law I guess.

I will try to find some YT video about dismanting STRIX.

How do you guys prefer to apply paste? I prefer single dot in the middle technique. But I know some like X technique.


----------



## manolith

got my gpu last night. been testing it today and 4k 60fps is here. 980ti was great but this is just in another level.


----------



## Bishop07764

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Those are some fantastic temps under water! With my 980 Ti I get around 46C under load. I'm interested in seeing what the temps will be like when EK releases the waterblock for my Aorus.


Thanks! The last 3 full cover blocks that I've had from EK have done great. This one is doing phenomenally, but it isn't pushing the same type of temp load that was on my old 780 lightning with it's 860 watt bios. Feel the burn







And my ambient temps aren't too bad yet. I'm sure that temps will go up over the summer somewhat.


----------



## mtbiker033

Quote:


> Originally Posted by *manolith*
> 
> got my gpu last night. been testing it today and 4k 60fps is here. 980ti was great but this is just in another level.


wow still rocking the i7-920? old skool!!!!


----------



## optimus002

Upgraded from a 980Ti to a FE 1080 Ti. Forgot how loud the fan was, but I planned on getting a hybrid kit. I was wondering if someone could post what and where are the exact mods needed for the 1080 hybrid kit to fit. Thanks for any help.


----------



## prelude514

My buddy who works at Canada Computers just called to let me know they received 2 Aorus Extremes and 4 Aorus non extremes. Both Extremes were snagged right away, but I had him put a regular aside for me. Tomorrow my MSI FE gets traded in.







Can't wait to see what she can do!


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Well, if I break anything I have 14 days to return it anyway
> 
> 
> 
> 
> 
> 
> 
> God bless EU law I guess.
> 
> I will try to find some YT video about dismanting STRIX.
> 
> How do you guys prefer to apply paste? I prefer single dot in the middle technique. But I know some like X technique.


gpu needs full coverage. have to smear method, all of it, even the sides.


----------



## Slackaveli

Quote:


> Originally Posted by *mtbiker033*
> 
> wow still rocking the i7-920? old skool!!!!


he is good to go (to an extent) too, because he is in 4k. Oh the irony.
Quote:


> Originally Posted by *prelude514*
> 
> My buddy who works at Canada Computers just called to let me know they received 2 Aorus Extremes and 4 Aorus non extremes. Both Extremes were snagged right away, but I had him put a regular aside for me. Tomorrow my MSI FE gets traded in.
> 
> 
> 
> 
> 
> 
> 
> Can't wait to see what she can do!


Welcome to the aorus side club. It's a sexy beast. I recommend stock volts and you start at +79 core so it'll prob hold another +50-100 core. Mine's at +76, it is at 2067 at first, then 2050 until temps get in the low 60s (where it stays), level out at 2025 and stay there. If I was under water this card can hold 2067+ @ 1.05v


----------



## Foxrun

Quote:


> Originally Posted by *optimus002*
> 
> Upgraded from a 980Ti to a FE 1080 Ti. Forgot how loud the fan was, but I planned on getting a hybrid kit. I was wondering if someone could post what and where are the exact mods needed for the 1080 hybrid kit to fit. Thanks for any help.


Look back at my previous posts. I slapped a 1080 hybrid kit on it with only needing a screw driver kit and some thermal paste.


----------



## Slackaveli

The best bang for your buck still thermal grizzly kyro. ~5c basically for 10 minutes and 10 dollar. That's solid. Considering that a hybrid kit is about $5-7 per 1c gained. So if you ARE going hybrid, FOR sure get Kyronaut, too. And if you are staying on air, Kyro gives you a bin or turbo worth of perf easily. Think of it like this. If the manufacturers offered another gpu that was the same but just was 5c lower average temp, would ANYBODY pass it up if it was only $10 more?


----------



## KedarWolf

Hey peeps, when bench testing with Heaven or 3DMark or running any game, you want to change your Nvidia control panel to Single Display Performance Mode and Power Management Mode to Prefer Maximum Performance in the actual game profile.

If you set Prefer Maximum Performance in Global settings your card may not downclock properly when just web browsing and stuff.

Using these options will give you a bit more stability and give you an extra bin or two at the same voltages.


----------



## optimus002

I saw that, I wanted to see the mods done to the shroud of the hybrid to make it look more complete. I've seen one persons attempt, but I needed more info on what and where he modified things, just to be 100% sure, don't want to **** it up, especially since this is my first attempt at modding a gpu!


----------



## Boyd

I can officially be a member of this fancy, seductive, flirtatious, erotic owners club









- MSI 1080 Ti GAMING X, my second MSI card other than a reference GTX 480 that I used to have. Upgraded from a Zotac AMP! Extreme 980 Ti


----------



## jarble

Just got my bitspower blocks installed







Also tried out the shunt mod but ran out of liquid ultra faster than I thought I would so I am not seeing the numbers others are. Still now able to hold low 40's with heavy stress testing at 2060. Block is for the titan but is compatible just need to add more thermal pads then they call for to cover the new/extra vrm area. Note that the led wire naturally runs out right over the lowest shunt mod so be sure to run it up past the caps instead.









Air cooled fire strike ultra score 11807
Water cooled fire strike ultra score 12875


----------



## bloodhawk

Some benches i did with my 1080Ti (non power modded) -

http://www.3dmark.com/fs/12164642 - 7714

http://www.3dmark.com/spy/1490385 - 10 033

http://www.3dmark.com/fs/12195287 - 24 507


----------



## prelude514

Quote:


> Originally Posted by *Slackaveli*
> 
> he is good to go (to an extent) too, because he is in 4k. Oh the irony.
> Welcome to the aorus side club. It's a sexy beast. I recommend stock volts and you start at +79 core so it'll prob hold another +50-100 core. Mine's at +76, it is at 2067 at first, then 2050 until temps get in the low 60s (where it stays), level out at 2025 and stay there. If I was under water this card can hold 2067+ @ 1.05v


Thanks brother. Really looking forward to it. What fan speed are you running for those temps, and what's the approximate ambient temperature? I plan on running it at 100%, really don't mind the noise. I'm running my FE at 100% and it's completely fine for me. Loud = reference 290X in my book. I'm hoping to maintain 2075-2100 with that generous 375W TDP!


----------



## illypso

Quote:


> Originally Posted by *Benny89*
> 
> EVGA G2 1000W here too
> 
> 
> 
> 
> 
> 
> 
> Love G2 PSUs.


I have the G3 1000W and I found it noisy. its small so air is having a hard time passing through and make more wind noise. But I change the fan and conected it to my motherboard to control fan speed and noise. So its ok now. But I would have prefer the bigger G2 it must be more silent.


----------



## Jbravo33

Quote:


> Originally Posted by *KedarWolf*
> 
> Hey peeps, when bench testing with Heaven or 3DMark or running any game, you want to change your Nvidia control panel to Single Display Performance Mode and Power Management Mode to Prefer Maximum Performance in the actual game profile.
> 
> If you set Prefer Maximum Performance in Global settings your card may not downclock properly when just web browsing and stuff.
> 
> Using these options will give you a bit more stability and give you an extra bin or two at the same voltages.


I noticed also you had a option for cpu under the power management I don't have that option. is that do to me using the 2nd gpu for PhysX? instead of cpu


----------



## Nico67

Quote:


> Originally Posted by *alucardis666*
> 
> Golden chip


Quote:


> Originally Posted by *Slackaveli*
> 
> dang, dude, that's a DIAMOND gpu. Very rare. It must have been cut from the very center of the die. It's like the heart of a watermelon!


Thx, seems I got lucky this time







cold water is what makes the difference and lets me use the voltage more noticeably than previous cards, well for me at least,

Strix bios seems to give similar benchmark results to the orginal, at least on "The Division" benchmark. drops the power limit nicely but does seem to have some voltage limit spikes even when it has more voltage bins available or when you give it more than it should need at that clk. Bit weird but it does allow me to go a bit higher again,

2152 is ok at 1.081 well benchable at least and max power is down to 107% thru 5 passes



2164 I got to pass once at 1.093, might be more usuable with more voltage but that's all is available for now



the voltage limit causes a little dip or two, but its ok on power at least









I think 2139 or less is where its gameable, and probably better on the original bios, except I would be power limiting with a memory bump. Would be close though.

around +549 seemed to scale well, +597 didn't see a huge jump, and +649 took a hit, so somewhere around +500~ is probably safe. Notably it was a nice jump in performance and only added about 4% power up to 111%



didn't play with driver settings either which may have helped.


----------



## Slackaveli

Quote:


> Originally Posted by *Boyd*
> 
> I can officially be a member of this fancy, seductive, flirtatious, erotic owners club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - MSI 1080 Ti GAMING X, my second MSI card other than a reference GTX 480 that I used to have. Upgraded from a Zotac AMP! Extreme 980 Ti


welcome, sir


----------



## hotrod717

-2144


----------



## Slackaveli

Quote:


> Originally Posted by *prelude514*
> 
> Thanks brother. Really looking forward to it. What fan speed are you running for those temps, and what's the approximate ambient temperature? I plan on running it at 100%, really don't mind the noise. I'm running my FE at 100% and it's completely fine for me. Loud = reference 290X in my book. I'm hoping to maintain 2075-2100 with that generous 375W TDP!


it's honestly inaudible until ~75% and never gets louder even at 100%, that's just 2700 rpms on this. i just run it 1:1 curve except drop it to zero at 39 and it goes full silent fan stop mode. Thing is awesome. Here, I didnt have this luxury while I waited. Get you a taste, bro 




mine is several bins better than his, and he's not a good OCer at all. Keep that in mind on his tests. The vid has a breakdown, though, and that was cool. I did it today already. Those fans are hard to unplug. he had trouble, too.


----------



## illypso

Just got my 2nd 1080ti, pop it in with a old ribon sli bridge. Seems to work well too

about 2000mhz jumping around because the 2nd it stock so temp and power limit.
Did not tune anything

Firestrike 26186 http://www.3dmark.com/fs/12251685
Firestrike ultra 12389 http://www.3dmark.com/fs/12251853

I feel like my cpu is holding me.

But no more money for now









bread and water for a while than a new CPU or maybe full water for both.


----------



## Slackaveli

Quote:


> Originally Posted by *illypso*
> 
> Just got my 2nd 1080ti, pop it in with a old ribon sli bridge. Seems to work well too
> 
> about 2000mhz jumping around because the 2nd it stock so temp and power limit.
> Did not tune anything
> 
> Firestrike 26186 http://www.3dmark.com/fs/12251685
> Firestrike ultra 12389 http://www.3dmark.com/fs/12251853
> 
> I feel like my cpu is holding me.
> 
> But no more money for now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bread and water for a while than a new CPU or maybe full water for both.


You are being held by that ribbon, homey. You need an HB bridge, stat!

OMG, you are on a 4790k and just admitted money is tight now. GET THE 5775c and sell that 4790k. I have already talked 5 people into it, and all of them have done it for $50 or less net. Some broke even. like @Palotte, he sold his 4790k and got his 5775c on zero dollars. That is the best upgrade ever, easy peasy, and freeasy.


----------



## cyc2

I just got a FE 1080 Ti, and it runs at 2050mhz 1.062v max fan speed. What overclock can I expect to reach if I put it under water?


----------



## cekim

Quote:


> Originally Posted by *cyc2*
> 
> I just got a FE 1080 Ti, and it runs at 2050mhz 1.062v max fan speed. What overclock can I expect to reach if I put it under water?


2050 @ 1.062v or lower









Don't expect it to improve - the benefit is if you keep it cool, you won't clock down and it will be quieter.


----------



## guttheslayer

Guys, I encounter flashing of the screen at boot up, is that normal? This never happen to my previous card.

Using a X99 Sabertooth with Aorus 1080 Ti 11G here.


----------



## Jbravo33

Quote:


> Originally Posted by *illypso*
> 
> Just got my 2nd 1080ti, pop it in with a old ribon sli bridge. Seems to work well too
> 
> about 2000mhz jumping around because the 2nd it stock so temp and power limit.
> Did not tune anything
> 
> Firestrike 26186 http://www.3dmark.com/fs/12251685
> Firestrike ultra 12389 http://www.3dmark.com/fs/12251853
> 
> I feel like my cpu is holding me.
> 
> But no more money for now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bread and water for a while than a new CPU or maybe full water for both.


Do custom runs and shut off psychics and combined. Focus on getting you graphics score as high as possible. That's what I did. Was able to get 59100 then do full runs. I was able to get my best overall 32645 Once I did that. But on that run which was my best graphics was 57 and some change. Couldn't quite get the highest of all three on one run.


----------



## illypso

Quote:


> Originally Posted by *Slackaveli*
> 
> You are being held by that ribbon, homey. You need an HB bridge, stat!
> 
> OMG, you are on a 4790k and just admitted money is tight now. GET THE 5775c and sell that 4790k. I have already talked 5 people into it, and all of them have done it for $50 or less net. Some broke even. like @Palotte, he sold his 4790k and got his 5775c on zero dollars. That is the best upgrade ever, easy peasy, and freeasy.


just tested watch dog 2, I get 80% on both card but ALL my CPU core hit 99% loll

For the cpu switch its interesting but where do I sell that, kijiji ? or there is a special place.

I'm in canada and if I check newegg.ca the 4790K new is 519$ and the 5775C is 744$ I don't see how I could sell it and switch for 100$ (converted 50$us to ca lol)difference. If I could just throw a 100$ and have the switch I would do it.


----------



## Slackaveli

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys, I encounter flashing of the screen at boot up, is that normal? This never happen to my previous card.
> 
> Using a X99 Sabertooth with Aorus 1080 Ti 11G here.


lemme guess. You have at least 378.78 drivers? 378.92, 381.xx whatever the latest one is? Those drivers have a flicker issue. lasts like 2 seconds.


----------



## cekim

Quote:


> Originally Posted by *guttheslayer*
> 
> Guys, I encounter flashing of the screen at boot up, is that normal? This never happen to my previous card.
> 
> Using a X99 Sabertooth with Aorus 1080 Ti 11G here.


Yes. Did not happen with 1080, 980ti or 980, but I definitely and consistently see it now with an RVE.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> lemme guess. You have at least 378.78 drivers? 378.92, 381.xx whatever the latest one is? Those drivers have a flicker issue. lasts like 2 seconds.


This happens during BIOS vga/EFI init.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> the aorus was a pain. easy peasy 7 screws an pull, but, unplugging those fans and led plugs... i gave up and just had my daughter hold it while i cleaned/pasted. took 10-15 minutes is all. FE was easy.
> 
> you gonna love that 5775c.
> 
> After a couple hours of BF1, i was gaming at 60-61 first hour and a half but Amiens came on so the high i according to HWmonitor was 62. Same situation this morning pre-paste was 66c. My cpu was actually running hotter than my gpu (65c) throughout.


Any pics by chance?


----------



## manolith

Quote:


> Originally Posted by *mtbiker033*
> 
> Quote:
> 
> 
> 
> Originally Posted by *manolith*
> 
> got my gpu last night. been testing it today and 4k 60fps is here. 980ti was great but this is just in another level.
> 
> 
> 
> wow still rocking the i7-920? old skool!!!!
Click to expand...

yeah still rocking it LOL. this bad boy is a legend. first cards on it where good old gtx285s a LONG time ago. things have not changed much in the gpu department though. latest pixel crunchers are a must.


----------



## KedarWolf

Quote:


> Originally Posted by *illypso*
> 
> Just got my 2nd 1080ti, pop it in with a old ribon sli bridge. Seems to work well too
> 
> about 2000mhz jumping around because the 2nd it stock so temp and power limit.
> Did not tune anything
> 
> Firestrike 26186 http://www.3dmark.com/fs/12251685
> Firestrike ultra 12389 http://www.3dmark.com/fs/12251853
> 
> I feel like my cpu is holding me.
> 
> But no more money for now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bread and water for a while than a new CPU or maybe full water for both.


Use TWO ribbon bridges or get a high bandwidth SLI bridge. I read two ribbons work the same as one high bandwidth.


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought i would share. Unfortunately didnt do a proper capture on higher score. 5960x btw
> 
> 
> 
> 
> 
> Just about done with testing if anyone is interested, shoot me a pm.


Don't know how you're getting such high scores on one 1080 Ti.

Here's the best I can do with FE BIOS at 2062 core,2 6132 memory.


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought i would share. Unfortunately didnt do a proper capture on higher score. 5960x btw
> 
> 
> 
> 
> 
> Just about done with testing if anyone is interested, shoot me a pm.


Even with a custom 720 preset.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Any pics by chance?


naw but i posted a video earlier where they do the breakdown. It's super easy except for the unplugging of the fans/leds. four of those buggers. i left 3 plugged in and let my daughter hold it.

If you mean of my temps, i just wrote them down but this is the 3rd gpu i have re-pasted with kyro in 2017 and all got between 5-8c gains. My evga hybrid got 8c all day. Both 1080ti got 5c.
Quote:


> Originally Posted by *illypso*
> 
> just tested watch dog 2, I get 80% on both card but ALL my CPU core hit 99% loll
> 
> For the cpu switch its interesting but where do I sell that, kijiji ? or there is a special place.
> 
> I'm in canada and if I check newegg.ca the 4790K new is 519$ and the 5775C is 744$ I don't see how I could sell it and switch for 100$ (converted 50$us to ca lol)difference. If I could just throw a 100$ and have the switch I would do it.


hmm, we all did ebay. I think ~$100 in american was what i expected on that exchange but we all beat it. Ebay has 4790k going for ~$300 pretty much steady and the 5775c is $359 right now @ bhphoto in the US. In Canada, I just dont know brother.

Edit. Went up $10, demand is up lately. I wonder why







https://www.bhphotovideo.com/c/product/1181928-REG/intel_bx80658i75775c_core_i7_5775c_3_3_ghz.html original list price was like $419 or something.


----------



## kevindd992002

Quote:


> Originally Posted by *Bishop07764*
> 
> Yes. Watercooled will run you right into the power limit wall most likely. You need to have a decent chip too of course. Mine won't do 2100 no matter what I do. But it holds 2063 almost constantly in games until I hit the said power limit occasionally. Hitting the limit happens more frequently in benchmarks. Much less frequently in actual games from what I can tell. My card will do 2076 or so, but it scores slightly lower in benchmarks. It's my cards wall. Haven't tried other BIOS yet, but I have been scoring higher graphics scores than some of the guys running 2100 or higher so I'm not sure that It's even worth me bothering with. Watercooling will make your clocks a lot more consistent by taking out the main hurdle with air coolers aka heat and then you run into the said voltage power limit issues most likely next.


Is it because the temps are so low you get past the thermal throttle wall and GPU Boost 3.0 will just dynamically OC the card to such a point that it hits the power limit?

Quote:


> Originally Posted by *Slackaveli*
> 
> gpu needs full coverage. have to smear method, all of it, even the sides.


Why the sides?


----------



## jase78

]Today i got the new ek block pre-ordered for my FE. excited to water-cool gpu for first time!! I Will be expanding swiftech 240x loop with additional rad(hwl 280gts). cant wait till the block ships to get started!

These are the best runs ive had so far http://www.3dmark.com/spy/1465070 http://www.3dmark.com/fs/12240231 do they seem average/below avg.?


----------



## NYU87

This MSI GTX 1080 Ti Gaming X is pretty beast @ 2.1GHz... almost 32K graphics score in Firestrike.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> Is it because the temps are so low you get past the thermal throttle wall and GPU Boost 3.0 will just dynamically OC the card to such a point that it hits the power limit?
> 
> Why the sides?


im sure it doesnt matter as much as the top but there is a couple mm around all the sides. Why leave them naked? If you notice the pre applied stuff covers the sides. Pretty much the only way to do it wrong is to leave places bare. As long as it is covered, even some extra is ok. Gpu and cpu are different. pea or x or line is fine for cpu. gpu, cover that sucker.


----------



## Boost240

Quote:


> Originally Posted by *Slackaveli*
> 
> The best bang for your buck still thermal grizzly kyro. ~5c basically for 10 minutes and 10 dollar. That's solid. Considering that a hybrid kit is about $5-7 per 1c gained. So if you ARE going hybrid, FOR sure get Kyronaut, too. And if you are staying on air, Kyro gives you a bin or turbo worth of perf easily. Think of it like this. If the manufacturers offered another gpu that was the same but just was 5c lower average temp, would ANYBODY pass it up if it was only $10 more?


I have a ton of Conductonaut left over from deliding my CPU. Should be fine using this with the new EK blocks, no?


----------



## Hawk777th

I am going to go from to TXM SLi to an 1080Ti. I barely use SLI so it seems its a better option for me. Most the software I use and games I play dont work with SLi at all.

I am looking at the EVGA cards? Is the FTW 3 the best option from them? Should I be looking at something else?


----------



## straha20

Finally got the ek water block on my 1080ti. First time water cooling a gpu. I don't have my build complete yet, so I threw it into another system I have just to see if it would post, as that system is not water cooled. After post, I decided to load up windows and look at it in hwmonitor and afterburner. I then decided to update the video drivers. All in all, it was on and mildly used for about 15-20 minutes with only the water block to cool it, and the temps spike up to 44c at one point, but hovered right around 38...so I am thinking I got things right?


----------



## Juggalo23451

Quote:


> Originally Posted by *straha20*
> 
> Finally got the ek water block on my 1080ti. First time water cooling a gpu. I don't have my build complete yet, so I threw it into another system I have just to see if it would post, as that system is not water cooled. After post, I decided to load up windows and look at it in hwmonitor and afterburner. I then decided to update the video drivers. All in all, it was on and mildly used for about 15-20 minutes with only the water block to cool it, and the temps spike up to 44c at one point, but hovered right around 38...so I am thinking I got things right?


That seems about right to me. I still need to do some benching though. I delidded my cpu and I have a heatkiller water block on my MSI FE.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> im sure it doesnt matter as much as the top but there is a couple mm around all the sides. Why leave them naked? If you notice the pre applied stuff covers the sides. Pretty much the only way to do it wrong is to leave places bare. As long as it is covered, even some extra is ok. Gpu and cpu are different. pea or x or line is fine for cpu. gpu, cover that sucker.


But the heatsink never touches the side, does it?


----------



## straha20

Quote:


> Originally Posted by *Juggalo23451*
> 
> That seems about right to me. I still need to do some benching though. I delidded my cpu and I have a heatkiller water block on my MSI FE.


Just so I am clear here, I didn't have it actually installed in a loop yet, so no actual water in the block.


----------



## Jbravo33

Quote:


> Originally Posted by *NYU87*
> 
> This MSI GTX 1080 Ti Gaming X is pretty beast @ 2.1GHz... almost 32K graphics score in Firestrike.


Nice card !!


----------



## EvilPieMoo

Quote:


> Originally Posted by *NYU87*
> 
> This MSI GTX 1080 Ti Gaming X is pretty beast @ 2.1GHz... almost 32K graphics score in Firestrike.


You should be doing well over 32K graphics score with 2.1Ghz, I've pushed over 32K with only 2063


----------



## NYU87

Quote:


> Originally Posted by *EvilPieMoo*
> 
> You should be doing well over 32K graphics score with 2.1Ghz, I've pushed over 32K with only 2063


It's probably not a 100% stable run, all I did was turn the core clock and memory sliders all the way up without really tinkering with it.

I'll hone in to get a max stable speeds, can probably increase my score as well.


----------



## KingEngineRevUp

Some OC discoveries of mine.

1. Memory and core can perform well or against one another. Anything below 2050, +400 Mhz on memory did better than +450. Above that, +450 was better.

2. The Asus strix bios actually does have its uses.

Earlier I posted many of my benchmarking results showing the stock bios scoring better than the ASUS Strix bios. I finally found a core clock and voltage where both stock and Asus bios perform 0.5% from one another in heaven. The stock still scores a little higher. This is 2088 Mhz on the core at 1.061V

Although the stock bios scores higher, there are games that are heavily gpu intense, Witcher 3 being one of them. These games draw lots of better.

Gameplay is smoother and more stable in Witcher 3 with the ASUS bios because of that extra 30 watts.

3. Even with 10% extra TDP, In Witcher 3, I can see slow downs more often due to temperature. But the fps for Witcher 3 is very consistent on the Asus bios. This game demands a lot of power.

4. Building off of point 3, it turns out that 10% from the Asus bios is not enough. From my understanding the Asus bios only gives us access to an additional 30 watts of power. Well the Witcher 3 will definitely hit 111% and then clocks will throttle a bin or even 2 bins down. I'm hitting power throttling with the extra power from the Asus bios.

5. Guys with AIO, be prepared to be thermal throttled if you're on a 120mm radiator. That extra 30 watts of power can push you from 45C to 50C where you lose a bin in OCing


----------



## madmeatballs

Anyone here who is also planning to get an Aqua Computer block and backplate for their card?


----------



## feznz

Loving this card settled for 2038MHz core game stable @ max temp 55°C under water
not the best clocker but it is a beast of a card without OC


----------



## KedarWolf

My best TimeSpy score, clean Windows Creator Edition install, 378.92 Nvidia drivers, 2088 core, 6132 memory.

http://www.3dmark.com/3dm/19131596? 10921


----------



## jacknhut

How did you guys enable voltage control in MSI afterburner?

I tried everything but nothing works... I clicked on Unlock Voltage Control and Unlock Voltage monitoring in MSI Afterburner Settings. I edit MSIAfterburnner cfg file using notepad, I also edited MSIafterburner oem2 file using notepad and still didn't work.

Any other trick? I'm using newest version of MSI afterburner.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> My best TimeSpy score, clean Windows Creator Edition install, 378.92 Nvidia drivers, 2088 core, 6132 memory.
> 
> http://www.3dmark.com/3dm/19131596? 10921
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> See my signature for CPU, memory clocks etc.


----------



## KedarWolf

Quote:


> Originally Posted by *jacknhut*
> 
> How did you guys enable voltage control in MSI afterburner?
> 
> I tried everything but nothing works... I clicked on Unlock Voltage Control and Unlock Voltage monitoring in MSI Afterburner Settings. I edit MSIAfterburnner cfg file using notepad, I also edited MSIafterburner oem2 file using notepad and still didn't work.
> 
> Any other trick? I'm using newest version of MSI afterburner.


http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## mtbiker033

Quote:


> Originally Posted by *manolith*
> 
> yeah still rocking it LOL. this bad boy is a legend. first cards on it where good old gtx285s a LONG time ago. things have not changed much in the gpu department though. latest pixel crunchers are a must.


that's awesome! and I bet at higher resolution when you're gpu bound it works fine.


----------



## guttheslayer

Quote:


> Originally Posted by *Slackaveli*
> 
> lemme guess. You have at least 378.78 drivers? 378.92, 381.xx whatever the latest one is? Those drivers have a flicker issue. lasts like 2 seconds.


Oh yes only on these latest drivers. What is happening? Should I uninstall them?

Only happens during BOOT UP before window logo but after BIOS trigger screen.


----------



## PCBUILDER1980

Mine flickers b4 boot up too, just a little annoying but hopefully gets fixed in driver upate


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> Some OC discoveries of mine.
> 
> 3. Even with 10% extra TDP, In Witcher 3, I can see slow downs more often due to temperature. But the fps for Witcher 3 is very consistent on the Asus bios. This game demands a lot of power.
> 
> 4. Building off of point 3, it turns out that 10% from the Asus bios is not enough. From my understanding the Asus bios only gives us access to an additional 30 watts of power. Well the Witcher 3 will definitely hit 111% and then clocks will throttle a bin or even 2 bins down. I'm hitting power throttling with the extra power from the Asus bios.


That is curious, I have seen a lot of people at 111% on the Asus bios, has anybody seen higher? with an FE that is.

275w x 111% = 305.25w and I am kinda wondering if maybe we aren't getting 30w more just a little more up to some other barrier, maybe a limit on one of the rails due to the way it draws power.

I haven't seen any power limit in AB at least but I have seen voltage limit spikes


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> I recommend connecting pump directly to PSU with an adapter. You'll need mini to normal pwm 4 pin


You mean PSU or to motherboard with 4 pin PWM?

is there any cable that i can buy, or i have to build it myself?


----------



## Bishop07764

Quote:


> Originally Posted by *kevindd992002*
> 
> Is it because the temps are so low you get past the thermal throttle wall and GPU Boost 3.0 will just dynamically OC the card to such a point that it hits the power limit?


Yeah. The main problem is heat. Then your next is voltage and power limit. Boost 3.0 does indeed complicate things compared to pre pascal cards. At total stock my gpu will boost to 1911 and stay there. But once you start adding to the core, boost 3.0 decides where the final clocks will end. Some may get higher and others lower at the same +150 on the core in afterburner. Depends on chip, cooling, and voltage.
Quote:


> Originally Posted by *NYU87*
> 
> This MSI GTX 1080 Ti Gaming X is pretty beast @ 2.1GHz... almost 32K graphics score in Firestrike.


I think if you up your memory clock some more you could easily get over 32k graphics score. Probably decent bump in Timespy too. It helped me out.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> You mean PSU or to motherboard with 4 pin PWM?
> 
> is there any cable that i can buy, or i have to build it myself?


On my old 980 ti hybrid kit I got one of these adapters to feed power from the motherboard. But then I bout a sata power to 4-pin adapter later and connected my pump directly into my PSU.

https://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> That is curious, I have seen a lot of people at 111% on the Asus bios, has anybody seen higher? with an FE that is.
> 
> 275w x 111% = 305.25w and I am kinda wondering if maybe we aren't getting 30w more just a little more up to some other barrier, maybe a limit on one of the rails due to the way it draws power.
> 
> I haven't seen any power limit in AB at least but I have seen voltage limit spikes


Actually 100% for Asus is 300 watts already. So if you flash the bios, than 110% is 330 watts, it's like you're running the shunt mod.


----------



## evassion

Evga 1080 ti, evga bios, no shunt mod on air. Waiting for an aio


Clocks are dancing a litte.. because of temps and limit.


----------



## Bishop07764

Hey guys. Some of us are getting a higher graphics score than the Titan X pp.









http://www.guru3d.com/news-story/nvidia-titan-xp-is-faster-as-custom-gtx-1080-ti.html


----------



## Asus11

Quote:


> Originally Posted by *evassion*
> 
> Evga 1080 ti, evga bios, no shunt mod on air. Waiting for an aio
> 
> 
> Clocks are dancing a litte.. because of temps and limit.


people still use orange







?


----------



## PasK1234Xw

Quote:


> Originally Posted by *Bishop07764*
> 
> Hey guys. Some of us are getting a higher graphics score than the Titan X pp.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/news-story/nvidia-titan-xp-is-faster-as-custom-gtx-1080-ti.html


Highly clocked Ti on par with stock Xp if you're lucky, now lets see what Xp looks like running at 2100

http://www.3dmark.com/fs/12249956


----------



## evassion

In Spain, orange is not bad at all







i usually post from the mobile.

Does the aio help improve mu results over air?? I was thinking about sending back card to the shop and get an aftermarket


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> On my old 980 ti hybrid kit I got one of these adapters to feed power from the motherboard. But then I bout a sata power to 4-pin adapter later and connected my pump directly into my PSU.
> 
> https://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html


wow 10$ for that small cable.!!

I'll buy one of this, and then change the connector with a SATA to connect into the PSU.

http://www.ebay.es/itm/121136389695

Thanks for the advice.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> wow 10$ for that small cable.!!
> 
> I'll buy one of this, and then change the connector with a SATA to connect into the PSU.
> 
> http://www.ebay.es/itm/121136389695
> 
> Thanks for the advice.


If you shop around that site you can find one for like $4 maybe. But yeah, if you can make your own, that's even better!


----------



## KingEngineRevUp

This is a noob question but how do you get the latest afterburner and RSS to play nice with 3dmark software? They don't like one another. And firestrike won't run my oc...

*Stealth mode does nothing.*


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> But the heatsink never touches the side, does it?


well, no, that's the point, eh? Noe the paste touches paste that touches the shim lol. Idk, im just not gonna leave it naked.


----------



## Slackaveli

Quote:


> Originally Posted by *EvilPieMoo*
> 
> You should be doing well over 32K graphics score with 2.1Ghz, I've pushed over 32K with only 2063


not in Firestrike 1.1 he wont. that's in og firestrike.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> well, no, that's the point, eh? Noe the paste touches paste that touches the shim lol. Idk, im just not gonna leave it naked.


Paste has a conduction rating of like 12. Aluminum is like 175 and copper is 300. So it won't really conduct that fast. It was go faster through the silicon to copper or aluminum contact than it would through the paste on the sides lol.

I say to not paste the side because it's easier to clean.


----------



## Slackaveli

Quote:


> Originally Posted by *jacknhut*
> 
> How did you guys enable voltage control in MSI afterburner?
> 
> I tried everything but nothing works... I clicked on Unlock Voltage Control and Unlock Voltage monitoring in MSI Afterburner Settings. I edit MSIAfterburnner cfg file using notepad, I also edited MSIafterburner oem2 file using notepad and still didn't work.
> 
> Any other trick? I'm using newest version of MSI afterburner.


it's all unlocked on Afetrburner 4.4.0 i bet you are on 4.3
Quote:


> Originally Posted by *guttheslayer*
> 
> Oh yes only on these latest drivers. What is happening? Should I uninstall them?
> 
> Only happens during BOOT UP before window logo but after BIOS trigger screen.


i've just been putting up with it, just glad to know it isnt my monitor or gear failing.


----------



## Slackaveli

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Highly clocked Ti on par with stock Xp if you're lucky, now lets see what Xp looks like running at 2100
> 
> http://www.3dmark.com/fs/12249956


that wasnt a stock Titan Xpp. It was tweaked. It wont clock as high, golden's may match it, and of course a titan xpp golden maxed will win. But on AVERAGE, most of us will be running our 1080ti at higher perf than your average titan xpp user.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Paste has a conduction rating of like 12. Aluminum is like 175 and copper is 300. So it won't really conduct that fast. It was go faster through the silicon to copper or aluminum contact than it would through the paste on the sides lol.
> 
> I say to not paste the side because it's easier to clean.


yeah, if i ever was going to be in there again that'd be a concern lol, but i'll pass. Is a beach to get past those fan plugs. You need a 5-year old's fingers with the strength of pliers.


----------



## PasK1234Xw

Quote:


> Originally Posted by *Slackaveli*
> 
> that wasnt a stock Titan Xpp. It was tweaked. It wont clock as high, golden's may match it, and of course a titan xpp golden maxed will win. But on AVERAGE, most of us will be running our 1080ti at higher perf than your average titan xpp user.


What you talking about? Lol I even said this is what Xp looks like at 2100 of course it's not stock. If he going to compare golden ti beating stock Xp only fair we compare to new XP overclocked.


----------



## Slackaveli

[/quote]
Quote:


> Originally Posted by *PasK1234Xw*
> 
> What the you talking about? Lol


the link he quoted and you responded to says the following, DUDE. (maybe read then post?)
"Update: I also added an overclocked result of the Aorus GTX 1080 Ti we tweaked to it's maximum stable 2038~2050 MHz. That Tweaked Aorus scores 31.3K on the GPU score, so does the Titan Xp at tweaked, at 31.9K, so tweaked the cards seem very close to each other."

http://www.guru3d.com/news-story/nvidia-titan-xp-is-faster-as-custom-gtx-1080-ti.html










titan xpp with +250 (7%)~ cores and slightly lower clocks and worse cooling will not be doing much if any better than our 1080ti, it just won't. If on water it may net 2 fps more at 4k..... For $600+ more dollars...... Even if underwater and oc to the same level, which will require a center die chip, it'll get, what, 3 fps more at 4k? WASTE


----------



## Bishop07764

Quote:


> Originally Posted by *PasK1234Xw*
> 
> What the you talking about? Lol


I think his point was that a Titan XPP won't clock to almost 2ghz core at default. Didn't look like they really touched the memory though. I look forward to those of you getting this card to show us what it can really do.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> I think his point was that a Titan XPP won't clock to almost 2ghz core at default. Didn't look like they really touched the memory though. I look forward to those of you getting this card to show us what it can really do.


exactly. It's like the people saying, "Oh no, Vega must be abeast if Nvidia is dropping this full core titan xpp". naw, they were always going to drop this card. If a Vega beats a 1080ti, it'll beat a Titan xpp, too (hint, it wont though).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*


the link he quoted and you responded to says the following, DUDE. (maybe read then post?)
"Update: I also added an overclocked result of the Aorus GTX 1080 Ti we tweaked to it's maximum stable 2038~2050 MHz. That Tweaked Aorus scores 31.3K on the GPU score, so does the Titan Xp at tweaked, at 31.9K, so tweaked the cards seem very close to each other."

http://www.guru3d.com/news-story/nvidia-titan-xp-is-faster-as-custom-gtx-1080-ti.html










titan xpp with +240 (7%)~ cores and slightly lower clocks and worse cooling will not be doing much if any better than our 1080ti, it just won't. If on water it may net 2 fps more at 4k..... For $600+ more dollars...... Even if underwater and oc to the same level, which will require a center die chip, it'll get, what, 3 fps more at 4k? WASTE[/quote]

Yeah, it's just for bragging rights.


----------



## Slackaveli

Quote:


> Originally Posted by *PasK1234Xw*
> 
> What you talking about? Lol I even said this is what Xp looks like at 2100 of course it's not stock. If he going to compare golden ti beating stock Xp only fair we compare to new XP overclocked.


of course but the "comparison" is flawed b/c we cant even see his cpu scores or what he is even running, but it is obviously a higher cpu score than the rig OC3D uses to test.

Yeah, it's all bragging rights. If it weren't, would they have ghost launched it or hyped it? They ghosted it.

Also, I dont know that it means much because i dont have the supply numbers, but this launched 30 hours ago and hasnt sold out once. It has been in stock throughout. So, is it really selling much?


----------



## prelude514

Well, I'm officially a dumbass.









I had taken the back plate off my FE since it's useless and had a nice amount of airflow over the back side of the GPU to help a bit with cooling. Was going to bring the card back in today to trade for an Aorus, so I had to put the back plate back on obviously. As I was putting the last screw back in, my dog bumped my leg and caused me to start the screw slightly cross threaded. As soon as I noticed I backed off and started over, but the damage was done. Tried working it back but to no avail. Screw sheared off close to the head.. Looks like I'm stuck with my FE. What are the best AIO CPU coolers that I can mount to this thing? Don't really feel like paying $~220 CAD for the EVGA if I can avoid that. Don't care about looks, just best cooling performance. That, along with a shunt mod, should be enough to keep me happy.

Anyone know where I can buy a replacement standoff and screw?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Well, I'm officially a dumbass.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had taken the back plate off my FE since it's useless and had a nice amount of airflow over the back side of the GPU to help a bit with cooling. Was going to bring the card back in today to trade for an Aorus, so I had to put the back plate back on obviously. As I was putting the last screw back in, my dog bumped my leg and caused me to start the screw slightly cross threaded. As soon as I noticed I backed off and started over, but the damage was done. Tried working it back but to no avail. Screw sheared off close to the head.. Looks like I'm stuck with my FE. What are the best AIO CPU coolers that I can mount to this thing? Don't really feel like paying $~220 CAD for the EVGA if I can avoid that. Don't care about looks, just best cooling performance. That, along with a shunt mod, should be enough to keep me happy.
> 
> Anyone know where I can buy a replacement standoff and screw?


Post on reddit hardwareswap and see if anyone has a broken 1070 or 1080 they would sell you loose hardware.


----------



## prelude514

Great idea, thanks.


----------



## alucardis666

Sold my Ti for $750. Got the Titan setup with the hybrid cooler, works perfect. OC's like a beast!





+181 on the core, + 666 on the Mem.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Sold my Ti for $750. Got the Titan setup with the hybrid cooler, works perfect. OC's like a beast!
> 
> 
> 
> 
> 
> +181 on the core, + 666 on the Mem.


nice but that texture fillrate is borked! Glad you got a good overclocker, but i knew you would. Almost all these are perfect samples.

666 on the mems. i see you.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> nice but that texture fillrate is borked! Glad you got a good overclocker, but i knew you would. Almost all these are perfect samples.
> 
> 666 on the mems. i see you.


Borked how? lol









And thanks bud, I'm glad it worked out.


----------



## Slackaveli

it's really like almost ~400, it'll get fixed when gpu-z updates.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> it's really like almost ~400, it'll get fixed when gpu-z updates.


So it's not reading accurately?

Here's stock


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Sold my Ti for $750. Got the Titan setup with the hybrid cooler, works perfect. OC's like a beast!
> 
> 
> 
> 
> 
> +181 on the core, + 666 on the Mem.


dang who bought your ti $750? Did you sell it local?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> dang who bought your ti $750? Did you sell it local?


eBay


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Great idea, thanks.


Your best bet is anyone that went with a kraken g10 mood that took their shroud off. Also post everywhere and anywhere.


----------



## joder

Quote:


> Originally Posted by *alucardis666*
> 
> eBay


So you get around $650 after eBay and Paypal fees (not factoring in shipping).


----------



## alucardis666

Quote:


> Originally Posted by *joder*
> 
> So you get around $650 after eBay and Paypal fees (not factoring in shipping).


Yea, I bought the card for $700, so not too bad all things considering.


----------



## joder

Quote:


> Originally Posted by *alucardis666*
> 
> Yea, I bought the card for $700, so not too bad all things considering.


Nope not bad. I just hate eBay for that reason.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> eBay


They should let you make the titan owners club thread now lol


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> They should let you make the titan owners club thread now lol


They did reactivate it









http://www.overclock.net/t/1627390/2017-nvidia-titan-xp-owners-thread/320#post_25997302


----------



## cazpy

http://www.3dmark.com/fs/12260947

do you guys have problems aswell with driver validation?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Unbelievable, really, because they were in stock for 700 in a few places that day.
> eBay


Quote:


> Originally Posted by *alucardis666*
> 
> So it's not reading accurately?
> 
> Here's stock


it did that witht he 1080 ti for the first few days, too. reported it 100 higher. But that may be right, idk. Bandwidth is correct and maybe that bumps the texture rate way up, too. B/c i am at 365 currently on TFR.


----------



## Tikerz

Anyone overclocking in Afterburner using the voltage curve notice that the saved profiles don't always load the same exact curve on startup?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Tikerz*
> 
> Anyone overclocking in Afterburner using the voltage curve notice that the saved profiles don't always load the same exact curve on startup?


Yeah, the nvidia bios does a correction on what it should run like. You might want to lower the point right before your highest voltage to insure you run your max clock at your max voltage.

For example [email protected] and [email protected], you might have to drop 1.062 to 2063.


----------



## xartion

The easter bunny just brought me a new toy


----------



## Slackaveli

Quote:


> Originally Posted by *xartion*
> 
> The easter bunny just brought me a new toy


nice. She's a beefy one. I like biggals


----------



## Tikerz

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, the nvidia bios does a correction on what it should run like. You might want to lower the point right before your highest voltage to insure you run your max clock at your max voltage.
> 
> For example [email protected] and [email protected], you might have to drop 1.062 to 2063.


Thanks, I'll give it a shot. I just don't want to have to tinker with it every time I boot up.


----------



## Slackaveli

Quote:


> Originally Posted by *Tikerz*
> 
> Thanks, I'll give it a shot. I just don't want to have to tinker with it every time I boot up.


but your name is tinkers....


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> but your name is tinkers....


----------



## Bishop07764

Quote:


> Originally Posted by *alucardis666*
> 
> Sold my Ti for $750. Got the Titan setup with the hybrid cooler, works perfect. OC's like a beast!
> 
> +181 on the core, + 666 on the Mem.


Was it your card that FedEx lost? Let's us know what you get in firestrike. Congrats!

Quote:


> Originally Posted by *cazpy*
> 
> http://www.3dmark.com/fs/12260947
> 
> do you guys have problems aswell with driver validation?


I get the same message with the new driver.


----------



## alucardis666

Quote:


> Originally Posted by *Bishop07764*
> 
> Was it your card that FedEx lost? Let's us know what you get in firestrike. Congrats!
> I get the same message with the new driver.


They delivered it an hour later, dunno why they claimed it was delivered before it actually was.

Firestrike is paid right? I don't buy benchmarks.







'

Here's Heaven though!



*+181* on the core, *+666* on the Mem.


----------



## RavageTheEarth

Just got word that my Aorus 1080 Ti had been delivered. I'm vacationing in New Orleans, but will be back home in Wednesday. CANNOT wait to get home and play with this thing!!!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Tikerz*
> 
> Thanks, I'll give it a shot. I just don't want to have to tinker with it every time I boot up.


No it shouldn't move after that. For some reason the bins move around if the one lower than it is set really close.

The closer the voltages are the chances of them adjusting shifting and moving.


----------



## joder

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Just got word that my Aorus 1080 Ti had been delivered. I'm vacationing in New Orleans, but will be back home in Wednesday. CANNOT wait to get home and play with this thing!!!


I think that is a valid reason to cancel your vacation. The Ti is a vacation all on its own...


----------



## RavageTheEarth

Quote:


> Originally Posted by *joder*
> 
> I think that is a valid reason to cancel your vacation. The Ti is a vacation all on its own...


I'm down! Think my GF would be cool with that? Can't wait to overclock this thing and do some benching!


----------



## GRABibus

Does someone own the GTX 1080 Ti SEA HAWK X from MSI ?
Any feedback ?


----------



## Slackaveli

10,733 graphics score. 1st run. No cherry picking here. Oh, on my 4 hour no crash on BF1 tested settings. I could get more with unstable settings. 4 core cpu as you all know by know what Im running there, lmao/

http://www.3dmark.com/3dm/19145398?

Look who's #1 in the world on my combo







And not just any combo. On the Gaming King cpu and and gaming King gpu,1080Ti, Aorus (the Titan isnt a "gaming gpu " per Nvidia LMAO







) http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/spy/P/2007/1127/500000?minScore=0&cpuName=Intel Core i7-5775C&gpuName=NVIDIA GeForce GTX 1080 Ti



CAN I GET AN AMEN, Merica?!


----------



## alucardis666

Quote:


> Originally Posted by *GRABibus*
> 
> Does someone own the GTX 1080 Ti SEA HAWK X from MSI ?
> Any feedback ?


It's out?


----------



## GRABibus

Quote:


> Originally Posted by *alucardis666*
> 
> It's out?


I don't think so...
Just to know if some could get some samples already.

I really plan to get it. With Bios flash ASUS Strix t4, that should be great.
Or, if Gigabyte releases the Xtreme Gaming WATERFORCE 1080Ti, then I will go for it


----------



## alucardis666

Quote:


> Originally Posted by *GRABibus*
> 
> I don't think so...
> Just to know if some could get some samples already.
> 
> I really plan to get it. With Bios flash ASUS Strix t4, that should be great.
> Or, if Gigabyte releases the Xtreme Gaming WATERFORCE 1080Ti, then I will go for it


Yea SeaHawX should be nice. Go for it!


----------



## Preim

Received my Aorus 1080Ti yesterday, pretty happy with it so far! Incredibly quiet even when pushing voltage


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> Actually 100% for Asus is 300 watts already. So if you flash the bios, than 110% is 330 watts, it's like you're running the shunt mod.


You sure, I thought it was 275w x 120% = 330w?

If it was 110% and I seeing 111% I would think it would give me power limiting. Also it seemed to be comparable to the FE bios in that I was 120% 300w on that and 108 297w on the Strix.


----------



## Slackaveli

Quote:


> Originally Posted by *Preim*
> 
> Received my Aorus 1080Ti yesterday, pretty happy with it so far! Incredibly quiet even when pushing voltage


yours likes volts? My Aorus thought volts were no bueno but gives me 2037-2063 steady at stock volts and +77 (which these come plus 79 compared to FE, so only a +156 to original core).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> 10,733 graphics score. 1st run. No cherry picking here. Oh, on my 4 hour no crash on BF1 tested settings. I could get more with unstable settings. 4 core cpu as you all know by know what Im running there, lmao/
> 
> http://www.3dmark.com/3dm/19145398?
> 
> Look who's #1 in the world on my combo
> 
> 
> 
> 
> 
> 
> 
> And not just any combo. On the Gaming King cpu and and gaming King gpu,1080Ti, Aorus (the Titan isnt a "gaming gpu " per Nvidia LMAO
> 
> 
> 
> 
> 
> 
> 
> ) http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/spy/P/2007/1127/500000?minScore=0&cpuName=Intel Core i7-5775C&gpuName=NVIDIA GeForce GTX 1080 Ti
> 
> 
> 
> CAN I GET AN AMEN, Merica?!


Dude how do we run 3dmark with afterburner? Stealth mode does nothing.


----------



## PasK1234Xw

Quote:


> Originally Posted by *alucardis666*
> 
> It's out?


it's out but not in the US.
Should be anytime now.


__
https://www.reddit.com/r/63dpdd/my_sea_hawk_x_just_arrived/


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> You sure, I thought it was 275w x 120% = 330w?
> 
> If it was 110% and I seeing 111% I would think it would give me power limiting. Also it seemed to be comparable to the FE bios in that I was 120% 300w on that and 108 297w on the Strix.


I believe 100% on the strix is 300 watts. So 110%,wod be 330 watts.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Dude how do we run 3dmark with afterburner? Stealth mode does nothing.


idk, i have steam version and it works for me w/ the OSD on and everything (riva)/

I run low detection and stealth mode and have the latest version of riva. And I run thru steam obviously.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> idk, i have steam version and it works for me w/ the OSD on and everything (riva)/
> 
> I run low detection and stealth mode and have the latest version of riva. And I run thru steam obviously.


I'm running the latest beta of afterburner and RSS... Hmmm... ***! Lol, what a waste of $7


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm running the latest beta of afterburner and RSS... Hmmm... ***! Lol, what a waste of $7


me too. idk. Is it Steam version? If not, there is a place when you first log in that says how to run it through Steam. I bet that's it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> me too. idk. Is it Steam version? If not, there is a place when you first log in that says how to run it through Steam. I bet that's it.


Yeah I'm running it through steam. But maybe it's something I have set in afterburner itself. Not sure if I have the patience to figure it out. But therefore I can't run it on my oc.


----------



## Bishop07764

Quote:


> Originally Posted by *alucardis666*
> 
> They delivered it an hour later, dunno why they claimed it was delivered before it actually was.
> 
> Firestrike is paid right? I don't buy benchmarks.
> 
> 
> 
> 
> 
> 
> 
> '
> 
> Here's Heaven though!
> 
> *+181* on the core, *+666* on the Mem.


Haha. I'm with you on the paying for benchmark thing. I just have the free version. You can get it off Steam. It lets you run regular Firestrike and Timespy. Just curious. What kind of temps and throttling are you seeing?


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> Haha. I'm with you on the paying for benchmark thing. I just have the free version. You can get it off Steam. It lets you run regular Firestrike and Timespy. Just curious. What kind of temps and throttling are you seeing?


i bought the basic version on a steam sale for i think it was $3.50 or something like that. Peanuts. Not even a decent cheeseburger.


----------



## Jbravo33

Quote:


> Originally Posted by *alucardis666*
> 
> Yea, I bought the card for $700, so not too bad all things considering.


which card did you have again? wondering if i should put both of mine on ebay for 699 rather then returning and waiting two weeks for money to get refunded. could potentially get reimbursed faster on ebay.


----------



## Bishop07764

Quote:


> Originally Posted by *Slackaveli*
> 
> i bought the basic version on a steam sale for i think it was $3.50 or something like that. Peanuts. Not even a decent cheeseburger.


I might grab it off Steam if it was that price. Maybe. But cheeseburger...Mmmm ?
Quote:


> Originally Posted by *GRABibus*
> 
> Does someone own the GTX 1080 Ti SEA HAWK X from MSI ?
> Any feedback ?


I had the 1080 Seahawk EK previously. It was an awesome card. I'm sure that the ti models will be up there in quality. Not a lick of coil whine from it either. I might have gone the Seahawk EK ti route this time if they had been available.


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> it's all unlocked on Afetrburner 4.4.0 i bet you are on 4.3
> i've just been putting up with it, just glad to know it isnt my monitor or gear failing.


I can't find AB 4.4.0 must be beta have you a link please?

I have managed to unlock the sliders but not voltage increase on 4.3.0 though in saying that I am game stable @ 2038Mhz can bench @ 2076Mhz


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> gpu needs full coverage. have to smear method, all of it, even the sides.


Really? I thought it is simiallar to CPU, just large dot in the middle. So I have to cover full chip with thermal paste?

SAD news - my STRIX ROG 1080Ti got delayed








Won't have it till maybe next week....

Well, I don't have CPU right now so no big deal...but sad anyway.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> I might grab it off Steam if it was that price. Maybe. But cheeseburger...Mmmm ?
> I had the 1080 Seahawk EK previously. It was an awesome card. I'm sure that the ti models will be up there in quality. Not a lick of coil whine from it either. I might have gone the Seahawk EK ti route this time if they had been available.


The Aorus Extreme hybrid will be a beast.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Really? I thought it is simiallar to CPU, just large dot in the middle. So I have to cover full chip with thermal paste?
> 
> SAD news - my STRIX ROG 1080Ti got delayed
> 
> 
> 
> 
> 
> 
> 
> Won't have it till maybe next week....
> 
> Well, I don't have CPU right now so no big deal...but sad anyway.


Before you re-tim test your card out first.

And yes you have to spread because this is an exposed die. A cpu isn't exposed, it has a heat spreader. Look at a de-lid video, when you remove the heat spreader, that is the actual cpu bare die.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Really? I thought it is simiallar to CPU, just large dot in the middle. So I have to cover full chip with thermal paste?
> 
> SAD news - my STRIX ROG 1080Ti got delayed
> 
> 
> 
> 
> 
> 
> 
> Won't have it till maybe next week....
> 
> Well, I don't have CPU right now so no big deal...but sad anyway.


yeah. it's very easy. Easy as a peanut butter sandwich, easier than buttering bread


----------



## hotrod717

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't know how you're getting such high scores on one 1080 Ti.
> 
> Here's the best I can do with FE BIOS at 2062 core,2 6132 memory.


#1 99% of my computer use is benching. #2 My core. I'm over 2138.# 3 Chiller #4 The force.








Also will post my time spy. I was at 11232 without volt unlock. Have to rerun. Will also post higher Cat.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *hotrod717*
> 
> #1 99% of my computer use is benching. #2 My core. I'm over 2138.# 3 Chiller #4 The force.
> 
> 
> 
> 
> 
> 
> 
> 
> Also will post my time spy. I was at 11232 without volt unlock. Have to rerun. Will also post higher Cat.


For whatever reason running above 2114 Mhz is counterproductive for me. I believe you need more memory to be able to do so and the max my card can do stable is 475 on memory.

Are your results at 2138 actually better or worse than bins ahead?
Do you have any heaven runs and GPU-Z graphs running at 2138?


----------



## optimus002

Quote:


> Originally Posted by *prelude514*
> 
> Well, I'm officially a dumbass.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had taken the back plate off my FE since it's useless and had a nice amount of airflow over the back side of the GPU to help a bit with cooling. Was going to bring the card back in today to trade for an Aorus, so I had to put the back plate back on obviously. As I was putting the last screw back in, my dog bumped my leg and caused me to start the screw slightly cross threaded. As soon as I noticed I backed off and started over, but the damage was done. Tried working it back but to no avail. Screw sheared off close to the head.. Looks like I'm stuck with my FE. What are the best AIO CPU coolers that I can mount to this thing? Don't really feel like paying $~220 CAD for the EVGA if I can avoid that. Don't care about looks, just best cooling performance. That, along with a shunt mod, should be enough to keep me happy.
> 
> Anyone know where I can buy a replacement standoff and screw?


Buy yourself a aio like a h75, kraken g10, thermal pads, heatsink and gd thermal paste, i just did


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> it's all unlocked on Afetrburner 4.4.0 i bet you are on 4.3
> i've just been putting up with it, just glad to know it isnt my monitor or gear failing.
> 
> 
> 
> I can't find AB 4.4.0 must be beta have you a link please?
> 
> I have managed to unlock the sliders but not voltage increase on 4.3.0 though in saying that I am game stable @ 2038Mhz can bench @ 2076Mhz
Click to expand...

http://forums.guru3d.com/showpost.php?p=5412373&postcount=216


----------



## dansi

Any body FE able to reach 2.1ghz stably in games?


----------



## Boyd

Quote:


> Originally Posted by *dansi*
> 
> Any body FE able to reach 2.1ghz stably in games?


I doubt anyone would be able to without alternate cooling. FE gets really hot with the reference cooler.


----------



## cekim

Quote:


> Originally Posted by *Boyd*
> 
> I doubt anyone would be able to without alternate cooling. FE gets really hot with the reference cooler.


Even on water I can't get there. I do have one card that requires substantially more voltage than the other in the same loop, but I'm not alone in finding the TI doesn't clock as well as the 1080. Not too surprising given the additional computes and memory.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> http://forums.guru3d.com/showpost.php?p=5412373&postcount=216


THX worked for me unlocked voltage
still think I need better cooling as more voltage didn't really gain anything.


----------



## jacknhut

Out of the 2 FE 1080 TI I got, 1 run at 2139 Mhz stable with default volt, the other cant pass 2050 Mhz without crashing valley.... Both cards never exceed 40C due to my custom loop.

Now if I can figure out how to increase voltage maybe I can try to get the 2nd card to at least 2100 Mhz...


----------



## KedarWolf

Quote:


> Originally Posted by *jacknhut*
> 
> Out of the 2 FE 1080 TI I got, 1 run at 2139 Mhz stable with default volt, the other cant pass 2050 Mhz without crashing valley.... Both cards never exceed 40C due to my custom loop.
> 
> Now if I can figure out how to increase voltage maybe I can try to get the 2nd card to at least 2100 Mhz...


See the voltage curve video.









http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## johnwayne117

Hey guys, what is up with Asus bios? Is it allow you to set higher voltage? So in this terms Asus>Msi and aorus?


----------



## glnn_23

Here's my EVGA 1080 Ti FE. Custom water, stock bios and no shunt mod.


http://www.3dmark.com/3dm/19152974


----------



## KedarWolf

Quote:


> Originally Posted by *glnn_23*
> 
> Here's my EVGA 1080 Ti FE. Custom water, stock bios and no shunt mod.
> 
> 
> http://www.3dmark.com/3dm/19152974


Got this today.


----------



## mshagg

Pro
Quote:


> Originally Posted by *glnn_23*
> 
> Here's my EVGA 1080 Ti FE. Custom water, stock bios and no shunt mod.
> 
> 
> http://www.3dmark.com/3dm/19152974


Epic. Those CPUs can take a beating huh.

Been playing around with the voltage curve in AB. Turns out Firestrike and Timespy easier to pass stable than supersampling VR games at 1.8 (3880x2160 at 90FPS). Shock to the system when the driver crashes in VR lol.

Still looks like 2062 at 1.031 is the sweet spot for real world on this FE/EKWB, keen to push it into the higher voltage bins for some benching though.


----------



## Jbravo33

Anyone else run an ek titan waterblock with titan nickel backplate? The screws that came with nickel back plate are too wide for the 1080ti. Wondering if anyone encountered this problem. I had to mount with black screw instead of nickel colored ones


----------



## KingEngineRevUp

Quote:


> Originally Posted by *johnwayne117*
> 
> Hey guys, what is up with Asus bios? Is it allow you to set higher voltage? So in this terms Asus>Msi and aorus?


As of right now I'm hoping someone can investigate the Asus bios because so far it seems like it doesn't do anything.

Supposedly it's supposed to give us higher tdp but another user measured the power usage and it was no different than the stock bios.


----------



## johnwayne117

Quote:


> Originally Posted by *SlimJ87D*
> 
> As of right now I'm hoping someone can investigate the Asus bios because so far it seems like it doesn't do anything.
> 
> Supposedly it's supposed to give us higher tdp but another user measured the power usage and it was no different than the stock bios.


So it can't allow you set 1.1v?
And generally, looking at 1080 oc, can we expect softmod\ biosmod that allow easy at least 2.100Mhz?


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Got this today.


Back to the FE bios? and does 123% power limit very much?


----------



## Alwrath

Anyone know if/how you can increase the power limit on FE cards? Thanks in advance.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *johnwayne117*
> 
> So it can't allow you set 1.1v?
> And generally, looking at 1080 oc, can we expect softmod\ biosmod that allow easy at least 2.100Mhz?


Quote:


> Originally Posted by *johnwayne117*
> 
> Hey guys, what is up with Asus bios? Is it allow you to set higher voltage? So in this terms Asus>Msi and aorus?


Not as of right now


----------



## PasK1234Xw

Quote:


> Originally Posted by *prelude514*
> 
> Well, I'm officially a dumbass.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had taken the back plate off my FE since it's useless and had a nice amount of airflow over the back side of the GPU to help a bit with cooling. Was going to bring the card back in today to trade for an Aorus, so I had to put the back plate back on obviously. As I was putting the last screw back in, my dog bumped my leg and caused me to start the screw slightly cross threaded. As soon as I noticed I backed off and started over, but the damage was done. Tried working it back but to no avail. Screw sheared off close to the head.. Looks like I'm stuck with my FE. What are the best AIO CPU coolers that I can mount to this thing? Don't really feel like paying $~220 CAD for the EVGA if I can avoid that. Don't care about looks, just best cooling performance. That, along with a shunt mod, should be enough to keep me happy.
> 
> Anyone know where I can buy a replacement standoff and screw?


Anyone using 1070/80/Ti/XP evga hybrid kit has them. They are included with kit plus the ones on the card. So lot left over ask around.

Would ask over in EVGA forum also.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Alwrath*
> 
> Anyone know if/how you can increase the power limit on FE cards? Thanks in advance.


You can with shunt mod


----------



## Alwrath

Quote:


> Originally Posted by *SlimJ87D*
> 
> You can with shunt mod


Theres no other way? ugh...


----------



## hotrod717

Quote:


> Originally Posted by *SlimJ87D*
> 
> For whatever reason running above 2114 Mhz is counterproductive for me. I believe you need more memory to be able to do so and the max my card can do stable is 475 on memory.
> 
> Are your results at 2138 actually better or worse than bins ahead?
> Do you have any heaven runs and GPU-Z graphs running at 2138?


Memory also has something to do with it. There is a point where mem oc will throttle your core oc as they are both pulling from same voltage ultimately. It all depends on bench. Some would rather see mem, some core. Just have to figure out what bench prefers.
Also curve ultimately should be reflective of the old "13" philosophy. Not doing it this way can severely limit you.
During Cat.I hit 121% for a brief moment. Rest is 100-110. The graph is on bottom of AB, which should be In my screens. Also not shunt mod. I'm using a universal wb with pk1.


----------



## corndog1836

i wish i could sell my 780ti Classifieds.. hahahahahaha


----------



## Mad Pistol

Quote:


> Originally Posted by *corndog1836*
> 
> i wish i could sell my 780ti Classifieds.. hahahahahaha


You can, but it will only make you between $130-170 per card. Obviously, a 1080 Ti costs quite a bit more than that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *hotrod717*
> 
> Memory also has something to do with it. There is a point where mem oc will throttle your core oc as they are both pulling from same voltage ultimately. It all depends on bench. Some would rather see mem, some core. Just have to figure out what bench prefers.
> Also curve ultimately should be reflective of the old "13" philosophy. Not doing it this way can severely limit you.
> During Cat.I hit 121% for a brief moment. Rest is 100-110. The graph is on bottom of AB, which should be In my screens. Also not shunt mod. I'm using a universal wb with pk1.


What is this "13" you speak of?


----------



## looniam

Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *hotrod717*
> 
> Memory also has something to do with it. There is a point where mem oc will throttle your core oc as they are both pulling from same voltage ultimately. It all depends on bench. Some would rather see mem, some core. Just have to figure out what bench prefers.
> Also curve ultimately should be reflective of the old "13" philosophy. Not doing it this way can severely limit you.
> During Cat.I hit 121% for a brief moment. Rest is 100-110. The graph is on bottom of AB, which should be In my screens. Also not shunt mod. I'm using a universal wb with pk1.
> 
> 
> 
> 
> 
> 
> 
> 
> What is this "13" you speak of?
Click to expand...

boost bins of 13Mhz. OC with that in mind and you'll get a stable clock speed instead of using increments of say 10 or 15 (think voltage likes 6mv adjustments) . . . like that since kepler.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *looniam*
> 
> boost bins of 13Mhz. OC with that in mind and you'll get a stable clock speed instead of using increments of say 10 or 15 (think voltage likes 6mv adjustments) . . . like that since kepler.


Oh, afterburner does that automatically for you. It won't let you choose 10 or 15, it'll jump up to the nearest 13 increment


----------



## Mad Pistol

I see the battle will begin brewing soon on which is the absolutely fastest card on the market (when overclocked): the GTX 1080 Ti or the Titan Xp.

I am curious to see if this is a repeat of the Maxwell generation or not. Just for reference, the 980 Ti was faster than the original Titan X when overclocked (including some 980 Ti AIB models.)


----------



## RadActiveLobstr

Quote:


> Originally Posted by *Mad Pistol*
> 
> I see the battle will begin brewing soon on which is the absolutely fastest card on the market (when overclocked): the GTX 1080 Ti or the Titan Xp.
> 
> I am curious to see if this is a repeat of the Maxwell generation or not. Just for reference, the 980 Ti was faster than the original Titan X when overclocked (including some 980 Ti AIB models.)


I don't think there will be any contest, the Titan Xp is going to be faster. The core count alone assures that.

The question will be how close can AIB 1080Ti's get to the reference Titan Xp.

If you put water on the Titan Xp it's clearly going to be the fastest.


----------



## PasK1234Xw

^this^


----------



## Mad Pistol

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> I don't think there will be any contest, the Titan Xp is going to be faster. The core count alone assures that.
> 
> The question will be how close can AIB 1080Ti's get to the reference Titan Xp.
> 
> If you put water on the Titan Xp it's clearly going to be the fastest.


I agree. Unfortunately, certain people on this forum believe that the 1080 Ti has the ability to beat the TXp on absolute performance. Honestly, for the $500 premium that Nvidia wants for the TXp, they can have their tiny performance lead. The 1080 Ti is far and away the winner on the value front.


----------



## PasK1234Xw

It's going to take hard modding like kingpin to push beyond nvidia set limits people new to pascal not like maxwell bios editing will never happen. And AIB with higher power limits and shunt mods will still be hitting clock wall.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mad Pistol*
> 
> I agree. Unfortunately, certain people on this forum believe that the 1080 Ti has the ability to beat the TXp on absolute performance. Honestly, for the $500 premium that Nvidia wants for the TXp, they can have their tiny performance lead. The 1080 Ti is far and away the winner on the value front.


Yeah but those people don't have any credibility.


----------



## mcg75

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> I don't think there will be any contest, the Titan Xp is going to be faster. The core count alone assures that.
> 
> The question will be how close can AIB 1080Ti's get to the reference Titan Xp.
> 
> If you put water on the Titan Xp it's clearly going to be the fastest.


Yeah, I'm not sure why people argue things like this.

All things being equal, the Titan Xp will always be faster.


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Anyone know if/how you can increase the power limit on FE cards? Thanks in advance.


asus bios works for some, shunt mod always works if you do it right, other than that, we screwed. There won't be a custom bios coming.


----------



## Slackaveli

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> I don't think there will be any contest, the Titan Xp is going to be faster. The core count alone assures that.
> 
> The question will be how close can AIB 1080Ti's get to the reference Titan Xp.
> 
> If you put water on the Titan Xp it's clearly going to be the fastest.


yeah because they are better samples AND have the 256 extra cores. The Titan's are all center die samples. They are overclocking like champs. Everybody is getting 2050+. @Alucardis666 's 1080Ti was a 1900's overclocker, he sold for $750, and his Titan clocks and holds 2.114 @ 42c

CRUSHES us, in his case that is almost 20% gains between clocks and cores.....

wow


----------



## hotrod717

Quote:


> Originally Posted by *SlimJ87D*
> 
> Oh, afterburner does that automatically for you. It won't let you choose 10 or 15, it'll jump up to the nearest 13 increment


Not always. Always check curve again after applying.. Its also about how you arrange your bins. Just maxing oc and locking it doesn't always yield best results or scores.


----------



## Clukos

In theory a TXp at a stable 1900Mhz core clock performs similarly to a 2050Mhz core clock 1080 ti (14.5 tflops). Of course, if you overclock both to the max, the TXp gains about 1 tflop of extra performance, not to mention the extra memory bandwidth. I still don't think that performance differential is worth $500, but I'm not against people wanting to buy the best thing out there either. The 1080 ti is still a beast on its own


----------



## Slackaveli

Quote:


> Originally Posted by *Clukos*
> 
> In theory a TXp at a stable 1900Mhz core clock performs similarly to a 2050Mhz core clock 1080 ti (14.5 tflops). Of course, if you overclock both to the max, the TXp gains about 1 tflop of extra performance, not to mention the extra memory bandwidth. I still don't think that performance differential is worth $500, but I'm not against people wanting to buy the best thing out there either. The 1080 ti is still a beast on its own


yeah, no reason to hang our heads in here. If you are a millionaire, well, hell, yeah your rig sucks with a 1080ti... ya bum

For the rest of us, we are just obviously smarter consumers. Just don't make yourself look ignorant and start claimng we beat Titan Xp(2017); that's ridiculous. (obviously I dont mean you, Clukos







)


----------



## alucardis666

Quote:


> Originally Posted by *Mad Pistol*
> 
> I agree. Unfortunately, certain people on this forum believe that the 1080 Ti has the ability to beat the TXp on absolute performance. Honestly, for the $500 premium that Nvidia wants for the TXp, they can have their tiny performance lead. The 1080 Ti is far and away the winner on the value front.


Quote:


> Originally Posted by *Clukos*
> 
> In theory a TXp at a stable 1900Mhz core clock performs similarly to a 2050Mhz core clock 1080 ti (14.5 tflops). Of course, if you overclock both to the max, the TXp gains about 1 tflop of extra performance, not to mention the extra memory bandwidth. I still don't think that performance differential is worth $500, but I'm not against people wanting to buy the best thing out there either. The 1080 ti is still a beast on its own


Agreed. But I'm seeing a 8-20fps increase in my games at 4k maxed settings with the TXp. So worth it, imo. I can only imagine how well this would be doing 100hz 3440x1440 gsync.


----------



## cyenz

Quote:


> Originally Posted by *alucardis666*
> 
> Agreed. But I'm seeing a 8-20fps increase in my games at 4k maxed settings with the TXp. So worth it, imo. I can only imagine how well this would be doing 100hz 3440x1440 gsync.


20fps increase over 1080ti? Are we talking about cs:go or something?


----------



## Slackaveli

he said 8-20, so the 20 being in the games that hit over 100 fps in 4k i guess. I hit 80 -85 fps in 4k in BF1 on ultra in some maps so there are a TON of games that will hit 100fps now-a-days (crazy, i know). I hit over 144fps in 4k in FiFa17, for example. In wildlands or mass effect if he's getting 8 fps in 4k, that's huge. But he did go from a dog to a golden, that doubles the usual delta between TXp17 and 1080Ti.


----------



## mastabog

I arrived here from a google search about a bios vmod and was greeted with 4500+ posts in 35 days, or ~128 posts/day ... you guys are nuts.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> he said 8-20, so the 20 being in the games that hit over 100 fps in 4k i guess. I hit 80 -85 fps in 4k in BF1 on ultra in some maps so there are a TON of games that will hit 100fps now-a-days (crazy, i know). I hit over 144fps in 4k in FiFa17, for example. In wildlands or mass effect if he's getting 8 fps in 4k, that's huge. But he did go from a dog to a golden, that doubles the usual delta between TXp17 and 1080Ti.


Thank you. Couldn't have said it any better myself.








Quote:


> Originally Posted by *mastabog*
> 
> I arrived here from a google search about a bios vmod and was greeted with 4500+ posts in 35 days, or ~128 posts/day ... you guys are nuts.


Welcome to the party!


----------



## cyenz

Quote:


> Originally Posted by *Slackaveli*
> 
> he said 8-20, so the 20 being in the games that hit over 100 fps in 4k i guess. I hit 80 -85 fps in 4k in BF1 on ultra in some maps so there are a TON of games that will hit 100fps now-a-days (crazy, i know). I hit over 144fps in 4k in FiFa17, for example. In wildlands or mass effect if he's getting 8 fps in 4k, that's huge. But he did go from a dog to a golden, that doubles the usual delta between TXp17 and 1080Ti.


Well if we are talking about the worst 1080ti ever to a titan Xp Golden Sample i can believe that a game thats outputting over 100fps can gain 20 fps but the difference will not turn an unplayable game at 4k into a playable one.


----------



## Slackaveli

Quote:


> Originally Posted by *mastabog*
> 
> I arrived here from a google search about a bios vmod and was greeted with 4500+ posts in 35 days, or ~128 posts/day ... you guys are nuts.


welcome to the place that will make you hate your rig, max out your cards, but have the best pc of anybody you know in real life


----------



## Slackaveli

Quote:


> Originally Posted by *cyenz*
> 
> Well if we are talking about the worst 1080ti ever to a titan Xp Golden Sample i can believe that a game thats outputting over 100fps can gain 20 fps but the difference will not turn an unplayable game at 4k into a playable one.


true but it may turn a 54 fps 4k game into a 60 one. or a 92 fps game into a 100 one. Depends on A) what games you are playing and on what monitor and B) just how OCD are you.


----------



## alucardis666

Quote:


> Originally Posted by *cyenz*
> 
> Well if we are talking about the worst 1080ti ever to a titan Xp Golden Sample i can believe that a game thats outputting over 100fps can gain 20 fps but the difference will not turn an unplayable game at 4k into a playable one.


Wrong. Ghost Recon Wildlands and Mass Effect Andromeda say hello.


----------



## cyenz

Quote:


> Originally Posted by *Slackaveli*
> 
> true but it may turn a 54 fps 4k game into a 60 one. or a 92 fps game into a 100 one. Depends on A) what games you are playing and on what monitor and B) just how OCD are you.


Well if money is not a problem, why not? Not saying that, only that the move from a 1080ti to a TitanXp will not redefine the gaming experience.


----------



## cyenz

Quote:


> Originally Posted by *alucardis666*
> 
> Wrong. Ghost Recon Wildlands and Mass Effect Andromeda say hello.


Care to show results then please? Wildlands have a built in bench.


----------



## Clukos

1900Mhz 1080ti to 2100 TXp is a 18% jump in theoretical performance (13.6 tflops vs 16.1 tflops) so it is entirely possible to see such difference in fps.


----------



## alucardis666

Quote:


> Originally Posted by *cyenz*
> 
> *The move from a 1080ti to a TitanXp will not redefine the gaming experience.*


Agree in 8/10 cases. AVG FPS isn't as important as 1% and .1% lows. Which both Mass Effect Andromeda and Ghost Recon Suffer greatly from due to poor optimization, the gap widens even more beyond 2460x1440. I was averaging 48fps in ghost recon Wildlands and 42 FPS in Mass Effect Andromeda @ 4k maxed out with the TI, I'm now getting 56FPS average in Wildlands and 50FPS in Mass Effect Andromeda.








Quote:


> Originally Posted by *cyenz*
> 
> Care to show results then please? Wildlands have a built in bench.


I'd love to, except I beat the game last night and uninstalled it :-( And comecast data caps suck.

So I don't intend to redownload it just for the benchmark.











But there's my heaven score at 2100mhz.









With my Ti @ 2012mhz I was getting 1479


----------



## Joshwaa

Slack there is a new processor I think you will be happy about. Might be a successor to your 5775c. Check out the Kaby Lake-G. HBM2 mem and stuffs.


----------



## alucardis666

Quote:


> Originally Posted by *Joshwaa*
> 
> Slack there is a new processor I think you will be happy about. Might be a successor to your 5775c. Check out the Kaby Lake-G. HBM2 mem and stuffs.


*This?* Sounds interesting!


----------



## cyenz

Quote:


> Originally Posted by *alucardis666*
> 
> Agree in 8/10 cases. AVG FPS isn't as important as 1% and .1% lows. Which both Mass Effect Andromeda and Ghost Recon Suffer greatly from due to poor optimization, the gap widens even more beyond 2460x1440. I was averaging 48fps in ghost recon Wildlands and 42 FPS in Mass Effect Andromeda @ 4k maxed out with the TI, I'm now getting 56FPS average in Wildlands and 50FPS in Mass Effect Andromeda.


More is always better, im not saying that, even if it was by a single fps it would still be the better card and the one to own if money is not a problem. If i saw value in that move i would do it, i had a Titan X Maxwell so im not even the best example on how to spend money since the 980ti would be a better value card at the time and i bought the Titan anyway.

In your case maybe i could say that, i didnt even know anyone that had a 1080ti that could not do at least 2000mhz, but comparing apples to apples, a 1080ti 2Ghz to a Titan Xp 2Ghz will not redefine the gaming experience.


----------



## alucardis666

Quote:


> Originally Posted by *cyenz*
> 
> More is always better, im not saying that, even if it was by a single fps it would still be the better card and the one to own if money is not a problem. If i saw value in that move i would do it, i had a Titan X Maxwell so im not even the best example on how to spend money since the 980ti would be a better value card at the time and i bought the Titan anyway.
> 
> In your case maybe i could say that, i didnt even know anyone that had a 1080ti that could not do at least 2000mhz, but comparing apples to apples, a 1080ti 2Ghz to a Titan Xp 2Ghz will not redefine the gaming experience.


I think that goes back to how OCD you are. A game needs to hit a minimum 48FPS for it to feel fluid to me. I'm playing on a 4k TV so no gsync or anything like that and the fluidity with the OC'd Titan does feel significant to me over the Ti it replaced.


----------



## Tikerz

Quote:


> Originally Posted by *Mad Pistol*
> 
> I agree. Unfortunately, certain people on this forum believe that the 1080 Ti has the ability to beat the TXp on absolute performance. Honestly, for the $500 premium that Nvidia wants for the TXp, they can have their tiny performance lead. The 1080 Ti is far and away the winner on the value front.


Yeah, I'm not paying $500 more money for an extra GB of memory and 5-10% more performance. That would be like Intel charging almost double the price for the 7700K over the 6700K. The Ti in my opinion was worth it being 30% faster than the 1080.


----------



## ReDdRaGoN39

Those 780ti's were sweet cards. But yes, its time.


----------



## Slackaveli

Quote:


> Originally Posted by *Clukos*
> 
> 1900Mhz 1080ti to 2100 TXp is a 18% jump in theoretical performance (13.6 tflops vs 16.1 tflops) so it is entirely possible to see such difference in fps.


see, my 20% off the top off my head rounded math is pretty damn accurate.


----------



## cyenz

Quote:


> Originally Posted by *alucardis666*
> 
> I think that goes back to how OCD you are. A game needs to hit a minimum 48FPS for it to feel fluid to me. I'm playing on a 4k TV so no gsync or anything like that and the fluidity with the OC'd Titan does feel significant to me over the Ti it replaced.


Im in the same situation since i game on my KS7000, even though my ocd levels are moderate i cannot see the value proposition moving 500$ for 10% performance, but im not criticising you or anyone how thinks its worth it.

I would probably consider sli first before moving to the Titan Xp but SLI is an hit or miss so i will probably wait for the next GPU that offers at least 25% more performance.


----------



## Slackaveli

Quote:


> Originally Posted by *Joshwaa*
> 
> Slack there is a new processor I think you will be happy about. Might be a successor to your 5775c. Check out the Kaby Lake-G. HBM2 mem and stuffs.


i think they are going to try it! And probably transition that to on die optane.... omg

these here are just laptop and imbedded ones but i bet it'll come to the bigdogs eventually now that intel has to at least try to innovate again.


----------



## alucardis666

Quote:


> Originally Posted by *cyenz*
> 
> Im in the same situation since i game on my KS7000, even though my ocd levels are moderate i cannot see the value proposition moving 500$ for 10% performance, but im not criticising you or anyone how thinks its worth it.
> 
> I would probably consider sli first before moving to the Titan Xp but SLI is an hit or miss so i will probably wait for the next GPU that offers at least 25% more performance.


I've been burned 2x with SLi so I am a firm believer in buying the best you can afford until the next gen. I'll probably keep this card till Volta TI or the next fully unlocked Titan and get one of those.


----------



## Slackaveli

Quote:


> Originally Posted by *cyenz*
> 
> Im in the same situation since i game on my KS7000, even though my ocd levels are moderate i cannot see the value proposition moving 500$ for 10% performance, but im not criticising you or anyone how thinks its worth it.
> 
> I would probably consider sli first before moving to the Titan Xp but SLI is an hit or miss so i will probably wait for the next GPU that offers at least 25% more performance.


same boat, im on js9000 and i feel ya'll. But in alucardis' case, 1) he had the money 2) he was not happy with his Ti and was already plotting his next move on that front when titan dropped so i'm just glad he happy, but i also will be keeping my Aorus!


----------



## JedixJarf

Quote:


> Originally Posted by *alucardis666*
> 
> *This?* Sounds interesting!


Yes, new design is gonna be awesome.


----------



## Slackaveli

Quote:


> Originally Posted by *JedixJarf*
> 
> Yes, new design is gonna be awesome.


Quote:


> Originally Posted by *JedixJarf*
> 
> Yes, new design is gonna be awesome.


yeah im hanging on to 5775c until these come to the big dogs, but seriously i bet the do this type of thing with optane eventually and it will SMOKE everything.


----------



## Mad Pistol

Quote:


> Originally Posted by *alucardis666*
> 
> Agree in 8/10 cases. AVG FPS isn't as important as 1% and .1% lows. Which both Mass Effect Andromeda and Ghost Recon Suffer greatly from due to poor optimization, the gap widens even more beyond 2460x1440. I was averaging 48fps in ghost recon Wildlands and 42 FPS in Mass Effect Andromeda @ 4k maxed out with the TI, I'm now getting 56FPS average in Wildlands and 50FPS in Mass Effect Andromeda.
> 
> 
> 
> 
> 
> 
> 
> 
> I'd love to, except I beat the game last night and uninstalled it :-( And comecast data caps suck.
> 
> So I don't intend to redownload it just for the benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But there's my heaven score at 2100mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With my Ti @ 2012mhz I was getting 1479


I think it's interesting how my 1070's in SLI are able to match a TXp in benchmarks (at least in Heaven), especially considering that I have basically the same core count as a TXp (1920 x 2 = 3840). I would assume that the TXp would be far an away superior since SLI does not scale perfectly.


----------



## Slackaveli

Quote:


> Originally Posted by *Mad Pistol*
> 
> I think it's interesting how my 1070's in SLI are able to match a TXp in benchmarks (at least in Heaven), especially considering that I have basically the same core count as a TXp (1920 x 2 = 3840). I would assume that the TXp would be far an away superior since SLI does not scale perfectly.


it scales at 95% in heaven i've heard. that looks more like 100% scaling.


----------



## alucardis666

Quote:


> Originally Posted by *Mad Pistol*
> 
> I think it's interesting how my 1070's in SLI are able to match a TXp in benchmarks (at least in Heaven), especially considering that I have basically the same core count as a TXp (1920 x 2 = 3840). I would assume that the TXp would be far an away superior since SLI does not scale perfectly.


So return my Titan Xp and get 2 1070 or 1080's in Sli?


----------



## Silent Scone

Plus, nobody really cares about Heaven performance as an absolute


----------



## Baasha

I know this isn't the right thread but nobody responded to the other one I created...

Last night I noticed that my Noctua iPPC PWM 3000RPM fans are NOT spinning - they are mounted on the radiator for the Kraken X62.

I am really concerned about this but what baffles my mind is that I was pumping 1.390V into the CPU and OC'ing the hell out of it - stress tests, games etc. and the CPU never broke 70C.

I changed the profile in CAM to 'Performance Mode' and let it start on logon to make sure it stays in that mode but still, the fans are not spinning.

Can someone/anyone PLEASE HELP me with this?

One of the fans is connected to the 4-pin PWM connection that comes with the Kraken X62 and the 2nd one is connected to the 'slave' connector which has only 3 pins.

The Kraken X62 comes with a 1-master & 3-slave fan connections.

EDIT: pics:


----------



## jsarver

so what would you guys do and why?

2 1080ti ftw 3 from evga in sli

1 Titan xp probably on water.

feedback very welcome

coming from 980ti's reference in sli.


----------



## alucardis666

Quote:


> Originally Posted by *jsarver*
> 
> so what would you guys do and why?
> 
> 2 1080ti ftw 3 from evga in sli
> 
> 1 Titan xp probably on water.
> 
> feedback very welcome
> 
> coming from 980ti's reference in sli.


2 1080Ti's would be ideal, *BUT* that would put you at the mercy of SLi implementation in games; scaling and whether it's even supported.
Would also cost you ~$200-250 more. Worse case scenario you play a game without SLI support and you have a $750 brick sitting in your rig sucking up power and not contributing any performance at all.

*I think* the TXp on water or with a hybrid cooler is the best bang for your buck assuming you can shell out $1300 USD for it. If not get 1 1080Ti with an extreme oc like the Aurous and be happy for ~$775.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Baasha*
> 
> I know this isn't the right thread but nobody responded to the other one I created...
> 
> Last night I noticed that my Noctua iPPC PWM 3000RPM fans are NOT spinning - they are mounted on the radiator for the Kraken X62.
> 
> I am really concerned about this but what baffles my mind is that I was pumping 1.390V into the CPU and OC'ing the hell out of it - stress tests, games etc. and the CPU never broke 70C.
> 
> I changed the profile in CAM to 'Performance Mode' and let it start on logon to make sure it stays in that mode but still, the fans are not spinning.
> 
> Can someone/anyone PLEASE HELP me with this?
> 
> One of the fans is connected to the 4-pin PWM connection that comes with the Kraken X62 and the 2nd one is connected to the 'slave' connector which has only 3 pins.
> 
> The Kraken X62 comes with a 1-master & 3-slave fan connections.
> 
> EDIT: pics:


Too off topic man. Make a thread in one of the cpu threads.

But my advice is to connect fans into your mobo and control it yourself. Screw cam.


----------



## Mad Pistol

Quote:


> Originally Posted by *alucardis666*
> 
> So return my Titan Xp and get 2 1070 or 1080's in Sli?


Hahahahahaha.... no









1 single fast card > 2 somewhat fast cards... always.


----------



## jsarver

Quote:


> Originally Posted by *alucardis666*
> 
> 2 1080Ti's would be ideal, *BUT* that would put you at the mercy of SLi implementation in games; scaling and whether it's even supported.
> Would also cost you ~$200-250 more. Worse case scenario you play a game without SLI support and you have a $750 brick sitting in your rig sucking up power and not contributing any performance at all.
> 
> *I think* the TXp on water or with a hybrid cooler is the best bang for your buck assuming you can shell out $1300 USD for it. If not get 1 1080Ti with an extreme oc like the Aurous and be happy for ~$775.


thanks man. any further opinions always welcome. luckily i have a few weeks before the ftw3 even ships to make up my mind


----------



## KingEngineRevUp

So I wanted to share some GPU-Z graphs. Perhaps the ASUS Bios truly is helping as you can see in the perf cap graph, the stock bios is hitting power limits (green) and GPU Utilization limits (grey) more often than the MSI and ASUS bios.

This is Witcher 3 which is hovering around 95 FPS average at 1440P on Ultra with Hair works. *This game draws A LOT of power when playing.*

Anyone here with the shunt mod and GPU-Z graphs? I think shunt mod might be the only way to truly get over the power limits of the FE cards.


----------



## Baasha

Quote:


> Originally Posted by *SlimJ87D*
> 
> Too off topic man. Make a thread in one of the cpu threads.
> 
> But my advice is to connect fans into your mobo and control it yourself. Screw cam.


I did make a thread but nobody responded.

Anyway, I tried plugging in one fan to teh CPU_FAN header on the mobo directly, and IT STILL DOESN'T SPIN!


----------



## Bishop07764

Quote:


> Originally Posted by *alucardis666*
> 
> Agree in 8/10 cases. AVG FPS isn't as important as 1% and .1% lows. Which both Mass Effect Andromeda and Ghost Recon Suffer greatly from due to poor optimization, the gap widens even more beyond 2460x1440. I was averaging 48fps in ghost recon Wildlands and 42 FPS in Mass Effect Andromeda @ 4k maxed out with the TI, I'm now getting 56FPS average in Wildlands and 50FPS in Mass Effect Andromeda.
> 
> 
> 
> 
> 
> 
> 
> 
> I'd love to, except I beat the game last night and uninstalled it :-( And comecast data caps suck.
> 
> But there's my heaven score at 2100mhz.


Impressive. Great clocks. Are those on the reference cooler or are you water-cooled? If it's on the gosh awful reference cooler, that is truly impressive. Glad that you are seeing gains in things that really matter ... Games. ?


----------



## Bishop07764

Quote:


> Originally Posted by *Baasha*
> 
> I did make a thread but nobody responded.
> 
> Anyway, I tried plugging in one fan to teh CPU_FAN header on the mobo directly, and IT STILL DOESN'T SPIN!


Sorry man. Tried plugging it into another header to see if those work? If you are using any type of splitter or reducer, I would take those off.


----------



## KingEngineRevUp

I still consider myself a overclocking noobie, but I wanted to share
Quote:


> Originally Posted by *Baasha*
> 
> I did make a thread but nobody responded.
> 
> Anyway, I tried plugging in one fan to teh CPU_FAN header on the mobo directly, and IT STILL DOESN'T SPIN!


Your fan just sounds like its dead man. If you're plugging it into other headers and it wont' spin even at startup, it's just dead.


----------



## jsarver

any of you guys running aftermarket air coolers in sli with these cards or recent pascal? how i sthe heat output into the case? see any issues with other temps rising?


----------



## Slackaveli

I'm #1 in teh world with the Broadwell i-7 5775c in Firestrike Ultra





















:"
somehow beat the guy with two Titan X in sli in 2nd place finally.

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpu/fs/R/2007/7431?minScore=6487&cpuName=Intel Core i7-5775C

#1 in Timespy for this cpu, too, if limited to 1 gpu.
http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpu/spy/P/2007/500000?minScore=0&cpuName=Intel Core i7-5775C

Bang, another one bites the dust!!!! http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/X/2007/1127/14399?minScore=12800&cpuName=Intel Core i7-5775C&gpuName=NVIDIA GeForce GTX 1080 Ti

THIS IS FUN!!!!


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> 
> So I wanted to share some GPU-Z graphs. Perhaps the ASUS Bios truly is helping as you can see in the perf cap graph, the stock bios is hitting power limits (green) and GPU Utilization limits (grey) more often than the MSI and ASUS bios.
> 
> This is Witcher 3 which is hovering around 95 FPS average at 1440P on Ultra with Hair works. *This game draws A LOT of power when playing.*
> 
> Anyone here with the shunt mod and GPU-Z graphs? I think shunt mod might be the only way to truly get over the power limits of the FE cards.


Shunt mod reporting in. No GPU-Z graphs or Witcher 3 but I'm hitting 350-380W (guessing using wall meter) at 100% TDP. I put more thermal pads on the backside of the PCB because the VRM caps and inductors still get really hot.



This is in Fire Strike Extreme Stress Test, so it's pretty much worst case.


----------



## mshagg

That is impressive.

I've noticed Graphics Test 2 in Timespy tends to send my card running down through the bins more than any other of the benches I've been playing with. Can get it to dance in the 2100/1.093 bin in 3DMark runs but actual heavy gaming loads this card will freeze at 2075 after a few minutes. Keep falling back to old faithful 2062/1.031 for actual gaming.

I wonder if the changing of frequency/voltage has an impact on stability. i.e. a card might be better off staying at 2100/1.093 rather than trying to back off to lower voltage bins due to the stock power limit.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Shunt mod reporting in. No GPU-Z graphs or Witcher 3 but I'm hitting 350-380W (guessing using wall meter) at 100% TDP. I put more thermal pads on the backside of the PCB because the VRM caps and inductors still get really hot.
> 
> 
> 
> This is in Fire Strike Extreme Stress Test, so it's pretty much worst case.


Wow that's great! I might consider the shunt mod one day, it would only net me 0.5% performance lol. Maybe not worth it.

I guess the only thing that is limiting you is VRel. Not sure if that's critical at all since our voltage is locked to 1.092 max.


----------



## stranger451

Quote:


> Originally Posted by *alucardis666*
> 
> I've been burned 2x with SLi so I am a firm believer in buying the best you can afford until the next gen. I'll probably keep this card till Volta TI or the next fully unlocked Titan and get one of those.


It depends on what resolution you're running. Years ago, 7680x1440 was only possible with two GTX Titans/780 Ti's if you even wanted to attempt to hit 60 fps. Most modern games can be played with a single GTX 1080 Ti now though there are still plenty of games that require two GTX 1080 Ti's just to hit 60 fps at nvidia surround resolutions. Once the 4k 144hz monitors hit the market, enthusiasts are going to have to adopt sli to even have any semblance of hope to reach the required fps. Personally, I have two FTW3's on preorder which gives me pause since I replaced my 7680x1440 set up with a lg c6 that's only 4k. I could probably get by with a single 1080 Ti, but there are plenty of games out there that will still require lowering settings to hit the magic 60 fps number.


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> Wow that's great! I might consider the shunt mod one day, it would only net me 0.5% performance lol. Maybe not worth it.
> 
> I guess the only thing that is limiting you is VRel. Not sure if that's critical at all since our voltage is locked to 1.092 max.


My GPU actually loses max frequency beyond 1.075, so I capped it at that.


----------



## Baasha

Quote:


> Originally Posted by *Bishop07764*
> 
> Sorry man. Tried plugging it into another header to see if those work? If you are using any type of splitter or reducer, I would take those off.


Quote:


> Originally Posted by *SlimJ87D*
> 
> I still consider myself a overclocking noobie, but I wanted to share
> Your fan just sounds like its dead man. If you're plugging it into other headers and it wont' spin even at startup, it's just dead.


Yea - I'm thinking the fans are dead. What's odd is that there are TWO and neither one works - what are the odds of that?

I also tried using a fan controller (Lamptron FC5 V2) that has 30W/12V per channel (4-channel) and cranking that up to 12V doesn't make the fans spin either!

Definitely something wrong with the fans but it's so weird that I was able to use the PC for so long without the CPU overheating? LOL.. I was oblivious about it until last night.

Grr.. time to return these and get some replacements.


----------



## GRABibus

What would you advise me ?
Go to SLI of GTX 1080 or go to single GTX 1080Ti ?

Personnally, I would go to single GTX 1080Ti, because :
- Easier to resale a GTX1080 now than 2 GTX1080 in some months
- Less potential games incompatibility
- Less power
- Less spent money at the end...

What's you opinion ?


----------



## keikei

Quote:


> Originally Posted by *GRABibus*
> 
> What would you advise me ?
> Go to SLI of GTX 1080 or go to single GTX 1080Ti ?
> 
> Personnally, I would go to single GTX 1080Ti, because :
> - Easier to resale a GTX1080 now than 2 GTX1080 in some months
> - Less potential games incompatibility
> - Less power
> - Less spent money at the end...
> 
> What's you opinion ?


Single 1080Ti.


----------



## feznz

Quote:


> Originally Posted by *Mad Pistol*
> 
> Hahahahahaha.... no
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1 single fast card > 2 somewhat fast cards... always.


If money was not an option









4 fastest cards possible> 3fastest cards possible>2fastest cards possible>1fastest cards possible>> 2 somewhat fast cards... always
Quote:


> Originally Posted by *GRABibus*
> 
> What would you advise me ?
> Go to SLI of GTX 1080 or go to single GTX 1080Ti ?
> 
> Personnally, I would go to single GTX 1080Ti, because :
> - Easier to resale a GTX1080 now than 2 GTX1080 in some months
> - Less potential games incompatibility
> - Less power
> - Less spent money at the end...
> 
> What's you opinion ?


Have you got a 1080 already?
decause a 1080 Ti is absolutly smashing my LG UC99-38w thats 3840x1600p in any game so far


----------



## Foxrun

Quote:


> Originally Posted by *alucardis666*
> 
> 2 1080Ti's would be ideal, *BUT* that would put you at the mercy of SLi implementation in games; scaling and whether it's even supported.
> Would also cost you ~$200-250 more. Worse case scenario you play a game without SLI support and you have a $750 brick sitting in your rig sucking up power and not contributing any performance at all.
> 
> *I think* the TXp on water or with a hybrid cooler is the best bang for your buck assuming you can shell out $1300 USD for it. If not get 1 1080Ti with an extreme oc like the Aurous and be happy for ~$775.


1400$ for two Ti's would be the best bet. One Ti is pretty close in performance to an Xp then the added benefit of another card for SLI games is a huge bonus, only if you're gaming at 4k. Anything below 4k, the single cards would be perfect.


----------



## prelude514

So I just got my hands on an Aorus non extreme after all. Afterburner only lets me bump power limit up to 125%, do I need to use Gigabyte's software to get to 150%? Or is there some other trick? Would really like to stick with Afterburner.

Cooler is absolutely fantastic BTW. Full load temps with both cards at 100% fan speed:

MSI FE 2000MHz @ 1.000v = 71c
Aorus 2012MHz @ 1.011v = 44c


----------



## Bishop07764

Quote:


> Originally Posted by *prelude514*
> 
> So I just got my hands on an Aorus non extreme after all. Afterburner only lets me bump power limit up to 125%, do I need to use Gigabyte's software to get to 150%? Or is there some other trick? Would really like to stick with Afterburner.
> 
> Cooler is absolutely fantastic BTW. Full load temps with both cards at 100% fan speed:
> 
> MSI FE 2000MHz @ 1.000v = 71c
> Aorus 2012MHz @ 1.011v = 44c


Wow. Those are amazing temps on air cooling. I bet it's a fair bit quieter to boot.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> That is impressive.
> 
> I've noticed Graphics Test 2 in Timespy tends to send my card running down through the bins more than any other of the benches I've been playing with. Can get it to dance in the 2100/1.093 bin in 3DMark runs but actual heavy gaming loads this card will freeze at 2075 after a few minutes. Keep falling back to old faithful 2062/1.031 for actual gaming.
> 
> I wonder if the changing of frequency/voltage has an impact on stability. i.e. a card might be better off staying at 2100/1.093 rather than trying to back off to lower voltage bins due to the stock power limit.


CTRL+L on your target voltage to lock it.


----------



## Slackaveli

Ok, EUREKA!!! I opened the door of my case and put a box fan as exhaust and fans on gpu at 100% (hadnt tried either yet, wth, Ken), and I can now pass a full Timespy run with max temps at 49c.... ON FREAKING AIR. House AC is turned down a bit, too, my kids are beeachin about it lol. 68f, it's usually 72.

new high in timespy, too.

http://www.3dmark.com/3dm/19171995?


----------



## prelude514

Quote:


> Originally Posted by *Bishop07764*
> 
> Wow. Those are amazing temps on air cooling. I bet it's a fair bit quieter to boot.


Definitely much quieter. I'm a little disappointed with the chip's quality compared to my FE, though. FE would run Witcher 3 for hours on end at 2000MHz with 0.993v, this bad boy just crashed after a minute at 1.025v. About to try 1.031v.


----------



## KingEngineRevUp

If you get a partner card that can OC to 2100 MHz+
Quote:


> Originally Posted by *Slackaveli*
> 
> Ok, EUREKA!!! I opened the door of my case and put a box fan as exhaust and fans on gpu at 100% (hadnt tried either yet, wth, Ken), and I can now pass a full Timespy run with max temps at 49c.... ON FREAKING AIR. House AC is turned down a bit, too, my kids are beeachin about it lol. 68f, it's usually 72.
> 
> new high in timespy, too.
> 
> http://www.3dmark.com/3dm/19171995?


Can you share your GPU-Z screenshots next time of the entire run? Thanks.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> Wow. Those are amazing temps on air cooling. I bet it's a fair bit quieter to boot.


yeah, man, love mine. i never see pink or green on perfcap anymore (except a blip here and there of green in FS ultra and TS)


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> If you get a partner card that can OC to 2100 MHz+
> Can you share your GPU-Z screenshots next time of the entire run? Thanks.


sure, i had the ticker on too fast last time. i have it on 2.5 second polling now. brb.

http://gpuz.techpowerup.com/17/04/10/uw5.png

damn, i forgot to hit "max" on the temp, i was trying to use snip to show everything and it wasn't taking so i snapshotted before it cycled off, sorry. But you can still tell, it hit 50c for a second that time.

Pretty damn salty for air.
But it's a benchmark air set-up right now. Those temps are 10c better, if not 12c better than i run with the door on and no giant fan exhaust lol. 62c is fine for gaming but for benching one must do what one must do.

Also, as ya'll can see, the 125% power limit is plenty on this card. Held it's 2076 throughout.


----------



## Bishop07764

Quote:


> Originally Posted by *Slackaveli*
> 
> Ok, EUREKA!!! I opened the door of my case and put a box fan as exhaust and fans on gpu at 100% (hadnt tried either yet, wth, Ken), and I can now pass a full Timespy run with max temps at 49c.... ON FREAKING AIR. House AC is turned down a bit, too, my kids are beeachin about it lol. 68f, it's usually 72.
> 
> new high in timespy, too.
> 
> http://www.3dmark.com/3dm/19171995?


Lol. They can put a jacket on. Dad needs to do some benching. ?
Quote:


> Originally Posted by *prelude514*
> 
> Definitely much quieter. I'm a little disappointed with the chip's quality compared to my FE, though. FE would run Witcher 3 for hours on end at 2000MHz with 0.993v, this bad boy just crashed after a minute at 1.025v. About to try 1.031v.


The silicone lottery is real. At least the power limit is higher.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> Lol. They can put a jacket on. Dad needs to do some benching. ?
> The silicone lottery is real. At least the power limit is higher.


I said "robe", but , yup!!!

plus these are guantlet, so the extreme got a lot (not all, because you can still get lucky with guantlet, i did. im beating top advertised extreme clocks by +20 core or so. not too shabby) and probably all of the truly golden chips, which in this case means maybe 25 core more than me if truly a golden.


----------



## prelude514

Silicone lottery is real indeed! I actually thought about getting my FE back and throwing it under water for a second LOL. I'm never satisfied...









1.031 held up for 20 minutes or so... Time to figure out how to enable 150% power limit and reach for the stars. Will report back later!


----------



## Slackaveli

Quote:


> Originally Posted by *prelude514*
> 
> Silicone lottery is real indeed! I actually thought about getting my FE back and throwing it under water for a second LOL. I'm never satisfied...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1.031 held up for 20 minutes or so... Time to figure out how to enable 150% power limit and reach for the stars. Will report back later!


i'm sure the extreme bios will do it. it was advertised as both of them having 150% power at first online but that is wrong. If you try the bios, report back of course. I am satisfied b/c my FE was a dog and i got at least a 'silver' aorus lul. Although my FE could hit +700+ on the mem before artifacting (never actually saw any. who knows how much farther it could have gone, although probably not much) and this one artifacts in Heaven at +420.









oh, here's a better pic of a full run. This time i got zero power throttles. I am at 125% and 90c temp target, no extra volts. 1.05v locked.

http://gpuz.techpowerup.com/17/04/10/apy.png

I can make it hit 2100 and hold thru a bench, but these are at a game stable setting.

**pic: im lonely hehe


----------



## AngryLobster

So this is what overclocking Nvidia GPUs has amounted to. Everyone discussing clocks within 50mhz of each other lol. Amazing.


----------



## prelude514

Nice! I'll try 2075 @ 1.05v to see if my card can swing that.

I really hope the 150% isn't only for the extreme card. I'll definitely be flashing if so.


----------



## prelude514

Quote:


> Originally Posted by *AngryLobster*
> 
> So this is what overclocking Nvidia GPUs has amounted to. Everyone discussing clocks within 50mhz of each other lol. Amazing.


Yeah, overclocking nvidia is complete garbage. I'll be jumping ship to Vega when possible, if only because AMD doesn't meddle with what I'm allowed to do with the hardware I own.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AngryLobster*
> 
> So this is what overclocking Nvidia GPUs has amounted to. Everyone discussing clocks within 50mhz of each other lol. Amazing.


It's deeper than that.

You have power limiting, thermal throttling and voltage limiting.

We discuss the balance of all 3, not just 50 Mhz.
Quote:


> Originally Posted by *Slackaveli*
> 
> i'm sure the extreme bios will do it. it was advertised as both of them having 150% power at first online but that is wrong. If you try the bios, report back of course. I am satisfied b/c my FE was a dog and i got at least a 'silver' aorus lul. Although my FE could hit +700+ on the mem before artifacting (never actually saw any. who knows how much farther it could have gone, although probably not much) and this one artifacts in Heaven at +420.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh, here's a better pic of a full run. This time i got zero power throttles. I am at 125% and 90c temp target, no extra volts. 1.05v locked.
> 
> http://gpuz.techpowerup.com/17/04/10/apy.png
> 
> I can make it hit 2100 and hold thru a bench, but these are at a game stable setting.
> 
> **pic: im lonely hehe


As a refresher, what happened to the Gigabyte bios? No one was able to flash those and get them to work?


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's deeper than that.
> 
> You have power limiting, thermal throttling and voltage limiting.
> 
> We discuss the balance of all 3, not just 50 Mhz.
> As a refresher, what happened to the Gigabyte bios? No one was able to flash those and get them to work?


the way the power phases are (i suspect) cause it to run way low power and clocks on fe. i know giga uses 14+2 phases but also uses a low current, high phase switching, and a quadrupler in there somewhere. End result: no bueno for FEs.

I just busted a barrier (for me) of 7500 firestrike ultra. http://www.3dmark.com/3dm/19172743?

Im done benching. I have top score on my cpu in all the firestrike main tests (main four) now, lol. That's a wrap.

For reference, here is THE HIGHEST single Ti, 4 core cpu build's score in the world. I am only 300 points behind. Dang. No catching that i dont guess, but i take solice in the fact we are both the same percentage vs the world .
http://www.3dmark.com/fs/12261205


----------



## AngryLobster

That's reaching pretty hard. None of those 3 aspects are even "limits" on a partner card. Sounds like trying to artificially make OCing these fun again when in fact you're either around 2050mhz or lucky and at or slightly above 2100mhz.

Have fun.


----------



## prelude514

So yeah... Looks like 150% is reserved only for Extreme cards. Time to flash this GPU.


----------



## Slackaveli

Quote:


> Originally Posted by *prelude514*
> 
> So yeah... Looks like 150% is reserved only for Extreme cards. Time to flash this GPU.


i bet it works fine. may not need the whole 150% anyway. Only time i power throttle (i always am monitoring) i see it's like 130%, 129%, 131%. I am pretty sure we'd be gucci with 135%.

Please post the extreme bios when you find it.


----------



## Slackaveli

Quote:


> Originally Posted by *AngryLobster*
> 
> That's reaching pretty hard. None of those 3 aspects are even "limits" on a partner card. Sounds like trying to artificially make OCing these fun again when in fact you're either around 2050mhz or lucky and at or slightly above 2100mhz.
> 
> Have fun.


you're right but you're also poo-pooing on what we have left, which admittedly isnt much. We ahve had to expand the topic a bit beyond just overclocking, that's for sure.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AngryLobster*
> 
> That's reaching pretty hard. None of those 3 aspects are even "limits" on a partner card. Sounds like trying to artificially make OCing these fun again when in fact you're either around 2050mhz or lucky and at or slightly above 2100mhz.
> 
> Have fun.


In a nice way, you're trying to tell us we are wasting our time. But without this community the following would have not been discovered.

1. A voltage curve can help you with stability. People have gotten past 2000 Mhz to get to 2088Mhz or even more.
2. Flashing the ASUS Bios gives you an additional 30 Watts. This has helped eliminate 75%+ of the power limiting you would experience in 1 hour game session.

And yeah, one day we might find a bios that gives us 50 Watts or 75 Watts. That will be number 3.

If I did the shunt mod, I could be playing at 2114-2128 Mhz. That would be 15% extra performance over stock clocks at 1850 Mhz. But I don't want to do the shunt mod, and hopefully a bios is discovered.

So yeah, **** please


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> i bet it works fine. may not need the whole 150% anyway. Only time i power throttle (i always am monitoring) i see it's like 130%, 129%, 131%. I am pretty sure we'd be gucci with 135%.
> 
> Please post the extreme bios when you find it.


Yeah, I just tested the Gigabyte Extreme bios and no bueno like you said. Won't go past 0.7V.

Awesome scores BTW.

I think I've exhausted my research and mastering of OCing this card. Hopefully something else comes up later. Time to play games


----------



## rakesh27

Sorry to hijack this thread.

Anyways i already have a Zotac 1080 Amp Edition Kraken G10 Hybrid Kit, and as i found an EVGA FE 1080 TI for $699 i made the purchase... I know the Titan Xp has just been launched and thinks i dont wanna pay for a $1200 card again.

Just wondering i play on a 1440p @ 144Hz, and once i get my EVGA FE 1080Ti im gonna stick my Kracken G10 Hybrid kit on it from my Zotac AE 1080, will i be able to confortably play at 4k virtual resolution on my monitor....

I hope i havent made a bad purchase.... im thinking do the mod on the 1080 TI, then restore my original cooler on my Zotac AE 1080 and sell it, the Zotac AE 1080 is a very good card.

Like i said sometimes well mostly i play games at 1440p @ 144hz all maxed out then sometimes play at virtual resolution @ 4k on some games so long as i can hit 60hz constantly, can the 1080 TI do this ?

What do you all think...

Thanks all..


----------



## mshagg

Quote:


> Originally Posted by *prelude514*
> 
> Yeah, overclocking nvidia is complete garbage. I'll be jumping ship to Vega when possible, if only because AMD doesn't meddle with what I'm allowed to do with the hardware I own.


They sold us a card that boosts itself to its rated power draw on its own. I honestly don't see the problem.

They could sell us cards that actually did only boost up to the advertised clock speeds out of the box and we'd all be ****ting ourselves over 300+Mhz overclocks.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> They sold us a card that boosts itself to its rated power draw on its own. I honestly don't see the problem.
> 
> They could sell us cards that actually did only boost up to the advertised clock speeds out of the box and we'd all be ****ting ourselves over 300+Mhz overclocks.


Agreed! People forget that Boost technology is literally a OC technology for us.


----------



## Tikerz

Quote:


> Originally Posted by *Slackaveli*
> 
> i'm sure the extreme bios will do it. it was advertised as both of them having 150% power at first online but that is wrong. If you try the bios, report back of course. I am satisfied b/c my FE was a dog and i got at least a 'silver' aorus lul. Although my FE could hit +700+ on the mem before artifacting (never actually saw any. who knows how much farther it could have gone, although probably not much) and this one artifacts in Heaven at +420.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh, here's a better pic of a full run. This time i got zero power throttles. I am at 125% and 90c temp target, no extra volts. 1.05v locked.
> 
> http://gpuz.techpowerup.com/17/04/10/apy.png


There must be something wrong with the GPU-Z reading. The perfcap (Util) and TDP all point to the card being idle? When you mean locked at 1.05V do you mean your voltage curve is flat?


----------



## prelude514

Quote:


> Originally Posted by *Slackaveli*
> 
> i bet it works fine. may not need the whole 150% anyway. Only time i power throttle (i always am monitoring) i see it's like 130%, 129%, 131%. I am pretty sure we'd be gucci with 135%.
> 
> Please post the extreme bios when you find it.


Just flashed the Extreme F4 beta BIOS from here without issue: http://www.gigabyte.us/Graphics-Card/GV-N108TAORUS-X-11GD#support-dl

But power limit was still 125% and default memory speed was still non-Extreme.

So I flashed this guy: https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331

Following the method here: http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe

And my card is now an Extreme.
















+150 now available in AB, and default memory clock is now 1404MHz instead of 1376MHz.

Time to beat this card up a bit!


----------



## prelude514

Quote:


> Originally Posted by *mshagg*
> 
> They sold us a card that boosts itself to its rated power draw on its own. I honestly don't see the problem.
> 
> They could sell us cards that actually did only boost up to the advertised clock speeds out of the box and we'd all be ****ting ourselves over 300+Mhz overclocks.


The point of being an overclocker is to go past rated power draw, push to the extreme. Proper overclocking of nVidia hardware died with Fermi, unless you're willing to physically modify your card.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Agreed! People forget that Boost technology is literally a OC technology for us.


If someone is the type that enjoys pressing the automatic overclock button on a motherboard and living with that, that's fine and dandy. Most real overclockers will never touch that button, and don't want to have a company tell them what they can or can not do with their hardware. I'm betting most people on this forum fall into that category.

Let me burn my hardware if I want to.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Just flashed the Extreme F4 beta BIOS from here without issue: http://www.gigabyte.us/Graphics-Card/GV-N108TAORUS-X-11GD#support-dl
> 
> But power limit was still 125% and default memory speed was still non-Extreme.
> 
> So I flashed this guy: https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331
> 
> Following the method here: http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe
> 
> And my card is now an Extreme.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +150 now available in AB, and default memory clock is now 1404MHz instead of 1376MHz.
> 
> Time to beat this card up a bit!


Really? Are you on a FE? I flashed that like 1 hour ago and it was no good. *Voltage never goes past 0.7V.*

Edit: BTW the F4 beta actually might be drawing more power but it just says 125%, did you actually run a bench on it? If it works out, let me know.


----------



## prelude514

Quote:


> Originally Posted by *SlimJ87D*
> 
> Really? Are you on a FE? I flashed that like 1 hour ago and it was no good. *Voltage never goes past 0.7V.*
> 
> Edit: BTW the F4 beta actually might be drawing more power but it just says 125%, did you actually run a bench on it? If it works out, let me know.


No, traded in my FE today for a Aorus non Extreme, which is limited to 125% vs 150% on the Extreme. Sorry for the confusion!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> No, traded in my FE today for a Aorus non Extreme, which is limited to 125% vs 150% on the Extreme. Sorry for the confusion!


Ahhh, okay that makes sense.


----------



## alucardis666

Quote:


> Originally Posted by *prelude514*
> 
> No, traded in my FE today for a Aorus non Extreme, which is limited to 125% vs 150% on the Extreme. Sorry for the confusion!


Extreme or bust I'd say


----------



## prelude514

Quote:


> Originally Posted by *prelude514*
> 
> Just flashed the Extreme F4 beta BIOS from here without issue: http://www.gigabyte.us/Graphics-Card/GV-N108TAORUS-X-11GD#support-dl
> 
> But power limit was still 125% and default memory speed was still non-Extreme.
> 
> So I flashed this guy: https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331
> 
> Following the method here: http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe
> 
> And my card is now an Extreme.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +150 now available in AB, and default memory clock is now 1404MHz instead of 1376MHz.
> 
> Time to beat this card up a bit!


Quote:


> Originally Posted by *alucardis666*
> 
> Extreme or bust I'd say


Would've gone Extreme, but they only received 2 and they sold before I got there. They got 4 non Extremes, and I got the last one. Couldn't wait any longer, or I'd not have been able to trade my FE in for it, and I honestly didn't mind since the GPUs are exactly the same minus 1 RGB LED and different stock clocks. Didn't realize that Gigabyte had gimped the power limit on the non Extreme, though. But that's not an issue anymore.


----------



## alucardis666

Quote:


> Originally Posted by *prelude514*
> 
> Would've gone Extreme, but they only received 2 and they sold before I got there. They got 4 non Extremes, and I got the last one. Couldn't wait any longer, or I'd not have been able to trade my FE in for it, and I honestly didn't mind since the GPUs are exactly the same minus 1 RGB LED and different stock clocks. Didn't realize that Gigabyte had gimped the power limit on the non Extreme, though. But that's not an issue anymore.


Gotcha, enjoy the new card


----------



## Slackaveli

Quote:


> Originally Posted by *Tikerz*
> 
> There must be something wrong with the GPU-Z reading. The perfcap (Util) and TDP all point to the card being idle? When you mean locked at 1.05V do you mean your voltage curve is flat?


ctrl+L on the exact curve point of 1.05v, all to to the right of that is flat, yeah.

Here's my pic after 3 hours straight of Battlefield One, NEVER did it leave 2050, and I mean never.
http://gpuz.techpowerup.com/17/04/10/z5y.png
Quote:


> Originally Posted by *prelude514*
> 
> Just flashed the Extreme F4 beta BIOS from here without issue: http://www.gigabyte.us/Graphics-Card/GV-N108TAORUS-X-11GD#support-dl
> 
> But power limit was still 125% and default memory speed was still non-Extreme.
> 
> So I flashed this guy: https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331
> 
> Following the method here: http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe
> 
> And my card is now an Extreme.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +150 now available in AB, and default memory clock is now 1404MHz instead of 1376MHz.
> 
> Time to beat this card up a bit!


OH SNAP.


----------



## alucardis666

Posted in the 2017 TITAN Xp thread, but I made a revelation that I think is significant and can help you all with the Ti, *or any pascal card really*

*Figured it out!*

You wanna adjust your curve points in increments or multiples of *13mhz* at a time. You'll see offset values of *13,26,39,52,65,78,91,104,117,130...etc*

When testing a point on your curve, lock it in by pressing *CTRL + L* in the afterburner *voltage/frequency curve editor* open GPU-Z and go to the sensors tab, run Heaven in benchmark mode or simply on a loop and see what perf caps you're hitting. You want it to just say Utilization, meaning there's nothing limiting your GPU Utilization, no bad clocks or insufficient voltages.

*EXAMPLE*



I'm still setting up my curve, but I will post back my values for each step once I'm locked in and fully stable with no perf caps.









Hope this helps you all and is clear.


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> In a nice way, you're trying to tell us we are wasting our time. But without this community the following would have not been discovered.
> 
> 1. A voltage curve can help you with stability. People have gotten past 2000 Mhz to get to 2088Mhz or even more.
> 2. Flashing the ASUS Bios gives you an additional 30 Watts. This has helped eliminate 75%+ of the power limiting you would experience in 1 hour game session.
> 
> And yeah, one day we might find a bios that gives us 50 Watts or 75 Watts. That will be number 3.
> 
> If I did the shunt mod, I could be playing at 2114-2128 Mhz. That would be 15% extra performance over stock clocks at 1850 Mhz. But I don't want to do the shunt mod, and hopefully a bios is discovered.
> 
> So yeah, **** please


Yeah shunt mod on the original bios is likely the best way to guarantee you aren't power limited, but I was curious if some of the aftermarket cooled reference cards like a couple of the EVGA's might get a FE bios with higher limits. Although I think its unlikely.

For your previous bios comparisons on Witcher 3, what was the max power on each, assuming they are pulling similar power they should give a good indication of how much the asus or msi bioses actually give the FE


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Yeah shunt mod on the original bios is likely the best way to guarantee you aren't power limited, but I was curious if some of the aftermarket cooled reference cards like a couple of the EVGA's might get a FE bios with higher limits. Although I think its unlikely.
> 
> For your previous bios comparisons on Witcher 3, what was the max power on each, assuming they are pulling similar power they should give a good indication of how much the asus or msi bioses actually give the FE


I don't know honestly... I don't think flashing Bios do anything at all. Scores are always worse on the ASUS bios for me.

I think the key for the FE and even the Titan XP is to run at the lowest possible voltage at the highest clocks.

2088 Mhz @ 1.043 below on stock bios



Asus bios 2114 Mhz @ 1.083V


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know honestly... I don't think flashing Bios do anything at all. Scores are always worse on the ASUS bios for me.
> 
> I think the key for the FE and even the Titan XP is to run at the lowest possible voltage at the highest clocks.


I think the Asus bios is working, but not perfectly. Kinda of like the Aorus bios has real problems, the Asus has minimal problems but I really don't think its giving 330w, more like 305. If it was 330w I think that extra 30w would get rid of all the power limitations in the Asus and MSI bioses in your Witcher tests as overclocking heaps doesn't increase power that much.

1911-2139 was only 12w
ram was about 3-4w

Looking at the Asus and MSI they both seem to have some lower voltage bins on the graphs, particularly when there is power limiting, and I would even goes as far as saying the MSI looks best as it only has little spikes even if its more often.


----------



## Exilon

Something seems to be up with the ASUS firmware's memory settings. I've been seeing people with lower memory clocks compared to stock FE firmware. Maybe timings?


----------



## Nico67

Quote:


> Originally Posted by *Exilon*
> 
> Something seems to be up with the ASUS firmware's memory settings. I've been seeing people with lower memory clocks compared to stock FE firmware. Maybe timings?


They probably are different, but I think there are other differences, maybe like the PCIE power and the 8 pin power are the same power limitations, and only the 8 pin that replaces our 6 pin is increased and hence the total. The real issue maybe the way it allocates phase power may mean we don't get the full benefit of that 8pin power increase and it then limits on say the PCIE power limitation. I could be completely wrong, but until I see an Asus bios pulling more than 111% I think its a bit suspicious.

Also unlike the days with bios switches on graphics cards, the OC mode is set in GPU tweak I believe, so maybe that switches in something else we can't. Interestingly GURU3D's card came with a bios locked to OC mode which has a different number, I think that one would be interesting to try











Asus bios floating around is 86.02.39.00.23 that one is 26


----------



## mshagg

Is it possible they're using a different shunt to measure power draw on the partner cards, in conjunction with whatever changes are being made at the BIOS level? I'm not sure if wizard has been addressing those parts in his breakdowns of the PCBs.

i.e. the FE PCB:

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_Ti/images/front.jpg

The shunts are clearly labeled 5MO.

Gigabyte Extreme gaming:

https://www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/images/front.jpg

Not clear, but physically different.

Asus Strix:

https://www.techpowerup.com/reviews/ASUS/GTX_1080_Ti_Strix_OC/images/front.jpg

Not clear, but physically different.

MSI Gaming X:

https://www.techpowerup.com/reviews/MSI/GTX_1080_Ti_Gaming_X/images/front.jpg

Not clear, but physically different.


----------



## Exilon

Quote:


> Originally Posted by *mshagg*
> 
> Is it possible they're using a different shunt to measure power draw on the partner cards, in conjunction with whatever changes are being made at the BIOS level? I'm not sure if wizard has been addressing those parts in his breakdowns of the PCBs.


Good catch. The Gigabyte board looks like it uses 2 mOhm resistors instead of the 5 mOhm like the others.

The FE uses this measurement IC
http://www.ti.com/product/INA3221

It can be programmed by the firmware to trigger alerts based on the voltage drop sensed on the shunt. So therefore the the board firmware has to match the actual board or the power measurements will go off the rails.

It makes sense that the Gigabyte firmware doesn't work on FE. The GPU thinks there's a power fault since power draw is 40% of the expected.


----------



## Nico67

Yeah Gigabyte do look like 2 so your right that would explain a lot







think the Asus are R005 which looks to be 5 milliohm same as FE, but could be something as simple as that.


----------



## jacknhut

I managed to get both of my FE 1080 Ti to run at 2075/12 Ghz stable (weakest card only run at 2075 max with 1.075V despite the stronger card can run 2126 Mhz at 1.05V all day...)

Default Bios without any mod too btw. I don't think the Asus bios is gonna make much of a difference since my card is not power limited.

Not very high overclock but for FE cards I think its pretty good.


----------



## BoredErica

I'd read the posts here but now there are almost 1500 new posts and it's just too many for me to go through.

Anybody know if EVGA SC Black/SC/FTW3 will get an EK full cover waterblock? I think they will but I'd like some confirmation. In EK configurator (https://www.ekwb.com/configurator/) they seem to have them for 1080 SC... not quite the SC Black. I think these are all non-reference. Too bad review sites don't clearly state if a PCB is reference or not. List here (https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-nvidia-gtx-1080-ti-graphics-cards/) makes it look like all non FE are not ref PCB...

Also anyone got any good ideas/info on power limits on the 3 EVGA skus?

Thanks in advance.


----------



## somebadlemonade

The ftw3 better get a block, and aren't the other versions EVGA makes just reference boards with more sensors and a different cooler?


----------



## BoredErica

Quote:


> Originally Posted by *somebadlemonade*
> 
> The ftw3 better get a block, and aren't the other versions EVGA makes just reference boards with more sensors and a different cooler?


You prompted me to recheck their site, which is a good thing. I must have mixed up the SC with the Armor X (which despite being entry level pricing still has 8+8), whereas the non FTW3 will be the usual. Not sure what to make of the phase count disparity though, and the fact that it's not on the 1080ti FE block list.



Yeah, I think it's likely FTW3 will get a block. I guess from what I have been hearing, extra power and voltage just doesn't help that much for overclocking in above ambient conditions.


----------



## somebadlemonade

Looks like I need to eat my words, we won't know if reference blocks will fit until we compare the boards side by side if they are different chances are they won't get blocks(middle ground cards never do unless they share board dimensions of the reference or top tier cards)


----------



## BoredErica

Yeah... there was 1080 SC full cover block, but that was probably because it was actually reference pcb. In their page http://www.evga.com/products/product.aspx?pn=08g-p4-6183-kr it shows the phase count being same as FE I believe. Now it's like... buy a cheaper card and possibly miss out on full cover block, or buy more expensive card that I might now need.

I'm really looking for a card on Amazon too so I can get 5% cash back. Other cards might not have pre-order option and the clock is ticking on my Prime (27 more days).

edit:

Google-fu net me this:

http://www.tomshardware.com/news/ekwb-five-1080-ti-blocks,34036.html


Graphics CardEK Water Block modelAvailabilityNVIDIA® GeForce® GTX 1080 TiEK-FC1080 GTX TiEarly April 2017ASUS® GeForce® GTX 1080 Ti StrixEK-FC1080 GTX Ti StrixMid April 2017MSI® GeForce® GTX 1080 Ti GAMING XEK-FC1080 GTX Ti TF6Early May 2017GIGABYTE® GeForce® GTX 1080 Ti AorusEK-FC1080 GTX Ti AorusEarly May 2017EVGA® GeForce® GTX 1080 Ti FTW3EK-FC1080 GTX Ti FTWLate May 2017


----------



## sew333

Hi. Fast question. Today after quitting fire strike ( 1920x1080 res ) ( i quit after 10 second test ) i get weird single flicker on desktop,like here:




It happened just once and one single time,but i want to be sure that card is ok. It flickered maybe 8 seconds after quitted test,on desktop. I tried to reproduce by testing 3 days but i cant. I am using new nvidia driver. Monitor is GB2488HSU 144HZ IIyama. Thanks Desktop is 1920x1080 144hz. Drivers 378.92.
I cant understand why it flickered once,when actually doesnt. Is my card faulty?

In games i dont observe this.
Card 1080 Ti no oc,stock.


----------



## scarecrow22

Quote:


> I cant understand why it flickered once,when actually doesnt. Is my card faulty?


What version (build) of windows are you on?

Are you using a displayport cable? Check your cable either way that looks like a cable issue.

If it only happened once and you havent changed your overclock I wouldnt worry about it.

Start worrying when you get artefacts in a game under high load and a mild overclock.


----------



## sew333

So my card is not producing artifacts in any games ( tested GTA V , Andromeda, 3dmark 11,13,Heaven ).

I dont know why it flickered just once. After that i tried to reproduce this all 3 days,without result. Just start Fire strike,then fast close, wait and nothing.

My cable Display Port is.


----------



## Bishop07764

Quote:


> Originally Posted by *sew333*
> 
> So my card is not producing artifacts in any games ( tested GTA V , Andromeda, 3dmark 11,13,Heaven ).
> 
> I dont know why it flickered just once. After that i tried to reproduce this all 3 days,without result. Just start Fire strike,then fast close, wait and nothing.
> 
> My cable Display Port is.


I was having that type of flicker issue too. It started for me after I installed the latest Riva tuner statistics server. I could get it to flicker everytime I hit apply or reset overclocks in Afterburner. After I uninstalled Riva tuner, it hasn't happened since.


----------



## sew333

I dont have riva tuner installed. Only gpuz,fraps,cpuz in background. Thats all. MAybe it flickered because clocks was changed? But why i cant reproduce this then>?


----------



## Bishop07764

Quote:


> Originally Posted by *alucardis666*
> 
> Posted in the 2017 TITAN Xp thread, but I made a revelation that I think is significant and can help you all with the Ti, *or any pascal card really*
> 
> *Figured it out!*
> 
> You wanna adjust your curve points in increments or multiples of *13mhz* at a time. You'll see offset values of *13,26,39,52,65,78,91,104,117,130...etc*
> 
> When testing a point on your curve, lock it in by pressing *CTRL + L* in the afterburner *voltage/frequency curve editor* open GPU-Z and go to the sensors tab, run Heaven in benchmark mode or simply on a loop and see what perf caps you're hitting. You want it to just say Utilization, meaning there's nothing limiting your GPU Utilization, no bad clocks or insufficient voltages.
> 
> *EXAMPLE*
> 
> 
> 
> I'm still setting up my curve, but I will post back my values for each step once I'm locked in and fully stable with no perf caps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps you all and is clear.


Hmmm...interesting. Didn't know about being able to lock the voltage at each point. Let us know how it turns out.

Quote:


> Originally Posted by *sew333*
> 
> I dont have riva tuner installed. Only gpuz,fraps,cpuz in background. Thats all. Maybe it flickered because clocks was changed? But why i cant reproduce this then>?


I also have a 144hz monitor and was trying for some 2.1 ghz clocks at the time too I believe. Might have been related to a driver recovery on a 144 hz monitor. I don't know. It doesn't sound like anything to worry about at this point though.


----------



## sew333

Maybe it flickered some fluke with my monitor 144hz or something? But i have 378.92 drivers almost the newest.


----------



## RavageTheEarth

I can't believe I'm actually counting down the days until my vacation is over and I can come home and play with that Aorus. Tomorrow is our last full day here. Already warned the GF that once we get back I'm going full introvert.


----------



## alucardis666

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I can't believe I'm actually counting down the days until my vacation is over and I can come home and play with that Aorus. Tomorrow or last full day here. Already warned the GF that once we get back I'm going full introvert.












Awesome!


----------



## bloodhawk

Quote:


> Originally Posted by *prelude514*
> 
> Just flashed the Extreme F4 beta BIOS from here without issue: http://www.gigabyte.us/Graphics-Card/GV-N108TAORUS-X-11GD#support-dl
> 
> But power limit was still 125% and default memory speed was still non-Extreme.
> 
> So I flashed this guy: https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331
> 
> Following the method here: http://www.overclock.net/t/1627212/how-to-flash-strix-1080-ti-bios-in-an-1080-ti-fe
> 
> And my card is now an Extreme.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +150 now available in AB, and default memory clock is now 1404MHz instead of 1376MHz.
> 
> Time to beat this card up a bit!


WELP, the Extreme BIOS doesnt work for the EVGA 1080 Ti FE unfortunately. After the flash and a clean driver install, the clocks under load never cross 8XXMhz. GPU-Z says that there the GPU is hitting power limit, even at 90% load.
This is with the power limit maxed out at 150%.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> WELP, the Extreme BIOS doesnt work for the EVGA 1080 Ti FE unfortunately. After the flash and a clean driver install, the clocks under load never cross 8XXMhz. GPU-Z says that there the GPU is hitting power limit, even at 90% load.
> This is with the power limit maxed out at 150%.


I could have saved you time, a post like last page I said all that lol.

Stop flashing bios, they don't do anything. Trust me, you'll get worse results. If you want better power draw, shunt mod is the only way to go.

I'm hoping Kendrawolf deletes his topic because it's kind of misleading until someone measures the difference between stock and bios flash out of the wall.


----------



## Slackaveli

Quote:


> Originally Posted by *sew333*
> 
> So my card is not producing artifacts in any games ( tested GTA V , Andromeda, 3dmark 11,13,Heaven ).
> 
> I dont know why it flickered just once. After that i tried to reproduce this all 3 days,without result. Just start Fire strike,then fast close, wait and nothing.
> 
> My cable Display Port is.


it's nvidia drivers, guys. it happens to me every few driver releases. I am on a multi-monitor set-up with one 4k/60, one 1440p/144 and I think it has something to do with that (or just the 144Hz in general), but it isnt a sign of faulty hardware and should go away (again) in future driver releases).

MSI X back in stock, btw


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> I could have saved you time, a post like last page I said all that lol.
> 
> Stop flashing bios, they don't do anything. Trust me, you'll get worse results. If you want better power draw, shunt mod is the only way to go.
> 
> I'm hoping Kendrawolf deletes his topic because it's kind of misleading until someone measures the difference between stock and bios flash out of the wall.


For 1080p-1440p you are totally right, i but for DX12 and FS Ultra i have seen a pretty significant jump in scores, using the Strix BIOS. Even though my card isnt a great clocker, i can at most do 2050Mhz @ 1.093v under water with sub 40C temperatures.

I will be a doing a power mod later this week.

Flashing another compatible vBIOS isnt useless as you are calling, it will only be useful in specific scenarios and loads.

Same card , same OC, same temps , same voltage-

http://www.3dmark.com/compare/spy/1487985/spy/1531250

http://www.3dmark.com/compare/fs/12195231/fs/12264485

http://www.3dmark.com/compare/fs/12155467/fs/12264453

Personally i didnt really follow or look at his guides/improvements. Just been waiting for the BIOS, i had the ASUS XOC unlocked BIOS on my EVGA FTW Hybrid before this , and that BIOS let me push past 2202Mhz while the stock would max out at 2126Mhz no matter what.

In order properly test improvements, people will need to use proper VF curves and have the card lock to their max clocks possible and keep it under 40C.

Will report back later this week once my Trimpots and resistors are here.

But yeah i should have read the last few pages, would have saved me time with the AORUS BIOS lol.


----------



## dansi

Is ASIC quality important for 1080Ti?
I ordered a FE and cant wait for arrival, hope to run 2050 core at least, pray for me.


----------



## alucardis666

Quote:


> Originally Posted by *dansi*
> 
> Is ASIC quality important for 1080Ti?
> I ordered a FE and cant wait for arrival, hope to run 2050 core at least, pray for me.


Nope, you can't even see it on Pascal.


----------



## JedixJarf

Quote:


> Originally Posted by *dansi*
> 
> Is ASIC quality important for 1080Ti?
> I ordered a FE and cant wait for arrival, hope to run 2050 core at least, pray for me.


2050 is pretty avg for the most part. The coveted 2100 is what you should be looking for : P, you need those extra 2 FPS.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> For 1080p-1440p you are totally right, i but for DX12 and FS Ultra i have seen a pretty significant jump in scores, using the Strix BIOS. Even though my card isnt a great clocker, i can at most do 2050Mhz @ 1.093v under water with sub 40C temperatures.
> 
> I will be a doing a power mod later this week.
> 
> Flashing another compatible vBIOS isnt useless as you are calling, it will only be useful in specific scenarios and loads.
> 
> Same card , same OC, same temps , same voltage-
> 
> http://www.3dmark.com/compare/spy/1487985/spy/1531250
> 
> http://www.3dmark.com/compare/fs/12195231/fs/12264485
> 
> http://www.3dmark.com/compare/fs/12155467/fs/12264453
> 
> Personally i didnt really follow or look at his guides/improvements. Just been waiting for the BIOS, i had the ASUS XOC unlocked BIOS on my EVGA FTW Hybrid before this , and that BIOS let me push past 2202Mhz while the stock would max out at 2126Mhz no matter what.
> 
> In order properly test improvements, people will need to use proper VF curves and have the card lock to their max clocks possible and keep it under 40C.
> 
> Will report back later this week once my Trimpots and resistors are here.
> 
> But yeah i should have read the last few pages, would have saved me time with the AORUS BIOS lol.


You better look at your GPU z at that voltage you're power limited, and very badly.

I do not recommend running voltages any higher than 1.050.

Ive come to realize why our voltages are locked, the FE running voltages above 1.050 will become power limited and your results will go backwards.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> You better look at your GPU z at that voltage you're power limited, and very badly.
> 
> I do not recommend running voltages any higher than 1.050.
> 
> Ive come to realize why our voltages are locked, the FE running voltages above 1.050 will become power limited and your results will go backwards.


Yeah that will happen if the TDP is way too low and limited to 120%. Already tested that, and that is the reason why FS Ultra power throttles hard!

And there is no harm in running above 1.050V, IF you are under TDP. IF the TDP is not reached yet, then the GPU will easily utilize that extra voltage. For example , in TimeSpy 90% of the benchmark is able to run @ 2050Mhz - 1.093V, but certain sections throttle down to 1989-2012Mhz (1.050-1.070V). However when using the FE Stock BIOS, the throttle kicks in way more , more like 80% of the run stays under TDP and the rest throttles.

Also ill say it again, any accurate comparisons, can only be done when the GPU is kept sub 40C.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Late limited?


Edited post. I was at a red light and it turned green. =/


----------



## Slackaveli

MSI X back in stock
Quote:


> Originally Posted by *bloodhawk*
> 
> For 1080p-1440p you are totally right, i but for DX12 and FS Ultra i have seen a pretty significant jump in scores, using the Strix BIOS. Even though my card isnt a great clocker, i can at most do 2050Mhz @ 1.093v under water with sub 40C temperatures.
> 
> I will be a doing a power mod later this week.
> 
> Flashing another compatible vBIOS isnt useless as you are calling, it will only be useful in specific scenarios and loads.
> 
> Same card , same OC, same temps , same voltage-
> 
> http://www.3dmark.com/compare/spy/1487985/spy/1531250
> 
> http://www.3dmark.com/compare/fs/12195231/fs/12264485
> 
> http://www.3dmark.com/compare/fs/12155467/fs/12264453
> 
> Personally i didnt really follow or look at his guides/improvements. Just been waiting for the BIOS, i had the ASUS XOC unlocked BIOS on my EVGA FTW Hybrid before this , and that BIOS let me push past 2202Mhz while the stock would max out at 2126Mhz no matter what.
> 
> In order properly test improvements, people will need to use proper VF curves and have the card lock to their max clocks possible and keep it under 40C.
> 
> Will report back later this week once my Trimpots and resistors are here.
> 
> But yeah i should have read the last few pages, would have saved me time with the AORUS BIOS lol.


For example, if you have an Aorus Ti and can hold clocks at least where the Extreme's pre-set OC mode is, why would you not flash the Extreme Bios (provided you are willing to risk warranty in case it dies with the other bios)?

Guys with cpu issues who are waiting on the next big thing, here's some good news.
http://wccftech.com/intel-basin-falls-x299-hedt-platform/

Quote:


> Originally Posted by *JedixJarf*
> 
> 2050 is pretty avg for the most part. The coveted 2100 is what you should be looking for : P, you need those extra 2 FPS.


more importantly , you need that 2076 at least to say you are running a 15 TFLOPs card. My OCD loves this!


----------



## la4ours

Picked up the GIGABYTE GeForce GTX 1080 Ti GAMING OC 11GB Video Card a few days back. Love the HUGE difference between my Sapphire Nitro 390 and this. Night and day. I think I literally doubled my fps at 1440p gaming.

I am going to return it however, due to a couple of issues.

1. No backplate.
2. Heat seems a bit higher than i'd like. I think it's averaging around 75°C while gaming.
3. I can't seem to get the oc higher than 1830 in the Aorus oc settings. I'm new to the nvidia side, so i'm not sure if that's the end result that i'm looking at (Don't burn me at the stake).

I'll probably end up dropping the $80 extra bucks and get the Asus Strix, or hold out for the eVGA FTW2 edition that comes out in May.

Anyone else with the same card, hmu and let me know what settings you're using.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Edited post. I was at a red light and it turned green. =/


Whoa be careful man. Forums can wait until you reach your destination.

I edited my post above as well.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> MSI X back in stock
> For example, if you have an Aorus Ti and can hold clocks at least where the Extreme's pre-set OC mode is, why would you not flash the Extreme Bios (provided you are willing to risk warranty in case it dies with the other bios)?
> 
> Guys with cpu issues who are waiting on the next big thing, here's some good news.
> http://wccftech.com/intel-basin-falls-x299-hedt-platform/


Flashing bios for non-fe is fine. You guys are not power limited thanks to the way your cards are setup.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> MSI X back in stock
> For example, if you have an Aorus Ti and can hold clocks at least where the Extreme's pre-set OC mode is, why would you not flash the Extreme Bios (provided you are willing to risk warranty in case it dies with the other bios)?
> 
> Guys with cpu issues who are waiting on the next big thing, here's some good news.
> http://wccftech.com/intel-basin-falls-x299-hedt-platform/
> more importantly , you need that 2076 at least to say you are running a 15 TFLOPs card. My OCD loves this!


I would do it in a heartbeat. Since i know how tor recover from a bad flash and have the necessary tools to manually flash the stock BIOS over without causes physical damage. Thats why i had the XOC unlocked BIOS as the slave BIOS on the FTW Hybrid.
For now im waiting on the Ti Classified, until then lets see how far i can push the FE without breaking.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Yeah that will happen if the TDP is way too low and limited to 120%. Already tested that, and that is the reason why FS Ultra power throttles hard!
> 
> And there is no harm in running above 1.050V, IF you are under TDP. IF the TDP is not reached yet, then the GPU will easily utilize that extra voltage. For example , in TimeSpy 90% of the benchmark is able to run @ 2050Mhz - 1.093V, but certain sections throttle down to 1989-2012Mhz (1.050-1.070V). However when using the FE Stock BIOS, the throttle kicks in way more , more like 80% of the run stays under TDP and the rest throttles.
> 
> Also ill say it again, any accurate comparisons, can only be done when the GPU is kept sub 40C.


I wasn't saying there's harm in running higher voltages, but higher voltages will make you hit that power limit and when you do, you get a second or two of poor performance as your card has to run calculations and slow things down and lower voltages.

It's better to run a constant core and not hit power limits. Game play is much smoother this way.

Yeah it was awesome in Witcher 3 I'd hit my highest fps ever but then it would stutter and I would notice clocks going from 2100 to 2088 2075 then down to 2050 and back to 2100.

Now I'm just running low voltage and 2088 Mhz and it just stays there, occasionally it will drop to 2075 but it doesn't jump around like crazy and my Gameplay is smooth.

Someone else measured the watts of their system with the stock and Asus bios and they didn't see the system draw 30 more watts either. There's that.


----------



## Slackaveli

Quote:


> Originally Posted by *JedixJarf*
> 
> 2050 is pretty avg for the most part. The coveted 2100 is what you should be looking for : P, you need those extra 2 FPS.


more importantly , you need that 2076 at least to say you are running a 15TFLOP card. My OCD loves this!
Quote:


> Originally Posted by *la4ours*
> 
> Picked up the GIGABYTE GeForce GTX 1080 Ti GAMING OC 11GB Video Card a few days back. Love the HUGE difference between my Sapphire Nitro 390 and this. Night and day. I think I literally doubled my fps at 1440p gaming.
> 
> I am going to return it however, due to a couple of issues.
> 
> 1. No backplate.
> 2. Heat seems a bit higher than i'd like. I think it's averaging around 75°C while gaming.
> 3. I can't seem to get the oc higher than 1830 in the Aorus oc settings. I'm new to the nvidia side, so i'm not sure if that's the end result that i'm looking at (Don't burn me at the stake).
> 
> I'll probably end up dropping the $80 extra bucks and get the Asus Strix, or hold out for the eVGA FTW2 edition that comes out in May.
> 
> Anyone else with the same card, hmu and let me know what settings you're using.


bro, set your LEDs how you like them, reset clocks/prfiles in Aorus software (the curve in it is terrible and you need the curve), and close it out (make sure you dont have it set it load up at startup), and USE MSI Afterburner to OC. That's first. Definitely run it high fans, and have you considered a re-paste with thermal grizzly kyro? that's 4-5c right there for $10/10minutes time.

You can get better clocks than that. Remember, though, you are already at +89 core off the top (edit: that's Aorus reggie, yours is +39 i believe) and some Tis only accept +100 to 120 core(from default FE which is 1480 core). just go up in increments of 13 until you freeze up in Heaven then back down 13-26 for stability. Set up a fan curve that hits 100% (it aint bad noisewise , the benefit is huge tempwise) at 59c. hit ctrl+f to open the curve, click on the 1.05v spot and hit ctrl+L to lock it there and save. Also, you may only be able to run about +250-350 on the memory for best performance and no artifacting.

ENJOY. Card is a beast.

EDIT: misread what card you had but the premise is the same. I recommend the Aorus, Aorus X, Asus, or MSI gaming X. Or, SEAHAWK X MSI hybrid. All of these can be had.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> I wasn't saying there's harm in running higher voltages, but higher voltages will make you hit that power limit and when you do, you get a second or two of poor performance as your card has to run calculations and slow things down and lower voltages.
> 
> It's better to run a constant core and not hit power limits. Game play is much smoother this way.
> 
> Yeah it was awesome in Witcher 3 I'd hit my highest fps ever but then it would stutter and I would notice clocks going from 2100 to 2088 2075 then down to 2050 and back to 2100.
> 
> Now I'm just running low voltage and 2088 Mhz and it just stays there, occasionally it will drop to 2075 but it doesn't jump around like crazy and my Gameplay is smooth.
> 
> Someone else measured the watts of their system with the stock and Asus bios and they didn't see the system draw 30 more watts either. There's that.


True that.

I think the main reason why people arent seeing benefits is because -

- Either they are running 2050+ @ 1.093V and hitting power / Temp throttles.

- Their temperatures arent really optimal to test without even a but of throttling.

I feel that the main reason im seeing improvements, is because my card is a ****ting clocker. With stock my peak power draw on the GPU was @ 295W. With the Strix BIOS im able to touch 330W. So that i pretty much in line with the other persons observations.

End of the day anything over 2025Mhz will add another 3-5fps at the most, and wont really causes too many frame incosistencies, unless the clocks start dropping down into the 19XX's.

In actually gaming anything over 2000Mhz is pointless.

This however is all for the sweet sweet epeen









This is what i can do with my HARD TDP limited 1080's in my laptop -

http://www.3dmark.com/fs/12084682

That cap is going away very soon. Right now they are at a 200W limit.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Flashing bios for non-fe is fine. You guys are not power limited thanks to the way your cards are setup.


yeah I always watch Wizard's PCB breakdown videos. I really like the way giga has these running very low frequencies but with high phase switching, it's very stable.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> True that.
> 
> I think the main reason why people arent seeing benefits is because -
> 
> - Either they are running 2050+ @ 1.093V and hitting power / Temp throttles.
> 
> - Their temperatures arent really optimal to test without even a but of throttling.
> 
> End of the day anything over 2025Mhz will add another 3-5fps at the most, and wont really causes too many frame incosistencies, unless the clocks start dropping down into the 19XX's.
> 
> In actually gaming anything over 2000Mhz is pointless.
> 
> This however is all for the sweet sweet epeen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is what i can do with my HARD TDP limited 1080's in my laptop -
> 
> http://www.3dmark.com/fs/12084682
> 
> That cap is going away very soon. Right now they are at a 200W limit.


Well another thing about the Asus bios, I would set it to max 2088 at 1.050V and before I knew it it would be doing 2088 at 1.081V!

I think it's because it thinks we have 350 watts of tdp and it just tells the card to run at the highest voltage+Yolo

The Asus bios would not run at a lower voltage, at least not for long. It literally had a mind of its own, doing what it thought was right because it probably thinks I'm using an Asus card.

Kendrawolf recently did some test and came to realize his test were better on stock. And he's is or was the biggest advocate for the Asus bios.

What we truly need is a bios editor but I don't know if that's going to happen.


----------



## Slackaveli

Quote:


> Originally Posted by *bloodhawk*
> 
> True that.
> 
> I think the main reason why people arent seeing benefits is because -
> 
> - Either they are running 2050+ @ 1.093V and hitting power / Temp throttles.
> 
> - Their temperatures arent really optimal to test without even a but of throttling.
> 
> I feel that the main reason im seeing improvements, is because my card is a ****ting clocker. With stock my peak power draw on the GPU was @ 295W. With the Strix BIOS im able to touch 330W. So that i pretty much in line with the other persons observations.
> 
> End of the day anything over 2025Mhz will add another 3-5fps at the most, and wont really causes too many frame incosistencies, unless the clocks start dropping down into the 19XX's.
> 
> In actually gaming anything over 2000Mhz is pointless.
> 
> This however is all for the sweet sweet epeen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is what i can do with my HARD TDP limited 1080's in my laptop -
> 
> http://www.3dmark.com/fs/12084682
> 
> That cap is going away very soon. Right now they are at a 200W limit.


agreed. Wasnt until i locked voltages a little bit lower that i could hold 2076/2063 in games for long sessions (on air) but i can now. Love this card.
As for temps, i was at great temps , 62c, in long sessions but yesterday i beat all my scores in benches with my same games stable settings just by opening the door, lowering the house temp 6f to 67f, and sticking a house fan next to my rig as an exhaust. I held 50c like that. Makes a difference in benches by a decent amount. I smashed all my records , like 7350 to 7505 in Firestrike Ultra for instance. All other settings equal.
Quote:


> Originally Posted by *SlimJ87D*
> 
> What we truly need is a bios editor but I don't know if that's going to happen.


it wont


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well another thing about the Asus bios, I would set it to max 2088 at 1.050V and before I knew it it would be doing 2088 at 1.081V!
> 
> I think it's because it thinks we have 350 watts of tdp and it just tells the card to run at the highest voltage+Yolo
> 
> The Asus bios would not run at a lower voltage, at least not for long. It literally had a mind of its own, doing what it thought was right because it probably thinks I'm using an Asus card.
> 
> Kendrawolf recently did some test and came to realize his test were better on stock. And he's is or was the biggest advocate for the Asus bios.
> 
> What we truly need is a bios editor but I don't know if that's going to happen.


Yeah that is clearly pointing to TDP Throttling./Limits.

Ill do the power mods on Wednesday hopefully, and compare the 2. I have a hex edited version ready, just need to flash it using my programmer and see if it work. Ah well next weekend it is.

I actually had it opposite, my FE BIOS was all over the place, the Strix is stable as hell for me.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bloodhawk*
> 
> Yeah that will happen if the TDP is way too low and limited to 120%. Already tested that, and that is the reason why FS Ultra power throttles hard!
> 
> And there is no harm in running above 1.050V, IF you are under TDP. IF the TDP is not reached yet, then the GPU will easily utilize that extra voltage. For example , in TimeSpy 90% of the benchmark is able to run @ 2050Mhz - 1.093V, but certain sections throttle down to 1989-2012Mhz (1.050-1.070V). However when using the FE Stock BIOS, the throttle kicks in way more , more like 80% of the run stays under TDP and the rest throttles.
> 
> Also ill say it again, any accurate comparisons, can only be done when the GPU is kept sub 40C.
> 
> 
> 
> I wasn't saying there's harm in running higher voltages, but higher voltages will make you hit that power limit and when you do, you get a second or two of poor performance as your card has to run calculations and slow things down and lower voltages.
> 
> It's better to run a constant core and not hit power limits. Game play is much smoother this way.
> 
> Yeah it was awesome in Witcher 3 I'd hit my highest fps ever but then it would stutter and I would notice clocks going from 2100 to 2088 2075 then down to 2050 and back to 2100.
> 
> Now I'm just running low voltage and 2088 Mhz and it just stays there, occasionally it will drop to 2075 but it doesn't jump around like crazy and my Gameplay is smooth.
> 
> Someone else measured the watts of their system with the stock and Asus bios and they didn't see the system draw 30 more watts either. There's that.
Click to expand...

My FE benches higher at 2050 core 1.025v on stock BIOS than at 2088 core, 1.075v Asus BIOS, not much higher, but a bit. And at 1.025v it throttles pretty much the same as 2088 at 1.075v.

In all cases I get slightly better benchmarks with stock BIOS at 2050 core, 1.025v.

*GPU-Z stock BIOS at 2050 core, 1.025v.*

Stock2050FullscreenHeaven.txt 68k .txt file


Stock2050Catzilla4K.txt 52k .txt file


Stock2050FireStrikeUltra.txt 51k .txt file


Stock2050TimeSpy.txt 71k .txt file


*Asus BIOS 2088 core, 1.075v.*

ASUS2088HeavenFullscreen.txt 99k .txt file


ASUS2088Catzilla4k.txt 52k .txt file


Asus2088FireStrikeUltra.txt 56k .txt file


Asus2088TimeSpy.txt 50k .txt file


Edit: Since I installed the Creator's Update of Windows I can only do 2088 on Asus BIOS no matter the voltages used, not 2100 any more.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> My FE benches higher at 2050 core 1.025v on stock BIOS than at 2088 core, 1.075v Asus BIOS, not much higher, but a bit. And at 1.025v it throttles pretty much the same as 2088 at 1.075v.
> 
> In all cases I get slightly better benchmarks with stock BIOS at 2050 core, 1.025v.
> 
> *GPU-Z stock BIOS at 2050 core, 1.025v.*
> 
> Stock2050FullscreenHeaven.txt 68k .txt file
> 
> 
> Stock2050Catzilla4K.txt 52k .txt file
> 
> 
> Stock2050FireStrikeUltra.txt 51k .txt file
> 
> 
> Stock2050TimeSpy.txt 71k .txt file
> 
> 
> *Asus BIOS 2088 core, 1.075v.*
> 
> ASUS2088HeavenFullscreen.txt 99k .txt file
> 
> 
> ASUS2088Catzilla4k.txt 52k .txt file
> 
> 
> Asus2088FireStrikeUltra.txt 56k .txt file
> 
> 
> Asus2088TimeSpy.txt 50k .txt file
> 
> 
> Edit: Since I installed the Creator's Update of Windows I can only do 2088 on Asus BIOS no matter the voltages used, not 2100 any more.


Yeap, you are clearly hitting TDP limits. And Pretty much at the same spots for both the BIOS'.

Also try and disable Gamemode after creators update. That might help. My clocks have been better since i did a dirty update, but that could have been something else all together.


----------



## KedarWolf

Quote:


> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> My FE benches higher at 2050 core 1.025v on stock BIOS than at 2088 core, 1.075v Asus BIOS, not much higher, but a bit. And at 1.025v it throttles pretty much the same as 2088 at 1.075v.
> 
> In all cases I get slightly better benchmarks with stock BIOS at 2050 core, 1.025v.
> 
> *GPU-Z stock BIOS at 2050 core, 1.025v.*
> 
> Stock2050FullscreenHeaven.txt 68k .txt file
> 
> 
> Stock2050Catzilla4K.txt 52k .txt file
> 
> 
> Stock2050FireStrikeUltra.txt 51k .txt file
> 
> 
> Stock2050TimeSpy.txt 71k .txt file
> 
> 
> *Asus BIOS 2088 core, 1.075v.*
> 
> ASUS2088HeavenFullscreen.txt 99k .txt file
> 
> 
> ASUS2088Catzilla4k.txt 52k .txt file
> 
> 
> Asus2088FireStrikeUltra.txt 56k .txt file
> 
> 
> Asus2088TimeSpy.txt 50k .txt file
> 
> 
> Edit: Since I installed the Creator's Update of Windows I can only do 2088 on Asus BIOS no matter the voltages used, not 2100 any more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeap, you are clearly hitting TDP limits. And Pretty much at the same spots for both the BIOS'.
> 
> Also try and disable Gamemode after creators update. That might help. My clocks have been better since i did a dirty update, but that could have been something else all together.
Click to expand...

Way ahead of you, first thing I did was disable Game Mode on an ISO clean install.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> Way ahead of you, first thing I did was disable Game Mode on an ISO clean install.


Roger.

Tried 381.65? For me the scores improved a tiny bit compared to 378.92.


----------



## KedarWolf

Quote:


> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Way ahead of you, first thing I did was disable Game Mode on an ISO clean install.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Roger.
> 
> Tried 381.65? For me the scores improved a tiny bit compared to 378.92.
Click to expand...

381.65 scores went down for me and card seemed less stable at my normal clocks. I'm batting zero here.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Yeah that is clearly pointing to TDP Throttling./Limits.
> 
> Ill do the power mods on Wednesday hopefully, and compare the 2. I have a hex edited version ready, just need to flash it using my programmer and see if it work. Ah well next weekend it is.
> 
> I actually had it opposite, my FE BIOS was all over the place, the Strix is stable as hell for me.


You haven't shown your gpu+z screenshots in runs.

You might think you're at [email protected] but with power limiting its gimped and not truly running to its full potential at that voltage. It's like a fake run at [email protected] with no power to fuel it.

The Asus bios is meant to be able to run at 350 watts and we don't have access to that. So it's the bios saying it's okay to run at [email protected] because it's doing the calculations and thinks you have access to 350 watts but you don't.

The Asus bios + shunt mod actually might work better, please report those two together if possible. But the Asus bios by itself is incomplete, it needs the true draw of 350 watts.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> You haven't shown your gpu+z screenshots in runs.
> 
> You might think you're at [email protected] but with power limiting its gimped and not truly running to its full potential at that voltage. It's like a fake run at [email protected] with no power to fuel it.


Didnt save any since i have it running on the other monitor.

But the power throttling was only during Time Spy and FS Ultra runs @ 2038Mhz + (For Time Spy, it sometimes go down to 1989-1974 during the sections where its going past the display cabinets) , but in FS Ultra, its a throttle galore.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Didnt save any since i have it running on the other monitor.
> 
> But the power throttling was only during Time Spy and FS Ultra runs @ 2038Mhz + (For Time Spy, it sometimes go down to 1989-1974 during the sections where its going past the display cabinets) , but in FS Ultra, its a throttle galore.


Blah, I'm resisting the urge to shunt mod... It's only going to give me like 0.5% extra performance!

Bring in these forums aren't healthy, lol.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Blah, I'm resisting the urge to shunt mod... It's only going to give me like 0.5% extra performance!
> 
> Bring in these forums aren't healthy, lol.


I already tried it. Barely made any difference. I tried both conductonaut and CLU. And covering 2 or 3 shunts.
Im not sure what it was , but it barely made a difference for me. Maybe too much or too little LM?

End of the day didnt find it worth the hassle. Just going to hard mod this card till the Ti Classified comes out.

Yeah ill test both the BIOS with the power mods and report back on Wednesday/Thursday.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> I already tried it. Barely made any difference. I tried both conductonaut and CLU. And covering 2 or 3 shunts.
> Im not sure what it was , but it barely made a difference for me. Maybe too much or too little LM?
> 
> End of the day didnt find it worth the hassle. Just going to hard mod this card till the Ti Classified comes out.


I'm in a bad position. I have a very good oc card, but it's held back by power limiting.

I can do [email protected] volts but GPU-Z sgows I'm solid green and power limited.

I could have probably pushed 2138 at 1.092V if I only had the power. But my wife actually got upset with me the other day, she has finally noticed I've been benchmarking and not keeping up with our errands.

She actually knows what heaven benchmark looks and sounds like.

Lol, if you guys don't see me here for a few weeks, that means she got to me. call 911.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm in a bad position. I have a very good oc card, but it's held back by power limiting.
> 
> I can do [email protected] volts but GPU-Z sgows I'm solid green and power limited.
> 
> I could have probably pushed 2138 at 1.092V if I only had the power. But my wife actually got upset with me the other day, she has finally noticed I've been benchmarking and not keeping up with our errands.
> 
> She actually knows what heaven benchmark looks and sounds like.
> 
> Lol, if you guys don't see me here for a few weeks, that means she got to me. call 911.


LOL.

Dang, thats one hell of a chip man. Most of the best ones i have seen have been from the first batches. Mine was a second batch.

Its great for gaming and all, but that sweet sweet epeen.









Im slapping on some resistors to the caps next to the power controller, will let you guys know if that works better than the shunt mod. And if there is a way to not have to solder them on.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Blah, I'm resisting the urge to shunt mod... It's only going to give me like 0.5% extra performance!
> 
> Bring in these forums aren't healthy, lol.


dont do it, it can degrade your solder to room temps and eat through the whole shunt resistor.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> dont do it, it can degrade your solder to room temps and eat through the whole shunt resistor.


That is very rare.
And depends on the solder / Liquid Metal paste being used.

And so far i have ONLY seen ONE case of that happening. But then again, caution is always better.


----------



## Outcasst

So I had some interesting results with the STRIX bios.

I've settled on 2000MHz at 1.0 volts as the lower voltage gets me about 5c cooler temperatures.

On the stock FE bios I was getting all three perfcaps at once and I wasn't even hitting the power limit (max of 110%) and got a 97.3 Heaven FPS.

On the STRIX BIOS, same settings, there were no perfcaps, max power limit hit was 96% and got a 99.3 Heaven FPS.

Can anybody explain this to me? I thought the STRIX BIOS would only help me if I was hitting 120% on the stock BIOS?


----------



## Slackaveli

Quote:


> Originally Posted by *bloodhawk*
> 
> That is very rare.
> And depends on the solder / Liquid Metal paste being used.
> 
> And so far i have ONLY seen ONE case of that happening. But then again, caution is always better.


I've heard of 2 or 3 (one may have been the same dude) out of hundreds, true, but we dont always hear the after story. Of course It's probably fine, I just go by the rule if you are going risky it needs to have a big payoff and I don't think this qualifies. I don't have the money to gamble with $700+ cards for one or two bins, but I don't begrudge those that do.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> I've heard of 2 or 3 (one may have been the same dude) out of hundreds, true, but we dont always hear the after story. Of course It's probably fine, I just go by the rule if you are going risky it needs to have a big payoff and I don't think this qualifies. I don't have the money to gamble with $700+ cards for one or two bins, but I don't begrudge those that do.


Yeah, at least on the 1080Ti's the results are not worth the Liquid Metal Shunt mod. It barely helped.


----------



## Slackaveli

Quote:


> Originally Posted by *bloodhawk*
> 
> Yeah, at least on the 1080Ti's the results are not worth the Liquid Metal Shunt mod. It barely helped.


imo, if one were hell bent on avoiding power throttle, sell the FE and get an Aorus.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> imo, if one were hell bent on avoiding power throttle, sell the FE and get an Aorus.


Nope waiting for the EVGA Classified. Right now the 3-5fps advantage isnt worth the cost.







At least for FE owners.


----------



## Slackaveli

Quote:


> Originally Posted by *bloodhawk*
> 
> Nope waiting for the EVGA Classified. Right now the 3-5fps advantage isnt worth the cost.
> 
> 
> 
> 
> 
> 
> 
> At least for FE owners.


I wish I had the patience to wait for a classy, those have always been my goto model. I freaking LOVE a beefier, copper card. But I fear they'll release a month or two before Volta. I hate the timing on release of 3rd party cards now-a-days. Look at the Arus 1080. It came out a month before Ti dropped.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> I wish I had the patience to wait for a classy, those have always been my goto model. I freaking LOVE a beefier, copper card. But I fear they'll release a month or two before Volta. I hate the timing on release of 3rd party cards now-a-days. Look at the Arus 1080. It came out a month before Ti dropped.


Yeap, and that is why love EVGA and their StepUP Program.


----------



## Bishop07764

Quote:


> Originally Posted by *Slackaveli*
> 
> I wish I had the patience to wait for a classy, those have always been my goto model. I freaking LOVE a beefier, copper card. But I fear they'll release a month or two before Volta. I hate the timing on release of 3rd party cards now-a-days. Look at the Arus 1080. It came out a month before Ti dropped.


Yeah and Nvidia will probably release a refresh of the 1080 ti with all CUDA cores at the same price point.







Now you never know. I'm pleased with my card. But I would be rather steamed if I had bought the older Titan Xp.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah and Nvidia will probably release a refresh of the 1080 ti with all CUDA cores at the same price point.
> 
> 
> 
> 
> 
> 
> 
> Now you never know. I'm pleased with my card. But I would be rather steamed if I had bought the older Titan Xp.


no doubt. but those guys should have known, i knew from the start when it was cut down that this was a 2 titan cycle. Other clue was the very fast and quiet launch.


----------



## PasK1234Xw

Just a heads up MSI 1080Ti seahawk X in stock at newegg

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> LOL.
> 
> Dang, thats one hell of a chip man. Most of the best ones i have seen have been from the first batches. Mine was a second batch.
> 
> Its great for gaming and all, but that sweet sweet epeen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im slapping on some resistors to the caps next to the power controller, will let you guys know if that works better than the shunt mod. And if there is a way to not have to solder them on.


There's someone in here that tried soldering some resistors and failed. I'd read his post before you start.


----------



## GRABibus

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just a heads up MSI 1080Ti seahawk X in stock at newegg
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126












I wait for it !

Will be available in France mid or end of April.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> There's someone in here that tried soldering some resistors and failed. I'd read his post before you start.


Yeah i saw that one, im not entirely sure if that was done properly or not. I have a feeling he dropped the current way too much.


----------



## fisher6

Using the Strix bios with a voltage curve has definitely helped me. I'm game stable at 2088Mhz, been playing Ghost Recon for hours without any issues. I could only do 2063 with the FE bios due to the power limit.


----------



## Addsome

Quote:


> Originally Posted by *fisher6*
> 
> Using the Strix bios with a voltage curve has definitely helped me. I'm game stable at 2088Mhz, been playing Ghost Recon for hours without any issues. I could only do 2063 with the FE bios due to the power limit.


Have you tried to see if in game performance actually increased?


----------



## D13mass

Guys, how I can fix it ?

I already used DDU soft, but still can`t install drivers


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> Using the Strix bios with a voltage curve has definitely helped me. I'm game stable at 2088Mhz, been playing Ghost Recon for hours without any issues. I could only do 2063 with the FE bios due to the power limit.


Just because it shows a constant clock doesn't mean anything.

Monitor your voltage, power limiting and the clock.

This keeps happening with everyone. Kendrawolf needs to update the op of the strix thread.

But the strix will hold your core clock and then try to stabilize it by continously adding voltage. Your power limiting will get worse, worse and worse.

The Asus strix bios thinks it has 350 watts available when it does not. It will tap into that power draw and come up short. But you will still see 2088 Mhz even though you're power limited.


----------



## PasK1234Xw

Quote:


> Originally Posted by *D13mass*
> 
> Guys, how I can fix it ?
> 
> I already used DDU soft, but still can`t install drivers


Thats just basic driver windows don't have Ti drivers to auto load

Double check you got correct driver also if it fails try again i had this problem few months back where it failed first time reboot and 2nd time it went though


----------



## fisher6

Quote:


> Originally Posted by *SlimJ87D*
> 
> Just because it shows a constant clock doesn't mean anything.
> 
> Monitor your voltage, power limiting and the clock.
> 
> This keeps happening with everyone. Kendrawolf needs to update the op of the strix thread.
> 
> But the strix will hold your core clock and then try to stabilize it by continously adding voltage. Your power limiting will get worse, worse and worse.
> 
> The Asus strix bios thinks it has 350 watts available when it does not. It will tap into that power draw and come up short. But you will still see 2088 Mhz even though you're power limited.


My voltage is at 1093mv when the clock is at 2088.


----------



## D13mass

Quote:


> Originally Posted by *PasK1234Xw*
> 
> edit missread


Unfortunately I have Win10 LTSB, so no any Creators update


----------



## PasK1234Xw

Yea ignore that i overlooked what driver installed re read my other post
Quote:


> Originally Posted by *fisher6*
> 
> My voltage is at 1093mv when the clock is at 2088.


lol your avatar is perfect when you quote person above you and not agree with them


----------



## RadActiveLobstr

Quote:


> Originally Posted by *D13mass*
> 
> Guys, how I can fix it ?
> 
> I already used DDU soft, but still can`t install drivers


Make sure you didn't accidentally download the 32 bit version of the driver. I've done that once or twice.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> My voltage is at 1093mv when the clock is at 2088.


That's very bad, trust me you're power limiting yourself.

You see, the strix will hit 1.050 or wherever you set it. But as you game it will increase that voltage to the max available to avoid voltage limiting.

Again the strix thinks it has 350 watts, but you're running an fe without a shunt mod.

The FE cannot do 1.092V at those clocks without hitting power limits.


----------



## D13mass

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> Make sure you didn't accidentally download the 32 bit version of the driver. I've done that once or twice.


I am sure.
c:\NVIDIA\DisplayDriver\381.65\Win10_64\International\
Even from GeForce Experience I have the same problems


----------



## Foxrun

Quote:


> Originally Posted by *D13mass*
> 
> Guys, how I can fix it ?
> 
> I already used DDU soft, but still can`t install drivers


Ah I get that error sometimes after using DDU because windows is attempting to install a driver on it's own. I have to kill the process in task manager for the nvidia install to work.


----------



## KingEngineRevUp

Why do people still use duu, is the clean install option of the driver not enough? They introduced it so we wouldn't have to use duu.


----------



## PasK1234Xw

Quote:


> Originally Posted by *D13mass*
> 
> I am sure.
> c:\NVIDIA\DisplayDriver\381.65\Win10_64\International\
> Even from GeForce Experience I have the same problems


Make sure you are using the lasted version of DDU


----------



## D13mass

Anybody have instructions ?


----------



## PasK1234Xw

Download latest version of DDU select launch in safe mode select clean/restart then after you reboot install driver


----------



## fisher6

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's very bad, trust me you're power limiting yourself.
> 
> You see, the strix will hit 1.050 or wherever you set it. But as you game it will increase that voltage to the max available to avoid voltage limiting.
> 
> Again the strix thinks it has 350 watts, but you're running an fe without a shunt mod.
> 
> The FE cannot do 1.092V at those clocks without hitting power limits.


That's strange. When I run Heaven and let it loop then it reaches 2088 at 1.093v and GPU-Z shows UTIL in PerfCap reason. My power target usually doesn't go above 110%, while gaming it's around 97% most of the time. I haven't tried any other games other than GR Wildlands though.


----------



## D13mass

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Download latest version of DDU select launch in safe mode select clean/restart then after you reboot install driver


Latest !
17.0.6.1 already done (safe boot and etc.)


----------



## PasK1234Xw

IDK then

If i remember correctly i had this problem because i got sick of updating DDU every time new driver came out come to find out i should have downloaded the latest version.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> That's strange. When I run Heaven and let it loop then it reaches 2088 at 1.093v and GPU-Z shows UTIL in PerfCap reason. My power target usually doesn't go above 110%, while gaming it's around 97% most of the time. I haven't tried any other games other than GR Wildlands though.


Look in my images, I have comparison runs of the stock at 2088 Mhz 1.042-1.050v and it runs better than strix at 2088 Mhz.

Edit: http://www.overclock.net/content/type/61/id/3001077/

Left is Asus, right is stock.

I did mess with the Asus one later and got it up to 2575, but also my latest stock runs have been 2600.

Feel free to run the same heaven settings and compare them to. Mine.

I'm going to do one more test with the Asus tonight. I'm going to not add any voltage at all, and run it at 2088 and see if that avoids the bios pushing 1.093V


----------



## feznz

I am amazed at people thinking that 2076 core is going to be a ton faster than 2038Mhz core, even benchmarks it is barely noticeable let alone in game performance
performance scaling per Mhz go down dramatically when pushing the last drop of MHz and it is putting a ton of extra load for less that 1% FPS gain that's how I burnt out 2 GTX 580s squeezing that last drop out of them.
My strix boosted to 1976Mhz straight out of the box with water and managed to get running game stable 2038-2050 Mhz with a small OC


----------



## RadActiveLobstr

Quote:


> Originally Posted by *D13mass*
> 
> Latest !
> 17.0.6.1 already done (safe boot and etc.)


Does it happen with an older driver version? Can you install say the prior driver to this one and then install the latest over top?


----------



## bloodhawk

Quote:


> Originally Posted by *feznz*
> 
> I am amazed at people thinking that 2076 core is going to be a ton faster than 2038Mhz core, even benchmarks it is barely noticeable let alone in game performance
> performance scaling per Mhz go down dramatically when pushing the last drop of MHz and it is putting a ton of extra load for less that 1% FPS gain that's how I burnt out 2 GTX 580s squeezing that last drop out of them.
> My strix boosted to 1976Mhz straight out of the box with water and managed to get running game stable 2038-2050 Mhz with a small OC


From 1950-2038Mhz i barely even notice any deference in fps (maybe 5fps at mos?) when playing games like DOTA2 and CSGO. Beyond 2000Mhz supposedly if users can maintain <45C, the fps difference will be at MOST 2-5fps.
But yet again, those sweet sweet, epeen points man.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> I am amazed at people thinking that 2076 core is going to be a ton faster than 2038Mhz core, even benchmarks it is barely noticeable let alone in game performance
> performance scaling per Mhz go down dramatically when pushing the last drop of MHz and it is putting a ton of extra load for less that 1% FPS gain that's how I burnt out 2 GTX 580s squeezing that last drop out of them.
> My strix boosted to 1976Mhz straight out of the box with water and managed to get running game stable 2038-2050 Mhz with a small OC


agreed. I've settled in at [email protected] 1.04v for long gaming sessions.


----------



## lilchronic

When using a voltage curve i have notice that you are limited to 100% power target. Raising the slider to 120% with voltage curve set does nothing. i get power limited @ 100% Anyone else noticing this?


----------



## Slackaveli

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> Does it happen with an older driver version? Can you install say the prior driver to this one and then install the latest over top?


i always use last fall's version of ddu and in normal mode. i've never had an issue in 50 plus uses.


----------



## D13mass

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> Does it happen with an older driver version? Can you install say the prior driver to this one and then install the latest over top?


Guys, I can`t instal ANY drivers! Any!


----------



## lilchronic

Quote:


> Originally Posted by *D13mass*
> 
> Guys, I can`t instal ANY drivers! Any!


Did you flash your bios?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> When using a voltage curve i have notice that you are limited to 100% power target. Raising the slider to 120% with voltage curve set does nothing. i get power limited @ 100% Anyone else noticing this?


Not me.


----------



## D13mass

Quote:


> Originally Posted by *lilchronic*
> 
> Did you flash your bios?


I only put new card in PC.


----------



## mshagg

Quote:


> Originally Posted by *feznz*
> 
> I am amazed at people thinking that 2076 core is going to be a ton faster than 2038Mhz core, even benchmarks it is barely noticeable let alone in game performance


Point taken, and I've settled for an easy OC in a lower voltage bin, but the higher core speeds definitely have an impact on benchmark scores. It's what those applications are designed to show, and they show it clearly.

Im not sure why you'd question people pushing the most out of their cards on an overclocking forum. Also, warnings about 'burning cards' seems like hyperbole when these things are locked down to pretty safe voltages. If nothing else it's also kinda interesting.


----------



## Slackaveli

Quote:


> Originally Posted by *D13mass*
> 
> Guys, I can`t instal ANY drivers! Any!


that is weird. You've tried older drivers? it may be a borked card.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> Point taken, and I've settled for an easy OC in a lower voltage bin, but the higher core speeds definitely have an impact on benchmark scores. It's what those applications are designed to show, and they show it clearly.
> 
> Im not sure why you'd question people pushing the most out of their cards on an overclocking forum. Also, warnings about 'burning cards' seems like hyperbole when these things are locked down to pretty safe voltages. If nothing else it's also kinda interesting.


his mistake was not having a different profile for "gaming" that includes "Optimal Power" (or adaptive) and a lower locked voltage, or a lower clock, than the benchmark profile that is full power, balls-to-the wall and isnt quite 100% game-stable even perhaps, just for those benchmarks. I use both, and even a lower power one for certain games. We do have 5 profiles available after all.


----------



## ThingyNess

I didn't see anyone make a note of it in this thread, but for reference in case anyone gets a 1080Ti for which there isn't a BIOS dump / analysis yet.
Rather than digging through the code and hex values like I've been doing, if you actually have the card in hand:

Load up a command prompt (cmd) and copy/paste the following:

Code:



Code:


"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power

You'll see a nice little text output like this (it's from my 1070 as i'm still waiting on my 1080Ti to arrive)

Code:



Code:


Timestamp                           : Mon Apr 10 15:15:39 2017
Driver Version                      : 381.65

Attached GPUs                       : 1
GPU 0000:02:00.0
    Power Readings
        Power Management            : Supported
        Power Draw                  : 35.86 W
        Power Limit                 : 199.32 W
        Default Power Limit         : 166.10 W
        Enforced Power Limit        : 199.32 W
        Min Power Limit             : 75.00 W
        Max Power Limit             : 200.00 W
    Power Samples
        Duration                    : 2.69 sec
        Number of Samples           : 119
        Max                         : 36.45 W
        Min                         : 8.99 W
        Avg                         : 32.49 W

This also gives you an easy way to see absolute TDP numbers rather than percentages like GPU-Z gives you, which is nice.

From this, you can also see that the sliders in most GPU tweaking software seem to round to the nearest integer percentage. For example my slider only goes to 20%, which gives me 166.1w * 1.2 = 199.32w, even though the card technically supports 200w. If you really want that last .68w of TDP headroom, you can actually use nvidia-smi to set the power limit via the command prompt.

This is also handy if you want to add the command to a batch file and increase your TDP limit without having to have any OC software start up in the background, as they eat up memory and (rarely) have negative interactions with certain games.

To set a new power target, make sure you run the command prompt or batch file with admin rights, and type:

Code:



Code:


"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl <newpowerlimit>

The output should look something like this, which for me gave me the extra .68w of TDP that my 1070 Strix OC BIOS was capable of that Afterburner and the ASUS Gpu-Tweak utilities weren't giving me









Code:



Code:


C:\WINDOWS\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 200
Power limit for GPU 0000:02:00.0 was set to 200.00 W from 199.32 W.
All done.


----------



## Benny89

Finally my ROG STRIX will be in my hands tomorrow.

Are you still using that strange volage curves using AB custom curves or do you use some new AB version where we can finally just add more voltage and don't play with those curves?


----------



## D13mass

Quote:


> Originally Posted by *Slackaveli*
> 
> that is weird. You've tried older drivers? it may be a borked card.


No, I even tried to put back my 980ti, install without problems 381.65 drivers and change videocards to 1080ti, 980ti works like a watches.


----------



## Benny89

Guys, for AIBs- do you still use that AB custom voltage curves thing or do you use some new AB version where we can finally just apply voltage with slider?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> I didn't see anyone make a note of it in this thread, but for reference in case anyone gets a 1080Ti for which there isn't a BIOS dump / analysis yet.
> Rather than digging through the code and hex values like I've been doing, if you actually have the card in hand:
> 
> Load up a command prompt (cmd) and copy/paste the following:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> You'll see a nice little text output like this (it's from my 1070 as i'm still waiting on my 1080Ti to arrive)
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Timestamp                           : Mon Apr 10 15:15:39 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 35.86 W
> Power Limit                 : 199.32 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 199.32 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.69 sec
> Number of Samples           : 119
> Max                         : 36.45 W
> Min                         : 8.99 W
> Avg                         : 32.49 W
> 
> This also gives you an easy way to see absolute TDP numbers rather than percentages like GPU-Z gives you, which is nice.
> 
> From this, you can also see that the sliders in most GPU tweaking software seem to round to the nearest integer percentage. For example my slider only goes to 20%, which gives me 166.1w * 1.2 = 199.32w, even though the card technically supports 200w. If you really want that last .68w of TDP headroom, you can actually use nvidia-smi to set the power limit via the command prompt.
> 
> This is also handy if you want to add the command to a batch file and increase your TDP limit without having to have any OC software start up in the background, as they eat up memory and (rarely) have negative interactions with certain games.
> 
> To set a new power target, make sure you run the command prompt or batch file with admin rights, and type:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl <newpowerlimit>
> 
> The output should look something like this, which for me gave me the extra .68w of TDP that my 1070 Strix OC BIOS was capable of that Afterburner and the ASUS Gpu-Tweak utilities weren't giving me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> C:\WINDOWS\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 200
> Power limit for GPU 0000:02:00.0 was set to 200.00 W from 199.32 W.
> All done.


That's interesting, but how does this data work with a bios flashed from another card.


----------



## c0nsistent

So basically flashing another BIOS isn't going to help me much without a shunt mod? I game at 2050-2062 with an offset on a stock cooling FE card that is about to be modded with an AIO.


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's interesting, but how does this data work with a bios flashed from another card.


It works exactly the way you'd think. It shows all the relevant data from whatever BIOS you currently have flashed to the card.
I have a Strix 1070 non-OC which technically came with an FE power limit of 150w/171w, but I currently have the BIOS from the 1070 OC version of the card flashed to it, which gives me the higher 166w default TDP limit and 200w max adjustment limit.


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> I didn't see anyone make a note of it in this thread, but for reference in case anyone gets a 1080Ti for which there isn't a BIOS dump / analysis yet.
> Rather than digging through the code and hex values like I've been doing, if you actually have the card in hand:
> 
> Load up a command prompt (cmd) and copy/paste the following:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> You'll see a nice little text output like this (it's from my 1070 as i'm still waiting on my 1080Ti to arrive)
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Timestamp                           : Mon Apr 10 15:15:39 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 35.86 W
> Power Limit                 : 199.32 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 199.32 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.69 sec
> Number of Samples           : 119
> Max                         : 36.45 W
> Min                         : 8.99 W
> Avg                         : 32.49 W
> 
> This also gives you an easy way to see absolute TDP numbers rather than percentages like GPU-Z gives you, which is nice.
> 
> From this, you can also see that the sliders in most GPU tweaking software seem to round to the nearest integer percentage. For example my slider only goes to 20%, which gives me 166.1w * 1.2 = 199.32w, even though the card technically supports 200w. If you really want that last .68w of TDP headroom, you can actually use nvidia-smi to set the power limit via the command prompt.
> 
> This is also handy if you want to add the command to a batch file and increase your TDP limit without having to have any OC software start up in the background, as they eat up memory and (rarely) have negative interactions with certain games.
> 
> To set a new power target, make sure you run the command prompt or batch file with admin rights, and type:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl <newpowerlimit>
> 
> The output should look something like this, which for me gave me the extra .68w of TDP that my 1070 Strix OC BIOS was capable of that Afterburner and the ASUS Gpu-Tweak utilities weren't giving me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> C:\WINDOWS\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 200
> Power limit for GPU 0000:02:00.0 was set to 200.00 W from 199.32 W.
> All done.


nice work. so as you can see, regular aORUS IS 300W LIMIT WITH 375W ENFORCED AT THE 125% SLIDER. nO NEED FOR THAT 150% SLIDER AT ALL ON THE EXTREME.


----------



## Slackaveli

Quote:


> Originally Posted by *D13mass*
> 
> No, I even tried to put back my 980ti, install without problems 381.65 drivers and change videocards to 1080ti, 980ti works like a watches.


only time i've encountered it was on an old msi r9 280x that wouldnt take drivers and it had to go for an RMA, I'm very sorry to say, and I hope I am wrong.
Quote:


> Originally Posted by *Benny89*
> 
> Guys, for AIBs- do you still use that AB custom voltage curves thing or do you use some new AB version where we can finally just apply voltage with slider?


Both. But Aorus doesnt want any extra volts and the consensus is that most cards work better locked to 1.04v/1.05v/1.63v anyway , at highest possible clocks. So, determine highest possible clock at 1.050v, at crash lower it 13-25, drop everything down to the right of the straight across same line mark . click 1.05v and hit ctrl+L to lock, x out, apply, save, and enjoy. It's really easy now.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> It works exactly the way you'd think. It shows all the relevant data from whatever BIOS you currently have flashed to the card.
> I have a Strix 1070 non-OC which technically came with an FE power limit of 150w/171w, but I currently have the BIOS from the 1070 OC version of the card flashed to it, which gives me the higher 166w default TDP limit and 200w max adjustment limit.


Yeah a big issue is people thinking that flashing the Asus bios is giving us extra power when no one has actually measured it yet.

The Asus strix bios has a power draw of 275 and 330 max. Well when we flash it on the FE, we can only get up to 110% power draw on this bios... Guess 275X1.1= 300 watts, the maximum power draw of an FE card.

We're not getting any benefit with this bios and other users keep swearing on their mom's souls that they are getting improved performance when they haven't even done any real test to compare to one another.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ThingyNess*
> 
> It works exactly the way you'd think. It shows all the relevant data from whatever BIOS you currently have flashed to the card.
> I have a Strix 1070 non-OC which technically came with an FE power limit of 150w/171w, but I currently have the BIOS from the 1070 OC version of the card flashed to it, which gives me the higher 166w default TDP limit and 200w max adjustment limit.
> 
> 
> 
> Yeah a big issue is people thinking that flashing the Asus bios is giving us extra power when no one has actually measured it yet.
> 
> The Asus strix bios has a power draw of 275 and 330 max. Well when we flash it on the FE, we can only get up to 110% power draw on this bios... Guess 275X1.1= 300 watts, the maximum power draw of an FE card.
> 
> We're not getting any benefit with this bios and other users keep swearing on their mom's souls that they are getting improved performance when they haven't even done any real test to compare to one another.
Click to expand...

This issue isn't you're only getting 110% power draw it it's only hitting 110-111% of the Power Limit, which is good. Most FE BIOS's start hitting the wall at 112-113%, it's well documented in the 1080 Ti forum.


----------



## ThingyNess

Quote:


> Originally Posted by *Slackaveli*
> 
> nice work. so as you can see, regular aORUS IS 300W LIMIT WITH 375W ENFORCED AT THE 125% SLIDER. nO NEED FOR THAT 150% SLIDER AT ALL ON THE EXTREME.


Which version of the Aorus Extreme BIOS are you running? There are actually four in the wild apparently (F1, F2, F3, F4) with reports that some of them actually have different power targets. Some people were unhappy when they found out that their Aorus Extreme only had the TDP slider going up to +25%, but your results make sense now if the base had been raised from 250-300w.

Interestingly, the BIOS shipped to reviewers is definitely 250W by default with a +50% adjustment limit, as shown in TechPowerUp's review - it's the same one they posted in their BIOS database as well - from the text string it looks to be F3.

https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah a big issue is people thinking that flashing the Asus bios is giving us extra power when no one has actually measured it yet.
> 
> The Asus strix bios has a power draw of 275 and 330 max. Well when we flash it on the FE, we can only get up to 110% power draw on this bios... Guess 275X1.1= 300 watts, the maximum power draw of an FE card.
> 
> We're not getting any benefit with this bios and other users keep swearing on their mom's souls that they are getting improved performance when they haven't even done any real test to compare to one another.


It really irritates me that all the programs seem to want to quote the TDP numbers in percent rather than absolute values, which I think also confuses the issue as not everyone understands that 110% on the ASUS bios is not the same as 110% on the FE bios.

If you want absolute numbers, a quick and easy test is FurMark, as much as I know KedarWolf doesn't like it. It definitely blew up some VRMs a few generations of cards ago back before TDP limiting and monitoring was a thing, but on modern cards it's no harder on the card than anything else.

Try FurMark with the FE BIOS' TDP slider maxed out, and then with the ASUS BIOS' TDP slider maxed out. If you're worried about the health of your GPU and/or VRMs, only run it for as long as necessary for temps/clocks to stabilize.

Rather than using GPU-Z or Afterburner's percentages, if you want to really see the absolute power consumption that NVidia thinks the card is using, just add '-l 1' to the nvidia-smi.exe command to make it run in a continuous 1000ms polling loop and keep the window open with FurMark running.

Code:



Code:


nvidia-smi -q -d power -l 1

You should see your max power draw hitting very close to the advertised TDP limits within a few seconds of FurMark running, unless your limits are so high that even FurMark doesn't power throttle the GPU (my 1070 is fairly close actually, I can sustain 1923-1950mhz and with another 20w I'd probably be there)

With the stock non-OC bios and its 170W TDP limit the card would pull way back into the 1700s in FurMark.

Code:



Code:


==============NVSMI LOG==============

Timestamp                           : Mon Apr 10 17:10:15 2017
Driver Version                      : 381.65

Attached GPUs                       : 1
GPU 0000:02:00.0
    Power Readings
        Power Management            : Supported
        Power Draw                  : 196.80 W
        Power Limit                 : 200.00 W
        Default Power Limit         : 166.10 W
        Enforced Power Limit        : 200.00 W
        Min Power Limit             : 75.00 W
        Max Power Limit             : 200.00 W
    Power Samples
        Duration                    : 2.37 sec
        Number of Samples           : 119
        Max                         : 199.14 W
        Min                         : 173.41 W
        Avg                         : 192.07 W


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> Yeah a big issue is people thinking that flashing the Asus bios is giving us extra power when no one has actually measured it yet.
> 
> The Asus strix bios has a power draw of 275 and 330 max. Well when we flash it on the FE, we can only get up to 110% power draw on this bios... Guess 275X1.1= 300 watts, the maximum power draw of an FE card.
> 
> We're not getting any benefit with this bios and other users keep swearing on their mom's souls that they are getting improved performance when they haven't even done any real test to compare to one another.
> 
> 
> 
> It really irritates me that all the programs seem to want to quote the TDP numbers in percent rather than absolute values, which I think also confuses the issue as not everyone understands that 110% on the ASUS bios is not the same as 110% on the FE bios.
> 
> If you want absolute numbers, a quick and easy test is FurMark, as much as I know KedarWolf doesn't like it. It definitely blew up some VRMs a few generations of cards ago back before TDP limiting and monitoring was a thing, but on modern cards it's no harder on the card than anything else.
> 
> Try FurMark with the FE BIOS' TDP slider maxed out, and then with the ASUS BIOS' TDP slider maxed out. If you're worried about the health of your GPU and/or VRMs, only run it for as long as necessary for temps/clocks to stabilize.
> 
> Rather than using GPU-Z or Afterburner's percentages, if you want to really see the absolute power consumption that NVidia thinks the card is using, just add '-l 1' to the nvidia-smi.exe command to make it run in a continuous 1000ms polling loop and keep the window open with FurMark running.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> nvidia-smi -q -d power -l 1
> 
> You should see your max power draw hitting very close to the advertised TDP limits within a few seconds of FurMark running, unless your limits are so high that even FurMark doesn't power throttle the GPU (my 1070 is fairly close actually, I can sustain 1923-1950mhz and with another 20w I'd probably be there)
> 
> With the stock non-OC bios and its 170W TDP limit the card would pull way back into the 1700s in FurMark.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ==============NVSMI LOG==============
> 
> Timestamp                           : Mon Apr 10 17:10:15 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 196.80 W
> Power Limit                 : 200.00 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 200.00 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.37 sec
> Number of Samples           : 119
> Max                         : 199.14 W
> Min                         : 173.41 W
> Avg                         : 192.07 W
Click to expand...

To set the Power Limit with the command you said in earlier post to get that few extra percentage points do you need to run the .bat find you made on every boot or just set it o after you install the driver?


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> To set the Power Limit with the command you said in earlier post to get that few extra percentage points do you need to run the .bat find you made on every boot or just set it o after you install the driver?


Has to be done on every boot unfortunately, but at least it is lightning quick to complete and automatically exits after the command is done, leaving no memory footprint and consuming no background CPU cycles.


----------



## dboythagr8

Anybody pick up a Strix? Curious about its performance, I know it's out in some countries.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> It really irritates me that all the programs seem to want to quote the TDP numbers in percent rather than absolute values, which I think also confuses the issue as not everyone understands that 110% on the ASUS bios is not the same as 110% on the FE bios.
> 
> If you want absolute numbers, a quick and easy test is FurMark, as much as I know KedarWolf doesn't like it. It definitely blew up some VRMs a few generations of cards ago back before TDP limiting and monitoring was a thing, but on modern cards it's no harder on the card than anything else.
> 
> Try FurMark with the FE BIOS' TDP slider maxed out, and then with the ASUS BIOS' TDP slider maxed out. If you're worried about the health of your GPU and/or VRMs, only run it for as long as necessary for temps/clocks to stabilize.
> 
> Rather than using GPU-Z or Afterburner's percentages, if you want to really see the absolute power consumption that NVidia thinks the card is using, just add '-l 1' to the nvidia-smi.exe command to make it run in a continuous 1000ms polling loop and keep the window open with FurMark running.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> nvidia-smi -q -d power -l 1
> 
> You should see your max power draw hitting very close to the advertised TDP limits within a few seconds of FurMark running, unless your limits are so high that even FurMark doesn't power throttle the GPU (my 1070 is fairly close actually, I can sustain 1923-1950mhz and with another 20w I'd probably be there)
> 
> With the stock non-OC bios and its 170W TDP limit the card would pull way back into the 1700s in FurMark.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ==============NVSMI LOG==============
> 
> Timestamp                           : Mon Apr 10 17:10:15 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 196.80 W
> Power Limit                 : 200.00 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 200.00 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.37 sec
> Number of Samples           : 119
> Max                         : 199.14 W
> Min                         : 173.41 W
> Avg                         : 192.07 W


See, that's the problem though. We're given information on a Asus bios which had a power of 275 watts. We're only drawing 110% of that, 300 watts.

So yeah, we're only getting what the FE is designed for, 300 watts.

If I have time I'll run furmark and measure total system power draw on stock BIOS and Asus. Don't really like running furmark though.


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> See, that's the problem though. We're given information on a Asus bios which had a power of 275 watts. We're only drawing 110% of that, 300 watts.


That's my point though. With the Asus BIOS installed, use nvidia-smi to check your current power limit and the BIOS maximums. Also use nvidia-smi to check for certain that you've set it to 330w, and set it manually if need be, and then use it again in loop mode to monitor power draw in FurMark.

If FurMark draws > 300w as shown by nvidia-smi, you know you've broken through the 300w limit of the FE BIOS and are achieving performance that wouldn't be possible with the FE BIOS. If you aren't, then some other limitation is holding you back.

(also, as a sanity check, compare the FurMark clocks between the FE BIOS set to max and the ASUS BIOS set to max. The ASUS BIOS should settle on higher clocks.)

I've gone through this exact exercise on my 1070 flashing a whole bunch of BIOSes trying to get the highest and best power target and have significantly improved its performance in TDP-limited applications. Certain games like The Witcher 3, 4k/5k resolutions, and VR in particular are super power hungry.


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> See, that's the problem though. We're given information on a Asus bios which had a power of 275 watts. We're only drawing 110% of that, 300 watts.
> 
> 
> 
> That's my point though. With the Asus BIOS installed, use nvidia-smi to check your current power limit and the BIOS maximums. Also use nvidia-smi to check for certain that you've set it to 330w, and set it manually if need be, and then use it again in loop mode to monitor power draw in FurMark.
> 
> If FurMark draws > 300w as shown by nvidia-smi, you know you've broken through the 300w limit of the FE BIOS and are achieving performance that wouldn't be possible with the FE BIOS. If you aren't, then some other limitation is holding you back.
> 
> (also, as a sanity check, compare the FurMark clocks between the FE BIOS set to max and the ASUS BIOS set to max. The ASUS BIOS should settle on higher clocks.)
> 
> I've gone through this exact exercise on my 1070 flashing a whole bunch of BIOSes trying to get the highest and best power target and have significantly improved its performance in TDP-limited applications. Certain games like The Witcher 3, 4k/5k resolutions, and VR in particular are super power hungry.
Click to expand...

I'm done comparing the BIOS's, been at it a few weeks now, 2050 on stock at 1.025v what I'm settling at, but I'll definitely max out the power limit with a .bat file running as admin on boot.


----------



## BoredErica

So... 100+ messages since I went to sleep, I'm just going to assume not a single person knew the answer to my questions lol.

Quote:


> Originally Posted by *Darkwizzie*
> 
> Yeah... there was 1080 SC full cover block, but that was probably because it was actually reference pcb. In their page http://www.evga.com/products/product.aspx?pn=08g-p4-6183-kr it shows the phase count being same as FE I believe. Now it's like... buy a cheaper card and possibly miss out on full cover block, or buy more expensive card that I might now need.
> 
> I'm really looking for a card on Amazon too so I can get 5% cash back. Other cards might not have pre-order option and the clock is ticking on my Prime (27 more days).
> 
> edit:
> 
> Google-fu net me this:
> 
> http://www.tomshardware.com/news/ekwb-five-1080-ti-blocks,34036.html
> 
> 
> Graphics CardEK Water Block modelAvailabilityNVIDIA® GeForce® GTX 1080 TiEK-FC1080 GTX TiEarly April 2017ASUS® GeForce® GTX 1080 Ti StrixEK-FC1080 GTX Ti StrixMid April 2017MSI® GeForce® GTX 1080 Ti GAMING XEK-FC1080 GTX Ti TF6Early May 2017GIGABYTE® GeForce® GTX 1080 Ti AorusEK-FC1080 GTX Ti AorusEarly May 2017EVGA® GeForce® GTX 1080 Ti FTW3EK-FC1080 GTX Ti FTWLate May 2017


Quote:


> Originally Posted by *Darkwizzie*
> 
> You prompted me to recheck their site, which is a good thing. I must have mixed up the SC with the Armor X (which despite being entry level pricing still has 8+8), whereas the non FTW3 will be the usual. Not sure what to make of the phase count disparity though, and the fact that it's not on the 1080ti FE block list.
> 
> 
> 
> Yeah, I think it's likely FTW3 will get a block. I guess from what I have been hearing, extra power and voltage just doesn't help that much for overclocking in above ambient conditions.


Quote:


> Originally Posted by *Darkwizzie*
> 
> I'd read the posts here but now there are almost 1500 new posts and it's just too many for me to go through.
> 
> Anybody know if EVGA SC Black/SC/FTW3 will get an EK full cover waterblock? I think they will but I'd like some confirmation. In EK configurator (https://www.ekwb.com/configurator/) they seem to have them for 1080 SC... not quite the SC Black. I think these are all non-reference. Too bad review sites don't clearly state if a PCB is reference or not. List here (https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-nvidia-gtx-1080-ti-graphics-cards/) makes it look like all non FE are not ref PCB...
> 
> Also anyone got any good ideas/info on power limits on the 3 EVGA skus?
> 
> Thanks in advance.


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm done comparing the BIOS's, been at it a few weeks now, 2050 on stock at 1.025v what I'm settling at, but I'll definitely max out the power limit with a .bat file running as admin on boot.


Shouldn't be any need to. 250W rounds up nicely to 300W with 1.2 times. Was showing 300W here when I checked with NVidia-smi.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> That's my point though. With the Asus BIOS installed, use nvidia-smi to check your current power limit and the BIOS maximums. Also use nvidia-smi to check for certain that you've set it to 330w, and set it manually if need be, and then use it again in loop mode to monitor power draw in FurMark.
> 
> If FurMark draws > 300w as shown by nvidia-smi, you know you've broken through the 300w limit of the FE BIOS and are achieving performance that wouldn't be possible with the FE BIOS. If you aren't, then some other limitation is holding you back.
> 
> (also, as a sanity check, compare the FurMark clocks between the FE BIOS set to max and the ASUS BIOS set to max. The ASUS BIOS should settle on higher clocks.)
> 
> I've gone through this exact exercise on my 1070 flashing a whole bunch of BIOSes trying to get the highest and best power target and have significantly improved its performance in TDP-limited applications. Certain games like The Witcher 3, 4k/5k resolutions, and VR in particular are super power hungry.


I might just run heaven in real time and log the system power consumption with corsair link and see the max power draw. Then do it for Asus bios


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Guys, for AIBs- do you still use that AB custom voltage curves thing or do you use some new AB version where we can finally just apply voltage with slider?


Quote:


> Originally Posted by *ThingyNess*
> 
> Which version of the Aorus Extreme BIOS are you running? There are actually four in the wild apparently (F1, F2, F3, F4) with reports that some of them actually have different power targets. Some people were unhappy when they found out that their Aorus Extreme only had the TDP slider going up to +25%, but your results make sense now if the base had been raised from 250-300w.
> 
> Interestingly, the BIOS shipped to reviewers is definitely 250W by default with a +50% adjustment limit, as shown in TechPowerUp's review - it's the same one they posted in their BIOS database as well - from the text string it looks to be F3.
> 
> https://www.techpowerup.com/vgabios/190959/gigabyte-gtx1080ti-11264-170331


yeah, im not sure but it's what they7 sent me. i had heard they did a last minute bios revision and that is why they delayed launch by a couple weeks.


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> It really irritates me that all the programs seem to want to quote the TDP numbers in percent rather than absolute values, which I think also confuses the issue as not everyone understands that 110% on the ASUS bios is not the same as 110% on the FE bios.
> 
> If you want absolute numbers, a quick and easy test is FurMark, as much as I know KedarWolf doesn't like it. It definitely blew up some VRMs a few generations of cards ago back before TDP limiting and monitoring was a thing, but on modern cards it's no harder on the card than anything else.
> 
> Try FurMark with the FE BIOS' TDP slider maxed out, and then with the ASUS BIOS' TDP slider maxed out. If you're worried about the health of your GPU and/or VRMs, only run it for as long as necessary for temps/clocks to stabilize.
> 
> Rather than using GPU-Z or Afterburner's percentages, if you want to really see the absolute power consumption that NVidia thinks the card is using, just add '-l 1' to the nvidia-smi.exe command to make it run in a continuous 1000ms polling loop and keep the window open with FurMark running.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> nvidia-smi -q -d power -l 1
> 
> You should see your max power draw hitting very close to the advertised TDP limits within a few seconds of FurMark running, unless your limits are so high that even FurMark doesn't power throttle the GPU (my 1070 is fairly close actually, I can sustain 1923-1950mhz and with another 20w I'd probably be there)
> 
> With the stock non-OC bios and its 170W TDP limit the card would pull way back into the 1700s in FurMark.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> ==============NVSMI LOG==============
> 
> Timestamp                           : Mon Apr 10 17:10:15 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 196.80 W
> Power Limit                 : 200.00 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 200.00 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.37 sec
> Number of Samples           : 119
> Max                         : 199.14 W
> Min                         : 173.41 W
> Avg                         : 192.07 W


i always run hardwaremonitor and gpu-z, b/c HWMonitor has EVERY value you could want measured. It's the best.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> i always run hardwaremonitor and gpu-z, b/c HWMonitor has EVERY value you could want measured. It's the best.


Nothing beats a PSU that can measure everything itself


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> I didn't see anyone make a note of it in this thread, but for reference in case anyone gets a 1080Ti for which there isn't a BIOS dump / analysis yet.
> Rather than digging through the code and hex values like I've been doing, if you actually have the card in hand:
> 
> Load up a command prompt (cmd) and copy/paste the following:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> You'll see a nice little text output like this (it's from my 1070 as i'm still waiting on my 1080Ti to arrive)
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Timestamp                           : Mon Apr 10 15:15:39 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 35.86 W
> Power Limit                 : 199.32 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 199.32 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.69 sec
> Number of Samples           : 119
> Max                         : 36.45 W
> Min                         : 8.99 W
> Avg                         : 32.49 W
> 
> This also gives you an easy way to see absolute TDP numbers rather than percentages like GPU-Z gives you, which is nice.
> 
> From this, you can also see that the sliders in most GPU tweaking software seem to round to the nearest integer percentage. For example my slider only goes to 20%, which gives me 166.1w * 1.2 = 199.32w, even though the card technically supports 200w. If you really want that last .68w of TDP headroom, you can actually use nvidia-smi to set the power limit via the command prompt.
> 
> This is also handy if you want to add the command to a batch file and increase your TDP limit without having to have any OC software start up in the background, as they eat up memory and (rarely) have negative interactions with certain games.
> 
> To set a new power target, make sure you run the command prompt or batch file with admin rights, and type:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl <newpowerlimit>
> 
> The output should look something like this, which for me gave me the extra .68w of TDP that my 1070 Strix OC BIOS was capable of that Afterburner and the ASUS Gpu-Tweak utilities weren't giving me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> C:\WINDOWS\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 200
> Power limit for GPU 0000:02:00.0 was set to 200.00 W from 199.32 W.
> All done.


Seems to me mine is already maxed out.

Attached GPUs : 2
GPU 0000:01:00.0
Power Readings
Power Management : Supported
Power Draw : 12.83 W
Power Limit : 300.00 W
Default Power Limit : 250.00 W
Enforced Power Limit : 300.00 W
Min Power Limit : 125.00 W
Max Power Limit : 300.00 W
Power Samples
Duration : 20.93 sec
Number of Samples : 119
Max : 15.52 W
Min : 10.91 W
Avg : 11.49 W


----------



## jleslie246

someone just doesnt want me to get a 1080ti. So I preordered an EVGA 1080ti SC with my discover card 2 weeks ago? And last week my wife lost her Discover card. So I had to cancel the cards (they use the same #). I talked to EVGA today in hopes of just prepaying the total. But no, they dont do that. I have to wait for the preorder to fill, try to bill on a credit card that no longer exists, then wait for an email from EVGA to update my billing info. During this time I will likely lose my preorder. OMG


----------



## WarbossChoppa

I picked up a MSI 1080 Ti SEA HAWK X, can't wait to overclock that monster!


----------



## optimus002

I wish I had waited a few weeks lol. I could of gotten this for much cheaper than going FE+G10...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *optimus002*
> 
> I wish I had waited a few weeks lol. I could of gotten this for much cheaper than going FE+G10...


Well the g10 will stay with you for other gpu, it's a one time investment at least.


----------



## Tikerz

The voltage curve thing is driving me nuts. It never stays the same. I can make the same curve and will get no perfcaps one time and perfcaps a second time. Or the curve will load differently with every reboot. Or I make a curve testing the exact same voltage and clock and have different results because the other points on the curve are slightly different.

Sent from my MHA-L29 using Tapatalk


----------



## optimus002

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well the g10 will stay with you for other gpu, it's a one time investment at least.


That's true.


----------



## KedarWolf

Quote:


> Originally Posted by *Tikerz*
> 
> The voltage curve thing is driving me nuts. It never stays the same. I can make the same curve and will get no perfcaps one time and perfcaps a second time. Or the curve will load differently with every reboot. Or I make a curve testing the exact same voltage and clock and have different results because the other points on the curve are slightly different.
> 
> Sent from my MHA-L29 using Tapatalk


I think I know how to fix it. If you say put curve on a point at 2066, it'll jump to 2062 when you hit apply but when you reboot will be 2075 and mess things up.

What you do is put it exactly on the point you want, say 2062, hit apply, it'll jump one notch down every time to 2050, but then the second time put the point exactly at 2062, hit apply, it'll stay at 2062 second time and not jump any higher next reboot.

So, to reiterate, just put it exactly on the point you want, hit apply, if it's one lower, do it again on exact point again, hit apply, it'll be right now, and stay there on reboot.


----------



## NYU87

After using voltage curve, looks like my MSI GTX 1080 Ti Gaming X is capable of 2.088GHz, trying to push for 2.1GHz crashes Firestrike. Anything else I can do to get 2.1GHz? Do you think lowering memory clocks can help me get 2.1GHz?

32.4K graphics score in FS at 2.088GHz.


----------



## Alwrath

Ryzen 4 ghz with stock voltage 1080ti 1950 core 5805 ram.


----------



## Tikerz

Quote:


> Originally Posted by *KedarWolf*
> 
> I think I know how to fix it. If you say put curve on a point at 2066, it'll jump to 2062 when you hit apply but when you reboot will be 2075 and mess things up.
> 
> What you do is put it exactly on the point you want, say 2062, hit apply, it'll jump one notch down every time to 2050, but then the second time put the point exactly at 2062, hit apply, it'll stay at 2062 second time and not jump any higher next reboot.
> 
> So, to reiterate, just put it exactly on the point you want, hit apply, if it's one lower, do it again on exact point again, hit apply, it'll be right now, and stay there on reboot.


Thanks, I'll give that a shot.


----------



## alucardis666

Quote:


> Originally Posted by *Alwrath*
> 
> 
> 
> Ryzen 4 ghz with stock voltage 1080ti 1950 core 5805 ram.


Nice!


----------



## KingEngineRevUp

People thought that the ASUS bios gave you an extra 30 watts of TDP, I'm here to prove this absolutely wrong.

Here's the verdict, the ASUS bios doesn't give you anything. Please take a look at the results below, you don't get that extra 30 watts you thought you did. Running furmark, you can see here that the ASUS bios drew 10 more watts, but I have to tell you that it was just near the end of the end of the test where power usage was %112.



Asus Power Draw 275 * 1.12 = 308 Watts

This is equal to the times on stock where I would hit 124% just for a brief few seconds playing Witcher 3. Here's the stock bios below hitting 122%. But I usually have it hit 124% a few times.



Stock Power Draw 250 * 1.24 = 310 Watts

So there you have it folks. The freaking ASUS bios gives you +30 extra watts is complete BS.

Please stop flashing it, it just gives you worse results. Those of you who think you're overcoming power limits and getting better results, sorry but you're not.


----------



## KickAssCop

Bios flashing on a 1080 Ti is a joke. Can't believe people want to talk about that extra 50 MHz and feel special.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KickAssCop*
> 
> Bios flashing on a 1080 Ti is a joke. Can't believe people want to talk about that extra 50 MHz and feel special.


Can't believe some hypocrite is posting in a forum dedicated to OC and judging people on trying to figure out how to push boundaries.









You're in the wrong forum to be judgmental. And I'm sorry, but it takes baby steps and a community together to push boundaries







. If everyone was lazy and condescending as you were, then nothing would be discovered:thumb:.

As for myself, I set out to prove that flashing the ASUS was a waste of everyone's time because I was concerned others would waste their time.


----------



## stoker

Any ideas if the EK Thermosphere will fit the FE while still retaining the stock ram and vrm plate?


----------



## Tikerz

Those with the EVGA Hybrid mod what do you all run your VRM fans at during full load?


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> As for myself, I set out to prove that flashing the ASUS was a waste of everyone's time because I was concerned others would waste their time.


Good analysis there, cheers. Not that it's gonna stop us from trying to push these things. Back to the drawing board


----------



## D13mass

Quote:


> Originally Posted by *D13mass*
> 
> Guys, how I can fix it ?
> 
> I already used DDU soft, but still can`t install drivers


How I fixed this : I had Windows 10 LTSB, I downloaded last original iso of Windows 10 Enterprise build 1703 and installed it from system on top of my current system, I saved all programs and files and everything began to work out of the box, even drivers 381.65 was automatically applied because I installed them for 980ti and change card to 1080ti.
So, I hate Microsoft


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Good analysis there, cheers. Not that it's gonna stop us from trying to push these things. Back to the drawing board


Don't think I'm against pushing boundaries man. I would love more than anything to have a bios modifier and not have to rely on a shunt mod.

I'm doing my part to steer people in the right direction. Now you do yours and go unlock our bios.


----------



## prelude514

Can someone with an MSI Seahawk run the following command in a command prompt please? Make sure you have the Power Limit slider maxed out in afterburner before running the command. Really want to know what this card's TDP allowance is. Also interested in MSI Gaming X which should be more common by now.

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power

Here's my dog of an Aorus. Having difficulty getting 2000MHz stable, really got a crap draw of the lottery.


----------



## cyenz

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, im not sure but it's what they7 sent me. i had heard they did a last minute bios revision and that is why they delayed launch by a couple weeks.


I currently have an Aorus (non extreme), flashed the F4_Beta and now i have 150% power target, the original bios only had 125%. Strange.


----------



## geriatricpollywog

Quote:


> Originally Posted by *KickAssCop*
> 
> Bios flashing on a 1080 Ti is a joke. Can't believe people want to talk about that extra 50 MHz and feel special.


This is OCN. I am trying to load OSX onto my Kaby Lake build for no other reason than because I love computers.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Can someone with an MSI Seahawk run the following command in a command prompt please? Make sure you have the Power Limit slider maxed out in afterburner before running the command. Really want to know what this card's TDP allowance is. Also interested in MSI Gaming X which should be more common by now.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> Here's my dog of an Aorus. Having difficulty getting 2000MHz stable, really got a crap draw of the lottery.


Given time, techpower up will post all that information up.

Honestly, the card looks like a reference.


----------



## DerComissar

Quote:


> Originally Posted by *SlimJ87D*
> 
> People thought that the ASUS bios gave you an extra 30 watts of TDP, I'm here to prove this absolutely wrong.
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> Here's the verdict, the ASUS bios doesn't give you anything. Please take a look at the results below, you don't get that extra 30 watts you thought you did. Running furmark, you can see here that the ASUS bios drew 10 more watts, but I have to tell you that it was just near the end of the end of the test where power usage was %112.
> 
> 
> 
> Asus Power Draw 275 * 1.12 = 308 Watts
> 
> This is equal to the times on stock where I would hit 124% just for a brief few seconds playing Witcher 3. Here's the stock bios below hitting 122%. But I usually have it hit 124% a few times.
> 
> 
> 
> Stock Power Draw 250 * 1.24 = 310 Watts
> 
> So there you have it folks. The freaking ASUS bios gives you +30 extra watts is complete BS.
> 
> Please stop flashing it, it just gives you worse results. Those of you who think you're overcoming power limits and getting better results, sorry but you're not.


Been following all your hard work investigating this bios flash, and I really appreciate it.

It looked promising at first, but you've certainly proven that it isn't.
Reperoonie+


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DerComissar*
> 
> Been following all your hard work investigating this bios flash, and I really appreciate it.
> 
> It looked promising at first, but you've certainly proven that it isn't.
> Reperoonie+


Thank you! I actually want to try flashing the Seahawk's bios O_O

It is very close to the reference and also has 3 DP instead of 2 HDMI.


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> People thought that the ASUS bios gave you an extra 30 watts of TDP, I'm here to prove this absolutely wrong.
> 
> Here's the verdict, the ASUS bios doesn't give you anything. Please take a look at the results below, you don't get that extra 30 watts you thought you did. Running furmark, you can see here that the ASUS bios drew 10 more watts, but I have to tell you that it was just near the end of the end of the test where power usage was %112.


shunt mod! shunt mod! shunt mod!
Quote:


> Originally Posted by *NYU87*
> 
> After using voltage curve, looks like my MSI GTX 1080 Ti Gaming X is capable of 2.088GHz, trying to push for 2.1GHz crashes Firestrike. Anything else I can do to get 2.1GHz? Do you think lowering memory clocks can help me get 2.1GHz?
> 
> 32.4K graphics score in FS at 2.088GHz.


What did you do with the curve? When I tried it, I pushed the top voltage bin to the desired frequency, but then found that the top voltage bin has lower max stable frequency than the 1.075v bin.


----------



## alucardis666

Quote:


> Originally Posted by *Exilon*
> 
> shunt mod! shunt mod! shunt mod!


----------



## prelude514

Quote:


> Originally Posted by *SlimJ87D*
> 
> Given time, techpower up will post all that information up.
> 
> Honestly, the card looks like a reference.


I might have been spoiled by my previous reference card. It did 2000MHz locked playing Witcher 3 for hours on end with only 0.993v.


----------



## Outcasst

Either the 381.65 Drivers are unstable or Mass Effect Andromeda is the best overclock instability finder in history. Keep getting the "Device Removed" DirecxtX error when it's Heaven stable.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Can someone with an MSI Seahawk run the following command in a command prompt please? Make sure you have the Power Limit slider maxed out in afterburner before running the command. Really want to know what this card's TDP allowance is. Also interested in MSI Gaming X which should be more common by now.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> Here's my dog of an Aorus. Having difficulty getting 2000MHz stable, really got a crap draw of the lottery.


Lookie lookie









1080TiMSISeaHawk.zip 155k .zip file


Whoever wants to flash and test this bios and see if there's anything special about it.

I need to go to bed


----------



## Exilon

Quote:


> Originally Posted by *Outcasst*
> 
> Either the 381.65 Drivers are unstable or Mass Effect Andromeda is the best overclock instability finder in history. Keep getting the "Device Removed" DirecxtX error when it's Heaven stable.


Heaven doesn't put enough load on the GPU to call an overclock stable.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Heaven doesn't put enough load on the GPU to call an overclock stable.


For me it has always been Witcher 3. That game will crash and crash quite often with a unstable OC.


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> For me it has always been Witcher 3. That game will crash and crash quite often with a unstable OC.


Witcher 3, FS Extreme or Ultra stress test, or Overwatch are good pretty good at rooting out issues.

FS Extreme Stress Test wrecks my unstable overclocks faster than anything else I've run. Also wrecks the power limit. Good to grab when the suite is $5 during Winter sale season. Not worth full price for me, but this is a forum for people with very expensive PCs so opinions may differ.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Witcher 3, FS Extreme or Ultra stress test, or Overwatch are good pretty good at rooting out issues.
> 
> FS Extreme Stress Test wrecks my unstable overclocks faster than anything else I've run. Also wrecks the power limit. Good to grab when the suite is $5 during Winter sale season. Not worth full price for me, but this is a forum for people with very expensive PCs so opinions may differ.


I got it for $7 but I can't run it because it doesn't like msi afterburner and Riva tuner, even if I run RSS in stealth mode.

Haven't figured out why I can't run it and have afterburner open at the same time. Might have to email 3dmark support.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> Both. But Aorus doesnt want any extra volts and the consensus is that most cards work better locked to 1.04v/1.05v/1.63v anyway , at highest possible clocks. So, determine highest possible clock at 1.050v, at crash lower it 13-25, drop everything down to the right of the straight across same line mark . click 1.05v and hit ctrl+L to lock, x out, apply, save, and enjoy. It's really easy now.


Just follow Kedar tutorial on YT, correct? :


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> I got it for $7 but I can't run it because it doesn't like msi afterburner and Riva tuner, even if I run RSS in stealth mode.
> 
> Haven't figured out why I can't run it and have afterburner open at the same time. Might have to email 3dmark support.


Try setting the application detection threshold in rivatuner to low or just turning off rivatuner while running 3dmark.


----------



## prelude514

Quote:


> Originally Posted by *SlimJ87D*
> 
> Lookie lookie
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080TiMSISeaHawk.zip 155k .zip file
> 
> 
> Whoever wants to flash and test this bios and see if there's anything special about it.
> 
> I need to go to bed


Sweet.







I'd give it a go if I still had a reference card.

Quote:


> Originally Posted by *SlimJ87D*
> 
> For me it has always been Witcher 3. That game will crash and crash quite often with a unstable OC.


Same. Witcher 3 roots out any instability pretty quickly. Heaven is definitely not to be trusted to call an OC stable.


----------



## Dasboogieman

Dat new GPU smell.

1080ti Aorus Generic model just came in. Completely mediocre overclocks in all respects. 
https://www.techpowerup.com/gpuz/details/7hawa

seems to cap out at 2037mhz with 1.093V. Get soft crashing in FFXIV with 2050mhz. But meh, dun't really mind.

Build quality is similar but not the same as the extreme version. The Copper back block is less chunky, the Aorus LED is smaller on the backplate and no backplate thermal pads for the mosfets.

TDP is capped to 375W if you slide the AB bar to max. Also perma VREL regardless of what I do, this thing is screaming for volt mods.

Also my 7700k is freshly delidded, 5.0 Ghz is now 1.35v stable (-2 AVX offset) with load temps <70 (down from total thermal shutdown lol).


----------



## cyenz

Quote:


> Originally Posted by *Dasboogieman*
> 
> Dat new GPU smell.
> 
> 1080ti Aorus Generic model just came in. Completely mediocre overclocks in all respects.
> https://www.techpowerup.com/gpuz/details/7hawa
> 
> seems to cap out at 2037mhz with 1.093V. Get soft crashing in FFXIV with 2050mhz. But meh, dun't really mind.
> 
> Build quality is similar but not the same as the extreme version. The Copper back block is less chunky, the Aorus LED is smaller on the backplate and no backplate thermal pads for the mosfets.
> 
> TDP is capped to 375W if you slide the AB bar to max. Also perma VREL regardless of what I do, this thing is screaming for volt mods.


Did you installed the F4_Beta bios that is currently on the gigabyte website?


----------



## Dasboogieman

Quote:


> Originally Posted by *cyenz*
> 
> Did you installed the F4_Beta bios that is currently on the gigabyte website?


Not yet, what does it do?


----------



## cyenz

Quote:


> Originally Posted by *Dasboogieman*
> 
> Not yet, what does it do?


Well, it states increased stability over F3 and i now can use 150% Power limit instead of 125%


----------



## Dasboogieman

Quote:


> Originally Posted by *cyenz*
> 
> Well, it states increased stability over F3 and i now can use 150% Power limit instead of 125%


This thing has 2 bioses, do I flash both? how do I know which is which?


----------



## cyenz

Quote:


> Originally Posted by *Dasboogieman*
> 
> This thing has 2 bioses, do I flash both? how do I know which is which?


One is for HDMI 1 and 2 to work the other one is to disable one HDMI and enable DVI, there is a pdf inside stating which Bios is.


----------



## braddocksp

One Bios activate hdmi 1 and 2 and disable the dvi? and the another disable hdmi 2 and enable dvi? is correct? I read these pdf but i understant really.


----------



## cyenz

Quote:


> Originally Posted by *braddocksp*
> 
> One Bios activate hdmi 1 and 2 and disable the dvi? and the another disable hdmi 2 and enable dvi? is correct? I read these pdf but i understant really.


Exactly that


----------



## Dasboogieman

Quote:


> Originally Posted by *braddocksp*
> 
> One Bios activate hdmi 1 and 2 and disable the dvi? and the another disable hdmi 2 and enable dvi? is correct? I read these pdf but i understant really.


Its similar to the 980ti Gaming edition. Which BIOS activates depends on what monitor output you are currently using as your primary.

I read through the doc, Ill frasl the F4 and see what happens.


----------



## cyenz

Quote:


> Originally Posted by *Dasboogieman*
> 
> Its similar to the 980ti Gaming edition. Which BIOS activates depends on what monitor output you are currently using as your primary.
> 
> I read through the doc, Ill frasl the F4 and see what happens.


Well now im in doubt, do we have to flash both then? Or only the one we want to use?


----------



## braddocksp

MMM, using display port, no hdmi or dvi.


----------



## Dasboogieman

Quote:


> Originally Posted by *cyenz*
> 
> Well now im in doubt, do we have to flash both then? Or only the one we want to use?


For mine, I use the DP that sits just above the DVI port. My Aorus is using the VR mode VBIOS by default so I flashed only that (I tested by plugging in a DVI and Windows considered it a brand new GPU being installed).

Seems good though, I'm getting the full fat 375W Power limit, the VREL is completely gone, and at 1.093V I'm boosting to 2063mhz stable so far on FFXIV.

Actually on that note, the FFXIV benchmark seems like a good stability test since it soft-crashes (CTD) upon the slightest instability.


----------



## braddocksp

You are using Display Port ok, but you have flashed AH or AD BIOS?


----------



## BoredErica

Following EKWB's website, EVGA SC Black will work with FE full cover block. SC2 will NOT. FTW 3 will have its own dedicated block instead.


----------



## Dasboogieman

Quote:


> Originally Posted by *braddocksp*
> 
> You are using Display Port ok, but you have flashed AH or AD BIOS?


AH


----------



## Eco28

Interestingly enough, is anyone switching to the new EK 1080TI water block from TitanX block?


----------



## madmeatballs

Quote:


> Originally Posted by *Outcasst*
> 
> Either the 381.65 Drivers are unstable or Mass Effect Andromeda is the best overclock instability finder in history. Keep getting the "Device Removed" DirecxtX error when it's Heaven stable.


As stated from Nvidia's driver release notes ME:A has random memory errors on the 1080 Ti. So I wouldn't use ME:A to find out if an OC is stable.

Release Notes for 381.65 go to Open Issues


----------



## Benny89

There she is finally, my baby and salary







:


----------



## PasK1234Xw

Microcenters seem to be stocking up on STRIX today (in-store only) so check local. Also seem to be able to select in-store pickup.

http://www.microcenter.com/product/478276/GeForce_GTX_1080_Ti_STRIX_ROG_Overclocked_11GB_GDDR5X_GAMING_Video_Card


----------



## wholeeo

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Microcenters seem to be stocking up on STRIX today (in-store only) so check local. Also seem to be able to select in-store pickup.
> 
> http://www.microcenter.com/product/478276/GeForce_GTX_1080_Ti_STRIX_ROG_Overclocked_11GB_GDDR5X_GAMING_Video_Card


Finally, reserved one for pickup. You have better chances of running into a unicorn than finding one of these things in store. Now to see how to get the wife to approve.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> People thought that the ASUS bios gave you an extra 30 watts of TDP, I'm here to prove this absolutely wrong.
> 
> Here's the verdict, the ASUS bios doesn't give you anything. Please take a look at the results below, you don't get that extra 30 watts you thought you did. Running furmark, you can see here that the ASUS bios drew 10 more watts, but I have to tell you that it was just near the end of the end of the test where power usage was %112.
> 
> 
> 
> Asus Power Draw 275 * 1.12 = 308 Watts
> 
> This is equal to the times on stock where I would hit 124% just for a brief few seconds playing Witcher 3. Here's the stock bios below hitting 122%. But I usually have it hit 124% a few times.
> 
> 
> 
> Stock Power Draw 250 * 1.24 = 310 Watts
> 
> So there you have it folks. The freaking ASUS bios gives you +30 extra watts is complete BS.
> 
> Please stop flashing it, it just gives you worse results. Those of you who think you're overcoming power limits and getting better results, sorry but you're not.


Yes, but that doesn't change the fact that the Asus BIOS hits the power limits less often and throttles less because of that. Plus people are getting higher stable clocks with it. And your power supply output isn't really the definitive answer as to whether or not the BIOS is pulling extra watts. You might want to run that nvidia-smi.exe command someone suggested while benching with Furmark running, that would really tell you if it's pulling more wattage.


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, but that doesn't change the fact that the Asus BIOS hits the power limits less often and throttles less because of that. Plus people are getting higher stable clocks with it. And your power supply output isn't really the definitive answer as to whether or not the BIOS is pulling extra watts. You might want to run that nvidia-smi.exe command someone suggested while benching with Furmark running, that would really tell you if it's pulling more wattage.


It might be correct that the Asus bios doesn't do anything or add extra performance due to higher OC. But for me personally I could never reach 2088 with the FE bios, I tried 2100 yesterday and even got TS and FS to run:

Firestrike at 2100: http://www.3dmark.com/3dm/19189465
Timespy at 2100: http://www.3dmark.com/3dm/19189637


----------



## Addsome

Quote:


> Originally Posted by *fisher6*
> 
> It might be correct that the Asus bios doesn't do anything or add extra performance due to higher OC. But for me personally I could never reach 2088 with the FE bios, I tried 2100 yesterday and even got TS and FS to run:
> 
> Firestrike at 2100: http://www.3dmark.com/3dm/19189465
> Timespy at 2100: http://www.3dmark.com/3dm/19189637


Higher clocks on ASUS bios actually perform worse than lower clocks on FE bios. Run Heaven and make sure your actually seeing an increase in performance. In the Asus bios I ran @ 2088MHz but I scored lower than running my stock FE bios @ 2050MHz.


----------



## fisher6

Quote:


> Originally Posted by *Addsome*
> 
> Higher clocks on ASUS bios actually perform worse than lower clocks on FE bios. Run Heaven and make sure your actually seeing an increase in performance. In the Asus bios I ran @ 2088MHz but I scored lower than running my stock FE bios @ 2050MHz.


Yeah i heard multiple people having that issue. i will try to flash the FE bios when i get the time later.


----------



## BoredErica

Quote:


> Originally Posted by *Addsome*
> 
> Higher clocks on ASUS bios actually perform worse than lower clocks on FE bios.


Wonder why this is.


----------



## fisher6

Can confirm, I found a Timespy score from before i flashed the Strix bios with the core running at 2076 instead of 2100. The FE bios yields higher GPU score:

Strix bios at 2100: http://www.3dmark.com/3dm/19189637
FE bios at 2076: http://www.3dmark.com/3dm/18935211

The FE bios test does have a slightly higher OC bus I doubt that is the difference here.


----------



## Benny89

Well, STRIX, AORUS, MSI Bios was never meant for FE so I am not supprised they do not provide improvement.


----------



## mshagg

I'd consider 20 points within a margin of error. Were both runs running the same voltage curve? I got a better TS score by making sure the bins the card dropped back to weren't ridiculous like 1960Mhz. http://www.3dmark.com/spy/1534371

Not sure how useful furmark is, the whole idea there is to draw power, so even with an extra 10% TDP overhead it's going to send the card back down the curve. End of the day we're talking one or two potential bins that can be sustained on a partner card compared to the FE.

Really want to see how some of the partner cards go under full cover blocks.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, but that doesn't change the fact that the Asus BIOS hits the power limits less often and throttles less because of that. Plus people are getting higher stable clocks with it. And your power supply output isn't really the definitive answer as to whether or not the BIOS is pulling extra watts. You might want to run that nvidia-smi.exe command someone suggested while benching with Furmark running, that would really tell you if it's pulling more wattage.


First off, the stock voltage for the Asus starts at 1.062V so you're going to actually hit power limits sooner.

Second, this bios thinks the card has access to 350 watts but it doesn't. So it'll keep your clocks where they are because it thinks you have 350 watts to draw from but you do not. You're running a gimped OC and that's why your score is worse in benchmarks. And this is the worse thing that can happen because people here think "oh look, my clocks aren't changing." this is equal to me just going to a point on the graph and force locking with ctrl + L at a voltage and core which shouldn't be done if you don't have the power to actually run that core at that voltage in power hungry games, you get worse performance.

Third, as I played Witcher 3, the Asus bios, even though I capped it at 1.050V with a voltage curve, would increase voltage on its own to 1.081V and even 1.092V. From here on my perf cap, it was bouncing in the green zone over and over again. This is probably happening because the Asus bios thinks there is 350 watts available to it and there isn't. WE CAN ONLY DRAW 300!

If someone did a shunt mod, the Asus bios might work better. But by itself it just cripples your performance.

You should at least update your Asus topic and let people know this doesn't magically overcome power limiting. If anything, it. Causes power limiting since the voltage starts at 1.062V unless you down volt it with a curve which again doesn't work.


----------



## ThingyNess

Quote:


> Originally Posted by *cyenz*
> 
> Well, it states increased stability over F3 and i now can use 150% Power limit instead of 125%


Just FYI, the max Power Limit of all the Aorus 1080Ti BIOSes are the same. The ones that only have the slider going up to +25% are starting from a *300w base* instead of the default 250w limit (same as FE) that some of the other Aorus BIOSes have. They all max out at 375w though, whether it's 250w + 50% or 300w + 25%.


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> Lookie lookie
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080TiMSISeaHawk.zip 155k .zip file
> 
> 
> Whoever wants to flash and test this bios and see if there's anything special about it.
> 
> I need to go to bed


Checked this in a hex editor as my 1080Ti hasn't arrived yet. Looks like stock FE power limits - 250w default, 300w max.
A lot of the water cooled / AIO BIOSes are like this and I don't know why. A water cooled card is the most likely to run into power limits since thermals aren't an issue, but the manufacturers keep making them on PCBs with weak VRMs and with low power limits.

On the MSI 1080 (non-Ti) Seahawk X they gave you a LOWER power limit than even the FE! At least with this one they kept it the same.

https://www.techpowerup.com/vgabios/183939/msi-gtx1080-8192-160509
vs
https://www.techpowerup.com/vgabios/184559/msi-gtx1080-8192-160607

I guess they really don't have a lot of faith in their hybrid VRM cooling....


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> Checked this in a hex editor as my 1080Ti hasn't arrived yet. Looks like stock FE power limits - 250w default, 300w max.
> A lot of the water cooled / AIO BIOSes are like this and I don't know why. A water cooled card is the most likely to run into power limits since thermals aren't an issue, but the manufacturers keep making them on PCBs with weak VRMs and with low power limits.
> 
> On the MSI 1080 (non-Ti) Seahawk X they gave you a LOWER power limit than even the FE! At least with this one they kept it the same.
> 
> https://www.techpowerup.com/vgabios/183939/msi-gtx1080-8192-160509
> vs
> https://www.techpowerup.com/vgabios/184559/msi-gtx1080-8192-160607
> 
> I guess they really don't have a lot of faith in their hybrid VRM cooling....


Yeah, you would think that powering a pump and possibly 2 fans, they would at least give it more power lol.


----------



## rcfc89

Serious question fella's









I'm currently waiting on my Titan Xp and Evga hybrid cooler. With tax I'm looking at $1461. Although staying on a single card is tempting I'm really digging the new 1080Ti SeaHawk. Being a huge Seahawk fan and living in Seattle helps as well









I can run a pair in SLI for $1600 out the door. $139 more then the Xp setup. Opinions?


----------



## profundido

hey I'm about to make a new pc and I haven't been following the 1080ti train so far. So can anyone inform me fast what I would need to buy to get the best performance possible on a triple rad liquid cooled loop ?

Do I need a custom one or are they all locked anyway and may I buy a FE as well for that reason since it overclocks equal ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *rcfc89*
> 
> Serious question fella's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm currently waiting on my Titan Xp and Evga hybrid cooler. With tax I'm looking at $1461. Although staying on a single card is tempting I'm really digging the new 1080Ti SeaHawk. Being a huge Seahawk fan and living in Seattle helps as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can run a pair in SLI for $1600 out the door. $139 more then the Xp setup. Opinions?


SLI is a gamble. You can only hope that DX12 one day pulls through with Multi-GPU support. But by that time rolls around, it's a headache.

Here's some math

Titan Xp = 1080 Ti + 10-12%
SLI 1080 Ti = 1080 Ti + 0%-70%

Personally, I'm so happy I'm not doing SLI 980 Ti anymore. I thought I could pass on the 1080 Ti, but my 1080 Ti OC is running better than my SLI 980 Ti.

*I think you should just get a single 1080 Ti and then get the 2080 in 1 year which will be equal to the Titan Xp but probably get some more features. Then sell your 1080 Ti* You'll get the most out of your money for performances you would hardly notice a difference and come out on top in 1 year.


----------



## rcfc89

Quote:


> Originally Posted by *SlimJ87D*
> 
> SLI is a gamble. You can only hope that DX12 one day pulls through with Multi-GPU support. But by that time rolls around, it's a headache.
> 
> Here's some math
> 
> Titan Xp = 1080 Ti + 10-12%
> SLI 1080 Ti = 1080 Ti + 0%-70%
> 
> Personally, I'm so happy I'm not doing SLI 980 Ti anymore. I thought I could pass on the 1080 Ti, but my 1080 Ti OC is running better than my SLI 980 Ti.
> 
> *I think you should just get a single 1080 Ti and then get the 2080 in 1 year which will be equal to the Titan Xp but probably get some more features.*


I hear you. I'm currently on Lightning 980Ti's now. Even when SLI does work the frames going all over the place is annoying as well. I'm probably going to stick with the single fastest card. Thanks bro


----------



## KingEngineRevUp

Quote:


> Originally Posted by *profundido*
> 
> hey I'm about to make a new pc and I haven't been following the 1080ti train so far. So can anyone inform me fast what I would need to buy to get the best performance possible on a triple rad liquid cooled loop ?
> 
> Do I need a custom one or are they all locked anyway and may I buy a FE as well for that reason since it overclocks equal ?


They all OC around to 2050, just get whatever works best for you. Waterblock? FE then.


----------



## KraxKill

Quote:


> Originally Posted by *fisher6*
> 
> It might be correct that the Asus bios doesn't do anything or add extra performance due to higher OC. But for me personally I could never reach 2088 with the FE bios, I tried 2100 yesterday and even got TS and FS to run:
> 
> Firestrike at 2100: http://www.3dmark.com/3dm/19189465
> Timespy at 2100: http://www.3dmark.com/3dm/19189637


Were your scores actually higher though? I doubt it. Check each Bios clock for clock.

Also I suspect as with the numerous posts above, higher clocks don't always to translate to better performance.

My card is on water and shunted, and though I can run the card at 2100 in Firestrike/Heaven, 2138 in Valley, my best performance is actually at 2075.

My situation could be due to me running a 4790k but it's at 5.0ghz so it isnt a slouch.


----------



## ThingyNess

Quote:


> Originally Posted by *KraxKill*
> 
> Were your scores actually higher though? I doubt it. Check each Bios clock for clock.
> 
> Also I suspect as with the numerous posts above, higher clocks don't always to translate to better performance.
> 
> My card is on water and shunted, and though I can run the card at 2100 in Firestrike/Heaven, 2138 in Valley, my best performance is actually at 2075.
> 
> My situation could be due to me running a 4790k but it's at 5.0ghz so it isnt a slouch.


Also, don't forget to all those that are benchmarking. Not all tests/loads are power limited. The higher the resolution/complexity, the more likely it will be that the card is TDP limited.
In addition, the extra TDP headroom doesn't help your average FPS as much as you might think. What it does help a lot is the _minimum_ FPS. Even in heaven, there's only like 5% of the benchmark that's really intensive, the rest of it won't power limit even with a 300w cap anyway.

Try running The Witcher 3 in 5k res with everything maxed, and watch your minimum FPS through a benchmark run.

Switching P-states does have a slight latency to it and can cause a bit more irregularity in frame times if they are bouncing around due to coming up against the TDP limiter.

Being able to stay in the absolute top P-state and max clocks throughout an entire benchmark run improves the perceived smoothness from my experience as well, even if it might only translate to a .1 difference in the average FPS at the end of the run.


----------



## nrpeyton

The ASUS BIOS _*does*_ give you an extra 30 watts.

Sorry, but calculations I read in a previous post here didn't look right.

A '1080 TI FE' has a power limit of 250 watts. (with power slider in your O/C'ing app at the default 100%)
Change that slider to the maximum 120% and you have 300 watts. (1.2 multiplied by 250 = 300)

The ASUS BIOS has a *default* power limit of 275 watts (with power slider in your O/C'ing app at the default 100%)

Change the slider to the maximum 120% with the ASUS BIOS installed and you have 330w. (1.2 multiplied by 275 = 330)


----------



## KingEngineRevUp

[/quote]
Quote:


> Originally Posted by *nrpeyton*
> 
> The ASUS BIOS _*does*_ give you an extra 30 watts.
> 
> Sorry, but your calculations are wrong.
> 
> An 1080 TI FE has a power limit of 250 watts. (with power slider in your O/C'ing app at the default 100%)
> Change that slider to the maximum 120% and you have 300 watts. (1.2 multiplied by 250 = 300)
> 
> The ASUS BIOS has a power limit of 275 watts (with power slider in your O/C'ing app at the default 100%)
> Change the slider to the maximum 120% with the ASUS BIOS installed and you have 330w. (1.2 multiplied by 275 = 330)


So you're telling me that my PSU, which measures watts, is wrong? Yeah, no thanks.

Running furmark
Stock Bios at 120%, which is it's max, 410 Watt to system

ASUS Bios at 110%, which is its max, 410 Watts to system.

What your'e saying is 410 =/= 410...


----------



## JedixJarf

Quote:


> Originally Posted by *SlimJ87D*
> 
> So you're telling me that my PSU, which measures watts, is wrong? Yeah, no thanks.
> 
> Running furmark
> Stock Bios at 120%, which is it's max, 410 Watt to system
> 
> ASUS Bios at 110%, which is its max, 410 Watts to system.
> 
> What your'e saying is 410 =/= 410...


410 = 400 duh, you forgot to magically subtract 10.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> So you're telling me that my PSU, which measures watts, is wrong? Yeah, no thanks.
> 
> Running furmark
> Stock Bios at 120%, which is it's max, 410 Watt to system
> 
> ASUS Bios at 110%, which is its max, 410 Watts to system.
> 
> What your'e saying is 410 =/= 410...


If you are cross-flashing a BIOS from another card, the numbers won't report correctly. The card doesn't know what is really drawing. The VRM and phase switching and everything is all different.

If you owned the ASUS card, you'd see the extra 30 watts.

I'm not recommending anyone flash their card, with a BIOS not designed for that card.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> So you're telling me that my PSU, which measures watts, is wrong? Yeah, no thanks.
> 
> Running furmark
> Stock Bios at 120%, which is it's max, 410 Watt to system
> 
> ASUS Bios at 110%, which is its max, 410 Watts to system.
> 
> What your'e saying is 410 =/= 410...
> 
> If you are cross-flashing a BIOS from another card, the numbers won't report correctly. The card doesn't know what is really drawing. The VRM and phase switching and everything is all different.


*You're not understanding. When you use the ASUS bios, the power usage NEVER GOES ABOVE 110%, it actually floats rarely goes above 108%.*

I am not using software to read power. I am using hardware to read wattage from the wall.


----------



## joder

In other news EK is shipping out the 1080Ti water blocks.


----------



## bloodhawk

Quote:


> Originally Posted by *fisher6*
> 
> It might be correct that the Asus bios doesn't do anything or add extra performance due to higher OC. But for me personally I could never reach 2088 with the FE bios, I tried 2100 yesterday and even got TS and FS to run:
> 
> Firestrike at 2100: http://www.3dmark.com/3dm/19189465
> Timespy at 2100: http://www.3dmark.com/3dm/19189637


Those arent quite 2100Mhz results -

http://www.3dmark.com/fs/12264485

http://www.3dmark.com/spy/1531250

@SlimJ87D - I rand the same test as yours last night, using furmark and for me weirdly the clocks weren't crossing 1800Mhz no matter what. (With either BIOS) In 3DMark however they go up like normal.

Tops power draw difference i noticed was about 10W higher on the ASUS BIOS. But my score ARE higher with the ASUS BIOS since i dont throttle as often, and the clocks are much more stable under max TDP loads, the FE BIOS under the same loads send the clocks to break dancing.

However the benefits will be visible only with cards that can NOT go over 2050Mhz.


----------



## JedixJarf

Quote:


> Originally Posted by *nrpeyton*
> 
> If you are cross-flashing a BIOS from another card, the numbers won't report correctly. The card doesn't know what is really drawing. The VRM and phase switching and everything is all different.
> 
> If you owned the ASUS card, you'd see the extra 30 watts.
> 
> I'm not recommending anyone flash their card, with a BIOS not designed for that card.


What the cards report has nothing to do with it... he is reading the usage straight from the PSU, the psu doesn't care about what the heck it's plugged into, power usage is power usage.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Those are quite 2100Mhz results -
> 
> http://www.3dmark.com/fs/12264485
> 
> http://www.3dmark.com/spy/1531250
> 
> @SlimJ87D - I rand the same test as yours last night, using furmark and for me weirdly the clocks weren't crossing 1800Mhz no matter what. (With either BIOS) In 3DMark however they go up like normal.
> 
> Tops power draw difference i noticed was about 10W higher on the ASUS BIOS. But my score ARE higher with the ASUS BIOS since i dont throttle as often, and the clocks are much more stable under max TDP loads, the FE BIOS under the same loads send the clocks to break dancing.


The FE will occasionally run past it's maximum power draw, we have seen it plenty of times. Sometimes you can go up to 125%. That's similar to when the ASUS bios goes up to 112%.

Here you go, I just played a bit of witcher to prove my point. Hitting 124%



*This is the stock bios hitting 310 Watts*


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> *You're not understanding. When you use the ASUS bios, the power usage NEVER GOES ABOVE 110%, it actually floats rarely goes above 108%.*
> 
> I am not using software to read power. I am using hardware to read wattage from the wall.


When I cross flashed my EVGA 1080 Classified with the ASUS T4, STRIX BIOS (1080). (which has no power or temp limits at all -- _they are completely removed)._

My card would actually draw less power from the wall than it did before. (And FPS would also go down).

This is due to VRM's being different.

I also have a watt meter that plugs into the wall and gives me full system draw.

However if I used the ASUS BIOS, in a compatible ASUS card -- I would indeed of had no power or temp limits.

The same applies to you.

The ASUS BIOS gives ASUS card owners an extra 30 watts. But it won't give everyone who cross-flashes an extra 30 watts. It depends on the differences on the VRM (cards power delivery & monitoring system).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> When I cross flashed my EVGA 1080 Classified with the ASUS T4, STRIX BIOS (1080). (which has no power or temp limits at all -- _they are completely removed)._
> 
> My card would actually draw less power from the wall than it did before. (And FPS would also go down).
> 
> This is due to VRM's being different.
> 
> I also have a watt meter that plugs into the wall and gives me full system draw.
> 
> However if I used the ASUS BIOS, in a compatible ASUS card -- I would indeed of had no power or temp limits.


Okay, but we're talking about the 1080 Ti FE and flashing a non FE bios on it.

We're not talking about a 1080 classified, or a 1070 Wubba lubba dub dub.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> The FE will occasionally run past it's maximum power draw, we have seen it plenty of times. Sometimes you can go up to 125%. That's similar to when the ASUS bios goes up to 112%.
> 
> Here you go, I just played a bit of witcher to prove my point. Hitting 124%
> 
> 
> 
> *This is the stock bios hitting 310 Watts*


Yeah that happens. But the point still stands, the clocks with the FE BIOS hitting TDP fluctuate down way more often than the ASUS BIOS.



http://imgur.com/B1Scx


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Yeah that happens. But the point still stands, the clocks with the FE BIOS hitting TDP fluctuate down way more often than the ASUS BIOS.
> 
> 
> 
> http://imgur.com/B1Scx


One thing that both our test are showing, we're not magically pulling an extra 30 watts.

ASUS bios =/= Shunt mod substitute


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> One thing that both our test are showing, we're not magically pulling an extra 30 watts.
> 
> ASUS bios =/= Shunt mod substitute


Cant Argue with that. Ill post results with the power mod in a day or 2.

JESUS CHRIST, i noticed i forgot to set my vcore back to 1.328V from 1.48V (for 4.8Ghz), and accidentally set it to 1.38V.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Okay, but we're talking about the 1080 Ti FE and flashing a non FE bios on it.
> 
> We're not talking about a 1080 classified, or a 1070 Wubba lubba dub dub.


I hear what you're saying, but I've been here before.

The only way to bypass the power limit on an FE 1080 TI is the shut mod.

Unless you can find a more compatible BIOS (which could still come in future). But I doubt that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Cant Argue with that. Ill post results with the power mod in a day or 2.


I say we pull kedrawolf to a soccer field and quarter his limbs off using 4 horse carriages!

Anyways, this is a shunt mod video with Witcher 3. Power draw doesn't go above 80% in most cases. The guy runs at 1.094V too with 2114 Mhz.






The shunt mod is just where it is. Risky, but rewards are risky.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> I say we pull kedrawolf to a soccer field and quarter his limbs off using 4 horse carriages!
> 
> Anyways, this is a shunt mod video with Witcher 3. Power draw doesn't go above 80% in most cases. The guy runs at 1.094V too with 2114 Mhz.
> 
> 
> 
> 
> 
> 
> The shunt mod is just where it is. Risky, but rewards are risky.


Its only risky if you're not careful where the liquid metal lands ;-)

Edit:
Also don't leave it indefinitely; some liquid metals will eventually dissolve solder.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> I hear what you're saying, but I've been here before.
> 
> The only way to bypass the power limit on an FE 1080 TI is the shut mod.
> 
> Unless you can find a more compatible BIOS (which could still come in future). But I doubt that.


I'm surprised no one has started a bounty, gathered money from various forums and paid someone to figure out how to unlock and modify the pascal bios.

I'd pay $10 to $20 for this bounty. If various communities did the same we'd probably get a talented hacker to do it for us.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm surprised no one has started a bounty, gathered money from various forums and paid someone to figure out how to unlock and modify the pascal bios.
> 
> I'd pay $10 to $20 for this bounty. If various communities did the same we'd probably get a talented hacker to do it for us.


haha I agree lol....

I'd pay too lol

Nvidia have really locked it down hard.

All us 1080 guys were really hoping it would happen when the TI released. As all the vets who were hanging onto their top spots at hwbot with maxwell 980TI would have to make the switch eventually ;-)

Anyway its still *very early days*.

Most people still can't even buy a TI yet. They're still out of stock at nvidia site daily, and not a single partner has released info on their full lineup yet.

Lets all keep fingers crossed.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm surprised no one has started a bounty, gathered money from various forums and paid someone to figure out how to unlock and modify the pascal bios.
> 
> I'd pay $10 to $20 for this bounty. If various communities did the same we'd probably get a talented hacker to do it for us.


I'd imagine that something might be able to be done, however, it would have to be a security exploit that would allow unsigned firmware to be flashed.

I don't think we are going to be able to sign it ourselves anytime soon.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Can't believe some hypocrite is posting in a forum dedicated to OC and judging people on trying to figure out how to push boundaries.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're in the wrong forum to be judgmental. And I'm sorry, but it takes baby steps and a community together to push boundaries
> 
> 
> 
> 
> 
> 
> 
> . If everyone was lazy and condescending as you were, then nothing would be discovered:thumb:.
> 
> As for myself, I set out to prove that flashing the ASUS was a waste of everyone's time because I was concerned others would waste their time.


Hell, he's johnny law, so.... impossible for him not to be judgemental and overbearing and out of line. And lazy.


----------



## bloodhawk

Quote:


> Originally Posted by *ThingyNess*
> 
> Checked this in a hex editor as my 1080Ti hasn't arrived yet. Looks like stock FE power limits - 250w default, 300w max.
> A lot of the water cooled / AIO BIOSes are like this and I don't know why. A water cooled card is the most likely to run into power limits since thermals aren't an issue, but the manufacturers keep making them on PCBs with weak VRMs and with low power limits.
> 
> On the MSI 1080 (non-Ti) Seahawk X they gave you a LOWER power limit than even the FE! At least with this one they kept it the same.
> 
> https://www.techpowerup.com/vgabios/183939/msi-gtx1080-8192-160509
> vs
> https://www.techpowerup.com/vgabios/184559/msi-gtx1080-8192-160607
> 
> I guess they really don't have a lot of faith in their hybrid VRM cooling....


Can you point me to the addresses you found the power limits at?


----------



## Slackaveli

Quote:


> Originally Posted by *prelude514*
> 
> Can someone with an MSI Seahawk run the following command in a command prompt please? Make sure you have the Power Limit slider maxed out in afterburner before running the command. Really want to know what this card's TDP allowance is. Also interested in MSI Gaming X which should be more common by now.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> Here's my dog of an Aorus. Having difficulty getting 2000MHz stable, really got a crap draw of the lottery.


sorry to hear that man. I always draw short straw. I guess getting an Aorus that can hold 1650 core was a big win for me this time, i better at least realize it.

Can yours hold the 1607 base of the Extreme? Have you been trying to add voltage?


----------



## nrpeyton

Quote:


> Originally Posted by *joder*
> 
> I'd imagine that something might be able to be done, however, it would have to be a security exploit that would allow unsigned firmware to be flashed.
> 
> I don't think we are going to be able to sign it ourselves anytime soon.


The 'certificates bypassed version' of nvflash already exists, and its even been updated to be pascal compatible.

The problem is deciphering the BIOS files. (figuring out what to edit).

I could use a hex-editor to edit my BIOS file right now, and I have the tools to successfully flash it to my 1080.

The problem, is any edit I done.. would be a "shot in the dark".


----------



## Slackaveli

Quote:


> Originally Posted by *D13mass*
> 
> How I fixed this : I had Windows 10 LTSB, I downloaded last original iso of Windows 10 Enterprise build 1703 and installed it from system on top of my current system, I saved all programs and files and everything began to work out of the box, even drivers 381.65 was automatically applied because I installed them for 980ti and change card to 1080ti.
> So, I hate Microsoft


AWESOME! I am so glad you got that worked out. Yours was such a sad tale. Now we need the dude with the lost Discover card to get his EVGA.


----------



## Slackaveli

Quote:


> Originally Posted by *cyenz*
> 
> I currently have an Aorus (non extreme), flashed the F4_Beta and now i have 150% power target, the original bios only had 125%. Strange.


150% of what, though? 150% of 250w is 375w. 125% of 300w is 375w. I think they will perform the same and the only difference between Aorus BIOS after the final revision (like I have, 300w target, 125%=375W) and the 150% one is the target limit and the base clocks. It should perform the same when OCing, w/ lottery being the only real variable.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Given time, techpower up will post all that information up.
> 
> Honestly, the card looks like a reference.


it's close. Only difference is the look, the power phase consistency, the cooling, and the power limit won't be a problem with 375w to play with. But I'd say all of the above favors Aorus


----------



## ThingyNess

Quote:


> Originally Posted by *bloodhawk*
> 
> Can you point me to the addresses you found the power limits at?


Sure thing! For future reference - all the Pascal cards have mostly the same _relative_ offsets between bits of data in the BIOS, although the starting offsets are different from GPU to GPU (1070, 1080, and 1080Ti offsets are all different)

The below data is from the 1080Ti FE BIOS - the offsets should be the same between all 1080Ti BIOSes, so if you want to know the max TDP adjustment limit for any 1080Ti BIOS, just open up your hex editor of choice, jump to 0x00030e78 and read those 4 bytes. Don't forget to swap from little endian to big endian before converting to decimal.

Code:



Code:


1080Ti BIOS offsets:
Note - in the BIOS the numbers are stored LSB first, so if you see "48 e8 01 00" in your hex editor, you need to swap to MSB first notation before converting to decimal.

"48 e8 01 00" would become "00 01 e8 48" which when converted to decimal = 125000 which is your Min TDP target in Milliwatts.

Min TDP target                  DWORD - 0x00030e70 - (48 e8 01 00) = 125000mW = 125W
Default TDP target              DWORD - 0x00030e74 - (90 D0 03 00) = 250000mW = 250w
Max TDP limit                   DWORD - 0x00030e78 - (e0 93 04 00) = 300000mW = 300w

Motherboard Default TDP Target  DWORD - 0x00030ea2 - (D0 01 01 00) = 66000mW = 66W
Motherboard Max TDP Limit       DWORD - 0x00030ea6 - (B0 30 01 00) = 78000mW = 78W

PCIe1 Default TDP Target        DWORD - 0x00030ed0 - (D8 53 01 00) = 87000mW = 87W
PCIe1 Max TDP Limit             DWORD - 0x00030ed4 - (B8 82 01 00) = 99000mW = 99W

PCIe2 Default TDP Target        DWORD - 0x00030efe - (D0 78 02 00) = 162000mW = 162W
PCIe2 Max TDP Limit             DWORD - 0x00030f02 - (98 ab 02 00) = 175000mW = 175W

Fan Curve Entry 1               WORD  - 0x0003150a - (40 05) = 1344 rpm
Fan Curve Entry 2               WORD  - 0x0003150c - (4c 04) = 1100 rpm
Fan Curve Entry 3               WORD  - 0x0003150e - (80 0a) = 2688 rpm
Fan Curve Entry 4               WORD  - 0x00031510 - (60 09) = 2400 rpm
Fan Curve Entry 5               WORD  - 0x00031512 - (50 0b) = 2896 rpm
Fan Curve Entry 6               WORD  - 0x00031514 - (c0 12) = 4800 rpm

Fan Max RPM                     WORD  - 0x000314e6 - (c0 12) = 4800 rpm


----------



## bloodhawk

Quote:


> Originally Posted by *ThingyNess*
> 
> Sure thing! For future reference - all the Pascal cards have mostly the same _relative_ offsets between bits of data in the BIOS, although the starting offsets are different from GPU to GPU (1070, 1080, and 1080Ti offsets are all different)
> 
> The below data is from the 1080Ti FE BIOS - the offsets should be the same between all 1080Ti BIOSes, so if you want to know the max TDP adjustment limit for any 1080Ti BIOS, just open up your hex editor of choice, jump to 0x00030e78 and read those 4 bytes. Don't forget to swap from little endian to big endian before converting to decimal.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> 1080Ti BIOS offsets:
> Note - in the BIOS the numbers are stored LSB first, so if you see "48 e8 01 00" in your hex editor, you need to swap to MSB first notation before converting to decimal.
> 
> "48 e8 01 00" would become "00 01 e8 48" which when converted to decimal = 125000 which is your Min TDP target in Milliwatts.
> 
> Min TDP target                  DWORD - 0x00030e70 - (48 e8 01 00) = 125000mW = 125W
> Default TDP target              DWORD - 0x00030e74 - (90 D0 03 00) = 250000mW = 250w
> Max TDP limit                   DWORD - 0x00030e78 - (e0 93 04 00) = 300000mW = 300w
> 
> Motherboard Default TDP Target  DWORD - 0x00030ea2 - (D0 01 01 00) = 66000mW = 66W
> Motherboard Max TDP Limit       DWORD - 0x00030ea6 - (B0 30 01 00) = 78000mW = 78W
> 
> PCIe1 Default TDP Target        DWORD - 0x00030ed0 - (D8 53 01 00) = 87000mW = 87W
> PCIe1 Max TDP Limit             DWORD - 0x00030ed4 - (B8 82 01 00) = 99000mW = 99W
> 
> PCIe2 Default TDP Target        DWORD - 0x00030efe - (D0 78 02 00) = 162000mW = 162W
> PCIe2 Max TDP Limit             DWORD - 0x00030f02 - (98 ab 02 00) = 175000mW = 175W
> 
> Fan Curve Entry 1               WORD  - 0x0003150a - (40 05) = 1344 rpm
> Fan Curve Entry 2               WORD  - 0x0003150c - (4c 04) = 1100 rpm
> Fan Curve Entry 3               WORD  - 0x0003150e - (80 0a) = 2688 rpm
> Fan Curve Entry 4               WORD  - 0x00031510 - (60 09) = 2400 rpm
> Fan Curve Entry 5               WORD  - 0x00031512 - (50 0b) = 2896 rpm
> Fan Curve Entry 6               WORD  - 0x00031514 - (c0 12) = 4800 rpm
> 
> Fan Max RPM                     WORD  - 0x000314e6 - (c0 12) = 4800 rpm


ooooh baby cant thank you enough, made my life so much easier and saved me so much time.
First thing ill try now is to manually edit the BIOS and flash it on using a programmer.

Will let you guys know how it goes tonight or tomorrow night.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I got it for $7 but I can't run it because it doesn't like msi afterburner and Riva tuner, even if I run RSS in stealth mode.
> 
> Haven't figured out why I can't run it and have afterburner open at the same time. Might have to email 3dmark support.


it's your NVIDIA control panel settings, (ive been meaning to tell you this- i got you, mane!!) . One is shader cache, one is Anio filtering, ... there are more. Just set EVERYTHING to default or "application controlled". For me I kept getting hung on Timespy and getting a launch error b/c I usually run 16x Anio filter from NCP overiding, and Timespy hates that.

Quote:


> Originally Posted by *wholeeo*
> 
> Finally, reserved one for pickup. You have better chances of running into a unicorn than finding one of these things in store. Now to see how to get the wife to approve.


just sell your Tis, why would she care then?


----------



## BoredErica

Does that extra power from the shunt mod really matter? What are we talking here, 25mhz on top from 150% power draw?

That PSU that can measure power draw sounds sweet, sort of want one.


----------



## Exilon

Quote:


> Originally Posted by *Darkwizzie*
> 
> Does that extra power from the shunt mod really matter? What are we talking here, 25mhz on top from 150% power draw?
> 
> That PSU that can measure power draw sounds sweet, sort of want one.


Before I did the shunt mod, I was throttling down to 1950 in super heavy load (Overwatch uncapped frames, FS Extreme) with the GPU barely boosting above the 1V bin. Now I can hold 2100 steady in all load scenarios.

Your mileage will vary based on silicon leakiness.

Not that I need it now that I'm on a Stellaris binge...


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Dat new GPU smell.
> 
> 1080ti Aorus Generic model just came in. Completely mediocre overclocks in all respects.
> https://www.techpowerup.com/gpuz/details/7hawa
> 
> seems to cap out at 2037mhz with 1.093V. Get soft crashing in FFXIV with 2050mhz. But meh, dun't really mind.
> 
> Build quality is similar but not the same as the extreme version. The Copper back block is less chunky, the Aorus LED is smaller on the backplate and no backplate thermal pads for the mosfets.
> 
> TDP is capped to 375W if you slide the AB bar to max. Also perma VREL regardless of what I do, this thing is screaming for volt mods.
> 
> Also my 7700k is freshly delidded, 5.0 Ghz is now 1.35v stable (-2 AVX offset) with load temps <70 (down from total thermal shutdown lol).


i feel like you added volts. I am right.

Protip- DON'T.

Also, 1631 core is the most anyone should expect out of a non-extreme Aorus. The bin over there (GPU Gauntlet), so, if a gpu were passing at 1633 (plus whatever there headroom allowance dictates in gauntlet policies), it would be an Extreme. I got very lucky in my eyes to hold 1646 on non extreme, and I couldnt get close to that adding volts. Just Lock the clock at 2050 at 1.05v and test. then try 2032.5, then 2020. You are better off than you realize I think.
Quote:


> Originally Posted by *Dasboogieman*
> 
> For mine, I use the DP that sits just above the DVI port. My Aorus is using the VR mode VBIOS by default so I flashed only that (I tested by plugging in a DVI and Windows considered it a brand new GPU being installed).
> 
> Seems good though, I'm getting the full fat 375W Power limit, the VREL is completely gone, and at 1.093V I'm boosting to 2063mhz stable so far on FFXIV.
> 
> Actually on that note, the FFXIV benchmark seems like a good stability test since it soft-crashes (CTD) upon the slightest instability.


agreed. Plus, It's the prettiest Benchmark to watch ~!
Quote:


> Originally Posted by *Dasboogieman*
> 
> For mine, I use the DP that sits just above the DVI port. My Aorus is using the VR mode VBIOS by default so I flashed only that (I tested by plugging in a DVI and Windows considered it a brand new GPU being installed).
> 
> Seems good though, I'm getting the full fat 375W Power limit, the VREL is completely gone, and at 1.093V I'm boosting to 2063mhz stable so far on FFXIV.
> 
> Actually on that note, the FFXIV benchmark seems like a good stability test since it soft-crashes (CTD) upon the slightest instability.


so, if one were to use the HDMI that is supposed to shut off when using DVI, that's putting it on the other bios, right?


----------



## KedarWolf

Quote:


> Originally Posted by *Exilon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> Does that extra power from the shunt mod really matter? What are we talking here, 25mhz on top from 150% power draw?
> 
> That PSU that can measure power draw sounds sweet, sort of want one.
> 
> 
> 
> Before I did the shunt mod, I was throttling down to 1950 in super heavy load (Overwatch uncapped frames, FS Extreme) with the GPU barely boosting above the 1V bin. Now I can hold 2100 steady in all load scenarios.
> 
> Your mileage will vary based on silicon leakiness.
> 
> Not that I need it now that I'm on a Stellaris binge...
Click to expand...

ll this benching i've been doing is pretty much a moot point for myself.

I use a 4K G-Sync screen and have the frame rate capped at 59 FPS all the games I play so my clocks really don't max out anyways.


----------



## wholeeo

Quote:


> Originally Posted by *Slackaveli*
> 
> it's your NVIDIA control panel settings, (ive been meaning to tell you this- i got you, mane!!) . One is shader cache, one is Anio filtering, ... there are more. Just set EVERYTHING to default or "application controlled". For me I kept getting hung on Timespy and getting a launch error b/c I usually run 16x Anio filter from NCP overiding, and Timespy hates that.
> just sell your Tis, why would she care then?


I only have a 1080 at the moment which is already overkill for what I play.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> it's your NVIDIA control panel settings, (ive been meaning to tell you this- i got you, mane!!) . One is shader cache, one is Anio filtering, ... there are more. Just set EVERYTHING to default or "application controlled". For me I kept getting hung on Timespy and getting a launch error b/c I usually run 16x Anio filter from NCP overiding, and Timespy hates that.
> just sell your Tis, why would she care then?


Thank, I'll give it a try but I think my problem is RSS related since I can run it with it closed. But I just disabled something that has been stopping other issues, custom direct 3d. Will try it tonight.


----------



## joder

Quote:


> Originally Posted by *nrpeyton*
> 
> The 'certificates bypassed version' of nvflash already exists, and its even been updated to be pascal compatible.
> 
> The problem is deciphering the BIOS files. (figuring out what to edit).
> 
> I could use a hex-editor to edit my BIOS file right now, and I have the tools to successfully flash it to my 1080.
> 
> The problem, is any edit I done.. would be a "shot in the dark".


Shows how much I know







. I thought that was the sticking point at this moment.

See this post: http://www.overclock.net/t/1621789/pascal-bios-editor-any-news/20#post_25955570


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> Checked this in a hex editor as my 1080Ti hasn't arrived yet. Looks like stock FE power limits - 250w default, 300w max.
> A lot of the water cooled / AIO BIOSes are like this and I don't know why. A water cooled card is the most likely to run into power limits since thermals aren't an issue, but the manufacturers keep making them on PCBs with weak VRMs and with low power limits.
> 
> On the MSI 1080 (non-Ti) Seahawk X they gave you a LOWER power limit than even the FE! At least with this one they kept it the same.
> 
> https://www.techpowerup.com/vgabios/183939/msi-gtx1080-8192-160509
> vs
> https://www.techpowerup.com/vgabios/184559/msi-gtx1080-8192-160607
> 
> I guess they really don't have a lot of faith in their hybrid VRM cooling....


the only one(s) which breaks this trend was the EVGA 1080 hybrid FTW (maybe the gigabyte extreme waterforce, too, not sure but i think so), all the others are stock pcb/power limit. Best perf with an AIO is still gonna be an Aorus Extreme with an AIO kit or eventually the ZOtac highest end w/ a AIO (obviously fully underwater is best).


----------



## GRABibus

Quote:


> Originally Posted by *ThingyNess*
> 
> Checked this in a hex editor as my 1080Ti hasn't arrived yet. Looks like stock FE power limits - 250w default, 300w max.
> A lot of the water cooled / AIO BIOSes are like this and I don't know why. A water cooled card is the most likely to run into power limits since thermals aren't an issue, but the manufacturers keep making them on PCBs with weak VRMs and with low power limits.
> 
> On the MSI 1080 (non-Ti) Seahawk X they gave you a LOWER power limit than even the FE! At least with this one they kept it the same.
> 
> https://www.techpowerup.com/vgabios/183939/msi-gtx1080-8192-160509
> vs
> https://www.techpowerup.com/vgabios/184559/msi-gtx1080-8192-160607
> 
> I guess they really don't have a lot of faith in their hybrid VRM cooling....


Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just a heads up MSI 1080Ti seahawk X in stock at newegg
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126


Already out of stock ?

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-VigLink-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6146846&SID=j1dx7i127h000kb500053


----------



## lilchronic

unigine superposition benchmark
1080p Extreme
2000Mhz / 6000Mhz @ .981v


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> Also, don't forget to all those that are benchmarking. Not all tests/loads are power limited. The higher the resolution/complexity, the more likely it will be that the card is TDP limited.
> In addition, the extra TDP headroom doesn't help your average FPS as much as you might think. What it does help a lot is the _minimum_ FPS. Even in heaven, there's only like 5% of the benchmark that's really intensive, the rest of it won't power limit even with a 300w cap anyway.
> 
> Try running The Witcher 3 in 5k res with everything maxed, and watch your minimum FPS through a benchmark run.
> 
> Switching P-states does have a slight latency to it and can cause a bit more irregularity in frame times if they are bouncing around due to coming up against the TDP limiter.
> 
> Being able to stay in the absolute top P-state and max clocks throughout an entire benchmark run improves the perceived smoothness from my experience as well, even if it might only translate to a .1 difference in the average FPS at the end of the run.


yeah, since I got the Aorus and it's 300w base, i never power throttle in Heaven at 103% power and up. It just won't go over 300w (heaven) even maxed. But FFXIV, now THAT is a power hungry benchmark (and free, and BEAUTIFUL), i did the command prompt while running it (but the pic saved in a weird place i think b/c now i can't find it :/) , but I was hitting 360w (with still no power throttling).


----------



## nrpeyton

oops sorry, deleted, wrong thread.


----------



## Slackaveli

Quote:


> Originally Posted by *bloodhawk*
> 
> Cant Argue with that. Ill post results with the power mod in a day or 2.
> 
> JESUS CHRIST, i noticed i forgot to set my vcore back to 1.328V from 1.48V (for 4.8Ghz), and accidentally set it to 1.38V.


how long you been running like that? for the most part it would be lowered a bit automatically and is safe, but dang. You could have been running at 4.7 all that time.


----------



## asuindasun

Anyone have issues with coil whine on their FE cards? Found an EVGA FE in stock but have horrible coil whine from my 980 reference card that I'd like to avoid this go around.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> how long you been running like that? for the most part it would be lowered a bit automatically and is safe, but dang. You could have been running at 4.7 all that time.


2 days or so lol.
But under water it barely goes over 65C with anything lower than 1.4V. (non AVX) , and my Widnows Profile is always set to high performance, it never drops








Plus i have 4 x 140mm Noctua 3000rpm fans on the 280mm rad, keeps this puppy super quite. Come month end, and im getting a 6950X + a water Chiller.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I say we pull kedrawolf to a soccer field and quarter his limbs off using 4 horse carriages!
> 
> Anyways, this is a shunt mod video with Witcher 3. Power draw doesn't go above 80% in most cases. The guy runs at 1.094V too with 2114 Mhz.
> 
> 
> 
> 
> 
> 
> The shunt mod is just where it is. Risky, but rewards are risky.


UGGGH> Who uses WHITE for their OSD? U G L Y

Orange or Pink only, please.


----------



## bloodhawk

Quote:


> Originally Posted by *asuindasun*
> 
> Anyone have issues with coil whine on their FE cards? Found an EVGA FE in stock but have horrible coil whine from my 980 reference card that I'd like to avoid this go around.


I have a bit of coil whine on my EVGA FE, but its barely audible and only under specific loads.


----------



## JedixJarf

Quote:


> Originally Posted by *bloodhawk*
> 
> I have a bit of coil whine on my EVGA FE, but its barely audible and only under specific loads.


Same with my FE, only comes out while playing battlegrounds


----------



## joder

Quote:


> Originally Posted by *bloodhawk*
> 
> I have a bit of coil whine on my EVGA FE, but its barely audible and only under specific loads.


May as well wait for Skylake-X this summer







. That's what I am going to do.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> ooooh baby cant thank you enough, made my life so much easier and saved me so much time.
> First thing ill try now is to manually edit the BIOS and flash it on using a programmer.
> 
> Will let you guys know how it goes tonight or tomorrow night.


This would be interesting. We know the card can temporarily go past 300 watts on its own. Would be nice if it was simple as unlocking that limit and letting it draw up to 350 watts. It would be a true shunt mod replacement.


----------



## Slackaveli

Quote:


> Originally Posted by *Exilon*
> 
> Before I did the shunt mod, I was throttling down to 1950 in super heavy load (Overwatch uncapped frames, FS Extreme) with the GPU barely boosting above the 1V bin. Now I can hold 2100 steady in all load scenarios.
> 
> Your mileage will vary based on silicon leakiness.
> 
> Not that I need it now that I'm on a Stellaris binge...


Hilarious! That's like my buddy who has two Titan X(2016) and all he plays is LoL. And he plays it on a 1080p TV (60Hz of course) with uncapped framerates and always says stuff like, "dang. Im getting 700 FPS!!" Long time real life friend and I 've educated him but still he doesn't change. Too funny. My reply is usually something like, "Now if they only made 1Mhz 1080p tv's, you'd be in Heaven!"
Quote:


> Originally Posted by *lilchronic*
> 
> unigine superposition benchmark
> 1080p Extreme
> 2000Mhz / 6000Mhz @ .981v


oh, sweet, been waiting on this one!!!
Quote:


> Originally Posted by *bloodhawk*
> 
> 2 days or so lol.
> But under water it barely goes over 65C with anything lower than 1.4V. (non AVX) , and my Widnows Profile is always set to high performance, it never drops
> 
> 
> 
> 
> 
> 
> 
> 
> Plus i have 4 x 140mm Noctua 3000rpm fans on the 280mm rad, keeps this puppy super quite. Come month end, and im getting a 6950X + a water Chiller.


lol, dang, that's going to be nasty. I've seen nothing but amazing results from folks using chilled water.

I love noctua F-12s that i have on mine, and the other 2 I have in my case , i as intake, one pointed directly at the gpu blowing towards the exhast, half blowing below the card, half of it directly across the backplate and that copper towards the exhaust. It helped! Noctua F-12 are beast.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> UGGGH> Who uses WHITE for their OSD? U G L Y
> 
> Orange or Pink only, please.


Agreed! Lol, it was very hard to see.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> This would be interesting. We know the card can temporarily go past 300 watts on its own. Would be nice if it was simple as unlocking that limit and letting it draw up to 350 watts. It would be a true shunt mod replacement.


Yeah, it should work. Since i have done the same on Notebook 1070's/1080's (MXM Desktop Class) and it works.
But if they have some checks/further limits implemented close to the power controller then a Shunt mod might also be needed.
Quote:


> Originally Posted by *joder*
> 
> May as well wait for Skylake-X this summer
> 
> 
> 
> 
> 
> 
> 
> . That's what I am going to do.


Im not really going to invest in a brand new platform tbh. I already have a 2 x 18 Core Xeon workstation, and this other desktop is more for fun and games. Unless Skylake-X will fit in the same socket type?

Add to that 6700k + 980 (Desktop Class) laptop and another with a 7700k + 2 x 1080 (Desktop Class) laptop. (More like Small desktop replacements, that was kinda portable)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> Hilarious! That's like my buddy who has two Titan X(2016) and all he plays is LoL. And he plays it on a 1080p TV (60Hz of course) with uncapped framerates and always says stuff like, "dang. Im getting 400 FPS!!" Long time real life friend and I 've educated him but still he doesn't change. Too funny. My reply is usually something like, "Now if they only made 1Mhz 1080p tv's, you'd be in Heaven!"


Man people like to defend 1080 Ti at 1080P. I always tells them to save up for a new monitor and before they know it a 2080 or 2080 Ti will be out.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> ll this benching i've been doing is pretty much a moot point for myself.
> 
> I use a 4K G-Sync screen and have the frame rate capped at 59 FPS all the games I play so my clocks really don't max out anyways.


Wait, they _should_ be maxing out. I almost never go below 59 when i am on my 4k, I run it the exact same as you. Are you using AA? If you turn on MFAA in nvidia control panel and use no more than 2x AA in game, _that makes effective AA_ 4x and across pixels so you will never ever see jaggies. If you are using more than that just to say "all ultra" you are really screwing yourself. Framerate is THE most important setting to have maxed out, and if it requires something to be lowered than so be it. Especially if it's a redundant setting like AA in 4k.


----------



## Slackaveli

Quote:


> Originally Posted by *GRABibus*
> 
> Already out of stock ?
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-VigLink-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6146846&SID=j1dx7i127h000kb500053


NOPE!!! For sale RIGHT NOW, HURRY.
Quote:


> Originally Posted by *wholeeo*
> 
> I only have a 1080 at the moment which is already overkill for what I play.


hmmm. really depends on the wife. Here's a sure fire way. Sell your 1080 on ebay and get your 1080ti, because future proofing and vram (you can always say you'll need it for the game you are waiting on and REALLY looking forward to







), but tell her you really want her to not begrudge the purchase so tell her it's ok for her to go buy whatever she wants around $200, or that you owe her a date including her favorite most expensive restaraunt and then when she accepts she'll never be able to effectively hold it against you , or will never evenbe tempted to if she's a more understanding wifey.


----------



## BoredErica

Unless told otherwise, I'm going to pre-order EVGA SC Black Edition for $700 shipped.

I dunno what's going on with the power draw there. I know EK will do full cover for it, and Heatkiller probably will but simply hasn't updated their sheet correctly. Hoping their rep will fill me in soon.


----------



## ThingyNess

Quote:


> Originally Posted by *Slackaveli*
> 
> the only one(s) which breaks this trend was the EVGA 1080 hybrid FTW (maybe the gigabyte extreme waterforce, too, not sure but i think so), all the others are stock pcb/power limit. Best perf with an AIO is still gonna be an Aorus Extreme with an AIO kit or eventually the ZOtac highest end w/ a AIO (obviously fully underwater is best).


The special MSI 1080 Seahawk EK X had good power limits too, 240w default/291w max, but of course you have the extra expense and complexity of a custom loop if you didn't already have one - looks like it used the same PCB and power limits as the 1080 Gaming X.

https://www.techpowerup.com/vgabios/184410/msi-gtx1080-8192-160603


----------



## joder

What do folks recommend for paste these days? I have always used AS5 and been happy enough. I know there is better stuff out there. Obviously I am looking for ease of application and performance. I don't really want anything liquid metal-wise.


----------



## Blotto80

So I'm thinking of taking the plunge and building a custom loop. With one Ti and a 5930k, would one 360mm rad do the trick or should I plan for two? I've got an H440 so I've got room for two but my budget would prefer only having to buy one.


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> The special MSI 1080 Seahawk EK X had good power limits too, 240w default/291w max, but of course you have the extra expense and complexity of a custom loop if you didn't already have one - looks like it used the same PCB and power limits as the 1080 Gaming X.
> 
> https://www.techpowerup.com/vgabios/184410/msi-gtx1080-8192-160603


good find.
Quote:


> Originally Posted by *joder*
> 
> What do folks recommend for paste these days? I have always used AS5 and been happy enough. I know there is better stuff out there. Obviously I am looking for ease of application and performance. I don't really want anything liquid metal-wise.


Thermal Grizzly Kyronaut is best, GeLid is very close second.


----------



## bloodhawk

Quote:


> Originally Posted by *joder*
> 
> What do folks recommend for paste these days? I have always used AS5 and been happy enough. I know there is better stuff out there. Obviously I am looking for ease of application and performance. I don't really want anything liquid metal-wise.


I swear by Kryonaut and Conductonaut.
Gelid GCE and Mastergel Maker Nano a close second.


----------



## SEALBoy

Quote:


> Originally Posted by *NYU87*
> 
> After using voltage curve, looks like my MSI GTX 1080 Ti Gaming X is capable of 2.088GHz, trying to push for 2.1GHz crashes Firestrike. Anything else I can do to get 2.1GHz? Do you think lowering memory clocks can help me get 2.1GHz?
> 
> 32.4K graphics score in FS at 2.088GHz.


Hey, mind sharing a picture of your voltage curve and Afterburner settings? 2088MHz is higher than most people can get.


----------



## Slackaveli

NEW Ryzen benchmarks that INCLUDE 5775c, omg this thing is a beast (Broadwell FTW!). Even in games where the 8 core cpus are getting utilized, I am still right there with cpu's that cost almost triple (for the most part, warhammer notwithstanding). And still pooping all over a 5.0ghz 7700k (considering the older platform, slower ram, it's amazing, really); I LOVE THIS BROADWELL /1080ti combo SO FREAKING MUCH.

(sorry for yelling







)

https://www.purepc.pl/procesory/test_procesora_amd_ryzen_5_1400_konkurent_intel_core_i5_7400?page=0,30

 Ryzen is somewhere down there out of the shot :/


----------



## rcfc89

Quote:


> Originally Posted by *Blotto80*
> 
> So I'm thinking of taking the plunge and building a custom loop. With one Ti and a 5930k, would one 360mm rad do the trick or should I plan for two? I've got an H440 so I've got room for two but my budget would prefer only having to buy one.


SLI is so bad right now. Scaling is just bad if the game uses it at all. I've ran SLI for the last 6 years and this is the first time I'm writing it off and going with a single card. I'll have $1400 in a Xp on water. Same price as 1080Ti SLI on Air. If that doesn't tell you how bad it is nothing will. Stick to a single card.


----------



## GRABibus

Quote:


> Originally Posted by *Slackaveli*
> 
> NOPE!!! For sale RIGHT NOW, HURRY.


I wait for it available in France (beginning of May).
Newegg shipping to France must be expensive and long.....


----------



## Addsome

Quote:


> Originally Posted by *rcfc89*
> 
> SLI is so bad right now. Scaling is just bad if the game uses it at all. I've ran SLI for the last 6 years and this is the first time I'm writing it off and going with a single card. I'll have $1400 in a Xp on water. Same price as 1080Ti SLI on Air. If that doesn't tell you how bad it is nothing will. Stick to a single card.


I think hes asking if he should put in 2 rads and not 2 Tis.


----------



## Blotto80

Quote:


> Originally Posted by *Addsome*
> 
> I think hes asking if he should put in 2 rads and not 2 Tis.


Yes. One or two rads. Ran Crossfire a few times and have zero intentions if ever running mGPU again.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> NEW Ryzen benchmarks that INCLUDE 5775c, omg this thing is a beast (Broadwell FTW!). Even in games where the 8 core cpus are getting utilized, I am still right there with cpu's that cost almost triple (for the most part, warhammer notwithstanding). And still pooping all over a 5.0ghz 7700k (considering the older platform, slower ram, it's amazing, really); I LOVE THIS BROADWELL /1080ti combo SO FREAKING MUCH.
> 
> (sorry for yelling
> 
> 
> 
> 
> 
> 
> 
> )
> 
> https://www.purepc.pl/procesory/test_procesora_amd_ryzen_5_1400_konkurent_intel_core_i5_7400?page=0,30
> 
> Ryzen is somewhere down there out of the shot :/


It wouldn't make too much of a difference with my 4790K at 1440P or 4K right?


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> I think hes asking if he should put in 2 rads and not 2 Tis.


i've never ran a loop but i can still answer, one would be fine, especially if you add a chiller. It would require a second one perhaps if you go sli.
Quote:


> Originally Posted by *SlimJ87D*
> 
> It wouldn't make too much of a difference with my 4790K at 1440P or 4K right?


considering you can sell your 4790k for ~$300+ it's a ~$50 upgrade.That is the deal of a lifetime as you'll have a whole platform's worth of performance upgrade for that small money. Short answer, yes it still outperforms in highs, lows, and average frames in 4k. The 134mb combined L3+L4 cache vs 8mb cache..... That's a roflstomp. And even in 1440p/4k you'll gain ~20% fps in many titles.

PSA- If you are on z97, the very very very best upgrade you can do in perf per dollar is a drop-in cpu upgrade to Broadwell-C. It's literally a no-brainer, deal of a lifetime. THAT'S why you've never heard about it tbh, Intel tried very hard to kill this thing off.


----------



## bloodhawk

Quote:


> Originally Posted by *Slackaveli*
> 
> i've never ran a loop but i can still answer, one would be fine, especially if you add a chiller. It would require a second one perhaps if you go sli.


Single 1080Ti can be run on a single 280mm 35-40mm rad. Along with a 6 core processor. And it will still run slightly cooler than an AIO







(push / pull)


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> i've never ran a loop but i can still answer, one would be fine, especially if you add a chiller. It would require a second one perhaps if you go sli.
> considering you can sell your 4790k for ~$300+ it's a ~$50 upgrade.That is the deal of a lifetime as you'll have a whole platform's worth of performance upgrade for that small money. Short answer, yes it still outperforms in highs, lows, and average frames in 4k. The 134mb combined L3+L4 cache vs 8mb cache..... That's a roflstomp. And even in 1440p/4k you'll gain ~20% fps in many titles.


Can you link benchmarks showing 20%FPS gain?


----------



## Slackaveli

Nope . Intel has a ban on those apparently. I can link the owner's club where 90% of us came from 4690k/4790k and there is a 100% approval rating for this cpu. I've read the entire thread and every single one of us are super super happy. Testimonials and what-not. Unfortunitely you will NEVER see benches unless they come from very far flung 3rd world places , and I suspect that is part of Intel's blackballing of this cpu. There just aren't any benches (or super few and all have the 5775c @ 3.3ghz when all of us can hit 4.2-4.4ghz), but 1080p is the very best way to bench cpu anyway . I understand you want to know if it's better at that rez, and im telling you NONE of us were gaming at 1080p and we all see vast improvements across the board, including HUGE HUGE gains in games, especially in my case when I'm at 144hz 1440p, and power consumption is way down. these are 14nm cpus. They sip power. Just go read 10 or 20 pages here... http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/720#post_26005440

it's simple math. ipc improvement, die shrink, single core and physics scores are up, and 143MB vs 8mb cache....... As the articles I linked are titled "Broadwell Destroyer" and they call it "The Gaming Behemoth." I couldn't have said it any better. It beats 7700k, so of course it roflstomps 4790k, even the golden 5.0ghz ones like many of us sold/gifted when we jumped on The Gaming Behemoth. I'm pretty much all gamer so for me I wouldnt trade it for a 7700k. Nope.


----------



## coolharris93

Just got my Asus 1080Ti Strix. 69 highest temp with 1939 boost clock without even touching anything. Fans are loud on 100% but they're nearly silent when the card is under heavy stress. So far excited so much with this card!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> i've never ran a loop but i can still answer, one would be fine, especially if you add a chiller. It would require a second one perhaps if you go sli.
> considering you can sell your 4790k for ~$300+ it's a ~$50 upgrade.That is the deal of a lifetime as you'll have a whole platform's worth of performance upgrade for that small money. Short answer, yes it still outperforms in highs, lows, and average frames in 4k. The 134mb combined L3+L4 cache vs 8mb cache..... That's a roflstomp. And even in 1440p/4k you'll gain ~20% fps in many titles.
> 
> PSA- If you are on z97, the very very very best upgrade you can do in perf per dollar is a drop-in cpu upgrade to Broadwell-C. It's literally a no-brainer, deal of a lifetime. THAT'S why you've never heard about it tbh, Intel tried very hard to kill this thing off.


Resale value for the 4790K is $250 =/, so it'd cost $100 extra.


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> Nope . Intel has a ban on those apparently. I can link the owner's club where 90% of us came from 4690k/4790k and there is a 100% approval rating for this cpu. I've read the entire thread and every single one of us are super super happy. Testimonials and what-not. Unfortunitely you will NEVER see benches unless they come from very far flung 3rd world places , and I suspect that is part of Intel's blackballing of this cpu. There just aren't any benches (or super few and all have the 5775c @ 3.3ghz when all of us can hit 4.2-4.4ghz), but 1080p is the very best way to bench cpu anyway . I understand you want to know if it's better at that rez, and im telling you NONE of us were gaming at 1080p and we all see vast improvements across the board, including HUGE HUGE gains in games, especially in my case when I'm at 144hz 1440p, and power consumption is way down. these are 14nm cpus. They sip power. Just go read 10 or 20 pages here... http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/720#post_26005440
> 
> it's simple math. ipc improvement, die shrink, single core and physics scores are up, and 143MB vs 8mb cache.......


Theres no way the 5775c gains 20%fps @ 1440p/4k over a 4790k. The benchmark you showed before showed a 18% gain and that was at 1080p. That gain will diminish at higher res. Theres no doubt a 5775c is better but not 20% better.


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Theres no way the 5775c gains 20%fps @ 1440p/4k over a 4790k. The benchmark you showed before showed a 18% gain and that was at 1080p. That gain will diminish at higher res. Theres no doubt a 5775c is better but not 20% better.


But it DOESN'T diminish at higher rez. In fact it can become even more pronounced than that in heavy cpu titles. The gains are because of the ENORMOUS CACHE DELTA of 126MB , that over 1500% more cache, bro. You are just wrong, but iDC, Im good. and 18.7% is the same thing as ~20%, btw.

Regardless of what % it wins by, it will NEVER lose to a 4790k in any title and it will ALWAYS beat it by at least some measurable amount. For $50.


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> But it DOESN'T diminish at higher rez. In fact it can become even more pronounced than that in heavy cpu titles. The gains are because of the ENORMOUS CACHE DELTA of 126MB , that over 1000% more cache, bro. You are just wrong, but iDC, Im good. and 18.7% is the same thing as ~20%, btw.


At high res you are being limited by the GPU and not CPU. No matter how good your CPU is, your GPU is still only pushing a certain amount of frames.....


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> At high res you are being limited by the GPU and not CPU. No matter how good your CPU is, your GPU is still only pushing a certain amount of frames.....


and those frames come out faster with more draw calls. And they have WAY more consistency. And way better minimums. And way more of a smooth gameplay experience. And some games are still cpu limited at higher rez. I feel like you are trying to educate me when you dont even know what you are talking about. You are using pre-concieved notions about cpu's that do not apply to this cpu b/c of it's differences in archetecture, namely and mostly it's cache. Do you know what a cache does? Go do some googling. Im gonna pass this argument off to the others that will surely come and tell you the same thing I am. You can continue to save that one dinner's out worth of money and continue to burn a crap load more power than us and have a sub-par gaming experience in comparison to us. Cheers, enjoy that.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Resale value for the 4790K is $250 =/, so it'd cost $100 extra.


except that 99% of folks don't realize that the 4790k isn't the top upgrade path on a z97 board. All that ignorance helps with reselling the 4790k. Although it should be only worth $250, the five of us in this thread who have done it all got more than that. One guy got his 5775c for free (Palote99). One paid $30. And @Benny89 paid $50.
Quote:


> Originally Posted by *Benny89*
> 
> Just sell your 4790k. I sold mine, added 50$ and I upgrade to 5.0Ghz 7700k level of performance. For 50 bucks.


Quote:


> Originally Posted by *palote99*
> 
> You will notice a BIG improve ......
> 
> Waiting 4 your comments tomorrow......
> 
> HF


Quote:


> Originally Posted by *Benny89*
> 
> Y, I am little mad about it, but before second half of 2016 I didn't really need anything more than 4790k so glad I found this baby.
> 
> But still mad because I could have enjoyed it sooo much faster. That thing was one of the biggest scams Intel made.
> 
> I mean really this is LAST revolutionally CPU from Intel in last 5 years.
> 
> This only proves how good could CPUs be, but Intel prefer to milk cows. Can't really blame them though..


Quote:


> Originally Posted by *Ansau*
> 
> In general the 7700k is sligthly ahead, but not by much:
> 
> 
> In some games the EDRAM does indeed its job and puts the 5775c as the best cpu in the world:
> 
> 
> 
> 
> It also puts it ahead of the 7700k in others:
> 
> 
> 
> 
> And when overclock, the 5775c manages to beat a 5GHz 7700k in some other games:


#BESTCPUINTHEWORLD


----------



## feznz

Quote:


> Originally Posted by *bloodhawk*
> 
> From 1950-2038Mhz i barely even notice any deference in fps (maybe 5fps at mos?) when playing games like DOTA2 and CSGO. Beyond 2000Mhz supposedly if users can maintain <45C, the fps difference will be at MOST 2-5fps.
> But yet again, those sweet sweet, epeen points man.


Quote:


> Originally Posted by *Slackaveli*
> 
> agreed. I've settled in at [email protected] 1.04v for long gaming sessions.


I think that's going to be the sweet spot for every one









Quote:


> Originally Posted by *mshagg*
> 
> Point taken, and I've settled for an easy OC in a lower voltage bin, but the higher core speeds definitely have an impact on benchmark scores. It's what those applications are designed to show, and they show it clearly.
> 
> Im not sure why you'd question people pushing the most out of their cards on an overclocking forum. Also, warnings about 'burning cards' seems like hyperbole when these things are locked down to pretty safe voltages. If nothing else it's also kinda interesting.


Just people are like look at my score with a quad core
to be competitive you going to need a 6950X then *benching is going to about the size of your wallet not your ability to OC*
as to compare benches as we all do you have to look at the graphics score alone the Physics test is more of a money spinner for Intel/AMD trying to convince you need the latest CPU with the most cores.
I am guilty when browsing bench results I always look at the top scores.

in saying That I encourage you to bench just look at the hours I spent doesn't feel like 46hours










if I had to look at my fire strike results with ,my GTX 770 SLI then I would be 3rd in the world with only a quad core with only 6 and 10 core CPUs beating me and FPS with a few frames of each other
http://www.3dmark.com/compare/fs/3947052/fs/9368009/fs/2429341
now that score was valid till new hardware came out that required even old score to run the latest system info.

actually folding that fried the cards with the maximum stable OC with keeping GPU under 60°C 3 weeks of 24/7 folding seemed a good idea at the time instead of turning on the heater.


----------



## Benny89

My 1080Ti STRIX is up and running







. Time for some overclocking!


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> My 1080Ti STRIX is up and running
> 
> 
> 
> 
> 
> 
> 
> . Time for some overclocking!


Benny, pleas chime in on this latest discussion, sir.

Quote:


> Originally Posted by *navjack27*
> 
> I'm too lazy/busy to revisit 5775c on my channel but I gave it high praise when it was my main chip. Still 100% the best min frame rate CPU you can buy. Broadwell-E is a joke and shouldn't have the broadwell name IMHO, you need the cache to have that name.


Quote:


> Originally Posted by *palote99*
> 
> Thanks for your answers!!
> 
> Thanks to MasterGamma12, Slackaveli and nizinizi too for helping me with my new toy!!
> 
> Y
> 
> Performance is great again and of course is better with high uncore and core voltage... I think the upgrade worth it completely from my old 4790k...... speaking about gaming i get a boost of 15-20fps and cpu is never over 60degrees Celsius
> 
> Thanks again and i wait for your opinion!!!
> 
> Cheers


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> Benny, pleas chime in on this latest discussion, sir.


Lel, my 5775C on stock is already giving me more min fps than my 4790k on 4.8 Ghz.

Gonna OC that baby when I finish playing with my new 1080Ti.

And yes it was 50$ Upgrade. Sold my 4.9 Ghz 4790k for 1500 PLN and bought my 5775C OEM for 1700 PLN. 200 PLN is roughly 50$.

5.0 Ghz 7700k performance in games for mere 50$. Come on....best deal in my life so far.


----------



## Benny89

What is the newest Afterburner version? Does it have voltage slider unlocked?


----------



## KingEngineRevUp

If I can sell my 4790K for $300 I'd love to get one, but its selling for $250 right now.

Finding a used 5775C would be the key!


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> and those frames come out faster with more draw calls. And they have WAY more consistency. And way better minimums. And way more of a smooth gameplay experience. And some games are still cpu limited at higher rez. I feel like you are trying to educate me when you dont even know what you are talking about. You are using pre-concieved notions about cpu's that do not apply to this cpu b/c of it's differences in archetecture, namely and mostly it's cache. Do you know what a cache does? Go do some googling. Im gonna pass this argument off to the others that will surely come and tell you the same thing I am. You can continue to save that one dinner's out worth of money and continue to burn a crap load more power than us and have a sub-par gaming experience in comparison to us. Cheers, enjoy that.


Untill I see legit benchmarks @ 1440p and 4k of the 5775c having higher AVG fps then I will believe you. All the benchmarks im seeing so far are either at 720p or 1080p. Im not saying the 5775c is a bad chip, its beating the 7700k @ 1080p, all im saying is that your not gonna see a 20% AVG fps improvement at 1440p/4k. Im running a 6700k so theres really no point in me going to a 5775c especially since I run @ 3440x1440.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> it's simple math. ipc improvement, die shrink, single core and physics scores are up, and 143MB vs 8mb cache....... As the articles I linked are titled "Broadwell Destroyer" and they call it "The Gaming Behemoth." I couldn't have said it any better. It beats 7700k, so of course it roflstomps 4790k, even the golden 5.0ghz ones like many of us sold/gifted when we jumped on The Gaming Behemoth. I'm pretty much all gamer so for me I wouldnt trade it for a 7700k. Nope.


I never would have really given this CPU much attention without your endorsement. It really seems to be a beast due the the eDRAM being available when using a GPU and good overclock-ability. You would think they would have gimped it by locking the multiplier or something.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> So you're telling me that my PSU, which measures watts, is wrong? Yeah, no thanks.
> 
> Running furmark
> Stock Bios at 120%, which is it's max, 410 Watt to system
> 
> ASUS Bios at 110%, which is its max, 410 Watts to system.
> 
> What your'e saying is 410 =/= 410...


Again, thanks for your efforts in running these tests (so we dont have to lol). The only question I'd raise is whether furmark is the ideal method of testing. Given it is designed to maximise power draw, it would still hit TDP limits on the Strix BIOS and thus downclock... and, knowing what Pascal is like, no doubt overshoots the downclock, thus resulting in the 'low' power draw measurements.

I guess what im suggesting is, Furmark would only really prove the point if its power draw landed between 300W and 330W (which seems unlikely).

My own theory is that the different VRM configurations mean some extra load is being placed on the FE VRMs that pull power from motherboard voltage rail (whereas the Strix VRM configuration would be designed to pull the extra power from the 8+8 PCIE). Given the BIOS has 'sub limits' placed on each of the three power sources, it's likely to run up against the individual limit for the motherboard power source and trip the TDP limit power flag.

I guess the only way to test the hypothesis would be to apply the mod to the shunt which monitors power draw from the motherboard and leave the other two alone, whilst running the strix bios on an FE.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Lel, my 5775C on stock is already giving me more min fps than my 4790k on 4.8 Ghz.
> 
> Gonna OC that baby when I finish playing with my new 1080Ti.
> 
> And yes it was 50$ Upgrade. Sold my 4.9 Ghz 4790k for 1500 PLN and bought my 5775C OEM for 1700 PLN. 200 PLN is roughly 50$.
> 
> 5.0 Ghz 7700k performance in games for mere 50$. Come on....best deal in my life so far.


agreed. Deal of a lifetime.
Quote:


> Originally Posted by *Benny89*
> 
> What is the newest Afterburner version? Does it have voltage slider unlocked?


it's 4.4.0 and yes it's unlocked.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> If I can sell my 4790K for $300 I'd love to get one, but its selling for $250 right now.
> 
> Finding a used 5775C would be the key!


and THAT is the ultimate proof of how awesome these are. I have NEVER seen a used one for sale. Nobody gets rid of them , like, ever.
Quote:


> Originally Posted by *SlimJ87D*
> 
> If I can sell my 4790K for $300 I'd love to get one, but its selling for $250 right now.
> 
> Finding a used 5775C would be the key!


honestly, it's a wortyy upgrade at $100 as well. I even gifted my 4790k and I don't regret it at a $350 upgrade. It's worth that.


----------



## Benny89

Hmm, ok I started to OC and I noticed after just a start (+50 core +350 memory) and + 100 V and PL the core clock is jumping in Heaven like crazy, 2012, 1969, 1975, 2000. lol.

Temps are great. 65C was max in Heaven I had on 1440p Extreme.

But why are those clocks jumping like that.... ?


----------



## Slackaveli

Quote:


> Originally Posted by *joder*
> 
> I never would have really given this CPU much attention without your endorsement. It really seems to be a beast due the the eDRAM being available when using a GPU and good overclock-ability. You would think they would have gimped it by locking the multiplier or something.


believe me, i was just as much in denial and shock when I discovered it (thanks WCCFTECH comment section) as anybody. I couldn't believe it either. They couldn't lock it because they had already been promising this cpu for a long time, but they DID call in a "-c" so most of the world would THINK it's locked.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Again, thanks for your efforts in running these tests (so we dont have to lol). The only question I'd raise is whether furmark is the ideal method of testing. Given it is designed to maximise power draw, it would still hit TDP limits on the Strix BIOS and thus downclock... and, knowing what Pascal is like, no doubt overshoots the downclock, thus resulting in the 'low' power draw measurements.
> 
> I guess what im suggesting is, Furmark would only really prove the point if its power draw landed between 300W and 330W (which seems unlikely).
> 
> My own theory is that the different VRM configurations mean some extra load is being placed on the FE VRMs that pull power from motherboard voltage rail (whereas the Strix VRM configuration would be designed to pull the extra power from the 8+8 PCIE). Given the BIOS has 'sub limits' placed on each of the three power sources, it's likely to run up against the individual limit for the motherboard power source and trip the TDP limit power flag.
> 
> I guess the only way to test the hypothesis would be to apply the mod to the shunt which monitors power draw from the motherboard and leave the other two alone, whilst running the strix bios on an FE.


That's I nice theory but I think it's a combination of software and hardware and possibly just hardware only.

Something reads the 3 resistors measurements and has the card cut drawing power. We need to find what part in the software level to changes the values and math of how the software reads the resistors.

Flashing the bios did not do the trick. Something in the FE still reads the resistors and limits power draw. I'm not sure if modifying the bios is enough.

Shunt mod is a hardware level modification that changes the resistor values read and doesn't limit the power draw at 300W.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Hmm, ok I started to OC and I noticed after just a start (+50 core +350 memory) and + 100 V and PL the core clock is jumping in Heaven like crazy, 2012, 1969, 1975, 2000. lol.
> 
> Temps are great. 65C was max in Heaven I had on 1440p Extreme.
> 
> But why are those clocks jumping like that.... ?


What happens 10 minutes in? Still jumping around? Is voltage also jumping around?


----------



## Somasonic

Quote:


> Originally Posted by *Benny89*
> 
> Hmm, ok I started to OC and I noticed after just a start (+50 core +350 memory) and + 100 V and PL the core clock is jumping in Heaven like crazy, 2012, 1969, 1975, 2000. lol.
> 
> Temps are great. 65C was max in Heaven I had on 1440p Extreme.
> 
> But why are those clocks jumping like that.... ?


I noticed the same thing in benchmarks, it's like looking at a saw tooth. In gaming it holds a lot more steady so I haven't bothered to look into it further.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's I nice theory but I think it's a combination of software and hardware and possibly just hardware only.
> 
> Something reads the 3 resistors measurements and has the card cut drawing power. We need to find what part in the software level to changes the values and math of how the software reads the resistors.
> 
> Flashing the bios did not do the trick. Something in the FE still reads the resistors and limits power draw. I'm not sure if modifying the bios is enough.
> 
> Shunt mod is a hardware level modification that changes the resistor values read and doesn't limit the power draw at 300W.


A few pages back we looked at the PCBs for the various cards (that we have info on at the moment). It was confirmed the Gigabyte card had different resistors for the shunts but I believe the others used the same 5 mOhm. However, Exilon pointed out the measurement IC can also be programmed, which may well indeed support your theory:
Quote:


> Originally Posted by *Exilon*
> 
> Good catch. The Gigabyte board looks like it uses 2 mOhm resistors instead of the 5 mOhm like the others.
> 
> The FE uses this measurement IC
> http://www.ti.com/product/INA3221
> 
> It can be programmed by the firmware to trigger alerts based on the voltage drop sensed on the shunt. So therefore the the board firmware has to match the actual board or the power measurements will go off the rails.
> 
> It makes sense that the Gigabyte firmware doesn't work on FE. The GPU thinks there's a power fault since power draw is 40% of the expected.


All interesting nonetheless!
Quote:


> Originally Posted by *Benny89*
> 
> Hmm, ok I started to OC and I noticed after just a start (+50 core +350 memory) and + 100 V and PL the core clock is jumping in Heaven like crazy, 2012, 1969, 1975, 2000. lol.
> 
> Temps are great. 65C was max in Heaven I had on 1440p Extreme.
> 
> But why are those clocks jumping like that.... ?


This is GPU Boost doing its thing, and the power limit (which we've been trying to get around with the cross flashing of BIOSes) is forcing the card to downclock.

Your best bet is to start tweaking the voltage curve. For example, you could probably bump that 1975mhz bin to 2000Mhz, which means when the power limit kicks in, that's what it down clocks to.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> What happens 10 minutes in? Still jumping around? Is voltage also jumping around?


Hmm, strange. I gave it +75 on core and +400 on memory (still 100% V and PL) and it boosted to 2050 and then went down to 2038 and stayed 2038 whole time till the Benchmark end (04:09 min).

Hmmm.... Ok, will try to kick it more.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> If I can sell my 4790K for $300 I'd love to get one, but its selling for $250 right now.
> 
> Finding a used 5775C would be the key!


honestly, it's a wortyy upgrade at $100 as well. I even gifted my 4790k and I don't regret it at a $350 upgrade. It's worth that.
Quote:


> Originally Posted by *Benny89*
> 
> Hmm, strange. I gave it +75 on core and +400 on memory (still 100% V and PL) and it boosted to 2050 and then went down to 2038 and stayed 2038 whole time till the Benchmark end (04:09 min).
> 
> Hmmm.... Ok, will try to kick it more.


ctrl+f to open curve. ctrl+L to lock it at 1.05v. see if it can hold 2050 at that voltage.


----------



## Benny89

Playing with curve more. So far managed to get Heaven run stable at 1.081 V at 2075 Core and +450 memory. Will try to get little more. 69C max temp. ROG cooler is really darn good.


----------



## bloodhawk

Quote:


> Originally Posted by *Benny89*
> 
> Hmm, strange. I gave it +75 on core and +400 on memory (still 100% V and PL) and it boosted to 2050 and then went down to 2038 and stayed 2038 whole time till the Benchmark end (04:09 min).
> 
> Hmmm.... Ok, will try to kick it more.


Quote:


> Originally Posted by *Somasonic*
> 
> I noticed the same thing in benchmarks, it's like looking at a saw tooth. In gaming it holds a lot more steady so I haven't bothered to look into it further.


Quote:


> Originally Posted by *Benny89*
> 
> Hmm, ok I started to OC and I noticed after just a start (+50 core +350 memory) and + 100 V and PL the core clock is jumping in Heaven like crazy, 2012, 1969, 1975, 2000. lol.
> 
> Temps are great. 65C was max in Heaven I had on 1440p Extreme.
> 
> But why are those clocks jumping like that.... ?


TDP Throttling.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Playing with curve more. So far managed to get Heaven run stable at 1.081 V at 2075 Core and +450 memory. Will try to get little more. 69C max temp. ROG cooler is really darn good.


I agree with that. ROG would have been my choice, I was more or less tied between Rog and Aorus, I love both of their beefy butts. Had to grab whatever was available first b/c I was already two days past the Nvidia store return window and used my Slackaveli ways to get a return and didnt want to delay.


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> I agree with that. ROG would have been my choice, I was more or less tied between Rog and Aorus, I love both of their beefy butts. Had to grab whatever was available first b/c I was already two days past the Nvidia store return window and used my Slackaveli ways to get a return and didnt want to delay.


But isn't the ROG like 90+ over the FE while the Aorus is +20, Aorus wins


----------



## Benny89

Ok, tested more with curve. At 2088 with 1.093 V it downclocked itself to 2076 and stayed there for the rest of the bench. +500 memory (will try +550 now). Seems like max of my card is around 2075 on core. Unless I am not getting that curve thing right....

My curve so far:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Hmm, strange. I gave it +75 on core and +400 on memory (still 100% V and PL) and it boosted to 2050 and then went down to 2038 and stayed 2038 whole time till the Benchmark end (04:09 min).
> 
> Hmmm.... Ok, will try to kick it more.


You're just being thermal throttled. Everything is settling down.

I would try and run the lowest voltage possible so you don't generate as much heat.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Ok, tested more with curve. At 2088 with 1.093 V it downclocked itself to 2076 and stayed there for the rest of the bench. +500 memory (will try +550 now). Seems like max of my card is around 2075 on core. Unless I am not getting that curve thing right....
> 
> My curve so far:


sounds right to me. That is literally exactly what I'm getting. That card is fabulous, bro. We are getting that on air. And it really really hold up well over long gaming sessions. I game at ~54c with the window open and about 58c-61c with it closed. I do have 2 other profiles; one with slightly less clocks locked to a slightly less voltage for power hog games of if i ever get bad temps, and another one with +103 (which is +194 adjusted over stock core) for benching. Helped me get #1 on 5775c on all the Firestrike benches. Let's see if you end up beating me.


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> Ok, tested more with curve. At 2088 with 1.093 V it downclocked itself to 2076 and stayed there for the rest of the bench. +500 memory (will try +550 now). Seems like max of my card is around 2075 on core. Unless I am not getting that curve thing right....
> 
> My curve so far:


Looks good, and 2076 is pretty common, although you might find you can run 2076 at a lower voltage bin than that. Performance/stability doesn't seem to scale particularly well into the higher voltage bins.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> Looks good, and 2076 is pretty common, although you might find you can run 2076 at a lower voltage bin than that. Performance/stability doesn't seem to scale particularly well into the higher voltage bins.


on water it is but is less common on air. Very few can hold 2076 on air.


----------



## Benny89

Quote:


> Originally Posted by *mshagg*
> 
> Looks good, and 2076 is pretty common, although you might find you can run 2076 at a lower voltage bin than that. Performance/stability doesn't seem to scale particularly well into the higher voltage bins.


On lower Voltage it downclocks to 2063. I have to keep him 2088 on 1.093 so it can downclock to 2076 when hitting 64C and stay there.

On good note- +550 memory is ok, so running 12k memory. I think I will leave it that way.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> On lower Voltage it downclocks to 2063. I have to keep him 2088 on 1.093 so it can downclock to 2076 when hitting 64C and stay there.
> 
> On good note- +550 memory is ok, so running 12k memory. I think I will leave it that way.


If that's the case, compare results thoroughly.

Your results at lower voltages might be better than at voltages of 1.092.

Post a screenshot of your GPU-Z also.


----------



## Benny89

Ok, did two full Heaven runs. 1.093 V on 2076 stable (downclock from 2088) with +550 memory (6055 memory).

Maybe I will try to push memory to +600 tomorrow. We will see.

So far I am pleased with this card.

2 full runs at Extreme 1440p at 2076 core and 1.093 V and max temp was 69C. Beast cooler....

Althought there is no point in OC higher than 2050 really. Difference in fps between 2050 to 2063 was 0.2 fps in max and min. And difference between 2063 and 2076 was another 0.2 and 0.3 difference in fps.

I think for everyday use I will settle on 2063 on 1.081V.

Tomorrow some games and CPU OC


----------



## Slackaveli

http://gpuz.techpowerup.com/17/04/12/vqj.png

guys, this new Unigine 2 benchmark is awesome. The 4k preset really hits the power hard, but I'm still not throttling, but it's 350w all day.
http://www.guru3d.com/files-get/unigine-superpostition-benchmark-download,3.html


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Ok, did two full Heaven runs. 1.093 V on 2076 stable (downclock from 2088) with +550 memory (6055 memory).
> 
> Maybe I will try to push memory to +600 tomorrow. We will see.
> 
> So far I am pleased with this card.
> 
> 2 full runs at Extreme 1440p at 2076 core and 1.093 V and max temp was 69C. Beast cooler....
> 
> Althought there is no point in OC higher than 2050 really. Difference in fps between 2050 to 2063 was 0.2 fps in max and min. And difference between 2063 and 2076 was another 0.2 and 0.3 difference in fps.
> 
> I think for everyday use I will settle on 2063 on 1.081V.
> 
> Tomorrow some games and CPU OC


Try the Superposition bench on 4k.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> http://gpuz.techpowerup.com/17/04/12/vqj.png
> 
> guys, this new Unigine 2 benchmark is awesome. The 4k preset really hits the power hard, but I'm still not throttling, but it's 350w all day.
> http://www.guru3d.com/files-get/unigine-superpostition-benchmark-download,3.html


Ouch, I wonder what's going to happen to us FE guys that only have 300 watts.


----------



## dentnu

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> http://gpuz.techpowerup.com/17/04/12/vqj.png
> 
> guys, this new Unigine 2 benchmark is awesome. The 4k preset really hits the power hard, but I'm still not throttling, but it's 350w all day.
> http://www.guru3d.com/files-get/unigine-superpostition-benchmark-download,3.html


[

You have your voltage set to low as GPU-Z is showing PrefCap Reason as being "VRef". You need to raise your voltage a bit more


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> Try the Superposition bench on 4k.


Tomorrow







but I am not that interested. As I said non-existence differtence between 2050, 2063 and 2075 in fps numbers. 0.2-0.4 difference







.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Tomorrow
> 
> 
> 
> 
> 
> 
> 
> but I am not that interested. As I said non-existence differtence between 2050, 2063 and 2075 in fps numbers. 0.2-0.4 difference
> 
> 
> 
> 
> 
> 
> 
> .


you totally missed the point. those tests you are running don't stress you power wise but this one will.








damn, wrong pic. Anyway, I hit 10063 second run with the benchmark settings in 4k, only a taste of hitting power limit.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Ouch, I wonder what's going to happen to us FE guys that only have 300 watts.


that's what Im waiting to see.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> you totally missed the point. those tests you are running don't stress you power wise but this one will.
> 
> 
> 
> 
> 
> 
> 
> 
> damn, wrong pic. Anyway, I hit 10063 second run with the benchmark settings in 4k, only a taste of hitting power limit.


What do you mean "power wise"?









I will test on games now. This is where stability will show. Or not


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> What do you mean "power wise"?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will test on games now. This is where stability will show. Or not


heaven only uses 300w. Superposition on 4k will throw up to 380w at you at least, thus indicating whether or not you are good where you think you are. It's a 1gb free download and a 5 minute test being the allure. Plus, I'm curious what your asus at same clocks scores. And if you are wanting there's a stability test. this is my new favorite benchmark. It's dope.


----------



## Slackaveli

Quote:


> Originally Posted by *dentnu*
> 
> [
> 
> You have your voltage set to low as GPU-Z is showing PrefCap Reason as being "VRef". You need to raise your voltage a bit more


yeah it's VREL, my card likes lower voltages with higher clocks better. I'm at 1.05v or 1.04v and still above 2050.


----------



## Qba73

Getting same exact result on my Aorus 1080 ti xtreme (f4 beta bios),

starts at 2088 (Vcurve set just like yours) then settles at 2076 after a while for rest of bench, +500 on mem (fine for me) 2088-2076 very stable.

I flirted with a 2100 curve, ran 2100 for about 30 sec then freeze..ugh (all the while no tem over 69c, custom fan curve maxing at 80% and still quiet at 80% lol)

2062 will be my everyday I think too.


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> Getting same exact result on my Aorus 1080 ti xtreme (f4 beta bios),
> 
> starts at 2088 (Vcurve set just like yours) then settles at 2076 after a while for rest of bench, +500 on mem (fine for me) 2088-2076 very stable.
> 
> I flirted with a 2100 curve, ran 2100 for about 30 sec then freeze..ugh (all the while no tem over 69c, custom fan curve maxing at 80% and still quiet at 80% lol)
> 
> 2062 will be my everyday I think too.


It seems I indeed got very lucky (it's about damn time!) and caught what would have passed gpu gauntlet as an extreme. These are very nice, I really love it. I certainly would rather it be an extreme, I really love your backplate!


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Ouch, I wonder what's going to happen to us FE guys that only have 300 watts.


Downloaded and installed earlier, will post some results from an unmodded FE under water later









Damn VR mode needs the paid version to unlock.


----------



## dentnu

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah it's VREL, my card likes lower voltages with higher clocks better. I'm at 1.05v or 1.04v and still above 2050.


Won't running it with lower voltage and getting VREL produce lower FPS since its not getting all the voltage it need at the freq?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dentnu*
> 
> Won't running it with lower voltage and getting VREL produce lower FPS since its not getting all the voltage it need at the freq?


Not always. Vrel isn't as bad as pwr.


----------



## Qba73

these gigabyte cards are great and beastly lol

I was always an EVGA guy, but could not wait for the FTW3,
although I do have the FTW3 preordered for 5/8 so I will still get it and test her out to see what results

the winner stays and the loser gets sold.

my wife is gonna kill me...lol but when she mentions it I just remind her of her handbags..then walk away.


----------



## dentnu

Hi Everyone,

I have been reading this thread from the start and want to thank all you crazy overclockers always trying push it a bit more no matter what happens. This thread has been a really great read THANKS!

Well with that out the way I would like to say that I finally after years of trying I hit the lottery&#8230; GPU lottery that is wish it was a real one lol. I got a MSI 1080 TI Gaming X and boy is she a keeper. I can bench at 2126MHz Core @ 1080Mv using curve overclocking any higher and it crashes. Memory so far been able to hit 6003Mhz. I can run it with lower voltage but I get VREL in GPU-Z.

I have been doing a bunch of benching these last 2 days testing and making sure my clocks are performing correctly and not just giving worst performance. I can say that these clock are producing higher number than lower clocks in benches so I know they are good. Only issue I have is I am hitting my power limits causing it to lower down to 2088Mhz to 2100Mhz for a bit at a few spots throughout the benchmarks. Then it recovers back to 2126Mhz. I know this as I saw it listed in my GPU-Z log file. It is only Power my voltage and temps are good.

I removed the heatsink while keeping the midplate to cool the memory on it and mounted my NZXT G10 + x41 with Push/Pull fans. My idle temps are 22c to 25c depending on my ambient temps and the max temps I seen benching is 41c. These clock are benching clocks as I am 99% sure they would cash in a game eventually and I do not want to run this card at 1080Mv all the time.

Here is my latest run on the new superposition benchmark. This is the highest I have been able to hit so far still pushing.


----------



## Qba73

Nice







makes me want to go under water too on my card. under 50c I know I can get her that high too.


----------



## mshagg

Isnt Vrel just where the card wants to boost up but you either have ctrl-L locking that bin or the curve to right of the current bin is flat?

i.e. I would jump to the next voltage bin but I cant = VRel


----------



## KingEngineRevUp

Awesome clocks. Yeah, power limiting and not being under 40C (I'm at 45C-48C) limits me from running 2138. Although I can go it for a short while.

Depending on how my power limit affects my benchmarks, I really might have to consider doing the shunt mod.

Not at home so can't testing the superposition benchmark.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> Downloaded and installed earlier, will post some results from an unmodded FE under water later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn VR mode needs the paid version to unlock.


please run 4k optimized and 8k optimized tests. They will push you hard. Here's my 8k optimized results. Notice my air cooled max temp


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> these gigabyte cards are great and beastly lol
> 
> I was always an EVGA guy, but could not wait for the FTW3,
> although I do have the FTW3 preordered for 5/8 so I will still get it and test her out to see what results
> 
> the winner stays and the loser gets sold.
> 
> my wife is gonna kill me...lol but when she mentions it I just remind her of her handbags..then walk away.


well played, sir.

I too love evga, ideally id have evga classy, i almost waited for ftw3, but , nah. These and the asus were to good to pass up. Never owned a gigabyte before but it's dope.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> please run 4k optimized and 8k optimized tests. They will push you hard. Here's my 8k optimized results. Notice my air cooled max temp


So your power is at a constant 120%?


----------



## dentnu

nevermind


----------



## Slackaveli

Quote:


> Originally Posted by *dentnu*
> 
> Hi Everyone,
> 
> I have been reading this thread from the start and want to thank all you crazy overclockers always trying push it a bit more no matter what happens. This thread has been a really great read THANKS!
> 
> Well with that out the way I would like to say that I finally after years of trying I hit the lottery&#8230; GPU lottery that is wish it was a real one lol. I got a MSI 1080 TI Gaming X and boy is she a keeper. I can bench at 2126MHz Core @ 1080Mv using curve overclocking any higher and it crashes. Memory so far been able to hit 6003Mhz. I can run it with lower voltage but I get VREL in GPU-Z.
> 
> I have been doing a bunch of benching these last 2 days testing and making sure my clocks are performing correctly and not just giving worst performance. I can say that these clock are producing higher number than lower clocks in benches so I know they are good. Only issue I have is I am hitting my power limits causing it to lower down to 2088Mhz to 2100Mhz for a bit at a few spots throughout the benchmarks. Then it recovers back to 2126Mhz. I know this as I saw it listed in my GPU-Z log file. It is only Power my voltage and temps are good.
> 
> I removed the heatsink while keeping the midplate to cool the memory on it and mounted my NZXT G10 + x41 with Push/Pull fans. My idle temps are 22c to 25c depending on my ambient temps and the max temps I seen benching is 41c. These clock are benching clocks as I am 99% sure they would cash in a game eventually and I do not want to run this card at 1080Mv all the time.
> 
> Here is my latest run on the new superposition benchmark. This is the highest I have been able to hit so far still pushing.


holy crap, dude, that score is beating everyone in the Tian Xp thread. GO POST IT IN THERE.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> http://gpuz.techpowerup.com/17/04/12/vqj.png
> 
> guys, this new Unigine 2 benchmark is awesome. The 4k preset really hits the power hard, but I'm still not throttling, but it's 350w all day.
> http://www.guru3d.com/files-get/unigine-superpostition-benchmark-download,3.html


https://www.techpowerup.com/download/unigine-superposition/

Above link much faster download.


----------



## dentnu

Quote:


> Originally Posted by *Slackaveli*
> 
> holy crap, dude, that score is beating everyone in the Tian Xp thread. GO POST IT IN THERE.


WOW for real lol going to post it there.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> So your power is at a constant 120%?


no but it actually hit 127% and was usually between 112-120, dips to 100%.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> Isnt Vrel just where the card wants to boost up but you either have ctrl-L locking that bin or the curve to right of the current bin is flat?
> 
> i.e. I would jump to the next voltage bin but I cant = VRel


That's my exact understanding of it too.
Quote:


> Originally Posted by *dentnu*
> 
> WOW for real lol going to post it there.


yeah, dude, you no doubt hit the lotto on that lady. She's a freaking Queen. you beat my score by 400, so 4%....

Ok, check that, there is a dude with an 11,000 score in there but you are definitely beating most of them who have posted it.
Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/download/unigine-superposition/
> 
> Above link much faster download.


Good lookin out. It was a slow starting download. Heavy use I imagine.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> guys, this new Unigine 2 benchmark is awesome. The 4k preset really hits the power hard, but I'm still not throttling, but it's 350w all day.
> http://www.guru3d.com/files-get/unigine-superpostition-benchmark-download,3.html




Stock clocks with temp/power limits maxed.

Seemed to hit Pwr quite a bit and Both Pwr/Vrel as well. Didn't someone post what all of those mean?


----------



## Slackaveli

Quote:


> Originally Posted by *joder*
> 
> 
> 
> Stock clocks with temp/power limits maxed.


dang. That is a LOT of power throttle and it results in 10% lower score, of course that's stock, maybe it'll just power through that power throttle and get some of those points back. Wow, though. Definitely a good test for game stability.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> dang. That is a LOT of power throttle and it results in 10% lower score, of course that's stock, maybe it'll just power through that power throttle and get some of those points back. Wow, though. Definitely a good test for game stability.


Hit some VRel too. The temps were pretty much in the 60s with an FE. Would custom voltage mess with the power at all? I can't honestly remember.

Edit: The processor likely plays into this too. Just a measly 3570k.


----------



## Qba73

4k result

running 2062, must say its a pretty benchmark, much better than heaven and the music that makes me want to staple my head.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> dang. That is a LOT of power throttle and it results in 10% lower score, of course that's stock, maybe it'll just power through that power throttle and get some of those points back. Wow, though. Definitely a good test for game stability.


Ouch, I'm going to test when I get home, but sounds like I'm going to have to shunt mod my card.


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> 4k result
> 
> running 2062


very nice, man. Looks like the good clockers can hit 10k in this. In Titan Xp thread JBravo has his hitting 11k. No losers this time around as we are all sitting pretty.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Ouch, I'm going to test when I get home, but sounds like I'm going to have to shunt mod my card.


The Superposition benchmark will help Conductonaut sales lol! Just a brutal bench, but gorgeous. Ran the 8k bench on my 55" 4k and wow!


----------



## Qba73

Quote:


> Originally Posted by *Slackaveli*
> 
> very nice, man. Looks like the good clockers can hit 10k in this. In Titan Xp thread JBravo has his hitting 11k. No losers this time around as we are all sitting pretty.


yeah very good for air breathers, I'm happy..

now for timespy..


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> yeah very good for air breathers, I'm happy..
> 
> now for timespy..


Im glad you dont have that on a 5775c to come bash my records lol.
Quote:


> Originally Posted by *joder*
> 
> Hit some VRel too. The temps were pretty much in the 60s with an FE. Would custom voltage mess with the power at all? I can't honestly remember.
> 
> Edit: The processor likely plays into this too. Just a measly 3570k.


processor plays in some but not much at 4k, exception being certain cache magic like a 5775c, but at any rate, you aren't likely to encounter anything like that much power in a game.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> processor plays in some but not much at 4k, exception being certain cache magic like a 5775c, but at any rate, you aren't likely to encounter anything like that much power in a game.


It will be interesting to see if @bloodhawk can mod the bios and flash it with higher power limits.

Interesting enough I have a higher idle power draw than you do. I noticed the fan from uses 6-7 watts at full speed on the FE. She will be under water within the week so I'll have a few watts back







.


----------



## Qba73

Timespy, cpu looks to be holding me under 10k. and ti dropped from 2062 to 2050. well good for me cause im not going water anytime soon.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Qba73*
> 
> 4k result
> 
> running 2062
> 
> 
> 
> 
> 
> very nice, man. Looks like the good clockers can hit 10k in this. In Titan Xp thread JBravo has his hitting 11k. No losers this time around as we are all sitting pretty.
Click to expand...

Asus BIOS, 2088 core at 1.062v, 6147 memory,only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.







It's over a 300 point jump in my stock FE BIOS at 1.062v 2062, memory was 6112 I think.


----------



## joder

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2088 core at 1.062v, 6147 memory,only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.
> 
> 
> 
> 
> 
> 
> 
> It's over a 300 point jump in my stock FE BIOS at 1.062v 2062, memory was 6112 I think.


Nice. I'm updating my drivers and switching the nvidia settings to see if I get any difference. I'm hitting a power wall on air/stock bios.

EDIT: already had performance and single settings. Guess the power limit is my enemy now.


----------



## Qba73

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS, only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.


yup always switch to single and performance









saw your tutorial, very informative thank you, I never tried the curve route until I saw it, much better that conventional I feel.


----------



## joder

@KedarWolf

What does your "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power show?


----------



## Exilon

Never realized that monkeying with monitor modes and power in the control panel makes much of a difference. But 2% is still 2%









Normal mode: 2 monitors, adaptive



Single monitor, performance:



Shunt modded @ 2100/6156


----------



## Qba73

Quote:


> Originally Posted by *Exilon*
> 
> Never realized that monkeying with monitor modes and power in the control panel makes much of a difference. But 2% is still 2%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Normal mode: 2 monitors, adaptive
> 
> 
> 
> Single monitor, performance:
> 
> 
> 
> Shunt modded @ 2100/6156


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Exilon*
> 
> Never realized that monkeying with monitor modes and power in the control panel makes much of a difference. But 2% is still 2%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Normal mode: 2 monitors, adaptive
> 
> 
> 
> Single monitor, performance:
> 
> 
> 
> Shunt modded @ 2100/6156


Any tips on doing the shunt mod?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> It will be interesting to see if @bloodhawk can mod the bios and flash it with higher power limits.
> 
> Interesting enough I have a higher idle power draw than you do. I noticed the fan from uses 6-7 watts at full speed on the FE. She will be under water within the week so I'll have a few watts back
> 
> 
> 
> 
> 
> 
> 
> .


Damn really? What speed is your fan running at to zap that much power?


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Damn really? What speed is your fan running at to zap that much power?


Even at 0% it is still running 60+ watts. I am not sure *** is happening on my card.


----------



## dentnu

Well here are the best scores my MSI GTX 1080 Ti Gaming X can do.

Superposition 4k



Superposition 8k


----------



## Qba73

Quote:


> Originally Posted by *joder*
> 
> Even at 0% it is still running 60+ watts. I am not sure *** is happening on my card.


here is mine


----------



## joder

Quote:


> Originally Posted by *Qba73*
> 
> here is mine


Interesting. My power draw 15W after a reboot.

EDIT: Noticed it goes up to 61W after I start up the Superposition bechmark loader.


----------



## mshagg

Amp! extreme review over on tweaktown.

http://www.tweaktown.com/reviews/8140/zotac-geforce-gtx-1080-ti-amp-extreme-edition-review/index12.html

Is it just me or is a review of a pascal card completely useless without looking at, at least, sustained clocks? Power limit not mentioned once in all 13 unique, clicktastic pages (each of which delivers a fresh set of ads)?

Doesn't stop them from arriving at this conclusion:

If you want the fastest graphics card known to man, it's right here: ZOTAC's GeForce GTX 1080 Ti AMP! Extreme Edition - ready for gaming now, and in the future.

Shameful.


----------



## dentnu

Here is my MSi 1080 Ti Gaming X


----------



## joder

Quote:


> Originally Posted by *mshagg*
> 
> ...
> 
> Is it just me or is a review of a pascal card completely useless without looking at, at least, sustained clocks? Power limit not mentioned once in all 13 unique, clicktastic pages (each of which delivers a fresh set of ads)?
> 
> ...
> 
> Shameful.


----------



## RavageTheEarth

In 19 hours I'll be back across the country and home. My Aorus was delivered last Saturday!!! Can't wait to play!


----------



## joder

Quote:


> Originally Posted by *RavageTheEarth*
> 
> In 19 hours I'll be back across the country and home. My Aorus was delivered last Saturday!!! Can't wait to play!


Were you the one in New Orleans?


----------



## Exilon

Quote:


> Originally Posted by *SlimJ87D*
> 
> Any tips on doing the shunt mod?


I only put it on the two resistors near the PCIE power plugs, cause I think the 3rd one near the bottom is the PCIE slot power.

Just dab a bit of CLU on the brush and paint a layer of it over the top of the resistor. Make sure to fully bridge the terminals but not to get any on the solder on the sides. Taping up the vicinity with black electrical tape can help you spot any stray CLU and give some peace of mind, but not necessary.


----------



## KingEngineRevUp

If you guys have max performance mode on, your card wont' downclock to idle clocks. It will stay like at 1500 Mhz.

Make sure you select performance mode for benchmarks but don't do it at a global level unless you want to waste electricity.


----------



## mshagg

I think it does downclock eventually, but yeah, it seems to hang around at ~1480Mhz under no load for a while before reverting to a proper idle state.

Does single monitor actually disable multi monitors? Or just change the way the card handles them?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Amp! extreme review over on tweaktown.
> 
> http://www.tweaktown.com/reviews/8140/zotac-geforce-gtx-1080-ti-amp-extreme-edition-review/index12.html
> 
> *Is it just me or is a review of a pascal card completely useless without looking at, at least, sustained clocks? Power limit not mentioned once in all 13 unique, clicktastic pages (each of which delivers a fresh set of ads)?
> *
> Doesn't stop them from arriving at this conclusion:
> 
> If you want the fastest graphics card known to man, it's right here: ZOTAC's GeForce GTX 1080 Ti AMP! Extreme Edition - ready for gaming now, and in the future.
> 
> Shameful.


Well we're all extreme OC here. But when review websites do their OC, they actually are taking in consideration of everything, perf cap and all.

That's why they don't just crank voltage to the max and usually clock less than we do on here. They give you an idea of what's a realistic OC for the average OCer.


----------



## joder

Better score here with no voltage and just doing stock +120/+300 on core/mem on FE.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well we're all extreme OC here. But when review websites do their OC, they actually are taking in consideration of everything, perf cap and all.
> 
> That's why they don't just crank voltage to the max and usually clock less than we do on here. They give you an idea of what's a realistic OC for the average OCer.


For sure, and when they have a pile of cards to test I wouldn't expect them to spend hours playing with voltage curves. But to not even look at clockspeeds over a benching run actually makes the review borderline useless.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2088 core at 1.062v, 6147 memory,only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.
> 
> 
> 
> 
> 
> 
> 
> It's over a 300 point jump in my stock FE BIOS at 1.062v 2062, memory was 6112 I think.


Stock bios, 2088 core at 1.043V but my memory is lower than yours, 5954 Mhz.



I'm not sure how you scored 300 points more than the stock bios, maybe you need to fix your stock voltage curve. But you should be scoring higher with the stock bios.

Once again, ASUS bios does worse than stock.


----------



## Dasboogieman

Quote:


> Originally Posted by *mshagg*
> 
> Amp! extreme review over on tweaktown.
> 
> http://www.tweaktown.com/reviews/8140/zotac-geforce-gtx-1080-ti-amp-extreme-edition-review/index12.html
> 
> Is it just me or is a review of a pascal card completely useless without looking at, at least, sustained clocks? Power limit not mentioned once in all 13 unique, clicktastic pages (each of which delivers a fresh set of ads)?
> 
> Doesn't stop them from arriving at this conclusion:
> 
> If you want the fastest graphics card known to man, it's right here: ZOTAC's GeForce GTX 1080 Ti AMP! Extreme Edition - ready for gaming now, and in the future.
> 
> Shameful.


I'd also add, without a PCB shot, the conclusions of the review is also worthless.


----------



## illypso

Quote:


> Originally Posted by *bloodhawk*
> 
> I already tried it. Barely made any difference. I tried both conductonaut and CLU. And covering 2 or 3 shunts.
> Im not sure what it was , but it barely made a difference for me. Maybe too much or too little LM?
> 
> End of the day didnt find it worth the hassle. Just going to hard mod this card till the Ti Classified comes out.
> 
> Yeah ill test both the BIOS with the power mods and report back on Wednesday/Thursday.


On my card, at stock, there was a lots of solder resin left from factory install on my shunt, maybe you had to and it was isolation the contact point between the shunt and the CLU. a little scraching/sanding would help. Though I prefer to solder a small wire on top of the 2 shunt.
Quote:


> Originally Posted by *bloodhawk*
> 
> Yeah i saw that one, im not entirely sure if that was done properly or not. I have a feeling he dropped the current way too much.


Yes at that time with the resistor, my load had drop too much,, card was reading 50%max and I was having stability problem but from my later test its was because of too much use of artics sylver paste on the cap around the GPU causing instability. I remove that and it was stable, but soldering a wire on my shunt was best for me, I never go over 80% now.


----------



## illypso

Quote:


> Originally Posted by *Slackaveli*
> 
> i've never ran a loop but i can still answer, one would be fine, especially if you add a chiller. It would require a second one perhaps if you go sli.
> considering you can sell your 4790k for ~$300+ it's a ~$50 upgrade.That is the deal of a lifetime as you'll have a whole platform's worth of performance upgrade for that small money. Short answer, yes it still outperforms in highs, lows, and average frames in 4k. The 134mb combined L3+L4 cache vs 8mb cache..... That's a roflstomp. And even in 1440p/4k you'll gain ~20% fps in many titles.
> 
> PSA- If you are on z97, the very very very best upgrade you can do in perf per dollar is a drop-in cpu upgrade to Broadwell-C. It's literally a no-brainer, deal of a lifetime. THAT'S why you've never heard about it tbh, Intel tried very hard to kill this thing off.


PLEASE STOP!!!!!!!
or else I wont be able to contain myself and I will buy the 5775C
But my 4790K is delid and late, so it will be hard d to sell it and I will end up stuk with it loll

Everytime I get a hold of myself and say that I dont need it and that its not THAT magical.... you come back and it like magic again, I WANT ONE. loll


----------



## Amuro

zotac gtx 1080 ti fe blower
+150 +500


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Amuro*
> 
> zotac gtx 1080 ti fe blower
> +150 +500


Another Zotac FE person O_O... You're the first person I have met with my card lol


----------



## Exilon

Only reason I haven't bought a 5775C is because Broadwell-C doesn't work on Z87

Such a shame...


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Another Zotac FE person O_O... You're the first person I have met with my card lol


Hey ive been here all along lol


----------



## KedarWolf

Quote:


> Originally Posted by *joder*
> 
> @KedarWolf
> 
> What does your "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power show?


Attached GPUs : 2
GPU 0000:01:00.0
Power Readings
Power Management : Supported
Power Draw : 17.41 W
Power Limit : 330.00 W
Default Power Limit : 275.00 W
Enforced Power Limit : 330.00 W
Min Power Limit : 125.00 W
Max Power Limit : 330.00 W
Power Samples
Duration : 4.07 sec
Number of Samples : 119
Max : 18.47 W
Min : 17.41 W
Avg : 17.66 W


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Hey ive been here all along lol


Oh... You don't count... for obvious reasons









Lol, jk. Awesome. I thought I was the only one that got stuck with Zotac. Honestly, I got a good freaking bin from them. And if rumors are true, EVGA bins their FE's so they can stick their chips onto their SC versions. Of course, rumors are rumors.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2088 core at 1.062v, 6147 memory,only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.
> 
> 
> 
> 
> 
> 
> 
> It's over a 300 point jump in my stock FE BIOS at 1.062v 2062, memory was 6112 I think.


The reason why your ASUS bios is doing bad is because it runs a gimped 2088 Mhz. It thinks it has extra power to run at 2088 Mhz at 1.062V when it does not.

The stock bios reports the correct maximum power draw, therefore it will change voltages to another clock and optimize 120% power for that clock and voltage and not have a gimp run. That run will run at its fullest potential since it has the power to do so.

*2050 Mhz @lower voltage+Max power draw > 2088 Mhz @ higher voltages + Gimped power draw*

What you need to do with your stock bios is actually adjust the core clocks higher at 1.000V, 1.020V, 1.032V and 1.040V. That way when the card down volts to optimize power and core, you'll be at a higher clock at those lower cores. ASUS Bios will not do this correctly because it thinks it has more power to play with and also, it won't adjust as low of a voltage because again it thinks it has more power.

Otherwise, you can keep using the ASUS bios that thinks it has 350 Watts and it'll stay at a gimped 2088 to 2075 Mhz with no power to actually runs those cores at those volts. .


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2088 core at 1.062v, 6147 memory,only thing I changed in Nvidia Control Panel was Single Display Performace Mode and Power Mode to Performance.
> 
> 
> 
> 
> 
> 
> 
> It's over a 300 point jump in my stock FE BIOS at 1.062v 2062, memory was 6112 I think.
> 
> 
> 
> 
> 
> The reason why your ASUS bios is doing bad is because it runs a gimped 2088 Mhz. It thinks it has extra power to run at 2088 Mhz at 1.062V when it does not.
> 
> The stock bios reports the correct maximum power draw, therefore it will change voltages to another clock and optimize 120% power for that clock and voltage and not have a gimp run. That run will run at its fullest potential since it has the power to do so.
> 
> *2050 Mhz @lower voltage+Max power draw > 2088 Mhz @ higher voltages + Gimped power draw*
> 
> What you need to do with your stock bios is actually adjust the core clocks higher at 1.000V, 1.020V, 1.032V and 1.040V. That way when the card down volts to optimize power and core, you'll be at a higher clock at those lower cores. ASUS Bios will not do this correctly because it thinks it has more power to play with.
> 
> Otherwise, you can keep using the ASUS bios that thinks it has 350 Watts and it'll stay at a gimped 2088 to 2075 Mhz with no power to actually runs those cores at those volts. .
Click to expand...

I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.

This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.



This is stock BIOS 2062 1.031v 6142 memory.



And all my games I run 4K so I'm definitely going with the Asus BIOS.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> Another Zotac FE person O_O... You're the first person I have met with my card lol


Count me in lol. I went the FE way since I'm gonna slap a waterblock on it anyway.


----------



## bloodhawk

So i have been able to read the BIOS using my programmer, but i didnt flash the modded BIOS just yet. The first thing i observed about the dump i made was that its 512KB instead of the 257 KB BIOS's that we are flashing. Going to flash and stuff tomorrow after work.


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.
> 
> This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.
> 
> 
> 
> This is stock BIOS 2062 1.031v 6142 memory.
> 
> 
> 
> And all my games I run 4K so I'm definitely going with the Asus BIOS.


If you folks really want to see some power-throttling insanity, try running it in 'custom' mode with 8k resolution, but with the '4k optimized' shaders.

On my 1070 with a 200w power limit (up from 150w default, so 33% higher!) I get no throttling at all in 8k res with the 8k shaders, but it hits that power limit hard with the '4k optimized' shaders.

It might not be obvious, but the '4k optimized' shaders are less complex than the '8k optimized' shaders, and both of those are less complex than the Medium or Extreme ones.

On 4k Extreme on my 1070, it only gets like 2.5fps, haha. Even if this benchmark supported SLI (which it doesn't for now), 4-way SLI Titan Xp would still be a slideshow at 4k Extreme.


----------



## ThingyNess

Quote:


> Originally Posted by *bloodhawk*
> 
> So i have been able to read the BIOS using my programmer, but i didnt flash the modded BIOS just yet. The first thing i observed about the dump i made was that its 512KB instead of the 257 KB BIOS's that we are flashing. Going to flash and stuff tomorrow after work.


There's likely a lot of zero-padding at the end, if memory serves.


----------



## bloodhawk

Quote:


> Originally Posted by *ThingyNess*
> 
> There's likely a lot of zero-padding at the end, if memory serves.


Probably, but with NVIDIA's Flacon checks and stuff, i really dont want to run with that. Since they have another check on the die level.
This dump is with the ASUS Strix BIOS flashed. Had to block the ground pin to avoid running into overcurrent protection. AND use the 1.8V adapter on top of that.

Here is the dump - https://www.sendspace.com/file/fyqe5u


----------



## Jbravo33

hows it going in here?

this thing is craving more power. memory is maxed
2126 @1.062 once it hit 2050 or 2064 it maxed out at 1.092


----------



## ThingyNess

Quote:


> Originally Posted by *Jbravo33*
> 
> hows it going in here?
> 
> this thing is craving more power. memory is maxed
> 2126 @1.062 once it hit 2050 or 2064 it maxed out at 1.092


If you're that power limited in the benchmark you might actually see slightly better results by knocking the voltage back some, even if it means you lose a bin or two on the core clock.


----------



## Jbravo33

Quote:


> Originally Posted by *ThingyNess*
> 
> If you're that power limited in the benchmark you might actually see slightly better results by knocking the voltage back some, even if it means you lose a bin or two on the core clock.


I'm not sure voltage is doing anything I'm about to test that out now. even memory seems to max out at 1676 which is actually 250 added not 1000


----------



## KedarWolf

Quote:


> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.
> 
> This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.
> 
> 
> 
> This is stock BIOS 2062 1.031v 6142 memory.
> 
> 
> 
> And all my games I run 4K so I'm definitely going with the Asus BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you folks really want to see some power-throttling insanity, try running it in 'custom' mode with 8k resolution, but with the '4k optimized' shaders.
> 
> On my 1070 with a 200w power limit (up from 150w default, so 33% higher!) I get no throttling at all in 8k res with the 8k shaders, but it hits that power limit hard with the '4k optimized' shaders.
> 
> It might not be obvious, but the '4k optimized' shaders are less complex than the '8k optimized' shaders, and both of those are less complex than the Medium or Extreme ones.
> 
> On 4k Extreme on my 1070, it only gets like 2.5fps, haha. Even if this benchmark supported SLI (which it doesn't for now), 4-way SLI Titan Xp would still be a slideshow at 4k Extreme.
Click to expand...

Asus BIOS at 2062 1.025v, no shunt mod, barely throttles, can do memory higher than stock BIOS stable at +659.


----------



## madmeatballs

My V/F Curve this is using an NZXT X61 Waterblock and Kraken G10. (This is not an AIO, I did a ghetto mod to connect the waterblock to my custom loop with 2x 280mm rads. Ambient is at 21C.) Memory at +400(5900MHz). My card is an FE with stock bios btw.





Another curve


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.
> 
> This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.
> 
> 
> 
> This is stock BIOS 2062 1.031v 6142 memory.
> 
> 
> 
> And all my games I run 4K so I'm definitely going with the Asus BIOS.




How is this not enough? I don't see how that command is going to change anything. I ran Furmark and logged the max power draws, and it's funny because *you're more than capable of doing it yourself and you have a PSU that can read your wattage.*

BloodHawk did a test with his PSU and he saw similar results to mine.

I haven't see you prove that the ASUS bios has a magic extra 30 watts, if anything you're the one that's hiding. You have all the capable tools to try and prove that there's an extra 30 watts for us to draw form and you haven't.


----------



## ThingyNess

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS at 2062 1.025v, no shunt mod, barely throttles, can do memory higher than stock BIOS stable at +659.


Note that I said *8k res* with the '4k optimized' (more complex than 8k optimized) shaders.









4k res with 4k optimized shaders doesn't power throttle for me either.
If you haven't tried it yet, definitely worth trying 4k/8k with 'Extreme' shaders just for the eye candy, even if it is < 10fps.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.
> 
> This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.
> 
> 
> 
> This is stock BIOS 2062 1.031v 6142 memory.
> 
> 
> 
> And all my games I run 4K so I'm definitely going with the Asus BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How is this not enough? I don't see how that command is going to change anything. I ran Furmark and logged the max power draws, and it's funny because *you're more than capable of doing it yourself and you have a PSU that can read your wattage.*
> 
> BloodHawk did a test with his PSU and he saw similar results to mine.
> 
> I haven't see you prove that the ASUS bios has a magic extra 30 watts, if anything you're the one that's hiding. You have all the capable tools to try and prove that there's an extra 30 watts for us to draw form and you haven't.
Click to expand...

Did you even see my last 4K run, it was way higher than my 4K stock run at same clocks. And I've told you before, Asus BIOS power limits a lot less, even at higher voltages.

I find though much more stable and less power limiting going on at lower voltages at lower clocks. And remember, I have not done the shunt mod so Asus BIOS has it uses, I heard someone else say it's like a safe poor man's shunt mod.

I figure I'll let you test the voltages the way I suggested as your determined and on a mission to prove me wrong, I have no need to prove you wrong to be honest, it's just annoying you talk out of your hat so much with no real proof of a lot of the claims you make, baked up theories of why you THINK you are right.









And I said before the maximum power draw of a PSU really isn't indicative of the power draw each or both PSU power leads are drawing.


----------



## KingEngineRevUp

@Thingyness

This is quite interesting, the stock bios at some point pulled 332 Watts and then software+hardware kicked in and told it to back off.



Here's to hoping there is a way to turn of this regulator that kicks it down to 300 Watts


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS at 2062 1.025v, no shunt mod, barely throttles, can do memory higher than stock BIOS stable at +659.


To each his own then. My results are pretty much close to the same as yours except I can't OC my memory as high as your card. This is the stock bios. I have no clue why the stock bios scores so low for you.



I'll test the ASUS bios right now and see how that fairs. If I get better results then I'll post them up.

Edit: crap I had g-sync on in the above run.

EDIT 2:

ASUS


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> Timespy, cpu looks to be holding me under 10k. and ti dropped from 2062 to 2050. well good for me cause im not going water anytime soon.


9k is beast in Timespy for 4 cores. 9.5k even more so. we usually go by graphics score, which you killed!
Quote:


> Originally Posted by *joder*
> 
> Even at 0% it is still running 60+ watts. I am not sure *** is happening on my card.


i run in optimized mode, that's why it's so low.
Quote:


> Originally Posted by *joder*
> 
> Even at 0% it is still running 60+ watts. I am not sure *** is happening on my card.


i run in optimized mode, that's why it's so low.
Quote:


> Originally Posted by *illypso*
> 
> PLEASE STOP!!!!!!!
> or else I wont be able to contain myself and I will buy the 5775C
> But my 4790K is delid and late, so it will be hard d to sell it and I will end up stuk with it loll
> 
> Everytime I get a hold of myself and say that I dont need it and that its not THAT magical.... you come back and it like magic again, I WANT ONE. loll


sorry, bro. Try this. Just list it on EBAY, say you are on Overclockers.net and have a pro's 4790k, delidded and all, for "only" $350. Tell them what volts to set it at and what clock frequency. Do it like you are the Silicon Lottery Store.

Just leave it up and you WILL get the "sold" email, I promise. People on a z97 board who know very little about computers can still drop in a 4790k now that they are sick of their 4690k, but cant build a new rig. Im telling you, homey. It'll sell, especially if you list it with "Best Offer accepted". Somebody will throw up a $300 at you first day.
Quote:


> Originally Posted by *Exilon*
> 
> Only reason I haven't bought a 5775C is because Broadwell-C doesn't work on Z87
> 
> Such a shame...


That really does suck. Another way they used to kill it off pre-launch. Called it a "C" so peeps think it doesn't overclock, and set it at a measly 3.3, too. ALL to trick people into upgrading to yet ANOTHER platform. SHAMEFUL
Quote:


> Originally Posted by *ThingyNess*
> 
> If you folks really want to see some power-throttling insanity, try running it in 'custom' mode with 8k resolution, but with the '4k optimized' shaders.
> 
> On my 1070 with a 200w power limit (up from 150w default, so 33% higher!) I get no throttling at all in 8k res with the 8k shaders, but it hits that power limit hard with the '4k optimized' shaders.
> 
> It might not be obvious, but the '4k optimized' shaders are less complex than the '8k optimized' shaders, and both of those are less complex than the Medium or Extreme ones.
> 
> On 4k Extreme on my 1070, it only gets like 2.5fps, haha. Even if this benchmark supported SLI (which it doesn't for now), 4-way SLI Titan Xp would still be a slideshow at 4k Extreme.


this bench will be used for the next decade.


----------



## Slackaveli

Quote:


> Originally Posted by *ThingyNess*
> 
> If you folks really want to see some power-throttling insanity, try running it in 'custom' mode with 8k resolution, but with the '4k optimized' shaders.
> 
> On my 1070 with a 200w power limit (up from 150w default, so 33% higher!) I get no throttling at all in 8k res with the 8k shaders, but it hits that power limit hard with the '4k optimized' shaders.
> 
> It might not be obvious, but the '4k optimized' shaders are less complex than the '8k optimized' shaders, and both of those are less complex than the Medium or Extreme ones.
> 
> On 4k Extreme on my 1070, it only gets like 2.5fps, haha. Even if this benchmark supported SLI (which it doesn't for now), 4-way SLI Titan Xp would still be a slideshow at 4k Extreme.


this bench will be used for the next decade.
Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS at 2062 1.025v, no shunt mod, barely throttles, can do memory higher than stock BIOS stable at +659.


about the memory, that means that your memory can take more than the stock bios will allow. It gets that power from the 6-pin iirc, but it also shares it with part of the gpu. I dont remember exactly but Wizard's pcb breakdown mentions it. ASUS bios is giving you power differently, or either you have an amazing Mobo and it is giving you extra juice than most of us can get thru your PCI on your mobo. Plus your PSU in general is nasty.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> this bench will be used for the next decade.
> about the memory, that means that your memory can take more than the stock bios will allow. It gets that power from the 6-pin iirc, but it also shares it with part of the gpu. I dont remember exactly but Wizard's pcb breakdown mentions it. ASUS bios is giving you power differently, or either you have an amazing Mobo and it is giving you extra juice than most of us can get thru your PCI on your mobo. Plus your PSU in general is nasty.


I don't mean this as a joke, but he could very well have a magic PSU lol.

I know he has two separate pci-e cables but that shouldn't make a difference since they're pulling form the same rail.

Could be his mobo.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't mean this as a joke, but he could very well have a magic PSU lol.


it's not impossible to get extra perf because of an amazing PSU and Mobo combo. Especially stability and consistency wise, no doubt about it.


----------



## Asmola

Asus Strix GTX 1080 Ti @ 2088MHz.



http://www.3dmark.com/spy/1524806


----------



## Benny89

I am now in work but I want to play around with my card more when I get back.

So far what I have discovered:

Voltage Curve:

At 1.093 it does 2088 but downclocks to 2076. It was stable most of the time, but 1 per 3 benchmarks it crashes Heaven close to end

At 1.081 If I set it to 2088 or 2076 It downclock to 2063 on core and stays stable all the time.

Both cases with +550 memory.

Now question- what can I do to improve this? I don't really understand what conclusions should I get from those results.

Funny fact- Heaven score at 1.081 V 2063 was better than 2076 at 1.093 (not much).

Anyway to push clocks higher to keep them stable?

I am noob with this Voltage curve.

Also I didn't monitor Power in GPU-Z, should I do it and what should I look for?


----------



## Silent Scone

Quote:


> Originally Posted by *Benny89*
> 
> I am now in work but I want to play around with my card more when I get back.
> 
> So far what I have discovered:
> 
> Voltage Curve:
> 
> At 1.093 it does 2088 but downclocks to 2076. It was stable most of the time, but 1 per 3 benchmarks it crashes Heaven close to end
> 
> At 1.081 If I set it to 2088 or 2076 It downclock to 2063 on core and stays stable all the time.
> 
> Both cases with +550 memory.
> 
> Now question- what can I do to improve this? I don't really understand what conclusions should I get from those results.
> 
> Funny fact- Heaven score at 1.081 V 2063 was better than 2076 at 1.093 (not much).
> 
> Anyway to push clocks higher to keep them stable?
> 
> I am noob with this Voltage curve.
> 
> Also I didn't monitor Power in GPU-Z, should I do it and what should I look for?


Not a lot, besides the shunt mod. It's just how GPU Boost works. Increasing the core offset obviously helps if you're stable at the next bin


----------



## pez

I went ahead and triggered my step up from the 1080 to the Ti. Not sure if I'll decide on keeping it or selling it for an AIB.


----------



## mshagg

The point about memory power draw is not insignficant. Having a play around with superposition 4K benchmark.

With my custom curve and no mem OC it's happy to sit in the 1.05v and 1.43v bins. Plug in the 500Mhz OC on the Memory and its tapping the 0.962v bin on occasion, although the extra bandwith yields a massive 500 point improvement in scores (9700 versus 10200).


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> @Thingyness
> 
> This is quite interesting, the stock bios at some point pulled 332 Watts and then software+hardware kicked in and told it to back off.
> 
> 
> 
> Here's to hoping there is a way to turn of this regulator that kicks it down to 300 Watts


332 is just a power spike, but that's pretty extreme









Also the Asus bios does give a bit more power maybe 5+w so in the nicest possible terms, don't discredit it as being no gain at all, infact if you were boardline power limiting with FE bios it can actually remove that power limiting which is maybe what Kedarwolf is seeing. I wasn't getting any power limit issues with the Asus bios either, but I was seeing voltage limiting which I think was just vrel. Notably the FE bios is about one bin less in stability for me, but even that might be one bin higher performance at lower voltage than the Asus.

No way I'm pulling the waterblock back off to do a shunt mod, so still holding out and hoping for a power boost for the FE bios


----------



## Benny89

Anyone here with STRIX 1080Ti that plans to remove/replace its backplate? If so I would like to buy original STRIX backplate for some custom modding.

Please kindly PM me if you'd like to sell it.


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Anyone here with STRIX 1080Ti that plans to remove/replace its backplate? If so I would like to buy original STRIX backplate for some custom modding.
> 
> Please kindly PM me if you'd like to sell it.


I spent an extra $200 for the strix over any other card because of that pretty little logo









just a quick run with my 24/7 settings looks like my 3770k is not being killed by this bench
might try for some better benches winter is only a few months away


----------



## shalafi

Quote:


> Originally Posted by *feznz*
> 
> just a quick run with my 24/7 settings looks like my 3770k is not being killed by this bench
> might try for some better benches winter is only a few months away


what are your clocks on the 3770k? I don't supposer it's 3501MHz as the benchmark reports ..
EDIT: looking at your sig rig - 4.8GHz?


----------



## RavageTheEarth

Quote:


> Originally Posted by *joder*
> 
> Were you the one in New Orleans?


That's me! I'm at the airport right now. Gonna be a long day of traveling I'll tell ya that much. Have a three hour layover in Atlanta too. The price you pay when you have more important things to spend money on (1080 Ti) lulz


----------



## BoredErica

MSI Gaming X, Asus Strix = 330w?

MSI Armor, Gigabyte Aorus, EVGA SC = ???

Founders = 300w?

Solution for those with 300w power target: Wait for bios that works if it comes or shunt mod 4 all? (Unless your case is oriented in a way such that the liquid metal will just roll off, then RIP in peace pepperoni)

Is the reason why Gaming X and Strix have 330w power target that they have a custom PCB that affects the reference card's shunts?


----------



## Dasboogieman

Quote:


> Originally Posted by *Darkwizzie*
> 
> MSI Gaming X, Asus Strix = 330w?
> 
> MSI Armor, Gigabyte Aorus, EVGA SC = ???
> 
> Founders = 300w?
> 
> Solution for those with 300w power target: Wait for bios that works if it comes or shunt mod 4 all? (Unless your case is oriented in a way such that the liquid metal will just roll off, then RIP in peace pepperoni)
> 
> Is the reason why Gaming X and Strix have 330w power target that they have a custom PCB that affects the reference card's shunts?


The Aorus non-extreme version is 375W in software. The PCB also uses 002 resistors for the shunts for good measure. I can't speak for the other models.


----------



## FuriousReload

I have a FE with a Hybrid cooler on it, I can run 2126 MHz on 1.081 for 30 minutes on both heaven and valley, I cant get the card to stay at 1.093 long because of the power limitation, for the time I can stay in 1.093 I run 2151 MHz, would the shunt mod help keep me in the 1.093 bracket? I tried the ASUS bios, but flashed back after that didn't help and I had some issues with it even running 2126 MHz on 1.081. My temps stay under 49C during these runs with the hybrid cooler and Kryonaut on the chip. Memory is at +500.

Link to my TimeSpy run with these settings: http://www.3dmark.com/spy/1546540


----------



## BoredErica

Quote:


> Originally Posted by *Dasboogieman*
> 
> The Aorus non-extreme version is 375W in software. The PCB also uses 002 resistors for the shunts for good measure. I can't speak for the other models.


Funny world if Gigabyte wins simply by having higher power target...


----------



## Benny89

Quote:


> Originally Posted by *Darkwizzie*
> 
> Funny world if Gigabyte wins simply by having higher power target...


They won't. There are people here with Gigabyte and they hit same clocks as ASUS etc. You can get power limit all what you want but there is still voltage limitation and how much core you can achieve.

Ale few FE here OC higher than AIBs so....

Also higher core doesn't mean better performance as it was proven here few times. Many times higher clccks on higher V will give you worse performance/fps/score than lower clocks at lower V.

If it was as simple as power limit it would be nice









It's all a silicaon lottery as always


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> Funny world if Gigabyte wins simply by having higher power target...


I know we make a big deal out of it, but honestly more power would just get the FE guys like 0.5-1% performance increase.

And nearly none if you can't oc that high.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I know we make a big deal out of it, but honestly more power would just get the FE guys like 0.5-1% performance increase.
> 
> And nearly none if you can't oc that high.


Can you show yuor curve?


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> They won't. There are people here with Gigabyte and they hit same clocks as ASUS etc. You can get power limit all what you want but there is still voltage limitation and how much core you can achieve.
> 
> Ale few FE here OC higher than AIBs so....
> 
> Also higher core doesn't mean better performance as it was proven here few times. Many times higher clccks on higher V will give you worse performance/fps/score than lower clocks at lower V.
> 
> If it was as simple as power limit it would be nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's all a silicaon lottery as always


Thing is, the power budget on the damn Aorus is so ridiculously overkill that I find I'm permanently capped by Vrel or temperature (the bins above 40 degrees) more than anything. I'm still looking for something that can max the Power limit, so far its looking extremely difficult unless I do voltage hardmods.

Its the same as my old GTX 980ti with the unlocked BIOS, not so much about absolute best performance but pure consistency in all scenarios.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> it's not impossible to get extra perf because of an amazing PSU and Mobo combo. Especially stability and consistency wise, no doubt about it.


Can you post your curve? Where are you max and when it downcklocks.


----------



## BoredErica

Quote:


> Originally Posted by *Benny89*
> 
> They won't. There are people here with Gigabyte and they hit same clocks as ASUS etc. You can get power limit all what you want but there is still voltage limitation and how much core you can achieve.
> 
> Ale few FE here OC higher than AIBs so....
> 
> Also higher core doesn't mean better performance as it was proven here few times. Many times higher clccks on higher V will give you worse performance/fps/score than lower clocks at lower V.
> 
> If it was as simple as power limit it would be nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's all a silicaon lottery as always


It's true that the quality of the chip matters. It might be true that it's such a big factor it outweighs all else. But it's also true that a great FE card would be faster with the right power target for example.

Quote:


> Originally Posted by *SlimJ87D*
> 
> I know we make a big deal out of it, but honestly more power would just get the FE guys like 0.5-1% performance increase.


Maybe like 1.5% in the most.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> It's true that the quality of the chip matters. It might be true that it's such a big factor it outweighs all else. But it's also true that a great FE card would be faster with the right power target for example.
> 
> Maybe like 1.5% in the most.


Yeah for superposition benchmark specifically.

But in the great majority of games, like 0.5%. Games don't run you to the roof with power draw during the entire duration of the game. My card will do Witcher 3 at 2088 Mhz and not drop due to power 95% of the time.

That small fraction of 5% I'll drop to 2075 or 2063. So you can say that it'd probably make a 0.05% difference in performance even if you compared the average fps.

This is what makes me resist doing the shunt mod, it'll be all this effort and money for tiny gains. It would have been worth to do it when I installed my AIO initially, but I don't know if it's worth doing now.


----------



## beastlyhax

Just installed my Accelero Hybrid III-140 on my 1080 ti fe. I don't have any VRM cooling yet (just a fan blowing on them







). I ran 2075 the whole time, my max temp was only 47C! I plan on doing the shunt mod once I get some VRM cooling.


----------



## BoredErica

Quote:


> Originally Posted by *beastlyhax*
> 
> 
> 
> Just installed my Accelero Hybrid III-140 on my 1080 ti fe. I don't have any VRM cooling yet (just a fan blowing on them
> 
> 
> 
> 
> 
> 
> 
> ). I ran 2075 the whole time, my max temp was only 47C! I plan on doing the shunt mod once I get some VRM cooling.


Did your clocks change at all once you went hybrid?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *beastlyhax*
> 
> 
> 
> Just installed my Accelero Hybrid III-140 on my 1080 ti fe. I don't have any VRM cooling yet (just a fan blowing on them
> 
> 
> 
> 
> 
> 
> 
> ). I ran 2075 the whole time, my max temp was only 47C! I plan on doing the shunt mod once I get some VRM cooling.


I'm curious why did you choose this instead of just mounting a AIO directly into the FE shroud and keep the oem midplate to cool the vram and vrm?


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> @Thingyness
> 
> This is quite interesting, the stock bios at some point pulled 332 Watts and then software+hardware kicked in and told it to back off.
> 
> 
> 
> Here's to hoping there is a way to turn of this regulator that kicks it down to 300 Watts


If you look at the text there it'll give you some idea as to why - NVidia's power level sampling is relatively slow. 119 samples over 2.47 seconds. In most cases that's even slower than your framerate, so it takes anywhere from 1-10 frames for it to figure out that you've exceeded the power cap and then take action to reduce clocks, frequency, power gate (turn off) functional units temporarily, or do whatever it does to keep you below the limit.

Because of this there will always be situations where you 'peak' well above the limit, and this is to be expected. It will try to keep the average < 300w though.


----------



## dansi

Hey all, if i run timespy demo and it locks up midway, does it mean my OC is at the limit?

Im on FE and it barely go past 2ghz before it lock up. I am afraid i got the worst luck in GPU lottery again.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> If you look at the text there it'll give you some idea as to why - NVidia's power level sampling is relatively slow. 119 samples over 2.47 seconds. In most cases that's even slower than your framerate, so it takes anywhere from 1-10 frames for it to figure out that you've exceeded the power cap and then take action to reduce clocks, frequency, power gate (turn off) functional units temporarily, or do whatever it does to keep you below the limit.
> 
> Because of this there will always be situations where you 'peak' well above the limit, and this is to be expected. It will try to keep the average < 300w though.


Yeah I did this last night with the ASUS bios and Stock Bios and for the stock it was 305 Watts and the Asus was 310 Watts.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Can you show yuor curve?


Left is what I game at. Right is for the benchmark (scores around 10280 - 10290 on superposition)

My card is a pretty good bin, it can run [email protected]



[email protected] on heaven i score 2595 at 1440P benchmark.


----------



## gingerbreadman

I don't know if I got a bad card or what?

I keep crashing even at 1987 @ 1.093V









I'm only stable at 1974


----------



## beastlyhax

It would not have worked, the plate of the stock cooler interfered with the aio coldplate. I am aware that you can do this with the EVGA cooler, the only reason I got this is because I got it on sale for £65.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *beastlyhax*
> 
> It would not have worked, the plate of the stock cooler interfered with the aio coldplate. I am aware that you can do this with the EVGA cooler, the only reason I got this is because I got it on sale for £65.


That's a good deal!

I bought a nzxt x41 for $65 refurbished and just slapped it into the FE.
Quote:


> Originally Posted by *gingerbreadman*
> 
> I don't know if I got a bad card or what?
> 
> I keep crashing even at 1987 @ 1.093V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm only stable at 1974


Play with voltage curves more, and if you have an FE, don't just shoot for 1.093V, that will get you closer to power limiting. Go read Kedarwolf's guide.


----------



## beastlyhax

They were rock solid, I never saw them change at all. Everything was perfectly static, including the voltage, it was sitting at 1.093v the whole time.


----------



## claymanhb

Just ordered this beast!

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137126


----------



## gingerbreadman

Quote:


> Originally Posted by *SlimJ87D*
> 
> Play with voltage curves more, and if you have an FE, don't just shoot for 1.093V, that will get you closer to power limiting. Go read Kedarwolf's guide.


Mine is a MSI gaming X.

I don't know how you guys can hit 2000 @ 1.025V


----------



## dansi

Quote:


> Originally Posted by *gingerbreadman*
> 
> I don't know if I got a bad card or what?
> 
> I keep crashing even at 1987 @ 1.093V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm only stable at 1974


Hey does yours lock up the demo? If i run timespy demo, it locks up midway, does it mean my OC is at the limit?

Im on FE and it just barely go past 2ghz before it lock up.

I am afraid i got the worst luck in GPU lottery again.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gingerbreadman*
> 
> Mine is a MSI gaming X.
> 
> I don't know how you guys can hit 2000 @ 1.025V


I'm sorry man, it's just the silicon lottery. What temperatures is you card running at? Try to get it as cool as possible, you get a 13 Mhz boost for every 10C lower starting at 80C.

So if you're at 72 and you can get to 68 then you'll get another bump of 13 Mhz.


----------



## Rhadamanthys

So which of the FE's should I get? Sure in theory they're all the same but someone before wrote there are rumours that EVGA is binning their cards. What about other manufacturers? For example, should I get a Gainward, given that (as far as I've seen) their lineup consists solely of the FE? Let's suppose everyone is binning, that would make it more likely to get a good sample, wouldn't it.


----------



## Serandur

Just got my Strix 1080 Ti. So excited.


----------



## gingerbreadman

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm sorry man, it's just the silicon lottery. What temperatures is you card running at? Try to get it as cool as possible, you get a 13 Mhz boost for every 10C lower starting at 80C.
> 
> So if you're at 72 and you can get to 68 then you'll get another bump of 13 Mhz.


I'm at about 68-69C. I notice that even my power limit slider is max at 117% (I think MSI can't go to 120%), the power % graphs shows fluctuates only 98 - 105%, but the voltage fluctuates from 1.062 - 1.093V.

I'm not sure if it's the power limit? How do I get it up to 117%?


----------



## BoredErica

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So which of the FE's should I get? Sure in theory they're all the same but someone before wrote there are rumours that EVGA is binning their cards. What about other manufacturers? For example, should I get a Gainward, given that (as far as I've seen) their lineup consists solely of the FE? Let's suppose everyone is binning, that would make it more likely to get a good sample, wouldn't it.


EVGA Jacob has said before that there is no binning of chips. For any SKU.


----------



## GRABibus

Some results for the MSI SeaHawk :


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Can you post your curve? Where are you max and when it downcklocks.


i just go down 13 per step starting at 1.075v @ 2088.


Playing around with thee resolutions and I have seen our future, men. 10, 240 x 4,320 is Widescreen 21:9 8k.... ummmm


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> I went ahead and triggered my step up from the 1080 to the Ti. Not sure if I'll decide on keeping it or selling it for an AIB.


which one? Keep it, man! You are PEZ, man.


----------



## lilchronic

2038Mhz with some power throttle down to 2000Mhz
voltage .963 -1.025


----------



## TheParisHeaton

Hi. Thinking about replacing 2x 980T for 1080ti. I've resolved but I do not know what card to buy it.

My favorites are - Aorus Xtreme or FE. Fe is cheaper for 100 €. I have 2x 480 60mm rads and also i plan to buy EK-FC with card.

I do not know what differences reach these versions without changing the voltage. Any thoughts / advice? Thanks.


----------



## Slackaveli

1080p Extreme just for giggles. I literally never left 120 fps, it was locked through everry frame of the bench, despite the reported fps min/max. That's got to be a decent score right there.


----------



## KingEngineRevUp

My time spy score. I have no clue why 3DMark doesn't like to run, it hates Afterburner on my system for some reason.

I can probably get the score a little higher playing with the voltage curve.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> I spent an extra $200 for the strix over any other card because of that pretty little logo
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just a quick run with my 24/7 settings looks like my 3770k is not being killed by this bench
> might try for some better benches winter is only a few months away


Winter is coming, lol, you Southern Hemisphere weirdo LOL


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gingerbreadman*
> 
> I'm at about 68-69C. I notice that even my power limit slider is max at 117% (I think MSI can't go to 120%), the power % graphs shows fluctuates only 98 - 105%, but the voltage fluctuates from 1.062 - 1.093V.
> 
> I'm not sure if it's the power limit? How do I get it up to 117%?


MSI cards have a minimum power draw at 280, ASUS has 275. So they both go to 330 when you multiply them by their percentages.

117% * 280 = 327.6 Watts
120% * 275 = 300 Watts


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> I have a FE with a Hybrid cooler on it, I can run 2126 MHz on 1.081 for 30 minutes on both heaven and valley, I cant get the card to stay at 1.093 long because of the power limitation, for the time I can stay in 1.093 I run 2151 MHz, would the shunt mod help keep me in the 1.093 bracket? I tried the ASUS bios, but flashed back after that didn't help and I had some issues with it even running 2126 MHz on 1.081. My temps stay under 49C during these runs with the hybrid cooler and Kryonaut on the chip. Memory is at +500.
> 
> Link to my TimeSpy run with these settings: http://www.3dmark.com/spy/1546540


bruh. That's a monster. What temp delta gain with the Kyro? I have always seen 4-8c gains.
Quote:


> Originally Posted by *TheParisHeaton*
> 
> Hi. Thinking about replacing 2x 980T for 1080ti. I've resolved but I do not know what card to buy it.
> 
> My favorites are - Aorus Xtreme or FE. Fe is cheaper for 100 €. I have 2x 480 60mm rads and also i plan to buy EK-FC with card.
> 
> I do not know what differences reach these versions without changing the voltage. Any thoughts / advice? Thanks.


how OCD are you? Will it bother you if you cant keep a certain clock or do you just want to know you have a strong OC/card? B/c if it's going to bother you, you will be power limited with t he FE at times and not with the Aorus. However, said power limiting (while being quite bothersome to the ole OCD) only amounts to a ~0.5% - 2% difference AT MOST in actual performance.
Quote:


> Originally Posted by *gingerbreadman*
> 
> I don't know if I got a bad card or what?
> 
> I keep crashing even at 1987 @ 1.093V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm only stable at 1974


start with NO VOLTS ADDED. I crash all day at 2020 with volts but can hold 2050+ even after the first few temp bins drop with NO volts added.

**Somebody throw up a 1080p medium preset SuPo bench, please. Im curious if the ole 5775c cache works in this bench at 1080p.


----------



## Slackaveli

nm


----------



## Slackaveli

dp


----------



## gingerbreadman

Quote:


> Originally Posted by *Slackaveli*
> 
> start with NO VOLTS ADDED. I crash all day at 2020 with volts but can hold 2050+ even after the first few temp bins drop with NO volts added.


Are you saying you OC more with no volts added?


----------



## FuriousReload

Quote:


> Originally Posted by *Slackaveli*
> 
> bruh. That's a monster. What temp delta gain with the Kyro? I have always seen 4-8c gains.
> how OCD are you? Will it bother you if you cant keep a certain clock or do you just want to know you have a strong OC/card? B/c if it's going to bother you, you will be power limited with t he FE at times and not with the Aorus. However, said power limiting (while being quite bothersome to the ole OCD) only amounts to a ~0.5% - 2% difference AT MOST in actual performance.
> start with NO VOLTS ADDED. I crash all day at 2020 with volts but can hold 2050+ even after the first few temp bins drop with NO volts added.
> 
> **Somebody throw up a 1080p medium preset SuPo bench, please. Im curious if the ole 5775c cache works in this bench at 1080p.


Kryo seemed to lower temps 4-5C, My wifes 1060 went down by 9C.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> Kryo seemed to lower temps 4-5C, My wifes 1060 went down by 9C.


YEP! I have it on all of our cpu gpu, too, in the house. 9c, wow. I got 8c out of my old 980yi hybrid with it. It's great , hella perf per dollar, that stuff.
Quote:


> Originally Posted by *gingerbreadman*
> 
> Are you saying you OC more with no volts added?


yessir. It's stable but always crashes at like clocks when volts are added. Try finding your best clocks w/o volts , then maybe try 25mv and see if it's stable. I find none is best on Aorus.


----------



## FuriousReload

Quote:


> Originally Posted by *Slackaveli*
> 
> YEP! I have it on all of our cpu gpu, too, in the house. 9c, wow. I got 8c out of my old 980yi hybrid with it. It's great , hella perf per dollar, that stuff.
> yessir. It's stable but always crashes at like clocks when volts are added. Try finding your best clocks w/o volts , then maybe try 25mv and see if it's stable. I find none is best on Aorus.


I used the Kryo for my 7700k delid, as my conductonaut didn't show up. Later on redid it with conductonaut and the different was only 1C, it really is great paste.


----------



## Benny89

Hmm, I am not hitting Power Limit... Even when I force my card to run at 1.093 V on 2076 on core and +600 memory my top Power Limit in GPU-Z is 108...

How to make it actually hit higher?\

And did anyone tried higher Voltage than 1.093 on air for higher clocks?


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> I used the Kryo for my 7700k delid, as my conductonaut didn't show up. Later on redid it with conductonaut and the different was only 1C, it really is great paste.


damn, that's kinda crazy. Wow.
Quote:


> Originally Posted by *Benny89*
> 
> Hmm, I am not hitting Power Limit... Even when I force my card to run at 1.093 V on 2076 on core and +600 memory my top Power Limit in GPU-Z is 108...
> 
> How to make it actually hit higher?\
> 
> And did anyone tried higher Voltage than 1.093 on air for higher clocks?


just go into Superposition bench and run 4k or 8k preset. that will power you up. It's all application controlled what power you actually run at.


----------



## Slackaveli

.


----------



## gingerbreadman

Quote:


> Originally Posted by *Slackaveli*
> 
> yessir. It's stable but always crashes at like clocks when volts are added. Try finding your best clocks w/o volts , then maybe try 25mv and see if it's stable. I find none is best on Aorus.


Thanks! Can I check what is your graphics score on Superposition at 1080 extreme preset with your 1080 ti OC-ed?


----------



## Slackaveli

Quote:


> Originally Posted by *gingerbreadman*
> 
> Thanks! Can I check what is your graphics score on Superposition at 1080 extreme preset with your 1080 ti OC-ed?


lemme go run it. brb.


----------



## RavageTheEarth

I'm on my last airplane and then I have another hour ride home and then I can finally tear that box open and hopefully able to fit it right below my water cooled 980 Ti. I have no interest in draining the loop right now. I just want to slide it under the 980 Ti until the water block comes out. I have a feeling I'm going to run into issues though. Maybe I'll be the first person to use a 980 Ti as a physics card ha!


----------



## Benny89

How do I lock voltage on card. Said I want to force it to get 1.081 V at 2076 core, how to lock it there?


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'm on my last airplane and then I have another hour ride home and then I can finally tear that box open and hopefully able to fit it right below my water cooled 980 Ti. I have no interest in draining the loop right now. I just want to slide it under the 980 Ti until the water block comes out. I have a feeling I'm going to run into issues though. Maybe I'll be the first person to use a 980 Ti as a physics card ha!


I'm using an older water cooled Maxwell Titan X as a dedicated PhysX card.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> How do I lock voltage on card. Said I want to force it to get 1.081 V at 2076 core, how to lock it there?


in the curve, click the point and hit ctrl+L.
Quote:


> Originally Posted by *gingerbreadman*
> 
> Thanks! Can I check what is your graphics score on Superposition at 1080 extreme preset with your 1080 ti OC-ed?


Here's my 1080p Extreme run. Started at 2088, then held 2076 throughout (







) , max temp 47c on air. Dat's nasty.
Quote:


> Originally Posted by *KedarWolf*
> 
> I'm using an older water cooled Maxwell Titan X as a dedicated PhysX card.


You know that will still sell for ~ $500 on Ebay ....


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> Nope . Intel has a ban on those apparently. I can link the owner's club where 90% of us came from 4690k/4790k and there is a 100% approval rating for this cpu. I've read the entire thread and every single one of us are super super happy. Testimonials and what-not. Unfortunitely you will NEVER see benches unless they come from very far flung 3rd world places , and I suspect that is part of Intel's blackballing of this cpu. There just aren't any benches (or super few and all have the 5775c @ 3.3ghz when all of us can hit 4.2-4.4ghz), but 1080p is the very best way to bench cpu anyway . I understand you want to know if it's better at that rez, and im telling you NONE of us were gaming at 1080p and we all see vast improvements across the board, including HUGE HUGE gains in games, especially in my case when I'm at 144hz 1440p, and power consumption is way down. these are 14nm cpus. They sip power. Just go read 10 or 20 pages here... http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/720#post_26005440
> 
> it's simple math. ipc improvement, die shrink, single core and physics scores are up, and 143MB vs 8mb cache....... As the articles I linked are titled "Broadwell Destroyer" and they call it "The Gaming Behemoth." I couldn't have said it any better. It beats 7700k, so of course it roflstomps 4790k, even the golden 5.0ghz ones like many of us sold/gifted when we jumped on The Gaming Behemoth. I'm pretty much all gamer so for me I wouldnt trade it for a 7700k. Nope.


Quote:


> Originally Posted by *Slackaveli*
> 
> But it DOESN'T diminish at higher rez. In fact it can become even more pronounced than that in heavy cpu titles. The gains are because of the ENORMOUS CACHE DELTA of 126MB , that over 1500% more cache, bro. You are just wrong, but iDC, Im good. and 18.7% is the same thing as ~20%, btw.
> 
> Regardless of what % it wins by, it will NEVER lose to a 4790k in any title and it will ALWAYS beat it by at least some measurable amount. For $50.


If I decide to upgrade my 2600K to a new CPU that is most practical to handle a 1080Ti which is it going to be? Or do you guys recommend waiting for newer CPU's?


----------



## TheParisHeaton

Quote:


> Originally Posted by *Slackaveli*
> 
> how OCD are you? Will it bother you if you cant keep a certain clock or do you just want to know you have a strong OC/card? B/c if it's going to bother you, you will be power limited with t he FE at times and not with the Aorus. However, said power limiting (while being quite bothersome to the ole OCD) only amounts to a ~0.5% - 2% difference AT MOST in actual performance.


I do not play benchmarks but I had the card that holds over 2000 MHz. I'm going to use it on 24/7 folding and sometimes some game.
About overclocking .. I can move the scroll bars in the afterburner.








BTW: the difference between Aorus and Aorus Xtreme?


----------



## beastlyhax

Since I got water cooling my voltages are higher and I never hit the power limit. It is really strange. Before I was hitting the limit all the time.


----------



## gingerbreadman

Quote:


> Originally Posted by *Slackaveli*
> 
> Here's my 1080p Extreme run. Started at 2088, then held 2076 throughout (
> 
> 
> 
> 
> 
> 
> 
> ) , max temp 47c on air. Dat's nasty.


Damn, that's a nice card. 47C on air wut? lol..you living in antarctica?


----------



## Benny89

Ok, I know that my card can get stable at 2076. But I have two problems with that. First I can't curve it to 2076 because it will downcloack after 61C to 2063. That can't be avoided on air. However when I set card up to 2088 so it will downclock to 2076 it crashes often at 2088 before it can downclock itself to 2076 where it can stay stable.

How Can I make this card STAY at this 2076? Any ideas?


----------



## Addsome

How much thermal grizzly kryonaut would I need if I want to put it on my GPU, CPU and have some left over for the future? Would 3.9grams be enough?


----------



## prelude514

So my Aorus non-Extreme is a terrible clocker and seems to have other issues. I had traded in what I consider a golden MSI FE which would do 2000mhz at only 0.993v. That's pretty good, right?

The store still has my FE card. I've just about convinced myself to go get it back and pick up an AIO and shunt mod it. 2 questions:

1) For the shunt mod, I'd prefer soldering a wire on each PCI-e connectors shunt. A single stand of thin aluminium speaker wire should do the trick? Or should I use copper wire?

2) This will be the first time I put an AIO on a gpu. These things are meant for CPUs, do I need to mod the AIO at all? Are there brackets that are sold? I'm looking ideally for a 120mm thick rad AIO as that's all that my current case will accommodate. If performance is really better with a 240mm, I could make that work as well. What are the general recommendations for AIO on a 1080Ti? I want the best cooling performance. Here's where I'll be buying the AIO from most likely:

http://www.canadacomputers.com/index.php?cPath=8

Oh, and are the VRMs decent on FE cards? Haven't found any info on them.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Ok, I know that my card can get stable at 2076. But I have two problems with that. First I can't curve it to 2076 because it will downcloack after 61C to 2063. That can't be avoided on air. However when I set card up to 2088 so it will downclock to 2076 it crashes often at 2088 before it can downclock itself to 2076 where it can stay stable.
> 
> How Can I make this card STAY at this 2076? Any ideas?


You're fighting thermal throttling, not much you can really do besides improving your temperatures. Maybe buy some kryonaut and re-tim.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> If I decide to upgrade my 2600K to a new CPU that is most practical to handle a 1080Ti which is it going to be? Or do you guys recommend waiting for newer CPU's?


7700k, 5775c, or wait for 7740k are pretty much the options. If you already had a z97, that'd be a no-brainer. If you are investing in a whole new platform, I'd wait for x299, it just got bumped up to computex reveal. They should be out by summer.


----------



## Slackaveli

Quote:


> Originally Posted by *TheParisHeaton*
> 
> I do not play benchmarks but I had the card that holds over 2000 MHz. I'm going to use it on 24/7 folding and sometimes some game.
> About overclocking .. I can move the scroll bars in the afterburner.
> 
> 
> 
> 
> 
> 
> 
> 
> BTW: the difference between Aorus and Aorus Xtreme?


+25 clocks off the top and a better backplate. Otherwise it's the same. Possible caveat being their GPU gauntlet , so extreme probably has a better chance of being a 2088+ chip.

Here's my best 4k optimized. Can't quite hit that 10, 400 :/
Quote:


> Originally Posted by *gingerbreadman*
> 
> Damn, that's a nice card. 47C on air wut? lol..you living in antarctica?


I have the fan on 100% which on the Arus REALLY clears out the heat. it's 70f/21c ambient and i did open up the window on my case for 3c to keep me under 50c. But even door closed and fan on normal 1:1 curve on long gaming sessions I usually never pass 60c. I did the kyronaut re-paste, too, that helps 5c+.
Quote:


> Originally Posted by *Benny89*
> 
> Ok, I know that my card can get stable at 2076. But I have two problems with that. First I can't curve it to 2076 because it will downcloack after 61C to 2063. That can't be avoided on air. However when I set card up to 2088 so it will downclock to 2076 it crashes often at 2088 before it can downclock itself to 2076 where it can stay stable.
> 
> How Can I make this card STAY at this 2076? Any ideas?


that was my exact situation pre-kyro re-pasting but now I keep that bin because of the 5c gained. You can set the 2088 bin to 2076, too, to avoid that crash. still lock it there but it's clock will hold on the voltage switch.
Quote:


> Originally Posted by *Addsome*
> 
> How much thermal grizzly kryonaut would I need if I want to put it on my GPU, CPU and have some left over for the future? Would 3.9grams be enough?


should be about 4 uses in that. they sell a 1 gram so that is about 1 use. I got the 5.5 gram tube on Ebay for $21, though. That was a buy right there. Did 3 gpus and 2 cpus with that and I have 1 use left.


----------



## FuriousReload

Quote:


> Originally Posted by *Addsome*
> 
> How much thermal grizzly kryonaut would I need if I want to put it on my GPU, CPU and have some left over for the future? Would 3.9grams be enough?


Should be plenty bud, I always buy the 11 gram tubes, and it last for a lot of CPUs GPUs.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> Should be plenty bud, I always buy the 11 gram tubes, and it last for a lot of CPUs GPUs.


always going to need more!


----------



## FuriousReload

Quote:


> Originally Posted by *Slackaveli*
> 
> always going to need more!


I just bought two 11 gram tubes on Amazon for $28 each yesterday, should keep me good for awhile.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> I just bought two 11 gram tubes on Amazon for $28 each yesterday, should keep me good for awhile.


damn, that's a GOOD BUY; only ~$2.50/g. They want $12 gram, hell, that's more than I pay for nugs!


----------



## Exilon

Quote:


> Originally Posted by *beastlyhax*
> 
> Since I got water cooling my voltages are higher and I never hit the power limit. It is really strange. Before I was hitting the limit all the time.


Not that strange.

The board power includes the cooler and fans, which can pull up to 15W at 100%.

Lowering temperatures also reduces power by as much as 1W / 2C, especially with the smaller and leakier transistors.
Quote:


> Originally Posted by *prelude514*
> 
> So my Aorus non-Extreme is a terrible clocker and seems to have other issues. I had traded in what I consider a golden MSI FE which would do 2000mhz at only 0.993v. That's pretty good, right?
> 
> The store still has my FE card. I've just about convinced myself to go get it back and pick up an AIO and shunt mod it. 2 questions:
> 
> 1) For the shunt mod, I'd prefer soldering a wire on each PCI-e connectors shunt. A single stand of thin aluminium speaker wire should do the trick? Or should I use copper wire?
> 
> 2) This will be the first time I put an AIO on a gpu. These things are meant for CPUs, do I need to mod the AIO at all? Are there brackets that are sold? I'm looking ideally for a 120mm thick rad AIO as that's all that my current case will accommodate. If performance is really better with a 240mm, I could make that work as well. What are the general recommendations for AIO on a 1080Ti? I want the best cooling performance. Here's where I'll be buying the AIO from most likely:
> 
> http://www.canadacomputers.com/index.php?cPath=8
> 
> Oh, and are the VRMs decent on FE cards? Haven't found any info on them.


1) If you're going into soldering, might as well just replace the 5 mOhm resistor with a 3-4 mOhm one. Pretty sure a 2512 Panasonic ERJ resistor will work. Much more precise than soldering a wire and hoping it doesn't trip a power fault from too little resistance.

http://www.mouser.com/ds/2/315/AOA0000C241-947534.pdf

2) There are brackets like the G10 or the new G12 that can work, but for the FE you can use EVGA's hybrid cooler.

3) FE can handle >350W just fine with enough cooling. Be careful if you're going to use the GPU fan header to drive the VRM fan; the VRM will cook if the GPU fan header doesn't command enough RPMs.


----------



## c0ld

Quote:


> Originally Posted by *kevindd992002*
> 
> If I decide to upgrade my 2600K to a new CPU that is most practical to handle a 1080Ti which is it going to be? Or do you guys recommend waiting for newer CPU's?


I would wait im still running my 2600k with my Aorus 1080 Ti don't feel like there is any bottle neck since my CPU is sitting at 4.8GHz.


----------



## Addsome

Whats the difference between Thermal Grizzly Kryonaut, Thermal Grizzly Hydronaut and Thermal Grizzly Aeronaut?


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> Whats the difference between Thermal Grizzly Kryonaut and Thermal Grizzly Hydronaut?


hydro is the older stuff, kyro is twice as good by stats.


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> hydro is the older stuff, kyro is twice as good by stats.


What about Thermal Grizzly Aeronaut?


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> 7700k, 5775c, or wait for 7740k are pretty much the options. If you already had a z97, that'd be a no-brainer. If you are investing in a whole new platform, I'd wait for x299, it just got bumped up to computex reveal. They should be out by summer.


Any price forecast on the 7740K though?

Quote:


> Originally Posted by *c0ld*
> 
> I would wait im still running my 2600k with my Aorus 1080 Ti don't feel like there is any bottle neck since my CPU is sitting at 4.8GHz.


I'm all waiting but the thing is that I will be going custom water cooling next month with a new case and I would want to use the new platform (CPU, mobo, ram) already rather than using the old system, installing the water cooling loop, and then doing the same process all over again when a new CPU comes out.


----------



## Serandur

This card is incredible. Rise of the Tomb Raider, nearly max settings, 1440p, with *SSAA 2x (supersampling!!!)* at a _constant_ 60 FPS in geothermal valley.

Time for a new monitor I guess?


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> that was my exact situation pre-kyro re-pasting but now I keep that bin because of the 5c gained. You can set the 2088 bin to 2076, too, to avoid that crash. still lock it there but it's clock will hold on the voltage switch.


It will still downclock due to temperature to 2063. How do you want me to lock it there or 1.093 V bin if it will still downclock after 60C. There are two moments that Core downclocks. At 60 C it downclock one bin and at 71 it downclocks another bin. 71 I can avoid with custom fan curve but hell I can't keep temp on air below 60 C.

Also why when I made custom curve and save my profile and I apply +50 more on Memory and hit apply Afterburner changes my curve?


----------



## Slackaveli

Quote:


> Originally Posted by *Addsome*
> 
> What about Thermal Grizzly Aeronaut?


the original thermal grizz, half as good as hydro. it's like 3, hydro is 6, and Kyro is 13.2 or something like that. Just get the Kyro, it costs more because it's the best by far.
Quote:


> Originally Posted by *Benny89*
> 
> It will still downclock due to temperature to 2063. How do you want me to lock it there or 1.093 V bin if it will still downclock after 60C. There are two moments that Core downclocks. At 60 C it downclock one bin and at 71 it downclocks another bin. 71 I can avoid with custom fan curve but hell I can't keep temp on air below 60 C.
> 
> Also why when I made custom curve and save my profile and I apply +50 more on Memory and hit apply Afterburner changes my curve?


you need the kyro.

the curve will always adjust to it's real frequencies, aka 2100, 2088, 2076, 2063, 2050, 2037, 2025, 2012, 2000, 1987, 1974, 1961, 1949.... etc, etc. every 2 steps is 25mhz.


----------



## Serandur

Quote:


> Originally Posted by *Benny89*
> 
> It will still downclock due to temperature to 2063. How do you want me to lock it there or 1.093 V bin if it will still downclock after 60C. There are two moments that Core downclocks. At 60 C it downclock one bin and at 71 it downclocks another bin. 71 I can avoid with custom fan curve but hell I can't keep temp on air below 60 C.
> 
> Also why when I made custom curve and save my profile and I apply +50 more on Memory and hit apply Afterburner changes my curve?


I haven't kept up with Pascal overclocking, but isn't anyone creating custom BIOSes to override GPU Boost's throttling behavior? I agree, it's annoying.


----------



## Norlig

I received my Copper shims yesterday, 25mmx25mmx1.5mm.

I dont get why I didnt double check those meassurements, as they are just enough to cover the die of the GPU itself.

Thats dangerous isnt it?

Shouldn't it also rest on the silver edges, as to divide the preassure?

Anyway, I did test, and my Temps rose to >90'C and I got Driver crash, so I must've mounted it skewered or something.
Works fine when I put the normal cooler on again.

a 45mmx45mmx1.5mm Copper shim would be perfect..


----------



## Slackaveli

Quote:


> Originally Posted by *Norlig*
> 
> I received my Copper shims yesterday, 25mmx25mmx1.5mm.
> 
> I dont get why I didnt double check those meassurements, as they are just enough to cover the die of the GPU itself.
> 
> Thats dangerous isnt it?
> 
> Shouldn't it also rest on the silver edges, as to divide the preassure?
> 
> Anyway, I did test, and my Temps rose to >90'C and I got Driver crash, so I must've mounted it skewered or something.
> Works fine when I put the normal cooler on again.
> 
> a 45mmx45mmx1.5mm Copper shim would be perfect..


unlucky. i'd be very careful when tightening that down.


----------



## Benny89

I was able to do full Heaven benchmark on 1.093V 2088 downclocked to 2076 after 50C. 100% Fan speeds allowed me to keep 2076 core and 6100 Mhz Memory for full run at 1440p Extreme.

Mind it is on stock 5775C CPU (didn't OC it yet):


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> So my Aorus non-Extreme is a terrible clocker and seems to have other issues. I had traded in what I consider a golden MSI FE which would do 2000mhz at only 0.993v. That's pretty good, right?
> 
> The store still has my FE card. I've just about convinced myself to go get it back and pick up an AIO and shunt mod it. 2 questions:
> 
> 1) For the shunt mod, I'd prefer soldering a wire on each PCI-e connectors shunt. A single stand of thin aluminium speaker wire should do the trick? Or should I use copper wire?
> 
> 2) This will be the first time I put an AIO on a gpu. These things are meant for CPUs, do I need to mod the AIO at all? Are there brackets that are sold? I'm looking ideally for a 120mm thick rad AIO as that's all that my current case will accommodate. If performance is really better with a 240mm, I could make that work as well. What are the general recommendations for AIO on a 1080Ti? I want the best cooling performance. Here's where I'll be buying the AIO from most likely:
> 
> http://www.canadacomputers.com/index.php?cPath=8
> 
> Oh, and are the VRMs decent on FE cards? Haven't found any info on them.


I have an h55 for sale with a noctua fan on it. You can modify the bracket and install it like so


__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/

Or just get a kraken g12


----------



## Foxrun

Quote:


> Originally Posted by *Serandur*
> 
> This card is incredible. Rise of the Tomb Raider, nearly max settings, 1440p, with *SSAA 2x (supersampling!!!)* at a _constant_ 60 FPS in geothermal valley.
> 
> Time for a new monitor I guess?


Time for 4k sir


----------



## FuriousReload

@Slackaveli I did manage to run my card at 2176MHz at 1.093, didn't down clock for two runs of valley. on 3rd run I ran into power wall and it back down to 2151 MHz at 1.081. Seeing how I seem to be doing well so far, don't think I should complain, but damn, stay at 1.093! Lol.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> @Slackaveli I did manage to run my card at 2176MHz at 1.093, didn't down clock for two runs of valley. on 3rd run I ran into power wall and it back down to 2151 MHz at 1.081. Seeing how I seem to be doing well so far, don't think I should complain, but damn, stay at 1.093! Lol.


I don't think anybody is touching that, bro. DIZAMN. Beating most of us by a good 100Mhz, ya freak! In theory, wouldn't a 2200Mhz Ti tie a 2000Mhz Titan Xp?
Quote:


> Originally Posted by *Benny89*
> 
> I was able to do full Heaven benchmark on 1.093V 2088 downclocked to 2076 after 50C. 100% Fan speeds allowed me to keep 2076 core and 6100 Mhz Memory for full run at 1440p Extreme.
> 
> Mind it is on stock 5775C CPU (didn't OC it yet):


very nice, man. You've settled in now. Time for that 5775c to get up to 4.3, eh? Then we'll have basically identical systems


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> I was able to do full Heaven benchmark on 1.093V 2088 downclocked to 2076 after 50C. 100% Fan speeds allowed me to keep 2076 core and 6100 Mhz Memory for full run at 1440p Extreme.
> 
> Mind it is on stock 5775C CPU (didn't OC it yet):


Hmm, at those clocks your score should be 2550-2600

What does your perf cap look like?


----------



## shalafi

Quote:


> Originally Posted by *Slackaveli*
> 
> **Somebody throw up a 1080p medium preset SuPo bench, please. Im curious if the ole 5775c cache works in this bench at 1080p.


Hi-a, here you go:


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> Hi-a, here you go:


wow! Those cores are scaling in this one!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Norlig*
> 
> I received my Copper shims yesterday, 25mmx25mmx1.5mm.
> 
> I dont get why I didnt double check those meassurements, as they are just enough to cover the die of the GPU itself.
> 
> Thats dangerous isnt it?
> 
> Shouldn't it also rest on the silver edges, as to divide the preassure?
> 
> Anyway, I did test, and my Temps rose to >90'C and I got Driver crash, so I must've mounted it skewered or something.
> Works fine when I put the normal cooler on again.
> 
> a 45mmx45mmx1.5mm Copper shim would be perfect..


No that's fine. That's what I'm using


----------



## Benny89

Quote:


> Originally Posted by *FuriousReload*
> 
> @Slackaveli I did manage to run my card at 2176MHz at 1.093, didn't down clock for two runs of valley. on 3rd run I ran into power wall and it back down to 2151 MHz at 1.081. Seeing how I seem to be doing well so far, don't think I should complain, but damn, stay at 1.093! Lol.


lol what? That is on water, right? Please post your benchmark scores!


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Hmm, at those clocks your score should be 2550-2600
> 
> What does your perf cap look like?


What is perf cap?


----------



## FuriousReload

Quote:


> Originally Posted by *Benny89*
> 
> lol what? That is on water, right? Please post your benchmark scores!


Yea, I am using the EVGA Hybrid AIO.

I will get the full set of benches done tonight and upload them, including SuPo.


----------



## shalafi

Quote:


> Originally Posted by *Slackaveli*
> 
> wow! Those cores are scaling in this one!


out of curiosity I undid my overclock back to stock and disabled 4 cores, so here's [email protected]:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> What is perf cap?


Run gpu-z with your benchmark and then look at the sensors after the run.


----------



## feznz

I am loving this bench not like the future mark where a separate Physics test is misleading with the overall score.
can't wait till they get the online results page up so I can browse the worlds top benches that are exclusively GPU based.
Quote:


> Originally Posted by *shalafi*
> 
> what are your clocks on the 3770k? I don't supposer it's 3501MHz as the benchmark reports ..
> EDIT: looking at your sig rig - 4.8GHz?


your right 4.8Ghz @ 1.35v
I can run 5Ghz 24/7 but 1.45v
you reminded I must edit my sig
Quote:


> Originally Posted by *Slackaveli*
> 
> Winter is coming, lol, you Southern Hemisphere weirdo LOL


In the land down under under New Zealand where the water goes down the toilet right clockwise direction.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> Yea, I am using the EVGA Hybrid AIO.
> 
> I will get the full set of benches done tonight and upload them, including SuPo.


I started calling it that and I think it's going to stick. wooot.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Run gpu-z with your benchmark and then look at the sensors after the run.


Ok here it is:


----------



## FuriousReload

Quote:


> Originally Posted by *Slackaveli*
> 
> I started calling it that and I think it's going to stick. wooot.


You did it bro, Internet famous!


----------



## Lefty23

Once again late to the party.
My first attempt to oc using the curve (If no PL is hit this gives 2126core - 12265mem @ 1.081V), and a new benchmark is out.

CPU is a [email protected] - DDR3 [email protected]
GPU is EVGA 1080Ti FE with EK-FC Titan X Pascal - Acetal+Nickel waterblock
Using the latest GeForce driver (381.65 WHQL).
There are 3*240 rads and a D5 pump in a Core V21 case - all running @100% for the benches.
The room temperature is 21-22 C.

Here are my runs:
1080p Extreme


4K Optimized


8K Optimised


Quote:


> Originally Posted by *Slackaveli*
> 
> **Somebody throw up a 1080p medium preset SuPo bench, please. Im curious if the ole 5775c cache works in this bench at 1080p.


I'm a few pages behind so not sure if you are still looking for this but, here you go


----------



## Slackaveli

sick, Lefty. I can almost keep up with you but your superior cooling solution definitely puts you in a boss position
Quote:


> Originally Posted by *Lefty23*
> 
> Once again late to the party.
> My first attempt to oc using the curve (If no PL is hit this gives 2126core - 12265mem @ 1.081V), and a new benchmark is out.
> 
> CPU is a [email protected] - DDR3 [email protected]
> GPU is EVGA 1080Ti FE with EK-FC Titan X Pascal - Acetal+Nickel waterblock
> Using the latest GeForce driver (381.65 WHQL).
> There are 3*240 rads and a D5 pump in a Core V21 case - all running @100% for the benches.
> The room temperature is 21-22 C.
> 
> Here are my runs:
> 1080p Extreme
> 
> 
> 4K Optimized
> 
> 
> 8K Optimised
> 
> I'm a few pages behind so not sure if you are still looking for this but, here you go


Quote:


> Originally Posted by *shalafi*
> 
> out of curiosity I undid my overclock back to stock and disabled 4 cores, so here's [email protected]:


dont know what I did wrong the first time but it was locked at 120 for some reason but THIS time, 
.....


----------



## feznz

Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> 
> My time spy score. I have no clue why 3DMark doesn't like to run, it hates Afterburner on my system for some reason.
> 
> I can probably get the score a little higher playing with the voltage curve.


Update to latest river tuner sometime AB doesn't update RT properly

Quote:


> Originally Posted by *Benny89*
> 
> Ok, I know that my card can get stable at 2076. But I have two problems with that. First I can't curve it to 2076 because it will downcloack after 61C to 2063. That can't be avoided on air. However when I set card up to 2088 so it will downclock to 2076 it crashes often at 2088 before it can downclock itself to 2076 where it can stay stable.
> 
> How Can I make this card STAY at this 2076? Any ideas?


Get water cooling or put you pc in a commercial freezer
Quote:


> Originally Posted by *Serandur*
> 
> This card is incredible. Rise of the Tomb Raider, nearly max settings, 1440p, with *SSAA 2x (supersampling!!!)* at a _constant_ 60 FPS in geothermal valley.
> 
> Time for a new monitor I guess?


I put that game to the side planning on playing with max eye candy now with my UC99-38w 3840x1600p using max settings and SMAAx2 I get a playable 45-50 FPS







loving it


----------



## Slackaveli

@fezenz I saw that video where dude was OC ryzen in a deep freeze and he did get a 1700 plain to 4150 mhz i think which is pretty gnarly , inside that deep freezer. I think the deep freezer filled with dry iCE AND NOT PLUGGED IN WOULD HAVE YiELDED MUCH SAFER AND BETTER RESULTS, THOUGH.


----------



## c0ld

Quote:


> Originally Posted by *kevindd992002*
> 
> Any price forecast on the 7740K though?
> 
> I'm all waiting but the thing is that I will be going custom water cooling next month with a new case and I would want to use the new platform (CPU, mobo, ram) already rather than using the old system, installing the water cooling loop, and then doing the same process all over again when a new CPU comes out.


Then you got a reason to upgrade go for it


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> You did it bro, Internet famous!


I think so, lol. It needed more than "SP" and "Superposition" is a mouthful. So , SuPo it is.
Quote:


> Originally Posted by *c0ld*
> 
> Then you got a reason to upgrade go for it


Agreed, and apparently i am in the minority on this (lol, I'm shocked) but I think the 7740k will be a nice chip. It could clock over 5.5ghz with that extra power headroom and process improvements should net at least a 4770k vs Devil's Canyon situation, and that isn't really even accounting for the increased overclock headroom more power and a way better platform will bring. I am unsure how they will TIM the thing but if it is like all the other HEDT than it will crush a 7700k so to speak (and even if it's craptastic reggie TIM like a 7700k, a delid would do the trick), as well as leaving several future upgrade paths. If I were building now I'd wait, and try to save as much dough as I could in case I decided to go Skylake-X, which will be nasty, too.

But even if I'm trippin, who knows how long z270 will be viable. There won't be a process shrink with ingenuitive caches coming as a magic life extender to z270 like z97 got with the Broadwell late cycle, that I can guarantee. Not that it will become crap overnight or anything, it's a great platform but we are talking about a fresh build scenario with two processor lines and a new platform about to drop. My advice is wait, unless you have a z97 platform, and then you'd be CRAZY to not sell that 4690k/4790k and grab a 5775c.


----------



## Benny89

Quote:


> Originally Posted by *Benny89*
> 
> Ok here it is:


So anyone see anything here? My Perf says always Util it seems. What does it mean?


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> I think so, lol. It needed more than "SP" and "Superposition" is a mouthful. So , SuPo it is.
> Agreed, and apparently i am in the minority on this (lol, I'm shocked) but I think the 7740k will be a nice chip. It could clock over 5.5ghz with that extra power headroom and process improvements should net at least a 4770k vs Devil's Canyon situation, and that isn't really even accounting for the increased overclock headroom more power and a way better platform will bring. I am unsure how they will TIM the thing but if it is like all the other HEDT than it will crush a 7700k so to speak (and even if it's craptastic reggie TIM like a 7700k, a delid would do the trick), as well as leaving several future upgrade paths. If I were building now I'd wait, and try to save as much dough as I could in case I decided to go Skylake-X, which will be nasty, too.


Don't like the 7700k running hot my 2600k runs way cooler at 4.8GHz. Waiting for Ryzen to mature and for intel's next offering.

I wonder if my PCI-E 2.0 bus may be limiting my 1080 Ti??


----------



## Lefty23

Quote:


> Originally Posted by *Slackaveli*
> 
> sick, Lefty.
> 
> dont know what I did wrong the first time but it was locked at 120 for some reason but THIS time,
> .....


That's more like it.









Almost 7 fps higher mins. Last week you almost convinced me to try the 5775c.
Only thing is I'm already planning a platform update (6core+) at the end of the year.


----------



## Slackaveli

It looks great to me, brother. It's a balls to the wall OC squeezing what you can but is pretty damn stable. If you can go play 3 straight hours of your most demanding title without crashing than you should be Gucci.

Battlefield One is great because you are testing OC on a new cpu and a new gpu, and that game pushes both hard at 144 fps in 1440p. Eitehr you driver crash (freeze then eventually recovery), than that is gpu. If it hard freezes and screams 'beeep, buuuzz' at you than that is your cpu needing more voltage (usually).


----------



## Exilon

Quote:


> Originally Posted by *Benny89*
> 
> So anyone see anything here? My Perf says always Util it seems. What does it mean?


Util means the GPU driver detected that the load isn't high enough to warrant higher clock states so it's downclocking the GPU.

Interesting that you have all 4 PerfCaps active at one point...


----------



## Tikerz

If I'm testing let's say 2050 @ 1.031 does it matter if my voltage slider is 0% or 100%?


----------



## shalafi

Quote:


> Originally Posted by *Slackaveli*
> 
> dont know what I did wrong the first time but it was locked at 120 for some reason but THIS time,
> .....


now THAT's better







what are your gpu clocks? Too bad the benchmark doesn't show them, it's the only thing I'm missing there.
my Gigabyte FE seems like a power hog, I have a 2012MHz OC dialled in at 1.043V via the curve, but in FS Extreme/Ultra and SuPo 4k+ it hits the power limit hard and downclocks to anywhere between 1980-1930 with the voltage dropping below 1.000V to keep it under 120%.
I wonder if I should take it apart and connect the pump to the mobo/psu rather than the card itself to save some power







I just don't feel like dismantling it anymore, for the 6th time now


----------



## Benny89

Quote:


> Originally Posted by *Exilon*
> 
> Util means the GPU driver detected that the load isn't high enough to warrant higher clock states so it's downclocking the GPU.
> 
> Interesting that you have all 4 PerfCaps active at one point...


Ok, what I can do and what does it mean? And why it is interesting?









Slim said I should score higher in Heaven with those clocks so is there anything wrong with my card?


----------



## Slackaveli

Util means fully utilized. it's what you want.


----------



## Slackaveli

Quote:


> Originally Posted by *Lefty23*
> 
> That's more like it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Almost 7 fps higher mins. Last week you almost convinced me to try the 5775c.
> Only thing is I'm already planning a platform update (6core+) at the end of the year.


might be worth it just in the short term, too, since it can be had for ~$50-75 on the switch, and the resell value on a 5775c is basically full price







In fact, they will sell out the last they made and then the used ones will go for over $400-500







It will always have a certain following/allure , at least longer than most cpus.
But, I am a bit tempted by skylake-x myself. Looks sexy, and I've been wanting a 8 core, too. In the future with scorpio being the lead game developer platform 8-cores should scale in every game and then my cache magic no longer wins the day.


----------



## Tikerz

Quote:


> Originally Posted by *Slackaveli*
> 
> Util means fully utilized. it's what you want.


That's not how I understand it. Util means there's no activity on the card. It's always Util until you give the GPU something to do.


----------



## Slackaveli

Quote:


> Originally Posted by *Tikerz*
> 
> That's not how I understand it. Util means there's no activity on the card. It's always Util until you give the GPU something to do.


so, the gpu isnt full bore because it is able to do all required of it without ramping up, hence utilized. Or something like that. At any rate, nothing that requires tweaking to remove it.
Quote:


> Originally Posted by *c0ld*
> 
> Don't like the 7700k running hot my 2600k runs way cooler at 4.8GHz. Waiting for Ryzen to mature and for intel's next offering.
> 
> I wonder if my PCI-E 2.0 bus may be limiting my 1080 Ti??


It is limiting you a little bit, but it isn't as much as you'd think. Something well less than 10%, iirc it was something like 4%.


----------



## Tikerz

Quote:


> Originally Posted by *Slackaveli*
> 
> so, the gpu isnt full bore because it is able to do all required of it without ramping up, hence utilized. Or something like that. At any rate, nothing that requires tweaking to remove it.\


Yeah, it's nothing to worry about


----------



## KedarWolf

Superposition at 1080P Extreme Asus BIOS 2088 1.062v doesn't throttle at all entire run.

1080PExtreme2088_1062mv.txt 42k .txt file




Can someone run 1080P Extreme on stock BIOS with GPU-Z logging?


----------



## Lefty23

Quote:


> Originally Posted by *Slackaveli*
> 
> might be worth it just in the short term, too, since it can be had for ~$50-75 on the switch, and the resell value on a 5775c is basically full price
> 
> 
> 
> 
> 
> 
> 
> In fact, they will sell out the last they made and then the used ones will go for over $400-500
> 
> 
> 
> 
> 
> 
> 
> It will always have a certain following/allure , at least longer than most cpus.
> But, I am a bit tempted by skylake-x myself. Looks sexy, and I've been wanting a 8 core, too. In the future with scorpio being the lead game developer platform 8-cores should scale in every game and then my cache magic no longer wins the day.


Maybe in the US man. Where I live it's more like 160-180 USD. I know cause I checked last week.
Anyway got to get some sleep. I'm supposed to get up to go to work in 5 hours.


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> now THAT's better
> 
> 
> 
> 
> 
> 
> 
> what are your gpu clocks? Too bad the benchmark doesn't show them, it's the only thing I'm missing there.
> my Gigabyte FE seems like a power hog, I have a 2012MHz OC dialled in at 1.043V via the curve, but in FS Extreme/Ultra and SuPo 4k+ it hits the power limit hard and downclocks to anywhere between 1980-1930 with the voltage dropping below 1.000V to keep it under 120%.
> I wonder if I should take it apart and connect the pump to the mobo/psu rather than the card itself to save some power
> 
> 
> 
> 
> 
> 
> 
> I just don't feel like dismantling it anymore, for the 6th time now


yeah, my clocks are very solid on this Aorus as I don't hit the power limiter. Aorus is even advertizing these as the only "4k/60 capable gpu's with a 375w power limit", which means they know exactly what they are doing lol. I run the 1st two scenes at 2088 then hit 40c (tops at 47c) and clocks go to 2076 thru the end of the run.
Quote:


> Originally Posted by *Lefty23*
> 
> Maybe in the US man. Where I live it's more like 160-180 USD. I know cause I checked last week.
> Anyway got to get some sleep. I'm supposed to get up to go to work in 5 hours.


well, whatever the cost, minus resell on what the world thinks is the best i-7 on a z97 (4790k) , is the total upgrade cost as everything else is unchanged. But you would get every dollar you spent on the 5775c back on resale I bet. It'll hold all it's value, fwtw.
g/n, bro.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Superposition at 1080P Extreme Asus BIOS 2088 1.062v doesn't throttle at all entire run.
> 
> 1080PExtreme2088_1062mv.txt 42k .txt file
> 
> 
> 
> 
> Can someone run 1080P Extreme on stock BIOS with GPU-Z logging?


crazily I can stay within 2c of you and Im on air


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*


log just shows some blue vrel bc/ clocks locked at lower voltage./


----------



## Benny89

Wish that somebody hack that stupid BIOS to disable that stupid downclocking.....


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Wish that somebody hack that stupid BIOS to disable that stupid downclocking.....


you just need cooling. Get a $10 tube of kyronaut and you'll stay under 59c.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> you just need cooling. Get a $10 tube of kyronaut and you'll stay under 59c.


I don't want to void warranty. There is a warranty sticker from what I have heard in Asus card that if you damage you void warranty.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I don't want to void warranty. There is a warranty sticker from what I have heard in Asus card that if you damage you void warranty.


ooooo. msi has those too. I don't blame you. Luckily Aorus was sticker free, as was FE. Same reason I didnt do my son's 1070 msi. But that thing runs in the 60s anyway.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> ooooo. msi has those too. I don't blame you. Luckily Aorus was sticker free, as was FE. Same reason I didnt do my son's 1070 msi. But that thing runs in the 60s anyway.


Hmm, I might consider returning STRIX then and getting Aorus Extreme if it finally hits stores in my country in next 2 weeks.

EDIT: I am not sure about those ASUS sticker. I just heard they had ones in previous cards. I didn't find confirmation if 1080Ti has it.


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> sick, Lefty. I can almost keep up with you but your superior cooling solution definitely puts you in a boss position
> 
> dont know what I did wrong the first time but it was locked at 120 for some reason but THIS time,
> .....


2038Mhz - - 6210Mhz - - 5820k @ 4.6Ghz


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Hmm, I might consider returning STRIX then and getting Aorus Extreme if it finally hits stores in my country in next 2 weeks.
> 
> EDIT: I am not sure about those ASUS sticker. I just heard they had ones in previous cards. I didn't find confirmation if 1080Ti has it.


easy enough to do. especially if you have a good lil toolkit for pc work.
Quote:


> Originally Posted by *lilchronic*
> 
> 2038Mhz - - 6210Mhz - - 5820k @ 4.6Ghz


damn, son. very nice. that 5820 is still a beast, as is that Ti of course.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> log just shows some blue vrel bc/ clocks locked at lower voltage./
Click to expand...

If you mean my log the lower clocks and voltages were when the bench was just starting up before it was actually running.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> If you mean my log the lower clocks and voltages were when the bench was just starting up before it was actually running.


i meant mine. I was being lazy b/c there was nothing to see really.


----------



## Benny89

So should I do SUPERPOSITION AT 4K, 1440p or 1080p? My monitor is 1440p


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> If you mean my log the lower clocks and voltages were when the bench was just starting up before it was actually running.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i meant mine. I was being lazy b/c there was nothing to see really.
Click to expand...

Oh, I see...


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> So should I do SUPERPOSITION AT 4K, 1440p or 1080p? My monitor is 1440p


You can do them all, even on a 1440P screen.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> So should I do SUPERPOSITION AT 4K, 1440p or 1080p? My monitor is 1440p


4k optimized is the best to test a gpu (or 8k). Dialed in you should get 10,000-10,400.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> 4k optimized is the best to test a gpu (or 8k). Dialed in you should get 10,000-10,400.


lol what is wrong with this bench. My clocks are jumping around in 4K.... In Heaven they were stable...


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> lol what is wrong with this bench. My clocks are jumping around in 4K.... In Heaven they were stable...


It's the power draw due to the complexity of the scene, which is what makes it a fairly good test for these cards. You'll want to invest some time in tweaking your voltage curve to maximise scores.

The 8K scene brings my card to its knees, will barely hold 2000Mhz. Heaven will pass at 2075 and even 2100.

Heaven is a relatively light load on the GPU compared to supo.


----------



## Benny89

Quote:


> Originally Posted by *mshagg*
> 
> It's the power draw due to the complexity of the scene, which is what makes it a fairly good test for these cards. You'll want to invest some time in tweaking your voltage curve to maximise scores.
> 
> The 8K scene brings my card to its knees, will barely hold 2000Mhz. Heaven will pass at 2075 and even 2100.
> 
> Heaven is a relatively light load on the GPU compared to supo.


But what I can tweak more in curve?







I already know my card max stable is 2076 and I need 1.093/1.081 V for that. What else I can set in curve.

Should I try to lock it for SUP bench?


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> But what I can tweak more in curve?
> 
> 
> 
> 
> 
> 
> 
> I already know my card max stable is 2076 and I need 1.093/1.081 V for that. What else I can set in curve.
> 
> Should I try to lock it for SUP bench?


More meat in the curve. It's gonna hit the power limit and it's gonna move to a lower voltage bin; so the trick is to have the lower voltage bins at the highest core speed they can handle.

Once you've achieved that the only thing that's going to improve performance is a shunt mod to lift the TDP limit restriction.


----------



## c0nsistent

So has anyone ziptied an AIO onto a founders edition yet? I have an old Antec 620 lying around and I've done 'the mod' in the past to various GPUs. It seems that a shim is required to do so whilst keeping the shroud attached. Does anyone know what size is ideal, and have there been any success stories?

Thanks.

I'm just not feeling the $159 price tag for the Hybrid kit when I have an AIO sitting right behind me that I can ziptie onto the card. I could care less how it looks, as I simply want more performance.


----------



## Slackaveli

THIS
Quote:


> Originally Posted by *mshagg*
> 
> It's the power draw due to the complexity of the scene, which is what makes it a fairly good test for these cards. You'll want to invest some time in tweaking your voltage curve to maximise scores.
> 
> The 8K scene brings my card to its knees, will barely hold 2000Mhz. Heaven will pass at 2075 and even 2100.
> 
> Heaven is a relatively light load on the GPU compared to supo.


Quote:


> Originally Posted by *Benny89*
> 
> lol what is wrong with this bench. My clocks are jumping around in 4K.... In Heaven they were stable...


BRO! What I've been telling you the last two days. Heaven bench doesnt push your power limit so why even tune with it? Do a 4k optimized SuPo run and then you'll see the nature of the power throttle. But, I will say that I can start at 2088, clock down to 2076, then it volts down again but still 2076, and holds thru until the end. So, it doesn't require much tweaking.
Quote:


> Originally Posted by *c0nsistent*
> 
> So has anyone ziptied an AIO onto a founders edition yet? I have an old Antec 620 lying around and I've done 'the mod' in the past to various GPUs. It seems that a shim is required to do so whilst keeping the shroud attached. Does anyone know what size is ideal, and have there been any success stories?
> 
> Thanks.
> 
> I'm just not feeling the $159 price tag for the Hybrid kit when I have an AIO sitting right behind me that I can ziptie onto the card. I could care less how it looks, as I simply want more performance.


yeah, man, it should work. you'll want a 40mm x 40mm x 1.5mm shim (or 45x45x1.5) and some thermal grizzly kyro.


----------



## Benny89

Clocks were jumping from 2030-2076.

i7 5775C Overclock to 4.3 Ghz.

Weak score imo. Something is off... 4K Performance.


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> Clocks were jumping from 2030-2076.
> 
> i7 5775C Overclock to 4.3 Ghz.
> 
> Weak score imo. Something is off... 4K Performance.


Looks like you had a stutter in that run with the 20FPS minimum. My gaming-stable OC, which is 2025-2037 for the 4K optimized run, was min 60.15/avg 75.58/max 98.54 for score of 10105.


----------



## Benny89

Little better socre. Put all bins in curve to higher clocks:


Quote:


> Originally Posted by *mshagg*
> 
> Looks like you had a stutter in that run with the 20FPS minimum. My gaming-stable OC, which is 2025-2037 for the 4K optimized run, was min 60.15/avg 75.58/max 98.54 for score of 10105.


What could cause stutter?

Do you guys bench with G-Sync on? I have it off.


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> What could cause stutter?
> .


Anything. e.g. Background task interrupting the benchmark.

10k+ looks like everything running pretty well.


----------



## KingEngineRevUp

This superposition test is a good test to show off maximum horse power definitely. I've actually caved in and I'm going to do the shunt mod + liquid electric tape + liquid metal on my aio. Should have it all done next week









But this test doesn't show realistic results for actual gaming, at least the majority of them don't run at maximum+ all the time.


----------



## c0nsistent

Terrible... this is what happens when you run 100% fan and try to get good results on an FE card. This is @ 2050mhz locked @ 1000mv with 6000mhz memory.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Little better socre. Put all bins in curve to higher clocks:
> 
> 
> What could cause stutter?
> 
> Do you guys bench with G-Sync on? I have it off.


yeah, you good. those scores are good.


----------



## Slackaveli

Quote:


> Originally Posted by *c0nsistent*
> 
> 
> 
> Terrible... this is what happens when you run 100% fan and try to get good results on an FE card. This is @ 2050mhz locked @ 1000mv with 6000mhz memory.


do a few adjustments in nvidia control panel to up that a bit. put prefer high perf, turn a few settings to performance over quality.


----------



## Qba73

You know what guys I think I'm just gonna get rid of my TI and go with some real muscle, feast your eyes on the y2k beast the one the only 9700 pro complete with added on memory heatsinks, I think I can score like 2 points on time spy lol. Found my old buddy going through my closet of forgottens.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *c0nsistent*
> 
> 
> 
> Terrible... this is what happens when you run 100% fan and try to get good results on an FE card. This is @ 2050mhz locked @ 1000mv with 6000mhz memory.


Yeah... I think someone measured the fan power and it can go up to 12 watts =/, probably makes you more power limited.

But I wouldn't feel too bad, this benchmark is also a stress test. It's not often you'd be gaming with such draws, at least for the average game.


----------



## ThingyNess

Quote:


> Originally Posted by *Slackaveli*
> 
> Util means fully utilized. it's what you want.


Quote:


> Originally Posted by *Tikerz*
> 
> Yeah, it's nothing to worry about


After reading through all the pages after this, it looks like nobody addressed it, so i'll take a shot.

'Util' means the card is automatically downclocking itself to save power *because it has nothing to do* - the most common cause is a heavy CPU/driver bottleneck, so the GPU simply isn't being fed data fast enough to justify keeping it at the maximum clock rate.

It can also happen if you've enabled a framerate cap in Afterburner, or have Vsync/GSync forced and the card is hitting the framerate cap that way.

Basically, if you see 'Util' then you have a bottleneck somewhere (artificial or otherwise) that isn't the GPU.

This is why you see 'Util' as the PerfCap reason at idle when you're just looking at the windows desktop. The very simple task of drawing the desktop at your monitor's refresh rate is so easy that the card can do it at the lowest P-state (~200mhz on most cards) so there's no reason to burn power by keeping the highest clock/voltage state active.


----------



## Benny89

Quote:


> Originally Posted by *ThingyNess*
> 
> After reading through all the pages after this, it looks like nobody addressed it, so i'll take a shot.
> 
> 'Util' means the card is automatically downclocking itself to save power *because it has nothing to do* - the most common cause is a heavy CPU/driver bottleneck, so the GPU simply isn't being fed data fast enough to justify keeping it at the maximum clock rate.
> 
> It can also happen if you've enabled a framerate cap in Afterburner, or have Vsync/GSync forced and the card is hitting the framerate cap that way.
> 
> Basically, if you see 'Util' then you have a bottleneck somewhere (artificial or otherwise) that isn't the GPU.
> 
> This is why you see 'Util' as the PerfCap reason at idle when you're just looking at the windows desktop. The very simple task of drawing the desktop at your monitor's refresh rate is so easy that the card can do it at the lowest P-state (~200mhz on most cards) so there's no reason to burn power by keeping the highest clock/voltage state active.


Interesting. But I have Util when runnin 2076 core and 6100Mhz Memory. CPU is not a concern since it is 5775C OCed to 4.3 Ghz.

So what else can cause that?


----------



## Addsome

Quote:


> Originally Posted by *Benny89*
> 
> Interesting. But I have Util when runnin 2076 core and 6100Mhz Memory. CPU is not a concern since it is 5775C OCed to 4.3 Ghz.
> 
> So what else can cause that?


Your graph shows multiple perf caps, thats why the rainbow is there. That means your running the core too high on the voltage bins. Put your core lower and test. If there is no perf cap you are good. A high Mhz core with perf cap will give a lower score than a lower Mhz with no perf cap.


----------



## Benny89

Quote:


> Originally Posted by *Addsome*
> 
> Your graph shows multiple perf caps, thats why the rainbow is there. That means your running the core too high on the voltage bins. Put your core lower and test. If there is no perf cap you are good. A high Mhz core with perf cap will give a lower score than a lower Mhz with no perf cap.


You mean only on one bin when I set max or on all bins? So I should try to lower clocks on all bins?


----------



## Serandur

Pascal's GPU boost is really annoying. My Strix 1080 Ti at stock started at 2000 MHz, got me all excited. Then it dropped as it warmed up... 1987, then 1974, then 1962, then 1949... in some scenarios as low as 1860 MHz. Maxwell never throttled so wildly.

It seems stable OC'd to 2038-2063 MHz at stock voltage though, which is like 14.5+ TFLOPs of peak shader throughput. Very monstrous. If only GPU boost could be tamed.


----------



## Benny89

Ehhh.....I am tired of all this voltage curve bs. I miss Maxwell and simple OC of it....


----------



## Qba73

Same here brother, same here, maxwell was crank and bench, rinse an repeat no curve bs

Damn boost 3.0


----------



## Benny89

Y, I even consider returning my 1080Ti, put back my 980Ti and wait for Volta.

Darn this Perf, limits, boosts, curves etc.

Took my 2 hours to OC my new CPU nice and steady. 2 days trying to find out of my OC on my 1080Ti if it is good or bad, too high, too small, bla bla...

Not pleased with Pascals....


----------



## mshagg

You're kidding right? The cards have to operate within limits.

Getting the most out of them is actually a moderately interesting challenge now, rather than moving a slider, hitting apply and finding out whether you won the silicone lottery or not.


----------



## Benny89

And my GPU does not downclock properly on idle now... After PC reset. It sits at 1500 core 50C in desktop.

Anyone knows how to fix it?


----------



## RavageTheEarth

Alright boys, this day of hell traveling for 12 hours to get back home was absolutely miserable..... BUT now I'm FINALLY home playing with the new card. Man this thing is HUGE. Like HUGE. I had to move my reservoir and side panel in order to fit it in there. Can't wait until EK releases the waterblock for this card... but until then I'll have to deal with it. Time to sell my 980 Ti + waterblock and get some of this cash back!!!





I bumped the power limit up to 125% to get that sweet 375w limit. Didn't touch anything else and it's boosting to 2000Mhz right out of the box. I'm impressed! I set the fan at 100% and during firestrike extreme temps didn't even hit 50C which I'm surprised about since I heard the Aorus's ran a little hot.


----------



## KingEngineRevUp

Come on now guys, boost 3.0 is boosting us safely from 1500 Mhz. All it did was take away 80% of the oc work for us. But don't take it for granted. I still remem the days in hs where I had to oc using nvidia control panel and it was terrible.


----------



## Stiltz85

I just ordered 3 FE's, not sure what I will do with them but, sign me up because I just busted my way into this club!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> And my GPU does not downclock properly on idle now... After PC reset. It sits at 1500 core 50C in desktop.
> 
> Anyone knows how to fix it?


You have it on max performance mode at global level. You just want to set that for the benchmark itself.


----------



## Qba73

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Alright boys, this day of hell traveling for 12 hours to get back home was absolutely miserable..... BUT now I'm FINALLY home playing with the new card. Man this thing is HUGE. Like HUGE. I had to move my reservoir and side panel in order to fit it in there. Can't wait until EK releases the waterblock for this card... but until then I'll have to deal with it. Time to sell my 980 Ti + waterblock and get some of this cash back!!!
> 
> 
> 
> 
> 
> I bumped the power limit up to 125% to get that sweet 375w limit. Didn't touch anything else and it's boosting to 2000Mhz right out of the box. I'm impressed! I set the fan at 100% and during firestrike extreme temps didn't even hit 50C which I'm surprised about since I heard the Aorus's ran a little hot.


It is beastly isn't it, that's the first thing I thought (was always a evga guy myself) when I got this card. I event got the master accessory magnetic stand for the slight sag

MasterAccessory Universal VGA Holder (MCA-0005-KUH00) https://www.amazon.com/dp/B01IQCBKOI/ref=cm_sw_r_cp_api_9vS7ybKEBE9XS


----------



## Qba73

Quote:


> Originally Posted by *SlimJ87D*
> 
> Come on now guys, boost 3.0 is boosting us safely from 1500 Mhz. All it did was take away 80% of the oc work for us. But don't take it for granted. I still remem the days in hs where I had to oc using nvidia control panel and it was terrible.


Oh god I remember those days, good point


----------



## Addsome

Quote:


> Originally Posted by *Benny89*
> 
> You mean only on one bin when I set max or on all bins? So I should try to lower clocks on all bins?


Are you locking your voltage at a certain bin? Like the cntrl+L method?


----------



## Qba73

Quote:


> Originally Posted by *SlimJ87D*
> 
> You have it on max performance mode at global level. You just want to set that for the benchmark itself.


Slim is correct, it will idle down to 278 but if you open a browser it clocks back up


----------



## Addsome

Quote:


> Originally Posted by *Qba73*
> 
> Slim is correct, it will idle down to 278 but if you open a browser it clocks back up


I just set adaptive for chrome specifically and it clocks down now. I have max performance for the global tab.


----------



## Emmanuel

Just placed an order on a backordered Gigabyte Aorus Extreme 1080ti. I was just telling myself earlier today that I will wait until more reviews are out so I can pick the one with the best OC potential, and then here I am taking my credit card out! I will watercool it whenever EK releases a block for it.

I keep reading that Pascal usually tops out at around 2100MHz, do you foresee the Evga FTW3 and Zotac AMP Extreme fairing any better? What's the general consensus here on which non-reference card to get?


----------



## c0nsistent

Well I hit 10101 in Superposition 4K optimized. It seems regardless of whether I set the curve to 1000mv or just use an offset, the core bounces between 1950-2012mhz during the bench, so even 1.0v set to 2050mhz can't keep it from throttling. I might try some other BIOS just for the heck of it, but I'm really tempted to use a copper shim and the AIO cooler I have and go all out with the shunt, etc.


----------



## KedarWolf

Asus BIOS, 2075 core. +659 memory, 1.031v, is my best score yet.

I'm pretty happy I'm hitting those clocks at 1.031v.


----------



## s1rrah

Just bought an MSI Seahawk X 1080 ti from Newegg today ... looking forward to joining the club some time early next week ... wanna put these 980 Hybrids up for sale before their value drops below $12.00 each ... LOL ...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2075 core. +659 memory, 1.031v, is my best score yet.
> 
> I'm pretty happy I'm hitting those clocks at 1.031v.


Full water cooling must make a big difference, I'd love to have that last 13 Mhz. Youre below 40C right?


----------



## Stiltz85

I'm a jerk and got 3, Almost 4 but taxes alone were over $200 at that point.


----------



## Baasha

My run w/ 4x 1080 Ti in 4 Way SLI (8K Optimized):


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Stiltz85*
> 
> I'm a jerk and got 3, Almost 4 but taxes alone were over $200 at that point.


I would have just got 2 titan Xp. 3 and 4 way sli is terrible, sometimes does worse than 2 way sli.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Asus BIOS, 2075 core. +659 memory, 1.031v, is my best score yet.
> 
> I'm pretty happy I'm hitting those clocks at 1.031v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Full water cooling must make a big difference, I'd love to have that last 13 Mhz. Youre below 40C right?
Click to expand...

I push around 45C but only because I have a highly OC'd CPU, my 1080 Ti and my dedicated PhysX card also water cooled on the same loop with no rez and one 360 rad.


----------



## Stiltz85

Quote:


> Originally Posted by *SlimJ87D*
> 
> I would have just got 2 titan Xp. 3 and 4 way sli is terrible, sometimes does worse than 2 way sli.


I never said I was keeping them, lol. I am saving my upgrade PCIe slots for something wet.

Wow that sounded wrong but I'll stick with it!

Also I bought these with store credit so I really did not have much of a choice.


----------



## HyperC

This is the hottest I have seen my card get , I noticed core clock jumps around once it hits 40c in less I need to do that power curve or stunt mod?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I push around 45C but only because I have a highly OC'd CPU, my 1080 Ti and my dedicated PhysX card also water cooled on the same loop with no rez and one 360 rad.


Does your physx card help with time spy?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I push around 45C but only because I have a highly OC'd CPU, my 1080 Ti and my dedicated PhysX card also water cooled on the same loop with no rez and one 360 rad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does your physx card help with time spy?
Click to expand...

No, it only works on PhysX games, all three of them.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *HyperC*
> 
> This is the hottest I have seen my card get , I noticed core clock jumps around once it hits 40c in less I need to do that power curve or stunt mod?


Yeah, look at all that power limiting... The shunt mod will unlock the last 3% of the cards potential. Lol.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Qba73*
> 
> It is beastly isn't it, that's the first thing I thought (was always a evga guy myself) when I got this card. I event got the master accessory magnetic stand for the slight sag
> 
> MasterAccessory Universal VGA Holder (MCA-0005-KUH00) https://www.amazon.com/dp/B01IQCBKOI/ref=cm_sw_r_cp_api_9vS7ybKEBE9XS


Good idea! The waterblock and backplate should help a bit when they are released (HURRY UP EK!!!). Hey, I need to find that tutorial for setting a voltage curve on the 1080 Ti. Can't seem to find it. Can anyone link me to it? I'm still confused as to what the purpose of setting a curve is, but I'm interested. I tried +100 on the core and it crashed pretty quickly in firestrike. No issues with +50 though.


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Qba73*
> 
> It is beastly isn't it, that's the first thing I thought (was always a evga guy myself) when I got this card. I event got the master accessory magnetic stand for the slight sag
> 
> MasterAccessory Universal VGA Holder (MCA-0005-KUH00) https://www.amazon.com/dp/B01IQCBKOI/ref=cm_sw_r_cp_api_9vS7ybKEBE9XS
> 
> 
> 
> Good idea! The waterblock and backplate should help a bit when they are released (HURRY UP EK!!!). Hey, I need to find that tutorial for setting a voltage curve on the 1080 Ti. Can't seem to find it. Can anyone link me to it? I'm still confused as to what the purpose of setting a curve is, but I'm interested. I tried +100 on the core and it crashed pretty quickly in firestrike. No issues with +50 though.
Click to expand...

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20

Video here.


----------



## RavageTheEarth

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20
> 
> Video here.


Thanks!!!! +REP


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20
> 
> Video here.


You know something else I just realized? You're not running the blower fan correct? That can give you an extra 6W-12W 

That can be an extra 2.5-5% of tdp


----------



## joder

I wonder if blood was able to flash the BIOS...?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> I wonder if blood was able to flash the BIOS...?


Same here, although his bios was twice the file size, sounds wrong.


----------



## Qba73

So I was curious to see oc with no volts and PL at 150% what I can get,

and I was bench and game stable at 2050 at 1.05 volts according to gpuz

also helped my temps to stay at 63c at 60% fan speed

think I'm happy here.

Edit: oh and 500 on mem

I can do 2088 stable but need the volts which also generates more heat

when my FTW3 comes in I will put her through her paces to see what thermals I get.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Good idea! The waterblock and backplate should help a bit when they are released (HURRY UP EK!!!). Hey, I need to find that tutorial for setting a voltage curve on the 1080 Ti. Can't seem to find it. Can anyone link me to it? I'm still confused as to what the purpose of setting a curve is, but I'm interested. I tried +100 on the core and it crashed pretty quickly in firestrike. No issues with +50 though.


on these try +64, +77, +90.
Quote:


> Originally Posted by *Qba73*
> 
> So I was curious to see oc with no volts and PL at 150% what I can get,
> 
> and I was bench and game stable at 2050 at 1.05 volts according to gpuz
> 
> also helped my temps to stay at 63c at 60% fan speed
> 
> think I'm happy here.
> 
> I can do 2088 stable but need the volts which also generates more heat
> 
> when my FTW3 comes in I will put her through her paces to see what thermals I get.


tolda ya, bro







. On these Aorus , ez oc is just no volts, custom fan curve, +51, +64, +77, or +90. If you are very very lucky, +103. that's 2100 right there. Then make 3 or 4 adjustments to a curve and lock it. Done. Bios on these things already has you squared away. If we could stay under 40c, we'd be 2088-2100 all day.


----------



## Qba73

Quote:


> Originally Posted by *Slackaveli*
> 
> tolda ya, bro
> 
> 
> 
> 
> 
> 
> 
> . On these Aorus , ez oc is just no volts, custom fan curve, +51, +64, +77, or +90. If you are very very lucky, +103. that's 2100 right there.


yup you did







thanks

if I keep the card I will do the Kryonaut to drop some C's


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> yup you did
> 
> 
> 
> 
> 
> 
> 
> thanks
> 
> if I keep the card I will do the Kryonaut to drop some C's


I've very happy with mine. In games it's crazy good. Going to be a heavyweight fight between that and the FTW3.


----------



## illypso

Quote:


> Originally Posted by *prelude514*
> 
> So my Aorus non-Extreme is a terrible clocker and seems to have other issues. I had traded in what I consider a golden MSI FE which would do 2000mhz at only 0.993v. That's pretty good, right?
> 
> The store still has my FE card. I've just about convinced myself to go get it back and pick up an AIO and shunt mod it. 2 questions:
> 
> 1) For the shunt mod, I'd prefer soldering a wire on each PCI-e connectors shunt. A single stand of thin aluminium speaker wire should do the trick? Or should I use copper wire?
> 
> 2) This will be the first time I put an AIO on a gpu. These things are meant for CPUs, do I need to mod the AIO at all? Are there brackets that are sold? I'm looking ideally for a 120mm thick rad AIO as that's all that my current case will accommodate. If performance is really better with a 240mm, I could make that work as well. What are the general recommendations for AIO on a 1080Ti? I want the best cooling performance. Here's where I'll be buying the AIO from most likely:
> 
> http://www.canadacomputers.com/index.php?cPath=8
> 
> Oh, and are the VRMs decent on FE cards? Haven't found any info on them.


For the wire, I used a very thin one, that came from a composite cable (because there are the smallest one I could find), it was copper and as small as a hair and very soft (wanted to measure it but I cant find my caliper or the wire I used







), I think common speaker wire would be too big. I use it on the 2 shunt coming from the 6 pin and the 8 pin but not on the pciexpress one and it drop my power from 120% max to 80% max without any problem. seeing how small the wire was and how big the drop was, I think it would be easy to lower it too much if the wire is too big


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Come on now guys, boost 3.0 is boosting us safely from 1500 Mhz. All it did was take away 80% of the oc work for us. But don't take it for granted. I still remem the days in hs where I had to oc using nvidia control panel and it was terrible.


They could ship them so they peak at 1600Mhz and we'd all be stoked with the 30% overclock lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *illypso*
> 
> For the wire, I used a very thin one, that came from a composite cable (because there are the smallest one I could find), it was copper and as small as a hair and very soft (wanted to measure it but I cant find my caliper or the wire I used
> 
> 
> 
> 
> 
> 
> 
> ), I think common speaker wire would be too big. I use it on the 2 shunt coming from the 6 pin and the 8 pin but not on the pciexpress one and it drop my power from 120% max to 80% max without any problem. seeing how small the wire was and how big the drop was, I think it would be easy to lower it too much if the wire is too big


Can I ask you why you shunted the mobo resistor? Aren't you scared of pulling extra power form the motherboard slot?

Edit: Nvm justice rrad you didn't shunt that one.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joder*
> 
> I wonder if blood was able to flash the BIOS...?
> 
> 
> 
> Same here, although his bios was twice the file size, sounds wrong.
Click to expand...

I'll run that nvidia command when I get home on Asus BIOS. Maybe 30 minutes from now.


----------



## bloodhawk

Quote:


> Originally Posted by *joder*
> 
> I wonder if blood was able to flash the BIOS...?


Quote:


> Originally Posted by *SlimJ87D*
> 
> Same here, although his bios was twice the file size, sounds wrong.


Not yet. Might do it tonight if i get back home in time.


----------



## bluewr

How's this score?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bluewr*
> 
> 
> 
> How's this score?


Close to average. But above stock for sure.


----------



## Stiltz85

Quote:


> Originally Posted by *SlimJ87D*
> 
> Close to average. But above stock for sure.


Pretty sure my ex told me something like that before she left me.


----------



## Slackaveli

Quote:


> Originally Posted by *bluewr*
> 
> 
> 
> How's this score?


it's good, man. not bad at all.
Quote:


> Originally Posted by *Stiltz85*
> 
> Pretty sure my ex told me something like that before she left me.


LOL!


----------



## Qba73

So I settled on 2062 to 2050 downclock (over 60c) for my 24/7, no volts, +54mhz, 150 PL, custom fan curve max at 75% (still bone quiet) hitting 1.032v 1.051v on gpu-z (she maxes out at 70c)

here is my score. game stable, witcher 3, Battlefield 1, GTA V

not chasing 2100 unless I'm underwater with some kryonaut scuba gear lol and that wont be until I put this aorus xtreme up against the FTW3 in early may when my preorder comes in.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Stiltz85*
> 
> Pretty sure my ex told me something like that before she left me.


Lmao


----------



## RavageTheEarth

I cannot get over how much of an upgrade this Aorus was over my 980 Ti. I just loaded up Deus Ex Mankind Divided and maxed out every setting (except no MSAA). I loaded up the game and I was at 144hz in the Subway. Walked out in the city and I dropped down to 90-110FPS. HUGE upgrade from the 980 Ti.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I cannot get over how much of an upgrade this Aorus was over my 980 Ti. I just loaded up Deus Ex Mankind Divided and maxed out every setting (except no MSAA). I loaded up the game and I was at 144hz in the Subway. Walked out in the city and I dropped down to 90-110FPS. HUGE upgrade from the 980 Ti.


Same. Almost double the fps of my 980Ti.

** So, memory scales well in this test. This is my highest mems with no artifacting, I can only get a true 100% clean mem oc of + 442, unfortunately. But this is my new high and by 100 points, in 4k. 
Quote:


> Originally Posted by *Qba73*
> 
> So I settled on 2062 to 2050 downclock (over 60c) for my 24/7, no volts, +54mhz, 150 PL, custom fan curve max at 75% (still bone quiet) hitting 1.032v 1.051v on gpu-z (she maxes out at 70c)
> 
> here is my score. game stable, witcher 3, Battlefield 1, GTA V
> 
> not chasing 2100 unless I'm underwater with some kryonaut scuba gear lol and that wont be until I put this aorus xtreme up against the FTW3 in early may when my preorder comes in.


what mem oc?


----------



## KickAssCop

What is this Superposition bench?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I push around 45C but only because I have a highly OC'd CPU, my 1080 Ti and my dedicated PhysX card also water cooled on the same loop with no rez and one 360 rad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does your physx card help with time spy?
Click to expand...

TimeSpy custom run, settings maxed out, 4K, scene one only, two loops. Power Draw gets up to and exceeds 330 watts on a regular basis.

I copied and pasted the output from the command prompt into a txt file, just deleted the pauses before and between runs.









AsusBIOSPowerDraw.txt 147k .txt file


Edit: had my BIOS at 1.093v at 2088 core.


----------



## stoker

Getting some great results with undervolting.

1961-1939 @ 0.975v +500 mem


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> which one? Keep it, man! You are PEZ, man.


I had 8 days left on my EVGA step up and it didn't look like a non-FE card was going to appear on there at all. In the end it's worth it for me to step up the 1080 to a Ti FE even if not to resell it. I'd lose more money trying to resell the 1080 than a potentially BNIB or very lightly used FE. And naturally if I sell my TXP (2016), I've considered SLI....this is the reason I went ITX/SFF







.

In the end, the goal is to end up with a single card that I'm happy with. The TXp is an awesome card at the end of the day, so I can attempt to put a AIO on it and probably have the best of both worlds, honestly.


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> So I settled on 2062 to 2050 downclock (over 60c) for my 24/7, no volts, +54mhz, 150 PL, custom fan curve max at 75% (still bone quiet) hitting 1.032v 1.051v on gpu-z (she maxes out at 70c)
> 
> here is my score. game stable, witcher 3, Battlefield 1, GTA V
> 
> not chasing 2100 unless I'm underwater with some kryonaut scuba gear lol and that wont be until I put this aorus xtreme up against the FTW3 in early may when my preorder comes in.


Quote:


> Originally Posted by *KickAssCop*
> 
> What is this Superposition bench?


New Unigine Bench, it will be with us for years. It's pretty great.
Quote:


> Originally Posted by *pez*
> 
> I had 8 days left on my EVGA step up and it didn't look like a non-FE card was going to appear on there at all. In the end it's worth it for me to step up the 1080 to a Ti FE even if not to resell it. I'd lose more money trying to resell the 1080 than a potentially BNIB or very lightly used FE. And naturally if I sell my TXP (2016), I've considered SLI....this is the reason I went ITX/SFF
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In the end, the goal is to end up with a single card that I'm happy with. The TXp is an awesome card at the end of the day, so I can attempt to put a AIO on it and probably have the best of both worlds, honestly.


true that. And definitely worth triggering the step-up even if just for the resell value.
Quote:


> Originally Posted by *pez*
> 
> I had 8 days left on my EVGA step up and it didn't look like a non-FE card was going to appear on there at all. In the end it's worth it for me to step up the 1080 to a Ti FE even if not to resell it. I'd lose more money trying to resell the 1080 than a potentially BNIB or very lightly used FE. And naturally if I sell my TXP (2016), I've considered SLI....this is the reason I went ITX/SFF
> 
> 
> 
> 
> 
> 
> 
> .
> 
> In the end, the goal is to end up with a single card that I'm happy with. The TXp is an awesome card at the end of the day, so I can attempt to put a AIO on it and probably have the best of both worlds, honestly.


Quote:


> Originally Posted by *stoker*
> 
> Getting some great results with undervolting.
> 1961-1939 @ 0.975v +500 mem


looks like it. Nice and stable and cool.

Mod edit: Stop double posting, please sue the edit button.


----------



## stoker

Quote:


> Originally Posted by *Slackaveli*
> 
> looks like it. Nice and stable and cool.


Cheers, my core loses stability over 60C @ low volts and adding more just adds heat.

Can't wait for my WB to arrive and see if I can run 2000mhz @ 1.0v.

Have a medium test for you


----------



## dansi

Hey guys how you make undervolt? My voltage control for FE is unavailable in afterburner


----------



## warbucks

Been playing around with the voltage curve. I managed to get 2050Mhz @ 1.050V stable with +450Mhz on the memory. Will now try lowering it to 1.043V with same clocks.

Updated Afterburner and RTSS to the latest beta versions which fixed Timespy not running for me.

Here's my Timespy score:



Here's a 1080P Extreme run of Superposition:


----------



## bloodhawk

Well flashing the BIOS is a no go, even with a programmer. Got Flacon punched in the nuts. The dump can be made inline, but inline flashing does not work. The SOP8 chip needs to be soldered off.

Buuuuttt, i did solder on 3 x 10 ohm resistors and to the Power IC and - https://www.sendspace.com/file/i4ko7y (GPU-Z Log, during Time Spy) (Using the ASUS BIOS, will run tests tomorrow using the Stock BIOS)
http://www.3dmark.com/3dm/19239127


----------



## pez

I got the 'Creator's Update' pushed to my system. I recall there previously being an issue where performance was spotty or decreased with GameDVR turned on in the Xbox app. I wanted to remind you guys to re-turn that off as mine was enabled again after the update.


----------



## nonnac7

Here is my Aorus 1080ti Xtreme - Superposition @ 4k and Timespy run. 2075 core that eventually goes down to 2050 due to heat and 6014 for memory. 6850k @ 4.2GHz.



Been using these clocks to play witcher 3 for the past 4 or 5 days with like 4+ hours sessions. Pretty happy overall.

Edit: forgot to mention my card lol


----------



## KedarWolf

Quote:


> Originally Posted by *dansi*
> 
> Hey guys how you make undervolt? My voltage control for FE is unavailable in afterburner


http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20


There is MSi afterburner 4.4.0 beta 6 but i can't seem to find the download link.


----------



## pez

Nice, finally managed to pick up a STRIX OC. A card that will finally fit into my case







.


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_20
> 
> 
> 
> There is MSi afterburner 4.4.0 beta 6 but i can't seem to find the download link.
Click to expand...

http://forums.guru3d.com/showpost.php?p=5412373&postcount=216


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> TimeSpy custom run, settings maxed out, 4K, scene one only, two loops. Power Draw gets up to and exceeds 330 watts on a regular basis.
> 
> I copied and pasted the output from the command prompt into a txt file, just deleted the pauses before and between runs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: had my BIOS at 1.093v at 2088 core.


Interesting, what did AB or GPU-Z report max power as during that? maybe power reading is just incorrect with the ASUS bios on AB etc?


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> TimeSpy custom run, settings maxed out, 4K, scene one only, two loops. Power Draw gets up to and exceeds 330 watts on a regular basis.
> 
> I copied and pasted the output from the command prompt into a txt file, just deleted the pauses before and between runs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: had my BIOS at 1.093v at 2088 core.
> 
> 
> 
> Interesting, what did AB or GPU-Z report max power as during that? maybe power reading is just incorrect with the ASUS bios on AB etc?
Click to expand...

i used "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1 in a command prompt, neglected to say that.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> i used "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1 in a command prompt, neglected to say that.


Yeah I had a look at that and even the averages were around 308-310w, so It would seem you are pulling a lot more power on the ASUS bios. If AB still reports a max around 111% then its possible that it is just calculating wrong due to the different bios. Particularly if AB is report power limiting at 111% and NVSMI shows 330 peaking?


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> You have it on max performance mode at global level. You just want to set that for the benchmark itself.


No. I did not. I checked. I had to uninstall MSI afterburner and heaven and again downlocked itself properly on idle.


----------



## Benny89

Quote:


> Originally Posted by *Addsome*
> 
> Are you locking your voltage at a certain bin? Like the cntrl+L method?


No I don't lock it on certain bin. Should I? I just set for each voltage bin max clocks and let it run.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Yeah I had a look at that and even the averages were around 308-310w, so It would seem you are pulling a lot more power on the ASUS bios. If AB still reports a max around 111% then its possible that it is just calculating wrong due to the different bios. Particularly if AB is report power limiting at 111% and NVSMI shows 330 peaking?


KedarWolf is fully water cooled so he also doesn't have the led and blower fan running which gives him some extra tdp too. I believe the fan is a 1 Amp rated one which can push 12w at 100% =/, it is spinning like at 5,000 rpms after all.

It's kinda sad but every watt counts with these FE cards, but only when it comes to superposition benchmark. This benchmark is like a stress test for sure. Not realistic to real life gaming.


----------



## gingerbreadman

Anyone has this problem? I have a MSI Gaming X. My power slider is at 117%, but when i load it, it only this about 100%

Anyone knows why?


----------



## Benny89

It seems I got this curve all wrong and that is why my scores and Perf are wrong. I should have started with with like 1.000 V bin or 1.031 bin and find on it max core clocks it can hold and then start my way up until I find lowest bin/highest core clock I can find and then start running benches, correct?

Because I started to finding max clock for 1.093V and putting rest to straight line. And think here lies the problem. I run too high V for too high clock or something....

Once I find lowest bin with max clocks - should I lock it for benches or let it run?

Can you people with air AIBs post your Voltage curves and say at what bin your running?

Maybe I should return STRIX and grab Aorus Xtreme... seems like this higher PL helps not in better max OC allows for simple OC in AB without all this curves...


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> i used "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1 in a command prompt, neglected to say that.
> 
> 
> 
> Yeah I had a look at that and even the averages were around 308-310w, so It would seem you are pulling a lot more power on the ASUS bios. If AB still reports a max around 111% then its possible that it is just calculating wrong due to the different bios. Particularly if AB is report power limiting at 111% and NVSMI shows 330 peaking?
Click to expand...

Power Limit definitely reporting right. if you check the three logs, all at a second timing each, the Power Limits hit equal the voltages reached. It appears Asus BIOS does increase the voltage to 330 watts.









PowerDraw.txt 289k .txt file


GPU-ZSensorLog.txt 32k .txt file


HardwareMonitoring.zip 2k .zip file


This is my custom TimeSpy settings so hitting a much higher Power Limit than 111% that I get in Heaven or Furmark.



Edit: Could it be the BIOS reports the Power Limits AND voltages wrong?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Power Limit definitely reporting right. if you check the three logs, all at a second timing each, the Power Limits hit equal the voltages reached. It appears Asus BIOS does increase the voltage to 330 watts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PowerDraw.txt 289k .txt file
> 
> 
> GPU-ZSensorLog.txt 32k .txt file
> 
> 
> HardwareMonitoring.zip 2k .zip file
> 
> 
> This is my custom TimeSpy settings so hitting a much higher Power Limit than 111% that I get in Heaven or Furmark.
> 
> 
> 
> Edit: Could it be the BIOS reports the Power Limits AND voltages wrong?


Are you sure you're not measuring an initial spike? I posted an image in here of my stock BIOS running at 330 watts for a bit.

But I did do a measurement with my PSU and the nvidia command in real time when I saw it hit 111% power draw on Asus and 119%-122% on stock.

Stock was 305 and Asus was 310. The difference was consistent with what my PSU was showing as a system draw.

Your situation is a bit different also from non water stock FE users I believe as you're not running the blower and led fan as you're saving anywhere from 5 to 12 watts not running them. That can give you an extra 3%-5% on tdp if true.

Stock bios hitting 330 watts occasionally:

http://www.overclock.net/content/type/61/id/3007995/

If only we could modify the bios and have it read "Enforced Power Limit 350 Watts" we'd be golden.

I don't know how this nvidia command works, we should get someone doing a shunt mod to do a reading because if they're at 80%, which is really 120% for stock, and if it only reads 200 watts being drawn, then the nvidia command is only multiplying what it reads as its minimum power draw by percentage. But if it reads 350+ watts then we know it's actually reading what is actually being drawn.

Can someone with a shunt mod run the nvidia command with furmark?


----------



## mcg75

Quote:


> Originally Posted by *gingerbreadman*
> 
> Anyone has this problem? I have a MSI Gaming X. My power slider is at 117%, but when i load it, it only this about 100%
> 
> Anyone knows why?


Setting it to 117% doesn't mean it's going to use 117%.

It means you allow the card to use 117% IF the card needs it.


----------



## Norlig

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Norlig*
> 
> I received my Copper shims yesterday, 25mmx25mmx1.5mm.
> 
> I dont get why I didnt double check those meassurements, as they are just enough to cover the die of the GPU itself.
> 
> Thats dangerous isnt it?
> 
> Shouldn't it also rest on the silver edges, as to divide the preassure?
> 
> Anyway, I did test, and my Temps rose to >90'C and I got Driver crash, so I must've mounted it skewered or something.
> Works fine when I put the normal cooler on again.
> 
> a 45mmx45mmx1.5mm Copper shim would be perfect..
> 
> 
> 
> unlucky. i'd be very careful when tightening that down.
Click to expand...

Quote:


> Originally Posted by *SlimJ87D*
> 
> No that's fine. That's what I'm using


My Grandpa had a copper plate lying around, it was only 1mm though, so I might have to use 2.

Cut them to 45mmx45mm - ish. and sanded them down with 3000grit

will mount it later today.


----------



## Qba73

Quote:


> Originally Posted by *Slackaveli*
> 
> Same. Almost double the fps of my 980Ti.
> 
> ** So, memory scales well in this test. This is my highest mems with no artifacting, I can only get a true 100% clean mem oc of + 442, unfortunately. But this is my new high and by 100 points, in 4k.
> what mem oc?


I settled on +450 turns out not stable at 500


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> tolda ya, bro
> 
> 
> 
> 
> 
> 
> 
> . On these Aorus , ez oc is just no volts, custom fan curve, +51, +64, +77, or +90. If you are very very lucky, +103. that's 2100 right there. Then make 3 or 4 adjustments to a curve and lock it. Done. Bios on these things already has you squared away. If we could stay under 40c, we'd be 2088-2100 all day.


Can you post your curve? So you have found max clocks for lowest bin and you lock it for 24/7?

I still have no clue if I am getting how to do this curve right....


----------



## Qba73

Quote:


> Originally Posted by *pez*
> 
> I got the 'Creator's Update' pushed to my system. I recall there previously being an issue where performance was spotty or decreased with GameDVR turned on in the Xbox app. I wanted to remind you guys to re-turn that off as mine was enabled again after the update.


Yup first thing I looked for after I upped to creators, that and the privacy settings ;-)


----------



## BucketInABucket

Does anyone know what the name of this connector is? I've been wracking my head for the past few days trying to find it to no avail.


Spoiler: Pictures!













EDIT: Just found it, it's an XH-2P connector.


----------



## D13mass

Do we have something like OC manual for 1080ti?

I just increased core +130 and memory +250 and got 1946 and 11200 Mhz with stable gaming but I`m dislike this noise and waitng water block.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Power Limit definitely reporting right. if you check the three logs, all at a second timing each, the Power Limits hit equal the voltages reached. It appears Asus BIOS does increase the voltage to 330 watts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is my custom TimeSpy settings so hitting a much higher Power Limit than 111% that I get in Heaven or Furmark.
> 
> Edit: Could it be the BIOS reports the Power Limits AND voltages wrong?


Looking at those logs it definitely looks like AB is reporting 119 max, and GPU-Z even went to 121 so it would seem that would tie up with NVSMI. It does power limit on AB a bit more than I thought it might, but might be higher than 119 due to polling intervals. Maybe as you say most tests don't stress it more than 300w so we didn't see higher than 111, and were all running full voltage and similar clocks so that would make sense.

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you sure you're not measuring an initial spike? I posted an image in here of my stock BIOS running at 330 watts for a bit.
> 
> But I did do a measurement with my PSU and the nvidia command in real time when I saw it hit 111% power draw on Asus and 119%-122% on stock.
> 
> Stock was 305 and Asus was 310. The difference was consistent with what my PSU was showing as a system draw.
> 
> Your situation is a bit different also from non water stock FE users I believe as you're not running the blower and led fan as you're saving anywhere from 5 to 12 watts not running them. That can give you an extra 3%-5% on tdp if true.
> 
> If only we could modify the bios and have it read "Enforced Power Limit 350 Watts" we'd be golden.
> 
> I don't know how this nvidia command works, we should get someone doing a shunt mod to do a reading because if they're at 80%, which is really 120% for stock, and if it only reads 200 watts being drawn, then the nvidia command is only multiplying what it reads as its minimum power draw by percentage. But if it reads 350+ watts then we know it's actually reading what is actually being drawn.
> 
> Can someone with a shunt mod run the nvidia command with furmark?


119-122 on stock is 300w and 111 on asus is 300w so that's about right. 120 on asus is 330w which is something I hadn't seen til now, but looks like 350w would be nice, but for extreme benchs only. I would still say asus is a liitle lower in performance, and given the performance scaling isn't that great the FE bios running cleaner at bin or two lower than the asus should perform on par. Probably better for general gaming that way at least.


----------



## Nico67

On a separate note I did have gaming stability issues on the FE bios, so I reset the curve (ctrl+d) and just raised the voltage point I wanted to run at and hit apply, this also made all the following voltages flat from that point. That made it totally stable when previously it wasn't, at least that was coming from a curve that was core clock raised and than curve tuned. may help others I don't know, but worth mentioning


----------



## FuriousReload

Slackaveli, you were right. On Heaven and Valley I could run 2151 stable the whole run, even TimeSpy I would see it do 2151 and 2126 alternating. I have to run it at 2101 MHz due to power limits. I got the screenshots at home, didn't have time to upload, but 4K Optimized I got 10,350ish on all three runs. Still happy with 2101 even, but that load is way higher than other benchmarks. Sad I cant use SuPo stress in windows while clicking else where on the screen, it closes SuPo out.

I bought some heatsinks with double sided thermal tape on Amazon, I will be putting those on the base plate around the Hybrid pump, just to add some cooling back to VRMs and Memory. Also going to wire the pump to the motherboard WPUMP header, to let the GPU free up some more wattage.


----------



## Benny89

I think I will return my STRIX and wait for AORUS XTREME to be finally in my country. Few reasons for it:

1. You can't repaste STRIX without voiding warranty (stupid stickers). I have bought Kryonaut so I know I can get extra 4-5C with repasting.
2. Higher Power Limit really helps in basic OC and on benches.
3. Cosmetic reason. Side ROG RGB logo is too small and covered by my sleved cables anyway and I like RGB to be exposed








4. I have nothing exctiing to play on PC right now apart from DS3 DLC (which is 60 fps cap) and BF 1 which I already have 100fps with 980 TI.

I also want to see all AIBs released.


----------



## Benny89

Quote:


> Originally Posted by *FuriousReload*
> 
> Slackaveli, you were right. On Heaven and Valley I could run 2151 stable the whole run, even TimeSpy I would see it do 2151 and 2126 alternating. I have to run it at 2101 MHz due to power limits. I got the screenshots at home, didn't have time to upload, but 4K Optimized I got 10,350ish on all three runs. Still happy with 2101 even, but that load is way higher than other benchmarks. Sad I cant use SuPo stress in windows while clicking else where on the screen, it closes SuPo out.
> 
> I bought some heatsinks with double sided thermal tape on Amazon, I will be putting those on the base plate around the Hybrid pump, just to add some cooling back to VRMs and Memory. Also going to wire the pump to the motherboard WPUMP header, to let the GPU free up some more wattage.


Your score seems too low for 2101 core. People were hitting here 10,4k with 2050 on core. You may run too high clocks for your bin. It is stable but performance is not better. I have same issue with my max clock.


----------



## FuriousReload

Quote:


> Originally Posted by *Benny89*
> 
> Your score seems too low for 2101 core. People were hitting here 10,4k with 2050 on core. You may run too high clocks for your bin. It is stable but performance is not better. I have same issue with my max clock.


I don't have the same CPU they have either.


----------



## Benny89

Quote:


> Originally Posted by *FuriousReload*
> 
> I don't have the same CPU they have either.


True but you may want to try a stable run with lets say 2063 on core and lower bin. Just compare if there is a difference.

Which GPU do you have btw?


----------



## FuriousReload

Quote:


> Originally Posted by *Benny89*
> 
> True but you may want to try a stable run with lets say 2063 on core and lower bin. Just compare if there is a difference.
> 
> Which GPU do you have btw?


I did some at 2076 and they were lower, by about 80 points I want to say?

I have the 1080 Ti FE from the NVidia store, no AIB.


----------



## AsusFan30




----------



## stoker

Quote:


> Originally Posted by *AsusFan30*


Nice looking Rig


----------



## DerComissar

Quote:


> Originally Posted by *D13mass*
> 
> Do we have something like OC manual for 1080ti?
> 
> I just increased core +130 and memory +250 and got 1946 and 11200 Mhz with stable gaming but I`m dislike this noise and waitng water block.


http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps#post_25977285


----------



## DStealth

Just received my MSI 1080ti Gaming X card ...seems nice at glance with stock cooling out of the box ~2100/12k will retest this days again FS GPU score is close to 32k










Edit: Just hit the 117% power limit with thеsе settings...anyway the score looks good to me 11136 GPU ..


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Norlig*
> 
> My Grandpa had a copper plate lying around, it was only 1mm though, so I might have to use 2.
> 
> Cut them to 45mmx45mm - ish. and sanded them down with 3000grit
> 
> will mount it later today.


1mm will not work, you need 1.5mm or thicker.


----------



## kevindd992002

Quote:


> Originally Posted by *c0ld*
> 
> Then you got a reason to upgrade go for it


Isn't there always?








Quote:


> Originally Posted by *Slackaveli*
> 
> I think so, lol. It needed more than "SP" and "Superposition" is a mouthful. So , SuPo it is.
> Agreed, and apparently i am in the minority on this (lol, I'm shocked) but I think the 7740k will be a nice chip. It could clock over 5.5ghz with that extra power headroom and process improvements should net at least a 4770k vs Devil's Canyon situation, and that isn't really even accounting for the increased overclock headroom more power and a way better platform will bring. I am unsure how they will TIM the thing but if it is like all the other HEDT than it will crush a 7700k so to speak (and even if it's craptastic reggie TIM like a 7700k, a delid would do the trick), as well as leaving several future upgrade paths. If I were building now I'd wait, and try to save as much dough as I could in case I decided to go Skylake-X, which will be nasty, too.
> 
> But even if I'm trippin, who knows how long z270 will be viable. There won't be a process shrink with ingenuitive caches coming as a magic life extender to z270 like z97 got with the Broadwell late cycle, that I can guarantee. Not that it will become crap overnight or anything, it's a great platform but we are talking about a fresh build scenario with two processor lines and a new platform about to drop. My advice is wait, unless you have a z97 platform, and then you'd be CRAZY to not sell that 4690k/4790k and grab a 5775c.


Is 7740K one of the mainstream CPU's though? When is its target release date?


----------



## outofmyheadyo

Has anyone actually got a founders 1080ti without coilwhine? I am trying to decide if I should grab one and hope the best or sell my entire watercooling setup and get an msi gamingX or something, since coilwhine defeats the purpose of watercooling.
I know the gaming x might whine aswell, but with the money from watercooling I wouldnt have to pay anything extra for the card. I find it ridiculous we have to pay 800€ for a graphics card and then have to put up with a coilwhining piece of crap...


----------



## dunbheagan

I ordered the EK Waterblock 1080Ti directly on release with express shipping to germany, got it today. I think it looks absolutely awesome. The letters on top are silver, not white like i thought. Makes the look even better. The single slot bracket has a nice dark gunmetal finish. Here are a few impressions:


----------



## chef1702

Thats just funny now. I wanted to post the same. Got mine a hour ago. No express shipping









Easter is saved...


----------



## TheParisHeaton

Haha Easter is save.









This singleslot bracket with new terminal looks sexy.


----------



## dunbheagan

I have to disappoint everybody who hopes the cover terminal is led-lightened. It is not and there is absolutly no way to insert a led yourself. It is no space in there and the terminal cant be disassembled. Funny that the manual has a section "connecting the terminal cover led" This is surely not for the actual terminal. Either the section is outdated or EK already plans another terminal with led.


----------



## Slackaveli

Quote:


> Originally Posted by *nonnac7*
> 
> Here is my Aorus 1080ti Xtreme - Superposition @ 4k and Timespy run. 2075 core that eventually goes down to 2050 due to heat and 6014 for memory. 6850k @ 4.2GHz.
> 
> 
> 
> Been using these clocks to play witcher 3 for the past 4 or 5 days with like 4+ hours sessions. Pretty happy overall.
> 
> Edit: forgot to mention my card lol


nice, man, almost exactly like mine. I keep matching or occasionally beating all the extreme's I see, which makes my reggie Aorus very happy!
Quote:


> Originally Posted by *Benny89*
> 
> No. I did not. I checked. I had to uninstall MSI afterburner and heaven and again downlocked itself properly on idle.


when I get that glitch I switch primary monitors and re-boot and it resets. Is very annoying; it's the old 144Hz glitch.
Quote:


> Originally Posted by *Norlig*
> 
> My Grandpa had a copper plate lying around, it was only 1mm though, so I might have to use 2.
> 
> Cut them to 45mmx45mm - ish. and sanded them down with 3000grit
> 
> will mount it later today.


those are actually gorgeous. I miss my ole granpaw....
Quote:


> Originally Posted by *Benny89*
> 
> Can you post your curve? So you have found max clocks for lowest bin and you lock it for 24/7?
> 
> I still have no clue if I am getting how to do this curve right....


just woke up, give me a bit. Basically i determined that i crash at +90 but i can hold +90 mostly everywhere except at that 1.043v mark, so i run that mark at +77, the others (over 1v) at +90, the ones BELOW 1v at +77 (in case more are unstable like 1.043v is), and I flatten the line at 2088 T 1.075v, and lock it there.


----------



## KingEngineRevUp

My submission for the 1.000 Volt Challenge:

Ran mostly at 2063 Mhz with power limiting


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> Slackaveli, you were right. On Heaven and Valley I could run 2151 stable the whole run, even TimeSpy I would see it do 2151 and 2126 alternating. But SuPo is a *****, I have to run it at 2101 MHz due to power limits. I got the screenshots at home, didn't have time to upload, but 4K Optimized I got 10,350ish on all three runs. Still happy with 2101 even, but that load is way higher than other benchmarks. Sad I cant use SuPo stress in windows while clicking else where on the screen, it closes SuPo out.
> 
> I bought some heatsinks with double sided thermal tape on Amazon, I will be putting those on the base plate around the Hybrid pump, just to add some cooling back to VRMs and Memory. Also going to wire the pump to the motherboard WPUMP header, to let the GPU free up some more wattage.


yeah, i am a bit annoyed that I can't pause during SuPo bench to tune. That's a good Idea, before i sold founder's i had 2 100mmx50mmx15mm finned heatsinks with the high grade Fujipoly pads right over the gpu/vram on the backplate and they got pretty hot so they were doing something.
Quote:


> Originally Posted by *FuriousReload*
> 
> I don't have the same CPU they have either.


score looks about right to me, i have to have a locked 2076 curve and max on my mems to beat it, and I only run about 10,400 maxed out myself.
Quote:


> Originally Posted by *AsusFan30*


damn, that's a lot of beef! That looks very dope with the dragon sli HB connector. sick!
Quote:


> Originally Posted by *kevindd992002*
> 
> Isn't there always?
> 
> 
> 
> 
> 
> 
> 
> 
> Is 7740K one of the mainstream CPU's though? When is its target release date?


yeah, it's the best 4/8 i-7. Will OC to about 5.5Ghz+. Unveiling at Computex in exactly 2 months, release shortly after.
Quote:


> Originally Posted by *outofmyheadyo*
> 
> Has anyone actually got a founders 1080ti without coilwhine? I am trying to decide if I should grab one and hope the best or sell my entire watercooling setup and get an msi gamingX or something, since coilwhine defeats the purpose of watercooling.
> I know the gaming x might whine aswell, but with the money from watercooling I wouldnt have to pay anything extra for the card. I find it ridiculous we have to pay 800€ for a graphics card and then have to put up with a coilwhining piece of crap...


haven't heard a single whine out of either my FE or my Aorus.


----------



## joder

Quote:


> Originally Posted by *chef1702*
> 
> Thats just funny now. I wanted to post the same. Got mine a hour ago. No express shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Easter is saved...


Mine just got here as well (US) and I did standard shipping. Too bad I won't be able to play with it until next week as I'm tied up until then.


----------



## Benny89

I returned my STRIX 1080 Ti today. Got cash back. I am waiting for other AIBs to start showing up here. Back to my 980Ti for a moment (at least no stupid voltage curve...







)

I will probably go for AORUS Extreme now or another hight Power Limit card.

Any info what Power EVGA FTW3 has?


----------



## KingEngineRevUp

Tried to do 1.000V challenge run on ASUS bios and it just ran like garbage:



For reference, stock:


----------



## Norlig

Update on my Hybrid mount,

Managed to mount it "properly" now, I dont get driver crashes due to temperatures, but They seem a bit high still, reaching 67'C after 20 minutes in Heaven.
Will try to look for a single Shim that is 45mmx45mmx1.5 , instead of 2 shims.

Had to file away a portion of the corner on the mount, as it was hitting the stock shroud and would not lay flat on the gpu.


















Filed away here


----------



## outofmyheadyo

Who made ur founders? Omw to the store now, pretty set on the gigabyte model they also have asus founders sadly no EVGA so either gigabyte or asus, I hear gigabyte doesnt mind if remove the cooler and slap a block on it.


----------



## Norlig

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Who made ur founders? Omw to the store now, pretty set on the gigabyte model they also have asus founders sadly no EVGA so either gigabyte or asus, I hear gigabyte doesnt mind if remove the cooler and slap a block on it.


Mine is made by Asus, but that warranty sticker on the screw is bogus anyway, Atleast in Norway.

No sticker can make you loose warranty if its there to "stop" you from opening your product. (In Norway)

On my Asus card the sticker didnt even say anything, it was just a red circle with no text.

If you break the card when you take it apart though, its on you, but they need to prove it died due to your mistake. (In Norway)

A removed sticker and a cracked GPU is usually enough proof though.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Norlig*
> 
> Update on my Hybrid mount,
> 
> Managed to mount it "properly" now, I dont get driver crashes due to temperatures, but They seem a bit high still, reaching 67'C after 20 minutes in Heaven.
> Will try to look for a single Shim that is 45mmx45mmx1.5 , instead of 2 shims.
> 
> Had to file away a portion of the corner on the mount, as it was hitting the stock shroud and would not lay flat on the gpu.
> 
> Filed away here


2 Shims is bad, too much contact resistance. I keep trying to tell you to use a 1.5mm to 2mm shim...

You need a solid shim and do the spread method and apply as little as possible. APPLY AS LITTLE AS POSSIBLE DAMN IT! I can't stress that enough.

I'm getting 45C stock and 48C OC.



When I do the shunt mod next week, I'm going to use liquid metal instead because having the shim with two sides of thermal paste isn't as efficient.


----------



## AsterFenix

I saw the 1080 TI Gigabyte gaming OC today on Newegg. Can someone confirm if this is going to be Reference Board? I want to add it to my Water loop but retain a decent air cooler in case I want to pass it to my brother in the future.

https://www.newegg.com/Product/Product.aspx?Item=N82E16814125955&ignorebbr=1&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Video+Card+-+Nvidia-_-N82E16814125955&gclid=Cj0KEQjww7zHBRCToPSj_c_WjZIBEiQAj8il5KGEoeMpShBb5gXnrlNUrMHptr3_9FMekRYIYAOoVU8aAqrS8P8HAQ&gclsrc=aw.ds


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AsterFenix*
> 
> I saw the 1080 TI Gigabyte gaming OC today on Newegg. Can someone confirm if this is going to be Reference Board? I want to add it to my Water loop but retain a decent air cooler in case I want to pass it to my brother in the future.
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814125955&ignorebbr=1&nm_mc=KNC-GoogleAdwords-PC&cm_mmc=KNC-GoogleAdwords-PC-_-pla-_-Video+Card+-+Nvidia-_-N82E16814125955&gclid=Cj0KEQjww7zHBRCToPSj_c_WjZIBEiQAj8il5KGEoeMpShBb5gXnrlNUrMHptr3_9FMekRYIYAOoVU8aAqrS8P8HAQ&gclsrc=aw.ds


Compare photos of the bottom of the card to a FE one. See if they look the same component wise.


----------



## Hackslash

would like to share something




can strix owners reproduce this?

from 1-14% it is totally bugged and i have this most of the time on my desktop surfing and watching youtube, its really annoying


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, it's the best 4/8 i-7. Will OC to about 5.5Ghz+. Unveiling at Computex in exactly 2 months, release shortly after.


Any forecast yet as to how much it would approximately cost?


----------



## Norlig

Quote:


> Originally Posted by *SlimJ87D*
> 
> 2 Shims is bad, too much contact resistance. I keep trying to tell you to use a 1.5mm to 2mm shim...
> 
> You need a solid shim and do the spread method and apply as little as possible. APPLY AS LITTLE AS POSSIBLE DAMN IT! I can't stress that enough.
> 
> I'm getting 45C stock and 48C OC.
> 
> 
> 
> When I do the shunt mod next week, I'm going to use liquid metal instead because having the shim with two sides of thermal paste isn't as efficient.


Yeh, I am pretty sure I have used too much TIM between both the copper shims, on the GPU and on the AIO cooler









I dont want to risk cracking my gpu die though, so I will use these for now and will actively look for a single shim that is 45mmx45mmx1.5-2mm
Then do the spread method on that

Also thought about what you are doing with the small heatsinks, but I only got thermal glue, not thermal tape.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> Any forecast yet as to how much it would approximately cost?


no but it'll be $300-350 I'm sure, probably $330.

btw, I can get my 5775-c to 4.4Ghz and it beats 7700k single core score at 5.0ghz at that frequency


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Norlig*
> 
> Yeh, I am pretty sure I have used too much TIM between both the copper shims, on the GPU and on the AIO cooler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont want to risk cracking my gpu die though, so I will use these for now and will actively look for a single shim that is 45mmx45mmx1.5-2mm
> Then do the spread method on that
> 
> Also thought about what you are doing with the small heatsinks, but I only got thermal glue, not thermal tape.


Quote:


> Originally Posted by *Hackslash*
> 
> would like to share something
> 
> 
> 
> 
> can strix owners reproduce this?


If I were you I wouldn't worry about it, the fan starts spinning at 25%.

You really don't need 45x45mm, the heat transfer is con shaped. If the die is 25mmx25mm then it will transfer to 24mmx24mm for example. So the thinner the shim the better, as too thick of a shim it will transfer only to a 22mmx22mm. Those numbers are obviously made up for an example. I did a heat analysis a few weeks ago with some multi thousand dollar software and verified with some hand calculations.

What you should focus more on is reinforcing the back of the die. You're not going to crack your GPU die if you just gently grip and turn the screw driver until the screw driver slips out of your fingers. I've done this mod 4 times already.

Anyways, good luck.

Edit: if you're turning it with a screw driver, turn it on the metal shaft itself so you get a better feel on how much torque you're applying. If you turn the handle itself you're doubling the moment force couple.

Tuan gently with fingers on the skinny metal shaft closet to the screw.

And lastly, it's mandatory to get this pressure right because if you don't you can gain 10C with poor pressure.


----------



## RavageTheEarth

I see a lot of you guys running SuperPosition. I didn't even realize Unigine released another benchmark. Here's my results. Seem to be about on par with other people's scores here.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I see a lot of you guys running SuperPosition. I didn't even realize Unigine released another benchmark. Here's my results. Seem to be about on par with other people's scores here.


yep. looks good. We like this bench b/c it's new, awesome looking, & has great pre-sets that really push power limits. I can't wait for leaderboards.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> I agree, would you start a new thread and maintain an op? I'm sure the mods will be okay with it. We should just have this one closed but reference it in a new op.
> 
> 
> 
> i will actively participate but I nominate Kedar . Have him put his slider unlock and Voltage curve videos up. Just start it and have a mod merge this thread into it at post 2 and just pre-empt dude's OP.
Click to expand...

I'm pretty busy with work and stuff, don't want to high jack someone elses thread, but if OP here approved I'd do my best to keep it up to date and add content.

Not just going to jump someone's thread though.


----------



## Slackaveli

**I'll stop derailing with 5775c stuff, but, just look at this ram score with 2400Mhz ddr3 ram. That's some SERIOUS cachemagic right there. 99th% even with all that high speed ddr4 running around... And all that quad channel memory out there. My my lil ole 2 channel 2400 ddr3 is 99th%!?!


----------



## KingEngineRevUp

I don't think you have to update it daily but it would be nice to have some kind of informative op.


----------



## KedarWolf

Oh, and I need to update the Asus BIOS flashing one, have info and logs the power limit IS hitting 330 watts and getting really good benchmarks in the new Unigine with Asus BIOS.


----------



## wsjackson5

Got this in today


----------



## Qba73

Quote:


> Originally Posted by *Hackslash*
> 
> would like to share something
> 
> 
> 
> 
> can strix owners reproduce this?
> 
> from 1-14% it is totally bugged and i have this most of the time on my desktop surfing and watching youtube, its really annoying


small world, I responded to your reddit post

thenerdbomberr


----------



## RavageTheEarth

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, but guy never even posted a second time. He just said " I pre-ordered" and never came back... :/
> 
> **I'll stop derailing with 5775c stuff, but, just look at this ram score with 2400Mhz ddr3 ram. That's some SERIOUS cachemagic right there. 99th% even with all that high speed ddr4 running around... And all that quad channel memory out there. My my lil ole 2 channel 2400 ddr3 is 99th%!?!


Yeah that CPU looks amazing. I upgraded to a Gigabyte G1 Gaming 7 Z170 MOBO and 6700k last year so it's going to be a couple years before I upgrade those components again, but if it was a LGA1150 chip I definitely would grab one. My delidded 6700k is doing great though.

It's funny, yesterday I removed my 980 Ti with waterblock and installed the Aorus so right now I'm running a Monsta 420mm, 45mm thick 360mm, and 30mm 240mm thick radiators just for my CPU lol. EK needs to hurry up and release the Aorus blocks!!!!!! I cannot wait to get this three slot cooler off of this card.


----------



## Slackaveli

Quote:


> Originally Posted by *wsjackson5*
> 
> Got this in today


damn, that's the new design? That glassy black back is nicE!
Quote:


> Originally Posted by *RavageTheEarth*
> 
> Yeah that CPU looks amazing. I upgraded to a Gigabyte G1 Gaming 7 Z170 MOBO and 6700k last year so it's going to be a couple years before I upgrade those components again, but if it was a LGA1150 chip I definitely would grab one. My delidded 6700k is doing great though.
> 
> It's funny, yesterday I removed my 980 Ti with waterblock and installed the Aorus so right now I'm running a Monsta 420mm, 45mm thick 360mm, and 30mm 240mm thick radiators just for my CPU lol. EK needs to hurry up and release the Aorus blocks!!!!!! I cannot wait to get this three slot cooler off of this card.


lmao, GOT THAT CPU RUNNING AT WHATEVER ROOM TEMP IS


----------



## Benny89

MSI Afterburner beta 4.4 version with already unlock Voltage if anyone will need:

http://forums.guru3d.com/showpost.php?p=5412373&postcount=216

I am waiting for that Xtreme to be here but so far no store has it







.


----------



## Chris Ihao

Ok. From what I'm reading after having researched this for a while now, it seems like I might just as well get a standard Aorus instead of a xtreme, if I'm really just interested in getting real life fps performance out of it instead of marginally higher benches, correct? The xtreme version costs 8% more atm, and isnt available anywhere. Can get the standard much faster. I'm also considering the msi gaming x version, but thats also about 5% or so more expensive. Asus strix is as expensive as the xtreme Aorus. Already on the minus side after recently having purchased some analog synths and whatnot, so I have to avoid getting bankrupt (or kicked in the head by my missus. Dont know whats worse). Any help deciding is appreciated as I'm waving back and forth between the alternatives here.


----------



## coolharris93

Here's a Unigine Supeposition bench running on my 1080 Ti Strix all stock, 1080p Extreme Preset.
The video stuttering is caused by NVENC used to record my video.


----------



## Slackaveli

Quote:


> Originally Posted by *Chris Ihao*
> 
> Ok. From what I'm reading after having researched this for a while now, it seems like I might just as well get a standard Aorus instead of a xtreme, if I'm really just interested in getting real life fps performance out of it instead of marginally higher benches, correct? The xtreme version costs 8% more atm, and isnt available anywhere. Can get the standard much faster. I'm also considering the msi gaming x version, but thats also about 5% or so more expensive. Asus strix is as expensive as the xtreme Aorus. Already on the minus side after recently having purchased some analog synths and whatnot, so I have to avoid getting bankrupt (or kicked in the head by my missus. Dont know whats worse). Any help deciding is appreciated as I'm waving back and forth between the alternatives here.


you could get lucky, too. my reggie Aorus would definitely have passed an extreme gauntlet test as it holds stock extreme clocks plus 70mhz more. It's an awesome card, very sexy. Jst dont try it in a tiny case or on a mobo with weak pcie slots, it's a beast. Although I would have been happy to pay the extar $30 just for that backplate and the guarantee of at least 2000MHZ (which you'll get with the pre-set on the Aorus XT). The backplate and +25 guarantee are the only differences, but it's only $30 , too. The backplate is worth $20 extra imo all day with the bigger copper plate and the LED Aorus Eagle. I wish I had it, but clocks wise I beat several of the XT guys somehow so I caught a golden one imo.


----------



## Foxrun

Anyone know why I would be getting a power limit throttle in gpu z if Im just sitting at the desktop?


----------



## Slackaveli

Quote:


> Originally Posted by *Foxrun*
> 
> Anyone know why I would be getting a power limit throttle in gpu z if Im just sitting at the desktop?


hhmhmmhmhmh, what clocks? Sure nothing running in background? Try a reboot right quick and see if it's gone.


----------



## Foxrun

Quote:


> Originally Posted by *Slackaveli*
> 
> hhmhmmhmhmh, what clocks? Sure nothing running in background? Try a reboot right quick and see if it's gone.




No avail, this is weird.


----------



## Chris Ihao

Quote:


> Originally Posted by *Slackaveli*
> 
> you could get lucky, too. my reggie Aorus would definitely have passed an extreme gauntlet test as it holds stock extreme clocks plus 70mhz more. It's an awesome card, very sexy. Jst dont try it in a tiny case or on a mobo with weak pcie slots, it's a beast. Although I would have been happy to pay the extar $30 just for that backplate and the guarantee of at least 2000MHZ (which you'll get with the pre-set on the Aorus XT). The backplate and +25 guarantee are the only differences, but it's only $30 , too. The backplate is worth $20 extra imo all day with the bigger copper plate and the LED Aorus Eagle. I wish I had it, but clocks wise I beat several of the XT guys somehow so I caught a golden one imo.


Ok. Cool. Hmm, l need to buy this thing soon before my common sense gets the better of me. I think my MSI Z97 gaming 7's slots should be capable enough to hold the Aorus without falling apart.


----------



## Slackaveli

Quote:


> Originally Posted by *Foxrun*
> 
> 
> 
> No avail, this is weird.


psu? maybe a janky pci-e slot? Not to have you doing all kinds of stuff but i'd try a different pci-e slot and see if it's still happening. May just be a glitch with the app. Maybe have HWmonitor up, too , and make sure you are seeing it on both.
Quote:


> Originally Posted by *Chris Ihao*
> 
> Ok. Cool. Hmm, l need to buy this thing soon before my common sense gets the better of me. I think my MSI Z97 gaming 7's slots should be capable enough to hold the Aorus without falling apart.


My Z97 Gaming MSI is doing fine, no sag. they are reinforced slots.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Chris Ihao*
> 
> Ok. From what I'm reading after having researched this for a while now, it seems like I might just as well get a standard Aorus instead of a xtreme, if I'm really just interested in getting real life fps performance out of it instead of marginally higher benches, correct? The xtreme version costs 8% more atm, and isnt available anywhere. Can get the standard much faster. I'm also considering the msi gaming x version, but thats also about 5% or so more expensive. Asus strix is as expensive as the xtreme Aorus. Already on the minus side after recently having purchased some analog synths and whatnot, so I have to avoid getting bankrupt (or kicked in the head by my missus. Dont know whats worse). Any help deciding is appreciated as I'm waving back and forth between the alternatives here.


Grab that regular Aorus. I'm easily running 2088 core with 1.043v. At 100% the fan keeps my card under 50C which I was extremely surprised about. This thing destroys every game I own on my 1440p 144hz monitor. If you want to keep it really simple, pop in the card, install the drivers and MSI afterburner, and crank the power limit up to 125% and the fans to 100%. At 100% the fans are barely louder than my TWO D5 pumps. You will most likely be boosting to 2000Mhz if you do that. Can't wait to get a waterblock on it.


----------



## Chris Ihao

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Grab that regular Aorus. I'm easily running 2088 core with 1.043v. At 100% the fan keeps my card under 50C which I was extremely surprised about. This thing destroys every game I own on my 1440p 144hz monitor. If you want to keep it really simple, pop in the card, install the drivers and MSI afterburner, and crank the power limit up to 125% and the fans to 100%. At 100% the fans are barely louder than my TWO D5 pumps. You will most likely be boosting to 2000Mhz if you do that. Can't wait to get a waterblock on it.


Thanks for the info mate. I'll just order it right now. I'll even join your club when I get it







Also running at 1440p/144hz, so this will be a perfect card to keep up to speed. Not to forget; keeping my Oculus at 90 fps, which honestly isnt all that easy with my 980 ti. Cheers!

Edit: I HATE using forums on my ipad!


----------



## Benny89

I don't even have here normal Aorus, so I can wait extra for Xtreme







.

Wasn't AORUS 150% power Limit? Or it is also 125%?


----------



## Stiltz85

Anyone hear any rumors on price/release date for the Asus 1080 Ti Poseidon? I need to sell my FE's before pre-order time comes!


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> Tried to do 1.000V challenge run on ASUS bios and it just ran like garbage:
> 
> 
> 
> For reference, stock:


Wow your power limiting hard at 112% on the Asus, were Kedarwolf was getting 119%. Maybe it is a case of limiting on a different rail early where Kedarwolf can pull a bit more power somehow. Probably the only way to see would be for Kedar to run benchmarks with graphs to show what % power he starts to get the green power limit warning on GPU-Z.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I don't even have here normal Aorus, so I can wait extra for Xtreme
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Wasn't AORUS 150% power Limit? Or it is also 125%?


both the same, 375w power limit.
Quote:


> Originally Posted by *RavageTheEarth*
> 
> Grab that regular Aorus. I'm easily running 2088 core with 1.043v. At 100% the fan keeps my card under 50C which I was extremely surprised about. This thing destroys every game I own on my 1440p 144hz monitor. If you want to keep it really simple, pop in the card, install the drivers and MSI afterburner, and crank the power limit up to 125% and the fans to 100%. At 100% the fans are barely louder than my TWO D5 pumps. You will most likely be boosting to 2000Mhz if you do that. Can't wait to get a waterblock on it.


yep, do exactly that and then try +25, +50, +75, +100. You may get another step up. When you finally crash, back off 13Mhz. gold. I crash at +100, but can hold +90 .


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> both the same, 375w power limit.
> yep, do exactly that and then try +25, +50, +75, +100. You may get another step up. When you finally crash, back off 13Mhz. gold. I crash at +100, but can hold +90 .


Nah, I want that sexy backplate with big RGB eagle







. I can wait for Xtreme. And better chance for better chip.

I don't have anything to play on PC anyway


----------



## RavageTheEarth

Quote:


> Originally Posted by *Benny89*
> 
> Nah, I want that sexy backplate with big RGB eagle
> 
> 
> 
> 
> 
> 
> 
> . I can wait for Xtreme. And better chance for better chip.
> 
> I don't have anything to play on PC anyway


That was another reason I chose the regular Aorus when I bought it on launch day. I'm going to be putting this card under water the day that EK releases the block and backplate for it so I have no use for fancy backplates or lighting.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Wow your power limiting hard at 112% on the Asus, were Kedarwolf was getting 119%. Maybe it is a case of limiting on a different rail early where Kedarwolf can pull a bit more power somehow. Probably the only way to see would be for Kedar to run benchmarks with graphs to show what % power he starts to get the green power limit warning on GPU-Z.


That's odd, no one can do 119% constantly on the Asus bios. Max is usually 111% or 112% for small periods.

Yeah, what happens when you run the Asus bios? I've already ran test on my computer and measured power recently with the nvidia command and my PSU itself.

Stock Max was 305 and Asus was 310ish.

Kedar might get some extra power because he doesn't have to run the blower or isn't powering the led. He's on a full water block. But I have no clue how he's doing 119% on the Asus bios when we're all floating around 108% unless if he's running stock BIOS and forgot he didn't flash back.

Bloodhawk had some similar measurements to me too.

I have a good PSU so I don't think it's my PSU. Rm1000i.


----------



## mechwarrior

Has anyone tried using the new titan xp bios on the 1080ti???
Wondering if the power limit is higher, or more efficient.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Benny89*
> 
> I don't even have here normal Aorus, so I can wait extra for Xtreme
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Wasn't AORUS 150% power Limit? Or it is also 125%?


Aorus regular is 125 and Xtreme is 150, but both are 375w limit because the regular defaults at 300.


----------



## stoker

Quote:


> Originally Posted by *dunbheagan*
> 
> I have to disappoint everybody who hopes the cover terminal is led-lightened. It is not and there is absolutly no way to insert a led yourself. It is no space in there and the terminal cant be disassembled. Funny that the manual has a section "connecting the terminal cover led" This is surely not for the actual terminal. Either the section is outdated or EK already plans another terminal with led.


Thats disappointing, I would have expected a provision for an 3.5 led

Quote:


> Originally Posted by *Slackaveli*
> 
> bro, that's pretty salty at your settings.
> Sorry, bro, The OP of this thread is a lil dooshyboy who needs tied and quartered for creating an OP than running away. I guess he wanted to BRAG? About one stock FE Ti? I don't get it, but he needs banned and a new OP assigned.










Someone needs to contact a mod, to see if they can take over

Quote:


> Originally Posted by *SlimJ87D*
> 
> My submission for the 1.000 Volt Challenge:
> 
> Ran mostly at 2063 Mhz with power limiting


I like it


----------



## Chris Ihao

Ordered the regular Aorus (need to think everytime I'm spelling this), so there. Now, if only easter wouldnt exist, as it will delay my favourite activity (except a few secret ones), testing new gpu's with all of my 700 pc games. Ok, perhaps not Peggle and a "few" others.


----------



## Qba73

Quote:


> Originally Posted by *Chris Ihao*
> 
> Ordered the regular Aorus (need to think everytime I'm spelling this), so there. Now, if only easter wouldnt exist, as it will delay my favourite activity (except a few secret ones), testing new gpu's with all of my 700 pc games. Ok, perhaps not Peggle and a "few" others.


I thought I was the only one who had to think twice when spelling aorus..lol


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> I thought I was the only one who had to think twice when spelling aorus..lol


i spelled it wrong the first week


----------



## Chris Ihao

Quote:


> Originally Posted by *Qba73*
> 
> I thought I was the only one who had to think twice when spelling aorus..lol


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Wow your power limiting hard at 112% on the Asus, were Kedarwolf was getting 119%. Maybe it is a case of limiting on a different rail early where Kedarwolf can pull a bit more power somehow. Probably the only way to see would be for Kedar to run benchmarks with graphs to show what % power he starts to get the green power limit warning on GPU-Z.
> 
> 
> 
> That's odd, no one can do 119% constantly on the Asus bios. Max is usually 111% or 112% for small periods.
> 
> Yeah, what happens when you run the Asus bios? I've already ran test on my computer and measured power recently with the nvidia command and my PSU itself.
> 
> Stock Max was 305 and Asus was 310ish.
> 
> Kedar might get some extra power because he doesn't have to run the blower or isn't powering the led. He's on a full water block. But I have no clue how he's doing 119% on the Asus bios when we're all floating around 108% unless if he's running stock BIOS and forgot he didn't flash back.
> 
> Bloodhawk had some similar measurements to me too.
> 
> I have a good PSU so I don't think it's my PSU. Rm1000i.
Click to expand...

No, I'm on Asus BIOS, Nvidia CMD command confirms it.

If you look at the logs on my Asus bios flashing thread I'm hitting 330+ watts off and on depending on the Power Limit I hit and spiked up to 360 watts two full runs of test one and two TimeSpy at 8K maxed out anisotropy and tesselation. Getting like 8 FPS but it's only to max out the power draw.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> No, I'm on Asus BIOS, Nvidia CMD command confirms it.
> 
> If you look at the logs on my Asus bios flashing thread I'm hitting 330+ watts off and on depending on the Power Limit I hit and spiked up to 360 watts two full runs of test one and two TimeSpy at 8K maxed out anisotropy and tesselation. Getting like 8 FPS but it's only to max out the power draw.


That's interesting, what can you be doing to get the extra 30 watts that no one else can't.

The blower like I said can use 12 watts and the led, there's that also. If I measured 310, and you get an extra 10 from not running those, the other 10 I'm not sure.

out of 4 of us that actually took measurements, you're the only one that gets more watts. Hardware wise, the only difference I can see is you're on full water and no blower running. Software, not sure either.

So you're hitting 119% on the Asus bios? I'm on my phone so I can't check your GPU-Z right now.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> No, I'm on Asus BIOS, Nvidia CMD command confirms it.
> 
> If you look at the logs on my Asus bios flashing thread I'm hitting 330+ watts off and on depending on the Power Limit I hit and spiked up to 360 watts two full runs of test one and two TimeSpy at 8K maxed out anisotropy and tesselation. Getting like 8 FPS but it's only to max out the power draw.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's interesting, what can you be doing to get the extra 30 watts that no one else can't.
> 
> The blower like I said can use 12 watts and the led, there's that also. If I measured 310, and you get an extra 10 from not running those, the other 10 I'm not sure.
> 
> out of 4 of us that actually took measurements, you're the only one that gets more watts. Hardware wise, the only difference I can see is you're on full water and no blower running. Software, not sure either.
> 
> So you're hitting 119% on the Asus bios? I'm on my phone so I can't check your GPU-Z right now.
Click to expand...

I think it depends how you stress test the card the power draw you get, why I used a maxed out custom TimeSpy test.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I think it depends how you stress test the card the power draw you get, why I used a maxed out custom TimeSpy test.


Are you hitting and holding 119% on the Asus bios now? Because you have previous post where you said you weren't going above 108-110%, and if so what could have changed?


----------



## CoreyL4

Managed to snag a Gaming X today







.


----------



## Nico67

I ran SuPo off a USB drive (hate installing anything unnecessary on a gaming rig, now that's ocd







), this is FE bios getting absolutely crushed









2100 stable gaming setting (some games just really bad)



2100 but curve tweaked to try and keep clocks as high as possible 4th run



2138 my game(s) can't handle this, go figure











could tune that better, but it only got above 2088 when the load wasn't bad enough to power limit.

Will try Asus for comparison


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> I ran SuPo off a USB drive (hate installing anything unnecessary on a gaming rig, now that's ocd
> 
> 
> 
> 
> 
> 
> 
> ), this is FE bios getting absolutely crushed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2100 stable gaming setting (some games just really bad)
> 
> 
> 
> 2100 but curve tweaked to try and keep clocks as high as possible 4th run
> 
> 
> 
> 2138 my game(s) can't handle this, go figure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> could tune that better, but it only got above 2088 when the load wasn't bad enough to power limit.
> 
> Will try Asus for comparison


Can you share your vintage curve? Are you on a water block?

Yeah let me know how the Asus bios goes for you.

I'm going to shunt mod asap, I don't want to think about the power limiting stuff anymore


----------



## ThingyNess

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's interesting, what can you be doing to get the extra 30 watts that no one else can't.
> 
> The blower like I said can use 12 watts and the led, there's that also. If I measured 310, and you get an extra 10 from not running those, the other 10 I'm not sure.
> 
> out of 4 of us that actually took measurements, you're the only one that gets more watts. Hardware wise, the only difference I can see is you're on full water and no blower running. Software, not sure either.
> 
> So you're hitting 119% on the Asus bios? I'm on my phone so I can't check your GPU-Z right now.


One thing to remember is that all of these BIOSes actually have 4 separate power limits. The overall main limiter is the one that everyone talks about, but there are also individual limiters for each of the 3 +12v sources (PCIe motherboard slot, and each of the two PCIe power connectors)

Every PCB design routes power differently. One might power the memory VRM, RGB lighting, the blower fan, and other miscellaneous stuff from the PCIe slot, and have all of the power phases for the VCore VRM come from the two PCIe power connectors.

In that case, overclocking and overvolting the core is really never going to increase the power draw from the PCIe slot, and so you'll never hit that particular limiter.

On the FE, I highly suspect that at least one, if not two of the VCore phases are connected to the PCIe slot, and everyone is running up against that limiter. Unfortunately we currently have no way to tell whether this is the case in software. An intrepid adventurer could take an accurate multimeter and probe the 5 milliohm shunt resistor for the PCIe slot while under a constant heavy load with the power limiter maxed out.

The power limit for PCIe slot +12v is 78w on both the FE and the ASUS Strix BIOS with the slider maxed out.

78w equates to 6.5 amps of current at 12v.
6.5 amps of current running through a 5 milliohm resistor would cause a voltage drop of 32.5mV.

If you see 32.5mV across the shunt resistor for the PCIe slot (or very close to) then you're likely hitting that individual sub-limit and not the overall limit.

Edit: Actually, I forgot how peaky the current draw is from the VRM phases. A multimeter isn't going to be enough, you'd need to have an oscilloscope with an integrating/averaging function. Sorry if I got anyone's hopes up!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ThingyNess*
> 
> One thing to remember is that all of these BIOSes actually have 4 separate power limits. The overall main limiter is the one that everyone talks about, but there are also individual limiters for each of the 3 +12v sources (PCIe motherboard slot, and each of the two PCIe power connectors)
> 
> Every PCB design routes power differently. One might power the memory VRM, RGB lighting, the blower fan, and other miscellaneous stuff from the PCIe slot, and have all of the power phases for the VCore VRM come from the two PCIe power connectors.
> 
> In that case, overclocking and overvolting the core is really never going to increase the power draw from the PCIe slot, and so you'll never hit that particular limiter.
> 
> On the FE, I highly suspect that at least one, if not two of the VCore phases are connected to the PCIe slot, and everyone is running up against that limiter. Unfortunately we currently have no way to tell whether this is the case in software. An intrepid adventurer could take an accurate multimeter and probe the 5 milliohm shunt resistor for the PCIe slot while under a constant heavy load with the power limiter maxed out.
> 
> The power limit for PCIe slot +12v is 78w on both the FE and the ASUS Strix BIOS with the slider maxed out.
> 
> 78w equates to 6.5 amps of current at 12v.
> 6.5 amps of current running through a 5 milliohm resistor would cause a voltage drop of 32.5mV.
> 
> If you see 32.5mV across the shunt resistor for the PCIe slot (or very close to) then you're likely hitting that individual sub-limit and not the overall limit.
> 
> Edit: Actually, I forgot how peaky the current draw is from the VRM phases. A multimeter isn't going to be enough, you'd need to have an oscilloscope with an integrating/averaging function. Sorry if I got anyone's hopes up!


I'm just going to shunt mod and leave this all behind me.


----------



## buddatech

Can anyone tell me what is the max resolution I'd be able to achieve with the included DisplayPort to DVI adapter?

I'm using an older 30" 2560x1600 panel


----------



## OneCosmic

I have done the shunt mod, but i only put CLU on those 6+8 pin shunts and now during Furmark i can hit 105% TDP in MSI AB, but GPU-Z only shows ~68% TDP on stock FE BIOS and i can see power limit being the limiter during Furmark. Shunt mod obviously works, because i have a AX860i PSU with Power monitoring and the power draw is ~50W higher now running 1924/1949MHz in Furmark instead of 1800-1860MHz before.

Actually when i did the shunt mod i had the Strix BIOS loaded in it and i have seen the same behavior in MSI AB and GPU-Z, but the power draw was the same, simply like if the shunt mod wasn't working and as i suspected flashing back stock BIOS solved the issue and made shunt mod working, so obviously Strix BIOS and Stock FE BIOS are handling power management differently, maybe it has something to do with the fact that i didn't put CLU on the PCIE shunt?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I think it depends how you stress test the card the power draw you get, why I used a maxed out custom TimeSpy test.
> 
> 
> 
> Are you hitting and holding 119% on the Asus bios now? Because you have previous post where you said you weren't going above 108-110%, and if so what could have changed?
Click to expand...

I said when running Heaven I wasn't going over 111% which is good, you DON't want to hit the power limit or you'll throttle, lower is better.

But when I'm stress testing with TimeSpy I'm getting 120% off and on which I was doing to test to see if I was getting the 330+ watts which I was.

You'll never get a sustained 120% unless you can find a stress test that pulls that kind of load and I haven't found one that does yet.

I DID hit 120%+ off and on regularly running the custom TimeSpy, it jumps all over the place though.


----------



## Taint3dBulge

Temps seem nice and just poked in afew numbers and it seems stable thus far. will tweak it more. +170 and it crashes in heaven after 20 min. Not sure if adding more voltage will help.

This is after i think it was 45 min of heaven running.


----------



## Nico67

Results for Asus bios,

2100 tuned to same as FE or close enough



Note to get anywhere near FE bios I added +100 more to the mem clk, Asus doesn't power limit as much but I'm still hitting it around 111% even if it can spike past to 114%

2138 although I think it was going as high as 2152



115% and massively power limited

Asus clks higher or I should say further than FE, curve would be similar but the little extra power allows it to use a few more bins, performance for me at least is not as good as FE and I feel that's because they soften some settings to allow for bigger clks on core and mem that take more voltage and power but probably end up at a similar result with tweaking.

Quote:


> Originally Posted by *SlimJ87D*
> 
> Can you share your vintage curve? Are you on a water block?
> 
> Yeah let me know how the Asus bios goes for you.
> 
> I'm going to shunt mod asap, I don't want to think about the power limiting stuff anymore


26c temps are full titan block on chilled water cooler system. curve is defaulted, then raise,

1.000 - 2075
1.011 - 2075
1.025 - 2088
1.031 - 2088
1.043 - 2101

then flat at 2101

just see if you drop lower than 2075 in my case, and if so raise the next voltage down to minimize the loss.

Hopefully the shunt mod works, or we get a full unlocked Asus bios, I could work with that


----------



## DStealth

Quote:


> Originally Posted by *DStealth*
> 
> Just received my MSI 1080ti Gaming X card ...seems nice at glance with stock cooling out of the box ~2100/12k will retest this days again FS GPU score is close to 32k
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Just hit the 117% power limit with thеsе settings...anyway the score looks good to me 11136 GPU ..


Exceeded 32k GPU(32.3 actually) on air with 1.05v + full fan speed in order stay below 50* and constant 2076/12000 clocks











Edit: Validation on FS - http://www.3dmark.com/fs/12318972


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm just going to shunt mod and leave this all behind me.


Ill solder on wire across shunts tonight


----------



## MURDoctrine

Quote:


> Originally Posted by *buddatech*
> 
> Can anyone tell me what is the max resolution I'd be able to achieve with the included DisplayPort to DVI adapter?
> 
> I'm using an older 30" 2560x1600 panel


1920x1080p 60hz


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> I ran SuPo off a USB drive (hate installing anything unnecessary on a gaming rig, now that's ocd
> 
> 
> 
> 
> 
> 
> 
> ), this is FE bios getting absolutely crushed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2100 stable gaming setting (some games just really bad)
> 
> 
> 
> 2100 but curve tweaked to try and keep clocks as high as possible 4th run
> 
> 
> 
> 2138 my game(s) can't handle this, go figure
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> could tune that better, but it only got above 2088 when the load wasn't bad enough to power limit.
> 
> Will try Asus for comparison


good enough, my man. I felt a sense of relief when my first 10,450 popped up. I was like, "I'm DONE!".
Quote:


> Originally Posted by *bloodhawk*
> 
> Ill solder on wire across shunts tonight


tiny wire!
Quote:


> Originally Posted by *buddatech*
> 
> Can anyone tell me what is the max resolution I'd be able to achieve with the included DisplayPort to DVI adapter?
> 
> I'm using an older 30" 2560x1600 panel


you'll need an active adapter. Or new monitor.


----------



## DStealth

Quote:


> Originally Posted by *Nico67*
> 
> Results for Asus bios,
> 
> 2100 tuned to same as FE or close enough
> 
> 
> 
> Note to get anywhere near FE bios I added +100 more to the mem clk, Asus doesn't power limit as much but I'm still hitting it around 111% even if it can spike past to 114%
> 
> 2138 although I think it was going as high as 2152
> 
> 
> 
> 115% and massively power limited
> 
> Asus clks higher or I should say further than FE, curve would be similar but the little extra power allows it to use a few more bins, performance for me at least is not as good as FE and I feel that's because they soften some settings to allow for bigger clks on core and mem that take more voltage and power but probably end up at a similar result with tweaking.
> 26c temps are full titan block on chilled water cooler system. curve is defaulted, then raise,
> 
> 1.000 - 2075
> 1.011 - 2075
> 1.025 - 2088
> 1.031 - 2088
> 1.043 - 2101
> 
> then flat at 2101
> 
> just see if you drop lower than 2075 in my case, and if so raise the next voltage down to minimize the loss.
> 
> Hopefully the shunt mod works, or we get a full unlocked Asus bios, I could work with that


Your scores are not right for 2100+/+650 mem ...i have 10600+ 4k optimized on 2063/+500 you're either Throttling or Ecc memory correcting for such clocks...

[email protected] due to PT on stock MSI cooler thru









Any suggestions which one to apply on the GPU


----------



## TWiST2k

Quote:


> Originally Posted by *Slackaveli*
> 
> .


That is fascinating, please tell us more!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> I have done the shunt mod, but i only put CLU on those 6+8 pin shunts and now during Furmark i can hit 105% TDP in MSI AB, but GPU-Z only shows ~68% TDP on stock FE BIOS and i can see power limit being the limiter during Furmark. Shunt mod obviously works, because i have a AX860i PSU with Power monitoring and the power draw is ~50W higher now running 1924/1949MHz in Furmark instead of 1800-1860MHz before.
> 
> Actually when i did the shunt mod i had the Strix BIOS loaded in it and i have seen the same behavior in MSI AB and GPU-Z, but the power draw was the same, simply like if the shunt mod wasn't working and as i suspected flashing back stock BIOS solved the issue and made shunt mod working, so obviously Strix BIOS and Stock FE BIOS are handling power management differently, maybe it has something to do with the fact that i didn't put CLU on the PCIE shunt?


It's better you didn't put it on the pci-e one, safer. Shouldn't draw more from the board itself.

Although did you measure your resistors with a multimeter? I read its recommended to measure them to see if you put enough clu or not.
Quote:


> Originally Posted by *KedarWolf*
> 
> I said when running Heaven I wasn't going over 111% which is good, you DON't want to hit the power limit or you'll throttle, lower is better.
> 
> But when I'm stress testing with TimeSpy I'm getting 120% off and on which I was doing to test to see if I was getting the 330+ watts which I was.
> 
> You'll never get a sustained 120% unless you can find a stress test that pulls that kind of load and I haven't found one that does yet.
> 
> I DID hit 120%+ off and on regularly running the custom TimeSpy, it jumps all over the place though.


So is that not similar to when you hit 330 watts on the stock BIOS then? Because it does happen occasionally on the stock BIOS.
Quote:


> Originally Posted by *Nico67*
> 
> Results for Asus bios,
> 
> 2100 tuned to same as FE or close enough
> 
> 
> 
> Note to get anywhere near FE bios I added +100 more to the mem clk, Asus doesn't power limit as much but I'm still hitting it around 111% even if it can spike past to 114%
> 
> 2138 although I think it was going as high as 2152
> 
> 
> 
> 115% and massively power limited
> 
> Asus clks higher or I should say further than FE, curve would be similar but the little extra power allows it to use a few more bins, performance for me at least is not as good as FE and I feel that's because they soften some settings to allow for bigger clks on core and mem that take more voltage and power but probably end up at a similar result with tweaking.
> 26c temps are full titan block on chilled water cooler system. curve is defaulted, then raise,
> 
> 1.000 - 2075
> 1.011 - 2075
> 1.025 - 2088
> 1.031 - 2088
> 1.043 - 2101
> 
> then flat at 2101
> 
> just see if you drop lower than 2075 in my case, and if so raise the next voltage down to minimize the loss.
> 
> Hopefully the shunt mod works, or we get a full unlocked Asus bios, I could work with that


Looks like your Asus bios results are worse than FE, similar to everyone else's experience, besides KedarWolf.

Truth be told, when we're power limited we're running the core in a gimped state. Who knows if you can truly hold 2138-2152 Mhz on a shunt mod. This is evident when you compare our scores to users without power limiting, them at 2050 Mhz is like us power limited to occasionally hitting 2101 Mhz.

Mod edit: Please use the edit button instead of double and triple posting.


----------



## Slackaveli

Quote:


> Originally Posted by *TWiST2k*
> 
> That is fascinating, please tell us more!


lol, if i double post, which apparently my autistic self does a BUNCH, i edit and since you can't "remove' a post, i just leave that so the clean-up mods have it easy and don't ban me lol.
Quote:


> Originally Posted by *DStealth*
> 
> Your scores are not right for 2100+/+650 mem ...i have 10600+ 4k optimized on 2063/+500 you're either Throttling or Ecc memory correcting for such clocks...
> 
> [email protected] due to PT on stock MSI cooler thru
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any suggestions which one to apply on the GPU


dat kyro, man. thin layer across the whole die.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Your scores are not right for 2100+/+650 mem ...i have 10600+ 4k optimized on 2063/+500 you're either Throttling or Ecc memory correcting for such clocks...
> 
> [email protected] due to PT on stock MSI cooler thru
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any suggestions which one to apply on the GPU


If you haven't been paying attention, the FE is power limited so we're all stuck around 10300.

He did get 10428 on the normal bios a page back.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> If you haven't been paying attention, the FE is power limited so we're all stuck around 10300.
> 
> He did get 10428 on the normal bios a page back.


not sure how he's getting 10,600 , though. Dat's nasty.


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> not sure how he's getting 10,600 , though. Dat's nasty.


Never mind 10600 lol I can't even break 10k cuz mine is that bad of a bin. Sigh, the one time I get godlike VRAM and the core is turd.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Never mind 10600 lol I can't even break 10k cuz mine is that bad of a bin. Sigh, the one time I get godlike VRAM and the core is turd.


happened to me on my replacement 980ti after my hybrid cooler failed. 1st one was 1500 all day, but only 500 memory. That was fine, though. Replacement crashed after 1440 core (uggh) but somehow the memory could hold +675. That was crazy high on gddr5.

I feel for ya, Bro; It's all just a couple of % points, though. You'll get to "acceptance" stage of grieving soon.


----------



## NYU87

Quote:


> Originally Posted by *DStealth*
> 
> Exceeded 32k GPU(32.3 actually) on air with 1.05v + full fan speed in order stay below 50* and constant 2076/12000 clocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Validation on FS - http://www.3dmark.com/fs/12318972


Nice! I got very similar FS GPU score of 32.4K at 2088/12000.


----------



## zswickliffe

Finished my custom loop and finally played with OCing a bit more.

Time Spy - 9,254 (http://www.3dmark.com/spy/1558393)

GPU temps stay right around 41*C


----------



## Nico67

Quote:


> Originally Posted by *DStealth*
> 
> Your scores are not right for 2100+/+650 mem ...i have 10600+ 4k optimized on 2063/+500 you're either Throttling or Ecc memory correcting for such clocks...


Yeah that one is Asus bios and power limiting heaps, infact all mine are power limiting due to FE card, but that wasn't the purpose of the post








I was trying to show power limiting at % TDP to see if we could get close to a full usable 330w before we get limited. Unfortunately it doesn't seem we can, so the Asus bios does provide some benefit, but that might just be in softer times and lower scores stretching the overclockability a bit further. Chances are that the componentry tolerances may vary the power readings so one card might limit at 295 and another at 305 and that would again stretch the Asus bios further.

Personally a shunt mod with the stock bios on the FE would be the way to go, unless an XOC Asus bios shows up that has no real limits on any rail or total. Hopefully with full 1.2v capability as well


----------



## Taint3dBulge

Quote:


> Originally Posted by *SlimJ87D*
> 
> If you haven't been paying attention, the FE is power limited so we're all stuck around 10300.
> 
> He did get 10428 on the normal bios a page back.


Just installed my EVGA 1080 Ti FE today. So far stable at +150 core (2050) and trying +575 mem (6075) Im reading conflicting things. Some say dont up the voltage, and some say it helps. Whats the sweet spot? Been reading but too many pages lol. Using +70mv right now and seems stable enough, temps not going over 39c and power % is 113max but mostly right around 110%.


----------



## Dasboogieman

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Just installed my EVGA 1080 Ti FE today. So far stable at +150 core (2050) and trying +575 mem (6075) Im reading conflicting things. Some say dont up the voltage, and some say it helps. Whats the sweet spot? Been reading but too many pages lol. Using +70mv right now and seems stable enough, temps not going over 39c and power % is 113max but mostly right around 110%.


Well the issue with upping the voltage is if it doesn't significantly increase your clocks you may lose performance because you hit the power ceiling which will downclock you by 1-2bins.

iirc, only people that have done the shunt mod + watercooled can tolerate max voltage without tripping the power limit. I'm on a 375W powerlimit Aorus and I can still hit the power limit with 4k FFXIV or Superposition. That being said, my GPU is at the bottom end of the voltage leakage curve (I need 1081mv to do 2012mhz lol) so mine is generating more power than usual.


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's better you didn't put it on the pci-e one, safer. Shouldn't draw more from the board itself.
> 
> Although did you measure your resistors with a multimeter? I read its recommended to measure them to see if you put enough clu or not.


You need to have a special multimeter to measure such low resistance and i didn't want to invest money in such a multimeter right now just to measure resistance on one card. I have a regular OWON B35 multimeter, but this one and also other regular ones can measure only up to 0.5ohm. I bought a silver lacquer in my local electronics shop and i tested a straight line of 1cm of it on a paper and compared it to the CLU also with 1cm of straight line, the B35 has shown the CLU as practically conductive with no resistance, the lacquer was specified to have several ohms per 1cm and that's also what i measured with my multimeter, with shorter distance the resistance was lower but it was measurable of 2-3 ohms and more.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Just installed my EVGA 1080 Ti FE today. So far stable at +150 core (2050) and trying +575 mem (6075) Im reading conflicting things. Some say dont up the voltage, and some say it helps. Whats the sweet spot? Been reading but too many pages lol. Using +70mv right now and seems stable enough, temps not going over 39c and power % is 113max but mostly right around 110%.


Go for the lowest voltage and highest clock possible, avoid power limiting at all possible.

For Witcher 3 I do 1.040V and 2088 Mhz, and I rarely hit power limits.

But this superposition benchmark is also a stress test in itself, it draws an unrealistic amount of power that games normally wouldn't draw, but it does show you what a partner card without power limits can do,get like 0.5% more frames than you lol.


----------



## MURDoctrine

Quote:


> Originally Posted by *zswickliffe*
> 
> Finished my custom loop and finally played with OCing a bit more.
> 
> Time Spy - 9,254 (http://www.3dmark.com/spy/1558393)
> 
> GPU temps stay right around 41*C
> 
> 
> Spoiler: Warning: Spoiler!


Looks good.







I can't wait for my EK block and backplate to get here.

However, that res placement makes me cringe. I guess its no different than any of the other points of failure but all that water up there irks me.


----------



## alucardis666

I'm coming back!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> I'm coming back!


Wow all that cost the same as just a titan Xp on its own lol. Welcome back.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Wow all that cost the same as just a titan Xp on its own lol. Welcome back.


Exactly...









And thanks!


----------



## Taint3dBulge

Time to bring my 4790k up to 4.9ghz and run it again. I dont get why these benchmarks can never get your actual cpu speed correct...


----------



## Benny89

Anyone has any info of Power Limit on EVGA FTW3? I wonder if it is also 330W or 375W like Aorus cards.


----------



## DStealth

Mentioning 375W on the Aorus cards...to share my experience flashing GB BIOS on my MSI Gaming card...it was quite unpleasant and hardly recomend MSI GamingX users not to try flashing it hoping for extended Power limit.
The card stuck @ 2d clocks and even though it throttles on default speeds...seems board incompatibility or memory timings to me, returned my original BIOS and the card is just fine


----------



## alucardis666

Am I crazy or do I remember someone here who flashed the Aorus extreme bios on their Aorus regular?

I'd love to give that a shot and convert my card when it arrives if that's possible.









Anyone tried this?


----------



## Benny89

Does anyone know if EVGA FTW3 side text "EVGA GeForce GTX 1080 Ti" is fully LED RGB? So far on all screens they are only shown as white.


----------



## Hackslash

is there a PCB difference between Aorus and Aorus Xtreme?
Will bios flashing work?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Am I crazy or do I remember someone here who flashed the Aorus extreme bios on their Aorus regular?
> 
> I'd love to give that a shot and convert my card when it arrives if that's possible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone tried this?


Yeah someone here did do that.


----------



## shalafi

Quote:


> Originally Posted by *Slackaveli*
> 
> i spelled it wrong the first week


Speaking of spelling, it's kRYonaut, not kYRonaut


----------



## Clukos

Quote:


> Originally Posted by *alucardis666*
> 
> I'm coming back!


Went for the exact same combo (minus Aorus, I bought the Gaming X because I found it -100 euros cheaper when they started stocking them) should be fun going from Intel to AMD for once, they deserve it this round


----------



## alucardis666

Quote:


> Originally Posted by *Clukos*
> 
> Went for the exact same combo (minus Aorus, I bought the Gaming X because I found it -100 euros cheaper when they started stocking them) should be fun going from Intel to AMD for once, they deserve it this round


Agreed! What are you coming from?


----------



## Clukos

Quote:


> Originally Posted by *alucardis666*
> 
> Agreed! What are you coming from?


A 3570k + Z77 mobo and the integrated gpu (sold my 970 some months ago). Should be a nice bump over the Intel 4000









Edit: I was thinking about a 6900k or a 5960x before Ryzen launched but the 1700 was a no-brainer really, for a non-multi gpu setup at least.


----------



## dunbheagan

Just a few impressions of my shunt modded 1080Ti FE on a EK waterblock. I used the Conductonaut between DIE and waterblock too. Insulation is nail polish and black heat resistant silicone.


----------



## dansi

Hey guys have you ran FF14 Heavensward benchmark?

It is a damn power virus hog, even in windowed 1080p, i am getting pwr limited??









It is more stressful than Timespy.


----------



## s1rrah

Quote:


> Originally Posted by *dunbheagan*
> 
> Just a few impressions of my shunt modded 1080Ti FE on a EK waterblock. I used the Conductonaut between DIE and waterblock too. Insulation is nail polish and black heat resistant silicone..


Very nice. I used liquid electric tape on mine (Hybrid cooler) but it didn't spread well and so just ended up using standard electric tape. I don't trust it though and so am going to clean it off and fall back to Kryonaut as among my two hybrid cooled cards, I didn't see any noticeable gain in performance ... temps roughly the same ... which probably has more to do with the efficiency threshold of the Hybrid cooler more so that the pastes.

BTW: on your die? What order did you apply the nail polish and silicon? And could you post a link to the particular silicon you used? Did it already have an applicator that allowed that nice bead or did you somehow otherwise apply the silicon?

Thanks for any info! Want to do my new 1080 ti Seahawk with Conductonaut some day and would like to replicate your process ...

Best,
joel


----------



## amer020

http://www.3dmark.com/3dm/19271932?



[email protected] volt with Strix 1080 Ti BIOS


----------



## dunbheagan

Quote:


> Originally Posted by *s1rrah*
> 
> Very nice. I used liquid electric tape on mine (Hybrid cooler) but it didn't spread well and so just ended up using standard electric tape. I don't trust it though and so am going to clean it off and fall back to Kryonaut as among my two hybrid cooled cards, I didn't see any noticeable gain in performance ... temps roughly the same ... which probably has more to do with the efficiency threshold of the Hybrid cooler more so that the pastes.
> 
> BTW: on your die? What order did you apply the nail polish and silicon? And could you post a link to the particular silicon you used? Did it already have an applicator that allowed that nice bead or did you somehow otherwise apply the silicon?
> 
> Thanks for any info! Want to do my new 1080 ti Seahawk with Conductonaut some day and would like to replicate your process ...
> 
> Best,
> joel


Hi Joel, i used black nail polish first:



then i put the conductonaut on the shunts:



last step, black silicone around the shunts to form a "wall":





On the small parts around the gpu, i only used black nail polish, because the silicone is not fluent enough imo.


----------



## s1rrah

Quote:


> Originally Posted by *dunbheagan*
> 
> On the small parts around the gpu, i only used black nail polish, because the silicone is not fluent enough imo.


thanks so much. I'll just be doing the die on mine. So any old black nail polish will do? And it's known to withstand heat fairly well?

Thanks again
joel


----------



## leequan2009

Quote:


> Originally Posted by *dunbheagan*
> 
> Hi Joel, i used black nail polish first:
> 
> 
> 
> then i put the conductonaut on the shunts:
> 
> 
> 
> last step, black silicone around the shunts to form a "wall":
> 
> 
> 
> 
> 
> On the small parts around the gpu, i only used black nail polish, because the silicone is not fluent enough imo.


Sorry, Can u tell me why did you paste black oil to Boardcard?
We can apply Block to Card without black oil?


----------



## dunbheagan

Quote:


> Originally Posted by *s1rrah*
> 
> thanks so much. I'll just be doing the die on mine. So any old black nail polish will do? And it's known to withstand heat fairly well?
> 
> Thanks again
> joel


There are different oppinions on using nail polish as an isolator. But many people use it succesfully. I think it is ok, although there might be more professional materials. Here is what i found out on the topic of heat resistance:

http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/130#post_25984068


----------



## DStealth

Just did the liquid metal shunt mod on my MSI Gaming X
Consumption in TimeSpy GT1 with [email protected] *338w* max

Consumption after the mod in TimeSpy GT1 with [email protected] *159w* max









Will retest soon 2114/12000 on stock air cooler seems possible now


----------



## dunbheagan

Quote:


> Originally Posted by *leequan2009*
> 
> Sorry, Can u tell me why did you paste black oil to Boardcard?
> We can apply Block to Card without black oil?


Dont worry, you dont need it if you use normal thermal paste which is non-conductive.
I just put the black nail polish on the small parts as an isolator because i use liquid metal as a thermal paste.


----------



## DStealth

Quote:


> Originally Posted by *DStealth*
> 
> Just did the liquid metal shunt mod on my MSI Gaming X


Not bad for stock air cooled card in 28* room ambient temp...suppose









Exceeding 80fps 4k optimized


----------



## KickAssCop

Ordered an MSI Gaming X. Here is hoping didn't make a mistake as just when it shipped the ASUS became available


----------



## Dasboogieman

Quote:


> Originally Posted by *s1rrah*
> 
> thanks so much. I'll just be doing the die on mine. So any old black nail polish will do? And it's known to withstand heat fairly well?
> 
> Thanks again
> joel


I got some nail hardener, chucked it on an old Core2D package and put a 150 degree celsius heatgun to it, didn't melt or anything. TLDR, as long as it is not mechanically disturbed it should be OK. I could upload photos of that particular nail hardener if u want lol.

Added bonus, this is super ez to remove in case of warranty, just use any form of nail polish remover and ur golden.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *s1rrah*
> 
> Very nice. I used liquid electric tape on mine (Hybrid cooler) but it didn't spread well and so just ended up using standard electric tape. I don't trust it though and so am going to clean it off and fall back to Kryonaut as among my two hybrid cooled cards, I didn't see any noticeable gain in performance ... temps roughly the same ... which probably has more to do with the efficiency threshold of the Hybrid cooler more so that the pastes.
> 
> BTW: on your die? What order did you apply the nail polish and silicon? And could you post a link to the particular silicon you used? Did it already have an applicator that allowed that nice bead or did you somehow otherwise apply the silicon?
> 
> Thanks for any info! Want to do my new 1080 ti Seahawk with Conductonaut some day and would like to replicate your process ...
> 
> Best,
> joel


Not sure why the liquid electrical tape didn't work for you another user got it on really nice for 4 of his cards.

But if I was you, I would go to the Auto store and get some black rtv gasket maker.

https://www.amazon.com/Permatex-82180-Maximum-Resistance-Silicone/dp/B0002UEN1U/ref=br_lf_m_cgrjwkuajft29zs_img?_encoding=UTF8&s=automotive

Black silicone is what is used to seal the lid on cpus.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> My favorite new thing is Liquid electrical Tape!!!
> 
> For those that want significantly better thermals on air and don't want to bother with water.
> 
> Consider running liquid metal. It's not nearly as scary as people make it out to be. Unless you have shaky hands from Parkinson's or similar ofcorse.
> 
> Just flood the area around the GPU die with higher temp liquid tape and go to town. This will drastically lower your temps as it's nearly as good as solder in terms of thermal transfer and will drop your temps significantly.
> 
> When you're done with the card, just wipe it off carefully and peel off the tape.
> 
> He're how my old 1080 and Titan X Maxwell before it cleaned up after I was done with em. You can literally flood the area surrounding the socket with liquid electrical tape to ward off shorts and peel it off when done. Same with the Shunt mod frankly.
> 
> One can always practice on old useless hardware first if scurrd....


Best example of using liquid electrical tape. So if you have to RMA, just peel it off.


----------



## Fieldsweeper

So I just got this card (the EVGA one) it did not come with any disc, so I just installed the drivers manually which overclocking software should I use? first steps??

I downloaded the evga software but it asked for my sn, i put it in but the software seems kind of crappy. I hear the MSI afterburner is nice, do I need a sn for that one?

so whats this shunt mod I see about? risky?


----------



## outofmyheadyo

Afterburner is free u dont need no serial


----------



## DStealth

If anybody wonders how 2100/1200 looks like on stock air cooling...here it is








http://www.3dmark.com/3dm/19276650

11300+ GPU score


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Fieldsweeper*
> 
> So I just got this card (the EVGA one) it did not come with any disc, so I just installed the drivers manually which overclocking software should I use? first steps??
> 
> I downloaded the evga software but it asked for my sn, i put it in but the software seems kind of crappy. I hear the MSI afterburner is nice, do I need a sn for that one?
> 
> so whats this shunt mod I see about? risky?


It depends on your oc, if you can oc past 2100 and you're being held back by power than the shunt mod might be for you. But if your card is a oc dud then it's not worth it.

I'm going to do it because my card can run up to 2114 or higher with no issues, just limited by power right now.


----------



## Fieldsweeper

how come the voltage says 0 ?? lol


----------



## Fieldsweeper

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Fieldsweeper*
> 
> So I just got this card (the EVGA one) it did not come with any disc, so I just installed the drivers manually which overclocking software should I use? first steps??
> 
> I downloaded the evga software but it asked for my sn, i put it in but the software seems kind of crappy. I hear the MSI afterburner is nice, do I need a sn for that one?
> 
> so whats this shunt mod I see about? risky?
> 
> 
> 
> It depends on your oc, if you can oc past 2100 and you're being held back by power than the shunt mod might be for you. But if your card is a oc dud then it's not worth it.
> 
> I'm going to do it because my card can run up to 2114 or higher with no issues, just limited by power right now.
Click to expand...

what is the best way to test that? i would love to see 2100 plus


----------



## Fieldsweeper

http://prntscr.com/ewhgu6

why so low on the clocks?


----------



## Fieldsweeper

It has just been so long since I have overclocked a GPU I forget where to start HAHA


----------



## octiny

Count me in for 2.



Idle temps 26c/Load 48c @950 RPM w/+150/250

Will test more when I fully finish the build.


----------



## alucardis666

Quote:


> Originally Posted by *Clukos*
> 
> A 3570k + Z77 mobo and the integrated gpu (sold my 970 some months ago). Should be a nice bump over the Intel 4000
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I was thinking about a 6900k or a 5960x before Ryzen launched but the 1700 was a no-brainer really, for a non-multi gpu setup at least.


Agreed! Hope it works out for us both!


----------



## KedarWolf

Edit, never mind, video quality is terrible.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> If anybody wonders how 2100/1200 looks like on stock air cooling...here it is
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19276650
> 
> 11300+ GPU score


You're shunt modded also right? I would include that in your details.

I'm more interested in seeing your GPU-Z sensor graphs and your AB detached graphs.


----------



## Slackaveli

Quote:


> Originally Posted by *shalafi*
> 
> Speaking of spelling, it's kRYonaut, not kYRonaut


oy boy. that's probably been bothering ya'll. I've only misspelled the two things I've been advocating most. Now, let me find a way to screw up 5775c...
Quote:


> Originally Posted by *dansi*
> 
> Hey guys have you ran FF14 Heavensward benchmark?
> 
> It is a damn power virus hog, even in windowed 1080p, i am getting pwr limited??
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is more stressful than Timespy.


prettier, too.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> oy boy. that's probably been bothering ya'll. I've only misspelled the two things I've been advocating most. Now, let me find a way to screw up 5775c...
> prettier, too.


I think you mean the c5775


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> I think you mean the c5775


yeah, The Broodwells.


----------



## outofmyheadyo

Unfortunately my gigabyte founders 1080ti seems to crap out at locked 1.050v +169 core ( 2050 ) + 550 memory ( 6055 ) it isn`t even hitting the 120% power limit or anything just freezes, its under water and ~40c.


----------



## cekim

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, The Broodwells.


BW is emo?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Unfortunately my gigabyte founders 1080ti seems to crap out at locked 1.050v +169 core ( 2050 ) + 550 memory ( 6055 ) it isn`t even hitting the 120% power limit or anything just freezes, its under water and ~40c.


Silicon lottery









But make sure your water blocks aren't shorting anything, I've heard stories of people reassembling their cards because their pc would power down.


----------



## cekim

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Unfortunately my gigabyte founders 1080ti seems to crap out at locked 1.050v +169 core ( 2050 ) + 550 memory ( 6055 ) it isn`t even hitting the 120% power limit or anything just freezes, its under water and ~40c.


Starting to seem pretty normal from following the internets.

Same thing here with one of my cards. The other is better, but takes 2 to tango in SLI.

Still cranking out the frames, but magic 2100 OC's elude...


----------



## Silent Scone

Quote:


> Originally Posted by *cekim*
> 
> Starting to seem pretty normal from following the internets.
> 
> Same thing here with one of my cards. The other is better, but takes 2 to tango in SLI.
> 
> Still cranking out the frames, but magic 2100 OC's elude...


Hoping to get 2100 out of a pair of Strix, but will see!


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, The Broodwells.


Hey, what kind of temps are you getting with your Aorus? and what clocks/volts? Also, I assume you redid the paste with Kryonaut? Also do you know if I can use my hybrid cooler with it? *Doubtful as I assume there's no baseplate to use between the cooler block and the pcb







*

Thanks bud!


----------



## outofmyheadyo

Well I dont mind now I can go play some g
Quote:


> Originally Posted by *SlimJ87D*
> 
> Silicon lottery
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But make sure your water blocks aren't shorting anything, I've heard stories of people reassembling their cards because their pc would power down.


It doesn`t power down it just craps out as usual for a failed GPU oc


----------



## cekim

Quote:


> Originally Posted by *Silent Scone*
> 
> Hoping to get 2100 out of a pair of Strix, but will see!


Fingers crossed for strixy-goodness...

I wasn't patient enough to wait for them after the blah that was board-partner performance of the 1080... My 2 1080FE bang out 2100 under water. The ti still blows them out of the water even 100MHz+ lower so - all good.


----------



## Mrip541

Impulse buy after a beer too many. Had time this morning to cancel before it shipped but... NAH! Looks like I'll have one on Tuesday.


----------



## outofmyheadyo

I read the titan Xp thread and people were talkin about the shunt mod, that you can also use the silver pen for it, wich method do you preffer? I have some CLU at hand but afraid to mess something up if it spills or resistors fall off, since someone said it ate their solder ?


----------



## eXteR

Quote:


> Originally Posted by *amer020*
> 
> http://www.3dmark.com/3dm/19271932?
> 
> 
> 
> [email protected] volt with Strix 1080 Ti BIOS


2038 with Stock BIOS.

http://www.3dmark.com/fs/12267950

31643 graphic points on mine vs 31483 on yours


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Hey, what kind of temps are you getting with your Aorus? and what clocks/volts? Also, I assume you redid the paste with Kryonaut? Also do you know if I can use my hybrid cooler with it? *Doubtful as I assume there's no baseplate to use between the cooler block and the pcb
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Thanks bud!


nevr above 62c, but in BF1 for long stretches it's 58c. All post Kryo pasting of course. generally in the 50's benching.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> nevr above 62c, but in BF1 for long stretches it's 58c. All post Kryo pasting of course. generally in the 50's benching.


Nice, so it's not too far off from the hybrid cooler. What do you idle at?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Nice, so it's not too far off from the hybrid cooler. What do you idle at?


idle's a little different b/c these have zero fan fanstop. If i wanted to leave fans on, about 32c. but the way i have the fan curve it cuts off at 38c and just hangs out there, with zero fan and the "Fanstop" led illuminated on the gpu









and i'd say my temps are almost exactly what my 980ti hybrid temps were before i re-pasted it, then it was 6-8c better than these. But that's with a hybrid and a lot less cores. Very nice air cooler on these. Just beefy.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> idle's a little different b/c these have zero fan fanstop. If i wanted to leave fans on, about 32c. but the way i have the fan curve it cuts off at 38c and just hangs out there, with zero fan and the "Fanstop" led illuminated on the gpu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and i'd say my temps are almost exactly what my 980ti hybrid temps were before i re-pasted it, then it was 6-8c better than these. But that's with a hybrid and a lot less cores. Very nice air cooler on these. Just beefy.


Hmm. Alright thanks. We'll see. I'm debating getting a second one for SLi once newegg allows me to buy a second card. **48hr window is stupid AF**

I know I've preached long and hard against SLi, but with the TXp being a let down I wanna finally play 4k 60fps continuously *or closest to as possible with 1% and .1% lows.*


----------



## Asus11

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I read the titan Xp thread and people were talkin about the shunt mod, that you can also use the silver pen for it, wich method do you preffer? I have some CLU at hand but afraid to mess something up if it spills or resistors fall off, since someone said it ate their solder ?


if your careful first clean the area with iso then use some clear nail varnish around the bottom of the soldering points then all around areas where it could potentially drip

after that apply a decent amount over both shunts (CLU).. mine didnt even drip once and it was vertical mounted.. please refer to my pics in the titan thread..

I have seen the silver pen and would like to use it next time but I heard theres different versions of resistance so make sure you get the right one


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I read the titan Xp thread and people were talkin about the shunt mod, that you can also use the silver pen for it, wich method do you preffer? I have some CLU at hand but afraid to mess something up if it spills or resistors fall off, since someone said it ate their solder ?


No one has used the silver pen yet, and some people think it has too much resistance.

Look up liquid electrical tape, that's what it going to apply around the resistors and on the solder.

The guy that had his resistor fall off never showed anymore photos or discussed what happened, so I'm going to assume user error. Although I'm still going to buy a small paint brush and paint the solder on the resistors with liquid electrical tape.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Hmm. Alright thanks. We'll see. I'm debating getting a second one for SLi once newegg allows me to buy a second card. **48hr window is stupid AF**
> 
> I know I've preached long and hard against SLi, but with the TXp being a let down I wanna finally play 4k 60fps continuously *or closest to as possible with 1% and .1% lows.*


Well wait for your card and see if you won the silicon lottery.

Shunt mod and hybrid + g12 can get you another 5% performance maybe and you can run 4K.

Save that's moneybags for the 2080 Ti which will be like another "1080 Ti x 162.5%" if this trend continues.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well wait for your card and see if you won the silicon lottery.
> 
> Shunt mod and hybrid + g12 can get you another 5% performance maybe and you can run 4K.
> 
> Save that's moneybags for the 2080 Ti which will be like another "1080 Ti x 162.5%" if this trend continues.


Will G12 work with the EVGA Hybrid cooler? and would I need additional VRM cooling other than the fan that the G12 provides?


----------



## s1rrah

It lives. And it is beautiful.

(count me a new member)










...



...



...

Stock corsair fan sucks .,.. super quiet though but super slow ... I'm gonna get a Delta and put it on a fan controller ... LOL ...

Can't dang wait to buy another one in the next few months for SLI ...


----------



## Chris Ihao

Edit: Never mind. Just some minor case of ocd.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Will G12 work with the EVGA Hybrid cooler? and would I need additional VRM cooling other than the fan that the G12 provides?


Do you have a evga hybrid cooler and that's why you want to use it?

If not just grab a x41 refurbished on ebay or on nzxt website. Comes with a year warranty and it comes directly from nzxt. Cost $65.00


----------



## Tikerz

Why is it that when you do slider overclocking you seems to always have perfcaps all over the place? The only way to get a stable consistent overclock at a constant speed/voltage without any perfcaps is to do a custom voltage curve.

Is there anyone that can get an overclock where every game/benchmark holds the boost speed AND voltage AND no perfcaps whatsoever? Literally straight lines across for miles in GPU-Z on boost and voltage and perfcaps zero.

Yes, people are posting I hit 2100 or whatever but as soon as they get into a bench or game it drops to below 2000 for most of the time. To me that's not really a 2100 overclock. Am I looking at this wrong?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Tikerz*
> 
> Why is it that when you do slider overclocking you seems to always have perfcaps all over the place? The only way to get a stable consistent overclock at a constant speed/voltage without any perfcaps is to do a custom voltage curve.
> 
> Is there anyone that can get an overclock where every game/benchmark holds the boost speed AND voltage AND no perfcaps whatsoever? Literally straight lines across for miles in GPU-Z on boost and voltage and perfcaps zero.
> 
> Yes, people are posting I hit 2100 or whatever but as soon as they get into a bench or game it drops to below 2000 for most of the time. To me that's not really a 2100 overclock. Am I looking at this wrong?


We'll with a voltage curve you're just fine tuning and controlling core vs voltage, and power is a function of voltage.

So you're also controlling core vs power/voltage


----------



## Tikerz

Quote:


> Originally Posted by *SlimJ87D*
> 
> We'll with a voltage curve you're just fine tuning and controlling core vs voltage, and power is a function of voltage.
> 
> So you're also controlling core vs power/voltage


I guess what I'm saying there is even at default out of the box speeds you don't get clean perfcap readings and consistent runs. Then if you just use the sliders to increase your clock, basically same thing. With a voltage curve you can get consistent boost and voltage throughout the benchmark/game with zero perfcaps. Someone must have came up with that and said, my people, if you achieve this you will be blessed.







I just haven't seen it here. I see you on here asking people for their GPU-Z sensor readings to look for the exact same thing I'm talking about so we must understand the same thing.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Tikerz*
> 
> I guess what I'm saying there is even at default out of the box speeds you don't get clean perfcap readings and consistent runs. Then if you just use the sliders to increase your clock, basically same thing. With a voltage curve you can get consistent boost and voltage throughout the benchmark/game with zero perfcaps. Someone must have came up with that and said, my people, if you achieve this you will be blessed.
> 
> 
> 
> 
> 
> 
> 
> I just haven't seen it here. I see you on here asking people for their GPU-Z sensor readings to look for the exact same thing I'm talking about so we must understand the same thing.


Yeah, pascal benchmarks need to gpu-z.

People like to show they're at a core above 2100 but that means nothing to me without gpu-z. At least when it comes to superposition.

Normal gaming I don't care to see the gpu-z. Heaven is still a good benchmark to replicate true gaming power uses. It's not obsolete. Heaven is more close to what we would see in a gaming session.

Superposition is good to see what someone's card can hold and their perf cap. But it's practically like furmark, pushing a card to 100% power draw. Not very realistic.


----------



## JedixJarf

Quote:


> Originally Posted by *wsjackson5*
> 
> Got this in today


Oh boy that's a lanboy air, freaking loved that case


----------



## amer020

Quote:


> Originally Posted by *eXteR*
> 
> 2038 with Stock BIOS.
> 
> http://www.3dmark.com/fs/12267950
> 
> 31643 graphic points on mine vs 31483 on yours


cpu overclock to 4.5


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> No one has used the silver pen yet, and some people think it has too much resistance.
> 
> Look up liquid electrical tape, that's what it going to apply around the resistors and on the solder.
> 
> The guy that had his resistor fall off never showed anymore photos or discussed what happened, so I'm going to assume user error. Although I'm still going to buy a small paint brush and paint the solder on the resistors with liquid electrical tape.


I tried to put a little bit of CLU on a bit of leaded soldering wire and i can definitely say that it made that wire brittle in few minutes at that area, so seems like CLU is really dissolving solder.


----------



## eXteR

Quote:


> Originally Posted by *amer020*
> 
> cpu overclock to 4.5


My 4790K is stock. 4.4 XMP Profile


----------



## Benny89

My AORUS Xtreme is coming to me. Should have it 18.04









Gimme that Powah!


----------



## amer020

Quote:


> Originally Posted by *eXteR*
> 
> My 4790K is stock. 4.4 XMP Profile


you

Physics Test38.7 fps
Combined Test36.26 fps

me
Physics Test55.09 fps
Combined Test43.82 fps


----------



## zipeldiablo

Quote:


> Originally Posted by *dunbheagan*
> 
> Just a few impressions of my shunt modded 1080Ti FE on a EK waterblock. I used the Conductonaut between DIE and waterblock too. Insulation is nail polish and black heat resistant silicone.


Did you order the block with fast shipping?
I preorder on day one and my order is still processing


----------



## Nico67

Got the back plate on my what block and leds in, so I could finally close the case again















did a bit of ram performance testing and FE seems to like around +500, where as the Asus was a lot stronger around +650. Not sure if that's a power limit thing but most likely Asus is tuned for higher memory clks.
If you use Strix bios give it more ram and that should makeup for the performance difference clk for clk.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Got the back plate on my what block and leds in, so I could finally close the case again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did a bit of ram performance testing and FE seems to like around +500, where as the Asus was a lot stronger around +650. Not sure if that's a power limit thing but most likely Asus is tuned for higher memory clks.
> If you use Strix bios give it more ram and that should makeup for the performance difference clk for clk.


wow, dude, that looks awesome, I gotta say.


----------



## feznz

Quote:


> Originally Posted by *Silent Scone*
> 
> Hoping to get 2100 out of a pair of Strix, but will see!


I could only get 2050Mhz until I used the volt/freq curve can now get to 2101Mhz But not hold it generally clocks down 2076Mhz /2088Mhz

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, pascal benchmarks need to gpu-z.
> 
> People like to show they're at a core above 2100 but that means nothing to me without gpu-z. At least when it comes to superposition.
> 
> Normal gaming I don't care to see the gpu-z. Heaven is still a good benchmark to replicate true gaming power uses. It's not obsolete. Heaven is more close to what we would see in a gaming session.
> 
> Superposition is good to see what someone's card can hold and their perf cap. But it's practically like furmark, pushing a card to 100% power draw. Not very realistic.


or use MSI with a preety graph with GPU power %


Spoiler: Warning: Spoiler!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *feznz*
> 
> I could only get 2050Mhz until I used the volt/freq curve can now get to 2101Mhz But not hold it generally clocks down 2076Mhz /2088Mhz
> or use MSI with a preety graph with GPU power %
> 
> 
> Spoiler: Warning: Spoiler!


AB is nice but it's easier to see perf cap with gpu-z. It's colored like a rainbow for perf cap and let's us know straight away what someone is lacking.

With afterburner you have to zoom into the power and voltage limits and look at the 0 and 1, etc. Not as user friendly and not ideal for looking at perf cap.


----------



## Nico67

Quote:


> Originally Posted by *Slackaveli*
> 
> wow, dude, that looks awesome, I gotta say.


Thx, I quite enjoy building them and figuring out nice ways to plumb the water fittings


----------



## AsusFan30

Look at this GPU Insane Core Clock! These are on Air! I must have gotten some Golden Cards! I have included the link so people don't think it has been Photoshopped.

http://www.3dmark.com/fs/12324785


----------



## manolith

on air? you serious?


----------



## KingEngineRevUp

2658? Huh?


----------



## AsusFan30

Yes! On Air!!! Amazing!


----------



## manolith

thats kind of crazy. you sure is not a bug? have you checked gpuz sensors. i just find it had to believe that any ti can get to 2600 on air out of the box and you have 2 of them which makes it even more incredible.


----------



## AsusFan30

I have not been able to reach that again, but have been hitting 2200+MHz. I wish I would have had GPU-z on for that one!


----------



## MrTOOSHORT

gpu score is in in line with 2100MHz, so it wasn't 2600MHz+


----------



## DooRules

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> gpu score is in in line with 2100MHz, so it wasn't 2600MHz+


ya beat me to it lol, that score is in no way indicative of a 2600+ core,


----------



## manolith

yeah that must have been a bugged reading. but 2200 is above average anyways which is great.


----------



## AsusFan30

These are some scores I have been getting. I will say it so was cold lastnight in my room when I was running Benchmarks, I could see my breath.































Sent from my iPhone using Tapatalk Pro


----------



## manolith

your scores are the same or better at 2000-2200 that at 2600..


----------



## AsusFan30

Quote:


> Originally Posted by *manolith*
> 
> your scores are the same or better at 2000-2200 that at 2600..


I know..that is what had me confused. Still 2200 on Air is pretty good..especially since I didn't mess with any settings, except power limit and temp.

Sent from my iPhone using Tapatalk Pro


----------



## mshagg

Quote:


> Originally Posted by *Tikerz*
> 
> Why is it that when you do slider overclocking you seems to always have perfcaps all over the place? The only way to get a stable consistent overclock at a constant speed/voltage without any perfcaps is to do a custom voltage curve.
> 
> Is there anyone that can get an overclock where every game/benchmark holds the boost speed AND voltage AND no perfcaps whatsoever? Literally straight lines across for miles in GPU-Z on boost and voltage and perfcaps zero.
> 
> Yes, people are posting I hit 2100 or whatever but as soon as they get into a bench or game it drops to below 2000 for most of the time. To me that's not really a 2100 overclock. Am I looking at this wrong?


The slider method effectively just moves the curve in a parallel fashion, so you retain the standard curve shape and end up with higher clocks at higher voltages, which is a recipe for perfcaps. Having a stable clock speed at, say. 1.031 and flat to the right of it means the card wont boost into the higher voltage bins. Doing the curve manually ends up with a flatter curve.


----------



## Slackaveli

Quote:


> Originally Posted by *AsusFan30*
> 
> I know..that is what had me confused. Still 2200 on Air is pretty good..especially since I didn't mess with any settings, except power limit and temp.
> 
> Sent from my iPhone using Tapatalk Pro


it's more than pretty good. mine won't touch 2114, it no likey past 2101.


----------



## AsusFan30

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tikerz*
> 
> Why is it that when you do slider overclocking you seems to always have perfcaps all over the place? The only way to get a stable consistent overclock at a constant speed/voltage without any perfcaps is to do a custom voltage curve.
> 
> Is there anyone that can get an overclock where every game/benchmark holds the boost speed AND voltage AND no perfcaps whatsoever? Literally straight lines across for miles in GPU-Z on boost and voltage and perfcaps zero.
> 
> Yes, people are posting I hit 2100 or whatever but as soon as they get into a bench or game it drops to below 2000 for most of the time. To me that's not really a 2100 overclock. Am I looking at this wrong?
> 
> 
> 
> The slider method effectively just moves the curve in a parallel fashion, so you retain the standard curve shape and end up with higher clocks at higher voltages, which is a recipe for perfcaps. Having a stable clock speed at, say. 1.031 and flat to the right of it means the card wont boost into the higher voltage bins. Doing the curve manually ends up with a flatter curve.
Click to expand...

Wouldn't the 2100, or 2200 or whatever the Benchmark says, be constant through the Benchmark?

Sent from my iPhone using Tapatalk Pro


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Thx, I quite enjoy building them and figuring out nice ways to plumb the water fittings


well, it shows. i love the power cords, nice touch. even your tub placement seems "artsy" lol
Quote:


> Originally Posted by *AsusFan30*
> 
> Wouldn't the 2100, or 2200 or whatever the Benchmark says, be constant through the Benchmark?
> 
> Sent from my iPhone using Tapatalk Pro


yes, minus the 1,2,3,or4 + bin drops from temps. So, keep it in the 30s, no drops. 40's, 1 drop, 50s 2-3 drops, etc....


----------



## WarbossChoppa

I got my MSI 1080Ti Sea Hawk X to 2,012 MHz and 6075 MHz, if I messed around with the voltage curve instead of the slider I think I could get it higher on the clock.


----------



## DStealth

Quote:


> Originally Posted by *s1rrah*
> 
> It lives. And it is beautiful.
> 
> (count me a new member)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...


Would you mind dumping you BIOS for the community purpose








NVIDIA NVFlash 5.353.0
Start CMD(admin) go to the folder disable the video inside Device manager and type nvflash/nvflash64 --save MSI1080tiSH.rom
then attach the file here as archive
Thanks in advance.

As for the guy with the bugged futuremark frequency clocks readings...while single [email protected]<2100 is making 32+ GPU points in FS both should be in the 60k area, not 52k...even not to mention if 2600+Mhz are reachable w/o LN2


----------



## WarbossChoppa

Where are the 1080Ti BIOS, I can't find any for my SeaHawk X and I've been told to update them.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Would you mind dumping you BIOS for the community purpose
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash 5.353.0
> Start CMD(admin) go to the folder disable the video inside Device manager and type nvflash/nvflash64 --save MSI1080tiSH.rom
> then attach the file here as archive
> Thanks in advance.
> 
> As for the guy with the bugged futuremark frequency clocks readings...while single [email protected]<2100 is making 31+ GPU points in FS both should be in the 60k area, not 52k...even not to mention if 2600+Mhz are reachable w/o LN2


I already posted the bios here.

1080TiMSISeaHawk.zip 155k .zip file


----------



## dansi

Quote:


> Originally Posted by *mshagg*
> 
> The slider method effectively just moves the curve in a parallel fashion, so you retain the standard curve shape and end up with higher clocks at higher voltages, which is a recipe for perfcaps. Having a stable clock speed at, say. 1.031 and flat to the right of it means the card wont boost into the higher voltage bins. Doing the curve manually ends up with a flatter curve.


Strangely my FE is the opposite. I get more stable using the old offset method.

Using curve, even at the derived stable clocks and voltages, i am getting multi color perfcaps in GPUz.

GPU Boost 3.0, how does it work? Magic?


----------



## Slackaveli

Quote:


> Originally Posted by *dansi*
> 
> Strangely my FE is the opposite. I get more stable using the old offset method.
> 
> Using curve, even at the derived stable clocks and voltages, i am getting multi color perfcaps in GPUz.
> 
> GPU Boost 3.0, how does it work? Magic?


it does a pretty decent job. it definitely "works" better than any other auto-overclock i've ever seen.


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're shunt modded also right? I would include that in your details.
> 
> I'm more interested in seeing your GPU-Z sensor graphs and your AB detached graphs.


Here you are


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Do you have a evga hybrid cooler and that's why you want to use it?


Yes. I do.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Here you are


From the look of it, the only thing that's limiting you is voltage limit, but it's better to look at a GPU-Z graph. THe power limit graph is missing in that AB screenshot.

But either way, nice OC. I can't wait to do the shunt mod myself.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dansi*
> 
> Strangely my FE is the opposite. I get more stable using the old offset method.
> 
> Using curve, even at the derived stable clocks and voltages, i am getting multi color perfcaps in GPUz.
> 
> GPU Boost 3.0, how does it work? Magic?


The voltage curve is to get past crashes. You see, Boost 3.0 will literally run through the curve from first point to last point.

If someone crashes at [email protected], then they can customize the voltage curve to flat line from [email protected] 1.031 to 1.040 and then apply [email protected] where they no longer crash.

That's the point of the voltage curve. You find out where you crash and you try and bypass it with a flat line to the next voltage.


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> But either way, nice OC. I can't wait to do the shunt mod myself.


Thanks, seems the sweet spot for maximum benching on air is 1.081v 2114/2126 [email protected] is not stable at all








http://www.3dmark.com/3dm/19287962


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Thanks, seems the sweet spot for maximum benching on air is 1.081v 2114/2126 [email protected] is not stable at all
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19287962


Yeah I can do 2114 and 2126 runs, but I hit power limits. Can barely do 1.000V at 2088 with it power limiting still.


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> I already posted the bios here.
> 
> 1080TiMSISeaHawk.zip 155k .zip file


Could you share this info with your BIOS please


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Could you share this info with your BIOS please


That bios I got from someone else, the seahawk seems to be similar if not the same as a FE.


----------



## alucardis666

So I'm guessing I can't use my Hybrid cooler with the Aorus? Unless I got a G12?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> So I'm guessing I can't use my Hybrid cooler with the Aorus? Unless I got a G12?


The G12 uses an Aestek mount.

You can use your hybrid mount if you want, but you'll have to stick some heatsinks on your VRM and VRAMS and have something blowing at them. You'll have to be creative in keeping the VRM and VRAMs cool.

Lots of people want to buy your hybrid kit, you can always sell it and grab a Kraken G12 + Refurbished x41


__
https://www.reddit.com/r/65g878/usami_h_xfx_r9_295x2_wkoolance_block_sb6141_bnib/


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> The G12 uses an Aestek mount.
> 
> You can use your hybrid mount if you want, but you'll have to stick some heatsinks on your VRM and VRAMS and have something blowing at them. You'll have to be creative in keeping the VRM and VRAMs cool.
> 
> Lots of people want to buy your hybrid kit, you can always sell it and grab a Kraken G12 + Refurbished x41
> 
> 
> __
> https://www.reddit.com/r/65g878/usami_h_xfx_r9_295x2_wkoolance_block_sb6141_bnib/


Thanks!


----------



## WarbossChoppa

Quote:


> Originally Posted by *SlimJ87D*
> 
> I already posted the bios here.
> 
> 1080TiMSISeaHawk.zip 155k .zip file


When you load GPU-Z are they the 86.02.39.00.3D ID BIOS?

Also have you found a way to change the LED on the card? I can't find any way of doing it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *WarbossChoppa*
> 
> When you load GPU-Z are they the 86.02.39.00.3D ID BIOS?
> 
> Also have you found a way to change the LED on the card? I can't find any way of doing it.


Like I said, this isn't my bios. It was from another users that was generous to upload it.


----------



## GraphicsWhore

Did some Firestrike and SuperPosition runs. EVGA FE.

+150, +550 FS: 29,442 (http://www.3dmark.com/fs/12332101)

Stats for the FS run (max temp is from earlier run with fans on quiet; this topped out at 67)



+170, +495 SP:



No voltage changes.

This is going on water but I like what I see on air.


----------



## joder

We really need the OP to update the OP with BIOSs, etc...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> We really need the OP to update the OP with BIOSs, etc...


Not going to happen. The OP isn't even a normal poster here. He just made a topic that he pre-ordered and disappeared.

Someone needs to contact a mod and see if they can make a new topic and merge this one underneath the new OP.


----------



## dunbheagan

Quote:


> Originally Posted by *zipeldiablo*
> 
> Did you order the block with fast shipping?
> I preorder on day one and my order is still processing


Yes, I ordered like three minutes after release with express shipping to germany.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not going to happen. The OP isn't even a normal poster here. He just made a topic that he pre-ordered and disappeared.
> 
> Someone needs to contact a mod and see if they can make a new topic and merge this one underneath the new OP.


I think it said he was here 4 hours ago lol. I'm game to help but honestly I will likely get busy and only come here occasionally until the next big release. Not sure if anyone feels up to managing the OP. I guess there is no sort of wiki style format where we could have multiple folks helping with edits?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> I think it said he was here 4 hours ago lol. I'm game to help but honestly I will likely get busy and only come here occasionally until the next big release. Not sure if anyone feels up to managing the OP. I guess there is no sort of wiki style format where we could have multiple folks helping with edits?


That's exactly what I was thinking. We can just start a google drive and let people that post here a lot have access to it. It will be a master edit file and whoever owns the OP can just copy and paste it when it's updated.

I think Kedarwolf would be the primary candidate. He can start by copying his other 1080 Ti threads and having them as the OP. He can then throw his OP in the cloud and give some of us access to it.


----------



## Luckbad

Bought a Titan Xp. Wasn't impressed. Benches great and overclocks okay but the stock cooler is terrible and loud. Decided I didn't want to mod a $1200 card so the Titan Xp is going back.

Back to my 1080 FTW2 until I can score one of the better 1080 Tis. Something with a good stock OC, extra power headroom, etc.


----------



## leequan2009

Anybody can show me info to set *Curve* (MSI Afterburner) for the 1080Ti FE version.
I was Overclocking this can to 2050Mhz with 1.075v. And I couldn't pass higher 2050mhz.


----------



## KedarWolf

Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading


----------



## Fieldsweeper

where should I start if I wanna overclock this thing? evga FE version.


----------



## eXteR

Quote:


> Originally Posted by *amer020*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eXteR*
> 
> My 4790K is stock. 4.4 XMP Profile
> 
> 
> 
> you
> 
> Physics Test38.7 fps
> Combined Test36.26 fps
> 
> me
> Physics Test55.09 fps
> Combined Test43.82 fps
Click to expand...

Seriously?

Are you comparing my 4790k vs 5930k?

The point here is that despite you having higher clocks, the gpu score is lower because that asus bios.

Enviado desde mi SM-P550 mediante Tapatalk


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading


I wonder if it power limits on the average of the samples, as it did seems to be hovering around 300w, although the was a lot of max's over 340w and GPU-Z at 124%. Hard to say, but it looks like you get more power than anybody else at any rate


----------



## outofmyheadyo

How does this give me 1900 ? I do not get it.

I really wish there would be a better way to OC these 1080ti-s than the voltage curve, whatever I do to the voltage curve the cards just ignores it and does it`s own thing. Running at some random speeds and random volts, I cant figure it out, and even when the card is runnin @ 105% TDP or so afterburner onscreen display flashes the power limit and reduces clocks, using strix bios.








Seems buying a waterblock and using watercooling is a waste of money on these cards since you cant really get anything more out of em compared to air, when they just do whatever they want, god I miss the 980ti days when u disable boost andwhatnot foce constant voltage and clocks and do what you want.


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder if it power limits on the average of the samples, as it did seems to be hovering around 300w, although the was a lot of max's over 340w and GPU-Z at 124%. Hard to say, but it looks like you get more power than anybody else at any rate
Click to expand...

O
This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> Bought a Titan Xp. Wasn't impressed. Benches great and overclocks okay but the stock cooler is terrible and loud. Decided I didn't want to mod a $1200 card so the Titan Xp is going back.
> 
> Back to my 1080 FTW2 until I can score one of the better 1080 Tis. Something with a good stock OC, extra power headroom, etc.


That would be Aorus Xtreme Edition- 375W Power limit (highest of all AIBs), highest boost out of box.


----------



## stoker

Quote:


> Originally Posted by *KedarWolf*
> 
> How do I contact a mod? I'd be willing to take it on.


Click the flag on the first post


----------



## shalafi

Quote:


> Originally Posted by *Slackaveli*
> 
> oy boy. that's probably been bothering ya'll. I've only misspelled the two things I've been advocating most. Now, let me find a way to screw up 5775c...
> prettier, too.


A bit, but the thought of pointing it out to you has been bothering me a lot more .. should have just sent you a PM








Quote:


> Originally Posted by *s1rrah*
> 
> It lives. And it is beautiful.
> 
> (count me a new member)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> 
> 
> ...
> 
> Stock corsair fan sucks .,.. super quiet though but super slow ... I'm gonna get a Delta and put it on a fan controller ... LOL ...
> 
> Can't dang wait to buy another one in the next few months for SLI ...


sorry, but isn't this the worst possible rad placement? any air bubbles in the loop will be up in the pump causing more noise and worsening your cooling efficiency. the rad should always be higher than the pump.


----------



## Benny89

What are your temps in maxed out games with your AIBs.

Best would be Ultra on 1440p or 4K temps for: ASUS STRIX, AORUS XTREME, MSI Gaming X.

Only on air please







. I know on water it will be nice and cool


----------



## Benny89

Zotac Xtreme:






I wonder what is its power limit


----------



## alucardis666

*New Nvidia Driver.*

*381.78*


----------



## zipeldiablo

Quote:


> Originally Posted by *alucardis666*
> 
> *New Nvidia Driver.*
> 
> *381.78*


This is not even on nvidia own website, not a chance on earth i am gonna download that


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> This is not even on nvidia own website, not a chance on earth i am gonna download that


Guru3D is a pretty reliable source. I've never had any issues with their downloads. I just updated and figured it was worth the post.

*Fixes Windows 10 Creators Update booting to a black screen*


----------



## DStealth

http://nvidia.custhelp.com/app/answers/detail/a_id/4453


----------



## alucardis666

Quote:


> Originally Posted by *zipeldiablo*
> 
> This is not even on nvidia own website, not a chance on earth i am gonna download that


Quote:


> Originally Posted by *DStealth*
> 
> http://nvidia.custhelp.com/app/answers/detail/a_id/4453


Feel better now?


----------



## zipeldiablo

Weird, still not available on the drivers download page


----------



## PasK1234Xw

Quote:


> Originally Posted by *shalafi*
> 
> A bit, but the thought of pointing it out to you has been bothering me a lot more .. should have just sent you a PM
> 
> 
> 
> 
> 
> 
> 
> 
> sorry, but isn't this the worst possible rad placement? any air bubbles in the loop will be up in the pump causing more noise and worsening your cooling efficiency. the rad should always be higher than the pump.


Yes it should bve level or above the card. In his case in back of case
As is putting too much stress on that little pump.

Quote:


> Originally Posted by *Luckbad*
> 
> Bought a Titan Xp. Wasn't impressed. Benches great and overclocks okay but the stock cooler is terrible and loud. Decided I didn't want to mod a $1200 card so the Titan Xp is going back.
> 
> Back to my 1080 FTW2 until I can score one of the better 1080 Tis. Something with a good stock OC, extra power headroom, etc.


I have both even on water its not much different than Ti def not worth extra 500 for at most 10 fps increase over current XP/Ti

Keeping my Ti Xp def going back.


----------



## Nico67

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How does this give me 1900 ? I do not get it.
> 
> I really wish there would be a better way to OC these 1080ti-s than the voltage curve, whatever I do to the voltage curve the cards just ignores it and does it`s own thing. Running at some random speeds and random volts, I cant figure it out, and even when the card is runnin @ 105% TDP or so afterburner onscreen display flashes the power limit and reduces clocks, using strix bios.
> 
> 
> 
> 
> 
> 
> 
> 
> Seems buying a waterblock and using watercooling is a waste of money on these cards since you cant really get anything more out of em compared to air, when they just do whatever they want, god I miss the 980ti days when u disable boost andwhatnot foce constant voltage and clocks and do what you want.


Looks like its working ok, only thing holding it back is you need to add +volts to go higher than 1.050. You may as it seems be able to use 1.062, but just stick it to +100 on the voltage, all it does is allow it to use more volts up to 1.093, depending on what your curve allows.
Probably best to set 1.050 and all voltage points after that to say 2000, hit apply and save to a profile button. Then increase 1.050... until you crash or start power limiting.


----------



## Eldux

Hello,

Need some help from you.

It seems, that my EVGA nVidia Geforce GTX 1080 Ti Founders Edition is having some issues with memory or regulator, could you tell me if your GPUs do hold 11000 MHz memory clock during testing with AIDA64 GPGPU? It would really be interesting to know if only I have such problem. Also in Ubuntu, I see that board never goes to maximum power state P0.

The GPU is in right PCI-Express X16, Windows 10 is fresh installed, drivers newest. Please could you check your Founders Edition boards, if they do perform same in AIDA64 GPGPU test and if they are also dropping memory speed from 11 GHz to 10 GHz during high load?

I am adding GPU-z log and test report. Please say if you are getting same report. Should I worry about my card being faulty?


GPU-ZSensorLog.txt 59k .txt file


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.


It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power


----------



## TheBoom

Looking to get a 1080ti.

Currently own a 1070 Strix and was looking to get the Strix 1080ti for the upgrade as well, but it's like a 140 USD more than the Aorus in my country.

Prices weren't that different for the 1070 not sure why they are for the 1080ti.

What do you guys think? Should I just go for the Aorus Extreme instead?


----------



## outofmyheadyo

Ofcourse, strix is not worth the 140 dollar premium


----------



## DStealth

Quote:


> Originally Posted by *TheBoom*
> 
> What do you guys think? Should I just go for the Aorus Extreme instead?


Previous seeing some GB Aeorus OC results...only by judging from Power Limit doesn't seem right


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading


Quote:


> Originally Posted by *KedarWolf*
> 
> Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading


Mind doing the test with your physx card pulled out? It would also be nice to see a normal TimeSpy run recorded with AB+RSS running to watch your power usage.

Still odd you're the only person that gets this and no one else.

I still wouldn't recommend this bios to anyone else as we've all showed it gives us all worse scores and doesn't draw anymore power than the normal FE bios.

You're definitely a special case though.


----------



## amer020

Quote:


> Originally Posted by *eXteR*
> 
> Seriously?
> 
> Are you comparing my 4790k vs 5930k?
> 
> The point here is that despite you having higher clocks, the gpu score is lower because that asus bios.
> 
> Enviado desde mi SM-P550 mediante Tapatalk


I think you speak about total score


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Made a video of Asus BIOS hitting 330+ watts. Cut out as TimeSpy was loading


you guys wanting to test powerlimits should try og Titanfall. I was playing lasrt night on Insane in 4k with 4xTSAA and it was pulling 400w at times acccording to HWmonitor.
Quote:


> Originally Posted by *Benny89*
> 
> Zotac Xtreme:
> 
> 
> 
> 
> 
> 
> I wonder what is its power limit


WOOOOO, that is nasty. It's base clock is my Aorus' max OC. that's the one for you guys who have to have the best I'd guess. Dizamn.
Quote:


> Originally Posted by *TheBoom*
> 
> Looking to get a 1080ti.
> 
> Currently own a 1070 Strix and was looking to get the Strix 1080ti for the upgrade as well, but it's like a 140 USD more than the Aorus in my country.
> 
> Prices weren't that different for the 1070 not sure why they are for the 1080ti.
> 
> What do you guys think? Should I just go for the Aorus Extreme instead?


it SHOULD be $750 for extreme, $775 for Asus. any more is gouging.


----------



## outofmyheadyo

So I decided to settle on 2025 @ locked 1.000v +500 memory, beyond that scaling is horrible all kind of random problems start and it starts smashing the powerlimit ( even with strix bios ), shunt mod might save me but having to dissasemble the loop just for that, I might do it another day ( not sure if its worth the risk ) superposition 4k optimized gets me 10016.


----------



## TheBoom

Quote:


> Originally Posted by *DStealth*
> 
> Previous seeing some GB Aeorus OC results...only by judging from Power Limit doesn't seem right


Care to explain a little?
Quote:


> Originally Posted by *Slackaveli*
> 
> it SHOULD be $750 for extreme, $775 for Asus. any more is gouging.


Costs in my country (Singapore):

Aorus : $800 USD

MSI Armor : $800 USD

MSI Gaming X : $885 USD

Aorus Extreme : $890 USD

Asus Strix : $1000 USD

No other aftermarket cards apart from FE versions seem to be released here yet, but I'd doubt the prices will be any less than the Aorus. Maybe Palit and Zotac versions when released will be slightly cheaper but I've had issues with both brands in the past.

I really like my current 1070 Strix but don't see any way I'm going to pay that much extra for the 1080ti version over other variants.


----------



## alucardis666

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Yes it should bve level or above the card. In his case in back of case
> As is putting too much stress on that little pump.
> I have both even on water its not much different than Ti def not worth extra 500 for at most 10 fps increase over current XP/Ti
> 
> Keeping my Ti Xp def going back.


Yea my XP is going back too sadly.


----------



## demon09

theres so many pages on this thread I will look threw them if i have to, but if any of you guys have a link to where you found msi aftburner 4.4.0 that would be appreciated


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> Care to explain a little?
> Costs in my country (Singapore):
> 
> Aorus : $800 USD
> 
> MSI Armor : $800 USD
> 
> MSI Gaming X : $885 USD
> 
> Aorus Extreme : $890 USD
> 
> Asus Strix : $1000 USD
> 
> No other aftermarket cards apart from FE versions seem to be released here yet, but I'd doubt the prices will be any less than the Aorus. Maybe Palit and Zotac versions when released will be slightly cheaper but I've had issues with both brands in the past.
> 
> I really like my current 1070 Strix but don't see any way I'm going to pay that much extra for the 1080ti version over other variants.


seems the reggie Aorus is a no brainer at $800


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> seems the reggie Aorus is a no brainer at $800


Agreed


----------



## TheBoom

Quote:


> Originally Posted by *Slackaveli*
> 
> seems the reggie Aorus is a no brainer at $800


What are the differences between the two? Only the bios?


----------



## cugno87

Hi guys, someone can explain me why I'm getting 4 performance caps at the same time?


----------



## Jaysend

Quote:


> Originally Posted by *krutoydiesel*
> 
> You didn't accidentally drop any screws or whatnot or gauge any of the components did you?


I hope so, because that would be a tangible explanation, but sadly my forensics didn't turn up anything...


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> What are the differences between the two? Only the bios?


and the backplate has a bigger copper shim and an LED lit eagle, not worth a $100.


----------



## Jaysend

So I was impatient waiting for my newegg replacement card and went out and got another. It is a gem
of a card. Constant 1050mV and stable 2075Mhz with 6000Mhz memory.
However the replacement came the very next day and it is not nearly as good :-( runs less voltage more heat and gets about 2000Mhz with only +100 on memory.

Oh well I have the good card in slot one for non SLI games and a nice OC profile for it and then a mild OC for SLI.


----------



## Luckbad

Just managed to snag a reasonable best offer price on a Zotac 1080 Ti Amp Extreme via eBay. ~$25 premium all told since everywhere that ever has stock of it charges my tax (California).

Looks like my first Zotac in over a decade (bought one when they were super cheap and new, then never again after they started matching prices with others).


----------



## alucardis666

Well, I said F it and bought my 2nd Aorus from SuperBiiz, never bought there before, but it was $5 cheaper than the egg and with free shipping. It's gonna be a *REALLY* fun next week.









Really hope SLi scaling and micro stutter is better now that it used to be 3-4 years ago.


----------



## Silent Scone

Quote:


> Originally Posted by *alucardis666*
> 
> Well, I said F it and bought my 2nd Aorus from SuperBiiz, never bought there before, but it was $5 cheaper than the egg and with free shipping. It's gonna be a *REALLY* fun next week.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really hope SLi scaling and micro stutter is better now that it used to be 3-4 years ago.


Makes more sense than a single TXP


----------



## alucardis666

Quote:


> Originally Posted by *Silent Scone*
> 
> Makes more sense than a single TXP


That's what I'm hoping too!

And the funny part is when I return my TXp and sell my 6950x + board I'll have *MADE* *~$600*


----------



## kevindd992002

Quote:


> Originally Posted by *shalafi*
> 
> Speaking of spelling, it's kRYonaut, not kYRonaut


Quote:


> Originally Posted by *alucardis666*
> 
> That's what I'm hoping too!
> 
> And the funny part is when I return my TXp and sell my 6950x + board I'll have *MADE* *~$600*


What, how?!


----------



## RavageTheEarth

So I hear you guys talk about perfcap a lot, but I'm not sure what it is. How does this GPU-Z monitoring look? This is while playing Tom Clancy Wildlands. The clock speeds aren't correct because I had to go to the desktop to screenshot GPU-Z. My clock speed of 2062 held throughout playing.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Well, I said F it and bought my 2nd Aorus from SuperBiiz, never bought there before, but it was $5 cheaper than the egg and with free shipping. It's gonna be a *REALLY* fun next week.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really hope SLi scaling and micro stutter is better now that it used to be 3-4 years ago.


I don't know, I just came from sli 980 ti and it wasn't that great. For a long time Bf1 had an issue where it would burn on letters into your screen. Batman arkham Knight and Doom didn't support it. It took patches and patches for Witcher 3 to support it right. And just about every other game I had to mess with sli profiles.

I'm hoping it takes off with dx12 in a year or two so I can buy another 1080 Ti used in a year or two for $300.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I hear you guys talk about perfcap a lot, but I'm not sure what it is. How does this GPU-Z monitoring look? This is while playing Tom Clancy Wildlands. The clock speeds aren't correct because I had to go to the desktop to screenshot GPU-Z. My clock speed of 2062 held throughout playing.


Looks like you need to unlock your power slider all the way to the max and also the voltage slider.

You pretty much have everything right now. Thermal throttling, power limiting, voltage limiting.


----------



## bloodhawk

So here if anyone is thinking about Power modding, Shunt mod with LM works, but is rather unreliable sometimes. Soldering a thin wire, works better than that. But the best way is to solder 10Ohm resistors on top of the capacitors next to the power controller IC. That works the best.

But there is something else to note, there is another power limit, either at the PCIe power connectors or the slot power draw. Still need to look into that.

Some images -



http://imgur.com/PLx3I


Liquid Metal Shun Mod- Power Target at about 105% Max.

Thin Battery Wires - Power Target at about 75%

Resistors in Caps - Power Target at 55% Max.


----------



## Arizonian

ANNOUNCEMENT

As of now KedarWolf will be taking over as thread starter. He's also Shown interest in making this an [Official] 1080ti club by getting a club roster going. Please have patience while he gets things done and will he let you guys know when you can start adding yourself to the owners list.

Thank you KedarWolf


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Looks like you need to unlock your power slider all the way to the max and also the voltage slider.
> 
> You pretty much have everything right now. Thermal throttling, power limiting, voltage limiting.


Hmm. My power slider is all the way at 125% which is the max. I don't see how I'm getting thermal throttling. The GPU hasn't gone over 51C while playing and I haven't dropped any bins at all. It's held my overclock of 2063 the entire time.

The reason you see the clock speeds like that is because I had to alt tab to the desktop to take a screenshot and it drops my clocks when I do that.

Any second opinions?

EDIT: Looking at afterburner, my power is around 85% while playing the game so that doesn't even seem close to the 125% I have it set to.

Here is my power info from the cmd. Did this at the desktop hence the 101.7 watt draw.

==============NVSMI LOG==============

Timestamp : Sat Apr 15 14:35:44 2017
Driver Version : 381.65

Attached GPUs : 1
GPU 0000:01:00.0
Power Readings
Power Management : Supported
Power Draw : 101.70 W
Power Limit : 375.00 W
Default Power Limit : 300.00 W
Enforced Power Limit : 375.00 W
Min Power Limit : 150.00 W
Max Power Limit : 375.00 W
Power Samples
Duration : 2.37 sec
Number of Samples : 119
Max : 104.84 W
Min : 79.31 W
Avg : 91.10 W

C:\Program Files\NVIDIA Corporation\NVSMI>


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Arizonian*
> 
> ANNOUNCEMENT
> 
> As of now KedarWolf will be taking over as thread starter. He's also Shown interest in making this an [Official] 1080ti club by getting a club roster going. Please have patience while he gets things done and will he let you guys know when you can start adding yourself to the owners list.
> 
> Thank you KedarWolf


Awesome! Thank you KedarWolf! And the mods!
Quote:


> Originally Posted by *bloodhawk*
> 
> So here if anyone is thinking about Power modding, Shunt mod with LM works, but is rather unreliable sometimes. Soldering a thin wire, works better than that. But the best way is to solder 10Ohm resistors on top of the capacitors next to the power controller IC. That works the best.
> 
> But there is something else to note, there is another power limit, either at the PCIe power connectors or the slot power draw. Still need to look into that.
> 
> Some images -
> 
> 
> 
> http://imgur.com/PLx3I
> 
> 
> Liquid Metal Shun Mod- Power Target at about 105% Max.
> 
> Thin Battery Wires - Power Target at about 75%
> 
> Resistors in Caps - Power Target at 55% Max.


Not everyone can solder lol. And hkw would that fly RMA wise? At least liquid metal you can blow dry off and wipe. What do you mean 105% Max for LM shunt mod? As in you only gain 5% extra power? Because that's wrong, I've seen people have access to 400+ watts. Doesn't mean the card needs that much to run though.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not everyone can solder lol. And hkw would that fly RMA wise? At least liquid metal you can blow dry off and wipe. What do you mean 105% Max for LM shunt mod? As in you only gain 5% extra power? Because that's wrong, I've seen people have access to 400+ watts. Doesn't mean the card needs that much to run though.


Definitely agreed and understandable.
I personally, can to board level work so it wont be a problem, till the time i can find the schematics and the core isn't fried









But i would stay away from using LM, the current and resistance were varying quite a bit.


----------



## KedarWolf

Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.

Starting an owner's list in OP.









Example:

KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory


----------



## warbucks

Warbucks - 1x Nvidia 1080 TI FE - Ek Water Block, Backplate - 2050 Core 6003 Memory


----------



## outofmyheadyo

What is the meaning of this, thought my oc was stable ? Gigabyte founders 1080ti under water locked @ 1.063v @ 2050 after an hour or so of playing mass effect andromeda it randomly crashed, temps are like 46c or somethng.


----------



## sWaY20

Sway20-1x Evga 1080Ti FE- Ek water block and backplate-2063 core 6100 memory


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Definitely agreed and understandable.
> I personally, can to board level work so it wont be a problem, till the time i can find the schematics and the core isn't fried
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But i would stay away from using LM, the current and resistance were varying quite a bit.


It's already too late, I bought the liquid metal and I'm determined to use it. I bought paint brushes and liquid electrical tape. I'm going to put 2 coats around the resistors and even on the sides of the solder joints to protect them better.

There's more success stories with LM that there are failures so I'll risk it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.
> 
> Starting an owner's list in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Example:
> 
> KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory


How about we start a Google sheets and opening it to the public? It always keeps a history so if a troll tried to delete everything you can just revert it back.

I'll create a sheet shortly for you.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> What is the meaning of this, thought my oc was stable ? Gigabyte founders 1080ti under water locked @ 1.063v @ 2050 after an hour or so of playing mass effect andromeda it randomly crashed, temps are like 46c or somethng.


This is what happens with an unstable OC.


----------



## outofmyheadyo

How is it unstable all of a sudden after an hour ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> How is it unstable all of a sudden after an hour ?


Just because you are running it through benchmarks doesn't make it stable proof. You have to test games, that's the next step.

Benchmarks run through the same scenes over and over, but random calculations throen at the gpu is the next time in real world gaming.

Witcher 3 can take your stable benchmark oc and wipe its *** with it for example.


----------



## MURDoctrine

Quote:


> Originally Posted by *outofmyheadyo*
> 
> 
> 
> What is the meaning of this, thought my oc was stable ? Gigabyte founders 1080ti under water locked @ 1.063v @ 2050 after an hour or so of playing mass effect andromeda it randomly crashed, temps are like 46c or somethng.


Quote:


> Originally Posted by *SlimJ87D*
> 
> This is what happens with an unstable OC.


Nope that is just Andromeda. The game has a memory issue with the 1080ti. It is even mentioned in the driver release notes. If you are stable in every other game and bench then ignore that error.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *MURDoctrine*
> 
> Nope that is just Andromeda. The game has a memory issue with the 1080ti. It is even mentioned in the driver release notes. If you are stable in every other game and bench then ignore that error.


For his case I hope that's true.


----------



## outofmyheadyo

Quote:


> Originally Posted by *SlimJ87D*
> 
> Just because you are running it through benchmarks doesn't make it stable proof. You have to test games, that's the next step.
> 
> Benchmarks run through the same scenes over and over, but random calculations throen at the gpu is the next time in real world gaming.
> 
> Witcher 3 can take your stable benchmark oc and wipe its *** with it for example.


Yes i wasnt born yesterday neither is this my first rodeo, but makes no sense that its perfectly cool and stable for an hour and then crashes for no reason, as it turns out mass effect andromeda is to blame here not the card.


----------



## madmeatballs

Yup it is ME:A. I also see random memory issues on Battlegrounds(PUBG) but just on the lobby screen. Other games, Heaven/SP/3dmark work fine. In ME:A it mostly happens to me when I tab out of the game for too long or when I pause for too long (lets say around 10~20 mins).


----------



## fisher6

@KedarWolf fisher6 - 1x EVGA 1080 Ti FE - EK Water Block, Backplate - 2088 Core 6003 Memory - Strix Bios.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Yes i wasnt born yesterday neither is this my first rodeo, but makes no sense that its perfectly cool and stable for an hour and then crashes for no reason, as it turns out mass effect andromeda is to blame here not the card.


Not sure who you're taking this personal. This is a forum where people put time in helping one another and all i was doing was attempting tonhelp you. I have had "stable oc" before to only discover they were not stable in certain games.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.
> 
> Starting an owner's list in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Example:
> 
> KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory


Congrats!


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> That's what I'm hoping too!
> 
> And the funny part is when I return my TXp and sell my 6950x + board I'll have *MADE* *~$600*


go ahead, thank me for planting that seed. In the end, you get a bunch of new toys, definitely better perf, higher clocks, a very awesome aorus sammich that is going to look dope as frick, and CHANGE.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> go ahead, thank me for planting that seed. In the end, you get a bunch of new toys, definitely better perf, higher clocks, a very awesome aorus sammich that is going to look dope as frick, and CHANGE.


+Rep


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> +Rep


can't wait to see it when youre thru.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> can't wait to see it when youre thru.


Pics will be plenty!


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.
> 
> Starting an owner's list in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Example:
> 
> KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory
> 
> 
> 
> How about we start a Google sheets and opening it to the public? It always keeps a history so if a troll tried to delete everything you can just revert it back.
> 
> I'll create a sheet shortly for you.
Click to expand...

No, I'd rather not have a public document for an owner's list, too many trolls and stuff might mess with it.

I WILL add you to the shared link of the OP doc though. And Slackaveli.









I'll PM you both an editable link, just don't share it with others unless you clear it with myself, please.


----------



## Jaysend

Jaysend
2 EVGA FE SLI under water

Single GPU
Core:2067
Memory 5998Hz

SLI:
Core-2012mHz
Men-5770Mhz
Quote:


> Originally Posted by *KedarWolf*
> 
> Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.
> 
> Starting an owner's list in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Example:
> 
> KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory


Quote:


> Originally Posted by *krutoydiesel*
> 
> You didn't accidentally drop any screws or whatnot or gauge any of the components did you?


I hope so, because that would be a tangible explin


----------



## joder

Quote:


> Originally Posted by *Arizonian*
> 
> ANNOUNCEMENT
> 
> As of now KedarWolf will be taking over as thread starter. He's also Shown interest in making this an [Official] 1080ti club by getting a club roster going. Please have patience while he gets things done and will he let you guys know when you can start adding yourself to the owners list.
> 
> Thank you KedarWolf


Thank you!


----------



## joder

Quote:


> Originally Posted by *KedarWolf*
> 
> No, I'd rather not have a public document for an owner;s list, too many trolls and stuff might mess with it.
> 
> I WILL add you to the shared link of the OP doc though. And Slackaveli.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll PM you both an editable link, just don't share it with others unless you clear it with myself, please.


Thank you for taking this over. It will be nice to have a list of BIOSs in the OP.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> No, I'd rather not have a public document for an owner;s list, too many trolls and stuff might mess with it.
> 
> I WILL add you to the shared link of the OP doc though. And Slackaveli.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll PM you both an editable link, just don't share it with others unless you clear it with myself, please.


Trolls can't mess with it, a history will always exist and you can always revert it back.


----------



## joder

Quote:


> Originally Posted by *KedarWolf*
> 
> No, I'd rather not have a public document for an owner;s list, too many trolls and stuff might mess with it.
> 
> I WILL add you to the shared link of the OP doc though. And Slackaveli.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll PM you both an editable link, just don't share it with others unless you clear it with myself, please.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Trolls can't mess with it, a history will always exist and you can always revert it back.


I would have to agree with him KedarWolf if you are remotely open to doing it this way. It will make your life easier down the road too.


----------



## Pandora's Box

@KedarWolf - Pandora's Box - 1x MSI GTX 1080 Ti Gaming X - 2050 Core 6000 Memory.


----------



## KedarWolf

Quote:


> Originally Posted by *joder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> No, I'd rather not have a public document for an owner;s list, too many trolls and stuff might mess with it.
> 
> I WILL add you to the shared link of the OP doc though. And Slackaveli.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll PM you both an editable link, just don't share it with others unless you clear it with myself, please.
> 
> 
> 
> Thank you for taking this over. It will be nice to have a list of BIOSs in the OP.
Click to expand...

Official BIOS's are all found in the TechPowerUp link.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Official BIOS's are all found in the TechPowerUp link.


https://docs.google.com/spreadsheets/d/104MyVbmRX5folwK7cQoKxkipJrNKYLPbnVFdQbe3ITY/edit?usp=sharing

If you want, I can modify it so only you can edit it, but other's can make a comment and you can fill it in.

No one can edit, but they can comment and you can just put their information in.


----------



## Pandora's Box

Quote:


> Originally Posted by *SlimJ87D*
> 
> https://docs.google.com/spreadsheets/d/104MyVbmRX5folwK7cQoKxkipJrNKYLPbnVFdQbe3ITY/edit?usp=sharing
> 
> If you want, I can modify it so only you can edit it, but other's can make a comment and you can fill it in.
> 
> No one can edit, but they can comment and you can just put their information in.


People can edit it


----------



## outofmyheadyo

I have the gigabyte founders edition, tried the strix, aorus and gamingx bioses, still smashes the powerlimit even when running like 1.063v and such, it`s silly, is there really no way to avoid the shunt mod, might go grab some clu tomorrow and get it over witjh, only problem is its the 3rd time to drain my loop this weekend


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> People can edit it


Yeah I have it open right now. I watched you edit and type your information in lol.

But I can close it to Kedarwolf, and a select few users and just open it for "comments" only.

So comments will show on on the side and the users that are allowed to edit it can add people.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I have the gigabyte founders edition, tried the strix, aorus and gamingx bioses, still smashes the powerlimit even when running like 1.063v and such, it`s silly, is there really no way to avoid the shunt mod, might go grab some clu tomorrow and get it over witjh, only problem is its the 3rd time to drain my loop this weekend


I've given up. Going for the shunt mod with liquid electrical tape to protect everything around it and the solder joints itself.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pandora's Box*
> 
> People can edit it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I have it open right now. I watched you edit and type your information in lol.
> 
> But I can close it to Kedarwolf, and a select few users and just open it for "comments" only.
> 
> So comments will show on on the side and the users that are allowed to edit it can add people.
Click to expand...

Sure, that would work.









You need my email address to give me editing permissions?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Sure, that would work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need my email address to give me editing permissions?


Yeah, just PM me. I have updated the list and there's already 10 of us on it.


----------



## HeliXpc

Whats the expected average overclock for the FE card?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *HeliXpc*
> 
> Whats the expected average overclock for the FE card?


Well take a look at the sheet we just made and you'll get an idea.


----------



## joder

Quote:


> Originally Posted by *KedarWolf*
> 
> Official BIOS's are all found in the TechPowerUp link.


Thanks - I clearly didn't thoroughly read the OP. My bad


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well take a look at the sheet we just made and you'll get an idea.


Might I suggest doing a google survey form that feeds into the google sheet? Less work for both of you.

Edit: I can get one setup to start out.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> Might I suggest doing a google survey form that feeds into the google sheet? Less work for both of you.
> 
> Edit: I can get one setup to start out.


That's not a bad idea, whatever helps us.

Can the survey be edited? You'll have to keep up with all the latest cards that come out on it.


----------



## KingEngineRevUp

I have locked the sheet so only KedarWolf and I can edit it, but I can add more owners later. *Please fill in a comment so you can be put on the list*.

https://docs.google.com/spreadsheets/d/104MyVbmRX5folwK7cQoKxkipJrNKYLPbnVFdQbe3ITY/edit#gid=0


----------



## Pandora's Box

pretty sure there's a way to integrate the sheet into the first post of this thread. would look a lot better than the list like it is currently.


----------



## PasK1234Xw

So i got my hybrid and went from 2050 to 2088

On air anything over 2050 would lock but now 2088 with curve locked at 1.093 very rarely do i hit power limit in games and when do it only drops to 2070for few sec the nback to 2088.
Also what is weird my card does not down clock at all with temps from 30- 55c this is max temp ive hit so far.

I had hybrid on my 1080 SLI but i forgot how much better these run kept cool voltage and everything it rock stable when set.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> pretty sure there's a way to integrate the sheet into the first post of this thread. would look a lot better than the list like it is currently.


So far I have a preview of what we're shooting for. Waiting for Kedarwolf to get back to me.


Spoiler: Warning: Spoiler!



*Introducing the Nvidia GTX 1080 Ti*



The GeForce® GTX 1080 Ti is NVIDIA's new flagship gaming GPU, based on the NVIDIA Pascal™ architecture. The latest addition to the ultimate gaming platform, this card is packed with extreme gaming horsepower, next-gen 11 Gbps GDDR5X memory, and a massive 11 GB frame buffer.

*Official Owners Club and Overclock Spreadsheet*


Spoiler: Warning: Spoiler!



Please use the google sheet and make a comment on if you are using one or more 1080 Ti's and how many you run on your PC. Sheet is self explanatory. Starting a list of us owners. Yes, it's redundant you need to tell me officially but some have moved to the Xp, so if you can mention it here again it would help.

Below is the official owners club list along with their overclocking data:
https://docs.google.com/spreadsheets/d/104MyVbmRX5folwK7cQoKxkipJrNKYLPbnVFdQbe3ITY/edit?usp=sharing



*How to Flash a Bios*


Spoiler: Warning: Spoiler!



https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20

Today I'm making a video on how exactly you flash a different BIOS on your 1080 Ti. The Gigabyte Extreme and Aorus BIOS do NOT work right with other cards, it would have to be one of the other ones for an FE etc.

How to do a custom voltage curve. See thread for detailed explanation if you prefer not to watch a video (I hate watching YouTube's on how to do this stuff, but it was requested by some)









The link goes into a lot of detail on exactly the best way to get an overclock with little or no power limiting using a custom voltage curve.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20




Today i'm making a video on how exactly you flash a different BIOS on your 1080 Ti. The Gigabyte Extreme and Aorus BIOS do NOT work right with other cards, it would have to be one of the other ones for an FE etc.



*How to get an unlocked voltage slider.*


Spoiler: Warning: Spoiler!



http://forums.guru3d.com/showpost.php?p=5412373&postcount=216

Go to c:\Program Files (x86)\MSI AfterBurner\
Open MSIAfterburner.oem2 in a text editor and add the card to third party database yourself and save at the very bottom of all the text in the file.

[VEN_10DE&DEV_1B06&*SUBSYS_120F10DE*&REV_??]
VDDC_Generic_Detection = 1

You will notice the part bolded, the Subsys_ID. This can change per BIOS/Brand. In AfterBurner just click the 'I' (information) button, to read out your proper string and replace the bolded part with what is listed under the information button (if it differs).

Afterburner select third party voltage control mode in properties and adding the following will do the trick.

Just replace the text between '1B06&' and '&REV' from the 'I' Info and leave the rest as is from the 'I' (Info) button in Afterburner.

So for example and EVGA Ti FE the 'I' button says.

[VEN_10DE&DEV_1B06&*SUBSYS_63903842*&REV_A1&BUS_2&DEV_0&FN_0]

So for EVGA in the .oem2 file in the MSI Programs folder right at the very bottom of all the text in the file you add:

[VEN_10DE&DEV_1B06&*SUBSYS_63903842*&REV_??]
VDDC_Generic_Detection = 1

Reference cards should be and most others should be

[VEN_10DE&DEV_1B06&*SUBSYS_120F10DE*&REV_??]
VDDC_Generic_Detection = 1

*I'm adding .zip files for reference cards and EVGA here.

Right click on zip, choose Properties, and Unblock, then unzip.

*
*Try this one first.*

WildCard.zip 1k .zip file


*If not, this one.*

MSIAfterburner.zip 1k .zip file


*EVGA, this one if first one doesn't work.*

EVGA.zip 1k .zip file


*Afterburner select third party voltage control mode in properties.*

If you click the cog icon, make sure Enable Hardware Control and Monitoring, Enable Low level IO Driver, Enable Low Level Hardware Access Interface 'Use Mode', Restore Settings After Suspended Mode, Unlock Voltage Control 'Third Party' and Unlock Voltage Monitoring are all checked.

Download it here and replace the '.oem2' file with the extracted WildCards.zip.

http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup.zip Direct download link.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> So i got my hybrid and went from 2050 to 2088
> 
> On air anything over 2050 would lock but now 2088 with curve locked at 1.093 very rarely do i hit power limit in games and when do it only drops to 2070for few sec the nback to 2088.
> Also what is weird my card does not down clock at all with temps from 30- 55c this is max temp ive hit so far.
> 
> I had hybrid on my 1080 SLI but i forgot how much better these run kept cool voltage and everything it rock stable when set.


You're on a FE? Superposition is a power hungry son of a B... If you're on a FE you'll definitely hit power limiting on it.

I hit power limits doing [email protected] 1.000V!


----------



## PasK1234Xw

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're on a FE? Superposition is a power hungry son of a B... If you're on a FE you'll definitely hit power limiting on it.
> 
> I hit power limits doing [email protected] 1.000V!


im not counting benchmark gaming is all i care about


----------



## RavageTheEarth

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I hear you guys talk about perfcap a lot, but I'm not sure what it is. How does this GPU-Z monitoring look? This is while playing Tom Clancy Wildlands. The clock speeds aren't correct because I had to go to the desktop to screenshot GPU-Z. My clock speed of 2062 held throughout playing.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Looks like you need to unlock your power slider all the way to the max and also the voltage slider.
> 
> You pretty much have everything right now. Thermal throttling, power limiting, voltage limiting.


Quote:


> Originally Posted by *RavageTheEarth*
> 
> Hmm. My power slider is all the way at 125% which is the max. I don't see how I'm getting thermal throttling. The GPU hasn't gone over 51C while playing and I haven't dropped any bins at all. It's held my overclock of 2063 the entire time.
> 
> The reason you see the clock speeds like that is because I had to alt tab to the desktop to take a screenshot and it drops my clocks when I do that.
> 
> Any second opinions?
> 
> EDIT: Looking at afterburner, my power is around 85% while playing the game so that doesn't even seem close to the 125% I have it set to.
> 
> Here is my power info from the cmd. Did this at the desktop hence the 101.7 watt draw.
> 
> ==============NVSMI LOG==============
> 
> Timestamp : Sat Apr 15 14:35:44 2017
> Driver Version : 381.65
> 
> Attached GPUs : 1
> GPU 0000:01:00.0
> Power Readings
> Power Management : Supported
> Power Draw : 101.70 W
> Power Limit : 375.00 W
> Default Power Limit : 300.00 W
> Enforced Power Limit : 375.00 W
> Min Power Limit : 150.00 W
> Max Power Limit : 375.00 W
> Power Samples
> Duration : 2.37 sec
> Number of Samples : 119
> Max : 104.84 W
> Min : 79.31 W
> Avg : 91.10 W
> 
> C:\Program Files\NVIDIA Corporation\NVSMI>


I'm going to bump this because it was buried after @KedarWolf was announced as the new overseer of this thread (congrats)!! I'm just a little confused as to what is going on with my Perfcap. The card is averaging 51C in gaming and holds my overclock perfectly fine. Set the power limit to 125% (375w) in MSI afterburner and run a custom voltage curve for 2063 @ 1.043v. The card never drops bins. So what exactly is going on here? I want to learn. I feel like I have to learn everything over again with this 1080 Ti!


----------



## mtbiker033

guys, quick question, I was using the curve method shown in the thread created by KedarWolf (thanks brah!) and had good results but how to you like save your curve? I mean I was able to save the settings on the sliders that corresponded with the curve but I see pics on here of some members MSI AB And I see where it says "curve" on the slider. Is there some way of like confirming or switching to curve that does that? Like I hit cntrl-F to bring it up is there a keyboard command that saves it?


----------



## outofmyheadyo

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're on a FE? Superposition is a power hungry son of a B... If you're on a FE you'll definitely hit power limiting on it.
> 
> I hit power limits doing [email protected] 1.000V!


Judging by the curve what you have done is limited yourself to 2088mhz but voltage is uncapped, select the point above 1.000v and press control+L that will cap it, close the curve and apply, 2088 is really high for 1.000v you shouldnt powerlimit at that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'm going to bump this because it was buried after @KedarWolf was announced as the new overseer of this thread (congrats)!! I'm just a little confused as to what is going on with my Perfcap. The card is averaging 51C in gaming and holds my overclock perfectly fine. Set the power limit to 125% (375w) in MSI afterburner and run a custom voltage curve for 2063 @ 1.043v. The card never drops bins. So what exactly is going on here? I want to learn. I feel like I have to learn everything over again with this 1080 Ti!


Try putting the voltage to 1.050, 1.062 and onward.

I believe your card wants more voltage + power to run the core to its true potential.

lower 1.043 to 2050 and keep 2063 at 1.050 and continue on. Let us know what the perf cap shows after that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Judging by the curve what you have done is limited yourself to 2088mhz but voltage is uncapped, select the point above 1.000v and press control+L that will cap it, close the curve and apply, 2088 is really high for 1.000v you shouldnt powerlimit at that.


That was just me testing the power limiting of the card. I purposely did a "1 Volt challenge."

I play games at 2088 Mhz @ 1.050v and rarely hit a power limit. Superposition is just power hungry, it's a stress test in itself. Not realistic for game purposes.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's not a bad idea, whatever helps us.
> 
> Can the survey be edited? You'll have to keep up with all the latest cards that come out on it.


Sent you the details and a follow-up for you and @KedarWolf to consider. Another route is to require a GPU-Z validation link to help weed out trolls. Just a thought.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pandora's Box*
> 
> People can edit it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I have it open right now. I watched you edit and type your information in lol.
> 
> But I can close it to Kedarwolf, and a select few users and just open it for "comments" only.
> 
> So comments will show on on the side and the users that are allowed to edit it can add people.
Click to expand...

Thanks for your help, really improved the OP!!


----------



## KedarWolf

Can all of you who joined the Owner's Club please provide a GPU-Z validation link as well?


----------



## KedarWolf

Quote:


> Originally Posted by *mtbiker033*
> 
> guys, quick question, I was using the curve method shown in the thread created by KedarWolf (thanks brah!) and had good results but how to you like save your curve? I mean I was able to save the settings on the sliders that corresponded with the curve but I see pics on here of some members MSI AB And I see where it says "curve" on the slider. Is there some way of like confirming or switching to curve that does that? Like I hit cntrl-F to bring it up is there a keyboard command that saves it?


Just hit the 'Apply' checkmark right in Afterburner. Don't know a key commands.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Pandora's Box*
> 
> pretty sure there's a way to integrate the sheet into the first post of this thread. would look a lot better than the list like it is currently.
> 
> 
> 
> So far I have a preview of what we're shooting for. Waiting for Kedarwolf to get back to me.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Introducing the Nvidia GTX 1080 Ti*
> 
> 
> 
> The GeForce® GTX 1080 Ti is NVIDIA's new flagship gaming GPU, based on the NVIDIA Pascal™ architecture. The latest addition to the ultimate gaming platform, this card is packed with extreme gaming horsepower, next-gen 11 Gbps GDDR5X memory, and a massive 11 GB frame buffer.
> 
> *Official Owners Club and Overclock Spreadsheet*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Please use the google sheet and make a comment on if you are using one or more 1080 Ti's and how many you run on your PC. Sheet is self explanatory. Starting a list of us owners. Yes, it's redundant you need to tell me officially but some have moved to the Xp, so if you can mention it here again it would help.
> 
> Below is the official owners club list along with their overclocking data:
> https://docs.google.com/spreadsheets/d/104MyVbmRX5folwK7cQoKxkipJrNKYLPbnVFdQbe3ITY/edit?usp=sharing
> 
> 
> 
> *How to Flash a Bios*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20
> 
> Today I'm making a video on how exactly you flash a different BIOS on your 1080 Ti. The Gigabyte Extreme and Aorus BIOS do NOT work right with other cards, it would have to be one of the other ones for an FE etc.
> 
> How to do a custom voltage curve. See thread for detailed explanation if you prefer not to watch a video (I hate watching YouTube's on how to do this stuff, but it was requested by some)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The link goes into a lot of detail on exactly the best way to get an overclock with little or no power limiting using a custom voltage curve.
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20
> 
> 
> 
> 
> 
> Great OP. Tweaked it a bit.
> 
> 
> 
> 
> 
> 
> 
> 
> Today i'm making a video on how exactly you flash a different BIOS on your 1080 Ti. The Gigabyte Extreme and Aorus BIOS do NOT work right with other cards, it would have to be one of the other ones for an FE etc.
> 
> 
> 
> *How to get an unlocked voltage slider.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://forums.guru3d.com/showpost.php?p=5412373&postcount=216
> 
> Go to c:\Program Files (x86)\MSI AfterBurner\
> Open MSIAfterburner.oem2 in a text editor and add the card to third party database yourself and save at the very bottom of all the text in the file.
> 
> [VEN_10DE&DEV_1B06&*SUBSYS_120F10DE*&REV_??]
> VDDC_Generic_Detection = 1
> 
> You will notice the part bolded, the Subsys_ID. This can change per BIOS/Brand. In AfterBurner just click the 'I' (information) button, to read out your proper string and replace the bolded part with what is listed under the information button (if it differs).
> 
> Afterburner select third party voltage control mode in properties and adding the following will do the trick.
> 
> Just replace the text between '1B06&' and '&REV' from the 'I' Info and leave the rest as is from the 'I' (Info) button in Afterburner.
> 
> So for example and EVGA Ti FE the 'I' button says.
> 
> [VEN_10DE&DEV_1B06&*SUBSYS_63903842*&REV_A1&BUS_2&DEV_0&FN_0]
> 
> So for EVGA in the .oem2 file in the MSI Programs folder right at the very bottom of all the text in the file you add:
> 
> [VEN_10DE&DEV_1B06&*SUBSYS_63903842*&REV_??]
> VDDC_Generic_Detection = 1
> 
> Reference cards should be and most others should be
> 
> [VEN_10DE&DEV_1B06&*SUBSYS_120F10DE*&REV_??]
> VDDC_Generic_Detection = 1
> 
> *I'm adding .zip files for reference cards and EVGA here.
> 
> Right click on zip, choose Properties, and Unblock, then unzip.
> 
> *
> *Try this one first.*
> 
> WildCard.zip 1k .zip file
> 
> 
> *If not, this one.*
> 
> MSIAfterburner.zip 1k .zip file
> 
> 
> *EVGA, this one if first one doesn't work.*
> 
> EVGA.zip 1k .zip file
> 
> 
> *Afterburner select third party voltage control mode in properties.*
> 
> If you click the cog icon, make sure Enable Hardware Control and Monitoring, Enable Low level IO Driver, Enable Low Level Hardware Access Interface 'Use Mode', Restore Settings After Suspended Mode, Unlock Voltage Control 'Third Party' and Unlock Voltage Monitoring are all checked.
> 
> Download it here and replace the '.oem2' file with the extracted WildCards.zip.
> 
> http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup.zip Direct download link.
Click to expand...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Just hit the 'Apply' checkmark right in Afterburner. Don't know a key commands.


I added a column at the end of GPU-Z validation.


----------



## Slackaveli

yay, a proper owner's club!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> yay, a proper owner's club!


Yes, this is so much better... Now hopefully people can get the questions answered easier.


----------



## feznz

Quote:


> Originally Posted by *SlimJ87D*
> 
> I've given up. Going for the shunt mod with liquid electrical tape to protect everything around it and the solder joints itself.


with resistors? I recall something about using solder only will create no resistance and lock the card on base clock.


----------



## bloodhawk

Quote:


> Originally Posted by *feznz*
> 
> with resistors? I recall something about using solder only will create no resistance and lock the card on base clock.


If you do it like this, then the 10Ohm resistors work. BUT if you are doing the shunt mod, you need to try and drop the resistance a little bit. (Do not completely short/bypass them)

http://i.imgur.com/QTZqNJi.jpg


----------



## mshagg

mshagg - 1x Zotac 1080Ti FE - EK Waterblock + backplate - 2050Mhz @ 1.031V 5897 Memory - Stock

https://www.techpowerup.com/gpuz/details/h4aad


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.
> 
> 
> 
> It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power
Click to expand...

It's only draw that much power because of the custom settings I'm using to put a huge load on the card and max out the power draw.

But I'll drop it to 2050 core, 1.000v and see what kind of power limiting I get on a normal TimeSpy run. If I recall right it stop power limits a tiny bit.


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Try putting the voltage to 1.050, 1.062 and onward.
> 
> I believe your card wants more voltage + power to run the core to its true potential.
> 
> lower 1.043 to 2050 and keep 2063 at 1.050 and continue on. Let us know what the perf cap shows after that.


So basically just move 2063 from 1.043v to 1.05v and then adjust the rest of the curve below 1.043? Thanks for the response!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *feznz*
> 
> with resistors? I recall something about using solder only will create no resistance and lock the card on base clock.


No, I'm going to use CLU on top of the resistors but first let 2 coats of liquid electrical tape dry around the components and solder joints of the resistors.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So basically just move 2063 from 1.043v to 1.05v and then adjust the rest of the curve below 1.043? Thanks for the response!


You might not even have to adjust the other ones, just move 1.043v down to 2050 so your next highest bin is 2063 @ 1.050v. If you're still getting voltage perf cap, then try 1.062 but I think you'll be good with 1.050v.

My card likes [email protected] I still get some minor power limiting and I end up at [email protected]


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.
> 
> 
> 
> It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power
Click to expand...

I'm having issues running TimeSpy with Afterburner where it jumps from 1.000v to 1.062v on test two and power limits even though I have it in a flat line on Afterburner after 1.000v.

So I did a Unigine Superposition 1080p Extreme at 1.075 v 2088 core and had zero power limiting.

GPU-ZSensorLog.txt 30k .txt file


----------



## gavros777

Are evga precision and msi afteburner stable these days?
They used to crash my games all the time back in 2015 when i was using them with the titan x maxwell.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm having issues running TimeSpy with Afterburner where it jumps from 1.000v to 1.062v on test two and power limits even though I have it in a flat line on Afterburner after 1.000v.
> 
> So I did a Unigine Superposition 1080p Extreme at 1.075 v 2088 core and had zero power limiting.
> 
> GPU-ZSensorLog.txt 30k .txt file


Yeah, Timespy doesn't like Afterburner. It kind of just seizures, voltages and clocks go everywhere.

And for whatever reason, it takes Timespy like 60 seconds to initiate a benchmark on my PC.

I like Heaven as a benchmark for realistic game performance and Superposition for stress test + benchmark more.


----------



## OneCosmic

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm having issues running TimeSpy with Afterburner where it jumps from 1.000v to 1.062v on test two and power limits even though I have it in a flat line on Afterburner after 1.000v.
> 
> So I did a Unigine Superposition 1080p Extreme at 1.075 v 2088 core and had zero power limiting.
> 
> GPU-ZSensorLog.txt 30k .txt file


Time Spy GPU test 2 is very power demanding, that's why are clocks jumping so much during it. It is actually the 2nd most power demanding test i know, 1st is Furmark.


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, I'm going to use CLU on top of the resistors but first let 2 coats of liquid electrical tape dry around the components and solder joints of the resistors.


Show us photos of how it looks like when it's done







What are your clocks and TDP when running Furmark in GPU-Z and MSI AB?


----------



## Slackaveli

Quote:


> Originally Posted by *OneCosmic*
> 
> Time Spy GPU test 2 is very power demanding, that's why are clocks jumping so much during it. It is actually the 2nd most power demanding test i know, 1st is Furmark.


well, before SuPo came out. Now supo 8k optimized, or 8k with the extreme shaders, that's all power limit, i use ~400w in that in spikes, like 10 fps lol.


----------



## ishbu

My take on SLI. It damn does top card cool because blocked fan. I couldn't wait so got FE. Wish I would have went atx not matx and then maybe temps better.


----------



## OneCosmic

Quote:


> Originally Posted by *Slackaveli*
> 
> well, before SuPo came out. Now supo 8k optimized, or 8k with the extreme shaders, that's all power limit, i use ~400w in that in spikes, like 10 fps lol.


400W for card only or total system consumption? Time Spy GPU test 2 dips to 1924MHz for a sec at one point every time, SuPo 1949MHz also for a sec, so Time Spy wins for me


----------



## Slackaveli

Quote:


> Originally Posted by *OneCosmic*
> 
> 400W for card only or total system consumption? Time Spy GPU test 2 dips to 1924MHz for a sec at one point every time, SuPo 1949MHz also for a sec, so Time Spy wins for me


card, but to be clear, i mean it spikes to that. runs mostly around 340w in that test. +. It's crazy. I am on Aorus, remember, it's got 375w limit.

Hey, guys, here's the Aorus extreme teardown....


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> Show us photos of how it looks like when it's done
> 
> 
> 
> 
> 
> 
> 
> What are your clocks and TDP when running Furmark in GPU-Z and MSI AB?


Furmark seems to just use a lot of power draw but voltage and clocks stay the same.

Superposition clocks will drop to whatever I set them in my voltage curve. My clocks will be everywhere from 0.993V to 1.050V due to power limiting.

Below is a 1 volt challenge I did where I locked my highest clock at 1 volt, which is 2088 Mhz at 1.000v.

There's lot of power limiting so the card isn't truly running at 2088 Mhz at 1.000v. It's just running a gimped 2088 Mhz, not enough to crash. But I suspect if I had enough power draw, 2088 Mhz wouldn't be stable at 1.000v because it's not during gaming.



I am stable at 2088 Mhz @ 1.050v when gaming or running heaven benchmark with occasional Power limits which drop me to 2076 @ 1.031v-1.043v



*In theory*

The shunt mod should allow me to do 2127 @ 1.093v if I obtain 400 Watts of power.

But I'm afraid of running the card at 400 Watts, I'll settle for 375 Watts if that is considered safe.

I did the math, if 80% is 300 watts with the shunt mod, than 100% should get me to 375 Watts which sounds safe to me. If not then I'll settle for 350 Watts which would be 93%.

So we'll see. My CLU comes tomorrow. I already have some thermal pads, liquid electrical tape and other stuff to modify my card tomorrow. Will probably take 2 hours.

*What I will need to prepare for*

My card is about to receive anywhere from 50 to 75 Watts extra. That's a lot of heat. My AIO is going to get much hotter, liquid might go from 40C to 50C which is scary.

I'm going to put some thermal pads on my backplate and have some of the heat absorb and distribute out onto my backpalte.

I'm going to also apply CLU to my GPU die and protect the components around the die with liquid electrical tape.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm having issues running TimeSpy with Afterburner where it jumps from 1.000v to 1.062v on test two and power limits even though I have it in a flat line on Afterburner after 1.000v.
> 
> So I did a Unigine Superposition 1080p Extreme at 1.075 v 2088 core and had zero power limiting.
> 
> GPU-ZSensorLog.txt 30k .txt file


Thx man, but looking at the logs you know what I'm thinking









Anything over the magic 111% on Asus is probably where you start limiting?


----------



## KedarWolf

Quote:


> Originally Posted by *OneCosmic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm having issues running TimeSpy with Afterburner where it jumps from 1.000v to 1.062v on test two and power limits even though I have it in a flat line on Afterburner after 1.000v.
> 
> So I did a Unigine Superposition 1080p Extreme at 1.075 v 2088 core and had zero power limiting.
> 
> GPU-ZSensorLog.txt 30k .txt file
> 
> 
> 
> 
> 
> 
> Time Spy GPU test 2 is very power demanding, that's why are clocks jumping so much during it. It is actually the 2nd most power demanding test i know, 1st is Furmark.
Click to expand...

Yeah, but if my power curve is in a straight line above 1.000v no way it should jump to 1.062v. Should max out at 1.000v.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, but if my power curve is in a straight line above 1.000v no way it should jump to 1.062v. Should max out at 1.000v.


Can you test the stock mod with your same voltage curve?

When I had the ASUS bios, **** jumped everywhere and anywhere for some reason. It was doing clocks and voltages I didn't even have on my curve. But the stock bios seems to lock clocks and follow my voltage curve better.

Superposition and Timespy follow this voltage curve very closely, I don't see it deviate much.

My 1.000v challenge.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, but if my power curve is in a straight line above 1.000v no way it should jump to 1.062v. Should max out at 1.000v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you test the stock mod with your same voltage curve?
> 
> When I had the ASUS bios, **** jumped everywhere and anywhere for some reason. It was doing clocks and voltages I didn't even have on my curve. But the stock bios seems to lock clocks and follow my voltage curve better.
> 
> Superposition and Timespy follow this voltage curve very closely, I don't see it deviate much.
> 
> My 1.000v challenge.
Click to expand...

TimeSpy test two at 1999 core, .950v does the same thing, voltages jumping around, going to 1.062v as the Asus BIOS. And seems to power limit a tiny bit even at .950v. I got the same results at 2037 core 1.000v on Asus BIOS, exactly the same.


----------



## KShirza1

Count me in


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> TimeSpy test two at 1999 core, .950v does the same thing, voltages jumping around, going to 1.062v as the Asus BIOS. And seems to power limit a tiny bit even at .950v. I got the same results at 2037 core 1.000v on Asus BIOS, exactly the same.


Why have we not thought about testing the inno 3d 1080 Ti bios?

It has a reference design but pushes 50 more watts!

This might be the key to our power limiting issues!

http://hexus.net/tech/reviews/graphics/103423-inno3d-ichill-geforce-gtx-1080-ti-x3/?page=12


----------



## KedarWolf

Are the EVGA FTW3 or Classified out yet? And if it is, can someone do a nvflash64 --save ftw3.rom in an admin command prompt so I can test it?

nvflash_5.353.0.zip 3031k .zip file


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Are the EVGA FTW3 or Classified out yet? And if it is, can someone do a nvflash64 --save ftw3.rom in an admin command prompt so I can test it?
> 
> nvflash_5.353.0.zip 3031k .zip file


The new gpu-z let's you save the bios put now I think.

But our best bet might be the Inno3d, it's a 8+6 pin card, has generally a reference pcb design and pulls 30-50 more watts I believe.


----------



## ttnuagmada

Quote:


> Originally Posted by *KedarWolf*
> 
> Can all of you who joined the Owner's Club please provide a GPU-Z validation link as well?


https://www.techpowerup.com/gpuz/details/h87rv


----------



## ttnuagmada

I submitted an Armor OC vbios just now.


----------



## outofmyheadyo

Quote:


> Originally Posted by *SlimJ87D*
> 
> Why have we not thought about testing the inno 3d 1080 Ti bios?
> 
> It has a reference design but pushes 50 more watts!
> 
> This might be the key to our power limiting issues!
> 
> http://hexus.net/tech/reviews/graphics/103423-inno3d-ichill-geforce-gtx-1080-ti-x3/?page=12


I dont know why there are so few 1080ti bioses available in TPU database, i tried them all, and all faceplant the powerlimit.
So for now anyone wanting to get rid of the silly powerlimit needs to do a shuntmod.


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> The new gpu-z let's you save the bios put now I think.
> 
> But our best bet might be the Inno3d, it's a 8+6 pin card, has generally a reference pcb design and pulls 30-50 more watts I believe.


We definitely need some better BIOS, like i would be very satisfied with a real working 330-350W BIOS, if it was even like +20% PL enabled then oh my god


----------



## Silent Scone

Quote:


> Originally Posted by *OneCosmic*
> 
> We definitely need some better BIOS, like i would be very satisfied with a real working 330-350W BIOS, if it was even like +20% PL enabled then oh my god


Unlike Maxwell, that stuff is locked down.


----------



## OneCosmic

Quote:


> Originally Posted by *Silent Scone*
> 
> Unlike Maxwell, that stuff is locked down.


I obviously didn't mean editing BIOS, just have somebody with an Inno3D ichill or EVGA SC/FTW3 to extract their BIOSes for us.

I had a GTX 970 G1 Gamimg from Gigabyte with edited BIOS up to 360W, could have set it even higher but i was already afraid of burning the VRM, it was only hitting it in Furmark, no other benchmark or game could detrone it from running 1600MHz rock solid stable with 8200-300 VRAM


----------



## DStealth

Seems i had a defective card or at least some of the components not sure what they're for...but yesterday while playing [email protected] temperatures were in 58-63* range with a very aggressive fan curve my PC shut down instantly ... it was late and went to bеd...
Today disassembled the card and it looked burned on the back, attaching picture
The saddest thing is it was connected via DVI port and took my good old precious 30" 2560х1600 IPS Apple cinema HD monitor with it


----------



## alucardis666

Quote:


> Originally Posted by *DStealth*
> 
> Seems i had a defective card or at least some of the components not sure what they're for...but yesterday while playing [email protected] temperatures were in 58-63* range with a very aggressive fan curve my PC shut down instantly ... it was late and went to bad...
> Today disassembled the card and it looked burned on the back, attaching picture
> The saddest thing is it was connected via DVI port and took my good old precious 30" 2560х1600 IPS Apple cinema HD monitor with it


Ouch... that monitor









I'm so sorry.


----------



## sWaY20

@KedarWolf can you add the s to my name in the list, and you can add kryonaut as well... Thanks for taking this over.

I'll add my validation Tom.


----------



## DerComissar

Quote:


> Originally Posted by *KedarWolf*
> 
> Hey peeps, PM me or mention here if you run a 1080 T, how many you run, AIO, water or stock air, and clocks you run for 24/7 use.
> 
> Starting an owner's list in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Example:
> 
> KedarWolf - 1x Gigabyte 1080 Ti FE - EK Water Block, Backplate - 2066 Core 6137 Memory


My thanks as well, for taking on the task of heading this thread.

Your guides will be a great source of information for me, as I slowly get into messing around with this intimidating Pascal card, lol.
Rep+


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I dont know why there are so few 1080ti bioses available in TPU database, i tried them all, and all faceplant the powerlimit.
> So for now anyone wanting to get rid of the silly powerlimit needs to do a shuntmod.


All those bios are custom cards and most if not all of them have a 8+8 pin setup.

The Inno3d is a reference card like an FE with a 8+6 pin configuration but it is capable of drawing more power. That's why it's a primary candidate right now to help FE users gain move power.

But I haven't heard of anyone with this card so far.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *sWaY20*
> 
> @KedarWolf can you add the s to my name in the list, and you can add kryonaut as well... Thanks for taking this over.
> 
> I'll add my validation Tom.


Hey guys we swt upon the comments on the sheet for this reason.

If you guys need an edit or want to be added please leave a comment on the sheet and one of us when we have time will edit it.

That being said we need some more help with it so if any users who have been actively posting here would like to help us, let KedarWolf and I know.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Seems i had a defective card or at least some of the components not sure what they're for...but yesterday while playing [email protected] temperatures were in 58-63* range with a very aggressive fan curve my PC shut down instantly ... it was late and went to bеd...
> Today disassembled the card and it looked burned on the back, attaching picture
> The saddest thing is it was connected via DVI port and took my good old precious 30" 2560х1600 IPS Apple cinema HD monitor with it


Dang, weren't you the one with the crazy core clocks at very high voltages? That didn't have anything to do with it did it?


----------



## DerComissar

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> TimeSpy test two at 1999 core, .950v does the same thing, voltages jumping around, going to 1.062v as the Asus BIOS. And seems to power limit a tiny bit even at .950v. I got the same results at 2037 core 1.000v on Asus BIOS, exactly the same.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why have we not thought about testing the inno 3d 1080 Ti bios?
> 
> It has a reference design but pushes 50 more watts!
> 
> This might be the key to our power limiting issues!
> 
> http://hexus.net/tech/reviews/graphics/103423-inno3d-ichill-geforce-gtx-1080-ti-x3/?page=12
Click to expand...

And the version tested had no DVI-D, would be nice to snag the bios from that one, as the review stated Inno 3D will be including a DVI-D port on the retail units.

Nonetheless, a good idea, certainly worth testing this bios for those of us with an FE card.
[email protected]!


----------



## KingEngineRevUp

The hunt for the Inno3d X3 bios is on! Try and get ahold of it FE users!


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Dang, weren't you the one with the crazy core clocks at very high voltages? That didn't have anything to do with it did it?


If the "very high voltages" are the maximum allowed by the BIOS 1.093v yes it was me...just for benching. Yesterday played at 1.043v actually this corresponds to less than 300w from the max allowed 330w...the components are defective...the bad thing is the monitor died also and 100% next card won't OC that well for sure...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> If the "very high voltages" are the maximum allowed by the BIOS 1.093v yes it was me...just for benching. Yesterday played at 1.043v actually this corresponds to less than 300w from the max allowed 330w...the components are defective...the bad thing is the monitor died also and 100% next card won't OC that well for sure...


That's very odd the monitor died also... Can you RMA the monitor? If not, hopefully the dp port is a separate board and you can just replace that.


----------



## Nico67

false
Quote:


> Originally Posted by *SlimJ87D*
> 
> Why have we not thought about testing the inno 3d 1080 Ti bios?
> 
> It has a reference design but pushes 50 more watts!
> 
> This might be the key to our power limiting issues!
> 
> http://hexus.net/tech/reviews/graphics/103423-inno3d-ichill-geforce-gtx-1080-ti-x3/?page=12


Would be great if this is true, as that's the default FE bios with Nvidia id. Lets hope that it was a modified version and the retails keeps the same power limit


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DStealth*
> 
> If the "very high voltages" are the maximum allowed by the BIOS 1.093v yes it was me...just for benching. Yesterday played at 1.043v actually this corresponds to less than 300w from the max allowed 330w...the components are defective...the bad thing is the monitor died also and 100% next card won't OC that well for sure...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's very odd the monitor died also... Can you RMA the monitor? If not, hopefully the dp port is a separate board and you can just replace that.
Click to expand...

DP on Apple...good joke


----------



## KickAssCop

KickAssCop - ASUS 1080 Ti FE- Stock Cooling, 2050 core, 6050 memory.

Will update once I have the MSI Gaming X card in as well.


----------



## cyenz

Question for those who game at 4K.

Can anyone maintain 2ghz clock playing Mass Effect Andromeda and Witcher at 4K maxed out wihtout throttling because of the power limit? I cannot sustaing 2ghz, sometimes it will go even to 1.9 on more demanding scenes in Mass Effect, always getting power throttling.


----------



## evosamurai

Hi all, I just purchased the evga 1080ti FE and and EK waterblock for it, I bought grizzly conductonaut to add to my cpu when i delid it, can I apply the conductonaut to the gpu as well, or should I use something else?


----------



## outofmyheadyo

You can use it on the gpu aswell but get some liquid electritian tape you can cover the doodads around the gpu incase you spill some it wont cause trouble, as an added benefit you can also do a shunt mod with it to eliminate the power limit issue.


----------



## Pandora's Box

Quote:


> Originally Posted by *KickAssCop*
> 
> KickAssCop - ASUS 1080 Ti FE- Stock Cooling, 2050 core, 6050 memory.
> 
> Will update once I have the MSI Gaming X card in as well.


Sorry but I don't think those clocks should be added to the sheet. There's no way in hell you are running 2050 core 6050 memory on a stock cooler for 24/7 use unless 100% fan speed. And even then it's throttling hard.

I guess this raises the question - when people report clock speeds to be added to the owners list; are we talking 24/7 clocks or crash and burn benchmark clocks?


----------



## undercoverb0ss

undercoverb0ss - nvidia 1080ti FE - hybrid cooler, 2063 core, 6003 memory


----------



## cyenz

Quote:


> Originally Posted by *undercoverb0ss*
> 
> undercoverb0ss - nvidia 1080ti FE - hybrid cooler, 2063 core, 6003 memory


Can you mantain that boost at 4K on Witcher or the new Mass effect without power throttle?


----------



## undercoverb0ss

Quote:


> Originally Posted by *cyenz*
> 
> Can you mantain that boost at 4K on Witcher or the new Mass effect without power throttle?


I'm not currently playing either of those, but during extended gaming (Forza Horizon 3) it starts to throttle down to 2050. I have some conductonaut on its way to do the shunt mod.


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> Why have we not thought about testing the inno 3d 1080 Ti bios?
> 
> It has a reference design but pushes 50 more watts!
> 
> This might be the key to our power limiting issues!
> 
> http://hexus.net/tech/reviews/graphics/103423-inno3d-ichill-geforce-gtx-1080-ti-x3/?page=12


I wrote to the article creatir, asking if they still own the card and if is possible he can dump the bios and upload for us.

Hope he can do it. 350W on Founders will be amazing.


----------



## pantsoftime

Has anyone tried this Inno3d BIOS on an FE? It's an FE-based card from looking at the photos and it looks like it has a higher power limit. Also wondering if display outputs work properly (the card has a DVI added but has all of the standard FE ports as well).

https://www.techpowerup.com/vgabios/191230/191230


----------



## mtbiker033

my gpu-z validation link:

https://www.techpowerup.com/gpuz/details/fcbrw


----------



## KingEngineRevUp

Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone tried this Inno3d BIOS on an FE? It's an FE-based card from looking at the photos and it looks like it has a higher power limit. Also wondering if display outputs work properly (the card has a DVI added but has all of the standard FE ports as well).
> 
> https://www.techpowerup.com/vgabios/191230/191230


Wow that must have come up last night..
Edit : Nvm it came from the unverified section. Maybe I'll give that a try in a bit. It only gives 315 watts max.

Looks like the review sample gave 350 watts though or the X3. This one is the x4.


----------



## s1rrah

Quote:


> Originally Posted by *shalafi*
> 
> A bit, but the thought of pointing it out to you has been bothering me a lot more .. should have just sent you a PM
> 
> 
> 
> 
> 
> 
> 
> 
> sorry, but isn't this the worst possible rad placement? any air bubbles in the loop will be up in the pump causing more noise and worsening your cooling efficiency. the rad should always be higher than the pump.


No stress mate ... LOL .. I've had zero issues with that rad placement ... it's utterly silent, never heard so much as a single bubble short of the first time I powered it on .. both with this new 1080ti and also my previous 980(s) which also occupied that spot. I love it. Right in front of the fat dual 140mm intakes ... card rarely sees the 50's ... generally always 48C or so ...

Now the challenge is to stop myself from buying a second one .. LOL ... at least for a few months ...


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Try putting the voltage to 1.050, 1.062 and onward.
> 
> I believe your card wants more voltage + power to run the core to its true potential.
> 
> lower 1.043 to 2050 and keep 2063 at 1.050 and continue on. Let us know what the perf cap shows after that.


So I'm going to try this when I get home, but I'm just confused about the WHY. My OC of 2063 @ 1.043v is completely stable so why do I need to add more volts? Even though I never drop bins it looks like I do have that issue with the perfcap. Is this holding back some of the performance of my card without me knowing? I just don't understand what this whole perfcap thing is in the first place.

Thanks!


----------



## prelude514

So I ended up bringing my Aorus back to the store and got my MSI FE back. Glad to have her back. First thing I did was pull the GPU apart and shunt mod the 8 and 6 pin connectors. Left the PCI-e slot stock. Seems like I dropped about 100w in the software readings, so I figure my FE is good for 390-400w now.







So nice not running into power limits anymore. Now I need to upgrade the cooling!


----------



## pantsoftime

Quote:


> Originally Posted by *SlimJ87D*
> 
> Wow that must have come up last night..
> Edit : Nvm it came from the unverified section. Maybe I'll give that a try in a bit. It only gives 315 watts max.
> 
> Looks like the review sample gave 350 watts though or the X3. This one is the x4.


Yeah I think it was posted this morning. The X3 review BIOS looks ideal but the X4 BIOS is in the wild already. I wouldn't mind an extra 15 watts even though 50 would be better.


----------



## outofmyheadyo

Quote:


> Originally Posted by *prelude514*
> 
> So I ended up bringing my Aorus back to the store and got my MSI FE back. Glad to have her back. First thing I did was pull the GPU apart and shunt mod the 8 and 6 pin connectors. Left the PCI-e slot stock. Seems like I dropped about 100w in the software readings, so I figure my FE is good for 390-400w now.
> 
> 
> 
> 
> 
> 
> 
> So nice not running into power limits anymore. Now I need to upgrade the cooling!


Wich ones are the 6 pin and 8 pin and wich one is the PCIE shunt ? The lowest one should be the PCIE one by logic, no ?


----------



## lanofsong

Hey GTX 1080Ti owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## prelude514

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Wich ones are the 6 pin and 8 pin and wich one is the PCIE shunt ? The lowest one should be the PCIE one by logic, no ?


Yes, lowest and closest to PCI-e slot is the one you want to leave stock if you don't want to draw more than 75w from your motherboard.

http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


----------



## KingEngineRevUp

Quote:


> Originally Posted by *pantsoftime*
> 
> Yeah I think it was posted this morning. The X3 review BIOS looks ideal but the X4 BIOS is in the wild already. I wouldn't mind an extra 15 watts even though 50 would be better.


Oh, I just discovered it was posted because I asked someone on reddit and they posted it lol.


----------



## leequan2009

Anyone have Stock BIOS for Gigabyte 1080ti Founder Edition?.Please Upload for me. I'm using Asus Strix bios now and I wanna back stock bios (I forgot backup stock before flash asus bios).


----------



## KingEngineRevUp

Inno3D X4 bios ran with 15 extra watts but wasn't as stable sadly.

I think we need the X3 one.


----------



## Addsome

Quote:


> Originally Posted by *leequan2009*
> 
> Anyone have Stock BIOS for Gigabyte 1080ti Founder Edition?.Please Upload for me. I'm using Asus Strix bios now and I wanna back stock bios (I forgot backup stock before flash asus bios).


Your lucky day.
https://drive.google.com/file/d/0B8626fz7LhLhSUI5Tjg2OHJzMEE/view?usp=sharing
Quote:


> Originally Posted by *SlimJ87D*
> 
> Inno3D X4 bios ran with 15 extra watts but wasn't as stable sadly.
> 
> I think we need the X3 one.


What do you mean by not as stable? Crashes at same clocks FE bios can run at?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Your lucky day.
> https://drive.google.com/file/d/0B8626fz7LhLhSUI5Tjg2OHJzMEE/view?usp=sharing
> What do you mean by not as stable? Crashes at same clocks FE bios can run at?


Yeah, I went through a whole superposition and got 2 more FPS and my previous average but it crashed right at the end. My previous average was 76.5 and this one got me 78.

It could also be that with more power, my clocks aren't actually stable at the voltage I had it.

I was still hitting perf cap power limiting but I did measure an extra 15 watts in the run compared to the ASUS Bios.

Maybe I need to spend time and optimize the voltage curve more but I'm going to shunt mod tomorrow so might not bother.

*This is definitely a bios for a reference card.* You do not lose the DP1 and also MSI Afterburner keeps your settings after you flash, but please fix the voltage curve right away because the Inno3D has a higher base clock. All of your bin points will be moved up 100 Mhz.

It is worth playing with and trying to tame and optimize. Let me know if you try it.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I went through a whole superposition and got 2 more FPS and my previous average but it crashed right at the end. My previous average was 76.5 and this one got me 78.
> 
> It could also be that with more power, my clocks aren't actually stable at the voltage I had it.
> 
> I was still hitting perf cap power limiting but I did measure an extra 15 watts in the run compared to the ASUS Bios.
> 
> Maybe I need to spend time and optimize the voltage curve more but I'm going to shunt mod tomorrow so might not bother.
> 
> *This is definitely a bios for a reference card.* You do not lose the DP1 and also MSI Afterburner keeps your settings after you flash, but please fix the voltage curve right away because the Inno3D has a higher base clock. All of your bin points will be moved up 100 Mhz.
> 
> It is worth playing with and trying to tame and optimize. Let me know if you try it.


Is running the extra 15w safe for long term?


----------



## hotrod717

Quote:


> Originally Posted by *Silent Scone*
> 
> Unlike Maxwell, that stuff is locked down.


This.
-First and foremost, Nvidia is running the show and AIB are complying. I'm skeptical that we will see any unlocked card from any manufacturer. Glalax is advertising, but I've seen no proof retail cards will come this way. And 1080 Kingpin is a great example of this.


----------



## prelude514

What's the maximum fan RPM on the Inno BIOS?


----------



## pantsoftime

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I went through a whole superposition and got 2 more FPS and my previous average but it crashed right at the end. My previous average was 76.5 and this one got me 78.
> 
> It could also be that with more power, my clocks aren't actually stable at the voltage I had it.
> 
> I was still hitting perf cap power limiting but I did measure an extra 15 watts in the run compared to the ASUS Bios.
> 
> Maybe I need to spend time and optimize the voltage curve more but I'm going to shunt mod tomorrow so might not bother.
> 
> *This is definitely a bios for a reference card.* You do not lose the DP1 and also MSI Afterburner keeps your settings after you flash, but please fix the voltage curve right away because the Inno3D has a higher base clock. All of your bin points will be moved up 100 Mhz.
> 
> It is worth playing with and trying to tame and optimize. Let me know if you try it.


It was fairly common for people running the T4 BIOS on the 1080 that this would happen. Higher power limits mean that it's going to end up in different bins at different times. It could just be something like that. Since it's a reference BIOS it should be compatible.


----------



## pantsoftime

Quote:


> Originally Posted by *Addsome*
> 
> Is running the extra 15w safe for long term?


The buildzoid teardown suggests that the FE design has a lot of power margin. If you're running better cooling than the stock blower then it shouldn't be a concern at all.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Is running the extra 15w safe for long term?


If you shunt mod you run your card at 350 to 400 Watts.

Watch a breakdown of the 1080 Ti from Nexus mod, they talk about all the components there. The chokes, VRM and VRAMs can take quite a bit of watts from my understanding.


----------



## Addsome

Quote:


> Originally Posted by *pantsoftime*
> 
> The buildzoid teardown suggests that the FE design has a lot of power margin. If you're running better cooling than the stock blower then it shouldn't be a concern at all.


I'm running a 1080 hybrid kit on mine but the vrm and vram is still being cooled by the reference fan. Half my shroud is off. Would this keep the vrm cool enough?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> I'm running a 1080 hybrid kit on mine but the vrm and vram is still being cooled by the reference fan. Half my shroud is off. Would this keep the vrm cool enough?


That's fine. I have my card like this too.

For 1080 Ti, Looks like with a hybrid cooler, *the VRMs hover around and below 70C.* So I think 350 Watts - 375 Watts would be fine. Any opinions from others?



http://www.gamersnexus.net/guides/2841-gtx-1080-ti-hybrid-benchmarks-remove-thermal-limit/page-2

*VRM can go up to 150C*

http://www.mouser.com/ds/2/149/FDPC8016S-372149.pdf


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's fine. I have my card like this too.
> 
> For 1080 Ti, Looks like with a hybrid cooler, *the VRMs hover around and below 70C.* So I think 350 Watts - 375 Watts would be fine. Any opinions from others?
> 
> 
> 
> http://www.gamersnexus.net/guides/2841-gtx-1080-ti-hybrid-benchmarks-remove-thermal-limit/page-2
> 
> *VRM can go up to 150C*
> 
> http://www.mouser.com/ds/2/149/FDPC8016S-372149.pdf


What fan speed are you running the blower fan at?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> What fan speed are you running the blower fan at?


30%, just like the nexus mods article is doing. Read it, it's very informative.

They kept it at 30% and the VRMs never went above 70C. So 15 watts extra would literally be nothing, maybe raise you to 73C and still half of the maximum temperature these VRMs can take which is 150C.


----------



## mtbiker033

these things ever coming in stock? I have an auto notify with evga and B&W


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> 30%, just like the nexus mods article is doing. Read it, it's very informative.
> 
> They kept it at 30% and the VRMs never went above 70C. So 15 watts extra would literally be nothing, maybe raise you to 73C and still half of the maximum temperature these VRMs can take which is 150C.


Just tried the inno3d bios and I hit the power limit less often but my temps also went from 48C under load to 52C. Do you know if the blower fan RPMs are the same between the bioses?

Also my power limit in afterburner is only going upto 116%, same for you?


----------



## prelude514

Quote:


> Originally Posted by *Addsome*
> 
> Just tried the inno3d bios and I hit the power limit less often but my temps also went from 48C under load to 52C. Do you know if the blower fan RPMs are the same between the bioses?
> 
> Also my power limit in afterburner is only going upto 116%, same for you?


Temperature increase is to be expected when using more power, so nothing to be worried about. Can you max out the fan and report what the RPM is? Should be 4800.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Just tried the inno3d bios and I hit the power limit less often but my temps also went from 48C under load to 52C. Do you know if the blower fan RPMs are the same between the bioses?
> 
> Also my power limit in afterburner is only going upto 116%, same for you?


Inno3D has a starting power at 275 Watts. So 275 * 115% = 316.25 Watts.

I do not know the fan speed for inno3d, just run the fan at 100% and take a look at AB tachometer. It willl tell you how many RPMs your fan is spinning.

For reference, stock spins at 4800 RPM at 100%.

Let me know what RPMs you get on Inno3D.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *prelude514*
> 
> Temperature increase is to be expected when using more power, so nothing to be worried about. Can you max out the fan and report what the RPM is? Should be 4800.


Inno3D uses 3X 90mm fans, so it'll be based off of those fans.

They might only spin at like 2,000 RPMs or even less.


----------



## prelude514

Quote:


> Originally Posted by *SlimJ87D*
> 
> Inno3D uses 3X 90mm fans, so it'll be based off of those fans.
> 
> They might only spin at like 2,000 RPMs or even less.


Ah, ok. Thought it was a reference design. Hopefully 100% = 4800 RPM on FE cards.


----------



## Addsome

Quote:


> Originally Posted by *prelude514*
> 
> Temperature increase is to be expected when using more power, so nothing to be worried about. Can you max out the fan and report what the RPM is? Should be 4800.


100% is 1700RPM


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> 100% is 1700RPM


So this would be equal to you running your fans at 35%.

I wouldn't recommend this if you are running a blower fan. But if you're on a hybrid then leave the fans at 100%.


----------



## prelude514

Quote:


> Originally Posted by *Addsome*
> 
> 100% is 1700RPM


Damn, too bad. Thanks.


----------



## cugno87

I'm losing hope...


----------



## Addsome

Im so confused with the voltage curve and perf cap. If I run [email protected] 1.062V I get Pwr, VRel, and VOp perf cap but if I move that point upto [email protected] I get no perf cap. What's going on? My curve looks like this:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cugno87*
> 
> I'm losing hope...


For what?


----------



## Baasha

you guys want to test your GPUs? Play this in 8K 60fps!


----------



## KingEngineRevUp

I can confirm the bios is definitely giving 15 extra watts legitimately. My scores are slightly higher than my stock BIOS.

The Inno3d is closer to a reference design so that's why it must be performing better than the Asus bios for me.

Still hitting perf cap as 15 watts is honestly... Not that muchlonger more power...

Higher score is Inno3D bios.




EDIT: Although I can run a benchmark with it, I still do not recommend this bios as I can't play Witcher 3 with it. It crashes.


----------



## cugno87

For a bios working on FE.


----------



## Slackaveli

Quote:


> Originally Posted by *DStealth*
> 
> Seems i had a defective card or at least some of the components not sure what they're for...but yesterday while playing [email protected] temperatures were in 58-63* range with a very aggressive fan curve my PC shut down instantly ... it was late and went to bеd...
> Today disassembled the card and it looked burned on the back, attaching picture
> The saddest thing is it was connected via DVI port and took my good old precious 30" 2560х1600 IPS Apple cinema HD monitor with it


aaaaaaaaaaaaawwwwwwwwwwwwww, man, that's horrible!


----------



## RavageTheEarth

Quote:


> Originally Posted by *Addsome*
> 
> Im so confused with the voltage curve and perf cap. If I run [email protected] 1.062V I get Pwr, VRel, and VOp perf cap but if I move that point upto [email protected] I get no perf cap. What's going on? My curve looks like this:


I really don't understand this perfcap stuff. I'm running 2063 @ 1.043v and the card stays under 51C. I never drop any bins, but I guess I'm getting all those too. What exactly does those mean? Here is mine after playing tom Clancy Wildlands. Had to alt tab to the desktop to get the screenshot so ignore the clock speeds. If anyone can explain what all this means it would be appreciated.


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.
> 
> 
> 
> It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power
Click to expand...

I don't power limit at 2025 core, .981v, +618 memory in TimeSpy or Superposition 4K Optimized. The pic is my TimeSpy run.









Asus BIOS and still getting a respectable score.


----------



## Slackaveli

perf cap means what is capping your performance, so , heat exceeded or voltage needed or power limit reached, or all of the above. There is always going to be something limiting your performance, be it voltrel(blue) meaning :i needed more volts", well if the answer is "sorry, bro, no more volts" than that's an acceptable color in that scenario, right? Util is best, i guess, because that's full utilization. I'm self-taught in all this as well and it is confusing to decifer it all at times, no doubt.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> perf cap means what is capping your performance, so , heat exceeded or voltage needed or power limit reached, or all of the above. There is always going to be something limiting your performance, be it voltrel(blue) meaning :i needed more volts", well if the answer is "sorry, bro, no more volts" than that's an acceptable color in that scenario, right? Util is best, i guess, because that's full utilization. I'm self-taught in all this as well and it is confusing to decifer it all at times, no doubt.


Blank is best, where there's nothing. But you're right, blue is okay as we're locked on voltage unless if we hardware mod the cards.

But blank is ideal.


----------



## Addsome

Quote:


> Originally Posted by *Addsome*
> 
> Im so confused with the voltage curve and perf cap. If I run [email protected] 1.062V I get Pwr, VRel, and VOp perf cap but if I move that point upto [email protected]1.062V I get no perf cap. What's going on? My curve looks like this:


Quote:


> Originally Posted by *SlimJ87D*
> 
> Blank is best, where there's nothing. But you're right, blue is okay as we're locked on voltage unless if we hardware mod the cards.
> 
> But blank is ideal.


Can you understand why this is happening?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Can you understand why this is happening?


I'm sorry, me electrical engineering knowledge is actually pretty limited. But it just sounds like your card likes a higher clock for that voltage.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.
> 
> 
> 
> It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I don't power limit at 2025 core, .981v, +618 memory in TimeSpy or Superposition 4K Optimized. The pic is my TimeSpy run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus BIOS and still getting a respectable score.
Click to expand...

This is same clocks but tweaking the Nvidia control panel settings, only 100 points behind my best ever score.




















Edit: Now to test stock BIOS at same settings.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm sorry, me electrical engineering knowledge is actually pretty limited. But it just sounds like your card likes a higher clock for that voltage.


I get Perf caps at any lower clock speed than 2050. Even [email protected] gives me a perf cap. This is frustrating.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Blank is best, where there's nothing. But you're right, blue is okay as we're locked on voltage unless if we hardware mod the cards.
> 
> But blank is ideal.


blank is also elusive lol


----------



## Benny89

Anyone know Power Limit for ZOTAC GeForce GTX 1080 Ti AMP! Extreme Edition?

I am thinking about sending back my Aorus Xtreme and grabbing that ZOTAC AMP! Extreme Edition.

It looks DOPE as hell!







(not mine picture)


----------



## outofmyheadyo

What are you doing nvidia?
This is gigabyte gtx 1080ti founders edition, with stock bios, all I did was increase PT by 20% as much as afterburner allowed, otherwise it`s running stock and it`s watercooled, this was while playing mass effect andromeda...


----------



## pantsoftime

Quote:


> Originally Posted by *SlimJ87D*
> 
> I can confirm the bios is definitely giving 15 extra watts legitimately. My scores are slightly higher than my stock BIOS.
> 
> EDIT: Although I can run a benchmark with it, I still do not recommend this bios as I can't play Witcher 3 with it. It crashes.


When you shunt mod if you find that witcher still crashes it might be related to your OC.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is what I was running TimeSpy at to get that power draw. 30 FPS. Only scene 2.
> 
> 
> 
> It bit much to ask, but it would be nice to see how much you can draw without power limiting. just run the same test but keep lowering the clocks until it does limit anymore. Remember its not about how high you clk or score, just interested in power
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I don't power limit at 2025 core, .981v, +618 memory in TimeSpy or Superposition 4K Optimized. The pic is my TimeSpy run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus BIOS and still getting a respectable score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> This is same clocks but tweaking the Nvidia control panel settings, only 100 points behind my best ever score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Now to test stock BIOS at same settings.
Click to expand...

Stock BIOS, no throttling, 2012 core, .962v, 6147 memory, just a bit less than Asus BIOS in TimeSpy.


----------



## outofmyheadyo

benchmarks are irrelevant, play some games


----------



## Tikerz

Quote:


> Originally Posted by *Addsome*
> 
> Im so confused with the voltage curve and perf cap. If I run [email protected] 1.062V I get Pwr, VRel, and VOp perf cap but if I move that point upto [email protected] I get no perf cap. What's going on? My curve looks like this:


This is what I've been wondering this whole time. The voltage curve thing is confusing as hell. It's not consistent an doesn't make sense.

I can run heaven all day long at a constant clock and voltage and no perf caps but run other games or benchmarks it jumps all over the place. Basically I don't think there's any way to do that in every application. It's impossible. We're trying for the impossible lol


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ThingyNess*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I don't take any stock in your made up theories. Did you even run the nvidia command with furmark running to test power draw like someone said to do or you did and the results were contrary to what you're claiming so you won't post them.
> 
> This is 4K Optimized at Asus BIOS 2088 1.075v. 6142 memory.
> 
> 
> 
> This is stock BIOS 2062 1.031v 6142 memory.
> 
> 
> 
> And all my games I run 4K so I'm definitely going with the Asus BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you folks really want to see some power-throttling insanity, try running it in 'custom' mode with 8k resolution, but with the '4k optimized' shaders.
> 
> On my 1070 with a 200w power limit (up from 150w default, so 33% higher!) I get no throttling at all in 8k res with the 8k shaders, but it hits that power limit hard with the '4k optimized' shaders.
> 
> It might not be obvious, but the '4k optimized' shaders are less complex than the '8k optimized' shaders, and both of those are less complex than the Medium or Extreme ones.
> 
> On 4k Extreme on my 1070, it only gets like 2.5fps, haha. Even if this benchmark supported SLI (which it doesn't for now), 4-way SLI Titan Xp would still be a slideshow at 4k Extreme.
> 
> Click to expand...
> 
> Asus BIOS at 2062 1.025v, no shunt mod, barely throttles, can do memory higher than stock BIOS stable at +659.
Click to expand...

Broke 10400 with 2025 at .981v 6177 memory, best score yet, no throttling.









I didn't do the screenshot right away so the Util in GPU-Z is when the bench wasn't running, the clear area was when it was on.













Now to run TimeSpy again.


----------



## Tikerz

Quote:


> Originally Posted by *SlimJ87D*
> 
> 30%, just like the nexus mods article is doing. Read it, it's very informative.
> 
> They kept it at 30% and the VRMs never went above 70C. So 15 watts extra would literally be nothing, maybe raise you to 73C and still half of the maximum temperature these VRMs can take which is 150C.


Well that's good to know. Those fans get really loud above 50%.


----------



## Addsome

Quote:


> Originally Posted by *KedarWolf*
> 
> Broke 10400 with 2025 at .981v 6177 memory, best score yet, no throttling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't do the screenshot right away so the Util in GPU-Z is when the bench wasn't running, the clear area was when it was on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to run TimeSpy again.


This is with Strix Bios?


----------



## Slackaveli

http://gpuz.techpowerup.com/17/04/16/2uy.png
2100Mhz run 4k optimized


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 10400 with 2025 at .981v 6177 memory, best score yet, no throttling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't do the screenshot right away so the Util in GPU-Z is when the bench wasn't running, the clear area was when it was on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to run TimeSpy again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is with Strix Bios?
Click to expand...

Yes, Strix BIOS. and really close to my best TimeSpy score at these CPU clocks as well, zero throttling.









GPU-ZSensorLog.txt 25k .txt file


----------



## KedarWolf

Quote:


> Originally Posted by *outofmyheadyo*
> 
> benchmarks are irrelevant, play some games


Games don't matter really for me. I have a 4K G-Sync screen with Rivatuner capping framerates at 59, one less than my refresh rate, and clocks never max out even on highest detail settings in the games I play the most.

It all about waving my huge e-____ around and benching.


----------



## Slackaveli

here you go, 2100 with a couple dips to 2088, full 4k supo run. not too bad.


----------



## Tikerz

Did you ever notice that with slider overclocking you can leave the voltage slider at zero and increase the core clock higher than you would if you maxed out the voltage slider? Have heaven running and play with it. When people say omg I got +180 on core it would only be impressive if the voltage slider was at 100% lol.

But it's also an alternative to touching the voltage curve. You can tweak the voltage slider and core clock while heaven is running.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> here you go, 2100 with a couple dips to 2088, full 4k supo run. not too bad.


How did you do it all of the sudden? I remember you said your max clock was 2076 I think stable.

Did you use new GB BIOS?


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, Strix BIOS. and really close to my best TimeSpy score at these CPU clocks as well, zero throttling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-ZSensorLog.txt 25k .txt file


Nice







, power looks like its the same as what I get around 111% or less so I think Asus just gives you better clks. Try dropping your mem on the FE bios and it would probably score better or try that Inno bios as it may actually give you more real power.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> How did you do it all of the sudden? I remember you said your max clock was 2076 I think stable.
> 
> Did you use new GB BIOS?


i just tried to see if i could get stable adding volts. i actually had a run that was mostly 2114 going but it crashed on scene 16. But, yeah, default i start at 1.06v and then drope to 1.05v, 1.043v gaming. if i add volts, i wont drop farther than 1.07v but i eventually crash. So, still not adding volts except to try to max bench score. it was 10, 464 btw. that's all i can muster up.


----------



## RavageTheEarth

So I bumped my voltage on the curve from 1.043 @ 2063 to 1.063 @ 2063 and all of my perfcap errors went away! Looks like the card did want more voltage. Getting 100% GPU usage in Tom Clancy Wildlands (incredible game btw) with my GPU power being between 80%-95%.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *pantsoftime*
> 
> When you shunt mod if you find that witcher still crashes it might be related to your OC.


If you read it crashed because of the Inno3d bios. I play Witcher fine with [email protected] *with the stock BIOS*. In addition, every game I have runs fine as well.


----------



## bloodhawk

Peak power draw during ROTTR @ 1440P Ultra everything maxed out (Inno3D BIOS) -


http://i.imgur.com/HCrGbt4.jpg

During timespy @ 2050Mhz 0 1.080V and Memory @ 6136Mhz the power draw peaks at 535W (System) (With the 3 x 10Ohm resistor mod and ASUS BIOS)

Using Stock BIOS -


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Peak power draw during ROTTR @ 1440P Ultra everything maxed out (Inno3D BIOS) -
> 
> 
> http://i.imgur.com/HCrGbt4.jpg
> 
> During timespy @ 2050Mhz 0 1.080V and Memory @ 6136Mhz the power draw peaks at 535W (System) (With the 3 x 10Ohm resistor mod and ASUS BIOS)


I don't know it's hard to tell bec you're shunt modded lol

I did measure 315 watts but it wasn't stable with witcher 3.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't know it's hard to tell bec you're shunt modded lol
> 
> I did measure 315 watts but it wasn't stable with witcher 3.


Not the shunt mod.


----------



## kevindd992002

Quote:


> Originally Posted by *Slackaveli*
> 
> perf cap means what is capping your performance, so , heat exceeded or voltage needed or power limit reached, or all of the above. There is always going to be something limiting your performance, be it voltrel(blue) meaning :i needed more volts", well if the answer is "sorry, bro, no more volts" than that's an acceptable color in that scenario, right? Util is best, i guess, because that's full utilization. I'm self-taught in all this as well and it is confusing to decifer it all at times, no doubt.


Util was discussed a few pages back and it means that the card is in idle. So no, it does not mean full utilization.


----------



## KedarWolf

Best 24/7 overclock score ever after reinstalling 3DMark and TimeSpy. I noticed there was a stutter in the bench so I reinstalled it.

This was at 2025 core, .981v, 6180 memory, Asus BIOS.


----------



## Nico67

Inno iChillx4 bios results

all but no power limiting at 116%



power limits is improved, but seemed to score worse for me than FE bios even with less limiting, memory clk performance is about the same as FE bios (best for me at 6003).

SMI showing limits



I can push it into limiting, but nothing could get me close to the FE score

FE run for reference at similar settings


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Inno iChillx4 bios results
> 
> all but no power limiting at 116%
> 
> 
> 
> power limits is improved, but seemed to score worse for me than FE bios even with less limiting, memory clk performance is about the same as FE bios (best for me at 6003).
> 
> SMI showing limits
> 
> 
> 
> I can push it into limiting, but nothing could get me close to the FE score
> 
> FE run for reference at similar settings


Yeah the FE bios is still the best. That bios wasn't the original one we were after though. The one we want is the X3 from that reviewer.


----------



## leequan2009

Quote:


> Originally Posted by *Addsome*
> 
> Your lucky day.
> https://drive.google.com/file/d/0B8626fz7LhLhSUI5Tjg2OHJzMEE/view?usp=sharing
> What do you mean by not as stable? Crashes at same clocks FE bios can run at?


Thank very much!
In the Stock Bios, I could run Stable benchmark at 2088 with Max voltage 1.093. But in the Asus Bios I couldn't.do that, it running 2050mhz


----------



## KingEngineRevUp

Preparing to shunt mod.

Didnt have a stupid 3mm socket drive... Had to make one using a 4mm socket drive and tape. Luckily these small screws are only hand tight.



Added more heatsinks by vram and even more around where heat would be coming form the vrms. Every 1C counts right now, so removing heat frame the vrm before they can transfer anywhere will surely help.



Liquid electrical tape, drys fast, very easy to coat. Recommend doing two coatings. Covered around shunted resistors and also covered solder joints so they don't corrode.





Goi to apply the clu when it comes tomorrow on the shunt and gpu die. Need to prepare for that extra incoming heat form more power draw.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Preparing to shunt mod.
> 
> Didnt have a stupid 3mm socket drive... Had to make one using a 4mm socket drive and tape. Luckily these small screws are only hand tight.
> 
> 
> 
> Added more heatsinks by vram and even more around where heat would be coming form the vrms. Every 1C counts right now, so removing heat frame the vrm before they can transfer anywhere will surely help.
> 
> 
> 
> Liquid electrical tape, drys fast, very easy to coat. Recommend doing two coatings. Covered around shunted resistors and also covered solder joints so they don't corrode.
> 
> 
> 
> 
> 
> Goi to apply the clu when it comes tomorrow on the shunt and gpu die. Need to prepare for that extra incoming heat form more power draw.


Are you using the Hybrid ? The most gains you will see would be with replacing the VRM thermal pads with higher (11W/mK one or 17W/mK ones) I use 11w/mK ones and with my water block the VRM temps dropped by 15-CC depending on where my thermal sensor was placed. That helps more in holding higher clocks at higher voltages.

@KedarWolf, do you mind adding "ASUS Bios with 3 x 10Ohm resistor mod" to the owners sheet?


----------



## AngryLobster

Quote:


> Originally Posted by *KedarWolf*
> 
> Games don't matter really for me. I have a 4K G-Sync screen with Rivatuner capping framerates at 59, one less than my refresh rate, and clocks never max out even on highest detail settings in the games I play the most.
> 
> It all about waving my huge e-____ around and benching.


Why are you capping your FPS instead of just turning Vsync on? Vsync does it for you when you have Gsync enabled.


----------



## Addsome

Quote:


> Originally Posted by *AngryLobster*
> 
> Why are you capping your FPS instead of just turning Vsync on? Vsync does it for you when you have Gsync enabled.


Once vsync engages you lose the benefit of Gsync and it introduces input lag.

The best setting for gsync is to enable gsync and vsync in nvidia control panel and then limit your FPS using RTS. For example I have a 100hz monitor and I limit it to 98FPS.


----------



## Pandora's Box

Quote:


> Originally Posted by *Addsome*
> 
> Once vsync engages you lose the benefit of Gsync and it introduces input lag.
> 
> The best setting for gsync is to enable gsync and vsync in nvidia control panel and then limit your FPS using RTS. For example I have a 100hz monitor and I limit it to 98FPS.


What you are doing is redundant. once you reach your monitors max refresh rate, gsync switches off and vsync kick in. using RTS or any other program to limit fps isnt doing anything.


----------



## zeroyon04

Has anyone tried soldering a DVI connector back into where it should be on the 1080Ti FE PCB, and see if that works?

I wonder if Nvidia just removed the connector, but the port is still electrically connected and enabled in the BIOS...

I have a 120Hz 27" monitor that only has DVI dual link, and no displayport. I'm getting a 1080Ti FE from EVGA's step-up, but I will be limited to 60Hz on my monitor without buying a >$100+ active DP -> DVI dual link active adapter. If I can just solder the DVI connector back on instead, I'll just do that. I'm throwing a hybrid WB on it anyways, so the vapor chamber won't be blocking...

Thanks for your help


----------



## KingEngineRevUp

Quote:


> Originally Posted by *bloodhawk*
> 
> Are you using the Hybrid ? The most gains you will see would be with replacing the VRM thermal pads with higher (11W/mK one or 17W/mK ones) I use 11w/mK ones and with my water block the VRM temps dropped by 15-CC depending on where my thermal sensor was placed. That helps more in holding higher clocks at higher voltages.
> 
> @KedarWolf, do you mind adding "ASUS Bios with 3 x 10Ohm resistor mod" to the owners sheet?


Yeah I got some fujiblahblah ones too.

I have a kraken x41 mounted as a ghetto hybrid.

But it's crazy, adding only 8 previously made my core go from 51C to 48C (at stock it's at 45). I used a thermal reader, the base plate was at 40C and the heatsinks were at 36C, base of copper was at 40C also, so the top fins were releasing heat. Previously the MIDPLATE was at 45C.

I added some more heatsinks because of the shunt mod which will introduce 50-75 more watts to my card. The vrm will be getting warmer so transferring any heat to the mid-plate is going to help like it did before.

I'm also going to use clu on my die and shim. So that should helps transfer more efficiently.

Then lastly adding a thermal pad behind gpu and mid-plate to transfer dome more heat. Since the pump keeps the gpu around 45C, the mid-plate shouldn't have a problem due to thermal expansion.

Will post results tomorrow. Hopefully everything goes smoothly.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah I got some fujiblahblah ones too.
> 
> I have a kraken x41 mounted as a ghetto hybrid.
> 
> But it's crazy, adding only 8 previously made my core go from 51C to 48C (at stock it's at 45). I used a thermal reader, the base plate was at 40C and the heatsinks were at 36C, base of copper was at 40C also, so the top fins were releasing heat. Previously the MIDPLATE was at 45C.
> 
> I added some more heatsinks because of the shunt mod which will introduce 50-75 more watts to my card. The vrm will be getting warmer so transferring any heat to the mid-plate is going to help like it did before.
> 
> I'm also going to use clu on my die and shim. So that should helps transfer more efficiently.
> 
> Then lastly adding a thermal pad behind gpu and mid-plate to transfer dome more heat. Since the pump keeps the gpu around 45C, the mid-plate shouldn't have a problem due to thermal expansion.
> 
> Will post results tomorrow. Hopefully everything goes smoothly.


Yeah those copper probably help , but better thermal pads on GPU's are so under rated, specially when someone plans on using a power mod and running it at high voltages.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> What you are doing is redundant. once you reach your monitors max refresh rate, gsync switches off and vsync kick in. using RTS or any other program to limit fps isnt doing anything.


There are studies that show limiting fps below monitor refresh rate by 1 or 2 fish resuxes input lag compares to vsync kicking in.


----------



## Slackaveli

this
Quote:


> Originally Posted by *SlimJ87D*
> 
> There are studies that show limiting fps below monitor refresh rate by 1 or 2 fish resuxes input lag compares to vsync kicking in.


Quote:


> Originally Posted by *Pandora's Box*
> 
> What you are doing is redundant. once you reach your monitors max refresh rate, gsync switches off and vsync kick in. using RTS or any other program to limit fps isnt doing anything.


yeah it IS, it's eliminating input lag. i agree, best method is 1 fps less than your Hz, locked, ie 59 on a 60Hz, or i do 142 on my 144Hz. Lock it, v-sync off, g-sync on. NO input lag. V-sync method there is some lag, especially if you are hitting over 135fps, which we are on 1080Tis.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah I got some fujiblahblah ones too.
> 
> I have a kraken x41 mounted as a ghetto hybrid.
> 
> But it's crazy, adding only 8 previously made my core go from 51C to 48C (at stock it's at 45). I used a thermal reader, the base plate was at 40C and the heatsinks were at 36C, base of copper was at 40C also, so the top fins were releasing heat. Previously the MIDPLATE was at 45C.
> 
> I added some more heatsinks because of the shunt mod which will introduce 50-75 more watts to my card. The vrm will be getting warmer so transferring any heat to the mid-plate is going to help like it did before.
> 
> I'm also going to use clu on my die and shim. So that should helps transfer more efficiently.
> 
> Then lastly adding a thermal pad behind gpu and mid-plate to transfer dome more heat. Since the pump keeps the gpu around 45C, the mid-plate shouldn't have a problem due to thermal expansion.
> 
> Will post results tomorrow. Hopefully everything goes smoothly.


it's a helluva well thought out plan.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah it IS, it's eliminating input lag. i agree, best method is 1 fps less than your Hz, locked, ie 59 on a 60Hz, or i do 142 on my 144Hz. Lock it, v-sync off, g-sync on. NO input lag. V-sync method there is some lag, especially if you are hitting over 135fps, which we are on 1080Tis.


The problem is not everyone likes even numbers slack-man. I grew up with ocd so I need odd numbers. I keep it at 141.99 fps cap

All the clocks in my house have had all even numbers removed.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> There are studies that show limiting fps below monitor refresh rate by 1 or 2 fish resuxes input lag compares to vsync kicking in.


Quote:


> Originally Posted by *SlimJ87D*
> 
> The problem is not everyone likes even numbers slack-man. I grew up with ocd so I need odd numbers. I keep it at 141.99 fps cap
> 
> All the clocks in my house have had all even numbers removed.


grew up with it? SHizzz, it's here every day! I like your thinking, though.

We are right about it, though. Framerate limiter and v-sync off w g-sync is best imo, and i tried several methods.


----------



## mshagg

The Inno3d bios seems ok. Out of the box it boosts to 2025/5700. Still power limits like crazy in supo 4k. Setting the power limit to 116% in afterburner gets you like 313W, you need to use Nvidia-smi to get the extra 2W.

All 3 displayports are enabled.


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> The Inno3d bios seems ok. Out of the box it boosts to 2025/5700. Still power limits like crazy in supo 4k. Setting the power limit to 116% in afterburner gets you like 313W, you need to use Nvidia-smi to get the extra 2W.
> 
> All 3 displayports are enabled.


sounds like a winner for the guys with middling or poor cards. May enable them to get 2000 to hold.


----------



## Dasboogieman

Quote:


> Originally Posted by *DStealth*
> 
> Seems i had a defective card or at least some of the components not sure what they're for...but yesterday while playing [email protected] temperatures were in 58-63* range with a very aggressive fan curve my PC shut down instantly ... it was late and went to bеd...
> Today disassembled the card and it looked burned on the back, attaching picture
> The saddest thing is it was connected via DVI port and took my good old precious 30" 2560х1600 IPS Apple cinema HD monitor with it


Sorry a bit late to post.
That blown component looks like a regulation IC for the power draw from the 8pin supply just from its location alone.

Looks to me like a power surge went through that cable and took out that part and additionally your monitor. You may want to check your PSU and/or ensuring your PC has surge protection, just to be safe in the event your replacement GPU also suffers the same fate..

The 2nd possibility though extremely unlikely is the component blew due to overheating under the backplate and the progressively unlikely possibility is load balancing messed up and sent too much power through that rail.


----------



## DStealth

Yes from the location my assumption is the same...do not know how load balancing on these cards woks i.e 1st/2nd 8pin connector and PCI-e line....but may the shunt mod be related to it as the resistance on all 3 shunts changed by quite margin and not linearly for sure...So be aware with modding them. Yes my PSU has surge protection had to turn it back manually to operate with the paper clip on the 24 pin connector after the fail. And yes overheating is also possible with these backplates, but judging from Guru3d review at this point there is not so much heat generated...


----------



## Clukos

Quote:


> Originally Posted by *DStealth*
> 
> Yes from the location my assumption is the same...do not know how load balancing on these cards woks i.e 1st/2nd 8pin connector and PCI-e line....but may the shunt mod be related to it as the resistance on all 3 shunts changed by quite margin and not linearly for sure...So be aware with modding them. Yes my PSU has surge protection had to turn it back manually to operate with the paper clip on the 24 pin connector after the fail. And yes overheating is also possible with these backplates, but judging from Guru3d review at this point there is not so much heat generated...


I did a search for your PSU and it seems from user reviews that multiple people are saying it blew up and took components with it. Maybe the PSU is to blame and not the card.


----------



## KedarWolf

Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.









I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.









Too big for an attachment, link to it in Google Drive.









Deleted, doesn't work.


----------



## leequan2009

Quote:


> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing


Does 1080Ti FE use this BIOS?


----------



## Clukos

Quote:


> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing


Lol what is up with that vram speed


----------



## cugno87

Quote:


> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing


Thank you. Can you tell me what's the 100% fan speed for that bios?


----------



## outofmyheadyo

Quote:


> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing


This is the 1089ti thread, but a titan bios or what am I missing?


----------



## DStealth

Quote:


> Originally Posted by *Clukos*
> 
> I did a search for your PSU and it seems from user reviews that multiple people are saying it blew up and took components with it. Maybe the PSU is to blame and not the card.


I have the second version of the unit, the problems you're reading with X-1250w SS are with the first, not XM2 model - Seasonic SS-1250XM2 1250W

Anyway my monitor is alive, seems some protection was activated and today with another card it has started...









@KedarWolf push it further m8 still 250 points behind my already dead air cooled card in GPU score http://www.3dmark.com/spy/1562028


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing


Looks like still using the strix bios to me. Gigabyte one should be 86.02.39.00.57. Vendor also shows as Asus in Gpuz?

I tried the Gigabyte update utlity, it said it worked but doesnt appear to have taken. There's a version of the bios on TPU you could try.


----------



## JJ1217

Hi guys, anyone know if its safe to use Thermal Grizzly Conductonaut on the 1080TI FE? Idk what metal it is that makes contact with the die. Thanks to anyone who can answer this for me. If not I'll just use kryonaut


----------



## cugno87

Quote:


> Originally Posted by *JJ1217*
> 
> Hi guys, anyone know if its safe to use Thermal Grizzly Conductonaut on the 1080TI FE? Idk what metal it is that makes contact with the die. Thanks to anyone who can answer this for me. If not I'll just use kryonaut


I'm using CLU on FE and works fine (-7/6 °C). FE's cooler is made in nichel plated copper. I also used it on my old 970 for a year and I had no problems.


----------



## KedarWolf

Quote:


> Originally Posted by *leequan2009*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing
> 
> 
> 
> 
> 
> Does 1080Ti FE use this BIOS?
Click to expand...

Yes, using it on my FE.


----------



## KedarWolf

Quote:


> Originally Posted by *DStealth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Clukos*
> 
> I did a search for your PSU and it seems from user reviews that multiple people are saying it blew up and took components with it. Maybe the PSU is to blame and not the card.
> 
> 
> 
> I have the second version of the unit, the problems you're reading with X-1250w SS are with the first, not XM2 model - Seasonic SS-1250XM2 1250W
> 
> Anyway my monitor is alive, seems some protection was activated and today with another card it has started...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @KedarWolf push it further m8 still 250 points behind my already dead air cooled card in GPU score http://www.3dmark.com/spy/1562028
Click to expand...

I got 11098. Bit higher than 10578.


----------



## JJ1217

Quote:


> Originally Posted by *cugno87*
> 
> I'm using CLU on FE and works fine (-7/6 °C). FE's cooler is made in nichel plated copper. I also used it on my old 970 for a year and I had no problems.


thanks thats super helpful. Mine comes soon since in Australia a store lowered FE price from 1129 to $1000 just for the EVGA card. can't wait to get it.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing
> 
> 
> 
> 
> 
> Looks like still using the strix bios to me. Gigabyte one should be 86.02.39.00.57. Vendor also shows as Asus in Gpuz?
> 
> I tried the Gigabyte update utlity, it said it worked but doesnt appear to have taken. There's a version of the bios on TPU you could try.
Click to expand...

Yes, you're right. Extreme BIOS doesn't work, already tried it.


----------



## cugno87

Quote:


> Originally Posted by *JJ1217*
> 
> thanks thats super helpful. Mine comes soon since in Australia a store lowered FE price from 1129 to $1000 just for the EVGA card. can't wait to get it.


You're welcome


----------



## KedarWolf

FTW3 has been released, we REALLY NEED that BIOS to test.


----------



## KedarWolf

Quote:


> Originally Posted by *cugno87*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing
> 
> 
> 
> 
> 
> Thank you. Can you tell me what's the 100% fan speed for that bios?
Click to expand...

bios doesnt really flash...


----------



## DStealth

Quote:


> Originally Posted by *KedarWolf*
> 
> I got 11098. Bit higher than 10578.


We're discussing video cards and GPU score not the difference between 6 and 8 core CPU has on the overal score influence...let alone your post for 12000 ....nvm









Edit: Let me simplify it for you:
you vs Me
GT1 71.69 vs 72.58
GT2 63.84 vs 65.68
1.2% less in GT1
2.9% less in GT2


----------



## KedarWolf

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Really close to a 12000 TimeSpy with Gigabyte F4 beta BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll add it as an attachment here, right click, Properties, Unblock, unzip, run .exe file in main folder, should say flashed successfully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too big for an attachment, link to it in Google Drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://drive.google.com/file/d/0ByOOg6Qw8gHZRHVuUTdxNHlEbVU/view?usp=sharing
> 
> 
> 
> 
> 
> This is the 1089ti thread, but a titan bios or what am I missing?
Click to expand...

bios doesn't actually flash...


----------



## KedarWolf

Quote:


> Originally Posted by *AngryLobster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Games don't matter really for me. I have a 4K G-Sync screen with Rivatuner capping framerates at 59, one less than my refresh rate, and clocks never max out even on highest detail settings in the games I play the most.
> 
> It all about waving my huge e-____ around and benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why are you capping your FPS instead of just turning Vsync on? Vsync does it for you when you have Gsync enabled.
Click to expand...

Less latency capping FPS with Rivatuner and no Vsync...


----------



## mshagg

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, you're right. Extreme BIOS doesn't work, already tried it.


There's some non-extreme GA BIOSes to try:

https://www.techpowerup.com/vgabios/191240/191240
https://www.techpowerup.com/vgabios/191202/191202

The second one is the gaming OC card, with 8+2 power phases, but only has 300W limit it seems:

https://www.aorus.com/product-detail.php?p=168&t=17&t2=23&t3=27


----------



## RJacobs28

All finished up with my 2 Strix cards.


----------



## error-id10t

Sorry guys haven't kept up, need a card since I sold my 970s.. well two odd years ago I think. Which TI should I grab, suggestions..?


----------



## outofmyheadyo

Quote:


> Originally Posted by *KedarWolf*
> 
> FTW3 has been released, we REALLY NEED that BIOS to test.


If the ftw3 has double 8pin it's as useless as the xtreme gigabyte one on TPU database, for us with 6+8 im afraid.


----------



## outofmyheadyo

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If the ftw3 has double 8pin it's as useless as the xtreme gigabyte one on TPU database, for us with 6+8 im afraid.


It never ran above 0.8v and 1200mhz for me.


----------



## ttnuagmada

Can anyone speculate as to how risky it would be to try flashing one of my MSI Armor OC cards with the Asus Xtreme Gaming vbios? My card is dual 8 pin.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ttnuagmada*
> 
> Can anyone speculate as to how risky it would be to try flashing one of my MSI Armor OC cards with the Asus Xtreme Gaming vbios? My card is dual 8 pin.


Just grow a pair and try it. As long as you backup your bios.


----------



## outofmyheadyo

What is asus xtreme gaming? Do you mean gigabyte? I tried it on my FE stuck at 0.8v and 1200mhz


----------



## imLagging

I remember seeing a thread with someone showing 7 vs 10 3dmark firestrike and the difference was around ~10%. Think I would gain some points with this benchmark on 10? Highest I have been able to push is 2139mhz, crashes after a short time on 2154mhz with heaven and superposition.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Yes from the location my assumption is the same...do not know how load balancing on these cards woks i.e 1st/2nd 8pin connector and PCI-e line....but may the shunt mod be related to it as the resistance on all 3 shunts changed by quite margin and not linearly for sure...So be aware with modding them. Yes my PSU has surge protection had to turn it back manually to operate with the paper clip on the 24 pin connector after the fail. And yes overheating is also possible with these backplates, but judging from Guru3d review at this point there is not so much heat generated...


You had a msi GTX 1080 Ti gaming x right? And you shunt modded it?

Did you ever measure how much watts it was drawing? Do you think you just drew too much? Doesn't your card already have access to 330 watts?


----------



## DStealth

It has more than 330w for sure...never measured it.


----------



## leequan2009

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, using it on my FE.


That BIOS max Power 375W with 8+8pin. But the FE version 300W with 8+6pin. What will cause damage to the card?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> It has more than 330w for sure...never measured it.


Did you leave the power slider maxed out? Because that gives 400+ Watts on a FE. I can't imagine what's it would give to a partner card.


----------



## feznz

Quote:


> Originally Posted by *error-id10t*
> 
> Sorry guys haven't kept up, need a card since I sold my 970s.. well two odd years ago I think. Which TI should I grab, suggestions..?


The cards are almost the same comes down to silicon lottery personally I would base my choice on aesthetics then price over brand


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> You had a msi GTX 1080 Ti gaming x right? And you shunt modded it?
> 
> Did you ever measure how much watts it was drawing? Do you think you just drew too much? Doesn't your card already have access to 330 watts?


330W wasn't probably why that component burned, i think the shunt mod messed up something else on the pcb, some other voltage and that's why it burned, maybe if he did the mod the other way arround by modding the caps with 10ohm resistors then it would have been running fine. The VRM on MSI Gaming X as on any other 1080 Ti can handle no problem 400W or more.


----------



## ttnuagmada

So I "grew a pair" as I was instructed, and flashed the Gigabyte extreme gaming vbios on my MSI Armor OC's, but even though everything appeared to be good, it shows power as the perfcap, even at stock, and the GPU clock stays mostly below 1000mhz. So that would be a no go. The Armor already has a 330w TDP ([email protected]%) but I was hoping to get the 375w cap, as even with the 330w, I still can hit a power perfcap.

Someone needs to unlock this vbios.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> 330W wasn't probably why that component burned, i think the shunt mod messed up something else on the pcb, some other voltage and that's why it burned, maybe if he did the mod the other way arround by modding the caps with 10ohm resistors then it would have been running fine. The VRM on MSI Gaming X as on any other 1080 Ti can handle no problem 400W or more.


He said he shunt all 3 resistors too so that's opening even more power although like you said, the area that burned out is from the pins.

Pretty scary stuff. I'm going to shunt mod my card today. Don't have the soldering skills to do the resistor mod. But I'm going to start by limiting power draw at 75% and working my way up using my power supply to read how much current is being drawn.

Total system power draw previously was 410 watts max so if it goes to 460 I'll keeping the slider there.

I'm not sure if I'll be able to use the nvidia power command since the shunt mod might throw its reading off.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ttnuagmada*
> 
> So I "grew a pair" as I was instructed, and flashed the Gigabyte extreme gaming vbios on my MSI Armor OC's, but even though everything appeared to be good, it shows power as the perfcap, even at stock, and the GPU clock stays mostly below 1000mhz. So that would be a no go. The Armor already has a 330w TDP ([email protected]%) but I was hoping to get the 375w cap, as even with the 330w, I still can hit a power perfcap.
> 
> Someone needs to unlock this vbios.


Yeah, maybe we can start a community fund to motivate programmers to unlocking the bios.


----------



## zipeldiablo

Quote:


> Originally Posted by *SlimJ87D*
> 
> He said he shunt all 3 resistors too so that's opening even more power although like you said, the area that burned out is from the pins.
> 
> Pretty scary stuff. I'm going to shunt mod my card today. Don't have the soldering skills to do the resistor mod. But I'm going to start by limiting power draw at 75% and working my way up using my power supply to read how much current is being drawn.
> 
> Total system power draw previously was 410 watts max so if it goes to 460 I'll keeping the slider there.
> 
> I'm not sure if I'll be able to use the nvidia power command since the shunt mod might throw its reading off.


What is the resistor mod? Security for the shunt?
There is so much stuff but nothing on the first page


----------



## Dasboogieman

Quote:


> Originally Posted by *zipeldiablo*
> 
> What is the resistor mod? Security for the shunt?
> There is so much stuff but nothing on the first page


TLDR its a hard mod to increase the power limit

NVIDIA uses a set of 3 low resistance SM components to regulate the power draw across the 3 rails. It does this by measuring the voltage drop accross the 5MO resistor (1 for every power rail, thats 2 8/6 pin units and 1 PCIe regulator) in conjunction with a voltagee regulator chip.

What people are doing is reducing the resistance of these SMs by introducing a parallel electrical path also with a nominal resistance thus "fooling" the regulator in to thinking a lot less power is being drawn than actually is. This is achieved with spreading Liquid metal to act as this bridge, this is the most consistent result that also has some degree of reversibility in case of RMT.

However, the regulator chip has a failsafe mode in the case that any of the 3 rails show zero resistance (i.e. you shorted it with solder) then it forces the GPU in to "Safe" mode with 100mhz iirc locked.


----------



## beastlyhax

Was just wondering why people are calculating their shunt mod power draw based on load draw? Is it not more accurate to compare idle draw?


----------



## Dasboogieman

Quote:


> Originally Posted by *beastlyhax*
> 
> Was just wondering why people are calculating their shunt mod power draw based on load draw? Is it not more accurate to compare idle draw?


Its too hard to quantify a difference in idle draw because the conditions are difficult to replicate, as opposed to full draw. Plus, the small figures involved amplifies any variance in measurement.

I suspect, a lot of people who do the shunt mod don't really care exactly what their headroom, as long as its functionally "infinite" that is good enough for them.


----------



## outofmyheadyo

Quote:


> Originally Posted by *ttnuagmada*
> 
> So I "grew a pair" as I was instructed, and flashed the Gigabyte extreme gaming vbios on my MSI Armor OC's, but even though everything appeared to be good, it shows power as the perfcap, even at stock, and the GPU clock stays mostly below 1000mhz. So that would be a no go. The Armor already has a 330w TDP ([email protected]%) but I was hoping to get the 375w cap, as even with the 330w, I still can hit a power perfcap.
> 
> Someone needs to unlock this vbios.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, maybe we can start a community fund to motivate programmers to unlocking the bios.


I would so pay for this, I bet someone would surely do it for a decent sum, should start a kickstarter or other similar crowdfunding effort for this.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *zipeldiablo*
> 
> What is the resistor mod? Security for the shunt?
> There is so much stuff but nothing on the first page


That's because there's a shunt mod thread on its own


----------



## Dasboogieman

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I would so pay for this, I bet someone would surely do it for a decent sum, should start a kickstarter or other similar crowdfunding effort for this.


To my knowledge the current BIOS modders need some kind of decryption key that you can only get from the OEMs directly if you ask. I tried asking Gigabyte + MSI but they wanted me to void my warranty before they'd hand it over lol.


----------



## mouacyk

Quote:


> Originally Posted by *Dasboogieman*
> 
> To my knowledge the current BIOS modders need some kind of decryption key that you can only get from the OEMs directly if you ask. I tried asking Gigabyte + MSI but they wanted me to void my warranty before they'd hand it over lol.


Sounds like a fair trade to me.


----------



## rolldog

I just placed a pre-order for 2 x Gigabyte AORUS GeForce GTX 1080 Ti Extreme Edition, GV-N108TAORUS X-11GD

http://www.gigabyte.com/Graphics-Card/GV-N108TAORUS-X-11GD#kf

Since I haven't been keeping up with these cards as much as I wanted to, what does everyone think of this card, which is soon to be released? It has 12+2 Power Phases, 2 x 8 pin power supply connectors, and looks like it'll be one of the best 1080 Tis on the market, once it's released. I'm just rolling the dice hoping EK releases a waterblock for this non-reference card, similar to the Gigabyte 980 Ti Gaming G1 GPUs I'm running now. If there's another card out, or soon to be out, which might have more potential power, please let me know. Since it's a pre-order I still have time to cancel this order, but from what I've seen, this might be the card to beat.


----------



## Dasboogieman

Quote:


> Originally Posted by *rolldog*
> 
> I just placed a pre-order for 2 x Gigabyte AORUS GeForce GTX 1080 Ti Extreme Edition, GV-N108TAORUS X-11GD
> 
> http://www.gigabyte.com/Graphics-Card/GV-N108TAORUS-X-11GD#kf
> 
> Since I haven't been keeping up with these cards as much as I wanted to, what does everyone think of this card, which is soon to be released? It has 12+2 Power Phases, 2 x 8 pin power supply connectors, and looks like it'll be one of the best 1080 Tis on the market, once it's released. I'm just rolling the dice hoping EK releases a waterblock for this non-reference card, similar to the Gigabyte 980 Ti Gaming G1 GPUs I'm running now. If there's another card out, or soon to be out, which might have more potential power, please let me know. Since it's a pre-order I still have time to cancel this order, but from what I've seen, this might be the card to beat.


Just a heads up: the Gigabyte cards have the highest TDP ceiling of all the 1080tis, however, the stock OC is extremely aggressive so you may have instability straight out of the box.
Also, despite having 12+2, it actually only has 3 voltage drivers (each has a quadrupler) thus its actually only 3 "true" Vcore phases. This means while it can supply massive sustained current, it also has the worst theoretical voltage regulation of any GTX 1080ti (I believe ASUS has the best regulation with 6 vcore drivers and no doublers, followed by MSI with same as FE but with better quality chokes and MOSFETs). Chances are any future MSI Lightning or EVGA Classified/Kingpin will have a heightened TDP ceiling as well.

Though from what I can say from personally using the Aorus, the fan is extremely quiet at 100%. Possibly the quietest of any model (barring the Seahawk), thus it makes little sense to run the fans <100%. In fact, some Aorus owners seem to have noticed the fans are actually relatively inefficient at low RPMs (I think MSI is the best in this area but I'm not certain).

The Aorus also stands to benefit hugely from a repaste, the stock paste is crap and/or non-existent.


----------



## KickAssCop

Quote:


> Originally Posted by *RJacobs28*
> 
> All finished up with my 2 Strix cards.


Temps at full load and clock speeds?


----------



## AorusGuy

Hey guys just want some advice/opinions.

Got my Aorus non-extreme two days ago and it runs pretty good out of the box but not so much when it came to overclocking. With the original bios the OC mode (which is just a +25 offset) was extremely unstable. After hours of testing I found that I could only get a +20 offset which would put me at about a 1949 boosted clock. At this time I couldn't push memory past 200 without stability issues. This is with max power limit.

I flashed the F4 bios and it did help plenty. With F4 I can run the OC mode and it is stable. With the voltage/freq curve I was able to get [email protected] which while it is pretty terrible compared to most its better than the [email protected] that it was running at. I am also able to push memory up to 500 or so now without issue.

You guys have any ideas for this card? New to overclocking with boost 3.0/pascal. I guess I just didn't win the lottery on this one?


----------



## Dasboogieman

Quote:


> Originally Posted by *AorusGuy*
> 
> Hey guys just want some advice/opinions.
> 
> Got my Aorus non-extreme two days ago and it runs pretty good out of the box but not so much when it came to overclocking. With the original bios the OC mode (which is just a +25 offset) was extremely unstable. After hours of testing I found that I could only get a +20 offset which would put me at about a 1949 boosted clock. At this time I couldn't push memory past 200 without stability issues. This is with max power limit.
> 
> I flashed the F4 bios and it did help plenty. With F4 I can run the OC mode and it is stable. With the voltage/freq curve I was able to get 1961mH[email protected] which while it is pretty terrible compared to most its better than the [email protected] that it was running at. I am also able to push memory up to 500 or so now without issue.
> 
> You guys have any ideas for this card? New to overclocking with boost 3.0/pascal. I guess I just didn't win the lottery on this one?


Mine was a poop overclocker ad well, im capped put at 2025 with max voltage.

Try keeping the core <70 degrees and see if that improves you clocks. I spent an hour figuring out why i was crashing at previously stable 2025 clocks then realised my temps were 72 degrees at 90% fan speed. I cranked it to 100% trnps fropped to 65 and the core became stable again.


----------



## outofmyheadyo

So after messing around with all the available bioses I decided to shuntmod my gigabyte founders 1080ti, TDP seems to be sorted, superposition @ 4koptimized @ 2075mhz @1.050v(locked) didnt hit the TDP limit not even close
But what does vrel mean ? It needs more voltage ? I dont know.

stockbios+POWERMOD+20PT @ 2075/[email protected](locked)



[email protected] = TDP limit



some more pictures of the mod for those interested










Spoiler: Warning: Spoiler!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> So after messing around with all the available bioses I decided to shuntmod my gigabyte founders 1080ti, TDP seems to be sorted, superposition @ 4koptimized @ 2075mhz @1.050v(locked) didnt hit the TDP limit not even close
> But what does vrel mean ? It needs more voltage ? I dont know.
> 
> stockbios+POWERMOD+20PT @ 2075/[email protected](locked)
> 
> 
> 
> [email protected] = TDP limit
> 
> 
> 
> some more pictures of the mod for those interested
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I think you need voltage man. I get higher results while power limited.


----------



## outofmyheadyo

Its not only the GPU do you have a ryzen 1700 runnin at 3.9 aswell ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Its not only the GPU do you have a ryzen 1700 runnin at 3.9 aswell ?


4790K at 4.7 Ghz. But we're doing a 4K run, does cpu really hold it back? That's possible I guess.

Thanks for sharing photos. Will I have enough clu to do this, put some on gpu and also on other side of a copper shim?


----------



## rolldog

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just a heads up: the Gigabyte cards have the highest TDP ceiling of all the 1080tis, however, the stock OC is extremely aggressive so you may have instability straight out of the box.
> Also, despite having 12+2, it actually only has 3 voltage drivers (each has a quadrupler) thus its actually only 3 "true" Vcore phases. This means while it can supply massive sustained current, it also has the worst theoretical voltage regulation of any GTX 1080ti (I believe ASUS has the best regulation with 6 vcore drivers and no doublers, followed by MSI with same as FE but with better quality chokes and MOSFETs). Chances are any future MSI Lightning or EVGA Classified/Kingpin will have a heightened TDP ceiling as well.
> 
> Though from what I can say from personally using the Aorus, the fan is extremely quiet at 100%. Possibly the quietest of any model (barring the Seahawk), thus it makes little sense to run the fans <100%. In fact, some Aorus owners seem to have noticed the fans are actually relatively inefficient at low RPMs (I think MSI is the best in this area but I'm not certain).
> 
> The Aorus also stands to benefit hugely from a repaste, the stock paste is crap and/or non-existent.


I guess I'm getting the cart before the horse, since I'm buying the cards hoping EK releases a waterblock for them. With this in mind, knowing that the stock cooler is being removed, if they release a non-reference waterblock like they did for the 980 Ti, would I be better off with a different card?

Well, looks like EK will be releasing a waterblock for this GPU. Knowing that I'll be watercooling it, would that make any difference in which card you would get? I'm not sure what the benefits/drawbacks of having 6 vcore drivers vs 3 voltage drivers with quadruplers are, but having the highest TDP would imply higher overclocks, right? And since the cards will be watercooled, heat shouldn't be a problem either. Would this difference between the cards possibly lead to instability or something?


----------



## outofmyheadyo

Quote:


> Originally Posted by *rolldog*
> 
> I guess I'm getting the cart before the horse, since I'm buying the cards hoping EK releases a waterblock for them. With this in mind, knowing that the stock cooler is being removed, if they release a non-reference waterblock like they did for the 980 Ti, would I be better off with a different card?


If I were you I would just get founders, you have plenty of founder cards to choose from ( and plenty of blocks, not just EK ) and after shuntmoding mine does 2100/12000 with 1.063v.
If you plan on watercooling anyway paying extra for the better cooler makes no sense, and be it mega super or hyper version, I am pretty sure none of em will come close to 2200mhz anyway, so save yourself some money and hassle and get founders.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If I were you I would just get founders, you have plenty of founder cards to choose from ( and plenty of blocks, not just EK ) and after shuntmoding mine does 2100/12000 with 1.063v.
> If you plan on watercooling anyway paying extra for the better cooler makes no sense, and be it mega super or hyper version, I am pretty sure none of em will come close to 2200mhz anyway, so save yourself some money and hassle and get founders.


Yo, not super if you saw me post earlier but if I just bought clu, will I have enough to shunt two resistors and two gpu-z die?


----------



## outofmyheadyo

Shunts take next to nothing, and you will surely have enough to do both sides of yo shim!


----------



## AorusGuy

Ya this is with the temps never going above 64 degrees in heavy stress testing.


----------



## Slackaveli

Quote:


> Originally Posted by *Clukos*
> 
> I did a search for your PSU and it seems from user reviews that multiple people are saying it blew up and took components with it. Maybe the PSU is to blame and not the card.


what's crazy is the monitor going, too. i'd be sick about that. try to rma that , too.
Quote:


> Originally Posted by *zipeldiablo*
> 
> What is the resistor mod? Security for the shunt?
> There is so much stuff but nothing on the first page


well, you could have read the whole thread like me lol


----------



## GosuPl

GTX 1080Ti vs TITAN X Pascal (2016) in 11 games, part 1.

1080p/1440p/4k

1. Mass Effect Andromeda
2. Ghost Recon Wildlands
3. Battlefield 1
4. Shadow Warrior 2
5. Resident Evil 7

Part 2 and SLI vs SLI, soon. TITAN Xp and TITAN Xp SLI, a little bit later


----------



## Slackaveli

Quote:


> Originally Posted by *zipeldiablo*
> 
> What is the resistor mod? Security for the shunt?
> There is so much stuff but nothing on the first page


well, you could have read the whole thread like me lol
Quote:


> Originally Posted by *Dasboogieman*
> 
> Just a heads up: the Gigabyte cards have the highest TDP ceiling of all the 1080tis, however, the stock OC is extremely aggressive so you may have instability straight out of the box.
> Also, despite having 12+2, it actually only has 3 voltage drivers (each has a quadrupler) thus its actually only 3 "true" Vcore phases. This means while it can supply massive sustained current, it also has the worst theoretical voltage regulation of any GTX 1080ti (I believe ASUS has the best regulation with 6 vcore drivers and no doublers, followed by MSI with same as FE but with better quality chokes and MOSFETs). Chances are any future MSI Lightning or EVGA Classified/Kingpin will have a heightened TDP ceiling as well.
> 
> Though from what I can say from personally using the Aorus, the fan is extremely quiet at 100%. Possibly the quietest of any model (barring the Seahawk), thus it makes little sense to run the fans <100%. In fact, some Aorus owners seem to have noticed the fans are actually relatively inefficient at low RPMs (I think MSI is the best in this area but I'm not certain).
> 
> The Aorus also stands to benefit hugely from a repaste, the stock paste is crap and/or non-existent.


all of this is true. The current is smooth and steady though it seems. Also, 80% fan speed and less does very little but 100% fan is godly, and quiet, so like he said, set your curve to hit 100% by 45c or so. You'll game in the 50c's if you repaste as well. Love mine.


----------



## Slackaveli

Quote:


> Originally Posted by *AorusGuy*
> 
> Hey guys just want some advice/opinions.
> 
> Got my Aorus non-extreme two days ago and it runs pretty good out of the box but not so much when it came to overclocking. With the original bios the OC mode (which is just a +25 offset) was extremely unstable. After hours of testing I found that I could only get a +20 offset which would put me at about a 1949 boosted clock. At this time I couldn't push memory past 200 without stability issues. This is with max power limit.
> 
> I flashed the F4 bios and it did help plenty. With F4 I can run the OC mode and it is stable. With the voltage/freq curve I was able to get [email protected] which while it is pretty terrible compared to most its better than the [email protected] that it was running at. I am also able to push memory up to 500 or so now without issue.
> 
> You guys have any ideas for this card? New to overclocking with boost 3.0/pascal. I guess I just didn't win the lottery on this one?


mine is same card and can hold 2063 for hours @ 1.05v, or 2050 @ 1.04 volts. Im on original bios, NO VOLTS added. Sounds like you lottery was lesser , true, but try no volts before you go trying to stabilize with voltage add b/c mine isnt stable above 2012 or so if i go the voltage route for some reason. Probably the way the phase design/vrms operate.


----------



## outofmyheadyo

What does the vrel actually mean in gpuz perfcap reason ?


----------



## AorusGuy

After I fini
Quote:


> Originally Posted by *Slackaveli*
> 
> mine is same card and can hold 2063 for hours @ 1.05v, or 2050 @ 1.04 volts. Im on original bios, NO VOLTS added. Sounds like you lottery was lesser , true, but try no volts before you go trying to stabilize with voltage add b/c mine isnt stable above 2012 or so if i go the voltage route for some reason. Probably the way the phase design/vrms operate.


Quote:


> Originally Posted by *Slackaveli*
> 
> mine is same card and can hold 2063 for hours @ 1.05v, or 2050 @ 1.04 volts. Im on original bios, NO VOLTS added. Sounds like you lottery was lesser , true, but try no volts before you go trying to stabilize with voltage add b/c mine isnt stable above 2012 or so if i go the voltage route for some reason. Probably the way the phase design/vrms operate.


For 1.043mV it goes unstable at 1961 mHz so thats most likely not my answer.


----------



## Slackaveli

Quote:


> Originally Posted by *AorusGuy*
> 
> After I fini
> 
> For 1.043mV it goes unstable at 1961 mHz so thats most likely not my answer.


yeah bios F4 may be though. or rma it, say it's unstable at stock and try again. Mine is proof some better than extreme reggie aorus' exist.
Quote:


> Originally Posted by *outofmyheadyo*
> 
> What does the vrel actually mean in gpuz perfcap reason ?


"more volts are needed to achieve more perf" is my understanding. It's ok to have if that's all the volts you can give b/c of stability, for instance. If you are still stable than adding more volts will produce a higher clock bin. So, in essence, it's saying, "more clock bins are attainable if more volts are given " but it doesn't factor stability, so if you know it'll crash, than that's your limit. If you can add more w/o crashing, you have more perf being 'left on the table' so to speak. Hope that helps.


----------



## AorusGuy

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah bios F4 may be though. or rma it, say it's unstable at stock and try again. Mine is proof some better than extreme reggie aorus' exist.


Most FE can atleast hit 2ghz. On default bios the card wouldn't even work on the Aorus Engine OC mode crap that it says it should in the specs. Atleast with F4 I can get more out of it and the memory is much more overclockable. Going to see if I can bring it to where I bought and get another one.


----------



## Lexiconman

Quote:


> Originally Posted by *Pandora's Box*
> 
> What you are doing is redundant. once you reach your monitors max refresh rate, gsync switches off and vsync kick in. using RTS or any other program to limit fps isnt doing anything.


Factually incorrect.


----------



## Addsome

Whats this F4 bios everyones talking about?


----------



## Jaysend

@Kedarwolf

I still don't seem to be on the list.
Jaysend
SLI EVGA 1080 ti FE
2050/5999


----------



## RJacobs28

Quote:


> Originally Posted by *KickAssCop*
> 
> Temps at full load and clock speeds?


Top card sits at 82, bottom at 71 running at 2000MHz core and 12000MHz memory.


----------



## Slackaveli

Quote:


> Originally Posted by *Lexiconman*
> 
> Factually incorrect.


i tried to tell him








Quote:


> Originally Posted by *Addsome*
> 
> Whats this F4 bios everyones talking about?


Aorus bios, it's the one with "efficient performance" they call it, it has 250w base with a 150% slider, so if you have a carrd that crashes at stock on the 300W base, switching to this allows it to be stable. Of course, that also allows giga to avoid an rma and if mine wasnt stable at 300w id rma it.


----------



## Exilon

Is CLU on the GPU die actually worth it? I'm leaning towards no.

Say you have a idle to load temperature of 25C. If we analyze where the temperature gradients, we see that it's mostly in the radiator to air interface and the GPU to coolant interface. In my experience, a typical Asetek cold plate can do a coolant-GPU delta of 8C with Kryonaut (on a 980Ti) and maybe 6C with CLU. Great! that's a huge % change in temperature on that interface... but we need to take a look at the gradient on the radiator as well. A 300W GPU still puts out 300W heat that needs to be rejected, and a 120mm radiator might do it at 15C? So the end result of using CLU is a 2C drop overall, from 23C to 21C. Not going to make a difference at all in overclocking and you've really ratcheted up the risk factor.


----------



## rolldog

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If I were you I would just get founders, you have plenty of founder cards to choose from ( and plenty of blocks, not just EK ) and after shuntmoding mine does 2100/12000 with 1.063v.
> If you plan on watercooling anyway paying extra for the better cooler makes no sense, and be it mega super or hyper version, I am pretty sure none of em will come close to 2200mhz anyway, so save yourself some money and hassle and get founders.


I wouldn't necessarily be paying extra for a cooler, since it'll be removed anyway, I'm paying extra for a pre-binned card. I just want a card that overclocks well, stable, with a new firmware flashed to it. I can't justify the price increase for a Titan given what the 1080 Ti can do, but I can justify paying extra for a pre-binned card with a higher ASIC, no matter what kind of cooler it has. Actually, it doesn't even need a cooler. Too bad manufacturers wouldn't sell a stripped down version with no cooler, but it's all about the cooler, and the RGB lights, which are another way for manufacturers to differentiate themselves from the rest of the crowd trying to convince you to pay a premium for their card over someone else's. At the end of the day, it's all the same thing, except silicon lottery winners can be predetermined by binning, but you have to pay a premium for this. It's not really the cooler you're paying for. It's a better card. They just can't sell it like that so they throw in something that looks different to justify the higher price.


----------



## Benny89

Quote:


> Originally Posted by *rolldog*
> 
> I wouldn't necessarily be paying extra for a cooler, since it'll be removed anyway, I'm paying extra for a pre-binned card. I just want a card that overclocks well, stable, with a new firmware flashed to it. I can't justify the price increase for a Titan given what the 1080 Ti can do, but I can justify paying extra for a pre-binned card with a higher ASIC, no matter what kind of cooler it has. Actually, it doesn't even need a cooler. Too bad manufacturers wouldn't sell a stripped down version with no cooler, but it's all about the cooler, and the RGB lights, which are another way for manufacturers to differentiate themselves from the rest of the crowd trying to convince you to pay a premium for their card over someone else's. At the end of the day, it's all the same thing, except silicon lottery winners can be predetermined by binning, but you have to pay a premium for this. It's not really the cooler you're paying for. It's a better card. They just can't sell it like that so they throw in something that looks different to justify the higher price.


Well I don't watercool but If I would put blocks on it anyway and do not plan to do shunt mod I would grab Aorus Xtreme card as it has 375W Power Limit compare to other AIBs 330W and reference 300W.

But again, I am not on water so yeah


----------



## mshagg

I don't believe anyone is binning the chips for these cards. If they are, the results would suggest they're doing a terrible job of it.


----------



## KingEngineRevUp

*Unlocking the FE's true potential
*
After hitting power limits, I could not take it anymore. I knew that it was time for be to do the shunt mod. My card could bench at 2114 Mhz in heaven, and I believe it could go higher. After all, I've seen it boost to 2138 Mhz before without crashing.

I found it as a OC member, it was irresponsible for me to have so much potential thrown away. My theory was that I had a pretty nice golden card. That theory was proved to me today.

*What I needed to prepare for*

The shunt mod, if done right without too much CLU, will drop your 100% to 75-80%. This means that at 75%-80% you are drawing 300-310 Watts.This can be dangerous because at 120%, that is 450 Watts!

Because I set out to perform this shunt mod, I need to upgrade my cooling. My x41 was nearly at its limits. The liquid temperature was at 40C and my GPU temperatures stock was at 48C. If efficiency was 100%, then liquid and GPU temperatures would be equal according to newton's energy equation. Thus, I knew there was things I could do.

1. I could improve the TIM in between my shim and heat pump. Originally, I was planning to CLU the die and other side of the shim, but after looking online, CLU is really bad for dies from a RMA stand point, it leaves stains. So I decided to just use Gelid Extreme on the die and CLU on the shim and pump. Eureka, this got liquid temperatures from 40C to 44C, this means that more heat was being transferred to the radiator.

2. The midplate is a heatsink itself. It is connected to VRMs, VRAMs and other components. If I was going to introduce more watts to this card, I need to keep them cool. Originally I had placed 8 heatsinks on my midplate. This dropped my core temperature from 51C to 48C. Taking thermal measurements, the shims were at 36C. I knew that if I added more heatsinks, specially closer to the VRMs, it would help with the extra 50 to 75 Watts.



3. Finally, the backplate was not absorbing any heat whatsoever. The aluminum plate is somewhat thin, my theory is that the plate was too thin to absorb heat from the GPU, it would cause warping and flexing. But using my engineering knowledge and deduction, I knew that my AIO was keeping my GPU at 48C, so 45-60C wouldn't harm the back plate at all. So I added a thermal pad in between the back of the GPU and the backplate.



*The shunt mod*

For my shunt mod, I used CLU and liquid electrical tape.





*My results*

My previous calculations were correct. From the shunt mod, running a light heaven, I was drawing 300-310 Watts at 75-80% power draw. That means at 100%, I would be drawing 400 Watts.

At max, my card was drawing 95-100%, which mean I was drawing 400 Watts. This is confirmed below!



Here are my thermal results

Stock at 300 Watts, 48C.

Shunt modded at 400 Watts, 50C

All my modifications have successfully fought off the incoming 100 Watts.

After some light benchmarks, I can easily hit 2152 Mhz @ 1.093v with no limiting.



I have decided to call it a day and spend time with my wife. I have not pushed this card to is maximum core clock yet.

*With great power... comes great responsibility... Conclusion*

This card is now drawing 100 extra watts, which is very dangerous. Because I do not want a $700 paper weight, my next step is to find a balance between my maximum core clock and voltage and try to limit my power draw to 350-375 Watts.

Do I recommend this mod?
To people on water, yes!

To people on AIO, maybe. You must take the extra steps I did to ward off that 100 Watts of extra heat that will be coming to your card.

To people on AIR, No. I do not recommend this mod to people on air. It can potentially be dangerous.

To people with partner cards. Maybe... you have to think about it, you're adding 33% more power and that can increase your temperatures by 33% as a rough estimate. It can be counter productive due to thermal throttling. And if you don't take extra pre-cautions to cool your card, it can blow up!


----------



## somebadlemonade

I feel dumb having to ask this but will a EVGA 550w G2 PSU be enough for an overclocked FTW3 and an overclocked 6600k?

Since I kind of need to change to a short PSU anyways, for my core 500.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *somebadlemonade*
> 
> I feel dumb having to ask this but will a EVGA 550w G2 PSU be enough for an overclocked FTW3 and an overclocked 6600k?
> 
> Since I kind of need to change to a short PSU anyways, for my core 500.


Look at my post above. My system with an FE when it was stock would draw 434 Watts. The FTW3 will draw 30 to 50 watts more than a stock FE.

Do you really want to risk running your PSU close to 100%? Ill tell you it won't last long.


----------



## somebadlemonade

Quote:


> Originally Posted by *SlimJ87D*
> 
> Look at my post above. My system with an FE when it was stock would draw 434 Watts. The FTW3 will draw 30 to 50 watts more than a stock FE.
> 
> Do you really want to risk running your PSU close to 100%? Ill tell you it won't last long.


I was thinking of picking up a EVGA 750W G3 maybe the 850w version. But if it let me run stock speeds for both for a few days I'll wait until I get the card to order the new PSU.


----------



## phenom01

I am in the market for a aftermarket 1080ti soon. I am reading and seeing reviews that say a entry level $700 1080ti(aftermarket) will net you within 1% of the same performance of the high end $750-$780 USD units after overclocking. Is there any reason to get a high end 1080TI that I am missing? I mean what is 50-80 bucks when you are dropping $700+ on a card but is there any tangible benefits of custom bios or anything like that?


----------



## hotrod717

Quote:


> Originally Posted by *phenom01*
> 
> I am in the market for a aftermarket 1080ti soon. I am reading and seeing reviews that say a entry level $700 1080ti(aftermarket) will net you within 1% of the same performance of the high end $750-$780 USD units after overclocking. Is there any reason to get a high end 1080TI that I am missing? I mean what is 50-80 bucks when you are dropping $700+ on a card but is there any tangible benefits of custom bios or anything like that?


It all depends on silicon and winning the lottery. Buying a more expensive AIB card is in some way hedging the bet. Better cooling and components will help mediocre silicon become better, so to speak. Buying a vanilla FE will not guarantee anything as buying a $800 AIB will not guarantee anything. its pure luck of the draw. If you arent chasing FPS or benchmarks, by all means go with the cheapest card. If you are not looking to watercool, personally AIB is the way to go.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *phenom01*
> 
> I am in the market for a aftermarket 1080ti soon. I am reading and seeing reviews that say a entry level $700 1080ti(aftermarket) will net you within 1% of the same performance of the high end $750-$780 USD units after overclocking. Is there any reason to get a high end 1080TI that I am missing? I mean what is 50-80 bucks when you are dropping $700+ on a card but is there any tangible benefits of custom bios or anything like that?


I can sneak form experience, an FE clocked at the same clocks as an aib will perform 1-2% different

I just shunt my card and although clocks are consistent and I'm not being powerful limited, it's just a 1%-3% improvement.

But if you have a golden FE it can really hold you back. The FE cannot do above 2100 Mhz and above without power limit. But again, hitting 2100 Mhz is still rare and not easy.


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> *Unlocking the FE's true potential
> *
> After hitting power limits, I could not take it anymore. I knew that it was time for be to do the shunt mod. My card could bench at 2114 Mhz in heaven, and I believe it could go higher. After all, I've seen it boost to 2138 Mhz before without crashing.
> 
> I found it as a OC member, it was irresponsible for me to have so much potential thrown away. My theory was that I had a pretty nice golden card. That theory was proved to me today.
> 
> *What I needed to prepare for*
> 
> The shunt mod, if done right without too much CLU, will drop your 100% to 75-80%. This means that at 75%-80% you are drawing 300-310 Watts.This can be dangerous because at 120%, that is 450 Watts!
> 
> Because I set out to perform this shunt mod, I need to upgrade my cooling. My x41 was nearly at its limits. The liquid temperature was at 40C and my GPU temperatures stock was at 48C. If efficiency was 100%, then liquid and GPU temperatures would be equal according to newton's energy equation. Thus, I knew there was things I could do.
> 
> 1. I could improve the TIM in between my shim and heat pump. Originally, I was planning to CLU the die and other side of the shim, but after looking online, CLU is really bad for dies from a RMA stand point, it leaves stains. So I decided to just use Gelid Extreme on the die and CLU on the shim and pump. Eureka, this got liquid temperatures from 40C to 44C, this means that more heat was being transferred to the radiator.
> 
> 2. The midplate is a heatsink itself. It is connected to VRMs, VRAMs and other components. If I was going to introduce more watts to this card, I need to keep them cool. Originally I had placed 8 heatsinks on my midplate. This dropped my core temperature from 51C to 48C. Taking thermal measurements, the shims were at 36C. I knew that if I added more heatsinks, specially closer to the VRMs, it would help with the extra 50 to 75 Watts.
> 
> 
> 
> 3. Finally, the backplate was not absorbing any heat whatsoever. The aluminum plate is somewhat thin, my theory is that the plate was too thin to absorb heat from the GPU, it would cause warping and flexing. But using my engineering knowledge and deduction, I knew that my AIO was keeping my GPU at 48C, so 45-60C wouldn't harm the back plate at all. So I added a thermal pad in between the back of the GPU and the backplate.
> 
> 
> 
> *The shunt mod*
> 
> For my shunt mod, I used CLU and liquid electrical tape.
> 
> 
> 
> 
> 
> *My results*
> 
> My previous calculations were correct. From the shunt mod, running a light heaven, I was drawing 300-310 Watts at 75-80% power draw. That means at 100%, I would be drawing 400 Watts.
> 
> At max, my card was drawing 95-100%, which mean I was drawing 400 Watts. This is confirmed below!
> 
> 
> 
> Here are my thermal results
> 
> Stock at 300 Watts, 48C.
> 
> Shunt modded at 400 Watts, 50C
> 
> All my modifications have successfully fought off the incoming 100 Watts.
> 
> After some light benchmarks, I can easily hit 2152 Mhz @ 1.093v with no limiting.
> 
> 
> 
> I have decided to call it a day and spend time with my wife. I have not pushed this card to is maximum core clock yet.
> 
> *With great power... comes great responsibility... Conclusion*
> 
> This card is now drawing 100 extra watts, which is very dangerous. Because I do not want a $700 paper weight, my next step is to find a balance between my maximum core clock and voltage and try to limit my power draw to 350-375 Watts.
> 
> Do I recommend this mod?
> To people on water, yes!
> 
> To people on AIO, maybe. You must take the extra steps I did to ward off that 100 Watts of extra heat that will be coming to your card.
> 
> To people on AIR, No. I do not recommend this mod to people on air. It can potentially be dangerous.
> 
> To people with partner cards. Maybe... you have to think about it, you're adding 33% more power and that can increase your temperatures by 33% as a rough estimate. It can be counter productive due to thermal throttling. And if you don't take extra pre-cautions to cool your card, it can blow up!


Grats







looks like it worked real well for you

Quote:


> Originally Posted by *SlimJ87D*
> 
> I can sneak form experience


Awesome, I type really badly too


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Grats
> 
> 
> 
> 
> 
> 
> 
> looks like it worked real well for you
> Awesome, I type really badly too


Yeah, was talking my dog and swype texting with phone, not easy lol.


----------



## PeterOC

Is it really worth it to bother with backplates? Gamers Nexus just did a review of 1080 Ti with a copper part on the backplate and concluded it made no difference.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PeterOC*
> 
> Is it really worth it to bother with backplates? Gamers Nexus just did a review of 1080 Ti with a copper part on the backplate and concluded it made no difference.


Link? If there's no contact between the back of the gpu and the backplate then nothing will happen. It can make things actually worse.

I applied a thermal pad to heat the backplate up and take heat away. Before the backplate was at like 40C, not it is 60C so heat is being transferred to it for sure.


----------



## Dasboogieman

Quote:


> Originally Posted by *PeterOC*
> 
> Is it really worth it to bother with backplates? Gamers Nexus just did a review of 1080 Ti with a copper part on the backplate and concluded it made no difference.


It makes a huge difference if you pad the plate and the PCB. Otherwise, its worse than having no backplate because the stale air layer acts as an insulator and actually increases temperatures for the components covered by the plate.

Back in ye olde days with the AMD 290 which had insanely hot VRMs, I used to pad the backplate around the VRAM and VRM area. It got like super hot to touch so I attached an old GT220 cooling blower to the backplate and it dropped the VRM temps by close to 10 degrees.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Dasboogieman*
> 
> It makes a huge difference if you pad the plate and the PCB. Otherwise, its worse than having no backplate because the stale air layer acts as an insulator and actually increases temperatures for the components covered by the plate.


Yep, I actually proved that today.

My backplate was at 40C before and now it is at 60C because of the thermal pad.

But I don't recommend adding a thermal pad if you are not AIO. Your plate can go up to or past 100C on air.

This is for the FE. Partner cards back plates are usually thicker.


----------



## PeterOC

http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks/page-3

They did pad the backplate and came to conclusion it didn't make a difference.


----------



## PeterOC

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yep, I actually proved that today.
> 
> My backplate was at 40C before and now it is at 60C because of the thermal pad.
> 
> But I don't recommend adding a thermal pad if you are not AIO. Your plate can go up to or past 100C on air.
> 
> This is for the FE. Partner cards back plates are usually thicker.


I'll have it under water and planning on doing the shunt mod. And PCI riser cable might be covering some of the backside, so maybe I should get a backplate...hmm. Is EK's backplate any good?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PeterOC*
> 
> http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks/page-3
> 
> They did pad the backplate and came to conclusion it didn't make a difference.


They said that backplates occasionally work in the article, it just didn't help here. And I don't know if Asus backplate has a thermal pad on the gpu to backplate.

They're not drawing 400 watts of power either. But I'm telling you, I measured my backplate temperature before and after adding the thermal pad. It went from 40C to 60C, so heat is definitely transferring to the backplate now.

I just added 100 watts to my card and managed to keep it at 50C. It has to be a combination of everything I did. How much the backplate helps? Maybe 1C? Who knows.


----------



## Benny89

My 1080 Ti Aorus Xtreme Edition should be here tomorrow.

I also ordered new Asus ROG STRIX 1080 Ti and I will do comparsion between them in games. See which one can hold higher clocks in game after few hours with better temps and noise and make my decision based on that.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> To people with partner cards. Maybe... you have to think about it, you're adding 33% more power and that can increase your temperatures by 33% as a rough estimate. It can be counter productive due to thermal throttling. And if you don't take extra pre-cautions to cool your card, it can blow up!


They really have to think about it and expect the worse if they have a partner card(custom PCB?) and do the shunt mod. I did the shunt mod on my 1070 AMP Extreme not long ago and it did not end well. It made my card stuck at lower clocks and very hot. (I don't know what I was thinking and trying to achieve that time. LOL) When I removed CLU the problem was still there so the card was f'ed up.

Anyway good guide!


----------



## Dasboogieman

Quote:


> Originally Posted by *PeterOC*
> 
> http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks/page-3
> 
> They did pad the backplate and came to conclusion it didn't make a difference.


I dunno, maybe the temps were not high enough for the padding to matter? Like I said, my old 290 had VRM temp outputs and the drop of 10 degrees was noticeable. That being said, my backplate did have active cooling (like heatsink+fan) instead of just a fan blowing.



Like this except eventually I mounted it on a backplate instead of just PCB


----------



## warbucks

Quote:


> Originally Posted by *madmeatballs*
> 
> They really have to think about it and expect the worse if they have a partner card(custom PCB?) and do the shunt mod. I did the shunt mod on my 1070 AMP Extreme not long ago and it did not end well. It made my card stuck at lower clocks and very hot. (I don't know what I was thinking and trying to achieve that time. LOL) When I removed CLU the problem was still there so the card was f'ed up.
> 
> Anyway good guide!


Your card being stuck at lower clocks was probably due to the long standing driver bug that does just this. I have a 1080 TI and experience it when the computer is sitting idle for long periods of time and I'm not shunted(yet







).


----------



## RavageTheEarth

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If I were you I would just get founders, you have plenty of founder cards to choose from ( and plenty of blocks, not just EK ) and after shuntmoding mine does 2100/12000 with 1.063v.
> If you plan on watercooling anyway paying extra for the better cooler makes no sense, and be it mega super or hyper version, I am pretty sure none of em will come close to 2200mhz anyway, so save yourself some money and hassle and get founders.


Yeah I was originally going to jump on the Founders, but then I saw on Reddit that it was launch night of the Aorus so I told myself that if there were any in stock when I woke up in the morning that I would get one (thinking that "yeah right those things are going to sell out in seconds" (and they did)). So I was very surprised to see the regular Aorus in stock the next morning so I kept the promise I made to myself and grabbed it. I learned later that the Xtreme sold out in 22 seconds, but then came back in-stock for seconds throughout the day and the regular Aorus sold out also and also came back in stock randomly throughout the day, one of those times being right before I woke up.

I haven't really tried to push the limits yet because I'm waiting for the waterblock to release, but I run a steady 2063/1.063 in games although I do get a Vreg error in the Perfcap reason sensor after playing for a while. When I bench with Firestrike Ultra or SuPo I do drop down to 2050 though. I'm not sure if it would be worthy of Shunt Modding the card. With my power slider at 125% that's 375 watts. I'm still amazed at the performance of this card compared to my 980 Ti at 1440p 144hz.
Quote:


> Originally Posted by *somebadlemonade*
> 
> I was thinking of picking up a EVGA 750W G3 maybe the 850w version. But if it let me run stock speeds for both for a few days I'll wait until I get the card to order the new PSU.


You will be completely fine. EVGA G2's are great PSU's. Just because it is rated at 550w doesn't mean it can't go over that. It's just something that someone shouldn't do.


----------



## jleslie246

did anyone here preorder a Ti from EVGA that was supposed to ship today? Just curious if any actually shipped out today. Mine did not.

UPDATE: I will have mine on the 24th!! EVGA 1080Ti SC


----------



## evosamurai

Quote:


> Originally Posted by *outofmyheadyo*
> 
> You can use it on the gpu aswell but get some liquid electritian tape you can cover the doodads around the gpu incase you spill some it wont cause trouble, as an added benefit you can also do a shunt mod with it to eliminate the power limit issue.


Shunt mod?


----------



## somebadlemonade

Quote:


> Originally Posted by *RavageTheEarth*
> 
> You will be completely fine. EVGA G2's are great PSU's. Just because it is rated at 550w doesn't mean it can't go over that. It's just something that someone shouldn't do.


That's what I figured, part of the reason I got the G2 in the first place, though now I need a beefier PSU that is shorter, the core 500 isn't big enough for another G2 so I have to get something smaller in either a 150mm or 140mm size, hence the 750w G3.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *evosamurai*
> 
> Shunt mod?


I juat did a whole shunt mod guide a page over.


----------



## evosamurai

Quote:


> Originally Posted by *SlimJ87D*
> 
> I juat did a whole shunt mod guide a page over.


Nice, I will be mounting mine vertically so I'd use hot glue to keep it from shorting out


----------



## KaRLiToS

Just removed my trusty R9 290x from my system and replaced it with 2 x GTX 1080ti in SLI


----------



## Exilon

Quote:


> Originally Posted by *KaRLiToS*
> 
> Just removed my trusty R9 290x from my system and replaced it with 2 x GTX 1080ti in SLI


Nice build! That's like $200 worth of rotary adapters, isn't it?

I'd love to use them but I'm too paranoid about swivel adapters leaking. All static compression and black EPDM tubing for me.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *evosamurai*
> 
> Nice, I will be mounting mine vertically so I'd use hot glue to keep it from shorting out


Yeah, but if the clu runs off you won't be shunt anymore. Worth a shot though! Do it!


----------



## evosamurai

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, but if the clu runs off you won't be shunt anymore. Worth a shot though! Do it!


Is solder an option?


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> *Unlocking the FE's true potential
> *
> All my modifications have successfully fought off the incoming 100 Watts.
> 
> After some light benchmarks, I can easily hit 2152 Mhz @ 1.093v with no limiting.
> 
> 
> 
> I have decided to call it a day and spend time with my wife. I have not pushed this card to is maximum core clock yet.


Something is definitely wrong with your result for such clocks or the bios is slower clock for clock or you are either throttling or power limited of some kind as i have 1fps more at 2076 core and 1.5fps more after shunt mod with a constant 2101 core....if you have a real 2152Mhz your score should be in the range of 81-82fps 4k optimized run...maybe the curve OC gives lower result for the clock if not properly made as it was with 1080 pascal cards...Anyway your card seems pretty decent for a FE


----------



## bloodhawk

Quote:


> Originally Posted by *DStealth*
> 
> Something is definitely wrong with your result for such clocks or the bios is slower clock for clock or you are either throttling or power limited of some kind as i have 1fps more at 2076 core and 1.5fps more after shunt mod with a constant 2101 core....if you have a real 2152Mhz your score should be in the range of 81-82fps 4k optimized run...maybe the curve OC gives lower result for the clock if not properly made as it was with 1080 pascal cards...


Yeap there is another power limit that causes issues.


----------



## MrTOOSHORT

Curve always scores lower for me compared to slider in AB.


----------



## TheBoom

So the Amp Extreme just came out here and seems reasonably priced. Unfortunately there are still no reviews or breakdowns of the card. Anyone here owns one?

The card looks good IMO, second to the strix maybe but a lot better priced.

Someone here was saying the Asus Strix currently has the most stable power delivery with 6 regulators and no doublers? I wonder whats in the Amp Extreme version.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Something is definitely wrong with your result for such clocks or the bios is slower clock for clock or you are either throttling or power limited of some kind as i have 1fps more at 2076 core and 1.5fps more after shunt mod with a constant 2101 core....if you have a real 2152Mhz your score should be in the range of 81-82fps 4k optimized run...maybe the curve OC gives lower result for the clock if not properly made as it was with 1080 pascal cards...Anyway your card seems pretty decent for a FE


I don't think there anything wrong with my score because it can be various factors that come into play once you get to this point of oc. For example, it could be my cpu and you have higher memory oc than I do, our ram speeds and were running different OS, who knows. But not every card will oc the same I guess. At this point, various factors can come into play, what we have in our OS, what background apps were running, our mobo, our ram etc and etc.

I still have a higher maximum than you do, but it could be my cpu, ram, mobo that hold back my minimum scores.

The real question is how would my card perform in your rig, no sexual pun intended.

Besides, your card had to explode to get those scores







, mine hasn't yet, might soon lol.

But maybe a different bios might make it run better now. Maybe that Asus bios actually might make a difference for a change. But like I said, I haven't tweaked anything like nvidia control panel, disabled things in Windows creative update (which scores lower in benchmarks I'm hearing Btw).


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Besides, your card had to explode to get those scores
> 
> 
> 
> 
> 
> 
> 
> , mine hasn't yet, might soon lol.


Lol








CPU does nothing comparing heaviest GPU load on 4k ... Your memory OC is a mere 50mhz less...compared to 80mhz more on the core and 1fps lower score...
Please make other run w/o curve , set let say 2100 and constant voltage so we can compare result with and w/o curve








I know you want show 2150Mhz while they are not the real GPU speed, but it might help you to get better score with less but actual GPU speeds


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> So the Amp Extreme just came out here and seems reasonably priced. Unfortunately there are still no reviews or breakdowns of the card. Anyone here owns one?
> 
> The card looks good IMO, second to the strix maybe but a lot better priced.
> 
> Someone here was saying the Asus Strix currently has the most stable power delivery with 6 regulators and no doublers? I wonder whats in the Amp Extreme version.


Ahh I see, the Zotac VRM design looks similar to their 1080 (from what I can glean on the Official page), just more phases. If it matters, the MOSFETs on that edition had a weird 2 phase low side + 1 phase high side (i.e. 3 MOSFETs per VRM phase) setup. However the mosfets themselves are pretty cheapo individually (i.e. low efficiency) which means they need some seriously beefy cooling to get their optimal output. The 2 by 1 setup means your ultimate current capability is limited by the spike durability of the lone high side FET so thats the one that is going to blow if something goes wrong. As for how to get 16 + 2 phases, I have no idea how they're achieving that. 6 x 3? maybe? but triplers really weird but chances are, unless they got permission to use a different voltage reegulator, the actual regulation should be on par with the Aorus.

Also, I re-listened to the PCB breakdown again, I was wrong. The ASUS has a 5 x 2 +2 setup not a 6 x 1 +2. However, it is also very likely ASUS still has the best voltage regulation because their MOSFETs are godly, high efficiency and run at the same 300khz that all the others so far are running at, though MSI come a close second.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Lol
> 
> 
> 
> 
> 
> 
> 
> 
> CPU does nothing comparing heaviest GPU load on 4k ... Your memory OC is a mere 50mhz less...compared to 80mhz more on the core and 1fps lower score...
> Please make other run w/o curve , set let say 2100 and constant voltage so we can compare result with and w/o curve
> 
> 
> 
> 
> 
> 
> 
> 
> I know you want show 2150Mhz while they are not the real GPU speed, but it might help you to get better score with less but actual GPU speeds


Sure I can test it tomorrow. But I don't think that's true about cpu and ram not affecting you at 4K, people says it "doesn't matter" because it literally makes like 2 to 4 fps of a difference which is a few percentages which makes people wave it off as it doesn't matter.

What are we doing here? We're epeening literally about a 0.009% performance difference and 1 or 2 frames.

Even if I test without a curve, we don't have the same hardware and OS where literally getting 0.5 fps gets you 100 points more and 0.0009% difference in overall score.


----------



## DStealth

You can test on your own system and all will remain the same, just to clarify curve OC is not giving you the real GPU at some point.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> You can test on your own system and all will remain the same, just to clarify curve OC is not giving you the real GPU at some point.


Okay, ill checked it out.

BTW did you test on the new windows 10 at all?


----------



## DStealth

Checking my result http://www.3dmark.com/spy/1562028
Operating system - 64-bit Windows 10 (10.0.14393)
Don't know what you mean with "new windows 10" Creators update or ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Checking my result http://www.3dmark.com/spy/1562028
> Operating system - 64-bit Windows 10 (10.0.14393)
> Don't know what you mean with "new windows 10" Creators update or ?


Yeah the latest if 15063 and a lot of people said their benchmark scores went down and nvidia is stillhas fixing the drivers for it, they even released a hot fix to try and address some issues. So I was curious if you have tested on it or not yet.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah the latest if 15063 and a lot of people said their benchmark scores went down and nvidia is stillhas fixing the drivers for it, they even released a hot fix to try and address some issues. So I was curious if you have tested on it or not yet.


I'm running the creator's update. I have no problems with it. Haven't noticed any performance decrease or significant perf increase. I also installed Nvidia's driver (the latest one 381.78) for it before I applied the update.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ahh I see, the Zotac VRM design looks similar to their 1080 (from what I can glean on the Official page), just more phases. If it matters, the MOSFETs on that edition had a weird 2 phase low side + 1 phase high side (i.e. 3 MOSFETs per VRM phase) setup. However the mosfets themselves are pretty cheapo individually (i.e. low efficiency) which means they need some seriously beefy cooling to get their optimal output. The 2 by 1 setup means your ultimate current capability is limited by the spike durability of the lone high side FET so thats the one that is going to blow if something goes wrong. As for how to get 16 + 2 phases, I have no idea how they're achieving that. 6 x 3? maybe? but triplers really weird but chances are, unless they got permission to use a different voltage reegulator, the actual regulation should be on par with the Aorus.
> 
> Also, I re-listened to the PCB breakdown again, I was wrong. The ASUS has a 5 x 2 +2 setup not a 6 x 1 +2. However, it is also very likely ASUS still has the best voltage regulation because their MOSFETs are godly, high efficiency and run at the same 300khz that all the others so far are running at, though MSI come a close second.


So I'm guessing that means more stable clocks/boost over time and more reliability? Over higher but less stable boost clocks on the Aorus and Amp Extreme?

Over at the 1070 thread the consensus is that the Amp Extreme cards seem to be the best overall though. I myself own a Strix 1070 but the asking price for the Strix 1080 ti is kinda ridiculous over other partner 1080tis here.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> So I'm guessing that means more stable clocks/boost over time and more reliability? Over higher but less stable boost clocks on the Aorus and Amp Extreme?
> 
> Over at the 1070 thread the consensus is that the Amp Extreme cards seem to be the best overall though. I myself own a Strix 1070 but the asking price for the Strix 1080 ti is kinda ridiculous over other partner 1080tis here.


Despite all this geeky electrical talk, its functionally irrelevant on Aircooling (perhaps even water cooling). They're all functionally the same because we cannot push the kind of voltages + clocks where the power delivery actually matters.
The conditions where the VRMs on these cards will differentiate themselves from each other is under LN2 where massive voltages and current are involved.

The ASUS one is probably the most "elegant" design from a pure electrical design perspective because it uses such high quality efficient MOSFETs (which you pay for), you won't be getting much thermal output from the VRM area (thus actually making the ASUS model ideal for AIO cooling mods).

In terms of reliability, all the cards are functionally the same imo because none of us push them hard or hold them long enough for it to matter. From a long term perspective, if we look at purely from an engineering perspective, less components=less chances for something to fail also more efficient=less heat=less degradation.

I'm personally very uneasy at the fact that Zotac used an asymmetrical Low/High setup and with cheapo MOSFETs some more. Historically, the most famous example of asymmetric designs is the AMD 290x, those MOSFETs they used were server grade but they output an insane amount of heat plus the high side had a tendency to blow at high voltages because the low side was able to draw so much power in comparison. The other example I can think of was the infamous GTX 570 Reference which also had a tendency to blow the high side.

More of cheap not a particularly elegant design if you catch my drift, because individually, those fets generate more heat and you're getting the same current capability anyway so the 16+2 is pretty much all for marketing and little substance.

TLDR: you get what you pay for with the ASUS model, will it help with OC? maybe if ur under LN2 but Air cooled its irrelevant.


----------



## Taint3dBulge

I just dont get this 1080ti... I can bench and run heaven stress test for hours at 2037-2050 but even at 2012 in games i get driver crashes all the time. What the heck is going on lol. I just rolled back from the 381.78 driver to the 381.65... Not sure if its the driver itself that is causing it. But the game i mostly play is BF1 on ultra with DX12. So could it be DX12 that could be making it crash or the old driver... Willl test out the .65 more tomorrow also turning off DX12.

Using EVGA precision X.
Power 120%
Tried upping the volts to +50-100
+140-160 core: and get the same thing no matter what in BF1
+400 mem

in heaven if i go past 160 i get crashes

Memory at +500 seems stable

I can bench at
+200 core
+625 mem

Thanks for any info.


----------



## PCBUILDER1980

My FE 1080ti flashing at boot and there is a solid horizontal line running down my monitor. Seems like its a problem with FE cards and having to disable csm in bios to boot normal. I tried everything under the sun and really don't want to load a new windows with csm disabled. Anyone on here have a fix? Once I reboot and get into windows its fine. Just annoying my 1080 FE didn't have this issue.....


----------



## Dasboogieman

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I just dont get this 1080ti... I can bench and run heaven stress test for hours at 2037-2050 but even at 2012 in games i get driver crashes all the time. What the heck is going on lol. I just rolled back from the 381.78 driver to the 381.65... Not sure if its the driver itself that is causing it. But the game i mostly play is BF1 on ultra with DX12. So could it be DX12 that could be making it crash or the old driver... Willl test out the .65 more tomorrow also turning off DX12.
> 
> Using EVGA precision X.
> Power 120%
> Tried upping the volts to +50-100
> +140-160 core: and get the same thing no matter what in BF1
> +400 mem
> 
> in heaven if i go past 160 i get crashes
> 
> Memory at +500 seems stable
> 
> I can bench at
> +200 core
> +625 mem
> 
> Thanks for any info.


Because some games are much more sensitive to overclocks than others. In fact, theres little correlation between how games are loading the GPU and how stable an overclock is. I personally found the FFXIV Free benchmark to be a really good indicator of stability (just loop the 4k at max settings), if you make it through an hr its truly "stable. Hell, I remember Ghost Recon Future Soldier used to crash on AIB cards unless you get rid of the factory overclock.

Lastly, get ur GPU colder, mine needs 100% fan speeds to keep temps under 70 otherwise my 2050 clocks are no longer stable


----------



## feznz

So I'm guessing that means more stable clocks/boost over time and more reliability? Over higher but less stable boost clocks on the Aorus and Amp Extreme?

Over at the 1070 thread the consensus is that the Amp Extreme cards seem to be the best overall though. I myself own a Strix 1070 but the asking price for the Strix 1080 ti is kinda ridiculous over other partner 1080tis here.[/quote]

aesthetics is it is all about the aesthetics









Quote:


> Originally Posted by *Dasboogieman*
> 
> Despite all this geeky electrical talk, its functionally irrelevant on Aircooling (perhaps even water cooling). They're all functionally the same because we cannot push the kind of voltages + clocks where the power delivery actually matters.
> The conditions where the VRMs on these cards will differentiate themselves from each other is under LN2 where massive voltages and current are involved.
> 
> The ASUS one is probably the most "elegant" design from a pure electrical design perspective because it uses such high quality efficient MOSFETs (which you pay for), you won't be getting much thermal output from the VRM area (thus actually making the ASUS model ideal for AIO cooling mods).
> 
> In terms of reliability, all the cards are functionally the same imo because none of us push them hard or hold them long enough for it to matter. From a long term perspective, if we look at purely from an engineering perspective, less components=less chances for something to fail also more efficient=less heat=less degradation.
> 
> I'm personally very uneasy at the fact that Zotac used an asymmetrical Low/High setup and with cheapo MOSFETs some more. Historically, the most famous example of asymmetric designs is the AMD 290x, those MOSFETs they used were server grade but they output an insane amount of heat plus the high side had a tendency to blow at high voltages because the low side was able to draw so much power in comparison. The other example I can think of was the infamous GTX 570 Reference which also had a tendency to blow the high side.
> 
> More of cheap not a particularly elegant design if you catch my drift, because individually, those fets generate more heat and you're getting the same current capability anyway so the 16+2 is pretty much all for marketing and little substance.
> 
> TLDR: you get what you pay for with the ASUS model, will it help with OC? maybe if ur under LN2 but Air cooled its irrelevant.


you are making me feel, the need to get a full cover block go away.








I am still undecided but I am probably too lazy to re-do the loop and spend more time and money for minimal gains.


----------



## mshagg

Quote:


> Originally Posted by *TheBoom*
> 
> So I'm guessing that means more stable clocks/boost over time and more reliability? Over higher but less stable boost clocks on the Aorus and Amp Extreme?
> 
> Over at the 1070 thread the consensus is that the Amp Extreme cards seem to be the best overall though. I myself own a Strix 1070 but the asking price for the Strix 1080 ti is kinda ridiculous over other partner 1080tis here.


For us mere mortals I just really struggle to see the point of having a VRM setup that can pipe 600W without breaking a sweat.

At the end of the day, your card will be stable 2100Mhz at 1.093V... or it wont. That seems to me like a function of the silicone lottery more so than the stability of power regulation delivered by the VRM. Next issue will be what kind of clocks it can hold under heavy load within its TDP limits.

If you're soldering on the various bits an pieces to bypass power limits and manually adjust core voltage, with a view to pouring a cup of LN2 onto the card, I can see how the composition of the VRM phases would be of relevance.

TDP limit and cooling solution seem to be the only variables that are going to impact performance on these things. Otherwise it's down to aesthetics and that sweet RGB street cred.


----------



## KedarWolf

Hey all,

Been busy with real life stuff. If you all want to bew added to the Owner's List, please comment your details in the OP spreadsheet for out Mods and myself to add you.

Thanks!!

Tested both new 'unnofficial' Gigabyte bios's someone posted a while back, not working, same issues as the earlier Aorus bios's.


----------



## KedarWolf

Quote:


> Originally Posted by *mshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, you're right. Extreme BIOS doesn't work, already tried it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There's some non-extreme GA BIOSes to try:
> 
> https://www.techpowerup.com/vgabios/191240/191240
> https://www.techpowerup.com/vgabios/191202/191202
> 
> The second one is the gaming OC card, with 8+2 power phases, but only has 300W limit it seems:
> 
> https://www.aorus.com/product-detail.php?p=168&t=17&t2=23&t3=27
Click to expand...

Tried both BIOS's, no go, same issues as earlier Aorus BIOS's.


----------



## KedarWolf

Quote:


> Originally Posted by *KickAssCop*
> 
> KickAssCop - ASUS 1080 Ti FE- Stock Cooling, 2050 core, 6050 memory.
> 
> Will update once I have the MSI Gaming X card in as well.


Comment on the spreadsheet in the OP and we'll add you. Thanks!!


----------



## scarecrow22

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I just dont get this 1080ti... I can bench and run heaven stress test for hours at 2037-2050 but even at 2012 in games i get driver crashes all the time. What the heck is going on lol. I just rolled back from the 381.78 driver to the 381.65... Not sure if its the driver itself that is causing it. But the game i mostly play is BF1 on ultra with DX12. So could it be DX12 that could be making it crash or the old driver... Willl test out the .65 more tomorrow also turning off DX12.


I'd probably stick with the latest drivers as although they don't fix any DX12 driver crashes they do improve performance in DX12. I played through ROTTR over the past couple of days and got some really weird driver crashes in DX12 mode that never happened once I switched to DX11. One particularly bad one that seemed to trip a protection circuit somewhere and hard shutdown my PC instantly almost like a power outage. This happened multiple times during the same cut scene towards the end which I learned was a known DX12 bug. Looks like the lower level API also allows for lower level ****ery.

I have been most happy with either a modest default curve offset of 100-120 (power and voltage targets set to max) or a custom curve which tops out at 2012Mhz at 1.12V. Once the card gets up to temp (53C) it bumps down a rung to 2000 at 1V (it was stable at this setting in ROTTR for the entire playthrough). Anything above 140-150 on the offset and I start seeing an unacceptable amount of perfcaps and clock bouncing around. At some stage I will rework some resistors to the shunts to get some more power into my FE so I can overcome this. (I have already voided my warranty by knocking some caps off the backside so now I can do the shunt mod the easy way







)

A question for anyone here who might know also:
Is there any advantage or disadvantage to locking the voltage at a certain frequency even if its constantly power limited at that setting? Or is it better to let the card go on its own and just have intermittent power limit? I ran a few heaven benchmarks to test and saw some minor gains in scores for the former but there might have been a downside to minimum framerates. It wasn't conclusive from frametime graphs so I'm not really 100% sure.


----------



## KedarWolf

Quote:


> Originally Posted by *scarecrow22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Taint3dBulge*
> 
> I just dont get this 1080ti... I can bench and run heaven stress test for hours at 2037-2050 but even at 2012 in games i get driver crashes all the time. What the heck is going on lol. I just rolled back from the 381.78 driver to the 381.65... Not sure if its the driver itself that is causing it. But the game i mostly play is BF1 on ultra with DX12. So could it be DX12 that could be making it crash or the old driver... Willl test out the .65 more tomorrow also turning off DX12.
> 
> 
> 
> I'd probably stick with the latest drivers as although they don't fix any DX12 driver crashes they do improve performance in DX12. I played through ROTTR over the past couple of days and got some really weird driver crashes in DX12 mode that never happened once I switched to DX11. One particularly bad one that seemed to trip a protection circuit somewhere and hard shutdown my PC instantly almost like a power outage. This happened multiple times during the same cut scene towards the end which I learned was a known DX12 bug. Looks like the lower level API also allows for lower level ****ery.
> 
> I have been most happy with either a modest default curve offset of 100-120 (power and voltage targets set to max) or a custom curve which tops out at 2012Mhz at 1.12V. Once the card gets up to temp (53C) it bumps down a rung to 2000 at 1V (it was stable at this setting in ROTTR for the entire playthrough). Anything above 140-150 on the offset and I start seeing an unacceptable amount of perfcaps and clock bouncing around. At some stage I will rework some resistors to the shunts to get some more power into my FE so I can overcome this. (I have already voided my warranty by knocking some caps off the backside so now I can do the shunt mod the easy way
> 
> 
> 
> 
> 
> 
> 
> )
> 
> A question for anyone here who might know also:
> Is there any advantage or disadvantage to locking the voltage at a certain frequency even if its constantly power limited at that setting? Or is it better to let the card go on its own and just have intermittent power limit? I ran a few heaven benchmarks to test and saw some minor gains in scores for the former but there might have been a downside to minimum framerates. It wasn't conclusive from frametime graphs so I'm not really 100% sure.
Click to expand...

On all my benches I'm more stable and higher results with previous driver, new one seems not as good.


----------



## KedarWolf

Hey peeps, soon we are adding a form you fill out in OP that'll update the spreadsheet automatically.







Maybe a few other quality of life changes like that.


----------



## Jsunn

I am seeing some strange behavior on my 1080Ti FE.
I'll try to outline what I am seeing as best as I can.

I had my card mounted vertically using the Lian-Li riser card, you can see how I did it here:
Custom Vertical GPU mount Lian-Li PC-O11

I noticed that my benchmark scores were about 2,000-3,000 points lower in Fire Strike, and I randomly seemed to get a random hang upon reboot. The video posted below shows the behavior I was getting.

Then, somewhere along the line, upon boot, the BIOS and BIOS load screen (where you press F2 or Del to enter BIOS or UEFI) switched to my secondary monitor.

I thought that it might be that the bends in the riser cable were too tight, so I did some experimenting and thought I fixed it, I had more gradual bends and made sure that the cables weren't folded back and touching each other. But then I got the weird boot hang again.












I decided to return to stock for now, but then I got it again. (the video posted is with the GPU installed in the motherboard, without the PCIe riser cable. )

I completely reset my BIOS, and everything seems to be back to normal, but I am still concerned that there might be something wrong. Can I bend the cable too sharply and cause problems?

Has anyone seen this behavior before? Is this a sign of something more serious?

Thanks,
Jason


----------



## Luckbad

My Titan Xp is on the way back to Nvidia. I couldn't tolerate the temps and volume and didn't want to mod a $1200 card.

Now I have 3 Zotac Amp Extremes and an Asus ROG Strix OC on the way to cherry pick the best of them and an EVGA FTW3 preordered.

Technically I have a 1080 Ti FE on the way from a Step-Up, but I plan to sell that without opening it.

The next couple weeks are gonna be fun.


----------



## amer020

please can any one help me to make shunt mod

first thing what i need to order from amazon to make this mod

thank you


----------



## PeterOC

Quote:


> Originally Posted by *amer020*
> 
> please can any one help me to make shunt mod
> 
> first thing what i need to order from amazon to make this mod
> 
> thank you


http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


----------



## dboythagr8

Quote:


> Originally Posted by *Lexiconman*
> 
> Factually incorrect.


Interesting.

I've always left VSYNC off in NVCP when GSYNC was enabled (I'm on a 144hz monitor). Going to try what this pic suggests, play with both on, and cap FPS to 125 in RTS.

I always thought you should just let the game obtain as many frames as possible when using a high refresh monitor.


----------



## outofmyheadyo

Quote:


> Originally Posted by *dboythagr8*
> 
> Interesting.
> 
> I've always left VSYNC off in NVCP when GSYNC was enabled (I'm on a 144hz monitor). Going to try what this pic suggests, play with both on, and cap FPS to 125 in RTS.
> 
> I always thought you should just let the game obtain as many frames as possible when using a high refresh monitor.


Can I cap in nvidia inspector instead of RTSS?


----------



## amer020

Quote:


> Originally Posted by *PeterOC*
> 
> http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus


i dont know what i need to buy to make this mod


----------



## outofmyheadyo

Coolaboratory liquid ultra

or

Thermal Grizzly Conductonaut


----------



## amer020

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Coolaboratory liquid ultra
> 
> or
> 
> Thermal Grizzly Conductonaut


this one working

https://www.amazon.com/Thermal-Grizzly-Conductonaut-Grease-Paste/dp/B01EO2V332/ref=sr_1_2?ie=UTF8&qid=1492529637&sr=8-2&keywords=Thermal+Grizzly+Conductonaut


----------



## Swolern

Quote:


> Originally Posted by *dboythagr8*
> 
> Interesting.
> 
> I've always left VSYNC off in NVCP when GSYNC was enabled (I'm on a 144hz monitor). Going to try what this pic suggests, play with both on, and cap FPS to 125 in RTS.
> 
> I always thought you should just let the game obtain as many frames as possible when using a high refresh monitor.


Why limit to 125fps though, why not higher, 143? I always limit my FPS in RTSS by 1fps less than my max refresh rate, so it will not introduce lag inducing Vsync nor screen tearing.


----------



## outofmyheadyo

Quote:


> Originally Posted by *amer020*
> 
> this one working
> 
> https://www.amazon.com/Thermal-Grizzly-Conductonaut-Grease-Paste/dp/B01EO2V332/ref=sr_1_2?ie=UTF8&qid=1492529637&sr=8-2&keywords=Thermal+Grizzly+Conductonaut


Yes that`s the one, but to be honest save urself the hassle, if your card is a potato powermod wont help, just flash asus bios for slightly higher power limit.
I might remove my own shuntmod, it does nothing.


----------



## ChickenInferno

Which 1080 Ti would people recommend if noise is a primary concern? Would be upgrading from an unbearably noisy R9 390.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Yes that`s the one, but to be honest save urself the hassle, if your card is a potato powermod wont help, just flash asus bios for slightly higher power limit.
> I might remove my own shuntmod, it does nothing.


From someone that has done the shunt mod myself, I agree with this.

If your card isn't a silicon winner than shunt mod will do nothing for you.

In my case, my card was being held back by power limiting. But now I can do witcher 3 at 2114 Mhz @ 1.075-1.092v without clocks ever taking a hit or dropping.

Before I would play at 2088 Mhz, but it would downclock up and down from 2050, 2063, 2075 and 2088 in some areas and it would cause me stutter or slowdowns.

As for the Asus bios, statistics form our community has shown that you get like 10 extra watts but benchmarks lower. Not sure if it's worth the hassle. 100% sure it made things worse for me when I was stock.


----------



## sWaY20

In case anyone wonders why FedEx is late or unreliable... This was in front of the grocery store i was at.


----------



## keikei

Quote:


> Originally Posted by *ChickenInferno*
> 
> Which 1080 Ti would people recommend if noise is a primary concern? Would be upgrading from an unbearably noisy R9 390.


Any non-reference model would do the job. Seeing as many of them are not in stock its hard to recommend a particular brand. The evga has a preorder for 04/24/2017 on newegg.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *sWaY20*
> 
> In case anyone wonders why FedEx is late or unreliable... This was in front of the grocery store i was at.


Quote:


> Originally Posted by *sWaY20*
> 
> In case anyone wonders why FedEx is late or unreliable... This was in front of the grocery store i was at.


*****


----------



## mouacyk

Quote:


> Originally Posted by *sWaY20*
> 
> In case anyone wonders why FedEx is late or unreliable... This was in front of the grocery store i was at.


How many GTX 1080 TI's were part of that fusco?


----------



## amer020

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Yes that`s the one, but to be honest save urself the hassle, if your card is a potato powermod wont help, just flash asus bios for slightly higher power limit.
> I might remove my own shuntmod, it does nothing.


i have asus bois already and i can reach 2050 with 1063v stable and for mem 6100

i will try anyway


----------



## RavageTheEarth

Quote:


> Originally Posted by *ChickenInferno*
> 
> Which 1080 Ti would people recommend if noise is a primary concern? Would be upgrading from an unbearably noisy R9 390.


Aorus 1080 Ti. I have the regular version. These are the quietest fans at 100% that I have ever heard in my life. Plus you get a 375w limit with the power slider at 125%.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *amer020*
> 
> i have asus bois already and i can reach 2050 with 1063v stable and for mem 6100
> 
> i will try anyway


can you run heaven at all at 2101 Mhz at 1.062v?or does it crash instantly?


----------



## amer020

Quote:


> Originally Posted by *SlimJ87D*
> 
> can you run heaven at all at 2101 Mhz at 1.062v?or does it crash instantly?


i run it before 2100 with 1093v


----------



## dboythagr8

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Can I cap in nvidia inspector instead of RTSS?


Yeah you can. From watching the link that was also included with the picture, the guy says that Inspector introduces additional input lag. I'd assume RTSS does as well. He recommends using built in limiters in the game itself when possible.

Quote:


> Originally Posted by *Swolern*
> 
> Why limit to 125fps though, why not higher, 143? I always limit my FPS in RTSS by 1fps less than my max refresh rate, so it will not introduce lag inducing Vsync nor screen tearing.


If you can hold a steady 142-143 fps in a game that's fine. However if you can't do that, but can hold something below that, locking it at that frame rate in game in combination with GSYNC appears to be the best method.


----------



## outofmyheadyo

I am taking my gigabyte 1080ti founders edition back to the store for a refund or replacement, it coilwhines like crazy and overclocks like crap, for 800€ this is not acceptable, that I can hear it squealing from the other room.


----------



## dunbheagan

Quote:


> Originally Posted by *mshagg*
> 
> TDP limit and cooling solution seem to be the only variables that are going to impact performance on these things. Otherwise it's down to aesthetics and that sweet RGB street cred.


I thought exactly the same. All 1080Ti-PCBs are good enough not to bottleneck your OC. At least if you are not going sub-zero, which nobody does for gaming and "normal" OC. All this "clean and stable power supply"-talk is mainly marketing imo. The only factor for your OC is the GPU(pure luck, no correlation to a certain brand), the cooling solution and the TDP.

From an OC-standpoint i would decide like this:
-Custom Waterblock+Shunt Mod: Nvidia FE
-Custom waterblock, no shunt mod: Custom board with the highest TDP and watercooler available (Gigabyte Aorus Extreme, max 375W, waterblock from EK early May, correct me if there is a card with higher possible TDP)
-stock aircooling, no shunt mod: a card with high tdp and a huge cooler, for example zotac extreme, but others are good as well

One must keep in mind, that even the boards with the highest TDP are power limited to a certain degree, so the only way to get totally rid of the power limit is the shunt mod (yet)


----------



## kevindd992002

Quote:


> Originally Posted by *Lexiconman*
> 
> Factually incorrect.


I still don't get this. Why would you want to enable VSYNC in NVCP and apply an FPS limit (either in-game or via RTSS) together? I understand the diagram but if you apply an FPS limit, VSYNC will never engage so it will never reach those "exceed 1ms polling range" and "exceeds g-sync range" areas in the diagram.

With that said, why then would you consider enabling VSYNC in NVCP part of the "optimal g-sync settings for single gpu/monitor" tagline? Or am I missing something here?


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lexiconman*
> 
> Factually incorrect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still don't get this. Why would you want to enable VSYNC in NVCP and apply an FPS limit (either in-game or via RTSS) together? I understand the diagram but if you apply an FPS limit, VSYNC will never engage so it will never reach those "exceed 1ms polling range" and "exceeds g-sync range" areas in the diagram.
> 
> With that said, why then would you consider enabling VSYNC in NVCP part of the "optimal g-sync settings for single gpu/monitor" tagline? Or am I missing something here?
Click to expand...

Less input lag limiting frame rates with no Vsync and Rivatuner and G-Sync on.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't think there anything wrong with my score because it can be various factors that come into play once you get to this point of oc. For example, it could be my cpu and you have higher memory oc than I do, our ram speeds and were running different OS, who knows. But not every card will oc the same I guess. At this point, various factors can come into play, what we have in our OS, what background apps were running, our mobo, our ram etc and etc.
> 
> I still have a higher maximum than you do, but it could be my cpu, ram, mobo that hold back my minimum scores.
> 
> The real question is how would my card perform in your rig, no sexual pun intended.
> 
> Besides, your card had to explode to get those scores
> 
> 
> 
> 
> 
> 
> 
> , mine hasn't yet, might soon lol.
> 
> But maybe a different bios might make it run better now. Maybe that Asus bios actually might make a difference for a change. But like I said, I haven't tweaked anything like nvidia control panel, disabled things in Windows creative update (which scores lower in benchmarks I'm hearing Btw).


turning everything to basically "off" in nvidia control panel raises my score fro 10,300 to 10, 467, so it has a bearing on the results.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ahh I see, the Zotac VRM design looks similar to their 1080 (from what I can glean on the Official page), just more phases. If it matters, the MOSFETs on that edition had a weird 2 phase low side + 1 phase high side (i.e. 3 MOSFETs per VRM phase) setup. However the mosfets themselves are pretty cheapo individually (i.e. low efficiency) which means they need some seriously beefy cooling to get their optimal output. The 2 by 1 setup means your ultimate current capability is limited by the spike durability of the lone high side FET so thats the one that is going to blow if something goes wrong. As for how to get 16 + 2 phases, I have no idea how they're achieving that. 6 x 3? maybe? but triplers really weird but chances are, unless they got permission to use a different voltage reegulator, the actual regulation should be on par with the Aorus.
> 
> Also, I re-listened to the PCB breakdown again, I was wrong. The ASUS has a 5 x 2 +2 setup not a 6 x 1 +2. However, it is also very likely ASUS still has the best voltage regulation because their MOSFETs are godly, high efficiency and run at the same 300khz that all the others so far are running at, though MSI come a close second.


Aorus are running at 600 khz







. hell, i run my mobo's phases at 670 khz. is that bad?


----------



## madpete76

Update your drivers to the latest hotfix, mine did this too and that fixed it.
Quote:


> Originally Posted by *Jsunn*
> 
> I am seeing some strange behavior on my 1080Ti FE.
> I'll try to outline what I am seeing as best as I can.
> 
> I had my card mounted vertically using the Lian-Li riser card, you can see how I did it here:
> Custom Vertical GPU mount Lian-Li PC-O11
> 
> I noticed that my benchmark scores were about 2,000-3,000 points lower in Fire Strike, and I randomly seemed to get a random hang upon reboot. The video posted below shows the behavior I was getting.
> 
> Then, somewhere along the line, upon boot, the BIOS and BIOS load screen (where you press F2 or Del to enter BIOS or UEFI) switched to my secondary monitor.
> 
> I thought that it might be that the bends in the riser cable were too tight, so I did some experimenting and thought I fixed it, I had more gradual bends and made sure that the cables weren't folded back and touching each other. But then I got the weird boot hang again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I decided to return to stock for now, but then I got it again. (the video posted is with the GPU installed in the motherboard, without the PCIe riser cable. )
> 
> I completely reset my BIOS, and everything seems to be back to normal, but I am still concerned that there might be something wrong. Can I bend the cable too sharply and cause problems?
> 
> Has anyone seen this behavior before? Is this a sign of something more serious?
> 
> Thanks,
> Jason


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Sure I can test it tomorrow. But I don't think that's true about cpu and ram not affecting you at 4K, people says it "doesn't matter" because it literally makes like 2 to 4 fps of a difference which is a few percentages which makes people wave it off as it doesn't matter.
> 
> What are we doing here? We're epeening literally about a 0.009% performance difference and 1 or 2 frames.
> 
> Even if I test without a curve, we don't have the same hardware and OS where literally getting 0.5 fps gets you 100 points more and 0.0009% difference in overall score.


cpu and ram DEFINITELY influence fps at any resolution, including 4k. More draw calls = more fpa. yes, it's a lot less of a difference than in a cpu bound resolution. But, for instance, i get about ~120fps in BF1 in 1440p on my i-7 5775c at stock 3.3ghz with 1333mhz ram. But at 4.3Ghz and 3.7ghz uncore and 2133mhz Cas-9 ram, i get ~144+ fps. ANd who thinks a ryzen 4 core stock has the same fps in 4k as a 5.0Ghz 7700k? C'mon, guys, let's stop with this old argument. That like the guys who kept saying you'd get no fps gains in QHD going from a 4790k to a 5775c. Well, everyone that did it like myself, Benny, palotte, and a few others ALL report 10+ fps gains, just from a cpu drop in.

TLR - cpu and ram still affect performance at any resolution.


----------



## kevindd992002

Quote:


> Originally Posted by *KedarWolf*
> 
> Less input lag limiting frame rates with no Vsync and Rivatuner and G-Sync on.


Yes, I agree. But that's actually not what I was asking about, no pun intended 

My question was:

Why then would you consider enabling VSYNC in NVCP part of the "optimal g-sync settings for single gpu/monitor" tagline? Or am I missing something here? The box specifically says GSYNC ON + VSYNC NVCP ON + VSYNC IN-GAME OFF + FPS LIMIT.


----------



## Jsunn

Quote:


> Originally Posted by *madpete76*
> 
> Update your drivers to the latest hotfix, mine did this too and that fixed it.


Thank you! I will give it a try and post back.
I will also get everything plugged back up again and see if it works with the vertical GPU. Too many variables to try and track down the cause.
This weekend I installed the Win 10 latest update (hotfix addresses issues cause by this), the installed the riser card.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> cpu and ram DEFINITELY influence fps at any resolution, including 4k. More draw calls = more fpa. yes, it's a lot less of a difference than in a cpu bound resolution. But, for instance, i get about ~120fps in BF1 in 1440p on my i-7 5775c at stock 3.3ghz with 1333mhz ram. But at 4.3Ghz and 3.7ghz uncore and 2133mhz Cas-9 ram, i get ~144+ fps. ANd who thinks a ryzen 4 core stock has the same fps in 4k as a 5.0Ghz 7700k? C'mon, guys, let's stop with this old argument. That like the guys who kept saying you'd get no fps gains in QHD going from a 4790k to a 5775c. Well, everyone that did it like myself, Benny, palotte, and a few others ALL report 10+ fps gains, just from a cpu drop in.
> 
> TLR - cpu and ram still affect performance at any resolution.


That's what I'm trying to tell him, but I think he just wants the crown with the highest core clock lol. We're literally comparing 0.5 FPS difference here, and this is where ram and CPU can make a difference.

If I literally got 1 more FPS on my superposition I would get an extra 200 points. I just down clocked my CPU, lost 0.5 fps and lost like 100 points on the test.

To non-Turbo nerds, 0.5 FPS is ridiculous subject to obsess about.

I need at 5775C! Just need to find a used one


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Less input lag limiting frame rates with no Vsync and Rivatuner and G-Sync on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, I agree. But that's actually not what I was asking about, no pun intended
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My question was:
> 
> Why then would you consider enabling VSYNC in NVCP part of the "optimal g-sync settings for single gpu/monitor" tagline? Or am I missing something here? The box specifically says GSYNC ON + VSYNC NVCP ON + VSYNC IN-GAME OFF + FPS LIMIT.
Click to expand...

G-Sync on, Vsync in nvidia control panel and game off, Rivatuner 1 FPS lower then your refresh rate.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> G-Sync on, Vsync in nvidia control panel and game off, Rivatuner 1 FPS lower then your refresh rate.


But what if you have OCD and don't like odd numbers?


----------



## cekim

Quote:


> Originally Posted by *SlimJ87D*
> 
> But what if you have OCD and don't like odd numbers?


The struggle is real...


----------



## Taint3dBulge

Quote:


> Originally Posted by *Dasboogieman*
> 
> Because some games are much more sensitive to overclocks than others. In fact, theres little correlation between how games are loading the GPU and how stable an overclock is. I personally found the FFXIV Free benchmark to be a really good indicator of stability (just loop the 4k at max settings), if you make it through an hr its truly "stable. Hell, I remember Ghost Recon Future Soldier used to crash on AIB cards unless you get rid of the factory overclock.
> 
> Lastly, get ur GPU colder, mine needs 100% fan speeds to keep temps under 70 otherwise my 2050 clocks are no longer stable


Oh its cold enough, never gets over 39degs. Shes under a big loop.


----------



## kevindd992002

Quote:


> Originally Posted by *KedarWolf*
> 
> G-Sync on, Vsync in nvidia control panel and game off, Rivatuner 1 FPS lower then your refresh rate.


Right right but the diagram says to enable vsync in nvcp, doesn't it? That's what confuses me.


----------



## KingEngineRevUp

Any luck with the EVGA SC2 bios?


----------



## Addsome

Quote:


> Originally Posted by *kevindd992002*
> 
> Right right but the diagram says to enable vsync in nvcp, doesn't it? That's what confuses me.


Im not sure the exact reasoning behind it but you can read about it on blurbuster forums. It has something to do with how gsync was originally launched with vsync being forced on but later on they added the option of allowing it to be turned off.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Any luck with the EVGA SC2 bios?


Do we know if the other innios3d bios came out? The first one helped me tons.


----------



## Mrip541

Fedex just delivered my card while I'm at work which means they left it more or less on the sidewalk in front of my building. I won't be home for 3 hours. I'm dying.


----------



## kevindd992002

Quote:


> Originally Posted by *Addsome*
> 
> Im not sure the exact reasoning behind it but you can read about it on blurbuster forums. It has something to do with how gsync was originally launched with vsync being forced on but later on they added the option of allowing it to be turned off.
> Do we know if the other innios3d bios came out? The first one helped me tons.


I see. Let me check and give that a good read for curiosity's sake.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> G-Sync on, Vsync in nvidia control panel and game off, Rivatuner 1 FPS lower then your refresh rate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But what if you have OCD and don't like odd numbers?
Click to expand...

58 FPS is an even odd number.


----------



## Addsome

Quote:


> Originally Posted by *kevindd992002*
> 
> I see. Let me check and give that a good read for curiosity's sake.


Heres a link to the forum post:
http://forums.blurbusters.com/viewtopic.php?t=3073

Also the Gsync image in this thread is outdated, here is the updated image from that post.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Any luck with the EVGA SC2 bios?


You have a link to it?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> You have a link to it?


Trying to grab it for us now. But inno3d bios for sure gives everyone 15 watts extra. I'll post if I get ahold of it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Im not sure the exact reasoning behind it but you can read about it on blurbuster forums. It has something to do with how gsync was originally launched with vsync being forced on but later on they added the option of allowing it to be turned off.
> Do we know if the other innios3d bios came out? The first one helped me tons.


The one we want is a review sample that gives a reference design 350 watts and sadly due to nda, I don't think we'll ever get that bios.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You have a link to it?
> 
> 
> 
> Trying to grab it for us now. But inno3d bios for sure gives everyone 15 watts extra. I'll post if I get ahold of it.
Click to expand...

Post the Inno3D as well? I think I tried it with not good results.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You have a link to it?
> 
> 
> 
> Trying to grab it for us now. But inno3d bios for sure gives everyone 15 watts extra. I'll post if I get ahold of it.
> 
> Click to expand...
> 
> Post the Inno3D as well? I think I tried it with not good results.
Click to expand...

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/5640_20#post_26019798

Yeah, tried the Inno3d BIOS, wasn't too good.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Mrip541*
> 
> Fedex just delivered my card while I'm at work which means they left it more or less on the sidewalk in front of my building. I won't be home for 3 hours. I'm dying.


Dude call a friend or family to go get it for you. Do you live in a apartment complex? Ask the leasing office if they can grab it for you and tell them it's super important


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You have a link to it?
> 
> 
> 
> Trying to grab it for us now. But inno3d bios for sure gives everyone 15 watts extra. I'll post if I get ahold of it.
Click to expand...

Tried the Inno3D BIOS again, instant crash in TimeSpy at settings I use for Asus BIOS, back to Asus, no issues.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Tried the Inno3D BIOS again, instant crash in TimeSpy at settings I use for Asus BIOS, back to Asus, no issues.


Yeah I can't get it to work well either. I'm going to try Asus with the shunt later today. See if it actually helps this time.


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> But what if you have OCD and don't like odd numbers?


Tell me about it, my 100hz X34 wasn't happy at 100hz and I didn't like 95hz so I just use 90hz lol









It sounds better when its like 50% more than 60hz and not that it matter before as 980Ti couldn't push that anyway, must try 100hz again with this card and maybe try original DP cable


----------



## joder

Just got mine under water finally and boy are those temps nice. Sitting around 21 C idle with 21 C ambient.

Valley had no problem running 2100Mhz @ 1.062v. I can't say the same about Firestrike/Superposition though. I think the power is just too much.

Has anyone else put thermal tape on the chips left of the chokes? I noticed a few areas on the FE heatsink that had thermal tape, however, EK didn't recommend putting any there. Perhaps the waterblock makes contact with these areas already.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *joder*
> 
> Just got mine under water finally and boy are those temps nice. Sitting around 21 C idle with 21 C ambient.
> 
> Valley had no problem running 2100Mhz @ 1.062v. I can't say the same about Firestrike/Superposition though. I think the power is just too much.
> 
> Has anyone else put thermal tape on the chips left of the chokes? I noticed a few areas on the FE heatsink that had thermal tape, however, EK didn't recommend putting any there. Perhaps the waterblock makes contact with these areas already.


It could be that's they need to run at a certain lukewarm temperature. And since you're taking lots of heat away from the chips around it, it shouldn't be a problem.

If Ek has a reason they have a good reason. I'd listen to them.


----------



## mshagg

Quote:


> Originally Posted by *Slackaveli*
> 
> TLR - cpu and ram still affect performance at any resolution.


This is particularly true for an application which is designed to measure minuscule changes in system performance like supo is.


----------



## Caos

Hi. Good result or not? Asus strix 1080ti


----------



## gmpotu

Hey guys. Just got myself the 1080 TI STRIX from ASUS and super excited but now my computer keep crashing and losing power and restarting.

Any chance you guys might be able to help with some tips? Should I start my own new thread for my issue or is there a page someone could link me to that has already discussed this issue with the ASUS cards?

I'm running stock with no overclock and it's crashing but it looks like it just goes up as high as it wants when I look at GPU-Z and GPU Tweak II. I see the TDP hit 94% sometimes and the perf reason is VRel, with the core hitting almost 2000 even though the "Gaming" setting has the boost listed as 1683.

I have discord, TS3, or vent if anyone wants to chat also.


----------



## NYU87

Quote:


> Originally Posted by *gmpotu*
> 
> Hey guys. Just got myself the 1080 TI STRIX from ASUS and super excited but now my computer keep crashing and losing power and restarting.
> 
> Any chance you guys might be able to help with some tips? Should I start my own new thread for my issue or is there a page someone could link me to that has already discussed this issue with the ASUS cards?
> 
> I'm running stock with no overclock and it's crashing but it looks like it just goes up as high as it wants when I look at GPU-Z and GPU Tweak II. I see the TDP hit 94% sometimes and the perf reason is VRel, with the core hitting almost 2000 even though the "Gaming" setting has the boost listed as 1683.
> 
> I have discord, TS3, or vent if anyone wants to chat also.


What's your power supply?


----------



## gmpotu

Quote:


> Originally Posted by *NYU87*
> 
> What's your power supply?


Ultra X3 1000W PSU (12V rail is supposed to be 70A 840W) I originally was running two GTX 570 in SLI with no issues.
Also have 1325V CyberPower UPS that my computer is connected to.

One thing I found strange is that the Ultra X3 has two 6pin to 8pin adapters. Is that normal for a modular PSU to have a 6pin going into the PSU but an 8 pin going to the GPU?


----------



## SuprUsrStan

SLI 1080Ti under water clocked to a constant 2050Mhz Core and 5750Mhz Memory.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Ultra X3 1000W PSU (12V rail is supposed to be 70A 840W) I originally was running two GTX 570 in SLI with no issues.
> Also have 1325V CyberPower UPS that my computer is connected to.
> 
> One thing I found strange is that the Ultra X3 has two 6pin to 8pin adapters. Is that normal for a modular PSU to have a 6pin going into the PSU but an 8 pin going to the GPU?


PSU should be ok, but make sure to use two separate cables back to the psu, one for each plug on the videocard, other than that, maybe CPU overclock is not stable due to different load on the motherboard?


----------



## HyperC

Quote:


> Originally Posted by *Syan48306*
> 
> SLI 1080Ti under water clocked to a constant 2050Mhz Core and 5750Mhz Memory.


Ummm, that score is very low


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> PSU should be ok, but make sure to use two separate cables back to the psu, one for each plug on the videocard, other than that, maybe CPU overclock is not stable due to different load on the motherboard?


Hmm could be the cpu. Didn't think of that. My CPU is I think set to turbo though and only hitting 4.2ghz as a 3770k with temps below 40c. I don't think it's even manually overclocked.
For what it's worth my mouse has started to go haywire too now and will get "stuck" for a second and then move etc.

The system crashes as soon as I do something that requires intense power from the GPU. Furmark won't even start from the GPU Tweak II application now.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Hmm could be the cpu. Didn't think of that. My CPU is I think set to turbo though and only hitting 4.2ghz as a 3770k with temps below 40c. I don't think it's even manually overclocked.
> For what it's worth my mouse has started to go haywire too now and will get "stuck" for a second and then move etc.
> 
> The system crashes as soon as I do something that requires intense power from the GPU. Furmark won't even start from the GPU Tweak II application now.


Yeah it all sounds as if it should be ok, but that's the trouble with PC's








PSU is the most likely suspect, could just be getting old? I'd try that if you have a spare one, or try different vga plugs on the PSU particularly if it has multiple 12v rails.
Its kind of hit and miss tracking stuff down, but maybe check one of the old videocards in the same slot again and see if its still good etc


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Syan48306*
> 
> SLI 1080Ti under water clocked to a constant 2050Mhz Core and 5750Mhz Memory.


What's wrong with your score, did you forget to activate sli?


----------



## Foxrun

How do you activate sli in superposition?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> How do you activate sli in superposition?


You shouldn't need to, but it should be active in nvidia control panel at least.


----------



## Foxrun

I have SLI enabled but I guess it doesnt support it? Regardless my score seems to be uncharacteristically low compared to other users here


----------



## gmpotu

I dropped my processor to Auto Turbo mode. Reinstalled Furmark and was able to run the bench mark one time to test.
So with CPU and GPU at default this is what GPU-Z is showing during the test.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> I dropped my processor to Auto Turbo mode. Reinstalled Furmark and was able to run the bench mark one time to test.
> So with CPU and GPU at default this is what GPU-Z is showing during the test.


Yeah that's pretty normal, use MSI afterburner to raise power limit to 120 and it should be a little better.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> 
> 
> I have SLI enabled but I guess it doesnt support it? Regardless my score seems to be uncharacteristically low compared to other users here


Maybe sli iant supported but I think we have some sli scores here


----------



## KingEngineRevUp

I bring you... SC2 bios O_O

EVGA1080TiSC2.zip 155k .zip file


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> Yeah that's pretty normal, use MSI afterburner to raise power limit to 120 and it should be a little better.


I noticed that my 12v rail dropped from 12.48v to 12.28v after I set my CPU back to normal so not sure if that was it.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I bring you... SC2 bios O_O
> 
> EVGA1080TiSC2.zip 155k .zip file


Whats the power limit on this one? And different power design than reference board?


----------



## gmpotu

What does the power consumption limit mean and how do I improve that beyond 120% or would I just start manually increasing core and memory clocks until the perf reason changes to vrel?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Whats the power limit on this one? And different power design than reference board?


I believe the design is similar if not the same as reference, I'm testing right now and it's going well.

Power draw is the same I think. I'll report in a little bit.

Asus bios + shunt mod does not work Btw, just tested it... I don't even know what it's good for at this point.

But I can't give you a real reading since I am shunt modded and it will report like 200 watts used. I'll let you know the nvidia command details
Edit : same information as reference


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I believe the design is similar if not the same as reference, I'm testing right now and it's going well.
> 
> Power draw is the same I think. I'll report in a little bit.
> 
> Asus bios + shunt mod does not work Btw, just tested it... I don't even know what it's good for at this point.
> 
> But I can't give you a real reading since I am shunt modded and it will report like 200 watts used. I'll let you know the nvidia command details
> Edit : same information as reference


So 300W max?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> So 300W max?


300 watts Max, so far for me it's more stable at oc. Time to test on Witcher 3


----------



## KingEngineRevUp

So reporting in, the ASUS bios, doesn't support the shunt mod. It just goes berserk and doesn't even draw extra power. I believe it's just not meant for the FE card. The ASUS card has different kind of components so something must confuse it. I can't speak for Kedarwolf, but I always get worse results with it. It's always bad news and now shunt modded with 400 Watts for it to draw form, it just acts stupid.

If the ASUS bios can't recognize a shunt mod on a reference card, something is up with it. It's just not meant to be run on the FE.




Here's old power readings



Anyways, the SC2 bios seems to give me more stable OCs. I like it so far. It's worth trying, even though it seems to be equal in performance to the FE design. Also, if you're on a hybrid mod or AIO, you can now set the fan to turn off at lower temperatures, which is great for noise and extends the fans life.

*TL;DR: So far, the SC2 bios doesn't seem to have any drawbacks, but you gain 0% fan speed at idle if you want. ASUS bios + Shunt mod = No.*


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> Aorus are running at 600 khz
> 
> 
> 
> 
> 
> 
> 
> . hell, i run my mobo's phases at 670 khz. is that bad?


Thing is i dont think it runs at 600khz. Its the pwm signal split to 3 drivers each has a quadrupler. Our signal is a sluggish 150khz iirc.


----------



## stoker

Quote:


> Originally Posted by *SlimJ87D*
> 
> I bring you... SC2 bios O_O
> 
> EVGA1080TiSC2.zip 155k .zip file


Rep+ Will try this one









Quote:


> Originally Posted by *Addsome*
> 
> Whats the power limit on this one? And different power design than reference board?


Same 7+1 vrm, 250w & 6+8 pin power, its based on reference with the extra icx temperature sensors.


----------



## HyperC

Quote:


> Originally Posted by *SlimJ87D*
> 
> So reporting in, the ASUS bios, doesn't support the shunt mod. It just goes berserk and doesn't even draw extra power. I believe it's just not meant for the FE card. The ASUS card has different kind of components so something must confuse it. I can't speak for Kedarwolf, but I always get worse results with it. It's always bad news and now shunt modded with 400 Watts for it to draw form, it just acts stupid.
> 
> If the ASUS bios can't recognize a shunt mod on a reference card, something is up with it. It's just not meant to be run on the FE.
> 
> Anyways, the SC2 bios seems to give me more stable OCs. I like it so far. It's worth trying, even though it seems to be equal in performance to the FE design. Also, if you're on a hybrid mod or AIO, you can now set the fan to turn off at lower temperatures, which is great for noise and extends the fans life.
> 
> *TL;DR: So far, the SC2 bios doesn't seem to have any drawbacks, but you gain 0% fan speed at idle if you want. ASUS bios + Shunt mod = No.*


Take one for the team and solder on those 2 missing pin connectors see if that helps the bios







.. I believe my paste comes Friday so I will be doing my first shunt mod, I have a core x9 case so think I will in close the liquid metal with hot glue


----------



## Caos

Quote:


> Originally Posted by *Caos*
> 
> Hi. Good result or not? Asus strix 1080ti


----------



## gmpotu

The Strix has two 8-pin power connectors but only comes with one adapter that converts two 6-pins to one 8-pin.

My power supply has four 6-pin on the back with two 6-pin to 8-pin cables. Should I be using the 6-pin to 8-pin cables or should I look for another dual 6-pin to 8-pin adapter? (One 12v rail)

I ask because my card is crashing (kernel power) at stock settings when on the "gaming" preset. (Strix) model.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *HyperC*
> 
> Take one for the team and solder on those 2 missing pin connectors see if that helps the bios
> 
> 
> 
> 
> 
> 
> 
> .. I believe my paste comes Friday so I will be doing my first shunt mod, I have a core x9 case so think I will in close the liquid metal with hot glue


No thanks, I've done enough for our team. other people need to start experimenting, measuring, quantifying and verifying lol.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> So reporting in, the ASUS bios, doesn't support the shunt mod. It just goes berserk and doesn't even draw extra power. I believe it's just not meant for the FE card. The ASUS card has different kind of components so something must confuse it. I can't speak for Kedarwolf, but I always get worse results with it. It's always bad news and now shunt modded with 400 Watts for it to draw form, it just acts stupid.
> 
> If the ASUS bios can't recognize a shunt mod on a reference card, something is up with it. It's just not meant to be run on the FE.
> 
> 
> 
> 
> Here's old power readings
> 
> 
> 
> Anyways, the SC2 bios seems to give me more stable OCs. I like it so far. It's worth trying, even though it seems to be equal in performance to the FE design. Also, if you're on a hybrid mod or AIO, you can now set the fan to turn off at lower temperatures, which is great for noise and extends the fans life.
> 
> *TL;DR: So far, the SC2 bios doesn't seem to have any drawbacks, but you gain 0% fan speed at idle if you want. ASUS bios + Shunt mod = No.*


How is it possible it is giving more stable results even though power limit and everything is the same? Are scores similar on SC2 bios vs FE bios at same clocks?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> The Strix has two 8-pin power connectors but only comes with one adapter that converts two 6-pins to one 8-pin.
> 
> My power supply has four 6-pin on the back with two 6-pin to 8-pin cables. Should I be using the 6-pin to 8-pin cables or should I look for another dual 6-pin to 8-pin adapter? (One 12v rail)
> 
> I ask because my card is crashing (kernel power) at stock settings when on the "gaming" preset. (Strix) model.


I would call ASUS customer support.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> How is it possible it is giving more stable results even though power limit and everything is the same? Are scores similar on SC2 bios vs FE bios at same clocks?


It can be a placebo, or EVGA tweaked the bios because this card is meant to be OC. But scores are very close to one another, if not I scored 10 more on the EVGA one and earlier today I ran around with Witcher 3, which hates OCs, with 2101 Mhz on the stock and it crashed eventually. I haven't tested the EVGA bios enough, but they are equals. Maybe the only difference is the 0% fan spin.

BTW, 50% stock fan speed = 67% fan speed on EVGA.

As for power draw, I can't tell you what the EVGA bios draws because once again I am shunt modded. What I can tell you is that it supports the shunt mod, which means it must recognize the components or the way a reference card works.

ASUS Strix bios DOES NOT. It just treats the card like a flashed FE without shunt.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> The Strix has two 8-pin power connectors but only comes with one adapter that converts two 6-pins to one 8-pin.
> 
> My power supply has four 6-pin on the back with two 6-pin to 8-pin cables. Should I be using the 6-pin to 8-pin cables or should I look for another dual 6-pin to 8-pin adapter? (One 12v rail)
> 
> I ask because my card is crashing (kernel power) at stock settings when on the "gaming" preset. (Strix) model.


6pin to 8pin cables should be fine as the two extra they break off to make 8 are earth connections. as long as its not 1 x 6 pin cable that breaks off to 2 x 8 pin, that could be a problem. Can you try the card in another system? it really is a process of elimination for the most part.


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> The Strix has two 8-pin power connectors but only comes with one adapter that converts two 6-pins to one 8-pin.
> 
> My power supply has four 6-pin on the back with two 6-pin to 8-pin cables. Should I be using the 6-pin to 8-pin cables or should I look for another dual 6-pin to 8-pin adapter? (One 12v rail)
> 
> I ask because my card is crashing (kernel power) at stock settings when on the "gaming" preset. (Strix) model.


I'm thinking you likely need the two six pin to eight pin. might be your problem.


----------



## CoreyL4

Installed my Gaming X today, clocked to 1936 out of the box. During games and benchmarks I see it downclocks to like 1921, 1898 etc.

Why is this? The 980 ti I had didnt downclock that rapidly. This is my first pascal card so I dont know if this is normal compared to maxwell.


----------



## buddatech

PNY GTX 1080 Ti FE +169/500 2050/12K


9,062 Time Spy
http://www.3dmark.com/spy/1589847

6,249 SuperPosition 1080p EXTREME


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> Installed my Gaming X today, clocked to 1936 out of the box. During games and benchmarks I see it downclocks to like 1921, 1898 etc.
> 
> Why is this? The 980 ti I had didnt downclock that rapidly. This is my first pascal card so I dont know if this is normal compared to maxwell.


Maxwell cards clocked from 1300-1500. These cards clock from 1900-2100, much higher. So therefore there is more points of thermal throttling. Turn your fans to 100%, and you will probably maintain a pretty high OC.


----------



## KingEngineRevUp

For the first time, I have been CPU bottle necked.

BF1 at 1440P at 142 FPS (capped), I can see my CPU clocks hitting 80% and my gpu utilization is only at 94%


----------



## KedarWolf

Tested the SC2 BIOS, about on par with the Asus but power limits more and I had to use .993v at 2025 core instead of .981v.

If anyone has a card we haven't tested the BIOS, right click the zip file below, choose Properties, Unblock, Apply, Ok, then run
nvflash64 --save filename.rom in an admin command prompt in the folder you unzip nvflash to. It would greatly help as find the best BIOS for our FE's.


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> For the first time, I have been CPU bottle necked.
> 
> BF1 at 1440P at 142 FPS (capped), I can see my CPU clocks hitting 80% and my gpu utilization is only at 94%


Wow! I'm still kind of shocked by the power increase between the 980 Ti and the 1080 Ti. I mean, 2000 Mhz out of the box with a maxed power slider ain't no joke. I'm luckily still getting 99% GPU utilization at 1440p 144 hz with my delidded 6700k @ 4.7Ghz. It's seeming to hold up well, but who knows what is to come in the future. If they are releasing 2000 Mhz cards now they really must have something in store for the future generations. I mean, they have to sell cards right?

What is your CPU? You should do a rig builder in your signature!


----------



## CoreyL4

Quote:


> Originally Posted by *SlimJ87D*
> 
> Maxwell cards clocked from 1300-1500. These cards clock from 1900-2100, much higher. So therefore there is more points of thermal throttling. Turn your fans to 100%, and you will probably maintain a pretty high OC.


Managed to overclock it to +60/+250 for a 1987-1949 core clock and didnt know how hard we can push memory so its at 5760mhz (11520).

Sadly I would crash any higher on the core clock. No 2ghz here







. Still this card rocks!


----------



## leequan2009

Quote:


> Originally Posted by *stoker*
> 
> Rep+ Will try this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same 7+1 vrm, 250w & 6+8 pin power, its based on reference with the extra icx temperature sensors.


Can I use this BIOS for the FE version?
And I see they have differences Power Phase. What wrong if I flash 1080TiSC BIOS?


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> 6pin to 8pin cables should be fine as the two extra they break off to make 8 are earth connections. as long as its not 1 x 6 pin cable that breaks off to 2 x 8 pin, that could be a problem. Can you try the card in another system? it really is a process of elimination for the most part.


Don't really have the luxury of trying it in another system. I might just buy a new PSU tomorrow and see if that works. I have had this PSU for like 10 years so it's pretty old.
I'm going to try swapping one of the 6 to 8 pins with the adapter for the 2x6 to 8 pin since I have it. I tried to hit up Fry's Electronics today but they didn't have the cable so who knows.
If I lower down the Power Target to 80% it seems to do okay, but it should work at the stock "gaming" setting. That's what I paid for. So I have a ticket in with Asus and waiting to hear back from them.

Anyone have a good suggestion for a PSU? 750W is probably way more than enough for me. Just one card in my system.
I'll let you guys know how it goes with swapping out the cables.

Is there a guide on OCN for the Asus GPU Tweak II utility?


----------



## Benny89

Quote:


> Originally Posted by *gmpotu*
> 
> Don't really have the luxury of trying it in another system. I might just buy a new PSU tomorrow and see if that works. I have had this PSU for like 10 years so it's pretty old.
> I'm going to try swapping one of the 6 to 8 pins with the adapter for the 2x6 to 8 pin since I have it. I tried to hit up Fry's Electronics today but they didn't have the cable so who knows.
> If I lower down the Power Target to 80% it seems to do okay, but it should work at the stock "gaming" setting. That's what I paid for. So I have a ticket in with Asus and waiting to hear back from them.
> 
> Anyone have a good suggestion for a PSU? 750W is probably way more than enough for me. Just one card in my system.
> I'll let you guys know how it goes with swapping out the cables.
> 
> Is there a guide on OCN for the Asus GPU Tweak II utility?


If you want to change PSU anyway it is better to invest in bigger one as you will be more future proof. More can always draw less, while less will never provide more.

So I would take either EVGA 1000G2 Gold or Corsair Gold 1000W.

Just my opinion.


----------



## SEALBoy

Could someone run Time Spy for me using these settings:

Clockspeed @ 2062 MHz
RAM @ 5900 MHz

I'm getting 10500-10600 graphics score and it feels like I should be getting higher than that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Wow! I'm still kind of shocked by the power increase between the 980 Ti and the 1080 Ti. I mean, 2000 Mhz out of the box with a maxed power slider ain't no joke. I'm luckily still getting 99% GPU utilization at 1440p 144 hz with my delidded 6700k @ 4.7Ghz. It's seeming to hold up well, but who knows what is to come in the future. If they are releasing 2000 Mhz cards now they really must have something in store for the future generations. I mean, they have to sell cards right?
> 
> What is your CPU? You should do a rig builder in your signature!


Truth be told, I could be wrong. I do have my fps capped at 142 and it stays there so I guess that would put my card in lower utilization.

Quote:


> Originally Posted by *SEALBoy*
> 
> Could someone run Time Spy for me using these settings:
> 
> Clockspeed @ 2062 MHz
> RAM @ 5900 MHz
> 
> I'm getting 10500-10600 graphics score and it feels like I should be getting higher than that.


That's not too bad, at 2088 I got like 10700 so that sounds right.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *SEALBoy*
> 
> Could someone run Time Spy for me using these settings:
> 
> Clockspeed @ 2062 MHz
> RAM @ 5900 MHz
> 
> I'm getting 10500-10600 graphics score and it feels like I should be getting higher than that.


That looks about right.


----------



## SEALBoy

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's not too bad, at 2088 I got like 10700 so that sounds right.


Guru3D got 10799 at like ~2025 MHz core and 6123 MHz RAM. My core is faster but my RAM is slower. Is Time Spy more sensitive to RAM speeds?


----------



## CoreyL4

What is a safe ram overclock number? Im at 5760 at the moment.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *SEALBoy*
> 
> Guru3D got 10799 at like ~2025 MHz core and 6123 MHz RAM. My core is faster but my RAM is slower. Is Time Spy more sensitive to RAM speeds?


TimeSpy is really buggy on my system and I haven't figured out why. It literally takes 5 minutes for it to initiate a benchmark. The program sits and does nothing.


----------



## KedarWolf

Quote:


> Originally Posted by *SEALBoy*
> 
> Could someone run Time Spy for me using these settings:
> 
> Clockspeed @ 2062 MHz
> RAM @ 5900 MHz
> 
> I'm getting 10500-10600 graphics score and it feels like I should be getting higher than that.


Running TimeSpy but I'm using Asus BIOS.


----------



## KedarWolf

Quote:


> Originally Posted by *SEALBoy*
> 
> Could someone run Time Spy for me using these settings:
> 
> Clockspeed @ 2062 MHz
> RAM @ 5900 MHz
> 
> I'm getting 10500-10600 graphics score and it feels like I should be getting higher than that.


TimeSpy on stock FE BIOS at 2062, 5900.


----------



## madmeatballs

Do I lost a port if I flash the SC2 bios to FE?


----------



## OneCosmic

What is the maximum PL in Watts with SC2 BIOS?


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> So reporting in, the ASUS bios, doesn't support the shunt mod. It just goes berserk and doesn't even draw extra power. I believe it's just not meant for the FE card. The ASUS card has different kind of components so something must confuse it. I can't speak for Kedarwolf, but I always get worse results with it. It's always bad news and now shunt modded with 400 Watts for it to draw form, it just acts stupid.
> 
> If the ASUS bios can't recognize a shunt mod on a reference card, something is up with it. It's just not meant to be run on the FE.
> 
> 
> 
> 
> Here's old power readings
> 
> 
> 
> Anyways, the SC2 bios seems to give me more stable OCs. I like it so far. It's worth trying, even though it seems to be equal in performance to the FE design. Also, if you're on a hybrid mod or AIO, you can now set the fan to turn off at lower temperatures, which is great for noise and extends the fans life.
> 
> *TL;DR: So far, the SC2 bios doesn't seem to have any drawbacks, but you gain 0% fan speed at idle if you want. ASUS bios + Shunt mod = No.*


I confirmed this too few days before you, i made a post about it here.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> I confirmed this too few days before you, i made a post about it here.


Confirmed which point? The Asus bios being no good?


----------



## OneCosmic

Quote:


> Originally Posted by *SlimJ87D*
> 
> Confirmed which point? The Asus bios being no good?


The part about shunt mod not working with ASUS BIOS.


----------



## mshagg

Quote:


> Originally Posted by *madmeatballs*
> 
> Do I lost a port if I flash the SC2 bios to FE?


Should be fine. Has 3 DP ports.

Not sure what the benefit of this over an FE BIOS is, although the SC2 BIOS will have an OC baked into it.


----------



## SEALBoy

Quote:


> Originally Posted by *KedarWolf*
> 
> TimeSpy on stock FE BIOS at 2062, 5900.


Thanks. I ran mine again and got 10512. Not sure why it would be slower... I have a 2700K @ 4.8GHz, maybe my CPU is gimping the 1080 Ti?


----------



## imLagging

I remember seeing a thread with someone showing 7 vs 10 3dmark firestrike and the difference was around ~10%. Think I would gain some points with this benchmark on 10? Highest I have been able to push is 2139mhz, crashes after a short time on 2154mhz with heaven and superposition.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *imLagging*
> 
> 
> I remember seeing a thread with someone showing 7 vs 10 3dmark firestrike and the difference was around ~10%. Think I would gain some points with this benchmark on 10? Highest I have been able to push is 2139mhz, crashes after a short time on 2154mhz with heaven and superposition.


Answer is no.


----------



## gmpotu

Thank you to everyone here at the OCN community for talking me through some of the details. I think the issue was a bad port on my PSU.
I switched which PCI-E power spot I was using and now I'm able to run OC mode getting 2000 core with WoW and Furmark running and a video streaming with like 20 windows open in chrome.

Thanks so much guys. I also learned a lot about this overclock tool so I'll be playing around with trying to step it up just slightly on my OC the next couple of days.


----------



## eXteR

I recieved EVGA Hybrid kit yesterday (1080/1070) version, and mounted using stock founders shroud.





Obviously max OC is the same, but i get sustained speed at this clocks, stock bios. 2038 Mhz/6000Mhz Mem.



At the moment the pump is connected to the card, i'll recieve shortly a cable to connect the pump on PSU.

Temps playing are 45-47º. On the rad i'm using a Corsair ML120 PRO fan. Far better than stock hybrid fan. Barely silent up to 1500 rpm.

It's gorgeous being able to play with zero noise and low temps.


----------



## DStealth

Quote:


> Originally Posted by *SEALBoy*
> 
> Thanks. I ran mine again and got 10512. Not sure why it would be slower... I have a 2700K @ 4.8GHz, maybe my CPU is gimping the 1080 Ti?


PCI-e 2.0 and IPC of i7 [email protected]/4.44GHZ vs 2700K @ 4.8GHz


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> cpu and ram DEFINITELY influence fps at any resolution, including 4k. More draw calls = more fpa. yes, it's a lot less of a difference than in a cpu bound resolution. But, for instance, i get about ~120fps in BF1 in 1440p on my i-7 5775c at stock 3.3ghz with 1333mhz ram. But at 4.3Ghz and 3.7ghz uncore and 2133mhz Cas-9 ram, i get ~144+ fps. ANd who thinks a ryzen 4 core stock has the same fps in 4k as a 5.0Ghz 7700k? C'mon, guys, let's stop with this old argument. That like the guys who kept saying you'd get no fps gains in QHD going from a 4790k to a 5775c. Well, everyone that did it like myself, Benny, palotte, and a few others ALL report 10+ fps gains, just from a cpu drop in.
> 
> TLR - cpu and ram still affect performance at any resolution.


BF1 is still pretty heavily influenced by CPU, so it is effected by CPU and RAM at all resolutions, but there's quite a few games that aren't. You reduce CPU influence/bottleneck significantly as you go up in res, but most people are playing at res' that are CPU bottlenecked.

I do find it a bit silly how CPU influenced BF1 is, though. It seems a bit backwards considering the benefits of DX12.


----------



## Taint3dBulge

So i was having some major driver crashing in BF1, looks like DX12 is the culprit. If you play the game and use dx12 you can clock your cards higher by using dx11 and really the eye candy isnt much different. Also i dont see any improvement in latency and fps go up alittle using dx12.


----------



## Silent Scone

So far my Strix can't hold a candle to my Titan X. Even clock for clock at 2100 it's a far cry from my previous Super Position score. Needs water


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> So far my Strix can't hold a candle to my Titan X. Even clock for clock at 2100 it's a far cry from my previous Super Position score. Needs water


2016 XP? I never have the patience to compare my cards, but in gaming I didn't feel a difference between the STRIX and 2016 XP. Of course synthetics are different







.


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> *2016 XP*? I never have the patience to compare my cards, but in gaming I didn't feel a difference between the STRIX and 2016 XP. Of course synthetics are different
> 
> 
> 
> 
> 
> 
> 
> .


No, that would be glaringly obvious as to why otherwise. The Strix is scoring poorly compared to my original TX. Need EK to pull their finger out and release some blocks









(There is a valid reason why I have both, I wouldn't expect anyone in their right mind to replace a TX with a 1080Ti)


----------



## Chocobo

I have a question regarding overclock!

I got the Asus Strix OC 1080 Ti, and I expected the core clock to be at 1700Mhz~ (as the official specs show), yet when I plugged it in and opened Afterburner, it showed that the actual clock was 1960Mhz.
I didn't touch anything, the overclock was at +0. then I put +90 on the overclock to reach 2050Mhz and it worked fine.

Does Afterburner overclock it by default? Was my card overclocked that high from manufacture? Am I missing something? I was expecting to overclock my card manually up to +350Mhz, yet it seems all I needed was +90Mhz to reach 2050Mhz.

Thanks and sorry for my english!


----------



## pez

Quote:


> Originally Posted by *Chocobo*
> 
> I have a question regarding overclock!
> 
> I got the Asus Strix OC 1080 Ti, and I expected the core clock to be at 1700Mhz~ (as the official specs show), yet when I plugged it in and opened Afterburner, it showed that the actual clock was 1960Mhz.
> I didn't touch anything, the overclock was at +0. then I put +90 on the overclock to reach 2050Mhz and it worked fine.
> 
> Does Afterburner overclock it by default? Was my card overclocked that high from manufacture? Am I missing something? I was expecting to overclock my card manually up to +350Mhz, yet it seems all I needed was +90Mhz to reach 2050Mhz.
> 
> Thanks and sorry for my english!


That's GPU boost 3.0 working its magic. With good temps and a mostly more aggressive fan profile from the AIB, the GPU will boost on its own based on factors of temperature and GPU usage alone. I believe mine currently will boost to around 1974 with nothing done to the card. Sounds like you've got good temps and a decent card to boot







.


----------



## Chocobo

Quote:


> Originally Posted by *pez*
> 
> That's GPU boost 3.0 working its magic. With good temps and a mostly more aggressive fan profile from the AIB, the GPU will boost on its own based on factors of temperature and GPU usage alone. I believe mine currently will boost to around 1974 with nothing done to the card. Sounds like you've got good temps and a decent card to boot
> 
> 
> 
> 
> 
> 
> 
> .


Oh I see! Thanks a lot!


----------



## imLagging

Quote:


> Originally Posted by *SlimJ87D*
> 
> Answer is no.


So the gains could possibly apply to 3dmark, but not for unigine? Any particular reason?

User claims for 7 vs 10:
11208 http://www.3dmark.com/fs/8685261
12673 http://www.3dmark.com/fs/8706141


----------



## evosamurai

Quote:


> Originally Posted by *imLagging*
> 
> So the gains could possibly apply to 3dmark, but not for unigine? Any particular reason?
> 
> User claims for 7 vs 10:
> 11208 http://www.3dmark.com/fs/8685261
> 12673 http://www.3dmark.com/fs/8706141


Damn I was staring at my phone for a minute thinking it's slow as hell, realised it was your avatar lol


----------



## alucardis666

Part of the club again with the Aorus



*EDIT:* Getting VERY close to my 6950x @4.3. That setup with the Aorus would get 1524



This was achieved with the RyZen 1700 @ 4.0.


----------



## Benny89

Did anyone try to disasamble 1080 TI ROG STRIX? Or found a video of someone doing it? I want to see if 1080 TI also have those stupid warranty stickers below backplate.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone try to disasamble 1080 TI ROG STRIX? Or found a video of someone doing it? I want to see if 1080 TI also have those stupid warranty stickers below backplate.


On my ASUS FE card, there is a warranty sticker on one of the cooler screws. My Gigabyte FE doesn't have that sticker









Therefore I guess the Strix will have one too.


----------



## pez

Heat guns and hair dryers are your friends when it comes to warranty stickers







.


----------



## Dasboogieman

Guise guise I found a use for the intel stock HSF.
To the naysayers, the backplate definitely works as a heatsink, its just a matter of getting moar cooling on to it.



Dropped load temps by around 5-10 degrees. Also lets a nice breeze go around the VRAM area too. Now to get a more powerful fan....
Quote:


> Originally Posted by *pez*
> 
> Heat guns and hair dryers are your friends when it comes to warranty stickers
> 
> 
> 
> 
> 
> 
> 
> .


Thank god Australian law doesn't recognise the legitimacy of warranty stickers. All manufacturers must give 2 years regardless of policy and the consumer is entitled to fully disassemble the hardware to inspect, as long as nothing was damaged/modified upon re-assembly the warranty still holds.


----------



## alucardis666

Quote:


> Originally Posted by *Dasboogieman*
> 
> Guise guise I found a use for the intel stock HSF.
> To the naysayers, the backplate definitely works as a heatsink, its just a matter of getting moar cooling on to it.
> 
> 
> 
> Dropped load temps by around 5-10 degrees. Also lets a nice breeze go around the VRAM area too. Now to get a more powerful fan....


----------



## KickAssCop

Quote:


> Originally Posted by *Dasboogieman*
> 
> Guise guise I found a use for the intel stock HSF.
> To the naysayers, the backplate definitely works as a heatsink, its just a matter of getting moar cooling on to it.
> 
> 
> 
> Dropped load temps by around 5-10 degrees. Also lets a nice breeze go around the VRAM area too. Now to get a more powerful fan....
> Thank god Australian law doesn't recognise the legitimacy of warranty stickers. All manufacturers must give 2 years regardless of policy and the consumer is entitled to fully disassemble the hardware to inspect, as long as nothing was damaged/modified upon re-assembly the warranty still holds.


You call it Mr. Frankenstein!
But seriously, stock 3 slot cooler not good enough for this card?


----------



## alucardis666

Quote:


> Originally Posted by *KickAssCop*
> 
> You call it Mr. Frankenstein!
> But seriously, stock 3 slot cooler not good enough for this card?


2.5 slot tyvm! I hope it fits and I can SLi when my 2nd one shows up. It looks like it;s gonna be very tight.


----------



## Benny89

Does anyone has experience with Zotac and its support? Do they also include warranty stickers below backplate/heatsink?

I am mad when I can't repaste GPU without losing warranty


----------



## Benny89

Is there any other difference between STRIX OC version and non-OC version of this card? If I OC card with AB anyway then there is really no difference between non-OC and OC card, right?


----------



## cugno87

Quote:


> Originally Posted by *Benny89*
> 
> Does anyone has experience with Zotac and its support? Do they also include warranty stickers below backplate/heatsink?
> 
> I am mad when I can't repaste GPU without losing warranty


Zotac FE doesn't have warranty stickers.


----------



## Blotto80

Quote:


> Originally Posted by *cugno87*
> 
> Zotac FE doesn't have warranty stickers.


Neither did mine straight from nVidia.


----------



## DStealth

MSI Gaming X has on one of the screws heads...the stupidest part is you can unscrew it with pliers w/o using the head itself....but more interesting part is the two shunts for the 8-pin connectors are visible and easy moddable w/o taking down the cooler itself...don't know what exactly they're hiding behind this warranty sticker


----------



## gstarr

someone can confirm? will FE backplate fit on ekwb new gtx1080ti cooler?


----------



## phenom01

Just ordered my MSI 1080ti X. Cant wait to play some game on my ultrawide without memory limits and sli stutter.


----------



## Clukos

Just got mine (MSI Gaming X), boosts to 1950 out of the box, ran a firestrike with my 3570k (last day before Ryzen 1700): http://www.3dmark.com/3dm/19382980?

30k GPU score at stock clocks, not bad


----------



## Levesque

Just received my 2 MSI 1080 Ti FE, and should receive my 2 waterblocks today or tomorrow.









I don't want to read 602 pages lol, so can anyone tell me what people are able to OC on average on air on stock voltage before I get my blocks?

Ty.


----------



## s1rrah

Quote:


> Originally Posted by *Levesque*
> 
> Just received my 2 MSI 1080 Ti FE, and should receive my 2 waterblocks today or tomorrow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't want to read 602 pages lol, so can anyone tell me what people are able to OC on average on air on stock voltage before I get my blocks?
> 
> Ty.


I'm pretty comfortable at 2000+ on the core and 6000mhz on memory ... no problems with stock volts thus far but I've only been running it a few days ... sure is a monstrous card, though ... can't get my head around how my 1440p performance has increased so dramatically over my dual 980's in SLI ...

...


----------



## Levesque

Quote:


> Originally Posted by *s1rrah*
> 
> I'm pretty comfortable at 2000+ on the core and 6000mhz on memory ... no problems with stock volts thus far but I've only been running it a few days ... sure is a monstrous card, though ... can't get my head around how my 1440p performance has increased so dramatically over my dual 980's in SLI ...


Just replaced 980 Ti SLI with 1080 Ti SLI.

I'm impressed with those 1080 Ti. Monsters!

Ty for the info, will try to push them for fun on air.


----------



## Clukos

Almost 32k GPU score when overclocked and I haven't touched voltage yet (Afterburner doesn't let me): http://www.3dmark.com/3dm/19383466?

FS Extreme: http://www.3dmark.com/3dm/19383570?


----------



## s1rrah

Quote:


> Originally Posted by *Levesque*
> 
> Just replaced 980 Ti SLI with 1080 Ti SLI.
> 
> I'm impressed with those 1080 Ti. Monsters!
> 
> Ty for the info, will try to push them for fun on air.


Oh .. forgot to mention ... it's a Seahawk Hyrid water cooled card ... so technically not air cooled ...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Levesque*
> 
> Just received my 2 MSI 1080 Ti FE, and should receive my 2 waterblocks today or tomorrow.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't want to read 602 pages lol, so can anyone tell me what people are able to OC on average on air on stock voltage before I get my blocks?
> 
> Ty.


All you had to do was read the first post and see we have a list of people's oc.


----------



## CoreyL4

What is a good/safe oc for memory? Im at 5760 currently.


----------



## leequan2009

Quote:


> Originally Posted by *Dasboogieman*
> 
> Guise guise I found a use for the intel stock HSF.
> To the naysayers, the backplate definitely works as a heatsink, its just a matter of getting moar cooling on to it.
> 
> 
> 
> Dropped load temps by around 5-10 degrees. Also lets a nice breeze go around the VRAM area too. Now to get a more powerful fan....
> Thank god Australian law doesn't recognise the legitimacy of warranty stickers. All manufacturers must give 2 years regardless of policy and the consumer is entitled to fully disassemble the hardware to inspect, as long as nothing was damaged/modified upon re-assembly the warranty still holds.


So cool


----------



## mshagg

Quote:


> Originally Posted by *CoreyL4*
> 
> What is a good/safe oc for memory? Im at 5760 currently.


You wont hurt it. Things will become unstable after a point. You need to do some testing to make sure that's not hurting performance. There is a sweet spot with mem OC that takes a while to find.


----------



## st0necold

Quote:


> Originally Posted by *Levesque*
> 
> Just replaced 980 Ti SLI with 1080 Ti SLI.
> 
> I'm impressed with those 1080 Ti. Monsters!
> 
> Ty for the info, will try to push them for fun on air.


Bro congratulations!!!

How much better do the games look with the new 1080ti SLI setup? I haven't decided on upgrading yet-- are the graphics better?


----------



## CoreyL4

Quote:


> Originally Posted by *mshagg*
> 
> You wont hurt it. Things will become unstable after a point. You need to do some testing to make sure that's not hurting performance. There is a sweet spot with mem OC that takes a while to find.


Any idea of what that sweet spot is? I see people going to 6000mhz.


----------



## Mrip541

Quote:


> Originally Posted by *SlimJ87D*
> 
> Dude call a friend or family to go get it for you. Do you live in a apartment complex? Ask the leasing office if they can grab it for you and tell them it's super important


Got someone to save it for me. whew.

It's the Gigabyte Gaming OC and I installed it just now. It has the most horrendous coil whine. I hesitate to call it coil whine because I've heard coil whine before and what I have now is so much worse. It sounds like someone being electrocuted at various voltages based on the load. As I look around in game the stuttering crackle changes in real time.


----------



## reflex75

Quote:


> Originally Posted by *Mrip541*
> 
> It's the Gigabyte Gaming OC and I installed it just now. It has the most horrendous coil whine. I hesitate to call it coil whine because I've heard coil whine before and what I have now sounds different. It sounds like someone being electrocuted at various voltages based on the load. As I look around in game the stuttering crackle changes in real time.


Sorry for you about this coil whine








But I was wondering: what is the acceptable amount of coil whine?
Because I tried 3 different 1080ti and they all produce some coil whine...


----------



## JedixJarf

Quote:


> Originally Posted by *phenom01*
> 
> Just ordered my MSI 1080ti X. Cant wait to play some game on my ultrawide without memory limits and sli stutter.


This card is glorious for ultrawide


----------



## Mrip541

Quote:


> Originally Posted by *reflex75*
> 
> Sorry for you about this coil whine
> 
> 
> 
> 
> 
> 
> 
> 
> But I was wondering: what is the acceptable amount of coil whine?
> Because I tried 3 different 1080ti and they all produce some coil whine...


I guess I'm not really sure... I had a 7970 that had a bit of whine that I kept because it wasn't really audible over game sound, but this 1080ti is unbearable. It sounds like my case is about to burst into flames.


----------



## tpain813

Sorry this thread is pretty big, and tried searching but just want to confirm because I saw a comment a few pages back: you guys are still overclocking these by going into the Voltage/Frequency curve right? Or are you guys just adding straight to the core clock in afterburner?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *tpain813*
> 
> Sorry this thread is pretty big, and tried searching but just want to confirm because I saw a comment a few pages back: you guys are still overclocking these by going into the Voltage/Frequency curve right? Or are you guys just adding straight to the core clock in afterburner?


It may or may not help you. It's a "YMMV" kind of thing.


----------



## Dasboogieman

Quote:


> Originally Posted by *Mrip541*
> 
> I guess I'm not really sure... I had a 7970 that had a bit of whine that I kept because it wasn't really audible over game sound, but this 1080ti is unbearable. It sounds like my case is about to burst into flames.


Quote:


> Originally Posted by *reflex75*
> 
> Sorry for you about this coil whine
> 
> 
> 
> 
> 
> 
> 
> 
> But I was wondering: what is the acceptable amount of coil whine?
> Because I tried 3 different 1080ti and they all produce some coil whine...


99% of modern GPUs will produce coil whine under specific conditions.

The good news is you can deal with coil whine by padding them with something to dampen the vibration. This usually gets rid of 80% of the coil whine. It can be as involved as spreading hot glue around each choke. Don't worry, its extremely removable.


If ur lazy like me, just run a long strip of Seksui Thermal adhesive across the top of all the inductors (but don't remove the white paper bit) and that also does the job. I did this to my AMD 290 Tri-X and haven't had noticeable coil whine in years.

https://en.wikipedia.org/wiki/Coil_noise#Reducing_coil_noise

Bear in mind coil whine is not actually indicative of the quality of the inductors used. Special low noise units do exist (usually at a drawback of cost, heat or current limitations) but I haven't seen a GPU manufacturer use low noise inductors since iirc Gigabyte a few years back with those "ultra durable" branded solid ferrite chokes.


----------



## tpain813

Quote:


> Originally Posted by *SlimJ87D*
> 
> It may or may not help you. It's a "YMMV" kind of thing.


Oh got it - so both methods work? I thought with 1080's you had to OC using the frequency curve. Good to know that simply adding some overall boost clock works too!


----------



## KedarWolf

Hi kind peeps,

Been busy with real life stuff and work today.

Soon we are going to add a form to our OP so you just add your 1080 Ti model number and info in the form and the spread sheet it automatically updated. We may be able to actually embed the spread sheet info in the OP as well.

When I'm home from work I'll try to set it up.

Shout out to @joder for setting up the form and possibly helping with the embedding and to @SlimJ87D for the great work he's done in improving the OP!!


----------



## KedarWolf

Quote:


> Originally Posted by *SEALBoy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> TimeSpy on stock FE BIOS at 2062, 5900.
> 
> 
> 
> 
> 
> Thanks. I ran mine again and got 10512. Not sure why it would be slower... I have a 2700K @ 4.8GHz, maybe my CPU is gimping the 1080 Ti?
Click to expand...

Yeah, you need to compare graphics scores without the CPU score in the screenshots, to left of CPU score.


----------



## Clukos

Alright, overclocking done for today, some results:

Firestrike (32k GPU Score): http://www.3dmark.com/3dm/19385621


Firestrike Extreme (15.6k GPU Score): http://www.3dmark.com/3dm/19385311


Firestrike Ultra (7.7k GPU Score): http://www.3dmark.com/3dm/19385522


Time Spy (10.8k GPU Score): http://www.3dmark.com/3dm/19386518


2038 curve OC and +750 on mem. I'm severely power limited but I don't really want to shunt mod the card.


----------



## reflex75

Quote:


> Originally Posted by *Dasboogieman*
> 
> 99% of modern GPUs will produce coil whine under specific conditions.
> 
> The good news is you can deal with coil whine by padding them with something to dampen the vibration. This usually gets rid of 80% of the coil whine. It can be as involved as spreading hot glue around each choke. Don't worry, its extremely removable.
> 
> 
> If ur lazy like me, just run a long strip of Seksui Thermal adhesive across the top of all the inductors (but don't remove the white paper bit) and that also does the job. I did this to my AMD 290 Tri-X and haven't had noticeable coil whine in years.
> 
> https://en.wikipedia.org/wiki/Coil_noise#Reducing_coil_noise
> 
> Bear in mind coil whine is not actually indicative of the quality of the inductors used. Special low noise units do exist (usually at a drawback of cost, heat or current limitations) but I haven't seen a GPU manufacturer use low noise inductors since iirc Gigabyte a few years back with those "ultra durable" branded solid ferrite chokes.


Thank you for the tip.
But removing the cooler will void the warranty...
And if it was that simple, why manufacturers do nothing about it?


----------



## buddatech

Love my 1080 Ti, it even forced me into upgrading my old 2560x1600 30" xhd which by the way still works LOL but damn I should have waited for a AIB card.


----------



## SuprUsrStan

Quote:


> Originally Posted by *gstarr*
> 
> someone can confirm? will FE backplate fit on ekwb new gtx1080ti cooler?


Nope, you need the EK backplate.


----------



## Clukos

Superposition benchmark 1080p extreme



And I'm done with benches, time to try some games


----------



## gmpotu

Nice memory OC clukos!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *reflex75*
> 
> Thank you for the tip.
> But removing the cooler will void the warranty...
> And if it was that simple, why manufacturers do nothing about it?


Because the chokes should be bare and have a fan blowing over them, lol. Don't put too much hot glue.


----------



## Clukos

Quote:


> Originally Posted by *gmpotu*
> 
> Nice memory OC clukos!


I got lucky both with the core and the memory but I'm still power limited. Wish we could mod our bios like we did with Maxwell, oh well


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Clukos*
> 
> Superposition benchmark 1080p extreme
> 
> 
> 
> And I'm done with benches, time to try some games


You're not done until you do 4K optimized bench!


----------



## DStealth

Quote:


> Originally Posted by *Clukos*
> 
> Alright, overclocking done for today, some results:
> 
> 2038 curve OC and +750 on mem. I'm severely power limited but I don't really want to shunt mod the card.


But than you'll have the chance to burn it ....literally








Great memory OC you have there, but push some more - my results in these test *were* GPU wize:
FS - 32294
FSX - 15930
FSU - 7850
TS - 11304


----------



## KingEngineRevUp

Quote:


> Originally Posted by *SEALBoy*
> 
> Guru3D got 10799 at like ~2025 MHz core and 6123 MHz RAM. My core is faster but my RAM is slower. Is Time Spy more sensitive to RAM speeds?


NVM, I scored 10789 while power limited with clock fluctuations from 1950-2050 Mhz.

I haven't had a chance to do the test with the shunt mod.


----------



## Clukos

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're not done until you do 4K optimized bench!


Here you go


----------



## gmpotu

Couple of questions for you guys.
1) Now that I have my power stable (I think) I'm going to test my OC to see what I can achieve. I've read that some of you are flashing your bios so you can get a higher power draw. (I'm on an Asus Strix)
a) Does the FE Bios actually let the slider go above 120% or does 120% just draw more power than it would on another Bios?
b) Would I lose any features of my Strix like my DVI or HDMI port functionality or the color aura control etc if I flashed to a different Bios?

2) Anyone have a suggestion on the best (in your opinion) 21:9 curved monitor to go with?
a) from the ones I have seen in the store I like the curve of the LG 34" 88-b the most but I've read some bad reviews on this one though
b) I would prefer to have g-sync as an option but the acer predator and the rogue swift don't have as much of a curve as the LG 88-b and from the angle I was at in the store the my looked washed out (maybe these two are matte and the LG is glossy?)


----------



## Clukos

Quote:


> Originally Posted by *DStealth*
> 
> But than you'll have the chance to burn it ....literally
> 
> 
> 
> 
> 
> 
> 
> 
> Great memory OC you have there, but push some more - my results in these test *were* GPU wize:
> FS - 32294
> FSX - 15930
> FSU - 7850
> TS - 11304


Yup TS is the worst one with power limiting, great scores but I don't want to risk it with shunt modding. I'm probably undervolting some more for 24/7 usage


----------



## RavageTheEarth

Quote:


> Originally Posted by *Clukos*
> 
> Alright, overclocking done for today, some results:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Firestrike (32k GPU Score): http://www.3dmark.com/3dm/19385621
> 
> 
> Firestrike Extreme (15.6k GPU Score): http://www.3dmark.com/3dm/19385311
> 
> 
> Firestrike Ultra (7.7k GPU Score): http://www.3dmark.com/3dm/19385522
> 
> 
> Time Spy (10.8k GPU Score): http://www.3dmark.com/3dm/19386518
> 
> 
> 
> 
> 2038 curve OC and +750 on mem. I'm severely power limited but I don't really want to shunt mod the card.


Wow that CPU seems to really be holding back your scores. Here is a comparison between me and you. Your graphics scores are a couple points higher than mine and I'm at 2063 core and 1465 memory. My card isn't a good overclocker when it comes to memory so I am only running +400. At the moment I'm just sticking with 2063 core @ 1.063v although after gaming for a little while I start to get Vrel for my Perfcap reason in GPU-Z so it seems voltage hungry, but the card runs cool and I only drop down 1 bin after gaming for a while which is good so I'm just ignoring that for now. Just waiting until EK releases the block for the Aorus and then I'm going to really see what the card can do.


----------



## Clukos

Yeah I'm getting a Ryzen 1700 tomorrow, these are the last benchmarks with my 3570k


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Clukos*
> 
> Here you go


Awesome, Now go enjoy some games!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Wow that CPU seems to really be holding back your scores. Here is a comparison between me and you. Your graphics scores are a couple points higher than mine and I'm at 2063 core and 1465 memory. My card isn't a good overclocker when it comes to memory so I am only running +400. At the moment I'm just sticking with 2063 core @ 1.063v although after gaming for a little while I start to get Vrel for my Perfcap reason in GPU-Z so it seems voltage hungry, but the card runs cool and I only drop down 1 bin after gaming for a while which is good so I'm just ignoring that for now. Just waiting until EK releases the block for the Aorus and then I'm going to really see what the card can do.


Yeah, but if he's playing at higher resolutions, then you guys would be about equal.

If he had a better CPU though, it would give him like 1 or 2 more FPS. Not that big of a deal.

If he's playing at 1080P, then that's a different story.


----------



## chibi

I like even numbers and simplicity. Will I run into any troubles with a core oc of 2000 MHz, mem oc of 6000 MHz without any curve adjustment and increased voltage? Can I go back to the Maxwell days of just bumping up the slider and not run into any voltage or power limits with the above oc gains? I'm keeping it mild to not have to tinker too much. GPU will be on a full waterblock and custom liquid.

Card is an EVGA FE. I can't test at the moment due to a wip build log, but hopefully in 2 weeks time I'll have it up and running.


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, but if he's playing at higher resolutions, then you guys would be about equal.
> 
> If he had a better CPU though, it would give him like 1 or 2 more FPS. Not that big of a deal.
> 
> If he's playing at 1080P, then that's a different story.


Yeah I'm talking about the CPU holding him back benching-wise. Gaming is a whole other story. In games it's not as big of a deal if you don't have a newer CPU.
Quote:


> Originally Posted by *DStealth*
> 
> But than you'll have the chance to burn it ....literally
> 
> 
> 
> 
> 
> 
> 
> 
> Great memory OC you have there, but push some more - my results in these test *were* GPU wize:
> FS - 32294
> FSX - 15930
> FSU - 7850
> TS - 11304


Jesus you are running 2114 on the core? What voltage are you running on the curve to hit 2114? That's a nice card you've got there!


----------



## Funkynex

I just got this little puppy, Zotac GTX 1080 Ti Blower.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *chibi*
> 
> I like even numbers and simplicity. Will I run into any troubles with a core oc of 2000 MHz, mem oc of 6000 MHz without any curve adjustment and increased voltage? Can I go back to the Maxwell days of just bumping up the slider and not run into any voltage or power limits with the above oc gains? I'm keeping it mild to not have to tinker too much. GPU will be on a full waterblock and custom liquid.
> 
> Card is an EVGA FE. I can't test at the moment due to a wip build log, but hopefully in 2 weeks time I'll have it up and running.


You're going to have issues because the card is really running at 1999 Mhz and just rounds up to 2000 Mhz in reporting software. So your OCD will probably cause you to either hang yourself or jump out a window because it's truly not running an even number.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Yeah I'm talking about the CPU holding him back benching-wise. Gaming is a whole other story. In games it's not as big of a deal if you don't have a newer CPU.
> Jesus you are running 2114 on the core? What voltage are you running on the curve to hit 2114? That's a nice card you've got there!


Running? He once ran it at that core until his card blew up. Now he's just a broken man, a shadow of his old self. Now he keeps epeening that he used to have a good card until it blew up. He waits for people to show their benchmarks and then brags about his veteran past with his old card before it blew up









Edit: Joking


----------



## chibi

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're going to have issues because the card is really running at 1999 Mhz and just rounds up to 2000 Mhz in reporting software. So your OCD will probably cause you to either hang yourself or jump out a window because it's truly not running an even number.


Lawd almighty... /wrists









Ticks aside







, will 1999 run into any PL/VL issues at stock bios & voltage levels? I need to hurry up and get this build finished before all this hardware becomes obsolete.


----------



## Boogur

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone try to disasamble 1080 TI ROG STRIX? Or found a video of someone doing it? I want to see if 1080 TI also have those stupid warranty stickers below backplate.


There's a warranty sticker on top of one of the four screws holding the heatsink.


----------



## outofmyheadyo

Depending where you live, the warranty void sticker doesnt mean anything, they cant just make stuff like this up, it`s your card you can dissasemble it.


----------



## Clukos

Some undervolting results


I can run a stable 2025 on the core with "just" 1.0120mv (tested on several games at 4k maxed), I also lowered the memory from +750 to +600 just in case, this is for 24/7 usage. I prefer not to dump too much heat into my room/case and I think this strikes a nice balance


----------



## RavageTheEarth

Quote:


> Originally Posted by *Clukos*
> 
> Some undervolting results
> 
> 
> I can run a stable 2025 on the core with "just" 1.0120mv (tested on several games at 4k maxed), I also lowered the memory from +750 to +600 just in case, this is for 24/7 usage. I prefer not to dump too much heat into my room/case and I think this strikes a nice balance


How are you undervolting the card? Are you just lowering the values on the voltage curve?


----------



## evosamurai

Quote:


> Originally Posted by *Clukos*
> 
> Alright, overclocking done for today, some results:
> 
> Firestrike (32k GPU Score): http://www.3dmark.com/3dm/19385621
> 
> 
> Firestrike Extreme (15.6k GPU Score): http://www.3dmark.com/3dm/19385311
> 
> 
> Firestrike Ultra (7.7k GPU Score): http://www.3dmark.com/3dm/19385522
> 
> 
> Time Spy (10.8k GPU Score): http://www.3dmark.com/3dm/19386518
> 
> 
> 2038 curve OC and +750 on mem. I'm severely power limited but I don't really want to shunt mod the card.



My 980ti is almost the same


----------



## fisher6

Anybody getting a red screen when running superposition benchmark 4k optimized?


----------



## cekim

Quote:


> Originally Posted by *chibi*
> 
> Lawd almighty... /wrists
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ticks aside
> 
> 
> 
> 
> 
> 
> 
> , will 1999 run into any PL/VL issues at stock bios & voltage levels? I need to hurry up and get this build finished before all this hardware becomes obsolete.


Yeah, that's the problem with custom loops and OC is by the time you get everything sorted... It's time to start building again.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Clukos*
> 
> Some undervolting results
> 
> 
> I can run a stable 2025 on the core with "just" 1.0120mv (tested on several games at 4k maxed), I also lowered the memory from +750 to +600 just in case, this is for 24/7 usage. I prefer not to dump too much heat into my room/case and I think this strikes a nice balance


I can do 2075 Mhz at 1.000v

I did a one volt challenge earlier. Never caught on, no one wants to do it.


----------



## Clukos

Quote:


> Originally Posted by *RavageTheEarth*
> 
> How are you undervolting the card? Are you just lowering the values on the voltage curve?


Yup voltage curve looks like this:


Quote:


> Originally Posted by *evosamurai*
> 
> 
> My 980ti is almost the same


CPU difference









GPU is 21k vs 32k
Quote:


> Originally Posted by *SlimJ87D*
> 
> I can do 2075 Mhz at 1.000v
> 
> I did a one volt challenge earlier. Never caught on, no one wants to do it.


Damn that is nice! Thing is, I get power limited with anything higher than that and clocks jump all over the place.


----------



## CoreyL4

so my card is a dud when it comes to silicon lottery. final stable numbers is +60/+350 which is 1987/5858

any other ways to get some more juice out of the card? ie. increase voltage? (not familiar with pascal)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> so my card is a dud when it comes to silicon lottery. final stable numbers is +60/+350 which is 1987/5858
> 
> any other ways to get some more juice out of the card? ie. increase voltage? (not familiar with pascal)


Voltage curve and try to get past your crashing bins.


----------



## TheNoseKnows

Quote:


> Originally Posted by *evosamurai*
> 
> 
> My 980ti is almost the same


Come on, the 1080 Ti has a 51% higher score than the 980 Ti. That's not "almost the same".


----------



## NYU87

Quote:


> Originally Posted by *evosamurai*
> 
> 
> My 980ti is almost the same


32K vs 20K graphics score is almost the same?

Damn I must be going blind.


----------



## outofmyheadyo

Why are u comparing firestrike ?


----------



## evosamurai

Take it all back, was just looking at the score my bad, then I looked into it


----------



## IMI4tth3w

Quote:


> Originally Posted by *evosamurai*
> 
> 
> My 980ti is almost the same


you cannot simply compare the overall scores. look at the graphics scores. His is 50% higher than yours. 32k vs 21k for graphics scores.


----------



## outofmyheadyo

I need to find a way to reduce my loadtemps on my 1080ti by like 2-3c without increasing the rpm, the gpu gets downclocked from 2050 to 2037 because it passes 45c and 2037 is such a strange number.
But 600rpm on my fans is about as loud as I am willing to tolerate, increasing D5 vario speed from setting 2 -> 3 wont really help me I quess.
Runnin 560 monsta and 1 cpu + 1 gpu fans at 600rpm + D5 setting2, gpu maxes out at 48c after a few hours of masseffect andromeda, what temps are you getting on your cards ?


----------



## Clukos

With 1.000mv on the core I can go up to 2012 core, that's not that bad but it's not something amazing either, what is amazing is that at 1.000mv I can pretty much avoid power limit at all times, playing through RoTR maxed out at 4k at the moment and the card is pegged 99% usage and it only draws 80-85% tdp. I can live with 2012 on the core and +700 on the mem to be honest. With 1.0930 I can get close to 2100+ but I'm always power limited and the card is way hotter, not worth for me since I'm not on watter.


----------



## hotrod717

Quote:


> Originally Posted by *Luckbad*
> 
> My Titan Xp is on the way back to Nvidia. I couldn't tolerate the temps and volume and didn't want to mod a $1200 card.
> 
> Now I have 3 Zotac Amp Extremes and an Asus ROG Strix OC on the way to cherry pick the best of them and an EVGA FTW3 preordered.
> 
> Technically I have a 1080 Ti FE on the way from a Step-Up, but I plan to sell that without opening it.
> 
> The next couple weeks are gonna be fun.


And that reference card could be better than any of them.


----------



## KedarWolf

I can do 2012 core, 6142 memory at .975v and avoid all throttling under water even in stock BIOS and I still get 10862 on TimeSpy at those settings.


----------



## CoreyL4

Quote:


> Originally Posted by *Clukos*
> 
> With 1.000mv on the core I can go up to 2012 core, that's not that bad but it's not something amazing either, what is amazing is that at 1.000mv I can pretty much avoid power limit at all times, playing through RoTR maxed out at 4k at the moment and the card is pegged 99% usage and it only draws 80-85% tdp. I can live with 2012 on the core and +700 on the mem to be honest. With 1.0930 I can get close to 2100+ but I'm always power limited and the card is way hotter, not worth for me since I'm not on watter.


Wish I had your problem lol.

Im stuck at +60/+350 on stock voltage.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Clukos*
> 
> Yup voltage curve looks like this:


I just tried this and after about 10 minutes of playing Tom Clancy's Wildlands I saw that my voltage had jumped up to 1.062v from 1.012v. Running a core clock of 2025. Not sure why it jumped up 5 bins like that. Before this I was running 2063 @ 1.062v. Looks like my card really likes 1.062v. BTW TC Wildlands is VERY CPU dependent. More so than any of the other games I play.


----------



## Foxrun

Quote:


> Originally Posted by *KedarWolf*
> 
> I can do 2012 core, 6142 memory at .975v and avoid all throttling under water even in stock BIOS and I still get 10862 on TimeSoy at those settings.


What happens when a card does hit the power limit? Does it lower fps due to the throttle?


----------



## outofmyheadyo

Quote:


> Originally Posted by *KedarWolf*
> 
> I can do 2012 core, 6142 memory at .975v and avoid all throttling under water even in stock BIOS and I still get 10862 on TimeSpy at those settings.


Whats your timespy graphics score at that setting ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I just tried this and after about 10 minutes of playing Tom Clancy's Wildlands I saw that my voltage had jumped up to 1.062v from 1.012v. Running a core clock of 2025. Not sure why it jumped up 5 bins like that. Before this I was running 2063 @ 1.062v. Looks like my card really likes 1.062v. BTW TC Wildlands is VERY CPU dependent. More so than any of the other games I play.


It is doing that for stability.
Quote:


> Originally Posted by *Foxrun*
> 
> What happens when a card does hit the power limit? Does it lower fps due to the throttle?


Your card drops voltage bins. Voltage draws more power, at if you're at 1V and drawing 100 watts, at 1.1v you'd draw around 110 watts.

So your GPU down volts until you're power draw doesn't reach its limit.


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Clukos*
> 
> Yup voltage curve looks like this:
> 
> 
> 
> 
> 
> I just tried this and after about 10 minutes of playing Tom Clancy's Wildlands I saw that my voltage had jumped up to 1.062v from 1.012v. Running a core clock of 2025. Not sure why it jumped up 5 bins like that. Before this I was running 2063 @ 1.062v. Looks like my card really likes 1.062v. BTW TC Wildlands is VERY CPU dependent. More so than any of the other games I play.
Click to expand...

After much experimenting with many BIOS etc. Why I need to do to stop voltages from jumping around is CTRL D for default curve, hit the check mark to apply, then hold shift and put the top most line one point higher than the voltage you want, so if you want say 2032 drag level line holding shift to 2041 core, hit the check mark to apply, it'll stay at 2032 after you do.

Next you drag the voltage point you want to 2041 core, so for me it's 1.000v to 2041, apply it again, now from a straight line from 1.000v to the right it'll be at 2032. Lastly I smooth out the curve below 1.000 to 2025 at .993, 2012 at .981 and gradually lower.

By applying the core point one point higher then the core speed you want, like 2041 fur a 2032 core speed it doesn't jump higher.

Another example If you put say 2025 at 1.000v at 2025 directly and hit apply the voltage will jump to .1.062v at times for some reason.

But if you put it at 2032 at 1.000v, hit apply, it'll actually apply to 2025 at 1.000v on s straight line to the right and NOT jump higher at all.

I might make a quick video about this trick.


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> Running? He once ran it at that core until his card blew up. Now he's just a broken man, a shadow of his old self. Now he keeps epeening that he used to have a good card until it blew up. He waits for people to show their benchmarks and then brags about his veteran past with his old card before it blew up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Joking


Will moving the slider too high in the boost clock make the card request too much voltage and burn it or is the only way to burn it by messing with the actual voltage curve under the gear setting?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> After much experimenting with many BIOS etc. Why I need to do to stop voltages from jumping around is CTRL D for default curve, hit the check mark to apply, then hold shift and put the top most line one point higher than the voltage you want, so if you want say 2032 drag level line holding shift to 2041 core, hit the check mark to apply, it'll stay at 2032 after you do.
> 
> Next you drag the voltage point you want to 2041 core, so for me it's 1.000v to 2041, apply it again, now from a straight line from 1.000v to the right it'll be at 2032. Lastly I smooth out the curve below 1.000 to 2025 at .993, 2012 at .981 and gradually lower. By applying the core point one point higher then the core speed you want, like 2041 fur a 2032 core speed it doesn't jump higher.
> 
> Another example If you put say 2025 at 1.000v at 2025 directly and hit apply the voltage will jump to .1.062v at times for some reason.
> 
> But if you put it at 2032 at 1.000v, hit apply, it'll actually apply to 2025 at 1.000v and NOT jump around at all.
> 
> I might make s quick video about this trick.


Exactly what I do. Fastest way to. But if you're power limited, I'd do the same thing but going backwards, increasing lower bins for superpositon and TimeSpy.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> Will moving the slider too high in the boost clock make the card request too much voltage and burn it or is the only way to burn it by messing with the actual voltage curve under the gear setting?


No, your card when it hits power limits will down volt. It will down volt to around 0.992v and then probably start lowering your clocks.


----------



## KedarWolf

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I can do 2012 core, 6142 memory at .975v and avoid all throttling under water even in stock BIOS and I still get 10862 on TimeSpy at those settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats your timespy graphics score at that setting ?
Click to expand...

I didn't know if I saved a screen, on my way home from work.

On transit, darn it, I don't text and drive, what kinda wolf do you think I am anyways.


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, your card when it hits power limits will down volt. It will down volt to around 0.992v and then probably start lowering your clocks.


If the card down volts on its own when it hits the power target them how did dstealth end up burning his card? It seems like the save guards would prevent that from happening.

Sorry if I am asking these noob questions. I'm just trying to understand more.


----------



## KedarWolf

I find for 24/7 clocks the best way to get high benches and stable for every day use is find the highest core at lowest voltage no shunt mod you don't power limit at at all in TimeSpy and Superposition, best stable memory clock yuo can get, then a a bit lower then that, run that for your 24/7 clocks.









For me on stock BIOS it's .975v, 2012 core, 6147 memory.


----------



## KedarWolf

Quote:


> Originally Posted by *Foxrun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I can do 2012 core, 6142 memory at .975v and avoid all throttling under water even in stock BIOS and I still get 10862 on TimeSoy at those settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What happens when a card does hit the power limit? Does it lower fps due to the throttle?
Click to expand...

Core speeds go lower so yes, it'll affect FPS and I get more consistent better scores with clocks not jumping around than if they do jump all over.


----------



## alucardis666

So anyone else's *Aorus NON- EXTREME* hitting 80+C while running Heaven in 4K extreme or am I just lucky?

This was done with the fans at 100% too











On Auto fan curve card got to 88 and then killed Heaven... I don't get why it's soooo hot. On the upside card doesn't see to throttle much, I'm not really seeing many perf caps unlike my founders or the TXp.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> So anyone else's *Aorus NON- EXTREME* hitting 80+C while running Heaven in 4K extreme or am I just lucky?
> 
> This was done with the fans at 100% too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On Auto fan curve card got to 88 and then killed Heaven... I don't get why it's soooo hot. On the upside card doesn't see to throttle much, I'm not really seeing many perf caps unlike my founders or the TXp.


And you did a re-tim?


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> And you did a re-tim?


I have not. I'm wondering if its gonna be worth it or not. Or if I should Rma/Return


----------



## Pandora's Box

Quote:


> Originally Posted by *alucardis666*
> 
> I have not. I'm wondering if its gonna be worth it or not. Or if I should Rma/Return


What is your room temp?


----------



## alucardis666

Quote:


> Originally Posted by *Pandora's Box*
> 
> What is your room temp?


~24-25c

My worry if what happens when the 2nd one gets here? How bad will temps be then?









My CPU is running cooler for Christ sake. *Granted it's cooled with a H100i V2, but still...*


----------



## Pandora's Box

Quote:


> Originally Posted by *alucardis666*
> 
> ~24-25c
> 
> My worry if what happens when the 2nd one gets here? How bad will temps be then?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My CPU is running cooler for Christ sake. *Granted it's cooled with a H100i V2, but still...*


Seems like a dud card IMO. Tho I havent read reviews on the Auros Non Extreme.

2 card SLI with aftermarket cards is a bad idea. Either go Founders Edition or Watercooling. The top card will drown in the bottom cards heat.


----------



## alucardis666

Quote:


> Originally Posted by *Pandora's Box*
> 
> Seems like a dud card IMO. Tho I havent read reviews on the Auros Non Extreme.
> 
> 2 card SLI with aftermarket cards is a bad idea. Either go Founders Edition or Watercooling. The top card will drown in the bottom cards heat.


That was my worry. I'd just like to get something that works.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> If the card down volts on its own when it hits the power target them how did dstealth end up burning his card? It seems like the save guards would prevent that from happening.
> 
> Sorry if I am asking these noob questions. I'm just trying to understand more.


He shunt modded a card that doesn't need the shunt mod. His card could already access 375 watts iirc if it needed to, and he shunt modded it. So it could have access 500 watts and blew up.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> That was my worry. I'd just like to get something that works.


I think it's their Tim and mounting pressure.


----------



## Pandora's Box

Quote:


> Originally Posted by *alucardis666*
> 
> That was my worry. I'd just like to get something that works.


https://hardforum.com/threads/auros-1080ti-xtreme-unstable.1929932/

Other people reporting issues. Personally I think the Asus STRIX and MSI Gaming X are the cards to get for aftermarket aircooled cards.


----------



## outofmyheadyo

zotac amp extreme is solid aswell


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Pandora's Box*
> 
> Seems like a dud card IMO. Tho I havent read reviews on the Auros Non Extreme.
> 
> 2 card SLI with aftermarket cards is a bad idea. Either go Founders Edition or Watercooling. The top card will drown in the bottom cards heat.


A case with a side fan can negate that.

Or bottom intake fans and the top fans exhausting.

It depends on how you setup your case flow.


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> Core speeds go lower so yes, it'll affect FPS and I get more consistent better scores with clocks not jumping around than if they do jump all over.


Would upping the voltage help my dud of a card hit 2000mhz? It can only do 1987/5858 stable at 1.06 (stock voltage, unless that is already at max)


----------



## KraxKill

Did some more tweaking, here is Firestrike and Timespy @ *2114* Zero throttling (Shunted), but my CPU is holding me back a bit I suspect.


http://www.3dmark.com/fs/12381526


http://www.3dmark.com/spy/1595031


----------



## GRABibus

I receive my "MSI GTX 1080 Ti Sea Hawk X" tomorrow !








Hope this will be a good overclockable one


----------



## alucardis666

Quote:


> Originally Posted by *GRABibus*
> 
> I receive my "MSI GTX 1080 Ti Sea Hawk X" tomorrow !
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this will be a good overclockable one


Jealous. Wish I could get my hands on 2 of those.


----------



## GRABibus

Quote:


> Originally Posted by *alucardis666*
> 
> Jealous. Wish I could get my hands on 2 of those.


Difficult to get in all stores in France.
I think I ordered the only one and now lead times have increased to 2 or 3 weeks lol


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> I receive my "MSI GTX 1080 Ti Sea Hawk X" tomorrow !
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this will be a good overclockable one


With a 300 watt power limit, your optimal voltage will be at 1.042v or lower.

So if you do hit gold, you'll have to think about shunt modding your card.


----------



## mechwarrior

Galax GeForce GTX 1080 Ti EXOC 11GB looks like a ref board but not sure about the power draw??? display connectors are the same as ref.
has anyone got this board?
the bios might be worth testing.


----------



## alucardis666

About ready to order 2 EVGA FE's and another hybrid kit...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> About ready to order 2 EVGA FE's and another hybrid kit...


Damn man, you have such bad luck these last few days.

Did you at least try to re-tim your card or open the case and test it with an open case?

If you don't feel a lot of hot air blowing off the side of the cards heatsink then it's definitely a Tim issue.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mechwarrior*
> 
> Galax GeForce GTX 1080 Ti EXOC 11GB looks like a ref board but not sure about the power draw??? display connectors are the same as ref.
> has anyone got this board?
> the bios might be worth testing.


With a 6+8, it'll probably be 300 watts maximum again just like the evga SC2


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> With a 300 watt power limit, your optimal voltage will be at 1.042v or lower.
> 
> So if you do hit gold, you'll have to think about shunt modding your card.


With OC strix Bios, power will be unlocked


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Damn man, you have such bad luck these last few days.
> 
> Did you at least try to re-tim your card or open the case and test it with an open case?
> 
> If you don't feel a lot of hot air blowing off the side of the cards heatsink then it's definitely a Tim issue.


I know!

Not gonna bother tbh, I'm returning it to newegg.









The 2nd one is unfortunately already on it's way from Super Biiz, hope that one doesn't have whine or heating issues...


----------



## Benny89

*AORUS CARDS are poop








Don't buy Xtreme cards, I can confirm that problems with those are true:
*
! I was thinking that I have good card, no problem- it could run Heaven, boosting nicely. NO.

First of all, my card started to artefacting in every game and Benchmark like crazy. Even with stock Memory settings it artefacting now everytime.

Whats more- I just can't start playing games, especially BF1. It crashes immidietly after map is loaded and I am suppose to chose class. I got error about DirectX 11 failure/something and that card has been physically removed. Which means drivers crash.... Even on stock settings not touching afterburner at all....

So my card is broken and I can't play anything with it... Superb. Didn't have all that problems with Asus Strix or my 980Ti. mad.gif

I don't know what is wrong with those cards but I recommend avoiding Aorus cards right now.

Super dissapointed considering price and waiting.

Every time I try to start game on stock settings, no even OC mode, anything:



And artefacts everywhere....


----------



## RavageTheEarth

Quote:


> Originally Posted by *alucardis666*
> 
> So anyone else's *Aorus NON- EXTREME* hitting 80+C while running Heaven in 4K extreme or am I just lucky?
> 
> This was done with the fans at 100% too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> On Auto fan curve card got to 88 and then killed Heaven... I don't get why it's soooo hot. On the upside card doesn't see to throttle much, I'm not really seeing many perf caps unlike my founders or the TXp.
> 
> 
> Spoiler: Warning: Spoiler!


Something is VERY wrong there. I have the same card and run my fans at 100% and haven't seen temps go past 55C and that is only under heavy benchmarking. While gaming I'm usually hanging between 45C-50C. You need to take the cooler off and make sure all the thermal pads are on correctly and re-tim the core.
Quote:


> Originally Posted by *KedarWolf*
> 
> After much experimenting with many BIOS etc. Why I need to do to stop voltages from jumping around is CTRL D for default curve, hit the check mark to apply, then hold shift and put the top most line one point higher than the voltage you want, so if you want say 2032 drag level line holding shift to 2041 core, hit the check mark to apply, it'll stay at 2032 after you do.
> 
> Next you drag the voltage point you want to 2041 core, so for me it's 1.000v to 2041, apply it again, now from a straight line from 1.000v to the right it'll be at 2032. Lastly I smooth out the curve below 1.000 to 2025 at .993, 2012 at .981 and gradually lower.
> 
> By applying the core point one point higher then the core speed you want, like 2041 fur a 2032 core speed it doesn't jump higher.
> 
> Another example If you put say 2025 at 1.000v at 2025 directly and hit apply the voltage will jump to .1.062v at times for some reason.
> 
> But if you put it at 2032 at 1.000v, hit apply, it'll actually apply to 2025 at 1.000v on s straight line to the right and NOT jump higher at all.
> 
> I might make a quick video about this trick.


Thanks! I have noticed that. Didn't think about it when trying the curve with the lower voltage. Do you get any Perfcap Reason's in GPU-Z with that voltage?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> With OC strix Bios, power will be unlocked


No, it will not be unlocked. Sorry to disappoint you but 4 of us already measured the power draws from the Asus bios and it doesn't give you 30 watts. Bloodhawk, Nico, Kraxkills and I have all measured PM the power output of the Asus bios and at most it seems to give you 10 watts.










Second, 30 watts is not enough, you need 75-80 more watts to not be power limited. So even if the ASUS bios gave you 30 watts, which I along with 3 others am skeptical about, it would only stop you from hitting power limits sometimes. You'll never be able to run 1.075-1.093 Volts without the shunt mod.

Stock bios with shunt mod, accessing 75-100 watts of more power.











KedarWolf needs to do his test with the stock BIOS and Asus bios and read his PSU draw. His nvidia power command prompt is with his physx card still connected and it's giving him the power draw for both cards. If he did a furmark with the Asus and stock BIOS and measured using corsair link and PSU, he'd make the same discovery the 4 of us have already made.

Edit :I attached wrong last image I'll post-office the right one later.


----------



## KraxKill

Quote:


> Originally Posted by *Benny89*
> 
> *AORUS CARDS are poop
> 
> 
> 
> 
> 
> 
> 
> Don't buy Xtreme cards, I can confirm that problems with those are true:
> *
> I was thinking that I have good card, no problem- it could run Heaven, boosting nicely. NO.
> 
> First of all, my card started to artefacting in every game and Benchmark like crazy. Even with stock Memory settings it artefacting now everytime.
> 
> Whats more- I just can't start playing games, especially BF1. It crashes immidietly after map is loaded and I am suppose to chose class. I got error about DirectX 11 failure/something and that card has been physically removed. Which means drivers crash.... Even on stock settings not touching afterburner at all....
> 
> So my card is broken and I can't play anything with it... Superb. Didn't have all that problems with Asus Strix or my 980Ti. mad.gif
> 
> I don't know what is wrong with those cards but I recommend avoiding Aorus cards right now.
> 
> Super dissapointed considering price and waiting.
> 
> Every time I try to start game on stock settings, no even OC mode, anything:
> 
> 
> 
> And artefacts everywhere....


You get a "bad" card, which from what you posted is (yet to be determined), and from that you are suggesting that people avoid them? There are bad apples with all vendor models.

What troubleshooting have you tried? Remove drivers? Reseat card? Do a clean install? What are your temps?


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> So anyone else's *Aorus NON- EXTREME* hitting 80+C while running Heaven in 4K extreme or am I just lucky?
> 
> This was done with the fans at 100% too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> On Auto fan curve card got to 88 and then killed Heaven... I don't get why it's soooo hot. On the upside card doesn't see to throttle much, I'm not really seeing many perf caps unlike my founders or the TXp.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Something is VERY wrong there. I have the same card and run my fans at 100% and haven't seen temps go past 55C and that is only under heavy benchmarking. While gaming I'm usually hanging between 45C-50C. You need to take the cooler off and make sure all the thermal pads are on correctly and re-tim the core.
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> After much experimenting with many BIOS etc. Why I need to do to stop voltages from jumping around is CTRL D for default curve, hit the check mark to apply, then hold shift and put the top most line one point higher than the voltage you want, so if you want say 2032 drag level line holding shift to 2041 core, hit the check mark to apply, it'll stay at 2032 after you do.
> 
> Next you drag the voltage point you want to 2041 core, so for me it's 1.000v to 2041, apply it again, now from a straight line from 1.000v to the right it'll be at 2032. Lastly I smooth out the curve below 1.000 to 2025 at .993, 2012 at .981 and gradually lower.
> 
> By applying the core point one point higher then the core speed you want, like 2041 fur a 2032 core speed it doesn't jump higher.
> 
> Another example If you put say 2025 at 1.000v at 2025 directly and hit apply the voltage will jump to .1.062v at times for some reason.
> 
> But if you put it at 2032 at 1.000v, hit apply, it'll actually apply to 2025 at 1.000v on s straight line to the right and NOT jump higher at all.
> 
> I might make a quick video about this trick.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks! I have noticed that. Didn't think about it when trying the curve with the lower voltage. Do you get any Perfcap Reason's in GPU-Z with that voltage?
Click to expand...

No, none at all, clear graph entire time benches were run.


----------



## mouacyk

Quote:


> Originally Posted by *Benny89*
> 
> *AORUS CARDS are poop
> 
> 
> 
> 
> 
> 
> 
> Don't buy Xtreme cards, I can confirm that problems with those are true:
> *
> I was thinking that I have good card, no problem- it could run Heaven, boosting nicely. NO.
> 
> First of all, my card started to artefacting in every game and Benchmark like crazy. Even with stock Memory settings it artefacting now everytime.
> 
> Whats more- I just can't start playing games, especially BF1. It crashes immidietly after map is loaded and I am suppose to chose class. I got error about DirectX 11 failure/something and that card has been physically removed. Which means drivers crash.... Even on stock settings not touching afterburner at all....
> 
> So my card is broken and I can't play anything with it... Superb. Didn't have all that problems with Asus Strix or my 980Ti. mad.gif
> 
> I don't know what is wrong with those cards but I recommend avoiding Aorus cards right now.
> 
> Super dissapointed considering price and waiting.
> 
> Every time I try to start game on stock settings, no even OC mode, anything:
> 
> 
> 
> And artefacts everywhere....


Ouch, $700 worth of frustration. You just don't expect that from an item of this value. How were temps? Simple inadequate factory application of thermal pads or paste will do stuff like this -- problem is of course, you void warranty trying to correct the issue yourself, if it can be fixed. Good luck.


----------



## alucardis666

So Newegg wants to charge a restocking fee... and the SuperBiiz card I bought will be here Monday...

My 2 Aorus cards are now on eBay. The TXp is going back in the mail tomorrow... And I just ordered 2 EVGA 1080 Ti FE's and another hybrid kit, and I'm going to mount the two hybrid rads in the front of my case and swap the H100i out for a Kraken X42, CPU temps should still be very manageable.

I just hope these FE cards clock well and have no issues.


----------



## RavageTheEarth

Quote:


> Originally Posted by *KedarWolf*
> 
> No, none at all, clear graph entire time benches were run.


Wow man you are lucky! So I'm still having issues with this. After hitting ctrl+d to default the curve should I move the points up below 1.043v to match the line? I ask because when defaulted the straight line starts at 1.043v. Should I move the points below 1.043v all the way to 1.012v up to match the line BEFORE I hold shift and move the entire curve up? I'm still a little confused. If you could make a video that would be great. I tried what you said and when I launched The Witcher 3 it started at 1.043v @ 2038 when I wanted 1.012v @ 2025. Then after a minute it jumped up a bin to 1.05v. So I'm definitely doing something wrong. I'm in the process of trying this again, but I am still a little confused by what you mean by "hold shift and move it above the OC you want. When I look at your graph it is straight all the way forward. Mine looked like this:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> So Newegg wants to charge a restocking fee... and the SuperBiiz card I bought will be here Monday...
> 
> My 2 Aorus cards are now on eBay. The TXp is going back in the mail tomorrow... And I just ordered 2 EVGA 1080 Ti FE's and another hybrid kit, and I'm going to mount the two hybrid rads in the front of my case and swap the H100i out for a Kraken X42, CPU temps should still be very manageable.
> 
> I just hope these FE cards clock well and have no issues.


Man, I feel really bad for you and your experience I hope it all comes through.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, I feel really bad for you and your experience I hope it all comes through.


Thanks man, it's been a journey these past few weeks. If I knew I'd be back to the Ti, would've kept my first EVGA FE. lol


----------



## FattysGoneWild

Quote:


> Originally Posted by *SlimJ87D*
> 
> Man, I feel really bad for you and your experience I hope it all comes through.


Why? They have to much time/money and have no idea wth they want. Its great to see a hit taken for they cannot abuse the return policy that Newegg has in place now.







Someone like that can never be pleased with any thing they have or do.


----------



## Pandora's Box

I thought newegg had a strict "replace only" return policy on video cards. I doubt he'll be able to return it for a refund.


----------



## outofmyheadyo

Quote:


> Originally Posted by *FattysGoneWild*
> 
> Why? They have to much time/money and have no idea wth they want. Its great to see a hit taken for they cannot abuse the return policy that Newegg has in place now.
> 
> 
> 
> 
> 
> 
> 
> Someone like that can never be pleased with any thing they have or do.


What`s the new newegg policy ?


----------



## evosamurai

is asic quality not a thing anymore? says my card doesnt support it


----------



## RavageTheEarth

Quote:


> Originally Posted by *evosamurai*
> 
> is asic quality not a thing anymore? says my card doesnt support it


Yep, no ASIC quality on Pascal.


----------



## Caos

Help with OC in my strix? I need to unlock the voltage? Msi afterburnet does not display this option


----------



## keikei

Quote:


> Originally Posted by *Pandora's Box*
> 
> I thought newegg had a strict "replace only" return policy on video cards. I doubt he'll be able to return it for a refund.


Correct.



Spoiler: Warning: Spoiler!


----------



## FattysGoneWild

Quote:


> Originally Posted by *outofmyheadyo*
> 
> What`s the new newegg policy ?


Actually they went even further. No returns period on cards accept for replacements.

Return Policies
*Return for refund within: Non-refundable*
Return for replacement within: 30 days

Its great. People like him are forced to sell them since they cant abuse the policy any more trying to get their money back. Just for testing or playing with a card/cards they have no intent on keeping.


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> What troubleshooting have you tried? Remove drivers? Reseat card? Do a clean install? What are your temps?


Drivers reinstall, AB reinstall, Aorus software uninstall. Temps were very good. 57C during benches were max I saw on my card.

Also I had before 1080 Ti Asus Strix which I returned and I did not have any problems with it whatsoever. Also my 980Ti which I have just installed again do not have any problems on the same drivers.

If you read reviews on Newegg about Aorus Xtreme and in Aorus Owners thread you can see that this is common problem. It was not just "bad luck" on my side.

If you read reviews on Asus, MSI etc you can see there are no such a problems there.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Caos*
> 
> Help with OC in my strix? I need to unlock the voltage? Msi afterburnet does not display this option


http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_30


----------



## KraxKill

Quote:


> Originally Posted by *Benny89*
> 
> Drivers reinstall, AB reinstall, Aorus software uninstall. Temps were very good. 57C during benches were max I saw on my card.
> 
> Also I had before 1080 Ti Asus Strix which I returned and I did not have any problems with it whatsoever.
> 
> If you read reviews on Newegg about Aorus Xtreme and in Aorus Owners thread you can see that this is common problem. It was not just "bad luck" on my side.
> 
> If you read reviews on Asus, MSI etc you can see there are no such a problems there.


That sucks. Try to see what wattage or at least TDP the card is drawing before/when it crashes. Try to re-seat the card. PCIE can be finicky sometimes.


----------



## Caos

Quote:


> Originally Posted by *RavageTheEarth*
> 
> http://www.overclock.net/t/1625653/how-to-get-voltage-slider-in-afterburner-working-on-a-1080-ti/0_30


Thank you very much ... nothing will happen if the voltage increases?


----------



## evosamurai

threw in the new card ran it stock on air, first time spy test


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> That sucks. Try to see what wattage or at least TDP the card is drawing before/when it crashes. Try to re-seat the card. PCIE can be finicky sometimes.


Already done that. TDP was ok, besides I tried it on stock, max Power Limit, 110%, 120% etc etc. PCIE was fine, I always double checked if card sits well.

Also artefacting on stock settings is just unacceptable on 700 USD card...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> With OC strix Bios, power will be unlocked


Quote:


> Originally Posted by *FattysGoneWild*
> 
> Actually they went even further. No returns period on cards accept for replacements.
> 
> Return Policies
> *Return for refund within: Non-refundable*
> Return for replacement within: 30 days
> 
> Its great. People like him are forced to sell them since they cant abuse the policy any more trying to get their money back. Just for testing or playing with a card/cards they have no intent on keeping.


How is he abusing the replacement policy though? His card is faulty. It should be running in the high 60s, but it's running in the high 80s.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Caos*
> 
> Thank you very much ... nothing will happen if the voltage increases?


No man, I already answered your question a few pages back. Start reading the thread or at least use the search function.


----------



## evosamurai

ran a firestrike, off to work i go, sucks, i want to tinker with it lol


----------



## mshagg

Quote:


> Originally Posted by *Caos*
> 
> Thank you very much ... nothing will happen if the voltage increases?


If you look at the voltage curve (you can read about it in here), all the voltage slider does is allows the card to move further to the right.


----------



## Caos

Quote:


> Originally Posted by *SlimJ87D*
> 
> No man, I already answered your question a few pages back. Start reading the thread or at least use the search function.


thanks man.. I'm sure it's the first time I ask .. sorry if it is not so

What is the average OC?


----------



## Caos

Quote:


> Originally Posted by *mshagg*
> 
> If you look at the voltage curve (you can read about it in here), all the voltage slider does is allows the card to move further to the right.


Yes I understand thank you


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Weren't you the one I told to get a nice credit card? I have 90 days to return anything. The limit is $1K I think. And if the vendor will not take it back, my credit card will buy it off of me for the price I paid.
> 
> I'll void warranty stickers because my CC company doesn't care. If the vendor doesn't honor their warranty, then my credit card company will pay for the repairs or buy me a new card.
> 
> It really baffles me on why people don't take advantage of these free perks that come with credit cards.
> 
> Just pay off your credit card at the end of the year and you have no interest or no debt.


I think you misunderstood me. I said that I am glad I CAN RETURN ANYTHING within 14 days of purchase, just because of EU law







. I don't need credit card for that specific perk. However other perks are also cool that is why I am looking for nice CC here. Just don't need it for returns. 14 days is plenty to see if hardware/stuff is working or suits me.

I can void warranty too and return it anyway







. But 90 days is awesome









That is why I am glad I don't need to buy from stores like Newegg.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> I think you misunderstood me. I said that I am glad I CAN RETURN ANYTHING within 14 days of purchase, just because of EU law
> 
> 
> 
> 
> 
> 
> 
> . I don't need credit card for that specific perk. However other perks are also cool.
> 
> I can void warranty too and return it anyway
> 
> 
> 
> 
> 
> 
> 
> .
> 
> That is why I am glad I don't buyt from stores like Newegg.


I understand you clearly, and I'm just adding that no matter where you live, a CC will just take care of all that for you.

14 days, how about 90 days? Someone can also steal the GPU or you can accidentally break it yourself and get a refund.

In 90 days the price drops $50? They'll give you that $50 back.

I know you don't need a CC, but I still recommend getting a free one that might come with some of these perks. There's a lot of free ones.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> How is he abusing the replacement policy though? His card is faulty. It should be running in the high 60s, but it's running in the high 80s.


Well he originally bought two EVGA FEs, returned them, got a Titan Xp, returned it and now karma got him. You can't just be abusing these policies, ruins it for the rest of us.
Quote:


> Originally Posted by *alucardis666*
> 
> So Newegg wants to charge a restocking fee... and the SuperBiiz card I bought will be here Monday...
> 
> My 2 Aorus cards are now on eBay. The TXp is going back in the mail tomorrow... And I just ordered 2 EVGA 1080 Ti FE's and another hybrid kit, and I'm going to mount the two hybrid rads in the front of my case and swap the H100i out for a Kraken X42, CPU temps should still be very manageable.
> 
> I just hope these FE cards clock well and have no issues.


Please tell me your gonna tell the buyer that the 1 Aorus card has high temps and not just sell it without mentioning that.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's what I'm trying to tell him, but I think he just wants the crown with the highest core clock lol. We're literally comparing 0.5 FPS difference here, and this is where ram and CPU can make a difference.
> 
> If I literally got 1 more FPS on my superposition I would get an extra 200 points. I just down clocked my CPU, lost 0.5 fps and lost like 100 points on the test.
> 
> To non-Turbo nerds, 0.5 FPS is ridiculous subject to obsess about.
> 
> I need at 5775C! Just need to find a used one


you'll love it when you do. Remember , though, nobody ever sells them used, lol, they are too good! But since hardly anyone knows that they pay almost retail for used 4790ks, the NOOBZ!


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Tell me about it, my 100hz X34 wasn't happy at 100hz and I didn't like 95hz so I just use 90hz lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It sounds better when its like 50% more than 60hz and not that it matter before as 980Ti couldn't push that anyway, must try 100hz again with this card and maybe try original DP cable


yeah, im cool with 90, too, it sounds great 50% more frames than 60. But, the reason for the 59 lock is input lag. There is a tiny but measurable amount of input lag at 60 that's gone at 59. same goes for 142 on a 144Hz monitor. On yours, You need 98 locked. No input lag. Get your mind used to the idea of 98 locked. 98 fps at 98% gpu usage would make my OCD happy.


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> you'll love it when you do. Remember , though, nobody ever sells them used, lol, they are too good! But since hardly anyone knows that they pay almost retail for used 4790ks, the NOOBZ!


I'll may sell my 4790K soon but not until I find something significantly faster in gaming. Mine does [email protected] so its hard to let go.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> I'll may sell my 4790K soon but not until I find something significantly faster in gaming. Mine does [email protected] so its hard to let go.


Everyone says to not go over 1.30v... I've been too scared to add voltage. I'm at [email protected] but I'm debating about adding some more voltage.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> For the first time, I have been CPU bottle necked.
> 
> BF1 at 1440P at 142 FPS (capped), I can see my CPU clocks hitting 80% and my gpu utilization is only at 94%


notice how battlefield runs us at like 85% or less power, too, NOT 125% like i wish it did. Makes for a very cool temp experience, though. Time to up that resolution slider until you see 99% usage, tho.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> notice how battlefield runs us at like 85% or less power, too, NOT 125% like i wish it did. Makes for a very cool temp experience, though. Time to up that resolution slider until you see 99% usage, tho.


Well I'm at 142 FPS where I cap it. GPU usage is at 95%. So I don't think I can put the slider up that much, lol.

BF1 is pretty well optimized. BTW, add me on BF1, I don't know anyone! I usually just play TDM.

SlimJ87D is my ID.


----------



## Slackaveli

Quote:


> Originally Posted by *DStealth*
> 
> MSI Gaming X has on one of the screws heads...the stupidest part is you can unscrew it with pliers w/o using the head itself....but more interesting part is the two shunts for the 8-pin connectors are visible and easy moddable w/o taking down the cooler itself...don't know what exactly they're hiding behind this warranty sticker


hilarious! exposed shunts! What N()()BZ


----------



## Slackaveli

Quote:


> Originally Posted by *st0necold*
> 
> Bro congratulations!!!
> 
> How much better do the games look with the new 1080ti SLI setup? I haven't decided on upgrading yet-- are the graphics better?


hell yeah, man ultra everything and AA, too, and still run max frames. Can't look any better.


----------



## KraxKill

Quote:


> Originally Posted by *SlimJ87D*
> 
> Everyone says to not go over 1.30v... I've been too scared to add voltage. I'm at [email protected] but I'm debating about adding some more voltage.


I've been running [email protected] 1.328 for many moons now. Shoot if it blows up tomorrow, I will have gotten my $$ worth out of it. I think 1.4v is the danger limit but even that is OK if you can keep the temps below 60C under load, but 1.4v with temps in the 80C range is just asking for chip death.

Shoot, Silicon Lottery was binning 4790k for sale at 1.344v and 1.9 input so you can definitely run more than 1.3v. I would also bump the IO voltages a bit. That alone unlocked a few more bins for me.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well I'm at 142 FPS where I cap it. GPU usage is at 95%. So I don't think I can put the slider up that much, lol.
> 
> BF1 is pretty well optimized. BTW, add me on BF1, I don't know anyone! I usually just play TDM.
> 
> SlimJ87D is my ID.


bet, Im KingSlackaveli
Quote:


> Originally Posted by *SlimJ87D*
> 
> Well I'm at 142 FPS where I cap it. GPU usage is at 95%. So I don't think I can put the slider up that much, lol.
> 
> BF1 is pretty well optimized. BTW, add me on BF1, I don't know anyone! I usually just play TDM.
> 
> SlimJ87D is my ID.


bet, Im KingSlackaveli. Im at ultra 142 locked as well. see ya soon. ill lay tdm , dom, conquest, rush, whatevs.


----------



## rakesh27

Hey,

Recently got a EVGA FE 1080TI i did the kracken G10 mod on it, i noticed my card stays very cool compared to when i had the Kracken G10 on my Zotac Amp Edition 1080...

What i wanted to know are the Founders Edition fairly good cards, as i have no problem overclocking to 2100+ with low temps. I had to do the Afterburner trick of editing my files to get full voltage control, seems like these cards run well on water/air hybrid cooling.

What do you all think ?


----------



## KraxKill

My shuntz....
Quote:


> Originally Posted by *Slackaveli*
> 
> hilarious! exposed shunts! What N()()BZ


Not mine! They have their own sarcophagus. Like a liquid metal filled jelly bean.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I'll may sell my 4790K soon but not until I find something significantly faster in gaming. Mine does [email protected] so its hard to let go.


yeah Im not even sure my cache could keep up with that. HOWEVER, if I were you...

Silicon Lottery sold 5.0 4790k's for $450. they have been sold out for awhile but their 4.9's still sell for $400 ish. Now, a 5.1 would have been $475, and you're guaranteed 5.2ghz @ 1.38v means you couls list it on EBAy for $500 and just see if it sells. When it does , that's damn near a new platform for you. In fact, it is easily whatever you want b/c you'd still have a mobo and ram to sell. Do that and get a 1800x system if you do more than gaming or a skylake-x next month when they drop....

Here's my full system. Notice the line where it says this must be owned by a technical master LOL.

http://www.userbenchmark.com/UserRun/3449474

" Overall this PC is performing way above expectations (90th percentile). This means that out of 100 PCs with exactly the same components, 10 performed better. The overall PC percentile is the average of each of its individual components. This PC is likely operated by a technical master!"


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> My shuntz....
> Not mine! They have their own sarcophagus. Like a liquid metal filled jelly bean.


Aren't you scared of drawing more power from the mobo? There was that whole RX 480 fiscal that supposedly damaged a lot of mobos because the card bios drew 85+ watts from the mobo.

A lot of guides have said that shunting the resistor by the bios won't do much anyways. Luckily I haven't gone over 380 Watts in timespy or superposition, so I don't eve think I need the extra 400+ watts the shunt mod has added.

Nice sarcophagus though.


----------



## KraxKill

Quote:


> Originally Posted by *KraxKill*
> 
> My shuntz....
> Not mine! They have their own sarcophagus.


Quote:


> Originally Posted by *Slackaveli*
> 
> yeah Im not even sure my cache could keep up with that. HOWEVER, if I were you...
> 
> Silicon Lottery sold 5.0 4790k's for $450. they have been sold out for awhile but their 4.9's still sell for $400 ish. Now, a 5.1 would have been $475, and your guaranteed 5.2ghz @ 1.38v means you couls list it on EBAy for $500 and just see if it sells. When it does , that's damn near a new platform for you. In fact, it is easily whatever you want b/c you'd still have a mobo and ram to sell. Do that and get a 1800x system if you do more than gaming or a skylake-x next month when they drop....


skylake-x





















Not a bad idea.


----------



## Bishop07764

Here's the best that I've been able to do at the 4k optimized. Been so busy that I've barely had time to turn on my computer. I'm slamming into the power limit hard and all the time on this benchmark. I'm hitting the power limit and downclocking pretty bad even though my temps never pass 34 C so far on this bench. Mine won't hold 2063 core to save it's life in this bench. It spends a lot of time 2050 down to even 2025.



Thanks Kedarwolf and the rest for all the work getting this organized. I'll try to submit my info here shortly.


----------



## KraxKill

Quote:


> Originally Posted by *SlimJ87D*
> 
> Aren't you scared of drawing more power from the mobo? There was that whole RX 480 fiscal that supposedly damaged a lot of mobos because the card bios drew 85+ watts from the mobo.
> 
> A lot of guides have said that shunting the resistor by the bios won't do much anyways. Luckily I haven't gone over 380 Watts in timespy or superposition, so I don't eve think I need the extra 400+ watts the shunt mod has added.
> 
> Nice sarcophagus though.


Scared? No. Concerned? Yes

I start hitting TDP at 2063-2075 without the shunt and my card seems to boost to 2100+ solid with it. Says i'm at 94% TDP which could be an obscene amount of watts through the mobo but hopefully ASUS did their homework and over engineered a bit. So far so good and the power cables are cool.


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> Scared? No. Concerned? Yes
> 
> I start hitting TDP at 2063-2075 without the shunt and my card seems to boost to 2100+ solid with it. Says i'm at 94% TDP which could be an obscene amount of watts through the mobo but hopefully ASUS did their homework and over engineered a bit. So far so good and the power cables are cool.


You hit that at stock voltage or at higher? Did you use V slider or Voltage curve?

FE or STRIX?


----------



## KraxKill

Quote:


> Originally Posted by *Benny89*
> 
> You hit that at stock voltage or at higher? Did you use V slider or Voltage curve?
> 
> FE or STRIX?


With FE at anything over 1.050v. 1.063v if I remember correctly so with the voltage slider untouched. I have to bump for 2075 at 1.07 and am at 1.093 at 2100+ end of the line so to speak. It's a matter of cooling from there.


----------



## rakesh27

It is possible as im at 2100 v1.093, only thing is you will probably have to put it under water or hybrid kit... At least this way card will stay cool even at higher clocks


----------



## noles1983

Patiently waiting to trade in my 1070 Sea Hawk EK X for the 1080 Ti variant, if MSI ever decides to release it...


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> He shunt modded a card that doesn't need the shunt mod. His card could already access 375 watts iirc if it needed to, and he shunt modded it. So it could have access 500 watts and blew up.


Oh wow. Ouch. Is there a tool to see how many watts the GPU is pulling by itself (not including the CPU etc.)


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RavageTheEarth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Clukos*
> 
> Yup voltage curve looks like this:
> 
> 
> 
> 
> 
> I just tried this and after about 10 minutes of playing Tom Clancy's Wildlands I saw that my voltage had jumped up to 1.062v from 1.012v. Running a core clock of 2025. Not sure why it jumped up 5 bins like that. Before this I was running 2063 @ 1.062v. Looks like my card really likes 1.062v. BTW TC Wildlands is VERY CPU dependent. More so than any of the other games I play.
> 
> Click to expand...
> 
> After much experimenting with many BIOS etc. Why I need to do to stop voltages from jumping around is CTRL D for default curve, hit the check mark to apply, then hold shift and put the top most line one point higher than the voltage you want, so if you want say 2032 drag level line holding shift to 2041 core, hit the check mark to apply, it'll stay at 2032 after you do.
> 
> Next you drag the voltage point you want to 2041 core, so for me it's 1.000v to 2041, apply it again, now from a straight line from 1.000v to the right it'll be at 2032. Lastly I smooth out the curve below 1.000 to 2025 at .993, 2012 at .981 and gradually lower.
> 
> By applying the core point one point higher then the core speed you want, like 2041 fur a 2032 core speed it doesn't jump higher.
> 
> Another example If you put say 2025 at 1.000v at 2025 directly and hit apply the voltage will jump to .1.062v at times for some reason.
> 
> But if you put it at 2032 at 1.000v, hit apply, it'll actually apply to 2025 at 1.000v on s straight line to the right and NOT jump higher at all.
> 
> I might make a quick video about this trick.
Click to expand...

I take this all back.

While it's an ideal way to set your voltage curve after some more testing some voltages and clock points have the jump to 1.062v no matter how you apply them. 2012 core at .993v v I think did it, 2000 at .950v definitely did it.

I think it depends on actual point in the curve if we have this bug.


----------



## madmeatballs

Quote:


> Originally Posted by *Benny89*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *AORUS CARDS are poop
> 
> 
> 
> 
> 
> 
> 
> Don't buy Xtreme cards, I can confirm that problems with those are true:
> *
> Ok, this cards are ******! I was thinking that I have good card, no problem- it could run Heaven, boosting nicely. NO.
> 
> First of all, my card started to artefacting in every game and Benchmark like crazy. Even with stock Memory settings it artefacting now everytime.
> 
> Whats more- I just can't start playing games, especially BF1. It crashes immidietly after map is loaded and I am suppose to chose class. I got error about DirectX 11 failure/something and that card has been physically removed. Which means drivers crash.... Even on stock settings not touching afterburner at all....
> 
> So my card is broken and I can't play anything with it... Superb. Didn't have all that problems with Asus Strix or my 980Ti. mad.gif
> 
> I don't know what is wrong with those cards but I recommend avoiding Aorus cards right now.
> 
> Super dissapointed considering price and waiting.
> 
> Every time I try to start game on stock settings, no even OC mode, anything:
> 
> 
> 
> 
> 
> And artefacts everywhere....


What game does this happen to you? I get this too on Mass effect andromeda. But that game has an issue acknowledged by Nvidia about memory errors. It happens to me randomly or when I have an aggressive oc.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GRABibus*
> 
> With OC strix Bios, power will be unlocked
> 
> 
> 
> No, it will not be unlocked. Sorry to disappoint you but 4 of us already measured the power draws from the Asus bios and it doesn't give you 30 watts. Bloodhawk, Nico, Kraxkills and I have all measured PM the power output of the Asus bios and at most it seems to give you 10 watts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Second, 30 watts is not enough, you need 75-80 more watts to not be power limited. So even if the ASUS bios gave you 30 watts, which I along with 3 others am skeptical about, it would only stop you from hitting power limits sometimes. You'll never be able to run 1.075-1.093 Volts without the shunt mod.
> 
> Stock bios with shunt mod, accessing 75-100 watts of more power.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> KedarWolf needs to do his test with the stock BIOS and Asus bios and read his PSU draw. His nvidia power command prompt is with his physx card still connected and it's giving him the power draw for both cards. If he did a furmark with the Asus and stock BIOS and measured using corsair link and PSU, he'd make the same discovery the 4 of us have already made.
> 
> Edit :I attached wrong last image I'll post-office the right one later.
Click to expand...

The power draw in the Nvidia command is shown separately for each card, not combined.

And I never used Furmark, I used a maxed out custom TimeSpy test to get the voltages on my 1080 Ti I have shown.

Even Furmark doesn't draw that kind of power.


----------



## Benny89

Quote:


> Originally Posted by *madmeatballs*
> 
> What game does this happen to you? I get this too on Mass effect andromeda. But that game has an issue acknowledged by Nvidia about memory errors. It happens to me randomly or when I have an aggressive oc.


All games I have currently installed- ME:A, BF1, Witcher 3, Dishonored 2, AC: Syndicate.

This doesn't happen randomly. It just happens all the time. There is something wrong with those cards from Aorus. Too many people have this problems. I could understand that it does that after some heavy OC, voltage etc. But on stock settings? No freaking way.

Not to mention artefacting. This card is simply broken on too many levels.


----------



## KedarWolf

I might try the shunt mod on my vertically mounted card with nail lacquer around it to stop any CLU from running and maybe use Condunaut which peeps say has a thicker consistency. If I've gotten 2100 at 1.075v without it, imagine what I could do with it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I might try the shunt mod on my vertically mounted card with nail lacquer around it to stop any CLU from running and maybe use Condunaut which peeps say has a thicker consistency. If I've gotten 2100 at 1.075v without it, imagine what I could do with it.


Yeah, I recommend it. No more downlocking, downvolting or power limits to worry about.

But be ready, you'll be drawing in around 100 more watts which is a increase of 33% more heat. So you might have to run your fans on your pumps higher.

If you're at 45C now, you'll be at around 58C.


----------



## alucardis666

For anyone curious what the difference is between TXp and Ti on Ryzen.

*Ti*



*TXp*


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> KedarWolf needs to do his test with the stock BIOS and Asus bios and read his PSU draw. His nvidia power command prompt is with his physx card still connected and it's giving him the power draw for both cards. If he did a furmark with the Asus and stock BIOS and measured using corsair link and PSU, he'd make the same discovery the 4 of us have already made.
> 
> Edit :I attached wrong last image I'll post-office the right one later.


Is corsair link a tool i can download and install to measure the power draw from my PSU? Does it separate the draw that your CPU and other components are drawing?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> For anyone curious what the difference is between TXp and Ti on Ryzen.
> 
> *Ti*
> 
> 
> 
> *TXp*


Wow, that's like $100 per 0.75% of extra performance.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> Is corsair link a tool i can download and install to measure the power draw from my PSU? Does it separate the draw that your CPU and other components are drawing?


You need a corsair power supply which is capable of reading incoming and outgoing power usage from the wall and your CPU.

So no, more than likely you will not be able to if you don't have a corsair link compatible PSU.


----------



## FuriousReload

I have got lucky with my curve lately, my Ti is running 2151 99% of the time, every few minutes I get a small dip to 2126. at 1.0V I get 2076 stable, and 1.031 I am at 2151.

http://www.3dmark.com/spy/1596413
11071 Graphics score!


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> All games I have currently installed- ME:A, BF1, Witcher 3, Dishonored 2, AC: Syndicate.
> 
> This doesn't happen randomly. It just happens all the time. There is something wrong with those cards from Aorus. Too many people have this problems. I could understand that it does that after some heavy OC, voltage etc. But on stock settings? No freaking way.
> 
> Not to mention artefacting. This card is simply broken on too many levels.


Have you updated the BIOS? I thought it was strange that GA had pushed 2 BIOS updates already, perhaps it was because the one the card shipped with is a dog?


----------



## Dasboogieman

Quote:


> Originally Posted by *reflex75*
> 
> Thank you for the tip.
> But removing the cooler will void the warranty...
> And if it was that simple, why manufacturers do nothing about it?


Why don't they universally use DrMOS mosfets? why don't AIBs all use vapor chambers? Its down to cost and effort vs reward. At the end of the day, your average run of the mill person only runs the card at 60 FPS so the potential for coil whine is very low. I suspect that glue you see on the PCS card I linked has to be applied by hand (which is time consuming and costly) which is why it makes less sense from a mass production perspective.


----------



## Luckbad

Just got in a Zotac 1080 Ti Amp Extreme.

I think this one is going to be a great one.

Stock boost: 2037.5 MHz


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> My shuntz....
> Not mine! They have their own sarcophagus. Like a liquid metal filled jelly bean.


those looks awesome.


----------



## Benny89

Quote:


> Originally Posted by *mshagg*
> 
> Have you updated the BIOS? I thought it was strange that GA had pushed 2 BIOS updates already, perhaps it was because the one the card shipped with is a dog?


Nah, its going back anyway. Artefacting is something BIOS won't probaby fix and I have no respect for company that can't give me a card that simple works out of the box. This is riddiculous.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> skylake-x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not a bad idea.


ikr, im gonna want!
Quote:


> Originally Posted by *gmpotu*
> 
> Is corsair link a tool i can download and install to measure the power draw from my PSU? Does it separate the draw that your CPU and other components are drawing?


HWinfo64 should do the trick.
Quote:


> Originally Posted by *FuriousReload*
> 
> I have got lucky with my curve lately, my Ti is running 2151 99% of the time, every few minutes I get a small dip to 2126. at 1.0V I get 2076 stable, and 1.031 I am at 2151.
> 
> http://www.3dmark.com/spy/1596413
> 11071 Graphics score!


that is an absurd card, bro!


----------



## FuriousReload

Well, got a 11,132 graphics score from 2126 and +800 memory. I was thinking shunt mod, but it already does so well. The only mods I have done to the card is adding the EVGA Hybrid pump, 10 heatsinks on around the chip/pump and one other mod I did was extending the pump and fan cables to PWM headers on the mobo, seemed to help with power cap, don't have evidence to prove that though.

Final TimeSpy run: http://www.3dmark.com/spy/1596558


----------



## CoreyL4

So I unlocked voltage control and used the curve. I managed to get 2025-2012 stable. However my memory was not able to increase at all. I am stuck at 350. 400 will give me slight artefacts.


----------



## Slackaveli

Quote:


> Originally Posted by *FuriousReload*
> 
> I have got lucky with my curve lately, my Ti is running 2151 99% of the time, every few minutes I get a small dip to 2126. at 1.0V I get 2076 stable, and 1.031 I am at 2151.
> 
> http://www.3dmark.com/spy/1596413
> 11071 Graphics score!


that is an absurd card, bro!
Quote:


> Originally Posted by *CoreyL4*
> 
> So I unlocked voltage control and used the curve. I managed to get 2025-2012 stable. However my memory was not able to increase at all. I am stuck at 350. 400 will give me slight artefacts.


i artifact at +480 myself. Thes mems are pretty different chip to chip. I just run 60 under the artifact point. seems to give better supo score, too.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *FuriousReload*
> 
> I have got lucky with my curve lately, my Ti is running 2151 99% of the time, every few minutes I get a small dip to 2126. at 1.0V I get 2076 stable, and 1.031 I am at 2151.
> 
> http://www.3dmark.com/spy/1596413
> 11071 Graphics score!


I think your voltage limited or something is off because at 2114 mhz kraxkills scored 11,118

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/6090#post_26029142


----------



## KingEngineRevUp

Quote:


> Originally Posted by *FuriousReload*
> 
> Well, got a 11,132 graphics score from 2126 and +800 memory. I was thinking shunt mod, but it already does so well. The only mods I have done to the card is adding the EVGA Hybrid pump, 10 heatsinks on around the chip/pump and one other mod I did was extending the pump and fan cables to PWM headers on the mobo, seemed to help with power cap, don't have evidence to prove that though.
> 
> Final TimeSpy run: http://www.3dmark.com/spy/1596558


You should show gpu-z sensors


----------



## FuriousReload

Quote:


> Originally Posted by *SlimJ87D*
> 
> You should show gpu-z sensors




Here we go, power limited indeed.

http://www.3dmark.com/spy/1596675


----------



## KingEngineRevUp

Quote:


> Originally Posted by *FuriousReload*
> 
> 
> 
> Here we go, power limited indeed.
> 
> http://www.3dmark.com/spy/1596675


You are a candidate for the shunt mod.


----------



## FuriousReload

Quote:


> Originally Posted by *SlimJ87D*
> 
> You are a candidate for the shunt mod.


I'll keep that in mind, probably until I get bored in a week using it as it is now.


----------



## Fieldsweeper

So I was able to get both games from the deal lol. I had taken my 1080ti FE back to get the STRIX OC version and was given another code lolol


----------



## CoreyL4

Why does my voltage and core clock go all over the place during that supercomposition benchmark? I have my core clock set to curve


----------



## Bishop07764

Quote:


> Originally Posted by *CoreyL4*
> 
> Why does my voltage and core clock go all over the place during that supercomposition benchmark? I have my core clock set to curve


That would likely be the power limit. My clocks definitely fluctuate in that benchmark too. Even with my gpu at 33-34 c.


----------



## DStealth

Quote:


> Originally Posted by *gmpotu*
> 
> Oh wow. Ouch. Is there a tool to see how many watts the GPU is pulling by itself (not including the CPU etc.)


Hwinfo or Nvidia tool here - 
As you can see my card has 330w peak, not 375w and w/o shunt throtled in 2088/2076 area like crazy with stock 1.062v...

Just to clarify - my card has a defective part as it didn't blew up while benching with [email protected]+ but with ~300W [email protected] and aggressively custom fan cooled(57-62*C) while the first two hours game-play session was applied to it.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Wow, that's like $100 per 0.75% of extra performance.


Yea both my cards were kinda crappy OC-ers. TXp def aint worth it unless you plan to SLi.

But now that I've got that outta my system, literally and metaphorically... We'll see how the FE's do when I get them.









In other news, anyone wanna buy a used 6950x? Oc's to 4.4 with 1.45v. and can do daily 4.2 @ 1.25v


----------



## RavageTheEarth

Quote:


> Originally Posted by *KedarWolf*
> 
> I take this all back.
> 
> While it's an ideal way to set your voltage curve after some more testing some voltages and clock points have the jump to 1.062v no matter how you apply them. 2012 core at .993v v I think did it, 2000 at .950v definitely did it.
> 
> I think it depends on actual point in the curve if we have this bug.


I really wish that the voltage bar in afterburner allowed you to under-volt the card! Let me know what you figure out. I'm waiting on the block to be released for my Aorus so I won't have to worry about temps although the hottest the card ever got was 55C under heavy benching, but while gaming I'm usually between 40-50C. I'm just not sure off its temp related or not.


----------



## Luckbad

I got two 1080 Tis in today: Zotac Amp Extreme and Asus ROG Strix OC

The Zotac stock boosts to 2037.5. The Asus in its OC mode boosts to 1920 something and crashes in benchmarks.

Welp.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> I got two 1080 Tis in today: Zotac Amp Extreme and Asus ROG Strix OC
> 
> The Zotac stock boosts to 2037.5. The Asus in its OC mode boosts to 1920 something and crashes in benchmarks.
> 
> Welp.


dat lottery is real. my new and forevermore gpu method will be to buy my favorite 3 gpus, test them all, and keep the golden (unless very unlucky) one, or whichever one I like the best. Screw all this "lottery" business.

Thankfully my Aorus was one of their best chips, but their gauntlet sucks so bad it didnt even make it into an extreme version lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> Why does my voltage and core clock go all over the place during that supercomposition benchmark? I have my core clock set to curve


Power limit, superposition is like a stressful test and requires a lot of power
Quote:


> Originally Posted by *RavageTheEarth*
> 
> I really wish that the voltage bar in afterburner allowed you to under-volt the card! Let me know what you figure out. I'm waiting on the block to be released for my Aorus so I won't have to worry about temps although the hottest the card ever got was 55C under heavy benching, but while gaming I'm usually between 40-50C. I'm just not sure off its temp related or not.


I'm using stock bios. I don't have this issue when I under volt.

I even did a 1.000 volt run at 2088-2075 not too long ago.



With power limiting it would drop one bin below 1.000v.


----------



## Slackaveli

DON'T GET TOO COMFORTABLE, MEN!!!***

http://wccftech.com/nvidia-geforce-20-volta-graphics-card-q3-2017/


----------



## JedixJarf

Quote:


> Originally Posted by *Slackaveli*
> 
> DON'T GET TOO COMFORTABLE, MEN!!!***
> 
> http://wccftech.com/nvidia-geforce-20-volta-graphics-card-q3-2017/


Did you just assume my gender


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> DON'T GET TOO COMFORTABLE, MEN!!!***
> 
> http://wccftech.com/nvidia-geforce-20-volta-graphics-card-q3-2017/


Dude seriously? **** Nvidia if this is true. JayzTwoCents was right. Lol.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Dude seriously? **** Nvidia if this is true. JayzTwoCents was right. Lol.


I'll take 2!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> I'll take 2!


You're a freak man, you buy cards left and right. Must be quite wealthy.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're a freak man, you buy cards left and right. Must be quite wealthy.


Lol not really. But if you sell at the right times and plan it out the cost isn't too much different. Say they launch it at 700 and drop the price of the Ti to 550 or so. If I sell my Tis' for 600 just before launch, it's about ~150 to trade up per card after ebay/paypal and shipping costs. Not too bad really. And if this card will do all they're rumored to do I might be fine with just 1.

Plus I did just sell my *6950X* and my old board too, and I'll be getting my refund for the *TXp* soon. So there's that.









It's all about budgeting and knowing how to "recycle" cash flow/reshift your assets.

I'm 27, and aside from my car payment/insurance/phone/food/ student loans/rent I don't have too many expenses. And I don't really spend my money on anything outside those other than tech *mainly my gaming/workstation rig* and entertainment/girlfriend.

I make about $45K a year.

You wanna talk about wealthy check this guy out... He posts here too!

He's got *4 TXps & 4 1080Tis'*...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alucardis666*
> 
> Lol not really. But if you sell at the right times and plan it out the cost isn't too much different. Say they launch it at 700 and drop the price of the Ti to 550 or so. If I sell my Tis' for 600 just before launch, it's about ~150 to trade up per card after ebay/paypal and shipping costs. Not too bad really. And if this card will do all they're rumored to do I might be fine with just 1.
> 
> Plus I did just sell my *6950X* and my old board too, and I'll be getting my refund for the *TXp* soon. So there's that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's all about budgeting and knowing how to "recycle" cash flow/reshift your assets.
> 
> I'm 27, and aside from my car payment/insurance/phone/food/ student loans/rent I don't have too many expenses. And I don't really spend my money on anything outside those other than tech *mainly my gaming/workstation rig* and entertainment/girlfriend.
> 
> I make about $45K a year.
> 
> You wanna talk about wealthy check this guy out... He posts here too!
> 
> He's got *4 TXps & 4 1080Tis'*...


Yeah, I used to be able to do all that until I got a mortgage and a wife. Then we started keeping track of one another spending habits









But I'll still sell my stuff to get other things.


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I used to be able to do all that until I got a mortgage and a wife. Then we started keeping track of one another spending habits
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I'll still sell my stuff to get other things.


Yea. Not ready to settle down just yet. Still got a little recklessness in me.


----------



## outofmyheadyo

Quote:


> Originally Posted by *FuriousReload*
> 
> 
> 
> Here we go, power limited indeed.
> 
> http://www.3dmark.com/spy/1596675


Did you lock the voltage curve?


----------



## eXteR

Quote:


> Originally Posted by *Luckbad*
> 
> Just got in a Zotac 1080 Ti Amp Extreme.
> 
> I think this one is going to be a great one.
> 
> Stock boost: 2037.5 MHz


Can you dump that bios so we can try it on founder cards?

Enviado desde mi SM-P550 mediante Tapatalk


----------



## HyperC

Getting better


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Can you dump that bios so we can try it on founder cards?
> 
> Enviado desde mi SM-P550 mediante Tapatalk


Should we even attempt to flash it? This partner card seems to have a few more components that might not play well with the FE.

The power boost chips for example.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *HyperC*
> 
> Getting better


Damn, +720 on the memory that's intense lol


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> Damn, +720 on the memory that's intense lol


His FE won the lottery I guess. lol!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> His FE won the lottery I guess. lol!


I can stillhas bench at 2152 at least







, just need a 5775c for better minimum framerates


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> I can stillhas bench at 2152 at least
> 
> 
> 
> 
> 
> 
> 
> , just need a 5775c for better minimum framerates


No you can't stop trowing frequencies based on curve OC. Your results are in the area of 2076/2088 GPU range...


----------



## Clukos

2100 core +775 mem



I'm less power limited in the 1080p extreme test, I think my card can go up to 2130+ on the core with water + shunt mod.


----------



## KraxKill

With mem that high, are your scores actually higher throughout the rest of the benchmarks? I.E. +450 vs +750. I ask, because I found that even though I could clock my mem into the stratasphere the scores actually declined after +550 or so.


----------



## Clukos

Quote:


> Originally Posted by *KraxKill*
> 
> With mem that high, are your scores actually higher throughout the rest of the benchmarks? I.E. +450 vs +750. I ask, because I found that even though I could clock my mem into the stratasphere the scores actually declined after +550 or so.


I lose about 200 points with mem at 550 so I guess mine do scale up to 775 or close enough


----------



## evosamurai

So I bought the warranty through microcenter, protects against overclock damage, so I should be able the shunt mod my card and if it fails clean it up and return? That's in the perfect world


----------



## HyperC

Quote:


> Originally Posted by *SlimJ87D*
> 
> Damn, +720 on the memory that's intense lol


I can go higher on the memory think 800 or 840
Quote:


> Originally Posted by *madmeatballs*
> 
> His FE won the lottery I guess. lol!


Yeah I got that titan Xp score for only an extra $13









Quote:


> Originally Posted by *SlimJ87D*
> 
> I can stillhas bench at 2152 at least
> 
> 
> 
> 
> 
> 
> 
> , just need a 5775c for better minimum framerates


Damn that's I pretty high core clock I know if I go +190 without any voltage benchmark crashes, Are you using the curve?


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> Should we even attempt to flash it? This partner card seems to have a few more components that might not play well with the FE.
> 
> The power boost chips for example.


Didn't remember that boost chips. Probably not a good idea to flash that bios.

My card is not golden, only 2050 with voltage maxed out, and 2020 with 1.043, all with EVGA Hybrid Pump.

Superposition on 4K Optimized it's heavily power limited, clocks bouncing all time. I don't want to shunt mod, so i'm hoping a magic bios to unlock a little more power.

Also for now, the pump is connected to GPU, because i'm waiting for a mini-gpu fan to PWM connector, so i can put in direct on the mobo. I suppose this is eating some power, because now the GEFORCE GTX led it's barely visible, it's like don't recieve enough power.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> No you can't stop trowing frequencies based on curve OC. Your results are in the area of 2076/2088 GPU range...


Quit the epeen bro, people have different systems. My card is literally linearly growing as I add cores. Feel free to look through my images at my other benchmarks at different core clocks. If what you were saying is true, then my scores wouldn't be linearly growing as I add more to my core. So quit it with the BS just because you don't have a good card anymore.

You're literally upset over 0.5 fps and it's ridiculous.


----------



## DStealth

Many times written...I'm not upset don't worry for me will find a good card.
Quote:


> Originally Posted by *SlimJ87D*
> 
> *Unlocking the FE's true potential
> *
> After some light benchmarks, I can easily hit 2152 Mhz @ 1.093v with no limiting.


Your result is not on 2152 core speed. Your system with [email protected] cannot hold 4k Optimised Superposition benchmark not even with 0.5fps compared to other systems. 10500 mark is where not throttling cards are hitting with 2050-2076Mhz range and you are bellow this mark what we need to discuss if you have a "real" 2152Mhz core frequency or not ?
You can get back and see my score with 2063Mhz core speed...here you're if you're lazy ...and this was prior to the shunt mod










Edit: HyperC has 10632 with 2075Mhz some posts above also

Edit: Not upset for 0.5fps difference just get with it you're not running 2152Mhz core speed...actually the difference is you have 1.5fps less relating to 2.3% difference claiming 50mhz higher core speed


----------



## gmpotu

Can someone look at this picture for me and explain to me what this means? I read the guide and did an attempt at making a curve with these settings...
1.012v = 2038 / 1.050v = 2050 / 1.093v = 2062 / 1.100v+ = 2075
Set this curve and opened and ran Furmark @ 1920 x 1080 stress test to put load on my GPU but it's saying I need more power? What does that mean or what can I do if anything so it will bring my voltage above 0.993v like you see in my picture?



Also, can someone tell me how I can zoom in or see full screen the pictures that you guys are posting? I can not zoom in to see the clocks you guys have set because the pictures look to small with chrome.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Can someone look at this picture for me and explain to me what this means? I read the guide and did an attempt at making a curve with these settings...
> 1.012v = 2038 / 1.050v = 2050 / 1.093v = 2062 / 1.100v+ = 2075
> Set this curve and opened and ran Furmark @ 1920 x 1080 stress test to put load on my GPU but it's saying I need more power? What does that mean or what can I do if anything so it will bring my voltage above 0.993v like you see in my picture?
> 
> 
> 
> Also, can someone tell me how I can zoom in or see full screen the pictures that you guys are posting? I can not zoom in to see the clocks you guys have set because the pictures look to small with chrome.


You are full on power limiting, that's what all the green is in GPU-Z, with curve open, hit CNTRL+D, then raise 1.050 volt to 2050 and hit apply. It should raise that voltage and all voltages after to 2050. then try that, ideally you should have very little green. Also 1.093 is the highest you can use anyway


----------



## madmeatballs

@HyperC Did you do the shunt mod on your FE?


----------



## glnn_23

Just did a shunt mod on my EVGA 1080 Ti FE. Seems to have knocked down the TDP a bit.


----------



## HyperC

Quote:


> Originally Posted by *madmeatballs*
> 
> @HyperC Did you do the shunt mod on your FE?


Yes but guess I didn't apply that much because my power limits aren't that much lower.. I will add some more soon just a pain in the rear have to remove all the hot glue and drain my loop again plus I think my D5 is on its last leg barely pushes through my 1st 2 radiators not joking it takes like 30 minutes to fill my system


----------



## OneCosmic

Quote:


> Originally Posted by *Clukos*
> 
> I lose about 200 points with mem at 550 so I guess mine do scale up to 775 or close enough


Mine does scale too with VRAM up to ~12700MHz - performance is increasing, but after ~12500MHz it is decreasing stability for core.


----------



## madmeatballs

Quote:


> Originally Posted by *HyperC*
> 
> Yes but guess I didn't apply that much because my power limits aren't that much lower.. I will add some more soon just a pain in the rear have to remove all the hot glue and drain my loop again plus I think my D5 is on its last leg barely pushes through my 1st 2 radiators not joking it takes like 30 minutes to fill my system


Haha! Yea, I know how that feels. I wanna do that shunt mod, I already have all the necessary materials. Just don't have the guts yet.


----------



## DStealth

Quote:


> Originally Posted by *madmeatballs*
> 
> Just don't have the guts yet.


Just to stimulate you








This post and my answer beneath


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> DON'T GET TOO COMFORTABLE, MEN!!!***
> 
> http://wccftech.com/nvidia-geforce-20-volta-graphics-card-q3-2017/


My bet is NVIDIA will slightly tweak Pascal (and call it Volta) for minor IPC gain but this is doubtful due to the scheduled release. Unless the manufacturing node has changed/heavily tweaked, I expect GV102 to have similar clockspeed ranges as Pascal, though 15-20% more CUDA cores for a similar die size or 20-30% more CUDA cores and a 500-520mm2 die size. NVIDIA will get these gains for essentially free, likely by utilizing the fact HBM2 memory controllers are smaller and the HBM2 itself is less power hungry than GDDR5x, thus allowing the TDP to remain at 250 (so no real changes to the FE cooler necessary).

I highly doubt GV110 is a thing though. GP100 has actually got worse performance than GP102 at similar resource counts and similar VRAM because it uses more power, thus limiting its clockspeed for the 250W TDP. Unless NVIDIA finally plan to bifurcate the GeForce and Titan lines completely, with GV110 going to the Titan and GV102 going to the 2080Ti. Though I think this is unlikely too since GV110 will likely cost more to make and why put it in a Titan card when they can milk us just the same with the cheaper GV102?
Most likely explanation is GV100 and GV 102 will remain distinct as GPGPU and Graphics GPU.


----------



## ocCuS

I think I finally found my future darling

https://videocardz.com/68741/galax-geforce-gtx-1080-ti-hof-features-163-phase-design

three 8-pin power connectors and 16+3 phase design


----------



## becks

Quote:


> Originally Posted by *ocCuS*
> 
> I think I finally found my future darling
> 
> https://videocardz.com/68741/galax-geforce-gtx-1080-ti-hof-features-163-phase-design
> 
> three 8-pin power connectors and 16+3 phase design


This is .....beautiful....
But I doubt we will have any water blocks available for it...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Many times written...I'm not upset don't worry for me will find a good card.
> Your result is not on 2152 core speed. Your system with [email protected] cannot hold 4k Optimised Superposition benchmark not even with 0.5fps compared to other systems. 10500 mark is where not throttling cards are hitting with 2050-2076Mhz range and you are bellow this mark what we need to discuss if you have a "real" 2152Mhz core frequency or not ?
> You can get back and see my score with 2063Mhz core speed...here you're if you're lazy ...and this was prior to the shunt mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HyperC has 10632 with 2075Mhz some posts above also
> 
> Edit: Not upset for 0.5fps difference just get with it you're not running 2152Mhz core speed...actually the difference is you have 1.5fps less relating to 2.3% difference claiming 50mhz higher core speed


Once again, you're wrong and not listening. Everyone system is different, I'm not sure why exactly my score 0.05 fps but I already pointed out to you that my score with 2088 or 2075 Mhz isn't the same as yours because we have different system setups. And hyper has a better cpu, oh shocking!

You're in some kind of denial that a cpu makes a difference in a 4K test but let's take a closer look at at the fps itself. I'm scoring higher maximum fps than hyper but his minimum framerates are slightly better, which slackman has multiple post showing that even at higher resolution a cpu can increase your minimum framerates.

But what's consistent is that if I increase my clocks by 1% I gain about 1% in performance. It's scaling in my system.

The voltage curve using 2088 Mhz doesn't matter. Here's my score, once again in my system, at 2088 not using a voltage curve.



And running it at 2152 Mhz



So obviously the score is going up, not down. Compared to modern hardware, yeah I might have a bottleneck that can be affecting my score by 1 or 1.5 frames, where 1 frame literally makes a 125 point difference in a 10,000 figure range.. But does that mean my card isn't running at 2152 Mhz? No, that's proven wrong with the images I just posted above.


----------



## ocCuS

Quote:


> Originally Posted by *becks*
> 
> This is .....beautiful....
> But I doubt we will have any water blocks available for it...


With the 980ti HOF they had a version directly with an EK waterblock out of the box. I hope they will do this again


----------



## fisher6

Quote:


> Originally Posted by *ocCuS*
> 
> With the 980ti HOF they had a version directly with an EK waterblock out of the box. I hope they will do this again


The 1080 Ti HOF might be the one I will get to replace my 1080 Ti FE. I had the 980 Ti HOF and loved it, it boosted to over 1400Mhz out of the box. The block was made by Bitspower and not EK. From my experience with previous HOF cards, you are looking at September for the block to even start showing up at retailers and not everyone will have them. Just sucks that all those extra power phases won't mean anything because of how Pascal works. That said, HOF cards are the best I have owned.


----------



## mshagg

Quote:


> Originally Posted by *ocCuS*
> 
> I think I finally found my future darling
> 
> https://videocardz.com/68741/galax-geforce-gtx-1080-ti-hof-features-163-phase-design
> 
> three 8-pin power connectors and 16+3 phase design


Epic looking card, but seems like a lot of effort to go to for a chip that seems not to have gone much past 2100Mhz in any variant to date.


----------



## becks

Saw the article mentioning a back IO panel button and custom Bios....so we might have a big surprise...


----------



## VESPA5

Quote:


> Originally Posted by *mshagg*
> 
> Epic looking card, but seems like a lot of effort to go to for a chip that seems not to have gone much past 2100Mhz in any variant to date.


There is a dude that OC'd this card around 2.5GHz but using insane cooling methods (liquid nitrogen?)
http://www.pcgamer.com/geforce-gtx-1080-ti-overclocked-to-25ghz-on-sets-world-record/

I'd be content just slapping on the EVGA Hybrid Cooler for my 1080 Ti FE (like I did with my 980 Ti). I just thought it was shady that EVGA had the same cooler (for a Titan X) and bumped up the price from $119.99 to $159.99 on the day the 1080 Ti was released.


----------



## Silent Scone

Strix block ordered. Can't wait to stretch the cards legs


----------



## CoreyL4

So I need some help with the voltage curve....

So I am aiming for max clock of 2025.

1.012v = 1999 / 1.050v = 2012 / 1.093v = 2025

Is that how it works?

Also could I do 1.093 at 2025 and the next three lower voltage points to 2012 so I stay at 2012 for the majority? So like 1.093 at 2025, 1.081 at 2012, 1.075 at 2012, 1.062 at 2012 and then 1.05 at 1999? I'm a bit confused.


----------



## Blotto80

Anything you try to hit at 1.093v won't stay at that speed due to power throttling unless you do a shunt mod or have one of the aftermarket cards with much higher TDP limits.

If your goal is 2025, start with the default curve and raise 1.000 to 2025, if it crashes, try 1.012 and so on until you find the lowest volts that can run 2025 without crashing (for me it's 1.000v). Then work backwards on your curve.

The lower the volts at 2025 the more likelihood that it stays at 2025 and doesn't jump around all over the place.


----------



## Caos

Quote:


> Originally Posted by *Blotto80*
> 
> Anything you try to hit at 1.093v won't stay at that speed due to power throttling unless you do a shunt mod or have one of the aftermarket cards with much higher TDP limits.
> 
> If your goal is 2025, start with the default curve and raise 1.000 to 2025, if it crashes, try 1.012 and so on until you find the lowest volts that can run 2025 without crashing (for me it's 1.000v). Then work backwards on your curve.
> 
> The lower the volts at 2025 the more likelihood that it stays at 2025 and doesn't jump around all over the place.


Thanks information, yesterday I was testing and my strix was 2025mhz in 1093v for almost all the time in the game .. is not harmful?

How to lock points in the voltage editor in afterburn


----------



## leequan2009

Quote:


> Originally Posted by *ocCuS*
> 
> I think I finally found my future darling
> 
> https://videocardz.com/68741/galax-geforce-gtx-1080-ti-hof-features-163-phase-design
> 
> three 8-pin power connectors and 16+3 phase design


three 8-pin power connectors and 16+3 phase design
Ok. The New King of OC is coming....


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Caos*
> 
> Thanks information, yesterday I was testing and my strix was 2025mhz in 1093v for almost all the time in the game .. is not harmful?
> 
> How to lock points in the voltage editor in afterburn


You keep asking the same question and we keeping telling you no.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *leequan2009*
> 
> three 8-pin power connectors and 16+3 phase design
> Ok. The New King of OC is coming....


Yeah, but sadly if the die sucks and you lose the silicon lottery, all that power goes to waste.

The average right now on hwbot for oc a 1080 Ti seems to be 1850 on air and 2050 on water.


----------



## Blotto80

Quote:


> Originally Posted by *Caos*
> 
> Thanks information, yesterday I was testing and my strix was 2025mhz in 1093v for almost all the time in the game .. is not harmful?
> 
> How to lock points in the voltage editor in afterburn


No, it's not harmful. There's not much you can do on the software side of these that can cause harm but.... 1.093v is probably way more than you need for 2025mhz. Try only adjusting a lower voltage bin up, start at 1.000v and work up from there. To lock it, just select the bin you want and hit L. It will still throttle if it needs to though so locking it is kind of unnecessary.


----------



## Caos

Quote:


> Originally Posted by *SlimJ87D*
> 
> You keep asking the same question and we keeping telling you no.


And the voltage curvature in the afterburner?

Ok sorry


----------



## viz0id

So I'm glad you are already talking about voltage curves and undervolting. I have tried to stablize my boost around 2025, and I've gotten it to 0.9810.

I can run Time Spy, furmark and my current game ME:Andromeda without it fluctuating. Can i then call it stable or is there something else i should test it up against?

How do i know if it doesn't work? Does it just clock down or does it bluescreen?

I'm monotoring in GPU-Z and my TDP while gaming never goes above 100%, but in TimeSpy it does so I've enabled it to go to 120% but it rarely (never currently) uses it.

EDIT: meant undervolting not underclocking.


----------



## Caos

Quote:


> Originally Posted by *Blotto80*
> 
> No, it's not harmful. There's not much you can do on the software side of these that can cause harm but.... 1.093v is probably way more than you need for 2025mhz. Try only adjusting a lower voltage bin up, start at 1.000v and work up from there. To lock it, just select the bin you want and hit L. It will still throttle if it needs to though so locking it is kind of unnecessary.










Thank you very much for answering .. I will prove what you say to me.


----------



## BrainSplatter

Quote:


> Originally Posted by *SlimJ87D*
> 
> I bring you... SC2 bios O_O
> 
> EVGA1080TiSC2.zip 155k .zip file


Thanks for SC2 BIOS, having 0 fan speed on FEs is great, especially with SLI. Flashed both FEs (ASUS and Gigabyte). OC props seem to be about the same. Memory OC seems to vary much more than core. ASUS FE does > +600, Gig FE artefacts @ +480. Both cores do about 2050 when cooled well.

My EKWB 1080Ti blocks have also just arrived but I am not sure when I will actually build my first custom water loop for the GPUs







. The prospect of not having these cards in action for about 2 or 3 days (my guess for how long it will take 4 me) is somewhat discouraging. And with 0 fan speed at idle, my major complaint with the FE cooling has been solved, lol.

Quote:


> Originally Posted by *viz0id*
> 
> How do i know if it doesn't work? Does it just clock down or does it bluescreen?


If u are on the edge of stability, a slightly instable overclock will usually only result in a driver reset / game crash not a bluescreen.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> If u are on the edge of stability, a slightly instable overclock will usually only result in a driver reset / game crash not a bluescreen.


Ok cool, good to know! Thanks for the quick response! Might try and push it even further then


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> So I'm glad you are already talking about voltage curves and undervolting. I have tried to stablize my boost around 2025, and I've gotten it to 0.9810.
> 
> I can run Time Spy, furmark and my current game ME:Andromeda without it fluctuating. Can i then call it stable or is there something else i should test it up against?
> 
> How do i know if it doesn't work? Does it just clock down or does it bluescreen?
> 
> I'm monotoring in GPU-Z and my TDP while gaming never goes above 100%, but in TimeSpy it does so I've enabled it to go to 120% but it rarely (never currently) uses it.
> 
> EDIT: meant undervolting not underclocking.


It's probably good to go. Just keep playing with it. You usually get a driver crash issue if it's a core clock instability. Or artifacts and a system freeze if it's memory.

But if you can run through timespy and play Mass Effect, you're probably good to go.

Do you own Witcher 3? That is a game that hates unstable OC, it'll let you know within 10 minutes if your OC is okay or not. But sometimes it can crash in conversations with people.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *BrainSplatter*
> 
> Thanks for SC2 BIOS, having 0 fan speed on FEs is great, especially with SLI. Flashed both FEs (ASUS and Gigabyte). OC props seem to be about the same. Memory OC seems to vary much more than core. ASUS FE does > +600, Gig FE artefacts @ +480. Both cores do about 2050 when cooled well.
> 
> My EKWB 1080Ti blocks have also just arrived but I am not sure when I will actually build my first custom water loop for the GPUs
> 
> 
> 
> 
> 
> 
> 
> . The prospect of not having these cards in action for about 2 or 3 days (my guess for how long it will take 4 me) is somewhat discouraging. And with 0 fan speed at idle, my major complaint with the FE cooling has been solved, lol.
> If u are on the edge of stability, a slightly instable overclock will usually only result in a driver reset / game crash not a bluescreen.


Your welcome. Yes, I can actually OC my memory to +550 on the SC2. So far, I like it. It has been a bit more stable more me.

Witcher 3, BF1 and Quantum Break run on 2114 Mhz @ 1.075v.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Caos*
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you very much for answering .. I will prove what you say to me.


I don't recommend locking a voltage and core because your computer doesn't downclock or really go idle anymore.


----------



## viz0id

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's probably good to go. Just keep playing with it. You usually get a driver crash issue if it's a core clock instability. Or artifacts and a system freeze if it's memory.
> 
> But if you can run through timespy and play Mass Effect, you're probably good to go.
> 
> Do you own Witcher 3? That is a game that hates unstable OC, it'll let you know within 10 minutes if your OC is okay or not. But sometimes it can crash in conversations with people.


Thanks for answering! Yeah i do own Witcher 3, I'll add that to the test setup! But yeah its really strange tweaking this card. It's my first pascal card and to undervoltage to get "better" (at least more stable) performance is really weird. With everything on full blast i was peaking somewhere in the 2075-2100 range with my Asus Strix. But it fluctuated so much with the power limitation. This feels a lot better.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> Thanks for answering! Yeah i do own Witcher 3, I'll add that to the test setup! But yeah its really strange tweaking this card. It's my first pascal card and to undervoltage to get "better" (at least more stable) performance is really weird. With everything on full blast i was peaking somewhere in the 2075-2100 range with my Asus Strix. But it fluctuated so much with the power limitation. This feels a lot better.


Well let me tell you one more thing.

Superposition and Timespy are like stress test, they run the card at high power levels. But games will not run that intensively.

Witcher 3 will only uses 3/4 of the power Superposition does.

I found that playing the game at 1.042v rarely has any power drops. This was before I was shunt modded.

So you can definitely aim for 1.03-1.042v for gaming and be fine to hit a higher core.


----------



## viz0id

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well let me tell you one more thing.
> 
> Superposition and Timespy are like stress test, they run the card at high power levels. But games will not run that intensively.
> 
> Witcher 3 will only uses 3/4 of the power Superposition does.
> 
> I found that playing the game at 1.042v rarely has any power drops. This was before I was shunt modded.
> 
> So you can definitely aim for 1.03-1.042v for gaming and be fine to hit a higher core.


Ok I'll have a go at that as well. When i went for "as high as possible" i didn't do voltage curve, I just did "+100" on whatever the card had as base, and increased Power limiter to 120%.

I guess if i use the same methodology on that, as i did on the under voltage, I should get a stable clock right?

One thing i also wondered with the higher clocks that you might be able to explain to me, is that it starts at a higher clock, then drops down, even though the temp is in the 60's. Why is that? There is no link to temp limit and it doesn't look like it hits TDP limit, but it boosts up, then drops down for some reason. What is causing this?


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well let me tell you one more thing.
> 
> Superposition and Timespy are like stress test, they run the card at high power levels. But games will not run that intensively.
> 
> Witcher 3 will only uses 3/4 of the power Superposition does.
> 
> I found that playing the game at 1.042v rarely has any power drops. This was before I was shunt modded.
> 
> So you can definitely aim for 1.03-1.042v for gaming and be fine to hit a higher core.


I find playing Assetto Corsa in VR causes a supo/timespy stable OC to crash. My theory is the card wont/cant spend much time at the higher clocks when running supo/timespy before getting downclocked, but after 20 odd minutes in the highest bin playing Assetto the instabilities find their way to the surface.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> Ok I'll have a go at that as well. When i went for "as high as possible" i didn't do voltage curve, I just did "+100" on whatever the card had as base, and increased Power limiter to 120%.
> 
> I guess if i use the same methodology on that, as i did on the under voltage, I should get a stable clock right?
> 
> One thing i also wondered with the higher clocks that you might be able to explain to me, is that it starts at a higher clock, then drops down, even though the temp is in the 60's. Why is that? There is no link to temp limit and it doesn't look like it hits TDP limit, but it boosts up, then drops down for some reason. What is causing this?


You lose 13 Mhz for every 10C starting at 40C I believe.

So when you first run a game and you're temperatures are coming from idle, then you're getting your max boost. Then it instantly recalculates your TDP. Is this what you're asking about.

When you test, make sure you're running GPU-Z in the background and look at the sensors tab at perf cap.

If it's green, then it's clocking down due to power limiting.


----------



## cugno87

Quote:


> Originally Posted by *viz0id*
> 
> Ok I'll have a go at that as well. When i went for "as high as possible" i didn't do voltage curve, I just did "+100" on whatever the card had as base, and increased Power limiter to 120%.
> 
> I guess if i use the same methodology on that, as i did on the under voltage, I should get a stable clock right?
> 
> One thing i also wondered with the higher clocks that you might be able to explain to me, is that it starts at a higher clock, then drops down, even though the temp is in the 60's. Why is that? There is no link to temp limit and it doesn't look like it hits TDP limit, but it boosts up, then drops down for some reason. What is causing this?


It's a sort of soft thermal throttling, occurs at 49 °C and 69°C too.


----------



## viz0id

Quote:


> Originally Posted by *cugno87*
> 
> It's a sort of soft thermal throttling, occurs at 49 °C and 69°C too.


Quote:


> Originally Posted by *SlimJ87D*
> 
> You lose 13 Mhz for every 10C starting at 40C I believe.
> 
> So when you first run a game and you're temperatures are coming from idle, then you're getting your max boost. Then it instantly recalculates your TDP. Is this what you're asking about.
> 
> When you test, make sure you're running GPU-Z in the background and look at the sensors tab at perf cap.
> 
> If it's green, then it's clocking down due to power limiting.


Ok thanks for helping me understand, both of you!


----------



## tpain813

So I just installed the EVGA Hybrid cooler last night. Man was that a PITA. I heard the directions were bad, but was not expecting that. It wasn't hard, just time consuming because I needed to look up every step to make sure I was doing it right. Still nervous I may have messed something up - had to remove and put back on the Cooler to the GPU chip, because lost grip of it when I had it upside down to screw in the spring loaded screws. Also, I had thermal paste residue all around the chip (in the green board area), that I could't remove. And wasn't sure what to do with the long fan wire that sits inside the shroud around the fan - I think it may be touching the heat sink or even stuck between it and the top of the shroud a bit.

It took longer than expected to complete, so haven't fired it up yet. Will after work and hoping it all works! Fingers crossed!


----------



## gkolarov

I am experiencing the "software" thermal throttling at 50°C and once again at 60°C. I haven't passed 70°C, so have no idea. At 59°C with 55% fan, 1.093V the core is 2038 without any throttling. The card is Asus Strixx.


----------



## Ayame77

Hi! What do you think about Gigabyte 1080Ti OC Gaming 11G?


----------



## CoreyL4

People with gaming x's, what are your temps/clocks/fan speeds when gaming/benchmarking?


----------



## Luckbad

Quote:


> Originally Posted by *eXteR*
> 
> Can you dump that bios so we can try it on founder cards?


Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.

https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


----------



## eXteR

Quote:


> Originally Posted by *Luckbad*
> 
> Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


Thanks mate!

Powerlimit is not a problem, because this card starts at 320W (more than FE stock Power limit +20%) and goes up to 384W.

Fans also not a problem, because you can make a custom profile on MSI Afterburner.

The point here is that this custom PCB should be really different than reference one. I'll wait sience other experienced guys here on the post, give it a try.

@SlimJ87D


----------



## cugno87

Quote:


> Originally Posted by *Luckbad*
> 
> Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


"WARNING: Firmware image PCI Subsystem ID (19DA.1471)
does not match adapter PCI Subsystem ID (10DE.120F)."

What now?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Thanks mate!
> 
> Powerlimit is not a problem, because this card starts at 320W (more than FE stock Power limit +20%) and goes up to 384W.
> 
> Fans also not a problem, because you can make a custom profile on MSI Afterburner.
> 
> The point here is that this custom PCB should be really different than reference one. I'll wait sience other experienced guys here on the post, give it a try.
> 
> @SlimJ87D


No, I can't keep testing out all these bios. My wife is starting to get angry at me looking at a benchmarking screen, lol.

Why don't you be the first to test it.


----------



## KedarWolf

Quote:


> Originally Posted by *Luckbad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eXteR*
> 
> Can you dump that bios so we can try it on founder cards?
> 
> 
> 
> Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
Click to expand...

Can't test it for another 6.5 hours until I'm home from work. Amp Extreme has two 8 pin power connectors and 16x2 power phases,not sure it'll work.


----------



## Pandora's Box

Quote:


> Originally Posted by *CoreyL4*
> 
> People with gaming x's, what are your temps/clocks/fan speeds when gaming/benchmarking?


2050 core, 12,000 mem 70% fan speed 65C while benchmarking

2000 core, 11,000 mem 60% fan speed, 65C while gaming.

I dont see a huge boost in memory overclocking while gaming only while benchmarking.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Can't test it for another 6.5 hours until I'm home from work. Amp Extreme has two 8 pin power connectors and 16x2 power phases,not sure it'll work.


Yeah, on top of that, it has some separate chips and components for "power boost" on the board.

It's very different from the reference design. Almost like someone hard modding the other side of the board like that one guy did.

I wouldn't bother even trying to test it as it's probably written to find the "power boost" parts.


----------



## KedarWolf

Quote:


> Originally Posted by *cugno87*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Luckbad*
> 
> Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
> 
> 
> 
> "WARNING: Firmware image PCI Subsystem ID (19DA.1471)
> does not match adapter PCI Subsystem ID (10DE.120F)."
> 
> What now?
Click to expand...

I think that's a normal message. Hit the 'y' key twice, should flash.


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, I can't keep testing out all these bios. My wife is starting to get angry at me looking at a benchmarking screen, lol.
> 
> Why don't you be the first to test it.


Lost my balls









I'll give a try this night. Still at work.


----------



## CoreyL4

Quote:


> Originally Posted by *Pandora's Box*
> 
> 2050 core, 12,000 mem 70% fan speed 65C while benchmarking
> 
> 2000 core, 11,000 mem 60% fan speed, 65C while gaming.
> 
> I dont see a huge boost in memory overclocking while gaming only while benchmarking.


Do you know what your voltage is for both?


----------



## Pandora's Box

Quote:


> Originally Posted by *CoreyL4*
> 
> Do you know what your voltage is for both?


2000 - around 1.012

2050 - around 1.031

Obviously this fluctuates


----------



## CoreyL4

Quote:


> Originally Posted by *Pandora's Box*
> 
> 2000 - around 1.012
> 
> 2050 - around 1.031
> 
> Obviously this fluctuates


Thanks! Did you mess with your voltage curve?


----------



## Luckbad

Anyone else with a Zotact 1080 Ti Amp Extreme find a way to make the fan run all the time?

It has some sort of hardcoded fan speed killer up to a certain temperature.

It follows your fan profiles in Afterburner et al AFTER that limit is reached, but before that temp, it is 0% all the time.

It wouldn't bother me except that as soon as it hits 45 C, it maxes the fan speed out then drops it to your setting. Super annoying.


----------



## Pandora's Box

Quote:


> Originally Posted by *CoreyL4*
> 
> Thanks! Did you mess with your voltage curve?


Nope. Way too much fiddling for what amounts to not much gain if any (aftermarket card with higher power limit)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> Anyone else with a Zotact 1080 Ti Amp Extreme find a way to make the fan run all the time?
> 
> It has some sort of hardcoded fan speed killer up to a certain temperature.
> 
> It follows your fan profiles in Afterburner et al AFTER that limit is reached, but before that temp, it is 0% all the time.
> 
> It wouldn't bother me except that as soon as it hits 45 C, it maxes the fan speed out then drops it to your setting. Super annoying.


Are you saying the fans are at 0% and then they rev up to max and then follow the fan profile?


----------



## cugno87

Quote:


> Originally Posted by *KedarWolf*
> 
> I think that's a normal message. Hit the 'y' key twice, should flash.


Sorry.. lost my balls too ???


----------



## jckaboom

.


----------



## jckaboom

Quote:


> Originally Posted by *rcfc89*
> 
> Serious question fella's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm currently waiting on my Titan Xp and Evga hybrid cooler. With tax I'm looking at $1461. Although staying on a single card is tempting I'm really digging the new 1080Ti SeaHawk. Being a huge Seahawk fan and living in Seattle helps as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can run a pair in SLI for $1600 out the door. $139 more then the Xp setup. Opinions?


Depends of the games you play and resolution. For 1440p I will go for a single card, at 4K I will take the SLI.
If the game you play at 1440p is not too heavy (running more than 100fps) you will not see big difference between Titan xp and 1080ti.
IMO if I was going to run 1440p, the 1080ti and custom water cooling (plus shunt mod if need) will give you nice performance, temps and looking.
To play AAA games at 4K I will go SLI 1080ti. The SLI setup is gonna need good air flow unless you go all in with a custom loop (DO IT!).


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> With mem that high, are your scores actually higher throughout the rest of the benchmarks? I.E. +450 vs +750. I ask, because I found that even though I could clock my mem into the stratasphere the scores actually declined after +550 or so.


mine drop after 440 and artifact at 480+, so, it scales until you start getting errors, which won't 'show' (except in scores) until it reaches artifacting level of errors. So, the ones who got lotto winners on vram are the winners in SuPo.
Quote:


> Originally Posted by *DStealth*
> 
> Many times written...I'm not upset don't worry for me will find a good card.
> Your result is not on 2152 core speed. Your system with [email protected] cannot hold 4k Optimised Superposition benchmark not even with 0.5fps compared to other systems. 10500 mark is where not throttling cards are hitting with 2050-2076Mhz range and you are bellow this mark what we need to discuss if you have a "real" 2152Mhz core frequency or not ?
> You can get back and see my score with 2063Mhz core speed...here you're if you're lazy ...and this was prior to the shunt mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HyperC has 10632 with 2075Mhz some posts above also
> 
> Edit: Not upset for 0.5fps difference just get with it you're not running 2152Mhz core speed...actually the difference is you have 1.5fps less relating to 2.3% difference claiming 50mhz higher core speed


bro, you aren't factoring in vram (bandwidth). Slim is like me, stuck around +400 memory. We can't get over 10,500 witht hat bandwidth, even if our core is 100mhz faster (slim). If his mem took +550, +600, +700 like some of ya'll, he'd be hitting 10,600+, too. I can't get past 10, 460 even at 2100Mhz. I should be seeing 10,550 at that clock , but only if my memory can hold +500. It can't. So I run +396 and max out at 10,464.


----------



## outofmyheadyo

Quote:


> Originally Posted by *Luckbad*
> 
> Uploaded the Zotac 1080 Ti AMP! Extreme BIOS. It's not likely to be of much help to FE users. It has a power limit of 120% and won't start your fans until 55 °C.
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


Thanks for the upload, this seems to be a great bios for founders edition owners who dont feel like messing with the manual overclocking and voltage curve, my gigabyte founders edition boosted to 1898 with stock bios without touching anything, zotac amp xtreme bios boosted the same card to 2025mhz so I quess I'm gonna start using the zotac one.


----------



## Luckbad

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you saying the fans are at 0% and then they rev up to max and then follow the fan profile?


Yep. Before 45 C, the fans are at 0%. As soon as it hits 45, they rev to >100% then drop to whatever speed is set.

It's unbearably annoying because during normal PC use, if you have the card overclocked, it reaches 45 C every few minutes, revs up, gets down to 40, repeats.


----------



## Slackaveli

Quote:


> Originally Posted by *DStealth*
> 
> Many times written...I'm not upset don't worry for me will find a good card.
> Your result is not on 2152 core speed. Your system with [email protected] cannot hold 4k Optimised Superposition benchmark not even with 0.5fps compared to other systems. 10500 mark is where not throttling cards are hitting with 2050-2076Mhz range and you are bellow this mark what we need to discuss if you have a "real" 2152Mhz core frequency or not ?
> You can get back and see my score with 2063Mhz core speed...here you're if you're lazy ...and this was prior to the shunt mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HyperC has 10632 with 2075Mhz some posts above also
> 
> Edit: Not upset for 0.5fps difference just get with it you're not running 2152Mhz core speed...actually the difference is you have 1.5fps less relating to 2.3% difference claiming 50mhz higher core speed


bro, you aren't factoring in vram (bandwidth). Slim is like me, stuck around +400 memory. We can't get over 10,500 witht hat bandwidth, even if our core is 100mhz faster (slim). If his mem took +550, +600, +700 like some of ya'll, he'd be hitting 10,600+, too. I can't get past 10, 460 even at 2100Mhz. I should be seeing 10,550 at that clock , but only if my memory can hold +500. It can't. So I run +396 and max out at 10,464.
Quote:


> Originally Posted by *BrainSplatter*
> 
> Thanks for SC2 BIOS, having 0 fan speed on FEs is great, especially with SLI. Flashed both FEs (ASUS and Gigabyte). OC props seem to be about the same. Memory OC seems to vary much more than core. ASUS FE does > +600, Gig FE artefacts @ +480. Both cores do about 2050 when cooled well.
> 
> My EKWB 1080Ti blocks have also just arrived but I am not sure when I will actually build my first custom water loop for the GPUs
> 
> 
> 
> 
> 
> 
> 
> . The prospect of not having these cards in action for about 2 or 3 days (my guess for how long it will take 4 me) is somewhat discouraging. And with 0 fan speed at idle, my major complaint with the FE cooling has been solved, lol.
> If u are on the edge of stability, a slightly instable overclock will usually only result in a driver reset / game crash not a bluescreen.


yep, if you are getting BoD or hard freeze with buzzing, that's a bad cpu OC.


----------



## Bishop07764

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Thanks for the upload, this seems to be a great bios for founders edition owners who dont feel like messing with the manual overclocking and voltage curve, my gigabyte founders edition boosted to 1898 with stock bios without touching anything, zotac amp xtreme bios boosted the same card to 2025mhz so I quess I'm gonna start using the zotac one.


Did you actually flash this onto a Founders edition? Let us know how it works. I'm power limited like crazy and don't want to bother with the shunt mod on my water-cooled Founders.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> Yep. Before 45 C, the fans are at 0%. As soon as it hits 45, they rev to >100% then drop to whatever speed is set.
> 
> It's unbearably annoying because during normal PC use, if you have the card overclocked, it reaches 45 C every few minutes, revs up, gets down to 40, repeats.


time to lower that house temp by 5f lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> Yep. Before 45 C, the fans are at 0%. As soon as it hits 45, they rev to >100% then drop to whatever speed is set.
> 
> It's unbearably annoying because during normal PC use, if you have the card overclocked, it reaches 45 C every few minutes, revs up, gets down to 40, repeats.


They might have used some PWM fans that need a full 12v kick start to start spinning that's why.

Some fans cannot start unless they are powered at 9v-12v or 75%-100% current.

That's the reason why this might be happening.


----------



## KedarWolf

Quote:


> Originally Posted by *Bishop07764*
> 
> Quote:
> 
> 
> 
> Originally Posted by *outofmyheadyo*
> 
> Thanks for the upload, this seems to be a great bios for founders edition owners who dont feel like messing with the manual overclocking and voltage curve, my gigabyte founders edition boosted to 1898 with stock bios without touching anything, zotac amp xtreme bios boosted the same card to 2025mhz so I quess I'm gonna start using the zotac one.
> 
> 
> 
> Did you actually flash this onto a Founders edition? Let us know how it works. I'm power limited like crazy and don't want to bother with the shunt mod on my water-cooled Founders.
Click to expand...

Can't wait to try it. I'm home from work in 5 hours from now.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> bro, you aren't factoring in vram (bandwidth). Slim is like me, stuck around +400 memory. We can't get over 10,500 witht hat bandwidth, even if our core is 100mhz faster (slim). If his mem took +550, +600, +700 like some of ya'll, he'd be hitting 10,600+, too. I can't get past 10, 460 even at 2100Mhz. I should be seeing 10,550 at that clock , but only if my memory can hold +500. It can't. So I run +396 and max out at 10,464.
> yep, if you are getting BoD or hard freeze with buzzing, that's a bad cpu OC.


Yeah, in my latest post, I already showed him that my system runs the card differently. I might have a bottleneck of some kind, probably need a 5775c for better minimums.

His theory is that the voltage curve doesn't give you true clocks and he ignored that everyones system performs differently. He's critical about 100 points in a 10,000 score which is literally a 1% difference. 1 fps can literally change the score to +125 points.

And to prove him even more wrong, I did a clock at 2088 Mhz and compared it at 2152 Mhz. It obviously scaled properly. So my system is bottle necked elsewhere, but not the card.




This guy just needs to get over his epeen. It's like he used to have a huge member and it blew up and now he's stuck with an average one but likes to show people photos of his old big one.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Thanks for the upload, this seems to be a great bios for founders edition owners who dont feel like messing with the manual overclocking and voltage curve, my gigabyte founders edition boosted to 1898 with stock bios without touching anything, zotac amp xtreme bios boosted the same card to 2025mhz so I quess I'm gonna start using the zotac one.


All that's doing is just having your stock voltage curve higher. Your stock voltage curve is just set to the zotac amp extreme voltage curve.

More importantly, please try and test out the TDP. When you run benchmarks, does you power limit ever approach the maximum you set it at in AB? In other words, if you run superposition, and you have the power limit slider at 120%, are you hitting 120% the majoriy of the time and are your voltages and clocks dropping much?

Let us know.


----------



## outofmyheadyo

My card is shuntmodded, yet it still sometimes flashed pwr limit with the amp xtreme bios, while it`s at 55-60% of it`s tdp that makes no sense.


----------



## chibi

Anyone have the Aquacomputer GTX 1080 TI Water Block + Active XCS Backplate? I just received mine today and it looks like the backplate doesn't use thermal pads on the memory chips similar to how EKWB does.

You guys reckon I should slap some on?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> My card is shuntmodded, yet it still sometimes flashed pwr limit with the amp xtreme bios, while it`s at 55-60% of it`s tdp that makes no sense.


Well the ASUS bios does the same for me. It doesn't support the shunt mod. It doesn't tap into that extra 100 watts I have.



So I don't think either of us can really find out what it can do for the non-shunt modded guys.

Compared to stock with shunt mod, no perf cap


----------



## wrc05

Quote:


> "eXteR"
> 
> At the moment the pump is connected to the card, i'll recieve shortly a cable to connect the pump on PSU.


@eXteR post #5993 Page 600

May I ask which cable is that? Please could you provide the website link, thanks.


----------



## CoreyL4

What is a good fan curve for a non reference card?

I do no fans till 40 degrees and have it max at 75% fan speed at 70 degrees or higher.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *wrc05*
> 
> @eXteR post #5993 Page 600
> 
> May I ask which cable is that? Please could you provide the website link, thanks.


If it's the evga hybrid, you need a female mini-pwm 4 pin to a male pwm-4 pin so you can connect it to your mobo. And if you want to connect it to your PSU, then you'll need a SATA or molex 4 adapter to 4-pin pwm.


----------



## evosamurai

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, I can't keep testing out all these bios. My wife is starting to get angry at me looking at a benchmarking screen, lol.
> 
> Why don't you be the first to test it.


My wife is starting to question all the packages that keep getting delivered here, the things I forget and add makes the situation a little worse lol


----------



## Luckbad

Update on Zotac 1080 Ti Amp Extreme fans being hardcoded to stay at 0% until 45 C:

If I do not use any kind of fan control in Afterburner or FireStorm, it doesn't do the weird 100% fan thing at 45 C. It still stays at 0% until then, but it just spins them up to a reasonable and effectively silent RPM.

Long story short, do not try to use a custom fan profile for the Zotac 1080 Ti Amp Extreme or you'll get constantly angry with the fan.

The temperatures with the default profile are outstanding and it doesn't actually need a custom profile. I just wanted to keep the fans on at all times so the temperature could be minimized, but it seems that isn't an option.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> Update on Zotac 1080 Ti Amp Extreme fans being hardcoded to stay at 0% until 45 C:
> 
> If I do not use any kind of fan control in Afterburner or FireStorm, it doesn't do the weird 100% fan thing at 45 C. It still stays at 0% until then, but it just spins them up to a reasonable and effectively silent RPM.
> 
> Long story short, do not try to use a custom fan profile for the Zotac 1080 Ti Amp Extreme or you'll get constantly angry with the fan.
> 
> The temperatures with the default profile are outstanding and it doesn't actually need a custom profile. I just wanted to keep the fans on at all times so the temperature could be minimized, but it seems that isn't an option.


You can try speedfan, that might help.


----------



## wrc05

Quote:


> "SlimJ87D"


Yes, it is EVGA Hybrid, thanks for the info.


----------



## outofmyheadyo

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well the ASUS bios does the same for me. It doesn't support the shunt mod. It doesn't tap into that extra 100 watts I have.
> 
> 
> 
> So I don't think either of us can really find out what it can do for the non-shunt modded guys.
> 
> Compared to stock with shunt mod, no perf cap


Well it doesn`t hurt to try, flashing a bios and reinstalling drivers is what, 5 minutes ?


----------



## beastlyhax

I have a gtx 1080 ti fe with an aio cooler and the shunt mod. So I have no thermal or power limitations. When I play games or run benchmarks for a while, however, my frequency drops from 2050 down to 2038, the cause according to afterburner being a voltage limit. Yet there seems to be no voltage issue, since it runs at 1.093v all the time, for both frequencies. Can anyone maybe shed some light on this?


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Awesome, we need this.


Ha ha. Trying to figure out the embedded owner's club list. I have it nailed down now.


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> Anyone have the Aquacomputer GTX 1080 TI Water Block + Active XCS Backplate? I just received mine today and it looks like the backplate doesn't use thermal pads on the memory chips similar to how EKWB does.
> 
> You guys reckon I should slap some on?


I ordered both with the 1 slot bracket. I am still waiting for mine. Could you post some photos? hehe I wanna see it. I'd ask @Shoggy about your concern though.


----------



## claymanhb

This thing is going to be sitting with me at work for the next 5 hours. I'm already going nuts










-tapa


----------



## joder

Quote:


> Originally Posted by *claymanhb*
> 
> This thing is going to be sitting with me at work for the next 5 hours. I'm already going nuts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -tapa


Install it in your work computer.


----------



## outofmyheadyo

Most work computers dont come with watercooling loops and quickdisconnects, I belive not sure tho, and I`m not an expert but I belive most people go to work, to "work".


----------



## eXteR

Quote:


> Originally Posted by *wrc05*
> 
> @eXteR post #5993 Page 600
> 
> May I ask which cable is that? Please could you provide the website link, thanks.


You have two options (expensive vs cheap).

https://www.moddiy.com/products/4%252dPin-PWM-Fan-Connector-%28Female%29-to-4%252dPin-Mini-GPU-Fan-Connector-%28Male%29.html

https://es.aliexpress.com/item/Hot-Sales-New-Product-Portable-Wireless-WiFi-Free-and-Easy-Sharing-Networks-Everywhere-Wifi-Router/1462054608.html


----------



## s1rrah

Quote:


> Originally Posted by *tpain813*
> 
> So I just installed the EVGA Hybrid cooler last night. Man was that a PITA. I heard the directions were bad, but was not expecting that. It wasn't hard, just time consuming because I needed to look up every step to make sure I was doing it right. Still nervous I may have messed something up - had to remove and put back on the Cooler to the GPU chip, because lost grip of it when I had it upside down to screw in the spring loaded screws. Also, I had thermal paste residue all around the chip (in the green board area), that I could't remove. And wasn't sure what to do with the long fan wire that sits inside the shroud around the fan - I think it may be touching the heat sink or even stuck between it and the top of the shroud a bit.
> 
> It took longer than expected to complete, so haven't fired it up yet. Will after work and hoping it all works! Fingers crossed!


I had two of the EVGA Hybrids for my old 980's and am now using a 1080 ti Seahawk that came already assembled with the Hybrid cooler ... and yes .. the EVGA's can be a PITA but mostly due to having to breakdown the reference cooler, which to me, was the most tedious part; all those tiny screws man!

Anyway, I've repasted my old 980's a gazillion times and it's easy peasy now ...

1. As far as seating the cooler on the die? Once you have the paste applied? I carefully lower the cooler from the top while holding the card with my other hand; I visually keep things aligned while doing this by "sighting" through the cooler mounting holes and just making sure they line up with the holes in the card; once it's seated, I simply hold it pressed very firmly with my left hand and flip it and one by one, apply the four screws. Super easy once you do it a couple more times.

2. The fan wire you mention? It's this wire indicated here, correct?

...



...

You just need to find a way to route it around the fan ... my 980's had a little channel that worked well for that ... once I found a path for it, I secured it with a couple small bits of electrical tape just to be sure it didnt' "migrate" over and hit the fan. In fact the last time I repasted, after assembling and re installing the GPU ... it turns out that little wire was ever so slightly touching the blower fan and made a bad clicking sound and so had to take the shroud off and re route/secure that wire.

All in all I think you'll find the performance of the cooler is well worth the bit of effort

Enjoy


----------



## mouacyk

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Most work computers dont come with watercooling loops and quickdisconnects, I belive not sure tho, and I`m not an expert but I belive most people go to work, to "work".


No they go to work to pick up the stuff they want for after work.


----------



## jestermx6

Got my ti cooled and overclocked last night. figured i'd post up here.

Bone stock on water (EK block/backplate) This is what i was seeing.










no bios flash or anything. Just used Precision OC, Here's after about 30 minutes of Heaven benchmark. tried going higher with no luck.










Cooling loop is an Alphacool 480 xflow pumping through the rad>cpu>gpu>res>pump. 4 Corsair sp120 LED fans at 50% (900rpm)

my only real complaint is there's quite a bit of coil whine coming from the card. Not sure if i'm just noticing it now because it's not on air, or if its a result of something to do with the water block.


----------



## rakesh27

I have the EVGA FE 1080 Ti with Kraken G10, installed all fine, however is there away to put the back plates on to fully cover the card ?


----------



## tpain813

Quote:


> Originally Posted by *s1rrah*
> 
> ...
> 
> You just need to find a way to route it around the fan ... my 980's had a little channel that worked well for that ... once I found a path for it, I secured it with a couple small bits of electrical tape just to be sure it didnt' "migrate" over and hit the fan. In fact the last time I repasted, after assembling and re installing the GPU ... it turns out that little wire was ever so slightly touching the blower fan and made a bad clicking sound and so had to take the shroud off and re route/secure that wire.
> 
> All in all I think you'll find the performance of the cooler is well worth the bit of effort
> 
> Enjoy


Did you plug the radiator fan back into the card or into the motherboard itself?

Yeah that was the fan wire I mentioned. Mine looks similar, it just seemed to bunch up around the heat sink more than yours in the pic.

Yeah once I had the cooler seated on the die, I accidentally lifted it completely off while trying to get the first screw in. Instead of then cleaning the thermal paste off and reapplying some, I just stuck it back on so am hoping that won't hurt it too much....

The removal of the original cooler was definitely the most annoying part. So was reinstalling all of the hex screws so the back plate can be put back on, although I'm not even sure how necessary that was (if at all).

All in all, I'm sure it will be worth it, and can't wait to test after work, but I was definitely not expecting it to be that annoying of a project! haha


----------



## Savatage79

I have an Aorus, everything seems good but I'm tryin to get idle Temps down a bit. I'm at about 49 degrees when nothing is happening. My fan speed shows a 0 to 59 off and on fluctuation, is that normal?

Andromeda is running 2025 pretty solid at 4k mid 60s, gets about 70 tops and drops to 2000.

Need a few fan suggestions if possible and if that's normal to see fan speed fluctuate like that. It's about 3 seconds, 0 then 59, 0 then 59 etc.


----------



## claymanhb

Quote:


> Originally Posted by *joder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *claymanhb*
> 
> This thing is going to be sitting with me at work for the next 5 hours. I'm already going nuts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -tapa
> 
> 
> 
> Install it in your work computer.
Click to expand...

I think it was a whopping 180w psu

-tapa


----------



## schmak01

Got some Reference cards installed with the EVGA Titan Xp/1080 Ti Hybrid kits, doing pretty darn well right now, haven't had time to really try and OC them yet, but I have them running at 2050 Mhz. Having a heaven just run constantly for about 2 hours, the highest temp I got was 42C on the bottom card, only thing I might change at this time is the bottom fan, currently a corsair 120 SP, probably should have an AF there for just a bit more VRAM cooling, if I go and OC it. The LED lights work great with the EVGA controller software, although I wish there were a few more options. That is the new Hight Bandwidth Bridge as well.


----------



## schmak01

Quote:


> Originally Posted by *s1rrah*
> 
> I had two of the EVGA Hybrids for my old 980's and am now using a 1080 ti Seahawk that came already assembled with the Hybrid cooler ... and yes .. the EVGA's can be a PITA but mostly due to having to breakdown the reference cooler, which to me, was the most tedious part; all those tiny screws man!
> 
> Anyway, I've repasted my old 980's a gazillion times and it's easy peasy now ...
> 
> 1. As far as seating the cooler on the die? Once you have the paste applied? I carefully lower the cooler from the top while holding the card with my other hand; I visually keep things aligned while doing this by "sighting" through the cooler mounting holes and just making sure they line up with the holes in the card; once it's seated, I simply hold it pressed very firmly with my left hand and flip it and one by one, apply the four screws. Super easy once you do it a couple more times.
> 
> 2. The fan wire you mention? It's this wire indicated here, correct?
> 
> ...
> 
> 
> 
> ...
> 
> You just need to find a way to route it around the fan ... my 980's had a little channel that worked well for that ... once I found a path for it, I secured it with a couple small bits of electrical tape just to be sure it didnt' "migrate" over and hit the fan. In fact the last time I repasted, after assembling and re installing the GPU ... it turns out that little wire was ever so slightly touching the blower fan and made a bad clicking sound and so had to take the shroud off and re route/secure that wire.
> 
> All in all I think you'll find the performance of the cooler is well worth the bit of effort
> 
> Enjoy


Was this the 1080/1070 or 980 AIO kit being put on a 1080 TI? The new kits that EVGA has that are specific for the 1080 TI's actually have the blower and plate included, took less than 30 minutes to do per card. Much easier than the 980 Ti kit that required the teardown of the reference blower. So now I have two whole 1080 Ti reference blowers sitting in a box. Oh and a ton of tiny ass screws, since EVGA gave you duplicates of EVERY screw.


----------



## Tonza

So i had leaking waterblock and needed to slap my stock turbo cooler on which is way too loud for my taste and runs hot -> Decided to try downvolting with the afterburner voltage/clock curve (With waterblock and shunt mod i was stable with 2100mhz core and stock 1.062V). Anyway, just impressive results, card is stable at 2025mhz with just 0.975V (GPU-Z reports around 0.98V in load), runs like over 10C cooler and is not killing my ears now.


----------



## s1rrah

Quote:


> Originally Posted by *tpain813*
> 
> Did you plug the radiator fan back into the card or into the motherboard itself?


If your okay with your card controlling the fan then just plug it in to the card.

I'm personally totally OCD about adjusting fan speeds so run all my fans on analogue fan controllers (with knobs and what not).

I just ordered a 3000RPM Scythe fan to run on my fan controller and with the Seahawk 1080 ti ... silent when I need it and turbo when I need that ... 

joel


----------



## s1rrah

Quote:


> Originally Posted by *schmak01*
> 
> Was this the 1080/1070 or 980 AIO kit being put on a 1080 TI? The new kits that EVGA has that are specific for the 1080 TI's actually have the blower and plate included, took less than 30 minutes to do per card. Much easier than the 980 Ti kit that required the teardown of the reference blower. So now I have two whole 1080 Ti reference blowers sitting in a box. Oh and a ton of tiny ass screws, since EVGA gave you duplicates of EVERY screw.


No ... that's simply a reference GTX 980 before I put the shroud on ... those are stock reference cooling plate and blower ...

I think the EVGA cooling plate works better than the one on my new Seahawk 1080 ti as the air coming out the back of the Seahawk isn't that hot ... but man, my EVGA Hybrid kit was shooting 110F+ degree air out the back of the card which was crazy to me as that was the same temp as the radiator was exhausting ... can't believe those heat sinks made the difference but have ordered some nice Enzotech sinks to put on the cooling plate of the Seahawk just to see ...


----------



## tpain813

haha nice! I have a m-itx case so barely have any fans to control







but have always liked the idea of having that!

Enjoy the seahawk! If I didn't get the FE through EVGA's FTW upgrade tehn step up, I would've probably picked that up!


----------



## schmak01

Quote:


> Originally Posted by *s1rrah*
> 
> No ... that's simply a reference GTX 980 before I put the shroud on ... those are stock reference cooling plate and blower ...
> 
> I think the EVGA cooling plate works better than the one on my new Seahawk 1080 ti as the air coming out the back of the Seahawk isn't that hot ... but man, my EVGA Hybrid kit was shooting 110F+ degree air out the back of the card which was crazy to me as that was the same temp as the radiator was exhausting ... can't believe those heat sinks made the difference but have ordered some nice Enzotech sinks to put on the cooling plate of the Seahawk just to see ...


Ahh I saw a lot of folks doing this 'mod', should work the same as their specific kits.

Does the copper sinks above the VRAM actually help any? I have a ton of those from Pi's I have modded.


----------



## ru7hl355

Finally after multiple delays, its here!!


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, in my latest post, I already showed him that my system runs the card differently. I might have a bottleneck of some kind, probably need a 5775c for better minimums.
> 
> His theory is that the voltage curve doesn't give you true clocks and he ignored that everyones system performs differently. He's critical about 100 points in a 10,000 score which is literally a 1% difference. 1 fps can literally change the score to +125 points.
> 
> And to prove him even more wrong, I did a clock at 2088 Mhz and compared it at 2152 Mhz. It obviously scaled properly. So my system is bottle necked elsewhere, but not the card.
> 
> 
> 
> 
> This guy just needs to get over his epeen. It's like he used to have a huge member and it blew up and now he's stuck with an average one but likes to show people photos of his old big one.


the 5775c would help. at some point we are probably ram bottlenecked b/c running a 5775c and 1080ti is a lot for ddr3 to handle i think. It does well, those. check out what I finally figured out. I guess Im just slow lol. Here is 2 maps full playthru of BF1. All jus copy my post from Aorus forum.

EUREKA. Guys, REMEMBER, I am on air, too. I did stick a heatsink on my copperplate and gained 5c somehow lol. that and the thermal pasting. Also, I will admit my computer's window was open for this30 minute session, but it's only 72f in the house so not really cheating there.

Here is my run in Battlefield One, two maps, ZERO PERFCAPS. I take back what I've been screaming about voltage with caveats. 1, know all your stable clocks at all voltages well before you mess with it. Also, these still hate 1.93v (mine does) and isnt all that great with 1.8, either, but if i run a +90 curve (which would be +179 on a founder's edition) and +51Mv and lock as you see in the pic, it doesnt crash. Gave me my highest SuPo score, too. And I just hat 142 locked in Battlefield One all Ultra except I run post process at medium b/c i think it looks better and gives me a couple fps. I was 142 lock, it only dropped into the 130s while in full sprint in a heavy battle 64 man multiplayer map and as soon as i stopped running it was 142 locked again. NO PERF CAPS AT ALL.

Look at those temps, that clock. Aorus [email protected]!!!



Good luck, guys, it took me awhile to figure this out.


----------



## chibi

Quote:


> Originally Posted by *madmeatballs*
> 
> I ordered both with the 1 slot bracket. I am still waiting for mine. Could you post some photos? hehe I wanna see it. I'd ask @Shoggy about your concern though.


Shoggy has advised that since the 1080 Ti pcb does not have memory chips on the backside, there's no need for pads.... Duh, lol. Brain fart on my end there.








There's a small thermal pad for the VRM though.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> the 5775c would help. at some point we are probably ram bottlenecked b/c running a 5775c and 1080ti is a lot for ddr3 to handle i think. It does well, those. check out what I finally figured out. I guess Im just slow lol. Here is 2 maps full playthru of BF1. All jus copy my post from Aorus forum.
> 
> EUREKA. Guys, REMEMBER, I am on air, too. I did stick a heatsink on my copperplate and gained 5c somehow lol. that and the thermal pasting. Also, I will admit my computer's window was open for this30 minute session, but it's only 72f in the house so not really cheating there.
> 
> Here is my run in Battlefield One, two maps, ZERO PERFCAPS. I take back what I've been screaming about voltage with caveats. 1, know all your stable clocks at all voltages well before you mess with it. Also, these still hate 1.93v (mine does) and isnt all that great with 1.8, either, but if i run a +90 curve (which would be +179 on a founder's edition) and +51Mv and lock as you see in the pic, it doesnt crash. Gave me my highest SuPo score, too. And I just hat 142 locked in Battlefield One all Ultra except I run post process at medium b/c i think it looks better and gives me a couple fps. I was 142 lock, it only dropped into the 130s while in full sprint in a heavy battle 64 man multiplayer map and as soon as i stopped running it was 142 locked again. NO PERF CAPS AT ALL.
> 
> Look at those temps, that clock. Aorus [email protected]!!!
> 
> 
> 
> Good luck, guys, it took me awhile to figure this out.


Agreed, I don't have the best ram either. This will probably be close to the end of my PC's generation. Next generation we might get PCI-E 4.0 and I'll be really behind. But for now, I don't mind the 1% bottleneck I'm dealing with a 4790K and ram at 1888 Mhz.

Anyways, for me to run 1.093v, my system draws close to 400 watts on the card. So I've backed it down to 1.072v. And I'm going to try and back it down to 2114 Mhz at 1.062v.

So what you are saying makes sense. Lower voltage is still the safer and better way to go. I understand why NVIDIA limited the voltage to 1.092v, it's because the cards are drawing a lot of power to run these cores in the 2000s.

No way would anyone be able to do anything with the old 1.187v we used to get.

Do you got a picture of your heatsink on the back of your card? Lol. I want to see it.


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> I have an Aorus, everything seems good but I'm tryin to get idle Temps down a bit. I'm at about 49 degrees when nothing is happening. My fan speed shows a 0 to 59 off and on fluctuation, is that normal?
> 
> Andromeda is running 2025 pretty solid at 4k mid 60s, gets about 70 tops and drops to 2000.
> 
> Need a few fan suggestions if possible and if that's normal to see fan speed fluctuate like that. It's about 3 seconds, 0 then 59, 0 then 59 etc.


The fan has "fan stop" and the default curve kicks in at like 50c or something, i dont remember, i changed it immediately to kick off at 35c instead. Set up a fan curve that hits 100% at 49c for best effect. these are super quiet even at 100%, quieter than an FE at 40%. Look at my above post. I also stuck a finned aluminum heatsink on top of my backplake and connecting with the copper. It gets hot fast so it's working. Put an sp fan pointed right at it.

Also, a repaste of these is a must if you want great temps. Use thermal Grizzly Kryonaut and do the full coverage spread method.

If you look above I am at 32c idle and maxed at 45c in BF1 64man map, in 1440p 142fps Ultra.

*pic is trash, used my son's tablet and the lighting was terrible, it hates the LEDs i think. My phone is terrible so, best i can do on the quick. it' a 100mmx30mmx18mm or something like that. I can't find my fujipoly pads i had left so it's sittin there dry and working so it is definitely 'a thing'. i could paste it but that's messy. I dont feel like spending $20 for a thumb sized piece of fuji so im still trying to find my stash but no luck.


----------



## GRABibus

Hello,

I would need the original Bios of the *MSI GTX1080 Ti Sea Hawk X* (Version of Sea Hawk with OC mode at 1569MHz).

Thnak you in advance for help.


----------



## GRABibus

Quote:


> Originally Posted by *s1rrah*
> 
> No ... that's simply a reference GTX 980 before I put the shroud on ... those are stock reference cooling plate and blower ...
> 
> I think the EVGA cooling plate works better than the one on my new Seahawk 1080 ti as the air coming out the back of the Seahawk isn't that hot ... but man, my EVGA Hybrid kit was shooting 110F+ degree air out the back of the card which was crazy to me as that was the same temp as the radiator was exhausting ... can't believe those heat sinks made the difference but have ordered some nice Enzotech sinks to put on the cooling plate of the Seahawk just to see ...


I sent you a PM


----------



## Somasonic

Hi all, looking for a bit of advice. I have an Asus Strix OC that has a bit of coil whine. It's not awful but I can hear it in quieter parts of games and it annoys me that a card this expensive comes with any sort of issue at all. I've spoken to my retailer and it looks like they'll let me return it based on coil whine but from what I've been reading I could well end up with one that's worse. I guess I'm wondering do I stick with a card that's not too bad and be happy with it or take a gamble on returning and see what I get back? I really can't decide at this point. Thanks!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> Hello,
> 
> I would need the original Bios of the *MSI GTX1080 Ti Sea Hawk X* (Version of Sea Hawk with OC mode at 1569MHz).
> 
> Thnak you in advance for help.


 1080TiMSISeaHawk.zip 155k .zip file


----------



## mshagg

Lol has anyone been brave enough to try the Zotac BIOS?

Techpowerup reports TDP limit of 384W.


----------



## KedarWolf

Quote:


> Originally Posted by *claymanhb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *claymanhb*
> 
> This thing is going to be sitting with me at work for the next 5 hours. I'm already going nuts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -tapa
> 
> 
> 
> Install it in your work computer.
> 
> Click to expand...
Click to expand...


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *claymanhb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *claymanhb*
> 
> This thing is going to be sitting with me at work for the next 5 hours. I'm already going nuts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -tapa
> 
> 
> 
> Install it in your work
> 
> Click to expand...
> 
> 
> 
> Click to expand...
Click to expand...


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> 1080TiMSISeaHawk.zip 155k .zip file


Thanks !
This is the original Bios of the SEa Hawk X version ? Right ?

You own it ?


----------



## GRABibus

Quote:


> Originally Posted by *mshagg*
> 
> Lol has anyone been brave enough to try the Zotac BIOS?
> 
> Techpowerup reports TDP limit of 384W.


Where is this Bios ?


----------



## KedarWolf

Sorry for those repeat posts, my phone messed up, I WASN'T trying to post those, I'll see as OP if I can delete them.


----------



## s1rrah

Quote:


> Originally Posted by *schmak01*
> 
> Ahh I saw a lot of folks doing this 'mod', should work the same as their specific kits.
> 
> Does the copper sinks above the VRAM actually help any? I have a ton of those from Pi's I have modded.


I would think they help a bit. The air flow inside the hybrids is quite restricted due to the pump being smack in the middle of the cooling plate so I figure anything helps...

Just ordered 8pc kit from Enzotech to put on the seahawk ... going to measure the exhaust air for any difference in temperatures ...


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> the 5775c would help. at some point we are probably ram bottlenecked b/c running a 5775c and 1080ti is a lot for ddr3 to handle i think. It does well, those. check out what I finally figured out. I guess Im just slow lol. Here is 2 maps full playthru of BF1. All jus copy my post from Aorus forum.
> 
> EUREKA. Guys, REMEMBER, I am on air, too. I did stick a heatsink on my copperplate and gained 5c somehow lol. that and the thermal pasting. Also, I will admit my computer's window was open for this30 minute session, but it's only 72f in the house so not really cheating there.
> 
> Here is my run in Battlefield One, two maps, ZERO PERFCAPS. I take back what I've been screaming about voltage with caveats. 1, know all your stable clocks at all voltages well before you mess with it. Also, these still hate 1.93v (mine does) and isnt all that great with 1.8, either, but if i run a +90 curve (which would be +179 on a founder's edition) and +51Mv and lock as you see in the pic, it doesnt crash. Gave me my highest SuPo score, too. And I just hat 142 locked in Battlefield One all Ultra except I run post process at medium b/c i think it looks better and gives me a couple fps. I was 142 lock, it only dropped into the 130s while in full sprint in a heavy battle 64 man multiplayer map and as soon as i stopped running it was 142 locked again. NO PERF CAPS AT ALL.
> 
> Look at those temps, that clock. Aorus [email protected]!!!
> 
> 
> 
> Good luck, guys, it took me awhile to figure this out.


I still don't get what exactly did you figure out... So where is the change/eureka? What is the reason behind this better score and no perf?


----------



## Addsome

I just had a SYSTEM_SERVICE_EXCEPTION bluescreen. My CPU overclock has been stable for months and my GPU overclock has been benchmark/game stable for a couple days. Is this bluescreen related to my overclock or something else?

This happened while running the new Superposition benchmark. I ran it again after the bluescreen using same overclock settings and it ran fine.

Edit: Timespy just crashed on the CPU test. Would that mean my CPU OC is unstable?


----------



## mshagg

Quote:


> Originally Posted by *GRABibus*
> 
> Where is this Bios ?


https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


----------



## phenom01

So I just got my MSI gaming X and when i load up afterburner the power limit is locked at 117% but all the reviews I am seeing it is 120% am I missing something here?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Lol has anyone been brave enough to try the Zotac BIOS?
> 
> Techpowerup reports TDP limit of 384W.


why haven't you tried it and grow a pair


----------



## GRABibus

Did someone test it ?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> Did someone test it ?


Why are you guys asking? If you're home just test it yourselves and if it doesn't work flash back.


----------



## mtbiker033

Quote:


> Originally Posted by *schmak01*
> 
> Got some Reference cards installed with the EVGA Titan Xp/1080 Ti Hybrid kits, doing pretty darn well right now, haven't had time to really try and OC them yet, but I have them running at 2050 Mhz. Having a heaven just run constantly for about 2 hours, the highest temp I got was 42C on the bottom card, only thing I might change at this time is the bottom fan, currently a corsair 120 SP, probably should have an AF there for just a bit more VRAM cooling, if I go and OC it. The LED lights work great with the EVGA controller software, although I wish there were a few more options. That is the new Hight Bandwidth Bridge as well.


where in the hell did you get those hybrid kits?!?! Been on auto-notify on evga and B&G and nada


----------



## gmpotu

This means my card is not getting the voltage it needs to go higher on the core right?
Would getting a new power supply fix this if my power supply is very old or do I just have a card that doesn't really do 2000+?
Should I bother trying to up the core clock with this? My guess is not because this is saying it's not getting enough voltage to go higher than it is now?

I guess my two questions are....

1) If my Performance Cap = Vrel what can I do to fix that?
2) If my Performance Cap = Pow what can I do to fix that?

Also last night I was seeing a new one. Performance Cap = Vop


----------



## GRABibus

i have to understand how these 1080Ti work.
I come from a GTX1080 on which i could go over 1.093V with Asus Strix bios 1080 and no power limit.

Here, whatever the Bios, I am power limited and get better case 2012MHz.

I have a question :
with all those bioses for GTX1080Ti who show different power limits (Example Zotac at 383W), why don't I see improvment using this Zotac Bios for example ?
How do you measure the fact that power is limited ?

And last question : Why the Asus Strix bios doesn't unlock the voltage limit as on Asus trix Bios for GTX1080's ?

with the Sea Hawk X, if i put +65Mhz on the core in MSI AB (Without tweaking the curve), i crash in timeSpy....
Not so much headroom for overclock due to power limit I assume ?


----------



## schmak01

Quote:


> Originally Posted by *mtbiker033*
> 
> where in the hell did you get those hybrid kits?!?! Been on auto-notify on evga and B&G and nada


Straight luck with EVGA. I happened to refresh the site two minutes before the auto-notify fired and it was available to order. I had my confirmation before the auto-notify email. I was actually late to a meeting, but worth it.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I still don't get what exactly did you figure out... So where is the change/eureka? What is the reason behind this better score and no perf?


just the right curve settings to allow me to add some volts with out crashing but not so much that I just slam into the power cap. Just finally found a sweet spot. I can now game at a steady 2088 in BF1, and at sub 50c temps, on air....


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> Did someone test it ?


Tried Zotac BIOS, power limiting under 100% power limit, some memory artifacts and Time Spy crashes at 1.093v 2088 core and .993v 2025 core both which work on stock and Asus BIOS.


----------



## GRABibus

Quote:


> Originally Posted by *KedarWolf*
> 
> Tried Zotac BIOS, power limiting under 100% power limit, some memory artifacts and Time Spy crashes at 1.093v 2088 core and .993v 2025 core both which work on stock and Asus BIOS.


With MSI Sea Hawk X I crash in Time spy at 2025MHz 1.093 V









Whatever ASUS Strix Bios or original Bios.


----------



## mtbiker033

Quote:


> Originally Posted by *schmak01*
> 
> Straight luck. I happened to refresh the sure two minutes before the auto-notify fired and it was available to order. I had my confirmation before the auto-notify email. I was actually late to a meeting, but worth it.


/respect (you lucky bastard)!!!


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> Tried Zotac BIOS, power limiting under 100% power limit, some memory artifacts and Time Spy crashes at 1.093v 2088 core and .993v 2025 core both which work on stock and Asus BIOS.


Same problem i had with the asus strix bios on the EVGA FE..... Might have to try that bios.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> why haven't you tried it and grow a pair


I'm not at home!


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GRABibus*
> 
> Did someone test it ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why are you guys asking? If you're home just test it yourselves and if it doesn't work flash back.
Click to expand...

I tried all the aftermarket bioses available in TPU bios database and can also confirm asus strix OC one worked the best, but unfortunately it still crashes the powerlimit even if I stay under 1.093v in afterburner. Its abit less than the original founders edition bios that my card came with but this card still needs a shuntmod.
Nice video tho really helpful if someone has not done this before.﻿
comment on my YouTube, and not the first Asus BIOS I have heard is better.


----------



## gmpotu

Quote:


> Originally Posted by *KedarWolf*
> 
> I tried all the aftermarket bioses available in TPU bios database and can also confirm asus strix OC one worked the best, but unfortunately it still crashes the powerlimit even if I stay under 1.093v in afterburner. Its abit less than the original founders edition bios that my card came with but this card still needs a shuntmod.
> Nice video tho really helpful if someone has not done this before.﻿
> comment on my YouTube, and not the first Asus BIOS I have heard is better.


Which video are you referring to?


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I tried all the aftermarket bioses available in TPU bios database and can also confirm asus strix OC one worked the best, but unfortunately it still crashes the powerlimit even if I stay under 1.093v in afterburner. Its abit less than the original founders edition bios that my card came with but this card still needs a shuntmod.
> Nice video tho really helpful if someone has not done this before.﻿
> comment on my YouTube, and not the first Asus BIOS I have heard is better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which video are you referring to?
Click to expand...


----------



## stoker

Quote:


> Originally Posted by *Slackaveli*
> 
> the 5775c would help. at some point we are probably ram bottlenecked b/c running a 5775c and 1080ti is a lot for ddr3 to handle i think. It does well, those. check out what I finally figured out. I guess Im just slow lol. Here is 2 maps full playthru of BF1. All jus copy my post from Aorus forum.
> 
> EUREKA. Guys, REMEMBER, I am on air, too. I did stick a heatsink on my copperplate and gained 5c somehow lol. that and the thermal pasting. Also, I will admit my computer's window was open for this30 minute session, but it's only 72f in the house so not really cheating there.
> 
> Here is my run in Battlefield One, two maps, ZERO PERFCAPS. I take back what I've been screaming about voltage with caveats. 1, know all your stable clocks at all voltages well before you mess with it. Also, these still hate 1.93v (mine does) and isnt all that great with 1.8, either, but if i run a +90 curve (which would be +179 on a founder's edition) and +51Mv and lock as you see in the pic, it doesnt crash. Gave me my highest SuPo score, too. And I just hat 142 locked in Battlefield One all Ultra except I run post process at medium b/c i think it looks better and gives me a couple fps. I was 142 lock, it only dropped into the 130s while in full sprint in a heavy battle 64 man multiplayer map and as soon as i stopped running it was 142 locked again. NO PERF CAPS AT ALL.
> 
> Look at those temps, that clock. Aorus [email protected]!!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck, guys, it took me awhile to figure this out.


Hey Slackaveli can you do a test on St Quentin scar 64P? Let us know your minimums between F & E flag.

Interested to know, Cheers


----------



## KedarWolf

Okay peeps, need to add your card info on our form in bottom of OP, even if we had it before. Additional info, better OP.









Shoutout to @joder for making the form and updating the spreadsheet!!


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> 
> 
> This means my card is not getting the voltage it needs to go higher on the core right?
> Would getting a new power supply fix this if my power supply is very old or do I just have a card that doesn't really do 2000+?
> Should I bother trying to up the core clock with this? My guess is not because this is saying it's not getting enough voltage to go higher than it is now?
> 
> I guess my two questions are....
> 
> 1) If my Performance Cap = Vrel what can I do to fix that?
> 2) If my Performance Cap = Pow what can I do to fix that?
> 
> Also last night I was seeing a new one. Performance Cap = Vop


Vrel and Vop are just info, they don't don't clk.

Vrel means it wants more voltage at that clk, play with + volts

Vop means it can't get anymore voltage to clk higher, usually when you have it set to curve up another bin after 1.093.

Pow is limiting and clks down, and can't be fixed with a bios that gives more power and actually works properly with the card. Unfortunately there isn't an XOC unlimited bios like the was with the Strix 1080, at least not yet. Best you can do is limit the overclock and voltage to keep it to minimal pow spikes. Games don't use as much power as some benchmarks, so test in games you play.


----------



## Addsome

Quote:


> Originally Posted by *Addsome*
> 
> I just had a SYSTEM_SERVICE_EXCEPTION bluescreen. My CPU overclock has been stable for months and my GPU overclock has been benchmark/game stable for a couple days. Is this bluescreen related to my overclock or something else?
> 
> This happened while running the new Superposition benchmark. I ran it again after the bluescreen using same overclock settings and it ran fine.
> 
> Edit: Timespy just crashed on the CPU test. Would that mean my CPU OC is unstable?


Anyone?


----------



## KedarWolf

New video.


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Addsome*
> 
> I just had a SYSTEM_SERVICE_EXCEPTION bluescreen. My CPU overclock has been stable for months and my GPU overclock has been benchmark/game stable for a couple days. Is this bluescreen related to my overclock or something else?
> 
> This happened while running the new Superposition benchmark. I ran it again after the bluescreen using same overclock settings and it ran fine.
> 
> Edit: Timespy just crashed on the CPU test. Would that mean my CPU OC is unstable?
> 
> 
> 
> Anyone?
Click to expand...

Likely CPU or system memory, yes.


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> The fan has "fan stop" and the default curve kicks in at like 50c or something, i dont remember, i changed it immediately to kick off at 35c instead. Set up a fan curve that hits 100% at 49c for best effect. these are super quiet even at 100%, quieter than an FE at 40%. Look at my above post. I also stuck a finned aluminum heatsink on top of my backplake and connecting with the copper. It gets hot fast so it's working. Put an sp fan pointed right at it.
> 
> Also, a repaste of these is a must if you want great temps. Use thermal Grizzly Kryonaut and do the full coverage spread method.
> 
> If you look above I am at 32c idle and maxed at 45c in BF1 64man map, in 1440p 142fps Ultra.
> 
> *pic is trash, used my son's tablet and the lighting was terrible, it hates the LEDs i think. My phone is terrible so, best i can do on the quick. it' a 100mmx30mmx18mm or something like that. I can't find my fujipoly pads i had left so it's sittin there dry and working so it is definitely 'a thing'. i could paste it but that's messy. I dont feel like spending $20 for a thumb sized piece of fuji so im still trying to find my stash but no luck.


yeah dat copper plate on the Aorus really responds to cooling lol. Imma see if getting a H80i on to it changes the temps appreciably. The hard part is getting the block to fit.

I used a bunch of old HeGrease Extreme for the interface with my Intel HSF. I did ponder using Fujiploy but thermal pads need consistent downward pressure for good contact while gravity is enough for the grease.


----------



## CoreyL4

Ok I need help with this voltage curve thing. Read up on it and attempted to find my lowest voltage for 2012mhz, I think I found it at like 1.050 etc. So I set that and every other voltage past that to 2012mhz and did some benchmarking. It instantly downclocks to 1999mhz after a couple seconds of heaven.

Yesterday when I had 2025 set to 1.093 volts and 2012 set to the next 3 lower volt levels it stayed at 2012 for the entire duration of the benchmark.

What am I not getting and/or doing wrong?


----------



## phenom01

So a few hours with with my MSI I seem to be rock solid at 2038/5940 70c max temps seen about an hour of heaven loop 3440x1440 8xAA. It did push up to 115% power limit and my slider in MSI which maxed out at 117%. Still getting the hang of this GPU boost 3.0. Does this seem like a decent card? Anything I maybe missing?

After burner settings
+100 core voltage
117% power limit (maxed)
+100 core
+425 mem (instacrash 500...15 min crash 450)


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> just the right curve settings to allow me to add some volts with out crashing but not so much that I just slam into the power cap. Just finally found a sweet spot. I can now game at a steady 2088 in BF1, and at sub 50c temps, on air....


Ah, ok. And "Aorus FTW" only if you don't get broken card/chip with them :/

So I will mess with voltage curves on my Strix then.

So you recommend to start from what Voltage bin to fogure out max clocks stable and then move up? Should I start from 1.000 and up?


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> Vrel and Vop are just info, they don't don't clk.
> 
> Vrel means it wants more voltage at that clk, play with + volts
> 
> Vop means it can't get anymore voltage to clk higher, usually when you have it set to curve up another bin after 1.093.
> 
> Pow is limiting and clks down, and can't be fixed with a bios that gives more power and actually works properly with the card. Unfortunately there isn't an XOC unlimited bios like the was with the Strix 1080, at least not yet. Best you can do is limit the overclock and voltage to keep it to minimal pow spikes. Games don't use as much power as some benchmarks, so test in games you play.


Thanks for going over this in detail I didn't know only power limiting will down clock. Also didn't know the relationship of Vop (ps, I did have points on the curve above 1.093)

My Voltage % is at +100 so I don't think I can go up more there.

Q:So since my system is not crashing should I keep increasing boost clock small amount and test to see if I can push it more... even though it's showing Vrel? Is it possible that I might get lucky creating a lower voltage curve at like 1.012v and maybe lower my clock just a little.


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> Shoggy has advised that since the 1080 Ti pcb does not have memory chips on the backside, there's no need for pads.... Duh, lol. Brain fart on my end there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There's a small thermal pad for the VRM though.


lol! Anyway, show us some photos of that sweet block. Haha! Also, some temp results if you can. Can't wait for mine.


----------



## Luckbad

Quote:


> Originally Posted by *Luckbad*
> 
> Update on Zotac 1080 Ti Amp Extreme fans being hardcoded to stay at 0% until 45 C:
> 
> If I do not use any kind of fan control in Afterburner or FireStorm, it doesn't do the weird 100% fan thing at 46 C. It still stays at 0% until then, but it just spins them up to a reasonable and effectively silent RPM.
> 
> Long story short, do not try to use a custom fan profile for the Zotac 1080 Ti Amp Extreme or you'll get constantly angry with the fan.
> 
> The temperatures with the default profile are outstanding and it doesn't actually need a custom profile. I just wanted to keep the fans on at all times so the temperature could be minimized, but it seems that isn't an option.


Okay, finally got this thing figured out. There's a hard-to-find arrow that brings you to a place where you can disable the 0% fan speed functionality.

You can only do this in Zotac Firestorm, which you'll need for this as well as changing or turning off the LEDs on the physical card. Note also that I've only been able to adjust voltage for the card via Firestorm. Doing the standard tricks in Afterburner have yielded nothing (yep, even editing the oem2 file and unlocking it in settings).

To solve the issue
1) Click "Spectra"
2) Click the little down arrow on the right side of that UI
3) Disable the Fan Stop Setting (aka FREEZE Fan Stop)

Now, you can manipulate the fan speeds in your preferred software without getting the blast of 100% fan speed every time the card exceeds 45 C.


----------



## Slackaveli

Quote:


> Originally Posted by *stoker*
> 
> Hey Slackaveli can you do a test on St Quentin scar 64P? Let us know your minimums between F & E flag.
> Interested to know, Cheers


sure thing. i was playing 64man on the fortress map a few minutes ago and i did see some 125 dips, but i am also running at 110% resolution now to utilize all of the Ti so that's a factor. It's 1440p ultra and 110% rez. Ill let you know in a bit what it's like Scar between F and E.
Quote:


> Originally Posted by *Addsome*
> 
> Anyone?


cpu's can degrade over time; well the DO degrade. You can just try to add another baby voltage bump .010v or so to try to stay stable, or take it down 100Mhz and reduce voltage a bit to scale down, which should stop the degradation from being a problem.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> yeah dat copper plate on the Aorus really responds to cooling lol. Imma see if getting a H80i on to it changes the temps appreciably. The hard part is getting the block to fit.
> 
> I used a bunch of old HeGrease Extreme for the interface with my Intel HSF. I did ponder using Fujiploy but thermal pads need consistent downward pressure for good contact while gravity is enough for the grease.


good point on the pressure. I am getting a little bit of pressure with my cpu cooler tube. I was surprised that the heatsink got hot without pad or paste, it's (the copper) ribbed and everything so there isnt a ton of contact there, although decent contact on the backplate , but it definitely heats up and the fan is disapating some of that heat. curious to see what you get out of the H80. Were you the one who had the old intel stock cooler running on top of the copper plate? That worked!


----------



## StephenP85

So, started taking off the FE cooler from one of my TIs to install my blocks. Butterfingers with the socket driver, and snapped off an SMD capacitor....



Oddly enough, the card still works. Played a game on it for about an hour at near full utilization, temps were fine. Haven't done anything beyond that.

Does anyone happen to know what that cap was for, and whether I might see problems in other aspects?


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> Okay, finally got this thing figured out. There's a hard-to-find arrow that brings you to a place where you can disable the 0% fan speed functionality.
> 
> You can only do this in Zotac Firestorm, which you'll need for this as well as changing or turning off the LEDs on the physical card. Note also that I've only been able to adjust voltage for the card via Firestorm. Doing the standard tricks in Afterburner have yielded nothing (yep, even editing the oem2 file and unlocking it in settings).
> 
> To solve the issue
> 1) Click "Spectra"
> 2) Click the little down arrow on the right side of that UI
> 3) Disable the Fan Stop Setting (aka FREEZE Fan Stop)
> 
> Now, you can manipulate the fan speeds in your preferred software without getting the blast of 100% fan speed every time the card exceeds 45 C.


I admit I'm a little bit of jelly of that card. I never would have dreamed Zotac would end up being a major player like they are. Those power boost caps are sexy, man! Hell of an engineering marvel.


----------



## Luckbad

Quote:


> Originally Posted by *Slackaveli*
> 
> I admit I'm a little bit of jelly of that card. I never would have dreamed Zotac would end up being a major player like they are. Those power boost caps are sexy, man! Hell of an engineering marvel.


It's an outstanding card. It's probably best for those who want to either install it and forget about it (because it has a great stock OC) or who want to overclock a bit without total control. The power boost and voltage regulation is handled in some unique way that makes it unlike any card I've tried to overclock with.

That said, its stock boost is 2037.5MHz. I'm only going to push it 50-75 past that for daily use. Memory is stock overclocked and I have it up another 500 to 6100 MHz (does that make it 12200 MHz?).

Just a great set-and-forget card.

A goal I had was to be able to hit 2GHz on the core, and it exceeded that out of the box.

I also wanted to break 10,000 in Superposition 4k, and it does that without me even optimizing for benchmarks. CPU (i7 6700k) is only running at 4.4, and I usually bump that to 4.6 for benching. Power options are running normal, no running full throttle for the bench. I'm using two monitors, not disabling my antivirus or other programs, and I can still go over 10000 no problem.


----------



## alucardis666

Quote:


> Originally Posted by *Luckbad*
> 
> *Its stock boost is 2037.5MHz*.
> I'm only going to push it 50-75 past that for daily use.


2037 outta the box huh?! That's impressive. I've had a 1080 Ti FE and TXp both refuse to do that with OC...









As for an update from me...

I hit a minor inconvenience... Got my Kraken X42, to replace my H100i V2 and try the two Hybrid Rad's for the 1080 Tis' in the front of my case.
But I didn't really notice that my case doesn't support rear 140mm. So the X42 is going back to Amazon. And I think it'd be stupid as hell to have 1 card in my SLi setup water-cooled and one air-cooled, so I'm gonna also be sending back the hybrid cooler I just bought and I'll have a spare hybrid cooler on hand.

Which means unfortunately I'll have my SLi 1080 Tis' on stock FE coolers. Could just build a water cooling loop for it all but idk if I wanna do that as I've never done it before and there's too many components at risk.

If anyone wants my used Hybrid cooler which I also bought used I'll ship it to you for $20+shipping.

Mind you this is just the rad and pump block, didn't come with the shroud when I bought it.


----------



## Benny89

Darn, I can order ZOTAC GeForce GTX 1080 Ti AMP Extreme....

Who got that card and can tell good things about it?







I heard you need to use specific software for it


----------



## GRABibus

Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> New video.


Nice.
But, even with my Sea Hawk, i am unable to reach such frequencies...

The best i can do is to add offset of 55MHz on core in MSI AB.
Starting from 1544MHz, this bring the stock frequency to 1599MHz.

If i set +60MHz on core, i crash.

Best boost frequency in Time Spy is 2000MHz at 1.093V....









I don't really get why I can't get more and why I am so quickly power limited with a GPU only at 48°C during Time spy.


----------



## Luckbad

The Corsair website looks like it has stock of the MSI Sea Hawk (branded as the Corsair Hydro GFX 1080 Ti):

Hydro GFX GTX 1080 Ti Liquid Cooled Graphics Card
http://www.corsair.com/en-us/hydro-gfx-gtx-1080-ti-liquid-cooled-graphics-card


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Darn, I can order ZOTAC GeForce GTX 1080 Ti AMP Extreme....
> 
> Who got that card and can tell good things about it?
> 
> 
> 
> 
> 
> 
> 
> I heard you need to use specific software for it


See my last few posts. I have one and it boosts to 2037.5 MHz out of the box.

I have two more on the way because I wanted to cherry pick, but I don't think that will be necessary given how good this one is.


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> See my last few posts. I have one and it boosts to 2037.5 MHz out of the box.
> 
> I have two more on the way because I wanted to cherry pick, but I don't think that will be necessary given how good this one is.


Nice, I will first test my new ASUS STRIX and then I will see if I don't order two more 1080 Ti (FTW3 and ZOTAC) to chose better one.

But if I can get 2050 mhz on Asus stable, even downclock to 2025 I will be ok with it. For Gaming I don't need more. 30-40 mhz this or that way won't make any difference in games. Good for benchmark scores though


----------



## CoreyL4

I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.

Idk if I should return it or not.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Nice, I will first test my new ASUS STRIX and then I will see if I don't order two more 1080 Ti (FTW3 and ZOTAC) to chose better one.
> 
> But if I can get 2050 mhz on Asus stable, even downclock to 2025 I will be ok with it. For Gaming I don't need more. 30-40 mhz this or that way won't make any difference in games. Good for benchmark scores though


I got a dud Asus. It wasn't fully stable with its standard OC setting (was crashing in FurMark). They're built super well though. I was expecting to keep that one but I lost the silicon lottery with it.

I also have an EVGA FTW3 preordered at BH. Really like that company but it'll be hard to beat this Zotac.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> It's an outstanding card. It's probably best for those who want to either install it and forget about it (because it has a great stock OC) or who want to overclock a bit without total control. The power boost and voltage regulation is handled in some unique way that makes it unlike any card I've tried to overclock with.
> 
> That said, its stock boost is 2037.5MHz. I'm only going to push it 50-75 past that for daily use. Memory is stock overclocked and I have it up another 500 to 6100 MHz (does that make it 12200 MHz?).
> 
> Just a great set-and-forget card.
> 
> A goal I had was to be able to hit 2GHz on the core, and it exceeded that out of the box.
> 
> I also wanted to break 10,000 in Superposition 4k, and it does that without me even optimizing for benchmarks. CPU (i7 6700k) is only running at 4.4, and I usually bump that to 4.6 for benching. Power options are running normal, no running full throttle for the bench. I'm using two monitors, not disabling my antivirus or other programs, and I can still go over 10000 no problem.


oh, snap. so my dual monitors, antivirus on 10,470 was better than i even thought. what other benchmark tricks do you know? im gonna have to max bench again.


----------



## mshagg

Quote:


> Originally Posted by *StephenP85*
> 
> Oddly enough, the card still works. Played a game on it for about an hour at near full utilization, temps were fine. Haven't done anything beyond that.
> 
> Does anyone happen to know what that cap was for, and whether I might see problems in other aspects?


Lol, **** happens hey. I've done it on cards in the past. Obviously a lot of redundancy built in, or perhaps many of the components are just to clean up voltages. Hard to tell from that side of the card what area it has affected.

Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


Just curious, on what grounds would you return the card? It doesn't appear to be faulty. Surely this kind of thing doesn't help with the high prices we pay for hardware lol.


----------



## Slackaveli

well, i all out max appears to be 10,484. not too damn bad. picked up 30 points or so by running 1 monitor instead of two.


----------



## KingEngineRevUp

I satisfied feeling is starting to fill into me now. I think I can finally leave our community and just game.

S
Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


At the end of the day, you're only missing out on 1 or 2 FPS. So why bother wasting time returning or even possibly paying a restocking fee? Yeah, being on these boards and seeing people have some pretty high nice clocks can cause you to be insecure, but come on now, is it worth the hassle, time and money because you're missing out on 1-2 FPS?


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> I satisfied feeling is starting to fill into me now. I think I can finally leave our community and just game.
> 
> S
> At the end of the day, you're only missing out on 1 or 2 FPS. So why bother wasting time returning or even possibly paying a restocking fee? Yeah, being on these boards and seeing people have some pretty high nice clocks can cause you to be insecure, but come on now, is it worth the hassle, time and money because you're missing out on 1-2 FPS?


Well, I also have that satisfaction and it's great but don't leave completely. Pretty cool community on this site, and this thread.

it's really all about not wanting to be the guy with the crap card. And if you are crashing in the 1900s or something, i totally get that. I mean, i personally returned an FE, so I cant talk. But it couldnt hold 2000, and i paid an extra $70 in tax that I got back. I would have been happy if the Aorus could just hold 2000 and be sexier and quieter and I got change back. I got super lucky and it's a 2076/2088 card. Ok. But did I gain an appreciable gaming difference? yeah, the noise of the cooler. 10 degrees cooler. It looks cooler. But, that's about it. 2 or 3 fps, maybe.
Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> Well, I also have that satisfaction and it's great but don't leave completely. Pretty cool community on this site, and this thread.
> 
> it's really all about not wanting to be the guy with the crap card. And if you are crashing in the 1900s or something, i totally get that. I mean, i personally returned an FE, so I cant talk. But it couldnt hold 2000, and i paid an extra $70 in tax that I got back. I would have been happy if the Aorus could just hold 2000 and be sexier and quieter and I got change back. I got super lucky and it's a 2076/2088 card. Ok. But did I gain an appreciable gaming difference? yeah, the noise of the cooler. 10 degrees cooler. It looks cooler. But, that's about it. 2 or 3 fps, maybe.


Oh, I didn't mean I'd leave completely, but I think I can stop checking it every few hours to see anything major on these boards. Lol.

But I think if he apply Kryonaut to his card, got 5 to 10C lesser, he can bin another 13 Mhz from boost and hopefully be happy.


----------



## Talon2016

Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


Just got my MSI Gaming X 1080 Ti today from Frys and with 100mhz offset, power at 117% and fans at 70% I'm getting great results. Sits around 2032-2050Mhz in game. Haven't touched the voltage yet and can't figure out why Incab select 120% like reviews showed.

Temps on this card are insanely low. With 70% fan I'm topping out around 48-50C overclocked.


----------



## ttnuagmada

Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


That seems strange. I have the Armor OC's, which are the same PCB, and both of them can do 2050/6000. They even seemed to be stable at 2112/6000 when i was gaming with them. Power limit and temps are the main limitation with mine in particular.


----------



## ttnuagmada

Quote:


> Originally Posted by *Talon2016*
> 
> Just got my MSI Gaming X 1080 Ti today from Frys and with 100mhz offset, power at 117% and fans at 70% I'm getting great results. Sits around 2032-2050Mhz in game. Haven't touched the voltage yet and can't figure out why Incab select 120% like reviews showed.
> 
> Temps on this card are insanely low. With 70% fan I'm topping out around 48-50C overclocked.


118% puts these cards at 330w TDP (based off of a 280 max at 100%) so that's why they don't go all the way up to 120%


----------



## Fieldsweeper

How do you guys overclock the card so its only using the extra power while playing game?> rather than blasting 100% all the time? or only when Bench marking?


----------



## illypso

Just received my second SLI bridge. Although the geforce alarm is gone it did not made a real difference at 1080p or 4K benchtest

Both test with all fan set to max, with a preheating run to get same temps.

was 12573
http://www.3dmark.com/3dm/19422600?

is now 12601
http://www.3dmark.com/3dm/19422753?

Whthin margin of error.

Maybe less stutter but there was like none already...

Is there anyone with a benchmark of firestrike ultra on a 5775C with 2 gtx 1080ti clock at arround 2038 to compare (I did have some intermittent throttle in bench test due to the heat of my second non-watercooled card, package is coming next week)
I would like to see if I should buy that processor or keep the 4790k clock a 4.9 (will try 5.0) it crash the second I start prime95 a 4.7 but it run everything at 4.9 flawless... so I don't use prime95 anymore lolll

Thank you

Edit, just overclock it to 5.0 and got #1 1 in my category nice
12975
http://www.3dmark.com/fs/12396804http://www.3dmark.com/fs/12396804

Edit: just tried 5.1 at 1.4V did not work, windows freeze after boot I'm staying 5.0 at 1.4V to see if stable in long game. I don't want to go higher than 1.4V... for now...

edit: BSOD at 5.0 in game, back to 4.9


----------



## leequan2009

Quote:


> Originally Posted by *SlimJ87D*
> 
> 1080TiMSISeaHawk.zip 155k .zip file


is this BIOS 1080Ti Sea Hawk? Can I use it for 1080Ti FE. I see they have 6+8pin connectors and TDP 250W ?


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Thanks for going over this in detail I didn't know only power limiting will down clock. Also didn't know the relationship of Vop (ps, I did have points on the curve above 1.093)
> 
> My Voltage % is at +100 so I don't think I can go up more there.
> 
> Q:So since my system is not crashing should I keep increasing boost clock small amount and test to see if I can push it more... even though it's showing Vrel? Is it possible that I might get lucky creating a lower voltage curve at like 1.012v and maybe lower my clock just a little.


Yeah only Temp and Power will down clk, and if you have it on +100 then you can use bins up to 1.093v but I wouldn't







For Vrel, it may give better scores if you drop the highest first point of flat voltage line down one. So if it was flat from 1.050v at 2050 say then drop 1.050 down to 2037 so it now clks up to 2050 at 1.062.
The best better is to individually overclock test each voltage point from 1.000v up to say 1.075v and make a curve based on those clks. You should be able to raise each point til it locks up and then drop down of freq bin, cntrl+d to reset curve, then do next voltage point. This way if it does limit and drop down a bin then its going as fast as it can each down clk and giving you the best score it can. once you start hitting green in games then flat line from that voltage.
Quote:


> Originally Posted by *CoreyL4*
> 
> Ok I need help with this voltage curve thing. Read up on it and attempted to find my lowest voltage for 2012mhz, I think I found it at like 1.050 etc. So I set that and every other voltage past that to 2012mhz and did some benchmarking. It instantly downclocks to 1999mhz after a couple seconds of heaven.
> 
> Yesterday when I had 2025 set to 1.093 volts and 2012 set to the next 3 lower volt levels it stayed at 2012 for the entire duration of the benchmark.
> 
> What am I not getting and/or doing wrong?


That's what GPU boost is supposed to do, it may down clock if it can't feed enough voltage so set it to +100 to ensure it can feed as per your curve. various temps and power can effect it too.


----------



## Slackaveli

Quote:


> Originally Posted by *Fieldsweeper*
> 
> How do you guys overclock the card so its only using the extra power while playing game?> rather than blasting 100% all the time? or only when Bench marking?


optimal power setting in nvidia control panel and at least two different afterburner settings. One locked voltage, one noe. use locked for gaming/benching and switch to the other profile for instant downclock from 2088 to 240 (in my case) and full load is over 300w, but idling at those clocks still uses ~100w+ where-as if you are at 240Mhz idle on optimal power setting it's 24w power draw from the gpu.
Quote:


> Originally Posted by *illypso*
> 
> Just received my second SLI bridge. Although the geforce alarm is gone it did not made a real difference at 1080p or 4K benchtest
> 
> Both test with all fan set to max, with a preheating run to get same temps.
> 
> was 12573
> http://www.3dmark.com/3dm/19422600?
> 
> is now 12601
> http://www.3dmark.com/3dm/19422753?
> 
> Whthin margin of error.
> 
> Maybe less stutter but there was like none already...
> 
> Is there anyone with a benchmark of firestrike ultra on a 5775C with 2 gtx 1080ti clock at arround 2038 to compare (I did have some intermittent throttle in bench test due to the heat of my second non-watercooled card, package is coming next week)
> I would like to see if I should buy that processor or keep the 4790k clock a 4.9 (will try 5.0) it crash the second I start prime95 a 4.7 but it run everything at 4.9 flawless... so I don't use prime95 anymore lolll
> 
> Thank you
> 
> Edit, just overclock it to 5.0 and got #1 1 in my category nice
> 12975
> http://www.3dmark.com/fs/12396804http://www.3dmark.com/fs/12396804
> 
> Edit: just tried 5.1 at 1.4V did not work, windows freeze after boot I'm staying 5.0 at 1.4V to see if stable in long game. I don't want to go higher than 1.4V... for now...
> 
> edit: BSOD at 5.0 in game, back to 4.9


i would but i only have 1 1080Ti. Ill say this, everyone who gets a 5775c loves it and doesnt regret it, even those who have 5.0ghz 4790ks.
Quote:


> Originally Posted by *Nico67*
> 
> Yeah only Temp and Power will down clk, and if you have it on +100 then you can use bins up to 1.093v but I wouldn't
> 
> 
> 
> 
> 
> 
> 
> For Vrel, it may give better scores if you drop the highest first point of flat voltage line down one. So if it was flat from 1.050v at 2050 say then drop 1.050 down to 2037 so it now clks up to 2050 at 1.062.
> The best better is to individually overclock test each voltage point from 1.000v up to say 1.075v and make a curve based on those clks. You should be able to raise each point til it locks up and then drop down of freq bin, cntrl+d to reset curve, then do next voltage point. This way if it does limit and drop down a bin then its going as fast as it can each down clk and giving you the best score it can. once you start hitting green in games then flat line from that voltage.
> That's what GPU boost is supposed to do, it may down clock if it can't feed enough voltage so set it to +100 to ensure it can feed as per your curve. various temps and power can effect it too.


well done, dude. that's how you fatten up the curve right there. tedious but the very most effective method.


----------



## Slackaveli

.dp


----------



## Slackaveli

Quote:


> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.


you gotta give us more to go on. what settings? voltage ? how much? what tests? how does it crash? is your cpu OC'd? Is it for sure stable? Have you tried the curve? Did you lock it? What about at 1.043 or 1.05v, what does it crash at at those voltages?

I bet that card is fine. In fact, 2012/2025 is at or above average OC on these. they have 1000 more cores than a 1080, remember that. Seeing all these repeat posts of the same dozen guys who managed to get 2100 to stick can be disheartening. But look at how many cards some of us run through. Look at all the extra paste and heatsinks and shunt mods and soldered wires and a few burned out cards; and bios mods and time spent benching/testing/forum posting, returning and rebuying and reselling and tearing apart our rigs and draining loops again and again...

The guy who bought the Zotac and plugged it in and was running 2037 right off the bat with 2 minutes involved in the whole process is 100% the winner in this whole thread. Luckily I enjoy the whole process, but it is work, no doubt about it. I.T. work for fun.

Keep your chin up. That GPU smokes any you've ever had by A LOT. It's probably better than anybody you know in real life's gpu. It's probably overkill for your current monitor anyway. It games silently and it doesnt run hot. It's all good, brother. We are here to help if you really need to see 2037 or 2050 to be happy. I bet we can get you there at least.


----------



## gmpotu

Hmm not sure if I am missing a step.

When I run Time Spy I get this graph below. It's a rainbow graph. The system stays at the lowest binned voltage in the flat line (1.012v) to maintain the OC, However from what I can tell the rainbow means it wants more than that lowest binned voltage for that requested frequency (2050 I think). So shouldn't it just jump up the voltage since there are other binned voltage values available? Like move up to 1.025 and keep going up if it needs more voltage? The interesting thing is that it does keep the core frequency one below what I set like has been discussed in the video example. So it runs at 2037 for the whole Time Spy run. I think it failed when I was testing the same clocks at (0.993v) before I adjusted my curve.

So do I just ignore the Rainbow colors as long as it's passing the benchmark and not downclocking below what I have set and not giving me worse performance in the benchmarks? If yes I assume I'd then follow @Nico67's advice and test to see what stable OC I can get at each Voltage Bin without a failure and then readjust the curve on each voltage bin to the best stable clocks that don't downclock me?

I watched the great video that KedarWolf made and I setup the voltage curve below (Thanks Kedar for the video)
I have not tested each point until it crashes yet @Nico67, however thanks for the tip. I will try testing all the points tomorrow after work.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Hmm not sure if I am missing a step.
> 
> So do I just ignore the Rainbow colors as long as it's passing the benchmark and not downclocking below what I have set and not giving me worse performance in the benchmarks? If yes I assume I'd then follow @Nico67's advice and test to see what stable OC I can get at each Voltage Bin without a failure and then readjust the curve on each voltage bin to the best stable clocks that don't downclock me?


close and open GPU-Z, I think it can bug out and rainbow, did that for me once too, but also default the curve using cntrl+d and start testing just 1.000v, ignore score for now or your just chasing your tail basically







All your interested in is stability with no green at that voltage, and don't touch any voltage points below that as they don't matter.
It is tedious as Slackaveli rightly noted, but its the same principle as overclocking a CPU, pick a voltage a see what the max clock is, the raise the voltage a step and see how much more you can get. 1.000v stable is likely a good point for all cards not to get green in most benchmarks. Tests like SuPo 4K are designed to ruin your day







, but games will be fine.
My gaming curve is default FE curve, with 1.050 raised to 2101 and flat after and its fine because the games I have been playing barely reach 120% if ever.


----------



## Slackaveli

Quote:


> Originally Posted by *gmpotu*
> 
> Hmm not sure if I am missing a step.
> 
> When I run Time Spy I get this graph below. It's a rainbow graph. The system stays at the lowest binned voltage in the flat line (1.012v) to maintain the OC, However from what I can tell the rainbow means it wants more than that lowest binned voltage for that requested frequency (2050 I think). So shouldn't it just jump up the voltage since there are other binned voltage values available? Like move up to 1.025 and keep going up if it needs more voltage? The interesting thing is that it does keep the core frequency one below what I set like has been discussed in the video example. So it runs at 2037 for the whole Time Spy run. I think it failed when I was testing the same clocks at (0.993v) before I adjusted my curve.
> 
> So do I just ignore the Rainbow colors as long as it's passing the benchmark and not downclocking below what I have set and not giving me worse performance in the benchmarks? If yes I assume I'd then follow @Nico67's advice and test to see what stable OC I can get at each Voltage Bin without a failure and then readjust the curve on each voltage bin to the best stable clocks that don't downclock me?
> 
> I watched the great video that KedarWolf made and I setup the voltage curve below (Thanks Kedar for the video)
> I have not tested each point until it crashes yet @Nico67, however thanks for the tip. I will try testing all the points tomorrow after work.


yeah, do it the tried and true way and you'll know your card very well. If you push it more and it crashes, you'll know it so well you'll say, "Damn , I knew I pushed 1.043v a little too hard". Just knowing your limits is the key to maximizing your overclock.

That rainbow is weird. i only got it a few times. try a baby voltage bump and lock it one higher up the curve and see if it goes away or at least goes to just blue.


----------



## Nico67

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, do it the tried and true way and you'll know your card very well. If you push it more and it crashes, you'll know it so well you'll say, "Damn , I knew I pushed 1.043v a little too hard". Just knowing your limits is the key to maximizing your overclock.
> 
> That rainbow is weird. i only got it a few times. try a baby voltage bump and lock it one higher up the curve and see if it goes away or at least goes to just blue.


Exactly







it takes patient's but it the only way to get a good baseline for what your card can do


----------



## koven

Anyone know when the Seahawk EK X is coming out?


----------



## ocCuS

Quote:


> Originally Posted by *ocCuS*
> 
> I think I finally found my future darling
> 
> https://videocardz.com/68741/galax-geforce-gtx-1080-ti-hof-features-163-phase-design
> 
> three 8-pin power connectors and 16+3 phase design


The 1080ti HOF is now online for pre-order on overclockers.co.uk

https://www.overclockers.co.uk/kfa2-geforce-gtx-1080ti-hof-11264mb-gddr5x-pci-express-graphics-card-gx-09g-kf.html


----------



## viz0id

If i run Time Spy and get Vrel,VOp and Pwr, but the clockspeed and voltage stay constant through the whole thing. Does it really matter?

I'm running 2025 on 0,9810v. So are they just really telling me that im holding my card "back" ? Or is there something I'm not understanding here?¨

Edit: Nvm ended up running this as my daily OC on Asus Strix


----------



## DStealth

Quote:


> Originally Posted by *ocCuS*
> 
> The 1080ti HOF is now online for pre-order on overclockers.co.uk
> 
> https://www.overclockers.co.uk/kfa2-geforce-gtx-1080ti-hof-11264mb-gddr5x-pci-express-graphics-card-gx-09g-kf.html


Holly








830 British Pound equals 1062.90 US Dollar


----------



## shalafi

Quote:


> Originally Posted by *schmak01*
> 
> Got some Reference cards installed with the EVGA Titan Xp/1080 Ti Hybrid kits, doing pretty darn well right now, haven't had time to really try and OC them yet, but I have them running at 2050 Mhz. Having a heaven just run constantly for about 2 hours, the highest temp I got was 42C on the bottom card, only thing I might change at this time is the bottom fan, currently a corsair 120 SP, probably should have an AF there for just a bit more VRAM cooling, if I go and OC it. The LED lights work great with the EVGA controller software, although I wish there were a few more options. That is the new Hight Bandwidth Bridge as well.


can you please post temps during a Firestrike Extreme/Ultra Stress test?


----------



## alpsie

Has anyone installed an AIO to any of these cards?
I´m curious about what temp drops they have seen, and if anyone have a comparison between a small AIO (120) and a larger AIO (240)

What sort of bracket have been used, or just modding the stock founder edition cooler?


----------



## fisher6

Don't ask me how I know but there WILL be a waterblock coming for the 1080 Ti HOF


----------



## ocCuS

Quote:


> Originally Posted by *fisher6*
> 
> Don't ask me how I know but there WILL be a waterblock coming for the 1080 Ti HOF


all I wanted to hear


----------



## Dasboogieman

Quote:


> Originally Posted by *Addsome*
> 
> Anyone?


what model of CPU, what cooling, what voltage and what workloads were you running?


----------



## fisher6

Quote:


> Originally Posted by *ocCuS*
> 
> all I wanted to hear


Yup, I will be selling my EVGA FE card and getting the HOF with a waterblock once it's in stock here. Might not even overclock more but I love the card itself.


----------



## TheParisHeaton

Block from Bitspower?


----------



## fisher6

Quote:


> Originally Posted by *TheParisHeaton*
> 
> Block from Bitspower?


Yes Bitspower is usually the one that makes blocks for the HOF cards and it will be the same this time.


----------



## optimus002

So, I had bought some heatsinks, but they were too big when used with G10 bracket lol, so I decided to just leave off heatsinks on the VRAM, leaving the speeds at stock and have 2 fans blowing (1 on the card and another case fan blowing directly on the card) over the VRM's. I plan on leaving it as is for at least couple years, or whenever the next Ti card is available. I have a moderate oc, +150 on the gpu and temps stay around 60 during gaming. Do you guys think it'll be ok for prolonged use?


----------



## OneCosmic

Did anybody already try ZOTAC Amp Extreme BIOS on FE? https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
It has 320/384W Power Limit, that looks nice if it can work on FE.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *optimus002*
> 
> So, I had bought some heatsinks, but they were too big when used with G10 bracket lol, so I decided to just leave off heatsinks on the VRAM, leaving the speeds at stock and have 2 fans blowing (1 on the card and another case fan blowing directly on the card) over the VRM's. I plan on leaving it as is for at least couple years, or whenever the next Ti card is available. I have a moderate oc, +150 on the gpu and temps stay around 60 during gaming. Do you guys think it'll be ok for prolonged use?


Just buy smaller low profile heatsinks.
Quote:


> Originally Posted by *OneCosmic*
> 
> Did anybody already try ZOTAC Amp Extreme BIOS on FE? https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
> It has 320/384W Power Limit, that looks nice if it can work on FE.


Doesn't work


----------



## cugno87

Quote:


> Originally Posted by *OneCosmic*
> 
> Did anybody already try ZOTAC Amp Extreme BIOS on FE? https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
> It has 320/384W Power Limit, that looks nice if it can work on FE.


I did... no improvements.


----------



## ChickenInferno

Just ordered the Aorus Xtreme edition...my wife is going to kill me.


----------



## BrainSplatter

Quote:


> Originally Posted by *ChickenInferno*
> 
> Just ordered the Aorus Xtreme edition...my wife is going to kill me.


Time for the last will and a random pick from this forum to bequeath the card to!


----------



## kevindd992002

Quote:


> Originally Posted by *Addsome*
> 
> Heres a link to the forum post:
> http://forums.blurbusters.com/viewtopic.php?t=3073
> 
> Also the Gsync image in this thread is outdated, here is the updated image from that post.


Thanks. Now it makes sense to set VSYNC to ON (either in-game or in NVCP) even when using an fps limiter just below your monitor's refresh rate.

Quote:


> Originally Posted by *Luckbad*
> 
> Yep. Before 45 C, the fans are at 0%. As soon as it hits 45, they rev to >100% then drop to whatever speed is set.
> 
> It's unbearably annoying because during normal PC use, if you have the card overclocked, it reaches 45 C every few minutes, revs up, gets down to 40, repeats.


Quote:


> Originally Posted by *Luckbad*
> 
> Okay, finally got this thing figured out. There's a hard-to-find arrow that brings you to a place where you can disable the 0% fan speed functionality.
> 
> You can only do this in Zotac Firestorm, which you'll need for this as well as changing or turning off the LEDs on the physical card. Note also that I've only been able to adjust voltage for the card via Firestorm. Doing the standard tricks in Afterburner have yielded nothing (yep, even editing the oem2 file and unlocking it in settings).
> 
> To solve the issue
> 1) Click "Spectra"
> 2) Click the little down arrow on the right side of that UI
> 3) Disable the Fan Stop Setting (aka FREEZE Fan Stop)
> 
> Now, you can manipulate the fan speeds in your preferred software without getting the blast of 100% fan speed every time the card exceeds 45 C.


I'm really surprised Zotac still has problems with their fans regarding PWM. There was this fluctuation issue near the temp threshold with the 1070 explained here: http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners#post_25567515 . I bet your 1080Ti also does this. Not a major issue but certainly a nuisance.


----------



## DStealth

Quote:


> Originally Posted by *ChickenInferno*
> 
> Just ordered the Aorus Xtreme edition...my wife is going to kill me.


Ask me what mine did when she understood that I burned the video a day after the purchase and in my country it cost 1000USD...just to give you the idea...the average monthly salary for the whole country is 300-350$


----------



## Benny89

Quote:


> Originally Posted by *DStealth*
> 
> Ask me what mine did when she understood that I burned the video a day after the purchase and in my country it cost 1000USD...just to give you the idea...the average monthly salary for the whole country is 300-350$


Eastern Europe too?







My wifre just looked into my order when standing behind me. Seeing that I have ordered 3x 1080 Ti cards from 3 differente retailers and just said "well, as long as we don't starve..."


----------



## s1rrah

Quote:


> Originally Posted by *Benny89*
> 
> Eastern Europe too?
> 
> 
> 
> 
> 
> 
> 
> My wifre just looked into my order when standing behind me. Seeing that I have ordered 3x 1080 Ti cards from 3 differente retailers and just said "well, as long as we don't starve..."


LOL ...

Does she have a sister?


----------



## s1rrah

Does anybody know what height the thermal pads are under the aluminum cooling plate? Wouldn't they be either .5mm or 1.0mm? When I repaste my card I want to change the pads with some Fujipoly 17.0 W/mK pads ...

Thanks for any info

joel


----------



## Benny89

Quote:


> Originally Posted by *s1rrah*
> 
> LOL ...
> 
> Does she have a sister?


No, she does not







. Unique female example







. Well, she is also glad because she gets my golden 980Ti after me and she will finally be able to max out Witcher 3 so overall she want me to finally upgrade


----------



## SuprUsrStan

So I realized I had the frame limiter on set to 58 when I first ran the benchmarks and then I realized I still had G-Sync on which capped some of the frames.

Now with those turned off, both 1080Ti's are pushed to the limit. Sadly, superposition still seems to not work with SLI.






EDIT: Oh those were two FE 1080Ti with EK blocks clocked to 2050Mhz core 5750 Mem


----------



## Addsome

Quote:


> Originally Posted by *Dasboogieman*
> 
> what model of CPU, what cooling, what voltage and what workloads were you running?


i7 6700k, NH-D15 cooler, I was running [email protected] Initially I got a bluescreen during Superposition benchmark, restarted PC and ran the benchmark again and it ran through it fine. Then i tried Timespy and the program crashed during the CPU stress test. Then i upped the voltage to 1.37V and it seems fine, but I recently just bumped the voltage to 1.36 because I BSOD @ 1.35V even though that was stable for months. My chip cant be degrading this soon can it? I do run a constant voltage tho.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> i7 6700k, NH-D15 cooler, I was running [email protected] Initially I got a bluescreen during Superposition benchmark, restarted PC and ran the benchmark again and it ran through it fine. Then i tried Timespy and the program crashed during the CPU stress test. Then i upped the voltage to 1.37V and it seems fine, but I recently just bumped the voltage to 1.36 because I BSOD @ 1.35V even though that was stable for months. My chip cant be degrading this soon can it? I do run a constant voltage tho.


There's a reason why you didn't do adaptive voltage?


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> what model of CPU, what cooling, what voltage and what workloads were you running?
> 
> 
> 
> i7 6700k, NH-D15 cooler, I was running [email protected] Initially I got a bluescreen during Superposition benchmark, restarted PC and ran the benchmark again and it ran through it fine. Then i tried Timespy and the program crashed during the CPU stress test. Then i upped the voltage to 1.37V and it seems fine, but I recently just bumped the voltage to 1.36 because I BSOD @ 1.35V even though that was stable for months. My chip cant be degrading this soon can it? I do run a constant voltage tho.
Click to expand...

I think it's ideal to run a 6700k under 1.3v or you might get some degradation. People do run it higher at times but I'm pretty sure 1.3v is recommended. and if you have your other voltages on Auto your VCCIO I think it is may be running way too high, on Asus motherboards Auto voltages have that bug.









It's one of the voltages that supposed to default to 1.05v on main page runs at 1.24v or something on Auto, way too high. is a problem.


----------



## tpain813

So that hybrid install was definitely worth it, and looking back only took so long because the directions were so bad that I had to look everything up a few times before feeling confident enough to move forward. Currently rocking a stable 2 Ghz core once settled and the card isn't getting about 55 degrees C in my small case and I can't hear a thing. I may be able to get my core to settle at 2025, but i was getting crashes at 2050.

Have you guys seen improvements when OCing memory? I added 300 to my stock memory clocks, but didn't seem to notice any improvements on benchmarks so am just running it stock speeds.

edit: a sentence


----------



## Luckbad

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm really surprised Zotac still has problems with their fans regarding PWM. There was this fluctuation issue near the temp threshold with the 1070 explained here: http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners#post_25567515 . I bet your 1080Ti also does this. Not a major issue but certainly a nuisance.


It was mostly user error on my part. Leaving the fans alone, they would stay at 0% until 46 C, then they'd kick in at low RPM for a couple minutes until it hit 40 C again.

The fans are inaudible below 40% in my very quiet case, so I just wanted them on all the time. Once I found the setting in Firestorm to disable that zero fan function, everything works great.


----------



## KingEngineRevUp

According to this review, the SC2 bios should give you 315 watts on the FE.

This is similar to what I saw on the Inno3d and Asus bios. I roughly got 10-15 more watts.

I cannot measure and test this anymore because I am shunt modded.

But I am using it and it is shunt mod compatible since this bios is based off of reference design.

Asus bios isn't compatible with shunt mod. Inno3d crashes as well as zotac amp extreme. Gigabyte plain just doesn't work.

So far SC2 is the best compatible bios for me.


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> It was mostly user error on my part. Leaving the fans alone, they would stay at 0% until 46 C, then they'd kick in at low RPM for a couple minutes until it hit 40 C again.
> 
> The fans are inaudible below 40% in my very quiet case, so I just wanted them on all the time. Once I found the setting in Firestorm to disable that zero fan function, everything works great.


Good to know they don't have that annoying issue anymore. Just ordered one







.

That was one of the initial reasons i didn't wanna move from my 1070 Strix to the Amp Extreme version of the 1080ti.


----------



## jestermx6

i'm not sure i did this right. the info i put into the form in the OP is my max values from benchmarking. the GPU-z validation doesn't line up with that though so i'm not sure what's going on there.

https://www.techpowerup.com/gpuz/z92zk


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jestermx6*
> 
> i'm not sure i did this right. the info i put into the form in the OP is my max values from benchmarking. the GPU-z validation doesn't line up with that though so i'm not sure what's going on there.
> 
> https://www.techpowerup.com/gpuz/z92zk


GPU-Z is just proof of you owning the card. Not your clocks.

Also, please post gaming clocks, not benchmark clocks


----------



## jestermx6

I haven't done much besides Diablo 3 so far. 180+ fps is pretty nice though lol


----------



## chibi

Quote:


> Originally Posted by *madmeatballs*
> 
> lol! Anyway, show us some photos of that sweet block. Haha! Also, some temp results if you can. Can't wait for mine.


No testing data yet, the build is still a work in progress (wip) - my 1080 Ti still has the plastic protector on it.








The single slot bracket is nothing to write home about, a little ruff around the edges and not quite as polished with the finish - but it serves it's purpose. Here you go with pics!




Spoiler: Warning: Spoiler!


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> No testing data yet, the build is still a work in progress (wip) - my 1080 Ti still has the plastic protector on it.
> 
> 
> 
> 
> 
> 
> 
> 
> The single slot bracket is nothing to write home about, a little ruff around the edges and not quite as polished with the finish - but it serves it's purpose. Here you go with pics!
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Neat! I'll be getting mine on monday!


----------



## KedarWolf

Quote:


> Originally Posted by *jestermx6*
> 
> i'm not sure i did this right. the info i put into the form in the OP is my max values from benchmarking. the GPU-z validation doesn't line up with that though so i'm not sure what's going on there.
> 
> https://www.techpowerup.com/gpuz/z92zk


The GPU-Z link looks fine for me!!


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> According to this review, the SC2 bios should give you 315 watts on the FE.
> 
> This is similar to what I saw on the Inno3d and Asus bios. I roughly got 10-15 more watts.
> 
> I cannot measure and test this anymore because I am shunt modded.
> 
> But I am using it and it is shunt mod compatible since this bios is based off of reference design.
> 
> Asus bios isn't compatible with shunt mod. Inno3d crashes as well as zotac amp extreme. Gigabyte plain just doesn't work.
> 
> So far SC2 is the best compatible bios for me.


Where's that SC2 bios? can't find on the post nor techpowerup database.


----------



## ttnuagmada

I tried the Zotac bios on my Armor OC, and i got excited at first because i wasnt having the issues that i was with the Asus bios, but it wont recognize any TDP adjustments over 100%, so it caps me to 320w vs 330 with the stock vbios

bummer.


----------



## KedarWolf

Quote:


> Originally Posted by *eXteR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> According to this review, the SC2 bios should give you 315 watts on the FE.
> 
> This is similar to what I saw on the Inno3d and Asus bios. I roughly got 10-15 more watts.
> 
> I cannot measure and test this anymore because I am shunt modded.
> 
> But I am using it and it is shunt mod compatible since this bios is based off of reference design.
> 
> Asus bios isn't compatible with shunt mod. Inno3d crashes as well as zotac amp extreme. Gigabyte plain just doesn't work.
> 
> So far SC2 is the best compatible bios for me.
> 
> 
> 
> Where's that SC2 bios? can't find on the post nor techpowerup database.
Click to expand...

SC2 BIOS crashed on me same settings Asus and stock worked, no shunt mod.


----------



## CoreyL4

Quote:


> Originally Posted by *Slackaveli*
> 
> you gotta give us more to go on. what settings? voltage ? how much? what tests? how does it crash? is your cpu OC'd? Is it for sure stable? Have you tried the curve? Did you lock it? What about at 1.043 or 1.05v, what does it crash at at those voltages?
> 
> I bet that card is fine. In fact, 2012/2025 is at or above average OC on these. they have 1000 more cores than a 1080, remember that. Seeing all these repeat posts of the same dozen guys who managed to get 2100 to stick can be disheartening. But look at how many cards some of us run through. Look at all the extra paste and heatsinks and shunt mods and soldered wires and a few burned out cards; and bios mods and time spent benching/testing/forum posting, returning and rebuying and reselling and tearing apart our rigs and draining loops again and again...
> 
> The guy who bought the Zotac and plugged it in and was running 2037 right off the bat with 2 minutes involved in the whole process is 100% the winner in this whole thread. Luckily I enjoy the whole process, but it is work, no doubt about it. I.T. work for fun.
> 
> Keep your chin up. That GPU smokes any you've ever had by A LOT. It's probably better than anybody you know in real life's gpu. It's probably overkill for your current monitor anyway. It games silently and it doesnt run hot. It's all good, brother. We are here to help if you really need to see 2037 or 2050 to be happy. I bet we can get you there at least.


Thanks for the words of encouragement lol.

I'll try to answer your questions best I can while at work at not on my computer. I tried voltage up to 1.093v. I made a custom curve.

About the curve: I did not lock it, I do not know what that does. My first day learning about the curve I set 2025 at 1.093v, and the next 3 or 4 voltage points to 2012. For the majority, it would instantly downlock to 2012 and stay there throughout the entire Heaven benchmark. Read more the next day and learned to start at 1.000v and see where I can get lowest voltage 2025/2012 stable. I opted to try for 2012. I think it was 1.050 that 2012 seemed stable. Well my problem was when I set 1.05v and the rest of the voltage points after that flat on 2012, it would downlock pretty fast in heaven to 1999 and so forth unlike the previous day where it would downclock from 2025 after like 20 seconds and stay at 2012 for majority if not all of heaven benchmark run.

After getting discouraged from that I called it a night and made the previous post. I still think I dont understand the voltage curve and DEFINITELY need help.


----------



## gmpotu

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, do it the tried and true way and you'll know your card very well. If you push it more and it crashes, you'll know it so well you'll say, "Damn , I knew I pushed 1.043v a little too hard". Just knowing your limits is the key to maximizing your overclock.
> 
> That rainbow is weird. i only got it a few times. try a baby voltage bump and lock it one higher up the curve and see if it goes away or at least goes to just blue.


*What is a "baby voltage bump and lock it one higher up the curve mean"?*
Does this mean if I am currently having (1.012v @ 2037) and flatline above (1.012v) then I should move to a lower clock like 1999 @ 1.012v and have the next voltage (1.025v @ 2037) so like try moving my curve up one voltage step?
Quote:


> Originally Posted by *Nico67*
> close and open GPU-Z, I think it can bug out and rainbow, did that for me once too, but also default the curve using cntrl+d and start testing just 1.000v, ignore score for now or your just chasing your tail basically smile.gif All your interested in is stability with no green at that voltage, and don't touch any voltage points below that as they don't matter.
> It is tedious as Slackaveli rightly noted, but its the same principle as overclocking a CPU, pick a voltage a see what the max clock is, the raise the voltage a step and see how much more you can get. 1.000v stable is likely a good point for all cards not to get green in most benchmarks. Tests like SuPo 4K are designed to ruin your day smile.gif, but games will be fine.
> My gaming curve is default FE curve, with 1.050 raised to 2101 and flat after and its fine because the games I have been playing barely reach 120% if ever.


Should I when testing make the curve flat line on anything above the voltage I'm testing? So like 1.000v and above flatline test.
Once I find max stable I go to (1.012v) and flatline everything above (1.012v) to find the highest stable clock? etc. etc.?

I'm kind of repeating questions but I want to make sure that I'm correctly understanding.

Also, If I don't have a 4K monitor (only 1080p) can I still run the 4k tests with SuPo and other benches? I'm assuming the answer would be NO that it doesn't somehow virtually apply a 4k resolution in like a sandbox environment.

ps. That Rainbow happens to me a lot.

What is locking / unlocking the curve and how do you "lock" the curve?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> SC2 BIOS crashed on me same settings Asus and stock worked, no shunt mod.


It crashed because of your OC right? It might require different tweaking and playing with the voltage curve. From my experience, you can't just use the same exact voltage curve for every bios, they give slightly different results and all need to be tamed.

But the Inno3D bios literally instantly crashes anytime, every time, even on stock clocks or down clocked in gaming. So that's what I mean by "crashing" as it won't even load in any situation.
Quote:


> Originally Posted by *eXteR*
> 
> Where's that SC2 bios? can't find on the post nor techpowerup database.


As for the SC2 bios,

EVGA1080TiSC2.zip 155k .zip file


It's working well, you don't lose a DP1, you should get the extra 15 watts like the other bios and you get the ability to have 0% fan speed when idle. It is written and based off of a reference card so works well with shunt mod.


----------



## gmpotu

Quote:


> Originally Posted by *CoreyL4*
> 
> Thanks for the words of encouragement lol.
> 
> I'll try to answer your questions best I can while at work at not on my computer. I tried voltage up to 1.093v. I made a custom curve.
> 
> About the curve: I did not lock it, I do not know what that does. My first day learning about the curve I set 2025 at 1.093v, and the next 3 or 4 voltage points to 2012. For the majority, it would instantly downlock to 2012 and stay there throughout the entire Heaven benchmark. Read more the next day and learned to start at 1.000v and see where I can get lowest voltage 2025/2012 stable. I opted to try for 2012. I think it was 1.050 that 2012 seemed stable. Well my problem was when I set 1.05v and the rest of the voltage points after that flat on 2012, it would downlock pretty fast in heaven to 1999 and so forth unlike the previous day where it would downclock from 2025 after like 20 seconds and stay at 2012 for majority if not all of heaven benchmark run.
> 
> After getting discouraged from that I called it a night and made the previous post. I still think I dont understand the voltage curve and DEFINITELY need help.


From the tutorial video KedarWolf made (http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps) I think you are supposed to set your clocks one higher than what you actually want to be at. So if you want 2012 clock you would set the flatline to 2025 so that it will downclock to 2012 like you are noticing. (Hopefully one of the pros here can confirm this is correct)

My response is somewhat like "The blind leading the blind".


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> It crashed because of your OC right? It might require different tweaking and playing with the voltage curve. From my experience, you can't just use the same exact voltage curve for every bios, they give slightly different results and all need to be tamed.
> 
> But the Inno3D bios literally instantly crashes anytime, every time, even on stock clocks or down clocked in gaming. So that's what I mean by "crashing" as it won't even load in any situation.
> As for the SC2 bios,
> 
> EVGA1080TiSC2.zip 155k .zip file
> 
> 
> It's working well, you don't lose a DP1, you should get the extra 15 watts like the other bios and you get the ability to have 0% fan speed when idle. It is written and based off of a reference card so works well with shunt mod.


Surprisingly Inno3D bios worked best for me. On stock i would sometimes crash running 2050 but on Inno3D im stable at 2063MHz at a lower voltage. Also my memory is able to clock way higher.


----------



## fisher6

Quote:


> Originally Posted by *SlimJ87D*
> 
> It crashed because of your OC right? It might require different tweaking and playing with the voltage curve. From my experience, you can't just use the same exact voltage curve for every bios, they give slightly different results and all need to be tamed.
> 
> But the Inno3D bios literally instantly crashes anytime, every time, even on stock clocks or down clocked in gaming. So that's what I mean by "crashing" as it won't even load in any situation.
> As for the SC2 bios,
> 
> EVGA1080TiSC2.zip 155k .zip file
> 
> 
> It's working well, you don't lose a DP1, you should get the extra 15 watts like the other bios and you get the ability to have 0% fan speed when idle. It is written and based off of a reference card so works well with shunt mod.


Are you saying that this bios really does give you extra power? I recall you saying the Asus bios doing nothing, but this one works?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Surprisingly Inno3D bios worked best for me. On stock i would sometimes crash running 2050 but on Inno3D im stable at 2063MHz at a lower voltage. Also my memory is able to clock way higher.


Same here but with the SC2 bios, my memory can now go from 450 to 550 which is pretty nice.

Did you try the SC2 bios?


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> Same here but with the SC2 bios, my memory can now go from 450 to 550 which is pretty nice.
> 
> Did you try the SC2 bios?


Not yet, will try it once I get home. Are you sure it is giving 15W extra power? I know the Inno3D one is.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> Are you saying that this bios really does give you extra power? I recall you saying the Asus bios doing nothing, but this one works?


For Bloodhawk, kraxkills, Nico and I, we saw like 10 extra watts on average from the ASUS bios. But generally it just gave us 300-305 watts when measured by our PSU or nvidia command.

Kedarwolf as it seems gets 330 watts, never figured out how or why it's not repeatable for all of us.

But yes, the SC2, Inno3D, MSI and ASUS bios seems to give you 310-315 watts maximum. But you actually don't hit that maximum that often so it'll perform roughly equal to stock (or in my experience worse than stock).

But from my testing, the SC2 bios perform on par with the stock bios with some extra benefits, it is from my understanding a reference designed board after all. You get 0% fan at idle if you want to use that, don't lose a DP1 port.

I'm shunt modded now so I can't measure the wattage of the card with stock settings comparing SC2 and stock bios anymore for you guys. My card will draw 380-400 watts.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Not yet, will try it once I get home. Are you sure it is giving 15W extra power? I know the Inno3D one is.


I can't measure it for you guys, I'm shunt modded now. But the review website I posted measured it and they saw it got 15 watts extra than the FE card.

This is akin to the inno3d, which other review sites saw it drew 15 watts more than the FE.

Sadly, the bios we want is the inno3d review sample bios which gives it 50 extra watts, but no one can get a hold of that one since it is a NDA review sample bios.

Perhaps you guys can test the SC2 bios power draw for the community.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> SC2 BIOS crashed on me same settings Asus and stock worked, no shunt mod.


You know, there's a possibility that it's crashing for you because evga made it really important for hte card to know what's going on with the fans at all times.

Since you're waterblocked cooled, I believe in your case, if the bios doesnt' detect a fan spinning it will crash and stop itself from running. It might be a safety feature so the card doesn't overheat.

This is just a *theory* of mine after reading the review and reading about the EVGA ICX fans.

Or are you able to run the bios at different settings? If so, I think you'd just have to tweak and optimize your voltage curve for this bios.


----------



## fisher6

Thinking to try the SC2 bios then. Just realised my HX750i PSU actually can use the Corsair software to measure power draw I think.


----------



## CoreyL4

Quote:


> Originally Posted by *mshagg*


Quote:


> Originally Posted by *SlimJ87D*


Quote:


> Originally Posted by *Slackaveli*


Quote:


> Originally Posted by *Talon2016*


Quote:


> Originally Posted by *ttnuagmada*


Thanks for the words of encouragement lol.

I'll try to answer your questions best I can while at work at not on my computer. I tried voltage up to 1.093v. I made a custom curve.

About the curve: I did not lock it, I do not know what that does. My first day learning about the curve I set 2025 at 1.093v, and the next 3 or 4 voltage points to 2012. For the majority, it would instantly downlock to 2012 and stay there throughout the entire Heaven benchmark. Read more the next day and learned to start at 1.000v and see where I can get lowest voltage 2025/2012 stable. I opted to try for 2012. I think it was 1.050 that 2012 seemed stable. Well my problem was when I set 1.05v and the rest of the voltage points after that flat on 2012, it would downlock pretty fast in heaven to 1999 and so forth unlike the previous day where it would downclock from 2025 after like 20 seconds and stay at 2012 for majority if not all of heaven benchmark run.

After getting discouraged from that I called it a night and made the previous post. I still think I dont understand the voltage curve and DEFINITELY need help.


----------



## Benny89

Got my Asus ROG STRI 1080 TI.

First of all I checked if I can play games on stock... you know after Aorus bad experience. No problem, with max power limit 1980 out of box, 135-143 fps in BF1 1440p Ultra. Good.

Some simple OC at the start +55 on core, stayed stable 2025 in Heaven for full run.



Limited but Uti in Perf... What does it mean?

Anyway I will see how far I can push it without touching voltage.

EDIT: +65 stable 2038 in Heaven, downclocked from 2050


----------



## GNUster

Quote:


> Originally Posted by *Benny89*
> 
> Limited but Uti in Perf... What does it mean?


'Util' stands for GPU utilization, i.e., this is triggered when you don't run a 3d application that fully utilizes your GPU, e.g. when your 3d benchmark stops.


----------



## Benny89

Quote:


> Originally Posted by *GNUster*
> 
> 'Util' stands for GPU utilization, i.e., this is triggered when you don't run a 3d application that fully utilizes your GPU, e.g. when your 3d benchmark stops.


Hmm it didn't stop so strange....

Anyway +51Voltage and so far stable Heaven run 2050 mhz. Max temp on 80% fan speed was 50C.

So far so good.

EDIT: Another run, 2063 Mhz stable in Heaven. 51C Max. I will stop here and play with memory.

Seems like good chip


----------



## GNUster

Quote:


> Originally Posted by *Benny89*
> 
> Hmm it didn't stop so strange....
> 
> Anyway +51Voltage and so far stable Heaven run 2050 mhz. Max temp on 80% fan speed was 50C.
> 
> So far so good.


It can also be happen during benchmarking that the GPU idles since it has to wait on the CPU or other components on the system/PCIE bus.


----------



## Benny89

Quote:


> Originally Posted by *GNUster*
> 
> It can also be happen during benchmarking that the GPU idles since it has to wait on the CPU or other components on the system/PCIE bus.


CPU is 5775C on 4.2 Ghz and RAM CL9 2133Mhz so it is not bottlenecked at all. But Heaven has moments when you don't utilize 99% of GPU so this can be because of that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Hmm it didn't stop so strange....
> 
> Anyway +51Voltage and so far stable Heaven run 2050 mhz. Max temp on 80% fan speed was 50C.
> 
> So far so good.
> 
> EDIT: Another run, 2063 Mhz stable in Heaven. 51C Max. I will stop here and play with memory.
> 
> Seems like good chip


This happens if gpus is utilization is less than 99%.

Anyways, also do superposition, it helps you understand your lower bin performances in the situations you are power limited.


----------



## GNUster

Quote:


> Originally Posted by *Benny89*
> 
> CPU is 5775C on 4.2 Ghz and RAM CL9 2133Mhz so it is not bottlenecked at all. But Heaven has moments when you don't utilize 99% of GPU so this can be because of that.


As you said, it does not necessarily mean that your system is bottlenecked somewhere bc it might be due to the nature of the benchmark. I note 'Util' sporadically also in 3dmark. Hence, it's normal until it flashes up often and regularly during a test run.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> CPU is 5775C on 4.2 Ghz and RAM CL9 2133Mhz so it is not bottlenecked at all. But Heaven has moments when you don't utilize 99% of GPU so this can be because of that.


Hmmm, that's odd. I never get utilization with heaven.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> This happens if gpus is utilization is less than 99%.
> 
> Anyways, also do superposition, it helps you understand your lower bin performances in the situations you are power limited.


While SUPS does that- it doesn't happen in games really. SUPO isn't "game condition" bench. Heaven, Valley or FS is closer to that.

But I will do SUPs one day when I will play with curves. Right now I can see that without touching curve card can do with +51V 2075 no problem, downclocking to 2063 because 50-51C temp.

Now I will find Memory OC and then go and play some games finally







Will have time for curve in next week


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> While SUPS does that- it doesn't happen in games really. SUPO isn't "game condition" bench. Heaven, Valley or FS is closer to that.
> 
> But I will do SUPs one day when I will play with curves. Right now I can see that without touching curve card can do with +51V 2075 no problem, downclocking to 2063 because 50-51C temp.
> 
> Now I will find Memory OC and then go and play some games finally
> 
> 
> 
> 
> 
> 
> 
> Will have time for curve in next week


EDIT: I think I spotted your problem. You either have v-sync or frame limiting on, so you're not going above 144 FPS. I'm guessing you have a 144 Hz monitor?

*So you're having util when it tries to go past 144 fps.*


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> EDIT: I think I spotted your problem. You either have v-sync or frame limiting on, so you're not going above 144 FPS. I'm guessing you have a 144 Hz monitor?


Yes I have 144Hz monitor, XB271HU


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> While SUPS does that- it doesn't happen in games really. SUPO isn't "game condition" bench. Heaven, Valley or FS is closer to that.
> 
> But I will do SUPs one day when I will play with curves. Right now I can see that without touching curve card can do with +51V 2075 no problem, downclocking to 2063 because 50-51C temp.
> 
> Now I will find Memory OC and then go and play some games finally
> 
> 
> 
> 
> 
> 
> 
> Will have time for curve in next week


Superposition is actually going to help you optimize the beginning of your voltage curve.

With boost 3.0, we have to optimize core and voltages from 1.000v and up.

So you'll want to see what's your best core at 1, 1.012, 1.032, 1.042, 1.050, etc.

1. Heaven benchmark in window mode. Test and lock with ctrl + l at each voltage bin and increase the core till crash.

2. Do this for each voltage bin.

3. Then run superposition.

Games like wildlands and witcher 3 will push you down 1 to 3 voltage bins, so if you have your 3 lower voltage at 1950 mhz, then you'll potential have stutter going from 2063 Mhz down to 1950 Mhz suddenly and back up again.

That's my advice.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Yes I have 144Hz monitor, XB271HU


Redo the test without v-sync or fps limiting. You'll get a higher score. Make sure v-sync isn't on nvidia control panel either, at least not for heaven.

Hope that helps.


----------



## DerComissar

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CoreyL4*
> 
> I am very disappointed in my gaming x. Can barely get 2012/5800 stable if that.... still testing.
> 
> Idk if I should return it or not.
> 
> 
> 
> you gotta give us more to go on. what settings? voltage ? how much? what tests? how does it crash? is your cpu OC'd? Is it for sure stable? Have you tried the curve? Did you lock it? What about at 1.043 or 1.05v, what does it crash at at those voltages?
> 
> *I bet that card is fine. In fact, 2012/2025 is at or above average OC on these. they have 1000 more cores than a 1080, remember that. Seeing all these repeat posts of the same dozen guys who managed to get 2100 to stick can be disheartening. But look at how many cards some of us run through. Look at all the extra paste and heatsinks and shunt mods and soldered wires and a few burned out cards; and bios mods and time spent benching/testing/forum posting, returning and rebuying and reselling and tearing apart our rigs and draining loops again and again...*
> 
> The guy who bought the Zotac and plugged it in and was running 2037 right off the bat with 2 minutes involved in the whole process is 100% the winner in this whole thread. Luckily I enjoy the whole process, but it is work, no doubt about it. I.T. work for fun.
> 
> Keep your chin up. That GPU smokes any you've ever had by A LOT. It's probably better than anybody you know in real life's gpu. It's probably overkill for your current monitor anyway. It games silently and it doesnt run hot. It's all good, brother. We are here to help if you really need to see 2037 or 2050 to be happy. I bet we can get you there at least.
Click to expand...

Very well said, Mr. Slackaveli!
Rep+

Pretty much sums things up, as to the state we are in with Pascal.
But I agree, regardless of the hassle of dealing with the limitations, there is a lot of horsepower on tap with these cards.


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> It crashed because of your OC right? It might require different tweaking and playing with the voltage curve. From my experience, you can't just use the same exact voltage curve for every bios, they give slightly different results and all need to be tamed.
> 
> But the Inno3D bios literally instantly crashes anytime, every time, even on stock clocks or down clocked in gaming. So that's what I mean by "crashing" as it won't even load in any situation.
> As for the SC2 bios,
> 
> EVGA1080TiSC2.zip 155k .zip file
> 
> 
> It's working well, you don't lose a DP1, you should get the extra 15 watts like the other bios and you get the ability to have 0% fan speed when idle. It is written and based off of a reference card so works well with shunt mod.


This bios is definitely running better than nvidia stock for me.

Still power limited on Superposition 4K, but i got a 200 point boost with same core OC. And still without unlocking voltage slider on AB, stock curve only +100 on core and +450 mem.

Fan as you said, can go 0%, but being on AIO, i prefer to stay all the time at 30%, thats about 1000rpm. 0 Noise and at least vrm is colder.

Also temps with this bios, maxed on 57º. When playing i get about 46-47º.

I'll keep trying this bios on weekend, as you said, it's a little better than stock bios, at lest for founders.

Edit:
Unlocked voltage slider, and now can get 2063 on core. Previously 2038. Let's try if this frequency is "real" on performance numbers.

Got coil whine, maxing the voltage. It's safe? On stock bios, i got little coil whine, but this is a little louder.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Redo the test without v-sync or fps limiting. You'll get a higher score. Make sure v-sync isn't on nvidia control panel either, at least not for heaven.
> 
> Hope that helps.


Nice! Thanks for that!









2063 core, 6009Mhz memory Heaven run.







Thanks for advice!


----------



## KingEngineRevUp

Yeah, I keep saying this. Boost 3.0 took the fun out of OC, but at the same time it got us close to the limit of these cards with hardly even having to do anything.



So before you get the urge to shop and return a card, just know that Boost 3.0 is taking good care of you.

And look at the average overclock for air cards

http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/

*1830 on air and 2045 on water.*

So don't feel insecure about your cards unless you're lower than the average.


----------



## Clukos

Superposition 4k optimized with Ryzen


----------



## jestermx6

So i took the time to read through this thread. I originally was thinking i left some power on the table but now it seems like I got lucky?

Took about 10 minutes of tweaking and maybe 2 hours of testing to hit my 2100mhz gpu/6000mhz ram on water.

That being the case, would flashing a bios like the SC2 possibly get me even better results?

for pushing over 2100mhz gpu would crash games/benchmarks instantly. pushing past 6000mhz ram caused major artifacting and eventual full system crash.

edit: oh and i've been using Precision X OC. Is there a benefit to using MSI Afterburner?


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> yeah, do it the tried and true way and you'll know your card very well. If you push it more and it crashes, you'll know it so well you'll say, "Damn , I knew I pushed 1.043v a little too hard". Just knowing your limits is the key to maximizing your overclock.
> 
> That rainbow is weird. i only got it a few times. try a baby voltage bump and lock it one higher up the curve and see if it goes away or at least goes to just blue.
> 
> 
> 
> *What is a "baby voltage bump and lock it one higher up the curve mean"?*
> Does this mean if I am currently having (1.012v @ 2037) and flatline above (1.012v) then I should move to a lower clock like 1999 @ 1.012v and have the next voltage (1.025v @ 2037) so like try moving my curve up one voltage step?
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> close and open GPU-Z, I think it can bug out and rainbow, did that for me once too, but also default the curve using cntrl+d and start testing just 1.000v, ignore score for now or your just chasing your tail basically smile.gif All your interested in is stability with no green at that voltage, and don't touch any voltage points below that as they don't matter.
> It is tedious as Slackaveli rightly noted, but its the same principle as overclocking a CPU, pick a voltage a see what the max clock is, the raise the voltage a step and see how much more you can get. 1.000v stable is likely a good point for all cards not to get green in most benchmarks. Tests like SuPo 4K are designed to ruin your day smile.gif, but games will be fine.
> My gaming curve is default FE curve, with 1.050 raised to 2101 and flat after and its fine because the games I have been playing barely reach 120% if ever.
> 
> Click to expand...
> 
> Should I when testing make the curve flat line on anything above the voltage I'm testing? So like 1.000v and above flatline test.
> Once I find max stable I go to (1.012v) and flatline everything above (1.012v) to find the highest stable clock? etc. etc.?
> 
> I'm kind of repeating questions but I want to make sure that I'm correctly understanding.
> 
> Also, If I don't have a 4K monitor (only 1080p) can I still run the 4k tests with SuPo and other benches? I'm assuming the answer would be NO that it doesn't somehow virtually apply a 4k resolution in like a sandbox environment.
> 
> ps. That Rainbow happens to me a lot.
> 
> What is locking / unlocking the curve and how do you "lock" the curve?
Click to expand...

See the video.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jestermx6*
> 
> So i took the time to read through this thread. I originally was thinking i left some power on the table but now it seems like I got lucky?
> 
> Took about 10 minutes of tweaking and maybe 2 hours of testing to hit my 2100mhz gpu/6000mhz ram on water.
> 
> That being the case, would flashing a bios like the SC2 possibly get me even better results?
> 
> for pushing over 2100mhz gpu would crash games/benchmarks instantly. pushing past 6000mhz ram caused major artifacting and eventual full system crash.
> 
> edit: oh and i've been using Precision X OC. Is there a benefit to using MSI Afterburner?


I wouldn't say it was luck, but all that hard work putting it on water. On water you're getting about 80 extra mhz than someone on air.

The SC2 bios may or may not help. It helped me. Witcher 3 crashed a few times 2114 Mhz at 1.062v, had to bump it to 1.093v.

Now I'm 2114 Mhz at 1.062-1.075v and can run Witcher 3 all day with it with SC2 bios.


----------



## jestermx6

what kind of risks are involved with a bios flash? I'm not really willing to do the shunt mod and risk a $700 brick, but bios flashes seem simple enough.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jestermx6*
> 
> what kind of risks are involved with a bios flash? I'm not really willing to do the shunt mod and risk a $700 brick, but bios flashes seem simple enough.


Nearly no risk. If anything happens, you can just connect to your intel mobo integrated GPU and reflash it again to fix anything.

But I've reflashed and went back and forth at least 40 times now with no issues.

But I'm not sure what will happen to you since you are on water and don't have the fan plugged into the controller anymore. I think EVGA might have the card crash purposely if it doesn't detect a fan. Kedarwolf never got back to me.

But it's worth a try. You might get more OC. I actually gained +100 on my memory.


----------



## Benny89

Very pleased with my STRIX so far. 2063 and 6009mhz Memory stable in BF1 all the time. 130-142 fps in game. But really I will have to push up Resolution Scale it seems as its hard to put that card to 99% usage







.

This is my second 1080 TI STRIX (first one I returned). And so far seems like STRIX card have nice chips, or I was lucky enough.


----------



## jestermx6

guess I'll give it a shot then. thanks


----------



## cekim

Quote:


> Originally Posted by *SlimJ87D*
> 
> Nearly no risk. If anything happens, you can just connect to your intel mobo integrated GPU and reflash it again to fix anything.
> 
> But I've reflashed and went back and forth at least 40 times now with no issues.
> 
> But I'm not sure what will happen to you since you are on water and don't have the fan plugged into the controller anymore. I think EVGA might have the card crash purposely if it doesn't detect a fan. Kedarwolf never got back to me.
> 
> But it's worth a try. You might get more OC. I actually gained +100 on my memory.


Crash intentionally?

I had the EVGA FE bios on 2 cards (one evga, one asus, but FE) with EK blocks. It was not stable above 1974 regardless of voltage, but did not "crash purposely" for lack of a fan... Unless of course, the hang/crash I'd see (DX11 device removed error from windows) was bogus and the driver caused it on purpose???


----------



## s1rrah

So I repasted my MSI 1080 ti Seahawk X today ... and man I tell you ... there's bad factory paste jobs and then there are unbelievably bad factory paste jobs ...

This is what I found when I took the pump/cooler off of the GPU die; honestly, I'd expected better from MSI ... LOL ...:

...



...



...






































...

I mean it literally poured out and over the GPU die edges ... even dripping down the copper cooling plate ... LOL!!!

I seriously had to take a five minute break as I just kept staring at in in amazement; never in all my days have I seen that ...

Anyway ...

Repasted with Kryonaut, nice even coat ... added eight Enzotech heatsinks to the VRM cooling plate ... and just for fun, strapped on a honker 3000RPM Scythe Kaze fan ... and have quite literally dropped my load temps by 10C. Formerly I would load at 52C and now it only occasionally breaks 42C (hour long 1440p all maxxed Unigine Heaven run, going to do different tests later); this in a 22C ambient room mind you...

...



...



...

But that factory paste job though!


----------



## GRABibus

Quote:


> Originally Posted by *s1rrah*
> 
> So I repasted my MSI 1080 ti Seahawk X today ... and man I tell you ... there's bad factory paste jobs and then there are unbelievably bad factory paste jobs ...
> 
> This is what I found when I took the pump/cooler off of the GPU die; honestly, I'd expected better from MSI ... LOL ...:
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> I mean it literally poured out and over the GPU die edges ... even dripping down the copper cooling plate ... LOL!!!
> 
> I seriously had to take a five minute break as I just kept staring at in in amazement; never in all my days have I seen that ...
> 
> Anyway ...
> 
> Repasted with Kryonaut, nice even coat ... added eight Enzotech heatsinks to the VRM cooling plate ... and just for fun, strapped on a honker 3000RPM Scythe Kaze fan ... and have quite literally dropped my load temps by 10C. Formerly I would load at 52C and now it only occasionally breaks 42C (hour long 1440p all maxxed Unigine Heaven run, going to do different tests later); this in a 22C ambient room mind you...
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> ...
> 
> But that factory paste job though!


More headroom for OC ?
What was your best OC (Before repasting) ?


----------



## TheParisHeaton

oh my God. This paste is the worst I've ever seen.


----------



## s1rrah

Quote:


> Originally Posted by *GRABibus*
> 
> More headroom for OC ?
> What was your best OC (Before repasting) ?


Haven't tested yet ... currently I'm at 2012mhz on core with no issues and 5980mhz on memory ... I haven't spent a whole lot of time perfecting the OC though ... and I've not tried any other bios' or anything. My main goal is to simply keep my card below 49C at all times to avoid any throttling and it looks like that's going to be easily accomplished after the repaste. TBH I'm totally happy with a 2000mhz overclock as the real world performance gains of 50 to 100mhz more is pretty much negligible to me.

I did notice that during the hour long Heaven run, I saw no power limiting but that's probably not the best app to test for that with ... dunno. Also, I'm not using a curve, just power/temp limit maxed and stock voltages. I also always have "Prefer Maximum Performance" enabled in Nvidia Control Panel ... again, not an overclock afficianado so just sticking with basic settings.


----------



## joder

Quote:


> Originally Posted by *TheParisHeaton*
> 
> oh my God. This paste is the worst I've ever seen.


I think you have to try to make it that bad...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cekim*
> 
> Crash intentionally?
> 
> I had the EVGA FE bios on 2 cards (one evga, one asus, but FE) with EK blocks. It was not stable above 1974 regardless of voltage, but did not "crash purposely" for lack of a fan... Unless of course, the hang/crash I'd see (DX11 device removed error from windows) was bogus and the driver caused it on purpose???


SC2 bios, not FE.

Kedarwolf said the bios crashed on him instantly, so I don't know. We need more people on water to test it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *s1rrah*
> 
> Haven't tested yet ... currently I'm at 2012mhz on core with no issues and 5980mhz on memory ... I haven't spent a whole lot of time perfecting the OC though ... and I've not tried any other bios' or anything. My main goal is to simply keep my card below 49C at all times to avoid any throttling and it looks like that's going to be easily accomplished after the repaste. TBH I'm totally happy with a 2000mhz overclock as the real world performance gains of 50 to 100mhz more is pretty much negligible to me.
> 
> I did notice that during the hour long Heaven run, I saw no power limiting but that's probably not the best app to test for that with ... dunno. Also, I'm not using a curve, just power/temp limit maxed and stock voltages. I also always have "Prefer Maximum Performance" enabled in Nvidia Control Panel ... again, not an overclock afficianado so just sticking with basic settings.


From my experiencing using an AIO, try to get your maximum OC at 1.042-1.050v. Anything higher will hit you with power limits.


----------



## GRABibus

Quote:


> Originally Posted by *s1rrah*
> 
> Haven't tested yet ... currently I'm at 2012mhz on core with no issues and 5980mhz on memory ... I haven't spent a whole lot of time perfecting the OC though ... and I've not tried any other bios' or anything. My main goal is to simply keep my card below 49C at all times to avoid any throttling and it looks like that's going to be easily accomplished after the repaste. TBH I'm totally happy with a 2000mhz overclock as the real world performance gains of 50 to 100mhz more is pretty much negligible to me.
> 
> I did notice that during the hour long Heaven run, I saw no power limiting but that's probably not the best app to test for that with ... dunno. Also, I'm not using a curve, just power/temp limit maxed and stock voltages. I also always have "Prefer Maximum Performance" enabled in Nvidia Control Panel ... again, not an overclock afficianado so just sticking with basic settings.


2012MHz at stock voltage ? Nice.
From my side, in Time Spy, I need 1.093V to be stable at 2012MHz in first graphic test...
Same card as you.
It seems yours better than mine


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> 2012MHz at stock voltage ? Nice.
> From my side, in Time Spy, I need 1.093V to be stable at 2012MHz in first graphic test...
> Same card as you.
> It seems yours better than mine


if you were watercooled or on a AIO, you would get an additional 13 Mhz for every 10C you can lose until 40C.

So keep that in mind. If you're at 75C, then at 40C or lower you'd have another 39C so you'd be at 2050 Mhz.


----------



## Benny89

I see that 1080 Ti STRIX has a warranty sticker on one of backplate screws. Picture below.



But I think I could get that screw out using Pliers and so do not touch the sticker itself. Can someone advise if this is possible and easy to do? I think MSI and 1080 STRIX had the same stickers.

What I mean is I wish to repaste it with Kryonaut paste.


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> Hmmm, that's odd. I never get utilization with heaven.


2088 @ 1.032v SlimJ87D......NICE clocks you have there. Is this because of the shunt mod giving you access to more power?
Seems like all of us should not have problems at the low end of 1.000v - 1.050v regardless of power draw so you probably just have a really nice card!!

Does shunt mod help with higher clocks at same voltages as before? Anyone able to explain why if it does?

Also, I notice no one uses Furmark any more. A long time ago when I was into overclocking Furmark was one of the goto test. Why did people stop using it? It's built in with the Asus GPU Tweak II as an option.


----------



## Clukos

Updated FS scores with Ryzen

FS: http://www.3dmark.com/3dm/19439020


FSE: http://www.3dmark.com/3dm/19439221


FSU:http://www.3dmark.com/3dm/19439437


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> I see that 1080 Ti STRIX has a warranty sticker on one of backplate screws. Picture below.
> 
> 
> 
> But I think I could get that screw out using Pliers and so do not touch the sticker itself. Can someone advise if this is possible and easy to do? I think MSI and 1080 STRIX had the same stickers.
> 
> What I mean is I wish to repaste it with Kryonaut paste.


Yeah, there's a guy that does it on youtube with a MSI card. I can't find the video anymore. But it seemed pretty easy, just make sure you have some lockjaw pliers so you don't slip.

Have someone hold the card, and make sure you do a coupling moment. That means, turn pliers using two hands on each handle. This will create an even torque around the screw and you're less likely to slip.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> 2088 @ 1.032v SlimJ87D......NICE clocks you have there. Is this because of the shunt mod giving you access to more power?
> Seems like all of us should not have problems at the low end of 1.000v - 1.050v regardless of power draw so you probably just have a really nice card!!
> 
> Does shunt mod help with higher clocks at same voltages as before? Anyone able to explain why if it does?
> 
> Also, I notice no one uses Furmark any more. A long time ago when I was into overclocking Furmark was one of the goto test. Why did people stop using it? It's built in with the Asus GPU Tweak II as an option.


No, this was before I had the shunt. The silicon gods were just nice to me this round. Makes up for the crappy 980 Ti I got last generation.

2075-2088 @ 1.000v. No shunt mod here.



If your card does better at 1.075-1.092v running heaven at 1080P, then the shunt mod is good for you.

Lets say you can reach 2101 Mhz at 1.075v and do a full heaven run at 1080P then doing the shunt mod should let you do superposition and time spy at that voltage without worrying about power limiting.

A FE card will hit power limits in normal gameplay above 1.042v. And with superposition, you'll start hitting power limits above 1.000v already.


----------



## GRABibus

Quote:


> Originally Posted by *s1rrah*
> 
> Haven't tested yet ... currently I'm at 2012mhz on core with no issues and 5980mhz on memory ... I haven't spent a whole lot of time perfecting the OC though ... and I've not tried any other bios' or anything. My main goal is to simply keep my card below 49C at all times to avoid any throttling and it looks like that's going to be easily accomplished after the repaste. TBH I'm totally happy with a 2000mhz overclock as the real world performance gains of 50 to 100mhz more is pretty much negligible to me.
> 
> I did notice that during the hour long Heaven run, I saw no power limiting but that's probably not the best app to test for that with ... dunno. Also, I'm not using a curve, just power/temp limit maxed and stock voltages. I also always have "Prefer Maximum Performance" enabled in Nvidia Control Panel ... again, not an overclock afficianado so just sticking with basic settings.


2012MHz at stock voltage ?
In Time Spy, I need 1.093V to be sta
Quote:


> Originally Posted by *SlimJ87D*
> 
> if you were watercooled or on a AIO, you would get an additional 13 Mhz for every 10C you can lose until 40C.
> 
> So keep that in mind. If you're at 75C, then at 40C or lower you'd have another 39C so you'd be at 2050 Mhz.


i don't understand what you mean.
i have the Sea Hawk X (Water AIO).

In time spy, I crash at 2025MHz at 1.093V in first graphic test.
If I add +50MHz only on core with MSI AB (Starting from 1544MHz) wxith stock voltage, I crash
Max GPU temp = 51°C

I have no headroom...

What should I expect and do according to you to rise this Core frequency and being stable ?


----------



## PasK1234Xw

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, there's a guy that does it on youtube with a MSI card. I can't find the video anymore. But it seemed pretty easy, just make sure you have some lockjaw pliers so you don't slip.
> 
> Have someone hold the card, and make sure you do a coupling moment. That means, turn pliers using two hands on each handle. This will create an even torque around the screw and you're less likely to slip.


When you find it tell him MSI doesn't actually void warranty that sticker is just a deterrent.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> 2012MHz at stock voltage ?
> In Time Spy, I need 1.093V to be sta
> i don't understand what you mean.
> i have the Sea HAw (Water AIO).
> 
> In time spy, I crash for at 2025MHz at 1.093V in first graphic test.
> 
> I have no headroom...
> 
> What should I expect and do according to you to rise this Core frequency and being stable ?


I didn't know you were on water already. Sorry. I thought you were on air.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *PasK1234Xw*
> 
> When you find it tell him MSI doesn't actually void warranty that sticker is just a deterrent.


I believe the guy lived in another country where MSI did check stickers.

But yes, US MSI doesn't care according to what they said in their message boards.


----------



## PasK1234Xw

Quote:


> Originally Posted by *SlimJ87D*
> 
> I believe the guy lived in another country where MSI did check stickers.
> 
> But yes, US MSI doesn't care according to what they said in their message boards.


Oh OK


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> I didn't know you were on water already. Sorry. I thought you were on air.


Then no hope ?


----------



## chef1702

Quote:


> We need more people on water to test it.


Done. Fullcover EKWB Block. No fan connected. Card normaly wont go past 41-42° because of water so the fan won't spin anyway at this temperatures...

Evga FE. SC2 Bios flashed. Card is still working fine. Boost is really nice out of box. On FE Bios I remember something around 185x without touching anything. Now she goes to 195x at 1.05 V.

Interesting though that the version of the FE Bios is the same like the SC2 one.



Which should mean its just a subvendor change with another core clock curve mapped and that few extra watts.

Im not that kind of "search for those 10 extra watts" guy so no results from me here. But I just wanted to let you guys know that the card is still working fine under load. Now lets see if I get it stable at 2000 MHz and 1.012 V which wasnt possible on the FE Bios. I dont wanna go under 2k MHz


----------



## macwin2012

Hey guys ,

I run a ultra wide monitor 3440x1440p with 1070 , is it worth ugprade to ti ?

Also i hear some news about nvidia volta in Q3 2017 .


----------



## KingEngineRevUp

Quote:


> Originally Posted by *chef1702*
> 
> Done. Fullcover EKWB Block. No fan connected. Card normaly wont go past 41-42° because of water so the fan won't spin anyway at this temperatures...
> 
> Evga FE. SC2 Bios flashed. Card is still working fine. Boost is really nice out of box. On FE Bios I remember something around 185x without touching anything. Now she goes to 195x at 1.05 V.
> 
> Interesting though that the version of the FE Bios is the same like the SC2 one.
> 
> 
> 
> Which should mean its just a subvendor change with another core clock curve mapped and that few extra watts.
> 
> Im not that kind of "search for those 10 extra watts" guy so no results from me her. But I just wanted to let you guys know that the card is still working fine under load. Now lets see if I get it stable at 2000 MHz and 1.012 V which wasnt possible on the FE Bios. I dont wanna go under 2k MHz


Yes, the board is one of the closes partner reference boards. So that's why it works very well with the FE and the only bios I have recommended to everyone.

BTW, the last two digits is .90 because that's EVGA's vendor bios number IIRC.


----------



## gmpotu

What does pressing Ctrl+L do in AB? I get a yellow vertical line on the curve and my voltage (even if low like 1.012v) doesn't seem to go up higher when i do this.


----------



## Benny89

Wow. So pleased with my STRIX. Does 2063 stable in games and with 70% Fan speed I don't see any higher then 54C







If I repaste that beast.... mmmm...

Finally settled on my 1080 Ti


----------



## RaleighStClair

Is there a way to lock certain frequencies to certain voltages? I have tried the curve method (Cnt+f) and locking with it, but that doesn't seem to really do anything. Is there a specific way to do this to make it work?


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, this was before I had the shunt. The silicon gods were just nice to me this round. Makes up for the crappy 980 Ti I got last generation.
> 
> 2075-2088 @ 1.000v. No shunt mod here.


That's a great silicon lottery winner. I can't get past 2025Mhz @ 1.000v. I made it to 15/17 on Superposition @ 2037Mhz. I wish we could just get a BIOS to up that power limit just a bit more


----------



## StephenP85

One more card to add! Redoing my loop always takes me days.



I wonder if it would look better if I put MSI's included m.2 shield over the 960 Pro, or just let it proudly display itself and its reflection (the red "Pro" is the only red part of the whole setup).


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Should I when testing make the curve flat line on anything above the voltage I'm testing? So like 1.000v and above flatline test.
> Once I find max stable I go to (1.012v) and flatline everything above (1.012v) to find the highest stable clock? etc. etc.?


Yes, that is correct








Quote:


> Also, If I don't have a 4K monitor (only 1080p) can I still run the 4k tests with SuPo and other benches? I'm assuming the answer would be NO that it doesn't somehow virtually apply a 4k resolution in like a sandbox environment.


Yeah it doesn't care what your monitor is just use 4K optimised as it uses a lot of power, but remember it can be misleading as games don't use anywhere near that much.
Quote:


> ps. That Rainbow happens to me a lot.
> 
> What is locking / unlocking the curve and how do you "lock" the curve?


ignore the rainbow for now and just get a feel for what is stable and what is the crash point per voltage bin

once you move a voltage point you hit apply as you normally would, but you can also save it to a preset as well. then theres locking it to a voltage bin, but I wouldn't worry about that as it can cause the clk bins to drop a lot worse with power or temp limiting.


----------



## GNUster

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, this was before I had the shunt. The silicon gods were just nice to me this round. Makes up for the crappy 980 Ti I got last generation.
> 
> 2075-2088 @ 1.000v. No shunt mod here.


That's crazy! The most I got was [email protected] to complete superposition, 3dmark, heaven.


----------



## s1rrah

Quote:


> Originally Posted by *GRABibus*
> 
> Then no hope ?


Dude ... 2000mhz is awesome ... 2012mhz is also awesome ... hell, you wouldn't notice any in game difference really even if you were only running 1900mhz ... your card is fine man.

And I'm positive you could run your card at 1.093 volts every day and never have any issue ...

BTW: after repaste of the Seahawk .. just did 45minutes continues loop of Metro Last Light Benchmark, 1440p all maxed (no SSAA) ... card maxed at 41C .... LOL ... pre repaste Metro: LL bench would load at 54C ... amazing (I think the fan has a good bit to do with it as well) ... you'd think a 3000RPM Scythe fan would be thunderously loud but it's really not ... I can certainly hear it but gaming loud speakers or headphones I'd never even notice ...

Metro Last light, believe or not (the built in Benchmark program) ... still heats my card(s) up more than just about anything ...

Oh yeah ... not a single instance of power limiting according to logs ... but lots of voltage limiting ... whatever that means ... LOL ..


----------



## s1rrah

Quote:


> Originally Posted by *macwin2012*
> 
> Hey guys ,
> 
> I run a ultra wide monitor 3440x1440p with 1070 , is it worth ugprade to ti ?
> 
> Also i hear some news about nvidia volta in Q3 2017 .


Yes


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yes, the board is one of the closes partner reference boards. So that's why it works very well with the FE and the only bios I have recommended to everyone.
> 
> BTW, the last two digits is .90 because that's EVGA's vendor bios number IIRC.


Don't know why, but with SC2 bios, i get lower temps than stock, despite having a high OC.

As example, Rainbow Six Siege, playing some maps on terrorist hunt.

Stock bios: 1976Mhz @ 1.031v - 46º

SC2 bios: 2000Mhz @1.025 - 42º

Rad fan, is between 900-1100 rpm (Corsair ML120Pro), and blower fixed at 30%.

Extremely pleased with this bios.

Max OC is 2063Mhz @1.093v, Stock bios was 2038Mhz.

Max temps on superposition, 54º while stock bios was 57º.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Don't know why, but with SC2 bios, i get lower temps than stock, despite having a high OC.
> 
> As example, Rainbow Six Siege, playing some maps on terrorist hunt.
> 
> Stock bios: 1976Mhz @ 1.031v - 46º
> 
> SC2 bios: 2000Mhz @1.025 - 42º
> 
> Rad fan, is between 900-1100 rpm (Corsair ML120Pro), and blower fixed at 30%.
> 
> Extremely pleased with this bios.
> 
> Max OC is 2063Mhz @1.093v, Stock bios was 2038Mhz.
> 
> Max temps on superposition, 54º while stock bios was 57º.


You can paypal me a beer


----------



## Addsome

What RPM do stock FE fans max out at again? Trying to figure out what I should set the SC2 bios fan speed at.


----------



## eXteR

Quote:


> Originally Posted by *Addsome*
> 
> What RPM do stock FE fans max out at again? Trying to figure out what I should set the SC2 bios fan speed at.


Stock bios max is 4800 RPM.

SC2 bios is 3700 RPM.

With 30% fan with SC2 bios i get 1100RPM.


----------



## gmpotu

Okay so I'm testing 1.000v in Unigen Heaven 4.0 @ 2000 core.
Settings - DX11 / Ultra / Extreme / X8 / 1920x1080

If I press Ctrl + L in AB then it will lock me into 2000 @ 1.000v and my GPU-Z shows clean as a whistle. No Vrel or Pow issues.
However, if I make a flat line from 1.000v to1.100v @ 2000 and press Ctrl + L again to turn it off then my Core sits at like 350mhz and voltage at like 0.770 and GPU-Z shows Pow issues at the low end.


----------



## Caos

Which drivers do they use? The Hotfix Driver 381.78 or the official 381.65?


----------



## gmpotu

I don't think I can even do 1950 @ 1.000v on my Asus STRIX model.
I got a little frustrated with trying to test 1.000v so I will go back to it later.

Tested 1.093v and I was able to run 2076 for the full benchmark of Heaven. There was one small part where it stuttered for a quarter of a second around 213 seconds I think close to scene 24/26.
My GPU-Z graphs were clear the whole time. No Pow or Vrel issues and no Rainbow.
My CPU is at 4.222ghz not sure why Heaven Bench does not reflect that.



Ran Superposition with a new curve starting from the back side at 1.093v and only had a couple of spots that had POW issue but never dropped me below 2050 @ 1.050v. The majority of the time I was at 2076 @ 1.093v or 1.081v


----------



## Nico67

SC2 Bios only goes to 300w according to SMI, feels as if it it has been soften up a bit to let it clock higher too. As such it doesn't power limit as much as FE, but scores are a bit lower too.



2100 max volts 1.031



also felt like mem clk'd a bit higher but only by +50 more than FE

if it had more power available I think it would do quite well, probably why SlimJ87D is liking it. Still waiting on that elusive 350w FE bios


----------



## joder

Quote:


> Originally Posted by *Nico67*
> 
> if it had more power available I think it would do quite well, probably why SlimJ87D is liking it. Still waiting on that elusive 350w FE bios


Is there one out there or are you hoping for that?


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> cpu and ram DEFINITELY influence fps at any resolution, including 4k. More draw calls = more fpa. yes, it's a lot less of a difference than in a cpu bound resolution. But, for instance, i get about ~120fps in BF1 in 1440p on my i-7 5775c at stock 3.3ghz with 1333mhz ram. But at 4.3Ghz and 3.7ghz uncore and 2133mhz Cas-9 ram, i get ~144+ fps. ANd who thinks a ryzen 4 core stock has the same fps in 4k as a 5.0Ghz 7700k? C'mon, guys, let's stop with this old argument. That like the guys who kept saying you'd get no fps gains in QHD going from a 4790k to a 5775c. Well, everyone that did it like myself, Benny, palotte, and a few others ALL report 10+ fps gains, just from a cpu drop in.


Bit of a throw back but there is a point where more CPU power will result in no more gain I get about the same FPS with a 3770k @ 4.8Ghz I clock it to 5Ghz all that happens is my CPU utilisation goes down from 70% to about 65%
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I keep saying this. Boost 3.0 took the fun out of OC, but at the same time it got us close to the limit of these cards with hardly even having to do anything.
> 
> 
> 
> So before you get the urge to shop and return a card, just know that Boost 3.0 is taking good care of you.
> 
> And look at the average overclock for air cards
> 
> http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/
> 
> *1830 on air and 2045 on water.*
> 
> So don't feel insecure about your cards unless you're lower than the average.


yeah the EPEEN Bench then I play games and realise how much of a beast of a card this is even without a OC

Quote:


> Originally Posted by *Benny89*
> 
> Very pleased with my STRIX so far. 2063 and 6009mhz Memory stable in BF1 all the time. 130-142 fps in game. But really I will have to push up Resolution Scale it seems as its hard to put that card to 99% usage
> 
> 
> 
> 
> 
> 
> 
> .
> 
> This is my second 1080 TI STRIX (first one I returned). And so far seems like STRIX card have nice chips, or I was lucky enough.


Nice I am happy for you my Strix will only do 5900Mhz on memory but will do 6000Mhz and pass the bench with artefacts.
I have settled for 2038Mhz for 24/7. This card will bench @ 2100Mhz core but only with curve
With +100v offset I can only get to 2050 but that said for some reason I get better benchmarks with offset with same core speed.

Quote:


> Originally Posted by *macwin2012*
> 
> Hey guys ,
> 
> I run a ultra wide monitor 3440x1440p with 1070 , is it worth ugprade to ti ?
> 
> Also i hear some news about nvidia volta in Q3 2017 .


depends you got a 60Hz or 144Hz monitor because I have a feeling if you got a 60Hz monitor then no you are probably near maxing your monitor out already if you got a 144Hz monitor then yes it is worth the upgrade but I would hold out if I could to see if the Vega RX 590 will push NVidia price down.
I feel the 1080 Ti is perfectly matched to play 3850x1600 @ 75Hz perfectly my monitor with even the most demanding games unless you max max out the setting like GTA5 with the last final crazy settings push FPS down to 18FPS







but ROTTR will be around 35FPS


----------



## mshagg

Quote:


> Originally Posted by *Nico67*
> 
> SC2 Bios only goes to 300w according to SMI, feels as if it it has been soften up a bit to let it clock higher too. As such it doesn't power limit as much as FE, but scores are a bit lower too.
> 
> 
> 
> 2100 max volts 1.031
> 
> 
> 
> also felt like mem clk'd a bit higher but only by +50 more than FE
> 
> if it had more power available I think it would do quite well, probably why SlimJ87D is liking it. Still waiting on that elusive 350w FE bios


Yeah similar results here. Didnt lose anything, but didnt really gain anything either.


----------



## joder

Quote:


> Originally Posted by *Nico67*
> 
> SC2 Bios only goes to 300w according to SMI, feels as if it it has been soften up a bit to let it clock higher too. As such it doesn't power limit as much as FE, but scores are a bit lower too.
> 
> 2100 max volts 1.031


Did you hit 2100 no problem with the stock BIOS?


----------



## Benny89

Quote:


> Originally Posted by *feznz*
> 
> Nice I am happy for you my Strix will only do 5900Mhz on memory but will do 6000Mhz and pass the bench with artefacts.
> I have settled for 2038Mhz for 24/7. This card will bench @ 2100Mhz core but only with curve
> With +100v offset I can only get to 2050 but that said for some reason I get better benchmarks with offset with same core speed.


I didn't push that card to the limits really yet. Added +51 Voltage in AB, 2063 no problem (boost 2073) stable with 6009 memory. Didn't try to push memory further as I think after you break 12k Memory you get dimishion return in scores.

I could probably push that card with voltage curves up to 2080-2100. It did 2050 on 1.610 V no problem.

But again, I am not in hurry as this card already is remarkable and push all my games to limit. Additional 20mhz on core would not gain me anthing in games, only in benches.

Supprisingly the default voltage curve is really well made on Strix. At least on mine.

I say this- as long as your card can stay in games stable over 2000mhz on core and at least 5800 Memory you are golden. Everythins else is just cherry on top. No real significant gains in games. 2-4 fps at top.


----------



## joder

Quote:


> Originally Posted by *joder*
> 
> Did you hit 2100 @ 1.03v no problem with the stock BIOS?


----------



## Nico67

Quote:


> Originally Posted by *joder*
> 
> Is there one out there or are you hoping for that?


The Inno x3 bios was looking promising, at least the early review sample. But until its available who knows









Quote:


> Originally Posted by *joder*
> 
> Did you hit 2100 no problem with the stock BIOS?


Yeah, it can do that at 1.043 on FE bios with curve adjusted, I game at 2101 1.050 just to be on the safe side


----------



## alucardis666

1080 Ti FE SLi @ 2050/12000







R7 1700 @ 3.9Ghz


----------



## Benny89

Quote:


> Originally Posted by *alucardis666*
> 
> 1080 Ti FE SLi @ 2050/12000
> 
> 
> 
> 
> 
> 
> 
> R7 1700 @ 3.9Ghz


Nice one!







Litttle overkill for 60Hz, don't you think?









You finally settled on your cards?







or you will return and change again?


----------



## alucardis666

Quote:


> Originally Posted by *Benny89*
> 
> Nice one!
> 
> 
> 
> 
> 
> 
> 
> Litttle overkill for 60Hz, don't you think?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You finally settled on your cards?
> 
> 
> 
> 
> 
> 
> 
> or you will return and change again?


I'm happy for far, but the top car does have issues with the fan, either coil while or bad bearing or off balance, makes kinda chirping sound through the fan speed range, but if the fan is stopped the noise goes away. Bottom card is silent. TBH, I'll probably learn to live with it. Tired for swapping things in and out and returning and ebaying...









I may get a 4k 144hz HDR monitor down the road when they come out so for now this works. It's overkill most of the time, but for the VERY demanding/poor optimized games like ME Andromeda and GRWL it works for me!


----------



## MURDoctrine

Woot! I'm finally under water. I've started playing with the voltage curve but I already new my cards silicon was pretty bad. Is superposition not as demanding as heaven was? I can do bench passes on it at 2025mhz @ 1.025 or so but that instantly crashes me in heaven.


----------



## Slackaveli

Quote:


> Originally Posted by *kevindd992002*
> 
> Thanks. Now it makes sense to set VSYNC to ON (either in-game or in NVCP) even when using an fps limiter just below your monitor's refresh rate.
> 
> I'm really surprised Zotac still has problems with their fans regarding PWM. There was this fluctuation issue near the temp threshold with the 1070 explained here: http://www.overclock.net/t/1613126/gtx-1070-amp-extreme-owners#post_25567515 . I bet your 1080Ti also does this. Not a major issue but certainly a nuisance.


not "on" though, "FAST", because it works better and no perf hit. Right back to what I said 500 pages ago... g-sync on, fast sync on NCP, frame limiter 1 below refresh (2 below >90Hz).







Quote:


> Originally Posted by *Benny89*
> 
> No, she does not
> 
> 
> 
> 
> 
> 
> 
> . Unique female example
> 
> 
> 
> 
> 
> 
> 
> . Well, she is also glad because she gets my golden 980Ti after me and she will finally be able to max out Witcher 3 so overall she want me to finally upgrade


i also have a gamer girl, and she's hot as hell with a fat ass and doesn't hate. I built her a nice rig and she mostly plays dominion online on it lol (it's a cool card game). NOBODY can beat her though. Ole RukiKusher is like 8000-250 in that games.
Quote:


> Originally Posted by *gmpotu*
> 
> *What is a "baby voltage bump and lock it one higher up the curve mean"?*
> Does this mean if I am currently having (1.012v @ 2037) and flatline above (1.012v) then I should move to a lower clock like 1999 @ 1.012v and have the next voltage (1.025v @ 2037) so like try moving my curve up one voltage step?
> Should I when testing make the curve flat line on anything above the voltage I'm testing? So like 1.000v and above flatline test.
> Once I find max stable I go to (1.012v) and flatline everything above (1.012v) to find the highest stable clock? etc. etc.?
> 
> I'm kind of repeating questions but I want to make sure that I'm correctly understanding.
> 
> Also, If I don't have a 4K monitor (only 1080p) can I still run the 4k tests with SuPo and other benches? I'm assuming the answer would be NO that it doesn't somehow virtually apply a 4k resolution in like a sandbox environment.
> 
> ps. That Rainbow happens to me a lot.
> 
> What is locking / unlocking the curve and how do you "lock" the curve?


i meant it may be crashing because it needs just a little more juice, like how a cpu core would. so give it 25mv (baby bump. only 25,50,75,and 100 do anything. all other 'bumps' are redundant). and then go p the scale one more and lock it there. Like, if it's crashing at 1.05v at 2050. try adding +25mv on voltage slider, then open the curve and move one dot to the right (or 'Up"), which would be 1.063, and lock it there. always have one step up between each voltage. I find locking at 1.075v with a +50Mv setting, with 1.08v being the 2100 mark and flat after that, gives me a locked 2088/2075 in benches and games alike. I no longer have any perfcap reasons besides occasional "util" which is nothing and the bench related power throttling. Havent power throttled in a game yet, even witcher 3, but I dont run uncapped frames in 4k. I lock it at 59fps and I think that has kept me from power limiting in that game because it's pretty much locked in 4k at 59.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> i also have a gamer girl, and she's hot as hell with a fat ass and doesn't hate. I built her a nice rig and she mostly plays dominion online on it lol (it's a cool card game). NOBODY can beat her though. Ole RukiKusher is like 8000-250 in that games.


HAHA!







My wife plays mostly Overwatch or some MOBOas so she can be support and RPGs cause we love them (we still play pen-paper Warhammer RPG with friends).

BTW. It is really hard to push this card to limit in BF1 on 1440p. On Monarchy map GPU usage was rarely above 87% and sometimesin 74-75%. Of course that depends on map. In German City map it easly goes 96-99%). No CPU bottleneck with 4.2 Ghz 5775C (65 % lowest core usage on cire 1, 75% highest core usage on core 3).

BF1 is REALLY heavy CPU dependent game.

I am downloading again Witcher 3 to push this card hard









Really beast cards, those 1080 Tis


----------



## KCDC

This may be a newb question, but how are you guys able to run 3dMark benchmarks with Afterburner running? I've read it's due to the OSD, which I have disabled, but I still get crashes during loading, works just fine without AB on, but then I don't have my applied overclock, right?

I'm using the latest beta with the voltage bar unlocked. Thanks for any assistance on the matter. I wanna post scores!!! Getting stable 2100+ and I wanna show off.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KCDC*
> 
> This may be a newb question, but how are you guys able to run 3dMark benchmarks with Afterburner running? I've read it's due to the OSD, which I have disabled, but I still get crashes during loading, works just fine without AB on, but then I don't have my applied overclock, right?
> 
> I'm using the latest beta with the voltage bar unlocked. Thanks for any assistance on the matter. I wanna post scores!!! Getting stable 2100+ and I wanna show off.


You have to disabe that 3d custom setting in RSS.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> SC2 Bios only goes to 300w according to SMI, feels as if it it has been soften up a bit to let it clock higher too. As such it doesn't power limit as much as FE, but scores are a bit lower too.
> 
> 
> 
> 2100 max volts 1.031
> 
> 
> 
> also felt like mem clk'd a bit higher but only by +50 more than FE
> 
> if it had more power available I think it would do quite well, probably why SlimJ87D is liking it. Still waiting on that elusive 350w FE bios


We won't get that one it was with a review sample of the Inno3d X3 which even the reviewer said the one they have will be different form the ones shipped. I don't think the reviewers can release it either due to nda.


----------



## KCDC

Quote:


> Originally Posted by *SlimJ87D*
> 
> You have to disabe that 3d custom setting in RSS.


Appreciate it.


----------



## Alwrath

Quote:


> Originally Posted by *Benny89*
> 
> Nice one!
> 
> 
> 
> 
> 
> 
> 
> Litttle overkill for 60Hz, don't you think?


Mass Effect ultra 4K / Crysis 3 4K already cant get 60 fps with a single 1080ti. Imagine what games are going to require a year or 2 from now for 4K max settings. 2 is not overkill.


----------



## Qba73

2075 solid thanks to tip from Slack to put a fan on the copper heatsink on the xtreme, kept me at 64c no downclock, 400 on mem (reg fans @70%) not bad for air breather with factory paste..lol


----------



## jase78

trying to finally figure out curve. hows this look? any suggestions on improving curve would be greatly appreciated.


----------



## Alwrath

Can anyone provide a msi afterburner link for the latest version? MSI site doesnt work anymore, its a rar file. I cant get it to work. My afterburner is from guru3d and I dont have voltage unlocked.


----------



## Blotto80

Ok, so anyone who hasn't re-pasted with Kryonaut needs to get on that ASAP. I put my EVGA Hybrid on my FE when I first got it with the MX4 I had kicking around and ordered a tube of TG: Kryonaut, it showed up today and the results are impressive.

I've settled into 2012mhz @ 0.981v for my 24/7 clocks, I've got 2x SP120 Performance Edition fans on my hybrid in a push pull. I run them off a MB header and have Speedfan setting a curve. Fan speed ramps up to 70% over 50c, prior to that it's reasonably quiet up to 50% (30% at idle),

Prior to re-pasting, I ran two back to back runs of Supo. Max temp I saw was 51c, it would flutter between 50-51 and top out there. I took it apart and slapped the TG on and immediately re-ran the same back to back Supo and the top was 44c. It would flutter back and forth between 43-44c now.

Not only did I gain 7c for a $15cdn spend but as long as it keeps under 50 degrees my ears will be very grateful.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> CPU is 5775C on 4.2 Ghz and RAM CL9 2133Mhz so it is not bottlenecked at all. But Heaven has moments when you don't utilize 99% of GPU so this can be because of that.


time to update your sig rig, brother.
Our systems are so similar. We run the exact same ram and ram timings, too. I have my 2400Mhz c-10 corsair dominator platinum undervolted and running a tight 9,9,9,24 @ 2133. We (now, since you finally caught a winner with the Rog)are both running over 2063 on our 1080Ti's, and at very cool temps. And of course we are both rocking the mad overclocked Broadwell Destroyers.








Sick system, Bro!


----------



## KedarWolf

Quote:


> Originally Posted by *Alwrath*
> 
> Can anyone provide a msi afterburner link for the latest version? MSI site doesnt work anymore, its a rar file. I cant get it to work. My afterburner is from guru3d and I dont have voltage unlocked.


http://forums.guru3d.com/showpost.php?p=5412373&postcount=216 Works voltage unlocked out of the box.


----------



## 86JR

So I am honestly thinking of joining this club...this and an m.2 will futureproof my rig for another 3 years or so I think (see sig).

But my question is what is the best one to get? I seem to remember on past cards, MSI add their own PWM etc for better overclocks?

I have an asus 690 it has done well. I don't normally overclock graphics cards, my priority is low noise and TDP really.

I only play rocket league and arma 3 but arma 3...it hits vram limit (2gb per core) at 1080p so 11gb of ram would come in useful! Will be buying escape from tarkov too.

Thanks!


----------



## KaRLiToS

All you people should post in the *Unigine Superposition thread*


----------



## Slackaveli

Quote:


> Originally Posted by *s1rrah*
> 
> So I repasted my MSI 1080 ti Seahawk X today ... and man I tell you ... there's bad factory paste jobs and then there are unbelievably bad factory paste jobs ...
> 
> This is what I found when I took the pump/cooler off of the GPU die; honestly, I'd expected better from MSI ... LOL ...:
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> I mean it literally poured out and over the GPU die edges ... even dripping down the copper cooling plate ... LOL!!!
> 
> I seriously had to take a five minute break as I just kept staring at in in amazement; never in all my days have I seen that ...
> 
> Anyway ...
> 
> Repasted with Kryonaut, nice even coat ... added eight Enzotech heatsinks to the VRM cooling plate ... and just for fun, strapped on a honker 3000RPM Scythe Kaze fan ... and have quite literally dropped my load temps by 10C. Formerly I would load at 52C and now it only occasionally breaks 42C (hour long 1440p all maxxed Unigine Heaven run, going to do different tests later); this in a 22C ambient room mind you...
> 
> ...
> 
> 
> 
> ...
> 
> 
> 
> ...
> 
> But that factory paste job though!


exactly why you should ALWAYS repaste every single gpu you ever buy. it's pathetic. but also great to be able to get such gains for so little effort/cost. makes it more fun tbh.
Quote:


> Originally Posted by *Benny89*
> 
> HAHA!
> 
> 
> 
> 
> 
> 
> 
> My wife plays mostly Overwatch or some MOBOas so she can be support and RPGs cause we love them (we still play pen-paper Warhammer RPG with friends).
> 
> BTW. It is really hard to push this card to limit in BF1 on 1440p. On Monarchy map GPU usage was rarely above 87% and sometimesin 74-75%. Of course that depends on map. In German City map it easly goes 96-99%). No CPU bottleneck with 4.2 Ghz 5775C (65 % lowest core usage on cire 1, 75% highest core usage on core 3).
> 
> BF1 is REALLY heavy CPU dependent game.
> 
> I am downloading again Witcher 3 to push this card hard
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really beast cards, those 1080 Tis


yeah, no doubt. I've settled in on all ultra but medium post process and 112% resolution slider. No jaggies AT ALl, I keep 99%~ usage, and still stay over 135 fps. Usually still 144 on most maps. Lowest Ive seen was scar at about 122/
Quote:


> Originally Posted by *Qba73*
> 
> 2075 solid thanks to tip from Slack to put a fan on the copper heatsink on the xtreme, kept me at 64c no downclock, 400 on mem (reg fans @70%) not bad for air breather with factory paste..lol


Glad to hear it man! Franken cooling ftw! I love gaming in the 50c's on air. It's dope! Definitely order some Thermal Grizzly Kryonaut and you'll be in the 50c's, too!








Quote:


> Originally Posted by *Alwrath*
> 
> Can anyone provide a msi afterburner link for the latest version? MSI site doesnt work anymore, its a rar file. I cant get it to work. My afterburner is from guru3d and I dont have voltage unlocked.


open windows store and download a rar converter.


----------



## Nico67

Quote:


> Originally Posted by *Slackaveli*
> 
> not "on" though, "FAST", because it works better and no perf hit. Right back to what I said 500 pages ago... g-sync on, fast sync on NCP, frame limiter 1 below refresh (2 below >90Hz).


Why would you want Vsync / Fast Sync "On" with Gsync+RSS limiter? if anything it would never come on anyway and at worse would add lag that did need to be there. No sync is always the least lag and a limiter is designed to avoid it hitting Vsync threshold. I can't see Gsync+Vsync being less laggy than Gysnc+No Sync within the Gsync operating range, but if it is I would be curious to know


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Why would you want Vsync / Fast Sync "On" with Gsync+RSS limiter? if anything it would never come on anyway and at worse would add lag that did need to be there. No sync is always the least lag and a limiter is designed to avoid it hitting Vsync threshold. I can't see Gsync+Vsync being less laggy than Gysnc+No Sync within the Gsync operating range, but if it is I would be curious to know


because g-sync doesnt work between 135-144 and because I also have a 4k/60 and fast sync works great on that. And fast sync adds no input lag, it just stops screen tearing. fast on or off is the same on input lag so no worries there. i used to always run off , too, but i prefer this method now. Blurbusters recommends vsync on, but i just wont do it and settled on fast to try it out. Been on it everr since. Works great on the 4k, too. Feels like free freeesync lol.


----------



## CoreyL4

What does voltage lock do when you are tweaking the curve?


----------



## Nico67

Quote:


> Originally Posted by *Slackaveli*
> 
> because g-sync doesnt work between 135-144 and because I also have a 4k/60 and fast sync works great on that. And fast sync adds no input lag, it just stops screen tearing. fast on or off is the same on input lag so no worries there. i used to always run off , too, but i prefer this method now.


Cool, because of Gsync ain't as great as the refresh rate







maybe better to run the limiter at 133hz in that case? did see where they said Fast Sync was a bit stuttery even if lag was low. Think I should be safe with Sync off at sub 100hz


----------



## Talon2016

Flashed My Gaming X with Asus Stric OC vBios and works perfectly.

After further testing today I had to add 100Mhz to core and 117% power limit to get 2012Mhz in game for BF1 and it was crashing on stock Gaming X vBios.

Flashed Strix bios this evening and now a +75mhz OC and 120% power limit yields 2025Mhz on core and was stable for over an hour of BF1. This also removed a weird microstutter issue the MSI vBios seemed to have. I will keep pushing the core and see what I can get before I even touch the voltage or manual curves.

Anyone having issues on Gaming X should give the Asus vBios a go.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Cool, because of Gsync ain't as great as the refresh rate
> 
> 
> 
> 
> 
> 
> 
> maybe better to run the limiter at 133hz in that case? did see where they said Fast Sync was a bit stuttery even if lag was low. Think I should be safe with Sync off at sub 100hz


no doubt. sync off its still gravy, probably a lock at 135 would be ideal.

OR... uh... Maybe time to give ULMB another day in court. Could set monitor up as a ULMB @ 120Hz , frame rate locked at 119 and have fast sync so no tearing. Ive never felt any sort of stutter. Oh, wow, Im actually going to try that out.


----------



## gmpotu

Quote:


> Originally Posted by *slackaveli*
> i meant it may be crashing because it needs just a little more juice, like how a cpu core would. so give it 25mv (baby bump. only 25,50,75,and 100 do anything. all other 'bumps' are redundant). and then go p the scale one more and lock it there. Like, if it's crashing at 1.05v at 2050. try adding +25mv on voltage slider, then open the curve and move one dot to the right (or 'Up"), which would be 1.063, and lock it there. always have one step up between each voltage. I find locking at 1.075v with a +50Mv setting, with 1.08v being the 2100 mark and flat after that, gives me a locked 2088/2075 in benches and games alike. I no longer have any perfcap reasons besides occasional "util" which is nothing and the bench related power throttling. Havent power throttled in a game yet, even witcher 3, but I dont run uncapped frames in 4k. I lock it at 59fps and I think that has kept me from power limiting in that game because it's pretty much locked in 4k at 59.


Just finished playing World of Warcraft at max settings 1920x1080 with SSAA on for the whole night and during raid.

I thought I was good to go with locking my card [email protected] No issues all night and no power limits and temps below 55c most of the night. Then out of the blue when I'm doing some easy solo questing the entire computer crashes with power loss. Now any time I try to launch wow the computer crashes even if I go back to stock settings. Funny thing is that after playing with the clocks I wound up liking the default curve and just increasing the core by 80 and locking it with Ctrl + L at 1.093v. I was noticing avg 60-144 fps depending on location or if it was a crazy animated fight.


----------



## gmpotu

Hmm now that my system crashed with WOW it won't run at all. I just ran Super Position and Heaven Bench fine but as soon as I launch the .exe for WOW the system reboots.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Just finished playing World of Warcraft at max settings 1920x1080 with SSAA on for the whole night and during raid.
> 
> I thought I was good to go with locking my card [email protected] No issues all night and no power limits and temps below 55c most of the night. Then out of the blue when I'm doing some easy solo questing the entire computer crashes with power loss. Now any time I try to launch wow the computer crashes even if I go back to stock settings. Funny thing is that after playing with the clocks I wound up liking the default curve and just increasing the core by 80 and locking it with Ctrl + L at 1.093v. I was noticing avg 60-144 fps depending on location or if it was a crazy animated fight.


looks like that was 2075 @ 1.075 even though it should have locked it at 1.093. Would be better to drop 1.075 and 1.081 a bin so that it ran 2075 at 1.093 properly.

Also sounds like you may have a corrupted file in WoW, and that's why it won't start. Could also be those power issues again?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> because g-sync doesnt work between 135-144 and because I also have a 4k/60 and fast sync works great on that. And fast sync adds no input lag, it just stops screen tearing. fast on or off is the same on input lag so no worries there. i used to always run off , too, but i prefer this method now. Blurbusters recommends vsync on, but i just wont do it and settled on fast to try it out. Been on it everr since. Works great on the 4k, too. Feels like free freeesync lol.


So fast sync + gsync + frame limiter?


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> looks like that was 2075 @ 1.075 even though it should have locked it at 1.093. Would be better to drop 1.075 and 1.081 a bin so that it ran 2075 at 1.093 properly.
> 
> Also sounds like you may have a corrupted file in WoW, and that's why it won't start. Could also be those power issues again?


Yeah I think a file got corrupted in the Launcher. I opened the wow.exe from the program files folder and it still crashed on me again but eventually I was able to get the launcher to run and do a repair.
This could still be more issues with my PSU even though it seemed to be doing well after switching the PCI-E connectors I was using.

I'll see if dropping the 1.075 and 1.081 a couple of bins to 2050 and dropping 1.093 down to 2067 works better and ends up being more stable. Then I'll post back with how it goes. I bought a corsair HXi 1000W PSU the other day so I might hook that up tomorrow and rerun all my cables to see if the new PSU gives me more stability. I'm pretty sure my current one is over 8 years old. Ultra isn't even a company any more if I remember correctly.


----------



## illypso

Quote:


> Originally Posted by *Slackaveli*
> 
> no doubt. sync off its still gravy, probably a lock at 135 would be ideal.
> 
> OR... uh... Maybe time to give ULMB another day in court. Could set monitor up as a ULMB @ 120Hz , frame rate locked at 119 and have fast sync so no tearing. Ive never felt any sort of stutter. Oh, wow, Im actually going to try that out.


Personalty after comparing both ULMB and GSync a lots of time.
With my screen Acer Predator XB271HU
I find ULMB to be a lots better in gaming than g-sync because there is no blur anymore, (I made a lots of test to compare and as an example, going full speed in GTA in a car, you can see better and gauge distance so you make better maneuver between incoming car.
In FirstPersonShooter you can almost keep eye contact with your target as you move your gun to him.
Everything seem clear when you move its night and day

BUT ONLY IF you have high FPS ( I find 90 or more to be the sweet spot) 2x1080ti is giving me that almost everywhere
Because it remove the blur, any stutter of low FPS become so visible that its not fun to play so G-Sync become better.
And if you are not able to see or feel the screen flickering of course.
And if your screen don't have to much ghosting. mine has some but in gameplay I don't see it

So don't be shy to give ULMB a try







, you could love it.


----------



## CoreyL4

Quote:


> Originally Posted by *Talon2016*
> 
> Flashed My Gaming X with Asus Stric OC vBios and works perfectly.
> 
> After further testing today I had to add 100Mhz to core and 117% power limit to get 2012Mhz in game for BF1 and it was crashing on stock Gaming X vBios.
> 
> Flashed Strix bios this evening and now a +75mhz OC and 120% power limit yields 2025Mhz on core and was stable for over an hour of BF1. This also removed a weird microstutter issue the MSI vBios seemed to have. I will keep pushing the core and see what I can get before I even touch the voltage or manual curves.
> 
> Anyone having issues on Gaming X should give the Asus vBios a go.


I just flashed my Gaming X. I'll let you know if I see any improvement.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Talon2016*
> 
> Flashed My Gaming X with Asus Stric OC vBios and works perfectly.
> 
> After further testing today I had to add 100Mhz to core and 117% power limit to get 2012Mhz in game for BF1 and it was crashing on stock Gaming X vBios.
> 
> Flashed Strix bios this evening and now a +75mhz OC and 120% power limit yields 2025Mhz on core and was stable for over an hour of BF1. This also removed a weird microstutter issue the MSI vBios seemed to have. I will keep pushing the core and see what I can get before I even touch the voltage or manual curves.
> 
> Anyone having issues on Gaming X should give the Asus vBios a go.


You should run benchmarks and see if your scores are actually higher or not just in case.


----------



## alucardis666

Couldn't be happier with my SLi setup and the 1700 right now.

R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz --- 16886 --- 4K Optimized


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> looks like that was 2075 @ 1.075 even though it should have locked it at 1.093. Would be better to drop 1.075 and 1.081 a bin so that it ran 2075 at 1.093 properly.
> 
> Also sounds like you may have a corrupted file in WoW, and that's why it won't start. Could also be those power issues again?


could be the unprecedented solar flares we are getting hammered with the next three days. Put your foil hats on, men and ladies!


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*
> 
> could be the unprecedented solar flares we are getting hammered with the next three days. Put your foil hats on, men and ladies!


Quote:


> Originally Posted by *alucardis666*
> 
> Couldn't be happier with my SLi setup and the 1700 right now.
> 
> R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz --- 16886 --- 4K Optimized


lovely. If i didnt love my cpu so much id be on a 1600x or 1700 right now, too. But I'm gonna wait for another round or two being on this lil sexy lady.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> lovely. If i didnt love my cpu so much id be on a 1600x or 1700 right now, too. But I'm gonna wait for another round or two being on this lil sexy lady.


Don't blame you bro. I'm just so impressed with the thermals and power draw on the Ryzen platform. My room is so much cooler and my performance isn't too far off from the old 6950x. And I'm sure performance will continue to improve with drivers and bios updates.

Only thing I need now if for *this* to drop in price to add it to the rig.

Only wish I could've done dual hybrid coolers on these FEs. I'm sure I'm leaving another 10% of performance on the table without doing Curves in AB and having that sweet thermal head space. But tbh, my cards aren't loading past ~70c, and I expected much worse with the SLi sammich.



Maybe I'll break things down next week and throw some Thermal Grizzly on them and see if I can't shave off a few C.


----------



## gmpotu

Spent another hour in game once I got everything back up and running.
[email protected] 1.081v

Settings
+51 voltage%
+25 core
+0 power
Fan Hard set to 50%

In Superposition I mostly see blue. So I either need to increase the voltage slider to 76+ tomorrow or start manually adjusting the curve again and testing with upping the voltage step on the curve only one more bin to go though so I guess I would check to see stability of 2025 @ 1.093v since 1.081v is showing blue on gpu-z. If 1.093 doesn't play nice then I would jump to +76 on voltage slider.

I'll be more than happy if I can test and find a sweet spot for 2000 that's not at the max with 1.093v. Just hope my crashing is over with for now.

Okay bed time. 03:50am here. See you guys in 10+ hours.


----------



## feznz

Quote:


> Originally Posted by *gmpotu*
> 
> Just finished playing World of Warcraft at max settings 1920x1080 with SSAA on for the whole night and during raid.
> 
> I thought I was good to go with locking my card [email protected] No issues all night and no power limits and temps below 55c most of the night. Then out of the blue when I'm doing some easy solo questing the entire computer crashes with power loss. Now any time I try to launch wow the computer crashes even if I go back to stock settings. Funny thing is that after playing with the clocks I wound up liking the default curve and just increasing the core by 80 and locking it with Ctrl + L at 1.093v. I was noticing avg 60-144 fps depending on location or if it was a crazy animated fight.


I found a bug in that MSI AB you need to boot in safe mode and remove the profiles in the MSI install folder the file somehow get corrupted and they load a previous setting that may not be stable all the time.


----------



## fisher6

Reporting back on the SC2 bios I flashed yesterday on my FE. It is definitely more stable for me than the Strix one. I still hit the power limit sometimes but it looks very stable at 2100/6000







. My temps also went down a degree or two to 47-48C during load.


----------



## dunbheagan

Hey Guys, took me a while to finish my build but today i could finally press the power button for the first time. Fortunatly everything worked instantly.

First shot, no optimisation at all:

Settings:
Benchmark: Metro Last Light Redux, 4K, 4x Antialising, 1 Run
CPU Voltage: +0%
Power Limit: +0%
CPU Clock: +200MHz
Memory Clock: +500MHz

Results after 1 run(stable)
CPU Clock: 2100MHz (not one single drop)
Memory Clock: 6003MHz
Max Temp: 36°C
Max Power Target: 96% (no Power Limit at any time)

Higher clocks seem to need a higher voltage.

I am extremly satisfied being able to run steady 2100/6000 even at stock voltage. Shunt mod works perfectly!


----------



## dunbheagan

Same for DOOM 4K, Nightmare Settings

2100MHz GPU, no drops
6003MHz Memory
1,05V
GPU Temp 37 degree

I am pretty shure i will keep it at these settings, i am totally satisfied.

PS: I used Conductonaut between GPU Die and Waterblock


----------



## GRABibus

Quote:


> Originally Posted by *dunbheagan*
> 
> Same for DOOM 4K, Nightmare Settings
> 
> 2100MHz GPU, no drops
> 6003MHz Memory
> 1,05V
> GPU Temp 37 degree
> 
> I am pretty shure i will keep it at these settings, i am totally satisfied.
> 
> PS: I used Conductonaut between GPU Die and Waterblock


You won the Silicon lottery !


----------



## Luckbad

In a couple weeks, once I can find a pair of smaller cards (hoping for FTW3) that will fit in my case in SLI, I'll likely be selling my Zotac.

I'll ping this thread when I list it to give you guys dibs. It's a Zotac 1080 Ti Amp Extreme that boosts to 2037.5 MHz out of the box (no overclocking involved).

Not looking to profit or anything. Feel free to ping me via PM if you'll be interested.


----------



## dunbheagan

Quote:


> Originally Posted by *GRABibus*
> 
> You won the Silicon lottery !


Yeah, seems like i was lucky









Nice motivation for my first watercool-build


----------



## Chris Ihao

Hey guys. Got my Aorus (non xtreme version) yesterday. Increased the power limit to +125%, added 25 on the clock and it basically stays at about 1950 core under load. Temps are excellent at about 71 degrees. I was planning on reaching 2 ghz and leaving it be, but it seems to throttle down a bit (from 1987 at the moment). Any tips?

Oh, and I get around 14500 on firestrike extreme with my 4790k, cloked to 4.5 ghz. Seems about normal, right?


----------



## Talon2016

I'm confused how my temps are so low with this beast on stock air cooler. Last night was seeing 47C with 70% fan speed on gaming X @2025Mhz. Voltage is stock. This is after 1 hour of BF1 multiplayer.

My regular 1080 gaming x with same fan profile at 2088 would hit 58-60C. I would have expected the beefier Ti card to run hotter.


----------



## TheBoom

Quote:


> Originally Posted by *Talon2016*
> 
> I'm confused how my temps are so low with this beast on stock air cooler. Last night was seeing 47C with 70% fan speed on gaming X @2025Mhz. Voltage is stock. This is after 1 hour of BF1 multiplayer.
> 
> My regular 1080 gaming x with same fan profile at 2088 would hit 58-60C. I would have expected the beefier Ti card to run hotter.


Why would a beefier cooler run hotter?


----------



## pantsoftime

Didn't get any gains out of the SC2 BIOS on EVGA FE. I was power limiting quite a bit and couldn't get better clocks. Time to try something else.


----------



## derpa

Quote:


> Originally Posted by *Talon2016*
> 
> I'm confused how my temps are so low with this beast on stock air cooler. Last night was seeing 47C with 70% fan speed on gaming X @2025Mhz. Voltage is stock. This is after 1 hour of BF1 multiplayer.
> 
> My regular 1080 gaming x with same fan profile at 2088 would hit 58-60C. I would have expected the beefier Ti card to run hotter.


Hmmm....dynamic work load allowing the card to heat and cool enough to keep the temp down? I'm guessing you have everything maxed out?


----------



## KaRLiToS

With EK blocks, the idle temps on my cards are

Card1 : 25'C
Card2: 22'C

During load 2037mhz

Card1: 41'C
Card2: 38'C

With fans at 0%


----------



## mtbiker033

Quote:


> Originally Posted by *KaRLiToS*
> 
> With EK blocks, the idle temps on my cards are
> 
> Card1 : 25'C
> Card2: 22'C
> 
> During load 2037mhz
> 
> Card1: 41'C
> Card2: 38'C
> 
> With fans at 0%


nice!!

I was just going to post and ask what would be a good waterblock to use as I'm tried of waiting for a hybrid kit. I think you just answered my question (EK!)

https://www.ekwb.com/custom-loop-configurator/shared/oW58fb69b81833f

wow that's expensive!! haha I'm sure this could be done cheaper


----------



## DStealth

Quote:


> Originally Posted by *dunbheagan*
> 
> Same for DOOM 4K, Nightmare Settings
> 
> 2100MHz GPU, no drops
> 6003MHz Memory
> 1,05V
> GPU Temp 37 degree
> 
> I am pretty shure i will keep it at these settings, i am totally satisfied.
> 
> PS: I used Conductonaut between GPU Die and Waterblock


Show some benchmarks pls in oreder to conclude you have no drops







What card you have?
Just ordered EK MSI Nickel block...hope after the RMA the card won't be much worster than my previous one...








If it's worst I have to take the soldering iron...and reverse-engineer the good one


----------



## fisher6

Has anybody tried the AMP! Extreme bios on a FE card?

EDIT: There is also this Ichill X4 Ultra bios https://www.techpowerup.com/vgabios/191230/191230


----------



## schmak01

Quote:


> Originally Posted by *shalafi*
> 
> can you please post temps during a Firestrike Extreme/Ultra Stress test?


FS Extreme Score:

http://www.3dmark.com/3dm/19459465?

And stats

Max temp there was only 39

I did the TimeSpy too

http://www.3dmark.com/3dm/19459936?

Stats


This one got a bit warmer at 45 C max temp.

I'm still getting a little throttling which is weird on the GPU clocks, I don't get that with Heaven or Valley benchmarks, just 3DMark.


----------



## madmeatballs

Quote:


> Originally Posted by *dunbheagan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey Guys, took me a while to finish my build but today i could finally press the power button for the first time. Fortunatly everything worked instantly.
> 
> First shot, no optimisation at all:
> 
> Settings:
> Benchmark: Metro Last Light Redux, 4K, 4x Antialising, 1 Run
> CPU Voltage: +0%
> Power Limit: +0%
> CPU Clock: +200MHz
> Memory Clock: +500MHz
> 
> Results after 1 run(stable)
> CPU Clock: 2100MHz (not one single drop)
> Memory Clock: 6003MHz
> Max Temp: 36°C
> Max Power Target: 96% (no Power Limit at any time)
> 
> Higher clocks seem to need a higher voltage.
> 
> I am extremly satisfied being able to run steady 2100/6000 even at stock voltage. Shunt mod works perfectly!


Nice build and OC!









What case is that?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *schmak01*
> 
> FS Extreme Score:
> 
> http://www.3dmark.com/3dm/19459465?
> 
> And stats
> 
> Max temp there was only 39
> 
> I did the TimeSpy too
> 
> http://www.3dmark.com/3dm/19459936?
> 
> Stats
> 
> 
> This one got a bit warmer at 45 C max temp.
> 
> I'm still getting a little throttling which is weird on the GPU clocks, I don't get that with Heaven or Valley benchmarks, just 3DMark.


Heaven and valley are good test for realistic Gameplay scenarios.

3Dmark and Superposition are very power hungry test that nearly run your card to it's maximum power draw. So it makes sense your card would be hotter. These double as stress test.


----------



## nyk20z3

http://www.guru3d.com/news-story/evga-teases-gtx-1080-ti-k%7Cngp%7Cn-edition.html


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nyk20z3*
> 
> http://www.guru3d.com/news-story/evga-teases-gtx-1080-ti-k%7Cngp%7Cn-edition.html


***, am I reading that right? 2,531 Mhz on the core?


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> ***, am I reading that right? 2,531 Mhz on the core?


Yea with LN2.

Weird tho, he reached 3GHz with an FE before Here or here TPU


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> Yea with LN2.
> 
> Weird tho, he reached 3GHz with an FE before Here


I think he did his own modifications there, but the kingpin may be he just ran it as is?


----------



## Foxrun

Quote:


> Originally Posted by *Talon2016*
> 
> I'm confused how my temps are so low with this beast on stock air cooler. Last night was seeing 47C with 70% fan speed on gaming X @2025Mhz. Voltage is stock. This is after 1 hour of BF1 multiplayer.
> 
> My regular 1080 gaming x with same fan profile at 2088 would hit 58-60C. I would have expected the beefier Ti card to run hotter.


What resolution are you running at? Before I switched to 4k, I would have the same temps as you are.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think he did his own modifications there, but the kingpin may be he just ran it as is?


Could be, but still in LN2 though. Probably also has a custom bios.


----------



## dunbheagan

Quote:


> Originally Posted by *DStealth*
> 
> Show some benchmarks pls in oreder to conclude you have no drops
> 
> 
> 
> 
> 
> 
> 
> What card you have?
> Just ordered EK MSI Nickel block...hope after the RMA the card won't be much worster than my previous one...
> 
> 
> 
> 
> 
> 
> 
> 
> If it's worst I have to take the soldering iron...and reverse-engineer the good one


Here it is, one full run Metro LL redux:



2101MHz, not one single drop, stock voltage









I have a Asus 1080Ti FE, EKWB Cooler+Backplate, liquid metal as thermal paste and shunt mod


----------



## madmeatballs

Quote:


> Originally Posted by *dunbheagan*
> 
> Here it is, one full run Metro LL redux:
> 
> 
> 
> 2101MHz, not one single drop, stock voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a Asus 1080Ti FE, EKWB Cooler+Backplate, liquid metal as thermal paste and shunt mod


Do you have a photo of your shunt mod? hehe I wanna see how you did it.







Really considering on doing it as well.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> Here it is, one full run Metro LL redux:
> 
> 
> 
> 2101MHz, not one single drop, stock voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a Asus 1080Ti FE, EKWB Cooler+Backplate, liquid metal as thermal paste and shunt mod


Nice, you can probably do higher with a bit of voltage.


----------



## dunbheagan

Quote:


> Originally Posted by *madmeatballs*
> 
> Do you have a photo of your shunt mod? hehe I wanna see how you did it.
> 
> 
> 
> 
> 
> 
> 
> Really considering on doing it as well.


----------



## madmeatballs

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *dunbheagan*






Is that Liquid Electric tape then RTV silicone?


----------



## iluvkfc

Is my 1080 Ti FE defective or something?

In gaming it cannot even hold 1.000V, drops below due to power limit! PL set at 120%, still with 0.962-0.975V it reaches 120% and throttles. Haven't even tried benchmarks yet...


----------



## dunbheagan

Quote:


> Originally Posted by *madmeatballs*
> 
> Nice build and OC!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What case is that?


Thanks buddy! It is a Corsair Carbide Air 540. Still a bit messy, but i wanted to test first. will do some work on the cables and stuff later...


----------



## dunbheagan

Quote:


> Originally Posted by *madmeatballs*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Is that Liquid Electric tape then RTV silicone?


No, its black nail polish and then black heat resistant silicone


----------



## Qba73

Quote:


> Originally Posted by *KedarWolf*
> 
> http://forums.guru3d.com/showpost.php?p=5412373&postcount=216 Works voltage unlocked out of the box.


KW

4.4.0 beta 7 ;-) I tried changing url from 6 to 7 and it works. nothing any higher though

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta7.rar


----------



## Dasboogieman

Cooling mod Mk2

I cannibalised an old Palit Twin Fan GTX 570 heatsink. Now the core backplate + VRAM gets solid copper block cooling. Temps now hold steady at 60 degrees when running Supo 8K. Everything else doesn't break 55 degrees. Basically an additional 10 degree improvement on the Intel HSF mod.

I'm taking delivery of a dead Quadro Single slot GPU in a few weeks, cant wait to see this thing with a sexy blower on top.


----------



## Talon2016

Quote:


> Originally Posted by *Foxrun*
> 
> What resolution are you running at? Before I switched to 4k, I would have the same temps as you are.


1440p with 115% resolution scale. 4K would probably heat it more yes. Regardless it just seemed odd the card ran cooler than the stock 1080. Even with a beefier cooler as pointed out the card draws a hell of a lot more power.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Cooling mod Mk2
> 
> I cannibalised an old Palit Twin Fan GTX 570 heatsink. Now the core backplate + VRAM gets solid copper block cooling. Temps now hold steady at 60 degrees when running Supo 8K. Everything else doesn't break 55 degrees. Basically an additional 10 degree improvement on the Intel HSF mod.
> 
> I'm taking delivery of a dead Quadro Single slot GPU in a few weeks, cant wait to see this thing with a sexy blower on top.


Quote:


> Originally Posted by *madmeatballs*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Is that Liquid Electric tape then RTV silicone?


Liquid electrical tape is good and it just peels off.










Peels off like a sticker. Cost like $4

If you're not using it, get a fuji thermal pad with 17 conductivity rating. That will help even more.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Talon2016*
> 
> 1440p with 115% resolution scale. 4K would probably heat it more yes. Regardless it just seemed odd the card ran cooler than the stock 1080. Even with a beefier cooler as pointed out the card draws a hell of a lot more power.


The coolers are larger in volume for the 1080 Ti, I'm not surprised or find it weird. The msi gaming x is almost twice the volume.


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> Nice, you can probably do higher with a bit of voltage.


Yeah, i tried a little, like +230MHz, +100%V and +230MHz, +40%V, but i havent found a stable seeting above +200MHz yet. 2100/6000 seems to work just fine, I think I will not spend too much time benchmarking and just keep on these "relaxed" settings without any voltage increase.

Increasing the voltage also makes the coil whine louder, which is another reason for me not to touch it. At stock voltage the coil whine is audible in an open case, but it is ok. With higher voltage it gets annoying.


----------



## Talon2016

Quote:


> Originally Posted by *SlimJ87D*
> 
> The coolers are larger in volume for the 1080 Ti, I'm not surprised or find it weird. The msi gaming x is almost twice the volume.


Copy.


48C max. Ill flash back in a bit to see what the MSI vBIOS scores and if it can hold it.

Settings on chip are shown in afterburner.


----------



## Addsome

Alright so the Inno3D bios is best for me on a Gigabyte FE with a 1080 hybrid kit. The bios lets me clock way higher on memory and actually provides FE users with 15W more. I've also tested benchmarks and it does improve over stock bios.

https://www.techpowerup.com/vgabios/191230/191230


----------



## shalafi

Quote:


> Originally Posted by *schmak01*
> 
> FS Extreme Score:
> 
> Max temp there was only 39


yeah, FS Benchmark is too short to heat up the card to the max, that's why I asked for a FS Ultra/Extreme *Stress Test* specifically







Sorry to be so demanding


----------



## KraxKill

Quote:


> Originally Posted by *Addsome*
> 
> Alright so the Inno3D bios is best for me on a Gigabyte FE with a 1080 hybrid kit. The bios lets me clock way higher on memory and actually provides FE users with 15W more. I've also tested benchmarks and it does improve over stock bios.
> 
> https://www.techpowerup.com/vgabios/191230/191230


More info please. How did you verify power usage?


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Alright so the Inno3D bios is best for me on a Gigabyte FE with a 1080 hybrid kit. The bios lets me clock way higher on memory and actually provides FE users with 15W more. I've also tested benchmarks and it does improve over stock bios.
> 
> https://www.techpowerup.com/vgabios/191230/191230


if it's the same X4 Inno3D BIOS I tried it never worked right on my Gigabyte FE at all.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Addsome*
> 
> Alright so the Inno3D bios is best for me on a Gigabyte FE with a 1080 hybrid kit. The bios lets me clock way higher on memory and actually provides FE users with 15W more. I've also tested benchmarks and it does improve over stock bios.
> 
> https://www.techpowerup.com/vgabios/191230/191230


That's good news yo!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> if it's the same X4 Inno3D BIOS I tried it never worked right on my Gigabyte FE at all.


Same here, I couldn't launch a game at all.


----------



## fisher6

Also thinking to try the Ichill bios. Not 100% sure the SC2 bios adds 15w.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> More info please. How did you verify power usage?


We verified the power usage and test pages ago.

I did a PSU reading and also nvidia commands.

But I could only benchmark with those, couldn't launch a game.
Quote:


> Originally Posted by *fisher6*
> 
> Also thinking to try the Ichill bios. Not 100% sure the SC2 bios adds 15w.


It doesn't, has a max power target of 300 watts.


----------



## Blotto80

It worked for my FE and resulted in better power limiting but the blower fan didn't really work and I think it may have slowed the pump down, temps were through the roof, easily 10c higher in Supo. The only bios that really worked for me so far was the SC2 but it seems to bench a bit lower than the stock FE.
I think I'm done chasing clock speed on this, I've moved on to trying to maintain a low voltage/clocks. Out of the box my boost was 1911mhz, so I'm aiming for 2012mhz for a nice round 100mhz OC. So far I'm pretty stable at 0.981v at 44c max.


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> Liquid electrical tape is good and it just peels off.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peels off like a sticker. Cost like $4
> 
> If you're not using it, get a fuji thermal pad with 17 conductivity rating. That will help even more.[/


Do you think liquid electric tape would avoid what happened to the guy on the other thread where his shunt fell off? lol. That is what is stopping me from doing the shunt mod.









Anyway, do you know where else I could get some fujipoly? Currently looking at moddiy. I need 0.5mm thickness since I will be using an Aqua computer block/backplate.


----------



## shalafi

Quote:


> Originally Posted by *KedarWolf*
> 
> if it's the same X4 Inno3D BIOS I tried it never worked right on my Gigabyte FE at all.


I just tried it with my Gigabyte FE and it worked, 315W confirmed with the nvidia utility. However the base clocks were far far higher on stock - I had to lower my Afterburner Core offset by some 100MHz and Mem by around 200Mhz to end up at the same clocks as the stock FE BIOS (for example to get Core to 2012 on this BIOS I had to go +40, instead of +140). Loading any previous OC profiles would send the clock throught the roof, instant crash.


----------



## RavageTheEarth

Quote:


> Originally Posted by *madmeatballs*
> 
> Do you think liquid electric tape would avoid what happened to the guy on the other thread where his shunt fell off? lol. That is what is stopping me from doing the shunt mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, do you know where else I could get some fujipoly? Currently looking at moddiy. I need 0.5mm thickness since I will be using an Aqua computer block/backplate.


Performance-PCS.com has them. Was just looking at them last night getting prepared for the release of the Aorus water block. Can't wait to get this thing under water!


----------



## madmeatballs

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Performance-PCS.com has them. Was just looking at them last night getting prepared for the release of the Aorus water block. Can't wait to get this thing under water!


Sadly ppcs doesnt have the 17w/mK version for 0.5mm. I guess I'll be getting it from moddiy.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> Do you think liquid electric tape would avoid what happened to the guy on the other thread where his shunt fell off? lol. That is what is stopping me from doing the shunt mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, do you know where else I could get some fujipoly? Currently looking at moddiy. I need 0.5mm thickness since I will be using an Aqua computer block/backplate.


Yes, I purchased a small paint brush and applied liquid electrical tape on the solder joints themselves. So only the top of the resistor came in contact with CLU.

The other guy probably lathered the solder joints too and it corroded them.

That guy never came out and explained what he did. I bet he tried to clean it off without blow drying and warming it up or didn't use acetone and just scrubbed the resistor till it came off.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *madmeatballs*
> 
> Sadly ppcs doesnt have the 17w/mK version for 0.5mm. I guess I'll be getting it from moddiy.


Fujipoly is like a clay... It will form to 0.5mm if you need it to. You can technically roll it up into a ball, flatten it to the size you need and apply it.

Just wear gloves and don't get finger oils in it I guess.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Fujipoly is like a clay... It will form to 0.5mm if you need it to. You can technically roll it up into a ball, flatten it to the size you need and apply it.
> 
> Just wear gloves and don't get finger oils in it I guess.


If anything oil will help the heat transfer







I went without gloves for ease of peeling the protective film.


----------



## joder

I'm excited about the 12 core Skylake-X coming in June







While the 5775c is appealing I think I'll wait and get a CPU that will last for years.


----------



## chef1702

Some infos regarding the Inno Bios. Fullwater, EVGA FE no shunt. Temps. while testing at 40-43.

SC2 Bios 2000 MHz @ 1.0125 V. (not in GPU-Z shown, forgot to click max)



Inno Bios 2000 MHz @ 1.0125 V



Its just 30 Points and gpu-z shows nearly the same pwr limit all the time like with the sc2 bios. I guess the 30 points I gained are from the reboot after the flash and the "clean" boot without using the pc for the whole day.

Curve was on both the same. I double checked that even the lower points are the same so that the card, if the power throttling happens, goes exactly the same steps down like with the sc2 bios. (picture was made with sc2 bios for reference)



I did a 4k run too where the card never hit pwr (SC2 and Inno) and with the SC2 bios the max tdp was 118% from 120 which means 6 watt left. With the Inno Bios the card went up to 113% from 117 (116 in afterburner setting) which means between 9 and 12 watt left (315 total, 1% -> 3.15 watt. 116 - 113 -> 3. 3 x 3.15...you get it







).

So someone gets a conclusion. Is it working or not? 30 Points in 8k test are measuring tolerance or really 15 watt only gives you 30 points.


----------



## gmpotu

Quote:


> Originally Posted by *dunbheagan*
> 
> Same for DOOM 4K, Nightmare Settings
> 
> 2100MHz GPU, no drops
> 6003MHz Memory
> 1,05V
> GPU Temp 37 degree
> 
> I am pretty shure i will keep it at these settings, i am totally satisfied.
> 
> PS: I used Conductonaut between GPU Die and Waterblock


Is there any way that you can measure how much power draw your card is pulling now that you have the shunt mod?
I'm curious how much total power above 330W you're pulling when you are at 2100 on the clock speed.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Don't blame you bro. I'm just so impressed with the thermals and power draw on the Ryzen platform. My room is so much cooler and my performance isn't too far off from the old 6950x. And I'm sure performance will continue to improve with drivers and bios updates.
> 
> Only thing I need now if for *this* to drop in price to add it to the rig.
> 
> Only wish I could've done dual hybrid coolers on these FEs. I'm sure I'm leaving another 10% of performance on the table without doing Curves in AB and having that sweet thermal head space. But tbh, my cards aren't loading past ~70c, and I expected much worse with the SLi sammich.
> 
> 
> 
> Maybe I'll break things down next week and throw some Thermal Grizzly on them and see if I can't shave off a few C.


looks great. that is amazing temps for sli fe compared to the 80s 980ti sli hit. If you grizz them up, you'll be in the 60s and that's great!

Yeah, i love that ram but those prices are ******ed.

Best thing about sli 1080ti would be the over aboundance of power makes all that tweaking less needed. Of course you will want to determine which one is bettter , and how much better, for your single card games/situations.

All in all, GREAT decision to sell the 6950 and fund the WHOLE PC, and great decision in 2 1080ti over one Titan, especially on an 8 core cpu like you have. NASTY.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *chef1702*
> 
> Some infos regarding the Inno Bios. Fullwater, EVGA FE no shunt. Temps. while testing at 40-43.
> 
> SC2 Bios 2000 MHz @ 1.0125 V. (not in GPU-Z shown, forgot to click max)
> 
> 
> 
> Inno Bios 2000 MHz @ 1.0125 V
> 
> 
> 
> Its just 30 Points and gpu-z shows nearly the same pwr limit all the time like with the sc2 bios. I guess the 30 points I gained are from the reboot after the flash and the "clean" boot without using the pc for the whole day.
> 
> Curve was on both the same. I double checked that even the lower points are the same so that the card, if the power throttling happens, goes exactly the same steps down like with the sc2 bios. (picture was made with sc2 bios for reference)
> 
> 
> 
> I did a 4k run too where the card never hit pwr (SC2 and Inno) and with the SC2 bios the max tdp was 118% from 120 which means 6 watt left. With the Inno Bios the card went up to 113% from 117 (116 in afterburner setting) which means between 9 and 12 watt left (315 total, 1% -> 3.15 watt. 116 - 113 -> 3. 3 x 3.15...you get it
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> So someone gets a conclusion. Is it working or not? 30 Points in 8k test are measuring tolerance or really 15 watt only gives you 30 watt.


Awesome test, thank you for that. Have your tested the Asus bios? I used to only go to 110%, which was 300 watts.

Honestly, shunt mod seems to be the only way to go. I don't see a bios solving our issues. A bios is written specifically for a card and the components on it. The only bios that can work is a modified one of our own. Or at least work well.

I lose more hope everyday that a bios will give us access to 350+ watts and work correctly with the card.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> ***, am I reading that right? 2,531 Mhz on the core?


his 6950x is at 5.2Ghz as well. and his ram is 13 cas at 3000-whatever it was. Dude's is THE K|ngp|n for a reason.


----------



## chef1702

Quote:


> Originally Posted by *SlimJ87D*
> Awesome test, thank you for that. Have your tested the Asus bios? I used to only go to 110%, which was 300 watts.


No didn't tested the asus one jet. Maybe tomorrow but for real what are 30 watts more? Im currently playing ME:A at 2560 x 1080 all max with 1.2x resolution scale. Planning to step up to 3440 x 1440 when they get a good 21:9 screen ready. At those settings Im hovering around 105 and 115% pwr all the time with just that 2000 MHz and 1.0125 V. I guess if the 3440 x 1440 screen comes I will hit pwr limit hard even with 30 watts more.

Like you said the real potential of these cards you will only get with shunt. Upping the voltage to around 1.05 with a clock around 2100 and playing on resolutions above 2k will for sure pull 80-100 watts more out of that card. If you're coming with a + 30 watt bios its like nothing... But for now 2000 Mhz without pwr limit on SC2 bios is good enough. No need to hunt 50 Mhz for what...


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Cooling mod Mk2
> 
> I cannibalised an old Palit Twin Fan GTX 570 heatsink. Now the core backplate + VRAM gets solid copper block cooling. Temps now hold steady at 60 degrees when running Supo 8K. Everything else doesn't break 55 degrees. Basically an additional 10 degree improvement on the Intel HSF mod.
> 
> I'm taking delivery of a dead Quadro Single slot GPU in a few weeks, cant wait to see this thing with a sexy blower on top.


that is awesome!


----------



## Luckbad

Quote:


> Originally Posted by *Qba73*
> 
> KW
> 
> 4.4.0 beta 7 ;-) I tried changing url from 6 to 7 and it works. nothing any higher though
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta7.rar


WARNING: I downloaded Afterburner Beta 7. Seemed to work great at first, then I got a BSOD and it corrupted my Windows 10 installation. Had to use ye olde Reset PC option and I'm still installing applications and drivers.

Use this at your own peril.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> WARNING: I downloaded Afterburner Beta 7. Seemed to work great at first, then I got a BSOD and it corrupted my Windows 10 installation. Had to use ye olde Reset PC option and I'm still installing applications and drivers.
> 
> Use this at your own peril.


Holy smokes, thanks for the warning man.


----------



## Slackaveli

Quote:


> Originally Posted by *Blotto80*
> 
> It worked for my FE and resulted in better power limiting but the blower fan didn't really work and I think it may have slowed the pump down, temps were through the roof, easily 10c higher in Supo. The only bios that really worked for me so far was the SC2 but it seems to bench a bit lower than the stock FE.
> I think I'm done chasing clock speed on this, I've moved on to trying to maintain a low voltage/clocks. Out of the box my boost was 1911mhz, so I'm aiming for 2012mhz for a nice round 100mhz OC. So far I'm pretty stable at 0.981v at 44c max.


nice. that's how you switch gears when chasing the dragon goes bad.
Quote:


> Originally Posted by *joder*
> 
> I'm excited about the 12 core Skylake-X coming in June
> 
> 
> 
> 
> 
> 
> 
> While the 5775c is appealing I think I'll wait and get a CPU that will last for years.


that's wise if you can afford it, definitely. the 5775-c's real value is that it is like a poor-man's-7700k-platform-mod, for less than $100net if you are on a 4790k already, in a 5 minute drop in.
Skylake-x is going to be a monster, though. And ryzen is a beast now.


----------



## Chris Ihao

Is throttling unavoidable if you reach a certain heat level? I think its strange that my Aorus throttles quite a bit already at 70-72 degrees. Dont think I got the best chip around unfortunately, as I cant seem to hit 2 ghz on air.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Holy smokes, thanks for the warning man.


v4.4.0.9855Beta6 has been good to me.
Quote:


> Originally Posted by *Chris Ihao*
> 
> Is throttling unavoidable if you reach a certain heat level? I think its strange that my Aorus throttles quite a bit already at 70-72 degrees. Dont think I got the best chip around unfortunately, as I cant seem to hit 2 ghz on air.


do the thermal grizzly kryonaut mod and keep her in the 60s and she's happy to give 2000+


----------



## Chris Ihao

Quote:


> Originally Posted by *Slackaveli*
> 
> REP'D
> v4.4.0.9855Beta6 has been good to me.
> do the thermal grizzly kryonaut mod and keep her in the 60s and she's happy to give 2000+


Hehe. Perhaps I will do just that. I presume you are talking about replacing the compound? Do the ram and vrm's use pads? No need to replace those I guess?


----------



## gmpotu

Hmm this is sad. Last night I was able to do 2025 with +51 voltage slider and now it won't do it unless I move up the slider to +76


----------



## Slackaveli

Quote:


> Originally Posted by *Chris Ihao*
> 
> Hehe. Perhaps I will do just that. I presume you are talking about replacing the compound? Do the ram and vrm's use pads? No need to replace those I guess?


it has a bunch of pads, just try not to move them around (or note where they were). it's an easy mod and very helpful.


----------



## Luckbad

If I end up chickening out on SLI and keeping my Zotac, Kryonaut is in my future. Is that also the preferred thermal compound for CPUs?

I used to try 2-3 pastes per CPU but eventually threw in the towel and went back to Arctic Silver 5.


----------



## gmpotu

Does the Core Voltage % slider have any benefit when trying to find core max at lower voltages?

Like If I'm trying to see how low I can run 2000 core at but I'm already stable at 1.050v and above is there any point in moving the Core Voltage % slider to +100%?


----------



## Chris Ihao

Quote:


> Originally Posted by *Slackaveli*
> 
> it has a bunch of pads, just try not to move them around (or note where they were). it's an easy mod and very helpful.


Ok. Thanks for the info.


----------



## gmpotu

Okay this is really weird and I don't know what would cause it.
I was running Heaven Bench at 1920x1080 and my Curve in AB was set to 2000 Flat Line from 1.1012v and above.
I changed the resolution to 2560x1440 and kept everything the same on the curve. 2000 for 1.1012v and above.

It is running 2012 @ 1.012v with no issue and GPU-Z is clean, no Vrel or Power issues.
What I can't figure out is why is it at 2012 if my curve is set to max at 2000 @1.012v?


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> that's wise if you can afford it, definitely. the 5775-c's real value is that it is like a poor-man's-7700k-platform-mod, for less than $100net if you are on a 4790k already, in a 5 minute drop in.
> Skylake-x is going to be a monster, though. And ryzen is a beast now.


Yeah I am on the 77 platform right now. Ryzen supposedly might have a 16 core beast... I really hope they do.


----------



## MURDoctrine

Okay I feel comfortable enough with my overclock now that I'm under water. Getting 2038 core and 6003 mem at 1.093V. I've got a noob question about the voltage curve however. If I play around with it can I do say a max non throttled value such as my 2038 @ 1.093 and then control the step downs due to power limit? Say if I could be stable at 2025 @ 1.05 or so to prevent more step downs as the I run into the power limit?

Quote:


> Originally Posted by *Slackaveli*
> 
> nice. that's how you switch gears when chasing the dragon goes bad.
> that's wise if you can afford it, definitely. the 5775-c's real value is that it is like a poor-man's-7700k-platform-mod, for less than $100net if you are on a 4790k already, in a 5 minute drop in.
> Skylake-x is going to be a monster, though. And ryzen is a beast now.


Yeah I was going to do a refresh with a 7700k on mine once I got the 1080ti but Skylake-X and Kabylake-X sound like they will be INSANE. I might go that route.


----------



## joder

Quote:


> Originally Posted by *Slackaveli*
> 
> his 6950x is at 5.2Ghz as well. and his ram is 13 cas at 3000-whatever it was. Dude's is THE K|ngp|n for a reason.


He's just a casual apparently...

http://wccftech.com/intel-core-i76950x-overclocked-whopping-57ghz-details/

Didn't say if it was stable for long though. Guessing not?


----------



## Nico67

Quote:


> Originally Posted by *MURDoctrine*
> 
> Okay I feel comfortable enough with my overclock now that I'm under water. Getting 2038 core and 6003 mem at 1.093V. I've got a noob question about the voltage curve however. If I play around with it can I do say a max non throttled value such as my 2038 @ 1.093 and then control the step downs due to power limit? Say if I could be stable at 2025 @ 1.05 or so to prevent more step downs as the I run into the power limit?


Yep that's the ideal way to go, use SuPo 4k as a test and try to keep lower voltages at as higher clocks as it can run. That way when inevitably it does throttle, you don't lose to much performance


----------



## gmpotu

I'm kinda stuck with AB right now. I don't know why but for some reason when I try to apply my profile curve it is jumping the entire thing up a step or two steps.

My curve has everything from 1.000v+ @ 1999 and when I click apply it jumps.
1.000v - 1.025v @ 2025
1.031v+ @ 2037

It's like breaking the whole thing and then I have to manually try to pull down every spot again but when I hit apply it jumps again.
Anyone know how to fix / avoid that?

Also, is there a way to autoatically set everything to the Right of a bin to be the same thing if the one you're focused on is lower?
So lets say 1.000v is set to 1950 and everything above 1.000v is 2012 or above, but I want to set everything 1.000v+ to 2000 is there quick easy way to do that or do i have to manually click each dot and drag each one down?


----------



## stoker

The curve changes based on temperature, I have found with afterburner the different points occur every 10C.
Try setting your curve when your gpu is in the high 30s underwater or high 40s on air. You should find you may only lose 1 bin when temperature increases
This video will help for setting the curve
Quote:


> Originally Posted by *KedarWolf*


----------



## gmpotu

Quote:


> Originally Posted by *stoker*
> 
> The curve changes based on temperature, I have found with afterburner the different points occur every 10C.
> Try setting your curve when your gpu is in the high 30s underwater or high 40s on air. You should find you may only lose 1 bin when temperature increases
> This video will help for setting the curve


Thanks for the info about the temperature drop. I've watched the video above a couple of times now. The issue I was noticing was different though. Afterburner is giving me higher clocks than what I set.
Lets say my curve is set to 2000 then when I click apply it is shifting the entire graph upward to like 2012 or 2025. I found a work around such that if I take one bin down a notch and then I set the one just to the left of it back at 2000 where I want all my bins above 1.000v then hit apply it doesn't do that jumping thing, The weird thing is that the jump happens even when I set my curve from clicking on a saved profile. When I click the profile all looks well also but when you click apply it all shifts up to 2012 or 2025. Very strange behavior.

Another thing I notice is that as soon as I launch a 3D application it will jump my clocks to 2025 even though the highest clock on my entire curve is set to 2000 or 2012. Then the 3D application will run at 2025 until it needs to drop down to 2012 or 2000.

I don't have a youtube or anything to post a video but I might set one up just so you guys can see what I'm talking about.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *stoker*
> 
> The curve changes based on temperature, I have found with afterburner the different points occur every 10C.
> Try setting your curve when your gpu is in the high 30s underwater or high 40s on air. You should find you may only lose 1 bin when temperature increases
> This video will help for setting the curve


It's not really a mystery, it was explained awhile back on how Boost 3.0 works.


----------



## dunbheagan

Superposition 4K, [email protected]/6000, [email protected]: 10171 Points

Superposition 4K, [email protected]/6000, [email protected]: 10265 Points

Funny, huh? I tried it couple times, results stay the same.


----------



## stoker

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's not really a mystery, it was explained awhile back on how Boost 3.0 works.


Yes thankyou I know how boost 3 works.

I was explaining your saved curve changes depending on temp in afterburner when you hit apply


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10171 Points
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10265 Points
> 
> Funny, huh? I tried it couple times, results stay the same.


You found oc your cpu hurt your score? Strange. It seems within an error of margin unless if you did this 10 on each cpu core.

I just did a 2088 Mhz @ 1.050 run for fun and got 10425 I decided to go from 2114 Mhz @ 1.075v to 2088 Mhz @ 1.050v.

[email protected] 4.7 mhz 1.312v.


----------



## gmpotu

Quote:


> Originally Posted by *stoker*
> 
> Yes thankyou I know how boost 3 works.
> I was explaining your saved curve changes depending on temp in afterburner when you hit apply


I noticed for me 50deg and 65deg are the temps that I'll lose clock speed at.
I didn't understand what you meant at first but that makes more sense. Like if my temps are at 51deg+ and I save my profile then AB already recognizes that I'm down a bin due to temperature. So when I go to load that profile and I'm running at 35deg it will start me off a bin higher since AB knows it's going to get dropped when I pass 50deg.

I tested this out with Heaven running.

If I apply my profile when my temps are low before running Heaven then it jumps up a bin. If I apply the same profile after I've passed 50deg then the profile stays exactly how I had it saved.

I've been making and saving my profiles while Heaven is running. making little adjustments here and there to see if it stays stable.


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> You found oc your cpu hurt your score? Strange. It seems within an error of margin unless if you did this 10 on each cpu core.
> 
> I just did a 2088 Mhz @ 1.050 run for fun and got 10425 I decided to go from 2114 Mhz @ 1.075v to 2088 Mhz @ 1.050v.
> 
> [email protected] 4.7 mhz 1.312v.


I tried it three times, same results. Same for Metro benchmark. Strange, I will test more an that.

Something else:
I played around with a locked voltage at 1.093V and during a benchmark my pc shut down and restarted with a "unstable PCU warning". This is repeatable. I am pretty shure that my Seasonic Platinum 660W delivers enough power for the whole system, but I guess a shunt modded card which is locked at max voltage might draw more power in certain situations than the 6+8pin PCIe connector can deliver. When i go back to less than 1.093V this behavior does not occure anymore.

4.7Ghz is the sweet spot for my 4790k too, it is stable at around the the same voltage like yours, about 1,3V. For 4,8GHz i already need almost 1.4V.


----------



## dunbheagan

Quote:


> Originally Posted by *gmpotu*
> 
> I noticed for me 50deg and 65deg are the temps that I'll lose clock speed at.


Yeah, first time my card clocks down a few MHz is above 48°C.


----------



## octiny

So I originally purchased 4 1080 TI's- 2 Zotac and 2 EVGA's. Finally got around to testing all 4 to pick the best 2. Both Zotacs were duds, go figure.

1 of the EVGA's easily does 2100+ 500+ mem before dropping bins @ 4K while the other does 2038 with crap 250 on mem. Haven't tested/optimized any further.

Don't plan on playing around with the curve since it doesn't play well with SLI in my testing at least, single card no problem.

With OC's & tempered glass on:

52c on both cards with 100% GPU usage @ 4K hovering around 2000mhz @ 1100 rpm
46C & under on both cards with near max GPU usage @ 1440P/1080P hovering around 2025mhz @ 1100rpm

Overall, I'm pleased! Though I wish I had received 2 golden GPU's


----------



## Nico67

Quote:


> Originally Posted by *dunbheagan*
> 
> I tried it three times, same results. Same for Metro benchmark. Strange, I will test more an that.
> 
> Something else:
> I played around with a locked voltage at 1.093V and during a benchmark my pc shut down and restarted with a "unstable PCU warning". This is repeatable. I am pretty shure that my Seasonic Platinum 660W delivers enough power for the whole system, but I guess a shunt modded card which is locked at max voltage might draw more power in certain situations than the 6+8pin PCIe connector can deliver. When i go back to less than 1.093V this behavior does not occure anymore.
> 
> 4.7Ghz is the sweet spot for my 4790k too, it is stable at around the the same voltage like yours, about 1,3V. For 4,8GHz i already need almost 1.4V.


Yeah I was going to say maybe your CPU was taking power from the GPU, which it likely is and your right on the limit of what PSU can deliver. Best to run a PSU at 50% load for better efficiency, so around 1000w is ideal.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> I'm kinda stuck with AB right now. I don't know why but for some reason when I try to apply my profile curve it is jumping the entire thing up a step or two steps.
> 
> My curve has everything from 1.000v+ @ 1999 and when I click apply it jumps.
> 1.000v - 1.025v @ 2025
> 1.031v+ @ 2037
> 
> It's like breaking the whole thing and then I have to manually try to pull down every spot again but when I hit apply it jumps again.
> Anyone know how to fix / avoid that?
> 
> Also, is there a way to autoatically set everything to the Right of a bin to be the same thing if the one you're focused on is lower?
> So lets say 1.000v is set to 1950 and everything above 1.000v is 2012 or above, but I want to set everything 1.000v+ to 2000 is there quick easy way to do that or do i have to manually click each dot and drag each one down?


That's pretty normal for it to jump, I often have to set multiple times


----------



## Luckbad

Quote:


> Originally Posted by *dunbheagan*
> 
> I tried it three times, same results. Same for Metro benchmark. Strange, I will test more an that.
> 
> Something else:
> I played around with a locked voltage at 1.093V and during a benchmark my pc shut down and restarted with a "unstable PCU warning". This is repeatable. I am pretty shure that my Seasonic Platinum 660W delivers enough power for the whole system, but I guess a shunt modded card which is locked at max voltage might draw more power in certain situations than the 6+8pin PCIe connector can deliver. When i go back to less than 1.093V this behavior does not occure anymore.
> 
> 4.7Ghz is the sweet spot for my 4790k too, it is stable at around the the same voltage like yours, about 1,3V. For 4,8GHz i already need almost 1.4V.


I wouldn't be confident in a 660w PSU with both an overclocked CPU and GPU. I also don't think you need 1000w.

A really solid 750w would probably be enough, but an 850w would definitely work.

I've overclocked power hungry (compared to current gen) CPUs and done SLI GPUs no problem with a decent 850w power supply.

When I decided to upgrade my PSU, I stuck with 850w and got the best I could reasonably acquire (EVGA 850 T2... Go figure. Tried the usual suspects and read reviews and loved it).

Not sure how much I believe this, but...
http://outervision.com/b/FKBJ4d

Looks like GPU overvoltage does murder to the calculation. Even removing that variable, you're redlining your power supply under load.


----------



## Alwrath

I have tested many power supplies, ive been through alot. If your a PC gamer, always always always get a 1000w power supply at least. I cannot stress this enough. " I dont need that much power ". Even if you dont, if your power supply is running at half or less than half of its available power, its going to last that much longer. We are talking years longer. I cant count how many 750w 850w power supplys I went through to figure it out. I had to replace every 2 years because they just went. 1000w? I still have my original corsair 1000w still going in my wife's rig to this day, and I bought it in 2009. Its been running an overclocked core i5 at high voltage with a radeon 290 overclocked for years. Before that, it ran my overclocked 4 GHZ phenom ii 965 BE with crossfired 5870's.

Never get under 1000w if your a gamer, if you do, be prepared to shell out $100 for a new power supply every 2 years. 3 if your lucky. Also when a power supply goes, it takes your motherboard and components with it sometimes. You have been warned.


----------



## MURDoctrine

Quote:


> Originally Posted by *Alwrath*
> 
> I have tested many power supplies, ive been through alot. If your a PC gamer, always always always get a 1000w power supply at least. I cannot stress this enough. " I dont need that much power ". Even if you dont, if your power supply is running at half or less than half of its available power, its going to last that much longer. We are talking years longer. I cant count how many 750w 850w power supplys I went through to figure it out. I had to replace every 2 years because they just went. 1000w? I still have my original corsair 1000w still going in my wife's rig to this day, and I bought it in 2009. Its been running an overclocked core i5 at high voltage with a radeon 290 overclocked for years. Before that, it ran my overclocked 4 GHZ phenom ii 965 BE with crossfired 5870's.
> 
> Never get under 1000w if your a gamer, if you do, be prepared to shell out $100 for a new power supply every 2 years. 3 if your lucky. Also when a power supply goes, it takes your motherboard and components with it sometimes. You have been warned.


This is by far the worst and most ill informed information about a PSU ever. You do not want to buy a huge PSU like that because most rigs will only pull 350-500W under load max. Also your PSU's efficiency drops significantly if it isn't under a certain amount of load. Don't make shilka come out of the wood works and school us all on PSU's.


----------



## Alwrath

Quote:


> Originally Posted by *MURDoctrine*
> 
> This is by far the worst and most ill informed information about a PSU ever. You do not want to buy a huge PSU like that because most rigs will only pull 350-500W under load max. Also your PSU's efficiency drops significantly if it isn't under a certain amount of load. Don't make shilka come out of the wood works and school us all on PSU's.


I already came out of the woodwork. and I will school you on PSU's. I am an electrical engineer. I know how they operate and efficiency and all that crap. You just proved my point. 350w-500w under load max is ideal for a 1000w power supply, thats exactly where you want to be at. Half load. Guess how many more years your PSU will last as opposed to one at 75-95% load? ill wager a guess at 5 years, probably more. But hey if you wanna shell out more money on power supplies over the next 5-10 years be my guest. Make sure you keep us updated on when you have to buy a new one every few years. Keep us updated with your collection of dead 750w - 850w PSU's. Make sure to display them as trophy's in your room until one day you figure it out like I did.


----------



## AngryLobster

The quality of the PSU is what's important, not the wattage.

My guess is you were using some garbage power supplies that just happened to be lower watt units so your educated electrical engineer brain jumped to conclusions.

Most quality units now carry 7/10 year warranties anyway.


----------



## octiny




----------



## Nico67

Quote:


> Originally Posted by *Alwrath*
> 
> I already came out of the woodwork. and I will school you on PSU's. I am an electrical engineer. I know how they operate and efficiency and all that crap. You just proved my point. 350w-500w under load max is ideal for a 1000w power supply, thats exactly where you want to be at. Half load. Guess how many more years your PSU will last as opposed to one at 75-95% load? ill wager a guess at 5 years, probably more. But hey if you wanna shell out more money on power supplies over the next 5-10 years be my guest. Make sure you keep us updated on when you have to buy a new one every few years. Keep us updated with your collection of dead 750w - 850w PSU's. Make sure to display them as trophy's in your room until one day you figure it out like I did.


This is as true as it gets









It is true you'll be hard pushed to do much over 500w, but PSU specs rate efficiency at half load. Look at Titanium specs, they are rate to make spec at 50% load so its going to run best at that. 850 - 1000w is good for single GPU and 1000 - 1500w for multiple GPU.

Choice is always on the buyer, but a good quality PSU is the first place to start with any build


----------



## Luckbad

Anywho, about them 1080 Tis...


----------



## Foxrun

Quote:


> Originally Posted by *Nico67*
> 
> This is as true as it gets
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is true you'll be hard pushed to do much over 500w, but PSU specs rate efficiency at half load. Look at Titanium specs, they are rate to make spec at 50% load so its going to run best at that. 850 - 1000w is good for single GPU and 1000 - 1500w for multiple GPU.
> 
> Choice is always on the buyer, but a good quality PSU is the first place to start with any build


Yep, I just returned a 1000 watt for a 1200 because my computer was rebooting when powering both 1080Ti's


----------



## KaRLiToS

I once had a Lepa G 1600 to run my 4 x R9 290x and after a certain time (2 years) it started to fail and couldn't give more than 900watts from the wall at one point. It had 6 different rails so it was also hard to balance all the wattage accross them all especially with 4 x cards. I RMAed it sold it, and bought an EVGA Supernova 1600 T2...it was actually the second Lepa G 1600 failing on me. The Supernova has only a single rail of 1600watts. Never had issues with the EVGA PSU...a superflower unit.

It's not all about the wattage, it is also about the quality of the PSU.

The GTX 1080ti only peaks at 267watts. A Quality 750watts PSU is more than enough for a single GTX 1080ti.


----------



## DStealth

nonFE are [email protected] some even more...and shunt modded cards could draw as high as 400w themselves







with OC'ed 6/12 x99 platform even 700w cold reboot in some occasions...i have a friend with 750w PSU and he's downclocking the video(MSI Gaming X 1080ti) in order not to have occasional reboots ... As was written previously - better have 1kw+ good PSU the difference is not huge but it will last for years w/o issues as some 600-700w PSU could have.


----------



## phenom01

wait wait wait. So my 760w seasonic platinum PSU may not be enough for a single 1080ti over my highly overvolted overclocked and modded 970 sli?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *phenom01*
> 
> wait wait wait. So my 760w seasonic platinum PSU may not be enough for a single 1080ti over my highly overvolted overclocked and modded 970 sli?


It's more than enough. You're also running a lower powered quad core over that other guys 6 core.


----------



## KaRLiToS

Quote:


> Originally Posted by *DStealth*
> 
> nonFE are [email protected] some even more...and shunt modded cards could draw as high as 400w themselves
> 
> 
> 
> 
> 
> 
> 
> with OC'ed 6/12 x99 platform even 700w cold reboot in some occasions...i have a friend with 750w PSU and he's downclocking the video(MSI Gaming X 1080ti) in order not to have occasional reboots ... As was written previously - better have 1kw+ good PSU the difference is not huge but it will last for years w/o issues as some 600-700w PSU could have.


Did your friend tested all voltage rails (5v, 3.3v and 12v) on the PSU and the watts pulled from the wall when it was rebooting? Defective PSU can happen especially if it's a poor brand / unit. What was the brand of the psu of your friend?


----------



## phenom01

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> It's more than enough. You're also running a lower powered quad core over that other guys 6 core.


Phew thats what i like to hear. just checked the thread and was a bit shocked.


----------



## dansi

Not sure if using SC2 bios on FE is good idea?

Firstly the 0% fanspeed at idle hardly make sense when you look at the size of FE heatsink.

Can it passive cool when the tiny guy is hidden inside a shroud already?


----------



## rolldog

Quote:


> Originally Posted by *joder*
> 
> He's just a casual apparently...
> 
> http://wccftech.com/intel-core-i76950x-overclocked-whopping-57ghz-details/
> 
> Didn't say if it was stable for long though. Guessing not?


I guarantee that thing runs like a champion. He's one of the best guys when it comes to overclocking ability. There's a reason his avatar is Kingpin, and no matter what OC'ing site you go to online, he's always first place. Well, that could be a false claim, he's always been number 1 on the sites I've benched on, which are probably the most popular ones out there. It's possible he's not number 1 on all of them, since once you claim #1 on the most popular site, why prove yourself again to people who aren't even in the same league.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> I tried it three times, same results. Same for Metro benchmark. Strange, I will test more an that.
> 
> Something else:
> I played around with a locked voltage at 1.093V and during a benchmark my pc shut down and restarted with a "unstable PCU warning". This is repeatable. I am pretty shure that my Seasonic Platinum 660W delivers enough power for the whole system, but I guess a shunt modded card which is locked at max voltage might draw more power in certain situations than the 6+8pin PCIe connector can deliver. When i go back to less than 1.093V this behavior does not occure anymore.
> 
> 4.7Ghz is the sweet spot for my 4790k too, it is stable at around the the same voltage like yours, about 1,3V. For 4,8GHz i already need almost 1.4V.


At 1.092v you'll be drawing close to 400 watts so if you do that again you shouldn't leave the slider at 120% on power. At 100% that's around 400 watts.

With 1.092v, you can easily hit 400 watts.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dansi*
> 
> Not sure if using SC2 bios on FE is good idea?
> 
> Firstly the 0% fanspeed at idle hardly make sense when you look at the size of FE heatsink.
> 
> Can it passive cool when the tiny guy is hidden inside a shroud already?


You should be cautious flashing bios if you're on a FE, period. All the fans spin anywhere from 40%-75% of the FE maximum rpm.

I'd only recommend using the 0% fan for people on a hybrid.


----------



## DStealth

Quote:


> Originally Posted by *phenom01*
> 
> wait wait wait. So my 760w seasonic platinum PSU may not be enough for a single 1080ti over my highly overvolted overclocked and modded 970 sli?


760w Platinum Seasonic is more than enough he's got a poor old Chieftech with bad 12v rails power distribution and running [email protected] 1.45v is also draining a lot.
Telling if you have to buy a new PSU it's better to trow like 50-100$ more and get 1kw+ unit instead ot 600 or 700 one..just to be on a safe side as a future proof solution








Edit: Also the efficiency and cooling/noise/ when using 50% of it's capacity are much better than using 80-90%


----------



## Slackaveli

Quote:


> Originally Posted by *KaRLiToS*
> 
> Did your friend tested all voltage rails (5v, 3.3v and 12v) on the PSU and the watts pulled from the wall when it was rebooting? Defective PSU can happen especially if it's a poor brand / unit. What was the brand of the psu of your friend?


why i only buy superflower psu. my evga g2 1000w gold has been great to me.


----------



## tpain813

Quote:


> Originally Posted by *Alwrath*
> 
> I have tested many power supplies, ive been through alot. If your a PC gamer, always always always get a 1000w power supply at least. I cannot stress this enough. " I dont need that much power ". Even if you dont, if your power supply is running at half or less than half of its available power, its going to last that much longer. We are talking years longer. I cant count how many 750w 850w power supplys I went through to figure it out. I had to replace every 2 years because they just went. 1000w? I still have my original corsair 1000w still going in my wife's rig to this day, and I bought it in 2009. Its been running an overclocked core i5 at high voltage with a radeon 290 overclocked for years. Before that, it ran my overclocked 4 GHZ phenom ii 965 BE with crossfired 5870's.
> 
> Never get under 1000w if your a gamer, if you do, be prepared to shell out $100 for a new power supply every 2 years. 3 if your lucky. Also when a power supply goes, it takes your motherboard and components with it sometimes. You have been warned.


So true and I don't understand why so few people get it. I just had to replace my 550 watt (evga gs) because I was getting random reboots on max load with an overclocked 4690k and 1080 (almost 2 years old psu). It sucks because it was recommended to be "more than enough." So I just picked up an 850g3 to go along with the 1080ti, and was in r/buildapc asking if anyone had any experience with it, and no one would give any insights other than "you don't need a new psu, the 550 is fine." Right after I fired it up in regretted not just going with a 1000 watt hahah. I don't understand where this mentality comes from when any psu manufactures specs state the efficiency at different load ratings ....

Edit: this was the first pc I built myself, so relied heavy on forum help when I initially built it (started as a budget build, now its far from that hahah) which is why I got the 550 to start. Now I know the bigger the better with psus- at least it was a lesson learned.


----------



## CoreyL4

What are the pros/cons of pressing control L in voltage curve and locking?


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KaRLiToS*
> 
> Did your friend tested all voltage rails (5v, 3.3v and 12v) on the PSU and the watts pulled from the wall when it was rebooting? Defective PSU can happen especially if it's a poor brand / unit. What was the brand of the psu of your friend?
> 
> 
> 
> why i only buy superflower psu. my evga g2 1000w gold has been great to me.
Click to expand...

Love my Corsair AX1500i, a holdover from when I was doing quad SLI.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> What are the pros/cons of pressing control L in voltage curve and locking?


It's good for testing a specific bin but you shouldn't game with it.

Boost 3.0 is meant to adjust itself. If it needs more voltage it'll grab more voltage for stability purposes.

Dont lock a bin, let the card adjust itself. It's meant to.


----------



## CoreyL4

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's good for testing a specific bin but you shouldn't game with it.
> 
> Boost 3.0 is meant to adjust itself. If it needs more voltage it'll grab more voltage for stability purposes.
> 
> Dont lock a bin, let the card adjust itself. It's meant to.


So when I am testing bins should I lock?


----------



## gmpotu

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's good for testing a specific bin but you shouldn't game with it.
> 
> Boost 3.0 is meant to adjust itself. If it needs more voltage it'll grab more voltage for stability purposes.
> 
> Dont lock a bin, let the card adjust itself. It's meant to.


I had to lock my bin to 2012 @ 1.000v today to game for some reason. I was trying to stream a TV show on my second monitor while gaming and I don't know if my driver crashed or what but the clocks dropped to the normal 2D clocks. It's like it didn't realize I had a 3D application running for my game. So I had to "lock" it to get it to even set the clocks right.


----------



## MURDoctrine

Quote:


> Originally Posted by *Alwrath*
> 
> I already came out of the woodwork. and I will school you on PSU's. I am an electrical engineer. I know how they operate and efficiency and all that crap. You just proved my point. 350w-500w under load max is ideal for a 1000w power supply, thats exactly where you want to be at. Half load. Guess how many more years your PSU will last as opposed to one at 75-95% load? ill wager a guess at 5 years, probably more. But hey if you wanna shell out more money on power supplies over the next 5-10 years be my guest. Make sure you keep us updated on when you have to buy a new one every few years. Keep us updated with your collection of dead 750w - 850w PSU's. Make sure to display them as trophy's in your room until one day you figure it out like I did.


I guess you can't read my sig rig specs but that's on you. Seeing as I have a HX1050 corsair unit. I don't disagree with the efficiency numbers but a quality unit will trump a shaddy oem rebranded unit claiming a set wattage. Thats what will take out an entire system and possibly their home with it. It's blanket statements like yours that people buy 1200+ W units when they are pulling 350-400W max from their wall. But hey I'm done arguing it because this isn't the place.

Quote:


> Originally Posted by *Nico67*
> 
> Yep that's the ideal way to go, use SuPo 4k as a test and try to keep lower voltages at as higher clocks as it can run. That way when inevitably it does throttle, you don't lose to much performance


I've found SuPo 4k is less demanding than stuff like heaven for me. I can pull SuPo benches and then try the same curve in Heaven and it instantly crash.


----------



## bloodhawk

Quote:


> Originally Posted by *SlimJ87D*
> 
> It's good for testing a specific bin but you shouldn't game with it.
> 
> Boost 3.0 is meant to adjust itself. If it needs more voltage it'll grab more voltage for stability purposes.
> 
> Dont lock a bin, let the card adjust itself. It's meant to.


Some tests ASUS BIOS (Slightly modded) + 10 Ohm resistor mod -

http://i.imgur.com/KDCoPBA.jpg


corsair_link_20170423_00_04_12.csv 41k .csv file


Even though its reporting throttling, i now know what is the cause (PCIe Connector PL, mainly, and one other PL im yet to find)
Next step is to solder on an 8 Pin PCie connector , since there are pin outs and active traces on the GPU board.


----------



## Clukos

Hit 11k gpu score in Time Spy







http://www.3dmark.com/3dm/19479512

No shunt mod, no water, that's pretty good for air I think!


----------



## radrok

Hey guys I've just purchased two 1080 Tis and I'm currently working on max clock that I can achieve.

Cards seem stable at 2100+ but they constantly hit power limit, any possibility to raise that with a BIOS flash or do I need to shunt the resistors?

I know it has been discussed previously but I couldn't find about BIOSes

Thanks


----------



## cluster71

I'm happy with a low power system now that I can have in the office [Asus Strix mItx, I7-7700k 4.9Ghz, 1080 Ti FE, EVGA 750 G3 gold PSU] It draws ~ 450 w at most. My previous rig was the X79 3970x 5.0-5.4 Ghz 4-way Titan Sli Modded, 2x Corsair AX 1200. I measured the power draw up to 1.9 KW from the wall. I realized many years ago that you can not make a high-performance PC and get it quiet, because the room temperature rices and fans increase. I have had my computers in the basement under my office for over 10 years. They are not heard and it is cold, 15-18 degree. I can handle 4-5 m cables for monitors and USB. That's all I need. I run external sound card. I never uses optical drive, but it also goes with an external.


----------



## Clukos

Quote:


> Originally Posted by *radrok*
> 
> Hey guys I've just purchased two 1080 Tis and I'm currently working on max clock that I can achieve.
> 
> Cards seem stable at 2100+ but they constantly hit power limit, any possibility to raise that with a BIOS flash or do I need to shunt the resistors?
> 
> I know it has been discussed previously but I couldn't find about BIOSes
> 
> Thanks


If we had the option of bios flashing for higher tdp nobody would be shunt modding (except maybe LN2 overclockers). Imo it's not worth it for 24/7 usage, better downvolt them and find a sweetspot where your GPU won't be power limited (usually 1.000-1.0310).


----------



## radrok

Thanks, do you also know if the fan is going to impact the power limit? I'm going to put waterblocks soon so that might help, I think.


----------



## Clukos

Quote:


> Originally Posted by *radrok*
> 
> Thanks, do you also know if the fan is going to impact the power limit? I'm going to put waterblocks soon so that might help, I think.


It does but not a big difference (same as memory OC), you should get higher clocks because of your temperatures under water for the most part.

Btw, I think i'm reaching the limits of my card in terms of power limits:


----------



## Clukos

Yup, dropping voltage a bit certainly helps. Got a higher score with lower voltage (less power limited):


----------



## looniam

i'll just leave this here . .

On Efficiency


Spoiler: Warning: Spoiler!



Quote:


> So you should get a PSU the size so that your peak draw is around 50% of the PSU's load, right? No, not in most cases. Think: 90% of the time your computer will be idling, usually drawing only 75W-200W, depending on the computer. So you want your computer to be very efficient in that power range. Generally you want your idle power draw to fall into ~30%-40% of the PSU's rated wattage, and your peak power draw to fall into ~70%-80% of the PSU's rated wattage.
> 
> The exception is with computers that will be under heavy load all the time, such as folding rigs. These should have the extra headroom anyway; a power supply should be chosen so that the computer is drawing ~60% of the PSU's rated wattage under folding power consumption levels. But this does not apply to computers that will be idle most of the time, like 99% are.


----------



## TheNoseKnows

Quote:


> Originally Posted by *AngryLobster*
> 
> The quality of the PSU is what's important, not the wattage.
> 
> My guess is you were using some garbage power supplies that just happened to be lower watt units so your educated electrical engineer brain jumped to conclusions.
> 
> Most quality units now carry 7/10 year warranties anyway.


His Rosewill Quark 1200W only has a 5 year warranty. Shows how much confidence they have in their own product.


----------



## DStealth

Quote:


> Originally Posted by *TheNoseKnows*
> 
> His Rosewill Quark 1200W only has a 5 year warranty. Shows how much confidence they have in their own product.


I had a replacment Seasonic X-1250w V1 at the end of the 5th year, just before sending the unit I had received the new SS-1250XM2 1250W version







Very glad that there are companies like them...








Edit: And all of this took place in Eastern Europe...quite a small market for a company with such scale point of view


----------



## mshagg

Quote:


> Originally Posted by *dunbheagan*
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10171 Points
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10265 Points
> 
> Funny, huh? I tried it couple times, results stay the same.


What are you doing with uncore/cache ratios? There is a relationship between core/cache deltas and memory bandwidth. Typically speaking it's usually best to maximize core speeds but perhaps SuPo is a unique scenario where memory bandwidth comes into play?


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> looks great. that is amazing temps for sli fe compared to the 80s 980ti sli hit. If you grizz them up, you'll be in the 60s and that's great!
> 
> Yeah, i love that ram but those prices are ******ed.
> 
> Best thing about sli 1080ti would be the over aboundance of power makes all that tweaking less needed. Of course you will want to determine which one is bettter , and how much better, for your single card games/situations.
> 
> All in all, GREAT decision to sell the 6950 and fund the WHOLE PC, and great decision in 2 1080ti over one Titan, especially on an 8 core cpu like you have. NASTY.


Thanks man








Quote:


> Originally Posted by *octiny*


Sick setup









Plan to do the exact same thing when those Hybrid Kits are back in stock


----------



## alucardis666

.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Love my Corsair AX1500i, a holdover from when I was doing quad SLI.


it is a nice leftover. one of the best things about sli was it forced me to get a much better psu, and that's always a nice thing. I recommend not skimping on the psu.


----------



## CoreyL4

Quote:


> Originally Posted by *Clukos*
> 
> If we had the option of bios flashing for higher tdp nobody would be shunt modding (except maybe LN2 overclockers). Imo it's not worth it for 24/7 usage, better downvolt them and find a sweetspot where your GPU won't be power limited (usually 1.000-1.0310).


How do you tell your gpu is power limited? what does it say in gpuz, and color? for the peformance cap reason tab


----------



## Slackaveli

Quote:


> Originally Posted by *KaRLiToS*
> 
> I once had a Lepa G 1600 to run my 4 x R9 290x and after a certain time (2 years) it started to fail and couldn't give more than 900watts from the wall at one point. It had 6 different rails so it was also hard to balance all the wattage accross them all especially with 4 x cards. I RMAed it sold it, and bought an EVGA Supernova 1600 T2...it was actually the second Lepa G 1600 failing on me. The Supernova has only a single rail of 1600watts. Never had issues with the EVGA PSU...a superflower unit.
> 
> It's not all about the wattage, it is also about the quality of the PSU.
> 
> The GTX 1080ti only peaks at 267watts. A Quality 750watts PSU is more than enough for a single GTX 1080ti.


LOL!!! man, you really cannot go by a chart. My 1080ti pulls 375w from the wall max load. total system load is around ~550w, so yeah a 650w psu would "work", until it "worked" itself to death and kills off your mobo and possibly cpu, ram, and gpu with it. Hell, it can even kill a monitor. It aint nothin to play with, towing the power line, a psu is a mass murderer, man!
Quote:


> Originally Posted by *bloodhawk*
> 
> Some tests ASUS BIOS (Slightly modded) + 10 Ohm resistor mod -
> 
> http://i.imgur.com/KDCoPBA.jpg
> 
> 
> corsair_link_20170423_00_04_12.csv 41k .csv file
> 
> 
> Even though its reporting throttling, i now know what is the cause (PCIe Connector PL, mainly, and one other PL im yet to find)
> Next step is to solder on an 8 Pin PCie connector , since there are pin outs and active traces on the GPU board.


now THAT'S a mod. turning an 8+6 into an 8+8...


----------



## RavageTheEarth

Quote:


> Originally Posted by *CoreyL4*
> 
> How do you tell your gpu is power limited? what does it say in gpuz, and color? for the peformance cap reason tab


Power limited would be green.


----------



## Clukos

Quote:


> Originally Posted by *CoreyL4*
> 
> How do you tell your gpu is power limited? what does it say in gpuz, and color? for the peformance cap reason tab


Check the power tab in afterburner, if you are around your max limit (120% for fe iirc) and the gpu core clock fluctuates, that means you are power limited. Lowering the voltage will help stabilize the core clock, same thing as the shunt mod, only you will probably clock a bit less and consume *a lot* less power.


----------



## OneCosmic

Is there already any BIOS with higher Power Limit than Inno3D with 315W?


----------



## KraxKill

Quote:


> Originally Posted by *dunbheagan*
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10171 Points
> 
> Superposition 4K, [email protected]/6000, [email protected]: 10265 Points
> 
> Funny, huh? I tried it couple times, results stay the same.


I suspect one of your power supply rails is either running out of juice or your CPU must be throttling slightly at 4.8? My 4790K scales liniearly all the way to 5.2ghz. What are your voltages at 4.5 vs 4.8? Also as suggested above, it could be within the margin of error of the benchmark. You never know when windows decides to index a file or do something nasty in the background. Is this over multiple runs?


----------



## fisher6

Quote:


> Originally Posted by *OneCosmic*
> 
> Is there already any BIOS with higher Power Limit than Inno3D with 315W?


There is the AMP! Extreme bios I think but I'm not sure it works on a FE model.


----------



## GRABibus

I would like to understand something :

here is my score 3DMark with Sea Hawk X :



As you can see, frequency core is 1617MHz.
In MSI AB, I put voltage slider to 100% and power limit to 120%

Result => I only get 2000MHz maximum as boost frequency in time spy...

Now, when I look this result for example :

http://cdn.overclock.net/9/92/92e2e136_timespyahowe.png

The Core frequency is lower than mine (1554MHz) and the boost frequency is 2075MHz !

In Time Spy, why can't I go over 2000MHz-2012MHz with a start frequency at 1617MHz with power limit at 120% and slider voltage at 100% in MSI AB, while a lot of you can reach 2050MHz average in the same conditions....?

Is my card unable to go to its power limit ??


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> I would like to understand something :
> 
> here is my score 3DMark with Sea Hawk X :
> 
> 
> 
> As you can see, frequency core is 1617MHz.
> In MSI AB, I put voltage slider to 100% and power limit to 120%
> 
> Result => I only get 2000MHz maximum as boost frequency in time spy...
> 
> Now, when I look this result for example :
> 
> http://cdn.overclock.net/9/92/92e2e136_timespyahowe.png
> 
> The Core frequency is lower than mine (1554MHz) and the boost frequency is 2075MHz !
> 
> In Time Spy, why can't I go over 2000MHz-2012MHz with a start frequency at 1617MHz with power limit at 120% and slider voltage at 100% in MSI AB, while a lot of you can reach 2050MHz average in the same conditions....?
> 
> Is my card unable to go to its power limit ??


Power limited.


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> Power limited.


And this power limitation (Very low in my case) is only due to Silicon lottery ? (And not to a potential issue on my rig ?)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> Is there already any BIOS with higher Power Limit than Inno3D with 315W?


I don't think we'll find one. The best would have been the Aorus which is 375 watts and that doesn't work at all.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> And this power limitation (Very low in my case) is only due to Silicon lottery ? (And not to a potential issue on my rig ?)


Has nothing to do with either, it's just the card being a 6+8 pin.


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> Has nothing to do with either, it's just the card being a 6+8 pin.


Ah ok, this power limitation is due to 6+8 pin ? I ddin't know it was the main reason. Ithought it was Silicon lottery, as you told me yesterday by PM








It means, i have more chance to have more headroom for overclock with a FE card 8+8pins (Even if it will be hotter) ?


----------



## Luckbad

Quote:


> Originally Posted by *GRABibus*
> 
> Ah ok, this power limitation is due to 6+8 pin ? I ddin't know it was the main reason. Ithought it was Silicon lottery, as you told me yesterday by PM
> 
> 
> 
> 
> 
> 
> 
> 
> It means, i have more chance to have more headroom for overclock with a FE card 8+8pins (Even if it will be hotter) ?


Honestly, if you're thinking of modding your FE to have 2x 8-pins, sell your 1080 Ti FE and buy an aftermarket card that already has that configuration. You're going down a path that will destroy your warranty for sure since you can't undo that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *GRABibus*
> 
> Ah ok, this power limitation is due to 6+8 pin ? I ddin't know it was the main reason. Ithought it was Silicon lottery, as you told me yesterday by PM
> 
> 
> 
> 
> 
> 
> 
> 
> It means, i have more chance to have more headroom for overclock with a FE card 8+8pins (Even if it will be hotter) ?


Well silicon lottery can somewhat play.

Some cards can get higher clocks with higher voltages but higher voltages require higher power.

Let's say 1.000v for 300 watt, well if you overvolt to 1.092v than it needs around 330 watts.

There is no FE cards with 8+8. They only come with 8+6.

I'm not sure if you're understanding the silicon lottery. If you can't run your card at a certain clock and voltage doing light task then power will do nothing for you.

If you set 2100 Mhz at 1.093v and you can't run heaven benchmark even for a second then power will do nothing for you.

If you can make it through a run of heaven and boost 3.0 is adjusting clocks to meet power availability, and you do run through a few scenes at 2100 Mhz, then power might help you.


----------



## Luckbad

Quote:


> Originally Posted by *OneCosmic*
> 
> Is there already any BIOS with higher Power Limit than Inno3D with 315W?


I uploaded the Zotac Amp Extreme bios to TechPowerUp, which has a power limit of 384 W. It does not play nicely with the FE and probably won't do you much good with other cards either. It has some fancy weird stuff going on that other cards don't (16+2 power phases, 2x power boost chips, some sort of proprietary voltage regulation).

https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


----------



## GRABibus

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well silicon lottery can somewhat play.
> 
> Some cards can get higher clocks with higher voltages but higher voltages require higher power.
> 
> Let's say 1.000v for 300 watt, well if you overvolt to 1.092v than it needs around 330 watts.
> 
> There is no FE cards with 8+8. They only come with 8+6.


Ok, thanks.

2000MHz maximum stable frequency at 1.093V in Time Spy...That's really a card which is far far below average overclock capabilities, especially on water...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> I uploaded the Zotac Amp Extreme bios to TechPowerUp, which has a power limit of 384 W. It does not play nicely with the FE and probably won't do you much good with other cards either. It has some fancy weird stuff going on that other cards don't (16+2 power phases, 2x power boost chips, some sort of proprietary voltage regulation).
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


Someone tested it already, it's a no go.


----------



## Nico67

Quote:


> Originally Posted by *GRABibus*
> 
> I would like to understand something :
> As you can see, frequency core is 1617MHz.
> In MSI AB, I put voltage slider to 100% and power limit to 120%
> 
> Result => I only get 2000MHz maximum as boost frequency in time spy...
> 
> Now, when I look this result for example :
> 
> The Core frequency is lower than mine (1554MHz) and the boost frequency is 2075MHz !
> 
> In Time Spy, why can't I go over 2000MHz-2012MHz with a start frequency at 1617MHz with power limit at 120% and slider voltage at 100% in MSI AB, while a lot of you can reach 2050MHz average in the same conditions....?
> 
> Is my card unable to go to its power limit ??


People have got to start ignoring core frequency in this respect, all that matters is what your card will do under load. Real Boost clks

1/ different versions of the card have different curves by default, FE can do up to 1911 default but something like the Zotac AMP extreme can do 2037+, just see what the top step is on your default curve, it can't do more than that without adding clocks or changing the curve. How much +... clks is total irrelevant, what the highest boost you can reach and how much you can retain while limiting is where its at









2/ limitations will reduce your max boost, could be temp and or power, even volt lock if its less than the max curve volts.

3/ silicone lottery, some cards just aren't going to do as well no matter what you do, does matter which version or what mods you try, but you should always be able to optimise what you have mainly if you can keep them cooler.

first thing you should do is unlock and set voltage +100 (it does add any voltage, all it does is allow the curve to go up to 1.093 if it goes that high. most should go past 1.050 default)
second set 120% power 90c temp
third 100% fan (if your on what then not necessary, but maybe turn up AIO fans etc)

now you can "test" what the card can do, raise clks at 1.050 until it crashes, that is silicone lottery and that you can't control.

The rest is down to cooling and shunt mods etc


----------



## GRABibus

Quote:


> Originally Posted by *Nico67*
> 
> People have got to start ignoring core frequency in this respect, all that matters is what your card will do under load. Real Boost clks
> 
> 1/ different versions of the card have different curves by default, FE can do up to 1911 default but something like the Zotac AMP extreme can do 2037+, just see what the top step is on your default curve, it can't do more than that without adding clocks or changing the curve. How much +... clks is total irrelevant, what the highest boost you can reach and how much you can retain while limiting is where its at
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2/ limitations will reduce your max boost, could be temp and or power, even volt lock if its less than the max curve volts.
> 
> 3/ silicone lottery, some cards just aren't going to do as well no matter what you do, does matter which version or what mods you try, but you should always be able to optimise what you have mainly if you can keep them cooler.
> 
> first thing you should do is unlock and set voltage +100 (it does add any voltage, all it does is allow the curve to go up to 1.093 if it goes that high. most should go past 1.050 default)
> second set 120% power 90c temp
> third 100% fan (if your on what then not necessary, but maybe turn up AIO fans etc)
> 
> now you can "test" what the card can do, raise clks at 1.050 until it crashes, that is silicone lottery and that you can't control.
> 
> The rest is down to cooling and shunt mods etc


I am on water.
Power limit 120%
Voltage slider at 100%
Fan of the AIO at 100%
Fan of the card (For vrm and memory) at 100%

=> Time spy : crash at 2012MHz / 1.093V...
Frequency never goes above 2012MHz.

Max GPu temperature is 45°C during this test, so heat shouldn't limit power...

No way


----------



## Nico67

Quote:


> Originally Posted by *GRABibus*
> 
> I am on water.
> Power limit 120%
> Voltage slider at 100%
> Fan of the AIO at 100%
> Fan of the card (For vrm and memory) at 100%
> 
> => Time spy : crash at 2012MHz / 1.093V...
> Frequency never goes above 2012MHz.
> 
> Max GPu temperature is 45°C during this test, so heat shouldn't limit power...
> 
> No way


Silicone lottery hasn't been as good to you as others, but try less volts overall it will probably give you better performance even if the clk is a bit lower


----------



## dunbheagan

My best Fire Strike Ultra run today brought me 7598 Points, which is rank 6 of 800 for my configuration, only two guys above me.

1080Ti [email protected]/6200 (waterblock, shunt mod)
[email protected] (dont ask for the voltage, its 1,525V, just for benching)


----------



## PasK1234Xw

Quote:


> Originally Posted by *octiny*


Not enough EVGA logos


----------



## Coopiklaani

Just realized, no matter what you do, shunt mod or not, the "normalized" TDP read out from MSI afterburner and HWinfo never exceeds 110%. Although TDP can be easily fooled by shunt mod, this normalised TDP can not ... Is it some sort of low-side current sensing circuitry? Internal limit by NV/BIOS/Driver?
Interestingly enough, this 110% max normalized TDP translate to exact 375w, which is the highest TDP card that exists.


----------



## Pandora's Box

lol! Even the cable ties have logos.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Just realized, no matter what you do, shunt mod or not, the "normalized" TDP read out from MSI afterburner and HWinfo never exceeds 110%. Although TDP can be easily fooled by shunt mod, this normalised TDP can not ... Is it some sort of low-side current sensing circuitry? Internal limit by NV/BIOS/Driver?
> Interestingly enough, this 110% max normalized TDP translate to exact 375w, which is the highest TDP card that exists.


Not sure, I took measurements with my PSU and I at one point pulled around 100 watts extra over stock. Lets say around 100 watts, because that extra 10 could be from fans, etc.

This is with furmark so at stock I hit 120% for sure.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not sure, I took measurements with my PSU and I at one point pulled around 100 watts extra over stock. Lets say around 100 watts, because that extra 10 could be from fans, etc.
> 
> This is with furmark so at stock I hit 120% for sure.


The shunt mod does help to get extra power from the card. But this normalised TDP seems to be another barrier that hard limits the card to 375w. But I believe this normalised TDP is measured from the low-side since the high-side 12DC currents are measured by 3 shunt resistors.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> The shunt mod does help to get extra power from the card. But this normalised TDP seems to be another barrier that hard limits the card to 375w. But I believe this normalised TDP is measured from the low-side since the high-side 12DC currents are measured by 3 shunt resistors.


Then why did I measure pulling +100 to +110 more watts after the shunt mod and not +75

You can't trust software to read a shunt mod, it doesn't account for a hardware modification.


----------



## gmpotu

2/ limitations will reduce your max boost, could be temp and or power, even volt lock if its less than the max curve volts.

I have a question about voltage related limitations with regards to this statement.

If I set my curve to be [email protected] 1.000v and make a flat line all the way across to 1.093v and my system is running fine in games [email protected] 1.000v but during play gpu-z show blue for Vrel. Is this normal and what does it mean in this circumstance?

Also, if clocks are stable @1.000v but then the card needs more voltage to stay stable will it jump up to 1.012v or higher if I have a flat line all the way across or will it only attempt 1.000v if I have a flat line and drop down to the .0993v bin if it is no longer stable?

Ideally I would like to start by choosing a low voltage like 1.000v and flat line up to 1.093v and if the card needs more than 1.000v let it use more but I don't know if that is how it actually works.


----------



## octiny

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Not enough EVGA logos


I know right! I waa thinking of adding 10-15 more


----------



## Somasonic

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Not enough EVGA logos












Seriously though, it's a very nice looking rig


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> My best Fire Strike Ultra run today brought me 7598 Points, which is rank 6 of 800 for my configuration, only two guys above me.
> 
> 1080Ti [email protected]/6200 (waterblock, shunt mod)
> [email protected] (dont ask for the voltage, its 1,525V, just for benching)


Firestrike doesn't like my PC for some reason, I will always get a invalid score. It actually takes the damn stupid program 10 minutes before it decides to run but here's a run at 2088.



I might do a 2152 run later if I have time. I can't remember, what are you superposition scores again?


----------



## Benny89

Witcher 3 can be tricky with GPU OC. I had crashes to desktop in Witcher 3 and seems like I was constantly hitting Power Limit. I had to drop one voltage bin down to prevent that.

So far Witcher 3 was the only game when I was hitting a Power Limit wall.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 can be tricky with GPU OC. I had crashes to desktop in Witcher 3 and seems like I was constantly hitting Power Limit. I had to drop one voltage bin down to prevent that.
> 
> So far Witcher 3 was the only game when I was hitting a Power Limit wall.


Witcher 3 will take your stable benchmark oc and **** it out for breakfast.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Witcher 3 will take your stable benchmark oc and **** it out for breakfast.


True







But only one bin so not that much. Instead of 2063 gaming in W3 I had to drop to 2050 on lower voltage or it crashes from time to time.

But good lord I am playing Witcher 3 in 90-100 fps on 1440p Ultra







How awesome is that?


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> 2/ limitations will reduce your max boost, could be temp and or power, even volt lock if its less than the max curve volts.
> 
> I have a question about voltage related limitations with regards to this statement.
> 
> If I set my curve to be [email protected] 1.000v and make a flat line all the way across to 1.093v and my system is running fine in games [email protected] 1.000v but during play gpu-z show blue for Vrel. Is this normal and what does it mean in this circumstance?
> 
> Also, if clocks are stable @1.000v but then the card needs more voltage to stay stable will it jump up to 1.012v or higher if I have a flat line all the way across or will it only attempt 1.000v if I have a flat line and drop down to the .0993v bin if it is no longer stable?
> 
> Ideally I would like to start by choosing a low voltage like 1.000v and flat line up to 1.093v and if the card needs more than 1.000v let it use more but I don't know if that is how it actually works.


volt limitation would be if your curve kept going up past 1.050, but you didn't add +100v etc to allow it to use say 1.093.

as for being able to use higher voltage on the flat line, it should be able to, but not sure if it really does. If you are happy with 2012 @ 1.000v then try and step up 1.050 to 2025 and go flat from there.


----------



## octiny

Quote:


> Originally Posted by *alucardis666*
> 
> Sick setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plan to do the exact same thing when those Hybrid Kits are back in stock


Thanks dude!

Yeah, I was lucky and got Hybrid kits the day they came out. Got a notification and raced to the computer, they were sold out 2 minutes after hitting submit









You'll definitely enjoy them! Instructions were terrible, but everything is pretty straight forward.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> True
> 
> 
> 
> 
> 
> 
> 
> But only one bin so not that much. Instead of 2063 gaming in W3 I had to drop to 2050 on lower voltage or it crashes from time to time.
> 
> But good lord I am playing Witcher 3 in 90-100 fps on 1440p Ultra
> 
> 
> 
> 
> 
> 
> 
> How awesome is that?


Very, I used to play sli 980 ti and this is actually better.

A oc 1080 Ti = 1.9x 980 ti, and sli scaling is anywhere from 1.4-1.8x


----------



## buddatech

Just ordered a cooler from the hybrid kit for my 1080 Ti FE, should hopefully be here mid to late next week. I've done the full hybrid kit install on my original 1080 FE. My question; can/should I use CLU Pro on GPU Die or should I stick with other paste I have on hand Antec Formula 7??


----------



## Slackaveli

Well, HA!! I finally cleared that damn 10,500 barrier in SuPo 4k op. ON AIR, mind you. I jacked up the cache speed on my 5775c and that was enough to push me past. Also, I locked it at 1.05v and only had 2 green blips the whole run. i mean blips, too.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> Well, HA!! I finally cleared that damn 10,500 barrier in SuPo 4k op. ON AIR, mind you. I jacked up the cache speed on my 5775c and that was enough to push me past. Also, I locked it at 1.05v and only had 2 green blips the whole run. i mean blips, too.


Nice score. 5775c sounds like a champ.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Nice score. 5775c sounds like a champ.


especially these late batch ones we've all been getting. the last dozen of us or so up in the broadwell thread have 4.2-4.4 ones , mine's a 4.3, and it definitely is badass. my best timespy is 9266, which with a 4 core is pretty beast, too.

Just nabbed a 10,965 graphics score. dizamn.


----------



## alucardis666

Quote:


> Originally Posted by *octiny*
> 
> Thanks dude!
> 
> Yeah, I was lucky and got Hybrid kits the day they came out. Got a notification and raced to the computer, they were sold out 2 minutes after hitting submit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You'll definitely enjoy them! Instructions were terrible, but everything is pretty straight forward.


Yup! I previously ran the 1080 kit without the shroud on a FE Ti and TXp. Should be fun to get to use the shroud this time! Can you comment on the EVGA blower vs stock? Any quieter, and what RPMs does it hit? Or is it the same as reference with an EVGA "E" on it?










Thanks!

*EDIT:*

*Pulled from the Top 30 SuperPo Thread*

*AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2025Mhz/6028 --- 11729--- 1080p Extreme*



*AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz/6066 --- 17802 --- 4K Optimized*



*AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz/6066 --- 8692 --- 8K Optimized*



*@Octiny* You should post *there* bro! see if you beat me with your Hybrid Tis' and I'll rebench/post when I get my kits in.


----------



## mechwarrior

So... flashed my gb fe with the inno bios. Have to admit scared as hell. Max factory boost is 2038 Hz and i am not hitting the power limit as much the problem is the fan speeds, tried doing a fan curve in afterburner and they work after 44%.
It's ok as I'm on water and cooling with a 480 rad.


----------



## octiny

Quote:


> Originally Posted by *alucardis666*
> 
> Yup! I previously ran the 1080 kit without the shroud on a FE Ti and TXp. Should be fun to get to use the shroud this time! Can you comment on the EVGA blower vs stock? Any quieter, and what RPMs does it hit? Or is it the same as reference with an EVGA "E" on it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!
> 
> *EDIT:*
> 
> *Pulled from the Top 30 SuperPo Thread*
> 
> *AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2025Mhz/6028 --- 11729--- 1080p Extreme*
> 
> 
> 
> *AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz/6066 --- 17802 --- 4K Optimized*
> 
> 
> 
> *AlucarDis666 --- R7 1700 @ 4.0Ghz --- EVGA GTX 1080Ti FE SLi @ 2037Mhz/6066 --- 8692 --- 8K Optimized*
> 
> 
> 
> *@Octiny* You should post *there* bro! see if you beat me with your Hybrid Tis' and I'll rebench/post when I get my kits in.


Roger that. I've been meaning to download it.

Regarding the blower fan, it does seem to give off less noise on the lower end. HOWEVER, I had left it on auto and it was only hitting 31% because of the temps... started to give me some issues so I've been at 40% ever since and it's inaudible.

At 100% it hits 4800rpm


----------



## alucardis666

Quote:


> Originally Posted by *octiny*
> 
> Roger that. I've been meaning to download it.
> 
> Regarding the blower fan, it does seem to give off less noise on the lower end. HOWEVER, I had left it on auto and it was only hitting 31% because of the temps... started to give me some issues so I've been *at 40% ever since and it's inaudible.
> *
> At 100% it hits 4800rpm


Cool!









Thanks for the reply, hope to see you in the SupPo thread


----------



## KickAssCop

This is the last time I am buying an MSI card. First time it was the 980 Ti that wouldn't clock past 1434 MHz. This time the 1080 Ti that won't clock past 1950 MHz without crashing (does 1974 but then backs down to 1950 for consistent performance).

What a POS.

My ASUS 1080 Ti FE did 2050/12100. This one does 1950/12100. Angry at it.

For now, I have set it to OC mode that boosts it to 1923-1936 and left it there. Screw MSI.


----------



## alucardis666

Quote:


> Originally Posted by *KickAssCop*
> 
> This is the last time I am buying an MSI card. First time it was the 980 Ti that wouldn't clock past 1434 MHz. This time the 1080 Ti that won't clock past 1950 MHz without crashing (does 1974 but then backs down to 1950 for consistent performance).
> 
> What a POS.
> 
> My ASUS 1080 Ti FE did 2050/12100. This one does 1950/12100. Angry at it.
> 
> For now, I have set it to OC mode that boosts it to 1923-1936 and left it there. Screw MSI.


Yea, I can relate. I'm not a big fan/had very good luck with MSI or Gigabyte, lots of DOA, or QC issues, when they have worked for me they've been poor performers/clockers, always resulting in me exchanging or selling them off in favor of ASUS or EVGA cards.

I've also had decent luck with PNY too, but ASUS and EVGA are tops


----------



## Slackaveli

Quote:


> Originally Posted by *KickAssCop*
> 
> This is the last time I am buying an MSI card. First time it was the 980 Ti that wouldn't clock past 1434 MHz. This time the 1080 Ti that won't clock past 1950 MHz without crashing (does 1974 but then backs down to 1950 for consistent performance).
> 
> What a POS.
> 
> My ASUS 1080 Ti FE did 2050/12100. This one does 1950/12100. Angry at it.
> 
> For now, I have set it to OC mode that boosts it to 1923-1936 and left it there. Screw MSI.


ebay dat joint, killa. Grab that dude's golden zotac from a few pages back.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Yea, I can relate. I'm not a big fan/had very good luck with MSI or Gigabyte, lots of DOA, or QC issues, when they have worked for me they've been poor performers/clockers, always resulting in me exchanging or selling them off in favor of ASUS or EVGA cards.
> 
> I've also had decent luck with PNY too, but ASUS and EVGA are tops


damn sure cant complain about my aorus but a few others werent so lucky. 10,965 gpu score in timespy is salty/.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KickAssCop*
> 
> This is the last time I am buying an MSI card. First time it was the 980 Ti that wouldn't clock past 1434 MHz. This time the 1080 Ti that won't clock past 1950 MHz without crashing (does 1974 but then backs down to 1950 for consistent performance).
> 
> What a POS.
> 
> My ASUS 1080 Ti FE did 2050/12100. This one does 1950/12100. Angry at it.
> 
> For now, I have set it to OC mode that boosts it to 1923-1936 and left it there. Screw MSI.


Had an MSI 1080 SeaHawk EK X, clocked like a dream. All brands have there duds.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> damn sure cant complain about my aorus but a few others werent so lucky. 10,965 gpu score in timespy is salty/.


You're lucky my friend, my Aorus didn't go so hot... or rather all it went was hot. Loading at 87c, what a joke.









My SLi FE cards aren't on blocks yet and they're never going beyond 74C!









I plan to re-paste em with kryo or hydronaut when I get my hybrid kits in.
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Had an MSI 1080 SeaHawk EK X, clocked like a dream. All brands have there duds.


Their*

Let's be honest though... Hybrid cooled cards are just usually guaranteed to perform/be built better.


----------



## MURDoctrine

It is all pure silicon lottery in my experience. I've had nothing but EVGA for the past few cycles and I've not had a golden card yet. My 670, 980, and now 1080ti are all meh. At least I can hit 2038 now that I'm under water on my 1080ti.


----------



## alucardis666

Quote:


> Originally Posted by *MURDoctrine*
> 
> It is all pure silicon lottery in my experience. I've had nothing but EVGA for the past few cycles and I've not had a golden card yet. My 670, 980, and now 1080ti are all meh. At least I can hit 2038 now that I'm under water on my 1080ti.


Anything over 1950mhz is good in my book.









*2050+ is golden*


----------



## Clukos

I've actually ended up running 0.993v and 2000 on the core +630 mem (gotta have that round 540 GB/s). Perfectly stable (even in Witcher 3) and stays usually in the 60-65 C range and is virtually silent. Benchmarking I can go up to 2113-2126 but I'm power limited because of the voltage.


----------



## alucardis666

Quote:


> Originally Posted by *Clukos*
> 
> I've actually ended up running 0.993v and 2000 on the core +630 mem (gotta have that round 540 GB/s). Perfectly stable (even in Witcher 3) and stays usually in the 60-65 C range and is virtually silent. Benchmarking I can go up to 2113-2126 but I'm power limited because of the voltage.


Impressive!


----------



## gmpotu

I don't get how you guys are getting 100+ fps @1440p in your games?
I'm playing World of Warcraft and dropping to about 35fps on 1080p resolution in some areas.
My settings are maxed everything with SSAA turned on. My clocks are [email protected] 1.000v. +350 mem.

@nico67 I installed the PSU today. Didn't fix me getting the rainbow in GPU-Z. I cannot figure out why I'm getting that rainbow, but it only seems to happen when AB messes up my initial profile and the computer is cooled down when I apply it. Going to try not overclocking until my card heats up above 50deg and see if that helps.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> You're lucky my friend, my Aorus didn't go so hot... or rather all it went was hot. Loading at 87c, what a joke.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My SLi FE cards aren't on blocks yet and they're never going beyond 74C!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I plan to re-paste em with kryo or hydronaut when I get my hybrid kits in.
> Their*
> 
> Let's be honest though... Hybrid cooled cards are just usually guaranteed to perform/be built better.


yours was probably naked under the block with paste next to the chip or something. even before i changed a thing i never went over 70c. now, hell, i was 55c @ 2088/2076 all night in BF1. 125-150 fps, ultra'd up.

Ya'll add me , KingSlackaveli on Origin, Slackaveli on steam/uplay
Quote:


> Originally Posted by *MURDoctrine*
> 
> It is all pure silicon lottery in my experience. I've had nothing but EVGA for the past few cycles and I've not had a golden card yet. My 670, 980, and now 1080ti are all meh. At least I can hit 2038 now that I'm under water on my 1080ti.


my experience as well. Did have a 1500Mhz hybrid evga 980ti, but i owned 4 different 980ti, 2 1080ti, a 980, and a 780ti in the last few years and only that one evga and now this aorus were above average. I'd call this Aorus golden, being a regular and on air yet gaming all night at 2076+ and under 60c. FINALLY won the lottery. Not sure how this one didnt end up in an Extreme. Must've crashed in a gauntlet test on the bogus or something lol.


----------



## Luckbad

Quote:


> Originally Posted by *gmpotu*
> 
> I don't get how you guys are getting 100+ fps @1440p in your games?
> I'm playing World of Warcraft and dropping to about 35fps on 1080p resolution in some areas.
> My settings are maxed everything with SSAA turned on. My clocks are [email protected] 1.000v. +350 mem.
> 
> @nico67 I installed the PSU today. Didn't fix me getting the rainbow in GPU-Z. I cannot figure out why I'm getting that rainbow, but it only seems to happen when AB messes up my initial profile and the computer is cooled down when I apply it. Going to try not overclocking until my card heats up above 50deg and see if that helps.


What CPU are you using? I'm using 200% upscaling in WoW, so I'm generally around 100 FPS at 1440p. Don't run on max settings, either. That 10 view distance (increased significantly in Legion) is murder. I think I dropped my view distance to 8, environment detail at 9, and maxed everything else out, then render scale (under Advanced) is 200%. Note that 200% render scale is the same as SSAA 4x, so using that means you're rendering at double your resolution. Shouldn't be an issue at 1080p, though.

The biggest killer is view distance, though. Fly to the border of Val'sharah and Suramar and spin around. You'll see way too much and your framerate will plummet. WoW also cares a lot about your CPU, so if you are running something old you might be CPU bound at this point.

See video about CPUs and gaming: 




If you're rockin' a 1080 Ti, going older than a 4790k is probably hurting.


----------



## MURDoctrine

Quote:


> Originally Posted by *Luckbad*
> 
> What CPU are you using? I'm using 200% upscaling in WoW, so I'm generally around 100 FPS at 1440p. Don't run on max settings, either. That 10 view distance (increased significantly in Legion) is murder. I think I dropped my view distance to 8, environment detail at 9, and maxed everything else out, then render scale (under Advanced) is 200%. Note that 200% render scale is the same as SSAA 4x, so using that means you're rendering at double your resolution. Shouldn't be an issue at 1080p, though.
> 
> The biggest killer is view distance, though. Fly to the border of Val'sharah and Suramar and spin around. You'll see way too much and your framerate will plummet. WoW also cares a lot about your CPU, so if you are running something old you might be CPU bound at this point.
> 
> See video about CPUs and gaming:
> 
> 
> 
> 
> If you're rockin' a 1080 Ti, going older than a 4790k is probably hurting.


Yeah legion dialed everything to 15 it seemed. When I was playing on my 980 I was fine maxed until I reached Suramar I was getting HORRIFIC FPS drops. I think it was a combination of the water effects and draw distances.


----------



## gmpotu

Quote:


> Originally Posted by *Luckbad*
> 
> What CPU are you using? I'm using 200% upscaling in WoW, so I'm generally around 100 FPS at 1440p. Don't run on max settings, either. That 10 view distance (increased significantly in Legion) is murder. I think I dropped my view distance to 8, environment detail at 9, and maxed everything else out, then render scale (under Advanced) is 200%. Note that 200% render scale is the same as SSAA 4x, so using that means you're rendering at double your resolution. Shouldn't be an issue at 1080p, though.
> 
> The biggest killer is view distance, though. Fly to the border of Val'sharah and Suramar and spin around. You'll see way too much and your framerate will plummet. WoW also cares a lot about your CPU, so if you are running something old you might be CPU bound at this point.
> 
> See video about CPUs and gaming:
> 
> 
> 
> 
> If you're rockin' a 1080 Ti, going older than a 4790k is probably hurting.


Yeah I'm running an old 3770k @ 4.2ghz.
Suramar is usually where my frames drop down into the 20s when I'm flying. I have all sliders maxed, view distance, SSAA x 16 or whatever the max is. If there's a slider then it's maxed out.

I'll try to optimize tomorrow. I just figured you guys were maxing out all settings as far as they would go and still getting over 100fps.

I'm still not understanding something about this whole processes even after watching the videos etc. Not sure what I'm missing. More testing tomorrow though.


----------



## MURDoctrine

Quote:


> Originally Posted by *gmpotu*
> 
> Yeah I'm running an old 3770k @ 4.2ghz.
> Suramar is usually where my frames drop down into the 20s when I'm flying. I have all sliders maxed, view distance, SSAA x 16 or whatever the max is. If there's a slider then it's maxed out.
> 
> I'll try to optimize tomorrow. I just figured you guys were maxing out all settings as far as they would go and still getting over 100fps.
> 
> I'm still not understanding something about this whole processes even after watching the videos etc. Not sure what I'm missing. More testing tomorrow though.


I was having issues wrapping my head around it at first too. I usually game with just my slider set. The curve tuning is to find your best core at different voltage bins. So say you do a bench and your card is stable at 2038 @ 1.093 but you run into the power limiter during the bench. Then your card will start throttling back to meet the TDP limit. Mine drops down to around 1.000V in SuPo 4k just now. You would try to find the max core your card can push at those bins without crashing. This way when your card does throttle the performance loss will be at a minimal.


----------



## smonkie

I can't believe the amount of misinformed crap I've just read in the last pages concerning psu's needed wattage, even so when the 'data' came from alleged engineers.

A good quality PSU (and I mean a good one) is rated to last you at least the same number of years it offers you in warranty. I've been using a Seasonic X 650 for about 6 years now, and my rig has always been on the top side. Not a single problem whatsoever. Why in the hell a company would offer five or ten years of warranty if it wouldn't be sure of it? Makes no marketing sense at all. Stop alarming people and urging them into buying +1000W psu's.

Of course, the power load is related with efficiency, but that doesn't mean lifespan also is, at least in a noticeable manner. Overclocking may very well shorten the lifespan of our hardware, bit I don't think that stops people from doing it. And that is because the rate in which a MODERATE oc degrades your hardware is not even as much important as the performance/fun you get from it.


----------



## WarbossChoppa

I'm pulling my hair out trying to OC my 1080 Ti Seahawk X to get to 2000Mhz. I'm about ready to sell it and just get a Zotac.


----------



## Clukos

Quote:


> Originally Posted by *WarbossChoppa*
> 
> I'm pulling my hair out trying to OC my 1080 Ti Seahawk X to get to 2000Mhz. I'm about ready to sell it and just get a Zotac.


Get the Gaming X instead. 2000MHz stable core clock at 0.993mv, and some people in this thread can go even lower with the voltage and stay stable:






^ That was more of a CPU stress test run (at 1080p) but the GPU was always at 90+% usage, 2000MHz rock solid throughout the run.


----------



## WarbossChoppa

Quote:


> Originally Posted by *Clukos*
> 
> Get the Gaming X instead. 2000MHz stable core clock at 0.993mv, and some people in this thread can go even lower with the voltage and stay stable:
> 
> 
> 
> 
> 
> 
> ^ That was more of a CPU stress test run (at 1080p) but the GPU was always at 90+% usage, 2000MHz rock solid throughout the run.


I'm just sick of this thing, and I know the AMP Extremes can boost out of the box at good speeds.


----------



## imLagging

Quote:


> Originally Posted by *alucardis666*
> 
> Anything over 1950mhz is good in my book.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *2050+ is golden*


I haven't been following this thread, has the average OC been low?
I was under the impression most could hit 2050-2100 with max voltage
Quote:


> Originally Posted by *WarbossChoppa*
> 
> I'm pulling my hair out trying to OC my 1080 Ti Seahawk X to get to 2000Mhz. I'm about ready to sell it and just get a Zotac.


Can you still return to get a replacement? I can't imagine, I would also be very frustrated.
I didn't hold out long enough to buy custom cards. Got an clocking FE but I had to shunt mod to reach it's potential, man did the coil whine reveal itself after all was said and done








Been trying some benchmarks, max I've clocked is 2152 pulling 590w from the wall. Tryed 2160+ but it crashes after 10 seconds on superposition.
For memory I can clock +650 on cold run, oddly enough I ran +660 on first few benchmarks and never again.


----------



## KickAssCop

I know it is silicon lottery but losing twice to the same vendor is enough for me to not consider that vendor. Didn't have such a bad run with EVGA, ASUS or Gigabyte for that matter.


----------



## Clukos

Quote:


> Originally Posted by *imLagging*
> 
> I was under the impression most could hit 2050-2100 with max voltage


Technically most "should", the problem is that at max voltage you are power limited with most cards and unless you shunt mod or undervolt, you don't actually see a stable core clock of 2050-2100. It's more like 1949-2062 in a power limited scenario, even if you could technically hit 2100 with that voltage. Water and shunt modding help if you want to keep the card at maximum voltage and core clocks but overall, I don't think it's worth it for 24/7 use (shunt modding in particular).


----------



## MrTOOSHORT

Quote:


> Originally Posted by *alucardis666*
> 
> Their*
> 
> Let's be honest though... Hybrid cooled cards are just usually guaranteed to perform/be built better.


I'm talking about this card:



Thanks for the spell check.


----------



## imLagging

Quote:


> Originally Posted by *Clukos*
> 
> Technically most "should", the problem is that at max voltage you are power limited with most cards and unless you shunt mod or undervolt, you don't actually see a stable core clock of 2050-2100. It's more like 1949-2062 in a power limited scenario, even if you could technically hit 2100 with that voltage. Water and shunt modding help if you want to keep the card at maximum voltage and core clocks but overall, I don't think it's worth it for 24/7 use (shunt modding in particular).


I remember how limited I was with the stock cooler. The card boosted to ~1980 and I was very happy, but at the same time felt there was no room left to push it. I think max I seen was ~2060mhz. Power limited constantly at 120%.
Shunt mod is definitely not ideal. I tried to use the thinnest layer and barely coat until the ends, last thing I want is it touching and reaching solder joints. I used liquid electrical tape and let it cure for 2 days. Hoping nothing bad happens








Ironically, I'm working on .988 voltage and highest I've gotten is ~2060-2075mhz depending on games. You'd think I'd be running 1.09v and trying to burn this card into RMA


----------



## Clukos

You don't really need to shunt mod for load voltage <= 1.0310mv, even Superposition 8k Optimized (which is the hardest thing to run in terms of power limit from benches) doesn't throttle at that voltage. But I guess it depends on the configuration too, the Gaming X I'm using has 2x 8-pin so maybe I have more headroom at stock settings?


----------



## Benny89

Quote:


> Originally Posted by *KickAssCop*
> 
> I know it is silicon lottery but losing twice to the same vendor is enough for me to not consider that vendor. Didn't have such a bad run with EVGA, ASUS or Gigabyte for that matter.


I also had bad luck with 980 Ti from MSI, didn't go above 1960 mhz so I returned it. Grabbed Gigabyte Gaming G1 and stable since then 1555 OC on air.

I don't personally recommend MSI anymore. Also I can't recommend Aorus- not that all of them are bad, but bad ones are extreme bad like crashin games on stock settings, artefacting on stock memory etc.

I would definitely stick with ASUS or EVGA this round. Very pleased with my STRIX.


----------



## trawetSluaP

People complaining their card doesn't' go over 1950 + MHz when the advertised boost speed is around 1600-1700MHz is a bit ridiculous!

My card will hit 1999MHz but crash if I add another bin. I'm disappointed that I can't hit the magic (in my head) 2GHz but at the same time GPU Boost 3.0 is giving me an extra 300 odd MHz so I can't complain.

What is starting to bug me is the coilwhine in certain Folding WUs...


----------



## imLagging

Quote:


> Originally Posted by *Clukos*
> 
> You don't really need to shunt mod for load voltage <= 1.0310mv, even Superposition 8k Optimized (which is the hardest thing to run in terms of power limit from benches) doesn't throttle at that voltage. But I guess it depends on the configuration too, the Gaming X I'm using has 2x 8-pin so maybe I have more headroom at stock settings?


Yeah, your card has a higher power limit in bios. Along with the beautiful PCB circuirty and components








If someone was going to shunt to undervolt I would be confused haha
Curious what my power draw was, I loaded 1.031v with 2050mhz and I was reaching 90% power limit on first scene. Add the ~30-40% from shunt, I would still be able to lower the clocks or voltage and stay within stock power limit range. I still need to find my voltage-clock limits for 1.03, 1.05, 1.07, and 1.09. I need to find the sweet spot and use that when gaming, definitely not going to add ~30w for 13mhz and the temperature increase in loop.
-wow, I'm a bit surprised. I was testing in 4k and tried 8k to see 1.03v 2050mhz pull up to 97% power so I cranked to 1.09v 2100mhz and seen first scene reach 111%








Quote:


> Originally Posted by *trawetSluaP*
> 
> People complaining their card doesn't' go over 1950 + MHz when the advertised boost speed is around 1600-1700MHz is a bit ridiculous!
> 
> My card will hit 1999MHz but crash if I add another bin. I'm disappointed that I can't hit the magic (in my head) 2GHz but at the same time GPU Boost 3.0 is giving me an extra 300 odd MHz so I can't complain.
> 
> What is starting to bug me is the coilwhine in certain Folding WUs...


You have to look deeper than box specs. Considering there is a reason they use those numbers








Also this is one of those forums where people will break down the card, components, gpu architecture, and guesstimate what the card should be in range of doing. It is not difficult to determine the low and high ranges for overclocks.


----------



## PasK1234Xw

Quote:


> Originally Posted by *trawetSluaP*
> 
> People complaining their card doesn't' go over 1950 + MHz when the advertised boost speed is around 1600-1700MHz is a bit ridiculous!
> 
> My card will hit 1999MHz but crash if I add another bin. I'm disappointed that I can't hit the magic (in my head) 2GHz but at the same time GPU Boost 3.0 is giving me an extra 300 odd MHz so I can't complain.
> 
> What is starting to bug me is the coilwhine in certain Folding WUs...


Are you increasing voltage? Default max is 1.062 increasing voltage slider unlocks 1.074, 081, 093v

But 2000 is pretty easy to hit even on stock voltage Ive had 4 Ti and 1 Xp all did over 2000 no problem.

Also it s not just higher clocks better binned cards run cooler and don't hit power limit as often

Also that coil whine should only happens at very high frames 1000+ fps like when exiting heaven benchmark.


----------



## stranger451

I can confirm that the Corsair Hydro GFX GTX 1080 Ti comes with 3x DP, 1x HDMI and 1x DL-DVI-D.


----------



## Roadrunners

I have just received my replacement card from EVGA because my current one has coil whine. The new card is far better in terms of whine even though the coil whine has calmed down on my present card since I first installed it some 5 weeks ago. The new card however is showing its hitting the voltage limit as stock boost which is 1886Mhz. My current card can run upto 2088Mhz (on water) with no voltage limit being triggered in MSI Afterburner. I have done a quick run and can get the new card to overclock to 2050Mhz and settles to 2012Mhz (on water) and its stable even though it is showing it is hitting the Voltage Limit.

I am just wondering is this voltage limit going to cause issues? I guessing it will if I want to run it on water with decent clocks ie around 2050Mhz.


----------



## We Gone

Very happy with my new 1080Ti FE, running 2012mhz at 105% power & +101 power not hitting any limits Mem at stock. Max temps of 61c fan @ 80%. Can't wait for the water cool kits to get back in stock!!


----------



## jestermx6

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Are you increasing voltage? Default max is 1.062 increasing voltage slider unlocks 1.074, 081, 093v
> 
> But 2000 is pretty easy to hit even on stock voltage Ive had 4 Ti and 1 Xp all did over 2000 no problem.
> 
> Also it s not just higher clocks better binned cards run cooler and don't hit power limit as often
> 
> Also that coil whine should only happens at very high frames 1000+ fps like when exiting heaven benchmark.


I get coil whine at anything about 150fps. The higher the FPS the more whine obviously.


----------



## BrainSplatter

Quote:


> Originally Posted by *Roadrunners*
> 
> I am just wondering is this voltage limit going to cause issues? I guessing it will if I want to run it on water with decent clocks ie around 2050Mhz.


Your new card has higher default voltage. Try to undervolt it with AB voltage curve and see how far down u can get it for 2000-2050.


----------



## Roadrunners

Quote:


> Originally Posted by *BrainSplatter*
> 
> Your new card has higher default voltage. Try to undervolt it with AB voltage curve and see how far down u can get it for 2000-2050.


Thanks for the info, I haven't used AB Voltage curve before but I have just done quick google on how to bring up the graph and have done a screen grab of each card.

Original 1080Ti



New 1080Ti


----------



## BrainSplatter

Quote:


> Originally Posted by *Roadrunners*
> 
> Thanks for the info, I haven't used AB Voltage curve before but I have just done quick google on how to bring up the graph and have done a screen grab of each card.


As u can see, the new card is configured to use 1100mv @ 1900Mhz, thereas the old one is set to 1050 @ 1900Mhz.

Just search this thread for voltage curve and u will get a lot of info on how to edit it. Quickest try would probably be to press the shift key (will move all voltage points), select some point and drag it up so it uses something like 1025mv @ 2000Mhz and check stability.

Most cards will do 2000Mhz with something between 1000mv - 1050mv as minimum voltage. Great cards can also do <1000mv @ 2000Mhz.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Clukos*
> 
> You don't really need to shunt mod for load voltage <= 1.0310mv, even Superposition 8k Optimized (which is the hardest thing to run in terms of power limit from benches) doesn't throttle at that voltage. But I guess it depends on the configuration too, the Gaming X I'm using has 2x 8-pin so maybe I have more headroom at stock settings?


Your card has a legit 30 extra watts.

Superposition isn't as power demanding as some of 3Dmark benchmarks but 3Dmark benchmarks aren't realistic to Gameplay either. I think superposition and heaven are still the two go to benchmarks.


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> I also had bad luck with 980 Ti from MSI, didn't go above 1960 mhz so I returned it. Grabbed Gigabyte Gaming G1 and stable since then 1555 OC on air.
> 
> I don't personally recommend MSI anymore. Also I can't recommend Aorus- not that all of them are bad, but bad ones are extreme bad like crashin games on stock settings, artefacting on stock memory etc.
> 
> I would definitely stick with ASUS or EVGA this round. Very pleased with my STRIX.


I dunno if its a good idea to judge brands based on OC potential lol. The chips nowadays are getting more and more determined by sillicon lottery and less by the PCB design. The main thing IMO that each manufacturer should be judged on is customer support, cooling design and BIOS support (gigabyte were genius with their 375W limit).

If anything, recommend the stock NVIDIA card because those seem to be getting the cream of the crop centre dies while they push off the crap bins to the AIBs.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *imLagging*
> 
> Yeah, your card has a higher power limit in bios. Along with the beautiful PCB circuirty and components
> 
> 
> 
> 
> 
> 
> 
> 
> If someone was going to shunt to undervolt I would be confused haha
> Curious what my power draw was, I loaded 1.031v with 2050mhz and I was reaching 90% power limit on first scene. Add the ~30-40% from shunt, I would still be able to lower the clocks or voltage and stay within stock power limit range. I still need to find my voltage-clock limits for 1.03, 1.05, 1.07, and 1.09. I need to find the sweet spot and use that when gaming, definitely not going to add ~30w for 13mhz and the temperature increase in loop.
> -wow, I'm a bit surprised. I was testing in 4k and tried 8k to see 1.03v 2050mhz pull up to 97% power so I cranked to 1.09v 2100mhz and seen first scene reach 111%
> 
> 
> 
> 
> 
> 
> 
> 
> You have to look deeper than box specs. Considering there is a reason they use those numbers
> 
> 
> 
> 
> 
> 
> 
> 
> Also this is one of those forums where people will break down the card, components, gpu architecture, and guesstimate what the card should be in range of doing. It is not difficult to determine the low and high ranges for overclocks.


Did you do your shunt mod right? What percentage are you hitting when you run heaven?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *We Gone*
> 
> Very happy with my new 1080Ti FE, running 2012mhz at 105% power & +101 power not hitting any limits Mem at stock. Max temps of 61c fan @ 80%. Can't wait for the water cool kits to get back in stock!!


You'll hit power limits with TimeSpy, superposition and firestrike and certain parts of games like Witcher 3.


----------



## Roadrunners

Quote:


> Originally Posted by *BrainSplatter*
> 
> As u can see, the new card is configured to use 1100mv @ 1900Mhz, thereas the old one is set to 1050 @ 1900Mhz.
> 
> Just search this thread for voltage curve and u will get a lot of info on how to edit it. Quickest try would probably be to press the shift key (will lower all voltage points), select some point and drag it up so it uses something like 1025mv @ 2000Mhz and check stability.
> 
> Most cards will do 2000Mhz with something between 1000mv - 1050mv as minimum voltage. Great cards can also do <1000mv @ 2000Mhz.


Thanks again, I will give this a go and post back.


----------



## TheParisHeaton

Still thinking about which card to decide. I want a maximum OC on water with EKWB fc block without any mods.

STRIX OC
Aorus
Aorus Xtreme

Thanks.


----------



## Benny89

So I started to play with Voltage Curve on my STRIX.

So I started with 1.000V. Max clock I was able to be stable in Heaven with Ctrl+L lock on it was 2025. Droped in the end of the run to 2012.

Now my AB/RT overlay showed me constantly during this:

*LIM: POWER*

However max Power Consumption used by GPU during this runs was merely 93,3% so nowhere near 120%.

Max Voltage was of course 1.000 since it was locked.

Now I want to start working on other bins but why did I have *LIM: POWER* all the time?

*EDIT*: Also If I play with voltage curves- do I need to touch Voltage Slider at all?


----------



## BrainSplatter

Quote:


> Originally Posted by *TheParisHeaton*
> 
> I want a maximum OC on water with EKWB fc block without any mods.


Max OC depends on silicon lottery, not brand. Much higher voltage limit also seems to be less important than previous gens. If default cooler doesn't matter, just get the one which has the highest power limit to avoid throttling because of hitting power limit. But another card with lower power limit might still be faster in benches because the chip has lower voltage requirements and uses less power (again silicon lottery).


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> So I started to play with Voltage Curve on my STRIX.
> 
> So I started with 1.000V. Max clock I was able to be stable in Heaven with Ctrl+L lock on it was 2025. Droped in the end of the run to 2012.
> 
> Now my AB/RT overlay showed me constantly during this:
> 
> *LIM: POWER*
> 
> However max Power Consumption used by GPU during this runs was merely 93,3% so nowhere near 120%.
> 
> Max Voltage was of course 1.000 since it was locked.
> 
> Now I want to start working on other bins but why did I have *LIM: POWER* all the time?
> 
> *EDIT*: Also If I play with voltage curves- do I need to touch Voltage Slider at all?


I've found some clock speeds limit no matter what and 2025 will limit to 2012. You need to use 2032 or 2012 to get it not to limit.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheParisHeaton*
> 
> Still thinking about which card to decide. I want a maximum OC on water with EKWB fc block without any mods.
> 
> STRIX OC
> Aorus
> Aorus Xtreme
> 
> Thanks.


Here, this has all the answers to your questions.

https://tinyurl.com/2tx


----------



## Foxrun

Quote:


> Originally Posted by *smonkie*
> 
> I can't believe the amount of misinformed crap I've just read in the last pages concerning psu's needed wattage, even so when the 'data' came from alleged engineers.
> 
> A good quality PSU (and I mean a good one) is rated to last you at least the same number of years it offers you in warranty. I've been using a Seasonic X 650 for about 6 years now, and my rig has always been on the top side. Not a single problem whatsoever. Why in the hell a company would offer five or ten years of warranty if it wouldn't be sure of it? Makes no marketing sense at all. Stop alarming people and urging them into buying +1000W psu's.
> 
> Of course, the power load is related with efficiency, but that doesn't mean lifespan also is, at least in a noticeable manner. Overclocking may very well shorten the lifespan of our hardware, bit I don't think that stops people from doing it. And that is because the rate in which a MODERATE oc degrades your hardware is not even as much important as the performance/fun you get from it.


For SLI 1080Ti's and my 5820k a 1000 watt from EVGA did not cut it. I would get reboots due to loss of power. Now with the 1200watt Corsair everything runs great and no more reboots while gaming.


----------



## cekim

Quote:


> Originally Posted by *Foxrun*
> 
> For SLI 1080Ti's and my 5820k a 1000 watt from EVGA did not cut it. I would get reboots due to loss of power. Now with the 1200watt Corsair everything runs great and no more reboots while gaming.


Measured last night with a Kill-a-watt at the wall...

6950x + 2x1080ti FE OC to ~2000 + 2 EK pumps running aida64 stress cpu, gpu, cache (not FPU).

700-800W from the wall.

Fortunately I have a 1600W PSU.









Unfortunately, I need to move some of my computes off this circuit.


----------



## BrainSplatter

Quote:


> Originally Posted by *Foxrun*
> 
> For SLI 1080Ti's and my 5820k a 1000 watt from EVGA did not cut it. I would get reboots due to loss of power. Now with the 1200watt Corsair everything runs great and no more reboots while gaming.


I actually run SLI 1080Ti + [email protected] on a 800W BeQuiet E10-CM-Straight Power lol. A review showed that it can deliver >800W but I am feeling a little uneasy but so far no problems but thinking about upgrading









My device to measure power draw on the outlet also doesn't work because it's tiny battery is empty.


----------



## Foxrun

Quote:


> Originally Posted by *cekim*
> 
> Measured last night with a Kill-a-watt at the wall...
> 
> 6950x + 2x1080ti FE OC to ~2000 + 2 EK pumps running aida64 stress cpu, gpu, cache (not FPU).
> 
> 700-800W from the wall.
> 
> Fortunately I have a 1600W PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unfortunately, I need to move some of my computes off this circuit.


Ive got the 1080Ti's to 2038/6000 @ 1.063, 5820k @ 1.32, 9 Fans, 3 Pumps, OCed Ram at 1.37, and Im at 4k. 1000 watts from the Supernova PS was not enough. Maybe it's the quality of the PSU? I dunno, I was under the impression that EVGA was really good for PSU's.


----------



## cekim

Quote:


> Originally Posted by *BrainSplatter*
> 
> I actually run SLI 1080Ti + [email protected] on a 800W BeQuiet E10-CM-Straight Power, lol. I am feeling a little uneasy but so far no problems but thinking about upgrading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My device to measure power draw on the outlet also doesn't work because it's tiny battery is empty.


you should be - see above.. 6950x will certainly add a bunch, higher TDP and OC'd, but you have to be pushing your PSU near or beyond 80% of max load. Without ideal cooling, I would not be surprised to see problems.


----------



## Benny89

By Locking my card at lower bins like 1.000 and 1.012V and benching with it I have in GPU Z:



Is it normal when benching/locking at such low voltage bins?


----------



## cekim

Quote:


> Originally Posted by *Foxrun*
> 
> Ive got the 1080Ti's to 2038/6000 @ 1.063, 5820k @ 1.32, 9 Fans, 3 Pumps, OCed Ram at 1.37, and Im at 4k. 1000 watts from the Supernova PS was not enough. Maybe it's the quality of the PSU? I dunno, I was under the impression that EVGA was really good for PSU's.


They generally are, but by the looks of it you are pulling as much if not more power than I am.

Sustained 800-900W from a 1000W supply is a lot to ask without ideal (as in fans blazing, unobstructed path) cooling.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheParisHeaton*
> 
> Still thinking about which card to decide. I want a maximum OC on water with EKWB fc block without any mods.
> 
> STRIX OC
> Aorus
> Aorus Xtreme
> 
> Thanks.


Aorus and Aorus extreme are functionally the same lol.

If u are going for a uni block mod/AIO, I recommend the ASUS model. The heatsink detaches neatly from the cooling plates which cover the VRAM + VRMs. Plus, the VRMs themselves are much more efficient thus needing less cooling to be effective.

If you're going full block, the Aorus (norm/extreme doesnt matter) is technically the best due to its extremely high TDP (375W vs 330W on the ASUS)


----------



## Luckbad

Okay, I broke out measurement tools to talk about the 1080 Ti and power draw.

In short, with the Zotac 1080 Ti Amp Extreme, which has a higher power limit than pretty much every card out, the max GPU power being drawn was 372.251 W. That was at 1.081 V (can't seem to push it higher in Zotac Firestorm and don't want to mess with Afterburner right now).

The total power draw of my system with an overclocked i7 6700k was 492 W at real life settings. I went ahead and increased my overclock and set Vcore to 1.4 V to simulate a worse scenario and that only bumped it up to 504 W.

Adding a second Zotac to the equation would mean I'd be pushing 876+ W, so SLI on my 850 W power supply is actually out of the question with such a high power draw graphics card.

It is conceivable that with a CPU that draws a lot more power than my i7 6700k (do AMDs still draw a lot?), 1000 W _could_ be too little for SLI in certain situations.


----------



## TheParisHeaton

Quote:


> Originally Posted by *BrainSplatter*
> 
> Max OC depends on silicon lottery, not brand. Much higher voltage limit also seems to be less important than previous gens. If default cooler doesn't matter, just get the one which has the highest power limit to avoid throttling because of hitting power limit. But another card with lower power limit might still be faster in benches because the chip has lower voltage requirements and uses less power (again silicon lottery).


Thanks, so.. maybe i win with FE and save.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Here, this has all the answers to your questions.
> 
> https://tinyurl.com/2tx


Oh. Its not working







Quote:


> Originally Posted by *Dasboogieman*
> 
> Aorus and Aorus extreme are functionally the same lol.
> 
> If u are going for a uni block mod/AIO, I recommend the ASUS model. The heatsink detaches neatly from the cooling plates which cover the VRAM + VRMs. Plus, the VRMs themselves are much more efficient thus needing less cooling to be effective.
> 
> If you're going full block, the Aorus (norm/extreme doesnt matter) is technically the best due to its extremely high TDP (375W vs 330W on the ASUS)


Thanks.


----------



## cekim

Quote:


> Originally Posted by *Luckbad*
> 
> Okay, I broke out measurement tools to talk about the 1080 Ti and power draw.
> 
> In short, with the Zotac 1080 Ti Amp Extreme, which has a higher power limit than pretty much every card out, the max GPU power being drawn was 372.251 W. That was at 1.081 V (can't seem to push it higher in Zotac Firestorm and don't want to mess with Afterburner right now).
> 
> The total power draw of my system with an overclocked i7 6700k was 492 W at real life settings. I went ahead and increased my overclock and set Vcore to 1.4 V to simulate a worse scenario and that only bumped it up to 504 W.
> 
> Adding a second Zotac to the equation would mean I'd be pushing 876+ W, so SLI on my 850 W power supply is actually out of the question with such a high power draw graphics card.
> 
> It is conceivable that with a CPU that draws a lot more power than my i7 6700k (do AMDs still draw a lot?), 1000 W _could_ be too little for SLI in certain situations.


Ryzen are lower TDP than 69xx chips. Not by a huge margin. Of course, if you OC them, then that equation changes and you are approaching parity again.


----------



## Qba73

Quote:


> Originally Posted by *Luckbad*
> 
> WARNING: I downloaded Afterburner Beta 7. Seemed to work great at first, then I got a BSOD and it corrupted my Windows 10 installation. Had to use ye olde Reset PC option and I'm still installing applications and drivers.
> 
> Use this at your own peril.


thanks for the warning, uninstalling now, bullet dodged...lol

back to beta 6 for me


----------



## Luckbad

Quote:


> Originally Posted by *Foxrun*
> 
> Ive got the 1080Ti's to 2038/6000 @ 1.063, 5820k @ 1.32, 9 Fans, 3 Pumps, OCed Ram at 1.37, and Im at 4k. 1000 watts from the Supernova PS was not enough. Maybe it's the quality of the PSU? I dunno, I was under the impression that EVGA was really good for PSU's.


EVGA has incredible PSUs made by Super Flower.

I believe you are 100% right that the PSU was the problem, though. Looking at a few charts (example), power consumption for your 5820K overclocked is likely in the neighborhood of 330-340 W. Different chips also have different power draw, so if you have a "worse" chip (more power required), it could be even higher.

The CPU and both of your cards alone could actually reach 1000 W, leaving no room for anything else. Your particular setup indeed requires a 1200 W PSU.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> Okay, I broke out measurement tools to talk about the 1080 Ti and power draw.
> 
> In short, with the Zotac 1080 Ti Amp Extreme, which has a higher power limit than pretty much every card out, the max GPU power being drawn was 372.251 W. That was at 1.081 V (can't seem to push it higher in Zotac Firestorm and don't want to mess with Afterburner right now).
> 
> The total power draw of my system with an overclocked i7 6700k was 492 W at real life settings. I went ahead and increased my overclock and set Vcore to 1.4 V to simulate a worse scenario and that only bumped it up to 504 W.
> 
> Adding a second Zotac to the equation would mean I'd be pushing 876+ W, so SLI on my 850 W power supply is actually out of the question with such a high power draw graphics card.
> 
> It is conceivable that with a CPU that draws a lot more power than my i7 6700k (do AMDs still draw a lot?), 1000 W _could_ be too little for SLI in certain situations.


Well sli is pretty inefficient. Studies have shown on average you get like 40% to 50% scaling.

That would put both cards like in 75%-80% use. You'll rarely if ever run both cards at 100% power draw and 100% power usage.

Maybe some benchmarks, but gaming? You'll probably be seeing 80% power draw on both cards.


----------



## TheBoom

So I just received my Zotac Amp Extreme. I little disappointing I would say. Max stable core clock seems to be around 2037. Which is fine if the memory wasn't so bad. 11.7ghz artifacts all over, 11.6ghz seems stable for now.

And some quality qc from Zotac



Now I normally wouldn't bother but for a card this expensive and brand new a scratch and indentation on the GTX logo is a little annoying.


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> ****** why each time I hit appy on my Voltage Curve or change ONE bin and hit Apply - THE WHOLE STUPID CURVE IS MOVING AROUND TO DIFFERENT BINS/MHZ etc. It pisses me off


Easiest way to go about changing the voltage curve is to hit reset and then pick the clock rate and volt that you want and raise above all others. Fast and reliable way to do it.


----------



## KedarWolf

https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331

New BIOS to test on our FE's. 350W power limit!!!









Be home from work in three hours or so.


----------



## Roadrunners

Adjusted the voltage curve to lower input voltage and I am still getting Voltage Limit warnings at 1.00v @ 1886Mhz, surely this isn't correct? Bad card?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> New BIOS to test on our FE's. 350W power limit!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be home from work in three hours or so.


Almost looks promising, but 2X 8-pins.


----------



## Clukos

Quote:


> Originally Posted by *Roadrunners*
> 
> Adjusted the voltage curve to lower input voltage and I am still getting Voltage Limit warnings at 1.00v @ 1886Mhz, surely this isn't correct? Bad card?


Voltage limit means that you are not hitting power limit. From what I can understand they are the opposite of each other, as long as you are power limited you are not voltage limited and the card is dropping voltage/clocks. As long as you are not power limited the card says it is voltage limited because it could be drawing more power.


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> New BIOS to test on our FE's. 350W power limit!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be home from work in three hours or so.


Just got home. Gonna flash now. I spend more time benching than playing games these days


----------



## eXteR

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> New BIOS to test on our FE's. 350W power limit!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be home from work in three hours or so.


That palit has 12+2 phase design. I think that bios wont work, like the Zotac Amp Extreme.


----------



## joder

Quote:


> Originally Posted by *fisher6*
> 
> Just got home. Gonna flash now. I spend more time benching than playing games these days


Heavy breathing intensifies...


----------



## KedarWolf

Quote:


> Originally Posted by *eXteR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> New BIOS to test on our FE's. 350W power limit!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Be home from work in three hours or so.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That palit has 12+2 phase design. I think that bios wont work, like the Zotac Amp Extreme.
Click to expand...


----------



## dunbheagan

Quote:


> Originally Posted by *TheBoom*
> 
> So I just received my Zotac Amp Extreme. I little disappointing I would say. Max stable core clock seems to be around 2037. Which is fine if the memory wasn't so bad. 11.7ghz artifacts all over, 11.6ghz seems stable for now.
> 
> And some quality qc from Zotac
> 
> 
> 
> Now I normally wouldn't bother but for a card this expensive and brand new a scratch and indentation on the GTX logo is a little annoying.


I totally understand you. People gonna tell you it doesnt matter performace-wise, and they are right. But numbers do have a certain attraction by themselfes. Before i got my card, i crossed my fingers it could run the memory at 6000, everything below that would have been a dissapointment for me too. Graphics cards for 800 bucks are luxury, and luxury is mainly emotional, not rational. I am not even a real gamer, i do bench and test a lot more than i actually game.

By the way, is it possible, that the first batch of 1080Ti, like the FEs, did have a slightly better-to-overclock memory than the later cards? I remember everybody telling their mem can hit 6000+ in the first weeks, now it seems to be common, that it can not do the magic 6GHz. I think with 1070 it was similar, but that was because of the samsung/hynix/micron BIOS topic. Do i have the wrong impression, is it coincedence or does it have a system?


----------



## KedarWolf

Quote:


> Originally Posted by *Roadrunners*
> 
> Adjusted the voltage curve to lower input voltage and I am still getting Voltage Limit warnings at 1.00v @ 1886Mhz, surely this isn't correct? Bad card?


Some clocks will power limit no matter what like 2025 will to 2012 but 2012 or 2032 won't. Not sure why.


----------



## mtbiker033

Quote:


> Originally Posted by *buddatech*
> 
> Just ordered a cooler from the hybrid kit for my 1080 Ti FE, should hopefully be here mid to late next week. I've done the full hybrid kit install on my original 1080 FE. My question; can/should I use CLU Pro on GPU Die or should I stick with other paste I have on hand Antec Formula 7??


where in the world did you order it from?


----------



## Benny89

I was running W3 2038/6009 and had those readings in GPU-Z. I can't readt what does it mean so can someone tell me what info I can get from it?

*Pwr Vrel i VRel* - what does it mean and what needs to be increased/decreased?


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*


No point in flashing I guess







?


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> I was running W3 2038/6009 and had those readings in GPU-Z. I can't readt what does it mean so can someone tell me what info I can get from it?
> 
> *Pwr Vrel i VRel* - what does it mean and what needs to be increased/decreased?


It usually does that if you locked the voltage/clock point and if you clocks and voltages are stable and not jumping up and down is fine.


----------



## joder

Quote:


> Originally Posted by *fisher6*
> 
> No point in flashing I guess
> 
> 
> 
> 
> 
> 
> 
> ?


You haven't flashed yet? lol


----------



## eXteR

Quote:


> Originally Posted by *Benny89*
> 
> I was running W3 2038/6009 and had those readings in GPU-Z. I can't readt what does it mean so can someone tell me what info I can get from it?
> 
> *Pwr Vrel i VRel* - what does it mean and what needs to be increased/decreased?


Pwr = Throttle because power limited

VRel (Voltage Reliability) = Card can go high frequency but get limited by the actual vcore.

I think vrel is not a problem. I mainly get this and card works stable [email protected]

Problem is Pwr, that's when card starts to jump back and up all the time. Superposition 4K is the best example to test this.


----------



## dunbheagan

Quote:


> Originally Posted by *Clukos*
> 
> Voltage limit means that you are not hitting power limit. From what I can understand they are the opposite of each other, as long as you are power limited you are not voltage limited and the card is dropping voltage/clocks. As long as you are not power limited the card says it is voltage limited because it could be drawing more power.


That is not right. Voltage limit and power limit are not dependend. In best case you are neither power limited nor voltage limited.

Voltage limit has to do with temps:
When your card is becoming hotter your card increases voltage to compensate the higher electric resistance of the die at higher temps(i think that is the reason). If your card reaches the highest possible voltage, like 1,063 or 1,093 it wont let you have your highest possible clockspeed but throttles a little. Then you are voltage limited. It begins quite early, at about 50°C. But on water you are normally not voltage limited. Here is log of a Superposition 4K run on my card:


----------



## Clukos

Quote:


> Originally Posted by *dunbheagan*
> 
> That is not right. Voltage limit and power limit are not dependend. In best case you are neither power limited nor voltage limited.
> 
> Voltage limit has to do with temps:
> When your card is becoming hotter your card increases voltage to compensate the higher electric resistance of the die at higher temps(i think that is the reason). If your card reaches the highest possible voltage, like 1,063 or 1,093 it wont let you have your highest possible clockspeed but throttles a little. Then you are voltage limited. It begins quite early, at about 50°C. But on water you are normally not voltage limited. Here is log of a Superposition 4K run on my card:


Oh! I thought that behavior was just the thermal limit. Good to know









So if you are above 50C you should always be voltage limited, right?


----------



## dunbheagan

Quote:


> Originally Posted by *Clukos*
> 
> Oh! I thought that behavior was just the thermal limit. Good to know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So if you are above 50C you should always be voltage limited, right?


No, but your card wont allow max possible clock above 50°C. I think voltage limit comes a little later, when the card is at the highest possible voltage.

Sorry, my explanation is not to good. Maybe somebody else knows it more exactly?


----------



## dunbheagan

Quote:


> Originally Posted by *SlimJ87D*
> 
> Firestrike doesn't like my PC for some reason, I will always get a invalid score. It actually takes the damn stupid program 10 minutes before it decides to run but here's a run at 2088.
> 
> 
> 
> I might do a 2152 run later if I have time. I can't remember, what are you superposition scores again?


On my actual 24/7 settings i get 10111.
1080Ti [email protected]/[email protected],03V
[email protected],[email protected],225V


----------



## amer020

for now i use asus strix 1080ti bois for my msi founder and i install GPUTweakII Version 1.4.5.4 and with oc mod i play bf1 with 2025 or 2000

and i think its good for me


----------



## TheBoom

Quote:


> Originally Posted by *dunbheagan*
> 
> I totally understand you. People gonna tell you it doesnt matter performace-wise, and they are right. But numbers do have a certain attraction by themselfes. Before i got my card, i crossed my fingers it could run the memory at 6000, everything below that would have been a dissapointment for me too. Graphics cards for 800 bucks are luxury, and luxury is mainly emotional, not rational. I am not even a real gamer, i do bench and test a lot more than i actually game.
> 
> By the way, is it possible, that the first batch of 1080Ti, like the FEs, did have a slightly better-to-overclock memory than the later cards? I remember everybody telling their mem can hit 6000+ in the first weeks, now it seems to be common, that it can not do the magic 6GHz. I think with 1070 it was similar, but that was because of the samsung/hynix/micron BIOS topic. Do i have the wrong impression, is it coincedence or does it have a system?


Nevermind the clocks, the card is technically damaged even though its brand new in box. I don't even know if that qualifies for an exchange. A scratch on the GTX logo?

But anyway I think all the memory chips on the 1080tis are micron based. So I don't think it has that issue you are talking about. I had a 1070 strix with samsung memory before this and if I recall correctly it went up to +400mhz on memory max. There were people getting higher memory clocks with micron chips after the bios update though.

Out of curiosity, what is the average memory overclock on AIB cards here?


----------



## lilchronic

Quote:


> Originally Posted by *Roadrunners*
> 
> Adjusted the voltage curve to lower input voltage and I am still getting Voltage Limit warnings at 1.00v @ 1886Mhz, surely this isn't correct? Bad card?


you are hitting the voltage limit because you set your card to use no more than 1v so it's limited, nothing to worry about.


----------



## Slackaveli

Quote:


> Originally Posted by *gmpotu*
> 
> Yeah I'm running an old 3770k @ 4.2ghz.
> Suramar is usually where my frames drop down into the 20s when I'm flying. I have all sliders maxed, view distance, SSAA x 16 or whatever the max is. If there's a slider then it's maxed out.
> 
> I'll try to optimize tomorrow. I just figured you guys were maxing out all settings as far as they would go and still getting over 100fps.
> 
> I'm still not understanding something about this whole processes even after watching the videos etc. Not sure what I'm missing. More testing tomorrow though.


well that's your problem, man. You have the setting that Baasha would use... ie the settings of a 4 Titan Xp SLI madman!


----------



## Slackaveli

Quote:


> Originally Posted by *gmpotu*
> 
> Yeah I'm running an old 3770k @ 4.2ghz.
> Suramar is usually where my frames drop down into the 20s when I'm flying. I have all sliders maxed, view distance, SSAA x 16 or whatever the max is. If there's a slider then it's maxed out.
> 
> I'll try to optimize tomorrow. I just figured you guys were maxing out all settings as far as they would go and still getting over 100fps.
> 
> I'm still not understanding something about this whole processes even after watching the videos etc. Not sure what I'm missing. More testing tomorrow though.


well that's your problem, man. You have the setting that Baasha would use... ie the settings of a 4 Titan Xp
Quote:


> Originally Posted by *imLagging*
> 
> I haven't been following this thread, has the average OC been low?
> I was under the impression most could hit 2050-2100 with max voltage
> Can you still return to get a replacement? I can't imagine, I would also be very frustrated.
> I didn't hold out long enough to buy custom cards. Got an clocking FE but I had to shunt mod to reach it's potential, man did the coil whine reveal itself after all was said and done
> 
> 
> 
> 
> 
> 
> 
> 
> Been trying some benchmarks, max I've clocked is 2152 pulling 590w from the wall. Tryed 2160+ but it crashes after 10 seconds on superposition.
> For memory I can clock +650 on cold run, oddly enough I ran +660 on first few benchmarks and never again.


yeah, not even. average is about 2000. even in overclockers.net, so, we have a better average than regulare fools do. average overall is like 1900. If you can keep 2000, that's a good card. My first one topped out at 1974 :/

My Aorus holds 2088/2076 @ 1.05v. Thing is a beast.

That said, if you are only clock concerned , I'd buy the Zotac. Best chance of a 2100. Best chance of a 2076. And basically guaranteed a 2025.


----------



## WarbossChoppa

Quote:


> Originally Posted by *imLagging*
> 
> I haven't been following this thread, has the average OC been low?
> I was under the impression most could hit 2050-2100 with max voltage
> Can you still return to get a replacement? I can't imagine, I would also be very frustrated.
> I didn't hold out long enough to buy custom cards. Got an clocking FE but I had to shunt mod to reach it's potential, man did the coil whine reveal itself after all was said and done
> 
> 
> 
> 
> 
> 
> 
> 
> Been trying some benchmarks, max I've clocked is 2152 pulling 590w from the wall. Tryed 2160+ but it crashes after 10 seconds on superposition.
> For memory I can clock +650 on cold run, oddly enough I ran +660 on first few benchmarks and never again.


I got it through newegg and I doubt they will.


----------



## GRABibus

Quote:


> Originally Posted by *Slackaveli*
> 
> That said, if you are only clock concerned , I'd buy the Zotac. Best chance of a 2100. Best chance of a 2076. And basically guaranteed a 2025.


Do you refund us if we buy a Zotac and if this is not true for some of us ?


----------



## TheBoom

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, not even. average is about 2000. even in overclockers.net, so, we have a better average than regulare fools do. average overall is like 1900. If you can keep 2000, that's a good card. My first one topped out at 1974 :/
> 
> My Aorus holds 2088/2076 @ 1.05v. Thing is a beast.
> 
> That said, if you are only clock concerned , I'd buy the Zotac. Best chance of a 2100. Best chance of a 2076. And basically guaranteed a 2025.


Quote:


> Originally Posted by *GRABibus*
> 
> Do you refund us if we buy a Zotac and if this is not true for some of us ?


After some testing with 3d mark and superposition my Zotac holds about 2037-2050mhz max stable. So no, its really silicon lottery after all.

And like I've mentioned a few posts before, anything past 11.6ghz on the memory starts artifacting.


----------



## nrpeyton

yeeehaaaa!!!!!

It's confirmed!

There *is* going to be EVGA 1080 TI K|NGP|N Edition.


----------



## Slackaveli

Quote:


> Originally Posted by *GRABibus*
> 
> Do you refund us if we buy a Zotac and if this is not true for some of us ?


if you are worried buy the k|ngp|n. But, if you buy from amazon, they'll refund you. Just dont buy from the egg w/o checking the return policy. They have had enough of the gpu returns because of poor overclock headroom.


----------



## WarbossChoppa

Quote:


> Originally Posted by *TheBoom*
> 
> After some testing with 3d mark and superposition my Zotac holds about 2037-2050mhz max stable. So no, its really silicon lottery after all.
> 
> And like I've mentioned a few posts before, anything past 11.6ghz on the memory starts artifacting.


I can get to 6075 on my memory, although I wish I could trade higher core for memory speeds.


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> After some testing with 3d mark and superposition my Zotac holds about 2037-2050mhz max stable. So no, its really silicon lottery after all.
> 
> And like I've mentioned a few posts before, anything past 11.6ghz on the memory starts artifacting.


That's actually pretty good, though. it just a hair below me in mem and core. All-in-all a good one. 2050 and 11.6 matches well, too. You aren't missing out on much with the extra bandwidth you (I) can't quite get .
Quote:


> Originally Posted by *WarbossChoppa*
> 
> I can get to 6075 on my memory, although I wish I could trade higher core for memory speeds.


ha, if we could trade i might give you a few. my mem max OC is +440, so 11.8 or so. but i can hold 2076-2088. that's about right, i guess, but artifacting before 12gbps does peave me a bit.


----------



## nrpeyton

Seen it as a possible "teaser" at guru3d.

So I went digging a bit deeper and found a little "shout-out" on the chat-box at the top of his website. (not a "post" as such) so it will disappear as more people use the chat-box.

But it *did* come from the man himself, on his account/name.

So its definitely a confirmation  Not speculation.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Seen it as a possible "teaser" at guru3d.
> 
> So I went digging a bit deeper and found a little "shout-out" on the chat-box at the top of his website. (not a "post" as such) so it will disappear as more people use the chat-box.
> 
> But it *did* come from the man himself, on his account/name.
> 
> So its definitely a confirmation  Not speculation.


yeah, Vince confirmed it on twitter last friday i think.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, Vince confirmed it on twitter last friday i think.


My 1080 TI Founders Edition arrives on Thursday. Not sure what to do now, lol.

In all honest; I really wasn't sure it was going to happen.

I thought EVGA may of just stopped after the FTW3 this round. All with the new ICX and 9 thermal sensors e.t.c and all nvidias new lockdowns and BIOS lockouts (not to mention voltage),

This could also be a sign that some of that could still change on pascal (with TI). 
(Obviously as KP edition is so overclocking focused).


----------



## Benny89

For Future 1080 TI Owners- Don't worry that much about clocks pass 2000mhz.

*Unless you are benching and aim for highest bench scores you should NOT be concerned about Max OC on core pass 2000 Mhz.* We are talkig here about 3,5- 3,8% difference on Core Clock speed 2000Mhz vs 2050-2078 speeds.

That said it gains you absolutely nothing in games.

I have run some tests in Witcher 3 today and I have been testing in same area 3 different clocks:

2038 core, 2050 core and 2063 core.

The total difference between 2038 and 2063 core was- 1 fps in 1440p. We are talking here about scenario where I have anything from 88 fps to 105 fps in game. 1 fps is nothing.

Between 2038 and 2050 there was no difference at all in performance.

That being said- you could see 2 fps difference in games between 2000mhz and 2075mhz . And that not being guarantee as you can have better performance on lower clocks, depending on how good chip you got.

So if your card is 2000 mhz then you are winner.


----------



## Foxrun

Quote:


> Originally Posted by *Benny89*
> 
> For Future 1080 TI Owners- Don't worry that much about clocks pass 2000mhz.
> 
> *Unless you are benching and aim for highest bench scores you should NOT be concerned about Max OC on core pass 2000 Mhz.* We are talkig here about 3,5- 3,8% difference on Core Clock speed 2000Mhz vs 2050-2078 speeds.
> 
> That said it gains you absolutely nothing in games.
> 
> I have run some tests in Witcher 3 today and I have been testing in same area 3 different clocks:
> 
> 2038 core, 2050 core and 2063 core.
> 
> The total difference between 2038 and 2063 core was- 1 fps in 1440p. We are talking here about scenario where I have anything from 88 fps to 105 fps in game. 1 fps is nothing.
> 
> Between 2038 and 2050 there was no difference at all in performance.
> 
> That being said- you could see 2 fps difference in games between 2000mhz and 2075mhz . And that not being guarantee as you can have better performance on lower clocks, depending on how good chip you got.
> 
> So if your card is 2000 mhz then you are winner.


So true. 2038 vs 2012 at 4k gives me maybe 3 fps more in Wildlands. But that need to get a higher overclock is like an itch I cant scratch.


----------



## TheBoom

Quote:


> Originally Posted by *Benny89*
> 
> For Future 1080 TI Owners- Don't worry that much about clocks pass 2000mhz.
> 
> *Unless you are benching and aim for highest bench scores you should NOT be concerned about Max OC on core pass 2000 Mhz.* We are talkig here about 3,5- 3,8% difference on Core Clock speed 2000Mhz vs 2050-2078 speeds.
> 
> That said it gains you absolutely nothing in games.
> 
> I have run some tests in Witcher 3 today and I have been testing in same area 3 different clocks:
> 
> 2038 core, 2050 core and 2063 core.
> 
> The total difference between 2038 and 2063 core was- 1 fps in 1440p. We are talking here about scenario where I have anything from 88 fps to 105 fps in game. 1 fps is nothing.
> 
> Between 2038 and 2050 there was no difference at all in performance.
> 
> That being said- you could see 2 fps difference in games between 2000mhz and 2075mhz . And that not being guarantee as you can have better performance on lower clocks, depending on how good chip you got.
> 
> So if your card is 2000 mhz then you are winner.


Not so concerned about core clocks but more in memory. I see the average OC seems to be around 6000 or 12ghz effective. I wonder what impact memory clocks have though in games.


----------



## TheBoom

Quote:


> Originally Posted by *Slackaveli*
> 
> That's actually pretty good, though. it just a hair below me in mem and core. All-in-all a good one. 2050 and 11.6 matches well, too. You aren't missing out on much with the extra bandwidth you (I) can't quite get .


Not saying its bad at all. I mean the memory could do a bit better given the average oc around here but it's still a good card.

What I meant is to prove that even an Zotac Amp Extreme is not a guaranteed golden ticket to high clocks.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> For Future 1080 TI Owners- Don't worry that much about clocks pass 2000mhz.
> 
> *Unless you are benching and aim for highest bench scores you should NOT be concerned about Max OC on core pass 2000 Mhz.* We are talkig here about 3,5- 3,8% difference on Core Clock speed 2000Mhz vs 2050-2078 speeds.
> 
> That said it gains you absolutely nothing in games.
> 
> I have run some tests in Witcher 3 today and I have been testing in same area 3 different clocks:
> 
> 2038 core, 2050 core and 2063 core.
> 
> The total difference between 2038 and 2063 core was- 1 fps in 1440p. We are talking here about scenario where I have anything from 88 fps to 105 fps in game. 1 fps is nothing.
> 
> Between 2038 and 2050 there was no difference at all in performance.
> 
> That being said- you could see 2 fps difference in games between 2000mhz and 2075mhz . And that not being guarantee as you can have better performance on lower clocks, depending on how good chip you got.
> 
> So if your card is 2000 mhz then you are winner.


People aren't going to listen, we're on a overclocking forum for god sakes


----------



## KCDC

Hey there, how were you able to get superposition to run properly on SLI? Mine was't using my second card at all.


----------



## KCDC

Quote:


> Originally Posted by *Benny89*
> 
> For Future 1080 TI Owners- Don't worry that much about clocks pass 2000mhz.
> 
> *Unless you are benching and aim for highest bench scores you should NOT be concerned about Max OC on core pass 2000 Mhz.* We are talkig here about 3,5- 3,8% difference on Core Clock speed 2000Mhz vs 2050-2078 speeds.
> 
> That said it gains you absolutely nothing in games.
> 
> I have run some tests in Witcher 3 today and I have been testing in same area 3 different clocks:
> 
> 2038 core, 2050 core and 2063 core.
> 
> The total difference between 2038 and 2063 core was- 1 fps in 1440p. We are talking here about scenario where I have anything from 88 fps to 105 fps in game. 1 fps is nothing.
> 
> Between 2038 and 2050 there was no difference at all in performance.
> 
> That being said- you could see 2 fps difference in games between 2000mhz and 2075mhz . And that not being guarantee as you can have better performance on lower clocks, depending on how good chip you got.
> 
> So if your card is 2000 mhz then you are winner.


I dunno man, having my SLI setup running above 2000 vs under 2000 @ 7680x1440 is much smoother... framrates not jumping all over the place in ME:A. However I think I'm hitting the limit of my AX1200i. Had my first complete random shutdown the other day, normally is a PSU issue when that happens.


----------



## demon09

you guys who are saying 2000+ do you mean that it runs at 2000? or starts at it? as mine starts at much higher then what it runs at once its warmed up curse the FE cooler but its what evga had for step up. water cooling kits tempt me a lot but they would be a hard fit with my phantek ph-tc14pe. I do Kinda wish evga sold there air cooler shrouds that i could just change the fe cooler off for an open air cooler


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> Not saying its bad at all. I mean the memory could do a bit better given the average oc around here but it's still a good card.
> 
> What I meant is to prove that even an Zotac Amp Extreme is not a guaranteed golden ticket to high clocks.


no doubt about it. And the inverse is true, too. Gauntlet process on Gigabyte aorus is their version of "binning", yet I have a regular edition that runs great at 2088/2076. It doesn't like max volts, however. So I suspect it crashed in a high voltage test or something and got sent down to the regular bins. But, I've seen plenty of Aorus Extremes top out around 2012/2025 even holding more voltage, so it's so much luck.

But, with Asus/Zotac/AorusXT/FTW3/GalaxHOF you're pretty well assured of a 2Ghz+ card. I've only seen a couple MSI guys say otherwise. Well, and a few AorusX guys, but they RMA's b/c they weren't even keeping stock core clocks. Those were more proof of Aorus' poor binning process. My card is great. It games at high clocks, stays at 2076+ b/c it runs under 60c.. Great card. But it was my lottery luck finally turning.

But as Benny was just saying, we are talking about negligible gains in a 100Mhz range. At 4k it's like 1-2fps. At 1440p, maybe 4 fps, IF that. You'll only notice if you log everything. It's all in our heads. If you must fixate on a number, make it be 2000Mhz so you will have a chance to be happy at least, lolz.


----------



## Slackaveli

Quote:


> Originally Posted by *demon09*
> 
> you guys who are saying 2000+ do you mean that it runs at 2000? or starts at it? as mine starts at much higher then what it runs at once its warmed up curse the FE cooler but its what evga had for step up. water cooling kits tempt me a lot but they would be a hard fit with my phantek ph-tc14pe. I do Kinda wish evga sold there air cooler shrouds that i could just change the fe cooler off for an open air cooler


we mean holding. you should have 2 #s, what you max, what you hold imo. That's why i say 2088/2076. But, it's cooling that makes you drop bins. You drop at 49, 59, and 60 something iirc on an FE. I can keep my Aorus in the 50's usually and that keeps my # up there. A hybrid cooler would get you there, but the cheaper easier option is thermal grizzly kryonaut re-paste and 100% fan curve. That will give you a 'running' clock of only 2-3 bins beneath your max clock. aka, if you start at 2037, you'll hold 2012 or 2000.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> By Locking my card at lower bins like 1.000 and 1.012V and benching with it I have in GPU Z:
> 
> 
> 
> Is it normal when benching/locking at such low voltage bins?


Welcome to the Rainbow club. This is what I've been seeing non-stop.


----------



## Slackaveli

Quote:


> Originally Posted by *KCDC*
> 
> Hey there, how were you able to get superposition to run properly on SLI? Mine was't using my second card at all.


yep, i got you, sir!!

Here is how to get SLI working...

#1 Download Superposition SLI Profile+Tools pack here: http://www.mediafire.com/file/n1y1kigoxhcdlre/Superposition_SLI_Profile_Tools.zip
#2 Extract all to desktop.
#3 Open Geforce 3D Profile Manager, Extract SLI Profiles to your desktop using the Extract SLI Profiles button. A text document called NVIDIA Profiles should show up on your desktop if you extracted the profiles to the correct location.
#4 Open NVIDIA Profiles text document.
#5 Open Superposition SLI Profile text document.
#6 Copy and paste the Superposition SLI Profile into the NVIDIA Profiles text document then save the NVIDIA Profiles text document. Refer to example pic to see how it should look after you have added the Superposition SLI Profile to the NVIDIA Profiles text document.
#7 Open Geforce 3D Profile Manager, Import modified NVIDIA Profiles text document into the nvidia driver using the Import SLI Profiles button.
#8 Enable SLI inside the nvidia control panel if you don't already have it enabled.
#9 Fire up the Superposition Benchmark and enjoy SLI.

this copy and paste did not work for me, but when I just tried to input the superposition profile by itself it worked.


----------



## gmpotu

Quote:


> Originally Posted by *Clukos*
> 
> Voltage limit means that you are not hitting power limit. From what I can understand they are the opposite of each other, as long as you are power limited you are not voltage limited and the card is dropping voltage/clocks. As long as you are not power limited the card says it is voltage limited because it could be drawing more power.


So voltage Limit is expected and OK to see in GPU-Z when clocking to lower voltage bins... say 2000 @ 1.000v. If you see Voltage Limit that is a good thing or should it be clear on GPU-Z with no limit? (Flatline on the curve beyond 1.000v of course)


----------



## KedarWolf

Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.

https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331

350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.









And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.









And not power limiting at 1.025v.









My new go to BIOS!!


----------



## Roadrunners

Quote:


> Originally Posted by *KedarWolf*
> 
> Some clocks will power limit no matter what like 2025 will to 2012 but 2012 or 2032 won't. Not sure why.


Quote:


> Originally Posted by *dunbheagan*
> 
> That is not right. Voltage limit and power limit are not dependend. In best case you are neither power limited nor voltage limited.
> 
> Voltage limit has to do with temps:
> When your card is becoming hotter your card increases voltage to compensate the higher electric resistance of the die at higher temps(i think that is the reason). If your card reaches the highest possible voltage, like 1,063 or 1,093 it wont let you have your highest possible clockspeed but throttles a little. Then you are voltage limited. It begins quite early, at about 50°C. But on water you are normally not voltage limited. Here is log of a Superposition 4K run on my card:


This card is a replacement on an Advanced RMA from EVGA as my current card has coil whine and the Voltage limit is coming up on stock clocks.

My current card is under water and I haven't seen the Voltage Limit arise at all even when pushed upto 2088Mhz. Are you saying that seeing the Voltage limit is normal due to the higher temps on the air cooled FE card?

Quote:


> Originally Posted by *lilchronic*
> 
> you are hitting the voltage limit because you set your card to use no more than 1v so it's limited, nothing to worry about.


I only dropped the voltage as one member suggested it might solve the issue, however I tried adding voltage all the way upto the max 1.092v and the Voltage Limit is still coming up. No matter how I run this card it is flagging up the Voltage Limit. Is this a duff card?


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


well, there it is (finally) then, FE owners. If I was still on FE I'd be on this right now.

Hey Slim...


----------



## KingEngineRevUp

[/quote]

People aren't going to listen, we're on a overclocking forum for god sakes







Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


SWEET! Others should test it ASAP! Looking forward to hearing from them









I hope this is true, because it literally made my stomach drop. Did you use your PSU to take some measurements?


----------



## KCDC

Quote:


> Originally Posted by *Slackaveli*
> 
> yep, i got you, sir!!
> 
> Here is how to get SLI working...
> 
> #1 Download Superposition SLI Profile+Tools pack here: http://www.mediafire.com/file/n1y1kigoxhcdlre/Superposition_SLI_Profile_Tools.zip
> #2 Extract all to desktop.
> #3 Open Geforce 3D Profile Manager, Extract SLI Profiles to your desktop using the Extract SLI Profiles button. A text document called NVIDIA Profiles should show up on your desktop if you extracted the profiles to the correct location.
> #4 Open NVIDIA Profiles text document.
> #5 Open Superposition SLI Profile text document.
> #6 Copy and paste the Superposition SLI Profile into the NVIDIA Profiles text document then save the NVIDIA Profiles text document. Refer to example pic to see how it should look after you have added the Superposition SLI Profile to the NVIDIA Profiles text document.
> #7 Open Geforce 3D Profile Manager, Import modified NVIDIA Profiles text document into the nvidia driver using the Import SLI Profiles button.
> #8 Enable SLI inside the nvidia control panel if you don't already have it enabled.
> #9 Fire up the Superposition Benchmark and enjoy SLI.
> 
> this copy and paste did not work for me, but when I just tried to input the superposition profile by itself it worked.


Awesome, thanks, I couldn't find anything specific to this via googling.


----------



## gmpotu

Quote:


> Originally Posted by *dunbheagan*
> 
> That is not right. Voltage limit and power limit are not dependend. In best case you are neither power limited nor voltage limited.
> 
> Voltage limit has to do with temps:
> When your card is becoming hotter your card increases voltage to compensate the higher electric resistance of the die at higher temps(i think that is the reason). If your card reaches the highest possible voltage, like 1,063 or 1,093 it wont let you have your highest possible clockspeed but throttles a little. Then you are voltage limited. It begins quite early, at about 50°C. But on water you are normally not voltage limited. Here is log of a Superposition 4K run on my card:


If this is true that is such a better explanation of the relationship of Voltage Reliability and temperature. Also, it makes sense as to why the card seems perfectly stable even when you are seeing Vrel in GPU-Z with clocks at a low bin.


----------



## Slackaveli

Quote:


> Originally Posted by *gmpotu*
> 
> If this is true that is such a better explanation of the relationship of Voltage Reliability and temperature. Also, it makes sense as to why the card seems perfectly stable even when you are seeing Vrel in GPU-Z with clocks at a low bin.


yep, on air it's common.
Quote:


> Originally Posted by *KCDC*
> 
> Awesome, thanks, I couldn't find anything specific to this via googling.


no problemo. I found it in the Titan Xp owner's thread.


----------



## iluvkfc

Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


Good find, consider updating your thread with these findings.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


Time Spy at 1.025v 2050 core 6163 memory, power limits one bin lower at times on test 2. and note my CPU clocks are quite a bit lower at 4.6GHZ CPU, 4.3GHZ cache, than my sig which I normally run at, flashed an older BIOS for stability reasons. I think it's one of my best ever graphics score if not my best.


----------



## GRABibus

Quote:


> Originally Posted by *Slackaveli*
> 
> well that's your problem, man. You have the setting that Baasha would use... ie the settings of a 4 Titan Xp
> yeah, not even. average is about 2000. even in overclockers.net, so, we have a better average than regulare fools do. average overall is like 1900. If you can keep 2000, that's a good card. My first one topped out at 1974 :/
> 
> My Aorus holds 2088/2076 @ 1.05v. Thing is a beast.
> 
> That said, if you are only clock concerned , I'd buy the Zotac. Best chance of a 2100. Best chance of a 2076. And basically guaranteed a 2025.


Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


Could you list improvments voltages versus frequencies ? (A kind of excel sheet







)
Thanks.


----------



## c0ld

Quote:


> Originally Posted by *Slackaveli*
> 
> yep, i got you, sir!!
> 
> Here is how to get SLI working...
> 
> #1 Download Superposition SLI Profile+Tools pack here: http://www.mediafire.com/file/n1y1kigoxhcdlre/Superposition_SLI_Profile_Tools.zip
> #2 Extract all to desktop.
> #3 Open Geforce 3D Profile Manager, Extract SLI Profiles to your desktop using the Extract SLI Profiles button. A text document called NVIDIA Profiles should show up on your desktop if you extracted the profiles to the correct location.
> #4 Open NVIDIA Profiles text document.
> #5 Open Superposition SLI Profile text document.
> #6 Copy and paste the Superposition SLI Profile into the NVIDIA Profiles text document then save the NVIDIA Profiles text document. Refer to example pic to see how it should look after you have added the Superposition SLI Profile to the NVIDIA Profiles text document.
> #7 Open Geforce 3D Profile Manager, Import modified NVIDIA Profiles text document into the nvidia driver using the Import SLI Profiles button.
> #8 Enable SLI inside the nvidia control panel if you don't already have it enabled.
> #9 Fire up the Superposition Benchmark and enjoy SLI.
> 
> this copy and paste did not work for me, but when I just tried to input the superposition profile by itself it worked.


That's why I hated SLI when i had my old 780's.


----------



## Slackaveli

have ya'll seen this?????? 




Gtx 1080 voltage unlocked at 1.2v running 2379Mhz... NOT on LN2


----------



## KingEngineRevUp

Can someone with a corsair link capable PSU compare the FE stock bios to the new Palit bios? Might undo my shunt mod in the near future.

First, max out all your fans and pumps and record the power usage.

Second, run superposition with 1.075v, it should have you hit your max.

Do with both bios


----------



## gmpotu

Can I flash the Palit BIOS to my Asus STRIX model for testing or can you only flash FE cards?

Also, how does increasing the power limit allow for better clocks with respect to this comment?
Quote:


> Originally Posted by dunbheagan View Post
> 
> That is not right. Voltage limit and power limit are not dependend. In best case you are neither power limited nor voltage limited.
> 
> Voltage limit has to do with temps:
> When your card is becoming hotter your card increases voltage to compensate the higher electric resistance of the die at higher temps(i think that is the reason). If your card reaches the highest possible voltage, like 1,063 or 1,093 it wont let you have your highest possible clockspeed but throttles a little. Then you are voltage limited. It begins quite early, at about 50°C. But on water you are normally not voltage limited. Here is log of a Superposition 4K run on my card:


Does the card increase it's power draw first to compensate for heat and resistance and then if it is at the power limit increase the voltage to compensate? I'm just still missing the whole relationship between power and voltage. It sounds like what I am reading is that by increasing max power limit you can get higher clocks at the same voltage bins potentially but I don't know why that happens? Curious as to the reasoning (the science) behind it I guess.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> have ya'll seen this??????
> 
> 
> 
> 
> Gtx 1080 voltage unlocked at 1.2v running 2379Mhz... NOT on LN2


Not worth it to me. These cards already push 375 Watts at 1.075v-1.093v. At 1.200v, you would be pushing 420+ watts on the card. VRMs will probably burst like that Stealth guy.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> Can I flash the Palit BIOS to my Asus STRIX model for testing or can you only flash FE cards?
> 
> Also, how does increasing the power limit allow for better clocks with respect to this comment?
> Does the card increase it's power draw first to compensate for heat and resistance and then if it is at the power limit increase the voltage to compensate? I'm just still missing the whole relationship between power and voltage. It sounds like what I am reading is that by increasing max power limit you can get higher clocks at the same voltage bins potentially but I don't know why that happens? Curious as to the reasoning (the science) behind it I guess.


When you hit a power limit, your card will lower your voltage bins.

Power = voltage * current

Lets say you're running a benchmark and to sustain your power draw, you're at 1.000 volts and 300 amps of current

300 watts = 1.000 volts * 300 amps

Now your card needs 310 amps of current to sustain itself, your voltage drops and therefore your clocks drop because of the voltage and clock curve.

300 watts = 0.968 volts * 310 amps

But if you're at a scene where it just needs 250 amps then you won't be approaching power limits and lets say you had your voltage set to 1.050, then it would draw something like

263 Watts = 1.050v * 250 amps No power limiting here.

*So if your card hits 300 watts, then it lowers voltage to drop power. But lowering voltage lowers clocks according to how you set your voltage curve.*

Now back in the day you could modify your bios and allow it to draw 330, 350, 400 watts, etc.

*Therefore, your card will draw current and not have to worry about voltage if lets say you have 375 watts of power or more.*

If this Palit bios truly does work, then it'll give you 350 watts and you won't power limit as much. But 350 watts is not enough to prevent power limiting in Firestrike and supreposition, as these test require 375-400 watts of power. But you'll hit much less green, possibly eliminating 75-90% of the green with 350 watts of power.

During normal gaming, you'll probably avoid green all together.

At least this is my understanding of it.


----------



## Nico67

Quote:


> Originally Posted by *Roadrunners*
> 
> Thanks for the info, I haven't used AB Voltage curve before but I have just done quick google on how to bring up the graph and have done a screen grab of each card.
> 
> Original 1080Ti
> 
> 
> 
> New 1080Ti


Wow are they both the same brand and model of card, same bios? that is really weird as I would figure they just had a stock curve for all FE cards regardless of vendor. The new ones curve is totally pointless and will always have Voltage limits, as the curve goes beyond the max possible 1.093v. If you change it to look like the original ones curve it should get rid of that limit


----------



## lilchronic

Uhm these bios are not giving extra power. Im testing the palit bios and it works just as my founders editon bios. but it only goes to 116% power limit and it still throttle just like the founders ED bios.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> Uhm these bios are not giving extra power. Im testing the palit bios and it works just as my founders editon bios. but it only goes to 116% power limit and it still throttle just like the founders ED bios.


Do you have a PSU that can measure power draw?

On this bios,

300 Watts = 100%
350 Watts = 116-117%

That's why you're seeing 116%.

But I don't trust software in measuring power. We don't know how it was written. I only trust PSU or a power meter in the wall.


----------



## nrpeyton

you can watch your power draw from your GPU (in real time in Watts) using hwinfo64

*or I measure full system draw using a watt meter plugged in between the PC and the wall

(you can use both together to make a comparison and *validate*)


you can even use RivaTuner with hwinfo64 and send the info to your screen so you can watch your cards stats as you play


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> you can watch your power draw from your GPU (in real time in Watts) using hwinfo64
> 
> *or I measure full system draw using a watt meter plugged in between the PC and the wall


hwinfo might not work and I've been trying to figure it out.

My card gives hwinfo my card specs, which is 300 watts is my maximum. It looks at my power use, and if it's 100%, then it reports 300 watts.

But I am shunt modded. my card can draw way more than 300 watts now. When I'm at 75% power usage, it reports 225 watts. But my PSU actually shows I'm drawing 100-110 more watts than at stock.


----------



## Roadrunners

Quote:


> Originally Posted by *Nico67*
> 
> Wow are they both the same brand and model of card, same bios? that is really weird as I would figure they just had a stock curve for all FE cards regardless of vendor. The new ones curve is totally pointless and will always have Voltage limits, as the curve goes beyond the max possible 1.093v. If you change it to look like the original ones curve it should get rid of that limit


Yeah they are both FE EVGA cards and they both have the same Bios.

I have changed the curve to match my first card and I still get the same problem. No matter what settings I put this card at be it stock, overvolted etc it hits the Voltage Limit.


----------



## lilchronic

Quote:


> Originally Posted by *SlimJ87D*
> 
> Do you have a PSU that can measure power draw?


Nope don't need one. In fact this bios throttles at even lower voltage than the original bios. i could run unigine 1080p extreme with 1.075v-1.063v while this bios with throttle all the way down to 1.063 -1.031 with lower scores as well.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> hwinfo might not work and I've been trying to figure it out.
> 
> My card gives hwinfo my card specs, which is 300 watts is my maximum. It looks at my power use, and if it's 100%, then it reports 300 watts.
> 
> But I am shunt modded. my card can draw way more than 300 watts now. When I'm at 75% power usage, it reports 225 watts. But my PSU actually shows I'm drawing 100-110 more watts than at stock.


aye I see what you mean; with shut mod your right of course.. the reported power draw will be wayyy off in hwinfo64

in the past I've combined it using watt meter (plugged between PSU and wall) to make a rough _(but decent)_ estimation/validation using befores & afters
(I've noticed this is often always also necessary sometimes if using a cross-flashed BIOS)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *lilchronic*
> 
> Nope don't need one. In fact this bios throttles at even lower voltage than the original bios. i could run unigine 1080p extreme with 1.075v-1.063v while this bios with throttle all the way down to 1.063 -1.031 with lower scores as well.


Yeah, I can't investigate the exact power draw because I am not shunt modded. We would need someone with a PSU to measure power draw from the wall to really see what's going on.

I don't trust software since we are technically changing it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> aye I see what you mean; with shut mod your right of course.. the reported power draw will be wayyy off
> 
> in the past I've combined it using watt meter (plugged between PSU and wall) to make a rough estimation/validation


HWinfo works well if we don't do any modifications. But once you perform modifications, it can throw it off either at the software or hardware level. Same thing witht he Nvidia power commands, it doesn't read my power draw correctly anymore.


----------



## fisher6

I flashed the Palit bios a few hours ago but haven't had the time to do any proper testing yet. I have a Corsair power supply which can read power draw using the Link software but won't be able to test it until tomorrow. Any specific way I should measure? Haven't used Corsair Link in like ages.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> HWinfo works well if we don't do any modifications. But once you perform modifications, it can throw it off either at the software or hardware level. Same thing witht he Nvidia power commands, it doesn't read my power draw correctly anymore.


here's what I'm using:


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> I flashed the Palit bios a few hours ago but haven't had the time to do any proper testing yet. I have a Corsair power supply which can read power draw using the Link software but won't be able to test it until tomorrow. Any specific way I should measure? Haven't used Corsair Link in like ages.


I posted earlier on what to do. You want to flash stock bios first.

Make sure your computer is running the same in both scenarios, so for me I maxed out all of my fans and pump in my system. Take screenshot of your system at idle with all fans at 100%

Run furmark just for 15 seconds (not even close enough to harm your PC) and it will measure your maximum power draw. Take screenshot

Then do the same with palit bios. Take screenshot at idle and then take screenshot with furmark run.

The most important is comparing the max power draw screenshot of both bios.

If we see 40-60 watt difference, then you're definitely pulling more than 300 Watts. If we see it around 10 watts of one another, then we're not seeing anything









Lets hope we see that extra 50!



This is me comparing the stock bios and shunt mod. Getting around 100 extra watts of power draw.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> here's what I'm using:


Are you running an FE? You'd be the perfect person to test this for us. Just record the highest number you see via recording with your phones camera.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you running an FE? You'd be the perfect person to test this for us. Just record the highest number you see via recording with your phones camera.


My new 1080 TI FE arrives this Thursday (27th).

I just traded my 1080 Classy in for it.


----------



## KedarWolf

No shunt mod Palit BIOS is great!! I'm getting 2062 core at 1.025v and this really great TimeSpy!! Best I could do previously at 1.025v was 2032 core.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you running an FE? You'd be the perfect person to test this for us. Just record the highest number you see via recording with your phones camera.


I hear what you're saying, I'll see what I can do this weekend.


----------



## lilchronic

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I can't investigate the exact power draw because I am not shunt modded. We would need someone with a PSU to measure power draw from the wall to really see what's going on.
> 
> I don't trust software since we are technically changing it.







^^I fell the same way.









2050Mhz stock bios
1.05v


http://www.3dmark.com/3dm/18934258


----------



## octiny

Palit bios no good on my end.

EVGA Founders with Hybrid Kits. Same power draw, clocks (comes with stock OC though), same everything....slightly worse scores.

I've tried all the bios's out there and nothing compares to the FE stock bios.


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> well that's your problem, man. You have the setting that Baasha would use... ie the settings of a 4 Titan Xp
> yeah, not even. average is about 2000. even in overclockers.net, so, we have a better average than regulare fools do. average overall is like 1900. If you can keep 2000, that's a good card. My first one topped out at 1974 :/
> 
> My Aorus holds 2088/2076 @ 1.05v. Thing is a beast.
> 
> That said, if you are only clock concerned , I'd buy the Zotac. Best chance of a 2100. Best chance of a 2076. And basically guaranteed a 2025.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Could you list improvments voltages versus frequencies ? (A kind of excel sheet
> 
> 
> 
> 
> 
> 
> 
> )
> Thanks.
Click to expand...

Only Core Voltages I've Tested, Palit BIOS VS. Asus BIOS And Stock BIOS

2100 1.093v Palit, minor power limiting in Heaven, 2088 1.093 Asus, 2088 1.093 Stock
2062 1.025v Palit, 2062 core 1.062 Asus, 2062 Core 1.062 Asus
2050 Palit, 1.012v no power limiting in Time Spy, 2050v Asus 1.031, 2050 Stock 1.043v
2012 Asus .981 no power limiting, 2012 .993v.stock no power limiting in Time Spy.

PalitAsusStock.txt 0k .txt file


----------



## nrpeyton

When I learned of Pascal launch I bought the 1080 Classified.

Many of us were upset about lack of BIOS tweaks/voltage tweaks. But determined to succeed I kept pushing/digging and eventually got a hold of the software voltage tools I needed to over-volt and overclock my card further.

I got to 2379MHZ at 1.25v

Not under LN2, but under *chilled water*. Core temp on my 1080 was -10c to 6c (6c under load).

I modified/bypassed the Thermostat on my 1/2 HP (790w cooling capacity) Hailea Water Chiller to allow Subzero water temperatures. (Then by running a high gycol based coolant I could go subzero without the coolant in my loop freezing).

Here's the video:
(you can actually see the ice build-up on my 1080 Classified in the video).
My CPU was also under Dry Ice; ignore that; and notice the full-cover water block covered in ice on my 1080).




(Water block is covered in arma-flex insulation and I have "putty" on the back of the 1080 PCB to stop moisture touching the solder points).

Can't wait for my new 1080 TI to arrive on Thursday so I can see what she can do


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> When I learned of Pascal launch I bought the 1080 Classified.
> 
> Many of us were upset about lack of BIOS tweaks/voltage tweaks. But determined to succeed I kept pushing/digging and eventually got a hold of the software voltage tools I needed to over-volt and overclock my card further.
> 
> I got to 2379MHZ at 1.25v
> 
> Not under LN2, but under *chilled water*. Core temp on my 1080 was -10c to 6c (6c under load).
> 
> I modified/bypassed the Thermostat on my 1/2 HP (790w cooling capacity) Hailea Water Chiller to allow Subzero water temperatures. (Then by running a high gycol based coolant I could go subzero without the coolant in my loop freezing).
> 
> Here's the video:
> (you can actually see the ice build-up on my 1080 Classified in the video).
> My CPU was also under LN2 ignore that; and notice the full-cover water block covered in ice).
> 
> 
> 
> 
> 
> Can't wait for my new TI to arrive on Thursday so I can see what she can do


You had an XOC bios for that 1080 right? I remember when i had a 780Ti kingpin doing 1550Mhz on cold winter days using the classified tool with the xoc bios.


----------



## nrpeyton

Quote:


> Originally Posted by *lilchronic*
> 
> You had an XOC bios for that 1080 right? I remember when i had a 780Ti kingpin doing 1550Mhz on cold winter days using the classified tool with the xoc bios.


The XOC BIOS wasn't compatible with the Classy and HOF. But I had a few software voltage tools that were compatible. (and good for another 150 milli volts).

The XOC T4 didn't brick the card, but the power reporting was soooo horribly off that the card didn't even function properly... it would actually draw _less power_ and I'd get less FPS & lower score.

The XOC BIOS worked great on FTW's and the likes..

But aye those tools are fantastic lol; can't wait to see what my new TI does.

I had a friend from the 1080 owners club who weeled his rig outside into -20 for benching and got to 2190 (roughly) under water.


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> The XOC BIOS wasn't compatible with the Classy and HOF. But I had a few software voltage tools that were compatible. (and good for another 150 milli volts).
> 
> The XOC T4 didn't brick the card, but the power reporting was soooo horribly off that the card didn't even function properly... it would actually draw _less power_ and I'd get less FPS & lower score.
> 
> The XOC BIOS worked great on FTW's and the likes..
> 
> But aye those tools are fantastic lol; can't wait to see what my new TI does.


Well i hope they get it right for the kingpin 1080ti.


----------



## trickeh2k

Quick question, I'm about to get me a 1080Ti moving on from my current 780 Classy. Is there any reason not to get a FE? Seeing how Maxwell and Pascal doesn't seem to scale with voltage, it just seems like a waste of money to get a custom pcb card with extra power phases that seems to do more or less nothing. Would there be any benefits for me to hold on for something like a classy? I'm gonna put my new one under water as well and play around with custom bioses. Seems like I'm only paying for a custom fan solution and rgb lightning I won't use anyways or am I missing something here? Thanks.


----------



## nrpeyton

Quote:


> Originally Posted by *lilchronic*
> 
> Well i hope they get it right for the kingpin 1080ti.


Exactly my thinking 

And if it grows at hwbot.org then we'll start to see more flexibility with BIOS tweaks / tools for Pascal TI. _(hopefully)_

TI is still soooo new.... even *two* weeks ago the TI was still in/out of stock on Nvidia website. So there's plenty time.

_I've been in a queue with EVGA since launch for my TI.. and it only shipped today. (arriving Thursday from Germany to the U.K)_


----------



## nrpeyton

Quote:


> Originally Posted by *trickeh2k*
> 
> Quick question, I'm about to get me a 1080Ti moving on from my current 780 Classy. Is there any reason not to get a FE? Seeing how Maxwell and Pascal doesn't seem to scale with voltage, it just seems like a waste of money to get a custom pcb card with extra power phases that seems to do more or less nothing. Would there be any benefits for me to hold on for something like a classy? I'm gonna put my new one under water as well and play around with custom bioses. Seems like I'm only paying for a custom fan solution and rgb lightning I won't use anyways or am I missing something here? Thanks.


-Bigger VRM = better heat dissipation (less heat spilling over from VRM onto core + memory = better overclocking)
The three two GDDR5X memory chips closest to the VRM side of the card can get up to 25c hotter than the chips on the I/O side. (On a card with a smaller PCB I.E. Founders Edition that number is going to be even greater).

-Bigger PCB = see above x 2

-Ability to take cards voltages using multi-meter from 'probe points'

-Voltage health LED's for core, mem + PCI-E

-*No* need to shut mod.

-According to KPE website the new epower is going to have *evbot* 'built in'. <--_my 1080 classy had evbot plug port so I imagine 'KPE 1080 TI' will have it.
_
-*Better compatibility* if you want to experiment/bench with Extreme Cooling

-*Easier to remove 'stock' heatsink/fan* and go Water cooling. (EVGA also allows water-cooling. You can remove heatsink and install water-block and they will still honour warranty as long as you return card in original condition).

-*Higher Power Target* = Less power throttling. (we know for sure these cards *do* power throttle I see it on this thread every day).

-Sexier card.

-I think EVGA/Kingpin also have a nice surprise for us in-store with TI. As 1080 Classy owners got frustrated with and all the lockdowns. (it was the general consensus and well-known at the forums).

If you can get temps down below 25c voltage could still have a place up to about 1.25v (150 milli volts more than usual limit). _(Silicon Lottery dependant of course)._


----------



## KedarWolf

Updated OP and BIOS flashing thread with Palit BIOS results.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Updated OP and BIOS flashing thread with Palit BIOS results.


Isn't that a little too quick? People are reporting issues with it right now.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Updated OP and BIOS flashing thread with Palit BIOS results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Isn't that a little too quick? People are reporting issues with it right now.
Click to expand...

Saw one person say it wasn't working and one who never really tested it or posted results.

And if they don't reboot, reinstall drivers and reboot one last time, going to have issues.

Why are you always so contrary.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Saw one person say it wasn't working and one who never really tested it or posted results.
> 
> And if they don't reboot, reinstall drivers and reboot one last time, going to have issues.
> 
> Why are you always so contrary.


I'm not trying to be contrary. If I can explain my background, I publish science papers so it's in my profession to try and present findings once I've gathered and sampled data. I'm more afraid of reporting misinformation than anything. We maintain an op together and it can affect an entire user base if we don't make sure something works for everyone. Or at least write a disclaimer that it works for some but not for others.

But so far I actu believe this might be it, if someone is hitting 116%, this bios maximum power draw, then that should be 350 watts


----------



## trickeh2k

Quote:


> Originally Posted by *nrpeyton*
> 
> -Bigger VRM = better heat dissipation (less heat spilling over from VRM onto core + memory = better overclocking)
> The three two GDDR5X memory chips closest to the VRM side of the card can get up to 25c hotter than the chips on the I/O side. (On a card with a smaller PCB I.E. Founders Edition that number is going to be even greater).
> 
> -Bigger PCB = see above x 2
> 
> -Ability to take cards voltages using multi-meter from 'probe points'
> 
> -According to KPE website the new epower is going to have *evbot* 'built in'. <--my 1080 classy had evbot plug port so I imagine 'KPE 1080 TI' will have it.
> 
> -*Better compatibility* if you want to go Extreme Cooling
> 
> -*Easier to remove 'stock' heatsink/fan* and go Water cooling. (EVGA also allows water-cooling. You can remove heatsink and install water-block and they will still honour warranty as long as you return card in original condition).
> 
> -*Higher Power Target* = Less power throttling. (we know for sure these cards *do* power throttle I see it on this thread every day).
> 
> -Sexier card.
> 
> -I think EVGA/Kingpin also have a nice surprise for us in-store with TI. As 1080 Classy owners got frustrated with and all the lockdowns. (it was the general consensus and well-known at the forums).
> 
> -If you can get temps down below 25c voltage still has a place up to about 1.25v (150 milli volts more than usual limit).


Thank you for your reply! Yeah, I was considering the VRM cooling to be better but I had no idea that the difference in cooling could be that much of a difference! I'm not looking to do any sort of more extreme oc, just fiddle around with different bios settings and see what I can push to be stable for games. I have no interest of pushing good benchmark runs







I have no idea what epower is and have never heard of it but evbot seems like something i'd not really need if I'm just pushing my card for stable gamin oc, right? kpe is kingpin edition for 1080ti? sorry if that seems like a dumb question, I've been out of the loop for quite a while.

I'm fully aware of EVGA's policys regarding removing heatsink and fans, that's one of the reasons I went with EVGA's classy







However I do believe there's a few more manufacturers now offering the same kind of warranty. Back when my card was the hottest **** around I think they where the only ones at the time. There's nothing wrong with a sexy card but I prefer performance over looks tbh









I have a hard time seeing I'd get below 25c since that's like 4-5c hotter than my ambient temp and my cpu tends to run pretty hot since it's a bad chip and requires a lot of voltage (i never seem to be able to hit the silicon lottery with cpu's). Yeah, I sorta read about that. I was quite ready to jump the gun with the 1080 classy but got so disappointed with it i held off ordering a new gpu. But it seems wise to hold off a few weeks or even a month to see what the HoF might offer or if the Classy comes around. Would KPE (assuming I got that part right) be a better choice for me or would it most likely be wasted potential since I'm aiming for stable game clocks(I know it's only hypothetical until we do really see the card but, oh well)?


----------



## RavageTheEarth

So I have an Aorus (non-Xtreme) and I figured that since I'm waiting on the waterblock to be released that I'll try the 1v challenge. Here are my results with 2012 core @ 1v. Absolutely no Perfcaps in GPU-Z and I've done multiple runs through Superposition 4k and Heaven and have been playing Ghost Recon Wildlands for a couple hours with no issues. Might be able to either get a higher clock or drop the voltage even more. Haven't tried that yet.

BTW just an FYI, I learned that if you select a point in your voltage curve and then press the "L" key and it will lock it to that bin. I've found that it does still drop a bin so lock it on the next point up that you are aiming for.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I have an Aorus (non-Xtreme) and I figured that since I'm waiting on the waterblock to be released that I'll try the 1v challenge. Here are my results with 2012 core @ 1v. Absolutely no Perfcaps in GPU-Z and I've done multiple runs through Superposition 4k and Heaven and have been playing Ghost Recon Wildlands for a couple hours with no issues. Might be able to either get a higher clock or drop the voltage even more. Haven't tried that yet.
> 
> BTW just an FYI, I learned that if you select a point in your voltage curve and then press the "L" key and it will lock it to that bin. I've found that it does still drop a bin so lock it on the next point up that you are aiming for.


Nice, 2K at 1V is good. What was your superposition score?


----------



## Benny89

Seems like my STRIX can hold in Witcher 3 perfectly stable 2063 clock at 1.062V. This is the bin where it stops hitting Power Limit in W3 and downclock itself if temps stay below 60C. Any higher voltage bin makes card hits power limit and drop bin in W3 until it reach 1.062V where it holds. At 70% Fan speed after few hours in W3 I saw max 58C.

I think I will repaste it soon with Grizzly to drop it even more so I can maybe run it at 60% fan.

Can bench at 2088 at 1.081V but that is useless to me







.

At lowest bin I could get 2012 at 1.000V. Funny I couldn't get above 2038 until 1.04V where I could finally hit 2050.


----------



## Nico67

Quote:


> Originally Posted by *octiny*
> 
> Palit bios no good on my end.
> 
> EVGA Founders with Hybrid Kits. Same power draw, clocks (comes with stock OC though), same everything....slightly worse scores.
> 
> I've tried all the bios's out there and nothing compares to the FE stock bios.


Same result here, probably better than SC2 and Inno3d, but not as good as FE. also its not drawing more than 300w + spikes for me and power limiting.

2100 / 6055 (mem better a bit higher like sc2)



most notable thing is the conflict better AB and GPU-Z, AB is like its reading the FE bios and doing 120%, but GPU-Z is reading about 100% plus spikes so 300w

seems to power limit less, but not significantly and again probably due to a softer bios that does demand quite the same power and slightly less voltage per clk bin. Again better than sc2 but I can't match the FE bios in SuPo 4K.

smi shows maxes upto 313w and averages not exceeding 300w, and I checked my UPS power out and it was only peaking at 423w with monitor connected also. With PSU efficiency then what its giving to the systems is even less.

2126 / 6055 to showing some power limiting



lastly not stable in games at same clks as FE









Edit: thinking about "but GPU-Z is reading about 100% plus spikes so 300w", that's exactly what the Asus bios did, 111% of 275 = 300w but AB agreed with that at least.


----------



## gmpotu

Is Driver 378.92 more stable than 381.65? Noticed you're running the older driver Nico67.

Do you guys run into the issue where your clocks won't go back to 3D mode? I play games in Fullscreen (Windowed) and stream or look at forums on my other monitor. If I click off the game and go back my clocks won't go up and it keeps me at like 0.712v.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Same result here, probably better than SC2 and Inno3d, but not as good as FE. also its not drawing more than 300w + spikes for me and power limiting.
> 
> 2100 / 6055 (mem better a bit higher like sc2)
> 
> 
> 
> most notable thing is the conflict better AB and GPU-Z, AB is like its reading the FE bios and doing 120%, but GPU-Z is reading about 100% plus spikes so 300w
> 
> seems to power limit less, but not significantly and again probably due to a softer bios that does demand quite the same power and slightly less voltage per clk bin. Again better than sc2 but I can't match the FE bios in SuPo 4K.
> 
> smi shows maxes upto 313w and averages not exceeding 300w, and I checked my UPS power out and it was only peaking at 423w with monitor connected also. With PSU efficiency then what its giving to the systems is even less.
> 
> 2126 / 6055 to showing some power limiting
> 
> 
> 
> lastly not stable in games at same clks as FE


So you measured the power draw and its similar to the Inno3d and Asus bios, just +15 watts over stock?


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Nice, 2K at 1V is good. What was your superposition score?


It was around 10100-ish. Somewhere in the hundreds. Was actually surprised because I only score about an extra 100 or so points when I'm running my daily 2063 @ 1.063v, but I think I'm going to stick with this for now. I've actually noticed some smoother performance in games since it's not jumping around the bins all the time. I'm very happy I found out there was a lock function for the voltage curve. I've tried underclocking before without much success, but now that I can lock bins it's perfectly stable.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> It was around 10100-ish. Somewhere in the hundreds. Was actually surprised because I only score about an extra 100 or so points when I'm running my daily 2063 @ 1.063v, but I think I'm going to stick with this for now. I've actually noticed some smoother performance in games since it's not jumping around the bins all the time. I'm very happy I found out there was a lock function for the voltage curve. I've tried underclocking before without much success, but now that I can lock bins it's perfectly stable.


Yeah but the only thing about locking is your card won't idle. It'd be annoying to have to unlock and relock it each time even if your already have a profile setup.


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> So you measured the power draw and its similar to the Inno3d and Asus bios, just +15 watts over stock?


I would say Inno3d is the only one giving you the extra watts, but its only 15w. Asus may being giving you a few extra, but the Palit doesn't seem to be giving any I would say, although I'll give it the benefit of the doubt and say a few like the Asus


----------



## Benny89

Quote:


> Originally Posted by *RavageTheEarth*
> 
> It was around 10100-ish. Somewhere in the hundreds. Was actually surprised because I only score about an extra 100 or so points when I'm running my daily 2063 @ 1.063v, but I think I'm going to stick with this for now. I've actually noticed some smoother performance in games since it's not jumping around the bins all the time. I'm very happy I found out there was a lock function for the voltage curve. I've tried underclocking before without much success, but now that I can lock bins it's perfectly stable.


Jumping around the bins in games? Strange as mine stops changing clock when it reaches 1.063V bin. Never went below even in Witcher 3 Ultra. In all other games- same thing.

From my experience there should be a bin where card simply stops downclocking itself unless you start hitting Temperature Cap....

But maybe different AIBs behave differentely.


----------



## Coopiklaani

Palit bios works great with shunt mod. over 425w TDP confirmed with PSU power measurement.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Palit bios works great with shunt mod. over 425w TDP confirmed with PSU power measurement.


That's what I get with stock bios. Is it giving you higher scores?


----------



## KingEngineRevUp

Timespy score.


----------



## octiny

Quote:


> Originally Posted by *Nico67*
> 
> Same result here, probably better than SC2 and Inno3d, but not as good as FE. also its not drawing more than 300w + spikes for me and power limiting.
> 
> 2100 / 6055 (mem better a bit higher like sc2)
> 
> 
> 
> most notable thing is the conflict better AB and GPU-Z, AB is like its reading the FE bios and doing 120%, but GPU-Z is reading about 100% plus spikes so 300w
> 
> seems to power limit less, but not significantly and again probably due to a softer bios that does demand quite the same power and slightly less voltage per clk bin. Again better than sc2 but I can't match the FE bios in SuPo 4K.
> 
> smi shows maxes upto 313w and averages not exceeding 300w, and I checked my UPS power out and it was only peaking at 423w with monitor connected also. With PSU efficiency then what its giving to the systems is even less.
> 
> 2126 / 6055 to showing some power limiting
> 
> 
> 
> lastly not stable in games at same clks as FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: thinking about "but GPU-Z is reading about 100% plus spikes so 300w", that's exactly what the Asus bios did, 111% of 275 = 300w but AB agreed with that at least.


+1

I have yet to use a bios that beats, let alone matches the FE bios.

Nor have I seen any scores from people that are out of the boundaries of normal 100-300 points variances in benchmarks with any custom bios that couldn't be achieved on the FE bios.

Maybe the Kingpin card will change that but I doubt it. It's just a sh**ty generation of cards in regards to overclocking/unlocking.


----------



## KingEngineRevUp

Amazing, even with the shunt mod, some power limiting occurs. Drew about 385-400 Watts here.


----------



## KingEngineRevUp

.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's what I get with stock bios. Is it giving you higher scores?


+~100 on FSU. I got minor pwr limit on the stock BIOS with shunt mod. The max pwr I could get with stock BIOS and power mod was 375w. The max power with the Palit BIOS is about 415w, no more pwr limit. Also it gives me more frequency/voltage on furmark; stock bios gives me 0.9v, Palit gives 1.025v.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *fisher6*
> 
> I flashed the Palit bios a few hours ago but haven't had the time to do any proper testing yet. I have a Corsair power supply which can read power draw using the Link software but won't be able to test it until tomorrow. Any specific way I should measure? Haven't used Corsair Link in like ages.


Let us know what you find please.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> +~100 on FSU. I got minor pwr limit on the stock BIOS with shunt mod. The max pwr I could get with stock BIOS and power mod was 375w. The max power with the Palit BIOS is about 415w, no more pwr limit. Also it gives me more frequency/voltage on furmark; stock bios gives me 0.9v, Palit gives 1.025v.


That's weird, maybe you didn't put enough clu


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> Amazing, even with the shunt mod, some power limiting occurs. Drew about 385-400 Watts here.


Wow, maybe limiting on the PCIE rail as you didn't shunt mod that one. Could try full 120% power and see whether that fixes it, may prove whether the PCIE rail is worth modding at all


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I have an Aorus (non-Xtreme) and I figured that since I'm waiting on the waterblock to be released that I'll try the 1v challenge. Here are my results with 2012 core @ 1v. Absolutely no Perfcaps in GPU-Z and I've done multiple runs through Superposition 4k and Heaven and have been playing Ghost Recon Wildlands for a couple hours with no issues. Might be able to either get a higher clock or drop the voltage even more. Haven't tried that yet.
> 
> BTW just an FYI, I learned that if you select a point in your voltage curve and then press the "L" key and it will lock it to that bin. I've found that it does still drop a bin so lock it on the next point up that you are aiming for.


exactly my clocks in the 1v challenge. Very nice. I have found that if i lock to 1.05v at 2063, i can run supo 4k with only 1 green blip in the whole run. So I assume, 1.04 (havent done it yet) would be the completely non power limited voltage for supo 4k on our Aorus'. We hit the jackpot with these two! They are both 2100 cards when watered.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah but the only thing about locking is your card won't idle. It'd be annoying to have to unlock and relock it each time even if your already have a profile setup.


i just keep a second profile that's exactly the same except unlocked and has the "fanstop" working, and is on optimum power in NCP, so i just switch profiles on AB after gaming/benching and it instantly drops to 240Mhz and goes silent. I didnt like the unlocking/relocking method as it sometimes effed up my curve. only takes 2 seconds this way, and no screwing it up. And the fans remind me to do it so i usually dont forget.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Seems like my STRIX can hold in Witcher 3 perfectly stable 2063 clock at 1.062V. This is the bin where it stops hitting Power Limit in W3 and downclock itself if temps stay below 60C. Any higher voltage bin makes card hits power limit and drop bin in W3 until it reach 1.062V where it holds. At 70% Fan speed after few hours in W3 I saw max 58C.
> 
> I think I will repaste it soon with Grizzly to drop it even more so I can maybe run it at 60% fan.
> 
> Can bench at 2088 at 1.081V but that is useless to me
> 
> 
> 
> 
> 
> 
> 
> .
> 
> At lowest bin I could get 2012 at 1.000V. Funny I couldn't get above 2038 until 1.04V where I could finally hit 2050.


Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> 
> Timespy score.


my EXACT score from last night! damn!


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah but the only thing about locking is your card won't idle. It'd be annoying to have to unlock and relock it each time even if your already have a profile setup.


That's a good point, but it's just a few key presses to unlock it anyways. I'm going to be going back to a regular curve once the waterblock is released for it. It would be nice if they could implement a function where the bin is only locked when the GPU load hits a certain percentage. Would make things a lot easier.
Quote:


> Originally Posted by *Slackaveli*
> 
> exactly my clocks in the 1v challenge. Very nice. I have found that if i lock to 1.05v at 2063, i can run supo 4k with only 1 green blip in the whole run. So I assume, 1.04 (havent done it yet) would be the completely non power limited voltage for supo 4k on our Aorus'. We hit the jackpot with these two! They are both 2100 cards when watered.
> i just keep a second profile that's exactly the same except unlocked and has the "fanstop" working, and is on optimum power in NCP, so i just switch profiles after gaming/benching and it instantly drops to 240Mhz and goes silent.


Yeah man I ordered mine on launch day and it boosted to 2000 out of the box with the power slider at 125%. I actually found my FPS to be higher with a memory OC of +350 composted to the "sacred" 12Ghz OC which is perfectly stable, but it performs worse in all scenarios four some reason. Memory is strange. I haven't pushed it past 1.063v (yet) with a core of 2063 I get a voltage related performance cap in GPU-Z , but if I drop down the voltage to the nex bin I also get power related performance caps also. I can't wait for the waterblock to be released for this thing I haven't run air in years, although this is the coolest card with an air cooler I've ever used. Max temps during benchmarks doesnt usually break 53C. Max I've seen was 55C. During gaming I'm mostly in the mid to lower 40s, sometimes up to 50C depending on game. This is at 1.063v.

These Aorus cards are great if you get a good one.


----------



## Robilar

What is currently the fastest factory overclocked ti available?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> my EXACT score from last night! damn!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Wow, maybe limiting on the PCIE rail as you didn't shunt mod that one. Could try full 120% power and see whether that fixes it, may prove whether the PCIE rail is worth modding at all


I'm good, I don't think it's worth drawing more power from the motherboard as the RX 480's supposedly broke a few motherboards because they drew 10 extra watts from them.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> That's a good point, but it's just a few key presses to unlock it anyways. I'm going to be going back to a regular curve once the waterblock is released for it. It would be nice if they could implement a function where the bin is only locked when the GPU load hits a certain percentage. Would make things a lot easier.
> Yeah man I ordered mine on launch day and it boosted to 2000 out of the box with the power slider at 125%. I actually found my FPS to be higher with a memory OC of +350 composted to the "sacred" 12Ghz OC which is perfectly stable, but it performs worse in all scenarios four some reason. Memory is strange. I haven't pushed it past 1.063v (yet) with a core of 2063 I get a voltage related performance cap in GPU-Z , but if I drop down the voltage to the nex bin I also get power related performance caps also. I can't wait for the waterblock to be released for this thing I haven't run air in years, although this is the coolest card with an air cooler I've ever used. Max temps during benchmarks doesnt usually break 53C. Max I've seen was 55C. During gaming I'm mostly in the mid to lower 40s, sometimes up to 50C depending on game. This is at 1.063v.
> 
> These Aorus cards are great if you get a good one.


my mems only like up to +416, so I feel yeah. With the clocks you get i bet if you did +51Mv on the voltage slider and locked to 1.08 at 2088 you'd hold it, then settle at 2063. Temps are about the same tbh. That's my sweet spot it seems and our cards are really close to the same. That is when all my perfcaps went away. I ordered that first morning after I woke up just like you did.


----------



## Slackaveli

Quote:


> Originally Posted by *Robilar*
> 
> What is currently the fastest factory overclocked ti available?


Aorus extreme is 1636core. i dont think there is one higher yet. zotac extreme and Galax HOF are close, maybe higher. K|ngp|n will be the highest as the line fills out.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Slackaveli*
> 
> my mems only like up to +416, so I feel yeah. With the clocks you get i bet if you did +51Mv on the voltage slider and locked to 1.08 at 2088 you'd hold it, then settle at 2063. Temps are about the same tbh. That's my sweet spot it seems and our cards are really close to the same. That is when all my perfcaps went away. I ordered that first morning after I woke up just like you did.


Looks like we got them from the same batch. I actually hold 2063 @ 1.063 perfectly. I never drop bins while gaming. I think it's because I'm not power limited so the card wants to give it more voltage. Going to wait until it's under water before pushing it further.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Palit BIOS confirmed working on a Gigabyte 1080 Ti FE.
> 
> https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331
> 
> 350W power limit, 100 points more in Time Spy than any other BIOS I've tried. Getting 2052 core at 1.025v and not crashing, had to do 1.050v before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And getting 2100 core sustained in Heaven at 1.093v, no power limiting. Best I could do since Windows 10 Creator Updater was 2088 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And not power limiting at 1.025v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new go to BIOS!!


Just had a thought in regards to AB's weird behaviour with this bios, and it seems your running 4.4.0 beta6. Just wondering if that might be applying power offsets properly were 4.3.0 isn't? Could you run a quick load test to see what max power is on AB and GPU-Z for comparison.

I did notice that the Palit is using a different v reg from the FE cards which I didn't think Nvidia were allowing, but there yah go







Could be 4.3.0 doesn't recognise it and doesn't actually add the extra power?


----------



## Foxrun

Why would I keep hitting the power limit in timespy if it's not hitting 120%? I got a 15600 score but the past couple of days I can barely get to 15k..


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm not trying to be contrary. If I can explain my background, I publish science papers so it's in my profession to try and present findings once I've gathered and sampled data. I'm more afraid of reporting misinformation than anything. We maintain an op together and it can affect an entire user base if we don't make sure something works for everyone. Or at least write a disclaimer that it works for some but not for others.
> 
> But so far I actu believe this might be it, if someone is hitting 116%, this bios maximum power draw, then that should be 350 watts


It's going to be difficult to get a representative sample, but FWIW I'll flash this shortly and report back. I cant measure actual power draw, which is unfortunate, but I've got a pretty good feel for the card and how it behaves in SuPo on the FE BIOS.


----------



## Dasboogieman

Quote:


> Originally Posted by *trickeh2k*
> 
> Thank you for your reply! Yeah, I was considering the VRM cooling to be better but I had no idea that the difference in cooling could be that much of a difference! I'm not looking to do any sort of more extreme oc, just fiddle around with different bios settings and see what I can push to be stable for games. I have no interest of pushing good benchmark runs
> 
> 
> 
> 
> 
> 
> 
> I have no idea what epower is and have never heard of it but evbot seems like something i'd not really need if I'm just pushing my card for stable gamin oc, right? kpe is kingpin edition for 1080ti? sorry if that seems like a dumb question, I've been out of the loop for quite a while.
> 
> I'm fully aware of EVGA's policys regarding removing heatsink and fans, that's one of the reasons I went with EVGA's classy
> 
> 
> 
> 
> 
> 
> 
> However I do believe there's a few more manufacturers now offering the same kind of warranty. Back when my card was the hottest **** around I think they where the only ones at the time. There's nothing wrong with a sexy card but I prefer performance over looks tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a hard time seeing I'd get below 25c since that's like 4-5c hotter than my ambient temp and my cpu tends to run pretty hot since it's a bad chip and requires a lot of voltage (i never seem to be able to hit the silicon lottery with cpu's). Yeah, I sorta read about that. I was quite ready to jump the gun with the 1080 classy but got so disappointed with it i held off ordering a new gpu. But it seems wise to hold off a few weeks or even a month to see what the HoF might offer or if the Classy comes around. Would KPE (assuming I got that part right) be a better choice for me or would it most likely be wasted potential since I'm aiming for stable game clocks(I know it's only hypothetical until we do really see the card but, oh well)?


A couple of nitpicks to consider. Not all VRMs are the same, while as a general rule of thumb more MOSFETs=better cooling, the quality of the MOSFET also matters. Low efficiency ones will output a crap-ton of heat which pretty much negates any advantage you get from going wider.

Temperature really matters with FinFet technology. Leakage really soars with voltage, which in turn spikes the heat output. Keeping these 16nm FinFet chips cool is one of the best ways to getting stable high. OCs. Take the cooling capacity of the 1080ti in to account seriously.


----------



## Alwrath

Well I tried the Palit bios on an air evga 1080ti FE, dont do it. Fan stuck at half speed temps went up 20C, couldent get past 1898 mhz due to higher temps ugh.

Dont do it on air. Just dont.


----------



## gmpotu

Can someone help me understand what this means? This is from testing in Heaven @ 4k extreme.



The blue line is where I dropped the voltage slider to 76+ from 100+
(So from what I've read and the fact that I saw it drop to the 1.075v bin @1974. I can tell this is because dropping the slider reduced the max voltage I'm allowing the card to use so GPU-Z is saying I could do better with higher voltage.)

Then it's clear again when I jump it back up to 100+ and it jumps back to 202[email protected]

Then when I raise clocks above 2025 @ 1.093v up to 2037 @ 1.093v I get the rainbow in the graph. (This is what I have no clue about. What does this mean? It all seems stable and fine.)

Nothing is crashing and the Power draw hovers 108-113 of the max slider 120. When I run the Nvidia command I can confirm the power draw is at about 110%.
Not sure why I am seeing Vop on GPU-Z and this annoying Rainbow.

I think my card is pretty mediocre, not bad, just not an all star. It gets up to 60C at 75% fan speed on air with 1.093v.
From this it looks like the max I can hit at 1.093v is 2025 which I'm more than happy with but not sure I want to run at 1.093v all the time.
Memory freeze / artifacts at anything beyoned +350 so 11700.


----------



## mshagg

So, results with the Palit BIOS. This is a watercooled, non-shunt-modded FE. SuPo 4K optimzed, stock curve with +100 core and 116% power limit.

Out of the box it boosted to 1025/1.075v. Notably fewer power limit flags in the AB graphs, compared to FE or the Inno3D BIOS. Max power % reported by AB was 116%.

Temperatures maxed at 48 degrees, which is pretty typical.

Nvidia-smi.exe reports max power limit of 350W. 116% is actually 348W, so if you want that extra 2W you'd have to set it manually with nvidia-smi.

My card is stable at 2025 @ 1.031V on other BIOSes I've tried, so a bit of magic to work on the curve to see what it can do.

Looks promising though.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> So, results with the Palit BIOS. This is a watercooled, non-shunt-modded FE. SuPo 4K optimzed, stock curve with +100 core and 116% power limit.
> 
> Out of the box it boosted to 1025/1.075v. Notably fewer power limit flags in the AB graphs, compared to FE or the Inno3D BIOS. Max power % reported by AB was 116%.
> 
> Temperatures maxed at 48 degrees, which is pretty typical.
> 
> Nvidia-smi.exe reports max power limit of 350W. 116% is actually 348W, so if you want that extra 2W you'd have to set it manually with nvidia-smi.
> 
> My card is stable at 2025 @ 1.031V on other BIOSes I've tried, so a bit of magic to work on the curve to see what it can do.
> 
> Looks promising though.


Yeah, I posted back to another user that 116% should be 350 watts.

From my experience and taking lots of measurements, run your card a 1.000v and run superposition, with around 50 extra watts it shouldn't down volt at all or power limit. And if it does, it should be very little.

Try 1.000v and 1.031v, I think you should be able to hold both of those.

The best way to know is using a PSU or wall outlet.

Can you please take screenshots of gpu-z for both runs? Thank you.


----------



## mshagg

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I posted back to another user that 116% should be 350 watts.
> 
> From my experience and taking lots of measurements, run your card a 1.000v and run superposition, with around 50 extra watts it shouldn't down volt at all or power limit. And if it does, it should be very little.
> 
> Try 1.000v and 1.031v, I think you should be able to hold both of those.
> 
> The best way to know is using a PSU or wall outlet.
> 
> Can you please take screenshots of gpu-z for both runs? Thank you.


Results with 1.000v are strange, it just throws constant vrel and power limit perfcaps. Had the curve st at 1.000V and 2000Mhz. Core clocks didnt budge from 2000Mhz - i.e. it wasnt clocking down, but still showed a power perfcap. What the hell nvidia lol.

Results with 1.031v made much more sense. At 2037Mhz 1.031v, power maxes at 100.7% TDP. Zero perfcaps and clocks were stable. Im trying to find a setting that will sit a bit higher. 100% TDP for the Palit BIOS equates to 300W.

Im grabbing screens of it all, will post up shortly.


----------



## joder

So it looks like the Palit firmware just might give us 350W on the FE?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Results with 1.000v are strange, it just throws constant vrel and power limit perfcaps. Had the curve st at 1.000V and 2000Mhz. Core clocks didnt budge from 2000Mhz - i.e. it wasnt clocking down, but still showed a power perfcap. What the hell nvidia lol.
> 
> Results with 1.031v made much more sense. At 2037Mhz 1.031v, power maxes at 100.7% TDP. Zero perfcaps and clocks were stable. Im trying to find a setting that will sit a bit higher. 100% TDP for the Palit BIOS equates to 300W.
> 
> Im grabbing screens of it all, will post up shortly.


Put the voltage slider to 0, vrel means it wants to jump higher because it knows you have that slider maxed out but you're curve is limiting it.

If you have firestrike or time spy, those are worse than superposition.

What will set it higher is more voltage. Do 1.050v and then 1.062v

Power = voltage * current

So far sounds good. Hopefully someone with a PSU can do a measurement for us. I can't since ive gone shunt mod.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Looks like we got them from the same batch. I actually hold 2063 @ 1.063 perfectly. I never drop bins while gaming. I think it's because I'm not power limited so the card wants to give it more voltage. Going to wait until it's under water before pushing it further.


sure looks like it, huh.
Scored a new supo high. Got my ddr3 corsair platinum ram running past it's ratings at 2667Mhz c-11 tight. sickness. supo 4k 10,521.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> Just had a thought in regards to AB's weird behaviour with this bios, and it seems your running 4.4.0 beta6. Just wondering if that might be applying power offsets properly were 4.3.0 isn't? Could you run a quick load test to see what max power is on AB and GPU-Z for comparison.
> 
> I did notice that the Palit is using a different v reg from the FE cards which I didn't think Nvidia were allowing, but there yah go
> 
> 
> 
> 
> 
> 
> 
> Could be 4.3.0 doesn't recognise it and doesn't actually add the extra power?


could very well be.


----------



## mshagg

Alright here's a run at 1.05V 2037Mhz (score is low as mem is at stock speed). No way could my card hold that through 4K SuPo on the FE BIOS. Didnt budge on clock speed or core volts.



GPU-Z reading at 0.1sec on the sensors.

Zero perfcap. Max TDP through the run was 105%, which equates to 315W.


----------



## Nico67

Quote:


> Originally Posted by *mshagg*
> 
> Alright here's a run at 1.05V 2037Mhz (score is low as mem is at stock speed). No way could my card hold that through 4K SuPo on the FE BIOS. Didnt budge on clock speed or core volts.
> 
> 
> 
> GPU-Z reading at 0.1sec on the sensors.
> 
> Zero perfcap. Max TDP through the run was 105%, which equates to 315W.


Which version of AB are you using?


----------



## mshagg

Quote:


> Originally Posted by *Nico67*
> 
> Which version of AB are you using?


4.4.0 Beta 6, but i close it once the OC is applied.


----------



## Nico67

Quote:


> Originally Posted by *mshagg*
> 
> 4.4.0 Beta 6, but i close it once the OC is applied.


Cool, I'll try upgrading to that when I get home, may explain why some people have had more success than others with some bioses


----------



## mshagg

OK so found a spot on the curve, with some mem OC, that pushes the Palit BIOS to the power limit. 2050Mhz at 1.061V, with some power limiting in those first few scenes of SuPo.

Peak TDP was 109%, which equates to 327W.

I think it's a stretch to say we're going to get 350W from it. Presumably we are running up against the sub-limit for the motherboard's power supply, given Palit would be drawing that extra power from the 2x8-pin. That said, these are far better results than I've had with any of the other BIOSes.


----------



## lilchronic

It does not work. Not sure how you guy's don't realize this?


----------



## mshagg

Quote:


> Originally Posted by *lilchronic*
> 
> It does not work. Not sure how you guy's don't realize this?


Card just held 2050Mhz @ 1.04V through a full run of superposition 4k optimized. Founders edition BIOS would drop below 2000Mhz in the first few scenes with Perfcaps, even at 1987Mhz/0.975V it would throw TDP perfcaps. Several hundred points up on actual benchmark scores.

Im literally showing data that demonstrates how the Palit BIOS results in less power limiting than the stock FE BIOS.

That being said, outside of this benchmark I cant really see any use for it. None of the games I play run up against the power limit anyway. This is an overclocking forum where we tweak hardware yeah?


----------



## TheBoom

So is there a reason to upgrade to AB 4.40 Beta 6?

I'm still on 4.3.0 Beta 14 lol.


----------



## mechwarrior

tried the asus bios and stable 2063 heaven and spo benchmark no power limit.

whats the command for seeing power usage?


----------



## pez

Well the STRIX OC is going up for sale soon. Wasn't the right fit for my case, and I wasn't a fan of the overall noise profile of the card. The FE should be in next week or mid-week the following and that will also find it's way for sell as well. Has anyone received ETAs on their EVGA AIB cards?


----------



## lilchronic

Quote:


> Originally Posted by *mshagg*
> 
> Card just held 2050Mhz @ 1.04V through a full run of superposition 4k optimized. Founders edition BIOS would drop below 2000Mhz in the first few scenes with Perfcaps, even at 1987Mhz/0.975V it would throw TDP perfcaps. Several hundred points up on actual benchmark scores.
> 
> Im literally showing data that demonstrates how the Palit BIOS results in less power limiting than the stock FE BIOS.
> 
> That being said, outside of this benchmark I cant really see any use for it. None of the games I play run up against the power limit anyway. This is an overclocking forum where we tweak hardware yeah?


Think what you want. it's not doing anything for me.


----------



## mshagg

Quote:


> Originally Posted by *lilchronic*
> 
> Think what you want. it's not doing anything for me.


I dont think anything, I look at results.

Sorry. I dont mean to be a jerk. I think there's enough variation in the actual GP102 chips that results are going to vary. This thing wouldnt hold 2000Mhz in the 4K bench to save its life on the stock bios. Being able to hold 2050Mhz is a big improvement for it, and very easy to notice having done dozens and dozens (and dozens) of runs through that benchmark.


----------



## Nico67

Quote:


> Originally Posted by *Nico67*
> 
> Cool, I'll try upgrading to that when I get home, may explain why some people have had more success than others with some bioses


Ok tried install AB 4.4.0 B6 and then flash to Palit again, unfortunately still seeing the same thing, 116% in AB is about 100% in GPU-Z and just start getting minor limiting. Its weird, but it just ain't playing ball for me









Next bios pls


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> Well the STRIX OC is going up for sale soon. Wasn't the right fit for my case, and I wasn't a fan of the overall noise profile of the card. The FE should be in next week or mid-week the following and that will also find it's way for sell as well. Has anyone received ETAs on their EVGA AIB cards?


Noise profile? Are you a bat? I can't hear mine at all and the rest of my system is silent lol


----------



## alucardis666

Quote:


> Originally Posted by *Silent Scone*
> 
> Noise profile? Are you a bat? I can't hear mine at all and the rest of my system is silent lol


Everyone's different. The Aorus I tried was supposed to be "quiet", but at 100% fan speed, it sounded no quieter than my FE's @ 100%


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> Noise profile? Are you a bat? I can't hear mine at all and the rest of my system is silent lol


I have a rather open/hollow case, so any noise that happens inside, I hear. This means coil whine, AIO pump noise, etc.

Idle noise is perfect, but at 60% plus, the fans noise profile is rather undesirable for me. It's not that it's 'loud' but it's more the quality of sound that the 3 x 90mm fans exhibit.


----------



## BrainSplatter

Quote:


> Originally Posted by *Nico67*
> 
> Wow are they both the same brand and model of card, same bios? that is really weird as I would figure they just had a stock curve for all FE cards regardless of vendor.


GPU's are binned with different VID's, just like CPU's. Which is also the reason the GPU power usage comparisons in graphic card reviews are often pointless because the reviews usually never mention the specific GPU VID.


----------



## Roadrunners

Quote:


> Originally Posted by *BrainSplatter*
> 
> GPU's are binned with different VID's, just like CPU's. Which is also the reason the GPU power usage comparisons in graphic card reviews are often pointless because the reviews usually never mention the specific GPU VID.


I would like to know is if it is normal for this second card to be hitting the Voltage Limit all the time be it at stock or with +100 vcore? It's a replacement RMA card so I just need to know if I should return it to evga or not?

EVGA customer support are saying this behaviour is totally normal but my first card doesn't trigger any voltage limits at all even when at 2088mhz.


----------



## pez

It sounds like you're in SLI? Or are you saying 'second card' as in the replacement card? If in SLI, swap them around and see if the behavior continues.


----------



## Roadrunners

It's a replacement card. No SLI. My first card has coil whine.


----------



## Silent Scone

Quote:


> Originally Posted by *pez*
> 
> I have a rather open/hollow case, so any noise that happens inside, I hear. This means coil whine, AIO pump noise, etc.
> 
> Idle noise is perfect, but at 60% plus, the fans noise profile is rather undesirable for me. It's not that it's 'loud' but it's more the quality of sound that the 3 x 90mm fans exhibit.


Mine is quite literally next to my head (open chassis), and I can barely hear it besides the chokes, which is to be expected. I've not modified the fan profile in any way.


----------



## BrainSplatter

Quote:


> Originally Posted by *Roadrunners*
> 
> I would like to know is if it is normal for this second card to be hitting the Voltage Limit all the time be it at stock or with +100 vcore?


Hitting the 'voltage limit' is normal GPU boost behaviour. Your new card just hits it more often than your old one because it has a higher VID. My assumption was, that with an identical voltage curve, the behaviour would be the same for your 2 cards. But it seems to be more complex than that. I already noticed with my 2 FE cards (their VIDs are also different) that the voltage points in the editor seems to jump around a little different for each card when I press 'apply', even when I try to use the exact same curve.

In the end, the one thing which matters is which maximum OC u get from your new card. Since that depends on silicon lottery and your new one might not get as high as your old one. But if it runs stable with something between 2000-2050 Mhz then that would be a perfectly fine result. The new card will probably run a little hotter/louder than the old one due to the higher VID.

Should get another new card, it might have an even higher VID, overclock less or maybe hit 2100Mhz with 1v







Just a matter of luck.


----------



## viz0id

Probably have been asked a million times before but could not find a new post about it.

Is there any real benefit of clocking memory freq. for gaming? Or is it just for the benchmark? Because i seem to be able to get the coreclock much higher without adding memory OC?


----------



## KedarWolf

I got my older Maxwell Titan X with a waterblock and backplate on it and Predator QDC's for sale on the marketplace here.

It runs about the same as a 1070 but with 12GB of RAM. Guy PM's me knowing it's not a Pascal Titan X offering me a 1080 Ti plus cash for my card. Says he doesn't need the power of a 1080 Ti. Seems legit, right?









And no, I'm not going for it. Guy HAS to be a scammer.


----------



## KedarWolf

Okay, results on the Palit BIOS.

Superposition is a bit lower, Heaven about the same as stock, Time Spy quite a bit better. Not sure why some benches do better than others.









Edit: I'm going to keep using it with the higher clocks at lower voltages though.


----------



## BrainSplatter

Quote:


> Originally Posted by *viz0id*
> 
> Is there any real benefit of clocking memory freq. for gaming? Or is it just for the benchmark? Because i seem to be able to get the coreclock much higher without adding memory OC?


In general, the higher the resolution and/or MSAA/SSAA setting the more VRAM performance will make a difference. And of course, each game is different. in the end, you have to test yourself for each game whether it makes more of a difference to overclock core or memory (but this might even depend on the current scene in the game).


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> In general, the higher the resolution and/or MSAA/SSAA setting the more VRAM performance will make a difference. And of course, each game is different. in the end, you have to test yourself for each game whether it makes more of a difference to overclock core or memory (but this might even depend on the current scene in the game).


Ok thanks for answering. I guess it sounds like it isn't just a black and white answer here then







But since i seem to be able to run lower voltages without the memory OC'ed as well i think i might just skip OC'ing the memory then.


----------



## MRCOCOset

Palit 350W Bios on ASUS GTX 1080TI FE. Work







My video >>>


----------



## octiny

Quote:


> Originally Posted by *viz0id*
> 
> Probably have been asked a million times before but could not find a new post about it.
> 
> Is there any real benefit of clocking memory freq. for gaming? Or is it just for the benchmark? Because i seem to be able to get the coreclock much higher without adding memory OC?


On both Titan X Pascal and Ti's, overclocking the memory helps immensely. They love bandwidth.

OCing the memory 500+ on my Titan X P's gives me 90% of what +170 on the core gives. Testing done @ 4K via ROTR, The Division, FS Ultra.

Ti's same thing, unfortunately my mem OC's like crap on one of the cards so I'm stuck at +250. I envy those getting getting 12ghz+.....rather have that than an extra 50mhz on the core.

OC the core first, then find your max mem....if you lose a few bins off your core, that's fine. You'll find your FR's are still higher versus lowing the mem for those last extra core mhz, and if they aren't it's because the memory is unstable.


----------



## pez

Quote:


> Originally Posted by *Silent Scone*
> 
> Mine is quite literally next to my head (open chassis), and I can barely hear it besides the chokes, which is to be expected. I've not modified the fan profile in any way.


What is your fan speed at?


----------



## viz0id

Quote:


> Originally Posted by *octiny*
> 
> On both Titan X Pascal and Ti's, overclocking the memory helps immensely. They love bandwidth.
> 
> OCing the memory 500+ on my Titan X P's gives me 90% of what +170 on the core gives. Testing done @ 4K via ROTR, The Division, FS Ultra.
> 
> Ti's same thing, unfortunately my mem OC's like crap on one of the cards so I'm stuck at +250. I envy those getting getting 12ghz+.....rather have that than an extra 50mhz on the core.
> 
> OC the core first, then find your max mem....if you lose a few bins off your core, that's fine. You'll find your FR's are still higher versus lowing the mem for those last extra core mhz, and if they aren't it's because the memory is unstable.


Thanks for detailed input. I got the Strix OC on air and i can get 2063mhz stable in the games i play (andromeda and BF1). But if i increase the memory to +250 it throttles or won't go past 1950 on the core or something? Do i need to give it more voltage on the curve, or just dial back the core and find middle gorund?


----------



## octiny

Quote:


> Originally Posted by *viz0id*
> 
> Thanks for detailed input. I got the Strix OC on air and i can get 2063mhz stable in the games i play (andromeda and BF1). But if i increase the memory to +250 it throttles or won't go past 1950 on the core or something? Do i need to give it more voltage on the curve, or just dial back the core and find middle gorund?


I don't think I've ever dropped more than 2 bins after maxing out the memory on my XP's. Ti's at +250 I drop no bins at all. Losing 100+ on the core with just +250 on memory is definitely not normal. I would play around with the curve/voltage, run some high resolution tests etc


----------



## viz0id

Quote:


> Originally Posted by *octiny*
> 
> I don't think I've ever dropped more than 2 bins after maxing out the memory on my XP's. Ti's at +250 I drop no bins at all. Losing 100+ on the core with just +250 on memory is definitely not normal. I would play around with the curve/voltage, run some high resolution tests etc


That's why i haven't bothered much with the memory. Cause even with default curve or fixed +50 or whatever on core. If i cross over +200 memory it's like i choke the whole card and it runs rampant around 1850-1950 and doesn't know what to do. I feel that i've nailed stable core clocks with the curve on this card. But once i add memory clocking to the mix, everything goes out the window.

EDIT: It's even worse than i thought. I took 2 screenshots. First is 1999mhz on core and +100 memory, on 1,05v. Second screenshot is just adding +50 more to mem without changing anything. Look how it behaves. Voltage crashes and goes up and down 100 on the core and cant get stable anywhere. Default voltage curve.




Any ideas? I have open return, should i return it?


----------



## mtbiker033

Quote:


> Originally Posted by *MRCOCOset*
> 
> Palit 350W Bios on ASUS GTX 1080TI FE. Work
> 
> 
> 
> 
> 
> 
> 
> My video >>>


well that was interesting!







that's quite the set-up there

It was interesting watching the nvidia gpu information on the side screen, it's crazy that the watts jump around that much during the benchmark, sometimes a swing of 50W or more depending on the scene.


----------



## BrainSplatter

Quote:


> Originally Posted by *viz0id*
> 
> Any ideas?


Can u post the curve or try without custom curve.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> Can u post the curve or try without custom curve.


Sure removed the curve, then my card default clocks to 1987, but has Pwr and VRel perf caps. But it has stable clock and voltage at least. Then i add memory clock (this time i got to 200 before it started acting up).

EDIT: It fluctuates between almost 1750 and 1910 something. It looks terrible.


----------



## BrainSplatter

Looks weird. AB version ? What happens when u raise voltage? Maybe open a separate thread.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> Looks weird. AB version ? What happens when u raise voltage? Maybe open a separate thread.


Doesn't seem like it has any effect. AB version is the latest 4.3.0 stable one.

For reference here is my card running my stable core only OC:


----------



## ALSTER868

Quote:


> Originally Posted by *viz0id*
> 
> EDIT: It's even worse than i thought. I took 2 screenshots. First is 1999mhz on core and +100 memory, on 1,05v. Second screenshot is just adding +50 more to mem without changing anything. Look how it behaves. Voltage crashes and goes up and down 100 on the core and cant get stable anywhere. Default voltage curve.


Same thing here. Memory when OCed is a great power eater so we should find kind of balance between ocing memory and core not to hit the limits and getting this crazy freq and volt jumping.


----------



## viz0id

Quote:


> Originally Posted by *ALSTER868*
> 
> Same thing here. Memory when OCed is a great power eater so we should find kind of balance between ocing memory and core not to hit the limits and getting this crazy freq and volt jumping.


Good to see that it is not just me. But you say it is a great power eater, and it shows "Pwr" on perf caps. But the TDP hardly goes over 103% so i just cant seem to wrap my head around why it does that.


----------



## ALSTER868

Quote:


> Originally Posted by *viz0id*
> 
> Good to see that it is not just me. But you say it is a great power eater, and it shows "Pwr" on perf caps. But the TDP hardly goes over 103% so i just cant seem to wrap my head around why it does that.


I can see power hovering about 112-115% when the volts and frequency start getting unstable. At 103% they are still stable.
BTW, as far as I gathered from testing my FE it's almost impossible to get stable 2000 MHz and over on a FE in most of modern 4k games, may be except BF1 - this is the only demanding thing that allows me to see the magic stable 2GHz.


----------



## Nico67

Quote:


> Originally Posted by *MRCOCOset*
> 
> Palit 350W Bios on ASUS GTX 1080TI FE. Work
> 
> 
> 
> 
> 
> 
> 
> My video >>>


While I agree you are peaking 330w+, you are power limiting like crazy. Just look at how much its clocking down









Show me the same test with more stable clks and GPU-Z with maxs on power. Barely limiting I pulling closer to 300w averages.


----------



## viz0id

Quote:


> Originally Posted by *ALSTER868*
> 
> I can see power hovering about 112-115% when the volts and frequency start getting unstable. At 103% they are still stable.
> BTW, as far as I gathered from testing my FE it's almost impossible to get stable 2000 MHz and over on a FE in most of modern 4k games, may be except BF1 - this is the only demanding thing that allows me to see the magic stable 2GHz.


If i put on PerfCap on "MAX" the highest max TDP% i get is 104,2%, while it is bouncing around.

So i can choose to have 2063 on core and +100 mem. Or i can have 1999core and +200 on mem. I even tried to underclock and use like 1900 on core. Still can't go over +200 memory. Feel like something is really really wrong.

Might end up using my 45 day open return on the card. It has got some pretty hefty coil whine as well.


----------



## Roadrunners

Quote:


> Originally Posted by *BrainSplatter*
> 
> Hitting the 'voltage limit' is normal GPU boost behaviour. Your new card just hits it more often than your old one because it has a higher VID. My assumption was, that with an identical voltage curve, the behaviour would be the same for your 2 cards. But it seems to be more complex than that. I already noticed with my 2 FE cards (their VIDs are also different) that the voltage points in the editor seems to jump around a little different for each card when I press 'apply', even when I try to use the exact same curve.
> 
> In the end, the one thing which matters is which maximum OC u get from your new card. Since that depends on silicon lottery and your new one might not get as high as your old one. But if it runs stable with something between 2000-2050 Mhz then that would be a perfectly fine result. The new card will probably run a little hotter/louder than the old one due to the higher VID.
> 
> Should get another new card, it might have an even higher VID, overclock less or maybe hit 2100Mhz with 1v
> 
> 
> 
> 
> 
> 
> 
> Just a matter of luck.


Your right the second card does pull a higher voltage than my first card 1.063v vs 1.050v, I haven't spent much time overclocking but it seems to be capable of 2050Mhz on Valley benchmark on the stock FE cooler. The card will be on water if I decide to keep it so temps aren't an issue.

I was really only concerned as I have had 3 different 1080Ti FE cards and this was the only one which has had constant voltage Limit warning popping up at load and no matter what I do I cant get rid of it. Temp and Power Limits are easy to negate and my thinking is so should the Voltage Limit but this doesnt seem to be the case.

Anyway I will bench this card some more and if its stable I will keep it, although I must say I do not like seeing that Voltage Limit constantly being on especially at stock.


----------



## mshagg

Here's a similar video that shows the difference between GPU-Z and Nvidia-smi. I think even with monitoring set at 0.1sec, GPU-Z just isnt sampling the card enough, compared to the 119 samples nvsmi takes over a ~2 second period.




Of course that also means GPU-Z wont be seeing all of the perfcaps at 0.1second intervals.

The run is FS Extreme. TDP limits like crazy in the first scene, which isnt surprising given nvsmi is reading > 330W. The second scene starts around the 50sec mark. 2075Mhz @ 1.075V and plenty of peak readings > 300W from nvsmi. GPU-Z doesnt show any perfcaps but, as I suggested, that doesnt mean they are not there, it might just be missing them.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> OK so found a spot on the curve, with some mem OC, that pushes the Palit BIOS to the power limit. 2050Mhz at 1.061V, with some power limiting in those first few scenes of SuPo.
> 
> Peak TDP was 109%, which equates to 327W.
> 
> I think it's a stretch to say we're going to get 350W from it. Presumably we are running up against the sub-limit for the motherboard's power supply, given Palit would be drawing that extra power from the 2x8-pin. That said, these are far better results than I've had with any of the other BIOSes.


Thanks for looking into this


----------



## ALSTER868

Quote:


> Originally Posted by *viz0id*
> 
> If i put on PerfCap on "MAX" the highest max TDP% i get is 104,2%, while it is bouncing around.
> 
> So i can choose to have 2063 on core and +100 mem. Or i can have 1999core and +200 on mem. I even tried to underclock and use like 1900 on core. Still can't go over +200 memory. Feel like something is really really wrong..


Really seems weird this kind of behaviour. I'm also hitting the pwr limits severely but not the way you are. I think it's worth of considering to return it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *MRCOCOset*
> 
> Palit 350W Bios on ASUS GTX 1080TI FE. Work
> 
> 
> 
> 
> 
> 
> 
> My video >>>


Hey this is great, thank you for doing a thorough test on this. Looks like your test is in line with mshagg. The bios gives an extra 30 watts. But as other people experience, it either crashes, works but doesn't get as good of a score as other bios.

You're definitely hitting power limits though, probably holding higher clocks than stock but 30 watts isn't enough to eliminate power limiting.

So far this is the highest tdp of all the bios.


----------



## mshagg

Quote:


> Originally Posted by *viz0id*
> 
> If i put on PerfCap on "MAX" the highest max TDP% i get is 104,2%, while it is bouncing around.
> 
> So i can choose to have 2063 on core and +100 mem. Or i can have 1999core and +200 on mem. I even tried to underclock and use like 1900 on core. Still can't go over +200 memory. Feel like something is really really wrong.
> 
> Might end up using my 45 day open return on the card. It has got some pretty hefty coil whine as well.


Just testing my hypothesis against these observations - in that GPU-Z, at its lowest setting, will only poll the GPU every 0.1 seconds. I believe GPU Boost is working at a far higher frequency than this - nvsmi for example, can take 119 observations over a ~2second period, about 600% more frequent than GPU-Z.

So, who really knows what is actually happening, when looking at GPU-Z's max values. It may have spiked well above 104% TDP between the 0.1 seconds it was looking.


----------



## viz0id

Quote:


> Originally Posted by *mshagg*
> 
> Just testing my hypothesis against these observations - in that GPU-Z, at its lowest setting, will only poll the GPU every 0.1 seconds. I believe GPU Boost is working at a far higher frequency than this - nvsmi for example, can take 119 observations over a ~2second period, about 600% more frequent than GPU-Z.
> 
> So, who really knows what is actually happening, when looking at GPU-Z's max values. It may have spiked well above 104% TDP between the 0.1 seconds it was looking.


That is true. But when run long enough, would it not catch some higher values over a longer period? Eventually one would think that it would be polling the information at a higher TDP?

And if it is true like you said, and it is in fact power throttling, i find it weird that just an increase of +10 on the memory would make it go totally banans. It's like no perfcaps what so ever. Increase by +10 on the mem clock and it goes mental.

Really annoying that i cant get an understand of it.

EDIT: I can even run the card at 2088 without pwr throttle, but once the temp goes above whatever in the 30-40C range it goes down to 2075, then again down to 2062 on somewhere in the 50's. (soft temp throttle)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> That is true. But when run long enough, would it not catch some higher values over a longer period? Eventually one would think that it would be polling the information at a higher TDP?
> 
> And if it is true like you said, and it is in fact power throttling, i find it weird that just an increase of +10 on the memory would make it go totally banans. It's like no perfcaps what so ever. Increase by +10 on the mem clock and it goes mental.
> 
> Really annoying that i cant get an understand of it.
> 
> EDIT: I can even run the card at 2088 without pwr throttle, but once the temp goes above whatever in the 30-40C range it goes down to 2075, then again down to 2062 on somewhere in the 50's. (soft temp throttle)


Look at the images, I posted a image showing a graph of voltage, temperature and core. I'm on my phone so I can't repost it.


----------



## viz0id

Quote:


> Originally Posted by *SlimJ87D*
> 
> Look at the images, I posted a image showing a graph of voltage, temperature and core. I'm on my phone so I can't repost it.


I'm sorry i clicked quite a few pages back but couldn't find it. What page was this at? Is it today or some other day?

Just to clarify i understand the voltage curve and the soft temp throttle etc. What i dont understand is what is making my card go bananas with a moderate memory clock when the core clock is well above average.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> I'm sorry i clicked quite a few pages back but couldn't find it. What page was this at? Is it today or some other day?
> 
> Just to clarify i understand the voltage curve and the soft temp throttle etc. What i dont understand is what is making my card go bananas with a moderate memory clock when the core clock is well above average.


Sorry it's actually not what you're looking for. But I saw an article relating core clocks and memory before.

There is no relationship with memory oc and core, they can be completely random on hiw they affect one another.

Upper right you can see all images in this thread, it is near the end as it was posted a few days ago.

Or easier, click my profile and look at my images for this thread.


----------



## foolycooly

Got my SC2 in yesterday and did a fresh install of w10 on a new SSD. I haven't had a chance to do much of anything other than a quick test game of PU battlegrounds on ultra and some FFXIV. The card is definitely not downclocking to an idle frequency which I kind of expected. I didn't have this issue with my 780ti, but I am running 1x144hz and 2x60hz panels and am aware that this is an issue on pascal cards. The card is idling at 1480 core and temperatures are in the mid 50s. Under sustained gaming load it capped at 72c on the stock fan curve and was somewhere in the upper 1900s core (i wasn't really watching). I still have my side panel off and I couldn't hear the card at all over my h100i, even under load. Also love the way it looks!

I plan to start with only capping the power slider in XOC and adjusting the fan curve to a more aggressive one. I don't plan on manually overclocking this card, but I am interested to see what kind of gains (if any) I see by just moving sliders and adjusting the curve. I will be playing the dark souls III expansion tonight and can report back with some results.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *foolycooly*
> 
> Got my SC2 in yesterday and did a fresh install of w10 on a new SSD. I haven't had a chance to do much of anything other than a quick test game of PU battlegrounds on ultra and some FFXIV. The card is definitely not downclocking to an idle frequency which I kind of expected. I didn't have this issue with my 780ti, but I am running 1x144hz and 2x60hz panels and am aware that this is an issue on pascal cards. The card is idling at 1480 core and temperatures are in the mid 50s. Under sustained gaming load it capped at 72c on the stock fan curve and was somewhere in the upper 1900s core (i wasn't really watching). I still have my side panel off and I couldn't hear the card at all over my h100i, even under load. Also love the way it looks!
> 
> I plan to start with only capping the power slider in XOC and adjusting the fan curve to a more aggressive one. I don't plan on manually overclocking this card, but I am interested to see what kind of gains (if any) I see by just moving sliders and adjusting the curve. I will be playing the dark souls III expansion tonight and can report back with some results.


The card should be downclocking to an idle. People have their performance set to "maximum performance" globally. So that will cause it not to idle quite often or at all.

Make sure you don't have this setting set in global settings. Just leave it at optimal power. Not worth the hassle. If you want to set high performance, do it per a game or app basis.

If none of the above applies to you, then I don't know. My GPU downlocks to 139 mhz fine with a 1440P 144hz monitor on my desktop with power usage going down to 5-8%


----------



## foolycooly

Quote:


> Originally Posted by *SlimJ87D*
> 
> The card should be downclocking to an idle. People have their performance set to "maximum performance" globally. So that will cause it not to idle quite often or at all.
> 
> Make sure you don't have this setting set in global settings. Just leave it at optimal power. Not worth the hassle. If you want to set high performance, do it per a game or app basis.
> 
> If none of the above applies to you, then I don't know. My GPU downlocks to 139 mhz fine with a 1440P 144hz monitor on my desktop with power usage going down to 5-8%


Based on the countless threads/posts on the topic, it seems to be related to running multiple monitors at mixed refresh rates (not sure if you are running more than your 144hz panel or not). I do have the power settings in NCP set to "optimal" (do not have maximum performance set globally). I have a list of a few things I want to try tonight to see if I can correct the issue, but I'm not overly concerned if I can't. Even in the mid 50s only one of the two fans spins and it's silent. It would be nice to get those idle temps down into the 20s or 30s, but I can't imagine running 50-60 24/7 would significantly shorten the lifespan of the card.

I would like to see how the stock "aggressive" fan curve does in games. I was hoping to see under 70 at full sustained load.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *foolycooly*
> 
> Based on the countless threads/posts on the topic, it seems to be related to running multiple monitors at mixed refresh rates (not sure if you are running more than your 144hz panel or not). I do have the power settings in NCP set to "optimal" (do not have maximum performance set globally). I have a list of a few things I want to try tonight to see if I can correct the issue, but I'm not overly concerned if I can't. Even in the mid 50s only one of the two fans spins and it's silent. It would be nice to get those idle temps down into the 20s or 30s, but I can't imagine running 50-60 24/7 would significantly shorten the lifespan of the card.
> 
> I would like to see how the stock "aggressive" fan curve does in games. I was hoping to see under 70 at full sustained load.


You're right, I'm only running one monitor.

Hopefully you find a solution and report back to us. But meantime, could you put your monitor on the desktop to 120 Hz and have it do max monitor refresh rates for games? I did that when the bug was really bad on maxwell cards. It seemed to work.

Gaming would go to 144 hz, desktop would be at 120 hz.


----------



## foolycooly

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're right, I'm only running one monitor.
> 
> Hopefully you find a solution and report back to us. But meantime, could you put your monitor on the desktop to 120 Hz and have it do max monitor refresh rates for games? I did that when the bug was really bad on maxwell cards. It seemed to work.
> 
> Gaming would go to 144 hz, desktop would be at 120 hz.


Yeah, thanks, I will definitely report back if I find a solution.

I did try changing to 120hz and it didn't make a difference which is odd cause this corrects the issue for most people. My quick list of things to try based on what was worked for other people:


Enable K boost in XOC and then disable it again
Shut down the PC fully then power back on rather than doing a soft reset (didn't do this during my experimenting)
Change one of my side 60hz panels to the main display in windows, then revert back to the 144hz center panel being the main display
Running all 3 monitors off of displayport (right now I have 144hz DP, 60hz DP, 60hz DVI)
Toggle Gsync to "full screen only" then back to "full screen and windowed"
Kill extraneous Nvidia processes in task manager (someone mentioned seeing Nvidia "share" processes that when killed resolved the issue). Also checking GPU usage through NCP to identify if a process is using high GPU.
DDU clean install of latest driver with ONLY the 144hz panel connected. After install, plug secondary monitors back in.
Use Nvidia Inspector to enable multi display power saving mode (i would prefer not to do this)


----------



## viz0id

So i have no idea what i did. Well i do, but i have no idea why it had such an impact.

I just reinstalled afterburner and did a fresh curve and everything. Ended up now being allowed to run around 2037-2062 on core all Superposition and +300 on the mem (which i never could do before). The TDP% now makes sense, actually showing me getting to 120% when it gets throttled.

Pushed over 10k for the first time


----------



## BrainSplatter

Quote:


> Originally Posted by *Roadrunners*
> 
> Pushed over 10k for the first time


Congrats! Fortunately it seems to just have been a software- and not a hardware problem.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> Congrats! Fortunately it seems to just have been a software- and not a hardware problem.


Thanks! I have no idea why it says posted by Roadrunner but Yeah


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Everyone's different. The Aorus I tried was supposed to be "quiet", but at 100% fan speed, it sounded no quieter than my FE's @ 100%


crazy. My Aorus at 100% fans is about like an FE at 40% to me ears. I wonder how much difference there really is between card to card. It sounds like every component can have variance. A card may be 50% louder of 50% better at not leaking voltages, etc etc


----------



## Scooby Boostin

Quote:


> Originally Posted by *Slackaveli*
> 
> crazy. My Aorus at 100% fans is about like an FE at 40% to me ears. I wonder how much difference there really is between card to card. It sounds like every component can have variance. A card may be 50% louder of 50% better at not leaking voltages, etc etc


I don't understand the comments like this. The FE's are way too loud in my case and I have an h440. My 980ti SC was a louder two fan card and it was dead silent in my case. That cards replacement the 1080ti Strix OC it just sounds like the computers off. FE's are only good from EVGA since you would want to/need to water cool them. also not lose out on warranty. To say they perform the same acoustically on AIR is insane IMO.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Sure removed the curve, then my card default clocks to 1987, but has Pwr and VRel perf caps. But it has stable clock and voltage at least. Then i add memory clock (this time i got to 200 before it started acting up).
> 
> EDIT: It fluctuates between almost 1750 and 1910 something. It looks terrible.


so, your mems are just trash OCers, but the card still wants bandwidth. So, I would stay at +178 on the memory and go with your best curve. Then you at least hit the 500GBPS mark on bandwidth, you avoid the strange downclocking behavior altogether, and you don't lose any bins.
Quote:


> Originally Posted by *Scooby Boostin*
> 
> I don't understand the comments like this. The FE's are way too loud in my case and I have an h440. My 980ti SC was a louder two fan card and it was dead silent in my case. That cards replacement the 1080ti Strix OC it just sounds like the computers off. FE's are only good from EVGA since you would want to/need to water cool them. also not lose out on warranty. To say they perform the same acoustically on AIR is insane IMO.


Aorus is just hella quiet, man, I don't know what to tell you. It also performs temp-wise (after re-paste w/ 100% fans; which are quiet) almost as well as a Hybrid AIO on a FE does- ~55c loaded in games.


----------



## Scooby Boostin

Quote:


> Originally Posted by *Slackaveli*
> 
> crazy. My Aorus at 100% fans is about like an FE at 40% to me ears. I wonder how much difference there really is between card to card. It sounds like every component can have variance. A card may be 50% louder of 50% better at not leaking voltages, etc etc


I was responding to the comment you were. I assumed it copied what you replied to. I know those are quiet. The FE's are not.


----------



## Slackaveli

[/quote]
Quote:


> Originally Posted by *viz0id*
> 
> Doesn't seem like it has any effect. AB version is the latest 4.3.0 stable one.
> 
> For reference here is my card running my stable core only OC:


well, damn, son, go get the latest afterburner so you can give it a drink of volt. It needs a +25 or a +50 maybe, and it may stabilize.
Quote:


> Originally Posted by *Scooby Boostin*
> 
> I was responding to the comment you were. I assumed it copied what you replied to. I know those are quiet. The FE's are not.


yeah, no doubt. FE at 100% was worse than a PS4. And I don't get how his ASUS is too loud, either. Maybe he got a weird one. But he did say it was more the sound signature or pitch than it was the decibels that was bothering him.


----------



## Jeweettoch13

I want to know if the 1080ti reference cards are also suitable for the white Geforce GTX logo mod before i sand the logo to bits.


----------



## Clukos

Quote:


> Originally Posted by *SlimJ87D*
> 
> Amazing, even with the shunt mod, some power limiting occurs. Drew about 385-400 Watts here.


This is weird, I got 11054 GPU score with no shunt mod, drawing 100 watts less: http://www.3dmark.com/3dm/19479512

Maybe your CPU is holding you back? What is your memory OC?

Edit: Oh 1513, maybe that's what is holding back your score.


----------



## eXteR

Hi guys,

thermal question here.

I bought an Arctic Thermal Pad.

Do you know if i'll get any benefit, if i put a thermal pad between the back of the GPU resistors and the backplate?

The back of the GPU chip, don't make contact on the backplate, so is not disipating any heat.

It's safe to put a thermal pad there or better leave as is?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Clukos*
> 
> This is weird, I got 11054 GPU score with no shunt mod, drawing 100 watts less: http://www.3dmark.com/3dm/19479512
> 
> Maybe your CPU is holding you back? What is your memory OC?
> 
> Edit: Oh 1513, maybe that's what is holding back your score.


Memory is at +550 which is like 6050 mhz. GPU-Z doesn't know how to report it.

I don't think it's my CPU, Timespy and all 3DMark software is jacked up on my system. It literally takes it 15 minutes to finally start a run. So I have issues running it. Sometimes it doesn't even give me a score, it will error right away and then the benchmarks start and will all run through anyways. Initially it didn't even launch until I messed around with some things on my system.

Superposition just works best for me. But again, I am CPU bottlenecked, my minimums could be higher if I had a better CPU.

I haven't redone this test with +550 memory though, below is +450 memory.



Not sure if messing with the control panel will increase my scores. I haven't done anything to my nvidia control panel, turning off aero for Windows 10, etc. Anyone have a comment on how this stuff affects your score?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> Hi guys,
> 
> thermal question here.
> 
> I bought an Arctic Thermal Pad.
> 
> Do you know if i'll get any benefit, if i put a thermal pad between the back of the GPU resistors and the backplate?
> 
> The back of the GPU chip, don't make contact on the backplate, so is not disipating any heat.
> 
> It's safe to put a thermal pad there or better leave as is?


Read the OP and look at my shunt mod post. I added a fujipoly thermal pad back there and it's helping. How much? Not sure, maybe 1 or 2C?


----------



## Midian

MSI GeForce GTX 1080 Ti Gaming X 11GB on a 5960X and Rampage V Extreme board.

Idle temps around 25 degrees with custom fan profile 40%



Load temps (on Gaming mode) around 55 degrees with fan going up to around 65%

During Superposition Benchmark 1080p Extreme test:

Date 2017-04-25 18:30:31
GPU Core Clock [MHz] 1936.0
GPU Memory Clock [MHz] 1377.0
GPU Temperature [°C] 53.0
Fan Speed (%) [%] 63
Fan Speed (RPM) [RPM] 1638
Memory Used [MB] 3512
GPU Load [%] 99
Memory Controller Load [%] 31
Video Engine Load [%] 0
Bus Interface Load [%] 0
Power Consumption [% TDP] 86.8
PerfCap Reason [] 4
VDDC [V] 1.0620

It's obviously my gigantic case (Xigmatek Elysium) with negative airflow that allows the card to be this cool.


----------



## Dasboogieman

Quote:


> Originally Posted by *eXteR*
> 
> Hi guys,
> 
> thermal question here.
> 
> I bought an Arctic Thermal Pad.
> 
> Do you know if i'll get any benefit, if i put a thermal pad between the back of the GPU resistors and the backplate?
> 
> The back of the GPU chip, don't make contact on the backplate, so is not disipating any heat.
> 
> It's safe to put a thermal pad there or better leave as is?


Yes, that is the exact basis as to how the Gigabyte Aorus copper cold plate works. However, the backplate alone will not do much, you have to also put some additional active cooling to see an effect. The improvements range from 2-5 degrees if passively cooled and up to 15 degrees for massive active cooling.


This mod here of mine got 15 degrees of improvement.


----------



## Foxrun

Quote:


> Originally Posted by *viz0id*
> 
> So i have no idea what i did. Well i do, but i have no idea why it had such an impact.
> 
> I just reinstalled afterburner and did a fresh curve and everything. Ended up now being allowed to run around 2037-2062 on core all Superposition and +300 on the mem (which i never could do before). The TDP% now makes sense, actually showing me getting to 120% when it gets throttled.
> 
> Pushed over 10k for the first time


Prior to reinstalling afterburner, were you getting power throttled alot? Even at low voltages on timespy?


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> You're right, I'm only running one monitor.
> 
> Hopefully you find a solution and report back to us. But meantime, could you put your monitor on the desktop to 120 Hz and have it do max monitor refresh rates for games? I did that when the bug was really bad on maxwell cards. It seemed to work.
> 
> Gaming would go to 144 hz, desktop would be at 120 hz.


i got the fix. switch the primary monitor to the lesser Hz monitor and reboot. Then you can even switch back and you'll be back on your low power settings. I RUN A 4K/60 AND A 1440P/144 and I get that bug quite a bit. that's that best fix i've found, it always works and takes less than 1 minute. PitA, but still, not bad.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Memory is at +550 which is like 6050 mhz. GPU-Z doesn't know how to report it.
> 
> I don't think it's my CPU, Timespy and all 3DMark software is jacked up on my system. It literally takes it 15 minutes to finally start a run. So I have issues running it. Sometimes it doesn't even give me a score, it will error right away and then the benchmarks start and will all run through anyways. Initially it didn't even launch until I messed around with some things on my system.
> 
> Superposition just works best for me. But again, I am CPU bottlenecked, my minimums could be higher if I had a better CPU.
> 
> I haven't redone this test with +550 memory though, below is +450 memory.
> 
> 
> 
> Not sure if messing with the control panel will increase my scores. I haven't done anything to my nvidia control panel, turning off aero for Windows 10, etc. Anyone have a comment on how this stuff affects your score?


yeah, you'll get +100 or so by turning off vsync, Anio, etc. just turn anything w/ a perf hit off and set the other stuff to high perf.


----------



## viz0id

Quote:


> Originally Posted by *Foxrun*
> 
> Prior to reinstalling afterburner, were you getting power throttled alot? Even at low voltages on timespy?


Only had power throttle when i clocked memory. I could run 2025 at 1.000v with no throttle, if i kept memory at Stock.


----------



## Foxrun

Quote:


> Originally Posted by *viz0id*
> 
> Only had power throttle when i clocked memory. I could run 2025 at 1.000v with no throttle, if i kept memory at Stock.


Im going to try this when I get home. I get power throttling with 2012 @ 1.000


----------



## rakesh27

Hey all, i recently purchased EVGA FE 1080 Ti and im loving it, from the get go i put on the Kraken G10 and Corsair AIO which i had from a Zotac AE 1080...

What i wanted to know, i put heatsinks on all ram chips, mosfets and small gpu resistors so i could get rid of heat quickly, im getting 2100Ghz Overclock, my Zotac AE 1080 would only do with the same setup 2088Ghz.

Since my EVGA FE 1080Ti is a reference card and my Zotac AE 1080 is not, whats the difference between them meaning Founders Edition to a non Founders Edition.

Thanks all.


----------



## cluster71

Something to read

http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks

Gigabyte Aorus Xtreme Backplate Tests: Does it Work?

Based on our testing, it does not appear that Gigabyte's copper insert is aiding with temperatures in any way whatsoever.


----------



## trickeh2k

Quote:


> Originally Posted by *cluster71*
> 
> Something to read
> 
> http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks
> 
> Gigabyte Aorus Xtreme Backplate Tests: Does it Work?
> 
> Based on our testing, it does not appear that Gigabyte's copper insert is aiding with temperatures in any way whatsoever.


People didn't know already? I thought this was really old news, seen it been covered on a lot of sites as well as YT channels. But a backplate sure makes the card (in general) look a lot schmexier.


----------



## Slackaveli

Quote:


> Originally Posted by *Foxrun*
> 
> Im going to try this when I get home. I get power throttling with 2012 @ 1.000


whenever I get a new gpu, even same series, i always clean install drivers, Afterburner/Rivatuner, and any other software that "tweaks" gpu's. It may leave remnants that mess things up if you don't. Just like if you have been changing BIOS's, you need to be re-installing drivers and afterburner then, too, or you are potentially mucking up the works.


----------



## Baasha

hey guys - slightly off topic (not really), but what is the best way to utilize all the CUDA cores or power of the 1080 Ti for video encoding/rendering? Adobe Media Encoder CC, even with "GPU" selected, barely uses the GPUs - having 4x 1080 Ti's, I would assume rendering video should be super fast.

Is there any other program (maybe like Handbrake etc.) that can and does use the GPUs to render/encode videos? I've been searching long and hard for that and it just doesn't seem to happen - I've tried Sony Vegas, Premier Pro CC, Media Encoder CC, Handbrake and none of them utilize the GPUs fully to encode/render.

Would appreciate advice on this.

TIA


----------



## Slackaveli

Quote:


> Originally Posted by *trickeh2k*
> 
> People didn't know already? I thought this was really old news, seen it been covered on a lot of sites as well as YT channels. But a backplate sure makes the card (in general) look a lot schmexier.


Quote:


> Originally Posted by *cluster71*
> 
> Something to read
> 
> http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks
> 
> Gigabyte Aorus Xtreme Backplate Tests: Does it Work?
> 
> Based on our testing, it does not appear that Gigabyte's copper insert is aiding with temperatures in any way whatsoever.


actually it gains you 15c. GamersNexus < Overclockers.net

See the Aorus owner's thread. You have to put heatsink/fan in the mix to get it to help but that backplate/copper is all contacting, the backplate is working perfectly. Gamer's nexus, in their infinite wisdom, did NOTHING to aid dissapation, so, of COURSE it 'did nothing"

http://www.overclock.net/t/1627238/gigabyte-aorus-geforce-gtx-1080-ti-owners-thread/340

Check out @dasboogieman 's rig. I just use two 100MM x 50MM finned heatsinks and a noctua F-12







works, bros.

Again, I'll say this ... Overclockers.net > review sites. EVERY SINGLE TIME. This is a Mastermind Alliance situation, if you've ever studied PMA (positive mental attitude), the mastermind alliance is always greater than a few dudes who just tear everything down and speculate.


----------



## Alwrath

Hey guys my founders edition tops out at 1950 - 1975 mhz on stock voltage. I am power limited. I downloaded and flashed to the Palit bios last night and have since reverted back to the evga FE bios, but the test was interesting.

I ran Heaven at 2100 mhz core by mistake, at 116 power, and it was running fine for a bit till the temps hit 80C and then it crashed.

My question is, how high do you think I could push this EVGA FE with the EVGA AIO cooler? Right now I get 62C while gaming and thats without a repaste. I have Grizzly Hydronaught on the way so I can try repasting and see what I get too, but again im power limited atm so im thinking Palit bios + AIO would work wonders for me. Thoughts?


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Hey guys my founders edition tops out at 1950 - 1975 mhz on stock voltage. I am power limited. I downloaded and flashed to the Palit bios last night and have since reverted back to the evga FE bios, but the test was interesting.
> 
> I ran Heaven at 2100 mhz core by mistake, at 116 power, and it was running fine for a bit till the temps hit 80C and then it crashed.
> 
> My question is, how high do you think I could push this EVGA FE with the EVGA AIO cooler? Right now I get 62C while gaming and thats without a repaste. I have Grizzly Hydronaught on the way so I can try repasting and see what I get too, but again im power limited atm so im thinking Palit bios + AIO would work wonders for me. Thoughts?


I would, if affordable. That card could probably hold it, it is just in bad thermal need. The repaste will help a lot. For the money, that's the best mod. It's like $2 per degree lol. But it only gets you 5c-8c, depending on how badly the factory tim is. My FE was just like yours, but was able to hold 2000 after the re=pasting. It probably had 2063 in it but I didnt want to spend another hundred on cooling (i had $70 extra in it already because i had to pay tax), so i just returned, grabbed an aorus, and kept the $50 change lol. Then b/c I always pay with paypal (it gives all the perks of a CC), they even paid me back for return shipping









BTW, if you buy from Newegg and wish you could get a refund and are stymied by their "exchange only" policy. If you bought with Paypal, they will MAKE newegg refund you or will do it themselves. has to be "not as described", though, but that would include not holding OC settings that are pre-programmed in , like, aorus software. Or coil whine, since they desribe it as nearly silent. Those types of complaints. NOT just it won't clock as high as I'd like it too. May take a couple weeks though, but Im just throwing that out there. Some CCs would probably do the same. So, us Mericans can get those EU protections thru paypal, basically. What I was going to do if my regular aorus turned out terrible and not at least hold it's "OC MODE", but boy did it NOT turn out terrible, it holds the EXTREME'S OC mode plus 25 more..


----------



## KingEngineRevUp

Quote:


> Originally Posted by *cluster71*
> 
> Something to read
> 
> http://www.gamersnexus.net/hwreviews/2881-gigabyte-gtx-1080-ti-aorus-xtreme-review-benchmarks
> 
> Gigabyte Aorus Xtreme Backplate Tests: Does it Work?
> 
> Based on our testing, it does not appear that Gigabyte's copper insert is aiding with temperatures in any way whatsoever.


Yeah, but my test has shown otherwise.

I used a thermal gun. Backplate was 40C. After adding fujipoly, it is now at 60C so obviously heat is transferring to it.


----------



## alucardis666

New driver

http://www.guru3d.com/files-details/geforce-381-89-whql-driver-download.html


----------



## veg28

Quote:


> Originally Posted by *Slackaveli*
> 
> I would, if affordable. That card could probably hold it, it is just in bad thermal need. The repaste will help a lot. For the money, that's the best mod. It's like $2 per degree lol. But it only gets you 5c-8c, depending on how badly the factory tim is. My FE was just like yours, but was able to hold 2000 after the re=pasting. It probably had 2063 in it but I didnt want to spend another hundred on cooling (i had $70 extra in it already because i had to pay tax), so i just returned, grabbed an aorus, and kept the $50 change lol. Then b/c I always pay with paypal (it gives all the perks of a CC), they even paid me back for return shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, if you buy from Newegg and wish you could get a refund and are stymied by their "exchange only" policy. If you bought with Paypal, they will MAKE newegg refund you or will do it themselves. has to be "not as described", though, but that would include not holding OC settings that are pre-programmed in , like, aorus software. Or coil whine, since they desribe it as nearly silent. Those types of complaints. NOT just it won't clock as high as I'd like it too. May take a couple weeks though, but Im just throwing that out there. Some CCs would probably do the same. So, us Mericans can get those EU protections thru paypal, basically. What I was going to do if my regular aorus turned out terrible and not at least hold it's "OC MODE", but boy did it NOT turn out terrible, it holds the EXTREME'S OC mode plus 25 more..


Just buy if from amazon.com when in stock, you can choose to eat the return shipping cost or if you are in area that has $0.00 drop off lockers then you are in luck! Or the less ethical but I suspect common approach of "defective" i.e. coil whine for a full refund.


----------



## Slackaveli

http://wccftech.com/intel-x299-hedt-skylake-x-kaby-lake-x-launch-26-june-nda/

may 30 for reveal, june availability for Skylake-X/Kaby-X
Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, but my test has shown otherwise.
> 
> I used a thermal gun. Backplate was 40C. After adding fujipoly, it is now at 60C so obviously heat is transferring to it.


guys coming on THIS website trying to disprove us with Gamer's Nexus videos. LMAO
\
Quote:


> Originally Posted by *veg28*
> 
> Just buy if from amazon.com when in stock, you can choose to eat the return shipping cost or if you are in area that has $0.00 drop off lockers then you are in luck! Or the less ethical but I suspect common approach of "defective" i.e. coil whine for a full refund.


I would have, but they were WEEKS late to the party. They aint getting me to wait an extra month after release. They were TRIPPIN.


----------



## eXteR

Quote:


> Originally Posted by *SlimJ87D*
> 
> Read the OP and look at my shunt mod post. I added a fujipoly thermal pad back there and it's helping. How much? Not sure, maybe 1 or 2C?


how thick are thous fujipoly?

I bought 1.5mm ones. It's enough?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> New driver
> 
> http://www.guru3d.com/files-details/geforce-381-89-whql-driver-download.html


I'd wait. The forum on it doesnt look promising. Higher cpu usage, 10% lower benches, and games that are cpu heavy as well. Wait for the (now inevitable) hotfix.


----------



## Clukos

Quote:


> Originally Posted by *Slackaveli*
> 
> I'd wait. The forum on it doesnt look promising. Higher cpu usage, 10% lower benches, and games that are cpu heavy as well. Wait for the (now inevitable) hotfix.


Maybe it's the Ryzen driver, I got better scores from some initial tests


----------



## KingEngineRevUp

Quote:


> Originally Posted by *eXteR*
> 
> how thick are thous fujipoly?
> 
> I bought 1.5mm ones. It's enough?


That's more thicker than needed but don't worry it's like a clay you can reform it. It'll more than likely compress to your needs.

1mm is perfect.


----------



## Foxrun

Is it possible to not run into a power limit on time spy? No matter how low I drop the voltage, I still hit the power limit.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, but my test has shown otherwise.
> 
> I used a thermal gun. Backplate was 40C. After adding fujipoly, it is now at 60C so obviously heat is transferring to it.


I wonder if it is helping in any other way than heat dissipation. I did notice better and more stable overclocks after going on water, however, I used fujipoly pads as opposed to the EK pads. I am willing to bet that these have helped too. Unfortunately I can't quantify it. I would love to see someone on an ICX card to compare stock EVGA pads to Fujipoly since there are loads of temp monitors.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> Is it possible to not run into a power limit on time spy? No matter how low I drop the voltage, I still hit the power limit.


It's just power hungry. I've pointed out that you need around 375-400 watts to run it without power limiting. Even if the Pilat bios gives you 30 watts it's not enough.

A 1080 Ti to reach it's true potential needs 375+ watts.

But let's be honest here, that's just like a 2% difference in performance. And most average people can barely hit 2000 Mhz. Thus only affects like probably 5% of the population that are Watercooling and have a lottery winner.


----------



## lilchronic

Quote:


> Originally Posted by *Foxrun*
> 
> Is it possible to not run into a power limit on time spy? No matter how low I drop the voltage, I still hit the power limit.


Try .950v or lower.


----------



## Foxrun

Ah so it is normal? Ive got the AIO water coolers on them and temps never exceed 47c under timespy. Ive seen some users report no power limiting when benching with timespy but it seems impossible! I am worried that it effects performance somehow.


----------



## GRABibus

Quote:


> Originally Posted by *Foxrun*
> 
> Ah so it is normal? Ive got the AIO water coolers on them and temps never exceed 47c under timespy. Ive seen some users report no power limiting when benching with timespy but it seems impossible! I am worried that it effects performance somehow.


What's your card ? MSI Sea Hawk ?


----------



## Slackaveli

Quote:


> Originally Posted by *Clukos*
> 
> Maybe it's the Ryzen driver, I got better scores from some initial tests


interesting. i may give it a try.


----------



## Foxrun

Quote:


> Originally Posted by *GRABibus*
> 
> What's your card ? MSI Sea Hawk ?


I picked them both up from the nvidia store. I put the EVGA hybrid coolers on them.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> Ah so it is normal? Ive got the AIO water coolers on them and temps never exceed 47c under timespy. Ive seen some users report no power limiting when benching with timespy but it seems impossible! I am worried that it effects performance somehow.


Well a good question is what card they are using and what bios? Asus and msi bios gives 15 more watts. Pilat gives 30 more watts. So at 1.000v that extra 15 to 30 watts can keep them from throttling.

With a shunt mod, they won't throttle at all.


----------



## Foxrun

Ill try flashing them when I get home from work and update this!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> Ill try flashing them when I get home from work and update this!


Well don't get your hopes to high. Performance with other bios have been mixed, in my experience before we had power hungry benchmarks, on heaven without power limiting they didn't score as high as the FE stock bios. So that extra power might just make you stay where you are, hit thermal throttling quicker or get worse results.

Just do some thorough testing.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes, that is the exact basis as to how the Gigabyte Aorus copper cold plate works. However, the backplate alone will not do much, you have to also put some additional active cooling to see an effect. The improvements range from 2-5 degrees if passively cooled and up to 15 degrees for massive active cooling.
> 
> 
> This mod here of mine got 15 degrees of improvement.


Impressive mod 

Quote:


> Originally Posted by *SlimJ87D*
> 
> Well a good question is what card they are using and what bios? Asus and msi bios gives 15 more watts. Pilat gives 30 more watts. So at 1.000v that extra 15 to 30 watts can keep them from throttling.
> 
> With a shunt mod, they won't throttle at all.


How can we be sure that the power numbers are reporting correctly with the cross-flash? *(10 reported watts could = 9 true watts)?* _(Different VRM's could be calibrated differently)?
_
Last time I cross-flashed; I was getting ridiculously low power readings (like under 110w on a 1080 Classy) but FPS was only dropping 10-20% and voltages were either the same or higher.


----------



## alucardis666

Worth it to Flash my FE cards if I'm seeing 0 perf caps?


----------



## Foxrun

Quote:


> Originally Posted by *alucardis666*
> 
> Worth it to Flash my FE cards if I'm seeing 0 perf caps?


If you're not getting any caps then I wouldn't change the bios.


----------



## joder

New game ready drivers out today. Not sure if it will change much but worth a shot.

http://www.geforce.com/drivers/results/117914


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Impressive mod
> 
> How can we be sure that the power numbers are reporting correctly with the cross-flash? *(10 reported watts could = 9 true watts)?* _(Different VRM's could be calibrated differently)?
> _
> Last time I cross-flashed; I was getting ridiculously low power readings (like under 110w on a 1080 Classy) but FPS was only dropping 10-20% and voltages were either the same or higher.


That's why we're waiting for you to do your power draw test from the wall.

That's the last piece of evidence we can rely on.

We can't trust software 100% because we're tricking software and hardware. Unless if someone can dive in and tell us how the code reads what is going on, we're stuck speculating.

A readout from the wall is the most sure proof way.


----------



## joder

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's why we're waiting for you to do your power draw test from the wall.
> 
> That's the last piece of evidence we can rely on.
> 
> We can't trust software 100% because we're tricking software and hardware. Unless if someone can dive in and tell us how the code reads what is going on, we're stuck speculating.
> 
> A readout from the wall is the most sure proof way.


Figure out how I can fix my borked power meter and I would be more than happy to test. Damn thing was off voltage wise and I tried to calibrate it which further messed it up. Oh well.


----------



## lilchronic

EVGA FE stock bios no shunt mod.
voltage and clock during the test @ 4k were .981 -1.012v and 2000Mhz - 2038Mhz with a few seconds up to 2050Mhz.


When you run these power hungry test's you just need to raise the overclock for the lower voltage bins in the curve.


----------



## Lefty23

Quote:


> Originally Posted by *KedarWolf*
> 
> Okay peeps, need to add your card info on our form in bottom of OP, even if we had it before. Additional info, better OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Shoutout to @joder for making the form and updating the spreadsheet!!


Done.
Congrats on taking over the OP (also thanks to the guys that help maintain/improve it). Hopefully, in a couple of weeks, I will have more spare time and I won't be reading 4-5 days old posts.

I wasn't sure what was required for "clocks" info. Benchmarking speeds, or 24/7 ones? Anyway, I used my "worst" 24/7 up to date, i.e. 2100c/[email protected] (original BIOS - no shunt mod).
As I mentioned already not enough time at the moment but, last weekend I installed the free game I got with the card - GR Wildlands - and run around for a couple of hours (1440p Ultra preset with motion blur disabled).
There are games that run the same clocks @1.050V but, I haven't tested for long periods of time (GR-W crashed after ~40 mins at this voltage).
In the spoiler below a screenshot for reference and also the logfile from GPU-Z that shows I don't hit any limits other than Util after ~2,5h.


Spoiler: Warning: Spoiler!





2.5hGRWildlands2100c12000m.txt 1612k .txt file




I'm sure that once I try other games I will find ones that hit the PL and downclock or are unstable at those clocks. It was the same with my (also WCed) 980ti though.
There were a few games I could play at 1550/[email protected], most were ok at 1530/[email protected] and then there always was the one that would have me downclock to 1500/[email protected] so as not to crash.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> I would, if affordable. That card could probably hold it, it is just in bad thermal need. The repaste will help a lot. For the money, that's the best mod. It's like $2 per degree lol. But it only gets you 5c-8c, depending on how badly the factory tim is. My FE was just like yours, but was able to hold 2000 after the re=pasting. It probably had 2063 in it but I didnt want to spend another hundred on cooling (i had $70 extra in it already because i had to pay tax), so i just returned, grabbed an aorus, and kept the $50 change lol. Then b/c I always pay with paypal (it gives all the perks of a CC), they even paid me back for return shipping


Just pre ordered the EVGA Hybrid 1080ti cooler from B+H. I didnt have to pay tax on my 1080ti cause I was lucky enough to get it straight from EVGA on launch day while I was at work. Should have it May 4th. I will update you guys on what clocks I can get out of this thing with the Palit bios. Im excited


----------



## OneCosmic

I tested the Palit GameRock BIOS and i see no improvement regarding power limit over stock BIOS - i set the PL to 350W with nvsmi command. SUPO is only stable with 1.0V on stock or ASUS BIOS and downclocks the same as Palit even only on 1.025V.


----------



## KraxKill

2100 @ 1.063v

Broke through 23000 on my pedestrian 4790K. Single card outscoring a few SLI results.



Now I have to stop and wait for Skylake/Kaby + Volta. No seriously! I need to stop and play some games now.


http://www.3dmark.com/fs/12446981


----------



## OneCosmic

Maybe because Stock FireStrike is a CPU test? Run Extreme, Ultra or TimeSpy and then compare.


----------



## Benny89

Question:

Do I need to touch Voltage Slider when I have a Voltage Curve set?

Because I was playing around with Voltage Curve alone and Increaseing % on Voltage Slider did not seem to do anything to my clocks....

So If I have 2063 at 1.062V clock, do I need to change anything to my Voltage Slider or it can sit at 0%?


----------



## mechwarrior

flashed the palit bios is allows me to run 2100 in heaven and spo benchmark without hitting the power limit. But the strange thing is my scores are lower??? tried all the bios and this is the only one that allows me to run well at 2100.


----------



## RMBR

True or false?
Two 1080ti in sli is underperforming.
It has games like wildlands with perfomance equal to 1080 sli.
FAIL?


----------



## Benny89

Quote:


> Originally Posted by *RMBR*
> 
> True or false?
> Two 1080ti in sli is underperforming.
> It has games like wildlands with perfomance equal to 1080 sli.
> FAIL?


SLI is fail nowadays in 90% of cases....


----------



## KraxKill

Quote:


> Originally Posted by *OneCosmic*
> 
> Maybe because Stock FireStrike is a CPU test? Run Extreme, Ultra or TimeSpy and then compare.


What's your point? But sure. Here you go. Not bad huh?

I'd rather upgrade my CPU and Mobo than get a second card and deal with micro stutter ..but that's just me.

Time Spy

http://www.3dmark.com/spy/1595031

FS Ultra

http://www.3dmark.com/fs/12445747


----------



## feznz

SLI sort the noobs out my current build is the first non SLI system I have had since the 8800 GTX days I only had minor issues mainly trying to play a game a few days after launch

I guess a bit like the superstition bench SLI was only broken for a few days till it got patched.

I am thinking of getting another 1080 Ti to finally max out games


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's why we're waiting for you to do your power draw test from the wall.
> 
> That's the last piece of evidence we can rely on.
> 
> We can't trust software 100% because we're tricking software and hardware. Unless if someone can dive in and tell us how the code reads what is going on, we're stuck speculating.
> 
> A readout from the wall is the most sure proof way.


Understood.

Thursday it is then.

I'll include methodology/exactly how test was carried out & a walkthrough plus numbers; possibly in real-time using camera or screenshots + photos with before's/after's.
Should be pretty straight forward;

Fingers crossed its giving something ;-)

Anyone know if there's a VRM Analysis anywhere on the PALIT card? (Showing the name of the chip at the heart the power monitoring- _for instance; Texas Instruments UT6 D501 and VRM chip UP9571)._


----------



## KingEngineRevUp

Quote:


> Originally Posted by *OneCosmic*
> 
> I tested the Palit GameRock BIOS and i see no improvement regarding power limit over stock BIOS - i set the PL to 350W with nvsmi command. SUPO is only stable with 1.0V on stock or ASUS BIOS and downclocks the same as Palit even only on 1.025V.


Still need someone with a corsair power link capable PSU to do test for us. I honestly don't know if I trust software since we're technically manipulating it.
Quote:


> Originally Posted by *nrpeyton*
> 
> Understood.
> 
> Thursday it is then.
> 
> I'll include methodology/exactly how test was carried out & a walkthrough plus numbers either at the time; or in real time (I'll know more once I begin and am able to get a 'feel for it'. But it's pretty straight forward.
> 
> Fingers crossed its giving something ;-)


Thank you. I wish I could do it but it took quite awhile to install the shunt mod, don't want to undo it, lol.


----------



## Foxrun

Quote:


> Originally Posted by *Benny89*
> 
> SLI is fail nowadays in 90% of cases....


SLI is pretty well supported for most of the part and is needed right now to max games at 60fps 4k.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Still need someone with a corsair power link capable PSU to do test for us. I honestly don't know if I trust software since we're technically manipulating it.
> Thank you. I wish I could do it but it took quite awhile to install the shunt mod, don't want to undo it, lol.


np

did you use liquid metal? (if so, which one; and any reason for that preference)?

(don't worry; i'll do the test _before_ I shut mod myself).. not even sure I am going to do the shut mod yet.... (ha; who am I kidding; myself probably.. it's more a matter of _when_ than _if_ lol).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> np
> 
> did you use liquid metal? (if so, which one; and any reason for that preference)?
> 
> (don't worry; i'll do the test _before_ I shut mod myself).. not even sure I am going to do the shut mod yet.... (ha; who am I kidding; myself probably.. it's more a matter of _when_ than _if_ lol).


I used CLU and I insulated the resistor and solder joints using liquid electrical tape which is really cheap and removable.

Before you shunt mod, you can see if more voltage actually does anything for you.

Voltage allows me to go form 2088 Mhz to 2114 Mhz and requires 1.075-1.092v to run. So that power limited me when I was at that voltage.

So before you shunt mod, ask yourself

1. Does your card get higher clocks at higher voltages? A lot of people don't benefit from higher voltages.
2. Can your card do a run through Heaven benchmark and do some scenes at these voltages and clocks? My card was able to run at 2114 Mhz through certain scenes of Heaven, so that gave me an idea if the shunt was worth it for me or not.

If your answer is yes to both of these, then shunt mod might be worth it. If not, then shunt mod won't be worth it.

For me personally, it's nice playing Witcher 3 with the shunt mod. Previously, during intense fight scenes my card would be at 2088, then go down to 2073, 2068 and back up to 2088. It was jumping around and when it jumped around, I would experience stutter and slow downs even with G-Sync. Probably the card going berserk and having to recalculate TDP.

From my experience, if your maximum power draw is 300 Watts, starting at 280 Watts the card starts recalculating TDP and freaking out since it's a safe guard for you to not go past 300 Watts. Although I have seen the card temporarily hit 330 watts stock a few times it never lasts for more than a few seconds.

So be careful with these spikes if you see and read them. Try and read the power that is consistant. If for example Afterburner shows you hit 124% or 132% at some point, then that spike is in your data.



So when your card is reading 119%-120% that's your real maximum power draw. Honestly I would just hold a phone and record your wall reader and record it and when you see power draw hit 119%-120% I would just stop the video 5 seconds after that. That way you know in the last 10 seconds of your video you got the max power draw.

Then do it with Palit bios and when it hits 116% (hopefully it does), do the same thing. We don't want to read those temporary "spikes" which can be like 130% randomly.

Good luck! And thank you for investigating this in the name of science.


----------



## L0N3574R

Anyone get ahold of the new drivers yet?


----------



## KraxKill

Quote:


> Originally Posted by *Foxrun*
> 
> SLI is pretty well supported for most of the part and is needed right now to max games at 60fps 4k.


Plug in "your favorite card + SLI microstutter" into google and try to say that again?

I used to run nothing but SLI and then I did the unthinkable and tried a single card. Lower ultimate FPS yes but the fluidity is noticeable. The lack of micro stutter and latency on a single card is alone worth it. Most are better off to beef up a single card over getting SLI.

I don't even game with max settings anyway as I need ˜100fps so game at 1440p. For competitive online play I turn off a lot of FPS killers like motion blur and if that's not enough you can always turn down some settings. That's much more enjoyable than having no escape from microstutter. Once you notice it, there is no escape.

The power draw associated with it and the feeling of having a $700 paperweight in poorly optimized titles only adds to the grief. Not something I want to experience again.

I haven't even begun to talk about the lack of proper G-sync support in SLI. Where the cards are not 100% utilized and a single card actually performs better.

This thread has enough pain and frustration to have convinced me to not run 1080 in SLI and just run a single Ti.

https://forums.geforce.com/default/topic/946818/geforce-1000-series/gtx-1080-sli-low-gpu-usage-while-g-sync-enabled/

Maybe Volta will change all this, but don't hold your breath.

About the only use I see for SLI is in synthetic calculations, folding and GPU rendering...otherwise save yourself the $$, buy a better processor, better cooling or invest it in something worthy.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> I used CLU and I insulated the resistor and solder joints using liquid electrical tape which is really cheap and removable.
> 
> Before you shunt mod, you can see if more voltage actually does anything for you.
> 
> Voltage allows me to go form 2088 Mhz to 2114 Mhz and requires 1.075-1.092v to run. So that power limited me when I was at that voltage.
> 
> So before you shunt mod, ask yourself
> 
> 1. Does your card get higher clocks at higher voltages? A lot of people don't benefit from higher voltages.
> 2. Can your card do a run through Heaven benchmark and do some scenes at these voltages and clocks? My card was able to run at 2114 Mhz through certain scenes of Heaven, so that gave me an idea if the shunt was worth it for me or not.
> 
> If your answer is yes to both of these, then shunt mod might be worth it. If not, then shunt mod won't be worth it.
> 
> For me personally, it's nice playing Witcher 3 with the shunt mod. Previously, during intense fight scenes my card would be at 2088, then go down to 2073, 2068 and back up to 2088. It was jumping around and when it jumped around, I would experience stutter and slow downs even with G-Sync. Probably the card going berserk and having to recalculate TDP.
> 
> From my experience, if your maximum power draw is 300 Watts, starting at 280 Watts the card starts recalculating TDP and freaking out since it's a safe guard for you to not go past 300 Watts. Although I have seen the card temporarily hit 330 watts stock a few times it never lasts for more than a few seconds.
> 
> So be careful with these spikes if you see and read them. Try and read the power that is consistant. If for example Afterburner shows you hit 124% or 132% at some point, then that spike is in your data.
> 
> 
> 
> So when your card is reading 119%-120% that's your real maximum power draw. Honestly I would just hold a phone and record your wall reader and record it and when you see power draw hit 119%-120% I would just stop the video 5 seconds after that. That way you know in the last 10 seconds of your video you got the max power draw.
> 
> Then do it with Palit bios and when it hits 116% (hopefully it does), do the same thing. We don't want to read those temporary "spikes" which can be like 130% randomly.
> 
> Good luck! And thank you for investigating this in the name of science.


Are you planning on leaving the CLU on after going through the "experimental" phase we all like to do when we first get new cards? Or removing the mod in a week or so?

And aye I can't wait to see what she's capable of; hope to god I don't get a lemon, lol.


----------



## Foxrun

Quote:


> Originally Posted by *KraxKill*
> 
> Plug in "your favorite card + SLI microstutter" into google and try to say that again?
> 
> I used to run nothing but SLI and then I did the unthinkable and tried a single card. Lower ultimate FPS yes but the fluidity is noticeable. The lack of micro stutter and latency on a single card is alone worth it. Most are better off to beef up a single card over getting SLI.
> 
> I don't even game with max settings anyway as I need ˜100fps so game at 1440p. For competitive online play I turn off a lot of FPS killers like motion blur and if that's not enough you can always turn down some settings. That's much more enjoyable than having no escape from microstutter. Once you notice it, there is no escape.
> 
> The power draw associated with it and the feeling of having a $700 paperweight in poorly optimized titles only adds to the grief. Not something I want to experience again.
> 
> I haven't even begun to talk about the lack of proper G-sync support in SLI. Where the cards are not 100% utilized and a single card actually performs better.
> 
> This thread has enough pain and frustration to have convinced me to not run 1080 in SLI and just run a single Ti.
> 
> https://forums.geforce.com/default/topic/946818/geforce-1000-series/gtx-1080-sli-low-gpu-usage-while-g-sync-enabled/
> 
> Maybe Volta will change all this, but don't hold your breath.
> 
> About the only use I see for SLI is in synthetic calculations, folding and GPU rendering...otherwise save yourself the $$, buy a better processor, better cooling or invest it in something worthy.


I get performance boosts in every game i play except for no man's sky which doesnt have proper support. I never understood the argument against SLI when pushing for higher resolutions. If a game does not work well with SLI then set the profile to single card. What about this thread has turned you off from SLI? All Ive seen has been some extra steps to get superposition to work.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Are you planning on leaving the CLU on after going through the "experimental" phase we all like to do when we first get new cards? Or removing the mod in a week or so?
> 
> And aye I can't wait to see what she's capable of; hope to god I don't get a lemon, lol.


I'm leaving it on. You can still control how much power you want to go to the card via the power slider. It just affects it like so

75% = 300 Watts
100% = 400 Watts

I rarely hit 100% anyways, it mostly floats around 85-92%

92% = 368 Watts
Quote:


> Originally Posted by *KraxKill*
> 
> Plug in "your favorite card + SLI microstutter" into google and try to say that again?
> 
> I used to run nothing but SLI and then I did the unthinkable and tried a single card. Lower ultimate FPS yes but the fluidity is noticeable. The lack of micro stutter and latency on a single card is alone worth it. Most are better off to beef up a single card over getting SLI.
> 
> I don't even game with max settings anyway as I need ˜100fps so game at 1440p. For competitive online play I turn off a lot of FPS killers like motion blur and if that's not enough you can always turn down some settings. That's much more enjoyable than having no escape from microstutter. Once you notice it, there is no escape.
> 
> The power draw associated with it and the feeling of having a $700 paperweight in poorly optimized titles only adds to the grief. Not something I want to experience again.
> 
> I haven't even begun to talk about the lack of proper G-sync support in SLI. Where the cards are not 100% utilized and a single card actually performs better.
> 
> This thread has enough pain and frustration to have convinced me to not run 1080 in SLI and just run a single Ti.
> 
> https://forums.geforce.com/default/topic/946818/geforce-1000-series/gtx-1080-sli-low-gpu-usage-while-g-sync-enabled/
> 
> Maybe Volta will change all this, but don't hold your breath.
> 
> About the only use I see for SLI is in synthetic calculations, folding and GPU rendering...otherwise save yourself the $$, buy a better processor, better cooling or invest it in something worthy.


I'm with you on that. I thought SLI 980 Ti was awesome until I got my single 1080 Ti. Can definitely notice the weird stuttering is gone when I play Witcher 3 or other games. It's just very smooth. There's no microsecond of a twitch going from one frame to another.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm leaving it on. You can still control how much power you want to go to the card via the power slider. It just affects it like so
> 
> 75% = 300 Watts
> 100% = 400 Watts
> 
> I rarely hit 100% anyways, it mostly floats around 85-92%


Aye; I was more worried about the Liquid Metal possibly eating away at solder points. (It's not unheard of for a resistor to fall off as a result).

I watched a buildzoid video where it was mentioned, not too long ago... his answer was to try Conductonaut instead (_by Thermal Grizzly; their version of LM_)

_I got the impression he was only mentioning it as a "disclaimer".. he's never mentioned it in any of his VRM analysis vids since)._

Not meaning to sound all the alarm bells; but just something to be aware of  Better safe than sorry 

Also I'm not in anyway claiming Conductonaut is any less likely to be a culrprit. (Just the messenger on this one, i'm afraid).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Foxrun*
> 
> I get performance boosts in every game i play except for no man's sky which doesnt have proper support. I never understood the argument against SLI when pushing for higher resolutions. If a game does not work well with SLI then set the profile to single card. What about this thread has turned you off from SLI? All Ive seen has been some extra steps to get superposition to work.


Besides my post above, I decided to no go SLI now or wait on it (DX12 multi GPU please work!).

Battlefield 1 didn't support sli correctly, it had a strange glitch where it would actually cause burn in text onto your monitor.

https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=Battlefield+1+sli+flickering

Doom didn't even support SLI since it was Vulkan

DX12 doesn't support SLI

I got tired of having to mess with profile settings

And finally there was slight microstuttering, wasn't too noticeable since I adapted to it, but after getting a 1080 Ti and OC to be equal or greater than a 980 Ti SLI setup, I can see that the fluidity and smoothness of a single card has made a major difference to me.

Who knows, 2 years from now if I still have my 1080 Ti and they sell for cheap used, I'll grab a used one for $300 if DX12 Multi GPU takes off.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye; I was more worried about the Liquid Metal possibly eating away at solder points. (It's not unheard of for a resistor to fall off as a result).
> 
> I watched a buildzoid video where it was mentioned, not too long ago... his answer was to try Conductonaut instead (_by Thermal Grizzly; their version of LM_)
> 
> Not meaning to sound all the alarm bells; but just something to be aware of  Better safe than sorry


No issues, the OP has been updated with my shunt mod. I'll post a picture here, but I applied liquid electrical tape on the solder joint. So CLU cannot touch the solder joint. It can only touch the resistor itself, the top metal part.



The photo below, I took a small tiny paint brush and pained 2 to 3 coats around the solder joint, covering the solder from any CLU run off that could potentially happen.



The dude that had his resistor fall off never explained how it happened to him. I blame his possible neglifience or the way his GPU was setup. Maybe he put too much and it ran off to the side, maybe he has a rig that mounts things upside down.

But yeah, I protected and coated the soldier joint.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> No issues, the OP has been updated with my shunt mod. I'll post a picture here, but I applied liquid electrical tape on the solder joint. So CLU cannot touch the solder joint. It can only touch the resistor itself, the top metal part.


very nice, impressive.. I've saved those images to my OC'ing folder on my desktop.

nice work & good thinking ;-)

Heatsink mods on the opening post look good too (especially the thermal pad between core and back-plate).

My last Liquid Electrical Tape mission on my 1080 Classified looked like this lol:

Was a painstaking nightmare to remove.

Next time I'm hoping not to make such a mess lol (so 2nd time lucky) ha.

_Worked a treat though, even with a full sheet of ice between the EK block & 1080 PCB, still never even so much as went into safe mode _


----------



## DabboPiff

I have a Msi 1080ti gaming X overclocked core+110 memory +400

I7 6700k overclocked to 4.5

I can get a score just over 10k on superposition but at those same levels I crash on timespy and firestrike at +90 core I get no crashes

Should I lower my core to 90 or just leave it at the levels I have now?

Didn't touch my voltage.

Thanks


----------



## Alwrath

Quote:


> Originally Posted by *DabboPiff*
> 
> I have a Msi 1080ti gaming X overclocked core+110 memory +400
> 
> I7 6700k overclocked to 4.5
> 
> I can get a score just over 10k on superposition but at those same levels I crash on timespy and firestrike at +90 core I get no crashes
> 
> Should I lower my core to 90 or just leave it at the levels I have now?
> 
> Didn't touch my voltage.
> 
> Thanks


Leave it at 90. You want max stable oc.


----------



## KraxKill

Quote:


> Originally Posted by *SlimJ87D*
> 
> No issues, the OP has been updated with my shunt mod. I'll post a picture here, but I applied liquid electrical tape on the solder joint. So CLU cannot touch the solder joint. It can only touch the resistor itself, the top metal part.
> 
> 
> 
> The photo below, I took a small tiny paint brush and pained 2 to 3 coats around the solder joint, covering the solder from any CLU run off that could potentially happen.
> 
> 
> 
> The dude that had his resistor fall off never explained how it happened to him. I blame his possible neglifience or the way his GPU was setup. Maybe he put too much and it ran off to the side, maybe he has a rig that mounts things upside down.
> 
> But yeah, I protected and coated the soldier joint.


Thats beautiful, similar to how I have mine done.

And to all the "resistor might fall off folks". Guess what you do if that happens? Go grab a soldering iron. Don't be scurrd. If afraid of the soldering iron, don't want to risk RMA issues or unwilling to part with some $$, well...don't do it.


----------



## Benny89

I have a question about PerfCap:

So why I am getting PerfCap : PWr if my max TDP is 115% (PL slider is at 120%)... Isn't it strange?

Also how to get rid off VOp? Is it just about increase Voltage Slider?


----------



## gmpotu

Check this out guys!



Been gaming like this for over an hour with no issues. I decided to forget the curve and just slide the whole default curve up and find a stable spot.

Is there anything wrong with the card running at 1.093v for 24/7 use if it seems stable like this?


----------



## mshagg

Quote:


> Originally Posted by *gmpotu*
> 
> Check this out guys!
> 
> 
> 
> Been gaming like this for over an hour with no issues. I decided to forget the curve and just slide the whole default curve up and find a stable spot.
> 
> Is there anything wrong with the card running at 1.093v for 24/7 use if it seems stable like this?


It seems like unnecessarily high voltage for 2037Mhz/2050 but I certainly wouldnt be worried about any negative impact on the card; especially given most of us wish Nvidia would let us play with more!


----------



## KraxKill

Quote:


> Originally Posted by *Benny89*
> 
> I have a question about PerfCap:
> 
> So why I am getting PerfCap : PWr if my max TDP is 115% (PL slider is at 120%)... Isn't it strange?
> 
> Also how to get rid off VOp? Is it just about increase Voltage Slider?


Is it constant PWr or Maybe it's accounting for power spikes? GPUz only measures so often. Try turning up the GPUz sample rate. Are you just picking up some here and there?


----------



## nrpeyton

Quote:


> Originally Posted by *mshagg*
> 
> It seems like unnecessarily high voltage for 2037Mhz/2050 but I certainly wouldnt be worried about any negative impact on the card; especially given most of us wish Nvidia would let us play with more!


Agreed, and you're only averaging 220w, could maybe even get more out of her (for that game)?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> Thats beautiful, similar to how I have mine done.
> 
> And to all the "resistor might fall off folks". Guess what you do if that happens? Go grab a soldering iron. Don't be scurrd. If afraid of the soldering iron, don't want to risk RMA issues or unwilling to part with some $$, well...don't do it.


I did get the idea from you actually in your previous post here. Thanks again.


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> Is it constant PWr or Maybe it's accounting for power spikes? GPUz only measures so often. Try turning up the GPUz sample rate. Are you just picking up some here and there?


For example on 1.0500V at 2050 it is here and there.

But for example on 1.063V at 2063 I have constant Pwr. VRel and V0p. Voltage slider is maxed out, temps are below 60C and max TDP is 118,7%.

I just have no idea why I am getting those perfs and how you get rid off them....

Like VOp for example...


----------



## s1rrah

Quote:


> Originally Posted by *SlimJ87D*
> 
> I used CLU and I insulated the resistor and solder joints using liquid electrical tape which is really cheap and removable.
> 
> Before you shunt mod, you can see if more voltage actually does anything for you.
> 
> Voltage allows me to go form 2088 Mhz to 2114 Mhz and requires 1.075-1.092v to run. So that power limited me when I was at that voltage.


Is CLU preferred for the shunt mod over something like good old conductive ink pens because it's easily removed or due to a certain amount of electric conductivity? Seems like a conductive pen would be easier to work with but it's been years since I did voltage mods on my old 580 LOL ..


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> Understood.
> 
> Thursday it is then.
> 
> I'll include methodology/exactly how test was carried out & a walkthrough plus numbers; possibly in real-time using camera or screenshots + photos with before's/after's.
> Should be pretty straight forward;
> 
> Fingers crossed its giving something ;-)
> 
> Anyone know if there's a VRM Analysis anywhere on the PALIT card? (Showing the name of the chip at the heart the power monitoring- _for instance; Texas Instruments UT6 D501 and VRM chip UP9571)._


TechPowerUp review shows the VRM chip.


----------



## RavageTheEarth

I'm having a lot of luck with this 1v challenge. So I bumped up my core from 2012 to 2038 @ 1.00v and it's completely stable. Before I learned about the lock function on the curve I was running 2063 @ 1.063v. Now I just lock it at 1v and the bins don't move. It's fantastic. I'm getting no Perfcap's and with the locked bin and gameplay seems smoother since it's not always jumping all over the place. Been playing Ghost Recon Wildlands at this clock speed for a couple hours now. I'm really surprised I'm getting no Perfcaps. Will enjoy my 2038 tonight and tomorrow after work I'll try to bump it up a little more.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *gmpotu*
> 
> Check this out guys!
> 
> 
> 
> Been gaming like this for over an hour with no issues. I decided to forget the curve and just slide the whole default curve up and find a stable spot.
> 
> Is there anything wrong with the card running at 1.093v for 24/7 use if it seems stable like this?


I don't even know what youre trying to show us here. Are you running superposition, TimeSpy or firestrike? It doesn't look like it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'm having a lot of luck with this 1v challenge. So I bumped up my core from 2012 to 2038 @ 1.00v and it's completely stable. Before I learned about the lock function on the curve I was running 2063 @ 1.063v. Now I just lock it at 1v and the bins don't move. It's fantastic. I'm getting no Perfcap's and with the locked bin and gameplay seems smoother since it's not always jumping all over the place. Been playing Ghost Recon Wildlands at this clock speed for a couple hours now. I'm really surprised I'm getting no Perfcaps. Will enjoy my 2038 tonight and tomorrow after work I'll try to bump it up a little more.


Youre getting gpu utilization perf cap, not sure if it's your cpu or the card can't be fully utilized because your locked it. But locking isn't the way to go, you have to let these cards adjust themselves like they're meant to. Try doing some benchmarks and see if locking even helps you.


----------



## jleslie246

OK! I have my EVGA 1080Ti SC finally! Now what do I do?









It spins up to almost 2GHz with no adjustments playing Witcher 3. That game looks amazing now btw! going from a 780 to a 1080Ti.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jleslie246*
> 
> OK! I have my EVGA 1080Ti SC finally! Now what do I do?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It spins up to almost 2GHz with no adjustments playing Witcher 3. That game looks amazing now btw! going from a 780 to a 1080Ti.


Just play games and enjoy the card. Don't post here, it'll eating your soul. You'll benchmark more than you'll actually game.


----------



## mechwarrior

Quote:


> Originally Posted by *SlimJ87D*
> 
> Just play games and enjoy the card. Don't post here, it'll eating your soul. You'll benchmark more than you'll actually game.


Agree 100% I wish I never started down this path. I've watched heaven benchmark more since I got my 1080ti than game.

Hate you all!!
Lol


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Youre getting gpu utilization perf cap, not sure if it's your cpu or the card can't be fully utilized because your locked it. But locking isn't the way to go, you have to let these cards adjust themselves like they're meant to. Try doing some benchmarks and see if locking even helps you.


I had no idea that grey meant gpu perf cap, but the card is in a x16 Gigabyte G1 Gaming 7 with a 6700k @ a stable 4.7 Ghz and 3400Mhz G.Skill Trident Z. I'll have to check, but I'm pretty sure I see that all the time even when I'm not locked. As for locking, it's easy to switch profiles in afterburner. My card likes to jump up 3 or 4 bins so it will be running 2000core @ 125% power limit at 1.00v, but then it will jump up bins and be running 1.063v after a couple minutes in benchmarks it just likes high voltages for some reason so this solves that problem. Honestly I'm just messing around with the card with undervolting and stuff, but EK releases the waterblock I want to see how the card outperforms under the same scenarios.


----------



## mechwarrior

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I had no idea that grey meant gpu perf cap, but the card is in a x16 Gigabyte G1 Gaming 7 with a 6700k @ a stable 4.7 Ghz and 3400Mhz G.Skill Trident Z. I'll have to check, but I'm pretty sure I see that all the time even when I'm not locked. As for locking, it's easy to switch profiles in afterburner. My card likes to jump up 3 or 4 bins so it will be running 2000core @ 125% power limit at 1.00v, but then it will jump up bins and be running 1.063v after a couple minutes in benchmarks it just likes high voltages for some reason so this solves that problem. Honestly I'm just messing around with the card with undervolting and stuff, but EK releases the waterblock I want to see how the card outperforms under the same scenarios.


If it's green in gpuz it's power limited. Grey is per ult I think it's waiting to be used?? I get grey when my card is idle


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I had no idea that grey meant gpu perf cap, but the card is in a x16 Gigabyte G1 Gaming 7 with a 6700k @ a stable 4.7 Ghz and 3400Mhz G.Skill Trident Z. I'll have to check, but I'm pretty sure I see that all the time even when I'm not locked. As for locking, it's easy to switch profiles in afterburner. My card likes to jump up 3 or 4 bins so it will be running 2000core @ 125% power limit at 1.00v, but then it will jump up bins and be running 1.063v after a couple minutes in benchmarks it just likes high voltages for some reason so this solves that problem. Honestly I'm just messing around with the card with undervolting and stuff, but EK releases the waterblock I want to see how the card outperforms under the same scenarios.


Are you sure locking is helping you, have you compared benchmarks in several situations? If it is helping you, more power to you. These are just my findings personally.

Locking didn't help me at all. If there was a time my card was hitting power limits pre-shunt mod, then it couldn't and it would just run gimped with low GPU utilization meaning the GPU wasn't being used to its full potential. Grey. If there was a time my card actually wanted to access more voltage to make gameplay smoother or more stable, it couldn't. Vrel orange. Not having a lock allowed me card to stay at 99% GPU utilization and avoid any voltage limitation, in other words I was deactivating boost 3.0 by locking my card.


----------



## RavageTheEarth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Are you sure locking is helping you, have you compared benchmarks in several situations? If it is helping you, more power to you. These are just my findings personally.
> 
> Locking didn't help me at all. If there was a time my card was hitting power limits pre-shunt mod, then it couldn't and it would just run gimped with low GPU utilization meaning the GPU wasn't being used to its full potential. Grey. If there was a time my card actually wanted to access more voltage to make gameplay smoother or more stable, it couldn't. Vrel orange. Not having a lock allowed me card to stay at 99% GPU utilization and avoid any voltage limitation, in other words I was deactivating boost 3.0 by locking my card.


Oh, so grey is GPU utilization? If you look at the afterburner monitor you can see that while gaming I was at a steady 98-99% GPU utilization. I was on the desktop for a little while before snapping the picture.


----------



## Qba73

Quote:


> Originally Posted by *mechwarrior*
> 
> Agree 100% I wish I never started down this path. I've watched heaven benchmark more since I got my 1080ti than game.
> 
> Hate you all!!
> Lol


lol true story


----------



## Slackaveli

2100 on air club? how many of us are there?


----------



## KedarWolf

Anyone test the MSI Armour OC BIOS, it has a 330W power limit.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Foxrun*
> 
> Ah so it is normal? Ive got the AIO water coolers on them and temps never exceed 47c under timespy. Ive seen some users report no power limiting when benching with timespy but it seems impossible! I am worried that it effects performance somehow.
> 
> 
> 
> Well a good question is what card they are using and what bios? Asus and msi bios gives 15 more watts. Pilat gives 30 more watts. So at 1.000v that extra 15 to 30 watts can keep them from throttling.
> 
> With a shunt mod, they won't throttle at all.
Click to expand...

Okay, more testing on Palit over stock using latest drivers, Superposition quite a bit higher with stock even though clocks are lower. TimeSpy roughly the same as stock, Heaven 1.5 FPS better on stock than Palit BIOS.

Yes, you were right @SlimJ87D (eats crow), stock does seem to be better, I jumped the gun.


----------



## KraxKill

Quote:


> Originally Posted by *s1rrah*
> 
> Is CLU preferred for the shunt mod over something like good old conductive ink pens because it's easily removed or due to a certain amount of electric conductivity? Seems like a conductive pen would be easier to work with but it's been years since I did voltage mods on my old 580 LOL ..


You don't want to completely short the resistor. LM is not perfectly conductive so a little on top is enough to only drop resistance a little. Enough to fool the shunts a good 20-30% or so, but not enough to completely short the resistor and cause the card to go into a fail safe power state.


----------



## KedarWolf

Quote:


> Originally Posted by *s1rrah*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SlimJ87D*
> 
> I used CLU and I insulated the resistor and solder joints using liquid electrical tape which is really cheap and removable.
> 
> Before you shunt mod, you can see if more voltage actually does anything for you.
> 
> Voltage allows me to go form 2088 Mhz to 2114 Mhz and requires 1.075-1.092v to run. So that power limited me when I was at that voltage.
> 
> 
> 
> Is CLU preferred for the shunt mod over something like good old conductive ink pens because it's easily removed or due to a certain amount of electric conductivity? Seems like a conductive pen would be easier to work with but it's been years since I did voltage mods on my old 580 LOL ..
Click to expand...

A conductive pen won't work. It doesn't lower the electrical resistance nearly enough. Has to be CLU or Conductonaut, CLU preferred, some people have issues with Conductonaut not spreading very well.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> A conductive pen won't work. It doesn't lower the electrical resistance nearly enough. Has to be CLU or Conductonaut, CLU preferred, some people have issues with Conductonaut not spreading very well.


Yeah, I feel bad for the 3 to 5 people in the shunt thread that bought conductive pens and it only raised their TDP an extra 5%... They all got too excited by the idea and all bought the pens together.


----------



## KingEngineRevUp

New drivers seem good. Sounds like it's worth installing.


__
https://www.reddit.com/r/67gkfq/driver_38189_faqdiscussion_thread/%5B/URL


----------



## alucardis666

Quote:


> Originally Posted by *SlimJ87D*
> 
> New drivers seem good. Sounds like it's worth installing.
> 
> 
> __
> https://www.reddit.com/r/67gkfq/driver_38189_faqdiscussion_thread/%5B/URL


Thanks for sharing


----------



## KedarWolf

I may have stopped the voltages jumping around even on a straight voltage curve line.

Right click the .bat file, choose Edit and open it in Notepad, change the number after -pl to your max power limit, you can look your max power limit up at:

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

Open your Task Scheduler search for it in Run or Cortana, add the bat file to run at login, just add it as a Basic Task, then double click the entry in your task list and edit it like the screenshots. Have the bat file start 30 seconds after login.

You need to click on Properties in the right window to edit it like the pictures when you double-click it from your task list.

*Unzip this file for powerlimit.bat file.*









powerlimit.zip 0k .zip file
 *Unzip this file for powerlimit.bat file.*













Or you can just edit and run the bat file itself but you'll have to do it on every login.









Seems manually setting the power limit even if it doesn't actually change it fixes that issue.


----------



## Slackaveli

90 minute run in Battlefield One at 2101, NO LOCK in curve. Stayed at 2101 the entire time, and i am shocked tbh. At +100Mv. sick.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 90 minute run in Battlefield One at 2101, NO LOCK in curve. Stayed at 2101 the entire time, and i am shocked tbh. At +100Mv. sick.


Battlefield 1 is pretty well optimized with CPU and GPU usages on both. But I never found it as a taxing game, at 95% GPU usage I cap out my FPS to 142. So I don't think it'd ever hit any performance limits.

BTW, we got to play sometime. I think I sent you a friend request the other day.


----------



## Clukos

The one thing that will lock your GPU is power draw. At 4k it is far worse than 1440p or 1080p, 8K is even worse. Try running that 8k optimized superposition and see your card jumping all around the place at 1.0930 voltage







Or watch it blow up if you botched the shunt mod


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Battlefield 1 is pretty well optimized with CPU and GPU usages on both. But I never found it as a taxing game, at 95% GPU usage I cap out my FPS to 142. So I don't think it'd ever hit any performance limits.
> 
> BTW, we got to play sometime. I think I sent you a friend request the other day.


yessir, ok bet. ill check my notifications.
You are right it isnt taxing for a gpu, i can make it hit 99% usage by upping the res slider a bit. But it is certainly a good test for a stable cpu overclock! Brutal on cpu's.

I know this doesnt mean I'll hold 2100 in all games, but I did it there , for over an hour, with no perfcaps, on air. Outstanding








Quote:


> Originally Posted by *Clukos*
> 
> The one thing that will lock your GPU is power draw. At 4k it is far worse than 1440p or 1080p, 8K is even worse. Try running that 8k optimized superposition and see your card jumping all around the place at 1.0930 voltage
> 
> 
> 
> 
> 
> 
> 
> Or watch it blow up if you botched the shunt mod


true, but i max 60 pretty quick in 4k, easier than hitting 144 in 1440p. I understand what you mean about power draw, but if you lock it at 60 it doesnt come up in very many games on these beasts.


----------



## derpa

Okay, I fully admit; this may be a stupid question, and it makes me a terrible person for asking it, and I should be ashamed of myself, and what I'm about to do.







Regardless....

When I'm OCing my Ti, should I "care" about the Power Limit graph?

I understand this is an indication that the card has reached the "120%" power target I set, and the card is throttling because of it, but as I run through these benchmarks, the clocks aren't jumping all over the map in congruence with the Power Limit graph....







AND, the benchmarks run just fine with no artifacts.....soooo.....ummmm....yeah....

I've danced around the web watching vids on Pascal OCing, fallen into YT black holes, climbed back out, read articles, and in everything I can recall, no one ever really addressed if it matters.

Though, there is a distinct possibility I am blind, or have short term memory loss

THANKS!!!!









PS: I did watch the vid at the beginning of this thread as well. I tried using the slider graphey doohicky thingy, and had less success with that than using the "main" page sliders in PxOC


----------



## alucardis666

Good news! Should have my 1080 Ti Hybrid kits next week.









Thank you B&H!



I'm so pumped to get these cards going with a proper OC under water.


----------



## joder

Quote:


> Originally Posted by *KraxKill*
> 
> 2100 @ 1.063v
> 
> Broke through 23000 on my pedestrian 4790K. Single card outscoring a few SLI results.
> 
> http://www.3dmark.com/fs/12446981


Quote:


> Originally Posted by *KedarWolf*
> 
> Okay, more testing on Palit over stock using latest drivers, Superposition quite a bit higher with stock even though clocks are lower. TimeSpy roughly the same as stock, Heaven 1.5 FPS better on stock than Palit BIOS.
> 
> Yes, you were right @SlimJ87D (eats crow), stock does seem to be better, I jumped the gun.


----------



## Slackaveli

Quote:


> Originally Posted by *derpa*
> 
> Okay, I fully admit; this may be a stupid question, and it makes me a terrible person for asking it, and I should be ashamed of myself, and what I'm about to do.
> 
> 
> 
> 
> 
> 
> 
> Regardless....
> 
> When I'm OCing my Ti, should I "care" about the Power Limit graph?
> 
> I understand this is an indication that the card has reached the "120%" power target I set, and the card is throttling because of it, but as I run through these benchmarks, the clocks aren't jumping all over the map in congruence with the Power Limit graph....
> 
> 
> 
> 
> 
> 
> 
> AND, the benchmarks run just fine with no artifacts.....soooo.....ummmm....yeah....
> 
> I've danced around the web watching vids on Pascal OCing, fallen into YT black holes, climbed back out, read articles, and in everything I can recall, no one ever really addressed if it matters.
> 
> Though, there is a distinct possibility I am blind, or have short term memory loss
> 
> THANKS!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: I did watch the vid at the beginning of this thread as well. I tried using the slider graphey doohicky thingy, and had less success with that than using the "main" page sliders in PxOC


none of those colors represent a true "problem", per se, with the exception being thermal throttling. You want to keep that out of the equation just for the obvious reasons.


----------



## Silent Scone

Quote:


> Originally Posted by *alucardis666*
> 
> Everyone's different. The Aorus I tried was supposed to be "quiet", but at 100% fan speed, it sounded no quieter than my FE's @ 100%


Well, that's the thing you see. If you run it at 100%, it will be noisy. The difference is one will keep the core under 40c, the other will struggle to keep it under 65. Here's the best part, though; even when you have a conservative fan profile, it's quieter than the FE cooler and runs cooler, also.

Read that like a cheesy advertisement commercial for full sarcasm


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 90 minute run in Battlefield One at 2101, NO LOCK in curve. Stayed at 2101 the entire time, and i am shocked tbh. At +100Mv. sick.


Yeah i was pretty shocked after playing Andromeda the last days. Did a full evening of BF1 yesturday night and it all looked like this the whole evening (Load Is a bit lower because screenshot was after game was completed, in menus)


----------



## Clukos

Quote:


> Originally Posted by *Slackaveli*
> 
> true, but i max 60 pretty quick in 4k, easier than hitting 144 in 1440p. I understand what you mean about power draw, but if you lock it at 60 it doesnt come up in very many games on these beasts.


It's not about GPU util, I've noticed that the more you increase the resolution, at 100% utilization in both cases the one that draws more pixels draws more power. That's why the 8K superposition benchmark is the ultimate test right now in terms of stability of your OC under load. If you try it out you'll see what I mean


----------



## alucardis666

Quote:


> Originally Posted by *Silent Scone*
> 
> Well, that's the thing you see. If you run it at 100%, it will be noisy. The difference is one will keep the core under 40c, the other will struggle to keep it under 65. Here's the best part, though; even when you have a conservative fan profile, it's quieter than the FE cooler and runs cooler, also.
> 
> Read that like a cheesy advertisement commercial for full sarcasm


Lol. I'll stick to Hybrid 1080 TI FE's


----------



## Nico67

Upgraded to CU update and 381.89, tuned the driver a bit and got this











mainly a 2088 run but hits 2101 when power drops a bit, scores reasonable might try Palit again tomorrow and see if its any different with updates.


----------



## ocCuS

The upcoming Galax 1080 ti Hall of Fame and the COLORFUL iGame GTX 1080 Ti Vulcan X OC will come with LCD panels on the cards to read clock speeds, temps,..

https://videocardz.com/68998/galax-geforce-gtx-1080-ti-hof-also-features-lcd-panel


https://videocardz.com/68961/colorful-igame-gtx-1080-ti-vulcan-x-oc-detailed


----------



## viz0id

Hi guys. I've seen from screenshots of some of you guys, that when your cards are in idle they downclock to really low frequencies (like 200mhz). How do i do that?

My card idles at 1550 which makes the temp much higher than it needs to be. I have a Gsync monitor and Gsync enabled in windowed mode. Does that have anything to do with it?

Card is a OC'ed 1080ti Strix on air.

EDIT: Should also add that i have one 1080p 144hz screen as well. So running 2x monitor setup.


----------



## pez

Quote:


> Originally Posted by *viz0id*
> 
> Hi guys. I've seen from screenshots of some of you guys, that when your cards are in idle they downclock to really low frequencies (like 200mhz). How do i do that?
> 
> My card idles at 1550 which makes the temp much higher than it needs to be. I have a Gsync monitor and Gsync enabled in windowed mode. Does that have anything to do with it?
> 
> Card is a OC'ed 1080ti Strix on air.


IIRC, it's been mentioned that if you have a 165hz panel that making it 144hz fixes this. I could be wrong and maybe it's 144hz > 120hz.

However, on my 100Hz panel, I did have the issue that was solved by a clean install of drivers. I think it was due to so many GPU switches within a short amount of time.


----------



## viz0id

Quote:


> Originally Posted by *pez*
> 
> IIRC, it's been mentioned that if you have a 165hz panel that making it 144hz fixes this. I could be wrong and maybe it's 144hz > 120hz.
> 
> However, on my 100Hz panel, I did have the issue that was solved by a clean install of drivers. I think it was due to so many GPU switches within a short amount of time.


I think the panel is OC'ed to 165hz, but im currently using 144hz so that its the same refreshrate as my second monitor. I will try and put it at "standard" 144hz and maybe uninstall and install drivers. I have installed the newest drivers though so I'm not sure that is the problem.

EDIT: Complete uninstall and install of drivers did not help. I guess i have to hope it's the monitor at 165hz that is doing it. Will check when i get back home.


----------



## dunbheagan

Quote:


> Originally Posted by *KraxKill*
> 
> What's your point? But sure. Here you go. Not bad huh?
> 
> I'd rather upgrade my CPU and Mobo than get a second card and deal with micro stutter ..but that's just me.
> 
> Time Spy
> 
> http://www.3dmark.com/spy/1595031
> 
> FS Ultra
> 
> http://www.3dmark.com/fs/12445747


7748 in FS Ultra with a 4790k is a truly impressive score! I got 7590 with [email protected], [email protected], [email protected]

Which config did you have?


----------



## pez

Quote:


> Originally Posted by *viz0id*
> 
> I think the panel is OC'ed to 165hz, but im currently using 144hz so that its the same refreshrate as my second monitor. I will try and put it at "standard" 144hz and maybe uninstall and install drivers. I have installed the newest drivers though so I'm not sure that is the problem.
> 
> EDIT: Complete uninstall and install of drivers did not help. I guess i have to hope it's the monitor at 165hz that is doing it. Will check when i get back home.


Yep, it'd be worth it to try both (I'm sure someone will chime in and confirm the accuracy, though).


----------



## Silent Scone

Quote:


> Originally Posted by *alucardis666*
> 
> Lol. I'll stick to Hybrid 1080 TI FE's


Nah, that's half arsed. Got my Strix full cover block on order...


----------



## mechwarrior

I'm been testing this palit bios and I thing it only gives us an extra 30 watts on the fe.
anyone has an input on this???
on AB the power only goes up to 116?


----------



## Coopiklaani

Managed to achieve 415w+ TDP with the 3-resistor mod. Actually, 22Ohm or 47Ohm resistors are the best for this mod. 10Ohm is a little bit too small since nVidia changed their series resistor from 10Ohm to 20Ohm.


I also experimented with two conductive paints, carbon based, and silver based.




The carbon based paint has around 20-30Ohm resistance over a 1mm gap. The silver one only has about 0.1-0.2Ohms


----------



## 4lek

Quick question: do you guys think i'd have any bottleneck with a 1080ti sli on my current 4930k [@4.8 ~ 4.6 according to temperature] ?

Couse if i don't i'm gonna buy tooday







Thank you.


----------



## Coopiklaani

Quote:


> Originally Posted by *4lek*
> 
> Quick question: do you guys think i'd have any bottleneck with a 1080ti sli on my current 4930k [@4.8 ~ 4.6 according to temperature] ?
> 
> Couse if i don't i'm gonna buy tooday
> 
> 
> 
> 
> 
> 
> 
> Thank you.


It depends, if you are into 4K 60FPS gaming, then NO. If it is 1440p 144Hz then YES.


----------



## 4lek

2k 144hz


----------



## Coopiklaani

Quote:


> Originally Posted by *4lek*
> 
> 2k 144hz


Then 7700k @ 5.0GHz is what you need.


----------



## Benny89

I have no idea how you guys avoid Perf Caps. I have tried anything from 1.031 V to 1.081V and I always get Vrel and V0p. Even though my Voltage and Power slider are maxed out and GPU-Z shows max TDP at 118,7% . I have no idea how to avoid those Perf Caps...

What is the reason behind V0p or Vrel? Is my clock to high for bin, or too low for bin or what?


----------



## foolycooly

Quote:


> Originally Posted by *viz0id*
> 
> I think the panel is OC'ed to 165hz, but im currently using 144hz so that its the same refreshrate as my second monitor. I will try and put it at "standard" 144hz and maybe uninstall and install drivers. I have installed the newest drivers though so I'm not sure that is the problem.
> 
> EDIT: Complete uninstall and install of drivers did not help. I guess i have to hope it's the monitor at 165hz that is doing it. Will check when i get back home.


Let me know if you figure it out. I tried several things last night with no luck. I still idle 1480-1520 in the 40c-55c range. Pretty annoying.


----------



## 4lek

Yeah, but i should change mobo aswell..


----------



## dunbheagan

Quote:


> Originally Posted by *Coopiklaani*
> 
> Then 7700k @ 5.0GHz is what you need.


I dont think he NEEDS it. I mean, if he needs every last possible fps, then yes, 7700k is the right choice. But i am pretty sure you can run a 1080ti [email protected] on a [email protected],6GHz just fine without loosing much performance. I mean, how much faster is a [email protected] over a [email protected],6Ghz? Maybe 20%? Those 20% wont mean 20% more frames in games, maybe it will be 10% in some cases. I would even go further and say that a 1080ti sli will give you more performance than you need for 144Hz and 1440p, no matter if you are on a 4730k or 7700k.


----------



## 4lek

Quote:


> Originally Posted by *dunbheagan*
> 
> I dont think he NEEDS it. I mean, if he needs every last possible fps, then yes, 7700k is the right choice. But i am pretty sure you can run a 1080ti [email protected] on a [email protected],6GHz just fine without loosing much performance. I mean, how much faster is a [email protected] over a [email protected],6Ghz? Maybe 20%? Those 20% wont mean 20% more frames in games, maybe it will be 10% in some cases. I would even go further and say that a 1080ti sli will give you more performance than you need for 144Hz and 1440p, no matter if you are on a 4730k or 7700k.


I have a *49*30k. I'm not sure if 4730k even exists, i know about 4790k.

Anyhow, untill i can get 90~100 fps in every game [ i don't even care about max graphic settings] i'm good.

I will change mobo,cpu and ram next year probably, but atm if i buy two 1080ti [+ waterblocks] that is..


----------



## Benny89

Quote:


> Originally Posted by *4lek*
> 
> I have a *49*30k. I'm not sure if 4730k even exists, i know about 4790k.
> 
> Anyhow, untill i can get 90~100 fps in every game [ i don't even care about max graphic settings] i'm good.
> 
> I will change mobo,cpu and ram next year probably, but atm if i buy two 1080ti [+ waterblocks] that is..


If you are at 1150 socket, just upgade to 5775C, OC it to 4.2 and you have 7700k 5.0Hz performance in games thanks to its eDRAM.

I did that from 4790k and now I have equal, sometimes better performance in games that 5.0Ghz 7700K.

And it costed me 50$ to upgrade... lol...


----------



## 4lek

Lga 2011 here mate.


----------



## DabboPiff

Quote:


> Originally Posted by *Qba73*
> 
> lol true story


yeah i became obsessed too since gettin my 1080 ti. I just finally uninstalled 3dmark and superposition so I wont be tempted to tinker any more. haha


----------



## Benny89

Quote:


> Originally Posted by *4lek*
> 
> Lga 2011 here mate.


OK, sorry, I saw 4930k and thought that it is same socket at 4790k


----------



## Benny89

Do you guys set in Nvidia Control Panel "Prefer Maximum Power" in all your games? I heard it helps with stable clocks.


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> Do you guys set in Nvidia Control Panel "Prefer Maximum Power" in all your games? I heard it helps with stable clocks.


I played around with it, but didnt notice anything measurable. Presumably if an application starts rendering with a running start (because the clocks tend to sit high in max power) there's the potential for an advantage but I cant say ive noticed the boost clocks lagging at all.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Do you guys set in Nvidia Control Panel "Prefer Maximum Power" in all your games? I heard it helps with stable clocks.


Yes, in the actual game/benchmark profile 'Single Display Mode' and 'Prefer Maximum Power'.


----------



## KedarWolf

Quote:


> Originally Posted by *mechwarrior*
> 
> I'm been testing this palit bios and I thing it only gives us an extra 30 watts on the fe.
> anyone has an input on this???
> on AB the power only goes up to 116?


From OP.

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20

Stock BIOS on FE's seems to perform the best, Asus BIOS a close second, Palit BIOS get higher clock speeds on lower voltages and power limits less, but benches third.

For everyday use to get your card on lowest voltages with decent clocks to stop no shunt mod power limiting and temp throttling on air, I recommend Palit BIOS, decent clocks, lowest voltages of all BIOS's tested.

For bench tests stock seems the way to go.

Edit: On air Palit BIOS fans run a lot slower so you may need to up your fan curve.


----------



## Chris Ihao

Just ordered some Thermal Grizzly Kryonaut for my Aorus non-extreme. Looking forward to seeing what clocks I will get with this one


----------



## mshagg

New NVIDIA drivers are worth a look; dont think I've changed anything else that would yield a ~100 point increase in SuPo 4K scores, so assume it's the new driver. Also, max power yielded a ~20 point improvement, which could well be within the margin of error.


----------



## RavageTheEarth

Quote:


> Originally Posted by *viz0id*
> 
> Hi guys. I've seen from screenshots of some of you guys, that when your cards are in idle they downclock to really low frequencies (like 200mhz). How do i do that?
> 
> My card idles at 1550 which makes the temp much higher than it needs to be. I have a Gsync monitor and Gsync enabled in windowed mode. Does that have anything to do with it?
> 
> Card is a OC'ed 1080ti Strix on air.
> 
> EDIT: Should also add that i have one 1080p 144hz screen as well. So running 2x monitor setup.


I have a 1440p 144hz G-Sync monitor also and don't have any issues with downclocking, but since you are running two monitors that is most likely the reason why you're cards aren't clocking down.


----------



## alanthecelt

hi, read through a lot of pages so far.... and got a general idea about the alternative Bios for our cards
I have a zotac 1080ti FE and even on air it used to hit power limit regularly
now i'm on a full custom loop, and full cover block, in usage at +150 clock, +20% power limit and +400 memory (just for testing purposes still) i can get power limit all through the afterburner logs, thats playing ghost recon wildlands at 4k, no where near 60fps








thermals are good sitting mid 50 degrees c max through the logs

So, i'm wondering if i can try undervolting slightly and see if i can get stable higher clocks
otherwise.. would a zotac FE have different power limits to another FE?
how safe is it to raise the thermal cut on a FE with one of the other bios? i assume ours is currently set at 300W and i've seen a 330W mentioned


----------



## BrainSplatter

Quote:


> Originally Posted by *alanthecelt*
> 
> So, i'm wondering if i can try undervolting slightly and see if i can get stable higher clocks


I would try that first. What are your current core clocks + voltage in game? 1.0v only seems to power limit in extreme cases and with a decent GPU combined with your water cooling, 2000 Mhz should be stable.


----------



## alanthecelt

i will have to do some more logging, haven't noted the voltage at all yet (or unlocked it in afterburner), literally got the new build home last night and spent a couple of hours seeing how it behaved


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> New NVIDIA drivers are worth a look; dont think I've changed anything else that would yield a ~100 point increase in SuPo 4K scores, so assume it's the new driver. Also, max power yielded a ~20 point improvement, which could well be within the margin of error.


I posted earlier that new drivers give around 1% to 2% performance increase.


----------



## BrainSplatter

Quote:


> Originally Posted by *alanthecelt*
> 
> i will have to do some more logging, haven't noted the voltage at all yet (or unlocked it in afterburner), literally got the new build home last night and spent a couple of hours seeing how it behaved


Check voltage first. You might have a high GPU VID.


----------



## alanthecelt

yer good plan, wasn't aware it happened on gpu's....seen it on cpu's before now mind....
Would like to hope that's the case at least


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alanthecelt*
> 
> hi, read through a lot of pages so far.... and got a general idea about the alternative Bios for our cards
> I have a zotac 1080ti FE and even on air it used to hit power limit regularly
> now i'm on a full custom loop, and full cover block, in usage at +150 clock, +20% power limit and +400 memory (just for testing purposes still) i can get power limit all through the afterburner logs, thats playing ghost recon wildlands at 4k, no where near 60fps
> 
> 
> 
> 
> 
> 
> 
> 
> thermals are good sitting mid 50 degrees c max through the logs
> 
> So, i'm wondering if i can try undervolting slightly and see if i can get stable higher clocks
> otherwise.. would a zotac FE have different power limits to another FE?
> how safe is it to raise the thermal cut on a FE with one of the other bios? i assume ours is currently set at 300W and i've seen a 330W mentioned


No, I have a zotac FE and they're all actually made from NVIDIA. They have a 300 watt limit and flashing bios doesn't really do anything for you. Worse performance trading for slightly more power.

Yes, you can undervolt to not hit the power limits but you shouldn't need to.

What you need to do is raise your other bins ti their highest OC possible. Luckily for me, I can do 2088 Mhz from 1.000v to 1.093v. I also have a zotac.

But my recommendation is to raise all the lower bin parts to their maximum. You need to OC each point starting from 0.998, 1.000, 1.0312, etc.

Then just play the game and leave it be. Let the card adjust itself, if the bins are close enough it won't cause slowdowns or lag.

Slow downs, lag and stutter happen if your card has something like this.

1850 Mhz @ 1.000v
2088 Mhz @ 1.032v

When the card down volts to 1.000v it drops clocks rapidly from 2088 to 1850.

You want something like this.

2050 Mhz @ 1.000v
2063 Mhz @ 1.032v

Etc.


----------



## Benny89

Anyone know if you can slap EVGA Hybrid kit or NZXT Kraken + some AIO to ASUS STRIX 1080 Ti or MSI GAMING X?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Anyone know if you can slap EVGA Hybrid kit or NZXT Kraken + some AIO to ASUS STRIX 1080 Ti or MSI GAMING X?


Yes, Kraken G10 or G10 will more than likely work. As long as components aren't any higher than the FE or Titans, etc.

Put some heatsinks on the VRM and VRAMs though. They can make a 30C difference.


----------



## foolycooly

What kind of fan curves are people on air running?

For some reason EVGA XOC doesn't seem to have a "default" profile. I realized this after changing to the "Aggressive" curve to experiment with lower temps. After that, the only options were "Aggressive", "Quiet" and "Custom" (which just copies whatever current curve is active until you change it). right now I'm running 30% up to 50c, then 50% 50-60, 60% at 70c.

I was surprised at how loud the fans were on my SC2 compared to my 780ti SC. The fans become pretty noticeable at anything above 30% and are downright loud at 70%+. It seems like I would need to run the fans at 70%+ under full load to keep temps under 70. At my current curve capping at 60%, temps settle in around 72-75 with the core clock settling around 1911 (no overclock). Pretty good stock overclock, but the fan noise is something new to me.

It doesn't help that the card isn't downclocking on the desktop with my multiple monitor config. It idles at 1500 and upper 40s C at 30% fan.


----------



## alanthecelt

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, I have a zotac FE and they're all actually made from NVIDIA. They have a 300 watt limit and flashing bios doesn't really do anything for you. Worse performance trading for slightly more power.
> 
> Yes, you can undervolt to not hit the power limits but you shouldn't need to.
> 
> What you need to do is raise your other bins ti their highest OC possible. Luckily for me, I can do 2088 Mhz from 1.000v to 1.093v. I also have a zotac.
> 
> But my recommendation is to raise all the lower bin parts to their maximum. You need to OC each point starting from 0.998, 1.000, 1.0312, etc.
> 
> Then just play the game and leave it be. Let the card adjust itself, if the bins are close enough it won't cause slowdowns or lag.
> 
> Slow downs, lag and stutter happen if your card has something like this.
> 
> 1850 Mhz @ 1.000v
> 2088 Mhz @ 1.032v
> 
> When the card down volts to 1.000v it drops clocks rapidly from 2088 to 1850.
> 
> You want something like this.
> 
> 2050 Mhz @ 1.000v
> 2063 Mhz @ 1.032v
> 
> Etc.


that's interesting, so you get your max (2088) at 1.000v? to be honest.. i haven't noticed the downclocking specifically...i would just rather find instability before power limit ideally








but i will go through the voltage and bins


----------



## Luckbad

Quote:


> Originally Posted by *4lek*
> 
> Quick question: do you guys think i'd have any bottleneck with a 1080ti sli on my current 4930k [@4.8 ~ 4.6 according to temperature] ?
> 
> Couse if i don't i'm gonna buy tooday
> 
> 
> 
> 
> 
> 
> 
> Thank you.


You're good to go. That's a beast mode CPU.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *alanthecelt*
> 
> that's interesting, so you get your max (2088) at 1.000v? to be honest.. i haven't noticed the downclocking specifically...i would just rather find instability before power limit ideally
> 
> 
> 
> 
> 
> 
> 
> 
> but i will go through the voltage and bins


Stock before I shunt mod, Zotac FE, 2088 Mhz @ 1.000v, older nvidia drivers (not current ones which give a few hundred more points now), Superposition still causes massive power limiting, but still get a nice score.

Games aren't as power hungry as superposition. But anyways, I got lucky as my card was able to hold 2088 Mhz at 1.000v without power limiting during normal gameplay like Witcher 3, etc.



I have since shunt modded. I run 2114 Mhz @ 1.075v with no power limiting. Played For Honor, Witcher 3 and BF1 so far without issues.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yes, Kraken G10 or G10 will more than likely work. As long as components aren't any higher than the FE or Titans, etc.
> 
> Put some heatsinks on the VRM and VRAMs though. They can make a 30C difference.


What are those heatsinks and how to get them? And where to place them?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> What are those heatsinks and how to get them? And where to place them?


Amazon, just look up VRM and VRAM heatsinks. They have some that have adhesive backing on them. You'll need 11 for the VRAM and I believe like 16 to 20 for the VRM.


----------



## chibi

I'm getting closer to finishing the rig, slapped on these beauties last night!








The single slot bracket from Aquacomputer is very rough around the edges. The finish isn't up to par compared to the stock 2-slot bracket. Also, the screws stick out since the bracket is not counter-sunk.









For those looking to use this water block + backplate, I would also suggest ordering a different set of pads. The included thermal pads are about 0.5mm and it has a gap in some areas. I would look into 1.5mm or 1.0mm to make better contact.


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> I'm getting closer to finishing the rig, slapped on these beauties last night!
> 
> 
> 
> 
> 
> 
> 
> 
> The single slot bracket from Aquacomputer is very rough around the edges. The finish isn't up to par compared to the stock 2-slot bracket. Also, the screws stick out since the bracket is not counter-sunk.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For those looking to use this water block + backplate, I would also suggest ordering a different set of pads. The included thermal pads are about 0.5mm and it has a gap in some areas. I would look into 1.5mm or 1.0mm to make better contact.


I don't like the bracket either!! But it is awesome! I got the copper version, it looks nice too! 


I'm thinking of getting some fujipoly for thermal pads too! But you think we should use 1.0 or 1.5?


----------



## viz0id

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I have a 1440p 144hz G-Sync monitor also and don't have any issues with downclocking, but since you are running two monitors that is most likely the reason why you're cards aren't clocking down.


Tried to remove one monitor, and reboot. Did not help. I have also now tried to downclock the monitor back to normal, so it is native 144Hz. Still doesnt work.

I have tried to disable G-sync, change to 120hz, 100hz, 60hz. Still doesnt work. It just wants to run at baseclock all the freaking time.

Starting to run out of things to test.


----------



## KraxKill

Quote:


> Originally Posted by *viz0id*
> 
> Tried to remove one monitor, and reboot. Did not help. I have also now tried to downclock the monitor back to normal, so it is native 144Hz. Still doesnt work.
> 
> I have tried to disable G-sync, change to 120hz, 100hz, 60hz. Still doesnt work. It just wants to run at baseclock all the freaking time.
> 
> Starting to run out of things to test.


I've been dealing with this for days. I finally figured out what was preventing me from downclocking.

What software do you have running in the system tray? Turn everything off (Firefox/Chrome) and see if down clocks. Also try it with just one display plugged in. I have PG278Q @ 144hz and with gsync enabled it down clocks (you have to reboot after you enable it, will not down clock otherwise), with it off it does not. NZXT CAM will not allow your card to down clock with it running. Aquacomputer's aqua suite is another culprit. Usually it's something in your sys tray.

I've since ditched my AIO's (h100i + nzxt x41) and am no longer on the NZXT mod and am running EK blocks so this isn't an issue any more but if I have Aqyacomputer aqua suite running, i'm at 1500+ as well. With it off, 135


----------



## chibi

Quote:


> Originally Posted by *madmeatballs*
> 
> I don't like the bracket either!! But it is awesome! I got the copper version, it looks nice too!
> 
> 
> I'm thinking of getting some fujipoly for thermal pads too! But you think we should use 1.0 or 1.5?


I would go with 1.5mm because it will compress and make better contact. The 0.5mm pad seems too thin in my opinion. I'm too impatient to wait and order 1.5mm pads since I already have the block put together.
Did you use any nut to secure the last two pan head screws to your backplate/pcb? I left mine off, but may try to secure it.


----------



## viz0id

Quote:


> Originally Posted by *KraxKill*
> 
> I've been dealing with this for days.
> 
> What software do you have running in the system tray? Turn everything off (Firefox/Chrome) and see if down clocks. Also try it with just one display plugged in. I have PG278Q @ 144hz and with gsync enabled it down clocks, with it off it does not. NZXT CAM will not allow your card to down clock with it running. Aquacomputer's aqua suite is another culprit. Usually it's something in your sys tray.


The only thing i have running is GPU-Z and Afterburner, Rivatuner and audio driver. I removed a monitor and just ran one. Did not help.

The thing is, I have not yet seen it run low clocks, so it's not something that has happened with the latest driver or something. I thought i had horrible air flow first, since everyone posted idle temps in the 30s on reviewcards, and i had 55.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Yeah i was pretty shocked after playing Andromeda the last days. Did a full evening of BF1 yesturday night and it all looked like this the whole evening (Load Is a bit lower because screenshot was after game was completed, in menus)


it's a wonderful sight, no perfcaps.


----------



## foolycooly

Quote:


> Originally Posted by *viz0id*
> 
> The only thing i have running is GPU-Z and Afterburner, Rivatuner and audio driver. I removed a monitor and just ran one. Did not help.
> 
> The thing is, I have not yet seen it run low clocks, so it's not something that has happened with the latest driver or something. I thought i had horrible air flow first, since everyone posted idle temps in the 30s on reviewcards, and i had 55.


Same for me. I thought maybe it could be Corsair CUE for my kb and mouse but I have tried closing everything other than XOC and it still doesn't down clock. I wonder if XOC itself is causing it. I could try a different monitoring program maybe.


----------



## KraxKill

Quote:


> Originally Posted by *viz0id*
> 
> The only thing i have running is GPU-Z and Afterburner, Rivatuner and audio driver. I removed a monitor and just ran one. Did not help.
> 
> The thing is, I have not yet seen it run low clocks, so it's not something that has happened with the latest driver or something. I thought i had horrible air flow first, since everyone posted idle temps in the 30s on reviewcards, and i had 55.


You have nothing else in the sys tray? Try to turn off the audio driver helper just to check. Try running 60hz just to see. Reboot after you set it. It's a frustrating issue, but you'll figure it out. It's either the driver (less likely) though I've blamed it many times. But is more likely something in the background making it clock up. Maybe try running just GPUz and turn off afterburner. Try to isolate it.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I have no idea how you guys avoid Perf Caps. I have tried anything from 1.031 V to 1.081V and I always get Vrel and V0p. Even though my Voltage and Power slider are maxed out and GPU-Z shows max TDP at 118,7% . I have no idea how to avoid those Perf Caps...
> 
> What is the reason behind V0p or Vrel? Is my clock to high for bin, or too low for bin or what?


man, i wish i could help you. it all boils down to a magic spot that has enough volts for the clock, but not too much, and isnt power limited, and isnt hot. but BF1 is a good place to try to achieve it b/c it never power throttles and also that keeps temps down.
Quote:


> Originally Posted by *mshagg*
> 
> I played around with it, but didnt notice anything measurable. Presumably if an application starts rendering with a running start (because the clocks tend to sit high in max power) there's the potential for an advantage but I cant say ive noticed the boost clocks lagging at all.


seems to me to basically have the same effect as locking at your highest voltage.

age.
Quote:


> Originally Posted by *Chris Ihao*
> 
> Just ordered some Thermal Grizzly Kryonaut for my Aorus non-extreme. Looking forward to seeing what clocks I will get with this one


nice move, it works wonders.


----------



## zipeldiablo

Any of you guys still have signal interrupt with the new driver?
This is the 4th driver since the release of the card and it seems it still isn't fixed (though the issue seems to happen less).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *zipeldiablo*
> 
> Any of you guys still have signal interrupt with the new driver?
> This is the 4th driver since the release of the card and it seems it still isn't fixed (though the issue seems to happen less).


Never had the issue.


----------



## viz0id

Quote:


> Originally Posted by *foolycooly*
> 
> Same for me. I thought maybe it could be Corsair CUE for my kb and mouse but I have tried closing everything other than XOC and it still doesn't down clock. I wonder if XOC itself is causing it. I could try a different monitoring program maybe.


Quote:


> Originally Posted by *KraxKill*
> 
> You have nothing else in the sys tray? Try to turn off the audio driver helper just to check. Try running 60hz just to see. Reboot after you set it. It's a frustrating issue, but you'll figure it out. It's either the driver (less likely) though I've blamed it many times. But is more likely something in the background making it clock up. Maybe try running just GPUz and turn off afterburner. Try to isolate it.


Just did another reboot. Put lower hz, removed afterburner @ startup.Still the same baseclock.

I could kind of understand it if it actually had some load. But the GPU load is 0%.

I might just accept it and run the fans at around 20% to keep it around 42-43C idle.


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> I would go with 1.5mm because it will compress and make better contact. The 0.5mm pad seems too thin in my opinion. I'm too impatient to wait and order 1.5mm pads since I already have the block put together.
> Did you use any nut to secure the last two pan head screws to your backplate/pcb? I left mine off, but may try to secure it.


Me too, I also put it all together already as you can see. Anyway, I just screwed it and it held it lol! Don't really understand what they are for.


----------



## foolycooly

Quote:


> Originally Posted by *viz0id*
> 
> Just did another reboot. Put lower hz, removed afterburner @ startup.Still the same baseclock.
> 
> I could kind of understand it if it actually had some load. But the GPU load is 0%.
> 
> I might just accept it and run the fans at around 20% to keep it around 42-43C idle.


That's my solution for now--i'm running my base fan speed at 30% (i think...maybe 20%) and it's keeping it in the low to mid 40s. I can't imagine it's going to matter in the long run, but having the fans be completely silent on desktop would be nice.

Edit: A couple of things i plan to try still are having all 3 of my monitors on the same cable type (DP). Right now i have 2x DP and 1x DVI...i'm just waiting for another DP cable to get delivered. When I go to swap it out, I might DDU my driver and do a clean install with only the 144hz display plugged in. Then I'll add the 2nd and third 60hz monitors with DP after the driver is installed (read this worked for one person).


----------



## KraxKill

Quote:


> Originally Posted by *viz0id*
> 
> Just did another reboot. Put lower hz, removed afterburner @ startup.Still the same baseclock.
> 
> I could kind of understand it if it actually had some load. But the GPU load is 0%.
> 
> I might just accept it and run the fans at around 20% to keep it around 42-43C idle.


Leave GPUz running and unplug the display from the display port. Does it down clock with the display unplugged? Not sure what else it could be.


----------



## noles1983

Where is the EK Seahawk version?! Come on MSI...


----------



## chibi

Quote:


> Originally Posted by *madmeatballs*
> 
> Me too, I also put it all together already as you can see. Anyway, I just screwed it and it held it lol! Don't really understand what they are for.


Interesting... did the screw threads catch onto the pcb? Or do they just kind of sit there and hold onto the backplate only? I wish those screws were also counter sunk with the backplate to match the rest of them. Looks a bit odd having them pop up like that, lol.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Tried to remove one monitor, and reboot. Did not help. I have also now tried to downclock the monitor back to normal, so it is native 144Hz. Still doesnt work.
> 
> I have tried to disable G-sync, change to 120hz, 100hz, 60hz. Still doesnt work. It just wants to run at baseclock all the freaking time.
> 
> Starting to run out of things to test.


you DO have it set to "optimal" or "adaptive" in nvidia control panel, right? also, try setting the lower Hz of the two monitors as primary.


----------



## foolycooly

Quote:


> Originally Posted by *Slackaveli*
> 
> you DO have it set to "optimal" or "adaptive" in nvidia control panel, right? also, try setting the lower Hz of the two monitors as primary.


I know there are two of us responding to multiple people, but to summarize what I have tried (and hopefully I can add what Slackaveli has tried):


Confirmed that power is set to "optimal" in NCP (with a shut down/restart in between toggling)
Tried setting 144hz main display to 120hz. Even tried setting it to 60hz (to match side panels) and it didn't help.
Change lower refresh rate side monitor to "main display" (and restart) then back again
Toggle Gsync off/on and full screen/full screen windowed modes
Enable K boost in XOC, then disable again
Kill extraneous Nvidia processes in task manager (I couldn't find any...this is a fresh copy of w10 on a clean SSD. Only Nvidia processes were "Nvidia Container")
Close all task bar applications (only things running were XOC and "GPU activity" which I enabled through NCP)
Things I still want to try:


DDU removal of current driver and clean install with only 144hz main display connected. Then re-connect side displays
Run all 3 monitors off of displayport (instead of 2x DP and 1x DVI)
Nvidia inspector enable multi display power saving mode (I don't want to do this--I'd rather just run a higher base fan speed to keep idle temps low)


----------



## Slackaveli

Im gonna bet the dvi cable is the culprit. i have the issue from time to time as im multi monitor with a qhd/144Hz gsync on DP and a 4k/60 on HDMI2.0. if the gsync isnt on and that monitor is primary, i get the bug. If i switch to 4k as primary and reboot, as long as gsync is on ill be back to 240Hz and using 24w idle. at 1500Mhz, w/ the bug, it's 95w idle.


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> you DO have it set to "optimal" or "adaptive" in nvidia control panel, right? also, try setting the lower Hz of the two monitors as primary.


Quote:


> Originally Posted by *foolycooly*
> 
> I know there are two of us responding to multiple people, but to summarize what I have tried (and hopefully I can add what Slackaveli has tried):
> 
> 
> Confirmed that power is set to "optimal" in NCP (with a shut down/restart in between toggling)
> Tried setting 144hz main display to 120hz. Even tried setting it to 60hz (to match side panels) and it didn't help.
> Change lower refresh rate side monitor to "main display" (and restart) then back again
> Toggle Gsync off/on and full screen/full screen windowed modes
> Enable K boost in XOC, then disable again
> Kill extraneous Nvidia processes in task manager (I couldn't find any...this is a fresh copy of w10 on a clean SSD. Only Nvidia processes were "Nvidia Container")
> Close all task bar applications (only things running were XOC and "GPU activity" which I enabled through NCP)
> Things I still want to try:
> 
> 
> DDU removal of current driver and clean install with only 144hz main display connected. Then re-connect side displays
> Run all 3 monitors off of displayport (instead of 2x DP and 1x DVI)
> Nvidia inspector enable multi display power saving mode (I don't want to do this--I'd rather just run a higher base fan speed to keep idle temps low)


Ok so i finally fixed it. I thought i had tried everything, and i have, but apparently not the right things in the right order?

I have 2 native 144hz monitors. So i thought it would be easiest to run them at the same Hz. But apparently that was not the case. (Came from running one 144hz and one 60hz, then it was better to run 120 and 60. So i thought it had to do with being in "sync" with eachother, 60 120 and 180 would work, but 144hz would **** it up). ANYWAYS I DIGRESS.

So what worked for me was having my main monitor XB271HU at 144hz, and the second monitor at 120hz and not 144hz. BUT THE 144 MONITOR AS MAIN MONITOR?!?!

I think i have done the exact same thing 999 times before but I must have done something in the correct order or something because now it works.

Oh and i had Optimal on in NVCP the whole time.

EDIT:
Quote:


> Originally Posted by *Slackaveli*
> 
> Im gonna bet the dvi cable is the culprit. i have the issue from time to time as im multi monitor with a qhd/144Hz gsync on DP and a 4k/60 on HDMI2.0. if the gsync isnt on and that monitor is primary, i get the bug. If i switch to 4k as primary and reboot, as long as gsync is on ill be back to 240Hz and using 24w idle. at 1500Mhz, w/ the bug, it's 95w idle.


Main monitor is DP and second monitor is DVI. Just had to put that in there as well.


----------



## madmeatballs

Quote:


> Originally Posted by *chibi*
> 
> Interesting... did the screw threads catch onto the pcb? Or do they just kind of sit there and hold onto the backplate only? I wish those screws were also counter sunk with the backplate to match the rest of them. Looks a bit odd having them pop up like that, lol.


Nah, it never reached the pcb. lol.


----------



## foolycooly

Quote:


> Originally Posted by *viz0id*
> 
> Ok so i finally fixed it. I thought i had tried everything, and i have, but apparently not the right things in the right order?
> 
> I have 2 native 144hz monitors. So i thought it would be easiest to run them at the same Hz. But apparently that was not the case. (Came from running one 144hz and one 60hz, then it was better to run 120 and 60. So i thought it had to do with being in "sync" with eachother, 60 120 and 180 would work, but 144hz would **** it up). ANYWAYS I DIGRESS.
> 
> So what worked for me was having my main monitor XB271HU at 144hz, and the second monitor at 120hz and not 144hz. BUT THE 144 MONITOR AS MAIN MONITOR?!?!
> 
> I think i have done the exact same thing 999 times before but I must have done something in the correct order or something because now it works.
> 
> Oh and i had Optimal on in NVCP the whole time.
> 
> EDIT:
> Main monitor is DP and second monitor is DVI. Just had to put that in there as well.


Wait so...just to clarify, you have 2 144hz monitors. You're running one at 144hz (designated as main display) and the second one at 120hz?

I find it bizarre that that works but it doesn't work if you run them both at 144hz. I never thought matched monitors had this issue.


----------



## viz0id

Quote:


> Originally Posted by *foolycooly*
> 
> Wait so...just to clarify, you have 2 144hz monitors. You're running one at 144hz (designated as main display) and the second one at 120hz?
> 
> I find it bizarre that that works but it doesn't work if you run them both at 144hz. I never thought matched monitors had this issue.


It is bizarre indeed. Both is 144hz. The Acer can OC to 165 but i disabled that. That one is a XB271HU connected with DP. The other is an older benq tn panel 1080p 144hz monitor. That one is connected with dvi.

So i run the Acer at 144hz and the benq at 120hz. Acer main monitor.

Tried to put the benq @ 144hz again after i got it working. Then it boosted up to 1550 again. So i have to have it at 120 or lower.


----------



## Bishop07764

Newest drivers 381.89 seem to give me the exact same results here. Going to actually try some more games when I get time. Dang this forum moves fast. There were about 25 pages of posts since I last was able to really read on here. I was kinda hoping that the Palit bios might be my answer to power limiting. Seems like the results are mixed. I might still give it a go if I get more time to mess with it.


----------



## RavageTheEarth

@SlimJ87D

So I took your advice and once again tried to do 1v without locking the curve. Worked out pretty well. Completely stable at 2050 core with a solid 1v. No dropped bins or anything. I'm surprised I'm able to hit 2050 at 1v. I tried 2063, but the game ended up freezing with some artifacts after playing for a little while. So it looks like 1v tops out at 2050 which is only 13Mhz lower than my old overclock which was hitting 2063 @ 1.063v. Also, notice how the grey GPU utilization PerfCap went away. Very happy with this card!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *RavageTheEarth*
> 
> @SlimJ87D
> 
> So I took your advice and once again tried to do 1v without locking the curve. Worked out pretty well. Completely stable at 2050 core with a solid 1v. No dropped bins or anything. I'm surprised I'm able to hit 2050 at 1v. I tried 2063, but the game ended up freezing with some artifacts after playing for a little while. So it looks like 1v tops out at 2050 which is only 13Mhz lower than my old overclock which was hitting 2063 @ 1.063v. Also, notice how the grey GPU utilization PerfCap went away. Very happy with this card!


Nice, I'm glad you trusted me. Was just looking out for you.


----------



## lilchronic

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> Managed to achieve 415w+ TDP with the 3-resistor mod. Actually, 22Ohm or 47Ohm resistors are the best for this mod. 10Ohm is a little bit too small since nVidia changed their series resistor from 10Ohm to 20Ohm.
> 
> 
> I also experimented with two conductive paints, carbon based, and silver based.
> 
> 
> 
> 
> The carbon based paint has around 20-30Ohm resistance over a 1mm gap. The silver one only has about 0.1-0.2Ohms


Dam i just ordered some 10ohm 0805 smd's to do this mod. will it still work with the 10 ohm? I bought 100 of them but should of got the roll of 10ohm-100ohm


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> Dam i just ordered some 10ohm 0805 smd's to do this mod. will it still work with the 10 ohm? I bought 100 of them but should of got the roll of 10ohm-100ohm


Along with the TDP, the 1.093v limit is really a huge limiting factor on these cards. The Ti uses the uP9511 voltage controller. Anybody have a software hack or hard mod method to get say 1.25v out of it?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> Along with the TDP, the 1.093v limit is really a huge limiting factor on these cards. The Ti uses the uP9511 voltage controller. Anybody have a software hack or hard mod method to get say 1.25v out of it?


You can try that EVGA software hack stuff. Someone used it to do 1.24v on the 1080. There's a video here somewhere.


----------



## HyperC

Quote:


> Originally Posted by *SlimJ87D*
> 
> You can try that EVGA software hack stuff. Someone used it to do 1.24v on the 1080. There's a video here somewhere.


Stuff like this always makes the day better I mean are we going to see if Nvidia is lying about killing our cards in 1yr







I got my new pump few a days ago much better flow now but I was to lazy to redo the shunt mod


----------



## Coopiklaani

Quote:


> Originally Posted by *lilchronic*
> 
> Dam i just ordered some 10ohm 0805 smd's to do this mod. will it still work with the 10 ohm? I bought 100 of them but should of got the roll of 10ohm-100ohm


Yep, 10Ohm will also work. It will give you 5x TDP, 250w->1250w, a little bit overkill.


----------



## CoreyL4

So what is the general consensus on the Zotac cards? Might pick one up.


----------



## HyperC

Quote:


> Originally Posted by *KedarWolf*
> 
> I may have stopped the voltages jumping around even on a straight voltage curve line.
> 
> Right click the .bat file, choose Edit and open it in Notepad, change the number after -pl to your max power limit, you can look your max power limit up at:
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=
> 
> Open your Task Scheduler search for it in Run or Cortana, add the bat file to run at login, just add it as a Basic Task, then double click the entry in your task list and edit it like the screenshots. Have the bat file start 30 seconds after login.
> 
> You need to click on Properties in the right window to edit it like the pictures when you double-click it from your task list.
> 
> *Unzip this file for powerlimit.bat file.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> powerlimit.zip 0k .zip file
> *Unzip this file for powerlimit.bat file.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or you can just edit and run the bat file itself but you'll have to do it on every login.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems manually setting the power limit even if it doesn't actually change it fixes that issue.


Is that normal? or are you not allowed to edit the number above the firmware?


----------



## KedarWolf

Quote:


> Originally Posted by *HyperC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I may have stopped the voltages jumping around even on a straight voltage curve line.
> 
> Right click the .bat file, choose Edit and open it in Notepad, change the number after -pl to your max power limit, you can look your max power limit up at:
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=
> 
> Open your Task Scheduler search for it in Run or Cortana, add the bat file to run at login, just add it as a Basic Task, then double click the entry in your task list and edit it like the screenshots. Have the bat file start 30 seconds after login.
> 
> You need to click on Properties in the right window to edit it like the pictures when you double-click it from your task list.
> 
> *Unzip this file for powerlimit.bat file.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> powerlimit.zip 0k .zip file
> *Unzip this file for powerlimit.bat file.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or you can just edit and run the bat file itself but you'll have to do it on every login.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems manually setting the power limit even if it doesn't actually change it fixes that issue.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that normal? or are you not allowed to edit the number above the firmware?
Click to expand...

You are not allowed to edit the number above the max TDP in the firmware but some BIOS's the enforced limit is less than the max limit, like the Palit the enforced is only 348 and it wont set the max to 350 until you run the bat with '-pl 350'.


----------



## dunbheagan

Quote:


> Originally Posted by *4lek*
> 
> I have a *49*30k. I'm not sure if 4730k even exists, i know about 4790k.
> 
> Anyhow, untill i can get 90~100 fps in every game [ i don't even care about max graphic settings] i'm good.
> 
> I will change mobo,cpu and ram next year probably, but atm if i buy two 1080ti [+ waterblocks] that is..


Yeah, sorry i mixed up the numbers but i meant the same cpu. Your 4930k + 1080ti sli will give you more than 100+ fps at 1440p so dont worry about it. no need to upgrade to get what you want.


----------



## lilchronic

Quote:


> Originally Posted by *Coopiklaani*
> 
> Yep, 10Ohm will also work. It will give you 5x TDP, 250w->1250w, a little bit overkill.


Thank's








So 22-30Ohm would be lower power limit?

By the way did you solder that on? looks like a nice job


----------



## feznz

Quote:


> Originally Posted by *viz0id*
> 
> Hi guys. I've seen from screenshots of some of you guys, that when your cards are in idle they downclock to really low frequencies (like 200mhz). How do i do that?
> 
> My card idles at 1550 which makes the temp much higher than it needs to be. I have a Gsync monitor and Gsync enabled in windowed mode. Does that have anything to do with it?
> 
> Card is a OC'ed 1080ti Strix on air.
> 
> EDIT: Should also add that i have one 1080p 144hz screen as well. So running 2x monitor setup.


Quote:


> Originally Posted by *foolycooly*
> 
> Let me know if you figure it out. I tried several things last night with no luck. I still idle 1480-1520 in the 40c-55c range. Pretty annoying.


Did you close down MSI Ab or what ever OC GPU programme used or reinstall NVdia drivers/ GPU OC program

I idle @ 240 Mhz


----------



## Thoth420

Hey all I am considering between the reference and the Strix 1080Ti for a brand new 4K build centered around the new Quantum Dot ASUS ROG panel and plan to do a custom loop.
The two cards mentioned above are compatible with my preferred EK full coverage water blocks so I was looking for some input on whether it is worth going with the Strix or not and perhaps if there is rumors of any other EK custom AIB blocks rumbling. I prefer to buy my Nvidia cards from EVGA because processing an RMA if needed through ASUS is a nightmare in comparison. My case will be plenty big enough to house either card because the board I am planning on is E-ATX so size of card is not a problem at all.


----------



## foolycooly

Quote:


> Originally Posted by *feznz*
> 
> Did you close down MSI Ab or what ever OC GPU programme used or reinstall NVdia drivers/ GPU OC program
> 
> I idle @ 240 Mhz


I just uninstalled EVGA XOC planning to do a DDU uninstall and reinstall of the drivers. I then tried unplugging my second and third monitor so that only my 144hz monitor was plugged in. It finally downclocked to around 215!

I'm running a few heaven benchmarks with only the single monitor plugged in for comparison to after I plug all 3 monitors back in. I then plan to plug one at a time in and see what happens.

Edit: Plugging in my second 60hz monitor with displayport kept the idles at 215. However, when I plugged my third 60hz panel in with DVI, the clocks spike back up to 1500+. So it seems DVI is the culprit after all. My displayport cable is arriving tomorrow--hoping that running all 3 displays on displayport will correct the issue permanently!

One strange thing is that while my clocks came down to 215, the temperature seems to be sticking at 51c. I find it hard to believe that a 1300 reduction in core clock isn't bring the temperature down at all. Any thoughts?


----------



## KraxKill

Quote:


> Originally Posted by *SlimJ87D*
> 
> You can try that EVGA software hack stuff. Someone used it to do 1.24v on the 1080. There's a video here somewhere.


It works using the "classified" utility on the 1080, but the Ti uses a different voltage controller.

I wonder what Afterburner sets to put the voltage at +10% +100%. Wonder what would happen if you manually set the value to say 130%. Would it grab an extra bin?


----------



## Benny89

Witcher 3 is a killer. No matter what I do, how I set it it will after some period of time downclock your card. I have tried everything but it is impossible to keep any sort of "higher" clocks with Witcher 3.

KedarWolf .bat file helped to let it keep steady clocks longer but it will downclock your card anyway.

Thing is not even once I saw card hitting 120% PL. I have no idea what cause in Witcher 3 to downlock cards all the time....

I can keep clocks like 2075 all the time in BF1. But Witcher 3 will not even allow me to keep 2038. It will downclock to 2012 after like hour or so. Temperatures are not any higher then when it was keeping higher clock and TDP is not reaching 119% even...

What is wrong with this game....


----------



## palote99

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 is a killer. No matter what I do, how I set it it will after some period of time downclock your card. I have tried everything but it is impossible to keep any sort of "higher" clocks with Witcher 3.
> 
> KedarWolf .bat file helped to let it keep steady clocks longer but it will downclock your card anyway.
> 
> Thing is not even once I saw card hitting 120% PL. I have no idea what cause in Witcher 3 to downlock cards all the time....
> 
> I can keep clocks like 2075 all the time in BF1. But Witcher 3 will not even allow me to keep 2038. It will downclock to 2012 after like hour or so. Temperatures are not any higher then when it was keeping higher clock and TDP is not reaching 119% even...
> 
> What is wrong with this game....


I think its not the game is how this card works..... I dont understand it either......


----------



## CoreyL4

So i am benchmarking at 1.000v. Limit seems to be 2012 where it will be stable. However during the run perf cap in gpuz is a rainbow. What does that mean?


----------



## KedarWolf

Quote:


> Originally Posted by *CoreyL4*
> 
> So i am benchmarking at 1.000v. Limit seems to be 2012 where it will be stable. However during the run perf cap in gpuz is a rainbow. What does that mean?


I'm not exactly sure what the rainbow means but if clocks are not jumping around in Superposition 4K Optimized, Time Spy test 2 and stay at 2012 you're good even with the rainbow.


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm not exactly sure what the rainbow means but if clocks are not jumping around in Superposition 4K Optimized, Time Spy test 2 and stay at 2012 you're good even with the rainbow.


Ok thank you!


----------



## Qba73

So Jacob Freeman just posted this on twitter, FTW3, nice temps for air, low PL too and no volts.

getting mine to pit against my aorus exteme in about a week


----------



## Luckael

hello guys, is this normal for the GTX 1080 Ti FE, the debug mode is enable. I'm having a frame stuttering in all games like BF1 and the division. i don't know if this is the issue. i never experience frame drops on my Gtx 1080 xtreme Gaming before. any idea? thank you


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 is a killer. No matter what I do, how I set it it will after some period of time downclock your card. I have tried everything but it is impossible to keep any sort of "higher" clocks with Witcher 3.
> 
> KedarWolf .bat file helped to let it keep steady clocks longer but it will downclock your card anyway.
> 
> Thing is not even once I saw card hitting 120% PL. I have no idea what cause in Witcher 3 to downlock cards all the time....
> 
> I can keep clocks like 2075 all the time in BF1. But Witcher 3 will not even allow me to keep 2038. It will downclock to 2012 after like hour or so. Temperatures are not any higher then when it was keeping higher clock and TDP is not reaching 119% even...
> 
> What is wrong with this game....


Benny, from one turbo nerd to another, we both just got to stop worrying about oc and just game. I'm a hypocrite when I say this, but who lose sleep over 1% more performance?


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Benny, from one turbo nerd to another, we both just got to stop worrying about oc and just game. I'm a hypocrite when I say this, but who lose sleep over 1% more performance?


I know you are right, I am just obsesed with getting absolutely max out of my hardware in any game and Witcher 3 is denying me that







.

Yea, you right. I was able to run Witcher now at 2012 1.031V at 62C with 55% Fan speed.... Not bad. It runs 2025 before 60C. Maybe if I repaste with Grizzly I could at least get that.

I should really stop reading this thread for god sake.....


----------



## Qba73

This thread is the rock and we crackheads we keep coming back lol ;-)

it's never enough must have crack, I mean MHz (little voices in my head)

When you dream of voltage curves, or thinking about it at work at random times you know it's time to step away

Besides in 6 months it's Volta (thank vega for that) and back to the drawing board, the voices start, upgrade, upgrade lol. It a vicious cycle I tell ya lol


----------



## eXistencelies

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm not exactly sure what the rainbow means but if clocks are not jumping around in Superposition 4K Optimized, Time Spy test 2 and stay at 2012 you're good even with the rainbow.


How much jumping around is ok? I am looping timespy graphic test 2 and the MHZ jump around from 1936mhz-2050mhz over the whole loop. It mainly stays at 2037-2050mhz though. I am using a curve. 500+ memory. This is on a 1080ti FE. Water block on it of course.I see my power has spiked to 126 at one point. That is bad right?


----------



## Blotto80

I'm seeing some strangeness on the Palit Bios. I've got the PL set to 116% and NVSMI reports 348 as the enforced PL but during Supo, it still power limits at 100% (300w). I've forced it to 350w with SMI and it takes and the enforced limit changes but no difference once in Supo.


----------



## Coopiklaani

There're actually 2 TDP limit,
Total GPU Power (normalized) [% of TDP] and Total GPU Power [% of TDP]. GPU-Z reads out Total GPU Power [% of TDP] whereas AB reads out Total GPU Power (normalized) [% of TDP] . HWiNFO64 reads both TDPs. Both TDPs limit the max power the card can draw. I can't find a way to increase the normalized TDP unless I flash a BIOS with a higher normalized TDP.

The shunt mod only affects Total GPU Power [% of TDP] but not the normalized TDP.



You can see my card (shunt modded) hits power limit because of the normalized TDP.


----------



## Coopiklaani

Quote:


> Originally Posted by *KraxKill*
> 
> Along with the TDP, the 1.093v limit is really a huge limiting factor on these cards. The Ti uses the uP9511 voltage controller. Anybody have a software hack or hard mod method to get say 1.25v out of it?


Pascal doesn't not scale well with voltage under room temperature. Voltages higher than 1.093v can have weird problems. I ran my gtx1080 on Strix OC bios. When I increased voltage above 1.093v, event viwer started to be flooded with PCIE WHEA errors regardless of the GPU frequency.


----------



## Coopiklaani

Quote:


> Originally Posted by *lilchronic*
> 
> Thank's
> 
> 
> 
> 
> 
> 
> 
> 
> So 22-30Ohm would be lower power limit?
> 
> By the way did you solder that on? looks like a nice job


I'd recommand 22Ohm, it's 2.82x the original TDP, more than enough for anything. The shunt resistor will lower the RC constant of the sensor filter circuit. it means the power readings become more noisy with more spikes.

No I didn't solder them on. I simply secured them on with some non-conductive thermal glue first, then applied some silver paint. It works very well for me


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Blotto80*
> 
> I'm seeing some strangeness on the Palit Bios. I've got the PL set to 116% and NVSMI reports 348 as the enforced PL but during Supo, it still power limits at 100% (300w). I've forced it to 350w with SMI and it takes and the enforced limit changes but no difference once in Supo.


Everyone has already reported poorly results with it, and not just it but all bios. Just use stock. Through all my testing, flashing and trying out bios was just a waste of time.

If you really desperately need that power, just shunt mod and be done with it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> There're actually 2 TDP limit,
> Total GPU Power (normalized) [% of TDP] and Total GPU Power [% of TDP]. GPU-Z reads out Total GPU Power [% of TDP] whereas AB reads out Total GPU Power (normalized) [% of TDP] . HWiNFO64 reads both TDPs. Both TDPs limit the max power the card can draw. I can't find a way to increase the normalized TDP unless I flash a BIOS with a higher normalized TDP.
> 
> The shunt mod only affects Total GPU Power [% of TDP] but not the normalized TDP.
> 
> 
> 
> You can see my card (shunt modded) hits power limit because of the normalized TDP.


What are you running here to have that power limit?


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Everyone has already reported poorly results with it, and not justice it but all bios. Just use stock.


you need use it with shunt mod. The bios along doesn't not help much. My card manages to draw 450w of power with shunt mod + palit bios


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm not exactly sure what the rainbow means but if clocks are not jumping around in Superposition 4K Optimized, Time Spy test 2 and stay at 2012 you're good even with the rainbow.


Did SuPO 4k, what about clock drops from temps went from 2025-1987 gradually. No jumping all over the place.

EDIT: Just did 2037 at 1.031v and it only went down to 2025.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> What are you running here to have that power limit?










good old furmark. It's drawing 450w+ power when I took the screenshot. Shunt mod + palit bios


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> you need use it with shunt mod. The bios along doesn't not help much. My card manages to draw 450w of power with shunt mod + palit bios


What's the point though? It runs like crap, if you run benchmarks that aren't power hungry where stock wouldn't power limit, Palit gets worse scores. Heck users without shunt score better being power limited than having 30 extra watts.

So what's the point? Draw extra power to get worse results and higher thermals plus higher risk of damaging card? In other words, all it accomplished for everyone is to get worse results but in situations where that 30 watts helps, it only helped barely match the stock BIOS. It sounds counterproductive to flash a bios to get more power and heat so that it can do worse or tie with the stock BIOS.

Lastly, I really don't think you did your shunt right, something is wrong because I don't hit power limits like you do in that image. I can draw 410 watts on my card. But why do I need more power than this? Superposition doesn't power limit me. Games will never run like superposition, so why do I need 500 watts into my card?


----------



## s1rrah

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 is a killer. No matter what I do, how I set it it will after some period of time downclock your card. I have tried everything but it is impossible to keep any sort of "higher" clocks with Witcher 3..


My go to overclock of 2025mhz mostly stays there ... but will eventually downclock to 2012mhz as well ... other areas load and it will go back to 2025mhz ... and so it goes ...

I'm just using a +80 overclock in EVGA Precision X with power and thermal sliders set to max. I never break 43C in game ... don't use a curve ... haven't overvolted either.

Also, I get power limiting all the time ... like every 5 to 20 seconds ... but things are smooth as silk and frames are always 115FPS to 144fps depending on the area so I mean, is that power limit thing even anything to worry about? Don't the cards just do that by default?


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> Pascal doesn't not scale well with voltage under room temperature. Voltages higher than 1.093v can have weird problems. I ran my gtx1080 on Strix OC bios. When I increased voltage above 1.093v, event viwer started to be flooded with PCIE WHEA errors regardless of the GPU frequency.


My goal is to not be at room temperature
Quote:


> Originally Posted by *Coopiklaani*
> 
> There're actually 2 TDP limit,
> Total GPU Power (normalized) [% of TDP] and Total GPU Power [% of TDP]. GPU-Z reads out Total GPU Power [% of TDP] whereas AB reads out Total GPU Power (normalized) [% of TDP] . HWiNFO64 reads both TDPs. Both TDPs limit the max power the card can draw. I can't find a way to increase the normalized TDP unless I flash a BIOS with a higher normalized TDP.
> 
> The shunt mod only affects Total GPU Power [% of TDP] but not the normalized TDP.
> 
> 
> 
> You can see my card (shunt modded) hits power limit because of the normalized TDP.


Both my normalized and total maximums are within 1% of each other at 99% @ 2100mhz Stock FE Bios, Shunt modded @ 1.063v

99.2 vs 99.4


----------



## CoreyL4

Testing out my gaming x so far. 2050 at 1.050 seems to be stable. Haven't done memory yet.


----------



## JJ1217

Pretty crazy how a minor undervolt on the FE makes it so much bearable with better temperatures. Not the most intensive application (was just Unigine Valley), but still dropping from 78 degrees 78% fan speed to 67 degrees 67% fan speed just made it so much better. Might add some conductonaut on it to try and get it down a few more degrees.


----------



## DabboPiff

Quote:


> Originally Posted by *Qba73*
> 
> lol true story


Quote:


> Originally Posted by *Benny89*
> 
> I know you are right, I am just obsesed with getting absolutely max out of my hardware in any game and Witcher 3 is denying me that
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yea, you right. I was able to run Witcher now at 2012 1.031V at 62C with 55% Fan speed.... Not bad. It runs 2025 before 60C. Maybe if I repaste with Grizzly I could at least get that.
> 
> I should really stop reading this thread for god sake.....


Yeah I was never into benchmarks til I got my 1080ti I saw how much it was taking over my life in such a short amount of time I had to stop it before it got too bad lol

I uninstallled 3dmark and superposition so I don't even have the option. Was spending all my time chasing higher scores instead of playing games haha.


----------



## Qba73

Quote:


> Originally Posted by *DabboPiff*
> 
> Yeah I was never into benchmarks til I got my 1080ti I saw how much it was taking over my life in such a short amount of time I had to stop it before it got too bad lol
> 
> I uninstallled 3dmark and superposition so I don't even have the option. Was spending all my time chasing higher scores instead of playing games haha.


same here, the temptation was to great for me.

now back to Zelda


----------



## Luckbad

Quote:


> Originally Posted by *foolycooly*
> 
> I just uninstalled EVGA XOC planning to do a DDU uninstall and reinstall of the drivers. I then tried unplugging my second and third monitor so that only my 144hz monitor was plugged in. It finally downclocked to around 215!
> 
> I'm running a few heaven benchmarks with only the single monitor plugged in for comparison to after I plug all 3 monitors back in. I then plan to plug one at a time in and see what happens.
> 
> Edit: Plugging in my second 60hz monitor with displayport kept the idles at 215. However, when I plugged my third 60hz panel in with DVI, the clocks spike back up to 1500+. So it seems DVI is the culprit after all. My displayport cable is arriving tomorrow--hoping that running all 3 displays on displayport will correct the issue permanently!
> 
> One strange thing is that while my clocks came down to 215, the temperature seems to be sticking at 51c. I find it hard to believe that a 1300 reduction in core clock isn't bring the temperature down at all. Any thoughts?


Fan speed


----------



## Bishop07764

Quote:


> Originally Posted by *s1rrah*
> 
> My go to overclock of 2025mhz mostly stays there ... but will eventually downclock to 2012mhz as well ... other areas load and it will go back to 2025mhz ... and so it goes ...
> 
> I'm just using a +80 overclock in EVGA Precision X with power and thermal sliders set to max. I never break 43C in game ... don't use a curve ... haven't overvolted either.
> 
> Also, I get power limiting all the time ... like every 5 to 20 seconds ... but things are smooth as silk and frames are always 115FPS to 144fps depending on the area so I mean, is that power limit thing even anything to worry about? Don't the cards just do that by default?


It depends on the card when you will hit the power limit. Some obviously are more efficient than other cards at lower voltages. I hit the power limit mostly in benches but not much in actual games. I haven't tried Witcher 3 yet since getting my ti. It's definitely normal behavior. It happens to me too even with my max temps sitting at 34-36c. The AIB card might alleviate how quickly you hit the limit with their higher bios limits and/or if you shunt mod.


----------



## Nico67

Quote:


> Originally Posted by *Blotto80*
> 
> I'm seeing some strangeness on the Palit Bios. I've got the PL set to 116% and NVSMI reports 348 as the enforced PL but during Supo, it still power limits at 100% (300w). I've forced it to 350w with SMI and it takes and the enforced limit changes but no difference once in Supo.


Quote:


> Originally Posted by *Coopiklaani*
> 
> There're actually 2 TDP limit,
> Total GPU Power (normalized) [% of TDP] and Total GPU Power [% of TDP]. GPU-Z reads out Total GPU Power [% of TDP] whereas AB reads out Total GPU Power (normalized) [% of TDP] . HWiNFO64 reads both TDPs. Both TDPs limit the max power the card can draw. I can't find a way to increase the normalized TDP unless I flash a BIOS with a higher normalized TDP.
> 
> The shunt mod only affects Total GPU Power [% of TDP] but not the normalized TDP.
> 
> 
> 
> You can see my card (shunt modded) hits power limit because of the normalized TDP.


That's exactly what I was seeing, so the Palit bios is raise one of the TDP's, but not both. Asus bios probably does something similar and no matter what, you don't get past the max normalised default FE TDP of 300w


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm not exactly sure what the rainbow means but if clocks are not jumping around in Superposition 4K Optimized, Time Spy test 2 and stay at 2012 you're good even with the rainbow.


Was stable in heaven at 2075 at 1.903v, but clocks were jumping like crazy in SuPo. Think I am gonna settle for 2062 at 1.075v for the overclock since that doesnt jump like crazy.


----------



## Slackaveli

Quote:


> Originally Posted by *foolycooly*
> 
> I just uninstalled EVGA XOC planning to do a DDU uninstall and reinstall of the drivers. I then tried unplugging my second and third monitor so that only my 144hz monitor was plugged in. It finally downclocked to around 215!
> 
> I'm running a few heaven benchmarks with only the single monitor plugged in for comparison to after I plug all 3 monitors back in. I then plan to plug one at a time in and see what happens.
> 
> Edit: Plugging in my second 60hz monitor with displayport kept the idles at 215. However, when I plugged my third 60hz panel in with DVI, the clocks spike back up to 1500+. So it seems DVI is the culprit after all. My displayport cable is arriving tomorrow--hoping that running all 3 displays on displayport will correct the issue permanently!
> 
> One strange thing is that while my clocks came down to 215, the temperature seems to be sticking at 51c. I find it hard to believe that a 1300 reduction in core clock isn't bring the temperature down at all. Any thoughts?


YO, I CALLED that.


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> This thread is the rock and we crackheads we keep coming back lol ;-)
> 
> it's never enough must have crack, I mean MHz (little voices in my head)
> 
> When you dream of voltage curves, or thinking about it at work at random times you know it's time to step away
> 
> Besides in 6 months it's Volta (thank vega for that) and back to the drawing board, the voices start, upgrade, upgrade lol. It a vicious cycle I tell ya lol


get outta my head!!










"upgrade, upgrade, upgrade"...

BUT BRAIN, I already have the best gpu, best cpu, best ram!!

"wit your offbrand non-NVMe ssd totin azz..."

AWW, PISS OFF, Brain!


----------



## RadActiveLobstr

Quote:


> Originally Posted by *foolycooly*
> 
> I just uninstalled EVGA XOC planning to do a DDU uninstall and reinstall of the drivers. I then tried unplugging my second and third monitor so that only my 144hz monitor was plugged in. It finally downclocked to around 215!
> 
> I'm running a few heaven benchmarks with only the single monitor plugged in for comparison to after I plug all 3 monitors back in. I then plan to plug one at a time in and see what happens.
> 
> Edit: Plugging in my second 60hz monitor with displayport kept the idles at 215. However, when I plugged my third 60hz panel in with DVI, the clocks spike back up to 1500+. So it seems DVI is the culprit after all. My displayport cable is arriving tomorrow--hoping that running all 3 displays on displayport will correct the issue permanently!
> 
> One strange thing is that while my clocks came down to 215, the temperature seems to be sticking at 51c. I find it hard to believe that a 1300 reduction in core clock isn't bring the temperature down at all. Any thoughts?


It's not just DVI, it's having more than 2 monitors. I have the same issue with all three using DP and now two with DP and one HDMI (as a DP on my one monitor has died).

It's an issue that's been pointed out to Nvidia for a long time and it's gone from "We are aware of it and will work on a fix" to now "It's working as intended".


----------



## MURDoctrine

Quote:


> Originally Posted by *RadActiveLobstr*
> 
> It's not just DVI, it's having more than 2 monitors. I have the same issue with all three using DP and now two with DP and one HDMI (as a DP on my one monitor has died).
> 
> It's an issue that's been pointed out to Nvidia for a long time and it's gone from "We are aware of it and will work on a fix" to now "It's working as intended".


Is this an issue with above 60hz displays because I run 3 60hz panels and downclock fine?


----------



## KickAssCop

Can someone guide me on how to unlock voltage on MSI Gaming X. Thanks.
My card is crap and only runs 1949 sustained clocks.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone guide me on how to unlock voltage on MSI Gaming X. Thanks.
> My card is crap and only runs 1949 sustained clocks.


What happened to "oh wow never saw people waste time oc."

Karma.


----------



## Juggalo23451

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone guide me on how to unlock voltage on MSI Gaming X. Thanks.
> My card is crap and only runs 1949 sustained clocks.


you will need to use a program such as notepad++. This is completely free. Make sure msi afterburner is not running

You will need to open up msi afterburner then click on the (I) This will give you the info of the graphics card.

Copy this portion this will be an example

*VEN_10DE&DEV_1B06&SUBSYS_120F10DE&REV_*

You will need to go into where the MSI afterburner is installed folder.

Find the file and edit the oem2 with Notepad++

Then scroll down

put this in for your own reference if you want

#1080ti voltage unlock#

if not just put this in

[VEN_10DE&DEV_1B06&*SUBSYS_120F10DE*&REV_??] your ven number maybe different.
VDDC_Generic_Detection = 1

Once done save.

Open up msi afterburner then you will need to go into settings and select third party voltage control.

Save and restart MSI Afterburner.

Credit goes to KedarWolf


----------



## KickAssCop

Thanks.


----------



## KickAssCop

Quote:


> Originally Posted by *SlimJ87D*
> 
> What happened to "oh wow never saw people waste time oc."
> 
> Karma.


Are you talking to the right person?


----------



## Juggalo23451

Quote:


> Originally Posted by *KickAssCop*
> 
> Thanks.


I forgot to add a thing or two that has been corrected.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KickAssCop*
> 
> Are you talking to the right person?


I sure am, avatar is pretty distinguishable as the name.
Quote:


> Originally Posted by *KickAssCop*
> 
> Bios flashing on a 1080 Ti is a joke. Can't believe people want to talk about that extra 50 MHz and feel special.


Wasn't hard to Google your name and 1080 Ti.


----------



## Boyd

I can't keep up with this thread, there are a lot of people I would like to reply to but every time I attempt to do so, i realize their post was over 75 to 100 pages old.


----------



## buddatech

Yes like this question that got buried and never replied to









Can/should I use CLU Pro on GPU Die or should I stick with other paste I have on hand Antec Formula 7??


----------



## MURDoctrine

Quote:


> Originally Posted by *buddatech*
> 
> Yes like this question that got buried and never replied to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can/should I use CLU Pro on GPU Die or should I stick with other paste I have on hand Antec Formula 7??


I'm using MX-4 on mine and it is working fine. I'm under water and I idle at ~22C and the highest I've seen on load was 46C after a few hours of gaming. There is usually only a few degree difference between most of the TIM's.


----------



## KickAssCop

Quote:


> Originally Posted by *SlimJ87D*
> 
> I sure am, avatar is pretty distinguishable as the name.
> Wasn't hard to Google your name and 1080 Ti.


Voltage unlock vs. Bios flash. Two different things. Also I am looking more for 100 MHz gain here since my ASUS 1080 Ti FE was doing 2050. This one doesn't do more than 1950. If I can use some voltage unlock to get additional 100 MHz, I will be pleased.

Anyways, I am selling the MSI Gaming X POS today and ordered an ASUS STRIX for another chance at lottery.


----------



## MURDoctrine

Quote:


> Originally Posted by *KickAssCop*
> 
> Voltage unlock vs. Bios flash. Two different things.


While I avoid Bios flashing myself they are to achieve the same thing. Higher clocks one way or another.


----------



## mshagg

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone guide me on how to unlock voltage on MSI Gaming X. Thanks.
> My card is crap and only runs 1949 sustained clocks.


Latest version of afterburner works out of the box. I know a few people have had a hard time downloading, ive prob got the installer somewhere in my downloads folder cesspool if people need it.


----------



## KickAssCop

Didn't know. Thanks for the info. Will try and find the download location.


----------



## mshagg

4.4.0 Beta 6 is the one you seek


----------



## viz0id

Quote:


> Originally Posted by *s1rrah*
> 
> My go to overclock of 2025mhz mostly stays there ... but will eventually downclock to 2012mhz as well ... other areas load and it will go back to 2025mhz ... and so it goes ...
> 
> I'm just using a +80 overclock in EVGA Precision X with power and thermal sliders set to max. I never break 43C in game ... don't use a curve ... haven't overvolted either.
> 
> Also, I get power limiting all the time ... like every 5 to 20 seconds ... but things are smooth as silk and frames are always 115FPS to 144fps depending on the area so I mean, is that power limit thing even anything to worry about? Don't the cards just do that by default?


What kind of voltage is +2025 and 2012 at? I had huge problems with just flat offset on core. Once i moved to lower voltages (using the curve) i got much more stable clocks.

So i would recommend trying to use the curve if you haven't. Try to put 2012 at 1.000v, if that works work your way up.


----------



## eXteR

It's been about a week since i flashed SC2 bios.

In my case, this bios is better than Nvidia stock one.

Can clock a little higher, 2050 @ 1.070v (Stock) vs 2070 @ 1.063v (SC2).

Also, i can mantain better clocks at less voltage. With stock @1v can only do 1950Mhz, and with SC2 bios it ramps up to 1987mhz.

I feel TDP is less intrusive in Superposition 4K, and i get better scores on 1080p Extreme and 4K optimized

1080p Extreme: 6150 with stock bios - 6245 with SC2
4K Optimized: 9910 with Stock - 1215 with SC2

Also for me, as i can downvolt a little more, i got lower temps while gaming. Rainbow Six Siege is about 44-46º, and Witcher 3 about 50-52.

Hybrid Kit.

Keep in mind that at the moment, the AIO pump is connected to GPU fan connector, so i'm eating little tdp from the card. I'm waiting to recieve a cable to connect direc to motherboard.


----------



## g0barry

.


----------



## KedarWolf

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone guide me on how to unlock voltage on MSI Gaming X. Thanks.
> My card is crap and only runs 1949 sustained clocks.


http://forums.guru3d.com/showpost.php?p=5412373&postcount=216

Use this Afterburner, just install it, enable 'Third party' and 'unlock voltage control' and 'voltage monitoring' in preferences, done!!


----------



## viz0id

Are there any benchmarks that show the gain in fps / benchmark scores over going from DDR3 to DDR4 (with higher clockspeeds) on intel platform? I know the new Ryzen CPU's gain considerable amounts from higher clocked ram. I have a Z97 board with 4790k and DDR 1833mhz ram. Thinking about changing it out, but maybe it is better to wait for the next bigger jump in CPU architecture? Haha like i can hold out until 2019 and Icelake.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blotto80*
> 
> I'm seeing some strangeness on the Palit Bios. I've got the PL set to 116% and NVSMI reports 348 as the enforced PL but during Supo, it still power limits at 100% (300w). I've forced it to 350w with SMI and it takes and the enforced limit changes but no difference once in Supo.
> 
> 
> 
> Everyone has already reported poorly results with it, and not just it but all bios. Just use stock. Through all my testing, flashing and trying out bios was just a waste of time.
> 
> If you really desperately need that power, just shunt mod and be done with it.
Click to expand...

I disagree with you again, Slim. Palit BIOS has its uses for those it works for and for some it does. If you don't have the shunt mod and/or are on air Palit as I tested it game me decent clocks and the lowest voltages I ever tested on a BIOS and while on benching it not the best performer, for 24/7 use for gaming to avoid power limiting and/or thermal throttling it is a decent BIOS as it hardly power limits at all and can get good clocks are really low voltages.

You know though, @SlimJ87D, you come across and an elitist that everyone HAS to agree with you and you act like you're always right which at times your not. At least when you HAVE been right and I've been wrong, I owned up to it. But, you're ALWAYs right and have ALL the answers, it's really frustrating for me you act like this.

Anyhow, it's not a forum to voice issues really but I'd wish you not respond in absolutes and realise that you are not the end all to beat all when it comes to the forums. I'm sure you are an intelligent guy, no doubt, it just seems that the 'I am right and you are wrong' kinda attitude and statements is not conducive to a learning and growing community that is here to help each other, not to cause division and strife.









Anyhow, my rant is done.

And yes, this person is having an issue with the Palit BIOS but I've not had this issue, no power limiting until 117%.

I'm thinking even though my card is a reference FE somehow because it's a first model Gigabyte edition I got on release day there might somehow be some kind of differences where some get varying results then I do while some have similar results and I've seen both sides of that coin in the forums and elsewhere.

Palit BIOS does have it uses for those it's working for though.


----------



## KedarWolf

Quote:


> Originally Posted by *viz0id*
> 
> Are there any benchmarks that show the gain in fps / benchmark scores over going from DDR3 to DDR4 (with higher clockspeeds) on intel platform? I know the new Ryzen CPU's gain considerable amounts from higher clocked ram. I have a Z97 board with 4790k and DDR 1833mhz ram. Thinking about changing it out, but maybe it is better to wait for the next bigger jump in CPU architecture? Haha like i can hold out until 2019 and Icelake.


12 core X299 seems promising later this year.

I'm sooooo lucky, found out from our lawyer at work I'm eligible for a large class action lawsuit here in Canada and in a year or so should get a big chunk of cash, enough to build a top of the line X299 system I'm sure with the gen of Nvidia that is available then!!!









System I have now I could only afford from doing ten years of back tax refunds, and unless I get lucky in the lotto or something my X299 may be the last high-end PC I can afford for I don't know how long.


----------



## viz0id

Quote:


> Originally Posted by *KedarWolf*
> 
> 12 core X299 seems promising later this year.
> 
> I'm sooooo lucky, found out from our lawyer at work I'm eligible for a large class action lawsuit here in Canada and in a year or so should get a big chunk of cash, enough to build a top of the line X299 system I'm sure with the gen of Nvidia that is available then!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> System I have now I could only afford from doing ten years of back tax refunds, and unless I get lucky in the lotto or something my X299 may be the last high-end PC I can afford for I don't know how long.


Yeah i have no chance of upgrading to something like that. I had a cash flow in that was unexpected 1,5 month ago as well. With having 3100dollars of unclaimed income on my youtube channel, that i had forgot about. Bought a new couch, 1080ti and a XB271HU


----------



## KedarWolf

Quote:


> Originally Posted by *viz0id*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 12 core X299 seems promising later this year.
> 
> I'm sooooo lucky, found out from our lawyer at work I'm eligible for a large class action lawsuit here in Canada and in a year or so should get a big chunk of cash, enough to build a top of the line X299 system I'm sure with the gen of Nvidia that is available then!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> System I have now I could only afford from doing ten years of back tax refunds, and unless I get lucky in the lotto or something my X299 may be the last high-end PC I can afford for I don't know how long.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah i have no chance of upgrading to something like that. I had a cash flow in that was unexpected 1,5 month ago as well. With having 3100dollars of unclaimed income on my youtube channel, that i had forgot about. Bought a new couch, 1080ti and a XB271HU
Click to expand...

I love my XB280HK 4K G-Sync screen!! Acer makes pretty decent screens I think. I kinda wish I went 144HZ 1440p G-Sync though on second thought.


----------



## fisher6

Quote:


> Originally Posted by *KedarWolf*
> 
> 12 core X299 seems promising later this year.
> 
> I'm sooooo lucky, found out from our lawyer at work I'm eligible for a large class action lawsuit here in Canada and in a year or so should get a big chunk of cash, enough to build a top of the line X299 system I'm sure with the gen of Nvidia that is available then!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> System I have now I could only afford from doing ten years of back tax refunds, and unless I get lucky in the lotto or something my X299 may be the last high-end PC I can afford for I don't know how long.


I'm also holding out for X299 to upgrade from my Z97 + 4790k setup but i'm gonna wait for benchmarks against Ryzen. I want at least an 8 core CPU. If Intel is gonna charge a lot more than AMD for similar performance then I will be going with Ryzen.


----------



## TWiST2k

Quote:


> Originally Posted by *KedarWolf*
> 
> I love my XB280HK 4K G-Sync screen!! Acer makes pretty decent screens I think. I kinda wish I went 144HZ 1440p G-Sync though on second thought.


1440p 144hz G-Sync is the bees knees man! I have the PG279Q on BOTH of my workstations here and I cant imagine gaming without it. Granted it took a couple of returns to get good ones, but once you get a clean one, there really is no looking back!


----------



## BrainSplatter

Quote:


> Originally Posted by *viz0id*
> 
> Are there any benchmarks that show the gain in fps / benchmark scores over going from DDR3 to DDR4


Depends very much on game. Simple FPS shooters usually don't gain much from faster RAM (Battlefield SP, COD, Quake,...), 5-10%for x2 speed. Open world can show decent games usually a little more (e.g. Witcher, GTA, Fallout, Arma 3, ...), 10-20% for x2 speed. And more complex RTS games can benefit a lot (e.g. Ashes of Singularity, Total War, ...), 10%-30% for x2 speed.

When testing Total War Attila, I found that RAM speed could actually bottleneck a 980Ti on high settings (extreme setting was more GPU limited) @ 1080p. The difference there was up to 40% for x2 RAM speed.

Here is one RAM speed review: http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page3.html


----------



## KedarWolf

Quote:


> Originally Posted by *TWiST2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I love my XB280HK 4K G-Sync screen!! Acer makes pretty decent screens I think. I kinda wish I went 144HZ 1440p G-Sync though on second thought.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440p 144hz G-Sync is the bees knees man! I have the PG279Q on BOTH of my workstations here and I cant imagine gaming without it. Granted it took a couple of returns to get good ones, but once you get a clean one, there really is no looking back!
Click to expand...

I'd Amazon one if I do, they have the best return policy of any company here in Canada, and pay return shipping, I could buy and try until I got zero dead pixels.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> Depends very much on game. Simple FPS shooters usually don't gain much from faster RAM (Battlefield SP, COD, Quake,...), 5-10%for x2 speed. Open world can show decent games usually a little more (e.g. Witcher, GTA, Fallout, Arma 3, ...), 10-20% for x2 speed. And more complex RTS games can benefit a lot (e.g. Ashes of Singularity, Total War, ...), 10%-30% for x2 speed.
> 
> When testing Total War Attila, I found that RAM speed could actually bottleneck a 980Ti on high settings (extreme setting was more GPU limited) @ 1080p. The difference there was up to 40% for x2 RAM speed.
> 
> Here is one RAM speed review: http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page3.html


Thanks I'll have a look. I usually just game RPG and FPS.


----------



## TWiST2k

Quote:


> Originally Posted by *KedarWolf*
> 
> I'd Amazon one if I do, they have the best return policy of any company here in Canada, and pay return shipping, I could buy and try until I got zero dead pixels.


That is exactly what I did haha! I basically purchase EVERYTHING from Amazon these days, there returns and prime can't be beat. The only real issue I ever had with them was the back light bleed, never did see any dead pixels that I can recall at this time.

I feel for you up in Canada man, I have a good buddy that lives there and his prices are absurdly higher then mine in the states. We have joked about me buying stuff for him and sending it up, but I dunno how VAT or whatever you have there works if I were to send him anything.


----------



## KedarWolf

Quote:


> Originally Posted by *TWiST2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'd Amazon one if I do, they have the best return policy of any company here in Canada, and pay return shipping, I could buy and try until I got zero dead pixels.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That is exactly what I did haha! I basically purchase EVERYTHING from Amazon these days, there returns and prime can't be beat. The only real issue I ever had with them was the back light bleed, never did see any dead pixels that I can recall at this time.
> 
> I feel for you up in Canada man, I have a good buddy that lives there and his prices are absurdly higher then mine in the states. We have joked about me buying stuff for him and sending it up, but I dunno how VAT or whatever you have there works if I were to send him anything.
Click to expand...

I send $400 USD every month to the USA for support payments. Costs me just over $575 Canadian dollars to buy the $400 USD.


----------



## Taint3dBulge

Only had my block 3 weeks, and this is what it looks like. Flushed the system out when I got the new gpu. Am using EK coolant. what the heck.

Thats a nickel block too. so copper isnt supposed to be showing.


----------



## MURDoctrine

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Only had my block 3 weeks, and this is what it looks like. Flushed the system out when I got the new gpu. Am using EK coolant. what the heck.
> 
> Thats a nickel block too. so copper isnt supposed to be showing.


I think mine is looking the same in less than a week. I just checked it after seeing yours. Guess I'll be tearing my loop down later tonight. I'm really hoping it's just some residue or something in mine as it doesn't look as bad as yours.


----------



## AngryLobster

Quote:


> Originally Posted by *KedarWolf*
> 
> I disagree with you again, Slim. Palit BIOS has its uses for those it works for and for some it does. If you don't have the shunt mod and/or are on air Palit as I tested it game me decent clocks and the lowest voltages I ever tested on a BIOS and while on benching it not the best performer, for 24/7 use for gaming to avoid power limiting and/or thermal throttling it is a decent BIOS as it hardly power limits at all and can get good clocks are really low voltages.
> 
> You know though, @SlimJ87D, you come across and an elitist that everyone HAS to agree with you and you act like you're always right which at times your not. At least when you HAVE been right and I've been wrong, I owned up to it. But, you're ALWAYs right and have ALL the answers, it's really frustrating for me you act like this.
> 
> Anyhow, it's not a forum to voice issues really but I'd wish you not respond in absolutes and realise that you are not the end all to beat all when it comes to the forums. I'm sure you are an intelligent guy, no doubt, it just seems that the 'I am right and you are wrong' kinda attitude and statements is not conducive to a learning and growing community that is here to help each other, not to cause division and strife.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow, my rant is done.
> 
> And yes, this person is having an issue with the Palit BIOS but I've not had this issue, no power limiting until 117%.
> 
> I'm thinking even though my card is a reference FE somehow because it's a first model Gigabyte edition I got on release day there might somehow be some kind of differences where some get varying results then I do while some have similar results and I've seen both sides of that coin in the forums and elsewhere.
> 
> Palit BIOS does have it uses for those it's working for though.


I been reading through this thread and agree with your sentiment toward that poster.

I feel like because he won the silicone lottery with that card, his posts come off as if he's some GTX 1080 Ti Messiah.


----------



## mshagg

Quote:


> Originally Posted by *viz0id*
> 
> Are there any benchmarks that show the gain in fps / benchmark scores over going from DDR3 to DDR4 (with higher clockspeeds) on intel platform? I know the new Ryzen CPU's gain considerable amounts from higher clocked ram. I have a Z97 board with 4790k and DDR 1833mhz ram. Thinking about changing it out, but maybe it is better to wait for the next bigger jump in CPU architecture? Haha like i can hold out until 2019 and Icelake.


The higher speeds are offset by the looser timings to an extent, although now with DDR4 pushing 4000Mhz, that's a lot of extra bandwidth. Stuff like 3DMark physics test respond really well to the faster memory, but "real world differences" aren't great (again, until you get into the really fast stuff, which is $$$).

The 2011-3 platform gets a nice bump with quad channel, but the memory controllers are fussy and it takes a lot of work to push past 3200Mhz with respectable timings.

This gives you an idea of how gaming performance scales with memory bandwith:

http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page3.html

My 2c - plenty of good reasons to upgrade to a newer platform, but DDR4 is only a part of it...

Also, chalk me up as a +1 for the Palit BIOS. Better clockspeeds under heavy load, higher benchmark scores and gaming seems more stable (hah! remember gaming?). I was having all sorts of problems in VR games, which are pretty demanding, with the FE BIOS > 2037Mhz. Palit BIOS has been happy gaming endlessly at 2075Mhz.


----------



## Taint3dBulge

Quote:


> Originally Posted by *MURDoctrine*
> 
> I think mine is looking the same in less than a week. I just checked it after seeing yours. Guess I'll be tearing my loop down later tonight. I'm really hoping it's just some residue or something in mine as it doesn't look as bad as yours.


I contacted ek to see what they think. I hope i can get a new block. That really sucks to say the least.


----------



## viz0id

Quote:


> Originally Posted by *mshagg*
> 
> The higher speeds are offset by the looser timings to an extent, although now with DDR4 pushing 4000Mhz, that's a lot of extra bandwidth. Stuff like 3DMark physics test respond really well to the faster memory, but "real world differences" aren't great (again, until you get into the really fast stuff, which is $$$).
> 
> The 2011-3 platform gets a nice bump with quad channel, but the memory controllers are fussy and it takes a lot of work to push past 3200Mhz with respectable timings.
> 
> This gives you an idea of how gaming performance scales with memory bandwith:
> 
> http://www.techspot.com/article/1171-ddr4-4000-mhz-performance/page3.html
> 
> My 2c - plenty of good reasons to upgrade to a newer platform, but DDR4 is only a part of it...


Thanks for answering. Yeah i have CL10 so not the lowest, but i feel the platform with the CPU i have still holds far too much value to change out for something like Kaby. Maybe if i bought a completely new system minus a new GPU ( i have a 980 i can put in this system), so i had 2 functional systems, i could maybe see myself doing it.


----------



## MURDoctrine

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I contacted ek to see what they think. I hope i can get a new block. That really sucks to say the least.


Well I just took mine out. In the case with a flash light I could see something that resembled the copper color you had. I pulled the card out and out of the case I couldn't see it. I still think I'm going to do a tear down and clean everything again. I know my Rasa CPU block looks horrible and will be replaced soon with an EK Supremecy but that has me worried. I've been so weary after the plating issues of old.




Spoiler: Warning: Spoiler!


----------



## Qba73

Quote:


> Originally Posted by *Slackaveli*
> 
> get outta my head!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> "upgrade, upgrade, upgrade"...
> 
> BUT BRAIN, I already have the best gpu, best cpu, best ram!!
> 
> "wit your offbrand non-NVMe ssd totin azz..."
> 
> AWW, PISS OFF, Brain!


Lol 4real right.... tech, it's a hell of a drug.


----------



## pez

Welp, just pulled the trigger to overnight a Black Edition SC. Hoping this is 'the one' for me







.


----------



## DabboPiff

Quote:


> Originally Posted by *KickAssCop*
> 
> Voltage unlock vs. Bios flash. Two different things. Also I am looking more for 100 MHz gain here since my ASUS 1080 Ti FE was doing 2050. This one doesn't do more than 1950. If I can use some voltage unlock to get additional 100 MHz, I will be pleased.
> 
> Anyways, I am selling the MSI Gaming X POS today and ordered an ASUS STRIX for another chance at lottery.


Poor guy my MSi gaming X POS overclocks to 2050 at 1.050 voltage. +100 core +400 memory with overclock mode is Msi gaming app, didn't touch voltage


----------



## s1rrah

Quote:


> Originally Posted by *viz0id*
> 
> What kind of voltage is +2025 and 2012 at? I had huge problems with just flat offset on core. Once i moved to lower voltages (using the curve) i got much more stable clocks.
> 
> So i would recommend trying to use the curve if you haven't. Try to put 2012 at 1.000v, if that works work your way up.


Using no curve, with Power/Temp set to max and voltages untouched, ... core at 2012mhz and memory at 5895mhz ... max voltages recorded during stress testing (various tests) ... top out at 1.063v ...

And man I've had very little luck with the curve ... I find those nodes so tedious to move about ... I think the best I got was 2025mhz core at 1.050v ... but it was a little unstable ...

To be sure, though ... other than my OCD (which is horrible BTW) ... I detect no ingame FPS difference when playing at 1987mhz core vs 2025mhz core ... I haven't measured but feel positive 35mhz or so are completely not an issue when gaming.

Fun cards for sure ... just the sheer power of the thing ... but not that great of overclockers ...


----------



## foolycooly

Quote:


> Originally Posted by *MURDoctrine*
> 
> Is this an issue with above 60hz displays because I run 3 60hz panels and downclock fine?


Seems to definitely be related to mixed refresh rates. Although I've heard different things about 3 monitors. It works for some people and not for others. I'll find out later if it's dvi or if it's just 3 monitors. FWIW my 780ti downclocked fine with my same setup.


----------



## viz0id

Quote:


> Originally Posted by *s1rrah*
> 
> Using no curve, with Power/Temp set to max and voltages untouched, ... core at 2012mhz and memory at 5895mhz ... max voltages recorded during stress testing (various tests) ... top out at 1.063v ...
> 
> And man I've had very little luck with the curve ... I find those nodes so tedious to move about ... I think the best I got was 2025mhz core at 1.050v ... but it was a little unstable ...
> 
> To be sure, though ... other than my OCD (which is horrible BTW) ... I detect no ingame FPS difference when playing at 1987mhz core vs 2025mhz core ... I haven't measured but feel positive 35mhz or so are completely not an issue when gaming.
> 
> Fun cards for sure ... just the sheer power of the thing ... but not that great of overclockers ...


The curve can be tedious. Just want to let you know (in case you didn't) that there are hotkeys for the curve. if you hold "shift" and click you can slide the whole curve up and down, so then you just put the flat part where you want to be, then move the lower ones up.

Takes some time fiddling though, and like you said it's more for just "knowing" it is where you want it to be. Probs wont see it in game


----------



## eXistencelies

I have been trying to set my curve. I noticed during benchmarks with all 3 (timespy, firestrike, supo 4k) that the mhz tends to jump around a lot. I see it range from 1934-2063. I am on a full custom 720mm water loop. The card is EVGA FE 1080ti. How come when I put the curve at a certain number and at a certain mv the crosshair thing goes below where I have it? If it does that do I need to actually lower the block to that point? Also I have seen my max power hit 127. Isn't that too high?

My scores are:

Timespy Graphics Score 10,780.
Supo 4k Score was 10,011.

Didn't test firestrike as I used it for a loop when adjusting. Last one I have was before messing with the curve more.


----------



## mshagg

3x 60hz monitors on displpayports-HDMI cables here, core clock currently 253Mhz Mem 405Mhz.


----------



## trawetSluaP

My card doesn't downclock at idle but I am running Performance mode at 1440p.


----------



## eXistencelies

Quote:


> Originally Posted by *mshagg*
> 
> 3x 60hz monitors on displpayports-HDMI cables here, core clock currently 253Mhz Mem 405Mhz.


Quote:


> Originally Posted by *trawetSluaP*
> 
> My card doesn't downclock at idle but I am running Performance mode at 1440p.


I have two 60hz and my main 144hz (165hz oc) and it never idles under 1500mhz or so.


----------



## HyperC

Quote:


> Originally Posted by *eXistencelies*
> 
> I have been trying to set my curve. I noticed during benchmarks with all 3 (timespy, firestrike, supo 4k) that the mhz tends to jump around a lot. I see it range from 1934-2063. I am on a full custom 720mm water loop. The card is EVGA FE 1080ti. How come when I put the curve at a certain number and at a certain mv the crosshair thing goes below where I have it? If it does that do I need to actually lower the block to that point? Also I have seen my max power hit 127. Isn't that too high?
> 
> My scores are:
> 
> Timespy Graphics Score 10,780.
> Supo 4k Score was 10,011.
> 
> Didn't test firestrike as I used it for a loop when adjusting. Last one I have was before messing with the curve more.


power limit is making your clocksjump around that's why some of us have been doing the shunt mod


----------



## trawetSluaP

Quote:


> Originally Posted by *eXistencelies*
> 
> I have two 60hz and my main 144hz (165hz oc) and it never idles under 1500mhz or so.


Same.


----------



## eXistencelies

Quote:


> Originally Posted by *HyperC*
> 
> power limit is making your clocksjump around that's why some of us have been doing the shunt mod


So the only way to fix that is the shunt mod? I am curious why the power jumps higher than 120 though?


----------



## CoreyL4

Tested the core clock of my gaming x last night. Havent done memory yet, but here are my results for voltage and core.

It can do majority of heaven at 2088 @1.093v, locks up about ~60% done.

2075 at 1.093v/1.081v passes both supo 4k and heaven, but clocks jump around too much in supo and timespy.

2062 @ 1.062v is perfect. clocks dont jump

time for memory tonight


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KickAssCop*
> 
> Voltage unlock vs. Bios flash. Two different things. Also I am looking more for 100 MHz gain here since my ASUS 1080 Ti FE was doing 2050. This one doesn't do more than 1950. If I can use some voltage unlock to get additional 100 MHz, I will be pleased.
> 
> Anyways, I am selling the MSI Gaming X POS today and ordered an ASUS STRIX for another chance at lottery.


No bro, you were judging people for trying to oc their card and now you got a dud, that's a's how karma works.


----------



## viz0id

Quote:


> Originally Posted by *eXistencelies*
> 
> So the only way to fix that is the shunt mod? I am curious why the power jumps higher than 120 though?


Or force your card to run at lower voltages. That helped me at least get higher stable clocks without perfcapping.


----------



## eXistencelies

Quote:


> Originally Posted by *viz0id*
> 
> Or force your card to run at lower voltages. That helped me at least get higher stable clocks without perfcapping.


So to do that just lower the core at higher voltage points? I also have my volt meter at +100.


----------



## pez

Quote:


> Originally Posted by *SlimJ87D*
> 
> No bro, you were judging people for trying to oc their card and now you got a dud, that's a's how karma works.


AIB cards OC on their own for the most part once you set a more aggressive profile. I mean if we're honest, the Ti's are all terrible OC'ers compared to the 1080s and TXPs.


----------



## BrainSplatter

Quote:


> Originally Posted by *eXistencelies*
> 
> So to do that just lower the core at higher voltage points? I also have my volt meter at +100.


Yes, just that in the way the curve editor works, you have to raise the clock speed of the lower voltage points. Another good starting point is to check out what's the maximum stable core for 1.0v (set 1v to test core clock and flatten curve behind it) because 1v is almost never power limited.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I disagree with you again, Slim. Palit BIOS has its uses for those it works for and for some it does. If you don't have the shunt mod and/or are on air Palit as I tested it game me decent clocks and the lowest voltages I ever tested on a BIOS and while on benching it not the best performer, for 24/7 use for gaming to avoid power limiting and/or thermal throttling it is a decent BIOS as it hardly power limits at all and can get good clocks are really low voltages.
> 
> You know though, @SlimJ87D, you come across and an elitist that everyone HAS to agree with you and you act like you're always right which at times your not. At least when you HAVE been right and I've been wrong, I owned up to it. But, you're ALWAYs right and have ALL the answers, it's really frustrating for me you act like this.
> 
> Anyhow, it's not a forum to voice issues really but I'd wish you not respond in absolutes and realise that you are not the end all to beat all when it comes to the forums. I'm sure you are an intelligent guy, no doubt, it just seems that the 'I am right and you are wrong' kinda attitude and statements is not conducive to a learning and growing community that is here to help each other, not to cause division and strife.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow, my rant is done.
> 
> And yes, this person is having an issue with the Palit BIOS but I've not had this issue, no power limiting until 117%.
> 
> I'm thinking even though my card is a reference FE somehow because it's a first model Gigabyte edition I got on release day there might somehow be some kind of differences where some get varying results then I do while some have similar results and I've seen both sides of that coin in the forums and elsewhere.
> 
> Palit BIOS does have it uses for those it's working for though.


All I did was pull statistics, notnsure how that's being an elitist? So anyone that believes in math is an elitist?

You have a power supply that measures watts, you had all the time in the world to test and see how much watts the Palit bios delivers, which is 30 watts, but instead you just flashed it, and then "confirmed" it gives you 50 watts without confirming anything. You also went on about it being the best bios out there until everyone started giving it negative feedback. And now I'm an "elitist" because all this happened?

*Honestly, I wouldn't even make comments or suggestions to users about bios, but people don't keep up with this forum and they might have read an old post that didn't give them the full scoop. Maybe my post have come off as arrogant, and I'll work on that. But the attitude in my post is due to retaliation of the above paragraph. And I'm sure your entire post in in retaliation of me questioning your methods all the time.*

I'm not always right, but we have a forum where smart people can gather together to perform thorough testing. We get more done if people as a group set out to perform research together. And the results speak for themselves.

Just like the Asus bios,several post back you had 5 of us perform test and only measured 15 watts of extra power using nvidia commands, HWINFO64 and PSU. We gathered together to investigate and do the community some good.

All the Palit bios did for me was lower my scores across the board in all benchmarks. Same as other people have reported. So I'm not sure how it's beneficial to flash a bios that just scores worse even with more power draw. That's all I told the previous user.

Truth be told, we both need each other on this forum to conduct research and make discoveries with one another.


----------



## alanthecelt

I assume memory voltage is seperate?
i seem to get artifacts around +450 occasionally when i was running heaven, my gut is that +500 should easily be achievable, i havnet played too much with the memory, the curve last night took ages, but i seem to have a much stronger overclock due to it, much less power limiting


----------



## BrainSplatter

Quote:


> Originally Posted by *alanthecelt*
> 
> I assume memory voltage is seperate?
> i seem to get artifacts around +450 occasionally when i was running heaven, my gut is that +500 should easily be achievable


Unfortunately your guts are wrong. It seems memory OC differs more than maximum core OC. The range seems to be between +250 to +750, with most falling between 350 to 600 Mhz, with the average probably somewhere between 450-500Mhz.

From my 2 FE 1080Tis, one does 650Mhz the other one about 450Mhz.

Also, some memory OC's u see might actually already trigger the error correction in the VRAM modules and therefore cause some performance penalty over a slightly lower overclock without ECC kicking in.


----------



## eXistencelies

Quote:


> Originally Posted by *BrainSplatter*
> 
> Yes, just that in the way the curve editor works, you have to raise the clock speed of the lower voltage points. Another good starting point is to check out what's the maximum stable core for 1.0v (set 1v to test core clock and flatten curve behind it) because 1v is almost never power limited.


So see what the highest core clock I can get at 1.000v then even out the line after that? Do I end up raising it later though or just leave it flat? Thanks. I will def try this when I get home. As long as I am at 2000mhz @ 1.000v I will be happy. I easily was hitting 2136mhz on my 1080 though.


----------



## mshagg

Quote:


> Originally Posted by *eXistencelies*
> 
> So the only way to fix that is the shunt mod? I am curious why the power jumps higher than 120 though?


It's the only reliable way to trick the card into drawing more power. We are testing various non-FE BIOSes but results have been mixed.

It is likely jumping above 120 momentarily, which the card detects, and downclock/downvolts itself.
Quote:


> Originally Posted by *BrainSplatter*
> 
> Unfortunately your guts are wrong. It seems memory OC differs more than maximum core OC. The range seems to be between +250 to +750, with most falling between 350 to 600 Mhz, with the average probably somewhere between 450-500Mhz.
> 
> From my 2 FE 1080Tis, one does 650Mhz the other one about 450Mhz.
> 
> Also, some memory OC's u see might actually already trigger the error correction in the VRAM modules and therefore cause some performance penalty over a slightly lower overclock without ECC kicking in.


It's a shame NVIDIA dont let us monitor ECC flags on these cards. Would be a good way to reliably test mem OC. NVSMI has the capability, but I assume it's only for their profession/compute products.

Mine has a sweet spot at +452, no more no less, according to benchmarks, but can go +600 without crashing.


----------



## BrainSplatter

Quote:


> Originally Posted by *eXistencelies*
> 
> So see what the highest core clock I can get at 1.000v then even out the line after that? Do I end up raising it later though or just leave it flat?


All voltage points after 1.0v should have a core clock equal/lower than the value for 1.0v (when u press apply in Afterburner it will raise all lower clocks after the 1.0v voltage point to the clock of 1.0v).

That curve u should store in a AB profile. After u found the max clock for 1.0v u replace the profile with it. Then u can then use that curve as a good starting point to find the max OC for 1.012v, 1025v, ... because u only have to raise the core clock of the next voltage point.

For most games u can usually use higher voltages (-> higher core) before running into a power limit compared to many benchmarks.


----------



## eXistencelies

Quote:


> Originally Posted by *BrainSplatter*
> 
> All voltage points after 1.0v should have a core clock equal/lower than the value for 1.0v (when u press apply in Afterburner it will raise all lower clocks after the 1.0v voltage point to the clock of 1.0v).
> 
> That curve u should store in a AB profile. After u found the max clock for 1.0v u replace the profile with it. Then u can then use that curve as a good starting point to find the max OC for 1.012v, 1025v, ... because u only have to raise the core clock of the next voltage point.
> 
> For most games u can usually use higher voltages (-> higher core) before running into a power limit compared to many benchmarks.


OK great! Will def check that out when I get home from work later. So find the highest core clock at 1.000v. Hit the check mark and the rest of the points behind level out. Save that profile. Then test the next point at the next voltage level and see what the highest I can get on that? Then repeat, yes? Also when I am running a benchmark and see the crosshair drop some core where I have the point set I should lower that point to there? Another thing is I should only have 120 power and voltage level on 0, right?


----------



## Dasboogieman

Quote:


> Originally Posted by *alanthecelt*
> 
> I assume memory voltage is seperate?
> i seem to get artifacts around +450 occasionally when i was running heaven, my gut is that +500 should easily be achievable, i havnet played too much with the memory, the curve last night took ages, but i seem to have a much stronger overclock due to it, much less power limiting


Its only a function of voltage if your IMC is the bottleneck. Even then the IMC on Pascal has its own voltage plane + VRM. The thing that has the biggest impact on Memory OC is the CAS timings of the VRAM which cannot be modified by us, it is up to the manufacturer of your GPu to release a BIOS that loosen's the timings to allow the extra OC. Then after that its a function of silicon lottery, after all, the odds of landing 11 GGDR5X chips with not a single one being the bottleneck to go 500+ is actually on the uncommon side.


----------



## TheBoom

So Zotac were good enough to RMA my card for the cosmetic defect.

New card seems to hold the same core clocks, around 2037 stable. However this one crashes sooner at 2050 than the previous one. But it doesn't jump around as much with clocks stabilising around 2025-2037.

Memory also goes up to +450 on this one without artifacting (previous card could only do +300), this is inclusive of the stock overclock of +100 Zotac has on their Amp Extreme cards.

What I've noticed though is that power usage never goes anywhere near the 120% limit? Max was 115% but that lasted for less than a second before it downclocked and lowered back to around the 110% range.

So am I just being limited by voltage on this chip?

Edit : Also do all 1080tis get stuck on lower bins after a instability or crash occurs until restart? Its a bit annoying to have to keep restarting when trying to find the max stable clocks.


----------



## BrainSplatter

Quote:


> Originally Posted by *eXistencelies*
> 
> Hit the check mark and the rest of the points behind level out. Save that profile. Then test the next point at the next voltage level and see what the highest I can get on that? Then repeat, yes?


Yes. You just have to lower the core clock value of all voltages points after the 1.0v point once before hitting apply. Then AB will straighten out that line. That's a bit of fiddling due to the number of points. But after that u have a nice starting point for all further tests.

Quote:


> Originally Posted by *TheBoom*
> 
> But it doesn't jump around as much with clocks stabilising around 2025-2037.


Seems to hava a lower VID than your first one. That's also why u don't see much power throttling. So there is probably some more power headroom for a higher voltage and clock before it will again start to fluctuate a lot due to hitting the power level ceiling constantly.


----------



## viz0id

Quote:


> Originally Posted by *eXistencelies*
> 
> OK great! Will def check that out when I get home from work later. So find the highest core clock at 1.000v. Hit the check mark and the rest of the points behind level out. Save that profile. Then test the next point at the next voltage level and see what the highest I can get on that? Then repeat, yes? Also when I am running a benchmark and see the crosshair drop some core where I have the point set I should lower that point to there? Another thing is I should only have 120 power and voltage level on 0, right?


Sorry I was away to continue replying you but i see BrainSplatter has given you very good answers.

The reason you start at 1.000v is because it is important to find your cards sweet spot. For me, running 1.043v gives me the same clockspeeds as running it at higher volts. The only thing higher volts gives me is more power draw, which again gives me more Perfcaps. So for me the sweet spot was 1.043v.

I'm not an expert or anything, I'm just reporting what worked for me.


----------



## TheBoom

Quote:


> Originally Posted by *BrainSplatter*
> 
> Yes. You just have to lower the core clock value of all voltages points after the 1.0v point once before hitting apply. Then AB will straighten out that line. That's a bit of fiddling due to the number of points. But after that u have a nice starting point for all further tests.
> Seems to hava a lower VID than your first one. That's also why u don't see much power throttling. So there is probably some more power headroom for a higher voltage and clock before it will again start to fluctuate a lot due to hitting the power level ceiling constantly.


So I'm guessing I will get better results using the curve instead?


----------



## BrainSplatter

Quote:


> Originally Posted by *TheBoom*
> 
> So I'm guessing I will get better results using the curve instead?


Well, the curve gives definitely the finest control, especially when trying to limit/optimize the power draw. Whether it's worth fiddling around with it to get a few percent extra performance is another thing









The one thing only the voltage curve do, is to 'undervolt' the GPU. I have one curve which limits the OC to 1850Mhz @ 0.9v which I use to limit power draw / noise of the SLI configuration since I haven't put my FE's under water yet.


----------



## viz0id

Quote:


> Originally Posted by *BrainSplatter*
> 
> Well, the curve gives definitely the finest control, especially when trying to limit/optimize the power draw. Whether it's worth fiddling around with it to get a few percent extra performance is another thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The one thing only the voltage curve do, is to 'undervolt' the GPU. I have one curve which limits the OC to 1850Mhz @ 0.9v which I use to limit power draw / noise of the SLI configuration since I haven't put my FE's under water yet.


Yeah that is the strange thing with this card for me. Giving it more voltage just gave me more trouble, giving it less voltage gave me better clocks.

Usually (at least from what I understand) you undervoltage for more effeciency, less power draw and quieter fanprofile. But with my card, giving it lower voltage on the higher clocks, let it sit there more stable and didn't bounce around. (EDIT: Because of the perfcaps on power draw).


----------



## dunbheagan

Quote:


> Originally Posted by *KedarWolf*
> 
> You know though, @SlimJ87D, you come across and an elitist that everyone HAS to agree with you and you act like you're always right which at times your not. At least when you HAVE been right and I've been wrong, I owned up to it. But, you're ALWAYs right and have ALL the answers, it's really frustrating for me you act like this.
> 
> Anyhow, it's not a forum to voice issues really but I'd wish you not respond in absolutes and realise that you are not the end all to beat all when it comes to the forums. I'm sure you are an intelligent guy, no doubt, it just seems that the 'I am right and you are wrong' kinda attitude and statements is not conducive to a learning and growing community that is here to help each other, not to cause division and strife.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Palit BIOS does have it uses for those it's working for though.


I have to agree. Slim, you made a lot of usefull comments here, but sometimes you are taking things to serious and forget about basic rules of politeness, that makes your comments no fun to read.


----------



## Savatage79

Debating on how to go forward. My Aorus seems solid but I still have an ftw3 pre ordered and my aorus had a few weird moments but I'm not sure if it's considered a good gpu in terms of how these cards should be.

It idled at 49, but I set a new fan curve and it is around 31 to 34 but the fans are moving, that shouldn't be correct?

Then my numbers are basically I could only get my core to 25, memory at 300 with the core to 25, now I have my memory at 350 and core to about 10 or 15.

In Andromeda I'm over 2000 steadily overall, fluctuates from 2012 to about 2038 or so, a few moments 2050. Temps mid 50s, maybe 61 at highest but the fans I have pumping for that which are pretty quiet overall.

But in Andromeda I still get some slight frame pacing it seems, like sometimes a turn of the cam will have an instant chug, it's not anything major but it's enough to notice. Anyone else experiencing that?

I'm just torn as I'm an evga guy, so I feel I went out of my comfort zone with it. Should it OC a bit better than about 15/350? At 400 mem it didn't crash but it flashed the screen once or twice.

Maybe I'm chasing numbers to much, i think what has me a bit messed up is going away from evga and sort of feeling out of my pocket a bit.


----------



## Benny89

Quote:


> Originally Posted by *Savatage79*
> 
> Debating on how to go forward. My Aorus seems solid but I still have an ftw3 pre ordered and my aorus had a few weird moments but I'm not sure if it's considered a good gpu in terms of how these cards should be.
> 
> It idled at 49, but I set a new fan curve and it is around 31 to 34 but the fans are moving, that shouldn't be correct?
> 
> Then my numbers are basically I could only get my core to 25, memory at 300 with the core to 25, now I have my memory at 350 and core to about 10 or 15.
> 
> In Andromeda I'm over 2000 steadily overall, fluctuates from 2012 to about 2038 or so, a few moments 2050. Temps mid 50s, maybe 61 at highest but the fans I have pumping for that which are pretty quiet overall.
> 
> But in Andromeda I still get some slight frame pacing it seems, like sometimes a turn of the cam will have an instant chug, it's not anything major but it's enough to notice. Anyone else experiencing that?
> 
> I'm just torn as I'm an evga guy, so I feel I went out of my comfort zone with it. Should it OC a bit better than about 15/350? At 400 mem it didn't crash but it flashed the screen once or twice.
> 
> Maybe I'm chasing numbers to much, i think what has me a bit messed up is going away from evga and sort of feeling out of my pocket a bit.


Don't go where I am. There is no difference in performance between 2000 and 2050mhz. At all...

On Side Note: I can pass suppo 4K 2050, downclocking from 2088.

But Witcher 3 will eat that alive and leave me at 2012 mhz after an hour









So if you want to test your clocks, do with with Witcher 3 really







It will laugh at your "stable clocks".

I also find really funny that some people here can do more than 2038 on 1.000V while speaking with some people on PW seems like 2000 and 2012 is much more common on 1.000V.

I was suppose to stop writing here..... Dam that card gives me headache...


----------



## GNUster

Quote:


> Originally Posted by *Savatage79*
> 
> In Andromeda I'm over 2000 steadily overall, fluctuates from 2012 to about 2038 or so, a few moments 2050. Temps mid 50s, maybe 61 at highest but the fans I have pumping for that which are pretty quiet overall.
> 
> But in Andromeda I still get some slight frame pacing it seems, like sometimes a turn of the cam will have an instant chug, it's not anything major but it's enough to notice. Anyone else experiencing that?
> 
> I'm just torn as I'm an evga guy, so I feel I went out of my comfort zone with it. Should it OC a bit better than about 15/350? At 400 mem it didn't crash but it flashed the screen once or twice.


Speaking from my own experience: I had my hands on one FE, two Strix and two Aorus. The FE, one of the Strix and one of the Aorus were good overclockers which very similarly clocked ~2075-2088 stable under air cooling w/o playing around with the curve, whereas the other Strix and Aorus card were both duds. Hence, I highly doubt that changing the PCB has any notable impact on how much you can leverage the quality of the GP102 chip soldered on the card. It's all about the silicone lottery this time.

The next logical step after you got relatively lucky in the silicone lottery is water cooling e.g. here I summarized the improvement of a AIO mod that I applied to the 'good' Aorus of the two I got to play with.


----------



## TheBoom

Whats funny is it seems the FE cards are actually doing generally better than the aftermarket cards for the 1080ti.

The 10/12/16 phase power and power boost chips and whatnot aren't doing anything to change the silicon lottery itself.

With older cards the custom boards somewhat managed to provide a compromise for the unlucky ones, but that seems to not work anymore.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheBoom*
> 
> Whats funny is it seems the FE cards are actually doing generally better than the aftermarket cards for the 1080ti.
> 
> The 10/12/16 phase power and power boost chips and whatnot aren't doing anything to change the silicon lottery itself.
> 
> With older cards the custom boards somewhat managed to provide a compromise for the unlucky ones, but that seems to not work anymore.


Really? I think we just have a LOT more FE owners here in this thread judging from the owners club list.

But if more people filled in the data, we would have a better data sample. Right now in the owners list there's almost 75% FE owners, some aren't OC at all.


----------



## derpa

Quote:


> Originally Posted by *viz0id*
> 
> Sorry I was away to continue replying you but i see BrainSplatter has given you very good answers.
> 
> The reason you start at 1.000v is because it is important to find your cards sweet spot. For me, running 1.043v gives me the same clockspeeds as running it at higher volts. The only thing higher volts gives me is more power draw, which again gives me more Perfcaps. So for me the sweet spot was 1.043v.
> 
> I'm not an expert or anything, I'm just reporting what worked for me.


Quote:


> Originally Posted by *TheBoom*
> 
> So I'm guessing I will get better results using the curve instead?


Quote:


> Originally Posted by *BrainSplatter*
> 
> Well, the curve gives definitely the finest control, especially when trying to limit/optimize the power draw. Whether it's worth fiddling around with it to get a few percent extra performance is another thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The one thing only the voltage curve do, is to 'undervolt' the GPU. I have one curve which limits the OC to 1850Mhz @ 0.9v which I use to limit power draw / noise of the SLI configuration since I haven't put my FE's under water yet.


Quote:


> Originally Posted by *viz0id*
> 
> Yeah that is the strange thing with this card for me. Giving it more voltage just gave me more trouble, giving it less voltage gave me better clocks.
> 
> Usually (at least from what I understand) you undervoltage for more effeciency, less power draw and quieter fanprofile. But with my card, giving it lower voltage on the higher clocks, let it sit there more stable and didn't bounce around. (EDIT: Because of the perfcaps on power draw).


So just to clarify; in PxOC, if I use the curve editor and hit apply, then cycle back to the "dashboard", the curve settings will supersede any settings there (voltage limit, sliders, etc.)?


----------



## TheBoom

Quote:


> Originally Posted by *SlimJ87D*
> 
> Really? I think we just have a LOT more FE owners here in this thread judging from the owners club list.
> 
> But if more people filled in the data, we would have a better data sample. Right now in the owners list there's almost 75% FE owners, some aren't OC at all.


True. But I've been reading the posts and most of the high overclocks are coming from FE owners for some reason. The aftermarket owners are mostly complaining about low overclocks like me







.

The other half with good ocs are probably the shunt modders, undoubtedly.


----------



## StreaMRoLLeR

Can someone plz teach me to use msi afterburner curve, what i want is simple LOWEST voltage @ 2050 24/7 nothing more. I can hit 2063 and 2088 in benchmarks


----------



## Benny89

Quote:


> Originally Posted by *Streamroller*
> 
> Can someone plz teach me to use msi afterburner curve, what i want is simple LOWEST voltage @ 2050 24/7 nothing more. I can hit 2063 and 2088 in benchmarks


try 1.050 or 1.062V - they are most common to hold 2050.

But it depends on game. In some games like BF1 you will be able to hold it no problem, but for example Witcher 3 will make your clocks downclock to lower bins and lower clock no matter what you do. There is also thermal throttling.


----------



## StreaMRoLLeR

i ve strix 1080 ti with zero voltage i can hit 2038(witcher3) 24/7 and with + 50 % i goes 2050. I dont know how to do curve graph plz explain in screenshot:wubsmiley


----------



## Benny89

Quote:


> Originally Posted by *Streamroller*
> 
> i ve strix 1080 ti with zero voltage i can hit 2038(witcher3) 24/7 and with + 50 % i goes 2050. I dont know how to do curve graph plz explain in screenshot:wubsmiley


2038 with how much overclock on core? what resolution? +50% you mean Voltage Curve?

Here is the guide: http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps


----------



## StreaMRoLLeR

+110 on core no curve cuz i said i dont know how to use curve just slide PT 120 and +110 core 3440x1440


----------



## KraxKill

Quote:


> Originally Posted by *Streamroller*
> 
> i ve strix 1080 ti with zero voltage i can hit 2038(witcher3) 24/7 and with + 50 % i goes 2050. I dont know how to do curve graph plz explain in screenshot:wubsmiley


You're seriously going to wait for somebody to show you how? Here let me help you. This is much faster!

http://bit.ly/2pDFlXV


----------



## StreaMRoLLeR

so funny


----------



## KraxKill

Just poking fun, but seriously the first result in that search should get you what you want.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> True. But I've been reading the posts and most of the high overclocks are coming from FE owners for some reason. The aftermarket owners are mostly complaining about low overclocks like me
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The other half with good ocs are probably the shunt modders, undoubtedly.


I have a 1080 Ti FE under water and shunt mod on it too.

With 1.093V i can throw anything at it with Core @ 2100MHz and Memory 6304MHz (+800)

Edit: Not entirely true, i have to go one bin down for Superposition 8K Optimized, have to check if its the ram or the core overclock, since i tested just one time after crashing the first time, and i lowered both for that single run


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> I have a 1080 Ti FE under water and shunt mod on it too.
> 
> With 1.093V i can throw anything at it with Core @ 2100MHz and Memory 6304MHz (+800)
> 
> Edit: Not entirely true, i have to go one bin down for Superposition 8K Optimized, have to check if its the ram or the core overclock, since i tested just one time after crashing the first time, and i lowered both for that single run


+800 is pretty heavy on the memory. Are your bench scores actually higher than say +550? Many in the thread and reviews report negative scaling beyond about +600

Do you have a GPUZ screen grab of you running 2100/6300? That's some of the fastest GDDR5 we've seen.


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> I have a 1080 Ti FE under water and shunt mod on it too.
> 
> With 1.093V i can throw anything at it with Core @ 2100MHz and Memory 6304MHz (+800)
> 
> Edit: Not entirely true, i have to go one bin down for Superposition 8K Optimized, have to check if its the ram or the core overclock, since i tested just one time after crashing the first time, and i lowered both for that single run


Yeah thats some crazy memory oc.

Anyway for those who know how to curve oc, say I have 2012 at 1.00v stable. Now when I start to increase clocks for the next voltages, I can't seem to hold those clocks in supo. The card after 1-2 scenes downclocks back to the 1.00v bin. What am I doing wrong?


----------



## StreaMRoLLeR

Okay please guys be honest and tell this good silicon lottery or not __

1.0620V 60-61C
2063MHZ core
Fan 1970 RPM %54 here proof


----------



## KraxKill

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah thats some crazy memory oc.
> 
> Anyway for those who know how to curve oc, say I have 2012 at 1.00v stable. Now when I start to increase clocks for the next voltages, I can't seem to hold those clocks in supo. The card after 1-2 scenes downclocks back to the 1.00v bin. What am I doing wrong?


You're likely between thermal throttle bins on boost 3.0. They start at about 28C and go up from there.


----------



## KraxKill

Quote:


> Originally Posted by *Streamroller*
> 
> Okay please guys be honest and tell this good silicon lottery or not __
> 
> 1.0620V 60-61C
> 2063MHZ core
> Fan 1970 RPM %54 here proof


Thats on the better end for sure. If you can bring the core temps down lower, I bet you can run 2075 or even 2100 at that voltage.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +800 is pretty heavy on the memory. Are your bench scores actually higher than say +550? Many in the thread and reviews report negative scaling beyond about +600
> 
> Do you have a GPUZ screen grab of you running 2100/6300? That's some of the fastest GDDR5 we've seen.




Or do you mean sensors monitoring that clocks or something else?

Edit: Ah you asked if the scores are higher in comparison to lower clocks, i guess so but obviously just fps fractions


----------



## TheBoom

Quote:


> Originally Posted by *KraxKill*
> 
> You're likely between thermal throttle bins on boost 3.0. They start at about 28C and go up from there.


If only that information was visible to us.

So now I should skip those bins and go even higher on the voltage and clocks right?


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*


That looks really good, but you'd really need to show a GPUZ graph over time vs the other screens.


----------



## dunbheagan

Quote:


> Originally Posted by *TheBoom*
> 
> Whats funny is it seems the FE cards are actually doing generally better than the aftermarket cards for the 1080ti.
> 
> The 10/12/16 phase power and power boost chips and whatnot aren't doing anything to change the silicon lottery itself.
> 
> With older cards the custom boards somewhat managed to provide a compromise for the unlucky ones, but that seems to not work anymore.


I have the same impression, especially on the memory. I havent heard of a single FE, which could not handly 6000MHz+ mem. But i am reading of a lot of custom cards which do not reach 6000 on the mem.

A shunt modded watercooled FE seems to get everything out of the GP102 that is to gain. Custom cards cant do more, and i think the Kingpin or HOF wont either.


----------



## KraxKill

Quote:


> Originally Posted by *TheBoom*
> 
> If only that information was visible to us.
> 
> So now I should skip those bins and go even higher on the voltage and clocks right?




http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15

No real way to get around them. They are hardcoded into the firmware. Well there is....it's better cooling.


----------



## StreaMRoLLeR

Is this STILL SAFE voltages to pascal because i dont want my card to die in 1 year of usage 1.06V still high ?


----------



## KraxKill

Quote:


> Originally Posted by *Streamroller*
> 
> Is this STILL SAFE voltages to pascal because i dont want my card to die in 1 year of usage 1.06V still high ?


No way to know until about a year from now, but many here will be running well beyond that and at higher clocks.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> That looks really good, but you'd really need to show a GPUZ graph over time vs the other screens.


What would you like to see, i mean highest reading everywhere or average reading or what exactly, i can set it up and re run, no problem


----------



## TheBoom

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15
> 
> No real way to get around them. They are hardcoded into the firmware. Well there is....it's better cooling.


Ok, but what I don't get is why I can be stable at 2025-2037 on manual oc, but with the curve I'm going to throttle at anything past [email protected]?
Quote:


> Originally Posted by *dunbheagan*
> 
> I have the same impression, especially on the memory. I havent heard of a single FE, which could not handly 6000MHz+ mem. But i am reading of a lot of custom cards which do not reach 6000 on the mem.
> 
> A shunt modded watercooled FE seems to get everything out of the GP102 that is to gain. Custom cards cant do more, and i think the Kingpin or HOF wont either.


Yeah it seems its all up to the chips themselves, and it also seems that for some reason the FEs got the better of the lot.


----------



## StreaMRoLLeR

Many users you reporting are using water cooling ?


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> What would you like to see, i mean highest reading everywhere or average reading or what exactly, i can set it up and re run, no problem


You want to see how long your card spends at your top bin during the bench run. Thats why most people post the GPUZ graph to illustrate the stability of the clock.

Not suggesting your card is one way or the other just that it's a much more complete picture when you see the GPUZ graph.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> You want to see how long your card spends at your top bing during the bench run. Thats why most people post the GPUZ graph to illustrate the stability of the clock.


I was just wondering what you guys meant, cuz someone was speaking about memory clocks and i got confused.... at 2100 it stays basically all the time at that bin ;-)


----------



## dunbheagan

Quote:


> Originally Posted by *Hulio225*
> 
> What would you like to see, i mean highest reading everywhere or average reading or what exactly, i can set it up and re run, no problem


Just make a screen of the the afterburner monitor graph for a full run of Supo4k or FSUltra, reading clocks and tdp and limits.

+800 on the mem is pretty high. My FE does +700 without any problems. +750 still gives me a full run FSUltra, but i get first artifacts. +700 is still faster than +600 or + 500 on my card, but of course the scaling is poor. I can only see the difference in benchmarks, not in game fps. Thats why i am keeping it at 6000.


----------



## dunbheagan

Quote:


> Originally Posted by *Streamroller*
> 
> Many users you reporting are using water cooling ?


At least more than avarage. Maybe 50/50?


----------



## foolycooly

I can post a datapoint for an SC2 on air. I did some Heaven loops last night with only the EVGA XOC sliders to max (power/temp). I was only able to achieve a steady core OC of +80 and Memory +300 (haven't tried any higher on the memory clock). I started with core only and heaven crashed at +90 and +85. Temperatures with my custom curve peaked around 75c at 90% fan. GPUz and XOC reported core of around 1977 sustained (heaven reported around 2055 but i'm not sure if that is accurate).

I ended up with a score of about 3600 at 1440p full screen at Extreme/Max with 0xAA. I'm running a 4770k @4.5ghz.

Limited gaming showed the core settle around 1955 at 68c (dark souls 3) and 72c (PU battlegrounds and FFXIV).

I'm happy with these results and probably won't be touching voltages or doing any further tweaks.


----------



## nrpeyton

Well it's Thursday today; and look what just arrived. _(Just as promised)_











back of box damaged in post? _(hope its okay inside)_


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> Well it's Thursday today; and look what just arrived. _(Just as promised)_
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> back of box damaged in post? _(hope its okay inside)_


Nice! Good luck figuring out a voltage bypass. Seems most cards will need more than 1.093 to break past 2150 or so. With appropriate cooling ofcorse







Which I know you have.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Nice! Good luck figuring out a voltage bypass. Seems most cards will need more than 1.093 to break past 2150 or so. With appropriate cooling ofcorse
> 
> 
> 
> 
> 
> 
> 
> Which I know you have.


lol when i opened my 1080 Classy (5 months ago) I asked myself that exact question.

And it even took me two/three months to finally crack it, but in the end I did indeed figure that out (as the videos show) haha.. it just a matter of time lol

Where I'm at now is this:

Keep it sealed in the box and sell it at full price (or near full price) to pay for a KPE when its launched. _(how long will i have to wait)??? no idea_
Sell it 'used' for a KPE when KPE launches.
Open this now and never look back (not bother with KPE).
_Can't make my mind up. But its sitting here looking at me; doubt I can resist opening it lol._ Not sure how long I can hold out.........


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> lol when i opened my 1080 Classy (5 months ago) I asked myself that exact question.
> 
> And it even took me two/three months to finally crack it, but in the end I did indeed figure that out (as the videos show) haha.. it just a matter of time lol
> 
> Where I'm at now is this:
> 
> Keep it sealed in the box and sell it at full price (or near full price) to pay for a KPE when its launched. _(how long will i have to wait)??? no idea_
> Sell it 'used' for a KPE when KPE launches.
> Open this now and never look back (not bother with KPE).
> _Can't make my mind up. But its sitting here looking at me; doubt I can resist opening it lol._ Not sure how long I can hold out.........


Good luck keeping it "sealed" in a box! I bet you're ripping it open right now.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Good luck keeping it "sealed" in a box! I bet you're ripping it open right now.


not yet lol. ffs don't know what to do.. but this is ridiculous because i know in the end whats going to happen anyway....

How much do I lose as soon as i break that plastic seal? 100? 150? 200?


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> not yet lol. ffs don't know what to do.. but this is ridiculous because i know in the end whats going to happen anyway....
> 
> How much do I lose as soon as i break that plastic seal? 100? 150? 200?


Do it.









You did not buy it to keep it sealed...


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> Do it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You did not buy it to keep it sealed...


Lol i never read the KPE announcement until after it was already in the post.


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> Just make a screen of the the afterburner monitor graph for a full run of Supo4k or FSUltra, reading clocks and tdp and limits.
> 
> +800 on the mem is pretty high. My FE does +700 without any problems. +750 still gives me a full run FSUltra, but i get first artifacts. +700 is still faster than +600 or + 500 on my card, but of course the scaling is poor. I can only see the difference in benchmarks, not in game fps. Thats why i am keeping it at 6000.


Not my best run, but for those who wanted to see the graphs


voltage at 1.075V


----------



## dunbheagan

Quote:


> Originally Posted by *Hulio225*
> 
> Not my best run, but for those who wanted to see the graphs
> voltage at 1.075V


Thanks for sharing. Looks very good! At what clock do you have your 7700k? With my [email protected],7Ghz i get min frames of 60,x in supo 4k.

Have you already tested if your settings are totally stable? Like doing a FireStrike Ultra Loop for a couple of hours, or a Superposition Stress test? I had my FE at 2100 first too and it was stable for a run or two or three. But when i did some more intense testing, i found out it crashes after 30 or 60 minutes looped Metro or FSUltra. Actually i am back at 2050/6000, stock voltage curve. No powerlimiting at all thanks to shunt mod and watercooling, and i'm totally fine with that. I start loosing interest to squeeze out the last possible MHz and try to find a rock stable 24/7 setting. Actually 2050/[email protected] voltage(mostly 1,05V) and [email protected],[email protected],325V seems to be it for me. I already had this setting on benchmark loops for some time. Will test it two or three nights more on Supo 4k, FSUltra and Metro LL Redux loops. If its absoluty stable, maybe i keep it and stop testing...


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> Thanks for sharing. Looks very good! At what clock do you have your 7700k? With my [email protected],7Ghz i get min frames of 60,x in supo 4k.
> 
> Have you already tested if your settings are totally stable? Like doing a FireStrike Ultra Loop for a couple of hours, or a Superposition Stress test? I had my FE at 2100 first too and it was stable for a run or two or three. But when i did some more intense testing, i found out it crashes after 30 or 60 minutes looped Metro or FSUltra. Actually i am back at 2050/6000, stock voltage curve. No powerlimiting at all thanks to shunt mod and watercooling, and i'm totally fine with that. I start loosing interest to squeeze out the last possible MHz and try to find a rock stable 24/7 setting. Actually 2050/[email protected] voltage(mostly 1,05V) and [email protected],[email protected],325V seems to be it for me. I already had this setting on benchmark loops for some time. Will test it two or three nights more on Supo 4k, FSUltra and Metro LL Redux loops. If its absoluty stable, maybe i keep it and stop testing...


5.2GHz on the CPU... yeah this GPU OC and CPU OC are my bench settings... for every day use my cpu is at 4.8GHz and gpu 2063 / 6000 while my fans barly spinning... so a not audible system. watercooling 4tw

Edit: haven't tested how long it is stable like that, but tbh i don't care those borderline overclocks are for competitions and stuff, like i said for normal use my oc is a lot more conservative


----------



## dunbheagan

Quote:


> Originally Posted by *Hulio225*
> 
> 5.2GHz on the CPU... yeah this GPU OC and CPU OC are my bench settings... for every day use my cpu is at 4.8GHz and gpu 2063 / 6000 while my fans barly spinning... so a not audible system. watercooling 4tw
> 
> Edit: haven't tested how long it is stable like that, but tbh i don't care those borderline overclocks are for competitions and stuff, like i said for normal use my oc is a lot more conservative


I see. Same for me. Two days ago i pushed my 4790k to [email protected],525V for a Fire Strike run, which it did. But i would not use this setting regulary


----------



## trawetSluaP

Quote:


> Originally Posted by *dunbheagan*
> 
> I see. Same for me. Two days ago i pushed my 4790k to [email protected],525V for a Fire Strike run, which it did. But i would not use this setting regulary


If I put that much voltage through my CPU I'd be able to use it as a barbecue even while it's being water-cooled!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *dunbheagan*
> 
> Thanks for sharing. Looks very good! At what clock do you have your 7700k? With my [email protected],7Ghz i get min frames of 60,x in supo 4k.
> 
> Have you already tested if your settings are totally stable? Like doing a FireStrike Ultra Loop for a couple of hours, or a Superposition Stress test? I had my FE at 2100 first too and it was stable for a run or two or three. But when i did some more intense testing, i found out it crashes after 30 or 60 minutes looped Metro or FSUltra. Actually i am back at 2050/6000, stock voltage curve. No powerlimiting at all thanks to shunt mod and watercooling, and i'm totally fine with that. I start loosing interest to squeeze out the last possible MHz and try to find a rock stable 24/7 setting. Actually 2050/[email protected] voltage(mostly 1,05V) and [email protected],[email protected],325V seems to be it for me. I already had this setting on benchmark loops for some time. Will test it two or three nights more on Supo 4k, FSUltra and Metro LL Redux loops. If its absoluty stable, maybe i keep it and stop testing...


Quote:


> Originally Posted by *dunbheagan*
> 
> Thanks for sharing. Looks very good! At what clock do you have your 7700k? With my [email protected],7Ghz i get min frames of 60,x in supo 4k.
> 
> Have you already tested if your settings are totally stable? Like doing a FireStrike Ultra Loop for a couple of hours, or a Superposition Stress test? I had my FE at 2100 first too and it was stable for a run or two or three. But when i did some more intense testing, i found out it crashes after 30 or 60 minutes looped Metro or FSUltra. Actually i am back at 2050/6000, stock voltage curve. No powerlimiting at all thanks to shunt mod and watercooling, and i'm totally fine with that. I start loosing interest to squeeze out the last possible MHz and try to find a rock stable 24/7 setting. Actually 2050/[email protected] voltage(mostly 1,05V) and [email protected],[email protected],325V seems to be it for me. I already had this setting on benchmark loops for some time. Will test it two or three nights more on Supo 4k, FSUltra and Metro LL Redux loops. If its absoluty stable, maybe i keep it and stop testing...


Yeah, 4790K is going to hold back those minimums and averages oh well it's only 1 to 3 fps.


----------



## DabboPiff

I tried messing around with the curve and I kept getting crashes. I'll just have to stick with my +100 core +500 memory and be satisfied. I wasn't using any benchmarks just a steam game. It's not the best optimized game but it doesn't crash with my original settings so I'll jus assume I'm maxed out. That's fine I'm def happy with over 2ghz. Can't complain.


----------



## derpa

Okay, I'm hoping someone can help me out simply here. I switched from using eVGA PxOC to MSI AB. PxOC was causing a lot of tearing in videos (YT, Netflix, etc) as well as a lot of microstuttering in my benching. Weird, I know.

Anyway, since I switched to AB, things have been good, BUT, my card does NOT want to go above 1.000V for some reason. I dunno if I have something set wrong, or if there's something else I have to check. In PxOC, it would go to 1.093V as needed, and settle at a less dramatic voltage, usually above 1.000. Now, it barely, if ever, goes above 1.000V.

Here are all the AB settings and graphs if it helps. THANK YOU!!!!!























_(I tried to catch the graphs right as I got out of Andromeda, but by the time I got everything situated, it was almost gone)_


----------



## Eco28

I remember this thread to have nearly 100 posts. It grows so fast lol. I hope it is not to late to join the club.

After initial testing, it goes straight under the water.

Just waiting for EK to ship me the backplate (nickel version) which I ordered over a week ago. Funny enough as of today you cannot find it on their store, it's been removed


----------



## velocityx

Just got the same Evga Fe Ti today ;]

Nickel not available? that cant be right, I want nickel on my card..


----------



## Slackaveli

Quote:


> Originally Posted by *derpa*
> 
> Okay, I'm hoping someone can help me out simply here. I switched from using eVGA PxOC to MSI AB. PxOC was causing a lot of tearing in videos (YT, Netflix, etc) as well as a lot of microstuttering in my benching. Weird, I know.
> 
> Anyway, since I switched to AB, things have been good, BUT, my card does NOT want to go above 1.000V for some reason. I dunno if I have something set wrong, or if there's something else I have to check. In PxOC, it would go to 1.093V as needed, and settle at a less dramatic voltage, usually above 1.000. Now, it barely, if ever, goes above 1.000V.
> 
> Here are all the AB settings and graphs if it helps. THANK YOU!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _(I tried to catch the graphs right as I got out of Andromeda, but by the time I got everything situated, it was almost gone)_


did you complete uninstall Precision X? and did you ddu your drivers and re-install? If not, id uninstall afterburner, ddud drivers, install drivers with clean install, then install afterburner. You'll be good then i bet.


----------



## derpa

Quote:


> Originally Posted by *Slackaveli*
> 
> did you complete uninstall Precision X? and did you ddu your drivers and re-install? If not, id uninstall afterburner, ddud drivers, install drivers with clean install, then install afterburner. You'll be good then i bet.


Nope and nope. Ha ha









I'll try that right now, and do a fresh install of AB as well after all the driver malarky is finished. Thanks!


----------



## EvilPieMoo

Currently the most I can get out of 1 of my Founders Ti's


----------



## TheNoseKnows

Quote:


> Originally Posted by *fisher6*
> 
> Yup, I will be selling my EVGA FE card and getting the HOF with a waterblock once it's in stock here. Might not even overclock more but I love the card itself.


You like the HOF? You can pre-order here:
https://www.overclockers.co.uk/kfa2-geforce-gtx-1080ti-hof-8-pack-edition-11264mb-gddr5x-pci-express-graphics-card-gx-09k-kf.html


----------



## mtbiker033

I pre-ordered a hybrid kit from B&H, they are receiving their stock on Monday, hoping to have my kit sometime next week. I got my tools (4mm hex and phillips bits), got some grizzly kryonaut and am ready. My evga FE seems to be a good clocker +175c +500 mem with decent temps (aggressive fan curve) even on air so we shall see how this goes. More than anything I'm looking for it to be quiet!


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Are there any benchmarks that show the gain in fps / benchmark scores over going from DDR3 to DDR4 (with higher clockspeeds) on intel platform? I know the new Ryzen CPU's gain considerable amounts from higher clocked ram. I have a Z97 board with 4790k and DDR 1833mhz ram. Thinking about changing it out, but maybe it is better to wait for the next bigger jump in CPU architecture? Haha like i can hold out until 2019 and Icelake.


5775-c w/ 2400Mhz c-10 ram and you golden. just drop it in, too,.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Really? I think we just have a LOT more FE owners here in this thread judging from the owners club list.
> 
> But if more people filled in the data, we would have a better data sample. Right now in the owners list there's almost 75% FE owners, some aren't OC at all.


that doesnt account for the guys who only post in their respective threads, ie 'arus ti owners club', etc. they all have them.
Quote:


> Originally Posted by *derpa*
> 
> Nope and nope. Ha ha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try that right now, and do a fresh install of AB as well after all the driver malarky is finished. Thanks!


no problem. we all learn the hard way.


----------



## derpa

Quote:


> Originally Posted by *Slackaveli*
> 
> that doesnt account for the guys who only post in their respective threads, ie 'arus ti owners club', etc. they all have them.
> no problem. we all learn the hard way.


Whelp.....

Order of Ops:
- Uninstalled eVGA PxOC
- Uninstalled MSI AB 4.4.0 Beta 6
- DDU drivers completely and restart
- CCleaner all files and registry (_just in case_)
- Installed fresh drivers from nVidia website
- NV Control Panel - change settings to *'Prefer Maximum Performance'*
- Installed MSI AB 4.4.0 Beta 6 and setup
- Run Unigine Valley - same artificial V cap
- Follow instructions on Pg 1 to add vendor to .oem2 file (_didn't exist when I opened it_)
- Change to *'Third Party'* voltage control in AB settings

Card hung around 1.000V the whole time. Intermittently, it would up to 1.012V or for a split second to 1.025V, but then immediately back down to 1.000V or even 0.993.


Now, I'm not concerned about the performance I'm getting AT ALL. Frames are good while gaming, which is what matters. I'm more concerned there may be something I have setup incorrectly, or with the voltage controller. The Temps all good for all runs, and I don't hit Temp Limit at all; just Power and Voltage limits.

Should I just chalk this up to Silicon Lottery and move on? Just can't figure why it would volt higher in PxOC than AB.









Thanks again!


----------



## havoc315

Been waiting since the 5th and finally got my waterblocks from ekwb so I can finally finish my build and install them on my 1080ti's....quick question though is there a certain RTV sealant that you have to use on circuits or can you use the clear RTV silicone adhesive sealant the general purpose one by permatex.
Here's what came in the mail today though


----------



## derpa

Quote:


> Originally Posted by *havoc315*
> 
> Been waiting since the 5th and finally got my waterblocks from ekwb so I can finally finish my build and install them on my 1080ti's....quick question though is there a certain RTV sealant that you have to use on circuits or can you use the clear RTV silicone adhesive sealant the general purpose one by permatex.
> Here's what came in the mail today though


Very nice


----------



## s1rrah

Quote:


> Originally Posted by *mtbiker033*
> 
> I pre-ordered a hybrid kit from B&H, they are receiving their stock on Monday, hoping to have my kit sometime next week. I got my tools (4mm hex and phillips bits), got some grizzly kryonaut and am ready. My evga FE seems to be a good clocker +175c +500 mem with decent temps (aggressive fan curve) even on air so we shall see how this goes. More than anything I'm looking for it to be quiet!


Repaste when you install the cooler ... your going to love it man. My MSI Seahawk X 1080 ti never breaks 43C in *anything* (22C ambient) ...


----------



## mtbiker033

Quote:


> Originally Posted by *s1rrah*
> 
> Repaste when you install the cooler ... your going to love it man. My MSI Seahawk X 1080 ti never breaks 43C in *anything* (22C ambient) ...


definitely will re-paste, I have grizzly kryonaut on hand!!







thanks for the reply!


----------



## Nico67

Quote:


> Originally Posted by *derpa*
> 
> Whelp.....
> 
> Order of Ops:
> - Uninstalled eVGA PxOC
> - Uninstalled MSI AB 4.4.0 Beta 6
> - DDU drivers completely and restart
> - CCleaner all files and registry (_just in case_)
> - Installed fresh drivers from nVidia website
> - NV Control Panel - change settings to *'Prefer Maximum Performance'*
> - Installed MSI AB 4.4.0 Beta 6 and setup
> - Run Unigine Valley - same artificial V cap
> - Follow instructions on Pg 1 to add vendor to .oem2 file (_didn't exist when I opened it_)
> - Change to *'Third Party'* voltage control in AB settings
> 
> Card hung around 1.000V the whole time. Intermittently, it would up to 1.012V or for a split second to 1.025V, but then immediately back down to 1.000V or even 0.993.
> 
> 
> Now, I'm not concerned about the performance I'm getting AT ALL. Frames are good while gaming, which is what matters. I'm more concerned there may be something I have setup incorrectly, or with the voltage controller. The Temps all good for all runs, and I don't hit Temp Limit at all; just Power and Voltage limits.
> 
> Should I just chalk this up to Silicon Lottery and move on? Just can't figure why it would volt higher in PxOC than AB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks again!


Its just doing what you've set the curve to let it do. CNTRL+D to default the curve.

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah thats some crazy memory oc.
> 
> Anyway for those who know how to curve oc, say I have 2012 at 1.00v stable. Now when I start to increase clocks for the next voltages, I can't seem to hold those clocks in supo. The card after 1-2 scenes downclocks back to the 1.00v bin. What am I doing wrong?


Similarly, that is just doing what its supposed to do, use GPU-Z to see if its power limiting causing to clk down.


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> Whats funny is it seems the FE cards are actually doing generally better than the aftermarket cards for the 1080ti.
> 
> The 10/12/16 phase power and power boost chips and whatnot aren't doing anything to change the silicon lottery itself.
> 
> With older cards the custom boards somewhat managed to provide a compromise for the unlucky ones, but that seems to not work anymore.


Yes and No







FE cards are sufficient and it is totally down to Silicon lottery, but Aftermarket cards have a bit in reserve like power limit and cooling that will help some average cards be better than they would be other wise. But Aftermarket cards are no guarantee of better lottery unless they have been binned like some of the more exotic options, although even that better may not be best if you know what I mean.


----------



## Benny89

Ech.. I was so close to finally beat that Witcher 3 challange. I got curve that could hold 2038 at 1.043V in Witcher 3 for 3 hours. Temp max 62 C. At 64 it droped to 2025 but returned and hold. No Power Limit, no V0p, just Vrel. I was so happy, then after 3 hours it crashed again....









Anyone has some idea if maybe this can be caused by Memory OC? Maybe Witcher 3 is really sensitive to memory OC? Althought that would usually show artifacs.

Game just freeze, you can still hear that for example Gerals is moving or dialogue is still going but it goes to desktop anyway...

But I was closer than before......

Dam, this game is challange...

BTW. It so good to have 87-90% CPU utilization in Novigrad with 99% GPU utilization and seeing 96-106fps on Ultra with Hairworks on with Witcher 3


----------



## nrpeyton

OMG it looks so *small*.



My 1080 Classy (which I just sold) was massive (not _quite_ but *almost* 2x the surface area).

This thing is like a baby.


----------



## JohnnyFlash

Has anyone with an FE solved the high idle clocks with multiple monitors?


----------



## CoreyL4

ok need some help. so yesterday i overclocked the core and it was stable between 2050 and 2062 at 1.062-1.075v.

today i am overclocking memory too. got 6003 to work in supo except i was pwr limited the entire time. i brought down core to 2037 and upped it to 1093v. i pass, but still pwr limited with the clocks jumping all over the place. it doesnt do this when memory is not overclocked.

anyway to remedy this issue?


----------



## GosuPl

I have problem with voltages on SLI 2x 1080Ti's.

I can change voltage in MS AB and EVGA Prec. All is ok when i'm use one card, but on SLI have problems, and problems, and more problems ;-) With OC and stability.

1 card can go up for 1.062 v and 2 card stay on 1.000. When add +75mv , and even + 100. First card goes up for 1.09v and second only for 1.025v :/

*** with this lovely **** ? Any fix ?


----------



## nrpeyton

Quote:


> Originally Posted by *CoreyL4*
> 
> ok need some help. so yesterday i overclocked the core and it was stable between 2050 and 2062 at 1.062-1.075v.
> 
> today i am overclocking memory too. got 6003 to work in supo except i was pwr limited the entire time. i brought down core to 2037 and upped it to 1093v. i pass, but still pwr limited with the clocks jumping all over the place. it doesnt do this when memory is not overclocked.
> 
> anyway to remedy this issue?


Have you already put the power slider in MSI Afterburner (or your O/C'ing app of choice) to 120%?

You could also try lowering voltage to 1.0v (and test for highest stable O/C at that voltage).

Failing that; the only current reliable option is to carry out a physical hardware shut mod. (but you need to do a lot of research first). And think seriously about any possible repercussions.

There are some other options if you don't want to shut mod (involving cross-flashing BIOS's with higher max power limits) but that also carries risks. And the benefits are still under investigation on this forum. _(stay tuned for more updates on that front)._


----------



## CoreyL4

Quote:


> Originally Posted by *nrpeyton*
> 
> Have you already put the power slider in MSI Afterburner (or your O/C'ing app of choice) to 120%?
> 
> You could also try lowering voltage to 1.0v (and test for highest stable O/C at that voltage).
> 
> Failing that; the only current reliable option is to carry out a physical hardware shut mod. (but you need to do a lot of research first). And think seriously about what those implications are.
> 
> There are some other options if you don't want to shut mod (involving cross-flashing BIOS's with higher power max power limits) but that also carries risks. And the benefits are still under investigation on this forum. _(stay tuned for more updates on that front)._


Yeah I have the power slider to 117% on my gaming x. I almost want to flash the bios to one that has 120+

my highest stable was 2012 core at 1.000v


----------



## nrpeyton

Quote:


> Originally Posted by *CoreyL4*
> 
> Yeah I have the power slider to 117% on my gaming x. I almost want to flash the bios to one that has 120+
> 
> my highest stable was 2012 core at 1.000v


Anything over 2.0GHZ you're good to go. (at least that's the general consensus here).

Bear in mind also; games seem to draw less power than the synthetic benchmarks.


----------



## Nico67

Quote:


> Originally Posted by *CoreyL4*
> 
> ok need some help. so yesterday i overclocked the core and it was stable between 2050 and 2062 at 1.062-1.075v.
> 
> today i am overclocking memory too. got 6003 to work in supo except i was pwr limited the entire time. i brought down core to 2037 and upped it to 1093v. i pass, but still pwr limited with the clocks jumping all over the place. it doesnt do this when memory is not overclocked.
> 
> anyway to remedy this issue?


If you overclock the mem, it will draw more power, something around 3-4w extra is what I saw pushing into power limit. Its a balancing act to stay under the power limit, so nothing you can effective do other than shunt mod for guaranteed power. It will bounce clks on the power limit, that just what it does


----------



## nrpeyton

Well this is strange.


Installed new 1080TI FE in slot 1.
Performed a clean-install of drivers + geforce experience _(included)_ from Nvidia website
Updated MSI Afterburner to beta 4.4 _(including rivia tuner statistics server) and rebooted)_ _(to enable out-of-box voltage slider)_
Updated GPU-Z to latest version. Restarted
Fired up The Witcher 3 _(was previously working perfectly)_
Now within 2 seconds of entering the menu the process is automatically ended and I land back on the desktop with no Witcher 3 tab to click on. (As if I never even started the game).








Repeat last step 3x times... no luck.








Back soon with an update (after I figure out whats going on).


----------



## Dasboogieman

Quote:


> Originally Posted by *Nico67*
> 
> If you overclock the mem, it will draw more power, something around 3-4w extra is what I saw pushing into power limit. Its a balancing act to stay under the power limit, so nothing you can effective do other than shunt mod for guaranteed power. It will bounce clks on the power limit, that just what it does


I can get 4-5 extra FPS by getting my mem +500 alone in FFXIV 4K. I don't even get that with core overclocking so sometimes, if you are already at the limit of the core, it might be better to lower the core by maybe a bin and going ham on the VRAM.


----------



## RadActiveLobstr

Jayztwocents has a FTW3 (per his Twitter) so hopefully a review of that should be coming soon. Interested to see how the thermals hold up to the larger 2.5 and 3 slot cards.


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> There're actually 2 TDP limit,
> Total GPU Power (normalized) [% of TDP] and Total GPU Power [% of TDP]. GPU-Z reads out Total GPU Power [% of TDP] whereas AB reads out Total GPU Power (normalized) [% of TDP] . HWiNFO64 reads both TDPs. Both TDPs limit the max power the card can draw. I can't find a way to increase the normalized TDP unless I flash a BIOS with a higher normalized TDP.
> 
> The shunt mod only affects Total GPU Power [% of TDP] but not the normalized TDP.
> 
> 
> 
> You can see my card (shunt modded) hits power limit because of the normalized TDP.


So this guy made a great post and it was kind of overlooked, myself including until I ran smack into the "normalized TDP" wall.

I started playing with lower liquid temps and higher clocks at voltages above 1.05v. I'm shunt modded and got the same normalized TDP limits in superposition @ 4k.

I'm currently running 2100 @ 1.05v with +550 mem to stay under this limit.

I confirmed the limit by lowering my mem OC to +0 and I was able to run 1.06 before hitting the normalized TDP wall.

Other benches seem to not show this. +rep on this one.


----------



## Slackaveli

Quote:


> Originally Posted by *derpa*
> 
> Whelp.....
> 
> Order of Ops:
> - Uninstalled eVGA PxOC
> - Uninstalled MSI AB 4.4.0 Beta 6
> - DDU drivers completely and restart
> - CCleaner all files and registry (_just in case_)
> - Installed fresh drivers from nVidia website
> - NV Control Panel - change settings to *'Prefer Maximum Performance'*
> - Installed MSI AB 4.4.0 Beta 6 and setup
> - Run Unigine Valley - same artificial V cap
> - Follow instructions on Pg 1 to add vendor to .oem2 file (_didn't exist when I opened it_)
> - Change to *'Third Party'* voltage control in AB settings
> 
> Card hung around 1.000V the whole time. Intermittently, it would up to 1.012V or for a split second to 1.025V, but then immediately back down to 1.000V or even 0.993.
> 
> 
> Now, I'm not concerned about the performance I'm getting AT ALL. Frames are good while gaming, which is what matters. I'm more concerned there may be something I have setup incorrectly, or with the voltage controller. The Temps all good for all runs, and I don't hit Temp Limit at all; just Power and Voltage limits.
> 
> Should I just chalk this up to Silicon Lottery and move on? Just can't figure why it would volt higher in PxOC than AB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks again!


man, that seems odd for sure. i will say that a bunch of folks are running theirs locked at 1.0v to avoid all power throttling and b/c they get great clocks/frames at that voltage. tough call to not rma it, but, then again if you are happy with results it could be fine. if it is dying, you could always rma when that happens or becomes obvious.

Edit: after looking at your curve... try on a fresh set of settings and run it just +50 or something and see if it goes up to at least 1.063v. looks like your curve may just be set to the 1.0v challenge settings.


----------



## s1rrah

So I finally figured out this whole curve business ...

I re installed AFB and followed the instructions in post #1 about copying the string from the "Info" button in AFB ... something I didn't do before ... I added this to the "MSIAfterburner.oem2" file (the string I copied, related to my Seahawk X, was not in the .oem2 file by the way) ...

Oh yeah .. another thing I did this time when setting up AFB is to select the "Third Party" option under settings and next to the "Unlock Voltage Control" option ...

And now the curve seems to work much better and more consistently ...

For starters, I just went with the max of 1.093v and set it to 2050mhz at that point and things run fine through Witcher 3, Tom Clancy Wildlands, Metro:LL, ResEvil 7 and Titanfall2 .. I played for about two hours among all those games and only VERY rarely saw a power limit moment.

Things ran at 2050mhz for a bit but generally, especially with Witcher 3, would always drop down to the 2025mhz range ... or even 2012mhz range ... but generally always stayed at 2025mhz ... even when no power limiting was present; I did see lots of voltage limiting though ... at least I think that's what it was in GPU Z ... certainly wasn't power limiting though ...

I don't really mind these occasional fluctuations and not sure I can do anything about it ...

With the curve, I see much less Power Limiting than when not using the curve and just setting everything via sliders ...

Tired of messing with this thing.

Making me crazy a bit ... LOL


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> So this guy made a great post and it was kind of overlooked, myself including until I ran smack into the "normalized TDP" wall.
> 
> I started playing with lower liquid temps and higher clocks at voltages above 1.05v. I'm shunt modded and got the same normalized TDP limits in superposition @ 4k.
> 
> I'm currently running 2100 @ 1.05v with +550 mem to stay under this limit.
> 
> I confirmed the limit by lowering my mem OC to +0 and I was able to run 1.06 before hitting the normalized TDP wall.
> 
> Other benches seem to not show this. +rep on this one.


And how does this affect your score?


----------



## CoreyL4

what bios' have 125+ power limit?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> what bios' have 125+ power limit?


Palit bios gives you an extra 30 watts. That'd equal to running the card at 132% on stock bios. It's not going to report that of course but that's what has been measured by users.


----------



## s1rrah

What is "locking the voltage" and how do I do it?


----------



## CoreyL4

Quote:


> Originally Posted by *SlimJ87D*
> 
> Palit bios gives you an extra 30 watts. That'd equal to running the card at 132% on stock bios. It's not going to report that of course but that's what has been measured by users.


What is your opinion on flashing to it for the extra power? If I did it Id go to 125


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> What is your opinion on flashing to it for the extra power? If I did it Id go to 125


It didn't work well for me and other users but your mileage may vary.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Only had my block 3 weeks, and this is what it looks like. Flushed the system out when I got the new gpu. Am using EK coolant. what the heck.
> 
> Thats a nickel block too. so copper isnt supposed to be showing.


You definitely should post this on the EK club. I don't want my Aorus block to do this when they release it. This is why I'm pissed that EK completely ditched copper for nickel plated and on the monoblock they just announced only plexi + nickel is available because they "need to keep the RGB".... I'm not liking the direction they are going.

Quote:


> Originally Posted by *Luckael*
> 
> hello guys, is this normal for the GTX 1080 Ti FE, the debug mode is enable. I'm having a frame stuttering in all games like BF1 and the division. i don't know if this is the issue. i never experience frame drops on my Gtx 1080 xtreme Gaming before. any idea? thank you


Try downloading the latest drivers and then make sure you click on "clean install" I think you might need to go to custom installation to get that option I forget. Also download ccleaner and grey rod of any crack you have lying around your computer and then go to the registry tab and run that and fix all the words that come up. Back up if you want, I usually don't.


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> You definitely should post this on the EK club. I don't want my Aorus block to do this when they release it. This is why I'm pissed that EK completely ditched copper for nickel plated and on the monoblock they just announced only plexi + nickel is available because they "need to keep the RGB".... I'm not liking the direction they are going.
> Try downloading the latest drivers and then make sure you click on "clean install" I think you might need to go to custom installation to get that option I forget. Also download ccleaner and grey rod of any crack you have lying around your computer and then go to the registry tab and run that and fix all the words that come up. Back up if you want, I usually don't.


when i have crack laying around my computer i smoke it







, but you can 'grey rod' it if you want to.


----------



## CoreyL4

Is the zotac bios any good to flash to? What is it's power limit?


----------



## KraxKill

Quote:


> Originally Posted by *SlimJ87D*
> 
> And how does this affect your score?


Score is highest if you're just under the limiter. You can jack up your core and lower your mem a good bit to stay under but you're robbing Peter to pay Paul at that point.

My recommendation would be to run as high a clock as you can at a bin below your maximum which hopefully allows you to run less voltage and then clock the mem to meet the TDP limit. That should get the highest score.

Shooting for the highest gpu clock bin will most likely hit that pwr limiter at anything above.

2063 @ 1.094 +550 mem
2075 and 1.08v, +550 mem
[email protected] +550 mem

So if you can hit those clocks at lower voltages you can stay under the limiter.

I suspect this happens more in superposition and the witcher as mem usage in superposition and mem intensive games is highest but the other benches don't really touch the mem like superposition does at 4 & 8k.

For me, running 2100 @ 1.05 with mem at +550 keeps me under the limiter. Results may vary.


----------



## KCDC

Welp, pretty sure my ax1200i can't handle these two GPUs when overclocked. Either that or I killed it. Playing ME:A or trying any benchmark, when gpus both go over 2000, bam, full restart, all signs point to PSU peaking. Can play games and benchmark with one active and no SLI or nVidia surround with zero issues.

Just ordered an AX1500i..

Never in my life did I expect to max out a 1200 watt premium PSU.


----------



## nrpeyton

OMG this is *madness*.

Can't believe how *handicapped* this card is on *power*.

At out-of-box settings on my newly (just opened) 1080TI *FE* it was only 2 FPS better than my fully overclocked and maxed out 1080 Classified in the Witcher 3 at 1440p.

It wasn't until I:


updated MSI Afterburner to Beta 4.4
put fan to 100%
raised power target to 120%
*locked* an *under*-volt at 2050 MHZ

_ONLY then_ I finally seen any gains from upgrading from 1080 to 1080 TI.

This was _even_ more evident at 4k. Where overclocking using afterburner beta 4.4 (by undervolting, raising core clock + increasing power target to max) that I got another extra 6 FPS. (which in the 40's and 50's FPS's that is *A LOT* and definitely visible to the naked eye during gameplay).

Shut mod here I come... lol <--- _don't worry I still have a lot more tinkering to do before I do the mod (I know people are waiting on my results with a palit BIOS first_

Then we'll see what this beast can do 

*Results so far:* (the Witcher 3; Ultra)
FPS at 4k on fully overclocked 1080 Classified on water:
38 avg

FPS at 4k on new 1080 TI (out of box settings)
46 FPS

FPS at 4k on new 1080 TI (with *u*ndervolt / core overclock / power 120% / mem overclock _& various downloads of beta o/c'ing software to make it all possible_):
52 FPS

.


----------



## KraxKill

That "normalized TDP" limit is one major cockblock I tell ya.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> That "normalized TDP" limit is one major cockblock I tell ya.


Indeed lol

How are you guys doing so far with memory overclocks?

My initial testing is this:

*+565* = 14 errors within 30 seconds

*+560* = 10 minute run 0 errors (completely stable)









(In other words, with my memory overclocked to within a ball hair of its life at 560 I'm completely stable.. but even 1mhz more and things go wild lol).

/\ thats at a temp of approx. 60c (fan 100% and a power draw of 277w avg. + default clocks + volts)

*+675* = crash


----------



## havoc315

In the picture's below I'm wondering why ekwb tells you to put the thrmal pads pretty much in the same place on any waterblock that you buy from them, but it differs from nvidia's placement of pads.
Also on the backplate it only shows 4 places to put thermal pads 3 on memory modules and one long strip under the vrm section which is odd, because there are 11 memory modules.
Is this because when water cooling, the board does not get as hot so there is not as much need to cool all of which nvidia does?
Is there any recommended pad placement I should consider when installing ekwb waterblock that they have not specified?


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> That "normalized TDP" limit is one major cockblock I tell ya.


What bios are you using when you see normalized TDP limit?


----------



## nrpeyton

Quote:


> Originally Posted by *havoc315*
> 
> In the picture's below I'm wondering why ekwb tells you to put the thrmal pads pretty much in the same place on any waterblock that you buy from them, but it differs from nvidia's placement of pads.
> Also on the backplate it only shows 4 places to put thermal pads 3 on memory modules and one long strip under the vrm section which is odd, because there are 11 memory modules.
> Is this because when water cooling, the board does not get as hot so there is not as much need to cool all of which nvidia does?
> Is there any recommended pad placement I should consider when installing ekwb waterblock that they have not specified?


In my experience; it's better to do trial-and-error and see what gets you best temps per sizes.

For example:
In EK manual for my last card (1080 classy) pad sizes were 0.5 mem and 1.0 VRM respectfully.

However I found 1.0mm memory + 1.5mm VRM got me MUCH better temps. Having _really good compression_ on pads helps trumendously. Any good pad will compress to 50% it's original height anyway.
Anything over 3 W/MK is a winner. Compression is also more important than w/mk rating.

_I've also seen some old AMD partners use 6mm thick slabs of padding on cards before (in the olden days 9 years ago) -- so when you think about that -- in contrast... it's easy to work out ;-)
_
I installed probes attached to the back of the PCB (directly behind components) to take temp measurements.
Core temps weren't really affected, but memory overclocking benefited massively. These GDDR5X chips respond very nicely to temps. And the 3 chips nearest the VRM side (furthest away from I/O side of card) get really really hot. Especially on cards with smaller PCB such as FE the issue is even more apparent.

Just be careful.... apply pads.. screw everything back together..
then unscrew it all and see what pads stick. (put something slithery ontop) to see where contact is being missed or where contact is good. <--- i used thermal grease... if there was residue left on the bit the pad was meant to make contact with then ALL GOOD.. If not then you have to look at it all again...

It can be painstaking... going too high on one pad can affect the contact of adjacent parts (even on opposite ends of card)

Once you crack it, you'll feel great. And you'll get an extra 100mhz out your memory that you never thought possible.


----------



## Nico67

Quote:


> Originally Posted by *havoc315*
> 
> Also on the backplate it only shows 4 places to put thermal pads 3 on memory modules and one long strip under the vrm section which is odd, because there are 11 memory modules.


I don't think they are for memory modules, but rather just standoff protection so backplate doesn't touch components behind GPU. would have been better to just place a pad behind the GPU, which I did as well









As for other locations, just try to match what the original heatsink had as close as possible.


----------



## havoc315

Quote:


> Originally Posted by *nrpeyton*
> 
> In my experience; it's better to do trial-and-error and see what gets you best temps per sizes.
> 
> For example:
> In EK manual for my last card (1080 classy) pad sizes were 0.5 mem and 1.0 VRM respectfully.
> 
> However I found 1.0mm memory + 1.5mm VRM got me MUCH better temps. Having _really good compression_ on pads helps trumendously. Any good pad will compress to 50% it's original height anyway.
> Anything over 3 W/MK is a winner. Compression is also more important than w/mk rating.
> 
> _I've also seen some old AMD partners use 6mm thick slabs of padding on cards before (in the olden days 9 years ago) -- so when you think about that -- in contrast... it's easy to work out ;-)
> _
> I installed probes attached to the back of the PCB (directly behind components) to take temp measurements.
> Core temps weren't really affected, but memory overclocking benefited massively. These GDDR5X chips respond very nicely to temps. And the 3 chips nearest the VRM side (furthest away from I/O side of card) get really really hot. Especially on cards with smaller PCB such as FE the issue is even more apparent.
> 
> Just be careful.... apply pads.. screw everything back together..
> then unscrew it all and see what pads stick. (put something slithery ontop) to see where contact is being missed or where contact is good. <--- i used thermal grease... if there was residue left on the bit the pad was meant to make contact with then ALL GOOD.. If not then you have to look at it all again...
> 
> It can be painstaking... going too high on one pad can affect the contact of adjacent parts (even on opposite ends of card)


Thank you for the fast reply, because I'm literally sitting in front of my two 1080ti trying to figure out what to do


----------



## havoc315

Also can you stack on top of thermal pads as long as the compression is strong enough with a higher-end thermal pad


----------



## nrpeyton

Quote:


> Originally Posted by *havoc315*
> 
> Also can you stack on top of thermal pads as long as the compression is strong enough with a higher-end thermal pad


Yes, as a temporary measure (and to get it all figured out yes absolutely).

But when you have time I'd change it over.

You can get cheap pads at 3 w/mk ish on Ebay for under a 5'er (10cm x 10cm squares) which cut nicely into shape.

*The key is sticking something slithery on top the pads (thermal grease?).. screwing together then unscrewing to see where the residue is left. (the pads that have really good compression might even STICK to the block when you lift it back up if the contact is absolutely excellent.*

But you won't know for sure how good it is; until you fire it up and do some benches and compare temps.
(Also don't worry too much; if you stick to the manual you can't go wrong -- it might just not be as absolutely perfect as it could be).

There's also no harm in adding extra pads so more of the PCB makes contact with some part of the block. (Just be careful you don't compromise contact on other components in the process).

I've got temp probes attached to a cheap 25 bux fan controller like this:

Just some examples for a bit inspiration _(don't worry about going as indepth as me)_:





This is what I done with my 1080 Classified (I added pads that nvidia/evga & EK didn't use for even more surface area heat dissipation)

_right click & 'open new tab' for full size_

You don't need to get into it as much as me, or anywhere as near as much detail.

Sorry don't mean to overwhealm you, I'm just crazy as they come lol. (and obsessed with temps and pushing to the absolute limit)...

But notice in my '2nd last picture' how hot the 3 memory chips are compared to the others? (Purely due to their vicinity to the VRM -- _a lot of heat spills over from it. After this mod (and a few other little tricks) on my last card; I could hit a ridiculous +925 on memory







:_).

But just to give you a bit inspiration/food for thought


----------



## Clukos

Undervolting vs stock vs overclocking power draw at 4k max settings:


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> Undervolting vs stock vs overclocking power draw at 4k max settings:
> 
> 
> Spoiler: Warning: Spoiler!


Interesting, I see the BIOS on the Gaming X also features an extra 30 watts over the FE.

Watts really seem a commodity on the TI.

I really think Nvidia have been far too economical in trying to hit these power targets to meet their TDP targets for shareholders.

They could have at least countered that by giving overclockers the ability in the BIOS to increase the slider further (from say 20% to 40%).

How far that slider will go is 100% down to the BIOS. Nothing else.

Stock cards and OEM systems wouldn't of been affected. Everyone wins.


----------



## Clukos

Oh yes, BIOS modding would be great, Maxwell was good without it, and great with it.


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> Oh yes, BIOS modding would be great, Maxwell was good without it, and great with it.


Aye, absolutely.

Fingers crossed ;-)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> That "normalized TDP" limit is one major cockblock I tell ya.


When are we supposed to botice the normalized tdp limit, is perf cap green supposed to light up?

I haven't experienced any power limiting in video games, not saying your wrong, but I'm wondering what in the "norm" will make me hit it. Superposition and Witcher 3 don't make me really hit it but I believe I hit it a tiny bit during TimeSpy.

At any rate, would video games really make us hit the normalized tdp? I'm guessing no?


----------



## reflex75

Hello there








After playing with 2 different 1080ti, I am confused about the boost clock behavior








If I put the same setting for the boost clock, for instance 1750Mhz on both cards, then one card will boosts 50Mhz higher than the other, for instance one boost automaticly to 2000mhz and the other to 2050mhz.
Is it normal for the same boost clock to get such a difference?
What's the point of the advertised boost clock we are paying?


----------



## Luckbad

Quote:


> Originally Posted by *reflex75*
> 
> Hello there
> 
> 
> 
> 
> 
> 
> 
> 
> After playing with 2 different 1080ti, I am confused about the boost clock behavior
> 
> 
> 
> 
> 
> 
> 
> 
> If I put the same setting for the boost clock, for instance 1750Mhz on both cards, then one card will boosts 50Mhz higher than the other, for instance one boost automaticly to 2000mhz and the other to 2050mhz.
> Is it normal for the same boost clock to get such a difference?
> What's the point of the advertised boost clock we are paying?


Yes, 100% normal. You are guaranteed to reach the advertised boost clock, otherwise you have a defective unit.

If your boost goes higher than advertised (all of them do), you're good to go. If they can go over 2000MHz, you're in golden card territory and should be super happy.

I've seen boost clocks for the 1080 Ti defaulting anywhere between 1750 and 2050 for aftermarket cards (generally lower for reference at stock because they aren't overclocked by default).


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> OMG this is *madness*.
> 
> Can't believe how *handicapped* this card is on *power*.
> 
> At out-of-box settings on my newly (just opened) 1080TI *FE* it was only 2 FPS better than my fully overclocked and maxed out 1080 Classified in the Witcher 3 at 1440p.
> 
> It wasn't until I:
> 
> 
> updated MSI Afterburner to Beta 4.4
> put fan to 100%
> raised power target to 120%
> *locked* an *under*-volt at 2050 MHZ
> 
> _ONLY then_ I finally seen any gains from upgrading from 1080 to 1080 TI.
> 
> This was _even_ more evident at 4k. Where overclocking using afterburner beta 4.4 (by undervolting, raising core clock + increasing power target to max) that I got another extra 6 FPS. (which in the 40's and 50's FPS's that is *A LOT* and definitely visible to the naked eye during gameplay).
> 
> Shut mod here I come... lol <--- _don't worry I still have a lot more tinkering to do before I do the mod (I know people are waiting on my results with a palit BIOS first_
> 
> Then we'll see what this beast can do
> 
> *Results so far:* (the Witcher 3; Ultra)
> FPS at 4k on fully overclocked 1080 Classified on water:
> 38 avg
> 
> FPS at 4k on new 1080 TI (out of box settings)
> 46 FPS
> 
> FPS at 4k on new 1080 TI (with *u*ndervolt / core overclock / power 120% / mem overclock _& various downloads of beta o/c'ing software to make it all possible_):
> 52 FPS
> 
> .


yeah , man, that is the true curse when you get a good silicon and you tweak it expertly and suck every drop out of it, and then you "upgrade" to a 1000 more cuda cores and get basically the same fps! You have to at least catch a good sample again (which you did) and re-tweak it. At stock it looks like a sidegrade.

Quote:


> Originally Posted by *nrpeyton*
> 
> Indeed lol
> 
> How are you guys doing so far with memory overclocks?
> 
> My initial testing is this:
> 
> *+565* = 14 errors within 30 seconds
> 
> *+560* = 10 minute run 0 errors (completely stable)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (In other words, with my memory overclocked to within a ball hair of its life at 560 I'm completely stable.. but even 1mhz more and things go wild lol).
> 
> /\ thats at a temp of approx. 60c (fan 100% and a power draw of 277w avg. + default clocks + volts)
> 
> *+675* = crash


how are you testing for vram errors?? PLEASE?..


----------



## Luckbad

I've pushed my Zotac 1080 Ti Amp Extreme as hard as I can on air at this point.

My best memory speed is 6010 or so. I can go a fair bit higher without crashing or artifacts, but performance tends to go down for the last 100 MHz or so of headroom on the overclock offset so I stop it there.

My best core speed is 2088. The moment I hit 2100, the card throws in the towel and crashes whatever is running at the time.

The highest voltage I've ever been able to get is 1.075. Not sure why it doesn't seem to want to go higher even at +100. Maybe I need to screw with curves in Afterburner, but I don't feel compelled to do so.

My normal settings for the card have it running at 2062 core, 6010 memory at 120% power. The temperature usually hovers in the low 60s with fan speed matched to temperature, so it never gets particularly loud either.

Golden card out of the box for sure. Doesn't have a ton of headroom since it's already so good out the gate (2037.5 is its stock boost with no GPU clock offset).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah , man, that is the true curse when you get a good silicon and you tweak it expertly and suck every drop out of it, and then you "upgrade" to a 1000 more cuda cores and get basically the same fps! You have to at least catch a good sample again (which you did) and re-tweak it. At stock it looks like a sidegrade.
> rep'd (again)


Am I missing something here? Hes getting about 25%-35% extra performance over his 1080 which was to be expected?


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Indeed lol
> 
> How are you guys doing so far with memory overclocks?
> 
> My initial testing is this:
> 
> *+565* = 14 errors within 30 seconds
> 
> *+560* = 10 minute run 0 errors (completely stable)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (In other words, with my memory overclocked to within a ball hair of its life at 560 I'm completely stable.. but even 1mhz more and things go wild lol).
> 
> /\ thats at a temp of approx. 60c (fan 100% and a power draw of 277w avg. + default clocks + volts)
> 
> *+675* = crash


how are you testing for vram errors?? PLEASE?..
Quote:


> Originally Posted by *SlimJ87D*
> 
> Am I missing something here? Hes getting about 25%-35% extra performance over his 1080 which was to be expected?


yeah, i guess you missed where at stock it was a 2 fps gain. He's ok now that he put the work in (and got a good sample, luckily).


----------



## nrpeyton

My max overclock (on core) at stock voltage (1050mv):

2075 Mhz @ 50c +
or
2062 @ 60c +++

Power Limit during gaming _*IS*_ showing it's ugly face: (the Witcher 3 / Ultra / 4k)

See this screenshot. _(right click & 'open new tab' to view full size and you'll be able to make out the numbers in my screenshot)_


----------



## illypso

Quote:


> Originally Posted by *buddatech*
> 
> Yes like this question that got buried and never replied to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can/should I use CLU Pro on GPU Die or should I stick with other paste I have on hand Antec Formula 7??


I was on MX4 paste with evga AIO overclock 2050 at 1.092V and shunt mod, it stabilize at 51C after a lot of bench test loop
I switch to CLU and got 47C with same fan setting


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> how are you testing for vram errors?? PLEASE?..


OCCT is perfect for this (if you carefully set it up).

It will allow you to test max memory overclock to within 5mhz with absolute precision.

The only varying factor you need worry about is varying temperatures which can / will affect results. Also try and lay in a max FPS limit that stimulates a real-world scenario power scenario (I.E. 285w draw)

Moreover 'shader complexity' should be set to 3.

Its method is simple but eloquent. It simply compares one frame to the last, any discrepancies = error.

Don't be put off by the fact the programme only *claims* to test 2GB. It doesn't matter, it tests that 2GB over all modules. It only means 2GB is loaded at any one time. However in recent versions this number is actually doubled. Making the test even better. _(they just didn't update the GUI to say so)_









.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> My max overclock (on core) at stock voltage (1050mv):
> 
> 2075 Mhz @ 50c +
> or
> 2062 @ 60c +++
> 
> Power Limit during gaming _*IS*_ showing it's ugly face: (the Witcher 3 / Ultra / 4k)
> 
> See this screenshot. _(right click & 'open new tab' to view full size and you'll be able to make out the numbers in my screenshot)_


yeah, man, it will be straight post shunt, though. im notting hitting it since i went to aorus and got that 300w base 375w limit.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> OCCT is perfect for this (if you carefully set it up).
> 
> It will allow you to test max memory overclock to within 5mhz with absolute precision.
> 
> The only varying factor you need worry about is varying temperatures which can / will affect results. Also try and lay in a max FPS limit that stimulates a real-world scenario power scenario (I.E. 285w draw)
> 
> Moreover 'shader complexity' should be set to 3.
> 
> Its method is simple but eloquent. It simply compares one frame to the last, any discrepancies = error.
> 
> Don't be put off by the fact the programme only *claims* to test 2GB. It doesn't matter, it tests that 2GB over all modules. It only means 2GB is loaded at any one time. However in recent versions this number is actually doubled. Making the test even better. _(they just didn't update the GUI to say so)_
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


thx! rep'd


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man, it will be straight post shunt, though. im notting hitting it since i went to aorus and got that 300w base 375w limit.


2062/2075 at stock volts (1050mV) with a *near* 300W draw.
Is that any good at 65-70 degrees C with a +555 on memory?

Silicon winner, below average or above average?

What you guys think?

Or do you still need some more data?

(I've not tested above default voltage yet).. thats my next call?........

And aye, I'm quite looking forward to doing the mod; just want to *try a few BIOS's first as promised I'd do.*

*Then* shut mod here I come, lol.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> 2062/2075 at stock volts (1050mV) with a *near* 300W draw.
> Is that any good at 65-70 degrees C with a +555 on memory?
> 
> Silicon winner, below average or above average?
> 
> What you guys think?
> 
> Or do you still need some more data?
> 
> (I've not tested above default voltage yet).. thats my next call?........
> 
> And aye, I'm quite looking forward to doing the mod; just want to try a few BIOS's first as promised I'd do.
> 
> Then shut mod here I come, lol.


oh, that's well above average and will be a 2100+ under water. good pull, bro!


----------



## Clukos

Hitting power limit on Witcher 3 (set it to 1.0930mv 2100MHz with curve OC):


























The undervolting test actually provides a better gaming experience


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> oh, that's well above average and will be a 2100+ under water. good pull, bro!


lol I don't know if thats a good or a bad thing now!

Because that makes the decision on weather to sell and upgrade to KPE or not (when it finally/eventually lanches) even harder! haha.

But yeah I'm happy









cheers for feedback, appreciate it.


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> Hitting power limit on Witcher 3 (set it to 1.0930mv 2100MHz with curve OC):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The undervolting test actually provides a better gaming experience


yeah I can understand that and i've only "opened her up" three times in the last few hours I've had her:

And that makes complete sense...
Much smoother due to no/less "core jumping"

GPU boost 3.0 does get awfully excitable at times lol.








Quote:


> Originally Posted by *Slackaveli*
> 
> thx! rep'd


Thanks means a lot

let me know how you get on with it


----------



## Clukos

I think for gaming without shunt modding undervolting + overclocking the memory to the absolute limit makes more sense. I got a bit lucky with mem because I can go up to +800 without any artifacting at all (I start seeing artifacts at +850), dropped it down to +700 for general use and no issues so far.


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> I think for gaming without shunt modding undervolting + overclocking the memory to the absolute limit makes more sense. I got a bit lucky with mem because i can go up to +800 without any artifacting at all, dropped it down to +700 for general use and no issues so far.


Under water?

I noticed on my 1080 i could squeeze more out the memory by lowering temps on the 3 chips nearest VRM side of card.


----------



## Clukos

Quote:


> Originally Posted by *nrpeyton*
> 
> Under water?


No, just air. I've got the MSI Gaming X variant.


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> No, just air. I've got the MSI Gaming X variant.


What temps you hitting typically?

Even at 100% fan (which is unbearable) I am hitting 60-70c at a 275w draw.

I've noticed they love to be under 50c.


----------



## Clukos

Quote:


> Originally Posted by *nrpeyton*
> 
> What temps you hitting typically?
> 
> Even at 100% fan (which is unbearable) I am hitting 70c at a 275w draw.


65-70C with undervolting, when overclocked around 75+C


----------



## reflex75

Quote:


> Originally Posted by *Luckbad*
> 
> Yes, 100% normal. You are guaranteed to reach the advertised boost clock, otherwise you have a defective unit.
> 
> If your boost goes higher than advertised (all of them do), you're good to go. If they can go over 2000MHz, you're in golden card territory and should be super happy.
> 
> I've seen boost clocks for the 1080 Ti defaulting anywhere between 1750 and 2050 for aftermarket cards (generally lower for reference at stock because they aren't overclocked by default).


So what's the point to pay higher price for the Zotac extreme (advertised to 1759Mhz) if we can't guess what will happen?
Maybe it will boost only to 1800Mhz...


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> 65-70C with undervolting, when overclocked around 75+C


Wow you've hit the jackpot on memory then ;-)


----------



## Clukos

The Gaming X has separate cooling for VRM and VRAM, maybe that helps?


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> The Gaming X has separate cooling for VRM and VRAM, maybe that helps?


aye possibly very true, PCB looks a good bit wider too.

Anyway it could of been a lot worse I suppose.. +555 memory is nothing to be ashamed of. (least I hope not) lol

If I can get that to a +675 after going water I'll be happy.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> lol when i opened my 1080 Classy (5 months ago) I asked myself that exact question.
> 
> And it even took me two/three months to finally crack it, but in the end I did indeed figure that out (as the videos show) haha.. it just a matter of time lol
> 
> Where I'm at now is this:
> 
> Keep it sealed in the box and sell it at full price (or near full price) to pay for a KPE when its launched. _(how long will i have to wait)??? no idea_
> Sell it 'used' for a KPE when KPE launches.
> Open this now and never look back (not bother with KPE).
> _Can't make my mind up. But its sitting here looking at me; doubt I can resist opening it lol._ Not sure how long I can hold out.........


I would have said get the Asus strix it has the hot wire points but too late.
I used the hot wire with my GTX 770s could get all the way to 1.5v would have been great for you to try, I am almost tempted to try them but I am not convinced I would hit a stable 2200Mhz so I am waiting for you to tell us how you go








I know you will be sub zeroing this card at some point in the very near future

Quote:


> Originally Posted by *JohnnyFlash*
> 
> Has anyone with an FE solved the high idle clocks with multiple monitors?


I used one of my old cards to run my 2 accessory monitors trade off now running @ x8 PCIe 3 but doesn't seem to have made much of a performance hit


----------



## Luckbad

Quote:


> Originally Posted by *reflex75*
> 
> So what's the point to pay higher price for the Zotac extreme (advertised to 1759Mhz) if we can't guess what will happen?
> Maybe it will boost only to 1800Mhz...


Better cooling, quieter, more power. But yeah, you can still get a bad overclocker. That's true of every graphics card and CPU since... forever, basically.

It is more noticeable lately because NVIDIA's newer cards have Boost functionality that automatically overclocks. It used to be that you bought a card at a clock speed and that is what you got. Only a few people ever went so far as to overclock any further.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> I would have said get the Asus strix it has the hot wire points but too late.
> I used the hot wire with my GTX 770s could get all the way to 1.5v would have been great for you to try, I am almost tempted to try them but I am not convinced I would hit a stable 2200Mhz so I am waiting for you to tell us how you go
> 
> 
> 
> 
> 
> 
> 
> 
> I know you will be sub zeroing this card at some point in the very near future


Aye well the Classy's and KPE's usually have an evbot port. (even the 1080 classy retained the port despite EVGA no longer manufacturing evbot and they're a *very rare* find on Ebay too). Even if I got my hands on one they only updated the firmware for 1080 compatibility about 3 months ago.

However the new epower (by kingpincooling.com) is due to hit shelves soon. It has evbot built in.
Evbot allows direct hardware tuning of voltages/clock and other treats

I think they are probably going to time the release to coincide with the new 1080TI KPE. Which would be perfect. <--- thats just an estimated guess.

How do the *hot wire points* work? i never realised *ASUS* did anything like that. _(the evbot on EVGA's side was a little handheld computer or "bot" you plugged in)._



On the 1080's we found stock voltage could get us to 2250 at about 0-5c.Anything faster and we begun to need more voltage. The fact the TI's are still able to hit 2100 on the full size GPU indicates the manufacturing process has matured a bit. Which could be good news.

The 1080TI is still exceptionally new. I think there's still time for some pleasant surprises. (I never in all honesty ever thought I'd be able to do half the stuff I ended up getting to do on my 1080 Classy and in the end it was hard fought for but we got most of it -- everything except the BIOS tweaks). But a lot of guys in the community able to write these programs (who work in the industry or at EVGA etc) were hanging onto their 980TI's; *a*s a 980TI on LN2 actually scored higher than a 1080 (non TI) on LN2 in Firestrike.

Now the 1080TI on LN2 beats a 980TI on LN2. So nows where we will start to see guys making the switch to Pascal. And hopefully Nvidia overclocking beginning to grow again. (at hwbot nvidia overclocking has been dead for a while now -- _lets hope that is all about to change)!_


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye well the Classy's and KPE's usually have an evbot port. (even the 1080 retained the port despite EVGA no longer manufacture evbot and they're a *very rare* find on Ebay too). Even if I got my hands on one they only updated the firmware for 1080 compatibility about 3 months ago.
> 
> However the new epower (by kingpincooling.com) is due to hit shelves soon. It has evbot built in.
> Evbot allows direct hardware tuning of voltages/clock and other treats
> 
> I think they are probably going to time the release to coincide with the new 1080TI KPE. Which would be perfect. <--- thats just an estimated guess.
> 
> How do the hot wire points work? i never realised ASUS did anything like that. _(the evbot on EVGA's side was a little handheld computer or "bot" you plugged in)._


I looked at evbot before there is a raspberry pi hack to mimic the evbot.

Asus hot wire is basically knock off a resistor/bridge two point and solder in a potentiometer and your good to go
http://www.overclock.net/t/1409611/asus-gtx-770-dcuii-with-maximus-v-extreme-hotwire/0_20


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> I looked at evbot before there is a raspberry pi hack to mimic the evbot.
> 
> Asus hot wire is basically knock off a resistor/bridge two point and solder in a potentiometer and your good to go
> http://www.overclock.net/t/1409611/asus-gtx-770-dcuii-with-maximus-v-extreme-hotwire/0_20


Interesting. Is that like a variable resistor?

Great thing about these GPU's is they're so small + no heat-spreader = so easy to cool.

Even furmark struggled to push my last card more than 13 degrees above coolant temperature. (I used to imagine it like melting a big chunk of ICE cold metal on to the top of a smaller bit of metal) electricity going through that lil smaller bit of metal or not; the temp still isn't going anywhere. lol

Compare that to a CPU which will will rise to 50c above pot temperature in seconds....
(and that includes AMD soldered chips -- I imagine intel is even worse)

Or another way to look at it is this: imagine cooling a CPU with a stock TDP of 250w on air and still managing to keep that temp at 65c while overclocked with 1 fan.


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> 5775-c w/ 2400Mhz c-10 ram and you golden. just drop it in, too,.


I have to find 5775-c used then? Can't seem to find any new. I will have to look out for it. Thanks


----------



## jamexman

Quote:


> Originally Posted by *KCDC*
> 
> Welp, pretty sure my ax1200i can't handle these two GPUs when overclocked. Either that or I killed it. Playing ME:A or trying any benchmark, when gpus both go over 2000, bam, full restart, all signs point to PSU peaking. Can play games and benchmark with one active and no SLI or nVidia surround with zero issues.
> 
> Just ordered an AX1500i
> 
> Never in my life did I expect to max out a 1200 watt premium PSU.


I don't think you're maxing out. 1200 is more than plenty. My 1000w plat Evga does fine with 1080ti sli and overclocked Ryzen 1800x. Those Corsair power supplies are known for doing that. Check amazon and newegg reviews. Including the one you just ordered... I hope you don't get a lemon, personally I would have gone for a Evga PSU.


----------



## nrpeyton

Quote:


> Originally Posted by *jamexman*
> 
> I don't think you're maxing out. 1200 is more than plenty. My 1000w plat Evga does fine with 1080ti sli and overclocked Ryzen 1800x. Those Corsair power supplies are known for doing that. Check amazon and newegg reviews. Including the one you just ordered... I hope you don't get a lemon, personally I would have gone for a Evga PSU.


Thats mental.

Even if each GPU peaked at 350w (normally 300w fully overclocked) that's still only 700w.

Your CPU and other components would need to be drawing 500w to max that out.

My full system draw (fully overclocked) with 125w TDP CPU, 250W TDP GPU, 2 SSD's & 1 HDD + pump and fan = 550w.

Sure there's nothing else your missing? You connected into every available EPS power connector from PSU onto your board? (some boards even have an extra power connection to provide more juice beside the PCI-E lanes). I remember one even doing it via an old molex. And it wasn't a really old board.

I'm running an 850w unit.

If I upgraded to a 1200w unit I'd have an extra 350w to play with. And even now I can't get my full system draw past: *675w* (overclocked from 4.0 to 5.0GHZ at 1.45v) prime95 max power and Furmark drawing 300 on the GPU all simultaneously raping the system and it still *holds at 675w max.*


----------



## TheBoom

Quote:


> Originally Posted by *jamexman*
> 
> I don't think you're maxing out. 1200 is more than plenty. My 1000w plat Evga does fine with 1080ti sli and overclocked Ryzen 1800x. Those Corsair power supplies are known for doing that. Check amazon and newegg reviews. Including the one you just ordered... I hope you don't get a lemon, personally I would have gone for a Evga PSU.


Yeah some corsair psus are infamous for premature failure, but not the AX series though? I'm running a AX860i for years now without running into any issues yet. Even with the ti in the system it does fine.

It could be a batch issue though maybe.

Anyway for the guy wondering about the Zotac Amp Extreme, I've had 2 and both seem to like the 2025-2037 range on the core. Memory has a much wider range with my first card only going +300 max and the second card going up to +450.

Temps hover around 70-75c with fan speed at about 70-80% with an ambient room temp of 32c.


----------



## nrpeyton

What about an old PSU laying around you don't need? Connect it up and just use it to power 1 GPU or the CPU. Leaving the other to handle the main load.

Could help isolate the problem.


----------



## jamexman

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah some corsair psus are infamous for premature failure, but not the AX series though? I'm running a AX860i for years now without running into any issues yet. Even with the ti in the system it does fine.
> 
> It could be a batch issue though maybe.
> 
> Anyway for the guy wondering about the Zotac Amp Extreme, I've had 2 and both seem to like the 2025-2037 range on the core. Memory has a much wider range with my first card only going +300 max and the second card going up to +450.
> 
> Temps hover around 70-75c with fan speed at about 70-80% with an ambient room temp of 32c.


The AX are. Tons of amazon,Newegg reviews of people complaining there should be a recall. All with those same symptoms, reboots under load. I was going to go with one, but after reading the reviews, decided not to play lotto and got an Evga, 10 year warranty and Evga customer service is excellent from my previous gpu experiences.


----------



## imLagging

Quote:


> Originally Posted by *SlimJ87D*
> 
> Did you do your shunt mod right? What percentage are you hitting when you run heaven?


Yes, I compared before and after I did it.
Around ~75% in 1440p


----------



## TheBoom

Been playing with the curve, got 2037 at 1.093 stable for 11 or so scenes before it dropped to 2025 at 1.081v.

Also is it normal to see power perfcap when I'm only just hitting 105-110%?

I can tell the card is trying hard to not go past 110% even though I have power limit maxed to 120%.


----------



## Clukos

Quote:


> Originally Posted by *TheBoom*
> 
> Been playing with the curve, got 2037 at 1.093 stable for 11 or so scenes before it dropped to 2025 at 1.081v.
> 
> Also is it normal to see power perfcap when I'm only just hitting 105-110%?
> 
> I can tell the card is trying hard to not go past 110% even though I have power limit maxed to 120%.


Stable with what exactly? Test 8k Superposition, I'm pretty sure your card isn't stable at that voltage and core clock


----------



## viz0id

Quote:


> Originally Posted by *TheBoom*
> 
> Been playing with the curve, got 2037 at 1.093 stable for 11 or so scenes before it dropped to 2025 at 1.081v.
> 
> Also is it normal to see power perfcap when I'm only just hitting 105-110%?
> 
> I can tell the card is trying hard to not go past 110% even though I have power limit maxed to 120%.


Like someone posted earlier, there is 2 TDP% one that you can see in GPU-Z and one that shows up in HWINFO as TDP% normalized. If any of those hit close to 120% you will get Perfcapped


----------



## feznz

Quote:


> Originally Posted by *KCDC*
> 
> Welp, pretty sure my ax1200i can't handle these two GPUs when overclocked. Either that or I killed it. Playing ME:A or trying any benchmark, when gpus both go over 2000, bam, full restart, all signs point to PSU peaking. Can play games and benchmark with one active and no SLI or nVidia surround with zero issues.
> 
> Just ordered an AX1500i..
> 
> Never in my life did I expect to max out a 1200 watt premium PSU.


yip the old PSU debacle
I have had a real bad run on PSU. 5 in 10 years








even my current Enermax 1500W maxrevo is a warranty replacement always had a 1500w UPS pure sine wave, my friends say must be the OCing.
One my friends still got his ling long win wong no brand PSU going strong after 7 years even with SLI.
I don't rate corsair after my experience with 2 HX 850w could not run 580 SLI or 770 SLI stock or underclocked RMAed it to have it blow up after 1 year on my media PC probably less than 100w load.
took a chance with an EVGA

you just got a dud PSU, RMA it should be 7 year guarantee then sell the replacement


----------



## TheBoom

Quote:


> Originally Posted by *Clukos*
> 
> Stable with what exactly? Test 8k Superposition, I'm pretty sure your card isn't stable at that voltage and core clock


Lol how do you know so much about my card? Anyway that's on 4K Supo.

Quote:


> Originally Posted by *viz0id*
> 
> Like someone posted earlier, there is 2 TDP% one that you can see in GPU-Z and one that shows up in HWINFO as TDP% normalized. If any of those hit close to 120% you will get Perfcapped


Ahh I see so I should run HWM alongside GPU-Z on my benches then.


----------



## Clukos

Some charts from my testing results with The Witcher 3 maxed at 4k (with hairworks 8xAA) from this post: http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7470#post_26053260





Note that these results are without shunt mod, air cooling (MSI Gaming X) and the maximum attainable curve OC I could get to run stable on this game (2113MHz 1.0930mv). I'm power limited in both stock and overclocking tests. But it shows that while the 1080 Ti is a power hungry beast, you can also make it a power efficient beast if you want to (for thermals/acoustics) without losing too much performance









- Recommended for those with aftermarket cards that don't want to shunt mod and want a quiet/cooler 1080 Ti
- *HIGHLY recommended* for those owning a 1080 Ti FE!

Note that memory OC doesn't affect power draw that much so you can push it as much as you want to. Core voltage affects power draw the most out of anything.


----------



## Benny89

Quote:


> Originally Posted by *Clukos*
> 
> Some charts from my testing results with The Witcher 3 maxed at 4k (with hairworks 8xAA) from this post: http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7470#post_26053260
> 
> 
> 
> 
> 
> Note that these results are without shunt mod, air cooling (MSI Gaming X) and the maximum attainable curve OC I could get to run stable on this game (2113MHz 1.0930mv). I'm power limited in both stock and overclocking tests. But it shows that while the 1080 Ti is a power hungry beast, you can also make it a power efficient beast if you want to (for thermals/acoustics) without losing too much performance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - Recommended for those with aftermarket cards that don't want to shunt mod and want a quiet/cooler 1080 Ti
> - *HIGHLY recommended* for those owning a 1080 Ti FE!
> 
> Note that memory OC doesn't affect power draw that much so you can push it as much as you want to. Core voltage affects power draw the most out of anything.


Can you post your voltage curve for Witcher 3?

Also, try first to play W3 for at least 3 hours to see if your clocks are really stable. I reached many "stable" clocks in Witcher 3 (2063, 2050, 2038) but all eventually crash in W3. Funny is that it can stay stable at 2050 at 1.062 and at 1.050 at 2038. However once it starts droping to 2025 at 1.043V it usually crash short after (like 20-30 min). I have no idea how and why.

I also figured out that giving card less Voltage OC in Voltage slider made it run stable longer. I went for +35%. Now maybe it needs more or less to not crash once it reaches 1.043V.

It is really stupid that it can stay stable at higher bin, higher clock but stops being stable at lower bin, lower clock.

This makes me mentail


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> It is really stupid that it can stay stable at higher bin, higher clock but stops being stable at lower bin, lower clock.


Is the temperature the same? Also, how long does it stay in each bin. The longer the higher the chance of crashing due to partial instability.


----------



## fisher6

Quote:


> Originally Posted by *TheNoseKnows*
> 
> You like the HOF? You can pre-order here:
> https://www.overclockers.co.uk/kfa2-geforce-gtx-1080ti-hof-8-pack-edition-11264mb-gddr5x-pci-express-graphics-card-gx-09k-kf.html


I'm in Norway so I will have to order it locally, But we usually get all the Galax cards here for a few months before they go out of stock forever


----------



## Clukos

Quote:


> Originally Posted by *TheBoom*
> 
> Lol how do you know so much about my card? Anyway that's on 4K Supo.


They all behave similarly. 1.0930mv is pushing it to the absolute limit in terms of power draw. There is no way your card isn't throttling anywhere with that amount of voltage, that is the primary reason people are shunt modding their cards in the first place









You probably also don't need 1.0930mv to hit 2037, more like 1.0430-1.0620 for 100% stability is the norm for that clock speed.


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> Is the temperature the same? Also, how long does it stay in each bin. The longer the higher the chance of crashing due to partial instability.


Si it starts at 2063, downclock to 2050 where it stays till 52C where to goes down to 2038 and stays there for like an 2 and half hour. It starts to downclock to 2025 when it starts hitting 64C. Usually it crashes then after an 20 min more less.

I will try to make curve higher. So right now 2038 was at 1.050V so I will try to make it at 1.062V.

Card does not hit power limit or anything. No artefacting. I have no idea why it eventually crashes...


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> Card does not hit power limit or anything. No artefacting. I have no idea why it eventually crashes...


That is a sign of core instability, my card behaves the same way. Memory OC instability = artifacting, Core clock OC instability = Straight crash.


----------



## Benny89

Quote:


> Originally Posted by *Clukos*
> 
> That is a sign of core instability, my card behaves the same way. Memory OC instability = artifacting, Core clock OC instability = Straight crash.


I know that, but in every other game crad is table at 2063 or 2050. I can play hours in BF1 with 2063 core and no problem.

Witcher 3 is the only game that causes me so much trouble. For example in Heaven I know that my card can stay stable at 2038 at 1.043 and 1.050V. It can also stay stable at 2050 at 1.062V. But Witcher 3 does not accept those results







. Seems like for Witcher 3 you have to drop clock and up bins at one point. So if in Heaven your clock was stable at lets say 2050 at 1.063V in Witcher 3 you should make it 2050 at 1.075V.

This game is nuts when it comes to stable clocks. You can play for hour and think you have 24/7 stable clocks and then hour later it shows you middle finger


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 is the only game that causes me so much trouble.


It draws the most power and probably is using the GPU in some way that is very antsy about overclocking or clock stability in general. It's a good way to test stability for sure


----------



## viz0id

The next game I'm looking forward to is Prey. Hopefully it runs really good on high end systems.

EDIT: Running out of games to play so going to be a good game to put the 1080ti to the test in.


----------



## Benny89

Can you guys tell me your stable clocks in Witcher 3? By stable I mean you can play it for long hours/day without crash, not hour/two or so


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> It starts to downclock to 2025 when it starts hitting 64C. Usually it crashes then after an 20 min more less


So maybe the instability is related to temperature. Pascal is sensitive in that regard. If u can get < 60C u might be good with your lower volts.

Higher voltage will cause the temperature to rise and might actually cause instability. So it's really a balancing act.


----------



## GNUster

Quote:


> Originally Posted by *Benny89*
> 
> I know that, but in every other game crad is table at 2063 or 2050. I can play hours in BF1 with 2063 core and no problem.
> 
> Witcher 3 is the only game that causes me so much trouble. For example in Heaven I know that my card can stay stable at 2038 at 1.043 and 1.050V. It can also stay stable at 2050 at 1.062V. But Witcher 3 does not accept those results
> 
> 
> 
> 
> 
> 
> 
> . Seems like for Witcher 3 you have to drop clock and up bins at one point. So if in Heaven your clock was stable at lets say 2050 at 1.063V in Witcher 3 you should make it 2050 at 1.075V.
> 
> This game is nuts when it comes to stable clocks. You can play for hour and think you have 24/7 stable clocks and then hour later it shows you middle finger


I guess this is simply related to the power consumed by the game and thus the heat produced. That is, on the Strix the gpu and vram conduct heat to both the heat sink and backplate. Thus, in a more demanding game (like Witchter 3) your vram gets hotter due to the shared heat sinks with the gpu (and hence your vram is less stable) even if you use the same mem clock that is stable in all your less demanding games (like BF1).


----------



## TheBoom

Quote:


> Originally Posted by *Clukos*
> 
> They all behave similarly. 1.0930mv is pushing it to the absolute limit in terms of power draw. There is no way your card isn't throttling anywhere with that amount of voltage, that is the primary reason people are shunt modding their cards in the first place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You probably also don't need 1.0930mv to hit 2037, more like 1.0430-1.0620 for 100% stability is the norm for that clock speed.


Exactly what I'm trying to find but I can't seem to get the curve to behave properly for the lower voltage bins. It decides to throttle down at voltages between 1.025-1.075. But keeps itself stable at [email protected] or [email protected] and [email protected] Anything in between and the behavior is different.


----------



## reflex75

Quote:


> Originally Posted by *TheBoom*
> 
> Anyway for the guy wondering about the Zotac Amp Extreme, I've had 2 and both seem to like the 2025-2037 range on the core. Memory has a much wider range with my first card only going +300 max and the second card going up to +450.
> 
> Temps hover around 70-75c with fan speed at about 70-80% with an ambient room temp of 32c.


Thank you for your feedback.
I have myself the Zotac since yesterday, and I am trying to compare it to my 4 previous 1080ti: 1 FE, 2 MSI, 1 Asus
I have returned the FE because it was too loud, the 2 MSI had massive coil whine (you can hear from another room), and the Asus was struggling oc to 1900Mhz...
Now I am happy with my first Zotac, the silent beast








But I'm trying to figure out why 2 different cards, if set to the exact same level of initial boost clock (let's say 1759Mhz) will boost to different clocks?
Is it encoded in the chip itself? Is it controlled by the bios?


----------



## BrainSplatter

Quote:


> Originally Posted by *reflex75*
> 
> But I'm trying to figure out why 2 different cards, if set to the exact same level of initial boost clock (let's say 1759Mhz) will boost to different clocks?


Different cards often have different VID (=different thermals, power requirement, ...). This is one of the things almost all GPU reviewers completely ignore and still compare for example power usage without mentioning the GPU base voltage.


----------



## mtbiker033

Quote:


> Originally Posted by *s1rrah*
> 
> What is "locking the voltage" and how do I do it?


with your curve open you press L this will lock it at the point you select from going higher on clocks/voltage


----------



## Benny89

Quote:


> Originally Posted by *GNUster*
> 
> I guess this is simply related to the power consumed by the game and thus the heat produced. That is, on the Strix the gpu and vram conduct heat to both the heat sink and backplate. Thus, in a more demanding game (like Witchter 3) your vram gets hotter due to the shared heat sinks with the gpu (and hence your vram is less stable) even if you use the same mem clock that is stable in all your less demanding games (like BF1).


So it is a downside of STRIX that it conducts heat also from VRAM? How other AIBs handle it? It this some sorf of design flaw?


----------



## GNUster

Quote:


> Originally Posted by *Benny89*
> 
> So it is a downside of STRIX that it conducts heat also from VRAM? How other AIBs handle it? It this some sorf of design flaw?


Not at all, most mid/high end cards do the cooling this way because overall it gives you a more stable card for regular use. However, when you push the card to the limit, you have to consider those side effects and adapt your cooling solutions appropriately.

Edit: The most typical solution to your problem would be to overall scale up your cooling capacity by applying a water block. With that, you would have enough head room to keep all components below critical temps even under most demanding conditions.


----------



## BrainSplatter

Quote:


> Originally Posted by *GNUster*
> 
> you have to consider those side effects and adapt your cooling solutions appropriately.


Exactly. Coming back to Benny89s situation, I would recommend a more aggressive fan curve and see whether that will keep the GPU < 60C stabilizing the OC for Witcher 3.


----------



## nrpeyton

Quote:


> Originally Posted by *Clukos*
> 
> Some charts from my testing results with The Witcher 3 maxed at 4k (with hairworks 8xAA) from this post: http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7470#post_26053260
> 
> 
> 
> 
> 
> Note that these results are without shunt mod, air cooling (MSI Gaming X) and the maximum attainable curve OC I could get to run stable on this game (2113MHz 1.0930mv). I'm power limited in both stock and overclocking tests. But it shows that while the 1080 Ti is a power hungry beast, you can also make it a power efficient beast if you want to (for thermals/acoustics) without losing too much performance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - Recommended for those with aftermarket cards that don't want to shunt mod and want a quiet/cooler 1080 Ti
> - *HIGHLY recommended* for those owning a 1080 Ti FE!
> 
> Note that memory OC doesn't affect power draw that much so you can push it as much as you want to. Core voltage affects power draw the most out of anything.


Great work with the graphs for comparison.
as a FE owner I appreciate stuff like this massively rep++


----------



## mtbiker033

Quote:


> Originally Posted by *nrpeyton*
> 
> Great work with the graphs for comparison.
> as a FE owner I appreciate stuff like this massively rep++


totally agree I gave him +rep too! nice work!


----------



## Benny89

Quote:


> Originally Posted by *GNUster*
> 
> Not at all, most mid/high end cards do the cooling this way because overall it gives you a more stable card for regular use. However, when you push the card to the limit, you have to consider those side effects and adapt your cooling solutions appropriately.
> 
> Edit: The most typical solution to your problem would be to overall scale up your cooling capacity by applying a water block. With that, you would have enough head room to keep all components below critical temps even under most demanding conditions.


OK thanks for all the help and info. But since when "critical temps" are below 70C? I was overclocking my 980Ti to the absolute limit and it was on air on 74C max and never downclocked more than 15mhz down from inittial OC result.

Pascal card downclock several times when in usage. 980Ti downlocked ONCE. Initital OC 1565 on core, downclock after 60C to 1555 and stayed there forever and ever.

Strange that 60C is suddenly some temp threshold with those cards....


----------



## Clukos

Even 50C is a threshold, the lower you go with Pascal the higher it goes in clocks.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> Strange that 60C is suddenly some temp threshold with those cards....


There is no specific temperature. It's just the higher the temperature the more easily the GPU will crash when it's on the edge of stability. The 60C threshold I only suggested based on your Witcher 3 data.


----------



## fisher6

Out of curiosity, how do people downvolt the card using the curve? Do you force a specific clock/voltage bin? Would like to do it as well.


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> There is no specific temperature. It's just the higher the temperature the more easily the GPU will crash when it's on the edge of stability. The 60C threshold I only suggested based on your Witcher 3 data.


Ok, I will repaste card today with Grizzly Kryonaut. It should give me additional 4-5C. I will then test if this allows my curve to stay stable in Witcher 3. If yes then at least I will know that this is a temperature thing and I will probably decide to do water loop then.

But if it will still crash then meh...









Still funny bacause card can do 2063 in Heaven with temps reaching 69C on stock fan curve and it does not crash and that is I think on 1.081V.

Ow well, will test more.


----------



## Nico67

Quote:


> Originally Posted by *fisher6*
> 
> Out of curiosity, how do people downvolt the card using the curve? Do you force a specific clock/voltage bin? Would like to do it as well.


Just make your curve flat from the voltage you want to run, its that simple


----------



## ALSTER868

Quote:


> Originally Posted by *Benny89*
> 
> Strange that 60C is suddenly some temp threshold with those cards....


They all downclock at these temps according to my observation: 30c->41c->52c->60c->69c.
And there is no way to escape that exept of using water on it to keep the clocks. And yes, W3 is a hell of a game to put the card to test: while I can play BF1 easily @2000/[email protected] on stock FE cooler for at least an hour (didnt't try for longer though), W3 can be played at only 1924. I mean stable clocks, not jumping all around.


----------



## Coopiklaani

Quote:


> Originally Posted by *KraxKill*
> 
> So this guy made a great post and it was kind of overlooked, myself including until I ran smack into the "normalized TDP" wall.
> 
> I started playing with lower liquid temps and higher clocks at voltages above 1.05v. I'm shunt modded and got the same normalized TDP limits in superposition @ 4k.
> 
> I'm currently running 2100 @ 1.05v with +550 mem to stay under this limit.
> 
> I confirmed the limit by lowering my mem OC to +0 and I was able to run 1.06 before hitting the normalized TDP wall.
> 
> Other benches seem to not show this. +rep on this one.


Thanks, ppl usually misunderstand the shunt mod thinking it will remove TDP entirely, which is not true. I believe this normalized TDP is measure either by a low-side sensor or the GPU itself. Shunt mod does nothing to the normalized TDP. The only way to lift it is through flashing BIOSs with higher TDP.


----------



## TheNoseKnows

Quote:


> Originally Posted by *fisher6*
> 
> Out of curiosity, how do people downvolt the card using the curve? Do you force a specific clock/voltage bin? Would like to do it as well.


Press L when a particular voltage is selected to lock it. A yellow vertical line should appear:









This seems to prevent the card from downclocking when at idle, however.


----------



## trawetSluaP

I take it the "Run" function to auto-tune the voltage curve in Precision is crap?


----------



## TheBoom

Quote:


> Originally Posted by *Nico67*
> 
> Just make your curve flat from the voltage you want to run, its that simple


This works for keeping your voltage from going higher, but it doesn't prevent it from going lower or down clocking does it?


----------



## viz0id

Quote:


> Originally Posted by *TheBoom*
> 
> This works for keeping your voltage from going higher, but it doesn't prevent it from going lower or down clocking does it?


Well no but when at lower voltages, there should be less of a reason (Perfcaps) to force it to downclock.


----------



## BBEG

It's been so long since I've had a working, proper GPU... and thanks to an EVGA Founders Edition, I'm back.









Since I won't be able to instal the card until Mon/Tues, it's given me a chance to research bios flashing again! I haven't touched this since the gtx 680/770 days, but I'm assuming it still works the same. Is there any particularly juicy bios for Founders Edition cards, particularly ones with unlocked voltage? Might sound odd, but I'll actually be undervolting/downclocking my card at first until I can get my case cooling scheme properly set up.


----------



## fisher6

Quote:


> Originally Posted by *TheNoseKnows*
> 
> Press L when a particular voltage is selected to lock it. A yellow vertical line should appear:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This seems to prevent the card from downclocking when at idle, however.


That's kinda expected since the you locked the voltage to that particular bin. Wouldn't just work bette to set a clock you want and jut have a straight line at higher bins like others have suggested, then it should down clock properly.


----------



## s1rrah

Quote:


> Originally Posted by *Benny89*
> 
> Can you guys tell me your stable clocks in Witcher 3? By stable I mean you can play it for long hours/day without crash, not hour/two or so


I'm using a curve that puts 1.093v in to a 2050mhz core overclock ... (also, a 6000mhz memory overclock) ... but it's the core that is weird ...

Witcher 3 opens and plays for a few seconds at 2050mhz (as I have defined) but invariable always downclocks to 2025mhz ... and some times down to 2012mhz ... and only VERY occasionally/briefly down in to the 1989mhz range ... but generally always sticks at 2025mhz or 2012mhz. This is fine by me as the game plays like butter.

I'm just writing those weird fluctuations off to my cards voltage and power limits (both are set to max) ... it's certainly not temperature related as my in game load temps are never over 40C to 43C ... so no temp problems ... I haven't shunt modded or anything, but I might here in the near future. Seems like a very easy mod to do ...

Best,
Joel


----------



## CoreyL4

Should I flash to a bios that has a higher power limit to avoid the 117% power limit of my gaming x?


----------



## Hulio225

Since i know what my card is capable of in terms of max clocks (2100/6300 @ 1.063V) i thought i ll give undervolting a shot.
I wanted to check how much volts i need to run 2075 on the core so i adjusted my curve for that and i ran time spy game 2 test while monitoring graphs.

And i came across this strange "behavior". Im keeping the 2075 MHz all the time constant while beeing Power Limited all the time, at least thats what the river tuner statistics tell ^^
Its a Power Moded card, i mean normaly it should downclock it self if power limited... but i dont think that the card is actually power limited when keeping the clock... i honestly can't explain what is happening there^^


Quote:


> Originally Posted by *s1rrah*
> 
> I'm using a curve that puts 1.093v in to a 2050mhz core overclock ... (also, a 6000mhz memory overclock) ... but it's the core that is weird ...
> 
> Witcher 3 opens and plays for a few seconds at 2050mhz (as I have defined) but invariable always downclocks to 2025mhz ... and some times down to 2012mhz ... and only VERY occasionally/briefly down in to the 1989mhz range ... but generally always sticks at 2025mhz or 2012mhz. This is fine by me as the game plays like butter.
> 
> I'm just writing those weird fluctuations off to my cards voltage and power limits (both are set to max) ... it's certainly not temperature related as my in game load temps are never over 40C to 43C ... so no temp problems ... I haven't shunt modded or anything, but I might here in the near future. Seems like a very easy mod to do ...
> 
> Best,
> Joel


Edit: 40-43° C means your losing at least one bin (12.5 MHz) to the temperature, the drops to 2012 and 1989 is power limit. in addition try to go lower than 1,093V cuz the higher your voltage the faster you will hit the power limt. if you need those voltages to run those clocks speed in the first place, then its a different story.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> Should I flash to a bios that has a higher power limit to avoid the 117% power limit of my gaming x?


No, if you have a gaming x you're getting 30 extra watts.

I recommend you just stick to what you can around 1.043v and just game and be happy







.


----------



## wrc05

Quote:


> Originally Posted by *Clukos*
> 
> 
> 
> Spoiler: Warning: Spoiler!


@Clukos

Hi, how did you get MSI Afterburner to show GPU & CPU Power?

Thanks


----------



## Hulio225

Quote:


> Originally Posted by *wrc05*
> 
> @Clukos
> 
> Hi, how did you get MSI Afterburner to show GPU & CPU Power?
> 
> Thanks


HWinfo has OSD functionality as well, you can set it up and it will communicate with riva OSD statistics


----------



## wrc05

^^

Thanks a bunch


----------



## Benny89

Quote:


> Originally Posted by *s1rrah*
> 
> I'm using a curve that puts 1.093v in to a 2050mhz core overclock ... (also, a 6000mhz memory overclock) ... but it's the core that is weird ...
> 
> Witcher 3 opens and plays for a few seconds at 2050mhz (as I have defined) but invariable always downclocks to 2025mhz ... and some times down to 2012mhz ... and only VERY occasionally/briefly down in to the 1989mhz range ... but generally always sticks at 2025mhz or 2012mhz. This is fine by me as the game plays like butter.
> 
> I'm just writing those weird fluctuations off to my cards voltage and power limits (both are set to max) ... it's certainly not temperature related as my in game load temps are never over 40C to 43C ... so no temp problems ... I haven't shunt modded or anything, but I might here in the near future. Seems like a very easy mod to do ...
> 
> Best,
> Joel


It seems like you are on water so temp instability does not concern you







. However you at least partially confirm my theory that Witcher 3 needs more voltage for lower clocks that other apps.

First I will try to confirm temp throttle. If that is not the case I will experiement in setting lower clocks for higher bins.
Quote:


> Originally Posted by *Hulio225*
> 
> Since i know what my card is capable of in terms of max clocks (2100/6300 @ 1.063V) i thought i ll give undervolting a shot.
> I wanted to check how much volts i need to run 2075 on the core so i adjusted my curve for that and i ran time spy game 2 test while monitoring graphs.
> 
> And i came across this strange "behavior". Im keeping the 2075 MHz all the time constant while beeing Power Limited all the time, at least thats what the river tuner statistics tell ^^
> Its a Power Moded card, i mean normaly it should downclock it self if power limited... but i dont think that the card is actually power limited when keeping the clock... i honestly can't explain what is happening there^^


Sadly this work in benches and tests but is nowhere near sufficient in heavy GPU games. For example I could on air easly bench 2075 at 1.081V no problem. Or do 2025 at 1.000V. That will never work in Witcher 3 for example. You need to set totally different curve for Witcher 3.


----------



## trawetSluaP

I really don't understand these cars and the voltage curve.

If I just add to the Core or Voltage then Unigine crashes, but if I run Unigine and note the current voltage and just add an offset to that voltage it works.

I'm so confused.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Thanks, ppl usually misunderstand the shunt mod thinking it will remove TDP entirely, which is not true. I believe this normalized TDP is measure either by a low-side sensor or the GPU itself. Shunt mod does nothing to the normalized TDP. The only way to lift it is through flashing BIOSs with higher TDP.


When are we going to actually hit this normalized tdp though?

I don't hit it with any game I play, superposition or firestrike.

I don't want to flash the Palit bios because it doesn't score as well as stock, kind of seeing it as counterproductive. Here's to hoping the EVGA FTW3 bios might be better for everyone.

But other than furnark, is it really going to affect us that much? Even when I did hit power limit in TimeSpy it was for half only two lines in perf cap and this was at 1.075v.


----------



## TheBoom

Ok so I didn't realize how hot this card ran until I played andromeda on it for the first time today. 100% fans barely keeping it at 80c @2037 1.075v with 11.8ghz mem.

I have my h115iv2 AIO for the CPU as exhaust and my load temps just went up about 10c.

I'm only running my 6700k at 1.26v at 4.5ghz and it's hitting a max of 90c.

If I put my finger on the top of the case for more than 3 seconds I'll get scalded.

Zotac Amp Extreme btw.

Edit : There's even a smell of silicon coming from exhaust lol. I hope it doesn't fry. *knocks on wood*


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> Can you guys tell me your stable clocks in Witcher 3? By stable I mean you can play it for long hours/day without crash, not hour/two or so


I've got 2050mhz with 6000mhz memory at 1.025v stable in witcher 3 with no crashing on my Strix. No downclocking. Same oc ghost recon Wildlands runs at 2063 no downclocking.. Using a custom power curve.

I haven't tried to go higher yet but I think it's pretty good as is anyway, the temps never exceed 57c @ 70% fanspeed.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Sadly this work in benches and tests but is nowhere near sufficient in heavy GPU games. For example I could on air easly bench 2075 at 1.081V no problem. Or do 2025 at 1.000V. That will never work in Witcher 3 for example. You need to set totally different curve for Witcher 3.


I think you don't understand what the issue is with that screenshot and stuff i posted.
The question is why is my card holding 2075MHz constantly while beeing Power Limited?

And Time Spy Game 2 Test is heavy GPU Load, i would go that far to say even more heavy than Witcher 3

Screenshot again: 

Edit: Ill repeat myself... the guy who has set 2050MHz on his core and reaches 40-43°C temps, is definetly loosing 1-2 bins because his temps, thats a fact


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> I've got 2050mhz with 6000mhz memory at 1.025v stable in witcher 3 with no crashing on my Strix. No downclocking. Same oc ghost recon Wildlands runs at 2063 no downclocking.. Using a custom power curve.
> 
> I haven't tried to go higher yet but I think it's pretty good as is anyway, the temps never exceed 57c @ 70% fanspeed.


Dam, my card can do 2050 at 1.050 and 1.063V







But your is below 60C so this might be the reason why yours does not crash.. Did you play for several hours in W3 with it? It didn't downclock couple of times?

Can you post your custom voltage curve please and your Afterburner settings? I'd be grateful


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> I think you don't understand what the issue is with that screenshot and stuff i posted.
> The question is why is my card holding 2075MHz constantly while beeing Power Limited?
> 
> And Time Spy Game 2 Test is heavy GPU Load, i would go that far to say even more heavy than Witcher 3
> 
> Screenshot again:
> 
> Edit: Ill repeat myself... the guy who has set 2050MHz on his core and reaches 40-43°C temps, is definetly loosing 1-2 bins because his temps, thats a fact


From my own experiments I got Power Limit when my clocks where too high for its bin. Not even crashing. Just too high. When I lowered clock of each bin, I stopped having Power Limit even at higher voltage bins.

Also it is not about pure GPU load, but also how app works. Suppo 4K OP is even more heavy load but I can keep higher clocks there than in Witcher 3.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Dam, my card can do 2050 at 1.050 and 1.063V
> 
> 
> 
> 
> 
> 
> 
> But your is below 60C so this might be the reason why yours does not crash.. Did you play for several hours in W3 with it? It didn't downclock couple of times?
> 
> Can you post your custom voltage curve please and your Afterburner settings? I'd be grateful


Take in mind that every chip is different, those GPUs aren't prefect clones, his settings don't necessary will work on your card.
In addition yes temeperature can do alot, even 1 ° C delta can cause instability.
Quote:


> Originally Posted by *Benny89*
> 
> From my own experiments I got Power Limit when my clocks where too high for its bin. Not even crashing. Just too high. When I lowered clock of each bin, I stopped having Power Limit even at higher voltage bins.
> 
> Also it is not about pure GPU load, but also how app works. Suppo 4K OP is even more heavy load but I can keep higher clocks there than in Witcher 3.


Again: My Card is Power Moded, the Coreclock is set to 2075MHz and i am keeping it while its showing that the card is Power Limited, thats strange. Normally the card should downclock and fluctuate in clock speeds but it isn't.
I can run 2100 @ 1,063V and never see Power Limit... i hope its clear now to you^^

Edit: even on that screenshot if you check it again, the graph is showing Power Limit with that gpu-z render test... i mean seriously *** is that ****


----------



## Luckbad

Quote:


> Originally Posted by *TheBoom*
> 
> Ok so I didn't realize how hot this card ran until I played andromeda on it for the first time today. 100% fans barely keeping it at 80c @2037 1.075v with 11.8ghz mem.
> 
> I have my h115iv2 AIO for the CPU as exhaust and my load temps just went up about 10c.
> 
> I'm only running my 6700k at 1.26v at 4.5ghz and it's hitting a max of 90c.
> 
> If I put my finger on the top of the case for more than 3 seconds I'll get scalded.
> 
> Zotac Amp Extreme btw.
> 
> Edit : There's even a smell of silicon coming from exhaust lol. I hope it doesn't fry. *knocks on wood*


Is your ambient temperature super high? What's your cooling situation?

The 1080 Ti and the 6700k really aren't that hot.

I have both the Zotac Amp Extreme and 6700k as well. NZXT Kraken on my 6700k. My GPU rarely exceeds 65 C and CPU rarely exceeds 60 C.

No AC on in the house but it's generally mid 70s in the house when I'm gaming at night.


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> Is your ambient temperature super high? What's your cooling situation?
> 
> The 1080 Ti and the 6700k really aren't that hot.
> 
> I have both the Zotac Amp Extreme and 6700k as well. NZXT Kraken on my 6700k. My GPU rarely exceeds 65 C and CPU rarely exceeds 60 C.
> 
> No AC on in the house but it's generally mid 70s in the house when I'm gaming at night.


Stock air cooler on the amp extreme and H115iv2 for the CPU.

Ambient temps are around 30-32c idle and it gets to 34-35c in the room if I'm gaming full load.


----------



## GRABibus

Got my new ZOTAC Founders Edition today.
It has more headroom for overclokc than my former MSI Sea Hawk.



http://www.3dmark.com/3dm/19590292?

GTX1080Ti ZOTAC Founders Edition
+150MHz on core in MSI AB
Fan at 100%
Power limit => 120%
Voltage slider => 100%

PS : I don't understand why 3DMark says "drivers not approved"...
I have the latest from NVIDIA 381.89 WHQL


----------



## Hulio225

Okay so i found out that no matter what i do as long i undervolt below the standard of 1050mV ill see Power Limit in AfterBurner all the time even if its not in Power Limit in reality... so yeah i dont care anymore...
i can run 2075Mhz @ 1.024 V if i want ;-)


----------



## Luckbad

Quote:


> Originally Posted by *TheBoom*
> 
> Stock air cooler on the amp extreme and H115iv2 for the CPU.
> 
> Ambient temps are around 30-32c idle and it gets to 34-35c in the room if I'm gaming full load.


That might be it. Seems pretty hot in your room. I'd still ensure you have plenty of air intake in the case.

If I had temps like that in the room, I'd personally not overclock. In the middle of summer here in California, I tend to disable my overclocks for a couple months.


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> That might be it. Seems pretty hot in your room. I'd still ensure you have plenty of air intake in the case.
> 
> If I had temps like that in the room, I'd personally not overclock. In the middle of summer here in California, I tend to disable my overclocks for a couple months.


IMO below 80c for the 6700k is still fine. Which was the case, until I put this card in lol.

My previous 1070 strix wasn't putting out anywhere near this amount of heat.

I currently have 4 intake and 2 exhaust (on the radiator on top). Maybe too much intake and too little exhaust?


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> ...
> 
> My previous 1070 strix wasn't putting out anywhere near this amount of heat.
> 
> ...


1080 Ti has almost double the amount of cuda cores xD you need almost 2 1070's to put out the same heat


----------



## BrainSplatter

Quote:


> Originally Posted by *TheBoom*
> 
> Maybe too much intake and too little exhaust?


Best cooling for big GPUs on air is a side fan imho, but that's currently not 'en-vogue'. I did put a 200mm Bitfenix Spectre into my Phantom 630 which helped a lot to cool [email protected]

With your case, the best option is probably just to remove the side panel


----------



## mouacyk

So is the normalized power throttling affecting all shunted ti's or just specific brands?


----------



## mshagg

Not sure where the impression comes from that the normalized value from HWInfo is some sort of hard limit, compared to the outright TDP. Having a look at this FE with the Palit BIOS, SuPo 4k sees a maximum 117.8% and a "non normalized" of 109.5%.

To me they look like they're the same reading, expressed differently. i.e. same numerator, different denominator. The nomalized value appears to correlate with the % dialed in via afterburner.


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> i can run 2075Mhz @ 1.024 V if i want ;-)










My card can do only 2025 at 1.024V...

If temps won't help today I will play lottery again.

I miss the days when I didn't even know I can Overclock stuff







if you know what I mean


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> Dam, my card can do 2050 at 1.050 and 1.063V
> 
> 
> 
> 
> 
> 
> 
> But your is below 60C so this might be the reason why yours does not crash.. Did you play for several hours in W3 with it? It didn't downclock couple of times?
> 
> Can you post your custom voltage curve please and your Afterburner settings? I'd be grateful


Starts off at 2076 in the 40c range for about 5mins before it warms to the mid 50's where it finally rests at 2050 and doesn't deviate. Yep can play as long as I like without it dropping below 2050.


----------



## TheBoom

Quote:


> Originally Posted by *BrainSplatter*
> 
> Best cooling for big GPUs on air is a side fan imho, but that's currently not 'en-vogue'. I did put a 200mm Bitfenix Spectre into my Phantom 630 which helped a lot to cool [email protected]
> 
> With your case, the best option is probably just to remove the side panel


That's a big no. My room collects dust like a drain collects water during the rainy season.

I might try switching the 140mm rear fan to exhaust instead and see how it goes.


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> Starts off at 2076 in the 40c range for about 5mins before it warms to the mid 50's where it finally rests at 2050 and doesn't deviate. Yep can play as long as I like without it dropping below 2050.


Nice, thanks. Seems like I just didn't win silicon lottery...









I will buy another STRIX and see if it's any better. I can return worse one later.

Did you first test each bin to see what max clocks can you do?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mouacyk*
> 
> So is the normalized power throttling affecting all shunted ti's or just specific brands?


It doesn't really affect you that much, at least not in gaming and superposition. And from my experiment it doesn't really affect you at all at 1.050v.

It'll mostly affect you if you want to run furnark and watch the donut spin around, lol.

But if you run games, you're not going to really hit normalized tdp, at least I haven't playing Witcher 3 and For Honor (which runs as power hungry as ghost recon). This was me playing at 1.075v.


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> Did you first test each bin to see what max clocks can you do?


To be honest I haven't tested every bin/voltage combo haha. However when W3 crashes it's so quick to restart the game that it didn't take long to find out the bins the card hates.

I will have to keep at it though as I reckon it might have more mhz in it yet. Maybe not 2100 but ey it's worth a shot =)


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> I will buy another STRIX and see if it's any better. I can return worse one later.


I think your current card is already above average. With water cooling u could probably run W3 @ 2050Mhz. On air, I would try 2000Mhz @ 1.025v and make sure that the temperature doesn't exceed 70C. The performance difference between 2050Mhz and 2000Mhz is just 2.5% or 1.5 frames @ 60Hz.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> I have to find 5775-c used then? Can't seem to find any new. I will have to look out for it. Thanks


bhphoto has them for $359, in stock, and all from the later batches where ALL of us have hit 4.2+, even a few 4.4s , which is more than enough to beat skylake/kabylake in gaming.

Sell the old i-7 for $250~+ and have you a whole new platform's performance for less than $100.

https://www.bhphotovideo.com/c/product/1181928-REG/intel_bx80658i75775c_core_i7_5775c_3_3_ghz.html

Edit, price went up $20. word has been getting out on these and they have sold a bunch lately.

Good luck EVER finding one used. Nobody is going to part with The Gaming Behemoth lol.
Quote:


> Originally Posted by *jamexman*
> 
> I don't think you're maxing out. 1200 is more than plenty. My 1000w plat Evga does fine with 1080ti sli and overclocked Ryzen 1800x. Those Corsair power supplies are known for doing that. Check amazon and newegg reviews. Including the one you just ordered... I hope you don't get a lemon, personally I would have gone for a Evga PSU.


superflowerpowerFTW


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [/INDENT]
> Thanks means a lot
> 
> let me know how you get on with it


well, i might not know what im doing but i couldnt get the test to even run on my gpu. ran the cpu tests fine. but even on stock settings the test would stop after 1 second :/ . if you have a free moment, mind posting a screenshot of all your settings on OCCT, and can you think what i might be doing wrong?
Quote:


> Originally Posted by *Benny89*
> 
> Can you post your voltage curve for Witcher 3?
> 
> Also, try first to play W3 for at least 3 hours to see if your clocks are really stable. I reached many "stable" clocks in Witcher 3 (2063, 2050, 2038) but all eventually crash in W3. Funny is that it can stay stable at 2050 at 1.062 and at 1.050 at 2038. However once it starts droping to 2025 at 1.043V it usually crash short after (like 20-30 min). I have no idea how and why.
> 
> I also figured out that giving card less Voltage OC in Voltage slider made it run stable longer. I went for +35%. Now maybe it needs more or less to not crash once it reaches 1.043V.
> 
> It is really stupid that it can stay stable at higher bin, higher clock but stops being stable at lower bin, lower clock.
> 
> This makes me mentail


dude, i know it sounds crazy but it might be your vram frequency crashing it. what you running yo ramz at?

also, +35 is a strange number for voltage, and ill tell you why. you get 1 bin per +25. so i usually and +1 to that just to make sure it didnt , like, droop below the +25 mark. probably why some dont get that last bin past +75 when they go to full slider 100% and don't get the 1.093v (some users claim this is the case, yet i get the 1.093v, so it must be leakage/droop).
so, anyway, voltage slider either 0, +26mv, +51mv, +76mv, or +100mv are the numbers where i see a bump. It's like how you always add 13(12.5 really)Mhz between every clock increase on the core.
Quote:


> Originally Posted by *GNUster*
> 
> I guess this is simply related to the power consumed by the game and thus the heat produced. That is, on the Strix the gpu and vram conduct heat to both the heat sink and backplate. Thus, in a more demanding game (like Witchter 3) your vram gets hotter due to the shared heat sinks with the gpu (and hence your vram is less stable) even if you use the same mem clock that is stable in all your less demanding games (like BF1).


ah, i juust was speaking on this. good post.
Quote:


> Originally Posted by *TheBoom*
> 
> Ok so I didn't realize how hot this card ran until I played andromeda on it for the first time today. 100% fans barely keeping it at 80c @2037 1.075v with 11.8ghz mem.
> 
> I have my h115iv2 AIO for the CPU as exhaust and my load temps just went up about 10c.
> 
> I'm only running my 6700k at 1.26v at 4.5ghz and it's hitting a max of 90c.
> 
> If I put my finger on the top of the case for more than 3 seconds I'll get scalded.
> 
> Zotac Amp Extreme btw.
> 
> Edit : There's even a smell of silicon coming from exhaust lol. I hope it doesn't fry. *knocks on wood*


BRO! Ease up or the BOOM is going to live up to it's name!!!


----------



## Slackaveli

.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> well, i might not know what im doing but i couldnt get the test to even run on my gpu. ran the cpu tests fine. but even on stock settings the test would stop after 1 second :/ . if you have a free moment, mind posting a screenshot of all your settings on OCCT, and can you think what i might be doing wrong?
> dude, i know it sounds crazy but it might be your vram frequency crashing it. what you running yo ramz at?
> 
> also, +35 is a strange number for voltage, and ill tell you why. you get 1 bin per +25. so i usually and +1 to that just to make sure it didnt , like, droop below the +25 mark. probably why some dont get that last bin past +75 when they go to full slider 100% and don't get the 1.093v (some users claim this is the case, yet i get the 1.093v, so it must be leakage/droop).
> so, anyway, voltage slider either 0, +26mv, +51mv, +76mv, or +100mv are the numbers where i see a bump. It's like how you always add 13(12.5 really)Mhz between every clock increase on the core.
> ah, i juust was speaking on this. good post.


You mean my GPU Vram? I can do stable benches and other games at +500 so 6009 Mhz. But for Witcher 3 I use 450 only.

But I doubt its VRam since I was running same VRam all the time and only messing with voltage, bins and clocks did exceed my no-crash in Witcher 3.

So I should try +26% Voltage Slider? Maybe it will help. I will do repasting plus this and see if it helps...

I tried to do btw low voltage running Witcher 3 like 1.000V 2000mhz or 1.02 at 2012 mhz but game will still downclock it...


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> You mean my GPU Vram? I can do stable benches and other games at +500 so 6009 Mhz. But for Witcher 3 I use 450 only.
> 
> But I doubt its VRam since I was running same VRam all the time and only messing with voltage, bins and clocks did exceed my no-crash in Witcher 3.
> 
> So I should try +26% Voltage Slider? Maybe it will help. I will do repasting plus this and see if it helps...
> 
> I tried to do btw low voltage running Witcher 3 like 1.000V 2000mhz or 1.02 at 2012 mhz but game will still downclock it...


repaste will help a lot. Also, you dont need your vram that high, that's more than enough bandwidth and you lowering 50 may just be prolonging the inevitiable crash. Try running it at +178, that's 500Gbps bandwidth, and will not over heat at that freq i bet. Stuff gets hot and then becomes instable. We have almost full plate coverage in these aorus so i think that's why you crash but not for hours, the vram heats up just to an instability point but vram doesnt usually crash, but it can though, and it seems to happen in a real intense scene or when you open a menu or something and bam, froze. that is how vram crashes look. it's not all just go as high as possible without artifacting like it was on maxwell anymore.. try the 1.05v or less settings on core with a +178 mem and see if that works.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> repaste will help a lot. Also, you dont need your vram that high, that's more than enough bandwidth and you lowering 50 may just be prolonging the inevitiable crash. Try running it at +178, that's 500Gbps bandwidth, and will not over heat at that freq i bet. Stuff gets hot and then becomes instable. We have almost full plate coverage in these aorus so i think that's why you crash but not for hours, the vram heats up just to an instability point but vram doesnt usually crash, but it can though, and it seems to happen in a real intense scene or when you open a menu or something and bam, froze. that is how vram crashes look. it's not all just go as high as possible without artifacting like it was on maxwell anymore.. try the 1.05v or less settings on core with a +178 mem and see if that works.


My crashin looks like- game freezes, but you can still hear that game is running behind it (you can even hear going to menu, chosing option in dialogue etc.), but screen freezes. Then you hit like ESC or Windows button and your back to desktop and game is closed.

I will try to get VRam lower. I always though that VRam is artifacting, not crashing game. Or at least artefact first... Ow well, you learn something new every day


----------



## TheBoom

Quote:


> Originally Posted by *Slackaveli*
> 
> BRO! Ease up or the BOOM is going to live up to it's name!!!


Lol good one.

Temps aside, I think I finally managed to get around to how the curve works.

Whats funny is with 2050 at 1.093v, I get absolutely no perfcaps. The box is just blank. But the chip unfortunately can't do 2050 for long and it just crashes after a while with andromeda.

Now drop to the next bin which is 2037 at 1.075v, it holds and is perfectly stable, but I get a rainbow of perfcaps lmao.

Damn Nvidia logic.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mshagg*
> 
> Not sure where the impression comes from that the normalized value from HWInfo is some sort of hard limit, compared to the outright TDP. Having a look at this FE with the Palit BIOS, SuPo 4k sees a maximum 117.8% and a "non normalized" of 109.5%.
> 
> To me they look like they're the same reading, expressed differently. i.e. same numerator, different denominator. The nomalized value appears to correlate with the % dialed in via afterburner.


I believe they are right.

Please look at my Timespy doing 1.050v and 1.075v.

Below, I do not hit any perf cap at all. Even though my power limit is set to 120%, I'm only using 101% maximum power draw.


Here I hit the perf cap of power limiting a tiny tiny bit in time spy. My maximum power draw was only 103% in AF, but it seems the normalized TDP was hit here.


Although I never hit it in gaming or with superposition. Here is 2152 Mhz @ 1.093v



I believe the problem is being overblown though. With the shunt mod, I'm drawing roughly 100 more watts, around 400 watts total. I haven't ran a game that has drawn even close to this much power. I think For Honor (which is just as bad as Ghost Recon when it comes to optimization) drew up to 92% just at one point, but never in the 100s.



I believe this power limit "issue", if you want to call it that, is being overblown. Even if we figured out a way to get past it, it's not like the difference between shunt and non shunt mod in performance. Look at my pre-shunt mod perf cap here.



At best we'd probably gain less than a fraction of a 1% of performance in only Timespy and Firestrike? But no performance gain in any gameplay.

EDIT: I did not shunt mod the resistor by the motherboard, but that could probably give you the extra TDP you need or not. I don't know how it relates to normalized TDP. But why bother drawing more power from your mobo? It will risk damaging it. AMD updated the RX 480 for a reason, it is not safe to draw more than the recommended power from a mobo from my understanding. Correct me if I'm wrong.


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> bhphoto has them for $359, in stock, and all from the later batches where ALL of us have hit 4.2+, even a few 4.4s , which is more than enough to beat skylake/kabylake in gaming.
> 
> Sell the old i-7 for $250~+ and have you a whole new platform's performance for less than $100.
> 
> https://www.bhphotovideo.com/c/product/1181928-REG/intel_bx80658i75775c_core_i7_5775c_3_3_ghz.html
> 
> Edit, price went up $20. word has been getting out on these and they have sold a bunch lately.
> 
> Good luck EVER finding one used. Nobody is going to part with The Gaming Behemoth lol.
> superflowerpowerFTW


Thanks a bunch. I will probably have to flash the bios of my msi g55 sli mobo. Hopefully it supports it. I will probably have to Google it a bit and be sure i can use it. I live in Norway so the shopping is 40dollars and then 25% VAT on top og that. Need to know it works first before i invest that


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> Lol good one.
> 
> Temps aside, I think I finally managed to get around to how the curve works.
> 
> Whats funny is with 2050 at 1.093v, I get absolutely no perfcaps. The box is just blank. But the chip unfortunately can't do 2050 for long and it just crashes after a while with andromeda.
> 
> Now drop to the next bin which is 2037 at 1.075v, it holds and is perfectly stable, but I get a rainbow of perfcaps lmao.
> 
> Damn Nvidia logic.


crazy, iddnit? them perfcaps, if they arent pink... screw em lol. just run HWinfo64 instead of gpu-z. Hell, it's better anyway. You can add anything to the rivatuners OSD and monitor everything on screen in game. Screw dem perfcaps.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Thanks a bunch. I will probably have to flash the bios of my msi g55 sli mobo. Hopefully it supports it. I will probably have to Google it a bit and be sure i can use it. I live in Norway so the shopping is 40dollars and then 25% VAT on top og that. Need to know it works first before i invest that


aye, you have to have a z97 mobo. have to figure cost of that minus selling price of your old mobo to see true cost i guess. It's a no brainer upgrade if you have the z97 already. If you can find a deal on a mobo. I grabbed my son a z97 gigabyte mobo reconditioned of newegg for $79 bucks, tho. and ebay'd his old asus amd board for his 6300 and fetched $75 for it, so it can be done at little to no cost. If in Norway that's possible, that is.

Also to consider is what ram you have on hand or would need. ddr4 is freaking outrageous high right now, and even ddr3 prices have gone up, but you can still get g-skill 2400Mhzc-10 8x2 16gb kits for $100. And that's top of the line ram for a z97. If you are going to do a full upgrade you are going to be cussing those ram prices/ but if you some some to sell, life is good. I just sold a 2400Mhz corsair dominator plat 4x4 16gb kit on ebay for $220 and replaced it with that gskill kit and made $120 with no perf change at all. just went from silver to red heatspeaders for $120 bucks lol.


----------



## TheBoom

Quote:


> Originally Posted by *Slackaveli*
> 
> crazy, iddnit? them perfcaps, if they arent pink... screw em lol. just run HWinfo64 instead of gpu-z. Hell, it's better anyway. You can add anything to the rivatuners OSD and monitor everything on screen in game. Screw dem perfcaps.


Yeah, exactly.

Well I get vrel, since the curve is set to make 1.075v the highest value the card cannot go any higher due to user set limitation and hence reports vrel.

vop I have honestly no idea what it is.

pwr is something that shouldn't be happening since it had no issue at a higher clock and voltage.

thrm also shouldn't be happening since the temps are now lower with this bin compared to the higher bin and yet was not reported at the higher bin.

Mind = blown.


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> aye, you have to have a z97 mobo. have to figure cost of that minus selling price of your old mobo to see true cost i guess. It's a no brainer upgrade if you have the z97 already. If you can find a deal on a mobo. I grabbed my son a z97 gigabyte mobo reconditioned of newegg for $79 bucks, tho. and ebay'd his old asus amd board for his 6300 and fetched $75 for it, so it can be done at little to no cost. If in Norway that's possible, that is.


I searched for the specs of my mobo. It is a Z97 and it says it supportsider 5th gen cpu. But will i be able to clock it beyond 4ghz on air? I use a noctua D15S today.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> I searched for the specs of my mobo. It is a Z97 and it says it supportsider 5th gen cpu. But will i be able to clock it beyond 4ghz on air? I use a noctua D15S today.


yeah, on that you will. they run cooler and hold more volts ,both, than 4790k.

They are really amazing lil suckers. i can run 4.3 at 1.38v core and it stays under 70c on my 120mm double-thick 'fatboy' lil ole aio, in BF1 which pushes cpu usage through the roof. If I run it at 4.2 (i do) at 1.28v, it stays in the 50c s in that game. 40cs in any other games. And your cooler's perf is probably on par with my asetek fatboy $100 cooler.

Damn, bro, you are good to go. make sure you join us here. WEALTH of info in there. http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/870#post_26053281
\
100% approval rating in that forum and 90% of us came from 4790k's not sure if we'd even get a bump, and boy do you ever. GREAT gaming chip.


----------



## jckaboom

Quote:


> Originally Posted by *Baasha*
> 
> Wondering if someone can help me diagnose the following.
> 
> Just installed my 4x 1080 Ti GPUs on my 2nd rig and put the EVGA 1600W T2 PSU in there as well. Ran 3DMark Fire Strike Ultra - runs fine - all GPUs scale (score is much lower since I'm using the 3970X @ 4.50Ghz).
> 
> However, Heaven 4.0 doesn't scale well even at 4K w/ 8x AA!
> 
> Games also are not scaling anywhere as nearly well as the first rig w/ the 6950X @ 4.30Ghz.
> 
> Can the CPU alone cause this? I'm also running "only" DDR3 @ 2133Mhz (32GB) on the rig with the 4x 1080 Tis whereas on the other rig, I have DDR4 @ 3200Mhz (64GB).
> 
> Further, if I use DSR to play at 5K (this rig has the RoG Swift @ 1440P @ 144Hz), the scaling is pretty bad where as 5K on the other rig scales >90% across all 4 GPUs.
> 
> If I bump up resolution scale to 8K (150% from 5K or 200% from 4K UHD), the game (BF1) crashes instantly with a DirectX error.
> 
> The other rig can play at 8K all day (4x Titan XP).
> 
> Why is the scaling so bad when all other components (other than CPU & RAM) are the same? Could the X79 platform itself be limiting scaling somehow?
> 
> Would like some advice on this.


Make sure the pcie speed is on gen3.
There is problems on x79 reading the correct speed even if bios is set to gen3.
In my case if I try to do bclock OC my speed change to gen 2. If I run without OC speed keeps on gen3.
Look for "x79 pcie patch" on Nvidia o EVGA forums, run the file as admin and restart. You can do regedit manually as well. Check the speed with GPUZ.

With my E5-2690 and 1080ti SLI :
battlefield 1 @ 4K Ultra, just starting the campaign.
Gen2- 75-80 FPS
Gen3- 105-110 FPS

Also, not sure how many pcie lanes the 3970 has, this could be another factor but I guess the speed is the problem because many ppl is using 8-8 configurations without problems.??


----------



## jamexman

Quote:


> Originally Posted by *Slackaveli*
> 
> bhphoto has them for $359, in stock, and all from the later batches where ALL of us have hit 4.2+, even a few 4.4s , which is more than enough to beat skylake/kabylake in gaming.
> 
> Sell the old i-7 for $250~+ and have you a whole new platform's performance for less than $100.
> 
> https://www.bhphotovideo.com/c/product/1181928-REG/intel_bx80658i75775c_core_i7_5775c_3_3_ghz.html
> 
> Edit, price went up $20. word has been getting out on these and they have sold a bunch lately.
> 
> Good luck EVER finding one used. Nobody is going to part with The Gaming Behemoth lol.
> *superflowerpowerFTW*


Indeed.


----------



## KCDC

Quote:


> Originally Posted by *jamexman*
> 
> I don't think you're maxing out. 1200 is more than plenty. My 1000w plat Evga does fine with 1080ti sli and overclocked Ryzen 1800x. Those Corsair power supplies are known for doing that. Check amazon and newegg reviews. Including the one you just ordered... I hope you don't get a lemon, personally I would have gone for a Evga PSU.


While I completely agree with you that I shouldn't be maxing out, there was someone else on OCN that was getting constant restarts with his New EVGA 1k PSU, he swapped it for a 1200 and everything went fine after that. 1080tiSLI as well.

When monitoring Link, I would constantly see spikes above 950watts when under gaming load or rendering sometimes resting at 920 watts, so that to me seems fine with a 1200watt PSU.

This one has been solid for 3-4 years now on multiple builds, dual xeon workstations, old titan sli and 980ti sli setups.

It was running fine with one card non Surround until last night and did it again, so looks like I killed the bugger. I'll get a warranty replacement and keep it as a backup or something.

Honestly, I'm not really brand specific with these sort of things, I just like having the ability to monitor the PSU with Link. Don't get why other companies don't do it too.


----------



## Alwrath

While it is a nice chip, quad cores are just not impressive nowadays. Your cpu usage in BF1 is because that game is designed for more cores. More cores = higher and more consistent FPS in that game, including min FPS. If that broadwell chip was an 8 core, I might be impressed. But quad cores are so 2007.


----------



## Benny89

Quote:


> Originally Posted by *Alwrath*
> 
> While it is a nice chip, quad cores are just not impressive nowadays. Your cpu usage in BF1 is because that game is designed for more cores. More cores = higher and more consistent FPS in that game, including min FPS. If that broadwell chip was an 8 core, I might be impressed. But quad cores are so 2007.


It is impressive because being a forgotten child from Intel, with silent release- OCed to 4,2Ghz it matches and even beats sometimes 5.0Ghz 7700K in games. While being at 1150 socket still. This is impressive because first of all- is kind of "collectors edition chip"







, not mainstream like 7700k and it had eDRAM.

That above makes it in my opinion LAST revolutionaly chip that Intel has released. But since it was too good and 6700k was about to hit market, its premiere and brutal power in games went silent. Intel didn't want to jeopardize 1151 platform sells.

in hour, two I will be in home and will test that Vram in Witcher 3 if that causes problems or not.

I am little obsessed at this point, please don't hate


----------



## ttnuagmada

I have an AX1200i and 2 1080ti's is definitely not "too much" for one. I ran IBT on Very High along with Heaven on loop, And my 330w power limited 1080ti's hitting power perfcaps and [email protected]@1.41v maxed out in the 900w range.


----------



## pez

Got my EVGA 1080 Ti SC BE in and this thing is nice. I didn't get any pics before installing, but I have to say it looks much better in person than in the stock photos from Newegg or EVGA. I can only imagine how the silver or even those custom colors are going to look.

Card essentially stays at 70C and 50% fan. Max OC on GPU is a bit weak, but I'm happy. Sitting currently at +80/+400. Sat on 1987Mhz in game after 30-45 minutes of BF1 -- so one bin under the STRIX at 20-25% less fan speed and noise. Oh and it's not compromising my CPU cooling, either. I'm guessing the SC2 cards probably got the better binned chips with the FTW3 getting the real golden ones.


----------



## CoreyL4

So my gaming x can do 2062/stock memory at 1.062v stable.

When I try 2050/6000 at 1.093v, it becomes power limited and if it goes below 1.050 or 1.043v it will crash because i found i need high voltage to have that memory overclock.

Is there any common ground I could take? as in 2037/5500-5800 at a lower voltage or what?


----------



## KCDC

Quote:


> Originally Posted by *ttnuagmada*
> 
> I have an AX1200i and 2 1080ti's is definitely not "too much" for one. I ran IBT on Very High along with Heaven on loop, And my 330w power limited 1080ti's hitting power perfcaps and [email protected]@1.41v maxed out in the 900w range.


Yeah, signs are pointing to PSU failure. I hope it's not anything more than that, I just finished rebuilding the damn thing.


----------



## reflex75

Quote:


> Originally Posted by *BrainSplatter*
> 
> Different cards often have different VID (=different thermals, power requirement, ...). This is one of the things almost all GPU reviewers completely ignore and still compare for example power usage without mentioning the GPU base voltage.


My point is: can we trust the advertised boost clock?
I mean can a card maker sell a crappy card with high boost clock for instance 1759Mhz, but which only boost to let's say 1900Mhz for real?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *CoreyL4*
> 
> So my gaming x can do 2062/stock memory at 1.062v stable.
> 
> When I try 2050/6000 at 1.093v, it becomes power limited and if it goes below 1.050 or 1.043v it will crash because i found i need high voltage to have that memory overclock.
> 
> Is there any common ground I could take? as in 2037/5500-5800 at a lower voltage or what?


I'm sorry CoreyL4, but it's just the silicon lottery. I've been reading your post so I can try and figure out ways to help you but at this point I think you either have to return your card and buy another one to hope you can overclock 2% higher and get 1-2% more performance or just enjoy some videogames.

You're currently clocked at 2062, getting clocked at 2101 Mhz would only improve your performance by 1.5% roughly. That can make a 4K game go from 60 FPS to 60.9 FPS. You're not missing much my friend









I had a 980 Ti that did 1500 Mhz but when I sli it with another one, the new one couldn't go above 1450 Mhz. I tried everything and just settled.

At some point we just have to settle and enjoy the card the way it is.

But always remember that Boost 3.0 has already OC your card near to it's maximum potential, it's not like you're not getting good gains, you're getting great gains above 2000 Mhz already.


----------



## Benny89

Quote:


> Originally Posted by *CoreyL4*
> 
> So my gaming x can do 2062/stock memory at 1.062v stable.
> 
> When I try 2050/6000 at 1.093v, it becomes power limited and if it goes below 1.050 or 1.043v it will crash because i found i need high voltage to have that memory overclock.
> 
> Is there any common ground I could take? as in 2037/5500-5800 at a lower voltage or what?


There is no answer to that, you have try in different games. In some you will have 2063 ok, in some like W3 or ME:A you won't be able to hold the same clocks.

You have to run through apps and games to see where are clocks that are stable EVERYWHERE which is hardest part because each app reacts differently to OC.

You can run around with 2075-2088 in BF1 and you will crash in 10 seconds in ME:A.

You have to waste your life like I am doing and chasing voltage curves







Just enjoy card, don't be me







please.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> There is no answer to that, you have try in different games. In some you will have 2063 ok, in some like W3 or ME:A you won't be able to hold the same clocks.
> 
> You have to run through apps and games to see where are clocks that are stable EVERYWHERE which is hardest part because each app reacts differently to OC.
> 
> You can run around with 2075-2088 in BF1 and you will crash in 10 seconds in ME:A.
> 
> You have to waste your life like I am doing and chasing voltage curves
> 
> 
> 
> 
> 
> 
> 
> Just enjoy card, don't be me
> 
> 
> 
> 
> 
> 
> 
> please.


I know, at this rate we're going to have to start a support group here, lol.


----------



## CoreyL4

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm sorry CoreyL4, but it's just the silicon lottery. I've been reading your post so I can try and figure out ways to help you but at this point I think you either have to return your card and buy another one to hope you can overclock 2% higher and get 1-2% more performance or just enjoy some videogames.
> 
> You're currently clocked at 2062, getting clocked at 2101 Mhz would only improve your performance by 1.5% roughly. That can make a 4K game go from 60 FPS to 60.9 FPS. You're not missing much my friend
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a 980 Ti that did 1500 Mhz but when I sli it with another one, the new one couldn't go above 1450 Mhz. I tried everything and just settled.
> 
> At some point we just have to settle and enjoy the card the way it is.
> 
> But always remember that Boost 3.0 has already OC your card near to it's maximum potential, it's not like you're not getting good gains, you're getting great gains above 2000 Mhz already.


Oh I would never return the card because of the numbers I am hitting. They are still very good to me. Ill take 2050/55-5800 any day.

Quote:


> Originally Posted by *Benny89*
> 
> There is no answer to that, you have try in different games. In some you will have 2063 ok, in some like W3 or ME:A you won't be able to hold the same clocks.
> 
> You have to run through apps and games to see where are clocks that are stable EVERYWHERE which is hardest part because each app reacts differently to OC.
> 
> You can run around with 2075-2088 in BF1 and you will crash in 10 seconds in ME:A.
> 
> You have to waste your life like I am doing and chasing voltage curves
> 
> 
> 
> 
> 
> 
> 
> Just enjoy card, don't be me
> 
> 
> 
> 
> 
> 
> 
> please.


Haha last 2 days spent all night after working testing stuff to get no further.


----------



## KraxKill

Here's a 2151 @ 1.093v Firestrike for fun.

My top scores are actually 23100 at 2100 @ 1.63 as I'm hitting into the pwr limits even with the shunt mod. Clocking to 2151 puts my voltage where I begin to trip the "normalized tdp" throttle, while 2100 @ 1.63 keeps me right at the limit. This is on a 4790K @ 5.2 , 2133 CL9 DDR3

The moral here is these cards are TDP limited beyond what the shunt mod appears to forgive. I could score higher at 2151, if I wasn't hitting the pwr limit but since the card bounces off the rev limiter a couple of times during the run scores are ultimately higher at 2100 where I'm just under the limit due to voltage.

If somebody has a verifiable method of increasing that limit I'm all ears.

Different drivers I know, but my scores haven't changed much with the new driver. Everything is within 0.1% of where they were on the old driver.

2151 @ 1.093v
http://www.3dmark.com/fs/12477286


2100 @ 1.063v
http://www.3dmark.com/fs/12446981


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> Here's a 2151 @ 1.093v Firestrike for fun.
> 
> My top scores are actually 23100 at 2100 @ 1.63 as I'm hitting into the pwr limits even with the shunt mod. Clocking to 2151 puts my voltage where I begin to trip the "normalized tdp" throttle, while 2100 @ 1.63 keeps me right at the limit. This is on a 4790K @ 5.2 , 2133 CL9 DDR3
> 
> The moral here is these cards are TDP limited beyond what the shunt mod appears to forgive. I could score higher at 2151, if I wasn't hitting the pwr limit but since the card bounces off the rev limiter a couple of times during the run scores are ultimately higher at 2100 where I'm just under the limit due to voltage.
> 
> If somebody has a verifiable method of increasing that limit I'm all ears.
> 
> Different drivers I know, but my scores haven't changed much with the new driver. Everything is within 0.1% of where they were on the old driver.
> 
> 2151 @ 1.093v
> http://www.3dmark.com/fs/12477286
> 
> 
> 2100 @ 1.063v
> http://www.3dmark.com/fs/12446981


If we could SLI our cards together, we'd probably have the highest SLI score on water, lol. They seem to be nearly identical besides maybe memory. Only +550 here.


----------



## GosuPl

One of my 1080Ti's is worst card ever ^^ Even 1961 is not stable... I have problem with SLI to, one card have 0.50 mv less than another, my fix to it, have set clocks separatly, +40 mhz more on one card, than second. Voltages sliders maxed.

Soon i will watercool them (or just send this worst card to shop and buy another and then go LC on SLI) .How much stable clocks you gain on LC compared to NVTTM ?


----------



## KingEngineRevUp

@Kraxkills, I'm not sure if you saw my post but I copied it below.
Quote:


> Originally Posted by *SlimJ87D*
> 
> I believe they are right.
> 
> Please look at my Timespy doing 1.050v and 1.075v.
> 
> Below, I do not hit any perf cap at all. Even though my power limit is set to 120%, I'm only using 101% maximum power draw.
> 
> 
> Here I hit the perf cap of power limiting a tiny tiny bit in time spy. My maximum power draw was only 103% in AF, but it seems the normalized TDP was hit here.
> 
> 
> Although I never hit it in gaming or with superposition. Here is 2152 Mhz @ 1.093v
> 
> 
> 
> I believe the problem is being overblown though. With the shunt mod, I'm drawing roughly 100 more watts, around 400 watts total. I haven't ran a game that has drawn even close to this much power. I think For Honor (which is just as bad as Ghost Recon when it comes to optimization) drew up to 92% just at one point, but never in the 100s.
> 
> 
> 
> I believe this power limit "issue", if you want to call it that, is being overblown. Even if we figured out a way to get past it, it's not like the difference between shunt and non shunt mod in performance. Look at my pre-shunt mod perf cap here.
> 
> 
> 
> At best we'd probably gain less than a fraction of a 1% of performance in only Timespy and Firestrike? But no performance gain in any gameplay.
> 
> EDIT: I did not shunt mod the resistor by the motherboard, but that could probably give you the extra TDP you need or not. I don't know how it relates to normalized TDP. But why bother drawing more power from your mobo? It will risk damaging it. AMD updated the RX 480 for a reason, it is not safe to draw more than the recommended power from a mobo from my understanding. Correct me if I'm wrong.


I think the normalized TDP limit isn't that big of a deal when it comes to gaming.


----------



## Savatage79

Curious if anyone has a suggestion. So with my Aorus I'm 20 core, 350 mem... I can hold 2050 pretty easy, mid Temps high 40s and mid 50s. I drop to 2038 occasionally and 60 frames are locked.

Here's my issue, my frames don't drop but I'm getting some sort of slight stutter that's so far happened in Nier and Andromeda. It's driving me crazy because it's smooth. But it's not quite.

I'm assuming frame pacing, any suggestions on what to try? Running 16gb with a 4770k, win 10.


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> While it is a nice chip, quad cores are just not impressive nowadays. Your cpu usage in BF1 is because that game is designed for more cores. More cores = higher and more consistent FPS in that game, including min FPS. If that broadwell chip was an 8 core, I might be impressed. But quad cores are so 2007.


lol! it beats all the 8 cores. i hold 135+ in 1440p ultra all day. usually 144 locked. it smashes the ryzen 8 cores.


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> Curious if anyone has a suggestion. So with my Aorus I'm 20 core, 350 mem... I can hold 2050 pretty easy, mid Temps high 40s and mid 50s. I drop to 2038 occasionally and 60 frames are locked.
> 
> Here's my issue, my frames don't drop but I'm getting some sort of slight stutter that's so far happened in Nier and Andromeda. It's driving me crazy because it's smooth. But it's not quite.
> 
> I'm assuming frame pacing, any suggestions on what to try? Running 16gb with a 4770k, win 10.


i have heard turning off the Intel HEPT in BIOS settings can fix that.

Also, maybe try turning shader cache to off in nvidia control panel.


----------



## Luckbad

I already have a Zotac 1080 Ti Amp Extreme that does 2064 or so on core and 12000 or so on memory.

An EVGA 1080 Ti Founders Edition just arrived, and another Zotac 1080 Ti Amp Extreme was supposed to be in today but might not be until Monday.

The urge is real to open them both up and see what I can get for overclocks rather than selling them as completely new.

Curses!


----------



## Chris Ihao

Did the Kryonaut Godzilla mod on my Aorus non-xtreme edition. Running happily at 2.012 mhz with a comfortable max temperature of 59 celcius. I put the power limit to +125 and +80 volts on top of that. It stays at 2.025 mhz for a good while, but eventually drops down to 2.012 and stays put. Standard voltage of 1.0. Did a bit of messing around with my cabinet fans and removed the stupid front as well. The gaming pc previously doubled up as a audio workstation until I got a dedicated one, so I have for the longest time hopelessly tried to combine overclocking on air with a as silent build as possible. What a relief!







Much better airflow now, which helps quite a bit on the temps for this trial.

How I tested? I find that a maxed out Witcher 3 (- hairworks and the longest foilage distance) is an excellent stability test. If I stand beside some water during sunset and get a good view of both the water, rays of sun and other stuff, it easily hit the 99% gpu usage mark constantly. In addition this game takes up a whole lot of cpu power in certain areas, so you get to create some hotness in that compartment as well, which is useful in real life testing. I tried +100 but it crashed after a while, and so I decided to stop at 80 as it still was down at 2.012 at +85. Got a bit freaked out when the screen went black several reboots as I entered Windows after this. Turned out that Win probably somehow automatically thought my Oculus Rift was the main monitor. Phu. Hehe.

Just a word to the wise: Managed to break one of the plastic sides of one of the two fan headers I needed to remove when changing to the Kryonaut paste. Those fan headers really are a downright nightmare to disconnect when you dont have help from someone, while trying to avoid messing up the thermal pads placed around the cooler. In other words, it doesnt hurt to get some backup in case you run into trouble.

Anyhow. Thanks again for the tip regarding the Grizzly guys! I'm quite happy with the result, even though I'm not going to break any records today. At least I get to run at about 100 mhz more than before the mod.


----------



## Luckbad

@Chris Ihao how was the Kryonaut to apply? Did you heat it up in something watertight first to make it easier to spread, or was it no problem?


----------



## Chris Ihao

Quote:


> Originally Posted by *Luckbad*
> 
> @Chris Ihao how was the Kryonaut to apply? Did you heat it up in something watertight first to make it easier to spread, or was it no problem?


It was no problem at all. Very easy to apply, and they even include a sort of rubber "spade" with a hole in the middle, which you screw on the tip of the syringe. It lets you apply more paste easily by simply squeezing out more as you spread it out thinly. You can see it mounted on the tip here:



The paste that was there from before was another story though. Really hard to get off, even with isopropanol, of some reason. I have mostly used the "put some goop in the middle" method previously, but as Kryonaut adviced using the spread method I used it this time around. Hope that helps.


----------



## Savatage79

Quote:


> Originally Posted by *Slackaveli*
> 
> i have heard turning off the Intel HEPT in BIOS settings can fix that.
> 
> Also, maybe try turning shader cache to off in nvidia control panel.


I actually got it man, playing nier and I hadn't set that global illumination from 128 to 64, that fixed me right up.


----------



## BBEG

Is it worth replacing the TIM, generally, on 1080 Ti's?


----------



## Savatage79

Quote:


> Originally Posted by *Savatage79*
> 
> I actually got it man, playing nier and I hadn't set that global illumination from 128 to 64, that fixed me right up.


Actually, it didnt :sigh:

I'm not sure what I'm experiencing with this gpu, but something is amiss. It's Temps are great, frames locked at 60 but so far in both Andromeda and now nier in getting a subtle stutter in areas and when I'm spinning the cam around.

What's driving me nuts is with my 1080 I could set it to 4k and lower some settings but it was smooth as silk. But that was an evga 1080 and I've been an evga guy forever, now never is first time I switch and I have this small underlying stutter issue and I don't know how to solve it. I can try turning off shader caching perhaps next.


----------



## Chris Ihao

Quote:


> Originally Posted by *BBEG*
> 
> Is it worth replacing the TIM, generally, on 1080 Ti's?


I, for one, dont regret it at least. I cant speak generally, but if the goop is as hard on other cards as on the aorus, there should be a few degrees to earn at least. I dropped over 5 celcius myself.


----------



## Chris Ihao

Quote:


> Originally Posted by *Savatage79*
> 
> Actually, it didnt :sigh:
> 
> I'm not sure what I'm experiencing with this gpu, but something is amiss. It's Temps are great, frames locked at 60 but so far in both Andromeda and now nier in getting a subtle stutter in areas and when I'm spinning the cam around.
> 
> What's driving me nuts is with my 1080 I could set it to 4k and lower some settings but it was smooth as silk. But that was an evga 1080 and I've been an evga guy forever, now never is first time I switch and I have this small underlying stutter issue and I don't know how to solve it. I can try turning off shader caching perhaps next.


Hmm. It COULD be a borderline unstable overclock. I've experienced this in the past, when it comes to overclocking gpu's. (probably due to the gpu not handling the load, and dropping frames).


----------



## derpa

Quote:


> Originally Posted by *Savatage79*
> 
> Actually, it didnt :sigh:
> 
> I'm not sure what I'm experiencing with this gpu, but something is amiss. It's Temps are great, frames locked at 60 but so far in both Andromeda and now nier in getting a subtle stutter in areas and when I'm spinning the cam around.
> 
> What's driving me nuts is with my 1080 I could set it to 4k and lower some settings but it was smooth as silk. But that was an evga 1080 and I've been an evga guy forever, now never is first time I switch and I have this small underlying stutter issue and I don't know how to solve it. I can try turning off shader caching perhaps next.


Just wondering; do you have eVGA PxOC open while playing? I noticed when I was using PxOC, I was getting stuttering and tearing in random things. I did a complete uninstall/driver clean (outlined a couple pages back) and switched over to AB, and haven't experienced it since.


----------



## Bishop07764

Quote:


> Originally Posted by *CoreyL4*
> 
> So my gaming x can do 2062/stock memory at 1.062v stable.
> 
> When I try 2050/6000 at 1.093v, it becomes power limited and if it goes below 1.050 or 1.043v it will crash because i found i need high voltage to have that memory overclock.
> 
> Is there any common ground I could take? as in 2037/5500-5800 at a lower voltage or what?


Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm sorry CoreyL4, but it's just the silicon lottery. I've been reading your post so I can try and figure out ways to help you but at this point I think you either have to return your card and buy another one to hope you can overclock 2% higher and get 1-2% more performance or just enjoy some videogames.
> 
> You're currently clocked at 2062, getting clocked at 2101 Mhz would only improve your performance by 1.5% roughly. That can make a 4K game go from 60 FPS to 60.9 FPS. You're not missing much my friend
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a 980 Ti that did 1500 Mhz but when I sli it with another one, the new one couldn't go above 1450 Mhz. I tried everything and just settled.
> 
> At some point we just have to settle and enjoy the card the way it is.
> 
> But always remember that Boost 3.0 has already OC your card near to it's maximum potential, it's not like you're not getting good gains, you're getting great gains above 2000 Mhz already.


I'd second that. The actual difference in games is going to be pretty small at this point. I finally crashed in the Division running 2063 core. Dropped it down to 2050 and lost a whole 0.2 fps off my average at 1440 p. ? I have yet to try the Witcher 3. It sounds like it's pretty brutal on overclocks.


----------



## Hulio225

Quote:


> Originally Posted by *BBEG*
> 
> Is it worth replacing the TIM, generally, on 1080 Ti's?


I don't think it will give you an reasonable advantage to be honest...


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> This works for keeping your voltage from going higher, but it doesn't prevent it from going lower or down clocking does it?


No, only stay out of limiting will do that


----------



## Savatage79

Quote:


> Originally Posted by *Chris Ihao*
> 
> Hmm. It COULD be a borderline unstable overclock. I've experienced this in the past, when it comes to overclocking gpu's. (probably due to the gpu not handling the load, and dropping frames).


Possible, but would it actually show the slight drop in frames? Because that's what bugs me, my osd never shows actual droppage for frames when it happens. But it could perhaps be an unstable OC, is there any way to really test and see, because I can run the settings I'm on with no crashing or anything.
Quote:


> Originally Posted by *derpa*
> 
> Just wondering; do you have eVGA PxOC open while playing? I noticed when I was using PxOC, I was getting stuttering and tearing in random things. I did a complete uninstall/driver clean (outlined a couple pages back) and switched over to AB, and haven't experienced it since.


Yea I had it opened. I can try that tho. The one thing about AB when I used it was I couldn't ever remove the osd once I had it up, ever experience that?


----------



## Chris Ihao

Quote:


> Originally Posted by *Hulio225*
> 
> I don't think it will give you an reasonable advantage to be honest...


A benefit not mentioned here is that you can double check that the job is done properly, when it comes to the mounting of the hs itself. I thought some of the screws were ridiculously lightly tightened. Of course you dont have to change the tim to do that though









If its worth it is another question, but here we are on overclock.net. Gaining 1 fps really isnt "worth" the hassle of potential unstability etc, but here we are doing it anyways. Hehe.
Quote:


> Originally Posted by *Savatage79*
> 
> Possible, but would it actually show the slight drop in frames? Because that's what bugs me, my osd never shows actual droppage for frames when it happens. But it could perhaps be an unstable OC, is there any way to really test and see, because I can run the settings I'm on with no crashing or anything.
> Yea I had it opened. I can try that tho. The one thing about AB when I used it was I couldn't ever remove the osd once I had it up, ever experience that?


Hmm. I really dont know man. What I am referring to was a looong time ago, but you could do the 3dmark stability test to see what it says. It measures stability of clock etc. You can get some stutters if it drops up and down a lot between clocks methinks, at least in my experience. I constantly measure the clock now in the beginning, just to be sure. Oh, and instability in electronic components can pretty much manifest as anything. I guess if you are really lucky, you can crash while seeing a pattern of an elephant made up of many tiny dots of different colors


----------



## Hulio225

Quote:


> Originally Posted by *Chris Ihao*
> 
> A benefit not mentioned here is that you can double check that the job is done properly, when it comes to the mounting of the hs itself. I thought some of the screws were ridiculously lightly tightened. Of course you dont have to change the tim to do that though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If its worth it is another question, but here we are on overclock.net. Gaining 1 fps really isnt "worth" the hassle of potential unstability etc, but here we are doing it anyways. Hehe.


yeah sure, ill give you that point ;-)


----------



## buddatech

How much of an impact does overclocking VRAM vs VRAM + Core VS Core only?


----------



## s1rrah

Quote:


> Originally Posted by *BBEG*
> 
> Is it worth replacing the TIM, generally, on 1080 Ti's?


YES!!

I repasted my MSI Seahawk X about a week ago ... check this super great factory paste job:

...



...

[email protected][email protected][email protected]

Repaste dropped load temps quite a bit (plus I added a 38mm Scythe fan which kicks so much @#@ that I can't even describe it)

Anyway, here's the full story...

http://www.overclock.net/t/1628718/so-i-re-pasted-my-msi-1080-ti-seahawk-the-other-day-and#post_26039363f:thumb:


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> Actually, it didnt :sigh:
> 
> I'm not sure what I'm experiencing with this gpu, but something is amiss. It's Temps are great, frames locked at 60 but so far in both Andromeda and now nier in getting a subtle stutter in areas and when I'm spinning the cam around.
> 
> What's driving me nuts is with my 1080 I could set it to 4k and lower some settings but it was smooth as silk. But that was an evga 1080 and I've been an evga guy forever, now never is first time I switch and I have this small underlying stutter issue and I don't know how to solve it. I can try turning off shader caching perhaps next.


im tellin you, try to turn off HEDT in bios.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> lol! it beats all the 8 cores. i hold 135+ in 1440p ultra all day. usually 144 locked. it smashes the ryzen 8 cores.


Id be interested in seeing your BF1 results. Is there even a benchmark for it? You could be right, but remember your cores are all at over 90% usage and you cant have anything else open while you play. If you open up itunes youll see stuttering... also saying it " smashes " a Ryzen is kinda far fetched... maybe higher scores at 120 hz + but I game at 4K 60hz so it really doesnt matter to me


----------



## Coopiklaani

EK just murdered my card.. There's more coolant outside the loop than what's left inside..
Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


----------



## KedarWolf

Anyone flashing a BIOS having issues with really low power limits run unzip the powerlimit.zip and run the powerlmit.bat file after right clicking on it, edit it in NotePad and changing the '-pl 3**' to your BIOS max.

powerlimit.zip 0k .zip file


You can find your max TDP here.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

I found after flashing different BIOS's before the .bat file ran the max TDP was 250 not what it should be and I think it did this after reinstalling Nvidia drivers as well.


----------



## s1rrah

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


oh .... dude ...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


Aw man... Did it spray everywhere and what happened afterwards? Please share your story.

You had a golden silicon card also right?


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Anyone flashing a BIOS having issues with really low power limits run unzip the powerlimit.zip and run the powerlmit.bat file after right clicking on it, edit it in NotePad and changing the '-pl 3**' to your BIOS max.
> 
> powerlimit.zip 0k .zip file
> 
> 
> You can find your max TDP here.
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=
> 
> I found after flashing different BIOS's before the .bat file ran the max TDP was 250 not what it should be and I think it did this after reinstalling Nvidia drivers as well.


Edit: Yes, it did it after a driver reinstall as well.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.
> 
> 
> 
> 
> 
> Aw man... Did it spray everywhere and what happened afterwards? Please share your story.
> 
> You had a golden silicon card also right?
Click to expand...

Oh man, sorry.

Does EK cover this?

I know with their Predator line they did replace hardware from faulty water cooling equipment.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Aw man... Did it spray everywhere and what happened afterwards? Please share your story.
> 
> You had a golden silicon card also right?


Few drops on the GPU PCB. more on my PSU My PC went black screen and restarted itself in the middle of a video playback before I notice something went wrong..

Max core 2100MHz, VRAM +800 card. I'll try to clean the card with rubbing alcohol and will give it another go.


----------



## Chris Ihao

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


RIP (hope not though). Crossing fingers.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Few drops on the GPU PCB. more on my PSU My PC went black screen and restarted itself in the middle of a video playback before I notice something went wrong..
> 
> Max core 2100MHz, VRAM +800 card. I'll try to clean the card with rubbing alcohol and will give it another go.


Soak it in a tub of 99% for an hour and then let it dry for an hour afterwards, should be good


----------



## mshagg

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


Yikes. How long had the block been on there? It was ok during leak testing?


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Id be interested in seeing your BF1 results. Is there even a benchmark for it? You could be right, but remember your cores are all at over 90% usage and you cant have anything else open while you play. If you open up itunes youll see stuttering... also saying it " smashes " a Ryzen is kinda far fetched... maybe higher scores at 120 hz + but I game at 4K 60hz so it really doesnt matter to me


nope. my cores never go past 68%~72%. i have the broadwell destroyer







i never get stutter. i have 134mb of L3+L4 cache







. i know ryzen's best attribute (to me) is it's smoothness. I have that, too, though because of the crystalwell cache. tbf, i beat the 6 cores all venders and the 8 core ryzens in fps, min and max, but not the 6900k/6950x. but im right there with them on an older plat, older ram, etc. it's amazing really. I wish it had a benchmark.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> well, i might not know what im doing but i couldnt get the test to even run on my gpu. ran the cpu tests fine. but even on stock settings the test would stop after 1 second :/ . if you have a free moment, mind posting a screenshot of all your settings on OCCT, and can you think what i might be doing wrong?




Things to note:

Clicking on the OCCT rendering window (so it's at the *front* and currently highlighted on your desktop) will allow it to run full speed. Clicking any other window or item during the render will cause it to automatically downclock to 20 FPS. _Simply click the OCCT render again to resume full speed._
You can determine how much pressure it puts your card under by a) setting the 'FPS Limit' or b) re-sizing window after the render has begun. Bigger = more power hungry.
Don't go daft with 700++ FPS etc lol, try to simulate a real world scenario (make the window a bit bigger to keep FPS down while still drawing a typical gaming power draw (maybe 275w)?? Utilise FPS limiter.
Play about with window size after it's begun to 'perfect' your run.
My max stable memory overclock is 12,120Gbps

Anymore questions don't hesitate


----------



## Dasboogieman

Quote:


> Originally Posted by *buddatech*
> 
> How much of an impact does overclocking VRAM vs VRAM + Core VS Core only?


Well, everyone on this forum seems to be getting to around 2000mhz (sometimes at stock) as a baseline so any gains above that are single digit percentages. The VRAM is another story, you get roughly half of your percentage overclock as performance with VRAM (so if I OC my VRAM 10% I get roughly 5% perf gain), but where it really helps is the minimum frame rates, especially for open world type titles.

Its strongly recommended to OC the VRAM as its performance uplift is measurable and the power impact is minimal vs gunning for higher voltage augmented core clocks. Some members even noticed massive VRAM OC headroom just be simply beefing up the VRAM cooling.


----------



## Coopiklaani

Quote:


> Originally Posted by *mshagg*
> 
> Yikes. How long had the block been on there? It was ok during leak testing?


Only 3 weeks. No visible leak in my leak test. The crack is really small; it takes some time to really accumulate.


----------



## Bishop07764

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.


Oh man. So sorry. Hope nothing is ruined.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.
> 
> 
> Spoiler: Warning: Spoiler!


Was the card powered on when it happened?

If not it might be fine.

Take it to a sauna and dry it out thoroughly then leave to sit on top of a warm radiator for a few days.

Really sorry for your loss if you've lost it.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Was the card powered on when it happened?
> 
> If not it might be fine.
> 
> Take it to a sauna and dry it out thoroughly then leave to sit on top of a warm radiator for a few days.
> 
> Really sorry for your loss if you've lost it.


It black screened in the middle of a video playback, so yes, it was powered on..It's now bathing in the rubbing alcohol. Hopefully it's still ok.


----------



## mshagg

Quote:


> Originally Posted by *Coopiklaani*
> 
> Only 3 weeks. No visible leak in my leak test. The crack is really small; it takes some time to really accumulate.


Oh wow. That definitely sounds like EK owe you!


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> It black screened in the middle of a video playback, so yes, it was powered on..It's now bathing in the rubbing alcohol. Hopefully it's still ok.


omg, that's terrible. so sorry, yo!
Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Things to note:
> 
> Clicking on the OCCT rendering window (so it's at the *front* and currently highlighted on your desktop) will allow it to run full speed. Clicking any other window or item during the render will cause it to automatically downclock to 20 FPS. _Simply click the OCCT render again to resume full speed._
> You can determine how much pressure it puts your card under by a) setting the 'FPS Limit' or b) re-sizing window after the render has begun. Bigger = more power hungry.
> Don't go daft with 700++ FPS etc lol, try to simulate a real world scenario (make the window a bit bigger to keep FPS down while still drawing a typical gaming power draw (maybe 275w)?? Utilise FPS limiter.
> Play about with window size after it's begun to 'perfect' your run.
> My max stable memory overclock is 12,120Gbps
> 
> Anymore questions don't hesitate


ok, thanks man. i'll give it another round, even though i pretty much know my sweet spot is 11,880. i just want to confirm my tests and this should do it.


----------



## Benny89

Finally managed to do stable Witcher 3 clocks... Playing 5 hours straight, no crash or anything . Nice and stable 2050 at 1.050V.









Seems like Memory OC was a reason of crashing. W3 don't like momory OC at all. Not only I gained 4C by doing lower memory OC but I didn't lose any performance.

Also seems like Temperature is really important. I used more aggresive fan curve- 65% and kept my card in range of 58-60C and didn't have any perfcaps or powerlimit at 2050 and 1.050V. Not even once clock droped below 2050mhz.

Uff...finally....

Tomorrow I will repaste it with Kryonaut and I hope I will gain 4-5C more.

So key is lower memory OC and keeping card under 60C. Which is not easy in W3 but with agressive STRIX curve totally possible.

Thanks all for all suggestions and help and big thanks to my man, @*Slackaveli* for Vram suggestion.

M final curve and AB settings:



I think its time to invest in water loop.

Could someone tell me what components did Jay use in this build? I have same case so I want to order exactly same EK Res/Pump and radiators:


----------



## BBEG

Quote:


> Originally Posted by *Coopiklaani*
> 
> It black screened in the middle of a video playback, so yes, it was powered on..It's now bathing in the rubbing alcohol. Hopefully it's still ok.


Could you explain this in detail? I've heard others mention it before but I'm trying to understand exactly what you're doing. Are you dabbing rubbing alcohol onto the PCB or literally dunking it?


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Things to note:
> 
> Clicking on the OCCT rendering window (so it's at the *front* and currently highlighted on your desktop) will allow it to run full speed. Clicking any other window or item during the render will cause it to automatically downclock to 20 FPS. _Simply click the OCCT render again to resume full speed._
> You can determine how much pressure it puts your card under by a) setting the 'FPS Limit' or b) re-sizing window after the render has begun. Bigger = more power hungry.
> Don't go daft with 700++ FPS etc lol, try to simulate a real world scenario (make the window a bit bigger to keep FPS down while still drawing a typical gaming power draw (maybe 275w)?? Utilise FPS limiter.
> Play about with window size after it's begun to 'perfect' your run.
> My max stable memory overclock is 12,120Gbps
> 
> Anymore questions don't hesitate


Quote:


> Originally Posted by *Benny89*
> 
> Finally managed to do stable Witcher 3 clocks... Playing 5 hours straight, no crash or anything . Nice and stable 2050 at 1.050V.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems like Memory OC was a reason of crashing. W3 don't like momory OC at all. Not only I gained 4C by doing lower memory OC but I didn't lose any performance.
> 
> Also seems like Temperature is really important. I used more aggresive fan curve- 65% and kept my card in range of 58-60C and didn't have any perfcaps or powerlimit at 2050 and 1.050V. Not even once clock droped below 2050mhz.
> 
> Uff...finally....
> 
> Tomorrow I will repaste it with Kryonaut and I hope I will gain 4-5C more.
> 
> So key is lower memory OC and keeping card under 60C. Which is not easy in W3 but with agressive STRIX curve totally possible.
> 
> Thanks all for all suggestions and help and big thanks to my man, @*Slackaveli* for Vram suggestion.
> 
> M final curve and AB settings:
> 
> 
> 
> I think its time to invest in water loop.
> 
> Could someone tell me what components did Jay use in this build? I have same case so I want to order exactly same EK Res/Pump and radiators:


Quote:


> Originally Posted by *Benny89*
> 
> Finally managed to do stable Witcher 3 clocks... Playing 5 hours straight, no crash or anything . Nice and stable 2050 at 1.050V.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems like Memory OC was a reason of crashing. W3 don't like momory OC at all. Not only I gained 4C by doing lower memory OC but I didn't lose any performance.
> 
> Also seems like Temperature is really important. I used more aggresive fan curve- 65% and kept my card in range of 58-60C and didn't have any perfcaps or powerlimit at 2050 and 1.050V. Not even once clock droped below 2050mhz.
> 
> Uff...finally....
> 
> Tomorrow I will repaste it with Kryonaut and I hope I will gain 4-5C more.
> 
> So key is lower memory OC and keeping card under 60C. Which is not easy in W3 but with agressive STRIX curve totally possible.
> 
> Thanks all for all suggestions and help and big thanks to my man, @*Slackaveli* for Vram suggestion.
> 
> M final curve and AB settings:
> 
> 
> 
> I think its time to invest in water loop.
> 
> Could someone tell me what components did Jay use in this build? I have same case so I want to order exactly same EK Res/Pump and radiators:


I knew it, man. our cards are so close and my mems are picky. glad you got it figured. You'll probably get 13mhz just from the kyro, too.
Quote:


> Originally Posted by *BBEG*
> 
> Could you explain this in detail? I've heard others mention it before but I'm trying to understand exactly what you're doing. Are you dabbing rubbing alcohol onto the PCB or literally dunking it?


bath. like, soaked in it. then completely dry for a couple days then pray.


----------



## Coopiklaani

Quote:


> Originally Posted by *BBEG*
> 
> Could you explain this in detail? I've heard others mention it before but I'm trying to understand exactly what you're doing. Are you dabbing rubbing alcohol onto the PCB or literally dunking it?


I filled a container, like a big lock-and-lock, with rubbing alcohol and SUBMERGED the card into the rubbing alcohol entirely. Close the lid, wait for 24hr. Then drain and dry the card. It seems an overkill, cos I'm worried about some coolant can actually go under the chip/die.


----------



## nrpeyton

EK make a bold statement on all their kits saying they take no liability for any damage to computer parts while using their equipment. (properly or improperly doesn't make a difference). As the disclaimer/warnings are there in bold.

Everyone thinks "it'll never be me" and no one pays the warnings a 2nd thought.

If you're within warranty period; EK *will* replace the block.

We're on our own with any damaged cards / mobos or the like.

Doesn't seem fair; but imagine if they _did_ cover you? It would be abused so much I doubt the company could survive a year.

Personally; I think EK have great customer service and I've always found them to be fast, competent, knowledgeable & eager to help. One of the best customer services I've received *online* _(definitely up there with EVGA)._

.

Quote:


> Originally Posted by *Chris Ihao*
> 
> I, for one, dont regret it at least. I cant speak generally, but if the goop is as hard on other cards as on the aorus, there should be a few degrees to earn at least. I dropped over 5 celcius myself.


Or could try going hardcore with liquid metal. Might even get 8c ;-)
Disclaimer: I'm not actually recommending this









Anyway time to figure out my max core overclock ;-)
Last night _(my first night)_ at stock voltage I got to....... 2062/2079

Hoping to get to 2100 tonight using voltage slider / curve


----------



## RavageTheEarth

Quote:


> Originally Posted by *Clukos*
> 
> Hitting power limit on Witcher 3 (set it to 1.0930mv 2100MHz with curve OC):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The undervolting test actually provides a better gaming experience


How did you get rivatuner to show wattage? I haven't seen that before. I run 2025 @ 1v locked and it is much smoother without the bins dropping all the time.
Quote:


> Originally Posted by *Benny89*
> 
> Can you guys tell me your stable clocks in Witcher 3? By stable I mean you can play it for long hours/day without crash, not hour/two or so


Quote:


> Originally Posted by *TheBoom*
> 
> Ok so I didn't realize how hot this card ran until I played andromeda on it for the first time today. 100% fans barely keeping it at 80c @2037 1.075v with 11.8ghz mem.
> 
> I have my h115iv2 AIO for the CPU as exhaust and my load temps just went up about 10c.
> 
> I'm only running my 6700k at 1.26v at 4.5ghz and it's hitting a max of 90c.
> 
> If I put my finger on the top of the case for more than 3 seconds I'll get scalded.
> 
> Zotac Amp Extreme btw.
> 
> Edit : There's even a smell of silicon coming from exhaust lol. I hope it doesn't fry. *knocks on wood*


You need some water in your life man!
Quote:


> Originally Posted by *Benny89*
> 
> My crashin looks like- game freezes, but you can still hear that game is running behind it (you can even hear going to menu, chosing option in dialogue etc.), but screen freezes. Then you hit like ESC or Windows button and your back to desktop and game is closed.
> 
> I will try to get VRam lower. I always though that VRam is artifacting, not crashing game. Or at least artefact first... Ow well, you learn something new every day


I get the same type of crashes in Ghost Recon Wildlands with an unstable overclock. Can't even open task manager. Have to do a hard restart.
Quote:


> Originally Posted by *s1rrah*
> 
> YES!!
> 
> I repasted my MSI Seahawk X about a week ago ... check this super great factory paste job:
> 
> ...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> [email protected][email protected][email protected]
> 
> Repaste dropped load temps quite a bit (plus I added a 38mm Scythe fan which kicks so much @#@ that I can't even describe it)
> 
> Anyway, here's the full story...
> 
> http://www.overclock.net/t/1628718/so-i-re-pasted-my-msi-1080-ti-seahawk-the-other-day-and#post_26039363f:thumb:


Holy NSFW!!


----------



## phenom01

So I was just running some benchmarks(the new FF14 bench) on my card havnt really even used it yet maybe 2 hours of total non idle use...and I got a driver crash/system lock. Thought nothing of it restarted. Now my card will not even go fullscreen in any game or bench w/o instacrashing. It was acting a bit funny when I used it the few timespy and heaven runs I got it through giving cannot go fullscreen errors before this. But NOW the core clocks jump around from like 1400-2000 like all over the place instantly on boot of a game and I get a crash. Tried reinstalling drivers with DDU. Just reformatted and clean installed same issue. Tried it on my brother system and same problem. Like the voltage and core jump crazy fast from mid 1400-2000 all in between and poof hard system lock on black screen now within 1-5 seconds.









*edit* Im talking like voltages from .0665 to 1.093 and 1390ish to 2038 changing on the fraction of a second. and black screen it happens so fast I cant even get a grasp on the numbers flashing along with visual anomalies all over the place sparkles and lines and static. Then black.


----------



## Benny89

Btw guys which Grizzly Kryonaout did you use for you GPUs?

I have the grey one, should I get the silver one (metal one)? Which is better and used by you when you repaste GPU?


----------



## Slackaveli

Quote:


> Originally Posted by *phenom01*
> 
> So I was just running some benchmarks(the new FF14 bench) on my card havnt really even used it yet maybe 2 hours of total non idle use...and I got a driver crash/system lock. Thought nothing of it restarted. Now my card will not even go fullscreen in any game or bench w/o instacrashing. It was acting a bit funny when I used it the few timespy and heaven runs I got it through giving cannot go fullscreen errors before this. But NOW the core clocks jump around from like 1400-2000 like all over the place instantly on boot of a game and I get a crash. Tried reinstalling drivers with DDU. Just reformatted and clean installed same issue. Tried it on my brother system and same problem. Like the voltage and core jump crazy fast from mid 1400-2000 all in between and poof hard system lock on black screen now within 1-5 seconds.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *edit* Im talking like voltages from .0665 to 1.093 and 1390ish to 2038 changing on the fraction of a second. and black screen it happens so fast I cant even get a grasp on the numbers flashing along with visual anomalies all over the place sparkles and lines and static. Then black.


RMA time, my good sir. sorry. but, yeah. that card said screw it.
Quote:


> Originally Posted by *Benny89*
> 
> Btw guys which Grizzly Kryonaout did you use for you GPUs?
> 
> I have the grey one, should I get the silver one (metal one)? Which is better and used by you when you repaste GPU?


there's only one kryonaut. there's also conductanaut and another older type. you want kryonaut. the conductonaut introduces danger to the mix and only gives 1-2c more. really only worth doing if you are going water cooled anyway.


----------



## DStealth

Tested a friend of mine's Asus Strix and seems to be at least 2-3 bins worst OC'er compared to my previous MSI GamingX 1.062v 2050 seems the last stable bin...benchable at 2076/2063 with max voltage Lol...2088 is no go at all...at least it's quite cold and silent








http://www.3dmark.com/3dm/19604256

Edit: Also the memory artifacts with 12Ghz...


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> RMA time, my good sir. sorry. but, yeah. that card said screw it.
> there's only one kryonaut. there's also conductanaut and another older type. you want kryonaut. the conductonaut introduces danger to the mix and only gives 1-2c more. really only worth doing if you are going water cooled anyway.


Aaa! Ok, then ignore my PM pls







. So there is an danger in using conductanaut?! What sort of danger?

This guy here used Liquid Metal One on air and saw 10C drop in temps:






That is why I was thinking about conductanaut


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Aaa! Ok, then ignore my PM pls
> 
> 
> 
> 
> 
> 
> 
> . So there is an danger in using conductanaut?! What sort of danger?
> 
> This guy here used Liquid Metal One on air and saw 10C drop in temps:
> 
> 
> 
> 
> 
> 
> That is why I was thinking about conductanaut


i got 8c on kryonaut. but that liquid metal if you spill even a drop can fry your card.


----------



## phenom01

Quote:


> Originally Posted by *Slackaveli*
> 
> RMA time, my good sir. sorry. but, yeah. that card said screw it.
> .


Gah that's what I figured. Just did Neweggs automated replacement system. Dunno if I should go replace or refund and try for a different card. Wish I woulda waited a week more for Amazon to get it in stock so I didnt have to deal with neweggs RMA service.


----------



## Slackaveli

to those wanting to know how to get voltage display on riva tuner. use HWinfo64, so in settings, and you can add any of the settings to the osd via river. it links itself if you check it's box.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> i got 8c on kryonaut. but that liquid metal if you spill even a drop can fry your card.


Ughhhh :/ I am too poor for that risk







Ok, thanks, I will use Kryonaut then. Thanks!


----------



## Slackaveli

to those wanting to know how to get voltage display on riva tuner. use HWinfo64, so in settings(hwinfo64's), and you can add any of the settings to the osd via river. it links itself if you check it's box.
Quote:


> Originally Posted by *phenom01*
> 
> Gah that's what I figured just did Neweggs automated replacement system. Dunno if I should go replace or refund and try for a different card. Wish I woulda waited a week more for Amazon to get it in stock so I didnt have to deal with neweggs RMA service.


just bad luck, i imagine. replacement should be fine. dang, no bueno, though!
Quote:


> Originally Posted by *Benny89*
> 
> Ughhhh :/ I am too poor for that risk
> 
> 
> 
> 
> 
> 
> 
> Ok, thanks, I will use Kryonaut then. Thanks!


yeah, man, same. i will open her up but i aint putting liquid metal in her. screw that. if i was going to do that i would just have shunt mod'd my old FE.


----------



## nrpeyton

*Incredible*; my max overclock is *BANG* on *2100 Mhz* (not 1 bin faster)

2100 Mhz @ 1093mV = stable
2113 Mhz @ 1093mV = crash

2079 Mhz @ 1050mV _(stock voltage limit)_ = stable
2092 Mhz @ 1050mV _(stock voltage limit)_ = crash

Maximum Memory O/C = 12,120 Gbps _(+556)_

The detail:
2088 Mhz @ 1081mV
seemed to be my limit when _before_ switching to overclocking using Voltage/Frequency curve. (she didn't want to go over 1081mV even though slider at 100%)

Time to try the curve....

*-*Pulled ALL curve points up by +210 simultaneously (which is _just below_ max non-curve O/C of + 220).
*-*Dragged the curve point for 1.093v up another 15 points (to 2100)

Card now boosts to 2100 at 1.093 without the need to lock voltage. When I hit power target during the bench (only three short 1/2 second bursts throughout) she downclocks to 2088 momentarily.

...before returning to a definite stable 2100 Mhz at 1.093v @ 65c & 260w avg. power draw.

2113 Mhz = immediate crash

Okay it was _only_ Heaven; but I don't have anything else that won't induce continuous power throttling.


----------



## Chris Ihao

Quote:


> Originally Posted by *nrpeyton*
> 
> Or could try going hardcore with liquid metal. Might even get 8c ;-)
> Disclaimer: I'm not actually recommending this


Hehe. Dont worry, I wont


----------



## BBEG

Quote:


> Originally Posted by *phenom01*
> 
> Gah that's what I figured. Just did Neweggs automated replacement system. Dunno if I should go replace or refund and try for a different card. Wish I woulda waited a week more for Amazon to get it in stock so I didnt have to deal with neweggs RMA service.


Newegg CS, such as it is, turned me off to them for good when they tried screwing me on my R9 290s. Went through four, alll defective, and they wouldn't do a damn thing to return or refund. Had to get PayPal involved (they took care of me, thankfully), but after that I decided the Egg wasn't for me.

Which is fine. Amazon takes better care of us anyway.


----------



## DStealth

Quote:


> Originally Posted by *nrpeyton*
> 
> *Incredible*; ... at 2100 Mhz at 1.093v @ 65c & 260w avg. power draw.


You should be definitely drawing 350+W with such settings, how did you measure you consumption ?


----------



## CoreyL4

Quote:


> Originally Posted by *nrpeyton*
> 
> *Incredible*; my max overclock is *BANG* on *2100 Mhz* (not 1 bin faster)
> 
> 2100 Mhz @ 1093mV = stable
> 2113 Mhz @ 1093mV = crash
> 
> 2079 Mhz @ 1050mV _(stock voltage limit)_ = stable
> 2092 Mhz @ 1050mV _(stock voltage limit)_ = crash
> 
> Maximum Memory O/C = 12,120 Gbps _(+556)_
> 
> The detail:
> 2088 Mhz @ 1081mV
> seemed to be my limit when _before_ switching to overclocking using Voltage/Frequency curve. (she didn't want to go over 1081mV even though slider at 100%)
> 
> Time to try the curve....
> 
> *-*Pulled ALL curve points up by +210 simultaneously (which is _just below_ max non-curve O/C of + 220).
> *-*Dragged the curve point for 1.093v up another 15 points (to 2100)
> 
> Card now boosts to 2100 at 1.093 without the need to lock voltage. When I hit power target during the bench (only three short 1/2 second bursts throughout) she downclocks to 2088 momentarily.
> 
> ...before returning to a definite stable 2100 Mhz at 1.093v @ 65c & 260w avg. power draw.
> 
> 2113 Mhz = immediate crash
> 
> Okay it was _only_ Heaven; but I don't have anything else that won't induce continuous power throttling.


Did you overclock memory also?


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> *Incredible*; my max overclock is *BANG* on *2100 Mhz* (not 1 bin faster)
> 
> 2100 Mhz @ 1093mV = stable
> 2113 Mhz @ 1093mV = crash
> 
> 2079 Mhz @ 1050mV _(stock voltage limit)_ = stable
> 2092 Mhz @ 1050mV _(stock voltage limit)_ = crash
> 
> Maximum Memory O/C = 12,120 Gbps _(+556)_
> 
> The detail:
> 2088 Mhz @ 1081mV
> seemed to be my limit when _before_ switching to overclocking using Voltage/Frequency curve. (she didn't want to go over 1081mV even though slider at 100%)
> 
> Time to try the curve....
> 
> *-*Pulled ALL curve points up by +210 simultaneously (which is _just below_ max non-curve O/C of + 220).
> *-*Dragged the curve point for 1.093v up another 15 points (to 2100)
> 
> Card now boosts to 2100 at 1.093 without the need to lock voltage. When I hit power target during the bench (only three short 1/2 second bursts throughout) she downclocks to 2088 momentarily.
> 
> ...before returning to a definite stable 2100 Mhz at 1.093v @ 65c & 260w avg. power draw.
> 
> 2113 Mhz = immediate crash
> 
> Okay it was _only_ Heaven; but I don't have anything else that won't induce continuous power throttling.


Nice one if you can bench on that!









I can bench on 2088, because card on air downclocks from 1.093V no matter what. But only Heaven and Valley. At suppo or TimeSpy it will go down to 2063-2075 anyway.

Are you on air? 2100 is really impressive on air.

Good chip you have there


----------



## nrpeyton

Quote:


> Originally Posted by *CoreyL4*
> 
> Did you overclock memory also?


yeah, max memory O/C = 12,120 _(or +556)_









So 2100MHZ core.
and 12k memory

I'm happy with that for 65c on air with FE cooler  

*Not* as good an over-clocker as my 1080 Classy (it went to 2151 MHZ and + 675 mem on air)
But I'm still pleased, the bigger cards don't overclock as well?

I'll happily trade 50 MHZ on the core and 119mhz on the memory for:

30% extra TI performance

Any day of the week ;-)

Quote:


> Originally Posted by *Benny89*
> 
> Nice one if you can bench on that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can bench on 2088, because card on air downclocks from 1.093V no matter what. But only Heaven and Valley. At suppo or TimeSpy it will go down to 2063-2075 anyway.
> 
> Are you on air? 2100 is really impressive on air.
> 
> Good chip you have there


On air, yeah ;-)

thanks 

can't wait to see what she'll do with a shut mod and chilled / 'slightly' sub-zero water









.


----------



## phenom01

Quote:


> Originally Posted by *BBEG*
> 
> Newegg CS, such as it is, turned me off to them for good when they tried screwing me on my R9 290s. Went through four, alll defective, and they wouldn't do a damn thing to return or refund. Had to get PayPal involved (they took care of me, thankfully), but after that I decided the Egg wasn't for me.
> 
> Which is fine. Amazon takes better care of us anyway.


Reassuring since I did not use paypal and the last 2 RMA's I did with them they tried to screw me out of my warranty(they used to be GREAT).









Guess this will teach me that my impatience isn't a virtue. We shall soon see. No more clouding the Club posts up til I get news.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> *Incredible*; my max overclock is *BANG* on *2100 Mhz* (not 1 bin faster)
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> 2100 Mhz @ 1093mV = stable
> 2113 Mhz @ 1093mV = crash
> 
> 2079 Mhz @ 1050mV _(stock voltage limit)_ = stable
> 2092 Mhz @ 1050mV _(stock voltage limit)_ = crash
> 
> Maximum Memory O/C = 12,120 Gbps _(+556)_
> 
> The detail:
> 2088 Mhz @ 1081mV
> seemed to be my limit when _before_ switching to overclocking using Voltage/Frequency curve. (she didn't want to go over 1081mV even though slider at 100%)
> 
> Time to try the curve....
> 
> *-*Pulled ALL curve points up by +210 simultaneously (which is _just below_ max non-curve O/C of + 220).
> *-*Dragged the curve point for 1.093v up another 15 points (to 2100)
> 
> Card now boosts to 2100 at 1.093 without the need to lock voltage. When I hit power target during the bench (only three short 1/2 second bursts throughout) she downclocks to 2088 momentarily.
> 
> ...before returning to a definite stable 2100 Mhz at 1.093v @ 65c & 260w avg. power draw.
> 
> 2113 Mhz = immediate crash
> 
> Okay it was _only_ Heaven; but I don't have anything else that won't induce continuous power throttling.


I bet you're glad that you took the shrink wrap off the box now, eh?









Congrats on getting a good one!









Enjoying reading your progress, very comprehensive and educational as well.
It will be interesting to see what you can accomplish with more exotic cooling, if that's your chosen route!


----------



## pez

Quote:


> Originally Posted by *phenom01*
> 
> Reassuring since I did not use paypal and the last 2 RMA's I did with them they tried to screw me out of my warranty(they used to be GREAT).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guess this will teach me that my impatience isn't a virtue. We shall soon see. No more clouding the Club posts up til I get news.


Yeah same for me here. If I had waited and gone through Amazon I'd be able to return my STRIX instead of reselling. I guess I won't lose a ridiculois amount, but it's not ideal. I'd say it's worth paying tax at this point to avoid Neweggs policy if no refunds on GPUs.

I did use PayPal, but there's nothing wrong with the GPU, so I think I'm out of luck







.


----------



## Slackaveli

Quote:


> Originally Posted by *phenom01*
> 
> Reassuring since I did not use paypal and the last 2 RMA's I did with them they tried to screw me out of my warranty(they used to be GREAT).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guess this will teach me that my impatience isn't a virtue. We shall soon see. No more clouding the Club posts up til I get news.


ALWAYS use paypal. that way, no matter what, you are covered. The even reimburse for return shipping.

In this case, the product is too new for them to have accumulated a bunch of "refurbished" return cards. you'll probably get another brand new one. thing's only been out a couple weeks.
Quote:


> Originally Posted by *pez*
> 
> Yeah same for me here. If I had waited and gone through Amazon I'd be able to return my STRIX instead of reselling. I guess I won't lose a ridiculois amount, but it's not ideal. I'd say it's worth paying tax at this point to avoid Neweggs policy if no refunds on GPUs.
> 
> I did use PayPal, but there's nothing wrong with the GPU, so I think I'm out of luck
> 
> 
> 
> 
> 
> 
> 
> .


i know a guy. he fixes problems... lol. seriously, though. just send it for rma. say you uninstalled the drivers and now they won't reinstall. done.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> ALWAYS use paypal. that way, no matter what, you are covered. The even reimburse for return shipping.
> 
> In this case, the product is too new for them to have accumulated a bunch of "refurbished" return cards. you'll probably get another brand new one. thing's only been out a couple weeks.
> i know a guy. he fixes problems... lol. seriously, though. just send it for rma. say you uninstalled the drivers and now they won't reinstall. done.


Yeah, but that would just prompt them to send a replacement and not a refund







.


----------



## BBEG

Quote:


> Originally Posted by *Slackaveli*
> 
> ALWAYS use paypal. that way, no matter what, you are covered. The even reimburse for return shipping.
> 
> In this case, the product is too new for them to have accumulated a bunch of "refurbished" return cards. you'll probably get another brand new one. thing's only been out a couple weeks.


A lesson I've learned several times. I'm extremely grateful to PayPal for how well they've watched out for me over the years, even when otherwise reputable retailers drop the ball.

So I'm faced with a dilemma. My EVGA FE card is still NIB, in the plastic wrap and everything. I figure EVGA's hybrids are soon to launch. Come fall I won't have the OEM cooler on the card anymore since I'll be using the new Calyos chassis. Any thoughts on stepping up to the Hybrid version in the meanwhile? Worth the performance gains on a presumably better PCB when I switch to phase cooling? Think there will be resale value of the cooler by then?

Edit: got a response back from EVGA. Looks like they have no ETA on 1080 Ti hybrids and Hybrids are not available for the step up program.


----------



## c0nsistent

Well, I ziptied an Antec 620 onto my 1080 Ti FE and it seems to hold around 68 to 71C @ max load for extended periods of time, which isn't exactly great for water, but it's a tiny 120mm radiator and I'm using a shim with thermal paste as the TIM, so it's not the most ideal setup by any means BUT it is a LOT quieter than the FE cooler, and even at 100% fan the FE cooler struggled to keep the GPU that low, so for now, it'll work. I have an old Raystorm CPU block I might rig up with a full size 240mm rad here soon once I order some better TIM.

I'll post photos of my ghetto rigging tomorrow.

I know its probably blasphemy to see such shenanigans on a $700 GPU, but I'd personally rather save $100-150 on watercooling a GPU and invest it into the horsepower itself. Instead of a 1080 + waterblock, I went with 1080 Ti + cheapo AIO


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Yeah, but that would just prompt them to send a replacement and not a refund
> 
> 
> 
> 
> 
> 
> 
> .


yeah, true, but maybe a NIB replacement for maximum resell.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, true, but maybe a NIB replacement for maximum resell.


That's not a terrible idea, but I'm not that bent on it, and ya know...morals







.


----------



## feznz

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> EK just murdered my card.. There's more coolant outside the loop than what's left inside..
> Don't know if the crack happened before or after the installation. Thid Acrylic Glass is really really weak, I never had this problem before with Acetal waterblocks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ]


I had the same scenario happen to me I used CRC brakleen and blow gun to be sure to get all the coolant from under the chips I couldn't wait so tested the card within an hour all sweet as
*Use a blow gun to dry off the card and get any coolant that may be trapped.*
Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> yeah, max memory O/C = 12,120 _(or +556)_
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So 2100MHZ core.
> and 12k memory
> 
> I'm happy with that for 65c on air with FE cooler
> 
> *Not* as good an over-clocker as my 1080 Classy (it went to 2151 MHZ and + 675 mem on air)
> But I'm still pleased, the bigger cards don't overclock as well?
> 
> 
> 
> I'll happily trade 50 MHZ on the core and 119mhz on the memory for:
> 
> 30% extra TI performance
> 
> Any day of the week ;-)
> 
> 
> 
> On air, yeah ;-)
> 
> thanks
> 
> can't wait to see what she'll do with a shut mod and chilled / 'slightly' sub-zero water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


Me too







I just got a feeling it may not be much more Just I got a feeling Pro over clockers are getting special bios directly from the card manufacturers


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Tested a friend of mine's Asus Strix and seems to be at least 2-3 bins worst OC'er compared to my previous MSI GamingX 1.062v 2050 seems the last stable bin...benchable at 2076/2063 with max voltage Lol...2088 is no go at all...at least it's quite cold and silent
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/19604256
> 
> Edit: Also the memory artifacts with 12Ghz...


Epeen about your old card again I see. Yeah, just don't blow his card up.


----------



## Luckbad

Never one to leave well enough alone, I finally decided to mess with Afterburner curves and abandon Zotac FireStorm. Now that I actually get what's going on (by ever bothering to read, watch, or try to use them for a few minutes), I discovered that I can indeed get my Zotac 1080 Ti Amp Extreme to 1.093 V and I finally got it to run stable at 2100 MHz. Works perfectly in FurMark, Superposition, World of Warcraft, Overwatch, Heroes of the Storm, and Battlefield 1.

+Rep to @KedarWolf not only for maintaining the OP but posting a succinct video on the Afterburner curve.


----------



## DStealth

Quote:


> Originally Posted by *SlimJ87D*
> 
> Epeen about your old card again I see. Yeah, just don't blow his card up.


Do you have any new bench results @"2152Mhz"


----------



## reflex75

Quote:


> Originally Posted by *Luckbad*
> 
> Never one to leave well enough alone, I finally decided to mess with Afterburner curves and abandon Zotac FireStorm. Now that I actually get what's going on (by ever bothering to read, watch, or try to use them for a few minutes), I discovered that I can indeed get my Zotac 1080 Ti Amp Extreme to 1.093 V and I finally got it to run stable at 2100 MHz. Works perfectly in FurMark, Superposition, World of Warcraft, Overwatch, Heroes of the Storm, and Battlefield 1.
> 
> +Rep to @KedarWolf not only for maintaining the OP but posting a succinct video on the Afterburner curve.


Nice








How is your curve?


----------



## TheBoom

So just when I thought I had the curve nicely set up and working as intended in ME:A and Heaven/Supo I decided to give FS Ultra a go and the card goes bat s**t crazy with clocks and voltage jumping all over the place lol.

It's as if the curve is no different from a manual oc in Fs Ultra.


----------



## trawetSluaP

If I RMA my card, can I reuse the thermal strips or do I need to get new ones for an EK Block?


----------



## Nico67

Quote:


> Originally Posted by *trawetSluaP*
> 
> If I RMA my card, can I reuse the thermal strips or do I need to get new ones for an EK Block?


If your careful you should be ok to reuse them, just try to avoid touching them with your fingers, use clean tweezers or something similar.


----------



## KickAssCop

I licked my thermal strips and still they were reusable.


----------



## feznz

did they taste good enough to eat?


----------



## Nico67

Quote:


> Originally Posted by *KickAssCop*
> 
> I licked my thermal strips and still they were reusable.


Might help them to stick better too, but not sure I'd recommend it


----------



## nrpeyton

Quote:


> Originally Posted by *DStealth*
> 
> You should be definitely drawing 350+W with such settings, how did you measure you consumption ?


In most games or benchmarks you're absolutely right.

But Valley benchmark isn't very hungry. ;-)

I chose it especially; to have *some* sort of possible way to test max O/C with minimal power throttling.

Not sure 2100 @ 1.093v will hold up in a more intense game/benchmark that draws 350w + like you say; however by that time I'll be under water _(or chilled water if I reconnect my 'Sub-zero modified' Water Chiller back into my loop).
_
I figured if I can do Valley on air at 2100Mhz / 1093mv with a power draw of about 270w - 300w, I should _at least_ manage the same _(or better)_ at 350w in a more modern benchmark after I go water..... (hope that makes sense).

In answer to ur Q: I was measuring using hwinfo64 on a non-shut modified card. (well not shut moded _*yet*_) lol

Nick

Quote:


> Originally Posted by *DerComissar*
> 
> I bet you're glad that you took the shrink wrap off the box now, eh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on getting a good one!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enjoying reading your progress, very comprehensive and educational as well.
> It will be interesting to see what you can accomplish with more exotic cooling, if that's your chosen route!


Funny you say that, because I *am* actually glad lol.









And thanks; will keep posting / keep u posted ;-)

P.S.
I've hardly even thought about gaming much tbh yet; instead I feel more like my card is a new toy to tinker with, lol. _(and there's something about reference design that I find interesting; as it's a 1st for me)._

.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *DStealth*
> 
> Do you have any new bench results @"2152Mhz"


I've just been gaming as of late. 2152 didn't hold in actual gaming. I don't know anyone that can sustain that high of a benchmark with Witcher 3 for more than 1 hour.


----------



## Gandyman

Hey guys

First off, just want to say that I have always had heaps of help here from people much more knowledgeable then myself and it's greatly appreciated.

I got my second strix 1080ti in the post yesterday (first one had no video signal out at all) and I'm having issues I've never experienced with a GPU before, any insight or others experience would be appreciated.

First thing I noticed was the coil whine, not just a light hum or buzz or squeal when a load screen or map makes the fps skyrocket, I'm talking about a horrid grinding sound that sounds like someone using a chain saw, under any load that isn't just the desktop. My wife thought someone was mowing the lawn outside my office window. I can't state enough how its very, incredibly LOUD.... but not always. After about 2 days of trying to benchmark and play games this is the pattern. I start a program, usually no noise off the bat. I play for a few minutes, get phenomenal frames (even coming from a 1080), and then anywhere from 1 - 20 minutes into the program, whether it be game or synthetic bench, the whine will start up. And yes I can go to map on witcher 3 and get NO COIL sound at all untill it seems to reach some critical mass. Then the sound will start, the nice gentle hum of normal coil whine first, then rapidly over a few seconds turning into a horrible grinding sound, like someone missing the clutch before shifting gears. Enough to make me jump. Then the program will crash. Depending on the game different type of crash. Overwatch crashes to desktop with a 'send crash report to blizzard' window. ME: Andromeda just plain crashes. Wildlands just freezes (but with a ctrl alt del to task manager closes fine) The grindign sound will continue to get louder and louder after the crash for a few seconds, then fade away to silence once back on desktop.

My system is the a RV:E 10th ed, 5930k @ 4.2 (1.1v) 4v4 3k dom plat, 960 pro NVMe, hx750i plat psu.

My scammy e-tailer who I bought the card off are saying coil whine is natural byproduct of fast gpus and they wont accept a return. They also suggested that my PSU is at fault. I've told them that my 1080 Hybrid works fine, my 980ti strix (which as far as I know has a higher PUS requirement) which is in my wife's PC usually but I am using in my rig at the time of writing works fine. They assure me its a fault with the my PSU and the drivers and the game devs causing crashes and coil whine is normal. I will be reading this thread to see if anyone has had any similar issues when I can get time, but has anyone else had similar experiences or can shed some light or insight as to causes?

Appreciate any help,

Brendan


----------



## PCBUILDER1980

Faulty card! I had the same problem and I returned and the new one works fine.


----------



## Gandyman

Quote:


> Originally Posted by *PCBUILDER1980*
> 
> Faulty card! I had the same problem and I returned and the new one works fine.


Thanks for your reply man! You had a 1080ti with exact same problem? Geez sorry that happened to you but it legit makes me feel better someone else had the same problem and I'm not going crazy lol.


----------



## TheBoom

I thought my Amp Extreme had no issues but I was wrong.

I didn't notice until now that my fans were constantly spinning even though GPU-Z, MSI AB and Firestorm all show that the fan is at 0%.

And this is at 45c idle temps. Custom curves do nothing same as auto or manual.

Anyone ever encounter something like this before?

I really hope it's a software issue tho I'd hate to have to RMA again.


----------



## Luckbad

@TheBoom See my post below. Grab Zotac FireStorm and disable the fan freeze crap so you can actually control the card with custom curves.
Quote:


> Originally Posted by *Luckbad*
> 
> Okay, finally got this thing figured out. There's a hard-to-find arrow that brings you to a place where you can disable the 0% fan speed functionality.
> 
> You can only do this in Zotac Firestorm, which you'll need for this as well as changing or turning off the LEDs on the physical card. Note also that I've only been able to adjust voltage for the card via Firestorm. Doing the standard tricks in Afterburner have yielded nothing (yep, even editing the oem2 file and unlocking it in settings).
> 
> To solve the issue
> 1) Click "Spectra"
> 2) Click the little down arrow on the right side of that UI
> 3) Disable the Fan Stop Setting (aka FREEZE Fan Stop)
> 
> Now, you can manipulate the fan speeds in your preferred software without getting the blast of 100% fan speed every time the card exceeds 45 C.


Quote:


> Originally Posted by *TheBoom*
> 
> I thought my Amp Extreme had no issues but I was wrong.
> 
> I didn't notice until now that my fans were constantly spinning even though GPU-Z, MSI AB and Firestorm all show that the fan is at 0%.
> 
> And this is at 45c idle temps. Custom curves do nothing same as auto or manual.
> 
> Anyone ever encounter something like this before?
> 
> I really hope it's a software issue tho I'd hate to have to RMA again.


----------



## Luckbad

@Gandyman

What you're describing is not coil whine, it's broken fans.


----------



## Benny89

OK, so confirmed crashing in Witcher 3 it is a matter of temperature.

Yesterday in the evening, with window open (it was cold in room) with 55% fan speed not even once card broke 60C. Played for 5 hours no problem 2050Mhz. Today I repaste it with Kryonaut and I was playing during a day (so not even close to how cold it was in the evening). Temps went up to 62C and as soon as it was 63 game freeze and crash. Not even hour of playing...

Also seems like I didn't get much gains from repasting with Kryonaut... Like 3C I think. Asus paste job was really good though.

So as long as I can keep it under 61C I can play Witcher 3 at 2050 or 2038, basicelly above stock voltage.

Well, that is a little dissapointing but at least I know where the problem is....

Any news about EVGA FTW HYBRID 1080 Ti Card?


----------



## Gandyman

Hey thanks for reply! I also thought that because the sound was so mechanical, but turning the fans up or down manually and even stopping them with my finger didn't edit the noise at all, and the temperatures have never gone over 74c so that can't be causing the crashes too =/ I mean I most certainly could be wrong is there any way I can test it more definitively?


----------



## reflex75

Quote:


> Originally Posted by *TheBoom*
> 
> I thought my Amp Extreme had no issues but I was wrong.
> 
> I didn't notice until now that my fans were constantly spinning even though GPU-Z, MSI AB and Firestorm all show that the fan is at 0%.
> 
> And this is at 45c idle temps. Custom curves do nothing same as auto or manual.
> 
> Anyone ever encounter something like this before?
> 
> I really hope it's a software issue tho I'd hate to have to RMA again.


I also have a Zotac Extreme and I've also observed some strange behaviour with its automatic fan control.
But I don't think it's hardware fault.
From what I have noticed, the fans start spinning at 50°c and are locked at 850rpm during 0% to 30%.
Thus preventing stop/start during low %.
Then they increase with the % (above 30%) as temperature rises as usual.
After the load, temp decreases and the fans stop at 40°c.
And you don't have control during idle, if try to set even 100%, the fans won't move.
But this is not the worse, because once I have noticed during benchmark when temperature was 75° that the fans rpm was at 0!
I checked and indeed they were stopped and forgot to start!
It can be really dangerous if you don't notice it!
I think Zotac has tried to do something smart to prevent start/stop behaviour during low load but has done something strange and buggy that I've never seen before...


----------



## reflex75

Quote:


> Originally Posted by *Luckbad*
> 
> @TheBoom See my post below. Grab Zotac FireStorm and disable the fan freeze crap so you can actually control the card with custom curves.


Thanks!
I've switched this setting to OFF but I still don't have total control on the fans.
They are spinning at a constant 850rmp during idle, even if I choose to set 0%!
They are still "locked" below 30% at 850rmp.
I've never seen such behaviour with all my previous cards...


----------



## Benny89

*Ok, I have enough of those cards. I have no idea how do they work and why they do work and why they don't and when...*
*
SAME Afterburner Settings, same curv*e:

1. Yesterday- played for 5 hours straight, temps 58-61. Fan speed 55%- Didn't go below 2050mhz on clock in Witcher 3 at all.
2. Today- played for an hour, Temp 63- crash. Ok, well, seems like temp is the key.
3. Played with higher fan speed- 70% to keep temps lower (after reboot ofc)- Clocks didn't break 57C. Downclocked to 2038 (same settings and curve!).
4. I think to myself- ok, maybe higher fan speed causes more power consumption (didn't get any prefs or higher TDP but whatever) and clocks downclock.
5. Ok, back to 55% Fan speed (same as pt 1 ) to confirm my theory. Card didn't yet break 58C- downclock to 2025.....

EVERYTHING ON SAME CURVE AND AB SETTINGS!!

I give up, I have enough. I have no idea how those stupid 1080Ti chips work...


----------



## Savatage79

Quote:


> Originally Posted by *Slackaveli*
> 
> im tellin you, try to turn off HEDT in bios.


Tried it last night, no go man. I still get a slight micro stutter in my games. Like I tested witcher out more last night and when in the field it's about 95% smooth, but there are a few occasionsal hiccups. That's not to bad however until I ran into novigrad and it'd like the card is truly struggling to keep up with textures or something. It's not the worst stuttering I've ever seen been it's there, know what I mean?

It can go for stretches where there's no stuttering, but then it'll eventually happen and you turn the camera and it hangs for a split second but still might be hard for a non Pc gamer to notice.

I'm just really disappointed.. Because the Temps,, clocks all hold nicely but I still get the stutters. 60 is locked every game and I'm always over 2038 in games for stretches..but that micro stutter drives me nuts.

What makes me mad is this is the first time I jumped from evga, so it obviously makes me dislike gigabyte gpus now even more which sucks because I hate being that way. It would just figure I get a gpu that looks go run well but when you're playing it just hangs a bit.

Unless, do these cards still have slight issues pushing max 4k and a little stuttering here and there is to be expected? Or should I have zero stuttering outside maybe once or twice a session in well optimized games?


----------



## KraxKill

Quote:


> Originally Posted by *Gandyman*
> 
> Hey thanks for reply! I also thought that because the sound was so mechanical, but turning the fans up or down manually and even stopping them with my finger didn't edit the noise at all, and the temperatures have never gone over 74c so that can't be causing the crashes too =/ I mean I most certainly could be wrong is there any way I can test it more definitively?


What mode of payment did you use to pay for the card? If PayPal you should ask them (the retailer) if they would rather deal with PayPal disputes or simply take the card back.

Another option would be to dispute t with the credit card company and see how they handle it.

You could also buy another card from the scamy place and return the old faulty card as the one you had just purchased. Not sure how and if they track the serials they sell but I wouldn't think twice about doing this to places that refuse returns for what THEY precieve is a non issue. Not in 2017. Not on a product worth $700+ Places like this make me sick and if it was me, I would simply say the card is broken and doesn't work. Would not be past me to break the card on purpose by clipping a power connector to prove to them that it is in fact broken.

Where exactly did you buy the card so we know never to buy there? What is their policy?

The best bet may be to RMA the card with the manufacturer if you can get them to agree to an RMA. It may take a lot of explaining and proving but ultimately you would get a factory replacement. If they have a cross shipping option, that may be the best bet. You pay for the card they ship but get your money back when they receive the faulty card.


----------



## Foxrun

Quote:


> Originally Posted by *Benny89*
> 
> *Ok, I have enough of those cards. I have no idea how do they work and why they do work and why they don't and when...*
> *
> SAME Afterburner Settings, same curv*e:
> 
> 1. Yesterday- played for 5 hours straight, temps 58-61. Fan speed 55%- Didn't go below 2050mhz on clock in Witcher 3 at all.
> 2. Today- played for an hour, Temp 63- crash. Ok, well, seems like temp is the key.
> 3. Played with higher fan speed- 70% to keep temps lower (after reboot ofc)- Clocks didn't break 57C. Downclocked to 2038 (same settings and curve!).
> 4. I think to myself- ok, maybe higher fan speed causes more power consumption (didn't get any prefs or higher TDP but whatever) and clocks downclock.
> 5. Ok, back to 55% Fan speed (same as pt 1 ) to confirm my theory. Card didn't yet break 58C- downclock to 2025.....
> 
> EVERYTHING ON SAME CURVE AND AB SETTINGS!!
> 
> I give up, I have enough. I have no idea how those stupid 1080Ti chips work...


I've given up on the curve. I reverted back to the old method of cranking the voltage and power slider, and then adjusting clocks that way. Since then I haven't had any stability issues reported in gpu z and gaming feels smoother. My timespy scores have also increased. I got 2038/6000 stable with 1.093 max on one card and 1.062 on the other. Never doing the curves again, it's almost impossible to get a curve to behave well in SLI when the cards are different.


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> @TheBoom See my post below. Grab Zotac FireStorm and disable the fan freeze crap so you can actually control the card with custom curves.


Thanks for this.

+rep
Quote:


> Originally Posted by *Foxrun*
> 
> I've given up on the curve. I reverted back to the old method of cranking the voltage and power slider, and then adjusting clocks that way. Since then I haven't had any stability issues reported in gpu z and gaming feels smoother. My timespy scores have also increased. I got 2038/6000 stable with 1.093 max on one card and 1.062 on the other. Never doing the curves again, it's almost impossible to get a curve to behave well in SLI when the cards are different.


Exactly what I thought. I'm back to good ol manual oc too.

Ok no go on the fans. They're still stuck at 870 RPM on idle. Tried setting manual and curve in both Firestorm and AB and nothing.


----------



## KraxKill

Quote:


> Originally Posted by *Foxrun*
> 
> I've given up on the curve. I reverted back to the old method of cranking the voltage and power slider, and then adjusting clocks that way. Since then I haven't had any stability issues reported in gpu z and gaming feels smoother. My timespy scores have also increased. I got 2038/6000 stable with 1.093 max on one card and 1.062 on the other. Never doing the curves again, it's almost impossible to get a curve to behave well in SLI when the cards are different.


I only use the curve to control my top desired voltage bin if the voltage the card is pulling is a bin below where I want to be.

For example my card, even with the voltage slider all the way to the right (Afterburner beta) at +100% wants to run 2100 at 1.440-1.050v and is reluctant at selecting the 1.063v bin when temps are low. Thus, I pull my 1.063v bin down to meet 2100 in order to be able to use 1.063v there.

As far as the rest of the curve goes, it seems that the less you touch the better. I would also re-check the curve after reboot to make sure the desired max bin is still being selected appropriately. Afterburner tends to move the curve a fraction on reboot. At least sometimes from what I noticed.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> That's not a terrible idea, but I'm not that bent on it, and ya know...morals
> 
> 
> 
> 
> 
> 
> 
> .


well, yeah. im not 100% what issue you have w/ your card. If just clocks (and you can get ~2Ghz), then, yeah. I wouldnt do it personally. I try not to abuse the system so it's there for me when i do need it. Caveat: company like newegg do exchange foe SAME item only. They could at least do a store credit where if you dont like the model you could add a few dollars and try a better one. Them limiting to same model kind of excuses the return for bad clocks morally in my eyes. Ya know? Shady is as shady does. In other words, if all im getting on a return is another same brand/model gpu, it better be able to do at least 2Ghz and not crash. And it better not have coil whine!


----------



## TheBoom

Quote:


> Originally Posted by *reflex75*
> 
> Thanks!
> I've switched this setting to OFF but I still don't have total control on the fans.
> They are spinning at a constant 850rmp during idle, even if I choose to set 0%!
> They are still "locked" below 30% at 850rmp.
> I've never seen such behaviour with all my previous cards...


Did you manage to fix this?

I'm afraid it might be a bios implementation after all. Might need to contact Zotac about it on Monday.


----------



## GRABibus

Quote:


> Originally Posted by *Benny89*
> 
> *Ok, I have enough of those cards. I have no idea how do they work and why they do work and why they don't and when...
> I give up, I have enough. I have no idea how those stupid 1080Ti chips work...
> 
> 
> 
> 
> 
> 
> 
> *


They do what they want


----------



## Slackaveli

Quote:


> Originally Posted by *Gandyman*
> 
> Hey guys
> 
> First off, just want to say that I have always had heaps of help here from people much more knowledgeable then myself and it's greatly appreciated.
> 
> I got my second strix 1080ti in the post yesterday (first one had no video signal out at all) and I'm having issues I've never experienced with a GPU before, any insight or others experience would be appreciated.
> 
> First thing I noticed was the coil whine, not just a light hum or buzz or squeal when a load screen or map makes the fps skyrocket, I'm talking about a horrid grinding sound that sounds like someone using a chain saw, under any load that isn't just the desktop. My wife thought someone was mowing the lawn outside my office window. I can't state enough how its very, incredibly LOUD.... but not always. After about 2 days of trying to benchmark and play games this is the pattern. I start a program, usually no noise off the bat. I play for a few minutes, get phenomenal frames (even coming from a 1080), and then anywhere from 1 - 20 minutes into the program, whether it be game or synthetic bench, the whine will start up. And yes I can go to map on witcher 3 and get NO COIL sound at all untill it seems to reach some critical mass. Then the sound will start, the nice gentle hum of normal coil whine first, then rapidly over a few seconds turning into a horrible grinding sound, like someone missing the clutch before shifting gears. Enough to make me jump. Then the program will crash. Depending on the game different type of crash. Overwatch crashes to desktop with a 'send crash report to blizzard' window. ME: Andromeda just plain crashes. Wildlands just freezes (but with a ctrl alt del to task manager closes fine) The grindign sound will continue to get louder and louder after the crash for a few seconds, then fade away to silence once back on desktop.
> 
> My system is the a RV:E 10th ed, 5930k @ 4.2 (1.1v) 4v4 3k dom plat, 960 pro NVMe, hx750i plat psu.
> 
> My scammy e-tailer who I bought the card off are saying coil whine is natural byproduct of fast gpus and they wont accept a return. They also suggested that my PSU is at fault. I've told them that my 1080 Hybrid works fine, my 980ti strix (which as far as I know has a higher PUS requirement) which is in my wife's PC usually but I am using in my rig at the time of writing works fine. They assure me its a fault with the my PSU and the drivers and the game devs causing crashes and coil whine is normal. I will be reading this thread to see if anyone has had any similar issues when I can get time, but has anyone else had similar experiences or can shed some light or insight as to causes?
> 
> Appreciate any help,
> 
> Brendan


card is broken. the initial coil whine is natural. CRASHING and GRINDING is not. call that e-tailer and demand a manager, stat. otherwise, contact your credit card company or paypal stat.


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> Tried it last night, no go man. I still get a slight micro stutter in my games. Like I tested witcher out more last night and when in the field it's about 95% smooth, but there are a few occasionsal hiccups. That's not to bad however until I ran into novigrad and it'd like the card is truly struggling to keep up with textures or something. It's not the worst stuttering I've ever seen been it's there, know what I mean?
> 
> It can go for stretches where there's no stuttering, but then it'll eventually happen and you turn the camera and it hangs for a split second but still might be hard for a non Pc gamer to notice.
> 
> I'm just really disappointed.. Because the Temps,, clocks all hold nicely but I still get the stutters. 60 is locked every game and I'm always over 2038 in games for stretches..but that micro stutter drives me nuts.
> 
> What makes me mad is this is the first time I jumped from evga, so it obviously makes me dislike gigabyte gpus now even more which sucks because I hate being that way. It would just figure I get a gpu that looks go run well but when you're playing it just hangs a bit.
> 
> Unless, do these cards still have slight issues pushing max 4k and a little stuttering here and there is to be expected? Or should I have zero stuttering outside maybe once or twice a session in well optimized games?


man, idk, maybe rma it and say it is constantly crashing at stock clocks and try again. I dont think that's normal at all. But it's hard for me to say b/c I am on a 5775-c, the smoothest gaming cpu there is. I probably have a lot of stutter resistance naturally because my cache is L3+L4= 134mb vs all other i-7s (8mb)... so, what, 17 TIMES (1700% gain) more cache.


----------



## Savatage79

Quote:


> Originally Posted by *Slackaveli*
> 
> man, idk, maybe rma it and say it is constantly crashing at stock clocks and try again. I dont think that's normal at all. But it's hard for me to say b/c I am on a 5775-c, the smoothest gaming cpu there is. I probably have a lot of stutter resistance naturally because my cache is L3+L4= 134mb vs all other i-7s (8mb)... so, what, 17 TIMES (1700% gain) more cache.


Do you think a 4770 could possibly hold this gpu back a bit?


----------



## nrpeyton

Quote:


> Originally Posted by *Gandyman*
> 
> Hey guys
> 
> First off, just want to say that I have always had heaps of help here from people much more knowledgeable then myself and it's greatly appreciated.
> 
> I got my second strix 1080ti in the post yesterday (first one had no video signal out at all) and I'm having issues I've never experienced with a GPU before, any insight or others experience would be appreciated.
> 
> First thing I noticed was the coil whine, not just a light hum or buzz or squeal when a load screen or map makes the fps skyrocket, I'm talking about a horrid grinding sound that sounds like someone using a chain saw, under any load that isn't just the desktop. My wife thought someone was mowing the lawn outside my office window. I can't state enough how its very, incredibly LOUD.... but not always. After about 2 days of trying to benchmark and play games this is the pattern. I start a program, usually no noise off the bat. I play for a few minutes, get phenomenal frames (even coming from a 1080), and then anywhere from 1 - 20 minutes into the program, whether it be game or synthetic bench, the whine will start up. And yes I can go to map on witcher 3 and get NO COIL sound at all untill it seems to reach some critical mass. Then the sound will start, the nice gentle hum of normal coil whine first, then rapidly over a few seconds turning into a horrible grinding sound, like someone missing the clutch before shifting gears. Enough to make me jump. Then the program will crash. Depending on the game different type of crash. Overwatch crashes to desktop with a 'send crash report to blizzard' window. ME: Andromeda just plain crashes. Wildlands just freezes (but with a ctrl alt del to task manager closes fine) The grindign sound will continue to get louder and louder after the crash for a few seconds, then fade away to silence once back on desktop.
> 
> My system is the a RV:E 10th ed, 5930k @ 4.2 (1.1v) 4v4 3k dom plat, 960 pro NVMe, hx750i plat psu.
> 
> My scammy e-tailer who I bought the card off are saying coil whine is natural byproduct of fast gpus and they wont accept a return. They also suggested that my PSU is at fault. I've told them that my 1080 Hybrid works fine, my 980ti strix (which as far as I know has a higher PUS requirement) which is in my wife's PC usually but I am using in my rig at the time of writing works fine. They assure me its a fault with the my PSU and the drivers and the game devs causing crashes and coil whine is normal. I will be reading this thread to see if anyone has had any similar issues when I can get time, but has anyone else had similar experiences or can shed some light or insight as to causes?
> 
> Appreciate any help,
> 
> Brendan


Which 1080TI do you have?

Have you tried your 1080TI in your wifes computer (more to point on a different PSU)?

I've heard everything you're saying (just trying to rule stuff out first).


----------



## TheBoom

Quote:


> Originally Posted by *Savatage79*
> 
> Do you think a 4770 could possibly hold this gpu back a bit?


Pushing high frames with lower resolutions yes.

Otherwise no.

Only game I've seen my 6700k hit 100% is ME:Andromeda but only briefing during loading. Other times it's pushing out 70-90 fps in the Havarl open jungle with around 80% CPU usage.

4770k shouldn't be too far off in terms of performance.


----------



## foolycooly

Quote:


> Originally Posted by *Slackaveli*
> 
> YO, I CALLED that.


Quote:


> Originally Posted by *RadActiveLobstr*
> 
> It's not just DVI, it's having more than 2 monitors. I have the same issue with all three using DP and now two with DP and one HDMI (as a DP on my one monitor has died).
> 
> It's an issue that's been pointed out to Nvidia for a long time and it's gone from "We are aware of it and will work on a fix" to now "It's working as intended".


Quote:


> Originally Posted by *MURDoctrine*
> 
> Is this an issue with above 60hz displays because I run 3 60hz panels and downclock fine?


Looks like RadActiveLobstr was correct--I received my third DP cable today and all 3 monitors are running on DP. When I only have my main 144hz and one side 60z plugged in, the card downclocks properly to 215hz. When I plug the 2nd 60hz side panel in (3 total) with DP, the card doesn't downclock below ~1500.

Guess i'll just have to deal with some fan noise at idle. Maybe there will be a fix someday.


----------



## Savatage79

Quote:


> Originally Posted by *TheBoom*
> 
> Pushing high frames with lower resolutions yes.
> 
> Otherwise no.
> 
> Only game I've seen my 6700k hit 100% is ME:Andromeda but only briefing during loading. Other times it's pushing out 70-90 fps in the Havarl open jungle with around 80% CPU usage.
> 
> 4770k shouldn't be too far off in terms of performance.


Yea, I mean some people would maybe say I need to ignore the slight stutter... And maybe they're right.. But I never has stutters with any of my evga cards and I've had 1080, 980 ti, and all others on down and they might have coil whine or get to hot but they never stuttered and when they did it was easily fixable.

With the Aorus, my first time switching, I can't fix it.. So that's why it sucks because damn otherwise when it's smooth its really smooth and looks awesome.

I'll have the ftw3 shortly also, so that will tell the tale and help me see if it's the gpu or my Pc somewhere else.

Here's question... I have an older corsair tx750w, is there any slight chance an older psu could be a culprit if even a small one? Or if it was going down would I see more issues?


----------



## reflex75

Quote:


> Originally Posted by *TheBoom*
> 
> Did you manage to fix this?
> 
> I'm afraid it might be a bios implementation after all. Might need to contact Zotac about it on Monday.


I did my own fan curve:


----------



## TheBoom

Quote:


> Originally Posted by *Savatage79*
> 
> Yea, I mean some people would maybe say I need to ignore the slight stutter... And maybe they're right.. But I never has stutters with any of my evga cards and I've had 1080, 980 ti, and all others on down and they might have coil whine or get to hot but they never stuttered and when they did it was easily fixable.
> 
> With the Aorus, my first time switching, I can't fix it.. So that's why it sucks because damn otherwise when it's smooth its really smooth and looks awesome.
> 
> I'll have the ftw3 shortly also, so that will tell the tale and help me see if it's the gpu or my Pc somewhere else.
> 
> Here's question... I have an older corsair tx750w, is there any slight chance an older psu could be a culprit if even a small one? Or if it was going down would I see more issues?


I wouldn't rule the PSU out. Could be peaking and causing the stutters due to vdroops? Although unlikely. Still won't hurt if you can test with another PSU before it goes and brings your whole system with it.

That being said, in my experience if GPU and its drivers are ruled out, the biggest culprit apart from CPU being maxed out is RAM. I've had a lot of issues with overclocked ram causing stutters especially during high workloads.

Anyway stupid question but have you tried the max pre rendered frames fix already?


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> Do you think a 4770 could possibly hold this gpu back a bit?


yeah it could, in certain aps. what definition are you playing on? if 1080p. yeah, that could be your problem. What kind of cpu utilization are you at?


----------



## TheBoom

Quote:


> Originally Posted by *reflex75*
> 
> I did my own fan curve:


So you set idle/minimum fan speed to 30%?

But that isn't fixing the issue is it.

It seems more of a "if you cant beat them then join them" approach lol.
Quote:


> Originally Posted by *foolycooly*
> 
> Looks like RadActiveLobstr was correct--I received my third DP cable today and all 3 monitors are running on DP. When I only have my main 144hz and one side 60z plugged in, the card downclocks properly to 215hz. When I plug the 2nd 60hz side panel in (3 total) with DP, the card doesn't downclock below ~1500.
> 
> Guess i'll just have to deal with some fan noise at idle. Maybe there will be a fix someday.


I have this issue with 1 165hz and 1 60hz display as well.

The workaround is NV MDPS. Just switch it on when you're not gaming and it will downclock to the lowest bin.


----------



## Slackaveli

This
Quote:


> Originally Posted by *TheBoom*
> 
> I wouldn't rule the PSU out. Could be peaking and causing the stutters due to vdroops? Although unlikely. Still won't hurt if you can test with another PSU before it goes and brings your whole system with it.
> 
> That being said, in my experience if GPU and its drivers are ruled out, the biggest culprit apart from CPU being maxed out is RAM. I've had a lot of issues with overclocked ram causing stutters especially during high workloads.
> 
> Anyway stupid question but have you tried the max pre rendered frames fix already?


max pre rendered frames and shader cache settings need to be tried for sure.


----------



## reflex75

Quote:


> Originally Posted by *TheBoom*
> 
> So you set idle/minimum fan speed to 30%?
> 
> But that isn't fixing the issue is it.
> 
> It seems more of a "if you cant beat them then join them" approach lol.


That's the spirit








Bty, what is your max boost clock out of the box?
And your max oc?
Because mine looks like lazy


----------



## Savatage79

Quote:


> Originally Posted by *TheBoom*
> 
> I wouldn't rule the PSU out. Could be peaking and causing the stutters due to vdroops? Although unlikely. Still won't hurt if you can test with another PSU before it goes and brings your whole system with it.
> 
> That being said, in my experience if GPU and its drivers are ruled out, the biggest culprit apart from CPU being maxed out is RAM. I've had a lot of issues with overclocked ram causing stutters especially during high workloads.
> 
> Anyway stupid question but have you tried the max pre rendered frames fix already?


I actually have to take a look at my cpu stats, and I haven't tried max pre rendered frames yet... Is that a fix that actually has worked for some folks?


----------



## TheBoom

Quote:


> Originally Posted by *reflex75*
> 
> That's the spirit
> 
> 
> 
> 
> 
> 
> 
> 
> Bty, what is your max boost clock out of the box?
> And your max oc?
> Because mine looks like lazy


Max boost with first card was 1999mhz. Max oc 2050. Memory +300 (including the out of the box oc)

I exchanged that card for the one I have one now due to cosmetic defects.

This one boosted to 1987mhz. Max oc 2037. Memory +400.

Yeah they are kinda lazy lol.

But still a really good buy considering the 70 USD premium for the Strix where I live. And the strix are not performing great anyway.

Quote:


> Originally Posted by *Savatage79*
> 
> I actually have to take a look at my cpu stats, and I haven't tried max pre rendered frames yet... Is that a fix that actually has worked for some folks?


Works for many folks and on many different games. I myself have it on for quite a number of games. Of course as always YMMV.


----------



## Slackaveli

Quote:


> Originally Posted by *Savatage79*
> 
> I actually have to take a look at my cpu stats, and I haven't tried max pre rendered frames yet... Is that a fix that actually has worked for some folks?


yes. lowing the number (it's in nvidia control panel) helps some, raising it helps others. Try Max Pre rendered Frames (not to be confused with max vr frames) at 1. if that doesnt work try it at 4.


----------



## Savatage79

Sounds good guys, I'll give that a try this evening.. Really hope that can help! I'll keep yas posted


----------



## KingEngineRevUp

Her guys, if you're having issues with the latest drivers, Windows 10 creators update may have disabled some of the nvidia functionality.

This is caused my Windows fastboot, disable it.

I lose the ability to open nvidia control panel. I also had noticed some bad frame drops I games. Disabling fastboot brought the nvidia control panel back and allowed geforce experience to launch again.

People are discussing this issue on nvidia forum, devs are trying to figure out what went wrong and why fastboot is causing this issue.


----------



## Chris Ihao

Great. Afterburner messed up my Win 10 install last night (must be, as I upgraded to 4.3.0 just before). System thread exception. Not possible to recover, start in safe mode or fix anything, and spent hours trying to fix it today. Finally decided to reinstall everything. Most fun activity I know of. Bah.


----------



## KedarWolf

Quote:


> Originally Posted by *Chris Ihao*
> 
> Great. Afterburner messed up my Win 10 install last night (must be, as I upgraded to 4.3.0 just before). System thread exception. Not possible to recover, start in safe mode or fix anything, and spent hours trying to fix it today. Finally decided to reinstall everything. Most fun activity I know of. Bah.


Likely just a ID 10T error.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Chris Ihao*
> 
> Great. Afterburner messed up my Win 10 install last night (must be, as I upgraded to 4.3.0 just before). System thread exception. Not possible to recover, start in safe mode or fix anything, and spent hours trying to fix it today. Finally decided to reinstall everything. Most fun activity I know of. Bah.


Would doing a Windows recovery solve this issue?


----------



## nrpeyton

How far are you guys able to reach generally on 1.0v (fully stable)?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> How far are you guys able to reach generally on 1.0v (fully stable)?


2088 Mhz for me


----------



## Chris Ihao

Quote:


> Originally Posted by *KedarWolf*
> 
> Likely just a ID 10T error.


Hehe. Nope. Been messing around with pc's since one of the first dos versions, so I pretty much know what I'm doing. Did heck of a lot cmd stuff to try to fix it. I read about an error where afterburner locks a direct x file or something, so That may be the issue here.
Quote:


> Originally Posted by *SlimJ87D*
> 
> Would doing a Windows recovery solve this issue?


Unfortunately, no. Tried different ways of recovering. From now on I am creating a system image each time I install Afterburner.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Chris Ihao*
> 
> Hehe. Nope. Been messing around with pc's since one of the first dos versions, so I pretty much know what I'm doing. Did heck of a lot cmd stuff to try to fix it. I read about an error where afterburner locks a direct x file or something, so That may be the issue here.
> Unfortunately, no. Tried different ways of recovering. From now on I am creating a system image each time I install Afterburner.


Good idea, I'm going to store it on a separate HDD


----------



## Chris Ihao

Quote:


> Originally Posted by *SlimJ87D*
> 
> Good idea, I'm going to store it on a separate HDD


Yeah. Fortunately its much easier now to migrate stuff, with google etc.


----------



## done12many2

Quote:


> Originally Posted by *Chris Ihao*
> 
> From now on I am creating a system image each time I install Afterburner.


This is good practice when pushing things to the limits.

I keep images of all of my OSs (daily / bench) on a dedicated SSD. Whenever I do clean installs, I install all the drivers, software and updates that I want in the image and then I create the image. When stuff goes wrong, restoring things to exactly where I was after a clean install literally takes a couple of minutes. Comes in handy when you corrupt Windows files during memory / cache overclocking.


----------



## Chris Ihao

Quote:


> Originally Posted by *done12many2*
> 
> This is good practice when pushing things to the limits.
> 
> I keep images of all of my OSs (daily / bench) on a dedicated SSD. Whenever I do clean installs, I install all the drivers, software and updates that I want in the image and then I create the image. When stuff goes wrong, restoring things to exactly where I was after a clean install literally takes a couple of minutes. Comes in handy when you corrupt Windows files during memory / cache overclocking.


Good advice. I basically have been lucky for so long now, I'm probably a bit spoiled. Wish I had taken more care. Anyhow, as this is a purely gaming oriented desktop, its no cricis. It would be much worse if it was the audio workstation. Going to back that one up now as well :O

This basically became a clean install though, as somehow all my document files etc. was deleted, so I guess its good for something.


----------



## the9quad

Quote:


> Originally Posted by *Chris Ihao*
> 
> Great. Afterburner messed up my Win 10 install last night (must be, as I upgraded to 4.3.0 just before). System thread exception. Not possible to recover, start in safe mode or fix anything, and spent hours trying to fix it today. Finally decided to reinstall everything. Most fun activity I know of. Bah.


Man if Unwinder sees these posts lol


----------



## NYU87

Quote:


> Originally Posted by *Alwrath*
> 
> Id be interested in seeing your BF1 results. Is there even a benchmark for it? You could be right, but remember your cores are all at over 90% usage and you cant have anything else open while you play. If you open up itunes youll see stuttering... also saying it " smashes " a Ryzen is kinda far fetched... maybe higher scores at 120 hz + but I game at 4K 60hz so it really doesnt matter to me


I agree. At this point I would never pick up a quad core, especially a Broadwell-C. It's either Ryzen or Broadwell-E, which will both eat the quad core Broadwell-C alive.

I would have gone with either platforms but with Intel's x299 and Skylake X coming out in the summer I'm going to wait it out. Would love to get a 10C Skylake X chip.


----------



## Chris Ihao

Quote:


> Originally Posted by *the9quad*
> 
> Man if Unwinder sees these posts lol


Ouch. Didnt consider that


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> 2088 Mhz for me


Wow you've got a good one there then. ;-)

Even at only 50c core temp; mine won't do more than 2037 Mhz at 1.0v. _(with 2025 being the true rock solid numb. for 1.0v stability)._

What temp you @ for 2088 at 1.0?

*Anyway (edit):*
You got a link to this BIOS you wanted me to look at:?

About to do some baselines now, first.


----------



## GosuPl

1080Ti SLI + G-Sync = Less performance. On my former GPU's , 2x TITAN X Maxwell, i dont have any performance lose with G-Sync. But on 2x 1080Ti's, loses is noticable.





Any fixes ?


----------



## mtbiker033

Quote:


> Originally Posted by *Chris Ihao*
> 
> Great. Afterburner messed up my Win 10 install last night (must be, as I upgraded to 4.3.0 just before). System thread exception. Not possible to recover, start in safe mode or fix anything, and spent hours trying to fix it today. Finally decided to reinstall everything. Most fun activity I know of. Bah.


wow that's weird! sorry to hear it

my 1080ti hybrid kit should ship from B&W on Monday/Tuesday got everything ready, tools, kryonaut, place to mount the rad on my test bench, fan extensions to my fan controller at the ready


----------



## Chris Ihao

Quote:


> Originally Posted by *mtbiker033*
> 
> wow that's weird! sorry to hear it
> 
> my 1080ti hybrid kit should ship from B&W on Monday/Tuesday got everything ready, tools, kryonaut, place to mount the rad on my test bench, fan extensions to my fan controller at the ready


Thanks man


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> How far are you guys able to reach generally on 1.0v (fully stable)?


2037 here.


----------



## Slackaveli

Quote:


> Originally Posted by *NYU87*
> 
> I agree. At this point I would never pick up a quad core, especially a Broadwell-C. It's either Ryzen or Broadwell-E, which will both eat the quad core Broadwell-C alive.
> 
> I would have gone with either platforms but with Intel's x299 and Skylake X coming out in the summer I'm going to wait it out. Would love to get a 10C Skylake X chip.


LOL! How misinformed could you possible be?
Im 130-144+fps in bf1 @qhd, CRUSHING ryzen 4 cores, beating 6-8 cores, beating INTEL 6 CORES, bro. Beating all Intel except 5.0Ghz+ Kaby Lakes. Beating all of the ones you just mentioned. PROVEN fact. But, you do your thing. Just have to counter-point so the community isnt misled. TBC, not advocating a from scratch z97 system in 2017. it's for those with 4690k/4790k on z97. THEY get a sweet, sweet deal.


----------



## nrpeyton

Most games don't scale with cores.

Edit:
*Some* games will _*never*_ be able to scale properly with cores. It's still beyond our technology/programming skills.
Quote:


> Originally Posted by *Slackaveli*
> 
> 2037 here.


You and I seem to have equivalent lotterty then? 2037 was my max at 1.0v too

Whats your max attainable o/c on air with full 1.093v?
Mine was 2100 at 65c


----------



## pez

Quote:


> Originally Posted by *GosuPl*
> 
> 1080Ti SLI + G-Sync = Less performance. On my former GPU's , 2x TITAN X Maxwell, i dont have any performance lose with G-Sync. But on 2x 1080Ti's, loses is noticable.
> 
> 
> 
> 
> 
> Any fixes ?


If I'm not mistaken, all benchmarks will perform worse with G-Sync as G-Sync's main job is to get rid of partial or incomplete frames. Most of your results are within margin of error, but there is some consistency. It just seems like he nature of G-sync really.


----------



## AMDATI

Quote:


> Originally Posted by *pez*
> 
> If I'm not mistaken, all benchmarks will perform worse with G-Sync as G-Sync's main job is to get rid of partial or incomplete frames. Most of your results are within margin of error, but there is some consistency. It just seems like he nature of G-sync really.


That makes absolutely zero sense. Gsync doesn't 'get rid' of partial frames, GPU's don't draw partial frames. What Gsync does is make sure a whole frame is displayed as soon as it's sent to the monitor, rather than display part of the next frame. In non gsync monitors, the GPU essentially has to wait on the monitor (vsync). In adaptive sync displays, the monitor waits on the GPU and displays the whole frame before moving on to the next.

Also, even if frames were dropped, it wouldn't register from the system FPS readout, it would only happen at the screen hardware level.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Most games don't scale with cores.
> 
> Edit:
> *Some* games will _*never*_ be able to scale properly with cores. It's still beyond our technology/programming skills.
> You and I seem to have equivalent lotterty then? 2037 was my max at 1.0v too
> 
> Whats your max attainable o/c on air with full 1.093v?
> Mine was 2100 at 65c


same. post kryonaut mod, tho. so im maxing at 62c. my card isnt 100% stable at 2100, tho, in EVERY sense. i can play BF1 at that for hours. It really prefers 2088 at 1.8 and down to 2050 at 1.5v. so, i have curve at 2037 at 1v, 2050 at 1.043v, 2076 at 1.063 and 2088 at 1.08v. flat between 1v and 1.043v, flat between the steps if i skipped a voltage up to 1.08 at 2088 and flat from there. so, depending on the app, im either at 2088, 2076, or 2050. Havent seen it go to 2037 yet and only the witcher 3 knocks me down to 2050 so far , out of 2 dozen game or so tested. i can hold 2088+ in benches. can hold 2101 in benches, actually, but on these settings it'd be 2088.


----------



## GosuPl

Quote:


> Originally Posted by *pez*
> 
> If I'm not mistaken, all benchmarks will perform worse with G-Sync as G-Sync's main job is to get rid of partial or incomplete frames. Most of your results are within margin of error, but there is some consistency. It just seems like he nature of G-sync really.


On 2x TITAN X Maxwell i dont have any performance lose and GPU usage is better, than on 2x 1080Ti's. Ppls have problems with SLI + G+Sync on Pascal. I thinking about, return to 2x TITAN X Maxwell and just skip Pascal. For me , performance gains from change 2x TITAN X [email protected]/8000 for 2x GTX [email protected]/11912 , is to low...

30/35 fps on 1440p and 20/25 fps on 4k, is not worthy with that G-Sync problems :/


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's why we're waiting for you to do your power draw test from the wall.
> 
> That's the last piece of evidence we can rely on.


Is it the 'review sample' "GameRock Premium" BIOS?:
https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331

Or somewhere else?


----------



## AMDATI

To me 1-2fps is within a margin of error, especially considering the dynamic ever changing core clock of Pascal. You simply won't have the same exact clock rate at the same exact moment in a benchmark.


----------



## Slackaveli

Quote:


> Originally Posted by *AMDATI*
> 
> To me 1-2fps is within a margin of error, especially considering the dynamic ever changing clock rate of Pascal. You simply won't have the same exact clock rate at the same exact moment in a benchmark.


my score with g-sync on in supo is 10,240. with it off it's 10, 521.

Also, SuPo doesn't really have a MoE. It is hella consistent. i get the exact same fps at the exact same places in that test every single run unless i change a setting. It's remarkably consistant. MoE on SuPo 4k Optimized (optimized, man) is <1fps


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow you've got a good one there then. ;-)
> 
> Even at only 50c core temp; mine won't do more than 2037 Mhz at 1.0v. _(with 2025 being the true rock solid numb. for 1.0v stability)._
> 
> What temp you @ for 2088 at 1.0?
> 
> *Anyway (edit):*
> You got a link to this BIOS you wanted me to look at:?
> 
> About to do some baselines now, first.




It was the Palit bios.


----------



## Luckbad

Man I swear I should play the lottery.

I got a second Zotact 1080 Ti Amp Extreme. Before it was to cherry pick, but the one I got boosts stock to 2037.5 MHz and I didn't need to.

But screw it, I wanted to check out the new one too.

Stock boost? You guessed it. 2037.5 as well.

I haven't put it through its paces, but I have a good feeling I'll be able to sell one of them without any loss even though they're both opened now.


----------



## Coopiklaani

my new 1080ti, 2063 core +900 VRAM passed TS stress test 99.5%

Can do 2100 on the core, but not as rock solid..

http://www.3dmark.com/tsst/45381

Best score so far,
10 991 TS Graphics
http://www.3dmark.com/spy/1659925

7 907 FEU Graphics
http://www.3dmark.com/fs/12484916

Good silicon?


----------



## nrpeyton

Quote:


> Originally Posted by *NYU87*
> 
> I agree. At this point I would never pick up a quad core, especially a Broadwell-C. It's either Ryzen or Broadwell-E, which will both eat the quad core Broadwell-C alive.
> 
> I would have gone with either platforms but with Intel's x299 and Skylake X coming out in the summer I'm going to wait it out. Would love to get a 10C Skylake X chip.


For gaming FPS, the winning equation is only raw single threaded performance, every time.

You only get that on an intel 4-core.

Then for a further performance boost you want 2 threads for each core; so you're exacting _every_ ounce of performance.

The i7 7700k is an engine; packing incredible power into *just* 4 cores. Those cores are BEASTS and for anything that utilises up to 4 cores it can not be beaten.

For anything utilising more cores; it's still never needing more than the combined processing power of those 4 cores/8 threads in full charge.
Those cores will form a stampede at 100% utilisation -- while high-core-count-machines will float along at 50/60% utilisation _never_ unleashing their true power into the game
.

I desperately wanted to go Ryzen, but I just can't justify it.

The day will come when an i7 7700k is *only just playable* and a Ryzen 1700x narrowly falls below the playable threshold. At matching price; does that example make the higher-core-count machines a bad buy? For me; it does.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> my new 1080ti, 2063 core +900 VRAM passed TS stress test 99.5%
> 
> Can do 2100 on the core, but not as rock solid..
> 
> http://www.3dmark.com/tsst/45381
> 
> Best score so far,
> 10 991 TS Graphics
> http://www.3dmark.com/spy/1659925
> 
> 7 907 FEU Graphics
> http://www.3dmark.com/fs/12484916
> 
> Good silicon?


Did you shunt mod it? Also, did you buy your old one with a good CC? If you did then they will cover all damages to that card for 90 days, no matter what kind of damage.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> 
> 
> It was the Palit bios.


I can only find two 1080TI *Palit* BIOS's via google.

One is the BIOS I mentioned & linked in my last post, (the 'GameRock premium' review sample)
and the other:
has the same power limit as a FE.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> my new 1080ti, 2063 core +900 VRAM passed TS stress test 99.5%
> 
> Can do 2100 on the core, but not as rock solid..
> 
> http://www.3dmark.com/tsst/45381
> 
> Best score so far,
> 10 991 TS Graphics
> http://www.3dmark.com/spy/1659925
> 
> 7 907 FEU Graphics
> http://www.3dmark.com/fs/12484916
> 
> Good silicon?


yeah, man. 2037-2063 solid is the most likely golden chip, but it's golden all the same.
Quote:


> Originally Posted by *nrpeyton*
> 
> For gaming FPS, the winning equation is only raw single threaded performance, every time.
> 
> You only get that on an intel 4-core.
> 
> Then for a further performance boost you want 2 threads for each core; so you're exacting _every_ ounce of performance.
> 
> The i7 7700k is an engine; packing incredible power into *just* 4 cores. Those cores are BEASTS and for anything that utilises up to 4 cores it can not be beaten.
> 
> For anything utilising more cores; it's still never needing more than the combined processing power of those 4 cores/8 threads in full charge.
> Those cores will form a stampede at 100% utilisation -- while high-core-count-machines will float along at 50/60% utilisation _never_ unleashing their true power into the game
> .
> 
> I desperately wanted to go Ryzen, but I just can't justify it.
> 
> The day will come when an i7 7700k is *only just playable* and a Ryzen 1700x narrowly falls below the playable threshold. At matching price; does that example make the higher-core-count machines a bad buy? For me; it does.


cache is also a factor, especially for minimums. why my 1700% more cache than a 7700k lets me beat it. But other than that I agree.


----------



## OZrevhead

Guys, have any of the Ti's got a higher power limit? Or are you all doing shunt mod?


----------



## KraxKill

So here is the power limit for ya. If I could remove it, I know my card could go higher. As you can see, my highest score is at a lower clock, voltage and with room for mem OC, as the clocks and voltages climb, my card, despite being able to sustain the clocks, is being power limited despite the shunt mod. Somebody please please please....find a way around this.. I must have more voltage.

If anybody wants to verify the above, you can test this simply.

Set a mild mem overclock. Say +450 and raise you core clock and voltage just to the point you begin to see pwr throttling in something like Heaven loop. Now without allowing the card to grab higher voltage bins than your control loop, remove the memory OC and watch the pwr limit subside or disappear. For some underclocking the mem may allow for a favorable tradeoff between allowing for more stability at a higher clock staying off the pwr but the amount of mem under clock needed to grab an extra bin is usually not worth it. For those between bins though, it's a worthy thing to test.

2100 @ 1.063v +550 mem
http://www.3dmark.com/fs/12446981


2151 @ 1.093v +550 mem
http://www.3dmark.com/fs/12477286


2163 @ 1.093v +400 mem

http://www.3dmark.com/fs/12490144

2163 @ 1.093v +0 mem

http://www.3dmark.com/fs/12489719


----------



## Chris Ihao

Quote:


> Originally Posted by *Slackaveli*
> 
> same. post kryonaut mod, tho. so im maxing at 62c. my card isnt 100% stable at 2100, tho, in EVERY sense. i can play BF1 at that for hours. It really prefers 2088 at 1.8 and down to 2050 at 1.5v. so, i have curve at 2037 at 1v, 2050 at 1.043v, 2076 at 1.063 and 2088 at 1.08v. flat between 1v and 1.043v, flat between the steps if i skipped a voltage up to 1.08 at 2088 and flat from there. so, depending on the app, im either at 2088, 2076, or 2050. Havent seen it go to 2037 yet and only the witcher 3 knocks me down to 2050 so far , out of 2 dozen game or so tested. i can hold 2088+ in benches. can hold 2101 in benches, actually, but on these settings it'd be 2088.


You must have pretty good airflow in that box of yours mate







I'm running the fan on the 1080 ti at close to max, and heck of a lot of fans, and it reaches around 60 at only 2012 "stable non-downclocking". Wish I had some better ventilation where my cab is though.

Also, phu! System up and running again, pretty much the way it used to be. Probably lost some saves here and there, but that is how life is. Do you guys use Afterburner 4.3.0? Just made a full image backup now, and am ready to try again.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Did you shunt mod it? Also, did you buy your old one with a good CC? If you did then they will cover all damages to that card for 90 days, no matter what kind of damage.


Yes, I shunt modded. It did about the same on the core, but much better on the vram. I bought both the card and the wb from overclocker.co.uk. They're nice enough to except RMA for both the card and the waterblock.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> So here is the power limit for ya. If I could remove it, I know my card could go higher. As you can see, my highest score is at a lower clock, voltage and with room for mem OC, as the clocks and voltages climb, my card, despite being able to sustain the clocks, is being power limited despite the shunt mod. Somebody please please please....find a way around this.. I must have more voltage.
> 
> If anybody wants to verify the above, you can test this simply.
> 
> Set a mild mem overclock. Say +450 and raise you core clock and voltage just to the point you begin to see pwr throttling in something like Heaven loop. Now without allowing the card to grab higher voltage bins than your control loop, remove the memory OC and watch the pwr limit subside or disappear. For some underclocking the mem may allow for a favorable tradeoff between allowing for more stability at a higher clock staying off the pwr but the amount of mem under clock needed to grab an extra bin is usually not worth it. For those between bins though, it's a worthy thing to test.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 2100 @ 1.063v +550 mem
> http://www.3dmark.com/fs/12446981
> 
> 
> 2151 @ 1.093v +550 mem
> http://www.3dmark.com/fs/12477286
> 
> 
> 2163 @ 1.093v +400 mem
> 
> http://www.3dmark.com/fs/12490144
> 
> 2163 @ 1.093v +0 mem
> 
> http://www.3dmark.com/fs/12489719


Try a different LM. ? Some are more conductive than others.

How much extra headroom did the mod net you?


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man. 2037-2063 solid is the most likely golden chip, but it's golden all the same.
> cache is also a factor, especially for minimums. why my 1700% more cache than a 7700k lets me beat it. But other than that I agree.










But it seems everyone's card can easily do 2100+ here


----------



## Slackaveli

Quote:


> Originally Posted by *Chris Ihao*
> 
> You must have pretty good airflow in that box of yours mate
> 
> 
> 
> 
> 
> 
> 
> I'm running the fan on the 1080 ti at close to max, and heck of a lot of fans, and it reaches around 60 at only 2012 "stable non-downclocking". Wish I had some better ventilation where my cab is though.
> 
> Also, phu! System up and running again, pretty much the way it used to be. Probably lost some saves here and there, but that is how life is. Do you guys use Afterburner 4.3.0? Just made a full image backup now, and am ready to try again.


yeah, it's breezy in there. 3 intake, 1 exhaust plus the cpu cooler's exhaust. fans are Thermaltake Ring and i have 2 F-12 Noctuas, and another f-12 on the rad. Plus, kryo re-paste and a couple heatsinks on the backplate and on the backside gpu copper plate.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> But it seems everyone's card can easily do 2100+ here


because we are all lunatics! Most needed cooling mods and shunt mods and extra heatsinks and thermal pads and full blocks to do it. ON AIR, if you are over 2000Mhz you are gold, so holding above 2050 on air means it IS a 2100 card, you just haven't done your job to release it yet


----------



## Chris Ihao

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, it's breezy in there. 3 intake, 1 exhaust plus the cpu cooler's exhaust. fans are Thermaltake Ring and i have 2 F-12 Noctuas, and another f-12 on the rad. Plus, kryo re-paste and a couple heatsinks on the backplate and on the backside gpu copper plate.


I got 5 Noctuas. 2 pulling in, 2 on top exhuastoing, running a bit slower and one exhaust at the back. Also a Corsair on the bottom to pull up some "as cold as possible" air from the floor. Last one is probably not pulling much as I havent cleaned the filter in a looong time now though.







It should be a good setup for pushing quite a bit of air, but like I said, probably not enough ventilation. I COULD try to repaste with the kryo, but those fan headers... man, they suck.


----------



## Slackaveli

Quote:


> Originally Posted by *Chris Ihao*
> 
> I got 5 Noctuas. 2 pulling in, 2 on top exhuastoing, running a bit slower and one exhaust at the back. Also a Corsair on the bottom to pull up some "as cold as possible" air from the floor. Last one is probably not pulling much as I havent cleaned the filter in a looong time now though.
> 
> 
> 
> 
> 
> 
> 
> It should be a good setup for pushing quite a bit of air, but like I said, probably not enough ventilation. I COULD try to repaste with the kryo, but those fan headers... man, they suck.


i did it without removing the fan headers. Actually, i removed one. one came off easy. ANd yes, they are stupid hard to get off. but either way, though, you just need another person , even a child, to hold the cooler while you quickly clean and repaste.

Your fans and mine are almost identical, with a repaste you'll be in the 50s. Im usually in the 50s in most games. like 57c, 58c.


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> because we are all lunatics! Most needed cooling mods and shunt mods and extra heatsinks and thermal pads and full blocks to do it. ON AIR, if you are over 2000Mhz you are gold, so holding above 2050 on air means it IS a 2100 card, you just haven't done your job to release it yet


Even with the shunt mod, water blocks, and the best BIOS I could find, I can benchmark @2.1GHz fine. But it is just not stable enough for 247 use


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> So here is the power limit for ya. If I could remove it, I know my card could go higher. As you can see, my highest score is at a lower clock, voltage and with room for mem OC, as the clocks and voltages climb, my card, despite being able to sustain the clocks, is being power limited despite the shunt mod. Somebody please please please....find a way around this.. I must have more voltage.
> 
> If anybody wants to verify the above, you can test this simply.
> 
> Set a mild mem overclock. Say +450 and raise you core clock and voltage just to the point you begin to see pwr throttling in something like Heaven loop. Now without allowing the card to grab higher voltage bins than your control loop, remove the memory OC and watch the pwr limit subside or disappear. For some underclocking the mem may allow for a favorable tradeoff between allowing for more stability at a higher clock staying off the pwr but the amount of mem under clock needed to grab an extra bin is usually not worth it. For those between bins though, it's a worthy thing to test.
> 
> 2100 @ 1.063v +550 mem
> http://www.3dmark.com/fs/12446981
> 
> 
> 2151 @ 1.093v +550 mem
> http://www.3dmark.com/fs/12477286
> 
> 
> 2163 @ 1.093v +400 mem
> 
> http://www.3dmark.com/fs/12490144
> 
> 2163 @ 1.093v +0 mem
> 
> http://www.3dmark.com/fs/12489719


What's more relevant is game benchmarks, actual Gameplay and what you would experience.

GTAV, RISE OF THE TOMB RAIDER,Hitman, division, ghost recon all have in game benchmarks.

This "tiny" power limits will probably just hit you in TimeSpy and firestrike.

Do you notice the same thing happening in heaven and superposition?


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> Even with the shunt mod, water blocks, and the best BIOS I could find, I can benchmark @2.1GHz fine. But it is just not stable enough for 247 use


well, true, but i think there might only _really_ be 3 or 4 of those in here. Mine's not. it's a 2063+ all day, tho, with 2100 is some games. But mine is on air, no shunt. Just kryonaut and a good tunin.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Even with the shunt mod, water blocks, and the best BIOS I could find, I can benchmark @2.1GHz fine. But it is just not stable enough for 247 use


Bear in mind Nvidia designed these cards with a base clock of 1481 Mhz.

And most peoples boosts max out at 1850-1900 Mhz.

People who don't overclock (like us) are gaming at between 1600 - 1750 avg. on default fan profiles.

The fact you can bench at 2100 semi-stable is fantastic.

Also; you've got that challenge to hold 2100... If you were hitting 2100 without any effort; wouldn't that be a tad boring? (*no* one is hitting 2.2).

Personally; I'll probably use 1.0v @ an overclock of 1999MHZ for 24/7 gaming. (I'll enjoy the fact I'm running cool + faster drawing a lot less power + a lot less noise).

Don't be so hard on yourself


----------



## Coopiklaani

New single slot WB just arrived. The cover has NO led in it.. and I cannot find and LED strip that fits..
I added a T-sensor just underneath the mosfet area. Mosfets never go above 75C even under 450w load for 1-2hrs












Single slot backplate looks great!


Aquaero is a must have!


Done! with lots of blood!


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> well, true, but i think there might only _really_ be 3 or 4 of those in here. Mine's not. it's a 2063+ all day, tho, with 2100 is some games. But mine is on air, no shunt. Just kryonaut and a good tunin.


Depends on game. Mine is 2063 in everything but W3 and ME:A









I gave up on curve. Just went back to good old sliders. Playing now 2012 +50 core in Witcher 3 with Auto Fan at 1.062V with max temp of 68C after Kryonaut repasting. Extra 50mhz and 1fps is not worth all the hassle....

*BTW. guys if you have Asus STRIX there is no need for repasting it with Kryonaut, you will get like 2-3C at max. The paste job on Asus is god darn good, nice thin layer of paste, although it wasn't as thick paste as Grizzly is.
*

I have ordered another STRIX just to compare them.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> New single slot WB just arrived. The cover has NO led in it.. and I cannot find and LED strip that fits..
> I added a T-sensor just underneath the mosfet area. Mosfets never go above 75C even under 450w load for 1-2hrs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Single slot backplate looks great!
> 
> 
> Aquaero is a must have!
> 
> 
> Done! with lots of blood!


450w load lol? The safety limit for 8 pin + 6 pin = 510w?
Looks awfully close.

Anyway new block looks nice.

Anyone bothered with repasting EVGA's FE?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> [/SPOILER]
> 450w load lol? The safety limit for 8 pin + 6 pin = 510w
> Looks awfully close.
> 
> Anyway new block looks nice.


450w is done using the shunt mod + palit BIOS + furmark. I don't think any real-world work load can go beyond this...450w is 6pin+8pin+PCIE, so it is more like 400w or less from 6pin+8pin.


----------



## nrpeyton

Sorry, forgot about the PCI-E lane.. wouldn't that make the safety 585w?

And aye you're right; I can't imagine any scenario pulling that, maybe LN2....


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> 450w is done using the shunt mod + palit BIOS + furmark. I don't think any real-world work load can go beyond this...450w is 6pin+8pin+PCIE, so it is more like 400w or less from 6pin+8pin.


Can you do a Heaven and Superposition benchmark comparing the shunt mod with stock and Palit?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Sorry, forgot about the PCI-E lane.. wouldn't that make the safety 585w?
> 
> And aye you're right; I can't imagine any scenario pulling that, maybe LN2....


With locked voltages, I don't think it is even possible to go beyond 450w or 500w..


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Can you do a Heaven and Superposition benchmark comparing the shunt mod with stock and Palit?


With the shunt mod, no different between the stock bios and the Palit bios in Heaven, Superposition and Timespy scores, since no pwr limit in these tests with the stock bios.

Palit BIOS does do a little bit better in FSU 8000 vs 7900 with the stock bios. The first a few seconds in the first scene of FSU are really demanding. It can draw up to 425w to 450w in the first few seconds. Palit bios + shunt can do 460w, stock bios + shunt mod can only do about 400w max.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Depends on game. Mine is 2063 in everything but W3 and ME:A
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I gave up on curve. Just went back to good old sliders. Playing now 2012 +50 core in Witcher 3 with Auto Fan at 1.062V with max temp of 68C after Kryonaut repasting. Extra 50mhz and 1fps is not worth all the hassle....
> 
> *BTW. guys if you have Asus STRIX there is no need for repasting it with Kryonaut, you will get like 2-3C at max. The paste job on Asus is god darn good, nice thin layer of paste, although it wasn't as thick paste as Grizzly is.
> *
> 
> I have ordered another STRIX just to compare them.


well, depending on if you are close to a bin increase. like, if you are at 51c, 3c for $10 gets you to 58c. Jus sayin.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> With the shunt mod, no different between the stock bios and the Palit bios in Heaven, Superposition and Timespy scores, since no pwr limit in these tests with the stock bios.
> 
> Palit BIOS does do a little bit better in FSU 8000 vs 7900 with the stock bios. The first a few seconds in the first scene of FSU are really demanding. It can draw up to 425w to 450w in the first few seconds. Palit bios + shunt can do 460w, stock bios + shunt mod can only do about 400w max.


Can you post results though? I got worse scores, same clock speeds and voltages with Palit bios and so did others. Did you do a comparison test at all? I just want to know your results, maybe I need to tweak the curve a bit to see if the Palit can score the same or at least equal.


----------



## feznz

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> 
> 
> But it seems everyone's card can easily do 2100+ here


remember it's all about bragging here I can do 2100 for like 10 sec before I down clock eventually to 2038 because of the voltage but can hold 2050 not a problem with less voltage
Also the top results are so close that benching this card seems a little pointless unless I can aniloate my nearest revile ie maintain 2200+ on a bench!
I rather game this card to death


----------



## Benny89

Quote:


> Originally Posted by *feznz*
> 
> remember it's all about bragging here I can do 2100 for like 10 sec before I down clock eventually to 2038 because of the voltage but can hold 2050 not a problem with less voltage
> Also the top results are so close that benching this card seems a little pointless unless I can aniloate my nearest revile ie maintain 2200+ on a bench!
> I rather game this card to death


Precisely. Max clocks people post here are just benches or many time BF1 results, which is very low GPU usage game in nowadays. I can bench 2088 if I want but in any game that will never hold. I can hold 2063 all day in BF1 but in Witcher 3 I have to go 2012mhz max.

And gain of those couple tens of mhz is nothing really.

Benches are not games


----------



## KingEngineRevUp

Quote:


> Originally Posted by *feznz*
> 
> remember it's all about bragging here I can do 2100 for like 10 sec before I down clock eventually to 2038 because of the voltage but can hold 2050 not a problem with less voltage
> Also the top results are so close that benching this card seems a little pointless unless I can aniloate my nearest revile ie maintain 2200+ on a bench!
> I rather game this card to death


Agreed, the average isn't actually 2100, and not "everyone" can even do it.

The average on air is like *1847* and on water it's *2032*.

http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/

Even looking closer at data, you can see not many are doing 2100 Mhz.

http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


----------



## Addsome

Quote:


> Originally Posted by *Slackaveli*
> 
> ok, thanks man. i'll give it another round, even though i pretty much know my sweet spot is 11,880. i just want to confirm my tests and this should do it.


Did you get OCCT to work? Im getting the same user cancelled error.


----------



## Slackaveli

havent tried again yet tbh.
Quote:


> Originally Posted by *Addsome*
> 
> Did you get OCCT to work? Im getting the same user cancelled error.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Agreed, the average isn't actually 2100, and not "everyone" can even do it.
> 
> The average on air is like *1847* and on water it's *2032*.
> 
> http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/
> 
> Even looking closer at data, you can see not many are doing 2100 Mhz.
> 
> http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


exactly. i can actually game on some games at 2101 and not crash , for hours. but witcher 3 knocks me down to 2063. it is what it is. It even went to 2050 and then bounced back in forth to 2063 when I played for 4 hours the other day.

But, on air, even if you call it 2050 because it blinks that sometimes, THAT is GOLDEN on air. 200MHz above the average. People in here sometimes seem MAD at 2050! I dont get that.


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Precisely. Max clocks people post here are just benches or many time BF1 results, which is very low GPU usage game in nowadays. I can bench 2088 if I want but in any game that will never hold. I can hold 2063 all day in BF1 but in Witcher 3 I have to go 2012mhz max.
> 
> And gain of those couple tens of mhz is nothing really.
> 
> Benches are not games


Quote:


> Originally Posted by *SlimJ87D*
> 
> Agreed, the average isn't actually 2100, and not "everyone" can even do it.
> 
> The average on air is like *1847* and on water it's *2032*.
> 
> http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/
> 
> Even looking closer at data, you can see not many are doing 2100 Mhz.
> 
> http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


I think you both deserve a +1

I cant believe the Time spy ranks where are all the guys here with the 2100Mhz OC when a 2038 core is taking 4th place on a stock cooler







with only 2 LN2 boys taking the first 2 spots

I am where I want to be with the extra $150 premium I paid for a Strix and under water holy cow I think I could take 3rd spot if it were a winter morning here as it will be a few months and spent a few more $$$$ on an x99 but with the X299 due in June I would be just throwing my money away



source http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


----------



## KedarWolf

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> ok, thanks man. i'll give it another round, even though i pretty much know my sweet spot is 11,880. i just want to confirm my tests and this should do it.
> 
> 
> 
> Did you get OCCT to work? Im getting the same user cancelled error.
Click to expand...

I downloaded 4.4.0 zip and worked fine.

http://www.ocbase.com/download/OCCTPT4.4.0.zip


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> I downloaded 4.4.0 zip and worked fine.
> 
> http://www.ocbase.com/download/OCCTPT4.4.0.zip


huh. test just wouldnt even start for me.

On another note, i saw spikes to 355w in World of Warships, but this card crushes full ultra, 4xsmaa, fxaa high, all very high everything it's got at 4096x2160 and I get mid 70s fps. Sweet.
Quote:


> Originally Posted by *feznz*
> 
> I think you both deserve a +1
> 
> I cant believe the Time spy ranks where are all the guys here with the 2100Mhz OC when a 2038 core is taking 4th place on a stock cooler
> 
> 
> 
> 
> 
> 
> 
> with only 2 LN2 boys taking the first 2 spots
> 
> I am where I want to be with the extra $150 premium I paid for a Strix and under water holy cow I think I could take 3rd spot if it were a winter morning here as it will be a few months and spent a few more $$$$ on an x99 but with the X299 due in June I would be just throwing my money away
> 
> 
> 
> source http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


because not all of us have 8 core cpus, which is REQUIRED to make that leaderboard. the way you search it, idk, where am I? use a little bit of common sense and search by 4 core cpu's and you'll see a lot of HIGHER gpu scores than those guys. That post is bad. It is a chart full of the couple dozen folks on earth with 6950xs. jeez, dude. I have a gpu score of 10,900 in timespy.....

fps are the only # that matter anyway. And in games, not benches. 2088 on most games ive played, here. 2063 on the others (or 2100 on some). it isnt that unheard of.

I'm 3rd on my ENTIRE cpu's Timespy tests, btw. that sated me, personally, on the benching.

Edit. it wont let me post my in game screenshots because it's too much data?? wth. How can you post a 4k screenshot on here?


----------



## NYU87

Quote:


> Originally Posted by *Slackaveli*
> 
> LOL! How misinformed could you possible be?
> Im 130-144+fps in bf1 @qhd, CRUSHING ryzen 4 cores, beating 6-8 cores, beating INTEL 6 CORES, bro. Beating all Intel except 5.0Ghz+ Kaby Lakes. Beating all of the ones you just mentioned. PROVEN fact. But, you do your thing. Just have to counter-point so the community isnt misled. TBC, not advocating a from scratch z97 system in 2017. it's for those with 4690k/4790k on z97. THEY get a sweet, sweet deal.


Broadwell-E is the exact same core as Broadwell-C, how could it be faster unless it's clocked higher? Broadwell-E is probably faster clock for clock due to the massive L2 cache. Also not everyone just plays games with their PC. Your Broadwell-C is not some magic CPU. It's slower than Skylake and Kaby Lake. Also I would never get a quad core at this point.


----------



## done12many2

Quote:


> Originally Posted by *feznz*
> 
> I think you both deserve a +1
> 
> I cant believe the Time spy ranks where are all the guys here with the 2100Mhz OC when a 2038 core is taking 4th place on a stock cooler
> 
> 
> 
> 
> 
> 
> 
> with only 2 LN2 boys taking the first 2 spots
> 
> I am where I want to be with the extra $150 premium I paid for a Strix and under water holy cow I think I could take 3rd spot if it were a winter morning here as it will be a few months and spent a few more $$$$ on an x99 but with the X299 due in June I would be just throwing my money away
> 
> 
> 
> source http://hwbot.org/benchmark/3dmark_-_time_spy/rankings?hardwareTypeId=videocard_2821&cores=1#start=0#interval=20


You'll find that most of the guys on HWBot care more about the results then the actual frequency. I see the opposite of that in some posts here.


----------



## Slackaveli

Quote:


> Originally Posted by *NYU87*
> 
> Broadwell-E is the exact same core as Broadwell-C, how could it be faster unless it's clocked higher? Broadwell-E is probably faster clock for clock due to the massive L2 cache. Also not everyone just plays games with their PC. Your Broadwell-C is not some magic CPU. It's slower than Skylake and Kaby Lake. Also I would never get a quad core at this point.


LMAO. wow, dude, salty much. i clearly said IN GAMING.

Did you just say "massive cache". Like, 20mb massive? 128mb is massive, 5775-c has many TIMES more cache mbs and exactly why a broadwell-c wrecks broadwell-e in games, badly and consistently and you can't prove otherwise. I hope you spend hours trying, bud.

browse as you feel fit. I've had this argument 2 dozen times and I haven't lost it yet, so... yeah.

https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,33


----------



## Nico67

Quote:


> Originally Posted by *GosuPl*
> 
> On 2x TITAN X Maxwell i dont have any performance lose and GPU usage is better, than on 2x 1080Ti's. Ppls have problems with SLI + G+Sync on Pascal. I thinking about, return to 2x TITAN X Maxwell and just skip Pascal. For me , performance gains from change 2x TITAN X [email protected]/8000 for 2x GTX [email protected]/11912 , is to low...
> 
> 30/35 fps on 1440p and 20/25 fps on 4k, is not worthy with that G-Sync problems :/


Could it be more related to the fact that Gsync is now operating at decent fps? and isn't 30/35fps 1440p and 20/25 fps 4k still like 60% better?


----------



## KedarWolf

Quote:


> Originally Posted by *GosuPl*
> 
> 1080Ti SLI + G-Sync = Less performance. On my former GPU's , 2x TITAN X Maxwell, i dont have any performance lose with G-Sync. But on 2x 1080Ti's, loses is noticable.
> 
> 
> 
> 
> 
> Any fixes ?


Try turning off vsync, downloading RivaTuner and capping your frame rate to 2 less that your refresh rate. Like on my 4K G-Sync screen, I cap my frame rate with RivaTuner at 58, two less than my refresh rate of 60hz.

Edit: You do know with G-Sync you need to keep your frame rate less than your refresh rate, right?


----------



## Nico67

Anyone tried Black Desert with a 1080Ti at stable overclock? it runs fine at default for me, but at 2100/6003 it drops to black screen then comes back and repeats this til it says its out of memory. Tried higher voltage and lower clks, marginal better but still does it, tried default memory clk and again no real improvement. Funnily it doesn't even use more than about 80% gpu and barely power limits.

Wanted to see what frames were like so not overly bothered, but seems some graphic effects have a larger bearing on stability than load does? Similar to what you are seeing with Witcher3 I guess


----------



## Dasboogieman

Quote:


> Originally Posted by *NYU87*
> 
> Broadwell-E is the exact same core as Broadwell-C, how could it be faster unless it's clocked higher? Broadwell-E is probably faster clock for clock due to the massive L2 cache. Also not everyone just plays games with their PC. Your Broadwell-C is not some magic CPU. It's slower than Skylake and Kaby Lake. Also I would never get a quad core at this point.


Broadwell C has an even bigger cache, that L4 cache is 128mb and has only just recently been matched in latency + BW by extreme DDR4 speeds + Kaby lake @ 5ghz. That Crystalwell cache is the reason why the 5775c is so fast, even though its the same architecture, the 5775c core can maintain more of its IPC as datasets get bigger than the L2.


----------



## Benny89

Quote:


> Originally Posted by *Nico67*
> 
> Anyone tried Black Desert with a 1080Ti at stable overclock? it runs fine at default for me, but at 2100/6003 it drops to black screen then comes back and repeats this til it says its out of memory. Tried higher voltage and lower clks, marginal better but still does it, tried default memory clk and again no real improvement. Funnily it doesn't even use more than about 80% gpu and barely power limits.
> 
> Wanted to see what frames were like so not overly bothered, but seems some graphic effects have a larger bearing on stability than load does? Similar to what you are seeing with Witcher3 I guess


Mate 2100 is way to high for such games. You have to go lower, much lower.

2100 is something few very poor games will allow you to keep. Any graphic intensive game will break that clocks with ez







.

Some games just won't allow high OC at all.

You can see some people here holding 2050/63 clocks in Witcher 3 but they are at sub-50C temps.

And they really don't get any extra performance over lets say 2000 to 2025mhz







.

Try give it like 2063 on same bin as your 2100 is and see if that will allow BS to work. If not, just go lower clock on higher bins or just to back to good old sliders without playing with curve


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Mate 2100 is way to high for such games. You have to go lower, much lower.
> 
> 2100 is something few very poor games will allow you to keep. Any graphic intensive game will break that clocks with ez
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Some games just won't allow high OC at all.
> 
> You can see some people here holding 2050/63 clocks in Witcher 3 but they are at sub-50C temps.
> 
> And they really don't get any extra performance over lets say 2000 to 2025mhz
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Try give it like 2063 on same bin as your 2100 is and see if that will allow BS to work. If not, just go lower clock on higher bins or just to back to good old sliders without playing with curve


Its funny because The Division and The Secret World, both pretty demanding and in the latter case poorly optimised/ unstable games, run fine at 2101/6003 for long sessions. Black Desert has always been problematic, but I will see I can find a better setting for it. I guess we are getting closer each generation to have game specific oc profiles









and your right, performance would be negligible at lower clks, its more of the weird symptoms, not the usual crashing, locking or artifacts.


----------



## sperson1

Hey everyone just got my Ti today and have to say love it don't miss my 780 ti at all


----------



## buddatech

So am I still throttling? Depending on game I play temps never over 49c and clocks jump around from 1987 to 2063MHz?


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Broadwell C has an even bigger cache, that L4 cache is 128mb and has only just recently been matched in latency + BW by extreme DDR4 speeds + Kaby lake @ 5ghz. That Crystalwell cache is the reason why the 5775c is so fast, even though its the same architecture, the 5775c core can maintain more of its IPC as datasets get bigger than the L2.


well said, mate.


----------



## KraxKill

Quote:


> Originally Posted by *buddatech*
> 
> So am I still throttling? Depending on game I play temps never over 49c and clocks jump around from 1987 to 2063MHz?


You'd have to go below 28C or lower under load to not be hitting throttle points.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> You'd have to go below 28C or lower under load to not be hitting throttle points.


yeah, but that does seem to me an extreme throttle.


----------



## Nico67

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, but that does seem to me an extreme throttle.


Yeah more likely to be power if its jumping about


----------



## KingEngineRevUp

I thought g-sync didn't report crap frames, I know that's not the right term, but I read g-sync scraps certain frames which would have caused issues (stutter, tearing etc) so benchmarks will report lower scores.


----------



## dansi

If 5775C can run stably at 4.5Ghz, i feel it will beat 7700K at 5Ghz, in gaming.
Too bad i dont see any 5775C at 4.5Ghz or more IIRC.


----------



## Dasboogieman

Quote:


> Originally Posted by *SlimJ87D*
> 
> I thought g-sync didn't report crap frames, I know that's not the right term, but I read g-sync scraps certain frames which would have caused issues (stutter, tearing etc) so benchmarks will report lower scores.


I would believe that, AMD got a lot of flak a couple of years ago with the 69xx series sending out tons of runt frames that were inflating the benchmark scores but were stuttering like hell in games.
Quote:


> Originally Posted by *dansi*
> 
> If 5775C can run stably at 4.5Ghz, i feel it will beat 7700K at 5Ghz, in gaming.
> Too bad i dont see any 5775C at 4.5Ghz or more IIRC.


Yeah I dunno why Haswell responds so badly to high clockspeeds. That being said, a delidded 5775c can probably do 4.5ghz on heavy WC.

And whether it beats the 7700k depends entirely on DDR4 speeds. A Kaby Lake with 3800mhz + DDR4 with tight timings will likely cream the 5775c simply on IPC because the advantage of the L4 got negated.

That being said, the issue was never about absolute performance but performance per dollar.
Kaby Lake pretty much needs a delid, high voltage and excellent cooling to do 5+ghz. Plus, 3800mhz DRAM with tight timings are expensive. Compared to the 5775c which is a drop in replacement for Z97 owners for a nominal cost (depending on how much they sold their previous CPU for 2nd hand) and an ez multiplier bump.


----------



## buddatech

Quote:


> Originally Posted by *KraxKill*
> 
> You'd have to go below 28C or lower under load to not be hitting throttle points.


Dang thats loooow.... Thank you!

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, but that does seem to me an extreme throttle.


hovers mostly @2050 but I do see occasional drop to1987MHz[


----------



## Nico67

Quote:


> Originally Posted by *buddatech*
> 
> Dang thats loooow.... Thank you!
> hovers mostly @2050 but I do see occasional drop to1987MHz[


How many bins does it drop to 1987, just raise to bins a bit and see how it goes for stability. May not be an extreme throttle, more of an extreme drop off on your curve.


----------



## Slackaveli

Quote:


> Originally Posted by *dansi*
> 
> If 5775C can run stably at 4.5Ghz, i feel it will beat 7700K at 5Ghz, in gaming.
> Too bad i dont see any 5775C at 4.5Ghz or more IIRC.


SOURCE. because that's wrong. if it holds 4.2 it beats up to a 5.0ghz. 4.2, NOT 4.5. And ALL of us damn near can hold 4.3. You just making stuff up, now.









Also, a few of the guys in the owner's club are at 4.5 i can validate at 4.5, but cant game on it. can game easily and all day at 4.3, though. 4.4 is too much voltage for my liking and no gain, really.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> How many bins does it drop to 1987, just raise to bins a bit and see how it goes for stability. May not be an extreme throttle, more of an extreme drop off on your curve.


yeah, just curve fix the point that drops to 1987 up to 2000.


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> huh. test just wouldnt even start for me.
> 
> On another note, i saw spikes to 355w in World of Warships, but this card crushes full ultra, 4xsmaa, fxaa high, all very high everything it's got at 4096x2160 and I get mid 70s fps. Sweet.
> because not all of us have 8 core cpus, which is REQUIRED to make that leaderboard. the way you search it, idk, where am I? use a little bit of common sense and search by 4 core cpu's and you'll see a lot of HIGHER gpu scores than those guys. That post is bad. It is a chart full of the couple dozen folks on earth with 6950xs. jeez, dude. I have a gpu score of 10,900 in timespy.....
> 
> fps are the only # that matter anyway. And in games, not benches. 2088 on most games ive played, here. 2063 on the others (or 2100 on some). it isnt that unheard of.
> 
> I'm 3rd on my ENTIRE cpu's Timespy tests, btw. that sated me, personally, on the benching.
> 
> Edit. it wont let me post my in game screenshots because it's too much data?? wth. How can you post a 4k screenshot on here?










seriously









Quote:


> Originally Posted by *done12many2*
> 
> You'll find that most of the guys on HWBot care more about the results then the actual frequency. I see the opposite of that in some posts here.


Yeah a bunch of guys that think they got a score good enough for a leader board that gives an idea what actual top 1080Tis frequency are clocking to
I was aiming that scoreboard more for those that can't believe they cant game at the same clocks.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> 
> 
> 
> 
> 
> 
> 
> seriously
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah a bunch of guys that think they got a score good enough for a leader board that gives an idea what actual top 1080Tis frequency are clocking to
> I was aiming that scoreboard more for those that can't believe they cant game at the same clocks.


well, you POSTED a leaderboard with all 6950x owners...


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> well, you POSTED a leaderboard with all 6950x owners...


But if you look at the GPU core clock then you will know they are getting about the same clocks as most people here
and with 6950x that's great it means that they are not shy of a dollar so they was probably a no holds barred in computer hardware and cooling But then again did you check every submission individually? the CPUs are not listed

Just I still got my trusty 3770k @ 5Ghz I look deeply into those things..... as I said when the x299 chipset is released I be finally tempted to upgrade my old z77 MVE knowing that I will be also getting another GPU to actually warrant the CPU upgrade

yeah I do look at the GPU score separately and take no notice of CPU...... how did I imply I didn't?


----------



## buddatech

Quote:


> Originally Posted by *Nico67*
> 
> How many bins does it drop to 1987, just raise to bins a bit and see how it goes for stability. May not be an extreme throttle, more of an extreme drop off on your curve.


Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, just curve fix the point that drops to 1987 up to 2000.


Not doing a custom curve manual OC of +169/500 PL 120 tried voltage @0 and +50 +100


----------



## Nico67

Quote:


> Originally Posted by *buddatech*
> 
> Not doing a custom curve manual OC of +169/500 PL 120 tried voltage @0 and +50 +100


All you have to do is open the curve and adjust a few bins, it just continues on fine tuning where you are at


----------



## Clukos

Managed to get a stable 1898 core clock at 0.900mv +720 mem, stable for hours and hours on Witcher 3! 190-220 watt power draw, no throttling, silent and stays under 65C under load









Maybe I can push the core a bit more, I'll have to test when I'm done testing my CPU limits.


----------



## zipeldiablo

New driver and the screen still goes back for a second randomly.
Very annoying.
How many people are affected by this issue?


----------



## KedarWolf

Quote:


> Originally Posted by *zipeldiablo*
> 
> New driver and the screen still goes back for a second randomly.
> Very annoying.
> How many people are affected by this issue?


Try this, works for Windows 10 as well.







The TDR Delay thingy.

https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista


----------



## buddatech

Quote:


> Originally Posted by *Nico67*
> 
> All you have to do is open the curve and adjust a few bins, it just continues on fine tuning where you are at


Okay tried this and clocks drop as low as 1974MHz now







clearly I'm not doing something wrong. Now this was in GoW4 in heaven 2063-2076MHz but didn't let it finish so I'm sure would have dropped sooner or later.


----------



## Nico67

Quote:


> Originally Posted by *buddatech*
> 
> Okay tried this and clocks drop as low as 1974MHz now
> 
> 
> 
> 
> 
> 
> 
> clearly I'm not doing something wrong. Now this was in GoW4 in heaven 2063-2076MHz but didn't let it finish so I'm sure would have dropped sooner or later.


Nah, nothing wrong, its just dropping more than one bin, just raise the one at 1974 to 2000 as well and test. if it drops to a certain frequency, just see what that bin is and how many bins its dropped, usually tune it down to 1.000v and it should be pretty good.


----------



## zipeldiablo

Quote:


> Originally Posted by *KedarWolf*
> 
> Try this, works for Windows 10 as well.
> 
> 
> 
> 
> 
> 
> 
> The TDR Delay thingy.
> 
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista


Not sure it fits the case.
This only lasts for one second before displaying again, a signal interruption basically.
Driver didn't crashed


----------



## KedarWolf

Quote:


> Originally Posted by *zipeldiablo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Try this, works for Windows 10 as well.
> 
> 
> 
> 
> 
> 
> 
> The TDR Delay thingy.
> 
> https://support.microsoft.com/en-us/help/2665946/-display-driver-stopped-responding-and-has-recovered-error-in-windows-7-or-windows-vista
> 
> 
> 
> Not sure it fits the case.
> This only lasts for one second before displaying again, a signal interruption basically.
> Driver didn't crashed
Click to expand...

I'd still try it, can be a driver crash even if you don't get the error message.


----------



## Benny89

Anyone else having problems with G-Sync since last driver update?

I am pretty sure G-Sync does not work in Witcher 3 for me. I see a lot of micro-stuttering when I play. Experience is not smooth as it should be. A lot of small stutter, jumps...


----------



## buddatech

Quote:


> Originally Posted by *Nico67*
> 
> Nah, nothing wrong, its just dropping more than one bin, just raise the one at 1974 to 2000 as well and test. if it drops to a certain frequency, just see what that bin is and how many bins its dropped, usually tune it down to 1.000v and it should be pretty good.


This works thank you +rep!


----------



## TheBoom

Quote:


> Originally Posted by *zipeldiablo*
> 
> Not sure it fits the case.
> This only lasts for one second before displaying again, a signal interruption basically.
> Driver didn't crashed


Might wanna lower your oc and test again.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Anyone else having problems with G-Sync since last driver update?
> 
> I am pretty sure G-Sync does not work in Witcher 3 for me. I see a lot of micro-stuttering when I play. Experience is not smooth as it should be. A lot of small stutter, jumps...


Turn vsync off in game, download RivaTuner, cap frame rates to 2 less than your refresh rate, also an unstable memory overclock can cause stuttering, try lowering it.


----------



## Benny89

Quote:


> Originally Posted by *KedarWolf*
> 
> Turn vsync off in game, download RivaTuner, cap frame rates to 2 less than your refresh rate, also an unstable memory overclock can cause stuttering, try lowering it.


Its Witcher 3, fps never go higher than 119, V-Sync is off and memory OC is 0 because I am testing sliders clocks.

Didn't have it on last drivers...


----------



## Nico67

Quote:


> Originally Posted by *buddatech*
> 
> This works thank you +rep!


Cool, glad you stuck with it, once you get the idea of what's happening you can work with it


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Anyone else having problems with G-Sync since last driver update?
> 
> I am pretty sure G-Sync does not work in Witcher 3 for me. I see a lot of micro-stuttering when I play. Experience is not smooth as it should be. A lot of small stutter, jumps...


My Gsync stuffed up with the latest drivers, monitors built in frame counter stopped displaying at all, thought I'd killed Gsync lol. Use the Nvidia control panel Gsync indicator to see if it is functioning in game, or if you monitor has an info page that shows working mode that should say Gsync. If none showing Gsync working, then switch Gsync to Windowed+full screen mode and it should hopefully start working again, you can switch back to just fullscreen again afterward and it should stay Gsync.


----------



## pez

Quote:


> Originally Posted by *AMDATI*
> 
> That makes absolutely zero sense. Gsync doesn't 'get rid' of partial frames, GPU's don't draw partial frames. What Gsync does is make sure a whole frame is displayed as soon as it's sent to the monitor, rather than display part of the next frame. In non gsync monitors, the GPU essentially has to wait on the monitor (vsync). In adaptive sync displays, the monitor waits on the GPU and displays the whole frame before moving on to the next.
> 
> Also, even if frames were dropped, it wouldn't register from the system FPS readout, it would only happen at the screen hardware level.


Apologies for the incorrectness. I vaguely recalled the process of Gsync.

There was some people complaining of worse benchmark scores with Gsync enabled, but I couldn't remember what was the cause of it.


----------



## zipeldiablo

Quote:


> Originally Posted by *KedarWolf*
> 
> I'd still try it, can be a driver crash even if you don't get the error message.


Would be really weird then, since the issue can happen once in a day or once every ten seconds for up to two minutes from what happened to me (sometimes it happens twice in 6 or 7 seconds).
Will still give it a shoot though


----------



## TheBoom

So I gave the curve one last go.

Tried to match the same bins with manual oc as well as lower bins at lower voltages. Curve with voltage bins one clock bin higher crashes in 8K Supo while matched bins remained stable at 2012-2025 with a score of 4278. One last test with a lower voltage flat curve at 2012 yielded 4270.

Manual oc of +23 has clocks and voltages jumping all around but at the end of the day it still gave me the highest score of 4472.

Yeah I'm done with the curve lol. I think it gives you the impression of better performance with more stable clocks at lower voltages and less power usage but it can't seem to beat the manual oc in raw score/fps/performance.

Not worth the time fiddling with it IMHO.


----------



## PasK1234Xw

I only found curve useful when i put on hybrid kit and kept card under 50c.


----------



## AMDATI

I think the only way you should be getting issues with Gsync dropping frames, is if you're going in and out of the refresh rate of your monitor. Otherwise, Gsync is functionally no different to the videocard than having Vsync off and no Gsync, it's done entirely on the monitor.....enabling Gsync on the control panel only tells your Gsync module to enable over the displayport connection. So from the processing side of things, nothing should be affected. What Gsync does is all done on the Gsync module....it buffers frames then makes sure the entire frame is discharged at once, rather than allowing half of the next frame to be drawn.

A lot of people don't realize this, but Gsync modules actually contains nearly 1GB of triple channel DDR3, likely more in monitors bigger than 4k/60hz. This is effectively, their frame buffer, which is large enough to store multiple raw frames.

None of this should effect the performance of the GPU. Gsync shouldn't be slowing things down or throwing frames away. Now maybe, the program itself is misreading things because of some bug.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Anyone else having problems with G-Sync since last driver update?
> 
> I am pretty sure G-Sync does not work in Witcher 3 for me. I see a lot of micro-stuttering when I play. Experience is not smooth as it should be. A lot of small stutter, jumps...


I made a post earlier to disable Windows fastboot. It's conflicting with the current drivers.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I made a post earlier to disable Windows fastboot. It's conflicting with the current drivers.


Thing is I have fastboot off. I did it long time ago...

G-Sync is showing in NCP and in OSD monitor menu so I guess it is working.

Wierd...


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Thing is I have fastboot off. I did it long time ago...
> 
> G-Sync is showing in NCP and in OSD monitor menu so I guess it is working.
> 
> Wierd...


If your monitor has OSD frame rate display, that's the best way to confirm its working. NCP was on for me, but if I toggled the Gsync indicator on, it wasn't showing in game till I changed the Gsync mode and then it started showing. I always check with the OSD frame rate though to confirm its operating.


----------



## Savatage79

No ftw3 reviews yet? Slated to ship on the 1st, least that what my order says. But I'm dying to hear about it and try it even more so, can't wait for that sucker to arrive so I can RMA my Aorus back


----------



## MRCOCOset

My best stabil curve point on ASUS 1080 TI @ PALIT BIOS:

http://obrazki.elektroda.pl/5232797500_1493570149.png


----------



## KingEngineRevUp

Quote:


> Originally Posted by *MRCOCOset*
> 
> My best stabil curve point on ASUS 1080 TI @ PALIT BIOS:
> 
> http://obrazki.elektroda.pl/5232797500_1493570149.png


At 1.093v you're going to have power limiting if you're not shunt modded. Palit bios only gives you an extra 30 watts, you would need 400+ Watts to run 1.093v without power limiting.

Read Kraxkills comment below. If you're not giving your card the power it needs, you're having diminshing returns. Kraxkills is also shunt modded so he's getting around 375-400 Watts of power yet he's hitting another type of power limiting (normalized power limiting).
Quote:


> Originally Posted by *KraxKill*
> 
> Here's a 2151 @ 1.093v Firestrike for fun.
> 
> My top scores are actually 23100 at 2100 @ 1.63 as I'm hitting into the pwr limits even with the shunt mod. Clocking to 2151 puts my voltage where I begin to trip the "normalized tdp" throttle, while 2100 @ 1.63 keeps me right at the limit. This is on a 4790K @ 5.2 , 2133 CL9 DDR3
> 
> The moral here is these cards are TDP limited beyond what the shunt mod appears to forgive. I could score higher at 2151, if I wasn't hitting the pwr limit but since the card bounces off the rev limiter a couple of times during the run scores are ultimately higher at 2100 where I'm just under the limit due to voltage.
> 
> If somebody has a verifiable method of increasing that limit I'm all ears.
> 
> Different drivers I know, but my scores haven't changed much with the new driver. Everything is within 0.1% of where they were on the old driver.
> 
> 2151 @ 1.093v
> http://www.3dmark.com/fs/12477286
> 
> 
> 2100 @ 1.063v
> http://www.3dmark.com/fs/12446981


This is more than likely happening to you but sooner, maybe at 1.050v or less.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> But if you look at the GPU core clock then you will know they are getting about the same clocks as most people here
> and with 6950x that's great it means that they are not shy of a dollar so they was probably a no holds barred in computer hardware and cooling But then again did you check every submission individually? the CPUs are not listed
> 
> Just I still got my trusty 3770k @ 5Ghz I look deeply into those things..... as I said when the x299 chipset is released I be finally tempted to upgrade my old z77 MVE knowing that I will be also getting another GPU to actually warrant the CPU upgrade
> 
> yeah I do look at the GPU score separately and take no notice of CPU...... how did I imply I didn't?


why would they need to be listed? i know what scores they achieve. One can just look at the scores and tell they were all 5960x. That was a biased post and you know it. Those dudes don't tune or overclock better because they are rich (and dumb to waste that much money), in fact it's quite the opposite. Rich people are not taking the time to dial in the perfect overclock. If anything it means nothing.

Your point was something about all the 2100 getting beat in Timespy, and then you posted the list of scores that were ONLY that high because of cpu score. Bruh, please. We aint stupid.

How about you compare something relevant to your point, like GPU scores. Or , better yet, stop using a heavily cpu slanted benchmark like Timespy when you are making a point about gpu... But at least use the search parameters. We arent comparing ourselves to k|ngp|n, slinkyPC and them anyway lol.

Look, I really dont care. But this is overclockers.net, so coming in here trying to make fun of people for pushing their cards is not Kosher.


----------



## Bishop07764

Quote:


> Originally Posted by *Benny89*
> 
> Anyone else having problems with G-Sync since last driver update?
> 
> I am pretty sure G-Sync does not work in Witcher 3 for me. I see a lot of micro-stuttering when I play. Experience is not smooth as it should be. A lot of small stutter, jumps...


Have you tried changing your max rendered frames in the Nvidia control panel for Witcher 3. I have to change it to a 1 from the default of 3 or Shadow warrior 2 stutters annoyingly despite gsync.


----------



## Slackaveli

Quote:


> Originally Posted by *zipeldiablo*
> 
> Not sure it fits the case.
> This only lasts for one second before displaying again, a signal interruption basically.
> Driver didn't crashed


drivers start to crash but recover. this regedit helps extend the time the driver has to recover.
Quote:


> Originally Posted by *buddatech*
> 
> This works thank you +rep!


i tried to tell ya yesterday ;p
Quote:


> Originally Posted by *Nico67*
> 
> My Gsync stuffed up with the latest drivers, monitors built in frame counter stopped displaying at all, thought I'd killed Gsync lol. Use the Nvidia control panel Gsync indicator to see if it is functioning in game, or if you monitor has an info page that shows working mode that should say Gsync. If none showing Gsync working, then switch Gsync to Windowed+full screen mode and it should hopefully start working again, you can switch back to just fullscreen again afterward and it should stay Gsync.


yah. this. i always have g-sync indicator on. i hate not knowing if it's working for sure or not.
Quote:


> Originally Posted by *pez*
> 
> Apologies for the incorrectness. I vaguely recalled the process of Gsync.
> 
> There was some people complaining of worse benchmark scores with Gsync enabled, but I couldn't remember what was the cause of it.


you were actually pretty much right, and yes it does lower scores, and yes it's b/c partial frames are not being counted by the application.You were basically correct, and your conclusion WAS correct.








Quote:


> Originally Posted by *AMDATI*
> 
> I think the only way you should be getting issues with Gsync dropping frames, is if you're going in and out of the refresh rate of your monitor. Otherwise, Gsync is functionally no different to the videocard than having Vsync off and no Gsync, it's done entirely on the monitor.....enabling Gsync on the control panel only tells your Gsync module to enable over the displayport connection. So from the processing side of things, nothing should be affected. What Gsync does is all done on the Gsync module....it buffers frames then makes sure the entire frame is discharged at once, rather than allowing half of the next frame to be drawn.
> 
> A lot of people don't realize this, but Gsync modules actually contains nearly 1GB of triple channel DDR3, likely more in monitors bigger than 4k/60hz. This is effectively, their frame buffer, which is large enough to store multiple raw frames.
> 
> None of this should effect the performance of the GPU. Gsync shouldn't be slowing things down or throwing frames away. Now maybe, the program itself is misreading things because of some bug.


100% of the time your score will be lower with g-sync on. 100% of my tests this is the case, it's forum common knowledge, and it even has a warning when you do a run that your score is artificially lower b/c g-sync was on. It's IS indeed b/c the application doent count the frames that dont fall with in the standard frametime, partial frames are not counted by the application(even though they are displayed on the g-sync). Or something like that. But the conclusion of lesser scores... that's a real thing.


----------



## viz0id

Everyone have been talking so much about how demanding Witcher 3 was. So i played it for an hour and took this screenshot.



I don't think i'll be touching the card anymore. If it can run 2088 from the start and only downclock to 2075 I'm beyond happy. I got a Strix OC on stock air cooling.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> Everyone have been talking so much about how demanding Witcher 3 was. So i played it for an hour and took this screenshot.
> 
> 
> 
> I don't think i'll be touching the card anymore. If it can run 2088 from the start and only downclock to 2075 I'm beyond happy. I got a Strix OC on stock air cooling.


Not trying to brag here, but I'm able to play Witcher 3 at 2113.5 Mhz at 1.062-1.075v.

The only thing that has been bugging me is that on a x41 my fans have to run at 100% to keep the card at 48C and it annoys me without headphones.

In the past with my 970, 980 and 980 Ti Witcher 3 definitely let me know if my oc was going to work or not. Pretty much your speak to someone and then the conversation goes on but the scene would freeze.


----------



## viz0id

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not trying to brag here, but I'm able to play Witcher 3 at 2113.5 Mhz at 1.062-1.075v.
> 
> The only thing that has been bugging me is that on a x41 my fans have to run at 100% to keep the card at 48C and it annoys me without headphones.


Yeah i did not mean to brag either. I know it is not an execptional result, I just did it to test my OC and finally be content with the profile i have chosen, and stop using more time on it


----------



## sperson1

I have to say I love my strix it's boost all the way up 1967 stock and stays around 63c


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Yeah i did not mean to brag either. I know it is not an execptional result, I just did it to test my OC and finally be content with the profile i have chosen, and stop using more time on it


it's not bragging, just reporting results!

I think the witcher 3 just gets those vrams toasty so long play period could result in a crash as they heat up if on a high overclock. When I play, i run +250 and have been able to stay 2050/2063 for hours.


----------



## viz0id

Quote:


> Originally Posted by *Slackaveli*
> 
> it's not bragging, just reporting results!
> 
> I think the witcher 3 just gets those vrams toasty so long play period could result in a crash as they heat up if on a high overclock. When I play, i run +250 and have been able to stay 2050/2063 for hours.


Yeah maybe i did not have any problems since i always run +250. I can run higher, but it reduces my performances (at least in supo).

But witcher3 is a bit special as well. Because i had to force the load for the screenshot. Because when i go to map while playing etc, the GPU downclocks to idle mode, so it always lets the temps go down a fair bit in between the high load periods.


----------



## Slackaveli

Quote:


> Originally Posted by *viz0id*
> 
> Yeah maybe i did not have any problems since i always run +250. I can run higher, but it reduces my performances (at least in supo).
> 
> But witcher3 is a bit special as well. Because i had to force the load for the screenshot. Because when i go to map while playing etc, the GPU downclocks to idle mode, so it always lets the temps go down a fair bit in between the high load periods.


lol, gotta do what ya gotta do!

It's actually nice that it does cool off in map b/c that helps keep temps low just flushing out the buildup periodically. Especially for something like vram.


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> why would they need to be listed? i know what scores they achieve. One can just look at the scores and tell they were all 5960x. That was a biased post and you know it. Those dudes don't tune or overclock better because they are rich (and dumb to waste that much money), in fact it's quite the opposite. Rich people are not taking the time to dial in the perfect overclock. If anything it means nothing.
> 
> Your point was something about all the 2100 getting beat in Timespy, and then you posted the list of scores that were ONLY that high because of cpu score. Bruh, please. We aint stupid.
> 
> How about you compare something relevant to your point, like GPU scores. Or , better yet, stop using a heavily cpu slanted benchmark like Timespy when you are making a point about gpu... But at least use the search parameters. We arent comparing ourselves to k|ngp|n, slinkyPC and them anyway lol.
> 
> Look, I really dont care. But this is overclockers.net, so coming in here trying to make fun of people for pushing their cards is not Kosher.


posting is caring









OK I didn't assume all the scores came from a 5960x because I investigate before shooting someone.
13th and below are Ryzen cpus or 5820 not 5960x please don't skim read and assume because you can't beat it then no body can without a 5960x
No. 13 submission out of 56
http://hwbot.org/submission/3525401_thesirensong_3dmark___time_spy_geforce_gtx_1080_ti_10426_marks

Have you heard of tunnel vision?


----------



## TheBoom

Quote:


> Originally Posted by *SlimJ87D*
> 
> At 1.093v you're going to have power limiting if you're not shunt modded. Palit bios only gives you an extra 30 watts, you would need 400+ Watts to run 1.093v without power limiting.
> 
> Read Kraxkills comment below. If you're not giving your card the power it needs, you're having diminshing returns. Kraxkills is also shunt modded so he's getting around 375-400 Watts of power yet he's hitting another type of power limiting (normalized power limiting).
> This is more than likely happening to you but sooner, maybe at 1.050v or less.


Thing is testing 8K Supo at a lower locked voltage via the curve shows lower results. Yes my card only goes up to a max of 345w and stays stable at the clocks I've set it to, but with a manual oc even though clocks are jumping around it hits a max of 400w. Power limit kicks in but in the end i still get a higher score


----------



## KraxKill

Testing Palit GTX 1080Ti GameRock Premium BIOS along with KedarVolf's TDP bat file (fixes my reporting in GPUz, so rep and props that way) along with shunt mod. I'm no longer hitting pwr in any instance at 1.093v up to about 2164 and potentially higher. Clocks I will not get to without additional voltage. As you can see, I have regained scaling with the pwr green staying out of my GPUz reporting. I'm not suggesting that this BIOS is for all, but If your card is shunt modded, fetching the higher hz bins and you are still like me hitting pwr, the above has at least in my case moved a bit that pwr wall. The scores are not much higher but I feel the performance is more fluid visually without pwr showing up in the GPUz graph.

2100 @ 1.063v +550 mem
http://www.3dmark.com/fs/12446981

Please note this particular score was done on the previous nvidia driver and should not be compared to all the below done on 381.89

2151 @ 1.093v +550 mem
http://www.3dmark.com/fs/12477286


2163 @ 1.093v +400 mem

http://www.3dmark.com/fs/12490144

2163 @ 1.093v +0 mem

http://www.3dmark.com/fs/12489719

2151 @ 1.093v +600mem - Palit BIOS (scaling returns)

http://www.3dmark.com/fs/12499663


----------



## KedarWolf

Quote:


> Originally Posted by *KraxKill*
> 
> Testing Palit GTX 1080Ti GameRock Premium BIOS along with KedarVolf's TDP bat file (fixes my reporting in GPUz, so rep and props that way) along with shunt mod. I'm no longer hitting pwr in any instance at 1.093v up to about 2164 and potentially higher. Clocks I will not get to without additional voltage. As you can see, I have regained scaling with the pwr green staying out of my GPUz reporting. I'm not suggesting that this BIOS is for all, but If your card is shunt modded, fetching the higher hz bins and you are still like me hitting pwr, the above has at least in my case moved a bit that pwr wall. The scores are not much higher but I feel the performance is more fluid visually without pwr showing up in the GPUz graph.


To have the .bat file run at every boot. Note, you NEED the 30 second delay.

Right click the .bat file, choose Edit and open it in Notepad, change the number after -pl to your max power limit, you can look your max power limit up at:

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

Open your Task Scheduler search for it in Run or Cortana, add the bat file to run at login, just add it as a Basic Task, then double click the entry in your task list and edit it like the screenshots. Have the bat file start 30 seconds after login.

You need to click on Properties in the right window to edit it like the pictures when you double-click it from your task list.

*Unzip this file for powerlimit.bat file.*









powerlimit.zip 0k .zip file














Or you can just edit and run the bat file itself but you'll have to do it on every login.









Seems manually setting the power limit even if it doesn't actually change it fixes that issue.


----------



## RavageTheEarth

So I just got home from the weekend vacation and as usual I'm having issues running my 1v\/2050 curve. Sometimes it works fine, but other times the card will decide to use boost the voltage up to 1.05v/1.063v. I don't get it. So I once again just locked 1v @ 2050 and everything ran great. This is all during Wildlands which is pretty demanding. In The Witcher 3 with the same settings the clock dropped to 2038 and stayed there. That's the only game that drops the clock on my though. I didn't even get a GPU Utilization Error in the Perfcap Reasons which I always get. I've been using a big fan aimed directly at the card and it has really helped with temps. Just waiting for EK to release the block for this card. I can't wait to have silence and extremely cool temps. I'm not used to running on air at all









Here's MSI Afterburner and GPU-Z while playing Ghost Recon Wildlands at 1440p 144hz. Pretty happy the card isn't asking for more voltage since I'm running a 2050 core with only 1v. Doing +300 on the memory because it seems to be the sweet spot for my card. Scored higher benching at +300 than I did at +400 and +500.


----------



## Luckbad

Thanks whoever it was that suggested Reddit for selling. Not sure it's the best place to sell, though. People are uncool with $760 for a Zotac 1080 Ti Amp Extreme that clocks at 2037.5 stock and easily goes over 2100. They think it's overpriced.

I'll just keep answering Unanswered threads here until people start giving rep then will sell it here.


----------



## sperson1

Quote:


> Originally Posted by *Luckbad*
> 
> Thanks whoever it was that suggested Reddit for selling. Not sure it's the best place to sell, though. People are uncool with $760 for a Zotac 1080 Ti Amp Extreme that clocks at 2037.5 stock and easily goes over 2100. They think it's overpriced.
> 
> I'll just keep answering Unanswered threads here until people start giving rep then will sell it here.


I wish I had enough rep to sell on here I have some gpus and water cooling parts I don't need anymore


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> Thanks whoever it was that suggested Reddit for selling. Not sure it's the best place to sell, though. People are uncool with $760 for a Zotac 1080 Ti Amp Extreme that clocks at 2037.5 stock and easily goes over 2100. They think it's overpriced.
> 
> I'll just keep answering Unanswered threads here until people start giving rep then will sell it here.


I think its more of the fact that its used or second-hand and also probably cause they know you overclocked/benched it.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheBoom*
> 
> Thing is testing 8K Supo at a lower locked voltage via the curve shows lower results. Yes my card only goes up to a max of 345w and stays stable at the clocks I've set it to, but with a manual oc even though clocks are jumping around it hits a max of 400w. Power limit kicks in but in the end i still get a higher score


If you're not shunt modded, then you should do some thorough test and comparisons, that's all I'm recommending. All I'm saying is it can be a mistake assuming more on the core is going to get you better results, that's evident with my test and kraxkills.

And if you are not shunt modded, you're not getting 400 watts. I can't confirm this, another user did but not even 350 watts according to his measurements. If you have other test, it would be beneficial to share them for knowledge purposes. 350 watts would be pretty sweet, but the only measurements was from a video another user made and he settled at around 325 watts.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> To have the .bat file run at every boot. Note, you NEED the 30 second delay.
> 
> Right click the .bat file, choose Edit and open it in Notepad, change the number after -pl to your max power limit, you can look your max power limit up at:
> 
> https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=
> 
> Open your Task Scheduler search for it in Run or Cortana, add the bat file to run at login, just add it as a Basic Task, then double click the entry in your task list and edit it like the screenshots. Have the bat file start 30 seconds after login.
> 
> You need to click on Properties in the right window to edit it like the pictures when you double-click it from your task list.
> 
> *Unzip this file for powerlimit.bat file.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> powerlimit.zip 0k .zip file
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or you can just edit and run the bat file itself but you'll have to do it on every login.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems manually setting the power limit even if it doesn't actually change it fixes that issue.


Tried this with Palit bios and it didn't make a difference for me. I am wondering if Palit max is 350, but this normalized business is trying to keep the average below 300?



GPU-Z which seems to show normalized reading caps and AB shows about the same level at the same time, so its like limiting kicks in and is trying to keep the average at 100% or 300w in this case, like the average reading in smi.


----------



## nrpeyton

Just tried the PALIT BIOS myself, but due to losing 2000 FAN RPM; I'm afraid any experiments & their results are going to have to wait until *after* I've picked & installed a full-cover waterblock on this card.

It idles at 65C if I leave the fan profile to default, lol. And maxed out It doesn't go higher than 2700 RPM. _No good on a FE!! normal max is 4700RPM!!_

Any recommendations? There are no reviews yet comparing temps for different 1080 TI blocks.
Meaning I may just grab the EK block; the rest of my equipment , is all EKWB, _(barring the Chiller of course)._

On my Classy 1080 I never gave it a 2nd thought, massive PCB guaranteed great temps. But the baby-sized PCB of the reference design will impact cooling; so I want a block that scrapes every last possible degree C.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> Tried this with Palit bios and it didn't make a difference for me. I am wondering if Palit max is 350, but this normalized business is trying to keep the average below 300?
> 
> 
> 
> GPU-Z which seems to show normalized reading caps and AB shows about the same level at the same time, so its like limiting kicks in and is trying to keep the average at 100% or 300w in this case, like the average reading in smi.


No, Palit bios gives you 325-330 watts according to this user's testing. But a PSU reading would be better.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Luckbad*
> 
> Thanks whoever it was that suggested Reddit for selling. Not sure it's the best place to sell, though. People are uncool with $760 for a Zotac 1080 Ti Amp Extreme that clocks at 2037.5 stock and easily goes over 2100. They think it's overpriced.
> 
> I'll just keep answering Unanswered threads here until people start giving rep then will sell it here.


That was me. I gave you a pity rep lol.


----------



## CoreyL4

After a couple days testing it seems my gaming overclock that is stable is 2037/5900. ill take it!


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, Palit bios gives you 325-330 watts according to this user's testing. But a PSU reading would be better.


*Only* got 60 seconds in due to temps going out of control _(see my previous post)_

but here:

Click play at the same time guys, and you can watch them side-by-side.

*STOCK BIOS (300w limit)*





*PALIT BIOS (350w limit)*





1) Testing carried out with Furmark.

2) No other system settings changed. Just the GPU, bios.

3) I *did* the power pl command using the batch file from the opening thread after flashing. Power was set to 348w. The command changed it to 350w. I didn't touch MSI Afterburner again before beginning test.

I don't think the PALIT bios is reporting power correctly. As power seemed to max out at 330w in HWINFO64 (should be 350w) However it's only about 20w off. (but you still lose that 20w). Shame it's not off "the other way around". You can see by comparing the two videos; the PALIT BIOS does only average about an extra 30w *(in other words, this is confirmed in the videos above).*

I also noticed GPU boost 3.0 doesn't seem as "aggressive" using PALIT bios. (But that might just be me) lol.

*Can't do any official testing with 'scores' until I get a water-block, as temps go out of control and cause temp throttling before the bench finishes due to reduced fan RPM on PALIT BIOS vs FE BIOS (you lose 2000 RPM).*


----------



## Benny89

Quote:


> Originally Posted by *viz0id*
> 
> Everyone have been talking so much about how demanding Witcher 3 was. So i played it for an hour and took this screenshot.
> 
> 
> 
> I don't think i'll be touching the card anymore. If it can run 2088 from the start and only downclock to 2075 I'm beyond happy. I got a Strix OC on stock air cooling.


This is curve or just good old sliders?

And thanks for that. You convinced me to return my card. Yours can do 2075 in Witcher 3 at 1.032V.... Mine can't do even 2038 at 1.063V in W3. Already ordered another STRIX, ready to pick up. Gonna return this one day after tomorrow.

I have to say that one thing I hate about PC hardware is sillicon lottery, be it GPU or CPU.... everything should be the same. Well, that will be my 4th 1080Ti









Hope next one will finally be decent overclocker.

EDIT: and BTW hour in W3 is not enough. I thought I had stable 2050 in W3, crashed after 2,5h of gaming. Play for like 5-6 hours then I can say this is stable. Not trying to say it won't be, just giving you an advice. One hour in one game is not proof of stable clocks. You need couple of hours at least to say that.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> This is curve or just good old sliders?
> 
> And thanks for that. You convinced me to return my card. Yours can do 2075 in Witcher 3 at 1.032V.... Mine can't do even 2038 at 1.063V in W3. Already ordered another STRIX, ready to pick up. Gonna return this one day after tomorrow.
> 
> I have to say that one thing I hate about PC hardware is sillicon lottery, be it GPU or CPU.... everything should be the same. Well, that will be my 4th 1080Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope next one will finally be decent overclocker.


How did you return 3 cards? What did you say?


----------



## Addsome

Quote:


> Originally Posted by *Benny89*
> 
> This is curve or just good old sliders?
> 
> And thanks for that. You convinced me to return my card. Yours can do 2075 in Witcher 3 at 1.032V.... Mine can't do even 2038 at 1.063V in W3. Already ordered another STRIX, ready to pick up. Gonna return this one day after tomorrow.
> 
> I have to say that one thing I hate about PC hardware is sillicon lottery, be it GPU or CPU.... everything should be the same. Well, that will be my 4th 1080Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope next one will finally be decent overclocker.
> 
> EDIT: and BTW hour in W3 is not enough. I thought I had stable 2050 in W3, crashed after 2,5h of gaming. Play for like 5-6 hours then I can say this is stable. Not trying to say it won't be, just giving you an advice. One hour in one game is not proof of stable clocks. You need couple of hours at least to say that.


You know theres barely any difference between 2038 and 2075 right? Your just wasting cards at this point.....


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> How did you return 3 cards? What did you say?


Lol, I am in EU. I can return anything I ordered online withing 14 days without any reason. I just go to store, say I return it and unless I physically destroyed a hardware they have to accept it and return money within 14 days. There is no way to avoid that for a store. No restocking fees, nothing. This is the law here.

So I will return till I have decent overclocker STRIX. I am sick person, but this hobby is sick by definition








Quote:


> Originally Posted by *Addsome*
> 
> You know theres barely any difference between 2038 and 2075 right? Your just wasting cards at this point.....


I have no excuses. This is overclockers forum. We are obsessed here about every mhz and every frame.

Card is not going to waste. Somebody will buy for lower price from store fully 100% working card. Good for him.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Lol, I am in EU. I can return anything I ordered online withing 14 days without any reason. I just go to store, say I return it and unless I physically destroyed a hardware they have to accept it and return money within 14 days. There is no way to avoid that for a store. No restocking fees, nothing. This is the law here.
> 
> So I will return till I have decent overclocker STRIX. I am sick person, but this hobby is sick by definition


I'm in the U.K lol. (Europe).

But whats to stop a shop "catching on" and deciding to screw with you in revenge? They could say the card was returned damaged etc? (even when it wasn't).

Maybe not right away, but after a few returns maybe it would eventually flag up?

I say it's a slippery slope (a risky game you're playing) lol.

Just be careful 

Saying that; don't get me wrong.. if I got a *really* bad card. I can't say I wouldn't consider doing the same. 

Are you sure about the restock fee? (I thought the law changed recently to allow shops to cover costs of restocking). I.E. card would have to be sent back to manufacturer, tested. Cleaned. Repackaged in brand new packaging. e.t.c. e.t.c. before being sent on its way to a new customer.
OR some shops just sell it as "B grade" and take a 10% loss. (I know 'Scan Computers international LTD'. do this).


----------



## Benny89

Quote:


> Originally Posted by *Addsome*
> 
> You know theres barely any difference between 2038 and 2075 right? Your just wasting cards at this point.....


Quote:


> Originally Posted by *nrpeyton*
> 
> I'm in the U.K lol. (Europe).
> 
> But whats to stop a shop "catching on" and deciding to screw with you in revenge? They could say the card was returned damaged etc?
> 
> I say it's a slippery slope lol.
> 
> Just be careful


They can't do anything. I have done it many times. Law is clear: any item ordered online can be returned (apart from some expceptions, like chemicals, drugs etc.) withing 14 days from purchase date without a reason. There is no exceptions apart when I used an item beyond its purpouse, like I was trying to fry egs on backplate







.

I can just return it and thats all. It's pretty common here really. Nothing strange in that.

And they can't say that card is destroyd since they would have to proof it. When you return item, store has a chance to write down that something is broken with what you have returned. If they don't - that means they accept return.

Whats more- if hardware is not working. That doesn't matter. It had to be "destroyed", like had visual evidance of being broken by you. If lets say GPU just doesn't work- doesn't matter.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> *Only* got 60 seconds in due to temps going out of control _(see my previous post)_
> 
> but here:
> 
> Click play at the same time guys, and you can watch them side-by-side.
> 
> *STOCK BIOS (300w limit)*
> 
> 
> 
> 
> 
> *PALIT BIOS (350w limit)*
> 
> 
> 
> 
> 
> Testing carried out with Furmark.
> 
> No other system settings changed. Just the GPU, bios.
> 
> I don't think the PALIT bios is reporting power correctly. As power seemed to max out at 330w in HWINFO64 (should be 350w) However it's only about 20w off. (but you still lose that 20w). Shame it's not off "the other way around". You can see by comparing the two videos; the PALIT BIOS does only average about an extra 30w *(in other words, this is confirmed in the videos above).*
> 
> I also noticed GPU boost 3.0 doesn't seem as "aggressive" using PALIT bios. (But that might just be me) lol.
> 
> *Can't do any official testing with 'scores' until I get a water-block, as temps go out of control and cause temp throttling before the bench finishes due to reduced fan RPM on PALIT BIOS vs FE BIOS (you lose 2000 RPM).*


Thank you for your time and research. +Rep.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> They can't do anything. I have done it many times. Law is clear: any item ordered online can be returned (apart from some expceptions, like chemicals, drugs etc.) withing 14 days from purchase date without a reason. There is no exceptions apart when I used an item beyond its purpouse, like I was trying to fry egs on backplate
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I can just return it and thats all. It's pretty common here really. Nothing strange in that.


What are you aiming for then? 2100? 2150? It's subjective though.... some depends what game you're running, how many watts your drawing, even if your CPU bottlenecked... (that could help keep GPU more stable)... What one person thinks is stable at 2100 might be completely unstable for someone else......

I could hit 2100 yesterday, but today I can't seem to hit it.

It's like the card was more stable brand new than it is today. (like it's already degraded by 0.0001% and thats enough to throw it off a little). And I haven't done anything stupid with it.. Voltage has never been above 1.093v. (as currently nothing exists yet to allow this).


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> What are you aiming for then? 2100?
> 
> I could hit 2100 yesterday, but today I can't seem to hit it.
> 
> It's like the card was more stable brand new than it is today. (like it's already degraded by 0.0001% and thats enough to throw it off a little). And I haven't done anything stupid with it.. Voltage has never been above 1.093v. (as currently nothing exists yet to allow this).


I will be content with 2050-63 in W3. If I can get that I will rest in peace till Volta comes.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Thank you for your time and research. +Rep.


No probs, hope that was helpful.


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> No, Palit bios gives you 325-330 watts according to this user's testing. But a PSU reading would be better.


At the end of the day it has no impact on power limiting, so its going to downclock and lose performance. With 50w extra I would expect to see a significant change if I was barely limiting on the FE bios. I do think more power is getting thru as in like high spikes, but it seems to me to downclock much the same as FE within performance considerations. Meaning it seems like its been soften up a bit so I would expect it to clk even higher again.

Show me 50w worth of actual performance gain and I'd be more inclined to believe it.

How much do you gain with 50w extra on shunt?


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> At the end of the day it has no impact on power limiting, so its going to downclock and lose performance. With 50w extra I would expect to see a significant change if I was barely limiting on the FE bios. I do think more power is getting thru as in like high spikes, but it seems to me to downclock much the same as FE within performance considerations. Meaning it seems like its been soften up a bit so I would expect it to clk even higher again.
> 
> Show me 50w worth of actual performance gain and I'd be more inclined to believe it.
> 
> How much do you gain with 50w extra on shunt?


Shut gives you more than 50w extra. And allows you to stick to your 'stock' BIOS eliminating any compatibility problems which might cause the card to perform differently to how it would normally. (which could result in lower scores etc).

In my opinion. Your own BIOS + shut mod is going to be the best way to go. However after I get my waterblock my opinion on that could change. I just highly doubt it. (that doubt is centered around testing others here have already done and their results). But I believe what I read when people post evidence to back up their testing (which indeed they have)


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> At the end of the day it has no impact on power limiting, so its going to downclock and lose performance. With 50w extra I would expect to see a significant change if I was barely limiting on the FE bios. I do think more power is getting thru as in like high spikes, but it seems to me to downclock much the same as FE within performance considerations. Meaning it seems like its been soften up a bit so I would expect it to clk even higher again.
> 
> Show me 50w worth of actual performance gain and I'd be more inclined to believe it.
> 
> How much do you gain with 50w extra on shunt?


It's more about hold clocks, not about gains but stopping losses.

If you can do 2101 at 1.062v but only 2032 at 1.050v, they you'll lose your 1.062v oc in power hungry moments in games like Witcher 3,mass effect or benchmarks like superposition, TimeSpy, firestrike, etc.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> posting is caring
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OK I didn't assume all the scores came from a 5960x because I investigate before shooting someone.
> 13th and below are Ryzen cpus or 5820 not 5960x please don't skim read and assume because you can't beat it then no body can without a 5960x
> No. 13 submission out of 56
> http://hwbot.org/submission/3525401_thesirensong_3dmark___time_spy_geforce_gtx_1080_ti_10426_marks
> 
> Have you heard of tunnel vision?


point is still the same. my initial reaction was simply, "Because we all don't have 8 core cpus". Don't bring a cpu benchmark leaderboard up to argue your point about gpu. very simple. Especially when YOU YOURSELF have an old 4 core. Why not bring a 4 core leaderboard? You just tried to make a weak point with a weak pic, and got called out on it. Why? Because, you have jumped on other posters in this thread and I didn't feel like letting you slide on it.
Quote:


> Originally Posted by *Luckbad*
> 
> Thanks whoever it was that suggested Reddit for selling. Not sure it's the best place to sell, though. People are uncool with $760 for a Zotac 1080 Ti Amp Extreme that clocks at 2037.5 stock and easily goes over 2100. They think it's overpriced.
> 
> I'll just keep answering Unanswered threads here until people start giving rep then will sell it here.


why not Ebay?


----------



## nrpeyton

Possible important info on Memory Overclocking.

I trust some of you have now been using OCCT to find max stable memory O/C, (which we found is great at finding a more precise, stable, upper memory O/C).

But one more thing to note of importance:

I believe there 'could be' different memory timings straps programmed into BIOS which are automatically activated depending on your memory O/C. (for instance a good comparison to this would be how DDR3/4 XMP profiles load optimised (looser) timings for higher speeds you select in motherboard BIOS).

Anyway these timings can affect your score when you're trying to find best memory O/C.

During the 1080 days (at the owners club over there) there was even a roomer circulating that odd overclocks scored higher than even overclock numbers or memory overclocks ending in 0's didn't score as well as those ending in 5. _(I.E. +495/+605 always scored higher than +500/+600).
_
What is more likely to be the case is that: pre-optimised 'memory timings straps' are selected by BIOS depending on the O/C you enter.
For example; +250...lost FPS. But +240 and +255-+260 = gain.

And remember OCCT is still the best we have for testing memory O/C *"stability"*.









If anyone has anything to add, or any corrections (or different opinion), fire away  *Would be good to see some 'memory overclocking' spirited discussion.*


----------



## Slackaveli

LOL! You are just doing a public service, letting some folks get a strix for the cost of an FE!







[/quote]oh, I thought you crashed because of vram, not clocks. i havent been keeping up too well.
Quote:


> Originally Posted by *Benny89*
> 
> Lol, I am in EU. I can return anything I ordered online withing 14 days without any reason. I just go to store, say I return it and unless I physically destroyed a hardware they have to accept it and return money within 14 days. There is no way to avoid that for a store. No restocking fees, nothing. This is the law here.
> 
> So I will return till I have decent overclocker STRIX. I am sick person, but this hobby is sick by definition
> 
> 
> 
> 
> 
> 
> 
> 
> I have no excuses. This is overclockers forum. We are obsessed here about every mhz and every frame.
> 
> Card is not going to waste. Somebody will buy for lower price from store fully 100% working card. Good for him.


----------



## Luckbad

Quote:


> Originally Posted by *Slackaveli*
> 
> why not Ebay?


I went ahead and listed my FE card on eBay and have the Amp Extreme going up in an hour. Wanted to avoid the 10% fee but it's easier to sell there.

Hopefully next time I need to sell stuff (next Nvidia gen) I have the rep on overclock to sell here .


----------



## Luckbad

Quote:


> Originally Posted by *nrpeyton*
> 
> Possible important info on Memory Overclocking.
> 
> I trust some of you have now been using OCCT to find max stable memory O/C, (which we found is great at finding a more precise, stable, upper memory O/C.
> 
> But one more thing to note of importance:
> 
> I believe there 'could be' different memory timings straps programmed into BIOS which are automatically activated depending on your memory O/C. (for instance a good comparison to this would be how DDR3/4 XMP profiles load optimised (looser) timings for higher speeds you select in motherboard BIOS).
> 
> Anyway these timings can affect your score when you're trying to find best memory O/C.
> 
> During the 1080 days (at the owners club over there) there was even a roomer circulating that odd overclocks scored higher than even overclock numbers or memory overclocks ending in 0's didn't score as well as those ending in 5. _(I.E. +495/+605 always scored higher than +500/+600).
> _
> What is more likely to be the case is that: pre-optimised 'memory timings straps' are selected by BIOS depending on the O/C you enter.
> For example; +250...lost FPS. But +240 and +255-+260 = gain.
> 
> And remember OCCT is still the best we have for testing memory O/C stability.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has anything to add, or any corrections (or different opinion), fire away  *Would be good to see some 'memory overclocking' spirited discussion.*


Interesting. I'll try to dig in further tonight. I know +407 gives better performance than +417 on my Zotac (6010 MHz at 407).

I don't get memory-related artifacts until something like +550 or 600.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> I went ahead and listed my FE card on eBay and have the Amp Extreme going up in an hour. Wanted to avoid the 10% fee but it's easier to sell there.
> 
> Hopefully next time I need to sell stuff (next Nvidia gen) I have the rep on overclock to sell here .


might as well. because of limited availability, they have been going for msrp pretty much, or more, on ebay.


----------



## nrpeyton

The 10% has put me off selling on Ebay many times.

It's daylight robbery. For 98.5% of sales they probably have next to ZERO costs involved. The cost of hosting a page with maybe 50 views? On the rates they will be paying for bandwidth? lol.

£70 bux profit they make for the sale of your card.

They must make an absolute bloody fortune on that site. 5% would be a more reasonable offering. Or at least at anything costing over x amount.

By the time you include 10% ebay fee, paypal fees, and postage (which is usually at least £27 (or 35 in U.S dollars) if you want insurance which is MUST when selling cards) you're looking at nearly £100 bux costs just to sell a card. Add the discount you need to add the the price to get a fast sale (£50) and you're looking at 150 bux loss everytime you use Ebay. Then there's also the constant worry that the buyer could screw you over. Destroy the card, return it, then Ebay sides with the buyer _(as always)._

Ebay for me, is an extremely risky venture for 1-time sellers of expensive merchandise.


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> But I believe what I read when people post evidence to back up their testing (which indeed they have)


They have shown something, but what I'm trying to illustrate is its not proof of 50w. Would you agree that the stock bios hits power limt at 300w. Then testing up to that indicates what you can do non limited. Do the same with the Palit bios, so its not downclocking or showing power limiting in GPU-Z then see what HWINFO says. Just because you can put the card into overload does show what you've gain. I liken it to running benches on LN2 when its not 24/7 viable.

I'm not trying to cause trouble, just really want to see evidence of a higher power limit working









Quote:


> Originally Posted by *SlimJ87D*
> 
> It's more about hold clocks, not about gains but stopping losses.
> 
> If you can do 2101 at 1.062v but only 2032 at 1.050v, they you'll lose your 1.062v oc in power hungry moments in games like Witcher 3,mass effect or benchmarks like superposition, TimeSpy, firestrike, etc.


I agree, you should be able to hold higher clks, and with 50w probably a bin or two if you have a good card. that's the evidence I want to see. It should translate to performance gains as well, or what you might think are higher clk gains due to power are actually due to softer timings.

Probably the best comparison would be running 300w 120% and 350w 140% on a Auros card to see how much expected gain would be for 50w, needs a good card and result maybe almost negligible on some cards. The best chance would be where 300w is at 2000 1.000v say and you know you can do 2138 at 1.093 or so, therefore you should see some gains by unlimiting the power an extra 50w.


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> They have shown something, but what I'm trying to illustrate is its not proof of 50w. Would you agree that the stock bios hits power limt at 300w. Then testing up to that indicates what you can do non limited. Do the same with the Palit bios, so its not downclocking or showing power limiting in GPU-Z then see what HWINFO says. Just because you can put the card into overload does show what you've gain. I liken it to running benches on LN2 when its not 24/7 viable.
> 
> I'm not trying to cause trouble, just really want to see evidence of a higher power limit working
> 
> 
> 
> 
> 
> 
> 
> 
> I agree, you should be able to hold higher clks, and with 50w probably a bin or two if you have a good card. that's the evidence I want to see. It should translate to performance gains as well, or what you might think are higher clk gains due to power are actually due to softer timings.


I posted proof the PALIT bios does give an extra 30w a few pages back. The 20w loss (max 330w instead of 350w) is likely down to calibration between BIOS software and PCB power sensor being "off" due to hardware differences. The extra 30w *is real* and *is proven* a few pages back. (it's hard to miss as the post has two videos in it).

_In other words using the PALIT bios with an actual PALIT graphics card will give you the full 350w use the BIOS allows. FE owners flashing this BIOS still get an extra 30w_

If anyone got different results than me, let me know. (I do plan to do more extensive testing after I get the card under water).

Sorry you missed it, check it out


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> I posted proof the PALIT bios does give an extra 30w a few pages back. The 20w loss (max 330w instead of 350w) is likely down to calibration between BIOS software and PCB power sensor being "off" due to hardware differences. The extra 30w *is real* and *is proven* a few pages back. (it's hard to miss as the post has two videos in it).
> 
> _In other words using the PALIT bios with an actual PALIT graphics card will give you the full 350w use the BIOS allows. FE owners flashing this BIOS still get an extra 30w_
> 
> If anyone got different results than me, let me know. (I do plan to do more extensive testing after I get the card under water).
> 
> Sorry you missed it, check it out


I saw it, what it didn't see is whether it was limiting. My results showed no gain in clks or limiting and less performance. Maybe its something peculiar with my card as none of the bios have shown any real gains except maybe the Inno3d one. I still believe its doing something, but its not holding higher clocks so I don't see the benefit.


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> I saw it, what it didn't see is whether it was limiting. My results showed no gain in clks or limiting and less performance. Maybe its something peculiar with my card as none of the bios have shown any real gains except maybe the Inno3d one. I still believe its doing something, but its not holding higher clocks so I don't see the benefit.


One thing I want to try after i get my waterblock is this:


Making a note of all voltage/frequency plot points (on the curve) for stock FE BIOS.
Flashing PALIT BIOS.
Changing the curve on the palit BIOS so it matches the FE BIOS precisely. Might take 30 mins (and could be a little painstaking entering it in precisely, accurate to 1 MHZ at each point). But it would allow a truer comparison on clock-holding with the extra 30 watts).

The trouble with simply leaving the stock PALIT curve and trying to compare clocks is the curve on the palit bios is dramatically different. The Palit BIOS may be using higher voltages for the same boost speeds which would negate any gains. (or even result in lower scores).

Maybe something someone who already has a block on their card (who has the BIOS) and hasn't already done the shut mod could do? (if they have the time/patience) of course 







?

Or; wait for me to grab my block and I'll do it next weekend.

*Also;* anyone interested in *over-volting* your cards should watch this video. In an interview with Nvidia engineer; he discusses why Nvidia impose restriction on voltage. He even makes a bold statement saying if you *always* use the 100% voltage offset setting in your O/C'ing app your card could only last 1 year, as opposed to 5 years by sticking to the stock voltages (or 0% slider).

Of course my opinion is that 1 year is completely subjective.

They also discuss other interesting developments such as "the best TI ever":

Example:

780 to 780TI = 16% faster
980 to 980TI = 25% faster
1080 to 1080TI = 35% faster

*Anyway here's the video in full: (a very interesting watch):*


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I want to try after i get my waterblock is this:
> 
> 
> Making a note of all voltage/frequency plot points (on the curve) for stock FE BIOS.
> Flashing PALIT BIOS.
> Changing the curve on the palit BIOS so it matches the FE BIOS precisely. Might take 30 mins (and could be a little painstaking entering it in precisely, accurate to 1 MHZ at each point). But it would allow a truer comparison on clock-holding with the extra 30 watts).
> 
> The trouble with simply leaving the stock PALIT curve and trying to compare clocks is the curve on the palit bios is dramatically different. The Palit BIOS may be using higher voltages for the same boost speeds which would negate any gains. (or even result in lower scores).
> 
> Maybe something someone who already has a block on their card (who has the BIOS) and hasn't already done the shut mod could do? (if they have the time/patience) of course
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Or; wait for me to grab my block and I'll do it next weekend.


My results were with very similar curves, both running 2101 1.050 max down to 2088 1.000v downclock range, but I look forward to seeing what you get


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> One thing I want to try after i get my waterblock is this:
> 
> 
> Making a note of all voltage/frequency plot points (on the curve) for stock FE BIOS.
> Flashing PALIT BIOS.
> Changing the curve on the palit BIOS so it matches the FE BIOS precisely. Might take 30 mins (and could be a little painstaking entering it in precisely, accurate to 1 MHZ at each point). But it would allow a truer comparison on clock-holding with the extra 30 watts).
> 
> The trouble with simply leaving the stock PALIT curve and trying to compare clocks is the curve on the palit bios is dramatically different. The Palit BIOS may be using higher voltages for the same boost speeds which would negate any gains. (or even result in lower scores).
> 
> Maybe something someone who already has a block on their card (who has the BIOS) and hasn't already done the shut mod could do? (if they have the time/patience) of course
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Or; wait for me to grab my block and I'll do it next weekend.
> 
> 
> 
> My results were with very similar curves, both running 2101 1.050 max down to 2088 1.000v downclock range, but I look forward to seeing what you get
Click to expand...

i found no shunt mod you could get slightly higher clocks at lower voltages with Palit but never performed as well even then than stock or Asus BIOS.

Be good for someone with shunt mod to compare all three, on a clean windows install I got over 100 more points with Asus BIOS over stock in Superposition.


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I want to try after i get my waterblock is this:
> 
> 
> Making a note of all voltage/frequency plot points (on the curve) for stock FE BIOS.
> Flashing PALIT BIOS.
> Changing the curve on the palit BIOS so it matches the FE BIOS precisely. Might take 30 mins (and could be a little painstaking entering it in precisely, accurate to 1 MHZ at each point). But it would allow a truer comparison on clock-holding with the extra 30 watts).
> 
> The trouble with simply leaving the stock PALIT curve and trying to compare clocks is the curve on the palit bios is dramatically different. The Palit BIOS may be using higher voltages for the same boost speeds which would negate any gains. (or even result in lower scores).
> 
> Maybe something someone who already has a block on their card (who has the BIOS) and hasn't already done the shut mod could do? (if they have the time/patience) of course
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Or; wait for me to grab my block and I'll do it next weekend.
> 
> *Also;* anyone interested in *over-volting* your cards should watch this video. In an interview with Nvidia engineer; he discusses why Nvidia impose restriction on voltage. He even makes a bold statement saying if you *always* use the 100% voltage offset setting in your O/C'ing app your card could only last 1 year, as opposed to 5 years by sticking to the stock voltages (or 0% slider).
> 
> Of course my opinion is that 1 year is completely subjective.
> 
> They also discuss other interesting developments such as "the best TI ever":
> 
> Example:
> 
> 780 to 780TI = 16% faster
> 980 to 980TI = 25% faster
> 1080 to 1080TI = 35% faster
> 
> *Anyway here's the video in full: (a very interesting watch):*


Sorry, I don't believe one bit of that video. How do you even quantify "one year"? One year of what? At what voltage? At what clocks? At what temps? Folding and crunching h265 continuously? Furmark? Idle? or what exactly? If this guy can't contextualize "one year" its just plain fearmongoring and disclaimer lingo.

I call BS. What is so magical and fragile in the 1080ti that is different from that of the 1080 and the Titan XP that have thus far been put through much more, made it through hard mods, overvolts and bios hacks for more than a year now?

Also the guy is the "director of technical marketing" at Nvidia not an engineer. An engineer would never make such a claim without context.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Sorry, I don't believe one bit of that video. How do you even quantify "one year"? One year of what? At what voltage? At what clocks? At what temps? Folding and crunching h265 continuously? Furmark? Idle? or what exactly? If this guy can't contextualize "one year" its just plain fearmongoring and disclaimer lingo.
> 
> I call BS. What is so magical and fragile in the 1080ti that is different from that of the 1080 and the Titan XP that have thus far been put through much more, made it through hard mods, overvolts and bios hacks for more than a year now?


I agree completely.

But I'd guess the "1 year" the nvidia engineer quoted would mean running the GPU at 1.093v on stock FE cooler at stock fan profiles 60 minutes an hour, 24hrs a day, 365 days a year.

Which unless you have a 24/7 machine just set up for charity ([email protected]) is never going to happen.

The average user here, who probably does the odd bench and games a few hours a week. Could still have a strong healthy card after 7 years using 1.093v. (in my opinion).

The one bit of what the engineer did say which did at least make sense, was about partners getting competitive on voltage (which isn't something they wanted to endorse).. such a situation would make it harder for companies like EVGA to compete with bigger brands such as ASUS (who can probably afford more RMA's).


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> Just tried the PALIT BIOS myself, but due to losing 2000 FAN RPM; I'm afraid any experiments & their results are going to have to wait until *after* I've picked & installed a full-cover waterblock on this card.
> 
> It idles at 65C if I leave the fan profile to default, lol. And maxed out It doesn't go higher than 2700 RPM. _No good on a FE!! normal max is 4700RPM!!_
> 
> Any recommendations? There are no reviews yet comparing temps for different 1080 TI blocks.
> Meaning I may just grab the EK block; the rest of my equipment , is all EKWB, _(barring the Chiller of course)._
> 
> On my Classy 1080 I never gave it a 2nd thought, massive PCB guaranteed great temps. But the baby-sized PCB of the reference design will impact cooling; so I want a block that scrapes every last possible degree C.


Another block consideration for you may be the Aquacomputer Kryographics block, with their active cooled backplate.
This is the bit tech review of the TXP block that I ordered, which also fits the 1080Ti:
https://www.bit-tech.net/hardware/graphics/2016/09/23/titan-x-pascal-water-cooling-review/1

I ordered it directly from Aquacomputer in Germany, along with the active-cooled backplate:
https://shop.aquacomputer.de/product_info.php?products_id=3458&XTCsid=bpvgo27iuaini7b5km10vmqpi
https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=bpvgo27iuaini7b5km10vmqpi
They tend to be in short supply at times. Aquatuning and Performance PC also sell them, but are often oos.

I found a recent review, with the active cooled backplate, from VSG at Thermalbench:
http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/

Note that this review is for the 1080 version, but should be very similar to the TitanXP/1080Ti block.

The gpu temps were slightly lower than the EK block, but of interest was the significant reduction in vrm temps, from using the active backplate.:


I've used several EK gpu blocks in the past. I had an Aquacomputer block on one of my 780Ti's, and was impressed with its thick copper construction. The backplate was also much thicker than EK's backplate, but I didn't have the optional active cooling attachment on that block.

The test results from the Thermalbench review look very good, as keeping the vrm temps low is very important as well.

I had purchased an EK TitanXP block (which fits the 1080Ti as well) before buying the AC block, but have decided to use the Aquacomputer Kryographics block instead.

Edit:
OCUK may have them as well, I see the backplate listed there, but not sure if they have the blocks yet.
https://www.overclockers.co.uk/water-cooling/water-blocks?ckSuppliers=44&sSort=5&n=384&ckTab=0


----------



## nrpeyton

Quote:


> Originally Posted by *DerComissar*
> 
> Another block consideration for you may be the Aquacomputer Kryographics block, with their active cooled backplate.
> This is the bit tech review of the TXP block that I ordered, which also fits the 1080Ti:
> https://www.bit-tech.net/hardware/graphics/2016/09/23/titan-x-pascal-water-cooling-review/1
> 
> I ordered it directly from Aquacomputer in Germany, along with the active-cooled backplate:
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=bpvgo27iuaini7b5km10vmqpi
> They tend to be in short supply at times. Aquatuning and Performance PC also sell them, but are often oos.
> 
> I found a recent review, with the active cooled backplate, from VSG at Thermalbench:
> http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/
> 
> Note that this review is for the 1080 version, but should be very similar to the TitanXP/1080Ti block.
> 
> The gpu temps were slightly lower than the EK block, but of interest was the significant reduction in vrm temps, from using the active backplate.:
> 
> 
> I've used several EK gpu blocks in the past. I had an Aquacomputer block on one of my 780Ti's, and was impressed with its thick copper construction. The backplate was also much thicker than EK's backplate, but I didn't have the optional active cooling attachment on that block.
> 
> The test results from the Thermalbench review look very good, as keeping the vrm temps low is very important as well.
> 
> I had purchased an EK TitanXP block (which fits the 1080Ti as well) before buying the AC block, but have decided to use the Aquacomputer Kryographics block instead.


Interesting _(thank you)_, they look nearly 10c cooler (even compared to EK).

Any incites how they manage it?


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree completely.
> 
> But I'd guess the "1 year" the nvidia engineer quoted would mean running the GPU at 1.093v on stock FE cooler at stock fan profiles 60 minutes an hour, 24hrs a day, 365 days a year.
> 
> Which unless you have a 24/7 machine just set up for charity ([email protected]) is never going to happen.
> 
> The average user here, who probably does the odd bench and games a few hours a week. Could still have a strong healthy card after 7 years using 1.093v. (in my opinion).
> 
> The one bit of what the engineer did say which did at least make sense, was about partners getting competitive on voltage (which isn't something they wanted to endorse).. such a situation would make it harder for companies like EVGA to compete with bigger brands such as ASUS (who can probably afford more RMA's).


I agree with you also, but the guy is not an engineer. He's the director of technical marketing. He is paid to say exactly what he said in exactly the way he said it.


----------



## Luckbad

Some preliminary runs with memory overclocks...

Actually, first, what speed should we be quoting? 6010 as reported by Afterburner? Double that to "DDR" and call it 12020? Divide it by 4 to get GPU-Z's number?

Whatever, I'll call it by Afterburner's number unless someone (@nrpeyton) decides we should refer to the doubled rate.

There are definitely some memory clocks that are just objectively worse in SuperPosition than those higher or lower.

An offset of 407 yields 6010 MHz. An offset of 417 yields 6014 MHz.

+407 ALWAYS gives me a better score than +417 AND better than +427. For that reason, I thought 6010 was my ideal ram speed.

The plot thickens.

If I go to +650 or so, I'm guaranteed to start artifacting, so I knocked it down a fair bit to +595 and started to mess around from there.

+587 gives me a better score than +407 (~60 higher in SuperPosition). That's a memory clock of 6180.

+618 (mem clock 6220 MHz) is a stinker. Multiple runs of SuperPosition and I'm ~100 points LOWER than +407.

+608 (6210 MHz) in multiple runs always gives me better scores than +587.

This is pretty wacky stuff. Memory overclocking definitely deals in bins just like core in some way. Offsetting it by a couple MHz in the overclock won't actually change anything in the clock speed until it hits a threshold, but maybe it actually is changing timings or something.

Once I find my best memory overclock that never artifacts in SuperPosition, OCCT, or EVGA Furry E GPU memory burner, I'll start adding and subtracting single digits to see if I can get any real consistent data.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree completely.
> 
> But I'd guess the "1 year" the nvidia engineer quoted would mean running the GPU at 1.093v on stock FE cooler at stock fan profiles 60 minutes an hour, 24hrs a day, 365 days a year.
> 
> Which unless you have a 24/7 machine just set up for charity ([email protected]) is never going to happen.
> 
> The average user here, who probably does the odd bench and games a few hours a week. Could still have a strong healthy card after 7 years using 1.093v. (in my opinion).
> 
> The one bit of what the engineer did say which did at least make sense, was about partners getting competitive on voltage (which isn't something they wanted to endorse).. such a situation would make it harder for companies like EVGA to compete with bigger brands such as ASUS (who can probably afford more RMA's).


The thing that kills cards quicker worse than voltage is heat, heat just so happens to be a by product of voltage and current, power.

Nvidia already put too much safeguards into the card. Voltage is capped, power is limited, temperatures throttle.

Most of these safeguards are more due to heat than they are current and voltage. The specs for the vram, vrm, capacitors, resistors, conductors, etc are all way over spec. The vrm can go up to 150C and draw 40 watts each on an FE.

This guy isn't completely wrong when he says voltage will decrease the life of a card, but he's not specific enough. Like you said, is it on air? Stock blower? Etc. He shares no data.

1.092v is such a tiny amount I can't imagine it harming anything.

From all our conversations, it just makes sense that voltage is capped so low due to power draw, not because of the life of the card.

The dang FE can barely run at 1.050v without power limiting on Witcher 3. What would be the point of allowing us to do 1.24v only to discover we'd be power limited and binned to 1.050v anyways? That's what nvidia was thinking when they capped the voltage. *They were using math. To determine what the card will be used for.*

I don't think they were concerned with the life of the card and capping them at 1.092v as we used to have 1.187v in the past with the 600,700,900 cards correct me if I'm wrong.

But maybe they predicted a irresponsible turbo nerd breaking their card with a shunt mod drawing 500+ watts at 1.187v, lol.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Another block consideration for you may be the Aquacomputer Kryographics block, with their active cooled backplate.
> This is the bit tech review of the TXP block that I ordered, which also fits the 1080Ti:
> https://www.bit-tech.net/hardware/graphics/2016/09/23/titan-x-pascal-water-cooling-review/1
> 
> I ordered it directly from Aquacomputer in Germany, along with the active-cooled backplate:
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=bpvgo27iuaini7b5km10vmqpi
> They tend to be in short supply at times. Aquatuning and Performance PC also sell them, but are often oos.
> 
> I found a recent review, with the active cooled backplate, from VSG at Thermalbench:
> http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/
> 
> Note that this review is for the 1080 version, but should be very similar to the TitanXP/1080Ti block.
> 
> The gpu temps were slightly lower than the EK block, but of interest was the significant reduction in vrm temps, from using the active backplate.:
> 
> 
> I've used several EK gpu blocks in the past. I had an Aquacomputer block on one of my 780Ti's, and was impressed with its thick copper construction. The backplate was also much thicker than EK's backplate, but I didn't have the optional active cooling attachment on that block.
> 
> The test results from the Thermalbench review look very good, as keeping the vrm temps low is very important as well.
> 
> I had purchased an EK TitanXP block (which fits the 1080Ti as well) before buying the AC block, but have decided to use the Aquacomputer Kryographics block instead.
> 
> 
> 
> 
> 
> 
> 
> Interesting _(thank you)_, they look nearly 10c cooler (even compared to EK).
> 
> Any incites how they manage it?
Click to expand...

Imo, because the backplate is actually actively cooled, via the use of a heatpipe in contact with the plate, which is then plumbed, so to speak, into the coolant flowing through the loop. The backplate itself is in firm contact with the hot spots on the back of the pcb through thermal pads. This seems to provide far better results than using a simple backplate with thermal pads. Such backplates are often more decorative than anything.


----------



## pez

Quote:


> Originally Posted by *AMDATI*
> 
> I think the only way you should be getting issues with Gsync dropping frames, is if you're going in and out of the refresh rate of your monitor. Otherwise, Gsync is functionally no different to the videocard than having Vsync off and no Gsync, it's done entirely on the monitor.....enabling Gsync on the control panel only tells your Gsync module to enable over the displayport connection. So from the processing side of things, nothing should be affected. What Gsync does is all done on the Gsync module....it buffers frames then makes sure the entire frame is discharged at once, rather than allowing half of the next frame to be drawn.
> 
> A lot of people don't realize this, but Gsync modules actually contains nearly 1GB of triple channel DDR3, likely more in monitors bigger than 4k/60hz. This is effectively, their frame buffer, which is large enough to store multiple raw frames.
> 
> None of this should effect the performance of the GPU. Gsync shouldn't be slowing things down or throwing frames away. Now maybe, the program itself is misreading things because of some bug.


Quote:


> Originally Posted by *Slackaveli*
> 
> drivers start to crash but recover. this regedit helps extend the time the driver has to recover.
> i tried to tell ya yesterday ;p
> yah. this. i always have g-sync indicator on. i hate not knowing if it's working for sure or not.
> you were actually pretty much right, and yes it does lower scores, and yes it's b/c partial frames are not being counted by the application.You were basically correct, and your conclusion WAS correct.
> 
> 
> 
> 
> 
> 
> 
> 
> 100% of the time your score will be lower with g-sync on. 100% of my tests this is the case, it's forum common knowledge, and it even has a warning when you do a run that your score is artificially lower b/c g-sync was on. It's IS indeed b/c the application doent count the frames that dont fall with in the standard frametime, partial frames are not counted by the application(even though they are displayed on the g-sync). Or something like that. But the conclusion of lesser scores... that's a real thing.


Yeah, my technical definition was still off by a bit. I researched it a bit more and couldn't find anything that really showed that it was discarding 'partial' frames. But yes, what you said is what I was going for...just misconstrued







.


----------



## Luckbad

Okay, I believe at this point I've found my best memory overclock.

6210 MHz yields the highest scores in SuperPosition. 6220 does not artifact, but it yields much lower scores. 6237 will artifact or freeze.

*Test Environment
*Note that I'm not running in a 100% perfect benchmark environment for maximum scores. I was only interested in testing the memory on my GPU.

My CPU isn't overclocked to its limit. It's just a 6700k running at lowish voltage at 4.4 GHz (my daily speed, not a real benching speed).
My RAM is not overclocked beyond its XMP right now. 16GB of G.Skill TridentZ DDR4 3600 at 15-15-15-35
My GPU core is not absolutely maxed out. It's running at +92, which is 2075.5 MHz.
I'm allowing for some overvoltage, but it's limited to 1.075 and is not currently Locked via a curve
That is a long preamble for nothing earth shattering at all. Basically, I mentioned in my previous post that as I clock the GPU memory clock up or down, I see pretty wild fluctuations in my SuperPosition scores. +407 offset is definitively better than +417 every single time, well beyond the margin of error. I skipped up until I started seeing artifacts and/or freezing and found a comfortable memory overclock where that would never happen.

Then I started changing numbers and running SuperPosition (4K Optimized). I kept the fans on my Zotac 1080 Ti Amp Extreme at 100% and let the GPU cool down to 35 °C between runs (not its actual idle temp, but it was getting to ~55-58 °C during each run and didn't take long to get to 35).

Offset | Clock | Score
+597 | 6210 | 10270
+615 | 6210 | 10281
+628 | 6220 | 10123
+610 | 6210 | 10279
+605 | 6210 | 10282
+616 | 6220 | 10126
+615 | 6210 | 10285
+599 | 6210 | 10280
+596 | 6210 | 10281
+618 | 6220 | 10123
+596 | 6210 | 10276
+620 | 6220 | 10126

Long story short, everything clocked at 6210 MHz gives scores within the margin of error. There are thresholds at which the memory will gain frequency, and it seems like it doesn't really matter what the number is within that frequency band is, just what the actual clock is.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree completely.
> 
> But I'd guess the "1 year" the nvidia engineer quoted would mean running the GPU at 1.093v on stock FE cooler at stock fan profiles 60 minutes an hour, 24hrs a day, 365 days a year.
> 
> Which unless you have a 24/7 machine just set up for charity ([email protected]) is never going to happen.
> 
> The average user here, who probably does the odd bench and games a few hours a week. Could still have a strong healthy card after 7 years using 1.093v. (in my opinion).
> 
> The one bit of what the engineer did say which did at least make sense, was about partners getting competitive on voltage (which isn't something they wanted to endorse).. such a situation would make it harder for companies like EVGA to compete with bigger brands such as ASUS (who can probably afford more RMA's).


I could tell you back in the day of GTX 580s I derminined that 1000Mhz was my max stable overclock with software based voltage I had had these cards for about a year and probably used for 8 hours a week gaming and idling on the web for 12hours a week the rest of the time the PC was off. (the days before boosting clocks)
Hearing about lots about failing card ect I really had to see for myself how long would these cards last 24/7 folding ??? while keeping the cards under 60°C with max stable OC of 1000Mhz which never crashed during folding.

Answer;


Spoiler: Warning: Spoiler!



the first card 3 weeks, the second card about another 2 weeks of normal use


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> point is still the same. my initial reaction was simply, "Because we all don't have 8 core cpus". *Don't bring a cpu benchmark leaderboard up to argue your point about gpu.* very simple. Especially when YOU YOURSELF have an old 4 core. Why not bring a 4 core leaderboard? You just tried to make a weak point with a weak pic, and got called out on it. Why? Because, you have jumped on other posters in this thread and I didn't feel like letting you slide on it.


you win I can't argue with a guy of your calibre
I learnt something today TimeSpy is solely a CPU based benchmark I had always thought it was a combined system benchmark with 3 separate GPU and CPU and combined tests



Spoiler: Warning: Spoiler!



When I compare I only look in the red box it seems I might need to buy another $3000 CPU/Mobo/memory combo to gain 5% FPS or I could spend alittle less and just go SLI with another 1080Ti what do you think?



Spoiler: Warning: Spoiler!




feel free to get your buddies to PM me to correct me no need to fill this thread with...........


----------



## Nico67

Quote:


> Originally Posted by *Luckbad*
> 
> Okay, I believe at this point I've found my best memory overclock.
> 
> 6210 MHz yields the highest scores in SuperPosition. 6220 does not artifact, but it yields much lower scores. 6237 will artifact or freeze.
> 
> *Test Environment
> *Note that I'm not running in a 100% perfect benchmark environment for maximum scores. I was only interested in testing the memory on my GPU.
> 
> My CPU isn't overclocked to its limit. It's just a 6700k running at lowish voltage at 4.4 GHz (my daily speed, not a real benching speed).
> My RAM is not overclocked beyond its XMP right now. 16GB of G.Skill TridentZ DDR4 3600 at 15-15-15-35
> My GPU core is not absolutely maxed out. It's running at +92, which is 2075.5 MHz.
> I'm allowing for some overvoltage, but it's limited to 1.075 and is not currently Locked via a curve
> That is a long preamble for nothing earth shattering at all. Basically, I mentioned in my previous post that as I clock the GPU memory clock up or down, I see pretty wild fluctuations in my SuperPosition scores. +407 offset is definitively better than +417 every single time, well beyond the margin of error. I skipped up until I started seeing artifacts and/or freezing and found a comfortable memory overclock where that would never happen.
> 
> Then I started changing numbers and running SuperPosition (4K Optimized). I kept the fans on my Zotac 1080 Ti Amp Extreme at 100% and let the GPU cool down to 35 °C between runs (not its actual idle temp, but it was getting to ~55-58 °C during each run and didn't take long to get to 35).
> 
> Offset | Clock | Score
> +597 | 6210 | 10270
> +615 | 6210 | 10281
> +628 | 6220 | 10123
> +610 | 6210 | 10279
> +605 | 6210 | 10282
> +616 | 6220 | 10126
> +615 | 6210 | 10285
> +599 | 6210 | 10280
> +596 | 6210 | 10281
> +618 | 6220 | 10123
> +596 | 6210 | 10276
> +620 | 6220 | 10126
> 
> Long story short, everything clocked at 6210 MHz gives scores within the margin of error. There are thresholds at which the memory will gain frequency, and it seems like it doesn't really matter what the number is within that frequency band is, just what the actual clock is.


Yeah looks like its as @nrpeyton alluded to, there are frequency points were it changes multipliers and loosens timings. definitely around +616 it drops off in your case. Be interesting to test with OCCT, but I also find different bioses slant that too, FE was good for me at 6003, inno3d, sc2 and Palit seemed to like 6055, and Asus 6156.


----------



## Gandyman

Thanks to all who answerd my original post with my strix 1080ti's.

Firstly, after talking to someone who was not I presume the work experience kid, the e-tailer have agreed to a speedy return.

Secondly, I have recently made another observation, and have spent painful hours trying to replicate and fix issues. My windows is giving me bsod with the fault in non paged erea and other memory controller issues. I'll wait for replacement card to arrive and see if the issues are resolved. Other forums have been steering me towards a faulty memory controller on my cpu, so to eliminate that as a issue I ordered a 6900k... So much money being dumped for seemingly no reason I just want to go back to my 1080 hybrid where everything worked ok lol. Very frustrating especially when you feel all alone with no one else around my town that I know of with x99 stuff to help me test =p thanks to all who replied again you guys make me feel not so alone =D


----------



## viz0id

Quote:


> Originally Posted by *Benny89*
> 
> This is curve or just good old sliders?
> 
> And thanks for that. You convinced me to return my card. Yours can do 2075 in Witcher 3 at 1.032V.... Mine can't do even 2038 at 1.063V in W3. Already ordered another STRIX, ready to pick up. Gonna return this one day after tomorrow.
> 
> I have to say that one thing I hate about PC hardware is sillicon lottery, be it GPU or CPU.... everything should be the same. Well, that will be my 4th 1080Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope next one will finally be decent overclocker.
> 
> EDIT: and BTW hour in W3 is not enough. I thought I had stable 2050 in W3, crashed after 2,5h of gaming. Play for like 5-6 hours then I can say this is stable. Not trying to say it won't be, just giving you an advice. One hour in one game is not proof of stable clocks. You need couple of hours at least to say that.


Alright i will try and put it on now and let it run for a couple of hours and see









EDIT: Oh and it is curve. No locking.


----------



## Atotarho

Hello, I accidentally touched the details on the back of the board and 3 details came off, the card keeps working fine, but tell me, can I leave it so?
The photo is highlighted with red circles.

IMG_20170430_211621933.jpg 3753k .jpg file


----------



## Coopiklaani

Just did a semi-thorough test on the memory clock

with the core stable at 2062.5 throughout the Superposition benchmark, I've got following results:

Vram SP
+925 10311
+900 10303
+875 10289
+850 10254
+750 10221
+500 10071

It starts to artifact at +1000, and fails 20 loops stress test at +950.

Now I'm running at +900 on the memory. it seems to be a sweet spot for my card.


----------



## TheBoom

Quote:


> Originally Posted by *SlimJ87D*
> 
> If you're not shunt modded, then you should do some thorough test and comparisons, that's all I'm recommending. All I'm saying is it can be a mistake assuming more on the core is going to get you better results, that's evident with my test and kraxkills.
> 
> And if you are not shunt modded, you're not getting 400 watts. I can't confirm this, another user did but not even 350 watts according to his measurements. If you have other test, it would be beneficial to share them for knowledge purposes. 350 watts would be pretty sweet, but the only measurements was from a video another user made and he settled at around 325 watts.


Well thats the thing, I matched the curve to give me what the manual oc was giving me so more or less the same core not higher.

Yeah I was using HWM for the raw power numbers so I'm not sure how accurate that is. But I did observe that the normalized TDP was hitting a max of 124% which does make sense since 320w + 24% is roughly 400w.

Second observation is that total system power draw was at ~600w on my AX860i graphing. This is at 8K Supo which isn't that CPU intensive so unless my CPU was drawing more than 200w at 1.245v 4.5ghz in Supo, there can't be anything else that adds up to 600w..

Of course it is not holding at the max though. It somehow hit that max briefly before it power limited itself and throttled down possibly.


----------



## TheBoom

Quote:


> Originally Posted by *nrpeyton*
> 
> Possible important info on Memory Overclocking.
> 
> I trust some of you have now been using OCCT to find max stable memory O/C, (which we found is great at finding a more precise, stable, upper memory O/C).
> 
> But one more thing to note of importance:
> 
> I believe there 'could be' different memory timings straps programmed into BIOS which are automatically activated depending on your memory O/C. (for instance a good comparison to this would be how DDR3/4 XMP profiles load optimised (looser) timings for higher speeds you select in motherboard BIOS).
> 
> Anyway these timings can affect your score when you're trying to find best memory O/C.
> 
> During the 1080 days (at the owners club over there) there was even a roomer circulating that odd overclocks scored higher than even overclock numbers or memory overclocks ending in 0's didn't score as well as those ending in 5. _(I.E. +495/+605 always scored higher than +500/+600).
> _
> What is more likely to be the case is that: pre-optimised 'memory timings straps' are selected by BIOS depending on the O/C you enter.
> For example; +250...lost FPS. But +240 and +255-+260 = gain.
> 
> And remember OCCT is still the best we have for testing memory O/C *"stability"*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has anything to add, or any corrections (or different opinion), fire away  *Would be good to see some 'memory overclocking' spirited discussion.*


Explains why my testing showed anything but linear results. I settled at +350 but if according to this even a +5 or +10 change makes a difference I might have to do more extensive testing.


----------



## mshagg

Quote:


> Originally Posted by *Atotarho*
> 
> Hello, I accidentally touched the details on the back of the board and 3 details came off, the card keeps working fine, but tell me, can I leave it so?
> The photo is highlighted with red circles.
> 
> IMG_20170430_211621933.jpg 3753k .jpg file


Two them appear to be on the back of memory modules. Im not entirely sure but i think many of those capacitors exist to improve the quality of electrical signals on the board.

I'd be looking to do some stress testing of the vram. However, in my experience it has always been difficult to find one that reliably stresses the huge amounts of vram we have now.

EDIT: Try this, it only allocated 3.5GB for each instance, but I opened three of them and ran them all and GPU memory usage was reported as 11195 in afterburner.

http://www.programming4beginners.com/gpumemtest


----------



## Atotarho

Quote:


> Originally Posted by *mshagg*
> 
> Two them appear to be on the back of memory modules. Im not entirely sure but i think many of those capacitors exist to improve the quality of electrical signals on the board.
> 
> I'd be looking to do some stress testing of the vram. However, in my experience it has always been difficult to find one that reliably stresses the huge amounts of vram we have now.
> 
> EDIT: Try this, it only allocated 3.5GB for each instance, but I opened three of them and ran them all and GPU memory usage was reported as 11195 in afterburner.
> 
> http://www.programming4beginners.com/gpumemtest


Thanks, I started 3 tests at once, all ok. But the memory worked at 10,400 MHz, so it should be? I have a bios ichill 4, the memory I usually work at 11400 MHz. So these 2 capacitors can not be soldered? A transistor what role does it perform?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheBoom*
> 
> Well thats the thing, I matched the curve to give me what the manual oc was giving me so more or less the same core not higher.
> 
> Yeah I was using HWM for the raw power numbers so I'm not sure how accurate that is. But I did observe that the normalized TDP was hitting a max of 124% which does make sense since 320w + 24% is roughly 400w.
> 
> Second observation is that total system power draw was at ~600w on my AX860i graphing. This is at 8K Supo which isn't that CPU intensive so unless my CPU was drawing more than 200w at 1.245v 4.5ghz in Supo, there can't be anything else that adds up to 600w..
> 
> Of course it is not holding at the max though. It somehow hit that max briefly before it power limited itself and throttled down possibly.


That math seems a little off to me. Are you using the stock bios?

It would be 124% * 250 watts, and the stock bios will hit above 120% briefly for anyone but it doesn't mean it won't back down below 120% literally a second after.

Measuring power draw via software isn't really reliable anyways, specially since you are manipulating it. It's best to verify software and hardware together, using a power meter device. Look a page or two back, nrpeyton measured it out of the wall, the Palit bios only gives yo 30 watts.

I'm quite certain that with only 330 watts you won't hold 1.092v, I'm just letting you know you might want to consider shunt modding your card to unlock its true potential or settle for what you can get around 1.032-1.050v which is fine as well.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Atotarho*
> 
> Thanks, I started 3 tests at once, all ok. But the memory worked at 10,400 MHz, so it should be? I have a bios ichill 4, the memory I usually work at 11400 MHz. So these 2 capacitors can not be soldered? A transistor what role does it perform?


How did you knock them off btw?


----------



## Atotarho

Quote:


> Originally Posted by *SlimJ87D*
> 
> How did you knock them off btw?


I had an Arctic Cooling Accelero Hybrid III-140, the radiator on the back of the board was very stuck, I tried to tear it away, it jumped off and touched the parts.


----------



## Alwrath

Quote:


> Originally Posted by *feznz*
> 
> I could tell you back in the day of GTX 580s I derminined that 1000Mhz was my max stable overclock with software based voltage I had had these cards for about a year and probably used for 8 hours a week gaming and idling on the web for 12hours a week the rest of the time the PC was off. (the days before boosting clocks)
> Hearing about lots about failing card ect I really had to see for myself how long would these cards last 24/7 folding ??? while keeping the cards under 60°C with max stable OC of 1000Mhz which never crashed during folding.
> 
> Answer;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> the first card 3 weeks, the second card about another 2 weeks of normal use


My question to you, what was your voltage at? Was it max? What was the slider at and volts please.


----------



## TheBoom

Quote:


> Originally Posted by *SlimJ87D*
> 
> That math seems a little off to me. Are you using the stock bios?
> 
> It would be 124% * 250 watts, and the stock bios will hit above 120% briefly for anyone but it doesn't mean it won't back down below 120% literally a second after.
> 
> Measuring power draw via software isn't really reliable anyways, specially since you are manipulating it. It's best to verify software and hardware together, using a power meter device. Look a page or two back, nrpeyton measured it out of the wall, the Palit bios only gives yo 30 watts.
> 
> I'm quite certain that with only 330 watts you won't hold 1.092v, I'm just letting you know you might want to consider shunt modding your card to unlock its true potential or settle for what you can get around 1.032-1.050v which is fine as well.


My bad I should have probably mentioned this earlier but my card is a Zotac Amp Extreme not a FE. Using stock Amp Extreme bios.

So it should be 320*124%.

I think the curve probably works better for those on FE and shunt modded.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> The thing that kills cards quicker worse than voltage is heat, heat just so happens to be a by product of voltage and current, power.
> 
> Nvidia already put too much safeguards into the card. Voltage is capped, power is limited, temperatures throttle.
> 
> Most of these safeguards are more due to heat than they are current and voltage. The specs for the vram, vrm, capacitors, resistors, conductors, etc are all way over spec. The vrm can go up to 150C and draw 40 watts each on an FE.
> 
> This guy isn't completely wrong when he says voltage will decrease the life of a card, but he's not specific enough. Like you said, is it on air? Stock blower? Etc. He shares no data.
> 
> 1.092v is such a tiny amount I can't imagine it harming anything.
> 
> From all our conversations, it just makes sense that voltage is capped so low due to power draw, not because of the life of the card.
> 
> The dang FE can barely run at 1.050v without power limiting on Witcher 3. What would be the point of allowing us to do 1.24v only to discover we'd be power limited and binned to 1.050v anyways? That's what nvidia was thinking when they capped the voltage. *They were using math. To determine what the card will be used for.*
> 
> I don't think they were concerned with the life of the card and capping them at 1.092v as we used to have 1.187v in the past with the 600,700,900 cards correct me if I'm wrong.
> 
> But maybe they predicted a irresponsible turbo nerd breaking their card with a shunt mod drawing 500+ watts at 1.187v, lol.


Lol, true.

Think you hit it on the nail with power though.

*The number 250..............:* _(the stock TDP of your 1080 TI)_

    

Sure the number *250* is also famous for many things lol. But it's also a "nice _even_ number".

And it's also Nvidias TDP design for our 1080TI.... but how can something so volatile as a GPU be exactly *250* TDP? lol... of course it's not.
It's a marketing ploy. No one is going to release a card with a TDP of 279, or 296. Or 178. Because it's bad marketing.

Also if you think about CPU's (as a comparison). The most power hungry CPU (that I recall) to ever be released domestically was AMD's FX-9590 at *220w*. Despite socket compatibility you couldn't even run this CPU off a mid-range mobo because they thought you'd blow the VRM up (in fact on it's release it was OEM only -- _they wanted it restricted to system builders_).

With all the focus on environment & energy e.t.c. + "performance per watt"... how could Nvidia possibly release a card with a TDP greater than *250w*? Of course they couldn't. Because they'd of been slammed for it by the press! Big time!

So instead they artificially limit the performance of our cards for this marketing stunt. For shareholders and for their profit. (But if that's the reason; can you blame them)?

Also Nvidia doesn't strike me as the kind of company that likes anyone doing anything different from their ideology (and rules). You just have to watch their Chief Exec, 'Jensen Huang' talking at CES 2017 and you get the picture lol.

I never blamed them at first, until I watched 'Jensen Huang' talk about self-driving cars at CES,
What? self-driving cars?
yeah!
lol
that's right. that's what all your hard earned cash is going on.
Millions of dollars poured into research into things like Self-Driving Cars. Amongst other big grand claims that were also made at CES, hardly any of them actually talked about GPU development!

...now I start to wonder... lol

Anyway; these cards are *crying out* for Shut mod.
But to be honest that's not so bad:
It feels good doing something "forbidden".


----------



## RMBR

Many tests have shown that 1080ti SLI does not offer as much performance as single.
The SLI of 1080 loses by up to 10% only while single gets to lose by up to 30%.
Did anyone else notice that?


----------



## nrpeyton

Quote:


> Originally Posted by *Luckbad*
> 
> Okay, I believe at this point I've found my best memory overclock.
> 
> 6210 MHz yields the highest scores in SuperPosition. 6220 does not artifact, but it yields much lower scores. 6237 will artifact or freeze.
> 
> *Test Environment
> *Note that I'm not running in a 100% perfect benchmark environment for maximum scores. I was only interested in testing the memory on my GPU.
> 
> My CPU isn't overclocked to its limit. It's just a 6700k running at lowish voltage at 4.4 GHz (my daily speed, not a real benching speed).
> My RAM is not overclocked beyond its XMP right now. 16GB of G.Skill TridentZ DDR4 3600 at 15-15-15-35
> My GPU core is not absolutely maxed out. It's running at +92, which is 2075.5 MHz.
> I'm allowing for some overvoltage, but it's limited to 1.075 and is not currently Locked via a curve
> That is a long preamble for nothing earth shattering at all. Basically, I mentioned in my previous post that as I clock the GPU memory clock up or down, I see pretty wild fluctuations in my SuperPosition scores. +407 offset is definitively better than +417 every single time, well beyond the margin of error. I skipped up until I started seeing artifacts and/or freezing and found a comfortable memory overclock where that would never happen.
> 
> Then I started changing numbers and running SuperPosition (4K Optimized). I kept the fans on my Zotac 1080 Ti Amp Extreme at 100% and let the GPU cool down to 35 °C between runs (not its actual idle temp, but it was getting to ~55-58 °C during each run and didn't take long to get to 35).
> 
> Offset | Clock | Score
> +597 | 6210 | 10270
> +615 | 6210 | 10281
> +628 | 6220 | 10123
> +610 | 6210 | 10279
> +605 | 6210 | 10282
> +616 | 6220 | 10126
> +615 | 6210 | 10285
> +599 | 6210 | 10280
> +596 | 6210 | 10281
> +618 | 6220 | 10123
> +596 | 6210 | 10276
> +620 | 6220 | 10126
> 
> Long story short, everything clocked at 6210 MHz gives scores within the margin of error. There are thresholds at which the memory will gain frequency, and it seems like it doesn't really matter what the number is within that frequency band is, just what the actual clock is.


Looks good.









Did you also do a quick check with OCCT?

Memory technology hasn't changed much since 1080 (non TI).. we're just using higher binned/rated chips. Therefor I don't think posting this next graph would be tooooo off-topic for this thread (given as I said technology with memory is actually so similar and architecture is the same).

But anyway here's what happened to me with my 1080 Classy:

*(You may have to right click & select 'open new tab' and view in FULL SIZE).*

And just a little info on different GDDR5X memory specs (from micron datasheet).


Anyway you'll notice from the graph that at +925 I was actually scoring higher than at +775.
But what I also had to take into account was that at +925 the card was error correcting (dropping frames). It just so happens that because of the utterly ridiculous extent of my memory overclock -- it was sooo fast it actually "made up for" the error-correcting & lost frames and even surpassed it.

However.. I don't believe that's a healthy situation for my card. Therefore I dropped my memory overclock back to my "highest scoring overclock" that didn't throw up errors in OCCT.









Felt more comfortable.

Also; micron memory responds quite nicely to temp drops. (I am hoping I can improve on my maximum error-free +556 after I go water).
_
P.S.
Before I forget. don't pay any attention to 1.5v between +650 and +675 on graph, (I messed up and lost data and was forced to fill in the blanks due to limited time). I can assure 110% though the rest is all 110% accurate & true account._


----------



## Bishop07764

Quote:


> Originally Posted by *nrpeyton*
> 
> Just tried the PALIT BIOS myself, but due to losing 2000 FAN RPM; I'm afraid any experiments & their results are going to have to wait until *after* I've picked & installed a full-cover waterblock on this card.
> 
> It idles at 65C if I leave the fan profile to default, lol. And maxed out It doesn't go higher than 2700 RPM. _No good on a FE!! normal max is 4700RPM!!_
> 
> Any recommendations? There are no reviews yet comparing temps for different 1080 TI blocks.
> Meaning I may just grab the EK block; the rest of my equipment , is all EKWB, _(barring the Chiller of course)._
> 
> On my Classy 1080 I never gave it a 2nd thought, massive PCB guaranteed great temps. But the baby-sized PCB of the reference design will impact cooling; so I want a block that scrapes every last possible degree C.


I've been impressed with my EK block. Mine has maxed at 36c so far. If it cools the other components (vrm, mem, etc) anything like my old 780 lightning, those are probably around 30 c. My old lightning could monitor mem, vrm, and PCB temps. With my EK block, the memory sat no more than low 30s. Same temps or lower for vrm. That was with me overclocking and overvolting. Vrms never got out of the 30s even when pushing 1.4 volts on the core. My computer is even in the hottest room in my house by far.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> I could tell you back in the day of GTX 580s I derminined that 1000Mhz was my max stable overclock with software based voltage I had had these cards for about a year and probably used for 8 hours a week gaming and idling on the web for 12hours a week the rest of the time the PC was off. (the days before boosting clocks)
> Hearing about lots about failing card ect I really had to see for myself how long would these cards last 24/7 folding ??? while keeping the cards under 60°C with max stable OC of 1000Mhz which never crashed during folding.
> 
> Answer;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> the first card 3 weeks, the second card about another 2 weeks of normal use


wow, so there is truth in it after all.

I wonder how long it would of lasted without any o/c at stock voltage.... (for comparison)

Quote:


> Originally Posted by *DerComissar*
> 
> Another block consideration for you may be the Aquacomputer Kryographics block, with their active cooled backplate.
> This is the bit tech review of the TXP block that I ordered, which also fits the 1080Ti:
> https://www.bit-tech.net/hardware/graphics/2016/09/23/titan-x-pascal-water-cooling-review/1
> 
> I ordered it directly from Aquacomputer in Germany, along with the active-cooled backplate:
> https://shop.aquacomputer.de/product_info.php?products_id=3458&XTCsid=bpvgo27iuaini7b5km10vmqpi
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=bpvgo27iuaini7b5km10vmqpi
> They tend to be in short supply at times. Aquatuning and Performance PC also sell them, but are often oos.
> 
> I found a recent review, with the active cooled backplate, from VSG at Thermalbench:
> http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/
> 
> Note that this review is for the 1080 version, but should be very similar to the TitanXP/1080Ti block.
> 
> The gpu temps were slightly lower than the EK block, but of interest was the significant reduction in vrm temps, from using the active backplate.:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I've used several EK gpu blocks in the past. I had an Aquacomputer block on one of my 780Ti's, and was impressed with its thick copper construction. The backplate was also much thicker than EK's backplate, but I didn't have the optional active cooling attachment on that block.
> 
> The test results from the Thermalbench review look very good, as keeping the vrm temps low is very important as well.
> 
> I had purchased an EK TitanXP block (which fits the 1080Ti as well) before buying the AC block, but have decided to use the Aquacomputer Kryographics block instead.
> 
> Edit:
> OCUK may have them as well, I see the backplate listed there, but not sure if they have the blocks yet.
> https://www.overclockers.co.uk/water-cooling/water-blocks?ckSuppliers=44&sSort=5&n=384&ckTab=0


excellent info, I'm definitely going to read up more about this block.

the idea of an actively cooled backplate is highly intriguing.

I'm one of the guys who uses a fan blowing onto the back of the socket of my mobo for another 10c off my CPU temp, so actively cooling a backplate with water to me is actually a very natural and excellent idea 

rep+ for info ;-)

Quote:


> Originally Posted by *Bishop07764*
> 
> I've been impressed with my EK block. Mine has maxed at 36c so far. If it cools the other components (vrm, mem, etc) anything like my old 780 lightning, those are probably around 30 c. My old lightning could monitor mem, vrm, and PCB temps. With my EK block, the memory sat no more than low 30s. Same temps or lower for vrm. That was with me overclocking and overvolting. Vrms never got out of the 30s even when pushing 1.4 volts on the core. My computer is even in the hottest room in my house by far.


So with EK Titan / 1080 TI block; you're maxing out *under* 40 at full load?

Nice.

You just using regular water/radiators?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheBoom*
> 
> My bad I should have probably mentioned this earlier but my card is a Zotac Amp Extreme not a FE. Using stock Amp Extreme bios.
> 
> So it should be 320*124%.
> 
> I think the curve probably works better for those on FE and shunt modded.


Oh yeah, I thought you had an FE. No, you're not power limited on the Amp Extreme in gaming. You're good.


----------



## nrpeyton

Anyone bothered with a shut mod on stock FE air cooled?


----------



## Alwrath

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone bothered with a shut mod on stock FE air cooled?


No point really unless you do a repaste, temps will increase.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone bothered with a shut mod on stock FE air cooled?


I wouldn't do it man, it's introducing 75-100 watts into the card, you be power limited hardcore and also the vrm will get hotter.

If you do, maybe upgrade the thermal pads with fujipoly 17 conductivity and mount a AIO into the card. But on air, you'll be thermal limited for sure.

EDIT: I meant you'll be thermal limited hardcore.

Look at the OP and read my shunt mod post. I was at 48C pre shunt mod. I had to do a lot of modifications to my AIO, midplate and backplate to keep my card at 50C with the extra 100 watts.


----------



## Alwrath

Quote:


> Originally Posted by *SlimJ87D*
> 
> I wouldn't do it man, it's introducing 75-100 watts into the card, you be power limited hardcore and also the vrm will get hotter.
> 
> If you do, maybe upgrade the thermal pads with fujipoly 17 conductivity and mount a AIO into the card. But on air, you'll be thermal limited for sure.


this ^^^


----------



## derpa

Just wondering, but could the front page be updated with some "specs" of the optional BIOS' people are flashing? For instance, what the Power Limit is for _X_ BIOS, list of BIOS' people use with card restrictions (eg: _can't be flashed on Y card_)

I am planning my WC loop currently for the card, and the info would be convenient when I have some more thermal headroom and adequate cooling to compensate with the extra power.

Just an idea! Thanks!


----------



## KedarWolf

Quote:


> Originally Posted by *derpa*
> 
> Just wondering, but could the front page be updated with some "specs" of the optional BIOS' people are flashing? For instance, what the Power Limit is for _X_ BIOS, list of BIOS' people use with card restrictions (eg: _can't be flashed on Y card_)
> 
> I am planning my WC loop currently for the card, and the info would be convenient when I have some more thermal headroom and adequate cooling to compensate with the extra power.
> 
> Just an idea! Thanks!


If you click on the techpowerup link in 'How To Flash Different BIOS' spoiler the BIOS's are list there and if you follow the link on any of them all the specs are there.


----------



## Coopiklaani

My temps after a 20-loop FS ultra stress test. My rads are 240mm+280mm+120mm. 3x120mm 1100RPM fans(exhaust) + 2x140mm 1500RPM(intake)
The whole system draws about 450w of power. 380w from the GPU and 40w from the CPU
My coolant temp can go upto 48C,and I have no room for more rads


----------



## Luckbad

@nrpeyton
Yep, checked with OCCT and EVGA OC Scanner's Furry E Memory test (basically like OCCT but I can actually make it punish my card more).


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Look at the OP and read my shunt mod post. I was at 48C pre shunt mod. I had to do a lot of modifications to my AIO, midplate and backplate to keep my card at 50C with the extra 100 watts.


Quote:


> Originally Posted by *SlimJ87D*
> 
> *Quotes From Page 1 By SlimJ87D*
> 
> 
> "After hitting power limits, I could not take it anymore. I knew that it was time for be to do the shunt mod."
> "I found it as a OC member, it was irresponsible for me to have so much potential thrown away***! --SlimJ87D"


haha brilliant, that made my day*** ;-)







/\










Also excellent info about Shut Mod; I am desperately want to carry this out.

Shut mod now!

But sadly for me I will have to wait.... ????

*OR*.. I *do* have an unused Corsair H80iGT AIO CPU cooler, but of course I don't have any special bracket or anything to go with it for GPU?

I don't suppose there is easy *temporary* DIY "mount job" I could do? (I don't want to be cutting any plastic or doing any damage to FE cooler).

*BUT*........ I do have two complete sets of hardware for the H80IGT due to an RMA _(they never asked for the hardware/brackets e.t.c but sent a brand new sealed boxed product out)...
_
So I have one expendable set of hardware for it... hmmmmm..... I wouldn't mind cutting/drilling into a set..
........but...... I *would* obviously be against touching any plastic on the GPU.

However.. I've never even installed an AIO on a card before.. only ever done the full-cover water-block..

And I heard taking the stock cooler off the FE is a lot trickier than simply removing the heatsink/fan from an EVGA card for example... _(so would it be worth the hassle)??
_
Could it work? (DIY job)? *Until* I can get full-cover block after next weekend??

What you think guys, *can it be done?* Or am I insane? lol


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> haha brilliant, that made my day*** ;-)
> 
> 
> 
> 
> 
> 
> 
> /\
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also excellent info about Shut Mod; I am desperately want to carry this out.
> 
> Shut mod now!
> 
> But sadly for me I will have to wait.... ????
> 
> *OR*.. I *do* have an unused Corsair H80iGT AIO CPU cooler, but of course I don't have any special bracket or anything to go with it for GPU?
> 
> I don't suppose there is easy *temporary* DIY "mount job" I could do? (I don't want to be cutting any plastic or doing any damage to FE cooler).
> 
> *BUT*........ I do have two complete sets of hardware for the H80IGT due to an RMA _(they never asked for the hardware/brackets e.t.c but sent a brand new sealed boxed product out)...
> _
> So I have one expendable set of hardware for it... hmmmmm..... I wouldn't mind cutting/drilling into a set..
> ........but...... I *would* obviously be against touching any plastic on the GPU.
> 
> However.. I've never even installed an AIO on a card before.. only ever done the full-cover water-block..
> 
> And I heard taking the stock cooler off the FE is a lot trickier than simply removing the heatsink/fan from an EVGA card for example... _(so would it be worth the hassle)??
> _
> Could it work? (DIY job)? *Until* I can get full-cover block after next weekend??
> 
> What you think guys, *can it be done?* Or am I insane? lol


Use zip ties onto the gpu, wouldn't be too different from other things I've seen you do lol.

Here's you go, you'll need a shim if you decide to keep the MIDPLATE blower.


__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/


----------



## Clukos

I've found that if I set my voltage to 0.925mv or below the GPU doesn't throttle at all when temps increase, can someone else verify this for me? Does it change from GPU to GPU or is 0.925mv the absolute maximum where you don't see thermal throttling? I've test 0.931mv and it's a no-go.


----------



## viz0id

@Benny89

Played and ran W3 for 6hours today. Let it run while i ate and took breaks. In normal play it holds 2088 all the time since i regulary use the map and it gets a rest. But no perfcaps and no crash or artifacts. Lowest clock was 2075. My room got a bit warm from it though lol, but max temp was 65 with fanspeed peaking at 80%


----------



## ALSTER868

Quote:


> Originally Posted by *viz0id*
> 
> @Benny89
> 
> Played and ran W3 for 6hours today. Let it run while i ate and took breaks. In normal play it holds 2088 all the time since i regulary use the map and it gets a rest. But no perfcaps and no crash or artifacts. Lowest clock was 2075. My room got a bit warm from it though lol, but max temp was 65 with fanspeed peaking at 80%


At which resolution? Your card should be a golden one to be able to maintain such clocks for hours. Was it curve or just the old way sliders?
Mine can't keep up even close - in W3 it can only do max 1936 stable with no jumping all over the place, it's 4k. The power draw in W3 in just horrendous as it seems.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> Use zip ties onto the gpu, wouldn't be too different from other things I've seen you do lol.
> 
> Here's you go, you'll need a shim if you decide to keep the MIDPLATE blower.
> 
> 
> __
> https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/


Nice article.. I've been reading and viewing the pictures...

I do actually have some copper shims.

Reason:

When I was experimenting with trying to get BEST POSSIBLE temps on my 1080 Classy; I tried replacing 0.5mm thermal pads with 0.5mm copper shims (for memory).. copper is 401 w/Mk vs a thermal pad 3 w/MK.

Anyway because they were so cheap and I knew I wanted to "try everything" and didn't quite know what I was getting myself into or what I might need after I ripped the block off (and knowing they would take 4 weeks from China) I just ordered everything.. a load of different sizes.... 0.5, 0.6, 1.0mm, 1.5mm etc etc.

Trouble is, they're *ALL* only 15mm x 15mm (LxB)....

Yep; I've got all different thicknesses; but only *one* size (length / breadth).

The article says I'd need 2.5mm x 2.5mm. :-(

So looks like I'm going to have to just leave it...

You know; I've even been tempted to measure my block for my 1080 Classy up and see if it would do any good lol... in theory.. no matter what design _all_ blocks will have the same 4 screws around the GPU Core. (all other screw positions obviously will be different -- but 4 would be enough).... even if the block hangs over drastically on the upper side (part furthest away from board) it wouldn't really matter.. lol...

Anyway; I think I'm now beginning to become guilty of 'wishful thinking' lol.

Better stop while I'm still ahead haha ;-)

Mind you; imagine if it did; by some miracle actually fit (line up _just_ barely where it needs to in the most important places... lol

nahhh...........


----------



## Crazy Chuckster

I just got my shiny new ASUS STRIX OC 1080 ti installed, and wow it is a beast!

What are you asus owners using to control the color on the video card, ASUS Aura?

Last question Afterburner 4.3.0 or GPUTweak?


----------



## Silent Scone

Quote:


> Originally Posted by *Crazy Chuckster*
> 
> I just got my shiny new ASUS STRIX OC 1080 ti installed, and wow it is a beast!
> 
> What are you Asus owners using to control the color on the video card, ASUS Aura?
> 
> Last question Afterburner 4.3.0 or GPUTweak?


Currently using the latest version of GPU Tweak II with no issues, as well as Aura Graphics.


----------



## viz0id

Quote:


> Originally Posted by *ALSTER868*
> 
> At which resolution? Your card should be a golden one to be able to maintain such clocks for hours. Was it curve or just the old way sliders?
> Mine can't keep up even close - in W3 it can only do max 1936 stable with no jumping all over the place, it's 4k. The power draw in W3 in just horrendous as it seems.


Curve is set to 1.05v and 2100 clocks down to 2088 at 1.043v and sometimes down to 2075 at 1.031v

Im playing full ultra full hairworks at 1440p. Maybe the resolution is the reason?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *viz0id*
> 
> Curve is set to 1.05v and 2100 clocks down to 2088 at 1.043v and sometimes down to 2075 at 1.031v
> 
> Im playing full ultra full hairworks at 1440p. Maybe the resolution is the reason?


Check your perfcap? Are you shunt modded?

That can for sure can power limit you. It likes 1.032v for absolute smoothness.


----------



## Benny89

Quote:


> Originally Posted by *viz0id*
> 
> @Benny89
> 
> Played and ran W3 for 6hours today. Let it run while i ate and took breaks. In normal play it holds 2088 all the time since i regulary use the map and it gets a rest. But no perfcaps and no crash or artifacts. Lowest clock was 2075. My room got a bit warm from it though lol, but max temp was 65 with fanspeed peaking at 80%


Nice mate! Your card is platinium.... Compare to this mine is poop.

Well, gonna grab new one tomorrow and return old one. We will see if I will also be lucky









Question: how were you finding your max clocks? You were just draging each bin higher and higher with Heaven running and see when it crashes?


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Another block consideration for you may be the Aquacomputer Kryographics block, with their active cooled backplate.
> This is the bit tech review of the TXP block that I ordered, which also fits the 1080Ti:
> https://www.bit-tech.net/hardware/graphics/2016/09/23/titan-x-pascal-water-cooling-review/1
> 
> I ordered it directly from Aquacomputer in Germany, along with the active-cooled backplate:
> https://shop.aquacomputer.de/product_info.php?products_id=3458&XTCsid=bpvgo27iuaini7b5km10vmqpi
> https://shop.aquacomputer.de/product_info.php?products_id=3463&XTCsid=bpvgo27iuaini7b5km10vmqpi
> They tend to be in short supply at times. Aquatuning and Performance PC also sell them, but are often oos.
> 
> I found a recent review, with the active cooled backplate, from VSG at Thermalbench:
> http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/
> 
> Note that this review is for the 1080 version, but should be very similar to the TitanXP/1080Ti block.
> 
> The gpu temps were slightly lower than the EK block, but of interest was the significant reduction in vrm temps, from using the active backplate.:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I've used several EK gpu blocks in the past. I had an Aquacomputer block on one of my 780Ti's, and was impressed with its thick copper construction. The backplate was also much thicker than EK's backplate, but I didn't have the optional active cooling attachment on that block.
> 
> The test results from the Thermalbench review look very good, as keeping the vrm temps low is very important as well.
> 
> I had purchased an EK TitanXP block (which fits the 1080Ti as well) before buying the AC block, but have decided to use the Aquacomputer Kryographics block instead.
> 
> Edit:
> OCUK may have them as well, I see the backplate listed there, but not sure if they have the blocks yet.
> https://www.overclockers.co.uk/water-cooling/water-blocks?ckSuppliers=44&sSort=5&n=384&ckTab=0
> 
> 
> 
> 
> 
> 
> 
> excellent info, I'm definitely going to read up more about this block.
> 
> the idea of an actively cooled backplate is highly intriguing.
> 
> I'm one of the guys who uses a fan blowing onto the back of the socket of my mobo for another 10c off my CPU temp, so actively cooling a backplate with water to me is actually a very natural and excellent idea
> 
> rep+ for info ;-)
Click to expand...

Thanks!

Two members who have installed the AC block on their cards:
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7190#post_26048117
Would be nice to see some further results from them.

Not to knock the EK block or any of the others, but I want to give the active cooling thing a go.

I saw a later post where you are considering doing a temporary AIO on yours, and I can't blame you!
That would at least get some decent cooling for your card while you arrange to get a block.
The AIO coolers certainly work decently from what I've seen.

I'm in the same boat, can't do much on air with the FE cooler, until I get the loop re-done, and the block on the card.
I will certainly post some less than scientific results from the AC block, once I get it up and running.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> wow, so there is truth in it after all.
> 
> I wonder how long it would of lasted without any o/c at stock voltage.... (for comparison)
> 
> excellent info, I'm definitely going to read up more about this block.
> 
> the idea of an actively cooled backplate is highly intriguing.
> 
> I'm one of the guys who uses a fan blowing onto the back of the socket of my mobo for another 10c off my CPU temp, so actively cooling a backplate with water to me is actually a very natural and excellent idea
> 
> rep+ for info ;-)
> 
> ?


http://www.masterbond.com/properties/thermally-conductive-epoxy-adhesives

I was going to look at gluing a heatsink on the back of my Mobo socket for the same purpose but in such a way that I could increase the mounting pressure
I was also going to fill the voids inside the socket effectively making a solid block of epoxy so no CPU PCB flex as the thermal conductivity I think I had to achieve 20kg/cm² mounting pressure would be equivalent to a fused bond

Quote:


> Originally Posted by *Alwrath*
> 
> My question to you, what was your voltage at? Was it max? What was the slider at and volts please.


That is testing my brain I think I could go to a museum and see a GTX 580 there but I believe it was in the 1.4v range
the 580s were notorious for faulting and HOT partly why NVidia have locked down their Bios I miss those days

Quote:


> Originally Posted by *Silent Scone*
> 
> Currently using the latest version of GPU Tweak II with no issues, as well as Aura Graphics.


how is the Aura® software I heard it was a little fiddly and often needed resetting.
I personally never installed the software I like to keep my OS in the minimum as possible and I quite liked the default light reminds me of a heart beating and it is the heart of my PC


----------



## nrpeyton

Quote:


> Originally Posted by *DerComissar*
> 
> Thanks!
> 
> Two members who have installed the AC block on their cards:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7190#post_26048036
> Would be nice to see some further results from them.
> 
> Not to knock the EK block or any of the others, but I want to give the active cooling thing a go.
> 
> I saw a later post where you are considering doing a temporary AIO on yours, and I can't blame you!
> That would at least get some decent cooling for your card while you arrange to get a block.
> The AIO coolers certainly work decently from what I've seen.
> 
> I'm in same boat, can't do much on air with FE cooler, until I get the loop re-done, and the block on the card.
> I will certainly post some less than scientific results from the AC block, once I get it up and running.


Yeah, would love to see your temps and (if possible) a 'with' and 'without' the actively cooled backplate for core, memory (important) and even VRM would be fantastic.

Just read a post claiming EK block maxing out at 36c, load. (Which isn't impossible but _would_ require _very_ low ambient temps) and absolutely IMMACULATE thermal-paste job + balance.).

12c differential between coolant and core temp should be possible with an EK block (done it on Classy) *only at sub-ambient temps did I believe it possible.* _
_
(The colder the coolant the better it conducts heat). And I was also contributing that due to larger Classy PCB = less heat spillage from VRM etc onto core & memory).

But hey; maybe it's easier than all that?

*And*; that brings me to *your* block.... surely having water on the back of that GPU core _too_ could/should knock at least few more degrees off? (I'll be eagerly to see your results) 

A 10c differential between coolant temp and CPU load temp would be fantastic.
And reason enough to buy the block. (test with 250 watt draw would be fair as it _is_ the cards official TDP).

I get paid on Thursday, when I want to order my block. Hopefully for delivery by Saturday so I have coming weekend.

I admit I'll struggle not to go EK when I'm at the checkout.

But hey; can't stick to one manufacturer all my life -- and I've always dreamed of "back of socket water" -- so if I'm doing it GPU.. it's now or never!!


----------



## chibi

Quote:


> Originally Posted by *DerComissar*
> 
> Two members who have installed the AC block on their cards:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7190#post_26048117
> Would be nice to see some further results from them.


Hey hey, my build is still a wip. I can't even load Winders yet, let alone testing


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> http://www.masterbond.com/properties/thermally-conductive-epoxy-adhesives
> 
> I was going to look at gluing a heatsink on the back of my Mobo socket for the same purpose but in such a way that I could increase the mounting pressure
> I was also going to fill the voids inside the socket effectively making a solid block of epoxy so no CPU PCB flex as the thermal conductivity I think I had to achieve 20kg/cm² mounting pressure would be equivalent to a fused bond


Yeah, I thought about that too.. this was my individual take on it:



I even thought about getting two, one for the GPU ;-)
Then they do all shapes/sizes so GPU memory was also an option too (back of PCB)

I liked to think of it as "attacking from all sides"! lol Like literally _invading_ the component with coolant.

Quote:


> Originally Posted by *Silent Scone*
> 
> Currently using the latest version of GPU Tweak II with no issues, as well as Aura Graphics.
> 
> 
> Spoiler: Warning: Spoiler!


aww I remember from my GTX 980 days using GPU Tweak 2 (on my 980 strix). I liked how tweak II showed the true memory bandwidth (like 11,120 e.t.c) even if it did confuse Mhz with Gbps; it was still nice to see ;-)

Nowadays I have to work it out.
Double GPU-Z figure, then, multiply it by 4.
argh


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> I could tell you back in the day of GTX 580s I derminined that 1000Mhz was my max stable overclock with software based voltage I had had these cards for about a year and probably used for 8 hours a week gaming and idling on the web for 12hours a week the rest of the time the PC was off. (the days before boosting clocks)
> Hearing about lots about failing card ect I really had to see for myself how long would these cards last 24/7 folding ??? while keeping the cards under 60°C with max stable OC of 1000Mhz which never crashed during folding.
> 
> Answer;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> the first card 3 weeks, the second card about another 2 weeks of normal use


WOW!! That's freaking crazy.


----------



## viz0id

Quote:


> Originally Posted by *Benny89*
> 
> Nice mate! Your card is platinium.... Compare to this mine is poop.
> 
> Well, gonna grab new one tomorrow and return old one. We will see if I will also be lucky
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Question: how were you finding your max clocks? You were just draging each bin higher and higher with Heaven running and see when it crashes?


The card has yet to crash on me. The only thing ive seen my card be bad on is memory. Cause if i go higer than 250 i perfcap and lose performance.

But Yeah since im on air, i just tried lower and lower bins on 2100 to try and get under the tdp limit. So i just found the lowest bin it would go to 2100 on and did flat curve after that. And reduced one clock step per bin below that.

I dont think ive done anything special. Im just lucky i guess.

Oh and i used supo 4k to check highest clock. And 2100 perfcapped there so i didnt bother going higher


----------



## Bishop07764

Quote:


> Originally Posted by *nrpeyton*
> 
> So with EK Titan / 1080 TI block; you're maxing out *under* 40 at full load?
> 
> Nice.
> 
> You just using regular water/radiators?


Yeah. My Ti has a 480 and 280 rad all to itself. Yes. Max load temps I have seen are 36 c after playing The Division at 99% GPU usage for hours. My set up was originally designed for 2 780 lightning cards highly overclocked. Once summer gets in full swing, temps might rise a bit. My card is definitely screaming for more power. ?


----------



## nrpeyton

Quote:


> Originally Posted by *Bishop07764*
> 
> Yeah. My Ti has a 480 and 280 rad all to itself. Yes. Max load temps I have seen are 36 c after playing The Division at 99% GPU usage for hours. My set up was originally designed for 2 780 lightning cards highly overclocked. Once summer gets in full swing, temps might rise a bit. My card is definitely screaming for more power. ?


Aye well your sitting on a 1300 PSU I see too, lol

You not shut modded?

What's your max stable at 36c?. (Best temp I've seen on this thread yet).

Lol my card idles higher.


----------



## Nico67

Quote:


> Originally Posted by *Bishop07764*
> 
> I've been impressed with my EK block. Mine has maxed at 36c so far. If it cools the other components (vrm, mem, etc) anything like my old 780 lightning, those are probably around 30 c. My old lightning could monitor mem, vrm, and PCB temps. With my EK block, the memory sat no more than low 30s. Same temps or lower for vrm. That was with me overclocking and overvolting. Vrms never got out of the 30s even when pushing 1.4 volts on the core. My computer is even in the hottest room in my house by far.


Yeah the EK block is working very well, mines waterchilled at set point of 20c so it ranges up to 21c before the chiller kicks in again, yet the highest I have seen the GPU is 26c so that's only a 5-6c delta. The chiller is very good at controlling temps, but previous cards have usually been around the 28-30c mark so this seems to be doing a lot better.


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> Yeah the EK block is working very well, mines waterchilled at set point of 20c so it ranges up to 21c before the chiller kicks in again, yet the highest I have seen the GPU is 26c so that's only a 5-6c delta. The chiller is very good at controlling temps, but previous cards have usually been around the 28-30c mark so this seems to be doing a lot better.


What chiller you running?


----------



## RMBR

SLI 1080 ti FAIL!

The Division

SLI 1080ti 78fps
http://www.eurogamer.net/articles/digitalfoundry-2017-nvidia-geforce-gtx-1080-ti-asus-strix-sli-review


SLI 1080 73fps


SLI TITAN X pascal 96fps


Wildlands.

SLI 1080ti 53fps






SLI 1080 54fps


SLI TITAN X pascal 69fps


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> What chiller you running?


Its a 1/2hp Hailea, but its the one they do for some other company who's name eludes me, has a scroll compressor which is supposed to be quieter, than the HC500a.


----------



## Coopiklaani

My card doesn't seem to care about the temperature at all. It runs at the same core frequency from 40C to 55C



I'm using an EK fullcover block, deltaT is always below 10C with ~200LPM flow rate under 350w load.


----------



## KraxKill

That is all.


----------



## Bishop07764

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye well your sitting on a 1300 PSU I see too, lol
> 
> You not shut modded?
> 
> What's your max stable at 36c?. (Best temp I've seen on this thread yet).
> 
> Lol my card idles higher.


Yeah. My PSU is definitely overkill for this card right now. Not shunt modded. I was hoping to see a bios that would eliminate the need for it. Doesn't look like it's going to happen on pascal anytime soon anyway. I miss my 780 lightning bios that allowed 287% power limit.







I was hoping for a tweaked bios that allowed maybe 350-375 watts. It's bench stable at 2063 core and can push it to 2076 if I mess with the curve. Scores go down at 2076 though. Gaming 2050 core. Memory definitely fares better. I have tried up to +675 on the memory and still gotten positive gains in Supo. When I tried it out on air, my idle temps were definitely higher than my load temps now.

Quote:


> Originally Posted by *Nico67*
> 
> Yeah the EK block is working very well, mines waterchilled at set point of 20c so it ranges up to 21c before the chiller kicks in again, yet the highest I have seen the GPU is 26c so that's only a 5-6c delta. The chiller is very good at controlling temps, but previous cards have usually been around the 28-30c mark so this seems to be doing a lot better.


Ha ha. Can't touch this.







Those are awesome temps. Don't get temps like that until the dead of winter here. I remember my 1080 seahawk ek x idling at under 10 C last winter. Ha ha. Ambient temps can make a huge difference without a chiller.


----------



## Bishop07764

Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> That is all.


Lol. I'll take some of those temps please.


----------



## KraxKill

Quote:


> Originally Posted by *Bishop07764*
> 
> Lol. I'll take some of those temps please.


I finally got my chillbox all sorted. Ditched my AIO for a full block. Excuse the worm screw clamps. The pressures involved make for a mess without them.


----------



## TheNoseKnows

Quote:


> Originally Posted by *nrpeyton*
> 
> And it's also Nvidias TDP design for our 1080TI.... but how can something so volatile as a GPU be exactly *250* TDP? lol... of course it's not.
> It's a marketing ploy. No one is going to release a card with a TDP of 279, or 296. Or 178. Because it's bad marketing.


There are several 1080 Tis with >250W TDP. The Gigabyte 1080 Ti Aorus Xtreme, for example, has a default TDP of 250W or 300W (depending on the BIOS), which can be increased to 375W regardless of BIOS version. Another example is Zotac's 1080 Ti Amp Extreme, which has a base power consumption of 320W. In fact, the majority of custom-PCB 1080 Tis have a >250W TDP.


----------



## Slackaveli

Quote:


> Originally Posted by *RMBR*
> 
> SLI 1080 ti FAIL!
> 
> The Division
> 
> SLI 1080ti 78fps
> http://www.eurogamer.net/articles/digitalfoundry-2017-nvidia-geforce-gtx-1080-ti-asus-strix-sli-review
> 
> 
> SLI 1080 73fps
> 
> 
> SLI TITAN X pascal 96fps
> 
> 
> Wildlands.
> 
> SLI 1080ti 53fps
> 
> 
> 
> 
> 
> 
> SLI 1080 54fps
> 
> 
> SLI TITAN X pascal 69fps


Look, guys, a new guy has arrived! And he aims to bring a lot to the discussion!


----------



## Chris Ihao

Ok. I posted a comment about having a system thread exception where I couldnt boot into Windows, just a few days ago. Ended up reinstalling everything, and lost quite a few savegames etc. during the process, as I thought my "user" folder would be retained in Windows.old. Could have backed it up, stupid me, although fortunately most saves are stored in the cloud these days anyhow. Also, to be honest I should really restart several games as I dont even know what the heck is going on storywise loading these old saves. In other word, not a BIG loss.

ANYHOW, to finally get to the point: I blamed the install of the newest MSI Afterburner for the loop of death, but it seems I might have been wrong. Today, not even really trying to find info about this anymore, I stumbled onto the following thread on the Oculus forums about a VERY similar problem pertaining to the Oculus and USB 3.0 ports: https://answers.microsoft.com/en-us/windows/forum/windows_10-update/system-restore-failed-while-restoring-the/005bdbfa-75aa-4ad4-85d0-b427563e7928

Having had my fair share of problems with the Rift headset suddenly not being detected, and having to swap ports to get it up and running again, I have no doubt that the Oculus is to blame for this predicament. I suppose I have to buy this "recommended by Oculus" pci-e usb card to be certain everything works from now on. Stupid... The worst part of it is that there is a likely fix for the problem in this thread, using the command line, saving me all the trouble of reinstalling and losing documents. The only positive thing about this is that I now have a fresh install, after running this install and dragging along lots of bits and pieces from a long time ago, which is now simply gone. Most of the important documents is now residing in the cloud anyways. Feels somewhat... fresh.

To conclude this oversized post where I dont even mention the 1080 ti I wanted to apologize to the makers of Afterburner (and unwinder) for putting the blame where it doesnt belong. If I hadnt been so sure I "knew" what the problem was, I could even had found an actual fix. Lesson learnt I guess. Cheers!


----------



## Nico67

Quote:


> Originally Posted by *Bishop07764*
> 
> Ha ha. Can't touch this.
> 
> 
> 
> 
> 
> 
> 
> Those are awesome temps. Don't get temps like that until the dead of winter here. I remember my 1080 seahawk ek x idling at under 10 C last winter. Ha ha. Ambient temps can make a huge difference without a chiller.


Yeah my ambient is too warm water temps would be in the 30-35c mark most of the yeah, so with aircon 20c is good without condensation








Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> That is all.


Very nice or cool at least, need some serious antifreeze for that, how cool is it there as you don't seem to have any insulation on the system?


----------



## RMBR

Quote:


> Originally Posted by *Slackaveli*
> 
> Look, guys, a new guy has arrived! And he aims to bring a lot to the discussion!


Hello!
I am very disappointed with 1080ti sli revenue.
I have 1080 SLI and was thinking of upgrade.
But the 1080ti SLI is bad.


----------



## Slackaveli

Quote:


> Originally Posted by *RMBR*
> 
> Hello!
> I am very disappointed with 1080ti sli revenue.
> I have 1080 SLI and was thinking of upgrade.
> But the 1080ti SLI is bad.


no it's not, that reviewer is an idiot. everybody in here on sli is getting exactly what you'd expect out of sli, beating 1080 sli by 30-40%, as expected. not sure what is up with that article. Probably a salty amDolt who just realised that Vega competes with a 1070, not a 1080Ti. dont worry about it.


----------



## Alex132

amDolt?

Also has anyone here got a 1080 Ti FTW3? They're out of stock everywhere I look.


----------



## Alwrath

I stopped watching the second video at " I have 2 Xeon 2.3 ghz processors with Hyperthreading "


----------



## done12many2

Quote:


> Originally Posted by *Slackaveli*
> 
> no it's not, that reviewer is an idiot. everybody in here on sli is getting exactly what you'd expect out of sli, beating 1080 sli by 30-40%, as expected. not sure what is up with that article. Probably a salty amDolt who just realised that Vega competes with a 1070, not a 1080Ti. dont worry about it.


Agreed. My 1080 Ti SLI scales the same as my 1080 SLI, but higher.

Since others have mentioned the EK blocks, mine are doing better the expected. I'm seeing the same or better temps then those of my 1080 SLI, which were on EK blocks as well. I did go with liquid metal this time around. I also shunt modded the Ti cards.


----------



## KraxKill

Quote:


> Originally Posted by *Nico67*
> 
> Very nice or cool at least, need some serious antifreeze for that, how cool is it there as you don't seem to have any insulation on the system?


The temperature where I am is actually quite nice today. The high for the day was 87F while the low is expected to be 57F tonight.

There is no need for insulation as my case is the insulator.

I've built a little mini Antarctica. It's a sealed air tight, or near airtight case and the air temps inside the case are at or near the temperature of the coolant being pumped through the loop. So there is no possibility for condensation.

I simply dial up the liquid temp I desire and the temp in the case drops to match that of the liquid.


----------



## CoreyL4

Quote:


> Originally Posted by *KraxKill*
> 
> The temperature where I am is actually quite nice today. The high for the day was 87F while the low is expected to be 57F tonight.
> 
> There is no need for insulation as my case is the insulator.
> 
> I've built a little mini Antarctica. It's a sealed air tight, or near airtight case and the air temps inside the case are at or near the temperature of the coolant being pumped through the loop. So there is no possibility for condensation.
> 
> I simply dial up the liquid temp I desire and the temp in the case drops to match that of the liquid.


Wait, how did you do this? Is that a mini cooler???? I am so lost.


----------



## Randomocity

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye well your sitting on a 1300 PSU I see too, lol
> 
> You not shut modded?
> 
> What's your max stable at 36c?. (Best temp I've seen on this thread yet).
> 
> Lol my card idles higher.


I can also attest, I'm getting wonderful temperatures with my EK FC. I've yet to break 40c in furmark, which is incredible in my considering my old 1080FTW was in the 50s. With stock BIOS, my max overclock is somewhere between 2063 and 2075, but I've only played with it for a couple of hours. Going to try and flash the Palit BIOS in the next couple of days to see if I can hit the magical 2100 mark. No shunt mod for me either, I don't have the stomach for it.


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> The temperature where I am is actually quite nice today. The high for the day was 87F while the low is expected to be 57F tonight.
> 
> There is no need for insulation as my case is the insulator.
> 
> I've built a little mini Antarctica. It's a sealed air tight, or near airtight case and the air temps inside the case are at or near the temperature of the coolant being pumped through the loop. So there is no possibility for condensation.
> 
> I simply dial up the liquid temp I desire and the temp in the case drops to match that of the liquid.


Awesome, how do you go for coolant at -17c?


----------



## KraxKill

Quote:


> Originally Posted by *Nico67*
> 
> Awesome, how do you go for coolant at -17c?


Just really really clean ethylene glycol and water at a ratio under which the viscosity of the liquid and the desired temperature are optimal for both flow and heat transfer.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Thanks!
> 
> Two members who have installed the AC block on their cards:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7190#post_26048036
> Would be nice to see some further results from them.
> 
> Not to knock the EK block or any of the others, but I want to give the active cooling thing a go.
> 
> I saw a later post where you are considering doing a temporary AIO on yours, and I can't blame you!
> That would at least get some decent cooling for your card while you arrange to get a block.
> The AIO coolers certainly work decently from what I've seen.
> 
> I'm in same boat, can't do much on air with FE cooler, until I get the loop re-done, and the block on the card.
> I will certainly post some less than scientific results from the AC block, once I get it up and running.
> 
> 
> 
> 
> 
> 
> 
> Yeah, would love to see your temps and (if possible) a 'with' and 'without' the actively cooled backplate for core, memory (important) and even VRM would be fantastic.
> 
> Just read a post claiming EK block maxing out at 36c, load. (Which isn't impossible but _would_ require _very_ low ambient temps) and absolutely IMMACULATE thermal-paste job + balance.).
> 
> 12c differential between coolant and core temp should be possible with an EK block (done it on Classy) *only at sub-ambient temps did I believe it possible.* _
> _
> (The colder the coolant the better it conducts heat). And I was also contributing that due to larger Classy PCB = less heat spillage from VRM etc onto core & memory).
> 
> But hey; maybe it's easier than all that?
> 
> *And*; that brings me to *your* block.... surely having water on the back of that GPU core _too_ could/should knock at least few more degrees off? (I'll be eagerly to see your results)
> 
> A 10c differential between coolant temp and CPU load temp would be fantastic.
> And reason enough to buy the block. (test with 250 watt draw would be fair as it _is_ the cards official TDP).
> 
> I get paid on Thursday, when I want to order my block. Hopefully for delivery by Saturday so I have coming weekend.
> 
> I admit I'll struggle not to go EK when I'm at the checkout.
> 
> But hey; can't stick to one manufacturer all my life -- and I've always dreamed of "back of socket water" -- so if I'm doing it GPU.. it's now or never!!
Click to expand...

Yeah, I would like to do a comparison with and without the active cooling backplate "module" lol.

But it will be a while before I get the loop rebuilt and the system finished.

Tom's tested the AC block on a 1080Ti, and had a review of the temps. without using the active backplate, compared to the stock FE cooler:


Review:
http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html

Note that the AC block uses no thermal pads on the memory chips, they have direct contact to the block, using thermal paste instead.

Beyond that, if you go to chilled water, that would really lead to fantastic temps., lol.
Quote:


> Originally Posted by *chibi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Two members who have installed the AC block on their cards:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7190#post_26048117
> Would be nice to see some further results from them.
> 
> 
> 
> Hey hey, my build is still a wip. I can't even load Winders yet, let alone testing
Click to expand...

And a lovely WIP!









Looking forward to seeing how you do when you can load Winders, lol.


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> Its a 1/2hp Hailea, but its the one they do for some other company who's name eludes me, has a scroll compressor which is supposed to be quieter, than the HC500a.


I have the exact same model.

Hailea HC500a,

790w cooling capacity (1/2 HP).

You had it sub-zero?

Quote:


> Originally Posted by *DerComissar*
> 
> Yeah, I would like to do a comparison with and without the active cooling backplate "module" lol.
> 
> But it will be a while before I get the loop rebuilt and the system finished.
> 
> Tom's tested the AC block on a 1080Ti, and had a review of the temps. without using the active backplate, compared to the stock FE cooler:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Review:
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html
> 
> Note that the AC block uses no thermal pads on the memory chips, they have direct contact to the block, using thermal paste instead.
> 
> Beyond that, if you go to chilled water, that would really lead to fantastic temps., lol.
> And a lovely WIP!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward to seeing how you do when you can load Winders, lol.


Nice, I like the idea of direct contact with memory chips (no pads)...

But wow I'm surprised that works... (there's a reason manufacturers don't normally do it).

The block would need to be absolutely perfectly flat and symmetrical. Even a 0.25mm bend (natural manufacturing inconsistency) on the metal block from left-side to right-side could cause contact issues. They would have to have *absolutely utterly perfect measurements* so all 11 memory chips *and* the core, all makes perfect contact (with no pads to soak up any difference)..
In theory; if they've nailed it, temps on those memory chips should indeed be better.. but I'd have to see it to believe it lol.

Maybe they have screws all around the chips to help hold the PCB against the block perfectly....

I'm definitely going to do a bit of digging and find all the reviews and info I can on this block.

And the active cooled backplate too, on-top of it.. all seems like there are things going on here that no other manufacturer is doing. ;-)

*
P.S.
*When I swapped 0.5mm thermal pads out for 0.5mm copper shims on my 1080 Classy it didn't work as expected (due to reasons above I.E. manufacturing imperfections/inconsistency). Don't get me wrong, temps weren't bad... they were excellent on 3 chips on one side, mediocre in the middle but didn't seem right on the opposite side.. I even tried changing one side from 0.5mm to 0.6mm (eventually I was able to _match_ or even _better_ performance against using 1.0mm thermal pads but only maybe by 1/2 degree (which could be within margin of error).


----------



## stranger451

Quote:


> Originally Posted by *Alex132*
> 
> amDolt?
> 
> Also has anyone here got a 1080 Ti FTW3? They're out of stock everywhere I look.


You had to pre-order on 03/28/2017 within the first 30 minutes on EVGA's website to have a FTW3 ship today. It will most likely be weeks before they will be available to purchase without having a pre-order from either EVGA or Newegg. With my AIO 1080 Ti that won't go past 2038mhz at 53°C, I'm going to regret giving up my pre-order on two of them. The preliminary analysis of the PCB looks promising but these are not binned chips so overclocking is still luck of the draw.


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> I have the exact same model.
> 
> Hailea HC500a,
> 
> 790w cooling capacity (1/2 HP).
> 
> You had it sub-zero?


Nah mines a Hailea Aqua Medic HL 380 CA, same 790w 1/2hp, but slight different design. Its supposed to be quieter and had slightly lower flow rate requirement. It has Horizontal reservoir as the base which maybe a bit larger. I get condensation when its at 18c lol at least in summer, too warm and humid even with AC


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> Nah mines a Hailea Aqua Medic HL 380 CA, same 790w 1/2hp, but slight different design. Its supposed to be quieter and had slightly lower flow rate requirement. It has Horizontal reservoir as the base which maybe a bit larger. I get condensation when its at 18c lol at least in summer, too warm and humid even with AC


aww, I use a Dew Point & Relative Humidity Meter. Sometimes at night dew point goes down as low as 7 or 8c. So I'm able to run the Chiller that low with no condensation. (Meter is only 20 bux off Ebay).
Other days dew point is of course about 16c but it does change day-to-day.

They are really easy to modify for sub-zero. You just need to apply a little bit liquid electrical tape to the GPU and insulate your blocks/tubing with armaflex to stop air getting in (and condensation) and you can get as low as -14c coolant temp by bypassing the thermostat. I had great fun with mine.

It's currently disconnected. As after I modified it to run regardless of temperature (as soon as it was switched on at wall) I got annoyed having to periodically power it up/down every hour or so. (with PC idling it would actually take about an hour for temp to go from 16c to about 35-40c.

I got some great O/C'ing results on my last pascal as a result. 2300++ MHz ;-)

Can't wait to see what my TI is going to do. It's still sitting beside me now. I could have it reconnected to loop in 5 mins ;-)

A quieter model of compressor would be nice; I have to say.. the noise does get on your nerves eventually lol.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Yeah, I would like to do a comparison with and without the active cooling backplate "module" lol.
> 
> But it will be a while before I get the loop rebuilt and the system finished.
> 
> Tom's tested the AC block on a 1080Ti, and had a review of the temps. without using the active backplate, compared to the stock FE cooler:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Review:
> http://www.tomshardware.com/reviews/geforce-gtx-1080-ti-water-cooling,4975.html
> 
> Note that the AC block uses no thermal pads on the memory chips, they have direct contact to the block, using thermal paste instead.
> 
> Beyond that, if you go to chilled water, that would really lead to fantastic temps., lol.
> 
> 
> 
> 
> 
> 
> Nice, I like the idea of direct contact with memory chips (no pads)...
> 
> But wow I'm surprised that works... (there's a reason manufacturers don't normally do it).
> 
> The block would need to be absolutely perfectly flat and symmetrical. Even a 0.25mm bend (natural manufacturing inconsistency) on the metal block from left-side to right-side could cause contact issues. They would have to have *absolutely utterly perfect measurements* so all 11 memory chips *and* the core, all makes perfect contact (with no pads to soak up any difference)..
> In theory; if they've nailed it, temps on those memory chips should indeed be better.. but I'd have to see it to believe it lol.
> 
> Maybe they have screws all around the chips to help hold the PCB against the block perfectly....
> 
> I'm definitely going to do a bit of digging and find all the reviews and info I can on this block.
> 
> And the active cooled backplate too, on-top of it.. all seems like there are things going on here that no other manufacturer is doing. ;-)
> 
> *
> P.S.
> *When I swapped 0.5mm thermal pads out for 0.5mm copper shims on my 1080 Classy it didn't work as expected (due to reasons above I.E. manufacturing imperfections/inconsistency). Don't get me wrong, temps weren't bad... they were excellent on 3 chips on one side, mediocre in the middle but didn't seem right on the opposite side.. I even tried changing one side from 0.5mm to 0.6mm (eventually I was able to _match_ or even _better_ performance against using 1.0mm thermal pads but only maybe by 1/2 degree (which could be within margin of error).
Click to expand...

I see what you mean with the shims, again they are basically rigid, compared to thermal pads.
The precision fit that would be required just isn't there.

And the surfaces are never going to be perfectly flush.
But there must be some way that AC's block manages to work, the thermal paste would not fill any significant gaps, as do thermal pads. I've not seen any issues mentioned with their method, which they've been employing on their blocks for years.

Perhaps the fastening of the blocks has some bearing on this, as you mentioned.
Here are the .pdf files on the block and active backplate:

kryographics_TITAN_X_Pascal_english_20160818.pdf 297k .pdf file


Backplate_TITAN_X_Pascal_active_english_20160830.pdf 243k .pdf file


----------



## ViRuS2k

No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...

i have 2x 1080ti`s ek blocks and back plates.
run them currently @ 2025/12ghz

water creeps its way up until it reaches about 46c
with card temps @54c
and running a single card without sli creeps up to about 49c

this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.

CPU is under the same loop @4.8ghz I7 haswell 4970k

I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.

anyone have any suggestions ?
EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
i also have another 120 alpha cool monster radiator that i could fit
adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....

gotta figure a way to lower these fluid temps.


----------



## pez

I'm not water expert, but I imagine you're running that Haswell chip (is it delidded) on a vcore of 1.28 or so. It's not the coolest running CPU, and these guys might be running lower TDP/more efficient chips. Your temps don't sound bad at all.


----------



## ViRuS2k

Quote:


> Originally Posted by *pez*
> 
> I'm not water expert, but I imagine you're running that Haswell chip (is it delidded) on a vcore of 1.28 or so. It's not the coolest running CPU, and these guys might be running lower TDP/more efficient chips. Your temps don't sound bad at all.


Im delid yes.
with clu under the hood of that cpu @4.8 @1.34v, under stressing it temps @ around high 60s. surely though that cant be the cause of the high temps though

reason why i want under 50c is because i loose a bin once my cards jump over 50c.and also i will probably gain even higher temps once i shunt them both.


----------



## MURDoctrine

Quote:


> Originally Posted by *ViRuS2k*
> 
> No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
> the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...
> 
> i have 2x 1080ti`s ek blocks and back plates.
> run them currently @ 2025/12ghz
> 
> water creeps its way up until it reaches about 46c
> with card temps @54c
> and running a single card without sli creeps up to about 49c
> 
> this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.
> 
> CPU is under the same loop @4.8ghz I7 haswell 4970k
> 
> I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.
> 
> anyone have any suggestions ?
> EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
> i also have another 120 alpha cool monster radiator that i could fit
> adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....
> 
> gotta figure a way to lower these fluid temps.


Honestly that sounds about right. I have a single 1080ti under water and it hits 44-46C after long gaming sessions. With 2 of them and a CPU in the loop that seems about right. I also have a 360mm and 480mm rad in my loop. You have to remember you get diminished returns once you go past 120mm per block with 1 extra for headroom.


----------



## ViRuS2k

Quote:


> Originally Posted by *MURDoctrine*
> 
> Honestly that sounds about right. I have a single 1080ti under water and it hits 44-46C after long gaming sessions. With 2 of them and a CPU in the loop that seems about right. I also have a 360mm and 480mm rad in my loop. You have to remember you get diminished returns once you go past 120mm per block with 1 extra for headroom.


So do you know if i add the other 120mm thick radiator and dual pump and put clu on the gpus that i could knock off around 6-7c load temps ?
i know when adding clu to my cpu it knocked off about 20-25c alone compaired to that stock thermal crap lol

but not sure about how much it would knock off on gpus temps... as i have never done clu on gpu dies but i do have the stuff ordered and in my house ready wich is some liquid electric black tape that i plan on putting around the gpu die area then putting clu on the die and doing the same with the shunts.....

i should knock off quite a bit of temps im hoping anyway ..... also with better flow using 2x d5 pumps in series should give better thermals due to getting the heat away faster from the gpus i hope, well thats my theory anyway...


----------



## pez

What is your OC on the Tis currently? Is it worth adding the noise, power draw and decreased space of a 120 rad for a few degrees? I get wanting to have a crazy OC, but outside of benchmarks, you're not going to see that difference in a measurable way (read: margin of error).


----------



## AMDATI

Quote:


> Originally Posted by *ViRuS2k*
> 
> No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
> the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...
> 
> i have 2x 1080ti`s ek blocks and back plates.
> run them currently @ 2025/12ghz
> 
> water creeps its way up until it reaches about 46c
> with card temps @54c
> and running a single card without sli creeps up to about 49c
> 
> this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.
> 
> CPU is under the same loop @4.8ghz I7 haswell 4970k
> 
> I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.
> 
> anyone have any suggestions ?
> EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
> i also have another 120 alpha cool monster radiator that i could fit
> adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....
> 
> gotta figure a way to lower these fluid temps.


There's a few things you can do. First off, you may be able to run your CPU with less voltage. For example, I run mine at 4.4Ghz, and the typical suggested voltage for that is 1.2v, Yet, I can get it stable at 1.1v, a whole 0.9v savings.

Another way to shave off a few C is to look for a setting in your bios called 'render standby', and disable it. This will keep your onboard GPU from staying in a standby mode, which actually does effect temps. For some reason, they designed it so it's always in standby even with a dedicated graphics card installed, unless you explicitly disable standby mode. It's a very little known fact.

That should make things a bit easier on your system loop.


----------



## fisher6

Quote:


> Originally Posted by *ViRuS2k*
> 
> So do you know if i add the other 120mm thick radiator and dual pump and put clu on the gpus that i could knock off around 6-7c load temps ?
> i know when adding clu to my cpu it knocked off about 20-25c alone compaired to that stock thermal crap lol
> 
> but not sure about how much it would knock off on gpus temps... as i have never done clu on gpu dies but i do have the stuff ordered and in my house ready wich is some liquid electric black tape that i plan on putting around the gpu die area then putting clu on the die and doing the same with the shunts.....
> 
> i should knock off quite a bit of temps im hoping anyway ..... also with better flow using 2x d5 pumps in series should give better thermals due to getting the heat away faster from the gpus i hope, well thats my theory anyway...


Are using any custom bios by any chance? I noticed yesterday my temps going up quite a bit to 55C while playing ME:A. Afaik it seems to have happened around the time i flashed the Palit bios. Gonna flash back the stock bios today and see if it changes anything. Running a 350 30mm EK and a 45mm Alphacool rad here with a singel D5 pump, CPU temps are usually around 55-60 while gaming.


----------



## ViRuS2k

Quote:


> Originally Posted by *pez*
> 
> What is your OC on the Tis currently? Is it worth adding the noise, power draw and decreased space of a 120 rad for a few degrees? I get wanting to have a crazy OC, but outside of benchmarks, you're not going to see that difference in a measurable way (read: margin of error).


The temps i get are not even under synthetic loads.... there under gaming.loads...
cards are oc`d between 2012/2037 is the sweet spot though does fluctuate due to power limit, planning on fixing that with the shunt mods so i can run stable 2050/12ghz without fluctuations. in games that is. synthetics im not bothered about lol fluctuations are the norm due to the over the top power limits they require. ect...

but if i can get under 50c load with shunt in place i will be very happy







getting there is going to be a mission though it seems,

so even adding a extra 120mm thick radiator wont do much for the load temps even if i can knock off 5c more that would be great haha..


----------



## ViRuS2k

Quote:


> Originally Posted by *fisher6*
> 
> Are using any custom bios by any chance? I noticed yesterday my temps going up quite a bit to 55C while playing ME:A. Afaik it seems to have happened around the time i flashed the Palit bios. Gonna flash back the stock bios today and see if it changes anything. Running a 350 30mm EK and a 45mm Alphacool rad here with a singel D5 pump, CPU temps are usually around 55-60 while gaming.


Yeah that mass effect is a heat killing game.....
im running custom bios also the Zotac Xtreme 1080ti bios with 384w tdp limit....
not sure its getting all of that though but probably around 330w

temps in gaming yet i get about 70c or there abouts in prime95 or ibt but under normal gaming around high 50s low 60s.

internal GPU is disabled on the cpu.... as i run dedicated... if only i could steal back 5-8c in load temps lol


----------



## ViRuS2k

Quote:


> Originally Posted by *AMDATI*
> 
> There's a few things you can do. First off, you may be able to run your CPU with less voltage. For example, I run mine at 4.4Ghz, and the typical suggested voltage for that is 1.2v, Yet, I can get it stable at 1.1v, a whole 0.9v savings.
> 
> Another way to shave off a few C is to look for a setting in your bios called 'render standby', and disable it. This will keep your onboard GPU from staying in a standby mode, which actually does effect temps. For some reason, they designed it so it's always in standby even with a dedicated graphics card installed, unless you explicitly disable standby mode. It's a very little known fact.
> 
> That should make things a bit easier on your system loop.


Hi mate, i couldnt really run my cpu lower otherwise im going to end up running into a cpu bottleneck....
currently i have 2 monitors one @4k 60hz and one @ 1440p 144hz.... need the cpu to be as fast as possible for the 144hz monitor or i will end up bottlenecking something ...


----------



## ViRuS2k

One other thing guys, would this gain anything......

currently i run Custom loop @

Flow goes ->>>> PUMP/420MM RAD/GPU2/GPU1/480MM RAD/CPU/270MM PHOTON RES

Back to PUMP

Actually now that i think about it, hahaha i think i have the flow going the wrong way hmm lol or something im just not seeing it.... then again i dont think changing the flow would change anything regarding temps.
also i did notice i get a lot of pressure build up in my loop, if i unscrew the top fill cap on the top of the res i get a low of air, like unscrewing the top of your radiator cap with the car engine still hot hehe
is that normal ?????? or is that me letting out trapped air ? anyway again probably wouldnt help with temps.... gonna have to figure something out or i might just make a custom single loop just for the cpu and have the GPUS cooled by both 480 and 420mm rads....... though that would mean i need another res but where to put it hahaha not sure.... though suggestions are welcome....


----------



## Nico67

Quote:


> Originally Posted by *ViRuS2k*
> 
> No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
> the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...
> 
> i have 2x 1080ti`s ek blocks and back plates.
> run them currently @ 2025/12ghz
> 
> water creeps its way up until it reaches about 46c
> with card temps @54c
> and running a single card without sli creeps up to about 49c
> 
> this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.
> 
> CPU is under the same loop @4.8ghz I7 haswell 4970k
> 
> I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.
> 
> anyone have any suggestions ?
> EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
> i also have another 120 alpha cool monster radiator that i could fit
> adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....
> 
> gotta figure a way to lower these fluid temps.


About the only two things you can effect is fan speed and ambient temperature, if you can lower room temp it will help the most, but fan speed will help lower the delta between ambient and water temps. With the amount of Radiator you have I don't think a 120mm would make much difference at all. If your water temp is hitting 46c, that's insane as it should be 5c higher than room temp or thereabouts. Fan speed is probably the first place to start.


----------



## mshagg

Yep, as above. GPU delta to coolant temp is fine, suggests no problems with the block interface.

Your coolant temps are ridiculously high, unless it's like 36 degrees in the room. What are the fan configurations on those radiators?

Single D5 should be fine for that loop, I ran something very similar just fine on one - an additional pump will just add more heat to the loop.


----------



## ViRuS2k

Quote:


> Originally Posted by *Nico67*
> 
> About the only two things you can effect is fan speed and ambient temperature, if you can lower room temp it will help the most, but fan speed will help lower the delta between ambient and water temps. With the amount of Radiator you have I don't think a 120mm would make much difference at all. If your water temp is hitting 46c, that's insane as it should be 5c higher than room temp or thereabouts. Fan speed is probably the first place to start.


Thats under load though with the water temps... and the gpus are about 10c above watertemps at load.
at idle graphics cards are 25-27c with water temps currenlty at 27c...

you might be right about fans though they are pretty loud and dont push a lot but i was told with these radiators that you dont need pressure fans for them as the fins are not as compact... or dence....

might try adding another 4x 140 fans to both radiators

as currently the radiators are 480mm on top with 4x fans only in pull configuration
and the bottom rad has 3x 120 under the radiator in the pull configurations

no fans on the other side of the radiators

you might be right that im not getting the air out of the radiators fast enough hmm food for thought lol buy more fans









well guys i work nightshifts so i have to go now but i will check up on replys to any idea`s you guys have will also here is a video of my system, ignore the light show as i just got them at the time and need to configure them lol

though the video is with my 2x980TI`s though everything is the same apart from the cards replaced with 1080ti`s.....





or





also the res is only half full, hahaha its completely full now


----------



## Nico67

Quote:


> Originally Posted by *ViRuS2k*
> 
> Thats under load though with the water temps... and the gpus are about 10c above watertemps at load.
> at idle graphics cards are 25-27c with water temps currenlty at 27c...
> 
> you might be right about fans though they are pretty loud and dont push a lot but i was told with these radiators that you dont need pressure fans for them as the fins are not as compact... or dence....
> 
> might try adding another 4x 140 fans to both radiators
> 
> as currently the radiators are 480mm on top with 4x fans only in pull configuration
> and the bottom rad has 3x 120 under the radiator in the pull configurations
> 
> no fans on the other side of the radiators
> 
> you might be right that im not getting the air out of the radiators fast enough hmm food for thought lol buy more fans
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well guys i work nightshifts so i have to go now but i will check up on replys to any idea`s you guys have will also here is a video of my system, ignore the light show as i just got them at the time and need to configure them lol
> 
> though the video is with my 2x980TI`s though everything is the same apart from the cards replaced with 1080ti`s.....
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> also the res is only half full, hahaha its completely full now


Fans in Push are a lot more effective, never found fans on bothsides helped that much, but I'd wind them up as much as you can and switch them to push if possible, making sure to blow hot air out of the case


----------



## feznz

Quote:


> Originally Posted by *KraxKill*
> 
> The temperature where I am is actually quite nice today. The high for the day was 87F while the low is expected to be 57F tonight.
> 
> There is no need for insulation as my case is the insulator.
> 
> I've built a little mini Antarctica. It's a sealed air tight, or near airtight case and the air temps inside the case are at or near the temperature of the coolant being pumped through the loop. So there is no possibility for condensation.
> 
> I simply dial up the liquid temp I desire and the temp in the case drops to match that of the liquid.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ]


Quote:


> Originally Posted by *nrpeyton*
> 
> aww, I use a Dew Point & Relative Humidity Meter. Sometimes at night dew point goes down as low as 7 or 8c. So I'm able to run the Chiller that low with no condensation. (Meter is only 20 bux off Ebay).
> Other days dew point is of course about 16c but it does change day-to-day.
> 
> They are really easy to modify for sub-zero. You just need to apply a little bit liquid electrical tape to the GPU and insulate your blocks/tubing with armaflex to stop air getting in (and condensation) and you can get as low as -14c coolant temp by bypassing the thermostat. I had great fun with mine.
> 
> It's currently disconnected. As after I modified it to run regardless of temperature (as soon as it was switched on at wall) I got annoyed having to periodically power it up/down every hour or so. (with PC idling it would actually take about an hour for temp to go from 16c to about 35-40c.
> 
> I got some great O/C'ing results on my last pascal as a result. 2300++ MHz ;-)
> 
> Can't wait to see what my TI is going to do. It's still sitting beside me now. I could have it reconnected to loop in 5 mins ;-)
> 
> A quieter model of compressor would be nice; I have to say.. the noise does get on your nerves eventually lol.


I had almost gone down the rabbit hole with you guys time restraints got the best of me and when the GTX1080 didn't respond well with the cold I sold my 1Hp condenser unit.
I will give you my final take of where I was heading you both obiously familiar with the Chill box
-Get rid of the water loop altogether every heat exchange was going to lose energy, pumps were always going to add heat to the loop. Large volumes of water are a pain in the butt to chill and insulate or even pump at sub zero
-replace the water blocks with custom made evaporator blocks
-and use a Precise controller the problem being that it would require 1 per GPU or CPU and multiple check valves but all doable and you can get readily made AC hose that would handle -40°C

or you could just get one of these chiller unit and build your own Chill box I always thought these units were a little under powered


----------



## Dasboogieman

Quote:


> Originally Posted by *ViRuS2k*
> 
> No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
> the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...
> 
> i have 2x 1080ti`s ek blocks and back plates.
> run them currently @ 2025/12ghz
> 
> water creeps its way up until it reaches about 46c
> with card temps @54c
> and running a single card without sli creeps up to about 49c
> 
> this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.
> 
> CPU is under the same loop @4.8ghz I7 haswell 4970k
> 
> I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.
> 
> anyone have any suggestions ?
> EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
> i also have another 120 alpha cool monster radiator that i could fit
> adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....
> 
> gotta figure a way to lower these fluid temps.


What's your ambient temperatures? that has possibly the biggest impact on your temperatures, more so than any other factor. Some members here live in very cold climates so they can get a free 5-10 degree boost on that alone.

The order of troubleshooting I'd do
1. check block seating + TIM application, this has a massive impact on temperatures.
2. check that fans are well matched to the rads, MagLev or Swiftech fans seem the best static pressure to noise, to CFM ratio. Also, fans are much much better at pushing air than pulling it.
3. check if flow is the issue, it should be with a single D5 because you have a lot of blocks and rads. Hot coolant seems to suggest the interface with the water to air is the bottleneck. Maybe check for blockages?


----------



## feznz

Quote:


> Originally Posted by *RMBR*
> 
> SLI 1080 ti FAIL!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> The Division
> 
> SLI 1080ti 78fps
> http://www.eurogamer.net/articles/digitalfoundry-2017-nvidia-geforce-gtx-1080-ti-asus-strix-sli-review
> 
> 
> SLI 1080 73fps
> 
> 
> SLI TITAN X pascal 96fps
> 
> 
> Wildlands.
> 
> SLI 1080ti 53fps
> 
> 
> 
> 
> 
> 
> SLI 1080 54fps
> 
> 
> SLI TITAN X pascal 69fps


Welcome to Overclock.net

That is normal for SLI and new titles I have been in the SLI boat before and you will find yourself checking for new drivers to update new SLI profiles to be created for newish games.

And @KedarWolf this could be useful for the OP 1080Ti cards roundup


----------



## Bishop07764

Quote:


> Originally Posted by *ViRuS2k*
> 
> Thats under load though with the water temps... and the gpus are about 10c above watertemps at load.
> at idle graphics cards are 25-27c with water temps currenlty at 27c...
> 
> you might be right about fans though they are pretty loud and dont push a lot but i was told with these radiators that you dont need pressure fans for them as the fins are not as compact... or dence....
> 
> might try adding another 4x 140 fans to both radiators
> 
> as currently the radiators are 480mm on top with 4x fans only in pull configuration
> and the bottom rad has 3x 120 under the radiator in the pull configurations
> 
> no fans on the other side of the radiators
> 
> you might be right that im not getting the air out of the radiators fast enough hmm food for thought lol buy more fans
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well guys i work nightshifts so i have to go now but i will check up on replys to any idea`s you guys have will also here is a video of my system, ignore the light show as i just got them at the time and need to configure them lol
> 
> though the video is with my 2x980TI`s though everything is the same apart from the cards replaced with 1080ti`s.....
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> also the res is only half full, hahaha its completely full now


Lowering ambient temps as others have suggested could definitely help. I just used Kryonaut on my GPU with great results so far. My rad fans are push/pull on my GPU 480 mm rad at just 600-700rpm or so. Provided you have a good mount on your gpus, try increasing fan speed to see if that nets you anything. I know that I'm going to have to run A/c in my room this summer to be able to maintain the max 36c temps in seeing on my Ti. My dedicated GPU loop probably helps a bit too.


----------



## eXistencelies

Quote:


> Originally Posted by *ViRuS2k*
> 
> No sure how you guys are getting under 50c with your EK waterblocks but im suffering trying to get my cards under 50c load.
> the watertemps in my custom loop just keep going up and up and up very very very slowly i might at...
> 
> i have 2x 1080ti`s ek blocks and back plates.
> run them currently @ 2025/12ghz
> 
> water creeps its way up until it reaches about 46c
> with card temps @54c
> and running a single card without sli creeps up to about 49c
> 
> this is with 2x alpha cool radiators very thick ones 480mm on the top and 420mm on the bottom. with 1 ek pump. d5 @ 10 on the dial.
> 
> CPU is under the same loop @4.8ghz I7 haswell 4970k
> 
> I cant for the life of me figure out how to get my temps down to around low 40s. under load, once i get around that im planning on doing the shunt mod.
> 
> anyone have any suggestions ?
> EK-XTOP Revo Dual D5 PWM Serial - (incl. 2x pump) i bought this thinking it might be my flow thats cauing the issues though yet to fit it.
> i also have another 120 alpha cool monster radiator that i could fit
> adding stuff is no issue as i have a 2000w super flower 8pack PSU in the system probably overkill lol but you can never buy cheap when buying psu`s....
> 
> gotta figure a way to lower these fluid temps.


When I put my waterblock on my TI somehow I was getting high temps. I have done this several times with previous cards. So I end up taking it back out of the loop and guess what? Somehow one of the clear backings from the thermal pads made its way onto the nickel block where it sits on the gpu's core. It was blocking the cooling path. I was idling fine in the 20s, but when a load was on it shot up into the 50's pretty much instantly. Ended up removing it. Temps dropped back to no more than 45c during long time gaming sessions. This is at 2050mhz too.

BTW I am running two 360mm GTS rads with a delidded i7 7700k @ 5ghz @ 1.285v in a Phanteks TG Evolv


----------



## fisher6

Quote:


> Originally Posted by *Bishop07764*
> 
> Lowering ambient temps as others have suggested could definitely help. I just used Kryonaut on my GPU with great results so far. My rad fans are push/pull on my GPU 480 mm rad at just 600-700rpm or so. Provided you have a good mount on your gpus, try increasing fan speed to see if that nets you anything. I know that I'm going to have to run A/c in my room this summer to be able to maintain the max 36c temps in seeing on my Ti. My dedicated GPU loop probably helps a bit too.


Is Kryonaut much better than what EK includes with their block? Tempted to try it on my GPU since many people since to be mentioning it in this thread. How much difference are we talking about in temps?


----------



## nrpeyton

Quote:


> Originally Posted by *ViRuS2k*
> 
> Thats under load though with the water temps... and the gpus are about 10c above watertemps at load.
> at idle graphics cards are 25-27c with water temps currenlty at 27c...
> 
> you might be right about fans though they are pretty loud and dont push a lot but i was told with these radiators that you dont need pressure fans for them as the fins are not as compact... or dence....
> 
> might try adding another 4x 140 fans to both radiators
> 
> as currently the radiators are 480mm on top with 4x fans only in pull configuration
> and the bottom rad has 3x 120 under the radiator in the pull configurations
> 
> no fans on the other side of the radiators
> 
> you might be right that im not getting the air out of the radiators fast enough hmm food for thought lol buy more fans
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well guys i work nightshifts so i have to go now but i will check up on replys to any idea`s you guys have will also here is a video of my system, ignore the light show as i just got them at the time and need to configure them lol
> 
> though the video is with my 2x980TI`s though everything is the same apart from the cards replaced with 1080ti`s.....
> 
> 
> 
> 
> 
> or
> 
> 
> 
> 
> 
> also the res is only half full, hahaha its completely full now


*High static pressure fans* are a must.

_*If you lay a sheet of paper on top of the roof of your PC (above the radiator) does the sheet of paper float or blow away? It should. If it doesn't you're not getting enough air through.
*_
I'd say this is _even_ more important with thick radiators.

The highest static pressure fans I've ever seen/used have a static pressure of *7.63 mm H20*. And are capable of spinning up to 3000 RPM. (and moving 186 cubed metres of air every hour).
(Noctua NF-F12 industrialPPC-3000 PWM).

For comparison, the Corsair SP 120's (also high static pressure) are 3.1 mm H2O).

Also, sometimes radiators benefit from a really good cleaning (dust removal).


----------



## Coopiklaani

Quote:


> Originally Posted by *fisher6*
> 
> Is Kryonaut much better than what EK includes with their block? Tempted to try it on my GPU since many people since to be mentioning it in this thread. How much difference are we talking about in temps?


2-3C diff maximum at full load. With Kryonaut and EKFC GPU block my GPU-water delta is about 9C under full load with 225LPH flowrate.


----------



## BrainSplatter

Quote:


> Originally Posted by *nrpeyton*
> 
> The highest static pressure fans I've ever seen/used have a static pressure of *7.63 mm H20*. And are capable of spinning up to 3000 RPM. (and moving 186 cubed metres of air every hour).
> (Noctua NF-F12 industrialPPC-3000 PWM).


Recently I tried out the Noctua 140mm industrial and it's pretty crazy. Moves 269,3 m³/h air and 10,52 mm H₂O. It's pretty silent at low RPMs and can idle at 0 RPM, which is nice.
http://noctua.at/en/products/product-line-industrial/nf-a14-industrialppc-3000-pwm

Something different to save some power and to lower temperature is to use a framerate limiter like the one from Afterburner. Especially for SLI systems, this also often minimizes micro stutters and gives a better gaming experience than unlimited.


----------



## derpa

Quote:


> Originally Posted by *BrainSplatter*
> 
> Recently I tried out the Noctua 140mm industrial and it's pretty crazy. Moves 269,3 m³/h air and 10,52 mm H₂O. It's pretty silent at low RPMs and can idle at 0 RPM, which is nice.
> http://noctua.at/en/products/product-line-industrial/nf-a14-industrialppc-3000-pwm


If it's all about airflow (*252.85 CFM*) and static pressure (*35.865 mmH2O*), you could check out https://www.newegg.com/Product/Product.aspx?Item=N82E16835706015

I had two of these in my case at one point, but I'm down to one now. I have run them both at 12V and 5V, and at either voltage, have no issue keeping my CPU Kraken AiO cool. Fair warning, they're loud! LOL


----------



## Luckbad

Quote:


> Originally Posted by *BrainSplatter*
> 
> Recently I tried out the Noctua 140mm industrial and it's pretty crazy. Moves 269,3 m³/h air and 10,52 mm H₂O. It's pretty silent at low RPMs and can idle at 0 RPM, which is nice.
> http://noctua.at/en/products/product-line-industrial/nf-a14-industrialppc-3000-pwm
> 
> Something different to save some power and to lower temperature is to use a framerate limiter like the one from Afterburner. Especially for SLI systems, this also often minimizes micro stutters and gives a better gaming experience than unlimited.


I'll second Noctua here. I had some initial issues until I discovered a noise that was occurring was because of a daisychained fan.

I got the Noctua NF-AF14 industrial PPC-2000 PWM fans for my radiator. I grabbed these because the 3000 go way beyond reasonable volume and my Kraken's pump speed is dependent on the fan speed.
http://noctua.at/en/nf-a14-industrialppc-2000-pwm

A simple fan curve keeps these incredibly quiet, but they have the horsepower to push some real air through the radiator if necessary. My PC runs at ~40dB including all ambient noise around me even with these fans (jumps to ~48dB if I'm pushing the CPU hard).


----------



## KedarWolf

Hey peeps, been busy with real life stuff, haven't been ignoring you all.

Will get caught up soon.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> *High static pressure fans* are a must.
> 
> _*If you lay a sheet of paper on top of the roof of your PC (above the radiator) does the sheet of paper float or blow away? It should. If it doesn't you're not getting enough air through.
> *_
> I'd say this is _even_ more important with thick radiators.
> 
> The highest static pressure fans I've ever seen/used have a static pressure of *7.63 mm H20*. And are capable of spinning up to 3000 RPM. (and moving 186 cubed metres of air every hour).
> (Noctua NF-F12 industrialPPC-3000 PWM).
> 
> For comparison, the Corsair SP 120's (also high static pressure) are 3.1 mm H2O).
> 
> Also, sometimes radiators benefit from a really good cleaning (dust removal).


yeah, after switching to Noctua F-12s, .... I'll never go back. They put all fans Ive tried before to shame.


----------



## Bishop07764

Quote:


> Originally Posted by *fisher6*
> 
> Is Kryonaut much better than what EK includes with their block? Tempted to try it on my GPU since many people since to be mentioning it in this thread. How much difference are we talking about in temps?


It's the best paste that I've ever used. But you are probably talking 2-3 c drop as others have said provided that the factory Tim job was good. Otherwise it might be a much bigger temp difference. I believe Slack saw a pretty big difference.


----------



## Benny89

Got another STRIX today. It is even worse than my current one. Won't even do stable 2012 with simple OC in Heaven.

...... I am sick of sillicon lottery







.


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> Got another STRIX today. It is even worse than my current one. Won't even do stable 2012 with simple OC in Heaven.
> 
> ...... I am sick of sillicon lottery
> 
> 
> 
> 
> 
> 
> 
> .


If it wasn't for this site you wouldn't even care about it. Mine can hit 2100 with enough voltage and be stable in anything from benchmarks to games but because of power limits I've decided to keep it quiet and cool at 0.925mv and 1936 core clock


----------



## DStealth

Quote:


> Originally Posted by *Benny89*
> 
> Got another STRIX today. It is even worse than my current one. Won't even do stable 2012 with simple OC in Heaven.
> 
> ...... I am sick of sillicon lottery
> 
> 
> 
> 
> 
> 
> 
> .


Choose other vendor instead...a have a Strix 1080Ti card and despite temps and cooler it doesn't shine at all


----------



## Benny89

Quote:


> Originally Posted by *Clukos*
> 
> If it wasn't for this site you wouldn't even care about it. Mine can hit 2100 with enough voltage and be stable in anything from benchmarks to games but because of power limits I've decided to keep it quiet and cool at 0.925mv and 1936 core clock


IF it wasn't for this site I would probably still sit on my 980Ti since honestly I don't "need" to upgrade.

But, yea, I am upgrade freak and it starts to be problem


----------



## Benny89

Quote:


> Originally Posted by *DStealth*
> 
> Choose other vendor instead...a have a Strix 1080Ti card and despite temps and cooler it doesn't shine at all


I am thinking about FTW3 from EVGA, but no reviews are out yet. I can pre-order it now but I wish to see at least one review...

Anyone knows FTW3 TDP? Is it 330W too?


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> It's the best paste that I've ever used. But you are probably talking 2-3 c drop as others have said provided that the factory Tim job was good. Otherwise it might be a much bigger temp difference. I believe Slack saw a pretty big difference.


4 gpus so far. 5c, 5c, 6c, and 8c are the gains.


----------



## IMI4tth3w

how long has the 1080ti been out now? why is it impossible to get any of the higher factory clocking custom pcb board partner cards? the ftw3 has been on pre-order for what 2 months now? what is going on?

trying to get an upgrade for my wife's pc and not sure what to do. her 780ti is starting to take a dump and she needs an upgrade pretty soon. she wants a 1070 but 50% more performance for 40% more cost for a 1080ti makes a whole lot more sense.. sigh..


----------



## Benny89

Quote:


> Originally Posted by *IMI4tth3w*
> 
> how long has the 1080ti been out now? why is it impossible to get any of the higher factory clocking custom pcb board partner cards? the ftw3 has been on pre-order for what 2 months now? what is going on?
> 
> trying to get an upgrade for my wife's pc and not sure what to do. her 780ti is starting to take a dump and she needs an upgrade pretty soon. she wants a 1070 but 50% more performance for 40% more cost for a 1080ti makes a whole lot more sense.. sigh..


Well, EVGA is really slow in this round. I only got option to pre-order FTW3 today. And sadly only silver one.

There is also FTW3 Elite one with different colored fan shroud but I don't think it will be available in normal stores.

I am also thinking about upgrading my wifes PC, but she needs to first buy herself new monitor, because 1080Ti does not make sense in 1080p...


----------



## Alex132

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IMI4tth3w*
> 
> how long has the 1080ti been out now? why is it impossible to get any of the higher factory clocking custom pcb board partner cards? the ftw3 has been on pre-order for what 2 months now? what is going on?
> 
> trying to get an upgrade for my wife's pc and not sure what to do. her 780ti is starting to take a dump and she needs an upgrade pretty soon. she wants a 1070 but 50% more performance for 40% more cost for a 1080ti makes a whole lot more sense.. sigh..
> 
> 
> 
> Well, EVGA is really slow in this round. I only got option to pre-order FTW3 today. And sadly only silver one.
> 
> There is also FTW3 Elite one with different colored fan shroud but I don't think it will be available in normal stores.
> 
> I am also thinking about upgrading my wifes PC, but she needs to first buy herself new monitor, because 1080Ti does not make sense in 1080p...
Click to expand...

Not a fan of the FTW3 design or look - but the changes in the PCB, cooling, etc. are too attractive to turn down. I signed myself up for one... but just playing the waiting game now


----------



## nrpeyton

Quote:


> Originally Posted by *derpa*
> 
> If it's all about airflow (*252.85 CFM*) and static pressure (*35.865 mmH2O*), you could check out https://www.newegg.com/Product/Product.aspx?Item=N82E16835706015
> 
> I had two of these in my case at one point, but I'm down to one now. I have run them both at 12V and 5V, and at either voltage, have no issue keeping my CPU Kraken AiO cool. Fair warning, they're loud! LOL


lol that is crazy (the link)

40 dollars

consumes 60w

nearly 6000 RPM lol

Quote:


> Originally Posted by *Benny89*
> 
> I am thinking about FTW3 from EVGA, but no reviews are out yet. I can pre-order it now but I wish to see at least one review...
> 
> Anyone knows FTW3 TDP? Is it 330W too?


There's a great video review been done by Gamers Nexus and also Buildzoid @ youtube on FTW3 1080TI

You also allowed 36 watts extra, compared to FE. (so 336 TDP)

Quote:


> Originally Posted by *Benny89*
> 
> Got another STRIX today. It is even worse than my current one. Won't even do stable 2012 with simple OC in Heaven.
> 
> ...... I am sick of sillicon lottery
> 
> 
> 
> 
> 
> 
> 
> .


How many have you tried?


----------



## ilikelobstastoo

I have noctua nf a14s 2000rpm running on 2 280 rads here. Fe with ek block and 4.7 4790k and gpu Temps stay in the high 30s while benching 36 to 38 . Absolutely love these fans! Nice and quiet when i want and perform well when turned up

[/IMG]


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> 4 gpus so far. 5c, 5c, 6c, and 8c are the gains.


What a shame wasting 4 GPU's. SMH


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> What a shame wasting 4 GPU's. SMH


a waste? like Im gonna miss lost heat...

what in the world are you onabout?


----------



## ilikelobstastoo

I'm more of a live with what I get type of person bit I can see the urge to keep returning cards like these guys do.


----------



## ilikelobstastoo

I always read this thread just never post lol .
Slack ypu planning on slapping a waterblock on that beast if when avail?

Have been tempted to try out 5775c you've been on about


----------



## Benny89

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> I'm more of a live with what I get type of person bit I can see the urge to keep returning cards like these guys do.


Honestly don't be like us







I will retun my 4th 1080 Ti (well.. Aorus was returned since it was faulty and poop) and that is really not something I am proud of. But I can't accept fact that I pay the same amount of money buying same hardware and I can't get stable 2000mhz while someone is rocking 2075 or 2100 mhz...

I know it is not good attitude but when I pay the same, I want the same. Naive thinking with Silicon Lottery bs but well... what can I say...


----------



## ilikelobstastoo

Dont worry i wont I would feel too guilty if I did . Damn morals ! Now if I had a real problem with the gpu I wouldnt hesitate .


----------



## Slackaveli

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Dont worry i wont I would feel too guilty if I did . Damn morals ! Now if I had a real problem with the gpu I wouldnt hesitate .


Quote:


> Originally Posted by *ilikelobstastoo*
> 
> I always read this thread just never post lol .
> Slack ypu planning on slapping a waterblock on that beast if when avail?
> 
> Have been tempted to try out 5775c you've been on about


i think im gonna stay on air as im topping out around 59C. If I had the money for a full watercooling set-up I definitely would, though.


----------



## Slackaveli

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Dont worry i wont I would feel too guilty if I did . Damn morals ! Now if I had a real problem with the gpu I wouldnt hesitate .


i think ya'll are assuming that i repasted 4 1080ti's? i had one that i returned within my "satisfaction guarrantee" window, that when the next guy gets he'll love because it actually runs cool for an FE. The other two i re-pasted were a 980Ti hybrid and my son's 1060. Not sure what @lilchronic is on about








Quote:


> Originally Posted by *lilchronic*
> 
> What a shame wasting 4 GPU's. SMH


----------



## chibi

For those looking for an Aquacomputer 1080Ti water block, there's 3x nickel blocks available at Aquatuning.us right now. I would recommend the Active XCS Backplate and some thicker 1.0~1.5mm thermal pads if you go this route.

Link


----------



## MURDoctrine

Quote:


> Originally Posted by *Slackaveli*
> 
> i think im gonna stay on air as im topping out around 59C. If I had the money for a full watercooling set-up I definitely would, though.


Yeah it adds up fast. I just made a jump from soft tubing to hard tubing and I think I've dropped $300 at PPC's this past week.


----------



## TheNoseKnows

Quote:


> Originally Posted by *nrpeyton*
> 
> There's a great video review been done by Gamers Nexus and also Buildzoid @ youtube on FTW3 1080TI






That wasn't a review. It was a tear-down of the cooler and heatsink, and a look at the PCB. There were no benchmarks involved; in fact, the card wasn't even installed into a motherboard.


----------



## fisher6

I need your opinion on my loops temps and the GPU. Flashed back stock bios but my temps are still too high lately I feel, seeing 53-55C on my GPU, CPU is at 55-60 while playing ME:A. Running a 360 30mm and 45mm 240 rads, fans are Noctua NFF-12 running at 1100 rpm, D5 pump is at max. A few weeks ago my temps while gaming Ghost recon wildlands was around 48C max. I'm not sure what is wrong here.


----------



## Benny89

Anyone tried to mount NZXT Kraken on 1080 Ti FE? I think this might be right now my best route. I can get FE at 600USD here and slap it with AIO cooler + NZXT.

Toughts?

EDIT: Seems like NZXT came out with new Kraken G12.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> What a shame wasting 4 GPU's. SMH


Quote:


> Originally Posted by *MURDoctrine*
> 
> Yeah it adds up fast. I just made a jump from soft tubing to hard tubing and I think I've dropped $300 at PPC's this past week.


i mean, i'd love to, but click on my profile and you'll see why lol. 5 kids....


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> a waste? like Im gonna miss lost heat...
> 
> what in the world are you onabout?


you go through 4 cards just to find one that oc's better?

ok i see sorry


----------



## Benny89

Quote:


> Originally Posted by *lilchronic*
> 
> you go through 4 cards just to find one that oc's better?
> 
> ok i see


Thats me, lol







. He was getting improvement with repasting across 4 different cards from different generations. Not 4 1080Tis


----------



## Benny89

Gyus, just need some info:

Asus, MSI - 330W
Palit - 350W
Aorus - 375W.
FE- 300W
*
What about MSI AERO, EVGA GTX SC and FTW3?*

Also- does anyone know if Palit has any sort of warranty void for modifications (like some stickers etc.) if you want to mod your card with Kraken etc?


----------



## nrpeyton

Quote:


> Originally Posted by *TheNoseKnows*
> 
> 
> 
> 
> 
> That wasn't a review. It was a tear-down of the cooler and heatsink, and a look at the PCB. There were no benchmarks involved; in fact, the card wasn't even installed into a motherboard.


An:


indepth look at the *PCB* _(the card it's self),_
*the power delivery system* _(and O/C'ing)_ and
*the cooling system* _(heatsink+fan & thermal pad arrangement)_

is my *perfect* kind of review!

In terms of performance it is going to to be determined by Silicon Lottery.
If a "review sample" was a decent clocker; in general it will perform about 5% faster than a FE of the same 'lottery'.
(However official reviews often don't take into account FE's which have had the "shut mod" and water, which would _eliminate_ that 5% gap).

Other things to consider would be EVGA's reputation as a company and their customer service.

If it's 50 benchmarks _(bar-graphs e.t.c)_ showing comparative performance up against other 1080TI's (or older cards) in 50 different games then you may have to wait for more reviews to be done.

Another great benefit to the FTW is the 9 thermal sensors (which I don't believe any other partner is offering -- I know doesn't help performance) but it 'could' be a good indicator of GPU health.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> you go through 4 cards just to find one that oc's better?
> 
> ok i see sorry


no worries man
Quote:


> Originally Posted by *Benny89*
> 
> Thats me, lol
> 
> 
> 
> 
> 
> 
> 
> . He was getting improvement with repasting across 4 different cards from different generations. Not 4 1080Tis


this


----------



## ilikelobstastoo

has anyone tinkered around with different thermal pads on 1080tis with an ek blocks? So if I go with fujiopoly pads would i want a 1.0 and 1.5mm thickness? (with the 1mm replacing the .5 and 1.5 replacing 1) These pads are expensive and i don't want to get it wrong!! also am currently using gelid extreme paste and considering trying out kryonaut while im at it. just wishing i had installed a drain in my loop:sadsmiley


----------



## nrpeyton

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> has anyone tinkered around with different thermal pads on 1080tis with an ek blocks? So if I go with fujiopoly pads would i want a 1.0 and 1.5mm thickness? (with the 1mm replacing the .5 and 1.5 replacing 1) These pads are expensive and i don't want to get it wrong!! also am currently using gelid extreme paste and considering trying out kryonaut while im at it. just wishing i had installed a drain in my loop:sadsmiley


I'd say go with the manual.

(Unless you have good way of testing i.e. temperature probes you could stick to the back of the PCB behind the relevant components).
If something doesn't look right then sometimes going to a slightly higher (next pad size up) can help absorb any natural manufacturing flaws causing non-perfect contact.

I wrote a guide about this on the 1080 (non TI) to go _against_ an EK manual. But it was only when installing a block not designed for a card it was going onto. (nor was it even an officially supported combo). Anyway in this instance I did indeed get better temps using 1.0mm instead of 0.5mm and 1.5mm instead of 1.0mm.


----------



## Bishop07764

Quote:


> Originally Posted by *Slackaveli*
> 
> 4 gpus so far. 5c, 5c, 6c, and 8c are the gains.


That's awesome. It's hard for me to quantify my gains exactly because I've always repasted with Kryonaut in going from an air cooler to a waterblock. So "technically" my gains were along the lines of 30-50c.







Just recently got on the Kryonaut train a while back. My wifes 1060 in our htpc might get to mid 30's possibly on a super hot day on a single 280 mm rad with a 4.1ghz cpu in the loop.

Quote:


> Originally Posted by *Benny89*
> 
> Honestly don't be like us
> 
> 
> 
> 
> 
> 
> 
> I will retun my 4th 1080 Ti (well.. Aorus was returned since it was faulty and poop) and that is really not something I am proud of. But I can't accept fact that I pay the same amount of money buying same hardware and I can't get stable 2000mhz while someone is rocking 2075 or 2100 mhz...
> 
> I know it is not good attitude but when I pay the same, I want the same. Naive thinking with Silicon Lottery bs but well... what can I say...


What happened? I thought you had a Strix that did like 2063 all day.

Quote:


> Originally Posted by *fisher6*
> 
> I need your opinion on my loops temps and the GPU. Flashed back stock bios but my temps are still too high lately I feel, seeing 53-55C on my GPU, CPU is at 55-60 while playing ME:A. Running a 360 30mm and 45mm 240 rads, fans are Noctua NFF-12 running at 1100 rpm, D5 pump is at max. A few weeks ago my temps while gaming Ghost recon wildlands was around 48C max. I'm not sure what is wrong here.


Yeah that sounds rather toasty for a full waterblock. You sure that you took off the protective film on the block before mounting? Your ambient temps way up? Fans definitely working? Hope that you can get it sorted out.


----------



## Benny89

Quote:


> Originally Posted by *Bishop07764*
> 
> What happened? I thought you had a Strix that did like 2063 all day.


I thought so, but then it turned out that in more GPU demanding game like W3 or ME:A I can't get more than 2012 without crashing, so like 20mhz more than stock boost.... I dediced to try another one and can't even get that 2012 stable without crashing.

I think I will try FE/EVGA SC/Palit/FTW3 with Kraken mod plus Corsair H90 and Grizzly Kryonaut next.

I just hate having a stock card. I am overclocker and I need to overclock my stuff. Mere 20mhz OC is not OC









Anyone knows what is Power Limit on EVGA SC, SC2 or FTW3?


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Honestly don't be like us
> 
> 
> 
> 
> 
> 
> 
> I will retun my 4th 1080 Ti (well.. Aorus was returned since it was faulty and poop) and that is really not something I am proud of. But I can't accept fact that I pay the same amount of money buying same hardware and I can't get stable 2000mhz while someone is rocking 2075 or 2100 mhz...
> 
> I know it is not good attitude but when I pay the same, I want the same. Naive thinking with Silicon Lottery bs but well... what can I say...


I have theory when it comes to getting golden cards that is wait for at least 6 months after release
As the process of making the chips are refined I believe it takes 3-4 months from start to finish to make a silicon chip that means that there is room for them to improve them but the fixes will take several months to surface to the retailers then you got to wait for old stock to move.
Just a theory

I would have been more happy with the Strix in your sig

and the old looking others achievements I tell you the truth I only post my best results, it might of taken a frosty morning or there are plenty of other tricks to boost your score especially drivers I had a few old favourites or a fresh spread of TIM every few weeks/months is probably the best trick I know of.

or simply lie about your stability I can say stability for me is a 6 hour non stop game session for others it could be anything
I notice that stability issues normally show after 1-2 hours of non-stop gaming


----------



## RavageTheEarth

So I picked up a Kill-A-Watt since they are so cheap and I am interested in how many watts my computer puts out. With a 6700k @ 1.38v & 4.7Ghz, G.Skill Trident Z 3400Mhz RAM @ 1.65v and my Aorus @ 1.00v & 2050 my max power draw during SuperPosition 4k was 467w. I was thinking it would be higher than that considering I'm running two D5 pumps, 14 radiator fans, and one case fan for exhaust also.


----------



## Benny89

Quote:


> Originally Posted by *feznz*
> 
> I can say stability for me is a 6 hour non stop game session for others it could be anything
> I notice that stability issues normally show after 1-2 hours of non-stop gaming


Yup, I could keep 2063 in BF1 and in Dishonored 2 for like 4-5 hours. Then playing W3 or ME:A they would crash in 2hours best. I even once hold 2050 for 5 hours in W3 but it crashed anyway after that.

Sometimes people think they have stable clocks before checking all games they have or play longer than 3 hours.

Well, not too much waste. I love playing with hardware so I am looking foward to Kraken G10 mod


----------



## Bishop07764

Quote:


> Originally Posted by *Benny89*
> 
> I thought so, but then it turned out that in more GPU demanding game like W3 or ME:A I can't get more than 2012 without crashing, so like 20mhz more than stock boost.... I dediced to try another one and can't even get that 2012 stable without crashing.
> 
> I think I will try FE/EVGA SC/Palit/FTW3 with Kraken mod plus Corsair H90 and Grizzly Kryonaut next.
> 
> I just hate having a stock card. I am overclocker and I need to overclock my stuff. Mere 20mhz OC is not OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone knows what is Power Limit on EVGA SC, SC2 or FTW3?


Oh I understand. I was hoping for a 2100 card myself, but it was hard enough to get my fingers on the one that I have. When I get around to playing all the Witcher 3 expansions, I will find out if my 2050 clock is truly stable. I already changed my 24/7 clock back from 2063 because I finally got a crash in the Division. My average fps went down about 0.2 fps. I'm at the point of diminishing returns anyway it seems.


----------



## mshagg

I just find it hard to believe that retailers accept poor overclocking as a reason to return a card? You'd be laughed out of the store trying to do that here. Little wonder these things cost $1100 if people are binning them like that.
Quote:


> Originally Posted by *fisher6*
> 
> I need your opinion on my loops temps and the GPU. Flashed back stock bios but my temps are still too high lately I feel, seeing 53-55C on my GPU, CPU is at 55-60 while playing ME:A. Running a 360 30mm and 45mm 240 rads, fans are Noctua NFF-12 running at 1100 rpm, D5 pump is at max. A few weeks ago my temps while gaming Ghost recon wildlands was around 48C max. I'm not sure what is wrong here.


Really need to know coolant temperatures to get any valid insights. I'd try running the D5 at a lower speed, I have CPU + GPU + 50mm thick 480 Rad + 420 Rads running the pump at like half speed.


----------



## KingEngineRevUp

I think a person needs some serious help, and I don't mean this in a joking way, if they need to continously return and purchase multiple cards so they can get one that performs an extra 1%-2%.

This has to be some type of ocd thing or on the spectrum of autism where one is obsessed with numbers. But I don't even think car hobbyist purchase multiple mufflers and try them all to take the one that gives them 0.25 extra HP.

Guys, if you're over 2000 Mhz just be happy with it.


----------



## fisher6

Quote:


> Originally Posted by *mshagg*
> 
> I just find it hard to believe that retailers accept poor overclocking as a reason to return a card? You'd be laughed out of the store trying to do that here. Little wonder these things cost $1100 if people are binning them like that.


Most stores in the EU are required by law to accept a return within 14 days (some up to 45 days here in Norway). You don't need to say it's a bad overclocker, saying this product is not for me or you changed your mind is completely fine. I've done it plenty of times.


----------



## Addsome

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think a person needs some serious help, and I don't mean this in a joking way, if they need to continously return and purchase multiple cards so they can get one that performs an extra 1%-2%.
> 
> This has to be some type of ocd thing or on the spectrum of autism where one is obsessed with numbers. But I don't even think car hobbyist purchase multiple mufflers and try them all to take the one that gives them 0.25 extra HP.
> 
> Guys, if you're over 2000 Mhz just be happy with it.


+1


----------



## Luckbad

EVGA GeForce GTX 1080 Ti SC2 HYBRID GAMING

In Stock at NewEgg for probably only a couple more minutes:
https://www.newegg.com/Product/Product.aspx?Item=N82E16814487346

Details: http://www.evga.com/articles/01107/evga-geforce-gtx-1080-ti-sc2-hybrid/


----------



## mshagg

Quote:


> Originally Posted by *fisher6*
> 
> Most stores in the EU are required by law to accept a return within 14 days (some up to 45 days here in Norway). You don't need to say it's a bad overclocker, saying this product is not for me or you changed your mind is completely fine. I've done it plenty of times.


Presumably there are also laws in the EU that say an opened and tested card can no longer be sold as a new product.


----------



## Benny89

Quote:


> Originally Posted by *mshagg*
> 
> Presumably there are also laws in the EU that say an opened and tested card can no longer be sold as a new product.


There is not such law in EU. Only when you "used a product in other way than advertised". You can use GPU for 14 days and return it no problem. You can return any hardware in 14 days, doesn't matter if you used it or not. Only if you damaged it by "using it in other way than advertised" you can't return it freely.

And only other exception to return are product that are damanged/lost after opening/using like chemicals, drugs, medicines etc etc.

We know EU law very well







and we use 14 days return a lot here.

EDIT: Also retrurned cards are being sold again at lower price. So each time somone else will have fully working 1080Ti at a lower price.


----------



## nrpeyton

I could understand if someone wasn't getting 2000. (As even Nvidia claim they O/C this far) so for someone very interested in _overclocking_ which is what this site is.

I've never done it myself. And I received my card through step-up so it's not even an option.

But; still no need to be _so_ hard on him.

Think about how what you say may affect how others feel?

Anyway; It's easy to say when you're hitting 2150.


Anyway I imagine he heard other people returning stuff for similar reason
Imagine it was badly timed (the day he opened his 1st card he was probably on THIS SITE listening to everyone bragging about their 2100++ ventures
So he says; "what the heck, I'll return it once". (all legal + above board, *he wasn't happy*--and that's the point)
Gets a card that is even worse?
How's he going to feel then.
So he does it again.
Repeat of the above
Then after all the hassle he's already went through is he going to give up now? When his luck must surely be about to change?
Probably not.
You can't have one opinion about someone who returned something once. and a different opinion about someone who did it 2/3 times.

What if he'd sold the cards. Then bought brand new ones, every times? I hear a lot of people do that but I've never seen anyone complain.

I'm not saying he's right, and anyone complaining about his actions is wrong.

And I understand if everyone done this, we'd all be getting raped on price. *But then we probably wouldn't* because if that happened- companies would just bin everything. And we'd pay per binning.

Funny, because I'd hazard a guess the highest binned cards would probably be the ones always out of stock. regardless it's only 2%.

A lot of people _(or most)_ don't care how far a card overclocks.

So no harm in having the best ones in the hands of the people who _do_ care.

We are on an O/C'ing site after all.

Also what if he competes at hwbot, or on scoring?

Each to their own, I say. lol


----------



## mshagg

Quote:


> Originally Posted by *Benny89*
> 
> There is not such law in EU. Only when you "used a product in other way than advertised". You can use GPU for 14 days and return it no problem. You can return any hardware in 14 days, doesn't matter if you used it or not. Only if you damaged it by "using it in other way than advertised" you can't return it freely.
> 
> And only other exception to return are product that are damanged/lost after opening/using like chemicals, drugs, medicines etc etc.
> 
> We know EU law very well
> 
> 
> 
> 
> 
> 
> 
> and we use 14 days return a lot here.


Sorry, I don't think I made my point very well. You're returning all of these cards, which the retailer is forced to accept by law. However, they cant turn around and sell them as new cards. As you seem to confirm here:

Quote:


> EDIT: Also retrurned cards are being sold again at lower price. So each time somone else will have fully working 1080Ti at a lower price.


They are incurring an economic loss due to this practice.

So the retail price for the new cards, in addition to covering the cost of development, logistics and mark up, must now also factor in the cost of having to discount cards that people are returning for, frankly, no good reason.

I don't have a problem with it, you are entitled to do this under law. My point is that it contributes to the outrageous prices we pay for this hardware.

But, we are not here to discuss consumer law. We should get back to discussing the cards themselves


----------



## nrpeyton

I hear what you're saying.

But the reason we pay so much is because nvidia has such an outrageous R&D budget to support.

I watched Nvidia at CES 2017; and most of that money doesn't go into GPU development.

Companies will always charge what they can get away with; regardless.

I doubt a handful of "number crazy" over-clockers returning cards _really_ affects the current price of a 1080 TI.


----------



## Nico67

Quote:


> Originally Posted by *mshagg*
> 
> Sorry, I don't think I made my point very well. You're returning all of these cards, which the retailer is forced to accept by law. However, they cant turn around and sell them as new cards. As you seem to confirm here:
> They are incurring an economic loss due to this practice.
> 
> So the retail price for the new cards, in addition to covering the cost of development, logistics and mark up, must now also factor in the cost of having to discount cards that people are returning for, frankly, no good reason.
> 
> I don't have a problem with it, you are entitled to do this under law. My point is that it contributes to the outrageous prices we pay for this hardware.
> 
> But, we are not here to discuss consumer law. We should get back to discussing the cards themselves


Yeah, surely if there is no restocking fee and they have to sell those cards as "second's", then there going to have to add a extra cost to everything to compensate for average losses, so inherently your adding to your own markup in prices. Surprised retailers don't ban customers for excessive returns


----------



## Nico67

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I picked up a Kill-A-Watt since they are so cheap and I am interested in how many watts my computer puts out. With a 6700k @ 1.38v & 4.7Ghz, G.Skill Trident Z 3400Mhz RAM @ 1.65v and my Aorus @ 1.00v & 2050 my max power draw during SuperPosition 4k was 467w. I was thinking it would be higher than that considering I'm running two D5 pumps, 14 radiator fans, and one case fan for exhaust also.


Looks about right, I was seeing 430's max on my UPS power out that feeds PC and monitor. PC is very similar spec but my pumps are not feed from that and I only have two case fans at low rpm.


----------



## mshagg

Quote:


> Originally Posted by *nrpeyton*
> 
> I hear what you're saying.
> 
> But the reason we pay so much is because nvidia has such an outrageous R&D budget to support.
> 
> I watched Nvidia at CES 2017; and most of that money doesn't go into GPU development.
> 
> Companies will always charge what they can get away with; regardless.
> 
> I doubt a handful of "number crazy" over-clockers returning cards _really_ affects the current price of a 1080 TI.


The retailers would only be concerned with a margin. The wholesale price is no doubt inflated due to a lack of competition at this performance point.

Yes, no doubt we're contributing to their self driving cars or whatever crazy hijinks they're up to at the moment! So long as I can push 90FPS in a VR game I'm a happy customer.


----------



## nrpeyton

It's the "Distance selling regulation". It only works when you buy online/over telephone. You have the right of a "cooling-off period" of 14 days.

If he bought it in-store he wouldn't have the right. Unless the product was "not as described" or faulty.

Some companies probably do, ban people.

I'd be more worried about a reputable company struggling to hit targets one month claiming a "working" returned card was damaged on receipt.

And someone who really did have a broken card (through no fault of their own) loses out. I think that's a bigger potential problem; than price of 1080 TI.

It's certainly a risky game to play. (I already encouraged _caution_ the other day)


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> EVGA GeForce GTX 1080 Ti SC2 HYBRID GAMING
> 
> In Stock at NewEgg for probably only a couple more minutes:
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487346
> 
> Details: http://www.evga.com/articles/01107/evga-geforce-gtx-1080-ti-sc2-hybrid/


Finally! That means maybe we will see soon FTW Hybrid with 2x8 pin and higher TDP.

For now I want to try Kraken G10 plus Corsair H90. Which card would you recommend? I thought about Palit since it has 350W limit or Aorus non-Xtreme for 375W TDP or FE because with sub 50 temps I should be able to squeeze it a little bit more.

Anyone had any experience with Palit? Do they void warranty if you remove cooler and slap block/AIO on it?


----------



## Slackaveli

Quote:


> Originally Posted by *mshagg*
> 
> I just find it hard to believe that retailers accept poor overclocking as a reason to return a card? You'd be laughed out of the store trying to do that here. Little wonder these things cost $1100 if people are binning them like that.
> Really need to know coolant temperatures to get any valid insights. I'd try running the D5 at a lower speed, I have CPU + GPU + 50mm thick 480 Rad + 420 Rads running the pump at like half speed.


i agree that the "whatever reason" return policy drives up the price. equivalent of theft driving up the price, ie uncontrollable costs passed on to the customer.

But if that's their law, that is already figured in and might as well take advantage of the situation, be it by binning, or buying those returns for a discount.

Meanwhile, in The Land of the Free, we get this:

Return Policies
Return for refund within: Non-refundable
Return for replacement within: 30 days
This item is covered by Newegg.com's Replacement Only Return Policy.

Sounds like ole Ralph Nader and Bernie Sanders and them were right...


----------



## the9quad

Quote:


> Originally Posted by *Slackaveli*
> 
> i agree that the "whatever reason" return policy drives up the price. equivalent of theft driving up the price, ie uncontrollable costs passed on to the customer.
> 
> But if that's their law, that is already figured in and might as well take advantage of the situation, be it by binning, or buying those returns for a discount.
> 
> Meanwhile, in The Land of the Free, we get this:
> 
> Return Policies
> Return for refund within: Non-refundable
> Return for replacement within: 30 days
> This item is covered by Newegg.com's Replacement Only Return Policy.
> 
> Sounds like ole Ralph Nader and Bernie Sanders and them were right...


The real dirt bags are the ones who shunt mod the cards, and return them .Seen a few advocating that crap, "well if it goes wrong just clean it up and send it back hardee har."..scum of the earth. Seriously I cant believe how seriously legit terrible some people are around here.


----------



## nrpeyton

I've never actually read a post on here where someone performing a shut mod did actually break their card.

On the 1080 owners thread, over about 5 months there was one isolated incident (in that whole timeframe i spent there) where the solder points were eaten away by the LM causing the resistor to fall off. But it was nothing that was going to make his card blow up, become unreasonably expensive or difficult for him to fix.

In fact I think he soldered it back on himself. Card is probably still going strong with Liquid Tape protecting the base of the resistor this time ;-)


----------



## Slackaveli

Quote:


> Originally Posted by *the9quad*
> 
> The real dirt bags are the ones who shunt mod the cards, and return them .Seen a few advocating that crap, "well if it goes wrong just clean it up and send it back hardee har."..scum of the earth. Seriously I cant believe how seriously legit terrible some people are around here.


i totally feel you but I actually think their is some truth to what Slim was saying. Some of us are straight up on the spectrum , I know I am. But knowing is half the battle and I check myself. Usually there is a period of madness before I accept that I have reached to absolute highest clocks, etc.

Then something weird happens and I start geeking on low power consumption, and topping out perf at less watts/volts, and undo all my overclocking. Im a nutt, lmao

However, I do agree that we have to keep it real and not "blow up the hustle", b/c we used to be able to return gpu to Newegg, and people have ruined that FOREVER. Once they stop, they dont start back. One day Amazon will do the same, and then what?

I side with consumers over corporations all day everyday. But there does come a point where you have to let the company be profitable, ya know? Hardmodding a card then returning it to a STORE is douchey as hell. Now, sending it in for a warranty replacement , where they send you a refurbished anyway, that I can get behind. But don't be shunt modding and what-not and then walk that dude back into Best Buy.

Speaking of Best Buy, I have been known to "borrow" a gpu from them for up to 10 days before , lol. Waiting on that cross shipment from EVGA, grabbed a 1070 for the week. But, I didnt hurt it . And Best Buy put that dude RIGHT back on the shelf like it was brand new...


----------



## nrpeyton

Consumers will always be consumers. Some things never change. Like human nature. The rich, the poor. The hard-worker, the lazy. The privileged. We all different, all see it from different perspective and would all do things the other wouldn't do.

Nvidia is the true bandit.

Look at recent Ryzen launch. How come AMD can release the same performance at *half* the price? lol

And everyone still makes a profit.

That's right; I think we all know VEGA isn't going to nail it. But imagine if it did. 350 for a 1080 TI.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> I've never actually read a post on here where someone performing a shut mod did actually break their card.
> 
> On the 1080 owners thread, over about 5 months there was one isolated incident (in that whole timeframe i spent there) where the solder points were eaten away by the LM causing the resistor to fall off. But it was nothing that was going to make his card blow up, become unreasonably expensive or difficult for him to fix.
> 
> In fact I think he soldered it back on himself. Card is probably still going strong with Liquid Tape protecting the base of the resistor this time ;-)


i think that's true. but there have definitely been some that have shunted, LM on the die, slapped blocks on it, OC it until it crashes repeatedly going for 2100Mhz, and then end up "returning it to it's previous condition" and returning it. I seriously doubt they tell the store about what the card's been through.

It's a fine line. WE can't be getting screwed by multi-billion dollar corporations and having to get beat out of a bi-weekly paycheck's (for most) worth of money when a shady company refuses a return on a faulty product. But if we take too much advantage of the system binning a have dozen cards per actual purchase, the tide will turn and we'll be the one's getting screwed. All it takes is money in politicians pockets and consumer right's erode away...


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Consumers will always be consumers. Some things never change. Like human nature. The rich, the poor. The hard-worker, the lazy. The privileged. We all different, all see it from different perspective and would all do things the other wouldn't do.
> 
> Nvidia is the true bandit.
> 
> Look at recent Ryzen launch. How come AMD can release the same performance at *half* the price? lol
> 
> That's right; I think we all know VEGA isn't going to nail it. But imagine if it did. 350 for a 1080 TI.


jah. but even AMD bilked their customers buy selling a $500 cpu(1800x) with the exact performance of the $320 cpu(1700). CompANIES BE SHADY.
But, I do cheer AMD's success (while not expecting it) because monopolies SUCK.

In all seriousness, if AMD delivers on Vega , like actually competes with Ti, for <$500 , especially if it's like $350-400, it would do wonders for the industry. We need that competition. And it would make Volta come proper, too.

They really are bandits, though, Nvidia. Because we are giving them SO MUCH MONEY and instead of re-investing that into graphics R&D, they are using it to get in good with Big Brother with Spyhouse TM, and self-driving cars, and AI supercomputing. None of which will advance gaming tech at all. I don't like that crap.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> jah. but even AMD bilked their customers buy selling a $500 cpu(1800x) with the exact performance of the $320 cpu(1700). CompANIES BE SHADY.
> But, I do cheer AMD's success (while not expecting it) because monopolies SUCK.


Lol, then I change my statement. For nearly only 1/3rd the price. (as you're right, you can overclock the 1700 to same leve as 1800x for nearly £200 lessl).

So imagine that folks, a 1080TI performance for $238. Because that's effectively what AMD just did with Ryzen.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Nico67*
> 
> Yeah, surely if there is no restocking fee and they have to sell those cards as "second's", then there going to have to add a extra cost to everything to compensate for average losses, so inherently your adding to your own markup in prices. Surprised retailers don't ban customers for excessive returns


I'm only measuring the PC usage. Think i might grab that kill a watt power strip. I'm only using a little less than 1/3 of my PSU's power. I'm still rocking an Evga G2 1300w from my tri-fire 7950 Setup. Man i remember having a golden card if you could hit 1200 on the core. That's chump change compared to what we get out of the box these day.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol, then I change my statement. For nearly only 1/3rd the price. (as you're right, you can overclock the 1700 to same leve as 1800x for nearly £200 lessl).
> 
> So imagine that folks, a 1080TI performance for $238. Because that's effectively what AMD just did with Ryzen.


that's nuts. Would they still be so consumer friendly if they were in a stronger position? Probably not, but the competition alone would still keep prices down.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Finally! That means maybe we will see soon FTW Hybrid with 2x8 pin and higher TDP.
> 
> For now I want to try Kraken G10 plus Corsair H90. Which card would you recommend? I thought about Palit since it has 350W limit or Aorus non-Xtreme for 375W TDP or FE because with sub 50 temps I should be able to squeeze it a little bit more.
> 
> Anyone had any experience with Palit? Do they void warranty if you remove cooler and slap block/AIO on it?


I ordered the EVGA SC2 Hybrid. No guarantee they'll do a FTW variant.

If I can get it close to the performance of my Zotac Amp Extreme, I'll sell the Zotac and keep the Hybrid.

Also ordered a Noctua NF12 industrial PWM 2000 fan to swap out on the radiator.


----------



## nrpeyton

Well I'm rocking my old £100 CPU (*AMD FX-8350*) just now: lol

Witcher 3
4K
Ultra
*FPS: 52*

GPU:
1999Mhz at 981mv (drawing only 229 watts)
memory: +555
temp: 65c
fan: 100%
noise: okay with headphones on, otherwise unbearable lol.

Well I must say, it sure as hell isn't greedy. I'm only feeding it bread and water and it'll still do my old FX proud at 4k ;-)



Under-volting has so many benefits ;-)
Clock has only changed bin once (first 40 seconds of game-play).

_Over 2.GHZ, its a totally different ball game.
It wants 1 millivolt per 1 megahert._


----------



## Coopiklaani

FTW3 has a 10-phase VRM, which is actually a true 5-phase VRM with doublers. FE has a true 8-phase VRM with 2 phases sharing an inductor. In term of voltage purity, I think FE > FTW.


----------



## Coopiklaani

My best scores are actually with EVGA BIOS

Time Spy Graphics Score: 11 088
http://www.3dmark.com/spy/1677135

Time Spy stress test: 99.5%
http://www.3dmark.com/tsst/45857

FSU graphics score: 7 910
http://www.3dmark.com/fs/12510310

FSU stress test: 98.9%
http://www.3dmark.com/fsst/421941

No PWR limit with TS at all, some minor pwt limits with FSU first scene.


----------



## Benny89

Quote:


> Originally Posted by *Coopiklaani*
> 
> FTW3 has a 10-phase VRM, which is actually a true 5-phase VRM with doublers. FE has a true 8-phase VRM with 2 phases sharing an inductor. In term of voltage purity, I think FE > FTW.


Can you explain that a little more please? I am debating about ordering FTW3 vs getting FE and slapping Kraken G10 on it so any additional info would be appreciated







.
Quote:


> Originally Posted by *Luckbad*
> 
> I ordered the EVGA SC2 Hybrid. No guarantee they'll do a FTW variant.
> 
> If I can get it close to the performance of my Zotac Amp Extreme, I'll sell the Zotac and keep the Hybrid.
> 
> Also ordered a Noctua NF12 industrial PWM 2000 fan to swap out on the radiator.


Nice, be sure to share your OC results. I don't like that it is 8+6 pin only and that copper cover from pump itself is also cooling VRAM since imo that will only adds temps to core...



Also, I didn't find any information about Power Limit of Hybrid SC. If it is only 300W then it is really dissapointing. 350W with Hybrid cooler would allow for better OC potential.

But I have to wait anyway till they will appear in my country so I am in no rush.


----------



## Coopiklaani

Quote:


> Originally Posted by *Benny89*
> 
> Can you explain that a little more please? I am debating about ordering FTW3 vs getting FE and slapping Kraken G10 on it so any additional info would be appreciated
> 
> 
> 
> 
> 
> 
> 
> .
> Nice, be sure to share your OC results. I don't like that it is 8+6 pin only and that copper cover from pump itself is also cooling VRAM since imo that will only adds temps to core...
> 
> 
> 
> Also, I didn't find any information about Power Limit of Hybrid SC. If it is only 300W then it is really dissapointing. 350W with Hybrid cooler would allow for better OC potential.
> 
> But I have to wait anyway till they will appear in my country so I am in no rush.


My sources:
http://www.gamersnexus.net/news-pc/2844-gtx-1080-ti-fe-pcb-analysis-shunt-mod
http://www.gamersnexus.net/guides/2896-evga-1080ti-ftw3-pcb-analysis

basically a 10-phase controller doesn't not exist. the most you can get is an 8-phase controller.
more true phase = cleaner voltage = better overclocking maybe
more total phase = higher current capability = higher TDP on air

It means FE has cleaner cleaner voltage while FTW can supply more current. And current capability doesn't really matter for gtx 1080 ti cards anyways if you are willing to do a shunt mod/bios swap.


----------



## Luckbad

Isn't the FE 7+2 power phases?

Not sure if any card other than Zotac does 16+2 but that's the highest I've seen.


----------



## Coopiklaani

Quote:


> Originally Posted by *Luckbad*
> 
> Isn't the FE 7+2 power phases?
> 
> Not sure if any card other than Zotac does 16+2 but that's the highest I've seen.


FE has 8 sets of drivers+mosfets, but there are 2 sets of driver mosfet combo that share 1 inductor for load balancing purpose. So it is a advertised as a 7-phase VRM design. As long as you can keep these mosfets cool enough so they don't burst into fire, FE's 7+2 will perform just as good as 16+2


----------



## Dasboogieman

Quote:


> Originally Posted by *Luckbad*
> 
> Isn't the FE 7+2 power phases?
> 
> Not sure if any card other than Zotac does 16+2 but that's the highest I've seen.


More phases is not neccessarily better. In fact on power constrained cards like the GTX 1080ti, its better to have less but more efficient phases to minimize heat output and power losses.
Quote:


> Originally Posted by *Coopiklaani*
> 
> FTW3 has a 10-phase VRM, which is actually a true 5-phase VRM with doublers. FE has a true 8-phase VRM with 2 phases sharing an inductor. In term of voltage purity, I think FE > FTW.


Actually, the EVGA card might win here over the FE because the switching speed on the MOSFETs is faster, additionally, it doesn't have that weird alternating design the FE has.

Faster switching speed is extremely beneficial because you need a lot less VDROOP to prevent spikes, this in turn is more beneficial for voltage stability and longevity of the components.


----------



## bathrobehero

I just bought an MSI 1080 Ti Gaming X and it gets *artifacts* even on stock speeds in 3D applications.

It's pretty rare but it's there; usually it's just a few, small red/pink stuff that looks like space invaders (from the old game).

Once it was huge red rectangles across the screen playing Elite Dangerous.

Downclocking memory doesn't seem to help (didn't bother restarting as I kind of have to run the PC 0-24h) and overclocking +400 Mhz is instant freeze and rainbows. It also seems ~8-10% slower in benchmarks than FE cards I find in benchmarks/reviews. Latest nvidia driver and latest GPU Bios.

Previously I used an 1070 witohut any issues.

I'm finding lots of threads with both 1080 and 1080 Ti having arfitacts, how common are these really? I'll send this card back for sure, but I'm not sure if I should go with another brand of 1080 Ti.


----------



## Dasboogieman

Quote:


> Originally Posted by *bathrobehero*
> 
> I just bought an MSI 1080 Ti Gaming X and it gets *artifacts* even on stock speeds in 3D applications.
> 
> It's pretty rare but it's there; usually it's just a few, small red/pink stuff that looks like space invaders (from the old game).
> 
> Once it was huge red rectangles across the screen playing Elite Dangerous.
> 
> Downclocking memory doesn't seem to help (didn't bother restarting as I kind of have to run the PC 0-24h) and overclocking +400 Mhz is instant freeze and rainbows. It also seems ~8-10% slower in benchmarks than FE cards I find in benchmarks/reviews. Latest nvidia driver and latest GPU Bios.
> 
> Previously I used an 1070 witohut any issues.
> 
> I'm finding lots of threads with both 1080 and 1080 Ti having arfitacts, how common are these really? I'll send this card back for sure, but I'm not sure if I should go with another brand of 1080 Ti.


Try re-seating the cooler and check the VRAM pads. If that still doesnt work then RMA it, something is dodgy with the core.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *Benny89*
> 
> Can you explain that a little more please? I am debating about ordering FTW3 vs getting FE and slapping Kraken G10 on it so any additional info would be appreciated
> 
> 
> 
> 
> 
> 
> 
> .
> Nice, be sure to share your OC results. I don't like that it is 8+6 pin only and that copper cover from pump itself is also cooling VRAM since imo that will only adds temps to core...
> 
> 
> 
> Also, I didn't find any information about Power Limit of Hybrid SC. If it is only 300W then it is really dissapointing. 350W with Hybrid cooler would allow for better OC potential.
> 
> But I have to wait anyway till they will appear in my country so I am in no rush.


you planing to do the honorable thing this time and sell yours first? or another return? lol


----------



## illypso

After reading so much about it and to maximize my 2 X 1080TI

I finally just bought a 5775C to replace my 4790K. ( I have a kijiji offer for my 4790k at 360 CAD... not sure I will sell it at that price but there is no market for that here in quebec, I will probably end up keeping both)

It will arrive next week so... any benchmark recommendation to do on my 4790k while waiting to be able to compare and see the gain after the switch.

I did 1336 on firestrike ultra,I hope I will get a better score.

Also I'm on win 7, would switching to Win 10 be better in gaming? from you experience, does the new Nvidia driver work better in win10
I used win 10 for a year but it seems heavy compare to 7 and I feel like Microsoft was watching my every move on win10, there some weird behavior. I switch back to win7


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Yup, I could keep 2063 in BF1 and in Dishonored 2 for like 4-5 hours. Then playing W3 or ME:A they would crash in 2hours best. I even once hold 2050 for 5 hours in W3 but it crashed anyway after that.
> 
> Sometimes people think they have stable clocks before checking all games they have or play longer than 3 hours.
> 
> Well, not too much waste. I love playing with hardware so I am looking foward to Kraken G10 mod


Good luck on the 2100Mhz Gaming OC card looking at the OP post in this thread you got 4/40 or one in 10 cards are clocking to 2100Mhz
My opinion 2100 meh it is just a number like My CPU unlike you I had to buy every single one and sell the dud OCers I was lucky and got a 5Ghz 3770k on the 3rd BUT I run 4.8Ghz as a daily clock.
Not that is unstable @ 5Ghz just is soooo much more load for a 4% more Mhz almost .15v more but in all honest truth the performance scales more like a <2% OC
I have not done the numbers for the 1080Ti I simply don't have time, I do know the same happened with my GTX 770 SLI setup after 1350Mhz the performance simply didn't scale.

Maybe someone with time and a 2100Mhz capable card could do the numbers for all of us the actual gains per bin from 2000Mhz to 2100Mhz to put this to the test of truth

I wasn't sure what I should report my clock speed @KedarWolf my maxium attainable speed of 2100Mhz or my maximum stable sustained OC of 2038Mhz


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Yup, I could keep 2063 in BF1 and in Dishonored 2 for like 4-5 hours. Then playing W3 or ME:A they would crash in 2hours best. I even once hold 2050 for 5 hours in W3 but it crashed anyway after that.
> 
> Sometimes people think they have stable clocks before checking all games they have or play longer than 3 hours.
> 
> Well, not too much waste. I love playing with hardware so I am looking foward to Kraken G10 mod
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck on the 2100Mhz Gaming OC card looking at the OP post in this thread you got 4/40 or one in 10 cards are clocking to 2100Mhz
> My opinion 2100 meh it is just a number like My CPU unlike you I had to buy every single one and sell the dud OCers I was lucky and got a 5Ghz 3770k on the 3rd BUT I run 4.8Ghz as a daily clock.
> Not that is unstable @ 5Ghz just is soooo much more load for a 4% more Mhz almost .15v more but in all honest truth the performance scales more like a <2% OC
> I have not done the numbers for the 1080Ti I simply don't have time, I do know the same happened with my GTX 770 SLI setup after 1350Mhz the performance simply didn't scale.
> 
> Maybe someone with time and a 2100Mhz capable card could do the numbers for all of us the actual gains per bin from 2000Mhz to 2100Mhz to put this to the test of truth
> 
> I wasn't sure what I should report my clock speed @KedarWolf my maxium attainable speed of 2100Mhz or my maximum stable sustained OC of 2038Mhz
Click to expand...

24/7 gaming clocks you run.


----------



## pez

Quote:


> Originally Posted by *Alex132*
> 
> Not a fan of the FTW3 design or look - but the changes in the PCB, cooling, etc. are too attractive to turn down. I signed myself up for one... but just playing the waiting game now


FWIW, the EVGA coolers look much better in person than they do in pics. The ACX 3.0 coolers were a serious turnoff for me, and then when I saw one in my hands I felt differently. The ICX cooler looks much better to boot and still has that same affect. I guess it's just the way the OEM takes the photos.
Quote:


> Originally Posted by *Slackaveli*
> 
> i mean, i'd love to, but click on my profile and you'll see why lol. 5 kids....[/quote
> 
> Beautiful family
> 
> 
> 
> 
> 
> 
> 
> . That poor child being an only boy amongst 4 sisters
> 
> 
> 
> 
> 
> 
> 
> .


----------



## fisher6

Planning to reseat my block later today and was wondering if I do the shunt mod, what's the easiest and safest way to do it?


----------



## Alex132

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Not a fan of the FTW3 design or look - but the changes in the PCB, cooling, etc. are too attractive to turn down. I signed myself up for one... but just playing the waiting game now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FWIW, the EVGA coolers look much better in person than they do in pics. The ACX 3.0 coolers were a serious turnoff for me, and then when I saw one in my hands I felt differently. The ICX cooler looks much better to boot and still has that same affect. I guess it's just the way the OEM takes the photos.
Click to expand...

Watched the Gamer's Nexus review on the FTW3 (Or rather break down), and I was sold on it. Ordered it (or rather signed up for order when it comes available) locally.


----------



## pez

Quote:


> Originally Posted by *Alex132*
> 
> Watched the Gamer's Nexus review on the FTW3 (Or rather break down), and I was sold on it. Ordered it (or rather signed up for order when it comes available) locally.


Very nice!


----------



## Alex132

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Watched the Gamer's Nexus review on the FTW3 (Or rather break down), and I was sold on it. Ordered it (or rather signed up for order when it comes available) locally.
> 
> 
> 
> Very nice!
Click to expand...

Got tired of R9 295X2 special issues, things like BSODing when I open GPUZ, MSI AB, etc. got old 2 years ago









2500k + 1080 Ti is gonna be a bit weird, but waiting on Ryzen revision for higher clocks - or 6C|12T Coffee Lake.


----------



## pez

Quote:


> Originally Posted by *Alex132*
> 
> Got tired of R9 295X2 special issues, things like BSODing when I open GPUZ, MSI AB, etc. got old 2 years ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2500k + 1080 Ti is gonna be a bit weird, but waiting on Ryzen revision for higher clocks - or 6C|12T Coffee Lake.


Indeed....that's a large leap. Your res on that monitor will help relieve that bottleneck. I'm hoping Coffee Lake 6C/12T will be affordable...as in I hope it takes the current price-point of the top end i7.


----------



## Alex132

Quote:


> Originally Posted by *pez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Got tired of R9 295X2 special issues, things like BSODing when I open GPUZ, MSI AB, etc. got old 2 years ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2500k + 1080 Ti is gonna be a bit weird, but waiting on Ryzen revision for higher clocks - or 6C|12T Coffee Lake.
> 
> 
> 
> Indeed....that's a large leap. Your res on that monitor will help relieve that bottleneck. I'm hoping Coffee Lake 6C/12T will be affordable...as in I hope it takes the current price-point of the top end i7.
Click to expand...

If it doesn't, then I'll just get the Ryzen revision. So not too worried to be honest. Or heck, X99 revision 6C|12T.


----------



## max883

Tommorow im getting mine !!!


----------



## TWiST2k

Quote:


> Originally Posted by *Alex132*
> 
> Watched the Gamer's Nexus review on the FTW3 (Or rather break down), and I was sold on it. Ordered it (or rather signed up for order when it comes available) locally.


I pre-ordered the FTW3 direct from EVGA on 3/28 during the first wave and there having a stock issue at the moment. They had a 5/1 "estimated" ship date and only a handful of orders on that day have shipped and I read today that the rest won't be going out until Thursday or Friday. I am excited to get it, but the typical delays like everything these days. Tell me its gonna ship 5/8 and ship it 5/5 and I will be over the moon about it, instead of tell me 5/1 and then "maybe" it will ship by 5/5.


----------



## Benny89

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> you planing to do the honorable thing this time and sell yours first? or another return? lol


Um, I know some people here do not approve taking advantage of that legal way (which I don't really care, sorry), provided to me by our law - but using logic- why should I sell it if I can return it?

Quote:


> Originally Posted by *Alex132*
> 
> Watched the Gamer's Nexus review on the FTW3 (Or rather break down), and I was sold on it. Ordered it (or rather signed up for order when it comes available) locally.


I also think this is nice card, but I will wait first for some reviews on it. I want to know its TDP though.


----------



## Alex132

Quote:


> Originally Posted by *Benny89*
> 
> I also think this is nice card, but I will wait first for some reviews on it. I want to know its TDP though.


I'm honestly not that concerned. I love that its 2 slot, iCX looks amazing and I won't manually overclock it to run 24/7 - only for maybe benching.


----------



## Benny89

Quote:


> Originally Posted by *Alex132*
> 
> I won't manually overclock it to run 24/7 - only for maybe benching.


Eh, wish my stupid head could do it...


----------



## foolycooly

For people who have not touched the voltage curve--do the temp and power sliders actually do anything?

I have a small manual offset of +70 and +300 on a custom fan curve and the card will settle at 1944-1966 at 75c @ 65% fan during long sustained gaming sessions. This is with the sliders in their stock position in Precision XOC. I tried maxing the sliders out with the same settings and my core didn't move at all.


----------



## Alex132

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I won't manually overclock it to run 24/7 - only for maybe benching.
> 
> 
> 
> Eh, wish my stupid head could do it...
Click to expand...

I just dont like having to start MSI AB / EVGA Precision X every single time I boot, it's annoying. Plus the gains per power consumption/heat generation are not worth it IMO. (Especially on my 295x2 - this thing is crazy stock, let alone OCed).


----------



## stranger451

Quote:


> Originally Posted by *foolycooly*
> 
> For people who have not touched the voltage curve--do the temp and power sliders actually do anything?
> 
> I have a small manual offset of +70 and +300 on a custom fan curve and the card will settle at 1944-1966 at 75c @ 65% fan during long sustained gaming sessions. This is with the sliders in their stock position in Precision XOC. I tried maxing the sliders out with the same settings and my core didn't move at all.


GPU Boost 3.0 will adjust your clock frequency depending on a variety of factors. Simply changing the core voltage to 100% and the power limit to 120% nets a 25mhz increase in clock rate for my card. Maxing the core voltage and power limit will give you the best potential for a higher overclock. Unfortunately, GPU Boost 3.0 will downclock your card due to very aggressive thermal management. The voltage curve is used to give you better stability since GPU Boost 3.0 creates instability in your overclock. Ideally you could be overclocking by manually offsetting the core, memory clock and voltage but GPU Boost will override your settings causing crashes. The performance gained through having a slightly higher and more consistent core clock using a voltage curve is minimal at best.


----------



## Nico67

Quote:


> Originally Posted by *stranger451*
> 
> The voltage curve is used to give you better stability since GPU Boost 3.0 creates instability in your overclock. Ideally you could be overclocking by manually offsetting the core, memory clock and voltage but GPU Boost will override your settings causing crashes. The performance gained through having a slightly higher and more consistent core clock using a voltage curve is minimal at best.


Using core offset is still playing with the curve, its just moving the whole curve rather than individual points. GPU boost just operates off that curve. The gains using the curve are more consistant clks so better min fps.


----------



## bathrobehero

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bathrobehero*
> 
> I just bought an MSI 1080 Ti Gaming X and it gets *artifacts* even on stock speeds in 3D applications.
> 
> It's pretty rare but it's there; usually it's just a few, small red/pink stuff that looks like space invaders (from the old game).
> 
> Once it was huge red rectangles across the screen playing Elite Dangerous.
> 
> Downclocking memory doesn't seem to help (didn't bother restarting as I kind of have to run the PC 0-24h) and overclocking +400 Mhz is instant freeze and rainbows. It also seems ~8-10% slower in benchmarks than FE cards I find in benchmarks/reviews. Latest nvidia driver and latest GPU Bios.
> 
> Previously I used an 1070 witohut any issues.
> 
> I'm finding lots of threads with both 1080 and 1080 Ti having arfitacts, how common are these really? I'll send this card back for sure, but I'm not sure if I should go with another brand of 1080 Ti.
> 
> 
> 
> Try re-seating the cooler and check the VRAM pads. If that still doesnt work then RMA it, something is dodgy with the core.
Click to expand...

I tried everything other than disassembling it as it's already getting picked up tomorrow for RMA.

Thinking about swapping it to an Asus Strix, but I may have to wait a few weeks for one.

So I guess while there really are dozens of threads about artifacting 1080/1080 Ti's (which blows my mind considering the price), it isn't that widespread right? Thanks.

Also, this just happened again, minutes ago after hours of no artifacts: http://i.imgur.com/3ShmtHY.png


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> Um, I know some people here do not approve taking advantage of that legal way (which I don't really care, sorry), provided to me by our law - but using logic- why should I sell it if I can return it?
> I also think this is nice card, but I will wait first for some reviews on it. I want to know its TDP though.


Legal =/= ethical, you're only returning it because you can get away with any mods you did (if we wanna be nitpicky, Overclocking a card is considered a mod). Otherwise, you have no choice but to sell it.

People don't care enough about each other these days, its all me me me lol and you're a prime example. You return your card, its technically used (thus at the very least some electromigration has happened), they repackage it and re-sell to someone else who inherit all the problems you may have introduced with your "reversible" modding. Enough people do this, RMAs rates get higher and prices go up for everyone, everybody loses.

but....this is a tech forum, not a life ethics discussion so I will speak no more of this.


----------



## mshagg

Quote:


> Originally Posted by *bathrobehero*
> 
> So I guess while there really are dozens of threads about artifacting 1080/1080 Ti's (which blows my mind considering the price), it isn't that widespread right? Thanks.


Where are these 'dozens' of threads? Presumably on some other forum?


----------



## nrpeyton

Quote:


> Originally Posted by *fisher6*
> 
> Planning to reseat my block later today and was wondering if I do the shunt mod, what's the easiest and safest way to do it?


Read *"A shunt mod story by SlimJ87D"* on the opening thread. (page 1 of this thread).

It's a great article and covers how you can do it safely by using a little liquid electrical tape at the base of the resistors.

Nice big photos, up clear and easy to visualise/compare everything to your own card.

After you've read the article, if you've still got any questions then ask away; and I'm sure SlimJ87D or one of the others who have already carried out the mod safely, will be happy to help you ;-)


----------



## nrpeyton

deleted, double post (by accident)


----------



## Slackaveli

Quote:


> Originally Posted by *illypso*
> 
> After reading so much about it and to maximize my 2 X 1080TI
> 
> I finally just bought a 5775C to replace my 4790K. ( I have a kijiji offer for my 4790k at 360 CAD... not sure I will sell it at that price but there is no market for that here in quebec, I will probably end up keeping both)
> 
> It will arrive next week so... any benchmark recommendation to do on my 4790k while waiting to be able to compare and see the gain after the switch.
> 
> I did 1336 on firestrike ultra,I hope I will get a better score.
> 
> Also I'm on win 7, would switching to Win 10 be better in gaming? from you experience, does the new Nvidia driver work better in win10
> I used win 10 for a year but it seems heavy compare to 7 and I feel like Microsoft was watching my every move on win10, there some weird behavior. I switch back to win7


congrats, man!! you'll see more gains in games than benches as the cache is more useful, but you'll see gains in both. Be sure to pop by here http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club/890#post_26064099 , lots of good talk in there.


----------



## bathrobehero

Quote:


> Originally Posted by *mshagg*
> 
> Where are these 'dozens' of threads? Presumably on some other forum?


Yes, just all through google. But then again you only find what you look for so it might not be that prevalent - which is why I posted to find that out.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Beautiful family
> 
> 
> 
> 
> 
> 
> 
> . That poor child being an only boy amongst 4 sisters
> 
> 
> 
> 
> 
> 
> 
> .


yeah, thanks man. but there is one long haired hippy boy hiding in them girls. He's a pretty boy and it happens all the time (which he HATES, lol, of course). but the two boys are still out numbered and they get crushed by the 3 girls who seem to stick together lol.


----------



## TheNoseKnows

Quote:


> Originally Posted by *nrpeyton*
> 
> An:
> 
> 
> indepth look at the *PCB* _(the card it's self),_
> *the power delivery system* _(and O/C'ing)_ and
> *the cooling system* _(heatsink+fan & thermal pad arrangement)_
> 
> is my *perfect* kind of review!


Who would trust a film review written by someone who didn't watch the film? Or a car review by someone who didn't drive the car? Or a hardware review by someone who didn't use the hardware? As I said, he didn't even install the card, let alone make any judgements or comparisons. All that stuff you listed is purely superficial and can easily be done just by looking at pictures of the card (which is exactly what they did for the VRM analysis, where one guy used a picture of the PCB in GIMP without even physical access to the card: 



 ). Not even Gamers Nexus are calling it a review, because they know better. They just want to get the preliminary stuff out of the way early so people who are looking forwards to the actual review have something to watch while they wait.


----------



## dansi

Hey all 1080Ti on water cooling, can you run >2ghz stable for a full gaming session?
Heard the 1080Ti are purely heat limited.


----------



## Slackaveli

Quote:


> Originally Posted by *dansi*
> 
> Hey all 1080Ti on water cooling, can you run >2ghz stable for a full gaming session?
> Heard the 1080Ti are purely heat limited.


i do it on air at 2063+. just need to re-tim for $10


----------



## nrpeyton

Quote:


> Originally Posted by *TheNoseKnows*
> 
> Who would trust a film review written by someone who didn't watch the film? Or a car review by someone who didn't drive the car? Or a hardware review by someone who didn't use the hardware? As I said, he didn't even install the card, let alone make any judgements or comparisons. All that stuff you listed is purely superficial and can easily be done just by looking at pictures of the card (which is exactly what they did for the VRM analysis, where one guy used a picture of the PCB in GIMP without even physical access to the card:
> 
> 
> 
> ). Not even Gamers Nexus are calling it a review, because they know better. They just want to get the preliminary stuff out of the way early so people who are looking forwards to the actual review have something to watch while they wait.


Others have been "sold" just watching those videos already 

But fair enough.


----------



## dunbheagan

Quote:


> Originally Posted by *dansi*
> 
> Hey all 1080Ti on water cooling, can you run >2ghz stable for a full gaming session?
> Heard the 1080Ti are purely heat limited.


Quote:


> Originally Posted by *Slackaveli*
> 
> i do it on air at 2063+. just need to re-tim for $10


You have no downclock at all during the full run? I would consider this a very good result!

How's your card doing in FS Ultra? I found this test very demanding concerning power limiting. My FE(water, shunt mod) can do Supo 4k, Metro Redux, Valley, Doom at 2100MHz without any downclock. But it crashes in FS ultra above 2075. And at 2050 it still has a few minor downclocks to 2038 and 2025.


----------



## nrpeyton

Quote:


> Originally Posted by *dansi*
> 
> Hey all 1080Ti on water cooling, can you run >2ghz stable for a full gaming session?
> Heard the 1080Ti are purely heat limited.


With an under-volt, *many* can run 2GHZ all day long (1999Mhz bin), even without water. And still have 50 watts of power left to play with on a FE. _(in games)._ Benches: maybe a bit different.

Cards which struggle on the lottery (maybe *just* falling short of 2GHZ on air) will probably still do 2GHZ with a really good custom water loop.

Scale on PASCAL is about 100mhz for every 50 degrees C.


----------



## dmt123

I have been looking at some reviews and benchmarks of this card and it has caught my eye that the 0.1% minimum frame rate seems to dip very noticeably when the card is overclocked to its limit, altho the average and max fps are increased.
Any ideas what could be causing this? I find it very strange. Some kind of aggressive hardware level throttling maybe happening?


----------



## D13mass

Guys, if my card stable on 2050/12000 Mhz with water block and stock bios, do I need flash custom bios ? Will it give more performance?


----------



## nrpeyton

Quote:


> Originally Posted by *D13mass*
> 
> Guys, if my card stable on 2050/12000 Mhz with water block and stock bios, do I need flash custom bios ? Will it give more performance?


If you're hitting the power-limit, a shunt-mod would probably net you more performance than cross-flashing another manufacturers BIOS.

You would then indeed get a more consistent frequency at higher clocks; but in terms of the real-world performance gain that brings -- in all honesty it's negligible.

If you're going for a "benching-score" then by all means go for it.. if its just for gaming then you may be disappointed.

If you like to tinker and push the envelope and scrape every last degree of performance (just to say you can, _like me lol_) because you enjoy overclocking; then again-- I'd say definitely go for it!

By the way; I checked your system pictures out. I like your radiator. it's huuuuge lol. very nice setup ;-)


----------



## Mrip541

I received my replacement Gigabyte OC today. The coil whine is much improved, but I'm getting the exact same black grid flashing during WoW loading screens, intermittent black bar flashes horizontally while in game, and the screen goes black for a split second every 30min or so. Could this be something other than the card?


----------



## nrpeyton

Quote:


> Originally Posted by *Mrip541*
> 
> I received my replacement Gigabyte OC today. The coil whine is much improved, but I'm getting the exact same black grid flashing during WoW loading screens, intermittent black bar flashes horizontally while in game, and the screen goes black for a split second every 30min or so. Could this be something other than the card?


You could try overclocking your card, and see if it remains sable with a little boost. (If it does then you know its a software issue)?

If not then it could be an unlucky silicon lottery.

If that's the case (poor lottery); I'd try the following first in this order:

1) Under-volt the card while still aiming for a decent playable core-frequency you're happy with. The reduced current throughput on the core could help stabilise it at a decent clock-rate

_if that doesn't work then:_

2) Lower the frequency and maintain a stock voltage. If the problem goes away -- your card may be struggling to hit "stock" boost frequencies. In which case you have the choice of either *a)* returning the card or *b)*manually setting up a profile to slightly down-tune the card to attain a stable clock that you're still happy with and still nets you the performance from the card you desire.

3) If none of that works; then there may either be:
a) a software issue (try different games and see if the problem persists)
or
b) a more serious fault with the card that requires RMA


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> If you're hitting the power-limit, a shut-mod would probably net you more performance than cross-flashing another manufacturers BIOS.
> 
> You would then indeed get a more consistent frequency at higher clocks; but in terms of the real-world performance gain that brings -- in all honesty it's negligible.
> 
> If you're going for a score then by all means go for it.. if its just for gaming then you may be disappointed.
> 
> If you like to tinker and push the envelope and scrape every last degree of performance (just to say you can) because you enjoy overclocking; then again-- I'd say definitely go for it!
> 
> By the way; I checked your system pictures out. I like your radiator. it's huuuuge lol. very nice setup ;-)


Please, dont get me wrong on this, just asking out if curiosity: Do you call it shut mod instead of shunt mod for a special reason, like a joke i dont get? Really, i dont want to be a know-it-all and i am pretty sure you know it's shunt. I just see you write shut all the time and ask myself why.


----------



## D13mass

Quote:


> Originally Posted by *nrpeyton*
> 
> If you're hitting the power-limit, a shut-mod would probably net you more performance than cross-flashing another manufacturers BIOS.
> 
> You would then indeed get a more consistent frequency at higher clocks; but in terms of the real-world performance gain that brings -- in all honesty it's negligible.
> 
> If you're going for a score then by all means go for it.. if its just for gaming then you may be disappointed.
> 
> If you like to tinker and push the envelope and scrape every last degree of performance (just to say you can) because you enjoy overclocking; then again-- I'd say definitely go for it!
> 
> By the way; I checked your system pictures out. I like your radiator. it's huuuuge lol. very nice setup ;-)










thanks, but it`s old radiator, 3 days ago I built new config, something like that


And thanks for reply, I will stay on stock bios


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> Please, dont get me wrong on this, just asking out if curiosity: Do you call it shut mod instead of shunt mod for a special reason, like a joke i dont get? Really, i dont want to be a know-it-all and i am pretty sure you know it's shunt. I just see you write shut all the time and ask myself why.


My apologies; I stand corrected 

"Shunt mod" it is, lol


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> You have no downclock at all during the full run? I would consider this a very good result!
> 
> How's your card doing in FS Ultra? I found this test very demanding concerning power limiting. My FE(water, shunt mod) can do Supo 4k, Metro Redux, Valley, Doom at 2100MHz without any downclock. But it crashes in FS ultra above 2075. And at 2050 it still has a few minor downclocks to 2038 and 2025.


a couple games it touches 2050, i dont lock it. but, yeah, i got lucky (finally) this time.


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> My apologies; I stand corrected
> 
> "Shunt mod" it is, lol


No problem at all, i just thought i dont get the joke


----------



## velocityx

Guys, how is the stock TIM on Founders Edition Ti cards? (this my first FE card) I discovered that my card is losing OC stability around 71-72 C, will be doing a waterblock next month but was thinking maybe I should change for gelid extreme to get a few degrees until my waterbloc ships.


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> No problem at all, i just thought i dont get the joke


lol no worries, I think i'm on "auto-pilot" half the time and due to the MASSIVE size of this thread (813 pages now and probably close to nearly half-a-million views?) sometimes I'm guilty of just "skimming" and sometimes miss the finer points of "detail". The word "shut" comes more naturally; and I guess I maybe read it somewhere a little while ago (6 months maybe) and possibly it just "stuck" and I never ever gave it another thought, until now. lol

Funny thing is, now that I think about it.. "shunt" does make a lot more sense lol


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> a couple games it touches 2050, i dont lock it. but, yeah, i got lucky (finally) this time.


Happy to hear that! I wish you a lot of fun with your card!

I know how it feels to get an under average overclocker. I came from a 1070 Amp Extreme which had been quite expensive last year and I expected high clocks, but i was one of the first unlucky guys who got a 1070 with hynix memory instead of samsung and it absolutely made me mad that i got like 300MHz less on the memory than expected, although i knew it didnt make a noticable real world performance difference.

The 1080Ti is the first card which really makes me satisfied with a 2075/6200max and and 2050/6000 for everyday use.


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> lol no worries, I think i'm on "auto-pilot" half the time and due to the MASSIVE size of this thread (813 pages now and probably close to nearly half-a-million views?) sometimes I'm guilty of just "skimming" and sometimes miss the finer points of "detail". The word "shut" comes more naturally; and I guess I maybe read it somewhere a little while ago (6 months maybe) and possibly it just "stuck" and I never ever gave it another thought, until now. lol
> 
> Funny thing is, now that I think about it.. "shunt" does make a lot more sense lol


Yeah, it's really hard to follow this thread, it is so active







I have no idea how a few people manage to reply to so many posts, respect! I just read the the last posts every day and jump over the 100 new posts from the last 24h...


----------



## nrpeyton

Quote:


> Originally Posted by *velocityx*
> 
> Guys, how is the stock TIM on Founders Edition Ti cards? (this my first FE card) I discovered that my card is losing OC stability around 71-72 C, will be doing a waterblock next month but was thinking maybe I should change for gelid extreme to get a few degrees until my waterbloc ships.


Very good question:


I was wondering the exact same thing.
I'm in the same boat as you.
Although won't it also differ from manufacturer to manufacturer?
I.E:
EVGA FE
Gigabyte FE
ASUS FE
PALIT FE
Nvidia Store FE


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> Happy to hear that! I wish you a lot of fun with your card!
> 
> I know how it feels to get an under average overclocker. I came from a 1070 Amp Extreme which had been quite expensive last year and I expected high clocks, but i was one of the first unlucky guys who got a 1070 with hynix memory instead of samsung and it absolutely made me mad that i got like 300MHz less on the memory than expected, although i knew it didnt make a noticable real world performance difference.
> 
> The 1080Ti is the first card which really makes me satisfied with a 2075/6200max and and 2050/6000 for everyday use.


man, i actually caught a very good 980ti hybrid after several straight mediocre at best cards. it held 1539Mhz and ran cool as can be. Then it died in 1 year (pump went out. wish it came off so i could've kept my chip at least) and the replacement crashed at 1450. Had to run her 1438Mhz and even though it had an extra +100 on the mems, it was no consolation. My 1080ti FE was a 1974mhz card, and i had accepted it. But then as I was still in the 30 day window and had overpaid for the FE by $100 (next day ship plus tax, $801.00)i said screw it and returned it and got "just" a regular aorus. Wanted the extreme but i had to get something and they were out. Lo and behold I catch a regular aorus that beats every extreme ive seen yet. Freaking AWESOME feeling, even if it is much adeu about nothing.


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> Yeah, it's really hard to follow this thread, it is so active
> 
> 
> 
> 
> 
> 
> 
> I have no idea how a few people manage to reply to so many posts, respect! I just read the the last posts every day and jump over the 100 new posts from the last 24h...


email alerts, my man!
Quote:


> Originally Posted by *velocityx*
> 
> Guys, how is the stock TIM on Founders Edition Ti cards? (this my first FE card) I discovered that my card is losing OC stability around 71-72 C, will be doing a waterblock next month but was thinking maybe I should change for gelid extreme to get a few degrees until my waterbloc ships.


yeah, man, i gained ~8c on my re-tim of the FE. Gained 5c on my re-tim of the Aorus. they all need re-timmed. The Asus seems the best from the factory. damn robot company. i used to be an asus guy, but i dont support companies that replace people with fully automated. we will all be replaced someday. im racist against robots, lol. ironically , my grandfather is one of the Founding Father's of AI. he is the acknowledged genius behind automated theorum proving. He was even on the Unibomber's kill list. Wish he were still alive to speak of the future of the world. rip grandpappy, loved ya. To be clear, unibomber didnt kill him, ALS got him. But he was the 2nd name on Unibomber's kill list and the FBI checked my grandparent's mail for a full year after dude was busted.


----------



## Alex132

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheNoseKnows*
> 
> Who would trust a film review written by someone who didn't watch the film? Or a car review by someone who didn't drive the car? Or a hardware review by someone who didn't use the hardware? As I said, he didn't even install the card, let alone make any judgements or comparisons. All that stuff you listed is purely superficial and can easily be done just by looking at pictures of the card (which is exactly what they did for the VRM analysis, where one guy used a picture of the PCB in GIMP without even physical access to the card:
> 
> 
> 
> ). Not even Gamers Nexus are calling it a review, because they know better. They just want to get the preliminary stuff out of the way early so people who are looking forwards to the actual review have something to watch while they wait.
> 
> 
> 
> Others have been "sold" just watching those videos already
> 
> But fair enough.
Click to expand...

Yeah it's a 1080 Ti, what magical performance are you expecting beyond the typical stock overclock gains?

Only thing I would care about is the temp/noise test from that. But I trust it won't be bad, plus I only really care about noise when idle - and it turns the fans off then.


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> Very good question:
> 
> 
> I was wondering the exact same thing.
> I'm in the same boat as you.
> Although won't it also differ from manufacturer to manufacturer?
> I.E:
> EVGA FE
> Gigabyte FE
> ASUS FE
> PALIT FE
> Nvidia Store FE


When i removed my cooler on the FE, the TIM visually made quite a good impression. It was not hardened or too much or too less:



Anyway, i changed it to Conductonaut.





But why are you asking if you plan to watercool it? I mean you have to reapply the TIM anyway when you change the cooler, dont you?


----------



## Lefty23

Quote:


> Originally Posted by *feznz*
> 
> Maybe someone with time and a 2100Mhz capable card could do the numbers for all of us the actual gains per bin from 2000Mhz to 2100Mhz to put this to the test of truth


I know this is not enough data to reach any solid conclusions but here are the results of the ingame benchmark from Ghost Recon Wildlands (1440p Ultra preset) in all bins from 2000-2100. Unfortunatelly, I don't have a lot of free time but this benchmark is really short so the whole thing took like 15 minutes







.
2000 --- 61.63 Avg FPS
2012 --- 61.90 (+0.44%)
2025 --- 62.14 (+0.38%)
2038 --- 62.43 (+0.47%)
2050 --- 62.56 (+0.21%)
2063 --- 62.84 (+0.45%)
2076 --- 63.11 (+0.43%)
2088 --- 63.50 (+0.62%)
2100 --- 64.51 (+1.6%)

So, in total 2.98fps (+4.84%) for 100Hz (5%) OC in this specific game benchmark (screenshots in the spoiler).
As I said in another thread a few days ago, i don't believe that there are people that can actually tell the difference between 62 and 65 fps.
Even at higher fps that is what, 120 to 126. Again I don't think this is really noticeable or, something to worry about unless you intend to bench competitively.
As far as I'm concerned, I realised a couple of days ago that I have this card in my "gaming" PC for over a month and gamed for less than 10 hours







.
I uninstalled everything with Unigine, 3DMark, etc in its name and will only game from now on.


Spoiler: Warning: Spoiler!



2000 
2012 
2025 
2038 
2050 
2063 
2076 
2088 
2100 



As can be seen in the screenshot below I just used the slider for the core OC. I put the first 3 sliders all the way to the right, left the memory clock at stock and only changed the core clock from +100 to +190 (2000-2100) so as to keep the variables to a minimum (e.g. I obviously don't need 1.062 for 2000 on the core).


Spoiler: Warning: Spoiler!






Card is an EVGA FE under water ("EK-FC Titan X Pascal - Acetal+Nickel", 3*240rad - using Vardar 2200rpm fans, D5 pump) and room temperature was ~25C.
Original BIOS, no shunt mod, CPU is a 4790k @ 4.7GHz @ 1.296v.

-- Update with results from from the Rise of the Tomb Raider ingame benchmark
2000 --- 121.17
2012 --- 121.80 (+0.52%)
2025 --- 122.53 (+0.6%)
2038 --- 123.22 (+0.56%)
2050 --- 123.82 (+0.49%)
2063 --- 124.17 (+0.28%)
2076 --- 124.24 (+0.06%)
2088 --- 126.54 (+1.85%)
2100 --- 127.49 (+0.75%)

So, in total 6.32fps (+5.2%) for 100Hz (5%) OC in this one.
There is a big jump going from 2076->2088 for this one. It seemed weird so I tried a second time but got similar results (within 0.3 fps).
For some unknown to me reason I couldn't take screnshot so I took photos with my phone in case anyone is interested in detailed / per test results (though not tonight, I'm supposed to wake up in 5 hour to go to work)


Spoiler: Warning: Spoiler!







Sorry for any grammar mistakes but English is not my first language and I did write this very quickly - didn't check it afterwards.


----------



## fisher6

Anybody cares to rate this tim job:



Gonna try kryonaut for the first time.


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> Freaking AWESOME feeling, even if it is much adeu about nothing.


I am 100% with you


----------



## dunbheagan

Quote:


> Originally Posted by *fisher6*
> 
> Anybody cares to rate this tim job:
> 
> 
> 
> Gonna try kryonaut for the first time.


Hey man, somebody puked on your card...


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> When i removed my cooler on the FE, the TIM visually made quite a good impression. It was not hardened or too much or too less:
> 
> 
> 
> Anyway, i changed it to Conductonaut.
> 
> 
> 
> 
> 
> But why are you asking if you plan to watercool it? I mean you have to reapply the TIM anyway when you change the cooler, dont you?


Because I like to tinker about.

And if I take the heatsink and everything off the FE tonight, then it will go much quicker when I do the block next week.

Moreover there's less chance I'll forget where something goes (in a year or so when I sell it) if I've had it apart twice.

Also just pure curiosity.

Anyway you're using conductonaut; you not worried about corrosion or discolouration?
Here's what happened to my copper CPU block after 8 weeks:
 <- right click & open new tab then enlarge for *full size*

It's been scrubbed like relentlessly-- it's like the LM is permanently "moulded" into the microscopic pores of the copper. _(not sure if that's a good or a bad thing).

_Still - i like it.. conductonaut on GPU.. hardcore


----------



## Mrip541

Quote:


> Originally Posted by *nrpeyton*
> 
> You could try overclocking your card, and see if it remains sable with a little boost. (If it does then you know its a software issue)?
> 
> If not then it could be an unlucky silicon lottery.
> 
> If that's the case (poor lottery); I'd try the following first in this order...


Thanks for the tips. I played around with voltage and frequency but couldn't resolve the issues even with a huge underclock. I'll play around a bit more but who knows. This is what I get for cheating on EVGA... One other annoying bit is that they left a little piece of plastic screwed between the fan and the card, so it made a helicopter sound whenever the fan turned on. Took me 10sec to remove but Gigabyte couldn't be bothered, or they never even tested it.


----------



## nrpeyton

Quote:


> Originally Posted by *Mrip541*
> 
> Thanks for the tips. I played around with voltage and frequency but couldn't resolve the issues even with a huge underclock. I'll play around a bit more but who knows. This is what I get for cheating on EVGA... One other annoying bit is that they left a little piece of plastic screwed between the fan and the card, so it made a helicopter sound whenever the fan turned on. Took my 10sec to remove but Gigabyte couldn't be bothered, or they never even tested it.


aww then maybe it's RMA time.

or you're always going to feel unfaithful, lol

i'd just double check it's not software first (i.e. try in a different machine, if possible) or with a clean install.
_What about a drive you don't use often?_ (or even use the windows startup app to configure a "minimum only" boot)

I noticed that with the latest version of MSI Afterburner; Riva Tuner always crashes the Witcher 3 back to the desktop. Point is; often it can be some silly conflict issue between software.

Could even be a loose cable. Or a background task.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> Because I like to tinker about.
> 
> And if I take the heatsink and everything off the FE tonight, then it will go much quicker when I do the block next week.
> 
> Moreover there's less chance I'll forget where something goes (in a year or so when I sell it) if I've had it apart twice.
> 
> Also just pure curiosity.
> 
> Anyway you're using conductonaut; you not worried about corrosion or discolouration?
> Here's what happened to my copper CPU block after 8 weeks:
> <- right click & open new tab then enlarge for *full size*
> 
> It's been scrubbed like relentlessly-- it's like the LM is permanently "moulded" into the microscopic pores of the copper. _(not sure if that's a good or a bad thing).
> 
> _Still - i like it.. conductonaut on GPU.. hardcore


It won't hurt its performance, it's like a film die of the surface.


----------



## nrpeyton

Quote:


> Originally Posted by *SlimJ87D*
> 
> It won't hurt its performance, it's like a film die of the surface.


Aye; I agree.

Temps certainly didn't look any different, after.

What I have always wondered though; is if EK would still honour warranty. (Not that I can imagine anything ever going wrong with a block) -- it's no moving parts e.t.c. But I'm still curious.

Moreover is any discolouration to the GPU DIE possible?

My CPU IHS didn't look _too_ badly affected. Although there did seem to be a slight "fading" of the text etched on the top of it. _(8 weeks)_

Due to using a Chiller now when doing anything partially-extreme; I can't use LM anyway. It's inaffective below about 8c _(if i recall correctly)._

Anyway it netted me about 3 degrees C I think, at the time. (compared to Kryonaut).

*So all-in on Conductonaut; you're talking 6 degrees C.* If you think about it:
normal paste to kryonaut --> 3c
kryonaut --> conductonaut -> 3c
*=6c*

Could be worthwhile as a re-paste on air using stock FE at 6c of a difference. I do have some left ;-)
hmm... got me thinking now... lol

what you guys think? 6c would certainly be worth it until I get my block. ;-)


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> Because I like to tinker about.
> 
> And if I take the heatsink and everything off the FE tonight, then it will go much quicker when I do the block next week.
> 
> Moreover there's less chance I'll forget where something goes (in a year or so when I sell it) if I've had it apart twice.
> 
> Also just pure curiosity.
> 
> Anyway you're using conductonaut; you not worried about corrosion or discolouration?
> Here's what happened to my copper CPU block after 8 weeks:
> <- right click & open new tab then enlarge for *full size*
> 
> It's been scrubbed like relentlessly-- it's like the LM is permanently "moulded" into the microscopic pores of the copper. _(not sure if that's a good or a bad thing).
> 
> _Still - i like it.. conductonaut on GPU.. hardcore


Use metal polish such as Brasso or Never Dull and the discoloration will wipe away easily. Works on the block and the CPU IHS.

It literally makes them look brand new.

Make sure you follow up with a good cleaning with alcohol to remove any residual polish.

Good luck.


----------



## Newtocooling

Can one someone let me know if this overclock for a first timer looks decent?












I don't really understand my MSI is telling me I'm limited by Temp, Power, and Voltage??

I have a Heatkiller block on the card and I don't go over 31C on load.

GTA V seems to be doing a lot of stuttering as well to me, even though the FPS is staying around 100 to 85.


----------



## Bishop07764

Been testing out the Witcher 3 for stability. Where is the best place in the map to test my clocks? Tried it for a couple hours without crashes so far. Maxed 1440p including hair works. Haven't really seen any downclocking except for the map and inventory screens that I've noticed anyway. Been sticking pretty close to 120% power limit for the majority of the time.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> Use metal polish such as Brasso or Never Dull and the discoloration will wipe away easily. Works on the block and the CPU IHS.
> 
> It literally makes them look brand new.
> 
> Make sure you follow up with a good cleaning with alcohol to remove any residual polish.
> 
> Good luck.


+1


----------



## KedarWolf

Quote:


> Originally Posted by *Bishop07764*
> 
> Been testing out the Witcher 3 for stability. Where is the best place in the map to test my clocks? Tried it for a couple hours without crashes so far. Maxed 1440p including hair works. Haven't really seen any downclocking except for the map and inventory screens that I've noticed anyway. Been sticking pretty close to 120% power limit for the majority of the time.


No shunt mod I tested Rise Of The Tomb Raider DirectX12, it's a very power hungry game, had to do 2012 at .962v to get it not to power limit.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> My apologies; I stand corrected
> 
> "Shunt mod" it is, lol


At least you weren't talking about shirts and forgetting the r









I think we are all probably guilty of skim reading

Quote:


> Originally Posted by *Lefty23*
> 
> I know this is not enough data to reach any solid conclusions but here are the results of the ingame benchmark from Ghost Recon Wildlands (1440p Ultra preset) in all bins from 2000-2100. Unfortunatelly, I don't have a lot of free time but this benchmark is really short so the whole thing took like 15 minutes
> 
> 
> 
> 
> 
> 
> 
> .
> 2000 --- 61.63 Avg FPS
> 2012 --- 61.90 (+0.44%)
> 2025 --- 62.14 (+0.38%)
> 2038 --- 62.43 (+0.47%)
> 2050 --- 62.56 (+0.21%)
> 2063 --- 62.84 (+0.45%)
> 2076 --- 63.11 (+0.43%)
> 2088 --- 63.50 (+0.62%)
> 2100 --- 64.51 (+1.6%)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So, in total 2.98fps (+4.84%) for 100Hz (5%) OC in this specific game benchmark (screenshots in the spoiler).
> As I said in another thread a few days ago, i don't believe that there are people that can actually tell the difference between 62 and 65 fps.
> Even at higher fps that is what, 120 to 126. Again I don't think this is really noticeable or, something to worry about unless you intend to bench competitively.
> As far as I'm concerned, I realised a couple of days ago that I have this card in my "gaming" PC for over a month and gamed for less than 10 hours
> 
> 
> 
> 
> 
> 
> 
> .
> I uninstalled everything with Unigine, 3DMark, etc in its name and will only game from now on.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 2000
> 2012
> 2025
> 2038
> 2050
> 2063
> 2076
> 2088
> 2100
> 
> 
> 
> As can be seen in the screenshot below I just used the slider for the core OC. I put the first 3 sliders all the way to the right, left the memory clock at stock and only changed the core clock from +100 to +190 (2000-2100) so as to keep the variables to a minimum (e.g. I obviously don't need 1.062 for 2000 on the core).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Card is an EVGA FE under water ("EK-FC Titan X Pascal - Acetal+Nickel", 3*240rad - using Vardar 2200rpm fans, D5 pump) and room temperature was ~25C.
> Original BIOS, no shunt mod, CPU is a 4790k @ 4.7GHz @ 1.296v.
> 
> Sorry for any grammar mistakes but English is not my first language and I did write this very quickly - didn't check it afterwards
> 
> 
> .


well deserved first +1 I am surprised that the performance is on scale
Yeah maybe we need a FPS test for those who say they can tell the difference between 60 and 65 FPS I think I can tell when it is a huge difference like 45FPS and 60FPS
Above 50FPS is where I like to be with games unless it is FPS online then 100+ FPS because every little advantage helps.
don't worry about the grammar it is spot on I read my posts sometimes ad realise I wrote them in a haste with poor grammar and I only speak English.


----------



## keikei

Quote:


> Originally Posted by *Bishop07764*
> 
> Been testing out the Witcher 3 for stability. Where is the best place in the map to test my clocks? Tried it for a couple hours without crashes so far. Maxed 1440p including hair works. Haven't really seen any downclocking except for the map and inventory screens that I've noticed anyway. Been sticking pretty close to 120% power limit for the majority of the time.


Any large city. Those should be very taxing.


----------



## KraxKill

Quote:


> Originally Posted by *feznz*
> 
> At least you weren't talking about shirts and forgetting the r
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think we are all probably guilty of skim reading
> well deserved first +1 I am surprised that the performance is on scale
> Yeah maybe we need a FPS test for those who say they can tell the difference between 60 and 65 FPS I think I can tell when it is a huge difference like 45FPS and 60FPS
> Above 50FPS is where I like to be with games unless it is FPS online then 100+ FPS because every little advantage helps.
> don't worry about the grammar it is spot on I read my posts sometimes ad realise I wrote them in a haste with poor grammar and I only speak English.


I'd really like to see a fraps output of a similar scale. Because I believe that the min fps is where you will see the greatest impact from higher OC's If the .01 and 1% lows climb with the averages the greatest perceptible impact from all this OC'ing is at the lower end of the fps scale. A difference between 28 and 34 is a lot more noticible than the difference between 100 and 110.

It's not as much about the max fps or even averages after you get over 100 or so but more about the raised low fps numbers which translate very perceptibly to the user experience.

Just trying to devils advocate batting for the OC'oholics like myself that if the scaling of low fps is better than the avg or max fps chasing those last few MHz is not as frivolous as it may seem.

GNexus has some nice reviews accounting for increases in % lows due to overclocks and I think he makes some very valid points.


----------



## feznz

Quote:


> Originally Posted by *KraxKill*
> 
> I'd really like to see a fraps output of a similar scale. Because I believe that the min fps is where you will see the greatest impact from higher OC's If the .01 and 1% lows climb with the averages the greatest perceptible impact from all this OC'ing is at the lower end of the fps scale. A difference between 28 and 34 is a lot more noticible than the difference between 100 and 110.
> 
> It's not as much about the max fps or even averages after you get over 100 or so but more about the raised low fps numbers which translate very perceptibly to the user experience.


so true but looking back at Lefty23 post the min FPS remain almost identical between 2000Mhz and 2100Mhz within about .5FPS or error of margin as Lefty23 said it is inconclusive it does give a good indication though.


----------



## Lefty23

Quote:


> Originally Posted by *feznz*
> 
> well deserved first +1 I am surprised that the performance is on scale


Thanks for the rep

I updated my previous post with numbers from the Rise of the Tomb Raider ingame benchmark in case you are interested.
Similar results really 121.17->127.49 fps which is about 5.2% increase.

Scaling seems weird to me too, as you usually lose some performance at higher clocks. Maybe it has to do with the fact the card is not at its limit yet?
I have benched up to 2126-2139 though I'm heavily limited by power since I need 1.081-1.093 for those clocks.
Or maybe because these are still benchmarks so drivers are "optimised" for them?

Anyway, in case this is usefull, from another question of a user in the Superposition thread I checked 4k Supo results at stock vs my gaming OC vs max OC:
"For Superposition 4Kopt going from no core/memory OC (1911c/11000m), to my 24/7 gaming profile (2100c/12000m), to my max up to date (2126c/12555m):
Average fps: 71.58 -- > 77.50 (+8.3%) -- > 79.38 (+2.4%)"
Again these are barely noticeable differences - or maybe as I'm getting older my sight goes bad...


----------



## fisher6

Resitting the block on my GPU and switching to Kryonaut has reduced my temps by almost 10C to 47C max temp in ME:A. I think Kryonaut absolutely helps but so did refitting the block and tightening the screws around the core I think. Also switched to new fluid.


----------



## airisom2

Heads up, the EVGA Hybrid 1080Ti kit is in stock.

https://www.evga.com/products/Product.aspx?pn=400-HY-5388-B1

If you want to save a bit and don't mind not using the included shroud (VRM and VRAM temps will be fine, I can confirm), you can purchase the 1080 kit and only use the CLC. B&H is backordered for a couple days, and it's oos everywhere else, though.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Newtocooling*
> 
> Can one someone let me know if this overclock for a first timer looks decent?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't really understand my MSI is telling me I'm limited by Temp, Power, and Voltage??
> 
> I have a Heatkiller block on the card and I don't go over 31C on load.
> 
> GTA V seems to be doing a lot of stuttering as well to me, even though the FPS is staying around 100 to 85.


Dude smoothen out that curve, you don't want a exponential jump to your final oc.


----------



## Benny89

Quote:


> Originally Posted by *Bishop07764*
> 
> Been testing out the Witcher 3 for stability. Where is the best place in the map to test my clocks? Tried it for a couple hours without crashes so far. Maxed 1440p including hair works. Haven't really seen any downclocking except for the map and inventory screens that I've noticed anyway. Been sticking pretty close to 120% power limit for the majority of the time.


Not in the large cities, as CPU will take a lot of from GPU there.

You want some windy weather with rain in the forest where trees around you moves a lot due to very windy weather and rain is falling. Jump on horse and speed through forest in such weather. This is were I got most Power Limit in W3 as this takes your card to limits.

Never got Power Limit in large cities as CPU was helping there and there are no much graphical/physic effects in city as everything is stable there and not moving much.


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye; I agree.
> 
> Temps certainly didn't look any different, after.
> 
> What I have always wondered though; is if EK would still honour warranty. (Not that I can imagine anything ever going wrong with a block) -- it's no moving parts e.t.c. But I'm still curious.
> 
> Moreover is any discolouration to the GPU DIE possible?
> 
> My CPU IHS didn't look _too_ badly affected. Although there did seem to be a slight "fading" of the text etched on the top of it. _(8 weeks)_
> 
> Due to using a Chiller now when doing anything partially-extreme; I can't use LM anyway. It's inaffective below about 8c _(if i recall correctly)._
> 
> Anyway it netted me about 3 degrees C I think, at the time. (compared to Kryonaut).
> 
> *So all-in on Conductonaut; you're talking 6 degrees C.* If you think about it:
> normal paste to kryonaut --> 3c
> kryonaut --> conductonaut -> 3c
> *=6c*
> 
> Could be worthwhile as a re-paste on air using stock FE at 6c of a difference. I do have some left ;-)
> hmm... got me thinking now... lol
> 
> what you guys think? 6c would certainly be worth it until I get my block. ;-)


I would not use conductonaut for short-time, because i think it will be harder and more dangerous to remove it than normal paste. I dont think it is worth the hassle for one week.

Honestly, i knew about possible warranty problems and staining of the waterblock, but i didnt care. I havent rma anything in my whole life. I am not planning to disassemble the waterblock ever again, and when i sell the card, i sell it like it is. If i get less money for it, so be it. This are the costs for my hobby









Using the Conductonaut got a little obsession for me. First i just bought it for the shunt mod, then i decided to delid my 4790k and use it between cpu die and heatspeader too. Then i thought, why not use it between heatspeader and cpu cooler too. Last logical step was to use it on the gpu die as well...

The results are pretty good, gpu maxes out below 45 degrees and cpu below 55 degree under couple hours full load. You get these or better results with a good normal paste as well for sure, but at least i can be certain, that the tim is no bottleneck for the temps in my system. I do not have the most powerful waterloop on earth, because i made some asthetic over performance decicions. i have a 240 25mm and a 360 25mm EK radiator and five corsair sp120 rgb fans. considering this, i think the temps are pretty well.

please excuse my spelling, its already 1am in germany...


----------



## KedarWolf

If you don't want Chrome to ramp your clocks up on your video card in Chrome type in chrome://flags/ and make sure 'Override software rendering list' , 'GPU rasterization', 'Hardware-accelerated video decode', 'Accelerated 2D canvas', 'Cast Streaming hardware video encoding are disabled and '3D software rasterizer' is enabled.









Aand in 'Settings' make sure 'Hardware Acceleration. is disabled.


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> I would not use conductonaut for short-time, because i think it will be harder and more dangerous to remove it than normal paste. I dont think it is worth the hassle for one week.
> 
> Honestly, i knew about possible warranty problems and staining of the waterblock, but i didnt care. I havent rma anything in my whole life. I am not planning to disassemble the waterblock ever again, and when i sell the card, i sell it like it is. If i get less money for it, so be it. This are the costs for my hobby
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Using the Conductonaut got a little obsession for me. First i just bought it for the shunt mod, then i decided to delid my 4790k and use it between cpu die and heatspeader too. Then i thought, why not use it between heatspeader and cpu cooler too. Last logical step was to use it on the gpu die as well...
> 
> The results are pretty good, gpu maxes out below 45 degrees and cpu below 55 degree under couple hours full load. You get these or better results with a good normal paste as well for sure, but at least i can be certain, that the tim is no bottleneck for the temps in my system. I do not have the most powerful waterloop on earth, because i made some asthetic over performance decicions. i have a 240 25mm and a 360 25mm EK radiator and five corsair sp120 rgb fans. considering this, i think the temps are pretty well.
> 
> please excuse my spelling, its already 1am in germany...


A bit concerned how some people on custom loops are getting max temps of +38 at load; while others are still sitting nearly 10c higher.

All parties had *more than enough* radiator space for 1 CPU & 1 GPU. Plus *plenty* more left over..

I assume most are using the EK blocks; with a few exceptions. And everyone seems to be pasting with Kryonaut or Liq Metal..

When I get my block next week I need to decide whether to connect my Water Chiller back into the loop or buy another radiator.

Currently I only have a medium thickness 360mm EK radiator and a top-of-the-range D5.

Since I disconnected the Water Chiller; (on CPU only just now) it still just doesn't feel like I'm getting the temps I used to when I first installed it all about 7 months ago. (before I got the Chiller).

Saying that, I have changed my motherboard from one that had no LLC setting. To one that does. However It's at it's lowest setting (meaning v.core matches what I enter in BIOS -- in fact it's pretty bang-on in any situation),


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> A bit concerned how some people on custom loops are getting max temps of +38 at load; while others are still sitting nearly 10c higher.
> 
> All parties had *more than enough* radiator space for 1 CPU & 1 GPU. Plus *plenty* more left over..
> 
> I assume most are using the EK blocks; with a few exceptions. And everyone seems to be pasting with Kryonaut.
> 
> When I get my block next week I need to decide whether to connect my Water Chiller back into the loop or buy another radiator.
> 
> Currently I have a medium thickness 360mm EK radiator and a top-of-the-range D5.
> 
> Since I disconnected the Water Chiller; (on CPU only just now) it still just doesn't feel like I'm getting the temps I used to when I first installed it all about 7 months ago.
> 
> Saying that, I have changed my motherboard from one that had no LLC setting. To one that does. However It's at it's lowest setting (meaning v.core matches what I enter in BIOS),


It really matters how you measure max temps. If i do one or two or three runs Supo 4k, my gpu stays under 40 degree too. But I did a full night FS ultra loop to measure the max temps and until a certain point the water temp slowly rises.


----------



## alucardis666

Is MSI After Burner's latest beta still 4.4.0.9855 Beta6?

Thanks


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> It really matters how you measure max temps. If i do one or two or three runs Supo 4k, my gpu stays under 40 degree too. But I did a full night FS ultra loop to measure the max temps and until a certain point the water temp slowly rises.


I see...

Thats the beauty of the Chiller to be honest.

Because as long as you get a good mount and paste.

You pretty much get to _choose_ the GPU temperature you want.

Just set the Chiller coolant target (thermostat) to 10 degrees lower than you want your core temp, and bang.. you have exactly the core- temp you want. (within 2-3 degrees).

For instance if I wanted a GPU-core temp of 20c in Furmark. I'd probably have to set the chiller to 8c.

If I wanted 10c during normal gaming, a setting of 10c would probably suffice.

During idle, the GPU temp usually matches the coolant temp within 1-3 degrees.

Watching the core temp drop the SECOND you terminate the benchmark (before you can blink) your GPU temp is matching coolant temp, again.

The only downside, is the chiller is awfully expensive. £500 (or aout 600 in U.S. dollars I believe).
And it has moving parts (compressor for example). So wear-n-tear is a real thing..

After the initial novelty wore off, I now prefer not to use it unless I'm actually tinkering/experimenting or benching. (and usually I am aware my plans a few days in advance).


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Is MSI After Burner's latest beta still 4.4.0.9855 Beta6?
> 
> Thanks


http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta7.rar is the latest but one person had issues with it here, said it corrupted their Windows install.

I think it just may have been an ID 10T error though.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta7.rar is the latest but one person had issues with it here, said it corrupted their Windows install.
> 
> I think it just may have been an ID 10T error though.


I have also experienced issues since installing the 4.4 beta (to unlock voltage slider -- as suggested on the opening page).

I think the issue is with the accompanying "RivaTuner Statistics Server" app that is also installed.

If riva is running when I start the Witcher 3. The game _always_ crashes to desktop within seconds.

If I start W3 first; get into the game and begin playing.. then start Riva everything runs fine.

Riva sits in the background (doesn't even show a tab on the bottom taskbar). So I can quite imagine how someone not noticing this; would think their windows install became corrupted.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta7.rar is the latest but one person had issues with it here, said it corrupted their Windows install.
> 
> I think it just may have been an ID 10T error though.


Lol. Thanks.
Quote:


> Originally Posted by *nrpeyton*
> 
> I have also experienced issues since installing the 4.4 beta (to unlock voltage slider -- as suggested on the opening page).
> 
> I think the issue is with the accompanying "RivaTuner Statistics Server" app that is also installed.
> 
> If riva is running when I start the Witcher 3. The game _always_ crashes to desktop within seconds.
> 
> If I start W3 first; get into the game and begin playing.. then start Riva everything runs fine.


Same issues here with Watchdogs 2, worked fine for a few days, today nothing but crashes...


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> Lol. Thanks.
> Same issues here with Watchdogs 2, worked fine for a few days, today nothing but crashes...


Try closing Riva. Opening game (getting RIGHT into the game and begin playing) I.E. get past all the menus e.t.c so you're actually in-game.. then restart Riva.

See if that helps.

Let us know if that fixes it. Because if it does we should all maybe submit a bug report. If they get a few reports of the same nature there's more likely a flash-fix will be implimented quite quickly.

Would be a shame if people begin re-installing windows e.t.c, only to have a repeat of the same problem as soon as they re-install Afterburner.


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> Try closing Riva. Opening game (getting RIGHT into the game and begin playing) I.E. get past all the menus e.t.c so you're actually in-game.. then restart Riva.
> 
> See if that helps.
> 
> Let us know if that fixes it. Because if it does we should all maybe submit a bug report. If they get a few reports of the same nature there's more likely a flash-fix will be implimented quite quickly.
> 
> Would be a shame if people begin re-installing windows e.t.c, only to have a repeat of the same problem as soon as they re-install Afterburner.


I just trashed the beta and went back to 4.3









We'll see if I still crash.


----------



## mtbiker033

Quote:


> Originally Posted by *airisom2*
> 
> Heads up, the EVGA Hybrid 1080Ti kit is in stock.
> 
> https://www.evga.com/products/Product.aspx?pn=400-HY-5388-B1
> 
> If you want to save a bit and don't mind not using the included shroud (VRM and VRAM temps will be fine, I can confirm), you can purchase the 1080 kit and only use the CLC. B&H is backordered for a couple days, and it's oos everywhere else, though.


yep ordered mine FINALLY!!! just emailed B&H to cancel my pre-order.









got my tools, got my kryonaut I'm ready


----------



## alucardis666

Quote:


> Originally Posted by *mtbiker033*
> 
> yep ordered mine FINALLY!!! just emailed B&H to cancel my pre-order.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> got my tools, got my kryonaut I'm ready


Same here. Tired of waiting on B&H, I should have mine here Friday


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> I just trashed the beta and went back to 4.3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We'll see if I still crash.


great let us know how you get on 

meanwhile I'll try and track down an email address we can use. if it fixes it for you.


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> Same here. Tired of waiting on B&H, I should have mine here Friday


awesome!!!







I did the 3 day select and I'm cross country from EVGA I probably won't get mine until Monday so I will be looking forward to seeing your results with it!!


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> great let us know how you get on
> 
> meanwhile I'll try and track down an email address we can use. if it fixes it for you.


Will do. Gonna load it up now and see how I fare








Quote:


> Originally Posted by *mtbiker033*
> 
> awesome!!!
> 
> 
> 
> 
> 
> 
> 
> I did the 3 day select and I'm cross country from EVGA I probably won't get mine until Monday so I will be looking forward to seeing your results with it!!


Already know what to expect! I've ran the 1080 hybrid on one of my TIs before. Load temps ~50c, idle ~25C 21-22c ambient room temp.









Really wanna see how my clocks improve/stabilize with the thermal throttle gone.

EDIT: Thanks for the +Rep!


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> Will do. Gonna load it up now and see how I fare
> 
> 
> 
> 
> 
> 
> 
> 
> Already know what to expect! I've ran the 1080 hybrid on one of my TIs before. Load temps ~50c, idle ~25C 21-22c ambient room temp.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really wanna see how my clocks improve/stabilize with the thermal throttle gone.
> 
> EDIT: Thanks for the +Rep!


right! me too!!!


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> great let us know how you get on
> 
> meanwhile I'll try and track down an email address we can use. if it fixes it for you.


Still crashing under 3 minutes of gameplay...


----------



## Blotto80

Anyone using a universal waterblock with the FE? I'm currently running an EVGA Hybrid Cooler on my FE and am generally pretty pleased with the performance but I'd like to integrate it into the loop I'm building (1x 360mm, 1x 240mm). at the moment I'm thinking if I could leave the stock midplate/blower on for the VRMs and VRAM and throw something like an EK-VGA Supremacy on, it would be at least as good as the Hybrid temp wise but let me get rid of the extra 120mm rad and fans. I'm having trouble finding one that would for sure fit in the heatsink cutout of the stock midplate without alteration. Any ideas?


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> Still crashing under 3 minutes of gameplay...


Gonna try uninstalling RivaTuner Statistics altogether and see if it helps.


----------



## shadow85

I just ordered 2x GTX 1080 Ti STRIX OC.

Will I be bottle-necked by an [email protected] 4.3GHz @4K res.


----------



## KedarWolf

Quote:


> Originally Posted by *shadow85*
> 
> I just ordered 2x GTX 1080 Ti STRIX OC.
> 
> Will I be bottle-necked by an [email protected] 4.3GHz @4K res.


5930k should be fine.


----------



## TheBoom

Quote:


> Originally Posted by *shadow85*
> 
> I just ordered 2x GTX 1080 Ti STRIX OC.
> 
> Will I be bottle-necked by an [email protected] 4.3GHz @4K res.


4K is the resolution you will almost never see a CPU bottleneck simply because of the GPU overhead required for rendering.


----------



## Slackaveli

Quote:


> Originally Posted by *shadow85*
> 
> I just ordered 2x GTX 1080 Ti STRIX OC.
> 
> Will I be bottle-necked by an [email protected] 4.3GHz @4K res.


at 4k even a 4 core isnt a bottleneck, you straight


----------



## Slackaveli

Quote:


> Originally Posted by *mtbiker033*
> 
> yep ordered mine FINALLY!!! just emailed B&H to cancel my pre-order.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> got my tools, got my kryonaut I'm ready


it's been a long wait, huh?


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shadow85*
> 
> I just ordered 2x GTX 1080 Ti STRIX OC.
> 
> Will I be bottle-necked by an [email protected] 4.3GHz @4K res.
> 
> 
> 
> at 4k even a 4 core isnt a bottleneck, you straight
Click to expand...

5930k is 6 cores, 12 threads.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Gonna try uninstalling RivaTuner Statistics altogether and see if it helps.


nooooo. i love rivatuner. i'd been lost w/o miss riva!
Quote:


> Originally Posted by *KedarWolf*
> 
> 5930k is 6 cores, 12 threads.


yeah... i know homey. I was just telling him that even a 4 core isnt a bottleneck in sli in 4k. i know all the cpu by number, except some of the xeons


----------



## shadow85

Quote:


> Originally Posted by *Slackaveli*
> 
> at 4k even a 4 core isnt a bottleneck, you straight


Sweet. Can't wait till they get here!!!


----------



## Bishop07764

Quote:


> Originally Posted by *alucardis666*
> 
> Gonna try uninstalling RivaTuner Statistics altogether and see if it helps.


I ended up just uninstalling statistics server myself.
Quote:


> Originally Posted by *KedarWolf*
> 
> No shunt mod I tested Rise Of The Tomb Raider DirectX12, it's a very power hungry game, had to do 2012 at .962v to get it not to power limit.


Wow. I might have to try that game out again to see how mine responds. If yours throttled at less than 1 then I might be majorly power limited.
Quote:


> Originally Posted by *keikei*
> 
> Any large city. Those should be very taxing.


I've been running around a fair bit in Novigrad without problems.
Quote:


> Originally Posted by *fisher6*
> 
> Resitting the block on my GPU and switching to Kryonaut has reduced my temps by almost 10C to 47C max temp in ME:A. I think Kryonaut absolutely helps but so did refitting the block and tightening the screws around the core I think. Also switched to new fluid.


Glad to see that it brought your temps back down.
Quote:


> Originally Posted by *Benny89*
> 
> Not in the large cities, as CPU will take a lot of from GPU there.
> 
> You want some windy weather with rain in the forest where trees around you moves a lot due to very windy weather and rain is falling. Jump on horse and speed through forest in such weather. This is were I got most Power Limit in W3 as this takes your card to limits.
> 
> Never got Power Limit in large cities as CPU was helping there and there are no much graphical/physic effects in city as everything is stable there and not moving much.


I might be good then. I played for 4-5 hours or so today. I was all over the map and in all kinds of weather including storms riding through the woods. Finally made my card briefly flirt with 37c core temp. I finished the game a while back and I'm still finding a few random encounters after I thought I got pretty much everything. This game is just huge. Expansion packs are next on the list. ?


----------



## Bishop07764

Quote:


> Originally Posted by *nrpeyton*
> 
> A bit concerned how some people on custom loops are getting max temps of +38 at load; while others are still sitting nearly 10c higher.
> 
> All parties had *more than enough* radiator space for 1 CPU & 1 GPU. Plus *plenty* more left over..
> 
> I assume most are using the EK blocks; with a few exceptions. And everyone seems to be pasting with Kryonaut or Liq Metal..
> 
> When I get my block next week I need to decide whether to connect my Water Chiller back into the loop or buy another radiator.
> 
> Currently I only have a medium thickness 360mm EK radiator and a top-of-the-range D5.
> 
> Since I disconnected the Water Chiller; (on CPU only just now) it still just doesn't feel like I'm getting the temps I used to when I first installed it all about 7 months ago. (before I got the Chiller).
> 
> Saying that, I have changed my motherboard from one that had no LLC setting. To one that does. However It's at it's lowest setting (meaning v.core matches what I enter in BIOS -- in fact it's pretty bang-on in any situation),


Yeah. Everyone's loop is going to be different though. Routing, air flow, fans, ambient temps, number/size/thickness of rads, mounting. A lot of variables. Hope that you get good enough performance that you won't even need the chiller.


----------



## Slackaveli

Quote:


> Originally Posted by *shadow85*
> 
> Sweet. Can't wait till they get here!!!


you'll crush everything


----------



## BBEG

For anyone interested, EVGA's 1080 Ti / Titan XP hybrid cooler is back in stock. Or was as of a few hours ago when I ordered one. Previous EVGA hybrid cards have always measured and metered well so it should be a worthy upgrade in terms of noise and thermals for whoever managed to pick one up. Bonus that it is covered by EVGA's warranty.

Do we have a favorite TIM replacement? While I'm replacing the cooler I may as well upgrade the TIM if any prove significantly better than others.


----------



## alucardis666

Quote:


> Originally Posted by *BBEG*
> 
> For anyone interested, EVGA's 1080 Ti / Titan XP hybrid cooler is back in stock. Or was as of a few hours ago when I ordered one. Previous EVGA hybrid cards have always measured and metered well so it should be a worthy upgrade in terms of noise and thermals for whoever managed to pick one up. Bonus that it is covered by EVGA's warranty.
> 
> Do we have a favorite TIM replacement? While I'm replacing the cooler I may as well upgrade the TIM if any prove significantly better than others.


They're gone. Sold out within the hour.

As for TIM, I'd recommend this!

https://www.amazon.com/gp/product/B00ZJSDB7W/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> it's been a long wait, huh?


I was starting to wonder if they would ever be available!


----------



## alucardis666

Is it possible that visual C++ is corrupt somehow? I just crashed a minute into outlast 2 as well. I've uninstalled Rivatuner statistics and I'm on 4.3 of MSI afterburner... I also ran DDU and CCleaner and clean installed the latest driver. Oh and I've reverted to stock clocks too...

Still crashing...

But where Watchdogs doesn't give me an error I just got this in outlast 2...



Any ideas?









Oh look what got installed today! Must've been a windows update.











I guess I gotta revert to the older version. Thanks *Microsoft*.









*EDIT:* For Ss&Gs, I decided to repair both new versions instead of uninstalling them, Outlast 2 ran solid for 20 minutes before I exited the game.









Now to go back and test Watch Dogs 2 again...

Here's hoping the issue was resolved with that repair. Which means everything I did before was for nothing!









*EDIT 2:* Watch Dogs 2 is stable again too!


----------



## shadow85

Quote:


> Originally Posted by *Slackaveli*
> 
> you'll crush everything


I hope so. I have been waiting since the GTX 9 series to be able to do that @ 4K.


----------



## alucardis666

Quote:


> Originally Posted by *shadow85*
> 
> I just ordered 2x GTX 1080 Ti STRIX OC.
> 
> Will I be bottle-necked by an [email protected] 4.3GHz @4K res.


Quote:


> Originally Posted by *shadow85*
> 
> I hope so. I have been waiting since the GTX 9 series to be able to do that @ 4K.


Saw your posts and had to chime in,

YMMV *severely* depending on the game... I currently have 2 FE 1080Ti's **Previous 2 Aorus 1080Tis (Sold due to poor thermals) and 1 TXp (yes the new one. I sent it back.)** and I also used to have an X99 and 6950x. **Now on R7 1700 and X370.** But in both cases, FPS at 4k can be well over 300fps in some titles and in others be in the 30's with everything turned up. Scaling and optimization are a real PITA. And I don't think Pascal can really deliver a turn 4k 60fps gaming experience. **VOLTA and VEGA Can't come quick enough if you wanna do 4k 144hz on one of those new monitors coming out**

Be prepared to turn settings down, and occasionally forgo AA, if you wanna get 60fps in some titles, if you don't wanna compromise there then game @ 2560x1440. Or get yourself a 3440x1440 or 3840x1600 display. **MUCH MUCH MUCH less demanding than 2160p.**

And as for your choice to go with the Strix in SLi, be careful as card spacing and case airflow can cook your cards pretty good. IF you don't have a board with good slot spacing or don't plan to get one one I wouldn't recommend stacking AIB cards. Had this issue with the Aorus. Top card would cook and flush heat onto the 2nd card making for card 1 to load @ 95C and card 2 to load at 80C. *Yikes, I know* You're better served saving some $$$ and water cooling 2 FE's with blocks or hybrid kits, if you're dead-set on the Strix and don't want to watercool you're gonna need some VERY good airflow pushing the head out of your case, and a lot of fans, I'd also suggest re-timing the cards as the stock paste jobs and tim itself, isn't great. And also using MSI Afterburner to create a custom curve for GPU load/gaming, as yes 0db is nice and all for web-browsing and some FHD video watching, but it can spell thermal issues with your cards when under any kind of actual load.

Finally, if you really wanna maximize your gaming get a *5775c* or a *7700k*, the higher cache and or clock speeds really shine in gaming (we're talking 5-20fps more over a chip with more than 4 cores!!!) *this is due to most modern games not being well optimized for multi-core systems and topping out at either dual or quad core*, however if you use your PC as a workstation and do more than just game, then stick with your CPU and OC the crap outta your CPU and cards to compensate as much as you can to get as much of your FPS fluidity back. *Personally* I'd get the 7700k as it's newer and can push 5ghz pretty easy with a modest OC. some can even run up to 5.5ghz.









**Fast memory over 3000mhz helps with an extra 2-10fps depending on the game as well.**

*TL;DR,* DO NOT get an AIB card that vents heat into your case for SLI, FE would be better, and even better if you can water/hybrid cool it. You have the wrong CPU for 4k Gaming, consider changing or getting a different display, or gaming at 2k or 21:9, if that's an option, finally get fast ram over 3000mhz if you can/don't already have it.

Hope this info helps as someone with similar specs and too much experience than I care to admit I have.


----------



## Slackaveli

Quote:


> Originally Posted by *shadow85*
> 
> I hope so. I have been waiting since the GTX 9 series to be able to do that @ 4K.


you and me both. i started my 4k dragon chase on the release of gtx980. it's nice finally having the frames.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> you and me both. i started my 4k dragon chase on the release of gtx980. it's nice finally having the frames.


And me makes 3.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, thanks man. but there is one long haired hippy boy hiding in them girls. He's a pretty boy and it happens all the time (which he HATES, lol, of course). but the two boys are still out numbered and they get crushed by the 3 girls who seem to stick together lol.


Ah I did miss that! I should know better since I have long hair







.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> And me makes 3.


Jerry Garcia- "What a looong, strange trip it's been...."


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> Jerry Garcia- "What a looong, strange trip it's been...."












Assuming Volta ends up 15-30% faster than TXp I'll sell my TIs and either get the GTX 1180 or the next Titan and call it a day. If games don't become more demanding too that is.









a 100hz (or better) UW monitor @ sub 4K (3440x1440 or 3840x1600) looks VERY appealing considering the grueling requirements for 4K 60FPS locked @ Max Settings on ALL games.

Maybe next tax season.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Ah I did miss that! I should know better since I have long hair
> 
> 
> 
> 
> 
> 
> 
> .










It's so pretty and blond, ole Josiah's hair. his problem is the twins are as tall as him as girls 2 years his younger and they look so much alike that he gets called a triplet girl all the time lol.

@alucardis666 tax time next year is when volta should appear, too. And we'll have a few new monitors by then, too. For this year I feel good to go.


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> 
> 
> It's so pretty and blond, ole Josiah's hair. his problem is the twins are as tall as him as girls 2 years his younger and they look so much alike that he gets called a triplet girl all the time lol.
> 
> @alucardis666 tax time next year is when volta should appear, too. And we'll have a few new monitors by then, too. For this year I feel good to go.


One day he'll look back on it and laugh







.

As for on-topic, I finally decided to keep the EVGA Ti SC BE and the EVGA FE that I got through step-up. Returned the TXp and the TXP is still for sale. Now I just need to decide if I want to build a living room system with this extra ITX, i5 and 1070 that I'll have...

I'm going to pop the FE into my rig to see how good of an OC'er it is at 100% fan and then I'll decide if I want to get a AIO for it and put it in my system. If not, it's going to sit at 'stock' in my GFs system.


----------



## EarlZ

Im planning to get the 1080Ti Amp Extreme by the end of may though I am wondering if it will fit on my Gigabyte G1 Gaming M5, I have a Sound Blaster Z on the very bottom slot (4th slot ) The 16x PCIe slot is the very first slot followed by a 2x slot and a 16x slot and the Amp Extreme is 3 slots ?


----------



## AngryLobster

I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.

The only viable way is put them under water or use a open air card on top and a FE below.


----------



## KickAssCop

Any EVGA owners? Are the ICX coolers better than the POS ACX coolers? What about FTW3? Any reviews out?


----------



## Slackaveli

Quote:


> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.
> 
> The only viable way is put them under water or use a open air card on top and a FE below.


run a hybrid on top and any other you want below works great, too.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.
> 
> The only viable way is put them under water or use a open air card on top and a FE below.
> 
> 
> 
> run a hybrid on top and any other you want below works great, too.
Click to expand...

I've had top water, bottom air before, different series, not Ti's. )


----------



## Alex132

Quote:


> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.
> 
> The only viable way is put them under water or use a open air card on top and a FE below.


I don't think you'd like the 295X2 then.


----------



## pez

Quote:


> Originally Posted by *KickAssCop*
> 
> Any EVGA owners? Are the ICX coolers better than the POS ACX coolers? What about FTW3? Any reviews out?


Outside of the issues with the ACX 3.0 that arose, I'm not sure why they're 'POS'???


----------



## feznz

Quote:


> Originally Posted by *Lefty23*
> 
> Scaling seems weird to me too, as you usually lose some performance at higher clocks. Maybe it has to do with the fact the card is not at its limit yet?
> I have benched up to 2126-2139 though I'm heavily limited by power since I need 1.081-1.093 for those clocks.
> Or maybe because these are still benchmarks so drivers are "optimised" for them?
> 
> Anyway, in case this is usefull, from another question of a user in the Superposition thread I checked 4k Supo results at stock vs my gaming OC vs max OC:
> "For Superposition 4Kopt going from no core/memory OC (1911c/11000m), to my 24/7 gaming profile (2100c/12000m), to my max up to date (2126c/12555m):
> Average fps: 71.58 -- > 77.50 (+8.3%) -- > 79.38 (+2.4%)"
> Again these are barely noticeable differences - or maybe as I'm getting older my sight goes bad...


Thank you for the effort for benefit of everyone I was thinking maybe the great scaling was because you started at the top clock and worked your way down if started at the bottom and worked your way up, maybe there might be a difference but I did notice the 2088Mhz seems to be the magic number for ROTTR and Ghost Recon
Then again it could be simply new architecture/bios not letting you get near the poor scaling clocks who knows but you got one nice FE card there you have a whole 3% on my Strix but that's ok I rather have the ROG logo

Quote:


> Originally Posted by *nrpeyton*
> 
> A bit concerned how some people on custom loops are getting max temps of +38 at load; while others are still sitting nearly 10c higher.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> All parties had *more than enough* radiator space for 1 CPU & 1 GPU. Plus *plenty* more left over..
> 
> I assume most are using the EK blocks; with a few exceptions. And everyone seems to be pasting with Kryonaut or Liq Metal..
> 
> When I get my block next week I need to decide whether to connect my Water Chiller back into the loop or buy another radiator.
> 
> Currently I only have a medium thickness 360mm EK radiator and a top-of-the-range D5.
> 
> Since I disconnected the Water Chiller; (on CPU only just now) it still just doesn't feel like I'm getting the temps I used to when I first installed it all about 7 months ago. (before I got the Chiller).
> 
> Saying that, I have changed my motherboard from one that had no LLC setting. To one that does. However It's at it's lowest setting (meaning v.core matches what I enter in BIOS -- in fact it's pretty bang-on in any situation),[/


some people just apply the hardware differently TIM is probably the main factor But then I have dialled down all my fans I am happy as long as I am under 55°C my PC is barely audible I have been meaning to replace this 60mm UPS fan it simply roars compared to the PC
Quote:


> Originally Posted by *Slackaveli*
> 
> at 4k even a 4 core isnt a bottleneck, you straight


even an old 3770k







I have been meaning to replace it for a while still sporting a whole 8Gb ram seems to not to be a problem so far just close off browsers etc before gaming. Simply I have a rule that I have to double my current build power to justify an upgrade. the 299x will be pretty appealing for me
I believe in getting the most out of hardware just a balance of where I spend my wife's money it can get expensive my wife got some 1carat diamond earrings because I got a monitor and 1080Ti (we pay a premium it works out to $1100 USD for 1 1080Ti here in NZ)


----------



## Dasboogieman

Quote:


> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.
> 
> The only viable way is put them under water or use a open air card on top and a FE below.


I ran a pair of 290s in crossfire aircooled. I have never up until that point, nor since then, seen so much power density in such a small area before. I needed a server like network of fans to feed cold air through the front and vent the hot air out the side panel.

In fact, that particular setup killed my previous Silverstone 1000W PSU, it had a group regulation system so the excessive draw on the 12V rail strained the caps and MOSFETs too much. Got me a 1300W G2 and no problems ever since.


----------



## KickAssCop

Quote:


> Originally Posted by *pez*
> 
> Outside of the issues with the ACX 3.0 that arose, I'm not sure why they're 'POS'???


Poor cooling performance compared to competitor offering.
High noise compared to competitor offering (even now, the SC2 cooler runs at 37 dba vs. 39 dba for reference - source: techpowerup)
Bearing issues.
Coil whine issues.
Kingpin cooling for 980 Ti was also crap (ran hotter than normal classified cooling).

Anyways, I just want to see if FTW3 is on par w/ competitor offering - like 33 dba, 70 C or lower temps, no coil whine etc.


----------



## BrainSplatter

Quote:


> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.


You just need the right setup. The first build in my sig has no problem running 2x Zotac 980TI's @ 1500Mhz on original coolers. And relatively quiet too.

Key was a mobo with 3 slots spacing between GPUs, a big case and a big side panel fan sucking heat out of the case (I replaced the 200mm NZXT fan with a Bitfenix Spectre 200mm which is quieter and can move more air).


----------



## trawetSluaP

Quote:


> Originally Posted by *KickAssCop*
> 
> Poor cooling performance compared to competitor offering.
> High noise compared to competitor offering (even now, the SC2 cooler runs at 37 dba vs. 39 dba for reference - source: techpowerup)
> Bearing issues.
> Coil whine issues.
> Kingpin cooling for 980 Ti was also crap (ran hotter than normal classified cooling).
> 
> Anyways, I just want to see if FTW3 is on par w/ competitor offering - like 33 dba, 70 C or lower temps, no coil whine etc.


Agreed. The ACX was utter crap!


----------



## ocCuS

According to videocardz.com, MSI is launching their first custom GTX 1080 Ti with a full cover water block.

https://videocardz.com/newz/msi-launches-geforce-gtx-1080-ti-sea-hawk-ek-x


----------



## Alex132

Quote:


> Originally Posted by *KickAssCop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pez*
> 
> Outside of the issues with the ACX 3.0 that arose, I'm not sure why they're 'POS'???
> 
> 
> 
> Poor cooling performance compared to competitor offering.
> High noise compared to competitor offering (even now, the SC2 cooler runs at 37 dba vs. 39 dba for reference - source: techpowerup)
> Bearing issues.
> Coil whine issues.
> Kingpin cooling for 980 Ti was also crap (ran hotter than normal classified cooling).
> 
> Anyways, I just want to see if FTW3 is on par w/ competitor offering - like 33 dba, 70 C or lower temps, no coil whine etc.
Click to expand...

Keep in mind they are smaller and have more aggressive fan-curves. You can change that yourself and achieve lower noise.

Also bearing noise and coilwhine noise? Source on more than just ONE/etc. person(s) having this issue?

edit- you meant ACX not iCX. Not sure about ACX3.0 much beyond the design was a bit too dated and thermal pad issues.


----------



## Dasboogieman

Quote:


> Originally Posted by *Alex132*
> 
> Keep in mind they are smaller and have more aggressive fan-curves. You can change that yourself and achieve lower noise.
> 
> Also bearing noise and coilwhine noise? Source on more than just ONE/etc. person(s) having this issue?
> 
> edit- you meant ACX not iCX. Not sure about ACX3.0 much beyond the design was a bit too dated and thermal pad issues.


They also dropped the ball with the fans. I don't know where EVGA sourced their fans from but those motors had the most irritating sound profile, they sounded more annoying than their dBA suggests. Thus far, the best fans have been either Gigabyte Aorus or Asus Strix.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AngryLobster*
> 
> I would not SLI two partner cards. I found anything over 150w TDP in SLI is just noise and heat. I can't imagine 2 of these dumping 300w each right beside each other.
> 
> The only viable way is put them under water or use a open air card on top and a FE below.


There's a lot of cases with a side fan for this reason.










If you had a case like this, sli partner cards shouldn't be a problem.


----------



## BBEG

Quote:


> Originally Posted by *Dasboogieman*
> 
> I ran a pair of 290s in crossfire aircooled. I have never up until that point, nor since then, seen so much power density in such a small area before. I needed a server like network of fans to feed cold air through the front and vent the hot air out the side panel.
> 
> In fact, that particular setup killed my previous Silverstone 1000W PSU, it had a group regulation system so the excessive draw on the 12V rail strained the caps and MOSFETs too much. Got me a 1300W G2 and no problems ever since.


Unless your 290s were not voltage locked, it's difficult to imagine any system with xfire 290s abl not draw > 1Kw and damage the PSU (overclocked bulldozer aside). Faulty 12v rail?


----------



## Clukos

Quote:


> Originally Posted by *ocCuS*
> 
> According to videocardz.com, MSI is launching their first custom GTX 1080 Ti with a full cover water block.
> 
> https://videocardz.com/newz/msi-launches-geforce-gtx-1080-ti-sea-hawk-ek-x


That looks nice. It also looks like they are using the Gaming X PCB and 2x8pin instead of the reference PCB everybody else is using on theirs


----------



## BBEG

Quote:


> Originally Posted by *alucardis666*
> 
> They're gone. Sold out within the hour.
> 
> As for TIM, I'd recommend this!
> 
> https://www.amazon.com/gp/product/B00ZJSDB7W/ref=oh_aui_detailpage_o05_s00?ie=UTF8&psc=1


Wow, that went quick. Glad I jumped on it when I got the email.

I had a quick exchange with EVGA and they assured me that using my own thermal compound, in particular the one you suggested, would not void the warranty. I'm glad EVGA is willing to warranty the unit with the card when correctly installed (this was part of me wanting their hybrid cooler) and I'm doubly glad they'll let me use my own goop with it. I look forward to the lower noise and temps. This reference cooler is a little better in tone than my old 680 coolers, but not by much and the 1080 Ti heats up much quicker. At least with the 680s I could keep them under the throttle limit at 100% fan speed, unbearable though that was.


----------



## Sweetwater

How low can you go!?

This is an overclocking forum and usually how high can you go but I'm having fun under volting? this Strix. So far I'm at 2000core 6003mem with 0.950mv stable in Witcher 3/ Mass Effect Andromeda and Wildlands. Temp when I was 2050core 6003mem @ 1.025 was 57c top and currently temp tops out at 52c. I must do more testing and see if I can go lower haha

In the Wildlands benchmark 1440p Ultra Preset

2050/6003 @1.025mv Avg FPS -63.5
2000/6003 @0.950mv Avg FPS -61.5


----------



## done12many2

Picked up a couple of better cards and things are much better. I'm not sure that I'll even bother with shunt modding this pair as they run a lot better then the last. Much closer matched cards.


----------



## Addsome

Whats the best method to apply grizzly kryonaut to the GPU die? X method? Or spread it with the included spreader? I originally had some noctua paste on using three dots in a triangle and my temps at load was 47c. My grizzly kryo just came in and I applied it using the spreader and my temps under load went up to 59c....


----------



## nrpeyton

Quote:


> Originally Posted by *Addsome*
> 
> Whats the best method to apply grizzly kryonaut to the GPU die? X method? Or spread it with the included spreader? I originally had some noctua paste on using three dots in a triangle and my temps at load was 47c. My grizzly kryo just came in and I applied it using the spreader and my temps under load went up to 59c....


I found really good results spreading it with the included spreader. A nice thin layer but ensuring every square mm is completely covered.

Then try bring the block down evenly on all 4 corners.

Tighten from corner to corner, alternating between corners.

Anyone done the *Shunt Mod* on a card that is sitting in mobo vertically? (I.E. mobo laid flat on a bench table instead of vertically in a tower case).

I'm worried about the Liq Metal running and dripping off onto other components due to card not being flat.


----------



## ALSTER868

Quote:


> Originally Posted by *Addsome*
> 
> Whats the best method to apply grizzly kryonaut to the GPU die? X method? Or spread it with the included spreader? I originally had some noctua paste on using three dots in a triangle and my temps at load was 47c. My grizzly kryo just came in and I applied it using the spreader and my temps under load went up to 59c....


I think your increased temps are due rather to improper/weak tightening than to the method of how to apply the TIM







Temps are saying for themselves


----------



## Addsome

Quote:


> Originally Posted by *nrpeyton*
> 
> I found really good results spreading it with the included spreader. A nice thin layer but ensuring every square mm is completely covered.
> 
> Then try bring the block down evenly on all 4 corners.
> 
> Tighten from corner to corner, alternating between corners.
> 
> Anyone done the *Shunt Mod* on a card that is sitting in mobo vertically? (I.E. mobo laid flat on a bench table instead of vertically in a tower case).
> 
> I'm worried about the Liq Metal running and dripping off onto other components due to card not being flat.


Ill give it another try. What would you say is the best way to clean paste off the PCB around the GPU die?


----------



## Addsome

Quote:


> Originally Posted by *ALSTER868*
> 
> I think your increased temps are due rather to improper/weak tightening than to the method of how to apply the TIM
> 
> 
> 
> 
> 
> 
> 
> Temps are saying for themselves


I've tightened the same as last time. Put on the cooler and screw in alternating which one you tighten.


----------



## ALSTER868

Quote:


> Originally Posted by *Addsome*
> 
> I've tightened the same as last time. Put on the cooler and screw in alternating which one you tighten.


Try once more. There can't be another reason for such increase in temps from 47 to 59 other than that. Just check.


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> I found really good results spreading it with the included spreader. A nice thin layer but ensuring every square mm is completely covered.


Agreed. The included spreader works well and I've had great success with applying nice even coats.

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone done the *Shunt Mod* on a card that is sitting in mobo vertically? (I.E. mobo laid flat on a bench table instead of vertically in a tower case).
> 
> I'm worried about the Liq Metal running and dripping off onto other components due to card not being flat.


I did the shunt mod to my first pair of 1080 Ti cards that were vertically mounted. All I can say is be careful and it shouldn't be a problem. I was paranoid and fabricated some barriers out of the plastic from the CLU packaging to prevent any possible runoff from continuing down the PCB. I'm glad I did, because the next day if found one bead of CLU sitting on the plastic. Admittedly, I was a bit aggressive with the CLU application.

Before applying CLU, I coated the PCB surrounding the shunts with liquid tape and then I used liquid tape to glue the barriers to the PBC just below the shunts.

Hope that helps.


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> I found really good results spreading it with the included spreader. A nice thin layer but ensuring every square mm is completely covered.
> 
> 
> 
> Agreed. The included spreader works well and I've had great success with applying nice even coats.
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Anyone done the *Shunt Mod* on a card that is sitting in mobo vertically? (I.E. mobo laid flat on a bench table instead of vertically in a tower case).
> 
> I'm worried about the Liq Metal running and dripping off onto other components due to card not being flat.
> 
> Click to expand...
> 
> I did the shunt mod to my first pair of 1080 Ti cards that were vertically mounted. All I can say is be careful and it shouldn't be a problem. I was paranoid and fabricated some barriers out of the plastic from the CLU packaging to prevent any possible runoff from continuing down the PCB. I'm glad I did, because the next day if found one bead of CLU sitting on the plastic. Admittedly, I was a bit aggressive with the CLU application.
> 
> Before applying CLU, I coated the PCB surrounding the shunts with liquid tape and then I used liquid tape to glue the barriers to the PBC just below the shunts.
> 
> Hope that helps.
Click to expand...

When I can afford some Conductonaut (been so broke) I'm going to coat the surrounding areas and sides of the shunts with nail polish, let it dry, apply the Conductonaut with a Q-Tip, saw a video how to do it properly, put water block back on, hoping to have no issues.

My card is vertical as well.

Edit: Conductonaut because I heard it's thicker than CLU and someone else who used it moved the card around and it never budged.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> Agreed. The included spreader works well and I've had great success with applying nice even coats.
> I did the shunt mod to my first pair of 1080 Ti cards that were vertically mounted. All I can say is be careful and it shouldn't be a problem. I was paranoid and fabricated some barriers out of the plastic from the CLU packaging to prevent any possible runoff from continuing down the PCB. I'm glad I did, because the next day if found one bead of CLU sitting on the plastic. Admittedly, I was a bit aggressive with the CLU application.
> 
> Before applying CLU, I coated the PCB surrounding the shunts with liquid tape and then I used liquid tape to glue the barriers to the PBC just below the shunts.
> 
> Hope that helps.
> 
> 
> Spoiler: Warning: Spoiler!


*Like this??*


----------



## Alwrath

Does anyone know of the difference between getting a founders edition from EVGA 1080ti, getting the hybrid waterblock, and the $810 EVGA geforce 1080ti hybrid sc2? Are they essentially the same thing? Im asking cause one is cheaper than the other. I could step up to the sc2 but then id be subject to silicone lottery, also my geforce hits 1979 clocks at 100% fan speed completely stock ( no voltage, no paste )game stable in mass effect. What should I do? I could cancel my B+H order for the hybrid waterblock but I think I have a decent clocker on my hands.


----------



## nrpeyton

Quote:


> Originally Posted by *Alwrath*
> 
> Does anyone know of the difference between getting a founders edition from EVGA 1080ti, getting the hybrid waterblock, and the $810 EVGA geforce 1080ti hybrid sc2? Are they essentially the same thing? Im asking cause one is cheaper than the other.


Yes they are essentially the same. As they both feature the same maximum power target as the Founders Edition. (300 w).

The hybrid does have a slightly higher out-of-box boost frequency. But you could set that up yourself by overclocking your card.

So essentially; yes they are exactly the same.

Evga also doesn't bin these cards; so it's even quite possible you could get a non-hybrid which after overclocked, boosts higher than the hybrid.

https://www.evga.com/products/Compare.aspx


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> [/SPOILER]
> 
> *Like this??*


Good eye my friend. I'm glad that I tend to overbuild everything.


----------



## Alwrath

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes they are essentially the same. As they both feature the same maximum power target as the Founders Edition. (300 w).
> 
> The hybrid does have a slightly higher out-of-box boost frequency. But you could set that up yourself by overclocking your card.
> 
> So essentially; yes they are exactly the same.
> 
> https://www.evga.com/products/Compare.aspx


See my above post again, I edited, with a 1979 max boost already should I stick with what I got? What oc could I expect on water roughly?


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> Good eye my friend. I'm glad that I tend to overbuild everything.


okay nice one, rep+


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> When I can afford some Conductonaut (been so broke) I'm going to coat the surrounding areas and sides of the shunts with nail polish, let it dry, apply the Conductonaut with a Q-Tip, saw a video how to do it properly, put water block back on, hoping to have no issues.
> 
> My card is vertical as well.
> 
> Edit: Conductonaut because I heard it's thicker than CLU and someone else who used it moved the card around and it never budged.


I have used both. I used Conductonaut on the cards that I'm using now and CLU on the previous cards. Truthfully, they are about the same. I actually think CLU is a tad easier to spread/apply. I do like the included fine tip nozzle that comes with the Conductonaut, but you need to be careful with it. I accidentally squirted LM all over the place the first time I tried to use it.

I have a few extra tubes of CLU and wouldn't mind hooking you up.


----------



## KraxKill

Quote:


> Originally Posted by *Addsome*
> 
> Whats the best method to apply grizzly kryonaut to the GPU die? X method? Or spread it with the included spreader? I originally had some noctua paste on using three dots in a triangle and my temps at load was 47c. My grizzly kryo just came in and I applied it using the spreader and my temps under load went up to 59c....


I dip my finger in rubbing alcohol, let it dry and then use my finger like an ape.

The only compound I use a tool for (nylon qtips) is for spreading liquid metal. Otherwise I finger paint it on like a chimp.


----------



## dunbheagan

Quote:


> Originally Posted by *KedarWolf*
> 
> When I can afford some Conductonaut (been so broke) I'm going to coat the surrounding areas and sides of the shunts with nail polish, let it dry, apply the Conductonaut with a Q-Tip, saw a video how to do it properly, put water block back on, hoping to have no issues.
> 
> My card is vertical as well.
> 
> Edit: Conductonaut because I heard it's thicker than CLU and someone else who used it moved the card around and it never budged.


I found a good way to apply Cond. on the shunts is to put a little drop directly on it and then spread it with the flat end of a cable ties.

The Q-Tip is fine to spread LM on a larger surface like a heatspreader. But i wouldnt use it for the shunt mod for two reasons:

-The Q-Tip is bigger than the shunt, that makes it unprecise and chances are higher you spill the LM somewhere you dont want it.

-The Q-Tip is good to apply a thin layer, but you want more LM on the shunts. I applied almost a little hill on both shunts and still get a TDP at about 75 during gaming and up to 100 in heavy benchmarking like FS Ultra. I think danger is higher to apply not enough LM than to apply too much.

And dont forget to scratch the metal ends of the shunts clean, there might be a coating on it that will work against your mod if you dont remove it.


----------



## dunbheagan

Quote:


> Originally Posted by *KraxKill*
> 
> I dip my finger in rubbing alcohol, let it dry and then use my finger like an ape.
> 
> The only compound I use a tool for (nylon qtips) is for spreading liquid metal. Otherwise I finger paint it on like a chimp.


hehe, had a good laugh on this







Thumbs up buddy!


----------



## Coopiklaani

Quote:


> Originally Posted by *done12many2*
> 
> Agreed. The included spreader works well and I've had great success with applying nice even coats.
> I did the shunt mod to my first pair of 1080 Ti cards that were vertically mounted. All I can say is be careful and it shouldn't be a problem. I was paranoid and fabricated some barriers out of the plastic from the CLU packaging to prevent any possible runoff from continuing down the PCB. I'm glad I did, because the next day if found one bead of CLU sitting on the plastic. Admittedly, I was a bit aggressive with the CLU application.
> 
> Before applying CLU, I coated the PCB surrounding the shunts with liquid tape and then I used liquid tape to glue the barriers to the PBC just below the shunts.
> 
> Hope that helps.


Why not to do the resistor mod, much safer and much more accurate. You can still use the power readouts from the GPU after the resistor power mod just with proper scaling.


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> When I can afford some Conductonaut (been so broke) I'm going to coat the surrounding areas and sides of the shunts with nail polish, let it dry, apply the Conductonaut with a Q-Tip, saw a video how to do it properly, put water block back on, hoping to have no issues.
> 
> My card is vertical as well.
> 
> Edit: Conductonaut because I heard it's thicker than CLU and someone else who used it moved the card around and it never budged.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have used both. I used Conductonaut on the cards that I'm using now and CLU on the previous cards. Truthfully, they are about the same. I actually think CLU is a tad easier to spread/apply. I do like the included fine tip nozzle that comes with the Conductonaut, but you need to be careful with it. I accidentally squirted LM all over the place the first time I tried to use it.
> 
> I have a few extra tubes of CLU and wouldn't mind hooking you up.
Click to expand...

PM'd you with an offer you can't refuse (said in best Godfather voice).


----------



## done12many2

Quote:


> Originally Posted by *Coopiklaani*
> 
> Why not to do the resistor mod, much safer and much more accurate. You can still use the power readouts from the GPU after the resistor power mod just with proper scaling.


Sounds like that involves some soldering? If so, I've got no skills.


----------



## bloodhawk

Quote:


> Originally Posted by *Coopiklaani*
> 
> Why not to do the resistor mod, much safer and much more accurate. You can still use the power readouts from the GPU after the resistor power mod just with proper scaling.


The resistormod messes up the readouts, this is using 3 x 10Ohm Resists -





http://imgur.com/UXIgx






http://imgur.com/2S6wk


----------



## Coopiklaani

Quote:


> Originally Posted by *bloodhawk*
> 
> The resistormod messes up the readouts, this is using 3 x 10Ohm Resists -
> 
> 
> 
> 
> 
> http://imgur.com/UXIgx
> 
> 
> 
> 
> 
> 
> http://imgur.com/2S6wk


So does shunt mod. With the resistors mod you can actually re-calibrate your power readouts, for example, 3 x 10Ohm resistor mod reduces power readings by 5 times. Simply multiplying the power readouts by 5 you will get your real power.







You can not do the same with liquidmetal shunt mod.

Quote:


> Originally Posted by *done12many2*
> 
> Sounds like that involves some soldering? If so, I've got no skills.


No soldering required, just conductive gel/paint/glue will do .


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bloodhawk*
> 
> The resistormod messes up the readouts, this is using 3 x 10Ohm Resists -
> 
> 
> 
> 
> 
> http://imgur.com/UXIgx
> 
> 
> 
> 
> 
> 
> http://imgur.com/2S6wk
> 
> 
> 
> 
> 
> So does shunt mod. With the resistors mod you can actually re-calibrate your power readouts, for example, 3 x 10Ohm resistor mod reduces power readings by 5 times. Simply multiplying the power readouts by 5 you will get your real power.
> 
> 
> 
> 
> 
> 
> 
> You can not do the same with liquidmetal shunt mod.
> 
> Quote:
> 
> 
> 
> Originally Posted by *done12many2*
> 
> Sounds like that involves some soldering? If so, I've got no skills.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> No soldering required, just conductive gel/paint/glue will do .
Click to expand...

Conductive paint/gel etc. won't work, doesn't increase conductivity enough, has to be liquid metal or the resisters like someone said.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> Conductive paint/gel etc. won't work, doesn't increase conductivity enough, has to be liquid metal or the resisters like someone said.


Conductive paint won't work if you are doing a shunt mod, since shunt resistor are only 0.005Ohm in resistance. But it is completely ok if you are doing a resistor mod since the resistor you are adding will be in the range of 10Ohm to 100Ohm, totally work with conductive paint.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Conductive paint/gel etc. won't work, doesn't increase conductivity enough, has to be liquid metal or the resisters like someone said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Conductive paint won't work if you are doing a shunt mod, since shunt resistor are only 0.005Ohm in resistance. But it is completely ok if you are doing a resistor mod since the resistor you are adding will be in the range of 10Ohm to 100Ohm, totally work with conductive paint.
Click to expand...

Oh sorry, never knew that. +1


----------



## mr2cam

Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


----------



## eXistencelies

Quote:


> Originally Posted by *mr2cam*
> 
> Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


That is cooler than mine. Well depends on the games I play and for how long. My 1080ti runs around 41-45c when playing PUBG with 720mm rad space in a Phantek's Evolv TG case. Paired with it is a 7700k delidded. Rocket league my temps stay in the mid to high 30's.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Why not to do the resistor mod, much safer and much more accurate. You can still use the power readouts from the GPU after the resistor power mod just with proper scaling.


u got a link?

and does it involve soldering? _(invalidating warranty etc)_


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Conductive paint/gel etc. won't work, doesn't increase conductivity enough, has to be liquid metal or the resisters like someone said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Conductive paint won't work if you are doing a shunt mod, since shunt resistor are only 0.005Ohm in resistance. But it is completely ok if you are doing a resistor mod since the resistor you are adding will be in the range of 10Ohm to 100Ohm, totally work with conductive paint.
Click to expand...

You have a video or link how to do this with conductive paint?

I'm totally interested, have a vertically mounted card.


----------



## done12many2

Quote:


> Originally Posted by *mr2cam*
> 
> Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


Quote:


> Originally Posted by *eXistencelies*
> 
> That is cooler than mine. Well depends on the games I play and for how long. My 1080ti runs around 41-45c when playing PUBG with 720mm rad space in a Phantek's Evolv TG case. Paired with it is a 7700k delidded. Rocket league my temps stay in the mid to high 30's.


Ambient temps are an important factor. At 20c ambient, both of the 1080 Tis stay in the high 20s most of the time and go as high as 30c.


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> Why not to do the resistor mod, much safer and much more accurate. You can still use the power readouts from the GPU after the resistor power mod just with proper scaling.


I would think you could do the same with the Liquid Metal shunt mod by simply noting your baseline TDP values and getting a correction value that way. I prefer to not do anything permanent to the card like soldering or gluing in resistors. Otherwise you may as well forget the shunt mod all together and just solder in a trim pot to control your voltage and TDP using external dip switches.

https://xdevs.com/guide/pascal_m_oc/


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Conductive paint/gel etc. won't work, doesn't increase conductivity enough, has to be liquid metal or the resisters like someone said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Conductive paint won't work if you are doing a shunt mod, since shunt resistor are only 0.005Ohm in resistance. But it is completely ok if you are doing a resistor mod since the resistor you are adding will be in the range of 10Ohm to 100Ohm, totally work with conductive paint.
Click to expand...

Would these work by scratching the metal contacts of each end of shunt and sticking on with silver conductive paint?

http://www.ebay.ca/itm/100pcs-SMD-Chip-Surface-Mount-0805-Resistor-10-10-ohm-1-8W-1-/222190717547?hash=item33bb994a6b:g:To0AAOSw0fhXjOha


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> u got a link?
> 
> and does it involve soldering? _(invalidating warranty etc)_


I did do a brief tutorial here:
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160

No soldering, but silver paint


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bloodhawk*
> 
> The resistormod messes up the readouts, this is using 3 x 10Ohm Resists -
> 
> 
> 
> 
> 
> http://imgur.com/UXIgx
> 
> 
> 
> 
> 
> 
> http://imgur.com/2S6wk
> 
> 
> 
> 
> 
> So does shunt mod. With the resistors mod you can actually re-calibrate your power readouts, for example, 3 x 10Ohm resistor mod reduces power readings by 5 times. Simply multiplying the power readouts by 5 you will get your real power.
> 
> 
> 
> 
> 
> 
> 
> You can not do the same with liquidmetal shunt mod.
> 
> Quote:
> 
> 
> 
> Originally Posted by *done12many2*
> 
> Sounds like that involves some soldering? If so, I've got no skills.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> No soldering required, just conductive gel/paint/glue will do .
Click to expand...

Would these work by scratching the metal contacts of each end of shunt and sticking on with silver conductive paint?

http://www.ebay.ca/itm/100pcs-SMD-Chip-Surface-Mount-0805-Resistor-10-10-ohm-1-8W-1-/222190717547?hash=item33bb994a6b:g:To0AAOSw0fhXjOha


----------



## Coopiklaani

Quote:


> Originally Posted by *KraxKill*
> 
> I would think you could do the same with the Liquid Metal shunt mod by simply noting your baseline TDP values and getting a correction value that way. I prefer to not do anything permanent to the card like soldering or gluing in resistors. Otherwise you may as well forget the shunt mod all together and just solder in a trim pot to control your voltage and TDP using external dip switches.
> 
> https://xdevs.com/guide/pascal_m_oc/


The problem with the liquid metal shunt is uncertainty. Basically you can end up with unbalanced power reading reduction on each of the shunts. It is pretty hard to compensation accurately.

Resistor mod is also not permanent, you can remove them anytime and clean the silver paint with rubbing alcohol.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I did do a brief tutorial here:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160
> 
> No soldering, but silver paint


Okay not exactly sure what I'm looking at here.

Coopiklaani? can u do us a more detailed article?



where is the paint?

The idea of still being able to calculate my correct power usage in watts is very inticing

hwinfo64 even allows you to enter that calculation in; so the correct power usage would be displayed in all your games through Riva Tuner.


----------



## madmeatballs

Quote:


> Originally Posted by *mr2cam*
> 
> Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


I am using an Aqua Computer waterblock with active backplate. At an ambient temperature of 22c I get 35-36c max temps on Mass Effect Andromeda. I'll take note of my PUBG temps tomorrow. I got 2x 280mm HWLabs GTS X-flows.


----------



## nrpeyton

Quote:


> Originally Posted by *madmeatballs*
> 
> I am using an Aqua Computer waterblock with active backplate. At an ambient temperature of 22c I get 35c max temps on Mass Effect Andromeda. I'll take note of my PUBG temps tomorrow. I got 2x 280mm HWLabs GTS X-flows.


Lets see some pics of your block installation + PC.

would be nice

I'm in two minds, weather to get that block also; or grab the EK one. Can't bloody decide lol.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> I would think you could do the same with the Liquid Metal shunt mod by simply noting your baseline TDP values and getting a correction value that way. I prefer to not do anything permanent to the card like soldering or gluing in resistors. Otherwise you may as well forget the shunt mod all together and just solder in a trim pot to control your voltage and TDP using external dip switches.
> 
> https://xdevs.com/guide/pascal_m_oc/
> 
> 
> 
> The problem with the liquid metal shunt is uncertainty. Basically you can end up with unbalanced power reading reduction on each of the shunts. It is pretty hard to compensation accurately.
> 
> Resistor mod is also not permanent, you can remove them anytime and clean the silver paint with rubbing alcohol.
Click to expand...

http://www.ebay.ca/itm/Silver-Conductive-0-2ML-Glue-Wire-Electrically-Paste-Adhesive-Paint-PCB-Repair-/142142596935?hash=item21185be747:g:dCgAAOSwzaJX-Twi
and

http://www.ebay.ca/itm/100pcs-SMD-Chip-Surface-Mount-0805-Resistor-10-10-ohm-1-8W-1-/222190717547?hash=item33bb994a6b:g:To0AAOSw0fhXjOha

would work?


----------



## madmeatballs

Quote:


> Originally Posted by *nrpeyton*
> 
> Lets see some pics of your block installation + PC.
> 
> would be nice
> 
> I'm in two minds, weather to get that block also; or grab the EK one. Can't bloody decide lol.


Here you go!







I'm planning on swapping out the thermal pads to fujipoly ones when I replace my tubing soon. The only disappointment from Aqua Computer was their single slot bracket. Somewhat the bracket isn't a fit on my Enthoo Pro M, the bracket seems to be hitting the next one below it. I managed to make it fit tho by force.


----------



## nrpeyton

*shunt mod. & calculating correct power (after mod)*

Couldn't u just use your idle power draw to calculate the offset/equation you need to find out the true power reading after modding.

So on average idle power draw is *16w.*

After power draw I just need to try calculating the output value until i get 16w at idle.

So for example if *after* the mod I'm reading *11w.* _(even though i know my true idle power draw is always 16w)
_
I'd try multiplying *11* by various factors until I get back to 16.

Once I get 16 I know my offset; then can enter it into hwinfo64 to customise the output.

So for example I might start like this:

1.3 x *11* = 14.3 _(not quite there)_

try again:

1.4x *11*=15.4 _(nearly)_

1.46 x *11* = *16.06*

then in hwinfo 64, in "cusomise values" under power, just change the "multiply field" to 1.46

??


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> *Actually regarding shunt mod. & calculating correct power (after mod)*
> 
> Couldn't u just use your idle power draw to calculate the offset/equation you need to find out the true power reading after modding.
> 
> So on average idle power draw is *16w.*
> 
> After power draw I just need to try calculating the output value until i get 16w at idle.
> 
> So for example if *after* the mod I'm reading *11w.* _(even though i know my true idle power draw is always 16w)
> _
> I'd try multiplying *11* by various factors until I get back to 16.
> 
> Once I get 16 I know my offset; then can enter it into hwinfo64 to customise the output.
> 
> So for example I might start like this:
> 
> 1.3 x *11* = 14.3 _(not quite there)_
> 
> try again:
> 
> 1.4x *11*=15.4 _(nearly)_
> 
> 1.46 x *11* = *16.06*
> 
> then in hwinfo 64, in "cusomise values" under power, just change the "multiply field" to 1.46
> 
> ??


11 * X = 16 | :11
X = 16:11
X= 1.4545

you dont have to try, you simply do simple math


----------



## dunbheagan

Quote:


> Originally Posted by *mr2cam*
> 
> Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


Are you kidding? 41 is nothing. That is a very good result!

My FE clocks down the first 13MHz above 48 degree, so my goal is to keep it below 48, which my cooling does easily. but i get up to 45 degree under heavy load, and i have a 360 and a 240 rad.

which programm did you use to measure max temps? Try something heavy like FS ultra in a loop for two hours and i would be surprised if your temp would not go higher to at least 45 too.


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> 11 * X = 16 | :11
> X = 16:11
> X= 1.4545
> 
> you dont have to try, you simply do simple math


Right but you get the point lol.

My point; is would this scale up the way properly? Afer mod?

It should, right?

If shut mod gets me 30%

Then my idle power draw would read 11.2 after mod. (16 originally and also the "true" draw).

If i'm drawing 300w (reported) afer mod. I'd really be drawing 390w (390 would be reported into hwinfo64 and in-game via RivaTuner) *if* i entered 1.3 into the "multiply" field in hwinfo64 under customise values.


----------



## ilikelobstastoo

Even if u do solder it can easily be returned to stock condition after removing resistors . Little bit of flux paste and retouch the pads and the solder will melt back to how it was before .


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> Right but you get the point lol.
> 
> My point; is would this scale up the way properly? Afer mod?
> 
> It should, right?


I think it should since the lowered resistance stays static and the readings are manipulated by a static value. So if your idle power is 30% lower the load power should be by the same factor.
Quote:


> Originally Posted by *dunbheagan*
> 
> Are you kidding? 41 is nothing. That is a very good result!
> 
> My FE clocks down the first 13MHz above 48 degree, so my goal is to keep it below 48, which my cooling does easily. but i get up to 45 degree under heavy load, and i have a 360 and a 240 rad.
> 
> which programm did you use to measure max temps? Try something heavy like FS ultra in a loop for two hours and i would be surprised if your temp would go higher to at least 45 too.


Im pretty sure your card is downclocking the first 13 MHz (12.5MHz in reality) when you leave 36°C to 37°C.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay not exactly sure what I'm looking at here.
> 
> Coopiklaani? can u do us a more detailed article?
> 
> 
> 
> where is the paint?
> 
> The idea of still being able to calculate my correct power usage in watts is very inticing
> 
> hwinfo64 even allows you to enter that calculation in; so the correct power usage would be displayed in all your games through Riva Tuner.


Paint is undernearth the resistors, all you need is a little paint to make sure there's good contact between the resistor and the capacitor underneath. You don't want to over do your paint work cos it may short the capacitor completely resulting 0A current reading which may cause the GPU to lock into a low frequency state.

Here's how this resistor | silver paint | capacitor structure is like



I also put some non-conductive thermal glue in the middle to make sure the silver paint doesn't short the capacitor, and it also gives the whole structure some strength:thumb:


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> I think it should since the lowered resistance stays static and the readings are manipulated by a static value. So if your idle power is 30% lower the load power should be by the same factor.
> Im pretty sure your card is downclocking the first 13 MHz (12.5MHz in reality) when you leave 36°C to 37°C.


Excellent, I can't wait to try this..

Will great seeing 380w draw in-game using my FE.

People will wonder whats going on lol ;=p


----------



## ilikelobstastoo

I too am considering trying this out. (after all I have a ton of smd resistors laying around from rghing xboxs lol. )How many watts did 100 ohm resistors net you @coopiklaani?


----------



## Coopiklaani

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> I too am considering trying this out. (after all I have a ton of smd resistors laying around from rghing xboxs lol. )How many watts did 100 ohm resistors net you @coopiklaani?


100ohm resistors will give you 1.4x TDP

Formula:

(R+40)/R

for example:
R = 10ohm, 5x TDP
R = 22ohm, 2.81x TDP
R = 47ohm, 1.85x TDP
R = 100ohm, 1.4x TDP

I'll recommmand 22ohm or 47ohm resistors. 5x is a little bit overkill, you'll hit low-side TDP anyways. 1.4x TDP is a little be on the low side, you probably want more.


----------



## dunbheagan

Quote:


> Originally Posted by *Hulio225*
> 
> I think it should since the lowered resistance stays static and the readings are manipulated by a static value. So if your idle power is 30% lower the load power should be by the same factor.
> Im pretty sure your card is downclocking the first 13 MHz (12.5MHz in reality) when you leave 36°C to 37°C.


Yeah, you are kind of right.

For example, when i apply +150 on the gpu and start a bench, first it clocks to 2063 and after a couple of seconds when temp is above high 30ies, it clocks to 2050. So you already have the first downclock at about 38 degree.

Why i said, i have the first downclock above 48 degree, is because i set the baseline on the official 1900MHz stock max boost. For my example 1900 + 150 is 2050, so the 2063 below 38 degree in my mind is an additional overclock because of very low temps. 2050 is the baseline for a +150 oc, so the first downclock from that baseline is above 48 degree. Does this make any sense? Tell me the truth


----------



## Alwrath

Ok guys I flashed, on Nvidia FE bios now, ran a heaven loop and my Grizzly thermal paste is a def improvement over stock, low to high 50C range instead of 63C stock, clocks are pretty roughly the same though, mostly at 1962-1974, sometimes goes down to 1949 or can do 1987 if I add 50% voltage but obviously thats not worth it. I saw it hit 2000 mhz when the loop first started so looks like if I keep it in the 40C range ill see it jump to 2000 mhz or higher possibly. I dont know if thats worth $170 for this hybrid cooler from EVGA though, what do you guys think? I could always just use the stepup program and roll the dice on a new card later.


----------



## nrpeyton

BIOS for Voltage over 1.093v

According to very reputable author of this video. (buildzoids tear down of the 1080 TI FE PCB/VRM):
*



*
Buildzoid says A BIOS _*does*_ exist for one card, that does allow voltage up to 1.25v

Anyone know which card it is?

_Or is he getting confused with the STRIX T4 BIOS that everyone was using on the 1080's (non TI's)?_


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> BIOS for Voltage over 1.093v
> 
> According to very reputable author of this video. (buildzoids tear down of the 1080 TI FE PCB/VRM):
> *
> 
> 
> 
> *
> Buildzoid says A BIOS _*does*_ exist for one card, that does allow voltage up to 1.25v
> 
> Anyone know which card it is?


I guess he meant Kingpin edition + evga voltage tool


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I guess he meant Kingpin edition + evga voltage tool


From the tone of the video _(and how he said it)_; I got the impression it was something thats already been released.


----------



## Benny89

Managed to finally get 2012mhz in ME:Andromeda on Ultra with Lighting/DOF mod with right curve on this STRIX. But I need 1.050V to get stable 2012 mhz







. I saw people here getting 2000mhz on 1.000V...

I have to run it with 60% fan speed to keep it under 62C which makes card little too loud for my taste...

I will have to try FTW3 and EVGA Hybrid SC next to see if I can get above 2000mhz stable with low noise and better stability in power hunger games....


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> Yeah, you are kind of right.
> 
> For example, when i apply +150 on the gpu and start a bench, first it clocks to 2063 and after a couple of seconds when temp is above high 30ies, it clocks to 2050. So you already have the first downclock at about 38 degree.
> 
> Why i said, i have the first downclock above 48 degree, is because i set the baseline on the official 1900MHz stock max boost. For my example 1900 + 150 is 2050, so the 2063 below 38 degree in my mind is an additional overclock because of very low temps. 2050 is the baseline for a +150 oc, so the first downclock from that baseline is above 48 degree. Does this make any sense? Tell me the truth


Stock max boost ist 1911 ;-)
Quote:


> Originally Posted by *Benny89*
> 
> Managed to finally get 2012mhz in ME:Andromeda on Ultra with Lighting/DOF mod with right curve on this STRIX. But I need 1.050V to get stable 2012 mhz
> 
> 
> 
> 
> 
> 
> 
> . I saw people here getting 2000mhz on 1.000V...
> 
> I have to run it with 60% fan speed to keep it under 62C which makes card little too loud for my taste...
> 
> I will have to try FTW3 and EVGA Hybrid SC next to see if I can get above 2000mhz stable with low noise and better stability in power hunger games....


You dont know in which conditions they just need 1.0V for 2000MHz.
2012 in me:a stable is a good value

And you can always say: i saw Guys who have an i7 7700k which is doing 5500MHz @ 1,45V but they probably binned 100+ CPUs, as an example^^


----------



## dunbheagan

Quote:


> Originally Posted by *Hulio225*
> 
> Stock max boost ist 1911 ;-)


Really? Ok, that is by far a more simple explanation...









Funny how the mind creates quite complex models based on assumptions just to explain what it doesnt understand...


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> I've had top water, bottom air before, different series, not Ti's. )


works greAT.both cards stay cool.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *madmeatballs*
> 
> I am using an Aqua Computer waterblock with active backplate. At an ambient temperature of 22c I get 35c max temps on Mass Effect Andromeda. I'll take note of my PUBG temps tomorrow. I got 2x 280mm HWLabs GTS X-flows.
> 
> 
> 
> Lets see some pics of your block installation + PC.
> 
> would be nice
> 
> I'm in two minds, weather to get that block also; or grab the EK one. Can't bloody decide lol.
Click to expand...

Can't decide?

That's been my problem for years, lol.
I often wind up buying one thing, then changing my mind and buying the other.









I'm sure that the EK block is a fine one, I've had both brands in the past, mostly EK.
I actually had an EK and an AC block at the same time, running 780Ti's together in SLI:


They seemed to be neck and neck with the gpu temps.

But this time I decided (after buying the EK block already) to go with AC, I like their robust copper block, and thick backplate.
As for the active cooling on the backplate, it seems that it really does improve the vrm cooling considerably, based on VSG's testing, as posted earlier.

Good picking regardless of which you choose, I can't wait to get back to water again, and I'm sure you can't either!


----------



## phenom01

The thread is moving fast but Newegg accepted my RMA. New card get shipped out today.


----------



## shalafi

Quote:


> Originally Posted by *Benny89*
> 
> Managed to finally get 2012mhz in ME:Andromeda on Ultra with Lighting/DOF mod with right curve on this STRIX. But I need 1.050V to get stable 2012 mhz
> 
> 
> 
> 
> 
> 
> 
> . I saw people here getting 2000mhz on 1.000V...


1.050 is exactly what I need to be stable at 2012MHz on my GB FE + EVGA Hybrid. 2038 is about the max I can squeeze out of it, but then the TDP is through the roof. In SuPo and FS Ultra/Extreme it is in the high 1900s to meet the 300W, in games it holds 2012. I have considered my options, but decided it's not worth reselling and trying something else.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Managed to finally get 2012mhz in ME:Andromeda on Ultra with Lighting/DOF mod with right curve on this STRIX. But I need 1.050V to get stable 2012 mhz
> 
> 
> 
> 
> 
> 
> 
> . I saw people here getting 2000mhz on 1.000V...
> 
> I have to run it with 60% fan speed to keep it under 62C which makes card little too loud for my taste...
> 
> I will have to try FTW3 and EVGA Hybrid SC next to see if I can get above 2000mhz stable with low noise and better stability in power hunger games....


So you've finally hit a nice stable 2GHZ then lol.

After all that you've been through.

Not bad considering Nvidia designed cards with a stock clock of 1481, and 1582 boost.

Congrats,

A lot of people can't even hit 2100

10% (or 1 in 10) can hit 2100

In fact the "average" on air is only: 1849-1940 Mhz
Avage on water is: 2032 Mhz
LN2: 2401 Mhz

What about your memory?


----------



## Alwrath

Called and cancelled my order with B+H, EVGA Hybrid 1080ti is back in stock at EVGA, placed my order and cant wait to get it


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> Really? Ok, that is by far a more simple explanation...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Funny how the mind creates quite complex models based on assumptions just to explain what it doesnt understand...


religion in a nutshell right there. yes, it is profound.


----------



## shalafi

Oh, and here's a weird angle to show off most of what I've built in the past few weeks. Still can't be bothered with a custom loop. The fans for the X62 are in pull setup above the rad to facilitate easier cleaning of the eventual dust buildup - the Enthoo Evolv top airflow is pants anyway, so this is not hurting it much further.



EDIT: looking at the image, I really cannot fathom why Phanteks decided for white brackets in a completely black case







looks a bit daft ..


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> Thank you for the effort for benefit of everyone I was thinking maybe the great scaling was because you started at the top clock and worked your way down if started at the bottom and worked your way up, maybe there might be a difference but I did notice the 2088Mhz seems to be the magic number for ROTTR and Ghost Recon
> Then again it could be simply new architecture/bios not letting you get near the poor scaling clocks who knows but you got one nice FE card there you have a whole 3% on my Strix but that's ok I rather have the ROG logo
> some people just apply the hardware differently TIM is probably the main factor But then I have dialled down all my fans I am happy as long as I am under 55°C my PC is barely audible I have been meaning to replace this 60mm UPS fan it simply roars compared to the PC
> even an old 3770k
> 
> 
> 
> 
> 
> 
> 
> I have been meaning to replace it for a while still sporting a whole 8Gb ram seems to not to be a problem so far just close off browsers etc before gaming. Simply I have a rule that I have to double my current build power to justify an upgrade. the 299x will be pretty appealing for me
> I believe in getting the most out of hardware just a balance of where I spend my wife's money it can get expensive my wife got some 1carat diamond earrings because I got a monitor and 1080Ti (we pay a premium it works out to $1100 USD for 1 1080Ti here in NZ)


that's funny , dude. my X was the same way. If I "blew" a grand, she got a grand, too. she let me decide. It was TORTURE. I would be like, "Oh, snap, I want that $600 gpu!!! um..., damn. wait. Definitely not worth $1200, though. I'll pass". Sucked bigtime as I'm a broke dude.


----------



## mshagg

Quote:


> Originally Posted by *mr2cam*
> 
> Question for those who have a 1080ti with a full water block, what kind of temps are you seeing? I am using an EK waterblock along with an EK 240mm rad and max temp is 41c, does that seem right to you or a little to warm?


It comes down to ambient temps, which in turn affect coolant temps. And load, of course.

Idle:

23 degree room temp
25 degree coolant temp
27 degree core temp

Supo 4k run:

23 degree room temp
27 degree coolant temp by end of run
43 degree core temp by end of run

Supo 1080P high - same as above but 41 degree core temp.

Can probably improve on this by blasting the radiator fans for a few minutes before a run.


----------



## nrpeyton

Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.

He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).

Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.

Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.

I'll keep you all updated.

if I get my hands on it; it might be "share via PM only" but hopefully not.

Watch this space. And keep your fingers crossed.

Those passionate about O/C'ing may be in for a nice treat soon ;-)

Not making any promises. But my calls _did_ get answered last round. So its looking *50/50.*


----------



## Addsome

Quote:


> Originally Posted by *nrpeyton*
> 
> Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.
> 
> He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).
> 
> Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.
> 
> Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.
> 
> I'll keep you all updated.
> 
> if I get my hands on it; it might be "share via PM only" but hopefully not.
> 
> Watch this space. And keep your fingers crossed.
> 
> Those passionate about O/C'ing may be in for a nice treat soon ;-)
> 
> Not making any promises. But my calls _did_ get answered last round. So its looking 50/50.


Great news +rep


----------



## mshagg

Awesome, let's toast some cards


----------



## DerComissar

Quote:


> Originally Posted by *Addsome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.
> 
> He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).
> 
> Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.
> 
> Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.
> 
> I'll keep you all updated.
> 
> if I get my hands on it; it might be "share via PM only" but hopefully not.
> 
> Watch this space. And keep your fingers crossed.
> 
> Those passionate about O/C'ing may be in for a nice treat soon ;-)
> 
> Not making any promises. But my calls _did_ get answered last round. So its looking 50/50.
> 
> 
> 
> Great news +rep
Click to expand...

It is great news, and I second the Rep+

Certainly a much better option than what we've had so far.


----------



## KedarWolf

Hey peeps, to fix stuttering in games and benchmarks, go to Nvidia Control Panel, in Global settings and in the game if you have changed it in the game from Global settings turn shader cache off.

Zero stuttering now in Heaven after doing this.









Pretty much in global settings AND in game how you want your Nvidia control panel setup.

I have a G-Sync screen so I disable Vsync and cap my frame rate with Rivatuner.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.
> 
> He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).
> 
> Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.
> 
> Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.
> 
> I'll keep you all updated.
> 
> if I get my hands on it; it might be "share via PM only" but hopefully not.
> 
> Watch this space. And keep your fingers crossed.
> 
> Those passionate about O/C'ing may be in for a nice treat soon ;-)
> 
> Not making any promises. But my calls _did_ get answered last round. So its looking *50/50.*


Nice! Maybe that could help keeping my clocks little higher without TDP limit







. I won't push it more thatn 1.15V but I would at aleast try it









+Rep man, hope we will get it, one way or the other


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> So you've finally hit a nice stable 2GHZ then lol.
> 
> After all that you've been through.
> 
> Not bad considering Nvidia designed cards with a stock clock of 1481, and 1582 boost.
> 
> Congrats,
> 
> A lot of people can't even hit 2100
> 
> 10% (or 1 in 10) can hit 2100
> 
> In fact the "average" on air is only: 1849-1940 Mhz
> Avage on water is: 2032 Mhz
> LN2: 2401 Mhz
> 
> What about your memory?


Memory can do max 6009mhz but I noticed it doesn't give any boost over 5850mhz so I just keep it there.

I pre-ordered FTW3 anyway to check it and I am waiting for Hybrid SC to be in stock in my country which will take a while.


----------



## nrpeyton

Right I get paid in exactly 55 minutes lol.

And I still can't for the life of me decide which freaking block to get.

argh I hate decisions like this lol.


----------



## shadow85

Quote:


> Originally Posted by *alucardis666*
> 
> Saw your posts and had to chime in,
> 
> YMMV *severely* depending on the game... I currently have 2 FE 1080Ti's **Previous 2 Aorus 1080Tis (Sold due to poor thermals) and 1 TXp (yes the new one. I sent it back.)** and I also used to have an X99 and 6950x. **Now on R7 1700 and X370.** But in both cases, FPS at 4k can be well over 300fps in some titles and in others be in the 30's with everything turned up. Scaling and optimization are a real PITA. And I don't think Pascal can really deliver a turn 4k 60fps gaming experience. **VOLTA and VEGA Can't come quick enough if you wanna do 4k 144hz on one of those new monitors coming out**
> 
> Be prepared to turn settings down, and occasionally forgo AA, if you wanna get 60fps in some titles, if you don't wanna compromise there then game @ 2560x1440. Or get yourself a 3440x1440 or 3840x1600 display. **MUCH MUCH MUCH less demanding than 2160p.**
> 
> And as for your choice to go with the Strix in SLi, be careful as card spacing and case airflow can cook your cards pretty good. IF you don't have a board with good slot spacing or don't plan to get one one I wouldn't recommend stacking AIB cards. Had this issue with the Aorus. Top card would cook and flush heat onto the 2nd card making for card 1 to load @ 95C and card 2 to load at 80C. *Yikes, I know* You're better served saving some $$$ and water cooling 2 FE's with blocks or hybrid kits, if you're dead-set on the Strix and don't want to watercool you're gonna need some VERY good airflow pushing the head out of your case, and a lot of fans, I'd also suggest re-timing the cards as the stock paste jobs and tim itself, isn't great. And also using MSI Afterburner to create a custom curve for GPU load/gaming, as yes 0db is nice and all for web-browsing and some FHD video watching, but it can spell thermal issues with your cards when under any kind of actual load.
> 
> Finally, if you really wanna maximize your gaming get a *5775c* or a *7700k*, the higher cache and or clock speeds really shine in gaming (we're talking 5-20fps more over a chip with more than 4 cores!!!) *this is due to most modern games not being well optimized for multi-core systems and topping out at either dual or quad core*, however if you use your PC as a workstation and do more than just game, then stick with your CPU and OC the crap outta your CPU and cards to compensate as much as you can to get as much of your FPS fluidity back. *Personally* I'd get the 7700k as it's newer and can push 5ghz pretty easy with a modest OC. some can even run up to 5.5ghz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> **Fast memory over 3000mhz helps with an extra 2-10fps depending on the game as well.**
> 
> *TL;DR,* DO NOT get an AIB card that vents heat into your case for SLI, FE would be better, and even better if you can water/hybrid cool it. You have the wrong CPU for 4k Gaming, consider changing or getting a different display, or gaming at 2k or 21:9, if that's an option, finally get fast ram over 3000mhz if you can/don't already have it.


Thanks for the info buddy. But STRIX is fine for SLi. I already am using GTX 1080 STRIX OC in SLi and Card2 only reaches a max of 76°C @2050 boost. That's competly acceptable by me. I have a Corsair 900D with a X99s Gaming 7 M/B, i think 2 slots apart for the cards.

But your comments on getting the 7700k is interesting. But I would need to see evidence of that being considerably better than the i7-5930k for gaming. If anyone could link me any benchmarks of the i7-5930K OC'd and the 7700K OC'd @4K gaming and the latter did show better results then I would definitly like to put that on my to-buy list.


----------



## mr2cam

Quote:


> Originally Posted by *mshagg*
> 
> It comes down to ambient temps, which in turn affect coolant temps. And load, of course.
> 
> Idle:
> 
> 23 degree room temp
> 25 degree coolant temp
> 27 degree core temp
> 
> Supo 4k run:
> 
> 23 degree room temp
> 27 degree coolant temp by end of run
> 43 degree core temp by end of run
> 
> Supo 1080P high - same as above but 41 degree core temp.
> 
> Can probably improve on this by blasting the radiator fans for a few minutes before a run.


Ok that is right in line with what mine runs at, just making sure it is doing the job!


----------



## Alwrath

I went am4, not gonna look back. Quad cores are too outdated at this point to consider and I can drop in zen 2 or zen 3 in the future with the same mb/ram and blow the 7700k quad out of the water. I plan on getting big Volta when it comes out and will use my 1080ti Hybrid kit on it on launch day, thats gonna be a fun day for me.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.
> 
> He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).
> 
> Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.
> 
> Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.
> 
> I'll keep you all updated.
> 
> if I get my hands on it; it might be "share via PM only" but hopefully not.
> 
> Watch this space. And keep your fingers crossed.
> 
> Those passionate about O/C'ing may be in for a nice treat soon ;-)
> 
> Not making any promises. But my calls _did_ get answered last round. So its looking *50/50.*


Plz! just secretly leak it and don't let Asus know. I'll not tell anyone! just give me the BIOS!


----------



## nrpeyton

fingers crossed lol


----------



## Coopiklaani

Quote:


> Originally Posted by *mshagg*
> 
> It comes down to ambient temps, which in turn affect coolant temps. And load, of course.
> 
> Idle:
> 
> 23 degree room temp
> 25 degree coolant temp
> 27 degree core temp
> 
> Supo 4k run:
> 
> 23 degree room temp
> 27 degree coolant temp by end of run
> 43 degree core temp by end of run
> 
> Supo 1080P high - same as above but 41 degree core temp.
> 
> Can probably improve on this by blasting the radiator fans for a few minutes before a run.


the core temp seems a little bit high to me. I'd expect 10 to 12 water-GPU delta C under intensive loads


----------



## mshagg

I'm comfortable with 20 degree core-ambient deltas. Could probably drop a couple of degrees with some better TIM, but low 40s under deliberately high 4K loads is not of concern to me.


----------



## nrpeyton

Anyone with a 1080 Ti Founders Edition waterblock (that you've not yet fitted) do me a quick favour please?

Put a measuring tape across your block like I have done in this picture below:




Then either post or PM me the photos please?

Thanks a million!


----------



## alucardis666

Quote:


> Originally Posted by *BBEG*
> 
> Wow, that went quick. Glad I jumped on it when I got the email.
> 
> I had a quick exchange with EVGA and they assured me that using my own thermal compound, in particular the one you suggested, would not void the warranty. I'm glad EVGA is willing to warranty the unit with the card when correctly installed (this was part of me wanting their hybrid cooler) and I'm doubly glad they'll let me use my own goop with it. I look forward to the lower noise and temps. This reference cooler is a little better in tone than my old 680 coolers, but not by much and the 1080 Ti heats up much quicker. At least with the 680s I could keep them under the throttle limit at 100% fan speed, unbearable though that was.


Awesome!








Quote:


> Originally Posted by *shadow85*
> 
> Thanks for the info buddy. But STRIX is fine for SLi. I already am using GTX 1080 STRIX OC in SLi and Card2 only reaches a max of 76°C @2050 boost. That's competly acceptable by me. I have a Corsair 900D with a X99s Gaming 7 M/B, i think 2 slots apart for the cards.
> 
> But your comments on getting the 7700k is interesting. But *I would need to see evidence of that being considerably better than the i7-5930k for gaming*. If anyone could link me any benchmarks of the i7-5930K OC'd and the 7700K OC'd @4K gaming and the latter did show better results then I would definitly like to put that on my to-buy list.






Skip to the 6 minute mark. Bare in mind he is using a 980ti and these are stock clocks. Pay extra close attention to minimum frames and not averages. OC-ing a 7700k to 5.0Ghz or beyond will widen the gap as most cpus with more cores can't go much higher than 4.0 (ryzen) or 4.3 (Broadwell-X/Skylake-X)


----------



## KedarWolf

Added *'How To Maximise Your Memory Overclock'* to OP, need to search the thread to give credit to who I picked this up from here.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Possible important info on Memory Overclocking.
> 
> I trust some of you have now been using OCCT to find max stable memory O/C, (which we found is great at finding a more precise, stable, upper memory O/C).
> 
> But one more thing to note of importance:
> 
> I believe there 'could be' different memory timings straps programmed into BIOS which are automatically activated depending on your memory O/C. (for instance a good comparison to this would be how DDR3/4 XMP profiles load optimised (looser) timings for higher speeds you select in motherboard BIOS).
> 
> Anyway these timings can affect your score when you're trying to find best memory O/C.
> 
> During the 1080 days (at the owners club over there) there was even a roomer circulating that odd overclocks scored higher than even overclock numbers or memory overclocks ending in 0's didn't score as well as those ending in 5. _(I.E. +495/+605 always scored higher than +500/+600).
> _
> What is more likely to be the case is that: pre-optimised 'memory timings straps' are selected by BIOS depending on the O/C you enter.
> For example; +250...lost FPS. But +240 and +255-+260 = gain.
> 
> And remember OCCT is still the best we have for testing memory O/C *"stability"*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has anything to add, or any corrections (or different opinion), fire away  *Would be good to see some 'memory overclocking' spirited discussion.*


Here is the peep!! @nrpeyton Added this to OP with procedure.

Thank you!!

I got an amazing +723 stable on memory with this help.


----------



## GosuPl

I compared GTX 1080Ti vs TITAN X Pascal 2016, clock vs clock and + 1000 on memory effective.

1080Ti @2012/12000
TITAN X Pascal @2012/11000

http://www.3dmark.com/compare/fs/12537346/fs/12536362

http://www.3dmark.com/compare/fs/12537379/fs/12536406

http://www.3dmark.com/compare/fs/12537309/fs/12536330

Nvidia claim "1080Ti is faster than TITAN X", nope is not faster. ONLY in stock vs stock, and this is cuase better temperatures on 1080Ti and higher core clock.

Maybe soon i will buy TITAN Xp and compare vs 1080Ti and TX Pascal clock vs clock. But performance gain is not worthy in real scenarios.

On GPU-Z screen, you can see memory bandtwich, pixel fillrate and texture fillrate.

Please dont look at clocks on 3d mark, it bugged as alwyas, same GPU and CPU smile.gif Boost on GPU-Z is too misleading, beacuse GPU-Boost 3.0 is very strange smile.gif

Cards works on same clocks, TX P even throttle more, form 2012 to 1974/1961-1923.

1080Ti stay on 1974 most time.

I am satisfied with change 2x 1080Ti for 2x TITAN X Pascal, now time to LC them


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Here is the peep!! @nrpeyton Added this to OP with procedure.
> 
> Thank you!!
> 
> I got an amazing +723 stable on memory with this help.


Thanks for the mention.

This means a lot to me 
Slightly 'moved' actually









I had a look around to find the original posts; here they are: _(they feature some additional detail for anyone interested in finding max memory overclock using OCCT)_

original posts: _(hope this helps)_

Quote:


> Originally Posted by *nrpeyton*
> 
> How are you guys doing so far with memory overclocks?
> 
> My initial testing is this:
> 
> *+565* = 14 errors within 30 seconds
> 
> *+560* = 10 minute run 0 errors (completely stable)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (In other words, with my memory overclocked to within a ball hair of its life at 560 I'm completely stable.. but even 1mhz more and things go wild lol).
> 
> /\ thats at a temp of approx. 60c (fan 100% and a power draw of 277w avg. + default clocks + volts)
> 
> *+675* = crash


Quote:


> Originally Posted by *Slackaveli*
> 
> how are you testing for vram errors?? PLEASE?..


Quote:


> Originally Posted by *nrpeyton*
> 
> OCCT is perfect for this (if you carefully set it up).
> 
> It will allow you to test max memory overclock to within 5mhz with absolute precision.
> 
> The only varying factor you need worry about is varying temperatures which can / will affect results. Also try and lay in a max FPS limit that stimulates a real-world scenario power scenario (I.E. 285w draw)
> 
> Moreover 'shader complexity' should be set to 3.
> 
> Its method is simple but eloquent. It simply compares one frame to the last, any discrepancies = error.
> 
> Don't be put off by the fact the programme only *claims* to test 2GB. It doesn't matter, it tests that 2GB over all modules. It only means 2GB is loaded at any one time. However in recent versions this number is actually doubled. Making the test even better. _(they just didn't update the GUI to say so)_


Also, nearly forgot; you can change the power draw in real time without restarting the test simply by dragging the OCCT render "window" bigger. Or reduce it by making the window smaller. _(No need to restart the stress-test)._


----------



## MURDoctrine

Quote:


> Originally Posted by *GosuPl*
> 
> I compared GTX 1080Ti vs TITAN X Pascal 2016, clock vs clock and + 1000 on memory effective.
> 
> 1080Ti @2012/12000
> TITAN X Pascal @2012/11000
> 
> http://www.3dmark.com/compare/fs/12537346/fs/12536362
> 
> http://www.3dmark.com/compare/fs/12537379/fs/12536406
> 
> http://www.3dmark.com/compare/fs/12537309/fs/12536330
> 
> Nvidia claim "1080Ti is faster than TITAN X", nope is not faster. ONLY in stock vs stock, and this is cuase better temperatures on 1080Ti and higher core clock.
> 
> Maybe soon i will buy TITAN Xp and compare vs 1080Ti and TX Pascal clock vs clock. But performance gain is not worthy in real scenarios.
> 
> On GPU-Z screen, you can see memory bandtwich, pixel fillrate and texture fillrate.
> 
> Please dont look at clocks on 3d mark, it bugged as alwyas, same GPU and CPU smile.gif Boost on GPU-Z is too misleading, beacuse GPU-Boost 3.0 is very strange smile.gif
> 
> Cards works on same clocks, TX P even throttle more, form 2012 to 1974/1961-1923.
> 
> 1080Ti stay on 1974 most time.
> 
> I am satisfied with change 2x 1080Ti for 2x TITAN X Pascal, now time to LC them


I'm honestly not sure what you expected. It is the same GPU with just different clocks out of the box and 1GB of VRAM removed.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone with a 1080 Ti Founders Edition waterblock (that you've not yet fitted) do me a quick favour please?
> 
> Put a measuring tape across your block like I have done in this picture below:
> 
> Then either post or PM me the photos please?
> 
> Thanks a million!


PM'd.


----------



## nrpeyton

Quote:


> Originally Posted by *DerComissar*
> 
> PM'd.


+1 thanks ;-)


----------



## GosuPl

Quote:


> Originally Posted by *MURDoctrine*
> 
> I'm honestly not sure what you expected. It is the same GPU with just different clocks out of the box and 1GB of VRAM removed.


No excatly SAME,

1080Ti have 88 ROPS and 352 bit memory bus
TX Pascal have 96 rops and 384 bit memory bus

+1000 memory effective on both (1080Ti 12000, TX P 11000) and TX P have still higher memory bandwitch.

Funny thing is, ppls sell TX P sometimes even cheaper than 1080Ti cost...


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> +1 thanks ;-)


OCCT DX11 doesn't run on my platform for some reason unknown. I'd recommend to use EVGA oc scanner instead. https://www.evga.com/ocscanner/


----------



## nrpeyton

Anyone else just had the little
"NVIDIA GeForce Game Ready Driver is available" popup?

Have to say; they're churning these updates out pretty fast ;-)


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> OCCT DX11 doesn't run on my platform for some reason unknown. I'd recommend to use EVGA oc scanner instead. https://www.evga.com/ocscanner/


Have you tried v. 4.4.2. ?

I'm on 64bit windows 10.


----------



## rexbinary

Hi all. I just joined the club!











I replaced two 980 Ti cards with this as I got sick of the poor SLI support nowadays.

It's running everything great at 3440x1440. Very happy!


----------



## Qba73

Interesting, Gamernexus review of FTW3


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Have you tried v. 4.4.2. ?
> 
> I'm on 64bit windows 10.


Tried both 4.4.2 and 4.5.0, none worked. I'm on X99. I guess it has something to do with x99 platform


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> Interesting, Gamernexus review of FTW3


Already watched today, I love conclusion that SC2 is basicelly better buy than FTW3 from EVGA.

It is also cooler on stock vs other AIBs, but ONLY because EVGA has more aggresive auto-fan profile and it is louder than othe AIBs auto-fan profiles this way.

In terms of OC- FTW3 doesn't provide anything more than other AIBs, it is all Silicon Lottery in the end









It has 127% Power Limit Slider if anyone wants to know.


----------



## Qba73

Quote:


> Originally Posted by *Benny89*
> 
> Already watched today, I love conclusion that SC2 is basicelly better buy than FTW3 from EVGA.
> 
> It is also cooler on stock vs other AIBs, but ONLY because EVGA has more aggresive auto-fan profile and it is louder than othe AIBs auto-fan profiles this way.
> 
> In terms of OC- FTW3 doesn't provide anything more than other AIBs, it is all Silicon Lottery in the end
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It has 127% Power Limit Slider if anyone wants to know.


yup at the $780 tage for the FTW3, Sc2 is better purchase.

but I preordered mine at 749 (same price as sc2 is now) ;-)

coming on Monday..gonna pit the xtreme against it, keep the winner.

yeah all AIBs come down to the silicon lottery at least this gen...Maxwell oh how I miss thee..


----------



## Alwrath

Quote:


> Originally Posted by *Benny89*
> 
> It has 127% Power Limit Slider if anyone wants to know.


So is it 7% higher than the FE? That would mean its not as limited as a FE for OC potential.


----------



## lilchronic

Quote:


> Originally Posted by *Coopiklaani*
> 
> Tried both 4.4.2 and 4.5.0, none worked. I'm on X99. I guess it has something to do with x99 platform


4.5.0 works for me.


----------



## Benny89

Quote:


> Originally Posted by *Alwrath*
> 
> So is it 7% higher than the FE? That would mean its not as limited as a FE for OC potential.


I think it is different- its 7% higher than 330W from Asus and MSI which ar 120% Power Sliders. FE is 300W.

So I guess that place EVGA somewhe in the ground of 340-350W if my assumptions are correct?
Quote:


> Originally Posted by *Qba73*
> 
> yup at the $780 tage for the FTW3, Sc2 is better purchase.
> 
> but I preordered mine at 749 (same price as sc2 is now) ;-)
> 
> coming on Monday..gonna pit the xtreme against it, keep the winner.
> 
> yeah all AIBs come down to the silicon lottery at least this gen...Maxwell oh how I miss thee..


True, I miss Maxwells so much







Still having hard time giving up on my 980Ti G1.

EVGA FTW3 and SC2 are REALLY LOUND. I mean at 50% Aorus and MSI are 34db and SC is like 47db!!! LOL

EVGA need to finally upgrade their fans for god sake....


----------



## Alwrath

Quote:


> Originally Posted by *Benny89*
> 
> I think it is different- its 7% higher than 330W from Asus and MSI which ar 120% Power Sliders. FE is 300W.
> 
> So I guess that place EVGA somewhe in the ground of 340-350W if my assumptions are correct?


Im contemplating stepping up to a FTW3 or possibly a classified with my Hybrid cooler to see what clocks I can get. If its around 350w that might be a good try for 2100 mhz.


----------



## Benny89

Quote:


> Originally Posted by *Alwrath*
> 
> Im contemplating stepping up to a FTW3 or possibly a classified with my Hybrid cooler to see what clocks I can get. If its around 350w that might be a good try for 2100 mhz.


Sadly 2100mhz is pure Silicon Lottery. Power Limit of course helps if you win golden chip but silicon lottery is most important. We have guys here with Aorus and Zotac cards, which have 375W Power Limit and they didn't get to 2100mhz mostly. Silicon Lottery...

From review you can see that FTW3 OC results were not better than any other AIBs.

QUESTION: What heatsinks I need for VRAM and for other smaller RAM for 1080Tis?

Are those ok? :

https://www.komputronik.pl/product/109991/aab-cooling-ram-heatsink-2.html?gclid=Cj0KEQjwoqvIBRD6ls6og8qB77YBEiQAcqqHe8OuTmr2DumxaPI-bXH0HnQ9NR2hqBkAa-wBfVPM57AaAiXH8P8HAQ&gclsrc=aw.ds

And those? :

http://www.aquatuning.pl/chlodzenie-powietrzne/chlodzenia-pasywne/13663/alphacool-gpu-ram-aluminium-heatsinks-15x15mm-black-10-stk.?sPartner=googleshoppingpl&gclid=Cj0KEQjwoqvIBRD6ls6og8qB77YBEiQAcqqHe1YMwkaeoHYHaPuIDpLZqnfFAGxcONZ17ZviqlsMlFIaAgqA8P8HAQ


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Tried both 4.4.2 and 4.5.0, none worked. I'm on X99. I guess it has something to do with x99 platform


okay, i'll see if I can find anything out

Edit:
what error are you getting?


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> True, I miss Maxwells so much
> 
> 
> 
> 
> 
> 
> 
> Still having hard time giving up on my 980Ti G1.


I do and don't at the same time haha. My g1 gaming got 1480mhz but would top out at 70c while my 1080ti tops out at 52c and much better fps. 980ti was a beast and still a good card for sure


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> okay, i'll see if I can find anything out
> 
> Edit:
> what error are you getting?


"Stopped User Cancelled" 1 second after I clicked ON button.
DX9 mode seems to work fine, but it only utilizes less than 1000MB of the vram.


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> I do and don't at the same time haha. My g1 gaming got 1480mhz but would top out at 70c while my 1080ti tops out at 52c and much better fps. 980ti was a beast and still a good card for sure


It's because I have right now above average chip that can do 2012/2000mhz on air at max while my 980Ti Gaming G1 could do 1555 24/7 with 72C temp which was golden back in maxwell days.

So I still think higher of my 980Ti LOL


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> "Stopped User Cancelled" 1 second after I clicked ON button.
> DX9 mode seems to work fine, but it only utilizes less than 1000MB of the vram.


kk I've dropped them a quick email to see if anyone is manning their email address under the contact page.
(Their forum is down for maintenance).

Might be a quick fix or a known issue.

I'll let you know when I hear back from them.


----------



## alucardis666

Update for me. My EVGA 1080Ti Hybrid Kits arrive tomorrow! So excited!


----------



## nrpeyton

Quote:


> Originally Posted by *alucardis666*
> 
> Update for me. My EVGA 1080Ti Hybrid Kits arrive tomorrow! So excited!


I'm ordering the following this month too: _(but my hesitation is ruining my excitement)_









CPU:
i7 7700k _(upgrade from a 2012 'AMD FX-8350')_

Motherboard:
ASUS ROG MAXIMUS Hero IX Z270 _(upgrade from old 'Asrock 990FX Extreme9')_

Hesitation: But I *can't* decide on the 16GB memory *or* the new GPU block for my 1080Ti FE.

argh it's doing my head in lol.

I grudge paying "nearly" the price of my CPU _(and more than the cost of my mobo)_ for two 8gb sticks of bloody memory

absolute daylight robbery
memory companies should be ashamed!


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> "Stopped User Cancelled" 1 second after I clicked ON button.
> DX9 mode seems to work fine, but it only utilizes less than 1000MB of the vram.


same. i never got past that one. since i pretty well knew my best clocks pre artifacting, i just stopped trying, although i see it's helping others. my rams arent as great as my core.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> same. i never got past that one. since i pretty well knew my best clocks pre artifacting, i just stopped trying, although i see it's helping others. my rams arent as great as my core.


are you getting the same error as Coopiklaani?

I just tried the EVGA OC Scanner. It never seems to find errors/artifacts.. then eventaully if I raise the mem o/c high enogh it'll just crash all-together.

I've emailed the OCCT team; to see if we can liase to get something sorted.

As the only working app capable of testing memory O/C's on modern GPU's it's in their vested intrest to see that these bugs are ironed out.

I pointed out that their app is now at the forefront of a thread with what will shortly be nearly half a millon views. so hopefuly I'll get someone competitent get back to me.

I'll keep you guys updated.

If you can provide anymore info on your system I could copy/paste and a quick list of what you've tried and when exactly the error occurs, I'll compile it together and forward it onto them. And then I'll push them for a timeframe.

Hope that helps, not much else I can do until I hear back from them.

Just PM me if its easier (and you can be bothered - lol)

But honestly; OCCT is fantastic. I find it so *precise* at finding max memory O/C. It's also clearly impacted by VRAM temps too.

Just need to get these bugs ironed out, so everyone can benefit.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm ordering the following this month too: _(but my hesitation is ruining my excitement)_
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CPU:
> i7 7700k _(upgrade from a 2012 'AMD FX-8350')_
> 
> Motherboard:
> ASUS ROG MAXIMUS Hero IX Z270 _(upgrade from old 'Asrock 990FX Extreme9')_
> 
> Hesitation: But I *can't* decide on the 16GB memory *or* the new GPU block for my 1080Ti FE.
> 
> argh it's doing my head in lol.
> 
> I grudge paying "nearly" the price of my CPU _(and more than the cost of my mobo)_ for two 8gb sticks of bloody memory
> 
> absolute daylight robbery
> memory companies should be ashamed!


yeah, man, the memory bs is robbery. damn cell phones taking all the fabs.

yeah, same error. basically cant get it to start. im on win10 ,64. z97 plat on broadwell.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man, the memory bs is robbery. damn cell phones taking all the fabs.


aye, I really don't have the patience to scroll through another 10 memory reviews discussing speeds and performance per dollar e.t.c (I know diminishing returns after 3000)....

But I also hearing a lot about the benefits of 3600+ for FPS when paired with a 1080TI.

And other one-liners about Samsung B die being more overclockable.
(recognising B-die isn't easy without ripping the heat-sink off the chip)

*But* there are "tell-tale" signs you can watch out for in the shops, for example a 3200 CS 14 kit are probably Samsung B Die.. a CS 16 kit probably isn't

However when you get to 3600++ (which is what I'm aiming for; I believe it becomes harder to make a safer educated assumption.)


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> aye, I really don't have the patience to scroll through another 10 memory reviews discussing speeds and performance per dollar e.t.c (I know diminishing returns after 3000)....
> 
> But I also hearing a lot about the benefits of 3600+ for FPS when paired with a 1080TI.
> 
> And other one-liners about Samsung B die being more overclockable.
> (recognising B-die isn't easy without ripping the heat-sink off the chip)


It is easy cause its know which manufacturer uses what ;-)

basically all g.skill 3600mhz 16gb kits with cl 15 or cl 16 or cl 17 are samsung b-die for example


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> It is easy cause its know which manufacturer uses what ;-)
> 
> basically all g.skill 3600mhz 16gb kits with cl 15 or cl 16 or cl 17 are samsung b-die for example


ahh really, excellent.... 

What about Corsair?


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> ahh really, excellent....
> 
> What about Corsair?


You have to research for corsair by yourself, havent checked which kits use what.
bought g.skill since i knew what is what xD


----------



## CoreyL4

Is it wortj it to repaste a msi gaming x card? My temps are fine but im wondeting if i can get like a decrease of 10c? Temps stay in high 50s wigh occasional dip in 60-62


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> Is it wortj it to repaste a msi gaming x card? My temps are fine but im wondeting if i can get like a decrease of 10c? Temps stay in high 50s wigh occasional dip in 60-62


you can try but the difference will be mor like 2-3°c i guess


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> aye, I really don't have the patience to scroll through another 10 memory reviews discussing speeds and performance per dollar e.t.c (I know diminishing returns after 3000)....
> 
> But I also hearing a lot about the benefits of 3600+ for FPS when paired with a 1080TI.
> 
> And other one-liners about Samsung B die being more overclockable.
> (recognising B-die isn't easy without ripping the heat-sink off the chip)
> 
> *But* there are "tell-tale" signs you can watch out for in the shops, for example a 3200 CS 14 kit are probably Samsung B Die.. a CS 16 kit probably isn't
> 
> However when you get to 3600++ (which is what I'm aiming for; I believe it becomes harder to make a safer educated assumption.)


for Ryzen they are for sure (the Samsung-B being the magic sticks). not sure about on kaby.


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> you can try but the difference will be mor like 2-3°c i guess


this, but possibly as much as ~7c if it was a bad tim and u use thermal grizzly kryonaut.
Quote:


> Originally Posted by *CoreyL4*
> 
> Is it wortj it to repaste a msi gaming x card? My temps are fine but im wondeting if i can get like a decrease of 10c? Temps stay in high 50s wigh occasional dip in 60-62


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> for Ryzen they are for sure (the Samsung-B being the magic sticks). not sure about on kaby.


Samsung B-Die was best for kaby before ryzen has even risen.
All worldrecords on hwbot are done with b die


----------



## alucardis666

Quote:


> Originally Posted by *nrpeyton*
> 
> I grudge paying "nearly" the price of my CPU _(and more than the cost of my mobo)_ for two 8gb sticks of bloody memory
> 
> absolute daylight robbery
> memory companies should be ashamed!


Supply and demand.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> I think it is different- its 7% higher than 330W from Asus and MSI which ar 120% Power Sliders. FE is 300W.
> 
> So I guess that place EVGA somewhe in the ground of 340-350W if my assumptions are correct?
> True, I miss Maxwells so much
> 
> 
> 
> 
> 
> 
> 
> Still having hard time giving up on my 980Ti G1.
> 
> EVGA FTW3 and SC2 are REALLY LOUND. I mean at 50% Aorus and MSI are 34db and SC is like 47db!!! LOL
> 
> EVGA need to finally upgrade their fans for god sake....


They can't unless they're willing to sacrifice size like everyone else. All the quiet cards are huge, generally 2.5 slots.

EVGA cards can also get pretty quiet if you make a custom fan profile (you should). I have a very quiet computer and ran an EVGA 1080 FTW and FTW2 in it. With a custom profile it is appreciably quieter.

That being said, the quietest and most pleasant-sounding graphics card I've ever had that wasn't on water is my current Zotac Amp Extreme. It's gigantic, basically 2.75 slots, but very quiet.

I briefly had an Asus Strix OC and hated the sound of it. Quiet but a distracting pitch to me.

I have an EVGA SC2 Hybrid coming tomorrow if my shipping department at work sorts through today's mail in time.

That's the only card that might pull me away from the Zotac other than the FTW3.


----------



## Slackaveli

]
Quote:


> Originally Posted by *Hulio225*
> 
> Samsung B-Die was best for kaby before ryzen has even risen.
> All worldrecords on hwbot are done with b die


word, man. i appreciate your posts. Samsung mems are always the best!


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> They can't unless they're willing to sacrifice size like everyone else. All the quiet cards are huge, generally 2.5 slots.
> 
> EVGA cards can also get pretty quiet if you make a custom fan profile (you should). I have a very quiet computer and ran an EVGA 1080 FTW and FTW2 in it. With a custom profile it is appreciably quieter.
> 
> That being said, the quietest and most pleasant-sounding graphics card I've ever had that wasn't on water is my current Zotac Amp Extreme. It's gigantic, basically 2.75 slots, but very quiet.
> 
> I briefly had an Asus Strix OC and hated the sound of it. Quiet but a distracting pitch to me.
> 
> I have an EVGA SC2 Hybrid coming tomorrow if my shipping department at work sorts through today's mail in time.
> 
> That's the only card that might pull me away from the Zotac other than the FTW3.


feel the same about the Aorus. fans really get the job down, and silently. But you are right, because it is also a damn near 3 slot card.


----------



## madmeatballs

My temps with the AC waterblock and backplate on PUBG are 37-38c max at 22c ambient after awhile it regularizes to 36c


----------



## nrpeyton

New build almost picked out:

-Asus ROG MAXIMUS IX CODE
-Intel Core i7-7700K

/\ found a great 'combo deal' on the above....

But
I just _can't_ get my head around DDR4 memory prices....

To make *full* *use* of my boards max memory capability, I'm looking at £250 (that's about $323 in U.S dollars) for a 4133 kit

(Bear in mind; I'm coming from an AM3+ AMD setup. Boards max mem speed is 2133Mhz _(which often for 8gb is $50 on Ebay)_

On _some_ games using latest Nvidia GPU's I'm seeing up to 10+ FPS differences between 3000 Mhz DDR4 and 4000 Mhz DDR4.
.. but _other_ games it's only 1/2 FPS

_Anyone else been through this recently?_


----------



## ViRuS2k

arghhhh lol wish this super bios would come out








cause tomorrow im going to be shunt modding my 1080ti`s







and if this bios does come out after i have shunt modded i will be pretty annoyed lol


----------



## nrpeyton

Quote:


> Originally Posted by *ViRuS2k*
> 
> arghhhh lol wish this super bios would come out
> 
> 
> 
> 
> 
> 
> 
> 
> cause tomorrow im going to be shunt modding my 1080ti`s
> 
> 
> 
> 
> 
> 
> 
> and if this bios does come out after i have shunt modded i will be pretty annoyed lol


No need to be annoyed.

A lot of ppl will still prefer to shunt mod.

Cross-flashing doesn't always result in higher benchmark scores because particular BIOS's are specifically tuned to their native hardware.
It's been proven time and again; that you _usually_ score higher using your native BIOS.

A few exceptions to that were last round (1080 non Ti's) where FTW2 1080's scored higher using the ASUS unlocked bios.

It's also down to silicon lottery; if you're card is capable of utilising the extra power (it has to be able to clock higher to do so) ....
...and if you're one of the lucky ones who're able to take advantage of voltage up to 1.25v (because the lottery of your card allows it) then you can be in for a treat

But more often than not, unfortunately that's not always the case.

ALSO:
Even if ASUS makes the right decision or the XOC guy is kind enough to leak the file

then
1) I'll still be doing the shunt mod anyway
2) I'll still be trying the BIOS

and 3) I'll enjoy the chance to compare results of both and do my own benchmarking and analysis (as I'm sure you and many others will too)


----------



## ViRuS2k

nrpeyton







its not the voltage im after in this bios
















voltages are fine, i just want to be able to reliably run 2050 in sli without the clocks jumping around due to power limitations
with all power and thermal limitations gone i would be in heaven with that bios


----------



## mshagg

Quote:


> Originally Posted by *Alwrath*
> 
> So is it 7% higher than the FE? That would mean its not as limited as a FE for OC potential.


They have different base TDPs.

FTW3 is 280W base ( i think) so 127% gets you 355W. Founders edition is 250W base, @ 120% is 300W.


----------



## nrpeyton

Quote:


> Originally Posted by *ViRuS2k*
> 
> nrpeyton
> 
> 
> 
> 
> 
> 
> 
> its not the voltage im after in this bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> voltages are fine, i just want to be able to reliably run 2050 in sli without the clocks jumping around due to power limitations
> with all power and thermal limitations gone i would be in heaven with that bios


hehe ;-)


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> New build almost picked out:
> 
> -Asus ROG MAXIMUS IX CODE
> -Intel Core i7-7700K
> 
> /\ found a great 'combo deal' on the above....
> 
> But
> I just _can't_ get my head around DDR4 memory prices....
> 
> To make *full* *use* of my boards max memory capability, I'm looking at £250 (that's about $323 in U.S dollars) for a 4133 kit
> 
> (Bear in mind; I'm coming from an AM3+ AMD setup. Boards max mem speed is 2133Mhz _(which often for 8gb is $50 on Ebay)_
> 
> On _some_ games using latest Nvidia GPU's I'm seeing up to 10+ FPS differences between 3000 Mhz DDR4 and 4000 Mhz DDR4.
> .. but _other_ games it's only 1/2 FPS
> 
> _Anyone else been through this recently?_


you are a true OCer so anything less than the fastest available will always be a niggle so don't think twice just get it I went through that with a 3570k and memory I saved a little money but the knowledge that something out there that is faster will simply be too much you buy cheap you buy twice is my motto, ended up with quite a good media PC in the end though.

As for the Asus bios I believe Asus are compelled to release it because of the "hotwire" solder points that are on the Strix would be false advertising if it doesn't allow you to go past software voltage mods.

Edit a lot of answers here

This was an interesting read Asus on LN2 ultimate guide
http://forum.hwbot.org/showthread.php?t=169488
I am getting tempted to try this though I got a bad clocker depends on the XOC bios release though am still not sure if the original bios has a voltage limit

I believe That was the reason the EVbot got cancelled because NVidia were not happy that the cards that were RMAed were impossible to tell if they were modded unlike Asus

I am also had 3 BSOD since putting in the 1080Ti beast ,only when the PC has been idling for 6+ hours never at any other time so I am hoping it will fix itself with the NVidia driver updates I haven't looked deeply into it yet as this OS is due for a fresh install I am thinking it has gone though 3 different motherboards and 8.1 to 10 upgrade
Also with the bad rap SLI is getting with game profiles NVidia are probably working around the clock to fix them with the Vega due in the next month


----------



## alucardis666

So... anyone buying these?!

http://www.guru3d.com/news-story/manli-releases-geforce-gtx-1080-ti-gallardo.html

http://www.guru3d.com/news-story/colorful-launches-igame-geforce-gtx-1080-ti-vulcan-x-oc.html


----------



## Alex132

Quote:


> Originally Posted by *alucardis666*
> 
> So... anyone buying these?!
> 
> http://www.guru3d.com/news-story/manli-releases-geforce-gtx-1080-ti-gallardo.html
> 
> http://www.guru3d.com/news-story/colorful-launches-igame-geforce-gtx-1080-ti-vulcan-x-oc.html


Lol no. I have never had a good experience with Colorful.

I am getting the EVGA 1080 Ti FTW3 myself, it's overbuilt and overkill. Just how I like it


----------



## feznz

Those little warrantee stickers on the screws holding the cooler on, I was thinking about it I have not had a problem RMA the card with the seals broken actually after reading some members antics no names mentioned I think they are aimed more for the EU market so resellers can easily check a card for the 14 day no questions asked return.


----------



## DStealth

I live in EU and my RMA claim from MSI is approved. Awaiting my new card even my seal was perforated


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> +1 thanks ;-)
> 
> 
> 
> OCCT DX11 doesn't run on my platform for some reason unknown. I'd recommend to use EVGA oc scanner instead. https://www.evga.com/ocscanner/
Click to expand...

Go into Settings and put Hardware Monitoring to Disabled.


----------



## Clukos

Quote:


> Originally Posted by *nrpeyton*
> 
> New build almost picked out:
> 
> -Asus ROG MAXIMUS IX CODE
> -Intel Core i7-7700K
> 
> /\ found a great 'combo deal' on the above....
> 
> But
> I just _can't_ get my head around DDR4 memory prices....
> 
> To make *full* *use* of my boards max memory capability, I'm looking at £250 (that's about $323 in U.S dollars) for a 4133 kit
> 
> (Bear in mind; I'm coming from an AM3+ AMD setup. Boards max mem speed is 2133Mhz _(which often for 8gb is $50 on Ebay)_
> 
> On _some_ games using latest Nvidia GPU's I'm seeing up to 10+ FPS differences between 3000 Mhz DDR4 and 4000 Mhz DDR4.
> .. but _other_ games it's only 1/2 FPS
> 
> _Anyone else been through this recently?_


What you are describing is that some games are more mem bandwidth limited than others. In such games, in CPU limited scenarios, you can see 10+ fps difference between slower ram and faster ram.
Quote:


> Originally Posted by *DStealth*
> 
> I live in EU and my RMA claim from MSI is approved. Awaiting my new card even my seal was perforated


So we can re-paste no problem?


----------



## hotrod717

Quote:


> Originally Posted by *feznz*
> 
> I have theory when it comes to getting golden cards that is wait for at least 6 months after release
> As the process of making the chips are refined I believe it takes 3-4 months from start to finish to make a silicon chip that means that there is room for them to improve them but the fixes will take several months to surface to the retailers then you got to wait for old stock to move.
> Just a theory
> 
> I would have been more happy with the Strix in your sig
> 
> and the old looking others achievements I tell you the truth I only post my best results, it might of taken a frosty morning or there are plenty of other tricks to boost your score especially drivers I had a few old favourites or a fresh spread of TIM every few weeks/months is probably the best trick I know of.
> 
> or simply lie about your stability I can say stability for me is a 6 hour non stop game session for others it could be anything
> I notice that stability issues normally show after 1-2 hours of non-stop gaming


Would have to disagree. Release cards. Both my 1080ti and TXp are release cards and both were at the front of the pack. 2144 on ti and 2138 on TXp. Neither shunt modded.I would say the refined process after months of manufacture improves your chances of getting a better card or you are less likely to get a total dud. Also, for water or extreme cooling, aib becomes a mute point unless they come out with a totally unlocked version that offers total voltage control, vrm freq, pex, llc, ect, ect.


----------



## Benny89

Quote:


> Originally Posted by *feznz*
> 
> Those little warrantee stickers on the screws holding the cooler on, I was thinking about it I have not had a problem RMA the card with the seals broken actually after reading some members antics no names mentioned I think they are aimed more for the EU market so resellers can easily check a card for the 14 day no questions asked return.


Actually, missing stickers won't affect return policy at all. As long as card is working and physically not damaged.

And that depends on brand really. I know that MSI does not care but Asus for example- who knows. I will email Asus and ask them.


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> Update for me. My EVGA 1080Ti Hybrid Kits arrive tomorrow! So excited!


sounds like today is going to be a good day for you!! good luck and take lots of pics!!!

mine isn't due in until Tuesday


----------



## TheBoom

Quote:


> Originally Posted by *alucardis666*
> 
> So... anyone buying these?!
> 
> http://www.guru3d.com/news-story/manli-releases-geforce-gtx-1080-ti-gallardo.html
> 
> http://www.guru3d.com/news-story/colorful-launches-igame-geforce-gtx-1080-ti-vulcan-x-oc.html


Colorful's one is really good looking I must say.

Never heard of Manli though, but very creative and original name for their card lul.


----------



## Asmola

Quote:


> Originally Posted by *nrpeyton*
> 
> Just found a guy (one of the extreme O/C'ing guys) who has access to an XOC BIOS for 1080Ti. It allows voltage up to 1.25v and removes all power and TDP limits.
> 
> He's waiting on permission from ASUS to release the BIOS to the public (same as they did on the older 1080 (non Ti's).
> 
> Hopefully if ASUS hit their RMA targets in the last two quarters they'll release the BIOS.
> 
> Anyhow, I've asked if he'll send me the BIOS privately via PM. Waiting on a response.
> 
> I'll keep you all updated.
> 
> if I get my hands on it; it might be "share via PM only" but hopefully not.
> 
> Watch this space. And keep your fingers crossed.
> 
> Those passionate about O/C'ing may be in for a nice treat soon ;-)
> 
> Not making any promises. But my calls _did_ get answered last round. So its looking *50/50.*


You can find all the information about XOC bios (and overclock guide) from here: http://forum.hwbot.org/showthread.php?t=169488&highlight=1080

Also asked if i can get that bios.


----------



## DStealth

Quote:


> Originally Posted by *Clukos*
> 
> What you are describing is that some games are more mem bandwidth limited than others. In such games, in CPU limited scenarios, you can see 10+ fps difference between slower ram and faster ram.
> So we can re-paste no problem?


No idea how different brands, distributors are acting.








Just telling what happen to me. I removed the screw to apply Liquid metal and put it back, after the card burned was send back to MSI hub and I already have a replacement on the way...
I'm not going to suggest someone to do this...better to have the idea to make it on his own risk


----------



## Bishop07764

Quote:


> Originally Posted by *nrpeyton*
> 
> New build almost picked out:
> 
> -Asus ROG MAXIMUS IX CODE
> -Intel Core i7-7700K
> 
> /\ found a great 'combo deal' on the above....
> 
> But
> I just _can't_ get my head around DDR4 memory prices....
> 
> To make *full* *use* of my boards max memory capability, I'm looking at £250 (that's about $323 in U.S dollars) for a 4133 kit
> 
> (Bear in mind; I'm coming from an AM3+ AMD setup. Boards max mem speed is 2133Mhz _(which often for 8gb is $50 on Ebay)_
> 
> On _some_ games using latest Nvidia GPU's I'm seeing up to 10+ FPS differences between 3000 Mhz DDR4 and 4000 Mhz DDR4.
> .. but _other_ games it's only 1/2 FPS
> 
> _Anyone else been through this recently?_


And here I am still sitting on my z77 and i7 3770k. This board isn't healthy. You all aren't aggravating my upgrade itch.

But wouldn't memory that fast affect your core clock? I would imagine it putting a lot of strain on the memory controller.

I'm trying to convince myself that I wouldn't notice a true difference because of my gsync monitor. ?


----------



## TheBoom

Quote:


> Originally Posted by *Asmola*
> 
> You can find all the information about XOC bios (and overclock guide) from here: http://forum.hwbot.org/showthread.php?t=169488&highlight=1080
> 
> Also asked if i can get that bios.


Seems like this wouldn't be a good idea on air cooling though.

Unless there was a way to keep the temp limit on.


----------



## Asmola

Quote:


> Originally Posted by *TheBoom*
> 
> Seems like this wouldn't be a good idea on air cooling though.
> 
> Unless there was a way to keep the temp limit on.


True, but im going to use it with watercooling, but result might be the same.


----------



## fisher6

Quote:


> Originally Posted by *Asmola*
> 
> True, but im going to use it with watercooling, but result might be the same.


Doubt we will see that bios anytime soon, says waiting for Asus permission. Hope I'm wrong though.


----------



## max883




----------



## Asmola

Quote:


> Originally Posted by *fisher6*
> 
> Doubt we will see that bios anytime soon, says waiting for Asus permission. Hope I'm wrong though.


I hope i'm able to get it.







But unfortunetly, im not going to be able to share it if i get it.


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> religion in a nutshell right there. yes, it is profound.


very good comparision!


----------



## fisher6

Quote:


> Originally Posted by *Asmola*
> 
> Well, i know elmor and Xtreme Addict personally, so i hope i'm able to get it.
> 
> 
> 
> 
> 
> 
> 
> But unfortunetly, im not going to be able to share it if i get it.


Rest my case!


----------



## pez

Quote:


> Originally Posted by *KickAssCop*
> 
> Poor cooling performance compared to competitor offering.
> High noise compared to competitor offering (even now, the SC2 cooler runs at 37 dba vs. 39 dba for reference - source: techpowerup)
> Bearing issues.
> Coil whine issues.
> Kingpin cooling for 980 Ti was also crap (ran hotter than normal classified cooling).
> 
> Anyways, I just want to see if FTW3 is on par w/ competitor offering - like 33 dba, 70 C or lower temps, no coil whine etc.


Quote:


> Originally Posted by *Alex132*
> 
> Keep in mind they are smaller and have more aggressive fan-curves. You can change that yourself and achieve lower noise.
> 
> Also bearing noise and coilwhine noise? Source on more than just ONE/etc. person(s) having this issue?
> 
> edit- you meant ACX not iCX. Not sure about ACX3.0 much beyond the design was a bit too dated and thermal pad issues.


I didn't have this issue on my ACX 3.0 1080 whatsoever. Im not having this issue on my ICX cooler either. Also keep in mind that while reviews seem to have taken and let the fan get to 50% and then measure the noise, they don't point out that the EVGA card uses higher RPM fans or that cooling performance at 50% (their default fan profile) is not equivalent or really comparable to another AIBs default fan profile.

In my experience, I've had better results with the EVGA cards than the STRIX OC. To put it into perspective, my SC Ti gets to 71-72C if I leave it on 50% fan and the STRIX I had would be at least 5C higher than that. Case conditions and airflow are going to be a more important factor here.


----------



## KraxKill

My card at -17C and only 1.093v is like a Ferrari with a brick under the gas pedal. I need more voltage!!!!!! I bet I could run 2200+ with ease on that XOC









I have firestrike scores at 2168 at 1.093 and the card does 2100+ at ambient so I feel fairly confident with about 1.12v I could hit 2200mhz.

Don't want to go the hard mod, but if I'm caged up much longer I may have to go that route.

So far the only thing cooling has gained me is a few extra bins and the ability to run ridiculously low voltages at reasonable clocks. For example with just -10C liquid, I can run 2100 at just 1.440v but there is no extra headroom with the locked vbins.


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> New build almost picked out:
> 
> -Asus ROG MAXIMUS IX CODE
> -Intel Core i7-7700K
> 
> /\ found a great 'combo deal' on the above....
> 
> But
> I just _can't_ get my head around DDR4 memory prices....
> 
> To make *full* *use* of my boards max memory capability, I'm looking at £250 (that's about $323 in U.S dollars) for a 4133 kit
> 
> (Bear in mind; I'm coming from an AM3+ AMD setup. Boards max mem speed is 2133Mhz _(which often for 8gb is $50 on Ebay)_
> 
> On _some_ games using latest Nvidia GPU's I'm seeing up to 10+ FPS differences between 3000 Mhz DDR4 and 4000 Mhz DDR4.
> .. but _other_ games it's only 1/2 FPS
> 
> _Anyone else been through this recently?_


You dont need to buy a 4133MHz kit, those are sometimes even worse than a 3600MHz CL16 B-Die kit.
I talked with a guy on HWbot who has several world records and he has binned over 100 ram kits...

Ill copy pasta what he said to ram kits:

"Tested 4266 rgb,failed to train 4000 12 12 12 1.85-2V. Also latest 4266 were worser on tight timings compared to 3600c16. Latest 3733 c17 and 3600c15 again were worser,1707 serial week,1701 on pcb,644 week on IC.

On the average from 5 sets of 3600C16 i had way better average than on 7 sets of 4266.Not to mention that 4266 i had 3 kits with one module DOA.

It is what it is , i binned over 100 kits overall and i have 3 modules super good,found about 6 GOOD(4133 12 11 11 Spi 32M wazza) and about 10 4000-4040 which is in my book average.

My advice would be to get a few kits of 3600 17-18-18 or 16 16 16 NON-RGB and keep the best,return the rest."

After that i just went to a local vendor and bought a 3600MHz CL 16 Kit.
For 24/7 it runs those timings 16-16-16-32 1T @ 4000MHz @ 1.45V

And it can run 4080 MHz @ 12-12-12-24 1T with very tight sub and teritary timings @ 1.96V

If i losen the timings i can run 4133 instantly, i am on an asus apex board thou and if i losen timings even more i can go even higher...
But Ram performance is best if you balance clocks with tightest timings possible...

There is no need of buying B-Die kits over 3600MHz imo.


----------



## eXistencelies

Quote:


> Originally Posted by *KraxKill*
> 
> My card at -17C and only 1.093v is like a Ferrari with a brick under the gas pedal. I need more voltage!!!!!! I bet I could run 2200+ with ease on that XOC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have firestrike scores at 2168 at 1.093 and the card does 2100+ at ambient so I feel fairly confident with about 1.12v I could hit 2200mhz.
> 
> Don't want to go the hard mod, but if I'm caged up much longer I may have to go that route.
> 
> So far the only thing cooling has gained me is a few extra bins and the ability to run ridiculously low voltages at reasonable clocks. For example with just -10C liquid, I can run 2100 at just 1.440v but there is no extra headroom with the locked vbins.


So you are negative Celsius? Am I missing something?


----------



## done12many2

Quote:


> Originally Posted by *eXistencelies*
> 
> So you are negative Celsius? Am I missing something?


This should clarify things for you.









http://www.overclock.net/t/1533164/the-24-7-sub-zero-liquid-chillbox-club/700_100#post_26039563


----------



## Hulio225

Quote:


> Originally Posted by *eXistencelies*
> 
> So you are negative Celsius? Am I missing something?


I was wondering the same, and asked him abou that in the bios flash thread, but he didn't answer^^

I wanna know aswell and i wanna know the cooling solution since he is saying -10 C liquid, which liquid what are you doing mate?^^

Edit:
Quote:


> Originally Posted by *done12many2*
> 
> This should clarify things for you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1533164/the-24-7-sub-zero-liquid-chillbox-club/700_100#post_26039563


Tanks!


----------



## KraxKill

Quote:


> Originally Posted by *eXistencelies*
> 
> So you are negative Celsius? Am I missing something?


Yeah I'm sub zero 24/7.

I run a chiller and a chillbox that allows me to keep the computer at sub zero 24/7 at whichever temp I choose. I run it as my gaming rig and as a htpc the rest of the time. You can check up a few pages where I posted some pics.





More pics on page 266 of the thread.
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7950#post_26062336


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Gonna have to disagree. I had great success on my new Ryzen build with G Skill RGB 4133mhz 1.35v kit. I managed to get 3300 mhz at 14-13-13-13-33 at 1.35 voltage and I will push it to 3600 this month after agesa bios update.


Hehe what you are telling me makes no sense, let me explain why:

You are running on laughable clocks and timings for B-Die in general, cuz of ryzen.

Im am talking about 4000 MHz @ 12-12-12-24 1T which is a whole new dimension compared to 3300 14-13-13-33

im pretty sure that every b die kit can do 3300 at those timings... so imo you would have the same sucess with every other b die kit

no offense but you are comparing b die on a platform which has alot of ram problmes atm with a platform which has none

i mean its obvious that the limiting factor is the imc on ryzen and / or the mainboards etc not the ram kits by itself since those are running with tight timings and high clocks on other platforms


----------



## lilchronic

Quote:


> Originally Posted by *Alwrath*
> 
> Gonna have to disagree. I had great success on my new Ryzen build with G Skill RGB 4133mhz 1.35v kit. I managed to get 3300 mhz at 14-13-13-13-33 at 1.35 voltage and I will push it to 3600 this month after agesa bios update.


lol


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> I was wondering the same, and asked him abou that in the bios flash thread, but he didn't answer^^
> 
> I wanna know aswell and i wanna know the cooling solution since he is saying -10 C liquid, which liquid what are you doing mate?^^
> 
> Edit:
> Tanks!


The liquid is electronics grade, ethylene glycol and distilled water. Some people run propylene glycol and water but ethylene is less viscous at lower temperatures.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> The liquid is electronics grade, ethylene glycol and distilled water. Some people run propylene glycol and water but ethylene is less viscous at lower temperatures.


Im very intrested in your chilly cube ;-P maybe thats something ill consider building in the future xD


----------



## Alwrath

Quote:


> Originally Posted by *Hulio225*
> 
> Im am talking about 4000 MHz @ 12-12-12-24 1T which is a whole new dimension compared to 3300 14-13-13-33
> 
> im pretty sure that every b die kit can do 3300 at those timings... so imo you would have the same sucess with every other b die kit


Im pretty sure I wouldent be able to get my current timings / mhz with a 3600 cl16 kit, thats the point I was trying to make. Also, 4000 mhz at 12 cl requires super high voltage thats not suitable in a 24/7 gaming setup.


----------



## done12many2

Quote:


> Originally Posted by *Alwrath*
> 
> *Im pretty sure I wouldent be able to get my current timings / mhz with a 3600 cl16 kit*, thats the point I was trying to make. Also, 4000 mhz at 12 cl requires super high voltage thats not suitable in a 24/7 gaming setup.


Sounds pretty scientific.







Hence why I asked if you tried a 3600 kit. They are pretty amazing.


----------



## Alwrath

Quote:


> Originally Posted by *done12many2*
> 
> Sounds pretty scientific.
> 
> 
> 
> 
> 
> 
> 
> Hence why I asked if you tried a 3600 kit. They are pretty amazing.


Well, im sure I could with higher voltage, but not at 1.35


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Im pretty sure I wouldent be able to get my current timings / mhz with a 3600 cl16 kit, thats the point I was trying to make. Also, 4000 mhz at 12 cl requires super high voltage thats not suitable in a 24/7 gaming setup.


Quote:


> Originally Posted by *done12many2*
> 
> Sounds pretty scientific.
> 
> 
> 
> 
> 
> 
> 
> Hence why I asked if you tried a 3600 kit. They are pretty amazing.


Yeah 4000MHz 12-12-12-24 1T is @ 1,8 - 2,0 volts just for benching.

But how does it come that you think , that you wouldn't achieve those timings with a 3600 C16 kit?

Its the identical samsung b-die mounted on the pcb, just the xmp profile is different? Oo

plz explain^^


----------



## Alwrath

Quote:


> Originally Posted by *Hulio225*
> 
> But how does it come that you think , that you wouldn't achieve those timings with a 3600 C16 kit?


Well I should have edited my post to include voltage. Of course I could get it, but I dont think it compares as well to the 4133 RGB kit I have. Thats all im saying.


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Well I should have edited my post to include voltage. Of course I could get it, but I dont think it compares as well to the 4133 RGB kit I have. Thats all im saying.


Just for you because i like you, i restarted my pc with a short visit in the bios xD

This is on a 3600MHz c16 Kit

Edit: like i already said:
Its the identical samsung b-die mounted on the ram pcb, just the xmp profile is different


----------



## Garrett1974NL

Anyone else use Conductonaut or Liquid Ultra on their precious 1080Ti GPU die?















I mounted an Accelero Xtreme III on it, max stable is 2050MHz @ 1.081/1.093v and with the Xtreme III @ 65% fanspeed temps are 61c max








I have the MSI Armor OC version, the cooler is complete trash lol...but I deliberately chose it because it DOES have the cooling plate on the VRM's, where the Gaming X version doesn't (only on the memory, VRM's are cooled by the main Twin Frozr cooler)
Couldn't be happier with the cooling atm


----------



## FuriousReload

Well even though my FE runs amazing, I went ahead and ordered the FTW3 in black. I am liking how over built it is. Might sell my current FE with the Hybrid if the FTW3 comes close to it in clocks, or shunt mod it to try to bench higher than the 2126 I bench at now. If only that 2151 clock held in for a bit longer.







2151 with the +753 memory should produce some good scores!


----------



## Alwrath

Quote:


> Originally Posted by *Hulio225*
> 
> Just for you because i like you, i restarted my pc with a short visit in the bios xD
> 
> This is on a 3600MHz c16 Kit
> 
> Edit: like i already said:
> Its the identical samsung b-die mounted on the ram pcb, just the xmp profile is different


Im on 1T but you do have proof it looks like. So basically what your telling me is, I paid an extra $100 for RGB lighting? But what if I push beyond 3600 mhz, will the investment be worth it then?


----------



## KCDC

Swapped the Ax1200i with the Ax1500i last night..

Now, I'm not sure if it's because the 1200i was on its way out, or the new nvidia driver or what, but I never drop below 60 FPS, GPUs never leave 2000 with a basic OC, gpu +100, Mem +200, Power 120%, default voltage. This is SLI, 7680x1440 under water, testing with Mass Effect Andromeda.

Before, with higher settings I would get choppy fps always going up and down from 30/40 to 60 depdning on location. Now it's so buttery smooth no matter where.. I was amazed!! I didn't expect such an improvement. Inside the Tempest, its the worst as far as maintaining decent FPS, and now it's 55-60. Temps stay around 50 on load.

Didn't have as much time as I wanted last night to get into benchmarks since I also found a leaking EK fitting from one of the gpu blocks..ended up having to drain and fix and check the rest of them which took some time. Perhaps tonight.

Happy camper over here.


----------



## Slackaveli

Quote:


> Originally Posted by *Garrett1974NL*
> 
> Anyone else use Conductonaut or Liquid Ultra on their precious 1080Ti GPU die?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mounted an Accelero Xtreme III on it, max stable is 2050MHz @ 1.081/1.093v and with the Xtreme III @ 65% fanspeed temps are 61c max
> 
> 
> 
> 
> 
> 
> 
> 
> I have the MSI Armor OC version, the cooler is complete trash lol...but I deliberately chose it because it DOES have the cooling plate on the VRM's, where the Gaming X version doesn't (only on the memory, VRM's are cooled by the main Twin Frozr cooler)
> Couldn't be happier with the cooling atm


well that's how you turn a lemon into lemonaid right there.


----------



## done12many2

Quote:


> Originally Posted by *Alwrath*
> 
> So basically what your telling me is, I paid an extra $100 for RGB lighting?


Basically yes, that's what we're telling you. I purchased a RGB 4133 set for testing. After comparing it to two sticks of my 3200 b die, I returned the 4133 kit.
Quote:


> Originally Posted by *Alwrath*
> 
> But what if I push beyond 3600 mhz, will the investment be worth it then?


Not really because the 3600 kits go well past 3600 MHz as well.


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Im on 1T but you do have proof it looks like. So basically what your telling me is, I paid an extra $100 for RGB lighting? But what if I push beyond 3600 mhz, will the investment be worth it then?


could be in the long run. yours is basically 'binned' to hold 4133. some of the 3600's wouldn't hold it (Samsung's silicon is great, and this process is hella refined, but it's still silicon) If ryzen ever gets that high. Yours also may have headroom above 4133, but that , again, may not ever be seen.

Also, those are some pretty sexy rgb's, tho...







And the resell value on that ram kit is SKY HIGH. You could always grab a 3200/3600 kit and make $100-150 on the flip. Way more people than not have no idea how to make a 3600 kit do 4133. they'll pay top dollar, aka MSRP for those "New-other" slightly tested ram









@KCDC, was probably a dying 1200i. but, shoot, i would have probably done the same thing. you covered both possibilities. it's extreme overkill tho, but psu's are always better at 1/3-1/2 load so, nothing wrong with that. It'll last for years and years.


----------



## D13mass

Hi guys!

Very often I have a 'Power limit' in rivaTuner and I am looking for new bios. I want to try flash new one with big power limit.

I have Nvidia FE card with EK water block, could you tell me when I can download bios and which will be good for my card ?


----------



## Slackaveli

Quote:


> Originally Posted by *D13mass*
> 
> Hi guys!
> 
> Very often I have a 'Power limit' in rivaTuner and I am looking for new bios. I want to try flash new one with big power limit.
> 
> I have Nvidia FE card with EK water block, could you tell me when I can download bios and which will be good for my card ?


they have all been tested here and there throughout this thread, with varying results. Not sure what the current established conclusion is on which is the best Bios to try now, though, TBH. @Slim @KedarWolf could help.


----------



## Slackaveli

Quote:


> Originally Posted by *Bishop07764*
> 
> And here I am still sitting on my z77 and i7 3770k. This board isn't healthy. You all aren't aggravating my upgrade itch.
> 
> But wouldn't memory that fast affect your core clock? I would imagine it putting a lot of strain on the memory controller.
> 
> I'm trying to convince myself that I wouldn't notice a true difference because of my gsync monitor. ?


if you were on z97 i'd say drop in the 5775-c.

Being on higher resolution, it wouldn't be a huge perf upgrade. And until samsung re-fabs for ddr4, ram prices are going to be skyhigh (wont go back down for at leasst the nest 6 months it's predicted. also, predicted to rise a bit more first, too.). AND, the new Intel stuff coming, skylake-x/kaby-x will just piss you off if you upgrade now.

But, yes, this forum will make you hate your rig at times LOL


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Im on 1T but you do have proof it looks like. So basically what your telling me is, I paid an extra $100 for RGB lighting? But what if I push beyond 3600 mhz, will the investment be worth it then?


Not necessarily, because like is said, its the same B-Die chip on the sticks everywhere. And the rgb pcb add probably some signal noise which is bad if you go ham with overclocks. Other than that the other guys said everything else which was what i would say too.
Quote:


> Originally Posted by *D13mass*
> 
> Hi guys!
> 
> Very often I have a 'Power limit' in rivaTuner and I am looking for new bios. I want to try flash new one with big power limit.
> 
> I have Nvidia FE card with EK water block, could you tell me when I can download bios and which will be good for my card ?


For not shunt moded fe cards its asus strix oc bios or the palit game rock premium which are giving advantages in power limitation compared to the stock bios.
If shunt moded nvidia stock fe bios is the way to go

Edit: Do i have a 3600MHz c16 kit or do i have a 4133c19 kit ? xD nobody knows xD


----------



## CoreyL4

Would you guys rather have 2012/6000 or 2037/5900 as a daily clock?


----------



## D13mass

Quote:


> Originally Posted by *Hulio225*
> 
> Not necessarily, because like is said, its the same B-Die chip on the sticks everywhere. And the rgb pcb add probably some signal noise which is bad if you go ham with overclocks. Other than that the other guys said everything else which was what i would say too.
> For not shunt moded fe cards its asus strix oc bios or the palit game rock premium which are giving advantages in power limitation compared to the stock bios.
> If shunt moded nvidia stock fe bios is the way to go
> 
> Edit: Do i have a 3600MHz c16 kit or do i have a 4133c19 kit ? xD noboday knows xD


Thank you, I`m going to try this one https://www.techpowerup.com/gpudb/b4306/asus-rog-strix-gtx-1080-ti-gaming-oc


----------



## D13mass

deleted


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> Would you guys rather have 2012/6000 or 2037/5900 as a daily clock?


if i had to guess right now, i would guess 2037 / 5900 will give you higher fps in most cases compared to 2012 / 6000


----------



## Hulio225

Quote:


> Originally Posted by *D13mass*
> 
> Thank you, I`m going to try this one https://www.techpowerup.com/gpudb/b4306/asus-rog-strix-gtx-1080-ti-gaming-oc


you're welcome, plz have in mind that the first displayport output on your card is disabled with that bios, so put your cable in another one! otherwise you will not have a video signal and panic after the reboot^^


----------



## kiwwanna

I seem to have an issue with my clocks, they are stock in the pic. neither the core nor mem seem to be what they should.
Does anyone have advice on this, I'm at a bit of a loss.

I originally had MSI 1080 hybrid, I unistalled drivers and used CC cleaner. Reinstalled newest drivers when 1080 ti went in.

Edit - Opened dedicated thread.


----------



## Norlig

Thought I killed my card today,

I had taken the "round" shroud around the fan away, to more easily fit some copper heatsinks.



After that, when I pressed the power button, it would spin for a quarter second, then shut down with a "Tzzt!"

Took it apart again, and noticed the 2 wires going to the Geforce GTX LED, had been pinched by the round fan shroud, and were making contact with the metal of the shroud and shorting out the wires.

After putting it back together more carefully, It would boot but go black screen before coming to the windows login.

Apparently it had also reset the boot mode to legacy, instead of UEFI, kind of weird....

All working good now though, actually seems to have helped a bit with Core overclocking too. (the copper heatsinks, that is)


----------



## zipeldiablo

Guys, anyone who bought the ek 1080ti backplate?
The manual says to put thermal pads behind the card (so in front of the backplate) but i didn't have any.
Is it safe to put the backplate anyway or should i contact ek for thermal pads and wait to install it?


----------



## done12many2

Quote:


> Originally Posted by *zipeldiablo*
> 
> Guys, anyone who bought the ek 1080ti backplate?
> The manual says to put thermal pads behind the card (so in front of the backplate) but i didn't have any.
> Is it safe to put the backplate anyway or should i contact ek for thermal pads and wait to install it?


Your backplate should have come with its own set of thermal pads. Both of my EK 1080 Ti backplates did. Double check the box and between the cardboard inserts. If not, contact EK and they'll ship you some right away.


----------



## D13mass

Ok, I flashed 2 BIOSes: palit and strix on my Nvidia FE card.

Out of the box palit has little bit higher core clock, but overclocking is the same as when I had stock bios - 2050/12000, results in benchmarks are the same too









Don`t see reason to use custom bios.


----------



## Bishop07764

Quote:


> Originally Posted by *Slackaveli*
> 
> if you were on z97 i'd say drop in the 5775-c.
> 
> Being on higher resolution, it wouldn't be a huge perf upgrade. And until samsung re-fabs for ddr4, ram prices are going to be skyhigh (wont go back down for at leasst the nest 6 months it's predicted. also, predicted to rise a bit more first, too.). AND, the new Intel stuff coming, skylake-x/kaby-x will just piss you off if you upgrade now.
> 
> But, yes, this forum will make you hate your rig at times LOL


If I was running Z97, then I totally would. I've been waiting on more info on the z299 stuff to see how that pans out. But I still don't feel CPU limited at 1440 p yet because of gsync smoothing things out right now.

I had read that the RAM prices were heading for the stratosphere. It makes me much more gun shy about pulling the trigger on a platform upgrade. Honestly, I would love to give AMD another go. The AMD 64 processors were beats back in the day.


----------



## Hulio225

Quote:


> Originally Posted by *D13mass*
> 
> Ok, I flashed 2 BIOSes: palit and strix on my Nvidia FE card.
> 
> Out of the box palit has little bit higher core clock, but overclocking is the same as when I had stock bios - 2050/12000, results in benchmarks are the same too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don`t see reason to use custom bios.


Let me explain why it can be an advantage:

The main factor is Power Limit, while your card is power limiting it downclocks itself and overclocks back itself couple times per second.
So you have constant frequency fluctuations, which are causing worse frame-times than without frequency fluctuations.
Worse Frame-times can feel like lags or stutter while gaming, causing a worse gaming experience while the average frames per second don't show that.
Thats the reason why you don't want power limit and the fluctuations in clocks.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Norlig*
> 
> Thought I killed my card today,
> 
> I had taken the "round" shroud around the fan away, to more easily fit some copper heatsinks.
> 
> 
> 
> After that, when I pressed the power button, it would spin for a quarter second, then shut down with a "Tzzt!"
> 
> Took it apart again, and noticed the 2 wires going to the Geforce GTX LED, had been pinched by the round fan shroud, and were making contact with the metal of the shroud and shorting out the wires.
> 
> After putting it back together more carefully, It would boot but go black screen before coming to the windows login.
> 
> Apparently it had also reset the boot mode to legacy, instead of UEFI, kind of weird....
> 
> All working good now though, actually seems to have helped a bit with Core overclocking too. (the copper heatsinks, that is)


I want to mention that you didn't kill your card but potentially halved the life or worse on a trace in a few places. It's impossible to actually predict how it'll affect you in the future, but your warranty will save you in that time period anyways.


----------



## rolldog

Well, now I get to see what these GPUs can do.


----------



## D13mass

Quote:


> Originally Posted by *Hulio225*
> 
> Let me explain why it can be an advantage:
> 
> The main factor is Power Limit, while your card is power limiting it downclocks itself and overclocks back itself couple times per second.
> So you have constant frequency fluctuations, which are causing worse frame-times than without frequency fluctuations.
> Worse Frame-times can feel like lags or stutter while gaming, causing a worse gaming experience while the average frames per second don't show that.
> Thats the reason why you don't want power limit and the fluctuations in clocks.


Yes you are right, I switched to palit bios - without manipulation with MSI AfterBurner I have 2000/11000


----------



## RavageTheEarth

Quote:


> Originally Posted by *rolldog*
> 
> Well, now I get to see what these GPUs can do.


Good choice!!!


----------



## KedarWolf

Did the memory overclock thing in OP now at +723 memory stable in Heaven, Superposition and Time Spy.


----------



## s1rrah

Quote:


> Originally Posted by *rolldog*
> 
> Well, now I get to see what these GPUs can do.


Oh dear lord. That's going to stomp...


----------



## KCDC

Quote:


> Originally Posted by *Slackaveli*
> 
> @KCDC, was probably a dying 1200i. but, shoot, i would have probably done the same thing. you covered both possibilities. it's extreme overkill tho, but psu's are always better at 1/3-1/2 load so, nothing wrong with that. It'll last for years and years.


You're probably right, and yeah could have just replaced it with another 1200 watt PSU, but I use these times as an excuse to upgrade if I can. I'm helping a buddy build a design workstation on a budget so I'll RMA the 1200i and sell the replacement to him at a reduced price, make some money back on the purchase.

I like having the headroom anyway... Possibly getting a third card for octane.. Maybe it's been the psu holding me back this whole time. Looking forward to pushing these cards now.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> Let me explain why it can be an advantage:
> 
> The main factor is Power Limit, while your card is power limiting it downclocks itself and overclocks back itself couple times per second.
> So you have constant frequency fluctuations, which are causing worse frame-times than without frequency fluctuations.
> Worse Frame-times can feel like lags or stutter while gaming, causing a worse gaming experience while the average frames per second don't show that.
> Thats the reason why you don't want power limit and the fluctuations in clocks.


Honestly, if someone goes for their highest OC at 1.000v, then it'd probably be better than using a bios that performs less. Specially since a lot of games won't have you hit that power limit at 1.000v.

Before I was shunt mod, I don't think I was ever dropped below 1.032v, only in benchmarks.

I'd recommend to just go for the highest OC at 1.000v.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Did the memory overclock thing in OP now at +723 memory stable in Heaven, Superposition and Time Spy.


I'm going to go home and give this a try, although I can't OC my memory past 600 Mhz with just a midplate.

To all the others, I did notice better memory OC the lower temperature you can get your VRAM.


----------



## zipeldiablo

Quote:


> Originally Posted by *done12many2*
> 
> Your backplate should have come with its own set of thermal pads. Both of my EK 1080 Ti backplates did. Double check the box and between the cardboard inserts. If not, contact EK and they'll ship you some right away.


Yes there were hidden somewhere








Couldn't fix one of the screw though, there was no threading, weird


----------



## done12many2

Quote:


> Originally Posted by *zipeldiablo*
> 
> Yes there were hidden somewhere
> 
> 
> 
> 
> 
> 
> 
> 
> Couldn't fix one of the screw though, there was no threading, weird


Was it one of the screws for the backplate or for the waterblock?


----------



## zipeldiablo

Quote:


> Originally Posted by *done12many2*
> 
> Was it one of the screws for the backplate or for the waterblock?


The waterblock, it is the spot just beneath the fittings.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> you are a true OCer so anything less than the fastest available will always be a niggle so don't think twice just get it I went through that with a 3570k and memory I saved a little money but the knowledge that something out there that is faster will simply be too much you buy cheap you buy twice is my motto, ended up with quite a good media PC in the end though.
> 
> As for the Asus bios I believe Asus are compelled to release it because of the "hotwire" solder points that are on the Strix would be false advertising if it doesn't allow you to go past software voltage mods.
> 
> Edit a lot of answers here
> 
> This was an interesting read Asus on LN2 ultimate guide
> http://forum.hwbot.org/showthread.php?t=169488
> I am getting tempted to try this though I got a bad clocker depends on the XOC bios release though am still not sure if the original bios has a voltage limit
> 
> I believe That was the reason the EVbot got cancelled because NVidia were not happy that the cards that were RMAed were impossible to tell if they were modded unlike Asus
> 
> I am also had 3 BSOD since putting in the 1080Ti beast ,only when the PC has been idling for 6+ hours never at any other time so I am hoping it will fix itself with the NVidia driver updates I haven't looked deeply into it yet as this OS is due for a fresh install I am thinking it has gone though 3 different motherboards and 8.1 to 10 upgrade
> Also with the bad rap SLI is getting with game profiles NVidia are probably working around the clock to fix them with the Vega due in the next month


All very good points.

Thanks very much 

I like your moto  Makes sense lol.

Quote:


> Originally Posted by *Asmola*
> 
> You can find all the information about XOC bios (and overclock guide) from here: http://forum.hwbot.org/showthread.php?t=169488&highlight=1080
> 
> Also asked if i can get that bios.


Indeed.

I read that article too.

Its great to see something fresh and exciting starting to happen over there again ;-)

Whose your contact? r.e. the BIOS? (PM me if it's easier) ;-)

Quote:


> Originally Posted by *Bishop07764*
> 
> And here I am still sitting on my z77 and i7 3770k. This board isn't healthy. You all aren't aggravating my upgrade itch.
> 
> But wouldn't memory that fast affect your core clock? I would imagine it putting a lot of strain on the memory controller.
> 
> I'm trying to convince myself that I wouldn't notice a true difference because of my gsync monitor. ?


Lol, I do seem to be guilty of number fixation recently. Don't know why.

Think there's a bit of adrenaline involved in pushing for the limits.

Seeing my new i7 clocked at 5GHZ with a 4GHZ memory O/C too would just feel phenomenal. I mean that's almost half the speed of GDDR5X in the last round before TI's. lol

Quote:


> Originally Posted by *Hulio225*
> 
> You dont need to buy a 4133MHz kit, those are sometimes even worse than a 3600MHz CL16 B-Die kit.
> I talked with a guy on HWbot who has several world records and he has binned over 100 ram kits...
> 
> Ill copy pasta what he said to ram kits:
> 
> "Tested 4266 rgb,failed to train 4000 12 12 12 1.85-2V. Also latest 4266 were worser on tight timings compared to 3600c16. Latest 3733 c17 and 3600c15 again were worser,1707 serial week,1701 on pcb,644 week on IC.
> 
> On the average from 5 sets of 3600C16 i had way better average than on 7 sets of 4266.Not to mention that 4266 i had 3 kits with one module DOA.
> 
> It is what it is , i binned over 100 kits overall and i have 3 modules super good,found about 6 GOOD(4133 12 11 11 Spi 32M wazza) and about 10 4000-4040 which is in my book average.
> 
> My advice would be to get a few kits of 3600 17-18-18 or 16 16 16 NON-RGB and keep the best,return the rest."
> 
> After that i just went to a local vendor and bought a 3600MHz CL 16 Kit.
> For 24/7 it runs those timings 16-16-16-32 1T @ 4000MHz @ 1.45V
> 
> And it can run 4080 MHz @ 12-12-12-24 1T with very tight sub and teritary timings @ 1.96V
> 
> If i losen the timings i can run 4133 instantly, i am on an asus apex board thou and if i losen timings even more i can go even higher...
> But Ram performance is best if you balance clocks with tightest timings possible...
> 
> There is no need of buying B-Die kits over 3600MHz imo.


Very informative.

Thanks v much.

+1









Quote:


> Originally Posted by *done12many2*
> 
> This should clarify things for you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1533164/the-24-7-sub-zero-liquid-chillbox-club/700_100#post_26039563


Nice setup,

Very nice indeed. I love it.

Lowest I seem to be able to get on my Hailea HC 500a (1/2 HP compressor & 790w cooling capacity) is about -14 (with extreme difficulty).
Or -10c with relative ease --- but I lose it FAST as soon as I allow _any_ load from CPU/GPU at all.

I think it is to do with this more environmentally friendly refrigerant that is used these days. It's boiling point is only about -28 or so. Meaning that it's almost impossible to get further than I already have.

Quote:


> Originally Posted by *KraxKill*
> 
> The liquid is electronics grade, ethylene glycol and distilled water. Some people run propylene glycol and water but ethylene is less viscous at lower temperatures.


Have you had a look at "Mayhems XT-1"?

It allows temps as low as -50c at low viscosity.

It's also specifically designed for Chillers in PC liquid cooling loops. (one of it's kind).
So it has all the correct corrosion inhibiters e.t.c and also won't clog up your system.

I run sub-zero water-chiller and I swear by it 

Quote:


> Originally Posted by *KCDC*
> 
> Swapped the Ax1200i with the Ax1500i last night..
> 
> Now, I'm not sure if it's because the 1200i was on its way out, or the new nvidia driver or what, but I never drop below 60 FPS, GPUs never leave 2000 with a basic OC, gpu +100, Mem +200, Power 120%, default voltage. This is SLI, 7680x1440 under water, testing with Mass Effect Andromeda.
> 
> Before, with higher settings I would get choppy fps always going up and down from 30/40 to 60 depdning on location. Now it's so buttery smooth no matter where.. I was amazed!! I didn't expect such an improvement. Inside the Tempest, its the worst as far as maintaining decent FPS, and now it's 55-60. Temps stay around 50 on load.
> 
> Didn't have as much time as I wanted last night to get into benchmarks since I also found a leaking EK fitting from one of the gpu blocks..ended up having to drain and fix and check the rest of them which took some time. Perhaps tonight.
> 
> Happy camper over here.


Wow nice one, I've never heard of such a great result just by changing over a PSU.

I always assumed it would either work, or blow up. (or maybe not blow up, but shut down the PC).
A bit like the digital signal, it's either perfect or it's not there at all. There's no in-between with 1's & 0's. lol

Anyway nice one, glad to to hear things going smoothly for you now 

Quote:


> Originally Posted by *SlimJ87D*
> 
> Honestly, if someone goes for their highest OC at 1.000v, then it'd probably be better than using a bios that performs less. Specially since a lot of games won't have you hit that power limit at 1.000v.
> 
> Before I was shunt mod, I don't think I was ever dropped below 1.032v, only in benchmarks.
> 
> I'd recommend to just go for the highest OC at 1.000v.


That's precisely what I do for gaming.

Benching is a diff story though









Quote:


> Originally Posted by *KedarWolf*
> 
> Did the memory overclock thing in OP now at +723 memory stable in Heaven, Superposition and Time Spy.


Nice one.

Can't wait to see if where water takes me on memory.

Shame my voltage tools aren't working on the Ti because my Classy responded nicely to a lil mem voltage boost from 1.35v (stock) to 1.5v
.
+725 was roughly what I could get _without_ any extra voltage on my 1080. I miss my memory O/C lol. Only maxing out at +545 before errors start appearing.

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm going to go home and give this a try, although I can't OC my memory past 600 Mhz with just a midplate.
> 
> To all the others, I did notice better memory OC the lower temperature you can get your VRAM.


*
I agree 100%*

Definitely second that. +++

GDDR5X *LOVES* the cold ;-)

The chips everyone wants to concentrate on the most when doing any mods for memory temps are the 3 nearest the VRM side of the card. They run up to 25c hotter than the ones on the I/O side in my previous testing.

I measured like this from the back of the PCB using probes attached to a $25 fan controller:







[/URL]


And look at how temps differ in this one (*a LOT of heat "spills over" on the VRM side GDDR5X* ):



The VRM on the new 1080 Ti FE is also *1 inch CLOSER* to those memory chips compared to my Classy ; making the problem even worse. I'd hazard a guess they are running 30c hotter at least.

The best way to combat this is to get as much cooling to that VRM as possible (extra pads over components that aren't normally specified to be covered in EK manual). And also place extra pads between VRM and backplate and extra pads between those memory chips and backplate.

It can take a while to figure out the perfect heights, but the rewards will be well worth it. And I fully intend to do this when my block arrives. 

When it does; I'll post pictures and detailed "before" and "after".

Another good source worth looking at is how EVGA have added more pads to VRM components that NO other manufacturer have pads on. You can watch this in the VRM tear-down and PCB analysis of the new FTW 3 by Buildzoid on gamers nexus @ YouTube).


----------



## KedarWolf

Last night I had quite low ambient temps to get +723. Now with normal ambient temps, warmer day, closed a bunch of windows, I'm only getting +667 memory Heaven and Superposition stable. Only lost 1 FPS in Heaven though so it's still decent.


----------



## done12many2

Quote:


> Originally Posted by *zipeldiablo*
> 
> The waterblock, it is the spot just beneath the fittings.


You don't put a screw there.









@nrpeyton Holy never ending post.











Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *zipeldiablo*
> 
> The waterblock, it is the spot just beneath the fittings.


Quote:


> Originally Posted by *nrpeyton*
> 
> All very good points.
> 
> Thanks very much
> 
> I like your moto  Makes sense lol.
> 
> Indeed.
> 
> I read that article too.
> 
> Its great to see something fresh and exciting starting to happen over there again ;-)
> 
> Whose your contact? r.e. the BIOS? (PM me if it's easier) ;-)
> 
> Lol, I do seem to be guilty of number fixation recently. Don't know why.
> 
> Think there's a bit of adrenaline involved in pushing for the limits.
> 
> Seeing my new i7 clocked at 5GHZ with a 4GHZ memory O/C too would just feel phenomenal. I mean that's almost half the speed of GDDR5X in the last round before TI's. lol
> 
> Very informative.
> 
> Thanks v much.
> 
> +1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice setup,
> 
> Very nice indeed. I love it.
> 
> Lowest I seem to be able to get on my Hailea HC 500a (1/2 HP compressor & 790w cooling capacity) is about -14 (with extreme difficulty).
> Or -10c with relative ease --- but I lose it FAST as soon as I allow _any_ load from CPU/GPU at all.
> 
> I think it is to do with this more environmentally friendly refrigerant that is used these days. It's boiling point is only about -28 or so. Meaning that it's almost impossible to get further than I already have.
> 
> Have you had a look at "Mayhems XT-1"?
> 
> It allows temps as low as -50c at low viscosity.
> 
> It's also specifically designed for Chillers in PC liquid cooling loops. (one of it's kind).
> So it has all the correct corrosion inhibiters e.t.c and also won't clog up your system.
> 
> I run sub-zero water-chiller and I swear by it
> 
> Wow nice one, I've never heard of such a great result just by changing over a PSU.
> 
> I always assumed it would either work, or blow up. (or maybe not blow up, but shut down the PC).
> A bit like the digital signal, it's either perfect or it's not there at all. There's no in-between with 1's & 0's. lol
> 
> Anyway nice one, glad to to hear things going smoothly for you now
> 
> That's precisely what I do for gaming.
> 
> Benching is a diff story though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice one.
> 
> Can't wait to see if where water takes me on memory.
> 
> Shame my voltage tools aren't working on the Ti because my Classy responded nicely to a lil mem voltage boost from 1.35v (stock) to 1.5v
> .
> +725 was roughly what I could get _without_ any extra voltage on my 1080. I miss my memory O/C lol. Only maxing out at +545 before errors start appearing.
> 
> *
> I agree 100%*
> 
> Definitely second that. +++
> 
> GDDR5X *LOVES* the cold ;-)
> 
> The chips you want to concentrate on the most when doing any mods are the 3nearest the VRM side of the card. They run up to 25c hotter than the ones on the I/O side.
> 
> I measured like this from the back of the PCB using probes attached to a $25 fan controller:
> 
> 
> 
> 
> 
> 
> 
> [/URL]
> 
> 
> And look at how temps differ in this one (*a LOT of heat "spills over" on the VRM side GDDR5X* ):


----------



## feznz

Quote:


> Originally Posted by *hotrod717*
> 
> Would have to disagree. Release cards. Both my 1080ti and TXp are release cards and both were at the front of the pack. 2144 on ti and 2138 on TXp. Neither shunt modded.I would say the refined process after months of manufacture improves your chances of getting a better card or you are less likely to get a total dud. Also, for water or extreme cooling, aib becomes a mute point unless they come out with a totally unlocked version that offers total voltage control, vrm freq, pex, llc, ect, ect.


Lucky you some have all the luck







of course there is some really good chips out there in the first batches NVidia already know which chips are the potently better; ones from the centre of the wafer as where the light focus is always shaper in the manufacture process.
Quote:


> Originally Posted by *Bishop07764*
> 
> And here I am still sitting on my z77 and i7 3770k. This board isn't healthy. You all aren't aggravating my upgrade itch.
> 
> But wouldn't memory that fast affect your core clock? I would imagine it putting a lot of strain on the memory controller.
> 
> I'm trying to convince myself that I wouldn't notice a true difference because of my gsync monitor. ?


I am with you with my 3770k and 2400Mhz ram downclocked to 2200 with tighter timings as a weak IMC all I can say wait till x299 release to open up your option
I am in a LAN party group and most of my friends have done the multiple slight grades in the end after 5 years I would have spent less or similar money overall and currently my system will smoke every members PCs by a hefty margin
The funniest thing happened at the last party partly to do with the medications but side by side comparison of my new LG 38" to 1st gen VA ultra wide screen my mate spent 1/2 an hour trying to adjust the contrast etc to get the blacker blacks in the end after giving me crap about my old monitor he went out and got a IPS monitor 4th in 5 years to my 1.
Just look at the following bench comparison it certainly curbs my itch to micro upgrade


Spoiler: Warning: Spoiler!







Quote:


> Originally Posted by *Hulio225*
> 
> You dont need to buy a 4133MHz kit, those are sometimes even worse than a 3600MHz CL16 B-Die kit.
> I talked with a guy on HWbot who has several world records and he has binned over 100 ram kits...
> 
> Ill copy pasta what he said to ram kits:
> 
> "Tested 4266 rgb,failed to train 4000 12 12 12 1.85-2V. Also latest 4266 were worser on tight timings compared to 3600c16. Latest 3733 c17 and 3600c15 again were worser,1707 serial week,1701 on pcb,644 week on IC.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> On the average from 5 sets of 3600C16 i had way better average than on 7 sets of 4266.Not to mention that 4266 i had 3 kits with one module DOA.
> 
> It is what it is , i binned over 100 kits overall and i have 3 modules super good,found about 6 GOOD(4133 12 11 11 Spi 32M wazza) and about 10 4000-4040 which is in my book average.
> 
> My advice would be to get a few kits of 3600 17-18-18 or 16 16 16 NON-RGB and keep the best,return the rest."
> 
> After that i just went to a local vendor and bought a 3600MHz CL 16 Kit.
> For 24/7 it runs those timings 16-16-16-32 1T @ 4000MHz @ 1.45V
> 
> And it can run 4080 MHz @ 12-12-12-24 1T with very tight sub and teritary timings @ 1.96V
> 
> If i losen the timings i can run 4133 instantly, i am on an asus apex board thou and if i losen timings even more i can go even higher...
> But Ram performance is best if you balance clocks with tightest timings possible...
> 
> There is no need of buying B-Die kits over 3600MHz imo
> 
> 
> .


Interesting you got a link?
My theory was get the fastest possible memory and if the IMC is not up to it the underclock your ram and tighten up the timings not saying you are incorrect but it is hard to believe that 4200Mhz ram would be smoked by 3600Mhz lets say @ 3600 17-18-18 with lets say 4200Mhz clocked to 3600 9-9-9 which in theory should be possible.

I had 3 BSOD within 30min this morning with the latest Driver 382.05 so rolled back to 381.65 when I wasn't having any BSOD 4 hours later all good couldn't catch the error codes but something with kvldd???? STOP_SERVICE_EXCEPTION and over allocated memory STOP_SERVICE_EXCEPTION


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> @nrpeyton Holy never ending post.


Was on the late shift tonight loll; had catching up to do haha


----------



## Bishop07764

Quote:


> Originally Posted by *feznz*
> 
> I am with you with my 3770k and 2400Mhz ram downclocked to 2200 with tighter timings as a weak IMC all I can say wait till x299 release to open up your option
> I am in a LAN party group and most of my friends have done the multiple slight grades in the end after 5 years I would have spent less or similar money overall and currently my system will smoke every members PCs by a hefty margin
> The funniest thing happened at the last party partly to do with the medications but side by side comparison of my new LG 38" to 1st gen VA ultra wide screen my mate spent 1/2 an hour trying to adjust the contrast etc to get the blacker blacks in the end after giving me crap about my old monitor he went out and got a IPS monitor 4th in 5 years to my 1.
> Just look at the following bench comparison it certainly curbs my itch to micro upgrade
> 
> 
> Spoiler: Warning: Spoiler!


Wow. Mine is also clocked at 4.8ghz. That isn't a whole lot of difference. Same score as an 1800x too. Dang. I was possibly thinking about Ryzen.

Edit: And I just came home to compare my results after work for Timespy. Man only a 1-2 fps difference from a 5.2ghz 7700k. The itch isn't nearly so bad now.


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> My theory was get the fastest possible memory and if the IMC is not up to it the underclock your ram and tighten up the timings not saying you are incorrect but it is hard to believe that 4200Mhz ram would be smoked by 3600Mhz lets say @ 3600 17-18-18 with lets say 4200Mhz clocked to 3600 9-9-9 which in theory should be possible.


So in theory you think you can get a 4266 kit which has this Timings: CL19 26-26-46 downclocked to 3600MHz and set those timings CL 9-9-9-22 or somethin?
Have i understood that right? If so, no thats not how it works practicaly.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> Lucky you some have all the luck
> 
> 
> 
> 
> 
> 
> 
> of course there is some really good chips out there in the first batches NVidia already know which chips are the potently better; ones from the centre of the wafer as where the light focus is always shaper in the manufacture process.
> I am with you with my 3770k and 2400Mhz ram downclocked to 2200 with tighter timings as a weak IMC all I can say wait till x299 release to open up your option
> I am in a LAN party group and most of my friends have done the multiple slight grades in the end after 5 years I would have spent less or similar money overall and currently my system will smoke every members PCs by a hefty margin
> The funniest thing happened at the last party partly to do with the medications but side by side comparison of my new LG 38" to 1st gen VA ultra wide screen my mate spent 1/2 an hour trying to adjust the contrast etc to get the blacker blacks in the end after giving me crap about my old monitor he went out and got a IPS monitor 4th in 5 years to my 1.
> Just look at the following bench comparison it certainly curbs my itch to micro upgrade
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Interesting you got a link?
> My theory was get the fastest possible memory and if the IMC is not up to it the underclock your ram and tighten up the timings not saying you are incorrect but it is hard to believe that 4200Mhz ram would be smoked by 3600Mhz lets say @ 3600 17-18-18 with lets say 4200Mhz clocked to 3600 9-9-9 which in theory should be possible.
> 
> I had 3 BSOD within 30min this morning with the latest Driver 382.05 so rolled back to 381.65 when I wasn't having any BSOD 4 hours later all good couldn't catch the error codes but something with kvldd???? STOP_SERVICE_EXCEPTION and over allocated memory STOP_SERVICE_EXCEPTION


There's a little equation you can do to calculate "true latency". I've found it quite helpful:
CAS*2000/Speed(in MHz)= true latency

Anyway tonight I'm on a mission to find a memory kit to buy.

But effectively what we're saying then *(to summarise) is*: I can grab a good 3600 kit; change the timings so they match the timings advertised with a 4133 kit... and the memory will effectively run like a 4133 kit at normal timings for a 4133 kit? _(as advertised)?_

So for example I could buy a cheaper 3600 CL16-16-16-36 kit _(Trident Z F4-3600C17Q-64GTZSW)_
*T*hen change timings so it effectively becomes a 4200 19-19-19-39 _(Trident F4-4266C19D-16GTZKW_) kit ?


----------



## Coopiklaani

Finally, +850 on the vram with 8hrs of OCCT GPU scanner 0 error.


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> There's a little equation you can do to calculate "true latency". I've found it quite helpful:
> CAS*2000/Speed(in MHz)= true latency
> 
> Anyway tonight I'm on a mission to find a memory kit to buy.
> 
> But effectively what we're saying then *(to summarise) is*: I can grab a good 3600 kit; change the timings so they match the timings advertised with a 4133 kit... and the memory will effectively run like a 4133 kit at normal timings for a 4133 kit? _(as advertised)?_
> 
> So for example I could buy a 3600 CL16-16-16-36 kit _(Trident Z F4-3600C17Q-64GTZSW)_
> Then change timings so it effectively becomes a 4200 19-19-19-39 _(Trident F4-4266C19D-16GTZKW_) kit ?


frequencies over 4133MHz will mostlikely just run on an asus apex, but in general i showed it already here in this thread you can set loser timings and achieve the same clocks.


this is a 3600c16 kit

im running 4000 @ 16-16-16-32 1T for 24/7 on my 3600c16 kit, wich on average should be better than 4133 with crap timings


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Finally, +850 on the vram with 8hrs of OCCT GPU scanner 0 error.


wow, whats your power draw during the run? (you can't just leave it as a background task -- because as soon as you click another window, OCCT downclocks to 20 FPS until you bring it to the front again).

(Just wanting to double check).

Maybe I aught to write a little guide/article on this.

Sorry if you've done it correctly and I'm writing this pointlessly.. but I'm just worried in case someone doesn't notice these things and thinks they're stable when they're not!


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> frequencies over 4133MHz will mostlikely just run on an asus apex, but in general i showed it already here in this thread you can set loser timings and achieve the same clocks.
> 
> 
> this is a 3600c16 kit
> 
> im running 4000 @ 16-16-16-32 1T for 24/7 on my 3600c16 kit, wich on average should be better than 4133 with crap timings


Okay so with the APEX mobo (which I'm almost sold on).

Would I be better grabbing a 3600 CS 16 kit and manually tuning it to 4133 CS 19
Or just buying the 4133 CS 19 kit?


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay so with the APEX mobo (which I'm almost sold on).
> 
> Would I be better grabbing a 3600 CS 16 kit and manually tuning it to 4133 CS 19
> Or just buying the 4133 CS 19 kit?


im pretty sure that every 3600c16 kit is able to run 4133c19, if the price difference isnt that much you can go with the 4133 your decision.
I personally would save the money and buy a keg of beer or something ;-)


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> im pretty sure that every 3600c16 kit is able to run 4133c19, if the price difference isnt that much you can go with the 4133 your decision.
> I personally would save the money and buy a keg of beer or something ;-)


okay excellent, would you say CS 16 is the highest you'd go on the 3600?

So for example if I want to hit 4133...

But there's two kits available:
a 3600 CS 16
and a
3600 CS 17

And the CS 17's cheaper... would I still have the same chance of hitting 4133/4266 e.t.c on the CS 17 kit
or are you definitely recommending the CS 16?

OR - IN OTHER WORDS:
Are the 3600 CS 17 kits still Samsung B die.. ? or is B die limited to 3600 CS 15/16?

(Thank you; & I appreciate you clearing this up for me; I'm coming over from AMD platform).
And memory isn't cheap, lol.


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> So in theory you think you can get a 4266 kit which has this Timings: CL19 26-26-46 downclocked to 3600MHz and set those timings CL 9-9-9-22 or somethin?
> Have i understood that right? If so, no thats not how it works practicaly.


Quote:


> Originally Posted by *nrpeyton*
> 
> There's a little equation you can do to calculate "true latency". I've found it quite helpful:
> CAS*2000/Speed(in MHz)= true latency
> 
> Anyway tonight I'm on a mission to find a memory kit to buy.
> 
> But effectively what we're saying then *(to summarise) is*: I can grab a good 3600 kit; change the timings so they match the timings advertised with a 4133 kit... and the memory will effectively run like a 4133 kit at normal timings for a 4133 kit? _(as advertised)?_
> 
> So for example I could buy a cheaper 3600 CL16-16-16-36 kit _(Trident Z F4-3600C17Q-64GTZSW)_
> *T*hen change timings so it effectively becomes a 4200 19-19-19-39 _(Trident F4-4266C19D-16GTZKW_) kit ?


I'm not the best person to talk about memory those numbers were plucked out of the air
put your questions here http://www.overclock.net/t/1268061/ocn-ram-addict-club-gallery/8000_20#post_25768027

in my testing tighter timings yielded more FPS than higher frequency with valley benchmark with GTX 770 SLI which now is a pretty much dead thread but it is chasing that last 1% maybe 2% depending that was on the p77 platform

I am getting the competitive bench itch again but is kinda hard with time and 3&5 children to look after at the same time


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> I'm not the best person to talk about memory those numbers were plucked out of the air
> put your questions here http://www.overclock.net/t/1268061/ocn-ram-addict-club-gallery/8000_20#post_25768027
> 
> in my testing tighter timings yielded more FPS than higher frequency with valley benchmark with GTX 770 SLI which now is a pretty much dead thread but it is chasing that last 1% maybe 2% depending that was on the p77 platform
> 
> I am getting the competitive bench itch again but is kinda hard with time and 3&5 children to look after at the same time


yeah thas what im preaching all the time here, a combination of frequency and tight timings is better than just frequency with bad timings.
those high mhz kits are just binned for high mhz, which often means those can be bad at tight timings.
Quote:


> Originally Posted by *nrpeyton*
> 
> okay excellent, would you say CS 16 is the highest you'd go on the 3600?
> 
> So for example if I want to hit 4133...
> 
> But there's two kits available:
> a 3600 CS 16
> and a
> 3600 CS 17
> 
> And the CS 17's cheaper... would I still have the same chance of hitting 4133/4266 e.t.c on the CS 17 kit
> or are you definitely recommending the CS 16?
> 
> OR - IN OTHER WORDS:
> Are the 3600 CS 17 kits still Samsung B die.. ? or is B die limited to 3600 CS 15/16?
> 
> (Thank you; & I appreciate you clearing this up for me; I'm coming over from AMD platform).
> And memory isn't cheap, lol.


It is B-Die too... i can't guarantee anything, but i would guess those will do 4k @ 16-16-16-32 1T as well, which i would aim for. 4133 with bad timings will perform worse than 4k with better timings most likely.


----------



## c0nsistent

Well my Antec 620 ziptied onto my FE card holds temps under 75C even in 4K playing the Division. Clocks are in the 1924 to 2000 range throughout, and in BF1, I typically see clockspeeds around 2012-2025 which is fine for me until I shunt mod the card and try to squeeze out 2100 which I already know the card can do without insta crashing at lower temps.

I was gaming on a QNIX 1440p display that is 120hz capable but was only able to use 75hz with an adapter, so I was thinking about trying to swap my card with someone that has a TXP 2016 with a little cash on the side as well so that I would have the DL-DVI connector to use and wouldn't have to buy a $500+ monitor with DP or HDMI 2.0.

Well, in my infinite wisdom I decided to pick up a cheapo 4K TV from walmart for $300 and see how 4K was on it. The 43" size is passable for a monitor but the noticeable delay compared to a 120hz panel is something to get used to indeed. It is a cheap LED panel so the lag isn't horrendous. However, it is noticeable and when there is tearing, it's impossible to ignore it.

This TV will only do 4:4:4 chroma @ 4K 30hz, and 4:2:0 at 4K 60hz, so I've been having to switch between modes for gaming and desktop usage.

Has anyone else tried using a 4K TV with their 1080 Ti? I'm satisfied with the FPS I'm getting as well. I'm able to Max The Divison out (Free trial weekend) albeit with some shadows and particles turned down a notch, and I rarely dip below 60fps even when in firefights.


----------



## ttnuagmada

I wasn't aware that any Televisions with TN panels even existed. I imagine that it's a VA panel.


----------



## Luckbad

Got the EVGA 1080 Ti SC2 Hybrid in today. Clocks itself to 1936 in Boost 3.0 by default. Already looking pretty good...

But the fan for the Power/Memory is slightly higher pitched than I prefer.

Damn it.

I've become picky about noise in my old age. It's not even loud, it's just not a pitch I like to hear. It's basically a stock FE card type fan over that one spot and I hate that sound.

Yes, I'm the worst. I should just stick with my Zotac.

This card is going back unless someone wants to grab it before I get a refund.


----------



## c0nsistent

Quote:


> Originally Posted by *ttnuagmada*
> 
> I wasn't aware that any Televisions with TN panels even existed. I imagine that it's a VA panel.


It has a terrible viewing angle so it definitely resembles one to me.


----------



## ttnuagmada

The cheap VA panels can have pretty bad viewing angles too.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> wow, whats your power draw during the run? (you can't just leave it as a background task -- because as soon as you click another window, OCCT downclocks to 20 FPS until you bring it to the front again).
> 
> (Just wanting to double check).
> 
> Maybe I aught to write a little guide/article on this.
> 
> Sorry if you've done it correctly and I'm writing this pointlessly.. but I'm just worried in case someone doesn't notice these things and thinks they're stable when they're not!


Since I only used OCCT to test the vram, I put the power all the way down to 50%, down voltage and down clock the core in the Aftetburner and left it to run overnight. It only burned about 200w of power during the run. You need to turn off the screen saving and screen off timer to prevent occt from downclocking.


----------



## Nico67

Decided to try the Zotac AMP Extreme bios today, as far as benchmarks go its not bad, but couldn't get it to play games at same clks as FE.

2100/6148 1.043


Holds clks better, but does like higher ram like Asus Strix, GPU-Z has it as power limited when not on load, but just ignore that. Max power 310w at 96.7% TDP which looks good for 97% of 320w. However it is clearly limiting on this 117.9% normalized power that AB is also measuring.

This is FE which didn't run as high as previous, but still a good power indication.



300w around 120% on everything.

Funny thing is most bioses other than FE just don't play games stably, but looking forward to see what this XOC bios can do. Surely Asus want to let it slip, as its generally the advantage that gets people buying there product.


----------



## Slackaveli

Quote:


> Originally Posted by *c0nsistent*
> 
> Well my Antec 620 ziptied onto my FE card holds temps under 75C even in 4K playing the Division. Clocks are in the 1924 to 2000 range throughout, and in BF1, I typically see clockspeeds around 2012-2025 which is fine for me until I shunt mod the card and try to squeeze out 2100 which I already know the card can do without insta crashing at lower temps.
> 
> I was gaming on a QNIX 1440p display that is 120hz capable but was only able to use 75hz with an adapter, so I was thinking about trying to swap my card with someone that has a TXP 2016 with a little cash on the side as well so that I would have the DL-DVI connector to use and wouldn't have to buy a $500+ monitor with DP or HDMI 2.0.
> 
> Well, in my infinite wisdom I decided to pick up a cheapo 4K TV from walmart for $300 and see how 4K was on it. The 43" size is passable for a monitor but the noticeable delay compared to a 120hz panel is something to get used to indeed. It is a cheap LED panel so the lag isn't horrendous. However, it is noticeable and when there is tearing, it's impossible to ignore it.
> 
> This TV will only do 4:4:4 chroma @ 4K 30hz, and 4:2:0 at 4K 60hz, so I've been having to switch between modes for gaming and desktop usage.
> 
> Has anyone else tried using a 4K TV with their 1080 Ti? I'm satisfied with the FPS I'm getting as well. I'm able to Max The Divison out (Free trial weekend) albeit with some shadows and particles turned down a notch, and I rarely dip below 60fps even when in firefights.


use nvidia "fast sync" in the control panel and a frame rate limiter to 59 fps. it will have much less input lag and very little screen tear. (i play many games on my samsung 4k tv)


----------



## feznz

Quote:


> Originally Posted by *c0nsistent*
> 
> Well my Antec 620 ziptied onto my FE card holds temps under 75C even in 4K playing the Division. Clocks are in the 1924 to 2000 range throughout, and in BF1, I typically see clockspeeds around 2012-2025 which is fine for me until I shunt mod the card and try to squeeze out 2100 which I already know the card can do without insta crashing at lower temps.
> 
> I was gaming on a QNIX 1440p display that is 120hz capable but was only able to use 75hz with an adapter, so I was thinking about trying to swap my card with someone that has a TXP 2016 with a little cash on the side as well so that I would have the DL-DVI connector to use and wouldn't have to buy a $500+ monitor with DP or HDMI 2.0.
> 
> Well, in my infinite wisdom I decided to pick up a cheapo 4K TV from walmart for $300 and see how 4K was on it. The 43" size is passable for a monitor but the noticeable delay compared to a 120hz panel is something to get used to indeed. It is a cheap LED panel so the lag isn't horrendous. However, it is noticeable and when there is tearing, it's impossible to ignore it.
> 
> This TV will only do 4:4:4 chroma @ 4K 30hz, and 4:2:0 at 4K 60hz, so I've been having to switch between modes for gaming and desktop usage.
> 
> Has anyone else tried using a 4K TV with their 1080 Ti? I'm satisfied with the FPS I'm getting as well. I'm able to Max The Divison out (Free trial weekend) albeit with some shadows and particles turned down a notch, and I rarely dip below 60fps even when in firefights.


Check this out if you plan on a replacement TV

https://displaylag.com/display-database/
Quote:


> When using dual link, at a refresh rate of 60Hz you can get a resolution of 2560×1600 and for intense gaming monitors refreshing at 120Hz you can get 1920×1200. All modes that require more than 24 bits per pixel, or 165 MHz pixel clock frequency must use dual-link mode.


Source https://seriousseverity.wordpress.com/2013/10/10/monitor-connections-explained/

Buy Cheap Buy twice
Quote:


> Originally Posted by *nrpeyton*
> 
> Okay so with the APEX mobo (which I'm almost sold on).
> 
> Would I be better grabbing a 3600 CS 16 kit and manually tuning it to 4133 CS 19
> Or just buying the 4133 CS 19 kit?


If they are both B-Die surely they are binned memory modules meaning the 4133 would have the better bin
I can't get my head around lower binned chips used for faster memory
albeit a small small advantage for benching every iota counts IMHO


----------



## Dasboogieman

Quote:


> Originally Posted by *BBEG*
> 
> Unless your 290s were not voltage locked, it's difficult to imagine any system with xfire 290s abl not draw > 1Kw and damage the PSU (overclocked bulldozer aside). Faulty 12v rail?


My unit was fairly well used at that pt already and psus don't generally group regulate at 1kw anymore for a reason.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> Buy Cheap Buy twice
> If they are both B-Die surely they are binned memory modules meaning the 4133 would have the better bin
> I can't get my head around lower binned chips used for faster memory
> albeit a small small advantage for benching every iota counts IMHO


Went for this: _(order submitted)_

-G.Skill 16GB DDR4-3600 16GB DDR4 3600MHz memory module (16GB(8GBx2) G.SKILL Trident Z DDR4 PC4 28800 3600MHz C16 Kit

-ASUS ROG MAXIMUS IX APEX Intel Z270 ATX motherboard (Asus ROG MAXIMUS IX APEX Intel Z270 1151 ATX DDR4 EATX XFire/SLI HDMI DP LED Lighting Extreme Overclocking

-Intel Core i7-7700K 4.2GHz 8MB Smart Cache Box processor (Intel Core I7-7700K CPU 1151 4.2 GHz Quad Core 91W 14nm 8MB Overclockable Kaby Lake NO HEATSINK/FAN

Cant wait to pair it with my new 1080 Ti
Coming from an AMD-FX 8350

Think I'll notice the difference in FPS lol?


----------



## alucardis666

So I swapped CPU coolers and installed hybrid kits on my 1080tis. I go to turn on my computer and now it's stuck on post code 0d...

Tried flashing bios, pulling jumper, clearing cmos, 1 card, 1 dimm, different ram. Nothing...

Should I RMA the board? It's either the board or the CPU that is dead or defective I'd imagine.

Thoughts?!?


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> Check this out if you plan on a replacement TV
> 
> https://displaylag.com/display-database/
> Source https://seriousseverity.wordpress.com/2013/10/10/monitor-connections-explained/
> 
> Buy Cheap Buy twice
> If they are both B-Die surely they are binned memory modules meaning the 4133 would have the better bin
> I can't get my head around lower binned chips used for faster memory
> albeit a small small advantage for benching every iota counts IMHO


yeah, man, good source. here's what i use for my action games and whatnot. use the g-sync 1440p 144hz for the shooters.

2015 55" Samsung UN55JS9000 4K 120hz 3D LED HDTV 22ms


----------



## done12many2

Quote:


> Originally Posted by *alucardis666*
> 
> So I swapped CPU coolers and installed hybrid kits on my 1080tis. I go to turn on my computer and now it's stuck on post code 0d...
> 
> Tried flashing bios, pulling jumper, clearing cmos, 1 card, 1 dimm, different ram. Nothing...
> 
> Should I RMA the board? It's either the board or the CPU that is dead or defective I'd imagine.
> 
> Thoughts?!?


Remove the cooler and CPU. Reseat the CPU and reintstall the cooler making sure that you aren't putting too much pressure onto the CPU. It should boot then if excessive clamping force was your issue.


----------



## alucardis666

Quote:


> Originally Posted by *done12many2*
> 
> Remove the cooler and CPU. Reseat the CPU and reintstall the cooler making sure that you aren't putting too much pressure onto the CPU. It should boot then if excessive clamping force was your issue.


Nope. No luck.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Went for this: _(order submitted)_
> 
> -G.Skill 16GB DDR4-3600 16GB DDR4 3600MHz memory module (16GB(8GBx2) G.SKILL Trident Z DDR4 PC4 28800 3600MHz C16 Kit
> 
> -ASUS ROG MAXIMUS IX APEX Intel Z270 ATX motherboard (Asus ROG MAXIMUS IX APEX Intel Z270 1151 ATX DDR4 EATX XFire/SLI HDMI DP LED Lighting Extreme Overclocking
> 
> -Intel Core i7-7700K 4.2GHz 8MB Smart Cache Box processor (Intel Core I7-7700K CPU 1151 4.2 GHz Quad Core 91W 14nm 8MB Overclockable Kaby Lake NO HEATSINK/FAN
> 
> Cant wait to pair it with my new 1080 Ti
> Coming from an AMD-FX 8350
> 
> Think I'll notice the difference in FPS lol?


LOL, bro, you have been so gimped you just don't even know. Do you play Arma 3? do a last test before the switch. You'll see an enormous gain in that game. ENORMOUS>

Quote:


> Originally Posted by *alucardis666*
> 
> So I swapped CPU coolers and installed hybrid kits on my 1080tis. I go to turn on my computer and now it's stuck on post code 0d...
> 
> Tried flashing bios, pulling jumper, clearing cmos, 1 card, 1 dimm, different ram. Nothing...
> 
> Should I RMA the board? It's either the board or the CPU that is dead or defective I'd imagine.
> 
> Thoughts?!?


OMG, dude, nooooo

so hard to figure the true cause in that situation! Troubleshooting nightmare, i hope somebody here can help. Ryzen forum perhaps?


----------



## Alwrath

Quote:


> Originally Posted by *alucardis666*
> 
> Nope. No luck.


Try taking out 1 graphics card, may be your power supply.


----------



## alucardis666

Quote:


> Originally Posted by *Alwrath*
> 
> Try taking out 1 graphics card, may be your power supply.


If it was the PSU wouldn't it not even power on?


----------



## Luckbad

Quote:


> Originally Posted by *alucardis666*
> 
> If it was the PSU wouldn't it not even power on?


If your CPU has an integrated graphics chip, take both of your GPUs out. If it still doesn't boot, clear your CMOS.


----------



## madmeatballs

Quote:


> Originally Posted by *Slackaveli*
> 
> LOL, bro, you have been so gimped you just don't even know. Do you play Arma 3? do a last test before the switch. You'll see an enormous gain in that game. ENORMOUS>
> OMG, dude, nooooo


This! I play ARMA 3 alot, jumped from the 8350 to 4790k before I got a big fps increase. Now I have a 7700k and it improved more since its on 5GHz. Higher ram frequency also adds some fps. I base everything I buy on running arma since it is what I play.


----------



## Slackaveli

Quote:


> Originally Posted by *madmeatballs*
> 
> This! I play ARMA 3 alot, jumped from the 8350 to 4790k before I got a big fps increase. Now I have a 7700k and it improved more since its on 5GHz. Higher ram frequency also adds some fps. I base everything I buy on running arma since it is what I play.


there you go, man. if you want to see if your ram/cpu combo is up to the challenge, play arma 3! it's a beautiful game and it's a lot to render.


----------



## alucardis666

Quote:


> Originally Posted by *Luckbad*
> 
> If your CPU has an integrated graphics chip, take both of your GPUs out. If it still doesn't boot, clear your CMOS.


I do not have an igpu


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Went for this: _(order submitted)_
> 
> -G.Skill 16GB DDR4-3600 16GB DDR4 3600MHz memory module (16GB(8GBx2) G.SKILL Trident Z DDR4 PC4 28800 3600MHz C16 Kit
> 
> -ASUS ROG MAXIMUS IX APEX Intel Z270 ATX motherboard (Asus ROG MAXIMUS IX APEX Intel Z270 1151 ATX DDR4 EATX XFire/SLI HDMI DP LED Lighting Extreme Overclocking
> 
> -Intel Core i7-7700K 4.2GHz 8MB Smart Cache Box processor (Intel Core I7-7700K CPU 1151 4.2 GHz Quad Core 91W 14nm 8MB Overclockable Kaby Lake NO HEATSINK/FAN
> 
> Cant wait to pair it with my new 1080 Ti
> Coming from an AMD-FX 8350
> 
> Think I'll notice the difference in FPS lol?


I thought you might go Bang For Buck that route I just looked into it myself I honestly can't quite justify a mobo Upgrade yet........
I will get the 4133Mhz or better when I do upgrade, probably be a 8core plus CPU have to be at least 4.8Ghz min guaranteed OC I will be sold as long as the whole package retails around or under $2000USD.

They are my demands









Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man, good source. here's what i use for my action games and whatnot. use the g-sync 1440p 144hz for the shooters.
> 
> 2015 55" Samsung UN55JS9000 4K 120hz 3D LED HDTV 22ms


I also got a nice TV on the wish list the current TV has been attacked by children dead pixels and fallen over in a earthquake ripped out the tos link

you could tell me that TV has no DP input ? which would be a deal breaker for me just seems a bit weird rate a panel much higher than any display input is rated for.

which brings me to the next point why is there no Display Port version 2? seems that there is already a limitation on available bandwidth where 4k 120Hz would need 2 DP cables in unison to create the required bandwidth to drive the panel to the full potential


----------



## dansi

How easy is to install evga 1080 aio? Because the 1080ti aio is not going in stock for months, it is also $50 more, and it seems to only cover the vram? At least of what i seen of sc hybrid.

1080 aio, i just need to remove back plate, unscrew the vapor chamber, remove the front cover, dismantle the arcylic window, screw on the cooling plate and reassemble back plate? Seems easy, amirite?


----------



## wrc05

^^ @dansi

I didn't remove backplate, only the 4 spring screws that hold vapor chamber.



Spoiler: Warning: Spoiler!










Excuse poor quality of photos, even I was tired of waiting for Ti cooler.
I did get EVGA in-stock notification email somewhere middle of night and by the time I woke up they were gone


----------



## Chris Ihao

Weird. Just got a Nvidia Shield for my living room, which works brilliant btw, and ran into an unexpected issue. My so far rock solid +125 power, +80 core (2012 mhz) oc cause artifacting. Smallish white squares that pop up here and there, disappearing quickly. Not on my monitor, but on my tv. I thought at first that there was some compression issue going on, and tried to fix it by changing misc settings, but it reminded me a lot about memory artifacting back in the days. I finally tried to turn off the overclock, and voila! No artifacts. I had to relax it down to around +120 power and +50 core to oc without the artifacts.

Anyone have a clue whats going on? I know Shield uses some kind of processing in the 1080 ti, but this was really unexpected. No worries though, I am perfectly happy running the above settings, landing at 1972 mhz stable.


----------



## RavageTheEarth

So I've taken a step back from the whole "curve" scene and decided to try overclocking the old school way. I've found some interesting results. I found that overclocking this way, I never have any PerfCaps underload. Also, my clock speed never drops from what I set it to. I always had the GPU utilization Perfcap under load and that's gone too. I know I'm stable at +70/500 which nets me 2063 core. The voltage starts low at 1.012v and slowly raises voltage until it settles at 1.062v. The card never drops from 2063 the entire time which is cool because when I'm using the curve my core clock will drop, but not when overclocking this way for some reason.

The interesting part is when I do +80 which is 2076. It holds the clock speed perfectly with no perfcaps in Superposition UNTIL the card is about to hit 50C which results in a crash every time. Right when the card is teetering between 49C-50C is when the card crashes. Looks like I needed this waterblock more than I thought. Also, I have +100 on the voltage of the card, but the card never will go over 1.062v which is strange. I'm going to play around some more and try to figure this out.

Here is the screenshot of my 2076 run that resulted in a crash.


----------



## Garrett1974NL

Quote:


> Originally Posted by *CoreyL4*
> 
> Would you guys rather have 2012/6000 or 2037/5900 as a daily clock?


IF the 2012 doesn't dip below 2000 I think you're good. 6000 mem just looks nicer hahah.. I think performance wise you won't notice any difference, maybe in 3DMark but that's something I really don't care about.

On a sidenote, would a shunt mod help achieving a higher overclock? 2050 is rock solid right now, but anything above it just isn't... MSI 1080Ti Armor OC (Gaming X PCB except the RGB and led connectors)


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> Got the EVGA 1080 Ti SC2 Hybrid in today. Clocks itself to 1936 in Boost 3.0 by default. Already looking pretty good...
> 
> But the fan for the Power/Memory is slightly higher pitched than I prefer.
> 
> Damn it.
> 
> I've become picky about noise in my old age. It's not even loud, it's just not a pitch I like to hear. It's basically a stock FE card type fan over that one spot and I hate that sound.
> 
> Yes, I'm the worst. I should just stick with my Zotac.
> 
> This card is going back unless someone wants to grab it before I get a refund.


Well, it was to be expected as Hybrid is bacielly enchanced reference plus their hybrid cooler.

However, what are your OC results so far? And what are temps after max OC? Please kindly share.


----------



## dansi

@Wrc5 good to hear less disassembly needed! You didn't even have to remove the io shield. But did u cut up your front plate, why i see ramsinks on it?

Of 1-10 scale, how easy u rate it, how fast you got it done? Tia!


----------



## Benny89

Can anyone tell me please what size heatsinks I need for all RAM modules in 1080 Ti? I need for Vram around GPU die but also for those smaller RAMs too, right? How big do they need to be?


----------



## wrc05

^^ I tried to do like this


Code:



Code:


https://forums.evga.com/My-1080-Ti-Hybrid-m2629639.aspx

but backed out. I didn't cut front plate, had those heatsinks, so stuck ém hoping they may help a bit.
Installation was easy, my PSU is at RMA and the cooler arrived, so I installed it.

Few more members installed the same, maybe they will fill in.


----------



## toncij

Does anyone have a temperature throttle list? At what temperature the 1080Ti FE starts to throttle down, what are the operational parameters? Expected clocks? How much you reach without manual overclock on AIO etc.?


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> If it was the PSU wouldn't it not even power on?


oh no!! I was wondering how you were getting on with the hybrid kits!

you mentioned you changed your cpu cooler? what cooler did you switch to? surely by now you have re-seated it

hang in there, go step by step


----------



## mshagg

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I've taken a step back from the whole "curve" scene and decided to try overclocking the old school way.


I ended up doing something similar. Still running the Palit BIOS. +37 on the core, 100% volts, 116% power, +483 memory. Job done.

Rock solid 2050Mhz, couldn't even tell you what voltage bin it sits in. ~10500 SuPo 4K.

Enjoying the card a lot more since I stopped tweaking it and started playing games with it lol.


----------



## Benny89

Regarding Asus warranty stickers on Backplate screw:

*I was speaking via emails with Asus representative and he confirmed that removing, damaging etc. of sticker is automatic warranty void. Removing cooler, repasting card etc is auto warranty void.*

So if you ever plan to have any fun with your card, I strongly recommend not picking Asus.

2017 and they still go with those warranty stickers... omg... .


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> Regarding Asus warranty stickers on Backplate screw:
> 
> *I was speaking via emails with Asus representative and he confirmed that removing, damaging etc. of sticker is automatic warranty void. Removing cooler, repasting card etc is auto warranty void.*
> 
> So if you ever plan to have any fun with your card, I strongly recommend not picking Asus.
> 
> 2017 and they still go with those warranty stickers... omg... .


check your consumer laws carefully, those override whatever the f ASUS says.

e.g. In Australia, the consumer is entitled to 2 years defect free warranty regardless of manufacturer policies, including warranty stickers. As an added bonus, the consumer is also entitled to inspect their product to ensure no fault in manufacturing or function exists and this must be warranted unless the manufacturer can prove beyond reasonable doubt that modifications were done.

The devil is in the wording. You may find those stickers mean crap when contested against your local laws.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Well, it was to be expected as Hybrid is bacielly enchanced reference plus their hybrid cooler.
> 
> However, what are your OC results so far? And what are temps after max OC? Please kindly share.


I decided not to try overclocking. Didn't want to get attached since the fan sound bugs me. GPU temp never exceeded the 40s while I was testing it.


----------



## c0nsistent

So I'm the only one around here to do the old school 'mod' to their 1080 Ti and ziptie an AIO?


----------



## KickAssCop

Finally the card I wanted.


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> check your consumer laws carefully, those override whatever the f ASUS says.
> 
> e.g. In Australia, the consumer is entitled to 2 years defect free warranty regardless of manufacturer policies, including warranty stickers. As an added bonus, the consumer is also entitled to inspect their product to ensure no fault in manufacturing or function exists and this must be warranted unless the manufacturer can prove beyond reasonable doubt that modifications were done.
> 
> The devil is in the wording. You may find those stickers mean crap when contested against your local laws.


Good point. Anyone here that has knowledge about how does it look like in EU?


----------



## done12many2

Quote:


> Originally Posted by *KickAssCop*
> 
> Finally the card I wanted.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Are you running x99 in a dual channel memory configuration? That's illegal where I live. They can confiscate your rig for that.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Benny89*
> 
> Regarding Asus warranty stickers on Backplate screw:
> 
> *I was speaking via emails with Asus representative and he confirmed that removing, damaging etc. of sticker is automatic warranty void. Removing cooler, repasting card etc is auto warranty void.*
> 
> So if you ever plan to have any fun with your card, I strongly recommend not picking Asus.
> 
> 2017 and they still go with those warranty stickers... omg... .


Those stupid stickers. I hate companies like that. Let me do what I want with my card. I'm not a 5 year old and know how to deal with electronics. I just checked my Aorus for stickers and I don't see any. Good think since I'm slapping a block on this card the second EK releases it. My card is crying for water. It's so thirsty








Quote:


> Originally Posted by *KickAssCop*
> 
> Finally the card I wanted.
> 
> 
> Spoiler: Warning: Spoiler!


Throw some more RAM in there! Looks so sad seeing an X99 with only two sticks on memory on one side


----------



## RavageTheEarth

Quote:


> Originally Posted by *mshagg*
> 
> I ended up doing something similar. Still running the Palit BIOS. +37 on the core, 100% volts, 116% power, +483 memory. Job done.
> 
> Rock solid 2050Mhz, couldn't even tell you what voltage bin it sits in. ~10500 SuPo 4K.
> 
> Enjoying the card a lot more since I stopped tweaking it and started playing games with it lol.


I'm still playing with it. It is addicting. I couldn't get +80 without crashing so now I'm back to the curve and have obtained a perfectly stable 2076 @ 1.081v. Will probably just go back down to my underclocked 1.00v/2050/6000 OC for now. 0.081 extra volts for an extra 27Mhz isn't really worth it. Once I get this card under water I'll push it further.


----------



## Benny89

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Those stupid stickers. I hate companies like that. Let me do what I want with my card. I'm not a 5 year old and know how to deal with electronics. I just checked my Aorus for stickers and I don't see any. Good think since I'm slapping a block on this card the second EK releases it. My card is crying for water. It's so thirsty
> 
> 
> 
> 
> 
> 
> 
> 
> Throw some more RAM in there! Looks so sad seeing an X99 with only two sticks on memory on one side


Yeah, I pre-ordered EVGA FTW3 to check it. if it is marginally better than my STRIX I will keep it instead. I like to have options, for example if I will want to water cool card, or STICK Aio to it etc etc. With Asus I feel limited with what I can do. Still best looking Ti for me on market, but EVGA has always best warranty on market.


----------



## KickAssCop

My X99 left hand side ram slots are busted. So I can only run dual channel for now.


----------



## mtbiker033

Quote:


> Originally Posted by *c0nsistent*
> 
> So I'm the only one around here to do the old school 'mod' to their 1080 Ti and ziptie an AIO?


haha maybe! did you post pictures of your mod? I would love to see it but maybe missed them


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Good point. Anyone here that has knowledge about how does it look like in EU?


In germany i know that those warranty stickers are worth nothing... As long you haven't broken something by yourself you can basically do whatever you want with that hardware....
otherwise you wouldn't be able to swap coolers for a waterblock for example...


----------



## KickAssCop

Actually this card is doing much better than the rest.
Finally. 2050/12100 sustained. 68 C. No sound.








Fan doesn't go over 42%.

Screw MSI Gaming X and Founder's Edition.


----------



## CoreyL4

So I am doing the OCCT memory overclock and I am confused. After like 5 or 6 seconds, my memory drops down to 5500~ so I cant see if 6000+ is stable.

Why is it doing that?


----------



## Qba73

Its on.. let the best card win.. will report back findings.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> I thought you might go Bang For Buck that route I just looked into it myself I honestly can't quite justify a mobo Upgrade yet........
> I will get the 4133Mhz or better when I do upgrade, probably be a 8core plus CPU have to be at least 4.8Ghz min guaranteed OC I will be sold as long as the whole package retails around or under $2000USD.
> 
> They are my demands
> 
> 
> 
> 
> 
> 
> 
> 
> I also got a nice TV on the wish list the current TV has been attacked by children dead pixels and fallen over in a earthquake ripped out the tos link
> 
> you could tell me that TV has no DP input ? which would be a deal breaker for me just seems a bit weird rate a panel much higher than any display input is rated for.
> 
> which brings me to the next point why is there no Display Port version 2? seems that there is already a limitation on available bandwidth where 4k 120Hz would need 2 DP cables in unison to create the required bandwidth to drive the panel to the full potential


i believe DP1.4 has us covered there. and HDMI 2.1 with enough bandwidth for 4k/120Hz+HDR and VRR (variable refresh rate) tech. So, in future we're good.

HDMI 2.0 on my 4k tv is what made me "upgrade" from 290x to gtx 980s (1st hdmi 2.0 gpu) , but has had me complaining about the cable tech ever since. DP1.2 is pathetic. and monitor manufacturers are the ones holding us ALL back. the DP 1.3 standard has been out for years and no monitors had an input for it. DP 1.4 was out before even the first gaming monitor with dp1.3 tech came out. It's pathetic, tbh. We could have had high refresh rate 4k 2 years ago. We had the gpu for it with Titan/980ti , just in sli or on weaker games, of course. But that is often overlooked. People act like if you cant play AAA games at 4k 120 than it's all pointless. But that's crap because i also play a lot of not AAA games and they run high refresh rates in 4k... But , we can't see them. dumb as hell....


----------



## hotrod717

Quote:


> Originally Posted by *nrpeyton*
> 
> Went for this: _(order submitted)_
> 
> -G.Skill 16GB DDR4-3600 16GB DDR4 3600MHz memory module (16GB(8GBx2) G.SKILL Trident Z DDR4 PC4 28800 3600MHz C16 Kit
> 
> -ASUS ROG MAXIMUS IX APEX Intel Z270 ATX motherboard (Asus ROG MAXIMUS IX APEX Intel Z270 1151 ATX DDR4 EATX XFire/SLI HDMI DP LED Lighting Extreme Overclocking
> 
> -Intel Core i7-7700K 4.2GHz 8MB Smart Cache Box processor (Intel Core I7-7700K CPU 1151 4.2 GHz Quad Core 91W 14nm 8MB Overclockable Kaby Lake NO HEATSINK/FAN
> 
> Cant wait to pair it with my new 1080 Ti
> Coming from an AMD-FX 8350
> 
> Think I'll notice the difference in FPS lol?


Good choices. Just keep in mind you may have to go through several of those kits to find some gems or you could get lucky on 1st try.
Personally I like the 3600c15 kits as they are binned better. No guarantee with any of them for 4000c12 +


----------



## Slackaveli

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So I've taken a step back from the whole "curve" scene and decided to try overclocking the old school way. I've found some interesting results. I found that overclocking this way, I never have any PerfCaps underload. Also, my clock speed never drops from what I set it to. I always had the GPU utilization Perfcap under load and that's gone too. I know I'm stable at +70/500 which nets me 2063 core. The voltage starts low at 1.012v and slowly raises voltage until it settles at 1.062v. The card never drops from 2063 the entire time which is cool because when I'm using the curve my core clock will drop, but not when overclocking this way for some reason.
> 
> The interesting part is when I do +80 which is 2076. It holds the clock speed perfectly with no perfcaps in Superposition UNTIL the card is about to hit 50C which results in a crash every time. Right when the card is teetering between 49C-50C is when the card crashes. Looks like I needed this waterblock more than I thought. Also, I have +100 on the voltage of the card, but the card never will go over 1.062v which is strange. I'm going to play around some more and try to figure this out.
> 
> Here is the screenshot of my 2076 run that resulted in a crash.


try +51Mv. that's my sweet spot and our cards are probably twins from the same center die








Quote:


> Originally Posted by *done12many2*
> 
> [/SPOILER]
> 
> Are you running x99 in a dual channel memory configuration? That's illegal where I live. They can confiscate your rig for that.


thought the same thing. he should arrest himself!
Quote:


> Originally Posted by *Qba73*
> 
> Its on.. let the best card win.. will report back findings.


word, brother. Evga better have sent you a "reviewer's quality" sample or they about to get a refurb to sell!


----------



## s1rrah

I play 1440p with my 1080 ti ... it's the sweet spot for me as I require 100+ FPS for Full Personal Satisfaction(tm) ... LOL ...

But yesterday, I had an epiphany and realized that I have a 48" Samsung JS9000 4K TV sitting in front of me and thought, "Hey ... why not test this thing on my 4K TV!?" ... really, I had forgotten all about my TV being 4K (and JS9000 is a pretty decent gaming TV at that).

So I hook everything up through my receiver and start playing Witcher 3 (Ultra settings, no hair works) ... 3840x2160 ... and dude ... compared to my dual 980's on the same TV? Absolute night and day. I cruised around and was amazed at the snappiness and the fact that I was always between 65FPS and 80FPS ... generally, towards the higher end of those numbers ... and with the "ultra" settings, it literally looked like an animated film ...

I won't be playing on the TV much but man was it totally playable ... no bothersome lag (though the lag was there, it's a TV after all) snappy mouse response time and most importantly ... frame rates generally averaging around 70FPS ...

In the end, I love my 27" 1440p G Sync IPS screen more ... but sure was nice to see such smooth play on my 4K TV; I will say this though ... once a single card can push 4K to frame rates between 100FPS and 144FPS, I'll totally be upgrading to 4K ... but for now? 1440p is it ...

What a card man ...


----------



## pantsoftime

Neal I really was hoping you'd go with a ryzen build, I have to say I'm a bit surprised.


----------



## Slackaveli

Quote:


> Originally Posted by *s1rrah*
> 
> I play 1440p with my 1080 ti ... it's the sweet spot for me as I require 100+ FPS for Full Personal Satisfaction(tm) ... LOL ...
> 
> But yesterday, I had an epiphany and realized that I have a 48" Samsung JS9000 4K TV sitting in front of me and thought, "Hey ... why not test this thing on my 4K TV!?" ... really, I had forgotten all about my TV being 4K (and JS9000 is a pretty decent gaming TV at that).
> 
> So I hook everything up through my receiver and start playing Witcher 3 (Ultra settings, no hair works) ... 3840x2160 ... and dude ... compared to my dual 980's on the same TV? Absolute night and day. I cruised around and was amazed at the snappiness and the fact that I was always between 65FPS and 80FPS ... generally, towards the higher end of those numbers ... and with the "ultra" settings, it literally looked like an animated film ...
> 
> I won't be playing on the TV much but man was it totally playable ... no bothersome lag (though the lag was there, it's a TV after all) snappy mouse response time and most importantly ... frame rates generally averaging around 70FPS ...
> 
> In the end, I love my 27" 1440p G Sync IPS screen more ... but sure was nice to see such smooth play on my 4K TV; I will say this though ... once a single card can push 4K to frame rates between 100FPS and 144FPS, I'll totally be upgrading to 4K ... but for now? 1440p is it ...
> 
> What a card man ...


yeah man, it's (samsung js9000) pretty great for certain games. i have mine on my desk next to my 1440p/144Hz. It really just depends o the game but some experiences are just way better on the big 4k. for instance, last night i was drunk and not in any condition to play any real games so i fired up pinballFX on the 4k big screen. Holy hell, man. i played for 2 hours. it was fabulous. 8x AA on a 55" 4k from 2 feet away. Just nasty. Another is xcom2. way better on the bigscreen in 4k.


----------



## s1rrah

Quote:


> Originally Posted by *Slackaveli*
> 
> last night i was drunk and not in any condition to play any real games so i fired up pinballFX on the 4k big screen. Holy hell, man. i played for 2 hours. it was fabulous. 8x AA on a 55" 4k from 2 feet away. Just nasty. Another is xcom2. way better on the bigscreen in 4k.


Oh man ... I so relate. LOL!!

That's precisely the state I was in when I decided to roll my 4K JS9000 over in front of me ... so you have that sucker 2ft away? LOL ... mine is on a really sweet castor based stand that I can roll all over my room ... I can get it within 3ft but the other day, I was playing the TV back from about six feet ...

Going to do some more testing this evening ....


----------



## Benny89

Quote:


> Originally Posted by *KickAssCop*
> 
> Actually this card is doing much better than the rest.
> Finally. 2050/12100 sustained. 68 C. No sound.
> 
> 
> 
> 
> 
> 
> 
> 
> Fan doesn't go over 42%.
> 
> Screw MSI Gaming X and Founder's Edition.


Silicon Lottery. I have now my second STRIX. Won't do higher than 2000mhz stable in Witcher 3, 2012 stable in ME:A. My previous one could do 2012 stable.

I know there was someone here who had stable on STRIX for 6 hours 2075 in Witcher 3.

Silicon Lottery sadly :/


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> word, brother. Evga better have sent you a "reviewer's quality" sample or they about to get a refurb to sell!


All depends on sillicon lottery. For the past weeks I watched probably every OC test, "vs" and 1080 Ti review on YT. AIB design doesn't really give much. Only if you won silicon lottery and got golden chip the extra stuff like Power Limit, cooling etc. will just allow you to squeeze 100% of it. If you don't win lottery, all those things won't help you anyway.

I have watched several Aorus Reviews and some could only do max 2012 or 2025 mhz stable in games. Aorus Xtreme Edition have lowest user scores/worst reviews out of all AIBs on Newegg or amazon, and highest RMA ratio. I personally experienced that when I got mine 1080 Ti Xtreme Aorus and it atrefacting and couldn't start any game on stock settings (didn't install Aorus software at all).

It is all about silicon lottery. There is one guy here who has Asus STRIX that he tasted for 6 hours stable 2075 at 1.031V in Witcher 3. That doesn't mean STRIX are best cards. That only means he won lottery.

Your Aorus is same stuff- it is not better in anyway than any other AIB- you just won.

I know you are very happy with your Aorus, but judgnign from user reviews in internet and reddit - Aorus is probably worst AIB since it feels like 1/3rd of them do not work at all for people, crashing all the time.

So his "vs" FTW3 and Aorus is just up to this- which chip is better. And thats all.


----------



## Slackaveli

Quote:


> Originally Posted by *s1rrah*
> 
> Oh man ... I so relate. LOL!!
> 
> That's precisely the state I was in when I decided to roll my 4K JS9000 over in front of me ... so you have that sucker 2ft away? LOL ... mine is on a really sweet castor based stand that I can roll all over my room ... I can get it within 3ft but the other day, I was playing the TV back from about six feet ...
> 
> Going to do some more testing this evening ....


i kind of had an epiphany one day. i was thinking, if only we could sit closer our immersion would be so nice, but alas , the screen door effect.... wait.... NOT ON 4k there isnt. park it right in front of you. it was maybe 3 feet when i lean back.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Silicon Lottery. I have now my second STRIX. Won't do higher than 2000mhz stable in Witcher 3, 2012 stable in ME:A. My previous one could do 2012 stable.
> 
> I know there was someone here who had stable on STRIX for 6 hours 2075 in Witcher 3.
> 
> Silicon Lottery sadly :/


Isn't buying and returning cards a waste of your time?


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> All depends on sillicon lottery. For the past weeks I watched probably every OC test, "vs" and 1080 Ti review on YT. AIB design doesn't really give much. Only if you won silicon lottery and got golden chip the extra stuff like Power Limit, cooling etc. will just allow you to squeeze 100% of it. If you don't win lottery, all those things won't help you anyway.
> 
> I have watched several Aorus Reviews and some could only do max 2012 or 2025 mhz stable in games. Aorus Xtreme Edition have lowest user scores/worst reviews out of all AIBs on Newegg or amazon, and highest RMA ratio. I personally experienced that when I got mine 1080 Ti Xtreme Aorus and it atrefacting and couldn't start any game on stock settings (didn't install Aorus software at all).
> 
> It is all about silicon lottery. There is one guy here who has Asus STRIX that he tasted for 6 hours stable 2075 at 1.031V in Witcher 3. That doesn't mean STRIX are best cards. That only means he won lottery.
> 
> Your Aorus is same stuff- it is not better in anyway than any other AIB- you just won.
> 
> I know you are very happy with your Aorus, but judgnign from user reviews in internet and reddit - Aorus is probably worst AIB since it feels like 1/3rd of them do not work at all for people, crashing all the time.
> 
> So his "vs" FTW3 and Aorus is just up to this- which chip is better. And thats all.


all true. i made the comment because i already know his Aorus' results, which are on par with mine. One thing you didnt mention was the Aorus regular Ti , which we own, has a 5* newegg rating. Almost all the returns are people with Extremes and half of those people or more just didnt try Afterburner. The Aorus software screwed up their launch largely. Your experience notwithstanding, almost everybody in the Aorus owner's thread who bought a regular Aorus is 100% happy. the one's who weren't almost universally became happy once they uninstalled Aorus software. Not to say they are all hitting 2088, but they ARE all hitting ~2000Mhz, and are right there with any of the AiB. As you said, it's all lottery. However, if you do get lucky, it sure is great to have that 375w power limit.


----------



## s1rrah

Quote:


> Originally Posted by *Slackaveli*
> 
> i kind of had an epiphany one day. i was thinking, if only we could sit closer our immersion would be so nice, but alas , the screen door effect.... wait.... NOT ON 4k there isnt. park it right in front of you. it was maybe 3 feet when i lean back.


What was mind boggling is that my dual 980's in SLI were horrible at 4K ... stuttery, jumpy, laggy as all get out and maybe 30fps ... totally unplayable; after that, I wrote off playing 4K at all.

But the new 1080 ti? Lordy! 100% playable and enjoyable at 4K on that Samsung JS9000 and I wasn't even in "gaming mode" ... I'm super sensitive to lag and input response and the 1080 ti / JS9000 combo was pretty dreamy at an average of 70FPS ...

Just not as dreamy as my 1440p screen at 120FPS ..

Going to get back to it this evening with some 4K testing ...


----------



## dunbheagan

Hey guys, did a few Supo 4k runs today, this has been the best for today at 2113/[email protected],093V and a [email protected],7GHz:

10556 Points


----------



## Slackaveli

Quote:


> Originally Posted by *s1rrah*
> 
> What was mind boggling is that my dual 980's in SLI were horrible at 4K ... stuttery, jumpy, laggy as all get out and maybe 30fps ... totally unplayable; after that, I wrote off playing 4K at all.
> 
> But the new 1080 ti? Lordy! 100% playable and enjoyable at 4K on that Samsung JS9000 and I wasn't even in "gaming mode" ... I'm super sensitive to lag and input response and the 1080 ti / JS9000 combo was pretty dreamy at an average of 70FPS ...
> 
> Just not as dreamy as my 1440p screen at 120FPS ..
> 
> Going to get back to it this evening with some 4K testing ...


make sure to have vsync off, FAST sync on. then use your rivatuner to lock frames at 59. you wont feel input lag anymore. i do get a little bit of "overshoot" which can be annoying, but it's totally playable. OH! Also, hit "source" on samsung remote and rename input to "PC". then you have the same as game mode without the degradation. confirmed, this works. So, try those settings in your testing.

iagree my 980s were aweful on it. but my single 980Ti was good in some games , like Fifa 16, xcom, and such. just had to lower setting a bit. On the 1080Ti, full on Ultra 4k over 60fps is the norm, for sure.


----------



## dunbheagan

Also did some work on the waterloop and changed the tubes to black. What do you think?



Still need to tidy up the cables...


----------



## Slackaveli

that is hot, heagan


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> that is hot, heagan


Thanks a lot!


----------



## Qba73

initial results

same as aorus in terms of clock (comes down to chip luck) haven't been able to hit 2075 yet but 2050 solid on the ftw3 +55 on the gpu clock, no volts. but haven't played much.(no down clock cause of temps)

like benny and slack said, nothing to do with the AIB but rather the luck of the chip.

one thing I'll tell you even at 70% fans its quiet and it runs cooool. 58-61 c is max I hit under load

on the mem +500 (gonna see what celling is but 500 is 12Kghz am fine with that)

tried +75 clock but may need to add some juice cause it locked.

am just surprised on how a 2 slot cooler can run 10c cooler than a 2.5 monster sink on aorus. I can only imagine what temps would be with kryonaut.

am use Precision and not AB to make sure I have full support of the card (ever tell you how much I hate EXOC lol)

gonna have some screenies soon, wife is coming home soon and she will kill me if I'm paying with another card.

on a side note I decided I am giving the loser to my lil bro who is rocking a 780ti still.


----------



## Slackaveli

Quote:


> Originally Posted by *Qba73*
> 
> initial results
> 
> same as aorus in terms of clock (comes down to chip luck) haven't been able to hit 2075 yet but 2050 solid on the ftw3 +55 on the gpu clock, no volts. but haven't played much.(no down clock cause of temps)
> 
> like benny and slack said, nothing to do with the AIB but rather the luck of the chip.
> 
> one thing I'll tell you even at 70% fans its quiet and it runs cooool. 58-61 c is max I hit under load
> 
> on the mem +500 (gonna see what celling is but 500 is 12Kghz am fine with that)
> 
> tried +75 clock but may need to add some juice cause it locked.
> 
> am just surprised on how a 2 slot cooler can run 10c cooler than a 2.5 monster sink on aorus. I can only imagine what temps would be with kryonaut.
> 
> am use Precision and not AB to make sure I have full support of the card (ever tell you how much I hate EXOC lol)
> 
> gonna have some screenies soon, wife is coming home soon and she will kill me if I'm paying with another card.
> 
> on a side note I decided I am giving the loser to my lil bro who is rocking a 780ti still.


GOOD BROTHER!!! wow, he will be stoked.

*Hey guys. I cracked the stupid No STARt error on OCCT. you need to go to OCCT settings and turn off the hardware monitor and it will run then. Somebody actually mentioned soemthing like this 100 pages ago, so shout out to that guy. I just thought you meant in AB settings and that didnt work. tried again with OCCT settings , and, bingo. will report back testing results.

lol, figures i would have already nailed my own max vram overclock, exactly +435. +442 errors. SuPo 4k runs had me squared away. Only took me 6 hours instead of 10 minutes...


----------



## Coopiklaani

Very strange, my card can run 2100core and 6399memory without a single dip in frequency. but my SP score is still lower than ppl with lower clocks. Anyone has a clue why?

BTW, CPU 6800k 4.5GHz



GPU-ZSensorLog.txt 58k .txt file


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> initial results
> 
> same as aorus in terms of clock (comes down to chip luck) haven't been able to hit 2075 yet but 2050 solid on the ftw3 +55 on the gpu clock, no volts. but haven't played much.(no down clock cause of temps)
> 
> like benny and slack said, nothing to do with the AIB but rather the luck of the chip.
> 
> one thing I'll tell you even at 70% fans its quiet and it runs cooool. 58-61 c is max I hit under load
> 
> on the mem +500 (gonna see what celling is but 500 is 12Kghz am fine with that)
> 
> tried +75 clock but may need to add some juice cause it locked.
> 
> am just surprised on how a 2 slot cooler can run 10c cooler than a 2.5 monster sink on aorus. I can only imagine what temps would be with kryonaut.
> 
> am use Precision and not AB to make sure I have full support of the card (ever tell you how much I hate EXOC lol)
> 
> gonna have some screenies soon, wife is coming home soon and she will kill me if I'm paying with another card.
> 
> on a side note I decided I am giving the loser to my lil bro who is rocking a 780ti still.


How high TDP/Power Limit has FTW3? Is it around 350W?

Thanks for results. I would appreciate pictures of FTW3 inside your case- I want to see how LEDs are looking on FTW3.

Nice results for FTW3 cooler.


----------



## Benny89

Quote:


> Originally Posted by *Coopiklaani*
> 
> Very strange, my card can run 2100core and 6399memory without a single dip in frequency. but my SP score is still lower than ppl with lower clocks. Anyone has a clue why?
> 
> BTW, CPU 6800k 4.5GHz
> 
> 
> 
> GPU-ZSensorLog.txt 58k .txt file


Happens. It was mentioned few times over past weeks in this thread- higher clocks on Pascal does not guarantee better results. There is a dimishion return here. There were examples of people here who had better scores at lower clocks (both core and memort) than their max.

I can relate also as my first STRIX had better score at 2038 core than at 2050 core.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> Very strange, my card can run 2100core and 6399memory without a single dip in frequency. but my SP score is still lower than ppl with lower clocks. Anyone has a clue why?
> 
> BTW, CPU 6800k 4.5GHz
> 
> 
> 
> GPU-ZSensorLog.txt 58k .txt file


nvidia control panel settings can account for ~ 200 pts. also, score seems entirely based on "average fps", which is why clockspeeds and high fps marks seem arbitrary. b/c they are. all about average fps, however you can achieve it.


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> initial results
> 
> same as aorus in terms of clock (comes down to chip luck) haven't been able to hit 2075 yet but 2050 solid on the ftw3 +55 on the gpu clock, no volts. but haven't played much.(no down clock cause of temps)
> 
> like benny and slack said, nothing to do with the AIB but rather the luck of the chip.
> 
> one thing I'll tell you even at 70% fans its quiet and it runs cooool. 58-61 c is max I hit under load
> 
> on the mem +500 (gonna see what celling is but 500 is 12Kghz am fine with that)
> 
> tried +75 clock but may need to add some juice cause it locked.
> 
> am just surprised on how a 2 slot cooler can run 10c cooler than a 2.5 monster sink on aorus. I can only imagine what temps would be with kryonaut.
> 
> am use Precision and not AB to make sure I have full support of the card (ever tell you how much I hate EXOC lol)
> 
> gonna have some screenies soon, wife is coming home soon and she will kill me if I'm paying with another card.
> 
> on a side note I decided I am giving the loser to my lil bro who is rocking a 780ti still.


One more thing! Did you OC your FTW3 on 1st or 2nd BIOS? It has dual BIOS- one has only 117% Power Limit I think and the other one should have 127% from what I have heard.

Can you please confirm that?


----------



## Qba73

Quote:


> Originally Posted by *Benny89*
> 
> One more thing! Did you OC your FTW3 on 1st or 2nd BIOS? It has dual BIOS- one has only 117% Power Limit I think and the other one should have 127% from what I have heard.
> 
> Can you please confirm that?


I switched it to slave (2nd Bios) off the bat, gives you 127

Master (1st Bios) gives you 117

limits I'm seeing in gpuz is pwr on ftw3 (and I'm maxed at 127) so I'm hitting a wall, that's where the aorus never had a pwr on the sensors.


----------



## Coopiklaani

Quote:


> Originally Posted by *Benny89*
> 
> Happens. It was mentioned few times over past weeks in this thread- higher clocks on Pascal does not guarantee better results. There is a dimishion return here. There were examples of people here who had better scores at lower clocks (both core and memort) than their max.
> 
> I can relate also as my first STRIX had better score at 2038 core than at 2050 core.


also GPU boost 3.0 seems to be broken on my card, well in a good way. My card doesn't care about temperature at all. It maintains the same max clock from 30C to 55C. It simply crashes when temperature exceed 60C.


----------



## Lefty23

Quote:


> Originally Posted by *Coopiklaani*
> 
> Very strange, my card can run 2100core and 6399memory without a single dip in frequency. but my SP score is still lower than ppl with lower clocks. Anyone has a clue why?


As Slackaveli said, if you are really interested in a few more points, you could change these in NVCP:
Power Management Mode --> Prefer Maximum Performance
Texture Filtering Quality --> High Performance
and set "Single Display Performance Mode"

These are actually recommended in all the "benchmarking" threads here in OCN.

From what I have seen Superposition really likes memory OC. Try a few runs with your memory OC as high as possible, even if you sacrifice a few Mhz on the core.


----------



## lanofsong

Hey there 1080 Ti owners,

Would you consider signing up with Team OCN for the 2017 Pentathlon (*May 5th through May 19th*). There is so much time left an we really could use your help.

This event is truly a GLOBAL battle with you team OCN going up against many teams from across the world and while we put in a good showing at last year's event by finishing 6th, we could do with a lot more CPU/GPU compute power. All you need to do is sign up and crunch on any available hardware that you can spare.

The cool thing about this event is that it spread over 5 disciplines over *varying lengths of time* (different projects) so there is a lot of *strategy/tactics* involved.

We look forward to having you and your hardware on our team. Again, this event lasts for two weeks and takes place May 5th through the 19th.


Download the software here.

https://boinc.berkeley.edu/download.php

Presently we really would like some help with the following project:

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

To find your Cross Project ID# - sign into your account and it will be located under Computing and Credit


Please check out the GUIDE - How to add BOINC Projects page for more information about running different projects:

This really is an exciting and fun event and i look forward to it every year and I am hoping that you will join us and participate in this event









BTW - There is an awesome BOINC Pentathlon badge for those who participate









lanofsong

OCN - FTW


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> I switched it to slave (2nd Bios) off the bat, gives you 127
> 
> Master (1st Bios) gives you 117
> 
> limits I'm seeing in gpuz is pwr on ftw3 (and I'm maxed at 127) so I'm hitting a wall, that's where the aorus never had a pwr on the sensors.


So 127% gives you how much Watts? 125% on Aorus gives 375W. Could you let us know?







Thanks.


----------



## Norlig

My fan was hitting the copper heatsink at higher RPM, so I had to move it up a little.

I always thought there was a slight opening for air to come out from the fan to hit the back VRM Heatsink fins, but that one doesnt get any active cooling, so now I left the black shroud part off, should keep the VRM's slightly cooler I hope


----------



## CoreyL4

im testing the compromise of a lower core clock to see how high i can get my mem clock stable.

so far i am at 2025/6123 with my gaming x.

still going to push the memory further.

can someone help me with that occt mem tester? i follow the directions, but my mem goes down to 5508 when i reclick on occt to have that be the focus.


----------



## Qba73

Quote:


> Originally Posted by *Benny89*
> 
> So 127% gives you how much Watts? 125% on Aorus gives 375W. Could you let us know?
> 
> 
> 
> 
> 
> 
> 
> Thanks.


see below FTW3 looks like 350w at 127


----------



## KedarWolf

Quote:


> Originally Posted by *Qba73*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> So 127% gives you how much Watts? 125% on Aorus gives 375W. Could you let us know?
> 
> 
> 
> 
> 
> 
> 
> Thanks.
> 
> 
> 
> see below FTW3 looks like 350w at 127
Click to expand...

 nvflash_5.370.0.zip 3078k .zip file


Can you run nvflash and save your bios, put it in a zip file with winrar and attach it as an attachment here.

http://www.rarlab.com/download.htm

I'd like to test it on my FE.

*Unzip NVFlash to a folder.

Run an admin command prompt and cd to the folder you made.

nvflash64 --save filename.rom*


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> see below FTW3 looks like 350w at 127


Nice, thanks. 350W is ok.

I have to say that so far on internet I see that all FTW3 hit the same clocks. Of course batch is small as not many FTW3 were shipped from pre-orders but someone in internet said that EVGA saved better chips for FTW3 and rest went to SC, SC2 and SC Hybrid.

Well, only a rumor so don't take that seriously. But really nice OC so far around for FTW3 cards in internet (2075 with manual OC).


----------



## RavageTheEarth

Quote:


> Originally Posted by *Benny89*
> 
> Yeah, I pre-ordered EVGA FTW3 to check it. if it is marginally better than my STRIX I will keep it instead. I like to have options, for example if I will want to water cool card, or STICK Aio to it etc etc. With Asus I feel limited with what I can do. Still best looking Ti for me on market, but EVGA has always best warranty on market.


Can't argue that. EVGA's customer service is top notchnotch.
Quote:


> Originally Posted by *s1rrah*
> 
> I play 1440p with my 1080 ti ... it's the sweet spot for me as I require 100+ FPS for Full Personal Satisfaction(tm) ... LOL ...
> 
> But yesterday, I had an epiphany and realized that I have a 48" Samsung JS9000 4K TV sitting in front of me and thought, "Hey ... why not test this thing on my 4K TV!?" ... really, I had forgotten all about my TV being 4K (and JS9000 is a pretty decent gaming TV at that).
> 
> So I hook everything up through my receiver and start playing Witcher 3 (Ultra settings, no hair works) ... 3840x2160 ... and dude ... compared to my dual 980's on the same TV? Absolute night and day. I cruised around and was amazed at the snappiness and the fact that I was always between 65FPS and 80FPS ... generally, towards the higher end of those numbers ... and with the "ultra" settings, it literally looked like an animated film ...
> 
> I won't be playing on the TV much but man was it totally playable ... no bothersome lag (though the lag was there, it's a TV after all) snappy mouse response time and most importantly ... frame rates generally averaging around 70FPS ...
> 
> In the end, I love my 27" 1440p G Sync IPS screen more ... but sure was nice to see such smooth play on my 4K TV; I will say this though ... once a single card can push 4K to frame rates between 100FPS and 144FPS, I'll totally be upgrading to 4K ... but for now? 1440p is it ...
> 
> What a card man ...


I have yet to play on 4k, but i can't give up my sweet XB270HU's 144hz. The people who argue over FPS above 60 being useless are the people who haven't tried it. I remember it took me a day to get used to it. At first it was almost TOO smooth.
Quote:


> Originally Posted by *Benny89*
> 
> Silicon Lottery. I have now my second STRIX. Won't do higher than 2000mhz stable in Witcher 3, 2012 stable in ME:A. My previous one could do 2012 stable.
> 
> I know there was someone here who had stable on STRIX for 6 hours 2075 in Witcher 3.
> 
> Silicon Lottery sadly :/


That is crazy. Do you know what his temps were like? The Witcher 3 pulls more power than any benchmark I have ran. I'm on an Aorus with a 375w limit and this game constantly pulls SO much power. I measured 510w with my kill-a-watt compared to the 490ish watts I'm pulling from the wall in Firestrike Ultra and Superposition. This is with a 6700k @ 4.4GHZ 1.3v. The game heats my card up like no other and makes my clocks unstable when i hit 60C. I need the waterblock for this card to be released ASAP. I can run a stable 2076 @ 1.081v in The Witcher 3 until my card hits 60C about 5 minutes into playing the game it will crash.


----------



## KedarWolf

I PM'd @Qba73 to get ahold of the FTW3 BIOS to test on my FE. I'm pretty sure it'll work.


----------



## Luckbad

@Qba73 can you upload your FTW3 bios to TechPowerUp?

Grab GPU-Z then...


----------



## feznz

Any one having driver problems after rolling back to 381.65 no more BSOD tested for 24 hours just a suggestion to anyone having any stability issues roll back drivers
Not sure because my system specs are a little backward can say it is a first that I have more GPU memory than system







could be the issue. A friend has offered me his 4x4Gb 1600Mhz out of his media pc I have some other ram laying around to give him (buy cheap buy twice)
all happy now my 5year windows install lives









Quote:


> Originally Posted by *Slackaveli*
> 
> i believe DP1.4 has us covered there. and HDMI 2.1 with enough bandwidth for 4k/120Hz+HDR and VRR (variable refresh rate) tech. So, in future we're good.
> 
> HDMI 2.0 on my 4k tv is what made me "upgrade" from 290x to gtx 980s (1st hdmi 2.0 gpu) , but has had me complaining about the cable tech ever since. DP1.2 is pathetic. and monitor manufacturers are the ones holding us ALL back. the DP 1.3 standard has been out for years and no monitors had an input for it. DP 1.4 was out before even the first gaming monitor with dp1.3 tech came out. It's pathetic, tbh. We could have had high refresh rate 4k 2 years ago. We had the gpu for it with Titan/980ti , just in sli or on weaker games, of course. But that is often overlooked. People act like if you cant play AAA games at 4k 120 than it's all pointless. But that's crap because i also play a lot of not AAA games and they run high refresh rates in 4k... But , we can't see them. dumb as hell....


I was just looking at the 1080 Strix DP output they a specified as regular DP could mean regular size DP or type 1.0 or be a bit of a fail if the FE is listed as DP1.4

there was a bit of confusion last night probably the medicinal meds love play ROTTR in that state

but here is the actual scenario where double DP 1.4 cables are need to drive a monitor so where is the DP v1.5 or another cable type altogether

http://www.anandtech.com/show/11220/dells-32-inch-8k-up3218k-display-now-for-sale



I love this can you see the benefits of this new monitor just look at it though your old monitor = fail

And on a personal note day 10 cold turkey giving up the cigarettes my daughter prayed to god that daddy don't be grumpy







I am feeling a million times better


----------



## Luckbad

Quote:


> Originally Posted by *Coopiklaani*
> 
> Very strange, my card can run 2100core and 6399memory without a single dip in frequency. but my SP score is still lower than ppl with lower clocks. Anyone has a clue why?
> 
> BTW, CPU 6800k 4.5GHz
> 
> 
> 
> GPU-ZSensorLog.txt 58k .txt file


Shrug, variables. Sometimes my memory is best at 6210 MHz, sometimes it's slightly better a little higher or a little lower.

My best score using the Zotac FireStorm utility is a little lower than Afterburner, but it's still a slightly lower score than others with slightly worse cards.

This is with 6177 memory clock and 2075 GPU clock (I can get slightly higher GPU clock in Afterburner because Zotac FireStorm never lets it hit 1.093 V). i7 6700k running at 4.6 GHz.


----------



## Qba73

Quote:


> Originally Posted by *KedarWolf*
> 
> I PM'd @Qba73 to get ahold of the FTW3 BIOS to test on my FE. I'm pretty sure it'll work.


Yes will do, should be home by 8est, will post here


----------



## KraxKill

Quote:


> Originally Posted by *Qba73*
> 
> Yes will do, should be home by 8est, will post here


+1 +rep


----------



## Qba73

here it is FTW3 Bios

as zip also uploaded to Techpowerup

GP102.zip 153k .zip file


note this is the Slave Bios with 127PL not the conservative master.


----------



## Slackaveli

@RavageTheEarth "I have yet to play on 4k, but i can't give up my sweet XB270HU's 144hz. The people who argue over FPS above 60 being useless are the people who haven't tried it. I remember it took me a day to get used to it. At first it was almost TOO smooth"

who said give it up? Im not a one or the other kind of guy, so I have both


----------



## Qba73

Quote:


> Originally Posted by *Luckbad*
> 
> @Qba73 can you upload your FTW3 bios to TechPowerUp?
> 
> Grab GPU-Z then...


Done, also posted it on here see above..


----------



## Benny89

Did anyone tried to FLASH one AIB BIOS to another AIB?

I see only people flashing BIOS to FE card. Wonder if anyone tried to flash one for AIB instead.

Would interesting to see if for example we can have STRIX/GAMING X running with Aorus BIOS.


----------



## DerComissar

Quote:


> Originally Posted by *Qba73*
> 
> here it is FTW3 Bios
> 
> as zip also uploaded to Techpowerup
> 
> GP102.zip 153k .zip file
> 
> 
> note this is the Slave Bios with 127PL not the conservative master.


Reperoonie+


----------



## Qba73

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone tried to FLASH one AIB BIOS to another AIB?
> 
> I see only people flashing BIOS to FE card. Wonder if anyone tried to flash one for AIB instead.
> 
> Would interesting to see if for example we can have STRIX/GAMING X running with Aorus BIOS.


I was curious of this myself..


----------



## dansi

Quote:


> Originally Posted by *Norlig*
> 
> My fan was hitting the copper heatsink at higher RPM, so I had to move it up a little.
> 
> I always thought there was a slight opening for air to come out from the fan to hit the back VRM Heatsink fins, but that one doesnt get any active cooling, so now I left the black shroud part off, should keep the VRM's slightly cooler I hope


So artic cooling clc also worked ? Did you need to install its ugly cumbersome backplate, or you can just use the original 4 spring screws like evga clc?

Since evga 1080 clc is also out of stock, i figured one can also use arctic version in the same manner?


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> I was curious of this myself..


Could you possible test it? You have dual BIOS on FTW3 so if you flashed one (with lets say AORUS Xtreme BIOS) and then it wouldn't boot or anything- you can just switch to other BIOS for boot and then flash back your correct BIOS again with no problem.

That is also why I want to get FTW3 btw. Having dual BIOS is always great for testing stuff









I will be honest here, I am little afrain of flashing BIOS on my STRIX as I have only one BIOS


----------



## KedarWolf

Quote:


> Originally Posted by *Qba73*
> 
> here it is FTW3 Bios
> 
> as zip also uploaded to Techpowerup
> 
> GP102.zip 153k .zip file
> 
> 
> note this is the Slave Bios with 127PL not the conservative master.


Testing the FTW3 BIOS on my FE.

If you unzip this then run this .bat file after every boot (or add it to your task scheduler to run 30 seconds after boot) you'll get a 358 power limit.

powerlimit.zip 0k .zip file


Good results so far.

In Superposition getting 2050 at .993v.

It'll power limit to 2037 most of the run but I get 120 more points in Superposition compared to the next best BIOS I tried, the Asus Strix OC.

It seems to power limit even less than the Strix or Palit BIOS.


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> Testing the FTW3 BIOS on my FE.
> 
> If you unzip this then run this .bat file after every boot (or add it to your task scheduler to run 30 seconds after boot) you'll get a 358 power limit.
> 
> powerlimit.zip 0k .zip file
> 
> 
> Good results so far.
> 
> In Superposition getting 2050 at .993v.
> 
> It'll power limit to 2037 most of the run but I get 120 more points in Superposition compared to the next best BIOS I tried, the Asus Strix OC.
> 
> It seems to power limit even less than the Strix or Palit BIOS.


Thanks for the update. Looking forward to your final overall opinion.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> Testing the FTW3 BIOS on my FE.
> 
> If you unzip this then run this .bat file after every boot (or add it to your task scheduler to run 30 seconds after boot) you'll get a 358 power limit.
> 
> powerlimit.zip 0k .zip file
> 
> 
> Good results so far.
> 
> In Superposition getting 2050 at .993v.
> 
> It'll power limit to 2037 most of the run but I get 120 more points in Superposition compared to the next best BIOS I tried, the Asus Strix OC.
> 
> It seems to power limit even less than the Strix or Palit BIOS.


Nice one. I'm going to test it on my shunt modded FE shortly. So far I'm getting good results with stock BIOS in most of the benchmarks/games. The only places it pwr limits are SP 8k, TS scene 2 and FSU scene 1.


----------



## Qba73

Quote:


> Originally Posted by *Benny89*
> 
> Could you possible test it? You have dual BIOS on FTW3 so if you flashed one (with lets say AORUS Xtreme BIOS) and then it wouldn't boot or anything- you can just switch to other BIOS for boot and then flash back your correct BIOS again with no problem.
> 
> That is also why I want to get FTW3 btw. Having dual BIOS is always great for testing stuff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be honest here, I am little afrain of flashing BIOS on my STRIX as I have only one BIOS


Yeah I want to try the aorus bios and the strix one.


----------



## Coopiklaani

FTW3 bios works like a charm on my FE card with shunt mod. Higher TDP than the Palit BIOS, and the same efficiency as the stock bios.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> FTW3 bios works like a charm on my FE card with shunt mod. Higher TDP than the Palit BIOS, and the same efficiency as the stock bios.


Be sure to +1 @Qba73 who shared it with us. I'm getting good results even no shunt mod.


----------



## Benny89

Quote:


> Originally Posted by *Qba73*
> 
> Yeah I want to try the aorus bios and the strix one.


Glad to hear it! Can't wait for your results. I don't think you will get anything from STRIX one as it is only 330W compare to BIOS2 FTW3 with 350W+ TDP.

But I can't wait for AORUS or ZOTAC XTREME test on your FTW3









Thanks for contributing!


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Be sure to +1 @Qba73 who shared it with us. I'm getting good results even no shunt mod.


i knew one of them would eventually work, and I actually have been thinking the best candidate is the FTW3. higher TDP, similar pcb. Winner winner, chicken dinner. congrats , FE OGs. Good share, Qba73


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> Be sure to +1 @Qba73 who shared it with us. I'm getting good results even no shunt mod.


Sure thing!
It may be the best BIOS for FE cards before the elusive strix XOC bios.


----------



## done12many2

Quote:


> Originally Posted by *Qba73*
> 
> here it is FTW3 Bios
> 
> as zip also uploaded to Techpowerup
> 
> GP102.zip 153k .zip file
> 
> 
> note this is the Slave Bios with 127PL not the conservative master.


Works beautifully on my EVGA FE cards!

Thanks again.


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone tried to FLASH one AIB BIOS to another AIB?
> 
> I see only people flashing BIOS to FE card. Wonder if anyone tried to flash one for AIB instead.
> 
> Would interesting to see if for example we can have STRIX/GAMING X running with Aorus BIOS.


Forget Auros bios, as that card uses R002 shunt resistors. If your going to try anything, go for Zotac AMP extreme, that works on FE as it has R005 shunt resistors same as FE so should be ok on Asus ETC and has even higher TDP.


----------



## Qba73

Quote:


> Originally Posted by *Nico67*
> 
> Forget Auros bios, as that card uses R002 shunt resistors. If your going to try anything, go for Zotac AMP extreme, that works on FE as it has R005 shunt resistors same as FE so should be ok on Asus ETC and has even higher TDP.


Thanks for the heads up, was going to try the aorus xtreme bios on ftw3, but I think I will just try the zotac one.

lil update on the ftw3 chip I have, she'll do 2050 solid without downclocking cause the thermals never go over 62. But 2075 is another story, no matter what combo she hates 2075. I'm happy with 2050 same the aorus was getting me for my 24/7 clock

I am thinking I will repaste for better temps. Now gotta wrap the aorus for my bro, he will pee his pants lol

Glad I could help with the bios for those using it.


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> So I swapped CPU coolers and installed hybrid kits on my 1080tis. I go to turn on my computer and now it's stuck on post code 0d...
> 
> Tried flashing bios, pulling jumper, clearing cmos, 1 card, 1 dimm, different ram. Nothing...
> 
> Should I RMA the board? It's either the board or the CPU that is dead or defective I'd imagine.
> 
> Thoughts?!?


Fixed my issues! Apparently when I reseated the CPU my pins got bent, *3 to be exact!* I bent them back and posted with no issues...









EDIT:

Also... How's about adding the links to the latest drivers and version of msi afterburner in the op?


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Fixed my issues! Apparently when I reseated the CPU my pins got bent, *3 to be exact!* I bent them back and posted with no issues...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> 
> Also... How's about adding the links to the latest drivers and version of msi afterburner in the op?


you n00b!!







Glad you got it fixed!
Quote:


> Originally Posted by *Qba73*
> 
> Thanks for the heads up, was going to try the aorus xtreme bios on ftw3, but I think I will just try the zotac one.
> 
> lil update on the ftw3 chip I have, she'll do 2050 solid without downclocking cause the thermals never go over 62. But 2075 is another story, no matter what combo she hates 2075. I'm happy with 2050 same the aorus was getting me for my 24/7 clock
> 
> I am thinking I will repaste for better temps. Now gotta wrap the aorus for my bro, he will pee his pants lol
> 
> Glad I could help with the bios for those using it.


Oh, wow, you are the man, qba. take a video!


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> you n00b!!
> 
> 
> 
> 
> 
> 
> 
> Glad you got it fixed!
> Oh, wow, you are the man, qba. take a video!


Lol. Thanks.


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Lol. Thanks.


it is weird going from intel to amd. the things you have to be careful of are backwards.


----------



## Bishop07764

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I have yet to play on 4k, but i can't give up my sweet XB270HU's 144hz. The people who argue over FPS above 60 being useless are the people who haven't tried it. I remember it took me a day to get used to it. At first it was almost TOO smooth.
> That is crazy. Do you know what his temps were like? The Witcher 3 pulls more power than any benchmark I have ran. I'm on an Aorus with a 375w limit and this game constantly pulls SO much power. I measured 510w with my kill-a-watt compared to the 490ish watts I'm pulling from the wall in Firestrike Ultra and Superposition. This is with a 6700k @ 4.4GHZ 1.3v. The game heats my card up like no other and makes my clocks unstable when i hit 60C. I need the waterblock for this card to be released ASAP. I can run a stable 2076 @ 1.081v in The Witcher 3 until my card hits 60C about 5 minutes into playing the game it will crash.


Gsync 144hz monitors are indeed awesome. It's hard to look at anything else now.

Everyone's Witcher 3 results are quite interesting to me. I've played nothing but Witcher 3 the past couple of days, and I haven't seen any power throttling on my end that I have noticed. But I definitely get power throttling in the benchmarks that you listed. I've only seen the Witcher 3 downclocking in the menu and map screens which appears to be normal behavior. I'm constantly at 117% or so power though at 2050 core.


----------



## alucardis666

Quote:


> Originally Posted by *Slackaveli*
> 
> it is weird going from intel to amd. the things you have to be careful of are backwards.


So true! lol


----------



## Nico67

FTW3 doesn't seem any better to me,

2101/6059



its pulling 314w but again smashing the limiter and scoring about on par with FE. Gives me more stability issues in game where I wasn't power limited anyway.

Asus XOC will be the only useful one outside of a shunt mod. That's assuming it can get around the normalized power limit?


----------



## Bishop07764

Anyone flashing the FTW3 bios on FE actually tried any games? Lol. I've finally backed my Bios up. Maybe it's time to go ahead and flash. ?

Oh and you guys jumping on this cross flashing of bios to test them out, you are all awesome... and crazy, but mostly awesome. ?


----------



## Bishop07764

Quote:


> Originally Posted by *Nico67*
> 
> FTW3 doesn't seem any better to me,
> 
> 2101/6059
> 
> 
> 
> its pulling 314w but again smashing the limiter and scoring about on par with FE. Gives me more stability issues in game where I wasn't power limited anyway.
> 
> Asus XOC will be the only useful one outside of a shunt mod. That's assuming it can get around the normalized power limit?


More stability issues at your normal gaming clocks?


----------



## KickAssCop

Quote:


> Originally Posted by *Benny89*
> 
> Silicon Lottery. I have now my second STRIX. Won't do higher than 2000mhz stable in Witcher 3, 2012 stable in ME:A. My previous one could do 2012 stable.
> 
> I know there was someone here who had stable on STRIX for 6 hours 2075 in Witcher 3.
> 
> Silicon Lottery sadly :/


I know it is Silicon Lottery and I wanted to play it. Finally, am at a decent place. Should have bought STRIX in the first place.


----------



## Luckbad

Quote:


> Originally Posted by *Bishop07764*
> 
> Anyone flashing the FTW3 bios on FE actually tried any games? Lol. I've finally backed my Bios up. Maybe it's time to go ahead and flash. ?
> 
> Oh and you guys jumping on this cross flashing of bios to test them out, you are all awesome... and crazy, but mostly awesome. ?


I've flashed a few on non-FE cards.

They generally work, but some cards have weird things that could screw up.

EVGA cards with ICX might not have proper fan control since their fans are independent, Zotac Amp Extreme behaves oddly on other Bios probably because of its power phase chips, etc.


----------



## KedarWolf

Quote:


> Originally Posted by *Bishop07764*
> 
> Anyone flashing the FTW3 bios on FE actually tried any games? Lol. I've finally backed my Bios up. Maybe it's time to go ahead and flash. ?
> 
> Oh and you guys jumping on this cross flashing of bios to test them out, you are all awesome... and crazy, but mostly awesome. ?


I'm running Diablo 3 at 2062 core 1.050v, G-Sync on at 4K, not power limiting at all.









Just upped it to 1.093v 2088, zero power limiting at all.









Edit: FTW3 BIOS is great, Diablo 3: Reaper Of Souls at 2100 core, +667 memory (6177) , zero power limiting while playing the game, completely stable.


----------



## madmeatballs

Is my SP score too low for my oc? CPU is a i7 7700k @5GHz (4.7GHz AVX). This is as far as I can get without flashing a bios and shunt mod. I already tried Inno3d, SC2, Palit bioses, no good. FE stock bios seems to give me best results.



Here is my fan curve


----------



## ViRuS2k

EVGAFTW3 bios works great with my 2x FE gigabytes









still power limits but i need to shutmod for that lol

though intrested to try out that power limit/no limit bios when ever it surfaces lol


----------



## Nico67

Quote:


> Originally Posted by *Bishop07764*
> 
> More stability issues at your normal gaming clocks?


can't load or play games at 2101/6003 which I can on the FE bios. Each time I flash back to FE, games are fine again. Each card is different so you may have better luck


----------



## iluvkfc

So I finally put my FE card under water and performed the shunt mod. Good results power-wise, my control test where I was getting 77% power limit is now at 55%... so if everything scales proportionally my 100% PT corresponds to 350W and 120% is 420W... Though I have not seen anything hit 120 yet! Also GPU is extremely cool, runs at 45C max during a run of Fire Strike.

Seems my max stable core clock for benching is 2113 (haven't found 24/7 clock yet), have not played with memory much yet but I will probably settle to +700 or +750, +800 causes artifacts.



Edit: Also, should I connect 2 different power cables for the 6-pin & 8-pin, or using 1 cable is ok? PSU is an EVGA 850W GQ.


----------



## Luckbad

Quote:


> Originally Posted by *madmeatballs*
> 
> Is my SP score too low for my oc? CPU is a i7 7700k @5GHz (4.7GHz AVX). This is as far as I can get without flashing a bios and shunt mod. I already tried Inno3d, SC2, Palit bioses, no good. FE stock bios seems to give me best results.


Looks fine to me. Only thing that might be able to improve your scores is higher memory clock and fixing that power limit so your results are more consistent.


----------



## Slackaveli

Quote:


> Originally Posted by *iluvkfc*
> 
> So I finally put my FE card under water and performed the shunt mod. Good results power-wise, my control test where I was getting 77% power limit is now at 55%... so if everything scales proportionally my 100% PT corresponds to 350W and 120% is 420W... Though I have not seen anything hit 120 yet! Also GPU is extremely cool, runs at 45C max during a run of Fire Strike.
> 
> Seems my max stable core clock for benching is 2113 (haven't found 24/7 clock yet), have not played with memory much yet but I will probably settle to +700 or +750, +800 causes artifacts.
> 
> 
> 
> Edit: Also, should I connect 2 different power cables for the 6-pin & 8-pin, or using 1 cable is ok? PSU is an EVGA 850W GQ.


generally 2 cords would be better. And those are some great clocks. You won the lottery on the core and the ram. You really do have to play the lotto twice per card. i won on core (2101) but my memz go bonkers at +450. weird. lucky and unlucky both,.


----------



## KedarWolf

Quote:


> Originally Posted by *iluvkfc*
> 
> So I finally put my FE card under water and performed the shunt mod. Good results power-wise, my control test where I was getting 77% power limit is now at 55%... so if everything scales proportionally my 100% PT corresponds to 350W and 120% is 420W... Though I have not seen anything hit 120 yet! Also GPU is extremely cool, runs at 45C max during a run of Fire Strike.
> 
> Seems my max stable core clock for benching is 2113 (haven't found 24/7 clock yet), have not played with memory much yet but I will probably settle to +700 or +750, +800 causes artifacts.
> 
> 
> 
> Edit: Also, should I connect 2 different power cables for the 6-pin & 8-pin, or using 1 cable is ok? PSU is an EVGA 850W GQ.


I'm waiting on 0805 47 Ohm resistors to do a shunt mod rather than CLU as my card is vertical. Will give me 1.72xTDP I think or something like that.

Resistors, silver conductive paint, and nonconductive thermal glue to do it cost me less that $10 in total on Ebay, long delivery times though.

No rush on it, but will be great!!


----------



## iluvkfc

Quote:


> Originally Posted by *Slackaveli*
> 
> generally 2 cords would be better. And those are some great clocks. You won the lottery on the core and the ram. You really do have to play the lotto twice per card. i won on core (2101) but my memz go bonkers at +450. weird. lucky and unlucky both,.


Noted, will add second cable tomorrow. Also I agree it's a great card, am very happy with the result. Feels good to have a decent clocking card now, after the sad cards I had before... 1070 couldn't do 2050, 970 couldn't do 1500. Sad thing is, these cards have so much potential but need waterblock + shunt mod to unlock it...
Quote:


> Originally Posted by *KedarWolf*
> 
> I'm waiting on 0805 47 Ohm resistors to do a shunt mod rather than CLU as my card is vertical.
> 
> Resistors, silver conductive paint, and nonconductive thermal glue to do it cost me less that $10 in total on Ebay, long delivery times though.
> 
> No rush on it, but will be great!!


Doing mod with resistors would be much cleaner, I just don't trust my soldering skills enough for this... but definitely go for it and report here when it's done. Seems you have a good card already, can't wait to see what it can do.


----------



## KedarWolf

Quote:


> Originally Posted by *iluvkfc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> generally 2 cords would be better. And those are some great clocks. You won the lottery on the core and the ram. You really do have to play the lotto twice per card. i won on core (2101) but my memz go bonkers at +450. weird. lucky and unlucky both,.
> 
> 
> 
> Noted, will add second cable tomorrow. Also I agree it's a great card, am very happy with the result. Feels good to have a decent clocking card now, after the sad cards I had before... 1070 couldn't do 2050, 970 couldn't do 1500. Sad thing is, these cards have so much potential but need waterblock + shunt mod to unlock it...
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm waiting on 0805 47 Ohm resistors to do a shunt mod rather than CLU as my card is vertical.
> 
> Resistors, silver conductive paint, and nonconductive thermal glue to do it cost me less that $10 in total on Ebay, long delivery times though.
> 
> No rush on it, but will be great!!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Doing mod with resistors would be much cleaner, I just don't trust my soldering skills enough for this... but definitely go for it and report here when it's done. Seems you have a good card already, can't wait to see what it can do.
Click to expand...

Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> check your consumer laws carefully, those override whatever the f ASUS says.
> 
> e.g. In Australia, the consumer is entitled to 2 years defect free warranty regardless of manufacturer policies, including warranty stickers. As an added bonus, the consumer is also entitled to inspect their product to ensure no fault in manufacturing or function exists and this must be warranted unless the manufacturer can prove beyond reasonable doubt that modifications were done.
> 
> The devil is in the wording. You may find those stickers mean crap when contested against your local laws.


Also you could just say there was no sticker on the card.

Has anyone ever heard of a GPU company refusing RMA due to missing sticker? (Everyone's noticed the stickers but never heard of an incident where someone was actually refused an RMA because of it).

It's their as a "idiot proof" line of defence; or a "deterrent" to 'encourage' a 14yo NOT to take their new Xmas prseent GPU apart.

In supermarkets here; they put yellow "security protected" stickers on some expensive items that are difficult to tag, maybe due to size/shape. In reality there's no such tag or security device on the product. But usually the sight of these official stickers is enough of a deterrent.

And do we think anyone whose ever bought and installed an official EKWB waterblock for an ASUS card has ever been refused RMA because a sticker was removed? Highly doubt it.

On some cards the stickers aren't even put on top of the screws.

On my last GPU the sticker wasn't even on the GPU at all, but on a bit of outer packaging that you needed to open to get the card out it's box lol (the sticker said the warranty was void if I removed) haha

look I took a photo:




Quote:


> Originally Posted by *Slackaveli*
> 
> LOL, bro, you have been so gimped you just don't even know. Do you play Arma 3? do a last test before the switch. You'll see an enormous gain in that game. ENORMOUS>
> OMG, dude


lol, I wonder if my card will still be as stable in the Witcher 3 (and other benches) now that the bottleneck will be back on the GPU (and not the CPU).

Will be very interesting if this does affect how stable it is. Because if it does affect my Witcher 3 stability; it basically means you can't really compare lottery from one person to another. And that stability and highest overclock is completely subjective depending on other hardware & factors.


----------



## KedarWolf

Quote:


> Originally Posted by *iluvkfc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> generally 2 cords would be better. And those are some great clocks. You won the lottery on the core and the ram. You really do have to play the lotto twice per card. i won on core (2101) but my memz go bonkers at +450. weird. lucky and unlucky both,.
> 
> 
> 
> Noted, will add second cable tomorrow. Also I agree it's a great card, am very happy with the result. Feels good to have a decent clocking card now, after the sad cards I had before... 1070 couldn't do 2050, 970 couldn't do 1500. Sad thing is, these cards have so much potential but need waterblock + shunt mod to unlock it...
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm waiting on 0805 47 Ohm resistors to do a shunt mod rather than CLU as my card is vertical.
> 
> Resistors, silver conductive paint, and nonconductive thermal glue to do it cost me less that $10 in total on Ebay, long delivery times though.
> 
> No rush on it, but will be great!!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Doing mod with resistors would be much cleaner, I just don't trust my soldering skills enough for this... but definitely go for it and report here when it's done. Seems you have a good card already, can't wait to see what it can do.
Click to expand...

Here is with it done by someone else. Different card I think but same kinda thing.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lilchronic*
> 
> Dam i just ordered some 10ohm 0805 smd's to do this mod. will it still work with the 10 ohm? I bought 100 of them but should of got the roll of 10ohm-100ohm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep, 10Ohm will also work. It will give you 5x TDP, 250w->1250w, a little bit overkill.
Click to expand...

Did you have to do anything to the top of the metal contacts on the shunts? 47 Ohm 0805 resistors on the way from Ebay.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Here is with it done by someone else. Different card I think but same kinda thing.


What am I 'missing' in that photo?

It looks "stock" to me?


----------



## iluvkfc

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.


Let us know how it goes, might consider this method in the future.


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> What am I 'missing' in that photo?
> 
> It looks "stock" to me?


22R0 resistors stacked on Capacitors C250 C252 and C257, one 22ohm resistor on each capacitor


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.


The capacitors the resistors go on are much smaller then the shunts, may pay to figure out the correct dimensions as it will make it much easier.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Here is with it done by someone else. Different card I think but same kinda thing.
> 
> 
> 
> 
> 
> What am I 'missing' in that photo?
> 
> It looks "stock" to me?
Click to expand...

There's the 22R0 resistors glued on top with silver conductive paint and thermal glue.


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.
> 
> 
> 
> 
> 
> The capacitors the resistors go on are much smaller then the shunts, may pay to figure out the correct dimensions as it will make it much easier.
Click to expand...

0805 47 Ohm resistors fit. 0805 is the size you need.

I know the method to do this from someone on this thread, it's pretty easy and removable with 99% alcohol.


----------



## Norlig

Quote:


> Originally Posted by *dansi*
> 
> So artic cooling clc also worked ? Did you need to install its ugly cumbersome backplate, or you can just use the original 4 spring screws like evga clc?
> 
> Since evga 1080 clc is also out of stock, i figured one can also use arctic version in the same manner?


You need a 1.5-2mm copper shim.

I used the Arctic screws, but borrowed the springs from the NVIDIA stock screws. 
No backplate


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> Here is with it done by someone else. Different card I think but same kinda thing.


I am wondering what I am looking at looks like more like a force 3D voltage or LLC mod the resister is rather small for a shunt mod could be a voltage mod but normally you would cut the circuit and solder in a potentiometer to be able to adjust the voltage than a fixed resistor
And that looks more like the voltage controller 3221



I always thought the shut shunt was the largest resistor's near the main 6 or 8 pin plug labelled R002 or R005
The other resistor near the PCIe lane is normally associated with the memory power delivery


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.


http://www.ebay.co.uk/itm/300745229871?_trksid=p2060353.m2749.l2648&var=600050687744&ssPageName=STRK%3AMEBIDX%3AIT

just bought some of those resisters 47ohm 10 for £1.09

----

Could you tell me the process to be able to use these can you lay them on top of the old resister with silicone thermal glue ?
list of things i need if you could please, need it to be easy as possible to remove and leave no trace in case of RMA ect lol


----------



## Nico67

Quote:


> Originally Posted by *feznz*
> 
> I am wondering what I am looking at looks like more like a force 3D voltage or LLC mod the resister is rather small for a shunt mod could be a voltage mod but normally you would cut the circuit and solder in a potentiometer to be able to adjust the voltage than a fixed resistor
> And that looks more like the voltage controller 3221
> 
> 
> 
> I always thought the shut shunt was the largest resistor's near the main 6 or 8 pin plug labelled R002 or R005
> The other resistor near the PCIe lane is normally associated with the memory power delivery


The chip you are looking at is for current sensing, so you are manipulating the reading coming from the drop across the shunt. Shunt resistor is left untouched this way, but you have a known change rather than a random change via CLU on the shunt.
Point of interest though, by doing three capacitors you are doing the PCIE bus as well. Not that it should pose a problem, but just so you know


----------



## Benny89

Quote:


> Originally Posted by *Nico67*
> 
> Forget Auros bios, as that card uses R002 shunt resistors. If your going to try anything, go for Zotac AMP extreme, that works on FE as it has R005 shunt resistors same as FE so should be ok on Asus ETC and has even higher TDP.


Thanks for input but I don't understand why we should not try Aorus one. I am not that much technician







. Can you elaborate?


----------



## cyenz

XOC Bios with removed powerlimit for Asus 1080 Ti Strix

http://forum.hwbot.org/showthread.php?p=486472

EDIT: Didnt tried it myself, dont know if safe for use or not!


----------



## Asmola

Quote:


> Originally Posted by *cyenz*
> 
> XOC Bios with removed powerlimit for Asus 1080 Ti Strix
> 
> http://forum.hwbot.org/showthread.php?p=486472


XOC Bios and voltagetool are now public!


----------



## nrpeyton

_S1080Ti XOC (Extreme Overclock) BIOS:_

S1080TIXOC.zip 1820k .zip file


-Allows voltage up to 1.2v
-Completely removes power limit. _I was drawing 450 watts in Furmark on my Founders Edition. Stock maximum is 300 watts_

Warning:
-Temps will increase
-You lose 20% of your maximum fan speed on a FE
-Don't use with cheap PSU PCI-E power cables. If cables aren't ATX recommended guage, they could overheat and cause fire due to increased power draw.
-Using this BIOS invalidates your warranty. Do so at your _own_ risk + expense.
-*You will lose one displayport.* If your monitor is plugged into the one thats disabled just switch your cable over to another _(don't let this fool you into thinking you've bricked your card)_
-Removes UPPER & LOWER temperature limits. Caution! Card will overheat *without* throttling. This safety-guards were disabled to allow card to also operate normally at extremely LOW temperatures (I.E. -140 degrees C).

1080 voltage tool (v.2) for ASUS STRIX:

s1080_tool_v02.zip 503k .zip file


-Command line interface
-Allows voltage much higher than 1.2v _(1.5v possibly -- be careful - voltages at those levels without extreme cooling will evaporate transistors on your cards GPU core)._
-Doesn't work with Founders Edition _(different controllers)_



More information r.e. extreme O/C LN2 session and original post at hwbot.org:
http://forum.hwbot.org/showthread.php?t=169488


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> _S1080Ti XOC (Extreme Overclock) BIOS:_ *Warning - Not yet tested on FE*
> 
> S1080TIXOC.zip 1820k .zip file
> 
> 
> So.... who wants to be the guinea pig lol?
> 
> 1080 voltage tool (v.2) for ASUS STRIX:
> 
> s1080_tool_v02.zip 503k .zip file
> 
> 
> More information r.e. extreme O/C LN2 session and original post at hwbot.org:
> http://forum.hwbot.org/showthread.php?t=169488


I have STRIX so I will test it, it was origianlly BIOS for STRIX, right?







The first one above is BIOS, correct? The second one is some sort of tool like Afterburner?


----------



## cyenz

Quote:


> Originally Posted by *Benny89*
> 
> I have STRIX so I will test it, it was origianlly BIOS for STRIX, right?
> 
> 
> 
> 
> 
> 
> 
> The first one above is BIOS, correct? The second one is some sort of tool like Afterburner?


Yes, first one is a Bios with nvflash included.


----------



## feznz

dam I know it works for a strix so technically I aint the Guinea pig
I would nominate nrpeyton


----------



## feznz

first double post bug
\
dam
Quote:


> Originally Posted by *Nico67*
> 
> The chip you are looking at is for current sensing, so you are manipulating the reading coming from the drop across the shunt. Shunt resistor is left untouched this way, but you have a known change rather than a random change via CLU on the shunt.
> Point of interest though, by doing three capacitors you are doing the PCIE bus as well. Not that it should pose a problem, but just so you know


Ah now I get it a lways 2 ways to skin a cat


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> I have STRIX so I will test it, it was origianlly BIOS for STRIX, right?
> 
> 
> 
> 
> 
> 
> 
> The first one above is BIOS, correct? The second one is some sort of tool like Afterburner?


nope second one is a command line interface tool, that allows changing of V.CORE, V.MEM and PCI.E voltage (in that order).

But not sure which cards it's able to talk to yet.

After you apply the voltage it WON'T show in afterburner or hwinfo64 (as those apps only display the voltage selected in BIOS) but the tool is *overriding* the BIOS. What you *WILL* see is your power draw in hwinfo64 raise slightly when you increase the voltage offset.

be careful, remember you're not under extreme cooling.. so don't do anything mental with voltages. they could cause damage at normal temps if you go _too_ high.

Anyone with a STRIX will be feeling extremely lucky just now lol.. lets hope the BIOS is at least compatible with FE.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> nope second one is a command line interface tool, that allows changing of V.CORE, V.MEM and PCI.E voltage (in that order).
> 
> But not sure which cards it's able to talk to yet.
> 
> After you apply the voltage it WON'T show in afterburner or hwinfo64 (as those apps only display the voltage selected in BIOS) but the tool is overriding the BIOS. What you *WILL* see is your power draw in hwinfo64 raise slightly when you increase the voltage offset.
> 
> be careful, remember you're not under extreme cooling.. so don't do anything mental with voltages. they could cause damage at normal temps if you go _too_ high.


Wow...ok, I will first try only BIOS itself. Don't even know what "command line interface tool" means







so I will wait for some smarter people to test it and post guide for it.

Give time to get back home and I will flash this BIOS.

FE Bios Flashing guide is enough, yes? It is the same process, correct?


----------



## mtbiker033

Quote:


> Originally Posted by *alucardis666*
> 
> Fixed my issues! Apparently when I reseated the CPU my pins got bent, *3 to be exact!* I bent them back and posted with no issues...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> 
> Also... How's about adding the links to the latest drivers and version of msi afterburner in the op?


good to hear!! I was checking the thread all day looking for good news from you!

how did it go with the hybrid kits? are they performing well? need pics!!!


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Wow...ok, I will first try only BIOS itself. Don't even know what "command line interface tool" means
> 
> 
> 
> 
> 
> 
> 
> so I will wait for some smarter people to test it and post guide for it.
> 
> Give time to get back home and I will flash this BIOS.
> 
> FE Bios Flashing guide is enough, yes? It is the same process, correct?


nvflash is included in the file, but should be pretty straight forward:

-make sure you back up your original BIOS first
-always remember to disable display adapter in device manager *before* you flash
-restart computer, open device manager and 'enable' display driver (if not enabled automatically through reboot)

let us know how you get on


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Thanks for input but I don't understand why we should not try Aorus one. I am not that much technician
> 
> 
> 
> 
> 
> 
> 
> . Can you elaborate?


The same reason it didn't work on the FE, the shunt is lower so it limits the power because it thinks there's a problem.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> nvflash is included in the file, but should be pretty straight forward:
> 
> -make sure you back up your original BIOS first
> -always remember to disable display adapter in device manager *before* you flash
> -restart computer, open device manager and 'enable' display driver (if not enabled automatically through reboot)
> 
> let us know how you get on


Hm, I look here http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti And I don't see anything about disable display adapter in device manager.

Won't that give me black screen when I do it? How to flash BIOS then?


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> nvflash is included in the file, but should be pretty straight forward:
> 
> -make sure you back up your original BIOS first
> -always remember to disable display adapter in device manager *before* you flash
> -restart computer, open device manager and 'enable' display driver (if not enabled automatically through reboot)
> 
> let us know how you get on


only need to use nvflash -6 S1080TIXOC.rom

only has the 64 bit version so doesn't need ...64


----------



## OneCosmic

Oh wow i am really curious how XOC BIOS will work on FE...if it can hold SUPO 4K/8K and TimeSpy without throttling that would be golden.


----------



## nrpeyton

I'm about to flash it to find out.

*Wish me luck.*

Hope it doesn't brick my FE Ti lol.

It's my birthday today

So as a present to myself, i get to be the first one to flash it on FE argh _test_ it on FE. lol









*Edit:*
-Flash done.

-Rebooting now. If I don't come back in 5 mins you know why lol


----------



## cyenz

On my Strix it works as intended, removes powerlimit completly, whoever it gives 1.1v on full load :\


----------



## Nico67

works fine on FE no limiting so far


----------



## nrpeyton

And it's official lol.

The BIOS does *NOT* brick a Founders Edition.











Also, only losing 1000 fan RPM on FE compared to the PALIT BIOS losing 2000 fan RPM.

100% still gives 3700 RPM on FE.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm about to flash it to find out.
> 
> *Wish me luck.*
> 
> Hope it doesn't brick my FE Ti lol.
> 
> It's my birthday today
> 
> So as a present to myself, i get to be the first one to flash it on FE argh _test_ it on FE. lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit:*
> -Flash done.
> 
> -Rebooting now. If I don't come back in 5 mins you know why lol


Happy birthday! Aaaah, wrong reply! I'm going to try it now!


----------



## nrpeyton

Also official.

It allows voltage over 1.093v (up to *1.2v) on Founders Edition*


----------



## ffodi

Thanks for the info.









ASUS XOC BIOS works like a charm on an MSI 1080TI Gaming X, no power limit, no temp limit (sliders disappeared in MSI AB as well).

However, voltage tool doesn't work, I think mainly because of the different VRM controller (uP9511 vs. NCP81274), but good news is, you can still apply up to 1,2V to the GPU using curve.


----------



## OneCosmic

Oh wow even higher voltage? Thats a GPU porn for us FE users finally







Did you find 1.2V beneficial in terms of overclocking on water? (No sense on air coolers because those can't cool such power output)


----------



## nrpeyton

Aye,

I also done a quick Furmark run at *2100MHZ at 1.2v* (it was *drawing 450 watts on Founders Edition* with NO throttling). And didn't crash.









Had to stop the test after a few seconds due to temps. _(and my own nerves due to FE cooler)_ lol

And drawing 376 watts in The Witcher 3 with no power throttling at _over_ 2100 Mhz at 1.2v

I'm guessing on a _really_ good Custom Loop I could do 2.2GHZ easily. _(As my temp was 80c and -50c reduction nets about 100mhz on pascal)._

*Anyone already on water care to test?*


----------



## Clukos

Sounds like the perfect bios for watercooling, also no more shunt mods? How did this go through Nvidia?


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> Also official.
> 
> It allows voltage over 1.093v (up to *1.2v) on Founders Edition*


Wow leapen straight on in there









are you seeing higher power draw, as I'm not getting limiting, but power doesn't seem that much and performance is not really scaling as much as I thought it might, maybe still some fine tuning I guess









2114/6003



did 2126/6158, but forgot to save it and shutdown for the nite


----------



## nrpeyton

Quote:


> Originally Posted by *Nico67*
> 
> Wow leapen straight on in there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> are you seeing higher power draw, as I'm not getting limiting, but power doesn't seem that much and performance is not really scaling as much as I thought it might, maybe still some fine tuning I guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2114/6003
> 
> 
> 
> did 2126/6158, but forgot to save it and shutdown for the nite


Yeah, the Witcher 3 had me at 376 watts and Furmark had me at 450 watts. _(showing in hwinfo64)_

I think the power sensing chip is the same on the STRIX and FE.?

Reason you're not seeing any extra power draw is because you've not increased the voltage. You're still sitting at 1.05v. (from your screenshot). You can do that _without_ the BIOS lol.

.


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm guessing on a _really_ good Custom Loop I could do 2.2GHZ easily
> 
> *Anyone already on water care to test?*


Happy birthday bud.









I've got two FE cards on water that I'll be flashing in a few minutes.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> Happy birthday bud.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got two FE cards on water that I'll be flashing in a few minutes.


lol thanks.

will be good to know how you get on with it.

My block still isn't here yet lol, so my testing is pretty limited.


----------



## Hopesolo

My Shunt Modded Ti @2088/6250MHz


----------



## kiwwanna

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye,
> 
> I also done a quick Furmark run at *2100MHZ at 1.2v* (it was *drawing 450 watts on Founders Edition* with NO throttling). And didn't crash.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had to stop the test after a few seconds due to temps. _(and my own nerves due to FE cooler)_ lol
> 
> And drawing 376 watts in The Witcher 3 with no power throttling at _over_ 2100 Mhz at 1.2v
> 
> I'm guessing on a _really_ good Custom Loop I could do 2.2GHZ easily. _(As my temp was 80c and -50c reduction nets about 100mhz on pascal)._
> 
> *Anyone already on water care to test?*


Is the shunt mod needed for this?


----------



## becks

I want to see the service guy face at Nvidia when he receives an FE card with msi bios









That's priceless....


----------



## Clukos

Quote:


> Originally Posted by *kiwwanna*
> 
> Is the shunt mod needed for this?


No, the bios removes the power limit.


----------



## nrpeyton

Quote:


> Originally Posted by *kiwwanna*
> 
> Is the shunt mod needed for this?


We were talking about the new XOC BIOS

See my original post here:
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8600#post_26075655


----------



## Benny89

Can someone guide me through flashing BIOS







? I try to do it with nvflsh64 but I have no idea how to use it. I tried cmd and then type cd D:\nvflash but It doesn't go there and I have no idea what to do next









You have to first do some protection off, backup bios and then flash- I have no idea how to do this...


----------



## Sweetwater

Crickey this seems tempting to flash to my Strix, I get a lot of pwr limit/vrel in gpuz so would this remove that and be more stable clocks wise?

Are there any draw backs or is it all positive. Convince me! Haha


----------



## macwin2012

Hey guys ,

i am deciding between Zotac 1080ti Amp edition vs 1080ti Amp extreme .

Difference is around 100$ from amp edition vs 1080ti amp extreme in my country .

Is the 100$ extra upgrade justified ?


----------



## Clukos

Quote:


> Originally Posted by *Sweetwater*
> 
> Crickey this seems tempting to flash to my Strix, I get a lot of pwr limit/vrel in gpuz so would this remove that and be more stable clocks wise?
> 
> Are there any draw backs or is it all positive. Convince me! Haha


Pros: There are no limits
Cons: There are no limits


----------



## asdkj1740

how high can we raise the voltage?


----------



## gammagoat

This is my Superposition/ slider overclock.

Core voltage +100
Power limit +120
Core clock +210
Memory clock +501

With my newly installed arctic accelero 3 and a ambient temp of 58f GPU Temp was 49c.

My only perf cap was power. Maybe a couple of vrel blips.

I am interested in this bios everyone is talking about, I dont really want to run an extreme overclock 24/7 just want to get rid of power limiting at say 2050. Would flashing with this bios be fairly safe for my needs?


----------



## cyenz

Quote:


> Originally Posted by *Sweetwater*
> 
> Crickey this seems tempting to flash to my Strix, I get a lot of pwr limit/vrel in gpuz so would this remove that and be more stable clocks wise?
> 
> Are there any draw backs or is it all positive. Convince me! Haha


I tried on my Strix, not havinfgPL is good but... those vrm get upwards 100C with an agressive fan curve playing witcher 3 4k. Already rollback to stock bios since even at stock the XOC bios drives the GPU to 1.125v and VRM gets over 100C with stock fan curve.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Can someone guide me through flashing BIOS
> 
> 
> 
> 
> 
> 
> 
> ? I try to do it with nvflsh64 but I have no idea how to use it. I tried cmd and then type cd D:\nvflash but It doesn't go there and I have no idea what to do next
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have to first do some protection off, backup bios and then flash- I have no idea how to do this...


BACKUP ORIGINAL BIOS FIRST

1) Disable graphics adapter in device driver

2) open the folder where you unzipped the BIOS _(and nvflash)_
3) copy the path with CNTRL + C (e.g. C:\Users\nickp\Desktop\1080TI FE overclocking\S1080TIXOC\S1080TIXOC)

4) type "command" next to the start button on lower left hand corner of screen (in windows 10).
5) when the command prompt appears in search results, right click it and select run as administrator

6) type "CD then spacebar" then press the RIGHT mouse button then hit ENTER (the directory will change to the path you copied into clipboard)

so looks like this:


7) type nvflash --protectoff

8)type nvflash --overridesub S1080TIXOC.rom

9) restart computer

10) re-enable graphics driver in device manager


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> 1) Disable graphics adapter in device driver
> 
> 2) open the folder where you unzipped nvflash and the BIOS
> 3) copy the path with CNTRL + C (e.g. C:\Users\nickp\Desktop\1080TI FE overclocking\S1080TIXOC\S1080TIXOC)
> 
> 4) type "command" next to the start button on lower left hand corner of screen (in windows 10).
> 5) when the command prompt appears in search results, right click it and select run as administrator
> 
> 6) type "CD then spacebar" then press the RIGHT mouse button then hit ENTER (the directory will change to the path you copied into clipboard)
> 
> so looks like this:
> 
> 
> 7) type nvflash --protectoff
> 
> 8)type nvflash --overridesub S1080TIXOC.rom
> 
> 9) restart computer
> 
> 10) re-enable graphics driver in device manager


Ok, I did flash, but I didn't Disable graphics adapter in device driver. Ow well, I will reinstall drivers now anyway to new ones.

So do I have to after that push Power Limit slider to 120? If this doesn't have any power limit....


----------



## ffodi

Pascal_autoflash0.0a_ffodi.zip 3123k .zip file

Quote:


> Originally Posted by *Benny89*
> 
> Can someone guide me through flashing BIOS
> 
> 
> 
> 
> 
> 
> 
> ? I try to do it with nvflsh64 but I have no idea how to use it. I tried cmd and then type cd D:\nvflash but It doesn't go there and I have no idea what to do next
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have to first do some protection off, backup bios and then flash- I have no idea how to do this...


I've made a simplified flash script for it, see attachment.









1, Unpack
2, Copy the BIOS into the same folder, and rename it to BIOS.rom
3, Start FLASH.bat

(please read README as well)

The script makes automatically a backup, disables and re-enables the card, and does the flashing of course....


----------



## done12many2

Quote:


> Originally Posted by *Benny89*
> 
> Ok, I did flash, but I didn't Disable graphics adapter in device driver. Ow well, I will reinstall drivers now anyway to new ones.
> 
> So do I have to after that push Power Limit slider to 120? If this doesn't have any power limit....


The PL and TL slider will no longer function as they are grayed out with the XOC BIOS.


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Ok, I did flash, but I didn't Disable graphics adapter in device driver. Ow well, I will reinstall drivers now anyway to new ones.
> 
> So do I have to after that push Power Limit slider to 120? If this doesn't have any power limit....


It's dangerous not to disable graphics driver first.

You could brick a card forgetting to do that.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> It's dangerous not to disable graphics driver first.
> 
> You could brick a card forgetting to do that.


Will remember next time







. Rebooted no problem, installing new drivers now and we will go to testing


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Will remember next time
> 
> 
> 
> 
> 
> 
> 
> . Rebooted no problem, installing new drivers now and we will go to testing


Be careful, watch your temperatures.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Be careful, watch your temperatures.


Thanks! Yup, I will start with 60-70% fans


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Thanks! Yup, I will start with 60-70% fans


If you go beyond 1.093v I'd go with 100% fans.

In fact I don't even recommend using the BIOS at all without water.

It's true calling is Chilled Water / Dry Ice & LN2


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> If you go beyond 1.093v I'd go with 100% fans.
> 
> In fact I don't even recommend using the BIOS at all without water.
> 
> It's true calling is Chilled Water / Dry Ice & LN2


How do you lock voltage at 1.093v?


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> How do you lock voltage at 1.093v?


In the voltage/frequency curve window:

1) select voltage you want, then press the *L key on keyboard*
2) you also have to plot an overclock above the voltage point you want.
3)then click apply in main window

If you're not using the XOC BIOS you also have to drag the voltage slider to 100% in main Window.
.


----------



## chef1702

Using L key to lock voltage means full power all the time because my card dont downclock anymore after I locked the voltage or is it a bug on my side?


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> If you go beyond 1.093v I'd go with 100% fans.
> 
> In fact I don't even recommend using the BIOS at all without water.
> 
> It's true calling is Chilled Water / Dry Ice & LN2


Installing drivers fresh after flash reboot, will report what i can do with my custom loop... 19°C ambient atm thou....


----------



## nrpeyton

Quote:


> Originally Posted by *chef1702*
> 
> Using L key to lock voltage means full power all the time because my card dont downclock anymore after I locked the voltage or is it a bug on my side?


not a bug, normal behaviour if you lock it it's locked.


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> In the voltage/frequency curve window:
> 
> 1) select voltage you want, then press the *L key on keyboard*
> 2) you also have to plot an overclock above the voltage point you want.
> 3)then click apply in main window
> 
> If you're not using the XOC BIOS you also have to drag the voltage slider to 100% in main Window.
> .


I knew about hitting the L to lock it, but when I hit apply and put a load on the GPU it's not doing what I expected. In the screenshot, you can see the yellow line for the lock and the red line for where it's actually running at.

I'm sure that I'm skipping something simple.


----------



## ffodi

Quote:


> Originally Posted by *chef1702*
> 
> Using L key to lock voltage means full power all the time because my card dont downclock anymore after I locked the voltage or is it a bug on my side?


Yep, with this method, it won't downclock, it will run at 3d clocks all the time.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> I knew about hitting the L to lock it, but when I hit apply and put a load on the GPU it's not doing what I expected. In the screenshot, you can see the yellow line for the lock and the red line for where it's actually running at.
> 
> I'm sure that I'm skipping something simple.


From looking at your screenshots it looks like you've had a driver crash recently.

That can throw things off.

In an attempt to keep it's self stable the card/BIOS gets really grumpy and will try to keep voltage low.

You might need to power down the PC completely and drain all power from the card and reset to defaults.

Hold power key on your PC down for 8 seconds while it's off at the wall to drain everything out.

Then reboot PC.


----------



## done12many2

Sounds good. I'll give it a spin. Yes, I've had many crashes while testing things out.


----------



## eXistencelies

Quote:


> Originally Posted by *ffodi*
> 
> Pascal_autoflash0.0a_ffodi.zip 3123k .zip file
> 
> I've made a simplified flash script for it, see attachment.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1, Unpack
> 2, Copy the BIOS into the same folder, and rename it to BIOS.rom
> 3, Start FLASH.bat
> 
> (please read README as well)
> 
> The script makes automatically a backup, disables and re-enables the card, and does the flashing of course....


So this will work with any FE 1080ti, yes? Also where is this bios I have to copy into the folder? Also which folder? I have never reflashed a gpu before so I am curious.


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> From looking at your screenshots it looks like you've had a driver crash recently.
> 
> That can throw things off.
> 
> In an attempt to keep it's self stable the card/BIOS gets really grumpy and will try to keep voltage low.
> 
> You might need to power down the PC completely and drain all power from the card and reset to defaults.
> 
> Hold power key on your PC down for 8 seconds while it's off at the wall to drain everything out.
> 
> Then reboot PC.


Thanks man. Worked perfectly!!

+1

Spoke too soon. I showed the desired voltage, but then dropped again. I'll keep tinkering.


----------



## Hulio225

First test Time Spy Game 2 :
2151MHz Core and +750MHz Vram passed @ 1.2V Temp max 39 ° C

Edit: But Scores are significantly lower :-(


----------



## ffodi

Quote:


> Originally Posted by *eXistencelies*
> 
> So this will work with any FE 1080ti, yes? Also where is this bios I have to copy into the folder? Also which folder? I have never reflashed a gpu before so I am curious.


Yes it will work with any FE (with any 1080Ti)....

1, Download the BIOS you want to install, eg. ASUS XOC BIOS from this post.

2, Unpack the attached zip file with the script (all files) to eg. "C:\autoflash" (or desktop or whatever)

3, Copy the mentioned BIOS (in case of the XOC BIOS, you have to unpack it first) to the same folder, where you unpacked the flasher and rename it to BIOS.rom.

(all the needed info is described in the README file in detail...







)


----------



## Benny89

Initial testings wiht new BIOS on my STRIX. Card is hitting now around 1.125V but temps did not change. Still maxing around 60C with 55% fan speed.

with +55 on core card is hitting 2050mhz.

Temperature downclocks are no longer here. Card does not downclock on temp limits.

*HOWEVER, whats funny- i still get Power and Voltage Limit in OSD menu when running Heaven or Witcher 3 and card downclock to 2038.*

Card will still crash if it hits cerain Temp. For example I can play Witcher 3 at 2038 or do Heaven as long as it has below 62C. Once it hits 64 it crashes.

After it downclock to 2038 I am sitting on it with perf caps Power and Voltage anyway.

I have no idea how it can hit any power limit if power limit is like gone tottally....

Soo... conclusion- if you have bad chip like I have it will give you nothing really. If you already have golden one- it will help.

I am dissapointed it still hits Power Perf lol....


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Initial testings wiht new BIOS on my STRIX. Card is hitting now around 1.125V but temps did not change. Still maxing around 60C with 55% fan speed.
> 
> with +55 on core card is hitting 2050mhz.
> 
> Temperature downclocks are no longer here. Card does not downclock on temp limits.
> 
> *HOWEVER, whats funny- i still get Power and Voltage Limit in OSD menu when running Heaven or Witcher 3 and card downclock to 2038.*
> 
> Card will still crash if it hits cerain Temp. For example I can play Witcher 3 at 2038 or do Heaven as long as it has below 62C. Once it hits 64 it crashes.
> 
> After it downclock to 2038 I am sitting on it with perf caps Power and Voltage anyway.
> 
> I have no idea how it can hit any power limit if power limit is like gone tottally....
> 
> Soo... conclusion- if you have bad chip like I have it will give you nothing really. If you already have golden one- it will help.
> 
> I am dissapointed it still hits Power Perf lol....


hmm something isn't right there.

Try flashing the BIOS again? But this time disable graphics driver first _(as recommended)_?
I was drawing 450 watts + with no power limiting. With the stock BIOS it power limits at 300 watts.

No shunt mod here yet.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Initial testings wiht new BIOS on my STRIX. Card is hitting now around 1.125V but temps did not change. Still maxing around 60C with 55% fan speed.
> 
> with +55 on core card is hitting 2050mhz.
> 
> Temperature downclocks are no longer here. Card does not downclock on temp limits.
> 
> *HOWEVER, whats funny- i still get Power and Voltage Limit in OSD menu when running Heaven or Witcher 3 and card downclock to 2038.*
> 
> Card will still crash if it hits cerain Temp. For example I can play Witcher 3 at 2038 or do Heaven as long as it has below 62C. Once it hits 64 it crashes.
> 
> After it downclock to 2038 I am sitting on it with perf caps Power and Voltage anyway.
> 
> I have no idea how it can hit any power limit if power limit is like gone tottally....
> 
> Soo... conclusion- if you have bad chip like I have it will give you nothing really. If you already have golden one- it will help.
> 
> I am dissapointed it still hits Power Perf lol....


Can't imagine that you were hitting power limit, since thats removed in that bios...

I wasn't hitting anything, additionally shunt modded thou...

But i was getting ~5% less FPS than with FE Bios

Edit:

That really sucks, cuz i know with extra voltage i can go much higher in clocks, but this bios isn't otimized for FE cards :-(


----------



## madmeatballs

Damn I wanna try the XOC bios but too chicken to do it just yet









I got watercooling though. Just wanna see how it goes for other people first before I do it.
Is it possible to just disable PL? still stay at 1.093v?


----------



## Hulio225

Quote:


> Originally Posted by *madmeatballs*
> 
> Damn I wanna try the XOC bios but too chicken to do it just yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got watercooling though. Just wanna see how it goes for other people first before I do it.
> Is it possible to just disable PL? still stay at 1.093v?


Power Limit is disabled by default in that xoc bios, you can play with voltages up to 1.2V in afterburner... and higher with tools
so obviously you can just stick to 1.093V


----------



## done12many2

Quote:


> Originally Posted by *done12many2*
> 
> Thanks man. Worked perfectly!!
> 
> +1
> 
> Spoke too soon. I showed the desired voltage, but then dropped again. I'll keep tinkering.


I figured it out. In SLI, you have to manually lock each separately. You can't lock them at the same time.


----------



## nrpeyton

Quote:


> Originally Posted by *madmeatballs*
> 
> Damn I wanna try the XOC bios but too chicken to do it just yet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got watercooling though. Just wanna see how it goes for other people first before I do it.
> Is it possible to just disable PL? still stay at 1.093v?


*How to use the XOC BIOS to remove power limits without going over stock voltage:*

You could achieve that using the XOC BIOS yes, but you'd have to manipulate the curve.

The way it works is.. if the card can achieve a given clock speed by selecting a lower point voltage on the voltage/frequency curve it will.

It will always try to boost to the highest point on the curve (and select the voltage required to do that). So whatever voltage point is associated with the highest boost on your curve. That will be the voltage thats selected. _(normally with stock BIOS's that would be influenced by power/temp throttling but with the XOC no such restrictions exist)
_
You could set your curve up so that all voltages above 1.093v don't give any more frequency than 1.093v.

I.E:
1.093v @ 2000mhz
1.1v still at 2000mhz
1.125v still at 2000mhz
1.2v still at 2000mhz

instead of

1.093v @ 2000
1.1v at 2050
1.125v at 2100

etc, etc.

Your card will know there's no point selecting a higher voltage, because it can get to 2000 without that voltage. (GPU boost 3.0 is smart) ;-)

You have to set all that up by manipulating your curve.

Hope that makes sense.

So yes, you can use the BIOS just to remove the power target then manipulate the curve to prevent it using the extra voltage (so it still maxes out at 1.093v without power limits).


----------



## Benny89

I reflashed stock old STRIX BIOS (disabling GPU this time), rebooted, let it run, ok. I disabled again, flashed XOC BIOS again, rebooted, enabled device, did small OC +55 on core.

Still hitting Perfs of Power and Voltage:



Is my 1000W PSU too low lol? How is it I hit Power Limit still?









Power Limit slider is disabled in Afterburner so it should not hit it at all....


----------



## madmeatballs

Quote:


> Originally Posted by *Hulio225*
> 
> Power Limit is disabled by default in that xoc bios, you can play with voltages up to 1.2V in afterburner... and higher with tools
> so obviously you can just stick to 1.093V


Quote:


> Originally Posted by *nrpeyton*
> 
> *How to use the XOC BIOS to remove power limits without going over stock voltage:*
> 
> You could achieve that using the XOC BIOS yes, but you'd have to manipulate the curve.
> 
> The way it works is.. if the card can achieve a given clock speed by selecting a lower point voltage on the voltage/frequency curve it will.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It will always try to boost to the highest point on the curve (and select the voltage required to do that). So whatever voltage point is associated with the highest boost on your curve. That will be the voltage thats selected. _(normally with stock BIOS's that would be influenced by power/temp throttling but with the XOC no such restrictions exist)
> _
> You could set your curve up so that all voltages above 1.093v don't give any more frequency than 1.093v.
> 
> I.E:
> 1.093v @ 2000mhz
> 1.1v still at 2000mhz
> 1.125v still at 2000mhz
> 1.2v still at 2000mhz
> 
> instead of
> 
> 1.093v @ 2000
> 1.1v at 2050
> 1.125v at 2100
> 
> etc, etc.
> 
> Your card will know there's no point selecting a higher voltage, because it can get to 2000 without that voltage. (GPU boost 3.0 is smart) ;-)
> 
> You have to set all that up by manipulating your curve.
> 
> Hope that makes sense.
> 
> So yes, you can use the BIOS just to remove the power target then manipulate the curve to prevent it using the extra voltage (so it still maxes out at 1.093v without power limits).


Alright, I'll give it a try then. Oh, did the bios disable a port on your FE?


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> I reflashed stock old STRIX BIOS (disabling GPU this time), rebooted, let it run, ok. I disabled again, flashed XOC BIOS again, rebooted, enabled device, did small OC +55 on core.
> 
> Still hitting Perfs of Power and Voltage:
> 
> 
> 
> Is my 1000W PSU too low lol? How is it I hit Power Limit still?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power Limit slider is disabled in Afterburner so it should not hit it at all....


Can you show me the GPU-z sreenshot of the perfcap and the hwinfo64 reading showing power draw and voltage?

And your curve in MSI AB e.t.c


----------



## Hulio225

Quote:


> Originally Posted by *madmeatballs*
> 
> Alright, I'll give it a try then. Oh, did the bios disable a port on your FE?


Yes like the standard strix bios, so choose another display port
Quote:


> Originally Posted by *Benny89*
> 
> I reflashed stock old STRIX BIOS (disabling GPU this time), rebooted, let it run, ok. I disabled again, flashed XOC BIOS again, rebooted, enabled device, did small OC +55 on core.
> 
> Still hitting Perfs of Power and Voltage:
> 
> 
> 
> Is my 1000W PSU too low lol? How is it I hit Power Limit still?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power Limit slider is disabled in Afterburner so it should not hit it at all....


If your clock is stable (not fluctuating) its just a display bug with that power limit


----------



## madmeatballs

Quote:


> Originally Posted by *Hulio225*
> 
> Yes like the standard strix bios, so choose another display port


Okay, would you know which port? I don't want to freak out








I got my DP plugged on the left most port(regular gpu position)


----------



## Hulio225

Quote:


> Originally Posted by *madmeatballs*
> 
> Okay, would you know which port?
> 
> 
> 
> 
> 
> 
> 
> I got my DP plugged on the left most port(regular gpu position)


Most right is disabled if mounted in a standard case


----------



## madmeatballs

Quote:


> Originally Posted by *Hulio225*
> 
> Most right is disabled if mounted in a standard case


Alright, I'll get back here and post results.


----------



## nrpeyton

Aye,

The STRIX features 2 HDMI & 2 Displayport.

FE has 3 Displayport and 1 HDMI

Don't let this trick you into thinking you've bricked your card after flashing.

Just move the cable into a different connection on GPU.

I've updated my original post (with warnings etc) to include a warning about the missing displayport.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Can you show me the GPU-z sreenshot of the perfcap and the hwinfo64 reading showing power draw and voltage?
> 
> And your curve in MSI AB e.t.c


Now it downclocked 3 times for me. From 2050 to 2012 during bench. All the time I had Power and Voltage perfs. here are logs:





There is no curve really as I simply OCed manually. I did not curve it yet, just wanted to try no power limit....


----------



## johnwayne117

So asus 1080 ti strix oc is closest we can get to that no limit bios? I mean in terms of vrm etc . And do you need to to something besides flashing this no limit bios?


----------



## chef1702

@Benny89

Thats kinda strange. I pulled around 430 out of my FE @ 1.13 V and 206x CLock with an 850 PSU. Are you using some adapters for your 6 and 8 pin cables? Make sure you dont have any other oc prog in the background.


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye,
> 
> The STRIX features 2 HDMI & 2 Displayport.
> 
> FE has 3 Displayport and 1 HDMI
> 
> Don't let this trick you into thinking you've bricked your card after flashing.
> 
> Just move the cable into a different connection on GPU.
> 
> I've updated my original post (with warnings etc) to include a warning about the missing displayport.


Just happened to me, luckily I know what's up and have RDP access into my computer.

I'm waiting for my chillbox to come up to ambient so I can move the DisplayPort cable to another port and then I will test the XOC against the FE clock for clock volt for volt.

If as with the Palit bios it scores about a half fps lower on about a ~65fps test like TimeSpy Graphics Test 2 the. It will likely need to run a bin higher to deliver the same performance.

If I can hit 2200+ on it, I will likely run the XOC.

Has anybody done a clock for clock volt for volt on it yet vs the FE bios?


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Now it downclocked 3 times for me. From 2050 to 2012 during bench. All the time I had Power and Voltage perfs. here are logs:
> 
> 
> 
> 
> 
> There is no curve really as I simply OCed manually. I did not curve it yet, just wanted to try no power limit....


Quote:


> Originally Posted by *chef1702*
> 
> @Benny89
> 
> Thats kinda strange. I pulled around 430 out of my FE @ 1.13 V and 206x CLock with an 850 PSU. Are you using some adapters for your 6 and 8 pin cables? Make sure you dont have any other oc prog in the background.


I agree, me too.

You able to take a video Benny so we can try and work out whats happening? Doesn't seem to be affecting anyone else.

It might drop bins (13mhz increments as it passes through the normal temperature thresholds) . But the maximum *UPPER MOST* temp limit _(i.e. whats displayed as temp limit in MSI AB)_ and power limits have all been removed.


----------



## done12many2

Heaven @ 2200 MHz with 1.193v in SLI

Power draw at the wall peaked at 810w. Snapped off a 803w shot.

Still tinkering.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Just happened to me, luckily I know what's up and have RDP access into my computer.
> 
> I'm waiting for my chillbox to come up to ambient so I can move the DisplayPort cable to another port and then I will test the XOC against the FE clock for clock volt for volt.
> 
> If as with the Palit bios it scores about a half fps lower on about a ~65fps test like TimeSpy Graphics Test 2 the. It will likely need to run a bin higher to deliver the same performance.
> 
> If I can hit 2200+ on it, I will likely run the XOC.
> 
> Has anybody done a clock for clock volt for volt on it yet vs the FE bios?


mate i already tested that, with 2152MHz on Core and +750MHz Vram i scored 63FPS on average on time spy game test 2 with XOC Bios on FE card.
With Nvidia FE Bios and 2100 / 750 im doing 65.5-66 FPS

So ~ 5% worse performance with XOC bios on fe card sadly


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Don't need to solder, conductive paint and the glue keeps them on, they are these, same size as the shunts.
> 
> 
> 
> 
> 
> http://www.ebay.co.uk/itm/300745229871?_trksid=p2060353.m2749.l2648&var=600050687744&ssPageName=STRK%3AMEBIDX%3AIT
> 
> just bought some of those resisters 47ohm 10 for £1.09
> 
> ----
> 
> Could you tell me the process to be able to use these can you lay them on top of the old resister with silicone thermal glue ?
> list of things i need if you could please, need it to be easy as possible to remove and leave no trace in case of RMA ect lol
Click to expand...

On the end of each shunt where the metal contacts are you put silver conductive paint, in the middle between the metal contacts you put thermal nonconductive glue, I think silicon thermal glue will work. Use the 47 Ohm resistors, it'll give you 1.72xTDP. Let me find the post showing diagrams.

Where it says 'thermal past' you was thermal nonconductive glue.

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8260_20#post_26069283


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> mate i already tested that, with 2152MHz on Core and +750MHz Vram i scored 63FPS on average on time spy game test 2 with XOC Bios on FE card.
> With Nvidia FE Bios and 2100 / 750 im doing 65.5-66 FPS
> 
> So ~ 5% worse performance with XOC bios on fe card sadly


Can you do it clock for clock thought. 2100 vs 2100 for example at the same voltage. Running different clocks introduces variables like error correction on the mem, gpu etc.

I'd like to get an idea of a clock for clock deficit to gauge the offset required on the XOC to match the FE BIOS.

There is also the issue of Frame times, .01 and 1% lows etc which would need to be tested with fraps.


----------



## Benny89

Quote:


> Originally Posted by *chef1702*
> 
> @Benny89
> 
> Thats kinda strange. I pulled around 430 out of my FE @ 1.13 V and 206x CLock with an 850 PSU. Are you using some adapters for your 6 and 8 pin cables? Make sure you dont have any other oc prog in the background.


No adapters, staright custom sleeved cables from PSU to GPU.

Nothing else in running in the background, only Afterburner, GPU-Z and Corsair Utility Engine


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> nvflash is included in the file, but should be pretty straight forward:
> 
> -make sure you back up your original BIOS first
> -always remember to disable display adapter in device manager *before* you flash
> -restart computer, open device manager and 'enable' display driver (if not enabled automatically through reboot)
> 
> let us know how you get on
> 
> 
> 
> Hm, I look here http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti And I don't see anything about disable display adapter in device manager.
> 
> Won't that give me black screen when I do it? How to flash BIOS then?
Click to expand...

Newer versions of nvflash it automatically disables graphics driver and since OP has version 5.353.0 you don't need to disable it.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Newer versions of nvflash it automatically disables graphics driver and since OP has version 5.353.0 you don't need to disable it.


I still prefer to do it manually, in case the app fails to do it then flashes on an active driver 
Just an extra safety-net


----------



## madmeatballs

Alright, this is using the XOC bios on my FE. I tried the same VF curve from my FE oc looks like i lost 130 points using the bios. I'll try going beyond 1.093v.



@nrpeyton do you think I should put the core voltage slider to max?


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> mate i already tested that, with 2152MHz on Core and +750MHz Vram i scored 63FPS on average on time spy game test 2 with XOC Bios on FE card.
> With Nvidia FE Bios and 2100 / 750 im doing 65.5-66 FPS
> 
> So ~ 5% worse performance with XOC bios on fe card sadly


Strange.

That goes against the laws of physics lol.

It's clocked higher. Memory is clocked higher. Yet is turning out _less_ frames.

Nows about time we could really use a scientific answer as to why.

Only thing I can think of so far; is maybe overclock or volts are too high for temperature and the card is being forced to "error correct" or in other words, drop frames to prevent it's self from crashing.

But you found it quite stable? So that doesn't look like the reason.


----------



## done12many2

Quote:


> Originally Posted by *madmeatballs*
> 
> Alright, this is using the XOC bios on my FE. I tried the same VF curve from my FE oc looks like i lost 130 points using the bios. I'll try going beyond 1.093v.


I'm seeing the same thing in Superposition. It's running at a much higher clock speed and holding while drawing a ton more power, but the frames are lower.


----------



## nrpeyton

KK we just need to ascertain if thats the same case on the STRIX, I.E. does the BIOS perform better on it's native card.

Looks like shunt mod is still the best answer for FE at the moment. Until we learn _why_ scores are lower at same/greater clocks.

Voltage controllers must be different between FE and STRIX too; as the tool also doesn't seem to be doing anything for me.

*How is the XOC comparing vs. the PALIT BIOS on scores ?* (if someone on *Water* could test)?


----------



## KraxKill

Quote:


> Originally Posted by *madmeatballs*
> 
> Alright, this is using the XOC bios on my FE. I tried the same VF curve from my FE oc looks like i lost 130 points using the bios. I'll try going beyond 1.093v.
> 
> 
> 
> @nrpeyton do you think I should put the core voltage slider to max?


Thats only about a 1% deficit and well within the margin of error for the bench. Hulio above claims a 5% handicap. 1% is a small price to pay to be able to run 1.2v


----------



## madmeatballs

Quote:


> Originally Posted by *nrpeyton*
> 
> KK we just need to ascertain if thats the same case on the STRIX, I.E. does the BIOS perform better on it's native card.
> 
> Looks like shunt mod is still the best answer for FE at the moment. Until we learn _why_ scores are lower at same/greater clocks.


Quote:


> Originally Posted by *KraxKill*
> 
> Thats only about a 1% deficit and well within the margin of error for the bench. Hulio above claims a 5% handicap. 1% is a small price to pay to be able to run 1.2v


I'll keep on testing, but as far as I can see this bios is much better than Inno3d, EVGA, or Palit's I was getting very worse scores with the same oc on those. (I lose like 800-1k points)


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Thats only about a 1% deficit and well within the margin of error for the bench. Hulio above claims a 5% handicap. 1% is a small price to pay to be able to run 1.2v


I am able to get 1.2v without touching the voltage slider. So not sure if it even does anything anymore.

If it does, (i.e. allows voltage beyond 1.2v).. I'm not even sure how we'd dial that in.. as MSI Afterburners volt/freq curve window tops out at 1.2v


----------



## Clukos

Maybe the cards are reaching GPU clock instability and that's the reason for the lower scores perhaps? I've noticed that I tried pushing 2126 at 1.093mv, didn't crash but lower fps all around.


----------



## Benny89

I don't know what is wrong with that BIOS...

I let it run on +55 core Heaven. Start at 2050. It downclocked all the way to 1974mhz. Much worse than on stock BIOS. Power and Voltage Perfs all the time beginning from around 60C.

This XBIOS does not seem to work as advertised for me...


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> I don't know what is wrong with that BIOS...
> 
> I let it run on +55 core Heaven. Start at 2050. It downclocked all the way to 1974mhz. Much worse than on stock BIOS. Power and Voltage Perfs all the time beginning from around 60C.
> 
> This XBIOS does not seem to work as advertised for me...


Thats strange because you have a STRIX. Which is the same card the LN2 guy used with this BIOS in his article at hwbot on his LN2 run _(Liquid Nitrogen Benching session)_.

_I wonder if there's a handicap built into the BIOS to prevent full GPU utilisation until a certain lower temperature is reached. I.E. below zero degrees C._ _<--- *only* a theory_


----------



## Hulio225

The strix PCB is different than the FE cards maybe some code in the bios files is optimized for used ic's on strix pcb and therefore its performing worse on fe cards which probably using other ic's or whatever.


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> Thats strange because you have a STRIX. Which is the same card the LN2 guy used with this BIOS in his article at hwbot on his LN2 run _(Liquid Nitrogen Benching session)_.
> 
> _I wonder if there's a handicap built into the BIOS to prevent full GPU utilisation until a certain lower temperature is reached. I.E. below zero degrees C._ _<--- *only* a theory_


I bet there is a lot of truth to your temp throttle comment because my FE behaves differently at sub zero than it does at ambient. It actually wants to select lower vbins than it would otherwise have at the same clock. In some instances it picks unrealistically low voltages for a given clock. It would make logical sense that the thermal characteristics of different cooling solutions people use would cause the cards to react differently to varios BIOS' versions.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> 100ohm resistors will give you 1.4x TDP
> 
> Formula:
> 
> (R+40)/R
> 
> for example:
> R = 10ohm, 5x TDP
> R = 22ohm, 2.81x TDP
> R = 47ohm, 1.85x TDP
> R = 100ohm, 1.4x TDP
> 
> I'll recommmand 22ohm or 47ohm resistors. 5x is a little bit overkill, you'll hit low-side TDP anyways. 1.4x TDP is a little be on the low side, you probably want more.


So if I'm correct the above would translate to:

R = 10ohm, 1500 watt limit _<-- won't work as will probably put card into safe mode + completely pointless as card would never draw this much, and even if it did it would blow up lol_
R = 22ohm, 843 watt limit
R = 47ohm, 555 watt limit
R = 100ohm, 420 watt limit

?


----------



## hotrod717

Quote:


> Originally Posted by *nrpeyton*
> 
> _S1080Ti XOC (Extreme Overclock) BIOS:_
> 
> S1080TIXOC.zip 1820k .zip file
> 
> 
> -Allows voltage up to 1.2v
> -Completely removes power limit. _I was drawing 450 watts in Furmark on my Founders Edition. Stock maximum is 300 watts_
> 
> Warning:
> -Temps will increase
> -You lose 20% of your maximum fan speed on a FE
> -Don't use with cheap PSU PCI-E power cables. If cables aren't ATX recommended guage, they could overheat and cause fire due to increased power draw.
> -Using this BIOS invalidates your warranty. Do so at your _own_ risk + expense.
> -*You will lose one displayport.* If your monitor is plugged into the one thats disabled just switch your cable over to another _(don't let this fool you into thinking you've bricked your card)_
> -Removes UPPER & LOWER temperature limits. Caution! Card will overheat *without* throttling. This safety-guards were disabled to allow card to also operate normally at extremely LOW temperatures (I.E. -140 degrees C).
> 
> 1080 voltage tool (v.2) for ASUS STRIX:
> 
> s1080_tool_v02.zip 503k .zip file
> 
> 
> -Command line interface
> -Allows voltage much higher than 1.2v _(1.5v possibly -- be careful - voltages at those levels without extreme cooling will evaporate transistors on your cards GPU core)._
> -Doesn't work with Founders Edition _(different controllers)_
> 
> 
> 
> More information r.e. extreme O/C LN2 session and original post at hwbot.org:
> http://forum.hwbot.org/showthread.php?t=169488


Awesome. Not sure if I will use this today, but definitely prove useful next LN2 session. May have to get a strix solely based on the availability of voltage and tool.

Radio Shacks stores every where shutting down. Picked up all kinds resistors. Anyone near one, i suggest stopping by. $20 to fill a plastic bag with anything that can fit.


----------



## madmeatballs

Hmmm, at 2126MHz with the same mem clock as my FE OC I can't beat my 2062 score with FE Bios.







Looks like it just doesn't like the FE?

edit:
Even at 2150MHz @ 1.2v I still can't beat my 2062 FE bios OC score







max temp with 1.2v on SP was 44c (24c ambient)
Going back to the FE bios for now.

I hope with the bios and the tool maybe someone can dissect it to make an editor?


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> The strix PCB is different than the FE cards maybe some code in the bios files is optimized for used ic's on strix pcb and therefore its performing worse on fe cards which probably using other ic's or whatever.


That makes sense too.. but I'm not sure the BIOS is as "involved" as that?
But of course I could be wrong.

Just brainstorming.

Seems we still don't have a solution for voltage yet on FE 1080ti then.

Quote:


> Originally Posted by *KraxKill*
> 
> I bet there is a lot of truth to your temp throttle comment because my FE behaves differently at sub zero than it does at ambient. It actually wants to select lower vbins than it would otherwise have at the same clock. In some instances it picks unrealistically low voltages for a given clock. It would make logical sense that the thermal characteristics of different cooling solutions people use would cause the cards to react differently to varios BIOS' versions.


After I get my new GPU pot; I'll do a sub-zero run with the XOC BIOS and see if it behaves any differently.

For now at least, we still need to live with stock voltage on FE cards.

The problem is Nvidia won't release anything for more voltage.

And the only company that seems willing is ASUS. It wouldn't benefit their profit to release something for the FE as they'll lose the extra sales they get because people know it's the only way to get more voltage (by going with ASUS).

Galax usually eventually release something too (a tool).

As does Kingpin/TIN for Classy & KPE models. (Not *officially* sponsored by EVGA but their way of doing it without pissing off nvidia is for it to be done via kingpincooling.com).

This is another reason LN2 benching is so important; because it's these guys who always get us these tools in the end.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Also you could just say there was no sticker on the card.
> 
> Has anyone ever heard of a GPU company refusing RMA due to missing sticker? (Everyone's noticed the stickers but never heard of an incident where someone was actually refused an RMA because of it).
> 
> It's their as a "idiot proof" line of defence; or a "deterrent" to 'encourage' a 14yo NOT to take their new Xmas prseent GPU apart.
> 
> In supermarkets here; they put yellow "security protected" stickers on some expensive items that are difficult to tag, maybe due to size/shape. In reality there's no such tag or security device on the product. But usually the sight of these official stickers is enough of a deterrent.
> 
> And do we think anyone whose ever bought and installed an official EKWB waterblock for an ASUS card has ever been refused RMA because a sticker was removed? Highly doubt it.
> 
> On some cards the stickers aren't even put on top of the screws.
> 
> On my last GPU the sticker wasn't even on the GPU at all, but on a bit of outer packaging that you needed to open to get the card out it's box lol (the sticker said the warranty was void if I removed) haha
> 
> look I took a photo:
> 
> 
> quite possibly it will. it's like if you play a game like, say, FIFA 17. in 4k w/ 4xAA ultra settings (everything ticked possible) locked to 59 fps. it runs my 1080Ti at 40% power, 27% Util. (lol) so, if i am ay 2114 it wont crash. I can run a max but very unstable GPU setting in there with no repercussions. But, if I turned of teh frame rate limiter and it's uncapped 4k Ultra everythin 4xSSAA... well, as you know it will crash quickly. So, you removing the FX was you removing the "frame limiter" in some games. If you are in QHD or FHD it is going to happen in everyt game. Now, you stabalized with SuPo/other benches plus OCCT/other tools. You may only need a minor adjustment here or there. But, yeah, it's going to feel like you have a whole new rig.
> 
> My son was on FX 6300 and a 960. Gave that to my daughter for a Sims-box (all she plays), and put my old 4790k with a gtx 1060 when I got my 5775-c for my son and it was night and day in most games. In a few he was already locked at 72 ultra (like BF1), games that utilize more than 4 cores mostly. But now he is 72fps locked in FullHD Ultra on everything (except Arma3 dips to ~55fps, ). Very good perf.
> 
> So, in summary, you are going to freaking be blown away by your rig now. Your rig is Bae.
> 
> lol, I wonder if my card will still be as stable in the Witcher 3 (and other benches) now that the bottleneck will be back on the GPU (and not the CPU).
> 
> Will be very interesting if this does affect how stable it is. Because if it does affect my Witcher 3 stability; it basically means you can't really compare lottery from one person to another. And that stability and highest overclock is completely subjective depending on other hardware & factors.


----------



## KraxKill

Quote:


> Originally Posted by *madmeatballs*
> 
> Hmmm, at 2126MHz with the same mem clock as my FE OC I can't beat my 2062 score with FE Bios.
> 
> 
> 
> 
> 
> 
> 
> Looks like it just doesn't like the FE?
> 
> edit:
> Even at 2150MHz @ 1.2v I still can't beat my 2062 FE bios OC score
> 
> 
> 
> 
> 
> 
> 
> max temp with 1.2v on SP was 44c (24c ambient)


Do a test at the same clocks and same volts. Higher clocks may simply be destabilizing your core/mem at the temps you're running.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Here is with it done by someone else. Different card I think but same kinda thing.


so it'd be clean, prep with
Quote:


> Originally Posted by *KedarWolf*
> 
> 0805 47 Ohm resistors fit. 0805 is the size you need.
> 
> I know the method to do this from someone on this thread, it's pretty easy and removable with 99% alcohol.


yeah, this is a shunt mod I could get behind. i just aint with that "living" LM in my case.... stuff be relocatin' on ya without permission!


----------



## madmeatballs

Quote:


> Originally Posted by *KraxKill*
> 
> Do a test at the same clocks and same volts. Higher clocks may simply be destabilizing your core/mem at the temps you're running.


I already did that, same vfcurve and mem oc scored 10060 while I scored 10193 with stock FE bios on the same oc and voltages.

Recap photo for my FE bios oc


Recap photo for XOC bios oc same everything


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Can you do it clock for clock thought. 2100 vs 2100 for example at the same voltage. Running different clocks introduces variables like error correction on the mem, gpu etc.
> 
> I'd like to get an idea of a clock for clock deficit to gauge the offset required on the XOC to match the FE BIOS.
> 
> There is also the issue of Frame times, .01 and 1% lows etc which would need to be tested with fraps.


Im repeating my testing with 2101 / 5508 @ 1075mV
Time Spy Game Test 1 (because no Power Limit and **** here)
Nvidia FE Bios : AVG 68.0 +/- 0.1 FPS

XOC Bios: AVG 67.85 +/- 0.1 FPS

So my previous testing was somehow corrupted...

Now the behaviour is like i would have expected it in the first place, since i tested every bios out there extensively. And this is now inline with my previous tests with the normal strix bios and the asus fe bios, which is other than the nvidia bios.

So i have to play a bit more with that bios now.

edit:
Quote:


> Originally Posted by *madmeatballs*
> 
> I already did that, same vfcurve and mem oc scored 10060 while I scored 10193 with stock FE bios on the same oc and voltages.
> 
> Recap photo for my FE bios oc


those sores can vary by 150 points from run to run sometimes...


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> _S1080Ti XOC (Extreme Overclock) BIOS:_
> 
> S1080TIXOC.zip 1820k .zip file
> 
> 
> -Allows voltage up to 1.2v
> -Completely removes power limit. _I was drawing 450 watts in Furmark on my Founders Edition. Stock maximum is 300 watts_
> 
> Warning:
> -Temps will increase
> -You lose 20% of your maximum fan speed on a FE
> -Don't use with cheap PSU PCI-E power cables. If cables aren't ATX recommended guage, they could overheat and cause fire due to increased power draw.
> -Using this BIOS invalidates your warranty. Do so at your _own_ risk + expense.
> -*You will lose one displayport.* If your monitor is plugged into the one thats disabled just switch your cable over to another _(don't let this fool you into thinking you've bricked your card)_
> -Removes UPPER & LOWER temperature limits. Caution! Card will overheat *without* throttling. This safety-guards were disabled to allow card to also operate normally at extremely LOW temperatures (I.E. -140 degrees C).
> 
> 1080 voltage tool (v.2) for ASUS STRIX:
> 
> s1080_tool_v02.zip 503k .zip file
> 
> 
> -Command line interface
> -Allows voltage much higher than 1.2v _(1.5v possibly -- be careful - voltages at those levels without extreme cooling will evaporate transistors on your cards GPU core)._
> -Doesn't work with Founders Edition _(different controllers)_
> 
> 
> 
> More information r.e. extreme O/C LN2 session and original post at hwbot.org:
> http://forum.hwbot.org/showthread.php?t=169488


holy mother of god. the kraken is in the wild


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Thats strange because you have a STRIX. Which is the same card the LN2 guy used with this BIOS in his article at hwbot on his LN2 run _(Liquid Nitrogen Benching session)_.
> 
> _I wonder if there's a handicap built into the BIOS to prevent full GPU utilisation until a certain lower temperature is reached. I.E. below zero degrees C._ _<--- *only* a theory_


*Well, so far I have tested XOC Bios on My STRIX and it seems like it*:

If I run fans normally (stock or 55%) it will start getting Perfs Power and Voltage close to 58-60C. From here on now it will start downclocking.

However, what is interesting is if I start at 100% fan speed and start benchmarks I get PERFS Power and Voltage IMMIDIETLY from the start, however clocks will not downclock because I keep them at around 51-53 C.

Seems like BIOS does not override Perfs that you normally have on stock bios. It removes voltage and temp limit, but not power limit perf. It may allow card to draw more but if your card starts hitting around 317-320W (like on stock) you will get Perfs anyway with XOC Bios. And if you are exceeding 60C temp Power Perf will make your card downclock anyway.

So when I started from 100% Fan Speed and higher voltage GPU was already at its typical W peak (317-320W) where Perfs start to pop up (as normally STRIX has 330W limit). However since temps were low it did not downlock. But when temps exceeding 58-60C it start to downclock as always with Power Limit.

So that BIOS is not really "remove all limits'" but removing them under certain conditions.
*
So to all STRIX Owners*: please have that in mind, unless my card is broken you won't get much from it unless you can keep low temps.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> nope second one is a command line interface tool, that allows changing of V.CORE, V.MEM and PCI.E voltage (in that order).
> 
> But not sure which cards it's able to talk to yet.
> 
> After you apply the voltage it WON'T show in afterburner or hwinfo64 (as those apps only display the voltage selected in BIOS) but the tool is *overriding* the BIOS. What you *WILL* see is your power draw in hwinfo64 raise slightly when you increase the voltage offset.
> 
> be careful, remember you're not under extreme cooling.. so don't do anything mental with voltages. they could cause damage at normal temps if you go _too_ high.
> 
> Anyone with a STRIX will be feeling extremely lucky just now lol.. lets hope the BIOS is at least compatible with FE.


yeah , Strix sales to get a bump today/tomorrow


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> Im repeating my testing with 2101 / 5508 @ 1075mV
> Time Spy Game Test 1 (because no Power Limit and **** here)
> Nvidia FE Bios : AVG 68.0 +/- 0.1 FPS
> 
> XOC Bios: AVG 67.85 +/- 0.1 FPS
> 
> So my previous testing was somehow corrupted...
> 
> Now the behaviour is like i would have expected it in the first place, since i tested every bios out there extensively. And this is now inline with my previous tests with the normal strix bios and the asus fe bios, which is other than the nvidia bios.
> 
> So i have to play a bit more with that bios now.


Interesting. It could be like NRPeyton suggested that this XOC BIOS may just be very temperature sensitive as it's built for NL2 purposes. I wonder if the variance shrinks at lower clocks and bins where temps are even lower.


----------



## Coopiklaani

Just like the first version of gtx 1080 xoc bios, it performs slightly worse than the stock bios at the same clock. Hopefully asus will release a new version, like t4 for the gtx 1080, which fixes this problem.
But at least it removes the power constraint completely, which is good for ppl who don't want to perform shunt mod.
For ppl already have shunt mod, the FTW3 bios seems to be a better option for now, better performance for the same clock.


----------



## Benny89

Further testing on XOC Bios on STRIX CARD:

*1. It does not remove Power Limit totally from card.*

If I run fans normally (stock or 55%) it will start getting Perfs Power and Voltage close to 58-60C. From here on now it will start downclocking.

However it will keep max clocks from curve/OC till this point without downclocking on previous temp points- 40 and 50C. So card is stable max clock till then.

However, what is interesting is if I start at 100% fan speed and start benchmarks I get PERFS Power and Voltage IMMIDIETLY from the start, however clocks will not downclock because I keep them at around 51-53 C.

Seems like BIOS does not override Perfs that you normally have on stock bios. It removes voltage and temp limit, but not power limit perf. It may allow card to draw more but if your card starts hitting around 317-320W (like on stock) you will get Perfs anyway with XOC Bios. And if you are exceeding 60C temp Power Perf will make your card downclock anyway.

*2. You need to have low temps to prevent downclock with higher power draw*

So when I started from 100% Fan Speed and higher voltage GPU was already at its typical W peak (317-320W) where Perfs start to pop up (as normally STRIX has 330W limit). However since temps were low it did not downlock. But when temps exceeding 58-60C it start to downclock as always with Power Limit.





*3. Temperature is the key*

When keeping card at 51-52 C I managed to make full run 2050 in Heaven even with Power and Voltage perfs all the time (with stock BIOS my max run was 2012 as it downclocked).

So that BIOS is not really "remove all limits'" but removing them under certain conditions.

So to all STRIX Owners: please have that in mind, *unless my card is broken* you won't get much from it unless you can keep low temps.

This from testing on ROG STRIX 1080Ti OC. Either my card is broken or bios behave differently on FE cards.


----------



## Hulio225

So further testing done.... We are reaching a weird territory again:

2101 / 5508 @ 1075mV vs 2151 / 5508 @ 1200mV

They score identically Oo the same average fps in time spy game test 1









My Afterburner is showing me constant clocks temps never over 37°C

what the heck is that lol

thats both on the XOC bios


----------



## madmeatballs

Quote:


> Originally Posted by *Hulio225*
> 
> So further testing done.... We are reaching a weird territory again:
> 
> 2101 / 5508 @ 1075mV vs 2151 / 5508 @ 1200mV
> 
> They score identically Oo the same average fps in time spy game test 1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Afterburner is showing me constant clocks temps never over 37°C
> 
> what the heck is that lol
> 
> thats both on the XOC bios


I didn't bother to continue testing as it would only waste time. Now we just have to wait for some miracle (again) that some bios editor comes out soon.


----------



## OneCosmic

Quote:


> Originally Posted by *Benny89*
> 
> Further testing on XOC Bios on STRIX CARD:
> 
> *1. It does not remove Power Limit totally from card.*
> 
> If I run fans normally (stock or 55%) it will start getting Perfs Power and Voltage close to 58-60C. From here on now it will start downclocking.
> 
> However it will keep max clocks from curve/OC till this point without downclocking on previous temp points- 40 and 50C. So card is stable max clock till then.
> 
> However, what is interesting is if I start at 100% fan speed and start benchmarks I get PERFS Power and Voltage IMMIDIETLY from the start, however clocks will not downclock because I keep them at around 51-53 C.
> 
> Seems like BIOS does not override Perfs that you normally have on stock bios. It removes voltage and temp limit, but not power limit perf. It may allow card to draw more but if your card starts hitting around 317-320W (like on stock) you will get Perfs anyway with XOC Bios. And if you are exceeding 60C temp Power Perf will make your card downclock anyway.
> 
> *2. You need to have low temps to prevent downclock with higher power draw*
> 
> So when I started from 100% Fan Speed and higher voltage GPU was already at its typical W peak (317-320W) where Perfs start to pop up (as normally STRIX has 330W limit). However since temps were low it did not downlock. But when temps exceeding 58-60C it start to downclock as always with Power Limit.
> 
> 
> 
> 
> 
> *3. Temperature is the key*
> 
> When keeping card at 51-52 C I managed to make full run 2050 in Heaven even with Power and Voltage perfs all the time (with stock BIOS my max run was 2012 as it downclocked).
> 
> So that BIOS is not really "remove all limits'" but removing them under certain conditions.
> 
> So to all STRIX Owners: please have that in mind, *unless my card is broken* you won't get much from it unless you can keep low temps.
> 
> This from testing on ROG STRIX 1080Ti OC. Either my card is broken or bios behave differently on FE cards.



You need a proper waterblock


----------



## Coopiklaani

Quote:


> Originally Posted by *Benny89*
> 
> Further testing on XOC Bios on STRIX CARD:
> 
> *1. It does not remove Power Limit totally from card.*
> 
> If I run fans normally (stock or 55%) it will start getting Perfs Power and Voltage close to 58-60C. From here on now it will start downclocking.
> 
> However it will keep max clocks from curve/OC till this point without downclocking on previous temp points- 40 and 50C. So card is stable max clock till then.
> 
> However, what is interesting is if I start at 100% fan speed and start benchmarks I get PERFS Power and Voltage IMMIDIETLY from the start, however clocks will not downclock because I keep them at around 51-53 C.
> 
> Seems like BIOS does not override Perfs that you normally have on stock bios. It removes voltage and temp limit, but not power limit perf. It may allow card to draw more but if your card starts hitting around 317-320W (like on stock) you will get Perfs anyway with XOC Bios. And if you are exceeding 60C temp Power Perf will make your card downclock anyway.
> 
> *2. You need to have low temps to prevent downclock with higher power draw*
> 
> So when I started from 100% Fan Speed and higher voltage GPU was already at its typical W peak (317-320W) where Perfs start to pop up (as normally STRIX has 330W limit). However since temps were low it did not downlock. But when temps exceeding 58-60C it start to downclock as always with Power Limit.
> 
> 
> 
> 
> 
> *3. Temperature is the key*
> .
> When keeping card at 51-52 C I managed to make full run 2050 in Heaven even with Power and Voltage perfs all the time (with stock BIOS my max run was 2012 as it downclocked).
> 
> So that BIOS is not really "remove all limits'" but removing them under certain conditions.
> 
> So to all STRIX Owners: please have that in mind, *unless my card is broken* you won't get much from it unless you can keep low temps.
> 
> This from testing on ROG STRIX 1080Ti OC. Either my card is broken or bios behave differently on FE cards.


Hmm, I don't have the problem you described. Under similar temps, my card, an FE card with strix xoc bios, can draw about 525w of power when running sp 8k. no pwt wall or downlocking for me


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Initial testings wiht new BIOS on my STRIX. Card is hitting now around 1.125V but temps did not change. Still maxing around 60C with 55% fan speed.
> 
> with +55 on core card is hitting 2050mhz.
> 
> Temperature downclocks are no longer here. Card does not downclock on temp limits.
> 
> *HOWEVER, whats funny- i still get Power and Voltage Limit in OSD menu when running Heaven or Witcher 3 and card downclock to 2038.*
> 
> Card will still crash if it hits cerain Temp. For example I can play Witcher 3 at 2038 or do Heaven as long as it has below 62C. Once it hits 64 it crashes.
> 
> After it downclock to 2038 I am sitting on it with perf caps Power and Voltage anyway.
> 
> I have no idea how it can hit any power limit if power limit is like gone tottally....
> 
> Soo... conclusion- if you have bad chip like I have it will give you nothing really. If you already have golden one- it will help.
> 
> I am dissapointed it still hits Power Perf lol....


that aint right. you must've got a bad bios copy when you didnt disable teh driver adapter.

Edit: this
Quote:


> Originally Posted by *nrpeyton*
> 
> hmm something isn't right there.
> 
> Try flashing the BIOS again? But this time disable graphics driver first _(as recommended)_?
> I was drawing 450 watts + with no power limiting. With the stock BIOS it power limits at 300 watts.
> 
> No shunt mod here yet.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Interesting. It could be like NRPeyton suggested that this XOC BIOS may just be very temperature sensitive as it's built for NL2 purposes. I wonder if the variance shrinks at lower clocks and bins where temps are even lower.


aye, pitty i can't "DIY mount" my CPU pot to the card, as i'd of been able to find out. lol



they're awfully expensive.


by the time I paid for the pot,
about $30 for shipping from Taiwan _(cheapest option)_
import duties
parcel-force handling fees
then petrol to travel to the depot

I paid at about £350 U.K pounds at least, all in. which is $454 dollars or €413 euros

My GPU pot I'm going to try and source second hand.

How many rep points do you need to start trading here on overclock.net?


----------



## KedarWolf

Quote:


> Originally Posted by *OneCosmic*
> 
> Oh wow i am really curious how XOC BIOS will work on FE...if it can hold SUPO 4K/8K and TimeSpy without throttling that would be golden.


Tried it on my FE, couldn't even do 2172 core at 1.2 v.









Went back to FTW3 BIOS.


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> So further testing done.... We are reaching a weird territory again:
> 
> 2101 / 5508 @ 1075mV vs 2151 / 5508 @ 1200mV
> 
> They score identically Oo the same average fps in time spy game test 1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Afterburner is showing me constant clocks temps never over 37°C
> 
> what the heck is that lol
> 
> thats both on the XOC bios


Thats more reason to believe that this BIOS is coded for power draw vs temp. Maybe it's running 2151 but power capping in the background to where 2101 is in terms of draw and so they score similarly? If so, it's clearly a BIOS designed to protect air cards and is really built for sub ambient cooling. Are you drawing the same wattage at the wall?

@nrpayton....you have that fancy wall meter. Do you mind checking? I have to wait for my rig to get up to ambient.


----------



## hotrod717

Quote:


> Originally Posted by *Hulio225*
> 
> So further testing done.... We are reaching a weird territory again:
> 
> 2101 / 5508 @ 1075mV vs 2151 / 5508 @ 1200mV
> 
> They score identically Oo the same average fps in time spy game test 1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Afterburner is showing me constant clocks temps never over 37°C
> 
> what the heck is that lol
> 
> thats both on the XOC bios


Keep in mind going outside normal operating parameters you may see anomalies. Just like going below -50* , some cards will drop bins and you may see some gpuz issues reading correct clocks. Sometimes it's like flying blind and figuring them out.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> Tried it on my FE, couldn't even do 2172 core at 1.2 v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Went back to FTW3 BIOS.


will try the ftw3 bios too, i could reach 2176 core stable but scoring the same like fe bios with 2101 lol
it looks like the bios is having some problems on fe cards if you go over 2101 in core clock or something..
i dont understand whats goining on there, i simply can say its extremly strange


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> aye, pitty i can't "DIY mount" my CPU pot to the card, as i'd of been able to find out. lol
> 
> 
> 
> they're awfully expensive.
> 
> 
> by the time I paid for the pot,
> about $30 for shipping from Taiwan _(cheapest option)_
> import duties
> parcel-force handling fees
> then petrol to travel to the depot
> 
> I was paid at about £350 U.K pounds all in. which is $454 dollars or €413 euros
> 
> My GPU pot I'm going to try and source second hand.
> 
> How many rep points do you need to start trading here on overclock.net?


DUDE! Get that thing on there!!! That looks legit!


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> that aint right. you must've got a bad bios copy when you didnt disable teh driver adapter.
> 
> Edit: this


I did try to fresh flash BIOS again.

1. Downloaded BIOS from: http://forum.hwbot.org/showthread.php?p=486472

2. Removed BIOS protection

3. Disabled GPU in Device Manager

4. Flashed my stock STRIX Bios using commande: nvflash64 -6 STROX.rom

5. Press Y in command. Flashed. Reboot

6. Enable GPU. Reboot.

7. Disable GPU

8. Flashed XOC Bios using command: nvflash64 -6 S1080TIXOC.rom

9. Press Y, Reboot.

So I don't know what could go wrong. It just work as it works on my STRIX.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Thats more reason to believe that this BIOS is coded for power draw vs temp. Maybe it's running 2151 but power capping in the background to where 2101 is in terms of draw and so they score similarly? If so, it's clearly a BIOS designed to protect air cards and is really built for sub ambient cooling. Are you drawing the same wattage at the wall?
> 
> @nrpayton....you have that fancy wall meter. Do you mind checking? I have to wait for my rig to get up to ambient.


hmm good idea actually, if I do a clock for clock and voltage for voltage comparison it should give indication.

I'll grab something to eat then i'll see what I can do.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> I did try to fresh flash BIOS again.
> 
> 1. Downloaded BIOS from: http://forum.hwbot.org/showthread.php?p=486472
> 
> 2. Removed BIOS protection
> 
> 3. Disabled GPU in Device Manager
> 
> 4. Flashed my stock STRIX Bios using commande: nvflash64 -6 STROX.rom
> 
> 5. Press Y in command. Flashed. Reboot
> 
> 6. Enable GPU. Reboot.
> 
> 7. Disable GPU
> 
> 8. Flashed XOC Bios using command: nvflash64 -6S1080TIXOC.rom
> 
> 9. Press Y, Reboot.
> 
> So I don't know what could go wrong. It just work as it works on my STRIX.


idk try ddu drivers and check again


----------



## Slackaveli

Quote:


> Originally Posted by *done12many2*
> 
> Heaven @ 2200 MHz with 1.193v in SLI
> 
> Power draw at the wall peaked at 810w. Snapped off a 803w shot.
> 
> Still tinkering.


well, THAT'S a 15.6Tflop 1080Ti. I'll remember that when I'm trolling plebes on WCCFTech comment section.


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> idk try ddu drivers and check again


I have tried and unistalled drivers and installed new NVIDIA drivers again.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Strange.
> 
> That goes against the laws of physics lol.
> 
> It's clocked higher. Memory is clocked higher. Yet is turning out _less_ frames.
> 
> Nows about time we could really use a scientific answer as to why.
> 
> Only thing I can think of so far; is maybe overclock or volts are too high for temperature and the card is being forced to "error correct" or in other words, drop frames to prevent it's self from crashing.
> 
> But you found it quite stable? So that doesn't look like the reason.


i suspect memory errors. he should back that vram down -125


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> i suspect memory errors. he should back that vram down -125


I already made retesting with 0 memory oc and up to 2101 i have inline results, but 2177 scores the same like 2101 with no memory oc...
this is very strange


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> DUDE! Get that thing on there!!! That looks legit!


lol i'm just trying to figure out how i'd do it...

on the GPU pots themselves they have the contact plate (bit that touches the DIE) on the *SIDE*.

But on a CPU pot it's on the bottom.

I'd have to have the motherboard balanced on it's side then some way of holding the card up so it doesn't get ripped out the slot when I put the pot down. My pot weighs about 5KG lol
You really get a feel for the weight of it when you pick it up. Think of it-- it's just like a big chunk of copper.

*bottom of pot \/*


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> that aint right. you must've got a bad bios copy when you didnt disable teh driver adapter.
> 
> Edit: this


I reflashed XOC Bios again for 4th time, disabling GPU in Device manager.

Same, PerfCaps from start of Heaven- Power and Voltage.


----------



## done12many2

Quote:


> Originally Posted by *Slackaveli*
> 
> well, THAT'S a 15.6Tflop 1080Ti. I'll remember that when I'm trolling plebes on WCCFTech comment section.


Too bad it's performing worse then expected, but it is cool seeing those clock speeds.


----------



## OneCosmic

Quote:


> Originally Posted by *Benny89*
> 
> I reflashed XOC Bios again for 4th time, disabling GPU in Device manager.
> 
> Same, PerfCaps from start of Heaven- Power and Voltage.


Maybe it has a limits built in above certain temperature so that it is still a little dumbproof, because you don't want to have it running 400W+ at 80C+.


----------



## nrpeyton

Quote:


> Originally Posted by *OneCosmic*
> 
> Maybe it has a limits built in above certain temperature so that it is still a little dumbproof, because you don't want to have it running 400W+ at 80C+.


aye exactly, thats one theory we're indeed discussing;

Anyone here just now got a GPU LN2/Dry Ice Evaporation Pot?
That could test this BIOS for us on an FE?

Or anyone willing to lend me one? I'll pay postage both ways and report back with findings.

Can't really afford another POT right now. Just spent £870 on new computer parts already this month.


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> I already made retesting with 0 memory oc and up to 2101 i have inline results, but 2177 scores the same like 2101 with no memory oc...
> this is very strange


yeah, i see that now. initially it looks as if i was correct since you backed down to 5500 and it seemed stable. idk, i think it's just a sub zero-timed bios and it'll be crap abve 30c.


----------



## Slackaveli

Quote:


> Originally Posted by *done12many2*
> 
> Too bad it's performing worse then expected, but it is cool seeing those clock speeds.


yeah, that part sucks, but those nerds will never know that lol/


----------



## KedarWolf

My Superposition on FTW3 BIOS at 2037 core 1.000v and +667 memory, 200 full points more than the Asus Strix OC BIOS.


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm good idea actually, if I do a clock for clock and voltage for voltage comparison it should give indication.
> 
> I'll grab something to eat then i'll see what I can do.


Quote:


> Originally Posted by *nrpeyton*
> 
> lol i'm just trying to figure out how i'd do it...
> 
> on the GPU pots themselves they have the contact plate (bit that touches the DIE) on the *SIDE*.
> 
> But on a CPU pot it's on the bottom.
> 
> I'd have to have the motherboard balanced on it's side then some way of holding the card up so it doesn't get ripped out the slot when I put the pot down. My pot weighs about 5KG lol
> You really get a feel for the weight of it when you pick it up. Think of it-- it's just like a big chunk of copper.
> 
> *bottom of pot \/*


Yeah you'd have to set the mobo on it's side and maybe have an acrylic shop make you an L support bracket for the gpu on top of which you would place a piece of 1-3cm thick closed cell foam to brace up the card against the bracket. Then just set the pot on top as usual? There is a company that makes acrylic atx trays. You can find them if you look. Cheap. You could grab one of those and just take it to a plastics shop (assuming you have one around obviously) but they can usually make you a bracket by bending the acrylic to shape what you want. That way you can make a motherboard breadboard you can just configure how you want and a few pieces of colored cell foam to bridge gaps, extra support etc.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, i see that now. initially it looks as if i was correct since you backed down to 5500 and it seemed stable. idk, i think it's just a sub zero-timed bios and it'll be crap abve 30c.


This could actually be true, i opened the window and cooled the room down by couple degrees... now i dont see over 32 ° C max load temp. and magically i gain like 0.5 - 0.6 fps, memory still untouched... either its the temp doing something or its coincidence and something else caused that which im not aware of


----------



## KedarWolf

Also peeps, if you run the .bat file in task scheduler on each boot set it to run one minute after boot, not 30 seconds. If you run it too soon it actually won't set the TDP from the wrong setting.

I figured that out with FTW3 BIOS as the TDP is a bit lower on boot than the max it should be and after the .bat file ran after 30 seconds then I manually ran it a second time later the 358 power limit wasn't set until I manually ran it.


----------



## KraxKill

Quote:


> Originally Posted by *KedarWolf*
> 
> Also peeps, if you run the .bat file in task scheduler on each boot set it to run one minute after boot, not 30 seconds. If you run it too soon it actually won't set the TDP from the wrong setting.
> 
> I figured that out with FTW3 BIOS as the TDP is a bit lower on boot than the max it should be and after the .bat file ran after 30 seconds then I manually ran it a second time later the 358 power limit wasn't set until I manually ran it.


What do you actually set the bat file to? Your currently running BIOS' advertised TDP? I know it changes my GPUz reporting to be correct but other than that is it actually affecting TDP?


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> This could actually be true, i opened the window and cooled the room down by couple degrees... now i dont see over 32 ° C max load temp. and magically i gain like 0.5 - 0.6 fps, memory still untouched... either its the temp doing something or its coincidence and something else caused that which im not aware of


it's the answer, i'm convinced. Also, it has a 'dummy' protector above 60c, too, I think. Only way I can see Nvidia even allowing it in the wild tbh.

You guys see this? pathetic! they really need to get this tim crap worked out.
http://wccftech.com/intel-i7-7700k-owners-flood-forums-with-overheating-complaints/?utm_source=wccftech&utm_medium=related


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> it's the answer, i'm convinced. Also, it has a 'dummy' protector above 60c, too, I think. Only way I can see Nvidia even allowing it in the wild tbh.


And what i think in addition is happening is, that if you set lets say 2178 MHz and you cross the temp where the card normaly would downclock, it still does it but it doesnt show on graphs in afteburner... this i have to check with warming up my water thou...
but this could be true since on ln2 temp bins doesnt affect anything


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> it's the answer, i'm convinced. Also, it has a 'dummy' protector above 60c, too, I think. Only way I can see Nvidia even allowing it in the wild tbh.
> 
> You guys see this? pathetic! they really need to get this tim crap worked out.
> http://wccftech.com/intel-i7-7700k-owners-flood-forums-with-overheating-complaints/?utm_source=wccftech&utm_medium=related


Pfff...there goes XOC Bios.. Glad I didn't cancel FTW3 preorder today when I was excited about XOC Bios on my STRIX.


----------



## KedarWolf

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Also peeps, if you run the .bat file in task scheduler on each boot set it to run one minute after boot, not 30 seconds. If you run it too soon it actually won't set the TDP from the wrong setting.
> 
> I figured that out with FTW3 BIOS as the TDP is a bit lower on boot than the max it should be and after the .bat file ran after 30 seconds then I manually ran it a second time later the 358 power limit wasn't set until I manually ran it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you actually set the bat file to? Your currently running BIOS' advertised TDP? I know it changes my GPUz reporting to be correct but other than that is it actually affecting TDP?
Click to expand...

For FTW3 unzip this, run .bat, it'll set TDP to 358.

powerlimit.zip 0k .zip file


You can get your BIOS max TDP here. Set the '-pl 3**' in bat file with NotePad to your max TDP, Save As as All Files, not .txt., save the .bat file.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=

For FTW3 I opened the .bat file in NotePad and changed it to '-pl 358'.

It raises the default max TDP from 355.5 TDP to 358.

You can go to Run in Windows, find Task Scheduler, make a Basic Task to run the bat, them in Properties have it run one minute after windows login at highest privileges

It needs to run every boot on some BIOS's like the FTW3 or after you reinstall your Nvidia driver.


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> And what i think in addition is happening is, that if you set lets say 2178 MHz and you cross the temp where the card normaly would downclock, it still does it but it doesnt show on graphs in afteburner... this i have to check with warming up my water thou...
> but this could be true since on ln2 temp bins doesnt affect anything


yep yep. that's it right there.


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Here is with it done by someone else. Different card I think but same kinda thing.
> 
> 
> 
> 
> 
> I am wondering what I am looking at looks like more like a force 3D voltage or LLC mod the resister is rather small for a shunt mod could be a voltage mod but normally you would cut the circuit and solder in a potentiometer to be able to adjust the voltage than a fixed resistor
> And that looks more like the voltage controller 3221
> 
> 
> 
> I always thought the shut shunt was the largest resistor's near the main 6 or 8 pin plug labelled R002 or R005
> The other resistor near the PCIe lane is normally associated with the memory power delivery
Click to expand...

Here is the correct shunts to do.


----------



## RavageTheEarth

Quote:


> Originally Posted by *KedarWolf*
> 
> Here is the correct shunts to do.


All 3? I have an Aorus so I don't know what I'm talking about here, but don't people recommend to only do the top upper two to avoid drawing too much power? Genuinely curious.


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Here is the correct shunts to do.
> 
> 
> 
> 
> 
> All 3? I have an Aorus so I don't know what I'm talking about here, but don't people recommend to only do the top upper two to avoid drawing too much power? Genuinely curious.
Click to expand...

Some peeps say do two, some do all three, if you do them at all should be on water cooling so I'm thinking all three is fine if you are.


----------



## RavageTheEarth

Quote:


> Originally Posted by *KedarWolf*
> 
> Some peeps say do two, some do all three, if you do them at all should be on water cooling so I'm thinking all three is fine if you are.


I'm jelly. Waiting for EK to announce the block for the Aorus any day now. Can't wait to get this thing under water and see what it can really do! Nice to have the 375w limit out of the box. Have people shunted aftermarket cards like the Aorus? I haven't seen any posts about that, but perhaps I missed them.


----------



## Hulio225

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'm jelly. Waiting for EK to announce the block for the Aorus any day now. Can't wait to get this thing under water and see what it can really do! Nice to have the 375w limit out of the box. Have people shunted aftermarket cards like the Aorus? I haven't seen any posts about that, but perhaps I missed them.


since shunt mods work the same on every card, i guess there are several people which have done it... maybe in the aorus owners club here on forums...


----------



## lilchronic

Strix XOC bios on evga FE


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> Strix XOC bios on evga FE


what was your score with nvidia fe bios?


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> what was your score with nvidia fe bios?


this is FE bios @ same clocks 2063Mhz / 6210Mhz but had alot of power throttling down to 2000Mhz


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> this is FE bios @ same clocks 2063Mhz / 6210Mhz but had alot of power throttling down to 2000Mhz


Okay so for not shunt moded cards it seems to be nice


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> Okay so for not shunt moded cards it seems to be nice


my max temps are like 48c on water with the xoc and FE max temp was around 42c


----------



## chef1702

Sou, i did some testing too. NO SHUNT

All tests were done with 1987 MHz @ 1.0125 mV and +300 Mem. Not a golden card clockwise and Mem to +300 to avoid error correction mistakes. Card was around 40 - 43°

FE
XOC

 

Okay no difference... moving on

FE
XOC

 

XOC because no Power Limit throttling.

FE
XOC

 

Hm strange min Fps with the XOC Bios. But only 1.2 Fps difference. Even if its true, its less then 2%. Benchmark takes around 5 minutes, dont wanted to run it again, maybe tomorrow.

Now some Games, I dont play Benchmarks

Division, Reso Scale 100% 2560 x 1080

FE
XOC

 

XOC one Fps more, but no PWR at all neither with the FE Bios....

Division, Reso Scale 200% 2560 x 1080

FE
XOC

 

I would say same.

Wildlands, Reso Scale 1 2560 x 1080

FE
XOC

 

Same

Wildlands, Reso Scale 1.5 2560 x 1080

FE
XOC

 

Look at that **** its freakn 50.96 to 50.96. Min and Max Fps differ a bit.

Just for the record. *Curve was the same*. I copied point for point my FE Curve onto the XOC Bios besides the first 2 points because they wont go below 1500 Mhz, but the card don't use that points while benchmarking anyway.

 

Conclusion time:

- in Games no difference

- in Benchmarks depends on Benchmark and how often you run it

- If no shunt mod made and PWR happens then XOC is your way to go.


----------



## fisher6

XOC bios results in much more stable OC for me which is great but it seems like I need bigger power supply to let the GPU pull more power. My HXi 750 already shut down my PC twice at 2100 and 2200 with 1.2. Anyone can recommend a 1000+ PSU?


----------



## Hulio225

Quote:


> Originally Posted by *fisher6*
> 
> XOC bios results in much more stable OC for me which is great but it seems like I need bigger power supply to let the GPU pull more power. My HXi 750 already shut down my PC twice at 2100 and 2200 with 1.2. Anyone can recommend a 1000+ PSU?


are you sure its the psu? for one card even with oc, 750w should be okay i would assume

Edit: never thought to much about psu been to wheak... maybe thats my problem as well, since i have just 650w even its gold rated... hmm
damn have no wattmeter at hand atm


----------



## Hulio225

Quote:


> Originally Posted by *Hulio225*
> 
> are you sure its the psu? for one card even with oc, 750w should be okay i would assume
> 
> Edit: never thought to much about psu been to wheak... maybe thats my problem as well, since i have just 650w even its gold rated... hmm
> damn have no wattmeter at hand atm


edit2: now after i thought about it, i guess my psu is to weak haha... assueming my i7 7700k is puling 200 watts and the gpu 400 or something... damn i have to ge t a new psu

lol and now i even quoted myself...


----------



## KraxKill

Currently looping heaven at 2200 1.181v. -5c liquid temps. 12C load temps. Need a lot more testing, but daaaaamn this XOC bios is ridiculous.

I really want to find a stable clock and compare my frame times but it just seems more fluid compared to my FE bios. Similar to what I experienced on the Palit.


----------



## nrpeyton

Right due to the lower scores being reported by some people using the XOC BIOS and a request from someone for a power usage comparison I have done the comparison!

Power Draw Comparison
*XOC BIOS.........vs. ......STOCK Founders Edition BIOS*

Settings *same* for both runs:

Voltage: 1.0v
Clock: 1999Mhz (2000 set in curve)
Fan Speed _(adjusted manually to get same RPM on both BIOS's):_ 3700rpm
Memory: +535

Run both the videos simultaneously for comparison!









*XOC BIOS*





*Stock Founders Edition BIOS*


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> Right due to the lower scores being reported by some people using the XOC BIOS and a request from someone for a power usage comparison I have done the comparison!
> 
> Power Draw Comparison
> *XOC BIOS.........vs. ......STOCK Founders Edition BIOS*
> 
> Settings *same* for both runs:
> 
> Voltage: 1.0v
> Clock: 1999Mhz (2000 set in curve)
> Fan Speed _(adjusted manually to get same RPM on both BIOS's):_ 3700rpm
> Memory: +535
> 
> Run both the videos simultaneously for comparison!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *XOC BIOS*
> 
> 
> 
> 
> 
> *Stock Founders Edition BIOS*


after this even if you go up in voltages, i mean this plug is measuring whole sysem... i don't think 650w or even 750w is a limiting factor anymore...


----------



## KraxKill

Thanks for doing that. Looks like the XOC is pulling about 10w more?


----------



## done12many2

I wanted to compare "out of the box" clock speed performance to see the if there were any differences between the FTW3 and XOC.

I flashed the FTW3 BIOS and ran Valley, Heaven and Superposition 8k twice each. I repeated this with the XOC BIOS. No changes were made to anything else and both BIOS versions defaulted to a 1999.5 MHz clock speed.

FTW3
Valley: 7540 - 180.2
Run 2: 7546 - 180.4

XOC
Valley: 7545 - 180.3
Run 2: 7547 - 180.4

FTW3
Heaven: 5866 - 232.9
Run 2: 5864 - 232.8

XOC
Heaven: 5868 - 233
Run 2: 5865 - 232.8

FTW3
SP 8k: 8708 - 65.14
Run 2: 8716 - 65.2

XOC
SP 8k: 8917 - 66.7
Run 2: 8920 - 66.72

The only change was in SP 8k where the lack of power limit with the XOC BIOS allowed the clockspeed to remain at 1999.5 MHz compared to slight fluctuation occurring with the FTW3 BIOS. The degradation in performance scaling that we are all seeing with the XOC is definitely occurring when we start getting crazy with manually pushing voltage or clockspeed. I'm sticking with the XOC BIOS as there's clearly no loss. Now I just have to figure out how to make it work without triggering whatever is causing the drop in performance.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Thanks for doing that. Looks like the XOC is pulling about 10w more?


Yeah, it did look that way.

Yeah roughly, temps were the same too. But they weren't that farr off each other in power draw. (and sometimes meter was showing equal draw, give or take 5-10w).

In Hwinfo64 it actually even looked like the stock BIOS was "reporting" a slightly higher draw in hwinfo64.

But all-in-all it they didn't look too different at all.

So still doesn't explain why some people were getting 5% less FPS at same clocks.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> I wanted to compare "out of the box" clock speed performance to see the if there were any differences between the FTW3 and XOC.
> 
> I flashed the FTW3 BIOS and ran Valley, Heaven and Superposition 8k twice each. I repeated this with the XOC BIOS. No changes were made to anything else and both BIOS versions defaulted to a 1999.5 MHz clock speed.
> 
> FTW3
> SP 8k: 8708 - 65.14
> 8716 - 65.2
> 
> XOC
> SP 8k: 8917 - 66.7
> *8920* - 66.72
> 
> The only change was in SP 8k where the lack of power limit with the XOC BIOS allowed the clockspeed to remain at 1999.5 MHz compared to slight fluctuation occurring with the FTW3 BIOS. The degradation in performance scaling that we are all seeing with the XOC is definitely occurring when we start getting crazy with manually pushing voltage or clockspeed. I'm sticking with the XOC BIOS as there's clearly no loss. Now I just have to figure out how to make it work without triggering whatever is causing the drop in performance.


*Very interesting.*

+1

Nice one.

This is the first post we've seen with real scores showing a benefit to the XOC in scoring against another BIOS.

Edit:
Now I wonder if you could increase the score on the XOC further by utilising the extra clocks / voltage and removed power limits?

_This is interesting, because I also remember now from last round (1080 non Ti's) that FTW 2's benefited most from the unlocked "T4 XOC 1080 BIOS". More-so than the FE cards._


----------



## Slackaveli

Quote:


> Originally Posted by *fisher6*
> 
> XOC bios results in much more stable OC for me which is great but it seems like I need bigger power supply to let the GPU pull more power. My HXi 750 already shut down my PC twice at 2100 and 2200 with 1.2. Anyone can recommend a 1000+ PSU?


evga g2 or p2 1000w


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> *Very interesting.*
> 
> +1
> 
> Nice one.
> 
> This is the first post we've seen with real scores showing a benefit to the XOC in scoring against another BIOS.


Yeah, it only shows itself in a direct comparison when a power limit situation presents itself, which is exactly what we would expect. Otherwise, it's the same level of performance between the two.

I think some of us got a bit aggressive and took the voltage or clockspeed a little too far since it would actually do it, but didn't actually net any gain. It seems like when pushed too far, the opposite happens. There's a sweet spot in there somewhere.


----------



## fisher6

Quote:


> Originally Posted by *Hulio225*
> 
> edit2: now after i thought about it, i guess my psu is to weak haha... assueming my i7 7700k is puling 200 watts and the gpu 400 or something... damn i have to ge t a new psu
> 
> lol and now i even quoted myself...


I can use corsair link with my PSU and with XOC bios it shows close to 600W at only 2000Mhz while playing ME:A. If I set it to 2100 and 1.2, the PC shutsdown as as soon as I launch the game (although valley works fine).

Quote:


> Originally Posted by *nrpeyton*
> 
> Yeah, it did look that way.
> 
> Yeah roughly, temps were the same too. But they weren't that farr off each other in power draw. (and sometimes meter was showing equal draw, give or take 5-10w).
> 
> In Hwinfo64 it actually even looked like the stock BIOS was "reporting" a slightly higher draw in hwinfo64.
> 
> But all-in-all it they didn't look too different at all.
> 
> So still doesn't explain why some people were getting 5% less FPS at same clocks.


This is really confusing and seems to contradict what I'm getting. Maybe it's different from GPU to GPU?


----------



## feznz

Lower scores with XOC doesn't surprise me probably something to do with looser memory timing allowing higher memory clocks and help with stability, remembering this bios is aimed at LN² or sub zero even 10°C probably won't see too many benefits
Unless your card is hitting power limits at very low clocks then this bios will help.


----------



## nrpeyton

Quote:


> Originally Posted by *fisher6*
> 
> I can use corsair link with my PSU and with XOC bios it shows close to 600W at only 2000Mhz while playing ME:A. If I set it to 2100 and 1.2, the PC shutsdown as as soon as I launch the game (although valley works fine).
> This is really confusing and seems to contradict what I'm getting. Maybe it's different from GPU to GPU?


To be fair, 1.2v is probably far too much for air.. and probably still too much for water.

These bigger GPU's can handle the voltage a bit better than their older 1080 brothers, but that doesn't mean that at those voltages the GPU is still getting more voltage than it can handle.. on 1080Ti it's maybe not enough to crash it.. but it may be enough to reduce performance due to GPU having to error correct it's self or drop frames.

Maybe under normal temps; 1.093v is enough. On my 1080 I was able to get to well over 2200 at 1.093v *IF* I got temps low enough.
(But that *does* show what the GPU is capable of at stock voltage).

According to Nvidia 1.05v is actually the maximum.
1.093v is already over-volted.

Maybe beyond that (up to 1.2v+ should only be reserved for more extreme methods of cooling).

Maybe now a comparison between scores at same clocks/voltages (XOC vs SHUNT mod) is what we need to see next?


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> To be fair, 1.2v is probably far too much for air.. and probably still too much for water.
> 
> These bigger GPU's can handle the voltage a bit better than their older 1080 brothers, but that doesn't mean that at those voltages the GPU is still getting more voltage than it can handle.. on 1080Ti it's maybe not enough to crash it.. but it may be enough to reduce performance due to GPU having to error correct it's self or drop frames.
> 
> Maybe under normal temps; 1.093v is enough. On my 1080 I was able to get to well over 2200 at 1.093v *IF* I got temps low enough.
> (But that *does* show what the GPU is capable of at stock voltage).
> 
> According to Nvidia 1.05v is actually the maximum.
> 1.093v is already over-volted.
> 
> Maybe beyond that (up to 1.2v+ should only be reserved for more extreme methods of cooling).
> 
> Maybe now a comparison between scores at same clocks/voltages (XOC vs SHUNT mod) is what we need to see next?


we already saw that... thats what i have done... to a certain point both bios are close to each other, from a certain point weird stuff is going on


----------



## Hulio225

Quote:


> Originally Posted by *fisher6*
> 
> I can use corsair link with my PSU and with XOC bios it shows close to 600W at only 2000Mhz while playing ME:A. If I set it to 2100 and 1.2, the PC shutsdown as as soon as I launch the game (although valley works fine).
> This is really confusing and seems to contradict what I'm getting. Maybe it's different from GPU to GPU?


my pc isn't shuting down even i just have a 650w psu

heavy oc on cpu and gpu with shunt mod... now i dont know if i need a new psu or not... without wattmeter kinda difficult to figure out^^

Edit: Those psu calculators on the interwebs are saying i need a new one, but im not sure atm tbh^^


----------



## lilchronic

@ 1.2v i made it up to 2139Mhz but that's it after that it loock's up and sometimes i get a shut down. The weird part is even at 2139Mhz my scores were lower than what i did at 2063Mhz 1.081v.









I just went back to FE bios and run 2000Mhz @ .975v with max temp 38c. With the XOC bios @ 1.2v 2139Mhz my temps were hitting 53c.


----------



## Hulio225

Quote:


> Originally Posted by *lilchronic*
> 
> @ 1.2v i made it up to 2139Mhz but that's it after that it loock's up and sometimes i get a shut down. The weird part is even at 2139Mhz my scores were lower than what i did at 2063Mhz 1.081v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just went back to FE bios and run 2000Mhz @ .975v with max temp 38c. With the XOC bios @ 1.2v 2139Mhz my temps were hitting 53c.


yeah i had those weird stuff too, got lower scores with higher core clocks...


----------



## nrpeyton

That's my point.

Everyone is using the XOC to push clocks + voltages as high as they'll go and the GPU doesn't like the extra voltage. So scores are lower.

But what about testing the XOC at a maximum of 1.093v and just enjoying the higher power targets. Will it still score lower then?

@Hulio225 sorry; I don't recall from your earlier post what clocks/voltages you were testing at when you had the XOC flashed? Were you above 1.093v?

Like @done12many2 said; there's a sweet spot.

Or maybe it's true calling is just LN2 / Dry Ice.

I am only exploring it as a possibility as an alternative to a SHUNT MOD. Instead of utilising it's extra voltage. (unless you have some dry ice or LN2 of course).


----------



## lilchronic

Quote:


> Originally Posted by *Hulio225*
> 
> yeah i had those weird stuff too, got lower scores with higher core clocks...


It's not worth the extra heat in my opinion. Especially wen you are seeing no improvement's in benchmark's.


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> That's my point.
> 
> Everyone is using the XOC to push clocks + voltages as high as they'll go and the GPU doesn't like the extra voltage. So scores are lower.
> 
> But what about testing the XOC at a maximum of 1.093v and just enjoying the higher power targets. Will it still score lower then?
> 
> @Hulio225 sorry; I don't recall from your earlier post what clocks/voltages you were testing at when you had the XOC flashed? Were you above 1.093v?


2101 @ 1075mV no mem oc both on fe bios and xoc...
fe was a tiny tiny bit better, but not worth mentioning... and after i started to go higher in voltage and clock i still got the same scores until i opened my window and cooled the room down a bit then i got more points again, but still wasn't able to reach the same scores as with fe max oc ram and core... so im confused and dont know whats up ^^

edit: we came to a probable conclusion that maybe afterburner isn't showing downclocks from temps while on xoc bios thats why you think you are on higher clocks but in reality you are maybe throttled... who knows who the xoc bios is coded.... its meant for ln2 ... its a bit weird how it behaves from a crtain point on


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> That's my point.
> 
> Everyone is using the XOC to push clocks + voltages as high as they'll go and the GPU doesn't like the extra voltage. So scores are lower.
> 
> But what about testing the XOC at a maximum of 1.093v and just enjoying the higher power targets. Will it still score lower then?
> 
> @Hulio225 sorry; I don't recall from your earlier post what clocks/voltages you were testing at when you had the XOC flashed? Were you above 1.093v?


I worked my way up from 1.081v 2063Mhz and everything higher than that started to score lower.
This was my best run @ 2063Mhz / 6210Mhz 1.081v with XOC bios
Anything higher and all my scores were in the 10450 - 10490 for 4k supo

Time spy and FSU had the same result's.


----------



## KedarWolf

I've officially given up on the XOC BIOS on my FE, 2088 core +642 memory (got a TDR BSOD at my regular +667) I score 50 points less in Superposition than 2037 core +667 memory FTW3 BIOS.

At 2037 core, +667 memory I get 100 points less and if I try higher voltages and overclocking over 2100 system is completely unstable.









Going back to FTW3 BIOS.


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> I worked my way up from 1.081v 2063Mhz and everything higher than that started to score lower.
> This was my best run @ 2063Mhz / 6210Mhz 1.081v with XOC bios
> Anything higher and all my scores were in the 10450 - 10490 for 4k supo
> 
> Time spy and FSU had the same result's.


but what memory overclock? you may need to dial that back ~ 20%
Quote:


> Originally Posted by *KedarWolf*
> 
> I've officially given up on the XOC BIOS on my FE, 2088 core +642 memory (got a TDR BSOD at my regular +667) I score 50 points less in Superposition than 2037 core +667 memory FTW3 BIOS.
> 
> At 2037 core, +667 memory I get 100 points less and if I try higher voltages and overclocking over 2100 system is completely unstable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going back to FTW3 BIOS.


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> 2101 @ 1075mV no mem oc both on fe bios and xoc...
> fe was a tiny tiny bit better, but not worth mentioning... and after i started to go higher in voltage and clock i still got the same scores until i opened my window and cooled the room down a bit then i got more points again, but still wasn't able to reach the same scores as with fe max oc ram and core... so im confused and dont know whats up ^^


Okay 

All we can do then I suppose is either:
1)some of us might enjoy still tinkering around with it
1.1) if anyone has access to extreme cooling it would be good to see if the BIOS helps in that regard
1.2) I fully intend to test that myself, as soon as finances allow (I already have a CPU pot, the GPU pot is next on my list)

2) However most will probably prefer to go back to stock BIOS and run with an under-volt at 2000Mhz for smoother gameplay. I know that's what I'm going to do in the short-term.

After I get my water-block, I'll fire up my Water Chiller at sub-zero temps and do another comparison.

Then later in a month or two I do plan on getting my GPU really cold ;-)


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> I've officially given up on the XOC BIOS on my FE, 2088 core +642 memory (got a TDR BSOD at my regular +667) I score 50 points less in Superposition than 2037 core +667 memory FTW3 BIOS.
> 
> At 2037 core, +667 memory I get 100 points less and if I try higher voltages and overclocking over 2100 system is completely unstable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going back to FTW3 BIOS.


you, too. you need to try at lower memory. the timings are different and you cant push the memory as far. Hulio said the memory timings were looser, but I think they are TIGHTER. Back off on the memory.


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> but what memory overclock? you may need to dial that back ~ 20%


I have score's are even lower with lower mem clocks....


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> I have score's are even lower with lower mem clocks....


it's odd, i just think there will be a different "sweet spot" 100% for sure. where that falls, idk.


----------



## nrpeyton

OCCT would be perfect to see how it affects memory overclocks.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> OCCT would be perfect to see how it affects memory overclocks.


agreed


----------



## KraxKill

Ok after testing this BIOS for a bit and my excitement of new MEGA CLOCKS subsided I conclude that the XOC too sucks when compared to the vanilla FE Bios.

My scores were lower clock for clock, volt for volt when compared to the FE.

Time Spy Test 2 @ 2100 mhz and 1.063v scores consistently higher on the FE BIOS compared to the XOC. I think people may be lying to themselves but I can have the card at sub zero under load with my chiller/chillbox and no matter what settings I tried, I cannot beat my FE BIOS scores when isolating GPU in the 3Dmark stress tests.

I can bench at 2200+ on the XOC Bios which looks cool and makes my wee wee slightly larger but in order for my scores to equal my FE BIOS scores, I would have to run the card at 2300mhz+ at volts not even this BIOS will allow.

On the FE BIOS, I can go all the way up to 2186 at just 1.093v and at 2100 and just 1.050v, my scores are higher than anything I could pull on the XOC. Used Fraps to compare frame times which also appear to be identica or even better on my FE despite my initial impression to the contrary.

So if you don't let the numbers go to your head we're talking a whole 150+mhz disparity between running the XOC and just running the FE BIOS.

For me to beat my scores on the FE at 2164, I would need 2300mhz+ NL2 and even more voltage than the bios can provide on it's own.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Ok after testing this BIOS for a bit and my excitement of new MEGA CLOCKS subsided I conclude that the XOC too sucks when compared to the vanilla FE Bios.
> 
> My scores were lower clock for clock, volt for volt when compared to the FE.
> 
> Time Spy Test 2 @ 2100 mhz and 1.063v scores consistently higher on the FE BIOS compared to the XOC. I think people may be lying to themselves but I can have the card at sub zero under load with my chiller/chillbox and no matter what settings I tried, I cannot beat my FE BIOS scores when isolating GPU in the 3Dmark stress tests.
> 
> I can bench at 2200+ on the XOC Bios which looks cool and makes my wee wee slightly larger but in order for my scores to equal my FE BIOS scores, I would have to run the card at 2300mhz+ at volts not even this BIOS will allow.
> 
> On the FE BIOS, I can go all the way up to 2186 at just 1.093v and at 2100 and just 1.050v, my scores are higher than anything I could pull on the XOC. Used Fraps to compare frame times which also appear to be identica or even better on my FE despite my initial impression to the contrary.
> 
> So if you don't let the numbers go to your head we're talking a whole 150mhz disparity between running the XOC and just running the FE BIOS.
> 
> For me to beat my scores on the FE at 2186, I would need 2300mhz+ NL2 and even more voltage than the bios can provide on it's own.


What's your core temp with Chill Box maxed out, at average 24/7 gaming loads?


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Ok after testing this BIOS for a bit and my excitement of new MEGA CLOCKS subsided I conclude that the XOC too sucks when compared to the vanilla FE Bios.
> 
> My scores were lower clock for clock, volt for volt when compared to the FE.
> 
> Time Spy Test 2 @ 2100 mhz and 1.063v scores consistently higher on the FE BIOS compared to the XOC. I think people may be lying to themselves but I can have the card at sub zero under load with my chiller/chillbox and no matter what settings I tried, I cannot beat my FE BIOS scores when isolating GPU in the 3Dmark stress tests.
> 
> I can bench at 2200+ on the XOC Bios which looks cool and makes my wee wee slightly larger but in order for my scores to equal my FE BIOS scores, I would have to run the card at 2300mhz+ at volts not even this BIOS will allow.
> 
> On the FE BIOS, I can go all the way up to 2186 at just 1.093v and at 2100 and just 1.050v, my scores are higher than anything I could pull on the XOC. Used Fraps to compare frame times which also appear to be identica or even better on my FE despite my initial impression to the contrary.
> 
> So if you don't let the numbers go to your head we're talking a whole 150mhz disparity between running the XOC and just running the FE BIOS.
> 
> For me to beat my scores on the FE at 2186, I would need 2300mhz+ NL2 and even more voltage than the bios can provide on it's own.


amen another one with shunt mod concluding the same as i did... fe still the way to rock with shunt on a fe card haha


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> Ok after testing this BIOS for a bit and my excitement of new MEGA CLOCKS subsided I conclude that the XOC too sucks when compared to the vanilla FE Bios.
> 
> My scores were lower clock for clock, volt for volt when compared to the FE.
> 
> Time Spy Test 2 @ 2100 mhz and 1.063v scores consistently higher on the FE BIOS compared to the XOC. I think people may be lying to themselves but I can have the card at sub zero under load with my chiller/chillbox and no matter what settings I tried, I cannot beat my FE BIOS scores when isolating GPU in the 3Dmark stress tests.
> 
> I can bench at 2200+ on the XOC Bios which looks cool and makes my wee wee slightly larger but in order for my scores to equal my FE BIOS scores, I would have to run the card at 2300mhz+ at volts not even this BIOS will allow.
> 
> On the FE BIOS, I can go all the way up to 2186 at just 1.093v and at 2100 and just 1.050v, my scores are higher than anything I could pull on the XOC. Used Fraps to compare frame times which also appear to be identica or even better on my FE despite my initial impression to the contrary.
> 
> So if you don't let the numbers go to your head we're talking a whole 150mhz disparity between running the XOC and just running the FE BIOS.
> 
> For me to beat my scores on the FE at 2186, I would need 2300mhz+ NL2 and even more voltage than the bios can provide on it's own.


It is good for ONE thing. Take some screenshots. E-Peen for those trolling forums lol. and, yeah, you and Hulio prove that golden cards dont benefit from it (sans [email protected] anyway).

Could be useful at regular volts for an extremely powerlimited mediocre card if you didnt have to curve lock it to achieve it. Oh well.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> It is good for ONE thing. Take some screenshots. E-Peen for those trolling forums lol. and, yeah, you and Hulio prove that golden cards dont benefit from it (sans [email protected] anyway).
> 
> Could be useful at regular volts for an extremely powerlimited mediocre card if you didnt have to curve lock it to achieve it. Oh well.


yeah on a not shunt moded card it can be useful in power hungry games... but if you go for max benchscores on water and you have a shunt moded card, there is nothing which is better than the stock fe nvidia bios


----------



## nrpeyton

Aye, that does make sense.

XOC seems to belong in the Extreme Cooling domain.

Or for those who don't want to SHUNT mod.

But what are we going to recommend to newcomers here with FE's, SHUNT MOD or XOC?

I'd vote SHUNT MOD. Then XOC only if SHUNT MOD isn't a possibility or if they want to tinker with Extreme Cooling.?

You guys agree with that? Or you think there's still some more tinkering/testing to be done first?


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye, that does make sense.
> 
> XOC seems to belong in the Extreme Cooling domain.
> 
> Or for those who don't want to SHUNT mod.
> 
> But what are we going to recommend to newcomers here with FE's, SHUNT MOD or XOC?
> 
> I'd vote SHUNT MOD. Then XOC only if SHUNT MOD isn't a possibility or if they want to tinker with Extreme Cooling.?
> 
> You guys agree with that?


Maybe even the ftw3 bios, no risks and kedarwolf has good results with it on a not shunt moded fe card


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> What's your core temp with Chill Box maxed out, at average 24/7 gaming loads?


Maxed out i'm at about -4C at full load on the core with -17 to -20C liquid, lower than that when gaming (Arma 3) but I run it at about -10C liquid for 24/7 use the rest of the time which puts my load temps right around 0.

I can run much lower than that i figure in the winter. Ambient where I am right now is 30C+ but in the winter it can get to near freezing outside especially at night so I may ve able to fire off some obscene benches in the winter.


----------



## Hopesolo

SHUNT MOD









2088/6250MHz


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> I can bench at 2200+ on the XOC Bios which looks cool and makes my wee wee slightly larger but in order for my scores to equal my FE BIOS scores, I would have to run the card at 2300mhz+ at volts not even this BIOS will allow.
> .


This would be my conclusion too. Its designed for 2300+ clks scaling upwards, so I think they may soften timings thru 2100-2200 etc to distract people from using it as there is no gains. I agree it may benefit some severely power limited cards and or games letting them work better in normal ranges, sub 2100, but make sure to set the curve not to use to much voltage.

be interesting to see clk scaling results from 2050 - 2200 to see if there is a pattern or point of recovery.


----------



## wammos

newb looking for a little help...

I have a EVGA FE, no shunt that I flashed the XOC bios. I got the voltage slider unlocked adding this:

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
; Strix 1080 Ti
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
[VEN_10DE&DEV_1B06&SUBSYS_85E41043&REV_A1]
VDDC_Generic_Detection = 1

to the MSIAfterburner.oem2 file and the unlock voltage 3rd party and monitoring. I move the Core Voltage slider to +100 but when I test the card it is stuck on 1.062.

I must be missing something?


----------



## KraxKill

Quote:


> Originally Posted by *wammos*
> 
> newb looking for a little help...
> 
> I have a EVGA FE, no shunt that I flashed the XOC bios. I got the voltage slider unlocked adding this:
> 
> ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
> ; Strix 1080 Ti
> ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
> [VEN_10DE&DEV_1B06&SUBSYS_85E41043&REV_A1]
> VDDC_Generic_Detection = 1
> 
> to the MSIAfterburner.oem2 file and the unlock voltage 3rd party and monitoring. I move the Core Voltage slider to +100 but when I test the card it is stuck on 1.062.
> 
> I must be missing something?


I found you have to manually edit the vcurve by pressing CTRL-F to bring up the curve. Have Valley or something else looping in the background. Then pull the last selected bin down just slightly, you should see it select the next bin on the curve.

the +100 slider is just there to allow it to pick the bins between 1.063-1.093, but it doesn't force those bins unless dictated by the curve.


----------



## fisher6

Totally agree with what has been mentioned with regards to the XOC bios. My card is not shunt modded so I'm just gonna use the XOC bios to avoid power limiting I get with the FE and lock the voltage as low as possible for 2000Mhz


----------



## Hulio225

.


----------



## lilchronic

Quote:


> Originally Posted by *fisher6*
> 
> Totally agree with what has been mentioned with regards to the XOC bios. My card is not shunt modded so I'm just gonna use the XOC bios to avoid power limiting I get with the FE and lock the voltage as low as possible for 2000Mhz


you can do the same thing with the founders FE bios and get better results at the same clock's.


----------



## Luckbad

Memory has proven the most interesting overclocking variable for the 1080 Ti.

After excessive tinkering, I have found my real day to day sweet spot at just over 6100 MHz.

I can bench considerably higher even without voltage increases and even score higher.

However, using a few games and the Heaven benchmark with minimal framerate variance (paused on a scene), I get more FPS fluctuation at higher memory clocks.

2062 core, 6102 memory gives me good temps, rock solid stability, and minimal framerate variance. No voltage increase necessary either.

This is on a Zotac Amp Extreme btw.


----------



## chef1702

Dont know if my post got overlooked because I wrote "Strix" instead of "XOC" above the pictures... i meant XOC

click

The XOC doing well in normal situations. Normal clock normal voltage *no difference* to FE Bios. And when power limit kicks in the XOC scores higher. I just played ME:A for an hour and went up to 36x watts on 1987 MHz and 1.0125 V. Not possible with an FE Bios. For sure it was just one or two seconds but why dont use the XOC. When not power limiting its as good as FE and when power limiting its better...


----------



## Hulio225

Quote:


> Originally Posted by *chef1702*
> 
> Dont know if my post got overlooked because I wrote "Strix" instead of "XOC" above the pictures... i meant XOC
> 
> click
> 
> The XOC doing well in normal situations. Normal clock normal voltage *no difference* to FE Bios. And when power limit kicks in the XOC scores higher. I just played ME:A for an hour and went up to 36x watts on 1987 MHz and 1.0125 V. Not possible with an FE Bios. For sure it was just one or two seconds but why dont use the XOC. When not power limiting its as good as FE and when power limiting its better...


have you average fps comparison between fe bios and xoc while playing 1 hour ME.A to prove that it is better?^^
if not our conclusions still stands, that the fe bios performs better in terms of fps etc...
another thing is there are a lot of use scenarios, hard to compare everything... our conclusions are for the overall performance while overclocking


----------



## wammos

Quote:


> Originally Posted by *KraxKill*
> 
> I found you have to manually edit the vcurve by pressing CTRL-F to bring up the curve. Have Valley or something else looping in the background. Then pull the last selected bin down just slightly, you should see it select the next bin on the curve.
> 
> the +100 slider is just there to allow it to pick the bins between 1.063-1.093, but it doesn't force those bins unless dictated by the curve.


thank you


----------



## KraxKill

Quote:


> Originally Posted by *chef1702*
> 
> Dont know if my post got overlooked because I wrote "Strix" instead of "XOC" above the pictures... i meant XOC
> 
> click
> 
> The XOC doing well in normal situations. Normal clock normal voltage *no difference* to FE Bios. And when power limit kicks in the XOC scores higher. I just played ME:A for an hour and went up to 36x watts on 1987 MHz and 1.0125 V. Not possible with an FE Bios. For sure it was just one or two seconds but why dont use the XOC. When not power limiting its as good as FE and when power limiting its better...


Not sure what your testing methodology is, because it's 5% slower clock for clock volt for volt. If you're seeing something different than the few people earlier in the thread in back to back, repeatable testing, then I'm all ears but in my testing it seems to operate at about a 5% deficit to the FE Bios with theoretical advantage only coming at around 2300mhz where the voltage required for stability would be greater than the BIOS will provide without hard mods and in the 1.3+v range. I'm running -17C idles and can't touch this which means NL2, exotic phase change cascades and the like, but not on air, water or even chilled water. And this is just to break even.

We're talking if you're running 2150 on the FE bios you'd require 2300+mhz or more just to break even and that's assuming perfect linear scaling.


----------



## chef1702

Quote:


> have you average fps comparison between fe bios and xoc while playing 1 hour ME.A to prove that it is better?^^


For what? I benchmarked 5 different szenarios. 2 benchmark programms with different settings and two games with different resoluation scaling. Look at Wildlands. At 1.5 reso scale both bioses performed *exactly* the same. Even Superposition performed the same on 4k and on 8k XOC did better. Whats more to proof? Just because one or two guys who raised their mem clock that high that they cant even tell if its error correction or not say that the XOC is not worth it.
I play ME:A fps capped at 75 cause 75 Hz and I know some spots (ship, planet card, Havari lots of jungle vegetation) where the card run into power limit. Now there is no power limit and still 75 fps. If i count 1+1 then thats a benefit. I mean even that guy here showed, that the XOC is the same in normal situations *plus* the bonus of no power limit. And he said it correct. Dont go crazy with the clocks. Use your card normal with good oc and benefit from no power limit. You can't tell me even if your card could do 2,2 GHZ with 1.15x Voltage you would ran that 24/7. Its unimportant that the XOC Bios do worse when pushing clocks that high. With normal usage the XOC does fine.


----------



## done12many2

At stock boost FTW3 and XOC were exactly the same for me except for power limiting conditions such as SP 8k where XOC was slightly better.

I currently just bested my previous SP 8k max score with XOC and am working on my 1080p and 4k scores now.

There is no difference in efficiency of the BIOS. It's just unlocked.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Not sure what your testing methodology is, because it's 5% slower clock for clock volt for volt. If you're seeing something different than the few people earlier in the thread in back to back, repeatable testing, then I'm all ears but in my testing it seems to operate at about a 5% deficit to the FE Bios with theoretical advantage only coming at around 2300mhz where the voltage required for stability would be greater than the BIOS will provide without hard mods and in the 1.3+v range. I'm running -17C idles and can't touch this which means NL2, exotic phase change cascades and the like, but not on air, water or even chilled water. And this is just to break even.
> 
> We're talking if you're running 2150 on the FE bios you'd require 2300+mhz or more just to break even and that's assuming perfect linear scaling.


actually with those command prompt xoc tool you can go over 1.5 V, just afterburner stops at 1.2v

but yeah i have the same results as you have on water when i overclock the card higher


----------



## KraxKill

Quote:


> Originally Posted by *chef1702*
> 
> For what? I benchmarked 5 different szenarios. 2 benchmark programms with different settings and two games with different resoluation scaling. Look at Wildlands. At 1.5 reso scale both bioses performced *exactly* the same. Even Superposition performced the same on 4k and on 8k XOC did better. Whats more to proof? Just because one or two guys who raised their mem clock that high that they cant even tell if its error correction or not say that the XOC is not worth it.
> I play ME:A fps capped at 75 cause 75 Hz and I know some spots (ship, planet card, Havari lots of jungle vegetation) where the card run into power limit. Now there is no power limit and still 75 fps. If i count 1+1 then thats a benefit. I mean even that guy here showed, that the XOC is the same in normal situations *plus* the bonus of no power limit. And he said it correct. Dont go crasy with the clocks. Use your card normal with good oc and benefit from no power limit. You can't tell me even if your card could do 2,2 GHZ with 1.15x Voltage you would ran that 24/7. Its unimportant that the XOC Bios do worse when pushing clocks that high. With normal usage the XOC does fine.


What you said is all sorts of gibberish and endless variables. Why not make it easier. Just run your card at a desired clock in Time Spy and post links to your results with one bios vs the other at the same clocks and voltages. Until then you're just spewing hypotheticals and wishful thinking.

It's going to take a lot more than the above to prove you're the one guy getting higher scores with it.


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> What you said is all sorts of gibberish and endless variables. Why not make it easier. Just run your card at a desired clock in Time Spy and post links to your results with one bios vs the other at the same clocks and voltages. Until then you're just spewing hypotheticals and wishful thinking.


I'm getting the same results as @chef1702. With that said, it's possible that others are seeing different results then you. No real need to force your results onto other.


----------



## chef1702

Oh now it's timespy i need to run. You read my post complete? All tests (Heaven, Superposition 4k and 8k, Division and Wildlands) were on the *same clock* and the *same settings* and the *same curve* and the *same mem*. I don't wanna sound harsh but I ran 5 different tests at the exact same clocks and settings and results are clear. And now i got told I need to run another test and the xoc is rubbish. Hm maybe we should just leave it here


----------



## KraxKill

Quote:


> Originally Posted by *chef1702*
> 
> Oh now it's timespy i need to run. You read my post complete? All tests (Heaven, Superposition 4k and 8k, Division and Wildlands) were on the *same clock* and the *same settings* and the *same curve* and the *same mem*. I don't wanna sound harsh but I ran 5 different tests at the exact same clocks and settings and results are clear. And now i got told I need to run another test and the xoc is rubbish. Hm maybe we should just leave it here


Sounds like you have no proof


----------



## Hulio225

Relax guys, no shunt moded "low clock" speeds on xoc could be worth... high clocks like ours on shunt moded cards in 2100+++ areas are a diffeent story thou


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> Relax guys, no shunt moded "low clock" speeds on xoc could be worth... high clocks like ours on shunt moded cards in 2100+++ areas are a diffeent story thou


That's what I'm getting at. This isn't a one shoe size fits all type deal. It will even vary within the "high clock" speed cards.


----------



## KraxKill

Quote:


> Originally Posted by *done12many2*
> 
> That's what I'm getting at. This isn't a one shoe size fits all type deal. It will even vary within the "high clock" speed cards.


You do realize the disparity we're talking about here is about 5%. It would be fine if it was a small difference. I'm seeing 65-66fps at 2100 in Time Spy GPU test 2 on the FE Bios w/shunt mod and 63-64 at 2100 on the XOC in the same test at 0C load temps wit both BIOS' locked at 1.063v. Both at +550 on the mem.


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> You do realize the disparity we're talking about here is about 5%. It would be fine if it was a small difference. I'm seeing 65-66fps at 2100 in Time Spy GPU test 2 on the FE Bios w/shunt mod and 63-64 at 2100 on the XOC in the same test at 0C load temps. Both at +550 on the mem.


What I am telling you is plain and simple. I am not getting a 5% drop. It's literally dead even for me. I am not saying that you aren't getting a 5% drop, but the simple fact of the matter is that I am not.

I appreciate you sharing your experience with the XOC BIOS.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> You do realize the disparity we're talking about here is about 5%. It would be fine if it was a small difference. I'm seeing 65-66fps at 2100 in Time Spy GPU test 2 on the FE Bios w/shunt mod and 63-64 at 2100 on the XOC in the same test at 0C load temps wit both BIOS' locked at 1.063v. Both at +550 on the mem.


Worth to mention that the nvidia fe bios even power limits on time spy game test 2 (even with shunt mod), while the xoc isn't doing that but still manages to get lower average fps








shame on you xoc bios shame on you


----------



## OneCosmic

Do you guys use any tweaks for the scores of SuPo that you are posting like lowering the graphics quality in drivers?


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> This would be my conclusion too. Its designed for 2300+ clks scaling upwards, so I think they may soften timings thru 2100-2200 etc to distract people from using it as there is no gains. I agree it may benefit some severely power limited cards and or games letting them work better in normal ranges, sub 2100, but make sure to set the curve not to use to much voltage.
> 
> be interesting to see clk scaling results from 2050 - 2200 to see if there is a pattern or point of recovery.


i bet it's literally bad from 2101-2300 ...
Quote:


> Originally Posted by *Slackaveli*
> 
> the difference is clearly that those dudes are shunt mod'd.


Quote:


> Originally Posted by *Slackaveli*
> 
> yes they are. 'off' settings on AF,AA,vsync, extra, plus high performance everywhere you can, fixed refresh, and some of them even run their monitors in 720p...


----------



## Slackaveli

..


----------



## Slackaveli

..


----------



## done12many2

Quote:


> Originally Posted by *Slackaveli*
> 
> ..


Quote:


> Originally Posted by *Slackaveli*
> 
> ..


Getting a little trigger happy?


----------



## nrpeyton

Anyone else checked to see if their max stable memory O/C is any different using the XOC yet?
(OCCT is great for testing memory O/C'ing stability).


----------



## Slackaveli

Quote:


> Originally Posted by *done12many2*
> 
> Getting a little trigger happy?


yeah, it lied to me and said 3 new posts so i thought i was good to go lol.
Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone else checked to see if their max stable memory O/C is any different using the XOC yet?
> (OCCT is great for testing memory O/C'ing stability).


i really would like that as my memory is decidely MEH, but i aint puttin that bios on an Aorus. too different, i know it wont play nice.


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, it lied to me and said 3 new posts so i thought i was good to go lol.
> i really would like that as my memory is decidely MEH, but i aint puttin that bios on an Aorus. too different, i know it wont play nice.


Away from the rig for a min, but has anybody tried to see if the XOC volt tool works to override the voltage while on the FE Bios?


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> Away from the rig for a min, but has anybody tried to see if the XOC volt tool works to override the voltage while on the FE Bios?


I've got a feeling that you'll be the first. Not too many people with the cooling capacity like yours to really take advantage of what the tool would offer. Let us know.


----------



## bloodhawk

Quote:


> Originally Posted by *KraxKill*
> 
> Away from the rig for a min, but has anybody tried to see if the XOC volt tool works to override the voltage while on the FE Bios?


Unless you are trying to bench on LN2, do not push it over 1.2V.
I tried pushing my EVGA FE with the XOC BIOS, and i can easily push clocks to 2126 Mhz @ 1.15V, by easily setting up a VF curve. With the Normal Strix BIOS or Stock FE BIOS, the most my card does is 2038Mhz @ 1.093V.

Again DO NOT PUSH THE CARD BEYOND 1.2V ON AIR, or even water for longer periods. Most should be able to hit benchable clocks using sub 1.15V .

As a side note, my card has the 3 x 10Ohm resistor Power IC mod.


----------



## OneCosmic

What is the highest power that you have been running on FE? I was probably going over 450W in SuPo 4K/8K on 1,175V.


----------



## Hulio225

Quote:


> Originally Posted by *bloodhawk*
> 
> Unless you are trying to bench on LN2, do not push it over 1.2V.
> I tried pushing my EVGA FE with the XOC BIOS, and i can easily push clocks to 2126 Mhz @ 1.15V, by easily setting up a VF curve. With the Normal Strix BIOS or Stock FE BIOS, the most my card does is 2038Mhz @ 1.093V.
> 
> Again DO NOT PUSH THE CARD BEYOND 1.2V ON AIR, or even water for longer periods. Most should be able to hit benchable clocks using sub 1.15V .
> 
> As a side note, my card has the 3 x 10Ohm resistor Power IC mod.


KraxKill is on chillbox cooling his liquid temps are sub zero, he will be good if goes a bit higher than 1.2v


----------



## bloodhawk

Quote:


> Originally Posted by *Hulio225*
> 
> KraxKill is on chillbox cooling his liquid temps are sub zero, he will be good if goes a bit higher than 1.2v


Oh sweet! That would work awesome for him then. I pushed my voltage to 1.3V for a few seconds to see if it works, and it held pretty well. I did stick my portable AC's vent to the desktops radiator intake fans. :> (Just in case)


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> KraxKill is on chillbox cooling his liquid temps are sub zero, he will be good if goes a bit higher than 1.2v


I bet you're right, I'd probaby will, but my experience on the XOC bios wasn't that great.

I'm wondering if the XOC voltage offset command line tool works while flashed to the vanilla FE bios. The XOC is great in terms of outright clocks, I was throwing 2200+ at her no problem, but my scores were a flat line with it regardless of what I would do to the voltage.

It would be great if the tool worked allow an offset to any BIOS, then we could grab a couple of extra vbins on the FE while keeping the performance we've seen the FE Bios offers over the XOC for those with shunt mods.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> I bet you're right, I'd probaby will, but my experience on the XOC bios wasn't that great.
> 
> I'm wondering if the XOC voltage offset command line tool works while flashed to the vanilla FE bios. The XOC is great in terms of outright clocks, I was throwing 2200+ at her no problem, but my scores were a flat line with it regardless of what I would do to the voltage.
> 
> It would be great if the tool worked allow an offset to any BIOS, then we could grab a couple of extra vbins on the FE while keeping the performance we've seen the FE Bios offers over the XOC for those with shunt mods.


yeah i know i know i had the same results, higher clocks but garbage performance ... wanted to write this :
Quote:


> Originally Posted by *bloodhawk*
> 
> Oh sweet! That would work awesome for him then. I pushed my voltage to 1.3V for a few seconds to see if it works, and it held pretty well. I did stick my portable AC's vent to the desktops radiator intake fans. :> (Just in case)


but we already tested xoc bios... and thats worse than fe bios in performance, he just wants to know if the software tool is capable off.....

then i saw new post


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone else checked to see if their max stable memory O/C is any different using the XOC yet?
> (OCCT is great for testing memory O/C'ing stability).


Checked, no change of my max stable mem OC on the xoc bios.

XOC bios seems to have a programmed shutdown temperature and it is around 54-58C. No matter what voltage/frequency combo I use, 3DM stress test always cancels itself around this temperature range.


----------



## bloodhawk

Quote:


> Originally Posted by *Hulio225*
> 
> yeah i know i know i had the same results, higher clocks but garbage performance ... wanted to write this :
> but we already tested xoc bios... and thats worse than fe bios in performance, he just wants to know if the software tool is capable off.....
> 
> then i saw new post


Yeah for the FE cards the scores dont go up by much , because there are still a few limits that plague the FE cards. But i did get a 200 point bump in TimeSpy using the XOC + current mod , over the Strix/FE vanilla BIOS.
Ill run more tests with graphs tonight once im home.


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> You do realize the disparity we're talking about here is about 5%. It would be fine if it was a small difference. I'm seeing 65-66fps at 2100 in Time Spy GPU test 2 on the FE Bios w/shunt mod and 63-64 at 2100 on the XOC in the same test at 0C load temps wit both BIOS' locked at 1.063v. Both at +550 on the mem.


I think you'll find its bad because your at 2100, I see the same thing, and gets worse the higher I go. You would need to run tests from 2000 upwards and see how it scales. I think that's why I got bad results with a lot of the other bioses too, because they just don't scale well at around 2100 and above other than FE.


----------



## Benny89

Is it just me or it seems like FE generally clocks higher than AIBs?









At least reading this thread hehe


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Is it just me or it seems like FE generally clocks higher than AIBs?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least reading this thread hehe


we have very good cooling since day one

edit: you cant compare aib 60 ° C load temp to watercooled fe card which tops at 38°C with 19 ° C ambient or somethin like that


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Is it just me or it seems like FE generally clocks higher than AIBs?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least reading this thread hehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> we have very good cooling since day one
> 
> edit: you cant compare aib 60 ° C load temp to watercooled fe card which tops at 38°C with 19 ° C ambient or somethin like that
Click to expand...

My watercooled FE, XOC BIOS at 2100, +642 memory scored lower than FTW3 BIOS at 2032 +667 memory in Superposition.


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> we have very good cooling since day one
> 
> edit: you cant compare aib 60 ° C load temp to watercooled fe card which tops at 38°C with 19 ° C ambient or somethin like that


I'd love to have constant 19c temps this time of year, but on cool days when I get down to 20c ambient, my GPUs stay in the high 20's and may hit 30c. They are doing better then my 1080s were.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye, that does make sense.
> 
> XOC seems to belong in the Extreme Cooling domain.
> 
> Or for those who don't want to SHUNT mod.
> 
> But what are we going to recommend to newcomers here with FE's, SHUNT MOD or XOC?
> 
> I'd vote SHUNT MOD. Then XOC only if SHUNT MOD isn't a possibility or if they want to tinker with Extreme Cooling.?
> 
> You guys agree with that? Or you think there's still some more tinkering/testing to be done first?


I have another saying;
the early bird gets the worm the second mouse gets the cheese

mentioned it earlier and my suspicions are loose memory timings to allow higher core clock stability this is DICE cooling could come into play and explains the clock for clock lower benchmark scores with XOC though I have no idea how to read memory timings on a GPU

I would love to see your testing with standard cooling on CPU get a PCIe cable extender to use your LN² pot with DICE

superposition don't need a good CPU I am tempted to use the standard Asus bios do the shunt mod to find out if the standard ASUS bios has a voltage cap limit.

I don't care about voiding warrantee might just wait a little longer to be fully satisfied there is no reason to RMA before voiding warrantee and waiting for mother nature to cool a little more here

Quote:


> Originally Posted by *KraxKill*
> 
> Sounds like you have no proof


I like to see definitive proof too
but still waiting for the definitive proof on another discussion about RAM 3600Mhz 16-16-16 vs 4133 19-19-19 I firmly believe that the 4133 would run 3600 15-15-15 or better at stock volts I have no proof than a hunch.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Is it just me or it seems like FE generally clocks higher than AIBs?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least reading this thread hehe


Nah. It's more common and a lot of people here put the FE on water, so you'll see more people with high clocks.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> My watercooled FE, XOC BIOS at 2100, +642 memory scored lower than FTW3 BIOS at 2032 +667 memory in Superposition.


Yeah there are way more limits in place. Unfortunately no method of bypassing them without voiding the warranty.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> I have another saying;
> the early bird gets the worm the second mouse gets the cheese
> 
> mentioned it earlier and my suspicions are loose memory timings to allow higher core clock stability this is DICE cooling could come into play and explains the clock for clock lower benchmark scores with XOC though I have no idea how to read memory timings on a GPU
> 
> I would love to see your testing with standard cooling on CPU get a PCIe cable extender to use your LN² pot with DICE
> 
> superposition don't need a good CPU I am tempted to use the standard Asus bios do the shunt mod to find out if the standard ASUS bios has a voltage cap limit.
> 
> I don't care about voiding warrantee might just wait a little longer to be fully satisfied there is no reason to RMA before voiding warrantee and waiting for mother nature to cool a little more here
> I like to see definitive proof too
> but still waiting for the definitive proof on another discussion about RAM 3600Mhz 16-16-16 vs 4133 19-19-19 I firmly believe that the 4133 would run 3600 15-15-15 or better at stock volts I have no proof than a hunch.


makes sense to me (the ram), it has to at least be the same sammy b-dies but binned.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> makes sense to me (the ram), it has to at least be the same sammy b-dies but binned.


as i said binned for mhz doesnt gurantee tight timings...

and 3600 15-15-15 isn't that hard at stock voltages, the 4133 kit will do it in most cases like the 3600 kit as well...

since i just have a 3600c16 i cant show you what a 4133 can and what not... i just stated the experience of people who have binned over 100kits


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> as i said binned for mhz doesnt gurantee tight timings...
> 
> and 3600 15-15-15 isn't that hard at stock voltages, the 4133 kit will do it in most cases like the 3600 kit as well...
> 
> since i just have a 3600c16 i cant show you what a 4133 can and what not... i just stated the experience of people who have binned over 100kits


In addition to that, primaries are only a small part of the story.


----------



## feznz

I know about nothing about the tertiary ram timings have had a play but guess for my way of thinking it's a bit like the hot tip for the winning horse that came from my who know Uncle Bob who has the suppliers contract to the stable and talks to....... hey I like black and white proof just me.
They are binned higher maybe some one is down playing how good they are because they are a little elusive who knows why someone would bin over 100 just sounds like a hot tip to me.


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> I know about nothing about the tertiary ram timings have had a play but guess for my way of thinking it's a bit like the hot tip for the winning horse that came from my who know Uncle Bob who has the suppliers contract to the stable and talks to....... hey I like black and white proof just me.
> They are binned higher maybe some one is down playing how good they are because they are a little elusive who knows why someone would bin over 100 just sounds like a hot tip to me.


you know hwbot? those top guys are even binning motherboards... if you want worldrecords you bin almost every part...
belive or not i don't care, i can't prove it i just repeated what those guys discussed at hwbot forums.
you could be right, but a lot of peeps getting good results with 3600 and 3733 kits, myself including


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> you know hwbot? those top guys are even binning motherboards... if you want worldrecords you bin almost every part...
> belive or not i don't care, i can't prove it i just repeated what those guys discussed at hwbot forums.
> you could be right, but a lot of peeps getting good results with 3600 and 3733 kits, myself including


Look I will leave it at I will be kicking myself on race day when the hot tip comes in







I do appreciate your input Thanks


----------



## KedarWolf

https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330

This 384 TDP Zotac BIOS works with our non-shunted FEs but not getting same performance as FTW3 BIOS at similar clocks.

Zotac



FTW3


----------



## Luckbad

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
> 
> This 384 TDP Zotac BIOS works with our non-shunted FEs but not getting same performance as FTW3 BIOS at similar clocks.
> 
> Zotac
> 
> 
> 
> FTW3


The Zotac has so many weird features, I doubt its Bios is going to work for any other card.

Your minimum FPS on there is terrible for the Zotac Bios--maybe it was trying to give you 384 W and couldn't?


----------



## KingEngineRevUp

So whats the general census on the FTW3 Bios? The SC2 worked similarly if not equal to the stock FEA bios.

Does it play well with shunt mod? Is everyone getting better or equal scores or is it YMMV?


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330
> 
> This 384 TDP Zotac BIOS works with our non-shunted FEs but not getting same performance as FTW3 BIOS at similar clocks.
> 
> Zotac
> 
> 
> 
> FTW3


I found it needed or could use higher mem than FTW3, more like Strix bios.


----------



## Nico67

Been playing with XOC and found something interesting. gain per clk was insignificant, however gain per volts was very noticeable.

2050/6003 @ 1.000v 10207 281w
2050/6003 @ 1.025v 10284 295w
2050/6003 @ 1.050v 10317 313w
2050/6003 @ 1.075v 10346 327w seems enough at this point
2050/6003 @ 1.100v 10333 337w

started at 2000 and 10194 score so still testing, but it seems you need to give it enough voltage even if its stable. Vdroop or something effecting scores maybe?


----------



## Norlig

Is it a memory overclock issue, if light reflections in games give away a bright spot that might blink?


----------



## fisher6

Quote:


> Originally Posted by *Norlig*
> 
> Is it a memory overclock issue, if light reflections in games give away a bright spot that might blink?


Could be, artifacts are usually a sign of unstable memory OC. Try reducing it a little bit and see what happens.


----------



## dunbheagan

Quote:


> Originally Posted by *Norlig*
> 
> Is it a memory overclock issue, if light reflections in games give away a bright spot that might blink?


Yes, thats sounds exactly what i get when i clock my memory to high. I'd say it is a memory issue. At which clock do you have your memory? Try if clocking it at least 50MHz less helps. Mine starts to artefact at 6250 and crashes at 6300,


----------



## Norlig

Quote:


> Originally Posted by *fisher6*
> 
> Could be, artifacts are usually a sign of unstable memory OC. Try reducing it a little bit and see what happens.


Quote:


> Originally Posted by *dunbheagan*
> 
> Yes, thats sounds exactly what i get when i clock my memory to high. I'd say it is a memory issue. At which clock do you have your memory? Try if clocking it at least 50MHz less helps. Mine starts to artefact at 6250 and crashes at 6300,


Got it at +525


----------



## Hulio225

Quote:


> Originally Posted by *Norlig*
> 
> Got it at +525


+525 can be to much at 25 ° C ambient for example...

this gddr5x memory is temp sensetive like hell...

for example if my room is 15 ° C or lower i can run time spy with + 850 and still gaining fps, if temps are higher i cant go over +800 at 19 or 20 ° C i can run max 700 - 750 depends ...

like said temps are dictating clocks ^^


----------



## dunbheagan

Quote:


> Originally Posted by *Norlig*
> 
> Got it at +525


Could be to high. I read about a lot of Ti that do not make over the 6000MHz border stable. Try +400Mhz maybe.


----------



## Nico67

More XOC testing, not only is voltage a big factor (mine seems to like 1.075v), but ram speed is really picky and can change depending on your core clock.

2126/6003 - 10384
2126/6026 - 10234
2126/6059 - 10411
2126/6085 - 10268
2126/6106 - 10432
2126/6132 - 10293
2126/6142 - 10452
2126/6147 - 10309
2126/6166 - 10311
2126/6177 - 10473 yet 2138/6177 - 10316
2126/6180 - 10318 yet 2138/6180 - 10477
2126/6210 - 10485 yet 2138/6210 - 10340
2126/6221 - 10337
2126/6237 - 10499 starting to see artifacts but score seemed to scale ok.

SuPo is not a very good indicator of core scaling, but it does go up a bit

2152/6180 - 10487

also I could game at 2138/6180 at least with TSW which usually crashes with other bioses at FE clks 2101/6003.

Probably with a clean driver reinstall it may do better, and hopefully they big out a better version of the XOC









Has anybody queried the HWBOT guy as to the bios performance?


----------



## hotrod717

I can tell you there is another set of issues once you take these cold. At different temp points and different voltages, FE will experience various bin drops and soft crashes during benchmark. More than -50* and card scored worse and anomalies became almost uncontrollable. Some cards seem to do fine with more than -50*(-130* CB on this card), and some -50* is about all you want to go.
Blew thru about 30L/4hrs. for only 2 decent scores.And one I got screen and forgot benchmark save, so no boints there. Next time I'm capping at -50* to see I can get some consistent results. I can say PL was not a factor as 105 was the highest I saw even at 2200mhz if it was reading correctly.


----------



## MRCOCOset

Asus GTX 1080 TI + XOC BIOS Video Test


----------



## Benny89

Quote:


> Originally Posted by *MRCOCOset*
> 
> Asus GTX 1080 TI + XOC BIOS Video Test


Is this FE Asus or STRIX?

My STRIX could not ge above 320W even at 1.131V .... So if you are on STRIX I don't know why this BIOS works differently on your STRIX than mine. I got power limit still. But you are aslo below 60C all the time.


----------



## Benny89

*GREAT NEWS!! EVGA Confirmed there will be FTW3 Hybrid Version!!!*


__ https://twitter.com/i/web/status/861355973302116352
Imangine FTW3 with all its Power Limit, ICX cooling and Hybrid cooler on top of that!









No ETA for now.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Is this FE Asus or STRIX?
> 
> My STRIX could not ge above 320W even at 1.131V .... So if you are on STRIX I don't know why this BIOS works differently on your STRIX than mine. I got power limit still. But you are aslo below 60C all the time.


The video thumbnail clearly says its 1080 Ti *FE*

lol


----------



## MRCOCOset

FE


----------



## madmeatballs

Looks like for me the FTW3 Bios seems to beat my FE Bios scores with the almost the same oc. On my FE bios I had mem oc at +395 and would give me a score of 10193 in SP. Now with the FTW3 Bios I can get 10255 score on SP with +410 mem. I guess my card and @KedarWolf's card are similar.

I used the same VF curve.


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> The video thumbnail clearly says its 1080 Ti *FE*
> 
> lol


Had to watched it minimized (I am at work







) so I didn't notice. Thanks for that.


----------



## Hulio225

Quote:


> Originally Posted by *madmeatballs*
> 
> Looks like for me the FTW3 Bios seems to beat my FE Bios scores with the almost the same oc. On my FE bios I had mem oc at +395 and would give me a score of 10193 in SP. Now with the FTW3 Bios I can get 10255 score on SP with +410 mem. I guess my card and @KedarWolf's card are similar.
> 
> I used the same VF curve.


Could be, on the other hand its a long test and 62 points more or less is kinda standard fluctuation if you benched on your normal 24/7 OS. if benched on a stripped down bench OS its another story

its mostly better to use short tests which you can repeat several times fast, like time spy test 1 or 2


----------



## madmeatballs

Quote:


> Originally Posted by *Hulio225*
> 
> Could be, on the other hand its a long test and 62 points more or less is kinda standard fluctuation if you benched on your normal 24/7 OS. if benched on a stripped down bench OS its another story
> 
> its mostly better to use short tests which you can repeat several times fast, like time spy test 1 or 2


Yes, I will be testing it on 3dmark afterwards. Will probably do FSU stress test and timespy. Also gonna try it on Witcher 3 and probably ME:A, I found out if my oc wasn't stable it would give me a driver crash within 5 mins in the game.


----------



## Sweetwater

Well I flashed the XOC on my Strix and it's glorious. Original bios I would eventually crash at 2063 and over(2050 was stable) from power limiting in the Witcher 3/ Andromeda but now I'm running @ 2076 / 6003, 1.050 no crashes or downclocks, picked up +2fps avg and minimum in the Wildlands benchmark too.

Still more testing to do but it's very promising for sure


----------



## dunbheagan

Hey guys, i tried some undervolting, thats how it looks after 3h PREY


----------



## dunbheagan

will see if can go deeper with voltage


----------



## dunbheagan




----------



## Hulio225

gratz on that tripple post


----------



## dunbheagan

i am very sorry, has been a long weekend..


----------



## asdkj1740

Quote:


> Originally Posted by *Benny89*
> 
> *GREAT NEWS!! EVGA Confirmed there will be FTW3 Hybrid Version!!!*
> 
> 
> __ https://twitter.com/i/web/status/861355973302116352
> Imangine FTW3 with all its Power Limit, ICX cooling and Hybrid cooler on top of that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No ETA for now.


the TTXP version of evga hybrid aio took evga a year to finalize...dont expect it will be available soon ...


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> Well I flashed the XOC on my Strix and it's glorious. Original bios I would eventually crash at 2063 and over(2050 was stable) from power limiting in the Witcher 3/ Andromeda but now I'm running @ 2076 / 6003, 1.050 no crashes or downclocks, picked up +2fps avg and minimum in the Wildlands benchmark too.
> 
> Still more testing to do but it's very promising for sure


Glad it worked for you. It didn't work on my STRIX. Still got power limit perfs no matter what. Clocks could go higher but they did crash eventually after hour or two in games.


----------



## KraxKill

Anybody bored with your top bins try undervolting yet? My 24/7 is 2100 @ 1.063 +550, but if I can bring myself to give up just 3 lousy fps, I can run 2ghz @ just 0.950v over here.


----------



## asdkj1740

really wish to see some gaming review about average fps and frametime//1% low fps


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> Anybody bored with your top bins try undervolting yet? My 24/7 is 2100 @ 1.063 +550, but if I can bring myself to give up just 3 lousy fps, I can run 2ghz @ just 0.950v over here.


While running sub ambient temps, right?


----------



## CoreyL4

Quote:


> Originally Posted by *Benny89*
> 
> Glad it worked for you. It didn't work on my STRIX. Still got power limit perfs no matter what. Clocks could go higher but they did crash eventually after hour or two in games.


Is the XOC bios the gigabyte xtreme one? I might try and flash this on my gaming x.


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> While running sub ambient temps, right?


Yeah it always looks like he has "the golden" card, but with liquid temp at -17 ° C and card load temp at 0 ° C this is a pretty average card thou


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah it always looks like he has "the golden" card, but with liquid temp at -17 ° C and card load temp at 0 ° C this is a pretty average card thou


----------



## KraxKill

Quote:


> Originally Posted by *done12many2*
> 
> While running sub ambient temps, right?


Yes sir. And this way they are actually sub zero at load with my liquid at -10C where it is set for 24/7 use, but at 1.05v I'm at about +4C under load and. Where i'd run -15C liquid to have sub zero loads.


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> Glad it worked for you. It didn't work on my STRIX. Still got power limit perfs no matter what. Clocks could go higher but they did crash eventually after hour or two in games.


I'm pretty lucky as I think my card is slightly above average. I feel for you though man as wasn't your previous strix a bit better than the one you have now? That would have been interesting to try the XOC









I just don't know why yours is getting performance capped still very strange indeed


----------



## KraxKill

Quote:


> Originally Posted by *done12many2*


My card is no slouch though it does 2100 with ambient liquid but at 1.093v So the chiller is more for my CPU than it is the gpu really and simply let's me run lower voltages. The ceiling is about the same. About a bin above 2100 on ambient and a couple of bins under 2200 when chilled.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> My card is no slouch though it does 2100 with ambient liquid but at 1.093v So the chiller is more for my CPU than it is the gpu really and simply let's me run lower voltages.


I have to build such a chill box in the future too, really nice 
But first i have to get some more knowledge about that stuff in general...
I would like to know how far below zero you could go with the liquid if compressor and isolation and stuff is very good


----------



## wammos

I do not see any difference between rom's honestly. I have a EVGA FE (no shunt) and have tried the SC2, FTW3 and XOC roms. They all bench the same for me but the biggest difference is stability. The FE and FTW3 are not stable at 2100 @ 1.075 mv while the XOC is. ME:A crashes FE and FTW3 roms and they also throttle. The XOC does not throttle and does not crash. My card is under water and never gets over 40°.

I do not see consistent benchmarks ocing ram above 500+.


----------



## ViRuS2k

Come to the conclusion that the XOC bios is best for people without shunt and run between 2000-2050 with lower voltages....
as even with the limit removed i get the exact near enough scores or games within 4-5fps difference running much higher voltages at 2100+ and above... that are exact same near enough scores in the 2000-2050 range with much less voltage....

probably good for those people as it has no power limit which is the best part so no throttling at your desired oc...

but for those on SHUNT modded cards stick to the FTW3 bios..... as you actually do benifit from higher clocks and no powerlimit with that.....

until i shunt both my ti`s i will be using the XOC bios with 2050 on both my cards at 600 memory and 1050 voltages.


----------



## Hulio225

Quote:


> Originally Posted by *ViRuS2k*
> 
> Come to the conclusion that the XOC bios is best for people without shunt and run between 2000-2050 with lower voltages....
> as even with the limit removed i get the exact near enough scores or games within 4-5fps difference running much higher voltages at 2100+ and above... that are exact same near enough scores in the 2000-2050 range with much less voltage....
> 
> probably good for those people as it has no power limit which is the best part so no throttling at your desired oc...
> 
> but for those on SHUNT modded cards stick to the FTW3 bios..... as you actually do benifit from higher clocks and no powerlimit with that.....
> 
> until i shunt both my ti`s i will be using the XOC bios with 2050 on both my cards at 600 memory and 1050 voltages.


FTW3 bios also performs worse on shunt moded card than nvidia fe bios... i dont know why but its worse^^


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> I'm pretty lucky as I think my card is slightly above average. I feel for you though man as wasn't your previous strix a bit better than the one you have now? That would have been interesting to try the XOC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just don't know why yours is getting performance capped still very strange indeed


It was better like 12mhz more. So no big deal.

Waiting for my Evga FTW3 to arrive. It will take 2 weeks but whatever. I can't seem to win lottery on STRIX, maybe I will have better luck with FTW3.


----------



## ViRuS2k

Quote:


> Originally Posted by *Hulio225*
> 
> FTW3 bios also performs worse on shunt moded card than nvidia fe bios... i dont know why but its worse^^


Correction









Use the best bios that is the most compatible with your SHUNT modded card


----------



## shadow85

Just got my 2x GTX 1080 Ti STRIX OC to replace my 2x GTX 1080 STRIX OC, lol.

I ran GoW4 ingame benchmark at 4K all ULTRA+INSANE (no AA) settings and I went from 43 fps to 55fps.

I am surprised how much thicker the Ti STRIX OC is to the non-Ti STRIX OC. I had to remove some of the front panel case connectors from the X99S Gaming 7 M/B to fit the 2nd card in!

Bit annoyed that some GoW4 cutscenes still stutter and seem choppy sometimes.

Anyways will post pics later. Still happy with the upgrade!!


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> My card is no slouch though it does 2100 with ambient liquid but at 1.093v So the chiller is more for my CPU than it is the gpu really and simply let's me run lower voltages. The ceiling is about the same. About a bin above 2100 on ambient and a couple of bins under 2200 when chilled.


I'm definitely not saying that your card is a slouch, but comparing sub ambient voltage / clock speed to those of ambient cards isn't realistic. That's all I was pointing out.

Both of my FE cards do 2100+ (1.050v & 1.063v) on water as well, but I went through quite a few cards to get there. I wanted two matched up really close for SLI use. My OCD was killing me with the first pair that I had.









My 1080s were great on water, but phenomenal at sub ambient. This past winter I rolled the rig outside and get the water nice and cold and was blown away by how much faster they would run on stock voltage along with my CPU. (2200+ on 1080s and 5.6 GHz on CPU).

Are you going to submit scores to the OCN benchmarking threads?


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> I have to build such a chill box in the future too, really nice
> But first i have to get some more knowledge about that stuff in general...
> I would like to know how far below zero you could go with the liquid if compressor and isolation and stuff is very good


You can go to about -30C liquid. Lower if you build a cascade, but then you're into the impractical space like NL2 in my book. Also below about -30C tyou will be hard pressed to find a fluid that will be viscous enough to be pumped while maintaining reasonable viscosity. Everything seems to operate on exponential curves.

Getting the liquid that cold is not a problem, I can get it that low in the winter but pumping it would require 4-5pumps in series vs just one 24v PMP-600 I'm using now. This would add an additional 200w of heatload. So robbing Peter to pay Paul. And the gains of going that low become marginal to a point where you're better off running NL2 since you're unlikely to go down that low for 24/7 use anyway.

A refrigerant system totaling about 12000btu will get you down to -10 to -20c liquid without too much trouble. My reservoir is 8gal. And at full load, this is looping heaven and running Prime95 on the CPU at the same time the compressor runs once for 5min every 20min or so while keeping the liquid at -10C. During casual gaming this is even less frequent. At idle with no heatload, it comes on just a few times per day so it's actually quite power efficient and allows for 24/7 operation vs for example using direct liquid phase cooling or NL2 which is really just for fun. I originally was going to build a direct in die phase change but then realized how much power that thing uses just sitting there at idle.

My compressor is just under 1000w and has a 20%-30% duty cycle at full load. The rest of the time, at idle, it just sits there.

The liquid is also important to match your intended target temp range because the more ethylene (antifreeze) you run with water the more freeze protection you get, but the less heat transfer capacity you get as well so it's a balancing act in terms of ethylene water concentration vescocity,heat carting capacity etc.

I run 60/40 ethylene water to keep my flow rates reasonable and the heat transfer rates optimal at -10 to -15C liquid where my compressor can get to with relative ease.

Feel free to PM me if you decide to build a chillbox. It's a project, but once constructed you can basically swap whatever hardware you want into it. Can't wait to see what skylake x does at sub zero.


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> I'm definitely not saying that your card is a slouch, but comparing sub ambient voltage / clock speed to those of ambient cards isn't realistic. That's all I was pointing out.
> 
> Both of my FE cards do 2100+ (1.050v & 1.063v) on water as well, but I went through quite a few cards to get there. I wanted two matched up really close for SLI use. My OCD was killing me with the first pair that I had.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 1080s were great on water, but phenomenal at sub ambient. This past winter I rolled the rig outside and get the water nice and cold and was blown away by how much faster they would run on stock voltage along with my CPU. (2200+ on 1080s and 5.6 GHz on CPU).
> 
> Are you going to submit scores to the OCN benchmarking threads?


5.6 GHz on your 7700k @ slightly above 0 ° C ?
How much 7700ks have you binned?


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> 5.6 GHz on your 7700k @ slightly above 0 ° C ?
> How much 7700ks have you binned?


I went through a couple.


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> I went through a couple.


So you are buying that many and reselling the "bad" ones used for lower price or how are you doing that exactly?
If you buy that amount, you need a big starting budget i guess xD some spare good ones will make up for the loss on the bad ones which you have to sell with a loss right?


----------



## CoreyL4

Just to be clear, the XOC is the gigabyte xtreme bios?


----------



## Benny89

Quote:


> Originally Posted by *CoreyL4*
> 
> Just to be clear, the XOC is the gigabyte xtreme bios?


No, Asus STRIX.


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> Just to be clear, the XOC is the gigabyte xtreme bios?


*X*treme *O*ver*C*locking

Its the XOC Asus Strix LN2 Bios


----------



## CoreyL4

Thanks, Ill flash my gaming x tonight with that bios and see what it gets me... if anything more stable clocks.


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> *X*treme *O*ver*C*locking
> 
> Its the XOC Asus Strix LN2 Bios


I didnt see the XOC bios in the link on the front page. Is it found somewhere else?


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> I didnt see the XOC bios in the link on the front page. Is it found somewhere else?


http://forum.hwbot.org/showthread.php?t=169488


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> So you are buying that many and reselling the "bad" ones used for lower price or how are you doing that exactly?
> If you buy that amount, you need a big starting budget i guess xD some spare good ones will make up for the loss on the bad ones which you have to sell with a loss right?


I've made some money here and lost some there, but it works out in the end. The 7700k is a great chip that gets a bad rap from those shooting for high clock speeds without much experience/adequate configurations. I've really enjoyed it as a daily / gaming chip.

I'm really looking forward to starting all over when x299 drops.


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> http://forum.hwbot.org/showthread.php?t=169488


Awesome thank you!


----------



## Newtocooling

Can someone help me out with this overclock I tried to even out my curve like suggested, but why does it keep saying I'm limited by power, temp, and voltage?


My temp has not gone over 31C on my Heatkiller Block??









On a side note has anyone playing GTA V been able to get smooth gameplay on one 1080ti? Mine seems to be getting pretty choppy in some of the city and of course desert with grass areas. I turned grass down to just high quality. I know this game scales great with SLI, and was great with my SLI 1080's, but I think I"m done for now with SLI.


----------



## Hulio225

Quote:


> Originally Posted by *Newtocooling*
> 
> Can someone help me out with this overclock I tried to even out my curve like suggested, but why does it keep saying I'm limited by power, temp, and voltage?
> 
> 
> My temp has not gone over 31C on my Heatkiller Block??
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note has anyone playing GTA V been able to get smooth gameplay on one 1080ti? Mine seems to be getting pretty choppy in some of the city and of course desert with grass areas. I turned grass down to just high quality. I know this game scales great with SLI, and was great with my SLI 1080's, but I think I"m done for now with SLI.


If i undervolt my card im getting perma message of power limit too, even im not power limited, could be that bug


----------



## done12many2

Quote:


> Originally Posted by *Newtocooling*
> 
> Can someone help me out with this overclock I tried to even out my curve like suggested, but why does it keep saying I'm limited by power, temp, and voltage?


It's just that, when you start pushing the card it's going to hit power limits even at stock voltages. You can try finding the sweet spot and undervolting to avoid hitting power limits or you can check out some other BIOS flash options. FTW3 is a decent one that will give you a little more power headroom.

Quote:


> Originally Posted by *Newtocooling*
> 
> My temp has not gone over 31C on my Heatkiller Block??
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note has anyone playing GTA V been able to get smooth gameplay on one 1080ti? Mine seems to be getting pretty choppy in some of the city and of course desert with grass areas. I turned grass down to just high quality. I know this game scales great with SLI, and was great with my SLI 1080's, but I think I"m done for now with SLI.


SLI is a win / lose affair. When you have it, you don't need it, but when you don't have it, you wish you did for occasions such as this. I chose to continue with SLI as most of the stuff I play benifits from it, but there's no doubt that a lot of games don't. I choose to avoid them.


----------



## CoreyL4

So what are the benefits/cons of that XOC bios?


----------



## done12many2

Quote:


> Originally Posted by *CoreyL4*
> 
> So what are the benefits/cons of that XOC bios?


No power or voltage restrictions. Some folks are experiencing degradation in performance above 2100 MHz, but if you plan to run lower then that, it's a great BIOS for avoiding power limits at stock clocks or moderately higher.


----------



## CoreyL4

Quote:


> Originally Posted by *done12many2*
> 
> No power or voltage restrictions. Some folks are experiencing degradation in performance above 2100 MHz, but if you plan to run lower then that, it's a great BIOS for avoiding power limits at stock clocks or moderately higher.


Yeah I planned on 2050/6100 max for everyday with it at 1.031v.


----------



## done12many2

Quote:


> Originally Posted by *CoreyL4*
> 
> Yeah I planned on 2050/6100 max for everyday with it.


Run Unigine Superposition in 8k Optimized at the clock speed you intend to run daily. Watch your clock speed during the benchmark and have GPUz sensors tabs running in the background. If your clock speeds are bouncing and you see power limiting, you might want to give XOC a try. You should see your score increase with the XOC if you were power limited on your stock BIOS. Just keep in mind that as you start to climb much higher in clock speed, XOC may or may not provide worse efficiency as far as performance.


----------



## Newtocooling

Quote:


> Originally Posted by *done12many2*
> 
> It's just that, when you start pushing the card it's going to hit power limits even at stock voltages. You can try finding the sweet spot and undervolting to avoid hitting power limits or you can check out some other BIOS flash options. FTW3 is a decent one that will give you a little more power headroom.
> SLI is a win / lose affair. When you have it, you don't need it, but when you don't have it, you wish you did for occasions such as this. I chose to continue with SLI as most of the stuff I play benifits from it, but there's no doubt that a lot of games don't. I choose to avoid them.


I hear that I may just go back to SLI in the future.


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CoreyL4*
> 
> Yeah I planned on 2050/6100 max for everyday with it.
> 
> 
> 
> Run Unigine Superposition in 8k Optimized at the clock speed you intend to run daily. Watch your clock speed during the benchmark and have GPUz sensors tabs running in the background. If your clock speeds are bouncing and you see power limiting, you might want to give XOC a try. You should see your score increase with the XOC if you were power limited on your stock BIOS. Just keep in mind that as you start to climb much higher in clock speed, XOC may or may not provide worse efficiency as far as performance.
Click to expand...

Having a 4K G-Sync screen most games will run no power limiting with G-Sync on and frame rates capped.

I was running Diablo 3 with detail settings maxed out at 2100 core +667 memory no power limiting with FTW3 BIOS.

The joys of G-Sync.


----------



## KingEngineRevUp

Anyone with the shunt mod and FTW3 bios have any feedback on its performance?


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> Anyone with the shunt mod and FTW3 bios have any feedback on its performance?


Yes,
Nvidia FE Bios (2113MHz +805MHz Vram) Time Spy Game Test 1 AVG-FPS: 72,85

FTW3 Bios (2113MHz +805MHz Vram) Time Spy Game Test 1 AVG-FPS: 71,00

more vram clock results in lower fps for both bios versions, lower clocks too
so this was the maximum i was possible to pull out at given ambient temp


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> Yes,
> Nvidia FE Bios (2113MHz +805MHz Vram) Time Spy Game Test 1 AVG-FPS: 72,85
> 
> FTW3 Bios (2113MHz +805MHz Vram) Time Spy Game Test 1 AVG-FPS: 71,00
> 
> more vram clock results in lower fps for both bios versions, lower clocks too
> so this was the maximum i was possible to pull out at given ambient temp


Thanks buddy, +1 rep for you.

Sounds like resistor or shunt with stock is still the best option for me.


----------



## Newtocooling

Is it hard or dangerous to flash to the XOC bios? I've never flashed a bios before. I did download the RAR file, but I'm not sure how to use it. Sorry to seem dumb is anyone willing to help walk me through it?


----------



## dunbheagan

Quote:


> Originally Posted by *done12many2*
> 
> I went through a couple.


Seriously?!! These are 50 i7 7700k, worth at least 15k US$? Tell me honestly, did you really buy them yourself? If so, have you tested them all? What are your experiences, what is the average max OC on these puppies? I am curious!

edit: corrected the quote


----------



## Hulio225

Quote:


> Originally Posted by *Newtocooling*
> 
> Is it hard or dangerous to flash to the XOC bios? I've never flashed a bios before. I did download the RAR file, but I'm not sure how to use it. Sorry to seem dumb is anyone willing to help walk me through it?


not really risky, very easy, follow kedarwolfs instructions here:
http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
Quote:


> Originally Posted by *dunbheagan*
> 
> Seriously?!! These are 50 i7 7700k, worth at least 15k US$? Tell me honestly, did you really buy them yourself? If so, have you tested them all? What are your experiences, what is the average max OC on these puppies? I am curious!


You quoted the wrong guy
Quote:


> Originally Posted by *SlimJ87D*
> 
> Thanks buddy, +1 rep for you.
> 
> Sounds like resistor or shunt with stock is still the best option for me.


Yeah im disappointed, there is nothing we can do at this point to get better performance... ill maybe switch from shunt to the resisitor method, since shunt still power limits


----------



## Slackaveli

Quote:


> Originally Posted by *done12many2*
> 
> I went through a couple.


Dude, you are you gangster, I bow down, bruh. Holy crap. I wish we were friends in real life!


----------



## willverduzco

What is everyone using to raise voltage past 1.063 on the XOC BIOS? For some reason any time I try to add anything to the afterburner curve at 1.075 or above, it doesn't stick... Even with the voltage slider at 100%. I even tried using the XOC tool on hwbot, but had no idea what I was doing. No other BIOS behaves like that for me. On the upside, it's really nice to have absolutely no power limiting.


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> What is everyone using to raise voltage past 1.063 on the XOC BIOS? For some reason any time I try to add anything to the afterburner curve at 1.075 or above, it doesn't stick... Even with the voltage slider at 100%. I even tried using the XOC tool on hwbot, but had no idea what I was doing. No other BIOS behaves like that for me. On the upside, it's really nice to have absolutely no power limiting.


Curve works just fine, idk what your problem is^^


----------



## Slackaveli

Quote:


> Originally Posted by *Newtocooling*
> 
> I hear that I may just go back to SLI in the future.


we ALL will when 4k/144, 5k/120 21:9, 8k/90 hit the scene in the next 12 months+. We all will...

SLI Volta Ti's on 5k Ultrawide 21:9 HDR will be all the rage. At least, that's what my dreams are like.


----------



## done12many2

Quote:


> Originally Posted by *dunbheagan*
> 
> Seriously?!!


Yes

Quote:


> Originally Posted by *dunbheagan*
> 
> These are 50 i7 7700k, worth at least 15k US$?


Yes

Quote:


> Originally Posted by *dunbheagan*
> 
> Tell me honestly, did you really buy them yourself?


It would be odd to have 50 x 7700k CPUs laying on my kitchen counter that weren't paid for. That goes slightly beyond casual shoplifting.









Quote:


> Originally Posted by *dunbheagan*
> 
> If so, have you tested them all?


I did not test everyone of them, but a lot of them.

Quote:


> Originally Posted by *dunbheagan*
> 
> What are your experiences, what is the average max OC on these puppies? I am curious!


Most chips that I've tested run between 4.9 to 5.1 RealBench stable before delid. A few have done 5.2 RealBench stable prior to delid and 5.3+ after delid. All that I have delidded have gone 5.1 or more once delidded.


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> Most chips that I've tested run between 4.9 to 5.1 RealBench stable before delid. A few have done 5.2 RealBench stable prior to delid. All that I have delidded have gone 5.1 or more once delidded.


I bought just one and delidded instantly, it needs 1,42 volts to be stable at 5,0. can get it max to 5252 timespy stable @ 1,54v or 5200 @ 1,56-1,58V xtu stable for benching....
i call that a potato chip  ill bin a bit myself next time i guess... not in the extent you did, but a couple at least...


----------



## TheBoom

Anyone with the Amp Extreme tried the bios yet?

It is possible to flash back stock bios with NVFlash right?


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Anyone with the Amp Extreme tried the bios yet?
> 
> It is possible to flash back stock bios with NVFlash right?


yes nothing is permanent


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> I bought just one and delidded instantly, it needs 1,42 volts to be stable at 5,0. can get it max to 5252 timespy stable @ 1,54v or 5200 @ 1,56-1,58V xtu stable for benching....
> i call that a potato chip  ill bin a bit myself next time i guess... not in the extent you did, but a couple at least...


What do you call stable as we might be talking about two different things? I use RealBench, but if you're running a Prime95 blend, then your chip is not too bad at all. I don't suspect that you are based on the benchmark speeds/voltages.

"potato chip". I like your honesty man.


----------



## eXistencelies

Quote:


> Originally Posted by *done12many2*
> 
> Run Unigine Superposition in 8k Optimized at the clock speed you intend to run daily. Watch your clock speed during the benchmark and have GPUz sensors tabs running in the background. If your clock speeds are bouncing and you see power limiting, you might want to give XOC a try. You should see your score increase with the XOC if you were power limited on your stock BIOS. Just keep in mind that as you start to climb much higher in clock speed, XOC may or may not provide worse efficiency as far as performance.


Yea I noticed my clocks jump around a bit during these tests. I have a 150+ on core and 550 on memory. I scored a 10150 on the 4k with those settings. So running the new BIOS will keep my clocks more stable and not jump around as much, yes? I won't be going above 2100. Probably keep it at 2050 or 2075.


----------



## Coopiklaani

Tested some mem frequencies on my FE card with FTW3 bios. It scales very linearly with frequency all the way from +650 to +950 offset. All tests were conducted using TS with core fixed at 2050MHz solid.

[email protected] 2050MHz Timespy graphics
650 10765
675 10766
700 10777
725 10802
750 10808
775 10812
800 10815
825 10834
840 10840
850 10853
875 10862
900 10872
925 10876
950 10897


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> What do you call stable as we might be talking about two different things? I use RealBench, but if you're running a Prime95 blend, then your chip is not too bad at all. I don't suspect that you are based on the benchmark speeds/voltages.
> 
> "potato chip". I like your honesty man.


Wrote that a bit ******ed, [email protected] is Realbench stable .The other voltages and frequencies i wrote down, indeed just based for mentioned benchmarks... so nothing i would use as daily.
Like i said potato, probably in reality something average-ish...
Quote:


> Originally Posted by *Coopiklaani*
> 
> Tested some mem frequencies on my FE card with FTW3 bios. It scales very linearly with frequency all the way from +650 to +950 offset. All tests were conducted using TS with core fixed at 2050MHz solid.


Yep, that looks good, what are your temps, +950 for vram is better than my card can do, which i don't often see^^


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Wrote that a bit ******ed, [email protected] is Realbench stable .The other voltages and frequencies i wrote down, indeed just based for mentioned benchmarks... so nothing i would use as daily.
> Like i said potato, probably in reality something average-ish...
> Yep, that looks good, what are your temps, +950 for vram is better than my card can do, which i don't often see^^


50C at the core, I guess it is about the same for the mem since I use an EKFC waterblock.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> 50C at the core, I guess it is about the same for the mem since I use an EKFC waterblock.


wow, my card has never seen 50C while also being under ekwbfc, i need like 11 ° C ambient for +850 in time spy, so your card is going ham with higher temps on the vram^^

probably able to go +1k with lower temps lol


----------



## done12many2

Quote:


> Originally Posted by *eXistencelies*
> 
> Yea I noticed my clocks jump around a bit during these tests. I have a 150+ on core and 550 on memory. I scored a 10150 on the 4k with those settings. So running the new BIOS will keep my clocks more stable and not jump around as much, yes? I won't be going above 2100. Probably keep it at 2050 or 2075.


Yeah, the 4k test will work too as it pulls almost as much power from the wall. If your goal is to just take care of the power limiting and clock speed jumping, I think you're a good candidate for XOC at your desired daily operating clock speed.

Try it out and see if your clock speed stabilizes and score goes up. Remember that your score will only increase in those benchmarks that are causing your card to power limit. I'd run a SP 4k and 8k before doing the BIOS flash and then again afterwords. Something like Valley isn't going to be enough to trigger a PL. I don't consider Valley anything but a CPU test.









Good luck man.


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> wow, my card has never seen 50C while also being under ekwbfc, i need like 11 ° C ambient for +850 in time spy, so your card is going ham with higher temps on the vram^^
> 
> probably able to go +1k with lower temps lol


I start to see artifacts at +975. The card crashes benchmark at +1000 Maybe LN2 can be the key

I like my rads run dead quiet. I have about 9C GPU-coolan temp delta. and my coolant is in the range of 35-40C with my 280mm+360mm rads. All fans run at below 900RPM


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> wow, my card has never seen 50C while also being under ekwbfc, i need like 11 ° C ambient for +850 in time spy, so your card is going ham with higher temps on the vram^^
> 
> probably able to go +1k with lower temps lol


yeah, that's nuts. i certainly have potato chip ram. +435 is pedestrian as frick.


----------



## eXistencelies

Quote:


> Originally Posted by *Hulio225*
> 
> wow, my card has never seen 50C while also being under ekwbfc, i need like 11 ° C ambient for +850 in time spy, so your card is going ham with higher temps on the vram^^
> 
> probably able to go +1k with lower temps lol


My card has seen 50c on that block. Especially in a Phantek's Evolv TG class with all panels on. Getting my panels modded though for better airflow. Mainly though my card sits in the mid 40's when gaming.


----------



## Coopiklaani

Quote:


> Originally Posted by *eXistencelies*
> 
> My card has seen 50c on that block. Especially in a Phantek's Evolv TG class with all panels on. Getting my panels modded though for better airflow. Mainly though my card sits in the mid 40's when gaming.


Same here, the front panel of the case has insanely amount of air resistance. By simply removing the front panel, my coolant temp can drop by 8C This is 8C for aesthetics.


----------



## Hulio225

Quote:


> Originally Posted by *eXistencelies*
> 
> My card has seen 50c on that block. Especially in a Phantek's Evolv TG class with all panels on. Getting my panels modded though for better airflow. Mainly though my card sits in the mid 40's when gaming.


Hehe yeah i mean there are a lot variables, which we have to lock at. Ambienttemp, radiator size, fan configs, fan rpm etc...
sure if i turn my fans of ill hit 50C too^^ running them unaudible ill sit at low 40 i guess, i have 2 480 rads thou


----------



## TheBoom

So for this to work I need to flash both the XOC and use the voltage unlock tool am I right?


----------



## eXistencelies

Quote:


> Originally Posted by *Coopiklaani*
> 
> Same here, the front panel of the case has insanely amount of air resistance. By simply removing the front panel, my coolant temp can drop by 8C This is 8C for aesthetics.


Yea. I tend to remove the side glass panel and it helps a good bit. I am in the process now of getting them modded by the "rockit cool" guy as he lives in my city.
Quote:


> Originally Posted by *Hulio225*
> 
> Hehe yeah i mean there are a lot variables, which we have to lock at. Ambienttemp, radiator size, fan configs, fan rpm etc...
> sure if i turn my fans of ill hit 50C too^^ running them unaudible ill sit at low 40 i guess, i have 2 480 rads thou


Yea I have two 360mm in that case. I have the TT Riing fans and use their high setting. With all panels on it isn't that loud. I have my headphones on 99% of the time anyways. I think my secondary gaming PC is quieter on air than my primary on water lol. That one uses the NZXT H440 case.


----------



## KraxKill

So who's tried this?

Anybody with shunt mod try this Zotac yet? I doubt it helps, but may be worth a spin.

Target: 320.0 W
Limit: 384.0 W

https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


----------



## asdkj1740

Quote:


> Originally Posted by *KraxKill*
> 
> So who's tried this?
> 
> Anybody with shunt mod try this Zotac yet? I doubt it helps, but may be worth a spin.
> 
> Target: 320.0 W
> Limit: 384.0 W
> 
> https://www.techpowerup.com/vgabios/191335/zotac-gtx1080ti-11264-170330


check the voltage controller ic first.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> So for this to work I need to flash both the XOC and use the voltage unlock tool am I right?


Just flash the bios don't use tool at all, you won't understand it, no offense, and you won't need it anyway.

Just flash the bios, and you can play with voltages up to 1.2V in afterburner with curves


----------



## done12many2

Quote:


> Originally Posted by *eXistencelies*
> 
> Yea. I tend to remove the side glass panel and it helps a good bit. I am in the process now of getting them modded by the "rockit cool" guy as he lives in my city.
> Yea I have two 360mm in that case. I have the TT Riing fans and use their high setting. With all panels on it isn't that loud. I have my headphones on 99% of the time anyways. I think my secondary gaming PC is quieter on air than my primary on water lol. That one uses the NZXT H440 case.


Yeah, with water it does get quieter, but in order to do so you have to add a lot of radiator surface area when using power hungry components.


----------



## KraxKill

Quote:


> Originally Posted by *asdkj1740*
> 
> zotac this time uses another voltage controller ic. not up9511 any more.


I thought the same thing, but two people in this thread report having decent results on it while on reference cards.

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/240#post_26077449


----------



## wammos

Quote:


> Originally Posted by *willverduzco*
> 
> What is everyone using to raise voltage past 1.063 on the XOC BIOS? For some reason any time I try to add anything to the afterburner curve at 1.075 or above, it doesn't stick... Even with the voltage slider at 100%. I even tried using the XOC tool on hwbot, but had no idea what I was doing. No other BIOS behaves like that for me. On the upside, it's really nice to have absolutely no power limiting.


hey will i had the same problem. Open up heaven benchmark not in full screen but windowed. Have it on the left side and afterburner on the right. With heaven looping do a control f and it will open up the voltage curve. Select the dot that is showing up as the voltage in MSI should be a big red line running vertical. Mine was stuck on 1.062 so you want to drop that dot -5 for example which should move the voltage up to 1.075 at the overclock you want. Save it with the check mark in afterburner and sometimes it would update right then and there sometime I had to reboot. You can keep moving the voltage up that way until your stable.


----------



## willverduzco

Quote:


> Originally Posted by *wammos*
> 
> hey will i had the same problem. Open up heaven benchmark not in full screen but windowed. Have it on the left side and afterburner on the right. With heaven looping do a control f and it will open up the voltage curve. Select the dot that is showing up as the voltage in MSI should be a big red line running vertical. Mine was stuck on 1.062 so you want to drop that dot -5 for example which should move the voltage up to 1.075 at the overclock you want. Save it with the check mark in afterburner and sometimes it would update right then and there sometime I had to reboot. You can keep moving the voltage up that way until your stable.


Thank you! That certainly helped.

I was able to get past 1.063 (ended up setting steps up to 2126 @ 1.1V) by manually deleting the profiles I had in the AB folder and starting fresh and using freakishly high clocks just to get it to stick, and then toning down to my desired clocks. However, that didn't work for all of the steps in between. Your method worked brilliantly for the stubborn ones!

Thanks to this XOC Bios, I finally was able to match my old Titan XP (not Xp), so I'm glad that I made the side-grade / slight downgrade since there will never be an equivalent BIOS for that card. Strangely, my TXP hit 12000 MHz on the memory with 100% stability, which is quite a bit more bandwidth than 12000 MHz on the 1080TI, and I can't seem to get my 1080TI's memory stable at anything higher. At least I made back some cash in the process.


----------



## fat4l

Hi guys.

is the XOC bios working also for FE Ti ?
http://forum.hwbot.org/showthread.php?t=169488


----------



## willverduzco

Quote:


> Originally Posted by *fat4l*
> 
> Hi guys.
> 
> is the XOC bios working also for FE Ti ?
> http://forum.hwbot.org/showthread.php?t=169488


Yes. Mine is FE and it works brilliantly with this BIOS.


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> Just flash the bios don't use tool at all, you won't understand it, no offense, and you won't need it anyway.
> 
> Just flash the bios, and you can play with voltages up to 1.2V in afterburner with curves


Well actually I went back a few pages and I kinda understand what it is now. Not gonna be messing with it on air though.

The other thing is I'm not even sure the XOC bios will work with my Zotac. And I don't have a spare card anymore to re flash the 1080ti if it doesn't work with it.

Edit : Nevermind I totally forgot my 6700k has an iGPU lol.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Well actually I went back a few pages and I kinda understand what it is now. Not gonna be messing with it on air though.
> 
> The other thing is I'm not even sure the XOC bios will work with my Zotac. And I don't have a spare card anymore to re flash the 1080ti if it doesn't work with it.
> 
> Edit : Nevermind I totally forgot my 6700k has an iGPU lol.


If you still have a i7 6700k as your cpu, you have an iGPU









Edit: ah lol you figured it out by yourself^^


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Away from the rig for a min, but has anybody tried to see if the XOC volt tool works to override the voltage while on the FE Bios?


I've tried the *voltage tool* with stock FE BIOS & while on the XOC BIOS.

I also ran it with admin privileges.

I just get errors saying "failed to read" and "failed to write".

*Did anyone have better luck?
*

The tool WOULD have been the perfect solution for FE.

Would of allowed us free reign over voltage
And free reign over power (with shunt mod)
And all on stock BIOS! Such a shame......

Quote:


> Originally Posted by *Hulio225*
> 
> KraxKill is on chillbox cooling his liquid temps are sub zero, he will be good if goes a bit higher than 1.2v


I don't know if I'd be comfortable pushing voltages higher than LN2 without at least some Dry Ice.

If you apply too much voltage, scores and FPS always take a drop (both BIOS's).

The DISTANCE between conductive pathways (separated by non-conductive walls) is so short now; electrons are jumping through non-conductive walls and causing instability. The bigger those electrons the more it's going to happen.

In the Winter you got to over 2200 MHZ at stock voltages on your 1080. And it's the same architecture.

I just think it needs to be temps first, volts second. Can't stress that enough on Pascal.

Quote:


> Originally Posted by *bloodhawk*
> 
> Oh sweet! That would work awesome for him then. I pushed my voltage to 1.3V for a few seconds to see if it works, and it held pretty well. I did stick my portable AC's vent to the desktops radiator intake fans. :> (Just in case)


Ouch; that sounds very risky at normal temps.

See my last response to last quote.... (directly above this one).

Quote:


> Originally Posted by *Coopiklaani*
> 
> Checked, no change of my max stable mem OC on the xoc bios.
> 
> XOC bios seems to have a programmed shutdown temperature and it is around 54-58C. No matter what voltage/frequency combo I use, 3DM stress test always cancels itself around this temperature range.


That's odd, I wasn't experiencing that at all on my FE.... I hit 83c at one point and it was still running fine. I shut it down myself at that point as I wasn't comfortable with those temps.

Quote:


> Originally Posted by *Nico67*
> 
> Been playing with XOC and found something interesting. gain per clk was insignificant, however gain per volts was very noticeable.
> 
> 2050/6003 @ 1.000v 10207 281w
> 2050/6003 @ 1.025v 10284 295w
> 2050/6003 @ 1.050v 10317 313w
> 2050/6003 @ 1.075v 10346 327w seems enough at this point
> 2050/6003 @ 1.100v 10333 337w
> 
> started at 2000 and 10194 score so still testing, but it seems you need to give it enough voltage even if its stable. Vdroop or something effecting scores maybe?


"Stability" is a hard word to use with GPU's.

It's not like CPU's where we just get them to do a whole load of complex calculations (that we already know the answer to) then check if the CPU got the correct one.

We're generating images. Any small part of that image that isn't perfect will go unnoticed by the human eye. And we only know when a card completely drops a frame when our scores are lower at the end of a benchmark. It's so much less precise.

So you're absolutely right; and I agree with your statement 100% Too little or too much volts will affect "STABILITY" and scoring even when it LOOKS stable.

Quote:


> Originally Posted by *hotrod717*
> 
> I can tell you there is another set of issues once you take these cold. At different temp points and different voltages, FE will experience various bin drops and soft crashes during benchmark. More than -50* and card scored worse and anomalies became almost uncontrollable. Some cards seem to do fine with more than -50*(-130* CB on this card), and some -50* is about all you want to go.
> Blew thru about 30L/4hrs. for only 2 decent scores.And one I got screen and forgot benchmark save, so no boints there. Next time I'm capping at -50* to see I can get some consistent results. I can say PL was not a factor as 105 was the highest I saw even at 2200mhz if it was reading correctly.


So you got down to -130c?

Did you flash the XOC BIOS before you went cold?

Also just out of interest; how high were you able to get your clock-speed? And at what voltages?

Did you take a note of your power draw?

Thanks!


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> If you still have a i7 6700k as your cpu, you have an iGPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: ah lol you figured it out by yourself^^


Lol yeah


----------



## KraxKill

Quote:


> Originally Posted by *willverduzco*
> 
> Yes. Mine is FE and it works brilliantly with this BIOS.


Provided you're not already Shunt modded and are running sub 2100+ MHz otherwise there are more reports of the XOC Bios producing slower performance than the vanilla FE bios to the tune of 5%


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Provided you're not already Shunt modded and are running sub 2100+ MHz otherwise there are more reports of the XOC Bios producing slower performance than the vanilla FE bios to the tune of 5%


Imho, something is wrong with that bios. even xtreme addict, the guy who published that bios on hwbot, is getting just 12316 gpu score in time spy with a core clock of 2440MHz on ln2. Im getting 11376 with 2113 core clock on water.
Kingpin is getting 13230 gpu score with 2530MHz... there is no way that 90MHz are giving 900 points, if you compare those two ln2 scores...

if this would be the case than xtreme addict would have 3 times 900 points more than me...

im pretty sure the xoc bios is faulty in some way

Edit:

2530/2113=1,197 -> 19,7% higher clock speed | 13230/11376=1,163 [Kingpin 1080Ti world record, volt modded card etc VS mine shunt moded on water]
we know clock and performance scaling isn't completely linear so this result is inline what we would expect.

2440/2113=1,155 ->15,5% higher clock speed | 12316/11376=1,083 [Xtreme addict xoc bios VS me]
as you can see here result isn't in line with above...

15,5% more clock should result in 12-13% more gpu score, but it is not, so the conclusion the bios is crap

he is getting 8.3% more gpu score for 15,5% more clock speed

he is missing those ~5% performance like KraxKill and me. period.


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> Imho, something is wrong with that bios. even xtreme addict, the guy who published that bios on hwbot, is getting just 12316 gpu score in time spy with a core clock of 2440MHz on ln2. Im getting 11376 with 2113 core clock on water.
> Kingpin is getting 13230 gpu score with 2530MHz... there is no way that 90MHz are giving 900 points, if you compare those two ln2 scores...
> 
> if this would be the case than xtreme addict would have 3 times 900 points more than me...
> 
> im pretty sure the xoc bios is faulty in some way
> 
> Edit:
> 
> 2530/2113=1,197 -> 19,7% higher clock speed | 13230/11376=1,163
> we know clock and performance scaling isn't completely linear so this result is inline what we would expect.
> 
> 2440/2113=1,155 ->15,5% higher clock speed | 12316/11376=1,083
> as you can see here result isn't in line with above...
> 
> 15,5% more clock should result in 12-13% more gpu score, but it is not, so the conclusion the bios is crap


So if its faulty should I flash it on my gaming x to get rid of the power limit if I am staying around 2050?


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> So if its faulty should I flash it on my gaming x to get rid of the power limit if I am staying around 2050?


try and test... i just know,in my case, its performing worse than every other bios i tested


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> Imho, something is wrong with that bios. even xtreme addict, the guy who published that bios on hwbot, is getting just 12316 gpu score in time spy with a core clock of 2440MHz on ln2. Im getting 11376 with 2113 core clock on water.
> Kingpin is getting 13230 gpu score with 2530MHz... there is no way that 90MHz are giving 900 points, if you compare those two ln2 scores...
> 
> if this would be the case than xtreme addict would have 3 times 900 points more than me...
> 
> im pretty sure the xoc bios is faulty in some way
> 
> Edit:
> 
> 2530/2113=1,197 -> 19,7% higher clock speed | 13230/11376=1,163 [Kingpin 1080Ti world record, volt modded card etc VS mine shunt moded on water]
> we know clock and performance scaling isn't completely linear so this result is inline what we would expect.
> 
> 2440/2113=1,155 ->15,5% higher clock speed | 12316/11376=1,083 [Xtreme addict xoc bios VS me]
> as you can see here result isn't in line with above...
> 
> 15,5% more clock should result in 12-13% more gpu score, but it is not, so the conclusion the bios is crap
> 
> he is getting 8.3% more gpu score for 15,5% more clock speed
> 
> he is missing those ~5% performance like KraxKill and me. period.


We've kind of proven time and time again that flashing a bios over just gives you worse performance in the past. You trade one thing for another.

It's no surprise to me that what you are saying is true.

Kingpin takes the time to hardware mod his cards on the level of an electrical engineer would. He's bypassing sensors, power, voltage limits legitimately.

Xtreme addict is just flashing a bios, one that's not written for his card.

It sure looks like Kingpin is getting a true unlocked card with nothing holding him back while Xtreme Addict loses points in some fields and regains them in other fields which seem like *half measures*.


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> We've kind of proven time and time again that flashing a bios over just gives you worse performance in the past. You trade one thing for another.
> 
> It's no surprise to me that what you are saying is true.
> 
> Kingpin takes the time to hardware mod his cards on the level of an electrical engineer would. He's bypassing sensors, power, voltage limits legitimately.
> 
> Xtreme addict is just flashing a bios, one that's not written for his card.
> 
> It sure looks like Kingpin is getting a true unlocked card with nothing holding him back while Xtreme Addict loses points in some fields and regains them in other fields which seem like *half measures*.


Yeah, i posted my calculations on hwbot in his thread, maybe he will talk to the asus people and they re code the bios or whatever...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah, i posted my calculations on hwbot in his thread, maybe he will talk to the asus people and they re code the bios or whatever...


I don't think they'd recode the bios if it was written for a specific card with specific components right?

I mean we're trying to use bios that weren't meant for us, not sure who they would bother accommodating us.


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> I don't think they'd recode the bios if it was written for a specific card with specific components right?
> 
> I mean we're trying to use bios that weren't meant for us, not sure who they would bother accommodating us.


i don't know i just thought it was from asus for him and his strix cards, since he releasded that bios files to the wild^^.... i dont know how close they work with eacht other or if the bios was coded from someone else etc. i just don't know.
but i thought ill leave a post on hwbot just showcasing that this bios has its flaws... time will tell if something happens in that regard


----------



## dunbheagan

Quote:


> Originally Posted by *done12many2*
> 
> Yes
> Yes
> It would be odd to have 50 x 7700k CPUs laying on my kitchen counter that weren't paid for. That goes slightly beyond casual shoplifting.
> 
> 
> 
> 
> 
> 
> 
> 
> I did not test everyone of them, but a lot of them.
> Most chips that I've tested run between 4.9 to 5.1 RealBench stable before delid. A few have done 5.2 RealBench stable prior to delid and 5.3+ after delid. All that I have delidded have gone 5.1 or more once delidded.


wow









thanks for the info buddy, interesting!


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> We've kind of proven time and time again that flashing a bios over just gives you worse performance in the past. You trade one thing for another.
> 
> It's no surprise to me that what you are saying is true.
> 
> Kingpin takes the time to hardware mod his cards on the level of an electrical engineer would. He's bypassing sensors, power, voltage limits legitimately.
> 
> Xtreme addict is just flashing a bios, one that's not written for his card.
> 
> It sure looks like Kingpin is getting a true unlocked card with nothing holding him back while Xtreme Addict loses points in some fields and regains them in other fields which seem like *half measures*.


tbf, you guys are proving over and over that SHUNT mod's cards...


----------



## dunbheagan




----------



## Coopiklaani

Some experiments with different curve types with XOC bios. One uses the "cliff" type, the other uses the smooth type. Smooth curve does result in much much better scores. The cliff type or F-V locking results are significantly worse.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Some experiments with different curve types with XOC bios. One uses the "cliff" type, the other uses the smooth type. Smooth curve does result in much much better scores. The cliff type or F-V locking results are significantly worse.


Lol if this is real just lol. could you do me a favor and test this on time spy game test 1 only plz?
not the whole thing just lets say 2100 mhz cliff vs 2100 mhz smooth

Edit: ah wait i think i got something wrong, sure smooth curves should result in better scores if you thermal throttel or something like that because you dont jump 5 bins down or something.
what are your temps at while testing?


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Lol if this is real just lol. could you do me a favor and test this on time spy game test 1 only plz?
> not the whole thing just lets say 2100 mhz cliff vs 2100 mhz smooth
> 
> Edit: ah wait i think i got something wrong, sure smooth curves should result in better scores if you thermal throttel or something like that because you dont jump 5 bins down or something.
> what are your temps at while testing?


Temps are well under 50C.and frequencies are rock solid.


----------



## done12many2

Quote:


> Originally Posted by *Coopiklaani*
> 
> Temps are well under 50C.and frequencies are rock solid.


That graph right next to a graph of the stock FE BIOS would be sweet!


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> That graph right next to a graph of the stock FE BIOS would be sweet!


This!

I predict ~5% less Performance for the XOC Bios all over the place!









Edit:
Quote:


> Originally Posted by *Coopiklaani*
> 
> Temps are well under 50C.and frequencies are rock solid.


Yesterday while i tested, i had the feeling the MHz readouts were wrong on the XOC Bios, cause i noticed an average fps gain in my testing after i cooled my room down and my load temps were below 36 ° C
Monitored MHz were rocksolid all the time, but performance gain after temp went 2 ° C down could be the explanation.

If this is true my theory is your dropping several bins because your temps if cliff curve, but you dont see it because xoc bios.
With that smooth would be better since you would just drop 1 bin.


----------



## KraxKill

Who wants to guinea pig the new bios? 382.05


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Who wants to guinea pig the new bios? 382.05


pardon?


----------



## KraxKill

Duh. I mean driver. Clearly all this bios talk is making my brain hurt.


----------



## wammos

Quote:


> Originally Posted by *KraxKill*
> 
> Duh. I mean driver. Clearly all this bios talk is making my brain hurt.


That is what I have been testing with the whole time.


----------



## done12many2

Quote:


> Originally Posted by *KraxKill*
> 
> Duh. I mean driver. Clearly all this bios talk is making my brain hurt.


I've been running it since yesterday. I notice nothing different.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Duh. I mean driver. Clearly all this bios talk is making my brain hurt.


im already using it since 2 or 3 days, maybe even 4...
performance identical to previous driver in time spy at least


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Lol if this is real just lol. could you do me a favor and test this on time spy game test 1 only plz?
> not the whole thing just lets say 2100 mhz cliff vs 2100 mhz smooth
> 
> Edit: ah wait i think i got something wrong, sure smooth curves should result in better scores if you thermal throttel or something like that because you dont jump 5 bins down or something.
> what are your temps at while testing?


OK, here are the results. Extreme cliff vs extreme smooth, both run at rock solid 2100.5MHz, same vram, no perf cap.


----------



## rolldog

Who's ya daddy?


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> OK, here are the results. Extreme cliff vs extreme smooth, both run at rock solid 2100.5MHz, same vram, no perf cap.


hmm thtas actually weird somehow...
now i know what you mean with cliff and smooth... weird results...
why is this happening, now we need more than ever this graphs compared to stock nvidia bios


----------



## OneCosmic

Quote:


> Originally Posted by *Coopiklaani*
> 
> OK, here are the results. Extreme cliff vs extreme smooth, both run at rock solid 2100.5MHz, same vram, no perf cap.


This is exactly something i also noticed yesterday when i was benching superposition 4K and trying to figure out myself if FE BIOS does really have better performance than XOC BIOS and personally when i set the curve by "cliff" on FE i get the same low score than on XOC with "cliff", when set to smooth the score is about 400 points higher in Superposition 4K eventhough FE BIOS is throttling between 2088 and 2025MHz on 1.050V and XOC is holding the clocks on 2088MHz all the time on 1.05V.


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> hmm thtas actually weird somehow...
> now i know what you mean with cliff and smooth... weird results...
> why is this happening, now we need more than ever this graphs compared to stock nvidia bios


I remember some ppl reported the same results earlier here and in the gtx 1080 thread. How exactly GPU boost 3.0 works is still a mystery.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> I remember some ppl reported the same results earlier here and in the gtx 1080 thread. How exactly GPU boost 3.0 works is still a mystery.


But my theory could still be valid, because if you just don't see the downclocking because of temps, with the cliff one you instantly jump 100mhz down.
Withe the smooth one its bin for bin...


----------



## CptSpig

Quote:


> Originally Posted by *SlimJ87D*
> 
> We've kind of proven time and time again that flashing a bios over just gives you worse performance in the past. You trade one thing for another.
> 
> It's no surprise to me that what you are saying is true.
> 
> Kingpin takes the time to hardware mod his cards on the level of an electrical engineer would. He's bypassing sensors, power, voltage limits legitimately.
> 
> Xtreme addict is just flashing a bios, one that's not written for his card.
> 
> It sure looks like Kingpin is getting a true unlocked card with nothing holding him back while Xtreme Addict loses points in some fields and regains them in other fields which seem like *half measures*.


King Pin is using Frankenstein Cards. Not to many people can do these kind of mods. Voltage directly to the card from PSU.

http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> But my theory could still be valid, because if you just don't see the downclocking because of temps, with the cliff one you instantly jump 100mhz down.
> Withe the smooth one its bin for bin...


But there's no sign of pref cap anywhere. And this time I kept the GPU temp around 40C for the entire time to ensure there's no thermal downclock whatsoever.


----------



## OneCosmic

Quote:


> Originally Posted by *CptSpig*
> 
> King Pin is using Frankenstein Cards. Not to many people can do these kind of mods. Voltage directly to the card from PSU.
> 
> http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


Those cables which looks like power cables are there for him just to be able to easily connect a DMM and measure voltage, they aren't there to supply voltage, he is still using standard 6/8PIN PEGs to power the cards, those cables would be too slim to power the cards.


----------



## gmpotu

Hey guys. Been out of the game for about a week now. I'm about 1000 posts behind.
Anyone experiencing power issues with any of the current nvidia drivers?

My machine when I put it into sleep mode now turns everything off (monitors, keyboard, mouse, etc) but the fans stay running and I have to press the power button then it crashes the system and restarts. This never happened before I got my new card and I don't remember it happening when I was first testing. Just curious if anyone else has had power issues?


----------



## Hulio225

Quote:


> Originally Posted by *CptSpig*
> 
> King Pin is using Frankenstein Cards. Not to many people can do these kind of mods. Voltage directly to the card from PSU.
> 
> http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


there is even a tutorial how this is done, its not that hard... its just a bit soldering and you need some variable resistors


----------



## TWiST2k

My FTW3 arrived today! I slapped it in and did a quick bench with everything stock and got a 9162 on Superposition 4k optimized. Now it's time to dig in and see what she can do. I did notice that only 1 of the 3 fans was spinning, I wonder if I will be able to control all of the fans thru afterburner or if I will be forced to use precision x ugh, time to find out.


----------



## Eco28

Now I think I need some help to verify this. I have been toying with my FE and testing some basic OC since past few days. Nothing crazy, GPU to 2000Mhz mark (set flat +100-150 on core with 120% power). It kinda did well on both ends. Especially the fact that I have been getting rather solid 2012-2020 OC on GPU when using stock FE cooler before it got too hot and throttled. Now... the card is under the water and I was happy to finally lose these thermal chains. To my surprise card is getting PrefCap heavily , hell it is power limited even on stock clocks (stock bios, no shunt though). Is this "normal" or remotely possible to be power limited without OC even on old Heaven benchmark? Could I just break and mess something up when doing some initial OC? PSU problems perhaps?



Can anyone still on FE bios reset the clocks, power target, voltage, test and confirm please?
I am like 75% sure I wasn't seeing any power limit when I first ran this benchmark.









Thanks!


----------



## CptSpig

Quote:


> Originally Posted by *OneCosmic*
> 
> Those cables which looks like power cables are there for him just to be able to easily connect a DMM and measure voltage, they aren't there to supply voltage, he is still using standard 6/8PIN PEGs to power the cards, those cables would be too slim to power the cards.


He is not powering the cards with these cables he is adding extra voltage. It is similar to adding voltage to PCEI lanes with a extra six or eight pin connection on the mother board.


----------



## CptSpig

Quote:


> Originally Posted by *Hulio225*
> 
> there is even a tutorial how this is done, its not that hard... its just a bit soldering and you need some variable resistors


Provide a link on this tutorial? This is not easy if it was everyone would be doing this mod.


----------



## Hulio225

Quote:


> Originally Posted by *CptSpig*
> 
> Provide a link on this tutorial? This is not easy if it was everyone would be doing this mod.


http://forum.hwbot.org/showthread.php?t=167865

I haven't said its easy, i said its not that hard...
sure the risk to damage something is higher and the real benefit while being on air or water tends against zero...
if i would play around with ln2, i wouldn't hesitate 1 second to do the mod


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> But there's no sign of pref cap anywhere. And this time I kept the GPU temp around 40C for the entire time to ensure there's no thermal downclock whatsoever.


i did some comparison too now,

2101 mhz cliff vs smooth on stock fe bios... max load temp 34 ° C and it looks like your results... all cliffed curves scoreing worse

before i made all my curves to a certain point smooth, the question is now does it have to be all way long smooth or is it enough like 5,6 or 7 bins away from your max clock


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> i did some comparison too now,
> 
> 2101 mhz cliff vs smooth on stock fe bios... max load temp 34 ° C and it looks like your results... all cliffed curves scoreing worse
> 
> before i made all my curves to a certain point smooth, the question is now does it have to be all way long smooth or is it enough like 5,6 or 7 bins away from your max clock


I find it performs the best when the jump is below or equal to 2 bins (25MHz).


----------



## CptSpig

Quote:


> Originally Posted by *Hulio225*
> 
> http://forum.hwbot.org/showthread.php?t=167865
> 
> I haven't said its easy, i said its not that hard...
> sure the risk to damage something is higher and the real benefit while being on air or water tends against zero...
> if i would play around with ln2, i wouldn't hesitate 1 second to do the mod


Yes this is KingPin's mod. Did you read the procedure it provides extra voltage. The small wires are for monitoring voltage. A bios flash is not going to make this happen. This is what the original post was comparing. Thanks for the link.


----------



## done12many2

Quote:


> Originally Posted by *rolldog*
> 
> 
> Who's ya daddy?


Not you.









That 6950x and its CPU score is helping you out a lot, but that graphics score though!

18,697



21,650


----------



## Hulio225

Quote:


> Originally Posted by *CptSpig*
> 
> Yes this is KingPin's mod. Did you read the procedure it provides extra voltage. The small wires are for monitoring voltage. A bios flash is not going to make this happen. This is what the original post was comparing. Thanks for the link.


Yeah i know that this is kingpins mod^^ and i have read the whole article when it came out xD
but i guess you meant OneCosmic^^

And you mean what i was comparing, the crap xoc bios to a real mod, the original comparing post is from me^^

but who knows maybe there is some fine tuning possible in that bios, if not ln2 oc guys still need hard mods for #1, like it should be ^^


----------



## Nico67

Quote:


> Originally Posted by *Eco28*
> 
> Now I think I need some help to verify this. I have been toying with my FE and testing some basic OC since past few days. Nothing crazy, GPU to 2000Mhz mark (set flat +100-150 on core with 120% power). It kinda did well on both ends. Especially the fact that I have been getting rather solid 2012-2020 OC on GPU when using stock FE cooler before it got too hot and throttled. Now... the card is under the water and I was happy to finally lose these thermal chains. To my surprise card is getting PrefCap heavily , hell it is power limited even on stock clocks (stock bios, no shunt though). Is this "normal" or remotely possible to be power limited without OC even on old Heaven benchmark? Could I just break and mess something up when doing some initial OC? PSU problems perhaps?
> 
> 
> 
> Can anyone still on FE bios reset the clocks, power target, voltage, test and confirm please?
> I am like 75% sure I wasn't seeing any power limit when I first ran this benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


It doesn't look like your using 120% in that pic, but easier to tell if you use MAX on stats


----------



## Nico67

Quote:


> Originally Posted by *Coopiklaani*
> 
> Some experiments with different curve types with XOC bios. One uses the "cliff" type, the other uses the smooth type. Smooth curve does result in much much better scores. The cliff type or F-V locking results are significantly worse.


Very interesting, shouldn't really have an effect if its unlimited, but given we don't see a lot of the limiting stats with XOC, then its possible something is occurring that we don't see








All my testing was done with cliff just raising the voltage I was testing at and hitting apply, so I do some more tests tonite with flatter curves.


----------



## fg2chase

got mine on the 27th!

Gigabyte Founders Edition


----------



## CptSpig

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah i know that this is kingpins mod^^ and i have read the whole article when it came out xD
> but i guess you meant OneCosmic^^
> 
> And you mean what i was comparing, the crap xoc bios to a real mod, the original comparing post is from me^^
> 
> but who knows maybe there is some fine tuning possible in that bios, if not ln2 oc guys still need hard mods for #1, like it should be ^^


KingPin is sponsored by EVGA if he did this on a Titan Xp he would go over 14000 easily! I am getting close to 12000 with my Titan Xp on a predator 360 with a prefilled block. 195 core and 741 memory OC.


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CptSpig*
> 
> Provide a link on this tutorial? This is not easy if it was everyone would be doing this mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forum.hwbot.org/showthread.php?t=167865
> 
> I haven't said its easy, i said its not that hard...
> sure the risk to damage something is higher and the real benefit while being on air or water tends against zero...
> if i would play around with ln2, i wouldn't hesitate 1 second to do the mod
Click to expand...

I'm going to do the easy part, three 47 Ohm 0805 resisters on the three shunts for I think it is 1.85x TDP. My card sits vertical is why under water, much rather do it this way than CLU.









Waiting on the resistors, thermal glue and conductive thermal paint from eBay. Cost me less then $10 in total with (really slow) free delivery.


----------



## TahoeDust

I just ordered a EVGA SC2 Black Edition. What kind of overclock should I expect?


----------



## Benny89

Quote:


> Originally Posted by *TWiST2k*
> 
> My FTW3 arrived today! I slapped it in and did a quick bench with everything stock and got a 9162 on Superposition 4k optimized. Now it's time to dig in and see what she can do. I did notice that only 1 of the 3 fans was spinning, I wonder if I will be able to control all of the fans thru afterburner or if I will be forced to use precision x ugh, time to find out.


Can't wait for your OC results. Please share asap


----------



## Hulio225

Quote:


> Originally Posted by *TahoeDust*
> 
> I just ordered a EVGA SC2 Black Edition. What kind of overclock should I expect?


like every other 1080Ti, the die is still the same on all cards, its silicon lottery, like always... somewhat between 2000-2100 is realistic


----------



## Luckbad

Quote:


> Originally Posted by *TahoeDust*
> 
> I just ordered a EVGA SC2 Black Edition. What kind of overclock should I expect?


Don't expect an overclock. All of the cards have roughly the same potential. You might get lucky and find a nice card that does over 2 GHz, you might get REALLY lucky and find one that can do over 2100 if cooled properly, you might get screwed and find one that can't exceed 1800 or so. Memory is the same story. Most people can get a few hundred, some can do 600+.


----------



## dunbheagan

Quote:


> Originally Posted by *TahoeDust*
> 
> I just ordered a EVGA SC2 Black Edition. What kind of overclock should I expect?


A good realistic oc without too much effort should be around 2025/6000. Try to avoid power limiting and keep the temps down, that is more important than the highest possible clock. Try [email protected] for instance.

Dont forget, most Ti's can do 2000 stable and most of them do not get above 2100 totally stable, so for most cases we are talking about 5% performace differences in the silicon lottery, thats not much. Look out for 2025-2050 on the core and 5800-6000 on the memory. If you can go higher, you are lucky, but there's not much to gain, fps-wise. A 1080Ti at 2113/6100 is only about 4% faster than at 2012/6000.


----------



## Coopiklaani

Quote:


> Originally Posted by *Luckbad*
> 
> Don't expect an overclock. All of the cards have roughly the same potential. You might get lucky and find a nice card that does over 2 GHz, you might get REALLY lucky and find one that can do over 2100 if cooled properly, you might get screwed and find one that can't exceed 1800 or so. Memory is the same story. Most people can get a few hundred, some can do 600+.


I have currently 2 1080ti FEs with my. One can do [email protected] on the core and only +300 on the vram. The other can do [email protected] on the core and a crazy +950 on the vram.. which one should I keep..


----------



## done12many2

Quote:


> Originally Posted by *Coopiklaani*
> 
> I have currently 2 1080ti FEs with my. One can do [email protected] on the core and only +300 on the vram. The other can do [email protected] on the core and a crazy +950 on the vram.. which one should I keep..


I'd keep both.









In all seriousness though, I'd keep the second only because I know I can more easily fix the core speed with improved cooling and voltage.


----------



## phenom01

So heres a odd question. Why does amazon prime cap my streaming quality @ SD with my 1080ti in but I plug in my GTX970 to the same system and can stream hd 1080p all day?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> I have currently 2 1080ti FEs with my. One can do [email protected] on the core and only +300 on the vram. The other can do [email protected] on the core and a crazy +950 on the vram.. which one should I keep..


No 2,justice because it can run at a lower voltage.

Running at 1.093 isn't ideal bec you'll hit normalized power limiting in some benchmarks and possibly even games.


----------



## dunbheagan

Quote:


> Originally Posted by *Coopiklaani*
> 
> I have currently 2 1080ti FEs with my. One can do [email protected] on the core and only +300 on the vram. The other can do [email protected] on the core and a crazy +950 on the vram.. which one should I keep..


I would definately decide for the card with the higher memory! +15MHz core against +650Mhz memory, thats an easy decision!


----------



## Luckbad

Quote:


> Originally Posted by *Coopiklaani*
> 
> I have currently 2 1080ti FEs with my. One can do [email protected] on the core and only +300 on the vram. The other can do [email protected] on the core and a crazy +950 on the vram.. which one should I keep..


The second card for sure. It will both be better performance because of that crazy memory clock and will last longer at lower voltage (I'd consider knocking it down another bin or two and reducing voltage further unless you're on water).


----------



## Coopiklaani

Quote:


> Originally Posted by *done12many2*
> 
> I'd keep both.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness though, I'd keep the second only because I know I can more easily fix the core speed with improved cooling and voltage.


Not a big fan of SLI.







But the second card doesn't really scale well with voltage as I tested with the xoc bios. havn't tested the first one yet.
Quote:


> Originally Posted by *SlimJ87D*
> 
> No 2,justice because it can run at a lower voltage.
> 
> Running at 1.093 isn't ideal bec you'll hit normalized power limiting in some benchmarks and possibly even games.


pwr limit isn't a problem now with xoc bios. I don't have pwr limit problem even with the FTW3 bios and shunt mod. I can probably push the first card to break the 2100 with the help of additional voltage from the xoc bios, but I can't do anything to help it with its vram overclocking. As for the second card, I'm currently running it underwater, with the xoc bios and additional voltage, I can push it to 2151MHz on the core, but @1.2v. not very green for everyday use.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Not a big fan of SLI.
> 
> 
> 
> 
> 
> 
> 
> But the second card doesn't really scale well with voltage as I tested with the xoc bios. havn't tested the first one yet.
> pwr limit isn't a problem now with xoc bios. I don't have pwr limit problem even with the FTW3 bios and shunt mod. I can probably push the first card to break the 2100 with the help of additional voltage from the xoc bios, but I can't do anything to help it with its vram overclocking. As for the second card, I'm currently running it underwater, with the xoc bios and additional voltage, I can push it to 2151MHz on the core, but @1.2v. not very green for everyday use.


Why even bother asking then.


----------



## hotrod717

Quote:


> Originally Posted by *CptSpig*
> 
> King Pin is using Frankenstein Cards. Not to many people can do these kind of mods. Voltage directly to the card from PSU.
> 
> http://s1164.photobucket.com/user/C...1ce700c_1080Tivoltmodded_zpsw2szzqo0.jpg.html


Um, you mean Tin is modding the cards that he benches, right?? Tin is the electrical engineer and one half of team Kingpin. Without Tin.......


----------



## Luckbad

Quote:


> Originally Posted by *TWiST2k*
> 
> My FTW3 arrived today! I slapped it in and did a quick bench with everything stock and got a 9162 on Superposition 4k optimized. Now it's time to dig in and see what she can do. I did notice that only 1 of the 3 fans was spinning, I wonder if I will be able to control all of the fans thru afterburner or if I will be forced to use precision x ugh, time to find out.


Afterburner can't control the independent fans, but you can use Precision only to control fans and Afterburner for the rest.

After you find your sweet spots, you can switch completely to Precision. It has a voltage/clock curve as well, it's just not as easy to work with.


----------



## CoreyL4

So what is the general consensus for that xoc bios? I see mixed reviews of it.


----------



## willverduzco

To those convinced that the XOC BIOS results in decreased performance at a given clock, please refer to my tests here.

Previous comparisons people have made involved using deltas of scores vs clocks across different systems. I cannot emphasize how flawed of an approach this is, given the countless lurking variables that could come into play. I set out to look for differences, while minimizing the variable to just one: BIOS used. My tests were extremely controlled. Temperature, clocks, and system settings did not vary between runs. The only thing that changed was the BIOS.

TLDR: I ran two different clock speeds (1949 and 1886) on two different BIOSes (XOC and EVGA FE extracted from my own card), three times each using Time Spy. Clocks were chosen to make sure that there would be absolutely no GPUBoost throttling shenanigans on the EVGA FE BIOS. I also ensured that no step in the FV curves was greater than 2 frequency bins. I then made sure clocks were locked (both watching while the benchmarks were running and examining logs after). Long story short, XOC was negligibly (but statistically significantly) higher scoring than the EVGA FE BIOS that came with my card when locked at either frequency. Nothing else changed with the test setup, and standard deviations on each test were quite small compared to the overall magnitude of each test.

Because of the equivalent performance (and the fact that non-shunted cards can avoid power limit problems on the XOC BIOS), I have no reason to think that the EVGA FE BIOS that shipped with my EVGA FE card is in any way better than the XOC BIOS.


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> To those convinced that the XOC BIOS results in decreased performance at a given clock, please refer to my tests here.
> 
> *Previous comparisons people have made involved using deltas of scores vs clocks across different systems. I cannot emphasize how flawed of an approach this is, given the countless lurking variables that could come into play.* I set out to look for differences, while minimizing the variable to just one: BIOS used. My tests were extremely controlled. Temperature, clocks, and system settings did not vary between runs. The only thing that changed was the BIOS.


This was a simple addition, to testing already done, exactly like yours, isolated even more like you did.
On a bench OS stripped down, just compared time spy gpu test 1 or 2, and not just 3 runs, i did in total over 100 runs on fe and xoc bios and compared. KraxKill did the same, with the same conclusion.
From a certain point on xoc is performing ~5% worse.
You joined the party in terms of xoc today, and missed everything we discussed and concluded yesterday.
At low clocks close to stock / stock boost, we already know that the performance is the same or close to fe bios
it starts at around 2100 to get worse and thats strange somehow


----------



## KedarWolf

Quote:


> Originally Posted by *willverduzco*
> 
> To those convinced that the XOC BIOS results in decreased performance at a given clock, please refer to my tests here.
> 
> Previous comparisons people have made involved using deltas of scores vs clocks across different systems. I cannot emphasize how flawed of an approach this is, given the countless lurking variables that could come into play. I set out to look for differences, while minimizing the variable to just one: BIOS used. My tests were extremely controlled. Temperature, clocks, and system settings did not vary between runs. The only thing that changed was the BIOS.
> 
> TLDR: I ran two different clock speeds (1949 and 1886) on two different BIOSes (XOC and EVGA FE extracted from my own card), three times each using Time Spy. Clocks were chosen to make sure that there would be absolutely no GPUBoost throttling shenanigans on the EVGA FE BIOS. I also ensured that no step in the FV curves was greater than 2 frequency bins. I then made sure clocks were locked (both watching while the benchmarks were running and examining logs after). Long story short, XOC was negligibly (but statistically significantly) higher scoring than the EVGA FE BIOS that came with my card when locked at either frequency. Nothing else changed with the test setup, and standard deviations on each test were quite small compared to the overall magnitude of each test.
> 
> Because of the equivalent performance (and the fact that non-shunted cards can avoid power limit problems on the XOC BIOS), I have no reason to think that the EVGA FE BIOS that shipped with my EVGA FE card is in any way better than the XOC BIOS.


All my tests FTW3 BIOS at lower clocks outperforms XOC BIOS at 2100 core. And I flashed and tested both on two different occasions. It seems at higher clocks in both BIOS's you lose performance in XOC.

And several others have drawn the same conclusion.

Why test at low clocks when next to no-one is going to run either BIOS at those clocks.


----------



## willverduzco

Quote:


> Originally Posted by *KedarWolf*
> 
> All my tests FTW3 BIOS at lower clocks outperforms XOC BIOS at 2100 core. And I flashed and tested both on two different occasions. It seems at higher clocks in both BIOS's you lose performance in XOC.
> 
> And several others have drawn the same conclusion.
> 
> Why test at low clocks when next to no-one is going to run either BIOS at those clocks.


Testing at low clocks is required for me since I am not going to shunt mod my card. I didn't shunt mod my Titan XP and I am not going to shunt mod my 1080TI. This way, I avoided any throttling due to inability to hold clocks.

I must add the caveat that I never tried the FTW3 BIOS, as once I learned that this BIOS worked, I was already busy flashing and testing XOC. At some point in the near future, I'll run clock for clock tests on FTW3 vs XOC as I have already done versus my EVGA FE BIOS. I also want to test the particular FE BIOS that Hulio is using, as that may be the cause for these differences.
Quote:


> Originally Posted by *Hulio225*
> 
> This was a simple addition, to testing already done, exactly like yours, isolated even more like you did.
> On a bench OS stripped down, just compared time spy gpu test 1 or 2, and not just 3 runs, i did in total over 100 runs on fe and xoc bios and compared. KraxKill did the same, with the same conclusion.
> From a certain point on xoc is performing ~5% worse.
> You joined the party in terms of xoc today, and missed everything we discussed and concluded yesterday.


I'd love to try the FE BIOS that you and KraxKill used, as it certainly could not have been the EVGA FE BIOS that shipped with my card. Perhaps the FE BIOS you are using is better than mine? In any case, using my FE BIOS and the XOC BIOS, there was a negligible (but repeatable and statistically significant) *improvement* when going to the XOC BIOS from the EVGA FE that I pulled from my card and posted in my post in the 1080TI BIOS thread. Finally, my joining the party late in no way mitigates the results that I measured, analyzed statistically, and will gladly repeat.


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> Testing at low clocks is required for me since I am not going to shunt mod my card. I didn't shunt mod my Titan XP and I am not going to shunt mod my 1080TI. This way, I avoided any throttling due to inability to hold clocks.
> 
> I must add the caveat that I never tried the FTW3 BIOS, as once I learned that this BIOS worked, I was already busy flashing and testing XOC. At some point in the near future, I'll run clock for clock tests on FTW3 vs XOC as I have already done versus my EVGA FE BIOS. I also want to test the particular FE BIOS that Hulio is using, as that may be the cause for these differences.
> I'd love to try the FE BIOS that you and KraxKill used, as it certainly could not have been the EVGA FE BIOS that shipped with my card. Perhaps the FE BIOS you are using is better than mine? In any case, using my FE BIOS and the XOC BIOS, there was a negligible (but repeatable and statistically significant) *improvement* when going to the XOC BIOS from the EVGA FE that I pulled from my card and posted in my post in the 1080TI BIOS thread.


Its the standard nvidia fe bios, and yes im certainly sure its better than the evga fe bios. i tested that as well. not by much but i couln't pass time spy test 1 with evga fe bios @ 2113 and with the nvidia one always...

Edit: sure xoc at low clocks will be better than fe on low clocks since no power limit...

Edit2: all fe bios are the same nvidia fe bios, but evga and asus, those fe bios differ, i compared them with a hex editor, and they are less stable and perform in general clock for clock a tiny tiny bit worse


----------



## OneCosmic

Quote:


> Originally Posted by *Hopesolo*
> 
> SHUNT MOD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2088/6250MHz


Why only 2088MHz? Can't you do more than 2088MHz using the 1.093V?


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Its the standard nvidia fe bios, and yes im certainly sure its better than the evga fe bios. i tested that as well. not by much but i couln't pass time spy test 1 with evga fe bios @ 2113 and with the nvidia one always...
> 
> Edit: sure xoc at low clocks will be better than fe on low clocks since no power limit...
> 
> Edit2: all fe bios are the same nvidia fe bios, but evga and asus, those fe bios differ, i compared them with a hex editor, and they are less stable and perform in general clock for clock a tiny tiny bit worse


Aren't all FE bios' are the same except for the vendor identifier? Both EVGA precisionX and ASUS GPU tweaker use the vendor identifier to recognise their FE cards to unlock specific functions, such as EVGA precisionX's OC scanner. Besides, I don't think FE bios' are different in any other ways.


----------



## CptSpig

Quote:


> Originally Posted by *hotrod717*
> 
> Um, you mean Tin is modding the cards that he benches, right?? Tin is the electrical engineer and one half of team Kingpin. Without Tin.......


You are absolutely correct!


----------



## niobium615

Quote:


> Originally Posted by *Coopiklaani*
> 
> Not a big fan of SLI.
> 
> 
> 
> 
> 
> 
> 
> But the second card doesn't really scale well with voltage as I tested with the xoc bios. havn't tested the first one yet.
> pwr limit isn't a problem now with xoc bios. I don't have pwr limit problem even with the FTW3 bios and shunt mod. I can probably push the first card to break the 2100 with the help of additional voltage from the xoc bios, but I can't do anything to help it with its vram overclocking. As for the second card, I'm currently running it underwater, with the xoc bios and additional voltage, I can push it to 2151MHz on the core, but @1.2v. not very green for everyday use.


Which one of those did you do the resistor mod to? Any reason you're experimenting with other locked BIOSs like the FTW3 if you already removed the power limit?


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Aren't all FE bios' are the same except for the vendor identifier? Both EVGA precisionX and ASUS GPU tweaker use the vendor identifier to recognise their FE cards to unlock specific functions, such as EVGA precisionX's OC scanner. Besides, I don't think FE bios' are different in any other ways.


EVGA and Asus FE bios even differ in version number compared to the other FE Bios

if you compare all fe bios in hex editor you see one line is different, but on asus fe and evga fe there are way more lines different than just the vendor id line


----------



## willverduzco

Quote:


> Originally Posted by *Hulio225*
> 
> Its the standard nvidia fe bios, and yes im certainly sure its better than the evga fe bios. i tested that as well. not by much but i couln't pass time spy test 1 with evga fe bios @ 2113 and with the nvidia one always...
> 
> *Edit: sure xoc at low clocks will be better than fe on low clocks since no power limit...*


From your edit, I can tell that you did not bother to read the testing methodology on my post in the other thread. I selected the low clocks specifically in order to avoid the power limit, which was then verified in real time via Afterburner mobile app and post hoc using logs. There was absolutely no power throttling, and clocks stayed 100% consistent at 1886. At 1949, they were 99% locked (with the exception of around 5 seconds during GFX test 2 in Time Spy). That is why I had to go so low on my non-shunted card.

Regarding the BIOS, I will test again using the NVIDIA FE BIOS (as opposed to the EVGA FE BIOS that shipped with my card) in the near future, although I'm not expecting a huge difference... at least not anything that would lead me to conclude that any of them are "defective" as others have said in this and the 1080TI BIOS thread. For reference, this is the BIOS to which you refer, correct? Out of curiosity, mind trying my BIOS, which was linked in my post on the other thread? I'd love to see if you see decreased performance using my card's stock FE BIOS.


----------



## CoreyL4

I was gonna flash my gaming x to the xoc bios for no power limits, but now i am second guessing it from alll the analysis


----------



## OneCosmic

Quote:


> Originally Posted by *Hulio225*
> 
> EVGA and Asus FE bios even differ in version number compared to the other FE Bios
> 
> if you compare all fe bios in hex editor you see one line is different, but on asus fe and evga fe there are way more lines different than just the vendor id line


Is this BIOS the same version as yours when compared by version and in HEX editor?

https://www.techpowerup.com/vgabios/190928/nvidia-gtx1080ti-11264-170118

So basically you are saying that pure nvidia FE BIOS is performing better than any branded FE BIOS?


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> From your edit, I can tell that you did not bother to read the testing methodology on my post in the other thread. I selected the low clocks specifically in order to avoid the power limit, which was then verified in real time via Afterburner mobile app and post hoc using logs. There was absolutely no power throttling, and clocks stayed 100% consistent at 1886. At 1949, they were 99% locked (with the exception of around 5 seconds during GFX test 2 in Time Spy). That is why I had to go so low on my non-shunted card.
> 
> Regarding the BIOS, I will test again using the NVIDIA FE BIOS (as opposed to the EVGA FE BIOS that shipped with my card) in the near future, although I'm not expecting a huge difference... at least not anything that would lead me to conclude that any of them are "defective" as others have said in this and the 1080TI BIOS thread. For reference, this is the BIOS to which you refer, correct? Out of curiosity, mind trying my BIOS, which was linked in my post on the other thread? I'd love to see if you see decreased performance using my card's stock FE BIOS.


I have read your methodology, i simply am tired as f*** its 4:39 in the morning here xD
Im happy for you if it works out better for you with the xoc bios.

If your evga fe bios is the same like on TPU, i already tested it.

And, yep thtas the nvidia bios. Ill compare yours with the on TPU and decide if ill test it than.


----------



## Alwrath

Quote:


> Originally Posted by *CoreyL4*
> 
> I was gonna flash my gaming x to the xoc bios for no power limits, but now i am second guessing it from alll the analysis


Try the FTW 3 bios, im pretty sure a user on this forum tested it on your msi gaming x and had great results.


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> ........


TPU EVGA FE vs your ROM



its identical, but one chart

NVIDIA FE vs EVGA FE



as you can see there are more differences.


----------



## TWiST2k

Quote:


> Originally Posted by *Luckbad*
> 
> Afterburner can't control the independent fans, but you can use Precision only to control fans and Afterburner for the rest.
> 
> After you find your sweet spots, you can switch completely to Precision. It has a voltage/clock curve as well, it's just not as easy to work with.


The curve on precision is pretty atrocious haha. It's a shame there isn't a better way to control the FTW3 fans at this time. My 980 Ti has a custom bios and works great, then my 1080 needed me to run afterburner and now I gotta run afterburner AND precision x ugh. I sure do miss the custom bios days.


----------



## willverduzco

Quote:


> Originally Posted by *Hulio225*
> 
> TPU EVGA FE vs your ROM
> 
> 
> 
> its identical, but one chart
> 
> NVIDIA FE vs EVGA FE
> 
> 
> 
> as you can see there are more differences.


Thank you for checking in a hex editor. Looking forward to testing the Nvidia BIOS!


----------



## Hulio225

Quote:


> Originally Posted by *willverduzco*
> 
> Thank you for checking in a hex editor. Looking forward to testing the Nvidia BIOS!


There are significant differences...

and now the funny part, the asus fe bios which has the same version number like the nvidia one



Its different....

all other FE bios look like the nvidia fe bios in comparo meter just one line vendor id differs

Like i said, i tested them all, isolated under same test conditions, compared them all etc.

Im always pursuing more performance... but sadly it all was time waste^^


----------



## Bishop07764

Quote:


> Originally Posted by *Luckbad*
> 
> I've flashed a few on non-FE cards.
> 
> They generally work, but some cards have weird things that could screw up.
> 
> EVGA cards with ICX might not have proper fan control since their fans are independent, Zotac Amp Extreme behaves oddly on other Bios probably because of its power phase chips, etc.


At least I won't have to worry about fan control since mine is under a waterblock.









Quote:


> Originally Posted by *KedarWolf*
> 
> I'm running Diablo 3 at 2062 core 1.050v, G-Sync on at 4K, not power limiting at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just upped it to 1.093v 2088, zero power limiting at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: FTW3 BIOS is great, Diablo 3: Reaper Of Souls at 2100 core, +667 memory (6177) , zero power limiting while playing the game, completely stable.


Quote:


> Originally Posted by *Nico67*
> 
> can't load or play games at 2101/6003 which I can on the FE bios. Each time I flash back to FE, games are fine again. Each card is different so you may have better luck


The FTW 3 bios sounds kind of promising. Results seem to be mixed all around with differing bios in general though. Definitely seems very card dependent. I've been getting back to playing games though. Haven't felt the need to flash a bios after going back to actually playing games.







At 2050 core and +625 on memory, I have been completely stable in games. I haven't seen any crashing or downclocking even in the Witcher 3 which is all that I've been playing for the past 5 days or so. In fact, I have yet to see power limiting in any game yet that I've noticed. Benchmarks are a different story and perhaps it might be different if I were pushing 4k.


----------



## KraxKill

Quote:


> Originally Posted by *willverduzco*
> 
> From your edit, I can tell that you did not bother to read the testing methodology on my post in the other thread. I selected the low clocks specifically in order to avoid the power limit, which was then verified in real time via Afterburner mobile app and post hoc using logs. There was absolutely no power throttling, and clocks stayed 100% consistent at 1886. At 1949, they were 99% locked (with the exception of around 5 seconds during GFX test 2 in Time Spy). That is why I had to go so low on my non-shunted card.
> 
> Regarding the BIOS, I will test again using the NVIDIA FE BIOS (as opposed to the EVGA FE BIOS that shipped with my card) in the near future, although I'm not expecting a huge difference... at least not anything that would lead me to conclude that any of them are "defective" as others have said in this and the 1080TI BIOS thread. For reference, this is the BIOS to which you refer, correct? Out of curiosity, mind trying my BIOS, which was linked in my post on the other thread? I'd love to see if you see decreased performance using my card's stock FE BIOS.


I'm sure he read your post, it's just that there is nothing to gather from it. There just isn't. You're not testing what needs to be tested.

It's already been mentioned that the BIOS performs near identical at pedestrian clocks and you went ahead and tested clocks that are even more pedestrian. I wouldn't expect any performance separation there. You'll find most bios will perform identical at some common (low clock) denominator.

You're trying to compare the performance of two 'identical' Ferraris and choose to do so by running one car at the speed limit and the other just slightly above it while enforcing a rev limit on both cars by avoiding pwr.

Pwr behavior is precisely what needs to be tested and what besides voltage should make one bios better than another.

We're comparing the performance of a bios that has a pwr, against a bios that doesn't and has unlocked voltage.
.
The XOC claims no PWR but at clocks and voltages where PWR IS a part of life it's underperforming to the tune of 5% and this is reported by KedarVolf as well. He, is not 'yet' shunt modded.

I ran countless tests on it at 2100mhz and higher. All the way to 2200mhz and the results are identical to what Hulio, is seeing at similar clocks and voltages.

At clocks, frequencies and voltages where pwr is a lifestyle the bios that has pwr and temp throttles disabled is NOT performing better than a bios that doesn't! Not only is it not performing better, it's performing 5% slower in gpu isolated testing such as the 3DMark bench test 1 and 2 of your choosing.

Sure if you clock it under 2k, nowhere near the edge of potential where pwr events are minimal if present at all you'll be hard pressed to see a difference. While at slightly elevated clocks where a non shunted card begins to experience pwr events more frequently but before you achieve clocks that are unstable you MAY even see performance increase.

As clocks climb higher into a space where the area between pwr events is even more constrained something happens and the BIOS begins to loose out to the FE, especially for those with elevated power thresholds due to shunt modding.


----------



## TheBoom

So I finally got to flashing the XOC to my Zotac Amp Extreme.

No throttling at my usual gaming clocks of 2025 and 11.7ghz mem and a voltage of +0 which is 1112mv. This is in Firestrike Ultra which would constantly cause the card to jump around in clocks and voltage previously.

Best part is it fixed the issue I had with the stock bios not stopping the fans at idle and low temps.

Something I noticed though, the fans seem much louder than they were before? And I mean at the same 100%. Does this bios actually change the RPM the fans spin at or am I hearing things







?


----------



## TWiST2k

Quote:


> Originally Posted by *TheBoom*
> 
> So I finally got to flashing the XOC to my Zotac Amp Extreme.
> 
> No throttling at my usual gaming clocks of 2025 and 11.7ghz mem and a voltage of +0 which is 1112mv. This is in Firestrike Ultra which would constantly cause the card to jump around in clocks and voltage previously.
> 
> Best part is it fixed the issue I had with the stock bios not stopping the fans at idle and low temps.
> 
> Something I noticed though, the fans seem much louder than they were before? And I mean at the same 100%. Does this bios actually change the RPM the fans spin at or am I hearing things
> 
> 
> 
> 
> 
> 
> 
> ?


Yes, different bios files will have different RPM limits for the fans, I noticed this in the cross flashing of the 1080s.


----------



## TheBoom

Quote:


> Originally Posted by *TWiST2k*
> 
> Yes, different bios files will have different RPM limits for the fans, I noticed this in the cross flashing of the 1080s.


Hmm I see. I hope it doesn't ruin the fans though.

After some testing I've found that the card refuses to go over 1125mv, occasionally hitting 1131mv and then quickly dropping. It seems to really like 1112mv though and holds 2050 core and 11.8ghz memory stable in 8K Supo.

Increasing core to +75 or 2075 causes an instant crash. Have not tried further overclocking the memory yet.

Temps hitting about 80-84 C at 90% fan speed which is about 3200 rpm with this bios. Power draw hit 430 watts.

8K Supo score of 4622 with above clocks and 6700k.


----------



## IMI4tth3w

finally upgrading the wifes rig. got her on the waiting list for a 1080ti ftw3 on amazon. hopefully it won't be 10 years before we see the card. should hopefully be a nice upgrade from her 780ti!

her 780ti has a kraken g10 and AIO cooler on it but i just don't think i have the heart to install it on the ftw3 card. even if the cooling is better, the ftw3 cooler is just so beautiful. like legit the best looking cooler i've ever seen. super clean. and the overengineering of the card with the extra sensors.. total overkill but the EE in me loves it. just have to hide it from my 980ti in my rig before it gets jealous. although mine's got a nice ek block on it. maybe one day i'll convert the wife's rig to a custom water loop.


----------



## Slackaveli

Quote:


> Originally Posted by *IMI4tth3w*
> 
> finally upgrading the wifes rig. got her on the waiting list for a 1080ti ftw3 on amazon. hopefully it won't be 10 years before we see the card. should hopefully be a nice upgrade from her 780ti!
> 
> her 780ti has a kraken g10 and AIO cooler on it but i just don't think i have the heart to install it on the ftw3 card. even if the cooling is better, the ftw3 cooler is just so beautiful. like legit the best looking cooler i've ever seen. super clean. and the overengineering of the card with the extra sensors.. total overkill but the EE in me loves it. just have to hide it from my 980ti in my rig before it gets jealous. although mine's got a nice ek block on it. maybe one day i'll convert the wife's rig to a custom water loop.


somebody's gettin some tonight.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm going to do the easy part, three 47 Ohm 0805 resisters on the three shunts for I think it is 1.85x TDP. My card sits vertical is why under water, much rather do it this way than CLU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waiting on the resistors, thermal glue and conductive thermal paint from eBay. Cost me less then $10 in total with (really slow) free delivery.


Can you give me any idea what type thermal glue and thermal paint that i should buy....
also after the shunt and i end up selling or something happens, is this stuff easy to remove ? without ripping off the ic under the resister off the board lol

one other thing m8 is there a guide out there i seen pictures of someone that did it buy you on this thread at there 1080`s or might have been ti`s did he up a guide or any youtube video showing what to do

not exactly sure when to use the paint or the glue hehe also i know most people shunt the top 2 resisters near the power cables but what does that other 3rd shunt do ? is that to gain more vcore voltage or does it supply better voltage to the card for stability ? is it worth doing all 3 or just the 2 though when i do have all the bits i probably will do all 3 haha


----------



## KingEngineRevUp

While brushing my teeth, I just had an epiphany.

There was a time where people thought flashing a bios would give 30-50 extra watts and drastically increasing our performances. We all had a peace of mind and probably got better sleep or at least gamed more.

But after testing, research and discoveries, those times are long gone and the search continues... For a way for us to gain an extra 0.5%-1% extra in performance... There's even users who purchase and return cards on a weekly basis to try and get Willy Wonka's golden ticket.

But remember people, there was a timer you had a peace of mind when you thought a magic bios solved all your problems. It was all mental. Although it's great for us to research, test and discover new possibilities, don't feel insecure about 0.5%-2% of performance. You're not missing out much


----------



## willverduzco

Quote:


> Originally Posted by *KraxKill*
> 
> I'm sure he read your post, it's just that there is nothing to gather from it. There just isn't. You're not testing what needs to be tested.
> 
> It's already been mentioned that the BIOS performs near identical at pedestrian clocks and you went ahead and tested clocks that are even more pedestrian. I wouldn't expect any performance separation there. You'll find most bios will perform identical at some common (low clock) denominator.
> 
> You're trying to compare the performance of two 'identical' Ferraris and choose to do so by running one car at the speed limit and the other just slightly above it while enforcing a rev limit on both cars by avoiding pwr.
> 
> Pwr behavior is precisely what needs to be tested and what besides voltage should make one bios better than another.
> 
> We're comparing the performance of a bios that has a pwr, against a bios that doesn't and has unlocked voltage.
> .
> The XOC claims no PWR but at clocks and voltages where PWR IS a part of life it's underperforming to the tune of 5% and this is reported by KedarVolf as well. He, is not 'yet' shunt modded.
> 
> I ran countless tests on it at 2100mhz and higher. All the way to 2200mhz and the results are identical to what Hulio, is seeing at similar clocks and voltages.
> 
> At clocks, frequencies and voltages where pwr is a lifestyle the bios that has pwr and temp throttles disabled is NOT performing better than a bios that doesn't! Not only is it not performing better, it's performing 5% slower in gpu isolated testing such as the 3DMark bench test 1 and 2 of your choosing.
> 
> Sure if you clock it under 2k, nowhere near the edge of potential where pwr events are minimal if present at all you'll be hard pressed to see a difference. While at slightly elevated clocks where a non shunted card begins to experience pwr events more frequently but before you achieve clocks that are unstable you MAY even see performance increase.
> 
> As clocks climb higher into a space where the area between pwr events is even more constrained something happens and the BIOS begins to loose out to the FE, especially for those with elevated power thresholds due to shunt modding.


I am not going to argue with you beyond clarifying my point with this reply, as it was tiring trying to wade through the grammatical errors and rhetoric of your post to extract actual content, but you clearly do not understand what I sought out to demonstrate. I was trying to check for clock-for-clock performance differences at a frequency that both BIOSes can hit, without power throttling even in the picture (which would impact and thus bias against the FE BIOS). I found that when clocks were identical, there were only minor (but statistically significant) differences, such that the XOC BIOS performed measurably (albeit marginally) better when neither card was hampered by a power limit throttle.

I couldn't test at a higher clock because anything higher than 1886 on my non-shunt modified card results in at least a second or two of power limit clock reduction with the FE BIOS. This ONLY affects my card when running the FE BIOS, which would bias the results against the FE BIOS, making the XOC seem even faster in comparison.

You could try to argue that this lacks external validity to higher clocks, but that doesn't make sense for core clocks. If we were modifying memory clocks, I would agree at the possibility. For memory clocks, this would be feasible since BIOSes often have different memory timings for different clock ranges. This was a pretty big deal on my last AMD cards, where flashing to a BIOS with tighter memory timings yielded significantly improved per-clock performance. I've not seen proof of this for Pascal (perhaps due to Nvidia's tight control of the supply chain and most aspects of AIB cards in general), but it's entirely possible. However, the information stored for core clocks should just be the target F-V curve for GPU Boost, as well as the various limits that would detract from that.

As long as neither BIOS is reported erroneously inflated clocks at higher speeds, my results should theoretically generalize to higher performance bins as well. For reference, in my 1948 MHz tests, where the non-shunted FE BIOS throttled for about 5 seconds even at 120% Power Limit, the performance delta between the two BIOSes was even larger. At higher clocks on non-shunt modified cards, the difference will be even larger that.

You refer to when "_PWR IS a part of life_" (which I can only infer refers to how cards behave near / at the power limit). However, comparing at this point will logically favor a BIOS that lacks a power limit entirely. This is because at absolutely no time when running the XOC BIOS, did I ever encounter a throttling occurrence, as verified by looking at frequency and Perf Limit logs at speeds ranging from 1886 to 2126. Similarly, I found that at [email protected] locked using XOC BIOS, I achieved the higher scores I would expect when compared to my previous bests (albeit at lower and somewhat variable clocks) using the other BIOSes. Ideally, I would have tested at 2126 on the other BIOSes as well, but that was not sustainable on the other BIOSes, further leading one to believe that XOC > EVGA FE for non-shunted cards. Ultimately, there's no voodoo occurring in the low 2100s from what I've seen, although I am willing to admit that it is possible that something _may_ be occurring higher than 2126, as that is the absolute highest my card will clock at any sane voltage for ambient cooling. However, at 2126 and below (which includes what you refer to "_As clocks climb higher into a space where the area between pwr events is even more constrained_"), the XOC BIOS holds its clocks and delivers performance one would expect.

You also briefly mention there being a point between where non-shunted begin to throttle and when XOC BIOS shows instability ("_While at slightly elevated clocks where a non shunted card begins to experience pwr events more frequently but before you achieve clocks that are unstable you MAY even see performance increase_"). However, I just haven't seen this. On my cards, both are stable at the same speeds when restricting to 1.093V or less. However, the FE BIOS results in a diminished clock as the test goes on, which would otherwise be preserved with the XOC. Both seem to exhibit core crashing (i.e. stalls and crashes without visual artifacts) at the same speed.

One idea I had while writing this reply is that perhaps memory clocks could be an issue here. Since core clock stability is often impacted by memory clock (even if memory is at previously stable speeds) and vice versa, this could be something to look into / remove from the equation by lowering memory clocks a little on all BIOSes while testing to make sure that borderline stable memory doesn't behave strangely. My memory doesn't overclock well at all. I've found that by maximizing core first and only gradually increasing memory until performance stops rising, I get the best results. When I start reaching the borderline stable frequency, the scores go down noticeably. That said, my memory clocks are numerically quite low. Perhaps high memory clocks in general perform poorly on this BIOS due to potentially different memory timings at higher memory speeds. However, this is just a potential explanation for the discrepancy since I cannot test higher memory speeds on my card's poorly overclocking memory.

Finally, your car analogy fails for a number of reasons. Here's perhaps the most relevant: My testing is similar to measuring torque output (rather than horsepower) on two engines using a crankshaft dyno. In case you're unfamiliar, horsepower = torque * RPM / 5252. Although horsepower typically rises with RPM, torque curves are usually flat throughout most of the rev range. While yes, torque figures don't tell the entire story, they provide quite a bit of information about how an engine performs when not at redline. This translates to day-to-day drivability while not at the absolute limit. Yes, we're limiting peak horsepower by limiting revolutions per minute to a set number (i.e. limiting frequency), but we are doing so to prevent the FE BIOS from being unfairly handicapped during testing. (For reference, there are pretty large differences between different types of high performance engines at low RPMs, with large displacement engines performing well on low RPM torque figures and smaller, higher revving engines performing better at high RPMs.) For the sake of this analogy, the FE BIOS can be thought of as a large displacement engine, which builds a lot of torque at a moderate RPM. The XOC BIOS on the other hand, would be another large displacement engine, but with a large and slow-spooling turbo that builds peak boost at high RPMs (or has its rev limiter removed).


----------



## madmeatballs

Quote:


> Originally Posted by *SlimJ87D*
> 
> While brushing my teeth, I just had an epiphany.
> 
> There was a time where people thought flashing a bios would give 30-50 extra watts and drastically increasing our performances. We all had a peace of mind and probably got better sleep or at least gamed more.
> 
> But after testing, research and discoveries, those times are long gone and the search continues... For a way for us to gain an extra 0.5%-1% extra in performance... There's even users who purchase and return cards on a weekly basis to try and get Willy Wonka's golden ticket.
> 
> But remember people, there was a timer you had a peace of mind when you thought a magic bios solved all your problems. It was all mental. Although it's great for us to research, test and discover new possibilities, don't feel insecure about 0.5%-2% of performance. You're not missing out much


True! I questioned myself too, is it really worth it for 1-2FPS.







Imagine the time wasted in testing stability just for 1-2FPS. Heck, I'm actually satisfied with the card without any OC.

Anyway, I decided if I don't have to do any physical mods why not.


----------



## PriestOfSin

Got myself an EVGA GTX1080Ti SC Black over the weekend! Feels pretty good so far, just curious about performance. At the "ultra" preset of Ghost Recon: Wildlands, I get an average FPS of 64.77, a minimum of 57, and a max of 71.57 @ 2560x1080. Is this normal? What are some other games to test this puppy out at?


----------



## OZrevhead

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm going to do the easy part, three 47 Ohm 0805 resisters on the three shunts for I think it is 1.85x TDP. My card sits vertical is why under water, much rather do it this way than CLU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waiting on the resistors, thermal glue and conductive thermal paint from eBay. Cost me less then $10 in total with (really slow) free delivery.


Quote:


> Originally Posted by *ViRuS2k*
> 
> http://www.ebay.co.uk/itm/300745229871?_trksid=p2060353.m2749.l2648&var=600050687744&ssPageName=STRK%3AMEBIDX%3AIT
> 
> just bought some of those resisters 47ohm 10 for £1.09
> 
> ----
> 
> Could you tell me the process to be able to use these can you lay them on top of the old resister with silicone thermal glue ?
> list of things i need if you could please, need it to be easy as possible to remove and leave no trace in case of RMA ect lol


Mate do you know what resistance I should be aiming for with shunt? Im going to try to use circuit pen but with a low resistance multimeter to take the guesswork outof it.

Thanks


----------



## ViRuS2k

Quote:


> Originally Posted by *OZrevhead*
> 
> Mate do you know what resistance I should be aiming for with shunt? Im going to try to use circuit pen but with a low resistance multimeter to take the guesswork outof it.
> 
> Thanks


Im not sure im just following what kedarwolf is saying and trying to follow his footsteps in shuntmodding my cards so that i can get back to stock incase of rma or selling and also to do it in a way that looks like i havent done anything and easy to remove incase i do need to









so im waiting for him to reply to my post above about the guide methods or videos of doing this and what types of stuff i need to be able to do it properly


----------



## Nico67

Quote:


> Originally Posted by *ViRuS2k*
> 
> Im not sure im just following what kedarwolf is saying and trying to follow his footsteps in shuntmodding my cards so that i can get back to stock incase of rma or selling and also to do it in a way that looks like i havent done anything and easy to remove incase i do need to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so im waiting for him to reply to my post above about the guide methods or videos of doing this and what types of stuff i need to be able to do it properly


Its very alarming when you guys keep saying shunt modding when talking about resistors, please whatever you do, don't put them on the shunts, put them on the caps











got this off kingpin forums but added a bit more detail to indicate where you do different mods.


----------



## ttnuagmada

can anyone give me any tips on use the voltage curve for SLI cards? Seems like i can only apply a curve to one card.


----------



## BrainSplatter

Quote:


> Originally Posted by *ttnuagmada*
> 
> can anyone give me any tips on use the voltage curve for SLI cards? Seems like i can only apply a curve to one card.


You just have to do separate curves. First configure AB to allow separate settings for each card. Then adjust curves separately. Note that the curve window will not update correctly after selecting a different card. It's best to close curve window, switch to different card and then open the curve window again. Settings for each card should be saved to a profile (each card can use all profile slots).


----------



## Nico67

Quote:


> Originally Posted by *Coopiklaani*
> 
> Some experiments with different curve types with XOC bios. One uses the "cliff" type, the other uses the smooth type. Smooth curve does result in much much better scores. The cliff type or F-V locking results are significantly worse.


Yep for some bizarre reason this does help







I tried a few variations,

2138/6180 10487 323w cliff
2138/6180 10533 333w 2 point 2126 @ 1.063 and 2138 @ 1.075
2138/6180 10598 337w 3 point
2138/6180 10615 337w 4 or 5 point curve



2152/6180 10632 337w 1.075v 5 point curve



2164 looks like it would be a nice score, but can't get it to pass even at 1.100v. card seems good upto 2152 then takes a ton more voltage which wouldn't be worth it except for benchmarks but only really do those for gaming stability testing anyway









So with XOC, it would seem like its downclocking even though it doesn't show it, just not on power, and seen as temp never goes over 26c it wouldn't seem to be that either. BOOST 3.0 may have some other tricks, just not sure what they are









If you try it, test voltages for best performance at that clk speed, test mem steps, as they were all over the place for me, and lastly still use a curve, when common sense would suggest you don't need too









XOC has given me the highest score, but at higher clocks, mem and watts I couldn't use with FE. For gaming I have learnt a lot more about how the card responds so I could probably do better with FE. All I really need is an FE bios with 350w and I'd be laughing.


----------



## feznz

Quote:


> Originally Posted by *gmpotu*
> 
> Hey guys. Been out of the game for about a week now. I'm about 1000 posts behind.
> Anyone experiencing power issues with any of the current nvidia drivers?
> 
> My machine when I put it into sleep mode now turns everything off (monitors, keyboard, mouse, etc) but the fans stay running and I have to press the power button then it crashes the system and restarts. This never happened before I got my new card and I don't remember it happening when I was first testing. Just curious if anyone else has had power issues?


I posted a couple thousand posts ago yesterday







I was getting BSOD mainly idle on desktop with the last 2 drivers, the current driver was the worst 3 BSOD in 1/2 an hour
Rolled back to 381.65 left computer running 48 hours with no reboots maybe 4 hours gaming and all sweet again









Quote:


> Originally Posted by *Hulio225*
> 
> Imho, something is wrong with that bios. even xtreme addict, the guy who published that bios on hwbot, is getting just 12316 gpu score in time spy with a core clock of 2440MHz on ln2. Im getting 11376 with 2113 core clock on water.
> Kingpin is getting 13230 gpu score with 2530MHz... there is no way that 90MHz are giving 900 points, if you compare those two ln2 scores...
> 
> if this would be the case than xtreme addict would have 3 times 900 points more than me...
> 
> im pretty sure the xoc bios is faulty in some way
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Edit:
> 
> 2530/2113=1,197 -> 19,7% higher clock speed | 13230/11376=1,163 [Kingpin 1080Ti world record, volt modded card etc VS mine shunt moded on water]
> we know clock and performance scaling isn't completely linear so this result is inline what we would expect.
> 
> 2440/2113=1,155 ->15,5% higher clock speed | 12316/11376=1,083 [Xtreme addict xoc bios VS me]
> as you can see here result isn't in line with above...
> 
> 15,5% more clock should result in 12-13% more gpu score, but it is not, so the conclusion the bios is crap
> 
> he is getting 8.3% more gpu score for 15,5% more clock speed
> 
> he is missing those ~5% performance like KraxKill and me. period.


Good point you're probably right Kingpin is a face to a multi million dollar business with his own electrical engineer and probably a team of binners that bin hundreds more likely thousands of GPU CPU RAM hell he probably bins his PSU.
So wouldn't surprise me he has a personal BIOS person to write custom bios with a reputation to keep and world records to hold there is a lot more to a bit of LN² and a few clicks of the mouse to keep at the top.
Albeit all top secrets one thing to show off the electrical mods the average person could copy them and there are always plenty of online guides help to do the basic volt mods.
I think what we see is about 5% that photos cannot hide, 95% will be closely guarded secrets we couldn't have some like me or you post a world record beating bench, beating the top dog score with a celebrity status.

Quote:


> Originally Posted by *willverduzco*
> 
> I am not going to argue with you beyond clarifying my point with this reply, as it was tiring trying to wade through the grammatical errors and rhetoric of your post to extract actual content, but you clearly do not understand what I sought out to demonstrate. I was trying to check for clock-for-clock performance differences at a frequency that both BIOSes can hit, without power throttling even in the picture (which would impact and thus bias against the FE BIOS). I found that when clocks were identical, there were only minor (but statistically significant) differences, such that the XOC BIOS performed measurably (albeit marginally) better when neither card was hampered by a power limit throttle.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I couldn't test at a higher clock because anything higher than 1886 on my non-shunt modified card results in at least a second or two of power limit clock reduction with the FE BIOS. This ONLY affects my card when running the FE BIOS, which would bias the results against the FE BIOS, making the XOC seem even faster in comparison.
> 
> You could try to argue that this lacks external validity to higher clocks, but that doesn't make sense for core clocks. If we were modifying memory clocks, I would agree at the possibility. For memory clocks, this would be feasible since BIOSes often have different memory timings for different clock ranges. This was a pretty big deal on my last AMD cards, where flashing to a BIOS with tighter memory timings yielded significantly improved per-clock performance. I've not seen proof of this for Pascal (perhaps due to Nvidia's tight control of the supply chain and most aspects of AIB cards in general), but it's entirely possible. However, the information stored for core clocks should just be the target F-V curve for GPU Boost, as well as the various limits that would detract from that.
> 
> As long as neither BIOS is reported erroneously inflated clocks at higher speeds, my results should theoretically generalize to higher performance bins as well. For reference, in my 1948 MHz tests, where the non-shunted FE BIOS throttled for about 5 seconds even at 120% Power Limit, the performance delta between the two BIOSes was even larger. At higher clocks on non-shunt modified cards, the difference will be even larger that.
> 
> You refer to when "_PWR IS a part of life_" (which I can only infer refers to how cards behave near / at the power limit). However, comparing at this point will logically favor a BIOS that lacks a power limit entirely. This is because at absolutely no time when running the XOC BIOS, did I ever encounter a throttling occurrence, as verified by looking at frequency and Perf Limit logs at speeds ranging from 1886 to 2126. Similarly, I found that at [email protected] locked using XOC BIOS, I achieved the higher scores I would expect when compared to my previous bests (albeit at lower and somewhat variable clocks) using the other BIOSes. Ideally, I would have tested at 2126 on the other BIOSes as well, but that was not sustainable on the other BIOSes, further leading one to believe that XOC > EVGA FE for non-shunted cards. Ultimately, there's no voodoo occurring in the low 2100s from what I've seen, although I am willing to admit that it is possible that something _may_ be occurring higher than 2126, as that is the absolute highest my card will clock at any sane voltage for ambient cooling. However, at 2126 and below (which includes what you refer to "_As clocks climb higher into a space where the area between pwr events is even more constrained_"), the XOC BIOS holds its clocks and delivers performance one would expect.
> 
> You also briefly mention there being a point between where non-shunted begin to throttle and when XOC BIOS shows instability ("_While at slightly elevated clocks where a non shunted card begins to experience pwr events more frequently but before you achieve clocks that are unstable you MAY even see performance increase_"). However, I just haven't seen this. On my cards, both are stable at the same speeds when restricting to 1.093V or less. However, the FE BIOS results in a diminished clock as the test goes on, which would otherwise be preserved with the XOC. Both seem to exhibit core crashing (i.e. stalls and crashes without visual artifacts) at the same speed.
> 
> One idea I had while writing this reply is that perhaps memory clocks could be an issue here. Since core clock stability is often impacted by memory clock (even if memory is at previously stable speeds) and vice versa, this could be something to look into / remove from the equation by lowering memory clocks a little on all BIOSes while testing to make sure that borderline stable memory doesn't behave strangely. My memory doesn't overclock well at all. I've found that by maximizing core first and only gradually increasing memory until performance stops rising, I get the best results. When I start reaching the borderline stable frequency, the scores go down noticeably. That said, my memory clocks are numerically quite low. Perhaps high memory clocks in general perform poorly on this BIOS due to potentially different memory timings at higher memory speeds. However, this is just a potential explanation for the discrepancy since I cannot test higher memory speeds on my card's poorly overclocking memory.
> 
> Finally, your car analogy fails for a number of reasons. Here's perhaps the most relevant: My testing is similar to measuring torque output (rather than horsepower) on two engines using a crankshaft dyno. In case you're unfamiliar, horsepower = torque * RPM / 5252. Although horsepower typically rises with RPM, torque curves are usually flat throughout most of the rev range. While yes, torque figures don't tell the entire story, they provide quite a bit of information about how an engine performs when not at redline. This translates to day-to-day drivability while not at the absolute limit. Yes, we're limiting peak horsepower by limiting revolutions per minute to a set number (i.e. limiting frequency), but we are doing so to prevent the FE BIOS from being unfairly handicapped during testing. (For reference, there are pretty large differences between different types of high performance engines at low RPMs, with large displacement engines performing well on low RPM torque figures and smaller, higher revving engines performing better at high RPMs.) For the sake of this analogy, the FE BIOS can be thought of as a large displacement engine, which builds a lot of torque at a moderate RPM. The XOC BIOS on the other hand, would be another large displacement engine, but with a large and slow-spooling turbo that builds peak boost at high RPMs (or has its rev limiter removed).


you lost me at second sentence good for you are getting results that you are happy with









edit; grammar I am tired and I couldn't understand my post


----------



## ViRuS2k

Quote:


> Originally Posted by *Nico67*
> 
> Its very alarming when you guys keep saying shunt modding when talking about resistors, please whatever you do, don't put them on the shunts, put them on the caps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> got this off kingpin forums but added a bit more detail to indicate where you do different mods.


Im confusing myself lol i didnt mean to say resistors, i ment to say those 3 little ic`s in purple - SHUNTS lol

wonder where kendarwolf is :/


----------



## niobium615

Quote:


> Originally Posted by *ViRuS2k*
> 
> Im confusing myself lol i didnt mean to say resistors, i ment to say those 3 little ic`s in purple - SHUNTS lol
> 
> wonder where kendarwolf is :/


Those three little "IC's" in purple are the shunt resistors. What Nico is saying, and I agree with, is that you should not touch those. Put 47Ohm resistors in parallel with the caps, circled in green in Nico's photo, instead.

On a different note, has anyone probed the board yet to figure out which cap is connected to the PCIe shunt? It might be prudent to leave that one at stock to prevent overdrawing power from the PCIe slot and popping a trace on your mobo.


----------



## ViRuS2k

Quote:


> Originally Posted by *niobium615*
> 
> Those three little "IC's" in purple are the shunt resistors. What Nico is saying, and I agree with, is that you should not touch those. Put 47Ohm resistors in parallel with the caps, circled in green in Nico's photo, instead.
> 
> On a different note, has anyone probed the board yet to figure out which cap is connected to the PCIe shunt? It might be prudent to leave that one at stock to prevent overdrawing power from the PCIe slot and popping a trace on your mobo.


Ok now im really confused, i thought i lay the ic`s im getting on top of the purple shunt resisters
ok im totally confused lol


----------



## niobium615

Quote:


> Originally Posted by *ViRuS2k*
> 
> Ok now im really confused, i thought i lay the ic`s im getting on top of the purple shunt resisters
> ok im totally confused lol


You aren't getting any ICs, you should be getting SMD resistors. Those need to be put in parallel with the three caps near the INA3221 IC, look for Coopiklaani's post in this thread for more detail.


----------



## ViRuS2k

Quote:


> Originally Posted by *niobium615*
> 
> You aren't getting any ICs, you should be getting SMD resistors. Those need to be put in parallel with the three caps near the INA3221 IC, look for Coopiklaani's post in this thread for more detail.


Ah right i think im seeing it now lol blind as a bat like,

i bought these : SMD/SMT 0805 Resistors 10 - 10M Ohm Range - 10, 22, 47, 100, 220, 470 R K M Ohms

Selected : 47 Ohms - Getting 10x of them....

That picture i see 3x ic resisters surrounding the black squared INA3221 ic...

i place these SMD resisters on top of the 3 surrounding it and that will do the same thing instead of putting or messing with the purple shunts









--- if thats the case then all i need to know now is what type of glue and what thermal paint i need so that they can be removed if i need to remove them in the future as well as a guide or video showing how to do it if thats possible lol


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> Some experiments with different curve types with XOC bios. One uses the "cliff" type, the other uses the smooth type. Smooth curve does result in much much better scores. The cliff type or F-V locking results are significantly worse.
> 
> 
> 
> 
> 
> Yep for some bizarre reason this does help
> 
> 
> 
> 
> 
> 
> 
> I tried a few variations,
> 
> 2138/6180 10487 323w cliff
> 2138/6180 10533 333w 2 point 2126 @ 1.063 and 2138 @ 1.075
> 2138/6180 10598 337w 3 point
> 2138/6180 10615 337w 4 or 5 point curve
> 
> 
> 
> 2152/6180 10632 337w 1.075v 5 point curve
> 
> 
> 
> 2164 looks like it would be a nice score, but can't get it to pass even at 1.100v. card seems good upto 2152 then takes a ton more voltage which wouldn't be worth it except for benchmarks but only really do those for gaming stability testing anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So with XOC, it would seem like its downclocking even though it doesn't show it, just not on power, and seen as temp never goes over 26c it wouldn't seem to be that either. BOOST 3.0 may have some other tricks, just not sure what they are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you try it, test voltages for best performance at that clk speed, test mem steps, as they were all over the place for me, and lastly still use a curve, when common sense would suggest you don't need too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XOC has given me the highest score, but at higher clocks, mem and watts I couldn't use with FE. For gaming I have learnt a lot more about how the card responds so I could probably do better with FE. All I really need is an FE bios with 350w and I'd be laughing.
Click to expand...

FE BIOS with 350W, kinda like the FTW3 BIOS with 358W, right?


----------



## JimmyMo

Howdy, fellow Ti owners - - and particularly those with the ZOTAC Amp Extreme!

Loving my ZOTAC, and I am considering re-pasting it with Kryonaut.

Did anyone who re-pasted take pictures of the teardown? I'd be curious to see those and hear your advice on disassembling.

There's often gotchas and "wish I'd know known that before I started taking it apart" details.

Appreciate it, anyone that can help!


----------



## kolkoo

Here is a higher res picture of the three resistors needed to do the resistor mod:

https://image.ibb.co/h5cEfk/1080_Ti_Resistor_Mod.png

You need to put 3 0805 SMD Resistors on top of these and make sure they get good contact and do so in such a way that you can remove them later. People have suggested conductive silver low resistance ink / glue on the sides + nonconductive thermal paste in the middle so that it's sticky but easily removed and also you isolate the sides from one another as to not short the resistor.

And also a nice write up about it for Pascal cards -> http://forum.kingpincooling.com/showthread.php?t=3879 (just skip to section STEP 3 - Uncorking your power )
This article also links to previous articles describing this process in more detail.
I read something about this mod also uncorking the power you can draw through PCI-E slot on the mobo and not just the two additional PCI-E power connectors - can anyone elaborate on this?


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> FE BIOS with 350W, kinda like the FTW3 BIOS with 358W, right?


I've been meaning to ask you I have the FTW3 bios and using your bat file however my card still power limits and goes to a previous voltage bin at about 328W - am I missing something here?


----------



## Coopiklaani

Quote:


> Originally Posted by *ViRuS2k*
> 
> Ah right i think im seeing it now lol blind as a bat like,
> 
> i bought these : SMD/SMT 0805 Resistors 10 - 10M Ohm Range - 10, 22, 47, 100, 220, 470 R K M Ohms
> 
> Selected : 47 Ohms - Getting 10x of them....
> 
> That picture i see 3x ic resisters surrounding the black squared INA3221 ic...
> 
> i place these SMD resisters on top of the 3 surrounding it and that will do the same thing instead of putting or messing with the purple shunts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> --- if thats the case then all i need to know now is what type of glue and what thermal paint i need so that they can be removed if i need to remove them in the future as well as a guide or video showing how to do it if thats possible lol


I posted a brief instruction like, a thousand posts ago. Basically I used arctic thermal glue to secure the resistors and silver paint to make the connection.
Links here:
https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue
https://www.amazon.co.uk/Conducting-Silver-conductive-repairing-heaters/dp/B000YJ2P2I/ref=sr_1_2?ie=UTF8&qid=1494340057&sr=8-2&keywords=conductive+paint

Or you can get away with this conductive glue to do both. but it has a huge resistance. (10-30ohm over a 1mm gap as I tested earlier)
https://www.amazon.co.uk/PCB-Soldering-835-2699-Electric-Paint/dp/B00CSMDT8S/ref=sr_1_1?ie=UTF8&qid=1494340140&sr=8-1&keywords=conductive+paint



A link to my post here:
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160


----------



## Coopiklaani

Quote:


> Originally Posted by *kolkoo*
> 
> I've been meaning to ask you I have the FTW3 bios and using your bat file however my card still power limits and goes to a previous voltage bin at about 328W - am I missing something here?


FE BIOS + shunt mod gives you about 350 to 375w.


----------



## kolkoo

Quote:


> Originally Posted by *Coopiklaani*
> 
> FE BIOS + shunt mod gives you about 350 to 375w.


I have not bitten the bullet on shunt or resistor mod yet as I just - this weekend - put my GPU for the first time in my life under water so I am reluctant to take it apart yet... hence I went the bios flashing route. I read here and there that the FTW3 bios can give me 358W however during Fire Strike, Time Spy and Superposition I can see using hwinfo and gpu-z that my GPU is thottling down at 328W or 117% TDP... so I was wondering if that's expected or not


----------



## KedarWolf

Quote:


> Originally Posted by *kolkoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> FE BIOS with 350W, kinda like the FTW3 BIOS with 358W, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been meaning to ask you I have the FTW3 bios and using your bat file however my card still power limits and goes to a previous voltage bin at about 328W - am I missing something here?
Click to expand...

I find the card power limits at 117% power limit but power limits much less. I run 2050 at 1.025V and it pretty much stays there in Superposition though you may need to go 2025 or 2012 at 1.000 to get zero power limiting.









Edit: If you right click on the .bat file and Run As Administrator and it says it's at 258 power limit then the .bat file worked earlier.


----------



## KraxKill

Quote:


> Originally Posted by *willverduzco*
> 
> I couldn't test at a higher clock because anything higher than 1886 on my non-shunt modified card results in at least a second or two of power limit clock reduction with the FE BIOS. This ONLY affects my card when running the FE BIOS, which would bias the results against the FE BIOS, making the XOC seem even faster in comparison.


Will it? See that's the point. You're assuming without testing. Hence my Ferrari analogy. You're testing two race cars and choosing to race them at your speed limit. You then remove this speed limit for the XOC Bios and ASSUME it's faster because it's not hitting the rev limiter. This is a blatant error in your methodology. It was when pointed out the first time and clearly it still is.

You state multiple times that the XOC performed faster but at the same time say that you didn't bother to actually test the FE there because you simply assumed it would not perform.

Which is it? Why not confirm your assumption?

Why not test both at 2050 and locked vbins? You seem to be able to hit that high. Can you hit 2100? Then do it there. Let that FE hit pwr and SEE if your fps is actually lower.

Better yet. Don't even look at pwr. Clock your card as high as it will go on the FE Bios without crashing and then duplicate the frequency and voltage on the XOC. Then, lets see your results.

To clarify. We are talking about the vanilla NVIDIA FE vs the XOC Bios.

Also stop wasting your time doing complete 3D mark runs. Save time. Just run custom and select only test one or two and record your output fps. Takes much less time and makes it easy to validate quickly.

As far as your "grammar" comment. Take a look at your wall of text that begins with a run on and then tell me something about grammar. This only detracts from the topic.

You're behaving like somebody does when faced with disillusionment by avoiding testing or doing the very thing that would question the illusion you've built for yourself.


----------



## mtbiker033

finally getting my hybrid kit today! and of course I have something I have to do today and can't get to it until later tonight...


----------



## Eco28

Quote:


> Originally Posted by *Nico67*
> 
> It doesn't look like your using 120% in that pic, but easier to tell if you use MAX on stats


Yup 100% power target, even removed any OC software. My point was to check does it really (should it) throttling when you take it as it is from the box? No OC, no power/voltage tweaks. It just hovers from 1880 to 1920Mhz give or take and PrefCap is as shown on the pic switching from Util to Pwr most of the time. I use Heaven benchmark (desktop resolution 1440p, Extreme tesselation and highest quality settings) because it tends to be light on power draw compared to Superposition or even TimeSpy.
I find it all this throttling rather peculiar then again I've just made a huge leap from SLI Asus' OC'ed version of 670s onto this FE 1080ti - different Boost, Bios characteristics, whole different generation.

Thanks!


----------



## OneCosmic

Doesn't anybody know if there is any other
Quote:


> Originally Posted by *Nico67*
> 
> Yep for some bizarre reason this does help
> 
> 
> 
> 
> 
> 
> 
> I tried a few variations,
> 
> 2138/6180 10487 323w cliff
> 2138/6180 10533 333w 2 point 2126 @ 1.063 and 2138 @ 1.075
> 2138/6180 10598 337w 3 point
> 2138/6180 10615 337w 4 or 5 point curve
> 
> 
> 
> 2152/6180 10632 337w 1.075v 5 point curve
> 
> 
> 
> 2164 looks like it would be a nice score, but can't get it to pass even at 1.100v. card seems good upto 2152 then takes a ton more voltage which wouldn't be worth it except for benchmarks but only really do those for gaming stability testing anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So with XOC, it would seem like its downclocking even though it doesn't show it, just not on power, and seen as temp never goes over 26c it wouldn't seem to be that either. BOOST 3.0 may have some other tricks, just not sure what they are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you try it, test voltages for best performance at that clk speed, test mem steps, as they were all over the place for me, and lastly still use a curve, when common sense would suggest you don't need too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XOC has given me the highest score, but at higher clocks, mem and watts I couldn't use with FE. For gaming I have learnt a lot more about how the card responds so I could probably do better with FE. All I really need is an FE bios with 350w and I'd be laughing.


However a guy here, has higher score with just 2088/6300MHz shunt moded:
Quote:


> Originally Posted by *Hopesolo*
> 
> SHUNT MOD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2088/6250MHz


----------



## done12many2

Quote:


> Originally Posted by *OneCosmic*
> 
> However a guy here, has higher score with just 2088/6300MHz shunt moded:


Which means absolutely nothing. You can't compare scores between different configurations. Scaling within the same rig is what is important.


----------



## Dasboogieman

Quote:


> Originally Posted by *OneCosmic*
> 
> Doesn't anybody know if there is any other
> However a guy here, has higher score with just 2088/6300MHz shunt moded:


Just changing the NVIDIA Texture Filtering setting to High Performance can net you a hundred or so points. Comparisons within the same rig are golden for this reason, same settings.


----------



## JimmyMo

Quote:


> Originally Posted by *JimmyMo*
> 
> Howdy, fellow Ti owners - - and particularly those with the ZOTAC Amp Extreme!
> 
> Loving my ZOTAC, and I am considering re-pasting it with Kryonaut.
> 
> Did anyone who re-pasted take pictures of the teardown? I'd be curious to see those and hear your advice on disassembling.
> 
> There's often gotchas and "wish I'd know known that before I started taking it apart" details.
> 
> Appreciate it, anyone that can help!


Think I answered my own question, if any ZOTAC owners that have replaced their TIM can confirm - -

New article reviewing the Amp! Extreme came out (https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/)

There are some nice teardown pictures in that article.

Looks like 5 screws, all from the backside, unclip the fan power plug, and I am in? That simple?









Circled in red, here - -


----------



## Luckbad

Quote:


> Originally Posted by *JimmyMo*
> 
> Think I answered my own question, if any ZOTAC owners that have replaced their TIM can confirm - -
> 
> New article reviewing the Amp! Extreme came out (https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/)
> 
> There are some nice teardown pictures in that article.
> 
> Looks like 5 screws, all from the backside, unclip the fan power plug, and I am in? That simple?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Circled in red, here - -


If your replace the TIM on your Zotac Amp Extreme, please update us on the quality of what's there and results!

I have some Phanteks thermal compound on hand and might grab Kryonaut, but haven't done anything with mine as of yet.


----------



## lanofsong

Hey there GTX 1080Ti owners,

We truly could use your help here. Presently we are #1 and just ahead of two of the Great TITAN's when it comes to Distributed Computing and to stay there we could use the help from you and your BOSS GPU's. Only 4 days left.




Download the software here.

https://boinc.berkeley.edu/download.php

Add the following *GPU* project - *Einsteinathome.org*



Note: For every project you fold on, you will be offered if you want to join a team - type in overclock.net (enter) then JOIN team.


Remember to sign up for the Boinc team by going here: You can also post any questions that your may have - this group is very helpful









8th BOINC Pentathlon thread

Thanks in advance.

lanofsong

OCN - FTW


----------



## niobium615

Quote:


> Originally Posted by *Coopiklaani*
> 
> I posted a brief instruction like, a thousand posts ago. Basically I used arctic thermal glue to secure the resistors and silver paint to make the connection.
> Links here:
> https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue
> https://www.amazon.co.uk/Conducting-Silver-conductive-repairing-heaters/dp/B000YJ2P2I/ref=sr_1_2?ie=UTF8&qid=1494340057&sr=8-2&keywords=conductive+paint


You aren't worried about being able to remove the thermal glue if/when you need to? I thought that stuff formed pretty strong bonds.


----------



## islandgam3r

Can't wait till I order my 1080ti FTW3 Elite this week, and I will pray like mofo that it ships before the 22nd LOL


----------



## kolkoo

Quote:


> Originally Posted by *islandgam3r*
> 
> Can't wait till I order my 1080ti FTW3 Elite this week, and I will pray like mofo that it ships before the 22nd LOL


Would be cool if you share the bios so we can see if it is any good


----------



## Coopiklaani

Quote:


> Originally Posted by *kolkoo*
> 
> I have not bitten the bullet on shunt or resistor mod yet as I just - this weekend - put my GPU for the first time in my life under water so I am reluctant to take it apart yet... hence I went the bios flashing route. I read here and there that the FTW3 bios can give me 358W however during Fire Strike, Time Spy and Superposition I can see using hwinfo and gpu-z that my GPU is thottling down at 328W or 117% TDP... so I was wondering if that's expected or not


Also the FTW3 bios works the best with shunt mod. You are likely to pwr limit your card with the FTW3 bios on the FE card due to the limit of the 6-pin connector shunt. The
Quote:


> Originally Posted by *niobium615*
> 
> You aren't worried about being able to remove the thermal glue if/when you need to? I thought that stuff formed pretty strong bonds.


I did some experiments on some junk components, the glue can be softened and removed by rubbing alcohol. Just don't put too much.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*


yeah, man, cant have SlinkyPC and JPBBoy just stealin his top spots.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Also the FTW3 bios works the best with shunt mod. You are likely to pwr limit your card with the FTW3 bios on the FE card due to the limit of the 6-pin connector shunt. The
> I did some experiments on some junk components, the glue can be softened and removed by rubbing alcohol. Just don't put too much.


Why does Hulio test show that the FTW3 Bios and shunt score worse than stock FE?

How does it perform in actual gaming? That's all I care about. The reason is because no game hits power limits, superposition doesnt even hit power limits.

I'm not concerned with 3Dmark minir power limits with the shunt, I'm concerned with actual gaming.

Does the FTW3 Bios actually score better in actual Gameplay than the FE stock bios? If so, I'll flash it today


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> Why does Hulio test show that the FTW3 Bios and shunt score worse than stock FE?
> 
> How does it perform in actual gaming? That's all I care about. The reason is because no game hits power limits, superposition doesnt even hit power limits.
> 
> I'm not concerned with 3Dmark minir power limits with the shunt, I'm concerned with actual gaming.
> 
> Does the FTW3 Bios actually score better in actual Gameplay than the FE stock bios?


Haven't tested this since i just care about benchmark performance, this is something which has to be put into test.

Edit: and i found a bug with 3DMark wich i ill present to you in a few minutes or hours depends how fast ill make the presentation^^

I basically can produce "bad" and "good" scores in time spy at the same clocks and voltages, im not talking about smooth curve and cliff curve... my curves are maximum smooth. Its another thing which is causing that.

Ill present later, you guys will be suprised.

Maybe some testing has to be redone, cause it was probably flawed with this "bug".


----------



## Slackaveli

Quote:


> Originally Posted by *JimmyMo*
> 
> Howdy, fellow Ti owners - - and particularly those with the ZOTAC Amp Extreme!
> 
> Loving my ZOTAC, and I am considering re-pasting it with Kryonaut.
> 
> Did anyone who re-pasted take pictures of the teardown? I'd be curious to see those and hear your advice on disassembling.
> 
> There's often gotchas and "wish I'd know known that before I started taking it apart" details.
> 
> Appreciate it, anyone that can help!


do it. 5-8c gains. easy to do. 10 minutes, $10 , 10c is possible if it was a particularly poor paste job and/or you use a bit more thermal padding on vram and core backside especially. Or heatsinks. . and, yes, use kryonaut







Quote:


> Originally Posted by *mtbiker033*
> 
> finally getting my hybrid kit today! and of course I have something I have to do today and can't get to it until later tonight...


finally!!


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Ah right i think im seeing it now lol blind as a bat like,
> 
> i bought these : SMD/SMT 0805 Resistors 10 - 10M Ohm Range - 10, 22, 47, 100, 220, 470 R K M Ohms
> 
> Selected : 47 Ohms - Getting 10x of them....
> 
> That picture i see 3x ic resisters surrounding the black squared INA3221 ic...
> 
> i place these SMD resisters on top of the 3 surrounding it and that will do the same thing instead of putting or messing with the purple shunts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> --- if thats the case then all i need to know now is what type of glue and what thermal paint i need so that they can be removed if i need to remove them in the future as well as a guide or video showing how to do it if thats possible lol
> 
> 
> 
> I posted a brief instruction like, a thousand posts ago. Basically I used arctic thermal glue to secure the resistors and silver paint to make the connection.
> Links here:
> https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue
> https://www.amazon.co.uk/Conducting-Silver-conductive-repairing-heaters/dp/B000YJ2P2I/ref=sr_1_2?ie=UTF8&qid=1494340057&sr=8-2&keywords=conductive+paint
> 
> Or you can get away with this conductive glue to do both. but it has a huge resistance. (10-30ohm over a 1mm gap as I tested earlier)
> https://www.amazon.co.uk/PCB-Soldering-835-2699-Electric-Paint/dp/B00CSMDT8S/ref=sr_1_1?ie=UTF8&qid=1494340140&sr=8-1&keywords=conductive+paint
> 
> 
> 
> A link to my post here:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160
Click to expand...

The resistors don't go on the same shunts the CLU goes on?


----------



## done12many2

No. They go on the 3 tiny capacitors located mid way between the top two shunts and the 3rd lower shunt for the PCIe power.


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> The resistors don't go on the same shunts the CLU goes on?


Check my post here - that's where they go http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9070#post_26081225


----------



## islandgam3r

EVGA over- engineered this bad boy, better to over engineer than under-engineer I always say. Plus they value their consumers and made sure they don't make the mistake they made with some of the FTW 1080s.


----------



## islandgam3r

Quote:


> Originally Posted by *kolkoo*
> 
> Would be cool if you share the bios so we can see if it is any good


EVGA never disappointed me but I'll share my info if everything works out and I get my order out


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> Haven't tested this since i just care about benchmark performance, this is something which has to be put into test.
> 
> Edit: and i found a bug with 3DMark wich i ill present to you in a few minutes or hours depends how fast ill make the presentation^^
> 
> I basically can produce "bad" and "good" scores in time spy at the same clocks and voltages, im not talking about smooth curve and cliff curve... my curves are maximum smooth. Its another thing which is causing that.
> 
> Ill present later, you guys will be suprised.
> 
> Maybe some testing has to be redone, cause it was probably flawed with this "bug".


I think we need to stary differentiating between "best bios" for benchmarking and gaming because theres a few factors that might perform well in one but not the other, the tdp normalized power limiting, which you'd hit a tiny bit with a shunt mod in 3Dmark but not in supposition and gaming.

That being said, it even siffers further with non-shunt modded people.


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> I think we need to stary differentiating between "best bios" for benchmarking and gaming because theres a few factors that might perform well in one but not the other, the tdp normalized power limiting, which you'd hit a tiny bit with a shunt mod in 3Dmark but not in supposition and gaming.
> 
> That being said, it even siffers further with non-shunt modded people.


On Time Spy GPU Test 1, im not hitting Power Limit at all, but like i said i found a bug, related to 3dmark, which is causing issues if you overclock and start 3d mark in a certain order and stuff like that...

ill make a video now and demonstrate what i mean


----------



## Alexxuss

Can i Flash the xoc bios on the Gigabyte Aourus


----------



## Hulio225

Quote:


> Originally Posted by *Alexxuss*
> 
> Can i Flash the xoc bios on the Gigabyte Aourus


if it has the same voltage controller, it should work. otherwise i don't think so...


----------



## KedarWolf

Quote:


> Originally Posted by *kolkoo*
> 
> Here is a higher res picture of the three resistors needed to do the resistor mod:
> 
> https://image.ibb.co/h5cEfk/1080_Ti_Resistor_Mod.png
> 
> You need to put 3 0805 SMD Resistors on top of these and make sure they get good contact and do so in such a way that you can remove them later. People have suggested conductive silver low resistance ink / glue on the sides + nonconductive thermal paste in the middle so that it's sticky but easily removed and also you isolate the sides from one another as to not short the resistor.
> 
> And also a nice write up about it for Pascal cards -> http://forum.kingpincooling.com/showthread.php?t=3879 (just skip to section STEP 3 - Uncorking your power )
> This article also links to previous articles describing this process in more detail.
> I read something about this mod also uncorking the power you can draw through PCI-E slot on the mobo and not just the two additional PCI-E power connectors - can anyone elaborate on this?


Thanks for the high rez picture, now I understand. I would have totally messed it up. I assumed it was done in the same place the CLU is put on.


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> Thanks for the high rez picture, now I understand. I would have totally messed it up. I assumed it was done in the same place the CLU is put on.


Yeah it's much smaller but I guess if you put the conductive ink + tim glue + conductive ink onto the resistor and just place it on top it should be fine. Also I bet you can use liquid electrical tape to connect the resistors to each other in the middle instead of glue


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> On Time Spy GPU Test 1, im not hitting Power Limit at all, but like i said i found a bug, related to 3dmark, which is causing issues if you overclock and start 3d mark in a certain order and stuff like that...
> 
> ill make a video now and demonstrate what i mean


I wonder if it's a bug with Afterburner vs 3D Mark? I always felt it behaved odd running 3D mark and then adjusting with Afterburner. Vs running Afterburner and then starting 3D mark. I was seeing about a 1/3rd of an fps delta depending on the order. But then it would level off so I just assumed the first run was an outlier? I could never figure out how to reproduce it though. Are you seeing the same thing? Where the first run is sometimes faster but the consequent runs are all a fraction slower by consistently nearly identical?


----------



## TheBoom

Quote:


> Originally Posted by *JimmyMo*
> 
> Think I answered my own question, if any ZOTAC owners that have replaced their TIM can confirm - -
> 
> New article reviewing the Amp! Extreme came out (https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/)
> 
> There are some nice teardown pictures in that article.
> 
> Looks like 5 screws, all from the backside, unclip the fan power plug, and I am in? That simple?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Circled in red, here - -


Won't it void the warranty though?


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> I wonder if it's a bug with Afterburner vs 3D Mark? I always felt it behaved oddly running 3D mark and then adjusting with Afterburner. Vs running Afterburner and then starting 3D mark. I was seeing about a 1/3rd of an fps delta depending on the order. Are you seeing the same thing? Where the first run is sometimes faster?


exactly that, and sometimes i feel like its afterburner and sometimes i feel like its 3dmark.

but i could reproduce it constantly, load afterburner profile with given mem clock -> start 3mmark run it with set mem clock get for example 70 fps -> increasing mem clock by 50 mhz rerun getting worse scores -> increasing again 50mhz getting again better scores, scores which are inline

closing 3d mark setting the mem clock again to the middle value starting 3dmrk getting score inbetween again, like it should... increasing mem clock by 50 mhz and the score is worse again...

something is happening here which can kill your testing results if you are not aware of that...

my recommendation is to close 3dmark and afterburner every time you adjust clocks, otherwise your testing will be flawed by that bug at certain point


----------



## kolkoo

Quote:


> Originally Posted by *TheBoom*
> 
> Won't it void the warranty though?


Unless it's an EVGA Card pretty much any manufacturer can void your warranty for opening up a card... however some of them do not if you return it properly assembled back to factory condition and the card is physically fine... I've also heard that by law they cannot void the warranty for just disassembly at least in the US - but then you have to be ready to possibly sue them. Here is a thread that was based on European warranty cases -

__
https://www.reddit.com/r/54m23z/gpu_warranty_spreadsheet_compiled_from_german/


----------



## seckzee

I realize they haven't shipped yet but is it possible to add a hybrid AIO cooler to one of the other EVGA cards with more power consumption + (2) 8pin connectors such as the following? The fan design isn't the same as the FE/titan design, obviously, which is why I am bringing up this question.

EVGA GeForce GTX 1080 Ti FTW3 ELITE GAMING

https://www.evga.com/products/product.aspx?pn=11G-P4-6796-KR

Will the additional power supply allow higher clocking as well? Or is this locked through the bios/limits


----------



## Silent Scone

Not balls to the wall, as only just fitted it.

But this is the same settings at 2050Mhz core, Strix 1080Ti OC.

Air cooler



EK Block


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> exactly that, and sometimes i feel like its afterburner and sometimes i feel like its 3dmark.
> 
> but i could reproduce it constantly, load afterburner profile with given mem clock -> start 3mmark run it with set mem clock get for example 70 fps -> increasing mem clock by 50 mhz rerun getting worse scores -> increasing again 50mhz getting again better scores, scores which are inline
> 
> closing 3d mark setting the mem clock again to the middle value starting 3dmrk getting score inbetween again, like it should... increasing mem clock by 50 mhz and the score is worse again...
> 
> something is happening here which can kill your testing results if you are not aware of that...
> 
> my recommendation is to close 3dmark and afterburner every time you adjust clocks, otherwise your testing will be flawed by that bug at certain point


I could never figure out how to reproduce it though. Are you seeing the same thing? Where the first run is sometimes faster but the consequent runs are all a fraction slower by consistently nearly identical?

Yeah, I thought it was just me lol. I now test by setting a clock and saving a profile in Afterburner. Clicking it once to 'load' it. And Un checking and checking the 'load at startup button'. I then close everything and reboot. Wait a few seconds. Open 3D mark and run the test.

Otherwise I'm not sure the results are correct because I would set a clock, run a bench and be like great that's an improvement. Close Afterburner and run the test again and see a decline without being able to replicate the gain. So, I've been rebooting after every setting when dealing with fine tuning.


----------



## TheBoom

Quote:


> Originally Posted by *kolkoo*
> 
> Unless it's an EVGA Card pretty much any manufacturer can void your warranty for opening up a card... however some of them do not if you return it properly assembled back to factory condition and the card is physically fine... I've also heard that by law they cannot void the warranty for just disassembly at least in the US - but then you have to be ready to possibly sue them. Here is a thread that was based on European warranty cases -
> 
> __
> https://www.reddit.com/r/54m23z/gpu_warranty_spreadsheet_compiled_from_german/


Something I'd honestly rather avoid. Which is why I don't bother with replacing TIM these days.

But I remember my Zotac 970 Amp also didn't have any stickers on it. I don't actually see any stickers on the backplate of my 1080ti though, it may be underneath.


----------



## KraxKill

Quote:


> Originally Posted by *Silent Scone*
> 
> Not balls to the wall, as only just fitted it.
> 
> But this is the same settings at 2050Mhz core, Strix 1080Ti OC.
> 
> Air cooler
> 
> 
> 
> EK Block


Look at that! Nice gain at the top fps, but more impressive is the fact that you gained 3fps at your minimums by simply cooling your card.

Nice contrast.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> exactly that, and sometimes i feel like its afterburner and sometimes i feel like its 3dmark.
> 
> but i could reproduce it constantly, load afterburner profile with given mem clock -> start 3mmark run it with set mem clock get for example 70 fps -> increasing mem clock by 50 mhz rerun getting worse scores -> increasing again 50mhz getting again better scores, scores which are inline
> 
> closing 3d mark setting the mem clock again to the middle value starting 3dmrk getting score inbetween again, like it should... increasing mem clock by 50 mhz and the score is worse again...
> 
> something is happening here which can kill your testing results if you are not aware of that...
> 
> my recommendation is to close 3dmark and afterburner every time you adjust clocks, otherwise your testing will be flawed by that bug at certain point


I actually saw this phenomenon happen with heaven benchmark last month.

I would alt+tab to adjust my clocks and memory but get worse results. I always had to completely close heaven, apply new clocks and memory and reopen it.

Adjusting on the fly causes issues of some sort.


----------



## BoredErica

Hi.

Got 1080ti SC Black. I'm assuming no power target unlock bios for it.

Anyways: At +480mhz on vram my vram runs at 5977, at +481 to +485 mhz my vram runs at 5994mhz. Is this behavior normal?

I seem to be crashing at 5994mhz in Valley with repeat loops. Readings are same on Hwinfo or Precision or Afterburner.


----------



## TahoeDust

Quote:


> Originally Posted by *Darkwizzie*
> 
> Hi.
> 
> Got 1080ti SC Black. I'm assuming no power target unlock bios for it.
> 
> Anyways: At +480mhz on vram my vram runs at 5977, at +481 to +485 mhz my vram runs at 5994mhz. Is this behavior normal?
> 
> I seem to be crashing at 5994mhz in Valley with repeat loops. Readings are same on Hwinfo or Precision or Afterburner.


I have this card on the way. How do you like it? What kind of clock speeds are you able to get out of it?


----------



## BoredErica

Quote:


> Originally Posted by *TahoeDust*
> 
> I have this card on the way. How do you like it? What kind of clock speeds are you able to get out of it?


It's ehhh. It hits the power target maximum. If I can't adjust between 5994 and 5977 then I have to go with 5977, I was hoping for 6000. Clocks are at maybe 2025mhz? (It can fall to 2012 or even 2000 when it gets bad but it can go up to 2065 in Skyrim SE.) Varies based on load. I wonder if putting it under water will decrease the fluctuations, but that might just be power target or voltage talking.

Of course bear in mind I'm only a sample size of 1. It should just be Founders + better cooler. I got mine for $700 shipped so it's w/e.


----------



## Norlig

My card rests at 1480Mhz when at idle. (762mv)

max OC at 2088Mhz ,

If I reset the OC, the idle will drop to normal idle Mhz.

Ive done shunt mod.

Any tips how I can lower my idle Mhz and still have OC enabled? (dont want to mess with profiles in AB)


----------



## caymandive

After weeks of waiting for the ASUS STRIX 1080ti to be available, I finally received mine yesterday and I think it's looks great! Have not played around with overclocking, but out of the box compared to the EVGA 1080 Classified I had I'm seeing at least a 25-30% improvement.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> FE BIOS with 350W, kinda like the FTW3 BIOS with 358W, right?


If it didn't actually start limiting at about 310w, then yes. Trouble is its not a true 358w, at least XOC doesn't appear to power limit so I can get the 350w for real.

On the XOC, it really seemed to take a huge leap in voltage to get from 2152 -> 2164, so maybe its regaining performance around 2150+ mark, as the max frames around scene 5/6 in SuPo jumped quite a bit even though I hadn't been able to finish the test. Might look into that tonite.


----------



## Slackaveli

Quote:


> Originally Posted by *Nico67*
> 
> If it didn't actually start limiting at about 310w, then yes. Trouble is its not a true 358w, at least XOC doesn't appear to power limit so I can get the 350w for real.
> 
> On the XOC, it really seemed to take a huge leap in voltage to get from 2152 -> 2164, so maybe its regaining performance around 2150+ mark, as the max frames around scene 5/6 in SuPo jumped quite a bit even though I hadn't been able to finish the test. Might look into that tonite.


it uses the 2101-2202 range as a dummy blocker.


----------



## TahoeDust

Quote:


> Originally Posted by *Darkwizzie*
> 
> It's ehhh. It hits the power target maximum. If I can't adjust between 5994 and 5977 then I have to go with 5977, I was hoping for 6000. Clocks are at maybe 2025mhz? (It can fall to 2012 or even 2000 when it gets bad but it can go up to 2065 in Skyrim SE.) Varies based on load. I wonder if putting it under water will decrease the fluctuations, but that might just be power target or voltage talking.
> 
> Of course bear in mind I'm only a sample size of 1. It should just be Founders + better cooler. I got mine for $700 shipped so it's w/e.


Yeah...I got mine for $720 shipped, figured I would give it a spin. Have you tried putting the FTW3 bios on it?


----------



## TheBoom

Can some one tell me why my card doesn't want to go over 1125mv with the XOC bios? Is it the cards hard power limit kicking in?


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Can some one tell me why my card doesn't want to go over 1125mv with the XOC bios? Is it the cards hard power limit kicking in?


With my FE card it was possible to go 1.2V with XOC.

Don't know why you have problems with it.... show me your Curve you are using


----------



## Rhadamanthys

Hey guys, need some urgent help please. While disassembling my FE for water block installation I must have accidently ripped off one of these tiny little things (not sure how they're called or what's their purpose). I didn't even notice that I did it, just saw it lying on the pcb. Did I kill my card? Is there a way of fixing this, e.g. by soldering?


----------



## Hulio225

Quote:


> Originally Posted by *Rhadamanthys*
> 
> Hey guys, need some urgent help please. While disassembling my FE for water block installation I must have accidently ripped off one of these tiny little things (not sure how they're called or what's their purpose). I didn't even notice that I did it, just saw it lying on the pcb. Did I kill my card? Is there a way of fixing this, e.g. by soldering?


This is an capacitor, there is still one left, they were connected parallel, which simply doubls the capacity... you could be fine, or your card will get unstable... maybe you will have to resolder it

At least from what i see on tht picture, cant see the traces properly because of red circle and angle^^


----------



## JimmyMo

Quote:


> Originally Posted by *kolkoo*
> 
> Unless it's an EVGA Card pretty much any manufacturer can void your warranty for opening up a card... however some of them do not if you return it properly assembled back to factory condition and the card is physically fine... I've also heard that by law they cannot void the warranty for just disassembly at least in the US - but then you have to be ready to possibly sue them. Here is a thread that was based on European warranty cases -
> 
> __
> https://www.reddit.com/r/54m23z/gpu_warranty_spreadsheet_compiled_from_german/


I'm not too worried about it - - I'll take a picture of the original TIM application, and if I ever need to send it in, I'll clean off my Kryonaut and glop on some Arctic Silver or similar in the same messy way that they originally had it!









There don't appear to be any warranty void stickers, but I'll let people know if I see any.

What's the "gold standard" for before and after temperature comparisons?

I thought I would start from ambient room temperature (I'll measure it) and computer being on for several hours, leave my case fans at standard "daily driving" speeds, the ZOTAC fan at the "gaming" level of 50% that I usually use, and then run Heaven for a good hour or so, and log the temps to file for graphing?

Replace TIM, rinse and repeat?


----------



## Rhadamanthys

Quote:


> Originally Posted by *Hulio225*
> 
> This is an capacitor, there is still one left, they were connected parallel, which simply doubls the capacity... you could be fine, or your card will get unstable... maybe you will have to resolder it


So I guess my best bet would be to solder it back on. Boy it's been ages since I got out my solder kit, and the last stuff I worked wasn't that small. Goddamnit.


----------



## Hulio225

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So I guess my best bet would be to solder it back on. Boy it's been ages since I got out my solder kit, and the last stuff I worked wasn't that small.


yeah soldering smd parts isn't easy if you haven't practised


----------



## Rhadamanthys

Quote:


> Originally Posted by *Hulio225*
> 
> yeah soldering smd parts isn't easy if you haven't practised


Any advice on dos and don'ts when soldering such small parts on a gpu pcb? Getting the kit out right now.


----------



## Hulio225

Quote:


> Originally Posted by *Rhadamanthys*
> 
> Any advice on dos and don'ts when soldering such small parts on a gpu pcb? Getting the kit out right now.


Ehm check some youtube tutorials, im not soldering smd part on a daily, so i haven't that much experience with it, maybe someone else can give you tips


----------



## BoredErica

I have downloaded beta Afterburner for voltage slider. But raising voltage doesn't really raise voltage due to running face first into the power target wall.

AHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH

lf> gewd bios


----------



## Psykoking

Quote:


> Originally Posted by *Rhadamanthys*
> 
> So I guess my best bet would be to solder it back on. Boy it's been ages since I got out my solder kit, and the last stuff I worked wasn't that small. Goddamnit.


Quote:


> Originally Posted by *Rhadamanthys*
> 
> Any advice on dos and don'ts when soldering such small parts on a gpu pcb? Getting the kit out right now.


Check out these:

https://www.sparkfun.com/datasheets/Prototyping/General/SMD-SolderingWorkshop.pdf
https://www.johansontechnology.com/soldering-profiles-and-guidelines-for-smt-ceramic-components.html
Also if you have some in range, try soldering others components first to a matrix prototyping board, so you get used to it again.


----------



## BoredErica

As far as liquid metal tim for shunt mod goes, I am worried that with the way my GPU is mounted the metal will just melt and roll off into other parts of the board.

Quote:



> Originally Posted by *TahoeDust*
> 
> Yeah...I got mine for $720 shipped, figured I would give it a spin. Have you tried putting the FTW3 bios on it?


O wat, there's a FTW 3 bios? Is that safe? o.o


----------



## Hulio225

Quote:


> Originally Posted by *Darkwizzie*
> 
> O wat, there's a FTW 3 bios?


yeah just viist the thread here: http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TahoeDust*
> 
> Yeah...I got mine for $720 shipped, figured I would give it a spin. Have you tried putting the FTW3 bios on it?
> 
> 
> 
> O wat, there's a FTW 3 bios? Is that safe? o.o
Click to expand...

 GP102.zip 156k .zip file


----------



## Rhadamanthys

Quote:


> Originally Posted by *Psykoking*
> 
> Check out these:
> 
> https://www.sparkfun.com/datasheets/Prototyping/General/SMD-SolderingWorkshop.pdf
> https://www.johansontechnology.com/soldering-profiles-and-guidelines-for-smt-ceramic-components.html
> Also if you have some in range, try soldering others components first to a matrix prototyping board, so you get used to it again.


Wow, you been lurking in these forums since August 2016 and this is your first post helping me? first rep well deserved.


----------



## done12many2

Quote:


> Originally Posted by *Darkwizzie*
> 
> O wat, there's a FTW 3 bios? Is that safe? o.o


Hey wizzie, welcome to the Ti club bud! No 8 hour stress test required in this thread.


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TahoeDust*
> 
> Yeah...I got mine for $720 shipped, figured I would give it a spin. Have you tried putting the FTW3 bios on it?
> 
> 
> 
> O wat, there's a FTW 3 bios? Is that safe? o.o
Click to expand...

Never heard of anyone yet bricking a 1080 Ti on a BIOS flash.


----------



## BoredErica

Quote:


> Originally Posted by *done12many2*
> 
> Hey wizzie, welcome to the Ti club bud! No 8 hour stress test required in this thread.


Lol. I thought it would be nice to have a 1080ti thread like my CPU threads where I chart overclocks but also sort by SKU.

Quote:


> Originally Posted by *KedarWolf*
> 
> Never heard of anyone yet bricking a 1080 Ti on a BIOS flash.


Okay, I'll give it a shot. Thanks a bunch!

*explodes*


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> O wat, there's a FTW 3 bios? Is that safe? o.o
> 
> 
> 
> Hey wizzie, welcome to the Ti club bud! No 8 hour stress test required in this thread.
Click to expand...

Wait, there's not an eight hour stress test requirement here? (7 hours 42 minutes in)


----------



## KedarWolf

@Darkwizzie

Never started this thread but when peeps saw the original owner wasn't keeping it up I kinda got drafted to take over.


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> Wait, there's not an eight hour stress test requirement here? (7 hours 42 minutes in)










Our secret.

Quote:


> Originally Posted by *Darkwizzie*
> 
> Lol. I thought it would be nice to have a 1080ti thread like my CPU threads where I chart overclocks but also sort by SKU.


Do it, but how about not doing such stringent requirements for stability. Maybe something simple like passing 3DMark Time Spy stress test. It loops quite a few times.


----------



## KraxKill

Quote:


> Originally Posted by *Hulio225*
> 
> yeah just viist the thread here: http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti


Are you getting any performance over your shunted FE on the FTW3? I'm fed up flashing my card, would save me some time if you've already tested it.


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Wait, there's not an eight hour stress test requirement here? (7 hours 42 minutes in)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Our secret.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> Lol. I thought it would be nice to have a 1080ti thread like my CPU threads where I chart overclocks but also sort by SKU.
> 
> Click to expand...
> 
> Do it, but how about not doing such stringent requirements for stability. Maybe something simple like passing 3DMark Time Spy stress test. It loops quite a few times.
Click to expand...

Hmmm, if we did that would have to be something that peeps never had to pay for, I mean we're all broke having to buy our Ti's, right?


----------



## KedarWolf

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> yeah just viist the thread here: http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> 
> 
> 
> Are you getting any performance over your shunted FE on the FTW3? I'm fed up flashing my card, would save me some time if you've already tested it.
Click to expand...

Not shunted here but getting best benches of any BIOS I've tested so far.

Today got 10458 on Superposition, full 100 points better than Strix OC BIOS, next best BIOS I've tried.


----------



## Hulio225

Quote:


> Originally Posted by *KraxKill*
> 
> Are you getting any performance over your shunted FE on the FTW3? I'm fed up flashing my card, would save me some time if you've already tested it.


Tested it again today since i wanted to get sure because of that weird bug... but results stay the same... its worse


----------



## Psykoking

Quote:


> Originally Posted by *Rhadamanthys*
> 
> Wow, you been lurking in these forums since August 2016 and this is your first post helping me? first rep well deserved.


Well, finals kept me busy those days but since I'm studying Informatics on University, I can finally focus on contributing stuff to the forums I've been following for quite some time


----------



## Rhadamanthys

For testing if the card (FE) works properly after my ****ty solder job, can I just leave the backplate off or will this hurt performance/cooling under load?


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> Hmmm, if we did that would have to be something that peeps never had to pay for, I mean we're all broke having to buy our Ti's, right?


Good point. I forget the TS stress test was on the paid version.

Either way, it would be nice to see overclocks in an organized and standardized layout.

Kinda weird how a lot of the folks sharing benchmarks here aren't posting them to the 3D benchmark threads.


----------



## Hulio225

Quote:


> Originally Posted by *Rhadamanthys*
> 
> For testing if the card (FE) works properly after my ****ty solder job, can I just leave the backplate off or will this hurt performance/cooling under load?


tbh backplate is doing nothing, but protecting from water droping out of a leak in your cpu haha

so no worrys


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> With my FE card it was possible to go 1.2V with XOC.
> 
> Don't know why you have problems with it.... show me your Curve you are using


Oh so manual oc will not push the voltage to 1.2v?


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Oh so manual oc will not push the voltage to 1.2v?


if you're not overclocking with curve you won't be able to change voltages at all, the slider is allowing the card to overvolt, but you have to set voltages...

because the standard curve maxes out at a certain voltage, with just increasing clocks with the slider you just move the standard curve vertically be not changing voltages


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> Not shunted here but getting best benches of any BIOS I've tested so far.
> 
> Today got 10458 on Superposition, full 100 points better than Strix OC BIOS, next best BIOS I've tried.


Would you suggest I try the ftw3 bios on my gaming x? I want to see if it doesnt power limit as much as msi bios.


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> if you're not overclocking with curve you won't be able to change voltages at all, the slider is allowing the card to overvolt, but you have to set voltages...


I see thanks for the heads up. Might dabble with the curve if I have time. Although I must say I'm already pleased with this bios and it's improvements.


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> Would you suggest I try the ftw3 bios on my gaming x? I want to see if it doesnt power limit as much as msi bios.


Just try it, you won't brick anything, you always can flash back...


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> Just try it, you won't brick anything, you always can flash back...


Did you post a few pages back the ftw3 bios was worse for you?


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> Did you post a few pages back the ftw3 bios was worse for you?


Indeed i did, but on shunt modded FE Card!


----------



## KedarWolf

Quote:


> Originally Posted by *CoreyL4*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Not shunted here but getting best benches of any BIOS I've tested so far.
> 
> Today got 10458 on Superposition, full 100 points better than Strix OC BIOS, next best BIOS I've tried.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would you suggest I try the ftw3 bios on my gaming x? I want to see if it doesnt power limit as much as msi bios.
Click to expand...

Yes, I would.


----------



## feznz

Quote:


> Originally Posted by *Darkwizzie*
> 
> Hi.
> 
> Got 1080ti SC Black. I'm assuming no power target unlock bios for it.
> 
> Anyways: At +480mhz on vram my vram runs at 5977, at +481 to +485 mhz my vram runs at 5994mhz. Is this behavior normal?
> 
> I seem to be crashing at 5994mhz in Valley with repeat loops. Readings are same on Hwinfo or Precision or Afterburner.


temp is key right?
Quote:


> Originally Posted by *Nico67*
> 
> If it didn't actually start limiting at about 310w, then yes. Trouble is its not a true 358w, at least XOC doesn't appear to power limit so I can get the 350w for real.
> 
> On the XOC, it really seemed to take a huge leap in voltage to get from 2152 -> 2164, so maybe its regaining performance around 2150+ mark, as the max frames around scene 5/6 in SuPo jumped quite a bit even though I hadn't been able to finish the test. Might look into that tonite.


probably the wall of voltage leak

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, I would.


Thanks for looking after this thread got one more request can you put your collection of bios on the OP thanks


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> dam I know it works for a strix so technically I aint the Guinea pig
> I would nominate
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> Hi.
> 
> Got 1080ti SC Black. I'm assuming no power target unlock bios for it.
> 
> Anyways: At +480mhz on vram my vram runs at 5977, at +481 to +485 mhz my vram runs at 5994mhz. Is this behavior normal?
> 
> I seem to be crashing at 5994mhz in Valley with repeat loops. Readings are same on Hwinfo or Precision or Afterburner.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> If it didn't actually start limiting at about 310w, then yes. Trouble is its not a true 358w, at least XOC doesn't appear to power limit so I can get the 350w for real.
> 
> On the XOC, it really seemed to take a huge leap in voltage to get from 2152 -> 2164, so maybe its regaining performance around 2150+ mark, as the max frames around scene 5/6 in SuPo jumped quite a bit even though I hadn't been able to finish the test. Might look into that tonite.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, I would.
> 
> Click to expand...
> 
> Thanks for looking after this thread got one more request can you put your collection of bios on the OP thanks
Click to expand...

I could of all the ones I tested.


----------



## BoredErica

Quote:


> Originally Posted by *KedarWolf*
> 
> @Darkwizzie
> 
> Never started this thread but when peeps saw the original owner wasn't keeping it up I kinda got drafted to take over.


Yeah... I was kindda complaining too. Especially before bioses came along, because without those the brand and sku a person gets does change power targets which can affect overclocking (along with reference pcbs and non ref pcbs not fitting on custom blocks... etc). Then again, I've been a little lazy with my own threads so don't throw stones at glass houses?









Quote:


> Originally Posted by *done12many2*
> 
> 
> 
> 
> 
> 
> 
> 
> Our secret.
> Do it, but how about not doing such stringent requirements for stability. Maybe something simple like passing 3DMark Time Spy stress test. It loops quite a few times.


I was thinking maybe 3hrs? I'd be more lax on GPUs anyways because they don't typically crash the entire computer. A lot of the posts I get of people complaining about 8 hours not being enough because of Windows corruption isn't as relevant.

--

Seems +580 on vram is not stable long term... I'm using Valley right now. It's my own go to stress test. On 980ti I ran Heaven, Valley, Superposition and found that Valley showed artifacts at 5 mhz lower core clocks than the other two. It's not the end all of testing of course, but Valley's not bad. Plus I believe it's really heavy on the ram chips too.

As I've mentioned an hour ago, I find the memory overclocking to be weird. It seems to go up in steps. +464 to 465 takes vram from 5962 to 5977, whereas +463 = +464. Is this normal behavior on 1080ti or GPUs in general? I don't recall this happening on my 980ti, but maybe I just never paid enough attention.

Quote:


> Originally Posted by *feznz*
> 
> temp is key right?


Fans are at 100%.

--

Just tested FTW 3 bios. The voltage and power readings are higher... voltage maybe 0.07v, power read in HWinfo by maybe 7w. Didn't think to pull out my kill-a-watt.

Afterburner right now lets me push the slider to 127% on power instead of 125%.


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Darkwizzie*
> 
> Hi.
> 
> Got 1080ti SC Black. I'm assuming no power target unlock bios for it.
> 
> Anyways: At +480mhz on vram my vram runs at 5977, at +481 to +485 mhz my vram runs at 5994mhz. Is this behavior normal?
> 
> I seem to be crashing at 5994mhz in Valley with repeat loops. Readings are same on Hwinfo or Precision or Afterburner.
> 
> 
> 
> temp is key right?
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> If it didn't actually start limiting at about 310w, then yes. Trouble is its not a true 358w, at least XOC doesn't appear to power limit so I can get the 350w for real.
> 
> On the XOC, it really seemed to take a huge leap in voltage to get from 2152 -> 2164, so maybe its regaining performance around 2150+ mark, as the max frames around scene 5/6 in SuPo jumped quite a bit even though I hadn't been able to finish the test. Might look into that tonite.
> 
> Click to expand...
> 
> probably the wall of voltage leak
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, I would.
> 
> Click to expand...
> 
> Thanks for looking after this thread got one more request can you put your collection of bios on the OP thanks
Click to expand...

Put it in the How To Flash A Different BIOS Spoiler in OP.


----------



## done12many2

Quote:


> Originally Posted by *Darkwizzie*
> 
> I'm using Valley right now. It's my own go to stress test. On 980ti I ran Heaven, Valley, Superposition and found that Valley showed artifacts at 5 mhz lower core clocks than the other two. It's not the end all of testing of course, but Valley's not bad. Plus I believe it's really heavy on the ram chips too.


Valley is nothing, but a CPU test these days. Way too little demand on the GPU. A lot of demand on a single thread of your CPU though.


----------



## BoredErica

Quote:


> Originally Posted by *done12many2*
> 
> Valley is nothing, but a CPU test these days. Way too little demand on the GPU. A lot of demand on a single thread of your CPU though.


Then what do you suggest? I ran Superposition and checked for artifacts and I got artifacts slightly sooner with Valley.

Granted different tests can push the clock down or let it get higher and that can change results.

I just realized by 'taking over the thread', KedarWolf is the OP now, LOL. I didn't even know that was possible.


----------



## Nico67

Quote:


> Originally Posted by *Eco28*
> 
> Yup 100% power target, even removed any OC software. My point was to check does it really (should it) throttling when you take it as it is from the box? No OC, no power/voltage tweaks. It just hovers from 1880 to 1920Mhz give or take and PrefCap is as shown on the pic switching from Util to Pwr most of the time. I use Heaven benchmark (desktop resolution 1440p, Extreme tesselation and highest quality settings) because it tends to be light on power draw compared to Superposition or even TimeSpy.
> I find it all this throttling rather peculiar then again I've just made a huge leap from SLI Asus' OC'ed version of 670s onto this FE 1080ti - different Boost, Bios characteristics, whole different generation.
> 
> Thanks!


Yes at default out of the box and 100% pwr it will limit heaps

Quote:


> Originally Posted by *Hulio225*
> 
> exactly that, and sometimes i feel like its afterburner and sometimes i feel like its 3dmark.
> 
> but i could reproduce it constantly, load afterburner profile with given mem clock -> start 3mmark run it with set mem clock get for example 70 fps -> increasing mem clock by 50 mhz rerun getting worse scores -> increasing again 50mhz getting again better scores, scores which are inline
> 
> closing 3d mark setting the mem clock again to the middle value starting 3dmrk getting score inbetween again, like it should... increasing mem clock by 50 mhz and the score is worse again...
> 
> something is happening here which can kill your testing results if you are not aware of that...
> 
> my recommendation is to close 3dmark and afterburner every time you adjust clocks, otherwise your testing will be flawed by that bug at certain point


Sounds like what I was seeing testing mem in SuPo the other nite, every second test was low but always scaling up, might check that again too


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, I would.


I didnt see the ftw3 bios in the link on the front page.

Is it somewhere else?


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> I didnt see the ftw3 bios in the link on the front page.
> 
> Is it somewhere else?


http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti


----------



## Nico67

Quote:


> Originally Posted by *OneCosmic*
> 
> Doesn't anybody know if there is any other
> However a guy here, has higher score with just 2088/6300MHz shunt moded:


Quote:


> Originally Posted by *done12many2*
> 
> Which means absolutely nothing. You can't compare scores between different configurations. Scaling within the same rig is what is important.


Quote:


> Originally Posted by *Dasboogieman*
> 
> Just changing the NVIDIA Texture Filtering setting to High Performance can net you a hundred or so points. Comparisons within the same rig are golden for this reason, same settings.


Couldn't have said it better myself


----------



## CoreyL4

So I flashed the ftw3 bios on my gaming x. I can definitely get higher without power throttling. My question is about the powerlimit zip file.

It says to manually change my tdp to the card. Do I do it for the physical card or bios it has on it? i think ftw3 is 350/355w and msi is 330.

So do i set it for 330 or 350?


----------



## Nico67

Quote:


> Originally Posted by *CoreyL4*
> 
> So I flashed the ftw3 bios on my gaming x. I can definitely get higher without power throttling. My question is about the powerlimit zip file.
> 
> It says to manually change my tdp to the card. Do I do it for the physical card or bios it has on it? i think ftw3 is 350/355w and msi is 330.
> 
> So do i set it for 330 or 350?


358w


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> So I flashed the ftw3 bios on my gaming x. I can definitely get higher without power throttling. My question is about the powerlimit zip file.
> 
> It says to manually change my tdp to the card. Do I do it for the physical card or bios it has on it? i think ftw3 is 350/355w and msi is 330.
> 
> So do i set it for 330 or 350?


KedarWolf already prepared that file for that particular ftw3 bios, thats why its kinda bundled at the bottom of the page


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> KedarWolf already prepared that file for that particular ftw3 bios, thats why its kinda bundled at the bottom of the page


What? Bundled bottom of where? I'm confused. I went ahead and manually changed my tdp to 358.


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> What? Bundled bottom of where? I'm confused. I went ahead and manually changed my tdp to 358.


http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti

here first post at the bottom^^
But if you have it now it doesn't matter at all^^


----------



## CoreyL4

Quote:


> Originally Posted by *Hulio225*
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> 
> here first post at the bottom^^
> But if you have it now it doesn't matter at all^^


Ah thanks didn't see that. I'm still getting power limit at 119% when it shows 127% for power limit in AB?


----------



## KedarWolf

Quote:


> Originally Posted by *CoreyL4*
> 
> So I flashed the ftw3 bios on my gaming x. I can definitely get higher without power throttling. My question is about the powerlimit zip file.
> 
> It says to manually change my tdp to the card. Do I do it for the physical card or bios it has on it? i think ftw3 is 350/355w and msi is 330.
> 
> So do i set it for 330 or 350?


FTW3 BIOS is 358, you change it to '-pl 358'


----------



## KedarWolf

Quote:


> Originally Posted by *CoreyL4*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> 
> here first post at the bottom^^
> But if you have it now it doesn't matter at all^^
> 
> 
> 
> Ah thanks didn't see that. I'm still getting power limit at 119% when it shows 127% for power limit in AB?
Click to expand...

Yes, FE cards will still power limit at 117+ just a lot less with FTW3 BIOS.


----------



## CoreyL4

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, FE cards will still power limit at 117+ just a lot less with FW3 BIOS.


Ok thanks. I'm using it with my man gaming x.


----------



## Nico67

Quote:


> Originally Posted by *CoreyL4*
> 
> Ok thanks. I'm using it with my man gaming x.


Interesting that a 330w card would power limit the same as a 300w FE card. Does it seem to power limit less than your original bios?


----------



## CoreyL4

Quote:


> Originally Posted by *Nico67*
> 
> Interesting that a 330w card would power limit the same as a 300w FE card. Does it seem to power limit less than your original bios?


It will keep the second bin for a lot longer than my msi bios (at least it feels that way), however it will still jump clocks during graphics test 2 in timespy and 8k supo wrecks it

edit: didnt have voltage slider at 100%. after that it didnt really jump at all in timespy test 2, maybe twice and then right back to where its set at.


----------



## Hulio225

Quote:


> Originally Posted by *CoreyL4*
> 
> It will keep the second bin for a lot longer than my msi bios (at least it feels that way), however it will still jump clocks during graphics test 2 in timespy and 8k supo wrecks it
> 
> edit: didnt have voltage slider at 100%. after that it didnt really jump at all in timespy test 2, maybe twice and then right back to where its set at.


time spy test 2 even limits a shunt moded card a bit and 8k super all the time...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> time spy test 2 even limits a shunt moded card a bit and 8k super all the time...




Yep, I have photos of just that, lol. Very tiny bit of power limiting at 1.075v. Probably wouldn't happen at 1.050v.


----------



## mtbiker033

hybrid kit installed, took my time, was VERY careful. It wasn't too hard but it's a bit nerve wracking when you are taking apart a $700 graphics card lol.

just turned it back on and everything is good, nice and quiet, it's idling at 25C and my ambient is 19C. time for some load!


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> time spy test 2 even limits a shunt moded card a bit and 8k super all the time...
> 
> 
> 
> 
> 
> Yep, I have photos of just that, lol. Very tiny bit of power limiting at 1.075v. Probably wouldn't happen at 1.050v.
Click to expand...

I test my card with 4K Optimized and run the lowest clocks it doesn't power limit. It won't Time Spy test 2 as well.

I can run 2012 core at .950v +667 memory and it will not power limit both tests.


----------



## madmeatballs

Are you guys running SP with texture filtering - High Perf in NVCP?


----------



## BoredErica

Overclock on air SC Black for me is +80 on core.

So like ~2020 on core in tough gaming situations and 5962 on memory.

The freq on this thing is so unstable. Skyrim SE it goes to 2060, on Superposition 8k it's pushed to 1880s. That's a 180mhz difference.

I got 46.21fps on Extreme 1080p Superposition. This is NOT a balls to the walls bench number, this is just so you can compare to make sure nothing is wrong with both of our ends.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I test my card with 4K Optimized and run the lowest clocks it doesn't power limit. It won't Time Spy test 2 as well.
> 
> I can run 2012 core at .950v +667 memory and it will not power limit both tests.


Yeah, once you've shunt mod and since you're on a full Waterblock you'll be able to go higher.


----------



## BoredErica

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, once you've shunt mod and Sincerely youre on a full Waterblock you'll be able to go higher.


I plan to bring the card into a custom loop in the near future... but I'm not so sure a shunt mod is in my future. Plus my card is very vertical right now.

EDIT:
Wow, I pass Superposition for 3 hours and then run Valley no problem. Then I run The Last Remnant benchmark and it crashes hardcore at +80-85.

Lowered to +75...
Clock swings of almost 100mhz in the bench.


----------



## sperson1

Strix owners I wanted to know do you use your own fan curve or do you just leave it on auto? If you do use a custom fan curve could you please tell me what curve you use and temps so I could compare thank you


----------



## pez

Quote:


> Originally Posted by *sperson1*
> 
> Strix owners I wanted to know do you use your own fan curve or do you just leave it on auto? If you do use a custom fan curve could you please tell me what curve you use and temps so I could compare thank you


I ran mine at about 1%/1C after the card would hit 50C. I have a tiny case, so this is worst case scenario. With that in mind, it would top out at 75%/75C.


----------



## KedarWolf

To get really good benchmarks in Superposition (and others) run Superposition until it starts the benchmark, hit escape, stop it, right click on your desktop and go to Nvidia Control Panel, manage 3D Settings, Program Settings tab, click Add, if you ran Superposition you'll see it there, select it, choose Add Selected Program.









Now run it using these settings.





Got this score doing that.











If you have a G-Sync screen disable G-Sync for the bench in Global Settings and you'll get a better score then just disabling it for the benchmark in Program Settings.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I test my card with 4K Optimized and run the lowest clocks it doesn't power limit. It won't Time Spy test 2 as well.
> 
> I can run 2012 core at .950v +667 memory and it will not power limit both tests.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, once you've shunt mod and since you're on a full Waterblock you'll be able to go higher.
Click to expand...

Yeah, I've got all the info I need here to do resistor mod, just waiting on really cheap, really slow shipping stuff from eBay.


----------



## KedarWolf

Quote:


> Originally Posted by *madmeatballs*
> 
> Are you guys running SP with texture filtering - High Perf in NVCP?


See here.

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9180_20#post_26083222


----------



## BoredErica

Why is 4 prerendered frames better for perf than more than 4? And why is shader cache off better perf?


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Why is 4 prerendered frames better for perf than more than 4? And why is shader cache off better perf?


I can't answer that, just after hours of testing I get better results with both settings as is. You can try both, see if it works for you.









It may be because my hard drive is a really fast PCI-E SSD, but I'm not 100% sure that matters.


----------



## BoredErica

Quote:


> Originally Posted by *KedarWolf*
> 
> I can't answer that, just after hours of testing I get better results with both settings as is. You can try both, see if it works for you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It may be because my hard drive is a really fast PCI-E SSD, but I'm not 100% sure that matters.


Hmm, I'll try to do some testing next time on that. I forgot about the pre-rendered frames thing for benching entirely.

(I have 960 Pro)


----------



## Talon2016

http://imgur.com/8sape


MSI Gaming X 1080 Ti running Strix OC vBIOS.

Total beast at 70% fan speed. Max temp 46C. Can any air cooler beat that?


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, I've got all the info I need here to do resistor mod, just waiting on really cheap, really slow shipping stuff from eBay.


Im also waiting for my stuff *****









when you get your stuff mate if you have a video or streaming account ect, would be awesome if you could video a guide







for me to follow hehehe
as i have time off work next friday so i will be doing it then 100%









+ im still waiting on the thermal paint and conductive glue from amazon


----------



## Asmola

Quote:


> Originally Posted by *sperson1*
> 
> Strix owners I wanted to know do you use your own fan curve or do you just leave it on auto? If you do use a custom fan curve could you please tell me what curve you use and temps so I could compare thank you


Here fan curve i'm using on my 1080 Ti Strix oc'd 2063MHz / memory 1488 MHz, max temp while gaming around 61°C and fans are running about 65%.


Data from Wildlands.


----------



## BoredErica

Quote:


> Originally Posted by *Talon2016*
> 
> 
> 
> http://imgur.com/8sape
> 
> 
> MSI Gaming X 1080 Ti running Strix OC vBIOS.
> 
> Total beast at 70% fan speed. Max temp 46C. Can any air cooler beat that?


http://www.gamersnexus.net/hwreviews/2905-asus-1080-ti-rog-strix-review-vs-ftw3-gaming-x


----------



## TheBoom

For those
Quote:


> Originally Posted by *Talon2016*
> 
> 
> 
> http://imgur.com/8sape
> 
> 
> MSI Gaming X 1080 Ti running Strix OC vBIOS.
> 
> Total beast at 70% fan speed. Max temp 46C. Can any air cooler beat that?


Top 2 coolers are Strix and Amp Extreme with the Strix being the quieter of the two.

Most coolers do beat MSI though lol, no offense.


----------



## TheBoom

For those of you who are reporting performance degradation with the XOC bios, are you experiencing worse performance at the same clocks or are you experiencing diminishing returns?

I'm getting identical results at [email protected] and [email protected] with memory fixed at +400 or 11.8ghz effective in Supo 8k.


----------



## Clukos

Quote:


> Originally Posted by *Darkwizzie*
> 
> http://www.gamersnexus.net/hwreviews/2905-asus-1080-ti-rog-strix-review-vs-ftw3-gaming-x


Seems like the Gaming X is the most quiet or second best in all cases


























The Strix is matching the FE at 100%


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, I've got all the info I need here to do resistor mod, just waiting on really cheap, really slow shipping stuff from eBay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im also waiting for my stuff *****
> 
> 
> 
> 
> 
> 
> 
> 
> 
> when you get your stuff mate if you have a video or streaming account ect, would be awesome if you could video a guide
> 
> 
> 
> 
> 
> 
> 
> for me to follow hehehe
> as i have time off work next friday so i will be doing it then 100%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> + im still waiting on the thermal paint and conductive glue from amazon
Click to expand...

I'd use Arctic Silver electrically non-conductive thermal paste. Electrically non-conductive thermal glue may be hard to get off if you ever had to RMA your card. And it needs to be electrically non-conductive for sure.

I'm using some Coolermaster Makergel I have already. The conductive paint is sticky enough the resistors will stay on after it dries, so paste is fine.


----------



## Asmola

Quote:


> Originally Posted by *Clukos*
> 
> Seems like the Gaming X is the most quiet or second best in all cases
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Strix is matching the FE at 100%


I think this is the one you should look at because all cards have different fan profile,on this diagram all coolers @ 40 dBA.


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, I've got all the info I need here to do resistor mod, just waiting on really cheap, really slow shipping stuff from eBay.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im also waiting for my stuff *****
> 
> 
> 
> 
> 
> 
> 
> 
> 
> when you get your stuff mate if you have a video or streaming account ect, would be awesome if you could video a guide
> 
> 
> 
> 
> 
> 
> 
> for me to follow hehehe
> as i have time off work next friday so i will be doing it then 100%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> + im still waiting on the thermal paint and conductive glue from amazon
Click to expand...

In red is the resistors you do, NOT the shunts.


----------



## Talon2016

Quote:


> Originally Posted by *TheBoom*
> 
> For those
> Top 2 coolers are Strix and Amp Extreme with the Strix being the quieter of the two.
> 
> Most coolers do beat MSI though lol, no offense.


But it wouldn't appear so here my friend. Saying it runs cooler means nothing, show me a screen shot with the same overlay I showed in Superposition 4K benchmark.


----------



## TheBoom

Quote:


> Originally Posted by *Talon2016*
> 
> But it wouldn't appear so here my friend. Saying it runs cooler means nothing, show me a screen shot with the same overlay I showed in Superposition 4K benchmark.


I'm basing off reviews done in conditions that are similar. Like the one posted above and also the one on tech power up.

Saying yours runs at a certain temperature at a certain speed means nothing when your ambient and my ambient temperatures are completely different. Not to mention case cooling and other factors


----------



## alucardis666

So I have 2 cards in SLi and for some reason my first card won't downclock anymore. Any ideas?


----------



## TheBoom

Quote:


> Originally Posted by *alucardis666*
> 
> So I have 2 cards in SLi and for some reason my first card won't downclock anymore. Any ideas?


Multi monitor issue?


----------



## alucardis666

Quote:


> Originally Posted by *TheBoom*
> 
> Multi monitor issue?


Nope. Just 1, 60hz.


----------



## TheBoom

Quote:


> Originally Posted by *alucardis666*
> 
> Nope. Just 1, 60hz.


Hmm, then it might be a driver or SLI bug I suppose. Unfortunately I don't run SLI so I can't help with that.

Might wanna try DDU + reinstall if you haven't already.


----------



## alucardis666

Quote:


> Originally Posted by *TheBoom*
> 
> Hmm, then it might be a driver or SLI bug I suppose. Unfortunately I don't run SLI so I can't help with that.
> 
> Might wanna try DDU + reinstall if you haven't already.


Trying to avoid it, but I guess I'll have to as I've already done anti-malware, virus and rootkit scans and haven't found anything.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> In red is the resistors you do, NOT the shunts.


Thanks man







for the tips

gonna order some of this stuff aswell : Coolermaster Makergel

there are 3 versions of it lol

MasterGel
MasterGel Pro
MasterGel Maker

what stuff is what you have









also will you do a guide/video of your experience that noobs like me can follow heheh
really i want to be able to watch someone on a video do this lol


----------



## feznz

Quote:


> Originally Posted by *Darkwizzie*
> 
> I plan to bring the card into a custom loop in the near future... but I'm not so sure a shunt mod is in my future. Plus my card is very vertical right now.
> 
> EDIT:
> 
> Wow, I pass Superposition for 3 hours and then run Valley no problem. Then I run The Last Remnant benchmark and it crashes hardcore at +80-85.
> 
> Lowered to +75...
> 
> Clock swings of almost 100mhz in the bench.


I couldn't remember the flavour of your 1080ti it is not in your sig but I had this pointed out average OC on air
http://hwbot.org/hardware/videocard/geforce_gtx_1080_ti/

100Mhz is a huge down clock looks like the Bios is not happy so it down clocks the card to prevent damage, normally temp or power draw related.
Quote:


> Originally Posted by *sperson1*
> 
> Strix owners I wanted to know do you use your own fan curve or do you just leave it on auto? If you do use a custom fan curve could you please tell me what curve you use and temps so I could compare thank you


I have a strix but it is under water never used the original cooler.
really depends what you are aiming for maximum quietness or maximum performance
normally I will choose a point where I will ramp steeply up to 100% usually around 70°C and a low gentle curve up to the 65°C for quietness

Quote:


> Originally Posted by *alucardis666*
> 
> So I have 2 cards in SLi and for some reason my first card won't downclock anymore. Any ideas?


I do recall something about multiple monitors being the cause.
try different display output


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> For those of you who are reporting performance degradation with the XOC bios, are you experiencing worse performance at the same clocks or are you experiencing diminishing returns?
> 
> I'm getting identical results at [email protected] and [email protected] with memory fixed at +400 or 11.8ghz effective in Supo 8k.


Just takes a lot of tuning, but I could see positive scaling, just wasn't huge maybe 20 points per bins at the same voltage. You might find any gains in clock are diminished by the increase in voltage. Also tuneup at least 4 voltage bins below what you are running at, all bioses should benefit from tuning 4 bins below the minimum working clock speed.

Quote:


> Originally Posted by *KedarWolf*
> 
> In red is the resistors you do, NOT the shunts.


C is for capacitor







just sayin


----------



## Clukos

Quote:


> Originally Posted by *Asmola*
> 
> I think this is the one you should look at because all cards have different fan profile,on this diagram all coolers @ 40 dBA.


That is only relevant if you plan on running at 40dBA all the time. Also, the Strix they've tested has lower TDP than both the FTW3 and MSI Gaming X, which means it'll run cooler than both of them (I know, I've been running mine undervolted and it is _much_ cooler than stock)









TDP is just as important as cooling, the "normal" Strix has 250 watts TDP, 300 Watts is you raise it through afterburner.


----------



## alucardis666

Quote:


> Originally Posted by *TheBoom*
> 
> Hmm, then it might be a driver or SLI bug I suppose. Unfortunately I don't run SLI so I can't help with that.
> 
> Might wanna try DDU + reinstall if you haven't already.


Fixed. DDU + clean install of drivers. That was odd... hope I don't run into that again.


----------



## BrainSplatter

Quote:


> Originally Posted by *Asmola*
> 
> I think this is the one you should look at because all cards have different fan profile,on this diagram all coolers @ 40 dBA.


I didn't find it in the article. Did they fix the clock speed and GPU voltage for the 40dBA test?
Without the same voltage that test would not be very accurate and GPU reviewers almost always forget to take GPU VID into account when comparing power usage/temperatures/noise.


----------



## Clukos

Quote:


> Originally Posted by *BrainSplatter*
> 
> I didn't find it in the article. Did they fix the clock speed and GPU voltage for the 40dBA test?
> Without the same voltage that test would not be very accurate and GPU reviewers almost always forget to take GPU VID into account when comparing power usage/temperatures/noise.


They probably used stock settings.

Strix: https://www.techpowerup.com/vgabios/191621/asus-gtx1080ti-11264-170328 -> *250 - 300 Watt TDP*
MSI Gaming X: https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320 -> *280 - 330 Watt TDP*
FTW3 -> ? It's even higher than the Gaming X afaik


----------



## Nico67

Ok, no way I can run SuPo at 2164, but that's ok









Went back to FE and applied a few leasons I had learnt,

2138/6180 runs largely at 2075-2088 due to limiting, PWR peaking 314w were I would need around 337w min



Biggest thing I confirmed is that FE also suffers from cliff loss from min running clk. So in the above case I tuned 4 bins below 2075 and got some pretty good gains.

Second thing was FE has the best min FPS performance of all the bioses, which at the end of the day is more important in gaming than max or even average FPS.

Lastly SuPo is not a great indicator of scaling particularly of clk speed, you get better gains with mem, curve and tuning, but its still useful and runs off a USB standalone which is nice


----------



## pez

Quote:


> Originally Posted by *Clukos*
> 
> Seems like the Gaming X is the most quiet or second best in all cases
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Strix is matching the FE at 100%


Quote:


> Originally Posted by *Asmola*
> 
> I think this is the one you should look at because all cards have different fan profile,on this diagram all coolers @ 40 dBA.


Quote:


> Originally Posted by *Clukos*
> 
> That is only relevant if you plan on running at 40dBA all the time. Also, the Strix they've tested has lower TDP than both the FTW3 and MSI Gaming X, which means it'll run cooler than both of them (I know, I've been running mine undervolted and it is _much_ cooler than stock)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TDP is just as important as cooling, the "normal" Strix has 250 watts TDP, 300 Watts is you raise it through afterburner.


Let's also not forget how important noise profiles are. While the regular ICX cooler (2-fan design) from EVGA is louder than the STRIX percentage for percentage, the STRIX has a more annoying noise profile. Of course this comes down to the quality of fans being used, their size, and of course, 2 fans vs 3 fans. I imagine the MSI Gaming X being even better than the EVGA cards based on what I've read, but I have no personal experience with that particular cooler.

That being said, both are great cards and coolers.


----------



## Clukos

The Gaming X has two big fans instead of 3 smaller ones, which also helps with the acoustics for sure. I think that's the same for the Palit/Zotac Tis.


----------



## Nico67

Quote:


> Originally Posted by *Hulio225*
> 
> There are significant differences...
> 
> and now the funny part, the asus fe bios which has the same version number like the nvidia one
> 
> 
> 
> Its different....
> 
> all other FE bios look like the nvidia fe bios in comparo meter just one line vendor id differs
> 
> Like i said, i tested them all, isolated under same test conditions, compared them all etc.
> 
> Im always pursuing more performance... but sadly it all was time waste^^


after flashing back to my Asus FE bios I check the version details, and it looks like mine is the same one as TPU DB, its 1/18/17, but has a modified data of 3/1/17. check sums are different, but I saved mine with flash rather than GPU-Z.



Was there a significant difference in performance, as maybe changes were only for GPU Tweak compatibility or something?


----------



## KickAssCop

Tried both MSI Gaming X and STRIX. No issue with fans and noise was about the same. Cooling however, MSI Gaming X ran at 72 C w/ 65% fan speed (slightly noisy at that level) @ 1949/12100 clocks. STRIX that I have right now runs at 68 C w/ 42% fan speed @ 2050/12100 clocks. Without the overclocking the STRIX runs at about 66-67 C and at 38% fan speed. No noise. At that level the card auto boosts to 1949 MHz.

My experience. Don't know about the ICX but ACX were crappy coolers. I think FTW3 is the only EVGA cooler to buy right now especially since card runs much cooler and you can adjust fans to your liking in terms of sound.


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> In red is the resistors you do, NOT the shunts.


Quote:


> Originally Posted by *Nico67*
> 
> C is for capacitor
> 
> 
> 
> 
> 
> 
> 
> just sayin


@KedarWolf Shunt vs resistors vs capacitors is kicking your butt buddy.









I think liquid electrical tape will make a perfect adhesive for use between the painted ends. It dries quickly, is not electrically conductive and holds well. Process seems very simple. Paint the ends of the capacitor, place a dab of liquid electrical tape in the middle and slap a resistor on top of that bad boy before any of that dries. Much nicer then the shunt mod and remains reversible.


----------



## kingofsorrow

Hello,
At what temp does the throttling kicks in?


----------



## TheBoom

Quote:


> Originally Posted by *kingofsorrow*
> 
> Hello,
> At what temp does the throttling kicks in?


There are many different stages at which GPU Boost 3 throttles and I believe it starts from 25-30c as someone here was saying. He had a watercooled card with a chiller and he was able to maintain the card at below 25C.

That being said it depends on your card as well and it's bios.


----------



## cluster71

A question. I bought a Zotac 1080 Ti Amp Extreme. Have a FE. I usually switch cooling paste to Cool Laboratory Liquid Pro or Ultra on cpu and gpu + padding, I have Zotac in my Core 500 mItx chassis in the standing position. Do you isolate the circuits around the GPU to avoid short circuits. I mean small ones.


----------



## BoredErica

Quote:


> Originally Posted by *TheBoom*
> 
> There are many different stages at which GPU Boost 3 throttles and I believe it starts from 25-30c as someone here was saying. He had a watercooled card with a chiller and he was able to maintain the card at below 25C.
> 
> That being said it depends on your card as well and it's bios.


Meh, 25-30C seems like a dumb range. Why didn't they make it 35-40C? :'(


----------



## kolkoo

Quote:


> Originally Posted by *Darkwizzie*
> 
> Meh, 25-30C seems like a dumb range. Why didn't they make it 35-40C? :'(


You can get out a rough idea about when Boost 3.0 drops bins over here http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15

On their image with 1080 the first bin drop is at around 36-37 C then the next one at ~ 41-43C etc. of course their chart starts at 35 C so lower temp bins are not covered.


----------



## ALSTER868

Quote:


> Originally Posted by *TheBoom*
> 
> There are many different stages at which GPU Boost 3 throttles and I believe it starts from 25-30c as someone here was saying. He had a watercooled card with a chiller and he was able to maintain the card at below 25C.
> 
> That being said it depends on your card as well and it's bios.


According to my observation on my FE, the temp drop scale looks something like this: 29c->41c->55c->68c. May vary on different cards though.
So if you want to keep the clocks, you have to maintain temps. Simple.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> Meh, 25-30C seems like a dumb range. Why didn't they make it 35-40C? :'(


That's the range. GPU voltage will jump around 37C


----------



## BoredErica

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's the range. GPU voltage will jump around 37C


Quote:


> Originally Posted by *kolkoo*
> 
> You can get out a rough idea about when Boost 3.0 drops bins over here http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/15
> 
> On their image with 1080 the first bin drop is at around 36-37 C then the next one at ~ 41-43C etc. of course their chart starts at 35 C so lower temp bins are not covered.


Will look at this stuff when I wake up. Looks interesting.


----------



## done12many2

Quote:


> Originally Posted by *Darkwizzie*
> 
> Will look at this stuff when I wake up. Looks interesting.


Amazing how you responded without waking up!


----------



## mtbiker033

loving this hybrid kit!

I did set it up push/pull with two corsair SP120's I had laying around and connected them to my fan controller instead of the card itself.

just maxing out the Power Limit slider and letting it run it settles into 1911mhz solid in heaven, with fans at 1000rpm it was holding 44 deg C, cranking the fans up to max rpm (~2200) the temp dropped to 39 deg C and it was still quieter than the FE stock blower fan lol.

quick question, is there a need to set up a fan curve for the hybrid blower fan for the vrm's and mem?


----------



## wrc05

^^ have to I guess, idle at 35% to load 50% depending ambient room temp


----------



## mtbiker033

Quote:


> Originally Posted by *wrc05*
> 
> ^^ have to I guess, idle at 35% to load 50% depending ambient room temp


okay cool! that roughly what I have now, thanks!!!


----------



## TheBoom

Quote:


> Originally Posted by *SlimJ87D*
> 
> That's the range. GPU voltage will jump around 37C


The review doesn't cover the lower temp bins simply cause they can't achieve that kind of cooling unless they had a chiller like one of our members. Unfortunately that doesn't mean there is no throttling below 35C I think.

Go back 100 pages or so and he did a test on tests all the way down to 10C or so if I remember correctly and up till 25-30C there was still some form of throttling.


----------



## Benny89

Decided to give this STRIX to my wife. Ordered two more STRIX OC since my FTW3 pre-order was moved to 31.05 MAYBE - lol.... My wife never OC stuff, so stock with maxed V and PL will be enough for her.

Wanna have fun with OCing this card and benching but it is hard to have fun when your card can't go higher than 2012 and can do 2012 only on 1.06V lol...


----------



## ViRuS2k

Well my RES/SMD 0805 47 Ohm Resisters came today in the post....

HOLY **** there tiny its going to be a pretty hard job putting thermal paste on the middle of them then thermal electric tape on the sides
there really small, way smaller than i thought hahaha


----------



## bloodhawk

Quote:


> Originally Posted by *ViRuS2k*
> 
> Well my RES/SMD 0805 47 Ohm Resisters came today in the post....
> 
> HOLY **** there tiny its going to be a pretty hard job putting thermal paste on the middle of them then thermal electric tape on the sides
> there really small, way smaller than i thought hahaha


----------



## KedarWolf

Quote:


> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Well my RES/SMD 0805 47 Ohm Resisters came today in the post....
> 
> HOLY **** there tiny its going to be a pretty hard job putting thermal paste on the middle of them then thermal electric tape on the sides
> there really small, way smaller than i thought hahaha
Click to expand...

I'm thinking of using a toothpick and a very light layer of both.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm thinking of using a toothpick and a very light layer of both.


I just used tiny amount of solder paste and my hot air station.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Well my RES/SMD 0805 47 Ohm Resisters came today in the post....
> 
> HOLY **** there tiny its going to be a pretty hard job putting thermal paste on the middle of them then thermal electric tape on the sides
> there really small, way smaller than i thought hahaha
> 
> 
> 
> 
> 
> Click to expand...
> 
> I'm thinking of using a toothpick and a very light layer of both.
Click to expand...

You know you need to use silver conductive paint on the metal contacts on the ends of the capacitors and electrically non-conductive thermal paste in the middle between them, right?


----------



## Coopiklaani

Quote:


> Originally Posted by *bloodhawk*


This for sure requires brain surgery degree of accuracy.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> You know you need to use silver conductive paint on the metal contacts on the ends of the capacitors and electrically non-conductive thermal paste in the middle between them, right?


Nope. Im not sure where you are getting that from. But if you just solder them on its the same thing.

Either just use solder paste and hot air. Or add some solder using a micro tip soldering iron, place the resistors on top, hold them in place with a tweezer and touch each end the soldering tip.

Why are you tremendously increasing the amount of work needed to be done, by doing all the paint and thermal paste business?

Quote:


> Originally Posted by *Coopiklaani*
> 
> This for sure requires brain surgery degree of accuracy.


Oh yeah, specially when using a soldering iron, definitely needs steady hands. That can definitely be mitigated by using a tiny amount of solder paste and hot air


----------



## KedarWolf

Quote:


> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You know you need to use silver conductive paint on the metal contacts on the ends of the capacitors and electrically non-conductive thermal paste in the middle between them, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nope. Im not sure where you are getting that from. But if you just solder them on its the same thing.
> 
> Either just use solder paste and hot air. Or add some solder using a micro tip soldering iron, place the resistors on top, hold them in place with a tweezer and touch each end the soldering tip.
> 
> Why are you tremendously increasing the amount of work needed to be done, by doing all the paint and thermal paste business?
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> This for sure requires brain surgery degree of accuracy.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Oh yeah, specially when using a soldering iron, definitely needs steady hands. That can definitely be mitigated by using a tiny amount of solder paste and hot air
Click to expand...

Because than the resistors can be easily removed should you need to RMA your card or anything.









And no way am I going to be able to solder them on properly, have zero experience with such things.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> Because than the resistors can be easily removed should you need to RMA your card or anything.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And no way am I going to be able to solder them on properly, have zero experience with such things.


Ah that would make sense.
But i did try clamping them on and the resulting drop isnt as accurate and as much as when they are soldered on. If you us hot air, they can be removed as if nothing was ever there.

However how are you going to hold the resistors in place? Does the paint hold them in place?


----------



## KedarWolf

Quote:


> Originally Posted by *bloodhawk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Because than the resistors can be easily removed should you need to RMA your card or anything.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And no way am I going to be able to solder them on properly, have zero experience with such things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ah that would make sense.
> But i did try clamping them on and the resulting drop isnt as accurate and as much as when they are soldered on. If you us hot air, they can be removed as if nothing was ever there.
> 
> However how are you going to hold the resistors in place? Does the paint hold them in place?
Click to expand...

Yes, the conductive silver paint when it dries just hold them on enough they won't fall off, and I use thermal paste instead of thermal glue to remove them easier, bit of 99% alcohol, off they come.


----------



## bloodhawk

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, the conductive silver paint when it dries just hold them on enough they won't fall off, and I use thermal paste instead of thermal glue to remove them easier, bit of 99% alcohol, off they come.


Conductive paint seems interesting. Do you know what its conductivity/resistance is like?

Instead of thermal paste, you can even just use a very narrowly cut piece of Electric tape / Kaptop Tape.


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Decided to give this STRIX to my wife. Ordered two more STRIX OC since my FTW3 pre-order was moved to 31.05 MAYBE - lol.... My wife never OC stuff, so stock with maxed V and PL will be enough for her.
> 
> Wanna have fun with OCing this card and benching but it is hard to have fun when your card can't go higher than 2012 and can do 2012 only on 1.06V lol...


that's smart. that also absolves all your binning up to now in her eyes lol. Nice wifework.


----------



## Coopiklaani

Quote:


> Originally Posted by *bloodhawk*
> 
> Conductive paint seems interesting. Do you know what its conductivity/resistance is like?
> 
> Instead of thermal paste, you can even just use a very narrowly cut piece of Electric tape / Kaptop Tape.


I did resistance measurements here:
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160


----------



## bloodhawk

Quote:


> Originally Posted by *Coopiklaani*
> 
> I did resistance measurements here:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160


Nice, now i understand why people were talking about the 47Ohm resistors.
If someone decides to solder it on however, the 10 Ohm resistors are more than enough.


----------



## niobium615

Quote:


> Originally Posted by *bloodhawk*
> 
> Nice, now i understand why people were talking about the 47Ohm resistors.
> If someone decides to solder it on however, the 10 Ohm resistors are more than enough.


What do you mean? The 10Ohm resistors should provide a higher power limit than the 47Ohm resistors.


----------



## bloodhawk

Quote:


> Originally Posted by *niobium615*
> 
> What do you mean? The 10Ohm resistors should provide a higher power limit than the 47Ohm resistors.


Correct me if im mistaken, but the higher the resistance, the more the current drop and then higher the TDP (to a certain point).


----------



## KingEngineRevUp

Quote:


> Originally Posted by *TheBoom*
> 
> The review doesn't cover the lower temp bins simply cause they can't achieve that kind of cooling unless they had a chiller like one of our members. Unfortunately that doesn't mean there is no throttling below 35C I think.
> 
> Go back 100 pages or so and he did a test on tests all the way down to 10C or so if I remember correctly and up till 25-30C there was still some form of throttling.


My card starts at 23C and eill be there for a good 30 second before it it moves to 29C

I do not see the bins, voltages or anything jump anywhere until the mid 30s. I'd really like to see a video or graph and proof of what you say.


----------



## KedarWolf

Broke 10500 with this voltage curve. The reason I have it staggered like that is I know the points I'm stable at so when the card power limits (no resistor mod yet) it just goes to the highest point it'll run at that voltage like 2062 at 1.050v, 2050 at 1.025v 2025 at 1.000v and 2012 at .963v. It only drops to the lowest voltage and highest core at the voltage it is stable.


----------



## niobium615

Quote:


> Originally Posted by *bloodhawk*
> 
> Correct me if I'm mistaken, but the higher the resistance, the higher the voltage drop(and higher the observed TDP).


Just going off of Coopiklaani's calculations(I didn't run them myself), the observed voltage drop across the shunt will be changed by a factor of R/(40+ R). For a 10Ohm resistor, this is 0.2x, for a 47Ohm resistor, this is 0.54x. The effective power limit should be the inverse of these numbers, so 5xhigher power limit for the 10Ohm resistor and a 1.85x higher power limit for the 47Ohm resistor. Someone correct me if I screwed something up.


----------



## lilchronic

Quote:


> Originally Posted by *bloodhawk*
> 
> Correct me if im mistaken, but the higher the resistance, the more the current drop and then higher the TDP (to a certain point).


http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8250_50#post_26069316


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> that's smart. that also absolves all your binning up to now in her eyes lol. Nice wifework.


Yup. Maybe I should order 2x Zotac Extremes instead... But I really like STRIX looks and cooling...

EVGA is really loosing some money coming to party this late with their flagship 1080 Ti.


----------



## bloodhawk

Quote:


> Originally Posted by *niobium615*
> 
> Just going off of Coopiklaani's calculations(I didn't run them myself), the observed voltage drop across the shunt will be changed by a factor of R/(40+ R). For a 10Ohm resistor, this is 0.2x, for a 47Ohm resistor, this is 0.54x. The effective power limit should be the inverse of these numbers, so 5xhigher power limit for the 10Ohm resistor and a 1.85x higher power limit for the 47Ohm resistor. Someone correct me if I screwed something up.


Quote:


> Originally Posted by *lilchronic*
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8250_50#post_26069316


Makes sense! Thank you gentle men (or women).


----------



## niobium615

Something interesting I noticed is that the T|N guide for the 1080Ti seems to indicate that those filtering resistors are 10Ohm, http://forum.kingpincooling.com/showthread.php?t=3961

The numbers they give for the adding an extra 10Ohm resistor correlate to 10Ohm filtering resistors, not 20Ohm filter resistors. @Coopiklaani How did you find out about the component change, or did you just measure them?


----------



## KedarWolf

Anyone test this 384 TDP Amp Extreme BIOS with shunt mods. This version works with our FE's but not shunted it scored a bit less that FTW3 BIOS.

ZotacExtreme.zip 156k .zip file


----------



## Norlig

What is wrong with doing the shunt mod, versus resistors on capacitors exactly?

Except for the spilling risk ofcourse.


----------



## mtbiker033

I heard you guys liked nudes lol



and this was the final product of installing it



Next generation I buy into I'm definitely going full on WC but I'm rocking the AIO's for now. I just noticed in this pic it looks like the rad is floating in the air lol, I'm using a fan mount similar to the one you can see with the 120mm fan over the cpu area


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Anyone test this 384 TDP Amp Extreme BIOS with shunt mods. This version works with our FE's but not shunted it scored a bit less that FTW3 BIOS.
> 
> ZotacExtreme.zip 156k .zip file


Word of warning on Zotac BIOS, memory clocks much higher on offsets so adjust them accordingly.


----------



## niobium615

Quote:


> Originally Posted by *Norlig*
> 
> What is wrong with doing the shunt mod, versus resistors on capacitors exactly?
> 
> Except for the spilling risk ofcourse.


The shunt mod is a rather crude way of doing it, the drop on the power limit won't be exact; it'll depend on the application of the liquid metal.

Adding resistors to the caps allows for a well-defined and much higher drop of the power limit(and there's no spilling risk).


----------



## KedarWolf

Zotac BIOS power limits more than FTW3 BIOS and scored almost 150 points less in Superposition 4K as well.


----------



## LuisG7

My voltage/Clock is unstable in FTW3 Bios in FE Shunt Mod, only FE Bios its works correctly with Shunt Mod, why? :/
Sorry for my english


----------



## Benny89

*Asus STRIX vs FTW3 review noise and temps* (featuring also GAMING X and Aorus Xtreme in charts) *from Gamers Nexus*:




In case of noise and temps Asus STRIX is definitelly the best air-cooled card right now according to this detailed review.

However, I am still thinking that for playing with OC the FTW3 might be a little bit better due to 358W power limit vs 330W STRIX and dual BIOS.

What do you guys think about this review results?


----------



## HollowKnight

Any one knows why I'm still getting power and voltage limits In afterburner while the clock rates are stable with no drops?
I was playing Prey for a couple of hours with Vsync on 60hz @ 4K when I noticed that.


----------



## BoredErica

Quote:


> Originally Posted by *Benny89*
> 
> Decided to give this STRIX to my wife. Ordered two more STRIX OC since my FTW3 pre-order was moved to 31.05 MAYBE - lol.... My wife never OC stuff, so stock with maxed V and PL will be enough for her.
> 
> Wanna have fun with OCing this card and benching but it is hard to have fun when your card can't go higher than 2012 and can do 2012 only on 1.06V lol...


Did you just casually mention buying 4 1080tis?

Quote:


> Originally Posted by *done12many2*
> 
> Amazing how you responded without waking up!


You get that ability at level 50!

I was up at 7 in the morning lol, really had a hard time reading things.

Quote:


> Originally Posted by *TheBoom*
> 
> The review doesn't cover the lower temp bins simply cause they can't achieve that kind of cooling unless they had a chiller like one of our members. Unfortunately that doesn't mean there is no throttling below 35C I think.
> 
> Go back 100 pages or so and he did a test on tests all the way down to 10C or so if I remember correctly and up till 25-30C there was still some form of throttling.


From our point of view as water coolers at most, why does that matter? When throttling mean the clocks will not be perfectly consistent? If it's just a blanket 20mhz throttle across the board then it's just a flat 20mhz I'll never have access to since most I can do is ambient temps.

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, the conductive silver paint when it dries just hold them on enough they won't fall off, and I use thermal paste instead of thermal glue to remove them easier, bit of 99% alcohol, off they come.


How is that compared to tim shunt mod, especially with card vertical?

Quote:


> Originally Posted by *Benny89*
> 
> *Asus STRIX vs FTW3 review noise and temps* (featuring also GAMING X and Aorus Xtreme in charts) *from Gamers Nexus*:
> 
> 
> 
> 
> In case of noise and temps Asus STRIX is definitelly the best air-cooled card right now according to this detailed review.
> 
> However, I am still thinking that for playing with OC the FTW3 might be a little bit better due to 358W power limit vs 330W STRIX and dual BIOS.
> 
> What do you guys think about this review results?


I posted it already, some peeps think he should've controlled for stock clocks etc.

--

Some readings from tests... Average over 7 minutes after GPU has been at full load for 5 minutes.


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Decided to give this STRIX to my wife. Ordered two more STRIX OC since my FTW3 pre-order was moved to 31.05 MAYBE - lol.... My wife never OC stuff, so stock with maxed V and PL will be enough for her.
> 
> Wanna have fun with OCing this card and benching but it is hard to have fun when your card can't go higher than 2012 and can do 2012 only on 1.06V lol...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you just casually mention buying 4 1080tis?
> Quote:
> 
> 
> 
> Originally Posted by *done12many2*
> 
> Amazing how you responded without waking up!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> You get that ability at level 50!
> 
> I was up at 7 in the morning lol, really had a hard time reading things.
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBoom*
> 
> The review doesn't cover the lower temp bins simply cause they can't achieve that kind of cooling unless they had a chiller like one of our members. Unfortunately that doesn't mean there is no throttling below 35C I think.
> 
> Go back 100 pages or so and he did a test on tests all the way down to 10C or so if I remember correctly and up till 25-30C there was still some form of throttling.
> 
> Click to expand...
> 
> From our point of view as water coolers at most, why does that matter? When throttling mean the clocks will not be perfectly consistent? If it's just a blanket 20mhz throttle across the board then it's just a flat 20mhz I'll never have access to since most I can do is ambient temps.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, the conductive silver paint when it dries just hold them on enough they won't fall off, and I use thermal paste instead of thermal glue to remove them easier, bit of 99% alcohol, off they come.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> How is that compared to tim shunt mod, especially with card vertical?
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Originally Posted by *Benny89*
> 
> 
> *Asus STRIX vs FTW3 review noise and temps* (featuring also GAMING X and Aorus Xtreme in charts) *from Gamers Nexus*:
> 
> 
> 
> 
> 
> In case of noise and temps Asus STRIX is definitelly the best air-cooled card right now according to this detailed review.
> 
> However, I am still thinking that for playing with OC the FTW3 might be a little bit better due to 358W power limit vs 330W STRIX and dual BIOS.
> 
> What do you guys think about this review results?
> 
> Click to expand...
> 
> I posted it already, some peeps think he should've controlled for stock clocks etc.
> 
> --
> Some readings from tests... Average over 7 minutes after GPU has been at full load for 5 minutes.
Click to expand...

My card is vertical, why I haven't used CLU and am doing it this way.


----------



## KedarWolf

Quote:


> Originally Posted by *HollowKnight*
> 
> Any one knows why I'm still getting power and voltage limits In afterburner while the clock rates are stable with no drops?
> I was playing Prey for a couple of hours with Vsync on 60hz @ 4K when I noticed that.


If you're seeing the rainbow and clocks not dropping you're fine.


----------



## HollowKnight

Quote:


> Originally Posted by *KedarWolf*
> 
> If you're seeing the rainbow and clocks not dropping you're fine.


Cool, I was wondering If that's normal or not, thanks.


----------



## BoredErica

From what I'm seen, my 1080ti doesn't artifact nearly as much as my 980ti. My 980ti would artifact quite a bit before anything crashes, and it's mostly red bits. Here it mostly just crashes, which is annoying because I have to wait longer to see if something bad happens.


----------



## Benny89

Did anyone tried FTW3 Bios on ROG STRIX card? I wonder if it will allow real 358W limit and keep scores the same.


----------



## Savatage79

So are you guys getting decent ftw3 Temps with custom profiles? I set it to aggressive but in prey today reached about 72 to 73 degrees, which seems a bit high for the caliber of card. My Aorus that I sent back was holding mid 50s and barely hitting 60.

If my ftw3 doesn't pan out I'm debating tryin the gaming x, strix or zotac.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> Anyone test this 384 TDP Amp Extreme BIOS with shunt mods. This version works with our FE's but not shunted it scored a bit less that FTW3 BIOS.
> 
> ZotacExtreme.zip 156k .zip file


Ill test it now mate!

Edit: Testing done. One of the worst Bios i came across in terms of performance... under identical circumstances im getting ~ 1.2 FPS less in comparison to the FE Bios.
71.87 AVG vs ~70.65 AVG in Time Spy Game Test 1
Ftw3 Bios was ~0,7 FPS less in comparison to FE Bios, as an example.


----------



## TWiST2k

Don't know if this has been posted, but I noticed there is a new Beta of Afterburner, version 4.4.0 Beta 8. I have not personally tested it yet.

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar


----------



## Blotto80

Just waiting on the rest of my gear to show up. Fingers crossed I can get this bad boy installed tomorrow.


----------



## OneCosmic

Quote:


> Originally Posted by *Hulio225*
> 
> Ill test it now mate!
> 
> Edit: Testing done. One of the worst Bios i came across in terms of performance... under identical circumstances im getting ~ 1.2 FPS less in comparison to the FE Bios.
> 71.87 AVG vs ~70.65 AVG in Time Spy Game Test 1
> Ftw3 Bios was ~0,7 FPS less in comparison to FE Bios, as an example.


Did you also test Inno3D X4 BIOS?


----------



## Hulio225

Quote:


> Originally Posted by *OneCosmic*
> 
> Did you also test Inno3D X4 BIOS?


Haha, i flashed my card with it one minute ago, installing drivers atm will tell ya in a few if its good


----------



## blurp

Quote:


> Originally Posted by *TWiST2k*
> 
> Don't know if this has been posted, but I noticed there is a new Beta of Afterburner, version 4.4.0 Beta 8. I have not personally tested it yet.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar


Link does not work for me.


----------



## rolldog

Quote:


> Originally Posted by *OneCosmic*
> 
> Those cables which looks like power cables are there for him just to be able to easily connect a DMM and measure voltage, they aren't there to supply voltage, he is still using standard 6/8PIN PEGs to power the cards, those cables would be too slim to power the c
> ards.


The large sleeved cables you see coming into the picture are straight from the PS, but as you can see, the sleeving is then removed so each individual power cable can be run exactly where it needs to go, instead of plugging it into the card's power input and relying on the circuit to deliver the power. Each individual power cable that makes up the 6/8 pin connector is hooked directly to what it's supplying power to. Similar to running nitrous oxide on your car. Instead of using something to increase performance running all through the hoses, or circuits in this case, just fire the NOS directly into the manifold and you have instant power. Of course, all the circuits on the GPS can't withstand the same increase in voltage, otherwise, everyone would just increase the voltage to the entire plug. This way, it's more precise. Apply additional voltages directly to where they're needed.

Why don't you go check out Vincent website, www.kingpincooling.com and get a rig setup like what he's doing. He's been at this for years, and it's obvious he knows what he's doing. He released his own thermal paste earlier this year, which you can buy direct from his website. I've been wanting to try it for a while, but since I sold my 2 x 980 Ti on eBay last night, 30 minutes after listing them, I think I'll finally be able to use it on my 1080 Tis, once the waterblocks are available.

Oh, he also designs LN2 cooling equipment for CPUs and GPUs, which can be purchased from his website, if you want to go that route.


----------



## Hulio225

Quote:


> Originally Posted by *rolldog*
> 
> The large sleeved cables you see coming into the picture are straight from the PS, but as you can see, the sleeving is then removed so each individual power cable can be run exactly where it needs to go, instead of plugging it into the card's power input and relying on the circuit to deliver the power. Each individual power cable that makes up the 6/8 pin connector is hooked directly to what it's supplying power to. Similar to running nitrous oxide on your car. Instead of using something to increase performance running all through the hoses, or circuits in this case, just fire the NOS directly into the manifold and you have instant power. Of course, all the circuits on the GPS can't withstand the same increase in voltage, otherwise, everyone would just increase the voltage to the entire plug. This way, it's more precise. Apply additional voltages directly to where they're needed.
> 
> Why don't you go check out Vincent website, www.kingpincooling.com and get a rig setup like what he's doing. He's been at this for years, and it's obvious he knows what he's doing. *He released his own thermal paste* earlier this year, which you can buy direct from his website. I've been wanting to try it for a while, but since I sold my 2 x 980 Ti on eBay last night, 30 minutes after listing them, I think I'll finally be able to use it on my 1080 Tis, once the waterblocks are available.
> 
> *Oh, he also designs LN2 cooling equipment for CPUs and GPUs*, which can be purchased from his website, if you want to go that route.


Kingpin is a Trademark, Vincent himself is just benching these days, his modifications are done from "TiN" and his "own" Paste is most likely some paste we already know, just in other packaging with the brand-name "Kingpin" on it. Its just marketing...

Just my 2 cents here


----------



## Hulio225

Quote:


> Originally Posted by *OneCosmic*
> 
> Did you also test Inno3D X4 BIOS?


Testing done, its closer than any other bios (strix, ftw3, zotac), but still worse than FE bios and it is not giving any other benefits...

For shunt modded cards Nvidia FE bios still the best...


----------



## OneCosmic

Quote:


> Originally Posted by *Hulio225*
> 
> Testing done, its closer than any other bios (strix, ftw3, zotac), but still worse than FE bios and it is not giving any other benefits...
> 
> For shunt modded cards Nvidia FE bios still the best...


I think it is throttling a little bit less than any other AIB BIOS or FE BIOS. Seems like those 15W on it are really working. What settings are you using for testing?


----------



## Hulio225

Quote:


> Originally Posted by *OneCosmic*
> 
> I think it is throttling a little bit less than any other AIB BIOS or FE BIOS. Seems like those 15W on it are really working. What settings are you using for testing?


I mean it still throtteled in time spy game test 2. and i had lower average fps in test 1 and 2, so for me its worse in general, it won't produce the same nor higher scores.

Im comparing 2101 MHz core and 500/550/600 vram offset to FE bios in time spy test 1 and sometimes test 2 if i see its close in fps.

Temps always below 34° C under load...

enough info, or was it something else you wanted to know?


----------



## Luckbad

Quote:


> Originally Posted by *pez*
> 
> Let's also not forget how important noise profiles are. While the regular ICX cooler (2-fan design) from EVGA is louder than the STRIX percentage for percentage, the STRIX has a more annoying noise profile. Of course this comes down to the quality of fans being used, their size, and of course, 2 fans vs 3 fans. I imagine the MSI Gaming X being even better than the EVGA cards based on what I've read, but I have no personal experience with that particular cooler.
> 
> That being said, both are great cards and coolers.


I hated the Strix OC noise profile.


----------



## KedarWolf

Quote:


> Originally Posted by *TWiST2k*
> 
> Don't know if this has been posted, but I noticed there is a new Beta of Afterburner, version 4.4.0 Beta 8. I have not personally tested it yet.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar


It's in OP in a few spoilers.


----------



## KedarWolf

Quote:


> Originally Posted by *blurp*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TWiST2k*
> 
> Don't know if this has been posted, but I noticed there is a new Beta of Afterburner, version 4.4.0 Beta 8. I have not personally tested it yet.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar
> 
> 
> 
> Link does not work for me.
Click to expand...

Link works for me.


----------



## wrc05

Quote:


> okay cool!












@mtbiker033


Spoiler: Warning: Spoiler!









Only two pics? Thought you'd post more, since you have Ti kit stock fan speed of ~ 45% at load should be good


----------



## demon09

Quote:


> Originally Posted by *KedarWolf*
> 
> Link works for me.


hmm i click the link and for some reason i just go to guru3d's home page was there anything specific that got changed in 8 currently on 6 here


----------



## KedarWolf

Quote:


> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Link works for me.
> 
> 
> 
> hmm i click the link and for some reason i just go to guru3d's home page was there anything specific that got changed in 8 currently on 6 here
Click to expand...

Changed OP, seems you need to highlight the link text, copy the text and paste in browser.


----------



## demon09

Quote:


> Originally Posted by *KedarWolf*
> 
> Changed OP, seems you need to highlight the link text, copy the text and paste in browser.


:O thanks that worked


----------



## Talon2016

My MSI Gaming X 1080 Ti was returned to Frys this afternoon after it started developing a "popping/snapping" noise when under load in Superposition 4K benchmark and continued less so at idle. It sound like someone static shocking my GPU, sometimes continuously and sometimes it would go awhile and then have a single snap/pop noise. Games began to stutter like crazy and at that point I called Fry's and was told to return it ASAP.

They didn't have any Gaming X in stock to replace my card so I took the Strix OC card home. So far fan noise is about the same at 70% but temps are definitely a tad higher than on my Gaming X card. Although it overclocks better from a brief single run. I was stable at +100Mhz on the core on stock voltage for a 4K run in SP which the Gaming X wasn't. I am going to do some heavy OC testing to find out how far it can go. I really am loving the RGB and looks of the card though, very nice.


----------



## BoredErica

I'm feeling a little too lazy to test performance at same clocks for SC Black normal bios vs FTW3.


----------



## Coopiklaani

Has anyone tried beta8 yet?
http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar


----------



## demon09

Quote:


> Originally Posted by *Talon2016*
> 
> My MSI Gaming X 1080 Ti was returned to Frys this afternoon after it started developing a "popping/snapping" noise when under load in Superposition 4K benchmark and continued less so at idle. It sound like someone static shocking my GPU, sometimes continuously and sometimes it would go awhile and then have a single snap/pop noise. Games began to stutter like crazy and at that point I called Fry's and was told to return it ASAP.
> 
> They didn't have any Gaming X in stock to replace my card so I took the Strix OC card home. So far fan noise is about the same at 70% but temps are definitely a tad higher than on my Gaming X card. Although it overclocks better from a brief single run. I was stable at +100Mhz on the core on stock voltage for a 4K run in SP which the Gaming X wasn't. I am going to do some heavy OC testing to find out how far it can go. I really am loving the RGB and looks of the card though, very nice.


nice my EVGA Fe card I was able to get of step up is a pretty poor overclocker but I am glad I was able to get it


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Has anyone tried beta8 yet?
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta8.rar


I'm using it no issues.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm using it no issues.


Good to know. I'm more curious about what changes they made since beta6.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> To get really good benchmarks in Superposition (and others) run Superposition until it starts the benchmark, hit escape, stop it, right click on your desktop and go to Nvidia Control Panel, manage 3D Settings, Program Settings tab, click Add, if you ran Superposition you'll see it there, select it, choose Add Selected Program.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now run it using these settings.
> 
> 
> 
> 
> 
> Got this score doing that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you have a G-Sync screen disable G-Sync for the bench in Global Settings and you'll get a better score then just disabling it for the benchmark in Program Settings.


Interesting, just ran it with these settings and got this. Probably would get better results with a better CPU.


----------



## BoredErica

Quote:


> Originally Posted by *Coopiklaani*
> 
> Good to know. I'm more curious about what changes they made since beta6.


http://forums.guru3d.com/showpost.php?p=5429350&postcount=354


----------



## TheBoom

Can't do a test at 25C obviously since I don't own a chiller or even watercool my card. But there was a post a few 100 pages ago with a graph talking about that exactly. Nevertheless I don't think it matters since most of us can't hold temps like that anyway.

Anyway since my question got buried, those of you on XOC bios, are you seeing worse performance at identical clocks with stock bios or just diminishing returns on higher clocks acheived with the XOC bios?

Don't know why but 8K Supo gives me same results at 2050 and 2100 even with no perfcap reason.

Should I actually be aiming for the rainbow?


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> Can't do a test at 25C obviously since I don't own a chiller or even watercool my card. But there was a post a few 100 pages ago with a graph talking about that exactly. Nevertheless I don't think it matters since most of us can't hold temps like that anyway.
> 
> Anyway since my question got buried, those of you on XOC bios, are you seeing worse performance at identical clocks with stock bios or just diminishing returns on higher clocks acheived with the XOC bios?
> 
> Don't know why but 8K Supo gives me same results at 2050 and 2100 even with no perfcap reason.
> 
> Should I actually be aiming for the rainbow?


Check your NVIDIA settings, they might've changed when the drivers re-installed
Check the VRAM speeds. I suspect the performance disparity is VRAM related because the XOC BIOS uses looser timings.


----------



## -JunGLe-

Quote:


> Originally Posted by *ThingyNess*
> 
> I didn't see anyone make a note of it in this thread, but for reference in case anyone gets a 1080Ti for which there isn't a BIOS dump / analysis yet.
> Rather than digging through the code and hex values like I've been doing, if you actually have the card in hand:
> 
> Load up a command prompt (cmd) and copy/paste the following:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power
> 
> You'll see a nice little text output like this (it's from my 1070 as i'm still waiting on my 1080Ti to arrive)
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Timestamp                           : Mon Apr 10 15:15:39 2017
> Driver Version                      : 381.65
> 
> Attached GPUs                       : 1
> GPU 0000:02:00.0
> Power Readings
> Power Management            : Supported
> Power Draw                  : 35.86 W
> Power Limit                 : 199.32 W
> Default Power Limit         : 166.10 W
> Enforced Power Limit        : 199.32 W
> Min Power Limit             : 75.00 W
> Max Power Limit             : 200.00 W
> Power Samples
> Duration                    : 2.69 sec
> Number of Samples           : 119
> Max                         : 36.45 W
> Min                         : 8.99 W
> Avg                         : 32.49 W
> 
> This also gives you an easy way to see absolute TDP numbers rather than percentages like GPU-Z gives you, which is nice.
> 
> From this, you can also see that the sliders in most GPU tweaking software seem to round to the nearest integer percentage. For example my slider only goes to 20%, which gives me 166.1w * 1.2 = 199.32w, even though the card technically supports 200w. If you really want that last .68w of TDP headroom, you can actually use nvidia-smi to set the power limit via the command prompt.
> 
> This is also handy if you want to add the command to a batch file and increase your TDP limit without having to have any OC software start up in the background, as they eat up memory and (rarely) have negative interactions with certain games.
> 
> To set a new power target, make sure you run the command prompt or batch file with admin rights, and type:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl <newpowerlimit>
> 
> The output should look something like this, which for me gave me the extra .68w of TDP that my 1070 Strix OC BIOS was capable of that Afterburner and the ASUS Gpu-Tweak utilities weren't giving me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> C:\WINDOWS\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 200
> Power limit for GPU 0000:02:00.0 was set to 200.00 W from 199.32 W.
> All done.


May I ask if smi.exe can be used to keep the GPU at full clock speed in both 2d and 3d mode?
I just installed my Asus ROG Strix GTX 1080 Ti OC and can't for the life of me find a way for it to maintain full clock speed in Warframe.
It only clocks up when I'm in my landing craft which puts the gpu uinder 60% load. As soon as I join a mission, the gpu drops to under 35% load and down clocks right away.


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> *Asus STRIX vs FTW3 review noise and temps* (featuring also GAMING X and Aorus Xtreme in charts) *from Gamers Nexus*:
> 
> 
> 
> 
> In case of noise and temps Asus STRIX is definitelly the best air-cooled card right now according to this detailed review.
> 
> However, I am still thinking that for playing with OC the FTW3 might be a little bit better due to 358W power limit vs 330W STRIX and dual BIOS.
> 
> What do you guys think about this review results?


What you see there is just 250 watt TDP gpu (Strix non-OC) vs 280 watt TDP (MSI Gaming X) vs 300 Watt TDP (FTW3). Higher power limit = higher clocks = higher voltage = higher temps = more noise at stock. This only really matters if you plan on using everything at stock... And this is overclock.net


----------



## Dasboogieman

Quote:


> Originally Posted by *Clukos*
> 
> What you see there is just 250 watt TDP gpu (Strix non-OC) vs 280 watt TDP (MSI Gaming X) vs 300 Watt TDP (FTW3). Higher power limit = higher clocks = higher voltage = higher temps = more noise at stock. This only really matters if you plan on using everything at stock... And this is overclock.net


What irks me is that they reviewer only looked at raw dBA. The specific acoustic frequencies of the fans matter a lot, especially at nearer to 100% (which is what OCN cares about the most).

That is why the FE "sounds" pleasant at 100% compared to most units despite the higher dBA.
I hope EVGA changed the fan motors they used on the ACX because those sounded really whiney at high speeds. The MSI and strix also have mild high frequency pitches.

I like it how they didn't even bother to test the Aorus, the fans actually sound pleasant at 100% because it doesn't have any high pitched frequencies.


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Interesting, just ran it with these settings and got this. Probably would get better results with a better CPU.


i have told you guys about this about a dozen times, lol.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> i have told you guys about this about a dozen times, lol.


Yeah, I understand. But it's kinda funny how benchmark scores can change just by adjusting or changing settings. There's no real medium for us to use to compare and understand performance difference between cards unless someone comes out and says what settings they used.

If I was a reviewer or writing an article I would definitely mention the settings I used in all the cards I'm comparing to not mislead someone.


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> Good to know. I'm more curious about what changes they made since beta6.
> 
> 
> 
> http://forums.guru3d.com/showpost.php?p=5429350&postcount=354
Click to expand...

Going back to beta 6, beta 8 seems to have a bug where my 2050 core points jumps to 2062 on a reboot.


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I understand. But it's kinda funny how benchmark scores can change justify by adjusting or changing aettinfs. There's no real medium for us to use to compare and understand performance difference between cards unless someone comes out and says what settings they used.
> 
> If I was a reviewer or writing an article I would definitely mention the settings I used in all the cards I'm comparing to not mislead someone.


A reviewer doesn't change anything, everything on default, at least that is what i would do, and i guess most are doing it that way.


----------



## BoredErica

Press Steve on those issues enough and he'll make a change for the better.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> Check your NVIDIA settings, they might've changed when the drivers re-installed
> Check the VRAM speeds. I suspect the performance disparity is VRAM related because the XOC BIOS uses looser timings.


Oh it does? I didn't know the bios changed the memory timings on the VRAM.

Oh, and I wasn't comparing the old bios to the XOC. I meant the scores on the XOC bios were similar at different clocks.

Nvidia settings always backed up and reapplied via Inspector.


----------



## AngryLobster

How does one card throttling almost 200mhz not set the alarm off that something isn't right?

When equalized, in terms of absolute cooling the Gaming X seems to have the most surface area of the bunch so I'd assume that will win once he reruns the benches.


----------



## TheBoom

Quote:


> Originally Posted by *AngryLobster*
> 
> How does one card throttling almost 200mhz not set the alarm off that something isn't right?
> 
> When equalized, in terms of absolute cooling the Gaming X seems to have the most surface area of the bunch so I'd assume that will win once he reruns the benches.


I think the Zotac is still by far the biggest in dimensions.


----------



## pez

Quote:


> Originally Posted by *Luckbad*
> 
> I hated the Strix OC noise profile.


Glad to know I'm not the only one. That card is under water now, and apparently is a decent OC'er. I think I was told that it plateaus somewhere around 2088MHz.


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone tried FTW3 Bios on ROG STRIX card? I wonder if it will allow real 358W limit and keep scores the same.


I would be interested too

Quote:


> Originally Posted by *Dasboogieman*
> 
> Check your NVIDIA settings, they might've changed when the drivers re-installed
> Check the VRAM speeds. I suspect the performance disparity is VRAM related because the XOC BIOS uses looser timings.


mmmm I had just assumed this from an educated guess but how do you actually check what the timings are?
What I would do to change those timings back to stock on the XOC be epic for us water coolers.

Quote:


> Originally Posted by *KedarWolf*
> 
> Going back to beta 6, beta 8 seems to have a bug where my 2050 core points jumps to 2062 on a reboot.


edit:Not sure what you mean by that but If it is the bug I had then go into your MSI installation folder and manually delete the profiles I had situation the core was set to 2050 on the interface but was clocking to 2077 in actual clock

I won't bother testing beta 8 then but in your opinion which bios has been the best so far...


----------



## 4lek

Anything else i should cover with thermal pad guys?
thank you.


----------



## madmeatballs

Quote:


> Originally Posted by *feznz*
> 
> edit:Not sure what you mean by that but If it is the bug I had then go into your MSI installation folder and manually delete the profiles I had situation the core was set to 2050 on the interface but was clocking to 2077 in actual clock
> 
> I won't bother testing beta 8 then but in your opinion which bios has been the best so far...


Happens to me too on beta 8 but I've swapped bios like 3 times, I'm guessing MSI AB is getting confused lol. I'll try deleting profiles if it would work.


----------



## Benny89

Quote:


> Originally Posted by *Clukos*
> 
> What you see there is just 250 watt TDP gpu (Strix non-OC) vs 280 watt TDP (MSI Gaming X) vs 300 Watt TDP (FTW3). Higher power limit = higher clocks = higher voltage = higher temps = more noise at stock. This only really matters if you plan on using everything at stock... And this is overclock.net


I didn't notice that this is non-OC STRIX card. OC has 275W base Power Limit.

Also FTW3 base is 280W and MSI GAMING X base is 250W.

Here is a chart of base power limits and clocks:


http://imgur.com/yAERIDw


But guys, considering all pros and cons- if you were to buy GPU: STRIX OC vs FTW3 - which would you perefer? I like them both in terms of looks so lets leave that. From pure performance point- which one?


----------



## mtbiker033

Quote:


> Originally Posted by *wrc05*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @mtbiker033
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only two pics? Thought you'd post more, since you have Ti kit stock fan speed of ~ 45% at load should be good


here are some more pics!


Spoiler: Warning: Spoiler!



[





Spoiler: Warning: Spoiler!



[





Spoiler: Warning: Spoiler!



[





Spoiler: Warning: Spoiler!



[





Spoiler: Warning: Spoiler!



[





Spoiler: Warning: Spoiler!



[



ok 45% at load sounds good will give that a go. I need to work on my OC today. I'm actually thinking about flashing the FTW3 bios.


----------



## TWiST2k

Quote:


> Originally Posted by *blurp*
> 
> Link does not work for me.


It did not work for me just now, but I copied it and pasted it and it worked. No idea why the link is flaky, but it is there.


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> I didn't notice that this is non-OC STRIX card. OC has 275W base Power Limit.
> 
> Also FTW3 base is 280W and MSI GAMING X base is 250W.
> 
> Here is a chart of base power limits and clocks:
> 
> 
> http://imgur.com/yAERIDw


That chart includes only what Nvidia and the AIBs rate it at, not the actual power draw in the BIOS. They might say 250 for the Gaming X but the BIOS allows 280 at 100% and 330 at 117%. The FTW3 is 300 watts at 100%, the non-OC Strix is 250 watts at 100%.


----------



## Benny89

Quote:


> Originally Posted by *Clukos*
> 
> That chart includes only what Nvidia and the AIBs rate it at, not the actual power draw in the BIOS. They might say 250 for the Gaming X but the BIOS allows 280 at 100% and 330 at 117%. The FTW3 is 300 watts at 100%, the non-OC Strix is 250 watts at 100%.


Interesting. So betwee FTW3 and OC STRX- which one would you choose (ignoring looks and LEDs







)?


----------



## Rhadamanthys

Quote:


> Originally Posted by *Rhadamanthys*
> 
> Hey guys, need some urgent help please. While disassembling my FE for water block installation I must have accidently ripped off one of these tiny little things (not sure how they're called or what's their purpose). I didn't even notice that I did it, just saw it lying on the pcb. Did I kill my card? Is there a way of fixing this, e.g. by soldering?


Quote:


> Originally Posted by *Hulio225*
> 
> This is an capacitor, there is still one left, they were connected parallel, which simply doubls the capacity... you could be fine, or your card will get unstable... maybe you will have to resolder it
> 
> At least from what i see on tht picture, cant see the traces properly because of red circle and angle^^


I did my best to put that thing back on, but my soldering job looks really bad and I'm afraid the capacitor isn't actually "fixed". So just in case, could you or anyone else please tell me what exactly "unstable" means here? Thus when testing the card I could better find out if there's something wrong with it. What negative effects from a not working capacitor am I possibly looking at here?


card will die soon
reduced fps under load at stock or OC
unstable core/mem speed or voltage
higher temps
anything else you can think of
I did test the card briefly after resoldering, and in FS Ultra and Witcher 3 (the only benchmarks I did before ripping the cap) there was no immediately noticeable drop in performance compared to before.


----------



## Clukos

Quote:


> Originally Posted by *Benny89*
> 
> Interesting. So betwee FTW3 and OC STRX- which one would you choose (ignoring looks and LEDs
> 
> 
> 
> 
> 
> 
> 
> )?


The FT3 allows for more power draw which means you can push it further on its stock BIOS, on the other hand you can flash the Strix XOC bios on the Strix and not have to worry about incompatibilities. Out of the two the FTW3 appears to have the better VRM and because it is 2 slots if you are going for Sli it might be easier to cool 2 FTW3s than it is to cool 2 Strix Tis. But you can't go wrong with either overall


----------



## asdkj1740

Quote:


> Originally Posted by *Clukos*
> 
> The FT3 allows for more power draw which means you can push it further on its stock BIOS, on the other hand you can flash the Strix XOC bios on the Strix and not have to worry about incompatibilities. Out of the two the FTW3 appears to have the better VRM and because it is 2 slots if you are going for Sli it might be easier to cool 2 FTW3s than it is to cool 2 Strix Tis. But you can't go wrong with either overall


however the ftw3 uses diffierent vcore controller chip which may be resulted in unstable or even bricked


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> In red is the resistors you do, NOT the shunts.
> 
> 
> 
> 
> 
> Thanks man
> 
> 
> 
> 
> 
> 
> 
> for the tips
> 
> gonna order some of this stuff aswell : Coolermaster Makergel
> 
> there are 3 versions of it lol
> 
> MasterGel
> MasterGel Pro
> MasterGel Maker
> 
> what stuff is what you have
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also will you do a guide/video of your experience that noobs like me can follow heheh
> really i want to be able to watch someone on a video do this lol
Click to expand...

I have Mastergel Maker.

It depends if I can get my phone setup to record as I'm doing it. Because it's so small and detailed I don't want to have to hold my phone as I do it.

At least I'll put step by step pictures as I do if I can't make a video.


----------



## Alex132

EVGA FTW3 still out of stock


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Interesting. So betwee FTW3 and OC STRX- which one would you choose (ignoring looks and LEDs
> 
> 
> 
> 
> 
> 
> 
> )?


Silicon lottery there is a FE I recall back that game stable at 2100Mhz I am jelly so honestly I think it comes down to aesthetics
I am a bit bios if you look at my sig rig so.....


----------



## mtbiker033

Quote:


> Originally Posted by *KedarWolf*
> 
> I have Mastergel Maker.
> 
> It depends if I can get my phone setup to record as I'm doing it. Because it's so small and detailed I don't want to have to hold my phone as I do it.
> 
> At least I'll put step by step pictures as I do if I can't make a video.


just to make sure, do you recommend using the FTW3 bios on a FE? I am using the hybrid cooler


----------



## VESPA5

My EVGA Hybrid Cooler finally arrived (talk about a company that takes advantage of demand by spiking the price up an additional $40 - smh). Took about a good hour to put on the new custom shroud (doesn't feel as cheap as my previous 980Ti Hybrid shroud did) as well as applying a new die casting module with pre-attached thermal pads (looks identical to the original though).

I get a decent 2015 MHz Core and 5530 MHz Clock while never going over 52C at 1440p. I thought I'd be replacing the stock fan that came with the radiator but amazingly enough, it's actually quiet at 75% or lower. I'm pretty content. Haven't really done any aggressive overclocking with it yet given that I have a lot more thermal headway now. Prior to receiving my hybrid cooler, I was running on the reference blower for a couple of weeks. I never got over how loud the reference blower could be regardless of how conservative of a fan curve I applied. I'm definitely satisfied with the low temps AND low decibels. Even while gaming with headphones, I had to crank the reference blower to 70% and higher to prevent the GPU from throttling down. Just like what I did with my 980Ti, slapping on the hybrid cooler was a fantastic improvement.


----------



## KedarWolf

Quote:


> Originally Posted by *mtbiker033*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I have Mastergel Maker.
> 
> It depends if I can get my phone setup to record as I'm doing it. Because it's so small and detailed I don't want to have to hold my phone as I do it.
> 
> At least I'll put step by step pictures as I do if I can't make a video.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just to make sure, do you recommend using the FTW3 bios on a FE? I am using the hybrid cooler
Click to expand...

Best results no shunt mod I've had is FTW3 BIOS. Peeps saying with shunt mod stock Nvidia best though, but until I get the stuff from the slow boat from China I won't know for sure.

eBay, really cheap, long wait.


----------



## mtbiker033

Quote:


> Originally Posted by *KedarWolf*
> 
> Best results no shunt mod I've had is FTW3 BIOS. Peeps saying with shunt mod stock Nvidia best though, but until I get the stuff from the slow boat from China I won't know for sure.
> 
> eBay, really cheap, long wait.


ok cool!! thank you!


----------



## mtbiker033

Quote:


> Originally Posted by *VESPA5*
> 
> My EVGA Hybrid Cooler finally arrived (talk about a company that takes advantage of demand by spiking the price up an additional $40 - smh). Took about a good hour to put on the new custom shroud (doesn't feel as cheap as my previous 980Ti Hybrid shroud did) as well as applying a new die casting module with pre-attached thermal pads (looks identical to the original though).
> 
> I get a decent 2015 MHz Core and 5530 MHz Clock while never going over 52C at 1440p. I thought I'd be replacing the stock fan that came with the radiator but amazingly enough, it's actually quiet at 75% or lower. I'm pretty content. Haven't really done any aggressive overclocking with it yet given that I have a lot more thermal headway now. Prior to receiving my hybrid cooler, I was running on the reference blower for a couple of weeks. I never got over how loud the reference blower could be regardless of how conservative of a fan curve I applied. I'm definitely satisfied with the low temps AND low decibels. Even while gaming with headphones, I had to crank the reference blower to 70% and higher to prevent the GPU from throttling down. Just like what I did with my 980Ti, slapping on the hybrid cooler was a fantastic improvement.


totally loving it here too!! nice work!

even with my SP120's at full 12v ~2200rpm is more quiet than the stock blower was! it's so weird taking off my headphones while gaming and it's still quiet lol


----------



## kevindd992002

Why don't I still see the EVGA GTX 1080Ti FTW3 in retail stores (Amazon, Newegg, B&H, etc.)? I thought it's supposed to be released already?


----------



## kevindd992002

Why don't I still see the EVGA GTX 1080Ti FTW3 in retail stores (Amazon, Newegg, B&H, etc.)? I thought it's supposed to be released already?


----------



## islandgam3r

Quote:


> Originally Posted by *kevindd992002*
> 
> Why don't I still see the EVGA GTX 1080Ti FTW3 in retail stores (Amazon, Newegg, B&H, etc.)? I thought it's supposed to be released already?


It's still on Newegg, just sold out, it was released 2 days ago and already sold out, boy you sure can tell how much ppl were w8ting for that beast. If I'm lucky, I'll get my elite ordered this week from EVGA's site









https://www.newegg.com/Product/Product.aspx?Item=N82E16814487338


----------



## Benny89

Quote:


> Originally Posted by *kevindd992002*
> 
> Why don't I still see the EVGA GTX 1080Ti FTW3 in retail stores (Amazon, Newegg, B&H, etc.)? I thought it's supposed to be released already?


Yes, I wish I could get one too but 31.05 is ship date for me









So in the meantime I ordered 2 STRIX OC and maybe I will win lottery with them.


----------



## mshagg

Quote:


> Originally Posted by *Rhadamanthys*
> 
> I did my best to put that thing back on, but my soldering job looks really bad and I'm afraid the capacitor isn't actually "fixed". So just in case, could you or anyone else please tell me what exactly "unstable" means here? Thus when testing the card I could better find out if there's something wrong with it. What negative effects from a not working capacitor am I possibly looking at here?
> 
> 
> card will die soon
> reduced fps under load at stock or OC
> unstable core/mem speed or voltage
> higher temps
> anything else you can think of
> I did test the card briefly after resoldering, and in FS Ultra and Witcher 3 (the only benchmarks I did before ripping the cap) there was no immediately noticeable drop in performance compared to before.


You're not the first and you certainly wont be the last. A few people here have done something similar and havent had any problems. Obviously it's not ideal as an engineer put it there for a reason. The only option you have is to stress the hell out of the card and see how it goes. Maybe it will impact the longevity of the card, given another component is now picking up extra load... there's really no way of telling.

End of the day it's not like you can warranty the thing, so carry on and keep your fingers crossed.


----------



## kolkoo

Quote:


> Originally Posted by *Rhadamanthys*
> 
> I did my best to put that thing back on, but my soldering job looks really bad and I'm afraid the capacitor isn't actually "fixed". So just in case, could you or anyone else please tell me what exactly "unstable" means here? Thus when testing the card I could better find out if there's something wrong with it. What negative effects from a not working capacitor am I possibly looking at here?
> 
> 
> card will die soon
> reduced fps under load at stock or OC
> unstable core/mem speed or voltage
> higher temps
> anything else you can think of
> I did test the card briefly after resoldering, and in FS Ultra and Witcher 3 (the only benchmarks I did before ripping the cap) there was no immediately noticeable drop in performance compared to before.


Why don't you just take the card to somebody that has experience soldering smd components using hot air will probably cost you next to nothing...


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> Yeah, I understand. But it's kinda funny how benchmark scores can change just by adjusting or changing settings. There's no real medium for us to use to compare and understand performance difference between cards unless someone comes out and says what settings they used.
> 
> If I was a reviewer or writing an article I would definitely mention the settings I used in all the cards I'm comparing to not mislead someone.


and why reviewers , especially those with brand loyalty's, are worthless.


----------



## Slackaveli

Quote:


> Originally Posted by *Clukos*
> 
> That chart includes only what Nvidia and the AIBs rate it at, not the actual power draw in the BIOS. They might say 250 for the Gaming X but the BIOS allows 280 at 100% and 330 at 117%. The FTW3 is 300 watts at 100%, the non-OC Strix is 250 watts at 100%.


chart doesnt even include the Aorus' offerings, either... :/


----------



## kevindd992002

Quote:


> Originally Posted by *islandgam3r*
> 
> It's still on Newegg, just sold out, it was released 2 days ago and already sold out, boy you sure can tell how much ppl were w8ting for that beast. If I'm lucky, I'll get my elite ordered this week from EVGA's site
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487338


Bummer! Correct me if I'm wrong but the only difference the Elite version has are the different plastic shroud covers, right?

Quote:


> Originally Posted by *Benny89*
> 
> Yes, I wish I could get one too but 31.05 is ship date for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So in the meantime I ordered 2 STRIX OC and maybe I will win lottery with them.


***! I'll be in the US from May 21 to June 3 hoping that I'll be able to buy this card. Now I'm not sure if I can even make buy it on time.

But do they all sell at the same $779.99 pricing point?


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> Yes, I wish I could get one too but 31.05 is ship date for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So in the meantime I ordered 2 STRIX OC and maybe I will win lottery with them.


SC2 is available on newegg right now https://www.newegg.com/Product/Product.aspx?item=N82E16814487337


----------



## Alwrath

Quote:


> Originally Posted by *VESPA5*
> 
> My EVGA Hybrid Cooler finally arrived (talk about a company that takes advantage of demand by spiking the price up an additional $40 - smh). Took about a good hour to put on the new custom shroud (doesn't feel as cheap as my previous 980Ti Hybrid shroud did) as well as applying a new die casting module with pre-attached thermal pads (looks identical to the original though).
> 
> I get a decent 2015 MHz Core and 5530 MHz Clock while never going over 52C at 1440p. I thought I'd be replacing the stock fan that came with the radiator but amazingly enough, it's actually quiet at 75% or lower. I'm pretty content. Haven't really done any aggressive overclocking with it yet given that I have a lot more thermal headway now. Prior to receiving my hybrid cooler, I was running on the reference blower for a couple of weeks. I never got over how loud the reference blower could be regardless of how conservative of a fan curve I applied. I'm definitely satisfied with the low temps AND low decibels. Even while gaming with headphones, I had to crank the reference blower to 70% and higher to prevent the GPU from throttling down. Just like what I did with my 980Ti, slapping on the hybrid cooler was a fantastic improvement.


Im getting my hybrid today. Im so excited, cant wait to see how far past 1949 mhz I can go with this thing in Doom. Doom open gl reduced my oc by a bin so apparently its a good game to test for OC stability at 4K. If you already had a 980ti hybrid cooler, why not just use that one for your 1080ti? Coulda saved you the $175 or so. I agree the price is a little high for my liking but when im done with my 1080ti I will be using it on my new card I get in the future. I dont care if I have to use zipties. lol.


----------



## islandgam3r

Quote:


> Originally Posted by *kevindd992002*
> 
> Bummer! Correct me if I'm wrong but the only difference the Elite version has are the different plastic shroud covers, right?
> 
> ***! I'll be in the US from May 21 to June 3 hoping that I'll be able to buy this card. Now I'm not sure if I can even make buy it on time.
> 
> But do they all sell at the same $779.99 pricing point?


Yes the face plates are colored and yes they are 779.99 price point. Only time you pay more is if you get it str8 from EVGA site cause they charge tax and shipping costs


----------



## Alwrath

Look what just showed up at my doorstep









Finally!


----------



## Alwrath

So it begins... this is my first AIO water cooled card! Will report OC's soon


----------



## KraxKill

Quote:


> Originally Posted by *Alwrath*
> 
> when im done with my 1080ti I will be using it on my new card I get in the future. I dont care if I have to use zipties. lol.


Some guy in a leather jacket just showed off Volta on a massive die with over 5000 cores, so the reference design we've all gotten used to may be changing. May require more than just zip ties to get it to work. Might need a bench grinder and deep pockets.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> I have Mastergel Maker.
> 
> It depends if I can get my phone setup to record as I'm doing it. Because it's so small and detailed I don't want to have to hold my phone as I do it.
> 
> At least I'll put step by step pictures as I do if I can't make a video.


Step by step is better than nowt







hehe cheers....


----------



## kevindd992002

Quote:


> Originally Posted by *islandgam3r*
> 
> Yes the face plates are colored and yes they are 779.99 price point. Only time you pay more is if you get it str8 from EVGA site cause they charge tax and shipping costs


Ok. How does the FTW3 fair compared to the others anyway? I stopped reading through this thread everyday because it was starting to take a significant amount of time during my work, lol. So I don't have any updated news on it anymore.


----------



## seckzee

will an evga hybrid kit fit onto a ftw3?


----------



## koof513

Hey guys my 1080 ti ftw3 just came in the mail. I'm was about to start overclocking but I cant seem to find a bios for the ftw3. Is there not one yet or will the SC2 bios work? Thank you.


----------



## TheWizardMan

Quote:


> Originally Posted by *koof513*
> 
> Hey guys my 1080 ti ftw3 just came in the mail. I'm was about to start overclocking but I cant seem to find a bios for the ftw3. Is there not one yet or will the SC2 bios work? Thank you.


Huh? Your card will have the FTW3 bios pre-flashed?


----------



## DADDYDC650

Just used my EVGA Step-Up to a 1080 Ti SC Black Edition. Time to retire my 1080.


----------



## koof513

So your saying the ftw3 bios does not need to be flashed for overclocking?


----------



## TheWizardMan

Quote:


> Originally Posted by *koof513*
> 
> So your saying the ftw3 bios does not need to be flashed for overclocking?


Nope. You just need to download Precision X or Afterburner. I prefer Afterburner, but you'll lose functionality of the extra temp sensors and the LED if you use it.


----------



## koof513

Quote:


> Originally Posted by *TheWizardMan*
> 
> Nope. You just need to download Precision X or Afterburner. I prefer Afterburner, but you'll lose functionality of the extra temp sensors and the LED if you use it.


yea I use precision x but since I got the new card I'm going to switch to precision xoc. Makes spending the extra money a little nicer now that I don't have to flash the bios. And I have an extra bios : ) Win.


----------



## TheWizardMan

Quote:


> Originally Posted by *koof513*
> 
> yea I use precision x but since I got the new card I'm going to switch to precision xoc. Makes spending the extra money a little nicer now that I don't have to flash the bios. And I have an extra bios : ) Win.


Using the slave BIOS gives you a bit more power so should, theoretically, provide you with a more stable OC. Just keep that in mind.


----------



## koof513

Quote:


> Originally Posted by *TheWizardMan*
> 
> Using the slave BIOS gives you a bit more power so should, theoretically, provide you with a more stable OC. Just keep that in mind.


Thx. I just read that on another site too.


----------



## mtbiker033

Quote:


> Originally Posted by *Alwrath*
> 
> Im getting my hybrid today. Im so excited, cant wait to see how far past 1949 mhz I can go with this thing in Doom. Doom open gl reduced my oc by a bin so apparently its a good game to test for OC stability at 4K. If you already had a 980ti hybrid cooler, why not just use that one for your 1080ti? Coulda saved you the $175 or so. I agree the price is a little high for my liking but when im done with my 1080ti I will be using it on my new card I get in the future. I dont care if I have to use zipties. lol.


you should try running Doom in Vulkan, it performs much better, go with the TXAA 8X and the compute shaders, nightmare textures


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> SC2 is available on newegg right now https://www.newegg.com/Product/Product.aspx?item=N82E16814487337


SC2 is ugly







and does not have 2 BIOSes and has lower Power Limit I think.


----------



## kevindd992002

Quote:


> Originally Posted by *TheWizardMan*
> 
> Using the slave BIOS gives you a bit more power so should, theoretically, provide you with a more stable OC. Just keep that in mind.


How would using the slave BIOS gove you more power?


----------



## TheWizardMan

Quote:


> Originally Posted by *kevindd992002*
> 
> How would using the slave BIOS gove you more power?


Switches to the more aggressive BIOS which has a 130% power target instead of 120% power target.


----------



## KedarWolf

Quote:


> Originally Posted by *koof513*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheWizardMan*
> 
> Nope. You just need to download Precision X or Afterburner. I prefer Afterburner, but you'll lose functionality of the extra temp sensors and the LED if you use it.
> 
> 
> 
> yea I use precision x but since I got the new card I'm going to switch to precision xoc. Makes spending the extra money a little nicer now that I don't have to flash the bios. And I have an extra bios : ) Win.
Click to expand...

Can you switch to the slave higher power limit FTW3 BIOS and save it with GPU-Z and link it here as an attachment as a .zip file?

I want to make sure it's the same one I've been testing with.









Edit: In GPU-Z on the main tab you see the Arrow icon, click on that and it'll save it.


----------



## TheWizardMan

Quote:


> Originally Posted by *KedarWolf*
> 
> Can you switch to the slave higher power limit FTW3 BIOS and save it with GPU-Z and link it here as an attachment as a .zip file?
> 
> I want to make sure it's the same one I've been testing with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: In GPU-Z on the main tab you see the Arrow icon, click on that and it'll save it.


Have we reached a consensus in here that this is the best BIOS to flash to?


----------



## KedarWolf

Quote:


> Originally Posted by *TheWizardMan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Can you switch to the slave higher power limit FTW3 BIOS and save it with GPU-Z and link it here as an attachment as a .zip file?
> 
> I want to make sure it's the same one I've been testing with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: In GPU-Z on the main tab you see the Arrow icon, click on that and it'll save it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have we reached a consensus in here that this is the best BIOS to flash to?
Click to expand...

Seems to be on cards with no shunt mod. I've tested it extensively, it repeatedly producing the highest benches. Got a 10900+ Time Spy on it last night.









Edit: Actually i was very close to 11000 because I tried a few tweaks and different voltage curve to get 11000+ and never succeeded.


----------



## superV

what up






















did a quick test,played bf1 for some rounds and it boosts to [email protected] 120hz,with temps around 75/80 with ambient temp of 22c.
i hope in the next days to put it on water and mount my new 240hz xl2540.
so far impressed on air.
what do you think guys? will be 2000 reachable on water?


----------



## TheWizardMan

Quote:


> Originally Posted by *KedarWolf*
> 
> Seems to be on cards with no shunt mod. I've tested it extensively, it repeatedly producing the highest benches. Got a 10900+ Time Spy on it last night.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Actually i was very close to 11000 because I tried a few tweaks and different voltage curve to get 11000+ and never succeeded.


What were your clocks like?

Looking at all the 3dMark tests, it seems to me that with the FE cards, the highest scores have mid-2000s core clocks.


----------



## nrpeyton

Evening,

I've been away busy with work for a few days, come back and there's 400+ new posts.

What have I missed lol?

Nick


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Evening,
> 
> I've been away busy with work for a few days, come back and there's 400+ new posts.
> 
> What have I missed lol?
> 
> Nick


Well, Wizard has a horrible crush on Slim but Slim is already seeing peyton and is not interested, but little does Slim know peyton has been seeing other 1080 Ti owners behind his back and will be crushed, but other then that, not much new.


----------



## nrpeyton

lol


----------



## Coopiklaani

Returned my flawed card today and finally got my new best gtx 1080 ti FE so far. 2126MHz Vcore @1.093v and +525MHz Mem.

So far I've had 4 gtx 1080 Ti's
1st, FE from nVidia, Max core [email protected], +500 Mem
2nd, FE from EVGA, Max core [email protected], +300 Mem
3rd, FE from ASUS, Max core [email protected] +900 Mem. it crashes benchmarks when core temp is higher than 68C no matter what frequency it is at







Returned and got my 4th card
4th, FE from ASUS, Max core [email protected] +525Mem


----------



## feznz

nrpeyton you player
Quote:


> Originally Posted by *Slackaveli*
> 
> and why reviewers , especially those with brand loyalty's, are worthless.


and the fact bad reviews = no more freebees to test, ahem take home


----------



## iamjanco

Add me to the list. I was going to use a pair of SLI'd 1080 EK-X in my build, but this just simplifies everything. I'll hold onto the MSI cards until after I've installed and tested the FTW3 in my test bench setup:


----------



## nrpeyton

Got my new 7700k
ASUS IX ROG APEX
3600 Trident Z C16

Waiting until next week when I'm on 7 days annual leave before I set it all up.

Fingers crossed my 1080Ti FE stability doesn't turn from a 2100 GHZ 1.093 to a sub 2.1GHZ 1.093 with the new CPU.

Coming from an old FX-8350 (AMD) so it's very likely.

As long as it doesn't go below 2.0Ghz I'll be able to live with it.


----------



## ilikelobstastoo

Planing on trying out these fujipoly pads . Going with 1.0 to replace the. 5 and 1.5 to replace the 1.0 ones . Question is how many would I need? Including ram chips ?

Forgot link ...https://www.amazon.com/gp/aw/d/B00ZSJR1ZK/ref=mp_s_a_1_10?ie=UTF8&qid=1494539313&sr=8-10&pi=AC_SX236_SY340_FMwebp_QL65&keywords=fujipoly+thermal+pad


----------



## Benny89

Still Can't decide between STRIX OC and FTW3...









Please help me.....

FTW3 Pros:
- Dual Bios
- 358W Power Limit on second BIOS at max slider
- Looks awesome
- 2 slot sleek card, no sag
- EVGA best support on planet

FTW Cons:
- Loudest fans from all top dogs: STRIX, GAMING X, Aorus Xtreme
- Little hotter on GPU than STRIX, but cooler than GAMING X or Aorus
- Have to wait to 31.05 for shipping.

STRIX OC Pros:
- Looks Awesome
- Runs little cooler than FTW3
- After Gaming X quietest AIB
- Best GPU core temps on air
- Is available now for order

Cons:
- Only 330W Power Limit
- Single BIOS

If you were on my place- which one would you peak. Lets say you love the looks of both of them.
- ASUS support is nowhere like EVGA


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Still Can't decide between STRIX OC and FTW3...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please help me.....
> 
> FTW3 Pros:
> - Dual Bios
> - 358W Power Limit on second BIOS at max slider
> - Looks awesome
> - 2 slot sleek card, no sag
> - EVGA best support on planet
> 
> FTW Cons:
> - Loudest fans from all top dogs: STRIX, GAMING X, Aorus Xtreme
> - Little hotter on GPU than STRIX, but cooler than GAMING X or Aorus
> - Have to wait to 31.05 for shipping.
> 
> STRIX OC Pros:
> - Looks Awesome
> - Runs little cooler than FTW3
> - After Gaming X quietest AIB
> - Best GPU core temps on air
> - Is available now for order
> 
> Cons:
> - Only 330W Power Limit
> - Single BIOS
> 
> If you were on my place- which one would you peak. Lets say you love the looks of both of them.
> - ASUS support is nowhere like EVGA


Haha, until you decide Volta will be released









I personally would go with Asus, had couple EVGA cards in my rig and rigs of friends, and had more than once problems with the cards... not saying this is generally like that but i experienced that ...
And i never had issues with Asus stuff.... so for me at least i would go asus every day in the weak


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Still Can't decide between STRIX OC and FTW3...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please help me.....
> 
> FTW3 Pros:
> - Dual Bios
> - 358W Power Limit on second BIOS at max slider
> - Looks awesome
> - 2 slot sleek card, no sag
> - EVGA best support on planet
> 
> FTW Cons:
> - Loudest fans from all top dogs: STRIX, GAMING X, Aorus Xtreme
> - Little hotter on GPU than STRIX, but cooler than GAMING X or Aorus
> - Have to wait to 31.05 for shipping.
> 
> STRIX OC Pros:
> - Looks Awesome
> - Runs little cooler than FTW3
> - After Gaming X quietest AIB
> - Best GPU core temps on air
> - Is available now for order
> 
> Cons:
> - Only 330W Power Limit
> - Single BIOS
> 
> If you were on my place- which one would you peak. Lets say you love the looks of both of them.
> - ASUS support is nowhere like EVGA


I'd go with EVGA.

You can upgrade to water-cooling if you want in future without voiding warranty (which is often needed to get the most from these cards).

Go with anyone else; and it's not an option (unless u want to void ur warranty).


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> I'd go with EVGA.
> 
> You can upgrade to water-cooling if you want in future without voiding warranty (which is often needed to get the most from these cards).


Depends where you live, you aren't voiding anything if you take a asus card apart and mount a waterblock.
In germany for example you can do that without voiding the warranty.


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> Depends where you live, you aren't voiding anything if you take a asus card apart and mount a waterblock.
> In germany for example you can do that without voiding the warranty.


That might be "the law" in certain lands. But it's not ASUS policy. So you may still have a headache if it came up and you had to deal with them.

It's easy to fog someone off via email and take ages to reply, frustrating the hell out of you. And I doubt you're going to go to court over a GPU.


----------



## Alwrath

Quote:


> Originally Posted by *Benny89*
> 
> Still Can't decide between STRIX OC and FTW3...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please help me.....
> 
> FTW3 Pros:
> - Dual Bios
> - 358W Power Limit on second BIOS at max slider
> - Looks awesome
> - 2 slot sleek card, no sag
> - EVGA best support on planet
> 
> FTW Cons:
> - Loudest fans from all top dogs: STRIX, GAMING X, Aorus Xtreme
> - Little hotter on GPU than STRIX, but cooler than GAMING X or Aorus
> - Have to wait to 31.05 for shipping.
> 
> STRIX OC Pros:
> - Looks Awesome
> - Runs little cooler than FTW3
> - After Gaming X quietest AIB
> - Best GPU core temps on air
> - Is available now for order
> 
> Cons:
> - Only 330W Power Limit
> - Single BIOS
> 
> If you were on my place- which one would you peak. Lets say you love the looks of both of them.
> - ASUS support is nowhere like EVGA


Id go neither. Just get a EVGA hybrid


----------



## Alwrath

Alright so first heaven loop and it looks like im around 45C and im good for 2000 mhz - 2012 mhz stock voltage. Im def power limited. Flashed to Nvidia FE bios currently. Is the FTW3 bios proven to be better on water than Nvidia FE bios for a FE card?


----------



## KedarWolf

Quote:


> Originally Posted by *Alwrath*
> 
> Alright so first heaven loop and it looks like im around 45C and im good for 2000 mhz - 2012 mhz stock voltage. Im def power limited. Flashed to Nvidia FE bios currently. Is the FTW3 bios proven to be better on water than Nvidia FE bios for a FE card?


FTW3 BIOS power limits a ton less than FE BIOS and gets better benches at same clocks as the result.


----------



## Alwrath

Quote:


> Originally Posted by *KedarWolf*
> 
> FTW3 BIOS power limits a ton less than FE BIOS and gets better benches at same clocks as the result.


sold


----------



## TWiST2k

Quote:


> Originally Posted by *TheWizardMan*
> 
> Nope. You just need to download Precision X or Afterburner. I prefer Afterburner, but you'll lose functionality of the extra temp sensors and the LED if you use it.


And controlling the fans properly. I have been trying to figure my way thru this mess with my FTW3 and it kinda sucks haha.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> FTW3 BIOS power limits a ton less than FE BIOS and gets better benches at same clocks as the result.


Same on air?

Think I might just flash right now.

Funny thing is, I heard someone saying the XOC was beating the FTW3 BIOS?

Or is consensus for XOC a no-go? _(remember I've been away for a few days)._


----------



## nrpeyton

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Planing on trying out these fujipoly pads . Going with 1.0 to replace the. 5 and 1.5 to replace the 1.0 ones . Question is how many would I need? Including ram chips ?
> 
> Forgot link ...https://www.amazon.com/gp/aw/d/B00ZSJR1ZK/ref=mp_s_a_1_10?ie=UTF8&qid=1494539313&sr=8-10&pi=AC_SX236_SY340_FMwebp_QL65&keywords=fujipoly+thermal+pad


The extra compression should do the job. _(you'll see improvement in temps)_

Anything soft & *2.7 - 3* w/mk and over will get you comparable results. (they should all compress to 50% original height)

_I only say 2.7 because thats what I've seen on ebay._

Save your ££ / $$ / €€

You can get 10cm x 10cm for less than a 5 $ / € / £ so you'll have loads left over and plenty to fiddle around with.

_Way it helps me to look at is is this:

Copper is 400 w/MK.

so going from 3 w/MK to 16 w/MK will barely make 0.001 of a difference_


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Returned my flawed card today and finally got my new best gtx 1080 ti FE so far. 2126MHz Vcore @1.093v and +525MHz Mem.
> 
> So far I've had 4 gtx 1080 Ti's
> 1st, FE from nVidia, Max core [email protected], +500 Mem
> 2nd, FE from EVGA, Max core [email protected], +300 Mem
> 3rd, FE from ASUS, Max core [email protected] +900 Mem. it crashes benchmarks when core temp is higher than 68C no matter what frequency it is at
> 
> 
> 
> 
> 
> 
> 
> Returned and got my 4th card
> 4th, FE from ASUS, Max core [email protected] +525Mem


I would never bother with any of that for 2% extra performance, probably even less than 2% since your memory is lower now.


----------



## Alwrath

How many watts is FTW 3 bios again? 358?


----------



## TWiST2k

Quote:


> Originally Posted by *Alwrath*
> 
> How many watts is FTW 3 bios again? 358?


I believe it is 355, but u can kick it to 358 with the command line.


----------



## Benny89

Quote:


> Originally Posted by *Alwrath*
> 
> Id go neither. Just get a EVGA hybrid


Hybrid doesn't have RGB LED and its power limit is same as FE edition. If it had RGB LEDs however I would take it and slap FTW3 BIOS on it
















EDIT: anyone tried to flash FTW3 Bios to other AIB card?

Also I see many people saying that after flashing BIOS they went to "install drivers". I thought we should only disable GPU before flashing , no uninstalling drivers? Or I was doing that wrong?


----------



## Alwrath

Hhmm... FTW bios seems to be getting lower bins/clocks than the FE bios. Any idea why? Power limit is at max...


----------



## Alwrath

Im back to 1949 mhz when I was air cooled FTW3 is a fail for my FE


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> The extra compression should do the job. _(you'll see improvement in temps)_
> 
> Anything soft & *2.7 - 3* w/mk and over will get you comparable results. (they should all compress to 50% original height)
> 
> _I only say 2.7 because thats what I've seen on ebay._
> 
> Save your ££ / $$ / €€
> 
> You can get 10cm x 10cm for less than a 5 $ / € / £ so you'll have loads left over and plenty to fiddle around with.
> 
> _Way it helps me to look at is is this:
> 
> Copper is 400 w/MK.
> 
> so going from 3 w/MK to 16 w/MK will barely make 0.001 of a difference_


I couldn't agree more. I've personally used the Fujipoly Xr-m pads and they're very hard to work with (especially the thin ones) and produced negligible temp differences.


----------



## Alwrath

Back on origional Nvidia FE bios. Holds 2000 mhz most of the time. Voltage doesnt do anything it seems like. Just ordered 2 noctua 3000 rpm fans for the radiator. Im done playing around. Will report what I get with the new fans in a push / pull setup soon


----------



## Alwrath

Quote:


> Originally Posted by *mtbiker033*
> 
> you should try running Doom in Vulkan, it performs much better, go with the TXAA 8X and the compute shaders, nightmare textures


Ive played Doom in Vulkan at 4K, went back to open gl and honestly I dont notice any difference on my 1080ti. Im gettin over 120 FPS at 60HZ with fast sync so either way its smooth as butter


----------



## RavageTheEarth

Quote:


> Originally Posted by *kevindd992002*
> 
> I couldn't agree more. I've personally used the Fujipoly Xr-m pads and they're very hard to work with (especially the thin ones) and produced negligible temp differences.


Yeah i bought some Kronaut and the 13w/mK Fujipoly Extreme pads in anticipation of the Aorus block release (which was delayed to the last week of the month







). Those pads transfer great and are pretty easy to work with.


----------



## kevindd992002

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Yeah i bought some Kronaut and the 13w/mK Fujipoly Extreme pads in anticipation of the Aorus block release (which was delayed to the last week of the month
> 
> 
> 
> 
> 
> 
> 
> ). Those pads transfer great and are pretty easy to work with.


How much temp improvement did you notice?


----------



## Benny89

Did anyone try to flash FTW3 on AIB card?


----------



## RavageTheEarth

Quote:


> Originally Posted by *kevindd992002*
> 
> How much temp improvement did you notice?


Haven't gotten to use them yet because EK delayed the Aorus block from early May to the Kat week if May. I'm dying to get this thing underwater. I can do 2088 on air with temps not going over 60, but i want to see what this card can do with the 1080mm of rad space i have. Going for high 30's under load, but that is wishful thinking. More realistically I'll see low to mid 40's which in itself would be awesome.

I took this card apart and I noticed that gigabyte used ready thick thermal pads. At least 2mm. I feel that if EK uses 1mm and 0.5mm pads on the upcoming block, us Aorus owners will see a nice decrease in VRM temps.


----------



## CoreyL4

Quote:


> Originally Posted by *Benny89*
> 
> Did anyone try to flash FTW3 on AIB card?


I did


----------



## Benny89

Quote:


> Originally Posted by *CoreyL4*
> 
> I did


How were you clocks and power limiting?


----------



## alucardis666

So I'm getting hard locks in games and the error "Application has been blocked for accessing graphics hardware" Is this a bad GPU overclock or is it my RAM timings being too tight? (14-11-11-11-11-11)

I can't seem to game longer than 5-10 minutes at a time before this occurs. And I've already tried clean installing Nvidia Drivers with DDU.


----------



## hotrod717

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Haven't gotten to use them yet because EK delayed the Aorus block from early May to the Kat week if May. I'm dying to get this thing underwater. I can do 2088 on air with temps not going over 60, but i want to see what this card can do with the 1080mm of rad space i have. Going for high 30's under load, but that is wishful thinking. More realistically I'll see low to mid 40's which in itself would be awesome.
> 
> I took this card apart and I noticed that gigabyte used ready thick thermal pads. At least 2mm. I feel that if EK uses 1mm and 0.5mm pads on the upcoming block, us Aorus owners will see a nice decrease in VRM temps.


Universal is the way to go and lasts over many, many upgrades! If you dont care about "eye-catching fun', its the way to go!

^ I would think ram. They are some odd timings.


----------



## CoreyL4

Quote:


> Originally Posted by *Benny89*
> 
> How were you clocks and power limiting?


Didnt push them higher than my 24/7 overclock, doesnt power limit at 117, power limits around 120 now.. not the full 127% though


----------



## KedarWolf

Broke 11000.







FTW3 BIOS, no shunt mod.



http://www.3dmark.com/3dm/19881540?


----------



## RavageTheEarth

Quote:


> Originally Posted by *hotrod717*
> 
> Universal is the way to go and lasts over many, many upgrades! If you dont care about "eye-catching fun', its the way to go!
> 
> ^ I would think ram. They are some odd timings.


I'm just not a fan of anything that doesn't have active VRM cooling. I like the full cover blocks because i get amazing cooling and it's only $150 which isn't a lot considering that i only upgrade every 2-3 years.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?


Hey Kedar wrong thread, this is 1080 ti owners thread, why are you posting a link with time spy result done on a titan x?


----------



## done12many2

Quote:


> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?


Maybe you could share that in the Titan X thread?


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> Hey Kedar wrong thread, this is 1080 ti owners thread, why are you posting a link with time spy result done on a titan x?


You sniped me.


----------



## Hulio225

Quote:


> Originally Posted by *done12many2*
> 
> You sniped me.


hehe yeah man xD


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> So I'm getting hard locks in games and the error "Application has been blocked for accessing graphics hardware" Is this a bad GPU overclock or is it my RAM timings being too tight? (14-11-11-11-11-11)
> 
> I can't seem to game longer than 5-10 minutes at a time before this occurs. And I've already tried clean installing Nvidia Drivers with DDU.


Bump.


----------



## Hulio225

Quote:


> Originally Posted by *alucardis666*
> 
> Bump.
> 
> *RAM timings being too tight? (14-11-11-11-11-11)*


be more specific and the bold part isn't even a real thing...
im counting 5 times 11 lol

and the easiest way is to reduce oc and check if blocked message comes again, if yes than losen timings and check again... it isn't that hard to check which thing out of two is causing errors

edit: don't get me wrong, but instead of asking and waiting for speculations, you can try and check and be sure in a few by yourself


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?
> 
> 
> 
> 
> 
> Hey Kedar wrong thread, this is 1080 ti owners thread, why are you posting a link with time spy result done on a titan x?
Click to expand...

I have an older Maxwell Titan X as dedicated PhysX and for some reason 3DMark detects it instead of my primary 1080 Ti.









Edit: If you look at results, secondary card you'll see my 1080 Ti, 3DMark has always been bugged like that for me.


----------



## DerComissar

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?
> 
> 
> 
> 
> 
> Hey Kedar wrong thread, this is 1080 ti owners thread, why are you posting a link with time spy result done on a titan x?
Click to expand...

What amazes me, is how did he manage to flash a FTW bios to a TX?








Edit:
You do know I was just kidding, I've seen your "physX card" on previous test posting results, lol.


----------



## KedarWolf

Tried the hotfix Nvidia driver, crashes at same spot in Time Spy every time.









Went back to normal latest driver.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> FTW3 BIOS power limits a ton less than FE BIOS and gets better benches at same clocks as the result.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same on air?
> 
> Think I might just flash right now.
> 
> Funny thing is, I heard someone saying the XOC was beating the FTW3 BIOS?
> 
> Or is consensus for XOC a no-go? _(remember I've been away for a few days)._
Click to expand...

Tried the XOC, very unstable, lower benches at same clocks, will NOT get my clocks any higher with more voltage.


----------



## KedarWolf

Quote:


> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?
> 
> 
> 
> 
> 
> Maybe you could share that in the Titan X thread?
Click to expand...

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9420_20#post_26088481


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *done12many2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?
> 
> 
> 
> 
> 
> Maybe you could share that in the Titan X thread?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9420_20#post_26088481
Click to expand...

My old Maxwell Titan X I use as a PhysX card until I sell it.



GM200, Maxwell.


----------



## KedarWolf

Quote:


> Originally Posted by *DerComissar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Broke 11000.
> 
> 
> 
> 
> 
> 
> 
> FTW3 BIOS, no shunt mod.
> 
> 
> 
> http://www.3dmark.com/3dm/19881540?
> 
> 
> 
> 
> 
> Hey Kedar wrong thread, this is 1080 ti owners thread, why are you posting a link with time spy result done on a titan x?
> 
> Click to expand...
> 
> What amazes me, is how did he manage to flash a FTW bios to a TX?
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> You do know I was just kidding, I've seen your "physX card" on previous test posting results, lol.
Click to expand...

My Superposition.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> FTW3 BIOS power limits a ton less than FE BIOS and gets better benches at same clocks as the result.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Same on air?
> 
> Think I might just flash right now.
> 
> Funny thing is, I heard someone saying the XOC was beating the FTW3 BIOS?
> 
> Or is consensus for XOC a no-go? _(remember I've been away for a few days)._
> 
> Click to expand...
> 
> Tried the XOC, very unstable, lower benches at same clocks, will NOT get my clocks any higher with more voltage.
Click to expand...

Okay, you guys are on to me, I dismantled my water cooling rig three times this last 20 minutes, changed out my Titan XP twice for my 1080 Ti and ran the benches, I'm so ashamed.


----------



## alucardis666

Quote:


> Originally Posted by *Hulio225*
> 
> be more specific and the bold part isn't even a real thing...
> im counting 5 times 11 lol
> 
> and the easiest way is to reduce oc and check if blocked message comes again, if yes than losen timings and check again... it isn't that hard to check which thing out of two is causing errors
> 
> edit: don't get me wrong, but instead of asking and waiting for speculations, you can try and check and be sure in a few by yourself


Ryzen reads 5 timings...

Also I seem to have fixed it by lowering my core offsets a little. Guess my OC wasn't as stable as I'd hoped.


----------



## Hulio225

Quote:


> Originally Posted by *alucardis666*
> 
> Ryzen reads 5 timings...
> 
> Also I seem to have fixed it by lowering my core offsets a little. Guess my OC wasn't as stable as I'd hoped.


Glad you figured it out.

5 timings okay but not 14-11-11-11-11-11, that is not a real thing^^
should look more like 14-11-11-22 1T 180 or something


----------



## alucardis666

Quote:


> Originally Posted by *Hulio225*
> 
> Glad you figured it out.
> 
> 5 timings okay but not 14-11-11-11-11-11, that is not a real thing^^
> should look more like 14-11-11-22 1T 180 or something


Typo in a moment of frustration My timings are 14-11-11-11-11-22-1T


----------



## Hulio225

Quote:


> Originally Posted by *alucardis666*
> 
> Typo in a moment of frustration My timings are 14-11-11-11-11-22-1T


still two times 11 to much









but yeah everything good


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Typo in a moment of frustration My timings are 14-11-11-11-11-22-1T
> 
> 
> 
> still two times 11 to much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but yeah everything good
Click to expand...

One times 11 too much, Ryzen has five primary timings before the 1T.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> One times 11 too much, Ryzen has five primary timings before the 1T.


How is this possible, it is physically the same stick (ddr4) so it has to have the same timings...
i don't get it ^^

i think those are just the timings you can change in the bios, but its not the convention how we express timings, right?

because if i would write every timing down which i am able to change on an apex board i would need several rows here xD


----------



## pfinch

Hey guys,

i got an EVGA FE Card via Step-Up... Sadly full power/voltage throttling








Already noticed that some of you flashed FTW3 BIOS to FE air cooled/no shunt cards.

Can someone please upload/link the FTW3 BIOS for me?









Thank you!


----------



## Hulio225

Quote:


> Originally Posted by *pfinch*
> 
> Hey guys,
> 
> i got an EVGA FE Card via Step-Up... Sadly full power/voltage throttling
> 
> 
> 
> 
> 
> 
> 
> 
> Already noticed that some of you flashed FTW3 BIOS to FE air cooled/no shunt cards.
> 
> Can someone please upload/link the FTW3 BIOS for me?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you!


It is all here just look the first page up, or this thread... http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti


----------



## BoredErica

A longer test of avg frequency/etc on the 3 Unigine tests.



Every test begins with a 15 minute warm up period, followed by a 15 minute measurement period. Tested at fullscreen 1440p.


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> How is this possible, it is physically the same stick (ddr4) so it has to have the same timings...
> i don't get it ^^
> 
> i think those are just the timings you can change in the bios, but its not the convention how we express timings, right?
> 
> because if i would write every timing down which i am able to change on an apex board i would need several rows here xD


I am just as confused too maybe tWR,
maybe in bios there is partial secondary and tertiary timing settings instead of using a memory tweak program


----------



## BoredErica

OCCT has been updated to v4.5.


----------



## kolkoo

So elmor replied about the asus XOC bios that he sees the same performance between XOC and FE bios. Can any of you that tested these and saw perf differences reply wih your exact versions? Maybe ASUS can fix the XOC and then we can all benefit from nice perf no power limit bios









http://forum.hwbot.org/showthread.php?t=169488&page=3


----------



## kevindd992002

I don't see the EVGA GTX 1080Ti FTW3 in Micro Center's website. Is there a possibility that it is just not yet listed but they do have actual stocks?


----------



## kolkoo

Quote:


> Originally Posted by *kevindd992002*
> 
> I don't see the EVGA GTX 1080Ti FTW3 in Micro Center's website. Is there a possibility that it is just not yet listed but they do have actual stocks?


It's in stock on newegg as of right now https://www.newegg.com/Product/Product.aspx?Item=N82E16814487338&cm_re=ftw3-_-14-487-338-_-Product


----------



## ilikelobstastoo

Quote:


> Originally Posted by *nrpeyton*
> 
> The extra compression should do the job. _(you'll see improvement in temps)_
> 
> Anything soft & *2.7 - 3* w/mk and over will get you comparable results. (they should all compress to 50% original height)
> 
> _I only say 2.7 because thats what I've seen on ebay._
> 
> Save your ££ / $$ / €€
> 
> You can get 10cm x 10cm for less than a 5 $ / € / £ so you'll have loads left over and plenty to fiddle around with.
> 
> _Way it helps me to look at is is this:
> 
> Copper is 400 w/MK.
> 
> so going from 3 w/MK to 16 w/MK will barely make 0.001 of a difference_


Took your advice and saved some $ and ordered some gelid ones for a good price . Got all 3 sizes I'm wanting for 20$ usd. And will have a ton left over for testing . The gelid ones are still 12w/mk.

Also got some kronaut . Unfortunately in my haste I didn't include a drain in my loop so will be adding a drain so I can more easily remove and re paste and change over thermal pads on my ek block.


----------



## kevindd992002

Quote:


> Originally Posted by *kolkoo*
> 
> It's in stock on newegg as of right now https://www.newegg.com/Product/Product.aspx?Item=N82E16814487338&cm_re=ftw3-_-14-487-338-_-Product


Yes but those idiots don't accept international cc's







I'll be in CA this May 21 for a business trip and am planning to drop by a Micro Center if they have one in stock.


----------



## Alwrath

Hey guys, is there any chance we could unlock a 1080ti into a Titan XP? Maybe get a dual bios card and flash a Titan Xp nvidia stock bios on it and cross your fingers? If it fails then you can just switch over to your other bios... do not even think about it on a card with only 1 bios.

What are your guys thoughts? will it work?


----------



## niobium615

Quote:


> Originally Posted by *Alwrath*
> 
> Hey guys, is there any chance we could unlock a 1080ti into a Titan XP? Maybe get a dual bios card and flash a Titan Xp nvidia stock bios on it and cross your fingers? If it fails then you can just switch over to your other bios... do not even think about it on a card with only 1 bios.
> 
> What are your guys thoughts? will it work?


Nope, won't work. The memory controller and extra cores are cut off internally in the die.


----------



## Alwrath

Quote:


> Originally Posted by *niobium615*
> 
> Nope, won't work. The memory controller and extra cores are cut off internally in the die.


Ok, gotcha


----------



## KingEngineRevUp

Quote:


> Originally Posted by *kevindd992002*
> 
> Yes but those idiots don't accept international cc's
> 
> 
> 
> 
> 
> 
> 
> I'll be in CA this May 21 for a business trip and am planning to drop by a Micro Center if they have one in stock.


Can't you check out with Paypal? You can also do in store pickup at Newegg also.


----------



## kevindd992002

Quote:


> Originally Posted by *SlimJ87D*
> 
> Can't you check out with Paypal? You can also do in store pickup at Newegg also.


I already tried that and they really are strict with payments that use card from non-US countries. Oh wait, you can do in-store pickups and Newegg and pay on the spot? If that's the case, that'd be great!


----------



## Alwrath

Man, cant wait to get these Noctua fans on my EVGA Hybrid radiator. Im guessing my temps will go from 45C to somewhere around 38-39C. If thats the case, ill get up to 2025 mhz stock voltage ill bet.


----------



## gmpotu

Two noob questions.
What does AIB and XOC stand for?

@Talon have you tried the FTW3 Bios on your Strix OC card? Curious to see if the results are better than the Strix OC bios.


----------



## Alwrath

I cant wait till vega and volta are released, im going to put another 1080ti in sli when the price comes down, giggidy.


----------



## lilchronic

2076Mhz / 6240Mhz @ 1.081v XOC bios no shunt mod.
http://www.3dmark.com/3dm/19895595


----------



## Alwrath

Quote:


> Originally Posted by *lilchronic*
> 
> 2076Mhz / 6240Mhz FE bios no shunt mod.
> http://www.3dmark.com/3dm/19895595


----------



## lilchronic

Quote:


> Originally Posted by *Alwrath*


that was on XOC bios not FE.


----------



## BoredErica

Quote:


> Originally Posted by *gmpotu*
> 
> Two noob questions.
> What does AIB and XOC stand for?
> 
> @Talon have you tried the FTW3 Bios on your Strix OC card? Curious to see if the results are better than the Strix OC bios.


Think xoc is Strix OC. AIB is add in board, so like the non-FE MSI, Asus, EVGA cards etc.


----------



## lilchronic

Quote:


> Originally Posted by *Darkwizzie*
> 
> Think xoc is Strix OC. AIB is add in board, so like the non-FE MSI, Asus, EVGA cards etc.


XOC = Extreme Overclock ..... I think


----------



## gmpotu

I might try out FTW3 bios tonight on my Strix OC and see how it goes. Strix uses two 8-pin power cables so the extra power from FTW3 might be useful for me.

I'm on Air cooling so pretty limited and I'm a super noob at this stuff so probably more for the fun of switching Bioses for the first time. Plus I'm hoping this will fix my issue with my system not shutting down side I got this new card.

I'm assuming FTW3 won't let me use the Asus Aura sync software though right?


----------



## TheWizardMan

Quote:


> Originally Posted by *gmpotu*
> 
> I might try out FTW3 bios tonight on my Strix OC and see how it goes. Strix uses two 8-pin power cables so the extra power from FTW3 might be useful for me.
> 
> I'm on Air cooling so pretty limited and I'm a super noob at this stuff so probably more for the fun of switching Bioses for the first time. Plus I'm hoping this will fix my issue with my system not shutting down side I got this new card.
> 
> I'm assuming FTW3 won't let me use the Asus Aura sync software though right?


Isn't aura sync ran through the mobo anyways?


----------



## lilchronic

Quote:


> Originally Posted by *Alwrath*
> 
> I got trollllllllllllllllllled LOL


lol sorry.

Here was my highest with FE bios. 2063Mhz with power throttling causing core and voltage to drop to 2000Mhz and 0.963v


----------



## TheWizardMan

Quote:


> Originally Posted by *lilchronic*
> 
> lol sorry.
> 
> Here was my highest with FE bios. 2063Mhz with power throttling causing core and voltage to drop to 2000Mhz and 0.963v


The FTW bios definitely seems to fix power throttling, but the system becomes unstable in other aspects for me. Startup became a bit more sloppy, etc.


----------



## kolkoo

Quote:


> Originally Posted by *lilchronic*
> 
> 2076Mhz / 6240Mhz @ 1.081v XOC bios no shunt mod.
> http://www.3dmark.com/3dm/19895595


I get 11200 graphics score with FTW3 Bios on Time Spy and curve at

Mem @ 6220Mhz

1093 mV - 2100 Mhz

1075-1081 mV - 2088 Mhz

then 1 bin less until 993mV

And my card power throttles to 2037 at time and is at 2088 99% of the time, so it seems not much gains with XOC... (have not touched any nvidia control panel options for 3dmark)


----------



## Alwrath

Quote:


> Originally Posted by *TheWizardMan*
> 
> The FTW bios definitely seems to fix power throttling, but the system becomes unstable in other aspects for me. Startup became a bit more sloppy, etc.


FTW3 bios was terrible for my FE, didnt work for me at all on water too. I lost 4 bins clockspeed wise and was at when I was air cooled or lower, sometimes I saw 1926 mhz. ugh : (


----------



## wammos

Quote:


> Originally Posted by *Alwrath*
> 
> FTW3 bios was terrible for my FE, didnt work for me at all on water too. I lost 4 bins clockspeed wise and was at when I was air cooled or lower, sometimes I saw 1926 mhz. ugh : (


Same here the XOC has been best for my EVGA FE under water and no shunt.


----------



## KedarWolf

FTW3 BIOS I do lose a bin or two maximum overclock but benching I get about the same in Time Spy as Strix OC BIOS even though I have Strix higher at 2088 1.093v, FTW3 at 2062 1.075v.

In Superposition no matter how I adjust the Strix curve I get 150 more points with FTW3 at 2062.

In actual gaming though I'm using Strix BIOS as with FTW3 I'm not fully stable at 2088 with G-Sync on.


----------



## TheWizardMan

My cards (both of them) boosted to 2000 Mhz with no offset on the FTW bios. Still happy to hit 2088 with no power throttling. I flashed back to the FE BIOS because I didn't like some of the FTW3 "side effects", but it was definitely better for power throttling.


----------



## Nico67

Just did a comparison of my Asus FE bios versus Nvidia FE bios, as they do show quite a bit of difference in hex comparison.

Asus 2138/6180 - 10693



Nvidia 2138/6180 - 10692



so aside from the hex difference they perform identically and I used identical curves. If anything the Asus seemed very marginally more stable and had fractionally higher min fps. The biggest difference I have noted in all the bioses, is FE bioses allways have better min fps while other bioses allways have better max fps. Min fps is what I'd rather have.

As for XOC, it does seem fractionally worse performance, where if its not power limiting it should be noticeably better.

XOC 2152/6180 - 10632



Note that XOC is not limiting at all, at least not visibly, were the FE bioses are both dropping to 2075-88 for large portions of their runs.


----------



## wammos

Quote:


> Originally Posted by *Nico67*
> 
> Just did a comparison of my Asus FE bios versus Nvidia FE bios, as they do show quite a bit of difference in hex comparison.
> 
> Asus 2138/6180 - 10693
> 
> 
> 
> Nvidia 2138/6180 - 10692
> 
> 
> 
> so aside from the hex difference they perform identically and I used identical curves. If anything the Asus seemed very marginally more stable and had fractionally higher min fps. The biggest difference I have noted in all the bioses, is FE bioses allways have better min fps while other bioses allways have better max fps. Min fps is what I'd rather have.
> 
> As for XOC, it does seem fractionally worse performance, where if its not power limiting it should be noticeably better.
> 
> XOC 2152/6180 - 10632
> 
> 
> 
> Note that XOC is not limiting at all, at least not visibly, were the FE bioses are both dropping to 2075-88 for large portions of their runs.


nice! That's why i use the xoc because it doesn't throttle and is stable. I am willing to lose a few fps for stability.


----------



## buddatech

Well I _thought_ my 1080 Ti FE was stable @2088/11,962 1.093v turns out true stability for gaming is 2063/11,800 1.093v Idle 19-22c and load max I've seen with vsync on 45c vsync off 49-50c










Spoiler: Warning: Spoiler!











How good/bad is my FE w/hybrid cooler compared to other 1080 Ti's??


----------



## feznz

Quote:


> Originally Posted by *gmpotu*
> 
> Two noob questions.
> What does AIB and XOC stand for?.


Quote:


> Originally Posted by *Darkwizzie*
> 
> Think xoc is Strix OC. AIB is add in board, so like the non-FE MSI, Asus, EVGA cards etc.


actually to be technical I had to look it up a few days ago because it was the first time I had come across the AIB acronym

any thing that plugs into a PCI slot is AIB, so that is sound card, GPUs etc just it has loosely been accepted as the original manufacturer's version of the card ie AMD & NVidia has been applied as AIB

Who am I to change the mainstream use of AIB now?


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> actually to be technical I had to look it up a few days ago because it was the first time I had come across the AIB acronym
> 
> any thing that plugs into a PCI slot is AIB, so that is sound card, GPUs etc just it has loosely been accepted as the original manufacturer's version of the card ie AMD & NVidia has been applied as AIB
> 
> Who am I to change the mainstream use of AIB now?


I always thought AIB stands for "aftermarket individual board" what would make some sense at least


----------



## BoredErica

To test the bioses can I just run at a high but stable overclock, run Valley for 10 minutes to get it warmed up, then bench and just compare benches? If I have to find max overclock after checking for stability for each bios and then testing performance there goes all of my free time, lol.


----------



## Benny89

Interesting Kraken G12 test results for VRAM temps.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *Benny89*
> 
> Interesting Kraken G12 test results for VRAM temps.


yeah but can you believe anything that comes out that jays2cents dudes mouth? out of the tech youtubers hes by far the most bias least trustworthy one!!


----------



## Luckbad

Just got in an EVGA 1080 Ti FTW3 to compare to my Zotac 1080 Ti Amp Extreme.

Preliminary testing shows that it's a bit weaker than the Zotac both in silicon lottery and potential performance with the stock cooling. It doesn't cool as well as the 2.75 slot Zotac beast and gets louder. It also has a lower power limit.

If I max the fans out and pop it into K-Boost mode and max out voltage I can get a stable 2025 core, 6003 memory with no throttling. I'll have to mess with things a fair amount more to discover the real potential of this specific card. It's not quite in "screw it, I'm boxing this up and selling this on eBay" territory.

I'd prefer to keep the EVGA over the Zotac if only for future customer support.

Best Superposition score within it so far is 10250, which is a bit lower than I get with the Zotac.


----------



## Alwrath

Quote:


> Originally Posted by *Luckbad*
> 
> Just got in an EVGA 1080 Ti FTW3 to compare to my Zotac 1080 Ti Amp Extreme.
> 
> Preliminary testing shows that it's a bit weaker than the Zotac both in silicon lottery and potential performance with the stock cooling. It doesn't cool as well as the 2.75 slot Zotac beast and gets louder. It also has a lower power limit.
> 
> If I max the fans out and pop it into K-Boost mode and max out voltage I can get a stable 2025 core, 6003 memory with no throttling. I'll have to mess with things a fair amount more to discover the real potential of this specific card. It's not quite in "screw it, I'm boxing this up and selling this on eBay" territory.
> 
> I'd prefer to keep the EVGA over the Zotac if only for future customer support.
> 
> Best Superposition score within it so far is 10250, which is a bit lower than I get with the Zotac.


Sounds like you should keep the Zotac


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> Just got in an EVGA 1080 Ti FTW3 to compare to my Zotac 1080 Ti Amp Extreme.
> 
> Preliminary testing shows that it's a bit weaker than the Zotac both in silicon lottery and potential performance with the stock cooling. It doesn't cool as well as the 2.75 slot Zotac beast and gets louder. It also has a lower power limit.
> 
> If I max the fans out and pop it into K-Boost mode and max out voltage I can get a stable 2025 core, 6003 memory with no throttling. I'll have to mess with things a fair amount more to discover the real potential of this specific card. It's not quite in "screw it, I'm boxing this up and selling this on eBay" territory.
> 
> I'd prefer to keep the EVGA over the Zotac if only for future customer support.
> 
> Best Superposition score within it so far is 10250, which is a bit lower than I get with the Zotac.


Sillicon Lottery is Sillicon lottery. JayZ SC2 1080 Ti EVGA card is 2114mhz max stable clock and runs 72C on stock fan max. Sillicon Lottery. My OC STRIX is way cooler than that but can't dream of going above 2012.

But could you please kindly elaborate more about noise and temps FTW3 vs your ZOTAC?

What I am intersted is:
1. Cooling perforamance on stock- max temps reached in the same after hour-two on stick fans and your noise impressions.
2. Noise and cooling at fans speed 40-55%

If you could provide me with those info I would be much much oblidged!


----------



## Luckbad

Quote:


> Originally Posted by *Alwrath*
> 
> Sounds like you should keep the Zotac


That's where I'm leaning right now. EVGA seems like they confirmed that there would be a FTW3 Hybrid via Twitter several days ago, so I should probably just cut this card loose on the eBay.

In terms of practical usage, I can get a stable 2062 without voltage changes on the Zotac while gaming and stay to 65 or below without letting the fans exceed 70%.

The FTW3 gets up to 68 and the fans are considerably louder, and throttles itself down to 2012 or even lower with no voltage changes.

The sound difference alone is enough to drive me away. The sound profile of the FTW3 is perfectly acceptable, it's just louder than the Zotac. Both the FTW3 and Zotac have a pleasant sound, unlike the Asus STRIX OC.

Welp. Here's to hoping the FTW3 Hybrid comes out within the next month or so, or it'll just be pointless to buy it with the looming Volta.


----------



## CoreyL4

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> yeah but can you believe anything that comes out that jays2cents dudes mouth? out of the tech youtubers hes by far the most bias least trustworthy one!!


yeah hes gotten a big head lately.


----------



## feznz

]
Quote:


> Originally Posted by *Hulio225*
> 
> I always thought AIB stands for "aftermarket individual board" what would make some sense at least


That makes perfect sense now the internet lied to me or I was too lazy to look up further but FE is shorter and easier and no misinterpretation

Quote:


> Originally Posted by *Darkwizzie*
> 
> To test the bioses can I just run at a high but stable overclock, run Valley for 10 minutes to get it warmed up, then bench and just compare benches? If I have to find max overclock after checking for stability for each bios and then testing performance there goes all of my free time, lol.


I don't know valley was good for the old cards being DX11 when ever I run most of these old benches I really only average 75% or less on GPU core usage
Honestly stability is a hard one for an actual test that guarantees stability I really don't worry too much on getting to the final stablish bin rather with the calculation of FPS I gain
If I loose 10% on FPS @ 60FPS at top clock then that is 54FPS I cannot tell the difference unless I have a FPS counter that distracts me from the game, simply I know when a game is unplayable and I know when a game is buttery smooth which is debateable at what the actual FPS should be minimum

unless its for EPEEN benches then I game @ 2038Mhz with memory @ 5900Mhz card being Asus Strix 1080 Ti under water considering how much more these cards are compared to any other card then I got to expect way above average performance and OC

I think best way to say anything over 2000MHZ IS GOLDEN 1900Mhz and above is awesome 1800Mhz and above is Okay for 40°C tropics 1700Mhz and above is getting into Nana territory









worst way to tell you have an unstable card is get 45min into a round of Battlefield 1 almost at the end of an epic round thinking I might be in the top 5 players or better and ........then........... hang you then realise your OC is unstable and don't even make it top 20 players story of my life


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Sillicon Lottery is Sillicon lottery. JayZ SC2 1080 Ti EVGA card is 2114mhz max stable clock and runs 72C on stock fan max. Sillicon Lottery. My OC STRIX is way cooler than that but can't dream of going above 2012.
> 
> But could you please kindly elaborate more about noise and temps FTW3 vs your ZOTAC?
> 
> What I am intersted is:
> 1. Cooling perforamance on stock- max temps reached in the same after hour-two on stick fans and your noise impressions.
> 2. Noise and cooling at fans speed 40-55%
> 
> If you could provide me with those info I would be much much oblidged!


The Zotac is ultra quiet until about 60% fan speed. In my super quiet PC (all Cougar CFD black fans that I had to work to get to the USA for intakes and exhaust, and Noctua AF14 industrial2000s at low RPMs on the radiator), it barely contributes to the sound. Even at 65-70% where I max it out, it's clearly audible but not bothersome.

The FTW3 is also rather quiet when idling (note that with both the EVGA and Zotac, I never let the fans turn off. I'd rather go cooler than quieter). It becomes clearly audible at 55%, and rather loud at 65%.

Both idle right around 30 C +/- a few degrees. The FTW3 doesn't have as much surface area for the heatsink as the Zotac and the fans aren't as big, so it throttles much sooner.


----------



## TahoeDust

What are you guys running to test thermal management. I am messing with my fan curve and want something to really try to heat this thing up. Something that will throw as much heat at it as a long gaming session. I am running the superposition stress test right now. Do people still use Furmark?


----------



## Dasboogieman

Quote:


> Originally Posted by *TahoeDust*
> 
> What are you guys running to test thermal management. I am messing with my fan curve and want something to really try to heat this thing up. Something that will throw as much heat at it as a long gaming session. I am running the superposition stress test right now. Do people still use Furmark?


Superposition 8k will heat the gpu easily


----------



## TahoeDust

Quote:


> Originally Posted by *Dasboogieman*
> 
> Superposition 8k will heat the gpu easily


Thanks. For some reason it will not run the stress test in anything above native. It lets me select them and start them, but I can see by memory usage and FPS it is not actually rendering at the selected resolution.

I did just run the 8k benchmark and it ran correctly using the correct amount of vram. Max temp was 73* and score 4391, which looks middle of the pack....not bad for an old 2700k z68 system.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*
> 
> ]
> That makes perfect sense now the internet lied to me or I was too lazy to look up further but FE is shorter and easier and no misinterpretation
> I don't know valley was good for the old cards being DX11 when ever I run most of these old benches I really only average 75% or less on GPU core usage
> Honestly stability is a hard one for an actual test that guarantees stability I really don't worry too much on getting to the final stablish bin rather with the calculation of FPS I gain
> If I loose 10% on FPS @ 60FPS at top clock then that is 54FPS I cannot tell the difference unless I have a FPS counter that distracts me from the game, simply I know when a game is unplayable and I know when a game is buttery smooth which is debateable at what the actual FPS should be minimum
> 
> unless its for EPEEN benches then I game @ 2038Mhz with memory @ 5900Mhz card being Asus Strix 1080 Ti under water considering how much more these cards are compared to any other card then I got to expect way above average performance and OC
> 
> I think best way to say anything over 2000MHZ IS GOLDEN 1900Mhz and above is awesome 1800Mhz and above is Okay 1700Mhz and above is getting into Nana territory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> worst way to tell you have an unstable card is get 45min into a round of Battlefield 1 almost at the end of an epic round thinking I might be in the top 5 players or better and ........then........... hang you then realise your OC is unstable and don't even make it top 20 players story of my life


actually , Battlefield is my goto for cpu stability, too. game is going to crash you if you arent pretty much perfect.


----------



## Hulio225

Quote:


> Originally Posted by *TahoeDust*
> 
> Thanks. For some reason it will not run the stress test in anything above native. It lets me select them and start them, but I can see by memory usage and FPS it is not actually rendering at the selected resolution.
> 
> I did just run the 8k benchmark and it ran correctly using the correct amount of vram. Max temp was 73* and score 4391, which looks middle of the pack....not bad for an old 2700k z68 system.


8k is so gpu intensive,therefore gpu bound, the cpu you use makes almost no difference


----------



## Slackaveli

Quote:


> Originally Posted by *Slackaveli*
> 
> actually , Battlefield is my goto for cpu stability, too. game is going to crash you if you arent pretty much perfect.


Quote:


> Originally Posted by *TahoeDust*
> 
> Thanks. For some reason it will not run the stress test in anything above native. It lets me select them and start them, but I can see by memory usage and FPS it is not actually rendering at the selected resolution.
> 
> I did just run the 8k benchmark and it ran correctly using the correct amount of vram. Max temp was 73* and score 4391, which looks middle of the pack....not bad for an old 2700k z68 system.


That's the thing, 4k helps your cpu a TON. I see dudes all the time buy an expensive mobo, an obscene amount of hella expensive ram, and a $350-$1800 i-7 and then say , "I dont have enough money for a new monitor for now, gonna stick with my 1080p 144Hz monitor for now." And usually enough have the nerve to throw in an ignorant, "4k isnt ready anyway". When they could have spent all that money on a new bad arse 4k and wouldn't have even needed a cpu upgrade at all. Going to 4k IS A CPU upgrade.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> actually , Battlefield is my goto for cpu stability, too. game is going to crash you if you arent pretty much perfect.


Ryzen is stable here at 3965 mhz, geforce stable at 2012 according to BF1


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Ryzen is stable here at *3965* mhz, geforce stable at 2012 according to BF1


It would literally drive me nuts to have this clock^^ i would do everything possible to get it to 4000, otherwise i would destroy the cpu^^


----------



## Alwrath

Quote:


> Originally Posted by *Hulio225*
> 
> It would literally drive me nuts to have this clock^^ i would do everythuing possible to get it to 4000, otherwise i would destroy the cpu^^


Have fun with that 1.43 voltage then


----------



## TahoeDust

I don't play at 4k, but I do play at ultrawide 3440x1440 100hz. This 1080ti is a beast...locks on 100fps in most everything I play.


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> Have fun with that 1.43 voltage then


My cooling would be enough, i sometimes push 1.6V throu my i7 7700k and it never saw thermal throttling^^


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Ryzen is stable here at 3965 mhz, geforce stable at 2012 according to BF1


nice, my man. Broadwell steady at 4.3, 1080Ti stable at 2063 , checking in.

As Hulio said, that clock would merc my ocd lol. You have proven yourself mental sound by being able to run that clock, sir









I really should get a hybrid kit. I just hate the look of them whilest loving the look of my Aorus' cooler; and am not up for a full loop from scratch. meh. stuck at ~60c, when i want to be sub 55c. really sub 49c. Or, actuall, I want sub 38c under load. Arrg. It's never ending. Next spring I think I may actually build myself a chill box. That looks fun as hell. I really want to see -10c on my OSD for my ocd


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> nice, my man. Broadwell steady at 4.3, 1080Ti stable at 2063 , checking in.
> 
> I really should get a hybrid kit. I just hate the look of them whilest loving the look of my Aorus' cooler; and am not up for a full loop from scratch. meh. stuck at ~60c, when i want to be sub 55c. really sub 49c. Or, actuall, I want sub 38c under load. Arrg. It's never ending. Next spring I think I may actually build myself a chill box. That looks fun as hell. I really want to see -10c on my OSD for my ocd


What voltage for 2063? If your stock then you should just get it on water ASAP LOL


----------



## Alwrath

Quote:


> Originally Posted by *Hulio225*
> 
> My cooling would be enough, i sometimes push 1.6V throu my i7 7700k and it never saw thermal throttling^^


My cooling is enough as well, I just dont want to go over 1.4


----------



## Hulio225

Quote:


> Originally Posted by *Alwrath*
> 
> My cooling is enough as well, I just dont want to go over 1.4


Fair enough...


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> What voltage for 2063? If your stock then you should just get it on water ASAP LOL


i can bench at 2101, dips to 2088, with max volt slider. i play BF1 at 2050-2063 at 1.063v. Pretty sure this dude would plug along happily at 2114 underwater. Definitely gameable at 2101 non-air.


----------



## Alwrath

Im sittin here playing snes emulator mega man x2, star fox, yeah I need that 4 GHZ so bad lol


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> i can bench at 2101, dips to 2088, with max volt slider. i play BF1 at 2050-2063 at 1.063v.


Get it on water and ill see you in the 2150 mhz club


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Im sittin here playing snes emulator mega man x2, star fox, yeah I need that 4 GHZ so bad lol


you actually did right. it would be hard to accept at first, but temps and volts factor into my ocd, too. Although ryzen is safe daily to 1.425, i'd want to see 1.399v, lol.
Quote:


> Originally Posted by *Alwrath*
> 
> Get it on water and ill see you in the 2150 mhz club


haha, you know how to tempt a man.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> you actually did right. it would be hard to accept at first, but temps and volts factor into my ocd, too. Although ryzen is safe daily to 1.425, i'd want to see 1.399v, lol.
> haha, you know how to tempt a man.


Yeah only problem is my SOC voltage is 1.2, I will not give up my 3500 mhz ddr4 cl14 LOL


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Yeah only problem is my SOC voltage is 1.2, I will not give up my 3500 mhz ddr4 cl14 LOL


oh, jah, i remember you couldnt get your SOC higher. Nothing wrong with 3.965 and 3500Mhz , beautiful rgb ram on an 8 core beast. Not at all.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> oh, jah, i remember you couldnt get your SOC higher. Nothing wrong with 3.965 and 3500Mhz ram on an 8 core beast. Not at all.


Im chewing through games like a hot knife through butter. As im sure your rig will do as well till the end of time, and believe me, the end is coming


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> Im chewing through games like butter. As im sure you rig will do as well till the end of time, and believe me, the end is coming


of that i have no doubts, and yes Ms Eva3 does chew games up. You have the 8 cores, I have the magic cache, and we both have Tis over 2Ghz. Dont get any better.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> of that i have no doubts, and yes Ms Eva3 does chew games up. You have the 8 cores, I have the magic cache, and we both have Tis over 2Ghz. Dont get any better.


I have 2 Noctua 3000 rpm fans coming for my geforce radiator, when I plug them in im gonna laugh like a school girl. ill be at 2050 mhz stock voltage probably. 38C range.


----------



## Alwrath

My new rig is like playing poker, i went " all in " literally.


----------



## Alwrath

Quote:


> Originally Posted by *Slackaveli*
> 
> of that i have no doubts, and yes Ms Eva3 does chew games up. You have the 8 cores, I have the magic cache, and we both have Tis over 2Ghz. Dont get any better.


I heard rumors about your broadwell destroyer, it seems the claims arent unfounded


----------



## UrGonnaDie

no matter what i do in msi afterburner i cant get the voltage to unlock







im on 4.3 i cant seam to find the beta 4.4


----------



## BoredErica

Quote:


> Originally Posted by *feznz*
> 
> ]
> That makes perfect sense now the internet lied to me or I was too lazy to look up further but FE is shorter and easier and no misinterpretation
> I don't know valley was good for the old cards being DX11 when ever I run most of these old benches I really only average 75% or less on GPU core usage
> Honestly stability is a hard one for an actual test that guarantees stability I really don't worry too much on getting to the final stablish bin rather with the calculation of FPS I gain
> If I loose 10% on FPS @ 60FPS at top clock then that is 54FPS I cannot tell the difference unless I have a FPS counter that distracts me from the game, simply I know when a game is unplayable and I know when a game is buttery smooth which is debateable at what the actual FPS should be minimum
> 
> unless its for EPEEN benches then I game @ 2038Mhz with memory @ 5900Mhz card being Asus Strix 1080 Ti under water considering how much more these cards are compared to any other card then I got to expect way above average performance and OC
> 
> I think best way to say anything over 2000MHZ IS GOLDEN 1900Mhz and above is awesome 1800Mhz and above is Okay 1700Mhz and above is getting into Nana territory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> worst way to tell you have an unstable card is get 45min into a round of Battlefield 1 almost at the end of an epic round thinking I might be in the top 5 players or better and ........then........... hang you then realise your OC is unstable and don't even make it top 20 players story of my life


I don't understand how you're getting 75% usage. My pictures showed that Heaven caused lower clocks than Superposition.

I have in the past played Battlefield just to test stability.


----------



## velocityx

Witcher 3 is also quite for testing GPU oc ;] but its true about battlefield, fire up either 4 or 1 and if some of your gear slips errors, battlefield will find them all


----------



## Slackaveli

Quote:


> Originally Posted by *Alwrath*
> 
> I have 2 Noctua 3000 rpm fans coming for my geforce radiator, when I plug them in im gonna laugh like a school girl. ill be at 2050 mhz stock voltage probably. 38C range.


my fans! i love them. they wreck shop, man.


----------



## KedarWolf

Quote:


> Originally Posted by *UrGonnaDie*
> 
> no matter what i do in msi afterburner i cant get the voltage to unlock
> 
> 
> 
> 
> 
> 
> 
> im on 4.3 i cant seam to find the beta 4.4


Seems to be a problem with how overclock handles links but if you highlight the link and copy the text and paste in browser, it works.









http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta6.rar


----------



## kolkoo

Quote:


> Originally Posted by *Nico67*
> 
> Just did a comparison of my Asus FE bios versus Nvidia FE bios, as they do show quite a bit of difference in hex comparison.
> 
> Asus 2138/6180 - 10693
> 
> 
> 
> Nvidia 2138/6180 - 10692
> 
> 
> 
> so aside from the hex difference they perform identically and I used identical curves. If anything the Asus seemed very marginally more stable and had fractionally higher min fps. The biggest difference I have noted in all the bioses, is FE bioses allways have better min fps while other bioses allways have better max fps. Min fps is what I'd rather have.
> 
> As for XOC, it does seem fractionally worse performance, where if its not power limiting it should be noticeably better.
> 
> XOC 2152/6180 - 10632
> 
> 
> 
> Note that XOC is not limiting at all, at least not visibly, were the FE bioses are both dropping to 2075-88 for large portions of their runs.


Hmm so this may debunk the Nvidia FE bios is better than asus and xoc bioses theories... I guess at the end of they day I gotta also test for myself

Edit: What about the XOC min fps? (just like you I also have the Asus FE - well not as golden as yours in clock, and my Asus FE Bios is quite different Hex-wise to nvidia FE)


----------



## feznz

Quote:


> Originally Posted by *Darkwizzie*
> 
> I don't understand how you're getting 75% usage. My pictures showed that Heaven caused lower clocks than Superposition.
> 
> I have in the past played Battlefield just to test stability.
> pls dun't quadruple post.


Quote:


> Originally Posted by *feznz*
> 
> feznz- i7 3770k @ 5Ghz / 2200Mhz - Asus 1080 Ti Strix - 2076MHz/5955MHz - 145.7FPS- 6094 score -1080p
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> feznz- i7 3770k @ 5Ghz / 2200Mhz - Asus 1080 Ti Strix - 2076MHz/5955MHz - 67.5FPS- 2824 score - 3840x1600p
> 
> 
> 
> heres my monitoring with both runs first 1080p then 1600p straight afterwards
> with the 1080p the GPU usage was hovering around the 75% mark while the 1600p GPU usage was constantly 99-100%
> if I could only get the 1080p to run the Gpu at 100% I should get 25% higher score or around estimated 180FPS anyone got some hot tips? I applied the basic tweaks on OP post.
> I not sure about the CPU it was hovering around 25% usage


sorry my bad the above post explains 1080p HD just don't load the GPU up for me in but looking at the last few entry's could be the fact of my 3770k CPU is not optimised
what is you valley score for 1080HD, probably doesn't matter as 1080p is dead to me

on another note I am seriously getting the upgrade itch I been looking at this board

https://www.mightyape.co.nz/product/asus-rog-maximus-viii-extreme-motherboard/23970393?utm_source=MailingList&utm_medium=email&utm_campaign=computers&utm_content=65584

wait for intel X299 or take a punt on AMD 399 chipset board
I set the upgrade bar at 8core guaranteed 4.8Ghz OC on core spending money isn't the issue just the 3770k isn't really holding me back at 3840x1600p

maybe I will just go play some games and remind myself I have never seen 100% CPU usage in any game
Anyone willing to tell me the most CPU intensive game in current games? so I can test for myself to cure this itch


----------



## Hulio225

Quote:


> Originally Posted by *kolkoo*
> 
> Hmm so this may debunk the Nvidia FE bios is better than asus and xoc bioses theories... I guess at the end of they day I gotta also test for myself
> 
> Edit: What about the XOC min fps?


Don't generalize and don't do a conclusion based of just two single posted runs of someone... the sample size is to small.
Superposition is to long for comparison of two bios to each other, if the differences are very very very small... you need to use something like
time spy game test 1 or 2 and do at least 5-10 runs each to see if differences exist.

If you let superposition run 5 times with the same bios you will get results which differ more to each other than the screenshots he posted with 2 different bios...


----------



## BoredErica

Quote:


> Originally Posted by *feznz*
> 
> sorry my bad the above post explains 1080p HD just don't load the GPU up for me in but looking at the last few entry's could be the fact of my 3770k CPU is not optimised
> what is you valley score for 1080HD, probably doesn't matter as 1080p is dead to me
> 
> on another note I am seriously getting the upgrade itch I been looking at this board
> 
> https://www.mightyape.co.nz/product/asus-rog-maximus-viii-extreme-motherboard/23970393?utm_source=MailingList&utm_medium=email&utm_campaign=computers&utm_content=65584
> 
> wait for intel X299 or take a punt on AMD 399 chipset board
> I set the upgrade bar at 8core guaranteed 4.8Ghz OC on core spending money isn't the issue just the 3770k isn't really holding me back at 3840x1600p
> 
> maybe I will just go play some games and remind myself I have never seen 100% CPU usage in any game
> Anyone willing to tell me the most CPU intensive game in current games? so I can test for myself to cure this itch


Hmm, I test on 1440p for everything.


----------



## KedarWolf

Can anyone with spare 47 Ohm 0805 resistors mail me a half dozen of them?

I have everything I need, except those, even amazon.ca is like June until they arrive.









I'm in Canada and can reimburse you the shipping costs.


----------



## feznz

Quote:


> Originally Posted by *Darkwizzie*
> 
> Hmm, I test on 1440p for everything.


I just looked your SIG then must be old age I went though all my benchmarks the other day to realise from 3DMARK06 was my first benchmark and reflected how far we have come since then I think I was pulling 300FPS+ @ 60% GPU usage with this GTX1080Ti where looking back my 8800GTS was maxing out at 100FPS Hardly got this 1080Ti off idle temp


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> I just looked your SIG then must be old age I went though all my benchmarks the other day to realise from 3DMARK06 was my first benchmark and reflected how far we have come since then I think I was pulling 300FPS+ @ 60% GPU usage with this GTX1080Ti where looking back my 8800GTS was maxing out at 100FPS Hardly got this 1080Ti off idle temp


yeah in 3dmark 06 your cpu is the limiting factor, it can't deliver enough data to the gpu for higher fps


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Can anyone with spare 47 Ohm 0805 resistors mail me a half dozen of them?
> 
> I have everything I need, except those, even amazon.ca is like June until they arrive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm in Canada and can reimburse you the shipping costs.


Bought mine here mate : http://www.ebay.co.uk/itm/300745229871?_trksid=p2060353.m2749.l2649&var=600050687744&ssPageName=STRK%3AMEBIDX%3AIT

they took like 1-2 days to arrive, 10 of them







just waiting until next friday before i can do anything lol as im off for 10 days








not sure on the shipping costs though dam these things are tiny
and for me to ship them to you would take way longer than you ordering them from above...


----------



## 4lek

My first test.
Stock settings,no oc yet.

Glad to had this update


----------



## mfdoom7

im getting low ppd help ?







is it a folding software bug it has been running about hour or so.


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> The Zotac is ultra quiet until about 60% fan speed. In my super quiet PC (all Cougar CFD black fans that I had to work to get to the USA for intakes and exhaust, and Noctua AF14 industrial2000s at low RPMs on the radiator), it barely contributes to the sound. Even at 65-70% where I max it out, it's clearly audible but not bothersome.
> 
> The FTW3 is also rather quiet when idling (note that with both the EVGA and Zotac, I never let the fans turn off. I'd rather go cooler than quieter). It becomes clearly audible at 55%, and rather loud at 65%.
> 
> Both idle right around 30 C +/- a few degrees. The FTW3 doesn't have as much surface area for the heatsink as the Zotac and the fans aren't as big, so it throttles much sooner.


Nice mate, thanks. But what is the max temp they both achieve on average after OC? Also woth 358W TDP does FTW3 really throttles?

Can you use Afterburner with Zotac card? Because I heard you needs some sort of special software from them for OC. Can you still use voltage curve on Zotac etc? I really like AB OSD and curve option (my goal is card that can do at least 2000mhz stable on 1.000V so I can keep it there so voltage curve is important).

But at around 55% fan speed both cards are rather the same in noise, right? And temps?

EDIT: Also, according to TDP review Zotac Extremem is loudest on stock fans than other AIBs by 2db : https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/36.html while achieving same 69C temp


----------



## Nyrmo

I have one Question...
i Installed the S1080TIXOC Bios on my Gigabyte GTX1080ti Founders Edition and it works very well.
But i can´t find the power consumption of the Grafic Card. Can i see this somewere?`

Sorry, my english ist not really good ;-)


----------



## KedarWolf

Quote:


> Originally Posted by *Nyrmo*
> 
> I have one Question...
> i Installed the S1080TIXOC Bios on my Gigabyte GTX1080ti Founders Edition and it works very well.
> But i can´t find the power consumption of the Grafic Card. Can i see this somewere?`
> 
> Sorry, my english ist not really good ;-)


install afterburner... scroll to power in graph.... or gpu-z...


----------



## Nyrmo

Before i made the Bios update, i had the Power option in the list.
Now he´s gone.
Not in Afterburner and not in GPU-Z


----------



## ViRuS2k

Quote:


> Originally Posted by *Nyrmo*
> 
> I have one Question...
> i Installed the S1080TIXOC Bios on my Gigabyte GTX1080ti Founders Edition and it works very well.
> But i can´t find the power consumption of the Grafic Card. Can i see this somewere?`
> 
> Sorry, my english ist not really good ;-)


Thats whats so mysterious about the XOC bios







keeps everything hidden so that no one can figure out why 2100ghz is exactly near enough the same performance as a different bios is at 2025ghz...
5% is lost in limbo somewhere....

though i beleve without a shadow of a doult that power limit is completely removed from that bios.... everything else though is a mistery....

dam i wish there was a way to mod our own bios`s







maxwell style lol


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> install afterburner... scroll to power in graph.... or gpu-z...


He is talking about TDP ect, XOC does not show that anymore......
Its completely normal, as XOC bios has no TDP limit anymore i think lol


----------



## Dart06

I was able to hit 2065Mhz on my EVGA 1080Ti SC Black Edition. Gonna try to push it further.


----------



## Aganor

Hi all,
Since yesterday i'm a proud owner of MSI 1080 Ti FE and monday i'll mount the EK WC block on it.
Until now i was able to do +160 core and +600 memory OC with the voltage slider locked and +120 power usage.
I saw on the 1st page about using other bios and to unlock the voltage slide but i dont understand why the bios flashing or if is it safe/void warranty?

With the default cooling and 100% fan speed at all times my card is at 77ºc using heaven benchmark but with my custom loop WC i can manage temps with only the 25ºc strottling.

I truly love this beast vs my "old" MSI 1080 WCooled but i wanted to squeeze the most of it without bricking it.

My question is, why would i flash the bios? I understand more voltage means more room for OC but i came to know that in pascal it inst that useful or am i wrong?

PS: Sorry about the english, its not my 1st language.


----------



## TWiST2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Seems to be a problem with how overclock handles links but if you highlight the link and copy the text and paste in browser, it works.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta6.rar


There is a version 8 now. You can just change the URL to access it.


----------



## KedarWolf

Quote:


> Originally Posted by *TWiST2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Seems to be a problem with how overclock handles links but if you highlight the link and copy the text and paste in browser, it works.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta6.rar
> 
> 
> 
> There is a version 8 now. You can just change the URL to access it.
Click to expand...

Version 8 I tried, had a bug where every time I rebooted my 2050 core point changed to 2062, and version 7 someone said borked their Windows install, hence version 6 which works flawlessly for me.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Nice mate, thanks. But what is the max temp they both achieve on average after OC? Also woth 358W TDP does FTW3 really throttles?
> 
> Can you use Afterburner with Zotac card? Because I heard you needs some sort of special software from them for OC. Can you still use voltage curve on Zotac etc? I really like AB OSD and curve option (my goal is card that can do at least 2000mhz stable on 1.000V so I can keep it there so voltage curve is important).
> 
> But at around 55% fan speed both cards are rather the same in noise, right? And temps?
> 
> EDIT: Also, according to TDP review Zotac Extremem is loudest on stock fans than other AIBs by 2db : https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/36.html while achieving same 69C temp


You can use Afterburner with the Zotac, yeah. You only really need FireStorm to control the lights and turn off fan freeze (so you can control fan speeds in Afterburner).

The fans are not loud at all if you make a custom profile. My card never exceeds 65 C in games even if I max the fans out at 60%, and they're pretty quiet at that speed. Max them at 100% and they can get loud for sure. With a proper fan curve tuned to keep temps to ~65, the Zotac is quieter than the FE, EVGA FTW3, and Asus STRIX OC that I've owned. The EVGA SC2 Hybrid can be quieter.

The FTW3 after OC gets to about 70 C vs 65 C on the Zotac (both with full power, no voltage boost and the same conditions) using a fan curve that doesn't exceed 65% (and it's louder than I'd like at that speed).

All of this information can be intuited based solely on the physical size of the cards. The FTW3 is a true 2 slot design. The Zotac takes up darn near 3 slots. Bigger cooling means better cooling if you don't screw it up, and that bears out in this case. The biggest disadvantages are louder fans to maintain temps, thermal throttling, or some combination thereof.

I absolutely want to end up with an EVGA. Their customer service is stellar.

At this time, I haven't found an EVGA card that's as good as the Zotac all around.

The final possibility is a good silicon lottery winner in a FTW3 Hybrid. That will be overbuilt, quiet, and cool.

Edit: And to answer the 55% fan speed question. Similar volume for both, Zotac is a good 5 C cooler.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> Version 8 I tried, had a bug where every time I rebooted my 2050 core point changed to 2062, and version 7 someone said borked their Windows install, hence version 6 which works flawlessly for me.


I tried and have no problem with beta 8. I guess your problem is more related to how you set your offsets. For instance, a card that boosts to 1999.5MHz on +0 offset, with +50 offset, it will boost to 2037.5MHz and with +51 offset, it will boost to 2050.
Also F-V curve changes with temperature. When you reboot the system, the card is much coder and the F-V is my more aggressive.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Version 8 I tried, had a bug where every time I rebooted my 2050 core point changed to 2062, and version 7 someone said borked their Windows install, hence version 6 which works flawlessly for me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tried and have no problem with beta 8. I guess your problem is more related to how you set your offsets. For instance, a card that boosts to 1999.5MHz on +0 offset, with +50 offset, it will boost to 2037.5MHz and with +51 offset, it will boost to 2050.
> Also F-V curve changes with temperature. When you reboot the system, the card is much coder and the F-V is my more aggressive.
Click to expand...

No, I had it on 2050 set, benched at 2050 just fine, rebooted, bench crashed, checked, and it had changed to 2062, this happened a half a dozen times before going back to beta 6, hasn't happened since.


----------



## KraxKill

Quote:


> Originally Posted by *kolkoo*
> 
> Hmm so this may debunk the Nvidia FE bios is better than asus and xoc bioses theories... I guess at the end of they day I gotta also test for myself
> 
> Edit: What about the XOC min fps? (just like you I also have the Asus FE - well not as golden as yours in clock, and my Asus FE Bios is quite different Hex-wise to nvidia FE)


You must be seeing something different than I am looking at his post. He posted worse performance at a higher clock. Despite "no power throttling" on the XOC. If anything it confirms it's not performing as it should.

Asus 2138/6180 - 10693
Nvidia 2138/6180 - 10692
XOC *2152*/6180 - 10632


----------



## kolkoo

Quote:


> Originally Posted by *KraxKill*
> 
> You must be seeing something different than I am looking at his post. He posted worse performance at a higher clock. Despite "no power throttling" on the XOC. If anything it confirms it's not performing as it should.
> 
> Asus 2138/6180 - 10693
> Nvidia 2138/6180 - 10692
> XOC *2152*/6180 - 10632


Yeah but there was also the Asus and Evga FE bioses are performing worse than Vanilla Nvidia Fe bios. But yeah I guess Xoc performs worse and also you probably are better off testing with smaller benches as hulio suggested


----------



## madmeatballs

Quote:


> Originally Posted by *KedarWolf*
> 
> Version 8 I tried, had a bug where every time I rebooted my 2050 core point changed to 2062, and version 7 someone said borked their Windows install, hence version 6 which works flawlessly for me.


This happens to me too but it stopped doing it after I kept on adjusting my curve. I'm on AB beta 8 I just kept on putting it back to where it is supposed to be until it stayed,even after restarts.

I guess it has something to do with the grayed out line on the vf curve graph. But really not sure lol







just a guess.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> Managed to achieve 415w+ TDP with the 3-resistor mod. Actually, 22Ohm or 47Ohm resistors are the best for this mod. 10Ohm is a little bit too small since nVidia changed their series resistor from 10Ohm to 20Ohm.
> 
> 
> I also experimented with two conductive paints, carbon based, and silver based.
> 
> 
> 
> 
> The carbon-based paint has around 20-30Ohm resistance over a 1mm gap. The silver one only has about 0.1-0.2Ohms


I'm getting the same carbon based paint in the diagram locally. And I ordered from eBay some more 47 Ohm resistors that will arrive in less than a week I'm sure. Not waiting until June for earlier order.

You think the carbon-based paint will hold the resistors on if I use non-conductive CPU thermal paste in-between? It has a very thin viscosity and will be easy to apply.









My card sits vertical.









Edit: Wait, is less resistance better so the silver is better?


----------



## Coopiklaani

Quote:


> Originally Posted by *Coopiklaani*
> 
> I tried and have no problem with beta 8. I guess your problem is more related to how you set your offsets. For instance, a card that boosts to 1999.5MHz on +0 offset, with +50 offset, it will boost to 2037.5MHz and with +51 offset, it will boost to 2050.


Quote:


> Originally Posted by *KedarWolf*
> 
> I'm getting the same carbon based paint in the diagram locally. And I ordered from eBay some more 47 Ohm resistors that will arrive in less than a week I'm sure. Not waiting until June for earlier order.
> 
> You think the carbon-based paint will hold the resistors on if I use non-conductive CPU thermal paste in-between? It has a very thin viscosity and will be easy to apply.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My card sits vertical.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Wait, is less resistance better so the silver is better?


Thermal glue is more than strong enough to hold those little resistors in place, so no problem. If you are using carbon-base conductive paint, you probably don't even need thermal glue to hold the resistor.

And, yes, less resistance = more better:thumb:

but silver-paint is much more water like than glue like, much harder to control and apply. need some skills.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> I tried and have no problem with beta 8. I guess your problem is more related to how you set your offsets. For instance, a card that boosts to 1999.5MHz on +0 offset, with +50 offset, it will boost to 2037.5MHz and with +51 offset, it will boost to 2050.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm getting the same carbon based paint in the diagram locally. And I ordered from eBay some more 47 Ohm resistors that will arrive in less than a week I'm sure. Not waiting until June for earlier order.
> 
> You think the carbon-based paint will hold the resistors on if I use non-conductive CPU thermal paste in-between? It has a very thin viscosity and will be easy to apply.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My card sits vertical.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Wait, is less resistance better so the silver is better?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thermal glue is more than strong enough to hold those little resistors in place, so no problem. If you are using carbon-base conductive paint, you probably don't even need thermal glue to hold the resistor.
> 
> And, yes, less resistance = more better:thumb:
> 
> but silver-paint is much more water like than glue like, much harder to control and apply. need some skills.
Click to expand...

Hey, should I get carbon based paint or I can get a 50% silver Rear Window Defogger Repair Kit.

I'm thinking the silver is better and me being monetarily challenged the repair kit is quite inexpensive.









Edit: Carbon paint is cheap as well, any cons to not using silver?


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> Hey, should I get carbon based paint or I can get a 50% silver Rear Window Defogger Repair Kit.
> 
> I'm thinking the silver is better and me being monetarily challenged the repair kit is quite inexpensive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Carbon paint is cheap as well, any cons to not using silver?


Silver paint is longer lasting and less resistance, which means you can control the resistance of the final circuit more accurately since the final resistance = resistance of the resistor + resistance of the conductive filler you use.


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Hey, should I get carbon based paint or I can get a 50% silver Rear Window Defogger Repair Kit.
> 
> I'm thinking the silver is better and me being monetarily challenged the repair kit is quite inexpensive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Carbon paint is cheap as well, any cons to not using silver?
> 
> 
> 
> Silver paint is longer lasting and less resistance, which means you can control the resistance of the final circuit more accurately since the final resistance = resistance of the resistor + resistance of the conductive filler you use.
Click to expand...

Yeah, the self-drying silver epoxy it is then, it's very inexpensive. You don't know how much I Googled to find a cheap alternative.









Support payments are killer.


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> No, I had it on 2050 set, benched at 2050 just fine, rebooted, bench crashed, checked, and it had changed to 2062, this happened a half a dozen times before going back to beta 6, hasn't happened since.


Quote:


> Originally Posted by *madmeatballs*
> 
> This happens to me too but it stopped doing it after I kept on adjusting my curve. I'm on AB beta 8 I just kept on putting it back to where it is supposed to be until it stayed,even after restarts.
> 
> I guess it has something to do with the grayed out line on the vf curve graph. But really not sure lol
> 
> 
> 
> 
> 
> 
> 
> just a guess.


yeah that happens when you adjust the curve. It's happened to me on both beta 6 and 8


----------



## done12many2

Quote:


> Originally Posted by *Coopiklaani*
> 
> Silver paint is longer lasting and less resistance, which means you can control the resistance of the final circuit more accurately since the final resistance = resistance of the resistor + resistance of the conductive filler you use.


Can you link us up to the exact stuff that you used?


----------



## Coopiklaani

Quote:


> Originally Posted by *done12many2*
> 
> Can you link us up to the exact stuff that you used?


Quote:


> Originally Posted by *Coopiklaani*
> 
> I posted a brief instruction like, a thousand posts ago. Basically I used arctic thermal glue to secure the resistors and silver paint to make the connection.
> Links here:
> https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue
> https://www.amazon.co.uk/Conducting-Silver-conductive-repairing-heaters/dp/B000YJ2P2I/ref=sr_1_2?ie=UTF8&qid=1494340057&sr=8-2&keywords=conductive+paint
> 
> Or you can get away with this conductive glue to do both. but it has a huge resistance. (10-30ohm over a 1mm gap as I tested earlier)
> https://www.amazon.co.uk/PCB-Soldering-835-2699-Electric-Paint/dp/B00CSMDT8S/ref=sr_1_1?ie=UTF8&qid=1494340140&sr=8-1&keywords=conductive+paint
> 
> 
> 
> A link to my post here:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160


----------



## done12many2

Quote:


> Originally Posted by *Coopiklaani*


Sorry about the laziness. I had already seen your original post. We'll say I'm having an off day. Nothing new.









To clarify, did you use the following between the electrically conductive points of the OEM cap and the add in resistor or as the glue in the middle?

https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, the self-drying silver epoxy it is then, it's very inexpensive. You don't know how much I Googled to find a cheap alternative.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Support payments are killer.


Where do i get that stuff


----------



## Coopiklaani

Quote:


> Originally Posted by *done12many2*
> 
> Sorry about the laziness. I had already seen your original post. We'll say I'm having an off day. Nothing new.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To clarify, did you use the following between the electrically conductive points of the OEM cap and the add in resistor or as the glue in the middle?
> 
> https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue


yep, any non-conductive should do just fine. I use them because I have those and they have the right properties, non-conductive, strong bond, non-corrosive etc..


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, the self-drying silver epoxy it is then, it's very inexpensive. You don't know how much I Googled to find a cheap alternative.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Support payments are killer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Where do i get that stuff
Click to expand...

It was about $16 CAD locally with tax. Got it here in Toronto at Canadian Tire.

Edit: I googled the spec sheet, it's 40-50% silver and self drying epoxy, just what we need.









Second edit: You need to shake thru bottle thoroughly and scrape the sides with the brush, I watched the video how to use it.

https://www.amazon.com/gp/aw/d/B000ALBZJY/ref=mp_s_a_1_4?ie=UTF8&qid=1494702388&sr=8-4&pi=AC_SX236_SY340_FMwebp_QL65&keywords=permatex+rear


----------



## Luckbad

This is probably the most exciting unreleased 1080 Ti variant that I've seen:

ZOTAC GeForce GTX 1080 Ti ArcticStorm
https://www.zotac.com/product/graphics_card/zotac-geforce-gtx-1080-ti-arcticstorm

- Full waterblock (not an AIO) with standard G 1/4 fittings
- 16 + 2 power phases
- Dual 8-pin

It's basically the 1080 Ti Amp Extreme on water, though it's not really overclocked stock. Power consumption is listed as lower than the Amp Extreme as well... not sure if that means it won't be able to go to 384 W or what. Hopefully it can.


----------



## sudey

And the last question if it is possible. I as the owner of GTX 1080 FTW - pay now special attention to VRM video memory temperatures. http://www.tomshardware.de/zotac-1080ti-leistungsaufnahme-temperaturen-benchmark,testberichte-242352-6.html Here on this website is temperatures of VRM I of VIDEO MEMORY. I know that GPU cooling very good but other components not really. I here to you also set a question - If I play in dispersal and I will deliver to fanspeed of 75% there will be temperatures of video memory and vrm less than on this website? and how you think these temperatures were at what fanspeed? 45%-55%-60%?


----------



## Hopesolo

FIRE STRIKE ULTRA 1.1 SCORE 7 922

GTX 1080 Ti @2088/6264MHz - 5820K @4900MHz

http://www.3dmark.com/fs/12621536


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> You must be seeing something different than I am looking at his post. He posted worse performance at a higher clock. Despite "no power throttling" on the XOC. If anything it confirms it's not performing as it should.
> 
> Asus 2138/6180 - 10693
> Nvidia 2138/6180 - 10692
> XOC *2152*/6180 - 10632


Yeah XOC is worse, but scaling in GPU core is not great in SuPo, so I would expect around 10800~ if it was unlimited and the same performance. That's around 1.5% worse in SuPo at least, different benchmarks seem to respond differently to different bioses.

Quote:


> Originally Posted by *kolkoo*
> 
> Yeah but there was also the Asus and Evga FE bioses are performing worse than Vanilla Nvidia Fe bios. But yeah I guess Xoc performs worse and also you probably are better off testing with smaller benches as hulio suggested


Smaller benchs can be swayed more easily I would have thought, and SuPo is the best test I have ever seen when comes to repeatable results. But as above, maybe 3DMark does show a difference in the bioses that is not as apparent with SuPo. It can't be that much of a difference, as every other bios I have tested is about 100 points worse in SuPo minimum.
At any rate until they release an XOC or 350w FE bios, I don't see any point in changing, as games don't really hit the power limit, least not the ones I have been playing


----------



## Benny89

Is there any way to keep backplate LEDs when you mod it with EVGA KIT or Kraken G10? Like pluging it to PSU or something? Because header is still there and the only thing that you have removed is cooler. So will backplate LED work after moding?


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Is there any way to keep backplate LEDs when you mod it with EVGA KIT or Kraken G10? Like pluging it to PSU or something? Because header is still there and the only thing that you have removed is cooler. So will backplate LED work after moding?


If backplate leds are still plugged in yes, if not, than not.


----------



## Alex132

At the rate this is going, Vega and GTX20xx family will be out before I can get my hands on a 1080 Ti


----------



## Slackaveli

Quote:


> Originally Posted by *Alex132*
> 
> At the rate this is going, Vega and GTX20xx family will be out before I can get my hands on a 1080 Ti


whatcha waiting for?

And Vega is just a 1080 in red clothes.


----------



## Alex132

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> At the rate this is going, Vega and GTX20xx family will be out before I can get my hands on a 1080 Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> whatcha waiting for?
> 
> And Vega is just a 1080 in red clothes.
Click to expand...

Pretty much any decent non-FE card. FTW3 is what I want though. The Asus Strix and Strix OC are out of stock, I don't want Gigabyte's Auros and any EVGA cards seem to be having an extreme shortage.


----------



## Benny89

Quote:


> Originally Posted by *Alex132*
> 
> Pretty much any decent non-FE card. FTW3 is what I want though. The Asus Strix and Strix OC are out of stock, I don't want Gigabyte's Auros and any EVGA cards seem to be having an extreme shortage.


I know that pain. I want FTW3 but that is 31.05 ship for me.

However I just cancel my 2x STRIX OC order and ordered 2x ZOTAC AMP EXTREME 1080 Tis. Higher power limit is what I want and If I win sillicon lottery with one I will put it on Kraker G12 with H105 cooler anyway.


----------



## Alex132

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Pretty much any decent non-FE card. FTW3 is what I want though. The Asus Strix and Strix OC are out of stock, I don't want Gigabyte's Auros and any EVGA cards seem to be having an extreme shortage.
> 
> 
> 
> I know that pain. I want FTW3 but that is 31.05 ship for me.
> 
> However I just cancel my 2x STRIX OC order and ordered 2x ZOTAC AMP EXTREME 1080 Tis. Higher power limit is what I want and If I win sillicon lottery with one I will put it on Kraker G12 with H105 cooler anyway.
Click to expand...

I could get the Zotac Amp Extreme or GALAX cards, but I really don't like the looks of them. I'd rather just wait and get what I want and know is good


----------



## Benny89

Quote:


> Originally Posted by *Alex132*
> 
> I could get the Zotac Amp Extreme or GALAX cards, but I really don't like the looks of them. I'd rather just wait and get what I want and know is good


I personally think Zotac looks very sleek. It is big, but it has nice simple design and very visible LEDs. Also even though card is heavy I have read is had almost no sag if screwed tight.

I considered ZOTAC as my second choice because of 384W Power Limit. I think FTW3 358W will be also enough, however I just can't wait 3 more weeks for a card, lol.

And so far seems like people really like ZOTAC 1080 Ti.

I am even little supprised so few people have them in this thread.


----------



## Alex132

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I could get the Zotac Amp Extreme or GALAX cards, but I really don't like the looks of them. I'd rather just wait and get what I want and know is good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I personally think Zotac looks very sleek. It is big, but it has nice simple design and very visible LEDs. Also even though card is heavy I have read is had almost no sag if screwed tight.
> 
> I considered ZOTAC as my second choice because of 384W Power Limit. I think FTW3 358W will be also enough, however I just can't wait 3 more weeks for a card, lol.
> 
> And so far seems like people really like ZOTAC 1080 Ti.
> 
> I am even little supprised so few people have them in this thread.
Click to expand...

Probably because people stick to the bigger brands, and the extreme lack of advertising for the other brands probably doesn't help neither.


----------



## TWiST2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Version 8 I tried, had a bug where every time I rebooted my 2050 core point changed to 2062, and version 7 someone said borked their Windows install, hence version 6 which works flawlessly for me.


I read there was issues with Beta 7, but never heard of any issues with Beta 8, nor have I noticed any issues myself. Maybe I will roll back to Beta 6 for testing.


----------



## Slackaveli

Quote:


> Originally Posted by *Alex132*
> 
> Pretty much any decent non-FE card. FTW3 is what I want though. The Asus Strix and Strix OC are out of stock, I don't want Gigabyte's Auros and any EVGA cards seem to be having an extreme shortage.


ah. Aorus is awesome tho. But i caught a golden, so it would be awesome regardless of brand.

Zotac Amp extreme would be my #2 in your shoes.


----------



## Luckbad

Anyone in this thread trying to offload a 1080 or 980 Ti with a waterblock? I'm trying to figure out the full water loop thing and will need a card to try it with.


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> Anyone in this thread trying to offload a 1080 or 980 Ti with a waterblock? I'm trying to figure out the full water loop thing and will need a card to try it with.


Luck, one more question for AMP- I heard there is some bug with fans in Afterburner and you have to use Firestrike to fix it? Did you have that too?


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> Luck, one more question for AMP- I heard there is some bug with fans in Afterburner and you have to use Firestrike to fix it? Did you have that too?


It's not a bug really, it's a feature that you can only disable in FireStorm.

I did a little visual guide for it somewhere in this thread, but basically.

Click Spectra
Click the inconspicuous down arrow in that new panel
Turn "Fan Stop Setting" Off

That will get rid of the 0% fan speed below 45 C thing and make it so you can actually control the fans.


----------



## xSociety

Just ordered the Asus Strix card. Can't wait to join you guys!









Is there anything I need to know really when it comes to overclocking this card? Have there been any worthwhile BIOS mods or anything yet?


----------



## KedarWolf

Quote:


> Originally Posted by *Luckbad*
> 
> Anyone in this thread trying to offload a 1080 or 980 Ti with a waterblock? I'm trying to figure out the full water loop thing and will need a card to try it with.


I have a 12GB older Maxwell Titan X, not an Xp... Has EK nickel waterblock, black EK backplate with hoses on it filled with quick disconnect couplers on it. Selling it as I have my 1080 Ti.









http://www.overclock.net/t/1627837/older-asus-12gb-maxwell-titan-x-ek-nickel-waterblock-and-black-backplate-with-predator-qdcs/0_20


----------



## CoreyL4

How is the ftw3 bios vs the zotac bios?

I have the ftw3 bios on my gaming x and it still power limits at around 119% instead of 127%


----------



## RyviusRan

I have a Zotac AMP Extreme 1080ti and noticed that in Quantum Break GPU utilization is never close to max.
It rarely goes above 70% GPU utilization no matter what settings I choose.
I don't think my CPU is the issue as it's an i7 4770k at 4.0ghz.


----------



## feznz

[/quote]
Quote:


> Originally Posted by *RyviusRan*
> 
> I have a Zotac AMP Extreme 1080ti and noticed that in Quantum Break GPU utilization is never close to max.
> It rarely goes above 70% GPU utilization no matter what settings I choose.
> I don't think my CPU is the issue as it's an i7 4770k at 4.0ghz.


if you have a 1080p monitor that's gonna be the problem run a benchmark and do a custom resolution that will load up the card properly
bench below show the different runs and results on low vs high resolution monitors
Quote:


> Originally Posted by *feznz*
> 
> feznz- i7 3770k @ 5Ghz / 2200Mhz - Asus 1080 Ti Strix - 2076MHz/5955MHz - 145.7FPS- 6094 score -1080p
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> feznz- i7 3770k @ 5Ghz / 2200Mhz - Asus 1080 Ti Strix - 2076MHz/5955MHz - 67.5FPS- 2824 score - 3840x1600p
> 
> 
> 
> heres my monitoring with both runs first 1080p then 1600p straight afterwards
> with the 1080p the GPU usage was hovering around the 75% mark while the 1600p GPU usage was constantly 99-100%
> if I could only get the 1080p to run the Gpu at 100% I should get 25% higher score or around estimated 180FPS anyone got some hot tips? I applied the basic tweaks on OP post.
> I not sure about the CPU it was hovering around 25% usage


----------



## RyviusRan

Quote:


> Originally Posted by *feznz*
> 
> if you have a 1080p monitor that's gonna be the problem run a benchmark and do a custom resolution that will load up the card properly
> bench below show the different runs and results on low vs high resolution monitors
> [/SPOILER]


Nah I am running at 4k resolution with the upscale setting on. If I disable that setting it still has low GPU utilization but the game runs in the 20s for frame rate.


----------



## BoredErica

I ran Superposition many times at 1080p Extreme and no crash at +100 (even though I knew it was unstable). Ran Heaven 1440p at +90 to bench once and it crashed. *shrugs*

I just feel like even though Heaven and Valley are old, they are still good stress tests. Maybe Heaven moreso than Valley, Valley more for vram testing.

Superposition at 8k extreme will squash the clocks down really low, and I don't know if that compromises testing at all.

I will test the different BIOSes more closely once my custom loop is set up.


----------



## madmeatballs

We may have a new bios for the FE to try.

PNY's XLR8 1080 Ti it is using 6+8 Pin.

PNY XLR8 1080 Ti


----------



## Nico67

Quote:


> Originally Posted by *madmeatballs*
> 
> We may have a new bios for the FE to try.
> 
> PNY's XLR8 1080 Ti it is using 6+8 Pin.
> 
> PNY XLR8 1080 Ti


Looks like a reference design, so I'm pretty sure it will be 300w max, but here's hoping


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> It's not a bug really, it's a feature that you can only disable in FireStorm.
> 
> I did a little visual guide for it somewhere in this thread, but basically.
> 
> Click Spectra
> Click the inconspicuous down arrow in that new panel
> Turn "Fan Stop Setting" Off
> 
> That will get rid of the 0% fan speed below 45 C thing and make it so you can actually control the fans.


My card would not play nice with custom fan curves no matter what I did including disabling fan stop. It just stuck to the minimum of 30% no matter how low the temps went.

But the XOC bios fixed that.
Quote:


> Originally Posted by *Benny89*
> 
> I personally think Zotac looks very sleek. It is big, but it has nice simple design and very visible LEDs. Also even though card is heavy I have read is had almost no sag if screwed tight.
> 
> I considered ZOTAC as my second choice because of 384W Power Limit. I think FTW3 358W will be also enough, however I just can't wait 3 more weeks for a card, lol.
> 
> And so far seems like people really like ZOTAC 1080 Ti.
> 
> I am even little supprised so few people have them in this thread.


My Amp Extreme draws up to 450w now with the XOC bios. About the sag tho I can still see it. Very slight but its there. Don't think I'm gonna bother with a brace though.


----------



## Clukos

Quote:


> Originally Posted by *Darkwizzie*
> 
> I ran Superposition many times at 1080p Extreme and no crash at +100 (even though I knew it was unstable). Ran Heaven 1440p at +90 to bench once and it crashed. *shrugs*


Supo 1080p extreme isn't a good benchmark to test stability as it's not pushing the GPU too much (due to resolution). Running 4k or 8k supo is a better choice for stability than Heaven/Valley.


----------



## reflex75

Quote:


> Originally Posted by *TheBoom*
> 
> But the XOC bios fixed that.
> My Amp Extreme draws up to 450w now with the XOC bios. About the sag tho I can still see it. Very slight but its there. Don't think I'm gonna bother with a brace though.


What dou you think about this XOC bios?
What are the differences?


----------



## Coopiklaani

Quote:


> Originally Posted by *Darkwizzie*
> 
> I ran Superposition many times at 1080p Extreme and no crash at +100 (even though I knew it was unstable). Ran Heaven 1440p at +90 to bench once and it crashed. *shrugs*
> 
> I just feel like even though Heaven and Valley are old, they are still good stress tests. Maybe Heaven moreso than Valley, Valley more for vram testing.
> 
> Superposition at 8k extreme will squash the clocks down really low, and I don't know if that compromises testing at all.
> 
> I will test the different BIOSes more closely once my custom loop is set up.


best test for stability so far is FSU scene 1. if it passes scene 1 for 1hr loop, you are then golden.


----------



## TheBoom

Quote:


> Originally Posted by *reflex75*
> 
> What dou you think about this XOC bios?
> What are the differences?


Well the higher voltage definitely allows for higher clocks. The thing I don't like though is that you have to use the curve to "unlock" the higher voltages (1131-1200mv).

The bright side if you don't like the curve like me is you can run your previous highest stable clock at the default voltage (1112-1125mv) of the XOC bios without any perfcaps and power limiting which should result in better performance. Also my mem clocks increased by +100 effective and were tested to get higher scores/fps.

I did notice when playing with the curve that I was getting identical results at 2050-2113mhz as reported by some. Whether it is diminishing returns on the core clock or performance degradation from the XOC bios itself I cannot confirm however, since I'm too lazy to flash back to the stock bios lol.

Biggest difference is that I finally get to make the fans stop below 60C.

So I think I'm gonna stick with the XOC bios for now.

If you own the Amp Extreme and plan to flash to the XOC make sure you limit your fan speed to a lower percentage though. The RPM values are different from the Zotac bios which causes the fans to spin a lot faster and noisier than the default bios.


----------



## Sweetwater

This is my Strix with XOC @ 2063/ 6003, 1.075.

I think it's an ok score considering I'm on a 4690k haha


----------



## kolkoo

Quote:


> Originally Posted by *Sweetwater*
> 
> This is my Strix with XOC @ 2063/ 6003, 1.075.
> 
> I think it's an ok score considering I'm on a 4690k haha


It's a great score but on Superposition 4K CPU is pretty much irrelevant as all the heavy lifting is done on your GPU!


----------



## Sweetwater

I'll run the 1080p and 8k tomorrow. But for now it's late. Overclocking and F1 keeping me up while I have work early morning :'D


----------



## undercoverb0ss

With the new Forza Horizon 3 patch I can maintain 60 @4K Ultra everything AND 8xMSAA with my 1080ti. I caved and bought the 2 expansions since the performance is so good now.

Nice to be playing a game since I've just been tweaking benchmarks mostly since I've gotten the card


----------



## Dasboogieman

I just flashed the XOC BIOS on to my gigabyte Aorus. So far no problems but some interesting observations:

1. Supo benches are lower despite unlimited power, I used to get 10248 (2050mhz + 12000 VRAM) but now it only gets 10085 at best with +700 on the VRAM same clocks/voltage
2. Absolutely zero power limitations
3. My fans actually ran faster lol, I suspect there is an additional 800RPM or so left in the fans that Gigabyte didn't tap in to but I dropped 5 degrees lol. Either the fans got more stronk, or theres a GPU temp offset at work for this BIOS

All up, I'm 99% sure the performance delta people are noticing with the XOC BIOS is solely due to the VRAM timings being helluva lot more loose, presumably because frequency is king at LN2 cooling.


----------



## KedarWolf

I found a local company that is mailing me 47 Ohm resistors next week, I bought 0805 150W and 400W but will probably use the 400W.

It cost me about $5 for a half a dozen of both with free shipping and bought locally should be here in a few days,.

Will let you know how my card sits with resistor mod.


----------



## cluster71

I have a Zotac Amp Extreme. I've been playing Sniper Elite the last few days of 2063 stock voltage. The temperature feels good, but not so good, so I switch cooling paste to liquid metal. Memory pads are 1.5 mm and I only had 1.0 mm at home so I did not change. Tested again with about the same temp. It was not until I tested Superposition as I realized that something was wrong. The card backs down by the power limit, but it's the temp that triggers it. During the test, the card is about 110 % and almost 80 degrees Celsius. When I disassemble it again, I see that the GPU has not had full contact. When I think back, the stock cooling paste was the same, but it was so much paste that it worked good enough.
Heat zinc is spring-loaded, but the memory padding is too thick and the heat zinc is very heavy so that the springs can not push enough. I switched to 1.0 mm pads and took double springs into each other from a Gigabyte card to get more force. At the same test, the card is now around 40 degrees Celsius. I think more people have the same problem when the design on the cards is equal.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> I just flashed the XOC BIOS on to my gigabyte Aorus. So far no problems but some interesting observations:
> 
> 1. Supo benches are lower despite unlimited power, I used to get 10248 (2050mhz + 12000 VRAM) but now it only gets 10085 at best with +700 on the VRAM same clocks/voltage
> 2. Absolutely zero power limitations
> 3. My fans actually ran faster lol, I suspect there is an additional 800RPM or so left in the fans that Gigabyte didn't tap in to but I dropped 5 degrees lol. Either the fans got more stronk, or theres a GPU temp offset at work for this BIOS
> 
> All up, I'm 99% sure the performance delta people are noticing with the XOC BIOS is solely due to the VRAM timings being helluva lot more loose, presumably because frequency is king at LN2 cooling.


Yup I think this is it. Though I feel in the end you still should get better performance in most games at 1440p or lower since they are not as vram dependent as Supo benches.

The higher core clocks you can achieve with the XOC should compensate for the looser memory timings as well as no power limiting and other perfcaps.

Of course I haven't done any real world testing between the two bioses so don't take my word for it. Maybe someone who has benched games can attest to this.

The fans are definitely running faster because of the RPM differences.


----------



## Slackaveli

Quote:


> Originally Posted by *feznz*


if you have a 1080p monitor that's gonna be the problem run a benchmark and do a custom resolution that will load up the card properly
bench below show the different runs and results on low vs high resolution monitors
[/SPOILER][/quote]

i think it's ram related. notice how your ram usage is the same yet your page file went up? i think when gpu util isnt maxed yet cpu isnt either, it has to be ram, right? I get the same more or less on even a 5775-c. ram at 2400 c-10, 2133 c-9 is the same result. 35-38gb/s bandwidth isnt enough at that resolution is my guess. needs to be 45gb/s or more.
Quote:


> Originally Posted by *Sweetwater*
> 
> This is my Strix with XOC @ 2063/ 6003, 1.075.
> 
> I think it's an ok score considering I'm on a 4690k haha


z97 board? get a 5675-c or 5775-c and sell that 4690k. drop in perf boost for less than $100. oh, and a nice power savings to pay u back over time, too.


----------



## CoreyL4

Does a score of ~10090 in supo 4k at 2037/6123 sound right? I see a post above me got 10400 at 2063/6000. I'm gonna lower my mem overclock and see if it's any better


----------



## niobium615

Quote:


> Originally Posted by *KedarWolf*
> 
> I found a local company that is mailing me 47 Ohm resistors next week, I bought 0805 150W and 400W but will probably use the 400W.
> 
> It cost me about $5 for a half a dozen of both with free shipping and bought locally should be here in a few days,.
> 
> Will let you know how my card sits with resistor mod.


Do you mean 150mW and 400mW? Shouldn't make any difference which one you use, the current flowing through the resistor is minimal.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> I found a local company that is mailing me 47 Ohm resistors next week, I bought 0805 150W and 400W but will probably use the 400W.
> 
> It cost me about $5 for a half a dozen of both with free shipping and bought locally should be here in a few days,.
> 
> Will let you know how my card sits with resistor mod.


it doesn't really matter what the resistors are rated for. the current over the resistor is very very small, in the order of mirco amps


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> Yeah XOC is worse, but scaling in GPU core is not great in SuPo, so I would expect around 10800~ if it was unlimited and the same performance. That's around 1.5% worse in SuPo at least, different benchmarks seem to respond differently to different bioses.
> Smaller benchs can be swayed more easily I would have thought, and SuPo is the best test I have ever seen when comes to repeatable results. But as above, maybe 3DMark does show a difference in the bioses that is not as apparent with SuPo. It can't be that much of a difference, as every other bios I have tested is about 100 points worse in SuPo minimum.
> At any rate until they release an XOC or 350w FE bios, I don't see any point in changing, as games don't really hit the power limit, least not the ones I have been playing


I power limit in World of Warcraft in some areas with all settings maxed. Including SSAA


----------



## AMDeo

*MSI GTX 1080 Ti Armor & Prolimatech MK-26 & shunt mod (wire wound resistors)*

At PCB level, MSI GTX 1080 Ti Armor = MSI GTX 1080 Ti Gaming X.

GPU power supply has 8 phase (16 MOSFET's).

At BIOS level, they have the same power target (280W) and maximum power limit (330W).

https://www.techpowerup.com/vgabios/191501/msi-gtx1080ti-11264-170317-1

Armor: no backplate, no LED's...

Armor cooler is BAD!

Mine exceeded 80 deg. Celsius in some areas of Fallout 4 (2560x1440 with FPS limiter removed).
See newegg customers reviews:

https://www.newegg.com/Product/Product.aspx?Item=N82E16814137111

http://www.overclockersclub.com/reviews/msi_gtx_1080_ti_armor_11g/

I tried to improve not only GPU coolling, but VRM and memory cooling also...

In my book, a plain top plate is not the best cooling method for VRM (exception: blower style coolers).

By the way, *ASUS GTX 1080 Ti Strix owners using shunt mod, or XOC bios* *BE WARNED*:

VRM temperature will rise well above the values shown in these FLIR images >>

http://www.guru3d.com/articles_pages/asus_rog_strix_geforce_gtx_1080_ti_review,9.html

*For cooling mod* i used what I've already had:

1) Prolimatech MK-26 graphics card cooler...

http://www.prolimatech.com/en/products/detail.asp?id=1672&subid=1674#showtab

2) 2 x 120mm fans: Noiseblocker BlackSilentPRO PL-PS

http://proclockers.com/reviews/cooling/noiseblocker-nb-blacksilent-pro-pl-ps-120mm-fan-review?nopaging=1

3) Prolimatech PK-3 thermal compound (for GPU)

http://www.prolimatech.com/en/products/detail.asp?id=1582&subid=1585#showtab

4) 2 x Alphacool double-sided thermal adhesive pad 120x20x0.5mm (1W/mK)

https://www.alphacool.com/shop/waermeleitmittel/11851/alphacool-waermeleitklebepad-doppelseitig-120x20x0-5mm

5) Alphacool GPU Aluminium Heatsinks 15x15x15mm, 4g, black 10 pcs

https://www.alphacool.com/shop/grafikkartenkuehler/heatsinks/13663/alphacool-gpu-ram-aluminium-heatsinks-15x15mm-black-10-stk.

6) Arctic Silver AS5 thermal compound (for VRM area of top plate)

7) Some old copper heatsinks

8) Isopropyl alcohol

I have a Cooler Master HAF XB computer case (http://www.coolermaster.com/case/lan-box/haf-xb/), graphic card "sits" vertically and sag is out of question...

*For shunt mod* I used wire wrapping conductor (very shiny - silver plated, easy to solder).

https://en.wikipedia.org/wiki/Wire_wrap

I made 3 "resistors" consisted of 5 turns wounded on 2.5 mm diameter former and soldered them across the 5 miliohm resistors on the PCB.

Before the shunt mod I had 27% power (MSI Afterburner) in Windows 7 x64 SP1 with Google Chrome opened (2560x1440).

After the shunt mod I have 21% (same conditions)... Therefore the new maximum power limit should be (27/21)x330W = 1.285x330W = 424W

The price of MSI GTX 1080 Ti Armor dropped at the Founders Edition level and I warmly recommend buying it ONLY IF you intend to change the stock cooler with a better one...

P.S. Motherboard: Asus Maximus Hero VI, CPU: i5 4670k, RAM: 24GB DDR3 GSkill Trident Z 2400MHz


----------



## bloodhawk

Quote:


> Originally Posted by *AMDeo*
> 
> *MSI GTX 1080 Ti Armor & Prolimatech MK-26 & shunt mod (wire wound resistors)*
> 
> At PCB level, MSI GTX 1080 Ti Armor = MSI GTX 1080 Ti Gaming X.
> 
> GPU power supply has 8 phase (16 MOSFET's).
> 
> At BIOS level, they have the same power target (280W) and maximum power limit (330W).
> 
> https://www.techpowerup.com/vgabios/191501/msi-gtx1080ti-11264-170317-1
> 
> Armor: no backplate, no LED's...
> 
> Armor cooler is BAD!
> 
> Mine exceeded 80 deg. Celsius in some areas of Fallout 4 (2560x1440 with FPS limiter removed).
> See newegg customers reviews:
> 
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814137111
> 
> http://www.overclockersclub.com/reviews/msi_gtx_1080_ti_armor_11g/
> 
> I tried to improve not only GPU coolling, but VRM and memory cooling also...
> 
> In my book, a plain top plate is not the best cooling method for VRM (exception: blower style coolers).
> 
> By the way, *ASUS GTX 1080 Ti Strix owners using shunt mod, or XOC bios* *BE WARNED*:
> 
> VRM temperature will rise well above the values shown in these FLIR images >>
> 
> http://www.guru3d.com/articles_pages/asus_rog_strix_geforce_gtx_1080_ti_review,9.html
> 
> *For cooling mod* i used what I've already had:
> 
> 1) Prolimatech MK-26 graphics card cooler...
> 
> http://www.prolimatech.com/en/products/detail.asp?id=1672&subid=1674#showtab
> 
> 2) 2 x 120mm fans: Noiseblocker BlackSilentPRO PL-PS
> 
> http://proclockers.com/reviews/cooling/noiseblocker-nb-blacksilent-pro-pl-ps-120mm-fan-review?nopaging=1
> 
> 3) Prolimatech PK-3 thermal compound (for GPU)
> 
> http://www.prolimatech.com/en/products/detail.asp?id=1582&subid=1585#showtab
> 
> 4) 2 x Alphacool double-sided thermal adhesive pad 120x20x0.5mm (1W/mK)
> 
> https://www.alphacool.com/shop/waermeleitmittel/11851/alphacool-waermeleitklebepad-doppelseitig-120x20x0-
> 
> 5mm
> 
> 5) Alphacool GPU Aluminium Heatsinks 15x15x15mm, 4g, black 10 pcs
> 
> https://www.alphacool.com/shop/grafikkartenkuehler/heatsinks/13663/alphacool-gpu-ram-aluminium-heatsinks-
> 
> 15x15mm-black-10-stk.
> 
> 6) Arctic Silver AS5 thermal compound (for VRM area of top plate)
> 
> 7) Some old copper heatsinks
> 
> 8) Isopropyl alcohol
> 
> I have a Cooler Master HAF XB computer case (http://www.coolermaster.com/case/lan-box/haf-xb/), graphic card "sits" vertically and sag is out of question...
> 
> *For shunt mod* I used wire wrapping conductor (very shiny - silver plated, easy to solder).
> 
> https://en.wikipedia.org/wiki/Wire_wrap
> 
> I made 3 "resistors" consisted of 5 turns wounded on 2.5 mm diameter former and soldered them across the 5 miliohm resistors on the PCB.
> 
> Before the shunt mod I had 27% power (MSI Afterburner) in Windows 7 x64 SP1 with Google Chrome opened (2560x1440).
> 
> After the shunt mod I have 21% (same conditions)... Therefore the new maximum power limit should be (27/21)x330W = 1.285x330W = 424W
> 
> The price of MSI GTX 1080 Ti Armor dropped at the Founders Edition level and I warmly recommend buying it ONLY IF you intend to change the stock cooler with a better one...
> 
> P.S. Motherboard: Asus Maximus Hero VI, CPU: i5 4670k, RAM: 24GB DDR3 GSkill Trident Z 2400MHz


Great post!

Also i dont know if i posted this before or not, but one way to drop the VRM temps, for custom water block users is to use FujiPoly/Gelid/Arctic 6 or 11 w/mK thermal pads.
I am using the Fujipoly ones (seriously expensive, not worth it, unless on a custom waterblock), and it dropped my VRM temps by 15-20C (depending on where my thermal probe was petitioned). The temperature drop on the VRM's wont be as drastic with air coolers, but will definitely be better than before.


----------



## AMDeo

Instead of FujiPoly thermal pads (not avaible in my country) I recommend something reasonably priced:

http://www.aquatuning.co.uk/thermal-pads-und-paste/thermal-pads/19458/alphacool-eisschicht-waermeleitpad-11w/mk-120x20x0-5mm-2-stueck-sarcon-xr-he?c=2868

http://www.shop4pc.ro/pad-termic-alphacool-eisschicht-sarcon-xrhe-11wmk-05mm-120x20mm-2-bucati-p-44609.html

I haven't replaced the stock VRM pads yet with Eisschicht ones... Maybe after I shall make a backplate for my Armor...


----------



## KedarWolf

Quote:


> Originally Posted by *Coopiklaani*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I found a local company that is mailing me 47 Ohm resistors next week, I bought 0805 150W and 400W but will probably use the 400W.
> 
> It cost me about $5 for a half a dozen of both with free shipping and bought locally should be here in a few days,.
> 
> Will let you know how my card sits with resistor mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it doesn't really matter what the resistors are rated for. the current over the resistor is very very small, in the order of mirco amps
Click to expand...

Okay, thanks peeps.


----------



## mcg75

Quote:


> Originally Posted by *AMDeo*
> 
> *MSI GTX 1080 Ti Armor & Prolimatech MK-26 & shunt mod (wire wound resistors)*
> 
> Mine exceeded 80 deg. Celsius in some areas of Fallout 4 (2560x1440 with FPS limiter removed).


Sorry for the off topic in such an informative post but you should be aware that Fallout 4 has a very high risk of breaking when the frame rate exceeds 60 fps.


----------



## gmpotu

Quote:


> Originally Posted by *TheBoom*
> 
> Well the higher voltage definitely allows for higher clocks. The thing I don't like though is that you have to use the curve to "unlock" the higher voltages (1131-1200mv).
> 
> The bright side if you don't like the curve like me is you can run your previous highest stable clock at the default voltage (1112-1125mv) of the XOC bios without any perfcaps and power limiting which should result in better performance. Also my mem clocks increased by +100 effective and were tested to get higher scores/fps.
> 
> I did notice when playing with the curve that I was getting identical results at 2050-2113mhz as reported by some. Whether it is diminishing returns on the core clock or performance degradation from the XOC bios itself I cannot confirm however, since I'm too lazy to flash back to the stock bios lol.
> 
> Biggest difference is that I finally get to make the fans stop below 60C.
> 
> So I think I'm gonna stick with the XOC bios for now.
> 
> If you own the Amp Extreme and plan to flash to the XOC make sure you limit your fan speed to a lower percentage though. The RPM values are different from the Zotac bios which causes the fans to spin a lot faster and noisier than the default bios.


Okay what is XOC? Someone said its the Asus Strix Bios but I have an Asus Strix card and Afternurber with +100% voltage only lets me go to 1.093v bin. How are you enabling the higher bins up to 1.200v?

Is XOC not actually the Asus Strix Bios?


----------



## nrpeyton

Evening guys,

I've just completed my upgrade to i7 7700k with new ASUS ROG APEX board (coming from old AMD FX platform).

Anyway I decided to do a complete reformat of C:\ for a fresh start.

I'm trying to *re-download MSI AB 4.4 beta 6* for fully unlocked control over my 1080Ti but the download link is broken on page 1 of this thread.

Anyone able to upload it for me please?

Google is a nightmare; I'm only getting hits for v4.3

Thanks!

Nick


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Evening guys,
> 
> I've just completed my upgrade to i7 7700k with new ASUS ROG APEX board (coming from old AMD FX platform).
> 
> Anyway I decided to do a complete reformat of C:\ for a fresh start.
> 
> I'm trying to *re-download MSI AB 4.4 beta 6* for fully unlocked control over my 1080Ti but the download link is broken on page 1 of this thread.
> 
> Anyone able to upload it for me please?
> 
> Google is a nightmare; I'm only getting hits for v4.3
> 
> Thanks!
> 
> Nick


The instructions say to highlight the link and coy copy and paste into your browser.

It's a problem with how overclock.net handles links.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> The instructions say to highlight the link and coy copy and paste into your browser.
> 
> It's a problem with how overclock.net handles links.


oops my apologies,

sorry..

got so much to re-install and download I'm kind of "skimming" everything because there's so much!

Got it, sorry for not paying more attention! lol


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> oops my apologies,
> 
> sorry..
> 
> got so much to re-install and download I'm kind of "skimming" everything because there's so much!
> 
> Got it, sorry for not paying more attention! lol


Man, you're gonna love that little 7700k! Did you delid it? They are just very snappy chips!

I'm waiting on the parts to show up to finish the loop on my 5960x rig this week, but I already know that my 7700k rig will remain my daily / gaming rig.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> oops my apologies,
> 
> sorry..
> 
> got so much to re-install and download I'm kind of "skimming" everything because there's so much!
> 
> Got it, sorry for not paying more attention! lol


Hey Nick! Hows it going? You finally let your AMD go...








Who else jumped ship besides us?








Hows your FE clocks on air?

Cheers

Occamrazor


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> Hey Nick! Hows it going? You finally let your AMD go...
> 
> 
> 
> 
> 
> 
> 
> 
> Who else jumped ship besides us?
> 
> 
> 
> 
> 
> 
> 
> 
> Hows your FE clocks on air?
> 
> Cheers
> 
> Occamrazor


Yup,

Finally took the plunge lol.

Nice to see you.

On air I *was* able to hit 2100 (and not 1mhz more) at 1.093v (and even hold that in The Witcher 3) without crashing. Along with a memory O/C of +535 (confirmed error free).

However as I was probably CPU bottlenecked before; I'm preparing for impending disappointment that with my new 7700k, my 1080Ti may suddenly not want to clock as high! lol

Still working away amongst the aftermath of a complete reformat at the minute. But as soon as I'm done re-installing everything I'll be back with an update -- and we'll see if the platform change makes a difference!

As long as I can still do 2050Mhz I will be able to live with it lol!

I will be adding it to my water loop next month; and seeing what it can do with the Chiller etc then too!

How are you getting on mate?

Nick

Quote:


> Originally Posted by *done12many2*
> 
> Man, you're gonna love that little 7700k! Did you delid it? They are just very snappy chips!
> 
> I'm waiting on the parts to show up to finish the loop on my 5960x rig this week, but I already know that my 7700k rig will remain my daily / gaming rig.


Yeah.

I'm already overwhelmed by how much more BIOS options there are. And how many more active sensors there are in hwinfo64. (it seems support/compatability is taken for granted). on FX it was often thought of like a privilege lol.

It's a lot to take in.

I havne't delidded it yet. but I think I plan to, yes  There's a tool you can buy isn't there?

And yeah that's kind of why I got it. I liked the idea of 4, very fast, very snappy cores 

I grabbed the 3600mhz Trident Z C16 too ;-)

Not had a chance to O/C anything yet. Or even see what the little chip is capable of yet. In fact I'm wtill running at stock. (even the Trident Z is still at 2133Mhz just now lol).

Anyone with an ASUS board on an intel chip know if *disabling unused PCI-E lanes* in BIOS/through jumpers/switches know if that *nets you any extra stability?
*
My mobo (the ROG APEX) is designed so the RAM slots and PCI-E 1 slot are closer to the CPU. (it's even visually closer -- my EK Supremacy EVO cpu block is actually touching the side of my Trident Z stick it's soo close lol).
I know the board holds the world-record for the fastest DDR4 speed ever recorded due to this (plus only having 2 DIMM slots, not 4). However I've not read anything regarding GPU.... but the PCI-E 1 slot does also look a lot closer.

If not; I'll be the first to test + report here, I guess


----------



## feznz

Quote:


> Originally Posted by *Slackaveli*
> 
> i think it's ram related. notice how your ram usage is the same yet your page file went up? i think when gpu util isnt maxed yet cpu isnt either, it has to be ram, right? I get the same more or less on even a 5775-c. ram at 2400 c-10, 2133 c-9 is the same result. 35-38gb/s bandwidth isnt enough at that resolution is my guess. needs to be 45gb/s or more.
> z97 board? get a 5675-c or 5775-c and sell that 4690k. drop in perf boost for less than $100. oh, and a nice power savings to pay u back over time, too.


Hey my friend









Thats the problem slight grade for gaming even with the 5960x still waiting for that game that will actually give my CPU a work out the reason
I honestly need some more system ram but don't want to waste money on DDR3 and no current boards are appealing enough when Intel 299 or AMD 399 chipsets are so close to release. different senario if I had an old AMD bulldozer.
Only 2 distint adavantage senarios using 1080p or certain benches with a CPU tests.

So I don't render and have a high resolution monitor so the imediate advantage is not clear unless get 1080p monitor and game @ 240Hz

the morning frosts are here I am going to flash the XOC bios on the proper card Strix maybe there is a deficency when flashed on a non Strix card see if can smash Superposition am only lagging 300points to the leader on a 1080Ti 4k of course

I think if you got to bench @ 1080p then hell I might break out the old CRT


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> Yup,
> 
> Finally took the plunge lol.
> 
> Nice to see you.
> 
> On air I *was* able to hit 2100 (and not 1mhz more) at 1.093v (and even hold that in The Witcher 3) without crashing. Along with a memory O/C of +535 (confirmed error free).
> 
> However as I was probably CPU bottlenecked before; I'm preparing for impending disappointment that with my new 7700k, my 1080Ti may suddenly not want to clock as high! lol
> 
> Still working away amongst the aftermath of a complete reformat at the minute. But as soon as I'm done re-installing everything I'll be back with an update -- and we'll see if the platform change makes a difference!
> 
> As long as I can still do 2050Mhz I will be able to live with it lol!
> 
> I will be adding it to my water loop next month; and seeing what it can do with the Chiller etc then too!
> 
> How are you getting on?


Fine my friend, thanks!
Just got my 1080ti gaming x yesterday, it's doing 2000mhz out of the box without any adjustments, ambient temp is 25°C, load temps are mid 50's! I'm having a good feeling about putting her underwater! I'm waiting for EK to release the block for the gaming x to start "working" on her! ;-)

Cheers

Ed


----------



## CoreyL4

Does 10090 in supo 4k with 2037/6123 with a down clock to 2012 sound normal?


----------



## nrpeyton

Quote:


> Originally Posted by *OccamRazor*
> 
> Fine my friend, thanks!
> Just got my 1080ti gaming x yesterday, it's doing 2000mhz out of the box without any adjustments, ambient temp is 25°C, load temps are mid 50's! I'm having a good feeling about putting her underwater! I'm waiting for EK to release the block for the gaming x to start "working" on her! ;-)
> 
> Cheers
> 
> Ed


to start working on her...

hehe...

sounds good mate.

I got my Ti through the EVGA step-up programme; so I have a Founders Edition. (quite a different contrast to the 1080 Classy I had before).

But so far I'm actually quite happy with it.

If I raise my power limit to 120% I'll do 2000 Mhz with a +150 in the main window (give or take a few bins which are temp dependant of course).

Anyway it's time to find out if she'll still do 2100. Back soon


----------



## Sweetwater

Quote:


> Originally Posted by *Slackaveli*
> 
> z97 board? get a 5675-c or 5775-c and sell that 4690k. drop in perf boost for less than $100. oh, and a nice power savings to pay u back over time, too.


Yeah the ol Z97. Thing is I have already been looking for alternatives to drop in and being in Australia it's just not worth it. Our tech pricing is whack and our dollar to buy from overseas is pretty weak. For example a 1080ti FE is 1100 while a partner board 1200+


----------



## nrpeyton

*Upgrading* from an *old* *AMD FX-8350* to the *latest i7 7700k*, ASUS ROG mobo + G.Skill 3600Mhz C16 (£770) got me the *following gains* on The Witcher 3:

1440p - 14 FPS
(86 to 100)

4k - 5 FPS
(51 to 56)

Got to admit I expected a little more.

Memory is running with 3600Mhz XMP profile.

The i7 is running at stock (but at stock it's boosting to 4500mhz on all cores consistently during gaming & doesn't drop).

Not tried to O/C it yet. I doubt it's worth it.

The 1080Ti FE is running at 2000 MHZ 988mv with a +535 memory.


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Upgrading from an *old* AMD FX-8350 to the latest i7 7700k, ASUS ROG mobo + G.Skill 3600Mhz C16 (£770) got me the following gains on The Witcher 3:
> 
> 1440p - 14 FPS
> 4k - 5 FPS
> 
> Got to admit I expected a little more.
> 
> Memory is running with 3600Mhz XMP profile.
> 
> The i7 is running at stock (but at stock it's boosting to 4500mhz on all cores consistently during gaming & doesn't drop).


Yup, that is that. I had same when I moved from 4790k to 5775C OCed to 4.2 (equivalent in games to 5.0Ghz 7700k). Only around 10-12 fps gains in heavy CPU games.

Truth is- CPUs performance difference is small, so I wasn't really supprised. As you shouldn't be







. Most games are 90% GPU depend.

Buy hey, at least we don't have to upgrade to another 4-core for another 2 years at least. Next upgrade I am gonna 6-core.

Still in 1440p those 10-14 fps is nice to see


----------



## nrpeyton

Quote:


> Originally Posted by *Benny89*
> 
> Yup, that is that. I had same when I moved from 4790k to 5775C OCed to 4.2 (equivalent in games to 5.0Ghz 7700k). Only around 10-12 fps gains in heavy CPU games.
> 
> Truth is- CPUs performance difference is small, so I wasn't really supprised. As you shouldn't be
> 
> 
> 
> 
> 
> 
> 
> . Most games are 90% GPU depend.
> 
> Buy hey, at least we don't have to upgrade to another 4-core for another 2 years at least. Next upgrade I am gonna 6-core.
> 
> Still in 1440p those 10-14 fps is nice to see


Lol.

The i7 7700k is nearly 80% faster than an old AMD FX 8350 at single core.

I knew it wasn't going to be revolutionary (I had my FX overclocked to a an inch of it's life).

But considering the massive LEAP in platform, CPU performance, memory performance and motherboard -- I did expect _a bit_ more.

I'm not coming from a mid-range intel setup, I'm coming from an old *ancient* 8-core AMD on PCI-E 2.0 at 2133Mhz DDR3 lol

Oh well, maybe other games will yield some better surprises for me


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol.
> 
> The i7 7700k is nearly 80% faster than an old AMD FX 8350 at single core.
> 
> I knew it wasn't going to be revolutionary (I had my FX overclocked to a an inch of it's life).
> 
> But considering the massive LEAP in platform, CPU performance, memory performance and motherboard -- I did expect _a bit_ more.
> 
> I'm not coming from a mid-range intel setup, I'm coming from an old *ancient* 8-core AMD on PCI-E 2.0 at 2133Mhz DDR3 lol
> 
> Oh well, maybe other games will yield some better surprises for me


I get you, really







. In all cpu tests it is nearly 80% faster but nowadays it is GPU mostly that will give significant boost.

However you should see a big boost in BF1, RTS games, MMO Games, Games that spawn a lot of enemies (like for example Vermintide) etc.

But in most AAA titles that relies on GPU- well, it is what it is









But congrat on new hardware anyway







. May it serve you proudly like your AMD did


----------



## Dasboogieman

Quote:


> Originally Posted by *Sweetwater*
> 
> Yeah the ol Z97. Thing is I have already been looking for alternatives to drop in and being in Australia it's just not worth it. Our tech pricing is whack and our dollar to buy from overseas is pretty weak. For example a 1080ti FE is 1100 while a partner board 1200+


5775cs don't exist in Australia, you're better off going to Kaby Lake at this rate.


----------



## nrpeyton

Lol,

OMG though.

3DMARK is a different story:

Old Firestrike Score: 13 896 _(better than 84% of all scores)_
New Firestrike Score: 22 003 _(better than 98% of all scores)_

Old physics test FPS: 28
New physics test FPS: 46

Old combined test FPS: *15*
New combined test FPS: *43*

_/\ now thats the kind of result I was expecting! lol_

And my GPU was only at 2000 MHZ. And CPU at stock. So there's definitely a bit more in there









Time to "have at her" in the BIOS & MSI AB I think lol


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol,
> 
> OMG though.
> 
> 3DMARK is a different story:
> 
> Old Firestrike Score: 13 896
> New Firestrike Score: 22 003
> 
> Old physics test FPS: 28
> New physics test FPS: 46
> 
> Old combined test FPS: *15*
> New combined test FPS: *43*
> 
> _/\ now thats the kind of result I was expecting! lol_


Very nice. What clock speed are you running at?

Order the Delid-Die-Mate-2 and knock the top off that chip. Swap in some liquid metal and watch that clock speed go up a bit more with nice temps.









Single-threaded performance obviously increases to blistering speed at 5.4 to 5.5 GHz for benching.


----------



## nrpeyton

Quote:


> Originally Posted by *done12many2*
> 
> Very nice. What clock speed are you running at?
> 
> Order the Delid-Die-Mate-2 and knock the top off that chip. Swap in some liquid metal and watch that clock speed go up a bit more with nice temps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Single-threaded performance obviously increases to blistering speed at 5.4 to 5.5 GHz for benching.


Stock just now.

The BIOS is a bit overwhelming. Only thing I've changed so far is the DDR4 speed to XMP profile 3600Mhz.

Even on ASROCK's flagship offering for AMD FX (their 14 phase Extreme9 board) there was literally only one quarter of the options I've got now lol!

I'll be "getting in about her" very shortly in BIOS.

And aye; I do plan to de-lid. (Using tool).

I'm on a full custom loop. So makes sense ;-)


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> Stock just now.
> 
> The BIOS is a bit overwhelming. Only thing I've changed so far is the DDR4 speed to XMP profile 3600Mhz.
> 
> Even on ASROCK's flagship offering for AMD FX (their 14 phase Extreme9 board) there was literally only one quarter of the options I've got now lol!
> 
> I'll be "getting in about her" very shortly in BIOS.
> 
> And aye; I do plan to de-lid. (Using tool).
> 
> I'm on a full custom loop. So makes sense ;-)


You've got the best board there is for the job so have fun tweaking that chip and memory!


----------



## nrpeyton

Aye. the board is actually beautiful too.

I think it's time I seen if she'll do 5.0Ghz then.

What's max safe 25/8 voltage on these?


----------



## done12many2

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye.
> 
> I think it's time I seen if she'll do 5.0Ghz then.
> 
> What's max safe 24/7 voltage on these?


You'll probably start hitting some thermal limitations around 1.35v in heavy loads with a stock chip even on a full loop. As far as "safe voltage". Well that's completely personal discretion. I run an adaptive voltage that levels out to 1.44v with LLC factored in under load. The chip is delidded and running at 5.3 GHz daily. I've run Cinebench R15 at 1.6v / 5.6 GHz with very cold ambient.

See here for references.

http://www.overclock.net/t/1621347/kaby-lake-overclocking-guide-with-statistics/0_100


----------



## Coopiklaani

Anyone tried to stress the card with FSU Scene 1 loop? I can pass any test, including witcher 3 4hr gaming at 2138MHz at the core but it crashes 15-30mins running the FSU scene 1 loop.


----------



## mtbiker033

this is with +191/+500 max temp was 44C

just broke into the top 30 on the superposition thread! lol #42 on the unigine leaderboards, the old 4820k & RIVE BE hanging in there!


----------



## Slackaveli

Quote:


> Originally Posted by *Sweetwater*
> 
> This is my Strix with XOC @ 2063/ 6003, 1.075.
> 
> I think it's an ok score considering I'm on a 4690k haha


Quote:


> Originally Posted by *feznz*
> 
> Hey my friend
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thats the problem slight grade for gaming even with the 5960x still waiting for that game that will actually give my CPU a work out the reason
> I honestly need some more system ram but don't want to waste money on DDR3 and no current boards are appealing enough when Intel 299 or AMD 399 chipsets are so close to release. different senario if I had an old AMD bulldozer.
> Only 2 distint adavantage senarios using 1080p or certain benches with a CPU tests.
> 
> So I don't render and have a high resolution monitor so the imediate advantage is not clear unless get 1080p monitor and game @ 240Hz
> 
> the morning frosts are here I am going to flash the XOC bios on the proper card Strix maybe there is a deficency when flashed on a non Strix card see if can smash Superposition am only lagging 300points to the leader on a 1080Ti 4k of course
> 
> I think if you got to bench @ 1080p then hell I might break out the old CRT


lol. true.

On the new cpus, here's the latest. http://wccftech.com/intel-skylake-x-core-i9-7920x-7900x-7820x-7800x-x299-leaked/


----------



## Slackaveli

Quote:


> Originally Posted by *Sweetwater*
> 
> Yeah the ol Z97. Thing is I have already been looking for alternatives to drop in and being in Australia it's just not worth it. Our tech pricing is whack and our dollar to buy from overseas is pretty weak. For example a 1080ti FE is 1100 while a partner board 1200+


i hear you. but, (in american dollars) a 4690k still sells used for ~$180 and a 5775-c would be a pretty big jump from that. here they are about $350, so, yeah that's $170. But, $100 of that is for hyperthreADING. THE 5675-C Is the same thing w/o hyperthreading and actually tends to overclock ~100Mhz higher than the i-7. It's a drop in upgrade over a 4690k and it's $249 here. so, that's about $70 for a platform's worth of gaming performance upgrade in a drop in chiip requiring nothing else except a bios update. It's a worthy plan for consideration. the following two quote illustrate the difference. one is a $50 upgrade (Benny), the other a ~$800 upgrade. SAME perf. You are lucky to be on a z97 mobo, man....
Quote:


> Originally Posted by *nrpeyton*
> 
> *Upgrading* from an *old* *AMD FX-8350* to the *latest i7 7700k*, ASUS ROG mobo + G.Skill 3600Mhz C16 (£770) got me the *following gains* on The Witcher 3:
> 
> 1440p - 14 FPS
> (86 to 100)
> 
> 4k - 5 FPS
> (51 to 56)
> 
> Got to admit I expected a little more.
> 
> Memory is running with 3600Mhz XMP profile.
> 
> The i7 is running at stock (but at stock it's boosting to 4500mhz on all cores consistently during gaming & doesn't drop).
> 
> Not tried to O/C it yet. I doubt it's worth it.
> 
> The 1080Ti FE is running at 2000 MHZ 988mv with a +535 memory.


Quote:


> Originally Posted by *Benny89*
> 
> Yup, that is that. I had same when I moved from 4790k to 5775C OCed to 4.2 (equivalent in games to 5.0Ghz 7700k). Only around 10-12 fps gains in heavy CPU games.
> 
> Truth is- CPUs performance difference is small, so I wasn't really supprised. As you shouldn't be
> 
> 
> 
> 
> 
> 
> 
> . Most games are 90% GPU depend.
> 
> Buy hey, at least we don't have to upgrade to another 4-core for another 2 years at least. Next upgrade I am gonna 6-core.
> 
> Still in 1440p those 10-14 fps is nice to see


EDIT: I did not know that, How unfortunate.
Quote:


> Originally Posted by *Dasboogieman*
> 
> 5775cs don't exist in Australia, you're better off going to Kaby Lake at this rate.


Quote:


> Originally Posted by *Coopiklaani*
> 
> Anyone tried to stress the card with FSU Scene 1 loop? I can pass any test, including witcher 3 4hr gaming at 2138MHz at the core but it crashes 15-30mins running the FSU scene 1 loop.


Quote:


> Originally Posted by *Coopiklaani*
> 
> Anyone tried to stress the card with FSU Scene 1 loop? I can pass any test, including witcher 3 4hr gaming at 2138MHz at the core but it crashes 15-30mins running the FSU scene 1 loop.


Yeah, man, that test is a hater. Just... no game is going to use that much tessalation. It heats up the vram (my theory) and eventually it crashes. I noticed around the 40 minute mark i got a bit of artifacting before i crashed. This running at +425 vram when i know from OCCT that i start erroring at +445 vram. Which tells me my vram heated up b/c of all that tessalation at 4k and crashed me.


----------



## gmpotu

Can someone tel which bios is XOC and where to download it. I see the OP with the link to all the bios downloads but which one is XOC? I read one person posted it is the Asus Strix OC bios but then people are talking about no power limiting and unlocking voltage up to 1.2000v. My card is Asus Strix 1080ti OC so I would assume I already have that bios but I only get 1.093v in AB with the voltage slider maxed and my card maxes at 120% slider in AB running AB 4.3

Just really confused by the posts I have been trying to keep up with.


----------



## done12many2

Quote:


> Originally Posted by *mtbiker033*
> 
> 
> 
> this is with +191/+500 max temp was 44C
> 
> just broke into the top 30 on the superposition thread! lol #42 on the unigine leaderboards, the old 4820k & RIVE BE hanging in there!


Nice work man.

My SLI is #6 in 1080p Extreme, #3 in 4k Optimized, and #4 in 8k Optimized with no shunt mod. I like being right in the middle of Titan Xp SLIs with my ghetto Xp SLI.


----------



## CoreyL4

Does ~10090 supo 4k score with 2037/6123 seem low?

It downxloxks to 2012-2025 during the run too


----------



## mtbiker033

Quote:


> Originally Posted by *done12many2*
> 
> Nice work man.
> 
> My SLI is #6 in 1080p Extreme, #3 in 4k Optimized, and #4 in 8k Optimized with no shunt mod. I like being right in the middle of Titan Xp SLIs with my ghetto Xp SLI.


hell yeah! very nice!!


----------



## Hulio225

Quote:


> Originally Posted by *gmpotu*
> 
> Can someone tel which bios is XOC and where to download it. I see the OP with the link to all the bios downloads but which one is XOC? I read one person posted it is the Asus Strix OC bios but then people are talking about no power limiting and unlocking voltage up to 1.2000v. My card is Asus Strix 1080ti OC so I would assume I already have that bios but I only get 1.093v in AB with the voltage slider maxed and my card maxes at 120% slider in AB running AB 4.3
> 
> Just really confused by the posts I have been trying to keep up with.


Its not the strix bios man... its an unofficial *X*treme *O*ver*c*locking (XOC) Bios... found on hwbot: http://forum.hwbot.org/showthread.php?t=169488


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> Can someone tel which bios is XOC and where to download it. I see the OP with the link to all the bios downloads but which one is XOC? I read one person posted it is the Asus Strix OC bios but then people are talking about no power limiting and unlocking voltage up to 1.2000v. My card is Asus Strix 1080ti OC so I would assume I already have that bios but I only get 1.093v in AB with the voltage slider maxed and my card maxes at 120% slider in AB running AB 4.3
> 
> Just really confused by the posts I have been trying to keep up with.


Added XOC link to 'How To Flash A Different BIOS'.

It's also in the 'bestbioscollection.zip'


----------



## bladefrost

Hi, I got a Zotac GTX 1080 Ti Amp edition (not the xtreme)

My Temps are @84c max no even at 100% fan speed & open air case.

Boost clocks at max temp (84c) is around 1650-1750mhz range.

I believe this is running worst than a Founder's edition card. Can anyone confirm? What boost clocks are the Founder's edition running at stock?


----------



## gmpotu

Thank you for the replies in the XOC guys!


----------



## Hulio225

Quote:


> Originally Posted by *bladefrost*
> 
> Hi, I got a Zotac GTX 1080 Ti Amp edition (not the xtreme)
> 
> My Temps are @84c max no even at 100% fan speed & open air case.
> 
> Boost clocks at max temp (84c) is around 1650-1750mhz range.
> 
> I believe this is running worst than a Founder's edition card. Can anyone confirm? What boost clocks are the Founder's edition running at stock?


Telling us how hot your card is running without mentioning how high your ambient temps are is worth close to nothing.

84° C would be pretty normal if your ambient is ~30° C and would be hot if your ambient is ~20° C...

Just sayin, your not the only one, in fact most people throw temp numbers around and never mention ambient temps... that is so senseless...


----------



## bladefrost

Ambient Temps in my room is at 26C


----------



## Coopiklaani

Quote:


> Originally Posted by *AMDeo*
> 
> Instead of FujiPoly thermal pads (not avaible in my country) I recommend something reasonably priced:
> 
> http://www.aquatuning.co.uk/thermal-pads-und-paste/thermal-pads/19458/alphacool-eisschicht-waermeleitpad-11w/mk-120x20x0-5mm-2-stueck-sarcon-xr-he?c=2868
> 
> http://www.shop4pc.ro/pad-termic-alphacool-eisschicht-sarcon-xrhe-11wmk-05mm-120x20mm-2-bucati-p-44609.html
> 
> I haven't replaced the stock VRM pads yet with Eisschicht ones... Maybe after I shall make a backplate for my Armor...


The vrm IC that the FE cards and most of other aftermarket cards use, NTMFD4C85N, has a thermal conductivity value of about 1.5w/mk. The limiting factor is almost entirely the IC package. So don't waste money on high-end thermal pads. I don't think you'll tell any difference between a 5w/mk crappy thermal pad and a 12w/mk premium one.


----------



## Hulio225

Quote:


> Originally Posted by *bladefrost*
> 
> Ambient Temps in my room is at 26C


In which scenario have you reached the 84° C, while playing games (which games) benachmarks or even furmark?


----------



## bladefrost

Quote:


> Originally Posted by *Hulio225*
> 
> In which scenario have you reached the 84° C, while playing games (which games) benachmarks or even furmark?


During gaming in Witcher 3, Rise of the Tomb Raider, & other demanding titles. On benchmarks too it always max out at 84C. I tried 100% fan speed & open case but it did not help a bit. Strange. Maybe the heatsink is the problem. I also tried to replace the thermal paste & still 84C max temps


----------



## Sweetwater

Quote:


> Originally Posted by *Slackaveli*
> 
> EDIT: I did not know that, How unfortunate.


Believe me your guys info on the 5775c is what made me have a look. It is a worthy upgrade. But yeah Australia is a little Island 20 years behind the rest of the world haha. We miss out on a lot.


----------



## Slackaveli

Quote:


> Originally Posted by *Sweetwater*
> 
> Believe me your guys info on the 5775c is what made me have a look. It is a worthy upgrade. But yeah Australia is a little Island 20 years behind the rest of the world haha. We miss out on a lot.


well, we tried.

Australia is a beautiful place, though. Certainly on the list of places abroad in which I'd live.


----------



## Hulio225

Quote:


> Originally Posted by *bladefrost*
> 
> During gaming in Witcher 3, Rise of the Tomb Raider, & other demanding titles. On benchmarks too it always max out at 84C. I tried 100% fan speed & open case but it did not help a bit. Strange. Maybe the heatsink is the problem. I also tried to replace the thermal paste & still 84C max temps


Yeah with 100% fan speed even at 26° C ambient it seems to be a bit to high, hmm

Btw. 84° C is the temp limit from Nvidia if power target is set to 100%, if card reaches 84° C it will down-clock itself further to keep max 84° C... it seriously sounds like your heatsink has problems...


----------



## bladefrost

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah with 100% fan speed even at 26° C ambient it seems to be a bit to high, hmm
> 
> Btw. 84° C is the temp limit from Nvidia if power target is set to 100%, if card reaches 84° C it will down-clock itself further to keep max 84° C... it seriously sounds like your heatsink has problems...


Yeah and I've read somewhere about this Amp edition card & even the 1080 is reaching this high temp using this cooler. When I set to 90C temp limit it would reach 90C even at 100% fan speed so ***? I wanna sell this card. I don't know if the manufacture would accept if I return this as all of the same model card is experiencing high temperature even worst than the founders.


----------



## alucardis666

Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!

This is the unit in question...

https://www.amazon.com/Seasonic-Flagship-TITANIUM-SSR-850TD-Titanium/dp/B01HXYRMME/ref=cm_cr_arp_d_product_top?ie=UTF8


----------



## kolkoo

Quote:


> Originally Posted by *alucardis666*
> 
> Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!
> 
> This is the unit in question...
> 
> https://www.amazon.com/Seasonic-Flagship-TITANIUM-SSR-850TD-Titanium/dp/B01HXYRMME/ref=cm_cr_arp_d_product_top?ie=UTF8


I would read this if I were you http://www.overclock.net/t/1628560/sli-1080ti-causes-reboots-while-4k-gaming-solved-1000-watts-not-enough-for-oced-1080tis

and borrow a 1000 - 1200 W PSU from a friend to check out how it goes.

Or use a device to measure power draw on your PSU.


----------



## Dasboogieman

Quote:


> Originally Posted by *alucardis666*
> 
> Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!
> 
> This is the unit in question...
> 
> https://www.amazon.com/Seasonic-Flagship-TITANIUM-SSR-850TD-Titanium/dp/B01HXYRMME/ref=cm_cr_arp_d_product_top?ie=UTF8


Always take what people say on the internet with a grain of salt.
There are too many factors at work and too many variables to account for ranging from simple to complex: e.g.

Simple:
1. make sure all cables are functional
2. make sure all cables are connected properly
3. make sure all the cable contacts are clean

complex:
1. you are drawing more power than estimated
2. your PSU is faulty
3. 120v-220v-240v supply voltage, some units lose a bit of wattage depending on what supply voltage it is optimized for.

Then theres also the age old explanation of people are talking out of their rear, or are vastly underestimating the power draw, or straight lies


----------



## alucardis666

Quote:


> Originally Posted by *kolkoo*
> 
> I would read this if I were you http://www.overclock.net/t/1628560/sli-1080ti-causes-reboots-while-4k-gaming-solved-1000-watts-not-enough-for-oced-1080tis
> 
> and borrow a 1000 - 1200 W PSU from a friend to check out how it goes.
> 
> Or use a device to measure power draw on your PSU.


Quote:


> Originally Posted by *Dasboogieman*
> 
> Always take what people say on the internet with a grain of salt.
> There are too many factors at work and too many variables to account for ranging from simple to complex: e.g.
> 
> Simple:
> 1. make sure all cables are functional
> 2. make sure all cables are connected properly
> 3. make sure all the cable contacts are clean
> 
> complex:
> 1. you are drawing more power than estimated
> 2. your PSU is faulty
> 3. 120v-220v-240v supply voltage, some units lose a bit of wattage depending on what supply voltage it is optimized for.
> 
> Then theres also the age old explanation of people are talking out of their rear, or are vastly underestimating the power draw, or straight lies


Appreciate the advice.

I'm gonna get a bigger unit. I didn't have these issues with my old Ax1200 non i...


----------



## TheBoom

Quote:


> Originally Posted by *bladefrost*
> 
> Yeah and I've read somewhere about this Amp edition card & even the 1080 is reaching this high temp using this cooler. When I set to 90C temp limit it would reach 90C even at 100% fan speed so ***? I wanna sell this card. I don't know if the manufacture would accept if I return this as all of the same model card is experiencing high temperature even worst than the founders.


Amp editions have always had horrible cooling. My 970 constantly hit over 90 C and died 1 year later. VRMs got fried with a mild OC.

I would never advise anyone to get an Amp. The only other thing you could try is reapplying the TIM. I remember it improved my temps a little.

The Amp Extreme is a totally different story on the other hand.


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> Appreciate the advice.
> 
> I'm gonna get a bigger unit. I didn't have these issues with my old Ax1200 non i...


I just bought this...

Should fix my issues.









http://www.bestbuy.com/site/evga-1600w-modular-power-supply-black/4598983.p?skuId=4598983


----------



## bladefrost

Quote:


> Originally Posted by *TheBoom*
> 
> Amp editions have always had horrible cooling. My 970 constantly hit over 90 C and died 1 year later. VRMs got fried with a mild OC.
> 
> I would never advise anyone to get an Amp. The only other thing you could try is reapplying the TIM. I remember it improved my temps a little.
> 
> The Amp Extreme is a totally different story on the other hand.


Did Zotac replaced your 970? I wish I had gotten the Amp Extreme instead


----------



## TheBoom

Quote:


> Originally Posted by *bladefrost*
> 
> Did Zotac replaced your 970? I wish I had gotten the Amp Extreme instead


They didn't have stock for it so they gave me the blower edition instead. Which was better anyway lol.


----------



## Clukos

Quote:


> Originally Posted by *alucardis666*
> 
> Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!
> 
> This is the unit in question...
> 
> https://www.amazon.com/Seasonic-Flagship-TITANIUM-SSR-850TD-Titanium/dp/B01HXYRMME/ref=cm_cr_arp_d_product_top?ie=UTF8


Try undervolting the GPUs to 0.800mv to reduce power draw and see if it works. Then it's definitely the PSU.


----------



## bladefrost

Quote:


> Originally Posted by *TheBoom*
> 
> They didn't have stock for it so they gave me the blower edition instead. Which was better anyway lol.


Lol. Good for you. I will never buy Amp edition cards in the future.


----------



## Benny89

My wife said to me to just buy myself SLI and be done.

Well, you dont have to worry about extra fps on 1440p 144hz if you have 1080 Ti SLI









So thats what I am going to do. Waiting for my cards


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> My wife said to me to just buy myself SLI and be done.
> 
> Well, you dont have to worry about extra fps on 1440p 144hz if you have 1080 Ti SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So thats what I am going to do. Waiting for my cards


BS, you'll still be here Benny89 trying to oc not one but now two of them.


----------



## kolkoo

Quote:


> Originally Posted by *Benny89*
> 
> My wife said to me to just buy myself SLI and be done.
> 
> Well, you dont have to worry about extra fps on 1440p 144hz if you have 1080 Ti SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So thats what I am going to do. Waiting for my cards


I hope you are happy with your SLI and the games you play support it properly


----------



## BrainSplatter

Quote:


> Originally Posted by *alucardis666*
> 
> Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!


I am running 2x FE 1080ti SLI + 7700K on a 800W BeQuiet Straight Power PSU with no problems so far. I did all the 3DMark benches and superposition 4k/8k with OC + maxed out power target and the CPU @ [email protected] I was a bit nervous but everything worked just fine. System has 32GB RAM, 6xSSDs and 9 fans. Unfortunately, I can't measure power draw on the outlet because the battery in my Kill-A-Watt is dead.

I wouldn't recommend a 800W PSU for SLI 1080Ti but it can work, lol. An overclocked 6 core CPU might already push the system over the edge though.

Regarding FPS gains in games after switching from old AMD CPU to 7700K. Witcher 3 is not really a good test. One of the best tests would be a Total War game like Attila or TW-Warhammer since it's highly dependent on single CPU core performance. Arma 3 I think can also be pretty limited by a single core.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> BS, you'll still be here Benny89 trying to oc not one but now two of them.


Of course I will try to OC, but you know







I won't be that obssessed about extra FPS with SLI
















My wife had enough of boxed in and out and GPUs lying around in home....

Ow well, will report when I get them


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Lol.
> 
> The i7 7700k is nearly 80% faster than an old AMD FX 8350 at single core.
> 
> I knew it wasn't going to be revolutionary (I had my FX overclocked to a an inch of it's life).
> 
> But considering the massive LEAP in platform, CPU performance, memory performance and motherboard -- I did expect _a bit_ more.
> 
> I'm not coming from a mid-range intel setup, I'm coming from an old *ancient* 8-core AMD on PCI-E 2.0 at 2133Mhz DDR3 lol
> 
> Oh well, maybe other games will yield some better surprises for me


I expected more too much much more
A friend of mine made a similar upgrade but to a 6700k he is a bit different to us and simply does not believe in OCing







or water in a computer
I know one of his fav games was Microsoft flight simulator that had huge gains
Quote:


> Originally Posted by *alucardis666*
> 
> Question! I just got a Seasonic 850W PSU as I've read about tons of people SLi-in 1080s and 1080tis with 800w psu units, but when I try to run Heaven my system shuts off and reboots.. What gives? Faulty unit or just not enough power?!
> 
> This is the unit in question...
> 
> https://www.amazon.com/Seasonic-Flagship-TITANIUM-SSR-850TD-Titanium/dp/B01HXYRMME/ref=cm_cr_arp_d_product_top?ie=UTF8


I see you already got a PSU on order but general rule of thumb ;
automatically reboots = PSU
have to manually reboot = motherboard

And in all seriousness yes you can run on 750w PSU but in the long run the PSU will degrade much faster because the PSU is running close to 100% or in some case more than the rated output.

Nice choice on PSU I also got sick of PSU issues and got the biggest and reputable PSU available at the time 1500w enermax


----------



## BoredErica

Where you're testing in Witcher 3 matters too... It's not just a game, but also where in the game.


----------



## ottoore

Quote:


> Originally Posted by *AMDeo*
> 
> Instead of FujiPoly thermal pads (not avaible in my country) I recommend something reasonably priced:
> 
> http://www.aquatuning.co.uk/thermal-pads-und-paste/thermal-pads/19458/alphacool-eisschicht-waermeleitpad-11w/mk-120x20x0-5mm-2-stueck-sarcon-xr-he?c=2868
> 
> http://www.shop4pc.ro/pad-termic-alphacool-eisschicht-sarcon-xrhe-11wmk-05mm-120x20mm-2-bucati-p-44609.html
> 
> I haven't replaced the stock VRM pads yet with Eisschicht ones... Maybe after I shall make a backplate for my Armor...


Alphacool sells them but FUjipoly is the Oem.
http://www.osco.uk.com/userfiles/download/Thermal%20Material/Discontinued%20Thermal%20Materials/SARCON_XR-E%20Data%20Sheet.pdf


----------



## VESPA5

Quote:


> Originally Posted by *bladefrost*
> 
> Hi, I got a Zotac GTX 1080 Ti Amp edition (not the xtreme)
> 
> My Temps are @84c max no even at 100% fan speed & open air case.
> 
> Boost clocks at max temp (84c) is around 1650-1750mhz range.
> 
> I believe this is running worst than a Founder's edition card. Can anyone confirm? What boost clocks are the Founder's edition running at stock?


Wow, really? If you're still hitting 84C (which is the thermal cap), you're "Amp Edition" isn't really doing anything more differently than the 1080 Ti FE reference blower. Is your case fan setup with enough intake and exhaust fans? The custom cooler is basically dispersing heat inside the case and your setup may not be exhausting that heat out of it properly. But wait, you said it's an open air case. I dunno. If you're getting the same performance as the FE reference blower, looks like all you're paying for is for lesser decibels.


----------



## bladefrost

Quote:


> Originally Posted by *VESPA5*
> 
> Wow, really? If you're still hitting 84C (which is the thermal cap), you're "Amp Edition" isn't really doing anything more differently than the 1080 Ti FE reference blower. Is your case fan setup with enough intake and exhaust fans? The custom cooler is basically dispersing heat inside the case and your setup may not be exhausting that heat out of it properly.


Maybe this even performing worse than the founders. I tried opening case side panel & still didn't help. Lowest boost clock I've seen during gaming @84C is around 1630mhz. Maybe even slower than the Founders.


----------



## VESPA5

Quote:


> Originally Posted by *bladefrost*
> 
> Maybe this even performing worse than the founders. I tried opening case side panel & still didn't help. Lowest boost clock I've seen during gaming @84C is around 1630mhz. Maybe even slower than the Founders.


Oh wow. That is worse than the FE edition. Before I slapped on a hybrid cooler on my 1080Ti FE, I was getting around 2015MHz at roughly 70C (and that's with a very aggressive fan curve for the reference blower at 80% at a deafening decibel cost). Could you possibly get a refund/exchange? You might have a dud or a card that has the thermal paste applied to it improperly.


----------



## bladefrost

Quote:


> Originally Posted by *VESPA5*
> 
> Oh wow. That is worse than the FE edition. Before I slapped on a hybrid cooler on my 1080Ti FE, I was getting around 2015MHz at roughly 70C (and that's with a very aggressive fan curve for the reference blower at 80% at a deafening decibel cost). Could you possibly get a refund/exchange? You might have a dud or a card that has the thermal paste applied to it improperly.


I already replaced the thermal paste & it didn't help. At 100% fan speed with open case side panel it still runs at 84c. highest boost clock I've seen is around 1950mhz. Do you think the dealer would accept this if I return the card?


----------



## mtbiker033

Quote:


> Originally Posted by *bladefrost*
> 
> I already replaced the thermal paste & it didn't help. At 100% fan speed with open case side panel it still runs at 84c. highest boost clock I've seen is around 1950mhz. Do you think the dealer would accept this if I return the card?


it's worth a try

what is the ambient temps in your pc room?


----------



## bladefrost

Quote:


> Originally Posted by *mtbiker033*
> 
> it's worth a try
> 
> what is the ambient temps in your pc room?


My room is air-conditioned. 25C


----------



## mtbiker033

Quote:


> Originally Posted by *bladefrost*
> 
> My room is air-conditioned. 25C


well that's reasonable, I would try returning that card for sure


----------



## bladefrost

Quote:


> Originally Posted by *mtbiker033*
> 
> well that's reasonable, I would try returning that card for sure


Ok thank you for the advice. I'm not happy. I will try to return this.


----------



## RubyRP

After reading a lot on this thread, i threw myself in!

Bought a 1080Ti FE from Asus (cheapest price), and started to flash the Nvidia Vanilla firmware. The Asus one was kind of eratic one.

So i started to use the EVGA OC software, where i put +150 on core and 400 on memory. The problem was some freezing in Batman Arkham Knight. (This game is very sensitive to OC). With values of +150 +300 it's perfect, (1999 on turbo core and 11600 on memory, 119 max on power target, 84 on temp side), with a personnal custom fan curve.

So i decided to uninstall and try MSi afterburner.

With the same values than XOC, Batman and Unigine bacame very unstable, freezes etc. Even if i closed the MSi AB this was a mess! Look up every 5 minutes of playing etc even à stock values!!

So uninstalled it and back to EVGA XOC with OC values without any gltich. Working flawlessly.

The only thing i saw was the +120 on power target, which is used in XOC at it's max (119 in peak) didn't go farther than 112/113 with MSi AB for an unknown reason as far as i entered the same values in both apps. Weird.

Batman was nearly unplayable with MSi AB with a lot of micro stuttering, (2560*1440 was fine), but flawless in 4k with EVGA XOC...


----------



## KedarWolf

Quote:


> Originally Posted by *RubyRP*
> 
> After reading a lot on this thread, i threw myself in!
> 
> Bought a 1080Ti FE from Asus (cheapest price), and started to flash the Nvidia Vanilla firmware. The Asus one was kind of eratic one.
> 
> So i started to use the EVGA OC software, where i put +150 on core and 400 on memory. The problem was some freezing in Batman Arkham Knight. (This game is very sensitive to OC). With values of +150 +300 it's perfect, (1999 on turbo core and 11600 on memory, 119 max on power target, 84 on temp side), with a personnal custom fan curve.
> 
> So i decided to uninstall and try MSi afterburner.
> 
> With the same values than XOC, Batman and Unigine bacame very unstable, freezes etc. Even if i closed the MSi AB this was a mess! Look up every 5 minutes of playing etc even à stock values!!
> 
> So uninstalled it and back to EVGA XOC with OC values without any gltich. Working flawlessly.
> 
> The only thing i saw was the +120 on power target, which is used in XOC at it's max (119 in peak) didn't go farther than 112/113 with MSi AB for an unknown reason as far as i entered the same values in both apps. Weird.
> 
> Batman was nearly unplayable with MSi AB with a lot of micro stuttering, (2560*1440 was fine), but flawless in 4k with EVGA XOC...


There's an EVGA XOC BIOS? Can you link it here?

Or save it with GPU-Z and add it as a .zip as an attachment here?


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> There's an EVGA XOC BIOS? Can you link it here?
> 
> Or save it with GPU-Z and add it as a .zip as an attachment here?


Dude he is referring to the EVGA OC Tool, the counterpart to MSI Afterburner... not an bios









Sadly

Edit: This crap here: https://de.evga.com/precisionxoc/


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> There's an EVGA XOC BIOS? Can you link it here?
> 
> Or save it with GPU-Z and add it as a .zip as an attachment here?
> 
> 
> 
> Dude he is referring to the EVGA OC Tool, the counterpart to MSI Afterburner... not an bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly
> 
> Edit: This crap here: https://de.evga.com/precisionxoc/
Click to expand...

Can you crank up the voltages with the OC Tool if you flash the EVGA BIOS?


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> There's an EVGA XOC BIOS? Can you link it here?
> 
> Or save it with GPU-Z and add it as a .zip as an attachment here?
> 
> 
> 
> Dude he is referring to the EVGA OC Tool, the counterpart to MSI Afterburner... not an bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly
> 
> Edit: This crap here: https://de.evga.com/precisionxoc/
> 
> Click to expand...
> 
> Can you crank up the voltages with the OC Tool if you flash the EVGA BIOS?
Click to expand...

Edit: I ask because there used to be a tool for raising voltages with some cards.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> Edit: I ask because there used to be a tool for raising voltages with some cards.


I don't think this tool differs from MSi AB in terms of possibilities... if you would be able to set voltages higher than 1093mV with it, it would be known for a long time


----------



## Slackaveli

Quote:


> Originally Posted by *alucardis666*
> 
> Appreciate the advice.
> 
> I'm gonna get a bigger unit. I didn't have these issues with my old Ax1200 non i...


"I'm running a corsair ax1200i, dual 1080ti's, 6900k @ 4400 ghZ @ 1.380

when running corsair link software, I noticed, at full gpu load, it will spike near or at 1000 watts usage. I'm gonna upgrade to ax1500i for safety, but it could be that you're reaching the limit of that PSU.. Have you tried running with just one GPU at full load?

EDIT: Forgot to mention my 1080ti's are OC'd to around 2050-2100
Edited by KCDC - 4/21/17 at 12:49pm"

There are a bunch of posts like this, and a lot of people on single Ti with 550 or 650w power supplies getting the same crashes. You can't overclock two Tis on a 100w power supply without spiking to 1000w. i would recommend a 1000w for single, 1500w for SLI if you are an overclocker of all components. It's real easy to saturate a 850w psu.


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> "I'm running a corsair ax1200i, dual 1080ti's, 6900k @ 4400 ghZ @ 1.380
> 
> when running corsair link software, I noticed, at full gpu load, it will spike near or at 1000 watts usage. I'm gonna upgrade to ax1500i for safety, but it could be that you're reaching the limit of that PSU.. Have you tried running with just one GPU at full load?
> 
> EDIT: Forgot to mention my 1080ti's are OC'd to around 2050-2100
> Edited by KCDC - 4/21/17 at 12:49pm"
> 
> There are a bunch of posts like this, and a lot of people on single Ti with 550 or 650w power supplies getting the same crashes. You can't overclock two Tis on a 100w power supply without spiking to 1000w. i would recommend a 1000w for single, 1500w for SLI if you are an overclocker of all components. It's real easy to saturate a 850w psu.


What I also don't get is the price difference between a 750W PSU and a decent 1000W is peanuts compared to the benefits. Why are people cheaping out on the PSU?


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> "I'm running a corsair ax1200i, dual 1080ti's, 6900k @ 4400 ghZ @ 1.380
> 
> when running corsair link software, I noticed, at full gpu load, it will spike near or at 1000 watts usage. I'm gonna upgrade to ax1500i for safety, but it could be that you're reaching the limit of that PSU.. Have you tried running with just one GPU at full load?
> 
> EDIT: Forgot to mention my 1080ti's are OC'd to around 2050-2100
> Edited by KCDC - 4/21/17 at 12:49pm"
> 
> There are a bunch of posts like this, and a lot of people on single Ti with 550 or 650w power supplies getting the same crashes. You can't overclock two Tis on a 100w power supply without spiking to 1000w. i would recommend a 1000w for single, 1500w for SLI if you are an overclocker of all components. It's real easy to saturate a 850w psu.


To be honest recommending 1000w for a single gpu config is over the top...

My 650W PSU is enough for a shunt modded 1080ti and a i7 7700k both decently oc'ed... it depends how the psu power is distributed, if it has 650W on the 12V rail, like mine it is fine! for safety and a bit headroom i would recommend a 750w or max 850w for single gpu... and for 2 gpu setups 1200W will be enough too...


----------



## Slackaveli

Quote:


> Originally Posted by *BrainSplatter*
> 
> I am running 2x FE 1080ti SLI + 7700K on a 800W BeQuiet Straight Power PSU with no problems so far. I did all the 3DMark benches and superposition 4k/8k with OC + maxed out power target and the CPU @ [email protected] I was a bit nervous but everything worked just fine. System has 32GB RAM, 6xSSDs and 9 fans. Unfortunately, I can't measure power draw on the outlet because the battery in my Kill-A-Watt is dead.
> 
> I wouldn't recommend a 800W PSU for SLI 1080Ti but it can work, lol. An overclocked 6 core CPU might already push the system over the edge though.
> 
> Regarding FPS gains in games after switching from old AMD CPU to 7700K. Witcher 3 is not really a good test. One of the best tests would be a Total War game like Attila or TW-Warhammer since it's highly dependent on single CPU core performance. Arma 3 I think can also be pretty limited by a single core.


i am NOT sure how you are pulling that off! I know systems are different, but, i am pulling spikes to 400w on the Aorus. So , two of them max clocked would spike to 800w by themselves. then another 120w for the cpu, that's 920w... NOT counting ssds, ddr4s, mobo, hard drive, fans, any lighting, and the psu itself. HAS to be spiking at 1000w + with a fully overclocked system. Being as how psu's are rated , say, 1000w... that means they can handle 1000w spikes, but they cannot handle 1000w constant. Psu shouldnt be run at 80 or 90% load, even if they can do it. It will degrade and cause problems eventually. Yeah, it's warrantied, but if you drive a psu "until the wheels fall off" it takes your entire system with you. On a single Ti, 850 + works but I'd recommend 1000w. On a SLI set-up... brruh. 1200w + ONLY imo. For longevity. For a power overhead. Having power overhead is a critical part of system balance imo. Im a 1000w on single Ti and people laugh and say , "overkill" all the time. SLI owners on 1500w hear the same thing. But, you don't see threads about "Crashing in Witcher 3..." from guys with power overhead....

Regarding FPS gains in games after switching from old AMD CPU to 7700K. Witcher 3 is not really a good test. One of the best tests would be a Total War game like Attila or TW-Warhammer since it's highly dependent on single CPU core performance. Arma 3 I think can also be pretty limited by a single core.]

YEP! I told him before he got off the FX to do an Arma 3 comparison. It has a bad cpu bind.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> i am NOT sure how you are pulling that off! I know systems are different, but, i am pulling spikes to 400w on the Aorus. So , two of them max clocked would spike to 800w by themselves. then another 120w for the cpu, that's 920w... NOT counting ssds, ddr4s, mobo, hard drive, fans, any lighting, and the psu itself. HAS to be spiking at 1000w + with a fully overclocked system. Being as how psu's are rated , say, 1000w... that means they can handle 1000w spikes, but they cannot handle 1000w constant. Psu shouldnt be run at 80 or 90% load, even if they can do it. It will degrade and cause problems eventually. Yeah, it's warrantied, but if you drive a psu "until the wheels fall off" it takes your entire system with you. On a single Ti, 850 + workks but I'd recommend 100w. On a SLI set-up... brruh. 1200w + ONLY imo. For longevity. For a power overhead. Having power overhead is a critical part of system balance imo. Im a 100w on single Ti and people laugh and say , "overkill" all the time. SLI owners on 1500w hear the same thing. But, you don't see threads about "Crashing in ..." from guys with power overhead....
> 
> Regarding FPS gains in games after switching from old AMD CPU to 7700K. Witcher 3 is not really a good test. One of the best tests would be a Total War game like Attila or TW-Warhammer since it's highly dependent on single CPU core performance. Arma 3 I think can also be pretty limited by a single core. YEP! I told him before he got off the FX to do an Arma 3 comparison. It has a bad cpu bind.


You pretty much answered your question by yourself... Spikes are pretty much spikes, neither the gpu is pulling constant max power nor the cpu is fully 100% loaded. this would only be the case if you do boinc and stuff on gpu and cpu... but thats another story... And PSU can be run at 80 or 90% load all the time, they are designed for that


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> What I also don't get is the price difference between a 750W PSU and a decent 1000W is peanuts compared to the benefits. Why are people cheaping out on the PSU?


EX-ACT-LY!!! They NEVER save the $100 for hyperthreading by buying the i-5. they never save the $100 by buying the "B" mobo. They never save the $100 by buying the off-brand ssd..... But, they WILL skimp on a psu... the one component that will bork your WHOLE rig when it fails!!! That's dumb af. But, when they try to shove that logic down the community's throat just so they can feel better about their dumb purchase, it becomes more than just dumb. I try to refute it whenever I see it!


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> You pretty much answered your question by yourself... Spikes are pretty much spikes, neither the gpu is pulling constant max power nor the cpu is fully 100% loaded. this would only be the case if you do boinc and stuff on gpu and cpu... but thats another story... And PSU can be run at 80 or 90% load all the time, they are designed for that


yeah, not by me they won't. My pops taught me 20 years ago (he is an electrical engineer) to never run any type of power supply / generator at over 75% load. It degrades the part and risks the system. If it was three times the cost to go big I could understand the urge to go cheap on the psu... but I'd rather go cheap on a mobo tbh than a psu. I want 25% headroom.

I am sure a GOOD 850 for a single would be "fine". 650w is just ASKING to nuke your system!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Of course I will try to OC, but you know
> 
> 
> 
> 
> 
> 
> 
> I won't be that obssessed about extra FPS with SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My wife had enough of boxed in and out and GPUs lying around in home....
> 
> Ow well, will report when I get them


You were obsessed with trying to gain 1% FPS. Trust me, you'll be doing it again but now with both cards.


----------



## TheBoom

Too much headroom means you're reducing the efficiency of your PSU isn't it?


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, not by me they won't. My pops taught me 20 years ago (he is an electrical engineer) to never run any type of power supply / generator at over 75% load. It degrades the part and risks the system. If it was three times the cost to go big I could understand the urge to go cheap on the psu... but I'd rather go cheap on a mobo tbh than a psu. I want 25% headroom.
> 
> I am sure a GOOD 850 for a single would be "fine". 650w is just ASKING to nuke your system!


What coincidence i have an degree in electronics myself... while i'm aware of that 650W is somewhat skimpy, i had this already from my previous system... but the argument of degradadtion etc is a bit flawed, you could say the same about your gpu and cpu too, if loaded all the time it degrades because its getting warmer bla bla... 80-90% load on a modern quality psu is not an biggy, and im saying that as an electronic engineer ;-)


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> To be honest recommending 1000w for a single gpu config is over the top...
> 
> My 650W PSU is enough for a shunt modded 1080ti and a i7 7700k both decently oc'ed... it depends how the psu power is distributed, if it has 650W on the 12V rail, like mine it is fine! for safety and a bit headroom i would recommend a 750w or max 850w for single gpu... and for 2 gpu setups 1200W will be enough too...


mine's a 1000w single rail, but, tbh... i am mad for not just getting the 1200 or 1500w corsair. because, why NOT? It means you can go sli at will or always have headroom, never stress the psu. oh well. But, like right now if i threw another Ti in my rig.. I'd not feel comfortable with only 1000w single rail.


----------



## Clukos

Quote:


> Originally Posted by *TheBoom*
> 
> Too much headroom means you're reducing the efficiency of your PSU isn't it?


Yes. And 750 watt PSU is more than fine for a single Ti, just get a high quality one.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> mine's a 1000w single rail, but, tbh... i am mad for not just getting the 1200 or 1500w corsair. because, why NOT? It means you can go sli at will or always have headroom, never stress the psu. oh well. But, like right now if i threw another Ti in my rig.. I'd not feel comfortable with only 1000w single rail.


That is another story, i agree with that argument, why not if you can afford it sure...


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> What coincidence i have an degree in electronics myself... while i'm aware of that 650W is somewhat skimpy, i had this already from my previous system... but the argument of degradadtion etc is a bit flawed, you could say the same about your gpu and cpu too, if loaded all the time it degrades because its getting warmer bla bla... 80-90% load on a modern quality psu is not an biggy, and im saying that as an electronic engineer ;-)


i dont question your credentials. But I also KNOW you didnt just buy that psu for this particular setup you are running. not 650 for a 7700k and 1080ti in overclock.net; not YOU. You just deemed it acceptable to keep using. And ONLY b/c it's single rail and top notch quality. But if you were building a fresh system, i bet you'd buy an 850w+







Quote:


> Originally Posted by *TheBoom*
> 
> Too much headroom means you're reducing the efficiency of your PSU isn't it?


yes, too much is over 50%. And that depends on the psu. There are ratings for efficiency at 50%, 75% load. It varies, but, superflower model's are great at it. Like, really best if at 50-75% power.


----------



## TheBoom

Quote:


> Originally Posted by *Clukos*
> 
> Yes. And 750 watt PSU is more than fine for a single Ti, just get a high quality one.


I run a AX860i. Just remembered reading that lower loads usually means less efficiency and that's one of the reasons why I wouldn't go too big on headroom anyway.

Especially when I will never again go for SLI and Crossfire setups.


----------



## RubyRP

Quote:


> Originally Posted by *Hulio225*
> 
> I don't think this tool differs from MSi AB in terms of possibilities... if you would be able to set voltages higher than 1093mV with it, it would be known for a long time


Yep, it's quite the same... I read on the internet that most of the time XOC is not stable and MSI AB is! For me it's the opposite.

I messed with the voltages on AB with no luck. So back to small OC on air with the EVGA XOC software.

Running with NVIDIA official firmware


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> I run a AX860i. Just remembered reading that lower loads usually means less efficiency and that's one of the reasons why I wouldn't go too big on headroom anyway.
> 
> Especially when I will never again go for SLI and Crossfire setups.


that this true. if you KNOW you'll never again go sli, or have a single pci-e mobo like an itx, sure. But even then I'd recommend 1000w. Running at 75% isn't hurting efficiency and isn't risking a spike going over your rating. Doesnt mean you'd crash always, but it's be "stressed". If you do a lot of benching, running hours of loops, etc, never skimp on psu. An 860i is very high quality and over 850w, so, yeah. that is a perfect choice in your situation. Unless Volta is a 900mm die like the v6000 is and runs at 400w+ lol. Even then you are good.

But, SLI is set to make a comeback with the 4k and 5k screens running 120Hz+ , even if gpu's are getting more efficient over time. And if DX12 (lulz) ever gets it's stuff together w/ mgpu support.


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> that this true. if you KNOW you'll never again go sli, or have a single pci-e mobo like an itx, sure. But even then I'd recommend 1000w. Running at 75% isn't hurting efficiency and isn't risking a spike going over your rating. Doesnt mean you'd crash always, but it's be "stressed". If you do a lot of benching, running hours of loops, etc, never skimp on psu. An 860i is very high quality and over 850w, so, yeah. that is a perfect choice in your situation. Unless Volta is a 900mm die like the v6000 is and runs at 400w+ lol. Even then you are good.
> 
> But, SLI is set to make a comeback with the 4k and 5k screens running 120Hz+ , even if gpu's are getting more efficient over time. And if DX12 (lulz) ever gets it's stuff together w/ mgpu support.


I agree, go big & high quality on the psu. It's the heart of the system and definitely you will be happier in the long run especially when OC'ing!

I also agree as the new higher resolutions are coming in SLI may see new relevance.


----------



## ViRuS2k

I would also agree, dont skimp on PSU`s









I have a Super flower 2000w PSU - 8Pack Edition, Best psu i have ever bought, throw anything at it.... massive single rail


----------



## Coopiklaani

My GPU draws about 425w under FSU stress test as measured by HWinfo64. With my 6800k @4.5GHz, 1.35v, the whole system draws about 650w of power from the wall. I'm using a 850w EVGA PSU which is just right for the system imo.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> You were obsessed with trying to gain 1% FPS. Trust me, you'll be doing it again but now with both cards.


Pls, let my mind at least be convinced that we achieved some solution









Since I wanna go SLI I am waiting for 2x FTW3. 2 Slot design will help temps and room.

BTW. Does EVGA has RGB SLI Bridge? And which one (size) I should use on Maximus VII Hero MOBO?

Never had SLI before so you know, wanna prepare.

I have 1000W G2 EVGA PSU. But I read around internet that 1000W is ok for SLI 1080 Tis. So I belive, as my STRIX 1080Ti didn't spike higher than 340/350W in games


----------



## Slackaveli

Quote:


> Originally Posted by *mtbiker033*
> 
> I agree, go big & high quality on the psu. It's the heart of the system and definitely you will be happier in the long run especially when OC'ing!
> 
> I also agree as the new higher resolutions are coming in SLI may see new relevance.

















Quote:


> Originally Posted by *ViRuS2k*
> 
> I would also agree, dont skimp on PSU`s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a Super flower 2000w PSU - 8Pack Edition, Best psu i have ever bought, throw anything at it.... massive single rail


yeah, that's beast, I love super flower. Also, you could run two V6000s with that








Quote:


> Originally Posted by *Coopiklaani*
> 
> My GPU draws about 425w under FSU stress test as measured by HWinfo64. With my 6800k @4.5GHz, 1.35v, the whole system draws about 650w of power from the wall. I'm using a 850w EVGA PSU which is just right for the system imo.


jah. right at 75%


----------



## mtbiker033

Quote:


> Originally Posted by *Benny89*
> 
> Pls, let my mind at least be convinced that we achieved some solution
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since I wanna go SLI I am waiting for 2x FTW3. 2 Slot design will help temps and room.
> 
> BTW. Does EVGA has RGB SLI Bridge? And which one (size) I should use on Maximus VII Hero MOBO?
> 
> Never had SLI before so you know, wanna prepare.
> 
> I have 1000W G2 EVGA PSU. But I read around internet that 1000W is ok for SLI 1080 Tis. So I belive, as my STRIX 1080Ti didn't spike higher than 340/350W in games


I have read posts from at least two different members that found that 1000W wasn't enough for 1080ti SLI. ymmv of course but it's something to thnk about


----------



## Slackaveli

Quote:


> Originally Posted by *mtbiker033*
> 
> I have read posts from at least two different members that found that 1000W wasn't enough for 1080ti SLI. ymmv of course but it's something to thnk about


just read the last two pages, eh?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Pls, let my mind at least be convinced that we achieved some solution
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since I wanna go SLI I am waiting for 2x FTW3. 2 Slot design will help temps and room.
> 
> BTW. Does EVGA has RGB SLI Bridge? And which one (size) I should use on Maximus VII Hero MOBO?
> 
> Never had SLI before so you know, wanna prepare.
> 
> I have 1000W G2 EVGA PSU. But I read around internet that 1000W is ok for SLI 1080 Tis. So I belive, as my STRIX 1080Ti didn't spike higher than 340/350W in games


Does your case have a side fan on the panel? It can literally make a 10C difference for SLI AIB cards.

https://www.pugetsystems.com/labs/articles/Side-Panel-Fans-Are-They-Worth-It-102/



Look at the image with no side case fan, the cards are choking one another with hot air, it becomes a cluster of recycled hot air.


----------



## mtbiker033

Quote:


> Originally Posted by *Slackaveli*
> 
> just read the last two pages, eh?


exactly!!!


----------



## Benny89

Quote:


> Originally Posted by *mtbiker033*
> 
> I have read posts from at least two different members that found that 1000W wasn't enough for 1080ti SLI. ymmv of course but it's something to thnk about


Will see. If not I will invest on 2000W PSU. I am dropping 1400USD for 2 GPU cards... I don't really care at this point if I will have to buy PSU


----------



## Coopiklaani

Quote:


> Originally Posted by *mtbiker033*
> 
> I have read posts from at least two different members that found that 1000W wasn't enough for 1080ti SLI. ymmv of course but it's something to thnk about


It should be sufficient if you decided not to BIOS/Shunt mod your cards. modded cards can easily draw over 400w each. 1000-400x2 = 200w, you are left with only 200w for the rest of the system


----------



## kolkoo

Makes me wonder if higher power usage does something to the cards... because I can't explain to myself why would hybrid cards like MSI Sea Hawk be released with standard FE Power limits in the bios if temperature was the only issue to put the power limit on in the first place?


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Does your case have a side fan on the panel? It can literally make a 10C difference for SLI AIB cards.
> 
> https://www.pugetsystems.com/labs/articles/Side-Panel-Fans-Are-They-Worth-It-102/
> 
> 
> 
> Look at the image with no side case fan, the cards are choking one another with hot air, it becomes a cluster of recycled hot air.


109 GPU temp, good old days when thermal throttling was not a thing.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Pls, let my mind at least be convinced that we achieved some solution
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since I wanna go SLI I am waiting for 2x FTW3. 2 Slot design will help temps and room.
> 
> BTW. Does EVGA has RGB SLI Bridge? And which one (size) I should use on Maximus VII Hero MOBO?
> 
> Never had SLI before so you know, wanna prepare.
> 
> I have 1000W G2 EVGA PSU. But I read around internet that 1000W is ok for SLI 1080 Tis. So I belive, as my STRIX 1080Ti didn't spike higher than 340/350W in games


I'd also read this too


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *frogger4*
> 
> If TL;DR, skip to the last paragraph, otherwise read it all!
> 
> I have set out to empirically test the hypothesis posed by Jollyriffic using my own computer (the middle one listed in my signature).
> 
> Relevant hardware:
> Chassis: Nzxt Vulcan - this is a small enclosure with a mostly mesh side panel
> Side panel fan: Nzxt Airflow 120mm, tested at a "low" and "high" speed. It is positioned on top of the psu, oriented right at the gpu on the graphics card.
> CPU: i5-3570K, 4.2 GHz
> GPU: Sapphire Radeon HD 7970, tested at 950MHz stock, 1100MHz OC - this is a dual fan, open air type cooler.
> 
> 
> One front 120mm intake, one top-rear 120mm exhaust, and one 120mm side intake or exhaust.
> 
> Procedure:
> Fix the GPU's fan at 75% for all tests. Record GPU temperature after ~5 minutes of 720p FurMark burn in test for the following configurations: No side fan, side air intake, and side air exhaust, and at "high" and "low" speeds side fan speeds for each, and at stock and overclock speeds for each. Additionally, do the same, but with Prime 95 blend test on 2 cores to create heat from the CPU as well, and record CPU temperatures.
> 
> Results:
> Idle temperatures were consistently 28C for the GPU and ~24C for the CPU.
> 
> Lower is better for all graphs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Analysis and conclusions:
> It is clear that for this setup having the side fan exhaust did not help GPU temperatures, but rather the opposite appears to be true. In all tests, the configuration with a side intake yielded cooler GPU temperatures compared to no side fan and a side exhaust fan.
> Having a side fan improved CPU temperatures over no side fan, and a side exhaust shows just slightly lower CPU temperatures over an intake. I conjecture this is because hot air is removed from the GPU and does not get mixed into the CPU air, given the position of the CPU heatsink right where some air exhausts from the GPU.
> Also tested was the effect of fan speed on GPU temperature. With a side intake, the higher speed did not lower temperatures. With a side exhaust, the higher speed lowered GPU temperatures, but not as much as either speed with an intake. CPU temperatures generally improved with the higher fan speed in either direction.
> 
> Additional:
> I do not think my Nzxt Vulcan is very representative of most people's enclosures. Although these results conclusively favor a side intake for my rig, I don't think that will be true for everyone, and is worth experimenting with.


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'd also read this too


Good reading but I would never mount side fan despite of those tests







. I bought FTW3 and my Enthoo Evolv Tempered Glass for looks







. Side fan would destroy that!

But it is interesting as I was expecting opposite result, hah.


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> I would also agree, dont skimp on PSU`s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a Super flower 2000w PSU - 8Pack Edition, Best psu i have ever bought, throw anything at it.... massive single rail


I lubs my Corsair AX1500i, actually ran four power hungry older Maxwell Titan X's on it at one point.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Good reading but I would never mount side fan despite of those tests
> 
> 
> 
> 
> 
> 
> 
> . I bought FTW3 and my Enthoo Evolv Tempered Glass for looks
> 
> 
> 
> 
> 
> 
> 
> . Side fan would destroy that!
> 
> But it is interesting as I was expecting opposite result, hah.


I'm not sure if it's a good idea for you to get two FTW3 in SLI, just saying. Expect an extra 10C to 13C in temperature for the both of them.

Here they compare a single card and a SLI setup.

https://www.hardocp.com/article/2014/12/23/phanteks_enthoo_evolv_micro_tower_case_review/6

Might be better of with Reference 1080 Ti with a Re-TIM of thermal grizzly.


----------



## romanlegion13th

Would it be worth upgrading my maxwell Titan X Evga hybrid?
For a 1080ti hybrid?
Or have I left it to long wait for the next card

Playing in 4K 60Fps


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I'm not sure if it's a good idea for you to get two FTW3 in SLI, just saying. Expect an extra 10C to 13C in temperature for the both of them.
> 
> Here they compare a single card and a SLI setup.
> 
> https://www.hardocp.com/article/2014/12/23/phanteks_enthoo_evolv_micro_tower_case_review/6
> 
> Might be better of with Reference 1080 Ti with a Re-TIM of thermal grizzly.


Well, if its not, I will just return one









I will re-tim them anyway with Grizzly. Good thing about EVGA is that you can do almost everything with card and be easy about your warranty









Hope at least one will be a silicon win.

Besides I just love two Slot cards. I don't know why but all those 2,5 slot and 3 slot cards just look Ugly imo.

Personal preference of course....


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> Too much headroom means you're reducing the efficiency of your PSU isn't it?


Absolutely not







, PSU efficiency usually peaks at around 50% load, but there are other way more important factors,

1/ Voltage drop under load, the higher the load the more your voltage rails will fall off. Although this is nowhere near as bad as it use to be, platinum and titanium rated PSU's are very good these days.

2/ Transient spikes / ripple voltage, this is how well the power supply can cope with fast large changes in load, again a lot better with modern PSU's, but if you have gold or less rated PSU, then a big one is better.

basically if you have a good platinum or titanium, then you could go to 75% quite easily, but if you have a gold psu then its going to be cleaner and safer to run 50-75%


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sweetwater*
> 
> Believe me your guys info on the 5775c is what made me have a look. It is a worthy upgrade. But yeah Australia is a little Island 20 years behind the rest of the world haha. We miss out on a lot.
> 
> 
> 
> well, we tried.
> 
> Australia is a beautiful place, though. Certainly on the list of places abroad in which I'd live.
Click to expand...

Canada here is a lot like Oz, except Oz has a 100 or so creatures that'll kill you quick.


----------



## CoreyL4

Does ~10090 supo 4k score with 2037/6123 seem low?

It downclocks to 2012-2025 during the run too


----------



## gmpotu

Quote:


> Originally Posted by *CoreyL4*
> 
> Does ~10090 supo 4k score with 2037/6123 seem low?
> 
> It downclocks to 2012-2025 during the run too


From what I have seen people posting here that doesn't look too low. Most people hitting 10500 are running at 2100+ core on water and 6000 mem if I recall correctly.
I'm not near as knowledgeable as the other guys here but I know you'be asked this a couple of times and wanted to give you some feedback just from what I have seen in this thread.


----------



## CoreyL4

Quote:


> Originally Posted by *gmpotu*
> 
> From what I have seen people posting here that doesn't look too low. Most people hitting 10500 are running at 2100+ core on water and 6000 mem if I recall correctly.
> I'm not near as knowledgeable as the other guys here but I know you'be asked this a couple of times and wanted to give you some feedback just from what I have seen in this thread.


thanks, i might go look at the supo thread too.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Coopiklaani*
> 
> My GPU draws about 425w under FSU stress test as measured by HWinfo64. With my 6800k @4.5GHz, 1.35v, the whole system draws about 650w of power from the wall. I'm using a 850w EVGA PSU which is just right for the system imo.


That's interesting. I'm running 4.7Ghz @ 1.38v on my 6700k and even with my GPU voltage at 1.072v/1.081v for 2076 My absolute max power draw under load with my GPU power limit jumping around in the 110's and120's (Aorus regular, 125% power slider, 375w limit, I'm not even sure how to figure out how much wattage my GPU is drawing) I'm getting a MAX of 530w from the wall via my Kill-A-Watt's readings. This is during gaming in GPU intensive games like ME:A and The Witcher 3 and benchmarks such as Superposition and Firestrike Ultra. I'm also running two MCP655's and16 fans. I'm wondering how you are drawing that much more power than I am. Actually thinking about it, I'm not sure what the actual CPU voltage is because while I have it set to 1.38v it drops under load. It's definitely about 1.3v though because my chip takes 1.3v to get to 4.4Ghz so 4.7Ghz must be higher. It's been forever since I've overclocked my CPU.
Quote:


> Originally Posted by *CoreyL4*
> 
> Does ~10090 supo 4k score with 2037/6123 seem low?
> 
> It downclocks to 2012-2025 during the run too


That sounds like a good run to me man. Anything over 10k is good.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Hulio225*
> 
> To be honest recommending 1000w for a single gpu config is over the top...
> 
> My 650W PSU is enough for a shunt modded 1080ti and a i7 7700k both decently oc'ed... it depends how the psu power is distributed, if it has 650W on the 12V rail, like mine it is fine! for safety and a bit headroom i would recommend a 750w or max 850w for single gpu... and for 2 gpu setups 1200W will be enough too...


I'm still running my 1300w EVGA G2 Supernova from back when I was running a tri-fire setup. Agh the day I switched to Nvidia is the day I realized I was dealing with microstutter for the longest time. That was with my 980 Ti. Now we are onto the 1080 Ti and my mind is blown to smithereens once again.

EDIT: sorry for the double post I was going to copy this into the post above, but I instinctively pressed the submit button when I was done typing.


----------



## KingEngineRevUp

Not sure why someone would waste money on a fancy power supply. I've always used a potato to power all my computers. It's cheap and gets the job done, you can replace it once a year at a grocery store if you are scared of power degrading.



It also works with corsair link too if you connect a USB-mini to USB-10 pin to your mobo. This is what I use to measure all the 1080 Ti bios power draw in my test.


----------



## Coopiklaani

Quote:


> Originally Posted by *RavageTheEarth*
> 
> That's interesting. I'm running 4.7Ghz @ 1.38v on my 6700k and even with my GPU voltage at 1.072v/1.081v for 2076 My absolute max power draw under load with my GPU power limit jumping around in the 110's and120's (Aorus regular, 125% power slider, 375w limit, I'm not even sure how to figure out how much wattage my GPU is drawing) I'm getting a MAX of 530w from the wall via my Kill-A-Watt's readings. This is during gaming in GPU intensive games like ME:A and The Witcher 3 and benchmarks such as Superposition and Firestrike Ultra. I'm also running two MCP655's and16 fans. I'm wondering how you are drawing that much more power than I am. Actually thinking about it, I'm not sure what the actual CPU voltage is because while I have it set to 1.38v it drops under load. It's definitely about 1.3v though because my chip takes 1.3v to get to 4.4Ghz so 4.7Ghz must be higher. It's been forever since I've overclocked my CPU.
> That sounds like a good run to me man. Anything over 10k is good.


I'm using the XOC bios and running my GPU at 2152MHz @ 1.125v. It can draw upto 450w in FSU scene 1 and over 500w in Furmark NoAA.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Coopiklaani*
> 
> I'm using the XOC bios and running my GPU at 2152MHz @ 1.125v. It can draw upto 450w in FSU scene 1 and over 500w in Furmark NoAA.


That's awesome. I'm guessing you are under water. What temps are you seeing at 1.125v? Ek delayed the block for the Aorus to the last week of this month so I'm just waiting for that before I can push the card much further.


----------



## Benny89

Since my STRIX never went above 70C- where do you see another downclock on 1080 Ti? After 61-62C there is a drop in bin due to temp. If there another one before 80C?


----------



## Coopiklaani

Quote:


> Originally Posted by *RavageTheEarth*
> 
> That's awesome. I'm guessing you are under water. What temps are you seeing at 1.125v? Ek delayed the block for the Aorus to the last week of this month so I'm just waiting for that before I can push the card much further.


yep, I'm underwater rocking an EK FE block. Absolute temps aren't very meaningful due to different coolant temps.
At 227LPH(or 1GPM) flow, I'm getting 10C delta (GPU-coolant) with 300w GPU load. 14C delta with 400w-450w load. 15C delta with 500w load .


----------



## DerComissar

Quote:


> Originally Posted by *SlimJ87D*
> 
> Not sure why someone would waste money on a fancy power supply. I've always used a potato to power all my computers. It's cheap and gets the job done, you can replace it once a year at a grocery store if you are scared of power degrading.
> 
> 
> 
> It also works with corsair link too if you connect a USB-mini to USB-10 pin to your mobo. This is what I use to measure all the 1080 Ti bios power draw in my test.


Lol!

That post really sums up this "too much / too little" argument that keeps coming back like a disease in so many threads.

Rep+


----------



## Luckbad

Quote:


> Originally Posted by *CoreyL4*
> 
> Does ~10090 supo 4k score with 2037/6123 seem low?
> 
> It downclocks to 2012-2025 during the run too


Drop your ram speed one step a time and see if your score goes up. With my memory clocked too high, I lose hundreds of points. It's stable, but worse.


----------



## CoreyL4

Quote:


> Originally Posted by *Luckbad*
> 
> Drop your ram speed one step a time and see if your score goes up. With my memory clocked too high, I lose hundreds of points. It's stable, but worse.


What would you consider a step? 10mhz, 30mhz? Etc?


----------



## Luckbad

Quote:


> Originally Posted by *CoreyL4*
> 
> What would you consider a step? 10mhz, 30mhz? Etc?


Whatever it takes to make that number actually change. Probably around 10.


----------



## TheBoom

Quote:


> Originally Posted by *mtbiker033*
> 
> I agree, go big & high quality on the psu. It's the heart of the system and definitely you will be happier in the long run especially when OC'ing!
> 
> I also agree as the new higher resolutions are coming in SLI may see new relevance.


It's not so much about the relevance than the decade of poor optimisations and bugs in multi gpu scenarios. DX 12 was supposed to change that but most game publishers are refusing to incorporate the mGpu aspect of it it yet.

Until the day comes that at least 8-9 out of 10 games provide 100% scaling with SLI I would still stay away from it and go for a single best GPU.

My very first rig was running a 8800 GTS 512MB SLI and my was it disappointing lol.


----------



## done12many2

Quote:


> Originally Posted by *TheBoom*
> 
> It's not so much about the relevance than the decade of poor optimisations and bugs in multi gpu scenarios. DX 12 was supposed to change that but most game publishers are refusing to incorporate the mGpu aspect of it it yet.
> 
> Until the day comes that at least 8-9 out of 10 games provide 100% scaling with SLI I would still stay away from it and go for a single best GPU.
> 
> My very first rig was running a 8800 GTS 512MB SLI and my was it disappointing lol.


If you're in the position to do so, there's nothing wrong with having a second card on tap. Most of the stuff I play benefits nicely from it. In the situations where it doesn't, I simply disable SLI. Pretty easy stuff.


----------



## TheBoom

Quote:


> Originally Posted by *done12many2*
> 
> If you're in the position to do so, there's nothing wrong with having a second card on tap. Most of the stuff I play benefits nicely from it. In the situations where it doesn't, I simply disable SLI. Pretty easy stuff.


Most of the stuff I do don't unfortunately. And I'm not gonna waste another 800 USD to turn it off.


----------



## alucardis666

Quote:


> Originally Posted by *Clukos*
> 
> Try undervolting the GPUs to 0.800mv to reduce power draw and see if it works. Then it's definitely the PSU.


Quote:


> Originally Posted by *BrainSplatter*
> 
> I am running 2x FE 1080ti SLI + 7700K on a 800W BeQuiet Straight Power PSU with no problems so far. I did all the 3DMark benches and superposition 4k/8k with OC + maxed out power target and the CPU @ [email protected] I was a bit nervous but everything worked just fine. System has 32GB RAM, 6xSSDs and 9 fans. Unfortunately, I can't measure power draw on the outlet because the battery in my Kill-A-Watt is dead.
> 
> I wouldn't recommend a 800W PSU for SLI 1080Ti but it can work, lol. An overclocked 6 core CPU might already push the system over the edge though.
> 
> Regarding FPS gains in games after switching from old AMD CPU to 7700K. Witcher 3 is not really a good test. One of the best tests would be a Total War game like Attila or TW-Warhammer since it's highly dependent on single CPU core performance. Arma 3 I think can also be pretty limited by a single core.


Quote:


> Originally Posted by *feznz*
> 
> I expected more too much much more
> A friend of mine made a similar upgrade but to a 6700k he is a bit different to us and simply does not believe in OCing
> 
> 
> 
> 
> 
> 
> 
> or water in a computer
> I know one of his fav games was Microsoft flight simulator that had huge gains
> I see you already got a PSU on order but general rule of thumb ;
> automatically reboots = PSU
> have to manually reboot = motherboard
> 
> And in all seriousness yes you can run on 750w PSU but in the long run the PSU will degrade much faster because the PSU is running close to 100% or in some case more than the rated output.
> 
> Nice choice on PSU I also got sick of PSU issues and got the biggest and reputable PSU available at the time 1500w enermax


Quote:


> Originally Posted by *Slackaveli*
> 
> "I'm running a corsair ax1200i, dual 1080ti's, 6900k @ 4400 ghZ @ 1.380
> 
> when running corsair link software, I noticed, at full gpu load, it will spike near or at 1000 watts usage. I'm gonna upgrade to ax1500i for safety, but it could be that you're reaching the limit of that PSU.. Have you tried running with just one GPU at full load?
> 
> EDIT: Forgot to mention my 1080ti's are OC'd to around 2050-2100
> Edited by KCDC - 4/21/17 at 12:49pm"
> 
> There are a bunch of posts like this, and a lot of people on single Ti with 550 or 650w power supplies getting the same crashes. You can't overclock two Tis on a 100w power supply without spiking to 1000w. i would recommend a 1000w for single, 1500w for SLI if you are an overclocker of all components. It's real easy to saturate a 850w psu.


Quote:


> Originally Posted by *Hulio225*
> 
> To be honest recommending 1000w for a single gpu config is over the top...
> 
> My 650W PSU is enough for a shunt modded 1080ti and a i7 7700k both decently oc'ed... it depends how the psu power is distributed, if it has 650W on the 12V rail, like mine it is fine! for safety and a bit headroom i would recommend a 750w or max 850w for single gpu... and for 2 gpu setups 1200W will be enough too...


Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yeah, that's beast, I love super flower. Also, you could run two V6000s with that
> 
> 
> 
> 
> 
> 
> 
> 
> jah. right at 75%


Appreciate all the replies guys. Idk. Maybe this Seasonic is just a lemon. I did put cpu and gpus at stock and still get shut down within 30 seconds to 3 minutes of running Heaven. I'm worried that I'm degrading my components when it shuts off like it is. :-(

I will have the psu in hand Thursday and see what happens with that EVGA 1600w. This Seasonic is going back for sure though


----------



## Rezox

I can verify first hand that 860 watts is too low for 2 x 1080 TI in SLI. Was crashing in games, no overclocking. It worked for single card, even overclocked, but I had to move to 1000 watts to stabilize and support SLI.

1000 watts is fine, but if you have the room in your case go 1200 to be safe. I have a 240mm rad that takes up space so can't get a bigger PSU. I lucked out that EVGA just came out with the 1000w in 150mm.

Also, I game in 4k. I find that support for SLI sucks with the latest games and isnt worth it with others. Stick to single card and watercool it for OC. The games I run like Battlefield 1 in single card mode run very smooth at highest settings.


----------



## alucardis666

Quote:


> Originally Posted by *Rezox*
> 
> I can verify first hand that 760 watts is too low for 2 x 1080 TI in SLI. Was crashing in games, no overclocking. It worked for single card, even overclocked, but I had to move to 1000 watts to stabilize and support SLI.
> 
> 1000 watts is fine, but if you have the room in your case go 1200 to be safe. I have a 240mm rad that takes up space so can't get a bigger PSU. I lucked out that EVGA just came out with the 1000w in 150mm.
> 
> Also, I game in 4k. I find that support for SLI sucks with the latest games and isnt worth it with others. Stick to single card and watercool it for OC. The games I run like Battlefield 1 in single card mode run very smooth at highest settings.


Yea, I had a Corsair AX1200 and was fine, but the PSU was dying so I read a lot and people said a top tier 850w was just fine. Those people are wrong... I'd say go with a minimum 1000w for SLi, 1200w if you're overclocking components.

I have a 1600w evga psu on order, Overkill sure, but OC headroom and future cards shouldn't be an issue.









Btw, I was able to do a complete Heaven run @ 800mv locked in Msi Afterburner, no restarts, so It is for sure the PSU that is underpowering my system.


----------



## pez

Put the Hybrid on my FE and temps are good. Have only been playing Overwtach as of late, but boost sticks to 1987 and temps stay under 60C. Gaming temps never seem to exceed 56 or 57C, but for some reason, sitting on the menu gets the GPU hotter than anything







. CPU and GPU temps under 60C have been super nice







.

However, my TXP was a better OC'er and I think I'll be transferring the Hybrid over to that card. The upside is the FE card will get re-pasted with some Kryonaut.

Not the final setup, but figured I'd show it for the time being







.



Quote:


> Originally Posted by *Benny89*
> 
> SC2 is ugly
> 
> 
> 
> 
> 
> 
> 
> and does not have 2 BIOSes and has lower Power Limit I think.


Weren't you considering a FTW3?


----------



## gmpotu

If any of you have time would you mind looking at the result I gathered from my overclock testing and help me to learn whatever insights I should be able to pull from the information?

Cool trick for TimeSpy if you are using the basic version. You can skip the demo by ALT+Tabbing in a circle back to the Timespy application by pressing ALT+Tab once the demo finishes loading.

OCCT did not work for me as described in the OP. I had to also run the GPU-Z 3D Tool to put enough load on the card for testing. Once running both at the same time I was able to test my memory clocks. Also, OCCT would show no errors at clocks that would lock up on Heaven.

To save space in the thread I'm putting the scores and clock speeds in a spoiler tag below.

Asume Max Voltage / Power / Temp Slider for all tests.


Spoiler: Warning: Spoiler!



Asus Strix 1080 ti OC - with the bios that shipped with the card. (Max power limit 330W)

Ambient temps are around 22C
Air Cooled with the Strix Cooler
Case: 1 front intake 2 bottom intake 1 back exhaust 1 top exhaust a side vent that did have a fan but it broke so no fan on the side.

First I used OCCT and GPU-Z 3D Render to get a memory overclock of +400 (11800) without errors in OCCT.
(I let it run for about 5 minutes to heat up the card and kept the Fan at 100% )

Next I started running 4K Superposition with +100 core (2100 max) and the +400 memory and I backed it down one bin at a time until it stopped crashing. I landed at 2062 (100% Fan)

After this I went through a lot of different tests. Here are the results of the different tests based on the benchmarks used. Hopefully it's easy to see and read.

Max Pwer Draw reported by the display on my UPS is 469W - Corsair link shows max

Time Spy Results
*Score 10312* - Clocks 2000/11000 - Fan 100% - Forgot to get the FPS on the graphics tests. - Drops (1987 @ 52C)
*Score 10472* - Clocks 2000/11800 - Fan 100% - GT1 67.05fps, GT2 61.00fps
*Score 10702* - Clocks 2062/11800 - Fan 50% - GT1 68.56fps, GT2 62.31fps - Drops (2025 @ 64C)
*Score 10695* - Clocks 2062/11800 - Fan 100% - GT1 68.43fps, GT2 62.34fps - Drops (2050 @ 50C and 2025 to PL)
-
4K SuperPosition
*Score 9666* - Clocks 2000/11000 - Fan 100% - FPS 57.16min 94.78max - Drops (1987 @ 53C)
*Score 9831* - Clocks 2000/11800 - Fan 100% - FPS 58.00min 96.00max - Drops (1962 @ 64C)
*Score 10024* - Clocks 2062/11800- Fan 50% - FPS 58.88min 98.45max - Drops (2012 @ 66C)
*Score 10047* - Clocks 2062/11800 - Fan 100% - FPS 59.40min 98.37max - Drops (2037 @ 52C)

Heaven 2560x1440
*Score 3040* - Clocks 2000/11000 - Fan 100% - FPS 120.7 - Drops (2000 @ 51C) No Drop or Perf Caps at all.
*Score 3112* - Clocks 2000/11800 - Fan 100% - FPS 123.5 - at 50% fan and 70C artifacts happen.
*Score 3096* - Clocks 2062/11800- Fan 50% - FPS 122.9 - No Drop on Heaven and no perf caps but @70C artifacts and then a driver crash.
*Score 3163* - Clocks 2062/11800 - Fan 100% - FPS 125.6 - Drops (2063 @ 51C @ 1.093v) No Drop or Perf Caps.

Does Heaven Benchmark not cause performance caps for you guys?


I didn't get any but I got power limit in Supo and TimeSpy some. However Heaven heated my card more than any other and caused artifacts at 50% fan speed.



I would appreciate any extra insight you guys might have. I'm pretty happy with what I have. I feel like it's a decent overclock so far.
So far I think I like the default curve on the Strix a lot, I tried messing with the curve before and my results were not near as good as just messing with the clocks offset.
I think as my card heats up the Memory gets too hot and that's what is causing my artifacts at 70C.


----------



## illypso

4790K overclock at 4.8ghz 1.45v
2 x 1080ti shunt mod overclock at 2050

Work perfectly on my EVGA 1000W G3

I did not intend to SLI when I bought the powersupply, so I would still suggest 1200w or more to be on the safe side (and this EVGA G3 is noisy as hell, I had to change the fan and plug it on my mobo to lower RPM but that's another story







)

Now I have a 5775C so it consume less power but I had that 4790k for a month without issue.


----------



## KickAssCop

Can someone link me to 1080 ti SLi scaling review? Would love to see how good SLi is today.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone link me to 1080 ti SLi scaling review? Would love to see how good SLi is today.


Here you go http://bfy.tw/BoTI


----------



## gmpotu

Quote:


> Originally Posted by *KickAssCop*
> 
> Can someone link me to 1080 ti SLi scaling review? Would love to see how good SLi is today.


http://lmgtfy.com/?q=1080+ti+sli+scaling

1st link on that site has 25 games reviewed.


----------



## Hulio225

Quote:


> Originally Posted by *gmpotu*
> 
> http://lmgtfy.com/?q=1080+ti+sli+scaling
> 
> 1st link on that site has 25 games reviewed.


Wow i glanced through the results, just wow... scaling is so bad in most games, that is disgusting to be honest...


----------



## octiny

Sniper Elite 4 with negative scaling
Battlefield 1 with negative scaling
Gears of War 4 with no support (now has nearly perfect mGPU support)
The Division with negative scaling
Wildlands with negative scaling

The list of BS goes on and on.

Even their Firestrike Ultra score is pathetic.

Did they use a Potato as a 2nd GPU?









I get at least 90% scaling with all the games listed on there....besides the ones that don't support SLI which is only "3" games on that entire list as of today.


----------



## OZrevhead

Hands up anyone that bought an FTW3, can you tell us what it overclocks like? I know one guy that has one and it only does 2025mhz flat out .... bit disappointing ....

Does anyone know if evga might release a classified?


----------



## Dasboogieman

Quote:


> Originally Posted by *OZrevhead*
> 
> Hands up anyone that bought an FTW3, can you tell us what it overclocks like? I know one guy that has one and it only does 2025mhz flat out .... bit disappointing ....
> 
> Does anyone know if evga might release a classified?


Wait 2025 is pretty good. Most people that got higher are either using the first batch extremely high quality FE cores or are using water cooling. There are a handful of AIB guys that get higher but its 80% silicon lottery, almost nothing to do with the AIB unless they bin the chips like the HOF (which costs as much as a Titan Xp).


----------



## pez

Quote:


> Originally Posted by *illypso*
> 
> 4790K overclock at 4.8ghz 1.45v
> 2 x 1080ti shunt mod overclock at 2050
> 
> Work perfectly on my EVGA 1000W G3
> 
> I did not intend to SLI when I bought the powersupply, so I would still suggest 1200w or more to be on the safe side (and this EVGA G3 is noisy as hell, I had to change the fan and plug it on my mobo to lower RPM but that's another story
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Now I have a 5775C so it consume less power but I had that 4790k for a month without issue.


Why would you suggest an even more overkill PSU for SLI 1080Ti?


----------



## gmpotu

Quote:


> Originally Posted by *octiny*
> 
> Sniper Elite 4 with negative scaling
> Battlefield 1 with negative scaling
> Gears of War 4 with no support (now has nearly perfect mGPU support)
> The Division with negative scaling
> Wildlands with negative scaling
> 
> The list goes on.
> 
> Even their Firestrike Ultra score is pathetic.
> 
> Did they use a Potato as a 2nd GPU?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I get at least 90% scaling with all the games listed on there....besides the ones that don't support SLI which is only "3" games on that entire list as of today.


You're getting 90% scaling with your 1080ti SLI setup on the same games they were getting negative scaling on?

If so they must have had some terrible cards or absolutely no cooling in their system at all. I can imagine not having at least minimal scaling in most games. When I had my GTX570's in SLI I don't think I ever found a game that didn't give me at least like 25% scaling.


----------



## phaseshift

what's the best gtx 1080 ti out right now?


----------



## pez

Those are all games that have been shown to have quite great scaling with SLI/mGPU setups. Sniper Elite 3 and 4 are a golden sample of a game that has quite amazing SLI scaling.


----------



## Dasboogieman

Quote:


> Originally Posted by *gmpotu*
> 
> You're getting 90% scaling with your 1080ti SLI setup on the same games they were getting negative scaling on?
> 
> If so they must have had some terrible cards or absolutely no cooling in their system at all. I can imagine not having at least minimal scaling in most games. When I had my GTX570's in SLI I don't think I ever found a game that didn't give me at least like 25% scaling.


You used to be able to force the 570s to do SFR mode to enforce compatibility in those tricky games that didn't do AFR. Still a decent 20% scaling albeit with needing Vsync to deal with SFR screen tearing.....ahh the bad old days.
Quote:


> Originally Posted by *phaseshift*
> 
> what's the best gtx 1080 ti out right now?


depends, what are you looking for?
Outside of AIBs that pre-bin their chips like Galax HOF and EVGA Kingpin. Most will OC the same, the thing that really makes a difference in performance is core + VRAM cooling competency. The lower the temps you can get, the faster your card is. A second consideration is power limits, some AIBs have larger built-in limits like the Gigabyte Aorus (375W), FTW3 (358W), Zotac (375W+++), Strix (unlimited with the XOC BIOS). Seeing as how these Bioses are locked down, the choice of pre-installed BIOS matters.

Then lastly there are other considerations like your willingness to DIY, your intentions to watercool or not, your SLI ambitions, noise etc


----------



## octiny

Quote:


> Originally Posted by *gmpotu*
> 
> You're getting 90% scaling with your 1080ti SLI setup on the same games they were getting negative scaling on?
> 
> If so they must have had some terrible cards or absolutely no cooling in their system at all. I can imagine not having at least minimal scaling in most games. When I had my GTX570's in SLI I don't think I ever found a game that didn't give me at least like 25% scaling.


Yep. I almost spit out my drink when I glanced at the numbers.

Their numbers are totally bogus and not to mention outdated.

I'll run a bunch of benches later this week when I get time.

Quote:


> Originally Posted by *pez*
> 
> Those are all games that have been shown to have quite great scaling with SLI/mGPU setups. Sniper Elite 3 and 4 are a golden sample of a game that has quite amazing SLI scaling.


QFT

Edit:

Also just did a quick rundown on that list from my own personal experience.

Firestrike Ultra: Near perfect scaling
Timespy: Near perfect scaling
Crysis 3: 95-99 scaling
Metro Last Light: Near perfect scaling
GTA V: 95% scaling
The Witcher 3: 90%-95%
Fallout 4: 90%
Dirt Rally: Near perfect scaling
Far Cry Primal: 90%-95%
Infinite Warfare: Don't have
Rainbow Six Seige: Near perfect scaling
Watch Dogs 2: 90%-95%
Resident Evil 7: Near perfect scaling
For Honor: Near perfect scaling
Ghost Recon: 90%-95%
Doom: No SLI
The Division: 90ish%
Hitman: No SLI
Rise of the Tomb Raider: Near perfect scaling
Ashes of the Singularity: Don't have
Total War: Warhammer: Don't have
Dues Ex Mankind Divided: 90ish%
Gears of War 4: Near perfect scaling
Battlefield 1: 85%-95%
Sniper Elite 4: Near Perfect scaling

I'll run a bunch of benchmarks sometime this weekend.


----------



## gmpotu

Quote:


> Originally Posted by *Dasboogieman*
> 
> You used to be able to force the 570s to do SFR mode to enforce compatibility in those tricky games that didn't do AFR. Still a decent 20% scaling albeit with needing Vsync to deal with SFR screen tearing.....ahh the bad old days.


Lol those bad old days were just a couple of months ago for me. I jumped from two 570s up to the 1080TI STRIX in March


----------



## Hulio225

Quote:


> Originally Posted by *phaseshift*
> 
> what's the best gtx 1080 ti out right now?


Founders Edition @ Custom Waterloop


----------



## feznz

I must be on the wrong web site potatoclock.net

where is the Overclock.net attitude where more is still not enough


----------



## Luckbad

Quote:


> Originally Posted by *OZrevhead*
> 
> Hands up anyone that bought an FTW3, can you tell us what it overclocks like? I know one guy that has one and it only does 2025mhz flat out .... bit disappointing ....
> 
> Does anyone know if evga might release a classified?


Silicon lottery is what it clocks like.

It's a completely overbuilt card with a fairly high power limit.

I've now owned about 6x 1080 Tis and it's the second best of the bunch that I've had. The Zotac Amp Extreme is the best of the various cards I tried.

But the EVGA has way better customer service, asynchronous fans, interesting temp readouts, much better resale value, and a waterblock from EK coming soonish.

I was going to sell my FTW3 on eBay and keep the Zotac because the Zotac gets higher core and memory clocks, but what that means in a practical sense is a Superposition average framerate of 76.67 (EVGA) vs 77.33 (Zotac). Or leave everything stock and drop ~4-5 FPS.

That said, I'm going to either jump on the FTW3 Hybrid or EVGA 1080 Ti Hydro Copper when those come out. I want to quiet things down a bit.


----------



## Hulio225

Quote:


> Originally Posted by *octiny*
> 
> Yep. I almost spit out my drink when I glanced at the numbers.
> 
> Their numbers are totally bogus and not to mention outdated.
> 
> I'll run a bunch of benches later this week when I get time.
> QFT
> 
> Edit:
> 
> Also just did a quick rundown on that list from my own personal experience.
> 
> Firestrike Ultra: Near perfect scaling
> Timespy: Near perfect scaling
> Crysis 3: 95-99 scaling
> Metro Last Light: Near perfect scaling
> GTA V- 95% scaling
> The Witcher 3- 90%-95%
> Fallout 4- 90%
> Dirt Rally- Near perfect scaling
> Far Cry Primal- 90%-95%
> Infinite Warfare- Don't have
> Watch Dogs 2- 90%-95%
> Resident Evil 7- Near perfect scaling
> For Honor- Near perfect scaling
> Ghost Recon- 90%-95%
> Doom- No SLI
> The Division- 90ish%
> Hitman- No SLI
> Rise of the Tomb Raider- Near perfect scaling
> Ashes of the Singularity- Don't have
> Total War: Warhammer- Don't have
> Dues Ex Mankind Divided- 90ish%
> Gears of War 4- Near perfect scaling
> Battlefield 1- 85%-95%
> Sniper Elite 4- Near Perfect scaling
> 
> I'll run a bunch of benchmarks sometime this weekend.


So that list in that link is crap, someone made it who had no clue how to choose SLI Profiles or something like that, thats what you are saying, right?


----------



## octiny

Quote:


> Originally Posted by *Hulio225*
> 
> So that list in that link is crap, someone made it who had no clue how to choose SLI Profiles or something like that, thats what you are saying, right?


Yes. It's crap.

Pure. Utter. Crap. ::insert crap emoji here::

Keep in mind though, I play @ 4K. GPU utilization is going to be higher than say 1440P simply because there is less being put on the CPU.


----------



## BrainSplatter

Quote:


> Originally Posted by *octiny*
> 
> Sniper Elite 4 with negative scaling
> Their numbers are totally bogus and not to mention outdated.


QFT

Here are some numbers which are closer to my experience:
http://www.eurogamer.net/articles/digitalfoundry-2016-asus-strix-gtx-1070-1080-o8g-sli-review

The argument for SLI is simple. Get it if u want FPS/resolution which the current highest performing GPU can't deliver. But to get decent utility out of SLI, be prepared to
a) wait for games to support SLI (some will never support it)
b) a new patch might break SLI again
c) fiddle with custom SLI profiles (for games which don't have official support or to improve official profile)
d) fiddle with settings which can cause framerate issues in SLI (e.g. temporal AA solutions)
e) Use a framerate limiter to reduce SLI micro-stuttering

Another problem with SLI is that PCIE limits can reduce your performance by up to 20% in some games. I am looking forward to the new AMD chipset with 44 PCIE lanes since Intel limits those to the highest end CPUs only.


----------



## Hulio225

Quote:


> Originally Posted by *octiny*
> 
> Yes. It's crap.
> 
> Pure. Utter. Crap. ::insert crap emoji here::
> 
> Keep in mind though, I play @ 4K. GPU utilization is going to be higher than say 1440P simply because there is less being put on the CPU.


Yah im @ 4K too, so thats what i am interested in, in terms of scaling ^^ i was thinking about getting a second 1080 ti too and after i saw that list i got scared^^


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> So that list in that link is crap, someone made it who had no clue how to choose SLI Profiles or something like that, thats what you are saying, right?


Agreed also that review was a month ago and 4 drivers later.....


----------



## KickAssCop

Quote:


> Originally Posted by *phaseshift*
> 
> what's the best gtx 1080 ti out right now?


I would say it is all a wash. If you want decent temps, no noise and air cooling, STRIX is the best IMHO.
If you want water then MSI Seahawk AIO is the best.

My experience with MSI Gaming X was crap.


----------



## becks

Quote:


> Originally Posted by *octiny*
> 
> Yep. I almost spit out my drink when I glanced at the numbers.
> 
> Their numbers are totally bogus and not to mention outdated.
> 
> I'll run a bunch of benches later this week when I get time.
> QFT
> 
> Edit:
> 
> Also just did a quick rundown on that list from my own personal experience.
> 
> Firestrike Ultra: Near perfect scaling
> Timespy: Near perfect scaling
> Crysis 3: 95-99 scaling
> Metro Last Light: Near perfect scaling
> GTA V: 95% scaling
> The Witcher 3: 90%-95%
> Fallout 4: 90%
> Dirt Rally: Near perfect scaling
> Far Cry Primal: 90%-95%
> Infinite Warfare: Don't have
> Rainbow Six Seige: Near perfect scaling
> Watch Dogs 2: 90%-95%
> Resident Evil 7: Near perfect scaling
> For Honor: Near perfect scaling
> Ghost Recon: 90%-95%
> Doom: No SLI
> The Division: 90ish%
> Hitman: No SLI
> Rise of the Tomb Raider: Near perfect scaling
> Ashes of the Singularity: Don't have
> Total War: Warhammer: Don't have
> Dues Ex Mankind Divided: 90ish%
> Gears of War 4: Near perfect scaling
> Battlefield 1: 85%-95%
> Sniper Elite 4: Near Perfect scaling
> 
> I'll run a bunch of benchmarks sometime this weekend.


Add World of Warcraft with settings over 10 and View Distance at Max please


----------



## VESPA5

Quote:


> Originally Posted by *phaseshift*
> 
> what's the best gtx 1080 ti out right now?


I'd personally go with the 1080Ti FE and slap on custom watercooling or drop $159.99 USD on EVGA's hybrid cooler. When the 980Ti came out, sure there were fancy cards with 2-3 massive fans blowing onto huge radiators, but they never really came close to keeping temps down while maintaining performance with a custom watercooling setup or a hybrid AIO cooler.

Granted, the 1080Ti FE cost me $699 on top of $159.99 for the EVGA hybrid cooler to slap onto it, but I can breeze by with the GPU rarely going over 50C while overclocking it at a conservative 2015MHz. Nice!


----------



## kevindd992002

Quote:


> Originally Posted by *VESPA5*
> 
> I'd personally go with the 1080Ti FE and slap on custom watercooling or drop $159.99 USD on EVGA's hybrid cooler. When the 980Ti came out, sure there were fancy cards with 2-3 massive fans blowing onto huge radiators, but they never really came close to keeping temps down while maintaining performance with a custom watercooling setup or a hybrid AIO cooler.
> 
> Granted, the 1080Ti FE cost me $699 on top of $159.99 for the EVGA hybrid cooler to slap onto it, but I can breeze by with the GPU rarely going over 50C while overclocking it at a conservative 2015MHz. Nice!


But then you still are power-limited with the 1080Ti FE. My plan is to buy the EVGA GTX 1080Ti FTW3 and slap in a waterblock.


----------



## Slackaveli

Quote:


> Originally Posted by *pez*
> 
> Put the Hybrid on my FE and temps are good. Have only been playing Overwtach as of late, but boost sticks to 1987 and temps stay under 60C. Gaming temps never seem to exceed 56 or 57C, but for some reason, sitting on the menu gets the GPU hotter than anything
> 
> 
> 
> 
> 
> 
> 
> . CPU and GPU temps under 60C have been super nice
> 
> 
> 
> 
> 
> 
> 
> .
> 
> However, my TXP was a better OC'er and I think I'll be transferring the Hybrid over to that card. The upside is the FE card will get re-pasted with some Kryonaut.
> 
> Not the final setup, but figured I'd show it for the time being
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> Weren't you considering a FTW3?


dang. that's pretty dope. a LOT of power is such a small package!


----------



## pez

Quote:


> Originally Posted by *Slackaveli*
> 
> dang. that's pretty dope. a LOT of power is such a small package!


Indeed







. The TXP will be back in and I hope it to be the final form of this system for a while







. Temps and noise are finally what I want now, too.

...I need another hobby.


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> You used to be able to force the 570s to do SFR mode to enforce compatibility in those tricky games that didn't do AFR. Still a decent 20% scaling albeit with needing Vsync to deal with SFR screen tearing.....ahh the bad old days.
> depends, what are you looking for?
> Outside of AIBs that pre-bin their chips like Galax HOF and EVGA Kingpin. Most will OC the same, the thing that really makes a difference in performance is core + VRAM cooling competency. The lower the temps you can get, the faster your card is. A second consideration is power limits, some AIBs have larger built-in limits like the Gigabyte Aorus (375W), FTW3 (358W), Zotac (375W+++), Strix (unlimited with the XOC BIOS). Seeing as how these Bioses are locked down, the choice of pre-installed BIOS matters.
> 
> Then lastly there are other considerations like your willingness to DIY, your intentions to watercool or not, your SLI ambitions, noise etc


yes, yes, and yes. Although it hasn't worked so a few unlucky ones, if you want the "best" , just buy 3 and return 2 after a day of benching (please dont alter then return the cards) and then re-tim and/or put under water. The best card is whichever one of the 4 w/ over 358w power limits you like the look of best, or that fit your loop best, then personally bin the best one.


----------



## Coopiklaani

Just realised GDDR5X has such a weird frequency step and performance linearity.


----------



## pez

Yep. It's a great thing to test several different OCs on your Pascal GPUs to see which is 1) the most stable and 2) the best performing. A bad or borderline bad VRAM OC can cause decreased performance even if it doesn't cause crashing.


----------



## Sweetwater

Well here are my SuPo benches with my Strix on the XOC @ 2076 core, 6003mem / 1.075v. Finally dialed in and I don't think I could get any higher. Multiple hours in Witcher 3 and Andromeda so I know for sure it's well stable. Now to get back to gaming haha









1080p Extreme


4K Optimized


8K Optimized


----------



## Slackaveli

Quote:


> Originally Posted by *octiny*
> 
> Yep. I almost spit out my drink when I glanced at the numbers.
> 
> Their numbers are totally bogus and not to mention outdated.
> 
> I'll run a bunch of benches later this week when I get time.
> QFT
> 
> Rainbow Six Seige: Near perfect scaling
> 
> .


wait, R6 siege has a sli profile finally? when i had 980 ti 's SLI'd, last time i tried was probably last november or so (but that was a full year after the game launched), it was negative scaling.
Quote:


> Originally Posted by *Luckbad*
> 
> Silicon lottery is what it clocks like.
> 
> It's a completely overbuilt card with a fairly high power limit.
> 
> I've now owned about 6x 1080 Tis and it's the second best of the bunch that I've had. The Zotac Amp Extreme is the best of the various cards I tried.
> 
> But the EVGA has way better customer service, asynchronous fans, interesting temp readouts, much better resale value, and a waterblock from EK coming soonish.
> 
> I was going to sell my FTW3 on eBay and keep the Zotac because the Zotac gets higher core and memory clocks, but what that means in a practical sense is a Superposition average framerate of 76.67 (EVGA) vs 77.33 (Zotac). Or leave everything stock and drop ~4-5 FPS.
> 
> That said, I'm going to either jump on the FTW3 Hybrid or EVGA 1080 Ti Hydro Copper when those come out. I want to quiet things down a bit.


hey, man. question. How much do you figure you have made/lost on all those re-sales. How's the market been to ya?
Quote:


> Originally Posted by *pez*
> 
> Indeed
> 
> 
> 
> 
> 
> 
> 
> . The TXP will be back in and I hope it to be the final form of this system for a while
> 
> 
> 
> 
> 
> 
> 
> . Temps and noise are finally what I want now, too.
> 
> ...I need another hobby.


my old hobby was modding my old BMW M3. since she is a scrap heap now, lol, i am free from that ******edly expensive "hobby". uggh. I miss muh girl....


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> Just realised GDDR5X has such a weird frequency step and performance linearity.


yeah, man. here's my theory. it's the timings. when you first hit the number that bumps you to the next bin, you are 'tight' in that frequency. as the numbers go up, it's loosening, until the next jump, then it tightens again. Does that make sense? Or, am I wrong and when it first jumps it's loose, then tightens until the jump where it loosens again??... it's one or the other, of that i am sure. That's WHY the behavior is the way it is. Just need to determine which is tighter. ie, i hit 5900Mhz right at +396. it kicks to 5940 right at +417. which is the tighter setting, +398 or +415? Who has a 100% definitive answer to that question? I have to go to work in 5 minutes. somebody test this, please. Doesn't have to be that frequency, just the theory. is it tighter or looser right after it hops to the next bin?
Quote:


> Originally Posted by *Sweetwater*
> 
> Well here are my SuPo benches with my Strix on the XOC @ 2076 core, 6003mem / 1.075v. Finally dialed in and I don't think I could get any higher. Multiple hours in Witcher 3 and Andromeda so I know for sure it's well stable. Now to get back to gaming haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1080p Extreme
> 
> 
> 4K Optimized
> 
> 
> 8K Optimized


Gnarly. You've got it dialed in, no doubt. Basically where I am at myself so I feel pretty great about my card.


----------



## Luckbad

Quote:


> Originally Posted by *Slackaveli*
> 
> hey, man. question. How much do you figure you have made/lost on all those re-sales. How's the market been to ya?


I've lost about $200 in the card shuffle.

I'm only willing to do it because EVGA was too nice about the 1080 FTW situation. I did the $99 upgrade to FTW2, which allowed me to Step Up to a 1080 Ti for $20.

Since I lost less than I should have transitioning to the Ti, I figured I'd play the lottery a bit.

From my experience, it's harder to find a card that is stable above 2GHz than this thread makes it appear.


----------



## octiny

Quote:


> Originally Posted by *Slackaveli*
> 
> wait, R6 siege has a sli profile finally?


They've had it for awhile now. There was a few driver bugs with it that got ironed out.


----------



## Slackaveli

[/quote]
Quote:


> Originally Posted by *Luckbad*
> 
> I've lost about $200 in the card shuffle.
> 
> I'm only willing to do it because EVGA was too nice about the 1080 FTW situation. I did the $99 upgrade to FTW2, which allowed me to Step Up to a 1080 Ti for $20.
> 
> Since I lost less than I should have transitioning to the Ti, I figured I'd play the lottery a bit.
> 
> From my experience, it's harder to find a card that is stable above 2GHz than this thread makes it appear.


not too bad at all. And I would tend to agree. I even had a day 1 FE, which is supposed to be golden, and it didnt hold 2 Ghz, although it was close. Luckily I pulled a winner on try two.
Quote:


> Originally Posted by *octiny*
> 
> They've had it for awhile now. There was a few driver bugs with it that got ironed out.


ah, that is nice to hear.


----------



## ViRuS2k

Hi guys,, not long now until i do the shuntmod resister style










have a question though, i plan on buying Gelid extreme thermal pads @W/mK 12
will have Ek waterblocks on, does anyone have picture of the placment of the thermal pads, im not talking about ek`s guide, im taking about every single place on a card that draws heat and that can be lowered with thermal pads....

also what spacing do i get 0.5mm or 1mm and where exactly do i put the 0.5 or 1mm pads for best efficiancy on the block..

cheers.


----------



## Luckbad

BH Photo Video has the FTW3 in stock... ordered 2 to roll the dice again.

https://www.bhphotovideo.com/c/product/1330061-REG/evga_11g_p4_6696_kr_geforce_gtx_1080_ti.html


----------



## the9quad

Quote:


> Originally Posted by *Luckbad*
> 
> BH Photo Video has the FTW3 in stock... ordered 2 to roll the dice again.
> 
> https://www.bhphotovideo.com/c/product/1330061-REG/evga_11g_p4_6696_kr_geforce_gtx_1080_ti.html


Do you sell them when they don't work out or send them back as RMA screwing over everyone else in the process?


----------



## Luckbad

Quote:


> Originally Posted by *the9quad*
> 
> Do you sell them when they don't work out or send them back as RMA screwing over everyone else in the process?


Sell them, which is why I keep losing money.


----------



## IMI4tth3w

why are all these websites getting them in stock while us amazon pre-order holders are stuck waiting









i'd love to order from somewhere else but only amazon takes out of country credit cards and can split payments using amazon gift cards... (card is for my wife. her grandmother is paying for half of it and i'm paying the other half)


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> Sell them, which is why I keep losing money.


rep'd


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man. here's my theory. it's the timings. when you first hit the number that bumps you to the next bin, you are 'tight' in that frequency. as the numbers go up, it's loosening, until the next jump, then it tightens again. Does that make sense? Or, am I wrong and when it first jumps it's loose, then tightens until the jump where it loosens again??... it's one or the other, of that i am sure. That's WHY the behavior is the way it is. Just need to determine which is tighter. ie, i hit 5900Mhz right at +396. it kicks to 5940 right at +417. which is the tighter setting, +398 or +415? Who has a 100% definitive answer to that question? I have to go to work in 5 minutes. somebody test this, please. Doesn't have to be that frequency, just the theory. is it tighter or looser right after it hops to the next bin?
> Gnarly. You've got it dialed in, no doubt. Basically where I am at myself so I feel pretty great about my card.


Hmm, I think it has more to do with the GPU mem multiplier. Just like the memory multiplier on the CPU, you don't get any arbitrary memory frequency as you might wished to. My theory is, the vram works similarly to that of the system ram, you'll need a certain combo of uncore frequency and memory multiplier to achieve the desired vram frequency. Some combinations may result reducing the uncore frequency thus result in lower performance at higher memory clock.


----------



## gmpotu

Anyone have time to comment and give me some constructive feedback on this post?

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9720#post_26097394

Post 9737 in this thread if the link doesn't work.


----------



## Mrip541

I was getting screen flashing and drop-outs and couldn't fix it to save my life. I tried everything. I swapped back to my old 980ti and still had some of the same issues. Don't tell me I killed my board... Then I noticed the new rubber (neoprene?) mat I put under my desk the same day I installed the new card was creating a lot of static, and my cables were running across part of it. I pulled the thing out and it crackled angrily as I carried it down the hall. It was hard to believe. Boom. Screen flashes gone. tldr: I may or may not have rma'ed 2 cards because I had my system/cables sitting on a static-generating mat.


----------



## Slackaveli

Quote:


> Originally Posted by *Mrip541*
> 
> I was getting screen flashing and drop-outs and couldn't fix it to save my life. I tried everything. I swapped back to my old 980ti and still had some of the same issues. Don't tell me I killed my board... Then I noticed the new rubber (neoprene?) mat I put under my desk the same day I installed the new card was creating a lot of static, and my cables were running across part of it. I pulled the thing out and it crackled angrily as I carried it down the hall. It was hard to believe. Boom. Screen flashes gone. tldr: I may or may not have rma'ed 2 cards because I had my system/cables sitting on a static-generating mat.


lol, whoops. at least you didnt fry the mobo


----------



## feznz

Quote:


> Originally Posted by *gmpotu*
> 
> Anyone have time to comment and give me some constructive feedback on this post?
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9720#post_26097394
> 
> Post 9737 in this thread if the link doesn't work.


even in the free version of timespy there is a check box for the demo at least there is in my free version

Not sure what you are asking but yes you benches are in line with results of similar clocks
and TBH I can run the same bench same clocks and get 10 different scores I am yet to have 2 identical runs


----------



## GnarlyCharlie

I've got the paid version of Fire Strike, but not unlocked Time Spy, and I'm pretty sure I can't skip the Time Spy demo - but I can skip the FS demos.


----------



## Nico67

Quote:


> Originally Posted by *Coopiklaani*
> 
> Just realised GDDR5X has such a weird frequency step and performance linearity.


Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, man. here's my theory. it's the timings. when you first hit the number that bumps you to the next bin, you are 'tight' in that frequency. as the numbers go up, it's loosening, until the next jump, then it tightens again. Does that make sense? Or, am I wrong and when it first jumps it's loose, then tightens until the jump where it loosens again??... it's one or the other, of that i am sure. That's WHY the behavior is the way it is. Just need to determine which is tighter. ie, i hit 5900Mhz right at +396. it kicks to 5940 right at +417. which is the tighter setting, +398 or +415? Who has a 100% definitive answer to that question? I have to go to work in 5 minutes. somebody test this, please. Doesn't have to be that frequency, just the theory. is it tighter or looser right after it hops to the next bin?


Quote:


> Originally Posted by *Coopiklaani*
> 
> Hmm, I think it has more to do with the GPU mem multiplier. Just like the memory multiplier on the CPU, you don't get any arbitrary memory frequency as you might wished to. My theory is, the vram works similarly to that of the system ram, you'll need a certain combo of uncore frequency and memory multiplier to achieve the desired vram frequency. Some combinations may result reducing the uncore frequency thus result in lower performance at higher memory clock.


I was get the same sort of thing when test memory clks, high, next low, next high... and someone suggested its a bit of a bug if you don't close and restart benchmark programs between tests if changing mem clks. Haven't really tested much into it, but I have been closing and restart SuPo between tests and you get much more consistent results.


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> Just realised GDDR5X has such a weird frequency step and performance linearity.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> yeah, man. here's my theory. it's the timings. when you first hit the number that bumps you to the next bin, you are 'tight' in that frequency. as the numbers go up, it's loosening, until the next jump, then it tightens again. Does that make sense? Or, am I wrong and when it first jumps it's loose, then tightens until the jump where it loosens again??... it's one or the other, of that i am sure. That's WHY the behavior is the way it is. Just need to determine which is tighter. ie, i hit 5900Mhz right at +396. it kicks to 5940 right at +417. which is the tighter setting, +398 or +415? Who has a 100% definitive answer to that question? I have to go to work in 5 minutes. somebody test this, please. Doesn't have to be that frequency, just the theory. is it tighter or looser right after it hops to the next bin?
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> Hmm, I think it has more to do with the GPU mem multiplier. Just like the memory multiplier on the CPU, you don't get any arbitrary memory frequency as you might wished to. My theory is, the vram works similarly to that of the system ram, you'll need a certain combo of uncore frequency and memory multiplier to achieve the desired vram frequency. Some combinations may result reducing the uncore frequency thus result in lower performance at higher memory clock.
> 
> Click to expand...
> 
> I was get the same sort of thing when test memory clks, high, next low, next high... and someone suggested its a bit of a bug if you don't close and restart benchmark programs between tests if changing mem clks. Haven't really tested much into it, but I have been closing and restart SuPo between tests and you get much more consistent results.
Click to expand...

Even worse, I change things up. sometimes benches crash, reboot, bench is okay.









I think imma gonna reboot every time now. Pain in the butt for sure.


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> Just realised GDDR5X has such a weird frequency step and performance linearity.


I've got the same thing going on. Check it out. Being in the wrong mem frequency bin can have a ˜2% impact on your score.

This is Superposition 4K on a 4790K at 5.2Ghz 1.320v

2100MHz 1.063 +500 (12006MHz)
10710
10713

2100MHz 1.063 +510 (12028MHz)
10526
10525

2100MHz 1.063 +520 (12048MHz)
10716
10714

2100MHz 1.063 +530 (12078MHz)
10544
10540

2100MHz 1.063 +540 (12096MHz)
10732
10733

2100MHz 1.063 +550 (12108MHz)
10557
10556

2100MHz 1.063 +560 (12132MHz)
10751
10743

2100MHz 1.063 +570 (12150MHz)
10565
10572

2100MHz 1.063 +580 (12170MHz)
10761
10757

2100MHz 1.063 +590 (12190MHz)
10581
10578

2100MHz 1.063 +600 (12212MHz)
10768
10769

2100MHz 1.063 +610 (12236MHz)
10590
10591

2100MHz 1.063 +620 (12246MHz)
10777
10778

2100MHz 1.063 +630 (12264MHz)
10602
10604

2100MHz 1.063 +640 (12284MHz)
10783
10784

2100MHz 1.063 +650 (12312MHz)
10614
10618

2100MHz 1.063 +660 (12332MHz)
10799
10802


----------



## the9quad

Quote:


> Originally Posted by *Luckbad*
> 
> Sell them, which is why I keep losing money.


dang right you got some rep.


----------



## Unkzilla

Got a reasonable deal on a pair of 1080TI's - evga FE , they were $100 off each. Was enough to pull the trigger, was using a aftermarket 1080 before this.



Had to get the EVGA Sli bridge and a ram upgrade too while I was at it









Have read that SLI is a bit hit and miss at the moment but for my current game library it seems to be working very well. Metro LL Redux/ Crysis3/The Division/Far Cry Primal/For Honor/Ghost recon Wildlands is the current lineup and I am getting massive frame rates @ 4k on all of these games

Metro LL was especially surprising , getting 60fps average maxed out @ 4k with SSAA enabled


----------



## Nico67

Not sure if these have been seen in the wild before









msi-geforce-gtx-1080-ti-seahawk-ek-x-11gb

stuff not cheap here, but FE's are now under 1k


----------



## Coopiklaani

Quote:


> Originally Posted by *KraxKill*
> 
> I've got the same thing going on. Check it out. Being in the wrong mem frequency bin can have a ˜2% impact on your score.
> 
> This is Superposition 4K on a 4790K at 5.2Ghz 1.320v
> 
> 2100MHz 1.063v +500
> 10710
> 10713
> 
> 2100MHz 1.063v +510
> 10526
> 10525
> 
> 2100MHz 1.063v +520
> 10716
> 10714
> 
> 2100MHz 1.063v +530
> 10544
> 10540
> 
> 2100MHz 1.063v +540
> 10732
> 10733
> 
> 2100MHz 1.063v +550
> 10557
> 10556
> 
> 2100MHz 1.063v +560
> 10751
> 10743
> 
> 2100MHz 1.063v +570
> 10565
> 10572
> 
> 2100MHz 1.063v +580
> 10761
> 10757
> 
> 2100MHz 1.063v +590
> 10581
> 10578
> 
> 2100MHz 1.063v +600
> 10768
> 10769
> 
> 2100MHz 1.063v +610
> 10590
> 10591
> 
> 2100MHz 1.063v +620
> 10777
> 10778
> 
> 2100MHz 1.063v +630
> 10602
> 10604
> 
> 2100MHz 1.063v +640
> 10783
> 10784
> 
> 2100MHz 1.063v +650
> 10614
> 10618
> 
> 2100MHz 1.063v +660
> 10799
> 10802


Nice work! But could you check the actual memory clock vs the set clock? Just like the core clocks, memory clocks do not follow what you set is what you get law


----------



## ilikelobstastoo

Quote:


> Originally Posted by *the9quad*
> 
> dang right you got some rep.


good to see there are still some people willing to spend their own money on this hobby!! I've read many posts of people dumping their card back onto the retailer simply because they cant achieve what others achieve on a benchmark! This doesnt screw nvidia it screws the retailer and probably contributes to an increase in price and most importantly *return policies* for people with "real " issues.


----------



## Nico67

Quote:


> Originally Posted by *Coopiklaani*
> 
> Nice work! But could you check the actual memory clock vs the set clock? Just like the core clocks, memory clocks do not follow what you set is what you get law


Below is what I was seeing while testing the XOC bios, sorry no graph









Quote:


> Originally Posted by *Nico67*
> 
> More XOC testing, not only is voltage a big factor (mine seems to like 1.075v), but ram speed is really picky and can change depending on your core clock.
> 
> 2126/6003 - 10384
> 2126/6026 - 10234
> 2126/6059 - 10411
> 2126/6085 - 10268
> 2126/6106 - 10432
> 2126/6132 - 10293
> 2126/6142 - 10452
> 2126/6147 - 10309
> 2126/6166 - 10311
> 2126/6177 - 10473 yet 2138/6177 - 10316
> 2126/6180 - 10318 yet 2138/6180 - 10477
> 2126/6210 - 10485 yet 2138/6210 - 10340
> 2126/6221 - 10337
> 2126/6237 - 10499 starting to see artifacts but score seemed to scale ok.
> 
> SuPo is not a very good indicator of core scaling, but it does go up a bit
> 
> 2152/6180 - 10487
> 
> also I could game at 2138/6180 at least with TSW which usually crashes with other bioses at FE clks 2101/6003.
> 
> Probably with a clean driver reinstall it may do better, and hopefully they big out a better version of the XOC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anybody queried the HWBOT guy as to the bios performance?


----------



## kevindd992002

I love B&H when they list the FTW3 with an expected arrival date of May 22 but shipped my order today with free 3-day shipping







So a little bit cheaper than Newegg.


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> Nice work! But could you check the actual memory clock vs the set clock? Just like the core clocks, memory clocks do not follow what you set is what you get law


Sure, post edited.
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9780_30#post_26099568


----------



## IMI4tth3w

canceled my 1080ti ftw3 preorder from amazon today. evga is just way to late to the punch with this one.

the 1080ti strix OC card was not only in stock on amazon, but also has a water block available from ek. went ahead and ordered both. can't wait to see what this thing can do this weekend!


----------



## FuriousReload

I received my FTW3, and past 2025 MHz it locks up, so my FE was the winner by far. That being said, I love the way this FTW3 handles itself, and the temps are great! Time to sell one of them off.


----------



## Luckbad

Quote:


> Originally Posted by *KraxKill*
> 
> Sure, post edited.
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/9780_30#post_26099568


Are you using an FE on water?


----------



## Coopiklaani

The percentage controller allows you to effectively increase the voltage within predefined limits recommended by Nvidia. You can always set the slider to the max spec, however that would reduce the life span of your GPU to one year. - Tom Petersen

Is this guy serious? He's saying increasing the voltage from 1.05 to 1.093 will decrease the lifespan of the card from 5 year to one year.










Kill me


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I've got the same thing going on. Check it out. Being in the wrong mem frequency bin can have a ˜2% impact on your score.
> 
> This is Superposition 4K on a 4790K at 5.2Ghz 1.320v
> 
> 2100MHz 1.063 +500 (12006MHz)
> 10710
> 10713
> 
> 2100MHz 1.063 +510 (12028MHz)
> 10526
> 10525
> 
> 2100MHz 1.063 +520 (12048MHz)
> 10716
> 10714
> 
> 2100MHz 1.063 +530 (12078MHz)
> 10544
> 10540
> 
> 2100MHz 1.063 +540 (12096MHz)
> 10732
> 10733
> 
> 2100MHz 1.063 +550 (12108MHz)
> 10557
> 10556
> 
> 2100MHz 1.063 +560 (12132MHz)
> 10751
> 10743
> 
> 2100MHz 1.063 +570 (12150MHz)
> 10565
> 10572
> 
> 2100MHz 1.063 +580 (12170MHz)
> 10761
> 10757
> 
> 2100MHz 1.063 +590 (12190MHz)
> 10581
> 10578
> 
> 2100MHz 1.063 +600 (12212MHz)
> 10768
> 10769
> 
> 2100MHz 1.063 +610 (12236MHz)
> 10590
> 10591
> 
> 2100MHz 1.063 +620 (12246MHz)
> 10777
> 10778
> 
> 2100MHz 1.063 +630 (12264MHz)
> 10602
> 10604
> 
> 2100MHz 1.063 +640 (12284MHz)
> 10783
> 10784
> 
> 2100MHz 1.063 +650 (12312MHz)
> 10614
> 10618
> 
> 2100MHz 1.063 +660 (12332MHz)
> 10799
> 10802


so, you answered my question. i should go with +420
Quote:


> Originally Posted by *Coopiklaani*
> 
> The percentage controller allows you to effectively increase the voltage within predefined limits recommended by Nvidia. You can always set the slider to the max spec, however that would reduce the life span of your GPU to one year. - Tom Petersen
> 
> Is this guy serious? He's saying increasing the voltage from 1.05 to 1.093 will decrease the lifespan of the card from 5 year to one year.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Kill me


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> The percentage controller allows you to effectively increase the voltage within predefined limits recommended by Nvidia. You can always set the slider to the max spec, however that would reduce the life span of your GPU to one year. - Tom Petersen
> 
> Is this guy serious? He's saying increasing the voltage from 1.05 to 1.093 will decrease the lifespan of the card from 5 year to one year.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Kill me


he was serious but he didnt say exactly that. i think he was saying if we ran it at 1.2v or somethng. he did say the 0% to 100% bit, but he didnt say that the 3 bin bumps we get are what he was referring to we he said 100%. aka, he didnt mean 1.093v when he said 1 year. if so, good. we are all in warranty for 3 years lol. it's not true, though. the premise is, sure, and if you are on XOC bios you may want to consider that. i think 1.08- 1.09 is fine, tho. i mean, my card hit 1.063 at no bump. going to 1.093 is 0.03v. that cant possibly be shortening the life, right?


----------



## Dasboogieman

Quote:


> Originally Posted by *phaseshift*
> 
> what's the best gtx 1080 ti out right now?


Quote:


> Originally Posted by *Coopiklaani*
> 
> The percentage controller allows you to effectively increase the voltage within predefined limits recommended by Nvidia. You can always set the slider to the max spec, however that would reduce the life span of your GPU to one year. - Tom Petersen
> 
> Is this guy serious? He's saying increasing the voltage from 1.05 to 1.093 will decrease the lifespan of the card from 5 year to one year.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Kill me


If this were the case, the 1080 guys would already be reporting voltage deaths. Cmon, they're pushing 1.15V+ on their cards with the XOC BIOS.

I'd like to think the guy worded himself wrong by saying going to 1.093V reduces your card's lifespan *by* one year. This is more plausible considering these cards are engineered for something like 10+ years @ 80 degrees. So a voltage bump will reduce it to 9 years which is expected but also not relevant to OCN.


----------



## Coopiklaani

Quote:


> Originally Posted by *Dasboogieman*
> 
> If this were the case, the 1080 guys would already be reporting voltage deaths. Cmon, they're pushing 1.15V+ on their cards with the XOC BIOS.
> 
> I'd like to think the guy worded himself wrong by saying going to 1.093V reduces your card's lifespan *by* one year. This is more plausible considering these cards are engineered for something like 10+ years @ 80 degrees. So a voltage bump will reduce it to 9 years which is expected but also not relevant to OCN.







He did say stock voltage will make your card last for 5 years and maxing out the slider will reduce it down to one year.
But the truth is, the voltage slider for pascal cards does NOT apply a voltage offset to the card, but increases the maximum voltage allowed, from 1.05v @0% to [email protected]%. However, it has zero effect if you are not playing with the curve since the curve flats out after 1.05v if you manage to keep the temperature under 50c.

most ppl misunderstand the voltage slider on pascal cards, it does not work the same way maxwell one does, applying voltage offset. it doesn't even help to stabilise the oc.


----------



## BoredErica

Quote:


> Originally Posted by *alucardis666*
> 
> Appreciate all the replies guys. Idk. Maybe this Seasonic is just a lemon. I did put cpu and gpus at stock and still get shut down within 30 seconds to 3 minutes of running Heaven. I'm worried that I'm degrading my components when it shuts off like it is. :-(
> 
> I will have the psu in hand Thursday and see what happens with that EVGA 1600w. This Seasonic is going back for sure though


What model of Seasonic?

Quote:


> Originally Posted by *Coopiklaani*
> 
> Just realised GDDR5X has such a weird frequency step and performance linearity.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Interesting, I will test this too.


----------



## KraxKill

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> He did say stock voltage will make your card last for 5 years and maxing out the slider will reduce it down to one year.
> But the truth is, the voltage slider for pascal cards does NOT apply a voltage offset to the card, but increases the maximum voltage allowed, from 1.05v @0% to [email protected]%. However, it has zero effect if you are not playing with the curve since the curve flats out after 1.05v if you manage to keep the temperature under 50c.
> 
> most ppl misunderstand the voltage slider on pascal cards, it does not work the same way maxwell one does, applying voltage offset. it doesn't even help to stabilise the oc.


I could care less what he has to say. He's the director of technical *marketing*. He says what he is paid to say.


----------



## jim2point0

So.... I got the FTW3. Lordy is this thing hard to keep cool. Running Unigine Heaven benchmark for a while now and....



Not sure I just have really bad case airflow or what (never really had a problem with my G1 Gaming 980TI). This is only with a slight boost in the memory clock. The power temp is always super high. Fans spinning loud as ****. I don't know what a good temp is for the power\memory at all but I figured what's hot for the GPU is hot for the memory\power. Don't know. I never had to worry about that before.


----------



## KickAssCop

69 C is GPU temp and you are thinking card is not running cool?
But yes, as always this FTW3 cooler is noisy as sin like previous EVGA ACX coolers or so I have read/seen videos.


----------



## Luckbad

Quote:


> Originally Posted by *jim2point0*
> 
> So.... I got the FTW3. Lordy is this thing hard to keep cool. Running Unigine Heaven benchmark for a while now and....
> 
> 
> 
> Not sure I just have really bad case airflow or what (never really had a problem with my G1 Gaming 980TI). This is only with a slight boost in the memory clock. The power temp is always super high. Fans spinning loud as ****. I don't know what a good temp is for the power\memory at all but I figured what's hot for the GPU is hot for the memory\power. Don't know. I never had to worry about that before.


I have a super quiet PC so I've set my curves to try to keep the fans at or below 60% speed.

That said, I've done a lot of work to optimize airflow in the case as well. I use an NZXT Grid+ fan controller that increases airflow primarily based on GPU temp to further combat temps and sound.

Basically I have 5 intakes, an exhaust, and a closed loop Kraken. All use Cougar fans that usually run at 20% idle (virtually dead silent). They crank up based on temps such that they don't really increase overall volume over and above the video card.

Long story short, at 60% fan speed on my FTW3 I can keep it to around 65C with the graphics card remaining the loudest component in my case.

Longer story shorter, I look forward to either adding a Hybrid shroud to this card (are those coming?), buying a FTW3 Hybrid, buying a Hydro Copper, or buying an EKWB waterblock.

It's not a quiet card because EVGA kept it to 2 slots when many manufacturers make you lose an additional slot.


----------



## jim2point0

Quote:


> Originally Posted by *KickAssCop*
> 
> 69 C is GPU temp and you are thinking card is not running cool?
> But yes, as always this FTW3 cooler is noisy as sin like previous EVGA ACX coolers or so I have read/seen videos.


Look at the power temp. In order to keep that from getting too high, the fan is spinning almost at 100%. Sounds like a get engine. That's what I mean by not being able to keep it cool.

The GPU and memory seem fine but the power temps ramps up so that fan goes freakin nuts. Can anyone tell me what I should try and keep the power\memory at?

Also, why is Precision X such a pain to use? I really miss Afterburner. The UI is just ugly. Especially for the OSD. And if I so much as touch the fan speeds manually, I can't seem to get it to go back to automatic fan control.

Ugh. Gotta go to bed. Guess I'll play with it more tomorrow night. But so far it's rather disappointing.


----------



## TWiST2k

Quote:


> Originally Posted by *jim2point0*
> 
> So.... I got the FTW3. Lordy is this thing hard to keep cool. Running Unigine Heaven benchmark for a while now and....
> 
> 
> 
> Not sure I just have really bad case airflow or what (never really had a problem with my G1 Gaming 980TI). This is only with a slight boost in the memory clock. The power temp is always super high. Fans spinning loud as ****. I don't know what a good temp is for the power\memory at all but I figured what's hot for the GPU is hot for the memory\power. Don't know. I never had to worry about that before.


I have a Corsair carbide air 540 case and my FTW3 basically stays around 60c and the memory and power almost never make it up that high. The only time they did get up that far was when I was pushing more voltage with the XOC BIOS and that got me mostly worse performance at supposedly higher clock speeds. I do have a decently aggressive fan curve, but it looked like you did as well in your pic. I like the individual fan control, but I wish there was a way you could set them all to sync so you could use Afterburner on its own, since Precision X is still pretty atrocious.

I like the FTW3 but I just didn't get the best lottery this time around and I was in the very first pre-order phase from EVGA. I should of just waited and got it from Amazon like I always do, but the hype got me. You win this round EVGA.

I will snag some pics of my temps and fan curve and post it. I also have some of my notes before I got lazy and was just testing on what I had learned.


----------



## gmpotu

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> He did say stock voltage will make your card last for 5 years and maxing out the slider will reduce it down to one year.
> But the truth is, the voltage slider for pascal cards does NOT apply a voltage offset to the card, but increases the maximum voltage allowed, from 1.05v @0% to [email protected]%. However, it has zero effect if you are not playing with the curve since the curve flats out after 1.05v if you manage to keep the temperature under 50c.
> 
> most ppl misunderstand the voltage slider on pascal cards, it does not work the same way maxwell one does, applying voltage offset. it doesn't even help to stabilise the oc.


I'm not sure that's the case with all AIB models. My Strix OC has the default curve going up to 1.075v before it flattens out even though the max is 1.062v before upping the slider. Also, my card often will run @ 1.093v to stay stable when I'm running Heaven 2560x1440 even thought the line begins at 1.075v.


----------



## jim2point0

I have the NZXT H440 case. All stock fans. I have a Kraken X52 AIO cooling the CPU and doing a fine job, so that's not really an issue.

But the GPU is really warming up the entire case if I let a benchmark run for 10 minutes. Not really sure what I can do about it with the case that I have.

Problem is I ordered this thing from Newegg so I think I'm stuck with it.


----------



## ViRuS2k

Quote:


> Originally Posted by *ViRuS2k*
> 
> Hi guys,, not long now until i do the shuntmod resister style
> 
> 
> 
> 
> 
> 
> 
> 
> 
> have a question though, i plan on buying Gelid extreme thermal pads @W/mK 12
> will have Ek waterblocks on, does anyone have picture of the placment of the thermal pads, im not talking about ek`s guide, im taking about every single place on a card that draws heat and that can be lowered with thermal pads....
> 
> also what spacing do i get 0.5mm or 1mm and where exactly do i put the 0.5 or 1mm pads for best efficiancy on the block..
> 
> cheers.


Bump ????? anyone know lol


----------



## pez

Quote:


> Originally Posted by *Luckbad*
> 
> I've lost about $200 in the card shuffle.
> 
> I'm only willing to do it because EVGA was too nice about the 1080 FTW situation. I did the $99 upgrade to FTW2, which allowed me to Step Up to a 1080 Ti for $20.
> 
> Since I lost less than I should have transitioning to the Ti, I figured I'd play the lottery a bit.
> 
> From my experience, it's harder to find a card that is stable above 2GHz than this thread makes it appear.


Quote:


> Originally Posted by *Slackaveli*


not too bad at all. And I would tend to agree. I even had a day 1 FE, which is supposed to be golden, and it didnt hold 2 Ghz, although it was close. Luckily I pulled a winner on try two.
ah, that is nice to hear.[/quote]

Yeah, the only Ti I had that made it above 2GHz was the STRIX OC that I had and that is after getting the WB treatment. I think I was told that card boosts and sticks to 2088MHz on a EKWB. My FE Ti under a CLC gets to 1987 in some titles and 1974 in others. Ultimately, this is why I'm going back to the TXP. If I wanted to sit at an 'OK" OC, I'd just pop the SC BE back in. The TXP OC'ed past 2GHz even on the stock blower at like 70%, so the block should help it out a bit.
Quote:


> Originally Posted by *jim2point0*
> 
> Look at the power temp. In order to keep that from getting too high, the fan is spinning almost at 100%. Sounds like a get engine. That's what I mean by not being able to keep it cool.
> 
> The GPU and memory seem fine but the power temps ramps up so that fan goes freakin nuts. Can anyone tell me what I should try and keep the power\memory at?
> 
> Also, why is Precision X such a pain to use? I really miss Afterburner. The UI is just ugly. Especially for the OSD. And if I so much as touch the fan speeds manually, I can't seem to get it to go back to automatic fan control.
> 
> Ugh. Gotta go to bed. Guess I'll play with it more tomorrow night. But so far it's rather disappointing.


EVGA is notorious for fairly aggressive fan profiles out of the box. 69C is a decent temp, but I'm not sure what your fan %/speed is actually peaking at. If the power temp is related to VRMs and Chokes, those can get much hotter without need for alarm. You could probably cut that second asynchronous fan (not sure how it works with the 3-fan ICX coolers) down a bit. However, with a normal 2-fan ICX, I was seeing similar temps at no more than 65% fan speed. It sounds like your case airflow could use some work as well.


----------



## alucardis666

Quote:


> Originally Posted by *Darkwizzie*
> 
> What model of Seasonic?


https://www.amazon.com/gp/product/B01HXYRMME/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1


----------



## kevindd992002

So no free Nvidia game bundles this time around for the EVGA GTX 1080Ti FTW3?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jim2point0*
> 
> So.... I got the FTW3. Lordy is this thing hard to keep cool. Running Unigine Heaven benchmark for a while now and....
> 
> 
> 
> Not sure I just have really bad case airflow or what (never really had a problem with my G1 Gaming 980TI). This is only with a slight boost in the memory clock. The power temp is always super high. Fans spinning loud as ****. I don't know what a good temp is for the power\memory at all but I figured what's hot for the GPU is hot for the memory\power. Don't know. I never had to worry about that before.


Sounds like you should buy 5 more FTW3 cards, test them all and then only keep the best one and return the other 5.


----------



## Clukos

Quote:


> Originally Posted by *jim2point0*
> 
> So.... I got the FTW3. Lordy is this thing hard to keep cool. Running Unigine Heaven benchmark for a while now and....
> 
> 
> 
> Not sure I just have really bad case airflow or what (never really had a problem with my G1 Gaming 980TI). This is only with a slight boost in the memory clock. The power temp is always super high. Fans spinning loud as ****. I don't know what a good temp is for the power\memory at all but I figured what's hot for the GPU is hot for the memory\power. Don't know. I never had to worry about that before.


Undervolt the card if you want to see lower load temps. Try going for a stable 2000 core clock with 1.000mv or 1900 with 0.900mv load voltage and see if it improves (it should).
Quote:


> Originally Posted by *KraxKill*
> 
> I've got the same thing going on. Check it out. Being in the wrong mem frequency bin can have a ˜2% impact on your score.
> 
> This is Superposition 4K on a 4790K at 5.2Ghz 1.320v


Same here,

+700 = Good bin
+715 = Bad bin
+730 = Good bin
+750 = Bad bin
+765 = Good bin
+780 = Bad bin
+800 = Artifacts









...


----------



## Coopiklaani

Quote:


> Originally Posted by *gmpotu*
> 
> I'm not sure that's the case with all AIB models. My Strix OC has the default curve going up to 1.075v before it flattens out even though the max is 1.062v before upping the slider. Also, my card often will run @ 1.093v to stay stable when I'm running Heaven 2560x1440 even thought the line begins at 1.075v.


The stock curve is temperature dependent, it starts to flat out @1.05v when the GPU is really cold, under 38c, and it can also curve all the way to 1.093v if the GPU if hot, over 60c. You can check it by looking at the curve and re-applying the same frequency offset while bench marking.


----------



## fromthewatt

hi guys,
i have a 1080ti FE and want stable clocks at 1800hz or so ( 1850hz would be great )

i have done some benches and monitord my frequencies and temps etc

all stock is ~1750hz~ will clock down to 1650hz then back to 1750hz ~ 1800hz stock power limit @ 0.900volts to1000 volts

+150 is ~1965hz ~ will clock down to 1900hz 120 powerlimit @1030 volts ~ 1060 volts
+175 is ~1999hz ( will down clock) 120 powerlimit 1060 volts

all i want is a constant 1800hz or 1850hz on the core with no fluctuations

what will be the best way to achieve this ?
i want a flat 1850hz graph


----------



## BrainSplatter

Quote:


> Originally Posted by *fromthewatt*
> 
> all i want is a constant 1800hz or 1850hz on the core with no fluctuations


Get the curve editor (Ctrl-F) in Afterburner, use 'shift-left-click' to move the whole curve so that the 0.9v point has a clock of 1800Mhz. Save that curve (because sometimes it can get messed up in the next editing steps). All voltage points after the 0.9v have to be set to some lower clock than 1800Mhz (usually it's enough to move the next 3 or 4 voltage points lower and to click apply which will straighten the curve beyond 1800Mhz). Save profile, set power target to 120%, ... .

0.9v for 1800Mhz should work for most of the cards I guess. For my 2 FE cards, I use 0.9v for 1825Mhz which is stable even when the temperatures go above 80C. Depending on silicon lottery, u might also achieve 1850 or more Mhz with 0.9v. The default FE fan curve should be fine to prevent any temperature related throttling (also depends on case air flow ofc, ...). If not, u might need to apply a more aggressive curve.


----------



## Benny89

FTW3 is definitely louder and warmer than my STRIX.

I have Ethnoo Evolv ATX TG case and its is really not that good air-flow case. 3x 120mm 1400rpm fans intake front, one 140mm fan back and 2x120mm on CPU AIO cooler top.

STRIX with 55% fan speeds (max fan speed I consider quiet on them) never breaks 68-69C. That is without any repasting.

Now if you watch some videos at 55% speed FTW3 is much louder than STRIX and I can bet it wouldn't hold 68-9C in my case.

In terms of ratio of cooling, temps and noise I think STRIX is best AIB right now.


----------



## becks

Anyone can give me the length of the PCB on FE 1080 ti please....


----------



## andrews2547

Quote:


> Originally Posted by *becks*
> 
> Anyone can give me the length of the PCB on FE 1080 ti please....


Length please. It says it right there.


----------



## becks

Quote:


> Originally Posted by *andrews2547*
> 
> Length please. It says it right there.


I know...I've done that









Pretty artistic...if you think about it...a naked pic with lines, its art!


----------



## OZrevhead

Quote:


> Originally Posted by *Luckbad*
> 
> Silicon lottery is what it clocks like.
> 
> It's a completely overbuilt card with a fairly high power limit.
> 
> I've now owned about 6x 1080 Tis and it's the second best of the bunch that I've had. The Zotac Amp Extreme is the best of the various cards I tried.
> 
> But the EVGA has way better customer service, asynchronous fans, interesting temp readouts, much better resale value, and a waterblock from EK coming soonish.
> 
> I was going to sell my FTW3 on eBay and keep the Zotac because the Zotac gets higher core and memory clocks, but what that means in a practical sense is a Superposition average framerate of 76.67 (EVGA) vs 77.33 (Zotac). Or leave everything stock and drop ~4-5 FPS.
> 
> That said, I'm going to either jump on the FTW3 Hybrid or EVGA 1080 Ti Hydro Copper when those come out. I want to quiet things down a bit.


Thanks mate, what is the power limit on FTW3? What peak clocks did your Amp Extreme and FTW3 get? Shame its still down to silicon lottery, I was hoping FTW3 might kick butt ....


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> FTW3 is definitely louder and warmer than my STRIX.
> 
> I have Ethnoo Evolv ATX TG case and its is really not that good air-flow case. 3x 120mm 1400rpm fans intake front, one 140mm fan back and 2x120mm on CPU AIO cooler top.
> 
> STRIX with 55% fan speeds (max fan speed I consider quiet on them) never breaks 68-69C. That is without any repasting.
> 
> Now if you watch some videos at 55% speed FTW3 is much louder than STRIX and I can bet it wouldn't hold 68-9C in my case.
> 
> In terms of ratio of cooling, temps and noise I think STRIX is best AIB right now.


For your case*

The regular SC runs about the same in my case that's probably a quarter or less the size of your case.


----------



## Dasboogieman

Quote:


> Originally Posted by *OZrevhead*
> 
> Thanks mate, what is the power limit on FTW3? What peak clocks did your Amp Extreme and FTW3 get? Shame its still down to silicon lottery, I was hoping FTW3 might kick butt ....


FTW3 was 358 and the Amp X was like 384W.

Bear in mind these figures don't take in to account VRM efficiency or additional component loading. i.e. 358W on the FTW3 might equal 384 on the Amp Extreme as far as power throttling frequency is concerned.


----------



## mtbiker033

Quote:


> Originally Posted by *becks*
> 
> Anyone can give me the length of the PCB on FE 1080 ti please....


10.5 inches

http://vr-zone.com/articles/nvidia-geforce-gtx-1080-ti-founders-edition-review/123606.html


----------



## becks

Quote:


> Originally Posted by *mtbiker033*
> 
> 10.5 inches
> 
> http://vr-zone.com/articles/nvidia-geforce-gtx-1080-ti-founders-edition-review/123606.html


Thank you


----------



## mtbiker033

Quote:


> Originally Posted by *becks*
> 
> Thank you


your very welcome!!


----------



## becks

Saw the official sizing and those ones from the various reviews but wasn't sure if it is with the plate or no or for the PCB or card itself, glad I got confirmation...will have just enough space...
Cramming a 60 mm thick radiator in the front in a M-Itx build


----------



## mtbiker033

Quote:


> Originally Posted by *becks*
> 
> Saw the official sizing and those ones from the various reviews but wasn't sure if it is with the plate or no or for the PCB or card itself, glad I got confirmation...will have just enough space...
> Cramming a 60 mm thick radiator in the front in a M-Itx build


oh I understand! would love to see pics of that when you get it all built!


----------



## Benny89

Quote:


> Originally Posted by *pez*
> 
> For your case*
> 
> The regular SC runs about the same in my case that's probably a quarter or less the size of your case.


Tests and reviews shows that FTW3 runs hotter at simiallr rpm and noise level than STRIX, so this is not only a matter of my case. There is a detailed review by GamerNexus on this. It is also one of the loudest cards from all 1080TI AIB, especially above 50% fan speed it gets a lot louder than for example MSI, STRIX or Aorus.

I am not bashing here EVGA or anything, to be clear but reviews and tests clearly shows that 2 slot cooler design was achieved at some costs and EVGA fans were always louder.

On a side note- if you will go and check EVGA 10-sieres official forum you would see a lot of users reporting problems with fans and high temps. This are probably selected cases but still for 780USD people are little mad there.


----------



## becks

Quote:


> Originally Posted by *mtbiker033*
> 
> oh I understand! would love to see pics of that when you get it all built!


We're looking at probably August - September....that's why I did not started any build log yet, some personal things got in the way so there was no point in dragging a build log for that long..

That + I just had to RMA the motherboard and the CPU so everything is tits up at the moment..

Will hold the trigger till I have every part than start like a 1 week build log with putting everything together, tests, sleeved cables etc....


----------



## jim2point0

Quote:


> Originally Posted by *SlimJ87D*
> 
> Sounds like you should buy 5 more FTW3 cards, test them all and then only keep the best one and return the other 5.


I don't have that kind of money. Nor do I think I could do that on Newegg. Should have waited to get one on Amazon. Really wish I got the Strix.


----------



## pez

Quote:


> Originally Posted by *Benny89*
> 
> Tests and reviews shows that FTW3 runs hotter at simiallr rpm and noise level than STRIX, so this is not only a matter of my case. There is a detailed review by GamerNexus on this. It is also one of the loudest cards from all 1080TI AIB, especially above 50% fan speed it gets a lot louder than for example MSI, STRIX or Aorus.
> 
> I am not bashing here EVGA or anything, to be clear but reviews and tests clearly shows that 2 slot cooler design was achieved at some costs and EVGA fans were always louder.
> 
> On a side note- if you will go and check EVGA 10-sieres official forum you would see a lot of users reporting problems with fans and high temps. This are probably selected cases but still for 780USD people are little mad there.


Right. These are reviews done in open air, though. In a consumer system, everything is going to be dependent on a variety of factors. But yes, the cooler is going to be louder because the fans are higher RPM fans than the other coolers.


----------



## superV

guys need your help please.

low gpu usage issue at 1080p:
gpu on bf1 wont go higher that 75% 60/75 with fps arounf 120/140,bf4 sits at 40% with fps 80/90,cs:go sits at 30% with fps 200. with temps around 30/33 custom water cooling.
did everything i knowower savings to max perf,single monitor,prerendered frames to 1 etc etc.
had this thing on windows 7 and moved to windows 10 and same thing and even added another 16gb of ram.
did a firestricke run and scores looks normal:http://www.3dmark.com/3dm/19962346

specs:
GA-Z97X-Gaming G1 WIFI-BK
4790k,at 5ghz same issue,the low gpu usage drops the work on the cpu and i get a high cpu usage 75/85%, and if i disable ht it's even worse,cores it will work at 100%,its simply gpu refuses to work and drops the work on the cpu.
ram:32gb-g skill trident z ddr3 16 gigs of 2400mhz x 4 same issue with xmp enabled.
ssd: 960 pro 512gb
psu:corsair ax1500i
gpu:zotac 1080ti Fe boosts to 1873 and even overclocked + mem same thing.

i even bought a 240hz monitor


----------



## FuriousReload

To add in to the FTW3 temp discussion, I am running mine with Precision XOC and the aggressive fan profile inside of a Corsair 570x and my temps struggle to go above 58C, I have seen the card hit 59C for a second then drop. That was with running SuPo stress test for an hour. I do plan on swapping them TIM to Kryonaut, maybe this weekend, but as it is now it still runs really well for me. I do love that I never see any PerfCap in GPU-Z other than Vrel, my FE was sensitive with PerfCap. To me, my FE was fast with a stable 2126 MHz and +800 memory, but this FTW3 at 2025 MHz is providing a better experience, so far.


----------



## pez

Quote:


> Originally Posted by *superV*
> 
> guys need your help please.
> 
> low gpu usage issue at 1080p:
> gpu on bf1 wont go higher that 75% 60/75 with fps arounf 120/140,bf4 sits at 40% with fps 80/90,cs:go sits at 30% with fps 200. with temps around 30/33 custom water cooling.
> did everything i knowower savings to max perf,single monitor,prerendered frames to 1 etc etc.
> had this thing on windows 7 and moved to windows 10 and same thing and even added another 16gb of ram.
> did a firestricke run and scores looks normal:http://www.3dmark.com/3dm/19962346
> 
> specs:
> GA-Z97X-Gaming G1 WIFI-BK
> 4790k,at 5ghz same issue,the low gpu usage drops the work on the cpu and i get a high cpu usage 75/85%, and if i disable ht it's even worse,cores it will work at 100%,its simply gpu refuses to work and drops the work on the cpu.
> ram:32gb-g skill trident z ddr3 16 gigs of 2400mhz x 4 same issue with xmp enabled.
> ssd: 960 pro 512gb
> psu:corsair ax1500i
> gpu:zotac 1080ti Fe boosts to 1873 and even overclocked + mem same thing.
> 
> i even bought a 240hz monitor


You have no caps (FPS) set, correct?

I kinda understand CS:GO doing that, but I'm not sure about BF1. Settings are maxed out on BF1?


----------



## Coopiklaani

Quote:


> Originally Posted by *superV*
> 
> guys need your help please.
> 
> low gpu usage issue at 1080p:
> gpu on bf1 wont go higher that 75% 60/75 with fps arounf 120/140,bf4 sits at 40% with fps 80/90,cs:go sits at 30% with fps 200. with temps around 30/33 custom water cooling.
> did everything i knowower savings to max perf,single monitor,prerendered frames to 1 etc etc.
> had this thing on windows 7 and moved to windows 10 and same thing and even added another 16gb of ram.
> did a firestricke run and scores looks normal:http://www.3dmark.com/3dm/19962346
> 
> specs:
> GA-Z97X-Gaming G1 WIFI-BK
> 4790k,at 5ghz same issue,the low gpu usage drops the work on the cpu and i get a high cpu usage 75/85%, and if i disable ht it's even worse,cores it will work at 100%,its simply gpu refuses to work and drops the work on the cpu.
> ram:32gb-g skill trident z ddr3 16 gigs of 2400mhz x 4 same issue with xmp enabled.
> ssd: 960 pro 512gb
> psu:corsair ax1500i
> gpu:zotac 1080ti Fe boosts to 1873 and even overclocked + mem same thing.
> 
> i even bought a 240hz monitor


Your CPU may be holding your back for 1080p gaming. You'll probably need faster CPU and faster ram to unchain your gtx 1080ti beast








http://www.3dmark.com/fs/12654270


----------



## superV

yep,everything on ultra,no caps.

@Coopiklaani let's be serious [email protected]
gpu refuses to work and drops load on the cpu.


----------



## KraxKill

Quote:


> Originally Posted by *superV*
> 
> yep,everything on ultra,no caps.


Try super sampling set at 150% and see if your gpu usage goes up without an fps hit. May just be CPU bound.

I hit 24k in Firestrike on my 4790k @ 5.2 and am CPU bound at 1080p. The core just can't direct the gpu to draw that many frames. You'd need to be running 10ghz to be at 240fps. Now a game like DOTA, an older title or a racing game etc that 240hz really helps.

That's why I went with a 144hz 1440p until 4k 144hz is released with Volta and Skylake X


----------



## mtbiker033

Quote:


> Originally Posted by *KraxKill*
> 
> Try super sampling set at 150% and see if your gpu usage goes up without an fps hit. May just be CPU bound.
> 
> I hit 24k in Firestrike on my 4790k @ 5.2 and am CPU bound at 1080p. The core just can't direct the gpu to draw that many frames. You'd need to be running 10ghz to be at 240fps. Now a game like DOTA or a racing game etc that 240hz really helps.
> 
> That's why I went with a 144hz 1440p until 4k 144hz is released with Volta and Skylake X


yes I was going to suggest the same thing, turn up the resolution scale and put that gpu to work, 1080ti is waaaay overkill for 1080p


----------



## Dasboogieman

Quote:


> Originally Posted by *superV*
> 
> guys need your help please.
> 
> low gpu usage issue at 1080p:
> gpu on bf1 wont go higher that 75% 60/75 with fps arounf 120/140,bf4 sits at 40% with fps 80/90,cs:go sits at 30% with fps 200. with temps around 30/33 custom water cooling.
> did everything i knowower savings to max perf,single monitor,prerendered frames to 1 etc etc.
> had this thing on windows 7 and moved to windows 10 and same thing and even added another 16gb of ram.
> did a firestricke run and scores looks normal:http://www.3dmark.com/3dm/19962346
> 
> specs:
> GA-Z97X-Gaming G1 WIFI-BK
> 4790k,at 5ghz same issue,the low gpu usage drops the work on the cpu and i get a high cpu usage 75/85%, and if i disable ht it's even worse,cores it will work at 100%,its simply gpu refuses to work and drops the work on the cpu.
> ram:32gb-g skill trident z ddr3 16 gigs of 2400mhz x 4 same issue with xmp enabled.
> ssd: 960 pro 512gb
> psu:corsair ax1500i
> gpu:zotac 1080ti Fe boosts to 1873 and even overclocked + mem same thing.
> 
> i even bought a 240hz monitor


This will sound crazy but sell your 4790K and get a 5775c. Could easily get you another 10-20 frames.
What you are experiencing is that the CPU cannot push the workload to the GPU fast enough. You need to improve CPU throughput. IIRC BF1 is fully multithreaded so a HEDT processor can theoretically improve your frames but thats a full platform upgrade.
The 5775c upgrade is inexpensive (net cost of $20-$50) and its simply drop-in.


----------



## Benny89

How are people experience so far with 1080 TI in SLI? Good/bad ? How is scalling? Do you have to play with G-Sync off? Thanks for all the answers.


----------



## superV

Quote:


> Originally Posted by *KraxKill*
> 
> Try super sampling set at 150% and see if your gpu usage goes up without an fps hit. May just be CPU bound.


ye it does
Quote:


> Originally Posted by *mtbiker033*
> 
> yes I was going to suggest the same thing, turn up the resolution scale and put that gpu to work, 1080ti is waaaay overkill for 1080p


Quote:


> Originally Posted by *mtbiker033*
> 
> yes I was going to suggest the same thing, turn up the resolution scale and put that gpu to work, 1080ti is waaaay overkill for 1080p


yes the gpu usage is higher with dsr.but should work at 1080p too.
Quote:


> Originally Posted by *Dasboogieman*
> 
> This will sound crazy but sell your 4790K and get a 5775c. Could easily get you another 10-20 frames.
> What you are experiencing is that the CPU cannot push the workload to the GPU fast enough. You need to improve CPU throughput. IIRC BF1 is fully multithreaded so a HEDT processor can theoretically improve your frames but thats a full platform upgrade.
> The 5775c upgrade is inexpensive (net cost of $20-$50) and its simply drop-in.


the cpu is fine at 5ghz,with a 5775c won't clock that high and could get even worse performance .it simply doesn't want to work well in games and drops the workload on the cpu.


----------



## Dasboogieman

Quote:


> Originally Posted by *Dasboogieman*
> 
> This will sound crazy but sell your 4790K and get a 5775c. Could easily get you another 10-20 frames.


Quote:


> Originally Posted by *superV*
> 
> ye it does
> 
> yes the gpu usage is higher with dsr.but should work at 1080p too.
> the cpu is fine at 5ghz,with a 5775c won't clock that high and could get even worse performance .it simply doesn't want to work well in games and drops the workload on the cpu.


The 5775c doesn't get all its performance from raw clocks alone. It already matches a stock Devil's canyon at stock clocks because it gets the increased IPC from the Crystalwell cache. IIRC some testing on OCN showed a 4.5ghz 5775c can match or situationally exceed a 7700k at 5ghz at raw frame throughput.


----------



## Luckbad

Quote:


> Originally Posted by *jim2point0*
> 
> I don't have that kind of money. Nor do I think I could do that on Newegg. Should have waited to get one on Amazon. Really wish I got the Strix.


If it makes you feel better, I had a STRIX that couldn't break 1900 MHz and crashed at its stock OC when benching. I also really hate the sound profile of the STRIX. It's quiet but annoyingly pitched.

The Gigabyte is probably about as quiet and less annoying.

The best ratio of sound to performance I've had is the Zotac Amp Extreme. It can get very loud if you let it, but you can change it to be pretty quiet.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *jim2point0*
> 
> I don't have that kind of money. Nor do I think I could do that on Newegg. Should have waited to get one on Amazon. Really wish I got the Strix.


I was kidding.

If you just Re-TIM your card but use the spread method to apply the thinnest layer possible, you'll get much better temperatures.


----------



## KraxKill

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 5775c doesn't get all its performance from raw clocks alone. It already matches a stock Devil's canyon at stock clocks because it gets the increased IPC from the Crystalwell cache. IIRC some testing on OCN showed a 4.5ghz 5775c can match or situationally exceed a 7700k at 5ghz.


He's on a 4790k at 5.0 the descrete card gaming performance of a 7700k at the the same clocks has not shown to be much of an improvement. In fact anandtech showed a performance deficit in some instances on the 6700k.

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/10

This link is particularly telling because they test all the chips at a fixed clock.

And the 4790k is outperforming the 7700k here in some tests despite being at a lower frequency. The 4790k, 6700k, 7700k and the 5775c are all within 5% of each other. So there is no way he gains much if at all from switching.

http://www.anandtech.com/show/10968/the-intel-core-i7-7700k-91w-review-the-new-stock-performance-champion/6

If he goes through the trouble of geting a 5775c and by chance is able to get it to run at 4.5. (I would say 4.4 is more likely) then he'll get near the same fps as he would on his 4790k at 5.0.

There is no way I'd ditch my 4790k at 5.2ghz in the hopes of picking up a frame or two on the 5775C.

Waiting on Skylake X.


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> ye it does
> 
> yes the gpu usage is higher with dsr.but should work at 1080p too.
> the cpu is fine at 5ghz,with a 5775c won't clock that high and could get even worse performance .it simply doesn't want to work well in games and drops the workload on the cpu.


Mate, I know that you think that, I was there 2 months ago. 5775C Oced to 4.2 Ghz is better/equal to 5.0Ghz 7700k in games. Your 5.0 4790k will get destroyed by 5775C.

Belive, cause I sold my 4790k 5.0Ghz 2 months ago and got 5775C and I have MASSIVE improvement in games.

I sold it, added 50 bucks and now I enjoy having even better performance than 5.0Ghz 7700k in games, securing me from upgrading even further









For more info, please visit our Owners Thread: http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club

Here, catch:


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 5775c doesn't get all its performance from raw clocks alone. It already matches a stock Devil's canyon at stock clocks because it gets the increased IPC from the Crystalwell cache. IIRC some testing on OCN showed a 4.5ghz 5775c can match or situationally exceed a 7700k at 5ghz at raw frame throughput.


That is 4.2 to match and be better than 5.0Ghz 7700k


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> That is 4.2 to match and be better than 5.0Ghz 7700k


I knew I saw it somewhere. Yeah, 4.2 sounds about right, 4.5 should be doable with a delid + WC. Was there a reason why the 5775c couldn't clock very high? was the Crystalwell cache tied to the core clock?
That being said, what we're discussing is peanuts considering if he wants to push 200+ FPS 1080p in BF1 specifically. Maybe only more cores can do that but the 5775c is more consistent across more games.


----------



## superV

the cpu is not the issue.the issue is the gpu that drops the workload on the cpu.


----------



## dunbheagan

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> He did say stock voltage will make your card last for 5 years and maxing out the slider will reduce it down to one year.
> But the truth is, the voltage slider for pascal cards does NOT apply a voltage offset to the card, but increases the maximum voltage allowed, from 1.05v @0% to [email protected]%. However, it has zero effect if you are not playing with the curve since the curve flats out after 1.05v if you manage to keep the temperature under 50c.
> 
> most ppl misunderstand the voltage slider on pascal cards, it does not work the same way maxwell one does, applying voltage offset. it doesn't even help to stabilise the oc.


That is not completly right. If you turn the voltage slider above 100%, the card will go to 1,075V at higher temps instead of downclocking even without manual curve editing.

edit: sorry i overread your below 50 degree remark, i think you are right then


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> I knew I saw it somewhere. Yeah, 4.2 sounds about right, 4.5 should be doable with a delid + WC. Was there a reason why the 5775c couldn't clock very high? was the Crystalwell cache tied to the core clock?
> That being said, what we're discussing is peanuts considering if he wants to push 200+ FPS 1080p in BF1 specifically. Maybe only more cores can do that but the 5775c is more consistent across more games.


Well, we are talking here about 50-80 bucks upgrade to 7700k 5.0Ghz performance. And I didn't hear people complaining about bottleneck on 5.0 7700k.

I think it much better route for him than upgrading to 6/8 core and changing MOBO plus RAM or buying less powerfull GPU to turn throttling down.

Just my suggestion, but I agree- 1080 ti on 1080p is overkill


----------



## Luckbad

NewEgg has the MSI Sea Hawk with an EK water block:

https://m.newegg.com/products/N82E16814137144


----------



## Luckbad

Quote:


> Originally Posted by *OZrevhead*
> 
> Thanks mate, what is the power limit on FTW3? What peak clocks did your Amp Extreme and FTW3 get? Shame its still down to silicon lottery, I was hoping FTW3 might kick butt ....


Zotac is 384 W and FTW3 is 358 W.

I can get the FTW3 to 2050 but it doesn't hold all the time. The Zotac holds at 2062 without fluctuation. With a but of voltage it gains a bin, but this particular Zotac crashes after 2100.

I had a Zotac that could do 2150 or so but had pretty crappy ram. Couldn't get close to 12GHz but my current Zotac easily exceeds that.

My FTW3 can get 12GHz on memory.

If you want to buy one of them off me let me know. Going to list one on eBay in a couple days.


----------



## Agent-A01

Quote:


> Originally Posted by *Luckbad*
> 
> NewEgg has the MSI Sea Hawk with an EK water block:
> 
> https://m.newegg.com/products/N82E16814137144


Seems like newegg is only one carrying this..

Was hoping to find it on BHPhotovideo or some where else because with tax + shipping its near $920.


----------



## superV

Quote:


> Originally Posted by *Benny89*
> 
> Well, we are talking here about 50-80 bucks upgrade to 7700k 5.0Ghz performance. And I didn't hear people complaining about bottleneck on 5.0 7700k.
> 
> I think it much better route for him than upgrading to 6/8 core and changing MOBO plus RAM or buying less powerfull GPU to turn throttling down.
> 
> Just my suggestion, but I agree- 1080 ti on 1080p is overkill


i think i found the problem.i disabled the c-states and turbo boost,increased the vrm switching frequency and now it works 92/95/97%.
but as soon i lower the graphics to get more fps,the gpu usage lowers to 75/85%

haha 1080ti is overkill for 1080p







nice one m8.can't max out a 240hz monitor and is overkill.
i think this is nvidia's gimmick regarding 1080p to make people jump to higher res,cuz as soon i use dsr it jumps quick to 97%.


----------



## Delbane

I seem to be having an issue with voltage on my Zotac AMP ! EXTREME 1080 ti,
when I turn up the voltage to 100% in afterburner, the card gets unstable and keep crashing, anyone would know why this is happening ?
(I didn't even play with any overclocking settings yet) all I did is really just bump the power limit to 120%.

Thanks


----------



## Luckbad

Quote:


> Originally Posted by *Delbane*
> 
> I seem to be having an issue with voltage on my Zotac AMP ! EXTREME 1080 ti,
> when I turn up the voltage to 100% in afterburner, the card gets unstable and keep crashing, anyone would know why this is happening ?
> (I didn't even play with any overclocking settings yet) all I did is really just bump the power limit to 120%.
> 
> Thanks


What power supply do you have? It can draw quite a bit of power if it's not big or stable enough.

If your PSU is adequate, have you tried using Display Driver Uninstaller and reinstalling it? Maybe reseat the card.


----------



## KraxKill

Quote:


> Originally Posted by *superV*
> 
> i think i found the problem.i disabled the c-states and turbo boost,increased the vrm switching frequency and now it works 92/95/97%.
> but as soon i lower the graphics to get more fps,the gpu usage lowers to 75/85%
> 
> haha 1080ti is overkill for 1080p
> 
> 
> 
> 
> 
> 
> 
> nice one m8.can't max out a 240hz monitor and is overkill.
> i think this is nvidia's gimmick regarding 1080p to make people jump to higher res,cuz as soon i use dsr it jumps quick to 97%.


You lack an understanding of how your CPU is limiting frame output.


----------



## mtbiker033

Quote:


> Originally Posted by *superV*
> 
> i think i found the problem.i disabled the c-states and turbo boost,increased the vrm switching frequency and now it works 92/95/97%.
> but as soon i lower the graphics to get more fps,the gpu usage lowers to 75/85%
> 
> haha 1080ti is overkill for 1080p
> 
> 
> 
> 
> 
> 
> 
> nice one m8.can't max out a 240hz monitor and is overkill.
> i think this is nvidia's gimmick regarding 1080p to make people jump to higher res,cuz as soon i use dsr it jumps quick to 97%.


oh 240hz wow I missed that part, yeah I only have 144hz experience, that's a different animal for sure, apparently high hz doesn't push the gpu like higher resolution does


----------



## Dasboogieman

Quote:


> Originally Posted by *superV*
> 
> the cpu is not the issue.the issue is the gpu that drops the workload on the cpu.


Which is the result of one of three scenarios:
1. The CPU is struggling to supply work to the GPU
2: Driver overhead is preventing full GPU utilization.
3: Poor GPU work scheduling/poor BF1 code.

All except number 3 can be addressed with more CPU power. Ergo the CPU is likely the problem as the owner is not satisfied with the number of Frames he is getting.
Quote:


> Originally Posted by *superV*
> 
> i think i found the problem.i disabled the c-states and turbo boost,increased the vrm switching frequency and now it works 92/95/97%.
> but as soon i lower the graphics to get more fps,the gpu usage lowers to 75/85%
> 
> haha 1080ti is overkill for 1080p
> 
> 
> 
> 
> 
> 
> 
> nice one m8.can't max out a 240hz monitor and is overkill.
> i think this is nvidia's gimmick regarding 1080p to make people jump to higher res,cuz as soon i use dsr it jumps quick to 97%.


Its not a gimmick, you are misunderstanding how GPUs work.
Fundamentally, this how a modern GPU works. The CPU prepares the workload for the GPU to process, the GPU then outputs the result to the monitor.
Its a perpetual balancing curve between CPU-GPU bottlenecking. Light workload shifts the balance to the CPU, heavy graphics workload shifts the balance the GPU. Neither component is independent of each other as far as graphics workload is concerned.

With a 240FPS cap at 1080p, your workload is extremely lightweight for the GPU as it can generate frames faster than the CPU can supply it. Thus, the speed of your CPU limits the output of your GPU. Likewise if you set it to render 4K, the workload shifts back to the GPU which means the CPU waits on the GPU to finish before supplying more frames.

Which brings us back to the final conclusion, if you desire more frames then what you already have, you need a stronger CPU.


----------



## s1rrah

Another benchmark to aim our 1080 ti's at ... LOL:

Final Fantasy Storm Blood Benchmark

It's 1.9gb!

http://na.finalfantasyxiv.com/benchmark/download/


----------



## Delbane

Quote:


> Originally Posted by *Delbane*
> 
> I seem to be having an issue with voltage on my Zotac AMP ! EXTREME 1080 ti,
> when I turn up the voltage to 100% in afterburner, the card gets unstable and keep crashing, anyone would know why this is happening ?
> (I didn't even play with any overclocking settings yet) all I did is really just bump the power limit to 120%.
> 
> Thanks


Quote:


> Originally Posted by *Luckbad*
> 
> What power supply do you have? It can draw quite a bit of power if it's not big or stable enough.
> 
> If your PSU is adequate, have you tried using Display Driver Uninstaller and reinstalling it? Maybe reseat the card.


I have a corsair HX850 do you think it could be an issue ? I have tried to uninstall and reinstall my Nvidia drivers and i'm still getting the same issue, should I be worried about my card or something is wrong with my system ?

Thanks


----------



## KraxKill

Quote:


> Originally Posted by *Delbane*
> 
> I have a corsair HX850 do you think it could be an issue ? I have tried to uninstall and reinstall my Nvidia drivers and i'm still getting the same issue, should I be worried about my card or something is wrong with my system ?
> 
> Thanks


First of all, where do you experience crashing? You can do a simple test by running your card at the limit with Valley or Heaven in windowed loop mode and run Prime95 at the same time. If your system crashes or if your GPU crashes as you start Prime, it's likely your power supply.

Where is your CPU clocked at?

Do you have some wattage to spare to test your power supply? I would see at what TDP setting the card ran stable, and under clock/volt my CPU to determine if I gain a few more % TDP on the card without crashing. If you do, it's likely your power supply.

This will not eliminate your power supply as the cause, but it will suggest it may be if you gain more TDP room.

One easy way to test this, is

If you have C-States enabled, you can try this. Go into windows power settings and modify the "balanced" profile to 99% CPU speed or lower. This will disable turbo boost with the "balanced" profile selected, you can take it down further if you wish to gain even more wattage back. You can also manually under clock your CPU via the BIOS if you wish but its a bit more invasive for a simple test.

If you get HWinfo64 and note your current CPU speed and TDP during Prime95 you can use it to load your CPU during a Heaven loop to test for stability. If your rig takes a dive, you pretty much have to try another Power Supply to illiminate it as the cause.


----------



## OneCosmic

Quote:


> Originally Posted by *Clukos*
> 
> Undervolt the card if you want to see lower load temps. Try going for a stable 2000 core clock with 1.000mv or 1900 with 0.900mv load voltage and see if it improves (it should).
> Same here,
> 
> +700 = Good bin
> +715 = Bad bin
> +730 = Good bin
> +750 = Bad bin
> +765 = Good bin
> +780 = Bad bin
> +800 = Artifacts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...


I can't confirm this theory, no good or bad VRAM bins, no sudden dips or tops in SuPo 4K.


----------



## KraxKill

Quote:


> Originally Posted by *OneCosmic*
> 
> I can't confirm this theory, no good or bad VRAM bins, no sudden dips or tops in SuPo 4K.


How are you testing, and where is your data?


----------



## Delbane

Quote:


> Originally Posted by *KraxKill*
> 
> First of all, where do you experience crashing? You can do a simple test by running your card at the limit with Valley or Heaven in windowed loop mode and run Prime95 at the same time. If your system crashes or if your GPU crashes as you start Prime, it's likely your power supply.
> 
> Where is your CPU clocked at?
> 
> Do you have some wattage to spare to test your power supply? I would see at what TDP setting the card ran stable, and under clock/volt my CPU to determine if I gain a few more % TDP on the card without crashing. If you do, it's likely your power supply.
> 
> This will not eliminate your power supply as the cause, but it will suggest it may be if you gain more TDP room.
> 
> One easy way to test this, is
> 
> If you have C-States enabled, you can try this. Go into windows power settings and modify the "balanced" profile to 99% CPU speed or lower. This will disable turbo boost with the "balanced" profile selected, you can take it down further if you wish to gain even more wattage back. You can also manually under clock your CPU via the BIOS if you wish but its a bit more invasive for a simple test.
> 
> If you get HWinfo64 and note your current CPU speed and TDP during Prime95 you can use it to load your CPU during a Heaven loop to test for stability. If your rig takes a dive, you pretty much have to try another Power Supply to illiminate it as the cause.


i'm using a 4770k at 4.4GHz 1.26v
I just tried running prime95 with heaven 4.0 without issues at the same time,
I don't think a corsair HX850 would be having issues putting power to that ?

When I increase the voltage on the GPU I get a crash in Heaven 4.0 after only a few seconds,
I had a SLI of gtx 670 before and I also had a gtx 970 for a few months and never had any power issues,
I unfortunately don't have a multimeter on hands to check the power usage of my PSU but i'm guessing somewhere around 550W when everything is full load at 100%.
Anyone else is experiencing the same issue when playing with voltage on a Zotac 1080 ti AMP! Extreme ?


----------



## KraxKill

Quote:


> Originally Posted by *Delbane*
> 
> i'm using a 4770k at 4.4GHz 1.26v
> I just tried running prime95 with heaven 4.0 without issues at the same time,
> I don't think a corsair HX850 would be having issues putting power to that ?
> 
> When I increase the voltage on the GPU I get a crash in Heaven 4.0 after only a few seconds,
> I had a SLI of gtx 670 before and I also had a gtx 970 for a few months and never had any power issues,
> I unfortunately don't have a multimeter on hands to check the power usage of my PSU but i'm guessing somewhere around 550W when everything is full load at 100%.
> Anyone else is experiencing the same issue when playing with voltage on a Zotac 1080 ti AMP! Extreme ?


That sucks man. If it's not the power supply, not the drivers (Try the driver uninstaller) there is nowhere else to really point the finger at but the card. Is it happening elsewhere besides Heaven?


----------



## Delbane

Quote:


> Originally Posted by *KraxKill*
> 
> That sucks man. If it's not the power supply, not the drivers (Try the driver uninstaller) there is nowhere else to really point the finger at but the card. Is it happening elsewhere besides Heaven?


yeah happening pretty much anywhere, same issue with 3Dmark it crashes after a few seconds, as long as I don't touch the voltage the card is fine.
I'm wondering if anyone else with the same card is having the same issue..

On another note, I noticed that when my card gets past 50 degree celcius the fan jumps to 100% usage for a few seconds to keep it under 50c which is very annoying, anyone else is having the same problem lol ? Yes I know.. many problems... hahaha


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> guys need your help please.
> 
> low gpu usage issue at 1080p:
> gpu on bf1 wont go higher that 75% 60/75 with fps arounf 120/140,bf4 sits at 40% with fps 80/90,cs:go sits at 30% with fps 200. with temps around 30/33 custom water cooling.
> did everything i knowower savings to max perf,single monitor,prerendered frames to 1 etc etc.
> had this thing on windows 7 and moved to windows 10 and same thing and even added another 16gb of ram.
> did a firestricke run and scores looks normal:http://www.3dmark.com/3dm/19962346
> 
> specs:
> GA-Z97X-Gaming G1 WIFI-BK
> 4790k,at 5ghz same issue,the low gpu usage drops the work on the cpu and i get a high cpu usage 75/85%, and if i disable ht it's even worse,cores it will work at 100%,its simply gpu refuses to work and drops the work on the cpu.
> ram:32gb-g skill trident z ddr3 16 gigs of 2400mhz x 4 same issue with xmp enabled.
> ssd: 960 pro 512gb
> psu:corsair ax1500i
> gpu:zotac 1080ti Fe boosts to 1873 and even overclocked + mem same thing.
> 
> i even bought a 240hz monitor


again... get the 5775-c


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> ye it does
> 
> yes the gpu usage is higher with dsr.but should work at 1080p too.
> the cpu is fine at 5ghz,with a 5775c won't clock that high and could get even worse performance .it simply doesn't want to work well in games and drops the workload on the cpu.










LOL! Worse performance? naw, man...


----------



## superV

Quote:


> Originally Posted by *Dasboogieman*
> 
> Which is the result of one of three scenarios:
> 1. The CPU is struggling to supply work to the GPU
> 2: Driver overhead is preventing full GPU utilization.
> 3: Poor GPU work scheduling/poor BF1 code.
> 
> All except number 3 can be addressed with more CPU power. Ergo the CPU is likely the problem as the owner is not satisfied with the number of Frames he is getting.
> Its not a gimmick, you are misunderstanding how GPUs work.
> Fundamentally, this how a modern GPU works. The CPU prepares the workload for the GPU to process, the GPU then outputs the result to the monitor.
> Its a perpetual balancing curve between CPU-GPU bottlenecking. Light workload shifts the balance to the CPU, heavy graphics workload shifts the balance the GPU. Neither component is independent of each other as far as graphics workload is concerned.
> 
> With a 240FPS cap at 1080p, your workload is extremely lightweight for the GPU as it can generate frames faster than the CPU can supply it. Thus, the speed of your CPU limits the output of your GPU. Likewise if you set it to render 4K, the workload shifts back to the GPU which means the CPU waits on the GPU to finish before supplying more frames.
> 
> Which brings us back to the final conclusion, if you desire more frames then what you already have, you need a stronger CPU.


i will play more with the cpu power savings and see what i can get out of it.
well,i'll ask my friend to under clock his i7 7700k and see the results.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> He's on a 4790k at 5.0 the descrete card gaming performance of a 7700k at the the same clocks has not shown to be much of an improvement. In fact anandtech showed a performance deficit in some instances on the 6700k.
> 
> http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/10
> 
> This link is particularly telling because they test all the chips at a fixed clock.
> 
> And the 4790k is outperforming the 7700k here in some tests despite being at a lower frequency. The 4790k, 6700k, 7700k and the 5775c are all within 5% of each other. So there is no way he gains much if at all from switching.
> 
> http://www.anandtech.com/show/10968/the-intel-core-i7-7700k-91w-review-the-new-stock-performance-champion/6
> 
> If he goes through the trouble of geting a 5775c and by chance is able to get it to run at 4.5. (I would say 4.4 is more likely) then he'll get near the same fps as he would on his 4790k at 5.0.
> 
> There is no way I'd ditch my 4790k at 5.2ghz in the hopes of picking up a frame or two on the 5775C.
> 
> Waiting on Skylake X.


you guys are in denial. it coo tho


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> you guys are in denial. it coo tho


Show me some benchies...


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> i will play more with the cpu power savings and see what i can get out of it.
> well,i'll ask my friend to under clock his i7 7700k and see the results.


dude. you have a z-97 board and arent even looking at a 5775-c? you are REALLY doing yourself a disservice. ESPECIALLY at 1080p. You say you want frames but 10-30 % extra fps at 1080p for $50 doesnt interest you? I guess I will be done advising you then.









Here is a bunch of them. the linked page shows a game with 25% improvement ...

He still wont get 240Hz, tho. Nobody does except in cs:go
Quote:


> Originally Posted by *KraxKill*
> 
> Show me some benchies...


https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38

here is BF1 brakedown at 160fps, and this is NOT on a 1080ti , just a 1080
https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,30

and, finally, more BF1. here the 5775-c's MINIMUMs are 5 fps higher than the MAXIMUMs on a 4790k... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10

and another. the theme here is this. 5775-c 's MINIMUMS beat 4790k's MAXIMUms,.... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,17

and these are done at 3.3Ghz, man. 3.3!! Mine OC's to 4.3Ghz easily....


----------



## IMI4tth3w

here's some videos on youtube i found. he's at 1440p but its a 5775c + 1080ti. and his videos show gpu/cpu usage.


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> dude. you have a z-97 board and arent even looking at a 5775-c? you are REALLY doing yourself a disservice. ESPECIALLY at 1080p. You say you want frames but 10-30 % extra fps at 1080p for $50 doesnt interest you? I guess I will be done advising you then.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is a bunch of them. the linked page shows a game with 25% improvement ...
> 
> He still wont get 240Hz, tho. Nobody does except in cs:go
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38
> 
> here is BF1 brakedown at 160fps, and this is NOT on a 1080ti , just a 1080
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,30
> 
> and, finally, more BF1. here the 5775-c's MINIMUMs are 5 fps higher than the MAXIMUMs on a 4790k... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10
> 
> and another. the theme here is this. 5775-c 's MINIMUMS beat 4790k's MAXIMUms,.... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,17


Bro I want to see your scores.


----------



## KraxKill

Bro I want to see your scores. Can you score 24K in FIrestrike?


----------



## KraxKill

I've seen you post that before...info by one guy....its just doesn't corroborate with reality and I don't believe anything reported by one guy.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> Show me some benchies...


Quote:


> Originally Posted by *KraxKill*


it's advantages come in gaming, not as much in benches. clock for clock it is only 8% faster than a 4790k in benches. so, a 4.3Ghz 5775-c ties a 4.8Ghz 4790k in benches that dont utilize the cache. I'm telling you, though. I came from a 4.9Ghz 4790k. EVERY games sees higher fps. Most of the time the minimums beat my old max fps. And the MOST striking thing about a Broadwell Destroyer is how smooth it plays. i didnt realize how much microstutter there is on a 4790k until I got the Broadwell-DT. It is BUTTEr comparitively. I wouldnt trade you for a 4790k if you gave me $300 to go with it , man. Honestly.
Quote:


> Originally Posted by *KraxKill*
> 
> I've seen you post that before...info by one guy....its just doesn't corroborate with reality and I don't believe anything reported by one guy.


c'mon, dude. have you even read the posts in the broadwell owner's club. first person testimonials and tons of benches. If you actually care, you'll look. Im not linking all that data. it's is a 100% proven fact that a 5775-c sit well atop the z97 hierarchy. It is undisputable.


----------



## superV

Quote:


> Originally Posted by *Slackaveli*
> 
> dude. you have a z-97 board and arent even looking at a 5775-c? you are REALLY doing yourself a disservice. ESPECIALLY at 1080p. You say you want frames but 10-30 % extra fps at 1080p for $50 doesnt interest you? I guess I will be done advising you then.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is a bunch of them. the linked page shows a game with 25% improvement ...
> 
> He still wont get 240Hz, tho. Nobody does except in cs:go
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,38
> 
> here is BF1 brakedown at 160fps, and this is NOT on a 1080ti , just a 1080
> https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,30
> 
> and, finally, more BF1. here the 5775-c's MINIMUMs are 5 fps higher than the MAXIMUMs on a 4790k... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,10
> 
> and another. the theme here is this. 5775-c 's MINIMUMS beat 4790k's MAXIMUms,.... https://www.purepc.pl/procesory/broadwell_niszczyciel_test_core_i5_5675c_i_core_i7_5775c?page=0,17
> 
> and these are done at 3.3Ghz, man. 3.3!! Mine OC's to 4.3Ghz easily....


are you crazy man??here in europe 5775c is 410 euros, i7 6800k is 401 euros.
but i'm not buying nothin.
need to test now,and i'll come with some results.
thanks to all for your input.much appreciated


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> it's advantages come in gaming, not as much in benches. clock for clock it is only 8% faster than a 4790k in benches. so, a 4.3Ghz 5775-c ties a 4.8Ghz 4790k in benches that dont utilize the cache. I'm telling you, though. I came from a 4.9Ghz 4790k. EVERY games sees higher fps. Most of the time the minimums beat my old max fps. And the MOST striking thing about a Broadwell Destroyer is how smooth it plays. i didnt realize how much microstutter there is on a 4790k until I got the Broadwell-DT. It is BUTTEr comparitively. I wouldnt trade you for a 4790k if you gave me $300 to go with it , man. Honestly.


I am interested...it's not like i'm short on change to make it happen, it just seem too good to be true. But people try to tell me that about liquid metal all the time. So maybe you're right....it is tempting.


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> are you crazy man??here in europe 5775c is 410 euros, i7 6800k is 401 euros.
> but i'm not buying nothin.
> need to test now,and i'll come with some results.
> thanks to all for your input.much appreciated


ummm.... some reason why you wouldnt just sell your 4790k? and why do you think that Bradwell costs more than a 6800k?? oh, b/c it destroys it... Also, the 5675-c beats the 4790k, too, so if it's money , get the 5675-c and keep the change on the 4790k re-sell.

here's 170 fps in Battlefield 4 on a gtx 1080... much higher than the 90 you mentioned
https://www.purepc.pl/pamieci_ram/test_ddr3_vs_ddr4_jakie_pamieci_ram_wybrac_do_intel_skylake?page=0,11


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I am interested...it's not like i'm short on change to make it happen, it just seem too good to be true. But people try to tell me that about liquid metal all the time. So maybe you're right....it is tempting.


I have no possible motive to steer you wrong. and the Broadwell owner's club has 100% satisfaction. 100%. Probably isnt another owner's club on this whole site that can claim that besides the Broadwell owner's Club...

If you are on a z97 already it's a no brainer. they cost $350 in USA, and you can get $300 out of a 4790k on ebay. Also, of the last dozen or so member's of the owner's club (this month's new cats) all hit 4.3+ One guy actually hit 4.7 validation link. but, that is crazy high. At 4.3 it already beats 7700k, just mauls a 4790k. Obviously at 4k the results would be much lesser, but the minimums are still way better and the max is higher, too. That 1700% cache delta is WAY too much for the 4790k to overcome. Especially with an ipc disadvantage of 8% as well. It's extra 10, even 20% clocks arent enough to overcome all that.

All that said, I would recommend a new builder to wait until the new chips next month. If you have a z97 system, however, it's kind of a no-brainer. It's one of those rare times when it seems too good too be true but isnt. That's why it's still more expensive than any other 4 core. and why even though it's two years old, there are NEVER any used ones on Ebay. That should tell you... nobody gets rid of them to do an expensive side grade. Maybe when Skylake X drops some will hit Ebay. i doubt many will, though.


----------



## Agent-A01

Quote:


> Originally Posted by *Slackaveli*
> 
> why do you think that Bradwell costs more than a 6800k?? oh, b/c it destroys it...


No it doesn't.

Think you should stop with the nonsense already, nobody really cares.


----------



## Delbane

Here's what I scored
http://www.3dmark.com/3dm/19987456?


----------



## Slackaveli

Quote:


> Originally Posted by *Agent-A01*
> 
> No it doesn't.
> 
> Think you should stop with the nonsense already, nobody really cares.


nobody cares? Maybe not the guy who asked for help and got a great response from several people, which he chose to ignore. But, what about the 5 people from this thread who switched to 5775-c and thanked me profusely for it? What about the others asking for benches?
Probably 1/4 of my REP came from enlightening people to the 5775-c. This discussion went on for three pages before I commented on it and you wanna call yourself checking me? Puh-lease, bruh. diddle off


----------



## Benny89

Quote:


> Originally Posted by *Agent-A01*
> 
> No it doesn't.
> 
> Think you should stop with the nonsense already, nobody really cares.


Lol mate but it is truth... as I person who switched from 5.0 Ghz 4790k to OCed 5775C I can relate that thanks to Slackaveli I made an upgrade and can enjoy 7700k 5.0 Ghz performance on 1150 socket.

You can deny what you want but game benches shows clearly what 5775C can do.

I was where you were so I was sceptical two. EDRAM and L4 cache is magic


----------



## reflex75

Quote:


> Originally Posted by *Delbane*
> 
> Anyone else is experiencing the same issue when playing with voltage on a Zotac 1080 ti AMP! Extreme ?


I also have this card and it doesn't like higher voltage which increase temperature and instability.
If you keep stock voltage and increase only power, is it stable?
Bty, you score is fine.


----------



## Delbane

Quote:


> Originally Posted by *reflex75*
> 
> I also have this card and it doesn't like higher voltage which increase temperature and instability.
> If you keep stock voltage and increase only power, is it stable?
> Bty, you score is fine.


Yes it's stable at Max power (120%)

the card really doesn't go over 65 degrees while gaming so temperature is not an issue for me. Did you try to overclock it yet and what results did you get ?


----------



## johnfreeman

Quote:


> Originally Posted by *Slackaveli*
> 
> and why do you think that Bradwell costs more than a 6800k?? oh, b/c it destroys it...


Sure, mate - http://www.3dmark.com/compare/spy/1533019/spy/1771074#


----------



## Luckbad

Quote:


> Originally Posted by *johnfreeman*
> 
> Sure, mate - http://www.3dmark.com/compare/spy/1533019/spy/1771074#


----------



## Slackaveli

Quote:


> Originally Posted by *johnfreeman*
> 
> Sure, mate - http://www.3dmark.com/compare/spy/1533019/spy/1771074#


LOL!!! Oh, boy, your 6 core cpu beats my 4-core in a benchmark! too bad I am a gamer not really a benchmarker lol. Show me a game where it beats a 5775-c. There may be one, but I can think of several that actually utilize all cores where I still beat a 6800k (ie BF1). I am quite certain I can claim a 90%+ win across the entire catalogue of games.
Quote:


> Originally Posted by *Luckbad*


LOL!! best emo ever


----------



## Benny89

Quote:


> Originally Posted by *johnfreeman*
> 
> Sure, mate - http://www.3dmark.com/compare/spy/1533019/spy/1771074#


Man, in benchmarks we told you it will performa worse.

Throw us some game benches and let us then compare?

EDRAM and L4 Cache shines in games. This is not task/benchmak CPU.

Its pure gaming CPU. And I can tell you it is as good as 5.0 Ghz 7700k in games and even better in some (like in BF1 for example or Witcher 3).

We told you- it is gaming CPU, not benchmarks. Throw us some fps numbers here.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*


Quote:


> Originally Posted by *Benny89*
> 
> Man, in benchmarks we told you it will performa worse.
> 
> Throw us some game benches and let us then compare?
> 
> EDRAM and L4 Cache shines in games. This is not task/benchmak CPU.
> 
> Its pure gaming CPU. And I can tell you it is as good as 5.0 Ghz 7700k in games and even better in some (like in BF1 for example or Witcher 3).
> 
> We told you- it is gaming CPU, not benchmarks. Throw us some fps numbers here.


well, i bet he is off googling, which warms my heart, b/c he could search from now until the end of time looking for a benchmark where a 6800k w/ it's mediocre single core score and 1200% less cache can beat a Broadwell Gaming Behemoth in ANY title.


----------



## Agent-A01

Quote:


> Originally Posted by *Slackaveli*
> 
> nobody cares? Maybe not the guy who asked for help and got a great response from several people, which he chose to ignore. But, what about the 5 people from this thread who switched to 5775-c and thanked me profusely for it? What about the others asking for benches?
> Probably 1/4 of my REP came from enlightening people to the 5775-c. This discussion went on for three pages before I commented on it and you wanna call yourself checking me? Puh-lease, bruh. diddle off


So you are telling me where Broadwell-e (and even haswell-e) destroys 5GHz 7700k in multithreaded games/apps you are still going to spout drivel that 5775 is going to destroy it?

You are full of **** and you know it lol.
Quote:


> Originally Posted by *Benny89*
> 
> Lol mate but it is truth...


See above


----------



## lilchronic

5775c and 6800k is the same architecture. Single core performance will be the same and the 6800K will destroy it in multi threaded applications....
case closed stop with the nonsense


----------



## Slackaveli

Quote:


> Originally Posted by *Agent-A01*
> 
> So you are telling me where Broadwell-e (and even haswell-e) destroys 5GHz 7700k in multithreaded games/apps you are still going to spout drivel that 5775 is going to destroy it?
> 
> You are full of **** and you know it lol.
> See above


i said nothing about apps. I said in 90%+ of games it will beat a Broadwell-e with a Broadwell-DT, yes. Probably more like 95% b/c I cannot name one where a Broadwell-e beats a Broadwell-DT off the top of my head. enlighten me.
You seem also in denial about the fact that it beats a 7700k despite constant benches/pictures/testimonials. The only people on your side of the argument are theorising. They havent ever bothered to try for themselves.
The level of denial that keeps popping up on this subject weekly is astounding. I've never seen anything like it.


----------



## Agent-A01

Quote:


> Originally Posted by *Slackaveli*
> 
> i said nothing about apps. I said in 90%+ of games it will beat a Broadwell-e with a Broadwell-DT, yes.


That's still a far fetched and incorrect statement.

You cherry picked a couple game benchmarks where it has already shown majority of games don't even use the cache.

Any broadwell-e will destroy 5775 in highly threaded games.


----------



## Slackaveli

Quote:


> Originally Posted by *Agent-A01*
> 
> That's still a far fetched and incorrect statement.
> 
> You cherry picked a couple game benchmarks where it has already shown majority of games don't even use the cache.
> 
> Any broadwell-e will destroy 5775 in highly threaded games.



















every single game uses the cache. Why would a game somehow NOT use cache???

also, any broadwell-e will get the floor wiped with them in 90% of games because of exactly the disqualifier you used in your statement. THEY DON"T use more than 4 core 8 threads 90% of the time. And in some games, like BF1, they do. And guess what? 5775-c is the King of BF1 cpus. Just stop, dude. I am not telling everyone to go out and buy it. I am helping, WHEN ASKED, by people who have just realized their 4790k is now bound that they have a $50 solution that is up to the level of a top of the line 7700k. I would be doing them a disservice by not telling them about the 5775-c. So many uninformed fools have already convinced most that the 4790k is the top of the z97 hierarchy and to upgrade they must spend $700+. THAT IS PATENTLY FALSE, and I will continue to educate those who ask about the very best gaming cpu on the planet.


----------



## Benny89

Quote:


> Originally Posted by *Agent-A01*
> 
> That's still a far fetched and incorrect statement.
> 
> You cherry picked a couple game benchmarks where it has already shown majority of games don't even use the cache.
> 
> Any broadwell-e will destroy 5775 in highly threaded games.


Ok, mate. Please show me a games where that case is not used? Because it seems I game more than you cause I didn't meet an game yet that does not use it.

I will just leave it here. Various of games. All nowadays AAA games and bench/test games that are wildly used. 1080p, tested with 1080 GTX on board: Asus GTX 2025/10000:


----------



## Slackaveli

it's like some of you guys can't believe your eyes. Or, more than likely, are mad you didnt know about it earlier. And being as that is OBVIOUSLY the case, why get mad at me for telling others who havent sold off their PRECIOUS z-97 mobo yet about this amazing piece of tech? The ONLY interesting chip Intel has put out in years that was actually ingenuitive at all. I don't get the hate unless it is just pure buyer's remorse....


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> ^^^^^^^^^^^^^^^^^^^^^ AND THAT IS WITH IT AT 3.3GHZ. it easily OC's to 4.2+


That is OCed 4.2 vs OCed 5.0Ghz 7700k


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> That is OCed 4.2 vs OCed 5.0Ghz 7700k


ok, gotcha. it's from the other article. word. Just reks the 7700k. And the 4790k, psssht. roflstomped. Even the 5675-c 4 core i-5 reks the 4790k.


----------



## eXteR

Last 4 pages only talking about CPU X vs CPU Y.

I thought i was on 1080Ti thread.


----------



## KraxKill

Best case I've found for the 5775C......in a Ti Thread somehow. There is more there, but it's significant.



http://techreport.com/review/28751/intel-core-i7-6700k-skylake-processor-reviewed/6


----------



## superV

y'all should stop about these cpu wars. it's way too expensive 400 euros for that, will not buy it cuz i'm tired to drain my loop,did it twice this week for res change.
here the problem is that nvidia has the hands in the sack.it's simply unacceptable that the gpu performs like that at 1080p.u can say whatever you want,but needing a new cpu for using a 1080ti at 1080p.....something is wrong.


----------



## KraxKill

Quote:


> Originally Posted by *superV*
> 
> y'all should stop about these cpu wars. it's way too expensive 400 euros for that, will not buy it cuz i'm tired to drain my loop,did it twice this week for res change.
> here the problem is that nvidia has the hands in the sack.it's simply unacceptable that the gpu performs like that at 1080p.u can say whatever you want,but needing a new cpu for using a 1080ti at 1080p.....something is wrong.


Then you're misunderstanding the fundamentals of why GPU performance at 1080P cannot increase as there is no way to deliver frames that your CPU is not asking the GPU to draw.


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> y'all should stop about these cpu wars. it's way too expensive 400 euros for that, will not buy it cuz i'm tired to drain my loop,did it twice this week for res change.
> here the problem is that nvidia has the hands in the sack.it's simply unacceptable that the gpu performs like that at 1080p.u can say whatever you want,but needing a new cpu for using a 1080ti at 1080p.....something is wrong.


man..... what is "wrong" is you are cpu bound. if you dont want a better cpu, get a better monitor then. you have a 4k rig on a 1080p monitor, that's what's REALLY wrong. There is no setup you can get to run AAA games at ultra 240Hz. NONE. But, closest you can get is 5775-c. and you already have the entire rig save the cpu... smh.


----------



## superV

Quote:


> Originally Posted by *KraxKill*
> 
> Then you're misunderstanding the fundamentals of why GPU performance at 1080P cannot increase as there is no way to deliver frames that your CPU is not asking the GPU to draw.


i'm sure it can be improved with a driver update.
Quote:


> Originally Posted by *Slackaveli*
> 
> man..... what is "wrong" is you are cpu bound. if you dont want a better cpu, get a better monitor then. you have
> 
> i want to play at 240 hz,much better than a high res.
> everybody with their own


----------



## Slackaveli

I'm going to go ahead and pass my lightwork ^ on to the homies... Who wants to do the honors....


----------



## KraxKill

Quote:


> Originally Posted by *superV*
> 
> it can be improved with a driver update.


Not in any meaningful way. It can only increase with more frequency or higher IPC neither of which will happen for you with a driver update.


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> i'm sure it can be improved with a driver update.


Mate, seriously, nobody hear means any harm to you. But you are CPU bound. You have 3 ways to get rid off that:

1. Sell 4790k and buy 5775C. Your 4790k is golden so it will sell nicely, like mine did. 50-80 bucks and you have an upgrade. Maybe even less if you find good buyer.
2. But yourself 1440p 144Hz monitor to boost resolution for GPU and to still have those juciy more than 60hz gameplay
3. Go for 4K 60Hz G-Sync monitor or Ultra Wide 1440p and forget about CPU bound because resolution itself will force your GPU to go full power.

That is all really. You can't get over CPU bound. This is hardware limit. No drivers will help.


----------



## superV

Quote:


> Originally Posted by *Slackaveli*
> 
> man..... what is "wrong" is you are cpu bound. if you dont want a better cpu, get a better monitor then. you have a 4k rig on a 1080p monitor, that's what's REALLY wrong.


Quote:


> Originally Posted by *Slackaveli*
> 
> man..... what is "wrong" is you are cpu bound. if you dont want a better cpu, get a better monitor then. you have a 4k rig on a 1080p monitor, that's what's REALLY wrong. There is no setup you can get to run AAA games at ultra 240Hz. NONE. But, closest you can get is 5775-c. and you already have the entire rig save the cpu... smh.


busting 400 euros right now is stupid,intel will release new cpus from what i heard.


----------



## feznz

who cares bit like the PSU debate and best CPU debate
no one will win

best performance vs cost there is one group where money comes from daddy's pocket and there's the others that know what money is actually worth because they earn it.


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> busting 400 euros right now is stupid,intel will release new cpus from what i heard.


WHAT 400 eur?? If you sell your CPU you can add 50-80 bucks and have an upgrade.

So what if Intel release new CPUs? Ask any 5.0Ghz 7700k user or 4.2 5775C user - do they need an upgrade? Of course not, for a long time.

We are talking 80 bucks upgrade vs MOBO, RAM, CPU upgrade.


----------



## Clukos

Let's rename this thread to "Please consider buying a 5775C"


----------



## mtbiker033

If I had a compatible motherboard I would be getting a 5775c for sure!!


----------



## superV

ok ok i got it the intel i7 5775C = beast.but we should not worry bout buying new cpus.here the problem is that nvidia need to do something about.and i'm sure it can.


----------



## KedarWolf

Quote:


> Originally Posted by *superV*
> 
> ok ok i got it the intel i7 5775C = beast.but we should not worry bout buying new cpus.here the problem is that nvidia need to do something about.and i'm sure it can.


I only have a measly 5960x at 4.7GHZ , I feel sooooo, well, inadequate.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> I only have a measly 5960x at 4.7GHZ , I feel sooooo, well, inadequate.


well, if you are cpu bound, we have no possible way out of it, because bigdaddy is beast. broadwell is the pitbull. 5960x is Big Daddy in the Big House.


----------



## kicurry

Yeah, in most games if you already have a z97 mobo then the 5775c is a no brainer upgrade. Does not really need fast memory or anything. As far as benchmarking in orange room it really flies.

http://www.3dmark.com/vrpor/110712

That is right up there in the hall of fame. No having to run the cpu at 5.2 with tons of heat.


----------



## superV

Quote:


> Originally Posted by *Slackaveli*
> 
> well, if you are cpu bound, we have no possible way out of it, because bigdaddy is beast. broadwell is the pitbull. 5960x is Big Daddy in the Big House.


are you sure about the i7 5775C?the gpu will work 100%?i could delid it and mount it naked like i'm with my 4790k and clock the hell of it.i could send back my 16gb ram and sell my 4790k,but i'm worried bout the future of ram needs cuz,now bf1 it takes like 8/9gb.


----------



## Slackaveli

Quote:


> Originally Posted by *kicurry*
> 
> Yeah, in most games if you already have a z97 mobo then the 5775c is a no brainer upgrade. Does not really need fast memory or anything. As far as benchmarking in orange room it really flies.
> 
> http://www.3dmark.com/vrpor/110712
> 
> That is right up there in the hall of fame. No having to run the cpu at 5.2 with tons of heat.


yep. #1 out of all 1080Ti owners is Daneracer on his 5775c and 1080Ti. Even beats a 6950x. And I beat him in timespy. I need to go take my #1 spot!![


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> are you sure about the i7 5775C?the gpu will work 100%?i could delid it and mount it naked like i'm with my 4790k and clock the hell of it.i could send back my 16gb ram and sell my 4790k,but i'm worried bout the future of ram needs cuz,now bf1 it takes like 8/9gb.


sure. but no need to even delid. this is a power sipping chip. it only gives 8-10c from delidding.


----------



## superV

Quote:


> Originally Posted by *Slackaveli*
> 
> sure. but no need to even delid. this is a power sipping chip. it only gives 8-10c from delidding.


what about the ram ?and did u say that the gameplay is smoother ?


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> what about the ram ?and did u say that the gameplay is smoother ?


the gameplay is much smoother to me. none of that infamous intel microstutter. But, to be clear, there will be times you won't get 100% gpu usage. But you will get much higher usage and fps. It's the best you can do for 1080p. But that 1080Ti is muay powerful for that rez.

same ram you are running, same mobo. just drop it in.


----------



## CptSpig

Quote:


> Originally Posted by *lilchronic*
> 
> 5775c and 6800k is the same architecture. Single core performance will be the same and the 6800K will destroy it in multi threaded applications....
> case closed stop with the nonsense


^^^^


----------



## CptSpig

Quote:


> Originally Posted by *kicurry*
> 
> Yeah, in most games if you already have a z97 mobo then the 5775c is a no brainer upgrade. Does not really need fast memory or anything. As far as benchmarking in orange room it really flies.
> 
> http://www.3dmark.com/vrpor/110712
> 
> That is right up there in the hall of fame. No having to run the cpu at 5.2 with tons of heat.


Dude he is number 90 on the hall of fame list? There are a bunch of i7 7700K scores above this dude.


----------



## KraxKill

I'll post it again....I could care less about the max fps the 5775C delivers. This is what has got me interested. I ditched SLI because of micro stutter. Running a single card made that significantly better. If this graphic is accurate, then i'm all over it. To me latency is what matters most and if this is accurate it wipes the floor with the other chips. EVEN IF it scores less. Keep benching but I need practical performance and utility out of this thing. If the 5775C asks for frames from my 1080Ti at a faster rate i'm all over it. I'm ordering tonight! LOL


----------



## CptSpig

Quote:


> Originally Posted by *Slackaveli*
> 
> yep. #1 out of all 1080Ti owners is Daneracer on his 5775c and 1080Ti. Even beats a 6950x. And I beat him in timespy. I need to go take my #1 spot!![


The link above is for a Titan X Pascal not a 1080ti and he is 45 on the list. Show us your Time Spy link and I will show you what a 6950x will do to your processor.


----------



## KraxKill

Quote:


> Originally Posted by *CptSpig*
> 
> The link above is for a Titan X Pascal not a 1080ti and he is 45 on the list. Show us your Time Spy link and I will show you what a 6950x will do to your processor.


In gaming? Please do...waiting........

Or are you talking about fictional scenarios which don't translate to reality?


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> In gaming? Please do...waiting........
> 
> Or are you talking about fictional scenarios which don't translate to reality?


Most just throw benches but nobody want to show reali life in-game performance.

I don't understand how one can be part of overclock.net and and still think that benchmarks=gaming performance. LOL....

On the GPU side. I just did a SUPO 4K run on my old 980Ti 1555 OC +450 memory. Scored solid 6900.

But funny thing was, GPU was at 78C max in this test (fans on auto), downlocked only TWICE only after 72C, staying rock strong at 1525 80% run. And best it- NOT EVEN ONCE it hitted any power limit during this test... that is on +75 Voltage.

Eh, Pascal, why can't you behave like that







. I wish Pacals were more like Maxwells.


----------



## superV

Quote:


> Originally Posted by *KraxKill*
> 
> I'll post it again....I could care less about the max fps the 5775C delivers. This is what has got me interested. I ditched SLI because of micro stutter. Running a single card made that significantly better. If this graphic is accurate, then i'm all over it. To me latency is what matters most and if this is accurate it wipes the floor with the other chips. EVEN IF it scores less. Keep benching but I need practical performance and utility out of this thing. If the 5775C asks for frames from my 1080Ti at a faster rate i'm all over it. I'm ordering tonight! LOL


ok. i'm in too.
so if the 5775c is fastest cpu for gaming,even better than a 7700k which came out q1 17,means that should do great for a couple of years.right?


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> ok. i'm in too.
> so if the 5775c is fastest cpu for gaming,even better than a 7700k which came out q1 17,means that should do great for a couple of years.right?


At least as much as 5.0 Ghz 7700k. I would not upgrade from either of them till gaming mainstream 6cores hit market that will offer at least 20% more performance in games, not benchmarks.

Those are best 4-cores you can get for gaming.


----------



## CptSpig

Quote:


> Originally Posted by *KraxKill*
> 
> In gaming? Please do...waiting........
> 
> Or are you talking about fictional scenarios which don't translate to reality?


Dude are you serious? It is not only the processor that gives you high FPS in gaming it's the whole package.







You pick the game? If I have it I show you what a fast machine is all about!


----------



## KraxKill

Quote:


> Originally Posted by *CptSpig*
> 
> Dude are you serious? It is not only the processor that gives you high FPS in gaming it's the whole package.


Yeah...whats your point? I see none.


----------



## Benny89

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah...whats your point? I see none.


Buyer remorse, dont feed it, it will go away.

Getting back to GPU, can someone share with me his SLI experience so far with 1080Tis?


----------



## CptSpig

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah...whats your point? I see none.


This is what you said: In gaming? Please do...waiting........

Or are you talking about fictional scenarios which don't translate to reality?

Looks to me like a challenge?


----------



## KraxKill

Quote:


> Originally Posted by *CptSpig*
> 
> This is what you said: In gaming? Please do...waiting........
> 
> Or are you talking about fictional scenarios which don't translate to reality?
> 
> Looks to me like a challenge?


I'm so sorry for you.


----------



## Benny89

Quote:


> Originally Posted by *CptSpig*
> 
> This is what you said: In gaming? Please do...waiting........
> 
> Or are you talking about fictional scenarios which don't translate to reality?
> 
> Looks to me like a challenge?


I feel like someone feels "too short" now.







. Take that to other thread please.

Lets' cut this subject, ok? It has gone waaaay away from thread.

CUT.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CptSpig*
> 
> This is what you said: In gaming? Please do...waiting........
> 
> Or are you talking about fictional scenarios which don't translate to reality?
> 
> Looks to me like a challenge?
> 
> 
> 
> I feel like someone feels "too short" now.
> 
> 
> 
> 
> 
> 
> 
> . Take that to other thread please.
> 
> Lets' cut this subject, ok? It has gone waaaay away from thread.
> 
> CUT.
Click to expand...

Yes, please drop it, peeps, this is a 1080 Ti thread, not a 5775C thread.

Thank you.


----------



## superV

now i'm wondering if to drop the 240hz benq back(still in time) and to get a 1440p 165hz like yours @ Benny89.
how the gpu usage balance will be?


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> I'll post it again....I could care less about the max fps the 5775C delivers. This is what has got me interested. I ditched SLI because of micro stutter. Running a single card made that significantly better. If this graphic is accurate, then i'm all over it. To me latency is what matters most and if this is accurate it wipes the floor with the other chips. EVEN IF it scores less. Keep benching but I need practical performance and utility out of this thing. If the 5775C asks for frames from my 1080Ti at a faster rate i'm all over it. I'm ordering tonight! LOL


it's why we all end up so happy after we switch. b/c that microstutter is something you dont really sense completely until it's gone. And the butter smoothness feels so good..


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> now i'm wondering if to drop the 240hz benq back(still in time) and to get a 1440p 165hz like yours @ Benny89.
> how the gpu usage balance will be?


Well, my 1080 TI Strix in BF1 around 97-99% usage. This is BF1 thing. When add little scalling to res its 99% all the time pretty much. This is like an exception really. BF1 is really CPU depended. I am hovering (depends on map) at around 134-143fps mostly with some settings adjustments.

All other games I play like Witcher 3, MEA, RoTR is 100% GPU usage all the time.

If you have 1080 Ti then definitely I would suggest grabbing at least 1440p 165Hz or Ultra Wide 1440p 100hz monitor. 4K 60hz too, but I can't go back to 60, especially now


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> ok. i'm in too.
> so if the 5775c is fastest cpu for gaming,even better than a 7700k which came out q1 17,means that should do great for a couple of years.right?


yeah. it also means Intel be lyin and scammed a lot of people. but that's another story. word is out, now. a bunch have jumped on the broadwell train this qtr. Intel has probably heard about it at this point.


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> now i'm wondering if to drop the 240hz benq back(still in time) and to get a 1440p 165hz like yours @ Benny89.
> how the gpu usage balance will be?


Also before you do it- you can try to scale resolution in BF1. Try 150% and 200% res scale and see then you GPU usage and fps count.

Still recommending leaving 1080p behind. When you go top card, well, its time really


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, please drop it, peeps, this is a 1080 Ti thread, not a 5775C thread.
> 
> Thank you.


you're right, but it isnt a true derail as a lot have become gpu bound just when they got the 1080Ti (this thread) and this is the place to discuss solutions to all things related to the 1080 ti, right? How would they ever know to find the broadwell owner's thread? Nobody every believes the truth until a 5 page discussion. this happens every 50 pages or so and always ends with about 3 more broadwells sold. I have the PMs to prove it.

But, yeah, I think at THIS point further discussion is pushing a derail so Im done, too.

Congrats to the 2-3 of you taking the magic cache carpet ride.
Quote:


> Originally Posted by *superV*
> 
> now i'm wondering if to drop the 240hz benq back(still in time) and to get a 1440p 165hz like yours @ Benny89.
> how the gpu usage balance will be?


DO THIS! If you are in the return window, that is a very smart move.


----------



## Gattlin

Quote:


> Originally Posted by *Benny89*
> 
> At least as much as 5.0 Ghz 7700k. I would not upgrade from either of them till gaming mainstream 6 cores hit market that will offer at least 20% more performance in games, not benchmarks.
> 
> Those are best 4-cores you can get for gaming.


sIX CORES
Quote:


> Originally Posted by *Benny89*
> 
> At least as much as 5.0 Ghz 7700k. I would not upgrade from either of them till gaming mainstream 6cores hit market that will offer at least 20% more performance in games, not benchmarks.
> 
> Those are best 4-cores you can get for gaming.


Rite on this Ol boy can see the future and four core are yesterdays news. I won't even sniff a "4" core it's nothing but a detriment to the PC gaming society.


----------



## the9quad

Quote:


> Originally Posted by *Slackaveli*
> 
> you're right, but it isnt a true derail as a lot have become gpu bound just when they got the 1080Ti (this thread) and this is the place to discuss solutions to all things related to the 1080 ti, right? How would they ever know to find the broadwell owner's thread? Nobody every believes the truth until a 5 page discussion. this happens every 50 pages or so and always ends with about 3 more broadwells sold. I have the PMs to prove it.
> 
> But, yeah, I think at THIS point further discussion is pushing a derail so Im done, too.
> 
> Congrats to the 2-3 of you taking the magic cache carpet ride.
> DO THIS! If you are in the return window, that is a very smart move.


No it is pretty much way past a thread derail. I have just been lurking, but am about to quit following the thread period because of this to be honest. You said your peace, we get it, now get back on topic or carry it to a more appropriate section please. Thanks.


----------



## CptSpig

Quote:


> Originally Posted by *Benny89*
> 
> I feel like someone feels "too short" now.
> 
> 
> 
> 
> 
> 
> 
> . Take that to other thread please.
> 
> Lets' cut this subject, ok? It has gone waaaay away from thread.
> 
> CUT.


You are absolutely right it needs to be dropped. Tell it to your 5775C friends to stop making outrageous comments. By the way I am average height.


----------



## BoredErica

Quote:


> Originally Posted by *BrainSplatter*
> 
> Get the curve editor (Ctrl-F) in Afterburner, use 'shift-left-click' to move the whole curve so that the 0.9v point has a clock of 1800Mhz. Save that curve (because sometimes it can get messed up in the next editing steps). All voltage points after the 0.9v have to be set to some lower clock than 1800Mhz (usually it's enough to move the next 3 or 4 voltage points lower and to click apply which will straighten the curve beyond 1800Mhz). Save profile, set power target to 120%, ... .
> 
> 0.9v for 1800Mhz should work for most of the cards I guess. For my 2 FE cards, I use 0.9v for 1825Mhz which is stable even when the temperatures go above 80C. Depending on silicon lottery, u might also achieve 1850 or more Mhz with 0.9v. The default FE fan curve should be fine to prevent any temperature related throttling (also depends on case air flow ofc, ...). If not, u might need to apply a more aggressive curve.


Still struggling to understand...

The best performance if I fine tune everything, that would look like a curve where every point on the graph has been stress tested to its max right?

Whereas if I clamped frequencies to one frequency under load, that would not be the highest performance?

I read about minimum frame rates possibly suffering due to power limits... would I possibly get higher min FPS with a fixed frequency?



For example right now on Valley the point on the graph would jump between those two points. I don't understand why it would jump back to the lower frequency point. Is it just temperature? It seems to hold constant 58-60C.

I'm having a hard time adjusting the graph since I don't know how the GPU will behave under load.

Also, lol @ Broadwell derail yet again.


----------



## KedarWolf

Quote:


> Originally Posted by *Slackaveli*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, please drop it, peeps, this is a 1080 Ti thread, not a 5775C thread.
> 
> Thank you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you're right, but it isnt a true derail as a lot have become gpu bound just when they got the 1080Ti (this thread) and this is the place to discuss solutions to all things related to the 1080 ti, right? How would they ever know to find the broadwell owner's thread? Nobody every believes the truth until a 5 page discussion. this happens every 50 pages or so and always ends with about 3 more broadwells sold. I have the PMs to prove it.
> 
> But, yeah, I think at THIS point further discussion is pushing a derail so Im done, too.
> 
> Congrats to the 2-3 of you taking the magic cache carpet ride.
> Quote:
> 
> 
> 
> Originally Posted by *superV*
> 
> now i'm wondering if to drop the 240hz benq back(still in time) and to get a 1440p 165hz like yours @ Benny89.
> how the gpu usage balance will be?
> 
> Click to expand...
> 
> DO THIS! If you are in the return window, that is a very smart move.
Click to expand...

Listen guys and gals, last time i'm telling you before action is taken, start your own 5775C thread and take it here if you want, but pages and pages of talk NOT about our 1080 Ti's is NOT acceptable here.

Drop it, or I'll have mods start taking action regarding this issue.

Thank you for your consideration.


----------



## Dasboogieman

Quote:


> Originally Posted by *Agent-A01*
> 
> So you are telling me where Broadwell-e (and even haswell-e) destroys 5GHz 7700k in multithreaded games/apps you are still going to spout drivel that 5775 is going to destroy it?
> 
> You are full of **** and you know it lol.
> See above


I wish he was full of **** lol. I was skeptical at first too. After researching the l4 cache it has a massive impact on the IPC. In fact, the only cpus able tp compete with the 5775c are the 6+ cores at multithreaded games.

If an app can utilize the L4 cache it can match the 7700k. All for a simple drop in upgrade and maybe $50 put of pocket.

Ack my post was late. Yes no more talk of this.


----------



## ttnuagmada

Quote:


> Originally Posted by *Luckbad*
> 
> NewEgg has the MSI Sea Hawk with an EK water block:
> 
> https://m.newegg.com/products/N82E16814137144


Maybe this explains why EK hasn't released their MSI blocks yet. I've been waiting for weeks.


----------



## ttnuagmada

What are the sources for those 5775K benchmarks? It does not mention the memory speed used on any of the CPU's which is very important information.


----------



## Dasboogieman

Quote:


> Originally Posted by *ttnuagmada*
> 
> What are the sources for those 5775K benchmarks? It does not mention the memory speed used on any of the CPU's which is very important information.


That was part of my research. PM me for the details.


----------



## Nico67

Quote:


> Originally Posted by *Darkwizzie*
> 
> Still struggling to understand...
> 
> The best performance if I fine tune everything, that would look like a curve where every point on the graph has been stress tested to its max right?
> Whereas if I clamped frequencies to one frequency under load, that would not be the highest performance?
> 
> I read about minimum frame rates possibly suffering due to power limits... would I possibly get higher min FPS with a fixed frequency?
> 
> 
> 
> For example right now on Valley the point on the graph would jump between those two points. I don't understand why it would jump back to the lower frequency point. Is it just temperature? It seems to hold constant 58-60C.
> 
> I'm having a hard time adjusting the graph since I don't know how the GPU will behave under load.
> 
> Also, lol @ Broadwell derail yet again.


There doesn't seem to a lot of point maximising bins much below what your min working freq bin is. I found 4 bins below helped performance but after that no change. On your graph I would maximise down to 0.975v just to be safe. Currently I would say you are losing a bit of performance at 2050mhz.
Min frames are going to be worse with power limiting, as under load is when you need more power and are like to be hitting min frames, but also power limiting and dropping clk speed as such. Not a lot you can do but maximise bins as above so when it does drop its not by much. And you are probably jumping due to power limiting.


----------



## CoreyL4

I have been away from this thread for awhile. Any new bios come out?

I've yet to try the xoc bios.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> Listen guys and gals, last time i'm telling you before action is taken, start your own 5775C thread and take it here if you want, but pages and pages of talk NOT about our 1080 Ti's is NOT acceptable here.
> 
> Drop it, or I'll have mods start taking action regarding this issue.
> 
> Thank you for your consideration.


yeah, well, i answered a question, direct at me specifically, three pages after the discussion started, that had nothing to do with me. But, it's cool. Ya'll peace out. See ya'll in the Volta owner's thread.
Quote:


> Originally Posted by *the9quad*
> 
> No it is pretty much way past a thread derail. I have just been lurking, but am about to quit following the thread period because of this to be honest. You said your peace, we get it, now get back on topic or carry it to a more appropriate section please. Thanks.


it's all yours, sir.

Quote:


> Originally Posted by *the9quad*
> 
> I have just been lurking, but am about to quit


i've spent countless hours here helping people, but by all means. i defer to you, sir.
Quote:


> Originally Posted by *Dasboogieman*
> 
> I wish he was full of **** lol. I was skeptical at first too. After researching the l4 cache it has a massive impact on the IPC. In fact, the only cpus able tp compete with the 5775c are the 6+ cores at multithreaded games.
> 
> If an app can utilize the L4 cache it can match the 7700k. All for a simple drop in upgrade and maybe $50 put of pocket.
> 
> Ack my post was late. Yes no more talk of this.










rep'd


----------



## OZrevhead

Sorry for the 5775c comment but if they are so good then why did they come and go relatively unnoticed?

Another part off topic: I am following the ab curve posts with great interest as I want to apply them to my TXp after the water block goes on.

On topic- after I finish benching the TXp I will step bck to a 1080Ti, I have been waiting for the FTW3 to come out but it seems to have been a disappointment in the end .... so I will wait a bit and hopefully evga will drop a Classified and surprise us. I liked the days when Asus and MSI had competition for the Classified, I guess those days are gone....


----------



## Agent-A01

Quote:


> Originally Posted by *Dasboogieman*
> 
> I wish he was full of **** lol. I was skeptical at first too. After researching the l4 cache it has a massive impact on the IPC. In fact, the only cpus able tp compete with the 5775c are the 6+ cores at multithreaded games.
> 
> If an app can utilize the L4 cache it can match the 7700k. All for a simple drop in upgrade and maybe $50 put of pocket.
> 
> Ack my post was late. Yes no more talk of this.


He is though.

Making blanket statements that 5775 destorys Broadwell-e.

Plenty of games where hasswell-e/broadwell-e is much faster than 7700k, but of course the 5775 will destroy it in those games too









Not every-game benefits from cache anyways.
More cores is way more future-proof especially for the upcoming wave of highly threaded DX12 games.

Back on topic.

Bought a FE Gigabyte Ti for 628 total.
Also bought an EK block so should be around monday.


----------



## IMI4tth3w

Quote:


> Originally Posted by *johnfreeman*
> 
> Sure, mate - http://www.3dmark.com/compare/spy/1533019/spy/1771074#


while i am one of those who belives in the ability of the 5775c, that comparison might not be the best one to use. the gpus are at different speeds.

The 5775c shines in its ability to do higher fps at lower resolutions when paired up with overkill gpus. it has L4 cache which NO other cpu has. and it has a ton of it. L4 cache is like super duper fast RAM built into the cpu silicon. Most CPUs have 3 levels of cache but the 5775c has 4. Level 1 is the fastest, but holds the least amount of data. Level 2 is slower but holds a bit more. L3 is slower than L2 but holds even more data. With L4 being the slowest holding the most amount of data. But even with L4 being the "slowest" it is still WORLDS faster than the RAM on the motherboard. And the 5775c's ability to utilize that L4 cache is where it shines. There is no mystery to it. It has its own bit of super fast RAM built into the silicon that no other CPU has. This is where its amazing performance in pushing out maximum fps in games and how it can rise above the latest 7700k.


----------



## Slackaveli

Quote:


> Originally Posted by *IMI4tth3w*
> 
> while i am one of those who belives in the ability of the 5775c, that comparison might not be the best one to use. the gpus are at different speeds.
> 
> The 5775c shines in its ability to do higher fps at lower resolutions when paired up with overkill gpus. it has L4 cache which NO other cpu has. and it has a ton of it. L4 cache is like super duper fast RAM built into the cpu silicon. Most CPUs have 3 levels of cache but the 5775c has 4. Level 1 is the fastest, but holds the least amount of data. Level 2 is slower but holds a bit more. L3 is slower than L2 but holds even more data. With L4 being the slowest holding the most amount of data. But even with L4 being the "slowest" it is still WORLDS faster than the RAM on the motherboard. And the 5775c's ability to utilize that L4 cache is where it shines. There is no mystery to it. It has its own bit of super fast RAM built into the silicon that no other CPU has. This is where its amazing performance comes from.


WHICH is exactly the case with the guy who asked the damn question! 1080p with a z97 mobo w/4790k trying to push a 240Hz monitor with a 1080ti. tell me, b/c you clearly know your stuff, what should he do? because it's obvious. Even that guy decided to get one, and another long time poster in this thread also , today, decided to get one. But some lurkers are tired of reading about it so the rest of the people that come in here in his shoes can just dump $1500 on a PoS 6850k and mobo and ram instead, i guess, b/c that's what will suggested. smh


----------



## IMI4tth3w

Quote:


> Originally Posted by *Slackaveli*
> 
> WHICH is exactly the case with the guy who asked the damn question! 1080p with a z97 mobo trying to push a 240Hz monitor with a 1080ti. tell me, b/c you clearly know your stuff, what should he do?


hard to keep track of people but i believe you are the one trying to convince him to get the 5775c? and i agree completely with that.

I want to pick one up myself to swap out my 4790k but the cheapest i can find is $395 at newegg. also i'd have a hard time selling my 4790k as i delided it and its not the greatest clocker. also i'd rather get an 8 core 16 thread chip for x264 encoding as i'm at 1440p 144Hz so no need to really go above 144 and usually it wont' be the cpu limiting me there.


----------



## nrpeyton

One-tenth of an orgasm:

The new 'Galax HoF (Hall of Fame): Limited Edition 1080 Ti'

*-Three 8-pin power connectors
-16+3 phase VRM
-Dual BIOS
-VRM cooled by separate smaller metal heat-sink (not the fans--interesting!)*
This would make it perfect for "universal" water-cooling blocks or LN2/Dry Ice benching


----------



## Slackaveli

Quote:


> Originally Posted by *IMI4tth3w*
> 
> hard to keep track of people but i believe you are the one trying to convince him to get the 5775c? and i agree completely with that.
> 
> I want to pick one up myself to swap out my 4790k but the cheapest i can find is $395 at newegg. also i'd have a hard time selling my 4790k as i delided it and its not the greatest clocker. also i'd rather get an 8 core 16 thread chip for x264 encoding as i'm at 1440p 144Hz so no need to really go above 144 and usually it wont' be the cpu limiting me there.


yeah, it was me.

have you checked B&H Photo? they were $359 last week. and the last delidded 4790k i saw on ebay went for $300 something. if you advertise it, sure less will be interested, and others will understand that the delid is worth $50.[


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> One-tenth of an orgasm:
> 
> The new 'Galax HoF (Hall of Fame): Limited Edition 1080 Ti'
> 
> *-Three 8-pin power connectors
> -16+3 phase VRM
> -Dual BIOS
> -VRM cooled by separate smaller metal heat-sink (not the fans--interesting!)*
> This would make it perfect for "universal" water-cooling blocks or LN2/Dry Ice benching


ikr, that thing is hot


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> ikr, that thing is hot


Aye, what I find most interesting.. is that the VRM isn't cooled by the main heatsink/fans

But a smaller metal heatsink of it's own that looks like it's attached to the PCB _(I.E when you remove the heatsink/fan assembly the VRM will be perfectly okay on it's own)_


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye, what I find most interesting.. is that the VRM isn't cooled by the main heatsink/fans


yeah, and the 3rd 8-pin to power the fans/lights.


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> yeah, and the 3rd 8-pin to power the fans/lights.


I'd assume it's going to have a massive upper power target that will be difficult to breach.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> I'd assume it's going to have a massive upper power target that will be difficult to breach.


i just heard that in my head in Scotty voice lol. I inserted ", Captain" at the end


----------



## nrpeyton

lol


----------



## Dasboogieman

If anybody wants more info on my reaearch for the 5775c please PM me. There is a lot to digest, the performance characteristics are not entirely straightforward.

Lets keep this thread on topic.


----------



## ViRuS2k

Well that was intresting, just baffled my way through 3-4 pages of i7 cpu talk lol think i lost the NVIDIA GTX 1080TI Owners Club Thread.....
lots of people replying and saying back on topic after talking yet again about i7 cpus...

guess i must find somewhere else to look for 1080ti talk...


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Well that was intresting, just baffled my way through 3-4 pages of i7 cpu talk lol think i lost the NVIDIA GTX 1080TI Owners Club Thread.....
> lots of people replying and saying back on topic after talking yet again about i7 cpus...
> 
> guess i must find somewhere else to look for 1080ti talk...


I've messaged a mod about this issue and am waiting for a reply.

I've asked more than once for people to start their own 5775C thread and take this discussion elsewhere.

It's going to be dealt with.


----------



## superV

Quote:


> Originally Posted by *ViRuS2k*
> 
> Well that was intresting, just baffled my way through 3-4 pages of i7 cpu talk lol think i lost the NVIDIA GTX 1080TI Owners Club Thread.....
> lots of people replying and saying back on topic after talking yet again about i7 cpus...
> 
> guess i must find somewhere else to look for 1080ti talk...


what are you crying about dude?had a problem and thanks to this thread i found a solution.still related to the gtx 1080 ti.found out that my cpu can't handle a gtx 1080 ti at 1080p.
there are other people complaining about or buyer's remorse.
still related to the 1080 ti thread.found out that a new cpu will push better the gtx 1080ti without changing the platform.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye, what I find most interesting.. is that the VRM isn't cooled by the main heatsink/fans
> 
> But a smaller metal heatsink of it's own that looks like it's attached to the PCB _(I.E when you remove the heatsink/fan assembly the VRM will be perfectly okay on it's own)_


What I like with HOF is moar lights even on the rear BUT the most eciting part the BIOs that get me really excited

I downloaded the high res photos from KFA2 site for some 1080 Ti Pron





Just like the Asus strix has a metal cover frame that really stiffens the card, which I see the full cover water blocks have just come available not sure if to save my money as I have a universal W/B had my moneys worth been on 3 sets of cards also the Asus back plate is not compatible with the EK W/B


----------



## pez

Quote:


> Originally Posted by *superV*
> 
> yep,everything on ultra,no caps.
> 
> @Coopiklaani let's be serious [email protected]
> gpu refuses to work and drops load on the cpu.


Quote:


> Originally Posted by *superV*
> 
> ye it does
> 
> yes the gpu usage is higher with dsr.but should work at 1080p too.
> the cpu is fine at 5ghz,with a 5775c won't clock that high and could get even worse performance .it simply doesn't want to work well in games and drops the workload on the cpu.


Quote:


> Originally Posted by *Benny89*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Mate, I know that you think that, I was there 2 months ago. 5775C Oced to 4.2 Ghz is better/equal to 5.0Ghz 7700k in games. Your 5.0 4790k will get destroyed by 5775C.
> 
> Belive, cause I sold my 4790k 5.0Ghz 2 months ago and got 5775C and I have MASSIVE improvement in games.
> 
> I sold it, added 50 bucks and now I enjoy having even better performance than 5.0Ghz 7700k in games, securing me from upgrading even further
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For more info, please visit our Owners Thread: http://www.overclock.net/t/1583537/intel-broadwell-c-ownership-club
> 
> Here, catch:


Quote:


> Originally Posted by *superV*
> 
> i think i found the problem.i disabled the c-states and turbo boost,increased the vrm switching frequency and now it works 92/95/97%.
> but as soon i lower the graphics to get more fps,the gpu usage lowers to 75/85%
> 
> haha 1080ti is overkill for 1080p
> 
> 
> 
> 
> 
> 
> 
> nice one m8.can't max out a 240hz monitor and is overkill.
> i think this is nvidia's gimmick regarding 1080p to make people jump to higher res,cuz as soon i use dsr it jumps quick to 97%.


Quote:


> Originally Posted by *Benny89*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok, mate. Please show me a games where that case is not used? Because it seems I game more than you cause I didn't meet an game yet that does not use it.
> 
> I will just leave it here. Various of games. All nowadays AAA games and bench/test games that are wildly used. 1080p, tested with 1080 GTX on board: Asus GTX 2025/10000:


Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> it's like some of you guys can't believe your eyes. Or, more than likely, are mad you didnt know about it earlier. And being as that is OBVIOUSLY the case, why get mad at me for telling others who havent sold off their PRECIOUS z-97 mobo yet about this amazing piece of tech? The ONLY interesting chip Intel has put out in years that was actually ingenuitive at all. I don't get the hate unless it is just pure buyer's remorse....


Quote:


> Originally Posted by *KraxKill*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Then you're misunderstanding the fundamentals of why GPU performance at 1080P cannot increase as there is no way to deliver frames that your CPU is not asking the GPU to draw.


Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> man..... what is "wrong" is you are cpu bound. if you dont want a better cpu, get a better monitor then. you have a 4k rig on a 1080p monitor, that's what's REALLY wrong. There is no setup you can get to run AAA games at ultra 240Hz. NONE. But, closest you can get is 5775-c. and you already have the entire rig save the cpu... smh.


I think ultimately, it was ignored that he was seeing a problem initially that wasn't because of what CPU he was running. I'm assuming though by the last non-Spoiler'd post above that he fixed it somewhat.

Pascal is notoriously bad for scaling in some games to the point that it doesn't matter what your specs are. Fallout 4 and a variety of older titles simply will not run correctly on this architecture. If any of you are so inclined, please try Borderlands (OG) and let me know what your range of frames tends to be. And this is on various resolutions. So far one of the workarounds is simply to use DSR to make up for the lack of GPU usage that some games utilize.


----------



## superV

can someone please if on 7700k to down clock it to 4 ghz and try a run on bf1 at 1080p and take a look at the gpu usage.
thanks.


----------



## Hulio225

Quote:


> Originally Posted by *superV*
> 
> can someone please if on 7700k to down clock it to 4 ghz and try a run on bf1 at 1080p and take a look at the gpu usage.
> thanks.


4.2GHz and 1080p with my 2100MHz 1080ti gpu usage between 16-20%

Edit when i lock frequency at 2100MHz its most likely 15% all the time and 43/44% if V-Sync turned off


----------



## superV

Quote:


> Originally Posted by *Hulio225*
> 
> 4.2GHz and 1080p with my 2100MHz 1080ti gpu usage between 16-20%
> 
> Edit when i lock frequency at 2100MHz its most likely 15% all the time


i think that wasn't the gpu usage.me at stock 4790k will stay 58/60 sometimes 71% with vsync off ram at 2400mhz.mine was sitting at 30% on cs:go because is a lighter game,but that gpu usage is weird.
can you check please if correct,and what frame rate u get?


----------



## Hulio225

Quote:


> Originally Posted by *superV*
> 
> i think that wasn't the gpu usage.me at stock 4790k will stay 58/60 sometimes 71% with vsync off ram at 2400mhz.mine was sitting at 30% on cs:go because is a lighter game,but that gpu usage is weird.
> can you check please if correct,and what frame rate u get?




Framerate locks at 200FPS sometimes it dips below like in screenshot

Edit:

Ill retest i saw that graphics wasn't at ultra and what are you using DX 12 on or off?


----------



## superV

Quote:


> Originally Posted by *Hulio225*
> 
> 
> 
> Framerate locks at 200FPS sometimes it dips below like in screenshot
> 
> Edit:
> 
> Ill retest i saw that graphics wasn't at ultra and what are you using DX 12 on or off?


i don't know which direct x,i only changed to ultra from auto and that's it.
thanks for testing.post if u have news.


----------



## Hulio225

Quote:


> Originally Posted by *superV*
> 
> i don't know which direct x,i only changed to ultra from auto and that's it.
> thanks for testing.post if u have news.


DX12 off


As you can see usage somewhere between 85% and 96% depends whats going on in the game...
FPS are close to 200 all the time with some dips when something blows up or so

And thats how it looks like if cpu is at 5GHz:


----------



## 4lek

But.. is that normal?

Around 50 degrees èliquid cooled] on stock frequency just playing a game?

I mean my old titans black hardly used to pass 35 under stress test withouth overclock.


----------



## Dasboogieman

Quote:


> Originally Posted by *4lek*
> 
> But.. is that normal?
> 
> Around 50 degrees èliquid cooled] on stock frequency just playing a game?
> 
> I mean my old titans black hardly used to pass 35 under stress test withouth overclock.


Double check your mounting again and make sure your pump is working properly.


----------



## BoredErica

If going below my curve is power limiting then what is going above my curve supposed to mean? I set 2050 and it goes 2060... 2070...



EDIT

Went back to check curve, and it miraculously went up.

I swear, there's a ghost in my GPU!









I think I need to press 'apply' for the voltage curve BEFORE I close the curve menu otherwise it gets wicky wonky. WHO KNOWS?!


----------



## Hulio225

Quote:


> Originally Posted by *4lek*
> 
> But.. is that normal?
> 
> Around 50 degrees èliquid cooled] on stock frequency just playing a game?
> 
> I mean my old titans black hardly used to pass 35 under stress test withouth overclock.


Without knowing your variables like radiator surface, fan setup, fan speeds, ambient temps etc. no one will be able to tell you if your temps are within normal range which you can expect or not.


----------



## BrainSplatter

Quote:


> Originally Posted by *Darkwizzie*
> 
> I think I need to press 'apply' for the voltage curve BEFORE I close the curve menu otherwise it gets wicky wonky.


Pressing 'Apply' always has the chance that it modifies your curve. Whether you have the curve window open or not doesn't matter. It's only convenient to press apply before so that u can see the modification immediately.

Why that modification happens, I have no clue. It just seems that applying a curve internally is not as simple as setting an exact frequency for a voltage point. There seems to be a more complex algorithm at work.


----------



## BoredErica

Quote:


> Originally Posted by *BrainSplatter*
> 
> Pressing 'Apply' always has the chance that it modifies your curve. Whether you have the curve window open or not doesn't matter. It's only convenient to press apply before so that u can see the modification immediately.
> 
> Why that modification happens, I have no clue. It just seems that applying a curve internally is not as simple as setting an exact frequency for a voltage point. There seems to be a more complex algorithm at work.


My Afterburner wants the curve to stay at 2050 or 2037 or 2062... It's not liking 2045.

Yeah, with the apply thing I was just attributing causality when there was none. In fact, better click apply while curve is up so I can see what's actually going on and being applied.


----------



## pez

Quote:


> Originally Posted by *Darkwizzie*
> 
> My Afterburner wants the curve to stay at 2050 or 2037 or 2062... It's not liking 2045.
> 
> Yeah, with the apply thing I was just attributing causality when there was none. In fact, better click apply while curve is up so I can see what's actually going on and being applied.


Bins are stepped in multiples of 12.5. That would be why it moves to those other values.


----------



## Hulio225

Quote:


> Originally Posted by *Darkwizzie*
> 
> My Afterburner wants the curve to stay at 2050 or 2037 or 2062... It's not liking 2045.
> 
> Yeah, with the apply thing I was just attributing causality when there was none. In fact, better click apply while curve is up so I can see what's actually going on and being applied.


Exactly this:
Quote:


> Originally Posted by *pez*
> 
> Bins are stepped in multiples of 12.5. That would be why it moves to those other values.


The whole Boost 3.0 thing is working in 12.5 MHz bins, there is no way the curve would stick at 2045...
If you reach certain temperature points it will drop by those exactly bins too, from 2100.5 to 2088 for example, and so on


----------



## Nico67

Quote:


> Originally Posted by *Darkwizzie*
> 
> My Afterburner wants the curve to stay at 2050 or 2037 or 2062... It's not liking 2045.
> 
> Yeah, with the apply thing I was just attributing causality when there was none. In fact, better click apply while curve is up so I can see what's actually going on and being applied.


13mhz increments, 2045 doesn't exist









I find it always better to default the curve, press apply, then raise the highest bin first, then move down the bins after that and hit apply, if you pick frequencies factional above the fxied increments they tend to apply correctly, like 2053 will set 2050 ok.

The curve you show above will draw less power but is loose a ton of performance, you need to have at least one step down and then next 4 lower bins raised.

This is the curve I was running, but I probably should have had those last two bins raised as well











the last 4 bins give a good performance boost, like some invisible limiting is happening so optimising them helps.


----------



## Hulio225

Quote:


> Originally Posted by *Nico67*
> 
> *13mhz increments, 2045 does exist
> 
> 
> 
> 
> 
> 
> 
> *


Wrong, its 12.5MHz increments and 2045 is not possible


----------



## Nico67

Quote:


> Originally Posted by *Hulio225*
> 
> Wrong, its 12.5MHz increments and 2045 is not possible


Typo







and yeah strictly speaking its 12.5, but AB shows them as 12 or 13


----------



## Hulio225

Quote:


> Originally Posted by *Nico67*
> 
> Typo
> 
> 
> 
> 
> 
> 
> 
> and yeah strictly speaking its 12.5, but AB shows them as 12 or 13


Thats called "round numbers".
Thats why you have to adjust your curve to 2101 and press accept to get 2100.5MHz (2100 is shwon in AB), if you adjust curve to 2100 and press accept it will go down to 2088, because 2100 is below 2100.5 and it always jumps down to the next bin.


----------



## BoredErica

I think +465 is where I want to be.

Quote:


> Originally Posted by *Nico67*
> 
> 13mhz increments, 2045 doesn't exist
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find it always better to default the curve, press apply, then raise the highest bin first, then move down the bins after that and hit apply, if you pick frequencies factional above the fxied increments they tend to apply correctly, like 2053 will set 2050 ok.
> 
> The curve you show above will draw less power but is loose a ton of performance, you need to have at least one step down and then next 4 lower bins raised.
> 
> This is the curve I was running, but I probably should have had those last two bins raised as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the last 4 bins give a good performance boost, like some invisible limiting is happening so optimising them helps.


By min working clock you mean the minimum working clock I would typically see when the clocks are fluctuating right?

There's a particular place in TLR benchmark that tends to hit rock bottom clocks. That, and sometimes 4k extreme on Superposition.


----------



## Hulk1988

Hello everyone,

I am new here and I got my 1080 TI Hall of Fame and want to share with you some results. If you have any questions or you want to have any other results, just write it down.




























With 100% Fan Speed the card get no hotter than 56 degrees. With 50% ( Which I cannot hear anymore in a closed Define R5 case ) is the peak at 60 degrees.

Beautiful card. My first results are 2080/6200. After the drop because 3.0 Boost it is stable at 2020/6200 with 59/60 degrees and 50% Fan Speed. Better results with 100% Fanspeed.


----------



## 4lek

Quote:


> Originally Posted by *Dasboogieman*
> 
> Double check your mounting again and make sure your pump is working properly.


Tried with my bech pump too [it's a sanso pdh 054], nothing really changes. Werll it does if i turn the Chiller on too, but well, that's a dofferent story









About mounting, i did this way:


Quote:


> Originally Posted by *Hulio225*
> 
> Without knowing your variables like radiator surface, fan setup, fan speeds, ambient temps etc. no one will be able to tell you if your temps are within normal range which you can expect or not.


I have 3 radiators:
2x Alphacool NexxxoS Monsta 420mm in waterstation and 1 Alphacool NexxxoS ST30 240mm inside the case itself [betwheen the cpu/mosfets and VGAs/SB].

As fans i use Noctua Redux NF-P14S, full speed most of the time.

And tooday there are around 25 degrees here, talking about outdoor temperature.
My pc room is probably a little bit cooler, cause of house structure.


----------



## Hulio225

Quote:


> Originally Posted by *Hulk1988*
> 
> Hello everyone,
> 
> I am new here and I got my 1080 TI Hall of Fame and want to share with you some results. If you have any questions or you want to have any other results, just write it down.
> 
> With 100% Fan Speed the card get no hotter than 56 degrees. With 50% ( Which I cannot hear anymore in a closed Define R5 case ) is the peak at 60 degrees.
> 
> Beautiful card. My first results are 2080/6200. After the drop because 3.0 Boost it is stable at 2020/6200 with 59/60 degrees and 50% Fan Speed. Better results with 100% Fanspeed.


Plz upload your bios with GPU-Z
Quote:


> Originally Posted by *4lek*
> 
> Tried with my bech pump too [it's a sanso pdh 054], nothing really changes. Werll it does if i turn the Chiller on too, but well, that's a dofferent story
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About mounting, i did this way:
> 
> I have 3 radiators:
> 2x Alphacool NexxxoS Monsta 420mm in waterstation and 1 Alphacool NexxxoS ST30 240mm inside the case itself [betwheen the cpu/mosfets and VGAs/SB].
> 
> As fans i use Noctua Redux NF-P14S, full speed most of the time.
> 
> And tooday there are around 25 degrees here, talking about outdoor temperature.
> My pc room is probably a little bit cooler, cause of house structure.


Hmm, i have 2 times 480 Rad and with fans at 1200 rpm im hitting 42/43 ° C after looping valley at 4k for 10-15 minutes,
ambient temps are at 23.5 ° C here in my room.
Quote:


> Originally Posted by *Darkwizzie*
> 
> 
> 
> 
> 
> 
> I think +465 is where I want to be.
> 
> By min working clock you mean the minimum working clock I would typically see when the clocks are fluctuating right?
> 
> There's a particular place in TLR benchmark that tends to hit rock bottom clocks. That, and sometimes 4k extreme on Superposition.


Funny how different cards and mem clockspeed behaves from card to card, i did a run with +470 and a run with +490 and im gaining points where you lost points


----------



## Hulk1988

Quote:


> Originally Posted by *Hulio225*
> 
> Plz upload your bios with GPU-Z


I will be on my PC in around 3 hours. I will upload it directly.


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> what are you crying about dude?had a problem and thanks to this thread i found a solution.still related to the gtx 1080 ti.found out that my cpu can't handle a gtx 1080 ti at 1080p.
> there are other people complaining about or buyer's remorse.
> still related to the 1080 ti thread.found out that a new cpu will push better the gtx 1080ti without changing the platform.


exactly, and you're welcome, and glad I could help.

Bunch of triggered lil snowflakes in here.
Quote:


> Originally Posted by *Hulio225*
> 
> 
> 
> Framerate locks at 200FPS sometimes it dips below like in screenshot
> 
> Edit:
> 
> Ill retest i saw that graphics wasn't at ultra and what are you using DX 12 on or off?


Quote:


> Originally Posted by *superV*
> 
> i don't know which direct x,i only changed to ultra from auto and that's it.
> thanks for testing.post if u have news.


that's why you should do as we were saying yesterday and exchange that 1080p 240Hz monitor for a 1440p/165Hz, as it is impossible to max out a 240Hz monitor in 90% of games. Cant do it in most games all the way to 165 @ 1440p, either, but you'll be close in most and fully locked in others. All while looking way better with much higher PPI, cleaner, sharper lines, and an over all much better experience.


----------



## kicurry

In the games I play it kills all those expensive chips. In Project Cars it is at least 20% faster. As a matter of fact my 5675 is faster. Also Static benchmarks are not gaming. So again if you have a z97 you do not need an expensive chip. I sold my 5820k setup as it was too slow.


----------



## Slackaveli

Quote:


> Originally Posted by *kicurry*
> 
> In the games I play it kills all those expensive chips. In Project Cars it is at least 20% faster. As a matter of fact my 5675 is faster. Also Static benchmarks are not gaming. So again if you have a z97 you do not need an expensive chip. I sold my 5820k setup as it was too slow.
























I really appreciate you project cars guys/gals. ya'll are the ones who discovered the magic cache on broadwell, and many of us profusely thank you for it!! But, a bunch of others will hate ya'll for it. It makes them hate their much more expensive chips. They don't have the gumption that you did to sell off that $1000 paperweight and get a gaming oriented cpu. They game one workstations







Quote:


> Originally Posted by *Hulio225*
> 
> Exactly this:
> The whole Boost 3.0 thing is working in 12.5 MHz bins, there is no way the curve would stick at 2045...
> If you reach certain temperature points it will drop by those exactly bins too, from 2100.5 to 2088 for example, and so on


this is the whole curve thing in a nutshell. it is actually quite simple, really. took me a minute to figure it all out though. the changing after hitting apply, even when on the correct points, is what threw me off for a little while.


----------



## BrainSplatter

Quote:


> Originally Posted by *Slackaveli*
> 
> exactly, and you're welcome, and glad I could help.


You could still create a thread for recommending the 5775C to existing Z97 owners. Then u could help even more people and you could just refer to that thread from here.

I was pretty pissed when my Z87 board wasn't compatible with the 5775C since that would have been the ideal upgrade from my bad 4770K chip (would only do 4.4 Ghz even delidded).

The more complex the game gets the bigger the advantage of the 5775C over other CPUs due to the cache.
RTS (Total War) > Open World (GTA V) > MP shooter (BF1) > SP shooter

Also, with the 5775C, for me, it's not so much about the maximum framerate but about the minimum (or better the lowest 0.1% and 1% frame rates explained here: 



) since those decide whether the game feels smooth or not and there it has the biggest advantage.


----------



## Slackaveli

Quote:


> Originally Posted by *BrainSplatter*
> 
> You could still create a thread for recommending the 5775C to existing Z97 owners. Then u could help even more people and you could just refer to that thread from here.
> 
> I was pretty pissed when my Z87 board wasn't compatible with the 5775C since that would have been the ideal upgrade from my bad 4770K chip (would only do 4.4 Ghz even delidded).
> 
> The more complex the game gets the bigger the advantage of the 5775C over other CPUs due to the cache.
> RTS (Total War) > Open World (GTA V) > MP shooter (BF1) > SP shooter
> 
> Also, with the 5775C, for me, it's not so much about the maximum framerate but about the minimum (or better the lowest 0.1% and 1% frame rates explained here:
> 
> 
> 
> ) since those decide whether the game feels smooth or not and there it has the biggest advantage.


I may do that because I am actually getting bashed in here for helping people lol. It is MORE relevant in the Ti thread than ANY OTHER thread, b/c this is WHERE people realize they are cpu bound. Shame that z87 isnt compatible. And i am not mad at you, brother, but...

THESE guys could create a thread about the 7700k, which has been directly mentioned in this thread three times as much as a 5775-c. While you are at it, I AM TIRED of reading about water cooling. GO MAKE ANOTHER THREAD (im not, really, just proving a point).

This whole banning certain processors but not others from conversation is beyond stupid. Also, if every time 5775-c is mention I dont get directly blamed for it, that'd be greeeaaattt. I brought it up the first time... since then, i only comment on it when it is brought up by others. So, that'd be great.


----------



## TheBoom

I see this thread hasn't been renamed yet







.

Jokes aside, it seems elmor is getting identical performance from FE and XOC bios. I guess we won't see a new XOC bios with tightened memory timings soon then.


----------



## CptSpig

Quote:


> Originally Posted by *Slackaveli*
> 
> ikr, that thing is hot


Reminds me of EVGA KingPin cards. Three eight pin power connects means lots of voltage only good for LN2. I had two of the KingPins and they were not good for overclocking on air or water.


----------



## mcbaes72

Quote:


> Originally Posted by *KedarWolf*
> 
> I've messaged a mod about this issue and am waiting for a reply.
> 
> I've asked more than once for people to start their own 5775C thread and take this discussion elsewhere.
> 
> It's going to be dealt with.


Thank you for doing this! Last several pages has been torture and I nearly unsubscribed. I want to read about 1080 Ti info and benchmarks. It's finally getting back on topic.


----------



## superV

Quote:


> Originally Posted by *Slackaveli*
> 
> exactly, and you're welcome, and glad I could help.
> 
> Bunch of triggered lil snowflakes in here.
> 
> that's why you should do as we were saying yesterday and exchange that 1080p 240Hz monitor for a 1440p/165Hz, as it is impossible to max out a 240Hz monitor in 90% of games. Cant do it in most games all the way to 165 @ 1440p, either, but you'll be close in most and fully locked in others. All while looking way better with much higher PPI, cleaner, sharper lines, and an over all much better experience.


there will not be a 1440p monitor,they didn't want to switch for an acer 1440p 144hz,so i decided to keep it,i still prefer high refresh mode.
with that said,i think my gtx 1080ti will perform pefectly with a 5775-c,because right now,my gpu usage with [email protected] it reaches 88/92/95, just missing that 5/8% performance of a 7700k to make it work 100% at 1080p.
i really hope that the 5775-c clocks at least 4.2ghz,or will not get enough performance.
Quote:


> Originally Posted by *mcbaes72*
> 
> Thank you for doing this! Last several pages has been torture and I nearly unsubscribed. I want to read about 1080 Ti info and benchmarks. It's finally getting back on topic.


dude whats your problem?we're speaking bout the gtx 1080 ti not performing well at 1080p,could help other people,still related to the card topic.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> I see this thread hasn't been renamed yet
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Jokes aside, it seems elmor is getting identical performance from FE and XOC bios. I guess we won't see a new XOC bios with tightened memory timings soon then.


I'm not convinced unless its a specific Strix thing (i.e. the XOC causes performance loss in all other AIBs) because I was getting a consistent 2-3% performance regression when I used it with my Aorus. Something else about the XOC BIOS is amiss, its not natural that the performance is lower despite higher clocks and unlimited power.


----------



## dunbheagan

Quote:


> Originally Posted by *Darkwizzie*
> 
> If going below my curve is power limiting then what is going above my curve supposed to mean? I set 2050 and it goes 2060... 2070...
> 
> 
> 
> EDIT
> 
> Went back to check curve, and it miraculously went up.
> 
> I swear, there's a ghost in my GPU!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I need to press 'apply' for the voltage curve BEFORE I close the curve menu otherwise it gets wicky wonky. WHO KNOWS?!


I was confused by that too and it took me ages to realize, that afterburner always shows you the curve that is connected to your actual gpu temp. Open the curve at 30 degrees and you will get another curve than at say 60 degrees.

So when you open the curve directly after benching when your gpu is warm, then edit for a while and press apply when your gpu has cooled down, AB will apply the "cooler" curve, which makes it jump from the values you applied to one or two bins higher in this case.

Since i realized that i always edit the curve at my typical gaming gpu temp, which is around 40 degree. I leave a fur mark window at a low resolution running for that. Makes it easier to know exactly which results the curve gives you.


----------



## mcbaes72

Yeah, I'm not getting suckered into answering that question.

On Topic Inquiry:
I'm looking into another 1080 Ti, but hybrid version for another build. From my research, only MSI and EVGA makes an AIO for the GPU. For those that have them....

1) How's cooling performance?
2) Noise level?

TIA.


----------



## mtbiker033

Quote:


> Originally Posted by *dunbheagan*
> 
> I was confused by that too and it took me ages to realize, that afterburner always shows you the curve that is connected to your actual gpu temp. Open the curve at 30 degrees and you will get another curve than at say 60 degrees.
> 
> So when you open the curve directly after benching when your gpu is warm, then edit for a while and press apply when your gpu has cooled down, AB will apply the "cooler" curve, which makes it jump from the values you applied to one or two bins higher in this case.
> 
> Since i realized that i always edit the curve at my typical gaming gpu temp, which is around 40 degree. I leave a fur mark window at a low resolution running for that. Makes it easier to know exactly which results the curve gives you.


nice catch!! +rep! that makes total sense and take away some of the confusion I was having about the curves and performance.


----------



## BoredErica

Seems all the voltage curve adjustment interfaces are poopy right now. UI at 200% and the numbers get cut off. Usuable, but annoying. Be nice to switch from point to point via tab or something. I don't want to have to click on the small dot. Found it easier to use arrow keys than precisely drag the dot.


----------



## ELIAS-EH

Please someone help me to deside between msi gaming x, asus strix and evga ftw3 1080ti
The best quality and performance. I don't care about noise and price


----------



## Dasboogieman

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Please someone help me to deside between msi gaming x, asus strix and evga ftw3 1080ti
> The best quality and performance. I don't care about noise and price


Asus strix, also happens to be the best to Water cool if cost is not a thing.


----------



## ELIAS-EH

Thank u for your fast replay, asus strix can be water cooled also?


----------



## Benny89

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Please someone help me to deside between msi gaming x, asus strix and evga ftw3 1080ti
> The best quality and performance. I don't care about noise and price


So far I can tell from personal experience that best temps and quality is STRIX. Second I would get EVGA only because of their support quality and higher power limit. MSI last.

Overall I think this round (1080 Tis) Asus made best all around air GPU. Sleek, neutral design fitting any build, great temps, great noise, great build quality, and I haven't seen almost any complaints about STRIX apart form some coil whine, but I saw that under every 1080Ti brand topic over internet. A lot of complaints about EVGA fans not spinning or working or some some other minor stuff and int incompatibility with Afterburner (big thing for some, as I am addicted to AB OSD).

As I said each has some pros and cons compare to others, but overall I would definitelly recommend STRIX this round.


----------



## Hulk1988

https://www.file-upload.net/download-12505452/GP102_1080_TI_HOF.rar.html

1080 TI Hall of Fame Bios

BE CAREFUL! People telling me that this Bios could brick your 1080TI.


----------



## ELIAS-EH

What an amazing forum, thank u guys, i will buy the strix.


----------



## ELIAS-EH

There is an issue to use afterburner with asus tweaks, because i need to monitor the cpu temp in game


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'm not convinced unless its a specific Strix thing (i.e. the XOC causes performance loss in all other AIBs) because I was getting a consistent 2-3% performance regression when I used it with my Aorus. Something else about the XOC BIOS is amiss, its not natural that the performance is lower despite higher clocks and unlimited power.


I think its the memory timings as mentioned a few times. The default strix probably has these looser timings as well maybe that's why those going from the strix bios don't see the performance drop. Its too small to make a difference in games anyway I think. After all most people who do buy the GPUs are not going to be comparing scores between benchmarks between different bioses after all









Quote:


> Originally Posted by *ELIAS-EH*
> 
> Please someone help me to deside between msi gaming x, asus strix and evga ftw3 1080ti
> The best quality and performance. I don't care about noise and price


Quote:


> Originally Posted by *Benny89*
> 
> So far I can tell from personal experience that best temps and quality is STRIX. Second I would get EVGA only because of their support quality and higher power limit. MSI last.
> 
> Overall I think this round (1080 Tis) Asus made best all around air GPU. Sleek, neutral design fitting any build, great temps, great noise, great build quality, and I haven't seen almost any complaints about STRIX apart form some coil whine, but I saw that under every 1080Ti brand topic over internet. A lot of complaints about EVGA fans not spinning or working or some some other minor stuff and int incompatibility with Afterburner (big thing for some, as I am addicted to AB OSD).
> 
> As I said each has some pros and cons compare to others, but overall I would definitelly recommend STRIX this round.


If you had the choice you might want to consider the Zotac Amp Extreme too. A close second or even first depending on the price of both where you live. Haven't heard any complains about it either. Too bad Zotac is somewhat still an underdog in the market.


----------



## motivman

Finally got around to finishing my build. I was set back a few weeks because my EVGA 1080ti FE suddenly died out the blue, so had to wait for EVGA to send me a replacement. My replacement card from EVGA is kind of a dud. Max core clock is 2050 and memory will not go above +400 without artifacting. My old card could do 2088 on the core and 800+ on the memory. This makes me kind of sad. My MSI card has a limit of 2076 on the core and can also go 800+ on the memory. I am wondering if i should sell my replacement EVGA card and buy another one, that can hopefully overclock better. what do you guys think?

My System:

CPU: 5960X @ 4.5ghz
RAM: 32Gb EVGA 3200 (16-18-18-38 2T 1.35V)
PSU: Corsair RM1000i
Case: AIR 740
FANS: Corsair ML120 PWM
Cooling: EK supremacy Evo Cpu block and EK 1080ti GPU waterblock and backplate, Hardware Labs Nemesis GTS 240 (2x) and Nemesis GTS 360

hey, at least i made the firestrike hall of fame for SLI cards









http://www.3dmark.com/fs/12650639



https://pcpartpicker.com/b/h8J8TW


----------



## Benny89

Guys, please tell if I am doing BIOS flashing right.

1. Disable GPU in device manager
2. Flash BIOS using command
3. Reboot
4. Enable GPU in device manager
5. Profit?

Do I miss some step? I heard some people "install drivers" after BIOS flash- should I do clean drivers install after reboot?

How many people with STRIX tried XOC and saw improvement and no more power limit? Because on my STRIX (now my wifes) I still got perf-cap on power but maybe I flashed it wrong.

Please kindly assist with advice, thanks


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> I think its the memory timings as mentioned a few times. The default strix probably has these looser timings as well maybe that's why those going from the strix bios don't see the performance drop. Its too small to make a difference in games anyway I think. After all most people who do buy the GPUs are not going to be comparing scores between benchmarks between different bioses after all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you had the choice you might want to consider the Zotac Amp Extreme too. A close second or even first depending on the price of both where you live. Haven't heard any complains about it either. Too bad Zotac is somewhat still an underdog in the market.


i agree, and would add the best "cheap" one is the regular aorus. mine smokes most of the ones that are $70-90 more expensive. For $719, that's the one. 2088core with whisper quiet fans, or barely audible ones at 100% power and temps ~60c.

They are all pretty good this time. I just was underwhelmed by the Aorus extreme being worse than my regular aorus, the msi gaming x being meh on perf and temps, and the EVGA ones being 6 weeks late and not having as good a chance for a golden as the early samples. I think EVGA are the ones who lost the train car load of chips to bandits. It wasnt identified as to whose they were where I read it , but, they are so late to market AND they dont seem to have their allotment of center die chips like everyone else had, it's got to be them. Unlucky. After the gddr5x incidents on 1080, the re-fitting and lots of money spent on damage control, losing those chips and being last to market with underwhelming samples was teh last thing EVGA needed.
Quote:


> Originally Posted by *motivman*
> 
> Finally got around to finishing my build. I was set back a few weeks because my EVGA 1080ti FE suddenly died out the blue, so had to wait for EVGA to send me a replacement. My replacement card from EVGA is kind of a dud. Max core clock is 2050 and memory will not go above +400 without artifacting. My old card could do 2088 on the core and 800+ on the memory. This makes me kind of sad. My MSI card has a limit of 2076 on the core and can also go 800+ on the memory. I am wondering if i should sell my replacement EVGA card and buy another one, that can hopefully overclock better. what do you guys think?
> 
> My System:
> 
> CPU: 5960X @ 4.5ghz
> RAM: 32Gb EVGA 3200 (16-18-18-38 2T 1.35V)
> PSU: Corsair RM1000i
> Case: AIR 740
> FANS: Corsair ML120 PWM
> Cooling: EK supremacy Evo Cpu block and EK 1080ti GPU waterblock and backplate, Hardware Labs Nemesis GTS 240 (2x) and Nemesis GTS 360
> 
> hey, at least i made the firestrike hall of fame for SLI cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12650639


gorgeous system , bro. on the card- i would buy the new one first, not sell then try again. that 2050 card is actually well above average. your other two samples were just golden. you got crazy lucky. i'd stay pat with the 2050. here's why. Like I said, luck won't be on your side next time. And, if the game supports sli, 2 1080Tis under water at 2050Mhz is amazing and won't fail to deliver for you- ever! And, in the games where you are forced to run only 1 card, you have a golden to throw at it. You already are perfect imo.

Sexy, sexy, beast of a rig. Very well done, sir. I am a little jello of you guys who can even afford a rig like this . It's like a piece of art, man.





















Quote:


> Originally Posted by *Benny89*
> 
> Guys, please tell if I am doing BIOS flashing right.
> 
> 1. Disable GPU in device manager
> 2. Flash BIOS using command
> 3. Reboot
> 4. Enable GPU in device manager
> 5. Profit?
> 
> Do I miss some step? I heard some people "install drivers" after BIOS flash- should I do clean drivers install after reboot?
> 
> How many people with STRIX tried XOC and saw improvement and no more power limit? Because on my STRIX (now my wifes) I still got perf-cap on power but maybe I flashed it wrong.
> 
> Please kindly assist with advice, thanks


you ddu drivers and reinstall them.


----------



## wammos

Quote:


> Originally Posted by *Benny89*
> 
> Guys, please tell if I am doing BIOS flashing right.
> 
> 1. Disable GPU in device manager
> 2. Flash BIOS using command
> 3. Reboot
> 4. Enable GPU in device manager
> 5. Profit?
> 
> Do I miss some step? I heard some people "install drivers" after BIOS flash- should I do clean drivers install after reboot?
> 
> How many people with STRIX tried XOC and saw improvement and no more power limit? Because on my STRIX (now my wifes) I still got perf-cap on power but maybe I flashed it wrong.
> 
> Please kindly assist with advice, thanks


You don't have to enable in device manager it will be enabled after reboot. I always use ddu to clean up the driver install especially trying different bioses etc. I don't have a STRIX but my EVGA FE on XOC has ZERO throttling.


----------



## TheBoom

Quote:


> Originally Posted by *Benny89*
> 
> Guys, please tell if I am doing BIOS flashing right.
> 
> 1. Disable GPU in device manager
> 2. Flash BIOS using command
> 3. Reboot
> 4. Enable GPU in device manager
> 5. Profit?
> 
> Do I miss some step? I heard some people "install drivers" after BIOS flash- should I do clean drivers install after reboot?
> 
> How many people with STRIX tried XOC and saw improvement and no more power limit? Because on my STRIX (now my wifes) I still got perf-cap on power but maybe I flashed it wrong.
> 
> Please kindly assist with advice, thanks


I also see power limit on the XOC bios along with vrel and vop if I'm at certain clock and voltage bins. Another bin and I'll get no perfcaps at all even if it is a higher bin. I honestly don't know how GPU Boost 3 works with regards to that.

Quote:


> Originally Posted by *motivman*
> 
> Finally got around to finishing my build. I was set back a few weeks because my EVGA 1080ti FE suddenly died out the blue, so had to wait for EVGA to send me a replacement. My replacement card from EVGA is kind of a dud. Max core clock is 2050 and memory will not go above +400 without artifacting. My old card could do 2088 on the core and 800+ on the memory. This makes me kind of sad. My MSI card has a limit of 2076 on the core and can also go 800+ on the memory. I am wondering if i should sell my replacement EVGA card and buy another one, that can hopefully overclock better. what do you guys think?
> 
> My System:
> 
> CPU: 5960X @ 4.5ghz
> RAM: 32Gb EVGA 3200 (16-18-18-38 2T 1.35V)
> PSU: Corsair RM1000i
> Case: AIR 740
> FANS: Corsair ML120 PWM
> Cooling: EK supremacy Evo Cpu block and EK 1080ti GPU waterblock and backplate, Hardware Labs Nemesis GTS 240 (2x) and Nemesis GTS 360
> 
> hey, at least i made the firestrike hall of fame for SLI cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12650639
> 
> 
> 
> https://pcpartpicker.com/b/h8J8TW


2050 core and 11.8ghz memory is now considered a dud huh. That means myself and a ton of others have dud cards as well lol.

Gone are the days where a dud refers to cards that don't actually work at all out of the box


----------



## toncij

Quote:


> Originally Posted by *wammos*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> Guys, please tell if I am doing BIOS flashing right.
> 
> 1. Disable GPU in device manager
> 2. Flash BIOS using command
> 3. Reboot
> 4. Enable GPU in device manager
> 5. Profit?
> 
> Do I miss some step? I heard some people "install drivers" after BIOS flash- should I do clean drivers install after reboot?
> 
> How many people with STRIX tried XOC and saw improvement and no more power limit? Because on my STRIX (now my wifes) I still got perf-cap on power but maybe I flashed it wrong.
> 
> Please kindly assist with advice, thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You don't have to enable in device manager it will be enabled after reboot. I always use ddu to clean up the driver install especially trying different bioses etc. I don't have a STRIX but my EVGA FE on XOC has ZERO throttling.
Click to expand...

You're using what BIOS on EVGA FE?


----------



## dunbheagan

Quote:


> Originally Posted by *motivman*
> 
> Finally got around to finishing my build. I was set back a few weeks because my EVGA 1080ti FE suddenly died out the blue, so had to wait for EVGA to send me a replacement. My replacement card from EVGA is kind of a dud. Max core clock is 2050 and memory will not go above +400 without artifacting. My old card could do 2088 on the core and 800+ on the memory. This makes me kind of sad. My MSI card has a limit of 2076 on the core and can also go 800+ on the memory. I am wondering if i should sell my replacement EVGA card and buy another one, that can hopefully overclock better. what do you guys think?
> 
> My System:
> 
> CPU: 5960X @ 4.5ghz
> RAM: 32Gb EVGA 3200 (16-18-18-38 2T 1.35V)
> PSU: Corsair RM1000i
> Case: AIR 740
> FANS: Corsair ML120 PWM
> Cooling: EK supremacy Evo Cpu block and EK 1080ti GPU waterblock and backplate, Hardware Labs Nemesis GTS 240 (2x) and Nemesis GTS 360
> 
> hey, at least i made the firestrike hall of fame for SLI cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12650639
> 
> https://pcpartpicker.com/b/h8J8TW


Supersexy! Really really nice build. Looks awesome!

I just finished the last work on my new build today too, it has some (slight) similarities with your build, although your rig has some more power with more cores and a second Ti.

1080Ti FE on EK Waterblock
4790K on EK Waterblock
32Gb DDR3 2400 Corsair Vengance
Corsair Air 540


----------



## Nico67

Quote:


> Originally Posted by *Darkwizzie*
> 
> 
> 
> 
> 
> 
> I think +465 is where I want to be.
> 
> By min working clock you mean the minimum working clock I would typically see when the clocks are fluctuating right?
> 
> There's a particular place in TLR benchmark that tends to hit rock bottom clocks. That, and sometimes 4k extreme on Superposition.


Yeah the lowest frequency it drops too during a benchmark or gaming etc. My gaming curve is way different as there's not as much load









memory too sets in increments, if you set +495, and +497 they will likely both end up locked to the same frequency in AB. I just keep adding a few mhz until it steps up to the next increment applying each step to see if it changes, then run test again.


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> Supersexy! Really really nice build. Looks awesome!
> 
> I just finished the last work on my new build today too, it has some (slight) similarities with your build, although your rig has some more power with more cores and a second Ti.
> 
> 1080Ti FE on EK Waterblock
> 4790K on EK Waterblock
> 32Gb DDR3 2400 Corsair Vengance
> Corsair Air 540


i love how your ram looks with those cables. never seen corsair vengence ram looks so good.


----------



## mcbaes72

Quote:


> Originally Posted by *Benny89*
> 
> So far I can tell from personal experience that best temps and quality is STRIX. Second I would get EVGA only because of their support quality and higher power limit. MSI last.
> 
> Overall I think this round (1080 Tis) Asus made best all around air GPU. Sleek, neutral design fitting any build, great temps, great noise, great build quality, and I haven't seen almost any complaints about STRIX apart form some coil whine, but I saw that under every 1080Ti brand topic over internet. A lot of complaints about EVGA fans not spinning or working or some some other minor stuff and int incompatibility with Afterburner (big thing for some, as I am addicted to AB OSD).
> 
> As I said each has some pros and cons compare to others, but overall *I would definitelly recommend STRIX this round.*


I agree with this recommendation. I own the Strix, in short time, very happy with it so far. Fans are mostly quiet, does become noticeable over case fans above 60% speed though.

To put power in perspective, my old 1070 on High settings for 1440p/144 Gsync would get 90 FPS avg on Shadows of Mordor with mild-to-moderate OC (+100/+400).

The Ti on Ultra everything with similar OC settings and using 5.6GB VRAM still gets 99-100 avg FPS @ 75% GPU usage!


----------



## Benny89

Quote:


> Originally Posted by *dunbheagan*
> 
> Supersexy! Really really nice build. Looks awesome!
> 
> I just finished the last work on my new build today too, it has some (slight) similarities with your build, although your rig has some more power with more cores and a second Ti.
> 
> 1080Ti FE on EK Waterblock
> 4790K on EK Waterblock
> 32Gb DDR3 2400 Corsair Vengance
> Corsair Air 540


I love that unique color sheme on this build. I am just another "black-red" fanboy so I just can't have another combo but yours look really great.

UPDATE: My EVGA FTW3 got moved from 31.05 ship to 10.06 ship. I said screw it and canceled it.

Got nice 50 USD discount from my local store where I buy stuff for years and ordered SLI STRIX OC. Also ordered HB ROG Aura bridge. Should have cards by the weekend but bridge in next week probably









Really hard to get those HB ROG bridges.

Finally I can afford myself SLI so I said "I do it" and go for my dream







.

After I check their OC I will flash both with XOC BIOS and see what we can do.


----------



## rakesh27

Is it worth flashing bios ?

I already own EVGA FE 1080TI, with Kracken G10 installed on it hybrid water cooling, idle 29c and load i think its about 55c if that sometimes.

my GPU overclock i can get 2114 or 2110 mostly and memory is high anyways.

What are the benefits of flashing bios with another.

Does it sometimes unlock more power for the card, which i mean by it allows the constraints set by manufacture to be taken off so you can overclock further...

Or am i best leaving it alone with the card i have ?

Thanks all


----------



## Benny89

Quote:


> Originally Posted by *rakesh27*
> 
> Is it worth flashing bios ?
> 
> I already own EVGA FE 1080TI, with Kracken G10 installed on it hybrid water cooling, idle 29c and load i think its about 55c if that sometimes.
> 
> my GPU overclock i can get 2114 or 2110 mostly and memory is high anyways.
> 
> What are the benefits of flashing bios with another.
> 
> Does it sometimes unlock more power for the card, which i mean by it allows the constraints set by manufacture to be taken off so you can overclock further...
> 
> Or am i best leaving it alone with the card i have ?
> 
> Thanks all


Mate if you have such OC, you don't need to do anything







just enjoy and play games.


----------



## rakesh27

Thanks mate, i will enjoy the games, its all about da games ! he he

Cant wait for the 4k 144hz G-Sync to fully come out, i hope they go down in price after 4-5 months.

I did notice that when i had my Zotac 1080 the temps were alittle higher then my EVGA FE 1080ti...

What a difference between i presume 2 similar cards and both were hybrids.

Do you all think they will release a fully fledged 1080TI like they did with the Titan xp...

This computer stuff can get annoying when new stuff gets released, i promised myself i wont fall into this trap again

Thanks all.


----------



## motivman

Quote:


> Originally Posted by *TheBoom*
> 
> I also see power limit on the XOC bios along with vrel and vop if I'm at certain clock and voltage bins. Another bin and I'll get no perfcaps at all even if it is a higher bin. I honestly don't know how GPU Boost 3 works with regards to that.
> 2050 core and 11.8ghz memory is now considered a dud huh. That means myself and a ton of others have dud cards as well lol.
> 
> Gone are the days where a dud refers to cards that don't actually work at all out of the box


i am jealous of those guys that are hitting 2100 stable. I want my 2100 too!!! lol


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> ordered SLI STRIX OC.


If u don't go water, be prepared for some temperature issues on the top card due to the size of the cooler, the limited spacing on your mobo and the type of case u have. You can already prepare for finding a good compromise between low voltage and clock speed. I would also recommend to use the framerate limiter form Afterburner in order to save some power and to reduce SLI micro stutter









My 2x Zotac 980TIs @ 1500Mhz worked fabulous air cooled in my previous build with the 3 slot spacing of the motherboard and a 200mm side fan in the NZXT 630 Phantom case. When I used the same cards in my new build with 2 slot spacing and glass panel, the top card ran 20 degrees hotter than before and I had to reduce voltage + core speed.

The 2xFE 1080Ti's are now easier to cool than the 2x980Ti's with 2.5 slot coolers because they exhaust the hot air. @1825Mhz with 0.9v they are also quieter than the 980Tis in the same case. But the 2x980Ti's in the large case with side fan were much quieter than the 2xFE cards in the small case with glass side panel. That's why I have all the water cooling equipment for them already lying around but haven't found the time/patience to install it yet


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> If u don't go water, be prepared for some temperature issues on the top card due to the size of the cooler, the limited spacing on your mobo and the type of case u have. You can already prepare for finding a good compromise between low voltage and clock speed. I would also recommend to use the framerate limiter form Afterburner in order to save some power and to reduce SLI micro stutter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 2x Zotac 980TIs @ 1500Mhz worked fabulous air cooled in my previous build with the 3 slot spacing of the motherboard and a 200mm side fan in the NZXT 630 Phantom case. When I used the same cards in my new build with 2 slot spacing and glass panel, the top card ran 20 degrees hotter than before and I had to reduce voltage + core speed.
> 
> The 2xFE 1080Ti's are now easier to cool than the 2x980Ti's with 2.5 slot coolers because they exhaust the hot air. @1825Mhz with 0.9v they are also quieter than the 980Tis in the same case. But the 2x980Ti's in the large case with side fan were much quieter than the 2xFE cards in the small case with glass side panel. That's why I have all the water cooling equipment for them already lying around but haven't found the time/patience to install it yet


Yes, I am aware of that and I already look on internet and I know I can expect 76-82C on my top card. So I am hoping for undervolting both of them to like 1.000V 2000mhz or 1.02V at most on 2000mhz. This way I will be able to reduce heat at least a little.

And I know I can dream of keeping them both above 2000 on air with such temps but that why I went SLI- I want that powarrrrrr!

I will go custom water loop with Volta. I already invested in 2x 1080Ti to sell them for 2x 2080, to sell those for 2x2080Tis. So I will make another investment next year in custom loop finally


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> i love how your ram looks with those cables. never seen corsair vengence ram looks so good.


Quote:


> Originally Posted by *Benny89*
> 
> I love that unique color sheme on this build. I am just another "black-red" fanboy so I just can't have another combo but yours look really great.


Thanks a lot guys! I put some time and effort in my first watercooled build and now i am very happy with it.
Quote:


> Originally Posted by *motivman*
> 
> i am jealous of those guys that are hitting 2100 stable. I want my 2100 too!!! lol


I am totally with you buddy. Numbers can be very sexy. But dont get to obsessed about them. In best case scenarious the difference between gaming at 2050 or 2100MHz means 100 or 102 fps. Mostly the difference will be between 0 and 1 fps.

I pushed my watercooled/shuntmodded FE to 2100MHz(not 100% stable, but stable enough for most gaming and short-time benching), but actually i game at [email protected],7930V and enjoy (almost) zero coil whine, a temp delta of 3 degree under load between gpu and coolant and fans at 400rpm(at this speed even my crappy Corsair SP120 fans are silent). I know, most people would not sacrifice 15% performance for lower temps and noise like me in this case, but what i mean is, that your max OC (in my eyes) is never a very good setting for long-term gaming and so shouldnt be overrated.


----------



## feznz

Quote:


> Originally Posted by *Hulk1988*
> 
> Hello everyone,
> 
> I am new here and I got my 1080 TI Hall of Fame and want to share with you some results. If you have any questions or you want to have any other results, just write it down.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With 100% Fan Speed the card get no hotter than 56 degrees. With 50% ( Which I cannot hear anymore in a closed Define R5 case ) is the peak at 60 degrees.
> 
> Beautiful card. My first results are 2080/6200. After the drop because 3.0 Boost it is stable at 2020/6200 with 59/60 degrees and 50% Fan Speed. Better results with 100% Fanspeed.


Nice









Could you do us a favour and upload your Bios with GPUZ Thanks I looked at Techpowerup The HOF bios isn't there

Quote:


> Originally Posted by *Slackaveli*
> 
> I may do that because I am actually getting bashed in here for helping people lol. It is MORE relevant in the Ti thread than ANY OTHER thread, b/c this is WHERE people realize they are cpu bound. Shame that z87 isnt compatible. And i am not mad at you, brother, but...
> .


I get it... I might have followed that advice except I would have to change platform and looking the 5775c has never been available in NZ and as you know the next line up of platforms is weeks away
I thinking I need a refresh of mobo water loop and OS so.... do it all in one go to save time.

Quote:


> Originally Posted by *dunbheagan*
> 
> I was confused by that too and it took me ages to realize, that afterburner always shows you the curve that is connected to your actual gpu temp. Open the curve at 30 degrees and you will get another curve than at say 60 degrees.
> 
> So when you open the curve directly after benching when your gpu is warm, then edit for a while and press apply when your gpu has cooled down, AB will apply the "cooler" curve, which makes it jump from the values you applied to one or two bins higher in this case.
> 
> Since i realized that i always edit the curve at my typical gaming gpu temp, which is around 40 degree. I leave a fur mark window at a low resolution running for that. Makes it easier to know exactly which results the curve gives you.


Interesting will have to investigate that with on a accessory monitor while gaming.
Again I am not a competitive bencher so a loss of a bin or two means nothing while gaming

Quote:


> Originally Posted by *motivman*
> 
> i am jealous of those guys that are hitting 2100 stable. I want my 2100 too!!! lol


Trust me it didn't physically make any male appendages grow bigger







it all in their head.


----------



## dunbheagan

Quote:


> Originally Posted by *feznz*
> 
> Trust me it didn't physically make any male appendages grow bigger
> 
> 
> 
> 
> 
> 
> 
> it all in their head.


Porn is also just in your head, it feels nice nevertheless...


----------



## Hulk1988

Quote:


> Originally Posted by *feznz*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Could you do us a favour and upload your Bios with GPUZ Thanks I looked at Techpowerup The HOF bios isn't there.


Done

Or you can download it earlier here: https://www.file-upload.net/download-12505452/GP102_1080_TI_HOF.rar.html

I have tested more the temperature of the card. Here are the results:

Idle 29-30C and 50% RPM in the Case. I cannot hear the fans. It is pretty impressive! But the room temperature is 27C. Very hot here in Germany right now.

LOAD Temperature

40 Minutes Heaven Benchmark with 50% RPM

- Max Temperature 64 C
- Stable with 1999,5/6200
- Max 84,7% TDP

40 Minutes Heaven Benchmark with 100% RPM

- Max Temperature 59 C
- Stable with 2012/6200
- Max 84,7% TDP


----------



## nyk20z3

I am happy to join the club and it will be going in a Silverstone RVZ01-E.


----------



## OZrevhead

Quote:


> Originally Posted by *dunbheagan*
> 
> I was confused by that too and it took me ages to realize, that afterburner always shows you the curve that is connected to your actual gpu temp. Open the curve at 30 degrees and you will get another curve than at say 60 degrees.
> 
> So when you open the curve directly after benching when your gpu is warm, then edit for a while and press apply when your gpu has cooled down, AB will apply the "cooler" curve, which makes it jump from the values you applied to one or two bins higher in this case.
> 
> Since i realized that i always edit the curve at my typical gaming gpu temp, which is around 40 degree. I leave a fur mark window at a low resolution running for that. Makes it easier to know exactly which results the curve gives you.


Very useful info, thanks


----------



## nrpeyton

Quote:


> Originally Posted by *Hulk1988*
> 
> I will be on my PC in around 3 hours. I will upload it directly.


That would be fantastic.

I know there's a few people eager to try it. And no one has uploaded it yet.

Do you know how high your power target is? (at maximum with power slider at full in MSI AB).?


----------



## KedarWolf

DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.

I'm thinking given the file share website it may even be malware designed to brick our cards.


----------



## niobium615

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


That sucks. You haven't been able to flash it back with another GPU installed?


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


Even if it wasn't malware, the HOF uses a completely different voltage controller. At least the XOC has the same VC as the FE cards and most AIBs.

Fingers crossed you can cross flash it back.


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


damn, dude! uggh!


----------



## KedarWolf

Quote:


> Originally Posted by *niobium615*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That sucks. You haven't been able to flash it back with another GPU installed?
Click to expand...

I have another GPU in, it won't recognise the second GPU with the 1080 Ti in, no display, even if I only have a monitor hooked up to the secondary card.

Tried my second GPU in primary PCI-E slot with 1080 Ti in, black screen with cursor, no BIOS to see or boot into even.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> I have another GPU in, it won't recognise the second GPU with the 1080 Ti in, no display, even if I only have a monitor hooked up to the secondary card.
> 
> Tried my second GPU in primary PCI-E slot with 1080 Ti in, black screen with cursor, no BIOS to see or boot into even.


Meaning you're not even able to flash the old bios back? How is that possible. Unless your second card is dead as well maybe?


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I have another GPU in, it won't recognise the second GPU with the 1080 Ti in, no display, even if I only have a monitor hooked up to the secondary card.
> 
> Tried my second GPU in primary PCI-E slot with 1080 Ti in, black screen with cursor, no BIOS to see or boot into even.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meaning you're not even able to flash the old bios back? How is that possible. Unless your second card is dead as well maybe?
Click to expand...

No, my second card works without the 1080 Ti in.


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> No, my second card works without the 1080 Ti in.


If you do RMA it, its very likely to be successful because it looks like even the testing centre won't be able to tell that it has a modded BIOS.


----------



## seven7thirty30

Tweaked my NVIDIA GTX 1080 Ti FE and have a stable overclock at 2062MHz CORE and 5940MHz MEM @ 1093mV. Runs well after playing a few hours on Witcher III, Fallout 4, and MEA. Benchmarks below. Pretty happy with it so far. Will probably back the core down to 2050MHz @ 1050mV for longevity.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> No, my second card works without the 1080 Ti in.


You will probably need someone with a Z97 OC force board, the one that you can flick a dipswitch on to disable the card pci express slot
then boot into windows with the intergrated gpu and then open nvflash and flick the dipswitch to reenable the card and force flash it....
then reboot system, if you try to open device manager after flicking the boards pci express switch it will probably freeze your system reboot then it should work


----------



## Hulk1988

Hey Guys,

Please be careful to use the 1080 TI HOF Bios. I get messages that this Bios could brick your card.

And to answer the question of the maximum Watt. The card can use 525W with PowerTarget +28% which is the maximum


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


Wow, sorry to here that, but thanks for tell us as you might have just saved a few others from trying it. Hope you can figure out a way to reflash.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


I am so sorry to hear that

here is the best pic (had to shrink to 12MP) I can upload of the HOF bare PCB I am not sure where the voltage controller is



Quote:


> Originally Posted by *Hulk1988*
> 
> Done
> 
> Or you can download it earlier here: https://www.file-upload.net/download-12505452/GP102_1080_TI_HOF.rar.html
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I have tested more the temperature of the card. Here are the results:
> 
> Idle 29-30C and 50% RPM in the Case. I cannot hear the fans. It is pretty impressive! But the room temperature is 27C. Very hot here in Germany right now.
> 
> LOAD Temperature
> 
> 40 Minutes Heaven Benchmark with 50% RPM
> 
> - Max Temperature 64 C
> - Stable with 1999,5/6200
> - Max 84,7% TDP
> 
> 40 Minutes Heaven Benchmark with 100% RPM
> 
> - Max Temperature 59 C
> - Stable with 2012/6200
> - Max 84,7% TDP


Can you upload the Bios please with GPUz as even though the bios is encrypted as least we can compare size and possibly open with notepad to compare
I highly doubt there is Malware in the link you gave but best we get a clean copy from a known good card, still no HOF Bios here Techpowerup

Quote:


> Originally Posted by *Dasboogieman*
> 
> If you do RMA it, its very likely to be successful because it looks like even the testing centre won't be able to tell that it has a modded BIOS.


I thought the bios was a soldered on chip like this was taken from the HWbot 1080Ti strix there was mention of the missing 2nd bios chip and switch so technically possible to repair with another pre-flashed bios chip and solder it in
As for warrantee good luck
I would hope it the repairer would have the basic skills to test for corrupt Bios

U5 is missing U3 is the Bios Chip I assume


Quote:


> Originally Posted by *dunbheagan*
> 
> Porn is also just in your head, it feels nice nevertheless...










sorry just there has been some crazy antics going on with that magic 2100Mhz milestone even trying to put the actual gain into context has been fruitless


----------



## ELIAS-EH

can I use both msi afterburner and asus gpu tweak 2 together ?
asus gpu tweak 2 to enable OC profile
and msi afterburner for OSD in game (FPS, GPU TEMP, CPU TEMP... using RTSS)


----------



## jimmytim

Quote:


> Originally Posted by *feznz*
> 
> HOF Bios


this one? kfa=galax
https://www.techpowerup.com/vgabios/192115/192115


----------



## feznz

Quote:


> Originally Posted by *ELIAS-EH*
> 
> can I use both msi afterburner and asus gpu tweak 2 together ?
> asus gpu tweak 2 to enable OC profile
> and msi afterburner for OSD in game (FPS, GPU TEMP, CPU TEMP... using RTSS)


they will/can interfere with each other best to use MSI then just manually copy the OC gaming profile to MSI and save it to a profile
I have the Strix too I didn't bother with the software even for RGB control I like the default red flash as it reminds me of a heartbeat but guess it depends on your motif
Quote:


> Originally Posted by *jimmytim*
> 
> this one? kfa=galax
> https://www.techpowerup.com/vgabios/192115/192115


Unfortunately that looks like the FE but 450w power limit looks pretty nice
we are after the KFA2 HOF 1080Ti

edit;

how the heck did you find that even though the filters it never came up but it must be the HOF I was just having another look.
Just unverified


----------



## KraxKill

Quote:


> Originally Posted by *KedarWolf*
> 
> No, my second card works without the 1080 Ti in.


Can you boot with your onboard gpu with the Ti incerted?


----------



## jimmytim

Quote:


> Originally Posted by *feznz*
> 
> how the heck did you find that even though the filters it never came up but it must be the HOF I was just having another look.
> Just unverified


just do this

https://www.techpowerup.com/vgabios/?architecture=Uploads&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=


----------



## feznz

didn't do the Unverified bios upload check box

any way compared both bios files with note pad and to me at least appear identical

No malware it would appear more likely a voltage controller are different


----------



## KraxKill

Quote:


> Originally Posted by *KraxKill*
> 
> Can you boot with your onboard gpu with the Ti incerted?


Never mind you're on x99 aren't you. There's got to be a way around that for you. You're probably failing to post during the power on self test with the Ti on a dumb BIOS. Anyone around you with onboard gpu?


----------



## Hulk1988

I have uploaded my 1080 TI HOF Bios via GPU-Z yesterday already at around 18:30 oclock German time


----------



## feznz

Quote:


> Originally Posted by *Hulk1988*
> 
> I have uploaded my 1080 TI HOF Bios via GPU-Z yesterday already at around 18:30 oclock German time


Yes @jimmytim found it for me just yet to be verified Thankyou though it looks to be useless for any one other than a HOF owner, we assume it has different voltage controller in turn when flashed means no power to the card = bricked
I can only think one way to DIY fix which would require a donor card for the bios chip to be salvaged and swapped over no idea if you could get a replacement bios chip with the correct bios

Anyway 2080Mhz on the core love to see what the HOF could do on water though being such a limited edition card only a AIO or universal W/B would be available which would destroy the aesthetes of the card it is simply beautiful.

have you tried to push it further?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


Didnt you biy the card with a premium credit card? They cover fully damages and even theft for us to 90 days.

They'll buy you a new card.


----------



## Hulk1988

Quote:


> Originally Posted by *feznz*
> 
> have you tried to push it further?


I have tried to push to the limits. But the results are not 100% clear since I had not that much time yesterday.

I have tried everything *without* increasing the voltage right now.

But it looks like stable with 2050/6200 and with 70% RPM. It drops from 2086 to 2050 and it is stable after it.
With 100% RPM it is stable with 2062/6200

70% RPM is fine for me in a closed case. I can hear the fans a little bit.

But for 24/7 settings I will use my stable 2000/6200 with 50% RPM which I cannot hear in the case.

More results incoming today.


----------



## Hulio225

Quote:


> Originally Posted by *Hulk1988*
> 
> I have tried to push to the limits. But the results are not 100% clear since I had not that much time yesterday.
> 
> I have tried everything *without* increasing the voltage right now.
> 
> *But it looks like stable with 2050/6200 and with 70% RPM. It drops from 2086 to 2050 and it is stable after it.
> With 100% RPM it is stable with 2062/6200*
> 
> 70% RPM is fine for me in a closed case. I can hear the fans a little bit.
> 
> But for 24/7 settings I will use my stable 2000/6200 with 50% RPM which I cannot hear in the case.
> 
> More results incoming today.


When i read sentences like the bold ones, i have the feeling we have a very different understanding what stable means ;-)
If you refer to clocks not fluctuating and calling that stable, that is a different "stable" to be stable under (any) load and not crashing after some time.

And "It drops from 2086 to 2050 and it is stable after it." it drops from *2088* to 2050.5, that are 3 Boost 3.0 bins (12.5MHz), its dropping because you you surpassed three temperature bin borders. *Klugscheißermode off*


----------



## Hulk1988

Quote:


> Originally Posted by *Hulio225*
> 
> When i read sentences like the bold ones, i have the feeling we have a very different understanding what stable means ;-)
> If you refer to clocks not fluctuating and calling that stable, that is a different "stable" to be stable under (any) load and not crashing after some time.
> 
> And "It drops from 2086 to 2050 and it is stable after it." it drops from *2088* to 2050.5, that are 3 Boost 3.0 bins (12.5MHz), its dropping because you you surpassed three temperature bin borders. *Klugscheißermode off*


Yes I try to make the clock higher from the beginning to have a better stable clock at the end







haha

If I want to have stable 2050 I have to calculate 3 drops. So I have to set the number to +37,5 Clock.. That means 2087.

Personally a Aircard which is able to run 2050 "stable" is totally fine for me.

Even If I would max the voltage and MAYBE I am able to hit the 2124, the clock would be at the end at around 2087. Or may be less because the card gets hotter. I think we all agree that this card with a water cooler would be perfect.

But I am not aiming that. Air - max 59-60 degrees - 2050/6200 stable after the drops and I am happy with the card.


----------



## Hulio225

Quote:


> Originally Posted by *Hulk1988*
> 
> Yes I try to make the clock higher from the beginning to have a better stable clock at the end
> 
> 
> 
> 
> 
> 
> 
> haha
> 
> If I want to have stable 2050 I have to calculate 3 drops. So I have to set the number to +37,5 Clock.. That means 2087.
> 
> Personally a Aircard which is able to run 2050 "stable" is totally fine for me.
> 
> Even If I would max the voltage and MAYBE I am able to hit the 2124, the clock would be at the end at around 2087. Or may be less because the card gets hotter. I think we all agree that this card with a water cooler would be perfect.
> 
> But I am not aiming that. Air - max 59-60 degrees - 2050/6200 stable after the drops and I am happy with the card.


Its 2050.5 so you have to set 2088, if you set 2087 it will max go up to 2075.5,
it always jumps to the lower bin. You have to set the exact MHz bin or a bit over it in terms of the bin with .5 at the end.

But stable clock doesn't equal stable operating card, you have to test for a longer period of time.
My card a FE has done the 2088MHz under air too, after couple days i mounted a waterblock and shunt modded it.
With Voltage @ 1081mV its doing 2113 MHz and with voltage @ 1093mV and an ambient temperature under 10 ° C i can pull 2126MHz... but thats a golden card... 6300 MHz Vram is easy as well, if temps go down even more^^
I'm really curious how much / how good the HOF cards are binned^^


----------



## Hulk1988

Quote:


> Originally Posted by *Hulio225*
> 
> Its 2050.5 so you have to set 2088, if you set 2087 it will max go up to 2075.5,
> it always jumps to the lower bin. You have to set the exact MHz bin or a bit over it in terms of the bin with .5 at the end.
> 
> But stable clock doesn't equal stable operating card, you have to test for a longer period of time.
> My card a FE has done the 2088MHz under air too, after couple days i mounted a waterblock and shunt modded it.
> With Voltage @ 1081mV its doing 2113 MHz and with voltage @ 1093mV and an ambient temperature under 10 ° C i can pull 2126MHz... but thats a golden card...
> I'm really curious how much / how good the HOF cards are binned^^


When you give me a little tutorial how I can check everything to push the card to the limits. I am willing to do that.


----------



## Hulio225

Quote:


> Originally Posted by *Hulk1988*
> 
> When you give me a little tutorial how I can check everything to push the card to the limits. I am willing to do that.


I would say just take one of the newer benchmarks, like Time Spy or Superposition and run it couple times in a row or play some new demanding games at 4k and check if you crash or not, after that you know whats going on with your card.


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> Its 2050.5 so you have to set 2088, if you set 2087 it will max go up to 2075.5,
> it always jumps to the lower bin. You have to set the exact MHz bin or a bit over it in terms of the bin with .5 at the end.
> 
> But stable clock doesn't equal stable operating card, you have to test for a longer period of time.
> My card a FE has done the 2088MHz under air too, after couple days i mounted a waterblock and shunt modded it.
> With Voltage @ 1081mV its doing 2113 MHz and with voltage @ 1093mV and an ambient temperature under 10 ° C i can pull 2126MHz... but thats a golden card... 6300 MHz Vram is easy as well, if temps go down even more^^
> I'm really curious how much / how good the HOF cards are binned^^


That's what I was wondering if 15 Core phase + 3 memory phase give a slight edge at least a 450w power limit has to huge edge
even the clocks @Hulk1988 already pulling I would be happy

as for stability if you can fold for 2 months straight it is stable enough in my book not sure about frying something especially @ 2000Mhz+
I guess there is a reason for stock FE clocks which I am led to believe is the estimated clocks that would allow folding for 10 years continuously before failure.


----------



## UdoG

What are about the power limit? What is the consequence, if I have a card running with 2100 MHz (Strix), but it runs sometimes in the power limit? Consequently, the card will slow down (for a short time)?


----------



## Hulio225

Quote:


> Originally Posted by *UdoG*
> 
> What are about the power limit? What is the consequence, if I have a card running with 2100 MHz (Strix), but it runs sometimes in the power limit? Consequently, the card will slow down (for a short time)?


If your card hits power limit it clocks it down to stay under the limit. Its like the rev limiter in cars, pretty much. And the clocks are fluctuating below the limiter like the revolutions in the car^^
Quote:


> Originally Posted by *feznz*
> 
> That's what I was wondering if *15 Core phase + 3 memory phase give a slight edge* at least a 450w power limit has to huge edge
> even the clocks @Hulk1988 already pulling I would be happy
> 
> as for stability if you can fold for 2 months straight it is stable enough in my book not sure about frying something especially @ 2000Mhz+
> I guess there is a reason for stock FE clocks which I am led to believe is the estimated clocks that would allow folding for 10 years continuously before failure.


I don't think this will give any advantages, but marketing advantages ;-P

Even the FE Phases are over-engineered for the card imho...


----------



## UdoG

Thanks. But it's temporary - or? Does it goes back to the previous clock as soon as the PL is no longer hit?


----------



## Benny89

Quote:


> Originally Posted by *UdoG*
> 
> What are about the power limit? What is the consequence, if I have a card running with 2100 MHz (Strix), but it runs sometimes in the power limit? Consequently, the card will slow down (for a short time)?


You can try to flash XOC bios to possibly avoid power limit.


----------



## Hulk1988

That is a good thing on my card. Max TDP is around 85% without increasing the Power Target. I mean you can get the HOF card just for 30 Euro more comparing to the STRIX OC. I would not even think about if I would buy the STRIX OC or the HOF card then


----------



## Hulio225

Quote:


> Originally Posted by *UdoG*
> 
> Thanks. But it's temporary - or? Does it goes back to the previous clock as soon as the PL is no longer hit?


Yes if power limit fades it clocks up again, if it hits it again it down clocks again and so on, this is the issue which is causing clock fluctuation, there are tests which are showing that this behavior is not very good cause frame-times can get worse and the gaming experience is worse.


----------



## ELIAS-EH

Quote:


> Originally Posted by *feznz*
> 
> they will/can interfere with each other best to use MSI then just manually copy the OC gaming profile to MSI and save it to a profile
> I have the Strix too I didn't bother with the software even for RGB control I like the default red flash as it reminds me of a heartbeat but guess it depends on your motif
> Unfortunately that looks like the FE but 450w power limit looks pretty nice
> we are after the KFA2 HOF 1080Ti
> 
> edit;
> 
> how the heck did you find that even though the filters it never came up but it must be the HOF I was just having another look.
> Just unverified


you mean that I can export the OC Mode from asus tweak 2 of the 1080TI STRIX, then uninstall the asus tweak 2;
then install the msi afterburner and import this OC Profile mode ?
thank u


----------



## Benny89

Quote:


> Originally Posted by *Hulk1988*
> 
> That is a good thing on my card. Max TDP is around 85% without increasing the Power Target. I mean you can get the HOF card just for 30 Euro more comparing to the STRIX OC. I would not even think about if I would buy the STRIX OC or the HOF card then


Well, there are more things to consider for some







. For example: HOF is not available where I live. Second- I have black-red setup and white card would look roddiculous







. Third- HOF card is ugliest 1080Ti I have seen so far, which is of course personal taste









Besides there an Asus XOC Bios that removed power limit so if anyone really need it, he can flash that.


----------



## Coopiklaani

http://www.3dmark.com/3dm/20016780







My best stable OC so far

FSU stress test is a b*tch to pass


----------



## kolkoo

Quote:


> Originally Posted by *Coopiklaani*
> 
> http://www.3dmark.com/3dm/20016780
> 
> 
> 
> 
> 
> 
> 
> My best stable OC so far
> 
> FSU stress test is a b*tch to pass


What bios are you using for this?


----------



## Coopiklaani

Quote:


> Originally Posted by *kolkoo*
> 
> What bios are you using for this?


XOC. I think it is the best BIOS for the FE so far.


----------



## kolkoo

Quote:


> Originally Posted by *Coopiklaani*
> 
> XOC. I think it is the best BIOS for the FE so far.


Nice







I'm still using EVGA FTW3 might switch over to the XOC then. It disables one display port right? Which one is it [ DP1] [DP2] [HDMI1] [DP3] in this layout?


----------



## Coopiklaani

Quote:


> Originally Posted by *kolkoo*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> I'm still using EVGA FTW3 might switch over to the XOC then. It disables one display port right? Which one is it [ DP1] [DP2] [HDMI1] [DP3] in this layout?


Yep, it kills DP3, actually replaces it with HDMI.
[DVI]
[ DP1] [DP2] [HDMI1] [HDMI2]


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Yep, it kills DP3, actually replaces it with HDMI.
> [DVI]
> [ DP1] [DP2] [HDMI1] [HDMI2]


There is no DVI output on the FE card, mate. It is 3 times display port and 1 hdmi ...

And it kills this port, and it can't replace a not working display port with a hdmi port... the bios can't replace physically display port to hdmi haha


And XOC can be the best bios for a not shunt modded FE card, other than that the FE bios is better for a shunt modded FE card.


----------



## feznz

Quote:


> Originally Posted by *UdoG*
> 
> What are about the power limit? What is the consequence, if I have a card running with 2100 MHz (Strix), but it runs sometimes in the power limit? Consequently, the card will slow down (for a short time)?


personally I set my OC to where it holds the clock just if you lose a bin or 2 it is really no biggie my 24/7 OC is 2038Mhz/5900Mhz under water
Sustained clocks stops me from looking at clock and FPS while gaming now I am constantly just looking at the FPS which is also a distraction

Quote:


> Originally Posted by *Hulio225*
> 
> Even the FE Phases are over-engineered for the card imho...










I could have got a FE and full cover waterblock for much less than the Strix just the pretty back plate light made me buy it
Bloody marketing








IF this HOF isn't a guaranteed 2100Mhz then it will prove to some how lucrative a 2100Mhz card is

Quote:


> Originally Posted by *ELIAS-EH*
> 
> you mean that I can export the OC Mode from asus tweak 2 of the 1080TI STRIX, then uninstall the asus tweak 2;
> then install the msi afterburner and import this OC Profile mode ?
> thank u


MSI Afterburner is the most popular and user friendly OC program in my opinion and I got the 1080Ti Strix too
Import? I mean just copy them manually over but in all honesty I think the worst Strix OC was 1950Mhz/5800Mhz just have a play with the OC you might have a 2100Mhz/6200Mhz card just play with core then memory never together so toy don't lose track you will know are when you have hit the limit generally you will get an error message with a application crash or a blank screen hang.
OC mode is all good for Noobs









Quote:


> Originally Posted by *Benny89*
> 
> Well, there are more things to consider for some
> 
> 
> 
> 
> 
> 
> 
> . For example: HOF is not available where I live. Second- I have black-red setup and white card would look roddiculous
> 
> 
> 
> 
> 
> 
> 
> . Third- HOF card is ugliest 1080Ti I have seen so far, which is of course personal taste
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides there an Asus XOC Bios that removed power limit so if anyone really need it, he can flash that.


Nice Motif Same as mine
As for HOF a bit blunt but I think in a all White build it would look Awesome personally I would have preferred a white back plate if I were to do a all white build.
But nothing a can of Black spray paint couldn't fix Just spray bomb the whole lot PCB and cooler cover


----------



## Slackaveli

Quote:


> Originally Posted by *jimmytim*
> 
> just do this
> 
> https://www.techpowerup.com/vgabios/?architecture=Uploads&manufacturer=&model=GTX+1080+Ti&interface=&memType=&memSize=&since=


that 'maxsun' looks like the one. pcb like the FE, higher power limit (350)


----------



## DeathKill

So only this week did I receive my pre-ordered 1080 Ti FTW3 after a long wait. Overall I am very happy with the purchase, the aesthetics are great, the built quality is top notch - no GPU sag. The lighting options are beautiful, they really stand out and will compliment any build.

And using the card for the first time I noticed a huge improvement in gaming (of course coming from a 780). 165Hz @1440p

I've started playing with the clocks a bit, can only really use EVGA Precision XOC as that is designed to give full functionality over the FTW3. And I don't really have an issue with it actually. I like it, a lot of people give it a bad rep maybe because they don't want to get to grips with something that isn't MSI After burner. But the interface is good, afterburner is just boring... real boring, and when you're spending hours upon hours using overclocking software to tweak you're card, then interface helps.

That being said, afterburner is probably more feature rich.

TL;DR

Anyways, initial results on performance and overclocking.

I used the bios switch to get me over to the more aggressive set up MASTER-> SLAVE.

Turns voltage all the up to max, cranked power slider to 127% and set temp target at 90c.

Put fans all at 80% manually, which did produce a lot of noise, probably unnecessary as I already have great airflow in my case, but it is good practice.

Then proceeded to increase offset by 25Mhz and test.

I got upto 2075 boost for a while using +75 offset on 1569 base, before multiple benchmarks cause crashing.

2063 seemed fairly stable, but wasn't consistent under multiple benchmarks,

Voltage peaks at 1080mv

2050 was stable, but at times settled in the 2038 range as the more consistent boost clock maintained throughout benchmarks.

335 on the mems so far to go alongside +50 offset on 1569 base.

I know I can get more than this, it's either the chip, something isn't optimal, I need to looks at the voltage/frequency graphs again because my temps do not exceed 59 in benches, and often hover around 50-55 for the less intense benches like unigine valley and OC Scanner from EVGA.

I feel 2075 is within reach, but I might just only get 2063. My temps are very good, so I don't know what else is throttling it.


----------



## Hulio225

Quote:


> Originally Posted by *DeathKill*
> 
> So only this week did I receive my pre-ordered 1080 Ti FTW3 after a long wait. Overall I am very happy with the purchase, the aesthetics are great, the built quality is top notch - no GPU sag. The lighting options are beautiful, they really stand out and will compliment any build.
> 
> And using the card for the first time I noticed a huge improvement in gaming (of course coming from a 780). 165Hz @1440p
> 
> I've started playing with the clocks a bit, can only really use EVGA Precision XOC as that is designed to give full functionality over the FTW3. And I don't really have an issue with it actually. I like it, a lot of people give it a bad rep maybe because they don't want to get to grips with something that isn't MSI After burner. But the interface is good, afterburner is just boring... real boring, and when you're spending hours upon hours using overclocking software to tweak you're card, then interface helps.
> 
> That being said, afterburner is probably more feature rich.
> 
> TL;DR
> 
> Anyways, initial results on performance and overclocking.
> 
> I used the bios switch to get me over to the more aggressive set up MASTER-> SLAVE.
> 
> Turns voltage all the up to max, cranked power slider to 127% and set temp target at 90c.
> 
> Put fans all at 80% manually, which did produce a lot of noise, probably unnecessary as I already have great airflow in my case, but it is good practice.
> 
> Then proceeded to increase offset by 25Mhz and test.
> 
> I got upto 2075 boost for a while using +75 offset on 1569 base, before multiple benchmarks cause crashing.
> 
> 2063 seemed fairly stable, but wasn't consistent under multiple benchmarks,
> 
> Voltage peaks at 1080mv
> 
> 2050 was stable, but at times settled in the 2038 range as the more consistent boost clock maintained throughout benchmarks.
> 
> 335 on the mems so far to go alongside +50 offset on 1569 base.
> 
> I know I can get more than this, it's either the chip, something isn't optimal, I need to looks at the voltage/frequency graphs again because my temps do not exceed 59 in benches, and often hover around 50-55 for the less intense benches like unigine valley and OC Scanner from EVGA.
> 
> I feel 2075 is within reach, but I might just only get 2063. My temps are very good, so I don't know what else is throttling it.


Your TL;DR has more charts than what you wrote before it lol

With air cooled cards its always the temperature what is throttling, ALWAYS


----------



## Luckbad

@DeathKill

Not bad at all. The throttling is likely temperature. There's probably a graph in the OP that shows temp throttle points. They start at something like 41 C.

Afterburner can get you further in benches, as can KBoost in Precision, but I agree that Precision is plenty for day to day overclocking.


----------



## Hulio225

Quote:


> Originally Posted by *Luckbad*
> 
> @DeathKill
> 
> Not bad at all. The throttling is likely temperature. There's probably a graph in the OP that shows temp throttle points. They start at something like 41 C.
> 
> Afterburner can get you further in benches, as can KBoost in Precision, but I agree that Precision is plenty for day to day overclocking.


To slow mate, sniped ya


----------



## BoredErica

So how would a person try to test bioses effectively? Run the same core clock/voltage curve and see which bios averages out a higher clock under 100% fan speed?

Now some are saying the timings might be different and I have to test the vram too.


----------



## Hulio225

Quote:


> Originally Posted by *Darkwizzie*
> 
> So how would a person try to test bioses effectively? Run the same core clock/voltage curve and see which bios averages out a higher clock under 100% fan speed?
> 
> Now some are saying the timings might be different and I have to test the vram too.


Yeah you check frequency vs frequency under the same conditions with each other, short fast repeatable tests are good for that. Like a gpu test from firestrike, time spy or something similar... i would not use long tests like valley or superposition etc....
And you need a quite big sample size and a bench operating system which is stripped down, otherwise the runs aren't constant enough... a lot people here test under conditions, where i think constantly *face-palm*

I basically tested all the interesting ones on my shunt modded FE card...


----------



## mcbaes72

Quote:


> Originally Posted by *mcbaes72*
> 
> On Topic Inquiry:
> I'm looking into another 1080 Ti, but hybrid version for another build. From my research, only MSI and EVGA makes an AIO for the GPU. For those that have them....
> 
> 1) How's cooling performance?
> 2) Noise level?
> 
> TIA.


Gamers Nexus video was very informative for those looking into hybrid GPUs.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Darkwizzie*
> 
> So how would a person try to test bioses effectively? Run the same core clock/voltage curve and see which bios averages out a higher clock under 100% fan speed?
> 
> Now some are saying the timings might be different and I have to test the vram too.


I would just take the time to do a search through this thread and read about the feedback on all the bios.

*Hulio's* post are probably the most solid as he'd tested various bios and posted his results pretty thoroughly.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *mcbaes72*
> 
> Gamers Nexus video was very informative for those looking into hybrid GPUs.


We're all way way past that lol.


----------



## BoredErica

Quote:


> Originally Posted by *SlimJ87D*
> 
> I would just take the time to do a search through this thread and read about the feedback on all the bios.
> 
> *Hulio's* post are probably the most solid as he'd tested various bios and posted his results pretty thoroughly.


I did.

Just wanted to do some testing of my own.


----------



## DeathKill

Quote:


> Originally Posted by *Luckbad*
> 
> @DeathKill
> 
> Not bad at all. The throttling is likely temperature. There's probably a graph in the OP that shows temp throttle points. They start at something like 41 C.
> 
> Afterburner can get you further in benches, as can KBoost in Precision, but I agree that Precision is plenty for day to day overclocking.


Thank you for the useful information, and what the hell? lol 41c and throttling starts wow.... how come/.. that would explain it all, I figure 65 and over make the GPU cautious.

Afterburner is the standard, but Prescision is a bit more intuitive visually, I like looking at organised things when using it for extended periods lol.

I'll try kboost today cheers


----------



## Hulio225

Quote:


> Originally Posted by *DeathKill*
> 
> Thank you for the useful information, and what the hell? lol 41c and throttling starts wow.... how come/.. that would explain it all, I figure 65 and over make the GPU cautious.
> 
> Afterburner is the standard, but Prescision is a bit more intuitive visually, I like looking at organised things when using it for extended periods lol.
> 
> I'll try kboost today cheers


You clearly overlooked my post ... it starts earlier to throttle... Ill quote myself
Quote:


> Originally Posted by *Hulio225*
> 
> Your TL;DR has more charts than what you wrote before it lol
> 
> With air cooled cards its always the temperature what is throttling, ALWAYS


----------



## DeathKill

Quote:


> Originally Posted by *Hulio225*
> 
> Your TL;DR has more charts than what you wrote before it lol
> 
> With air cooled cards its always the temperature what is throttling, ALWAYS


lol I totally agree, TL;DR *but here's even more to read* just a lot to get off my chest, it's like jerry spring for me here.

That graph makes things a lot clearer for me, I guess that might be it for me then on air cooling. I should be happy with something stable and low db on fans and continue enjoying the card, I'll make a few more tweaks... but if it throttles at such as low temps oh well lol unless some custom bios comes along, I have a feeling that some mod will give us more freedom? soon?

There is a lot of power delivery tech on this card, can't let it go all too waste because of early throttling.. it has three fans lol


----------



## DeathKill

Quote:


> Originally Posted by *Hulio225*
> 
> You clearly overlooked my post ... it starts earlier to throttle... Ill quote myself


lol I saw your post after I replied to the other poster, thats why







but thanks for graph


----------



## ninjames

For someone who is a complete noob but who is having a ton of issues with his Aorus 1080Ti, how is the EVGA FTW3 1080Ti? I'm getting a ton of crashes with my Aorus and I don't really do any overclocking or anything like that, and I've spent 40-plus hours trying to determine the cause of my crashes until I found that a lot of people are having issues with the Aorus card.


----------



## AshramKnight

Look what finally came in the mail today. Now to test these out on my 1080ti Hybrid radiator in push/pull. Will report temps soon


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AshramKnight*
> 
> Look what finally came in the mail today. Now to test these out on my 1080ti Hybrid radiator in push/pull. Will report temps soon


I had those, and they're ****ing loud.

They made a 5C to 8C difference but were like 50+ dBA blowing through the radiator.

I ended up switching over to a x61 and a 140mm radiator. Keeps temperatures at 45C.


----------



## AngryLobster

I don't really understand how these reviews are achieving the temperatures at the RPM's reported.

Guru3D's Strix review has it sitting at 66C with 1400RPM but I can't even achieve that when undervolted to 0.950v at stock boost (which drops power consumption by 12ish %). My card requires 1700RPM and hovers around 70C.

If I don't undervolt, the card is even louder and hotter. This is all with a wide open case with no side panels and just fans blowing at the card in a pretty low ambient temp room.

I want to switch to a quieter card but quieter based on what. I'm afraid the same thing will happen if I buy the Gaming X as it appears nobody is really pushing the cards in reviews (I play at 4K) to actually represent what the coolers are capable of.

G12 is last resort because I hate pump noise.


----------



## BoredErica

For fan reviews try Thermalbench and then check sound signatures of the fans (rough idea on Youtube). (Youtube is fine because we're deciding on sound signature, not loudness.)

From my testing Asus bios is slightly better than FTW3.

Quote:


> Originally Posted by *AngryLobster*
> 
> I don't really understand how these reviews are achieving the temperatures at the RPM's reported.
> 
> Guru3D's Strix review has it sitting at 66C with 1400RPM but I can't even achieve that when undervolted to 0.950v at stock boost (which drops power consumption by 12ish %). My card requires 1700RPM and hovers around 70C.
> 
> If I don't undervolt, the card is even louder and hotter. This is all with a wide open case with no side panels and just fans blowing at the card in a pretty low ambient temp room.
> 
> I want to switch to a quieter card but quieter based on what. I'm afraid the same thing will happen if I buy the Gaming X as it appears nobody is really pushing the cards in reviews (I play at 4K) to actually represent what the coolers are capable of.
> 
> G12 is last resort because I hate pump noise.


I never really heard the pumps on my x61 back when I had it. The fans were louder than the pump.


----------



## Slackaveli

Quote:


> Originally Posted by *ninjames*
> 
> For someone who is a complete noob but who is having a ton of issues with his Aorus 1080Ti, how is the EVGA FTW3 1080Ti? I'm getting a ton of crashes with my Aorus and I don't really do any overclocking or anything like that, and I've spent 40-plus hours trying to determine the cause of my crashes until I found that a lot of people are having issues with the Aorus card.


did you uninstall the aorus software and get on msi afterburner and set up the curves posted in the aorus Ti owner's thread?


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> You clearly overlooked my post ... it starts earlier to throttle... Ill quote myself


Hmm, strange, my card doesn't care about temperature at all. Core clock is always dead on.








I'm on XOC bios btw.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Hmm, strange, my card doesn't care about temperature at all. Core clock is always dead on.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm on XOC bios btw.


Yeah XOC bios... i tested it yesterday again, and i noticed lower frames with higher voltage (just for producing more heat) and same clocks in my testing, i even think the frequency readouts are wrong with that bios with higher temps, it shows always a constant clock while the performance goes down...
something very weird going on with that bios, its an sub zero bios maybe thats why... idk

if you could provide average fps from a short test relating to the graph you posted it would be clear^^


----------



## Benny89

Quote:


> Originally Posted by *Coopiklaani*
> 
> Hmm, strange, my card doesn't care about temperature at all. Core clock is always dead on.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm on XOC bios btw.


Could you please make a game bench (Romb Rider bench or Far Cry bench or any onther GPU heavy game bench) using both XOC BIOS and Stock BIOS at same clock and compare min, average and max fps from this bench?

That would be great!


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah XOC bios... i tested it yesterday again, and i noticed lower frames with higher voltage (just for producing more heat) and same clocks in my testing, i even think the frequency readouts are wrong with that bios with higher temps, it shows always a constant clock while the performance goes down...
> something very weird going on with that bios, its an sub zero bios maybe thats why... idk
> 
> if you could provide average fps from a short test relating to the graph you posted it would be clear^^


I passed Timespy stress test with 99.6% FPS stability. So I don't think it is really a temperature thing. I don't see any performance difference between a XOC bios, a FTW3 bios a Palit bios or a Zotac bios using only the frequency offset slider (curve is another thing). But FE bios is slightly more efficient at the same clock rate.
http://www.3dmark.com/tsst/48553


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> I passed Timespy stress test with 99.6% FPS stability. So I don't think it is really a temperature thing. I don't see any performance difference between a XOC bios, a FTW3 bios a Palit bios or a Zotac bios using only the frequency offset slider (curve is another thing). But FE bios is slightly more efficient at the same clock rate.
> http://www.3dmark.com/tsst/48553


Okay









The difference to the FE bios is like 1,5% in numbers its ~73 AVG FPS versus ~72 AVG FPS @ 2100/6300.... previous tests and comparisons were flawed by the 3d mark bug which i mentioned several times.

If you start 3d mark with set clock speeds and change those clocks after you started 3d mark you get worse results, you have to change your clocks and restart 3dmark each time you change clocks.... i have video material which is showing that.... maybe ill upload and demonstrate this^^


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> Hmm, strange, my card doesn't care about temperature at all. Core clock is always dead on.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm on XOC bios btw.


Your graph also needs to graph voltage. Temperature affects voltage bins.

If your card is set to do 2152 Mhz at 1.043v to 1.093v, then it'll obviously stay at 2152 Mhz. What voltage it is at is the question.



See above, if they had set lets say 2100 Mhz at 1.030v, then that red line would just be straight and at 2100 Mhz.

By doing a voltage curve, you're overwriting the standard curve your card follows/ followed.


----------



## Coopiklaani

Quote:


> Originally Posted by *SlimJ87D*
> 
> Your graph also needs to graph voltage. Temperature affects voltage bins.
> 
> If your card is set to do 2152 Mhz at 1.043v to 1.093v, then it'll obviously stay at 2152 Mhz. What voltage it is at is the question.
> 
> 
> 
> See above, if they had set lets say 2100 Mhz at 1.030v, then that red line would just be straight and at 2100 Mhz.


well XOC bios doesn't cap the voltage nor the power, so the card can go as high as it wants until it crashes or bursts into fire.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> well XOC bios doesn't cap the voltage nor the power, so the card can go as high as it wants until it crashes or bursts into fire.


But it also scores lower in benchmarks while doing so


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> But it also scores lower in benchmarks while doing so


But how does it perform in actuall games? Anyone make any comparsion or everybody just do benches and nobody actually play with new ?

Bnech score is not always game performance.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> But how does it perform in actuall games? Anyone make any comparsion or everybody just do benches and nobody actually play with new ?
> 
> Bnech score is not always game performance.


I've been asking for that for awhile now.

But it's quite consistent for a benchmarks like Heaven Benchmark to reflect game performance.

Heaven is still a great benchmark to predict actual gameplay performance.

Benny, I thought you were going to get SLI and not have to worry about 1% more performance anymore


----------



## Coopiklaani

30mins of witcher 3 with XOC bios. Not a single drop or increase in voltage.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 30mins of witcher 3 with XOC bios. Not a single drop or increase in voltage.


Yeah its a decent bios for not shunt modded or other method power limit modded cards for gaming, period.


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Yeah its a decent bios for not shunt modded or other method power limit modded cards for gaming, period.


I did shunt mod but still am using XOC. Shunt mod cannot remove "normalised TDP" power limit of the FE bios. The max power is still capped to about 350w to 375w on a FE bios, even with the shunt mod.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> I did shunt mod but still am using XOC. Shunt mod cannot remove "normalised TDP" power limit of the FE bios. The max power is still capped to about 350w to 375w on a FE bios, even with the shunt mod.


Can you name me ANY games that run into the normalized TDP? Because *Witcher 3, Ghost Recon Wildlands and For Honor do not* and they're some of the most power hungry games out there.


----------



## ElectroGeek007

Picked up a new toy to replace my 980 Ti, and holy cow this thing is fast (and huge)!


----------



## AshramKnight

Went from 45C gaming to 35C with the Noctua 3000rpm fans in push pull. Amazing. 2025 mhz now with 1.087 voltage, 2012 mhz with 1.075 voltage, and 2000 1.050 stock voltage. Im def happy with these results. The fans are noticible in noise but my case drowns out the noise for the most part, plus I game with headphones so all good











Side note : yes that is a third fan installed at the bottom of my cpu radiator, used the EVGA fan that came in my Hybrid kit, lowered temps by a few C


----------



## AngryLobster

A single one of those on a rad records 60+ dBa at full speed.

http://thermalbench.com/2015/03/04/noctua-nf-f12-industrialppc-3000-pwm-120mm-fan/3/

Those are definitely audible with headphones on or even in a different room. I'm really not sure why that 25mhz matters in games because its imperceptible.

Better off betting on silicon lottery with a different card vs this approach.

You are insane.


----------



## AshramKnight

Quote:


> Originally Posted by *AngryLobster*
> 
> A single one of those on a rad records 60+ dBa at full speed.
> 
> http://thermalbench.com/2015/03/04/noctua-nf-f12-industrialppc-3000-pwm-120mm-fan/3/
> 
> Those are definitely audible with headphones on or even in a different room. I'm really not sure why that 25mhz matters in games because its imperceptible.
> 
> Better off betting on silicon lottery with a different card vs this approach.
> 
> You are insane.


Well, I am considering using my step up to a FTW3 and seeing what I can get with it. Honestly though 2000-2025 mhz for a $700 founders edition with no tax is pretty awsome. Yeah the Hybrid and fans were a chunk of change, but ill be using it on the next big Geforce that comes out probably, so money well spent. Being at 35C is a great place to be on a 1080ti, less voltage/mhz fluctuations in benchmarks/games for sure.









Air cooled I was at 65C 1949-1974 mhz, water cooled + fans 35C 2000-2025.

The noise is noticible, but like I said my case is huge, and drowns out some of it.


----------



## Hulio225

Quote:


> Originally Posted by *AshramKnight*
> 
> Well, I am considering using my step up to a FTW3 and seeing what I can get with it. Honestly though 2000-2025 mhz for a $700 founders edition with no tax is pretty awsome. Yeah the Hybrid and fans were a chunk of change, but ill be using it on the next big Geforce that comes out probably, so money well spent. Being at 35C is a great place to be on a 1080ti, less voltage/mhz fluctuations in benchmarks/games for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Air cooled I was at 65C 1949-1974 mhz, water cooled + fans 35C 2000-2025.
> 
> The noise is noticible, but like I said my case is huge, and drowns out some of it.


2025MHz rock stable here @ 1000mV







and not audible

Edit: But psst don't tell anyone, my custom loop is a second 1080 ti^^


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> I've been asking for that for awhile now.
> 
> But it's quite consistent for a benchmarks like Heaven Benchmark to reflect game performance.
> 
> Heaven is still a great benchmark to predict actual gameplay performance.
> 
> Benny, I thought you were going to get SLI and not have to worry about 1% more performance anymore


Yes, waiting for my cards. Actually got ROG HB bridge faster than them lol







.

I just want to understand if XOC Bios is really really working in games. Because one thing that XOC did was disabling thermal downclocking (at least on STRIX) so that might be usefull for SLI.

But again, I want to see actually game benches (like ROTR bench) with both BIOSes. That would tell us if FPS count is any different or not.

Can't wait for that SLI thought ^^


----------



## gmpotu

Quote:


> Originally Posted by *Dasboogieman*
> 
> Which is the result of one of three scenarios:
> 1. The CPU is struggling to supply work to the GPU
> 2: Driver overhead is preventing full GPU utilization.
> 3: Poor GPU work scheduling/poor BF1 code.
> 
> All except number 3 can be addressed with more CPU power. Ergo the CPU is likely the problem as the owner is not satisfied with the number of Frames he is getting.
> Its not a gimmick, you are misunderstanding how GPUs work.
> Fundamentally, this how a modern GPU works. The CPU prepares the workload for the GPU to process, the GPU then outputs the result to the monitor.
> Its a perpetual balancing curve between CPU-GPU bottlenecking. Light workload shifts the balance to the CPU, heavy graphics workload shifts the balance the GPU. Neither component is independent of each other as far as graphics workload is concerned.
> 
> With a 240FPS cap at 1080p, your workload is extremely lightweight for the GPU as it can generate frames faster than the CPU can supply it. Thus, the speed of your CPU limits the output of your GPU. Likewise if you set it to render 4K, the workload shifts back to the GPU which means the CPU waits on the GPU to finish before supplying more frames.
> 
> Which brings us back to the final conclusion, if you desire more frames then what you already have, you need a stronger CPU.


Great Explaination. Thank you for taking the time to go into detail on this. Very insightful.


----------



## gmpotu

Quote:


> Originally Posted by *s1rrah*
> 
> Another benchmark to aim our 1080 ti's at ... LOL:
> 
> Final Fantasy Storm Blood Benchmark
> 
> It's 1.9gb!
> 
> http://na.finalfantasyxiv.com/benchmark/download/


Interested to see what scored you guys get vs the scores you're getting in Supo and Timespy


----------



## gmpotu

Quote:


> Originally Posted by *KedarWolf*
> 
> I only have a measly 5960x at 4.7GHZ , I feel sooooo, well, inadequate.


I fell you with my 3779k @ 4.2ghz turbo boost.
Haven't really tried to OC much.

WOW uses 100% gpu with max settings SSAA @1080p and I get about 60fps.

FFXIV uses 100% gpu in dungeons and raids and I get like 140fps on my 1080p 60hz monitor.

After reading all these cpu wars discussions it sounds like I should just forget upgrading my CPU at all and go for 21:9 ultra wide 1440p


----------



## KingEngineRevUp

Quote:


> Originally Posted by *AshramKnight*
> 
> Went from 45C gaming to 35C with the Noctua 3000rpm fans in push pull. Amazing. 2025 mhz now with 1.087 voltage, 2012 mhz with 1.075 voltage, and 2000 1.050 stock voltage. Im def happy with these results. The fans are noticible in noise but my case drowns out the noise for the most part, plus I game with headphones so all good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Side note : yes that is a third fan installed at the bottom of my cpu radiator, used the EVGA fan that came in my Hybrid kit, lowered temps by a few C


You must be deaf lol, 60+dba is super loud even with headphones.


----------



## AshramKnight

Quote:


> Originally Posted by *SlimJ87D*
> 
> You must be deaf lol, 60+dba is super loud even with headphones.


Honestly its not that bad. My radeon 4870 at 75% fan speed was louder than this. 100% fan speed was louder, sounded like a jet engine. Heck the fan I have in my room is louder than my computer, so it really doesnt matter much. My case contains noise pretty well, plus I got gaming headphones.


----------



## feznz

Quote:


> Originally Posted by *AngryLobster*
> 
> I don't really understand how these reviews are achieving the temperatures at the RPM's reported.
> 
> Guru3D's Strix review has it sitting at 66C with 1400RPM but I can't even achieve that when undervolted to 0.950v at stock boost (which drops power consumption by 12ish %). My card requires 1700RPM and hovers around 70C.
> 
> If I don't undervolt, the card is even louder and hotter. This is all with a wide open case with no side panels and just fans blowing at the card in a pretty low ambient temp room.
> 
> I want to switch to a quieter card but quieter based on what. I'm afraid the same thing will happen if I buy the Gaming X as it appears nobody is really pushing the cards in reviews (I play at 4K) to actually represent what the coolers are capable of.
> 
> G12 is last resort because I hate pump noise.


really depends on your ambient temperature and the case air flow I not sure but some do test on an open test bench like this

or possibly air con turned right down to 18°C

Quote:


> Originally Posted by *Benny89*
> 
> Yes, waiting for my cards. Actually got ROG HB bridge faster than them lol
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I just want to understand if XOC Bios is really really working in games. Because one thing that XOC did was disabling thermal downclocking (at least on STRIX) so that might be usefull for SLI.
> 
> But again, I want to see actually game benches (like ROTR bench) with both BIOSes. That would tell us if FPS count is any different or not.
> 
> Can't wait for that SLI thought ^^


No thermal throttling mmm are you looking at turning your top card into a fry pan








Actually why I got a Maximus Extreme Motherboard to have 4 single slots spacing with SLI, still had residual heat from bottom card even 200mm and 140mm fan blowing on the cards, gave up switched to water only for me for SLI.
or go the FE blow style if you insist on staying on air.


----------



## mtbiker033

I'm using SP-120's on my hybrid and even at full rpm (2200) they are more quiet than the stock blower fan was, I usually set them to 1500rpm when I game and they are damn near silent!









and I am using an open test bench!!


----------



## Coopiklaani

My EK Vardar F1 fans with 1150max RPM really struggle to keep the water in the loop cool with my GPU piping over 350w of heat.


----------



## Alex132

Mine do fine with my 295X2's amount of heat in my single rad AIO that came with it at around 400-700rpm?


----------



## Luckbad

Finally found a golden FTW3. Will fully exercise it later tonight but it does well over 2000MHz and hits at least 6000 on ram.


----------



## ninjames

Quote:


> Originally Posted by *Luckbad*
> 
> Finally found a golden FTW3. Will fully exercise it later tonight but it does well over 2000MHz and hits at least 6000 on ram.


What does golden mean? Is there a problem with some of those FTW3 cards? I was considering buying one after my Aorus gave me a bunch of problems.


----------



## Slackaveli

Quote:


> Originally Posted by *KraxKill*
> 
> So....I've done some preliminary testing but this thing is miles ahead of my 4790k @ 5.2ghz. @ 1.390v.
> 
> With the settings I'm running, Arma 3 Altis benchmark hits 103 fps @ 4.2ghz @ 1.35v vs just 93fps on my 4790k on what could be considered a one of a kind chip.
> 
> I need more time to test the behavior of this chip elsewhere to properly select a keeper but to my surprise Slackaveli was not kidding. This thing really does fly. It's 10% faster than my 4790k at 5.2ghz while running just 4.2 ghz and I'm sure there is more room in there.
> 
> I feel these things are going to be sold out tomorrow....


----------



## Luckbad

Quote:


> Originally Posted by *ninjames*
> 
> What does golden mean? Is there a problem with some of those FTW3 cards? I was considering buying one after my Aorus gave me a bunch of problems.


Nope. All cards are part of the silicon lottery. My first was good but not amazing, and this one seems to be outstanding. We'll see if I can resist breaking the seal on the other I ordered at the same time.

I bought about 8-9 total 1080 Tis to find the best potential overclocker.

Found an amazing Zotac Amp Extreme, but nobody seems to be making a waterblock for it, so I've been hunting for a great FTW3 and found one on the second try.


----------



## ninjames

Quote:


> Originally Posted by *Luckbad*
> 
> Nope. All cards are part of the silicon lottery. My first was good but not amazing, this one seems to be outstanding. We'll see if I can resist breaking the seal on the other I ordered at the same time.
> 
> I bought about 8-9 total 1080 Tis to find the best potential overclocker.
> 
> Found an amazing Zotac Amp Extreme, but nobody seems to be making a waterblock for it, so I've been hunting for a great FTW3 and found one on the second try.


Oh OK. So what's the word on that card specifically for less crazy users like me who just want a card that isn't a headache to deal with and has to be babysit, like my Aorus I just returned.


----------



## Slackaveli

Quote:


> Originally Posted by *ninjames*
> 
> Oh OK. So what's the word on that card specifically for less crazy users like me who just want a card that isn't a headache to deal with and has to be babysit, like my Aorus I just returned.


glad you were able to return it. It was a bad card. It really isnt representative of the Aorus experience in general. But I dont blame you for wanting to try a different one. FTW3 or Strix.


----------



## Benny89

Quote:


> Originally Posted by *ninjames*
> 
> Oh OK. So what's the word on that card specifically for less crazy users like me who just want a card that isn't a headache to deal with and has to be babysit, like my Aorus I just returned.


Like totally no problems? STRIX. FTW3 is good, but it has some problems with fans, its louder, you can't set up custom fan curve in Afterburner on it. You are pretty much forced to use Precision X, which is not as good as AB.

Honestly, if you want to just plug card in, do some small OC via sliders and enjoy your time- STRIX in my opinion.


----------



## ninjames

Quote:


> Originally Posted by *Slackaveli*
> 
> glad you were able to return it. It was a bad card. It really isnt representative of the Aorus experience in general. But I dont blame you for wanting to try a different one. FTW3 or Strix.


Yeah I'm not going to like ... hate the brand now or anything. I thought it ran really well. But the pool of cards out there that seem to be duds from the Xtreme batch is a little more of a headache than I'm willing to deal with right now. Again, as someone who is far, far more casual than most of the users on this forum! It doesn't help that at the same time, the PC I had before this, I shipped to my brother and it arrived without powering on so I had to troubleshoot my Aorus, my old rig across the country, and my laptop's Windows Store also broke so I had to restore a backup.

Computers, for me, have been terrible for the past week!


----------



## Luckbad

Quote:


> Originally Posted by *ninjames*
> 
> Oh OK. So what's the word on that card specifically for less crazy users like me who just want a card that isn't a headache to deal with and has to be babysit, like my Aorus I just returned.


While you are beholden to Precision XOC for the fan curve on the EVGA FTW3, I always recommend EVGA cards to people because of the stellar customer service.

I personally didn't like the sound of the Asus Strix OC. The Aorus was fine. Zotac Amp Extreme is amazing if you're going to adjust your fans and move on (runs quiet with good temps).

The FTW3 isn't as quiet as the larger multi-slot cards. That said, I was able to change my fans to 50% max and set the target temp to 68 C and it works great.

If you're okay with a huge card but want to be guaranteed outstanding performance, ping me in PM and I can sell you my Zotac instead of eBaying it or give you more info.


----------



## AshramKnight

Update : Doom at 4K is solid at 2012 mhz, stock voltage. No fluctuations at all. Before with the stock EVGA fan on my Hybrid I was getting 1987 mhz. Nice change


----------



## CoreyL4

Quote:


> Originally Posted by *Benny89*
> 
> Like totally no problems? STRIX. FTW3 is good, but it has some problems with fans, its louder, you can't set up custom fan curve in Afterburner on it. You are pretty much forced to use Precision X, which is not as good as AB.
> 
> Honestly, if you want to just plug card in, do some small OC via sliders and enjoy your time- STRIX in my opinion.


When you are talking about the ftw3, do you mean the actual card and not the bios? I have a ftw3 bios on my gaming x and can set a custom fan curve in ab.


----------



## Coopiklaani

Quote:


> Originally Posted by *Luckbad*
> 
> While you are beholden to Precision XOC for the fan curve on the EVGA FTW3, I always recommend EVGA cards to people because of the stellar customer service.
> 
> I personally didn't like the sound of the Asus Strix OC. The Aorus was fine. Zotac Amp Extreme is amazing if you're going to adjust your fans and move on (runs quiet with good temps).
> 
> The FTW3 isn't as quiet as the larger multi-slot cards. That said, I was able to change my fans to 50% max and set the target temp to 68 C and it works great.
> 
> If you're okay with a huge card but want to be guaranteed outstanding performance, ping me in PM and I can sell you my Zotac instead of eBaying it or give you more info.


What do you think of GALAX GTX 1080 Ti HOF Limited Edition? It boosts to 2064MHz out of the box, must be a binned GPU. Maybe the only binned SKU besides the KNP?


----------



## Slackaveli

Quote:


> Originally Posted by *ninjames*
> 
> Yeah I'm not going to like ... hate the brand now or anything. I thought it ran really well. But the pool of cards out there that seem to be duds from the Xtreme batch is a little more of a headache than I'm willing to deal with right now. Again, as someone who is far, far more casual than most of the users on this forum! It doesn't help that at the same time, the PC I had before this, I shipped to my brother and it arrived without powering on so I had to troubleshoot my Aorus, my old rig across the country, and my laptop's Windows Store also broke so I had to restore a backup.
> 
> Computers, for me, have been terrible for the past week!


damn, that's unfortunate. i'd probably go with Benny's advice on this one. He's owned them all at this point.
Quote:


> Originally Posted by *Benny89*
> 
> Like totally no problems? STRIX. FTW3 is good, but it has some problems with fans, its louder, you can't set up custom fan curve in Afterburner on it. You are pretty much forced to use Precision X, which is not as good as AB.
> 
> Honestly, if you want to just plug card in, do some small OC via sliders and enjoy your time- STRIX in my opinion.


or I'd buy LuckBad's Zotac golden guaranteed chip.
Quote:


> Originally Posted by *Luckbad*
> 
> While you are beholden to Precision XOC for the fan curve on the EVGA FTW3, I always recommend EVGA cards to people because of the stellar customer service.
> 
> I personally didn't like the sound of the Asus Strix OC. The Aorus was fine. Zotac Amp Extreme is amazing if you're going to adjust your fans and move on (runs quiet with good temps).
> 
> The FTW3 isn't as quiet as the larger multi-slot cards. That said, I was able to change my fans to 50% max and set the target temp to 68 C and it works great.
> 
> If you're okay with a huge card but want to be guaranteed outstanding performance, ping me in PM and I can sell you my Zotac instead of eBaying it or give you more info.


Anybody hear if @Kedarwolf was able to revert his BIOS?


----------



## ninjames

Quote:


> Originally Posted by *Luckbad*
> 
> While you are beholden to Precision XOC for the fan curve on the EVGA FTW3, I always recommend EVGA cards to people because of the stellar customer service.
> 
> I personally didn't like the sound of the Asus Strix OC. The Aorus was fine. Zotac Amp Extreme is amazing if you're going to adjust your fans and move on (runs quiet with good temps).
> 
> The FTW3 isn't as quiet as the larger multi-slot cards. That said, I was able to change my fans to 50% max and set the target temp to 68 C and it works great.
> 
> If you're okay with a huge card but want to be guaranteed outstanding performance, ping me in PM and I can sell you my Zotac instead of eBaying it or give you more info.


Thanks for all the advice. Fan noise doesn't bother me though, like at all. I'm in a room where I always have a window A/C unit running right behind me. And I'd take you up on the Zotac but my refund for my Aorus is in the form of Newegg credit. I had an EVGA 770 a few years ago and that served me very well. I think maybe I'll go with the FTW3 -- weren't people saying the Precision software update fixed the fan issues?


----------



## Luckbad

Quote:


> Originally Posted by *Coopiklaani*
> 
> What do you think of GALAX GTX 1080 Ti HOF Limited Edition? It boosts to 2064MHz out of the box, must be a binned GPU. Maybe the only binned SKU besides the KNP?


They don't guarantee that boost out of the box. Might be binned, though. I've had two GALAX HOF cards and loved them.


----------



## Coopiklaani

Quote:


> Originally Posted by *Luckbad*
> 
> They don't guarantee that boost out of the box. Might be binned, though. I've had two GALAX HOF cards and loved them.


Not the ordinary Galax HOF, but the limited edition. They come with 1645MHz base clock, highest stock clock on the market, higher than most of the cards can ever reach.


----------



## Luckbad

Quote:


> Originally Posted by *Coopiklaani*
> 
> Not the ordinary Galax HOF, but the limited edition. They come with 1645MHz base clock, highest stock clock on the market, higher than most of the cards can ever reach.


That is the same stock clock as the Zotac Amp Extreme. https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti

The Zotac I have boosts out of the box to 2037.5 MHz with no changes. 2064 is bonkers good and I wouldn't expect that out of almost any card. I've seen exactly two cards boost stock over 2 GHz, and both were Zotac Amp Extremes. Looks like GALAX is getting in on it too!


----------



## Luckbad

As you might have anticipated, after finding one golden sample EVGA 1080 Ti FTW3 that I could clock over 2 GHz core and 12 GHz (aka 6 GHz... depends on what number you prefer) memory with little effort (no voltage addition)...

I opened the second FTW3 that came in the same box. Bought them both at the same time from BH.

The second FTW3 is ALSO golden sample!

Lightning strikes twice!

Gad, I don't know which is better either. I already have this one going over 2 GHz / 12 GHz without issue.

It's going to take all weekend to determine which card is better (frankly, they might be identical and from the same manufacturing run, so there might not be a better one). I'll be selling one of them and keeping the other to put on water once EKWB starts selling their waterblocks.


----------



## ttnuagmada

Quote:


> Originally Posted by *AshramKnight*
> 
> Went from 45C gaming to 35C with the Noctua 3000rpm fans in push pull. Amazing. 2025 mhz now with 1.087 voltage, 2012 mhz with 1.075 voltage, and 2000 1.050 stock voltage. Im def happy with these results. The fans are noticible in noise but my case drowns out the noise for the most part, plus I game with headphones so all good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Side note : yes that is a third fan installed at the bottom of my cpu radiator, used the EVGA fan that came in my Hybrid kit, lowered temps by a few C


I had some of the 2k RPM industrials, and at max RPM even those were pretty loud. I couldn't do it.


----------



## Nico67

Quote:


> Originally Posted by *Nico67*
> 
> 13mhz increments, 2045 doesn't exist
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find it always better to default the curve, press apply, then raise the highest bin first, then move down the bins after that and hit apply, if you pick frequencies factional above the fxied increments they tend to apply correctly, like 2053 will set 2050 ok.
> 
> The curve you show above will draw less power but is loose a ton of performance, you need to have at least one step down and then next 4 lower bins raised.
> 
> This is the curve I was running, but I probably should have had those last two bins raised as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the last 4 bins give a good performance boost, like some invisible limiting is happening so optimising them helps.


So I retuned the curve to add those extra points and got 10795, played with the memory too, and get good scaling as long as I restart bench program after memory adjustment



this is with the Asus FE bios, and its dropping to 2063 due to limiting, due the extra performance load, but 3 bins below that seems enough. Its so much easier to set the XOC bios, but the performance is just worse and the better you tune FE, the worse again XOC looks


----------



## AngryLobster

Quote:


> Originally Posted by *feznz*
> 
> really depends on your ambient temperature and the case air flow I not sure but some do test on an open test bench like this
> 
> or possibly air con turned right down to 18°C
> No thermal throttling mmm are you looking at turning your top card into a fry pan
> 
> 
> 
> 
> 
> 
> 
> 
> Actually why I got a Maximus Extreme Motherboard to have 4 single slots spacing with SLI, still had residual heat from bottom card even 200mm and 140mm fan blowing on the cards, gave up switched to water only for me for SLI.
> or go the FE blow style if you insist on staying on air.


I have my case wide open with no panels in a 17c room. It's functionally a test bench with fans.

Load Tomb Raider or the new Prey at 4K and stock the fans ramp up to 1700RPM and are too loud for my taste. Other games like Hitman keep the card much quieter with results comparable to reviews.

I'm convinced they are just not pushing the cards hard enough because 1400RPM at 66c is just out of the realm of possibility from my testing.

I'd love if another STRIX owner can chime in. I'm about to throw my Morpheus on it but the warranty sticker scares me given how terrible Asus RMA is.


----------



## Coopiklaani

Quote:


> Originally Posted by *Luckbad*
> 
> As you might have anticipated, after finding one golden sample EVGA 1080 Ti FTW3 that I could clock over 2 GHz core and 12 GHz (aka 6 GHz... depends on what number you prefer) memory with little effort (no voltage addition)...
> 
> I opened the second FTW3 that came in the same box. Bought them both at the same time from BH.
> 
> The second FTW3 is ALSO golden sample!
> 
> Lightning strikes twice!
> 
> Gad, I don't know which is better either. I already have this one going over 2 GHz / 12 GHz without issue.
> 
> It's going to take all weekend to determine which card is better (frankly, they might be identical and from the same manufacturing run, so there might not be a better one). I'll be selling one of them and keeping the other to put on water once EKWB starts selling their waterblocks.


most cards can go over 2000MHz easily right? I've had 5 FE cards so far, all of them can do over 2038MHz using only the core offset slider.

My best two cards so far are
2152MHz core and +525 Memory, and
2114MHz core and +850 Memory
Both manage to pass FSU stress test and TS stress test. I ended up keeping the first one. Not sure if I made the right choice.







They scored very similar in benchmarks and games.
http://www.3dmark.com/tsst/48553
http://www.3dmark.com/tsst/47064
http://www.3dmark.com/fsst/441911
http://www.3dmark.com/fsst/431230


----------



## gmpotu

Quote:


> Originally Posted by *Slackaveli*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really appreciate you project cars guys/gals. ya'll are the ones who discovered the magic cache on broadwell, and many of us profusely thank you for it!! But, a bunch of others will hate ya'll for it. It makes them hate their much more expensive chips. They don't have the gumption that you did to sell off that $1000 paperweight and get a gaming oriented cpu. They game one workstations
> 
> 
> 
> 
> 
> 
> 
> 
> this is the whole curve thing in a nutshell. it is actually quite simple, really. took me a minute to figure it all out though. the changing after hitting apply, even when on the correct points, is what threw me off for a little while.


On Air cooling

The changing after hitting apply happens to me when you open the curve screen at a low temp say below 38C and then decide to edit your curve while you have a 3D application going in the background such as a game and your card has heated up beyond one of the thermal limits at roughly 40C, 52C, 60C,65C on air.

As you approach these temps you may notice your card drop a frequency bin from 2101 down to 2088 down to 2062 down to 2050.

If you are editing your curve and press apply it will function differently at the different temp intervals most of the time. This is even if you don't have any power limits.

Heaven Benchmark is the only time I have seen this not be true for some reason. In heaven bench the card just decides to ignore throttling and it either heats up to get super hot and doesn't crash or it crashes.

ps. Super jelly of you guys with your good memory clocks. I can only max at +400 on my card.


----------



## alucardis666

Quote:


> Originally Posted by *alucardis666*
> 
> Seasonic installed and running great, no more hum!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for all the help everyone!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT:* Fired up Heaven 2x. And 2x my system powered off... Is 850 not enough? never had this happen with the AX1200...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT 2:* I just bought this...
> 
> Should fix my issues.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.bestbuy.com/site/evga-1600w-modular-power-supply-black/4598983.p?skuId=4598983


Well, got my EVGA 1600W P2 PSU installed, all issues resolved, no hum, quieter than the Seasonic in Eco mode. and NO issues with black screens or system shutdowns/restarts


----------



## nrpeyton

0.981v and _still_ consuming 295w + in The Witcher 3 on the Skellege Isles (4k Ultra).

And card is barely able to maintain a clock-speed of 1900 at that voltage. _(power limited)._

I just upgraded my entire system, CPU, Mobo & DDR4. So the water-block went on hold.

Have to admit, after the performance boost I seen--I even begun to think I could get away without water & shunt mod.

But after today playing The Witcher 3. I can't stress how wrong I was.

This card really is crying out for shunt mod, I'm just going to have to be patient. 2 weeks to go until I can afford to order my block.


----------



## feznz

Quote:


> Originally Posted by *AngryLobster*
> 
> I have my case wide open with no panels in a 17c room. It's functionally a test bench with fans.
> 
> Load Tomb Raider or the new Prey at 4K and stock the fans ramp up to 1700RPM and are too loud for my taste. Other games like Hitman keep the card much quieter with results comparable to reviews.
> 
> I'm convinced they are just not pushing the cards hard enough because 1400RPM at 66c is just out of the realm of possibility from my testing.
> 
> I'd love if another STRIX owner can chime in. I'm about to throw my Morpheus on it but the warranty sticker scares me given how terrible Asus RMA is.


http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig/0_20

First I think *everyone* should read the above fill our you Rig as it does help a lot if you ask for help

I have the Strix but I didn't bother testing The factory cooler, straight under water for me
As for the tamper stickers don't apply in my country because the NZ consumer laws allow for maintenance and improvements as long as you don't modify the integrity of the card.
Another words I can replace the TIM without voiding warrantee BUT you will sometimes have to fight the fact for the rouge reseller.

it is possible to take the screw out without damaging the tamper sticker by using small pliers to grip the side of the screw head to undo it.
Or clean the sticker off completely and ask them for undeniable proof that it was ever there


----------



## Luckbad

Quote:


> Originally Posted by *Coopiklaani*
> 
> most cards can go over 2000MHz easily right? I've had 5 FE cards so far, all of them can do over 2038MHz using only the core offset slider.
> 
> My best two cards so far are
> 2152MHz core and +525 Memory, and
> 2114MHz core and +850 Memory
> Both manage to pass FSU stress test and TS stress test. I ended up keeping the first one. Not sure if I made the right choice.
> 
> 
> 
> 
> 
> 
> 
> They scored very similar in benchmarks and games.
> http://www.3dmark.com/tsst/48553
> http://www.3dmark.com/tsst/47064
> http://www.3dmark.com/fsst/441911
> http://www.3dmark.com/fsst/431230


Most of the cards I've had can go over 2000 MHz on air, but few of them can hold over that point in gaming without aggressive fan profiles.

I've now had slightly less than 50% of the 1080 Tis I've purchased hold over 2GHz easily in practical situations. I've only had one that could reasonably hold over 2100, and that one had crappy memory.

None of the cards I've had can reliably do over +600 MHz offset on memory in stress tests without artifacting at some point.

The two FTW3 cards I just acquired won't throttle below 2 GHz when playing games, which has been rare from what I've seen (without voltage).

I'm hoping whichever one I keep can do 2100 MHz without fluctuating once I get it on water.


----------



## TheBoom

Had some time to play with curve on the XOC, though I'm not even sure I should bother.

Default bios I could get 2037 stable at 1.075-1.093v, but with this even at 1.2v 2062 is not stable.

Seems like the additional voltage doesn't really do much for the OC. Or maybe the chip is at its limits already.

Edit : Nevermind. Upon further testing [email protected] 1.131v is stable but not [email protected] 1.2v. It seems there is a certain voltage threshold before the card gets unstable simply due to too high voltage.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Had some time to play with curve on the XOC, though I'm not even sure I should bother.
> 
> Default bios I could get 2037 stable at 1.075-1.093v, but with this even at 1.2v 2062 is not stable.
> 
> Seems like the additional voltage doesn't really do much for the OC. Or maybe the chip is at its limits already.
> 
> Edit : Nevermind. Upon further testing [email protected] 1.131v is stable but not [email protected] 1.2v. It seems there is a certain voltage threshold before the card gets unstable simply due to too high voltage.


1.2V @ 2062MHz: Think what that means... it means more heat even at lower clock, it simply can be the higher temperature which is causing higher leakage current and cross flow current therefor the gpu can miscalculate stuff and you get crashes


----------



## dunbheagan

I have been testing some undervolting in the last days and wanted to report back my results.

All tests were made with [email protected] and a [email protected] I have a shunt modded FE under water. Power draw is measured for the whole system with a separate energy meter on the wall. I found that [email protected] maxed out is the most power drawing test, even more than Supo 4K and FSU, thats why i used Doom for comparision.

[email protected]/1,093V
Power draw: 460W
TDP: 95%
FPS 1% Low: 77,5

[email protected]/0,793V
Power draw: 275W
TDP: 55%
FPS 1% Low: 68,0

The difference between mac OC and max undervolt is 185W in this case.
Power draw of the system without GPU is appr. 100W, so it is 175W on the card against 360W on the card.

The 20% higher clock converts into 14% higher performance, which is ok imo, because memory is the same in both cases.

For me, the undervolt settings are much more appealing, because:
-+100% Power draw only give +14% performance
-coil whine changes from slightly annoying to bearly audible(open case, no headphones)
-fans spin at 400rpm instead at 1000rpm
-GPU is at 44° instead at 50° (water temp in both cases 40°)

I think it is nice to see, that the 1080Ti can be efficient as f***, if you want to. At 175W it is still the fastest card on the market(let aside the Titan XP) and it still beats every overclocked [email protected]+.

What is your max clock at 800mV guys?

And are there any 1080Ti with absoluty no coil whine on the market? I think i got quite lucky with my card having only little coil whine, but still it is the loudest part during gaming, which annoys me a little. I wouldnt hear it, if i would close my case or use headphones, but i dont want to.


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> 1.2V @ 2062MHz: Think what that means... it means more heat even at lower clock, it simply can be the higher temperature which is causing higher leakage current and cross flow current therefor the gpu can miscalculate stuff and you get crashes


Something like that but not exactly temperature related perhaps. I found that the highest voltage I can get to before this happens is 1150mv (regardless of clocks). The temperatures are identical or maybe 1C apart between 1150 and 1200mv. So its most likely the increased current itself causing the crashes.

So the next objective would be to find the highest clock for 1150mv.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Something like that but not exactly temperature related perhaps. I found that the highest voltage I can get to before this happens is 1150mv (regardless of clocks). The temperatures are identical or maybe 1C apart between 1150 and 1200mv. So its most likely the increased current itself causing the crashes.
> 
> So the next objective would be to find the highest clock for 1150mv.


It still can be temperature related, the temp sensor is not everywhere on the die, its kinda hard to tell.

Normally if you go higher clocks you need more voltage because some transistors are not switching fast enough, so more voltage makes the gate open and close faster for the electrons. If at some point the voltage is to high, the transistor can conduct current and that is causing fails too... But whatever the reasons are, good that you figured out that you can go higher with lower voltage








Quote:


> Originally Posted by *dunbheagan*
> 
> I have been testing some undervolting in the last days and wanted to report back my results.
> 
> All tests were made with [email protected] and a [email protected] I have a shunt modded FE under water. Power draw is measured for the whole system with a separate energy meter on the wall. I found that [email protected] maxed out is the most power drawing test, even more than Supo 4K and FSU, thats why i used Doom for comparision.
> 
> [email protected]/1,093V
> Power draw: 460W
> TDP: 95%
> FPS 1% Low: 77,5
> 
> [email protected]/0,793V
> Power draw: 275W
> TDP: 55%
> FPS 1% Low: 68,0
> 
> The difference between mac OC and max undervolt is 185W in this case.
> Power draw of the system without GPU is appr. 100W, so it is 175W on the card against 360W on the card.
> 
> The 20% higher clock converts into 14% higher performance, which is ok imo, because memory is the same in both cases.
> 
> For me, the undervolt settings are much more appealing, because:
> -+100% Power draw only give +14% performance
> -coil whine changes from slightly annoying to bearly audible(open case, no headphones)
> -fans spin at 400rpm instead at 1000rpm
> -GPU is at 44° instead at 50° (water temp in both cases 40°)
> 
> I think it is nice to see, that the 1080Ti can be efficient as f***, if you want to. At 175W it is still the fastest card on the market(let aside the Titan XP) and it still beats every overclocked [email protected]+.
> 
> What is your max clock at 800mV guys?
> 
> And are there any 1080Ti with absoluty no coil whine on the market? I think i got quite lucky with my card having only little coil whine, but still it is the loudest part during gaming, which annoys me a little. I wouldn't hear it, if i would close my case or use headphones, but i don't want to.


@800mV first quick test 1747MHz (looped stress test firestrike ultra 3 times)

Mine has coil-whine too if i do some **** which is producing unrealistically high fps other than that in closed case not really audible.


----------



## Dasboogieman

Quote:


> Originally Posted by *dunbheagan*
> 
> I have been testing some undervolting in the last days and wanted to report back my results.
> 
> All tests were made with [email protected] and a [email protected] I have a shunt modded FE under water. Power draw is measured for the whole system with a separate energy meter on the wall. I found that [email protected] maxed out is the most power drawing test, even more than Supo 4K and FSU, thats why i used Doom for comparision.
> 
> [email protected]/1,093V
> Power draw: 460W
> TDP: 95%
> FPS 1% Low: 77,5
> 
> [email protected]/0,793V
> Power draw: 275W
> TDP: 55%
> FPS 1% Low: 68,0
> 
> The difference between mac OC and max undervolt is 185W in this case.
> Power draw of the system without GPU is appr. 100W, so it is 175W on the card against 360W on the card.
> 
> The 20% higher clock converts into 14% higher performance, which is ok imo, because memory is the same in both cases.
> 
> For me, the undervolt settings are much more appealing, because:
> -+100% Power draw only give +14% performance
> -coil whine changes from slightly annoying to bearly audible(open case, no headphones)
> -fans spin at 400rpm instead at 1000rpm
> -GPU is at 44° instead at 50° (water temp in both cases 40°)
> 
> I think it is nice to see, that the 1080Ti can be efficient as f***, if you want to. At 175W it is still the fastest card on the market(let aside the Titan XP) and it still beats every overclocked [email protected]+.
> 
> What is your max clock at 800mV guys?
> 
> And are there any 1080Ti with absoluty no coil whine on the market? I think i got quite lucky with my card having only little coil whine, but still it is the loudest part during gaming, which annoys me a little. I wouldnt hear it, if i would close my case or use headphones, but i dont want to.


There are some rare cares with exceptionally expensive low-noise inductors but I cannot recall who has atm. IIRC ASUS and maayyyybe MSI might still use them but I cannot be sure. btw they are noise-reduced, not noiseless, they still whine except at a less audible frequency.

The best way to deal with coil whine is to pad down the coils with something e.g. Thermal pads with pressure from the HSF, Hot glue around the inductors, Seksui Thermal tape to brace all the coils together etc etc.

I am actually baffled why the questions about coil whine keep coming up time and time again. Wikipedia explains the phenomenon quite well, all inductors will make noise, its just inherent to the physics of how they work. The noise they make is not indicative of their quality (barring comparison of conventional coils to special purpose low-noise ones) or even their state of function (i.e. functioning as designed). The real questions should be how much noise?, is it audible? and what has been done to mitigate it?


----------



## TheBoom

Quote:


> Originally Posted by *dunbheagan*
> 
> I have been testing some undervolting in the last days and wanted to report back my results.
> 
> All tests were made with [email protected] and a [email protected] I have a shunt modded FE under water. Power draw is measured for the whole system with a separate energy meter on the wall. I found that [email protected] maxed out is the most power drawing test, even more than Supo 4K and FSU, thats why i used Doom for comparision.
> 
> [email protected]/1,093V
> Power draw: 460W
> TDP: 95%
> FPS 1% Low: 77,5
> 
> [email protected]/0,793V
> Power draw: 275W
> TDP: 55%
> FPS 1% Low: 68,0
> 
> The difference between mac OC and max undervolt is 185W in this case.
> Power draw of the system without GPU is appr. 100W, so it is 175W on the card against 360W on the card.
> 
> The 20% higher clock converts into 14% higher performance, which is ok imo, because memory is the same in both cases.
> 
> For me, the undervolt settings are much more appealing, because:
> -+100% Power draw only give +14% performance
> -coil whine changes from slightly annoying to bearly audible(open case, no headphones)
> -fans spin at 400rpm instead at 1000rpm
> -GPU is at 44° instead at 50° (water temp in both cases 40°)
> 
> I think it is nice to see, that the 1080Ti can be efficient as f***, if you want to. At 175W it is still the fastest card on the market(let aside the Titan XP) and it still beats every overclocked [email protected]+.
> 
> What is your max clock at 800mV guys?
> 
> And are there any 1080Ti with absoluty no coil whine on the market? I think i got quite lucky with my card having only little coil whine, but still it is the loudest part during gaming, which annoys me a little. I wouldnt hear it, if i would close my case or use headphones, but i dont want to.


I don't know man but 10 fps is significant to me and I would take that over less power used and heat anyday. Especially if it's 10 fps at 4K. But to each his own I guess.

My card doesn't seem to have any audible coil whine though. Even at 2000 fps in menus.


----------



## dunbheagan

Quote:


> Originally Posted by *Dasboogieman*
> 
> There are some rare cares with exceptionally expensive low-noise inductors but I cannot recall who has atm. IIRC ASUS and maayyyybe MSI might still use them but I cannot be sure. btw they are noise-reduced, not noiseless, they still whine except at a less audible frequency.
> 
> The best way to deal with coil whine is to pad down the coils with something e.g. Thermal pads with pressure from the HSF, Hot glue around the inductors, Seksui Thermal tape to brace all the coils together etc etc.
> 
> I am actually baffled why the questions about coil whine keep coming up time and time again. Wikipedia explains the phenomenon quite well, all inductors will make noise, its just inherent to the physics of how they work. The noise they make is not indicative of their quality (barring comparison of conventional coils to special purpose low-noise ones) or even their state of function (i.e. functioning as designed). The real questions should be how much noise?, is it audible? and what has been done to mitigate it?


Thanks, i know it is normal and not a sign of disfunction. Still it annoys me.

If i would decide to disasemble my waterblock to pad down the coils, wouldnt hot glue around the coils lead to possible heat problems? btw, the coils we are talking of are grey little boxes labled with R22, right? I you had a link to some photos and/or a tutorial, i would be very thankful!


----------



## Dasboogieman

Quote:


> Originally Posted by *dunbheagan*
> 
> Thanks, i know it is normal and not a sign of disfunction. Still it annoys me.
> 
> If i would decide to disasemble my waterblock to pad down the coils, wouldnt hot glue around the coils lead to possible heat problems? btw, the coils we are talking of are grey little boxes labled with R22, right? I you had a link to some photos and/or a tutorial, i would be very thankful!




Exactly how Powercolor did it at the factory, those grey rectangles with a C on them. Just a small amount would do.

If you are squeamish,
The easiest and cheapest mod is the Seksui Thermal Tape one. I did this for my R9 290 Sapphire Tri X.
1. Get some Seksui tape the length of the distance between the inductors. Cut the tape lengthwise in two.
2. Place the tape over the top and the sides of the Inductors but leave the white paper backing on.
3. Reassemble the waterblock. The principle of this mod is you brace all the coils together against each other so the vibrations individually get dampened by the collective mass.
The biggest disadvantage of this mod is the dampening effect is not the best and it may cause your inductors to heat up a bit more (due to being underneath the waterblock with no airflow). Though on the flip side, its extremely reversible and costs less than $2 to do.

The 2nd easiest mod is the Thermal pad mod.
This requires a bit of precision and 2mm-3mm padding (you can stack pads on each other too) depending on your waterblock clearance with the coils.
1. Just cut a long strip of pad and place it on top of the inductors, make sure there is significant pressure from the block going down on to the coils (enough to make a 0.5mm or so indent in to the pad).
2. Re-assemble the block
This method has decent dampening but requires some precision. Not enough thermal pad compression results in more whine, too much thermal pad may result in your block not seating right. The biggest advantage of this method is you also fix the thermal issues of the inductors by conducting it through the pads and in to your cooling solution.

The hot glue method is the most fiddly but if you do it properly you can still clean it up for warranty.
Basically, just hot glue around each coil. Not a lot of explanation here. IIRC this is the best method of dampening the coils as you provide a large surface area of bracing around each coil.


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> Thanks, i know it is normal and not a sign of disfunction. Still it annoys me.
> 
> If i would decide to disasemble my waterblock to pad down the coils, wouldnt hot glue around the coils lead to possible heat problems? btw, the coils we are talking of are grey little boxes labled with R22, right? I you had a link to some photos and/or a tutorial, i would be very thankful!


R stands for Resistor








L for Coils
C for Capacitors

The C on the rectangles from Boogie is just the Logo of manufacturer don't confuse it with the C i meant for Capacitors

Those rectangles which Dasboogieman is refering to look like those in the inside


----------



## dunbheagan

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Exactly how Powercolor did it at the factory, those grey rectangles with a C on them.


Very helpful, +rep

Is it better to use something that hardens or something like silicone?

Quote:


> Originally Posted by *Hulio225*
> 
> R stands for Resistor
> 
> 
> 
> 
> 
> 
> 
> 
> L for Coils
> C for Capacitors
> 
> The C on the rectangles from Boogie is just the Logo of manufacturer don't confuse it with the C i meant for Capacitors
> 
> Those rectangles which Dasboogieman is refering to look like those in the inside


I am confused. Arent the R22 on this photo the coils?
http://media.gamersnexus.net/media/k2/items/cache/77b366dd254a43618bcbedaf86ab4031_XL.jpg


----------



## Dasboogieman

Quote:


> Originally Posted by *dunbheagan*
> 
> Very helpful, +rep
> 
> Is it better to use something that hardens or something like silicone?
> I am confused. Arent the R22 on this photo the coils?
> http://media.gamersnexus.net/media/k2/items/cache/77b366dd254a43618bcbedaf86ab4031_XL.jpg


As long as its not super glue, you don't want chemical bonding with the PCB otherwise you can't easily remove it.

I've heard from some friends that Sticky Wax from a dental lab is pretty good for this.


----------



## IMI4tth3w

Quote:


> Originally Posted by *dunbheagan*
> 
> Very helpful, +rep
> 
> Is it better to use something that hardens or something like silicone?
> I am confused. Arent the R22 on this photo the coils?
> http://media.gamersnexus.net/media/k2/items/cache/77b366dd254a43618bcbedaf86ab4031_XL.jpg


the R22 on the coils is an indication of the part. You need to look at what is printed on the PCB. When doing the PCB design, each location is given a number and type of component. R15 would be "Resistor #15". Its like a parking spot for components on the PCB. Although for modern graphics cards the labels aren't printed on the PCB.

If you want more info on the R22 Inductor, here's a datasheet with some info about the part numbers: http://www.niccomp.com/products/catalog/npim_c.pdf
I believe they are 22nH power inductors.


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> Very helpful, +rep
> 
> Is it better to use something that hardens or something like silicone?
> I am confused. Arent the R22 on this photo the coils?
> http://media.gamersnexus.net/media/k2/items/cache/77b366dd254a43618bcbedaf86ab4031_XL.jpg


Check my own high res photo and gimmi the +rep mate



open link in a new tab and zoom in, and the white labels on the pcb are the important ones ;-)


----------



## dunbheagan

These are the parts we are talking about, right?


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> These are the parts we are talking about, right?


Yeah check my post before you asked that and open the photo in new tab and zoom in and read whats on them and check the white lables on pcb like suggested from me and IMI4tth3w


----------



## Dasboogieman

Quote:


> Originally Posted by *dunbheagan*
> 
> These are the parts we are talking about, right?


yup, standard issue Founder's Edition R22 Inductors. Hot glue mod might be difficult on those coils because of the tight spacing between them and the multitude of SMDs around them.

Pad mod and Seksui mod should work.


----------



## dunbheagan

Quote:


> Originally Posted by *IMI4tth3w*
> 
> the R22 on the coils is an indication of the part. You need to look at what is printed on the PCB. When doing the PCB design, each location is given a number and type of component. R15 would be "Resistor #15". Its like a parking spot for components on the PCB. Although for modern graphics cards the labels aren't printed on the PCB.
> 
> If you want more info on the R22 Inductor, here's a datasheet with some info about the part numbers: http://www.niccomp.com/products/catalog/npim_c.pdf
> I believe they are 22nH power inductors.


Quote:


> Originally Posted by *Hulio225*
> 
> Check my own high res photo and gimmi the +rep mate
> 
> open link in a new tab and zoom in, and the white labels on the pcb are the important ones ;-)


THANKS!

2x +rep


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> I don't know man *but 10 fps is significant to me* and I would take that over less power used and heat anyday. *Especially if it's 10 fps at 4K*. But to each his own I guess.


Quote:


> Originally Posted by *dunbheagan*
> 
> I have been testing some undervolting in the last days and wanted to report back my results.
> 
> [email protected]/1,093V
> Power draw: 460W
> TDP: 95%
> *FPS 1% Low: 77,5*
> 
> [email protected]/0,793V
> Power draw: 275W
> TDP: 55%
> *FPS 1% Low: 68,0*


I have to make here something clear ;-P

1% Lows still 68,0 FPS (1% of the lowest measured average FPS) if that is on 4K and your monitor has a refresh rate of 60Hz you are more than fine. In other words i could say every MHz on your GPU and every Wattage used beyond that is waste of energy and not good for our planet ;-)
I think the best way is to use different profiles for different games and use that which makes the most sense, pretty simple.


----------



## dunbheagan

Quote:


> Originally Posted by *TheBoom*
> 
> I don't know man but 10 fps is significant to me and I would take that over less power used and heat anyday. Especially if it's 10 fps at 4K. But to each his own I guess.
> 
> My card doesn't seem to have any audible coil whine though. Even at 2000 fps in menus.


Yeah, you are right, 10fps can make a significant difference at 4K, if you are under 50fps. But i chose one of most demanding scenes of Doom and still get smooth min fps at both settings, so i choose the settings with less noise. I dont feel a difference between 70fps and 80fps on a gsync monitor. And we are talking about 1% Low fps here, not average. And in most cases in Doom 4K nightmare 1746MHz means around 90fps and 2100MHz means around 100fps, which is even less noticable.

Quote:


> Originally Posted by *Hulio225*
> 
> I have to make here something clear ;-P
> 
> 1% Lows still 68,0 FPS (1% of the lowest measured average FPS) if that is on 4K and your monitor has a refresh rate of 60Hz you are more than fine. In other words i could say every MHz on your GPU and every Wattage used beyond that is waste of energy and not good for our planet ;-)
> I think the best way is to use different profiles for different games and use that which makes the most sense, pretty simple.


Exactly! Thats what I meant.

Im am playing Telltale games and older games on settings like 1100MHz fixed for that reason.


----------



## OZrevhead

Sorry guys, I have a water block ready to go on, can anyone answer these for me?

Can someone confirm for me if Liquid Ultra and Conductanaught both work for performing a shunt mod? This stuff seems reluctant to spreading so is it really going to drip (vertical gpu)? What is a piece of plastic or something was applied over the clu so it has something to hold on to? Does heat make it more likely to drip or run?


----------



## Dasboogieman

Quote:


> Originally Posted by *OZrevhead*
> 
> Sorry guys, I have a water block ready to go on, can anyone answer these for me?
> 
> Can someone confirm for me if Liquid Ultra and Conductanaught both work for performing a shunt mod? This stuff seems reluctant to spreading so is it really going to drip (vertical gpu)? What is a piece of plastic or something was applied over the clu so it has something to hold on to? Does heat make it more likely to drip or run?


Its pretty impervious to heat. If you are serious about shunt modding, you may wanna consult KedarWolf. He has a superior, easier and longer lasting method to the CLU shunt-mod


----------



## Hulio225

Quote:


> Originally Posted by *Dasboogieman*
> 
> Its pretty impervious to heat. If you are serious about shunt modding, you may wanna consult KedarWolf. He has a superior, *easier* and longer lasting method to the CLU shunt-mod


Its the *better* method for sure, but easier, i would say nope... applying some CLU or LM to 3 relative big shunt-resistors is as easy as it gets imho....
The other method involves a lot smaller smd-resistors which you need to stick to very small smd-capacitors.
Better yes, easier i don't think so tbh...

But doable for sure, just needs more precision


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> I have to make here something clear ;-P
> 
> 1% Lows still 68,0 FPS (1% of the lowest measured average FPS) if that is on 4K and your monitor has a refresh rate of 60Hz you are more than fine. In other words i could say every MHz on your GPU and every Wattage used beyond that is waste of energy and not good for our planet ;-)
> I think the best way is to use different profiles for different games and use that which makes the most sense, pretty simple.


Quote:


> Originally Posted by *dunbheagan*
> 
> Yeah, you are right, 10fps can make a significant difference at 4K, if you are under 50fps. But i chose one of most demanding scenes of Doom and still get smooth min fps at both settings, so i choose the settings with less noise. I dont feel a difference between 70fps and 80fps on a gsync monitor. And we are talking about 1% Low fps here, not average. And in most cases in Doom 4K nightmare 1746MHz means around 90fps and 2100MHz means around 100fps, which is even less noticable.
> Exactly! Thats what I meant.
> 
> Im am playing Telltale games and older games on settings like 1100MHz fixed for that reason.


Yeah I guess it really depends on your setup and preferences. I have a 1440p Gsync at 165hz so I will take as much as I can get until 165 fps.

I feel 1% low is more important then average since it tells you the extent of a frame drop or stutter. Also minimum fps is bound to go up by a significant margin with a OC which is the primary reason why I even bother. I mean lets be honest the 1080ti rarely has a problem keeping up average fps even at stock boost speeds except maybe in demanding 4k titles.

Another thing is For Honour. With settings maxed out at 1440p with SSAA my fps averages 55-60. 14% reduced performance would make the difference between playable and not for me.

Its really to each his own I guess. I wouldn't be building a powerhouse of a PC if I were that concerned about power wastage lol.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah I guess it really depends on your setup and preferences. I have a 1440p Gsync at 165hz so I will take as much as I can get until 165 fps.
> 
> I feel 1% low is more important then average since it tells you the extent of a frame drop or stutter. Also minimum fps is bound to go up by a significant margin with a OC which is the primary reason why I even bother. I mean lets be honest the 1080ti rarely has a problem keeping up average fps even at stock boost speeds except maybe in demanding 4k titles.
> 
> Another thing is For Honour. With settings maxed out at 1440p with SSAA my fps averages 55-60. 14% reduced performance would make the difference between playable and not for me.
> 
> Its really to each his own I guess. I wouldn't be building a powerhouse of a PC if I were that concerned about power wastage lol.


Thats another story, my example was specifically for [email protected] (3820*2160). And for that specific example its power wastage and is not necessary, period.

0.1% and 1% Lows tells you more than average fps, but that is nothing new... And simple min fps doesn't tell **** too cause, its possible to get a one time stutter which means nothing, so yeah .1% and 1% lows is the way to go....


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> Thats another story, my example was specifically for [email protected] (3820*2160). And for that specific example its power wastage and is not necessary, period.
> 
> 0.1% and 1% Lows tells you more than average fps, but that is nothing new... And simple min fps doesn't tell **** too cause, its possible to get a one time stutter which means nothing, so yeah .1% and 1% lows is the way to go....


Precisely, I wish I had the time and patience to configure different OC profiles for every game but nope. Especially when you have to fiddle with the curve for every profile.

Also I wasn't disputing the importance of 1% and 0.1% lows I was emphasising it made a difference if it was that much apart between OC profiles.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Precisely, I wish I had the time and patience to configure different OC profiles for every game but nope. Especially when you have to fiddle with the curve for every profile.
> 
> Also I wasn't disputing the importance of 1% and 0.1% lows I was emphasising it made a difference if it was that much apart between OC profiles.


Yeah the curve fiddle sucks the most


----------



## dunbheagan

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah I guess it really depends on your setup and preferences. I have a 1440p Gsync at 165hz so I will take as much as I can get until 165 fps.
> 
> I feel 1% low is more important then average since it tells you the extent of a frame drop or stutter. Also minimum fps is bound to go up by a significant margin with a OC which is the primary reason why I even bother. I mean lets be honest the 1080ti rarely has a problem keeping up average fps even at stock boost speeds except maybe in demanding 4k titles.
> 
> Another thing is For Honour. With settings maxed out at 1440p with SSAA my fps averages 55-60. 14% reduced performance would make the difference between playable and not for me.
> 
> Its really to each his own I guess. I wouldn't be building a powerhouse of a PC if I were that concerned about power wastage lol.


I am on a 144Hz GSync Monitor, but I dont really feel a difference between 140fps or 80fps. I feel a big difference if GSync is ON or OFF, but i wouldnt need that 144Hz on a Gsync monitor. A 100Hz GSync monitor would be as good as 144Hz for me. I only see the advantage of 144Hz when i totally dissable GSync or VSync, because you get less screen tearing with a higher refresh rate. But i always play with Gsync ON, so i dont really use the 144Hz. I sometimes even limit the fps at 90, although i could get more.

I am not concerned about power wastage too, i only game two hours a week. But i am sensitive about noise, since i do not play with headphones. And less power draw means less coil whine and lower fan rpm in my case, and thats a real advantage for me.


----------



## ninjames

Hey since ya'll helped me before, I'll ask a very noobish question again. I've the EVGA 650W G2, which has VGA1 and VGA2 on the back of it. This is what I used to hook up my Aorus 1080TI Xtreme. I returned that card and am buying an EVGA FTW3. In another forum, somebody said this to me about my PSU and that card:

"The psu has 20a per pcie cable . Theres 4 port for pcie on the psu but 2 ports share same power .So he would have to plug pcie into vga 1 and vga4 to get full 20amp from each . 20amp + 20amp = 40amp and multiplied by 12 v would be 480w available .If the evga uses 280 that leads 200w free on the 12v rails.
I think if the psu is working ok it should be ok for the card."

Is this person mistaken? Since I only see VGA1 and VGA2 on the back of my PSU: https://images10.newegg.com/NeweggImage/ProductImageCompressAll1280/17-438-054-14.jpg

Does that mean this PSU won't deliver enough amps over those two ports because they are shared, or was that incorrect? Max power wise, 650W should be enough as my build comes in around 480W altogether. Or should I be getting a PSU that has VGA1, VGA2, VGA3 and VGA4, and if so, why? And if so, do I plug one into VGA1 and one into VGA4?

Small words for stupid people, please!


----------



## Hulio225

Quote:


> Originally Posted by *ninjames*
> 
> Hey since ya'll helped me before, I'll ask a very noobish question again. I've the EVGA 650W G2, which has VGA1 and VGA2 on the back of it. This is what I used to hook up my Aorus 1080TI Xtreme. I returned that card and am buying an EVGA FTW3. In another forum, somebody said this to me about my PSU and that card:
> 
> "The psu has 20a per pcie cable . Theres 4 port for pcie on the psu but 2 ports share same power .So he would have to plug pcie into vga 1 and vga4 to get full 20amp from each . 20amp + 20amp = 40amp and multiplied by 12 v would be 480w available .If the evga uses 280 that leads 200w free on the 12v rails.
> I think if the psu is working ok it should be ok for the card."
> 
> Is this person mistaken? Since I only see VGA1 and VGA2 on the back of my PSU: https://images10.newegg.com/NeweggImage/ProductImageCompressAll1280/17-438-054-14.jpg
> 
> Does that mean this PSU won't deliver enough amps over those two ports because they are shared, or was that incorrect? Max power wise, 650W should be enough as my build comes in around 480W altogether. Or should I be getting a PSU that has VGA1, VGA2, VGA3 and VGA4, and if so, why? And if so, do I plug one into VGA1 and one into VGA4?
> 
> Small words for stupid people, please!


if this here is your PSU

http://www.evga.com/Products/Product.aspx?pn=220-G2-0650-Y1

you have nothing to worry


----------



## ninjames

Quote:


> Originally Posted by *Hulio225*
> 
> if this here is your PSU
> 
> http://www.evga.com/Products/Product.aspx?pn=220-G2-0650-Y1
> 
> you have nothing to worry


That is indeed my PSU. Thanks very much.


----------



## Hulio225

Quote:


> Originally Posted by *ninjames*
> 
> That is indeed my PSU. Thanks very much.


Its single rail 12V 54.1Amp... go to the other forum and say to the guy he has to research better


----------



## Slackaveli

Quote:


> Originally Posted by *TheBoom*
> 
> I don't know man but 10 fps is significant to me and I would take that over less power used and heat anyday. Especially if it's 10 fps at 4K. But to each his own I guess.
> 
> My card doesn't seem to have any audible coil whine though. Even at 2000 fps in menus.


i mean, if his monitor actually displayed those fps, yeah. but look. we are talking about 60Hz 4k, here, and he is getting 66 fps at 0.793v. that's quite low. My question is this. If I am running full bore at 2088 @ 1.093v but with the frame limiter on at 60fps, will I be running less watts that he is at 66 fps @ 0.793v? Because, if so as I suspect, wouldn't it be easier just to run full bore but capped?
Quote:


> Originally Posted by *dunbheagan*
> 
> I have been testing some undervolting in the last days and wanted to report back my results.
> 
> All tests were made with [email protected] and a [email protected] I have a shunt modded FE under water. Power draw is measured for the whole system with a separate energy meter on the wall. I found that [email protected] maxed out is the most power drawing test, even more than Supo 4K and FSU, thats why i used Doom for comparision.
> 
> [email protected]/1,093V
> Power draw: 460W
> TDP: 95%
> FPS 1% Low: 77,5
> 
> [email protected]/0,793V
> Power draw: 275W
> TDP: 55%
> FPS 1% Low: 68,0
> 
> The difference between mac OC and max undervolt is 185W in this case.
> Power draw of the system without GPU is appr. 100W, so it is 175W on the card against 360W on the card.
> 
> The 20% higher clock converts into 14% higher performance, which is ok imo, because memory is the same in both cases.
> 
> For me, the undervolt settings are much more appealing, because:
> -+100% Power draw only give +14% performance
> -coil whine changes from slightly annoying to bearly audible(open case, no headphones)
> -fans spin at 400rpm instead at 1000rpm
> -GPU is at 44° instead at 50° (water temp in both cases 40°)
> 
> I think it is nice to see, that the 1080Ti can be efficient as f***, if you want to. At 175W it is still the fastest card on the market(let aside the Titan XP) and it still beats every overclocked [email protected]+.
> 
> What is your max clock at 800mV guys?
> 
> And are there any 1080Ti with absoluty no coil whine on the market? I think i got quite lucky with my card having only little coil whine, but still it is the loudest part during gaming, which annoys me a little. I wouldnt hear it, if i would close my case or use headphones, but i dont want to.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> i mean, if his monitor actually displayed those fps, yeah. but look. we are talking about 60Hz 4k, here, and he is getting 66 fps at 0.793v. that's quite low. My question is this. If I am running full bore at 2088 @ 1.093v but with the frame limiter on at 60fps, will I be running less watts that he is at 66 fps @ 0.793v? Because, if so as I suspect, wouldn't it be easier just to run full bore but capped?


You are right. It will downclock and undervolt itself just to hold 60 FPS

i have v-sync enabled and as you can see the card clocks itself down and uses min voltage


----------



## ELIAS-EH

Guys hello
I have a thermaltake 3.0 extreme water cooler that i used for my CPU
Can i install if for my strix 1080ti? And how
I have nzxt 820 phantom and i have a place for it at the button.


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> You are right. It will downclock and undervolt itself just to hold 60 FPS
> 
> i have v-sync enabled and as you can see the card clocks itself down and uses min voltage


ok, good. I am ALL about max perf, but it really bothers my ocd to think of wasting a bunch of power on unseen fps! drives me bonkers. And, truthfully, it makes a big difference. It's about like leaving all the lights on in your house all day. I used to run uncapped when I was a noob back in the day, and after I started running with frame rate limiter on , i actually noticed a ~$25 less electric bill. not scientific, true, but it's real. About the same ~$25 I saved when i switched all my lightbulbs in the house to the modern low power bulbs.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Guys hello
> I have a thermaltake 3.0 extreme water cooler that i used for my CPU
> Can i install if for my strix 1080ti? And how


Yes you can. Use Google, that's how.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> ok, good. I am ALL about max perf, but it really bothers my ocd to think of wasting a bunch of power on unseen fps! drives me bonkers. And, truthfully, it makes a big difference. It's about like leaving all the lights on in your house all day. I used to run uncapped when I was a noob back in the day, and after I started running with frame rate limiter on , i actually noticed a ~$25 less electric bill. not scientific, true, but it's real. About the same ~$25 I saved when i switched all my lightbulbs in the house to the modern low power bulbs.


I have to correct myelf, this just worked in windowed mode, when i go back fullscreen mode, i stick at 2100 1,1V even with v-sync... when i disable it im at 90+ fps... so underclocking and undervolting would save some power

Have to try limiter instead of v-sync, because v-sync does not work like a limiter

Edit: Limiter doesn't help either... i guess under-clocking and under-volting is the only solution


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> I have to correct myelf, this just worked in windowed mode, when i go back fullscreen mode, i stick at 2100 1,1V even with v-sync... when i disable it im at 90+ fps... so underclocking and undervolting would save some power
> 
> Have to try limiter instead of v-sync, because v-sync does not work like a limiter
> 
> Edit: Limiter doesn't help either... i guess under-clocking and under-volting is the only solution


if you have the power draw enabled on osd through hwinfo64 you can check in real time. I'll do some teasts, i hadnt really given that max undervolting much thought. gonna ponder that a bit.

btw, just noticed your rig's name, lol. nice.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Slackaveli*
> 
> ok, good. I am ALL about max perf, but it really bothers my ocd to think of wasting a bunch of power on unseen fps! drives me bonkers. And, truthfully, it makes a big difference. It's about like leaving all the lights on in your house all day. I used to run uncapped when I was a noob back in the day, and after I started running with frame rate limiter on , i actually noticed a ~$25 less electric bill. not scientific, true, but it's real. About the same ~$25 I saved when i switched all my lightbulbs in the house to the modern low power bulbs.


I think using a frame limiter and adaptive or optimal power would save electricity to the same effect.

Dark Souls 3 only uses 15%-25% of my gpu power this way.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> if you have the power draw enabled on osd through hwinfo64 you can check in real time. I'll do some teasts, i hadnt really given that max undervolting much thought. gonna ponder that a bit.
> 
> *btw, just noticed your rig's name, lol. nice*.


Haha finally some one got that pun^^

Yeah i know about the power draw in OSD

I don-cloced my card to 1650 @ 0.793V and V-Sync enabled i have constant 60 FPS and its drawing 150 Watts +~20-25% because shunt mod, before at 2100 MHz it pulled 350 or somehting +~20-25%

thats 200 Watts less and the gaming experience is the same at 4K ultra Star Wars Battlefront thats huge, i could power a full village in Africa with light, not even joking here^^


----------



## nrpeyton

It's sometimes hard to believe how power hungry these are.

400 watts + (gaming at stock voltage limits).

I have a 1/2 horse power Water Chiller that consumes less energy. And the Chiller is HUGE.

Then you think about all that power _(even more)_ bursting through something the size of a coin.

I wonder how long it would actually last _(in hours/days) if_ you allowed it to run continiously consuming 450 watts.

Last night I gamed for over 5 hours, and the GPU was literally heating my room. _(and that was only 300 watts)_
During that game, my i7 7700k barely hit 50 watts


----------



## Hulio225

Quote:


> Originally Posted by *SlimJ87D*
> 
> *I think using a frame limiter and adaptive or optimal power would save electricity to the same effect.*
> 
> Dark Souls 3 only uses 15%-25% of my gpu power this way.


I limited frames with Riva Tuner and went to driver settings and chosed adaptive and restarted rig... still at 2100MHz don't know maybe because of XOC bios which im testing in games atm

But whatever method works, its worth doing it because, why would someone burn his own money, even if its not that much and more importantly why waste energy and be stupid doing so.
There is a difference in using a lot of energy like we do anyway and wasting it... just my two cents
That would be like driving a car in 5th gear in the rev limiter instead of switching into 7th gear^^


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> i mean, if his monitor actually displayed those fps, yeah. but look. we are talking about 60Hz 4k, here, and he is getting 66 fps at 0.793v. that's quite low. My question is this. If I am running full bore at 2088 @ 1.093v but with the frame limiter on at 60fps, will I be running less watts that he is at 66 fps @ 0.793v? Because, if so as I suspect, wouldn't it be easier just to run full bore but capped?


You got me wrong, i am not at 60Hz 4K, i am at 144Hz 4K GSync, so my fps=Hz

Quote:


> Originally Posted by *Hulio225*
> 
> You are right. It will downclock and undervolt itself just to hold 60 FPS
> 
> i have v-sync enabled and as you can see the card clocks itself down and uses min voltage


Concerning the power draw effects of frame limiting and undervolting you are not completly wrong but not completly right either. Frame limiting and undervolting will not have the same results on your power draw, although in both cases you will draw less power of course.

Did some quick testing:
2100MHz no frame limit
fps 103
watts 440

2100MHz frame limit
fps 60
watts 320

1700MHz no frame limit
fps 90
watts 275

1700MHz frame limit
fps 60
watts 235

When you limit your fps, your card will not lower voltage in every case. It didnt in my test above. And a undervolted card will draw much less power at the same fps than a card not undervolted.


----------



## Hulio225

Quote:


> Originally Posted by *dunbheagan*
> 
> You got me wrong, i am not at 60Hz 4K, i am at 144Hz 4K GSync, so my fps=Hz
> Concerning the power draw effects of frame limiting and undervolting you are not completly wrong but not completly right either. Frame limiting and undervolting will not have the same results on your power draw, although in both cases you will draw less power of course.
> 
> Did some quick testing:
> 2100MHz uncapped
> fps 103
> watts 440
> 
> 2100MHz capped
> fps 60
> watts 320
> 
> 1700MHz uncapped
> fps 90
> watts 275
> 
> 1700MHz uncapped
> fps 60
> watts 235


Yeah came to the same conclusion the first post with that screenshot was a to fast conclusion, after that i posted other results..

But i know for sure that i will downclock my card to 1650 0.793V for battlefront for example... its 200 Watts eletrical power less and ~ 200 Watts less heat too

At least as long im sticking to that 4k 60Hz Monitor... i want an other, a 21:9 one but not sure atm^^

Edit is there a way to go lower than 0.793V ? Crappy curve ends there but card should be able to go lower since its doing it by itself for idle


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> It's sometimes hard to believe how power hungry these are.
> 
> 400 watts + (gaming at stock voltage limits).
> 
> I have a 1/2 horse power Water Chiller that consumes less energy. And the Chiller is HUGE.
> 
> Then you think about all that power _(even more)_ bursting through something the size of a coin.
> 
> I wonder how long it would actually last _(in hours/days) if_ you allowed it to run continiously consuming 450 watts.
> 
> Last night I gamed for over 5 hours, and the GPU was literally heating my room. _(and that was only 300 watts)_
> During that game, my i7 7700k barely hit 50 watts


The power draw i write about is for the whole system, but still, i get your point. 300+ Watts on a gpu is a lot. Imagine the heat, three or four 100W light bulbs create.

My 4790K sits at around 50W in gaming too. Since i realized, that my gpu uses 6x the power than my cpu, i stopped bothering about undervolting my cpu just to save 10 or 20W. Undervolting your GPU has a significantly higher impact on the power draw of your system!


----------



## dunbheagan

Very good example for the card clocking down is Walking Dead from Telltale. This game has a frame limit of 60fps by itself. And it is of course not very demanding. Even at the highest possible settings like 5K, there is no way the framerate drops below those 60fps and still the card will run at settings like 1300MHz/700mV by itself. I have a power draw on the whole system of 160W during gaming TWD, so the workload on the gpu has a big impact on your power draw. If you reduce that workload by limiting your fps, you will reduce power draw and eventually your card will downclock by itself, but only if the workload is a lot lower than the performance. In more demanding games, manual undervolt will give you better results concerning power draw.

Quote:


> Originally Posted by *Hulio225*
> 
> Edit is there a way to go lower than 0.793V ? Crappy curve ends there but card should be able to go lower since its doing it by itself for idle


I havent found one yet, but i would be interested if it's possible too!


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Hulio225*
> 
> I limited frames with Riva Tuner and went to driver settings and chosed adaptive and restarted rig... still at 2100MHz don't know maybe because of XOC bios which im testing in games atm
> 
> But whatever method works, its worth doing it because, why would someone burn his own money, even if its not that much and more importantly why waste energy and be stupid doing so.
> There is a difference in using a lot of energy like we do anyway and wasting it... just my two cents
> That would be like driving a car in 5th gear in the rev limiter instead of switching into 7th gear^^


Yeah, the Mhz will be up there but the gpu usage and power will be way way down.

In my screenshot it's at 20 gpu usage and 49% power usage.

It won't draw as low power as undervolting but it's nice because whenever the game does have a lot going on, it can tape into that extra horsepower if needed.

Undervolting and capping isn't adaptive. I think frame limiting + adaptive or optimal power is the best middle ground because your GPU will lower the amount of power it draws for the frames you limit it to and tap into move power for highly intensive gpu usage moments.

Also, optimizing the voltage curve with the highest clock per a bin would save you even more money.


----------



## KingEngineRevUp

If I cap my systen to 60 FPS:

Undervolting and capping will give me moments where I might go below 60 FPS on games like Deus Ex mankind divided.

If I leave power to adaptive or optimal, then I'll get power savings but if things get intense on screen then my gpu usage and power will go up to keep me at 60 FPS.


----------



## ALSTER868

So there is no difference between turning the V-sync on and capping the framerates in Riva? The effect is going to be the same on power draw and gamig experience, right?
I remember someone saying that he caps his fps at 59. What does that do compared to 60?
I'm still in doubt which one I should use..


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> It's sometimes hard to believe how power hungry these are.
> 
> 400 watts + (gaming at stock voltage limits).
> 
> I have a 1/2 horse power Water Chiller that consumes less energy. And the Chiller is HUGE.
> 
> Then you think about all that power _(even more)_ bursting through something the size of a coin.
> 
> I wonder how long it would actually last _(in hours/days) if_ you allowed it to run continiously consuming 450 watts.
> 
> Last night I gamed for over 5 hours, and the GPU was literally heating my room. _(and that was only 300 watts)_
> During that game, my i7 7700k barely hit 50 watts


Quote:


> Originally Posted by *Hulio225*
> 
> Haha finally some one got that pun^^
> 
> Yeah i know about the power draw in OSD
> 
> I don-cloced my card to 1650 @ 0.793V and V-Sync enabled i have constant 60 FPS and its drawing 150 Watts +~20-25% because shunt mod, before at 2100 MHz it pulled 350 or somehting +~20-25%
> 
> thats 200 Watts less and the gaming experience is the same at 4K ultra Star Wars Battlefront thats huge, i could power a full village in Africa with light, not even joking here^^


UnderVolt.net


----------



## Slackaveli

Quote:


> Originally Posted by *SlimJ87D*
> 
> If I cap my systen to 60 FPS:
> 
> Undervolting and capping will give me moments where I might go below 60 FPS on games like Deus Ex mankind divided.
> 
> If I leave power to adaptive or optimal, then I'll get power savings but if things get intense on screen then my gpu usage and power will go up to keep me at 60 FPS.


how i've been doing it. but i have made a new profile at 0.793v at 1700mhz for the really light games i play in 4k, which mixed w/ adaptive and FRL should be sipping power.

@Alster, the 59 fps cap is b/c it has less input lag than a 60 fps cap does. but is also a 1.8% power savings over 60 fps .

@DunbHeagan you have a 4k/144Hz already? or play on a 1440/144Hz w/ dsr?


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> @DunbHeagan you have a 4k/144Hz already? or play on a 1440/144Hz w/ dsr?


I am on a 1440/144Hz monitor, but play at 4K via DSR. It is a Dell S2716DG, very nice monitor by the way.


----------



## BrainSplatter

Quote:


> Originally Posted by *ALSTER868*
> 
> So there is no difference between turning the V-sync on and capping the framerates in Riva?


The framerate lock in Afterburner is more effective than a simple V-Sync. Don't ask me about the details but I think it's because Afterburner limits the frames earlier in the rendering pipeline. There are also comparisons between different framerate limiters regarding lag and other aspects. Afterburner is essentially the best I think.

Framerate limiting also has the nice effect of reducing micro stutters, especially for SLI systems.


----------



## Rhadamanthys

Been building a new system with 1080ti SLI on EK water blocks. I have an issue with my top card. It's not recognised in BIOS and there's no display output. Bottom one works fine. The card in question was working fine on air in my other system. Did something go wrong when I put the block on it? I didn't notice any screw up.

Could anyone please help me troubleshoot? Is there anything I can do? I tried reseating the two cards (their connected via the 3-slot terminal), to no avail.

So far I can rule out:
- top PCI-e 16x slot, I just put a 980ti on air on it and no issues
- PSU. It's completely new and while there have been reports about 1000W not being enough for SLI, I don't think that would apply for boot. 1000W should be totally sufficient to at least get me into bios.
- PSU cables. Swapped these out, no dice.

Specs: Asus Crosshair Hero VI (latest bios), Ryzen 1600x, GSkill DRR 4 3600mhz


----------



## ALSTER868

Quote:


> Originally Posted by *BrainSplatter*
> 
> The framerate lock in Afterburner is more effective than a simple V-Sync. Don't ask me about the details but I think it's because Afterburner limits the frames earlier in the rendering pipeline. There are also comparisons between different framerate limiters regarding lag and other aspects. Afterburner is essentially the best I think.
> 
> Framerate limiting also has the nice effect of reducing micro stutters, especially for SLI systems.


Thank you. What about earlier statement here that it should be locked in Riva at 59 fps, not 60? Can this have any sence?


----------



## Coopiklaani

Quote:


> Originally Posted by *BrainSplatter*
> 
> The framerate lock in Afterburner is more effective than a simple V-Sync. Don't ask me about the details but I think it's because Afterburner limits the frames earlier in the rendering pipeline. There are also comparisons between different framerate limiters regarding lag and other aspects. Afterburner is essentially the best I think.
> 
> Framerate limiting also has the nice effect of reducing micro stutters, especially for SLI systems.


But framerate limiting does not solve the screen tearing problem. frames are not synced to the monitors refresh cycle even they are refreshing at the same rate. you can end up having constant screen tearing right at the center of the screen.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> But framerate limiting does not solve the screen tearing problem. frames are not synced to the monitors refresh cycle even they are refreshing at the same rate. you can end up having constant screen tearing right at the center of the screen.


Yeah thats true, thats why i turned on v-sync and downclocked my card until the borderline where i had my 60 fps constant in sw battlefront


----------



## Slackaveli

Quote:


> Originally Posted by *dunbheagan*
> 
> I am on a 1440/144Hz monitor, but play at 4K via DSR. It is a Dell S2716DG, very nice monitor by the way.


i have the same one. does g-sync work in dsr?


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> But framerate limiting does not solve the screen tearing problem. frames are not synced to the monitors refresh cycle even they are refreshing at the same rate. you can end up having constant screen tearing right at the center of the screen.


that's why you should run Framerate limiter at 59 (or 142) and Fast Sync On (or v-sync but that adds input lag) if on a non-gsync monitor.


----------



## dunbheagan

Quote:


> Originally Posted by *Slackaveli*
> 
> i have the same one. does g-sync work in dsr?


Yeah, no problem.


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> that's why you should run Framerate limiter at 59 (or 142) and Fast Sync On (or v-sync but that adds input lag) if on a non-gsync monitor.


but fastsync is very stuttery and choppy, with framerate limited to 59 and fast sync on, you basically skip 1 frame very second.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> but fastsync is very stuttery and choppy, with framerate limited to 59 and fast sync on, you basically skip 1 frame very second.


seems buttery to me. but im on broadwell so not apples to apples. so, you would suggest just locking at 60 if im following you.


----------



## OneCosmic

Quote:


> Originally Posted by *Coopiklaani*
> 
> but fastsync is very stuttery and choppy, with framerate limited to 59 and fast sync on, you basically skip 1 frame very second.


Why would you limit to 59fps? That's an utter nonsense, basically you wan't FPS as high as possible to have better latency because of aiming.


----------



## BranField

anyone here with the strix or ftw able to tell me how long the pcb is? all the measurements online that i can see are total, inclusive of the cooler that overhangs a bit. i have 11.5 inches before my front rad. At the moment it is a toss up between the ftw and the strix.


----------



## IMI4tth3w

Quote:


> Originally Posted by *BranField*
> 
> anyone here with the strix or ftw able to tell me how long the pcb is? all the measurements online that i can see are total, inclusive of the cooler that overhangs a bit. i have 11.5 inches before my front rad. At the moment it is a toss up between the ftw and the strix.


i'm still waiting for my card to come in, but my waterblock just came in and appears to be just about the entire length of the card
https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-nickel

the waterblock measures just a hair over 11 inches. hopefully that helps.


----------



## fat4l

Guys.
I know theres a guy that make a "guide" of how to set the curve properly for increased perfomance. Any link please ? Thanks
Like he was adding + 100mhz on the core, then fiddling with the curve ....


----------



## Coopiklaani

Some extensive tests on the XOC bios vs the FE bios (no shunt mod). I'll later add some results of FE bios with shunt mod.

And indeed XOC bios performs a lot worse on Superposition benchmark, but it seems OK on 3Dmark?


----------



## BranField

Quote:


> Originally Posted by *IMI4tth3w*
> 
> i'm still waiting for my card to come in, but my waterblock just came in and appears to be just about the entire length of the card
> https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-nickel
> 
> the waterblock measures just a hair over 11 inches. hopefully that helps.


Sounds like it will just about fit, thanks. Would you mind measuring total length when you receive the card and get the block installed?


----------



## Slackaveli

Quote:


> Originally Posted by *OneCosmic*
> 
> Why would you limit to 59fps? That's an utter nonsense, basically you wan't FPS as high as possible to have better latency because of aiming.


has to do with frametimes. less input lag. there's a very in-depth article about it , been posted in this thread multiple times.


----------



## KedarWolf

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I know theres a guy that make a "guide" of how to set the curve properly for increased perfomance. Any link please ? Thanks
> Like he was adding + 100mhz on the core, then fiddling with the curve ....


http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


So what's the scoop? Did you manage to fix your card? If not, can you file a claim with your CC company?


----------



## nrpeyton

The end of an *era*, tonight; for me. lol

After upgrading to the 1080Ti I felt it was my duty to FINALLY finish the Witcher 3 (which I started playing in 2015).

Apparently I have 85hrs of gameplay. That's about 45 minutes a week for 2 years lol.

What the hell am I meant to do now, lol

Honest, can't believe I finally completed it.

Went out in style too, I begun the game on a GTX 980 & AMD FX CPU, then upgraded to GTX 980 SLI, then continued on the GTX 1080. And finally completed it on the 1080Ti & 7700k at 70 FPS (some scenes) at 4k Ultra.


----------



## Slackaveli

wondered the same
Quote:


> Originally Posted by *SlimJ87D*
> 
> So what's the scoop? Did you manage to fix your card? If not, can you file a claim with your CC company?


Quote:


> Originally Posted by *nrpeyton*
> 
> The end of an *era*, tonight; for me. lol
> 
> After upgrading to the 1080Ti I felt it was my duty to FINALLY finish the Witcher 3 (which I started playing in 2015).
> 
> Apparently I have 85hrs of gameplay. That's about 45 minutes a week for 2 years lol.
> 
> What the hell am I meant to do now, lol
> 
> Honest, can't believe I finally completed it.
> 
> Went out in style too, I begun the game on a GTX 980 & AMD FX CPU, then upgraded to GTX 980 SLI, then continued on the GTX 1080. And finally completed it on the 1080Ti & 7700k at 70 FPS (some scenes) at 4k Ultra.


pretty cool, man. i still havent finished. began that game on a 4690k and a 980. i dont even have that rig anymore. had a 4790k and a couple 980ti's since then, then a 1080Ti on the 5775-c, still plodding along on that game from time to time. began trying to play that in 4k originally. was like 20 fps. now, over 60 all the time with spikes to 80's . crazy stuff. So smooth. Very well made game. I used to think it was a poorly made game. Wrong. I just had boo-boo gpu's and stuttering cpu's.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*
> 
> The end of an *era*, tonight; for me. lol
> 
> After upgrading to the 1080Ti I felt it was my duty to FINALLY finish the Witcher 3 (which I started playing in 2015).
> 
> Apparently I have 85hrs of gameplay. That's about 45 minutes a week for 2 years lol.
> 
> What the hell am I meant to do now, lol
> 
> Honest, can't believe I finally completed it.
> 
> Went out in style too, I begun the game on a GTX 980 & AMD FX CPU, then upgraded to GTX 980 SLI, then continued on the GTX 1080. And finally completed it on the 1080Ti & 7700k at 70 FPS (some scenes) at 4k Ultra.


Fantastic game, I beat it twice and the expansions


----------



## nrpeyton

Quote:


> Originally Posted by *Slackaveli*
> 
> wondered the same
> 
> pretty cool, man. i still havent finished. began that game on a 4690k and a 980. i dont even have that rig anymore. had a 4790k and a couple 980ti's since then, then a 1080Ti on the 5775-c, still plodding along on that game from time to time. began trying to play that in 4k originally. was like 20 fps. now, over 60 all the time with spikes to 80's . crazy stuff. So smooth. Very well made game. I used to think it was a poorly made game. Wrong. I just had boo-boo gpu's and stuttering cpu's.


Quote:


> Originally Posted by *SlimJ87D*
> 
> Fantastic game, I beat it twice and the expansions


Indeed lol.

I've used it to "evaluate" every GPU I've owned (as well as other components).

It's amazing how far we've came in just two years. _(I also remember trying 4k 2 years ago, like you-- and getting 20FPS lol)._

I wonder whats going to happen now.. because I don't think 8k will catch on quite as fast. Some ppl are still gaming at 1080. And reviews all still focus on 1080.

1080Ti is truly in my eyes, the 1st true, single GPU, whole 4k experience. _(65 FPS)_

Interesting times ahead.


----------



## feznz

[quote name="BranField" url="/t/1624521/nvidia-gtx-1080-ti-owners-club/1024
0_20#post_26108739"]anyone here with the strix or ftw able to tell me how long the pcb is? all the measurements online that i can see are total, inclusive of the cooler that overhangs a bit. i have 11.5 inches before my front rad. At the moment it is a toss up between the ftw and the strix.[/quote]


----------



## SolidSnakex9

I received my EVGA GTX 1080ti Hybrid yesterday. I am getting great idle temps but when I set a fan curve, or try to force it to use 100% fans, nothing happens in MSI Afterburner. It seems MSI doesnt support the card yet? I can't stand to use EVGA Precision... its night and day compared to MSI Afterburner. Should I maybe connect my rad fan strait to my motherboard and control it that way? Thoughts?


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20
> 
> 
> 
> So what's the scoop? Did you manage to fix your card? If not, can you file a claim with your CC company?
Click to expand...

RMA time, system won't even boot with card in, second card in or not.


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Indeed lol.
> 
> I've used it to "evaluate" every GPU I've owned (as well as other components).
> 
> It's amazing how far we've came in just two years. _(I also remember trying 4k 2 years ago, like you-- and getting 20FPS lol)._
> 
> I wonder whats going to happen now.. because I don't think 8k will catch on quite as fast. Some ppl are still gaming at 1080. And reviews all still focus on 1080.
> 
> 1080Ti is truly in my eyes, the 1st true, single GPU, whole 4k experience. _(65 FPS)_
> 
> Interesting times ahead.


i can envision a return to sli for some and the hdr 4k 144Hz era will begin. Provided, imo, an actual push to implement it in dx12 (mGPU). I myself am looking forward to ultrawide 4k or 5k with 100+Hz. I think that will truly be a sweet spot and should be a very usable monitor even now in sli Ti's/titans, but a single Volta Ti should be able to push 85-150 fps depending on the game in widescreen 4k, and in g-sync and HDR, that will be amazing.

But, what we REALLY need are a couple groundbreaking, defining, EPIC games to come out. Something fresh and outrageously beautiful and addictive that can really push us forward again.
Quote:


> Originally Posted by *SolidSnakex9*
> 
> I received my EVGA GTX 1080ti Hybrid yesterday. I am getting great idle temps but when I set a fan curve, or try to force it to use 100% fans, nothing happens in MSI Afterburner. It seems MSI doesnt support the card yet? I can't stand to use EVGA Precision... its night and day compared to MSI Afterburner. Should I maybe connect my rad fan strait to my motherboard and control it that way? Thoughts?


yes, the connect to mobo option works, at least for manually setting.

Im w/ you on afterburner. I am addicted to the rivatuner, too, so using other software is no bueno to me.


----------



## nrpeyton

aye that's true...

something like how "Crysis" was thought of when it first launched....

I think that'll be difficult to duplicate though...

because what made Crysis stand out was the way the vegetation came alive (the way the trees moved in the jungle) and the way the light trickled off the leaves....

everything since then.. has just improved upon that (and allowed it to be seen at higher resolutions)...

but that "wow" factor.. I don't know.. i'm not sure what they could do to bring that back the way Crysis did...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> RMA time, system won't even boot with card in, second card in or not.


I'm sorry to hearing that man, if they reject your RMA, get your CC involved.


----------



## nrpeyton

What happened to KedarWolf's card?

anyone know the original post number-- and i'll look back read myself? _(can't help being interested in this one)_?


----------



## Coopiklaani

To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.

NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark. FE performs better in superposition.


----------



## gmpotu

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> Some extensive tests on the XOC bios vs the FE bios (no shunt mod). I'll later add some results of FE bios with shunt mod.
> 
> And indeed XOC bios performs a lot worse on Superposition benchmark, but it seems OK on 3Dmark?


From your chart it looks like XOC wins over FE at every bench including SUPO when looking at the 2100mhz.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.
> 
> NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark. FE performs better in superposition.


what about games , though? Consider this post from @KraxKill
Quote:


> Originally Posted by *KraxKill*
> 
> Here is something funny for those that use benchmarks to measure performance vs using the games you play.
> 
> *I went form 93fps to 110fps in my ARMA3 benchmark. Performance I can visually and empirically quantify as a 17.3% improvement.*
> 
> There is also this. If your metric for gauging performance and time is constrained to watching that chick fight that demon dragon thing or walk through a futuristic museum with a loupe, then by all means upgrading to a 5775C may not be for you. If you play games however you may want to explore it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4790K @ 5.2ghz - 2000 CL9 1T - 1080Ti @ 2100ghz 12.2ghz DDR5X
> 
> 
> 5775C @ 4.3ghz - 2000 CL9 1T - 1080Ti @ 2100ghz 12.2ghz DDR5X


as you can see, the EXACT same supo score, yet a vast difference in game. Point being, the XOC bios might rock, or suck, in games, and that isnt really determinable from just supo scores.


----------



## gmpotu

Quote:


> Originally Posted by *Coopiklaani*
> 
> To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.
> 
> NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark. FE performs better in superposition.


Just saw this. Looks like...

FE+Shunt > XOC > FE No shunt.

Question would be where does XOC + Shunt sit? I would assume between FE + Shunt and XOC from what others have reported.


----------



## Coopiklaani

Quote:


> Originally Posted by *gmpotu*
> 
> From your chart it looks like XOC wins over FE at every bench including SUPO when looking at the 2100mhz.


Yeah, this one is FE bios without the shunt mod. the card was power limited very badly in most of the benchmarks.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> RMA time, system won't even boot with card in, second card in or not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sorry to hearing that man, if they reject your RMA, get your CC involved.
Click to expand...

I can get a motherboard with PCIe dip switches for $300 plus tax from NewEgg, and if I'm going to take advantage of their return policy I could send it back after using it.

To be honest is a really good motherboard and $100 off right now, if I do I might just keep it.

https://www.msi.com/Motherboard/X99A-XPOWER-AC.html#hero-overview

I'd just need to borrow the money, so broke.


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> what about games , though? Consider this post from @KraxKill
> as you can see, the EXACT same supo score, yet a vast difference in game. Point being, the XOC bios might rock, or suck, in games, and that isnt really determinable from just supo scores.


I didn't optimise the settings for the benchmark, all default settings. Both bios were tested using the same curve shape. Gaming has too many variables, I can try to run some games that have build-in benchmarks, but it is really hard to tell from the gameplay. ~1-2fps different max in most of the cases I suppose.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.


Hmm to be honest.. when I think back to 1080 (non TI's) this actually makes sense a little bit...

It's hard to explain but I'll try...

When I was looking for BIOS's to flash my 1080 Classy none of them seemed to work (none of them bricked the classy 1080) but the power reporting was so farrrr off the card often acted like it was stuck in safe-mode (even though it wasn't)..

Then one day I came across the HOF BIOS.. and it worked perfectly...

Meanwhile owners of FE's/STRIX/FTW's were all flashing eachothers BIOS's and having great fun.. but me.. with my Classy... the only other BIOS that seemed to be compatible was the HOF.

I later found out both cards also worked with eachothers voltage tools etc too..

Anyway the HOF and the Classy are fundamentally more different than other cards are different from each other.. they're kind of in a league of their own (I.E. they use a totally different league of voltage controller and power monitoring is different)...

It may just be that there are too many differences between a HOF 1080ti and a 1080Ti FE for it to function. (the HOF has a 16+3 phase VRM).. _but there is no such thing as a 16 phase.. so they are probably using an 8 phase with doublers. There is only *one* manufacturer that makes that kind of VRM package._

There are instructions somewhere that explain how you can perform a "blind flash" that may fix the problem.. but it's one of those things that would probably take a whole night to research. (and a bit of trial + error).

What about enabling the IGPU multi-monitor in BIOS? It lets you use both IGPU and GPU simultaneously on two screens. They work together in tandem. It's a long-shot.. but just something that came to my head just now.. not sure if it will work. But an idea nonetheless.


----------



## Slackaveli

yeah, which i guess could be important if it is the last 2 or 3 you need in 4k to hit 60fps or something. But with these cards, we are OVER 60 fps so be careful not to get caught in the trap of chasing for a couple frames that are beyond 60 on a 60 Hz monitor.


----------



## Coopiklaani

Quote:


> Originally Posted by *gmpotu*
> 
> Just saw this. Looks like...
> 
> FE+Shunt > XOC > FE No shunt.
> 
> Question would be where does XOC + Shunt sit? I would assume between FE + Shunt and XOC from what others have reported.


XOC doesn't need a shunt mod cos it has no power limit what so ever..

For ppl don't want to hard mod their cards, XOC is the best.
I can also go higher with the XOC bios since it allows more voltage to compensate for the performance loss in some of the cases..But FE + shunt mod may be my everyday setup from now on.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> DON'T flash the HOF BIOS. It bricked my 1080 Ti FE, my system won't even boot with it in.
> 
> I'm thinking given the file share website it may even be malware designed to brick our cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm to be honest.. when I think back to 1080 (non TI's) this actually makes sense a little bit...
> 
> It's hard to explain but I'll try...
> 
> When I was looking for BIOS's to flash my 1080 Classy none of them seemed to work (none of them bricked the classy 1080) but the power reporting was so farrrr off the card often acted like it was stuck in safe-mode (even though it wasn't)..
> 
> Then one day I came across the HOF BIOS.. and it worked perfectly...
> 
> Meanwhile owners of FE's/STRIX/FTW's were all flashing eachothers BIOS's and having great fun.. but me.. with my Classy... the only other BIOS that seemed to be compatible was the HOF.
> 
> I later found out both cards also worked with eachothers voltage tools etc too..
> 
> Anyway the HOF and the Classy are fundamentally more different than other cards are different from each other.. they're kind of in a league of their own (I.E. they use a totally different league of voltage controller and power monitoring is different)...
> 
> It may just be that there are too many differences between a HOF 1080ti and a 1080Ti FE for it to function.
> 
> There are instructions somewhere that explain how you can perform a "blind flash" that may fix the problem.. but it's one of those things that would probably take a whole night to research.
Click to expand...

My PC won't even boot with it in, not even a BIOS screen.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> My PC won't even boot with it in, not even a BIOS screen.


Have you tried to boot from another card and to flash the bios back?


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> My PC won't even boot with it in, not even a BIOS screen.


Install two cards. Have an older card in PCI-E 1. Boot from it.

Then flash PCI-E lane 2 (your 1080Ti FE)?

Or do you have a PC in the house with iGPU?

What about enabling the IGPU multi-monitor in BIOS? It lets you use both IGPU and GPU simultaneously on two screens. They work together in tandem? _(boot from iGPU) or on one screen_

Here's a guide on how to flash a bricked GPU:
http://www.overclock.net/t/593427/how-to-unbrick-your-bricked-graphics-card-fix-a-failed-bios-flash


----------



## gmpotu

Quote:


> Originally Posted by *Coopiklaani*
> 
> XOC doesn't need a shunt mod cos it has no power limit what so ever..
> 
> For ppl don't want to hard mod their cards, XOC is the best.
> I can also go higher with the XOC bios since it allows more voltage to compensate for the performance loss in some of the cases..But FE + shunt mod may be my everyday setup from now on.


Is the XOC dangerous to use then? With no power limit you could potentially burn up your card as the card keeps drawing power to try to remain stable right?

I would assume with XOC you don't get any down clocks because the card just keeps pulling more and more power? Are there still down locks from the temp limiting bins @37,42,52,60,65?


----------



## nrpeyton

It's not just volts that can shorten a cards life. It's current + heat.

It really depends how long you intend to keep the card.

If you intend to not replace your GPU for 5-10 years. Then I'd advise against it.

I may even be understating that a bit, too.

It's one of those things I'd never do with someone elses card. But I'd glady do with my own. (obviously with caution and monitoring it with respect).

Also, the stock FE BIOS is scoring (in benchmarks) just as good (if not better) than the XOC.

The shunt mod is the cards true calling, in my eyes. But then it may still not last as long as it would if you were only drawing 300 watts.

It's also subjective too.. some people just tinker and run the odd benchmark... others will game for 7hrs every day 4 days a week.


----------



## BoredErica

Maybe we should have a median/avg calculation for the values in the chart?


----------



## nrpeyton

I feel I have to find a new game now; that takes advantage of my 1080Ti.. while still having a decent story-line.

But I quite like strategy too. (kind of sick of RPG's.. they drag on too long.. and look how long it took me to complete the Witcher 3).

Any suggestions?


----------



## Coopiklaani

Quote:


> Originally Posted by *gmpotu*
> 
> Is the XOC dangerous to use then? With no power limit you could potentially burn up your card as the card keeps drawing power to try to remain stable right?
> 
> I would assume with XOC you don't get any down clocks because the card just keeps pulling more and more power? Are there still down locks from the temp limiting bins @37,42,52,60,65?


Watercooling is pretty much a must with XOC bios, it can draw up to 500w or more if you really stress it. As long as you can keep your core under 83C and vrm under 90C, you are then very safe.


----------



## Slackaveli

have you played Stellaris? it's a fun space stategy game, 3X type


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> I feel I have to find a new game now; that takes advantage of my 1080Ti.. while still having a decent story-line.
> 
> But I quite like strategy too. (kind of sick of RPG's.. they drag on too long.. and look how long it took me to complete the Witcher 3).
> 
> Any suggestions?


strategy game + stress GPU = Ash of the Singularity


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> My PC won't even boot with it in, not even a BIOS screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Install two cards. Have an older card in PCI-E 1. Boot from it.
> 
> Then flash PCI-E lane 2 (your 1080Ti FE)?
> 
> Or do you have a PC in the house with iGPU?
> 
> What about enabling the IGPU multi-monitor in BIOS? It lets you use both IGPU and GPU simultaneously on two screens. They work together in tandem? _(boot from iGPU) or on one screen_
> 
> Here's a guide on how to flash a bricked GPU:
> http://www.overclock.net/t/593427/how-to-unbrick-your-bricked-graphics-card-fix-a-failed-bios-flash
Click to expand...

I don't have a motherboard with an iGPU, and if PC won't boot with card in at all, even with second card in, no BIOS screen, nothing, then I can't flash it.

Can't even boot into a boot disk with it in, PC won't even start to BIOS or start with a boot disk as main device.

Good news is I talked with my boss, I can get that motherboard with the PCIe dip switches, borrow the money,

Can boot with PCIe lane disabled, enable it while in Windows, force flash the BIOS.









Edit: Or have nvflash on the DOS boot disk with BIOS I'm flashing, boot with PCIe lane disabled, enable PCIe lane after booting in boot disk, problem solved.

Either way I hope it works, it's a good card, I run it 24/7 2062 core, 6147 memory.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> What happened to KedarWolf's card?
> 
> anyone know the original post number-- and i'll look back read myself? _(can't help being interested in this one)_?


flashed the HOF BIOS here from a sketchy file share site, bricked my card.

I'm even wondering if it was malware.


----------



## Dasboogieman

Quote:


> Originally Posted by *Slackaveli*
> 
> have you played Stellaris? it's a fun space stategy game, 3X type


Also a massive CPU hog endgame.


----------



## nrpeyton

Thanks.

I'll check out both games.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> flashed the HOF BIOS here from a sketchy file share site, bricked my card.
> 
> I'm even wondering if it was malware.


Well you certainly took one for the team. (And got the warning out before a handful of people tried flashing HOF).

I know you'll get it fixed.

Will definitely make an excellent article when you do. (If you figure it out and publish how you done it -- it could be invaluable for some people)









It's never happened to me. But I have to admit, we probably all get "the fear" everytime we flash. I know I do lol. But I don't believe there's anything up-to-date been published for a while. (I.E. that link to that guide I found was probably years old). So I probably speak for a few of us here.. when I say eagerly awaiting the outcome on this one.

Also - it could of happened to anyone. Look how _eager_ we all were to try out the XOC the second it was released. The ones (like me) who tried it could of just as easily of been in the exact same situation.

I feel for ya like. Do really hope you get it sorted.

--

By the way; is it just me or is the "ad" on this page causing it to bug out? My screen keeps scrolling it's self.. not happening on any other sites.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> flashed the HOF BIOS here from a sketchy file share site, bricked my card.
> 
> I'm even wondering if it was malware.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well you certainly took one for the team. (And got the warning out before a handful of people tried flashing HOF).
> 
> I know you'll get it fixed.
> 
> Will definitely make an excellent article when you do. (If you figure it out and publish how you done it -- it could be invaluable for some people)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's never happened to me. But I have to admit, we probably all get "the fear" everytime we flash. I know I do lol. But I don't believe there's anything up-to-date been published for a while. (I.E. that link to that guide I found was probably years old). So I probably speak for a few of us here.. when I say eagerly awaiting the outcome on this one.
> 
> --
> 
> By the way; is it just me or is the "ad" on this page causing it to bug out? My screen keeps scrolling it's self.. not happening on any other sites.
Click to expand...

I got it fixed!!









Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.

So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.

When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.

I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!









I'm so glad I got it fixed, it's a decent card.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> I got it fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.
> 
> So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.
> 
> When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.
> 
> I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm so glad I got it fixed, it's a decent card.


Thank god for that. Were u getting any signal to the monitor at all to begin with?

(Just trying to get my head around how u fixed it -- in case i ever need to one day).

Look how eager we all were the second the XOC came out.. could happen to any one of us. And I know I must of tried about 20 different BIOS's on my Classy in the last round _(although I only tried 2/3 this time)._


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I got it fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.
> 
> So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.
> 
> When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.
> 
> I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm so glad I got it fixed, it's a decent card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank god for that. Were u getting any signal to the monitor at all to begin with?
> 
> (Just trying to get my head around how u fixed it -- in case i ever need to one day).
> 
> Look how eager we all were the second the XOC came out.. could happen to any one of us. And I know I must of tried about 20 different BIOS's on my Classy in the last round _(although I only tried 2/3 this time)._
Click to expand...

PC wouldn't even boot, not even into BIOS, black screen, even with the second card in with 1080 Ti. Wouldn't only work until I took out the 1080 Ti.









Only way i could boot into BIOS was disabled CSM and Fast Boot. Even then PC would freeze booting into Windows after putting my password in.

The workaround I figured out was the only fix.


----------



## feznz

Awesome news there I thought this was going to be the first GPU that was actually truly bricked I knew you could blind flash but without a booting PC it was going to be impossible


----------



## dunbheagan

I made some better photos of my recently finished 1080Ti-Build:



If you are interested in more photos, i created a gallery here:

http://www.overclock.net/g/a/1630543/my-build


----------



## UdoG

Any info about an upcoming water block for the HOF ?


----------



## ViRuS2k

NIce one, you fixed it









now get back to doing the shunt resister mod so i can follow your guide lol


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> I got it fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.
> 
> So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.
> 
> When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.
> 
> I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm so glad I got it fixed, it's a decent card.


Grats, must be something to do with UEFI bios on the videocard. Luckily those options fixed the booting issue for you


----------



## mechwarrior

Quote:


> Originally Posted by *KedarWolf*
> 
> I got it fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.
> 
> So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.
> 
> When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.
> 
> I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm so glad I got it fixed, it's a decent card.


great News really glad you were able to get your card back to working order. Thanks again for stepping up and taking over this tread.


----------



## Nico67

Quote:


> Originally Posted by *Coopiklaani*
> 
> I didn't optimise the settings for the benchmark, all default settings. Both bios were tested using the same curve shape. Gaming has too many variables, I can try to run some games that have build-in benchmarks, but it is really hard to tell from the gameplay. ~1-2fps different max in most of the cases I suppose.


Not to disparage your testing, but you really have to spent a massive amount of time optimising your curve with the FE, and pretty much just set the frequency (plus about 3 bins below) you want with the XOC. I have only tested SuPo 4K so that's all I can compare, but FE is way ahead of XOC. I think the shunt mod would bring my scores up about 100 points but it depends how much your card is limiting and there all different.

XOC - 10615
FE - 10795 (that's up from 10693 just with more curve tweaking)

My feeling is the XOC is pretty much a unlocked Strix Bios, and some people found that better performing, unfortunately not me









What does interest me is why raise curve points below the working frequency has such a huge effect on performance? I mean with the XOC, I should just raise 1.075v to 2101 and that should be it. However raising three voltage bins below that is worth about 130 points. The only thing I could come up with is that different bins might effect different clocks? Like say working frequency bin -2 is the XBAR frequency or something? would seem unlikely but can't think of much else as a locked frequency should be just that.


----------



## Hulio225

Quote:


> Originally Posted by *Nico67*
> 
> Not to disparage your testing, but you really have to spent a massive amount of time optimising your curve with the FE, and pretty much just set the frequency (plus about 3 bins below) you want with the XOC. I have only tested SuPo 4K so that's all I can compare, but FE is way ahead of XOC. I think the shunt mod would bring my scores up about 100 points but it depends how much your card is limiting and there all different.
> 
> XOC - 10615
> FE - 10795 (that's up from 10693 just with more curve tweaking)
> 
> My feeling is the XOC is pretty much a unlocked Strix Bios, and some people found that better performing, unfortunately not me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What does interest me is why raise curve points below the working frequency has such a huge effect on performance? I mean with the XOC, I should just raise 1.075v to 2101 and that should be it. However raising three voltage bins below that is worth about 130 points. The only thing I could come up with is that different bins might effect different clocks? Like say working frequency bin -2 is the XBAR frequency or something? would seem unlikely but can't think of much else as a locked frequency should be just that.


It is indeed a unlocked strix bios, its based on the strix bios...
I'm testing the XOC bios once again and i will return to FE once again^^, was testing it in games, but for me its not worth, especially if you have a power limit mod on your card...


----------



## pantsoftime

Quote:


> Originally Posted by *Slackaveli*
> 
> that 'maxsun' looks like the one. pcb like the FE, higher power limit (350)


Has anyone tried the maxsun BIOS on an FE?


----------



## Slackaveli

Quote:


> Originally Posted by *Dasboogieman*
> 
> Also a massive CPU hog endgame.


even the Broadwell gets chocked at the end


----------



## kolkoo

Ah FTW3 Bios has master -> slave switch? So perhaps that's exactly why when we flash it to our FE cards they only draw
Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone tried the maxsun BIOS on an FE?


I think it's the same as Palit gamerock bios by the looks of it - same company similar card similar TDP - I think the reviews were in favor of the FTW3 even when power limited


----------



## Slackaveli

Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone tried the maxsun BIOS on an FE?


that's the one i said for them to try as well. they havent in this thread at least, havent looked around the web or anything. im not going to try it, my bios is good for me. But, on paper, that maxsun is THE one. It's the same pcb and power delivery as an FE, only difference is higher clocks and the 355W(iirc) power limit. I guess they let it slip thru the cracks b/c it doesnt sound good. Eagerly waiting to be told we were right on it, though. On paper= gold.


----------



## lanofsong

Hey GTX 1080 Ti owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## gmpotu

@kedarwolf can you add link to the DDU uninstall tool to the OP?

I've never use that before and would like to check it out. I normally just choose clean install when updating nvidia drivers.


----------



## Cobra652

Rma?? Or is it normal??
Under load above 2ghz, starting with "tktktk" sound, then zzZZZZZZZZZZZZ....
(its not the fans, im using fans always on)


----------



## kolkoo

Quote:


> Originally Posted by *Slackaveli*
> 
> that's the one i said for them to try as well. they havent in this thread at least, havent looked around the web or anything. im not going to try it, my bios is good for me. But, on paper, that maxsun is THE one. It's the same pcb and power delivery as an FE, only difference is higher clocks and the 355W(iirc) power limit. I guess they let it slip thru the cracks b/c it doesnt sound good. Eagerly waiting to be told we were right on it, though. On paper= gold.


As I mentioned in my previous post this is 99.9% the same bios as the Palit GameRock one - https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331 and this one was tested and found to perform worse than FTW3 bios even if it power limits less.


----------



## TheBoom

Quote:


> Originally Posted by *Cobra652*
> 
> Rma?? Or is it normal??
> Under load above 2ghz, starting with "tktktk" sound, then zzZZZZZZZZZZZZ....
> (its not the fans, im using fans always on)


Coil whine. Don't think you will get a RMA for that. Return if possible since it seems quite loud. Its up to you really.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> that's the one i said for them to try as well. they havent in this thread at least, havent looked around the web or anything. im not going to try it, my bios is good for me. But, on paper, that maxsun is THE one. It's the same pcb and power delivery as an FE, only difference is higher clocks and the 355W(iirc) power limit. I guess they let it slip thru the cracks b/c it doesnt sound good. Eagerly waiting to be told we were right on it, though. On paper= gold.


Show me where it has the same pcb plz... as far i can tell its the same pcb as palit game rock premium and thats a bit different to FE pcb

Quote:


> Originally Posted by *kolkoo*
> 
> As I mentioned in my previous post this is 99.9% the same bios as the Palit GameRock one - https://www.techpowerup.com/vgabios/191499/palit-gtx1080ti-11264-170331 and this one was tested and found to perform worse than FTW3 bios even if it power limits less.


Edit:

i can definitely say the bios, its a bit different:


----------



## Slackaveli

could be mistaken, i stopped paying as much attention to it when i got rid of my FE for the Aorus. I would defer to you on that one.


----------



## Hulio225

Quote:


> Originally Posted by *Slackaveli*
> 
> could be mistaken, i stopped paying as much attention to it when i got rid of my FE for the Aorus. I would defer to you on that one.


im giving it a shot right now, it looks like a older version of the gamerock premium bios, reading the version number of the bios... if its older and closer optimized to a FE card/bios it could be good... we will see in a few


----------



## kolkoo

Quote:


> Originally Posted by *Hulio225*
> 
> im giving it a shot right now, it looks like a older version of the gamerock premium bios, reading the version number of the bios... if its older and closer optimized to a FE card/bios it could be good... we will see in a few


Keep us posted


----------



## superV

Quote:


> Originally Posted by *Cobra652*
> 
> Rma?? Or is it normal??
> Under load above 2ghz, starting with "tktktk" sound, then zzZZZZZZZZZZZZ....
> (its not the fans, im using fans always on)


what psu do you have?those cables look old.try to use proper pcie cable from your psu,and not molex extensions.
i'm disappointed,because i wanted that card so bad,but it costed 956 euros,so i bought a fe for 740 and put it on water.in the end all the cards reach 2000/2050 so not a big deal,but if u buy a custom card,you buy it for better components for a longer lifespan,to avoid things like those happening.but seeing that,it's a shame.
they should focus to fix problems like those,and not over 9000 power delivery phases.


----------



## Slackaveli

Quote:


> Originally Posted by *Hulio225*
> 
> im giving it a shot right now, it looks like a older version of the gamerock premium bios, reading the version number of the bios... if its older and closer optimized to a FE card/bios it could be good... we will see in a few


curious to see. i remember when looking at them all concluding that it looked the most intriguing, but i dont remember why i concluded that lol.


----------



## c0nsistent

Well I cleaned out my Antec 620 radiator and put 2 Yate Loon high speed fans and my temps never break 64C sustained load. I was hitting 71C with lower speed generic fans but blowing the dust out and replacing the fans did help a bit, and it even increased my Superposition score by 150 points.

If anyone is wondering if its possible to zip tie a 120mm AIO cooler, it's 100% doable and works fine. The thermals aren't as good as a better solution of course, but even with a shim in between, I'm at 64C max.


----------



## Hulio225

Quote:


> Originally Posted by *c0nsistent*
> 
> Well I cleaned out my Antec 620 radiator and put 2 Yate Loon high speed fans and my temps never break 64C sustained load. I was hitting 71C with lower speed generic fans but blowing the dust out and replacing the fans did help a bit, and it even increased my Superposition score by 150 points.
> 
> *If anyone is wondering if its possible to zip tie a 120mm AIO cooler*, it's 100% doable and works fine. The thermals aren't as good as a better solution of course, but even with a shim in between, I'm at 64C max.


This is how it was always done back in the days... zip tie 4tw









Edit: The maxsun bios is performing worse than FE...
even with the powerbat and power limit set to 350W i dont see the card going much over 300W. My card is shunt modded thou... don't know what is goining on there, but i ran several time spy game test 1 runs and compared to my database of FE avg fps and was always below that...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coopiklaani*
> 
> To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.
> 
> NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark. FE performs better in superposition.


I wish someone would do this would actual gaming benchmarks. *Heaven* benchmark would have been great for that.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> PC wouldn't even boot, not even into BIOS, black screen, even with the second card in with 1080 Ti. Wouldn't only work until I took out the 1080 Ti.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only way i could boot into BIOS was disabled CSM and Fast Boot. Even then PC would freeze booting into Windows after putting my password in.
> 
> The workaround I figured out was the only fix.


I'm not sure if you want to try this, but I have an idea that might help you. Awhile back when I had an XPS 8700, they would not boot or post with the 900 series cards because the bios for the mobo was not updated, and it wasn't for months.

This guy did this. He put his computer in sleep mode, flipped the power of his computer, put the card in, and then woke it up. The card booted up and was enabled.

You might be able to do this, and as long as your wearing some ESD gloves or really careful, nothing should happen. But it's still risky. This guy didn't look like he took any ESD measures and nothing happened to him lol.

It's worth a shot. Watch video carefully.


----------



## visata

I'm building a Ryzen based PC. What's the best 1080Ti to buy? I currently picked this one:
*EVGA GeForce GTX 1080Ti Founders Edition*


----------



## BranField

Quote:


> Originally Posted by *feznz*


Thanks, +rep. looks like its 287mm. should be able to fit in 11.5in (292mm)


----------



## Slackaveli

Quote:


> Originally Posted by *superV*
> 
> what psu do you have?those cables look old.try to use proper pcie cable from your psu,and not molex extensions.
> i'm disappointed,because i wanted that card so bad,but it costed 956 euros,so i bought a fe for 740 and put it on water.in the end all the cards reach 2000/2050 so not a big deal,but if u buy a custom card,you buy it for better components for a longer lifespan,to avoid things like those happening.but seeing that,it's a shame.
> they should focus to fix problems like those,and not over 9000 power delivery phases.


this. cant believe another one slipped through without a proper psu.


----------



## visata

Quote:


> Originally Posted by *visata*
> 
> I'm building a Ryzen based PC. What's the best 1080Ti to buy? I currently picked this one:
> *EVGA GeForce GTX 1080Ti Founders Edition*


Or maybe this one *ASUS ROG STRIX GeForce GTX 1080 TI 11GB VR Ready*?


----------



## Slackaveli

Quote:


> Originally Posted by *visata*
> 
> Or maybe this one *ASUS ROG STRIX GeForce GTX 1080 TI 11GB VR Ready*?


yep/


----------



## visata

Quote:


> Originally Posted by *Slackaveli*
> 
> yep/


I noticed you own Aorus card. Should I get Gigabyte card instead of Asus (aorus is cheaper here!)?
*Gigabyte GeForce GTX 1080 Ti XTREME Edition, 11GB
*

It might sound obvious what to get for the best performance but I have to go through 1k pages of this thread







I would like to overclock to 2000Mhz later. Will I require a custom cooling for that?


----------



## Slackaveli

Quote:


> Originally Posted by *visata*
> 
> I noticed you own Aorus card. Should I get Gigabyte card instead of Asus (aorus is cheaper here!)?
> *Gigabyte GeForce GTX 1080 Ti XTREME Edition, 11GB
> *
> 
> It might sound obvious what to get for the best performance but I have to go through 1k pages of this thread
> 
> 
> 
> 
> 
> 
> 
> I would like to overclock to 2000Mhz later. Will I require a custom cooling for that?


i LOVE mine. it's a very nice card. i think Aorus/asus/Zotac/Evga are all great.
The Aorus runs cool, quiet, and all of them hit about 2Ghz.+


----------



## ViRuS2k

my 2x gigabyte fe`s hit 2050 cores on sli so im pretty happy though im still doing the resistor mod







it will be happening this week









in preperation for this i bought all this stuff thats ready and waiting :

3x Gelid Solutions GP Extreme .5mm Thickness Thermal Pad (80 x 40 x 1.0 mm) W/mK 12
3x Gelid Solutions GP Extreme .5mm Thickness Thermal Pad (80 x 40 x 0.5 mm) W/mK 12

Wasnt sure what hight i needed lol hope its enough for 2x cards....

10x SMD 0805 Resistors 47 Ohms
1x Thermal Grizzly Conductonaut Liquid Metal Thermal Paste - 1g
1x 33+ Scotch Super Electrical Tape, Vinyl, 19 mm x 20 m - Black
1x Thermal Grizzly Liquid Metal Applicator - 3 Pieces
1x Conducting Silver conductive silver paint 3 grams
1x Arctic Silver Thermal Adhesive/Glue/Epoxy 7g (2x 3.5g Tubes)
1x MG Chemicals Translucent Silicone Grease, 85 ml Tube
1x Starbrite Liquid Tape Wire Coating BLACK
1x Cool Laboratory Liquid Metal CLU

Im not exactly sure but the stuff i gathered was for the normal method of putting liquid metal across the shunts but im going to do the resisters method instead as it is much cleaner


----------



## feznz

Quote:


> Originally Posted by *lanofsong*
> 
> Hey GTX 1080 Ti owners,
> 
> We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
> 
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> May 2017 Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> 
> Enter your passkey
> 
> Enter Team OCN number - 37726
> 
> later
> 
> lanofsong


Should be the perfect way to test for stability and do man kind a favour at the same time

I would warn the old saying I know it's stable but can it fold









there is a reason for the 250w NVidia stock power draw this is probably arguable most stressful test you could put a GPU though


----------



## Luckbad

When you put on a waterblock, do you generally have to use the backplate?

I'm wondering because the Zotac 1080 Amp Extreme PCB is very similar to the 1080 Ti Amp Extreme, and there's a Bitspower waterblock for the former. The backplate wouldn't be quite right, though.

I have these two excellent EVGA FTW3 cards, but I'm probably just going to keep the Zotac. It's still better, and the itch is there to keep the fastest card.


----------



## lanofsong

Quote:


> Originally Posted by *feznz*
> 
> Should be the perfect way to test for stability and do man kind a favour at the same time
> 
> I would warn the old saying I know it's stable but can it fold
> 
> 
> 
> 
> 
> 
> 
> 
> 
> there is a reason for the 250w NVidia stock power draw this is probably arguable most stressful test you could put a GPU though


Man, are you in for a treat with the 1080 Ti - 1million+ Points per day and sometimes brushing up close to 1.5million PPD - crazy numbers


----------



## Hulio225

Quote:


> Originally Posted by *Luckbad*
> 
> When you put on a waterblock, *do you generally have to use the backplate*?
> 
> I'm wondering because the Zotac 1080 Amp Extreme PCB is very similar to the 1080 Ti Amp Extreme, and there's a Bitspower waterblock for the former. The backplate wouldn't be quite right, though.
> 
> I have these two excellent EVGA FTW3 cards, but I'm probably just going to keep the Zotac. It's still better, and the itch is there to keep the fastest card.


Backplate is insignificant in all means, it just can save your gpu if cpu is starting to leak^^


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> @kedarwolf can you add link to the DDU uninstall tool to the OP?
> 
> I've never use that before and would like to check it out. I normally just choose clean install when updating nvidia drivers.


Added to 'How To Flash A BIOS' part.


----------



## seven7thirty30

Love this GPU! Stable "shunt mod" 2088 Core/5940 MEM @ 1.093


----------



## nrpeyton

Any HOF owners noticed more stability at higher switching frequencies?


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Any HOF owners noticed more stability at higher switching frequencies?


minimized ripple will do that.


----------



## Nico67

Quote:


> Originally Posted by *seven7thirty30*
> 
> Love this GPU! Stable "shunt mod" 2088 Core/5940 MEM @ 1.093


Scores pretty nice, but min fps is quite low. What card and bios is that?


----------



## seven7thirty30

NVIDIA Founders Edition w/stock BIOS and "resistor mod". That Superposition score was at 4K. Getting 95-110 FPS in Witcher 3 with maxed settings at 2K (Acer XB271HU).


----------



## nrpeyton

Looks good mate. Glad ur enjoying it.

Considering the price we've all paid for GPU. I think finding a game you actually really enjoy is *very important.*

"How our KILLER GPU's added to our gaming experience" <---- not discussed enough here?
_(I mean look at what a lot of people are still getting by with, and still having a great time)!_

We talk a lot about benchmarks but often forget the real reason we spent so much money So we could have an absolutely phenominally amazing _(and cutting edge)_ gaming experience.

Having an entire night devoted to gaming can be quite relaxing.

Also- I have hwinfo64 and rivatuner setup so I can still monitor everything as I play.

Sometimes we spend a lot of time performing benchmarks and pushing for that last 2%.. trust me you forget why you bought it*. I have a new appreciation* for my whole setup since I completed the Witcher 3 yesterday.

Anyone else agree / Any thoughts?


----------



## Nico67

Quote:


> Originally Posted by *seven7thirty30*
> 
> NVIDIA Founders Edition w/stock BIOS and "resistor mod". That Superposition score was at 4K. Getting 95-110 FPS in Witcher 3 with maxed settings at 2K (Acer XB271HU).


I get 64 fps min at similar scores, but could be more CPU orientated? should be good with FE bios though.


----------



## OneCosmic

Quote:


> Originally Posted by *seven7thirty30*
> 
> Love this GPU! Stable "shunt mod" 2088 Core/5940 MEM @ 1.093


Stock driver settings or any adjustments?


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Any HOF owners noticed more stability at higher switching frequencies?


Its because you need less Vdroop at higher switching frequencies. Wait, you can change the VRM switching frequencies on the HOF?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Nico67*
> 
> I get 64 fps min at similar scores, but could be more CPU orientated? should be good with FE bios though.


I saw lower minimums with no cpu oc so it can definitely make a difference.

What cpu aren't you using?


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Its because you need less Vdroop at higher switching frequencies. Wait, you can change the VRM switching frequencies on the HOF?


You could on the 1080 (non Ti).

Not sure if everything's been released yet for it's big brother. But can't see why it wouldn't be. So even if it's not, it certainly won't be far away.


----------



## seven7thirty30

Quote:


> Originally Posted by *OneCosmic*
> Stock driver settings or any adjustments?


Used my Global Driver settings: 381.89

Single Display Performance Mode
Power: Prefer Maximum Performance
V-sync OFF


----------



## Coopiklaani

My super compact ITX build. Completed a crazy 8hr long FSU stress test with [email protected] on FE BIOS with Shunt mod. Best 24/7 config for me so far


----------



## c0nsistent

Well the XOC BIOS is a LOT hotter than I was expecting. I think I'm going back to the stock BIOS for now until I can improve the cooling.


----------



## Zarotu

When is the next batch of free games coming in? I am interested in purchasing one, but can and will wait until a free game is bundled with it. For Honor didn't seem like a good idea and the ASUS Strix OC version was out of stock.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> My super compact ITX build.


Can't believe you got 3 radiators into that, lol


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Can't believe you got 3 radiators into that, lol


I had to, 120+240+280 radiators. Power modded 1080Ti + 6800k @4.5GHz = a loooooot of heat.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I had to, 120+240+280 radiators. Power modded 1080Ti + 6800k @4.5GHz = a loooooot of heat.


Aye, I can imagine.

Every INCH (top, front and back) of my own 1080Ti FE _(which is still on air)_ was too hot to touch for more than a few seconds. The only bit I could actually touch was the PCI-E power cables. _(During long gaming session @ 75% fan)._


----------



## Nico67

Quote:


> Originally Posted by *SlimJ87D*
> 
> I saw lower minimums with no cpu oc so it can definitely make a difference.
> 
> What cpu aren't you using?


6700k @ 4.7g nothing really spectacular


----------



## dunbheagan

Quote:


> Originally Posted by *Coopiklaani*
> 
> My super compact ITX build. Completed a crazy 8hr long FSU stress test with [email protected] on FE BIOS with Shunt mod. Best 24/7 config for me so far


Nice! I think after 8h FSU you can call it stable









Power draw of your system should be close to 500W at load, thats a lot of heat in such a tiny case. At which rpm do your fans spin during benchmarks/gaming? How's about the noise?


----------



## IMI4tth3w

just got my wife's 1080ti strix oc card today!!

only briefly messed with it, but it boosts to 1999MHz right out of the box which seems pretty typical of all 1080ti's. When it hits 50C the boost clock does drop 12.5MHz though which i believe is also fairly typical. but its pretty rock solid at that frequency.

Did some test runs with heaven benchmark at 2100MHz with no voltage added and fans on max and it'll get about half way through before crashing. Haven't messed with the voltage much more but i'm just going to wait until i get the waterblock and some CLU on the power resistors before trying out more overclocks. crossing fingers this one is a good one but so far it seems pretty decent!


----------



## KedarWolf

Quote:


> Originally Posted by *dunbheagan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Coopiklaani*
> 
> My super compact ITX build. Completed a crazy 8hr long FSU stress test with [email protected] on FE BIOS with Shunt mod. Best 24/7 config for me so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice! I think after 8h FSU you can call it stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power draw of your system should be close to 500W at load, thats a lot of heat in such a tiny case. At which rpm do your fans spin during benchmarks/gaming? How's about the noise?
Click to expand...

I found the no shunt mod Fire Strike Ultra stays at lower points on the voltage curve so you're not even testing your highest clocks.

Until I get my 47 Ohm resistors (Tuesday according to tracking info) I run Time Spy tests 1 and 2 in a loop, test 2 jumps up and down my major clock points on my curve and test 1 stays pretty much on my highest point, good stability test the entire curve I find.









Or if you just want to test the highest point I find Heaven works, it'll crash and lower frame rates some time in if I'm the least bit unstable.


----------



## Coopiklaani

Quote:


> Originally Posted by *dunbheagan*
> 
> Nice! I think after 8h FSU you can call it stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power draw of your system should be close to 500W at load, thats a lot of heat in such a tiny case. At which rpm do your fans spin during benchmarks/gaming? How's about the noise?


all my fans are ultra low noise variant, they max out at 1150 to 1200 rpm. and they really maxed out during the stress test.


----------



## Coopiklaani

Quote:


> Originally Posted by *KedarWolf*
> 
> I found the no shunt mod Fire Strike Ultra stays at lower points on the voltage curve so you're not even testing your highest clocks.
> 
> Until I get my 47 Ohm resistors (Tuesday according to tracking info) I run Time Spy tests 1 and 2 in a loop, test 2 jumps up and down my major clock points on my curve and test 1 stays pretty much on my highest point, good stability test the entire curve I find.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or if you just want to test the highest point I find Heaven works, it'll crash and lower frame rates some time in if I'm the least bit unstable.


FSU is extremely power hungry. the first few seconds can draw 370w to 400w of power. FSU is actually really good for FE bios + shunt mod, it tests the highest bin, and 1 or 2 bins below as well.


----------



## dallas1990

im finishing my new rig, by ordering either the aorus gtx 1080ti extreme or the zotac gtx 1080ti amp extreme. i do plan on running 2 of them but im just ordering one tonight for now. its time to trade in my 2 r9 290x 8gb cards for a new dual card setup. how much more performance is the zotac over the aorus 1080ti? and anyone that has these cards let me know what your getting out of them please







TY


----------



## Slackaveli

Quote:


> Originally Posted by *dallas1990*
> 
> im finishing my new rig, by ordering either the aorus gtx 1080ti extreme or the zotac gtx 1080ti amp extreme. i do plan on running 2 of them but im just ordering one tonight for now. its time to trade in my 2 r9 290x 8gb cards for a new dual card setup. how much more performance is the zotac over the aorus 1080ti? and anyone that has these cards let me know what your getting out of them please
> 
> 
> 
> 
> 
> 
> 
> TY


they are basically the same.


----------



## dallas1990

ah ok, so guess it boils down to looks then. though guess when i look at the specs closer it a 13mhz difference in terms of out of the box.


----------



## Luckbad

Quote:


> Originally Posted by *dallas1990*
> 
> im finishing my new rig, by ordering either the aorus gtx 1080ti extreme or the zotac gtx 1080ti amp extreme. i do plan on running 2 of them but im just ordering one tonight for now. its time to trade in my 2 r9 290x 8gb cards for a new dual card setup. how much more performance is the zotac over the aorus 1080ti? and anyone that has these cards let me know what your getting out of them please
> 
> 
> 
> 
> 
> 
> 
> TY


They're not a whole lot different. The Zotac can give you a bit more power, but you're more limited by voltage, temperature, and silicon lottery with the 1080 Ti once you pass 350W or so of power.

I've had most of the current crop of top-of-the-line 1080 Tis and like the Zotac Amp Extreme the most on air. Its cooling is bonkers and you can keep it very quiet with a decent fan profile while still maintaining solid temps. The Aorus comes really close and I'd have likely kept either a Zotac or Aorus based on silicon lottery.

I personally dislike the Asus Strix OC because the fans are an annoying pitch to me. They're quiet, but I hear them independently of the other 13 fans in my system that are all a particular sound.

The FTW3 is the most interesting of the cards for its temperature sensors and independent fan control. If you want to go SLI with only air cooling, this is your best option in my opinion. They're actually 2 slot cards, while most of the other flagships are much closer to 3 slots. A bit louder than the Zotac, Asus, and Gigabyte/Aorus because of the slimmer profile.

Aaaand I'll take that back. If you don't want to do a custom water cooling loop but want to SLI, get a pair of hybrids. The EVGA SC2 Hybrid is really solid and has lots of temperature sensors. Presumably the MSI/Corsair hybrid is good as well.

Am I missing any? MSI. That's the only flagship manufacturer I haven't tried because I got mad about some warranty crap with them a few years back and haven't purchased anything from them since.


----------



## Benny89

Can you shunt mod ASUS STRIX also? Anyone tried that or could provide photos with instruction how to do this? Thanks!


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Can you shunt mod ASUS STRIX also? Anyone tried that or could provide photos with instruction how to do this? Thanks!


Just use the XOC bios, it would accomplish the same thing but eliminate any throttling and allow more voltage.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.
> 
> *NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark*. FE performs better in superposition.


I have to return to that bold part here.
Your conclusion and your tests completely speak against what i can show and prove^^ somehow it makes no sense.
How many Time Spy runs have you done, if just one on each bios and the scores don't represent the median out of at least 3-5 runs each, your results could be flawed cause cpu points are fluctuating 200-250 points each run easily.
Thats why i always used Time Spy game test 1, ran it 10 times on each bios created the median and compared average median fps for each bios with each other.
And i can show with exactly the same testing conditions i'm loosing 1 average fps with the XOC bios in time spy game test 1.

So my questions are here, what was your Vram clocks, and what card do you exactly have (is it the Strix or the FE)

Im just tryin to figure out why people come to the conclusion that those bioses perform the same in 3D Mark, cause me and other fellas here with FE cards have shown more than once that this is not the case.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> Can you shunt mod ASUS STRIX also? Anyone tried that or could provide photos with instruction how to do this? Thanks!


This:
Quote:


> Originally Posted by *Nico67*
> 
> Just use the XOC bios, it would accomplish the same thing but eliminate any throttling and allow more voltage.


Other than that you can shunt mod every graphics card, just look at the PCB if there are shunts you can shunt mod them, shunt resistors differ in size and optics significantly from other smd resistors so no problem to detect that.



Naked Strix PCB: there you go i marked the shunts with a red circle... very easy those shunts are a lot bigger than normal smd resistors


----------



## Benny89

BTW how do you OC SLI in AB? When I setup SLI inside case, AB will detect them on auto or I need to do something?

Also- how to flash BIOS on SLI?


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> BTW how do you OC SLI in AB? When I setup SLI inside case, AB will detect them on auto or I need to do something?
> 
> *Also- how to flash BIOS on SLI?*


Just go to the "How to flash Bios thread" from KedarWolf, he has put so much work in the first post it explains everything....


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> BTW how do you OC SLI in AB? When I setup SLI inside case, AB will detect them on auto or I need to do something?


AB will detect both cards but u will loose your previous single GPU profiles I think. In AB u can then choose to use the same OC settings for both cards or to adjust each card separately (might be useful if both cards have different voltage requirements and u went to squeeze out the last bit of performance).

And regarding flashing, u can choose the card to flash on the command line during the flashing. Ofc, KedarWolfs thread explains everything nicely as pointed out above.


----------



## Hulio225

Quote:


> Originally Posted by *BrainSplatter*
> 
> AB will detect both cards but u will loose your previous single GPU profiles I think. In AB u can then choose to use the same OC settings for both cards or to adjust each card separately (might be useful if both cards have different voltage requirements and u went to squeeze out the last bit of performance).
> 
> And regarding flashing, u can choose the card to flash on the command line during the flashing. Ofc, KedarWolfs thread explains everything nicely as pointed out above.


If you lose profiles just open the old profile-file with a text editor and copy pasta it into the new profile file ...


----------



## BrainSplatter

Quote:


> Originally Posted by *Hulio225*
> 
> If you lose profiles just open the old profile-file with a text editor and copy pasta it into the new profile file ...


Thanks, good idea. The same probably applies when switching BIOS versions with a different card model ID.


----------



## Hulio225

Quote:


> Originally Posted by *BrainSplatter*
> 
> Thanks, good idea. The same probably applies when switching BIOS versions with a different card model ID.


Indeed it does


----------



## ELIAS-EH

Thank u all for your recommendation. I bought the strix 1080ti, my 6900K @4.3 ghz is waiting for this beast.


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> AB will detect both cards but u will loose your previous single GPU profiles I think. In AB u can then choose to use the same OC settings for both cards or to adjust each card separately (might be useful if both cards have different voltage requirements and u went to squeeze out the last bit of performance).
> 
> And regarding flashing, u can choose the card to flash on the command line during the flashing. Ofc, KedarWolfs thread explains everything nicely as pointed out above.


You can OC each card separetely? Ooo! I always thought they should be the same as I saw in some SLI videos that clocks were always identical.
Quote:


> Originally Posted by *Hulio225*
> 
> This:
> Other than that you can shunt mod every graphics card, just look at the PCB if there are shunts you can shunt mod them, shunt resistors differ in size and optics significantly from other smd resistors so no problem to detect that.
> 
> 
> 
> Naked Strix PCB: there you go i marked the shunts with a red circle... very easy those shunts are a lot bigger than normal smd resistors


BIG THANKS! +Rep. I will try shunt mod after I will test how both cards OC on SLI.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> You can OC each card separetely? Ooo! I always thought they should be the same as I saw in some SLI videos that clocks were always identical.
> BIG THANKS! +Rep. I will try shunt mod after I will test how both cards OC on SLI.


But i would try the XOC bios first, since its based on the strix bios


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> You can OC each card separetely? Ooo! I always thought they should be the same as I saw in some SLI videos that clocks were always identical.


Usually in SLI, u want identical clocks and memory speed in order to avoid frame pacing issues since each card usually an alternate frame. Having different OC profiles is mostly useful when u want to optimize the voltage curve and your SLIcards have different voltage requirements. It's also useful when u have cards with a different base clock (e.g. one FE card and one AIB card).

And finally, it can be useful to have separate profile for your main card for the games which don't support SLI. If on air, in non-SLI mode u can often clock your single card a little higher due to the lower temperature. Also, u usually want to use the faster card as the main card. As a result u can create a non-SLI profile with clock speeds which would not work on the second card (as long as u only play non-SLI games it wouldn't matter to apply that profile also to the 2nd card but when u forget to switch back for SLI mode, that would crash your game).


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> I have to return to that bold part here.
> Your conclusion and your tests completely speak against what i can show and prove^^ somehow it makes no sense.
> How many Time Spy runs have you done, if just one on each bios and the scores don't represent the median out of at least 3-5 runs each, your results could be flawed cause cpu points are fluctuating 200-250 points each run easily.
> Thats why i always used Time Spy game test 1, ran it 10 times on each bios created the median and compared average median fps for each bios with each other.
> And i can show with exactly the same testing conditions i'm loosing 1 average fps with the XOC bios in time spy game test 1.
> 
> So my questions are here, what was your Vram clocks, and what card do you exactly have (is it the Strix or the FE)
> 
> Im just tryin to figure out why people come to the conclusion that those bioses perform the same in 3D Mark, cause me and other fellas here with FE cards have shown more than once that this is not the case.


I took the highest graphics score out of three runs for each benchmark. My card is an FE card and is running +500 on the vram.


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> But i would try the XOC bios first, since its based on the strix bios


Should I use DDU before I flash my BIOs or after I reboot after flashing and then install drivers and reboot again?

What is better?


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> I took the highest graphics score out of three runs for each benchmark. My card is an FE card and is running +500 on the vram.


Now its getting annoying, because i can show you that the XOC bios will definitely score lower average fps than the FE bios in time spy with 2101MHz | +500 Mem and a FE card
Quote:


> Originally Posted by *Benny89*
> 
> Should I use DDU before I flash my BIOs or after I reboot after flashing and then install drivers and reboot again?
> 
> What is better?


You flash bios on it and then you start ddu, it will restart into windows save mode it will clean all up and restart into normal windows and than you install fresh drivers.

And don't forget to open command promt as admin for flashing bios


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Now its getting annoying, because i can show you that the XOC bios will definitely score lower average fps than the FE bios in time spy with 2101MHz | +500 Mem and a FE card
> You flash bios on it and then you start ddu, it will restart into windows save mode it will clean all up and restart into normal windows and than you install fresh drivers.
> 
> And don't forget to open command promt as admin for flashing bios


well is it possible that I didn't run DDU after bios flash?
http://www.3dmark.com/spy/1788173 TS FE bios + shunt
http://www.3dmark.com/spy/1787236 TS FE bios no shunt
http://www.3dmark.com/spy/1787105 TS XOC bios no shunt


----------



## Besty

Quote:


> Originally Posted by *nrpeyton*
> 
> Any HOF owners noticed more stability at higher switching frequencies?


I got my HOF today. I am waiting on Bitspower to release their block and the Galax XOC bios but I need to plug the HOF in to make sure its not DOA. I will give it a test run tonight once the air temps have dropped.

My FE on warm liquid benches at 212x with a custom curve and no modded bios or hardware. I got the lucky dip with that one. I doubt the HOF will better that with its stock BIOS on air but I am open minded.

Galax did a 2.2GHZ demo on air with the non-Ti 1080 HOF at Computex last year so I live in hope.


----------



## Benny89

Who is on STRIX and flashed XOC and see that performance is the same on same clocks? It was originally a STRIX Bios so there shouldn't be performance decrease.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> well is it possible that I didn't run DDU after bios flash?
> http://www.3dmark.com/spy/1788173 TS FE bios + shunt
> http://www.3dmark.com/spy/1787236 TS FE bios no shunt
> http://www.3dmark.com/spy/1787105 TS XOC bios no shunt


This could be the problem, since if you don't ddu after flashing another bios, i noticed less fps in my testing until i ddu'ed.

Edit:

One comparison is missing in your three tests and this would be TS XOC with shunt, and if you would start to get lower scores then, we would know that shunt + XOC is causing that issue, if your tests aren't flawed related to not DDU


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> This could be the problem, since if you don't ddu after flashing another bios, i noticed less fps in my testing until i ddu'ed.
> 
> Edit:
> 
> One comparison is missing in your three tests and this would be TS XOC with shunt, and if you would start to get lower scores then, we would know that shunt + XOC is causing that issue, if your tests aren't flawed related to not DDU


So right after I finish flashing BIOS - start DDU, wipe, reboot, clean install, right?

Do you disable GPU before flashing in Device Manager?


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> This could be the problem, since if you don't ddu after flashing another bios, i noticed less fps in my testing until i ddu'ed.
> 
> Edit:
> 
> One comparison is missing in your three tests and this would be TS XOC with shunt, and if you would start to get lower scores then, we would know that shunt + XOC is causing that issue, if your tests aren't flawed related to not DDU


I personally don't think shunt mod or not on a XOC bios card will make any difference since XOC is already fully power limited lifted.


----------



## fisher6

I've been using the XOC bios for weeks now and even though it gives slightly lower scores in benchmarks it never throttles for me while gaming, Been playing ME:A for hours and the clock/memory never moved. Game is smoother overall.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> So right after I finish flashing BIOS - start DDU, wipe, reboot, clean install, right?
> 
> Do you disable GPU before flashing in Device Manager?


Yep, thats the process, no need to disable gpu, new versions of NVFlash is doing that automatic.
Quote:


> Originally Posted by *fisher6*
> 
> I've been using the XOC bios for weeks now and even though it gives slightly lower scores in benchmarks it never throttles for me while gaming, Been playing ME:A for hours and the clock/memory never moved. Game is smoother overall.


Yep in comparison to card with no Shunt mods XOC has it easy to win.
And in games like ME:A even a shunt modded card is limiting af...
There are scenarios where it is better no doubt and nothing new here.


----------



## Benny89

Quote:


> Originally Posted by *fisher6*
> 
> I've been using the XOC bios for weeks now and even though it gives slightly lower scores in benchmarks it never throttles for me while gaming, Been playing ME:A for hours and the clock/memory never moved. Game is smoother overall.


May I have a kind request? Could you run some game benchmark (like ROTR bench, Far Cry Primal bench etc.) on same clock/mem with stock BIOS vs XOC Bios? I would like to know if XOC will really give you more average/min/max fps than stock.

Becasue I know that XOC can give lower bench score so I wonder if fps in games will see increase from constant clock speed.

Since I don't see a point of using XOC Bios if it does not give better performance in games.

Thank you in advance.


----------



## Hulio225

I bet XOC will beat my FE Bios under normal everyday use conditions, even on my shunt modded card.
When my case is fully closed and my Fans are barely spinning temps ca reach 42/43 ° C on GPU -> down-clocks by 2 bins, in addition standard shunt mod is not enough for every game, so another downclocks...

The difference in performance on my everyday used windows installation on my main NVMe drive are getting alot smaller than on my bench OS.

I mean my bench OS is giving me freaking 3,5% Performance compared to my regular windows....

To be honest here i always made my comparisons on my Bench OS, never on my regular daily windows...

Funny is ran time spy test 1 two times on FE Bios couple minutes ago and got 69.4 FPS and the run after i got 69.48 FPS (2101 | + 500).
Then i flashed XOC again DDU and ran 2 game test 1 tests again under regular windows with closed case, like i would play my game on a daily...
got 69.80 and 69.90 FPS

I would say under normal conditions for everyday use it can be better for sure, but somehow under strict testing conditions like i did on my bench OS with open case and fans at maximum it scores worse...


----------



## fisher6

Quote:


> Originally Posted by *Benny89*
> 
> May I have a kind request? Could you run some game benchmark (like ROTR bench, Far Cry Primal bench etc.) on same clock/mem with stock BIOS vs XOC Bios? I would like to know if XOC will really give you more average/min/max fps than stock.
> 
> Becasue I know that XOC can give lower bench score so I wonder if fps in games will see increase from constant clock speed.
> 
> Since I don't see a point of using XOC Bios if it does not give better performance in games.
> 
> Thank you in advance.


I'm not able to answer whether it gives better performance in gaming, all i know is if there is a difference then it's likely minimal. I get 80-90 fps in ME:A at 3440x1440. The good thing about the XOC bios is that it totally eliminates thermal and power throttling for me. That said, if there is any game that I has a benchmark you would like me to run I can try. i don't have FC Primal or ROTR.

Quote:


> Originally Posted by *Hulio225*
> 
> Yep, thats the process, no need to disable gpu, new versions of NVFlash is doing that automatic.
> Yep in comparison to card with no Shunt mods XOC has it easy to win.
> And in games like ME:A even a shunt modded card is limiting af...
> There are scenarios where it is better no doubt and nothing new here.


Yup, I love seeing the clock stuck at whatever I set it to. The only downside is temps have gone up by a few degrees due to the increased voltage/power draw but I'm under water so it doesn't really matter.


----------



## the9quad

Quote:


> Originally Posted by *fisher6*
> 
> I'm not able to answer whether it gives better performance in gaming, all i know is if there is a difference then it's likely minimal. I get 80-90 fps in ME:A at 3440x1440. The good thing about the XOC bios is that it totally eliminates thermal and power throttling for me. That said, if there is any game that I has a benchmark you would like me to run I can try. i don't have FC Primal or ROTR.
> Yup, I love seeing the clock stuck at whatever I set it to. The only downside is temps have gone up by a few degrees due to the increased voltage/power draw but I'm under water so it doesn't really matter.


Can you fire up ME:A, and turn on the OSD, and just take a 20 second clip of the title screen? I would like to see the 80-90 FPS at 3440x1440 on ultra settings in ME:A. I realize my CPU is a little old, but I am getting around 60 fps on that title screen with all settings turned up. So if what you are doing helps that much i.e. 20-30 fps increase, I might give it a try if possible.


----------



## Hulio225

Quote:


> Originally Posted by *the9quad*
> 
> Can you fire up ME:A, and turn on the OSD, and just take a 20 second clip of the title screen? I would like to see the 80-90 FPS at 3440x1440 on ultra settings in ME:A.


This resolution is 5 Million Pixels, 3820x2160 is 8.2 Million Pixels

I get like 40 FPS in the titel screen so if we calculate 40 FPS * (8.2/5) = 64

He should get approximately 64 FPS @ 3440x1440 if his card is at 2100 | +500 ^^


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Yep, thats the process, no need to disable gpu, new versions of NVFlash is doing that automatic.
> Yep in comparison to card with no Shunt mods XOC has it easy to win.
> And in games like ME:A even a shunt modded card is limiting af...
> There are scenarios where it is better no doubt and nothing new here.


Quote:


> Originally Posted by *Hulio225*
> 
> This could be the problem, since if you don't ddu after flashing another bios, i noticed less fps in my testing until i ddu'ed.
> 
> Edit:
> 
> One comparison is missing in your three tests and this would be TS XOC with shunt, and if you would start to get lower scores then, we would know that shunt + XOC is causing that issue, if your tests aren't flawed related to not DDU


I'll try to run some gaming benchmarks later today. I don't have ME, but I'll test following games using their build-in benchmarks (all in 2560x1440):
For Honor
GR: wildlands
FFIV benchmark
RotTR
Civilization VI

with following mod options:
XOC + shunt mod
FE + shunt mod
FE no shunt mod


----------



## UdoG

Quote:


> Originally Posted by *fisher6*
> 
> I've been using the XOC bios for weeks now and even though it gives slightly lower scores in benchmarks it never throttles for me while gaming, Been playing ME:A for hours and the clock/memory never moved. Game is smoother overall.


So you install just the XOC bios and do no change anything within Afterburner?


----------



## Benny89

Quote:


> Originally Posted by *Coopiklaani*
> 
> I'll try to run some gaming benchmarks later today. I don't have ME, but I'll test following games using their build-in benchmarks (all in 2560x1440):
> For Honor
> GR: wildlands
> FFIV benchmark
> RotTR
> Civilization VI
> 
> with following mod options:
> XOC + shunt mod
> FE + shunt mod
> FE no shunt mod


Please include XOC no shunt mod also vs FE shunt mod. Remember about DDU- I wonder if that will do the trick









Thanks for your time. +Rep


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> This could be the problem, since if you don't ddu after flashing another bios, i noticed less fps in my testing until i ddu'ed.
> 
> Edit:
> 
> One comparison is missing in your three tests and this would be TS XOC with shunt, and if you would start to get lower scores then, we would know that shunt + XOC is causing that issue, if your tests aren't flawed related to not DDU
> 
> 
> 
> So right after I finish flashing BIOS - start DDU, wipe, reboot, clean install, right?
> 
> Do you disable GPU before flashing in Device Manager?
Click to expand...

With newest versions of nvflash I've never had to disable the GPU, it does it automatically.

I know of one person though who claimed the BIOS never flashed until he did disable it in device manager.

See here.

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20


----------



## Benny89

Quote:


> Originally Posted by *KedarWolf*
> 
> With newest versions of nvflash I've never had to disable the GPU, it does it automatically.
> 
> I know of one person though who claimed the BIOS never flashed until he did disable it in device manager.
> 
> See here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20


But disabling GPU before flashing "just in caaase"







should not cause any problems, right?

Also, when I use XOC BIos I should ignore the following:

"_Important note: After flashing your BIOS or reinstalling your video card driver unzip the file below and right click it, Edit it using NotePad, change the '-pl 3**' to your max TDP on your card, Save As the file, choose 'All Files', not '.txt', save the powerlimit.bat file, than right click it and 'Run As Adminstrator_' "

As XOC Bios does not have power limit there no max TDP to be set.

I am asking because last time I flashed XOC Bios to my STRIX I was still getting power limit perfs and clock was still downclocking in Heaven so I am thinking I have done something wrong. I didn't do DDU after flash so it might be that!


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> But disabling GPU before flashing "just in caaase"
> 
> 
> 
> 
> 
> 
> 
> should not cause any problems, right?
> 
> Also, when I use XOC BIos I should ignore the following:
> 
> "_Important note: After flashing your BIOS or reinstalling your video card driver unzip the file below and right click it, Edit it using NotePad, change the '-pl 3**' to your max TDP on your card, Save As the file, choose 'All Files', not '.txt', save the powerlimit.bat file, than right click it and 'Run As Adminstrator_' "
> 
> As XOC Bios does not have power limit there no max TDP to be set.
> 
> I am asking because last time I flashed XOC Bios to my STRIX I was still getting power limit perfs and clock was still downclocking in Heaven so I am thinking I have done something wrong. I didn't do DDU after flash so it might be that!


No it won't cause anything but wasting time because the program is doing it by itself









And you don't need that power-limit file.

And last time i bet you did something wrong, i guess you haven't flashed the bios at all because you haven't ran command prompt as admin or your card was faulty or so.


----------



## chef1702

Or some other weird stuff happend. I never DDUed nor did I ran the command promt as admin. Just hold Shift while you are in the nvflash folder, right-click on a free space and click "open command promt here". Flash the XOC Bios and restart twice. First reboot the card gets "installed" (display goes on and off) and no Nvidia Control Panel is available. Second reboot all is fine, Nvidia Panel is there.

Running XOC on FE without any problems. No performance decrease -> in my case / 1987 MHz <- (dont wanna start the XOC performance discussion again). You know the bios changed when you have to set your settings in afterburner again. Because the programm thinks you got a new gpu because of the bios change (not sure about Strix card because you go from Strix Bios to XOC Strix Bios).


----------



## the9quad

Quote:


> Originally Posted by *Hulio225*
> 
> This resolution is 5 Million Pixels, 3820x2160 is 8.2 Million Pixels
> 
> I get like 40 FPS in the titel screen so if we calculate 40 FPS * (8.2/5) = 64
> 
> He should get approximately 64 FPS @ 3440x1440 if his card is at 2100 | +500 ^^


Thats what I thought, no way 80-90 fps...


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> With newest versions of nvflash I've never had to disable the GPU, it does it automatically.
> 
> I know of one person though who claimed the BIOS never flashed until he did disable it in device manager.
> 
> See here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20
> 
> 
> 
> But disabling GPU before flashing "just in caaase"
> 
> 
> 
> 
> 
> 
> 
> should not cause any problems, right?
> 
> Also, when I use XOC BIos I should ignore the following:
> 
> "_Important note: After flashing your BIOS or reinstalling your video card driver unzip the file below and right click it, Edit it using NotePad, change the '-pl 3**' to your max TDP on your card, Save As the file, choose 'All Files', not '.txt', save the powerlimit.bat file, than right click it and 'Run As Adminstrator_' "
> 
> As XOC Bios does not have power limit there no max TDP to be set.
> 
> I am asking because last time I flashed XOC Bios to my STRIX I was still getting power limit perfs and clock was still downclocking in Heaven so I am thinking I have done something wrong. I didn't do DDU after flash so it might be that!
Click to expand...

I have two different GPUs in my PC and when I disabled my 1080 Ti screen would freeze. Most people shouldn't have trouble though.


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Benny89*
> 
> But disabling GPU before flashing "just in caaase"
> 
> 
> 
> 
> 
> 
> 
> should not cause any problems, right?
> 
> Also, when I use XOC BIos I should ignore the following:
> 
> "_Important note: After flashing your BIOS or reinstalling your video card driver unzip the file below and right click it, Edit it using NotePad, change the '-pl 3**' to your max TDP on your card, Save As the file, choose 'All Files', not '.txt', save the powerlimit.bat file, than right click it and 'Run As Adminstrator_' "
> 
> As XOC Bios does not have power limit there no max TDP to be set.
> 
> I am asking because last time I flashed XOC Bios to my STRIX I was still getting power limit perfs and clock was still downclocking in Heaven so I am thinking I have done something wrong. I didn't do DDU after flash so it might be that!
> 
> 
> 
> No it won't cause anything but wasting time because the program is doing it by itself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And you don't need that power-limit file.
> 
> And last time i bet you did something wrong, i guess you haven't flashed the bios at all because you haven't ran command prompt as admin or your card was faulty or so.
Click to expand...

Yeah, XOC BIOS you don't need to do power limit file.


----------



## Benny89

Quote:


> Originally Posted by *chef1702*
> 
> Or some other weird stuff happend. I never DDUed nor did I ran the command promt as admin. Just hold Shift while you are in the nvflash folder, right-click on a free space and click "open command promt here". Flash the XOC Bios and restart twice. First reboot the card gets "installed" (display goes on and off) and no Nvidia Control Panel is available. Second reboot all is fine, Nvidia Panel is there.


I didn't have that. After restart I had everything, also my Afterburner settings... I will have to try again.

But it flashed it..to some point. I had greyed out Power Limit slider in AB and there was no thermall throttling before 62C...

Well, waiting for my SLI STRIX to arrive. Got ROG HB Bridge today ^^. So happy I manged to get it!


----------



## herpderpsky

hey guys here is my resolt all on bust idel and loud


----------



## Hulio225

just a pic


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> I have to return to that bold part here.
> Your conclusion and your tests completely speak against what i can show and prove^^ somehow it makes no sense.
> How many Time Spy runs have you done, if just one on each bios and the scores don't represent the median out of at least 3-5 runs each, your results could be flawed cause cpu points are fluctuating 200-250 points each run easily.
> Thats why i always used Time Spy game test 1, ran it 10 times on each bios created the median and compared average median fps for each bios with each other.
> And i can show with exactly the same testing conditions i'm loosing 1 average fps with the XOC bios in time spy game test 1.
> 
> So my questions are here, what was your Vram clocks, and what card do you exactly have (is it the Strix or the FE)
> 
> Im just tryin to figure out why people come to the conclusion that those bioses perform the same in 3D Mark, cause me and other fellas here with FE cards have shown more than once that this is not the case.


Quote:


> Originally Posted by *Coopiklaani*
> 
> To whom it may concern, XOC bios vs FE bios vs FE bios + shunt at the same core clock.
> 
> NO DIFFERENCE BETWEEN XOC and FE bios in 3Dmark. FE performs better in superposition.


Yawn be good if someone found an extra 2-3% in bios flash same clocks when most people are finding almost the margin of error
seems to be only a higher power limit that makes for slightly higher sustained clocks

seems to me up to an extra 150w of power draw for 5% gain if you guys think it worth it........
me already gone through a 2 month RMA process with MSI years ago, so lesson learnt, based on my location in NZ some people would be angry after a week








If I had known the process time I might have tried to re-ball the GPU myself


----------



## Coopiklaani

XOC or not, shunt mod or not, performances aren't very different. It's more just margin of error imo.


----------



## Benny89

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> XOC or not, shunt mod or not, performances aren't very different. It's more just margin of error imo.


Nice! Thanks. For me as STRIX owner this is important as with SLI on air I will gladly get rid of power and thermal limit. Just wanted to know if it won't decrease performance







. Thank you very much for your time.

Can't wait to load XOC on those STRIX SLI. Days are getting looonger









+Rep of course


----------



## gmpotu

Planning to try out XOC tonight if I have time. Anyone on air use XOC that has data in what your temps are reaching with 50% or 75% fan speeds and your ambient temps?

I want to try 2101 with 75% fab but I don't want to burn up my card on air.


----------



## dallas1990

How big of psu I should have for 2 Zotac 1080ti extremes. I have a 1000 watt evga supernova g2. My mobo is msi godlike carbon with 16 gb ram and Intel 6800k.


----------



## TheScorpio32

Hello guys, I just bought a GTX 1080 ti Strix (Non-OC), I was actually super worried that Asus might Bin their card.. but man it was a long day and I already spent hours searching around so just didnt bother grabbing the OC.. Anyways, I tested it with the OC profile and it boosted up to 1911, is that promising? or normal?
I mean i didnt even get to tinker with it manually yet, had a long day maybe tomorrow


----------



## Benny89

Quote:


> Originally Posted by *dallas1990*
> 
> How big of psu I should have for 2 Zotac 1080ti extremes. I have a 1000 watt evga supernova g2. My mobo is msi godlike carbon with 16 gb ram and Intel 6800k.


1000 G2 should be ok if you don't shunt mod your cards. I have also 1000 G2 Evga and awaiting for my 2 STRX cards.
Quote:


> Originally Posted by *TheScorpio32*
> 
> Hello guys, Ive just bought a GTX 1080 ti Strix (Non-OC), I was actually super worried that Asus might Bin their card.. but man it was a long day and I already spent hours searching around so just didnt bother grabbing the OC.. Anyways, I tested it with the OC profile and it boosted up to 1911, is that promising? or normal?
> I mean i didnt even get to tinker with it manually yet, had a long day maybe tomorrow


1911 on non-OC Strix seems ok. STRIX OC usually hits around 1967-2000 depending on silicon lottery ticket... Push power limit to max and start adding little and little to core. Before you start OC your memory first find you stable core OC.


----------



## dallas1990

Ok so I'm good to go then. I'm keeping them air cooled and not going to overclock them. Never been good at it lol. Fried a r9 260 doing that lol so I just stick with the speed out of the box.


----------



## Coopiklaani

Quote:


> Originally Posted by *dallas1990*
> 
> How big of psu I should have for 2 Zotac 1080ti extremes. I have a 1000 watt evga supernova g2. My mobo is msi godlike carbon with 16 gb ram and Intel 6800k.


Don't think ASUS bin their cards. I'm rocking an Asus FE card and it hits 2101MHz using the stock bios


----------



## cg4200

So I grabbed a gigabyte Fe for 666.56 new yesterday could not pass it up..sold my titan pascal 2016 little.. awhile ago to not lose much..
I needed something to wait while team red figures out how to launch a graphics card...ugh
So I was reading thru some of the posts wondering if someone will help me out a little..
I see the xoc bios what is the verdict?is it better for Fe cards? or is it no better?
I am gonna test my card tomorrow first but then I will put my old titan p wb block on..
Also what about evga ftw3 bios any good?
Or should I just do what I did to old card and shunt it up? Thanks for advice


----------



## jase78

Quote:


> Originally Posted by *Benny89*
> 
> Nice! Thanks. For me as STRIX owner this is important as with SLI on air I will gladly get rid of power and thermal limit. Just wanted to know if it won't decrease performance
> 
> 
> 
> 
> 
> 
> 
> . Thank you very much for your time.
> 
> Can't wait to load XOC on those STRIX SLI. Days are getting looonger
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +Rep of course


strix sli?? positive i remember you gave your wife strix and finally settled on ftw3 sli??

on another completely unrelated note... I wonder why companies like newegg keep changing their return policies


----------



## MunneY

Quote:


> Originally Posted by *nrpeyton*
> 
> Can't believe you got 3 radiators into that, lol


Really?





http://imgur.com/tw6ya


----------



## gmpotu

Question about flashing Bios before I jump into this.

1) I downloaded the XOC bios from that onedrive link in the post from the OP but it is a .rar file. Do I need to unpack that to get the .rom?
2) If I go back to my default bios do i need the powerlimit.bat to run for the rest of the time I own the card or do I just run it one time to change the power limit settings back to default? The wording on the http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_30 makes it seem like once you flash a new bios or back to your original this .bat file will be needed forever.


----------



## Dasboogieman

Quote:


> Originally Posted by *dallas1990*
> 
> Ok so I'm good to go then. I'm keeping them air cooled and not going to overclock them. Never been good at it lol. Fried a r9 260 doing that lol so I just stick with the speed out of the box.


In all fairness, its really really hard to kill a GTX 1080ti (or any modern Pascal card for that matter) nowadays without hard mods.

Some of our cards you see on this thread can even survive coolant spills on their PCB. My current Aorus card was particularly tough, it survived a straight Liquid Metal spill, some of which got under the BGA contacts on the VRAM and is still running fine.

The AMD cards have traditionally allowed you to tweak them very freely in software so its easy to push them too far.


----------



## gmpotu

What does the -6 do in this line?
nvflash64 -6 S1080TIXOC.rom


----------



## KickAssCop

-6 is to force flash.


----------



## gmpotu

So I took the XOC challenge today and found similar results. My tests were not very extensive at all so probably not the best since I only did one run of each benchmark with each bios and these results are well within the margin of error I'm sure. Here is what I found though...


My settings... Idle temps @38C with the fan at 0% room temp about 22C


Interesting takeaways:
I couldn't run stable @2062mhz during Supo 4k. It would crash even at 1.100v until I dropped it down to 2050mhz core.
If I increased the fan to 100% full time without using the fan profile I was able to pass Supo4k with my highest score ever of 10080.
These cards really like cool temps. I never went above 57C in any tests. All averaged right about 57C and sat there with 82% fan speed.

I tried 2101 with 100% fan but obviously that crashed on me as well.

I haven't tried increasing voltage beyond 1.100v yet.

*Edit* looks like in XOC the voltage moves on its own beyond 1.100v. It just jumped to 1111v in WoW for me.

I like the XOC bios but I think I may switch back to STRIX OC and be happy with my curve. Maybe go for a lower profile.


----------



## Dasboogieman

Quote:


> Originally Posted by *gmpotu*
> 
> So I took the XOC challenge today and found similar results. My tests were not very extensive at all so probably not the best since I only did one run of each benchmark with each bios and these results are well within the margin of error I'm sure. Here is what I found though...
> 
> 
> My settings... Idle temps @38C with the fan at 0% room temp about 22C
> 
> 
> Interesting takeaways:
> I couldn't run stable @2062mhz during Supo 4k. It would crash even at 1.100v until I dropped it down to 2050mhz core.
> If I increased the fan to 100% full time without using the fan profile I was able to pass Supo4k with my highest score ever of 10080.
> These cards really like cool temps. I never went above 57C in any tests. All averaged right about 57C and sat there with 82% fan speed.
> 
> I tried 2101 with 100% fan but obviously that crashed on me as well.
> 
> I haven't tried increasing voltage beyond 1.100v yet.
> 
> *Edit* looks like in XOC the voltage moves on its own beyond 1.100v. It just jumped to 1111v in WoW for me.
> 
> I like the XOC bios but I think I may switch back to STRIX OC and be happy with my curve. Maybe go for a lower profile.


This is exactly the information we need, I'm not surprised the XOC BIOS doesn't regress the Strix cards it was designed for. Because when I flashed the XOC BIOS on my Aorus, I actually see a performance regression. Like the XOC BIOS would give 1-2% lower performance across the board despite unlimited power and higher clocks.

Now I am certain that VRAM straps are involved. The VRAM timings on the Strix + XOC is simply either too loose or poorly calibrated for the VRAM traces used on the other boards. This is because GDDR5x needs to be trained every so often and I guess the default memory training routines for Asus are not well suited to the Mobo traces of other boards.

I mean when I was messing with my DDR4 tertiary timings, the difference of a few loose timings on the physical training side can easily swing the effective performance of the DRAM subsystem by 1-2%.


----------



## gmpotu

Quote:


> Originally Posted by *Dasboogieman*
> 
> This is exactly the information we need, I'm not surprised the XOC BIOS doesn't regress the Strix cards it was designed for. Because when I flashed the XOC BIOS on my Aorus, I actually see a performance regression. Like the XOC BIOS would give 1-2% lower performance across the board despite unlimited power and higher clocks.
> 
> Now I am certain that VRAM straps are involved. The VRAM timings on the Strix + XOC is simply either too loose or poorly calibrated for the VRAM traces used on the other boards. This is because GDDR5x needs to be trained every so often and I guess the default memory training routines for Asus are not well suited to the Mobo traces of other boards.
> 
> I mean when I was messing with my DDR4 tertiary timings, the difference of a few loose timings on the physical training side can easily swing the effective performance of the DRAM subsystem by 1-2%.


The memory timings is a bit over my head. I'm just diving into overclocking again and very behind on the details.

So with XOC is it basically like pressing L and locking a voltage/frequency bin on the curve in other Bioses? My card never throttles with this Bios and it just goes up in voltage. So should I be trying to find the max voltage that a bin goes up to and set that voltage as the starting voltage for the clock I want? (Example If 2062 starts at 1.075v but by the end goes up to 1.100v should I just set it to 1.100v?


----------



## Luckbad

Bitspower has indicated via an email reply that Zotac 1080 Ti Amp Extreme waterblocks are coming after Computex. I didn't get a date, but since they're coming, I'm keeping my golden Zotac.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> The memory timings is a bit over my head. I'm just diving into overclocking again and very behind on the details.
> 
> So with XOC is it basically like pressing L and locking a voltage/frequency bin on the curve in other Bioses? My card never throttles with this Bios and it just goes up in voltage. So should I be trying to find the max voltage that a bin goes up to and set that voltage as the starting voltage for the clock I want? (Example If 2062 starts at 1.075v but by the end goes up to 1.100v should I just set it to 1.100v?


If you set a point and its flat from there up, it really shouldn't go up in voltage. However there were some posts that indicated that a curve may change at different temps, so its possible that while it doesn't downclock, it could move to a curve that has the voltage higher at that point. Something they wouldn't need to factor in for a subzero bios.

Really your performance should be a lot better with the XOC as your not limiting anymore, so any game or benchmark that was limiting should be better. Things I found with XOC, that you couldn't tell with a standard bios where,

1/ voltage effects performance, so if you run a single raised frequency at different voltages you could optimise your performance. 1.075 worked best for me.

2/ raise the next 3 lower bins as high as you can but at least one frequency clock below the optimised bin from above, made a huge improvement so I was able to run say 2088 @ 1.043, 2088 @ 1.050, 2088 @ 1.062 and my optimised bin 2101 @ 1.075.


----------



## lilchronic

I find when locking the curve to a certain voltage point / Clock the performance is lower then if you clock with just the slider and offset, even with power throttling.

For instance my card clocks up to 1911Mhz with power and voltage slider maxed out and in GT1 of firestrike and i get 143 FPS every time.
If i lock voltage to .925v and 1911Mhz i get 138FPS. I even went up to 1.025v to see if that would help but it was still the same.


----------



## Benny89

Quote:


> Originally Posted by *jase78*
> 
> strix sli?? positive i remember you gave your wife strix and finally settled on ftw3 sli??
> 
> on another completely unrelated note... I wonder why companies like newegg keep changing their return policies


I gave my STRX to my wife (well, not yet, waiting for two cards). My FTW3 preorder was moved to 05.06.2017 so I said- screw it EVGA will your always "late at the party" policy, canceled my pre-order and ordered 2 more STRIX cards (got nice 50USD discount for each of them).

And here I am, waiting for them


----------



## TheScorpio32

So my Non-OC edition STRIX was able to reach 2012Mhz , well it starts off 2025 then stabilize at 2012Mhz, Didn't even touch any Voltage sliders or memory yet, just primarily focusing on the Core, Sounds Promising?
30,312 Graphics Score in Firestrike (Regular one)


----------



## Nico67

Quote:


> Originally Posted by *TheScorpio32*
> 
> So my Non-OC edition STRIX was able to reach 2012Mhz , well it starts off 2025 then stabilize at 2012Mhz, Didn't even touch any Voltage sliders or memory yet, just primarily focusing on the Core, Sounds Promising?
> 30,312 Graphics Score in Firestrike (Regular one)


Quality in terms of what it can reach out of the box is not so easy to quantify. A lot depends on the curve it is running (not sure how much they vary even among similar models) but also on how much heat your particular card generates. So if 2025 is the max frequency on the curve and it drops one bin to 2012 due to temp downclocking, then its probably pretty good.
You then need to look at what it can do with 120% power 90c temp raising the clocks to see how far you can go without crashing, as that seems to be more of a quality metric. Then power limiting becomes a factor as to what frequency you can really sustain which is also where cooling comes in.
Voltage slider does actually raise voltage, it just allows the card to use extra voltage if the curve continues to rise into bins higher than about 1.050v. But you can only control voltage via the curve and just on / off it via the slider.


----------



## Aganor

Hi guys, i just installed my EK waterblock on my 1080ti FE but i used the upper thread as the Outlet port instead of the Inlet as the manual indicates.
On my past rig i did the same but i never noticed if it was reversed. Does it make any difference temp wise?

On the CPU i know we have to respect it but for the GPU i dont know if thats mandatory?


----------



## Nico67

Quote:


> Originally Posted by *Aganor*
> 
> Hi guys, i just installed my EK waterblock on my 1080ti FE but i used the upper thread as the Outlet port instead of the Inlet as the manual indicates.
> On my past rig i did the same but i never noticed if it was reversed. Does it make any difference temp wise?
> 
> On the CPU i know we have to respect it but for the GPU i dont know if thats mandatory?


It may do, but its not likely to be that bad. Definitely not mandatory at any rate, its not like the water doesn't flow or anything


----------



## Dasboogieman

Quote:


> Originally Posted by *Aganor*
> 
> Hi guys, i just installed my EK waterblock on my 1080ti FE but i used the upper thread as the Outlet port instead of the Inlet as the manual indicates.
> On my past rig i did the same but i never noticed if it was reversed. Does it make any difference temp wise?
> 
> On the CPU i know we have to respect it but for the GPU i dont know if thats mandatory?


Theoretically you should stick to the factory advised inlet->outlet. This is because the way that block is designed is to vent the head pressure from the left thread (nearer towards the I/O ports) down vertically over the central fins where it spreads laterally then makes its way towards the exit outlet. Going in the reverse as you have done may give you a large flow restriction penalty as there will be massive turbulence from the water going through the fins and up through the inlet thus leading to suboptimal temperatures.

All that being said, if your pump is powerful enough it shouldn't matter but I don't see the reason why the loop needs to be run outside of recommended.


----------



## Coopiklaani

Quote:


> Originally Posted by *MunneY*
> 
> Really?
> 
> 
> 
> 
> 
> http://imgur.com/tw6ya


Nice build. But I think there's too much restriction from the front intake, it will produce negative pressure.


----------



## Aganor

Quote:


> Originally Posted by *Dasboogieman*
> 
> Theoretically you should stick to the factory advised inlet->outlet. This is because the way that block is designed is to vent the head pressure from the left thread (nearer towards the I/O ports) down vertically over the central fins where it spreads laterally then makes its way towards the exit outlet. Going in the reverse as you have done may give you a large flow restriction penalty as there will be massive turbulence from the water going through the fins and up through the inlet thus leading to suboptimal temperatures.
> 
> All that being said, if your pump is powerful enough it shouldn't matter but I don't see the reason why the loop needs to be run outside of recommended.


My system is at the rig, i have an D5 pump so its a decent pump but the reason i have this loop like that is because i was so tired when i mounted it last night so when i noticed it, everything was mounted and liquid inside lol
But i'll make the effort and redo everything


----------



## cg4200

Quote:


> Originally Posted by *gmpotu*
> 
> Question about flashing Bios before I jump into this.
> 
> 1) I downloaded the XOC bios from that onedrive link in the post from the OP but it is a .rar file. Do I need to unpack that to get the .rom?
> 2) If I go back to my default bios do i need the powerlimit.bat to run for the rest of the time I own the card or do I just run it one time to change the power limit settings back to default? The wording on the http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_30 makes it seem like once you flash a new bios or back to your original this .bat file will be needed forever.


I am wondering the same thing after flashing back to my bios ..
Do I need to set the powerlimit.bat one time? or every time?
just wondering if I do fresh install of windows will I have to remember to powerlimit.bat also? Thanks have not flashed since titan Maxwell..


----------



## pez

Quote:


> Originally Posted by *mcbaes72*
> 
> Yeah, I'm not getting suckered into answering that question.
> 
> On Topic Inquiry:
> I'm looking into another 1080 Ti, but hybrid version for another build. From my research, only MSI and EVGA makes an AIO for the GPU. For those that have them....
> 
> 1) How's cooling performance?
> 2) Noise level?
> 
> TIA.


Noise and temps are going to be pretty much ahead of anything short of a custom loop should you use a different fan than stock.
Quote:


> Originally Posted by *AngryLobster*
> 
> I have my case wide open with no panels in a 17c room. It's functionally a test bench with fans.
> 
> Load Tomb Raider or the new Prey at 4K and stock the fans ramp up to 1700RPM and are too loud for my taste. Other games like Hitman keep the card much quieter with results comparable to reviews.
> 
> I'm convinced they are just not pushing the cards hard enough because 1400RPM at 66c is just out of the realm of possibility from my testing.
> 
> I'd love if another STRIX owner can chime in. I'm about to throw my Morpheus on it but the warranty sticker scares me given how terrible Asus RMA is.


You're in a room that's 62F/17C? What's the rest of your setup like? Even in open-air a GPU can recirculate hot air.
Quote:


> Originally Posted by *Dasboogieman*
> 
> There are some rare cares with exceptionally expensive low-noise inductors but I cannot recall who has atm. IIRC ASUS and maayyyybe MSI might still use them but I cannot be sure. btw they are noise-reduced, not noiseless, they still whine except at a less audible frequency.
> 
> The best way to deal with coil whine is to pad down the coils with something e.g. Thermal pads with pressure from the HSF, Hot glue around the inductors, Seksui Thermal tape to brace all the coils together etc etc.
> 
> I am actually baffled why the questions about coil whine keep coming up time and time again. Wikipedia explains the phenomenon quite well, all inductors will make noise, its just inherent to the physics of how they work. The noise they make is not indicative of their quality (barring comparison of conventional coils to special purpose low-noise ones) or even their state of function (i.e. functioning as designed). The real questions should be how much noise?, is it audible? and what has been done to mitigate it?


My Ti FE has significant coil whine whereas my TXP has none whatsoever (or at least any that can be heard of fan noise and ambient noise).
Quote:


> Originally Posted by *nrpeyton*
> 
> The end of an *era*, tonight; for me. lol
> 
> After upgrading to the 1080Ti I felt it was my duty to FINALLY finish the Witcher 3 (which I started playing in 2015).
> 
> Apparently I have 85hrs of gameplay. That's about 45 minutes a week for 2 years lol.
> 
> What the hell am I meant to do now, lol
> 
> Honest, can't believe I finally completed it.
> 
> Went out in style too, I begun the game on a GTX 980 & AMD FX CPU, then upgraded to GTX 980 SLI, then continued on the GTX 1080. And finally completed it on the 1080Ti & 7700k at 70 FPS (some scenes) at 4k Ultra.


This sounds like my journey with Doom, though in a much shorter span of time. I think my process went something like 970 SLI > 1080 SLI > TXP > TXp. I actually finished it on the TXp, so I guess I at least got to see what it was like with a bang before I returned it







.


----------



## KedarWolf

Quote:


> Originally Posted by *cg4200*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gmpotu*
> 
> Question about flashing Bios before I jump into this.
> 
> 1) I downloaded the XOC bios from that onedrive link in the post from the OP but it is a .rar file. Do I need to unpack that to get the .rom?
> 2) If I go back to my default bios do i need the powerlimit.bat to run for the rest of the time I own the card or do I just run it one time to change the power limit settings back to default? The wording on the http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_30 makes it seem like once you flash a new bios or back to your original this .bat file will be needed forever.
> 
> 
> 
> I am wondering the same thing after flashing back to my bios ..
> Do I need to set the powerlimit.bat one time? or every time?
> just wondering if I do fresh install of windows will I have to remember to powerlimit.bat also? Thanks have not flashed since titan Maxwell..
Click to expand...

Depends on the BIOS. Strix OC after it's set once, it stays.

Palit BIOS you need to do it every boot. Others I'm not sure.

And you should do it every time you flash a BIOS or update the Nvidia driver.

Also you can use Windows Task Scheduler and have it run 60 seconds after every boot at highest privileges. I say 60 seconds because it is just enough time for driver etc. to fully initialize before applying the power limit.

XOC BIOS no need to use .bat file.


----------



## pfinch

I noticed that i'm stil getting Power-Limits all the time with StrixOC (EVGA FE Card) ... already used DDU


----------



## Hulio225

Quote:


> Originally Posted by *pfinch*
> 
> I noticed that i'm stil getting Power-Limits all the time with StrixOC (EVGA FE Card) ... already used DDU


You need the XOC Bios to get rid of the power limits...
*X*treme *O*ver*c*locking


----------



## BoredErica

Quote:


> Originally Posted by *Dasboogieman*
> 
> Now I am certain that VRAM straps are involved. The VRAM timings on the Strix + XOC is simply either too loose or poorly calibrated for the VRAM traces used on the other boards. This is because GDDR5x needs to be trained every so often and I guess the default memory training routines for Asus are not well suited to the Mobo traces of other boards.


I ran tests more than once and saw a small improvement with XOC over FTW3 on EVGA SC Black (Founders). You're probably over-generalizing.


----------



## pfinch

Quote:


> Originally Posted by *Hulio225*
> 
> You need the XOC Bios to get rid of the power limits...
> *X*treme *O*ver*c*locking


Oh sorry, i flashed XOC :-( ... my fault

MSI Afterburner Beta 8 immediately logs Powerlimit.

Power and Temptarget are greyedout so it has to be XOC


----------



## mcbaes72

Quote:


> Originally Posted by *pez*
> 
> Noise and temps are going to be pretty much ahead of anything short of a custom loop should you use a different fan than stock.


That's what I was hoping for. Thanks for replying.


----------



## ALSTER868

Guys, how much does the XOC bios raise the temps? On custom air cooled card.
Say, on a regular Strix compared to its stock bios.


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> If you set a point and its flat from there up, it really shouldn't go up in voltage. However there were some posts that indicated that a curve may change at different temps, so its possible that while it doesn't downclock, it could move to a curve that has the voltage higher at that point. Something they wouldn't need to factor in for a subzero bios.
> 
> Really your performance should be a lot better with the XOC as your not limiting anymore, so any game or benchmark that was limiting should be better. Things I found with XOC, that you couldn't tell with a standard bios where,
> 
> 1/ voltage effects performance, so if you run a single raised frequency at different voltages you could optimise your performance. 1.075 worked best for me.
> 
> 2/ raise the next 3 lower bins as high as you can but at least one frequency clock below the optimised bin from above, made a huge improvement so I was able to run say 2088 @ 1.043, 2088 @ 1.050, 2088 @ 1.062 and my optimised bin 2101 @ 1.075.


Thanks I'll try playing around with this when I get home tonight. Last night I only had time to do a quick comparison one run of each benchmark on each Bios.

I'll try editing my curve to optimize performance and then I'll see if I can push higher than 2062 with more voltage. I was able to game in WoW for basic non-raid content last night at [email protected] without issue.

My card would probably do fine with the ftw3 bios as it rarely goes above 330w and when it does it barely goes over. The highest draw I have been able to capture with the power cmd is around 335w even though I know it sometimes spikes a little higher.

My system usually sits around 400w on the 12v rail with max spikes of 424w total on the 12v rail. Not sure what else is involved in that 12v rail power draw. The iGPU is enabled in Bios buy nothing is connected so I would think it would not use much if any power. Other than that I have a typical setup with a bunch of system fans and molex fans as well as an LED light bar that uses molex power. Two sata drives two optical sata drives and my 3770k @ 4.2ghz


----------



## gmpotu

Quote:


> Originally Posted by *ALSTER868*
> 
> Guys, how much does the XOC bios raise the temps? On custom air cooled card.


My temps didn't really go up much at all my fan profile pretty much keeps it the same. What I noticed is that where STRIX OC was down clocking due to temp throttling or power throttling I now will just be unstable without the down clock from temp throttling and the system will crash. So I have to up my voltage curve (assuming, will test tonight to confirm) in order to stay stable.

Basically I see the card reach 57C and 82% fan speed and it kind just hovers there regardless of what settings I am using. In sure my voltage testing tonight will give more insight and I may see the temps jump slightly or more likely I just won't be able to push much more than 2062 on AIR and stay stable.

The STRIX OC doesn't have good cooling over the VRMs and I think that is what is causing me to crash all the time. Although core temp is fine at 57 my guess is those VRMs are cooking at like 80+ if not more. I don't have a thermal camera so I can't tell.


----------



## Hulio225

Quote:


> Originally Posted by *gmpotu*
> 
> The STRIX OC doesn't have good cooling over the VRMs and I think that is what is causing me to crash all the time. Although core temp is fine at 57 my guess is those VRMs are cooking at like 80+ if not more. I don't have a thermal camera so I can't tell.


Your crashes are definitely not VRM related! What does the voltage regulators have to do with that @ stock voltage? Its simply your chip being not stable at this frequency, temperature and voltage...


----------



## ALSTER868

Quote:


> Originally Posted by *gmpotu*
> 
> My temps didn't really go up much at all my fan profile pretty much keeps it the same. What I noticed is that where STRIX OC was down clocking due to temp throttling or power throttling I now will just be unstable without the down clock from temp throttling and the system will crash. So I have to up my voltage curve (assuming, will test tonight to confirm) in order to stay stable.
> 
> Basically I see the card reach 57C and 82% fan speed and it kind just hovers there regardless of what settings I am using. In sure my voltage testing tonight will give more insight and I may see the temps jump slightly or more likely I just won't be able to push much more than 2062 on AIR and stay stable.
> 
> The STRIX OC doesn't have good cooling over the VRMs and I think that is what is causing me to crash all the time. Although core temp is fine at 57 my guess is those VRMs are cooking at like 80+ if not more. I don't have a thermal camera so I can't tell.


So, I assume it's safe to try this XOC on a stock cooled Strix, good


----------



## Hulio225

Quote:


> Originally Posted by *ALSTER868*
> 
> So, I assume it's safe to try this XOC on a stock cooled Strix, good


Yeah why wouldn't it be, i mean as long your not raising the voltage to 1.5V and try to cool that with an air cooler, nothing will happen...


----------



## Benny89

Quote:


> Originally Posted by *pfinch*
> 
> Oh sorry, i flashed XOC :-( ... my fault
> 
> MSI Afterburner Beta 8 immediately logs Powerlimit.
> 
> Power and Temptarget are greyedout so it has to be XOC


Wait, you flashed XOC and you still have power limits? But does it only SHOW power limit in AB or does it downclock your card? Because that might be OSD bug or something.

Or you flashed it wrong somehow.

I had same case with my STRIX when I flashed XOC- I was still getting power limit perfs.


----------



## Agent-A01

I tried XOC vs FE @ 2113 1.062v and saw a slight drop in supo benchmark.

Games were the same though, tested overwatch and black ops 3.
Don't care about one benchmark being slower if games see no reduction.


----------



## KedarWolf

Question for anyone having done the shunt mod.

I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.

After I do it, will the extra voltage going to the card increase my overclock capabilities?









I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.


For what its worth. My Aorus card is only stable 2050mhz [email protected], 2037 @ 1.081 and 2013 @ 1V


----------



## gmpotu

Quote:


> Originally Posted by *Hulio225*
> 
> Your crashes are definitely not VRM related! What does the voltage regulators have to do with that @ stock voltage? Its simply your chip being not stable at this frequency, temperature and voltage...


I am a total novice when it comes to overclocking so I was probably completely wrong with that statement given your comment. My understanding was that the VRMs helped with the stability of your memory overclock and my thought was that although my card doesn't go above 57C these VRMs are probably getting way passed 57C which would
Be causing my memory overclock to become unstable and crash the system. It sounds like that is totally wrong though. Would you mind explaining it a little bit so I will have a better understanding?


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> Wait, you flashed XOC and you still have power limits? But does it only SHOW power limit in AB or does it downclock your card? Because that might be OSD bug or something.
> 
> Or you flashed it wrong somehow.
> 
> I had same case with my STRIX when I flashed XOC- I was still getting power limit perfs.


I had no performance caps in GPU-Z in any of the benchmarks I ran after flashing XOC whereas with the STRIX OC I would get pwr caps and after down clocking because of thermal throttling or power limiting I'd also see vrel caps when the benchmark would smooth back out.

If you guys are still seeing power caps I would guess that something went wrong with your Bios flash. My new Bios had the same number but it had .58 at the end after successfully flashing XOC whereas the STRIX OC ended in .28 I think. Would have to double check on the exact number but the bios version did change after flashing.

Also, I did not run the powerlimit.bat file and I am running AB 4.3.0 not the beta version.


----------



## seven7thirty30

Quote:


> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.


My initial thought is that the "resistor mod" should be more stable than the "CLU mod" since the resistance varies based on how much CLU is applied and whether it moves over time, etc. It would be difficult to tell how much the CLU is providing. I started with CLU, but went with resistors because my 2101 core clock was unstable in games, and it felt better than having loose CLU in my RIG. It was just a hunch then, but after I used resistors it leveled out. So far "knock on wood" my OC hasn't failed in games. I don't think the change from CLU to resistors will yield a higher overclock. It didn't for me. We're still subject to the "silicon lottery".


----------



## Hulio225

Quote:


> Originally Posted by *gmpotu*
> 
> I am a total novice when it comes to overclocking so I was probably completely wrong with that statement given your comment. My understanding was that the VRMs helped with the stability of your memory overclock and my thought was that although my card doesn't go above 57C these VRMs are probably getting way passed 57C which would
> Be causing my memory overclock to become unstable and crash the system. It sounds like that is totally wrong though. Would you mind explaining it a little bit so I will have a better understanding?


VRM (Voltage Regulator Module) is providing the Voltage your card needs to run, those on the Strix card have 300kHz switching frequency and are rated with 60 Ampere per VRM @ 125 ° C ... 10 Phases on the Strix card, so 10 VRMs which can deliver up to 600A of current at 125 ° C max.
There is no way you are even close to that numbers, so those are running way below what they are capable of.
Like i said your chip is limiting further OC, definitely not the VRMs.

Quote:


> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, *will the extra voltage* going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.


There is no extra voltage going through your card after the power limit mod, its current.
And it will not expand any overclocking capabilities, it simply allows to draw more power therefor it will not down-clock itself because its hitting power limit.

Pretty much that:
Quote:


> Originally Posted by *seven7thirty30*
> 
> My initial thought is that the "resistor mod" should be more stable than the "CLU mod" since the resistance varies based on how much CLU is applied and whether it moves over time, etc. It would be difficult to tell how much the CLU is providing. I started with CLU, but went with resistors because my 2101 core clock was unstable in games, and I felt better about having loose CLU in my RIG. It was just a hunch then, but after I used resistors it leveled out. So far "knock on wood" my OC hasn't failed in games. I don't think the change from CLU to resistors will yield a higher overclock. It didn't for me. We're still subject to the "silicon lottery".


----------



## lilchronic

So anyone else test this?
Quote:


> Originally Posted by *lilchronic*
> 
> I find when locking the curve to a certain voltage point / Clock the performance is lower then if you clock with just the slider and offset, even with power throttling.
> 
> For instance my card clocks up to 1911Mhz with power and voltage slider maxed out and in GT1 of firestrike and i get 143 FPS every time.
> If i lock voltage to .925v and 1911Mhz i get 138FPS. I even went up to 1.025v to see if that would help but it was still the same.


http://www.3dmark.com/compare/fs/12704552/fs/12704593
with curve 1911Mhz locked

1911Mhz with some power limiting to 1898Mhz for a couple seconds


----------



## superV

flashed my fe with the XOC,and so far i can reach 2060 without adding voltage.the problem is that in afterburner the voltage control is locked even if i copy card's info in the file.it's normal ?


----------



## Benny89

Quote:


> Originally Posted by *superV*
> 
> flashed my fe with the XOC,and so far i can reach 2060 without adding voltage.the problem is that in afterburner the voltage control is locked even if i copy card's info in the file.it's normal ?


It is normal, since your card is now detected as different one. If you are not using beta AB you have to do "voltage slider unlock" trick once again.

This confirms that I definitelly did something wrong while flashig BIOS as my Voltage was still on but power was greyed out. Seems like XOC was flashed like....partially?

Dam, and I thought flashing was easy...


----------



## KedarWolf

Quote:


> Originally Posted by *seven7thirty30*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My initial thought is that the "resistor mod" should be more stable than the "CLU mod" since the resistance varies based on how much CLU is applied and whether it moves over time, etc. It would be difficult to tell how much the CLU is providing. I started with CLU, but went with resistors because my 2101 core clock was unstable in games, and it felt better than having loose CLU in my RIG. It was just a hunch then, but after I used resistors it leveled out. So far "knock on wood" my OC hasn't failed in games. I don't think the change from CLU to resistors will yield a higher overclock. It didn't for me. We're still subject to the "silicon lottery".
Click to expand...

I'm not using CLU or any mod. So with the resistor mod will it help my overclocks at all?


----------



## KedarWolf

Quote:


> Originally Posted by *Hulio225*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gmpotu*
> 
> I am a total novice when it comes to overclocking so I was probably completely wrong with that statement given your comment. My understanding was that the VRMs helped with the stability of your memory overclock and my thought was that although my card doesn't go above 57C these VRMs are probably getting way passed 57C which would
> Be causing my memory overclock to become unstable and crash the system. It sounds like that is totally wrong though. Would you mind explaining it a little bit so I will have a better understanding?
> 
> 
> 
> VRM (Voltage Regulator Module) is providing the Voltage your card needs to run, those on the Strix card have 300kHz switching frequency and are rated with 60 Ampere per VRM @ 125 ° C ... 10 Phases on the Strix card, so 10 VRMs which can deliver up to 600A of current at 125 ° C max.
> There is no way you are even close to that numbers, so those are running way below what they are capable of.
> Like i said your chip is limiting further OC, definitely not the VRMs.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, *will the extra voltage* going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> There is no extra voltage going through your card after the power limit mod, its current.
> And it will not expand any overclocking capabilities, it simply allows to draw more power therefor it will not down-clock itself because its hitting power limit.
> 
> Pretty much that:
> Quote:
> 
> 
> 
> Originally Posted by *seven7thirty30*
> 
> My initial thought is that the "resistor mod" should be more stable than the "CLU mod" since the resistance varies based on how much CLU is applied and whether it moves over time, etc. It would be difficult to tell how much the CLU is providing. I started with CLU, but went with resistors because my 2101 core clock was unstable in games, and I felt better about having loose CLU in my RIG. It was just a hunch then, but after I used resistors it leveled out. So far "knock on wood" my OC hasn't failed in games. I don't think the change from CLU to resistors will yield a higher overclock. It didn't for me. We're still subject to the "silicon lottery".
> 
> Click to expand...
Click to expand...

Never mind my question. But wouldn't max voltages stay higher with no or less power limiting? Thus more stable higher higher clocks? I hear of people getting 400v to the card, logically it should help higher clocks.


----------



## seven7thirty30

KedarWolf:

My 1080Ti is not "game" stable at anything higher than 2101 @ 1.093mV. I tried 1.1mV and beyond at higher clocks, but the chip just wouldn't budge. That's just the reality of it. Your card may be better than mine, and I hope that it is. Just don't beat yourself up if the "resistor mod" doesn't go the way you want it to.


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For what its worth. My Aorus card is only stable 2050mhz [email protected], 2037 @ 1.081 and 2013 @ 1V
Click to expand...

Do you use a shunt mod?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.


You'll be able to run 1.093v more constantly except in TimeSpy nd superposition 8k.

your card will produce more heat, about 75 more watts of heat. But you're on water so it should be fine.

You fixed your card?

At 1.075v I can game at 2114 Mhz but in return my fans have to spin at 2000 rpm to stay under 50C.

I since went to 2100 Mhz at 1.062v and my fans don't go above 1500 rpm which is nicer.

Shunt mod isn't going to let you oc higher, it just let's your sustain a voltage.

Whatever you're able to run at 1.093v, you'll be able to just stay there.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Question for anyone having done the shunt mod.
> 
> I'm getting my 47 Ohm resistors later today. Should be about the same as the CLU shunt mod.
> 
> After I do it, will the extra voltage going to the card increase my overclock capabilities?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm running 2062 core at 1.062v and 6147 memory stress testing stable with Fire Strike Ultra and Time Spy stress tests.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You'll be able to run 1.093v more constantly except in TimeSpy nd superposition 8k.
> 
> your card will produce more heat, about 75 more watts of heat. But you're on water so it should be fine.
> 
> You fixed your card?
> 
> At 1.075v I can game at 2114 Mhz but in return my fans have to spin at 2000 rpm to stay under 50C.
> 
> I since went to 2100 Mhz at 1.062v and my fans don't go above 1500 rpm which is nicer.
> 
> Shunt mod isn't going to let you oc higher, it just let's your sustain a voltage.
> 
> Whatever you're able to run at 1.093v, you'll be able to just stay there.
Click to expand...

Yes, I fixed my card, long detailed post somewhere how I did it, took three days.


----------



## Caos

The 1080 ti ftw3 at temperatures how behaves? Hotter than the strix?

I sold my Strix because I love the design of the ftw3


----------



## the9quad

Quote:


> Originally Posted by *Caos*
> 
> The 1080 ti ftw3 at temperatures how behaves? Hotter than the strix?
> 
> I sold my Strix because I love the design of the ftw3


I had a FTW3 on pre-order and cancelled the day before release because i heard they are loud. I too liked the way they looked, but I am not about to have a loud card after dealing with 3 reference 290xs for the last few years.


----------



## Benny89

Quote:


> Originally Posted by *Caos*
> 
> The 1080 ti ftw3 at temperatures how behaves? Hotter than the strix?
> 
> I sold my Strix because I love the design of the ftw3


According to detailed review of GamerNexus (you can find on YouTube) Strix is both cooler and quieter than FTW3. FTW3 is second coolest card but is loudest out of all AIBs and at some point is as loud as FE card.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, I fixed my card, long detailed post somewhere how I did it, took three days.


I can't find the post anywhere. What'd you have to do to fix it?


----------



## dunbheagan

Quote:


> Originally Posted by *lilchronic*
> 
> So anyone else test this?
> http://www.3dmark.com/compare/fs/12704552/fs/12704593
> with curve 1911Mhz locked
> 
> 1911Mhz with some power limiting to 1898Mhz for a couple seconds


Funny, i experienced the same when i did some undervolting, i recognized a bad performance at 1900MHz and low voltage at around 925mV. I had like 5fps less in Doom than i should and scored about 9300 Points in Supo 4K. With the same clock at stock voltage i get 9700 Points in Supo 4k and more fps in Doom. I tried this at various clocks but could only replicate this behavior a couple times around 1900MHz. When i tried to investigate more on this yesterday, i couldnt replicate it at all, i got 9700 Points with normal and low voltage. Really strange. Is it possible, that the GPU has a similar error correction like the memory? So it wont instantly crash at too low voltage but becomes slower, because it does some calculations twice? I am just guessing. But like i said, i couldnt find this behavior on other clocks than 1900. I am running 1746MHz at 793mV just fine and get a higher performance per MHz than at any higher clock and voltage.


----------



## KedarWolf

Quote:


> Originally Posted by *SlimJ87D*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, I fixed my card, long detailed post somewhere how I did it, took three days.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't find the post anywhere. What'd you have to do to fix it?
Click to expand...

I got it fixed!!









Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.

So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.

When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.

I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!









I'm so glad I got it fixed, it's a decent card.


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> VRM (Voltage Regulator Module) is providing the Voltage your card needs to run, those on the Strix card have 300kHz switching frequency and are rated with 60 Ampere per VRM @ 125 ° C ... 10 Phases on the Strix card, so 10 VRMs which can deliver up to 600A of current at 125 ° C max.
> There is no way you are even close to that numbers, so those are running way below what they are capable of.
> Like i said your chip is limiting further OC, definitely not the VRMs.
> There is no extra voltage going through your card after the power limit mod, its current.
> And it will not expand any overclocking capabilities, it simply allows to draw more power therefor it will not down-clock itself because its hitting power limit.
> 
> Pretty much that:


I see you REP going up fast and with your answers are you are pretty much bang on you must be old skool or know how to use google

BTW when I done the potentiometer hard wire volt mod with my old GTX770s, I accidently hit 1.5 volts simply just had an instant crash but I wouldn't recommend it too often









I was surprised the XOC voltage tool was released to the general public I was thinking that the Asus hotwire was going to be mandatory with XOC bios.
I am seeing some real hype going on with the RED camp maybe the green camp are worried, like back in the old days when AMD were competitive...... I starting to NVidia to start to care a little again.

AS for the SHunt mod agreed you will get the same OC capabilities to a certain extent as XOC bios
really depends if you are hitting the power limit on the standard bios but really you should have flashed to the FTW3 bios with the highest power limit of 375w IIRC

there was one other thing the residual heat from the VRM will/can heat up the memory on a strix reducing OC but then again I am on a universal water block with no fans directly cooling the VRMs still getting a 5900Mhz 24/7 OC on memory
but then again I not too worried about losing 0.01 FPS I might win a chocolate fish in the recreational benching forums here


----------



## Caos

Quote:


> Originally Posted by *the9quad*
> 
> I had a FTW3 on pre-order and cancelled the day before release because i heard they are loud. I too liked the way they looked, but I am not about to have a loud card after dealing with 3 reference 290xs for the last few years.


Quote:


> Originally Posted by *Benny89*
> 
> According to detailed review of GamerNexus (you can find on YouTube) Strix is both cooler and quieter than FTW3. FTW3 is second coolest card but is loudest out of all AIBs and at some point is as loud as FE card.


Ok thanks, so I made a bad play selling my strix? I am hesitating to buy ftw3, although the aesthetics I love.


----------



## KedarWolf

I have my 47 Ohm 0805 resistors but I think I'm going to wait for the weekend to do the mod.









They are so small I don't think I can record with my phone unless I think up some kinda stand I can use to hold it in place.









If I can't make a video I'll do step by step pictures though.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> I got it fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found out I could get into PC BIOS with Fast Boot and CSM disabled. Windows would crash after putting my PIN when it booted up. nvflash64 wouldn't run from a DOS boot disk.
> 
> So I booted into Windows with my secondary card in, no 1080 Ti. In the Recovery options in Windows Settings I did the reboot into Advanced Startup, when it booted in the Advanced Startup screen I choose Boot Into A Windows Command Prompt.
> 
> When it was restarting before it booted into the command prompt I unplugged my power cord, put in 1080 Ti, plugged power cord back in.
> 
> I booted into a Windows command prompt, changed directory into my nvflash directory, flashed FTW3 BIOS, booted into Windows, did a DDU driver uninstall and reinstall, fixed!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm so glad I got it fixed, it's a decent card.


I posted a video for you to watch, not sure if you saw it but it had similar steps.

Happy you got it fixed.


----------



## Benny89

Quote:


> Originally Posted by *Caos*
> 
> Ok thanks, so I made a bad play selling my strix? I am hesitating to buy ftw3, although the aesthetics I love.


I can't answer that question for you. Why did you sell it? Performace? Was it bad overclocker?

Why do you want FTW3? Better looks for you or bigger LEDs? (that serious question).

I wanted SLI FTW3 for its 2 slot design but overall I think STRIX is best 1080Ti this round. I have gone for SLI STRIX instead because they moved my ship-to on FTW3 again and that pissed me off.

Single card- definitely STRIX this time. Eventually ZOTAC AMP Extreme as my second choice.

STRIX is quiet, cool and has access to XOC Bios to remove power limit completely and nice neutral colors and design fitting all builds. Also it already has waterblocks.

But if you want FTW3 because you like its look better- go for it, it is a personal preference. I prefer STRIX design.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> Wait, you flashed XOC and you still have power limits? But does it only SHOW power limit in AB or does it downclock your card? Because that might be OSD bug or something.
> 
> Or you flashed it wrong somehow.
> 
> I had same case with my STRIX when I flashed XOC- I was still getting power limit perfs.


With STRIX OC you don't have to redo anything in AB b/c XOC is based on this Bios. Mine is working fine without having to redo the voltage trick for AB.

Also, if you are on the last voltage bin (1.200v) without using the Voltage Tool to go above 1.200v you should be seeing the rainbow in gpu-z I think. I don't have any caps with XOC accept for @ 1.200v because that's the limit with the voltage slider @ +100% and the limit of AB editting for the curve.


----------



## orion933

Hello I have an issue with my msi 1080ti gaming x

It was perfect since 2 weeks (+100clock/ +300memory /117%)

And now in bench the TDP seems to be unstable
it strat ok but after a while it drop and my fps drop too
I had to reduce memory clock to +100
and when it happen (tdp drop) i need to restart my pc if not it stay low...

Someone can help me what could happened?

temp are ok (68° under heavy load)

i7 4790k oc 4.5
corsair RM1000w
z97 sabertooth mark1
16g ddr3 2400mhz
win10


----------



## cg4200

Quote:


> Originally Posted by *KedarWolf*
> 
> Depends on the BIOS. Strix OC after it's set once, it stays.
> 
> Palit BIOS you need to do it every boot. Others I'm not sure.
> 
> And you should do it every time you flash a BIOS or update the Nvidia driver.
> 
> Also you can use Windows Task Scheduler and have it run 60 seconds after every boot at highest privileges. I say 60 seconds because it is just enough time for driver etc. to fully initialize before applying the power limit.
> 
> XOC BIOS no need to use .bat file.


Thanks
Also I see you you have gigabyte FE that's what I just grabbed.
What are you max oc?
I boost gpuz 1898 with no voltage and 1911 with max voltage before oc.. With 155 on core I am at 2062 will past fire strike ultra fan 90% 61 degrees max any more on core crashes.Thanks again


----------



## cg4200

Hey I have quick question I bought a gigbyte Fe 1080ti yesterday from place in ny I won't mention not accusing..
But I was wondering if any one had shrink wrap on there card???? I did not and kinda looked like box had been open no round sticker seals on ends either..
I know when I bought my titan pascal had to open it.
Wondering if I got some ones return?? Thanks


----------



## gmpotu

Have to say thank you to everyone here for all of the knowledge and help you guys provide to us noobs.
This is my highest score yet at 10096 with XOC 2075.5 @ 1.125v and didn't have a single hiccup.

It was a big help leveling out the three bins below and also it seems these cards definitely are very finicky on what voltages they like.
2075,5 @ 1,200v the card does not like even though temps stay at 59C but at 1.125v it's happy.

Sadly my card is complete trash when it comes to memory overclock with only +400. I really wanted to hit that magic 6000k you guys are getting to.

About to play some games now and see how this OC holds up but when I get done I'll be pushing for 2088 to see if I can reach that.

Thanks again for all of the great posts and information and the guides in the OP and the quick and detailed responses from everyone.


----------



## Caos

Quote:


> Originally Posted by *Benny89*
> 
> I can't answer that question for you. Why did you sell it? Performace? Was it bad overclocker?
> 
> Why do you want FTW3? Better looks for you or bigger LEDs? (that serious question).
> 
> I wanted SLI FTW3 for its 2 slot design but overall I think STRIX is best 1080Ti this round. I have gone for SLI STRIX instead because they moved my ship-to on FTW3 again and that pissed me off.
> 
> Single card- definitely STRIX this time. Eventually ZOTAC AMP Extreme as my second choice.
> 
> STRIX is quiet, cool and has access to XOC Bios to remove power limit completely and nice neutral colors and design fitting all builds. Also it already has waterblocks.
> 
> But if you want FTW3 because you like its look better- go for it, it is a personal preference. I prefer STRIX design.


Thank you Benny89 for the answer .. I definitely like the design of the ftw3, I sold my strix because I liked the design of ftw3, my strix OC max 2012mhz, but the temperatures are amazing .. right now I am hesitating if I buy from New a strix or ftw3. Difficult decision: (... I also see the poseidon but I will not put the loop, but it seems is hotter than the strix ..

I'm very undecided


----------



## TronZy

Just picked up a founders today and got my EK block in, going to hook it up tomorrow


----------



## Nico67

Quote:


> Originally Posted by *lilchronic*
> 
> So anyone else test this?
> http://www.3dmark.com/compare/fs/12704552/fs/12704593
> with curve 1911Mhz locked
> 
> 1911Mhz with some power limiting to 1898Mhz for a couple seconds


Your issue is the locked one is a "ramp", optimise 3 bins below that and you will gain a ton on performance.



do like the dark green line but at your frequencies, currently yours is like the yellow line.


----------



## KedarWolf

Quote:


> Originally Posted by *cg4200*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Depends on the BIOS. Strix OC after it's set once, it stays.
> 
> Palit BIOS you need to do it every boot. Others I'm not sure.
> 
> And you should do it every time you flash a BIOS or update the Nvidia driver.
> 
> Also you can use Windows Task Scheduler and have it run 60 seconds after every boot at highest privileges. I say 60 seconds because it is just enough time for driver etc. to fully initialize before applying the power limit.
> 
> XOC BIOS no need to use .bat file.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks
> Also I see you you have gigabyte FE that's what I just grabbed.
> What are you max oc?
> I boost gpuz 1898 with no voltage and 1911 with max voltage before oc.. With 155 on core I am at 2062 will past fire strike ultra fan 90% 61 degrees max any more on core crashes.Thanks again
Click to expand...

Hey,

On XOC BIOS I'm at 2050 core, 6177 memory, 1.063v.









I find I can do 2062 core on Strix OC BIOS but only 6147 memory at 1.063v.









I think the XOC BIOS has looser memory timings why I can go higher.









I haven't tried to go higher on memory on the XOC BIOS though, but I'm getting over 10850 in Time Spy with what it's at now and is both Time Spy and Fire Strike stress tests stable as is.









I'm on water though and stay at 50C and under both stress tests.


----------



## KedarWolf

Quote:


> Originally Posted by *cg4200*
> 
> Hey I have quick question I bought a gigbyte Fe 1080ti yesterday from place in ny I won't mention not accusing..
> But I was wondering if any one had shrink wrap on there card???? I did not and kinda looked like box had been open no round sticker seals on ends either..
> I know when I bought my titan pascal had to open it.
> Wondering if I got some ones return?? Thanks


My Gigabyte FE had no shrink ramp or round stickers on the end either so I think you're fine.


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> Your issue is the locked one is a "ramp", optimise 3 bins below that and you will gain a ton on performance.
> 
> 
> 
> do like the dark green line but at your frequencies, currently yours is like the yellow line.


This helped me a lot. I'm now able to run 2075 @ 1.125v with the XOC bios. No perf caps while gaming all night and no crashes or slowdowns. My highest bench scores also coming from this. Thanks again Nico67.

ps. Just installed HWINFO64 this program has so much information it's crazy.


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Your issue is the locked one is a "ramp", optimise 3 bins below that and you will gain a ton on performance.
> 
> 
> 
> do like the dark green line but at your frequencies, currently yours is like the yellow line.
> 
> 
> 
> This helped me a lot. I'm now able to run 2075 @ 1.125v with the XOC bios. No perf caps while gaming all night and no crashes or slowdowns. My highest bench scores also coming from this. Thanks again Nico67.
> 
> ps. Just installed HWINFO64 this program has so much information it's crazy.
Click to expand...

Here is XOC BIOS at 2050 core, 6210 memory at 1.062v, as high as I've tried the memory with XOC BIOS.

It's also Fire Strike Ultra and Time Spy stress tests stable.


----------



## paskowitz

Just thought I'd share this...


----------



## KedarWolf

Quote:


> Originally Posted by *paskowitz*
> 
> Just thought I'd share this...


What exactly are you sharing? Stock pic of an Evga card, no cooler on it?


----------



## TahoeDust

Since we are sharing...


----------



## paskowitz

Quote:


> Originally Posted by *KedarWolf*
> 
> What exactly are you sharing? Stock pic of an Evga card, no cooler on it?


Nah. Just the next KPE card. The layout is interesting.


----------



## dallas1990

just ordered one zotac 1080ti amp extreme. and the hb sli bridge. in a couple of weeks i'll order the second one. cant wait till thursday


----------



## jamexman

Quote:


> Originally Posted by *octiny*
> 
> Yep. I almost spit out my drink when I glanced at the numbers.
> 
> Their numbers are totally bogus and not to mention outdated.
> 
> I'll run a bunch of benches later this week when I get time.
> QFT
> 
> Edit:
> 
> Also just did a quick rundown on that list from my own personal experience.
> 
> Firestrike Ultra: Near perfect scaling
> Timespy: Near perfect scaling
> Crysis 3: 95-99 scaling
> Metro Last Light: Near perfect scaling
> GTA V: 95% scaling
> The Witcher 3: 90%-95%
> Fallout 4: 90%
> Dirt Rally: Near perfect scaling
> Far Cry Primal: 90%-95%
> Infinite Warfare: Don't have
> Rainbow Six Seige: Near perfect scaling
> Watch Dogs 2: 90%-95%
> Resident Evil 7: Near perfect scaling
> For Honor: Near perfect scaling
> Ghost Recon: 90%-95%
> Doom: No SLI
> The Division: 90ish%
> Hitman: No SLI
> Rise of the Tomb Raider: Near perfect scaling
> Ashes of the Singularity: Don't have
> Total War: Warhammer: Don't have
> Dues Ex Mankind Divided: 90ish%
> Gears of War 4: Near perfect scaling
> Battlefield 1: 85%-95%
> Sniper Elite 4: Near Perfect scaling
> 
> I'll run a bunch of benchmarks sometime this weekend.


Rainbow six perfect scaling? Are you sure? I get worse performance than with a single card and every site that benchmarked SLI with this game it was negative....


----------



## octiny

Quote:


> Originally Posted by *jamexman*
> 
> Rainbow six perfect scaling? Are you sure? I get worse performance than with a single card and every site that benchmarked SLI with this game it was negative....


Yup. Any benchmarks you see for the game are outdated with a non-functioning SLI profile, took a long time for them to fix it. Some people still seem to have issues on certain configurations, and other's don't it seems. There are a bunch of work-arounds either way.

Both are my rigs are completely stripped right now as I'm upgrading/changing a bunch of parts.

I'll re-run it when I'm back in service!


----------



## jamexman

Quote:


> Originally Posted by *octiny*
> 
> Yup. Any benchmarks you see for the game are outdated with a non-functioning SLI profile, took a long time for them to fix it. Some people still seem to have issues on certain configurations, and other's don't it seems. There are a bunch of work-arounds either way.
> 
> Both are my rigs are completely stripped right now as I'm upgrading/changing a bunch of parts.
> 
> I'll re-run it when I'm back in service!


Please do and share, because I'm on the latest drivers and scaling is worse with sli on. Playing on ultra wide 3440x1440 gsync monitor.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> What exactly are you sharing? Stock pic of an Evga card, no cooler on it?


Since we are sharing here's the new world record from K|NGP|N

on LN² 2569Mhz core anyone rocking over 2100Mhz core on air MUST be ecstatic

@nrpeyton note the Ram 4000Mhz 19-21-21-28 underclocked to 3300Mhz13-13-13-28 tight timings


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> Since we are sharing here's the new world record from K|NGP|N
> 
> on LN² 2569Mhz core anyone rocking over 2100Mhz core on air MUST be ecstatic
> 
> @nrpeyton *note the Ram 4000Mhz 19-21-21-28 underclocked to 3300Mhz13-13-13-28 tight timings*


Internal Memory Controller of 6950X is not capable of faster and tighter settings, Peyton has a 7700k, he can OC his 3600C16 ram to 4000 12-12-12-28 1T if he limits used RAM of OS in MSConfig to 4 GB for benching









Tight timings and high speed yields a lot physics score in time spy


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> Internal Memory Controller of 6950X is not capable of faster and tighter settings, Peyton has a 7700k, he can OC his 3600C16 ram to 4000 12-12-12-28 1T if he limits used RAM of OS in MSConfig to 4 GB for benching
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tight timings and high speed yields a lot physics score in time spy


looking around not really much info with 6950x and memory did see a few running @ 3400Mhz but guess its a compromise binning between core clock and best IMC and knowing core is going to yield more points than a good IMC
Oh the life when you can bin 6950x

as for those B-die seems a lot of DOA reports even with the 3600Mhz bin

http://ocxtreme.net/viewtopic.php?f=8&t=9

edit;
BTH I did forget about the weaker IMC on the 6950x so +1 for pointing that out
was tempted again the other day looking at the 6950x "show room demo"about $1500USD which is cheap concidering the premium import tax etc
just have to remind myself higher memory bandwith is only going to make a difference @ 1080p


----------



## macwin2012

Hey guys ,

Zotac Gtx 1080ti Amp edition or Gtx 1080ti Amp Extreme ,

I only have 2 choices above since they provide 5 years warranty better resale value .

1080ti Amp extreme is around 100$ more price .

So which should i go for ?


----------



## Benny89

Quote:


> Originally Posted by *macwin2012*
> 
> Hey guys ,
> 
> Zotac Gtx 1080ti Amp edition or Gtx 1080ti Amp Extreme ,
> 
> I only have 2 choices above since they provide 5 years warranty better resale value .
> 
> 1080ti Amp extreme is around 100$ more price .
> 
> So which should i go for ?


If you can add those 100USD more go for Extreme. Better built quality, better fans, better temps.


----------



## orion933

Quote:


> Originally Posted by *orion933*
> 
> Hello I have an issue with my msi 1080ti gaming x
> 
> It was perfect since 2 weeks (+100clock/ +300memory /117%)
> 
> And now in bench the TDP seems to be unstable
> it strat ok but after a while it drop and my fps drop too
> I had to reduce memory clock to +100
> and when it happen (tdp drop) i need to restart my pc if not it stay low...
> 
> Someone can help me what could happened?
> 
> temp are ok (68° under heavy load)
> 
> i7 4790k oc 4.5
> corsair RM1000w
> z97 sabertooth mark1
> 16g ddr3 2400mhz
> win10


someone can help please?


----------



## Dasboogieman

Quote:


> Originally Posted by *orion933*
> 
> someone can help please?


You're gonna have to be more specific. What kind of loads? what kind of games? how long? what driver updates have you done? what settings?


----------



## ALSTER868

Has anyone tried the new 382.33 drivers?


----------



## orion933

Quote:


> Originally Posted by *Dasboogieman*
> 
> You're gonna have to be more specific. What kind of loads? what kind of games? how long? what driver updates have you done? what settings?


I use 3dmark firestrike ultra stress test for testing my oc and for like 3/4 20 loops in a raw.
Im on latest driver 382.33
There is something strange with temps too before the crad never go over 63°c and now its more 70°c
I didnt touch anything like i said
Here in france the weather is warmer so its may be the cause... idk


----------



## Dasboogieman

Quote:


> Originally Posted by *orion933*
> 
> I use 3dmark firestrike ultra stress test for testing my oc and for like 3/4 20 loops in a raw.
> Im on latest driver 382.33
> There is something strange with temps too before the crad never go over 63°c and now its more 70°c
> I didnt touch anything like i said
> Here in france the weather is warmer so its may be the cause... idk


Your ambient temperatures have the biggest impact on your temperatures than anything. Every degree outside your case will more or less be added directly or even multiplicatively with your core temperatures. And since GP102 is extremely sensitive to temperature, you can easily have just lost 13-26mhz just on a 5 degree temperature increase alone.

Secondly, FireStrike ultra always power gates the GPU. The test is extremely heavy on the core+VRAM so it makes sense you are power throttling. With higher core temperatures, it is entirely possible you are throttling more often in Firestrike.

With all that is said, we're talking performance variations of 1-2% at most, it shouldn't affect your gaming at all.

Finally, there is also a possibility MSI did a really crap thermal paste job on your GPU. If you don't want to disassemble, get a Phillips screwdriver and tighten the 4 GPU screws a bit and see if that helps. Otherwise, change out the thermal paste for better temperatures.


----------



## orion933

Quote:


> Originally Posted by *Dasboogieman*
> 
> Your ambient temperatures have the biggest impact on your temperatures than anything. Every degree outside your case will more or less be added directly or even multiplicatively with your core temperatures. And since GP102 is extremely sensitive to temperature, you can easily have just lost 13-26mhz just on a 5 degree temperature increase alone.
> 
> Secondly, FireStrike ultra always power gates the GPU. The test is extremely heavy on the core+VRAM so it makes sense you are power throttling. With higher core temperatures, it is entirely possible you are throttling more often in Firestrike.
> 
> With all that is said, we're talking performance variations of 1-2% at most, it shouldn't affect your gaming at all.
> 
> Finally, there is also a possibility MSI did a really crap thermal paste job on your GPU. If you don't want to disassemble, get a Phillips screwdriver and tighten the 4 GPU screws a bit and see if that helps. Otherwise, change out the thermal paste for better temperatures.


Thank you i just did few loops and with +100 memory and +100core no more issue for now
But i dont understand why when the throttling happen i have to reboot my pc to recover my frames?


----------



## TheBoom

Quote:


> Originally Posted by *Nico67*
> 
> If you set a point and its flat from there up, it really shouldn't go up in voltage. However there were some posts that indicated that a curve may change at different temps, so its possible that while it doesn't downclock, it could move to a curve that has the voltage higher at that point. Something they wouldn't need to factor in for a subzero bios.
> 
> Really your performance should be a lot better with the XOC as your not limiting anymore, so any game or benchmark that was limiting should be better. Things I found with XOC, that you couldn't tell with a standard bios where,
> 
> 1/ voltage effects performance, so if you run a single raised frequency at different voltages you could optimise your performance. 1.075 worked best for me.
> 
> 2/ raise the next 3 lower bins as high as you can but at least one frequency clock below the optimised bin from above, made a huge improvement so I was able to run say 2088 @ 1.043, 2088 @ 1.050, 2088 @ 1.062 and my optimised bin 2101 @ 1.075.


I did this and somehow "bricked" my overcloxk. I was stable at 2088 1.31 previously. Tested hours in gaming. Then I increased the previous 3 bins to 2075 like you said and it crashed within a few minutes. So I thought heck it just isn't stable for some reason and I went back to my previous curve and even that isn't stable anymore. What's going on lol. It's almost as if doing that changed the timings on the memory or something else but I'll have to do more testing now.


----------



## Dasboogieman

Quote:


> Originally Posted by *orion933*
> 
> Thank you i just did few loops and with +100 memory and +100core no more issue for now
> But i dont understand why when the throttling happen i have to reboot my pc to recover my frames?


Don't worry about the reboot, there are 2 mechanics that can explain what you are experiencing. Your GPU is power throttling, the higher VRAM clocks do eat in to your TDP power budget so its possible you recovered some TDP to keep the core from throttling when you downclocked the VRAM. The other way you can deal with this is to increase your power target via MSI afterburner. To my memory, the Gaming X has a fairly low stock TDP allowance so I'm not fully sure how far you are willing to go to remove the power throttling.

The 2nd mechanic is GDDR5X has an error compensation system. This means that if the VRAM bus link is unstable but not enough to crash the machine, it will resend the data. You could be losing performance at high mem clocks because data is being resent due to instability. Now we know that the current Micron GDDR5x can easily clock +300mhz without issue and the GPU IMC can handle even higher.
This means either your card PCB may have weak memory bus/memory trace engineering that is preventing the core from accurately syncing with the VRAM or one of your VRAM modules is borderline faulty/overheating.


----------



## Delbane

I have a zotac amp extreme 1080 ti and i'm having this issue where whenever the card hits 50 degrees the fans kicks in to 100% for a second or two just to keep it under 50 degrees, I find this pretty annoying (noisy) anyone else experiencing the same issue ? How would I go about fixing that ?
I'm using MSI afterburner and have set a custom fan curve where fans shouldn't start before 55 degrees so I really don't understand what's happening..

Thanks


----------



## KedarWolf

XOC BIOS really good for memory overclocks. One bin less on core though. 1.063v


----------



## Aganor

Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?

Used
Arctic Silver 4 on it (cross spread mode)


----------



## Siigari

Man I just realized I posted in the wrong thread.

I just got an EVGA 1080 Ti FTW3 and am excited to receive it tomorrow


----------



## mcbaes72

Quote:


> Originally Posted by *Siigari*
> 
> Man I just realized I posted in the wrong thread.
> 
> I just got an EVGA 1080 Ti FTW3 and am excited to receive it tomorrow


Congrats in advance! Please post your impressions on the FTW 3 (i.e. OC settings, temps (ambient, under load), and noise level). I'm sure it'd be helpful to those who are interested in the FTW 3, too.


----------



## Spirti

Quote:


> Originally Posted by *Aganor*
> 
> Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?
> 
> Used
> Arctic Silver 4 on it (cross spread mode)


I have the same temp so i guess it should be ok)


----------



## Spirti

Quote:


> Originally Posted by *Aganor*
> 
> Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?
> 
> Used
> Arctic Silver 4 on it (cross spread mode)


I have 45 with the same waterblock, I'm pretty happy with it.


----------



## Spirti

Sorry for double post.


----------



## Aganor

Quote:


> Originally Posted by *Spirti*
> 
> I have 45 with the same waterblock, I'm pretty happy with it.


I got sad when i saw the logo didnt have any led on it. It must be the Titan Xp block that has the leds lol


----------



## Spirti

Quote:


> Originally Posted by *Aganor*
> 
> I got sad when i saw the logo didnt have any led on it. It must be the Titan Xp block that has the leds lol


I got Titan X Pascal waterblock with no leds as well. I guess only waterblocks for custom cards support lighting.


----------



## orion933

Quote:


> Originally Posted by *Dasboogieman*
> 
> Don't worry about the reboot, there are 2 mechanics that can explain what you are experiencing. Your GPU is power throttling, the higher VRAM clocks do eat in to your TDP power budget so its possible you recovered some TDP to keep the core from throttling when you downclocked the VRAM. The other way you can deal with this is to increase your power target via MSI afterburner. To my memory, the Gaming X has a fairly low stock TDP allowance so I'm not fully sure how far you are willing to go to remove the power throttling.
> 
> The 2nd mechanic is GDDR5X has an error compensation system. This means that if the VRAM bus link is unstable but not enough to crash the machine, it will resend the data. You could be losing performance at high mem clocks because data is being resent due to instability. Now we know that the current Micron GDDR5x can easily clock +300mhz without issue and the GPU IMC can handle even higher.
> This means either your card PCB may have weak memory bus/memory trace engineering that is preventing the core from accurately syncing with the VRAM or one of your VRAM modules is borderline faulty/overheating.


Ok i see
But for now i dont know if the card is getting bad or it is just the warm weather these ...


----------



## Agent-A01

Quote:


> Originally Posted by *Aganor*
> 
> Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?
> 
> Used
> Arctic Silver 4 on it (cross spread mode)


Holy **** that is a ton of paste.

AS5 is crap paste not to mention it is capacitive.

With that much paste I wouldn't be surprised if it eventually spreads to those resistors around the core..
When that happens you will get some fireworks

My max temp on FE with EK block is 37c


----------



## JimmyMo

Quote:


> Originally Posted by *Delbane*
> 
> I have a zotac amp extreme 1080 ti and i'm having this issue where whenever the card hits 50 degrees the fans kicks in to 100% for a second or two just to keep it under 50 degrees, I find this pretty annoying (noisy) anyone else experiencing the same issue ? How would I go about fixing that ?
> I'm using MSI afterburner and have set a custom fan curve where fans shouldn't start before 55 degrees so I really don't understand what's happening..
> 
> Thanks


I got you, fam - - it drove me nuts with my ZOTAC Amp Extreme 1080 Ti, too, until I figured it out.

You need to use ZOTAC's "FireStorm" program, and TURN OFF the "Fan Stop Setting."

They make it hard to find, it's under the "SPECTRA" section, and then tap the down-arrow (circled, images below) to see the option. Turn it to "OFF!"

Now you will have complete control of your fans!










-jm


----------



## Aganor

Quote:


> Originally Posted by *Agent-A01*
> 
> Holy **** that is a ton of paste.
> 
> AS5 is crap paste not to mention it is capacitive.
> 
> With that much paste I wouldn't be surprised if it eventually spreads to those resistors around the core..
> When that happens you will get some fireworks
> 
> My max temp on FE with EK block is 37c


Its the AS4, the non conductive one.
If it was, then it would already fry the last card i had.
Maybe my temps could be lower but it will be a real pain to change it :/


----------



## Agent-A01

Quote:


> Originally Posted by *Aganor*
> 
> Its the AS4, the non conductive one.
> If it was, then it would already fry the last card i had.
> Maybe my temps could be lower but it will be a real pain to change it :/


No such thing as AS4.. AKA Arctic Silver 4.

Maybe you are referring to arctic cooling mx-4?

What is your rad/fan setup? Is CPU in loop?


----------



## Aganor

Quote:


> Originally Posted by *Agent-A01*
> 
> No such thing as AS4.. AKA Arctic Silver 4.
> 
> Maybe you are referring to arctic cooling mx-4?
> 
> What is your rad/fan setup? Is CPU in loop?


Sorry, thats the one i have, arctic cooling mx-4, my bad.

My rig is:

Custom WC Loop @ MSI Monoblock RGB + EK-FC1080 Ti + EK 360XP + 240PE + 120XE RADS + GELID Slim Blue Slim 120mm x5 + Noctua 140mm + EK Furious Vardar FF5 120mm x2 + EK-DBAY D5 MX + Xylem D5 PWM Pump + EK-FB MSI Z270 GAMING Monoblock

res/pump->280rad->120rad->gpu->cpu->360rad

both 280 and 120mm have push/pull and 360 only pull (waiting for new fans to push)


----------



## Agent-A01

What are your ambient temps?
We have similar config but as i said i see 37c~ max temp @ 1.063v.

At 1.093 is will hit 39c~.

Nothing wrong with MX-4.. Maybe 2-4c worse than best paste at worst.


----------



## Aganor

Quote:


> Originally Posted by *Agent-A01*
> 
> What are your ambient temps?
> We have similar config but as i said i see 37c~ max temp @ 1.063v.
> 
> At 1.093 is will hit 39c~.


About 25ºc and with fans at max. I remember my Gaming X 1080 being freshier and i didnt even unlock the voltage slider on this one.


----------



## TheBoom

Ok so I've confirmed it with some testing in FSU.

Following Nico's method of increasing the 3 voltage bins to the same clock below the clock and voltage you want to be at results in instability for me. Even though it's still running at the same bin on load.

If I drop back to the old ramp curve at the same EXACT clock and voltage bin its stable.

Not sure why but maybe that's also something to do with why you get better performance doing that. GPU Boost 3 and the curve is really weird in the way it works.


----------



## Agent-A01

Quote:


> Originally Posted by *Aganor*
> 
> About 25ºc and with fans at max. I remember my Gaming X 1080 being freshier and i didnt even unlock the voltage slider on this one.


CPU max temp?

77f = 25c, my ambient is 73 which is like 23c.

46c is most likely due to poor contact or bad flow-rate/non-optimal flow setup

Fan setup is good, better than mine as i run 1200rpm NF-f12 push on 360 EK rad, push pull 1200 rpm noctua 14 redux on 280mm monsta 80mm thick rad with a dinky xspc AX120 and corsair af120 fan.
Quote:


> Originally Posted by *TheBoom*
> 
> Ok so I've confirmed it with some testing in FSU.
> 
> Following Nico's method of increasing the 3 voltage bins to the same clock below the clock and voltage you want to be at results in instability for me. Even though it's still running at the same bin on load.
> 
> If I drop back to the old ramp curve at the same EXACT clock and voltage bin its stable.
> 
> Not sure why but maybe that's also something to do with why you get better performance doing that. GPU Boost 3 and the curve is really weird in the way it works.


Can you screenshot the unstable and stable before/after curve for future reference?

What about performance using the 'optimized' curve?
Any noticeable changes?


----------



## Aganor

Quote:


> Originally Posted by *Agent-A01*
> 
> CPU max temp?
> 
> 77f = 25c, my ambient is 73 which is like 23c.
> 
> 46c is most likely due to poor contact or bad flow-rate/non-optimal flow setup
> 
> Fan setup is good, better than mine as i run 1200rpm NF-f12 push on 360 EK rad, push pull 1200 rpm noctua 14 redux on 280mm monsta 80mm thick rad with a dinky xspc AX120 and corsair af120 fan.


Cpu max temp is 70ºc but i made a fail delid attempt, but running Heaven it does not reach more than 55ºc while GPU is 46ºc.
Dunno how to fix the poor contact as the screws were screwed at the max. Could be the excessive paste as you pointed out.
I do not have any 90º turns and my pump is a pretty decent one. Could it be because i´m running at +150 core and +500 memory and card is at a higher voltage even if the slider is locked? Using heaven the power limit reaches 120%.

Edit: i tested with my room temp hotter than usual since i used my dehumifier...Matybe it wasnt 25ºc after all, will test today and report.


----------



## Agent-A01

I am running 2101mhz @ 1.062 with XOC bios(aka no power limiting) so mine should be hotter.

Did you check that you are using the optimal inlet outlet for each component? I.E. monoblock, gpu block, each of the radiators etc?

D5 pump will also have a recommended inlet outlet

If all that's good than maybe you have air trapped in the loop.


----------



## Aganor

Quote:


> Originally Posted by *Agent-A01*
> 
> I am running 2101mhz @ 1.062 with XOC bios(aka no power limiting) so mine should be hotter.
> 
> Did you check that you are using the optimal inlet outlet for each component? I.E. monoblock, gpu block, each of the radiators etc?
> 
> D5 pump will also have a recommended inlet outlet
> 
> If all that's good than maybe you have air trapped in the loop.


Initially i mounted it backwards but have corrected the inlet on the pump/reservoir and the GPU as well.
Now everything is on the right holes but the radiators its unknown, i wasnt even aware the Rads had any preference on it as the fins seems uniform.

About the air its pretty possible as i tested minutes after filling the system haha
i even see some trapped bubbles on the cpu block and on the reservoir acrylic window on the bay.
I left the filling hole opened today, will check tonight how it is.

Even so, isnt the thermal throttling at 50ºc?


----------



## Spirti

Quote:


> Originally Posted by *Agent-A01*
> 
> 46c is most likely due to poor contact or bad flow-rate/non-optimal flow setup


Jesus, 46 is not 86)
Everything below 50 is great imo

I have 45 on GPU and 65 on CPU (5820k at 4 GHz) at max load with only one EK 360 SE rad push and I'm pretty happy with it.


----------



## Aganor

Quote:


> Originally Posted by *Spirti*
> 
> Jesus, 46 is not 86)
> Everything below 50 is great imo
> 
> I have 45 on GPU and 65 on CPU (5820k at 4 GHz) at max load with only one EK 360 SE rad push and I'm pretty happy with it.


So i have really bad temps since i got a **** load of rads lol

I'm not removing anything due to this, too much of a pain in the arse to even unpuck a single thing.
Well, i have to do it anyway, my cpu is badly delided, that will mock me everyday


----------



## Spirti

Quote:


> Originally Posted by *Aganor*
> 
> So i have really bad temps since i got a **** load of rads lol


I have an open frame case though.


----------



## Agent-A01

Quote:


> Originally Posted by *Aganor*
> 
> Initially i mounted it backwards but have corrected the inlet on the pump/reservoir and the GPU as well.
> Now everything is on the right holes but the radiators its unknown, i wasnt even aware the Rads had any preference on it as the fins seems uniform.
> 
> About the air its pretty possible as i tested minutes after filling the system haha
> i even see some trapped bubbles on the cpu block and on the reservoir acrylic window on the bay.
> I left the filling hole opened today, will check tonight how it is.
> 
> Even so, isnt the thermal throttling at 50ºc?


First throttling point should be at 60c with a few more after that.

You may try just leaving power plugged into PC and rotate is several times to remove air pockets.

and yes some rads have preferred inlet/outlet, although i did not check if that was the case for yours.

Quote:


> Originally Posted by *Spirti*
> 
> Jesus, 46 is not 86)
> Everything below 50 is great imo
> 
> I have 45 on GPU and 65 on CPU (5820k at 4 GHz) at max load with only one EK 360 SE rad push and I'm pretty happy with it.


What's your point?

His setup is 10c higher despite having better rad setup than me.

Meaning it's not optimal.
Nobody said 46c is dangerous.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Agent-A01*
> 
> First throttling point should be at 60c with a few more after that.
> 
> You may try just leaving power plugged into PC and rotate is several times to remove air pockets.
> 
> and yes some rads have preferred inlet/outlet, although i did not check if that was the case for yours.
> What's your point?
> 
> His setup is 10c higher despite having better rad setup than me.
> 
> Meaning it's not optimal.
> Nobody said 46c is dangerous.


First throttling point isn't 50C.


----------



## Spirti

Quote:


> Originally Posted by *Agent-A01*
> 
> What's your point?
> 
> His setup is 10c higher despite having better rad setup than me.
> 
> Meaning it's not optimal.
> Nobody said 46c is dangerous.


My point is that it's not worth that much attention in case of 46c)
I would've been worrying if it was 66c, but in case that he overclocks his gpu - 46c is more than ok as it seems to me.


----------



## Agent-A01

Quote:


> Originally Posted by *Spirti*
> 
> My point is that it's not worth that much attention in case of 46c)
> I would've been worrying if it was 66c, but in case that he overclocks his gpu - 46c is more than ok as it seems to me.


Well therein lies the difference between you and other people, as i would give attention to it.


----------



## Spirti

Quote:


> Originally Posted by *Agent-A01*
> 
> Well therein lies the difference between you and other people, as i would give attention to it.


Ok you won)


----------



## lilchronic

I have 2 HWlabs blackice 360 rads in push pull and on the XOC bios i have temps up to 50°c. Ambient temp is 26°-27°c

With FE bios temps were around 45°c


----------



## orion933

Hello again !

someone can advice me a good bench/stress test to be sure everything is ok with my card? :
I can pass superposition - 3dmark firestrike ultra - heaven 4.0 but my OC can't pass firestrike ultra stress test for some reasons (card to 70°c max during stress test)

thank you !


----------



## Hulio225

Talking and discussing temperatures of components without mentioning ambient temperatures, is an amateurs fault boys!
Please for the sake of the silicon gods mention always your ambient temp...
46 ° C is good if your ambient is 30 ° C for example.... just saying

The first thermal throttling starts at 36-37 ° C if you are not on XOC bios

Quote:


> Originally Posted by *orion933*
> 
> Hello again !
> 
> someone can advice me a good bench/stress test to be sure everything is ok with my card? :
> I can pass superposition - 3dmark firestrike ultra - heaven 4.0 but my OC can't pass firestrike ultra stress test for some reasons (card to 70°c max during stress test)
> 
> thank you !


Fire Strike Ultra is very stressful and the stress test is very long, so your card gets warmer and warmer, the resistance of your GPU rises, current decreases, some transistors aren't switching fast enough anymore for example, some calculations fail, you crash.
That's one and most-likely the easiest explanation i can give you right now.
The benches you used are okay, and most games will not be so stressful like FSU, if you play your games you like and you never crash, your OC is stable for those purposes, pretty easy ;-)


----------



## orion933

Quote:


> Originally Posted by *Hulio225*
> 
> Fire Strike Ultra is very stressful and the stress test is very long, so your card gets warmer and warmer, the resistance of your GPU rises, current decreases, some transistors aren't switching fast enough anymore for example, some calculations fail, you crash.
> That's one and most-likely the easiest explanation i can give you right now.
> The benches you used are okay, and most games will not be so stressful like FSU, if you play your games you like and you never crash, your OC is stable for those purposes, pretty easy ;-)


Ok but it is ok to not pass it or i need to adjust my OC ?
What about your card ?


----------



## Hulio225

Quote:


> Originally Posted by *orion933*
> 
> Ok but it is ok to not pass it or i need to adjust my OC ?
> What about your card ?


Don't compare other cards to mine^^ I have very good custom-loop and a very good card which is running over 2100, can go up to 2176 on cold days and XOC bios and 2126 on FE bios and cold days on core and up to +850 VRam depending on application and temperatures...

It is okay to not pass if you are okay with not passing it to be honest.
If your games run with that OC and just FSU not you are okay, otherwise you will notice it and then you have to adjust it.
A component just needs to be as stable as you need it to be imho.


----------



## dunbheagan

To add a thought to the gpu temp diskussion: my FE under water maxes out at 36C under full load. Oh no, wait it maxes out at 50C.

36C: GPU undervolted, Fans 1300rpm
50C: GPU at 1,093V, 500rpm

What i want to say, it is pointless to compare temps to quantify if they are to high or to low. There are too many variables in a custom loop:

Waterblock
Radiator size and thickness
Fan type
Fan speed
GPU Voltage
GPU load/application
Thermal paste
Ambient temp

I am pretty sure is still forgot some...


----------



## Slackaveli

Quote:


> Originally Posted by *orion933*
> 
> Ok but it is ok to not pass it or i need to adjust my OC ?
> What about your card ?


maybe lower your mems a tad. most likely culprit if you pass heaven but heat up and fail FSultra,.


----------



## lilchronic

Quote:


> Originally Posted by *Slackaveli*
> 
> maybe lower your mems a tad. most likely culprit if you pass heaven but heat up and fail FSultra,.


Earlier he said he only put +100 on the mem so im pretty sure he can run that no problem. More than likely its the core that is failing. With high mem overclocks you start to see artifacts.... at least on my card


----------



## Slackaveli

Quote:


> Originally Posted by *lilchronic*
> 
> Earlier he said he only put +100 on the mem so im pretty sure he can run that no problem. More than likely its the core that is failing. With high mem overclocks you start to see artifacts.... at least on my card


ah, i missed that post/


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> I did this and somehow "bricked" my overcloxk. I was stable at 2088 1.31 previously. Tested hours in gaming. Then I increased the previous 3 bins to 2075 like you said and it crashed within a few minutes. So I thought heck it just isn't stable for some reason and I went back to my previous curve and even that isn't stable anymore. What's going on lol. It's almost as if doing that changed the timings on the memory or something else but I'll have to do more testing now.


It crashes because its actually putting a bigger load on the GPU due to higher performance. To be honest as I stated in an earlier post I have no idea why it works, unless its is changing one of the internal clocks like XBAR which is changing memory performance / timings. Although changing the curve back should fix it.
Quote:


> Originally Posted by *TheBoom*
> 
> Ok so I've confirmed it with some testing in FSU.
> 
> Following Nico's method of increasing the 3 voltage bins to the same clock below the clock and voltage you want to be at results in instability for me. Even though it's still running at the same bin on load.
> 
> If I drop back to the old ramp curve at the same EXACT clock and voltage bin its stable.
> 
> Not sure why but maybe that's also something to do with why you get better performance doing that. GPU Boost 3 and the curve is really weird in the way it works.


You may not be able to do all 3 points to the same clock, they have to be at what frequency you could run if you underclocked to that bin.


----------



## donkidonki

Well today I swapped out my old EVGA GTX780ti's (with EK waterblocks) for a pair of MSI GTX1080 Seahawk X EK's.

All went well but Jesus! Those MSI's are huge! Half again the size of the old cards. They only just fit in my case. Was a bit of a surprise was all.









(Sorry, thought I was on the 1080 thread)


----------



## Tristanguy1224

So I just got an Asus STRIX 1080ti OC and I started reading the XOC thing. Does this mean I can just flash that BIOS and remove power limits like I had on my old 970s?


----------



## gmpotu

Quote:


> Originally Posted by *Delbane*
> 
> I have a zotac amp extreme 1080 ti and i'm having this issue where whenever the card hits 50 degrees the fans kicks in to 100% for a second or two just to keep it under 50 degrees, I find this pretty annoying (noisy) anyone else experiencing the same issue ? How would I go about fixing that ?
> I'm using MSI afterburner and have set a custom fan curve where fans shouldn't start before 55 degrees so I really don't understand what's happening..


Once you set the fan profile in Afternurber. You have to enable it by clicking on the gear. You should see a red box around the fan settings and the word auto should also be red. Then your fan profile should work.

Thanks
Quote:


> Originally Posted by *ALSTER868*
> 
> Has anyone tried the new 382.33 drivers?


I like this driver a lot. Got some of my best clocks on this driver.


----------



## orion933

Quote:


> Originally Posted by *lilchronic*
> 
> Earlier he said he only put +100 on the mem so im pretty sure he can run that no problem. More than likely its the core that is failing. With high mem overclocks you start to see artifacts.... at least on my card


Here what happened during FSU stress test

+90core
+117% TDP (cant more)
+300 mem

Launch the test and at some point my TDP start to drop 110+ to 80+- and my fps drop also.

should i try to lower core or memory or the TDP?
thank !


----------



## blurp

Quote:


> Originally Posted by *dunbheagan*
> 
> To add a thought to the gpu temp diskussion: my FE under water maxes out at 36C under full load. Oh no, wait it maxes out at 50C.
> 
> 36C: GPU undervolted, Fans 1300rpm
> 50C: GPU at 1,093V, 500rpm
> 
> What i want to say, it is pointless to compare temps to quantify if they are to high or to low. There are too many variables in a custom loop:
> 
> Waterblock
> Radiator size and thickness
> Fan type
> Fan speed
> GPU Voltage
> GPU load/application
> Thermal paste
> Ambient temp
> 
> I am pretty sure is still forgot some...


For me, what you submitted about your temps according to voltage and fan speed is very useful. I agree there are many other important variables as you stated. But i gives me an idea about what temps I should expect approximately.

As for myself, I rarely get up to 42C (mostly around 38-40C) during any game or bench with GPU clock 2050 with 1,062V. Fans (x8) around 700 rpm water cooled (Rads : 480 + 240 + 240). Ambient : 23C. Thermal Paste : CLU


----------



## gmpotu

Quote:


> Originally Posted by *Tristanguy1224*
> 
> So I just got an Asus STRIX 1080ti OC and I started reading the XOC thing. Does this mean I can just flash that BIOS and remove power limits like I had on my old 970s?


Yep. No power throttling and no thermal throttling.

BIG WARNING though. I feel like I should put this in a separate post probably...

If you are using OCCT to test memory overclock *DO NOT* disable the frame limited by setting it to 0. I did this and my 12v rail which has never passed 496w instantly shot up to 576w within one second. If I was not watching it I probably would have fried my card so just be careful.


----------



## lilchronic

Quote:


> Originally Posted by *orion933*
> 
> Here what happened during FSU stress test
> 
> +90core
> +117% TDP (cant more)
> +300 mem
> 
> Launch the test and at some point my TDP start to drop 110+ to 80+- and my fps drop also.
> 
> should i try to lower core or memory or the TDP?
> thank !


Try lower memory clocks see if that fixes it. Only card i had that problem on was a 780Ti, if you overclocked memory to far the frame rate goes to crap. Haven't had the once on maxwell or pascal card's.


----------



## c0nsistent

So what's the consensus on the XOC BIOS ******ing memory timings and rendering itself null and void in regards to FE cards? I gave it a quick test and it didn't throttle much at all, but the heat generated caused me to start fabricating a custom loop for my ziptie setup instead of an AIO cooler. I'll be using a raystorm CPU block and a 240mm rad instead the next time I try XOC. I'm curious how things are going for you guys with actual waterblocks.


----------



## gmpotu

When you guys are taking about under volting your card do you mean you are just setting your highest bin at a low voltage like [email protected] and then just setting everything else behind it to the same clocks like [email protected] all the way up to [email protected]?
(Just want to make sure I am understanding correctly and there is not another meaning for under volting)

Separate thought.
Now that I have this new 1080 do you guys have any opinions on a monitor. I need to upgrade from 1080p.
I think I like 3440x1440p 21:9 UW the best but I haven't seen a lot of 4K in the store.

Anyone have advice on a good monitor and have you guys heard if there are any good ones that I should wait on for 2017? Money doesn't matter much for this purchase. I've already decided I'm going to spend it.


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> When you guys are taking about under volting your card do you mean you are just setting your highest bin at a low voltage like [email protected] and then just setting everything else behind it to the same clocks like [email protected] all the way up to [email protected]?
> (Just want to make sure I am understanding correctly and there is not another meaning for under volting)
> 
> Separate thought.
> Now that I have this new 1080 do you guys have any opinions on a monitor. I need to upgrade from 1080p.
> I think I like 3440x1440p 21:9 UW the best but I haven't seen a lot of 4K in the store.
> 
> Anyone have advice on a good monitor and have you guys heard if there are any good ones that I should wait on for 2017? Money doesn't matter much for this purchase. I've already decided I'm going to spend it.


For the undervolting, yes that's exactly people mean. Its running at less than the original 1.050v default.

As for monitors, a good 27" 144hz 1440p or 34" 144hz 1440p (you'll have to wait for the later) is a nice way to go. Really low response time is idea, I think I actually liked my Asus PG278Q better than my current X34, although the colours are nicer and IPS looks better. It still does have that snappy feel to it.


----------



## ir88ed

I recently upgraded from my 980ti's to a pair of 1080ti FE's. I am under a full loop with a fair amount of rad to handle the heat and am ready to push the cards a bit.

Will the XOC bios work on an EVGA FE? Sorry if this has already been discussed many times already, but I couldn't tell from the link in the OP, and I am getting into this thread pretty late. I would like to open up the max TDP I can within the bios.


----------



## gmpotu

Quote:


> Originally Posted by *ir88ed*
> 
> I recently upgraded from my 980ti's to a pair of 1080ti FE's. I am under a full loop with a fair amount of rad to handle the heat and am ready to push the cards a bit.
> 
> Will the XOC bios work on an EVGA FE? Sorry if this has already been discussed many times already, but I couldn't tell from the link in the OP, and I am getting into this thread pretty late. I would like to open up the max TDP I can within the bios.


Yes, XOC should work on your card. XOC was designed for the STRIX OC so you won't notice as much benefit as if you were on STRIX but it should still help you improve if you are not using resistor or shunt mod.


----------



## gmpotu

Quote:


> Originally Posted by *Nico67*
> 
> For the undervolting, yes that's exactly people mean. Its running at less than the original 1.050v default.
> 
> As for monitors, a good 27" 144hz 1440p or 34" 144hz 1440p (you'll have to wait for the later) is a nice way to go. Really low response time is idea, I think I actually liked my Asus PG278Q better than my current X34, although the colours are nicer and IPS looks better. It still does have that snappy feel to it.


Should I worry about response times if I mainly play MMORPGs like WoW and FFXIV? I rarely ever play FPS games. I think I definitely want a 34".


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Should I worry about response times if I mainly play MMORPGs like WoW and FFXIV? I rarely ever play FPS games. I think I definitely want a 34".


Its more important in Action orientated combat, but it just looks so much smoother when you pan your view from side to side. There are a few 3440x1440 34" with 144Hz coming in both VA and IPS panels later this year. They need to have good response time to achieve 144Hz so probably worth waiting for.


----------



## gmpotu

Thanks for the info NIco.

Back on topic for overclocking 1080 TI.

I have two strange things happening that I noticed today.

XOC bios on STRIX OC card

Ambient Temp 22C / +440 memory

1) I was trying to find my lowest bin for 2050mhz. I started with 1.050v and what's weird is the voltage jumps up to 1.100v and stays there during the test. It never drops back below 1.100v but I don't get why It doesn't just sit at 1.050v. I tried to lock (L) the voltage bin to the [email protected] and another strange thing happened. I was throttled down to [email protected] when I hit 52C which remained throughout the test. Remember I'm on the XOC bios so there shouldn't be any power or thermal throttling. When I was testing 2062mhz and 2075mhz the voltage didn't jump above the lowest bin set, it would either crash or pass at that lowest bin.

2) I decided to try something else because I thought maybe [email protected] was just not stable and that's why it was throttling. So I decided to test [email protected] locked or unlocked and it also throttled down to [email protected] when I hit 52C. It did not however jump to 1.100v. So then I set it to [email protected] to test something else and as expected no throttling at all.

I don't know why but any voltage below 1.100v regardless of clock speed seems to be throttling for me with XOC bios and any voltage 1.100v+ does not throttle for me with this bios regardless of temps or power.


----------



## Tristanguy1224

Quote:


> Originally Posted by *gmpotu*
> 
> Thanks for the info NIco.
> 
> Back on topic for overclocking 1080 TI.
> 
> I have two strange things happening that I noticed today.
> 
> XOC bios on STRIX OC card
> 
> Ambient Temp 22C / +440 memory
> 
> 1) I was trying to find my lowest bin for 2050mhz. I started with 1.050v and what's weird is the voltage jumps up to 1.100v and stays there during the test. It never drops back below 1.100v but I don't get why It doesn't just sit at 1.050v. I tried to lock (L) the voltage bin to the [email protected] and another strange thing happened. I was throttled down to [email protected] when I hit 52C which remained throughout the test. Remember I'm on the XOC bios so there shouldn't be any power or thermal throttling. When I was testing 2062mhz and 2075mhz the voltage didn't jump above the lowest bin set, it would either crash or pass at that lowest bin.
> 
> 2) I decided to try something else because I thought maybe [email protected] was just not stable and that's why it was throttling. So I decided to test [email protected] locked or unlocked and it also throttled down to [email protected] when I hit 52C. It did not however jump to 1.100v. So then I set it to [email protected] to test something else and as expected no throttling at all.
> 
> I don't know why but any voltage below 1.100v regardless of clock speed seems to be throttling for me with XOC bios and any voltage 1.100v+ does not throttle for me with this bios regardless of temps or power.


I'm not sure what's up with the XOC BIOS either as on stock I was hitting 2088 stable no issues. But after flashing XOC I was crashing. I then flashed the EVGA FTW3 BIOS and am stable again. This is on an Asus STRIX OC card by the way which makes me wonder if while there's no power limit on this BIOS if it is stable enough to be worth running...


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Thanks for the info NIco.
> 
> Back on topic for overclocking 1080 TI.
> 
> I have two strange things happening that I noticed today.
> 
> XOC bios on STRIX OC card
> 
> Ambient Temp 22C / +440 memory
> 
> 1) I was trying to find my lowest bin for 2050mhz. I started with 1.050v and what's weird is the voltage jumps up to 1.100v and stays there during the test. It never drops back below 1.100v but I don't get why It doesn't just sit at 1.050v. I tried to lock (L) the voltage bin to the [email protected] and another strange thing happened. I was throttled down to [email protected] when I hit 52C which remained throughout the test. Remember I'm on the XOC bios so there shouldn't be any power or thermal throttling. When I was testing 2062mhz and 2075mhz the voltage didn't jump above the lowest bin set, it would either crash or pass at that lowest bin.
> 
> 2) I decided to try something else because I thought maybe [email protected] was just not stable and that's why it was throttling. So I decided to test [email protected] locked or unlocked and it also throttled down to [email protected] when I hit 52C. It did not however jump to 1.100v. So then I set it to [email protected] to test something else and as expected no throttling at all.
> 
> I don't know why but any voltage below 1.100v regardless of clock speed seems to be throttling for me with XOC bios and any voltage 1.100v+ does not throttle for me with this bios regardless of temps or power.


I have not seen anything like that, but I would suspect the card is say at 2050 it wants 1.100v to be stable. Could be some sort of VID table or just part of Boost3's overclocking routine test? You may have to test through different clks to see what sort of results you get. Cooling the GPU down heaps may help also.

Quote:


> Originally Posted by *Tristanguy1224*
> 
> I'm not sure what's up with the XOC BIOS either as on stock I was hitting 2088 stable no issues. But after flashing XOC I was crashing. I then flashed the EVGA FTW3 BIOS and am stable again. This is on an Asus STRIX OC card by the way which makes me wonder if while there's no power limit on this BIOS if it is stable enough to be worth running...


Were you power / temp limiting on Strix bios as you may never have run at 2088 long enough to know if it was stable. If it spends most of its time clocked down to say 2050 it might be stable there. If it crashes, it needs more voltage, sometimes that's just as far as you can go.


----------



## becks

Apparently there are 106 Geforce GTX Founders edition on the market....
So If one would go about buying one...and put it under water....which one ?

I know in theory they are all the same but...blower edition ? Is that still considered as being founders edition ?
Which one would give me the best chances of getting 2000+ on core and 12k on memory ?
Don't really want to shunt mode / resistor mode / or bios flash...will keep that only as a last resource...


----------



## AsterFenix

Hi guys,

Can anyone tell me which backplate is compatible with the swiftech waterblock, aside from swiftech's own backplate? I like their block but the backplate looks horrible. The original backplate isn't compatible with the watblock. I'm thinking about getting the EK backplate on sale but afraid it may not fit as well. Anyone tried this before?


----------



## becks

Quote:


> Originally Posted by *AsterFenix*
> 
> Hi guys,
> 
> Can anyone tell me which backplate is compatible with the swiftech waterblock, aside from swiftech's own backplate? I like their block but the backplate looks horrible. The original backplate isn't compatible with the watblock. I'm thinking about getting the EK backplate on sale but afraid it may not fit as well. Anyone tried this before?


How come it does not fit ? It uses the same holes....


----------



## TheBoom

Quote:


> Originally Posted by *Nico67*
> 
> It crashes because its actually putting a bigger load on the GPU due to higher performance. To be honest as I stated in an earlier post I have no idea why it works, unless its is changing one of the internal clocks like XBAR which is changing memory performance / timings. Although changing the curve back should fix it.
> You may not be able to do all 3 points to the same clock, they have to be at what frequency you could run if you underclocked to that bin.


If the 3 points are not on the same clock will it still improve performance? Or should I try and use an even higher voltage for the current bin so I can get the previous 3 clocks stable at the lower voltage bins?

Or should I try dropping memory OC instead since it might be actually changing the timings?

Apologies for the questions but this is quite interesting to me. I already noticed that my FSU graphics score went up about 100-150 just by increasing the previous 1 bin.


----------



## KedarWolf

In MSI Afterburner for example, both +691 memory and +699 memory says it's at 6210 MHZ.

But in Heaven +691 is 6196 MHZ and +699 is 6204 MHZ.









I'm thinking Heaven gives an actual better representation of your memory clocks than Afterburner does.









Any thoughts?


----------



## KedarWolf

Quote:


> Originally Posted by *Aganor*
> 
> Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?
> 
> Used
> Arctic Silver 4 on it (cross spread mode)


I have a 5960X CPU at 1.273V, a dedicated PhysX card and an EK waterblock and backplate in my loop with one 360 rad. With fans at 80% (I use a headset) and pump at 100% (I don't even hear it actually, quiet pump) after running Heaven for 30 minutes or so with ambient temps around 23C my temps hover around 42-43C.

So 45-46C with lower pump and fans speeds is reasonable.


----------



## lilchronic

Quote:


> Originally Posted by *TheBoom*
> 
> If the 3 points are not on the same clock will it still improve performance? Or should I try and use an even higher voltage for the current bin so I can get the previous 3 clocks stable at the lower voltage bins?
> 
> Or should I try dropping memory OC instead since it might be actually changing the timings?
> 
> Apologies for the questions but this is quite interesting to me. I already noticed that my FSU graphics score went up about 100-150 just by increasing the previous 1 bin.


yes

For me to run 1911Mhz @ .925v i need to adjust the .925v bin to +165. For the 3 voltage bin's under .925 just add +165.


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> If the 3 points are not on the same clock will it still improve performance? Or should I try and use an even higher voltage for the current bin so I can get the previous 3 clocks stable at the lower voltage bins?
> 
> Or should I try dropping memory OC instead since it might be actually changing the timings?
> 
> Apologies for the questions but this is quite interesting to me. I already noticed that my FSU graphics score went up about 100-150 just by increasing the previous 1 bin.


Yes get the three points as high as they can handle is as good as you can get, they just have to be lower than the max so it doesn't run at a lower voltage.
I would recommend trying your max clocks at different voltages as it made quite a difference for me as well. My scores improved all the way upto 1.075v, but then started dropping again, so that's why my max clock is at 1.075v. My gaming curve was able to use the next 3 points the same clk because they could do that anyway, so I can luckily run 2101 max and 2088 for all three bins below that. When playing The Division I do limit very occasionally so I made another step down of 3points after that as well. Its fine for everything else I play.
So I would find your optimal voltage and then work the bin settings after that.
Questions are fine, I find this stuff interesting too







its fun finding a few tricks, but I have to give props to Coopiklaani for finding the "cliff" vs "curve" phenomena. Still does make sense, but it works









EDIT: so as Not to cause confusion, the above strikethru was because I am back on FE bios







the 3 point thing applies to any bios, its just you have to do it from your min working clk with other bioses and from your chosen clk with XOC as it doesn't limit.


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> In MSI Afterburner for example, both +691 memory and +699 memory says it's at 6210 MHZ.
> 
> But in Heaven +691 is 6196 MHZ and +699 is 6204 MHZ.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm thinking Heaven gives an actual better representation of your memory clocks than Afterburner does.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any thoughts?


AB is better, as memory increments are in defined amounts 6180, 6210, 6221, 6237, so just add a few MHz each time till AB shows the Mem clk bump up a step. Which is easy to see if you have something like SuPo program still open and not running, just restart program before run the next test. Heaven is just calculating the AB setting I would say.


----------



## pez

Quote:


> Originally Posted by *gmpotu*
> 
> When you guys are taking about under volting your card do you mean you are just setting your highest bin at a low voltage like [email protected] and then just setting everything else behind it to the same clocks like [email protected] all the way up to [email protected]?
> (Just want to make sure I am understanding correctly and there is not another meaning for under volting)
> 
> Separate thought.
> Now that I have this new 1080 do you guys have any opinions on a monitor. I need to upgrade from 1080p.
> I think I like 3440x1440p 21:9 UW the best but I haven't seen a lot of 4K in the store.
> 
> Anyone have advice on a good monitor and have you guys heard if there are any good ones that I should wait on for 2017? Money doesn't matter much for this purchase. I've already decided I'm going to spend it.


If you do decide on 21:9 3440x1440, you'll be very happy with the performance of the Ti/TXP/TXp.


----------



## Benny89

Quote:


> Originally Posted by *gmpotu*
> 
> Thanks for the info NIco.
> 
> Back on topic for overclocking 1080 TI.
> 
> I have two strange things happening that I noticed today.
> 
> XOC bios on STRIX OC card
> 
> Ambient Temp 22C / +440 memory
> 
> 1) I was trying to find my lowest bin for 2050mhz. I started with 1.050v and what's weird is the voltage jumps up to 1.100v and stays there during the test. It never drops back below 1.100v but I don't get why It doesn't just sit at 1.050v. I tried to lock (L) the voltage bin to the [email protected] and another strange thing happened. I was throttled down to [email protected] when I hit 52C which remained throughout the test. Remember I'm on the XOC bios so there shouldn't be any power or thermal throttling. When I was testing 2062mhz and 2075mhz the voltage didn't jump above the lowest bin set, it would either crash or pass at that lowest bin.
> 
> 2) I decided to try something else because I thought maybe [email protected] was just not stable and that's why it was throttling. So I decided to test [email protected] locked or unlocked and it also throttled down to [email protected] when I hit 52C. It did not however jump to 1.100v. So then I set it to [email protected] to test something else and as expected no throttling at all.
> 
> I don't know why but any voltage below 1.100v regardless of clock speed seems to be throttling for me with XOC bios and any voltage 1.100v+ does not throttle for me with this bios regardless of temps or power.


Well, it was BIOS designed for LN2 runs so maybe this is how it works? 1.100 is I guess absolutely minimum from where smeone with LN2 setup would even look at voltage.

Did you try to do flat curve? Get your 2050 on lets say 1.08 and flat on same level any other further bins after it in curve so it is straight line from 1.08 to the end of curve. Does it still jump to 1.100V?

Did you try running 2050 on this 1.100V? Did it throttle/downclock at all? Test your temps on 1.100v.


----------



## Aganor

I'm considering using the XOC bios on my MSI FE card but i NEVER flashed a GPU BIOS so i do not understand the instructions on the 1st page.

I downloaded the S1080TIXOC+NVFLASH from the 1st page link but what to do next? Just run the nvflash.exe?

Inst there a risk to permanetly brick the card?


----------



## Eco28

Quote:


> Originally Posted by *Aganor*
> 
> Room temp at 23ºc, after using my EK Waterblock on the GPU, system in sig, i have run Heaven with max 46ºc. Good temp or should be lower?
> 
> Used
> Arctic Silver 4 on it (cross spread mode)


It is a great temp man but there could be some room for improvement. I have the very same EK Waterblock and custom loop also cooling my 7700k @4,8GHz. I am using:
- 240x 60mm thick (pull) and 360 x45mm thick (push) Alphacool NexXxoS rads with Corsair's SP120 quiet edition fans barely moving at 800rpm.
- My termal paste is however Thermal Grizzly Kryonaut which I highly recommend. I was generous with application on both GPU and CPU









I did some lazy OC on the card just to touch 2Ghz on GPU and 6000Mhz on mems. I have yet to see more than 39ºc on the card while CPU is crushing me with 80-90 degrees :/ Room temp is hovering around 25 degrees celsius

Edit: some typos.


----------



## feznz

Quote:


> Originally Posted by *Aganor*
> 
> I'm considering using the XOC bios on my MSI FE card but i NEVER flashed a GPU BIOS so i do not understand the instructions on the 1st page.
> 
> I downloaded the S1080TIXOC+NVFLASH from the 1st page link but what to do next? Just run the nvflash.exe?
> 
> Inst there a risk to permanetly brick the card?


it has been tested on the FE no chance of bricking
there is hundreds of guides to flash... probably best to find one you can comprehend
I believe the general conscious is FTW3 or Asus bios has been better than XOC 1.093v is getting up there for 24/7 running


----------



## ELIAS-EH

next week, the Thermal Grizzly Conductonaut with the cooler master liquid pro 280 will arrive.
I will use the liquid metal between my CPU IHS (6900K) and the water block, thermal Grizzly support team recommend me to spread the liquid metal on both CPU IHS and cooler.

anything other than liquid metal is considered as bottleneck for me.

MX-4 : Thermal Conductivity 8.5 W/(m*K)
Conductonaut : Thermal Conductivity 73 W/(m*K)


----------



## Aganor

Quote:


> Originally Posted by *Eco28*
> 
> It is a great temp man but there could be some room for improvement. I have the very same EK Waterblock and custom loop also cooling my 7700k @4,8GHz. I am using:
> - 240x 60mm thick (pull) and 360 x45mm thick (push) Alphacool NexXxoS rads with Corsair's SP120 quiet edition fans barely moving at 800rpm.
> - My termal paste is however Thermal Grizzly Kryonaut which I highly recommend. I was generous with application on both GPU and CPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did some lazy OC on the card just to touch 2Ghz on GPU and 6000Mhz on mems. I have yet to see more than 39ºc on the card while CPU is crushing me with 80-90 degrees :/ Room temp is hovering around 25 degrees celsius
> 
> Edit: some typos.


How do you measure room temp?
I know the system temp is 28ºc, i do not have a termometer on my room lol


----------



## Aganor

Quote:


> Originally Posted by *ELIAS-EH*
> 
> next week, the Thermal Grizzly Conductonaut with the cooler master liquid pro 280 will arrive.
> I will use the liquid metal between my CPU IHS (6900K) and the water block, thermal Grizzly support team recommend me to spread the liquid metal on both CPU IHS and cooler.
> 
> anything other than liquid metal is considered as bottleneck for me.
> 
> MX-4 : Thermal Conductivity 8.5 W/(m*K)
> Conductonaut : Thermal Conductivity 73 W/(m*K)


So you´r saying my mx-4 paste is better as a tooth paste than thermal paste?


----------



## orion933

Quote:


> Originally Posted by *lilchronic*
> 
> Try lower memory clocks see if that fixes it. Only card i had that problem on was a 780Ti, if you overclocked memory to far the frame rate goes to crap. Haven't had the once on maxwell or pascal card's.


Ok so i reduce to +80 core clock and no more issue for now

+80 core
+300 mem
17%tdp

should I try to raise core and lower memory to get better framerate?
thx!


----------



## ELIAS-EH

Quote:


> Originally Posted by *Aganor*
> 
> So you´r saying my mx-4 paste is better as a tooth paste than thermal paste?


hahaha nice comment

I am saying that liquid metal is king.
numbers talk : 8.5 W/mK vs 73 W/mK


----------



## Dasboogieman

Quote:


> Originally Posted by *ELIAS-EH*
> 
> hahaha nice comment
> 
> I am saying that liquid metal is king.
> numbers talk : 8.5 W/mK vs 73 W/mK


Numbers also lie. Conductivity isn't everything. In fact, the biggest factor that influences thermal transfer is contact intimacy aka mounting pressure + surface wetting. You can use cow poop and as long as contact is really tight, thermal transfer is improved.

And more specifically, liquid metal is not king. Phase change thermal interface is king, this is because it combines liquid metal's conductivity with perfect surface intimacy.


----------



## T0B5T3R

use Thermal Grizzly Kryonaut (12,5 W/mK) or Hydronaut (11,8 W/mK)


----------



## Benny89

Maybe I should apply liquid metal for my CPU AIO cooler?

BACK TO 1080Ti!

*Anyone on AIR STRIX OC that is using XOC Bios and sees no power throttling on bins lower than 1.100V*? On air pls. Thanks.


----------



## Aganor

Quote:


> Originally Posted by *T0B5T3R*
> 
> use Thermal Grizzly Kryonaut (12,5 W/mK) or Hydronaut (11,8 W/mK)


Already ordered it, will use it on both blocks...argh the pain to undo all that tubes again


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> *Anyone on AIR STRIX OC that is using XOC Bios and sees no power throttling on bins lower than 1.100V*? On air pls. Thanks.


You rang? How may I help









Though I have had the voltage go up to 1.1 (by itself) even when it's set at 2000 at 1.0. What I have figured out is make sure the secondary non bold voltage line is smooth and no sharp drops and dips. Also set the overclock and voltage adjust when the card is under load temperature.

There are certain ways I can adjust the voltage curve where it says vrel and vop/ pwr in gpuz however I don't believe it is being power limited as I'll still get 10300+ in superposition. Games are fine too so I think it's just an anomaly.

Since doing all that I haven't had the voltage creep up unexpectedly


----------



## ELIAS-EH

Quote:


> Originally Posted by *Dasboogieman*
> 
> Numbers also lie. Conductivity isn't everything. In fact, the biggest factor that influences thermal transfer is contact intimacy aka mounting pressure + surface wetting. You can use cow poop and as long as contact is really tight, thermal transfer is improved.
> 
> And more specifically, liquid metal is not king. Phase change thermal interface is king, this is because it combines liquid metal's conductivity with perfect surface intimacy.


This is why support team recommended me to apply liquid metal on both surfaces (on CPU IHS and water block), in order to have perfect contact between them.

Capture.JPG 82k .JPG file


----------



## TheBoom

So I managed to get 2088 @ 1143 stable and then 2050 at the 3 voltage bins below that but I get less score then a ramp followed by a cliff if you know what I mean. Meaning 2075 at the first bin after 2088 (1.131v) gives me a higher score than say normalizing all 3 bins before [email protected] to 2050 instead.

Edit : Nevermind I think it was a margin of error between the scores.

By the way do you guys see power limiting on the XOC in FS Ultra?
Quote:


> Originally Posted by *Nico67*
> 
> Yes get the three points as high as they can handle is as good as you can get, they just have to be lower than the max so it doesn't run at a lower voltage.
> I would recommend trying your max clocks at different voltages as it made quite a difference for me as well. My scores improved all the way upto 1.075v, but then started dropping again, so that's why my max clock is at 1.075v. My gaming curve was able to use the next 3 points the same clk because they could do that anyway, so I can luckily run 2101 max and 2088 for all three bins below that. When playing The Division I do limit very occasionally so I made another step down of 3points after that as well. It fine for everything else I play.
> So I would find your optimal voltage and then work the bin settings after that.
> Questions are fine, I find this stuff interesting too
> 
> 
> 
> 
> 
> 
> 
> its fun finding a few tricks, but I have to give props to Coopiklaani for finding the "cliff" vs "curve" phenomena. Still does make sense, but it works


----------



## Dasboogieman

Quote:


> Originally Posted by *ELIAS-EH*
> 
> This is why support team recommended me to apply liquid metal on both surfaces (on CPU IHS and water block), in order to have perfect contact between them.
> 
> Capture.JPG 82k .JPG file


One thing I would caution against using liquid metal for the IHS + block straight from the Delid guys. The Liquid metal causes mild corrosion of the IHS which is juuust enough to fade off the intel laser etchings. This will definitely void your warranty because for some reason, Intel doesn't give a damn if you delid your chip (some of the Delid guys got RMAs lmao) but they give a lot of a damn if you rub them laser etchings off.


----------



## done12many2

Quote:


> Originally Posted by *Dasboogieman*
> 
> One thing I would caution against using liquid metal for the IHS + block straight from the Delid guys. The Liquid metal causes mild corrosion of the IHS which is juuust enough to fade off the intel laser etchings. This will definitely void your warranty because for some reason, Intel doesn't give a damn if you delid your chip (some of the Delid guys got RMAs lmao) but they give a lot of a damn if you rub them laser etchings off.


Metal polish such as Never Dull or Brasso will take the staining right off without removing the labeling. The first time is removed LM I got aggressive in trying to remove the staining with other methods and it ruined the labeling on the IHS. Now I just wipe it off with Never Dull and it looks like a brand new chip everytime.


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> You rang? How may I help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though I have had the voltage go up to 1.1 (by itself) even when it's set at 2000 at 1.0. What I have figured out is make sure the secondary non bold voltage line is smooth and no sharp drops and dips. Also set the overclock and voltage adjust when the card is under load temperature.
> 
> There are certain ways I can adjust the voltage curve where it says vrel and vop/ pwr in gpuz however I don't believe it is being power limited as I'll still get 10300+ in superposition. Games are fine too so I think it's just an anomaly.
> 
> Since doing all that I haven't had the voltage creep up unexpectedly


Can you post your curve so I know what you mean? I dont understand what do you mean by "non bold voltage line". So ran your card at 2000 1.1? What are your temps?

"Also set the overclock and voltage adjust when the card is under load temperature" - what do you mean by that?

Thanks for the help, I will need it in coming days. My SLI STRIX are coming tomorrow


----------



## mtbiker033

I just flashed the FTW3 bios on my hybrid cooled FE

enabled power limit up to 127%

idle clock is up to 240mhz from 160mhz

with just maxing out the power limit slider and temp slider I did score a coupe hundred more points in superposition 4K versus stock but the core clock seemed to fluctuate allot, temps were actually a few degrees cooler as well

in a game, still with just power limit & temp sliders maxed, it boosted to 2000mhz vs. 1911mhz stock, ran a few deg hotter but no big deal, I didn't notice much difference in the one game I tried (Shadow Warrior 2).

I did try to push my highest clocks that I used before with Superposition 4K and the benchmark crashed to blackscreen (had to kill task):

full 100% voltage slider, 127% power, and full temp slider, +178 core and +500 memory

I need to work out a good curve OC

updated:

+500 mem, full voltage, power, temp sliders with my curve maxxing out at 2168 at 1050mV and then across the rest of the bins at that clock I almost beat 10k in Superposition 4K:



I really wanted to crack 10k but I'm not sure it's possible


----------



## seven7thirty30

Quote:


> Originally Posted by *mtbiker033*
> 
> I just flashed the FTW3 bios on my hybrid cooled FE
> 
> enabled power limit up to 127%
> 
> idle clock is up to 240mhz from 160mhz
> 
> with just maxing out the power limit slider and temp slider I did score a coupe hundred more points in superposition 4K versus stock but the core clock seemed to fluctuate allot, temps were actually a few degrees cooler as well
> 
> in a game, still with just power limit & temp sliders maxed, it boosted to 2000mhz vs. 1911mhz stock, ran a few deg hotter but no big deal, I didn't notice much difference in the one game I tried (Shadow Warrior 2).
> 
> I did try to push my highest clocks that I used before with Superposition 4K and the benchmark crashed to blackscreen (had to kill task):
> 
> full 100% voltage slider, 127% power, and full temp slider, +178 core and +500 memory
> 
> I need to work out a good curve OC
> 
> updated:
> 
> +500 mem, full voltage, power, temp sliders with my curve maxxing out at 2168 at 1050mV and then across the rest of the bins at that clock I almost beat 10k in Superposition 4K:
> 
> 
> 
> I really wanted to crack 10k but I'm not sure it's possible


I wouldn't say that 2168 @ 1.05mV is "impossible", but it appears that your card is incapable of holding it. It looks like it's throttling down. Try setting the curve to 2050 @ 1.05mV and work your way up from there. I hit 10600 and some change at 2088 @ 1.093mV in Supo 4K.


----------



## gmpotu

Quote:


> Originally Posted by *KedarWolf*
> 
> I have a 5960X CPU at 1.273V, a dedicated PhysX card and an EK waterblock and backplate in my loop with one 360 rad. With fans at 80% (I use a headset) and pump at 100% (I don't even hear it actually, quiet pump) after running Heaven for 30 minutes or so with ambient temps around 23C my temps hover around 42-43C.
> 
> So 45-46C with lower pump and fans speeds is reasonable.


Is your dedicated physx card also a 1080TI in SLI?
I'm asking because of its just another discrete card in your system I have a GTX 570 I could use for physx. Not sure if it would be a good idea to run physx off a card that old vs running it everything off my 1080TI or if it is even possible.


----------



## mtbiker033

Quote:


> Originally Posted by *seven7thirty30*
> 
> I wouldn't say that 2168 @ 1.05mV is "impossible", but it appears that your card is incapable of holding it. It looks like it's throttling down. Try setting the curve to 2050 @ 1.05mV and work your way up from there. I hit 10600 and some change at 2088 @ 1.093mV in Supo 4K.


thanks for the suggestion!


----------



## KraxKill

When I was testing with the XOC Bios, I found that the voltage would be correct during benching. Then I loaded Arma 3 and watched the voltage creep above 1.1 despite having limited the curve to 1.063v. Anybody see this? Fix?

Can somebody check with favorite game of choice. Thx


----------



## KedarWolf

Quote:


> Originally Posted by *gmpotu*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I have a 5960X CPU at 1.273V, a dedicated PhysX card and an EK waterblock and backplate in my loop with one 360 rad. With fans at 80% (I use a headset) and pump at 100% (I don't even hear it actually, quiet pump) after running Heaven for 30 minutes or so with ambient temps around 23C my temps hover around 42-43C.
> 
> So 45-46C with lower pump and fans speeds is reasonable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is your dedicated physx card also a 1080TI in SLI?
> I'm asking because of its just another discrete card in your system I have a GTX 570 I could use for physx. Not sure if it would be a good idea to run physx off a card that old vs running it everything off my 1080TI or if it is even possible.
Click to expand...

A GTX 570 would make a great PhysX card but will only work with games that actually use PhysX and the list isn't very long.









But in games that do use PhysX it offloads the PhysX processing to the second card leaving your primary card for all the rendering.









You don't need a high end card as your PhysX card either.


----------



## AsterFenix

The screws are of different size. The siwftech is slightly bigger and wouldn't pass the original backplate holes


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> When I was testing with the XOC Bios, I found that the voltage would be correct during benching. Then I loaded Arma 3 and watched the voltage creep above 1.1 despite having limited the curve to 1.063v. Anybody see this? Fix?
> 
> Can somebody check with favorite game of choice. Thx


Yeah with the XOC bios i cant keep the voltage locked to a specific voltage point either.


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> So I managed to get 2088 @ 1143 stable and then 2050 at the 3 voltage bins below that but I get less score then a ramp followed by a cliff if you know what I mean. Meaning 2075 at the first bin after 2088 (1.131v) gives me a higher score than say normalizing all 3 bins before [email protected] to 2050 instead.
> 
> Edit : Nevermind I think it was a margin of error between the scores.
> 
> By the way do you guys see power limiting on the XOC in FS Ultra?


It won't do 2075 / 2062 / 2050 for the three bins? and yeah the first one makes the biggest difference, and after three there was no noticeable gain.


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Thanks for the help, I will need it in coming days. My SLI STRIX are coming tomorrow


I am happy for you







just in time for the weekend

Wish the binning Gods are on your side LOL I done some estimation I think mine will need 1.17v+ to get 2100Mhz stable not really prepared to do that even on water for 24/7 running as I think I will have to get 4+ years out of this card

Quote:


> Originally Posted by *gmpotu*
> 
> Is your dedicated physx card also a 1080TI in SLI?
> I'm asking because of its just another discrete card in your system I have a GTX 570 I could use for physx. Not sure if it would be a good idea to run physx off a card that old vs running it everything off my 1080TI or if it is even possible.







I had to revisit that question conclusion from video PhysX rocks if you play Batman games apart from that...... fills your empty slots and makes your computer look nicer, fuller







I have a GTX770 more for running 2 accessory monitors

I guess PhysX as a Great marketing tool to sell low end GPUs or keep people in the Green camp


----------



## Benny89

Sooo anyone know how to fix 1.1V in XOC Bios? Seems from what I heard here that people lock voltage to certain bin but it goes to 1.1V anyway on XOC BIOS on STRIX OC. Someone said here that he knows how to avoid that and perf caps below 1.1V but he is offline I think.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> Well, it was BIOS designed for LN2 runs so maybe this is how it works? 1.100 is I guess absolutely minimum from where smeone with LN2 setup would even look at voltage.
> 
> Did you try to do flat curve? Get your 2050 on lets say 1.08 and flat on same level any other further bins after it in curve so it is straight line from 1.08 to the end of curve. Does it still jump to 1.100V?
> 
> Did you try running 2050 on this 1.100V? Did it throttle/downclock at all? Test your temps on 1.100v.


[email protected] flat line across jumps to 1.100v
[email protected] flat line across jumps to 1.100v
[email protected] flat line across stays at 1.100v (no throttle)


----------



## mcg75

*Just a reminder here everyone.

As this is the 1080 Ti owner's club, please try to keep the topic as close to that as possible.

Some infrequent off topic banter is fine but 4 pages recently arguing about what cpu to get doesn't belong in here.

Thanks.
*


----------



## Benny89

Quote:


> Originally Posted by *gmpotu*
> 
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across stays at 1.100v (no throttle)


Yup, seems like XOC BIOS force you to go 1.1V. Which not like some dangerous voltage but still would be nice to find a fix so it stays on its bin.


----------



## gmpotu

Quote:


> Originally Posted by *Sweetwater*
> 
> You rang? How may I help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though I have had the voltage go up to 1.1 (by itself) even when it's set at 2000 at 1.0. What I have figured out is make sure the secondary non bold voltage line is smooth and no sharp drops and dips. Also set the overclock and voltage adjust when the card is under load temperature.
> 
> There are certain ways I can adjust the voltage curve where it says vrel and vop/ pwr in gpuz however I don't believe it is being power limited as I'll still get 10300+ in superposition. Games are fine too so I think it's just an anomaly.
> 
> Since doing all that I haven't had the voltage creep up unexpectedly


What is the "secondary non-bold voltage line"?


----------



## chef1702

Quote:


> Originally Posted by *Benny89*
> Yup, seems like XOC BIOS force you to go 1.1V


Quote:


> Originally Posted by *gmpotu*
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across stays at 1.100v (no throttle)


Must be a "XOC Bios on Strix cards" problem. On my FE voltage curve still works as it should. No voltage fixing, just a flat curve like we all learned.



Gaming with the XOC for a couple of weeks now with 1.012 V and 1987 MHz in different games like ME:A, Nier and Surge and non of the games messed with the curve. Not even ME:A which pulles 360+ watts sometimes... card stays in curve.

Oh and yes GPU-Z shows two voltage messages, but thats right, the curve do limit the max. voltage so its Vrel and Vop.


----------



## Sweetwater

Quote:


> Originally Posted by *gmpotu*
> 
> What is the "secondary non-bold voltage line"?


I'm just off to work, but I'll explain when I'm home. Roughly 6 hours

I'll do a detailed post of what I mean with some screenshots to show you guys


----------



## RavageTheEarth

Us Aorus owners are finally joining you FE guys under water!


I can't wait until temp is no longer an issue. Plus I can finally get my reservoir back in the case and get my side panel back on. I had to unscrew it and hang it out of the computer to get this monstrous card to even fit and it still barely fits. Can't wait until 1 slot goodness! It sucks that EK delayed this a month, but I'm damn happy it was finally announced. Now my CPU can finally share some of this 1020mm of rad space. I'm way too excited I had to order the $50 UPS Urgent tracking to get it here to the USA ASAP. Customs better not play no games!!


----------



## Benny89

Quote:


> Originally Posted by *Sweetwater*
> 
> I'm just off to work, but I'll explain when I'm home. Roughly 6 hours
> 
> I'll do a detailed post of what I mean with some screenshots to show you guys


Thanks. It is strange that STRIX card would have problem with its own BIOS while FE cards do not have issues with XOC :/


----------



## Coopiklaani

Quote:


> Originally Posted by *gmpotu*
> 
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across jumps to 1.100v
> [email protected] flat line across stays at 1.100v (no throttle)


You need to re-calibrate your curve at different temperature. Try apply the same core offset at multiple time while running furmark and see how curve changes its shape.


----------



## gmpotu

Check this out...
I remember reading about this when this thread first started back like 7000+ post ago where coopiklaani is talking about.

I watched my curve while running furmark and you're right. I can see it changing the shape to where those voltage bins for my curve were actually the lowest bin at that frequency for the new curve based on temp.

So here is what I did.

Edit and save curve at 27C
Apply Curve at 27C
-Results-
[email protected]
[email protected] ~42C
[email protected] ~48C
[email protected] ~51C
-Score- 10087

Edit and Save the same curve shape at 56C
Apply curve at 42C
-Results-
[email protected] caps show vRel,vOP
[email protected] ~48C no caps
-Score- 10012

Apply the same saved profile at 27C
-Results-
[email protected] caps
[email protected] ~48C no caps
[email protected] ~52C no caps
-Score- 9981 (I might have had an extra application on here. Not sure why this score is so low.)

Edit and Save curve at 58C
Apply at 58C then stop furmark and run SP4K asap

-Results-
Benchmark started with temp at about 47C
[email protected] caps
[email protected] ~48C almost immediately
[email protected] ~55C
-Score~ 10009

On that last test I started seeing slight artifacts and then by scene 16/17 there were tons of artifacts like rainbow colors as if I was in the Matrix or something. So I'm sure I need to come down on my memory. My poor card can't OC the memory for crap. It's only at +440 right now.


----------



## gmpotu

Is there a way to make the Voltage/Frequency curve editor not be "Always On Top"?


----------



## nrpeyton

Played first 6 missions on 'Age of the Singularity' but can't seem to get into it.

Too hard to distinguish individuals unit (as you have to zoom out to see the battlefield) so everything just looks the *same*.

Besides needing AA support against air attacks you can throw anything into the mix (a bit like an "unintelligent brawl"). Hence too much of an 'arcade'?

Any other suggestions? Whats everyone else playing?
_I need something that justifies spending so much on a GPU. But having completed The Witcher 3 -- I also need a break from that genre._

_P.S.

My last mission (where you have to hold out long enough until a Titerium generator reaches critical mass) was my favourite mission so far as it's the only one I actually got defeated on. (so there was actually a decent challenge). Am I just not giving it a fair chance*?*
_


----------



## JimmyMo

Quote:


> Originally Posted by *JimmyMo*
> 
> Think I answered my own question, if any ZOTAC owners that have replaced their TIM can confirm - -
> 
> New article reviewing the Amp! Extreme came out (https://www.techpowerup.com/reviews/Zotac/GeForce_GTX_1080_Ti_Amp_Extreme/)
> 
> There are some nice teardown pictures in that article.
> 
> Looks like 5 screws, all from the backside, unclip the fan power plug, and I am in? That simple?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Circled in red, here - -


*OK, here are the re-pasting results for my ZOTAC Amp Extreme 1080 Ti.
*

Very surprising - - it's story time!

...But, only after the results and pictures, of course, which is what everyone wants to see - -

Teardown first. Like I said above, only five screws, and here's the location of the two cables to disconnect as you are taking it apart - -





Aaaaand pull it off to reveal...whoa, yuck, THAT doesn't look like full TIM coverage for the OEM paste !





Cleaned everything up with isopropyl alcohol AND I took the extra step to lap the copper heat sink with lapping paste - -



Before lapping, note the fine lines in the copper from the manufacturing process - -



And, after lapping, the lines have been smoothed and the surface is mirror-like (I just focused on the middle of the heatsink, where it makes contact with the chip).

SHINY!



And, the current meme-TIM of choice, "Teh Grizz-o-Naut" - -



I went for the "slather and (somewhat) evenly spread" method that is recommended - -



Slapped it all back together, and started running tests!

*TL;DR (if you don't want to read more, below) :

Lapping the heatsink and a Kryonaut re-paste didn't have an effect at first, apparently the new TIM needed to cure with heat?

Before AND immediately after the re-paste had 46-48 C over ambient.









Now, get 29 C over ambient







after a week of gaming, including one single "hot session" where the card hit 90 C, due to me forgetting to turn card fans up to gaming speeds.*

BTW - - I tested the card before doing the re-paste, and recorded ambient room temperature in all test cases.

Ran my case fans at the "Gaming" profile, so lots of airflow, but not max RPM, just my gaming speed.

ZOTAC card was set to 50% fan and STOCK settings for all runs.

Used Heaven, and ran for approximately 30 minutes each time, until I saw the temps really level off and stay in one place.

TEMPS BEFORE RE-PASTE:

19.28 C AMBIENT (11:09 AM 5-12-2017)
66 C MAX CARD TEMP
46.72 C OVER AMBIENT

TEMPS IMMEDIATELY AFTER RE-PASTE

20 C AMBIENT (12:53 PM 5-12-2017)
68 C MAX CARD TEMP
48 C OVER AMBIENT

19.83 C AMBIENT (2:02 PM 5-12-2017)
68 C MAX CARD TEMP
48.17 C OVER AMBIENT

"Higher temps?!? Well, that sucks!" I thought. "Like some people say, re-pasting doesn't really matter, even compared to ZOTAC's bad original TIM application!"

BUT WAIT...it gets better.

STORY TIME !

I gave up on tinkering, and dove back into gaming for a week. I checked temps occasionally, casually and it always looked the same.

Just got "Lords of the Fallen" ("Dark Souls-Lite!") and have been really enjoying it on the triple-wide monitor setup. They programmed it to support NVIDIA Surround out of the box, so I was happy.

Fired the game up one time over the last week AND FORGOT to increase my fan speed on my card before I began gaming!

I am playing, and start getting graphics hiccups, tab out to check the Afterburner settings, and see the card is at 90 C !!









Whoops, all fans to 100%, cool that puppy down! No damage done, card is fine.

BUT, here's the punchline...which I was really surprised by, and tested it many times over many days to verify.

TEMPS AFTER EXCEEDING WARP 9(0 C)

17.17 C AMBIENT (8:07 PM 5-21-2017)
47 C MAX CARD TEMP
29.3 C OVER AMBIENT

17.67 C AMBIENT (8:40 PM 5-21-2017)
47 C MAX CARD TEMP
29.3 C OVER AMBIENT

###

17.72 C AMBIENT (7:10 AM 5-22-2017)
47 C MAX CARD TEMP
29.28 C OVER AMBIENT

###

17.78 C AMBIENT (2:21 PM 5-22-2017)
47 C MAX CARD TEMP
29.22 C OVER AMBIENT

###

20.39 C AMBIENT (5:48 PM 5-25-2017)
50 C MAX CARD TEMP
29.61 C OVER AMBIENT

Hope that all helps someone, happy to share!


----------



## Siigari

Gorgeous card. The thing is half the size of my Define XL R2. Sheesh.

I haven't really gotten to play with it much yet, but VR is *so smooth* oh my life.

Here are a few pics. 100% lazy and didn't take a bunch before I put the door back on.


----------



## macwin2012

Hello ,

I got Gtx 1080ti Amp extreme , After doing heavy testing yesterday i found my sweet spot .

*Core Clock* : 2063 MHz Boost - Between 60 - 70C temp

*Memory Clock* 6147 MHZ

*Voltage* 1.075

*Nvida Drivers* : 382.33

*Nvidia Driver settings* : G-Sync off , Power : Adaptive , Rest is Default

Here is my score on Heaven Benchmark :



Is this okay score ? I don't think i can push more unless i do some kind of Voltage Mod or Liquid cooling which i don't plan to do anytime soon .


----------



## Siigari

Just benched it with stock clocks.


Not gonna lie, kinda disappointed. How can I make that better?


----------



## Nico67

Quote:


> Originally Posted by *gmpotu*
> 
> Check this out...
> I remember reading about this when this thread first started back like 7000+ post ago where coopiklaani is talking about.
> 
> I watched my curve while running furmark and you're right. I can see it changing the shape to where those voltage bins for my curve were actually the lowest bin at that frequency for the new curve based on temp.
> 
> So here is what I did.
> 
> Edit and save curve at 27C
> Apply Curve at 27C
> -Results-
> [email protected]
> [email protected] ~42C
> [email protected] ~48C
> [email protected] ~51C
> -Score- 10087
> 
> Edit and Save the same curve shape at 56C
> Apply curve at 42C
> -Results-
> [email protected] caps show vRel,vOP
> [email protected] ~48C no caps
> -Score- 10012
> 
> Apply the same saved profile at 27C
> -Results-
> [email protected] caps
> [email protected] ~48C no caps
> [email protected] ~52C no caps
> -Score- 9981 (I might have had an extra application on here. Not sure why this score is so low.)
> 
> Edit and Save curve at 58C
> Apply at 58C then stop furmark and run SP4K asap
> 
> -Results-
> Benchmark started with temp at about 47C
> [email protected] caps
> [email protected] ~48C almost immediately
> [email protected] ~55C
> -Score~ 10009
> 
> On that last test I started seeing slight artifacts and then by scene 16/17 there were tons of artifacts like rainbow colors as if I was in the Matrix or something. So I'm sure I need to come down on my memory. My poor card can't OC the memory for crap. It's only at +440 right now.


Wow that is strange behaviour for an unlocked bios. Seems that the only true "Temp Limiting" is at 90c (not applicable to XOC bios), but there are different Temp profiles at those previously shown GPU boost3 Temp graph points, so its not strictly downclocking due to limit but rather just changing the boost curve to the new profile which you haven't tuned?
Really something that would never be considered an issue for an XOC bios aimed at LN2 users. If anything it really puts emphasis on keeping temps as low as possible and probably as consistent as possible. Also it would seem best to set your curve at your working temp or at least when its in the step range of the graph of your gaming temp.

EDIT: added graph showing possible theory and illustrating some of the effects seen above.



If the temp range between the green lines are different profile using slight changes to the curve is some cases then it could explain what some people have been seeing. P1, P2, P3... are the different temp profile regions and each could use its own curve.
Noting that where the temp Voltage stays the same over to consecutive profiles the clk is likely to drop, but also if the Voltage increases the clk can stay the same.


----------



## Benny89

Quote:


> Originally Posted by *gmpotu*
> 
> Check this out...
> I remember reading about this when this thread first started back like 7000+ post ago where coopiklaani is talking about.
> 
> I watched my curve while running furmark and you're right. I can see it changing the shape to where those voltage bins for my curve were actually the lowest bin at that frequency for the new curve based on temp.
> 
> So here is what I did.
> 
> Edit and save curve at 27C
> Apply Curve at 27C
> -Results-
> [email protected]
> [email protected] ~42C
> [email protected] ~48C
> [email protected] ~51C
> -Score- 10087
> 
> Edit and Save the same curve shape at 56C
> Apply curve at 42C
> -Results-
> [email protected] caps show vRel,vOP
> [email protected] ~48C no caps
> -Score- 10012
> 
> Apply the same saved profile at 27C
> -Results-
> [email protected] caps
> [email protected] ~48C no caps
> [email protected] ~52C no caps
> -Score- 9981 (I might have had an extra application on here. Not sure why this score is so low.)
> 
> Edit and Save curve at 58C
> Apply at 58C then stop furmark and run SP4K asap
> 
> -Results-
> Benchmark started with temp at about 47C
> [email protected] caps
> [email protected] ~48C almost immediately
> [email protected] ~55C
> -Score~ 10009
> 
> On that last test I started seeing slight artifacts and then by scene 16/17 there were tons of artifacts like rainbow colors as if I was in the Matrix or something. So I'm sure I need to come down on my memory. My poor card can't OC the memory for crap. It's only at +440 right now.


So lets say my card reach out to 66C on 50% fan speed (this is what I am running on single STRIX).

So I should run game, let it reach this temp and then apply my curve? When the game is still running? And it should not downclock? So if I apply curve when I have 2050 on 1.09 it will stay there?

Seems like a real pain doing it every time... I thought it was supposed to be unlocked BIOS...


----------



## TWiST2k

Looks like Beta 9 of MSI Afterburner 4.4.0 is out.

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta9.rar

Note: You may have to copy the link and paste it in a new tab to load properly.


----------



## TheBoom

Quote:


> Originally Posted by *Nico67*
> 
> It won't do 2075 / 2062 / 2050 for the three bins? and yeah the first one makes the biggest difference, and after three there was no noticeable gain.


Yeah unfortunately not. I've stuck the next 3 bins to 2062, 2050 and 2050 instead. I don't know why 1311 does 2088 but 1125 cannot do 2075. Either way for now this gives me the highest score in a custom run of FSU. I can actually stick all 3 bins to 2062 but that for some reason that gives me a lower score than the earlier curve.

Do you get the highest score with the highest clock possible at each of the 3 next bins or have you tried actually lowering them to see if that's better?

Quote:


> Originally Posted by *JimmyMo*
> 
> *OK, here are the re-pasting results for my ZOTAC Amp Extreme 1080 Ti.
> *
> 
> Very surprising - - it's story time!
> 
> ...But, only after the results and pictures, of course, which is what everyone wants to see - -
> 
> Teardown first. Like I said above, only five screws, and here's the location of the two cables to disconnect as you are taking it apart - -
> 
> 
> 
> 
> 
> Aaaaand pull it off to reveal...whoa, yuck, THAT doesn't look like full TIM coverage for the OEM paste !
> 
> 
> 
> 
> 
> Cleaned everything up with isopropyl alcohol AND I took the extra step to lap the copper heat sink with lapping paste - -
> 
> 
> 
> Before lapping, note the fine lines in the copper from the manufacturing process - -
> 
> 
> 
> And, after lapping, the lines have been smoothed and the surface is mirror-like (I just focused on the middle of the heatsink, where it makes contact with the chip).
> 
> SHINY!
> 
> 
> 
> And, the current meme-TIM of choice, "Teh Grizz-o-Naut" - -
> 
> 
> 
> I went for the "slather and (somewhat) evenly spread" method that is recommended - -
> 
> 
> 
> Slapped it all back together, and started running tests!
> 
> *TL;DR (if you don't want to read more, below) :
> 
> Lapping the heatsink and a Kryonaut re-paste didn't have an effect at first, apparently the new TIM needed to cure with heat?
> 
> Before AND immediately after the re-paste had 46-48 C over ambient.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, get 29 C over ambient
> 
> 
> 
> 
> 
> 
> 
> after a week of gaming, including one single "hot session" where the card hit 90 C, due to me forgetting to turn card fans up to gaming speeds.*
> 
> BTW - - I tested the card before doing the re-paste, and recorded ambient room temperature in all test cases.
> 
> Ran my case fans at the "Gaming" profile, so lots of airflow, but not max RPM, just my gaming speed.
> 
> ZOTAC card was set to 50% fan and STOCK settings for all runs.
> 
> Used Heaven, and ran for approximately 30 minutes each time, until I saw the temps really level off and stay in one place.
> 
> TEMPS BEFORE RE-PASTE:
> 
> 19.28 C AMBIENT (11:09 AM 5-12-2017)
> 66 C MAX CARD TEMP
> 46.72 C OVER AMBIENT
> 
> TEMPS IMMEDIATELY AFTER RE-PASTE
> 
> 20 C AMBIENT (12:53 PM 5-12-2017)
> 68 C MAX CARD TEMP
> 48 C OVER AMBIENT
> 
> 19.83 C AMBIENT (2:02 PM 5-12-2017)
> 68 C MAX CARD TEMP
> 48.17 C OVER AMBIENT
> 
> "Higher temps?!? Well, that sucks!" I thought. "Like some people say, re-pasting doesn't really matter, even compared to ZOTAC's bad original TIM application!"
> 
> BUT WAIT...it gets better.
> 
> STORY TIME !
> 
> I gave up on tinkering, and dove back into gaming for a week. I checked temps occasionally, casually and it always looked the same.
> 
> Just got "Lords of the Fallen" ("Dark Souls-Lite!") and have been really enjoying it on the triple-wide monitor setup. They programmed it to support NVIDIA Surround out of the box, so I was happy.
> 
> Fired the game up one time over the last week AND FORGOT to increase my fan speed on my card before I began gaming!
> 
> I am playing, and start getting graphics hiccups, tab out to check the Afterburner settings, and see the card is at 90 C !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whoops, all fans to 100%, cool that puppy down! No damage done, card is fine.
> 
> BUT, here's the punchline...which I was really surprised by, and tested it many times over many days to verify.
> 
> TEMPS AFTER EXCEEDING WARP 9(0 C)
> 
> 17.17 C AMBIENT (8:07 PM 5-21-2017)
> 47 C MAX CARD TEMP
> 29.3 C OVER AMBIENT
> 
> 17.67 C AMBIENT (8:40 PM 5-21-2017)
> 47 C MAX CARD TEMP
> 29.3 C OVER AMBIENT
> 
> ###
> 
> 17.72 C AMBIENT (7:10 AM 5-22-2017)
> 47 C MAX CARD TEMP
> 29.28 C OVER AMBIENT
> 
> ###
> 
> 17.78 C AMBIENT (2:21 PM 5-22-2017)
> 47 C MAX CARD TEMP
> 29.22 C OVER AMBIENT
> 
> ###
> 
> 20.39 C AMBIENT (5:48 PM 5-25-2017)
> 50 C MAX CARD TEMP
> 29.61 C OVER AMBIENT
> 
> Hope that all helps someone, happy to share!


Hi, thanks for the extremely long and detailed post. I would like to ask however, if there were any warranty stickers on any part of the assembly that may void the warranty if taken apart?


----------



## dallas1990

had to redo my loop least just the res. so i could make room for the 2nd 1080 ti amp extreme lol. also did some clearance checks so when the 2nd one goes in it wont be caught on anything. so far i love the 1080ti but i havent had time to really push it. will try some 4k gaming later today. wildlands and Andromeda seems like a good choice in seeing were it stacks up


----------



## Sweetwater

Quote:


> Originally Posted by *Benny89*
> 
> Thanks. It is strange that STRIX card would have problem with its own BIOS while FE cards do not have issues with XOC :/


Wow forgive me I fell asleep straight after work ahah, anywho this is what I mean.


This is my daily running clocks and voltage that is nice and cool with no odd voltage movement at all. I first set these settings when I had Witcher 3 running as to load the gpu to mid 50's temp (would recommend doing this, I find it wont move around then). Saved it as a profile and it'll always be this when running a 3d application. However if I was to change the clocks on the fly but then switch back to the profile it wont be the correct settings. It weirds out and goes to 1.093+ for some reason. A restart will fix it.


The second picture here is essentially an anomaly I reckon. As i don't think there should be any perfcaps. Same core/ memory, same target voltage yet the previous voltage points are higher and you can see more of the thin curve compared to my daily profile.



The last picture here is *maybe* when weird stuff can happen and voltage will go up to 1.1 even though the bold line is rather steady you are able to manipulate it so the thin line simulates your local mountain range. Also if I leave my core clock +0 and don't adjust voltage curve, games like Andromeda/ Witcher 3 will get the voltage to 1.1v within a few minutes.
Quote:


> Originally Posted by *Benny89*
> 
> So I should run game, let it reach this temp and then apply my curve? When the game is still running? And it should not downclock? So if I apply curve when I have 2050 on 1.09 it will stay there?


This is what I did. Let the game reach target avg temp/ apply the curve you want and then save as a profile to apply overclock at system startup. *Should* then run and not change when you game


----------



## Coopiklaani

Quote:


> Originally Posted by *JimmyMo*
> 
> *OK, here are the re-pasting results for my ZOTAC Amp Extreme 1080 Ti.
> *
> 
> Very surprising - - it's story time!
> 
> ...But, only after the results and pictures, of course, which is what everyone wants to see - -
> 
> Teardown first. Like I said above, only five screws, and here's the location of the two cables to disconnect as you are taking it apart - -
> 
> 
> 
> 
> 
> Aaaaand pull it off to reveal...whoa, yuck, THAT doesn't look like full TIM coverage for the OEM paste !
> 
> 
> 
> 
> 
> Cleaned everything up with isopropyl alcohol AND I took the extra step to lap the copper heat sink with lapping paste - -
> 
> 
> 
> Before lapping, note the fine lines in the copper from the manufacturing process - -
> 
> 
> 
> And, after lapping, the lines have been smoothed and the surface is mirror-like (I just focused on the middle of the heatsink, where it makes contact with the chip).
> 
> SHINY!
> 
> 
> 
> And, the current meme-TIM of choice, "Teh Grizz-o-Naut" - -
> 
> 
> 
> I went for the "slather and (somewhat) evenly spread" method that is recommended - -
> 
> 
> 
> Slapped it all back together, and started running tests!
> 
> *TL;DR (if you don't want to read more, below) :
> 
> Lapping the heatsink and a Kryonaut re-paste didn't have an effect at first, apparently the new TIM needed to cure with heat?
> 
> Before AND immediately after the re-paste had 46-48 C over ambient.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now, get 29 C over ambient
> 
> 
> 
> 
> 
> 
> 
> after a week of gaming, including one single "hot session" where the card hit 90 C, due to me forgetting to turn card fans up to gaming speeds.*
> 
> BTW - - I tested the card before doing the re-paste, and recorded ambient room temperature in all test cases.
> 
> Ran my case fans at the "Gaming" profile, so lots of airflow, but not max RPM, just my gaming speed.
> 
> ZOTAC card was set to 50% fan and STOCK settings for all runs.
> 
> Used Heaven, and ran for approximately 30 minutes each time, until I saw the temps really level off and stay in one place.
> 
> TEMPS BEFORE RE-PASTE:
> 
> 19.28 C AMBIENT (11:09 AM 5-12-2017)
> 66 C MAX CARD TEMP
> 46.72 C OVER AMBIENT
> 
> TEMPS IMMEDIATELY AFTER RE-PASTE
> 
> 20 C AMBIENT (12:53 PM 5-12-2017)
> 68 C MAX CARD TEMP
> 48 C OVER AMBIENT
> 
> 19.83 C AMBIENT (2:02 PM 5-12-2017)
> 68 C MAX CARD TEMP
> 48.17 C OVER AMBIENT
> 
> "Higher temps?!? Well, that sucks!" I thought. "Like some people say, re-pasting doesn't really matter, even compared to ZOTAC's bad original TIM application!"
> 
> BUT WAIT...it gets better.
> 
> STORY TIME !
> 
> I gave up on tinkering, and dove back into gaming for a week. I checked temps occasionally, casually and it always looked the same.
> 
> Just got "Lords of the Fallen" ("Dark Souls-Lite!") and have been really enjoying it on the triple-wide monitor setup. They programmed it to support NVIDIA Surround out of the box, so I was happy.
> 
> Fired the game up one time over the last week AND FORGOT to increase my fan speed on my card before I began gaming!
> 
> I am playing, and start getting graphics hiccups, tab out to check the Afterburner settings, and see the card is at 90 C !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whoops, all fans to 100%, cool that puppy down! No damage done, card is fine.
> 
> BUT, here's the punchline...which I was really surprised by, and tested it many times over many days to verify.
> 
> TEMPS AFTER EXCEEDING WARP 9(0 C)
> 
> 17.17 C AMBIENT (8:07 PM 5-21-2017)
> 47 C MAX CARD TEMP
> 29.3 C OVER AMBIENT
> 
> 17.67 C AMBIENT (8:40 PM 5-21-2017)
> 47 C MAX CARD TEMP
> 29.3 C OVER AMBIENT
> 
> ###
> 
> 17.72 C AMBIENT (7:10 AM 5-22-2017)
> 47 C MAX CARD TEMP
> 29.28 C OVER AMBIENT
> 
> ###
> 
> 17.78 C AMBIENT (2:21 PM 5-22-2017)
> 47 C MAX CARD TEMP
> 29.22 C OVER AMBIENT
> 
> ###
> 
> 20.39 C AMBIENT (5:48 PM 5-25-2017)
> 50 C MAX CARD TEMP
> 29.61 C OVER AMBIENT
> 
> Hope that all helps someone, happy to share!


DAT STOCK THERMAL PASTE APPLICATION. It didn't even cover the entire die! It could have kill the GPU!


----------



## Coopiklaani

Quote:


> Originally Posted by *TWiST2k*
> 
> Looks like Beta 9 of MSI Afterburner 4.4.0 is out.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta9.rar
> 
> Note: You may have to copy the link and paste it in a new tab to load properly.


change log?


----------



## ViRuS2k

Quote:


> Originally Posted by *Coopiklaani*
> 
> change log?


MSI AB 4.4.0 beta 9 is available:

http://office.guru3d.com/afterburner...up440Beta9.rar

New version introduces new Quad Overvoltage(tm) architecture support for upcoming custom design MSI graphics cards, which means that you'll be able to control up to 4 independent votages on some hi-end MSI graphics cards (core voltage, memory voltage and 2 auxiliary voltages).
New version also includes updated RTSS build, but changes in it are related to documentation and context help only.


----------



## ViRuS2k

Well one more day and i will be shuntmodded









Q guys, currently there is grizzly none metal thermal paste on my GPU`s
will there be a difference with Grizzly conductonout liquid metal to justify doing it i know idle temps probably wont improve but what about load temps ?
when i did liquid metal to my CPU i gained about 15-20c over load temps
but im guessing GPUs are different.... hell even 10c better would help or am i looking at 5c better realistic ?

KedarWolf mate are you doing the resistermod this weekend ?
cause that guide would be of a great help....

am i correct in saying i put none conductive paste on the center and liquid metal on the 2 ends then slap it down and wait for it to dry ?
not sure what to use to stick it down im guessing the epoxy artic silver stuff i use on the ends right ?

gonna have to get me some cocktail sticks lol


----------



## Aganor

After installing the Voltage slider MSI Afterburner, i noticed the max voltage came to 1.081v on oeak power limit 121% on stock BIOS FE. Is it too high?
I'm able to go +180 offset core voltage with the Watercooling. For a stock BIOS is it good? Temps at max @47ºc heaven benchmark


----------



## TheBoom

Quote:


> Originally Posted by *ViRuS2k*
> 
> MSI AB 4.4.0 beta 9 is available:
> 
> http://office.guru3d.com/afterburner...up440Beta9.rar
> 
> New version introduces new Quad Overvoltage(tm) architecture support for upcoming custom design MSI graphics cards, which means that you'll be able to control up to 4 independent votages on some hi-end MSI graphics cards (core voltage, memory voltage and 2 auxiliary voltages).
> New version also includes updated RTSS build, but changes in it are related to documentation and context help only.


Wasn't this feature enabled in earlier builds already?

I remember my 760gtx hawk had this "feature". I honestly think it's more of a gimmick and it did nothing for the overclock itself.

Should also mention that changing one of the auxillary voltages fried the card twice. And I had to RMA it twice. They were asking at least a 30% premium compared to the basic models for that gimmick.


----------



## Coopiklaani

Quote:


> Originally Posted by *ViRuS2k*
> 
> MSI AB 4.4.0 beta 9 is available:
> 
> http://office.guru3d.com/afterburner...up440Beta9.rar
> 
> New version introduces new Quad Overvoltage(tm) architecture support for upcoming custom design MSI graphics cards, which means that you'll be able to control up to 4 independent votages on some hi-end MSI graphics cards (core voltage, memory voltage and 2 auxiliary voltages).
> New version also includes updated RTSS build, but changes in it are related to documentation and context help only.


Thanks!
I really wish they could implement a voltage offset slider for pascal cards so we don't need to play with this crazy F-V curve.


----------



## Villain512

Is there way to modify bios settings, such as using nvflash?
I only want to increase default and boost frequency without using MSI Afterburner.


----------



## TheBoom

Quote:


> Originally Posted by *Villain512*
> 
> Is there way to modify bios settings, such as using nvflash?
> I only want to increase default and boost frequency without using MSI Afterburner.


No. There is no bios editor for pascal and there will probably never be. You can try flashing other bioses to your card but they won't increase default clocks by much either.


----------



## Nico67

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah unfortunately not. I've stuck the next 3 bins to 2062, 2050 and 2050 instead. I don't know why 1311 does 2088 but 1125 cannot do 2075. Either way for now this gives me the highest score in a custom run of FSU. I can actually stick all 3 bins to 2062 but that for some reason that gives me a lower score than the earlier curve.
> 
> Do you get the highest score with the highest clock possible at each of the 3 next bins or have you tried actually lowering them to see if that's better?


2138, with 2126/ 2114/ 2101 gave me 10611 and increasing the last bin to 2114 so I had 2138, with 2126 / 2114 / 2114 gave me 10615. That's as far as I tested, then did 4 points and got the same 10615


----------



## Benny89

Installed my SLI 1080 TI STRIX OC but having a problem







. Even on stock SLI crashes in 3D application (Heaven for example), reboot whole PC and then when it comes back to desktop it looks like Safe Mode. Installed GPUs correctly, better one is on top, bought and installed ROG ASUS HB bridge, used DDU and did fresh install of new drivers. Control Panel sees SLI no problem.

However system keeps crashing..









EDIT: Also rebooting system does not seems to detect GPUs now. Safe Mode all the time (iGPU I think). I keep rebooting but it does not help.

Any solution?


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> However system keeps crashing..


Sorry to hear! Which kind of crash? Blue screen?
It seems u tested each individual card and had no problem, right?


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> Sorry to hear! Which kind of crash? Blue screen?
> It seems u tested each individual card and had no problem, right?


No. Crash black screen-> reboot to Safe Mode, and does not detect drivers/cards (can't ge out of Safe Mode even after few reboots).

I tested each card and running Heaven and had no problems. Just checked their max stable clocks fast.

EDIT: Removed one card, everything is working ok.

But why SLI keeps crashing







?


----------



## BrainSplatter

Do u get proper card info for both from GPU-Z and Afterburner? Can u control 2nd card with Afterburner (e.g. spin up fans manually)?


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> Do u get proper card info for both from GPU-Z and Afterburner? Can u control 2nd card with Afterburner (e.g. spin up fans manually)?


In AB I had normal view, and all OC/Fan settings I applied, applied to both cards.

There is only one fan "slider" and it applies for both cards.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> There is only one fan "slider" and it applies for both cards.


U can configure AB to apply settings individually for each GPU (after choosing that option, there is a small drop down field on main screen to select current GPU). When u turn up fans manually, the 2nd card also spins up? I assume your crashes are instantly before showing any 3D screen. And I would check with GPU-Z whether you get proper info about your 2nd card.

Also, can u disable SLI in NVIDIA control panel and run a single card?


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> U can configure AB to apply settings individually for each GPU (after choosing that option, there is a small drop down field on main screen to select current GPU). When u turn up fans manually, the 2nd card also spins up? I assume your crashes are instantly before showing any 3D screen. And I would check with GPU-Z whether you get proper info about your 2nd card.
> 
> Also, can u disable SLI in NVIDIA control panel and run a single card?


It does not crash instantly. It runs for about 40 secs - 1 min and crash.

Yes, I could disable SLI in Nvidia control panel. OSD also was showing correctly second GPU, its clocks, fans speed, power etc.

Usage in Heaven were aroun 98% on card and 95-97% second card.


----------



## BrainSplatter

Quote:


> Originally Posted by *Benny89*
> 
> It does not crash instantly. It runs for about 40 secs - 1 min and crash.


And AB is showing normal values for both cards (clock, utilization, temperature, ..)?


----------



## Benny89

Quote:


> Originally Posted by *BrainSplatter*
> 
> And AB is showing normal values for both cards (clock, utilization, temperature, ..)?


See in above, I edited post. In short- yes


----------



## BrainSplatter

Do u use the extra PCIE power connector on your motherboard?


----------



## JimmyMo

Quote:


> Originally Posted by *TheBoom*
> 
> Hi, thanks for the extremely long and detailed post. I would like to ask however, if there were any warranty stickers on any part of the assembly that may void the warranty if taken


No stickers, seals, runes, or words of binding - - however, by lapping the heat sink, I may have made it obvious that I was in there!








Quote:


> Originally Posted by *Coopiklaani*
> 
> DAT STOCK THERMAL PASTE APPLICATION. It didn't even cover the entire die! It could have kill the GPU!


Srsly! I was surprised. I wonder if the TIM sublimated and moved after installation? Poor coverage like that would have likely been noticed during production.
Quote:


> Originally Posted by *macwin2012*
> 
> Is this okay score ? I don't think i can push more unless i do some kind of Voltage Mod or Liquid cooling which i don't plan to do anytime soon .


Looks great to me! It's OK if you think it's OK! I pushed mine up there, too, but for now I've settled back to 2025 for gaming, and next will do some undervolting to get temps down even further. I value temps and lower fan speeds over a handful of FPS, personally.


----------



## Luckbad

Quote:


> Originally Posted by *Benny89*
> 
> See in above, I edited post. In short- yes


How many watts is your power supply? Black screen/reboot sounds like you're running out of power.


----------



## Benny89

Quote:


> Originally Posted by *Luckbad*
> 
> How many watts is your power supply? Black screen/reboot sounds like you're running out of power.


1000W. That is what I think. Ow well, there goes SLI dream









But I got lucky with one card. So far only played short with voltage curve but Cards does 2088 on 1.09 no problem and goes in Heaven to 2076 till 61C. At 62C it goes 2063 and stay for the last few scenes in Heaven. Memory OC +500 no problem. Fan speed 55% and does not go above 62C.

I think I finally got nice card







.

I wonder if I could push it to 2100....


----------



## mtbiker033

Quote:


> Originally Posted by *Benny89*
> 
> Installed my SLI 1080 TI STRIX OC but having a problem
> 
> 
> 
> 
> 
> 
> 
> . Even on stock SLI crashes in 3D application (Heaven for example), reboot whole PC and then when it comes back to desktop it looks like Safe Mode. Installed GPUs correctly, better one is on top, bought and installed ROG ASUS HB bridge, used DDU and did fresh install of new drivers. Control Panel sees SLI no problem.
> 
> However system keeps crashing..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Also rebooting system does not seems to detect GPUs now. Safe Mode all the time (iGPU I think). I keep rebooting but it does not help.
> 
> Any solution?


dude it could be your power supply can't handle it


----------



## TWiST2k

I currently have 3 x FTW3 cards and I am having such mixed results with all of them. First, using precision X because of the 3 fans sucks and I have tried with precision X and afterburner together as well as precision X alone. I will have great scores in superposition and then crashes in heaven extreme. Then I will get almost totally different results with the exact same settings. As well as getting really high scores with lower clocks then everyone else claims.

My regular 1080 FTW was a bit finicky to get dialed in, but this has been way more illogical at times and inconsistent. Even when I experimented with the XOC BIOS I was getting higher stables clocks, which produced much lower results, go figure.

I am just using the stock cooler and never really go much over 60c, if at all, on GPU, power or memory. I have my FTW3 in my 6700k @ 4.8 system, with 32GB RAM and M.SATA SSD.

2 of these 3 FTW3s are going back sooner then later, so I gotta figure out who the winner of the silicon lottery here is gonna be.


----------



## Benny89

Quote:


> Originally Posted by *TWiST2k*
> 
> I currently have 3 x FTW3 cards and I am having such mixed results with all of them. First, using precision X because of the 3 fans sucks and I have tried with precision X and afterburner together as well as precision X alone. I will have great scores in superposition and then crashes in heaven extreme. Then I will get almost totally different results with the exact same settings. As well as getting really high scores with lower clocks then everyone else claims.
> 
> My regular 1080 FTW was a bit finicky to get dialed in, but this has been way more illogical at times and inconsistent. Even when I experimented with the XOC BIOS I was getting higher stables clocks, which produced much lower results, go figure.
> 
> I am just using the stock cooler and never really go much over 60c, if at all, on GPU, power or memory. I have my FTW3 in my 6700k @ 4.8 system, with 32GB RAM and M.SATA SSD.
> 
> 2 of these 3 FTW3s are going back sooner then later, so I gotta figure out who the winner of the silicon lottery here is gonna be.


That was one of the reasons I decided to take STRIX in the end instead of FTW3 (and of course constant moving of my ship date lol). Having to play with precision X is just another pain among other pains that those pascals give me so no


----------



## TheBoom

Quote:


> Originally Posted by *Nico67*
> 
> 2138, with 2126/ 2114/ 2101 gave me 10611 and increasing the last bin to 2114 so I had 2138, with 2126 / 2114 / 2114 gave me 10615. That's as far as I tested, then did 4 points and got the same 10615


That seems like margin of error. So maybe the 2nd and 3rd bins aren't as important?


----------



## TheBoom

Quote:


> Originally Posted by *JimmyMo*
> 
> No stickers, seals, runes, or words of binding - - however, by lapping the heat sink, I may have made it obvious that I was in there!
> 
> 
> 
> 
> 
> 
> 
> 
> Looks great to me! It's OK if you think it's OK! I pushed mine up there, too, but for now I've settled back to 2025 for gaming, and next will do some undervolting to get temps down even further. I value temps and lower fan speeds over a handful of FPS, personally.


Thanks for the info + rep. Now I can reapply the TIM on mine with piece of mind. By the way do you know what the rectangular sticker on the back plate is for? I initially thought that was the seal.


----------



## JimmyMo

Quote:


> Originally Posted by *TheBoom*
> 
> Thanks for the info + rep. Now I can reapply the TIM on mine with piece of mind. By the way do you know what the rectangular sticker on the back plate is for? I initially thought that was the seal.


You're welcome, and thank YOU!

Do you mean this sticker, labeled 'N471' on my unit?



Not sure, it's not the serial number, so I am guessing that it's a production run tracking number, or something similar.

What number does yours have on it?

-jm


----------



## Benny89

Could someone tell me what kind of heatsinks I should use if I want to install Kraken G12 + AIO on my STRIX? I have no idea how high, long etc they should be.

And I should only get heatsinks for VRAM or for seomething else? Thanks


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> Could someone tell me what kind of heatsinks I should use if I want to install Kraken G12 + AIO on my STRIX? I have no idea how high, long etc they should be.
> 
> And I should only get heatsinks for VRAM or for seomething else? Thanks


IMHO, the Kraken G12 system is complete trash. I used the G10 system back a while ago with 290s. The VRM cooling is mediocre, even when I upgraded the fans to high RPM + a custom heatsink CnC milled specifically for the VRMs + Fujiploy, the temps would easily shoot to 90C+ (AMD has thermistors on the VRM array for some reason). The VRAM also gets really hot when OCed so I was getting artifacts at the slightest OC.

With that out of the way. Get the heatsinks as tall as possible, you need every scrap of cooling you can get for the VRAM because GDDR5X is very sensitive to operating temperatures. I'm not too concerned about the VRM because ASUS already used top tier 60A units rated for 125C which also happen to be like 91% efficient or something but it will get uncomfortably hot. Leaving that black heatplate on the VRMs is a must, you might be able to thermal glue (the arctic silver epoxy) some 5mm or taller sinks because the clearance with the fan should be quite high.

If you can still return the AIO equipment, I recommend just getting a Strix EK block for your card and an affordable 120mm rad kit and setup a custom private loop just for the GPU. You'd be much happier trust me, the G12 system is really flawed.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> So lets say my card reach out to 66C on 50% fan speed (this is what I am running on single STRIX).
> 
> So I should run game, let it reach this temp and then apply my curve? When the game is still running? And it should not downclock? So if I apply curve when I have 2050 on 1.09 it will stay there?
> 
> Seems like a real pain doing it every time... I thought it was supposed to be unlocked BIOS...


Yes this seems to be the way to do it for consistency.

You have to not only choose when to edit and save your curve but also when to apply it when you load your games.

Your system may run [email protected]@66C just fine but if you Apply these clocks when you have an idle temp of like 27C then you curve will not be the same shape and when you launch your game it may start you out at [email protected] Then you could end up crashing if your card is not stable or you could see it just throttle down to 2062 and then 2050 as your temps rise and your curve fluctuates.

I think from what I saw it is best to do this..
1) Edit and save curve at idle temp
2) Apply curve at idle temp
3) see how high voltage jumps on its own as temps increase
4) Go back to idle temp
5) Edit and save curve changing the voltage to the bin that voltage jumped to in the test
6) Then you should be able to have these setting apply on startup and stay stable regardless of temp


----------



## ninjames

Well, after some advice from folks here and in the Aorus 1080Ti thread, I returned that card about a week ago after persisting crashing issues after ruling out my RAM, SSD and HDD as the cause and reading about some of the quality control issues and bad cards that came out of the Xtreme line from Aorus. I loved the card when it worked.

Newegg received my return yesterday and processed my refund (in store credit) right away. So I bought the EVGA 1080Ti FTW3. I don't do any overclocking or messing with voltages. I plan to install the XOC software, giving it all updates, and leaving it where it is. Three-day shipping is what I paid for, but a weekend and holiday screwed me so I'm probably not getting it until Wednesday. Hoping for fewer issues with this.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> 1000W. That is what I think. Ow well, there goes SLI dream
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I got lucky with one card. So far only played short with voltage curve but Cards does 2088 on 1.09 no problem and goes in Heaven to 2076 till 61C. At 62C it goes 2063 and stay for the last few scenes in Heaven. Memory OC +500 no problem. Fan speed 55% and does not go above 62C.
> 
> I think I finally got nice card
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I wonder if I could push it to 2100....


Dude I'm so jealous. Getting those clocks and that memory on AIR without XOC also is amazing.

Your power supply is probably the issue like everyone else is saying. My total system power with one card spikes at 460W on the 12v rail and averages 400W+ so if you have two cards you are probably at close to 750-800W which is pushing the upper limit of your 1000W PSU.

You can do some tests.
Do you have an extra 4pin power on your mobo that you can run a cable to from your PSU? Some mobos have this.

1) PCI-E slot one tests best clocks with each card see if any crashes (make sure it is stable for a while)
2) PCI-E Slot two test best clocks with each card see if any crashes
3) Same thing but both cards and see if still crashes in SLI (If so its probably power I would think or maybe something faulty with your SLI bridge but it the cards are showing up in AB etc it probably isn't the bridge I would think.


----------



## Siigari

Wow you guys are modding your cards?

Should I get into this?


----------



## Benny89

So far so good. Playing ULTRA modded ME:A at 2063/6000. 60% Fan speed to keep it around 63.
Quote:


> Originally Posted by *Dasboogieman*
> 
> IMHO, the Kraken G12 system is complete trash. I used the G10 system back a while ago with 290s. The VRM cooling is mediocre, even when I upgraded the fans to high RPM + a custom heatsink CnC milled specifically for the VRMs + Fujiploy, the temps would easily shoot to 90C+ (AMD has thermistors on the VRM array for some reason). The VRAM also gets really hot when OCed so I was getting artifacts at the slightest OC.
> 
> With that out of the way. Get the heatsinks as tall as possible, you need every scrap of cooling you can get for the VRAM because GDDR5X is very sensitive to operating temperatures. I'm not too concerned about the VRM because ASUS already used top tier 60A units rated for 125C which also happen to be like 91% efficient or something but it will get uncomfortably hot. Leaving that black heatplate on the VRMs is a must, you might be able to thermal glue (the arctic silver epoxy) some 5mm or taller sinks because the clearance with the fan should be quite high.
> 
> If you can still return the AIO equipment, I recommend just getting a Strix EK block for your card and an affordable 120mm rad kit and setup a custom private loop just for the GPU. You'd be much happier trust me, the G12 system is really flawed.


Well, on YT you can find people that used it (G10 and G12) on their FE cards and are happy with results. But of course you are right that water loop is better. No doubt. But it costs a lot more...

Although having small loop for just your GPU seems really nice







. It would look strange but hey, at least no "yet another same looking loop"







. I will consider that.


----------



## djriful

Hi!

Kepler to Pascal!





Afterburner 4.4.0 Beta 9



Green line = 4.4.0 Beta 9

Blue line = 4.3 Final

Fix the dip fps in the bench test.


----------



## MMoshi

So I tried to "undervolt" my FTW3.
Best I got for now is 1936mhz @ 962mv, which drops at 65C to 1924mhz @ 950mv. Temps are stable at 68C with 45% on fans.

This does not seem amazing to me. Or is it?


----------



## Benny89

Quote:


> Originally Posted by *MMoshi*
> 
> So I tried to "undervolt" my FTW3.
> Best I got for now is 1936mhz @ 962mv, which drops at 65C to 1924mhz @ 950mv. Temps are stable at 68C with 45% on fans.
> 
> This does not seem amazing to me. Or is it?


That is really high temp for 962mv.... Mie STRIX at 45% is 68-69) at 1.075 (stock fan). 55% is 65max at 1.075.

Are you using Precision X? I know that FTW3 fans can be bugged in Afterburner. Maybe one is not spinning or spinning slow?


----------



## djriful

Yeah, highly recommended to get the 4.4.0 Beta 9 now. Fan is working.


----------



## Hulio225

Quote:


> Originally Posted by *Benny89*
> 
> That is really high temp for 962mv.... Mie STRIX at 45% is 68-69) at 1.075 (stock fan). 55% is 65max at 1.075.
> 
> Are you using Precision X? I know that FTW3 fans can be bugged in Afterburner. Maybe one is not spinning or spinning slow?


You don't know how high his ambient temperature is... you can't know at this point if its high or normal...

I really hate when people talk about temps without mentioning their ambient temps...


----------



## djriful

I'm waiting for the EK waterblock in stock... so I can jam with my loop together again and enjoy low temp on low at CPU+GPU+MOSFET 40'c and under.


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> So far so good. Playing ULTRA modded ME:A at 2063/6000. 60% Fan speed to keep it around 63.
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> IMHO, the Kraken G12 system is complete trash. I used the G10 system back a while ago with 290s. The VRM cooling is mediocre, even when I upgraded the fans to high RPM + a custom heatsink CnC milled specifically for the VRMs + Fujiploy, the temps would easily shoot to 90C+ (AMD has thermistors on the VRM array for some reason). The VRAM also gets really hot when OCed so I was getting artifacts at the slightest OC.
> 
> With that out of the way. Get the heatsinks as tall as possible, you need every scrap of cooling you can get for the VRAM because GDDR5X is very sensitive to operating temperatures. I'm not too concerned about the VRM because ASUS already used top tier 60A units rated for 125C which also happen to be like 91% efficient or something but it will get uncomfortably hot. Leaving that black heatplate on the VRMs is a must, you might be able to thermal glue (the arctic silver epoxy) some 5mm or taller sinks because the clearance with the fan should be quite high.
> 
> If you can still return the AIO equipment, I recommend just getting a Strix EK block for your card and an affordable 120mm rad kit and setup a custom private loop just for the GPU. You'd be much happier trust me, the G12 system is really flawed.
> 
> 
> 
> Well, on YT you can find people that used it (G10 and G12) on their FE cards and are happy with results. But of course you are right that water loop is better. No doubt. But it costs a lot more...
> 
> Although having small loop for just your GPU seems really nice
> 
> 
> 
> 
> 
> 
> 
> . It would look strange but hey, at least no "yet another same looking loop"
> 
> 
> 
> 
> 
> 
> 
> . I will consider that.
Click to expand...

I certainly agree that would be a good consideration.

Also you may have seen this post:
http://www.overclock.net/t/1627390/official-nvidia-titan-xp-owners-club/1660#post_26120518

Even though that was a rare occurrence, it is something to be aware of.
The quality of some AIO kits is not up to that of a proper waterblock and loop setup.

If you're going to be running SLI, all the better to be using a good loop and blocks, for the best temps. as well.

Besides, I'd like to see you have a nice loop in your system!


----------



## tagaxxl

hey guys, i just got 1080 ti amp extreme edition and fan keeps reeving up randomly while browsing the internet or something.have msi afterburner 4.4.0 Beta 9 at the moment


----------



## Luckbad

Quote:


> Originally Posted by *tagaxxl*
> 
> hey guys, i just got 1080 ti amp extreme edition and fan keeps reeving up randomly while browsing the internet or something.have msi afterburner 4.4.0 Beta 9 at the moment


Download FireStorm, go to Spectra, click the down arrow to the right of the LED settings, turn off fan freeze.

That'll let you control fan speed without revving in Afterburner.


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Installed my SLI 1080 TI STRIX OC but having a problem
> 
> 
> 
> 
> 
> 
> 
> . Even on stock SLI crashes in 3D application (Heaven for example), reboot whole PC and then when it comes back to desktop it looks like Safe Mode. Installed GPUs correctly, better one is on top, bought and installed ROG ASUS HB bridge, used DDU and did fresh install of new drivers. Control Panel sees SLI no problem.
> 
> However system keeps crashing..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Also rebooting system does not seems to detect GPUs now. Safe Mode all the time (iGPU I think). I keep rebooting but it does not help.
> 
> Any solution?


Dam!!!! brighter side you got a better card

I never had a problem similar to that all I could suggest is grab your wife's PSU do the green wire bridge and run one card with her PSU that way you can eliminate PSU as problem.


----------



## Benny89

Quote:


> Originally Posted by *DerComissar*
> 
> I certainly agree that would be a good consideration.
> 
> If you're going to be running SLI, all the better to be using a good loop and blocks, for the best temps. as well.
> 
> Besides, I'd like to see you have a nice loop in your system!


Why is that?









I don't know if I will run SLI now. Don't really want to go with a hassle of replacing PSU, again placing and connecting all that cables in my case etc. Too much pain.

I got nice card, will be still tweaking curve on it and see where it can land at max.

Then I will think about block.

Funny fact, I tried to push it to 2088 in ME:A but game freezes after about 40 min - hour. What is interesting for me is that this is first time I see game freeze, no crashing, no driver crash, nothing. Just freeze. Never had that, it was always crash. Interesting.


----------



## MMoshi

Quote:


> Originally Posted by *Benny89*
> 
> That is really high temp for 962mv.... Mie STRIX at 45% is 68-69) at 1.075 (stock fan). 55% is 65max at 1.075.
> 
> Are you using Precision X? I know that FTW3 fans can be bugged in Afterburner. Maybe one is not spinning or spinning slow?


Well my ambient is about 24 which is more than average. And my case (Enthoo EVOLV ATX) has mediocre airflow.
I was using precisionx but its very buggy so now im using afterburner. Will check the fans thanks.

Quote:


> Originally Posted by *Hulio225*
> 
> You don't know how high his ambient temperature is... you can't know at this point if its high or normal...
> 
> I really hate when people talk about temps without mentioning their ambient temps...


Indeed i should have included my ambient temp, it is about 24-25.


----------



## AngryLobster

To the poster above, you don't need any additional heatsink for Vram/VRM when attaching a G12 to the STRIX. It uses a baseplate to cool those components so the G12 and it's VRM fan sit above it.

It has a doubled 5 phase VRM so VRM temperatures with the existing base plate and 92mm fan will not be an issue. The Vram will naturally experience lower temperature as well since the AIO will reduce PCB temps with the GPU never going north of 55C.


----------



## ilikelobstastoo

sorry but this drives me crazy!! i see people that are buying cards left and right , talking about sli and such "my 5th card is .005% better than my 6th" but yet claiming a custom loop is too expensive!!! that just makes no sense at all to me. i guarantee you can have a custom loop for less than another 1080 ti.

If you want to save a few bucks you can make a loop like this for about 400 or so and just use one card.

purchase high quality expandable aio like a swiftech 240x or similar................................................ $150(usd)

purchase ek block (optional. backplate) ................................................................................................... $130(usd)

radiator like a hardwarelabs 280gts black ice......................................................................................... $65(usd)

purchase some primochill soft tubing, clamps, and some liquid like mayhems x1....................... $45(usd)

you can either use some static pressure fans you have laying around or get a couple .......................$25-50(usd)


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> I certainly agree that would be a good consideration.
> 
> If you're going to be running SLI, all the better to be using a good loop and blocks, for the best temps. as well.
> 
> Besides, I'd like to see you have a nice loop in your system!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why is that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know if I will run SLI now. Don't really want to go with a hassle of replacing PSU, again placing and connecting all that cables in my case etc. Too much pain.
> 
> I got nice card, will be still tweaking curve on it and see where it can land at max.
> 
> Then I will think about block.
> 
> Funny fact, I tried to push it to 2088 in ME:A but game freezes after about 40 min - hour. What is interesting for me is that this is first time I see game freeze, no crashing, no driver crash, nothing. Just freeze. Never had that, it was always crash. Interesting.
Click to expand...

All the better.

The money saved by not going SLI could go towards the block and loop!

I quit SLI after having two 780Ti's, it worked pretty well back then, but things are different these days.
One 1080Ti (or, cough*TXp*cough) stomps that 780Ti setup anyhow.

The EVGA G2 1000 should be working great with one Strix now, idk why you're still getting a game freeze, driver? DDU needed to be run again?
Am I grasping straws, lol.

Certainly seems to be a good card though.


----------



## gmpotu

Oh my goodness boys....

I hit [email protected] in SP4K !!! When I started this adventure I was only able to do [email protected] and it would downclock because of throttling all the way to 2025. I never tried XOC BIOS when I was starting out. From the help of all of you guys and all of the knowledge shared here, trial and error, and XOC Bios I've been able to improve so much. Thanks guys. This is a great community.

-Score 10173- Max 60C with a max 478W on the 12v rail according to Corsair Link. My fan curve topping at 90% (Loud but I use headphones so not too bad) System

I thought maybe this was a fluke but nope. Multiple runs no issues.

Will post picks of my curve later. It seems very apparent that that thin-light curve matters a lot too as it is now showing a nice smooth hill. I don't know how to manipulate the thin line curve really other than trial and error but trying out different settings helped me to hone in on my clock / voltage combo.

1.143v does not work and 1.162v does not work. 1.150v is the sweet spot and runs every time I try it.

Going to do a WoW raid and see if it holds up for the night and then maybe try to push that 2100 mark on the last remaining voltage bins I have free.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Benny89*
> 
> Why is that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I don't know if I will run SLI now.* Don't really want to go with a hassle of replacing PSU, again placing and connecting all that cables in my case etc. Too much pain.
> 
> I got nice card, will be still tweaking curve on it and see where it can land at max.
> 
> Then I will think about block.
> 
> Funny fact, I tried to push it to 2088 in ME:A but game freezes after about 40 min - hour. What is interesting for me is that this is first time I see game freeze, no crashing, no driver crash, nothing. Just freeze. Never had that, it was always crash. Interesting.


Awesome! One of the main reasons I have remained subscribed to this thread is Benny Watch, a show that follows Benny89 and his journey to seek maximum overclocks!


----------



## feznz

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> sorry but this drives me crazy!! i see people that are buying cards left and right , talking about sli and such "my 5th card is .005% better than my 6th" but yet claiming a custom loop is too expensive!!! that just makes no sense at all to me. i guarantee you can have a custom loop for less than another 1080 ti.
> 
> If you want to save a few bucks you can make a loop like this for about 400 or so and just use one card.
> 
> purchase high quality expandable aio like a swiftech 240x or similar................................................ $150(usd)
> 
> purchase ek block (optional. backplate) ................................................................................................... $130(usd)
> 
> radiator like a hardwarelabs 280gts black ice......................................................................................... $65(usd)
> 
> purchase some primochill soft tubing, clamps, and some liquid like mayhems x1....................... $45(usd)
> 
> you can either use some static pressure fans you have laying around or get a couple .......................$25-50(usd)


Especially when it is a one off purchase not like oh well throw the whole loop out and start again every time you get a new GPU








Everyone has there own way to do things I haven't spent anything on watercooling in like 4 years apart from a few dollars in coolant because I been using universal blocks for the GPUs
Seriously thinking of SLIing just not quite enough power IMHO OC = 5%gain SLI = min of 50% + maybe up to 95% gain with good scaling
I don't care for peoples opinion when it comes to SLI I personally never had issues apart from listening to others opinions on PSU I went all out got a 1500W never looked back.

As for PSU I always say where is the Minimum power draw recommendation for a PSU there isn't one if I want to draw 50w off a 1500w PSU I can it will just last hell of a lot longer than drawing 750w off a 750w PSU
People bag my Universal blocks all I got to sat is my backside is ugly trust me....... but it gets the job done








As for Vram cooling it's overrated had bare memory chips last time never had a problem at least I have a frame covering them this time.
And VRM well yes they need cooling but again you push your card to the limit for gaming the you going to have some roasting VRMs but then again they are rated for 105°C but also when you are drawing 400w+ max OC vs 275w for mild I thinks speaks for itself

I would rather have 2 lazy OC cards vs 1 maxed out @ 1.25v maybe 2150Mhz for gaming if you are lucky

Guess the older I get I have to start talking to myself to get some great advice


----------



## dunbheagan

Hey Guys, anybody already found a way to undervolt below 793mV? Is there any way to apply settings like 1200MHz/700mV or else? I can apply settings 1000MHz/800mV or 1746MHz/793mV with Afterburner, but nothing below that voltage. I would really like to see the afterburner curve go down to 650mV.

I found out, that the programm SetStablePowerState.exe locks the card at the base clock, which should have a voltage below 800mV, i think about 780mV. But it is Win10 only, and i would prefere a method which lets me chose clocks/voltage.

https://developer.nvidia.com/setstablepowerstateexe-%20disabling%20-gpu-boost-windows-10-getting-more-deterministic-timestamp-queries

Any ideas?


----------



## Benny89

Quote:


> Originally Posted by *SlimJ87D*
> 
> Awesome! One of the main reasons I have remained subscribed to this thread is Benny Watch, a show that follows Benny89 and his journey to seek maximum overclocks!


LOL







. Well, I wanted that SLI bad but if I can't run them even on stock







...


----------



## s1rrah

Quote:


> Originally Posted by *Benny89*
> 
> Could someone tell me what kind of heatsinks I should use if I want to install Kraken G12 + AIO on my STRIX? I have no idea how high, long etc they should be.
> 
> And I should only get heatsinks for VRAM or for seomething else? Thanks


I don't have a Kraken cooler ...

Mine is a MSI Seahawk X ... I used these Enzotech sinks on the VRAM heatspreader ... I like the Enzotech sinks ... they make low profile and regular ... I went with the regular and there is plenty of room to still put the shroud on:

...



...



...

They work great ... the air coming out of the back of the card, and produced via the standard "blower" fan? It reads at 110F+ .. same as the air temp coming out of the radiator fan (which is set to exhaust) ...

Best of luck ...

joel


----------



## hypespazm

would you guys think that a 3930K bottlenecks the 1080TI I have one and am curious


----------



## TWiST2k

Quote:


> Originally Posted by *gmpotu*
> 
> Oh my goodness boys....
> 
> I hit [email protected] in SP4K !!! When I started this adventure I was only able to do [email protected] and it would downclock because of throttling all the way to 2025. I never tried XOC BIOS when I was starting out. From the help of all of you guys and all of the knowledge shared here, trial and error, and XOC Bios I've been able to improve so much. Thanks guys. This is a great community.
> 
> -Score 10173- Max 60C with a max 478W on the 12v rail according to Corsair Link. My fan curve topping at 90% (Loud but I use headphones so not too bad) System
> 
> I thought maybe this was a fluke but nope. Multiple runs no issues.
> 
> Will post picks of my curve later. It seems very apparent that that thin-light curve matters a lot too as it is now showing a nice smooth hill. I don't know how to manipulate the thin line curve really other than trial and error but trying out different settings helped me to hone in on my clock / voltage combo.
> 
> 1.143v does not work and 1.162v does not work. 1.150v is the sweet spot and runs every time I try it.
> 
> Going to do a WoW raid and see if it holds up for the night and then maybe try to push that 2100 mark on the last remaining voltage bins I have free.


Awesome congrats man! What I find strange is the inconsistency in results. On one of the FTW3s I have right now, I get basically a solid 2012 the entire run on superposition 4k and get around 10184. But your rocking around 2088 and getting a similar score. Have you tried heaven extreme? Most of the time if I even dare to think I am stable based on superposition, I run heave extreme for like 2 mins and realize I was dreaming lol.


----------



## Aph0ticShield

Could anyone help me with a small issue?









I've got a 1080TI FE Hybrid. I can't use the XOC bios because I'm on a triple monitor set up, so I'm on the FTW3 bios.

I set the max power draw to 358W, but the card is only drawing ~305W tops. Card never goes above 1.01V either. Temps are fine. Never more than 65c on furmark. Voltage is set to +100. GPU-Z shows perfcap reason as POWER.

Any tips would be appreciated!


----------



## gmpotu

Quote:


> Originally Posted by *TWiST2k*
> 
> Awesome congrats man! What I find strange is the inconsistency in results. On one of the FTW3s I have right now, I get basically a solid 2012 the entire run on superposition 4k and get around 10184. But your rocking around 2088 and getting a similar score. Have you tried heaven extreme? Most of the time if I even dare to think I am stable based on superposition, I run heave extreme for like 2 mins and realize I was dreaming lol.


I'm on a [email protected] for my CPU I would assume that's the big difference in our scores in SP4K.

Here is the picture to go along with my post from earlier.




Spoiler: Previous Post



Oh my goodness boys....

I hit [email protected] in SP4K !!! When I started this adventure I was only able to do [email protected] and it would downclock because of throttling all the way to 2025. I never tried XOC BIOS when I was starting out. From the help of all of you guys and all of the knowledge shared here, trial and error, and XOC Bios I've been able to improve so much. Thanks guys. This is a great community.

-Score 10173- Max 60C with a max 478W on the 12v rail according to Corsair Link. My fan curve topping at 90% (Loud but I use headphones so not too bad) System

I thought maybe this was a fluke but nope. Multiple runs no issues.

Will post picks of my curve later. It seems very apparent that that thin-light curve matters a lot too as it is now showing a nice smooth hill. I don't know how to manipulate the thin line curve really other than trial and error but trying out different settings helped me to hone in on my clock / voltage combo.

1.143v does not work and 1.162v does not work. 1.150v is the sweet spot and runs every time I try it.

Going to do a WoW raid and see if it holds up for the night and then maybe try to push that 2100 mark on the last remaining voltage bins I have free.


----------



## Nico67

Quote:


> Originally Posted by *Aph0ticShield*
> 
> Could anyone help me with a small issue?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got a 1080TI FE Hybrid. I can't use the XOC bios because I'm on a triple monitor set up, so I'm on the FTW3 bios.
> 
> I set the max power draw to 358W, but the card is only drawing ~305W tops. Card never goes above 1.01V either. Temps are fine. Never more than 65c on furmark. Voltage is set to +100. GPU-Z shows perfcap reason as POWER.
> 
> Any tips would be appreciated!


Sounds about right, 305-314w is all I would expect to see. You'll never get the full 358w unlimited, its just you might get a slightly better score if your lucky. As for 1.01v? do you mean 1.093 or 1.012, as you should be able to see up to 1.050v as long as the curve allows it.


----------



## djriful

Quote:


> Originally Posted by *hypespazm*
> 
> would you guys think that a 3930K bottlenecks the 1080TI I have one and am curious


I am in the 3DMark 95% score range. So no bottleneck much but depends on the games also. 4.7Ghz


----------



## Aph0ticShield

Quote:


> Originally Posted by *Nico67*
> 
> Sounds about right, 305-314w is all I would expect to see. You'll never get the full 358w unlimited, its just you might get a slightly better score if your lucky. As for 1.01v? do you mean 1.093 or 1.012, as you should be able to see up to 1.050v as long as the curve allows it.


Thanks for the response!

Yes, 1.012V is the max voltage it will crank out due to the power limit (Using Furmark though, so take that with a grain of salt)

I don't quite get it. Because when the limit is set to 280W it averages 278W, 290W it averages 288W, 300W averages 298W, 310W averages 300W, 320W averages 295W.

What limit is it hitting if I'm setting the power limit so high? Clearly something else is getting in the way...


----------



## KedarWolf

Quote:


> Originally Posted by *Aph0ticShield*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Sounds about right, 305-314w is all I would expect to see. You'll never get the full 358w unlimited, its just you might get a slightly better score if your lucky. As for 1.01v? do you mean 1.093 or 1.012, as you should be able to see up to 1.050v as long as the curve allows it.
> 
> 
> 
> Thanks for the response!
> 
> Yes, 1.012V is the max voltage it will crank out due to the power limit (Using Furmark though, so take that with a grain of salt)
> 
> I don't quite get it. Because when the limit is set to 280W it averages 278W, 290W it averages 288W, 300W averages 298W, 310W averages 300W, 320W averages 295W.
> 
> What limit is it hitting if I'm setting the power limit so high? Clearly something else is getting in the way...
Click to expand...

When I ran custom Time Spy settings I was getting up to 350W on Strix OC BIOS and it would hit 330+ often.

These are the settings I used. No G-Sync though.


----------



## KedarWolf

This is XOC BIOS at 2088 1.100v no shunt mod Fire Strike Ultra stress test.

XOCPowerLimit.txt 87k .txt file


Power draw upwards of 390w.

If you're using XOC BIOS higher than 1.100v you might want to run in admin command prompt.

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1

with the beginning quotation.

I think you want to keep it drawing under 400w.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> This is XOC BIOS at 2088 1.100v no shunt mod Fire Strike Ultra stress test.
> 
> XOCPowerLimit.txt 87k .txt file
> 
> 
> Power draw upwards of 390w.
> 
> If you're using XOC BIOS higher than 1.100v you might want to run in admin command prompt.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> with the beginning quotation.
> 
> I think you want to keep it drawing under 400w.


Im doing the shunt resister mod today








and adding clu to my gpus .....

guess your not doing it this weekend...
really could use that guide lol


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is XOC BIOS at 2088 1.100v no shunt mod Fire Strike Ultra stress test.
> 
> XOCPowerLimit.txt 87k .txt file
> 
> 
> Power draw upwards of 390w.
> 
> If you're using XOC BIOS higher than 1.100v you might want to run in admin command prompt.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> with the beginning quotation.
> 
> I think you want to keep it drawing under 400w.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im doing the shunt resister mod today
> 
> 
> 
> 
> 
> 
> 
> 
> and adding clu to my gpus .....
> 
> guess your not doing it this weekend...
> really could use that guide lol
Click to expand...

You want to do CLU OR resistor mod, not both.









I'm doing resistor mod tomorrow or Sunday at the latest. With step by step guide.


----------



## Aph0ticShield

Quote:


> Originally Posted by *KedarWolf*
> 
> When I ran custom Time Spy settings I was getting up to 350W on Strix OC BIOS and it would hit 330+ often.
> 
> These are the settings I used. No G-Sync though.


Weird. I even reinstalled the bios and driver again! Definitely still getting powerlimited at 300W.


----------



## Nico67

Quote:


> Originally Posted by *Aph0ticShield*
> 
> Thanks for the response!
> 
> Yes, 1.012V is the max voltage it will crank out due to the power limit (Using Furmark though, so take that with a grain of salt)
> 
> I don't quite get it. Because when the limit is set to 280W it averages 278W, 290W it averages 288W, 300W averages 298W, 310W averages 300W, 320W averages 295W.
> 
> What limit is it hitting if I'm setting the power limit so high? Clearly something else is getting in the way...


Its called normalized TDP, and while as Kedar said you might draw more power, the amount varies from card to card. More importantly its always limited and trying to maintain this Normalized TDP.

Quote:


> Originally Posted by *Aph0ticShield*
> 
> Weird. I even reinstalled the bios and driver again! Definitely still getting powerlimited at 300W.


that's what I always found with every bios other than XOC. Nobody has ever shown more power being drawn with no power limit triggered other than XOC. I believe if they did it would be around the 305w mark.


----------



## Aph0ticShield

Quote:


> Originally Posted by *Nico67*
> 
> Its called normalized TDP, and while as Kedar said you might draw more power, the amount varies from card to card. More importantly its always limited and trying to maintain this Normalized TDP.
> that's what I always found with every bios other than XOC. Nobody has ever shown more power being drawn with no power limit triggered other than XOC. I believe if they did it would be around the 305w mark.


So does the actual FTW3 go up to 358W proper?

Yeah, the XOC bios works scary awesome. Was hitting 380W+ for a minute before I realized how much it was pumping out.







Too bad it's only dual displayport. Who makes a card with only two displayport modules?


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KedarWolf*
> 
> When I ran custom Time Spy settings I was getting up to 350W on Strix OC BIOS and it would hit 330+ often.
> 
> These are the settings I used. No G-Sync though.


I don't think people want to have to run custom settings to access more power. I think they want it when gaming and normal use.


----------



## tagaxxl

Quote:


> Originally Posted by *Luckbad*
> 
> Download FireStorm, go to Spectra, click the down arrow to the right of the LED settings, turn off fan freeze.
> 
> That'll let you control fan speed without revving in Afterburner.


cant find that option


----------



## Luckbad

Quote:


> Originally Posted by *tagaxxl*
> 
> cant find that option


Fan Freeze Setting


----------



## asdkj1740

ASUS ROG Poseidon GTX 1080 Ti Platinum Edition (Air cooling ONLY) Review
http://yujihw.com/review/asus-rog-poseidon-gtx-1080-ti-platinum-edition

the stock air cooler really sucks
the last gen 980ti version has three copper heatpipes go into the middle of the fins, this time the new design has only two heatpipes going beneath the fins.


----------



## KedarWolf




----------



## KingEngineRevUp

Quote:


> Originally Posted by *asdkj1740*
> 
> ASUS ROG Poseidon GTX 1080 Ti Platinum Edition (Air cooling ONLY) Review
> http://yujihw.com/review/asus-rog-poseidon-gtx-1080-ti-platinum-edition
> 
> the stock air cooler really sucks
> the last gen 980ti version has three copper heatpipes go into the middle of the fins, this time the new design has only two heatpipes going beneath the fins.


I don't know why anyone would buy it for air cooling.


----------



## OZrevhead

Guys, does anyone have any information on when we might see a Ti Classified and Kingpin for sale?


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> You want to do CLU OR resistor mod, not both.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm doing resistor mod tomorrow or Sunday at the latest. With step by step guide.


It will be the resister mod that i do








thats why i need a guide









will be using thermal paste in the middle of the resister and artic silver metal epoxy paste on the sides







still would like a guide to follow lol


----------



## feznz

Quote:


> Originally Posted by *asdkj1740*
> 
> ASUS ROG Poseidon GTX 1080 Ti Platinum Edition (Air cooling ONLY) Review
> http://yujihw.com/review/asus-rog-poseidon-gtx-1080-ti-platinum-edition
> 
> the stock air cooler really sucks
> the last gen 980ti version has three copper heatpipes go into the middle of the fins, this time the new design has only two heatpipes going beneath the fins.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ]


I can't believe how long they have taken to be released/developed essentially the same PCB so really the cooler itself is the only new part that could have been designed a long time ago..... I would have got one over the Strix if they were available at the time, seen the pricing not too bad almost identical as buying an EKW full cover and GPU.


----------



## Aganor

Yesterday i did a Heaven benchmark marathon of about 1h and then 30 minutes Witcher 3 gameplay.
My GPU temp gone to 50ºc and CPU 60ºc.
The air going off the radiators was really warm so seems everything is installed correctly.

My room temperature is still unknown, i do not know if the System Temp can be a good measure since it goes up to 30ºc and starts at about 28ºc.

Also my Card uses up to 1.093v @ 2050mhz

I managed to get a max offset of +175 core and +735Mem
On stock cooler it was +160core and +600Mem

Was expecting more of the core but at +180 it freezes the system when running Heaven.

Any thoughts?


----------



## smonkie

I've noticed that after several runs of Heaven my MSI 1080Ti gets noticiable worse results even when GPU Temp doesn't rise above 60º (100% fan). Could it be due to other components' temps, or is it just any kind of software or even hardware bug?


----------



## Aganor

Quote:


> Originally Posted by *smonkie*
> 
> I've noticed that after several runs of Heaven my MSI 1080Ti gets noticiable worse results even when GPU Temp doesn't rise above 60º (100% fan). Could it be due to other components' temps, or is it just any kind of software or even hardware bug?


I notice this as well, seems something in Heaven as the 2nd pass is worse all the time


----------



## Unnatural

Quote:


> Originally Posted by *ViRuS2k*
> 
> will be using thermal paste in the middle of the resister and artic silver metal epoxy paste on the sides
> 
> 
> 
> 
> 
> 
> 
> still would like a guide to follow lol


May I ask which product, exactly? The only adhesives listed on Arctic Silver's site have low to none electrical conductivity


----------



## ViRuS2k

Quote:


> Originally Posted by *Unnatural*
> 
> May I ask which product, exactly? The only adhesives listed on Arctic Silver's site have low to none electrical conductivity


https://www.amazon.co.uk/gp/product/B01I76H3HQ/ref=oh_aui_detailpage_o04_s00?ie=UTF8&psc=1

Arctic Silver Thermal Adhesive/Glue/Epoxy 7g (2x 3.5g Tubes) (ASTA-7G) Artic

Made with 99.8% pure micronized silver. 62% to 65% silver content by weight. Superior thermal performance.
40C to >150C (Bond strength is slightly weakened at temperatures below 0C due to crystallization.)
With its unique high-density filling of micronized silver and enhanced thermally conductive ceramic particles, Arctic Silver 5 provides a new level of performance and stability.
Arctic Silver Thermal Adhesive is a permanent adhesive. Components you attach with Arctic Silver Thermal Adhesive will stay attached forever.


----------



## Unnatural

Quote:


> Originally Posted by *ViRuS2k*
> 
> https://www.amazon.co.uk/gp/product/B01I76H3HQ/ref=oh_aui_detailpage_o04_s00?ie=UTF8&psc=1
> 
> Arctic Silver Thermal Adhesive/Glue/Epoxy 7g (2x 3.5g Tubes) (ASTA-7G) Artic


Product description:
Negligible Electrical Conductivity: Arctic Silver Thermal Adhesive was formulated to conduct heat, not electricity.
NOTE: Even though Arctic Silver Thermal Adhesive is specifically engineered for high electrical resistance, it should be kept away from electrical traces, pins, and leads.
The cured adhesive is slightly capacitive and could potentially cause problems if it bridged two close-proximity electrical paths.

I'm afraid it's not the right stuff.


----------



## Hulio225

Quote:


> Originally Posted by *ViRuS2k*
> 
> https://www.amazon.co.uk/gp/product/B01I76H3HQ/ref=oh_aui_detailpage_o04_s00?ie=UTF8&psc=1
> 
> Arctic Silver Thermal Adhesive/Glue/Epoxy 7g (2x 3.5g Tubes) (ASTA-7G) Artic
> 
> Made with 99.8% pure micronized silver. 62% to 65% silver content by weight. Superior thermal performance.
> 40C to >150C (Bond strength is slightly weakened at temperatures below 0C due to crystallization.)
> With its unique high-density filling of micronized silver and enhanced thermally conductive ceramic particles, Arctic Silver 5 provides a new level of performance and stability.
> Arctic Silver Thermal Adhesive is a permanent adhesive. Components you attach with Arctic Silver Thermal Adhesive will stay attached forever.


http://www.arcticsilver.com/arctic_silver_thermal_adhesive.htm

-> Negligible Electrical Conductivity:
Arctic Silver Thermal Adhesive was formulated to conduct heat, not electricity. NOTE: Even though Arctic Silver Thermal Adhesive is specifically engineered for high electrical resistance, it should be kept away from electrical traces, pins, and leads. The cured adhesive is slightly capacitive and could potentially cause problems if it bridged two close-proximity electrical paths.

TBH: I would use something else.... THis could be not enough Conductivity


----------



## ViRuS2k

Quote:


> Originally Posted by *Unnatural*
> 
> Product description:
> Negligible Electrical Conductivity: Arctic Silver Thermal Adhesive was formulated to conduct heat, not electricity.
> NOTE: Even though Arctic Silver Thermal Adhesive is specifically engineered for high electrical resistance, it should be kept away from electrical traces, pins, and leads.
> The cured adhesive is slightly capacitive and could potentially cause problems if it bridged two close-proximity electrical paths.
> 
> I'm afraid it's not the right stuff.


Im doing the resister mod, so all i need is something to hold the ic in place....
to be honest at this rate i probably wont lol need a guide

the stuff i currently have is :

Thermal Grizzly Conductonaut Liquid Metal Thermal Paste - 1g
Conducting Silver conductive silver paint 3 grams Ideal for repairing boards, window alarm loops and window heaters
Arctic Silver Thermal Adhesive/Glue/Epoxy 7g (2x 3.5g Tubes) (ASTA-7G) Artic
MG Chemicals Translucent Silicone Grease, 85 ml Tube
Starbrite Liquid Tape Wire Coating BLACK
MX4 thermal paste
Coollab CLU metal paste

To be honest there is not much space to play with lol so if i put the liquid metal on the sides and the none electric thermal paste in the middle, then im not really sure what im going to use to make the ic stick lol

----

NOTE* its possible i could put artic epoxy in the center of the ic and on corners i could put liquid metal but im not sure what would hold a strong enough bond and wont fall off. hmmm


----------



## Unnatural

Quote:


> Originally Posted by *ViRuS2k*
> 
> To be honest there is not much space to play with lol so if i put the liquid metal on the sides and the none electric thermal paste in the middle, then im not really sure what im going to use to make the ic stick lol


You could use the Arctic adhesive on the center (assuming it can be spread on a very thin layer) and the conductive paint on the side, then.
Or the other way around, using some conductive adhesive like the one for repairing car's defogger on the sides, and normal nonconductive thermal paste on the center.
There's a couple of good posts about this mod a few pages ago (also quoted on the shunt mod thread).

EDIT: lol, I just noticed you were quoted on those same posts


----------



## Nico67

Quote:


> Originally Posted by *Aph0ticShield*
> 
> So does the actual FTW3 go up to 358W proper?
> 
> Yeah, the XOC bios works scary awesome. Was hitting 380W+ for a minute before I realized how much it was pumping out.
> 
> 
> 
> 
> 
> 
> 
> Too bad it's only dual displayport. Who makes a card with only two displayport modules?


Yeah the actual EVGA card would go up to 358w due to its circuitry, where the FE can run it but has hardware differences that seem to impact its ability to use the full amount.

Quote:


> Originally Posted by *smonkie*
> 
> I've noticed that after several runs of Heaven my MSI 1080Ti gets noticiable worse results even when GPU Temp doesn't rise above 60º (100% fan). Could it be due to other components' temps, or is it just any kind of software or even hardware bug?


Could be due to heat soak, and you are starting or spending more time in different curves. If as some are saying different temp bands run their own curves then you may have to optimise curves at different running temps. That would be particularly painful to achieve, but not entirely impossible.

What I think the next step in testing needs to be is saving pics of curves at set temp intervals, then modifying each curve and applying it. I would assume it doesn't change the previous curve and you could build up a fully temp optimised profile in AB. It would be a more prevalent issue on air cooling as the idle to load delta would be over many more temp steps, compared to water cooling which is only over a few steps and would explain why some aren't seeing some of these issues.


----------



## macwin2012

Hi ,

What Nvidia Drivers are you guys using ?

It seems new drivers 382.33 are unstable in Games with Overclock .


----------



## smonkie

Th
Quote:


> Originally Posted by *Nico67*
> 
> Yeah the actual EVGA card would go up to 358w due to its circuitry, where the FE can run it but has hardware differences that seem to impact its ability to use the full amount.
> Could be due to heat soak, and you are starting or spending more time in different curves. If as some are saying different temp bands run their own curves then you may have to optimise curves at different running temps. That would be particularly painful to achieve, but not entirely impossible.
> 
> What I think the next step in testing needs to be is saving pics of curves at set temp intervals, then modifying each curve and applying it. I would assume it doesn't change the previous curve and you could build up a fully temp optimised profile in AB. It would be a more prevalent issue on air cooling as the idle to load delta would be over many more temp steps, compared to water cooling which is only over a few steps and would explain why some aren't seeing some of these issues.


That makes sense, because I use air cooling only. Didn't have the issue in previous gpus generations thou. It sucks.


----------



## PasK1234Xw

Quote:


> Originally Posted by *macwin2012*
> 
> Hi ,
> 
> What Nvidia Drivers are you guys using ?
> 
> It seems new drivers 382.33 are unstable in Games with Overclock .


Just because your card not stable don't make this a driver issue


----------



## macwin2012

Quote:


> Originally Posted by *PasK1234Xw*
> 
> Just because your card not stable don't make this a driver issue


Edit : I found the issue , Never ever ever use Nvidia G sync Fullscreen + Window Mode .

I switched to G -sync Full screen only it fixed the issue .

For benchmarking i turn of G-sync completely .


----------



## cluster71

How do you handle the summer? Now it's over 30 degrees C in Sweden. Tomorrow it will get warmer. Do you down clock or have you air conditioning


----------



## hypespazm

Quote:


> Originally Posted by *djriful*
> 
> I am in the 3DMark 95% score range. So no bottleneck much but depends on the games also. 4.7Ghz


Yeah I ran 3dmark and had the same result. as well as I am at 4.6Ghz.







thanks man give me confidence.. what are people getting on 1080 TI's as far as overclock's I have really overclocked seriously since the 780! are people getting as aggressive overclocks?

I remember going from 900Mhz to 1250Mhz... My 1080 ti FTW is making 1911 Mhz out of the box


----------



## Hulio225

Quote:


> Originally Posted by *hypespazm*
> 
> Yeah I ran 3dmark and had the same result. as well as I am at 4.6Ghz.
> 
> 
> 
> 
> 
> 
> 
> thanks man give me confidence.. what are people getting on 1080 TI's as far as overclock's I have really overclocked seriously since the 780! are people getting as aggressive overclocks?
> 
> I remember going from 900Mhz to 1250Mhz... My 1080 ti FTW is making *1911 Mhz out of the box*


1911 is the standard gpu boost 3.0 clock out of the box. Every non factory oc'ed 1080 Ti will do that.

Edit: In terms of Bottlnecking, every CPU will bottleneck the 1080 Ti in resolutions up to 1080p, everything over it will get less and less bottlenecked until a point where the cpu almost makes not a difference as long it is a decent cpu from the last decade.


----------



## macwin2012

My Gtx 1080ti Amp extreme Randomly fan spins every 4-5 mins to 100% , Is there a way to fix this ?


----------



## Luckbad

@macwin2012
Quote:


> Originally Posted by *Luckbad*
> 
> Okay, finally got this thing figured out. There's a hard-to-find arrow that brings you to a place where you can disable the 0% fan speed functionality.
> 
> You can only do this in Zotac Firestorm, which you'll need for this as well as changing or turning off the LEDs on the physical card.
> 
> To solve the issue
> 1) Click "Spectra"
> 2) Click the little down arrow on the right side of that UI
> 3) Disable the Fan Stop Setting (aka FREEZE Fan Stop)
> 
> Now, you can manipulate the fan speeds in your preferred software without getting the blast of 100% fan speed every time the card exceeds 45 C.


----------



## cluster71

As described earlier. Install Firestorm, select Spectra, the arrow to the right, Fan stop setting OFF. Then you can use with Msi Afterburner if you want.


----------



## macwin2012

Quote:


> Originally Posted by *Luckbad*
> 
> @macwin2012


Thank you so much !!!

That option was kept so hidden


----------



## c0ld

Anyone link for the latest Afterburner. I had 4.40 beta 6 last I believe. Is there a newer one?


----------



## cluster71

For me, Firestorm did not work properly at first. Every time I started firestorm, the look was changed. Sometimes I could access the Spectra menu sometimes not. Sometimes there was no Led control at all. It was like I did not have full permissions to control the Graphics Card. I rest GPU bios and it worked one time. I changed the security settings for the program. I was in contact with Zotac support. Some of the problems appear to be due to the fact that the program is not Digitally signed.


----------



## cluster71

Insert the link in the search window. http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta9.rar


----------



## c0ld

Quote:


> Originally Posted by *cluster71*
> 
> Insert the link in the search window. http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta9.rar


Thank you!


----------



## KedarWolf

Quote:


> Originally Posted by *c0ld*
> 
> Anyone link for the latest Afterburner. I had 4.40 beta 6 last I believe. Is there a newer one?


It's in OP too under the unlocked voltage slider part and under how to flash a different bios.


----------



## ELIAS-EH

Guys question please
What is the profile of oc mode in asus tweak 2 for strix 1080ti to be use in msi afterburner
I want to use only afterburner
Means that the oc mode in asus tweak 2 is +25 core and + 90 on the memory in msi afterburner? And what about power limit in %?
and should i need to touch the voltage? I don't want to oc beyond oc mode in asus tweak 2 present in the bios


----------



## Benny89

So farm very pleased with this STRIX. 2063 in games no problem. Ambient temp is 26C and card hold 64C in ME:A no problem.

Can do 2088 below 50C.

Very nice chip but it is all about temps sadly. And summer is coming







.


----------



## TheBoom

Quote:


> Originally Posted by *JimmyMo*
> 
> You're welcome, and thank YOU!
> 
> Do you mean this sticker, labeled 'N471' on my unit?
> 
> 
> 
> Not sure, it's not the serial number, so I am guessing that it's a production run tracking number, or something similar.
> 
> What number does yours have on it?
> 
> -jm


No actually I saw a pic on Guru3d's review of it and there was a sticker under the backplate on the rear of the card itself. I was wondering if that was a warranty seal.
Quote:


> Originally Posted by *Benny89*
> 
> So far so good. Playing ULTRA modded ME:A at 2063/6000. 60% Fan speed to keep it around 63.
> Well, on YT you can find people that used it (G10 and G12) on their FE cards and are happy with results. But of course you are right that water loop is better. No doubt. But it costs a lot more...
> 
> Although having small loop for just your GPU seems really nice
> 
> 
> 
> 
> 
> 
> 
> . It would look strange but hey, at least no "yet another same looking loop"
> 
> 
> 
> 
> 
> 
> 
> . I will consider that.


Where do you get mods for ME:A? Nexus?


----------



## Benny89

Quote:


> Originally Posted by *TheBoom*
> 
> No actually I saw a pic on Guru3d's review of it and there was a sticker under the backplate on the rear of the card itself. I was wondering if that was a warranty seal.
> Where do you get mods for ME:A? Nexus?


Yes, Nexus. I recommend this one: *Cinematic Realism Lighting Overhaul with Depth-of-Field* It has small bug with cursor but game looks so MUCH more beautiful than normal game. Even more than on HDR TV. I mean who need HDR monitor if you can make HDR effects in game







.

It impacts heavy on performance, even eating up to 20 fps. But its worth imo.


----------



## TheBoom

Quote:


> Originally Posted by *Benny89*
> 
> Yes, Nexus. I recommend this one: *Cinematic Realism Lighting Overhaul with Depth-of-Field* It has small bug with cursor but game looks so MUCH more beautiful than normal game. Even more than on HDR TV. I mean who need HDR monitor if you can make HDR effects in game
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It impacts heavy on performance, even eating up to 20 fps. But its worth imo.


Yes I was looking at that, but the depth of field seems a bit too much? Unless you can disable that and just keep the cinematic lighting.

Thanks anyway, repped.


----------



## Benny89

I love sleek design of this card. Not too many LEDs in your face, not too much fancy design, nice and elegant:


----------



## feznz

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Guys question please
> What is the profile of oc mode in asus tweak 2 for strix 1080ti to be use in msi afterburner
> I want to use only afterburner
> Means that the oc mode in asus tweak 2 is +25 core and + 90 on the memory in msi afterburner? And what about power limit in %?
> and should i need to touch the voltage? I don't want to oc beyond oc mode in asus tweak 2 present in the bios


that will be a easy OC ie every single Strix will do that,
but yes you are right you don't need to touch the voltage and the power limit could be slid up to 120% that will prevent any performance caps and +25/+90
My strix will sit around 112% with core @ 2038 while gaming effectively +100core +4000mem but I am on water too


----------



## TheBoom

Quote:


> Originally Posted by *feznz*
> 
> that will be a easy OC ie every single Strix will do that,
> but yes you are right you don't need to touch the voltage and the power limit could be slid up to 120% that will prevent any performance caps and +25/+90
> My strix will sit around 112% with core @ 2038 while gaming effectively +100core +4000mem but I am on water too


Wow +4000 mem. Teach me master







.


----------



## dunbheagan

Quote:


> Originally Posted by *dunbheagan*
> 
> Hey Guys, anybody already found a way to undervolt below 793mV? Is there any way to apply settings like 1200MHz/700mV or else? I can apply settings 1000MHz/800mV or 1746MHz/793mV with Afterburner, but nothing below that voltage. I would really like to see the afterburner curve go down to 650mV.
> 
> I found out, that the programm SetStablePowerState.exe locks the card at the base clock, which should have a voltage below 800mV, i think about 780mV. But it is Win10 only, and i would prefere a method which lets me chose clocks/voltage.
> 
> https://developer.nvidia.com/setstablepowerstateexe-%20disabling%20-gpu-boost-windows-10-getting-more-deterministic-timestamp-queries
> 
> Any ideas?


Sorry for the repost, but has anybody an idea how to undervolt below 800mV?


----------



## cluster71

I do not know what limits. Run a test on Superposition on solid 2114 core 11800 mem 1.093 v 7700K 5300 MHz with 10147 score. Does not feel overwhelming.


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> This is XOC BIOS at 2088 1.100v no shunt mod Fire Strike Ultra stress test.
> 
> XOCPowerLimit.txt 87k .txt file
> 
> 
> Power draw upwards of 390w.
> 
> If you're using XOC BIOS higher than 1.100v you might want to run in admin command prompt.
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> with the beginning quotation.
> 
> I think you want to keep it drawing under 400w.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im doing the shunt resister mod today
> 
> 
> 
> 
> 
> 
> 
> 
> and adding clu to my gpus .....
> 
> guess your not doing it this weekend...
> really could use that guide lol
Click to expand...

i tried doing the resistor mod.









I don't hand the hand/eye coordination to do it, they are really small and hard to put on. The defogger silver epoxy dries so quick.


----------



## ViRuS2k

Quote:


> Originally Posted by *KedarWolf*
> 
> i tried doing the resistor mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't hand the hand/eye coordination to do it, they are really small and hard to put on. The defogger silver epoxy dries so quick.


I guess you will be doing the CLU method then







I told you those things are like little midgets lol


----------



## KedarWolf

Quote:


> Originally Posted by *ViRuS2k*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> i tried doing the resistor mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't hand the hand/eye coordination to do it, they are really small and hard to put on. The defogger silver epoxy dries so quick.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess you will be doing the CLU method then
> 
> 
> 
> 
> 
> 
> 
> I told you those things are like little midgets lol
Click to expand...

I think I'm just going to stick with the XOC BIOS.

I got the second 1080 Ti Fire Strike single card with it I think on overclock.net, only a 6950X beating me out.










I have a 5960x.









Edit: My card sits vertically is why.


----------



## gmpotu

Quote:


> Originally Posted by *Benny89*
> 
> I love sleek design of this card. Not too many LEDs in your face, not too much fancy design, nice and elegant:


Wow your system looks so nice Benny. I really like the white lighting shining on the red.


----------



## Benny89

Quote:


> Originally Posted by *gmpotu*
> 
> Wow your system looks so nice Benny. I really like the white lighting shining on the red.


Thanks! Appreciated it! It looks better in real live, but my crappy smartphone camera can only do that much...

I love white light in red-black setup since it gets that red stand out more and make it much more visible.

Can someone tell me if this is good combo for small GPU water loop?

https://www.ekwb.com/shop/ek-res-x3-110 + https://www.ekwb.com/shop/ek-xtop-revo-d5-pwm-incl-pump + https://www.ekwb.com/shop/ek-coolstream-ce-140-single ?


----------



## Warrior1986

Hopefully someone can help me figure out what I'm doing wrong. I just installed my EVGA 1080 Ti FE card with the EKWB block and it's running flawlessly. Wanted to start overclocking as the amount of thermal headroom I have now is seriously absurd.

I followed the instructions to the T for Afterburner, and no matter what I do it seems the voltage will not go above 1.043mV. The slider doesn't seem to be doing anything. I'm sure this is why I'm getting blocked in going any further with my OC. I thought the voltage cap was 1.09mV?


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Thanks! Appreciated it! It looks better in real live, but my crappy smartphone camera can only do that much...
> 
> I love white light in red-black setup since it gets that red stand out more and make it much more visible.
> 
> Can someone tell me if this is good combo for small GPU water loop?
> 
> https://www.ekwb.com/shop/ek-res-x3-110 + https://www.ekwb.com/shop/ek-xtop-revo-d5-pwm-incl-pump + https://www.ekwb.com/shop/ek-coolstream-ce-140-single ?


Depends how much room you have, but I go for a 240 rad and a pump res combo to keep things tidier. possible even grab a swiftech 240x which has pump / rad with res built in combo and just not use the CPU block. Newer kits may need disassembling and new hose, but its a tidy way to do it









http://www.swiftech.com/h220x2.aspx#tab3


----------



## Coopiklaani

Quote:


> Originally Posted by *Warrior1986*
> 
> Hopefully someone can help me figure out what I'm doing wrong. I just installed my EVGA 1080 Ti FE card with the EKWB block and it's running flawlessly. Wanted to start overclocking as the amount of thermal headroom I have now is seriously absurd.
> 
> I followed the instructions to the T for Afterburner, and no matter what I do it seems the voltage will not go above 1.043mV. The slider doesn't seem to be doing anything. I'm sure this is why I'm getting blocked in going any further with my OC. I thought the voltage cap was 1.09mV?


You need tune the F-V curve manually. You need 80C+ on the GPU to get 1.09v without playing with the curve.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *Nico67*
> 
> Depends how much room you have, but I go for a 240 rad and a pump res combo to keep things tidier. possible even grab a swiftech 240x which has pump / rad with res built in combo and just not use the CPU block. Newer kits may need disassembling and new hose, but its a tidy way to do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.swiftech.com/h220x2.aspx#tab3


yeah definately would suggest swiftech aio route. very easy to expand to your gpu. Just dont like their latest model after the 240x. imho the thing is too gody looking now but would come down to personal pref. you can still maybe find the older 240x on ebay or something. my 240x has been running for like 3 years without single issue.


----------



## feznz

Quote:


> Originally Posted by *TheBoom*
> 
> Wow +4000 mem. Teach me master
> 
> 
> 
> 
> 
> 
> 
> .


first you cool the pot with LN² then cool the pot further with liquid He just you got to be careful to "warm the pot up with liquid LN²" to avoid cracking the pot.
Or have a morning coffee to avoid the morning jitters








Quote:


> Originally Posted by *dunbheagan*
> 
> Sorry for the repost, but has anybody an idea how to undervolt below 800mV?


Have ask why...

Quote:


> Originally Posted by *cluster71*
> 
> I do not know what limits. Run a test on Superposition on solid 2114 core 11800 mem 1.093 v 7700K 5300 MHz with 10147 score. Does not feel overwhelming.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ]


Did you apply the tweaks detailed here and be sure to submit I might do a suicide run later when weather permits and check out the XOC bios for myself now we have a best base score to try beat

http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_20

Quote:


> Originally Posted by *KedarWolf*
> 
> i tried doing the resistor mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't hand the hand/eye coordination to do it, they are really small and hard to put on. The defogger silver epoxy dries so quick.


I was going to suggest the pencil mod I believe a 2B was the preferred grade or the really soft one to reduce the resistance across these tiny resistors
there was the other shunt mod too a quite a few pages back with a 5w? resistor wire across the shunt but would involve soldering

Quote:


> Originally Posted by *Coopiklaani*
> 
> You need tune the F-V curve manually. You need 80C+ on the GPU to get 1.09v without playing with the curve.


Actually I am lazy and use the slider I get 1.093v I not sure what's up there then again should be hitting 1.062v stock


----------



## Robilar

I picked up the zotac AMP extreme today. Massive card but so far quite impressive. Glad i wasn't planning sli as there is no way that would fit.

http://s1201.photobucket.com/user/RobilarOCN/media/20170527_174317_zpsz7xisbb5.jpg.html


----------



## feznz

Quote:


> Originally Posted by *Benny89*
> 
> Thanks! Appreciated it! It looks better in real live, but my crappy smartphone camera can only do that much...
> 
> I love white light in red-black setup since it gets that red stand out more and make it much more visible.
> 
> Can someone tell me if this is good combo for small GPU water loop?
> 
> https://www.ekwb.com/shop/ek-res-x3-110 + https://www.ekwb.com/shop/ek-xtop-revo-d5-pwm-incl-pump + https://www.ekwb.com/shop/ek-coolstream-ce-140-single ?


Trust me you will never look back with water amazing temps with next to no noise

just make to choice and go the whole way

best ask advice from here
http://www.overclock.net/t/584302/ocn-water-cooling-club-and-picture-gallery/105000_20#post_26123570
Quote:


> Originally Posted by *Robilar*
> 
> I picked up the zotac AMP extreme today. Massive card but so far quite impressive. Glad i wasn't planning sli as there is no way that would fit.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> http://s1201.photobucket.com/user/RobilarOCN/media/20170527_174317_zpsz7xisbb5.jpg.html[/URL[/QUOTE
> 
> 
> 
> 
> 
> 
> ]]
Click to expand...

Is that a completely new setup? all looks so shiny and new.....
I am impressed with the 1080Ti abilities but after having this card for a while I still feel it is slightly underpowered for 3K gaming just feels not so fast any more
so SLI here I come in the near future... Vega please make a huge price drop I see on paper should be 20-30% faster than 1080Ti


----------



## Nico67

Just for interests sake I did some temp vs curve pics, can't go any higher than 29c idle due to water chiller



21c to 25c has a change, but 25c to 29c stays the same. Definitely something in tuning all the temp point curves you use / range across while gaming / benching.


----------



## Robilar

Quote:


> Originally Posted by *feznz*
> 
> Trust me you will never look back with water amazing temps with next to no noise
> 
> just make to choice and go the whole way
> 
> best ask advice from here
> http://www.overclock.net/t/584302/ocn-water-cooling-club-and-picture-gallery/105000_20#post_26123570
> *Is that a completely new setup? all looks so shiny and new.....*
> I am impressed with the 1080Ti abilities but after having this card for a while I still feel it is slightly underpowered for 3K gaming just feels not so fast any more
> so SLI here I come in the near future... Vega please make a huge price drop I see on paper should be 20-30% faster than 1080Ti


Not new just very carefully maintained, case, cooler and power supply about 1 1/2 years old, mb ram and cpu about six months, gpu yesterday


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> I love sleek design of this card. Not too many LEDs in your face, not too much fancy design, nice and elegant:


Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gmpotu*
> 
> Wow your system looks so nice Benny. I really like the white lighting shining on the red.
> 
> 
> 
> Thanks! Appreciated it! It looks better in real live, but my crappy smartphone camera can only do that much...
> 
> I love white light in red-black setup since it gets that red stand out more and make it much more visible.
> 
> Can someone tell me if this is good combo for small GPU water loop?
> 
> https://www.ekwb.com/shop/ek-res-x3-110 + https://www.ekwb.com/shop/ek-xtop-revo-d5-pwm-incl-pump + https://www.ekwb.com/shop/ek-coolstream-ce-140-single ?
Click to expand...

That is a really nice build, looking good!









Imo your selection of an EK ResX3-110 would be a good fit for the Phanteks Evolv case. The D5 is a great pump, along with the Revo top. It would perform well for the gpu block now, and even a future cpu block upgrade.

That 140 rad is a tad small though, considering that your case can support a 240, 280, or even a 360 rad up top, as long as you don't go for a "Monsta" super-thick rad. I prefer using either 240 or 360 rads because they use 120mm fans, which tend to be available in better static pressure versions than the 140mm fans. These are very good 120mm static pressure fans:
https://www.dazmode.com/store/product/scythe_gentle_typhoon_1450rpm_58cfm_30db_fan/

EK also has their Vardar fans, which are also rated well for static pressure.

This is a good video review of watercooling options for your case:





They show some good options for the pump, res., and rads in your case.

I'd really recommend putting a good full-cover block on that Strix. (hint
https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-asus-rog-strix-geforce-gtx-1080-ti/









Edit:
I know, take it to the watercooling club thread, right?








But it is for the benefit of that lovely 1080Ti Strix, and to keep the thermal throttling to a minimum, lol.


----------



## Hulk1988

I have a question.

I got a 1080 TI GameRock Premium for 574 Dollar. It is new.

Should I just replace the 2 fans with new 2x 120mm fans or is there any waterblocks/Hybrid solutions for this?

Thank you


----------



## KickAssCop

Are fans noisy on that card?


----------



## Hulk1988

Oh yes there are loud.

STRIX for example has 33DB. GameRock 38DB.

If you look into this video: 




He just removed the fans with 4 screws and replaced it with silent Fans. Done!


----------



## dallas1990

Quote:


> Originally Posted by *Robilar*
> 
> I picked up the zotac AMP extreme today. Massive card but so far quite impressive. Glad i wasn't planning sli as there is no way that would fit.
> 
> http://s1201.photobucket.com/user/RobilarOCN/media/20170527_174317_zpsz7xisbb5.jpg.html


i got the same card but im planning on sli. i love how it stays around in the 50's while gaming at 4k lol and surprisingly its pretty quiet


----------



## OccamRazor

Hi guys! whats your 4K fps in PREY?

Cheers

Occamrazor


----------



## superV

Quote:


> Originally Posted by *mcg75*
> 
> *Just a reminder here everyone.
> 
> As this is the 1080 Ti owner's club, please try to keep the topic as close to that as possible.
> 
> Some infrequent off topic banter is fine but 4 pages recently arguing about what cpu to get doesn't belong in here.
> 
> Thanks.
> *


as usual

worse than youtube heroes


----------



## Benny89

Quote:


> Originally Posted by *DerComissar*
> 
> That is a really nice build, looking good!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Imo your selection of an EK ResX3-110 would be a good fit for the Phanteks Evolv case. The D5 is a great pump, along with the Revo top. It would perform well for the gpu block now, and even a future cpu block upgrade.
> 
> That 140 rad is a tad small though, considering that your case can support a 240, 280, or even a 360 rad up top, as long as you don't go for a "Monsta" super-thick rad. I prefer using either 240 or 360 rads because they use 120mm fans, which tend to be available in better static pressure versions than the 140mm fans. These are very good 120mm static pressure fans:
> https://www.dazmode.com/store/product/scythe_gentle_typhoon_1450rpm_58cfm_30db_fan/
> 
> EK also has their Vardar fans, which are also rated well for static pressure.
> 
> This is a good video review of watercooling options for your case:
> 
> 
> 
> 
> 
> They show some good options for the pump, res., and rads in your case.
> 
> I'd really recommend putting a good full-cover block on that Strix. (hint
> https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-asus-rog-strix-geforce-gtx-1080-ti/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> I know, take it to the watercooling club thread, right?
> 
> 
> 
> 
> 
> 
> 
> 
> But it is for the benefit of that lovely 1080Ti Strix, and to keep the thermal throttling to a minimum, lol.


Thanks! And thanks for tips. I wanted to try 140mm rad first becasue I know even AIO cooler can keep 1080 Ti below 60C easy so I thought with 140mm rad and res+pump + proper water block I could snag maybe 50-55C and you know... se if I like that water cooling stuff at all.

But maybe I should go full for it... I don't know


----------



## TheBoom

Quote:


> Originally Posted by *Robilar*
> 
> I picked up the zotac AMP extreme today. Massive card but so far quite impressive. Glad i wasn't planning sli as there is no way that would fit.
> 
> http://s1201.photobucket.com/user/RobilarOCN/media/20170527_174317_zpsz7xisbb5.jpg.html


Nice. Are those ML140 Pro fans? I see you have a similar set up to mine, except for the X99 chipset.

How are your CPU load temps though? Mine went up almost 10C after installing the Amp Extreme. Thinking of reordering the airflow but don't know where to start.


----------



## TWiST2k

Looks like Afterburner 4.4.0 Beta 10 is out. Paste link in new tab for download.

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta10.rar


----------



## nostra

this question iam sure have been asked but i couldent find it :S
iam planning on buying a 1080TI and watercool it. is there any reason not get the FE version (the cheapest option) is it going to overclock worste than a EVGA FTW3 for exampel?


----------



## Benny89

Quote:


> Originally Posted by *nostra*
> 
> this question iam sure have been asked but i couldent find it :S
> iam planning on buying a 1080TI and watercool it. is there any reason not get the FE version (the cheapest option) is it going to overclock worste than a EVGA FTW3 for exampel?


Sillicon lottery. Besides you can flash FTW3 Bios to FE card. Or XOC Bios. If I were to water cool 100% I would grab FE for sure BUT it all comes down to chip quality... sadly.


----------



## Robilar

Quote:


> Originally Posted by *TheBoom*
> 
> Nice. Are those ML140 Pro fans? I see you have a similar set up to mine, except for the X99 chipset.
> 
> How are your CPU load temps though? Mine went up almost 10C after installing the Amp Extreme. Thinking of reordering the airflow but don't know where to start.


Good eyes







they are the ml pro series.

My cpu temps were not affected at all by the change but i have good case airflow and a really good low voltage 6800k.


----------



## ViRuS2k

MSI AB 4.4.0 beta 10 is available:

• Improved 5-channel thermal monitoring module architecture provides support for up to 20 independent thermal sensors per GPU (up to 5 independent GPU, up to 5 independent PCB, up to 5 independent memory and up to 5 independent VRM temperature sensors) on future custom design MSI graphics cards
• Added NCT7802Y thermal sensors support to provide compatibility with future custom design MSI graphics cards
• Improved hardware database format. New database subsections support provides more compact database definition for multiple graphics card models sharing similar hardware calibration info
• New cached I2C device detection algorithm improves application startup time on the systems with multichannel voltage controllers or multichannel thermal sensors
• Added experimental interleaved hardware polling mode, aimed to reduce hardware polling time on the systems with multiple polled I2C devices. When interleaved polling is enabled, just a part of hardware monitoring data sources is being polled on each hardware polling period, so it takes multiple periods to refresh all monitoring data sources. Power users may enable interleaved hardware polling mode via the configuration file if necessary
• Improved third party voltage control mode functionality. Now third party hardware database can also include extended thermal sensors calibration and mapping info for third party custom design graphics cards

iCX thermal sensors are in third party database now, so selecting third party voltage control mode in "General" tab should unlock access to them in hardware monitoring module.


----------



## s1rrah

Quote:


> Originally Posted by *Sinistur*
> 
> On average I am 52-58c depending on the game....


I laugh at your air cooling weakness. My 1080 Ti never breaks 45C at load. MWUHAHAHAHAHAAAAA!!!!


----------



## TheBoom

Quote:


> Originally Posted by *Robilar*
> 
> Good eyes
> 
> 
> 
> 
> 
> 
> 
> they are the ml pro series.
> 
> My cpu temps were not affected at all by the change but i have good case airflow and a really good low voltage 6800k.


I have the same ML Pro fans with a H115 and a 6700k at 1.25v 4.5ghz but i'm reaching 85C and sometimes higher in ME:A. How many exhaust and intake fans do you have?

The only exhaust I have are those MLs on the radiator. Maybe I need to change the rear fan from intake to exhaust.


----------



## ritchington

*e*

aaaa


----------



## buildzoid

I think I have a power mod that should get around the normalized power limit thing.

Here's a screenshot of the GTX 960 I tested it on


Still need to test it on my GTX 1070 dual


----------



## ViRuS2k

Quote:


> Originally Posted by *buildzoid*
> 
> I think I have a power mod that should get around the normalized power limit thing.
> 
> Here's a screenshot of the GTX 960 I tested it on
> 
> 
> Still need to test it on my GTX 1070 dual


Looking good ;D

no normalized TDP or power limit TDP being registered....
all you need now is temp limiting removed lol so that core stays the same no matter the temp








probably a good thing though for watercooled peeps.....
though if your mod works like the picture for the 1080ti cards then we wont be needing XOC bios anymore and we can use our FE bios`s


----------



## buildzoid

Quote:


> Originally Posted by *ViRuS2k*
> 
> Looking good ;D
> 
> no normalized TDP or power limit TDP being registered....
> all you need now is temp limiting removed lol so that core stays the same no matter the temp
> 
> 
> 
> 
> 
> 
> 
> probably a good thing though for watercooled peeps.....
> though if your mod works like the picture for the 1080ti cards then we wont be needing XOC bios anymore and we can use our FE bios`s


IDK how the temp circuit works and that's assuming I can get access to it. The next thing on the chopping block on my GTX 1070 is the software voltage controls.


----------



## Dasboogieman

Quote:


> Originally Posted by *ViRuS2k*
> 
> Did you just join the forum to come in this thread and just say water cooling is pathetic for pascal cards ....
> 
> Please go away......
> 
> I gather you dont run SLI 1080TI`s @2050/6066 and get 100% load temps @49-53c
> with this soldering mini heat wave we are getting here with water idle temps of 29.9c ....
> 
> On a serious note* you dont know what your talking about...


49-53C to what end? <1% performance gains? Anyone who thinks WC will net them massive performance gains vs the cost applied is delusional. Selling the 1080ti and upgrading to a Titan Xp may actually yield greater performance per dollar lol.

Watercooling for noise control, that is a different beast. I can get behind that, can't wait for my Aorus blocks to arrive.


----------



## djriful

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Did you just join the forum to come in this thread and just say water cooling is pathetic for pascal cards ....
> 
> Please go away......
> 
> I gather you dont run SLI 1080TI`s @2050/6066 and get 100% load temps @49-53c
> with this soldering mini heat wave we are getting here with water idle temps of 29.9c ....
> 
> On a serious note* you dont know what your talking about...
> 
> 
> 
> 49-53C to what end? <1% performance gains? Anyone who thinks WC will net them massive performance gains vs the cost applied is delusional. Selling the 1080ti and upgrading to a Titan Xp may actually yield greater performance per dollar lol.
> 
> Watercooling for noise control, that is a different beast. I can get behind that, can't wait for my Aorus blocks to arrive.
Click to expand...

Mmhhh noise controls...

Let's see my old Titan clock 1250mhz @ 1.3v, on air it dissipate at 90c but on water block covers the GDDR5 memory. On load I never broke 40c.







AIO is inferior compare to custom loop with multi rad. All fans at low rpm.

I wonder if I drop 1080 ti under water... looking to clock 2300mhz with voltage unlock.

Idle 20c both CPU and GPU custom loop. Ambient temp 19c. Water is still feasible for hot environment ambient temp. For Canada water on 1080 ti maybe pointless unless to cut noises. Going to wait for price drop on water block

Btw to all new posters, welcome to ovvveerclloockk forums.

For pascal, I think future cards are harder to overclocked due to the die size. Tiny surface to dissipate heats.


----------



## AndreyATGB

Hey guys, could someone with the Amp Extreme measure how long it is exactly? Says 325mm on the website but reviews with rulers show that it's around 310mm. I went through 3 Tis so far and apart from the FE, I returned them due to coil whine.. I plan on trying the amp extreme next as it has a high tdp and the largest air cooler on it. I'd try one of the hybrids but I'm certain idle pump noise will drive me mad (I can hear when my hdd spins up, so I doubt the pump makes less noise than that).


----------



## Caustin

I was wondering if anyone had any ideas as how to decrease the temps I am getting on my SLI EVGA 1080ti with the EVGA 1080Ti hybrid kit installed on both. I was getting 62c on the top card and 58c on the bottom card,but after a few lengthy cutscenes in the Witcher 3, my top card hit 69c and my bottom card hit 62c with the original ambient temp around 72f. I was fine with the first set of temps, even though it was a little warmer than anticipated but 69c with a hybrid kit is way too much. I will also note that I have done nothing to the voltage of either card.

I'm fairly confident that both blocks are making good contact with their respective chips and a good amount of thermal grizzly kryonaut was applied to both. I know the layout of my case is not the most efficient when it comes to cooling but I cant think of any other way with my case. I realize the top card is much warmer due to it being an exhaust and my bottom card is warmer due to the radiator being situated below the pump. I was still hoping someone could shoot some ideas my way. Thanks in advance.


----------



## KedarWolf

Quote:


> Originally Posted by *TWiST2k*
> 
> Looks like Afterburner 4.4.0 Beta 10 is out. Paste link in new tab for download.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta10.rar


Updated OP and threads to Beta10.


----------



## ViRuS2k

Quote:


> Originally Posted by *Caustin*
> 
> I was wondering if anyone had any ideas as how to decrease the temps I am getting on my SLI EVGA 1080ti with the EVGA 1080Ti hybrid kit installed on both. I was getting 62c on the top card and 58c on the bottom card,but after a few lengthy cutscenes in the Witcher 3, my top card hit 69c and my bottom card hit 62c with the original ambient temp around 72f. I was fine with the first set of temps, even though it was a little warmer than anticipated but 69c with a hybrid kit is way too much. I will also note that I have done nothing to the voltage of either card.
> 
> I'm fairly confident that both blocks are making good contact with their respective chips and a good amount of thermal grizzly kryonaut was applied to both. I know the layout of my case is not the most efficient when it comes to cooling but I cant think of any other way with my case. I realize the top card is much warmer due to it being an exhaust and my bottom card is warmer due to the radiator being situated below the pump. I was still hoping someone could shoot some ideas my way. Thanks in advance.


If your overclocking with the XOC bios on the cards at default the cards pump out 1999 core @1.93mv or 1.100mv, keep a eye on that as it seems to be a bug with the FE`s and XOC bios.....

and if your overclocking on that bios again keep a eye on the voltage as you have to use a curve instead of default or it will pump out to much voltage... and head the card up more...

if your on FE bios then again use curve and set a lower voltage for the desired overclock say 0.93mv @1900 and 1.00mv @2000

or could be just bad contact on the die.... or poor air flow or bad ambient temps due to summer weather or heatwaves in Britain at the minute.


----------



## djriful

Quote:


> Originally Posted by *Caustin*
> 
> I was wondering if anyone had any ideas as how to decrease the temps I am getting on my SLI EVGA 1080ti with the EVGA 1080Ti hybrid kit installed on both. I was getting 62c on the top card and 58c on the bottom card,but after a few lengthy cutscenes in the Witcher 3, my top card hit 69c and my bottom card hit 62c with the original ambient temp around 72f. I was fine with the first set of temps, even though it was a little warmer than anticipated but 69c with a hybrid kit is way too much. I will also note that I have done nothing to the voltage of either card.
> 
> I'm fairly confident that both blocks are making good contact with their respective chips and a good amount of thermal grizzly kryonaut was applied to both. I know the layout of my case is not the most efficient when it comes to cooling but I cant think of any other way with my case. I realize the top card is much warmer due to it being an exhaust and my bottom card is warmer due to the radiator being situated below the pump. I was still hoping someone could shoot some ideas my way. Thanks in advance.


There is little you can do... radiator 120x120 would be able to dissipateheats for up to 200w. Which is why some said 360 radiator for 3 components. Actually recommend 1.5 rad per components that is rating at 200w on load.

If you can mod those pipes for bigger radiator 360 : 480 rad; you will gain better temp.

My system has 240 + 360 for GPU / CPU / MOSFET able to keep temp under 40c at load with ambient temp 19c


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> I love sleek design of this card. Not too many LEDs in your face, not too much fancy design, nice and elegant:


such a beautiful rig. Best cpu for gaming. Best gpu for gaming. Beautiful case, design, and build quality is top notch. That's a work of art and one of the best gaming rigs on Earth. Very much like mine, almost all the same components. #BeastMode
Quote:


> Originally Posted by *cluster71*
> 
> I do not know what limits. Run a test on Superposition on solid 2114 core 11800 mem 1.093 v 7700K 5300 MHz with 10147 score. Does not feel overwhelming.
> 
> 
> Spoiler: Warning: Spoiler!


you just need to do the NCP 'tricks" every setting with performance in mind, which basically means "off" on most settings. that setup will hit ~10,500+


----------



## Robilar

Quote:


> Originally Posted by *TheBoom*
> 
> I have the same ML Pro fans with a H115 and a 6700k at 1.25v 4.5ghz but i'm reaching 85C and sometimes higher in ME:A. How many exhaust and intake fans do you have?
> 
> The only exhaust I have are those MLs on the radiator. Maybe I need to change the rear fan from intake to exhaust.


I am running a different CPU than you. If I were at 4.5 temps would be in excess of what the H100i could handle. I run my 6800k at 4Ghz at 1.21 vcore. Temps max 50 passes of IBT are mid 50's. I could likely hit 4.2 or 4.3 with reasonable temps but the games I play are not really CPU dependant.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> such a beautiful rig. Best cpu for gaming. Best gpu for gaming. Beautiful case, design, and build quality is top notch. That's a work of art and one of the best gaming rigs on Earth. Very much like mine, almost all the same components. #BeastMode
> you just need to do the NCP 'tricks" every setting with performance in mind, which basically means "off" on most settings. that setup will hit ~10,500+


Thanks Slack







. That nasty combo is awesome.

Anyone else noticed that when you have your curve and you set it "apply on system start" and then take a look at your curve at system start- it is changed by AB and not same as you made it. Bins are moved. Curve is never applied same as I made it. Kind of pain having to set it up all the time (bins) to be in same places...


----------



## ViRuS2k

Quote:


> Originally Posted by *Benny89*
> 
> Thanks Slack
> 
> 
> 
> 
> 
> 
> 
> . That nasty combo is awesome.
> 
> Anyone else noticed that when you have your curve and you set it "apply on system start" and then take a look at your curve at system start- it is changed by AB and not same as you made it. Bins are moved. Curve is never applied same as I made it. Kind of pain having to set it up all the time (bins) to be in same places...


There is a nack to doing it correctly.

different temps call for different curve ramps









just make sure your blasting a heavy load on the GPU for about 30 minutes then do your curve.







with the game still open
then once done save....

then once you apply that curve at idle before you game it will always be at what you set it to once warmed back up


----------



## ALSTER868

Quote:


> Originally Posted by *ViRuS2k*
> 
> There is a nack to doing it correctly.
> 
> different temps call for different curve ramps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just make sure your blasting a heavy load on the GPU for about 30 minutes then do your curve.
> 
> 
> 
> 
> 
> 
> 
> with the game still open
> then once done save....
> 
> then once you apply that curve at idle before you game it will always be at what you set it to once warmed back up


I was trying to do so multiple times - to set my curve under load at different temps and save a profile. If I fire up a 3D app with a previously saved profile, then the clock jumps 2 or even 3 bins up and comes to what is was set only after warmup. The higher is the temp at which I set a curve, the higher it then jumps after applying it. Are there any other solutions to avoid this?


----------



## KedarWolf

Quote:


> Originally Posted by *Sinistur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Did you just join the forum to come in this thread and just say water cooling is pathetic for pascal cards .....
> 
> 
> 
> It really doesn't matter why i joined.
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Please go away......
> 
> Click to expand...
> 
> I know... i know... the truth is hard to handle but I wont be going anywhere.
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> I gather you dont run SLI 1080TI`s @2050/6066 and get 100% load temps @49-53c
> 
> Click to expand...
> 
> Given the current state of sli I know better than to throw my money away on it.... Man, this is even worst than I imagined.....
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> On a serious note* you dont know what your talking about...
> 
> Click to expand...
> 
> Actually I do and you simply don't like it... Oh well.
Click to expand...

Lots of benefits to water cooling.

Lower temps on load thus higher bins. Pretty much known you lose bins the higher the temp goes, not just from thermal throttling but also your card just will gain a bin or two with lower operating temps.

The higher the temps, the more chance you lose a bin.

For example, if my card gets to say 51-52C on XOC BIOS Fire Strike Ultra will crash at 2088 1.100v and test will not pass, the framerates get messed up.










If I stay under 49C it runs just fine.


----------



## ViRuS2k

Quote:


> Originally Posted by *ALSTER868*
> 
> I was trying to do so multiple times - to set my curve under load at different temps and save a profile. If I fire up a 3D app with a previously saved profile, then the clock jumps 2 or even 3 bins up and comes to what is was set only after warmup. The higher is the temp at which I set a curve, the higher it then jumps after applying it. Are there any other solutions to avoid this?


Well mate thats the only way i do it, as i have tried the other way.....

at load is best though as it will apply the correct saved profile when system restarts and applys the profile
what happens when the game starts you will be on a low temp so you might gain a few bins higher until it hits your load temps then it will drop down to the correct saved curve bin...

if you do it the other way around and set the curve on idle before load you will end up with the correct clock and votlage on low temps then it will ramp down bins as it makes it way up to your load temps.


----------



## KraxKill

Quote:


> Originally Posted by *Benny89*
> 
> Thanks Slack
> 
> 
> 
> 
> 
> 
> 
> . That nasty combo is awesome.
> 
> Anyone else noticed that when you have your curve and you set it "apply on system start" and then take a look at your curve at system start- it is changed by AB and not same as you made it. Bins are moved. Curve is never applied same as I made it. Kind of pain having to set it up all the time (bins) to be in same places...


Yeah, Afterburner is all sorts of weird for me. It even behaves odd loading profiles. Beta6. I guess I can try Beta 10. Loving the Ti still.


----------



## TheBoom

Quote:


> Originally Posted by *ALSTER868*
> 
> I was trying to do so multiple times - to set my curve under load at different temps and save a profile. If I fire up a 3D app with a previously saved profile, then the clock jumps 2 or even 3 bins up and comes to what is was set only after warmup. The higher is the temp at which I set a curve, the higher it then jumps after applying it. Are there any other solutions to avoid this?


Has something to do with the bins before the one you are using IMO.

Just found this out using ME:A alt tabbed and adjusted the curve at my max temp bin which is 75-80C.

Increasing the bins before the "active bin" stops the card from going higher than what you set for your highest clock. It's a little complicated and hard to describe but I hope you get what I'm trying to say.

When I had the bin before my highest clock of 2088 at 2050 it would ramp up to 2100 when I clicked apply. Changing that bin to 2062 without touching the 2088 one made sure the card didn't go past 2088.

For anyone else wondering I found this is the easiest way to do it, run ME:A or any other intensive 3D app alt tabbed that gets your card to it's highest load temperatures then fix the curve at that point. With the curve at idle and max temp set everything else in between should behave properly.

As for bins changing on startup I just made sure I readjusted the curve when it happened and applied, after which I deleted and saved the profile multiple times. Now my bins are always correct on startup. I also deleted all other profiles but I'm not sure if that has any effect on the issue.

*Edit: This is on XOC bios.*


----------



## TheBoom

Quote:


> Originally Posted by *Robilar*
> 
> I am running a different CPU than you. If I were at 4.5 temps would be in excess of what the H100i could handle. I run my 6800k at 4Ghz at 1.21 vcore. Temps max 50 passes of IBT are mid 50's. I could likely hit 4.2 or 4.3 with reasonable temps but the games I play are not really CPU dependant.


Which is weird since mine should put out less heat. Also the CPU at max by itself rarely goes over 80C even in Prime95 but when the GPU is at full load it rises by 5-8C.

I really need to either change some fans or reorder the intake-exhaust configuration I have now.


----------



## ALSTER868

Quote:


> Originally Posted by *TheBoom*
> 
> Has something to do with the bins before the one you are using IMO.
> 
> Just found this out using ME:A alt tabbed and adjusted the curve at my max temp bin which is 75-80C.
> 
> Increasing the bins before the "active bin" stops the card from going higher than what you set for your highest clock. It's a little complicated and hard to describe but I hope you get what I'm trying to say.
> 
> When I had the bin before my highest clock of 2088 at 2050 it would ramp up to 2100 when I clicked apply. Changing that bin to 2062 without touching the 2088 one made sure the card didn't go past 2088.


The best way for me so far seems to set up the curve at 45-48C. If I want, for instance, 2050 then I set it to 2063 and it comes down at 49-53C one bin to my desired 2050 and so it stays.
Anyway, after restart if I apply this 2063 profile, it jumps to 2076 until 42-44C then 2063 and 2050 to end up with







Crazy, huh? )


----------



## TheBoom

Quote:


> Originally Posted by *ALSTER868*
> 
> The best way for me for far seems to set up the curve at 45-48C. If I want, for instance, 2050 then I set it to 2063 and it comes down at 49-53C one bin to my desired 2050 and so it stays.
> Anyway, after restart if I apply this 2063 profile, it jumps to 2076 intil 42-44C then 2063 and 2050 to end up with
> 
> 
> 
> 
> 
> 
> 
> Crazy, huh? )


Eh sorry, I should have mentioned that I'm on XOC bios. Forgot it behaves very differently from stock bios throttling/curve.

Mine pretty much stays at 2088 and then drops to 2075 at 80C and that's about it.

Edit: Page 1080







. 1080-ception


----------



## ALSTER868

So if I flash the XOC, then all my headache with this curve jumping all around will end?


----------



## KedarWolf

Quote:


> Originally Posted by *ALSTER868*
> 
> So if I flash the XOC, then all my headache with this curve jumping all around will end?


I find on XOC some bins the voltage will jump up a few bins even if your curve is flat from your top bin.









Same on pretty much every BIOS, just it won't lower bins from power throttling is all.


----------



## Benny89

Didn't try it on MAX (ambient temp is high in my country now, case closed, 100% fans and just my standard curve).

2038 lowest, 2050-63 on average during test, 2076 (1.07v, top of my curve) highest in first 2 scenes. Max 57C. Of course power draw was most limiting factor but clocks are great!

1080 Ti STRIX

Score:



I could push it to 2100 at top bin and let it go to 2075-88 in bench but I don't want to do whole curve again







wanted to just try it on my gaming curve.

Those curves are relly annoying on Pascals..


----------



## feznz

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah, Afterburner is all sorts of weird for me. It even behaves odd loading profiles. Beta6. I guess I can try Beta 10. Loving the Ti still.










Awesome score new highest for OC.net with 1080Ti









As far as I can remember please correct me If I am wrong I love this bench simply puts the system into perspective for an over all gaming experience without that pesky Physics marketing benches

And they say Water cooling is a waste of time ...... wait till they learn about a chill box......








Wish I had the time to play with a chill box I just have to wait for a frosty morning


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah, Afterburner is all sorts of weird for me. It even behaves odd loading profiles. Beta6. I guess I can try Beta 10. Loving the Ti still.


Nice score







, what clks you running for that. Assume its FE bios shunt mod too.


----------



## KraxKill

Quote:


> Originally Posted by *Nico67*
> 
> Nice score
> 
> 
> 
> 
> 
> 
> 
> , what clks you running for that. Assume its FE bios shunt mod too.


Thanks. It's a 5775C at 4.5ghz and a shunt modded Ti at 2100. Ram is 2133CL9. Yes the FE bios.

Looking for some 2400cl9 ram to break 11K


----------



## buildzoid

So I tested the power mod on the 1070. The normalized power limit both seems to be affected and unaffected at the same time. Idle it matches the GPU power draw but under load it goes back up. I did not however see any throttling during testing. However due to my lack of expirience with Nvidia cards I don't think my test results are conclusive and I still need to do more testing.

I suspect the Normalized power draw is either being calculated or there is another power monitor chip on the card that I can't find.


----------



## KedarWolf

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Yeah, Afterburner is all sorts of weird for me. It even behaves odd loading profiles. Beta6. I guess I can try Beta 10. Loving the Ti still.
> 
> 
> 
> 
> 
> Nice score
> 
> 
> 
> 
> 
> 
> 
> , what clks you running for that. Assume its FE bios shunt mod too.
Click to expand...

Never mind.


----------



## Hulio225

Quote:


> Originally Posted by *buildzoid*
> 
> So I tested the power mod on the 1070. The normalized power limit both seems to be affected and unaffected at the same time. Idle it matches the GPU power draw but under load it goes back up. I did not however see any throttling during testing. However due to my lack of expirience with Nvidia cards I don't think my test results are conclusive and I still need to do more testing.
> 
> I suspect the Normalized power draw is either being calculated or there is another power monitor chip on the card that I can't find.


Good job, keep us posted


----------



## done12many2

Quote:


> Originally Posted by *buildzoid*
> 
> So I tested the power mod on the 1070. The normalized power limit both seems to be affected and unaffected at the same time. Idle it matches the GPU power draw but under load it goes back up. I did not however see any throttling during testing. However due to my lack of expirience with Nvidia cards I don't think my test results are conclusive and I still need to do more testing.
> 
> I suspect the Normalized power draw is either being calculated or there is another power monitor chip on the card that I can't find.


Nice video on the Pascal mod. Confused the crap out of me, but I think I can grasp it after I watch it a few more times with some beer in me.


----------



## done12many2

Dupe


----------



## Nico67

Quote:


> Originally Posted by *KraxKill*
> 
> Thanks. It's a 5775C at 4.5ghz and a shunt modded Ti at 2100. Ram is 2133CL9. Yes the FE bios.
> 
> Looking for some 2400cl9 ram to break 11K


Wow that's pretty good, I would have thought you would be up around 2138, how about the mem speed? Still hopeful that we might see an improved XOC bios, as I don't think the current one could touch that


----------



## KraxKill

Quote:


> Originally Posted by *Nico67*
> 
> Wow that's pretty good, I would have thought you would be up around 2138, how about the mem speed? Still hopeful that we might see an improved XOC bios, as I don't think the current one could touch that


The mem is at 12.2ghz (+600 on the offset). I can run it at 2151 but the score doesn't improve. The max FPS goes up a bit, but the card begins to hit the hardcoded "normalized TDP" even with the shunt mod so my scores are more consistent at 2100 @ 1.63v since its not pwr throttling there much at all as opposed to 2151 @ 1.093 for example where its bouncing off the rev limiter a lot more.


----------



## TK421

Planning to pick up a 1080ti, but who do you guys bins the best silicon for overclock?

Heard that evga does best binning, then maybe the nvidia founders since they're cut close to the center of the silicon wafer?

Also do we have an xoc bios?


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Planning to pick up a 1080ti, but who do you guys bins the best silicon for overclock?
> 
> Heard that evga does best binning, then maybe the nvidia founders since they're cut close to the center of the silicon wafer?
> 
> Also do we have an xoc bios?


XOC BIOS in Op under How To Flash A Different BIOS or here.

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20


----------



## cluster71

I Flash XOC Bios in my Zotac Extreme and run 8-10 benchmarks. Can run on 1.1 v without any problems. It has been max 46 degrees C. But I moved my PC into the basement a few days ago. It's 18-19 degrees C there. The graphics card came up a level, it is quite stable in 2126 now. What do you use for DP cable? I need 4 m to the basement using Oehlbach Impact Plus http://www.oehlbach.com/en/detail/index/sArticle/1458 Gets dropouts on the signal with standard cable that can handle 60 Hz in 4K. There are not many cables to choose from which can handle DP v1.3, not here. I'll continue with Sniper Elite


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> XOC BIOS in Op under How To Flash A Different BIOS or here.
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_20


oh it's an asus

going to buy an asus then


----------



## KedarWolf

Quote:


> Originally Posted by *cluster71*
> 
> I Flash XOC Bios in my Zotac Extreme and run 8-10 benchmarks. Can run on 1.1 v without any problems. It has been max 46 degrees C. But I moved my PC into the basement a few days ago. It's 18-19 degrees C there. The graphics card came up a level, it is quite stable in 2126 now. What do you use for DP cable? I need 4 m to the basement using Oehlbach Impact Plus http://www.oehlbach.com/en/detail/index/sArticle/1458 Gets dropouts on the signal with standard cable that can handle 60 Hz in 4K. There are not many cables to choose from which can handle DP v1.3, not here. I'll continue with Sniper Elite


I bought a cheap 3 metre 4K display port cable from Amazon, had dropouts in signal as well.


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> I bought a cheap 3 metre 4K display port cable from Amazon, had dropouts in signal as well.


is amazon basics a good brand for displayport?


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I bought a cheap 3 metre 4K display port cable from Amazon, had dropouts in signal as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> is amazon basics a good brand for displayport?
Click to expand...

Bad reviews with flickering.

This in a review.

Save yourself the time and money and pick up GearIT Gold Plated DisplayPort to DisplayPort Cable 10 Feet - 4K Resolution Ready (DP to DP Cable) Black if you need a longer DisplayPort Cable. It worked for me without issues (2560x1440 @144hz and 150hz)


----------



## TWiST2k

Quote:


> Originally Posted by *KedarWolf*
> 
> Bad reviews with flickering.
> 
> This in a review.
> 
> Save yourself the time and money and pick up GearIT Gold Plated DisplayPort to DisplayPort Cable 10 Feet - 4K Resolution Ready (DP to DP Cable) Black if you need a longer DisplayPort Cable. It worked for me without issues (2560x1440 @144hz and 150hz)


I have some Amazon ones and some of the gold ones and honestly they are all the same IMO. I think with any product, sometimes you get good ones and sometimes you get bad ones. But on both of my 144Hz G-Sync displays I have used them both at times and not had any issues.


----------



## FastEddieNYC

I finally decided to retire my 290X crossfire setup and switch to the Evga 1080 Ti SC black. I was satisfied with crossfire but the performance at 4K of the Ti is Superior. My last Nvidia card was a Gtx 780 so I need to take a crash course on how to overclock pascal with a curve. I have a waterblock on the way so I will wait to dial it in but I did a quick test +100 core 500 mem.


Overall the Ti is a beast of a card. The only question I have is why did Nvidia enforce a ridiculous Power limit on an enthusiast card? I understand a mainstream card but the large majority of people that buy a 1080 Ti are knowledgeable with hardware and know how far to push the card safely.


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> Bad reviews with flickering.
> 
> This in a review.
> 
> Save yourself the time and money and pick up GearIT Gold Plated DisplayPort to DisplayPort Cable 10 Feet - 4K Resolution Ready (DP to DP Cable) Black if you need a longer DisplayPort Cable. It worked for me without issues (2560x1440 @144hz and 150hz)


Quote:


> Originally Posted by *TWiST2k*
> 
> I have some Amazon ones and some of the gold ones and honestly they are all the same IMO. I think with any product, sometimes you get good ones and sometimes you get bad ones. But on both of my 144Hz G-Sync displays I have used them both at times and not had any issues.


I'll try a few and check.

Currently I have a displayport cable from parts bin at work, might be HP or Dell part but it works well up to 4K at the moment. But is quite short.


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Bad reviews with flickering.
> 
> This in a review.
> 
> Save yourself the time and money and pick up GearIT Gold Plated DisplayPort to DisplayPort Cable 10 Feet - 4K Resolution Ready (DP to DP Cable) Black if you need a longer DisplayPort Cable. It worked for me without issues (2560x1440 @144hz and 150hz)
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *TWiST2k*
> 
> I have some Amazon ones and some of the gold ones and honestly they are all the same IMO. I think with any product, sometimes you get good ones and sometimes you get bad ones. But on both of my 144Hz G-Sync displays I have used them both at times and not had any issues.
> 
> Click to expand...
> 
> I'll try a few and check.
> 
> Currently I have a displayport cable from parts bin at work, might be HP or Dell part but it works well up to 4K at the moment. But is quite short.
Click to expand...

Short displayport cables seem to have fewer issues, it's the longer ones that flicker etc.


----------



## IMI4tth3w

jealous of all these 10,500+ superposition scores!!

I'm getting around 10,1xx from my 4770K @ 4.5GHz + 1080ti Strix @ 2088Mhz Core/+350 Mem (i haven't reached memory instability but saw zero performance improvement all the way up to +500 so i just stopped)

GPU seems to always be stuck at 1.062V no matter what i do with the voltage slider (i should probably not be using asus ai tweak as i thought it was a little odd the voltage slider is a % and not in mV) so 2088MHz core is as far as she'll go.

Both have ek water blocks keeping them nice and cool so i'll eventually get more creative with the voltage and look into getting in the 2100MHz+ 1080ti club


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> Short displayport cables seem to have fewer issues, it's the longer ones that flicker etc.


ah

what's the average for memory oc on 1080ti before it gets negative scaling?


----------



## buellersdayoff

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sinistur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Did you just join the forum to come in this thread and just say water cooling is pathetic for pascal cards .....
> 
> 
> 
> It really doesn't matter why i joined.
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> Please go away......
> 
> Click to expand...
> 
> I know... i know... the truth is hard to handle but I wont be going anywhere.
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> I gather you dont run SLI 1080TI`s @2050/6066 and get 100% load temps @49-53c
> 
> Click to expand...
> 
> Given the current state of sli I know better than to throw my money away on it.... Man, this is even worst than I imagined.....
> 
> Quote:
> 
> 
> 
> Originally Posted by *ViRuS2k*
> 
> On a serious note* you dont know what your talking about...
> 
> Click to expand...
> 
> Actually I do and you simply don't like it... Oh well.
> 
> Click to expand...
> 
> Lots of benefits to water cooling.
> 
> Lower temps on load thus higher bins. Pretty much known you lose bins the higher the temp goes, not just from thermal throttling but also your card just will gain a bin or two with lower operating temps.
> 
> The higher the temps, the more chance you lose a bin.
> 
> For example, if my card gets to say 51-52C on XOC BIOS Fire Strike Ultra will crash at 2088 1.100v and test will not pass, the framerates get messed up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I stay under 49C it runs just fine.
Click to expand...

A bin is 13mhz right? Water cooling might be helpful for benchmark scores, but for games a difference between 1950mhz and 2050mhz, 100mhz fluctuating or throttling will mean nothing in games...
I think my previous post was deleted? I've ditched the 980ti (I'll update my rig builder later) AND water cooling the gpu because there really is no need. Not for gaming, or without the mods that net you another 100mhz and a couple of hundred points on the score board...


----------



## KingEngineRevUp

Quote:


> Originally Posted by *buellersdayoff*
> 
> A bin is 13mhz right? Water cooling might be helpful for benchmark scores, but for games a difference between 1950mhz and 2050mhz, 100mhz fluctuating or throttling will mean nothing in games...
> I think my previous post was deleted? I've ditched the 980ti (I'll update my rig builder later) AND water cooling the gpu because there really is no need. Not for gaming, or without the mods that net you another 100mhz and a couple of hundred points on the score board...


Running at 2101 Mhz is 7.75% over 1950 Mhz and 2114 Mhz is 8.4% greater. That can be 6 to 10 foster extra. At 4K that can be a majkr difference from 50-52 fps and solid 60 FPS.


----------



## Benny89

Quote:


> Originally Posted by *buellersdayoff*
> 
> A bin is 13mhz right? Water cooling might be helpful for benchmark scores, but for games a difference between 1950mhz and 2050mhz, 100mhz fluctuating or throttling will mean nothing in games...
> I think my previous post was deleted? I've ditched the 980ti (I'll update my rig builder later) AND water cooling the gpu because there really is no need. Not for gaming, or without the mods that net you another 100mhz and a couple of hundred points on the score board...


It make almost no difference in 1440p or 1080p. 1987 clock between my 24/7 2063 clock the difference in 1440p is 2-4 fps. That is nothing if your game runs 80-100 fps or 136-144fps.

But for 4K 60Hz that may mean difference between 57fps and 60-61 fps which is A LOT since 60 fps is the minimum we all want.


----------



## ALSTER868

I don't think that 57 and 60-61 fps in 4k make a clearly distinguishable difference either. I would say there is none.


----------



## buellersdayoff

There are reasons for water cooling, it's just not as important this time around. Faster cpu and ram will also help with minimums, if your budget is unrestricted then why not, go for it ☺


----------



## ALSTER868

Quote:


> Originally Posted by *buellersdayoff*
> 
> There are reasons for water cooling, it's just not as important this time around. Faster cpu and ram will also help with minimums, if your budget is unrestricted then why not, go for it ☺


Well, this sound more like an excuse for yourself not to go for WC ''this round''.








Were there much more reasons in the ''previous round''? What extra benefits would I have got on my air cooled 980Ti [email protected] bencheable and [email protected]/7 if I went for water? Except temps maybe, but they didn't bother me


----------



## buellersdayoff

Quote:


> Originally Posted by *ALSTER868*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> There are reasons for water cooling, it's just not as important this time around. Faster cpu and ram will also help with minimums, if your budget is unrestricted then why not, go for it ☺
> 
> 
> 
> Well, this sound more like an excuse for yourself not to go for WC ''this round''.
> 
> 
> 
> 
> 
> 
> 
> 
> Were there much more reasons in the ''previous round''? What would I have got over this on my air cooled 980Ti [email protected] bencheable and [email protected]/7 if I went for water? Except temps maybe, but they didn't bother me
Click to expand...

Maxwell was also debatable, my 980ti clocked about the same as yours but with the unlocked bios and water cooling I could go further, not much so probably not worth it except for the COOL factor (pun intended), and yes it is my excuse for not bothering with it this time. I have all the gear needed except for the gpu block...


----------



## ALSTER868

Even with an unlocked bios and on water on my former 980Ti I would have got probably 50 extra MHz (that's optimistic) which would result in a gain next to.. nothing.
As for Pascal IMO there are more reasons to go WC, though it's my personal opinion.


----------



## Dasboogieman

Quote:


> Originally Posted by *ALSTER868*
> 
> Even with an unlocked bios and on water on my former 980Ti I would have got probably 50 extra MHz (that's optimistic) which would result in a gain next to.. nothing.
> As for Pascal IMO there are more reasons to go WC, though it's my personal opinion.


Hawaii had a lot of reasons to be watercooled. The VRM array alone demanded top tier watercooling just to push the clocks that WC can afford.
I remember we were comparing all the waterblocks and the VRM temps was more important than any other consideration. IIRC Alphacool was top dog then as they also are now for VRM temps cuz of the active backplate. The difference was back then if your VRMs ran <50 degrees you could push more voltage but now its mostly cosmetic.


----------



## fisher6

The biggest reason to water cool for me is noise. I can set my all my fans to run at 1000rpm and never touch them again. Doesn't matter if I'm gaming or just browsing, the pc is always dead silent


----------



## Hulio225

Quote:


> Originally Posted by *ALSTER868*
> 
> Even with an unlocked bios and on water on my former 980Ti I would have got probably 50 extra MHz (that's optimistic) which would result in a gain next to.. nothing.
> As for Pascal IMO there are more reasons to go WC, though it's my personal opinion.


Not a direct message to you, just in general cause some argued here that its not giving performance etc or less than 1 % and so on...
While this is absolutely true, there are other reasons to do it, noise for example, aesthetics, pleasure of building such systems etc etc.

Don't know why some people think they have the right to call someone out that he wasted his money, that is simply overbearing.
Everyone can do what he wants with his money for whatever reasons he may have...

Discussing the sense of performance gains, no matter how small they are, on an forum called "overclock.net" seems to me like even less sense in my humble opinion. Especially when there are enough people here which do benchmarking more than playing games and competing on HWBot f.e.
Or simply for the sake of gaining the absolute maximum out of their components, just for fun.

While it is okay to have your own opinion, its is questionable to have a whole opposite mindset in regard of stuff to a whole community and try to attack or argue with those people... that also makes no sense...

Just my 2 Cents


----------



## ALSTER868

Quote:


> Originally Posted by *Dasboogieman*
> 
> Hawaii had a lot of reasons to be watercooled. The VRM array alone demanded top tier watercooling just to push the clocks that WC can afford.
> I remember we were comparing all the waterblocks and the VRM temps was more important than any other consideration. IIRC Alphacool was top dog then as they also are now for VRM temps cuz of the active backplate. The difference was back then if your VRMs ran <50 degrees you could push more voltage but now its mostly cosmetic.


Yep, water for Hawaii was a must, outta question. Or an alternative AC like arctic at the very least.
But in case on last couple of generations of NVIDIA it's questionable.


----------



## Dasboogieman

Quote:


> Originally Posted by *fisher6*
> 
> The biggest reason to water cool for me is noise. I can set my all my fans to run at 1000rpm and never touch them again. Doesn't matter if I'm gaming or just browsing, the pc is always dead silent


I'm doing it because its Lego for adults. Purely for the fun of it.


----------



## ALSTER868

Quote:


> Originally Posted by *Hulio225*
> 
> Not a direct message to you, just in general cause some argued here that its not giving performance etc or less than 1 % and so on...
> While this is absolutely true, there are other reasons to do it, noise for example, aesthetics, pleasure of building such systems etc etc.
> 
> Don't know why some people think they have the right to call someone out that he wasted his money, that is simply overbearing.
> Everyone can do what he wants with his money for whatever reasons he may have...
> 
> Discussing the sense of performance gains, no matter how small they are, on an forum called "overclock.net" seems to me like even less sense in my humble opinion. Especially when there are enough people here which do benchmarking more than playing games and competing on HWBot f.e.
> Or simply for the sake of gaining the absolute maximum out of their components, just for fun.
> 
> While it is okay to have your own opinion, its is questionable to have a whole opposite mindset in regard of stuff to a whole community and try to attack or argue with those people... that also makes no sense...
> 
> Just my 2 Cents


Absolutely agree with you mate, actually I do understand the motives and incentives of those building a water cooled machine and was in no way arguing or even criticizing it. I am just like everybody here with the same hobby otherwise why should I be here at all? 
I was trying to judge from the practical side of things so to say..


----------



## Benny89

Quote:


> Originally Posted by *Hulio225*
> 
> Not a direct message to you, just in general cause some argued here that its not giving performance etc or less than 1 % and so on...
> While this is absolutely true, there are other reasons to do it, noise for example, aesthetics, pleasure of building such systems etc etc.
> 
> Don't know why some people think they have the right to call someone out that he wasted his money, that is simply overbearing.
> Everyone can do what he wants with his money for whatever reasons he may have...
> 
> Discussing the sense of performance gains, no matter how small they are, on an forum called "overclock.net" seems to me like even less sense in my humble opinion. Especially when there are enough people here which do benchmarking more than playing games and competing on HWBot f.e.
> Or simply for the sake of gaining the absolute maximum out of their components, just for fun.
> 
> While it is okay to have your own opinion, its is questionable to have a whole opposite mindset in regard of stuff to a whole community and try to attack or argue with those people... that also makes no sense...
> 
> Just my 2 Cents


This so much. We all here basicelly "waste" our money, because we could all "play" any game we want at 1/3rd of a price we paid easly. Yet we don't do it. We buy premium cases, premium GPUs, monitors, speakers, LEDs, fans etc. Because it is a hobby.

I will be honest- I have more fun building and rebuilding my system than I have playing on it









My only personal concern against water cooling is that I do a lot of stuff inside my case from time to time. Cleaning, changing some parts, changing some fans, replacing, adding etc. Full loop would be pain in the gutts because of it. However since my build is now finished I am considering watercooling it.


----------



## Hulio225

Quote:


> Originally Posted by *ALSTER868*
> 
> Absolutely agree with you mate, actually I do understand the motives and incentives of those building a water cooled machine and was in no way arguing or even criticizing it. I am just like everybody here with the same hobby otherwise why should I be here at all?
> I was trying to judge from the practical side of things so to say..


Like i said, wasn't directed to you directly.

There was another guy who was flaming a bit and calling people out that they wasted their money and what not...
A narrow minded arrogant guy imho... But yeah most peeps here aren't like him









Edit:
Quote:


> Originally Posted by *Benny89*
> 
> This so much. We all here basicelly "waste" our money, because we could all "play" any game we want at 1/3rd of a price we paid easly. Yet we don't do it. We buy premium cases, premium GPUs, monitors, speakers, LEDs, fans etc. Because it is a hobby.
> 
> I will be honest- *I have more fun building and rebuilding my system* than I have playing on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *My only personal concern against water cooling is that I do a lot of stuff inside my case from time to time.* Cleaning, changing some parts, changing some fans, replacing, adding etc. Full loop would be pain in the gutts because of it. However since my build is now finished I am considering watercooling it.


If you have more fun with build and rebuilding, a watercooling loop is providing exactly that







There is always some work if you change something, and as long its flexible tubing with some extra length in between components its pretty comfy to work with.


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hulio225*
> 
> Not a direct message to you, just in general cause some argued here that its not giving performance etc or less than 1 % and so on...
> While this is absolutely true, there are other reasons to do it, noise for example, aesthetics, pleasure of building such systems etc etc.
> 
> Don't know why some people think they have the right to call someone out that he wasted his money, that is simply overbearing.
> Everyone can do what he wants with his money for whatever reasons he may have...
> 
> Discussing the sense of performance gains, no matter how small they are, on an forum called "overclock.net" seems to me like even less sense in my humble opinion. Especially when there are enough people here which do benchmarking more than playing games and competing on HWBot f.e.
> Or simply for the sake of gaining the absolute maximum out of their components, just for fun.
> 
> While it is okay to have your own opinion, its is questionable to have a whole opposite mindset in regard of stuff to a whole community and try to attack or argue with those people... that also makes no sense...
> 
> Just my 2 Cents
> 
> 
> 
> This so much. We all here basicelly "waste" our money, because we could all "play" any game we want at 1/3rd of a price we paid easly. Yet we don't do it. We buy premium cases, premium GPUs, monitors, speakers, LEDs, fans etc. Because it is a hobby.
> 
> I will be honest- I have more fun building and rebuilding my system than I have playing on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My only personal concern against water cooling is that I do a lot of stuff inside my case from time to time. Cleaning, changing some parts, changing some fans, replacing, adding etc. Full loop would be pain in the gutts because of it. However since my build is now finished I am considering watercooling it.
Click to expand...

I use an EK Predator 360, everything is modularly connected with quick disconnects and easy to remove.









And EK is coming out with the modular MLC line in a few months, the same principle.









A Predator 360 is basically a custom loop in an AIO.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *buellersdayoff*
> 
> There are reasons for water cooling, it's just not as important this time around. Faster cpu and ram will also help with minimums, if your budget is unrestricted then why not, go for it ☺


You understand you're on overclock.net right? You're not on underclock.Net


----------



## hypespazm

coming from kepler to pascal it seems that there is not that much overclocking to be done


----------



## Aganor

Quote:


> Originally Posted by *fisher6*
> 
> The biggest reason to water cool for me is noise. I can set my all my fans to run at 1000rpm and never touch them again. Doesn't matter if I'm gaming or just browsing, the pc is always dead silent


WC my 1080ti made me fo about 15% more OC and A LOT more cooler than stock and A LOT more quieter. I dont even need to ramp the fans to max to have same temp as half the speed of them and no noise at all.


----------



## pantsoftime

So nobody was ever able to get that reviewer BIOS for the Inno3d iChill X3?


----------



## cluster71

Quote:


> Originally Posted by *IMI4tth3w*
> 
> jealous of all these 10,500+ superposition scores!!
> 
> I'm getting around 10,1xx from my 4770K @ 4.5GHz + 1080ti Strix @ 2088Mhz Core/+350 Mem (i haven't reached memory instability but saw zero performance improvement all the way up to +500 so i just stopped)
> 
> GPU seems to always be stuck at 1.062V no matter what i do with the voltage slider (i should probably not be using asus ai tweak as i thought it was a little odd the voltage slider is a % and not in mV) so 2088MHz core is as far as she'll go.
> 
> Both have ek water blocks keeping them nice and cool so i'll eventually get more creative with the voltage and look into getting in the 2100MHz+ 1080ti club


There is more to it than just tweaking. I'm trying to understand why my superposition results are so bad 9700-10180. I see many 7700K Z270i owners with problems. I see people with older PCs with less CPU and GPU performance that have better results. The PC does not seem to load properly, it is low power outage in directx. In the opegl donut ex it goes up in high power. I'm going to test KendarWolf's power tweak. Whether it is the Z270i chipset itself or that I have 2 M.2 SSD.


----------



## Hulio225

Quote:


> Originally Posted by *cluster71*
> 
> There is more to it than just tweaking. I'm trying to understand why my superposition results are so bad 9700-10180. I see many 7700K Z270i owners with problems. I see people with older PCs with less CPU and GPU performance that have better results. The PC does not seem to load properly, it is low power outage in directx. In the opegl donut ex it goes up in high power. I'm going to test KendarWolf's power tweak. Whether it is the Z270i chipset itself or that I have 2 M.2 SSD.


Have you put the driver settings correctly, with performance everywhere etc? In addition a bench OS which is stripped down of services and stuff helps alot, OS settings to max performance mode too i can pretty much squeeze 500 Points out with just mentioned, i am too at i7 7700k and z 270 mate. And im getting also 10800ish scores.


Im pretty sure with some more tweaking i could get a bit more

Edit: ah... you are on z270*i*, but i cant imagine that this is the issue, i guess its some OS and or driver settings you are missing


----------



## fisher6

EVGA GTX 1080Ti Kingpin Edition coming:






Ships with clock at 2025 on air which is amazing according to Jay









Just wanna try the bios it comes with.

EDIT:

Teardown by GN:


----------



## Benny89

Quote:


> Originally Posted by *fisher6*
> 
> EVGA GTX 1080Ti Kingpin Edition coming:
> 
> 
> 
> 
> 
> 
> Ships with clock at 2025 on air which is amazing according to Jay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wanna try the bios it comes with.
> 
> EDIT:
> 
> Teardown by GN:


Binned cards. Sadly same loud fans.

If you have money, good way to ensure you get golden card.


----------



## paskowitz

Quote:


> Originally Posted by *Benny89*
> 
> Binned cards. Sadly same loud fans.
> 
> If you have money, good way to ensure you get golden card.


I think the glass is more half empty... good way to ensure you don't get a stinker of a card. I doubt these cards will hit 2200Mhz... maybe on water with a lucky chip. Maybe people with chillers/sub ambient.


----------



## FastEddieNYC

Quote:


> Originally Posted by *fisher6*
> 
> EVGA GTX 1080Ti Kingpin Edition coming:
> 
> 
> 
> 
> 
> 
> Ships with clock at 2025 on air which is amazing according to Jay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wanna try the bios it comes with.
> 
> EDIT:
> 
> Teardown by GN:


JayzTwoCents also posed a interview with K|NGP|N. It is stated that the board will have a XOC unlocked bios either on the card or available to flash. It will be interesting to see how it compares to the Asus bios.


----------



## Hulio225

Quote:


> Originally Posted by *paskowitz*
> 
> I think the glass is more half empty... good way to ensure you don't get a stinker of a card. I doubt these cards will hit *2200Mhz... maybe on water with a lucky chip.* Maybe people with chillers/sub ambient.


Nope, i bet my a... off you will have to be lucky to hit 2100MHz even on water under normal ambient temps!! Like you said it, with sub zero temps yes, maybe but not normal, there is no way, i'm following this thread very very long and i know how good "good" cards look and how "common" they are.

Edit: It would be nice to do a poll and see how many peeps are capable of 2050 MHz+ stable like FSU stress test or something...


----------



## Benny89

So my card can do 2100 on 1.09V bin but no way I can keep that due to temps. It drops down to 1.075 where I crash if I touch 64/5C (works fine below that no problem, tested with 100% fan speed).

So I settled for now on 1.06V for 2063 with 65%-70% fan speed depending on ambient temps and it keeps it between 59-62C and game no problem. 60% fan speed would be enough if there was not summer here finally...

Can't wait to finally get my self remote AC in my small gaming room.... Summer is hitting hard now (yesterday was 28C in my room...)


----------



## Slackaveli

Quote:


> Originally Posted by *Benny89*
> 
> So my card can do 2100 on 1.09V bin but no way I can keep that due to temps. It drops down to 1.075 where I crash if I touch 64/5C (works fine below that no problem, tested with 100% fan speed).
> 
> So I settled for now on 1.06V for 2063 with 65%-70% fan speed depending on ambient temps and it keeps it between 59-62C and game no problem. 60% fan speed would be enough if there was not summer here finally...
> 
> Can't wait to finally get my self remote AC in my small gaming room.... Summer is hitting hard now (yesterday was 28C in my room...)


my exact perf and settings on aorus. 2063 is a nice spot to settle in @ ~59c.


----------



## Benny89

Quote:


> Originally Posted by *Slackaveli*
> 
> my exact perf and settings on aorus. 2063 is a nice spot to settle in @ ~59c.


Yea, I really want to put that card under water now


----------



## Coopiklaani

Quote:


> Originally Posted by *fisher6*
> 
> EVGA GTX 1080Ti Kingpin Edition coming:
> 
> 
> 
> 
> 
> 
> Ships with clock at 2025 on air which is amazing according to Jay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just wanna try the bios it comes with.
> 
> EDIT:
> 
> Teardown by GN:


It doesn't ship with 2025 stock clock. EVGA guarantees 2025 core OC, but ships with core clock much lower than 2025MHz.
Maybe EVGA is too conservative, most FE cards can hit 2025 on air no problem.


----------



## paskowitz

Quote:


> Originally Posted by *Coopiklaani*
> 
> It doesn't ship with 2025 stock clock. EVGA guarantees 2025 core OC, but ships with core clock much lower than 2025MHz.
> Maybe EVGA is too conservative, most FE cards can hit 2025 on air no problem.


#Marketing


----------



## TK421

kingpin card huh

wonder how much mhz you get for the price hike... normal users never use ln2 anyways


----------



## feznz

I couldn't comment on how much head room I gained for OC on water all I can say I did gain sustained clocks with little audible noise I would almost go as far as to say maybe 13Mhz or less....
Every now and then Us boys get together for Lan night and when I hear cards on stock coolers I Know I could never go back *Water For Life*
As for max clocks looking at Futuremark results ect I would assume that there would have been a few a round 2200Mhz but there seems to be more like
a few round 2150Mhz then a huge jump to 2300Mhz+ that's the separation between water/air to LN2/DICE


----------



## OMGWTFISTHIS

Can you guys find the 1080 Ti Hybrid cooler in stock? It was in stock for like 30 minutes yesterday and missed it from BHPhotoVideo. :\

PN: 400-HY-5188-B1


----------



## nrpeyton

Just water-cooling a FE won't do a tumendous amount for your O/C on it's own.

The beauty of water-cooing seems to be the fact it opens up the possibility of doing the power mod.


----------



## OMGWTFISTHIS

I mostly wanted the Hybrid to bring that damn 86c max down to about half. It's like I have my own personal desk heater in the middle of May.


----------



## FastEddieNYC

Quote:


> Originally Posted by *nrpeyton*
> 
> Just water-cooling a FE won't do a tumendous amount for your O/C on it's own.
> 
> The beauty of water-cooing seems to be the fact it opens up the possibility of doing the power mod.


That is what I am doing. I already shunt modded my EVGA SC black card and it's working great. I can run [email protected] at 80% TDP with no downclock but I am waiting for my waterblock to arrive before pushing it further. Water cooling the VRM's is just as important as the core when using the shunt mod/XOC bios that pushes 400+ watts.


----------



## Luckbad

Quote:


> Originally Posted by *OMG***ISTHIS*
> 
> I mostly wanted the Hybrid to bring that damn 86c max down to about half. It's like I have my own personal desk heater in the middle of May.


The same amount heat will still come out, it'll just be concentrated wherever you point the radiator/fans.


----------



## TK421

Quote:


> Originally Posted by *OMG***ISTHIS*
> 
> Can you guys find the 1080 Ti Hybrid cooler in stock? It was in stock for like 30 minutes yesterday and missed it from BHPhotoVideo. :\
> 
> PN: 400-HY-5188-B1


maxwell era cooler works, no difference


----------



## AngryLobster

Question for those of you guys who have either RMA'd a Hybrid cooler or bought one outright. I sent one to EVGA for a RMA replacement in it's original EVGA cardboard box (AIO only, no shroud or accessories as instructed). I received a brand new shrink wrapped replacement in a black/dark brown EVGA branded box that shares the same model # with what I sent back (for a GTX 1080 that I re-purposed for my 1080 Ti).

Have any of you guys received a Hybrid cooler in this new packaging? It's considerably smaller than the Hybrid boxes I've seen. Is it the complete kit with shroud and hardware? I don't want to open it if it is as I'd rather just sell it.


----------



## KedarWolf

Quote:


> Originally Posted by *AngryLobster*
> 
> Question for those of you guys who have either RMA'd a Hybrid cooler or bought one outright. I sent one to EVGA for a RMA replacement in it's original EVGA cardboard box (AIO only, no shroud or accessories as instructed). I received a brand new shrink wrapped replacement in a black/dark brown EVGA branded box that shares the same model # with what I sent back (for a GTX 1080 that I re-purposed for my 1080 Ti).
> 
> Have any of you guys received a Hybrid cooler in this new packaging? It's considerably smaller than the Hybrid boxes I've seen. Is it the complete kit with shroud and hardware? I don't want to open it if it is as I'd rather just sell it.


You mean a 1080 Hybrid kit or a 1080 TI Hybrid kit.

The video is the box size of a full 1080 Hybrid kit size, skip ahead to 3:25.


----------



## AngryLobster

Yeah the video shows the normal cardboard box they all come in. My replacement came in a shrink wrapped EVGA dark brown box that's 15% smaller. I'm guessing it's what they do for Hybrid cooler RMA's, send you just the AIO.


----------



## utparatrooper

Quote:


> Originally Posted by *KedarWolf*
> 
> Short displayport cables seem to have fewer issues, it's the longer ones that flicker etc.


As I understand it, there is a DP standard and most generic DP cables do not meet regarding some kind of pin out standard which I think is the primary cause of flickering, esp. on high refresh rate monitors. 4K, in particular, is limited to cables at 3 meters.

I buy the Accell cables on Amazon. Cables that meet the standard has the DP symbol imprinted on the cable, which I believe is some kind of licensing thing.

However, some do have good luck with the unbranded/non-spec cables.


----------



## TK421

Quote:


> Originally Posted by *utparatrooper*
> 
> As I understand it, there is a DP standard and most generic DP cables do not meet regarding some kind of pin out standard which I think is the primary cause of flickering, esp. on high refresh rate monitors. 4K, in particular, is limited to cables at 3 meters.
> 
> I buy the Accell cables on Amazon. Cables that meet the standard has the DP symbol imprinted on the cable, which I believe is some kind of licensing thing.
> 
> However, some do have good luck with the unbranded/non-spec cables.


is amazonbasics not a licensed/standard quality?


----------



## utparatrooper

Quote:


> Originally Posted by *TK421*
> 
> is amazonbasics not a licensed/standard quality?


I don't quite remember but I learned of this specific standard when I was researching why some cables were more expensive than others (looking for a legimitate reason).

DP also has some website that lists cables that meet it's standards and ones that don't cannot use the DP imprint on the cable.

The Q and A on Amazon for the Accell cables sheds more light on this.

Again I vaguely recall the pins/wiring are different between spec and non-spec cables.


----------



## gmpotu

Quote:


> Originally Posted by *KraxKill*
> 
> Thanks. It's a 5775C at 4.5ghz and a shunt modded Ti at 2100. Ram is 2133CL9. Yes the FE bios.
> 
> Looking for some 2400cl9 ram to break 11K


How does system ram tie in to the scores for SP4K and your overclock?
My system ram is some cheap samsung value ram. DDR3 @1648mhz cl11 2T


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *utparatrooper*
> 
> As I understand it, there is a DP standard and most generic DP cables do not meet regarding some kind of pin out standard which I think is the primary cause of flickering, esp. on high refresh rate monitors. 4K, in particular, is limited to cables at 3 meters.
> 
> I buy the Accell cables on Amazon. Cables that meet the standard has the DP symbol imprinted on the cable, which I believe is some kind of licensing thing.
> 
> However, some do have good luck with the unbranded/non-spec cables.
> 
> 
> 
> is amazonbasics not a licensed/standard quality?
Click to expand...

My problem was solved when I put the OD setting on my Acer monitor from Extreme to Normal.


----------



## gmpotu

Finally caught up on the 100 post I missed this weekend.
I picked up Final Fantasy Brave Exvius on my iPhone and now I'm not even using my GTX 1080TI.
I've spent 3 days on straight on this iOS game.

Still keeping up with the posts when I can though.


----------



## KedarWolf

I NEED A HUGe amount of money to buy another 1080 Ti.


----------



## Benny89

Lol, Dishonored 2 is a strange optimized game. It almost never uses 99% GPU, oftern 95%-97%, even below many times. Funny it does not use cores in CPU also, like out of 4 cores, two are 5-11% usage and two are 34-40% usage.

Everything on Ultra and game can't keep stable 120 fps even with its quite old looking graphic.

Really bad optimized game.


----------



## Dasboogieman

After one friggin day, the surgery was successful. Aorus + CPU are now together in one loop. Also got rid of the weedy 70L/hr Alphacool pump and got a proper D5.

GPU temps are now <40 pretty much at all times (unless the CPU is also fully loaded). As expected for 16nm FinFet, 2100mhz is now stable and 12500mhz VRAM, also gained about 25-30W of TDP headroom (partly from removed fans and partly from colder temps for everything). Funny thing is, my Supo score has gone down but its probably a driver thing. Too tired to care now.


----------



## ViRuS2k

Be intrested to know who posts that KINGPIN bios first








apart from the power limit removed, they also said the temp limits have been removed aswell. so no more bin jumping due to temps lol

i cant wait to try it out...... hope its FE compatible....
hope it also removes normilized tdp









though the card ships with normal fe bios and a 2025core clock bios aswell as a 3rd XOC bios
please someone release those bios`s


----------



## Benny89

Quote:


> Originally Posted by *ViRuS2k*
> 
> Be intrested to know who posts that KINGPIN bios first
> 
> 
> 
> 
> 
> 
> 
> 
> apart from the power limit removed, they also said the temp limits have been removed aswell. so no more bin jumping due to temps lol
> 
> i cant wait to try it out...... hope its FE compatible....
> hope it also removes normilized tdp
> 
> 
> 
> 
> 
> 
> 
> 
> 
> though the card ships with normal fe bios and a 2025core clock bios aswell as a 3rd XOC bios
> please someone release those bios`s


But you know that removing temp limit won't do anything for core stability. If certain temp is achieved on certain clock + voltage bin it will crash anyway. Removing temp limit will only really benefit people with water cooling as they can skip one bin, but on air you will have to keep the temp in check anyway so core won't crash.

Besides no BIOS will fix silicon lottery


----------



## ALSTER868

Doesn't XOC bios remove temp limits just as well? What is the difference between this kingpin bios and XOC I'm wondering?


----------



## TheBoom

Quote:


> Originally Posted by *OMG***ISTHIS*
> 
> I mostly wanted the Hybrid to bring that damn 86c max down to about half. It's like I have my own personal desk heater in the middle of May.


You know that the higher efficiency of watercooling means your room would get even hotter than it would on air right?

Bringing down temperatures on your core/card doesn't mean lower room temps. Its usually the opposite since more heat is being dumped out now.
Quote:


> Originally Posted by *ALSTER868*
> 
> Doesn't XOC bios remove temp limits just as well? What is the difference between this kingpin bios and XOC I'm wondering?


Yes it removes soft temp limits, but if your components are getting too hot at a certain voltage the card will crash nonetheless. Kinda like a hard limit then you can't do much about other than full block watercooling.

My card cannot go over 1150mv regardless of clocks. Too much heat being put out to the VRMs I suppose.


----------



## Dasboogieman

Buildzoid may have found the answer to why the XOC BIOS performs worse than the stock FE.





TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


----------



## Benny89

Quote:


> Originally Posted by *Dasboogieman*
> 
> TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


Bastards! So many things to screw overclock community with Pascals







. Locked BIOS, locked voltage, power controllers etc. Like average user would ever bother with OCing their cards....


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


I think for us the memory clocks also have an impact since GDDR5X. But how many of you actually get better performance at stock clocks compared to 2050?

Or does this only apply to shunt and other resistor mods?


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> I think for us the memory clocks also have an impact since GDDR5X. But how many of you actually get better performance at stock clocks compared to 2050?
> 
> Or does this only apply to shunt and other resistor mods?


I dunno, probably more relevant at higher voltages, considering the chip could be potentially doing its own BIOS level or lower management of its power which we can't even see.


----------



## Coopiklaani

Quote:


> Originally Posted by *Dasboogieman*
> 
> Buildzoid may have found the answer to why the XOC BIOS performs worse than the stock FE.
> 
> 
> 
> 
> 
> TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


I think it explains why a "cliff" shaped curve performs worse than a smooth one.

Basically the voltage and frequency that read by the software are the GPU desired voltage and frequency bin. However, due to vdroop, GPU usually gets less voltage than what it asks for, thus is forced to run at the lower bin.

Say in a cliff shaped F-V curve, you set [email protected] and [email protected], when GPU asks VRM for 1.093v, it actually gets less than 1.093v due to vdroop and is forced to run at 2000MHz. But GPU-Z only reports the frequency that GPU "wants" to operate at, not the frequency that GPU actually operates at

On the other hand, a smooth F-V curve, [email protected] and [email protected], same thing happens here, when GPU asks for 1.093v, it gets less than 1.093v, maybe say 1.088v, so GPU has to run a bin lower, at 2088MHz.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> I dunno, probably more relevant at higher voltages, considering the chip could be potentially doing its own BIOS level or lower management of its power which we can't even see.


So I just did a quick test on the XOC using stock clocks and memory at 11.8ghz.

Stock settings results in voltage of 1112mv without touching the curve. Clocks boost to 1999. Test 1 fps - 65.12.

Using curve locked voltage to 1031mv at same 1999 core clock. Test 1 fps - 65.31.

So I don't think this is the reason for XOC scoring lower than FE bios. There must be something else affecting it likely the memory timings as initially suggested.

Last test at my normal gaming curve. 1131mv 2088 with same 11.8ghz mem. Test 1 fps - 66.47.

So +88 on the core gives 1 fps extra. Whether this is performance degradation or simply diminishing returns is the question.


----------



## Coopiklaani

Quote:


> Originally Posted by *TheBoom*
> 
> So I just did a quick test on the XOC using stock clocks and memory at 11.8ghz.
> 
> Stock settings results in voltage of 1112mv without touching the curve. Clocks boost to 1999. Test 1 fps - 65.12.
> 
> Using curve locked voltage to 1031mv at same 1999 core clock. Test 1 fps - 65.31.
> 
> So I don't think this is the reason for XOC scoring lower than FE bios. There must be something else affecting it likely the memory timings as initially suggested.


I think pascal is doing something similar to what maxwell used to do, but much worse. In maxwell, each clock bin is assigned to a range of operating voltage, the minimum and the maximum. My theory is, XOC changed the voltage ranges, so it requires more minimum voltage to run the same clock bin. Due to the vdroop problem I mentioned in the previous post, it may cause GPU to run 1 or 2bins lower than FE bios.


----------



## TheBoom

Quote:


> Originally Posted by *Coopiklaani*
> 
> I think pascal is doing something similar to what maxwell used to do, but much worse. In maxwell, each clock bin is assigned to a range of operating voltage, the minimum and the maximum. My theory is, XOC changed the voltage ranges, so it requires more minimum voltage to run the same clock bin. Due to the vdroop problem I mentioned in the previous post, it may cause GPU to run 1 or 2bins lower than FE bios.


You are probably right, but I don't think its an XOC issue, more likely the stock Asus Strix bios has the same voltage ranges.

So if we did the curve properly with the smooth ramp, or at least the 3 bins previous to the "active bin", we should be getting the right performance technically shouldn't we? Since we are locking a range of clocks to their voltage bins.

Edit: Unless of course the clocks and voltages reported are false.


----------



## Coopiklaani

Quote:


> Originally Posted by *TheBoom*
> 
> You are probably right, but I don't think its an XOC issue, more likely the stock Asus Strix bios has the same voltage ranges.
> 
> So if we did the curve properly with the smooth ramp, or at least the 3 bins previous to the "active bin", we should be getting the right performance technically shouldn't we? Since we are locking a range of clocks to their voltage bins.
> 
> Edit: Unless of course the clocks and voltages reported are false.


XOC and FE bios' perform the same in Timespy and firestrike given the same curve as I tested earlier. XOC does perform worse in SP 4k and 8k for some reason. I think it may due to vdroop since SP4k is much more power hungry than 3dmarks.


----------



## TheBoom

Quote:


> Originally Posted by *Coopiklaani*
> 
> XOC and FE bios' perform the same in Timespy and firestrike given the same curve as I tested earlier. XOC does perform worse in SP 4k and 8k for some reason. I think it may due to vdroop since SP4k is much more power hungry than 3dmarks.


So maybe it's a power delivery issue as apposed to voltages then? The guy in the video only demonstrated how the voltage got locked at certain clocks but he didn't show the actual power consumption and whether there was a difference.

This makes me curious though. Would I be better off on my stock Amp Extreme bios if I'm gaming at 4k or DSR in games.


----------



## Coopiklaani

Quote:


> Originally Posted by *TheBoom*
> 
> So maybe it's a power delivery issue as apposed to voltages then? The guy in the video only demonstrated how the voltage got locked at certain clocks but he didn't show the actual power consumption and whether there was a difference.
> 
> This makes me curious though. Would I be better off on my stock Amp Extreme bios if I'm gaming at 4k or DSR in games.


power delivery does struggle when using the XOC bios as I can hear coil whine at voltage higher than 1.1v. It still needs more investigation I suppose..GPU boost 3.0 is a messy convoluted system that integrates voltage, power, frequency and temperature.. No way you can change one without affecting others..


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> *power delivery does struggle when using the XOC bios as I can hear coil whine at voltage higher than 1.1v*. It still needs more investigation I suppose..GPU boost 3.0 is a messy convoluted system that integrates voltage, power, frequency and temperature.. No way you can change one without affecting others..


Coil whine isn't an indicator for any struggle...


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Coil whine isn't an indicator for any struggle...


Sorry, I meant it's a sign for higher power/current draw. My card only starts to coil whine when power exceeds 400w.









I can never hit 400w with the FE bios, even with the shunt mod. But with XOC, it is an easy target.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Sorry, I meant it's a sign for higher power/current draw. My card only starts to coil whine when power exceeds 400w.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can never hit 400w with the FE bios, even with the shunt mod. But with XOC, it is an easy target.


Yep, in Mass Effect Andromeda the freaking Title Screen is pulling 420 Watts+ on my shunt modded card and XOC bios, considering in that my monitored Powerdraw is somewhere 20-25% lower... we are speaking of 500W for the GPU in that particular title... that is humongous^^


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> Yep, in Mass Effect Andromeda the freaking Title Screen is pulling 420 Watts+ on my shunt modded card and XOC bios, considering in that my monitored Powerdraw is somewhere 20-25% lower... we are speaking of 500W for the GPU in that particular title... that is humongous^^


Well I guess pulling 450w+ output a 300w vrm is really pushing it. It may really struggle if not under a fullcover water block.


----------



## Hulio225

Quote:


> Originally Posted by *Coopiklaani*
> 
> Well I guess pulling 450w+ output a 300w vrm is really pushing it. It may really struggle if not under a fullcover water block.


The VRM on the FE is quit good to be honest, at 400 A current at 1,1 V which equals 440 Watt, the VRM is pushing like ~40 Watts thermal dissipation due to switching loses, assuming the switching frequency is at the maximum of 600 kHz which the voltage controller is capable of, and this is manageable


----------



## TheBoom

Yeah ME:A is crazy even at 1440p. HWM shows my amp extreme frequently hitting 420w with a max of 440w and I'm not even shunt modded.

Question is whether this is actually reducing performance compared to stock bios with lower power draw.


----------



## Hulio225

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah ME:A is crazy even at 1440p. HWM shows the my amp extreme frequently hitting a max of 420w with a max of 440w and I'm not even shunt modded.
> 
> Question is whether this is actually reducing performance compared to stock bios with lower power draw.


Can't tell you, haven't benched performance differences in ME:A... but tbh this power draw for that crappy looking game is horrible...


----------



## TheBoom

Quote:


> Originally Posted by *Hulio225*
> 
> Can't tell you, haven't benched performance differences in ME:A... but tbh this power draw for that crappy looking game is horrible...


Agreed. Also the CPU usage -.-.

Well at least with reshade its much better looking and playable.


----------



## ir88ed

Quote:


> Originally Posted by *TheBoom*
> 
> Agreed. Also the CPU usage -.-.
> 
> Well at least with reshade its much better looking and playable.


What reshade did you do? Was it a nexus mod? I have been playing Prey and haven't touched ME:A for a while.


----------



## Benny89

Quote:


> Originally Posted by *ir88ed*
> 
> What reshade did you do? Was it a nexus mod? I have been playing Prey and haven't touched ME:A for a while.


Best one on Nexus is Cinematic Realism Lighting Overhaul with Depth-of-Field. I tested all of them. This one will BLOW YOU OUT on planets. It also has OSD menu where you can tweak, turn on/off all featueres to make it best looking for you. It made game HDR without need of having HDR monitor. It soo good and finally all characters skin look normally since they don't loose light...

But it eats your fps







Just so you know. But 1080Ti can tank that easly.

And yes ME:A is power hungry BUT ONLY in Menu and on Tempest (which is the fault of horrible optimalization there). On Tempest Power Draw can be high enough to downclock my card from its 2063 stable to 1911!!

However anywhere else- on planets, Nexus, during missons etc. The power draw is normal and I never droped below 2063 and max TDP there I saw was 110%.


----------



## feznz

Quote:


> Originally Posted by *TheBoom*
> 
> So maybe it's a power delivery issue as apposed to voltages then? The guy in the video only demonstrated how the voltage got locked at certain clocks but he didn't show the actual power consumption and whether there was a difference.
> 
> This makes me curious though. Would I be better off on my stock Amp Extreme bios if I'm gaming at 4k or DSR in games.


http://hexus.net/tech/news/graphics/103561-kingpin-pushes-nvidia-geforce-gtx-1080-ti-past-25ghz-ln2/

I seen this *Before* I got my Strix I didn't like the aesthetics of the FE I could have got an FE with full cover block for $100 cheaper than the Strix

silicon lottery true 2100Mhz+ card are out there but like a good gambler yells top of their lungs I got a winner


----------



## KedarWolf

Top 1080 Ti Time Spy on OC.net.









And yes, it's rendered on my 1080 Ti, my old Maxwell Titan X is detected by 3DMark as my primary card, it's in my second PCI-E slot as a dedicated PhysX card.


----------



## Hulio225

Quote:


> Originally Posted by *KedarWolf*
> 
> Top 1080 Ti Time Spy on OC.net.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes, it's rendered on my 1080 Ti, my old Maxwell Titan X is detected by 3DMark as my primary card, it's in my second PCI-E slot as a dedicated PhysX card.


Total points maybe... gpu score far away from top spot









http://www.3dmark.com/spy/1729047


----------



## DerComissar

Quote:


> Originally Posted by *KedarWolf*
> 
> Top 1080 Ti Time Spy on OC.net.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes, it's rendered on my 1080 Ti, my old Maxwell Titan X is detected by 3DMark as my primary card, it's in my second PCI-E slot as a dedicated PhysX card.


Well I'm impressed by that overall 11K+ score regardless!


----------



## RavageTheEarth

Quote:


> Originally Posted by *Benny89*
> 
> Best one on Nexus is Cinematic Realism Lighting Overhaul with Depth-of-Field. I tested all of them. This one will BLOW YOU OUT on planets. It also has OSD menu where you can tweak, turn on/off all featueres to make it best looking for you. It made game HDR without need of having HDR monitor. It soo good and finally all characters skin look normally since they don't loose light...
> 
> But it eats your fps
> 
> 
> 
> 
> 
> 
> 
> Just so you know. But 1080Ti can tank that easly.
> 
> And yes ME:A is power hungry BUT ONLY in Menu and on Tempest (which is the fault of horrible optimalization there). On Tempest Power Draw can be high enough to downclock my card from its 2063 stable to 1911!!
> 
> However anywhere else- on planets, Nexus, during missons etc. The power draw is normal and I never droped below 2063 and max TDP there I saw was 110%.


I don't have any issues with the menu, but I do get drops all the way down into the 40's sometimes on The Tempest depending on where I am. I'm playing on a 1440p G-Sync 144hz monitor so it drives me nuts, I get good performance on planets usually around 90-120 FPS with everything maxed out except no HDR. Great game besides the technical problems.


----------



## KenjiS

Well my 980 Ti died last night in the midst of Mass Effect Andromeda

While its under warranty I decided to heck with it and placed an order for an Asus Strix 1080 Ti...Figure ill sell the 980 Ti when i get it back from warranty service


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> Total points maybe... gpu score far away from top spot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/1729047


I swear Intel owns Futuremark....... or at least sponsors Futuremark
Then again I seen people upgrade from bronze to gold PSU with a genuine believe they are going to receive performance gains


----------



## feznz

Quote:


> Originally Posted by *Hulio225*
> 
> Total points maybe... gpu score far away from top spot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/1729047


I swear Intel owns Futuremark....... or at least sponsors Futuremark
Then again I seen people upgrade from bronze to gold PSU with a genuine believe they are going to receive performance gains


----------



## IMI4tth3w

Greetings everyone!

Asus Strix OC 1080ti with EK water block owner here.

Just did the XOC bios flash on this thing to see how much further i can take it.. Unfortunately it seems that no matter what i do, i cannot get this thing stable at 2100MHz. This thing is rock solid stable at 2088MHz @ 1.043V (no need for XOC bios here..) but i've taken the voltage as far as 1.162V to try and get a 2100MHz stable run but she's just not having it.

Now the even stranger thing is that before i did the XOC bios, I had a freak run through of Superposition at 2100MHz and i think 1.062V, Got a score of 10,5xx on 4k optimized. But i've never been able to get a complete run through again at 2100MHz at any voltage. I even tried some 2113Mhz runs at some random voltages just to see if my card didn't like that frequency or something LOL

Is it just the hard pascal silicon lottery wall i'm hitting? i just find it odd that i can get 2088MHz at such low voltages. You'd think it'd be able to get to the next bin easily with a bit more voltage.. pascal is so strange.


----------



## Nico67

Quote:


> Originally Posted by *IMI4tth3w*
> 
> Greetings everyone!
> 
> Asus Strix OC 1080ti with EK water block owner here.
> 
> Just did the XOC bios flash on this thing to see how much further i can take it.. Unfortunately it seems that no matter what i do, i cannot get this thing stable at 2100MHz. This thing is rock solid stable at 2088MHz @ 1.043V (no need for XOC bios here..) but i've taken the voltage as far as 1.162V to try and get a 2100MHz stable run but she's just not having it.
> 
> Now the even stranger thing is that before i did the XOC bios, I had a freak run through of Superposition at 2100MHz and i think 1.062V, Got a score of 10,5xx on 4k optimized. But i've never been able to get a complete run through again at 2100MHz at any voltage. I even tried some 2113Mhz runs at some random voltages just to see if my card didn't like that frequency or something LOL
> 
> Is it just the hard pascal silicon lottery wall i'm hitting? i just find it odd that i can get 2088MHz at such low voltages. You'd think it'd be able to get to the next bin easily with a bit more voltage.. pascal is so strange.


Nah it seems that it just gets to a point then no stability after that, you may get some less demanding things to work, but games are usually even lower than what benchmarks will pass at.


----------



## Nico67

Quote:


> Originally Posted by *Dasboogieman*
> 
> Buildzoid may have found the answer to why the XOC BIOS performs worse than the stock FE.
> 
> 
> 
> 
> 
> TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


That was interesting to watch and certainly shares a lot of the anomalies noticed within posts here. I noticed vastly improving performance with voltage at the same clk, but I think testing below the 800mv min was going to cause issues. The most interesting part however was the notion that representative clks may not being the actual working clks or that downclocking of sub clks could be a thing. That I would totally agree holds some weight and even XOC doesn't seem to get around.
If the GPU does measure normalised TDP, then I don't think it could read it internally or AIB cards couldn't get past it, so it may read via an external resistor that could be varied?

At any rate these cards are a lot more OC unfriendly than previous generations, although just cooling and bumping the power limit to max does help, just not as much fun


----------



## Streetdragon

Soo i need a bit help

im pissed from AMD. My Corssfire R9 390 is buggy and not stable. Heaven runs ok but 3Dmark or Prey crashes.
I dont wanna wait 3 or more month to get a watercooled vega:

now my question:
What is the best 1080TI to get for waterooling with a Heatkiller cooler? Referenz or EVGA FTW or some ther cards? I wanna clock it to hell and back xD

Maybe i should Add i Need 1 HDMI 1 DVI 1 dp


----------



## OZrevhead

Similar to the above but I am going to start using a chiller and lean on a Ti hard, which Ti would be best for me? Flash something with a HOF bios? Is there any benefit to buying an actual Ti HOF? They are listed here (none in stock) for AUD1379 ....


----------



## Benny89

Quote:


> Originally Posted by *IMI4tth3w*
> 
> Greetings everyone!
> 
> Asus Strix OC 1080ti with EK water block owner here.
> 
> Just did the XOC bios flash on this thing to see how much further i can take it.. Unfortunately it seems that no matter what i do, i cannot get this thing stable at 2100MHz. This thing is rock solid stable at 2088MHz @ 1.043V (no need for XOC bios here..) but i've taken the voltage as far as 1.162V to try and get a 2100MHz stable run but she's just not having it.
> 
> Now the even stranger thing is that before i did the XOC bios, I had a freak run through of Superposition at 2100MHz and i think 1.062V, Got a score of 10,5xx on 4k optimized. But i've never been able to get a complete run through again at 2100MHz at any voltage. I even tried some 2113Mhz runs at some random voltages just to see if my card didn't like that frequency or something LOL
> 
> Is it just the hard pascal silicon lottery wall i'm hitting? i just find it odd that i can get 2088MHz at such low voltages. You'd think it'd be able to get to the next bin easily with a bit more voltage.. pascal is so strange.


You are hitting frequency stability wall. For example my card can't do 2088 at 1.04v like yours. But it can do 2100 at 1.09v. So your card can do better clock at lower bins than mine, but mine can overall be stable at higher clocks.

This is sillicon lottery. You are winner too as getting 2088 at 1.043 is great! Your temps will be little lower thanks to that.

I need 1.08v for 2088.

My card is very consistent with OC







. 2038- 1.04. 2050- 1.05, 2063- 1.06 etc.







At least easy to remember those numbers thanks to that lol!


----------



## Besty

Looking at the 3Dmark leaderboard for single GPU's, Duck san and Rbuass came fairly close to beating Kingpin with the 1080ti Hall-of-fame.

Looks like the GPU tops out at 2.5-2.6GHZ under LN2 with this card.

I assume they are using as yet unseen 1080ti HOF XOC bios which has not yet been released into the wild.

The 1080ti HOF Bitspower block has been shown at Computex - anyone got any insider news about when the bitspower block will be released ?


----------



## Dasboogieman

Quote:


> Originally Posted by *Streetdragon*
> 
> Soo i need a bit help
> 
> im pissed from AMD. My Corssfire R9 390 is buggy and not stable. Heaven runs ok but 3Dmark or Prey crashes.
> I dont wanna wait 3 or more month to get a watercooled vega:
> 
> now my question:
> What is the best 1080TI to get for waterooling with a Heatkiller cooler? Referenz or EVGA FTW or some ther cards? I wanna clock it to hell and back xD
> 
> Maybe i should Add i Need 1 HDMI 1 DVI 1 dp


ASUS Strix + the EK waterblock. The ASUS card is the only one so far that works with the fully unlocked XOC BIOS with no performance degradation, combined with top tier voltage regulation and highly efficient MOSFETs (less heat for your loop to deal with). Only real drawback is the exorbitant price of entry.

If you don't intend to go XOC, Aorus + EK waterblock. You get the benefits of a built-in 375W TDP limit but Voltage regulation is total crap (i.e. high VDROOP).

IIRC Zotac might also have a waterblock and that one will give you 384W TDP but even worse quality VRMs (higher heat output) with better drivers (i.e. better voltage regulation than the Aorus, maybe slightly worse than the ASUS).

If you want cost effective, any EVGA FE card will do.
Quote:


> Originally Posted by *IMI4tth3w*
> 
> Greetings everyone!
> 
> Asus Strix OC 1080ti with EK water block owner here.
> 
> Just did the XOC bios flash on this thing to see how much further i can take it.. Unfortunately it seems that no matter what i do, i cannot get this thing stable at 2100MHz. This thing is rock solid stable at 2088MHz @ 1.043V (no need for XOC bios here..) but i've taken the voltage as far as 1.162V to try and get a 2100MHz stable run but she's just not having it.
> 
> Now the even stranger thing is that before i did the XOC bios, I had a freak run through of Superposition at 2100MHz and i think 1.062V, Got a score of 10,5xx on 4k optimized. But i've never been able to get a complete run through again at 2100MHz at any voltage. I even tried some 2113Mhz runs at some random voltages just to see if my card didn't like that frequency or something LOL
> 
> Is it just the hard pascal silicon lottery wall i'm hitting? i just find it odd that i can get 2088MHz at such low voltages. You'd think it'd be able to get to the next bin easily with a bit more voltage.. pascal is so strange.







Buildzoid does a better job of demonstrating this quirk than any explanation I can give.


----------



## Streetdragon

wow nice answer^^

then it would be Aorus vs founders. (strix is way to expensive)
has the Aorus any benefits in terms of overclocking vs the founders?


----------



## KedarWolf

Quote:


> Originally Posted by *Streetdragon*
> 
> wow nice answer^
> then it would be Aorus vs founders. (strix is way to expensive)
> has the Aorus any benefits in terms of overclocking vs the founders?


The only benefit would be the higher power limit of, the Aorus Extreme I think it is. You can flash the Extreme BIOS on a regular Aorus though and get the same benefits.

I'm pretty sure I got my cards right here, on my phone.


----------



## Dasboogieman

Quote:


> Originally Posted by *Streetdragon*
> 
> wow nice answer^^
> 
> then it would be Aorus vs founders. (strix is way to expensive)
> has the Aorus any benefits in terms of overclocking vs the founders?


The TDP is much higher, depending on your loop efficiency, 375W will virtually never throttle (e.g mine which has a 10 delta T). The FE can technically get higher TDP if you are willing to do the shunt/cap hard mod or flash the FTW BIOS (gives you 358W) but its not quite the same.


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> The only benefit would be the higher power limit of, the Aorus Extreme I think it is. You can flash the Extreme BIOS on a regular Aorus though and get the same benefits.
> 
> I'm pretty sure I got my cards right here, on my phone.


Both regular and extreme are the same BIOS wise, the extreme has higher factory OC.


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> The only benefit would be the higher power limit of, the Aorus Extreme I think it is. You can flash the Extreme BIOS on a regular Aorus though and get the same benefits.
> 
> I'm pretty sure I got my cards right here, on my phone.
> 
> 
> 
> Both regular and extreme are the same BIOS wise, the extreme has higher factory OC.
Click to expand...

I thought the Extreme had a higher power limit but I could be wrong.

Edit: Checked, both have a 375w power limit.


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Streetdragon*
> 
> wow nice answer^^
> 
> then it would be Aorus vs founders. (strix is way to expensive)
> has the Aorus any benefits in terms of overclocking vs the founders?
> 
> 
> 
> The TDP is much higher, depending on your loop efficiency, 375W will virtually never throttle (e.g mine which has a 10 delta T). The FE can technically get higher TDP if you are willing to do the shunt/cap hard mod or flash the FTW BIOS (gives you 358W) but its not quite the same.
Click to expand...

XOC BIOS is an option too, I run that as my daily driver.no shunt mod.


----------



## Streetdragon

in my loop is a 5930k clocked to 4,6-4,7 1,3V. 1*480mm rand and 2*240. i think i have enough space so far.
would the lower tdp of the founder hold me back with clocking?


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> I thought the Extreme had a higher power limit but I could be wrong.


No they're definitely the same. The bigger issue, and the reason I recommended the regular Aorus, is theres some funky binning anomaly with the extremes that is making them all pretty much OC like crap. The Aorus guys think there was some mix up in the manufacturing (so extreme chips got put in to the Aorus regular and vice versa). This is because all the regular Aorus are OCing great but the Extremes are crashing at stock or very mild OCs.


----------



## dansi

I have trouble getting mine run at 1.062v . Always stuck at 1.05v even though my temps are mid-50cs.

Only once after i make +40% voltage, it ran at 1.075v stable, but after restart, it went stuck back at 1.05v


----------



## Coopiklaani

Quote:


> Originally Posted by *Hulio225*
> 
> The VRM on the FE is quit good to be honest, at 400 A current at 1,1 V which equals 440 Watt, the VRM is pushing like ~40 Watts thermal dissipation due to switching loses, assuming the switching frequency is at the maximum of 600 kHz which the voltage controller is capable of, and this is manageable


Quote:


> Originally Posted by *dansi*
> 
> I have trouble getting mine run at 1.062v . Always stuck at 1.05v even though my temps are mid-50cs.
> 
> Only once after i make +40% voltage, it ran at 1.075v stable, but after restart, it went stuck back at 1.05v


You need tune the curve. The core frequency offset slider will only give you 1.05/1.063v at low temperatures even if you have set your voltage to 100%.
And the voltage slider is NOT a voltage offset slider, it is a voltage cap slider. 100% allows 1.093v max vcore, 0% is 1.05/1.063v.


----------



## Benny89

Quote:


> Originally Posted by *dansi*
> 
> I have trouble getting mine run at 1.062v . Always stuck at 1.05v even though my temps are mid-50cs.
> 
> Only once after i make +40% voltage, it ran at 1.075v stable, but after restart, it went stuck back at 1.05v


Hi, my card on stock Voltage slider could not jump onto 1.08 and 1.09 bins with my voltage curve. It simple skipped them and started at 1.07.

I had to increase my Voltage slider to 100% for it start from those bins.

Now I am running +76% Voltage slider and it starts properly from 1.08v and drops to 1.063v.

Just my experience.


----------



## jim2point0

Hey fellas.

I went through 2 EVGA FTW3s before deciding it ran too hotloud in my system. So I swapped to the Asus Strix.

It's much cooler and quieter, but it doesn't clock as high. I get 1987mhz max. It only goes down from there.

I don't know how I feel about swapping it for maybe 50-60mhz more on the core. Also, the memory does OC higher. Not enough to make up the difference but here's how they score in Superposition:

EVGA FTW3


Asus Strix


Is it worth it to swap the Strix for another in hopes of maybe getting a card that clocks higher? Would it really make much difference in games?


----------



## Benny89

Quote:


> Originally Posted by *jim2point0*
> 
> Is it worth it to swap the Strix for another in hopes of maybe getting a card that clocks higher? Would it really make much difference in games?


You did well getting that STRIX. Sadly, silicon lottery is for every card







.

The difference between 1996 and 2063mhz is 1-2fps in my case. 1996 is my normal boost on STRIX, 2063 is my 24.7 overclock.

Performance wise? No worth. Unless you are like me and you can't be happy till you know you have golden chip inside your case, but that is mental illness







.

Not worth from logical point of view. But then stop reading ths thread. You will see a ton of people braging about their 2100, 2050+ GPUs and you may start to feel bad because of it.

If you are in EU and you are in return window, I would orde one or two more and leave myself best one.

But that is just me.


----------



## jim2point0

I am in the US, but I can return this to Amazon and just cite the minor coil whine as being annoying to my ears. Or something.

I'm not all that worried though if the differences won't really matter in real-world gaming. I only ran the benchmarks for fun. I'm not looking to top the charts.

Does the XOC bios help much?


----------



## Slackaveli

Quote:


> Originally Posted by *KedarWolf*
> 
> The only benefit would be the higher power limit of, the Aorus Extreme I think it is. You can flash the Extreme BIOS on a regular Aorus though and get the same benefits.
> 
> I'm pretty sure I got my cards right here, on my phone.


both the aorus have the 375w limit.
Quote:


> Originally Posted by *Streetdragon*
> 
> wow nice answer^^
> 
> then it would be Aorus vs founders. (strix is way to expensive)
> has the Aorus any benefits in terms of overclocking vs the founders?


----------



## Slackaveli

Quote:


> Originally Posted by *dansi*
> 
> I have trouble getting mine run at 1.062v . Always stuck at 1.05v even though my temps are mid-50cs.
> 
> Only once after i make +40% voltage, it ran at 1.075v stable, but after restart, it went stuck back at 1.05v


you need +51 on the voltage slider to hold it.


----------



## ninjames

I just got my replacement card, which is a FTW3. Some people in this thread mentioned updating the card to fix fan spinning issues using XOC. I have XOC installed - how do I check for updates on my card?


----------



## nrpeyton

Anyone interested in overclocking a Pascal GPU should watch this video.

This video shows that when we tweak the voltage. Our Pascal cards either a) aren't always actually running at the clockspeed reported in windows or b) the card is doing some other crazy power stuff I.E. not utilizing all shaders etc.

This could go quite far to explain why people aren't getting the scores they should be when for example using the XOC BIOS. _(I.E scoring less even though the clockspeed is higher)._

A very, very interesting watch.





It seems there may be something on the PCB that is monitoring power efficiency that no hardware or software mod can override-- and when that power is increased _(due to more voltage)_ the card shuts down shaders (or downclocks) to stay within TDP. However that downlocking may not be reporting in Windows.

And when we say TDP, we don't _always_ mean power limit. There can be a difference between Power Limit and TDP.

The video explains it better than me. If you've got 30 minutes I'd highly recommend investing them in this video. It's a huge eye opener. (And finally gives us some semi-official explanation to the abnormalities we have all been experiencing when trying to O/C this arcitecture).


----------



## IMI4tth3w

So I spent wayyyy too long last night benching and testing my custom watercooled strix. Some REALLY weird stuff happened.

I made a post about 15 posts ago about hitting a hard wall at 2100MHz. Well last night the power surged (computer is fine, just turned off and back on) and now i'm getting stable runs at 2100 and 2113!!

I also watched that video about "the pascal problem" so i decided to start benching overclocks with only the voltage adjusted. And sure enough, adding more voltage at the same clocks gets you a few more points. There's a 100 point difference between my 2088 overclock at 1.04v and 1.09v. (1.09 scores higher)

I have all this data written down i just wanted to do further testing. What's interesting is i feel like I was still getting higher scores before i did the XOC bios which is odd.

I also need to try out more memory frequencies. I got a nearly 200 point gain going from 6003MHz to 6026MHz for the memory speed...









i'll make some nice pretty graphs in the near future

Also one last question, with the XOC bios, is the shunt mod even needed since it totally gets rid of power limiting?


----------



## Slackaveli

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone interested in overclocking a Pascal GPU should watch this video.
> 
> This video shows that when we tweak the voltage. Our Pascal cards either a) aren't always actually running at the clockspeed reported in windows or b) the card is doing some other crazy power stuff I.E. not utilizing all shaders etc.
> 
> This could go quite far to explain why people aren't getting the scores they should be when for example using the XOC BIOS. _(I.E scoring less even though the clockspeed is higher)._
> 
> A very, very interesting watch.
> 
> 
> 
> 
> 
> It seems there may be something on the PCB that is monitoring power efficiency that no hardware of software mod can override-- and when that power is increased (due to more voltage)n the card shuts down shaders (or downclocks) to stay within TDP. However that downlocking may not be reporting in Windows.
> 
> And when we say TDP, we don't _always_ mean power limit. There can be a difference between Power Limit and TDP.
> 
> The video explains it better than me. If you've got 30 minutes I'd highly recommend investing them in this video. It's a huge eye opener. (And finally gives us some semi-official explanation to the abnormalities we have all been experiencing when trying to O/C).


wow . fuji 2.0

more volts + less frequency = highest score. smfh


----------



## ELIAS-EH

Not worth oc the strix for example above the oc mode by the factory , max 2-3 fps?!?!


----------



## KraxKill

Quote:


> Originally Posted by *Slackaveli*
> 
> wow . fuji 2.0
> 
> more volts + less frequency = highest score. smfh


Hard to take that guy seriously with that hair, but he has a point. I've been trying to pinpoint the behavior for a while now, but can't with any sort of consistency. I think there is a hard coded TDP boost multiplier on the GPU and the only way to bypass it is to manually feed the voltage to the chip.

It would appear that he has done that, but it there has to be a way that Vince has done it that is different as he's able to clock to 2400+ on LN2 and I doubt he's doing it at just 1.093v. or without bypassing the (what I'm calling normalized TDP") until somebody can suggest otherwise.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Hard to take that guy seriously with that hair, but he has a point. I've been trying to pinpoint the behavior for a while now, but can't with any sort of consistency. I think there is a hard coded TDP boost multiplier on the GPU and the only way to bypass it is to manually feed the voltage to the chip.
> 
> It would appear that he has done that, but it there has to be a way that Vince has done it that is different as he's able to clock to 2400+ on LN2 and I doubt he's doing it at just 1.093v. or without bypassing the (what I'm calling normalized TDP") until somebody can suggest otherwise.


You have to get past the hair. He's actually got a very popular (and famous) channel called "actually hardcore overclocking". And he's known as "buildzoid" online. He done all the PCB tear downs for all the 1080's for Gamers Nexus and he works very closely with Steve Burke from Gamers Nexus on many projects. You'll notice he's had a lot of mentions from Steve recently too.

I'd highly recommend subscribing to his channel. Anyone who spends a lot of time here (particularly on this thread) will probably find his videos very interesting and insightful 

But I do agree.. the hair could improve lol.

The fact so many people DO take him seriously despite the hair (and his age), does go some way to showing how good he is ;-)

Also it was an EVGA staff member _(moderator)_ at the EVGA forums who originally linked his video.


----------



## buellersdayoff

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slackaveli*
> 
> wow . fuji 2.0
> 
> more volts + less frequency = highest score. smfh
> 
> 
> 
> Hard to take that guy seriously with that hair, but he has a point. I've been trying to pinpoint the behavior for a while now, but can't with any sort of consistency. I think there is a hard coded TDP boost multiplier on the GPU and the only way to bypass it is to manually feed the voltage to the chip.
> 
> It would appear that he has done that, but it there has to be a way that Vince has done it that is different as he's able to clock to 2400+ on LN2 and I doubt he's doing it at just 1.093v. or without bypassing the (what I'm calling normalized TDP") until somebody can suggest otherwise.
Click to expand...

Have you seen what kingpin has soldered to the 1080ti...watch jayz youtube with kingpin and his off sider, it's a lot more than buildzoid


----------



## RadActiveLobstr

Quote:


> Originally Posted by *nrpeyton*
> 
> You have to get past the hair. He's actually got a very popular (and famous) channel called "actually hardcore overclocking". And he's known as "buildzoid" online. He done all the PCB tear downs for all the 1080's for Gamers Nexus and he works very closely with Steve Burke from Gamers Nexus on many projects. You'll notice he's had a lot of mentions from Steve recently too.
> 
> I'd highly recommend subscribing to his channel. Anyone who spends a lot of time here (particularly on this thread) will probably find his videos very interesting and insightful
> 
> But I do agree.. the hair could improve lol.
> 
> The fact so many people DO take him seriously despite the hair (and his age), does go some way to showing how good he is ;-)


His hair is horrible, Steve's hair is glorious.

They balance each other out.


----------



## Luckbad

Zotac 1080 Ti ArcticStorm is available at NewEgg and showing as preorder in a couple of others.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone interested in overclocking a Pascal GPU should watch this video.
> 
> This video shows that when we tweak the voltage. Our Pascal cards either a) aren't always actually running at the clockspeed reported in windows or b) the card is doing some other crazy power stuff I.E. not utilizing all shaders etc.
> 
> This could go quite far to explain why people aren't getting the scores they should be when for example using the XOC BIOS. _(I.E scoring less even though the clockspeed is higher)._
> 
> A very, very interesting watch.
> 
> 
> 
> 
> 
> It seems there may be something on the PCB that is monitoring power efficiency that no hardware or software mod can override-- and when that power is increased _(due to more voltage)_ the card shuts down shaders (or downclocks) to stay within TDP. However that downlocking may not be reporting in Windows.
> 
> And when we say TDP, we don't _always_ mean power limit. There can be a difference between Power Limit and TDP.
> 
> The video explains it better than me. If you've got 30 minutes I'd highly recommend investing them in this video. It's a huge eye opener. (And finally gives us some semi-official explanation to the abnormalities we have all been experiencing when trying to O/C this arcitecture).







He's live streaming 1070 on LN2 atm. worth watching.


----------



## TheBoom

Quote:


> Originally Posted by *ir88ed*
> 
> What reshade did you do? Was it a nexus mod? I have been playing Prey and haven't touched ME:A for a while.


Quote:


> Originally Posted by *Benny89*
> 
> Best one on Nexus is Cinematic Realism Lighting Overhaul with Depth-of-Field. I tested all of them. This one will BLOW YOU OUT on planets. It also has OSD menu where you can tweak, turn on/off all featueres to make it best looking for you. It made game HDR without need of having HDR monitor. It soo good and finally all characters skin look normally since they don't loose light...
> 
> But it eats your fps
> 
> 
> 
> 
> 
> 
> 
> Just so you know. But 1080Ti can tank that easly.
> 
> And yes ME:A is power hungry BUT ONLY in Menu and on Tempest (which is the fault of horrible optimalization there). On Tempest Power Draw can be high enough to downclock my card from its 2063 stable to 1911!!
> 
> However anywhere else- on planets, Nexus, during missons etc. The power draw is normal and I never droped below 2063 and max TDP there I saw was 110%.


Yes that's the one I downloaded but it's heavily modded from his preset. I just found his adjustments too drastic and the orange lights from the solar heaters at Voeld were showing as white. I spent about 1 hour tweaking and mainly making everything more subtle but it's worth it.

Also disabled RBM (Reflective Bump Mapping) as well as DOF and Borders. Played around with RBM for a while and part from being the biggest fps hog (5-10 fps) it was also artifacting on dirt and ground and kinda spoils the experience for me.
Quote:


> Originally Posted by *RavageTheEarth*
> 
> I don't have any issues with the menu, but I do get drops all the way down into the 40's sometimes on The Tempest depending on where I am. I'm playing on a 1440p G-Sync 144hz monitor so it drives me nuts, I get good performance on planets usually around 90-120 FPS with everything maxed out except no HDR. Great game besides the technical problems.


I'm curious though, how do you have drops to 40 fps even on the Tempest? I only saw that happen on my 1070. Minimum I've seen with the 1080ti so far is about 60+ fps.

I have a similar monitor the XB271HU and I get what you mean about annoying fps drops. G-Sync doesn't really help with input lag and stutters. Also there's no way you can play with HDR since its not a software implementation but rather a hardware feature that only works with actual HDR TVs and monitors.

This is just a guess but are you running at compressed or full framebuffer? It does nothing for non-HDR displays but tanks fps if you have it set to full. That may be the cause of your frame drops.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> He's live streaming 1070 on LN2 atm. worth watching.


cool stream. do you see he's doing the curve in one horizontal line across all points? like, 2100 @ .8v all the way to 1.1v. that's his "trick" for perf.

Buildzoid is awesome. he is the perfect case of "dont judge a book by the cover". ive been watching his pcb breakdown's for awhile.


----------



## Coopiklaani

Quote:


> Originally Posted by *Slackaveli*
> 
> cool stream. do you see he's doing the curve in one horizontal line across all points? like, 2100 @ .8v all the way to 1.1v


He hard modded the voltage controller. So the GPU thinks it runs at 0.8v, but it actually runs at 1.2v or higher.


----------



## Slackaveli

Quote:


> Originally Posted by *Coopiklaani*
> 
> He hard modded the voltage controller. So the GPU thinks it runs at 0.8v, but it actually runs at 1.2v or higher.


ah, ok. he's hard volted.

that -65c, tho


----------



## Benny89

Lol, I didn't know how much VRAM is adding TDP and temps...

So I was running Witcher 3 at my 2063/6000. Of course I crashed at 62C with artifacts. TDP around 99-108% just running around in Blood and Wine.

Ok, Witcher 3 never liked memory OC so I lowered it from 6000 to 5800.

And wow. Not only TDP go to 89-94% but temps droped from 60-61 to 57-58 at max. I was just playing solid 3 hours at 2063, no perf caps.

Waiting for money from my second STRIX and I am gonna water cool that card


----------



## Aganor

My best score until now, maxed at 2050mhz at 1.093v, if i do more clock some benches are ok and others it crashes.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> 
> 
> 
> 
> 
> He's live streaming 1070 on LN2 atm. worth watching.


Nice one.

I'm watching it now.

Shame he's not got a camera on that multi-meter









Anyone actually logged in there with an account? Fancy suggesting it ? (since it's live)? I'm only viewing as a guest.

Edit:
He's at 1.38v now on a 1070 at 2450Mhz. (live) at -140c

The 1080Ti's _(full GPU)_ can handle voltage even better. I can imagine doing this on my Ti at 1.4v now.

The new kingpin edition allows voltage control through USB apparently.

I'd also love to know what his power draw is sitting at just now.


----------



## Nico67

Quote:


> Originally Posted by *IMI4tth3w*
> 
> So I spent wayyyy too long last night benching and testing my custom watercooled strix. Some REALLY weird stuff happened.
> 
> I made a post about 15 posts ago about hitting a hard wall at 2100MHz. Well last night the power surged (computer is fine, just turned off and back on) and now i'm getting stable runs at 2100 and 2113!!
> 
> I also watched that video about "the pascal problem" so i decided to start benching overclocks with only the voltage adjusted. And sure enough, adding more voltage at the same clocks gets you a few more points. There's a 100 point difference between my 2088 overclock at 1.04v and 1.09v. (1.09 scores higher)
> 
> I have all this data written down i just wanted to do further testing. What's interesting is i feel like I was still getting higher scores before i did the XOC bios which is odd.
> 
> I also need to try out more memory frequencies. I got a nearly 200 point gain going from 6003MHz to 6026MHz for the memory speed...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'll make some nice pretty graphs in the near future
> 
> Also one last question, with the XOC bios, is the shunt mod even needed since it totally gets rid of power limiting?


I saw voltage ling up to1.075, then it started dropping again up to1.093

For memory you need to close the benchmark program after each mem adjust or you see 200 point jumps every couple of adjustments.

Quote:


> Originally Posted by *KraxKill*
> 
> Hard to take that guy seriously with that hair, but he has a point. I've been trying to pinpoint the behavior for a while now, but can't with any sort of consistency. I think there is a hard coded TDP boost multiplier on the GPU and the only way to bypass it is to manually feed the voltage to the chip.
> 
> It would appear that he has done that, but it there has to be a way that Vince has done it that is different as he's able to clock to 2400+ on LN2 and I doubt he's doing it at just 1.093v. or without bypassing the (what I'm calling normalized TDP") until somebody can suggest otherwise.


I think Normalized TDP must vary between AIB cards or they would all lock to 300w like the FE. Shunt mod seems to shift it up quite a bit, but I am wondering if only doing 2 of the three shunts might have a different effect to all three or the resistor mod?


----------



## ir88ed

Quote:


> Originally Posted by *gmpotu*
> 
> Yes, XOC should work on your card. XOC was designed for the STRIX OC so you won't notice as much benefit as if you were on STRIX but it should still help you improve if you are not using resistor or shunt mod.


I flashed the XOC bios onto my cards, flash appeared to go normally. Reboot, and the device manager only showed one of my cards enabled. Wasn't able to enable the card. Reflashed my original bios, and all is back to normal. Maybe my EVGA FE just doesn't like the Strix bios? Any other bios suggestions?

I have seen the card hit 2100 with just tweaking in precisionX, so I am not sure I am really going to get anything additional, though. Plus seeing the "pascal problem" vid above, I am guessing that even higher clock speed isn't going to buy me more fps, necessarily.


----------



## Nico67

Quote:


> Originally Posted by *ir88ed*
> 
> I flashed the XOC bios onto my cards, flash appeared to go normally. Reboot, and the device manager only showed one of my cards enabled. Wasn't able to enable the card. Reflashed my original bios, and all is back to normal. Maybe my EVGA FE just doesn't like the Strix bios? Any other bios suggestions?
> 
> I have seen the card hit 2100 with just tweaking in precisionX, so I am not sure I am really going to get anything additional, though. Plus seeing the "pascal problem" vid above, I am guessing that even higher clock speed isn't going to buy me more fps, necessarily.


Did you flash both cards?


----------



## ir88ed

Yup. Did both index=0 and index=1. Followed the instructions on first post exactly, even downloaded that version of NVflash.


----------



## KedarWolf

Quote:


> Originally Posted by *ir88ed*
> 
> Yup. Did both index=0 and index=1. Followed the instructions on first post exactly, even downloaded that version of NVflash.


Try disabling both cards in Control Panel first. Nvflash is supposed to do that automatically but I know of at least one person that had issues with that. After you flash, reboot, check Control Panel, if they are still disabled enable them, reboot again.

Reboot RIGHT AFTER you flash though, don't enable them before rebooting.


----------



## Nico67

Quote:


> Originally Posted by *ir88ed*
> 
> Yup. Did both index=0 and index=1. Followed the instructions on first post exactly, even downloaded that version of NVflash.


Strange, I know I often have to manually enable after reboot, but not always. Never done SLi though so don't know of any particular tricks that might help


----------



## ir88ed

Quote:


> Originally Posted by *KedarWolf*
> 
> Try disabling both cards in Control Panel first. Nvflash is supposed to do that automatically but I know of at least one person that had issues with that. After you flash, reboot, check Control Panel, if they are still disabled enable them, reboot again.
> 
> Reboot RIGHT AFTER you flash though, don't enable them before rebooting.


Yeah, that was what I did. I have flashed bioses numerous times on my 980ti's and 780's before that, so I am reasonably familiar with the process. Just wanted to make sure that there weren't any known incompatibilities with the EVGA FE and the XOC bios. I will redownload a fresh copy and give it another shot. Thanks guys.


----------



## KenjiS

Got mine today...

Wow, its SILENT even under load

Didnt touch a THING just installed it, ran DDU, installed latest drivers and ran Time Spy

http://www.3dmark.com/spy/1842052

2025 Boost and 11,088 Memory right out of the box with no tinkering on the Asus Strix...

Am i crazy or is that actually really solid out of the box?

*edit* ok correction, just looked in MSI afterburner and it had +56 on core and +45 on memory applied... Some factory loaded overclock but still, I didnt touch it myself. all I did was push the Power Limit to 120%

Downside? I had to pull a hard drive out and move it to fit it in my case... Oh well.. Looks darn cool


----------



## Amuro

my 1st bench for msi gaming x 1080ti aircooled, non curve oc, anything higher than 50/500 will crash/artifact


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone interested in overclocking a Pascal GPU should watch this video.
> 
> This video shows that when we tweak the voltage. Our Pascal cards either a) aren't always actually running at the clockspeed reported in windows or b) the card is doing some other crazy power stuff I.E. not utilizing all shaders etc.
> 
> This could go quite far to explain why people aren't getting the scores they should be when for example using the XOC BIOS. _(I.E scoring less even though the clockspeed is higher)._
> 
> A very, very interesting watch.
> 
> 
> 
> 
> 
> It seems there may be something on the PCB that is monitoring power efficiency that no hardware or software mod can override-- and when that power is increased _(due to more voltage)_ the card shuts down shaders (or downclocks) to stay within TDP. However that downlocking may not be reporting in Windows.
> 
> And when we say TDP, we don't _always_ mean power limit. There can be a difference between Power Limit and TDP.
> 
> The video explains it better than me. If you've got 30 minutes I'd highly recommend investing them in this video. It's a huge eye opener. (And finally gives us some semi-official explanation to the abnormalities we have all been experiencing when trying to O/C this arcitecture).


My testing on XOC BIOS. I only skimmed through the video though, never watched the entire thing. I heard him say lower clocks at higher voltages scales better I think, but I couldn't reproduce that.









2025 @ 1.05V 68.32FPS



2025 @ 1.150v 68.03 FPS



2100 @ 1.1v 68.96 FPS



2100 @ 1.15v 69.09 FPS.

Lastly I did 1999 @ 1.175v 67.90 FPS


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> My testing on XOC BIOS. I only skimmed through the video though, never watched the entire thing. I heard him say lower clocks at higher voltages scales better I think, but I couldn't reproduce that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2025 @ 1.05V 68.32FPS
> 
> 
> 
> 2025 @ 1.150v 68.03 FPS
> 
> 
> 
> 2100 @ 1.1v 68.96 FPS
> 
> 
> 
> 2100 @ 1.15v 69.09 FPS.
> 
> Lastly I did 1999 @ 1.175v 67.90 FPS


I think he was talking about .850v verses 1.043v. It does work if you scale up say 2000mhz from .950 to 1.075 in .025 increments, but I found it drops off again after that.


----------



## KenjiS

Better picture



This thing is huge lol


----------



## IMI4tth3w

this voltage curve thing is driving me insane. It really just makes no sense...

If you use the voltage curve and just drag one dot, lets say the dot at 1.063V and drag it up to 2088Mhz then click apply, it will automatically increase all the dots after it. Okay great awesome. So i do a run at 2088MHz @ 1.063V and i get a score.

Now here's where things get weird. I put it back to stock and instead of using voltage curve, i use the core offset of +95 which gets me to 2088MHz. When i run the benchmark it picks the voltage of 1.063 (same as before) yet this run will score 250 points higher.

Comparing the two voltage curves, when i do a +95 offset, it adjusts every single point on the curve up. Where as the 1st manual single point adjustment to the voltage curve which only moves all voltage points higher up and leaves all the lower voltage points at lower speeds.

So does increasing the lower frequency voltage points increase your OC?

This voltage curve is so odd..


----------



## TWiST2k

Quote:


> Originally Posted by *KenjiS*
> 
> Got mine today...
> 
> Wow, its SILENT even under load
> 
> Didnt touch a THING just installed it, ran DDU, installed latest drivers and ran Time Spy
> 
> http://www.3dmark.com/spy/1842052
> 
> 2025 Boost and 11,088 Memory right out of the box with no tinkering on the Asus Strix...
> 
> Am i crazy or is that actually really solid out of the box?
> 
> *edit* ok correction, just looked in MSI afterburner and it had +56 on core and +45 on memory applied... Some factory loaded overclock but still, I didnt touch it myself. all I did was push the Power Limit to 120%
> 
> Downside? I had to pull a hard drive out and move it to fit it in my case... Oh well.. Looks darn cool


Sounds like you got a decent one! I am pretty sure you Afterburner had some settings saved from some previous point. I usually default my OC apps as well when putting in a diff card.


----------



## KenjiS

Quote:


> Originally Posted by *TWiST2k*
> 
> Sounds like you got a decent one! I am pretty sure you Afterburner had some settings saved from some previous point. I usually default my OC apps as well when putting in a diff card.


Nope, completely clean installed everything..

I think it might have been from when i loaded on Asus' app briefly because i was under the mistaken understanding thats where the RGB LED controls were...I just prefer Afterburner (Why is it nobody else has an app that is as nice?!)

Side note, stinks that the VIII series motherboards dont have complete Aura Sync functionality









Im also not sure im going to bother tweaking anymore on the OC, i just dont think i can squeeze much out of this card from what ive seen others achieve on air


----------



## TWiST2k

Quote:


> Originally Posted by *KenjiS*
> 
> Nope, completely clean installed everything..
> 
> I think it might have been from when i loaded on Asus' app briefly because i was under the mistaken understanding thats where the RGB LED controls were...I just prefer Afterburner (Why is it nobody else has an app that is as nice?!)
> 
> Side note, stinks that the VIII series motherboards dont have complete Aura Sync functionality
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im also not sure im going to bother tweaking anymore on the OC, i just dont think i can squeeze much out of this card from what ive seen others achieve on air


Afterburner is worlds beyond everybody else for sure! My only complaint about the FTW3 is the lack of fan support with Afterburner. The individual fans are cool and all, but lacking proper support outside of Precision X, is disappointing. I was thinking of getting GPU fan header splitter and experimenting with that. I find my fan curve has the second and third fans set higher then the primary GPU fan as it is haha.


----------



## KenjiS

Quote:


> Originally Posted by *TWiST2k*
> 
> Afterburner is worlds beyond everybody else for sure! My only complaint about the FTW3 is the lack of fan support with Afterburner. The individual fans are cool and all, but lacking proper support outside of Precision X, is disappointing. I was thinking of getting GPU fan header splitter and experimenting with that. I find my fan curve has the second and third fans set higher then the primary GPU fan as it is haha.


I only didnt get an eVGA because I was VERY unhappy with my 980 Ti's demise..

Getting it fixed under warranty and selling it tho


----------



## IMI4tth3w

hey guys not sure if this is something not allowed, but i'm streaming some overclocking if anyone wants to pop in and make some suggestions


----------



## Nico67

Quote:


> Originally Posted by *IMI4tth3w*
> 
> this voltage curve thing is driving me insane. It really just makes no sense...
> 
> If you use the voltage curve and just drag one dot, lets say the dot at 1.063V and drag it up to 2088Mhz then click apply, it will automatically increase all the dots after it. Okay great awesome. So i do a run at 2088MHz @ 1.063V and i get a score.
> 
> Now here's where things get weird. I put it back to stock and instead of using voltage curve, i use the core offset of +95 which gets me to 2088MHz. When i run the benchmark it picks the voltage of 1.063 (same as before) yet this run will score 250 points higher.
> 
> Comparing the two voltage curves, when i do a +95 offset, it adjusts every single point on the curve up. Where as the 1st manual single point adjustment to the voltage curve which only moves all voltage points higher up and leaves all the lower voltage points at lower speeds.
> 
> So does increasing the lower frequency voltage points increase your OC?
> 
> This voltage curve is so odd..


Two issues,

1/ if you downclock due to limiting after just raising one bin with the curve, then you are going to drop down to the default curve and have a terrible score. You need to raise bins as far as you limit too. So in your case it might down clock as far as 1.000v and you might be able to raise that to 2000mhz, plus 1.012 to 2013...

2/ you actually need to raise at least 3 more bins past that min downclock bin, so 0.987, 0.975, 0.963 (not sure of exact volt bins) as well. Not sure why, but this is possible something like what Buildzoid was experiencing, just weird behaviour and possible incorrectly reported clks or non monitored sub clks downclocking. It does give a solid boost doing this however


----------



## toncij

Using Grizzly Conductonaut anyone? Or any other liq.m.?


----------



## IMI4tth3w

Quote:


> Originally Posted by *Nico67*
> 
> Two issues,
> 
> 1/ if you downclock due to limiting after just raising one bin with the curve, then you are going to drop down to the default curve and have a terrible score. You need to raise bins as far as you limit too. So in your case it might down clock as far as 1.000v and you might be able to raise that to 2000mhz, plus 1.012 to 2013...
> 
> 2/ you actually need to raise at least 3 more bins past that min downclock bin, so 0.987, 0.975, 0.963 (not sure of exact volt bins) as well. Not sure why, but this is possible something like what Buildzoid was experiencing, just weird behaviour and possible incorrectly reported clks or non monitored sub clks downclocking. It does give a solid boost doing this however


i'm actually streaming my overclocking right now and just literally no way to better explain it other than some sort of voltage curve magic.

i'm now 2126MHz rock solid stable at 1.075v but the way i got there and set my voltage curve makes a HUGE difference in performance/stability.

I started out at +95 core offset and then proceeded to move some of the voltage curve points at almost seemingly random to find 2100 stable.. then using same curve adjust it slightly for 2113 stable.. then using same curve adjust it for 2126 stable.. and finally i'm starting to have trouble with 2136 getting only 1 stable run through of superposition...

there's like no rhyme or reason to this LOL but oh well i guess its a good way to waste a lot of time overclocking this thing


----------



## TWiST2k

Quote:


> Originally Posted by *IMI4tth3w*
> 
> i'm actually streaming my overclocking right now and just literally no way to better explain it other than some sort of voltage curve magic.
> 
> i'm now 2126MHz rock solid stable at 1.075v but the way i got there and set my voltage curve makes a HUGE difference in performance/stability.
> 
> I started out at +95 core offset and then proceeded to move some of the voltage curve points at almost seemingly random to find 2100 stable.. then using same curve adjust it slightly for 2113 stable.. then using same curve adjust it for 2126 stable.. and finally i'm starting to have trouble with 2136 getting only 1 stable run through of superposition...
> 
> there's like no rhyme or reason to this LOL but oh well i guess its a good way to waste a lot of time overclocking this thing


I have been watching as well, its pretty fun to see another persons results in real time for comparisons.


----------



## IMI4tth3w

Quote:


> Originally Posted by *TWiST2k*
> 
> I have been watching as well, its pretty fun to see another persons results in real time for comparisons.


Thanks for watching man!









But i finally just full crashed LOL so i guess a good place to stop. Going to do a couple runs without all the background nonsense with some of the higher overclocks and see what scores pop up...

incoming edit with scores....

2126Mhz/6066MHz @1.075v = 10249 (Superposition) (9788 was the score i got while streaming)

now check this...

2126MHz/6026MHz @ 1.075v = 10405


----------



## Benny89

Quote:


> Originally Posted by *IMI4tth3w*
> 
> Thanks for watching man!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But i finally just full crashed LOL so i guess a good place to stop. Going to do a couple runs without all the background nonsense with some of the higher overclocks and see what scores pop up...
> 
> incoming edit with scores....
> 
> 2126Mhz/6066MHz @1.075v = 10249 (Superposition) (9788 was the score i got while streaming)
> 
> now check this...
> 
> 2126MHz/6026MHz @ 1.075v = 10405


Try flat 6000 mhz for comparsion please.


----------



## KenjiS

Just fired up the game that literally killed my 980 Ti, Mass Effect Andromeda and WOW

now getting 80-120fps, rock solid, What a huge difference @[email protected] Its cooler and quieter than my old 980 Ti on top of it...

I seriously wasn't expecting to feel such a big improvement... Especially not in Lightroom


----------



## Amuro

not bad for an old cpu, board and ram. same clocks from my last post.
msi gtx 1080 ti gaming x air cooled, non curve oc +50 +500 max volt and pl slider, beyond 50/500 will crash or artifacts.


----------



## Benny89

Anyone has a problem with Superposition 4K stuttering when coming from one scene to other? I have like in 3 scenes during bench. When I fire bech again it go smooth from scene to scene, but when I fire it up for first time there are 3 particular scenes where going form one to other is not smooth (and it drops to like 21 min fps) until I restart bench and it is fixed....

LOL.

Of course I never have such thing in games etc. Only in SUPO 4K.


----------



## DStealth

Writing offset is useless...As different cards acts differently with it...not even to mention Voltage, Temps, etc...
Write the actual values during benchmark...
In Heaven my best run with old drivers and GTX [email protected]/12000 was 168pfs , so your score seem fine for the clocks in ~2ghz range
Edit: Lol you can take the screenshot directly with F12 and find it inside Folder= C:\Users\username\Heaven\screenshots...no need to use a phone or camera


----------



## IMI4tth3w

Quote:


> Originally Posted by *Benny89*
> 
> Try flat 6000 mhz for comparsion please.


2126MHz/6003MHz @ 1.075v = 10402


----------



## Amuro

Quote:


> Originally Posted by *DStealth*
> 
> Writing offset is useless...As different cards acts differently with it...not even to mention Voltage, Temps, etc...
> Write the actual values during benchmark...
> In Heaven my best run with old drivers and GTX [email protected]/12000 was 168pfs , so your score seem fine for the clocks in ~2ghz range
> Edit: Lol you can take the screenshot directly with F12 and find it inside Folder= C:\Users\username\Heaven\screenshots...no need to use a phone or camera


ill try. coz f12 gives me blank screen on paint☺


----------



## IMI4tth3w

just got a time spy run in. 2126 crashed i think during graphics test 2 so i went to my 2113 OC and did this

http://www.3dmark.com/spy/1843187

#1 score for 4770k + 1080ti







although probably not a lot of people running this combo. graphics score was 10862 so going to look at what some of the other high graphics scores are. so far it seems pretty decent


----------



## Nico67

Quote:


> Originally Posted by *IMI4tth3w*
> 
> Thanks for watching man!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But i finally just full crashed LOL so i guess a good place to stop. Going to do a couple runs without all the background nonsense with some of the higher overclocks and see what scores pop up...
> 
> incoming edit with scores....
> 
> 2126Mhz/6066MHz @1.075v = 10249 (Superposition) (9788 was the score i got while streaming)
> 
> now check this...
> 
> 2126MHz/6026MHz @ 1.075v = 10405


Yeah memory behaves weird, but I just took AB as the real figure and bumped the offset a few times til AB ticked over to the next step. I would hit apply, close and restart SuPo and then run it and I got proper mem scaling all the way up to 6236 and it was still good, but started seeing artifacts. I just run 6180 now and its seem happy.

Clocking these things and trying to get them to behave is a game in itself







keep playing though as you'll find something and there's another 100 points.


----------



## Benny89

Will LEDs on my STRIX backplate still work if I put EK waterblock on it but keep my backplate?


----------



## Aganor

Hey guys, yesterday after all the benchmarks marathons i decided to (finally) play witcher 3 again. After about 15 minutes or so the game freezed and took some minutes to crash to desktop and GPU lost to appear on afterburner.

I think it may be OC related as i oushed +735 on memory but in Heaven and Superposition it is stable.

I have seen an user blaming Memory OC for his Witcher 3 crashes. Is it usual?

I was so happy with this OC


----------



## Benny89

Quote:


> Originally Posted by *Aganor*
> 
> Hey guys, yesterday after all the benchmarks marathons i decided to (finally) play witcher 3 again. After about 15 minutes or so the game freezed and took some minutes to crash to desktop and GPU lost to appear on afterburner.
> 
> I think it may be OC related as i oushed +735 on memory but in Heaven and Superposition it is stable.
> 
> I have seen an user blaming Memory OC for his Witcher 3 crashes. Is it usual?
> 
> I was so happy with this OC


Witcher 3 is known for being super senstivie to OC. I ran my OC in 4K Sup, Heaven, Time Spy, Dishonored 2, ME:A, ROTR and everything was great.

Started Witcher 3- 30 min crash. Had to lower memory OC and keep temps even more in check.

Witcher 3 is very sensitive to OC. Especially memory OC.


----------



## Aganor

Quote:


> Originally Posted by *Benny89*
> 
> Witcher 3 is known for being super senstivie to OC. I ran my OC in 4K Sup, Heaven, Time Spy, Dishonored 2, ME:A, ROTR and everything was great.
> 
> Started Witcher 3- 30 min crash. Had to lower memory OC and keep temps even more in check.
> 
> Witcher 3 is very sensitive to OC. Especially memory OC.


I hear you. I was loving the 120 fps average, even used everything in High, instead of Ultra to have a smoother gameplay but will have to test with my old +600 MemOC, hopefully wont see much of a fps drop.


----------



## Benny89

Quote:


> Originally Posted by *Aganor*
> 
> I hear you. I was loving the 120 fps average, even used everything in High, instead of Ultra to have a smoother gameplay but will have to test with my old +600 MemOC, hopefully wont see much of a fps drop.


Memory OC in games will give you almost no performacne anyway. At least not with such small differene. I see no difference in fps in Witcher 3 between 5800 and 6000mhz OC.

Benches are different from games. Same like syntetic tests for CPU vs actuall gaming performance.


----------



## Luckbad

Quote:


> Originally Posted by *IMI4tth3w*
> 
> Thanks for watching man!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But i finally just full crashed LOL so i guess a good place to stop. Going to do a couple runs without all the background nonsense with some of the higher overclocks and see what scores pop up...
> 
> incoming edit with scores....
> 
> 2126Mhz/6066MHz @1.075v = 10249 (Superposition) (9788 was the score i got while streaming)
> 
> now check this...
> 
> 2126MHz/6026MHz @ 1.075v = 10405


Word! Great score. It was fun watching someone else OC their card.


----------



## jim2point0

Quote:


> Originally Posted by *Benny89*
> 
> Memory OC in games will give you almost no performacne anyway. At least not with such small differene. I see no difference in fps in Witcher 3 between 5800 and 6000mhz OC.
> 
> Benches are different from games. Same like syntetic tests for CPU vs actuall gaming performance.


This is how I'm coping with my Strix that only manages 9950 in superposition. Can't even break 10000 with it. But it probably performs nearly the same as a card that's doing 10200. Right? Right.....?


----------



## Benny89

Quote:


> Originally Posted by *jim2point0*
> 
> This is how I'm coping with my Strix that only manages 9950 in superposition. Can't even break 10000 with it. But it probably performs nearly the same as a card that's doing 10200. Right? Right.....?


Yes, you will see close to zero difference in games really. Benches are not games and they are various of ways that we use to bumb those bench scores as high as possible.

In games as some people showed it can even be a matter of 0.80 fps difference....

So no worries, just enjoy your card and play games


----------



## dunbheagan

Quote:


> Originally Posted by *jim2point0*
> 
> This is how I'm coping with my Strix that only manages 9950 in superposition. Can't even break 10000 with it. But it probably performs nearly the same as a card that's doing 10200. Right? Right.....?


Absolutly. 9950 or 10200 does not change your gaming experience AT ALL. It is a 1fps difference or maybe two at high framerates. Not worth thinking about, you are totally fine. Enjoy your games, this 10k mark is just a psychological effective number...


----------



## Aganor

Quote:


> Originally Posted by *jim2point0*
> 
> This is how I'm coping with my Strix that only manages 9950 in superposition. Can't even break 10000 with it. But it probably performs nearly the same as a card that's doing 10200. Right? Right.....?


Its easy to get lost on the bench world and not enjoy our card until its ready to be sold again.

Just enjoy the beast you have, games will run smooth as hell with it specially on a g-sync display. Its hard to see differences from 90 to 120 fps anyway.

I was going to a old save game on witcher on the swamps and my frames were from 130 to 90 in seconds and i didnt even notice it unless i looked to fraps. Maybe i'll turn hairworks again, Geralt gets strange with his beard without the hairworks, worth the 15 fps lost haha


----------



## nrpeyton

I'm ordering my new waterblock today. _(for 1080Ti FE)_

Is there any difference in temps between the:
-EK Titan X Pascal
&
-EK 1080Ti FE blocks

(as you know both fit).

The Titan block is cheaper. So if temps are the same-- I'll buy the titan one.


----------



## TheWizardMan

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm ordering my new waterblock today.
> 
> Anyone know if there's any difference in temps between the:
> -EK Titan X Pascal
> &
> -EK 1080Ti FE blocks
> 
> (as you know both fit 1080Ti FE).


Everything I have seen seems to say they run the same. My Ti blocks run in the mid-to-low 20s while idling and tend to sit just below or just over 40 under normal gameplay. Intense gameplay will bring them up to mid- to high 40s.

*This is an SLI setup, so things get a bit hotter.


----------



## cluster71

You who use with XOC bios. What level do you have in terms of "Game Stable" I mean 4 hours of hard gaming. It is usually not the same as Benchmark level. I've been at core 2126 1.100v but may need to give 1.112v to Zotac Extreme.


----------



## Benny89

So I am right now planning water loop for my RIG.

Now question- for first time water cooling- do you recommend soft tubing or to go hard tubing? I will want hard tubes in future anyway, but I wonder if I should not start with soft ones at first.

Opinions?

Also how should loop go? From pump->GPU->radiator->CPU->radiator->pump? Is that ok?


----------



## TheWizardMan

Quote:


> Originally Posted by *Benny89*
> 
> So I am right now planning water loop for my RIG.
> 
> Now question- for first time water cooling- do you recommend soft tubing or to go hard tubing? I will want hard tubes in future anyway, but I wonder if I should not start with soft ones at first.
> 
> Opinions?
> 
> Also how should loop go? From pump->GPU->radiator->CPU->radiator->pump? Is that ok?


I went hard tubing for my first build, but it was a pain and took a lot of patience. If you want hard tubing and are willing to put in the time, effort and money, then do it.

As far as loop order goes, the only thing that actually matters is that the reservoir comes before the pump, everything else will only make minor differences in temps (and maybe 0 difference after the loop stabilizes). That being said most people suggest you go CPU before GPU, so if you can maybe go res-pump-CPU-rad-GPU-rad. Having a rad between the pump and the first component could help minimally as well, as the pump will have a minor tdp as well.


----------



## Aganor

Yesterday i got up to 51ºc in witcher. i think my ambient temp was near 29ºc. At least case sensors where marking 29ºc on idle.

Quite hot!!


----------



## TheWizardMan

Quote:


> Originally Posted by *Aganor*
> 
> Yesterday i got up to 51ºc in witcher. i think my ambient temp was near 29ºc. At least case sensors where marking 29ºc on idle.
> 
> Quite hot!!


Yea, I got up to like 70 playing overwatch with 200% render on a 4K display. Don't suggest doing that.


----------



## Aganor

Quote:


> Originally Posted by *TheWizardMan*
> 
> Yea, I got up to like 70 playing overwatch with 200% render on a 4K display. Don't suggest doing that.


in Water?!


----------



## TheWizardMan

Quote:


> Originally Posted by *Aganor*
> 
> in Water?!


Yep. But the fans on my 240 mm rad had stopped working and I didn't realize it right away.


----------



## Benny89

Quote:


> Originally Posted by *TheWizardMan*
> 
> I went hard tubing for my first build, but it was a pain and took a lot of patience. If you want hard tubing and are willing to put in the time, effort and money, then do it.
> 
> As far as loop order goes, the only thing that actually matters is that the reservoir comes before the pump, everything else will only make minor differences in temps (and maybe 0 difference after the loop stabilizes). That being said most people suggest you go CPU before GPU, so if you can maybe go res-pump-CPU-rad-GPU-rad. Having a rad between the pump and the first component could help minimally as well, as the pump will have a minor tdp as well.


Thanks but why CPU first?

I am planning it this way. I know its ugly but this is just a concept











Looks ok?


----------



## TheWizardMan

Quote:


> Originally Posted by *Benny89*
> 
> Thanks but why CPU first?
> 
> I am planning it this way. I know its ugly but this is just a concept
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks ok?


Hitting the CPU first will give you the best OC results because your CPU, more than your GPU, will benefit from the slightly colder water. That being said, the real world difference is so negligible that you should be fine with this path. If you really wanted to hit your cpu first, just do that same loop going the other way.


----------



## TheBoom

Quote:


> Originally Posted by *cluster71*
> 
> You who use with XOC bios. What level do you have in terms of "Game Stable" I mean 4 hours of hard gaming. It is usually not the same as Benchmark level. I've been at core 2126 1.100v but may need to give 1.112v to Zotac Extreme.


I've found FSU stress test will tell you within 5 minutes whether you are rock solid stable or not. You got a really good card there by the way. Mine does 2088 at 1131mv on XOC bios.
Quote:


> Originally Posted by *TheWizardMan*
> 
> Hitting the CPU first will give you the best OC results because your CPU, more than your GPU, will benefit from the slightly colder water. That being said, the real world difference is so negligible that you should be fine with this path. If you really wanted to hit your cpu first, just do that same loop going the other way.


Maybe for the first 5 minutes? After that the water in your loop will reach an equilibrium so it really won't matter which component comes first in the loop.


----------



## Agent-A01

Quote:


> Originally Posted by *TheWizardMan*
> 
> Hitting the CPU first will give you the best OC results because your CPU, more than your GPU, will benefit from the slightly colder water. That being said, the real world difference is so negligible that you should be fine with this path. If you really wanted to hit your cpu first, just do that same loop going the other way.


It makes zero difference.

Water will hit an equilibrium throughout the loop.

Does not matter if you went RES > CPU > GPU > GPU > RAD either.


----------



## dunbheagan

Quote:


> Originally Posted by *Benny89*
> 
> Thanks but why CPU first?
> 
> I am planning it this way. I know its ugly but this is just a concept
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks ok?


Yeah, it looks ok. You cant really go wrong with the order(if you have res and pump in one piece), watertemps within a running loop are nearly the same at every spot. In an average loop(pump at 60l/h, 500ml coolant) it takes about 30 seconds for the coolant to circle one full loop. If you have your pump higher at 100 or 200l/h it can be even under 10s for the water to circle around. If i was you i would go for soft tubing and just go for the order which looks best in you pc-case.

Just for the information which order i have in my loop:
Res - Pump - 360 Rad - 1080T i- 4790K - 240Rad - Res



The Res+Pump is in back of the case.

Just dont forget to build a possibilty to easily drain your loop, like a ball valve at the lowest point. It will make your life MUCH easier later!


----------



## TheWizardMan

Quote:


> Originally Posted by *Agent-A01*
> 
> It makes zero difference.
> 
> Water will hit an equilibrium throughout the loop.
> 
> Does not matter if you went RES > CPU > GPU > GPU > RAD either.


This is a point of pretty significant debate in the community and people frequently demonstrate minor differences. But as I've said, repeatedly, it will make very little difference and will likely be negligible in real world situations.


----------



## Benny89

Thanks for all suggestions. I have a plan now to get good components to finally get that 1080 Ti under well-deserved water









EDIT: Does tubing size matters? Like their diameter? are 10/13mm ok?? Or do I need bigger ones?


----------



## Nico67

Quote:


> Originally Posted by *Benny89*
> 
> Thanks but why CPU first?
> 
> I am planning it this way. I know its ugly but this is just a concept
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks ok?


It usually comes down to what ever is easy to pump the tubing. Common theory is pump-> rad->cpu->rad->gpu->res->pump, although rads together would be fine. Rad before ensures coolest possible point, but may only be a fraction of a degree in it.
One thing that would help is pushing cold outside air thru the top rad, Then pull hot internal air out the back of the case, that could make a degree or two difference. Not so easy to accomplish, but similar to taking the side off the case.


----------



## TheWizardMan

Quote:


> Originally Posted by *Nico67*
> 
> It usually comes down to what ever is easy to pump the tubing. Common theory is pump-> rad->cpu->rad->gpu->res->pump, although rads together would be fine. Rad before ensures coolest possible point, but may only be a fraction of a degree in it.
> One thing that would help is pushing cold outside air thru the top rad, Then pull hot internal air out the back of the case, that could make a degree or two difference. Not so easy to accomplish, but similar to taking the side off the case.


^I agree that fan placement may be the biggest problem with this setup. It could have some "dead zones" for air turbulence and it's going to be an equal pressure setup when you really should be aiming for positive pressure.


----------



## TheBoom

Is it possible to damage a thermal pad such that it affects overclock?

I was replacing the TIM on my amp extreme and accidentally got some paste on the pad that goes on one of the VRAM chips. Tried to get it out but some of the pad got scratched off instead. Kinda like when you scratch paper when it's wet if you know what I mean. It's not even that deep and honestly thought it wouldn't make a difference.

But for some reason even though my temps went down and at even lower fan speeds my previous overclock isn't stable anymore?

The only other explanation I can think of is the lower fan speeds are causing higher temps on the other areas but I'm kinda paranoid about the pads been damaged lol.


----------



## Somasonic

Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


----------



## TheBoom

Quote:


> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


Coil whine is really kind of a luck thing. But my AMP Extreme doesn't have any audible whine and fans are about on par with the strix. Strix has the quietest fans if nothing has changed since the recent reviews.

Gaming X and Aorus also have quiet fans but not sure about coil whine on those.


----------



## dallas1990

I wish there was a waterblock for the Zotac 1080ti amp extreme. Idk there was one already watercooled.


----------



## dunbheagan

Quote:


> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


You need a lot of luck to get a 1080Ti without coil whine. Maybe your next card will be worse.

If you can sacrifice a little performance try undervolting your card. Mine has medium coil whine, but i mostly game at [email protected],793V, where i have no audible coil whine any more, even with an open case. This costs me about 10% performance but has other advantages too, less heat, less power draw and lower case fan rpm. All in all my system is much quieter, cooler and efficient. In most of my cases it is a very good trade-off for the 10% less performance. Maybe this is a solution for you too.


----------



## IMI4tth3w

I think i may have found a way for you guys to get better overclocks using the XOC bios. Spent a LOT of time messing with the voltage curve last night.

What you do, is first find your highest stable bin using the offset (do not touch the voltage curve). For me, i'm stable at +95 but unstable at +105 (2088/2100 respectively)

Once i found my stable offset, i adjust one of the voltage points up to 2100. For me, i'm 2100 stable at 1.062v this way (while i'm not stable doing the +105 offset) This may take a few tries, and you might even need to do something like a +98 offset then adjust the 2100 voltage point. Try different voltages (lower and higher) and see what scores better/is stable.

Now that i'm stable at 2100, i move onto 2113. (probably a good idea to save a good curve if you have good stability/performance at 2100. Sometimes it is nearly impossible to recreate a voltage curve)
Using the same voltage voltage curve, adjust the next voltage point (1.075 for me) to 2113. Sometimes you may have to move it a couple times as the point will be graphically at the same level as 2100 but if you look at the offset of that point, it will be lower than the one at 1.063 for 2100.

The real trick is to get just the right voltage curve. Start with the highest stable offset you can do (and experiment adding/subtracting a couple from it. it will affect some points on the curve that can influence stability/performance)

For those of you who are unhappy with your scores or overclock, don't fret as you may just need to work on your voltage curve.


----------



## Somasonic

Quote:


> Originally Posted by *TheBoom*
> 
> Coil whine is really kind of a luck thing. But my AMP Extreme doesn't have any audible whine and fans are about on par with the strix. Strix has the quietest fans if nothing has changed since the recent reviews.
> 
> Gaming X and Aorus also have quiet fans but not sure about coil whine on those.


Quote:


> Originally Posted by *dunbheagan*
> 
> You need a lot of luck to get a 1080Ti without coil whine. Maybe your next card will be worse.
> 
> If you can sacrifice a little performance try undervolting your card. Mine has medium coil whine, but i mostly game at [email protected],793V, where i have no audible coil whine any more, even with an open case. This costs me about 10% performance but has other advantages too, less heat, less power draw and lower case fan rpm. All in all my system is much quieter, cooler and efficient. In most of my cases it is a very good trade-off for the 10% less performance. Maybe this is a solution for you too.


Thanks for that, that's pretty much what I was expecting to hear. I'm happy to try undervolting but it's something I haven't played around with before, what sort of values do you think I should be going for?


----------



## Benny89

Quote:


> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


Had total of 5 1080 ti STRIX OC thus far and none had coil whine. I must be extreme lucky


----------



## nrpeyton

*Compatible EKWB blocks for 1080Ti FE*

1080Ti block

*Notice the WIDER 'memory step'*

Titan X Pascal block (also compatible)


Last month I *submitted a support ticket to EKWB* asking which block gave best temps. They said it's negligible but they still *recommended the Titan block. (For my 1080Ti FE).
*
To me; every last degree counts!

The titan block is also now cheaper!

BUT -- the plot thickens!
If you look at the two pictures above; you'll notice the 'step' for the memory is actually wider on the 1080Ti block.

Does anyone know if the Titan block covers EVERY INCH of all memory modules on the 1080Ti PCB?


----------



## TheBoom

Wow this sucks. FSU doesn't crash with a lower memory oc. Turns out that little bit of damage on the thermal pad made a difference after all.


----------



## Benny89

Quote:


> Originally Posted by *Nico67*
> 
> It usually comes down to what ever is easy to pump the tubing. Common theory is pump-> rad->cpu->rad->gpu->res->pump, although rads together would be fine. Rad before ensures coolest possible point, but may only be a fraction of a degree in it.
> One thing that would help is pushing cold outside air thru the top rad, Then pull hot internal air out the back of the case, that could make a degree or two difference. Not so easy to accomplish, but similar to taking the side off the case.


Thanks for suggestions. Sadly top rad can only have funs pushing air from case through rad as anything else would require case modding which I am not ready for.

Well, it will still be better than air and AIO so







.


----------



## cg4200

Hey picture is not the best dark in here.. my thermal pad goes a hair past vram so I would say flush..
And I have few rads on loop with 6850k 4.6 firstrike ultra does not go over 37C put has been 70c ish degrees here in new England this week running tests with vram at 700 core 2062.
If you can live with it not saying 1080 ti does not bother me.


----------



## cg4200

That's crappy.. If it was me and I had no new thermal pads I would take off clean vram and apply thin layer of your best thermal paste or top of pad where ever you damaged it..
I used to do all my vram but I change cards a lot so I no longer do it every time and I was eating up some thermal grizzly..


----------



## Somasonic

Quote:


> Originally Posted by *Benny89*
> 
> Had total of 5 1080 ti STRIX OC thus far and none had coil whine. I must be extreme lucky


To be fair I game at 1440p 60Hz and I didn't get coil whine unless I used DSR to play at 4k, which I was just doing just for fun so it didn't bother me. However I just started Dying Light and the coil whine is constant at 1440p and just not acceptable to me. I've tried all the usual tricks like VSYNC, FPS caps, etc. so I'm looking for other tips or a card that won't whine (which sounds like a gamble rather than anything with any degree of certainty attached to it). Also I'm hoping to go ultrawide later in the year and I'm guessing with the card working harder the whine will just get more prevalent.


----------



## Benny89

Is it ok to put a little thermal paste on thermal pads before placing them on VRAMs when you install water block?


----------



## ezveedub

Quote:


> Originally Posted by *Benny89*
> 
> Is it ok to put a little thermal paste on thermal pads before placing them on VRAMs when you install water block?


Used to do that in the past to hold them in place or when a pad cracked and it was being reused. Never had any issues.

Sent from my iPhone using Tapatalk Pro


----------



## IMI4tth3w

Quote:


> Originally Posted by *Benny89*
> 
> Is it ok to put a little thermal paste on thermal pads before placing them on VRAMs when you install water block?


a small tab of non conductive thermal paste should be just fine


----------



## TheBoom

Quote:


> Originally Posted by *cg4200*
> 
> That's crappy.. If it was me and I had no new thermal pads I would take off clean vram and apply thin layer of your best thermal paste or top of pad where ever you damaged it..
> I used to do all my vram but I change cards a lot so I no longer do it every time and I was eating up some thermal grizzly..


Does this look like it could affect temperatures? I think I'm gonna apply some paste nonetheless.


----------



## Benny89

Quote:


> Originally Posted by *IMI4tth3w*
> 
> a small tab of non conductive thermal paste should be just fine


Will that make any difference in temps? Using Grizzly Kryo?

EDIT: *Who else makes good Radiators besides EK? What brand would you recommend?*


----------



## cluster71

I would not take thermal paste on the pad. I did it on my FE when I did not have pads at home. Next time i took apart the card look like crap. It took almost a whole night to clean the entire card


----------



## cluster71

I wrote earlier that I had high temperatures on Zotac and earlier with FE even though I changed the paste. I think the green padding is too thick so the spring load can not really squeeze, so I picked 1.0 mm pad instead of the green which seems to be 1.5 mm and I have had low temperatures since then. I also took double springs in each other


----------



## cg4200

Quote:


> Originally Posted by *TheBoom*
> 
> Does this look like it could affect temperatures? I think I'm gonna apply some paste nonetheless.
> I would say if you had running stable on memory and now its not if it is same ambient temp mabe ram is getting hotter..
> I have never been sloppy when applying thermal pastes on vram .And thin layer when I peel off thermal pad q-tip and couple minutes all clean..
> Some older ek-wb manual actually tells you to apply helps thermal pads from moving when applying wb..
> Any hoot if worried I would just do the thermal padf that looks goudged up in pic apply directly on thermal pad I take corner of old credit card and smooth over to about flush..
> Good luck


----------



## IMI4tth3w

Quote:


> Originally Posted by *TheBoom*
> 
> Does this look like it could affect temperatures? I think I'm gonna apply some paste nonetheless.


that is not going to make any difference in temps. as long as its touching it it should be fine. thermal pads are much more forgiving.


----------



## TheBoom

Quote:


> Originally Posted by *cluster71*
> 
> I wrote earlier that I had high temperatures on Zotac and earlier with FE even though I changed the paste. I think the green padding is too thick so the spring load can not really squeeze, so I picked 1.0 mm pad instead of the green which seems to be 1.5 mm and I have had low temperatures since then. I also took double springs in each other


I thought of changing the pads but won't that void the warranty. I mean it'll be obvious that you opened up the card and changed the pads


----------



## TheBoom

Quote:


> Originally Posted by *IMI4tth3w*
> 
> that is not going to make any difference in temps. as long as its touching it it should be fine. thermal pads are much more forgiving.


Thanks man. I did what you said and applied small layers of paste on the Vrams anyway. For some
reason it's stable in FSU at the same OC now. In the end it could just be pascal being pascal for all I know. Stable one moment and not the next.


----------



## dallas1990

@Benny I use alphacool cause i didn't want to mix metals so I went with them since they use copper.


----------



## cluster71

My opinion is. If you are worried about warranty, have loans on the items or share the PC with your family (not your own) then you'll never start with extreme overclocking or modification. I do not care if things break. I have spent a lot of money on electronics over the years. Some things i ruin some will be perfect some unchanged.


----------



## jura11

Quote:


> Originally Posted by *nrpeyton*
> 
> *Compatible EKWB blocks for 1080Ti FE*
> 
> 1080Ti block
> 
> *Notice the WIDER 'memory step'*
> 
> Titan X Pascal block (also compatible)
> 
> 
> Last month I *submitted a support ticket to EKWB* asking which block gave best temps. They said it's negligible but they still *recommended the Titan block. (For my 1080Ti FE).
> *
> To me; every last degree counts!
> 
> The titan block is also now cheaper!
> 
> BUT -- the plot thickens!
> If you look at the two pictures above; you'll notice the 'step' for the memory is actually wider on the 1080Ti block.
> 
> Does anyone know if the Titan block covers EVERY INCH of all memory modules on the 1080Ti PCB?


Hi there

I can say for myself,running EK-FC Titan X Pascal block on my EVGA GTX1080Ti and my temps are under load in games in 36-38C as max,seen only once higher (42C) when ambient temps has been in high 20's(29C) and I used [email protected],in one loop have as well two GTX1080 which are running 2164MHz and 2100MHz,running 3 rads(360mm 60mm thick,240mm 60mm thick and 120mm 30mm thick)

Here is screenshot from Aquasuite and IRAY rendering



and here are few my OC settings still on stock EVGA FE BIOS

2088MHz at 1.05v +425MHz on VRAM



2113MHz at 1.06v +500MHz on VRAM



2116MHz at 1.07v +450MHz on VRAM



2126MHz at 1.07v +450MHz on VRAM



2138MHz at 1.09v +450MHz on VRAM



Hope this helps

Thanks,Jura


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> *Compatible EKWB blocks for 1080Ti FE*
> 
> 1080Ti block
> 
> *Notice the WIDER 'memory step'*
> 
> Titan X Pascal block (also compatible)
> 
> 
> Last month I *submitted a support ticket to EKWB* asking which block gave best temps. They said it's negligible but they still *recommended the Titan block. (For my 1080Ti FE).
> *
> To me; every last degree counts!
> 
> The titan block is also now cheaper!
> 
> BUT -- the plot thickens!
> If you look at the two pictures above; you'll notice the 'step' for the memory is actually wider on the 1080Ti block.
> 
> Does anyone know if the Titan block covers EVERY INCH of all memory modules on the 1080Ti PCB?


They probably recommended the Titan X block because it's stock is being cleared now.
Just get the 1080ti block if you are unsure, that way, in the infinitesimally small chance anything goes wrong you can blame EKWB and they can't turn around and say "you shouldn't have used a Titan X block with a 1080ti".
As for temp difference, it is well within margin of error. Your TIM application alone probably has more impact than which block lol.


----------



## done12many2

Quote:


> Originally Posted by *Hulio225*
> 
> Total points maybe... gpu score far away from top spot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/1729047


Agreed.

Time Spy, FireStrike and the like are horrible ways to compare GPU performance unless you look at graphics score alone. The overall score means nothing for GPU performance comparison because CPUs like the 5960x and 6950x score better overall due to physics scores.

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm ordering my new waterblock today. _(for 1080Ti FE)_
> 
> Is there any difference in temps between the:
> -EK Titan X Pascal
> &
> -EK 1080Ti FE blocks
> 
> (as you know both fit).
> 
> The Titan block is cheaper. So if temps are the same-- I'll buy the titan one.


I would seriously doubt that either is better. The 1080 Ti block comes with the new single slot I/O bracket. Not sure about the Titan X Pascal block, but they didn't originally. As mentioned, if EK said to get the Titan X Pascal, they probably had more of them on hand. I'd imagine 1080 Ti blocks are selling as fast as they can make them. Single slot is sexy.


----------



## alucardis666

Anyone know the GPU power fix to avoid black screen loss of signal issue *Screen will loos video for 1-3 seconds at random* when connected to a 4k screen @60hz? I've had this with every 10 series card for the past few months in SLi and single card cofigs and it's EXTREMELY annoying.









Thanks!


----------



## cluster71

It may be your Display Port cable that may not be able to hold the signal. You may need to change to DP v 1.3


----------



## cluster71

I have changed myself because I lost contact regularly


----------



## alucardis666

Quote:


> Originally Posted by *cluster71*
> 
> It may be your Display Port cable that may not be able to hold the signal. You may need to change to DP v 1.3


Quote:


> Originally Posted by *cluster71*
> 
> I have changed myself because I lost contact regularly


So I changed from Optimal power to adaptive in Nvidia control panel and haven't had one black screen issue yet!









Also I'm connected via HDMI. 4k TV user here.


----------



## FastEddieNYC

Quote:


> Originally Posted by *Benny89*
> 
> Will that make any difference in temps? Using Grizzly Kryo?
> 
> EDIT: *Who else makes good Radiators besides EK? What brand would you recommend?*


Alphacool and Hardware Labs make some of the best radiators you can buy.


----------



## mouacyk

Quote:


> Originally Posted by *Benny89*
> 
> Will that make any difference in temps? Using Grizzly Kryo?
> 
> EDIT: *Who else makes good Radiators besides EK? What brand would you recommend?*


http://www.xtremerigs.net/categories/reviews/water-cooling/radiators/


----------



## TheWizardMan

Quote:


> Originally Posted by *FastEddieNYC*
> 
> Alphacool and Hardware Labs make some of the best radiators you can buy.


I agree. It also depends on what you want out of a radiator. For instance, if you have a narrow rad you may want greater fin density (but this will have other issues such as flow restriction). Likewise, if you want a silent system, you need to look at thicker rads with lower fin densities. I have 3 Hardware Labs rads in my system and I like them quite a bit. Cleaning them was quite easy and there really wasn't much residue in the rads.


----------



## dunbheagan

Quote:


> Originally Posted by *Somasonic*
> 
> Thanks for that, that's pretty much what I was expecting to hear. I'm happy to try undervolting but it's something I haven't played around with before, what sort of values do you think I should be going for?


Quote:


> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


To give it a quick try, just open the curve editor with strg+l in afterburner, select the lowest voltage point at 800mV at hit L then apply to lock this point. Your card shoud now operate at fixed 800mV. Try some gaming, if this fixes your coil whine issue. If it helps, you can invest a little more time to find the highest stable clock at 800mV(should be between 1700-1800MHz) and make a totally flat curve at this clock straight from 800mV to 1200mV.


----------



## TWiST2k

Quote:


> Originally Posted by *dunbheagan*
> 
> To give it a quick try, just open the curve editor with strg+l in afterburner, select the lowest voltage point at 800mV at hit L then apply to lock this point. Your card shoud now operate at fixed 800mV. Try some gaming, if this fixes your coil whine issue. If it helps, you can invest a little more time to find the highest stable clock at 800mV(should be between 1700-1800MHz) and make a totally flat curve at this clock straight from 800mV to 1200mV.


What exactly is strg+I ? Did you mean ctrl+I ?And isn't it ctrl-F to open the curve?


----------



## KedarWolf

Quote:


> Originally Posted by *dunbheagan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Somasonic*
> 
> Thanks for that, that's pretty much what I was expecting to hear. I'm happy to try undervolting but it's something I haven't played around with before, what sort of values do you think I should be going for?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> To give it a quick try, just open the curve editor with strg+l in afterburner, select the lowest voltage point at 800mV at hit L then apply to lock this point. Your card shoud now operate at fixed 800mV. Try some gaming, if this fixes your coil whine issue. If it helps, you can invest a little more time to find the highest stable clock at 800mV(should be between 1700-1800MHz) and make a totally flat curve at this clock straight from 800mV to 1200mV.
Click to expand...

www.underclock.net This website is great!!


----------



## dunbheagan

Quote:


> Originally Posted by *TWiST2k*
> 
> What exactly is strg+I ? Did you mean ctrl+I ?And isn't it ctrl-F to open the curve?


Yeah, you are right, it is ctrl-F to open the curve, my fault.

PS: strg is the ctrl cap on my german keyboard layout


----------



## TWiST2k

Quote:


> Originally Posted by *dunbheagan*
> 
> Yeah, you are right, it is ctrl-F to open the curve, my fault.
> 
> PS: strg is the ctrl cap on my german keyboard layout


That's cool! I learn something new everyday! Thank you for the clarification.


----------



## Dasboogieman

Quote:


> Originally Posted by *Somasonic*
> 
> Hi all, I'm getting a bit sick of the coil whine from my Asus Strix OC. It's not awful but I've built my system to be quiet and I can hear a constant buzzing when gaming (I don't use headphones). My supplier has said I can return the card, so I'm hoping I can switch to another brand and am wondering if there are any 1080 Ti cards that are known to have no coil whine and a quiet fan? Thanks heaps for any guidance


You wanna hear a big revelation?
All cards have coil whine!!, anyone who claims otherwise is either lying, deaf or lucky that their system doesn't have audible resonance.
Its just the way the inductors work. ASUS used some pretty high end (I'm pretty sure noise reduced) coils in your Strix. Chances are, you will get whine from another card, there simply isn't another manufacturer that uses better coils except maybe the KingPin (and those would be optimized for efficiency instead of low noise).
There is something in either your AC mains power delivery, your PSU delivery or Mobo power draw that is creating electrical resonance (or noise) in the power that happens to coincide with an audible frequency.

If you want the most reliable way to deal with coil whine you have to open up your strix card.
1. If you are feeling brave, apply a decent amount of craft hot glue around each coil, like how powercolor has done it on the PCB photo below. This dampens the side to side vibration of the coils and thus makes the whine much less audible


2. If you are feeling less brave, apply a reasonably thick pad of Thermal Pad on top of the inductors and make sure the heatsink is pressing down pretty firmly on it. This will apply downwards pressure on the coils and will also dampen the vibration.

YMMV with each method but you will get a definite reduction in noise.


----------



## Luckbad

Zotac 1080 Ti ArcticStorm in the leak test phase.


----------



## reflex75

Quote:


> Originally Posted by *Luckbad*
> 
> Zotac 1080 Ti ArcticStorm in the leak test phase.


Nice








Interesting to compare to the air Zotac Amp Extreme !


----------



## dunbheagan

Quote:


> Originally Posted by *reflex75*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> 
> Interesting to compare to the air Zotac Amp Extreme !


I am very interested too in your OC results including temps etc of this card! Keep us updated.


----------



## mouacyk

Quote:


> Originally Posted by *Luckbad*
> 
> Zotac 1080 Ti ArcticStorm in the leak test phase.


Do share. Thanks.


----------



## Eco28

Quote:


> Originally Posted by *Aganor*
> 
> How do you measure room temp?
> I know the system temp is 28ºc, i do not have a termometer on my room lol


Nothing special. I have a cheap room thermometer. You can get one from Amazon or any home store. Any will do. Isn't perfect but, at least you get some rough idea whether you are closer to usual 22 or warm 28ºc. I have a 2 monitor setup and 55" tv above them. All 3 displays plus rig working for hours gets my room to 28ºc easily and that really impacts the GPU/CPU temps forcing me to turn up the fans on radiators to keep it cool.


----------



## Benny89

When you water cool- is push pull on rad better than just push? Anyone saw actually difference with water loop?


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> When you water cool- is push pull on rad better than just push? Anyone saw actually difference with water loop?


Hi there

Really depends on radiator used and thickness of radiator

On 30mm radiator I wouldn't do push pull, but on 60mm radiator I would do push pull configuration

I'm running on my XE360 360mm 60mm push pull and Mayhems Havoc 240nm 60mm thick radiator I'm running push pull configuration and on XSPC 120mm I'm not running push pull configuration




Hope this helps

Thanks,Jura


----------



## Benny89

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Really depends on radiator used and thickness of radiator
> 
> On 30mm radiator I wouldn't do push pull, but on 60mm radiator I would do push pull configuration
> 
> I'm running on my XE360 360mm 60mm push pull and Mayhems Havoc 240nm 60mm thick radiator I'm running push pull configuration and on XSPC 120mm I'm not running push pull configuration
> 
> 
> 
> 
> Hope this helps
> 
> Thanks,Jura


Thanks

Rad would be 38mm or 54mm. Would running push-pull be good there? Front mounter, 240 mm.


----------



## Coopiklaani

Quote:


> Originally Posted by *Benny89*
> 
> When you water cool- is push pull on rad better than just push? Anyone saw actually difference with water loop?


Push pull will always give you better temps, but at the cost of more noise since you are running twice the number of fans.


----------



## TheWizardMan

Quote:


> Originally Posted by *Benny89*
> 
> Thanks
> 
> Rad would be 38mm or 54mm. Would running push-pull be good there? Front mounter, 240 mm.


Push/pull will help on any thickness of rad. If you start doing some google searches you will find results that show the difference. Push/pull is especially helpful for thicker rads and rads with higher fin densities. It also helps for people looking to run a quieter system because the fans can run at a slower speed. at 38mm-60mm it would definitely help.


----------



## TheWizardMan

Quote:


> Originally Posted by *Coopiklaani*
> 
> Push pull will always give you better temps, but at the cost of more noise since you are running twice the number of fans.


This isn't necessarily true at all. You can run a push/pull set up at lower rpms than a push or pull setup, which will reduce noise.


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> Thanks
> 
> Rad would be 38mm or 54mm. Would running push-pull be good there? Front mounter, 240 mm.


As above there depends on fin density or FPI, if radiator is with higher FPI something like is 16FPI then I would suggest go route of push pull

Get good fans like are Phanteks PH-F120MP and you should be OK there

Hope this helps

Thanks, Jura


----------



## tomxlr8

Can I use the XOC custom bios to overclock MSI GeForce GTX 1080 Ti Seahawk EK X 11GB ?
Do I need to use a custom bios to do a great OC on it?


----------



## Dasboogieman

Quote:


> Originally Posted by *tomxlr8*
> 
> Can I use the XOC custom bios to overclock MSI GeForce GTX 1080 Ti Seahawk EK X 11GB ?
> Do I need to use a custom bios to do a great OC on it?


The cold temperatures of watercooling alone should allow your card to OC close to 2100mhz. The XOC will give you very little because to go above 2100mhz is hugely luck dependent. Many cards glitch out at 2126mhz regardless of volts unless you go subzero with the cooling.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> The cold temperatures of watercooling alone should allow your card to OC close to 2100mhz. The XOC will give you very little because to go above 2100mhz is hugely luck dependent. Many cards glitch out at 2126mhz regardless of volts unless you go subzero with the cooling.


Unless the card is being temperature limited. Mine can't go above 1143mv before crashing due to temps on other areas of the cards. I think full block watercooling will help quite abit with that.


----------



## KraxKill

Quote:


> Originally Posted by *TheWizardMan*
> 
> This isn't necessarily true at all. You can run a push/pull set up at lower rpms than a push or pull setup, which will reduce noise.


I agree. When I ran an nzxt41 in my setup, I went from a single loud nzxt fan and replaced it with a push pull noctua setup. Lowered my temps and noise at the same time. The noctuas themselves push less air, but in push pull they do more than just a single fan while remaining quiet.


----------



## Luckbad

Quick update on the Zotac 1080 Ti ArcticStorm buildout.

No leaks! Excellent news.









Also great news is that 1 hour of Heaven on loop yields temps of 37 °C max. This block is outstanding.

Info on the watercooling setup:

GPU/Waterblock: Zotac 1080 Ti ArcticStorm
Pump: EK-XRES 140 Revo D5 PWM (incl. pump)
Radiator: Hardware Labs Black Ice 420mm SR2 Multiport
Fans: 3x Cougar CFD140
Fittings: Bitspower G1/4" Black Matte Compression Fitting CC3 Ultimate for ID 3/8" OD 5/8"
Tubing: EK-Tube ZMT 3/8in. ID X 5/8in. OD - Matte Black
Mount: EK-UNI Pump Bracket (140mm Fan) Vertical
Coolant: Mayhems XT-1 Nuke Coolant Concentrate - Clear - 100ml
The radiator had virtually nothing to clean out of it, which is a massive change from the last time I watercooled with an open loop (10 years ago). Super high quality rad and complete overkill, which is the way I like it.

The pump is running silently at a constant 45% speed. Fans are silent at 40% and never go above 50% with my current settings (still essentially silent unless your room is dead quiet).

I do NOT have my CPU hooked up at the same time. I couldn't afford the second radiator at the moment, so I opted to keep my NZXT Kraken going on the CPU until the next time I upgrade.

Not time to properly push the card before I head to work but it seemed initially stable at 2100MHz locked to 1.093.

EDIT: Couldn't resist. Did a quick offset-based run with the power limit maxed. 2062MHz core, 12000MHz memory, rock solid in SuPo, 10300 with none of the usual benching optimizations.


----------



## TheBoom

Quote:


> Originally Posted by *Luckbad*
> 
> Quick update on the Zotac 1080 Ti ArcticStorm buildout.
> 
> No leaks! Excellent news.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also great news is that 1 hour of Heaven on loop yields temps of 37 °C max. This block is outstanding.
> 
> Info on the watercooling setup:
> 
> GPU/Waterblock: Zotac 1080 Ti ArcticStorm
> Pump: EK-XRES 140 Revo D5 PWM (incl. pump)
> Radiator: Hardware Labs Black Ice 420mm SR2 Multiport
> Fans: 3x Cougar CFD140
> Fittings: Bitspower G1/4" Black Matte Compression Fitting CC3 Ultimate for ID 3/8" OD 5/8"
> Tubing: EK-Tube ZMT 3/8in. ID X 5/8in. OD - Matte Black
> Mount: EK-UNI Pump Bracket (140mm Fan) Vertical
> Coolant: Mayhems XT-1 Nuke Coolant Concentrate - Clear - 100ml
> The radiator had virtually nothing to clean out of it, which is a massive change from the last time I watercooled with an open loop (10 years ago). Super high quality rad and complete overkill, which is the way I like it.
> 
> The pump is running silently at a constant 45% speed. Fans are silent at 40% and never go above 50% with my current settings (still essentially silent unless your room is dead quiet).
> 
> I do NOT have my CPU hooked up at the same time. I couldn't afford the second radiator at the moment, so I opted to keep my NZXT Kraken going on the CPU until the next time I upgrade.
> 
> Not time to properly push the card before I head to work but it seemed initially stable at 2100MHz locked to 1.093.
> 
> EDIT: Couldn't resist. Did a quick offset-based run with the power limit maxed. 2062MHz core, 12000MHz memory, rock solid in SuPo, 10300 with none of the usual benching optimizations.


Nice. But isn't a 420 rad enough for CPU cooling as well?


----------



## Luckbad

Quote:


> Originally Posted by *TheBoom*
> 
> Nice. But isn't a 420 rad enough for CPU cooling as well?


Yeah, technically, but I prefer overkill and my NZXT Kraken works very well by itself on the CPU. I'd see higher temps on the GPU for no improvement on the CPU just for the sake of getting it in the loop.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> Nice. But isn't a 420 rad enough for CPU cooling as well?


Depends on Delta T desired. It seems small die chips like the 7700k needs ridiculous rad power to get them below 60 delidded. Like my loop only saw a 5 degree drop going from (75->70) 360 to 600. As such, I don't think there's such thing as overkill. Once I get the new case, I'm expecting another 5 degree drop going from 600 to 1200.

Either that, or I need a waterblock that has really really dense fins suitable for small dies.


----------



## kolkoo

Quote:


> Originally Posted by *Luckbad*
> 
> Yeah, technically, but I prefer overkill and my NZXT Kraken works very well by itself on the CPU. I'd see higher temps on the GPU for no improvement on the CPU just for the sake of getting it in the loop.


What are your ambient temps?


----------



## KenjiS

Currently tinker tweaking my Strix since i felt up to it today, I cant seem to replicate my initial score.. My CPU score is a bit inconsistant atm

Initial run, 8996 (10,219 Graphics, 5362 CPU)
http://www.3dmark.com/spy/1842052

Latest run, 8965 (10,282 Graphics, 5196 CPU)
http://www.3dmark.com/3dm/20261779?

Not only that the clock speed reporting is majorly busted in the new runs... Im not sure exactly what im running at right now

Initial run was using the Asus GPU Tweak software, the new runs are me trying to replicate the settings from it in MSI Afterburner...


----------



## Luckbad

Quote:


> Originally Posted by *kolkoo*
> 
> What are your ambient temps?


23-24C


----------



## KenjiS

http://www.3dmark.com/3dm/20262251?

About as good as it will get

Noticed my boost clocks about 2012-2025, Mem is 11,088, Stock voltage

Pushed it to 2038 core and it crashed on the second graphics test.

Under 70 degrees under load and about 30db at seating position, 36 if i shove my meter RIGHT against my case.

I think ill take the quiet/cool, it runs several degrees colder than my 980 Ti and significantly quieter...


----------



## jprovido

just got my 1080 Ti Strix. lost with the silicon lottery. maximum stable I can get it is 2012mhz.


----------



## RavageTheEarth

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> just got my 1080 Ti Strix. lost with the silicon lottery. maximum stable I can get it is 2012mhz.


Crank those fans! These cards need to be as cool as possible.


----------



## the9quad

The thing I like about this card is I don't have to crank the fans to get decent gaming performance. I would rather it get a little warm, be 2-5 fps slower and enjoy the silence. I still cant believe how silent this card is, very unexpected to be honest, and glad I cancelled the FTW3 order.


----------



## Benny89

What tube size is the best for water cooling? Soft tube size? Does it matter for flow and temps and generally?


----------



## Dasboogieman

Quote:


> Originally Posted by *Benny89*
> 
> What tube size is the best for water cooling? Soft tube size? Does it matter for flow and temps and generally?


Yes it does matter but not to a measurable amount unless you are running many metres of the stuff. I personally really liked the 11mm internal diameter EK EPDM tubes. Nice, thick, secure fit over 12mm barbs and is quite flexible.


----------



## jprovido

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Crank those fans! These cards need to be as cool as possible.


overclocked a little better when temps were down but it's too loud for my liking. i'm gonna stick with the 2020mhz overclock


----------



## dallas1990

Quote:


> Originally Posted by *Benny89*
> 
> What tube size is the best for water cooling? Soft tube size? Does it matter for flow and temps and generally?


i use 3/8 inner and 5/8 outer tubing with primochill tubing. it works great and you can get some tight bends without it kinking


----------



## jim2point0

Quote:


> Originally Posted by *jprovido*
> 
> just got my 1080 Ti Strix. lost with the silicon lottery. maximum stable I can get it is 2012mhz.


My max is 1978mhz. I don't wanna hear it


----------



## Luckbad

Prior to ordering the Zotac 1080 Ti ArcticStorm, I'd bought a 980 Ti K|NGP|N from someone with the intent to watercool it and just hang tight until the next generation. Got the ArcticStorm and it's a beast, so I don't need the 980 Ti. He didn't know what the ASIC of the 980 Ti was, which I assumed was code for 72% since that's the lowest they sold them at... but this is over 80% ASIC!

Frick. I installed it for a bit to give it a quick check and it seems like it's at least as good as most 1080s. Two 8 pin and a 6 pin on the power gives it bonkers potential. Beautiful card too. Originally cost $1050 and is super rare.

Now I have an $800 card that kinda kicks it in the junk and I don't really know what to do with the K|NGP|N.


----------



## jprovido

Quote:


> Originally Posted by *jim2point0*
> 
> My max is 1978mhz. I don't wanna hear it


thank you for making me feel a lot better lol


----------



## bmgjet

So got sick of waiting for Vega since had my tax return for a month now and needed to spend it so got a 1080ti.
https://www.techpowerup.com/gpuz/dczwm

Have only been able to get it to 2ghz on the core and then it basically sits on 120% power limit the whole time even tho temps mid 50s.
Looking around the shunt mod seems to be the only wait to increase it.

Was currently the best way to do this mod.
Conductive paste or are there stick on resistors out for these cards yet like the 9XX series.


----------



## DADDYDC650

I'm in boys!

Got my EVGA 1080 Ti SC Black Edition yesterday. Have it running at +70 (2Ghz+) Core and +500 memory. No reason to push it further so I won't bother. Happy camper so far!


----------



## Spirti

Hello everyone, I read a lot about freezes during video playback on Pascal, but I have a slightly different problem.

It randomly hangs completely and reboots when I watch videos on YouTube. It happens only once a day (after reboot everything is fine). If i don't use YouTube on the first run - everything is fine. All stress tests, games, etc. can run for 24 hours flawlessly.

Of course I tried all that stuff with hardware acceleration, reinstalled OS (win 8.1 and win 10), latest and previous drivers, etc.
Any help appreciated, thanks.


----------



## bmgjet

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'm in boys!
> 
> Got my EVGA 1080 Ti SC Black Edition yesterday. Have it running at +70 (2Ghz+) Core and +500 memory. No reason to push it further so I won't bother. Happy camper so far!


Same core as mine, My memory only did +350 tho then benchmark scores started going down.


----------



## DADDYDC650

Quote:


> Originally Posted by *bmgjet*
> 
> Same core as mine, My memory only did +350 tho then benchmark scores started going down.


I'll have to bench the memory then. My core could go higher but I like round numbers and 2Ghz sounds great.


----------



## bmgjet

Quote:


> Originally Posted by *DADDYDC650*
> 
> I'll have to bench the memory then. My core could go higher but I like round numbers and 2Ghz sounds great.


Mine does 2012mhz but then sits on power limit.
So stuck to +70 which is 2ghz.
Tried bios mod so far but wont go over 119% power limit so will be doing the shunt mod.


----------



## itssladenlol

Just got an evga 1080ti sc2 hybrid, put two gentle typhoon on it and that baby is running at [email protected],000.
6hrs into heaven ,maxtemp is like 46 degree.
Noise is like my pc isnt turned on at all, cant hear anything.


----------



## feznz

Quote:


> Originally Posted by *Spirti*
> 
> Hello everyone, I read a lot about freezes during video playback on Pascal, but I have a slightly different problem.
> 
> It randomly hangs completely and reboots when I watch videos on YouTube. It happens only once a day (after reboot everything is fine). If i don't use YouTube on the first run - everything is fine. All stress tests, games, etc. can run for 24 hours flawlessly.
> 
> Of course I tried all that stuff with hardware acceleration, reinstalled OS (win 8.1 and win 10), latest and previous drivers, etc.
> Any help appreciated, thanks.


I am having similar problems BSOD during web browsing yet super stable during benches and gaming. I decided that I will get an M.2 drive tomorrow and put a genuine copy of windows cost $14 USD
had a look at the error codes and solutions seems I have corrupted the OS with OCing benches I think my CPU is a little degraded too it takes a wee bit more voltage to keep stable @ 5Ghz OS was installed 4-5 years ago.
I was wondering if it is a KMS related problem......so going to do a super clean install and only use the MS approved drivers I will know in a day if the BSOD are gone.

error code; nvlddmkm driver stopped reponding


----------



## Benny89

Any difference (apart from color) between EK-CryoFuel Blood Red Premix and Mayhems Pre-Mix X1 Blood Red ? Is one better than the other?

Also I have discovered that Witcher 3 just do not like bins (I have tested 5 1080Tis there) above 1.05v. Clocks does not matter that much for it. But running clocks above 1.06v will increase your chance to crash in this game espeially when breaking 61-62C. If you are on water you will be fine on higher bins because of tems.

However on air, running clock on 1.05 or 1.04 seems to give you stability regarding of your temperature.

I don't know why Witcher 3 reacts so different than EVERY other game. Tested OC in almost 15 different AAA games and none had any problems with clocks 1.06v and above. Only Witcher 3.

But well, it is fun testing it too


----------



## Spirti

Quote:


> Originally Posted by *feznz*
> 
> I am having similar problems BSOD during web browsing yet super stable during benches and gaming. I decided that I will get an M.2 drive tomorrow and put a genuine copy of windows cost $14 USD
> had a look at the error codes and solutions seems I have corrupted the OS with OCing benches I think my CPU is a little degraded too it takes a wee bit more voltage to keep stable @ 5Ghz OS was installed 4-5 years ago.
> I was wondering if it is a KMS related problem......so going to do a super clean install and only use the MS approved drivers I will know in a day if the BSOD are gone.
> 
> error code; nvlddmkm driver stopped reponding


My both Windows are legit with no kms or other. So my problem is not OS for sure, I had 980s in SLI before and everything was fine. Started with 1080ti. Overclocking is not the problem as well since I have my 1080 ti stock and cpu overclocked/stock behavior is the same. The reboot is only happening once after the 1st boot and youtube which pisses me the most. Will have to live with it i think)


----------



## Coopiklaani

Quote:


> Originally Posted by *Benny89*
> 
> Any difference (apart from color) between EK-CryoFuel Blood Red Premix and Mayhems Pre-Mix X1 Blood Red ? Is one better than the other?
> 
> Also I have discovered that Witcher 3 just do not like bins (I have tested 5 1080Tis there) above 1.05v. Clocks does not matter that much for it. But running clocks above 1.06v will increase your chance to crash in this game espeially when breaking 61-62C. If you are on water you will be fine on higher bins because of tems.
> 
> However on air, running clock on 1.05 or 1.04 seems to give you stability regarding of your temperature.
> 
> I don't know why Witcher 3 reacts so different than EVERY other game. Tested OC in almost 15 different AAA games and none had any problems with clocks 1.06v and above. Only Witcher 3.
> 
> But well, it is fun testing it too


My go to stability test for gtx 1080ti is FSU stress test or FSU scene 2 looping over 2 hours. It is so far the best indicator for instability.


----------



## ViRuS2k

Running both my Gigabyte Fe 1080Ti`s at 2050/12ghz with no issues at all.....

though i have found the best test for stability is running Playerunknowns battlegrounds
and keeping the game on the main menu, it will use 99% GPU on the menu and game will crash to desktop if your unstable and it likes to find the very tiniest weakness`s in any GPU.....
if you get your over clocks where you want them if you pass the player unknowns menu test for 1 hour without issue or crash then consider yourself 1000% stable.. its a game like witcher 3 for finding unstable gpus but this game is much better and quicker at finding the failure.


----------



## Aganor

1H of witcher 3 with a new profile of 40% Pump and Fan's speed and gpu reached 52ºc, going to the next throtlle...
In an attempt to lower the noise, i got worse temps. Will try to decrease fan speeds and maintain the pump in 60% to check improvements.

Will also replase my x2 140mm Corsair Push fans on the front case and use x2 Noctua NF-F12 120mm fans instead and also add x3 slim profile fans as pull on the top case.

52ºc WC is so freaking high, Delta T +14ºc than on idle after gaming and Delta T + 24ºc overall idle temps...


----------



## KraxKill

I see a lot of senseless BIOS flashing. Many hoping different means better. I'm pretty sure that these cards are hardware TDP limited in terms of performance regardless of the BIOS you flash.

A different BIOS will help lift the 'software' controlled TDP ceiling only. Shunt mods will do a bit better in removing all of it and you don't have to hop off your board specific BIOS. This is why the FE scores better with jacked shunts than it does on the XOC.

I did some more testing with the XOC and my scores were slightly lower again vs the generic FE bios on my Nvidia Founders Ti. About a 50pt difference in Superposition4K

I also found that there is a sweet spot for these cards in terms of mem and core clocks if you stay just under the hard coded TDP limit.

Make no mistake, the XOC Bios does not remove the hardware TDP restriction and falls victim to pwr. Don't let GPUZ fool you. It's a nice clean graph in GPU-Z, but the card is actually power throttling in the background and potentially sooner than stock depending on the volts you choose to feed it. Additional voltage is simply bled off as heat and lost potential.

Eventually, if your card can run high enough (2100+), you will run into a situation where increasing volts and clock no longer improves bench performance. The card will just pull power and run flat when it runs out of juice and performance will actually decrease.

Basically if your card can boost to around 2138 you'll find that regardless of the BIOS you use, performance will flatline there without supplemental voltage via hard modding the card. You can run a higher core clock and lower mem or higher mem and lower core the score will not improve.

The gpu will only draw x amount of power available to it so best performance is to use all TDP available without going over. Increasing volts unecesseeily will just eat more and more into your TDP.

For most, I suspect the fastest performance is between 2100-2138 on the core at the lowest possible volts and the Gddr5x clocked to a point where it just taps the remaining power left.

One may be able to run 2100 +600ram and score the same or better than 2138 + 500.

You can play with different clocks all you want, but ultimately the combination of volts, core + mem clock that utilizes all available TDP will score the best.


----------



## TheBoom

Quote:


> Originally Posted by *Coopiklaani*
> 
> My go to stability test for gtx 1080ti is FSU stress test or FSU scene 2 looping over 2 hours. It is so far the best indicator for instability.


Exactly what I was saying. FSU seems to be by far the best stress test period.
Quote:


> Originally Posted by *Spirti*
> 
> My both Windows are legit with no kms or other. So my problem is not OS for sure, I had 980s in SLI before and everything was fine. Started with 1080ti. Overclocking is not the problem as well since I have my 1080 ti stock and cpu overclocked/stock behavior is the same. The reboot is only happening once after the 1st boot and youtube which pisses me the most. Will have to live with it i think)


I've been seeing this pop up a lot recently. A friend of mine has had the same issue for a while on his R9290X. I think its something to do with mobo and/or ethernet drivers.

He tried swapping out to a 970 and still had the same issue so it can't be GPU related most probably.

If I were you I would try updating all chipset drivers first and if that doesn't work try to get your hands on another mobo to test.


----------



## Aganor

Quote:


> Originally Posted by *KraxKill*
> 
> I see a lot of senseless BIOS flashing. Many hoping different means better. I'm pretty sure that these cards are hardware TDP limited in terms of performance regardless of the BIOS you flash.
> 
> A different BIOS will help lift the 'software' controlled TDP ceiling only. Shunt mods will do a bit better in removing all of it and you don't have to hop off your board specific BIOS. This is why the FE scores better with jacked shunts than it does on the XOC.
> 
> I did some more testing with the XOC and my scores were slightly lower again vs the generic FE bios on my Nvidia Founders Ti. About a 50pt difference in Superposition4K
> 
> I also found that there is a sweet spot for these cards in terms of mem and core clocks if you stay just under the hard coded TDP limit.
> 
> Make no mistake, the XOC Bios does not remove the hardware TDP restriction and falls victim to pwr. Don't let GPUZ fool you. It's a nice clean graph in GPU-Z, but the card is actually power throttling in the background and potentially sooner than stock depending on the volts you choose to feed it. Additional voltage is simply bled off as heat and lost potential.
> 
> Eventually, if your card can run high enough (2100+), you will run into a situation where increasing volts and clock no longer improves bench performance. The card will just pull power and run flat when it runs out of juice and performance will actually decrease.
> 
> Basically if your card can boost to around 2138 you'll find that regardless of the BIOS you use, performance will flatline there without supplemental voltage via hard modding the card. You can run a higher core clock and lower mem or higher mem and lower core the score will not improve.
> 
> The gpu will only draw x amount of power available to it so best performance is to use all TDP available without going over. Increasing volts unecesseeily will just eat more and more into your TDP.
> 
> For most, I suspect the fastest performance is between 2100-2138 on the core at the lowest possible volts and the Gddr5x clocked to a point where it just taps the remaining power left.
> 
> One may be able to run 2100 +600ram and score the same or better than 2138 + 500.
> 
> You can play with different clocks all you want, but ultimately the combination of volts, core + mem clock that utilizes all available TDP will score the best.


So if i only move the voltage slider to max, giving +1.093v, i shoundlt provoke the TDP throttling right?
My card with 1.093v can go max 2050mhz and with the stock voltage (1.062v) can to max 2025mhz.


----------



## Spirti

Quote:


> Originally Posted by *TheBoom*
> 
> Exactly what I was saying. FSU seems to be by far the best stress test period.
> I've been seeing this pop up a lot recently. A friend of mine has had the same issue for a while on his R9290X. I think its something to do with mobo and/or ethernet drivers.
> 
> He tried swapping out to a 970 and still had the same issue so it can't be GPU related most probably.
> 
> If I were you I would try updating all chipset drivers first and if that doesn't work try to get your hands on another mobo to test.


Like I said - everything was perfect on the same motherboard with 980 sli. It started right after I upgraded to 1080ti.
Thanks anyway.


----------



## KraxKill

Quote:


> Originally Posted by *Aganor*
> 
> So if i only move the voltage slider to max, giving +1.093v, i shoundlt provoke the TDP throttling right?
> My card with 1.093v can go max 2050mhz and with the stock voltage (1.062v) can to max 2025mhz.


Not quite. Depending on the game, benchmark, and the total load on the card it may very well be TDP throttling at those volts. A few TDP pwr evens is perfectly fine and will still score a bit higher over a completely blank PWR chart since higher binned frequencies can still be selected while "TDP" load is lowest. I'd say stick around 2bins higher than your max non PWR clock.

If you care most about gaming performance, the best test to run is without a doubt Superposition 4K for gaming performance testing and Heaven for (stability testing) . TimeSpy, Firestrike and Valley tend to be easier to pass with Valley being the easiest. All will draw differing amounts of power.

I found Superposition 4K to be the most realistic in simulating a typical "gaming" workload. It also marginalizes the impact of the CPU compared to other benches and allows for a more repeatable scoring and thus easier comparison testing.

At 1.093 you are most likely throttling in quite a few places. Only you can tell by logging with GPUZ, but I would not worry about sporadic pwr. What you want is the highest average clock and the longest time spent at the higher clocks possible. If you graph your GPUZ at the lowest interval time and then graph that data, or use FRAPS and FRAPS data viewer you can actually determine which setting combination of voltage/clock gives you the highest area under the curve" THAT is your best overall performance setting for "gaming".

And if you're like most and playing one game at a time, you can theoretically have different settings for each game. A game drawing less power may allow for better performance at higher clocks, while a game that is TDP heavy may benefit from a lower clock and voltage to increase time spent at the top bin, even where a top bin may be lower.


----------



## IMI4tth3w

Quote:


> Originally Posted by *KraxKill*
> 
> I see a lot of senseless BIOS flashing. Many hoping different means better. I'm pretty sure that these cards are hardware TDP limited in terms of performance regardless of the BIOS you flash.
> 
> A different BIOS will help lift the 'software' controlled TDP ceiling only. Shunt mods will do a bit better in removing all of it and you don't have to hop off your board specific BIOS. This is why the FE scores better with jacked shunts than it does on the XOC.
> 
> I did some more testing with the XOC and my scores were slightly lower again vs the generic FE bios on my Nvidia Founders Ti. About a 50pt difference in Superposition4K
> 
> I also found that there is a sweet spot for these cards in terms of mem and core clocks if you stay just under the hard coded TDP limit.
> 
> Make no mistake, the XOC Bios does not remove the hardware TDP restriction and falls victim to pwr. Don't let GPUZ fool you. It's a nice clean graph in GPU-Z, but the card is actually power throttling in the background and potentially sooner than stock depending on the volts you choose to feed it. Additional voltage is simply bled off as heat and lost potential.
> 
> Eventually, if your card can run high enough (2100+), you will run into a situation where increasing volts and clock no longer improves bench performance. The card will just pull power and run flat when it runs out of juice and performance will actually decrease.
> 
> Basically if your card can boost to around 2138 you'll find that regardless of the BIOS you use, performance will flatline there without supplemental voltage via hard modding the card. You can run a higher core clock and lower mem or higher mem and lower core the score will not improve.
> 
> The gpu will only draw x amount of power available to it so best performance is to use all TDP available without going over. Increasing volts unecesseeily will just eat more and more into your TDP.
> 
> For most, I suspect the fastest performance is between 2100-2138 on the core at the lowest possible volts and the Gddr5x clocked to a point where it just taps the remaining power left.
> 
> One may be able to run 2100 +600ram and score the same or better than 2138 + 500.
> 
> You can play with different clocks all you want, but ultimately the combination of volts, core + mem clock that utilizes all available TDP will score the best.


I have probably spent 10 hours of overclocking and benching with this thing, i this is pretty much spot on.

Also what i have found is that adjusting the voltage curve in different ways seems to play with the built in hardware TDP restriction. (or maybe something else is happening. but i am definitely able to get higher clock speeds and gain performance if i adjust the voltage curve one way vs another)

This was my experience with a 1080ti strix + xoc bios. The xoc did allow me to get much higher overclocks by getting rid of the 330W ceiling in the original bios of the 1080ti strix

My best 100% stable clock was 2113 @ 1.063 with 2126/1.075 and 2136/1.081 being stable one run and not stable the next in FSU/Time Spy. But the score difference was very very small between the 3 so i just didn't bother and stuck with 2113.

It seems like anything over 1.063 and something in the GPU chip itself is starting to not get happy..


----------



## nrpeyton

*Finally ordered new EK water-block for my 1080Ti FE !!!*




*It was on clearance at overclockers... so only paid 79.99 for it brand new!














*
It was also their last one (1 in stock)

If I'd got the 1080Ti block it would have cost about 100 +


----------



## KraxKill

Quote:


> Originally Posted by *IMI4tth3w*
> 
> I have probably spent 10 hours of overclocking and benching with this thing, i this is pretty much spot on.
> 
> Also what i have found is that adjusting the voltage curve in different ways seems to play with the built in hardware TDP restriction. (or maybe something else is happening. but i am definitely able to get higher clock speeds and gain performance if i adjust the voltage curve one way vs another)
> 
> This was my experience with a 1080ti strix + xoc bios. The xoc did allow me to get much higher overclocks by getting rid of the 330W ceiling in the original bios of the 1080ti strix
> My best 100% stable clock was 2113 @ 1.063 with 2126/1.075 and 2136/1.081 being stable one run and not stable the next in FSU/Time Spy. But the score difference was very very small between the 3 so i just didn't bother and stuck with 2113.
> 
> It seems like anything over 1.063 and something in the GPU chip itself is starting to not get happy..


Yeah, the curve is a bit odd at times.

Here is where I see a lot of confusion with "the curve" in this thread. People talking RAMP vs Cliff and people describing different performance with different curve settings. I eventually solved this for myself by simply finding my desired top bin, and adjusting the bins immediately below my top bin to run at the lowest possible voltage that they are stable at.

If you take on this approach, the lower bins will not affect performance when selected by unecesserely dropping additional bins due to TDP by running unneeded voltage where a lower voltage bin would have sufficed.

So what people are more often than not describing as an "unpredictable" curve is actually due to the lower bins being sub optimal in terms of TDP usage. One curve the lower bins may waste less or more and thus the performance in PWR intensive scenarios will vary. If you pin down the lowest voltages for your subordinate bins, you'll see highest possible performance.

Example.

Your top bin is 2100 @ 2.081 OK great. but what of the lower bins?

If you can run 2088 at 1.063 for example, DO IT. There is no reason to waste the power locked in the 1.074 bin there. What you gain is even more "time" in being able to absorb TDP before the card drops another bin. That = more area under the curve and more performance.

Continuing this process all the way down the curve, will ultimately allow for the most "potential" but obviously only the 5 bins below your top would really benefit form this attention. Ofcorse you must now make sure that all your lower bins are actually stable at the lower voltages you select, but there is no gain without struggle.


----------



## sperson1

Me just playing around a little with my Strix OC


----------



## RadActiveLobstr

Doing the same as above but with my FTW3 (forgot to take a picture of the results screen so here is the uploaded one)


----------



## lilchronic

These cards throttle from TDP @ stock speed's. Timespy will throttle my card down to .963v in certain spots of the bench. So even @ .963v this card is hitting it's Power limit (300W). So if people are running more than say .95v im positive they will run into power throttling at some point ....

For gaming i just run 1v @ 2ghz / 12Ghz and i never hit the TDP in the games i play. well i only really play PUBG right now.









Here's a capture of my curve.

Started at 1v and +164 then set every voltage bin below that to +164 witch gives me the best performance. For me 3 or 4 bins below was still giving lower numbers till i set all lower bins to the same +164.


----------



## nrpeyton

Quote:


> Originally Posted by *lilchronic*
> 
> These cards throttle from TDP @ stock speed's. Timespy will throttle my card down to .963v in certain spots of the bench. So even @ .963v this card is hitting it's Power limit (300W). So if people are running more than say .95v im positive they will run into power throttling at some point ....
> 
> For gaming i just run 1v @ 2ghz / 12Ghz and i never hit the TDP in the games i play. well i only really play PUBG right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a capture of my curve.
> 
> Started at 1v and +164 then set every voltage bin below that to +164 witch gives me the best performance. For me 3 or 4 bins below was still giving lower numbers till i set all lower bins to the same +164.


In the Witcher 3 at 4k ultra there are some scenes I need to be at .89v to stop throttling @ 300w _(although FPS is still over 60 so I can't really complain)_

And my waterblock has now been ordered. So power mod coming up soon too! Soon as this power mod is complete -- I'm confident i'll be overkill even for 4K!


----------



## jura11

Quote:


> Originally Posted by *Aganor*
> 
> 1H of witcher 3 with a new profile of 40% Pump and Fan's speed and gpu reached 52ºc, going to the next throtlle...
> In an attempt to lower the noise, i got worse temps. Will try to decrease fan speeds and maintain the pump in 60% to check improvements.
> 
> Will also replase my x2 140mm Corsair Push fans on the front case and use x2 Noctua NF-F12 120mm fans instead and also add x3 slim profile fans as pull on the top case.
> 
> 52ºc WC is so freaking high, Delta T +14ºc than on idle after gaming and Delta T + 24ºc overall idle temps...


That's bit high there
How many radiators are you running?
And what case do you have?

I would suspect you will need better fans at least to keep delta lower or run fans in faster speeds

Please have look on Thermalright TY-143 SQ or TY-147ASQ or Phanteks PH-F140MP, BeQuiet Silent Wings 3 and for 120mm fans I would get as well Phanteks PH-F120MP

Hope this helps

Thanks,Jura


----------



## feznz

Quote:


> Originally Posted by *Spirti*
> 
> My both Windows are legit with no kms or other. So my problem is not OS for sure, I had 980s in SLI before and everything was fine. Started with 1080ti. Overclocking is not the problem as well since I have my 1080 ti stock and cpu overclocked/stock behavior is the same. The reboot is only happening once after the 1st boot and youtube which pisses me the most. Will have to live with it i think)


Exactly problems started as soon as I put the 1080Ti in ... previous system GTX 770 SLI super stable no problems what so ever
But I want to solve it is annoying knowing you can crash at any moment while on internet browsing good excuse to get a M.2 drive and clean install I will let *Windows Choose all the Microsoft approved drivers*
Interesting my friend that is a IT technician for the NZ army told me why they do that in the Army Communications Systems as Microsoft get plenty of memory dump reports so they Know which drivers are super stable and probably only approve 1 in 20 odd NVidia drivers that are WHQL.


----------



## AngryLobster

Man this 2nd FE I got is terrible. It requires 1.031v for 1962mhz and that's borderline. I can't even do 1900mhz at 0.931v.

My original card does 1962mhz with 0.975. This is just bad.


----------



## Benny89

Uff, so I have blocks and raditators, pump, res, fliud and tubes ordered. All EK.

All that is left is me trying to figure out what fittings, angles adapters etc I will need and how many which so far hardest thing here....

But almost ready


----------



## DrFreeman35

Alright so I'm considering my options. Not sure about what I want to do yet, thought I would come here for some opinions. Looking for some advice, and know if can be two sided.

I'm looking to upgrade to a Ti, or so I think. I want to get away from SLI, and have the opportunity to upgrade through step-up with EVGA.

I currently have 2 1080 FTW2, which do very good in what I play and the resolution I play at. (3440x1440) and (2560x1440) currently. I'm looking to upgrade to the 4k 120hz+ when available later this year. (Yes, I know I may have to SLI the Ti to get close to 120+ in some games.... which could be a possibility later) $ is not an issue.

I can step-up to the GTX 1080Ti SC Black Edition from my FTW2's for around $50.

Would this be a good choice? I am not intending to WC it just yet, and have thought about buying a Strix Ti or a FTW3 for Air Cooling purposes, and selling my 2 1080's to make some $ back. Any input welcome, TIA.


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> Uff, so I have blocks and raditators, pump, res, fliud and tubes ordered. All EK.
> 
> All that is left is me trying to figure out what fittings, angles adapters etc I will need and how many which so far hardest thing here....
> 
> But almost ready


Hi there

Personally I prefer XSPC,EK or Barrow fittings,which are best really depends,used all of them and and no leaks,only fittings which are leaked on me has been Alphacool and that's it,regarding the tubing,went via few like PrimoChill LRT which started to yellowing after 2-3 months,then EK DuraClear which has started to yellowing as well after few months and right now I'm using Mayhems UV White

Size of tubing and fittings,really depends,I like 13/10 but I will be going with 16/10 as next there and EK ZMT tubing

Hope this helps and best of luck









Thanks,Jura


----------



## sperson1

Quote:


> Originally Posted by *DrFreeman35*
> 
> Alright so I'm considering my options. Not sure about what I want to do yet, thought I would come here for some opinions. Looking for some advice, and know if can be two sided.
> 
> I'm looking to upgrade to a Ti, or so I think. I want to get away from SLI, and have the opportunity to upgrade through step-up with EVGA.
> 
> I currently have 2 1080 FTW2, which do very good in what I play and the resolution I play at. (3440x1440) and (2560x1440) currently. I'm looking to upgrade to the 4k 120hz+ when available later this year. (Yes, I know I may have to SLI the Ti to get close to 120+ in some games.... which could be a possibility later) $ is not an issue.
> 
> I can step-up to the GTX 1080Ti SC Black Edition from my FTW2's for around $50.
> 
> Would this be a good choice? I am not intending to WC it just yet, and have thought about buying a Strix Ti or a FTW3 for Air Cooling purposes, and selling my 2 1080's to make some $ back. Any input welcome, TIA.


Get the SC2


----------



## DrFreeman35

Quote:


> Originally Posted by *sperson1*
> 
> Get the SC2


Any reason to pick that over the FTW3? I've also looked at the Strix Ti OC as well. They are all around the same price range if I don't upgrade. The upgrade path is not to a SC2, well technically not the "SC2". SC Black Edition just doesn't have the 9 thermal sensors. Is that the one you're referring to?


----------



## sperson1

Quote:


> Originally Posted by *DrFreeman35*
> 
> Any reason to pick that over the FTW3? I've also looked at the Strix Ti OC as well. They are all around the same price range if I don't upgrade. The upgrade path is not to a SC2, well technically not the "SC2". SC Black Edition just doesn't have the 9 thermal sensors. Is that the one you're referring to?


I did not know there was a upgrade path so i assumed you were trying to save money. I mean get what you want they will all perform great I myself have a Strix OC and love it


----------



## Benny89

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Personally I prefer XSPC,EK or Barrow fittings,which are best really depends,used all of them and and no leaks,only fittings which are leaked on me has been Alphacool and that's it,regarding the tubing,went via few like PrimoChill LRT which started to yellowing after 2-3 months,then EK DuraClear which has started to yellowing as well after few months and right now I'm using Mayhems UV White
> 
> Size of tubing and fittings,really depends,I like 13/10 but I will be going with 16/10 as next there and EK ZMT tubing
> 
> Hope this helps and best of luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks,Jura


Why 16/10 over 13/10? I ordered 13/10 soft tubes, since most people were saying they are fine. Should I get 16/10 instead? Thanks!


----------



## DrFreeman35

Quote:


> Originally Posted by *sperson1*
> 
> I did not know there was a upgrade path so i assumed you were trying to save money. I mean get what you want they will all perform great I myself have a Strix OC and love it


Ok thanks, I will keep that in mind. I only have the option to go to SC BLack Edition for another 40 days. Im tempted to just upgrade due to $50 being so cheap, but do not know how easily I can sell my old ones. I may grab the Strix or FTW3, and put my old ones to work in a Folding Rig. Anyways, I'll appreciate any other input. If not thanks for the help


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> Why 16/10 over 13/10? I ordered 13/10 soft tubes, since most people were saying they are fine. Should I get 16/10 instead? Thanks!


Hi there

13/10 looks small or smaller there inside the bigger case and I will be redoing my loop again later on

But 13/10 is OK and running this as well without single issue, just I would like to try 16/10 EK ZMT

You will be just OK with 13/10 there, what pump are you will be running?

Hope this helps

Thanks, Jura


----------



## AngryLobster

1860 @ 0.931v is not even stable. Something has gotta be wrong with this or it is the worst 1080 Ti on earth.


----------



## Benny89

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> 13/10 looks small or smaller there inside the bigger case and I will be redoing my loop again later on
> 
> But 13/10 is OK and running this as well without single issue, just I would like to try 16/10 EK ZMT
> 
> You will be just OK with 13/10 there, what pump are you will be running?
> 
> Hope this helps
> 
> Thanks, Jura


This pump+res: https://www.ekwb.com/shop/ek-xres-140-revo-d5-pwm-incl-sl-pump


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> This pump+res: https://www.ekwb.com/shop/ek-xres-140-revo-d5-pwm-incl-sl-pump


Hi there

That's very nice combo there, you will be more than OK and this combo running my friends with 3*GPU without single issue

Hope this helps

Thanks, Jura


----------



## bmgjet

Tried the XOC bios on my EVGA SC B.
Got rid of it just sitting on power limit 90% of the time, Would sit on 2000-2012mhz then power limit down to 1980mhz then back to boost over and over.
Sits on a steedy 2025mhz but strangly no increase in performance so that bios makes it slower clock for clock.

EVGA bios 2012mhz with PL throttling (1.062-1.050v)
http://www.3dmark.com/3dm11/12206693

Code:



Code:


P28 245 with NVIDIA GeForce GTX 1080 Ti(1x) and Intel Core i7-5820K Processor

Graphics Score
    41 846

Physics Score
    14 634

Combined Score
    13 829

XOC Bios 2025mhz no throttling (constant 1.070v)
http://www.3dmark.com/3dm11/12207180

Code:



Code:


P28 208 with NVIDIA GeForce GTX 1080 Ti(1x) and Intel Core i7-5820K Processor

Graphics Score
    41 848

Physics Score
    14 638

Combined Score
    13 734

What sort of voltage are people happy to push?
With custom fan curve its getting up to 58c at 1.070v


----------



## Nico67

Quote:


> Originally Posted by *AngryLobster*
> 
> 1860 @ 0.931v is not even stable. Something has gotta be wrong with this or it is the worst 1080 Ti on earth.


What do you mean by stable? if its crashing, then your voltage is too low. If its downclocking then its hitting a limiter and you need to look at that.


----------



## dansi

Is this Boost 3.0 bug with Superposition?

When i benchmark 4K optimised vs 1080p extreme, my 1080ti will run about 5-6 bins lower.

clocks displayed
-4k optimised around 1921-1986
-1080p extreme around 2037 - 1986


----------



## lilchronic

Quote:


> Originally Posted by *dansi*
> 
> Is this Boost 3.0 bug with Superposition?
> 
> When i benchmark 4K optimised vs 1080p extreme, my 1080ti will run about 5-6 bins lower.
> 
> clocks displayed
> -4k optimised around 1921-1986
> -1080p extreme around 2037 - 1986


You are power limited which is causing your voltage and core clocks throttle. 4k is a lot more power demanding than 1080p


----------



## AngryLobster

Quote:


> Originally Posted by *Nico67*
> 
> What do you mean by stable? if its crashing, then your voltage is too low. If its downclocking then its hitting a limiter and you need to look at that.


It's crashing out of games. This is a terrible card. 1962mhz requires 1.031v and it seems that undervolting at anything beyond the default curves #'s causes a crash. I mean 1860mhz needs 0.950v which just doesn't make sense to me because both my other cards will do around 80-100mhz more at that same voltage.


----------



## Nico67

Quote:


> Originally Posted by *AngryLobster*
> 
> It's crashing out of games. This is a terrible card. 1962mhz requires 1.031v and it seems that undervolting at anything beyond the default curves #'s causes a crash. I mean 1860mhz needs 0.950v which just doesn't make sense to me because both my other cards will do around 80-100mhz more at that same voltage.


default voltage is 1.050v and while I appreciate some people like to undervolt, I don't think you can complain if it doesn't like it.







Some cards have higher VID and probably need to run at that to be close to reasonable. Unfortunately it is the luck of the draw, and as long as it boosts to 1580 or whatever it is then the card is fine, just not great.

Are you trying to run tri-sli undervolted or something?


----------



## Spirti

Quote:


> Originally Posted by *feznz*
> 
> Exactly problems started as soon as I put the 1080Ti in ... previous system GTX 770 SLI super stable no problems what so ever
> But I want to solve it is annoying knowing you can crash at any moment while on internet browsing good excuse to get a M.2 drive and clean install I will let *Windows Choose all the Microsoft approved drivers*
> Interesting my friend that is a IT technician for the NZ army told me why they do that in the Army Communications Systems as Microsoft get plenty of memory dump reports so they Know which drivers are super stable and probably only approve 1 in 20 odd NVidia drivers that are WHQL.


Well, then I think I'll have to wait for the next drivers and so on))
I hardly think that it's a driver problem since it crashes only after I turn the pc on and 10-15 mins of YouTube)


----------



## AngryLobster

Quote:


> Originally Posted by *Nico67*
> 
> default voltage is 1.050v and while I appreciate some people like to undervolt, I don't think you can complain if it doesn't like it.
> 
> 
> 
> 
> 
> 
> 
> Some cards have higher VID and probably need to run at that to be close to reasonable. Unfortunately it is the luck of the draw, and as long as it boosts to 1580 or whatever it is then the card is fine, just not great.
> 
> Are you trying to run tri-sli undervolted or something?


No I'm trying to keep temperatures in check for a ITX build. I've had my hands on almost a dozen 1080 Ti's and have never seen anything like this. Some cards wouldn't even OC to 1975 but they'd all cooperate really well undervolting. 1900mhz around 0.900v is what I'm seeing with these cards and this misses it by a huge margin.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> *Finally ordered new EK water-block for my 1080Ti FE !!!*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> *It was on clearance at overclockers... so only paid 79.99 for it brand new!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *
> It was also their last one (1 in stock)
> 
> If I'd got the 1080Ti block it would have cost about 100.


Glad to see you finally ordered your block!

Even sweeter, is that you got a really good deal on it.









Quote:


> Originally Posted by *Benny89*
> 
> Uff, so I have blocks and raditators, pump, res, fliud and tubes ordered. All EK.
> 
> All that is left is me trying to figure out what fittings, angles adapters etc I will need and how many which so far hardest thing here....
> 
> But almost ready


Fantastic!

I'm glad you've decided to take the "plunge" into real watercooling!
I think you've made some excellent choices.
For the fittings, did you go with black (painted) or silver (chrome and or nickel plated)?
Soft tubing?
Imo soft tubing is best for a first time build, unless you are a masochist, lol.

Go over your planned loop several times, to determine how many straight compression fittings, angled swivel fittings, spacers, etc. that you will need.
Also extra plugs, and extension fittings, although EK includes some with their blocks, and reservoirs/pumps.
Have a look at some other builds in the watercooling club to get some inspiration!

EK fittings are good, so are Barrow. Bitspower fittings are excellent, but damn expensive.

I prefer to have a few extra fittings, certainly better than not enough!
But remember, no matter how many you order, you will always wind up needing something else!

And get a drain valve too, really comes in handy for draining the loop.


----------



## Luckbad

It's been a decade since I properly watercooled (no AIO, closed loop, hybrid stuff).

Having picked up the Zotac ArcticStorm and a nice loop setup, there is no going back.

I'm a fan of quiet computing and have long had a quiet setup when not gaming and a fairly quiet setup when gaming.

Now when gaming, my system stays silent. It's addictive. The ambient noise and fan in my son's room (with a wall between us) are louder than my PC even when heavily overclocked.

I've managed 10496 on Superposition 4K and might push until I get 10500, but even at slightly worse core speed (haven't hit 2100 stable), this does better than my Amp Extreme.


----------



## DerComissar

Quote:


> Originally Posted by *Luckbad*
> 
> It's been a decade since I properly watercooled (no AIO, closed loop, hybrid stuff).
> 
> Having picked up the Zotac ArcticStorm and a nice loop setup, there is no going back.
> 
> I'm a fan of quiet computing and have long had a quiet setup when not gaming and a fairly quiet setup when gaming.
> 
> Now when gaming, my system stays silent. It's addictive. The ambient noise and fan in my son's room (with a wall between us) are louder than my PC even when heavily overclocked.
> 
> I've managed 10496 on Superposition 4K and might push until I get 10500, but even at slightly worse core speed (haven't hit 2100 stable), this does better than my Amp Extreme.


Yeah, that's a nice looking block on the Zotac.

And great temps.!









I've had two Zotac cards in the past, an old 480, then a reference Zotac 780Ti. (still have that one, with a block on it.)
Both were excellent cards.
The ArcticStorm 1080Ti is certainly their best card to date.

Idk who makes the block for it, not sure if it's EK, perhaps Bitspower?
The backplate looks good too, with the cutouts for the power boost modules!


----------



## Luckbad

I'm not sure who makes the ArcticStorm waterblock. Could be Bitspower since they're the only manufacturer I'm aware of who is releasing a block for the Amp Extreme. Could be Thermaltake since they worked with Zotac on the 1080 Anniversary Edition. EKWB isn't going to offer a block for the Amp Extreme, so I'm guessing it's not them.


----------



## dansi

Quote:


> Originally Posted by *lilchronic*
> 
> You are power limited which is causing your voltage and core clocks throttle. 4k is a lot more power demanding than 1080p


Which is strange because i monitor GPUZ, and the tdp jump between 95-120%, most of the time around 114~117%. Only about 5% of the time it shows >120%

The perfcap reason, i eye-stimate is 55% 'idle' and 45% 'pwr' during scenes 1-7

And then mostly is 'vrel' or 'vrel+pwr' after those, with sporadic 'pwr' and 'idle'

I was tested 4K optimised QHD at windowed.

Granted im only using stock FE bios with 120%. tdp.


----------



## dansi

Oh my superposition seems weird, so i tried the same QHD windowed res with Extreme shaders and what i got over 4K optimised

-much lower fps
-blurry textures
-100% gpu load
-scenes 1-7 tdp around 110-117% BUT perfcap reasons are none! i can run at 2025 with 1.062v through them
-scenes 7-17 have some 'pwr' and 'idle' and 'vrel' but i maintained higher clocks on average.
-much lower scores!

somehow Extreme shaders vs 4K optimised gives
-higher average clocks
-lower tdp consumption
-higher gpu load


----------



## KedarWolf

So, iI take off my water block and backplate, check all the pad placements, reapplied the thermal paste on the block, put it back on, start up my PC to check temps.

Clocks are staying at idle speeds running Time Spy stress test. core 240MHZ, memory 405MHZ.









I take the card out of the pci-e slot, put it back in, check power cables, everything is good.

Same problem.









I'm taking out the card to take the block off, see that the display port is plugged into my secondary PhysX card, put it on my 1080 Ti, back to normal.









Good thing this ID 10T error never made me take the block off again.


----------



## DerComissar

Quote:


> Originally Posted by *KedarWolf*
> 
> So, iI take off my water block and backplate, check all the pad placements, reapplied the thermal paste on the block, put it back on, start up my PC to check temps.
> 
> Clocks are staying at idle speeds running Time Spy stress test. core 240MHZ, memory 405MHZ.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I take the card out of the pci-e slot, put it back in, check power cables, everything is good.
> 
> Same problem.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm taking out the card to take the block off, see that the display port is plugged into my secondary PhysX card, put it on my 1080 Ti, back to normal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good thing this ID 10T error never made me take the block off again.


Lol!


----------



## KedarWolf

For stability XOC BIOS can't be beat.

And recently I got 99.6% in TimeSpy stress test but to get 99% on Fire Strike Ultra is pretty much unheard of.


----------



## dansi

Im tempted to try XOC bios.
What it gives?

I know my 1080ti is at the short end of silicon lottery but if XOC does really unlock more power limits, at least i can run it constant clocks!


----------



## bmgjet

Quote:


> Originally Posted by *dansi*
> 
> Im tempted to try XOC bios.
> What it gives?
> 
> I know my 1080ti is at the short end of silicon lottery but if XOC does really unlock more power limits, at least i can run it constant clocks!


Made good difference on my EVGA SC Black.
Just if its a bad chip youll need to under clock it straight away. Since in stock form the xoc bios on my card was trying to run 2068mhz on 1.15v with very low fan speed.
Stuck a custom fan curve on so its at 100% at 55C so it stays under 60C and dropped the voltage and frequency to 2050mhz @ 1.086V

There is about a 15mhz clock for clock disadvantage on the xoc bios tho with 3dmark11.
So if you dont match atleast 15mhz more overclock then youll get less score.


----------



## BoredErica

Quote:


> Originally Posted by *Dasboogieman*
> 
> Buildzoid may have found the answer to why the XOC BIOS performs worse than the stock FE.
> 
> 
> 
> 
> 
> TLDR, Pascal may have an on chip power management controller similar to Fiji that cannot be overriden. This same circuit may also be responsible for the normalized TDP that the shunt guys have had to deal with.


But... I got higher perf with XOC vs FE bios...

Will retest again with more trials and different tests next time to be sure... Maybe under water instead.


----------



## Aganor

Quote:


> Originally Posted by *jura11*
> 
> That's bit high there
> How many radiators are you running?
> And what case do you have?
> 
> I would suspect you will need better fans at least to keep delta lower or run fans in faster speeds
> 
> Please have look on Thermalright TY-143 SQ or TY-147ASQ or Phanteks PH-F140MP, BeQuiet Silent Wings 3 and for 120mm fans I would get as well Phanteks PH-F120MP
> 
> Hope this helps
> 
> Thanks,Jura


Check my sig, its my system atm.

It was high because room temp was really warm, but i strill dont have a thermometer.
Yesterday i had 3 hours of Superposition with open windows and cool night and temps reached max 50ºc.

I also upped the speed of fans and pump.


----------



## Benny89

Quote:


> Originally Posted by *DerComissar*
> 
> Fantastic!
> 
> I'm glad you've decided to take the "plunge" into real watercooling!
> I think you've made some excellent choices.
> For the fittings, did you go with black (painted) or silver (chrome and or nickel plated)?
> Soft tubing?
> Imo soft tubing is best for a first time build, unless you are a masochist, lol.
> 
> Go over your planned loop several times, to determine how many straight compression fittings, angled swivel fittings, spacers, etc. that you will need.
> Also extra plugs, and extension fittings, although EK includes some with their blocks, and reservoirs/pumps.
> Have a look at some other builds in the watercooling club to get some inspiration!
> 
> EK fittings are good, so are Barrow. Bitspower fittings are excellent, but damn expensive.
> 
> I prefer to have a few extra fittings, certainly better than not enough!
> But remember, no matter how many you order, you will always wind up needing something else!
> 
> And get a drain valve too, really comes in handy for draining the loop.


Well, here is the thing. I chose 10/13 soft tubes and those fittings: https://www.ekwb.com/shop/ek-acf-fitting-10-13mm-black-2

Now I wonder- do they fit into those adapters? : https://www.ekwb.com/shop/fittings/adapter-fittings/2-45 and https://www.ekwb.com/shop/ek-af-angled-90-g1-4-black ?

And what do you need to connect one adapter to the other? Same fitting as I mentioned above?

Those fittings give me headache...


----------



## jura11

Quote:


> Originally Posted by *Benny89*
> 
> Well, here is the thing. I chose 10/13 soft tubes and those fittings: https://www.ekwb.com/shop/ek-acf-fitting-10-13mm-black-2
> 
> Now I wonder- do they fit into those adapters? : https://www.ekwb.com/shop/fittings/adapter-fittings/2-45 and https://www.ekwb.com/shop/ek-af-angled-90-g1-4-black ?
> 
> And what do you need to connect one adapter to the other? Same fitting as I mentioned above?
> 
> Those fittings give me headache...


Yes these 13/10 fittings will fit in those 45° or 90° adapters,as this 45° or 90° is G1/4 male to female adapter and should be easy to fit

Have look on my build, have used similar on bottom radiator and on bottom GPU



Hope this helps

Thanks,Jura


----------



## jura11

Quote:


> Originally Posted by *Aganor*
> 
> Check my sig, its my system atm.
> 
> It was high because room temp was really warm, but i strill dont have a thermometer.
> Yesterday i had 3 hours of Superposition with open windows and cool night and temps reached max 50ºc.
> 
> I also upped the speed of fans and pump.


Hi there

Sorry been on phone and couldn't see yours signature

750D is not bad case, but still this case suffers from poor airflow

Yours radiator setup is very similar to my, I'm running very similar at top 360mm 60mm thick, bottom 240mm 60mm thick and at back 120mm 30mm thick radiator

In hot weather like has been over here with ambient temperature at 29°C I've seen highest load temperature at 42°C(idle temperature has been 27°C) with fans running at 1200RPM and water delta has been in 6-8°C

I would suggest get better fans which can and should lower temps a bit,with some radiators you really need to run push pull configuration and run fans at faster speeds but still if ambient temperature is high, nothing will help you,maybe water chiller or good A/C

Hope this helps

Thanks,Jura


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> Well, here is the thing. I chose 10/13 soft tubes and those fittings: https://www.ekwb.com/shop/ek-acf-fitting-10-13mm-black-2
> 
> Now I wonder- do they fit into those adapters? : https://www.ekwb.com/shop/fittings/adapter-fittings/2-45 and https://www.ekwb.com/shop/ek-af-angled-90-g1-4-black ?
> 
> And what do you need to connect one adapter to the other? Same fitting as I mentioned above?
> 
> Those fittings give me headache...


The EK 10/13 compressions you chose should work well.

As Jura mentioned, they will all fit together with the adapters you linked.
You'll be able to determine where you need to use angle adapters, and which type, as you begin to fit everything together.

Which version of the EK 1080Ti Strix block did you order, acetal or acrylic?


----------



## TheBoom

This is a bit off topic but can anyone tell me if fan CFM is linear at percentages? I just went from 4 intake 2 exhaust to 3-3 to help with all the heat the 1080ti is putting out. The 2 exhaust were actually the radiator fans on my CPU cooling and load temps were going up quite a bit after I installed the 1080ti.

Now temps are abit better with the additional exhaust fan but my exhaust fans are of higher airflow then my intakes. Trying my best not to create negative pressure so I've set the exhaust to lower percentages/rpm but was wondering if it's linear compared to its max cfm as on specs


----------



## jura11

Quote:


> Originally Posted by *TheBoom*
> 
> This is a bit off topic but can anyone tell me if fan CFM is linear at percentages? I just went from 4 intake 2 exhaust to 3-3 to help with all the heat the 1080ti is putting out. The 2 exhaust were actually the radiator fans on my CPU cooling and load temps were going up quite a bit after I installed the 1080ti.
> 
> Now temps are abit better with the additional exhaust fan but my exhaust fans are of higher airflow then my intakes. Trying my best not to create negative pressure so I've set the exhaust to lower percentages/rpm but was wondering if it's linear compared to its max cfm as on specs


Hi there

Try to remove empty PCI_E slot covers which should help with escaping hot air from GPU

Regarding the fans you should try bringing more cold air to case, you can try experiment with opening case window if temps will drop which in yours case I would suspect this would help to lower temps on CPU and GPU

What intake fans are you using or do you have? And what fans do you have on AIO

Hope this helps

Thanks,Jura


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> This is a bit off topic but can anyone tell me if fan CFM is linear at percentages? I just went from 4 intake 2 exhaust to 3-3 to help with all the heat the 1080ti is putting out. The 2 exhaust were actually the radiator fans on my CPU cooling and load temps were going up quite a bit after I installed the 1080ti.
> 
> Now temps are abit better with the additional exhaust fan but my exhaust fans are of higher airflow then my intakes. Trying my best not to create negative pressure so I've set the exhaust to lower percentages/rpm but was wondering if it's linear compared to its max cfm as on specs


The Max CFM advertised assumes no obstructions and maximum RPM. I think you are very likely to already be at positive overall pressure (with 3+3) because those radiator fans are at a much much lower CFM than rated, even at max power due to the obstruction.


----------



## TheBoom

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Try to remove empty PCI_E slot covers which should help with escaping hot air from GPU
> 
> Regarding the fans you should try bringing more cold air to case, you can try experiment with opening case window if temps will drop which in yours case I would suspect this would help to lower temps on CPU and GPU
> 
> What intake fans are you using or do you have? And what fans do you have on AIO
> 
> Hope this helps
> 
> Thanks,Jura


Don't really want to run an open case for noise and dust concerns. My temps are better now with 3-3 but I don't want dust coming in due to negative pressure.
Quote:


> Originally Posted by *Dasboogieman*
> 
> The Max CFM advertised assumes no obstructions and maximum RPM. I think you are very likely to already be at positive overall pressure (with 3+3) because those radiator fans are at a much much lower CFM than rated, even at max power due to the obstruction.


Yeah that's what I thought but my intake fans are also filtered so they won't be bringing in much air.

The intakes are 2 SP140 Leds at max with one Jetflo 120 at the bottom running at 40% only (its noisy).

Exhaust is 1 bitfenix spectre 140 led at the rear running at 50% and the 2 on the radiator are ML140 Pro leds at a fan curve with max 80% at 85c CPU temp.

That's why I'm a little worried about negative pressure the ML140s are much higher cfm and static pressure compared to the SP140s.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> Don't really want to run an open case for noise and dust concerns. My temps are better now with 3-3 but I don't want dust coming in due to negative pressure.
> Yeah that's what I thought but my intake fans are also filtered so they won't be bringing in much air.
> 
> The intakes are 2 SP140 Leds at max with one Jetflo 120 at the bottom running at 40% only (its noisy).
> 
> Exhaust is 1 bitfenix spectre 140 led at the rear running at 50% and the 2 on the radiator are ML140 Pro leds at a fan curve with max 80% at 85c CPU temp.
> 
> That's why I'm a little worried about negative pressure the ML140s are much higher cfm and static pressure compared to the SP140s.


OK the math there makes my head hurt. I'd say just don't worry about negative pressure vs positive, dust will get in regardless of what you do.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> OK the math there makes my head hurt. I'd say just don't worry about negative pressure vs positive, dust will get in regardless of what you do.


Yeah I know, I did the math though and if CFM is linear across rpm then it should be just about positive to neutral pressure.

Actually it makes a lot of difference as where I live dust in the air is crazy. One week without filters and you can see the build up through the window.


----------



## DADDYDC650

Quote:


> Originally Posted by *bmgjet*
> 
> Made good difference on my EVGA SC Black.
> Just if its a bad chip youll need to under clock it straight away. Since in stock form the xoc bios on my card was trying to run 2068mhz on 1.15v with very low fan speed.
> Stuck a custom fan curve on so its at 100% at 55C so it stays under 60C and dropped the voltage and frequency to 2050mhz @ 1.086V
> 
> There is about a 15mhz clock for clock disadvantage on the xoc bios tho with 3dmark11.
> So if you dont match atleast 15mhz more overclock then youll get less score.


I believe we have the same exact card. EVGA GTX 1080 TI SC Black Edition? Do you recommend I flash the XOC BIOS or should I stick to stock?


----------



## phaseshift

Quote:


> Originally Posted by *Dasboogieman*
> 
> You used to be able to force the 570s to do SFR mode to enforce compatibility in those tricky games that didn't do AFR. Still a decent 20% scaling albeit with needing Vsync to deal with SFR screen tearing.....ahh the bad old days.
> depends, what are you looking for?
> Outside of AIBs that pre-bin their chips like Galax HOF and EVGA Kingpin. Most will OC the same, the thing that really makes a difference in performance is core + VRAM cooling competency. The lower the temps you can get, the faster your card is. A second consideration is power limits, some AIBs have larger built-in limits like the Gigabyte Aorus (375W), FTW3 (358W), Zotac (375W+++), Strix (unlimited with the XOC BIOS). Seeing as how these Bioses are locked down, the choice of pre-installed BIOS matters.
> 
> Then lastly there are other considerations like your willingness to DIY, your intentions to watercool or not, your SLI ambitions, noise etc


Quote:


> Originally Posted by *Hulio225*
> 
> Founders Edition @ Custom Waterloop


Quote:


> Originally Posted by *KickAssCop*
> 
> I would say it is all a wash. If you want decent temps, no noise and air cooling, STRIX is the best IMHO.
> If you want water then MSI Seahawk AIO is the best.
> 
> My experience with MSI Gaming X was crap.


Quote:


> Originally Posted by *VESPA5*
> 
> I'd personally go with the 1080Ti FE and slap on custom watercooling or drop $159.99 USD on EVGA's hybrid cooler. When the 980Ti came out, sure there were fancy cards with 2-3 massive fans blowing onto huge radiators, but they never really came close to keeping temps down while maintaining performance with a custom watercooling setup or a hybrid AIO cooler.
> 
> Granted, the 1080Ti FE cost me $699 on top of $159.99 for the EVGA hybrid cooler to slap onto it, but I can breeze by with the GPU rarely going over 50C while overclocking it at a conservative 2015MHz. Nice!


Okay so I'd like to SLI in the future and also have the option to do watercooling(but not a necessity). Right now I'm stuck on either EVGA FTW3, Asus Strix, Gigabyte Auros Xtreme Edition or MSI Gaming X. What are your thoughts? I also just recently bought an Asus ROG PG279Q so I'll be gaming on 1440p.


----------



## RavageTheEarth

Quote:


> Originally Posted by *phaseshift*
> 
> Okay so I'd like to SLI in the future and also have the option to do watercooling(but not a necessity). Right now I'm stuck on either EVGA FTW3, Asus Strix, Gigabyte Auros Xtreme Edition or MSI Gaming X. What are your thoughts? I also just recently bought an Asus ROG PG279Q so I'll be gaming on 1440p.


I'm biased but i would suggest to buy the regular Aorus instead of the Xtreme. They both have the 375w limit and I'm the OP of the Aorus Owners Club and the regular Aorus cards seem to be clocking better than the Xtremes for some reson. I'm finally on water and am running 2100/6000 at 1.083v for everyday gaming. I can go higher on the core but anything higher requires 1.093v. Here is Superposition 4k at 2126/6003.



On air i could run 2076 @ 1.093v.


----------



## Benny89

Quote:


> Originally Posted by *DerComissar*
> 
> The EK 10/13 compressions you chose should work well.
> 
> As Jura mentioned, they will all fit together with the adapters you linked.
> You'll be able to determine where you need to use angle adapters, and which type, as you begin to fit everything together.
> 
> Which version of the EK 1080Ti Strix block did you order, acetal or acrylic?


Nice. Thanks. I ordered this one: https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-acetal-nickel


----------



## KedarWolf

This on the XOC BIOS.


----------



## Luckbad

Interesting quirk of the Zotac ArcticStorm BIOS is that its default power limit is 360W, as opposed to 384 for the Amp Extreme.

However, you can manually set it to 432.

C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 432


----------



## KedarWolf

Quote:


> Originally Posted by *Luckbad*
> 
> Interesting quirk of the Zotac ArcticStorm BIOS is that its default power limit is 360W, as opposed to 384 for the Amp Extreme.
> 
> However, you can manually set it to 432.
> 
> C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -pl 432


Can you save the BIOS with GPU-Z and zip it, add it as an attachment here?


----------



## jura11

Quote:


> Originally Posted by *TheBoom*
> 
> Don't really want to run an open case for noise and dust concerns. My temps are better now with 3-3 but I don't want dust coming in due to negative pressure.
> Yeah that's what I thought but my intake fans are also filtered so they won't be bringing in much air.
> 
> The intakes are 2 SP140 Leds at max with one Jetflo 120 at the bottom running at 40% only (its noisy).
> 
> Exhaust is 1 bitfenix spectre 140 led at the rear running at 50% and the 2 on the radiator are ML140 Pro leds at a fan curve with max 80% at 85c CPU temp.
> 
> That's why I'm a little worried about negative pressure the ML140s are much higher cfm and static pressure compared to the SP140s.


Hi there

I would personally go with different fans as first, have look on Phanteks PH-F140MP and on bottom I would add Phanteks PH-F120MP and as exhaust Phanteks PH-F140MP, these fans should do much better job than Corsair SP-140 or SP-120 fans or even Bitfenix etc

Then here are BeQuiet Silent Wings 3 which are very good fans although expensive over Phanteks fans

Other ones which I like are Thermalright TY-143SQ or TY-147ASQ or round fans like are TY-143 or TY-147A which are very nice fans and very quiet up to 1300RPM and they're comparable to NF-A15 fans which works very well on NH-D15

These CFM ratings on most of fans are rubbish and personally would rather have look on Thermalbench.com for fans reviews

Hope this helps

Thanks, Jura


----------



## nrpeyton

Whose got a Pascal Titan X, EK block installed on their 1080Ti FE.

Did you follow instructions in EK manual (instructions for Titan) or change pad sizes to accommodate the 1080Ti?


----------



## jura11

Quote:


> Originally Posted by *nrpeyton*
> 
> Whose got a Pascal Titan X, EK block installed on their 1080Ti FE.
> 
> Did you follow instructions in EK manual (instructions for Titan) or change pad sizes to accommodate the 1080Ti?


Hi there

I followed EK instructions for Titan X for applying the thermal pads on my GTX1080Ti

Hope this helps

Thanks,Jura


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Whose got a Pascal Titan X, EK block installed on their 1080Ti FE.
> 
> Did you follow instructions in EK manual (instructions for Titan) or change pad sizes to accommodate the 1080Ti?


I had one installed on my FE. What's about the thermal pads? 1080Ti FE and Titan X Pascal are the same size anyways.


----------



## phaseshift

Quote:


> Originally Posted by *RavageTheEarth*
> 
> I'm biased but i would suggest to buy the regular Aorus instead of the Xtreme. They both have the 375w limit and I'm the OP of the Aorus Owners Club and the regular Aorus cards seem to be clocking better than the Xtremes for some reson. I'm finally on water and am running 2100/6000 at 1.083v for everyday gaming. I can go higher on the core but anything higher requires 1.093v. Here is Superposition 4k at 2126/6003.
> 
> 
> 
> On air i could run 2076 @ 1.093v.


What are your thoughts on the other cards such as Strix, Gaming X or the FTW3?


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I had one installed on my FE. What's about the thermal pads? 1080Ti FE and Titan X Pascal are the same size anyways.


So the heights of the mosfets on the VRM are exactly the same? (I.E. nvidia specified the exact same VRM package for the 1080Ti as they used on their Titan)?

I thought the 1080Ti had an upgraded (better) power delivery system than the Titan. _<-- so they may be shorter/taller?_

The EK configurator shows my 1080 Classified as compatible with the 780Ti Classy block but to get best temps you have to change the pad sizes (in other words you *don't* follow instructions in manual when it comes to pads).

Just interested to know if anyones already tested / compared.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> So the heights of the mosfets on the VRM are exactly the same? (I.E. nvidia specified the exact same VRM package for the 1080Ti as they used on their Titan)?
> 
> I thought the 1080Ti had an upgraded (better) power delivery system than the Titan.


Titan Xp, Titan X, and GeForce GTX 1080 Ti all share the same layout. But whereas 1080 Ti's PCB is fully populated, Titan Xp and Titan X utilize co-packaged high- and low-side MOSFETs from ON Semiconductor (NTMFD4C85N).


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Titan Xp, Titan X, and GeForce GTX 1080 Ti all share the same layout. But whereas 1080 Ti's PCB is fully populated, Titan Xp and Titan X utilize co-packaged high- and low-side MOSFETs from ON Semiconductor (NTMFD4C85N).


So it may be worth testing with different pad sizes and comparing temps (as VRM *is* different)?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> So it may be worth testing with different pad sizes and comparing temps (as VRM *is* different)?


It's the same VRM PCB layout. The VRM section on the EK titan block has already covered all the VRMs including those missing ones.


Also, they are the same on the manual
GTX 1080Ti:


Titan X


----------



## TheBoom

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I would personally go with different fans as first, have look on Phanteks PH-F140MP and on bottom I would add Phanteks PH-F120MP and as exhaust Phanteks PH-F140MP, these fans should do much better job than Corsair SP-140 or SP-120 fans or even Bitfenix etc
> 
> Then here are BeQuiet Silent Wings 3 which are very good fans although expensive over Phanteks fans
> 
> Other ones which I like are Thermalright TY-143SQ or TY-147ASQ or round fans like are TY-143 or TY-147A which are very nice fans and very quiet up to 1300RPM and they're comparable to NF-A15 fans which works very well on NH-D15
> 
> These CFM ratings on most of fans are rubbish and personally would rather have look on Thermalbench.com for fans reviews
> 
> Hope this helps
> 
> Thanks, Jura


Yeah that would be my next course of action, to change the intake fans. I bought these SP140s and the Jetflo years ago so they were the best options for LED fans then. Also the SPs are quiet at 100%.

Thanks for the suggestions but none of those are LEDs and I wanna keep the aesthetics for now. I might probably go with the same ML140s for intakes as well.

http://thermalbench.com/2016/07/12/corsair-ml120-pro-120-mm-fan/3/


----------



## Luckbad

Zotac ArcticStorm BIOS should be on TechPowerUp soon, but it's also attached here.

GP102_arcticstorm.zip 155k .zip file


----------



## jura11

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah that would be my next course of action, to change the intake fans. I bought these SP140s and the Jetflo years ago so they were the best options for LED fans then. Also the SPs are quiet at 100%.
> 
> Thanks for the suggestions but none of those are LEDs and I wanna keep the aesthetics for now. I might probably go with the same ML140s for intakes as well.
> 
> http://thermalbench.com/2016/07/12/corsair-ml120-pro-120-mm-fan/3/


Hi there

Corsair ML series seems are OK, looks they're better than their old SP or Quiet Edition which I hated as they're been too noisy for me

Few people do have with them issues but all depends

If LED is important to you then ML are probably best one, I've run for while Phanteks PH-F120SP which has been Blue LED fans which has been pretty quiet

For bottom fans I would probably suggest turn them as intake not as exhaust and remove PCI slots covers, this will and can make difference when you are running air cooled GPU, on water cooled GPU this doesn't make lots of difference but on air cooled cards like have run like Zotac GTX1080 AMP Edition with EVGA Titan X SC Maxwell with Raijintek Morpheus II air cooler and on top card I've never seen higher temps than 72-76°C with 2088MHz OC and 65% fan profile and bottom Titan X SC Maxwell run as max 42-45°C under heavy load in rendering

Hope this helps

Thanks,Jura


----------



## feznz

Quote:


> Originally Posted by *phaseshift*
> 
> What are your thoughts on the other cards such as Strix, Gaming X or the FTW3?


Aesthetics all comes down to looking at the OP with the average OC then it comes down to the best looking card silicon lottery even the FE are winning
Asus has been mentioned a few time for being possibly the quietest... I am a Republic of Gamers all the way.... wither it be AMD Intel or NVidia

http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/0_20


----------



## TheBoom

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Corsair ML series seems are OK, looks they're better than their old SP or Quiet Edition which I hated as they're been too noisy for me
> 
> Few people do have with them issues but all depends
> 
> If LED is important to you then ML are probably best one, I've run for while Phanteks PH-F120SP which has been Blue LED fans which has been pretty quiet
> 
> For bottom fans I would probably suggest turn them as intake not as exhaust and remove PCI slots covers, this will and can make difference when you are running air cooled GPU, on water cooled GPU this doesn't make lots of difference but on air cooled cards like have run like Zotac GTX1080 AMP Edition with EVGA Titan X SC Maxwell with Raijintek Morpheus II air cooler and on top card I've never seen higher temps than 72-76°C with 2088MHz OC and 65% fan profile and bottom Titan X SC Maxwell run as max 42-45°C under heavy load in rendering
> 
> Hope this helps
> 
> Thanks,Jura


Can't seem to find any phanteks fans in my country. Are you sure that the ML Pros are quieter than the SPs? Mine are louder at 100% compared to the SPs but of course they are a lot higher cfm and static pressure.

My bottom is already an intake, the other exhaust is at the rear of the case (which was the one i switched from intake to exhaust).


----------



## jura11

Quote:


> Originally Posted by *TheBoom*
> 
> Can't seem to find any phanteks fans in my country. Are you sure that the ML Pros are quieter than the SPs? Mine are louder at 100% compared to the SPs but of course they are a lot higher cfm and static pressure.
> 
> My bottom is already an intake, the other exhaust is at the rear of the case (which was the one i switched from intake to exhaust).


Hi there

These ML series are 400-2400RPM and these SP are I think 1440RPM fans and due this I think ML are louder, you can try to run both fans at same RPM and I would suspect ML should be quieter a bit, but you should test them on same thing ie like case or radiator and compare them

You can try to turn exhaust fan(Bitfenix) as intake which can bring cold air to yours case, in some cases this does help, mainly in cases with AIO/CLC

Shame in yours country Phanteks are not available, I really like these fans and these fans are lot cheaper than Corsair fans

SP are 1300RPM, MP are 1800RPM fans and MP I recommend on radiator or as intake fans on cases

Hope this helps

Thanks,Jura


----------



## TheBoom

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> These ML series are 400-2400RPM and these SP are I think 1440RPM fans and due this I think ML are louder, you can try to run both fans at same RPM and I would suspect ML should be quieter a bit, but you should test them on same thing ie like case or radiator and compare them
> 
> You can try to turn exhaust fan(Bitfenix) as intake which can bring cold air to yours case, in some cases this does help, mainly in cases with AIO/CLC
> 
> Shame in yours country Phanteks are not available, I really like these fans and these fans are lot cheaper than Corsair fans
> 
> SP are 1300RPM, MP are 1800RPM fans and MP I recommend on radiator or as intake fans on cases
> 
> Hope this helps
> 
> Thanks,Jura


The bitfenix was intake but a lot of hot air was getting trapped inside so I switched it to exhaust. Temps are better now but might be negative pressure which was my concern all along. Will probably change the SPs to ML Pros, but thanks for all the input anyway. Appreciated.


----------



## jura11

Quote:


> Originally Posted by *TheBoom*
> 
> The bitfenix was intake but a lot of hot air was getting trapped inside so I switched it to exhaust. Temps are better now but might be negative pressure which was my concern all along. Will probably change the SPs to ML Pros, but thanks for all the input anyway. Appreciated.


Hi there

If hot air has been trapped with Bitfenix turned to intake then you should try run radiator fans bit faster and optimise around this

When you are running top radiator fans slow and exhaust fan as intake then you will have trapped air which can heat up inside the case and you can try playing with PCI_E slots covers if this would allow to hot air escape from the case

But if yours temps are better now then I wouldn't do that,when I run air cooled cards plus NH-D15 I optimised my airflow around the case and my components, with Corsair H100i my temps has been pretty high and went back to air and been very happy

Best of luck there with build

Thanks, Jura


----------



## Coopiklaani

just realised the efficiency completely goes out of the window at 350w+


----------



## TheBoom

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> If hot air has been trapped with Bitfenix turned to intake then you should try run radiator fans bit faster and optimise around this
> 
> When you are running top radiator fans slow and exhaust fan as intake then you will have trapped air which can heat up inside the case and you can try playing with PCI_E slots covers if this would allow to hot air escape from the case
> 
> But if yours temps are better now then I wouldn't do that,when I run air cooled cards plus NH-D15 I optimised my airflow around the case and my components, with Corsair H100i my temps has been pretty high and went back to air and been very happy
> 
> Best of luck there with build
> 
> Thanks, Jura


They were already at 100% curve at 80C, but my case has a top cover that isn't great for airflow. Even with magnets propping it up.


----------



## feznz

Quote:


> Originally Posted by *Spirti*
> 
> Well, then I think I'll have to wait for the next drivers and so on))
> I hardly think that it's a driver problem since it crashes only after I turn the pc on and 10-15 mins of YouTube)


I was thinking maybe its because a voltage bump needed for idle I noticed that my Asus Strix idles @ 0.65v would be nice to bump the voltage up to 0.7v but MSI Voltage/Frequency editor won't allow to edit voltage below 1000Mhz where my card idles @ 240Mhz









I am still trying to get my new PCIe M.2 drive as a bootable drive I think I am almost there I know it is possible with a Mobo bios edit to eliminate for me a corrupt OS


----------



## keikei




----------



## Luckbad

I must admit, I'm becoming a Zotac fanboy. Of the 10+ 1080 Tis I tried, the Zotacs were the best. They are doing innovative things with power and mini-1080 Tis that we aren't seeing from other manufacturers. EVGA is the only other company that seems like it brought something to the table with the ICX thermal readings and independent fan control.


----------



## Hiikeri

Quote:


> Originally Posted by *t1337dude*
> 
> Does anyone have that chart that shows games that max out 8Gb Vram, 11Gb, 12Gb etc? I remember it just showed examples of a game or two that used a lot of VRAM.


I have [email protected] ~10Gb usage (picture does not present max vram usage, was a little higher on some time).


On Killing Floor 2 @1440 also some maps ~9+ Gb vram usage.

Both loading textures full to vram (decrease map loading lags/stuttering, even with G-Sync). On Squad straight from graphics settings, KF2 thru tweaking *.ini.


----------



## alucardis666

Anyone got a link to latest MSI Afterburner beta? I'm wondering if AB is the cause for the GPU power crash/black screen issues...

I'm on 4.4.0 Beta 6


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> Which version of the EK 1080Ti Strix block did you order, acetal or acrylic?
> 
> 
> 
> Nice. Thanks. I ordered this one: https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-acetal-nickel
Click to expand...

That's a good-looking block.









Imo, Acetal is a safer choice than acrylic for durability, it's less likely to suffer from cracking issues.
Having said that, most of my EK blocks were acrylic and had no issues, but Acetal gives you that extra insurance.

You have a lot of work ahead getting the loop done, but when you see your gpu load temps. on water, you'll know it was all worth it.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Anyone got a link to latest MSI Afterburner beta? I'm wondering if AB is the cause for the GPU power crash/black screen issues...
> 
> I'm on 4.4.0 Beta 6


The link is in OP in "How To Flash A BIOS'


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> The link is in OP in "How To Flash A BIOS'


Link doesn't work. :-(


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> The link is in OP in "How To Flash A BIOS'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link doesn't work. :-(
Click to expand...

I'm pretty sure it says you need to copy the text of the link and paste in a browser. Trouble with how overclock.net handles links.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure it says you need to copy the text of the link and paste in a browser. Trouble with how overclock.net handles links.


Yea... I'm dumb.









thanks!


----------



## Dasboogieman

Quote:


> Originally Posted by *DerComissar*
> 
> That's a good-looking block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Imo, Acetal is a safer choice than acrylic for durability, it's less likely to suffer from cracking issues.
> Having said that, most of my EK blocks were acrylic and had no issues, but Acetal gives you that extra insurance.
> 
> You have a lot of work ahead getting the loop done, but when you see your gpu load temps. on water, you'll know it was all worth it.


I agree, I don't trust Plexi or Acrylic at all. In fact, I even consider Acetal as a neccessary evil because few blocks have the choice for all metal construction.


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure it says you need to copy the text of the link and paste in a browser. Trouble with how overclock.net handles links.


Quote:


> Originally Posted by *alucardis666*
> 
> Yea... I'm dumb.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks!


Well new AB beta didn't make a lick of difference, however going into NVCP and changing it to RGB full instead of 8-Bit 444 seems to have remedied this issue.









Colors are a little more muted and I'm sure I may notice a bit more banding, but the random constant black screens every so often REALLY were getting increasingly more annoying.









Really hoping HDMI 2.1 or some serious driver improvements from Nvidia remedy this. Or maybe I'll cave and pick up a new 4k 144hz monitor or something... idk yet.


----------



## demon09

Quote:


> Originally Posted by *alucardis666*
> 
> Well new AB beta didn't make a lick of difference, however going into NVCP and changing it to RGB full instead of 8-Bit 444 seems to have remedied this issue.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Colors are a little more muted and I'm sure I may notice a bit more banding, but the random constant black screens every so often REALLY were getting increasingly more annoying.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really hoping HDMI 2.1 or some serious driver improvements from Nvidia remedy this. Or maybe I'll cave and pick up a new 4k 144hz monitor or something... idk yet.


what all did you change?? As there is a setting for rgb and separate one for full and a separate one for 10bit . I have been plagued with the random black screen flash also


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm pretty sure it says you need to copy the text of the link and paste in a browser. Trouble with how overclock.net handles links.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *alucardis666*
> 
> Yea... I'm dumb.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Well new AB beta didn't make a lick of difference, however going into NVCP and changing it to RGB full instead of 8-Bit 444 seems to have remedied this issue.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Colors are a little more muted and I'm sure I may notice a bit more banding, but the random constant black screens every so often REALLY were getting increasingly more annoying.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really hoping HDMI 2.1 or some serious driver improvements from Nvidia remedy this. Or maybe I'll cave and pick up a new 4k 144hz monitor or something... idk yet.
Click to expand...

I get black screens if I use my cheap 3-metre display port cord. My two-metre quality one I have no issues. I think the trick is to get a certified display port cord, one with the display port symbol on it.


----------



## alucardis666

Quote:


> Originally Posted by *demon09*
> 
> what all did you change?? As there is a setting for rgb and separate one for full and a separate one for 10bit . I have been plagued with the random black screen flash also


*GOOD*



*BAD*


----------



## alucardis666

Quote:


> Originally Posted by *KedarWolf*
> 
> I get black screens if I use my cheap 3-metre display port cord. My two-metre quality one I have no issues. I think the trick is to get a certified display port cord, one with the display port symbol on it.


Except with my 4K HDR Samsung TV... My only option is HDMI...









*I'm also using a $100 Auidoquest HDMI cable...* so I don't think that's the issue, *I've also tried other HDMI's both more and less expensive.*


----------



## dansi

When i had 980ti, i did suffered black screen while playing bf4. Turns out i pushed my vram too far, drop down the oc and no more black screens.

If you suffering not because of oc, perhaps change your video cables or even your monitor. I never have screen out problems with nvidia or amd stock gpu at default clocks


----------



## demon09

Quote:


> Originally Posted by *alucardis666*
> 
> *GOOD*
> 
> 
> 
> *BAD*


hmm odd I have been running rgb full the whole time and still had the random flicker.i just moved to 382.33 driver as I was on 378.92. hoping they stay away this time.


----------



## dansi

Quote:


> Originally Posted by *dansi*
> 
> Oh my superposition seems weird, so i tried the same QHD windowed res with Extreme shaders and what i got over 4K optimised
> 
> -much lower fps
> -blurry textures
> -100% gpu load
> -scenes 1-7 tdp around 110-117% BUT perfcap reasons are none! i can run at 2025 with 1.062v through them
> -scenes 7-17 have some 'pwr' and 'idle' and 'vrel' but i maintained higher clocks on average.
> -much lower scores!
> 
> somehow Extreme shaders vs 4K optimised gives
> -higher average clocks
> -lower tdp consumption
> -higher gpu load


Quoting myself, anyone else have more findings of how boost3.0 works irl?


----------



## alucardis666

Quote:


> Originally Posted by *demon09*
> 
> hmm odd I have been running rgb full the whole time and still had the random flicker.i just moved to 382.33 driver as I was on 378.92. hoping they stay away this time


Good luck!









I will gladly take the minor hit in color fideltiy to not have to deal with my screen going black 100x a day randomly for 1-5 seconds... **EXTREMELY ANNOYING FOR VIDEO WATCHING AND GAMING** Nvidia really need to get their $H!T together.

*I've had this issue with 1080's 1080Tis and even a 2017 TXp...*







Idk if it's a bug with Pascal GPUs, the HDMI 2.0 cable not being quick enough, the way my TV reads the data from the card, or my OC... BUT!!! ...I also never tried changing from 444 to RGB full till now, and it seems to have cleared it up so far... 30 minutes with no god forsaken black screens!


----------



## demon09

Quote:


> Originally Posted by *alucardis666*
> 
> Good luck!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will gladly take the minor hit in color fideltiy to not have to deal with my screen going black... Nvidia really need to get their $H!T together. I've had this issue with 1080's 1080Tis and Ever a TXp...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never tried changing from 444 to RGB full till now, and it seems to have cleared it up so far... 30 minutes with no god forsaken black screens!


lol I feel you just used DDU to whipe my driver's and then installed the newest driver.I also removed the hidden monitors in device manager that windows had made Incase that helps. As the flicker is annoying af and I was ready to try anything. I am on a display port connection so I don't think it's fully a HDMI problem. Here's fingers crossed mine stays away


----------



## alucardis666

Quote:


> Originally Posted by *demon09*
> 
> lol I feel you just used DDU to whipe my driver's and then installed the newest driver.I also removed the hidden monitors in device manager that windows had made Incase that helps. As the flicker is annoying af and I was ready to try anything


Yea... Was debating going full AMD with VEGA on the horizon and getting a freesync gaming monitor just because of the issue... idk. We'll see. Maybe I still will.


----------



## KedarWolf

Quote:


> Originally Posted by *alucardis666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I get black screens if I use my cheap 3-metre display port cord. My two-metre quality one I have no issues. I think the trick is to get a certified display port cord, one with the display port symbol on it.
> 
> 
> 
> Except with my 4K HDR Samsung TV... My only option is HDMI...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I'm also using a $100 Auidoquest HDMI cable...* so I don't think that's the issue, *I've also tried other HDMI's both more and less expensive.*
Click to expand...

https://linustechtips.com/main/topic/632318-4k-60hz-flickering/?page=2


----------



## demon09

Quote:


> Originally Posted by *alucardis666*
> 
> Yea... Was debating going full AMD with VEGA on the horizon and getting a freesync gaming monitor just because of the issue... idk. We'll see. Maybe I still will.


x.x I definitely won't the 1080ti was costly enough if the issue keeps up.as it seems to not be happen to everyone I will get EVGA to send me a replacement. My current 4k monitor has free sync lol that sadly goes unused but gsync monitors are way to expensive


----------



## alucardis666

Quote:


> Originally Posted by *demon09*
> 
> x.x I definitely won't the 1080ti was costly enough if the issue keeps up and it seems to not be happen to everyone I will get EVGA to send me a replacement


As I mentioned, multiple cards, same issue. It's not the card.









If VEGA is as good as the TI or better I will make the jump to try it, and I'm sure it'll undercut Nvidia by $100-200 per card too.








Quote:


> Originally Posted by *KedarWolf*
> 
> https://linustechtips.com/main/topic/632318-4k-60hz-flickering/?page=2


Hmmm... I did change my HDMI black level to auto and just re-enabled 8bit 444 in NVCP. We'll see if it makes any difference.









Thanks KedarWolf!


----------



## KedarWolf

@Luckbad Uploaded his Zotac 432W power limit BIOS for me, this is it here. The only issue I have is 3DMark says it's an unrecognised card.

GP102_arcticstorm.zip 155k .zip file


Got my best Time Spy ever with it though after running my powerlimit.bat!

powerlimit.zip 0k .zip file


----------



## Somasonic

Quote:


> Originally Posted by *dunbheagan*
> 
> To give it a quick try, just open the curve editor with strg+l in afterburner, select the lowest voltage point at 800mV at hit L then apply to lock this point. Your card shoud now operate at fixed 800mV. Try some gaming, if this fixes your coil whine issue. If it helps, you can invest a little more time to find the highest stable clock at 800mV(should be between 1700-1800MHz) and make a totally flat curve at this clock straight from 800mV to 1200mV.


Thanks, I'll try this shortly.

Quote:


> Originally Posted by *Dasboogieman*
> 
> You wanna hear a big revelation?
> All cards have coil whine!!, anyone who claims otherwise is either lying, deaf or lucky that their system doesn't have audible resonance.
> Its just the way the inductors work. ASUS used some pretty high end (I'm pretty sure noise reduced) coils in your Strix. Chances are, you will get whine from another card, there simply isn't another manufacturer that uses better coils except maybe the KingPin (and those would be optimized for efficiency instead of low noise).
> There is something in either your AC mains power delivery, your PSU delivery or Mobo power draw that is creating electrical resonance (or noise) in the power that happens to coincide with an audible frequency.
> 
> If you want the most reliable way to deal with coil whine you have to open up your strix card.
> 1. If you are feeling brave, apply a decent amount of craft hot glue around each coil, like how powercolor has done it on the PCB photo below. This dampens the side to side vibration of the coils and thus makes the whine much less audible
> 
> 
> 2. If you are feeling less brave, apply a reasonably thick pad of Thermal Pad on top of the inductors and make sure the heatsink is pressing down pretty firmly on it. This will apply downwards pressure on the coils and will also dampen the vibration.
> 
> YMMV with each method but you will get a definite reduction in noise.


Thanks for this, it makes me less inclined to try swapping the card for a better one. I think I'll give the thermal pad a go at some point, I'm in no hurry to void my warranty with hot glue









Quote:


> Originally Posted by *KedarWolf*
> 
> www.underclock.net This website is great!!


----------



## Spirti

Quote:


> Originally Posted by *feznz*
> 
> I was thinking maybe its because a voltage bump needed for idle I noticed that my Asus Strix idles @ 0.65v would be nice to bump the voltage up to 0.7v but MSI Voltage/Frequency editor won't allow to edit voltage below 1000Mhz where my card idles @ 240Mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am still trying to get my new PCIe M.2 drive as a bootable drive I think I am almost there I know it is possible with a Mobo bios edit to eliminate for me a corrupt OS


Update - I think I made a mistake in my previous posts - I think I've never tried gaming right after the 1st boot, today I did a 1st boot and left it idle for 10-15 mins and experienced the same problem. Also, if I do a reboot right after 1st boot - everything is fine)

Looks like it's definitely an issue with power/voltage and I will live with it)

Thanks for your answers.


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> @Luckbad Uploaded his Zotac 432W power limit BIOS for me, this is it here. The only issue I have is 3DMark says it's an unrecognised card.
> 
> GP102_arcticstorm.zip 155k .zip file
> 
> 
> Got my best Time Spy ever with it though after running my powerlimit.bat!
> 
> powerlimit.zip 0k .zip file


how does the power limit.bat work?
Well ran it as admin and is said it worked but there was not extra power, my card was throttling down to .963v same as FE [email protected] 120%PT.

.... it's an EVGA FE edition for anyone wondering.


----------



## DADDYDC650

Quote:


> Originally Posted by *alucardis666*
> 
> Good luck!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will gladly take the minor hit in color fideltiy to not have to deal with my screen going black 100x a day randomly for 1-5 seconds... **EXTREMELY ANNOYING FOR VIDEO WATCHING AND GAMING** Nvidia really need to get their $H!T together.
> 
> *I've had this issue with 1080's 1080Tis and even a 2017 TXp...*
> 
> 
> 
> 
> 
> 
> 
> Idk if it's a bug with Pascal GPUs, the HDMI 2.0 cable not being quick enough, the way my TV reads the data from the card, or my OC... BUT!!! ...I also never tried changing from 444 to RGB full till now, and it seems to have cleared it up so far... 30 minutes with no god forsaken black screens!


I've got a Samsung 49KS8000 and an EVGA GTX 1080 TI SC Black Edition. No issues at all. Our TV's are practically the same so weird that you're having issues. Although I've heard of others having the same problem as well. Your TV just might be defective. Have you posted over at avsforums?


----------



## feznz

Quote:


> Originally Posted by *Spirti*
> 
> Update - I think I made a mistake in my previous posts - I think I've never tried gaming right after the 1st boot, today I did a 1st boot and left it idle for 10-15 mins and experienced the same problem. Also, if I do a reboot right after 1st boot - everything is fine)
> 
> Looks like it's definitely an issue with power/voltage and I will live with it)
> 
> Thanks for your answers.


Good news been 4 hours and running sweet as so was corrupt OS think after 4 odd driver updates some drivers were worse than others Hope you get your problems sorted.

I gave up on making a bootable PCIe M.2 drive probably try get some expert advice I know it is possible or wait for the Thread ripper 16cores WHOA Timespy eat my Physics Score LOL but as for gaming I am imagining 5-7% CPU usage

might try get some help here really wanted a smaller driver for OS and use the old OS SSD for games just watching the page file while loading off the same drive didn't makes sense to have games loaded on the OS

Edit;
one other time I had a problem similar to yours I fixed it by replacing the motherboard.....but this mobo done it from day 1 and would run sweet as a nut on stock clocks but more of a double post in the end it got to me and I had to sell the board I couldn't get warrantee for it as it was OK with stock clocks


----------



## Benny89

.

Did anyone here tried to flash for example Zotac BIOS to his Asus STRIX card or other brand card? Not FE.


----------



## Aganor

Is there any way to turn off the thermal throttling of the stock BIOS? If my only goal is this one, does the XOC BIOS suffice?


----------



## Nico67

Quote:


> Originally Posted by *KedarWolf*
> 
> @Luckbad Uploaded his Zotac 432W power limit BIOS for me, this is it here. The only issue I have is 3DMark says it's an unrecognised card.
> 
> GP102_arcticstorm.zip 155k .zip file
> 
> 
> Got my best Time Spy ever with it though after running my powerlimit.bat!
> 
> powerlimit.zip 0k .zip file


Did it make any difference in GPU-Z? are you still getting pwr limiting, you would kinda hope not at 432w


----------



## reflex75

Are there any statistics about the range frequency or average clock speed for this GP102 core ?


----------



## max99

Hey Guys









What do I have to enter at the Powerlimit.Zip for max Pl ? I use a gtx 1080 ti Aorus
sorry for my bad english


----------



## Dasboogieman

Quote:


> Originally Posted by *reflex75*
> 
> Are there any statistics about the range frequency or average clock speed for this GP102 core ?


The median seems to be 2000mhz at stock voltage. 2100mhz seems to be the upper 95% percentile for aircooling. Most watercooled cards can hit 2100mhz though its hard to pinpoint the frequency due to small sample size.


----------



## reflex75

Quote:


> Originally Posted by *Dasboogieman*
> 
> The median seems to be 2000mhz at stock voltage. 2100mhz seems to be the upper 95% percentile for aircooling. Most watercooled cards can hit 2100mhz though its hard to pinpoint the frequency due to small sample size.


Thanks! 2000mhz seems high for median no?
And what about the coming EVGA Kingpin with binned core at least 2025mhz?
Is it worth it? Can it make a difference?


----------



## Dasboogieman

Quote:


> Originally Posted by *reflex75*
> 
> Thanks! 2000mhz seems high for median no?
> And what about the coming EVGA Kingpin with binned core at least 2025mhz?
> Is it worth it? Can it make a difference?


Thats the funny thing because the population of cards that can do beyond 2000mhz drops considerably fast by every 13mhz increments. 2025mhz on stock voltage actually not that common (well, at least no so common that you'd bother binning for it).

Its more like, if its 2025mhz guaranteed on stock voltage on air. It has an almost 80%+ chance to do at least 2100mhz on Water, and much higher on LN2.


----------



## Luckbad

Possible insight from someone who has owned way too many 1080 Ti cards (me):

The majority of cards can reach 2000MHz on the core. Maybe not always sustain it under load with thermal throttling etc, but they can generally get there. I've never had one that couldn't get into the 1900+ territory, so it's pretty safe to say that if you can't get past 1900, your chip is terrible.

I have only had 2 cards that can hit over 2100MHz, both on air. Most cards seem to have a wall at 2100 where they simply crash, regardless of temperature. My new ArcticStorm card can't do 2100 on the core despite being in the low 30s under normal load, but it's actually a reasonable chip (better bench scores than any other card I've owned).

I also had two EVGA SC2 Hybrids. One couldn't even do 2000, the other could barely sustain greater than 2000 in gaming.

An interesting thing with overclocking the 1080 Ti is that many chips seem to just crap out at a particular frequency regardless of temperature. With one of my Amp Extremes, it was totally fine at 2100. Trying to adjust it to stay at that frequency after thermal throttling didn't work, though. As soon as it kissed the next bin, even at 30 C, the card crashed. If I let the card heat up to 65 THEN overclocked it such that it stuck at 2100, no problems at all.

It's reasonable to say that most cards can get to 2000-2025 on core, but not sustain it after thermal throttling and such kick in. I am highly doubtful that most cards can sustain 2100+ even on water, as the wall starts being hit and you need a particularly good chip.

That said, memory overclocking is super useful on the 1080 Ti. 2062 core/6123 mem yields me better FPS and benches than 2088 core/6106 mem. I had an Amp Extreme that did 2143 (think that's the right number, more than 2140 and less than 2150 whatever it was) on AIR. But the ram only went up to ~5800. That card was significantly slower for benches and FPS than the Amp Extreme I still have at my place that does 2062/~6106 as its day-to-day frequency (can hit 2100 with voltage).

I think with the 1080 Ti, we've basically reached the limits of the chips this generation. Almost every aftermarket card can get to similar speeds, and it's really just up to the silicon lottery. If you can't exceed 2000MHz ever without crashing, your chip is slightly below average. If you can hit 2100MHz ever without crashing, your chip is well above average. Memory overclocking is actually more interesting to me this generation than core.

With the ArcticStorm, I've settled in on my day-to-day overclock being 2050/6123 with no voltage. That sees no throttling ever and great performance, and it's a round number unlike the 2062 I can also do with the card.


----------



## Slackaveli

Quote:


> Originally Posted by *Luckbad*
> 
> Possible insight from someone who has owned way too many 1080 Ti cards (me):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> The majority of cards can reach 2000MHz on the core. Maybe not always sustain it under load with thermal throttling etc, but they can generally get there. I've never had one that couldn't get into the 1900+ territory, so it's pretty safe to say that if you can't get past 1900, your chip is terrible.
> 
> I have only had 2 cards that can hit over 2100MHz, both on air. Most cards seem to have a wall at 2100 where they simply crash, regardless of temperature. My new ArcticStorm card can't do 2100 on the core despite being in the low 30s under normal load, but it's actually a reasonable chip (better bench scores than any other card I've owned).
> 
> I also had two EVGA SC2 Hybrids. One couldn't even do 2000, the other could barely sustain greater than 2000 in gaming.
> 
> An interesting thing with overclocking the 1080 Ti is that many chips seem to just crap out at a particular frequency regardless of temperature. With one of my Amp Extremes, it was totally fine at 2100. Trying to adjust it to stay at that frequency after thermal throttling didn't work, though. As soon as it kissed the next bin, even at 30 C, the card crashed. If I let the card heat up to 65 THEN overclocked it such that it stuck at 2100, no problems at all.
> 
> It's reasonable to say that most cards can get to 2000-2025 on core, but not sustain it after thermal throttling and such kick in. I am highly doubtful that most cards can sustain 2100+ even on water, as the wall starts being hit and you need a particularly good chip.
> 
> That said, memory overclocking is super useful on the 1080 Ti. 2062 core/6123 mem yields me better FPS and benches than 2088 core/6106 mem. I had an Amp Extreme that did 2143 (think that's the right number, more than 2140 and less than 2150 whatever it was) on AIR. But the ram only went up to ~5800. That card was significantly slower for benches and FPS than the Amp Extreme I still have at my place that does 2062/~6106 as its day-to-day frequency (can hit 2100 with voltage).
> 
> I think with the 1080 Ti, we've basically reached the limits of the chips this generation. Almost every aftermarket card can get to similar speeds, and it's really just up to the silicon lottery. If you can't exceed 2000MHz ever without crashing, your chip is slightly below average. If you can hit 2100MHz ever without crashing, your chip is well above average. Memory overclocking is actually more interesting to me this generation than core.
> 
> With the ArcticStorm, I've settled in on my day-to-day overclock being 2050/6123 with no voltage. That sees no throttling ever and great performance, and it's a round number unlike the 2062 I can also do with the card.


I concur with all of this. I have a really good Aorus that loves 2088. It runs along great at that freq but crashes at 2100 after awhile. It just prefers 2088 in all scenarios. But the memory. Mine artifacts before 6000 , so, for long gaming sessions I have to run 5900. I can run 5940 but after an hour or so I guess the vram heats up a bit and crashes. It's definitely 2 lotteries going on and while I won the core lottery, i certainly caught an average at best vram. If I would have caught 6100 memory on this dude it'd be a monster. I am cool with it as the difference is minimal, but, I will definitely be binning at least 4 cards when it's Volta time.


----------



## Dasboogieman

Mine is binned extremely strange. I was only 2037mhz stable under Air cooling but every since I went under water, its now 2100mhz stable at 1.062v. I can benchmark at 2126mhz but thats definitely not long term stable.


----------



## Some Random Guy

My Strix OC seems to be doing ok with it's EK block. I'm building my voltage curve right now using MSI Afterburner to lock voltages. So far, I'm stable at 2114 with 1050mv for benchmarks and some gaming, but I haven't run any sessions over an hour. I've only worked down to the 1012mv step so far.

As a question though, what does the voltage slider actually do on Boost 3.0? Does it allow you access to higher voltage steps or does it increase the voltage at every point along your curve (to increase stability of whatever step is holding you back by throwing more voltage at them all)? I haven't bothered to unlock mine in Afterburner yet, but I can rebuild my curve in GPU Tweak II if I want and use their voltage slider. The new % thing has me a little confused. The only time I see the slider mentioned, they say "just turn it all the way up", but for me that will probably be counterproductive since I'm hitting power limits with lower voltages that are forcing downclocks. Stability isn't my problem yet, but it could be later. Honestly even if I have to drop a frequency step or two at each point along my curve, I'll be quite happy with what I have.

I guess what I'm saying is my best guess is the voltage slider increases voltage in all the bins to stabilize frequency all along the curve that gets built when you only use the sliders. However, when you use a custom curve it seems useless if that is correct. If someone has a good link to give me a better explanation, I would appreciate it.


----------



## BoredErica

It's sort of confusing what "2000mhz" even means. 2000mhz at what and what temps? I can get the frequency way down on Superposition 4k Extreme if I really wanted to or use Skyrim to cook the books.

I will be water cooling this month, so I am interested to see if there is a benefit in overclock.


----------



## KraxKill

The frequency is only 50% of the puzzle. TDP is the other 50%


----------



## nrpeyton

One thing I've noticed more regularly with the *10 series* (PASCAL) is that we see *artefacts* a lot *less*. _(In fact the only way I see any artefacts at all these days is via memory O/C'ing).
_
On my GTX 980 I would artefact as I approached the upper threshold on the core too. With *PASCAL* it's always a *straight CRASH!*

Anyone have a difference experience?

Quote:


> Originally Posted by *Darkwizzie*
> 
> It's sort of confusing what "2000mhz" even means. 2000mhz at what and what temps? I can get the frequency way down on Superposition 4k Extreme if I really wanted to or use Skyrim to cook the books.
> 
> I will be water cooling this month, so I am interested to see if there is a benefit in overclock.


The only benefit I see to water-cooling is this:

-Less noise

-On Founders Edition 50Mhz more stability

-On aftermarket cards with GOOD heatsink/fan assembly-- 25Mhz more stability

-Most important-- On cards with lower power targets _(Founders Edition)_ it makes 'power mod' */* 'shunt mod' *possible*! _(preventing throttling)_

-On 1080Ti _(not younger 1080's)_ it allows *small* volt mods to actually yield a little extra clock speed _(albeit not much)_ but every little helps!

I found a guy at the EVGA forums who works with metal. (As a machinest). He's able to design his own water blocks. If I had the money I'd have him design me an *actively* water cooled backplate for core, memory & VRM!


----------



## jase78

hello,would greatly appreciate if you guys could highlight what exact changes to nvidia control panel seem to net the best bench scores. ive got gsync off and prefer max perf, single display perf mode, but what else seems to help?


----------



## feznz

Quote:


> Originally Posted by *Dasboogieman*
> 
> The median seems to be 2000mhz at stock voltage. 2100mhz seems to be the upper 95% percentile for aircooling. Most watercooled cards can hit 2100mhz though its hard to pinpoint the frequency due to small sample size.


I must have lost the lottery big time...... I can only get 2038Mhz stable on water stock voltage I had a little play and with 1.093v I can sit about 2063Mhz

Could be something to do with my location if I were in the tropics compared to Antarctica maybe luck might be on my side.
Quote:


> Originally Posted by *jase78*
> 
> hello,would greatly appreciate if you guys could highlight what exact changes to nvidia control panel seem to net the best bench scores. ive got gsync off and prefer max perf, single display perf mode, but what else seems to help?


here the first post there is the NVidia tweaks section the same applies to every progam you want to tweak up the performance sand be sure to submit a result









http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_20


----------



## Timechange01

Count me in!


----------



## BoredErica

Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I've noticed more regularly with the *10 series* (PASCAL) is that we see *artefacts* a lot *less*. (In fact the only way I see any artefacts at all these days is via memory O/C'ing).
> 
> On my GTX 980 I would artefact as I approached the upper threshold on the core too. With *PASCAL* it's always a *straight CRASH!*
> 
> Anyone have a difference experience?
> 
> The only benefit I see to water-cooling is this:
> 
> -Less noise
> 
> -On Founders Edition 50Mhz more stability
> 
> -On aftermarket cards with GOOD heatsink/fan assembly-- 25Mhz more stability
> 
> -Most important-- On cards with lower power targets (Founders Edition) it makes 'power mod' */* 'shunt mod' *possible*! (preventing throttling)
> 
> -On 1080Ti (not younger 1080's) it allows *small* volt mods to actually yield a little extra clock speed (albeit not much) but every little helps!
> 
> I found a guy at the EVGA forums who works with metal. (As a machinest). He's able to design his own water blocks. If I had the money I'd have him design me an *actively* water cooled backplate for core, memory & VRM!


Active backplate exists though. http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/5/


----------



## feznz

Quote:


> Originally Posted by *Luckbad*
> 
> Possible insight from someone who has owned way too many 1080 Ti cards (me):
> 
> The majority of cards can reach 2000MHz on the core. Maybe not always sustain it under load with thermal throttling etc, but they can generally get there. I've never had one that couldn't get into the 1900+ territory, so it's pretty safe to say that if you can't get past 1900, your chip is terrible.
> 
> I have only had 2 cards that can hit over 2100MHz, both on air. Most cards seem to have a wall at 2100 where they simply crash, regardless of temperature. My new ArcticStorm card can't do 2100 on the core despite being in the low 30s under normal load, but it's actually a reasonable chip (better bench scores than any other card I've owned).
> 
> I also had two EVGA SC2 Hybrids. One couldn't even do 2000, the other could barely sustain greater than 2000 in gaming.
> 
> An interesting thing with overclocking the 1080 Ti is that many chips seem to just crap out at a particular frequency regardless of temperature. With one of my Amp Extremes, it was totally fine at 2100. Trying to adjust it to stay at that frequency after thermal throttling didn't work, though. As soon as it kissed the next bin, even at 30 C, the card crashed. If I let the card heat up to 65 THEN overclocked it such that it stuck at 2100, no problems at all.
> 
> It's reasonable to say that most cards can get to 2000-2025 on core, but not sustain it after thermal throttling and such kick in. I am highly doubtful that most cards can sustain 2100+ even on water, as the wall starts being hit and you need a particularly good chip.
> 
> That said, memory overclocking is super useful on the 1080 Ti. 2062 core/6123 mem yields me better FPS and benches than 2088 core/6106 mem. I had an Amp Extreme that did 2143 (think that's the right number, more than 2140 and less than 2150 whatever it was) on AIR. But the ram only went up to ~5800. That card was significantly slower for benches and FPS than the Amp Extreme I still have at my place that does 2062/~6106 as its day-to-day frequency (can hit 2100 with voltage).
> 
> I think with the 1080 Ti, we've basically reached the limits of the chips this generation. Almost every aftermarket card can get to similar speeds, and it's really just up to the silicon lottery. If you can't exceed 2000MHz ever without crashing, your chip is slightly below average. If you can hit 2100MHz ever without crashing, your chip is well above average. Memory overclocking is actually more interesting to me this generation than core.
> 
> With the ArcticStorm, I've settled in on my day-to-day overclock being 2050/6123 with no voltage. That sees no throttling ever and great performance, and it's a round number unlike the 2062 I can also do with the card.


I am trying to recall you had 10+ 1080Tis right?
anyway for a nice detailed post where I am not imagining that everyone is winning the lottery +1


----------



## Luckbad

Quote:


> Originally Posted by *feznz*
> 
> I am trying to recall you had 10+ 1080Tis right?
> anyway for a nice detailed post where I am not imagining that everyone is winning the lottery +1


I've kinda lost track. I think I'm at 10+ now though, yeah.


----------



## kfxsti

Sign me up woohoo


I hope MSI does not void warranty on cooler removal. The TIM on this baby is about to get swapped.


----------



## nrpeyton

Quote:


> Originally Posted by *Darkwizzie*
> 
> Active backplate exists though. http://thermalbench.com/2017/04/26/aqua-computer-kryographics-pascal-1080/5/


ahh yes... the Kryographics one--It's kind of "indirectly" active: (I did look into it before--here's a pic):



Also their thermal performance (with both water-block & the backplate above) only matches the performance of an EK solution.

But imagine if we had something where we could just buy our EK blocks as standard. But then instead of a normal passive EK backplate -- instead opt for one that's completely active on the back. I.E. fins at the back of the GPU DIE and water flowing directly behind memory & VRM. Then some other lanes for water-routes that branch out to keep the entire backplate cool too. (In other words the backplate would be like an entire GPU water-block.

(The GPU would be literally SURROUNDED in water-- attacking the heat from 2 sides).

On my old CPU, I was able to get -15c lower CPU temps by having a 3000 RPM industrial fan blowing at the back of the CPU socket. Then imagine that was water instead; on the back of your GPU DIE.

We've all "felt" how warm the backplates get.. even when water-cooling.. so cooling that whole backplate down to ambient level would defintely do something to improve temps on all components. _(So you could even say you're attacking on 3 sides!)._

It wouldn't be cheap. But it could be an elite option for the hard-core water coolers trying to shave off every last degree C.

Quote:


> Originally Posted by *kfxsti*
> 
> Sign me up woohoo
> 
> 
> I hope MSI does not void warranty on cooler removal. The TIM on this baby is about to get swapped.


What are you swapping with. Kryonaut?

Would be interested to know how you get on.

Also they can't void warranty because they shouldn't prove you took it apart.


----------



## nrpeyton

Wait a minute.

What if I used the active backplate (pictured above) alongside my EK water-block. *Instead* of with it's Kryographics water-block brother.

Then I'd have the benefit to EK on the front, and an actively cooled Kryographics backplate on the back!

I done a bit more digging -- and Krygraphics have now also released an active backplate for the 1080Ti. _(last time I looked it never existed) See below:_



Anyone see any compatability issues?

This could be as good as it's going to get !!


----------



## TheBoom

So apparently Zotac has a new Amp Extreme Core Edition 1080ti. I honestly can't tell whats the difference though. Apart from it having a lower boost clock than the regular Amp Extreme. Would be interesting to see if it is a binned chip though since prices are exactly the same.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Wait a minute.
> 
> What if I used the active backplate (pictured above) alongside my EK water-block. *Instead* of with it's Kryographics water-block brother.
> 
> Then I'd have the benefit to EK on the front, and an actively cooled Kryographics backplate on the back!
> 
> I done a bit more digging -- and Krygraphics have now also released an active backplate for the 1080Ti. _(last time I looked it never existed) See below:_
> 
> 
> 
> Anyone see any compatability issues?
> 
> This could be as good as it's going to get !!


Ooooooh those were really good in the Hawaii era at keeping the VRMs cold. It was like 15-20 degree difference with no active backplate and made all the difference for high voltage overclocks. Although, the impact for the GTX 1080ti is dubious, especially the custom model ones because we aren't as badly limited by VRM temps unlike Hawaii.


----------



## Nico67

Quote:


> Originally Posted by *nrpeyton*
> 
> Wait a minute.
> 
> What if I used the active backplate (pictured above) alongside my EK water-block. *Instead* of with it's Kryographics water-block brother.
> 
> Then I'd have the benefit to EK on the front, and an actively cooled Kryographics backplate on the back!
> 
> I done a bit more digging -- and Krygraphics have now also released an active backplate for the 1080Ti. _(last time I looked it never existed) See below:_
> 
> 
> 
> Anyone see any compatability issues?
> 
> This could be as good as it's going to get !!


The terminal on the top needs to lineup with the removed terminal holes on the EK. So screw holes and water channels need to lineup.


----------



## kfxsti

I have kryonaut and GC extreme on hand. Lol always have these two on hand.
I prefer Gelid on the GPUs and kryo on the CPUs.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I've noticed more regularly with the *10 series* (PASCAL) is that we see *artefacts* a lot *less*. _(In fact the only way I see any artefacts at all these days is via memory O/C'ing).
> _
> On my GTX 980 I would artefact as I approached the upper threshold on the core too. With *PASCAL* it's always a *straight CRASH!*
> 
> Anyone have a difference experience?
> 
> The only benefit I see to water-cooling is this:
> 
> -Less noise
> 
> -On Founders Edition 50Mhz more stability
> 
> -On aftermarket cards with GOOD heatsink/fan assembly-- 25Mhz more stability
> 
> -Most important-- On cards with lower power targets _(Founders Edition)_ it makes 'power mod' */* 'shunt mod' *possible*! _(preventing throttling)_
> 
> -On 1080Ti _(not younger 1080's)_ it allows *small* volt mods to actually yield a little extra clock speed _(albeit not much)_ but every little helps!
> 
> I found a guy at the EVGA forums who works with metal. (As a machinest). He's able to design his own water blocks. If I had the money I'd have him design me an *actively* water cooled backplate for core, memory & VRM!


Actually don't forget, going water drops the power draw by 30W due to sheer temperature difference alone.


----------



## jura11

Quote:


> Originally Posted by *kfxsti*
> 
> I have kryonaut and GC extreme on hand. Lol always have these two on hand.
> I prefer Gelid on the GPUs and kryo on the CPUs.


In my case I use Noctua NT-H1 on GPU and Kryonaut on CPU, tried as well Kryonaut on my GPUs but my temps has been around 2-3°C higher like with Noctua NT-H1

Hope this helps

Thanks,Jura


----------



## kfxsti

Thanks but Im picky about how I use them. Especially like with my laptop's . Gelid only on them. Lol


----------



## TheBoom

How much difference in temps do you actually see with kryonaut over the regular pastes?


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> How much difference in temps do you actually see with kryonaut over the regular pastes?


Phobya HeGrease (AKA Gelid Extreme) was the previous champion so I think Kryo was about 1-2 degrees better? mounting dependent.
Liquid metal will gain you maybe 2 degrees on top of Kryo and Phase change TIM will give another 2 degrees on top of liquid metal.


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> Phobya HeGrease (AKA Gelid Extreme) was the previous champion so I think Kryo was about 1-2 degrees better? mounting dependent.
> Liquid metal will gain you maybe 2 degrees on top of Kryo and Phase change TIM will give another 2 degrees on top of liquid metal.


Must have been out of the TIM loop for a while now, I still use MX4 lol.


----------



## Dasboogieman

Quote:


> Originally Posted by *TheBoom*
> 
> Must have been out of the TIM loop for a while now, I still use MX4 lol.


I can't find the article anymore but I think HeGrease was better than MX-4 by about 3-4 degrees


----------



## alucardis666

Quote:


> Originally Posted by *DADDYDC650*
> 
> I've got a Samsung 49KS8000 and an EVGA GTX 1080 TI SC Black Edition. No issues at all. Our TV's are practically the same so weird that you're having issues. Although I've heard of others having the same problem as well. Your TV just might be defective. Have you posted over at avsforums?


It only happens with the PC. All other sources are fine, so... idk. lol. Sine I change the option for HDMI blacks to auto issue has gone away.


----------



## jase78

Quote:


> Originally Posted by *feznz*
> 
> I must have lost the lottery big time...... I can only get 2038Mhz stable on water stock voltage I had a little play and with 1.093v I can sit about 2063Mhz
> 
> Could be something to do with my location if I were in the tropics compared to Antarctica maybe luck might be on my side.
> here the first post there is the NVidia tweaks section the same applies to every progam you want to tweak up the performance sand be sure to submit a result
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_20


I had most of those selected already except for texture filter quality . but funny thing is I went back to 381.65 driver from the latest and my scores all went back up! Another thing i've noticed is when gpu z is open i will crash with the same curve?? This is the absolute best i've done at this point


----------



## DerComissar

Quote:


> Originally Posted by *Nico67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Wait a minute.
> 
> What if I used the active backplate (pictured above) alongside my EK water-block. *Instead* of with it's Kryographics water-block brother.
> 
> Then I'd have the benefit to EK on the front, and an actively cooled Kryographics backplate on the back!
> 
> I done a bit more digging -- and Krygraphics have now also released an active backplate for the 1080Ti. _(last time I looked it never existed) See below:_
> 
> 
> 
> Anyone see any compatability issues?
> 
> This could be as good as it's going to get !!
> 
> 
> 
> The terminal on the top needs to lineup with the removed terminal holes on the EK. So screw holes and water channels need to lineup.
Click to expand...

That's the catch, as Niko67 pointed out.

You could buy an AC active backplate for experimenting with, but that's if the terminal could be made to work with the EK block.

The idea you mentioned about having a fully-functional block on the pcb, rather than a simple backplate, certainly would increase the thermal performance considerably.

I imagine that would easily be double the cost of a conventional 1080Ti block though! Ouch!


----------



## TheBoom

Quote:


> Originally Posted by *Dasboogieman*
> 
> I can't find the article anymore but I think HeGrease was better than MX-4 by about 3-4 degrees


If you mean gelid extreme I've had that but somehow always got worse performance compared to mx4.


----------



## dansi

Quote:


> Originally Posted by *feznz*
> 
> I must have lost the lottery big time...... I can only get 2038Mhz stable on water stock voltage I had a little play and with 1.093v I can sit about 2063Mhz


I have a nub 1080ti too, cooled by a hybrid kit, i can game Witcher3 stable at [email protected] with no perfcap. But once i tried 2067, it will flop at Superposition extreme bench. Strangely 2067 managed to complete 4k optmised but the scores were the same as 2025.

I cant even get it to run at 1.093v. Its 1.062v start and then down to 1.05v.

I saw it once run at 1.075v and 1.093v but i cant replicate that again. My 1080ti FE seems to have its mind of its own.


----------



## bmgjet

After all week playing with my EVGA 1080ti SC B, Iv come to find voltage doesnt help at all. If anything it makes it worse in some cases.

Pubg seems to be the hardest game on the card strangely.
Have based my over clock off whats stable on that, Going over 1.081v makes it less stable at the same and lower clock.
I have mine set to 2038mhz @ 1.081v under 45c
Then downclocks to 2025 @ 1.073v at 46-55c
Then downclocks to 2012 @ 1.062V at 55-62c
Some times it downclocks to 2000 @ 1.050v above 62c but then comes back down to 60c quickly.

BF1 is completely different.
It will hold 2050mhz @ 1.092v all day and only drops to 2038mhz @1.081v very rarely when it hits 62c then temps drop back down to high 50s very quickly.

BF4 didnt move from 2050mhz with a hour match of metro and never went over 55c.
Rest of my games are the same as BF4.

Whats annoying is it will pass every stress test and benchmark with 2050mhz @ 1.092v but will crash in pubg after sitting on menu screen for 3mins, Made me start to look at my CPU overclock thinking it was that doing it but nope. Its the GFX card.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Wait a minute.
> 
> What if I used the active backplate (pictured above) alongside my EK water-block. *Instead* of with it's Kryographics water-block brother.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Then I'd have the benefit to EK on the front, and an actively cooled Kryographics backplate on the back!
> 
> I done a bit more digging -- and Krygraphics have now also released an active backplate for the 1080Ti. _(last time I looked it never existed) See below:_
> 
> 
> 
> 
> 
> Anyone see any compatability issues?
> 
> This could be as good as it's going to get !!


if active backplate cooling is needed why not get a universal GPU or even CPU waterblock you would just need longer bolts.

Quote:


> Originally Posted by *jase78*
> 
> I had most of those selected already except for texture filter quality . but funny thing is I went back to 381.65 driver from the latest and my scores all went back up! Another thing i've noticed is when gpu z is open i will crash with the same curve?? This is the absolute best i've done at this point


Yes the all important driver if you get into benching then you will find yourself testing every driver I had my favourite 320.18 was a killer benching driver so good some people complained it killed their card but I got all my best scores off this driver. that is with my old GTX 770 SLI setup

http://www.overclock.net/t/1399104/mc-nvidia-320-18-whql-display-driver-is-damaging-gpus/0_20


----------



## TheBoom

Quote:


> Originally Posted by *bmgjet*
> 
> After all week playing with my EVGA 1080ti SC B, Iv come to find voltage doesnt help at all. If anything it makes it worse in some cases.
> 
> Pubg seems to be the hardest game on the card strangely.
> Have based my over clock off whats stable on that, Going over 1.081v makes it less stable at the same and lower clock.
> I have mine set to 2038mhz @ 1.081v under 45c
> Then downclocks to 2025 @ 1.073v at 46-55c
> Then downclocks to 2012 @ 1.062V at 55-62c
> Some times it downclocks to 2000 @ 1.050v above 62c but then comes back down to 60c quickly.
> 
> BF1 is completely different.
> It will hold 2050mhz @ 1.092v all day and only drops to 2038mhz @1.081v very rarely when it hits 62c then temps drop back down to high 50s very quickly.
> 
> BF4 didnt move from 2050mhz with a hour match of metro and never went over 55c.
> Rest of my games are the same as BF4.
> 
> Whats annoying is it will pass every stress test and benchmark with 2050mhz @ 1.092v but will crash in pubg after sitting on menu screen for 3mins, Made me start to look at my CPU overclock thinking it was that doing it but nope. Its the GFX card.


Yeah I've found the same. My card wouldn't be stable at anything over 1.143v. So I got it to 2088 core and 5.9 mem at 1143mv.

Then I reapplied TIM and suddenly even 1131mv is too much. So now it's back to the drawing board or should I say curve trying to get 2050 stable at 1112-1125mv.

What's funny is that at 2088 the highest score I got with FSU graphics tests was 4680. With 2037 it's 4623. That's not even a 1% improvement.


----------



## feznz

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah I've found the same. My card wouldn't be stable at anything over 1.143v. So I got it to 2088 core and 5.9 mem at 1143mv.
> 
> Then I reapplied TIM and suddenly even 1131mv is too much. So now it's back to the drawing board or should I say curve trying to get 2050 stable at 1112-1125mv.
> 
> What's funny is that at 2088 the highest score I got with FSU graphics tests was 4680. With 2037 it's 4623. That's not even a 1% improvement.


Temp is KEY honestly what do you get with 1.062v? I honestly get almost nothing more by going to 1.092v and I am on water
In fact a lot of people are reporting better results with undervolting the card i.e bench at 1.00v
I got a feeling you are just adding a whole lot of unnecessary heat with too much voltage, Pascal doesn't respond well with voltage unless you got LN2 of course.


----------



## TheBoom

Quote:


> Originally Posted by *feznz*
> 
> Temp is KEY honestly what do you get with 1.062v? I honestly get almost nothing more by going to 1.092v and I am on water
> In fact a lot of people are reporting better results with undervolting the card i.e bench at 1.00v
> I got a feeling you are just adding a whole lot of unnecessary heat with too much voltage, Pascal doesn't respond well with voltage unless you got LN2 of course.


2025 at 1.062v if I remember correctly. Better results with undervolting huh.

I mean I know you get lower temps but last I saw someone post his results he had lower performance although he was okay with it.


----------



## feznz

Quote:


> Originally Posted by *TheBoom*
> 
> 2025 at 1.062v if I remember correctly. Better results with undervolting huh.
> 
> I mean I know you get lower temps but last I saw someone post his results he had lower performance although he was okay with it.


depends each card responds differently but depends what you are aiming for absolute peak performance with the knowledge you are tittering on crashing at any second or rock solid stable nothing you can throw at it will crash it.
#are you a bencher or gamer

I aim for 24/7 rock solid stable I might throw 1.2v at my card when weather hits -4°C will have to get up for the 3am bench buzz but it is the only reason I would worry about getting every last FPS but then again depending on the bench I might have to get a thread ripper on the ball


----------



## blazarbot

I got my Ti FTW3 (the Elite Lol) last friday but haven't had time to play with it yet.


Once i stick it in (that sounded perverted, sorry haha) i'm sure i'll come to this topic for some tips (not that we can do THAT much overclocking on air anyway).


----------



## TheBoom

Quote:


> Originally Posted by *feznz*
> 
> depends each card responds differently but depends what you are aiming for absolute peak performance with the knowledge you are tittering on crashing at any second or rock solid stable nothing you can throw at it will crash it.
> #are you a bencher or gamer
> 
> I aim for 24/7 rock solid stable I might throw 1.2v at my card when weather hits -4°C will have to get up for the 3am bench buzz but it is the only reason I would worry about getting every last FPS but then again depending on the bench I might have to get a thread ripper on the ball


I would aim for the absolute best within 5% performance if you know what I mean. If going lower than a certain clock/voltage doesn't reduce my performance more than 5% I'll take it.

The person who was posting his results with undervolting was showing about 10-15% reduction in performance and that's a bit too much IMO.


----------



## smonkie

Hi, I have an MSI 1080Ti Gaming X which works fine but I have an issue with OC which is driving me nuts:

- If I run Superposition without Afterburner, everything is fine: 6, 7 runs in a row with ~5800 points in each.

- If I run Superposition with Afterburner running (stock), everything is also fine: 6, 7 runs in a row with ~5800 points in each.

- If I run Superposition with Afterburner running (+100 Voltage/ +100 Core), I usually have a nice score (6200), but in the runs after that my fps drop below 25 and no matter what I do (reverting to stock, closing Afterburner, letting the card cool down),the results will be the same in the next runs unless I restart the computer. It's very strange, because in these runs, GPU use is also at 99-100%, and core voltage and speed is also right. But fps are abismal low.

It seems like a software problem, but I've tried with different driver versions, and even with a clean Windows 10 instal with no luck. It's so damn annoying. Any ideas?


----------



## blazarbot

No that you mentioned Superposition, I need to contact EVGA to get the code for the full version.
It was an offer with their 1080 Ti.

I haven't tried that benchmark yet but looks really neat.


----------



## clarifiante

i just want to put this out there for any SLI owners who are wrecking their heads on why their core clocks are not reaching the target that they´ve set in the voltage curves. for whatever reason one of your cards will always clock to a lower voltage and both cards will clock down together

i apologise if it´s already been covered but i could not found anything on the matter.

this is assuming you´re using MSI AB and have unlocked the voltage curve using the settings in the OP. also i did this on my aorus 1080ti extreme so i´m not sure whether it is the same on all cards.

you need to set up curves individually for EACH of your cards. so uncheck that stupid ¨Synchronize settings for similar processors in Settings¨, its right at the top below the drop down menu where you can select each of your cards.

your core clock target MUST be the same across both cards as well as your mem OC
BUT
you can have different voltage targets. for example i am able to get both my cards to 2063 but GPU1 reaches it at 1.05V while GPU2 can reach it at 1.038V

it is a bit troublesome because this means that you have to apply your OC profile to EACH of your cards meaning you have to go into settings and hit apply for each of your cards
BUT
it works and you get the sexy extra FPS

may the FPS be ever in our favour


----------



## ELIAS-EH

Finally, now i am one of you guys ?Thx god


----------



## Aganor

Quote:


> Originally Posted by *ELIAS-EH*
> 
> 
> 
> Finally, now i am one of you guys ?Thx god


You need to convert to the WaterCooling gods before a full pledge


----------



## smonkie

This is the Afterburner report when my fps in Superposition begin to fall below 10:










As you can see, GPU use remains at 100%, but power % drops dramatically. I don't have a clue what's going on.









I'm leaning towards a hardware problem of any kind maybe temperature related, but It seems weird that unless I restart the problem won't get fixed.


----------



## KedarWolf

Quote:


> Originally Posted by *smonkie*
> 
> Hi, I have an MSI 1080Ti Gaming X which works fine but I have an issue with OC which is driving me nuts:
> 
> - If I run Superposition without Afterburner, everything is fine: 6, 7 runs in a row with ~5800 points in each.
> 
> - If I run Superposition with Afterburner running (stock), everything is also fine: 6, 7 runs in a row with ~5800 points in each.
> 
> - If I run Superposition with Afterburner running (+100 Voltage/ +100 Core), I usually have a nice score (6200), but in the runs after that my fps drop below 25 and no matter what I do (reverting to stock, closing Afterburner, letting the card cool down),the results will be the same in the next runs unless I restart the computer. It's very strange, because in these runs, GPU use is also at 99-100%, and core voltage and speed is also right. But fps are abismal low.
> 
> It seems like a software problem, but I've tried with different driver versions, and even with a clean Windows 10 instal with no luck. It's so damn annoying. Any ideas?


Means your overclock is unstable and driver is crashing even though it doesn't say it has. Usually it's core that is the issue, not memory.

Try lowering core.


----------



## smonkie

Quote:


> Originally Posted by *KedarWolf*
> 
> Means your overclock is unstable and driver is crashing even though it doesn't say it has. Usually it's core that is the issue, not memory.
> 
> Try lowering core.


You, sir, you are my personal hero.

It just seemed weird that driver didn't crash, or even no artifacts were present. Now everything seems to make sense.


----------



## Dasboogieman

Quote:


> Originally Posted by *smonkie*
> 
> You, sir, you are my personal hero.
> 
> It just seemed weird that driver didn't crash, or even no artifacts were present. Now everything seems to make sense.


Pascal is like that, you don't get artifacting like in previous generations anymore. Just performance regressions or straight crashes for instability.


----------



## Ghosteri7

Hello, I just got asus 1080 ti strix oc. ( I am a n00b )

Unfortunately it has coil whine







, I have a couple of questions if someone will like to help me out!

1. My card boost to 1987 mhz for a bit then stays at 1949 mhz all the time with fan at 47% , temps around 67-69
(oc profile gpu tweak 2) , while playing mafia 3 at 1080p ,144 hz (I know , I still need new monitor). is that normal, average or great?

2. I have a 4790k non oc, sometimes in mafia 3 I have seen some frozen like moments ( 2 seconds) then everything is fine , cpu goes to 70% , is that a bottleneck?

3. Most importantly, can someone tell me good numbers to try in GpuTweak 2 without voltages? I tried:
GPU Boost clock = +70
Memory Clock = +200

I run valley for a bit and everything was fine, gpu started at 2037 mhz then stayed at 2025 mhz, does that mean this gpu is good or just normal?

Here is my valley score I did first time I installed with normal oc profile:

http://i.share.pho.to/7316a3bb_o.png

Is that consider good or average?

Thanks in advance for any responses!! don't forget I am a complete n00b!! be kind


----------



## Coopiklaani

How do you guys get 108xx on the superposition 4k test? The best I can get is somewhere between 10450-10500. But I get 11000+ on the Timespy graphics score.

I'm rocking a FE card with 2100MHz at the core and 6026.4MHz at the ram.


----------



## Benny89

Quote:


> Originally Posted by *Ghosteri7*
> 
> Hello, I just got asus 1080 ti strix oc. ( I am a n00b )
> 
> Unfortunately it has coil whine
> 
> 
> 
> 
> 
> 
> 
> , I have a couple of questions if someone will like to help me out!
> 
> 1. My card boost to 1987 mhz for a bit then stays at 1949 mhz all the time with fan at 47% , temps around 67-69
> (oc profile gpu tweak 2) , while playing mafia 3 at 1080p ,144 hz (I know , I still need new monitor). is that normal, average or great?
> 
> 2. I have a 4790k non oc, sometimes in mafia 3 I have seen some frozen like moments ( 2 seconds) then everything is fine , cpu goes to 70% , is that a bottleneck?
> 
> 3. Most importantly, can someone tell me good numbers to try in GpuTweak 2 without voltages? I tried:
> GPU Boost clock = +70
> Memory Clock = +200
> 
> I run valley for a bit and everything was fine, gpu started at 2037 mhz then stayed at 2025 mhz, does that mean this gpu is good or just normal?
> 
> Here is my valley score I did first time I installed with normal oc profile:
> 
> http://i.share.pho.to/7316a3bb_o.png
> 
> Is that consider good or average?
> 
> Thanks in advance for any responses!! don't forget I am a complete n00b!! be kind


1. Yes, its normal. 1987 is your out of BoX boost with Nvidia Boost 3.0. Then your card start to thermal throttling and downclock itself till its ok with temps to core mhz ration according to Boost 3.0.

With OC it is just the same. Your clock boost higherl, like this 2037, then thermal throttling starts and downclock itself.

2037 is very good. Any card that can do above 2000 is very good, so no worries.

2. Yes, probably, 1080p 144hz is very CPU demanding and your card is also way overkill for 1080p. You don't need to OC your card. At this resolution your card will be limited by CPU anyway. You need to use DRS or buy 1440p or 4K monitor imo.

3. Each chip/card is different so try slowly. +13 mhz till you find your max Core clock. Then start doing +50 with memory and then run them both at their max together and see if everything is ok.

Use Heaven or FireStrike, Valley is little too old


----------



## Coopiklaani

Quote:


> Originally Posted by *Ghosteri7*
> 
> Hello, I just got asus 1080 ti strix oc. ( I am a n00b )
> 
> Unfortunately it has coil whine
> 
> 
> 
> 
> 
> 
> 
> , I have a couple of questions if someone will like to help me out!
> 
> 1. My card boost to 1987 mhz for a bit then stays at 1949 mhz all the time with fan at 47% , temps around 67-69
> (oc profile gpu tweak 2) , while playing mafia 3 at 1080p ,144 hz (I know , I still need new monitor). is that normal, average or great?
> 
> 2. I have a 4790k non oc, sometimes in mafia 3 I have seen some frozen like moments ( 2 seconds) then everything is fine , cpu goes to 70% , is that a bottleneck?
> 
> 3. Most importantly, can someone tell me good numbers to try in GpuTweak 2 without voltages? I tried:
> GPU Boost clock = +70
> Memory Clock = +200
> 
> I run valley for a bit and everything was fine, gpu started at 2037 mhz then stayed at 2025 mhz, does that mean this gpu is good or just normal?
> 
> Here is my valley score I did first time I installed with normal oc profile:
> 
> http://i.share.pho.to/7316a3bb_o.png
> 
> Is that consider good or average?
> 
> Thanks in advance for any responses!! don't forget I am a complete n00b!! be kind


You may see better results/higher OC if you can keep your card under 50C at all times.

Again, use MSI afterburner and spend some time on the curve. It is the best way to squeeze every last bit of performance out of your card.


----------



## Ghosteri7

Thanks a lot Benny89 for that detailed information I will follow it.

I only did a bit of valley at 2037 and was really at 2025 after. I didn't touch voltages.
I need to read and learn a bit more for that.

thanks Coopiklaani for that info , I was confuse on how gpu temps and speed were
acting up, that explained!! I thought 62C was when throttling starts!

about coil whine, should I just keep it or try to gamble returning and getting one without it?
do people usually get one without coil whine after sending back or it's just by luck?

Thanks!!


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually don't forget, going water drops the power draw by 30W due to sheer temperature difference alone.


Can you elaborate on that please?

Interesting....

Quote:


> Originally Posted by *Nico67*
> 
> The terminal on the top needs to lineup with the removed terminal holes on the EK. So screw holes and water channels need to lineup.


Damn.

I was hoping it would simply connect via tubing and normal fitments.

I.E instead of going:

RADIATOR ...> GPU BLOCK ....................................> CPU BLOCK ....> PUMP/RESERVOIR

go:

RADIATOR ...> GPU BLOCK ....> GPU BACKPLATE ....> CPU BLOCK ....> PUMP/RESERVOIR

(connect it as a separate component/block).

So that's not possible?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Can you elaborate on that please?
> 
> Interesting....
> 
> Damn.
> 
> I was hoping it would simply connect via tubing and normal fitments.
> 
> I.E instead of going:
> 
> RADIATOR ...> GPU BLOCK ....................................> CPU BLOCK ....> PUMP/RESERVOIR
> 
> go:
> 
> RADIATOR ...> GPU BLOCK ....> GPU BACKPLATE ....> CPU BLOCK ....> PUMP/RESERVOIR
> 
> (connect it as a separate component/block).
> 
> So that's not possible?


I don't think that backplate is compatible with the EK GPU block. Why do you need such a backplate? The EK backplate should do just fine. I actually have an EK Titan X compatible with your waterblock for sale.







I can sell it to you with good discount if you want.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> I don't think that backplate is compatible with the EK GPU block. Why do you need such a backplate? The EK backplate should do just fine. I actually have an EK Titan X compatible with your waterblock for sale.
> 
> 
> 
> 
> 
> 
> 
> I can sell it to you with good discount if you want.


Yeah maybe.

My block just arrived YEAH!!!!! So excited!! lol

Here I just took a picture: _(obviously I never ordered a backplate so yes I'd be interested in that @Coopiklaani )_



P.S. the block is *tiny* lol.

And it weighs so much less compared to my 1080 Classsy block. _(correction: 780Ti Classy block which was compatible)_


----------



## tagaxxl

hey guys ..i got zotac 1080 ti amp extreme edition and while i watch videos on youtube using waterfox/chrome it freeze for bit and sometimes i get blue screen with this error



i did a clean install the drivers using display driver uninstaller and still get the same error


----------



## nrpeyton

Now... I'm not sure whether to do the shunt mod before I install the block (to save me the hassle of taking everything apart again).
*Or*
if I should just wait and see how I enjoy the cooler temps first.

Any last minute advice regarding the stock FE cooler/heatsink.... I heard there're a pain to go back on again. How did you guys do it? Take notes/diagram for future refrence of "what goes where" with screws e.t.c?
Or did you guys all just go for it?

My 1080 classy was easy.. but the FE's on the other hand?


----------



## KedarWolf

Quote:


> Originally Posted by *tagaxxl*
> 
> hey guys ..i got zotac 1080 ti amp extreme edition and while i watch videos on youtube using waterfox/chrome it freeze for bit and sometimes i get blue screen with this error
> 
> 
> 
> i did a clean install the drivers using display driver uninstaller and still get the same error


Definitely a video card issue.

Check your overclock settings, scale them down if you are overclocked.

Also Google that error message, you may get some suggestions on what may help if you're not overclocked.

If it continues, may be RMA time.


----------



## jim2point0

Strix owners.... did you all get the Strix OC Edition? Or are people having luck with the base model?

Figured maybe the OC Edition might be binned for better overclocks and would maybe have a better chance at winning the silicon lottery?

Problem is I want to replace my Strix but it's out of stock on Amazon. They only have the $750 version in stock.
Quote:


> Originally Posted by *Coopiklaani*
> 
> How do you guys get 108xx on the superposition 4k test? The best I can get is somewhere between 10450-10500. But I get 11000+ on the Timespy graphics score.


I'd ask the same question about people who can somehow get 10500. I get a whopping 9950.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Now... I'm not sure whether to do the shunt mod before I install the block (to save me the hassle of taking everything apart again).
> *Or*
> if I should just wait and see how I enjoy the cooler temps first.
> 
> Any last minute advice regarding the stock FE cooler/heatsink.... I heard there're a pain to go back on again. How did you guys do it? Take notes/diagram for future refrence of "what goes where" with screws e.t.c?
> Or did you guys all just go for it?
> 
> My 1080 classy was easy.. but the FE's on the other hand?


If you're not benching and for every day gaming use, just install the XOC BIOS, got 99.7% stable in Time Spy stress test and 99% stable in Firestrike Ultra stress test with it, my daily driver.

No power limits or temp limits with it.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> If you're not benching and for every day gaming use, just install the XOC BIOS, got 99.7% stable in Time Spy stress test and 99% stable in Firestrike Ultra stress test with it, my daily driver.
> 
> No power limits or temp limits with it.


I kind of had my heart set on the Shunt mod.

Even though it's a "physical mod" in a sense.. it still feels "cleaner" to me.. as I'm not compromising on the driver. i.e. i'll be using the normal/best driver for my card (just with massively increased power limit).

However....

I've not even had a chance to see what water will do for the card alone. So I'll take your advice for now.. and hold off on the power mod. _(shunt mod)._

I suppose that can be something to look forward to on another night


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Now... I'm not sure whether to do the shunt mod before I install the block (to save me the hassle of taking everything apart again).
> *Or*
> if I should just wait and see how I enjoy the cooler temps first.


Performance wise, it is quite predictable, under 350w GPU load and 1GPM flow, you expect to get 11-12C water-GPU delta C. Resitting the waterblock means you'll need to buy a new set of thermal pads for the best performance. Thermal pads will deform after the installation. deformed themal pads do not have good contacts to the components. they perform worse if you decide to resit the waterblock without replacing them.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Performance wise, it is quite predictable, under 350w GPU load and 1GPM flow, you expect to get 11-12C water-GPU delta C. Resitting the waterblock means you'll need to buy a new set of thermal pads for the best performance. Thermal pads will deform after the installation. deformed themal pads do not have good contacts to the components. they perform worse if you decide to resit the waterblock without replacing them.


hmm that's true I suppose, good point indeed.

I'll have something to eat and think it over, then.

Anyway in the meantime; I've updated my signature to include the new block ;-)


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm that's true I suppose, good point indeed.
> 
> I'll have something to eat and think it over, then.
> 
> Anyway in the meantime; I've updated my signature to include the new block ;-)


Nice,
Here's a shot of my shunt mod with silver paint and my EK block.
You will definitely need a good pair of tweezers and good eye-hand coordination to do this mod with those tiny 0805 resistors.


----------



## Coopiklaani

repo, deleted


----------



## ELIAS-EH

Guys, did someone here has a 6900k? If yes, at which voltage it is stable at 4.3 ghz ?
Thank u


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Nice,
> Here's a shot of my shunt mod with silver paint and my EK block.
> You will definitely need a good pair of tweezers and good eye-hand coordination to do this mod with those tiny 0805 resistors.


So liquid metal here _(or silver paint-- but I have liquid metal so liquid metal)_, here & here? (3 red lines)?

*Or not?*



Also can you remind me the benefit to this method over the traditional shunt mod again please? I remember it was something to do with being able to calculate the new actual power draw?*?*

And higher power ceiling than shunt mod too?


----------



## Coopiklaani

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Guys, did someone here has a 6900k? If yes, at which voltage it is stable at 4.3 ghz ?
> Thank u


I'm rocking a 6850k 1.35v 4.5GHz. It all depends on the silicon lottery. But as a rule of thumb, stay under 1.4v for 24/7 is usually fine.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> So liquid metal here, here & here? (3 red lines)?
> 
> *Or not?*
> 
> 
> 
> Also can you remind me the benefit to this method over the traditional shunt mod again please? I remember it was something to do with being able to calculate the new actual power draw??


Pros.
Being able to accurately measure the power draw with a mulpilier
Much more stable as the resistors in this mod take virtually 0 current whereas in the traditional shunt mod, the liquid metal takes half of the current for your card. You may finally result in this as the liquid metal slowly reacts and corrupts the solder.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Pros.
> Being able to accurately measure the power draw with a mulpilier
> Much more stable as the resistors in this mod take virtually 0 current whereas in the traditional shunt mod, the liquid metal takes half of the current for your card. You may finally result in this as the liquid metal slowly reacts and corrupts the solder.


That's a picture of your old card when the resistor fell off after shunt mod isn't it?

Was I right about the three lines in my last comment?

And If I was-- can I use liquid metal instead of silver paint?


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> That's a picture of your old card when the resistor fell off after shunt mod isn't it?
> 
> Was I right about the three lines in my last comment?
> 
> And If I was-- can I use liquid metal instead of silver paint?


no.. you suppose to get 3x47ohm resistors over them, not liquid metal. The idea is to lower the resistance over those caps in a controlled manner, so you know exactly how much you have skewed the power measurement.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> no.. you suppose to get 3x47ohm resistors over them, not liquid metal. The idea is to lower the resistance over those caps in a controlled manner, so you know exactly how much you have skewed the power measurement.


So paint the top of the resistors with Liquid Metal or Silver paint
Then sit the 47 ohm resistor on top of it? _(The LM/silver paint will hold it in place)_?

*Like this?*

*Top*

..........................
.47 ohm resistor.
.........................
~~~~~~~~~~~~~~~
~Silver Paint / Liquid Metal~
~~~~~~~~~~~~~~~
__________________
0 8 0 5 . R e s i s t o r

*Bottom*


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> So paint the top of the resistors with Liquid Metal or Silver paint
> Then sit the 47 ohm resistor on top of it? _(The LM/silver paint will hold it in place)_?
> 
> *Like this?*
> 
> *Top*
> 
> ..........................
> .47 ohm resistor.
> .........................
> ~~~~~~~~~~~~~~~
> ~Silver Paint / Liquid Metal~
> ~~~~~~~~~~~~~~~
> __________________
> 0 8 0 5 . R e s i s t o r
> 
> *Bottom*


Nope


----------



## nrpeyton

Quote:


> Originally Posted by *lilchronic*
> 
> Nope


K can you link me to the original post then? I believe it was last month when I read something about this?


----------



## Coopiklaani

Quote:


> Originally Posted by *Coopiklaani*
> 
> no.. you suppose to get 3x47ohm resistors over them, not liquid metal.


Quote:


> Originally Posted by *nrpeyton*
> 
> So paint the top of the resistors with Liquid Metal or Silver paint
> Then sit the 47 ohm resistor on top of it? _(The LM/silver paint will hold it in place)_?
> 
> *Like this?*
> 
> *Top*
> 
> ..........................
> .47 ohm resistor.
> .........................
> ~~~~~~~~~~~~~~~
> ~Silver Paint / Liquid Metal~
> ~~~~~~~~~~~~~~~
> __________________
> 0 8 0 5 . R e s i s t o r
> 
> *Bottom*


Like this
Quote:


> Originally Posted by *Coopiklaani*
> 
> no.. you suppose to get 3x47ohm resistors over them, not liquid metal.


Quote:


> Originally Posted by *Coopiklaani*
> 
> I posted a brief instruction like, a thousand posts ago. Basically I used arctic thermal glue to secure the resistors and silver paint to make the connection.
> Links here:
> https://www.amazon.co.uk/Arctic-Silver-Thermal-Adhesive-ASTA-7G/dp/B01I76H3HQ/ref=sr_1_1?ie=UTF8&qid=1494340041&sr=8-1&keywords=arctic+thermal+glue
> https://www.amazon.co.uk/Conducting-Silver-conductive-repairing-heaters/dp/B000YJ2P2I/ref=sr_1_2?ie=UTF8&qid=1494340057&sr=8-2&keywords=conductive+paint
> 
> Or you can get away with this conductive glue to do both. but it has a huge resistance. (10-30ohm over a 1mm gap as I tested earlier)
> https://www.amazon.co.uk/PCB-Soldering-835-2699-Electric-Paint/dp/B00CSMDT8S/ref=sr_1_1?ie=UTF8&qid=1494340140&sr=8-1&keywords=conductive+paint
> 
> 
> 
> A link to my post here:
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/7160


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Like this


That picture makes more sense. rep+

Where are the 47 ohm resistors on your photo?


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> That's a picture of your old card when the resistor fell off after shunt mod isn't it?
> 
> Was I right about the three lines in my last comment?
> 
> And If I was-- can I use liquid metal instead of silver paint?


I was wondering the old school pencil mod across those resistors
the inverted ATX case is looking more attractive by the day when heat rises into the PCB GPUs are technically upside-down
Quote:


> Originally Posted by *TheBoom*
> 
> I would aim for the absolute best within 5% performance if you know what I mean. If going lower than a certain clock/voltage doesn't reduce my performance more than 5% I'll take it.
> 
> The person who was posting his results with undervolting was showing about 10-15% reduction in performance and that's a bit too much IMO.


That depends on your card just some people it works some cards just don't like it but it is more to help thermal throttling over longer gaming periods as the heat accumulates around the surrounding area of the chip then the more likely you are to thermal throttle
Quote:


> Originally Posted by *tagaxxl*
> 
> hey guys ..i got zotac 1080 ti amp extreme edition and while i watch videos on youtube using waterfox/chrome it freeze for bit and sometimes i get blue screen with this error
> 
> 
> 
> i did a clean install the drivers using display driver uninstaller and still get the same error


I had the same it was a corrupt OS a done a clean install of windows problem solved.
funny thing was this problem never arose till I put my 1080Ti in


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> That picture makes more sense. rep+
> 
> Where are the 47 ohm resistors on your photo?


Before, there are 3 caps


Put a 0805 47ohm resistor on top of each cap, secure them with thermal glue (or any nonconductive glue)


Finally, apply some silver paint at both ends of the resistor to make the connection


Tools used:
silver paint and thermal glue



Schematics:


Structure:


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> Before, there are 3 caps
> 
> 
> Put a 0805 47ohm resistor on top of each cap, secure them with thermal glue (or any nonconductive glue)
> 
> 
> Finally, apply some silver paint at both ends of the resistor to make the connection
> 
> 
> Tools used:
> silver paint and thermal glue
> 
> 
> 
> Schematics:
> 
> 
> Structure:


Excellent, that all makes perfect sense now. rep+

I'll need to order the parts.

I've found most of them on ebay already.

Anyway it doesn't even look like I'll be able to put the new block on tonight.

I can't find a screw driver small enough for miniature screws on the backplate. What the hell were nvidia thinking..

I do own some VERY small screw drivers. but omg these things are so small it's ridiculous. lol

never had this problem with EVGA.

Will have to visit a shop on Thursday. (don't see why EK can't include something in the bag.. only has to be a bit of metal able to get into the screws on backplate-- even something cheap that needs to be gripped with something else to save money.. but SOMETHING.. would have made life so much easier). almost seems amazing (negatively) that I can't do my block tonight due to this.


----------



## Coopiklaani

Quote:


> Originally Posted by *nrpeyton*
> 
> Excellent, that all makes perfect sense now. rep+
> 
> I'll need to order the parts.
> 
> I've found most of them on ebay already.
> 
> Anyway it doesn't even look like I'll be able to put the new block on tonight.
> 
> I can't find a screw driver small enough for miniature screws on the backplate. What the hell were nvidia thinking..
> 
> I do own some VERY small screw drivers. but omg these things are so small it's ridiculous. lol
> 
> never had this problem with EVGA.
> 
> Will have to visit a shop on Thursday. (don't see why EK can't include something in the bag.. only has to be a bit of metal able to get into the screws on backplate-- even something cheap that needs to be gripped with something else to save money.. but SOMETHING.. would have made life so much easier). almost seems amazing (negatively) that I can't do my block tonight due to this.


don't forget those hexkeys/hexscrews underneath the tiny screws.


----------



## nrpeyton

Quote:


> Originally Posted by *Coopiklaani*
> 
> don't forget those hexkeys/hexscrews underneath the tiny screws.


nvidia.......................................................


----------



## CptSpig

Quote:


> Originally Posted by *Coopiklaani*
> 
> don't forget those hexkeys/hexscrews underneath the tiny screws.


4mm Socket! Do not use pliers you will damage the board.


----------



## nrpeyton

Quote:


> Originally Posted by *CptSpig*
> 
> 4mm Socket! Do not use pliers you will damage the board.


I just *tried a 1.4mm precision* screwdriver which I found in a "mini screwdriver kit" in the loft.

And it's *STILL too big* for the miniature screws on the back plate. (1080Ti FE).

Anyone know what size it actually takes? So I know what to ask for from Shop or maybe Ebay or a friend tomorrow.

Edit: It may have been a 2mm I tried (there's one missing from the kit).


----------



## CptSpig

Quote:


> Originally Posted by *nrpeyton*
> 
> I just *tried a 1.4mm precision* screwdriver which I found in a "mini screwdriver kit" in the loft.
> 
> And it's *STILL too big* for the miniature screws on the back plate. (1080Ti FE).
> 
> Anyone know what size it actually takes? So I know what to ask for from Shop or maybe Ebay or a friend tomorrow.
> 
> Edit: It may have been a 2mm I tried (there's one missing from the kit).


The 4mm socket I was talking about is under the backplate. The small screws I just used a eye glass screw driver it worked like a champ.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CptSpig*
> 
> 4mm Socket! Do not use pliers you will damage the board.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just *tried a 1.4mm precision* screwdriver which I found in a "mini screwdriver kit" in the loft.
> 
> And it's *STILL too big* for the miniature screws on the back plate. (1080Ti FE).
> 
> Anyone know what size it actually takes? So I know what to ask for from Shop or maybe Ebay or a friend tomorrow.
> 
> Edit: It may have been a 2mm I tried (there's one missing from the kit).
Click to expand...

https://goo.gl/images/CEjH8j has everything you need, I used this with my EK waterblock and backplate.

Better link.

https://m.canadiantire.ca/products/productDetail/0573624P?storeClearance=false&selectedSku=0573624&isNonOnline=false&isAutomotive=false&isBarcodeValue=false&serverName=ATLAS


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> https://goo.gl/images/CEjH8j has everything you need, I used this with my EK waterblock and backplate.
> 
> Better link.
> 
> https://m.canadiantire.ca/products/productDetail/0573624P?storeClearance=false&selectedSku=0573624&isNonOnline=false&isAutomotive=false&isBarcodeValue=false&serverName=ATLAS


That first link is exactly what I have. (if you view the image then click 'visit page'

1.4mm is the first one then it goes up to 2mm.

So it must be the 1.4mm that is missing from my kit.

Damnit. I'll just grab a new one from Ebay. Hopefully have it by Thursday which is my day off.


----------



## jura11

Quote:


> Originally Posted by *nrpeyton*
> 
> I just *tried a 1.4mm precision* screwdriver which I found in a "mini screwdriver kit" in the loft.
> 
> And it's *STILL too big* for the miniature screws on the back plate. (1080Ti FE).
> 
> Anyone know what size it actually takes? So I know what to ask for from Shop or maybe Ebay or a friend tomorrow.
> 
> Edit: It may have been a 2mm I tried (there's one missing from the kit).


Hi there

This kit I bought and worked for me

http://www.toolstation.com/shop/p22626?gclid=CjwKEAjwpdnJBRC4hcTFtc6fwEkSJABw

Hope this helps

Thanks, Jura


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> https://goo.gl/images/CEjH8j has everything you need, I used this with my EK waterblock and backplate.
> 
> Better link.
> 
> https://m.canadiantire.ca/products/productDetail/0573624P?storeClearance=false&selectedSku=0573624&isNonOnline=false&isAutomotive=false&isBarcodeValue=false&serverName=ATLAS
> 
> 
> 
> That first link is exactly what I have. (if you view the image then click 'visit page'
> 
> 1.4mm is the first one then it goes up to 2mm.
> 
> So it must be the 1.4mm that is missing from my kit.
> 
> Damnit. I'll just grab a new one from Ebay. Hopefully have it by Thursday which is my day off.
Click to expand...

Second link, first one redirects wrong, I'm on my phone

It has the 4mm socket too.


----------



## FUZZFrrek

Hi there fella! I'll be a new member of the Club soon enough! I just bought a FTW3!! I'll keep you in touch on how does it face on the OC, temps and noise side.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> https://goo.gl/images/CEjH8j has everything you need, I used this with my EK waterblock and backplate.
> 
> Better link.
> 
> https://m.canadiantire.ca/products/productDetail/0573624P?storeClearance=false&selectedSku=0573624&isNonOnline=false&isAutomotive=false&isBarcodeValue=false&serverName=ATLAS
> 
> 
> 
> That first link is exactly what I have. (if you view the image then click 'visit page'
> 
> 1.4mm is the first one then it goes up to 2mm.
> 
> So it must be the 1.4mm that is missing from my kit.
> 
> Damnit. I'll just grab a new one from Ebay. Hopefully have it by Thursday which is my day off.
> 
> Click to expand...
> 
> Second link, first one redirects wrong, I'm on my phone
> 
> It has the 4mm socket too.
Click to expand...

And the hex heads for the waterblock.


----------



## nrpeyton

Quote:


> Originally Posted by *FUZZFrrek*
> 
> Hi there fella! I'll be a new member of the Club soon enough! I just bought a FTW3!! I'll keep you in touch on how does it face on the OC, temps and noise side.


Welcome to the club.

Look forward to seeing how you get on.

And thanks to everyone here for their replies, this evening.


----------



## TheBoom

Does anyone have the post or guide for tweaking Nvidia settings for 3D Mark?

Would be good if we had it pinned on the OP.


----------



## richiec77

Quote:


> Originally Posted by *TheBoom*
> 
> Does anyone have the post or guide for tweaking Nvidia settings for 3D Mark?
> 
> Would be good if we had it pinned on the OP.


Guru3D has one.

http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,1.html

Also, use GPU-Z along with AB when doing memory OC. Just noticed this possible time saver about 30 min ago. When adjusting memory up in small increments, you'll make a change from say 340-345 and see nothing happens. If you look at GPU-Z sensor page, you'll see it hasn't changed the memory clock at all. Adjust 1-2 MHz up or down to find the actual change point.

GDDR5 and 5x memory appear to both act like this and it's been shown somewhere else in a graph that's there is Memory steps. It's not linear at all.

There also appears to be a little bit of correlation between even memory freq vs odd. Aka 1511 was a downclock, but 1520 gained. It's a 4x ratio between GPU-Z and AB. 1522 is 6088 in AB.

Might need to investigate this more. Could be something like the memory likes 10 or 20 bumps vs smaller gains.


----------



## TheBoom

Quote:


> Originally Posted by *richiec77*
> 
> Guru3D has one.
> 
> http://www.guru3d.com/articles-pages/geforce-gtx-1080-overclocking-guide-with-afterburner-4-3,1.html
> 
> Also, use GPU-Z along with AB when doing memory OC. Just noticed this possible time saver about 30 min ago. When adjusting memory up in small increments, you'll make a change from say 340-345 and see nothing happens. If you look at GPU-Z sensor page, you'll see it hasn't changed the memory clock at all. Adjust 1-2 MHz up or down to find the actual change point.
> 
> GDDR5 and 5x memory appear to both act like this and it's been shown somewhere else in a graph that's there is Memory steps. It's not linear at all.
> 
> There also appears to be a little bit of correlation between even memory freq vs odd. Aka 1511 was a downclock, but 1520 gained. It's a 4x ratio between GPU-Z and AB. 1522 is 6088 in AB.
> 
> Might need to investigate this more. Could be something like the memory likes 10 or 20 bumps vs smaller gains.


I think you misunderstood me. I meant the tweaking of Nvidia Control Panel 3D Program settings for 3DMark specifically which would give higher scores regardless of overclock.

Also about the memory overclock, its been discussed a bit in this thread. The memory clock on the left side of AB should show the correct clock when you bump the memory clock up or down in the particular step. And yes, the timings are being changed as you change the memory clock speed itself that's why you see lower clocks giving better performance. You need to find the highest stable memory oc for your GPU and then work both downwards and upwards in small increments until you find the highest scores in your preferred benchmark of choice. Though I would say its easier to tell the differences for memory specifically with 4k or 8k benchmarks.


----------



## Robilar

Really loving my 1080ti. Runs bf1 at 120hz on the new monitor without a single dip. Haven't even bothered overclocking it. Quiet and powerful.

I may finally have a setup that wont require tweaking or serious overclocking any time soon.


----------



## wkdsean2014

I just got my ichill x 4 2 days ago, been overclocking and tinkering about with it since then and finaly found a setting im happy with. This card runs really quiet even with fans at 100% best thing i ever did upgrading from 295x2.


----------



## ELIAS-EH

Yes with gsync monitor, cap fps 120 in bf1 is amazing i love straight lines in perfoverlay


----------



## Lefty23

Quote:


> Originally Posted by *TheBoom*
> 
> Does anyone have the post or guide for tweaking Nvidia settings for 3D Mark?
> 
> Would be good if we had it pinned on the OP.


You can find a NVCP tweaks section in the OP of some of the benchmarking threads here on OCN . For example:
http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_100#post_26006658
or
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_100#post_19292884

Even though these threads are for Unigine benchmarks the tweaks should apply for 3D Mark as well.
At the very least I would change the following on the profile of the program you are interested in:
>Muti-display/mixed-GPU acceleration change to Single display performance mode
>Power management mode change to Prefer maximum performance
>Texture filtering - Quality change to High performance
>Vertical sync changed to off

You could try changing some of the other settings too but, I have not tried them so I'm not sure what impact - if any - they would have.

Quote:


> Originally Posted by *Coopiklaani*
> 
> How do you guys get 108xx on the superposition 4k test? The best I can get is somewhere between 10450-10500. But I get 11000+ on the Timespy graphics score.
> 
> I'm rocking a FE card with 2100MHz at the core and 6026.4MHz at the ram.


That is a very nice card.

I have uninstalled all the benchmarking apps a few weeks ago when I realized that I spent 90% of my (limited) spare time running benchmarks on my "gaming PC".
I have kept a lot of results though and I was getting similar scores at around the same clocks:
Timespy @2100/6003 (graphics score 11102): http://www.3dmark.com/spy/1479227
SuPo 4K Opt @2100/6075: 10501

As I have mentioned earlier in this thread SuPo in my experience really likes mem OC.
My highest score was 10785 with the card @2114/6277 and the 4 tweaks I mentioned in my reply to TheBoom above.
I could run the core up to 2139 in 4K SuPo but I was losing headroom in the memory OC resulting in worst overall scores (high 106xx - low 107xx).
In all these runs max GPU core temp was ~37C.

In order to go higher I would have to perform a h/w mod (shunt or resistor), or find a Bios with higher power limit that works well with my card (still on EVGA FE Bios since it gives me the best perf from the ones I have tried).
The one 108xx score I remember in this thread was from KraxKill who is using a shunt modded card and let's say "exotic" cooling. For his 108xx ++ run his GPU core temp was -9C min to 8C max in his chillbox.
Also Hulio225 has very high scores in SuPo 4K (don't remember if he "broke" 10800) but I think he is also shunt modded and has a very good WC loop.
I think both of them run high memory OC too, but maybe they have seen a different behavior than the one I have observed.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *Lefty23*
> 
> You can find a NVCP tweaks section in the OP of some of the benchmarking threads here on OCN . For example:
> http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_100#post_26006658
> or
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_100#post_19292884
> 
> Even though these threads are for Unigine benchmarks the tweaks should apply for 3D Mark as well.
> At the very least I would change the following on the profile of the program you are interested in:
> >Muti-display/mixed-GPU acceleration change to Single display performance mode
> >Power management mode change to Prefer maximum performance
> >Texture filtering - Quality change to High performance
> >Vertical sync changed to off
> 
> You could try changing some of the other settings too but, I have not tried them so I'm not sure what impact - if any - they would have.
> That is a very nice card.
> 
> I have uninstalled all the benchmarking apps a few weeks ago when I realized that I spent 90% of my (limited) spare time running benchmarks on my "gaming PC".
> I have kept a lot of results though and I was getting similar scores at around the same clocks:
> Timespy @2100/6003 (graphics score 11102): http://www.3dmark.com/spy/1479227
> SuPo 4K Opt @2100/6075: 10501
> 
> As I have mentioned earlier in this thread SuPo in my experience really likes mem OC.
> My highest score was 10785 with the card @2114/6277 and the 4 tweaks I mentioned in my reply to TheBoom above.
> I could run the core up to 2139 in 4K SuPo but I was losing headroom in the memory OC resulting in worst overall scores (high 106xx - low 107xx).
> In all these runs max GPU core temp was ~37C.
> 
> In order to go higher I would have to perform a h/w mod (shunt or resistor), or find a Bios with higher power limit that works well with my card (still on EVGA FE Bios since it gives me the best perf from the ones I have tried).
> The one 108xx score I remember in this thread was from KraxKill who is using a shunt modded card and let's say "exotic" cooling. For his 108xx ++ run his GPU core temp was -9C min to 8C max in his chillbox.
> Also Hulio225 has very high scores in SuPo 4K (don't remember if he "broke" 10800) but I think he is also shunt modded and has a very good WC loop.
> I think both of them run high memory OC too, but maybe they have seen a different behavior than the one I have observed.


AM jealous af of that score and that clock speed. I can hit 6100 memory but can only touch 2100 a few times during supo 4k bench. Think scene 7 thru 9 maybe. Temps max around 36 . Max score is 10460 or so .

Do you have any pics of the curve that gets you that score ? Not that my fe could run it but just curious. Also do u remeber which driver you were on at the time ? Cause I've noticed less perf. With laSt couple.


----------



## Lefty23

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Do you have any pics of the curve that gets you that score ? Not that my fe could run it but just curious. Also do u remeber which driver you were on at the time ? Cause I've noticed less perf. With laSt couple.


This run was on the 382.19 driver. I got similar results on 381.89 driver (10784).
As mentioned earlier I have uninstalled all benchmarking apps so I haven't tried the latest drivers.

Check the spoiler below for screen shot with the curve


Spoiler: Warning: Spoiler!





Right click & open in new tab. In case it is not visible from the screenshot:
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
[email protected]
one core clock bin for one voltage bin below that.



This is as high as my memory would go (+775 in AfterBurner). I start seeing artifacts around +760 and crash above +780.
As you can see I hit the power limit a lot so most of the time I'm at 2063/2088 on the core clock.
What I found from multiple runs is that I would get a better scores this way. I know other people claim to get better scores by optimizing the curve around the clock at which they stay for most of the run (say 2075/2088 in this case).
For me even those brief moments I spent at 2100-2113 help to get higher scores.


----------



## lilchronic

Quote:


> Originally Posted by *Lefty23*
> 
> This run was on the 382.19 driver. I got similar results on 381.89 driver (10784).
> As mentioned earlier I have uninstalled all benchmarking apps so I haven't tried the latest drivers.
> 
> Check the spoiler below for screen shot with the curve
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Right click & open in new tab. In case it is not visible from the screenshot:
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> one core clock bin for one voltage bin below that.
> 
> 
> 
> This is as high as my memory would go (+775 in AfterBurner). I start seeing artifacts around +760 and crash above +780.
> As you can see I hit the power limit a lot so most of the time I'm at 2063/2088 on the core clock.
> What I found from multiple runs is that I would get a better scores this way. I know other people claim to get better scores by optimizing the curve around the clock at which they stay for most of the run (say 2075/2088 in this case).
> For me even those brief moments I spent at 2100-2113 help to get higher scores.


Sounds similar to my card well the mem oc does. +780 i start to see artifact's
.


----------



## KraxKill

Quote:


> Originally Posted by *Lefty23*
> 
> This run was on the 382.19 driver. I got similar results on 381.89 driver (10784).
> As mentioned earlier I have uninstalled all benchmarking apps so I haven't tried the latest drivers.
> 
> Check the spoiler below for screen shot with the curve
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Right click & open in new tab. In case it is not visible from the screenshot:
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> [email protected]
> one core clock bin for one voltage bin below that.
> 
> 
> 
> This is as high as my memory would go (+775 in AfterBurner). I start seeing artifacts around +760 and crash above +780.
> As you can see I hit the power limit a lot so most of the time I'm at 2063/2088 on the core clock.
> What I found from multiple runs is that I would get a better scores this way. I know other people claim to get better scores by optimizing the curve around the clock at which they stay for most of the run (say 2075/2088 in this case).
> For me even those brief moments I spent at 2100-2113 help to get higher scores.


Awesome score man!! That Haswell is showing some strength! What Ram are you running?

My best on my 4790K @ 5.2ghz 2133CL9 Ti 2100mhz mem @ 6106 Shunted


Same exact setup on a 5775C @ 4.5ghz 2133CL9 2100mhz mem @ 6106 Shunted


----------



## nrpeyton

For tools to remove FE cooler + backplate; visited two hardware shops on High Street at lunch break.


Only 1 shop had 1.4mm screwdriver for backplate.
*Neither* shop had 4mm hex for back of PCB. _(They both had normal hex but not the hollow ones to grip the outside of the screws)._
So for hex, forced to use pliers. To protect the PCB (when pliers slip) used Electric Tape (around screws),

Total nightmare! Is there anyone who works in the industry here--who can argue why these obscure screws are *necessary*?

Only thing Nvidia achieved this stunt, is wasting *my* time.



Anyway on a happier note, 7 down--7 to go.. this is painstaking with the pliers. lol

_P.S. I've written to EK with feedback suggesting they include a cheap set of the 1.4mm screwdriver + 4mm hollow hex for reference design blocks.
The EK rep promised to pass this onto the correct person._


----------



## Lefty23

Quote:


> Originally Posted by *lilchronic*
> 
> Sounds similar to my card well the mem oc does. +780 i start to see artifact's
> .


Yeah, about a month ago there was a discussion about how to use OCCT to find where your memory OC begins giving errors.
Following that I found out that my memory is stable up to +746. After that I would get errors. I already kind of knew that because that was the memOC (+750) where I would start getting worst results in 3DMark benches.
However in Superposition I get higher scores up to the point I crash.
Quote:


> Originally Posted by *KraxKill*
> 
> What Ram are you running?


I'm using 2x8GB G-Skill F3-2400C10 and running the default XMP profile (only changed command rate 2T->1T)
RAM is the only thing I never felt comfortable OCing









Screenshot from the same run with GPU-Z & CPU-Z:


Spoiler: Warning: Spoiler!






Quote:


> Originally Posted by *nrpeyton*
> 
> _P.S. I've written to EK with feedback suggesting they include a cheap set of the 1.4mm screwdriver + 4mm hollow hex for reference design blocks.
> The EK rep promised to pass this onto the correct person._


I wish they would too. I had to go to 6 different stores before I found the 4mm hex and even then I had to buy a whole set of ~20 pieces I'll never use.
But, as long as they are selling them in their webshop I'm not sure they will:
https://www.ekwb.com/shop/hex-socket-4mm
Let us know if you hear back from them.

Also, in the off chance this would help you....
Regarding your question about how to handle the reassembly of the stock cooler if it is ever needed.
What I always do is I put the thing back together right after the dissasembly but without the PCB obviously.
So if I ever have to put the whole thing back together after a year or two I know where everything goes


Spoiler: Warning: Spoiler!




my 980ti & 1080ti stock cooler and backplate put back together:


----------



## KraxKill

I own one of these. Handles most of my electronics related tool needs.



https://www.newegg.com/Product/Product.aspx?Item=9SIAB934987162&ignorebbr=1&nm_mc=KNC-GoogleMKP-PC&cm_mmc=KNC-GoogleMKP-PC-_-pla-_-Tools+-+Hand+Screwdrivers+%26+Nutdrivers-_-9SIAB934987162&gclid=CJLs_9DIrNQCFcZgfgoddqEDkA&gclsrc=aw.ds


----------



## Lefty23

Yeah I've seen these, they're really nice.
Unfortunatelly not available in my country.
I couldn't get a set of really small screwdrivers or small hex sockets in a hardware store.
I had to buy from a store for miniature car/plane modelling (I'm not sure if that's how they are called but hopefully you get the idea) at a crazy price


----------



## Nico67

EK actually sell the 4mm hex on the web site as an accessory, but I pick one up in Bunnings here, as I single item for about 4 dollars they actually had two different ones on shelf, long and short. I left the plastic rack tag in the back of it as it makes a nice handle for easy turning without requiring any other tools







They also sell jewellers screwdriver kits for about 10 dollars.
Its kinda funny we can get some stuff really easy here but other items not at all, and PC gear costs a fortune due to"shipping" costs, or so the retailers say


----------



## nrpeyton

In the short term (at least) I'm just going to do a traditional shunt mod.

(As I already have the tools for it here). And I'm eager to get the waterblock on and see what she can really do. lol.

After the Ebay parts arrive and after I've tested how easily the silver paint removes (etc etc) later I will definitely do Coopiklaani's power mod. (also as I like the idea of being able to calculate the new power draw properly using multiplier).

However in the *short term* I'll do a traditional shunt mod first.... so this is how it looks so far:



And the actual photograph (sorry my camera is damaged; hence poor quality). _& why I also used the diagram above_


Just waiting on it drying now 

Will be painting Thermal Grizzly Conductonaut Liquid Metal across the top of those 3 resistors very soon.

Quote:


> Originally Posted by *Lefty23*
> 
> Yeah I've seen these, they're really nice.
> Unfortunatelly not available in my country.
> I couldn't get a set of really small screwdrivers or small hex sockets in a hardware store.
> I had to buy from a store for miniature car/plane modelling (I'm not sure if that's how they are called but hopefully you get the idea) at a crazy price


It really wouldn't cost a lot to include 2 little correctly shaped pieces of cheap metal in with the EK kit. Even something cheap that has to be gripped with something else (like pliers) to keep costs down. Can't see it even adding more than 1 dollar to the cost of your block.

It really should be included.


----------



## KraxKill

What are the chances that a TI would run on Titan XP bios? Could the hardcoded Titan XP TDP actually be higher when flashed to a Ti? The Ti is missing a gddr5x chip. A few ROPs and a DVI port, but ultimately the cards are fairly similar. Do they sport the same voltage controller?

I'm suggesting that if there is a hardcoded limit beyond what the shunt mod controls there really is no way to extract more from these cards without directly feeding the card via a bypass. Otherwise once we reach that limit, like I have. You can add clock and subtract mem or vis a versa and your score is flat or lower since there really is no additional power being consumed by the card. Ultimately the area under the curve does not increase and you wind up robbing Peter to pay Paul and going nowhere in terms of performance. The wall meters are nice but they are not the full story because the card could just be bleeding a bunch of what's showing as heat.

The shunt mod get you around the BIOS handicapping but not around NVidia's Great Wall of power usage. The XOC is nice, but it simply selects voltages for current the card doesn't actually deliver. Ideas?


----------



## SauronTheGreat

guys i have a question recently i bought a Zotac amp extreme core edition, the difference between the amp extreme core edition and amp extreme is that the amp extreme has 38 Mhz higher clock than the core edition.. rest is the same literally , even the boxes are the same with just core stickers on them ... till now the only difference is in the bios ... when i put the +38Mhz clock in the OC software the driver crashes in unigene heaven and valley, which means the TDP is different too in the bios .. can someone help is there any way i can flash a different bios ? or somehow change the TDP ... below are the tech power link for both the cards clocks on stock .. please help me out i hope i have not caused a confusion.

Zotac Amp extreme 1080ti : https://www.techpowerup.com/gpudb/b4356/zotac-gtx-1080-ti-amp-extreme

Zotac Amp extreme core edition 1080ti : https://www.techpowerup.com/gpudb/b4589/zotac-gtx-1080-ti-amp-extreme-core-edition


----------



## reflex75

Quote:


> Originally Posted by *SauronTheGreat*
> 
> guys i have a question recently i bought a Zotac amp extreme core edition, the difference between the amp extreme core edition and amp extreme is that the amp extreme has 38 Mhz higher clock than the core edition.. rest is the same literally , even the boxes are the same with just core stickers on them ... till now the only difference is in the bios ... when i put the +38Mhz clock in the OC software the driver crashes in unigene heaven and valley, which means the TDP is different too in the bios .. can someone help is there any way i can flash a different bios ? or somehow change the TDP ... below are the tech power link for both the cards clocks on stock .. please help me out i hope i have not caused a confusion.
> 
> Zotac Amp extreme 1080ti : https://www.techpowerup.com/gpudb/b4356/zotac-gtx-1080-ti-amp-extreme
> 
> Zotac Amp extreme core edition 1080ti : https://www.techpowerup.com/gpudb/b4589/zotac-gtx-1080-ti-amp-extreme-core-edition


Zotac has probably created this new core edition to recycle the chips that have failed to sustain the 1759Mhz of the amp extreme.
So it's hopeless and flashing the amp extreme bios won't change your gpu intern capacity.


----------



## SauronTheGreat

Quote:


> Originally Posted by *reflex75*
> 
> Zotac has probably created this new core edition to recycle the chips that have failed to sustain the 1759Mhz of the amp extreme.
> So it's hopeless and flashing the amp extreme bios won't change your gpu intern capacity.


i understand it makes sense also, but is there any harm in flashing the amp extreme bios in it ?


----------



## nrpeyton

Well I've had a set back. (with my Shunt mod)

The Conductonaut Liquid Metal wouldn't "spread" over the resistor.

It seems to want to stay in the shape it came out the bottle. So if it came out in a blob (or ball shape) it will stay that way.. I felt like I was chasing it about at one stage.. then got it onto the Liquid Electrical Tape surrounding the resistor.. then it broke in half and another bit ended up on the bloody PCB. Which was very difficult to remove.

It's such a weird texture.. something I've never seen before. It reminds me of the Terminator Film.

Anyway I can't see any way of covering from left to right atop the resistor without using a huge blob (big enough so the blob covers the whole resistor) but then it's going to overhang off the resistor onto the Liquid Tape.. and I'm scared of such a big blob moving around of it's own accord.

Anyway I had to peel off the entire bit of Liquid Electrical Tape off the 1st resistor I tried to apply on. .So I'm now going to have to paint Liquid Elec Tape on then wait for it to dry again. Bloody nightmare.

Any suggestions? How did you guys do it.. seriously it will not "spread" or "paint", Does CLU *behave* differently?

Edit:
What if I warmed it up.. might make it easier to work with?


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> Well I've had a set back. (with my Shunt mod)
> 
> The Conductonaut Liquid Metal wouldn't "spread" over the resistor.
> 
> It seems to want to stay in the shape it came out the bottle. So if it came out in a blob (or ball shape) it will stay that way.. I felt like I was chasing it about at one stage.. then got it onto the Liquid Electrical Tape surrounding the resistor.. then it broke in half and another bit ended up on the bloody PCB. Which was very difficult to remove.
> 
> It's such a weird texture.. something I've never seen before. It reminds me of the Terminator Film.
> 
> Anyway I can't see any way of covering the resistor without using a huge blob (big enough so the blob covers the whole resistor) but then it's going to overhang off the resistor onto the Liquid Tape.. and I'm scared it will move on it's own accord.
> 
> Anyway I had to peel off the entire bit of Liquid Electrical Tape off the 1st resistor I tried to apply on. .So I'm now going to have to paint Liquid Elec Tape on then wait for it to dry again. Bloody nightmare.
> 
> Any suggestions? How did you guys do it.. seriously it will not "spread" or "paint", Does CLU *behave* differently?


Did yours come with the nylon q-tips?

What I do to begin working with it is add a drop of it to the q-tip and smear that small ball of it against a little piece of unneeded copper. You could probably use a copper coin or something like that. This will get A -practice with how it spreads and B will get the tip of the qtip stained with liquid metal. You can then get some of that stained resedue and rub the metal portion of the shunt resistor with the end of the q-tip to get some of that residue on the shunt ends so that the liquid metal ball you put on it has something to attach to with surface tension. From there, I use a tooth pick, to pull and tug at the ball to get it to pull and stretch across the shunt. It's not a paste thats for sure. It's like mercury or any other liquid metal. You're not going to "spread"it you actually sort of pull it accross the surface of the shunt. Getting a bit of the residue on the shunt via the qtip helps "prime" the surface of the shunt to pull at the liquid metal a bit. It's kind of like spreading a drop of water across a coin etc. Good luck.

Alcohol gets it off pretty easy.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Did yours come with the nylon q-tips?
> 
> What I do to begin working with it is add a drop of it to the q-tip and smear that small ball of it against a little piece of unneeded copper. You could probably use a copper coin or something like that. This will get A -practice with how it spreads and B will get the tip of the qtip stained with liquid metal. You can then get some of that stained resedue and rub the metal portion of the shunt resistor with the end of the q-tip to get some of that residue on the shunt ends so that the liquid metal ball you put on it has something to attach to with surface tension. From there, I use a tooth pick, to pull and tug at the ball to get it to pull and stretch across the shunt. It's not a paste thats for sure. It's like mercury or any other liquid metal. You're not going to "spread"it you actually sort of pull it accross the surface of the shunt. Getting a bit of the residue on the shunt via the qtip helps "prime" the surface of the shunt to pull at the liquid metal a bit. It's kind of like spreading a drop of water across a coin etc. Good luck.
> 
> Alcohol gets it off pretty easy.


That was actually incredibly helpful.

Glad I'm not alone.. I thought it was just me! lol

Thanks again, rep+









And yeah it makes perfect sense.. liquid metal seems to love liquid metal.. so getting some residue on it will help keep it there.. nice one..























_(For those who haven't done the shunt mod -- it's nothing like spreading it on a big heatsink. The fact that what you are working with is so small -- is what makes it completely differently tricky._

Quote:


> Originally Posted by *KraxKill*
> 
> What are the chances that a TI would run on Titan XP bios? Could the hardcoded Titan XP TDP actually be higher when flashed to a Ti? The Ti is missing a gddr5x chip. A few ROPs and a DVI port, but ultimately the cards are fairly similar. Do they sport the same voltage controller?
> 
> I'm suggesting that if there is a hardcoded limit beyond what the shunt mod controls there really is no way to extract more from these cards without directly feeding the card via a bypass. Otherwise once we reach that limit, like I have. You can add clock and subtract mem or vis a versa and your score is flat or lower since there really is no additional power being consumed by the card. Ultimately the area under the curve does not increase and you wind up robbing Peter to pay Paul and going nowhere in terms of performance. The wall meters are nice but they are not the full story because the card could just be bleeding a bunch of what's showing as heat.
> 
> The shunt mod get you around the BIOS handicapping but not around NVidia's Great Wall of power usage. The XOC is nice, but it simply selects voltages for current the card doesn't actually deliver. Ideas?


I think the hard-coded Titan probably features a smaller TDP for safety reasons.

The VRM on the Titan isn't even near as good as the 1080 Ti FE. We got a huge VRM upgrade this round with Ti compared to Nvidias normal offering.


----------



## Mad Pistol

Ordered an MSI GTX 1080 Ti Gaming X. Should be here tomorrow.


----------



## nrpeyton

deleted, double post, soz


----------



## DerComissar

Quote:


> Originally Posted by *KraxKill*
> 
> What are the chances that a TI would run on Titan XP bios? Could the hardcoded Titan XP TDP actually be higher when flashed to a Ti? The Ti is missing a gddr5x chip. A few ROPs and a DVI port, but ultimately the cards are fairly similar. Do they sport the same voltage controller?
> 
> I'm suggesting that if there is a hardcoded limit beyond what the shunt mod controls there really is no way to extract more from these cards without directly feeding the card via a bypass. Otherwise once we reach that limit, like I have. You can add clock and subtract mem or vis a versa and your score is flat or lower since there really is no additional power being consumed by the card. Ultimately the area under the curve does not increase and you wind up robbing Peter to pay Paul and going nowhere in terms of performance. The wall meters are nice but they are not the full story because the card could just be bleeding a bunch of what's showing as heat.
> 
> The shunt mod get you around the BIOS handicapping but not around NVidia's Great Wall of power usage. The XOC is nice, but it simply selects voltages for current the card doesn't actually deliver. Ideas?


I doubt if it would work, due to the gpu mismatch.

In the reverse situation, MrTOOSHORT tried to flash his TXP with a TXp bios, and also some 1080Ti bios, and it didn't work.
http://www.overclock.net/t/1606562/official-nvidia-titan-x-pascal-owners-thread/7260#post_26019527


----------



## nrpeyton

Okay Liquid Metal applied:



I used more on the top two resistors (as I want the PCI-E power cables taking the brunt of the extra power).

Only used a very thin layer (and not completely covering all of the resistor but _*JUST*_ making it to the metal edges on the top). _I want a little extra power through the PCI-E lane but not too much._
The electromagnetic field/interference of the extra power on PCI-E lane (so close to data transfer) could lose me a few mhz of stability.

Time to screw the block down!

P.S

Wow the top of the DIE looks huge compared to my old 1080 ;-)
....Time for some good old Kryonaut. _(I did consider using LM between the block/DIE too, for another 3/4 degrees C, but I can't be bothered having to deal with the discolouration it will cause)._


----------



## KedarWolf

Zotac Arctic Storm BIOS really good for Time Spy.


----------



## TheBoom

Quote:


> Originally Posted by *Lefty23*
> 
> You can find a NVCP tweaks section in the OP of some of the benchmarking threads here on OCN . For example:
> http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_100#post_26006658
> or
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/0_100#post_19292884
> 
> Even though these threads are for Unigine benchmarks the tweaks should apply for 3D Mark as well.
> At the very least I would change the following on the profile of the program you are interested in:
> >Muti-display/mixed-GPU acceleration change to Single display performance mode
> >Power management mode change to Prefer maximum performance
> >Texture filtering - Quality change to High performance
> >Vertical sync changed to off
> 
> You could try changing some of the other settings too but, I have not tried them so I'm not sure what impact - if any - they would have.
> That is a very nice card.
> 
> I have uninstalled all the benchmarking apps a few weeks ago when I realized that I spent 90% of my (limited) spare time running benchmarks on my "gaming PC".
> I have kept a lot of results though and I was getting similar scores at around the same clocks:
> Timespy @2100/6003 (graphics score 11102): http://www.3dmark.com/spy/1479227
> SuPo 4K Opt @2100/6075: 10501
> 
> As I have mentioned earlier in this thread SuPo in my experience really likes mem OC.
> My highest score was 10785 with the card @2114/6277 and the 4 tweaks I mentioned in my reply to TheBoom above.
> I could run the core up to 2139 in 4K SuPo but I was losing headroom in the memory OC resulting in worst overall scores (high 106xx - low 107xx).
> In all these runs max GPU core temp was ~37C.


Thanks for the links although for some reason the tweaks didn't improve my score anyway. + Rep.
Quote:


> Originally Posted by *SauronTheGreat*
> 
> guys i have a question recently i bought a Zotac amp extreme core edition, the difference between the amp extreme core edition and amp extreme is that the amp extreme has 38 Mhz higher clock than the core edition.. rest is the same literally , even the boxes are the same with just core stickers on them ... till now the only difference is in the bios ... when i put the +38Mhz clock in the OC software the driver crashes in unigene heaven and valley, which means the TDP is different too in the bios .. can someone help is there any way i can flash a different bios ? or somehow change the TDP ... below are the tech power link for both the cards clocks on stock .. please help me out i hope i have not caused a confusion.
> 
> Zotac Amp extreme 1080ti : https://www.techpowerup.com/gpudb/b4356/zotac-gtx-1080-ti-amp-extreme
> 
> Zotac Amp extreme core edition 1080ti : https://www.techpowerup.com/gpudb/b4589/zotac-gtx-1080-ti-amp-extreme-core-edition


I was actually wondering what the differences are, thanks for the info. I have the Amp Extreme. I always thought the core edition would actually be better.

Anyway, if you really want higher clocks then you might want to try the XOC bios. It allows higher voltages so you may be able to get slightly higher clocks out of your card.


----------



## KedarWolf

Catzilla results anyone?


----------



## SauronTheGreat

Quote:


> Originally Posted by *TheBoom*
> 
> Thanks for the links although for some reason the tweaks didn't improve my score anyway. + Rep.
> I was actually wondering what the differences are, thanks for the info. I have the Amp Extreme. I always thought the core edition would actually be better.
> 
> Anyway, if you really want higher clocks then you might want to try the XOC bios. It allows higher voltages so you may be able to get slightly higher clocks out of your card.


What is XOC bios ?


----------



## TheBoom

Quote:


> Originally Posted by *SauronTheGreat*
> 
> What is XOC bios ?


It's an Asus Strix bios with no perf limits and temp limits, also voltage unlocked to 1.2v.


----------



## Lefty23

Quote:


> Originally Posted by *nrpeyton*
> 
> It really should be included.


Oh yeah, I'm 100% with you on that one.
When I paid 125 Euro (175 incl Tax+Shipping) for a full block and backplate from their webshop I wasn't expecting to have to drive around town in order to find a hex socket.
It would be nice of them to include these with blocks compatible with reference cards.
Quote:


> Originally Posted by *TheBoom*
> 
> Thanks for the links although for some reason the tweaks didn't improve my score anyway. + Rep.


Sorry it didn't help. I know in my case it got me from ~10630 up to ~10780 in 4K SuPo and I thought it would have similar impact in 3DMark.
Truth is since 3DMark benchmars are very dependant to the CPU I haven't tinkered with them as much. I got up to 112xx plus in graphics score in Timespy but overall was 9500 due to 4c8t and older (z97) chipset.
Quote:


> Originally Posted by *SauronTheGreat*
> 
> What is XOC bios ?


It is a bios for LN2 runs with the ASUS 1080TI ROG STRIX gpu. It removes temp/power limits and allows voltage up to 1.2v.
A lot of us have cross flashed it in our cards with mixed results.
In my case I was getting worst results in Superposition benchmark and slightly better results in 3DMark (at higher core clock speeds).
You can read a bit more here:
http://forum.hwbot.org/showthread.php?t=169488
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8600_100#post_26075655
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/8600_100#post_26075937
and search this thread for results from different users (some of them went to gret detail in their testing) and to see if someone with your card has already tried it.

If you do decide to try it out make sure you have some sort of monitoring software (eg, afterburner, hwinfo64) to keep eye on what is happenning.
Also in my case I got better (more stable) results from this bios using the curve to OC (behavior was unpredictable when using just slidders).

fwiw, I'm back on the original bios for every day use since I don't hit any temp/power limits in the games I play with my gaming OC profile.


----------



## BoredErica

Quote:


> Originally Posted by *nrpeyton*
> 
> For tools to remove FE cooler + backplate; visited two hardware shops on High Street at lunch break.
> 
> 
> Only 1 shop had 1.4mm screwdriver for backplate.
> *Neither* shop had 4mm hex for back of PCB. (They both had normal hex but not the hollow ones to grip the outside of the screws).
> So for hex, forced to use pliers. To protect the PCB (when pliers slip) used Electric Tape (around screws),
> 
> Total nightmare! Is there anyone who works in the industry here--who can argue why these obscure screws are *necessary*?
> 
> Only thing Nvidia achieved this stunt, is wasting *my* time.
> 
> Anyway on a happier note, 7 down--7 to go.. this is painstaking with the pliers. lol
> 
> P.S. I've written to EK with feedback suggesting they include a cheap set of the 1.4mm screwdriver + 4mm hollow hex for reference design blocks.
> The EK rep promised to pass this onto the correct person.


Turns out I have a kit for the hollow 4mm from a toolkit I bought to work on my car. If anybody else was like me then check that toolkit.


----------



## Amuro

sharin my non curve oc msi gaming x


----------



## SirBubby

Picked up a 1080 Ti Founders Edition the other day. My first Nvidia card in 15 years. This is the first time I've messed with overclocking pascal. I used MSI Afterburner for years but switched to EVGA PXOC and have been messing with GPU Boost 3.0. My settings so far...

Power Limit: 120%
Temp: 90c
Core: +175
Mem: 0 (haven't messed with this yet)
Voltage: +12%

With these settings I'll bounce between 2025mhz @ 1062mv and 2012mhz at 1050mv.

My question is if I increase the voltage anymore I hit a power limit. This is bad? I know hitting the power limit will decrease the clock speeds but does it also mess up frametime and cause the game to stutter? Should I be happy with those settings and leave it as is or increase the voltage and core and hit the power limit a bunch?

Thanks


----------



## Blotto80

So cooling has a huge effect on memory clock with the Ti. On stock midplate/blower with an EVGA Hybrid, the highest I could clock the mem was 5900 (+400) any higher would lock heaven up. Started playing with clocks last night under my Heatkiller IV block and I could get past 6100 (+600) no problem.


----------



## Ghosteri7

I finally pushed my 1080 ti strix oc with oc.

I did these settings:

boost clock = +100
memory = +300
power target = 120

the card showed at 2062 mhz but only few seconds, then stayed at 2050 mhz, I did boost clock to +90 before and also was at 2050 mhz. all this is without touching the voltages as I don't know much yet.

The only bad thing is that to keep the card in the 50C+ at 2050 mhz I have to have the fans at 100%









I am guessing all strix oc can do 2050 mhz? or some some strixs can't do this?

Has anyone keep their cards even though it has coil whine? Or it's better to send back and hope to get one without it? ( but can get worst too ) hmm. opinions please!!


----------



## Aganor

I'm starting to wonder that my FE is not a great chip after all.
To an FE card, doing +175 core clock is a good number?
I expected more. i can go +210 and to superposition 4k but sometimes it crashes and can't handle even 5 secs of gaming, hense the +175 i normally use.


----------



## Aganor

Quote:


> Originally Posted by *SirBubby*
> 
> Picked up a 1080 Ti Founders Edition the other day. My first Nvidia card in 15 years. This is the first time I've messed with overclocking pascal. I used MSI Afterburner for years but switched to EVGA PXOC and have been messing with GPU Boost 3.0. My settings so far...
> 
> Power Limit: 120%
> Temp: 90c
> Core: +175
> Mem: 0 (haven't messed with this yet)
> Voltage: +12%
> 
> With these settings I'll bounce between 2025mhz @ 1062mv and 2012mhz at 1050mv.
> 
> My question is if I increase the voltage anymore I hit a power limit. This is bad? I know hitting the power limit will decrease the clock speeds but does it also mess up frametime and cause the game to stutter? Should I be happy with those settings and leave it as is or increase the voltage and core and hit the power limit a bunch?
> 
> Thanks


Did you notice any lock up in heaven with this core OC +175?

This is the max i can do as well but i think its not completly stable.
I tested max voltage and core goes a little up and never gets passed 120% usage (rarely a spkie at 123%.


----------



## SirBubby

Quote:


> Originally Posted by *Aganor*
> 
> Did you notice any lock up in heaven with this core OC +175?
> 
> This is the max i can do as well but i think its not completly stable.
> I tested max voltage and core goes a little up and never gets passed 120% usage (rarely a spkie at 123%.


No lock ups in Heaven or Battlefield 1 after 15min. I need to stress test more. PUBG is a good test as that game will murder your hardware.

My power usage hangd around 105-110% but also spikes past 120% if I go higher than +12% voltage. That was my question is the spikes bad or should I be ok with spikes over 120% and increase the voltage?


----------



## TheBoom

Quote:


> Originally Posted by *Aganor*
> 
> I'm starting to wonder that my FE is not a great chip after all.
> To an FE card, doing +175 core clock is a good number?
> I expected more. i can go +210 and to superposition 4k but sometimes it crashes and can't handle even 5 secs of gaming, hense the +175 i normally use.


You should just use the actual core clock numbers instead of saying +175 or +xxx because every card is different in boost clocks and offsets. We wouldn't know what actual core clock you are getting if you say +175 and we can't tell you if its good or not.

That being said, anything over 2ghz is considered good and anything over 2100 is considered more or less a silicon lottery winner.

Quote:


> Originally Posted by *SirBubby*
> 
> No lock ups in Heaven or Battlefield 1 after 15min. I need to stress test more. PUBG is a good test as that game will murder your hardware.
> 
> My power usage hangd around 105-110% but also spikes past 120% if I go higher than +12% voltage. That was my question is the spikes bad or should I be ok with spikes over 120% and increase the voltage?


The consensus is that as long you don't get perfcaps in GPU-Z then you are good, however running into a perfcap like powerlimit *may* impact performance.

If you do increase voltage you won't be able to get higher stable clocks anyway if you are indeed running into powerlimits.

As to whether or not it affects frametimes and smoothness I can't say. Honestly I haven't seen anyone do a test specifically for powerlimits or perfcaps and show if there is actually an effect.


----------



## TheBoom

Quote:


> Originally Posted by *Ghosteri7*
> 
> I finally pushed my 1080 ti strix oc with oc.
> 
> I did these settings:
> 
> boost clock = +100
> memory = +300
> power target = 120
> 
> the card showed at 2062 mhz but only few seconds, then stayed at 2050 mhz, I did boost clock to +90 before and also was at 2050 mhz. all this is without touching the voltages as I don't know much yet.
> 
> The only bad thing is that to keep the card in the 50C+ at 2050 mhz I have to have the fans at 100%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am guessing all strix oc can do 2050 mhz? or some some strixs can't do this?
> 
> Has anyone keep their cards even though it has coil whine? Or it's better to send back and hope to get one without it? ( but can get worst too ) hmm. opinions please!!


2050 is considered a pretty good overclock and not many cards including strix can do it.

I suggest you find the best overclock to keep your fans silent. Even 2000 is good enough. 50 MHz is nothing. Maybe 1 fps at 4k and 2-3 in 1440p.

Also you will never get a card without coil whine. Unless it's really loud or audible through your case it's normal.


----------



## Aganor

Quote:


> Originally Posted by *TheBoom*
> 
> You should just use the actual core clock numbers instead of saying +175 or +xxx because every card is different in boost clocks and offsets. We wouldn't know what actual core clock you are getting if you say +175 and we can't tell you if its good or not.
> 
> That being said, anything over 2ghz is considered good and anything over 2100 is considered more or less a silicon lottery winner.
> The consensus is that as long you don't get perfcaps in GPU-Z then you are good, however running into a perfcap like powerlimit *may* impact performance.
> 
> If you do increase voltage you won't be able to get higher stable clocks anyway if you are indeed running into powerlimits.
> 
> As to whether or not it affects frametimes and smoothness I can't say. Honestly I haven't seen anyone do a test specifically for powerlimits or perfcaps and show if there is actually an effect.


In case case, core goes up to 2086 but very rarely and its just until card reaches more than 41ºc.
After 30m of Witcher 3, at 100% max gpu usage, card goes from 1986 to 2015 constantly with said +175 offset.
So yeah, i wished it was better lol
But into insight, i have a secondary PC being built for my nehpew and it has a gts450 in it and making a little OC i got it to max 1Ghz. Its wonderful to see how Pascal clocks!


----------



## Aganor

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay Liquid Metal applied:
> 
> 
> 
> I used more on the top two resistors (as I want the PCI-E power cables taking the brunt of the extra power).
> 
> Only used a very thin layer (and not completely covering all of the resistor but _*JUST*_ making it to the metal edges on the top). _I want a little extra power through the PCI-E lane but not too much._
> The electromagnetic field/interference of the extra power on PCI-E lane (so close to data transfer) could lose me a few mhz of stability.
> 
> Time to screw the block down!
> 
> P.S
> 
> Wow the top of the DIE looks huge compared to my old 1080 ;-)
> ....Time for some good old Kryonaut. _(I did consider using LM between the block/DIE too, for another 3/4 degrees C, but I can't be bothered having to deal with the discolouration it will cause)._


I used the thermal pads on the chips without them on your PIC, the small ones.
On the EK manual i noticed they did not require to add there but i did it to give more heat dissipation on the chips. Maybe i should have not done that?


----------



## OneCosmic

Quote:


> Originally Posted by *KedarWolf*
> 
> Zotac Arctic Storm BIOS really good for Time Spy.


Can you show MSI AB hardware monitor graph with GPU clocks and power limit? I am curious how much does it throttle with this BIOS.


----------



## jase78

Quote:


> Originally Posted by *nrpeyton*
> 
> For tools to remove FE cooler + backplate; visited two hardware shops on High Street at lunch break.
> 
> 
> Only 1 shop had 1.4mm screwdriver for backplate.
> *Neither* shop had 4mm hex for back of PCB. _(They both had normal hex but not the hollow ones to grip the outside of the screws)._
> So for hex, forced to use pliers. To protect the PCB (when pliers slip) used Electric Tape (around screws),
> 
> Total nightmare! Is there anyone who works in the industry here--who can argue why these obscure screws are *necessary*?
> 
> Only thing Nvidia achieved this stunt, is wasting *my* time.
> 
> 
> 
> Anyway on a happier note, 7 down--7 to go.. this is painstaking with the pliers. lol
> 
> _P.S. I've written to EK with feedback suggesting they include a cheap set of the 1.4mm screwdriver + 4mm hollow hex for reference design blocks.
> The EK rep promised to pass this onto the correct person._


i just grabbed a 4mm socket out of my 1/4 drive socket set and used it. but yeah i agree its a pretty small size!

and for the backplate used a cheap precision screwdriver set


----------



## cluster71

A question. What is the aim of for the core 2200 MHz? I've run 2100/11800 1.093v lately with XOC bios in my Zotac Extreme. It's "Game stable" at 2124 1.100v also when I'm Game Elite 4. I wonder what's required for the 2200? 1,200 volts or? Now run in windows on 2152 1.150v but do not care about the memory now.


----------



## cluster71

My PC started to become inresponsive "laggy" some sound problem. I checked LatensyMon and lay on a couple of thousand. I never got the right speed, but I think the nvme drivers were a contributing factor. I have 2 M.2 SSD. I reinstalld windows and now PC is feeling "snappy"


----------



## Ghosteri7

Quote:


> Originally Posted by *TheBoom*
> 
> 2050 is considered a pretty good overclock and not many cards including strix can do it.
> 
> I suggest you find the best overclock to keep your fans silent. Even 2000 is good enough. 50 MHz is nothing. Maybe 1 fps at 4k and 2-3 in 1440p.
> 
> Also you will never get a card without coil whine. Unless it's really loud or audible through your case it's normal.


Thanks for the information !!.

The card with the oc profile in gputweak 2 always start at 1987 mhz but only for few seconds then when the temps starts to go up it settles down to:

1949 Mhz stable throughout the whole game: (I think all strix do this speed with oc profile)

67C to 69C

Fan at 47%

the coil whine is there and I heard it a bit more in valley or heaven when I quit the application the coil goes nuts there for few seconds
When I quit games it's just the same normal coil noise from the beginning. so weird!

I have seen few people saying that their strix has not coil whine, even someone that made a video of it. though I see a lot more reports of "having" the coil whine for sure !

I just bought it from newegg and I think they can replace it with a brand new one? I think if I replace with Asus they may send me a refurbished one? anyone knows how they both do it? After your explanation maybe I will decide to keep it, not fully sure about it yet though!


----------



## KedarWolf

Quote:


> Originally Posted by *OneCosmic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Zotac Arctic Storm BIOS really good for Time Spy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you show MSI AB hardware monitor graph with GPU clocks and power limit? I am curious how much does it throttle with this BIOS.
Click to expand...

You can enable a 432W power limit but yeah, I'll do the graph in a few hours when I get home, show how it limits.


----------



## ilikelobstastoo

Can anybody tell me What resistance we are aiming for on top of the shunts? Am considering just soldering a resistor across them to make a more precise/permanent connection .


----------



## RavageTheEarth

So does the XOC BIOS that you guys are talking about mean the BIOS from the COLORFUL iGame GTX 1080 Ti Vulcan X OC? That's the only thing that comes up when I google that. I want to try flashing that BIOS to my Gigabyte Aorus so I can get above 1.093v/2026.


----------



## KedarWolf

Quote:


> Originally Posted by *RavageTheEarth*
> 
> So does the XOC BIOS that you guys are talking about mean the BIOS from the COLORFUL iGame GTX 1080 Ti Vulcan X OC? That's the only thing that comes up when I google that. I want to try flashing that BIOS to my Gigabyte Aorus so I can get above 1.093v/2026.


XOC in OP in the 'How To Flash A Bios' section.


----------



## KedarWolf

Quote:


> Originally Posted by *OneCosmic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Zotac Arctic Storm BIOS really good for Time Spy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you show MSI AB hardware monitor graph with GPU clocks and power limit? I am curious how much does it throttle with this BIOS.
Click to expand...

 GPU-ZSensorLog.txt 51k .txt file


This is my graph, log and my voltage curve which is key to higher benches.





The reason why I haver it staggered like that is it is staggered at the highest clocks at the lowest voltages points I can run, like 2088 1.093, 2075 1.081v2062. 1.075v, 2050 1.043v, 2025 .993v etc. so when it does power limit it limits to a lower voltage point it's at the highest core speed that voltage point allows.


----------



## nrpeyton

Last night I made a really stupid mistake.

One I thought it is absolutely impossible to make.

Well. Hell knows. But I done it.

Forgot to screw the plugs into unused ports on new GPU waterblock. (It was 5am--i'd been up all night & was tired).

Switched system on (not just the pump). and u get the picture.

Water shooted out the port, getting wet the VRM and CPU area on the motherboard. (Luckily cpu is covered by an Ek Supremacy block).

I dived for the 'off switch' the moment i realised!

Dried everything I could see with paper towels.

There were some bits I could see but couldn't reach, which 13hrs later, then looked dry. I left the central heating on in the house (rest of night & all day today).

There appeared no visible damage. Never heard any pops or strange noises.

Surprisingly the GPU its self looked dry (except PCI-E lane which is now dry also)

I read that distilled water isn't conductive. (Nor is the coolant).

I hoped there'd been no damage.

The accident happened at 5am. It's 11:40pm now.

Since 6pm: I've had the CPU & motherboard in the oven, at 60c. And the 1080 Ti & DDR4 on the radiator. And I've been rebuilding the system.

I certainly wont ever make this mistake again! But it sure proves it can happen to anyone!.

When i had my old card i was always taking block off to perfect pad arrangement/re-paste/extreme cooling session) etc etc. I done it so many times. And the plugs never had to be touched except the first time!
Last night i must have been on auto-pilot!.

Anyway--what started as what should have been a 1st time Shunt mod then also a simple waterblock install --- turned into an 18hr nightmare.

I am glad to say *I have JUST (this second) powered my system on.* And thank god.







I think I just hit my ceiling with a relieved sense of excitement!








Can't believe it's working. Can't believe I've not destroyed £1500 worth of computer equipment!

Not the end of the story yet though! \/ \/

I'm on the iGPU just now. Thought I better test it without the 1080 Ti first as the accident obviously happened before I tested if the shunt mod was working.

I'll be back in just a minute -- to a) let you know if my 1080 Ti is water damaged (broken) b)if my shunt mod worked or c) maybe a combination of the both.

Anyway fingers crossed! & brb


----------



## seven7thirty30

I did the same thing once. I bathed all the components in alcohol then dried it all in the oven on very low heat. I was horrified, but everything worked. Now I triple check everything when I put a loop back together.


----------



## nrpeyton

Quote:


> Originally Posted by *seven7thirty30*
> 
> I did the same thing once. I bathed all the components in alcohol then dried it all in the oven on very low heat. I was horrified, but everything worked. Now I triple check everything when I put a loop back together.


When it powered on and booted into windows a minute ago it felt like I just won the lottery. I literally jumped out my seat in euphoria lol.

Anyway time to test the new shunt modded 1080 Ti

Fingers crossed again!

brb


----------



## seven7thirty30

Quote:


> Originally Posted by *nrpeyton*
> 
> When it powered on and booted into windows a minute ago it felt like I just won the lottery. I literally jumped out my seat in euphoria lol.


I felt it too. If you have problems try bathing the components in alcohol then oven drying them (lowest temp possible, leave it cracked open). I don't know what you use for coolant, but the alcohol will break up the additives that could leave a thin film behind.


----------



## methadon36

I Made the plunge and gotten a EVGA 1080 Ti SC Black Edition GAMING, 11G-P4-6393-KR yesterday, Also picked up the Titan XP EK Copper water block and back plate from PerformancePC. The Card and block will arrive tomorrow and the back plate comes sat..







Now I just need to sell off the 980ti and its waterblock and pay down the $900+ spent on this!


----------



## nrpeyton

yeeehaaa!!

shunt mod appears to be working









Witcher 3 at 4k.
2088 MHZ 1.093v

*Only 211 watts lol*

On the same scene before the shunt mod I was at 1833Mhz 900mv (0.9v) and still drawing 300 watts (so major power throttling).











Not even tried to push the card faster yet.

Temps are also 35c better too.

Although they do still look quite high. I only have one 360mm radiator for 1080 Ti and 7700k

And another one:

This time:

*2100 Mhz @ 1.093v* and +500 mem

*Only 194 watts, lol*

_*right click & 'open new tab' to view full size*_



_(Temp also looks better because the coolant hasn't had time to heat up)._

And quick dirty run: (Timespy)
http://www.3dmark.com/3dm/20373906

Don't know how good that is. CPU was at stock too. But clock never dropped below 2100 Mhz.

Goood bye power throttling ;-)
Hello performance.

My new max O/C is 2126Mhz. (on water with new waterblock)
I also got a few seconds at 2138Mhz.


----------



## jase78

Cant wait to try shunt mod and hopefully get this card to be stable up above 2100. Not sure if anyone saw my post earlier but was asking about the target resistance in ohms that we are looking for across the shunts for the mod to work properly?

Best i've been able to do on timespy so far without it http://www.3dmark.com/spy/1869428 Does this seem avg? or Below?


----------



## jase78

Quote:


> Originally Posted by *nrpeyton*
> 
> yeeehaaa!!
> 
> shunt mod appears to be working
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Witcher 3 at 4k.
> 2088 MHZ 1.093v
> 
> *Only 211 watts lol*
> 
> On the same scene before the shunt mod I was at 1833Mhz 900mv (0.9v) and still drawing 300 watts (so major power throttling).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not even tried to push the card faster yet.
> 
> Temps are also 35c better too.
> 
> Although they do still look quite high. I only have one 360mm radiator for 1080 Ti and 7700k
> 
> And another one:
> 
> This time:
> 
> *2100 Mhz @ 1.093v* and +500 mem
> 
> *Only 194 watts, lol*
> 
> _*right click & 'open new tab' to view full size*_
> 
> 
> 
> _(Temp also looks better because the coolant hasn't had time to heat up)._
> 
> And quick dirty run: (Timespy)
> http://www.3dmark.com/3dm/20373906
> 
> Don't know how good that is. CPU was at stock too. But clock never dropped below 2100 Mhz.
> 
> Goood bye power throttling ;-)
> Hello performance.
> 
> My new max O/C is 2126Mhz. (on water with new waterblock)
> I also got a few seconds at 2138Mhz.


glad your card is still alright!! Cant wait to try shunt myself!!


----------



## dansi

Which is better?

2012-2025mhz with no perfcap

2037-2050 with vrel?

The later gave me higher scores in benchmark


----------



## KedarWolf

Quote:


> Originally Posted by *dansi*
> 
> Which is better?
> 
> 2012-2025mhz with no perfcap
> 
> 2037-2050 with vrel?
> 
> The later gave me higher scores in benchmark


I'd go 2012-2025 for just gaming, no throttling helps frame rate stability I think.

Actually, test the games you play at higher clocks.

Diablo 3 with settings maxed out in Nvidia Control Panel does not throttle even at 2088 1.093V.


----------



## c0nsistent

So I've been testing with the XOC BIOS, but the only way I'm getting the card to not go above a certain voltage is by using Shift + L to lock it in place. If I set a flat voltage curve the card will still ramp above that voltage regardless. Has anyone else experienced this with the XOC BIOS? I was trying to keep it at 1.05v since I'm on AIO watercooling and @ 1.093v the temps go over 80C rather quickly. I'm waiting on some fittings in the mail and I'll be slapping a Raystorm CPU block with a full pump/radiator that I had laying around onto the card instead of using this AIO Antec 620 setup that I have.

Does anyone use Shift + L to lock the clock/voltage or is it useless as the card will still drop bins either way. It seems In SuPo 4K even with the XOC BIOS the card will drop bins as if its TDP throttlling or temp throttling. Wasn't that BIOS supposed to eliminate that? Or is it only possible with a shunt mod?


----------



## cluster71

When I cross 1.100v, the core is solid on max speed, does not regulate in windows. 40 min Sniper Elite (as usual) nVidia recommended + 8 times super sampling on 34 "Acer 2139/11600 1.125v air-cooled Zotac. When I finished it started to hack a bit. Maybe need some power.


----------



## cluster71

Yes, my doing the same. When Voltage is set to 0%, the card will run to 1,093v as long as the curve increases. It does not hold back. When I enter 100% volts, it is possible to plot in curves up to 1.2 v


----------



## DStealth

Just replaced my RMA'd GamingX with Palit Gamerock ... So far so good the memory of this cards is great ...just out of the box run 2076_2063/12300 responded in 32369 GPU FS run WOW

http://www.3dmark.com/3dm/20375751

Edit: SuPo 4k is pushing the boundaries of this BIOS...even [email protected]







)


----------



## Robilar

Interesting thread. I bought mine to play games though









I used to do quite a bit of benchmarking but it somehow lost its allure,

Maybe because it got so easy to overclock pc components.


----------



## DerComissar

Quote:


> Originally Posted by *Robilar*
> 
> Interesting thread. I bought mine to play games though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to do quite a bit of benchmarking but it somehow lost its allure,
> 
> Maybe because it got so easy to overclock pc components.


Well, enjoy those games then!


----------



## cluster71

I did a lot of benchmarking and tweaking with previous computers, but just became annoyed by myself that i wasted time. With this PC, I only play Games and make small adjustments from day to day until it's done.


----------



## KedarWolf

Quote:


> Originally Posted by *DStealth*
> 
> Just replaced my RMA'd GamingX with Palit Gamerock ... So far so good the memory of this cards is great ...just out of the box run 2076_2063/12300 responded in 32369 GPU FS run WOW
> 
> http://www.3dmark.com/3dm/20375751


You got a really good card. after a dozen runs on several different BIOS's this is the best I could do.


----------



## DStealth

Thanks keep in mind is stock blower cooled, stock bios, no voltage and still not found the limit of the memory OC...The cooler is way worst than the GamingX but anyway this card is good










Edit: Looks like Valley is liking memory very well


----------



## Mad Pistol

Just got an MSI 1080 Ti Gaming X today to replace my current GTX 1070 FE SLI setup.

So, in terms of absolute framerates, it's a slight downgrade. However, coupling the 1080 Ti with a 1440P 144hz Gsync monitor is nirvana. Everything (and I mean EVERYTHING) is smooth, responsive, and without the subtle stutters associated with SLI.

So yea... definitely an upgraded experience.


----------



## nrpeyton

That's better: (2nd run since shunt mod -- _changed some settings)_

http://www.3dmark.com/3dm/20378778

9694
GFX: 10801
CPU: 6134



Seems rock solid at 2126 Mhz. (& no throttling)

Anything above begins to get very tricky.

Not had a chance to revisit max memory O/C yet since going water & shunt.

Now I just need to get over 10k


----------



## Dasboogieman

Quote:


> Originally Posted by *Mad Pistol*
> 
> Just got an MSI 1080 Ti Gaming X today to replace my current GTX 1070 FE SLI setup.
> 
> So, in terms of absolute framerates, it's a slight downgrade. However, coupling the 1080 Ti with a 1440P 144hz Gsync monitor is nirvana. Everything (and I mean EVERYTHING) is smooth, responsive, and without the subtle stutters associated with SLI.
> 
> So yea... definitely an upgraded experience.


Sometimes, its easy for us enthusiasts to forget the reason we chase higher frame rates in order to get smoother gameplay. One of the reasons I gave up on my dual GPU days in favour of the best single GPU you can get.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Sometimes, its easy for us enthusiasts to forget the reason we chase higher frame rates in order to get smoother gameplay. One of the reasons I gave up on my dual GPU days in favour of the best single GPU you can get.


Agreed.

I haven't SLI'd since my twin 980's and I've _never_ looked back.


----------



## TheBoom

Quote:


> Originally Posted by *nrpeyton*
> 
> Agreed.
> 
> I haven't SLI'd since my twin 980's and I've _never_ looked back.


Lol I haven't SLI'd since my twin 8800GTS's..


----------



## OneCosmic

Quote:


> Originally Posted by *KedarWolf*
> 
> You got a really good card. after a dozen runs on several different BIOS's this is the best I could do.


Is this still your original Gigabyte FE card? Because i know that you had no boot with it after flashing HOF BIOS.


----------



## Coree

Ordered a 1080 TI w/ a Blower fan for 610€, good price? I have a Accelero III laying around, but no VRM heatsinks :l
https://www.scan.co.uk/images/infopages/1080Ti_zotac/blower-topimage.png
Does anybody know are these blower style non ref ones as quiet as the founders cards?


----------



## fisher6

Quote:


> Originally Posted by *c0nsistent*
> 
> So I've been testing with the XOC BIOS, but the only way I'm getting the card to not go above a certain voltage is by using Shift + L to lock it in place. If I set a flat voltage curve the card will still ramp above that voltage regardless. Has anyone else experienced this with the XOC BIOS? I was trying to keep it at 1.05v since I'm on AIO watercooling and @ 1.093v the temps go over 80C rather quickly. I'm waiting on some fittings in the mail and I'll be slapping a Raystorm CPU block with a full pump/radiator that I had laying around onto the card instead of using this AIO Antec 620 setup that I have.
> 
> Does anyone use Shift + L to lock the clock/voltage or is it useless as the card will still drop bins either way. It seems In SuPo 4K even with the XOC BIOS the card will drop bins as if its TDP throttlling or temp throttling. Wasn't that BIOS supposed to eliminate that? Or is it only possible with a shunt mod?


I have the exact same issue. i'm using the XOC as my daily driver but haven't added any voltage using the slider in AF but I also can't make the voltage not go beyond a certain value unless I use CTRL-l. It doesn't bother me since temps are fine and I get no throttling at all but I have an EK waterblock.


----------



## dallas1990

i would assume so. since by the looks of them its just a redesign shroud. but i really dont know since i dont own a aero cooler card


----------



## feznz

Quote:


> Originally Posted by *tagaxxl*
> 
> hey guys ..i got zotac 1080 ti amp extreme edition and while i watch videos on youtube using waterfox/chrome it freeze for bit and sometimes i get blue screen with this error
> 
> 
> 
> i did a clean install the drivers using display driver uninstaller and still get the same error


So after testing for a few days after a clean install everything running smooth as butter
until tonight about an hour after installing MSI AB 4.4 beta10 I get the first BSOD so looks like it is MSI AB is the culprit


----------



## KedarWolf

Quote:


> Originally Posted by *OneCosmic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You got a really good card. after a dozen runs on several different BIOS's this is the best I could do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is this still your original Gigabyte FE card? Because i know that you had no boot with it after flashing HOF BIOS.
Click to expand...

Yeah, is my original. Took three days to fix it.


----------



## Lefty23

Quote:


> Originally Posted by *feznz*
> 
> So after testing for a few days after a clean install everything running smooth as butter
> until tonight about an hour after installing MSI AB 4.4 beta10 I get the first BSOD so looks like it is MSI AB is the culprit


Sorry to hear that. Do you install the Rivatuner Statistics Server when you install AB?
I would try unistalling it to see if anything changes.
RTSS has caused instability (OS crashes) for me in the past (~1 year ago).
Probably not your problem but you might consider trying?

Also when I tried the beta 6 build I was getting sporadic game/driver crashes so I went back to 4.3.0 "stable" build with no issues in the last couple weeks.
As I'm pretty sure you know, all you have to do is edit a file to get voltage control.

edit: I'm not saying AB/RTSS beta builds are bad, just that it might clash with something else in your system, as it seems to do in mine.


----------



## SirBubby

Quote:


> Originally Posted by *TheBoom*
> 
> You should just use the actual core clock numbers instead of saying +175 or +xxx because every card is different in boost clocks and offsets. We wouldn't know what actual core clock you are getting if you say +175 and we can't tell you if its good or not.
> 
> That being said, anything over 2ghz is considered good and anything over 2100 is considered more or less a silicon lottery winner.
> The consensus is that as long you don't get perfcaps in GPU-Z then you are good, however running into a perfcap like powerlimit *may* impact performance.
> 
> If you do increase voltage you won't be able to get higher stable clocks anyway if you are indeed running into powerlimits.


I kinda figured.
Quote:


> Originally Posted by *SirBubby*
> 
> No lock ups in Heaven or Battlefield 1 after 15min. I need to stress test more. PUBG is a good test as that game will murder your hardware.
> 
> My power usage hangd around 105-110% but also spikes past 120% if I go higher than +12% voltage. That was my question is the spikes bad or should I be ok with spikes over 120% and increase the voltage?


Tested more last night and my previous clocks were stable in Heaven and BF1 but not PUBG. I crashed within the first 2 minutes. I lowered the voltage to 0% and was still hitting the power limit. I got it stable at +150 to the core and it settled at 1987mhz @ 1050mv. Sometimes it drops to 1973mhz @ 1043mv. I'm pretty happy with that. Very temped to water cool it and try the shunt mod.


----------



## ALSTER868

Quote:


> Originally Posted by *Coree*
> 
> Ordered a 1080 TI w/ a Blower fan for 610€, good price? I have a Accelero III laying around, but no VRM heatsinks :l
> https://www.scan.co.uk/images/infopages/1080Ti_zotac/blower-topimage.png
> Does anybody know are these blower style non ref ones as quiet as the founders cards?


I'd find some cheap heatsinks for VRM and VRAM to use your ''old'' Accelero III, there are heaps of them around at ebays and so on. Its performance is still phenomenal with 1080 Ti. Plus noise next to zero, well, hardly noticeable noise even at 100% fans to my hearing. Far better choice that listening to the FE blower whine imo.
10-15 loops of [email protected] extreme, 2025/[email protected] - 55C max at 23C ambient.


----------



## Frosty288

Guys! I have a 1080Ti!

I have never owned a top model NVidia card before, I've always had the model below it, so I just retired my 970. I think before that, was the 670? Or 470? It's been so long...


----------



## Mad Pistol

Quote:


> Originally Posted by *Dasboogieman*
> 
> Sometimes, its easy for us enthusiasts to forget the reason we chase higher frame rates in order to get smoother gameplay. One of the reasons I gave up on my dual GPU days in favour of the best single GPU you can get.


Agreed. You would think I learned my lesson from an SLI GTX 780 setup, but I went ahead with a 1070 SLI setup.

Definitely a believer in a single powerful GPU again.


----------



## Mad Pistol

Quote:


> Originally Posted by *Frosty288*
> 
> Guys! I have a 1080Ti!
> 
> I have never owned a top model NVidia card before, I've always had the model below it, so I just retired my 970. I think before that, was the 670? Or 470? It's been so long...


Welcome to the club bro. Drinks are in the fridge, and ultra details/high frame rates are always on order.


----------



## Blotto80

I'm pretty impressed with the Heatkiller IV block. I've been mining with Nicehash since 5pm last night at 2000mhz/.95v and it's sitting at 37 degrees. That's on 360 and 240 rads with a 5930k and a reasonable fan curve.


----------



## Heruur

I recently purchased a 1080Ti for my Shuttle XPC; my concern is with temperatures. It has been running between 86-89C (~70-80% fan speed) due to ambient temp in house (last few summer days have been really hot). Is this safe or do I need to worry about long term damage to gpu


----------



## TheBoom

Quote:


> Originally Posted by *Heruur*
> 
> I recently purchased a 1080Ti for my Shuttle XPC; my concern is with temperatures. It has been running between 86-89C (~70-80% fan speed) due to ambient temp in house (last few summer days have been really hot). Is this safe or do I need to worry about long term damage to gpu


Technically anything below 90C is fine, but of course extra heat will always reduce lifespan. But we're not talking 50% reduced lifespan maybe 10-20% which wouldn't be enough to make a difference unless you plan on keeping it for like 6-8 years or more.

Should try undervolting and keeping it at slightly lower clocks if you are really concerned about temps.


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, is my original. Took three days to fix it.


So which bios are you using now to get your best results - XOC or that arctic one?


----------



## kevindd992002

Now that the Kaby Lake X is out, what would be the best CPU to wait for that can go along with the 1080Ti at 1440p? The 7740X or better yet spend $50 more to get a 6/12 7800X?


----------



## Mad Pistol

Quote:


> Originally Posted by *kevindd992002*
> 
> Now that the Kaby Lake X is out, what would be the best CPU to wait for that can go along with the 1080Ti at 1440p? The 7740X or better yet spend $50 more to get a 6/12 7800X?


If you're getting a CPU with more than 4 cores, go with Ryzen and/or Threadripper. Intel's new X299 platform seems like a waste of space.


----------



## DADDYDC650

Quote:


> Originally Posted by *kevindd992002*
> 
> Now that the Kaby Lake X is out, what would be the best CPU to wait for that can go along with the 1080Ti at 1440p? The 7740X or better yet spend $50 more to get a 6/12 7800X?


Why would anyone spend money on x299?


----------



## Coree

The whole X299 is a mess, why is there a 4 core option which has the exact same features as the normal Kaby Lake.


----------



## Hulio225

Quote:


> Originally Posted by *Coree*
> 
> The whole X299 is a mess, *why is there a 4 core option which has the exact same features as the normal Kaby Lake*.


Because it is the normal Kaby Lake just on a different socket! Other than that yes its a mess, a big one!


----------



## Enzarch

Well, it hasnt been since a pair of 8800-GT's that I have owned nVidia; But here I am coming off a pair of modded 290's to a Strix 1080Ti










This is quite the cakewalk for my loop since it was designed around a system that could easily pull 1KW from the wall

I tried the XOC BIOS and saw the 5% degradation in clock for clock performance so it is now shunt modded instead.

I *seem* to have a good chip, but I cannot achieve the scores you guys do (unless you guys are using driver tricks)

Timespy GPU score: The most i can seem to get is ~10500 graphics score @ 2138 / 6100MHz (fresh Windows, tuned for max performance, not highest clock) Judging by the scores I see here, should I not be getting roughly 11000?
Same story in Super Position, I can barely break 10000 (4k optimized) points when it looks like 10500+ should be the norm for these high clocks.
Am I missing something really dumb, or is there something awry? (No throttling)

Again, I haven't been to the green side in a while, all a tiny bit alien to me.

Rig is Ryzen 1700x @ 4Ghz & DDR4 2666 CL14


----------



## feznz

Quote:


> Originally Posted by *Lefty23*
> 
> Sorry to hear that. Do you install the Rivatuner Statistics Server when you install AB?
> I would try unistalling it to see if anything changes.
> RTSS has caused instability (OS crashes) for me in the past (~1 year ago).
> Probably not your problem but you might consider trying?
> 
> Also when I tried the beta 6 build I was getting sporadic game/driver crashes so I went back to 4.3.0 "stable" build with no issues in the last couple weeks.
> As I'm pretty sure you know, all you have to do is edit a file to get voltage control.
> 
> edit: I'm not saying AB/RTSS beta builds are bad, just that it might clash with something else in your system, as it seems to do in mine.


That's a thoughtful suggestion so +1

TBH I am not worried about voltage control my personal finding are that I might net 1%> so really no point 2038Mhz/1.062v to 2062Mhz/1.092v of course unless I go the XOC bios but still rather get the X399 and go SLI or I might stay with my current 3770k still undecided.

Brighter side clean install after 5 years so rid of all the bloatware there were a few other niggles that I couldn't solve so over all I am happy to have done a clean install just had been avoiding it.
still Administrative event logs is looking better thought still full on logging
Quote:


> Originally Posted by *Enzarch*
> 
> Well, it hasnt been since a pair of 8800-GT's that I have owned nVidia; But here I am coming off a pair of modded 290's to a Strix 1080Ti
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is quite the cakewalk for my loop since it was designed around a system that could easily pull 1KW from the wall
> 
> I tried the XOC BIOS and saw the 5% degradation in clock for clock performance so it is now shunt modded instead.
> 
> I *seem* to have a good chip, but I cannot achieve the scores you guys do (unless you guys are using driver tricks)
> 
> Timespy GPU score: The most i can seem to get is ~10500 graphics score @ 2138 / 6100MHz (fresh Windows, tuned for max performance, not highest clock) Judging by the scores I see here, should I not be getting roughly 11000?
> Same story in Super Position, I can barely break 10000 (4k optimized) points when it looks like 10500+ should be the norm for these high clocks.
> Am I missing something really dumb, or is there something awry? (No throttling)
> 
> Again, I haven't been to the green side in a while, all a tiny bit alien to me.
> 
> Rig is Ryzen 1700x @ 4Ghz & DDR4 2666 CL14


Thats a huge jump in preformance like me coming from GTX 770 SLI

I could almost garantee the performace mods are used who would when it is a legitimate performace gain for a few clicks of the mouse.
The guide to driver mods is here in the first post

http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/0_20


----------



## phenom01

Kinda off subject and probably deserves its own thread here but did anyone see what nvidias stock is doing?

http://www.cnbc.com/2017/06/09/wall-street-analysts-keep-one-upping-each-other-over-nvidia.html

Highest performing stock in the S&p 500 expected to go even higher by well...pretty much everyone. They have tripled in the last year and they are going higher supposedly.

Someone else can make a post about it in the news section Im to lazy to do it







.


----------



## nrpeyton

Quote:


> Originally Posted by *Lefty23*
> 
> Sorry to hear that. Do you install the Rivatuner Statistics Server when you install AB?
> I would try unistalling it to see if anything changes.
> RTSS has caused instability (OS crashes) for me in the past (~1 year ago).
> Probably not your problem but you might consider trying?
> 
> Also when I tried the beta 6 build I was getting sporadic game/driver crashes so I went back to 4.3.0 "stable" build with no issues in the last couple weeks.
> As I'm pretty sure you know, all you have to do is edit a file to get voltage control.
> 
> edit: I'm not saying AB/RTSS beta builds are bad, just that it might clash with something else in your system, as it seems to do in mine.


Riva doesn't ever cause me any crashes or windows errors *after* my game has loaded; but it *does stop some games from starting up*. I.E. both The Witcher 3 and now StarCraft II. I have to start the game *first*. Minimise to windows. Then load Riva. After which they work together perfectly.

However if Riva is running when I try to launch, the game will either crash back to the desktop within seconds of double clicking it's shortcut or will return an error message within seconds (before the PC even goes into full screen mode).

I'm using the version 7.0.0 beta 15.

I think it is causing some sort of conflict with direct X.

P.S. I'm also on fresh windows install.


----------



## cluster71

I had some frame error or something. Red dialogs that came up on windows desktop and afterburner 4.4.0 Beta 10 crached, and ended after that. That was when I had high latency times a week ago. Since I came down in LatensyMon, there is no problem.


----------



## Robilar

Quote:


> Originally Posted by *Mad Pistol*
> 
> Just got an MSI 1080 Ti Gaming X today to replace my current GTX 1070 FE SLI setup.
> 
> So, in terms of absolute framerates, it's a slight downgrade. However, coupling the 1080 Ti with a 1440P 144hz Gsync monitor is nirvana. Everything (and I mean EVERYTHING) is smooth, responsive, and without the subtle stutters associated with SLI.
> 
> So yea... definitely an upgraded experience.


Been there... SLI, i had many sets and never completely problem free. Once i went single card, pretty much every issue abated.

Gsync is great as well. On my 3rd gsync monitor and for gaming i would never go back.


----------



## nrpeyton

Anyone with an 'i7 7700k' and a Founders Edition Ti have 3dmark Timespy scores they could post please?

I just beat my own best score: http://www.3dmark.com/3dm/20393399

New Best:

*Overall:*
9859

*Graphics:*
11070

*CPU:*
6088

_(Better than 96% of all results).
_


----------



## methadon36

Just got my EVGA 1080TI SC Black Edition and EK Water block today, Just installed it and flushed the loop. Back Plate got delayed till monday but I couldn't wait lol . Just ran bf1 as a test and it all works temps stay at 44-47c. Won't bench till sunday or monday night when I have time.


----------



## blurp

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone with an 'i7 7700k' and a Founders Edition Ti have 3dmark Timespy scores they could post please?
> 
> I just beat my own best score: http://www.3dmark.com/3dm/20393399
> 
> New Best:
> 
> *Overall:*
> 9859
> 
> *Graphics:*
> 11070
> 
> *CPU:*
> 6088
> 
> _(Better than 96% of all results).
> _


Same specs as you. GPU @2100 with a lot of power throttling. CPU @ 4.8.Just did a run for fun (no crazy optimization)

http://www.3dmark.com/3dm/20393857?

Score : 9779
GPU : 10993
CPU: 6016


----------



## nrpeyton

Quote:


> Originally Posted by *blurp*
> 
> Same specs as you. GPU @2100 with a lot of power throttling. CPU @ 4.8.Just did a run for fun (no crazy optimization)
> 
> http://www.3dmark.com/3dm/20393857?
> 
> Score : 9779
> GPU : 10993
> CPU: 6016


Thanks 

Just wanted a comparison to see how I was doing 

No crazy optimisation either, but I did follow the advice on page 1 _(opening page of this thread)_ regarding Nvida control panel settings for Timespy.

I also shunt modded the card the other night, so power throttling isn't wayyy overboard like it normally is on a FE. (It certainly doesn't feel that way anyway-- and it sure did before the mod). _Saying that-- really wish I'd taken the time to take some baselines before I done the mod. I would have appreciated the mod more as a result._

Quote:


> Originally Posted by *methadon36*
> 
> Just got my EVGA 1080TI SC Black Edition and EK Water block today, Just installed it and flushed the loop. Back Plate got delayed till monday but I couldn't wait lol . Just ran bf1 as a test and it all works temps stay at 44-47c. Won't bench till sunday or monday night when I have time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> IMG ALT=""]http://www.overclock.net/content/type/61/id/3055523/width/500/height/1000[/IMG]


Looks good.

We have the same block I think. (Looks the same anyway) Copper plexi...

My temps are pretty much exactly the same too... how much rad space you got?
I am only on a single 360 _(medium thickness)_.

Tell you all something though.. and this probably is more relevant to the water coolers amongst us. But the 3 GDDR5X memory chips closest to the VRM are getting *very* hot on my card.

I noticed during my EK block install-- that part of the VRM that was covered by the stock FE cooler isn't cooled by the EK block. (EK decided we shouldn't place a thermal pad). But there was definitely a thermal pad there on air. Don't know what they're called.. but it's the row inbetween the those memory chips and the VRM.

I've marked an arrow to show what I'm referring to below:



I don't have a backplate either. So when I place my finger behind that part on the PCB while running a bench it's too hot to touch for more than 1-2 seconds.

On the other hand.. when I touch directly behind the MOSFETS (the part of the VRM which does have the biggest thermal pad) it's certainly HOT but I *am* able to keep my finger there for 5/6 seconds or more. I mean having that thermal pad there is definitely helping.

I can't help feel that if there was also one where I've marked the arrows.. that we'd defintely see better temps on those memory chips.

In contrast.. the 4 GDDR5X chips on the I/O side don't even feel hot at all... at most they're only *warm*.

Now my memory O/C has got better since going water.... so memory is definitely temperature sensitive and definitely rewards you for keeping them cool.

I feel like an email to EK support coming on.. _(I want to understand their thought process behind this decision)._

Nvidia certainly knew what they were doing when they removed the one smack bang in the middle, on the VRM side.

I'll let you all know how I get on.

If they're unhelpful I may just do some trial + error and experiment myself to find the right height pads for that row. (so they make contact with copper block).


----------



## methadon36

I have a 360 and a 120 rad also.. My 6700k is running at 4.9ghz. so I wanted the extra rad for just that..

Yea when I put my thermal pads on it only covered the chokes.. and the PCB behind them and not that part you are referring to


----------



## DStealth

Quote:


> Originally Posted by *Enzarch*
> 
> I *seem* to have a good chip, but I cannot achieve the scores you guys do (unless you guys are using driver tricks)
> 
> Timespy GPU score: The most i can seem to get is ~10500 graphics score @ 2138 / 6100MHz (fresh Windows, tuned for max performance, not highest clock) Judging by the scores I see here, should I not be getting roughly 11000?
> Same story in Super Position, I can barely break 10000 (4k optimized) points when it looks like 10500+ should be the norm for these high clocks.
> Am I missing something really dumb, or is there something awry? (No throttling)


Something is definitely off with these scores. Try reducing core overclock/it can lower scores at some point/ and then try lowering the Vmem as it has ECC and some straps are much better as 2-3% than others...
11k TS GPU are in the range of 2050 core and good memory 12+ and as for the 4k SuPo 10k are reachable over 2ghz anytime...Also go and see your NV panel settings, some options as Vsync/Fast sync/Gsync will reduce the score greatly also.
Good luck and keep us updated, your card seem like a serious contender









Edit: Throttling from 2063-2050-2038 at the last test as temperatures goes near 70* on this small cooler...still awaiting EK-Thermosphere to add it to the loop
http://www.3dmark.com/spy/1886516 *11139 GPU*
Tried XOC with stable 2076 run but the score was a little less 111xx


----------



## jase78

Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks
> 
> Just wanted a comparison to see how I was doing
> 
> No crazy optimisation either, but I did follow the advice on page 1 _(opening page of this thread)_ regarding Nvida control panel settings for Timespy.
> 
> I also shunt modded the card the other night, so power throttling isn't wayyy overboard like it normally is on a FE. (It certainly doesn't feel that way anyway-- and it sure did before the mod). _Saying that-- really wish I'd taken the time to take some baselines before I done the mod. I would have appreciated the mod more as a result._
> 
> Looks good.
> 
> We have the same block I think. (Looks the same anyway) Copper plexi...
> 
> My temps are pretty much exactly the same too... how much rad space you got?
> I am only on a single 360 _(medium thickness)_.
> 
> Tell you all something though.. and this probably is more relevant to the water coolers amongst us. But the 3 GDDR5X memory chips closest to the VRM are getting *very* hot on my card.
> 
> I noticed during my EK block install-- that part of the VRM that was covered by the stock FE cooler isn't cooled by the EK block. (EK decided we shouldn't place a thermal pad). But there was definitely a thermal pad there on air. Don't know what they're called.. but it's the row inbetween the those memory chips and the VRM.
> 
> I've marked an arrow to show what I'm referring to below:
> 
> 
> 
> I don't have a backplate either. So when I place my finger behind that part on the PCB while running a bench it's too hot to touch for more than 1-2 seconds.
> 
> On the other hand.. when I touch directly behind the MOSFETS (the part of the VRM which does have the biggest thermal pad) it's certainly HOT but I *am* able to keep my finger there for 5/6 seconds or more. I mean having that thermal pad there is definitely helping.
> 
> I can't help feel that if there was also one where I've marked the arrows.. that we'd defintely see better temps on those memory chips.
> 
> In contrast.. the 4 GDDR5X chips on the I/O side don't even feel hot at all... at most they're only *warm*.
> 
> Now my memory O/C has got better since going water.... so memory is definitely temperature sensitive and definitely rewards you for keeping them cool.
> 
> I feel like an email to EK support coming on.. _(I want to understand their thought process behind this decision)._
> 
> Nvidia certainly knew what they were doing when they removed the one smack bang in the middle, on the VRM side.
> 
> I'll let you all know how I get on.
> 
> If they're unhelpful I may just do some trial + error and experiment myself to find the right height pads for that row. (so they make contact with copper block).


am curious to here your results if you decide to do some temp readings with diff. thermal pad thicknesses. As far as those chips go though i went ahead and put thermal pad strip on them just like NV had.


----------



## smithsrt8

Quote:


> Originally Posted by *DStealth*
> 
> Something is definitely off with these scores. Try reducing core overclock/it can lower scores at some point/ and then try lowering the Vmem as it has ECC and some straps are much better as 2-3% than others...
> 11k TS GPU are in the range of 2050 core and good memory 12+ and as for the 4k SuPo 10k are reachable over 2ghz anytime...Also go and see your NV panel settings, some options as Vsync/Fast sync/Gsync will reduce the score greatly also.
> Good luck and keep us updated, your card seem like a serious contender
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Throttling from 2063-2050-2038 at the last test as temperatures goes near 70* on this small cooler...still awaiting EK-Thermosphere to add it to the loop
> http://www.3dmark.com/spy/1886516 *11139 GPU*
> Tried XOC with stable 2076 run but the score was a little less 111xx


I cant wait to get my "advanced replacement" card from Gigabyte...this card seems like a dud anyways...it wont go past x4 PCIe...(I think there are some broken caps on the bottom side) but I cannot clock more than 2050-2055 and it starts to crash...my best score in timespy is 9703 (10771 graphics score)


----------



## kevindd992002

Quote:


> Originally Posted by *Mad Pistol*
> 
> If you're getting a CPU with more than 4 cores, go with Ryzen and/or Threadripper. Intel's new X299 platform seems like a waste of space.


I'm kinda biased towards Intel as I am with Nvidia. What do you mean by "waste of space"?

Also, how is the X299 a "big mess"?

Quote:


> Originally Posted by *DADDYDC650*
> 
> Why would anyone spend money on x299?


I'm not sure which is why I asked here in this thread?


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm kinda biased towards Intel as I am with Nvidia. What do you mean by "waste of space"?
> 
> Also, how is the X299 a "big mess"?
> 
> I'm not sure which is why I asked here in this thread?


Marketing gone wrong. You might wanna watch Jayztwocentz's or Linus's video on why it's a big "mess".

If you really want to stick with Intel as do I, you should just wait for Coffeelake instead which is rumoured for Q3-Q4.


----------



## DADDYDC650

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm kinda biased towards Intel as I am with Nvidia. What do you mean by "waste of space"?
> 
> Also, how is the X299 a "big mess"?
> 
> I'm not sure which is why I asked here in this thread?


x399 is where it's at bud. Thread rip dat @ss.


----------



## DStealth

Quote:


> Originally Posted by *DStealth*
> 
> Thanks keep in mind is stock blower cooled, stock bios, no voltage and still not found the limit of the memory OC...The cooler is way worst than the GamingX but anyway this card is good


Still on Stock blower...this card is a beast 32500 GPU score +700 memory









http://www.3dmark.com/3dm/20397629

Edit: New driver *32522*

http://www.3dmark.com/3dm/20398752


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> Marketing gone wrong. You might wanna watch Jayztwocentz's or Linus's video on why it's a big "mess".
> 
> If you really want to stick with Intel as do I, you should just wait for Coffeelake instead which is rumoured for Q3-Q4.


Quote:


> Originally Posted by *DADDYDC650*
> 
> x399 is where it's at bud. Thread rip dat @ss.


Ok, not good then  I'm practically building a new rig and I'm coming from an i7-2600K. Waiting for Q3-Q4 is really hard.


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone with an 'i7 7700k' and a Founders Edition Ti have 3dmark Timespy scores they could post please?
> 
> I just beat my own best score: http://www.3dmark.com/3dm/20393399
> 
> New Best:
> 
> *Overall:*
> 9859
> 
> *Graphics:*
> 11070
> 
> *CPU:*
> 6088
> 
> _(Better than 96% of all results).
> _


*Overall:*
10412

*Graphics:*
11376

*CPU:*
7036

http://www.3dmark.com/spy/1677008


----------



## feznz

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok, not good then
> 
> 
> 
> 
> 
> 
> 
> I'm practically building a new rig and I'm coming from an i7-2600K. Waiting for Q3-Q4 is really hard.


we can do it
I think I can I think I can Choo Choo
I am holding out too I been looking lots especially when there has been 1/2 price sales
looking at peoples bench results does actually help the urge
https://www.overclock3d.net/news/cpu_mainboard/msi_and_g_skill_achieve_world_record_memory_speeds_of_5_5ghz/2

to keep this on topic and not debunk to a cpu thread again some of my bench results max clock with stock voltage
I see my Firestrike reports 2100Mhz but I am pretty sure I ran most of the test @2063Mhz magor throttling going on there my card is only good for 2038Mhz 24/7

http://www.3dmark.com/spy/1511440
http://www.3dmark.com/fs/12341867


----------



## The Storm

Well I took the plunge and sold off my 3 AMD R9-290x's and got a new 1080ti. I went with the EVGA SC black, I needed the DVI port and put on the EK water block. Any way, I also didn't agree with EK's thermal pad placement and put pads everywhere the stock cooler had them. After an hour of BF4 I didn't get above 34°c, mind you mine is setup to cool 3 290x's. I was also able to use the factory back plate, just drill the holes in the backplate 1 size bigger and it works perfectly.


----------



## Mad Pistol

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok, not good then
> 
> 
> 
> 
> 
> 
> 
> I'm practically building a new rig and I'm coming from an i7-2600K. Waiting for Q3-Q4 is really hard.


If you absolutely have to upgrade now and you want to stick with Intel, an i7-7700k + Z270 board is your path. As everyone else has said, X299 doesn't seem like a good platform. If you absolutely want an HEDT setup, AMD's X399/Threadripper setup is what you want to wait for.


----------



## dansi

Anyone have this prob with AB overclock on the curve?

I would set it perfectly, stable, no perfcap at [email protected] super stable and all. All voltage points beyond 1.05v i capped at 2025.

On the next cold boot, AB curved somehow jumped to [email protected] and instability struck!

***! AB is not saving my curve properly.


----------



## ViRuS2k

Quote:


> Originally Posted by *dansi*
> 
> Anyone have this prob with AB overclock on the curve?
> 
> I would set it perfectly, stable, no perfcap at [email protected] super stable and all. All voltage points beyond 1.05v i capped at 2025.
> 
> On the next cold boot, AB curved somehow jumped to [email protected] and instability struck!
> 
> ***! AB is not saving my curve properly.


It does it for me also.....
you have to get around it by heating up your GPU to its limit temp....

then set your curve SAY your stable at 2025....
set your curve under heat @ 2025 then save profile....

then once your card cools down it will boot up @ 2025 or 2050 @ your voltage that you set then once your card heats up to the same temp or around it at when you set the curve it will lower the bin and stay at 2025.....

its the only way around it.


----------



## dansi

Quote:


> Originally Posted by *ViRuS2k*
> 
> It does it for me also.....
> you have to get around it by heating up your GPU to its limit temp....
> 
> then set your curve SAY your stable at 2025....
> set your curve under heat @ 2025 then save profile....
> 
> then once your card cools down it will boot up @ 2025 or 2050 @ your voltage that you set then once your card heats up to the same temp or around it at when you set the curve it will lower the bin and stay at 2025.....
> 
> its the only way around it.


not working.









on reboot, the clocks gone messy.


----------



## nrpeyton

Quote:


> Originally Posted by *jase78*
> 
> am curious to here your results if you decide to do some temp readings with diff. thermal pad thicknesses. As far as those chips go though i went ahead and put thermal pad strip on them just like NV had.


Did you check it's making contact with the block?

Quote:


> Originally Posted by *Hulio225*
> 
> *Overall:*
> 10412
> 
> *Graphics:*
> 11376
> 
> *CPU:*
> 7036
> 
> http://www.3dmark.com/spy/1677008


| checked your timespy validation/score out.

Looks good.

But I couldn't help also noticing your core clock was only 1,592 MHz on your 1080 Ti.

I'd be very interested to know what settings you are using in the control panel or if you have any other advice/recommendations.

Come on, it's always best to share lol ;-)

P.S. if you can hit that score at that core clock, imagine what you'd be hitting at 2100 Mhz.

I did also notice your CPU was overclocked to 5.25Ghz-- but your graphics score was still ahead of mine too.

Quote:


> Originally Posted by *The Storm*
> 
> Well I took the plunge and sold off my 3 AMD R9-290x's and got a new 1080ti. I went with the EVGA SC black, I needed the DVI port and put on the EK water block. Any way, I also didn't agree with EK's thermal pad placement and put pads everywhere the stock cooler had them. After an hour of BF4 I didn't get above 34°c, mind you mine is setup to cool 3 290x's. I was also able to use the factory back plate, just drill the holes in the backplate 1 size bigger and it works perfectly.


Interesting.. you must have a lot of radiator space then. Those are nice temps.

Personally I don't care for the back plate... I think it just locks heat in!

I don't even recall there being any thermal pads on it at stock!

However certainly from an aesthetic point of view then yeah by all means 

Partly due to direct DIE; *modern Nvidia GPU's are good at matching coolant temp:*

At idle my 1080 used to match it to within 1-2 degrees. Once I knew how many degrees above coolant temp a workload would put my GPU I could always use the GPU temp to validate my coolant temp. A heavy workload was usually 10c over. Furmark 12-13c over.

Not had a chance to test this on my Ti yet. But I'm guessing due to increased DIE size it will be exactly the same. (In other words yes it's a bigger chip-- but a larger DIE can also dissipate more heat).


----------



## Benny89

Ordered EK waterblock and blackplate for ma STRIX.

Soon this bad boy will be swimming


----------



## The Storm

Interesting.. you must have a lot of radiator space then. Those are nice temps.

Personally I don't care for the back plate... I think it just locks heat in!

I don't even recall there being any thermal pads on it at stock!

However certainly from an aesthetic point of view then yeah by all means 

I have alot of radiator space, 2 RX 480mm, 1 rx360mm, 1 ex240mm.


----------



## FastEddieNYC

First I want to thank everyone that contributed to this thread. It really helped me get a handle on Pascal and how to shunt mod the card. My last Nvidia card was a GTX 780.
Quote:


> Originally Posted by *The Storm*
> 
> Well I took the plunge and sold off my 3 AMD R9-290x's and got a new 1080ti. I went with the EVGA SC black, I needed the DVI port and put on the EK water block. Any way, I also didn't agree with EK's thermal pad placement and put pads everywhere the stock cooler had them. After an hour of BF4 I didn't get above 34°c, mind you mine is setup to cool 3 290x's. I was also able to use the factory back plate, just drill the holes in the backplate 1 size bigger and it works perfectly.


I just did the same. I was running 290X crossfire for the last few years and got tired of all the issues so I purchased the EVGA 1080Ti SC black. After reading some reviews I decided to go with the Kryographics water block and back plate. It uses direct contact memory cooling(uses TIM) and the back plate is used as a large heat sink to cool the backside of the VRM's.


The pic is not upside down. I have the tray reversed so the block is facing up. Why have a nice WB and not be able to see it. Plus Gravity helps keep the liquid metal on the shunts while I test and run benchmarks.
I am currently bleeding the system and putting everything back together. I am really excited to see what the card can do when the load temps are under 36C. Time to crank up the AC.


----------



## Enzarch

Quote:


> Originally Posted by *Enzarch*
> 
> I tried the XOC BIOS and saw the 5% degradation in clock for clock performance so it is now shunt modded instead.
> 
> I *seem* to have a good chip, but I cannot achieve the scores you guys do (unless you guys are using driver tricks)
> 
> Timespy GPU score: The most i can seem to get is ~10500 graphics score @ 2138 / 6100MHz (fresh Windows, tuned for max performance, not highest clock) Judging by the scores I see here, should I not be getting roughly 11000?
> Same story in Super Position, I can barely break 10000 (4k optimized) points when it looks like 10500+ should be the norm for these high clocks.
> Am I missing something really dumb, or is there something awry? (No throttling)
> 
> Again, I haven't been to the green side in a while, all a tiny bit alien to me.
> 
> Rig is Ryzen 1700x @ 4Ghz & DDR4 2666 CL14


Quote:


> Originally Posted by *DStealth*
> 
> Something is definitely off with these scores. Try reducing core overclock/it can lower scores at some point/ and then try lowering the Vmem as it has ECC and some straps are much better as 2-3% than others...
> 11k TS GPU are in the range of 2050 core and good memory 12+ and as for the 4k SuPo 10k are reachable over 2ghz anytime...Also go and see your NV panel settings, some options as Vsync/Fast sync/Gsync will reduce the score greatly also.
> Good luck and keep us updated, your card seem like a serious contender


I have rigorously tested/ retested starting from zero, separately and together, these are the highest performing clocks. (though core clocks make little difference beyond 2100)

The Timespy driver tweaks netted me roughly 200 graphics points. but 10700 is still a bit off considering these clocks;
This seems to leave me with only one potential culprit: maybe somehow the Ryzen platform is holding the graphics sub-system back a percent or 2? As all of the scores I am comparing to are Intel.


----------



## nrpeyton

Resistors just ordered for upgrade to Coopiklaani's power mod.

Still got some other stuff to get too-- but at least this gets the ball rolling 



Quote:


> Originally Posted by *Benny89*
> 
> Ordered EK waterblock and blackplate for ma STRIX.
> 
> Soon this bad boy will be swimming


lol


----------



## sid4975

hey guys I need some help, just got a recertified x34 to go with my zotac 1080 ti i got 2 weeks ago. I just hooked the monitor up last night. I clocked it to100hz and tried out 2 of my games, witcher 3 and far cry 4. both games crashed within 10-15 mins of playing. I was running my 2k monitor with my 1080ti fine before i switched last night. I didnt really play anthing for more then a few mins on my old 2k though cuz i was waiting for the x34.

anyway im trying to figure out if its the 1080ti or the recertified x34 thats causing the games to freeze? Can anyone tell me what programs or tests i can use to narrow it down? i only have a few days to return the monitor, the card i can rma for a few years i believe.

the card doesnt seem to ever really rev up, i never really here the fans and it never goes above 65 degrees that i've seen. if im pushing it on the x34 with ultra shouldnt the card be maxxed?

any info is appreciated time is of the essence. also made post in x34 forum

thanx!


----------



## sid4975

ok so i just attempted the 3d mark 11 bench test. it didnt even make it through the 1st one and it crashed, i was running it at the x34 3440x1440 at 100hz. here is the error.

Workload work failed with error message: eva::d3d11::rendering::scene_renderer::render(): draw_unshadowed_illumination_task for thread 0: File: device_context.cpp
Line: 515
Function: struct ID3D11CommandList *__cdecl eva::d3d11::deferred_device_context::do_finish_command_list(bool)

Expression: native()->FinishCommandList( restore_deferred_context_state, &result): DX11 call failed.

Device hung due to badly formed commands.
DXGI_ERROR_DEVICE_HUNG: ID3D11DeviceContext::FinishCommandList:


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> | checked your timespy validation/score out.
> 
> Looks good.
> 
> But I couldn't help also noticing your core clock was only 1,592 MHz on your 1080 Ti.
> 
> I'd be very interested to know what settings you are using in the control panel or if you have any other advice/recommendations.
> 
> Come on, it's always best to share lol ;-)
> 
> P.S. if you can hit that score at that core clock, imagine what you'd be hitting at 2100 Mhz.
> 
> I did also notice your CPU was overclocked to 5.25Ghz-- but your graphics score was still ahead of mine too.


The core clock shown in time spy is wrong if you use curve and disable hardware monitoring...

this was at 2113 Core....


----------



## ReFFrs

Quote:


> Originally Posted by *sid4975*
> 
> hI just hooked the monitor up last night. I clocked it to100hz and tried out 2 of my games, witcher 3 and far cry 4. both games crashed within 10-15 mins of playing. I was running my 2k monitor with my 1080ti fine before i switched last night. I didnt really play anthing for more then a few mins on my old 2k though cuz i was waiting for the x34.


What is the exact model of your card? Is it Amp Extreme or some other version?

Plug your old monitor and launch some 4k tests, like 3DMARK Firestrike Ultra. it will render in 4k despite the monitor's native resolution and you will be able to test if it's stable with your old monitor.


----------



## sid4975

it is the zotac 1080 ti amp extreme. and ok, that was my next idea. i did run the benchmarks on the 2k but it never really pushed the card. it seems like its crashing at 64-65 degrees


----------



## ReFFrs

Quote:


> Originally Posted by *sid4975*
> 
> it is the zotac 1080 ti amp extreme. and ok, that was my next idea. i did run the benchmarks on the 2k but it never really pushed the card. it seems like its crashing at 64-65 degrees


Does your PSU powerful enough to run this card? It has higher power consumption, especially at heavy load.

I have AMP Extreme for 2 days already and it can run 4k benchmarks like FireStrike Ultra and manually configured TimeSpy 4k looping for more than hour without any crashes at 1987-2000 MHz with card's power limit set to maximum 120% (384W)

My PSU is Corsair AX1200i (1200W)


----------



## Benny89

Quote:


> Originally Posted by *nrpeyton*
> 
> lol


Why "lol"?


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> The core clock shown in time spy is wrong if you use curve and disable hardware monitoring...
> 
> this was at 2113 Core....


I see. Thanks for clearing that up.









Is there anything else you've done to help push that score up besides your hefty CPU overclock?


----------



## sid4975

i have the corsair hx850 watt. that should b enough right? its never let me down b4. i came from a 980 ti


----------



## Hulio225

Quote:


> Originally Posted by *nrpeyton*
> 
> I see. Thanks for clearing that up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there anything else you've done to help push that score up besides your hefty CPU overclock?


CPU overclock has close to none impact on the gpu score...
other than that, shunt mod hefty vram oc and a stripped down bench os


----------



## Hulio225

Quote:


> Originally Posted by *sid4975*
> 
> i have the corsair hx850 watt. that should b enough right? its never let me down b4. i came from a 980 ti


yes its single rail... 70 Amps


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> i have the corsair hx850 watt. that should b enough right? its never let me down b4. i came from a 980 ti


Things to try...

Use two separate 8 pin pcie cables from the power supply. Bump up the voltage a little as well as power target or drop the power target back to see if you can avoid the crash. Have you tried a clean driver install using DDU? Also in device manager find your display device and update the driver, it will change the device from generic to acer X34


----------



## dunbheagan

Played around with Supo 4K again, got my highest bench today:



Not too bad









[email protected]/1,075V
[email protected]
[email protected],6GHz


----------



## cluster71

Quote:


> Originally Posted by *sid4975*
> 
> hey guys I need some help, just got a recertified x34 to go with my zotac 1080 ti i got 2 weeks ago. I just hooked the monitor up last night. I clocked it to100hz and tried out 2 of my games, witcher 3 and far cry 4. both games crashed within 10-15 mins of playing. I was running my 2k monitor with my 1080ti fine before i switched last night. I didnt really play anthing for more then a few mins on my old 2k though cuz i was waiting for the x34.
> 
> anyway im trying to figure out if its the 1080ti or the recertified x34 thats causing the games to freeze? Can anyone tell me what programs or tests i can use to narrow it down? i only have a few days to return the monitor, the card i can rma for a few years i believe.
> 
> the card doesnt seem to ever really rev up, i never really here the fans and it never goes above 65 degrees that i've seen. if im pushing it on the x34 with ultra shouldnt the card be maxxed?
> 
> any info is appreciated time is of the essence. also made post in x34 forum
> 
> thanx!


What do you have for the DP cable? My X34 get dropouts over 75 Hz with standard 4K 60Hz DP 1.2 cable. You may need DP 1.3. I have it and no problem reaching 100


----------



## nrpeyton

Quote:


> Originally Posted by *Hulio225*
> 
> CPU overclock has close to none impact on the gpu score...
> other than that, shunt mod hefty vram oc and a stripped down bench os


Aye, that makes sense now then...

Your GFX score was about 200 points ahead of mine.. and your memory must be overclocked to about +800 to be reporting at 1575 Mhz.

My memory O/C maxes out at +575. Before I went water it barely done 500.

If only there were an easy way to add a lil VRAM voltage.









Pitty, because my last GDDR5X GPU done +875. Or +915 at the right temperature.

I could even get through a bench at +1000 _(albeit with severe artefacts)._

Have you changed anything in the Nvidia control panel? I seem to do best with pre-rendered frames at 4.


----------



## sid4975

alright so I ruled out the monitor by hooking my 2k monitor back up and running 3dmark 11, the card crashes at exactly 65 degrees everytime, I already called newegg which would've given me a replacement but the cards are all out of stock of course. so now I gotta send it back for store credit and wait till it comes bac in, meanwhile I already ebay'ed my 980ti lol so the desktop is gonna b a paperweight for a few weeks









also I don't really mess with voltage and stuff usually but if I did want to try an increase to see if that changes anything how much should I push it up?

also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> alright so I ruled out the monitor by hooking my 2k monitor back up and running 3dmark 11, the card crashes at exactly 65 degrees everytime, I already called newegg which would've given me a replacement but the cards are all out of stock of course. so now I gotta send it back for store credit and wait till it comes bac in, meanwhile I already ebay'ed my 980ti lol so the desktop is gonna b a paperweight for a few weeks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also I don't really mess with voltage and stuff usually but if I did want to try an increase to see if that changes anything how much should I push it up?
> 
> also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


You can safely add 100% but try it out @ 40 then 70 first to see how it goes


----------



## ReFFrs

Quote:


> Originally Posted by *sid4975*
> 
> also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


Actually 1080 Ti AMP Extreme (if it works) is a very good card by comparing to other vendors. You should try getting a replacement and hope it won't have any problems. Because otherwise the problem is somewhere in your PC, PSU etc.


----------



## nrpeyton

Quote:


> Originally Posted by *sid4975*
> 
> alright so I ruled out the monitor by hooking my 2k monitor back up and running 3dmark 11, the card crashes at exactly 65 degrees everytime, I already called newegg which would've given me a replacement but the cards are all out of stock of course. so now I gotta send it back for store credit and wait till it comes bac in, meanwhile I already ebay'ed my 980ti lol so the desktop is gonna b a paperweight for a few weeks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also I don't really mess with voltage and stuff usually but if I did want to try an increase to see if that changes anything how much should I push it up?
> 
> also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


You could also try taking 26 Mhz off the core clock.

So -26 core (as it goes up in 13Mhz incriments).

You wouldn't notice the difference in-game and it could make the card 100% stable.

After you've returned the card; it will likely be repackaged and re-sold as "down tuned". Usually with a saving of about 50 dollars passed to the buyer.

You may also have been lucky on your GDDR5X memory. You could make up for the slight performance decrease on the core by overclocking the memory. You don't need to touch voltage to O/C memory so it's 100% risk free.

Start with a +400 and go from there. If it crashes back it off.. but it's more likely you'll get a bit more out of it.. I've seen +1000 on some lucky cards. Most do about +500.

_+500 memory and -26 core would still give you more FPS than you'd get at stock (as if you had no O/C app installed at all)._
_
You could even save the profile, then set it to launch automatically at boot-up. So you don't ned to worry about it again._


----------



## feznz

Quote:


> Originally Posted by *Lefty23*
> 
> Sorry to hear that. Do you install the Rivatuner Statistics Server when you install AB?
> I would try unistalling it to see if anything changes.
> RTSS has caused instability (OS crashes) for me in the past (~1 year ago).
> Probably not your problem but you might consider trying?
> 
> Also when I tried the beta 6 build I was getting sporadic game/driver crashes so I went back to 4.3.0 "stable" build with no issues in the last couple weeks.
> As I'm pretty sure you know, all you have to do is edit a file to get voltage control.
> 
> edit: I'm not saying AB/RTSS beta builds are bad, just that it might clash with something else in your system, as it seems to do in mine.


lol I give up on MSI AB went and installed 4.3.0 "stable" get BSOD even with that build just took 14 hours from boot could be rivertuner I not sure but no MSI AB = No BSOD


----------



## bmgjet

Quote:


> Originally Posted by *sid4975*
> 
> alright so I ruled out the monitor by hooking my 2k monitor back up and running 3dmark 11, the card crashes at exactly 65 degrees everytime, I already called newegg which would've given me a replacement but the cards are all out of stock of course. so now I gotta send it back for store credit and wait till it comes bac in, meanwhile I already ebay'ed my 980ti lol so the desktop is gonna b a paperweight for a few weeks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also I don't really mess with voltage and stuff usually but if I did want to try an increase to see if that changes anything how much should I push it up?
> 
> also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


My card crashes at 62C but only in PUBG,
Let me know if you find a fix for your problem, I had to go waterblock to fix mine. Now that it stays on 42-48C its fine.


----------



## FastEddieNYC

After some testing I'm quite happy with with my SC black card and Kyrographics block. So far my best is [email protected] and +625(12264)mem. On air my max was +510 and [email protected] Pascal really likes to be kept cool.


I'm currently #11 on the leaderboard. Now I need to find the best stable OC for the Witcher 3 so I can finally play Blood and Wine at 4K.


----------



## Amuro

since my monitor is only 2560x1080 60hz ultrawide, and i only need min. 60fps on ultra settings no scaling/dsr. my 1080ti gaming x powerlimit is set to 60% but ocd +50 +500 and maxed powerdraw of 140watts, firestrike graphics score of 24500, or roughly equivalent of an ocd 1080. very power efficient, compared to stock 100% pl that draws 200+ watts just to maintain 60fps on that resolution, really need to upgrade my monitor but i love my lg ultrawide ips + its cheap.


----------



## ReFFrs

Quote:


> Originally Posted by *bmgjet*
> 
> My card crashes at 62C but only in PUBG,
> Let me know if you find a fix for your problem, I had to go waterblock to fix mine. Now that it stays on 42-48C its fine.


But your card is EVGA 1080ti SC, right? And his card is Zotac Amp Extreme. Different cards and still crash at almost the same temps. Very weird.


----------



## KedarWolf

Welp, I was resetting my CPU, bent a few pins on CPU socket, only seeing 24 of 32GB of RAM on two kits. The bad news.

The good news. My boss is lending me money for a motherboard and it's an upgrade from what I had.

https://www.asus.com/ca-en/Motherboards/ROG-STRIX-X99-GAMING/

And I'm getting A $500 CAD motherboard from newegg.ca open box for just over $350 so I'm quite happy.


----------



## buellersdayoff

Quote:


> Originally Posted by *ReFFrs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bmgjet*
> 
> My card crashes at 62C but only in PUBG,
> Let me know if you find a fix for your problem, I had to go waterblock to fix mine. Now that it stays on 42-48C its fine.
> 
> 
> 
> But your card is EVGA 1080ti SC, right? And his card is Zotac Amp Extreme. Different cards and still crash at almost the same temps. Very weird.
Click to expand...

Seems like temp throttling drops the voltage enough to crash @ high frequency, so add voltage, or add cooling, water or faster fan speeds


----------



## TheBoom

Quote:


> Originally Posted by *sid4975*
> 
> alright so I ruled out the monitor by hooking my 2k monitor back up and running 3dmark 11, the card crashes at exactly 65 degrees everytime, I already called newegg which would've given me a replacement but the cards are all out of stock of course. so now I gotta send it back for store credit and wait till it comes bac in, meanwhile I already ebay'ed my 980ti lol so the desktop is gonna b a paperweight for a few weeks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also I don't really mess with voltage and stuff usually but if I did want to try an increase to see if that changes anything how much should I push it up?
> 
> also do you guys think I should go the newegg way get the credit and wait for a new one or get a different card, or should I go through zotac's replacement process? I don't know whats smarter I'm afraid zotac will give me a refurb and I don't want a refurb for something I paid 800$ for last month


Quote:


> Originally Posted by *buellersdayoff*
> 
> Seems like temp throttling drops the voltage enough to crash @ high frequency, so add voltage, or add cooling, water or faster fan speeds


Actually I'm 99% sure its temps, although not the core temps itself. Likely Vram or Vrms but possibly some other components.

I own a amp extreme and faced the same issue at higher voltages. This is of course with the XOC bios at much higher voltages. Anything above 1143mv was instant crash with temps at 79C.

What's worse is after I reapplied TIM and temps on the core went down, I lowered fan speeds by 10% on the curve and it got worse. Now anything over 1120mv crashes. So it's definitely heat related since the better TIM application is transferring more heat to the cooler thus probably affecting heat absorption on the other components. Also the reduced fan speed (even though the core temp is still lower), is dissipating less heat across the enitre cooler.

I know it sounds a little complicated but the jist of it is I would suggest the opposite and actually lower voltage instead of increase it and test. Of course the card shouldn't be crashing even at 1093mv on the stock bios so it's definitely a dud but if you don't want to go through the process of returning it is the next best solution.


----------



## Benny89

Quote:


> Originally Posted by *buellersdayoff*
> 
> Seems like temp throttling drops the voltage enough to crash @ high frequency, so add voltage, or add cooling, water or faster fan speeds


I can confirm that. MY STRIX 2063mhz at 1.062v is also crashing if temps reach 62-63C so I am keeping it at 58-59C with 70% fan speed to prevent that. I have water parts coming to me next weekend so I hope to finally get that 2100 on it.

I think this is just Chip/Vram thmermal limit point at higher voltage bins combined with higher frequency. So far I had 5 1080Ti and all behave the same. Water cooling is a must for them if you want to squeeze 100% of your card.


----------



## seven7thirty30

I have about 90 - 47 Ohm resistor caps that are wasting away. Free to anyone that needs them. Shoot me a PM if you're interested. 4 per person limit.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *nrpeyton*
> 
> Did you check it's making contact with the block?
> 
> No i havent removed the block again. Need to order some more 1mm pads and then I will check.


----------



## cluster71

The common opinion here seems to be that it is rarely the memory that is the cause of instability. Mine experience with Zotac Extreme and some previous cards I've had that memory are the cause of instability in longer game sessions. I suppose it depends on the design. On Zotac, memory cooling is coupled with the GPU cooling, and thus the memory temperature becomes almost the same as the GPU Temperature. On the strix are the separated air blowing on an aluminum profile. Certainly big difference on waterblocks too.


----------



## dunbheagan

Hey guys,

for your interest, Superposition 4K barely seems to be influenced by your CPU at all, even with a 1080Ti maxed out.

Or would have anybody expected that you can get over 10800 Points in Supo 4K with a [email protected]?

I did some research, underclocking my 4790K, here are my results:

All benches: [email protected]/6202

[email protected]:

4600MHz - 10887 Points
2900MHz - 10869 Points
2300MHz - 10865 Points
1900MHz - 10837 Points

Even at 1900MHz i still get over 10800 Points, although min fps are slightly starting to go down below 2900MHz. But 2900MHz or 4600MHz, absolutly no difference.


----------



## feznz

Quote:


> Originally Posted by *dunbheagan*
> 
> Hey guys,
> 
> for your interest, Superposition 4K barely seems to be influenced by your CPU at all, even with a 1080Ti maxed out.
> 
> Or would have anybody expected that you can get over 10800 Points in Supo 4K with a [email protected]?
> 
> I did some research, underclocking my 4790K, here are my results:
> 
> All benches: [email protected]/6202
> 
> [email protected]:
> 
> 4600MHz - 10887 Points
> 2900MHz - 10869 Points
> 2300MHz - 10865 Points
> 1900MHz - 10837 Points
> 
> Even at 1900MHz i still get over 10800 Points, although min fps are slightly starting to go down below 2900MHz. But 2900MHz or 4600MHz, absolutly no difference.
> 
> 
> Spoiler: Warning: Spoiler!


Nice findings was my suspicions were correct








I believe the only time CPU/Ram will influence the score is 1080p low settings where FPS is going to exceed 160FPS+ and the ram simply does not have enough bandwidth to deliver the information to the CPU
Anyway +1


----------



## The Storm

Quote:


> Originally Posted by *KedarWolf*
> 
> Welp, I was resetting my CPU, bent a few pins on CPU socket, only seeing 24 of 32GB of RAM on two kits. The bad news.
> 
> The good news. My boss is lending me money for a motherboard and it's an upgrade from what I had.
> 
> https://www.asus.com/ca-en/Motherboards/ROG-STRIX-X99-GAMING/
> 
> And I'm getting A $500 CAD motherboard from newegg.ca open box for just over $350 so I'm quite happy.


I've had bent pins before and have been able to carefully bend them back into position, are you not able to do so?


----------



## nrpeyton

Quote:


> Originally Posted by *The Storm*
> 
> I've had bent pins before and have been able to carefully bend them back into position, are you not able to do so?


I bent pins on my old 32nm AMD FX-8350. I was able to straighten them out using both a pair of reading glasses _*AND*_ a magnifying glass too. lol And it was still extremely difficult. I only _JUST_ managed it. (barely).

However I've seen how much smaller the pins are on my new Z270 14nm intel-based motherboard. I doubt I could even see them well enough to manipulate them with visual aid.

On a modern architecture that is more than 2x as dense; they are just far too small _*and*_ far too many.
You would need a microscope and something mechanical. The hand just isn't steady enough.


----------



## dansi

I changed nvidia cp image quality ti performance and i got above 10k for supo finally.

Kinda cheat but I guess most of u do so too?


----------



## KedarWolf

Quote:


> Originally Posted by *dansi*
> 
> I changed nvidia cp image quality ti performance and i got above 10k for supo finally.
> 
> Kinda cheat but I guess most of u do so too?


In OP Nvidia settings to use


----------



## KedarWolf

Quote:


> Originally Posted by *The Storm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Welp, I was resetting my CPU, bent a few pins on CPU socket, only seeing 24 of 32GB of RAM on two kits. The bad news.
> 
> The good news. My boss is lending me money for a motherboard and it's an upgrade from what I had.
> 
> https://www.asus.com/ca-en/Motherboards/ROG-STRIX-X99-GAMING/
> 
> And I'm getting A $500 CAD motherboard from newegg.ca open box for just over $350 so I'm quite happy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had bent pins before and have been able to carefully bend them back into position, are you not able to do so?
Click to expand...

My hand/eye coordination is terrible, couldn't even to resistor mod right.









How exactly would I try to bend them back?


----------



## bmgjet

So did a re-seat on my GPU block to do the shunt mod.
Well it turned into a nightmare since one of the scews into the EK block snapped the head off and I had to use plyers to undo the rest of it.




Cant really see in the picture but there looks like it snapped where there was a air pocket inside the screw.


----------



## kevindd992002

How do I test for CPU bottlenecking when using an i7-2600K with a 1080Ti at 1080p?


----------



## lilchronic

Quote:


> Originally Posted by *kevindd992002*
> 
> How do I test for CPU bottlenecking when using an i7-2600K with a 1080Ti at 1080p?


monitor each individual cpu core and watch gpu usage.


----------



## kevindd992002

Quote:


> Originally Posted by *lilchronic*
> 
> monitor each individual cpu core and watch gpu usage.


And if I got at least one core that's pegged at 100% usage while GPU is below 100%, then that's CPU bottlenecking already?


----------



## BrainSplatter

Quote:


> Originally Posted by *kevindd992002*
> 
> And if I got at least one core that's pegged at 100% usage while GPU is below 100%, then that's CPU bottlenecking already?


In most games yes, since only a few of them are able to take really good advantage of multiple cores. Also, on DX11 it might even be the GPU drivers/ Direct X 11 itself that are limiting better CPU multithreading performance due to the 'draw call bottleneck' (just google it).

And with the GPU < 100%, there is a bottleneck somewhere.Sometimes it can be RAM bandwidth/latency but usually it's the CPU.


----------



## TK421

1080TI STRIX XOC bios work with msi gamingX card?


----------



## ffodi

Quote:


> Originally Posted by *TK421*
> 
> 1080TI STRIX XOC bios work with msi gamingX card?


Yep.


----------



## TK421

Quote:


> Originally Posted by *ffodi*
> 
> Yep.


can you test the vpll control and vmem control?


----------



## nrpeyton

*-*Core overclocked to 2100 -> 2113 using curve:
(1093: 2113
1081: 2100
1075: 2087 etc....)

*-*Shunt mod completed on 3 shunts. Card now also under water!

*-*Memory overclocked to +575 (also tried lowering to +550 which results in _slightly_ lower score.... any higher = errors in OCCT)

*-*CPU (i7 7700k) overclocked to 4900 Mhz _(stable)_

*-*RAM overclocked to 4000 Mhz @ 18, 19, 19, 39 _(passed memtest)_

*-*Entered the settings from OP into Nvidia control panel _(even tried different pre-rendered frames settings)_

*-*Tried running windows power management in "High Performance" mode.

*-* *Fairly* fresh install of Windows. No bull**** installed. Just the basic extras that come with my mobo, hwinfo64 & geforce experience which I always close. I also close all unwanted apps from bottom right of screen!

But I still can't get over 9800 to 9850 on Timespy

It's really beginning to get me down









So frustrated. What am I doing wrong :-(


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> *-*Core overclocked to 2100 -> 2113 using curve:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> (1093: 2113
> 1081: 2100
> 1075: 2087 etc....)
> 
> *-*Shunt mod completed on 3 shunts. Card now also under water!
> 
> *-*Memory overclocked to +575 (also tried lowering to +550 which results in _slightly_ lower score.... any higher = errors in OCCT)
> 
> *-*CPU (i7 7700k) overclocked to 4900 Mhz _(stable)_
> 
> *-*RAM overclocked to 4000 Mhz @ 18, 19, 19, 39 _(passed memtest)_
> 
> *-*Entered the settings from OP into Nvidia control panel _(even tried different pre-rendered frames settings)_
> 
> *-*Tried running windows power management in "High Performance" mode.
> 
> *-* *Fairly* fresh install of Windows. No bull**** installed. Just the basic extras that come with my mobo, hwinfo64 & geforce experience which I always close. I also close all unwanted apps from bottom right of screen!
> 
> 
> 
> But I still can't get over 9800 to 9850 on Timespy
> 
> It's really beginning to get me down
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So frustrated. What am I doing wrong :-(


whats your actual graphic score alone
I am sure you already know this but slightly unstable at higher clocks will score less that very stable at lower clocks
try 3200Mhz with super tight ram timings like 15-15-15-38 I honestly can't remember what those B-dies can do but I found tight timings will yeild more especially @ 1080p


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> *-*Core overclocked to 2100 -> 2113 using curve:
> (1093: 2113
> 1081: 2100
> 1075: 2087 etc....)
> 
> *-*Shunt mod completed on 3 shunts. Card now also under water!
> 
> *-*Memory overclocked to +575 (also tried lowering to +550 which results in _slightly_ lower score.... any higher = errors in OCCT)
> 
> *-*CPU (i7 7700k) overclocked to 4900 Mhz _(stable)_
> 
> *-*RAM overclocked to 4000 Mhz @ 18, 19, 19, 39 _(passed memtest)_
> 
> *-*Entered the settings from OP into Nvidia control panel _(even tried different pre-rendered frames settings)_
> 
> *-*Tried running windows power management in "High Performance" mode.
> 
> *-* *Fairly* fresh install of Windows. No bull**** installed. Just the basic extras that come with my mobo, hwinfo64 & geforce experience which I always close. I also close all unwanted apps from bottom right of screen!
> 
> But I still can't get over 9800 to 9850 on Timespy
> 
> It's really beginning to get me down
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So frustrated. What am I doing wrong :-(


Yes, you should compare graphics scores between other's benchmarks. +1 to the advice.


----------



## smithsrt8

Quote:


> Originally Posted by *nrpeyton*
> 
> *-*Core overclocked to 2100 -> 2113 using curve:
> (1093: 2113
> 1081: 2100
> 1075: 2087 etc....)
> 
> *-*Shunt mod completed on 3 shunts. Card now also under water!
> 
> *-*Memory overclocked to +575 (also tried lowering to +550 which results in _slightly_ lower score.... any higher = errors in OCCT)
> 
> *-*CPU (i7 7700k) overclocked to 4900 Mhz _(stable)_
> 
> *-*RAM overclocked to 4000 Mhz @ 18, 19, 19, 39 _(passed memtest)_
> 
> *-*Entered the settings from OP into Nvidia control panel _(even tried different pre-rendered frames settings)_
> 
> *-*Tried running windows power management in "High Performance" mode.
> 
> *-* *Fairly* fresh install of Windows. No bull**** installed. Just the basic extras that come with my mobo, hwinfo64 & geforce experience which I always close. I also close all unwanted apps from bottom right of screen!
> 
> But I still can't get over 9800 to 9850 on Timespy
> 
> It's really beginning to get me down
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So frustrated. What am I doing wrong :-(


See how your graphics score compares to mine...keep in mind my card only works in x4 (not x16) and I am overclocked to 2063 on my 1080 ti (GIgabyte Aorus non-extreme) and I am @ a stable 5ghz on a 7700k


At least this might help you see if it is the CPU or the GPU


----------



## c0nsistent

So how many of you are undervolting? So far I'm at .975v @ 2000mhz and I'm really starting to consider leaving my card at that for good, as the difference is literally 5% even if I make it to 2100mhz, and that's only with perfect scaling.


----------



## roberta507

Finally have card that works correctly
RMA'ed a Zotac 1080Ti AMP Extreme for a EVGA 1080Ti FTW
Better headroom and fan control 2050 solid


----------



## blurp

Quote:


> Originally Posted by *c0nsistent*
> 
> So how many of you are undervolting? So far I'm at .975v @ 2000mhz and I'm really starting to consider leaving my card at that for good, as the difference is literally 5% even if I make it to 2100mhz, and that's only with perfect scaling.


I'm on a curve with 2000 av 1.0 v. Did not try lower voltage. My card is 2100 stable @ 1.075 if i remember.i agree with you for the 5% not making a big diffrence for 24/7 use.


----------



## lilchronic

Quote:


> Originally Posted by *smithsrt8*
> 
> See how your graphics score compares to mine...keep in mind my card only works in x4 (not x16) and I am overclocked to 2063 on my 1080 ti (GIgabyte Aorus non-extreme) and I am @ a stable 5ghz on a 7700k
> 
> 
> At least this might help you see if it is the CPU or the GPU


What? why are you running a pci-e x4?
You should get GPU-Z and run the render test to see what it say's next to the bus interface?


----------



## smithsrt8

Quote:


> Originally Posted by *lilchronic*
> 
> What? why are you running a pci-e x4?
> You should get GPU-Z and run the render test to see what it say's next to the bus interface?


Card is broken (something wrong internally or broken caps) I just got my new card today from Gigabyte (Advanced replacement RMA) I already tried the GPU Z render test...it goes from x4 1.1 to x4 3.0...and I tried 3 motherboards...


Now just counting down the time until I can go home and install the new card and finally enjoy x16 greatness...although basketball is on tonight...decisions decisions!


----------



## KraxKill

5775C - 4.3Ghz - 2133CL9 - shunted 1080Ti FE @ 2100mhz GDDR5x @ 6126mhz


----------



## nrpeyton

Graphics score is usually just breaking 11,000


----------



## TK421

Quote:


> Originally Posted by *TK421*
> 
> 1080TI STRIX XOC bios work with msi gamingX card?


Quote:


> Originally Posted by *ffodi*
> 
> Yep.


Quote:


> Originally Posted by *TK421*
> 
> can you test the vpll control and vmem control?


any thoughts to the vpll and vmem


----------



## nrpeyton

http://www.3dmark.com/3dm/20444441

*Overall:* 9804
*Graphics:* 11030
*CPU:* 6017

Just revisted all the settings from the OP. This time I applied them BOTH for the timesply workload .exe _*and*_ globally.

It netted me no extra points. _(not surprising)._ But it does validate that I had them entered correctly before.

The difference between:

-default with "prefers maximum performance"
and
- *ALL* settings from the OP

= 100 points (_overall_)

Is anyone getting more than 100 points with the OP settings?

There must be something I'm missing....


----------



## smithsrt8

Quote:


> Originally Posted by *nrpeyton*
> 
> http://www.3dmark.com/3dm/20444441
> 
> *Overall:* 9804
> *Graphics:* 11030
> *CPU:* 6017
> 
> Just revisted all the settings from the OP. This time I applied them BOTH for the timesply workload .exe _*and*_ globally.
> 
> It netted me no extra points. _(not surprising)._ But it does validate that I had them entered correctly before.
> 
> The difference between:
> 
> default with "prefers maximum performance"
> and
> *ALL* settings from the OP
> 
> = 100 points (_overall_)


Is anyone getting more than 100 points with the OP settings?

There must be something I'm missing....

Strange...my CPU score is over 300 more than you for only 100 more mhz...you may not have a stable OC...I think your card is right on though...


----------



## nrpeyton

Quote:


> Originally Posted by *smithsrt8*
> 
> Strange...my CPU score is over 300 more than you for only 100 more mhz...you may not have a stable OC...I think your card is right on though...


Okay I'll switch to 5Ghz and report back.

Back in 5


----------



## nrpeyton

Right....

I'm back. (after raising my CPU to 5.0 Ghz from 4.9 Ghz)

It has now given me my new best score (of all time):

http://www.3dmark.com/3dm/20444709

*Overall:* 9880 _(up 76 points)_
*Graphics:* 11025 _(no change)_
*CPU:* 6220 _(up 203 points)_

I had to run at 1.5v for 5 Ghz so the CPU thermal throttled, but only a *little*.

But I'm still falling short compared to what the rest of you are all scoring. 10k ++









P.S. this is an *emergency only o/c* for me.. due to the high voltage. I'd never run at this all the time.. it's stable.. it's stable enough.. but I reserve it only for when extreme mesures are necessary.


----------



## smithsrt8

Quote:


> Originally Posted by *nrpeyton*
> 
> Right....
> 
> I'm back. (after raising my CPU to 5.0 Ghz from 4.9 Ghz)
> 
> It has now given me my new best score (of all time):
> 
> http://www.3dmark.com/3dm/20444709
> 
> *Overall:* 9880 _(up 76 points)_
> *Graphics:* 11025 _(no change)_
> *CPU:* 6220 _(up 203 points)_
> 
> But I'm still falling short compared to what the rest of you are all scoring. 10k ++


Strange...I am still 100 points more on CPU...tonight (after the game) I will rerun it tonight after I have my new card in and report back...

I will probably then tear everything down again to install all of my new cooling stuff (EK pump/Blocks etc)


----------



## nrpeyton

Quote:


> Originally Posted by *smithsrt8*
> 
> Strange...I am still 100 points more on CPU...tonight (after the game) I will rerun it tonight after I have my new card in and report back...


I think the 100 CPU points are maybe explained by a tiny little thermal throttling due to having to run at 1.5v

Sorry for the ninja edits above.. I've edited my last post

*****this is an *emergency only o/c* for me.. due to the high voltage. I'd never run at this all the time.. it's stable.. it's stable enough.. but I reserve it only for when extreme mesures are necessary.









Still don't see why I'm not breaking 10k, despite seeing some of you hitting nearly 11k on the same hardware... with the same settings?


----------



## smithsrt8

Quote:


> Originally Posted by *nrpeyton*
> 
> I think the 100 CPU points are maybe explained by a tiny little thermal throttling due to having to run at 1.5v
> 
> Sorry for the ninja edits above.. I've edited my last post
> 
> *****this is an *emergency only o/c* for me.. due to the high voltage. I'd never run at this all the time.. it's stable.. it's stable enough.. but I reserve it only for when extreme mesures are necessary.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still don't see why I'm not breaking 10k, despite seeing some of you hitting nearly 11k on the same hardware... with the same settings?


You have to run 1.5v for 5ghz? Good lord...I would think you could tweak it a bit better with the Apex...I know it varies with motherboards (for 5 I have used 3 different MOBO's with the same exact 7700k...Gigabyte Gaming SOC was 1.385, Asus Code IX was 1.39 and my EVGA is 1.36...


----------



## nrpeyton

Quote:


> Originally Posted by *smithsrt8*
> 
> You have to run 1.5v for 5ghz? Good lord...I would think you could tweak it a bit better with the Apex...I know it varies with motherboards (for 5 I have used 3 different MOBO's with the same exact 7700k...Gigabyte Gaming SOC was 1.385, Asus Code IX was 1.39 and my EVGA is 1.36...


That's my thoughts exactly... I thought that because I spent nearly £300 on a top-of-the-line ASUS board that I'd be laughing.. but unfortunately that doesn't seem to be the case....

For 4.9 Ghz I need 1.35-1.392v

For 5Ghz I need 1.455v (which shows in windows as 1.5v at *near idle*). At heavy load I have a healthy v.droop to around 1.45v +.

I run at 4.9 Ghz daily.

The chip isn't dellided. But I have an EK supremacy EVO on it. Temps are right for my setup.

I've tried all different combinations of LLC. I've even tried increasing the VRM switching frequency to maximum. But it netted me no extra gains.

Anyway I just done a superposition: _(all the same settings I used in timespy)_



How does that compare to what you guys are getting?

Edit: I'm going to try updating to latest driver.

Edit 2: Latest driver seems to have lost me about 150 points in timespy and about 100 points in Superposition









_I do have Water Chiller which I've not connected up yet.. but the Chiller is only for eeking out the last few hundred points using a higher CPU & GPU overclock. But, after I have everything else right!



Before I connect it up (and it's a BEAST of a machine) lol... I need to get *everything else right first*. Or it kind of defeats the purpose._

This is the first time -- in my life... I have *ever* owned a truly competitive machine. (The latest i7 CPU and the latest Nvidia GPU, 4000Mhz DDR4 & a top ASUS mobo). This won't last long. And tick tock.. the clock is ticking... before long I'll have lost this once-in-a-life time investment forever. And I can't see myself upgrading my CPU for another 5 years.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> That's my thoughts exactly... I thought that because I spent nearly £300 on a top-of-the-line ASUS board that I'd be laughing.. but unfortunately that doesn't seem to be the case....
> 
> For 4.9 Ghz I need 1.35-1.392v
> 
> For 5Ghz I need 1.455v (which shows in windows as 1.5v at *near idle*). At heavy load I have a healthy v.droop to around 1.45v +.
> 
> I run at 4.9 Ghz daily.
> 
> The chip isn't dellided. But I have an EK supremacy EVO on it. Temps are right for my setup.
> 
> I've tried all different combinations of LLC. I've even tried increasing the VRM switching frequency to maximum. But it netted me no extra gains.
> 
> Anyway I just done a superposition: _(all the same settings I used in timespy)_
> 
> 
> 
> How does that compare to what you guys are getting?
> 
> Edit: I'm going to try updating to latest driver.
> 
> Edit 2: Latest driver seems to have lost me about 150 points in timespy and about 100 points in Superposition
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _I do have Water Chiller which I've not connected up yet.. but the Chiller is only for eeking out the last few hundred points using a higher CPU & GPU overclock. But, after I have everything else right!
> 
> Before I connect it up (and it's a BEAST of a machine) lol... I need to get *everything else right first*. Or it kind of defeats the purpose._
> 
> This is the first time -- in my life... I have ever owned a truly competitive machine. (The latest i7 CPU and the latest Nvidia GPU). This won't last long. And the tick tock.. the clock is ticking... before long I'll have lost this once-in-a-life time investment forever. And I can't see myself upgrading my CPU for another 5 years.
> 
> I'm desperate to reach the same scores as the top guys here-- before the next wave of hardware is released.


The biggest bang for buck you can do is get your CPU delidded. No other upgrade or money you can throw at the system will net you the same benefit vs dollar gains (or even gains period). The temperature drop is 15 degrees _minimum_. This is such a big difference that with powerful cooling that can pull the chip below 65 degrees. Your 5Ghz is likely to become possible. My chip could only do 4.8ghz @ 1.35V before delidding, and this was with watercooling on a 360mm rad. Now I can do 5ghz @ 1.325V completely stable, just from the temperature drop alone.

If you are squeamish about it, you can get sillicon lottery to do it for you. They do a really professional + warrantied service for about $50 IIRC.

If you're chasing benchmarks, all of this has to be done. If you want to try something on the software side, try a fresh install of windows on an image on a separate partition from your usual 24/7 image. I hear the Supo benchers are getting a few hundred points extra just from using a lean fresh Windows install.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> The biggest bang for buck you can do is get your CPU delidded. No other upgrade or money you can throw at the system will net you the same benefit vs dollar gains (or even gains period). The temperature drop is 15 degrees _minimum_. This is such a big difference that with powerful cooling that can pull the chip below 65 degrees. Your 5Ghz is likely to become possible. My chip could only do 4.8ghz @ 1.35V before delidding, and this was with watercooling on a 360mm rad. Now I can do 5ghz @ 1.325V completely stable, just from the temperature drop alone.
> 
> If you are squeamish about it, you can get sillicon lottery to do it for you. They do a really professional + warrantied service for about $50 IIRC.


Wow, wait a minute.

You gained 200 Mhz at the same voltage with your de-lid?

Sir, you may just have sold me.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow, wait a minute.
> 
> You gained 200 Mhz at the same voltage with your de-lid?
> 
> Sir, you may just have sold me.


It makes sense, its even on the ASUS ROG website (Raja's Kaby Lake OC guide) and its inline with my observations with FinFet (response to cooling is equal to both GPUs and CPUs). I mean if you can get it stable with 1.45V then I have a gut feeling 1.4V or maybe 1.35V 5ghz may be possible for you, even less if you get that chiller, provided the chip gets below that magic 65 degree threshold. My 3770K was the same, could only do 4.5ghz at 1.35V on a NH-D14 cooler, I delidded it and now it does 4.7Ghz at same voltage and super low operating temps.

We enthusiasts have always known more cooling improves OC headroom and stability. However, for FinFet chips, this is even more apparent (iirc they started noticing it back in Ivy Bridge) because the chips get leakier much faster relative to voltage compared to the old planar Sandy Bridge era chips. This means the chips get quadratically hotter much faster than their 32nm cousins at higher voltages, the converse also means that cooling them vastly reduces their leakage (much faster and more potent than planar) netting better stability and greater longevity.

The other characteristic about FinFet is its also more resistant to degradation than Planar so it typically can sustain higher voltages provided operating temps are low. I knew of many 32nm Sandy chips that degraded over a couple of years at 1.4V+ whereas Ivy didn't tend to degrade as fast/badly although they did tended to die rather abruptly if something internal was pushed too far.

Extrapolating to GP102, I have noticed my GTX 1080ti is now completely stable with zero power throttling at 2100mhz at <40C whereas before at 60C I could only do 2037mhz with Supo/FSU throttling. Temperature matters with FinFet.


----------



## cluster71

I tried to get a good result on my 7700k
http://www.3dmark.com/spy/1904318


----------



## cluster71

I had to increase the voltage on everything more than usual to pass


----------



## cluster71

I have 5200 on the core and the AVX 4700 cache normally runs 1,392v but had to increase to 1.440v now to be completely stable. Has 2 7700k both can 5300 without problems, but I de-lid and max temp 60 degrees celcius in when the PC is in the basement 18 C


----------



## cluster71

I have "8 de aurer" had to wait 2 months 
on delivery. It is absolutely perfect. Worth the money


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> It makes sense, its even on the ASUS ROG website (Raja's Kaby Lake OC guide) and its inline with my observations with FinFet (response to cooling is equal to both GPUs and CPUs). I mean if you can get it stable with 1.45V then I have a gut feeling 1.4V or maybe 1.35V 5ghz may be possible for you, even less if you get that chiller, provided the chip gets below that magic 65 degree threshold. My 3770K was the same, could only do 4.5ghz at 1.35V on a NH-D14 cooler, I delidded it and now it does 4.7Ghz at same voltage and super low operating temps.
> 
> We enthusiasts have always known more cooling improves OC headroom and stability. However, for FinFet chips, this is even more apparent (iirc they started noticing it back in Ivy Bridge) because the chips get leakier much faster relative to voltage compared to the old planar Sandy Bridge era chips. This means the chips get quadratically hotter much faster than their 32nm cousins at higher voltages, the converse also means that cooling them vastly reduces their leakage (much faster and more potent than planar) netting better stability and greater longevity.
> 
> The other characteristic about FinFet is its also more resistant to degradation than Planar so it typically can sustain higher voltages provided operating temps are low. I knew of many 32nm Sandy chips that degraded over a couple of years at 1.4V+ whereas Ivy didn't tend to degrade as fast/badly although they did tended to die rather abruptly if something internal was pushed too far.
> 
> Extrapolating to GP102, I have noticed my GTX 1080ti is now completely stable with zero power throttling at 2100mhz at <40C whereas before at 60C I could only do 2037mhz with Supo/FSU throttling. Temperature matters with FinFet.


Rep +

That was very informative.

So what you're basically also saying (as this part interested me greatly) is that at higher temps they leak more energy. Meaning they use more power.

So in other words, getting it cooler not only allows more frequency for lesser voltage-- but it actually consumes less power, too? Reducing throttling?

Excellent post 

Pitty I couldn't de-lid my Ti lol.







_(only joking of course, I know it's already direct DIE)._

P.S.
I've got a new humidity & dew point meter in the post.
As soon as it arrives I'll be able to safely fire up the chiller.


----------



## cluster71

I have most overclocked X79 earlier as 3970X. I read this guide when I switched to 7700K http://edgeup.asus.com/2017/01/31/kaby-lake-overclocking-guide/ I know we're deviating from the subject now GTX1080Ti


----------



## nrpeyton

Quote:


> Originally Posted by *cluster71*
> 
> I have most overclocked X79 earlier as 3970X. I read this guide when I switched to 7700K http://edgeup.asus.com/2017/01/31/kaby-lake-overclocking-guide/ I know we're deviating from the subject now GTX1080Ti


I hear ya...

To steer the conversation back in the right direction then..

I can't wait to see how much further the Chiller will let me push the 1080 Ti ;-)

Hopefully if I can get a low humidity night. About 5c would be perfect.... that will allow me to run the 1080Ti at about 15c - 17c (reported in afterburner/hwinfo64 at load).









I've not had a chance to experiment with it since I traded my Classy 1080.

Lowest I ever got the Classy 1080 to, was -14c (using chiller -- but the time I spent insulating it was a nightmare lol). Obviously at those temps the dew point meter is completely irrelevant. It all comes down to insulation ;-)

Insulating my Classy 1080 a few months ago:


----------



## cluster71




----------



## jase78

on average how may points would you guys say you gained in supo 4k from shunt mod? and what was your max before and after?


----------



## cluster71

Sorry, I do not know. I'm not much for benchmark but "game stable" play and increase performance gradually to it's good. But Benchmark is a great way to see if you're on track right away


----------



## cluster71

My Zotac is completely stable at 2100 1,093v
2126 1.100-1.125v
2139 1,162v hehe

Definitely a turning point at 2126/12000


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Rep +
> 
> That was very informative.
> 
> So what you're basically also saying (as this part interested me greatly) is that at higher temps they leak more energy. Meaning they use more power.
> 
> So in other words, getting it cooler not only allows more frequency for lesser voltage-- but it actually consumes less power, too? Reducing throttling?
> 
> Excellent post
> 
> Pitty I couldn't de-lid my Ti lol.
> 
> 
> 
> 
> 
> 
> 
> _(only joking of course, I know it's already direct DIE)._
> 
> P.S.
> I've got a new humidity & dew point meter in the post.
> As soon as it arrives I'll be able to safely fire up the chiller.


Yup, the old triple value from extra cooling, power, stability and longevity.
My memory is hazy and but most of the FinFet research was done back in the Ivy era (we had graphs for everything, Temp vs Clocks, Power vs Clocks, Voltage vs Temp, Voltage vs Clocks, Voltage vs Power) but I think there were a few very well known temperature stability breakpoints for 22nm FinFet (for the life of me I cannot find the Temp vs Clockspeed graph any more).
80C, 65C, 40C, 25C, 10C and -10C IIRC. These breakpoints seem to map reasonably well to the GTX 1080ti +/-5C as well so you can expect higher clocks though you might need the XOC bios to utilize the 25C and below points.


----------



## JunkaDK

Hello all,

Just joined the 1080 ti club yesterday, when i got my EVGA GTX1080 Ti FTW3.

I put it in my livingroom PC to serve as a VR and Gaming platform.

Yesterday i was testing how well it would overclock, and it did not go as planned!

I am only able to add +25mhz to the core. Above that it crashes. is it just bad silicon lottery?

I'm testing using FS and SP

My PC setup is this

Case: Fractal Design Node 304
2 x 92mm Noctua intake fans
Noctua CPU cooler with 2 x 140 mm fans
Mobo: Biostar X370GTN mini ITX board for Ryzen
CPU: Ryzen 5 1600 @ 3.9Ghz.
RAM: 16GB Corsair LPX 3200mhz ( running 2933mhz )
GPU: EVGA GTX1080ti FTW3
SSD: Samsung 960 Evo 500GB
PSU: EVGA Supernova 750 G3 ( Running two seperate cables from the PSU to the GPU, no daisychain)

When i look at the power% it fluctuates alot.. why is that? i thought it would be at max all the time ?

I also tried adjusting the voltage slider, but without any effect. Im using EVGA precision X to tweak.. should i be using afterburner?

Also can any one give me any other pointers? I haven't really tested mem oc yet..

Picture here: 

(this pictures was taken while running Heaven benchmarks)

Any other info you might need, let me know )

Any help is much appreciated?


----------



## DStealth

Quote:


> Originally Posted by *cluster71*
> 
> My Zotac is completely stable at 2100 1,093v
> 2126 1.100-1.125v
> 2139 1,162v hehe
> Definitely a turning point at 2126/12000


And seeing the performance from your benchmark it's on par with 2050mhz cards/even lower/ with such high clock...try lowering and not using XOC BIOS rather than your original one








Here's how these GPU's are behaving ... boring video but informative...


----------



## bmgjet

Had water block on for a few days now and had time to test temp vs clocks since it has a lot better temp control then on air.
Ill use PUBG as my test game since it crashes the easiest and was crashing even at stock boost clocks when it hit 62c.

Here are my findings for point of crash, Voltage doesnt seem to matter and only makes it run hotter. The only clock that needed a voltage bump over 1.061 (which is stock boost voltage on my card) was to reach 2099mhz which needed 1.075 or it was instant crash. Beyond that I couldnt test any further since adding any more voltage I couldnt keep it under 40c.
Temp C Clock MHZ
62 1987
58 2025
52 2037
50 2050
48 2061
44 2073
40 2099

The most supprising thing I found out of this is if I keep core temp under 40C I can run 2050mhz on 1.025v but as soon as it hits 42c it will crash.
Which has got me wondering if the temp to clock place changes with voltage.


----------



## Dasboogieman

Quote:


> Originally Posted by *bmgjet*
> 
> Had water block on for a few days now and had time to test temp vs clocks since it has a lot better temp control then on air.
> Ill use PUBG as my test game since it crashes the easiest and was crashing even at stock boost clocks when it hit 62c.
> 
> Here are my findings for point of crash, Voltage doesnt seem to matter and only makes it run hotter. The only clock that needed a voltage bump over 1.061 (which is stock boost voltage on my card) was to reach 2099mhz which needed 1.075 or it was instant crash. Beyond that I couldnt test any further since adding any more voltage I couldnt keep it under 40c.
> Temp C Clock MHZ
> 62 1987
> 58 2025
> 52 2037
> 50 2050
> 48 2061
> 44 2073
> 40 2099
> 
> The most supprising thing I found out of this is if I keep core temp under 40C I can run 2050mhz on 1.025v but as soon as it hits 42c it will crash.
> Which has got me wondering if the temp to clock place changes with voltage.


This is fascinating and valuable data. I observed massive stability increases too from 40C downwards.
I don't fully know if its the cold alone, the internal power unit giving more voltage in response to cold or both that is responsible for the stability.


----------



## bmgjet

Over last 2 hours been trying out what I suspect that different voltages have different temp to mhz.

So far iv been able to try, 1.00v and 1.025v though those temp ranges.
Plan to do a full spread sheet over my spare time. But here is what I have so far.

Code:



Code:


Temp C    Clock MHZ

1.061v
62           1987
58           2025
52           2037
50           2050
48           2061
44           2073
40           2073

1.025v
62           1987
58           1987
52           1999
50           1999
48           2012
44           2025
40           2050
38           2050
36           2050

1.000v           
58           1939
52           1963
50           1987
48           1999
44           2012
40           2025
38           2037
36           2050


----------



## cluster71

DStealth Hmm I know how the clock temperature compensation in the GPU boost works, but if it's like he thinks that any function (shaders) does not kick in and hence the card does not pull that high load but can go on a high core with low voltage without craching. You're just fooling the benchmark program. The image quality will be lower. I'm not interested in a high score if it does not measure correctly. Have to verify it during game prestige. Image quality is most important, otherwise uninteresting. Games that I use the graphics card to do, do not compete in high scores.


----------



## nrpeyton

Quote:


> Originally Posted by *cluster71*
> 
> I tried to get a good result on my 7700k
> http://www.3dmark.com/spy/1904318


Looks like our graphics scores are pretty much exactly the same. Yours is a *little* higher. But your core clock was 1 bin higher too. So makes sense.

Have you used OCCT to find your maximum memory overclock? I noticed you're at +500.

You grabbed an extra 200 points overall (up compared to mine) on your CPU--which was overclocked to 5.2 Ghz. (compared to my maximum of 5.0 Ghz) So again.. makes sense....

Still don't see how some people here are beating both of us though with scores at nearly 11,000 on the same hardware. With either the same--or even *slightly* lower core clocks.

Quote:


> Originally Posted by *bmgjet*
> 
> Over last 2 hours been trying out what I suspect that different voltages have different temp to mhz.
> 
> So far iv been able to try, 1.00v and 1.025v though those temp ranges.
> Plan to do a full spread sheet over my spare time. But here is what I have so far.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Temp C    Clock MHZ
> 
> 1.061v
> 62           1987
> 58           2025
> 52           2037
> 50           2050
> 48           2061
> 44           2073
> 40           2073
> 
> 1.025v
> 62           1987
> 58           1987
> 52           1999
> 50           1999
> 48           2012
> 44           2025
> 40           2050
> 38           2050
> 36           2050
> 
> 1.000v
> 58           1939
> 52           1963
> 50           1987
> 48           1999
> 44           2012
> 40           2025
> 38           2037
> 36           2050


That looks like a pattern is forming showing lower temps at faster clock speeds*?*

Quote:


> Originally Posted by *DStealth*
> 
> And seeing the performance from your benchmark it's on par with 2050mhz cards/even lower/ with such high clock...try lowering and not using XOC BIOS rather than your original one
> 
> 
> 
> 
> 
> 
> 
> 
> Here's how these GPU's are behaving ... boring video but informative...
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> *


In the beginning of that *video*... at around #1:55 he mentions how:

"the capacitor mods on this card are very helpful--they do make it possible to run the memory at over +700".

Anyone know what he means by that? is he talking 'voltage' or 'power'?

Quote:


> Originally Posted by *jase78*
> 
> on average how may points would you guys say you gained in supo 4k from shunt mod? and what was your max before and after?


I really wish I'd taken a baseline first.

Kicking myself relentlessly for this mistake now-- since I done my shunt.

What I *can* say however-- if this is any use at all-- is that my shunt (on average) and this IS an estimation only. But about 25% extra power.

I found a screenshot I took before the shunt mod which showed my real time power draw at 234 watts at 0.981v at 1999Mhz. At 53 FPS.

In a similar scene (both in The Witcher 3 where I was also doing 53 FPS at 0.981v and 1999Mhz. With shunt mod I *report* a draw of about 180 watts.

My new resistors arrived today for Coop's resistor mod. However I've still got a few other things to order too (like new pads). Until then -- I've added a multiplier in hwinfo64 of 1.25x. So I can roughly monitor my new correct power draw. Where I'd normally be drawing 300 watts.. I'm now drawing about 375.

Anyway hoping someone else can answer your original question better than I - as I'm very interested regarding the answer too.... if no one comes up with an answer for this... I may do a bit research myself and see what I can find. But it won't be for a few weeks yet.

So yeah.. really kicking myself for not taking the time to take proper baselines.

Nick


----------



## nrpeyton

deleted.


----------



## cluster71

nrpeyton. No I have not run OCCT for memory. I just chanced. I have not systematically benchmarked and tested to get out the most of either CPU GPU Ram or GPU Vram. I Game and working upwards.

But it's something that's not aright with 7700k Z270 when it comes to benchmark score. There are some individuals who managed to get up in pretty good results, but most is much lower than older CPUs and platforms. Perhaps conflict between manufacturers of motherboards, Ram, M.2 drives in some combinations.There are more pcie lines than before, maybe some problems. You almost want to try and get the old 3770k going and put in Zotac and see. It's a little worrying now for the new CPU series. Perhaps going back to Xeon processors again


----------



## cluster71

It is not always clear if we speak continuous power or dynamic power (stored in capacitors, which can be very high) Hwinfo64 shows GPU power bigger than i ever measured the whole PC pulled from the wall


----------



## dunbheagan

I finally beat the 10900 border in Supo 4K:



[email protected]/6237
[email protected]
32GB DDR3 2400 CL11


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> I finally beat the 10900 border in Supo 4K:
> 
> 
> 
> [email protected]/6237
> [email protected]
> 32GB DDR3 2400 CL11


nice one

Any advice for the lower scoring peasants amongst us _(like me)_ ? lol


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> nice one
> 
> Any advice for the lower scoring peasants amongst us _(like me)_ ? lol


That broadwell chip coupled with an extremely potent memory subsystem is a good start.
He's also got a fresh windows install.


----------



## KraxKill

Quote:


> Originally Posted by *Dasboogieman*
> 
> That broadwell chip coupled with an extremely potent memory subsystem is a good start.
> He's also got a fresh windows install.


That 5775c is such a dark horse CPU. The Z97 platform is performing quite well for gaming. Even the older 4790k sku's are hitting around 10.8k

I'm so glad I picked up a 5775c, Intel will have to come up with a 10nm chip for me to give up on the z97 chipset.

Im on Windows 10, but am hitting right at ~10.9k as well on 2133cl9

That DDR3 is still performing quite well add to that that 128mb of L4 @ 40ns and it's hard to beat.


----------



## dallas1990

wish i never saw those e3 videos. cause i want anthem so bad now. but it comes out next year (as of now i might add) and hear im thinking of. is my zotac 1080ti amp extreme enough or should i get a second one. or wait on nvidia and see what magic they come up with next lol. these thoughts are on a majority of us im sure.


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> nice one
> 
> Any advice for the lower scoring peasants amongst us _(like me)_ ? lol


I do not have a fresh windows install, i am running a 2-year old installation of Win7 Pro. But what helped me software-wise for the last points, where the "classics":

-Turning all Nvidia Settings to "performance"
-Closing ALL programs from the taskbar before benching
-No Gsync
-No background picture

This seemed to give me at least 100 more points, maybe even a little more.


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> I do not have a fresh windows install, i am running a 2-year old installation of Win7 Pro. But what helped me software-wise for the last points, where the "classics":
> 
> -Turning all Nvidia Settings to "performance"
> -Closing ALL programs from the taskbar before benching
> -No Gsync
> -No background picture
> 
> This seemed to give me at least 100 more points, maybe even a little more.


I done the same, (I.E. followed the advice from the OP) and it also netted me about 100 points. _(no g-sync and performance mode on any setting that allows it).
_
But I still seem stuck at 9800. But most of the time I struggle to hit that. And my *best ever* was 9880. Which was in part due to getting the 7700k up to 5Ghz at an insanely silly voltage. (which I returned to 4.9 Ghz as soon as that bench had completed).

RAM also at 4000 Mhz with decent timings (18, 19, 19, 39).

Decent memory O/C of 575.

And core overclocked to 2100 (or slightly above) using curve.

Nothing hungry in background. Lucky if my idle CPU load is even 1%.

So how some of you guys are nearly 500-800 points ahead of me I just don't know.

I've spent another 12 hours at it since my rant here yesterday.. and I'm still no further forward.

*Anyone know if purchasing the timespy upgrade gives an edge?* (I know it allows you to disable the demo -- which for me annihilates my temps -- because by the time the actual tests begin my coolant is already about 38-40c. Meaning the GPU reaches about 50c - 52c.

I have the advanced edition. But unfortunately it was before July 14, 2016-- meaning I would have to purchase timespy seperately. Which at the moment -- I really can't afford

I don't expect much from disabling the demo. Maybe only 1 or 2 bins.

But I'm interested to know if there's maybe another *secret "incentive"* in there-- which futuremark don't like to talk about; to *"encourage" people to buy the upgrade*. Anyone noticed anything?


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> I done the same, (I.E. followed the advice from the OP) and it also netted me about 100 points. _(no g-sync and performance mode on any setting that allows it).
> _
> But I still seem stuck at 9800. But most of the time I struggle to hit that. And my *best ever* was 9880. Which was in part due to getting the 7700k up to 5Ghz at an insanely silly voltage. (which I returned to 4.9 Ghz as soon as that bench had completed).
> 
> RAM also at 4000 Mhz with decent timings (18, 19, 19, 39).
> 
> Decent memory O/C of 575.
> 
> And core overclocked to 2100 (or slightly above) using curve.
> 
> Nothing hungry in background. Lucky if my idle CPU load is even 1%.
> 
> So how some of you guys are nearly 500-800 points ahead of me I just don't know.
> 
> I've spent another 12 hours at it since my rant here yesterday.. and I'm still no further forward.
> 
> *Anyone know if purchasing the timespy upgrade gives an edge?* (I know it allows you to disable the demo -- which for me annihilates my temps -- because by the time the actual tests begin my coolant is already about 38-40c. Meaning the GPU reaches about 50c - 52c.
> 
> I have the advanced edition. But unfortunately it was before July 14, 2016-- meaning I would have to purchase timespy seperately. Which at the moment -- I really can't afford
> 
> I don't expect much from disabling the demo. Maybe only 1 or 2 bins.
> 
> But I'm interested to know if there's maybe another *secret "incentive"* in there-- which futuremark don't like to talk about; to *"encourage" people to buy the upgrade*. Anyone noticed anything?


Careful with 3dMark. It is an easily misleading benchamrk because it combines the CPU, GPU score and has a "combined" score in a manner people will never see in practicality. This has people drawing conclusions and buying workstation chips to game on or vis a versa.

The variables associated with 3dmark don't paint an accurate picture, especially when people post "totals".

3D mark is great to a point and especially for stability testing of the entire system, but for "gaming" performance testing, Unigine Valley, Heaven and now SP4K were and still are the best in term of correlating to reality. Unigine marginalizes the effect of the CPU the way a game workload does and puts the focus on the frame delivery. A good CPU will still push but a few, but none the less more frames on otherwise identical systems. This paints a more realistic picture of practical gaming scenarios. 3d mark tends to simulate stuff you will unlikley see in practicality. When will you ever (how often if ever?) ask the CPU to render fractals exclusively? 3dMark is a picture of a hypothetical workload that is not encountered often if ever in reality.


----------



## PachAz

I don't own a 1080 Ti yet but I have ordered this one:



Which is a KFA2 factory clocked card with reference PCB.

And I have ordered this as well and a back plate:


Aquacomputer kryptographics


----------



## TK421

how much performance are we looking when comparing a 1080 to 1080ti?

some open box at microcenter for good price


----------



## Aganor

How can i undervolt my card? If i use msi afterburner, i cannot pass from 0


----------



## KedarWolf

Quote:


> Originally Posted by *Aganor*
> 
> How can i undervolt my card? If i use msi afterburner, i cannot pass from 0


See the video on how to do a custom voltage curve but you'd top it say a .885v 1900 Core or something like that.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## KedarWolf

On my way to pick up my new motherboard.











https://www.asus.com/ca-en/Motherboards/ROG-STRIX-X99-GAMING/


----------



## smithsrt8

Quote:


> Originally Posted by *Aganor*
> 
> How can i undervolt my card? If i use msi afterburner, i cannot pass from 0


20-30fps more in most games...which in many cases is a ton...


----------



## smithsrt8

Quote:


> Originally Posted by *KedarWolf*
> 
> On my way to pick up my new motherboard.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.asus.com/ca-en/Motherboards/ROG-STRIX-X99-GAMING/


Nice choice...I considered the z270 board for a bit...


----------



## Aganor

Quote:


> Originally Posted by *KedarWolf*
> 
> See the video on how to do a custom voltage curve but you'd top it say a .885v 1900 Core or something like that.
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


Already done the afterburner mod but forgot to touch the graph, thanks


----------



## FastEddieNYC

Quote:


> Originally Posted by *PachAz*
> 
> I don't own a 1080 Ti yet but I have ordered this one:
> 
> 
> 
> Which is a KFA2 factory clocked card with reference PCB.
> 
> And I have ordered this as well and a back plate:
> 
> 
> Aquacomputer kryptographics


I'm using the same block and getting great results with it. On a 21C ambient temp and 25C water temp the core runs at 34C load [email protected] I have found that Pascal is very temp sensitive. At 40C it will crash at that setting.


----------



## jase78

getting better! ive gotten my supo 4k score up a bit. Just curious what shunt mod will do for me. gonna order some liquid metal paste today. are you guys going with liquid metal ultra ? thought i remember reading somewhere in this thread that it was easier to spread on shunt resistors compared to grizzly.



This has become so addictive!! This is the first card that i've invested more time benching than i have gaming!


----------



## Quadrider10

So out of Asus, evga, and gigabyte, which cards have been found to run the coolest and highest clocks?


----------



## TK421

Quote:


> Originally Posted by *jase78*
> 
> getting better! ive gotten my supo 4k score up a bit. Just curious what shunt mod will do for me. gonna order some liquid metal paste today. are you guys going with liquid metal ultra ? thought i remember reading somewhere in this thread that it was easier to spread on shunt resistors compared to grizzly.
> 
> 
> 
> This has become so addictive!! This is the first card that i've invested more time benching than i have gaming!


grizzly is better

some people say that clu eats the resistor solder


----------



## ilikelobstastoo

Quote:


> Originally Posted by *TK421*
> 
> grizzly is better
> 
> some people say that clu eats the resistor solder


Why would u put it on the solder? Just short the resistor itself correct?


----------



## bmgjet

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Why would u put it on the solder? Just short the resistor itself correct?


Your not shorting the resistor your changing its value.
Which needs to be touching each solder pad on the top of the resistor.


----------



## KedarWolf

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TK421*
> 
> grizzly is better
> 
> some people say that clu eats the resistor solder
> 
> 
> 
> Why would u put it on the solder? Just short the resistor itself correct?
Click to expand...

Be careful, there is two different mods that basically do the same thing.

Shunt mod, you use CLU, resistor mod you use 47 Ohm resistors and totally different method to do it.

CLU is ONLY for the shunts. You use CLU on the three resistors you'll brick your card.

Search this thread, you'll find both methods.









Actually, I recall I've been corrected, the 47 Ohm resistors go on three capacitators, not on resistors.


----------



## KraxKill

Played a bit more with the voltage curve today and and finally broke 10900. I can probably lean out my Win10 a bit more but I doubt it will help. I don't think 11K is possible without additional power mods beyond what the shunt mod can offer. The hardcoded power limit is still well in effect.



5775C - 4.3Ghz - 2133CL9 - shunted 1080Ti FE @ 2113mhz GDDR5x @ 6126mhz


----------



## FastEddieNYC

Quote:


> Originally Posted by *TK421*
> 
> grizzly is better
> 
> some people say that clu eats the resistor solder


I'm using CLU with no issues. Both products contain Gallium and Tin. The Grizzly just has it in different ratio and may have more Indium. As long as you keep the liquid metal on the top of the shunt and not coat the solder joint the chance of causing damage is minimized.

For anyone that wants to try the liquid metal shunt mod I found that if you take very fine sandpaper(1500+) and lightly hit each end of the shunt to make some micro abrasions it will increase the surface tension for the TIM.


----------



## cluster71

A bit strange. Since I bought the Zotac Amp E, the power compensation does not seem to work. When power usage begins to approach 110-120% (2100 1,093v) with original bios, the card can not back down without craching frezz instead. I had to keep a bit under the core 2062 1.063v 100-105% for it to work. I thought that the card was unable to pass at 2100, but when I switched to XOC bios, the card is totally stable at 2100, 1.093v (played for several days on it), even stable high frequencies but with higher voltage ... I In the afternoon I went back to Oiginalbios for over 2100 is not directly practical considering the effect and the heat. But when I started playing 2100 1.093v as before, Craching ELITE 4 within 2 minutes, several times. As before, the card can not reverse the clock frequency without freez. I have to back down. There is something wrong with my card or bad programing of the orginal bios. I installed XOC again and now played 2 hour on 2100/12000 1.093v


----------



## KenjiS

So guys any way i can get that Destiny 2 code you think since I JUST got my 1080 Ti?


----------



## TK421

Quote:


> Originally Posted by *FastEddieNYC*
> 
> I'm using CLU with no issues. Both products contain Gallium and Tin. The Grizzly just has it in different ratio and may have more Indium. As long as you keep the liquid metal on the top of the shunt and not coat the solder joint the chance of causing damage is minimized.
> 
> For anyone that wants to try the liquid metal shunt mod I found that if you take very fine sandpaper(1500+) and lightly hit each end of the shunt to make some micro abrasions it will increase the surface tension for the TIM.


ahh ok

does anyone know the screwdriver bit size/type you need to have to open the nvidia stock cooler?


----------



## dunbheagan

Quote:


> Originally Posted by *KraxKill*
> 
> Played a bit more with the voltage curve today and and finally broke 10900. I can probably lean out my Win10 a bit more but I doubt it will help. I don't think 11K is possible without additional power mods beyond what the shunt mod can offer. The hardcoded power limit is still well in effect.
> 
> 
> 
> 5775C - 4.3Ghz - 2133CL9 - shunted 1080Ti FE @ 2113mhz GDDR5x @ 6126mhz


Congrats, decent score!


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> ahh ok
> 
> does anyone know the screwdriver bit size/type you need to have to open the nvidia stock cooler?


1.4mm precision screw driver. I had to buy a little precision screw driver set especially for the task. _But it was only 5 bux. It has 1.0, 1.4, 2.0, 2.4, 3.0 & 3.5mm in the set. (If that helps you to find what you're looking for).



_
And then underneath you also need a 4mm hex *socket*. But don't make the same mistake I did.

It's the *socket*, not a normal hex.

I.E. the middle is hollow.

Instead of the hex, you could use pliers.. but be careful.. very dangerous if they slip... I used pliers myself -- but I covered the area around every screw with thick tape to protect the solder points and other components on the PCB for when the pliers slip.. and they *will slip*: _(it was painstaking using pliers)_

EK manual also says you can use pliers. But gives you a stern warning about being careful.



*Something very odd is happening:*



*Only 68% GPU utilisation*
..._*scratches head*_


----------



## ilikelobstastoo

The solder pad is on the board . A typical smd resistor has metal ends and is soldered to the pads. So no you short the resistor not the solder. Yes you want small amount of resistance to prevent a failsafe .but is still a short non the less


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> *Something very odd is happening:*
> 
> 
> 
> *Only 68% GPU utilisation*
> ..._*scratches head*_


Looks like you have a frame limit at 40fps? Check RivaTuner, Framerate Limit.


----------



## TK421

Quote:


> Originally Posted by *nrpeyton*
> 
> 1.4mm precision screw driver. I had to buy a little precision screw driver set especially for the task. _But it was only 5 bux. It has 1.0, 1.4, 2.0, 2.4, 3.0 & 3.5mm in the set. (If that helps you to find what you're looking for).
> 
> 
> 
> _
> And then underneath you also need a 4mm hex *socket*. But don't make the same mistake I did.
> 
> It's the *socket*, not a normal hex.
> 
> I.E. the middle is hollow.
> 
> Instead of the hex, you could use pliers.. but be careful.. very dangerous if they slip... I used pliers myself -- but I covered the area around every screw with thick tape to protect the solder points and other components on the PCB for when the pliers slip.. and they *will slip*: _(it was painstaking using pliers)_
> 
> EK manual also says you can use pliers. But gives you a stern warning about being careful.
> 
> 
> 
> *Something very odd is happening:*
> 
> 
> 
> *Only 68% GPU utilisation*
> ..._*scratches head*_


so what kind of hex socket driver is the correct one?


----------



## bmgjet

Too much resistance drop in the shunts will put the card into fail safe mode.

Lol I removed my stock cooler using a fine tip knife since screws smaller then my smallest screw driver.


----------



## nrpeyton

Quote:


> Originally Posted by *TK421*
> 
> so what kind of hex socket driver is the correct one?


----------



## FastEddieNYC

Quote:


> Originally Posted by *TK421*
> 
> ahh ok
> 
> does anyone know the screwdriver bit size/type you need to have to open the nvidia stock cooler?


For the screws either a #00 or #000 works and you need a 4mm socket.


----------



## cluster71

nrpeyton, You'll throw it out with the garbage soon


----------



## kevindd992002

Is the EK WB for EVGA GYX 1080Ti FTW3 the only WB available for this specific GPU? Is it already released?


----------



## bmgjet

Quote:


> Originally Posted by *kevindd992002*
> 
> Is the EK WB for EVGA GYX 1080Ti FTW3 the only WB available for this specific GPU? Is it already released?


Only full cover block.
Not released yet. Was ment to be released on 17th but it looks like its been delayed until next month.


----------



## kevindd992002

Quote:


> Originally Posted by *bmgjet*
> 
> Only full cover block.
> Not released yet. Was ment to be released on 17th but it looks like its been delayed until next month.


That's a bummer! I thought it was meant to be released end of May.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *nrpeyton*


You can use a 4mm hex with a few layers of tape. The screws are only finger tight anyways and should be.


No one should be turning their wrist when tightening these screws. Just fingers.


----------



## evensen007

Happy to say I just joined the club. After suffering through terrible recent x-fire scaling and driver issues, I finally upgraded from my pair of AMD 290's. I got an MSI Founder's Edition and threw an XSPC full cover block on it. Right out of the gate, I got +150 without messing with voltage and my temps haven't hit higher than 45 during any gaming. The turbo freq sits around 2070 or so, and I haven't even played to much with it!.

Add that to the HUGE improvement in my gaming quality of life (sick of x-fire microstutter, negative scaling, hiccups) and I am so happy to be 'Home'. I've been all AMD since my last nVidia card was a GeForce4 8800 GTS in 2007! After that I went AMD (Lightning 580's xfire, 7970's xfire, 290's xfire). It was good for a while when xfire scaling was solid, but now BF1 still doesn't even work properly with xfire. I tried to hold out for Vega, but was seriously let down by the repeated c*&^$-teases and no release of real info. Sorry AMD, you made me go green and I am so happy I did!


----------



## ilikelobstastoo

Best bet is to simply use the proper tools for the job . Most 1/4 inch socket sets include 4 ,5 ,5.5 mm sockets these days


----------



## Aganor

Quote:


> Originally Posted by *evensen007*
> 
> Happy to say I just joined the club. After suffering through terrible recent x-fire scaling and driver issues, I finally upgraded from my pair of AMD 290's. I got an MSI Founder's Edition and threw an XSPC full cover block on it. Right out of the gate, I got +150 without messing with voltage and my temps haven't hit higher than 45 during any gaming. The turbo freq sits around 2070 or so, and I haven't even played to much with it!.
> 
> Add that to the HUGE improvement in my gaming quality of life (sick of x-fire microstutter, negative scaling, hiccups) and I am so happy to be 'Home'. I've been all AMD since my last nVidia card was a GeForce4 8800 GTS in 2007! After that I went AMD (Lightning 580's xfire, 7970's xfire, 290's xfire). It was good for a while when xfire scaling was solid, but now BF1 still doesn't even work properly with xfire. I tried to hold out for Vega, but was seriously let down by the repeated c*&^$-teases and no release of real info. Sorry AMD, you made me go green and I am so happy I did!


Welcome back home









I got fed up with the Radeon 5970 dual GPU back on 2010 with the same issues as you. After that, only Nvidia here.

You got a nice chip there, but you'll find out that messing with voltages will worsen your scores most of the times. Pascal has some internal TDP limiter that we cannot mess with.
I have +175 on core (mostly) stable


----------



## ZealotKi11er

I do not have much experience with Nvidia GPUs. What sign does the game crashing to destop show for system stability?


----------



## eXteR

I recieved today a nVidia replacement for my founders 1080Ti.

Got RMA last week, because Geforce GTX led stopped working.

New one i better. Can go 2100Mhz and got 2012Mhz @ 1.0v

My first one max was 2037Mhz.

I'm really happy with that replacement unit. Was a brand new, not a refurbished.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I do not have much experience with Nvidia GPUs. What sign does the game crashing to destop show for system stability?


More info?


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> More info?


Both 1080 and 1080 Ti crash into desktop after playing for a bit. I played Metro Last Light and SW:BF. Cant be the CPU because that usually shuts the PC off or you get .exe problem. With my AMD card if game was not stable it would crash with .exe or drivers would reset or game would hang.


----------



## nrpeyton

Quote:


> Originally Posted by *evensen007*
> 
> Happy to say I just joined the club. After suffering through terrible recent x-fire scaling and driver issues, I finally upgraded from my pair of AMD 290's. I got an MSI Founder's Edition and threw an XSPC full cover block on it. Right out of the gate, I got +150 without messing with voltage and my temps haven't hit higher than 45 during any gaming. The turbo freq sits around 2070 or so, and I haven't even played to much with it!.
> 
> Add that to the HUGE improvement in my gaming quality of life (sick of x-fire microstutter, negative scaling, hiccups) and I am so happy to be 'Home'. I've been all AMD since my last nVidia card was a GeForce4 8800 GTS in 2007! After that I went AMD (Lightning 580's xfire, 7970's xfire, 290's xfire). It was good for a while when xfire scaling was solid, but now BF1 still doesn't even work properly with xfire. I tried to hold out for Vega, but was seriously let down by the repeated c*&^$-teases and no release of real info. Sorry AMD, you made me go green and I am so happy I did!


Welcome to the club.

Glad you're happy with your new card.

You'll find a wealth of information on here. But from the sounds of it--if you're coming from AMD-- then the performance & quality upgrade you're already experiencing is probably all you need. (certainly for a little while at least) 

Overclocking will help you edge out a few extra FPS-- but nothing game changing..

Most of us here overclock for fun, and, well; certainly for myself -- I like to see how far I can push something ;-)

However if you do get the 'bug for overclocking' then yeah... there's _*massive*_ wealth of information here. And loads of people; who all enjoy helping 

What games are you playing?

Quote:


> Originally Posted by *eXteR*
> 
> I recieved today a nVidia replacement for my founders 1080Ti.
> 
> Got RMA last week, because Geforce GTX led stopped working.
> 
> New one i better. Can go 2100Mhz and got 2012Mhz @ 1.0v
> 
> My first one max was 2037Mhz.
> 
> I'm really happy with that replacement unit. Was a brand new, not a refurbished.


Glad you had a pleasant RMA experience  Was it direct from Nvidia or one of the partners?


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both 1080 and 1080 Ti crash into desktop after playing for a bit. I played Metro Last Light and SW:BF. Cant be the CPU because that usually shuts the PC off or you get .exe problem. With my AMD card if game was not stable it would crash with .exe or drivers would reset or game would hang.


Pascal tends to straight up crash to desktop (but rarely forcing a reboot, usually driver reset at worst) upon signs of instability. Some games like FFXIV or PUBG are so sensitive to instability that they crash to desktop within minutes with a fatal directX error (but no driver crash).
AMD IIRC was able to chug along fine at borderline stability for ages with massive artifacting but no crash. The stability has to be really out of whack to force an AMD card to crash. Thats why OCs were a PITA to verify with AMD, you never really knew if you were truly stable or not, I used to verify an OC on weeks of [email protected] (some only showed up after 12+ days of protein folding with rounding errors resulting in declined WUs).

Pascal is a simple it either works or it doesnt. However, this is also a double edged sword. Its really hard to distinguish between buggy software/driver issues and a hardware crash with NVIDIA.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> Pascal tends to straight up crash to desktop (but rarely forcing a reboot, usually driver reset at worst) upon signs of instability. Some games like FFXIV or PUBG are so sensitive to instability that they crash to desktop within minutes with a fatal directX error (but no driver crash).
> AMD IIRC was able to chug along fine at borderline stability for ages with massive artifacting but no crash. The stability has to be really out of whack to force an AMD card to crash. Thats why OCs were a PITA to verify with AMD, you never really knew if you were truly stable or not, I used to verify an OC on weeks of [email protected] (some only showed up after 12+ days of protein folding with rounding errors resulting in declined WUs).
> 
> Pascal is a simple it either works or it doesnt. However, this is also a double edged sword. Its really hard to distinguish between buggy software/driver issues and a hardware crash with NVIDIA.


I am testing my system for stability with Prime 95 right now since I did change the MB to ITX. Problem is that my 1080 crashed even at stock and that was with custom fan curve so it was not running hot. Also my 290X would crash Heros of the Storm as soon as it was not stable but 1080 and 1080 Ti not a single crash there.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am testing my system for stability with Prime 95 right now since I did change the MB to ITX. Problem is that my 1080 crashed even at stock and that was with custom fan curve so it was not running hot. Also my 290X would crash Heros of the Storm as soon as it was not stable but 1080 and 1080 Ti not a single crash there.


Try the FFXIV benchmark. Once you downloaded it, go to NVIDIA control panel and disable Vsync. If you can run the looped bench on max settings for longer than 4hrs its stable.

Also double check all your cables are firmly connected. Just in case you're crashing because of power delivery.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Best bet is to simply use the proper tools for the job . Most 1/4 inch socket sets include 4 ,5 ,5.5 mm sockets these days


Yeah, even though I had a 4mm socket as pictured, it didn't fit perfectly and I didn't want the screws to look tampered with. Even a corner being nicked. So the tape came in use.


----------



## eXteR

Quote:


> Originally Posted by *nrpeyton*
> 
> Glad you had a pleasant RMA experience  Was it direct from Nvidia or one of the partners?


Direct from nVidia.


----------



## KraxKill

I'm done tinkering with the card for the time being until somebody figures out a clear cut method to more power beyond what the shunt mod offers.

The XOC Bios *does not increase the headroom at all*l for already shunted cards. All BIOS seem to be limited by the Nvidia hard coded power limit. You may see a higher frequency on the XOC, but your actual score and performance is likely to be lower or not increase especially if the card is shunted already. Running additional voltage will just make the card bounce against the hardcoded TDP that much harder lowering performance.

Another thing I noticed. *If you experience a hard crash* (win10, Afterburner beta10). Not a soft driver crash/rest but a harder crash. Sometimes, I found that Afterburner would handicap the card regardless of applied setting by a couple of bins and not allow a saved profile to clock the card up to its actual set frequency.. It's like it would add a negative offset. A reboot would not fix this for me.

*To remedy* I would have to delete and uninstall Afterburner along with the existing Afterburner profiles in the program files. Shutdown and restart the system and reinstall Afterburner from scratch. It would then again clock appropriately.

You can test if this condition is in effect by resetting Afterburner settings, maxing the voltage, TDP and temp sliders and note that your stock max boost is below where it would be. For me it would be in the 18xxs vs 1911 with the three top control sliders maxed out. It would stay at 18xx until I uninstall Afterburner, restart the system and reinstall Afterburner. Bam back at 1911 with 0 offset. I found this behavior pretty odd actually.

After reinstall, the system would then again OC correctly through any soft crashes. Force another hard crash, and BAM! it takes on a negative offset that won't go away until Afterburner reinstall.

Maybe there is a simpler way around this fault. Just something I ran into so thought I'd share.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *KraxKill*
> 
> I'm done tinkering with the card for the time being until somebody figures out a clear cut method to more power beyond what the shunt mod offers.
> 
> The XOC Bios *does not increase the headroom at all*l for already shunted cards. All BIOS seem to be limited by the Nvidia hard coded power limit. You may see a higher frequency on the XOC, but your actual score and performance is likely to be lower or not increase especially if the card is shunted already. Running additional voltage will just make the card bounce against the hardcoded TDP that much harder lowering performance.
> 
> Another thing I noticed. *If you experience a hard crash* (win10, Afterburner beta10). Not a soft driver crash/rest but a harder crash. Sometimes, I found that Afterburner would handicap the card regardless of applied setting by a couple of bins and not allow a saved profile to clock the card up to its actual set frequency.. It's like it would add a negative offset. A reboot would not fix this for me.
> 
> *To remedy* I would have to delete and uninstall Afterburner along with the existing Afterburner profiles in the program files. Shutdown and restart the system and reinstall Afterburner from scratch. It would then again clock appropriately.
> 
> You can test if this condition is in effect by resetting Afterburner settings, maxing the voltage, TDP and temp sliders and note that your stock max boost is below where it would be. For me it would be in the 18xxs vs 1911 with the three top control sliders maxed out. It would stay at 18xx until I uninstall Afterburner, restart the system and reinstall Afterburner. Bam back at 1911 with 0 offset. I found this behavior pretty odd actually.
> 
> After reinstall, the system would then again OC correctly through any soft crashes. Force another hard crash, and BAM! it takes on a negative offset that won't go away until Afterburner reinstall.
> 
> Maybe there is a simpler way around this fault. Just something I ran into so thought I'd share.


Thanks for all your hard work in this thread. Now it's time to enjoy gaming, lol.

It felt good to me to just let this go and play video games and I've been gaming ever since.


----------



## evensen007

Quote:


> Originally Posted by *nrpeyton*
> 
> Welcome to the club.
> 
> Glad you're happy with your new card.
> 
> You'll find a wealth of information on here. But from the sounds of it--if you're coming from AMD-- then the performance & quality upgrade you're already experiencing is probably all you need. (certainly for a little while at least)
> 
> Overclocking will help you edge out a few extra FPS-- but nothing game changing..
> 
> Most of us here overclock for fun, and, well; certainly for myself -- I like to see how far I can push something ;-)
> 
> However if you do get the 'bug for overclocking' then yeah... there's _*massive*_ wealth of information here. And loads of people; who all enjoy helping
> 
> What games are you playing?


Thanks for the warm welcome! After fiddling with trying to overclock 290's for years, I'm mostly out of the game now. And like you said, the performance boost was so large that I don't even really feel the need to. Adding 150 to the core was just something I figured I'd try and it worked so that's that!

Right now I am playing Battlefield 1. There are some games I'd like to try that I haven't had time for like the new Mass Effect and Witcher 3. Hell, I'd even like to re-boot Skyrim and add all the crazy mods now that my PC will handle it!


----------



## ZealotKi11er

Quote:


> Originally Posted by *evensen007*
> 
> Thanks for the warm welcome! After fiddling with trying to overclock 290's for years, I'm mostly out of the game now. And like you said, the performance boost was so large that I don't even really feel the need to. Adding 150 to the core was just something I figured I'd try and it worked so that's that!
> 
> Right now I am playing Battlefield 1. There are some games I'd like to try that I haven't had time for like the new Mass Effect and Witcher 3. Hell, I'd even like to re-boot Skyrim and add all the crazy mods now that my PC will handle it!


For FE the overclock help by reducing power target. +150 means its will run the same clock by using less voltage so lower power. Its so very different from AMD overclocking.


----------



## smithsrt8

Quote:


> Originally Posted by *SlimJ87D*
> 
> Thanks for all your hard work in this thread. Now it's time to enjoy gaming, lol.
> 
> It felt good to me to just let this go and play video games and I've been gaming ever since.


I know what you mean...I am driving myself crazy trying to get high scores in benchmarks and I keep forgetting that honestly...is there a huge difference between 90fps on ultra settings or 120fps on ultra settings (not that you can really gain 30fps in most games with tuning) when you have a good gsync monitor? not really...maybe if you really really look...but if you can get past that minuscule difference and just enjoy silky smooth game play it seems to make most problems go away


----------



## AllGamer

F! newegg!

They lost my package









Picked up 2 nice 1080Ti Seahawk and now they are lost in mail, going through the nightmare of fighting with the Shipper, New Egg and every damn resource I can find to get my package, or my money back










https://www.newegg.ca/Product/Product.aspx?Item=N82E16814137144&cm_re=GTX_1080_Ti-_-14-137-144-_-Product

Newegg says 7 to 14 business day







for their investigation, and that's a "maybe"

I'm thinking I should just call the Credit Card company and have them charge back instead.

but still... that version is sold out everywhere, can't even order again even If I charge back via CC.

Bastards! I'm sure it was the mail man guy that took it.

because the tracking says delivered, but there is nothing in the mail box.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AllGamer*
> 
> F! newegg!
> 
> They lost my package
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Picked up 2 nice 1080Ti Seahawk and now they are lost in mail, going through the nightmare of fighting with the Shipper, New Egg and every damn resource I can find to get my package, or my money back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.newegg.ca/Product/Product.aspx?Item=N82E16814137144&cm_re=GTX_1080_Ti-_-14-137-144-_-Product
> 
> Newegg says 7 to 14 business day
> 
> 
> 
> 
> 
> 
> 
> for their investigation, and that's a "maybe"
> 
> I'm thinking I should just call the Credit Card company and have them charge back instead.
> 
> but still... that version is sold out everywhere, can't even order again even If I charge back via CC.
> 
> Bastards! I'm sure it was the mail man guy that took it.
> 
> because the tracking says delivered, but there is nothing in the mail box.


They know who delivered it. We will get fired.


----------



## Aganor

Getting a bugcheck from watch dog, 0x00000133.
Maybe is the OC thats not stable?
That happened while mining at nicehash app

I'm mining at 100% power limit and stock voltage with 1949mhz @1000mv


----------



## Agent-A01

Quote:


> Originally Posted by *AllGamer*
> 
> F! newegg!
> 
> They lost my package
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Picked up 2 nice 1080Ti Seahawk and now they are lost in mail, going through the nightmare of fighting with the Shipper, New Egg and every damn resource I can find to get my package, or my money back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.newegg.ca/Product/Product.aspx?Item=N82E16814137144&cm_re=GTX_1080_Ti-_-14-137-144-_-Product
> 
> Newegg says 7 to 14 business day
> 
> 
> 
> 
> 
> 
> 
> for their investigation, and that's a "maybe"
> 
> I'm thinking I should just call the Credit Card company and have them charge back instead.
> 
> but still... that version is sold out everywhere, can't even order again even If I charge back via CC.
> 
> Bastards! I'm sure it was the mail man guy that took it.
> 
> because the tracking says delivered, but there is nothing in the mail box.


That sucks..
If they can't find the package and decide to refund you I would just go a different route.

honestly you should save hundreds and buy reference or even gigabyte OC that are always on sale for around 600 usd... + 120 USD for blocks.

You will save a lot of money in the end.


----------



## AllGamer

Quote:


> Originally Posted by *Agent-A01*
> 
> That sucks..
> If they can't find the package and decide to refund you I would just go a different route.
> 
> honestly you should save hundreds and buy reference or even gigabyte OC that are always on sale for around 600 usd... + 120 USD for blocks.
> 
> You will save a lot of money in the end.


Yeah, I know it's cheaper to go Reference + Block, but I actually like that MSI Dragon EK block







I don't mind paying extra for that.

I already have 2 GTX 1080 (non Ti) SeaHawk from last year, the Ti version is an upgrade to replace the 2 old ones.

{Build Log} Upside Down S8 project

I was soo looking forward to install it today, only to find out the mail box was empty, while it claims to have been delivered yesterday at lunch time.

I've got family in the house all day, no one ever rang the door bell, and no delivery note on the mail box.

With the NewEgg / MSI logos all over the box, I'm sure the Mail guy knew what it was and ran with it.

This is how it looks like when the older 2x GTX 1080 (non-Ti) SeaHawks came in
http://www.overclock.net/t/1608897/build-log-upside-down-s8-project/0_50#post_25450083


----------



## szeged

probably been asked a billion times already but....

for 4k is the 1080ti worth it with recent games?

been MIA from computer tech for about a 14 months now so i have no idea how its doing. still using my maxwell Titan Xs.

worth upgrading or keep waiting for 60+ fps 4k?


----------



## feznz

Quote:


> Originally Posted by *szeged*
> 
> probably been asked a billion times already but....
> 
> for 4k is the 1080ti worth it with recent games?
> 
> been MIA from computer tech for about a 14 months now so i have no idea how its doing. still using my maxwell Titan Xs.
> 
> worth upgrading or keep waiting for 60+ fps 4k?


Are you happy with what you have is there major short falls ie non SLI support or micro stutter.

I have a 3K monitor IMHO is isn't quite a genuine 3K max settings pretty dam close though I am seriously thinking going SLI but got to sort out a new Mothboard which is going to be 299 or 399


----------



## KraxKill

Quote:


> Originally Posted by *feznz*
> 
> Are you happy with what you have is there major short falls ie non SLI support or micro stutter.
> 
> I have a 3K monitor IMHO is isn't quite a genuine 3K max settings pretty dam close though I am seriously thinking going SLI but got to sort out a new Mothboard which is going to be 299 or 399


Put it this way, SLI bridge bandwidth has not increased significantly over the last few years. Faster cards only exacerbate this issue. To add to the bandwidth limitations, there are simply physical latency characteristics of the SLI interface that deal with frame timing and delivery that are impossible to remedy with current SLI bridging latencies.

SLI with ideal scaling in a hypothetical game, when it works, will get you from say 60fps on a single gpu to 100fps with sli but that 100fps will perform visually the way 75fps performs on a single card. A place you could easily get to without the headaches of SLI profiles, and associated frame judder.

IMO, you are better off spending the money on better cooling and getting the most out of a single card instead. If the Ti is not enough wait for Volta.

Sure, in the above scenario, you could water cool both cards and maybe squeeze 115fps out of it giving you the visual performance of say an 85fps single GPU system in terms of fluid performance but given the above, I find SLI very cost prohibitive even if affordable.

SLI is great for synthetic, scientific workloads, but for gaming I would stick with a single GPU. I will turn down settings before I deal with microstutter ever again. Back in the day, SLI was great, because delivering a frame at all superseded delivering the frame on time. I think now that we're able to push 60-100fps on a single card frame timing deserves more focus than simply rendering more frames.

The above is my opinion and opinions differ. Sometimes pictures and graphs help.







Much more in this thread and many more like it.
https://forums.geforce.com/default/topic/946036/sli/gtx-1080-sli-stutter-at-144hz-even-with-new-hb-sli-bridge/


----------



## ZealotKi11er

Quote:


> Originally Posted by *szeged*
> 
> probably been asked a billion times already but....
> 
> for 4k is the 1080ti worth it with recent games?
> 
> been MIA from computer tech for about a 14 months now so i have no idea how its doing. still using my maxwell Titan Xs.
> 
> worth upgrading or keep waiting for 60+ fps 4k?


Had the 1080 which is faster than Titan X but it was not enough for 4K. Got the 1080 Ti and really its pretty much there. You can always wait but it will take another 6 months for faster card.


----------



## nrpeyton

Quote:


> Originally Posted by *AllGamer*
> 
> Yeah, I know it's cheaper to go Reference + Block, but I actually like that MSI Dragon EK block
> 
> 
> 
> 
> 
> 
> 
> I don't mind paying extra for that.
> 
> I already have 2 GTX 1080 (non Ti) SeaHawk from last year, the Ti version is an upgrade to replace the 2 old ones.
> 
> {Build Log} Upside Down S8 project
> 
> I was soo looking forward to install it today, only to find out the mail box was empty, while it claims to have been delivered yesterday at lunch time.
> 
> I've got family in the house all day, no one ever rang the door bell, and no delivery note on the mail box.
> 
> With the NewEgg / MSI logos all over the box, I'm sure the Mail guy knew what it was and ran with it.
> 
> This is how it looks like when the older 2x GTX 1080 (non-Ti) SeaHawks came in
> http://www.overclock.net/t/1608897/build-log-upside-down-s8-project/0_50#post_25450083


I'm astonished they actually brand the boxes with company logos e.t.c

Just asking for trouble.

Would be much better if they kept the boxes plain and anonymous to remove any chance of opportunists.

Hope you get it sorted out though. I'd be livid.


----------



## szeged

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Had the 1080 which is faster than Titan X but it was not enough for 4K. Got the 1080 Ti and really its pretty much there. You can always wait but it will take another 6 months for faster card.


thanks, pretty much what i was looking for.

ill wait. Id rather have all the way there instead of almost there lol.

cooling is not an issue neither is SLI.

After quad titan x i wont be making that mistake again...


----------



## ZealotKi11er

Quote:


> Originally Posted by *szeged*
> 
> thanks, pretty much what i was looking for.
> 
> ill wait. Id rather have all the way there instead of almost there lol.
> 
> cooling is not an issue neither is SLI.
> 
> After quad titan x i wont be making that mistake again...


The only problem is that games get more demanding and 4K will never be easy to master with single GPU. New GPUs will just make sure the older gamers will be played at 60 fps +.


----------



## Quadrider10

so im debating between 3 different cards here. which one has had the best recorded performance and coolest temps?

Asus 1080ti Strix (non poseidon)
EVGA 1080ti SC2
Aorus 1080ti (non-extreme)

https://www.newegg.com/Product/Product.aspx?Item=9SIA24G5PY3917
https://www.newegg.com/Product/Product.aspx?Item=N82E16814487337
https://www.newegg.com/Product/Product.aspx?Item=N82E16814125954

Im leaning toward EVGA then Asus then Aorus...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Quadrider10*
> 
> so im debating between 3 different cards here. which one has had the best recorded performance and coolest temps?
> 
> Asus 1080ti Strix (non poseidon)
> EVGA 1080ti SC2
> Aorus 1080ti (non-extreme)
> 
> https://www.newegg.com/Product/Product.aspx?Item=9SIA24G5PY3917
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814487337
> https://www.newegg.com/Product/Product.aspx?Item=N82E16814125954
> 
> Im leaning toward EVGA then Asus then Aorus...


Considering the price difference Aorus.


----------



## Quadrider10

yea, the evga goes better with my build. but what it seems like is that all these cards are hitting pretty much the same clock speeds and temps... give or take a few MHz/degrees, and silicone lottery.

what about modding these cards? which one would be best?


----------



## mechwarrior

Has anyone tried the Poseidon bios on fe cards???


----------



## SupernovaBE

Is there a difference in 1080Ti cards for overclocking if its watercooled ?

I need to pick a card

ASUS ROG GeForce GTX 1080 Ti Strix Gaming 11G ( 799 euro )
GIGABYTE GeForce GTX 1080 Ti Gaming OC 11G ( 819 euro )
MSI GeForce GTX 1080 Ti Gaming X 11G ( 849 euro )
GIGABYTE AORUS GeForce GTX 1080 Ti 11G ( 849 euro )

Im leaning to the MSI.
I wil do a voltmod, i just need a card that can deliver the power.

Can you peeps help me ?

Im replacing my 1070 gaming X in SLI to 1 single 1080Ti.

I can get 750 euro local for my 2 cards on aircooling, good deal i think?


----------



## TheBoom

Quote:


> Originally Posted by *SupernovaBE*
> 
> Is there a difference in 1080Ti cards for overclocking if its watercooled ?
> 
> I need to pick a card
> 
> ASUS ROG GeForce GTX 1080 Ti Strix Gaming 11G ( 799 euro )
> GIGABYTE GeForce GTX 1080 Ti Gaming OC 11G ( 819 euro )
> MSI GeForce GTX 1080 Ti Gaming X 11G ( 849 euro )
> GIGABYTE AORUS GeForce GTX 1080 Ti 11G ( 849 euro )
> 
> Im leaning to the MSI.
> I wil do a voltmod, i just need a card that can deliver the power.
> 
> Can you peeps help me ?
> 
> Im replacing my 1070 gaming X in SLI to 1 single 1080Ti.
> 
> I can get 750 euro local for my 2 cards on aircooling, good deal i think?


Volt mod or shunt mod? Shunt increases power but if you don't want to mod then the highest TDP card out of all you listed is the Aorus.

And if you are going to do the shunt mod you might as well get a founders edition I think.


----------



## SupernovaBE

Quote:


> Originally Posted by *TheBoom*
> 
> Volt mod or shunt mod? Shunt increases power but if you don't want to mod then the highest TDP card out of all you listed is the Aorus.
> 
> And if you are going to do the shunt mod you might as well get a founders edition I think.


yeah Shunt sorry, if necessary of course








For the resel value and nois the first weeks i would go not FE









Are there beter cards that have no need for shunt mod ?
cards that i dont have listed.

ty !


----------



## feznz

Quote:


> Originally Posted by *KraxKill*
> 
> Put it this way, SLI bridge bandwidth has not increased significantly over the last few years. Faster cards only exacerbate this issue. To add to the bandwidth limitations, there are simply physical latency characteristics of the SLI interface that deal with frame timing and delivery that are impossible to remedy with current SLI bridging latencies.
> 
> SLI with ideal scaling in a hypothetical game, when it works, will get you from say 60fps on a single gpu to 100fps with sli but that 100fps will perform visually the way 75fps performs on a single card. A place you could easily get to without the headaches of SLI profiles, and associated frame judder.
> 
> IMO, you are better off spending the money on better cooling and getting the most out of a single card instead. If the Ti is not enough wait for Volta.
> 
> Sure, in the above scenario, you could water cool both cards and maybe squeeze 115fps out of it giving you the visual performance of say an 85fps single GPU system in terms of fluid performance but given the above, I find SLI very cost prohibitive even if affordable.
> 
> SLI is great for synthetic, scientific workloads, but for gaming I would stick with a single GPU. I will turn down settings before I deal with microstutter ever again. Back in the day, SLI was great, because delivering a frame at all superseded delivering the frame on time. I think now that we're able to push 60-100fps on a single card frame timing deserves more focus than simply rendering more frames.
> 
> *The above is my opinion and opinions differ. Sometimes pictures and graphs help.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Much more in this thread and many more like it.
> https://forums.geforce.com/default/topic/946036/sli/gtx-1080-sli-stutter-at-144hz-even-with-new-hb-sli-bridge/


I not sure what your point is

to sum it up IMHO a 1080Ti is far from a single card 4k solution
sezged is coming from an SLI set up I believe though will be impress as I am with the power of a 1080Ti the feel for the need to go SLI to max out 4K @ 60FPS will be mandatory.
I would always recommend a single card over SLI but a 1080TI is probably only a true 2k max settings card.
then again it is opinions of what settings some one is willing to settle for in game


----------



## Aganor

Quote:


> Originally Posted by *KedarWolf*
> 
> See the video on how to do a custom voltage curve but you'd top it say a .885v 1900 Core or something like that.
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


Silly question but i did not see this at the video you made, if i mess with the voltage curves, will it be for the profile i selected?


----------



## cluster71

I sat reading 12 hour straight yesterday and trying to understand why the 7700k Z270 performs so badly in the benchmark here. There are discussions around the world, and similar results. When looking at 1080p tests, 7700K is usually the top, and it helps to overclock the processor. But when it comes to 4K resolution, 7700k is usually the last and it's usually worse with overclocking.
Some believe that latency problems with DDR4 memory, but in tests, 7700K has the lowest latency and I have it with my memories. I have 38-39 ns 4133 mem on my so it does not seem to be the problem. And when you talk about True Latensy, most memories are on the same level.
I have seen tests 5960x where you only activate 2 cores 4 threads at 4.3 GHz and that's enough for 1080Ti to go full. And here on this thread, someone who clocked down its CPU of previous model to 1.9 GHz and got the same result in Superposition.

I feel like CPU GPU Ram is okay but that the system chokes. There are now 24 available pcie lines Z270 to be split depending on your demand or configuration. But when you start reading about HSIO Multiplexing, you realize that only 16 Pcie lines are available at any time, and if you have Gigabit networks, that's the number 15. I'm sitting now and reading the Intel 200 Series Chipset document 286 pages. But what I read yesterday, several suggested that motherboards manufacturers will find it difficult to solve the problems of this generation, now that more is being transferred to the pcie bus.

I do not think there's any idea to trim 1080 Ti and chase 11000 scores in Superposition with the Z270, because the problem is not there. It is not certain that problems can be solved.

I use the stuff and read on (I guess this creates discussions, but that's the point)


----------



## The Storm

Quote:


> Originally Posted by *szeged*
> 
> thanks, pretty much what i was looking for.
> 
> ill wait. Id rather have all the way there instead of almost there lol.
> 
> cooling is not an issue neither is SLI.
> 
> After quad titan x i wont be making that mistake again...


Coming from 3 R9-290x's, I kept thinking....I will wait, then the next cards hit, nah I will wait, I finally decided all I am doing is waiting for the next thing and not getting anything. There will always be the next big thing around the corner. I pulled the trigger on a new 1080ti and put a water block on it, sold the 3 290x's and couldn't be happier. I am glad that I am done waiting.


----------



## KraxKill

Quote:


> Originally Posted by *cluster71*
> 
> I sat reading 12 hour straight yesterday and trying to understand why the 7700k Z270 performs so badly in the benchmark here. There are discussions around the world, and similar results. When looking at 1080p tests, 7700K is usually the top, and it helps to overclock the processor. But when it comes to 4K resolution, 7700k is usually the last and it's usually worse with overclocking.
> Some believe that latency problems with DDR4 memory, but in tests, 7700K has the lowest latency and I have it with my memories. I have 38-39 ns 4133 mem on my so it does not seem to be the problem. And when you talk about True Latensy, most memories are on the same level.
> I have seen tests 5960x where you only activate 2 cores 4 threads at 4.3 GHz and that's enough for 1080Ti to go full. And here on this thread, someone who clocked down its CPU of previous model to 1.9 GHz and got the same result in Superposition.
> 
> I feel like CPU GPU Ram is okay but that the system chokes. There are now 24 available pcie lines Z270 to be split depending on your demand or configuration. But when you start reading about HSIO Multiplexing, you realize that only 16 Pcie lines are available at any time, and if you have Gigabit networks, that's the number 15. I'm sitting now and reading the Intel 200 Series Chipset document 286 pages. But what I read yesterday, several suggested that motherboards manufacturers will find it difficult to solve the problems of this generation, now that more is being transferred to the pcie bus.
> 
> I do not think there's any idea to trim 1080 Ti and chase 11000 scores in Superposition with the Z270, because the problem is not there. It is not certain that problems can be solved.
> 
> I use the stuff and read on (I guess this creates discussions, but that's the point)


Good post. Lots of info for me to follow up on in my own research. You have explored a lot in trying to figure out what it is that separtates the chips on the Ti under a gaming workload.

As you suggest, I think that memory latencies along with the supporting subsystems could still be the main issue where underlying architecture is the main limiter.

As you pointed out fast DDR4 is around 40ns but this is a bit misleading. Actual processing latency is different based on the size of the data set and the miss rate from cache to ram.

A large data set (like a 2.5k-8k frame in a game or benchmark) will overwhelm L1-L2 and L3 cache on all but one CPU. When this happens the data flows through the system agent, then the memory controller and only then to that 'fast' DDR4. The way I understand it, is this eats CPU cycles in dealing with the system agent and mem controller.

The 5775c for example is capable of driving the 1080ti to 10900+ in SuperP4k has 128mb of L4 at ~40ns but because it's not dealing with the system agent or the memory controller. There are advantages here. For one, it can handle the majority of the workload within the L1-L4 cache space. On the 5775C, the L4 is effectively acting like a slower L3 but still faster than having to swap to RAM through the memory sub system and eating up clock ticks.

Then when you consider that the the PCIe lane the GPU is in is also sharing time and competing for agent time the consistency of those low DDR4 latencies and bandwidth behind the memory controller become problematic in latency dependent applications like frame timing.

This is partly why many VR system builders are now snapping up the 5775c for the z97 platform.

There is more info in the 5775c owners thread. Feel free to pop in. Plenty of folks there exploring this.


----------



## cluster71

Thanks, I read on and testing


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Storm*
> 
> Coming from 3 R9-290x's, I kept thinking....I will wait, then the next cards hit, nah I will wait, I finally decided all I am doing is waiting for the next thing and not getting anything. There will always be the next big thing around the corner. I pulled the trigger on a new 1080ti and put a water block on it, sold the 3 290x's and couldn't be happier. I am glad that I am done waiting.


GTX 1080 Ti was right card to upgrade from 290X.


----------



## TheBoom

Quote:


> Originally Posted by *SupernovaBE*
> 
> yeah Shunt sorry, if necessary of course
> 
> 
> 
> 
> 
> 
> 
> 
> For the resel value and nois the first weeks i would go not FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are there beter cards that have no need for shunt mod ?
> cards that i dont have listed.
> 
> ty !


Cards with highest TDP's out of the box are Aorus, Zotac Amp Extreme and Galax HOF.

The XOC bios is somewhat an alternative to the shunt mod. At the end of the day power limits are only one part of the equation.

If you happen to win the silicon lottery and get limited by power, then you can either shunt mod or use the XOC bios.

Honestly I wouldn't recommend either unless you are just shooting for highest possible clocks without much care for performance.

Even then your focus should be more on finding the best cooling for the card, probably watercooling as it has proven to have the biggest impact time and again for the 1080ti and probably all pascal cards.


----------



## dallas1990

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GTX 1080 Ti was right card to upgrade from 290X.


thats what i did, upgraded from 2 290x 8gb versions. they were really good cards but they love the power lol.


----------



## GraphicsWhore

Quote:


> Originally Posted by *szeged*
> 
> thanks, pretty much what i was looking for.
> 
> ill wait. Id rather have all the way there instead of almost there lol.
> 
> cooling is not an issue neither is SLI.
> 
> After quad titan x i wont be making that mistake again...


I'm compelled to point out that you will never be "all the way there." We all know that games become progressively more demanding. Meanwhile, GPUs don't improve by incredible leaps and bounds over the previous, top model.

By the time a GPU comes out that runs all current games in 4k/60fps, games releasing at the same time or shortly after that GPU may be out of reach for 4k and high FPS (depending on settings).

I just don't see the point in that sense.


----------



## cluster71

I have previously had quad sli (2x GTX590 2x GTX690) and 4-way sli 4x GTXTitan v1. It was always fiddling, but improved in recent years with drivers. But it was sad when many games did not support sli. Sometimes it may force sli configuration, but it usually gets worse. Sad to run on one GPU and let the others tag along ...
This time it became one card and change more often instead.


----------



## nrpeyton

Quote:


> Originally Posted by *Quadrider10*
> 
> yea, the evga goes better with my build. but what it seems like is that all these cards are hitting pretty much the same clock speeds and temps... give or take a few MHz/degrees, and silicone lottery.
> 
> what about modding these cards? which one would be best?


If you plan on modding -- I'd definitely go with EVGA.

You're allowed to water-cool. You're allowed to remove the stock heatsink. And you're allowed to re-paste. All while keeping your warranty intact. It's EVGA warranty policy. More relaxed than most.

So if you plan on doing a power mod that you can remove afterwards (I.E. liquid metal on the shunts). Or adding resistors to capacitors using a little silver paint. Then you're good. In other words-- as long as you can return the card to it's original condition. Then you won't be left out to dry if a manufacturing flaw pops up further down the line.

Quote:


> Originally Posted by *feznz*
> 
> I not sure what your point is
> 
> to sum it up IMHO a 1080Ti is far from a single card 4k solution
> sezged is coming from an SLI set up I believe though will be impress as I am with the power of a 1080Ti the feel for the need to go SLI to max out 4K @ 60FPS will be mandatory.
> I would always recommend a single card over SLI but a 1080TI is probably only a true 2k max settings card.
> then again it is opinions of what settings some one is willing to settle for in game


sorry I disagree.... everything I've played so far works beautifully at 4k Ultra.

-Witcher 3.........65-75 FPS
-Ashes of Singularity.......CPU dependant but I've seen wayyy more than needed for RTS
And now:
-Gears of War 4.........the first game to feature *native* 4k textures & lighting; and also the biggest download of any game I've ever witnessed (at 101 GB). 55-75 FPS

Gears of War 4 also has the hardest *advertised* CPU system requirements I've ever seen.. They actually recommend a minimum of a GTX 1080 for *ideal* gameplay.

It also taxes my card really well too.. Even at a voltage of only 1.013v @ 1987 Mhz my 1080Ti is still pulling 300 ++ watts continiously with spikes as high as 370 watts.

For me the 1080 Ti is truly the first 4k ultra, solo card.

I'd be interested to know of any games that are an exception. (If they even exist)?


----------



## cluster71

I have 3440x1440 G sync, it feels okay. I game a lot of Sniper Elite 4 , Ultra + 8 times super sampling, but it is not that fast game


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> If you plan on modding -- I'd definitely go with EVGA.
> 
> You're allowed to water-cool. You're allowed to remove the stock heatsink. And you're allowed to re-paste. All while keeping your warranty intact. It's EVGA warranty policy. More relaxed than most.
> 
> So if you plan on doing a power mod that you can remove afterwards (I.E. liquid metal on the shunts). Or adding resistors to capacitors using a little silver paint. Then you're good. In other words-- as long as you can return the card to it's original condition. Then you won't be left out to dry if a manufacturing flaw pops up further down the line.
> 
> sorry I disagree.... everything I've played so far works beautifully at 4k Ultra.
> 
> -Witcher 3.........65-75 FPS
> -Ashes of Singularity.......CPU dependant but I've seen wayyy more than needed for RTS
> And now:
> -Gears of War 4.........the first game to feature *native* 4k textures & lighting; and also the biggest download of any game I've ever witnessed (at 101 GB). 55-75 FPS
> 
> Gears of War 4 also has the hardest *advertised* CPU system requirements I've ever seen.. They actually recommend a minimum of a GTX 1080 for *ideal* gameplay.
> 
> It also taxes my card really well too.. Even at a voltage of only 1.013v @ 1987 Mhz my 1080Ti is still pulling 300 ++ watts continiously with spikes as high as 370 watts.
> 
> For me the 1080 Ti is truly the first 4k ultra, solo card.
> 
> I'd be interested to know of any games that are an exception. (If they even exist)?


I am playing Metro LL and not really 4K ready. I drop to 30s sometime and feels very laggy.


----------



## Dasboogieman

Quote:


> Originally Posted by *KraxKill*
> 
> Good post. Lots of info for me to follow up on in my own research. You have explored a lot in trying to figure out what it is that separtates the chips on the Ti under a gaming workload.
> 
> As you suggest, I think that memory latencies along with the supporting subsystems could still be the main issue where underlying architecture is the main limiter.
> 
> As you pointed out fast DDR4 is around 40ns but this is a bit misleading. Actual processing latency is different based on the size of the data set and the miss rate from cache to ram.
> 
> A large data set (like a 2.5k-8k frame in a game or benchmark) will overwhelm L1-L2 and L3 cache on all but one CPU. When this happens the data flows through the system agent, then the memory controller and only then to that 'fast' DDR4. The way I understand it, is this eats CPU cycles in dealing with the system agent and mem controller.
> 
> The 5775c for example is capable of driving the 1080ti to 10900+ in SuperP4k has 128mb of L4 at ~40ns but because it's not dealing with the system agent or the memory controller. There are advantages here. For one, it can handle the majority of the workload within the L1-L4 cache space. On the 5775C, the L4 is effectively acting like a slower L3 but still faster than having to swap to RAM through the memory sub system and eating up clock ticks.
> 
> Then when you consider that the the PCIe lane the GPU is in is also sharing time and competing for agent time the consistency of those low DDR4 latencies and bandwidth behind the memory controller become problematic in latency dependent applications like frame timing.
> 
> This is partly why many VR system builders are now snapping up the 5775c for the z97 platform.
> 
> There is more info in the 5775c owners thread. Feel free to pop in. Plenty of folks there exploring this.


The 5775C L4 cache also allows additive bandwidth with the DDR3 to feed the cores. This is effectively 100GB/s+ bandwidth with sub 40ns latency feeding the cores provided the data resides in both the L4 and the DDR3 concurrently, allowing the kind of IPCs the 7700K will have trouble matching. No DDR4 system except maybe Skylake X can match these numbers.

However, this huge demand is good. It sends a clear signal to both Intel + AMD that there is a significant demand for lower core count but EDRAM assisted CPUs. The problem is these CPUs will eat in to their higher profit margin units due to sheer performance.


----------



## nrpeyton

Quote:


> Originally Posted by *cluster71*
> 
> I sat reading 12 hour straight yesterday and trying to understand why the 7700k Z270 performs so badly in the benchmark here. There are discussions around the world, and similar results. When looking at 1080p tests, 7700K is usually the top, and it helps to overclock the processor. But when it comes to 4K resolution, 7700k is usually the last and it's usually worse with overclocking.
> Some believe that latency problems with DDR4 memory, but in tests, 7700K has the lowest latency and I have it with my memories. I have 38-39 ns 4133 mem on my so it does not seem to be the problem. And when you talk about True Latensy, most memories are on the same level.
> I have seen tests 5960x where you only activate 2 cores 4 threads at 4.3 GHz and that's enough for 1080Ti to go full. And here on this thread, someone who clocked down its CPU of previous model to 1.9 GHz and got the same result in Superposition.
> 
> I feel like CPU GPU Ram is okay but that the system chokes. There are now 24 available pcie lines Z270 to be split depending on your demand or configuration. But when you start reading about HSIO Multiplexing, you realize that only 16 Pcie lines are available at any time, and if you have Gigabit networks, that's the number 15. I'm sitting now and reading the Intel 200 Series Chipset document 286 pages. But what I read yesterday, several suggested that motherboards manufacturers will find it difficult to solve the problems of this generation, now that more is being transferred to the pcie bus.
> 
> I do not think there's any idea to trim 1080 Ti and chase 11000 scores in Superposition with the Z270, because the problem is not there. It is not certain that problems can be solved.
> 
> I use the stuff and read on (I guess this creates discussions, but that's the point)
> 
> 
> Spoiler: Warning: Spoiler!


Interesting.. how would you sum all that up then? I mean as a i7 7700k owner is there anything I can do?

I don't believe I am using any other PCI E lanes. Only thing I have connected to the system is mouse, keyboard, GPU & wired LAN. _(so not even so much as a wireless card)_ And my SSD's are connected via SATA along with my HDD.
My memory is already running at 4000 Mhz 18, 19, 19, 39. And a daily O/C of 4.9Ghz (or 5Ghz for benches).


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Interesting.. how would you sum all that up then? I mean as a i7 7700k owner is there anything I can do?
> 
> I don't believe I am using any other PCI E lanes. Only thing I have connected to the system is mouse, keyboard, GPU & wired LAN. _(so not even so much as a wireless card)_ And my SSD's are connected via SATA along with my HDD.
> My memory is already running at 4000 Mhz 18, 19, 19, 39. And a daily O/C of 4.9Ghz (or 5Ghz for benches).


Cache is king for raw frame times, either get a 5775c or a HEDT HCC CPU that has massive amounts of cache.
As a 7700K owner, get the fastest DDR4 you can get, make sure you have the FCLK at 1000mhz and pray lol.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Cache is king for raw frame times, either get a 5775c or a HEDT HCC CPU that has massive amounts of cache.
> As a 7700K owner, get the fastest DDR4 you can get, make sure you have the FCLK at 1000mhz and pray lol.


System Agent is at 1000 Mhz and DDR4 is at 4000 Mhz CS 18. Running at 1.4v lol _(overclocked from 3600 Mhz)_

Only thing I've not overclocked is the cache. As I heard 100 Mhz on the core gives more than 1000 on the cache. So little point.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> System Agent is at 1000 and DDR4 is at 4000 Mhz CS 18. Running at 1.4v lol


Then GG, I'm out of ideas lol. The 5775c is good but not good enough for me to downgrade from Z270. I'm willing to trade slightly less gaming CPU competency for more up to date platform relevancy.


----------



## cluster71

I do not know, have not pinpoint the problem. But it would not be for Ryzen might have been satisfied, but i have two M.2 SSD, only. Wondering whether happens if i disable one, because it can not access it. It is on the underside of MB

How does "high performance" affect energy savings. Are pcie lines booked all the time when you force pcie 3.0 in bios, or is it out of control. The system probably takes care of it otherwise it would have locked the system. And it changes in GPU-Z 3.0-1.1


----------



## cluster71

Msi Afterburner draws some CPU resources, especially if you log a lot. I have tested "interleaved hardware polling mode" for a few days.

Added experimental interleaved hardware polling mode, aimed to reduce hardware polling time on the systems with multiple polled I2C devices. When interleaved polling is enabled, just a part of hardware monitoring data sources is being polled on each hardware polling period, so it takes multiple periods to refresh all monitoring data sources. Power users may enable interleaved hardware polling mode via the configuration file if necessary

MSIAfterburner.cfg


----------



## KraxKill

Quote:


> Originally Posted by *nrpeyton*
> 
> System Agent is at 1000 Mhz and DDR4 is at 4000 Mhz CS 18. Running at 1.4v lol _(overclocked from 3600 Mhz)_
> 
> Only thing I've not overclocked is the cache. As I heard 100 Mhz on the core gives more than 1000 on the cache. So little point.


I don't have your system but use the same methodology on mine. If I was in your position, I would explore how tightening DDR4 timings by lowering DDR4 frequency (lower bandwidth) affects your "gaming" performance (Superposition 4K). Ideally you want the tightest timings you can run while maintaining sufficient bandwidth for a given workload by keeping a reasonable DDR4 frequency. Too much bandwidth is a waste if it's killing timing and vis a versa.

You may also want to try FRAPS along with FRAPS bench viewer to measure your frame-times at tight timing with lower bandwidth vs loose timing and high bandwidth. The sweet spot is probably somewhere in the middle.


----------



## KraxKill

This is DDR3, but it shows the relationship between speed vs latency and how the two combine in terms of performance. This table will change depending on the workload involved in terms of dataset size. Each game will have a different data set size, each "bench" will have it's own. So when configuring a system for crunching though h264 encodes you may prefer bandwidth but when processing games you may prefer latency provided satisfactory bandwidth exists. Good luck!


----------



## cluster71

Will try it later. Installed Thaiphoon and Memtest Pro. Would like to avoid trial and error and have a clue about what timings to start with.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> sorry I disagree.... everything I've played so far works beautifully at 4k Ultra.
> 
> -Witcher 3.........65-75 FPS
> -Ashes of Singularity.......CPU dependant but I've seen wayyy more than needed for RTS
> And now:[/S]
> -Gears of War 4.........the first game to feature *native* 4k textures & lighting; and also the biggest download of any game I've ever witnessed (at 101 GB). 55-75 FPS
> 
> Gears of War 4 also has the hardest *advertised* CPU system requirements I've ever seen.. They actually recommend a minimum of a GTX 1080 for *ideal* gameplay.
> 
> It also taxes my card really well too.. Even at a voltage of only 1.013v @ 1987 Mhz my 1080Ti is still pulling 300 ++ watts continiously with spikes as high as 370 watts.
> 
> For me the 1080 Ti is truly the first 4k ultra, solo card.
> 
> I'd be interested to know of any games that are an exception. (If they even exist)?


I found Wicher 3 not very demanding compared to Rise of the Tomb Raider
When SMAA is ok but as soon as you MAX the settings with SMAA X2 or SMAA x 4 ROTTR is playable but not enjoyable.
GTA 5 with ALL settings maxed out I get 18FPS

Again I also find it hard to believe some people are getting so obsessed getting the last 25Mhz out of their GPU for..... gaming, if it were for Benching I could understand.
As I said it comes down to what settings you are willing to settle for GTA5 is not noticeably different with max,max settings where ROTTR I do notice the difference between SMAA and SMAA x2
My findings on a 3840x1600p or 3k equivalent monitor

then again with superposition I find it hard to distinguish the visual difference between 4k optimised and 1080 extreme


----------



## cluster71

Quote:


> Originally Posted by *KraxKill*
> 
> I don't have your system but use the same methodology on mine. If I was in your position, I would explore how tightening DDR4 timings by lowering DDR4 frequency (lower bandwidth) affects your "gaming" performance (Superposition 4K). Ideally you want the tightest timings you can run while maintaining sufficient bandwidth for a given workload by keeping a reasonable DDR4 frequency. Too much bandwidth is a waste if it's killing timing and vis a versa.
> 
> You may also want to try FRAPS along with FRAPS bench viewer to measure your frame-times at tight timing with lower bandwidth vs loose timing and high bandwidth. The sweet spot is probably somewhere in the middle.


I found FRAPS bench viewer i think https://sourceforge.net/u/raffriff42/profile/
I do not know what the other one looks like.

Found it


----------



## ocCuS

So yesterday I got my 1080 ti Hall of Fame and today I had a little time to test the card. Btw this is the "normal" 1080 ti Hall of Fame, not the extra cherry picked limited edition / 8pack approved HOF card.



it comes with the following specs:
GPU base: 1733MHz
Boost: 1873MHz
(900€ btw)

Out of the box the card boosts under load to 1987mhz (51°C / fan speed 45%).

So today I pushed the card as far as possible and I want to find the numbers where all benchmarks work 100% stable (and not the max for every benchmark individually):

(fan speed during testing always at 100% / power limit 128% / core voltage +100mV):

GPU Clock: +88mhz --> 2075mhz max
Mem Clock: +539mhz --> 6048mhz max
Temps: 58-60°C



With these settings I got the following scores:

Superposition:
1080p: 6139 points
4k optimized: 10122 points
8k optimized: 4627 points

TimeSpy: 9754
FireStrike Extreme: 13546
FireStrike: 22308

Heaven:
1080p extreme: 3808 (151.2 fps)
1440p extreme: 2460 (97.6 fps)

Valley:
1080p extreme: 5279 (126.2 fps)
1440p extreme: 3917 (93.6 fps)

(Rest of my system: 5820k @4.4ghz, 32GB ddr4 ram @2400mhz, 950pro m.2 SSD)

On air I think 2075mhz / 6048mhz are the 100% stable numbers in every benchmark.
When I increase the gpu clock the most benchmarks will crash and when I increase the memory, I get artifacts.
So the next few days I will play a little with the curves and try to get 2088 or even 2101mhz stable in a few benchmarks.


----------



## KraxKill

Quote:


> Originally Posted by *cluster71*
> 
> I found FRAPS bench viewer i think https://sourceforge.net/u/raffriff42/profile/
> I do not know what the other one looks like


I think that's it. Looks a bit different from when I downloaded it.

Fraps is Fraps and the viewer is just so you can run FRPS logs through it easily.


----------



## ZealotKi11er

Quote:


> Originally Posted by *feznz*
> 
> I found Wicher 3 not very demanding compared to Rise of the Tomb Raider
> When SMAA is ok but as soon as you MAX the settings with SMAA X2 or SMAA x 4 ROTTR is playable but not enjoyable.
> GTA 5 with ALL settings maxed out I get 18FPS
> 
> Again I also find it hard to believe some people are getting so obsessed getting the last 25Mhz out of their GPU for..... gaming, if it were for Benching I could understand.
> As I said it comes down to what settings you are willing to settle for GTA5 is not noticeably different with max,max settings where ROTTR I do notice the difference between SMAA and SMAA x2
> My findings on a 3840x1600p or 3k equivalent monitor
> 
> then again with superposition I find it hard to distinguish the visual difference between 4k optimised and 1080 extreme


I also do not notice that much of a difference. 25MHz is so small when you are at 2000MHz+. Even 100MHz is like 50MHz at 1GHz which will give you 2-3% more fps which is nothing and you will never notice.


----------



## methadon36

Just added the EK backplate to my EVGA Black edition SC. I like the copper block for this card, i prefer it over plated.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> This is DDR3, but it shows the relationship between speed vs latency and how the two combine in terms of performance. This table will change depending on the workload involved in terms of dataset size. Each game will have a different data set size, each "bench" will have it's own. So when configuring a system for crunching though h264 encodes you may prefer bandwidth but when processing games you may prefer latency provided satisfactory bandwidth exists. Good luck!


By following the equation on that table I would have an affictiveness of "9" by my current DDR4 settings.

*Speed / 0.5 = ####*
4000 / 0.5 = 2000

*tCAS / #### x 1000 = #*
18 / 2000 x 1000 = 9

9 is only *yellow* on the table.. so maybe you're right.. I'll possibly have a look at aiming for something a bit lower.
May even just see if I can reduce the tCAS to 16. Maybe another little voltage boost to the RAM.

It's only at 1.4v just now. And the absolute highest DDR4 is allowed to go for an XMP certification is 1.5v. _So I've still got 0.1v to play with._

*Random question for everyone:*

Anyone bothered with VR?









Is it worth it? The cost to set it up is equivalent to buying another 1080 Ti

Quote:


> Originally Posted by *methadon36*
> 
> Just added the EK backplate to my EVGA Black edition SC. I like the copper block for this card, i prefer it over plated.
> 
> 
> Spoiler: Warning: Spoiler!


Your system looks nice awesome. Very clean and tidy + some nice lighting going on there, too 

Did the backplate do anything for temps? _(I've still not bothered with backplate yet)_

While gaming, I just noticed in HWINFO64 I'm still getting a live temperature reading for *"*CPU GT Cores (Graphics)*"*.

Is that normal -- or do I need to _physically_ disable IGPU in BIOS to free up some cache or other resources? e.t.c _(forgive me if this is a total newb question -- but I only came across to intel after 10 years of AMD a few months ago -- and I'm still brainstorming why my scores are so low)._
Obviously I don't have any cables plugged into it. Everything is plugged into my 1080Ti.


----------



## bmgjet

Quote:


> Originally Posted by *methadon36*
> 
> Just added the EK backplate to my EVGA Black edition SC. I like the copper block for this card, i prefer it over plated.
> 
> 
> Spoiler: Warning: Spoiler!


I just re-used my stock back plate on the SC Black.
Gets quite a bit of warmth to it.


----------



## methadon36

Quote:


> Originally Posted by *nrpeyton*
> 
> By following the equation on that table I would have an affictiveness of "9" by my current DDR4 settings.
> 
> *Speed / 0.5 = ####*
> 4000 / 0.5 = 2000
> 
> *tCAS / #### x 1000 = #*
> 18 / 2000 x 1000 = 9
> 
> 9 is only *yellow* on the table.. so maybe you're right.. I'll possibly have a look at aiming for something a bit lower.
> May even just see if I can reduce the tCAS to 16. Maybe another little voltage boost to the RAM.
> 
> It's only at 1.4v just now. And the absolute highest DDR4 is allowed to go for an XMP certification is 1.5v. _So I've still got 0.1v to play with._
> 
> *Random question for everyone:*
> 
> Anyone bothered with VR?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it worth it? The cost to set it up is equivalent to buying another 1080 Ti
> 
> Your system looks nice awesome. Very clean and tidy + some nice lighting going on there, too
> 
> Did the backplate do anything for temps? _(I've still not bothered with backplate yet)_
> 
> Thank you! I didn't see any improvement with the temps. the backplate has four spots that thermal pads are used. I did it more for looks and to protect the PCB. I still need to freshen up on the wire management and dust issues. I just tinted the side window so it keep the lighting to a minimum at night when I'm sleeping.


----------



## dansi

My findings on core temp
-Below 45 degree-c, it will boost to your top bin +1. Say you set oc to flat line from [email protected] in AB, 1080ti will boost to [email protected]

-46-55 dc, it will boost to your set core at 2025, voltage will +1 bin, say you start the flatline from [email protected], this temp range will boost the voltage to 1.062v

-56-64 dc, this is the range where the core and voltage run at the flat line point. [email protected]v. providing no power limits, you want to keep temps at max 64dc.


----------



## bmgjet

Quote:


> Originally Posted by *dansi*
> 
> My findings on core temp
> -Below 45 degree-c, it will boost to your top bin +1. Say you set oc to flat line at 2025 in AB, 1080ti will boost to 2037.
> 
> -46-55 dc, it will boost to your set core at 2025, voltage will +1 bin, say you start the flatline from [email protected], this temp range will boost the voltage to 1.062v
> 
> -56-64 dc, this is the range where the core and voltage run at the flat line point. [email protected] providing no power limits, you want to keep temps at max 64dc.


My only boosts above set core if im under 25c. (which is first start up of a game for about a min)
25-32C is holds where its set.
32+ it moves up a voltage until it reaches 38c where it will drop its first bin.
From there on it drops a bin each 4C.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> *Random question for everyone:*
> 
> Anyone bothered with VR?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it worth it? The cost to set it up is equivalent to buying another 1080 Ti


Too immature kind of reminds me of playing the old wii very basic Graphics basically need to nail 10x the pixels per inch that is currently available
but is very cool if you can get past the pixilated graphics VERY immersive

personally I wouldn't buy but great toy to show off if you had money to burn.

Did you get round to try out lower latencies for Timespy to see if you get a little boost?
According to KraxKill Graph where is the source? should explain the reasoning behind it
I remember a big argument a long time back in the Valley thread where was a guy accused of cheating took me a long time to work out the "secret" system boost was coming from in short he was not cheating

I guess if I have an obsession it is Maxing every last setting in a game to know that they is no visual eye candy lost
Quote:


> Originally Posted by *KraxKill*
> 
> This is DDR3, but it shows the relationship between speed vs latency and how the two combine in terms of performance. This table will change depending on the workload involved in terms of dataset size. Each game will have a different data set size, each "bench" will have it's own. So when configuring a system for crunching though h264 encodes you may prefer bandwidth but when processing games you may prefer latency provided satisfactory bandwidth exists. Good luck!


----------



## Streetdragon

i could get a palit Founders 1080TI for 627€ new.... dont know if i should pull the trigger.... heatkiller on it.

The shunt mod does only give a bit headroom to the powerlimit right? to get higher stable clocks

or for 719 the AORUS..

op could use a list with the cards and the Waterblocks for it.. maybe a list with the best card+waterblock for the €/$


----------



## Mrip541

After having issues with 2 Gigabyte cards I switched it up and ordered an EVGA. I forgot to update my shipping address and the card was sent to my old apartment across town. Super said he didn't remember seeing any package for me but would look around. I'm preparing to flip out and think the universe is trying to tell me something.


----------



## Mad Pistol

*sigh*

The Destiny 2 bundle hit just after I purchased the card, and all Amazon would do is give me a $15 promotional credit.









I understand that I could return the card and repurchase it, but honestly, I just don't feel like going through the hassle. I guess I'll just have to pay full price on Destiny 2.


----------



## TheBoom

Quote:


> Originally Posted by *Mad Pistol*
> 
> *sigh*
> 
> The Destiny 2 bundle hit just after I purchased the card, and all Amazon would do is give me a $15 promotional credit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I understand that I could return the card and repurchase it, but honestly, I just don't feel like going through the hassle. I guess I'll just have to pay full price on Destiny 2.


Well I think I would swap For Honour for Destiny 2 if I had the chance lol.


----------



## Coopiklaani

Quote:


> Originally Posted by *TheBoom*
> 
> Well I think I would swap For Honour for Destiny 2 if I had the chance lol.


I redeemed GR wildlands since I've got for honour. Played for 2 hrs on the first day, never touched it since.








I really wish I could swap it for Destiny 2 Early access.


----------



## cluster71

Yesterday we discussed a little 7700K Z270. KraxKill suggested that we test with high RAM clocks and then lower clocks with tighten times, and then give them some workload and measure with FRAPS if possible.

I have G-skill F4-4266C19D-16GTZR [4266MHz 19-19-19-39-2N 1.40v]. They are not stable on my MB, not with not even with 1.5v. It's like a wall at 4273, it's enough for the bus to swing. Maybe these ODD RATIO MODES in Bios. Several have the same problem. I do not see them as "True" 4266 memories for everyone in the series is at 1.35v. These have XMP profile 4133 x103.2 clock 1.40v. As handpicked 4133 memories with tweaked profile. I usually run them on 4133 17-18-18-38 CR2.

I fiddled a couple of hours yesterday with different timings. 4234 19-19-19-39 CR2 / 4133 17-18-18-38 CR2 / 3000 13-13-13-28 CR1
I run Super Position 4K, Timespy and tried a little with FRAPS. But Graphics Score in Timespy (10900-11000) and Superposition (10000-10200) as before, and the frametime tests I did were almost identical.

Had I tested 1080p maybe I had seen something, but it's not interesting. I gave up then because I'm not well.
I'm stayed on 3000 13-13-13-28 CR1 where they are completely stable with memtest Pro, not an error. I Gaming some tonight and see if I experience any improvement


----------



## TheBoom

Quote:


> Originally Posted by *cluster71*
> 
> Yesterday we discussed a little 7700K Z270. KraxKill suggested that we test with high RAM clocks and then lower clocks with tighten times, and then give them some workload and measure with FRAPS if possible.
> 
> I have G-skill F4-4266C19D-16GTZR [4266MHz 19-19-19-39-2N 1.40v]. They are not stable on my MB, not with not even with 1.5v. It's like a wall at 4273, it's enough for the bus to swing. Maybe these ODD RATIO MODES in Bios. Several have the same problem. I do not see them as "True" 4266 memories for everyone in the series is at 1.35v. These have XMP profile 4133 x103.2 clock 1.40v. As handpicked 4133 memories with tweaked profile. I usually run them on 4133 17-18-18-38 CR2.
> 
> I fiddled a couple of hours yesterday with different timings. 4234 19-19-19-39 CR2 / 4133 17-18-18-38 CR2 / 3000 13-13-13-28 CR1
> I run Super Position 4K, Timespy and tried a little with FRAPS. But Graphics Score in Timespy (10900-11000) and Superposition (10000-10200) as before, and the frametime tests I did were almost identical.
> 
> Had I tested 1080p maybe I had seen something, but it's not interesting. I gave up then because I'm not well.
> I'm stayed on 3000 13-13-13-28 CR1 where they are completely stable with memtest Pro, not an error. I Gaming some tonight and see if I experience any improvement


That's weird but your 3000 CL 13 memory has a higher latency than my 3200 CL16. Doesn't make sense unless you loosened a lot of secondary or tertiary timings.


----------



## cluster71

No, I did not have time for so many fine adjustments yesterday. They are in AUTO and downregulate with time. May follow it up if I'm going to stay at this level.


----------



## KraxKill

Quote:


> Originally Posted by *cluster71*
> 
> Yesterday we discussed a little 7700K Z270. KraxKill suggested that we test with high RAM clocks and then lower clocks with tighten times, and then give them some workload and measure with FRAPS if possible.
> 
> I have G-skill F4-4266C19D-16GTZR [4266MHz 19-19-19-39-2N 1.40v]. They are not stable on my MB, not with not even with 1.5v. It's like a wall at 4273, it's enough for the bus to swing. Maybe these ODD RATIO MODES in Bios. Several have the same problem. I do not see them as "True" 4266 memories for everyone in the series is at 1.35v. These have XMP profile 4133 x103.2 clock 1.40v. As handpicked 4133 memories with tweaked profile. I usually run them on 4133 17-18-18-38 CR2.
> 
> I fiddled a couple of hours yesterday with different timings. 4234 19-19-19-39 CR2 / 4133 17-18-18-38 CR2 / 3000 13-13-13-28 CR1
> I run Super Position 4K, Timespy and tried a little with FRAPS. But Graphics Score in Timespy (10900-11000) and Superposition (10000-10200) as before, and the frametime tests I did were almost identical.
> 
> Had I tested 1080p maybe I had seen something, but it's not interesting. I gave up then because I'm not well.
> I'm stayed on 3000 13-13-13-28 CR1 where they are completely stable with memtest Pro, not an error. I Gaming some tonight and see if I experience any improvement


This is the simplest way to estimate frequency vs latency performance.

Latency/frequency = X

Your performance will be greatest where X is lowest.

Some numbers you posted.

19/4234 = 44.9ns
17/4132 = 41.1ns
13/3000 = 43.3ns

Your 4132cl17 is theretocaly the fastest combination of the three combinations you posted.

I'm suggesting going lower on frequency and continuing to tighten timing to see if you can find a stable combination that results in the lowest X value. Provided that condition occurs at a frequency where bandwidth is sufficient for the current workload, THAT will be you fastest setting.

For example if you can run 3200 at CL13, that would be
13/3200, or 40.6ns. Faster than all three combinations you posted.

Or 4000CL16. 16/4000 = 40ns

That would be 40ns and faster still.

Simply selecting lower latency is not enough the combination has to result in the lower X value in order to be effective.


----------



## cluster71

I used the Thaiphoon Burner as a guide to recalculate quickly, then you can squeeze it a bit


----------



## cluster71

Okay, I'm aiming at 4000 cl 16 tonight


----------



## Aganor

Hey guys, is there any advantage using Windows 10 vs W7 with a 1080ti?
I'm getting fed up with W10 since i always loved W7.
I moved to W10 when i had lots of latency problems back on my 1080 and afterall the problem wasnt the operation system so i'm thinking of using my Windows 7 cd again


----------



## AllGamer

Quote:


> Originally Posted by *Aganor*
> 
> Hey guys, is there any advantage using Windows 10 vs W7 with a 1080ti?
> I'm getting fed up with W10 since i always loved W7.
> I moved to W10 when i had lots of latency problems back on my 1080 and afterall the problem wasnt the operation system so i'm thinking of using my Windows 7 cd again


+1

LOL







even my wife (also a gamer) is pissed at Win10, all those damn updates screwing things up.

I've been formatting and re-installing Win7 back into many of my machines. (Home Theater, Game PC1, Game PC2, Son's Gaming PC, Space PC, Vistor's PC, and the many laptops we have. )

Win10 drivers are usually backwards compatible with Win7

GTX 1080 (non-Ti) worked fine in Win7

and from Nvidia point of view the GTX 1080 ti is officially supported as per driver page http://www.geforce.com/drivers/results/119912

Version 382.53 - WHQL
Release Date Fri Jun 09, 2017
Operating System *Windows 7 64-bit*
Windows 8.1 64-bit
Windows 8 64-bit
Language English (US)
File Size 385.52 MB

Even the new Rig I'm building it's going to be on Win7, and yes going to be running 2x MSI GTX 1080 ti EK X Seahawk edition (shipping mishap, waiting for new shipment )


----------



## FUZZFrrek

Hi! I have a noob question. In GUP Z i only have VRel on ther PerfCap does it meen i have not enough or too much voltage on the slider? I am trying to fing the right amount but nothing chanches the situation.


----------



## Aganor

I had DPC issues with the 1080 but turns out it was my old rig fault but when i figured it out, i already had Windows 10.

I'm soooo damn inclined to go back W7!!!


----------



## AllGamer

Quote:


> Originally Posted by *Aganor*
> 
> I had DPC issues with the 1080 but turns out it was my old rig fault but when i figured it out, i already had Windows 10.
> 
> I'm soooo damn inclined to go back W7!!!


The only reason to be on Win10 is to play only the new games in DirectX 12.

All DX11 games and older, I'll rather play them in Win7

My main OS is Linux.

So, you can always setup Multi-Boot, and run whatever you like the most, and only keep Win10 for those games you play on DX12.... which I have yet to seen any that pick my interest.

All of the games I play are on DX11 or waaaay older

I've setup multi boot for each OS corresponding each DX version.

Games runs better when they are in their native environment.

There are many GOG games for Win9x and XP that runs horrible in newer OSes.


----------



## FUZZFrrek

Nvm. I found out how to remove any PerfCap. Actually on any BM and games running on 1080p and 2Kp, i am able to get the core to 2050 stable (under 59c) at the limit voltage. In 4k i get power PerfCap. I have a EVGA FTW3.


----------



## cluster71

All calm now. One and a half hours gameplay 2100/11800


----------



## chubalz

1. Does Gtx 1080Ti + R5 1600 will cause problem on our eyes for a 1080p on 60hz TV?

2. Is the screen tearing annoying?

3. Is it ok to play even my FPS is over 120 on 60hz TV?

4. Can enabling on game setting vsync works even my TV dont have Gsync?


----------



## Mad Pistol

Quote:


> Originally Posted by *chubalz*
> 
> 1. Does Gtx 1080Ti + R5 1600 will cause problem on our eyes for a 1080p on 60hz TV?
> 
> 2. Is the screen tearing annoying?
> 
> 3. Is it ok to play even my FPS is over 120 on 60hz TV?
> 
> 4. Can enabling on game setting vsync works even my TV dont have Gsync?


1. No problem, just smooth gameplay (1080 Ti is overkill for 1080p).

2. Screen tearing affects everyone differently. After using gsync, I hate screen tearing like the plague. Before using gsync, I didn't care.

3. Yes, in fact, it's recommended.

4. Vsync will work fine. Just bear in mind that Vsync will add latency to gameplay and make it feel sluggish at times.


----------



## chubalz

Thank Mad Pistol.

Last question.

What can you say about my situation using Gtx 1080ti running 1080p on 60 hz TV?

For you its a go signal?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> 1. No problem, just smooth gameplay (1080 Ti is overkill for 1080p).
> 
> 2. Screen tearing affects everyone differently. After using gsync, I hate screen tearing like the plague. Before using gsync, I didn't care.
> 
> 3. Yes, in fact, it's recommended.
> 
> 4. Vsync will work fine. Just bear in mind that Vsync will add latency to gameplay and make it feel sluggish at times.


Everyone hates screen tearing. Not every game handles it the same. Also I gotten used to most of them but if someone points them out I too will get annoyed.


----------



## Aganor

Quote:


> Originally Posted by *Mad Pistol*
> 
> 1. No problem, just smooth gameplay (1080 Ti is overkill for 1080p).
> 
> 2. Screen tearing affects everyone differently. After using gsync, I hate screen tearing like the plague. Before using gsync, I didn't care.
> 
> 3. Yes, in fact, it's recommended.
> 
> 4. Vsync will work fine. Just bear in mind that Vsync will add latency to gameplay and make it feel sluggish at times.


So true #2








Having 2 rigs, the lower one doesnt have g-sync and even benching on Heaven makes me nervous lol


----------



## cluster71

I always used V-sync before G-sync. I can not take screen tearing, because of my nerve damage, nerve pain, chronic migraine and chronic cluster headache. My nervous system is always overloaded and goes high. I hear and see things that others do not notice. Like old tv CRT, like a stroboscope for me, I see the updates without problem 50 hz


----------



## cluster71

Just make sure you have good stuff for the task so the update never goes below 60 Hz and switches to 30. Then it works well with V-sync


----------



## chubalz

Actualy my TV is sony x800D 4k tv.

http://www.sony.com.ph/electronics/televisions/x8000d-series/specifications

Upon looking at the specs. It doesn't support 1440p resolution only 1080p and 3840 x 2160p. Im i wrong?

I even try my G4560 IGPU which is HD 610 plug using HDMI to my sony tv. their were no 1440p resolution only 1080p and 3840 x 2160p.

I was planning Gtx 1080ti only playing on 1080p because ive seen review that 1080ti majority games dont reach 60fps on ultra setting.

Is 30 fps 4k resolution on ultra setting smooth playable?


----------



## Mad Pistol

Quote:


> Originally Posted by *chubalz*
> 
> Thank Mad Pistol.
> 
> Last question.
> 
> What can you say about my situation using Gtx 1080ti running 1080p on 60 hz TV?
> 
> For you its a go signal?


I mean, it will run just fine. I just think that it's complete and utter overkill. If you're only going to play 1080p, a GTX 1070, even at roughly half as fast, will still be overkill for 1080p.

If you're only going to be playing @ 1080p, save yourself a few bucks and get a GTX 1070.
Quote:


> Originally Posted by *chubalz*
> 
> Actualy my TV is sony x800D 4k tv.
> 
> http://www.sony.com.ph/electronics/televisions/x8000d-series/specifications
> 
> Upon looking at the specs. It doesn't support 1440p resolution only 1080p and 3840 x 2160p. Im i wrong?
> 
> I even try my G4560 IGPU which is HD 610 plug using HDMI to my sony tv. their were no 1440p resolution only 1080p and 3840 x 2160p.
> 
> *I was planning Gtx 1080ti only playing on 1080p because ive seen review that 1080ti majority games dont reach 60fps on ultra setting.*
> 
> Is 30 fps 4k resolution on ultra setting smooth playable?


Ummmmmm... a GTX 1080 Ti is a 4k/ 60fps card. I'm not sure where you read this, but even a GTX 1070 would be able to max out virtually every game on the market @ 1080p and still get well over 60 fps.

To put it in perspective, I have a 2560x1440 144hz Gsync monitor. The 1080 Ti can EASILY run virtually every game @ 2560x1440 maxed out and never drop below 60 fps. In games like Battlefield 1, I'm averaging 100+ fps, and I can even run it @ 4k and still get well over 60 fps in multiplayer.

Again, a GTX 1080 Ti is a 4k/ 60 fps card.

One more thing... you can probably setup a custom resolution in the Nvidia Control Panel and add 2560x1440 manually.


----------



## kfxsti

Finally got around to repasting my baby !! Temps dropped around 4 on idle . Haven't had a chance to load her up yet. Will post back with those results. Also noticed 2 caps didn't have thermal pads and added some to them. All seems well though
I don't know if I'm the only one.. but mother of God .. do they have to apply this much paste ? Lol


----------



## ELIAS-EH

Hello

Can someone tell me the maximum power that each fan header on 1080ti strix can handle?
Can i put two 4 pin fans on 1 fan header on the 1080ti strix?

Thank u


----------



## TheBoom

Quote:


> Originally Posted by *chubalz*
> 
> Actualy my TV is sony x800D 4k tv.
> 
> http://www.sony.com.ph/electronics/televisions/x8000d-series/specifications
> 
> Upon looking at the specs. It doesn't support 1440p resolution only 1080p and 3840 x 2160p. Im i wrong?
> 
> I even try my G4560 IGPU which is HD 610 plug using HDMI to my sony tv. their were no 1440p resolution only 1080p and 3840 x 2160p.
> 
> I was planning Gtx 1080ti only playing on 1080p because ive seen review that 1080ti majority games dont reach 60fps on ultra setting.
> 
> Is 30 fps 4k resolution on ultra setting smooth playable?


I think you can force it to run at 1440p in Nvidia Control Panel.

It's just weird if a 4K TV cannot output 1440p.


----------



## chubalz

I hope it support 1440p.

Gtx 1070 here is so much expensive in our country compare to gtx 1080ti FE.

Palit GTX 1080 Ti Founders Edition 11GB is cheaper here compare to Gigabyte GTX 1080 Ti Founders Edition 11GB

Do all Gtx 1080 Ti FE has same quality? same materials of blower?


----------



## SupernovaBE

Got myself a 1080 TI ( msi gaming X )
To swap out my 2 1070 gamin x
A small downgrade it seems, but, i got 850 euro for the 2 cards, im happy







( new is like 920 @ cheapest shop, and msi is doing a 40 euro cashback on the cards.

Now save up for number 2








Getting water on it soon + shunt mod. im running on the powerlimiter all the time.
It will do 2060mhz on air for now.
and 12.2ghz on the mem.

liquid metal on the gpu or just artic silver ?


----------



## Coopiklaani

SInce UK is on fire atm, I decided to under volt my gtx 1080ti today. To my surprise, its frequency scales very linearly with the voltage even when it's over 1.093v. Power is another story, since efficiency goes out of the window when you go over 1.093v.








Here's my 4hr FSU scene 1 stable frequency vs voltage table:

[email protected]
[email protected] (1.33mV per 1 MHz increment)
[email protected] (1.32mv per 1 MHz increment)


----------



## asdkj1740

Quote:


> Originally Posted by *Coopiklaani*
> 
> SInce UK is on fire atm, I decided to under volt my gtx 1080ti today. To my surprise, its frequency scales very linearly with the voltage even when it's over 1.093v. Power is another story, since efficiency goes out of the window when you go over 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> Here's my 4hr FSU scene 1 stable frequency vs voltage table:
> 
> [email protected]
> [email protected] (1.33mV per 1 MHz increment)
> [email protected] (1.32mv per 1 MHz increment)


not funny, at all.


----------



## GraphicsWhore

Quote:


> Originally Posted by *SupernovaBE*
> 
> Got myself a 1080 TI ( msi gaming X )
> To swap out my 2 1070 gamin x
> A small downgrade it seems, but, i got 850 euro for the 2 cards, im happy
> 
> 
> 
> 
> 
> 
> 
> ( new is like 920 @ cheapest shop, and msi is doing a 40 euro cashback on the cards.
> 
> Now save up for number 2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Getting water on it soon + shunt mod. im running on the powerlimiter all the time.
> It will do 2060mhz on air for now.
> and 12.2ghz on the mem.
> 
> liquid metal on the gpu or just artic silver ?


I wouldn't use liquid metal. Regular TIM is fine.

I also wouldn't waste your time and money on a SLI setup. I guarantee it'll be more trouble than it's worth.


----------



## dallas1990

got a new tv its a 4k 43" hisense 43h7c has hdr picture is great for the price at local bestbuy $380 . but my only issue is some of my games are running at 45fps ghost recon wildlands, crysis 3, one more but cant remember the name. wondering if its my tv or its my zotac amp extreme 1080 ti. my cpu is a 6800k oc'd to 4.0ghz. i probably should ask this somewhere else but i since it could possibl be my gpu i figured it would be ok lol. oh and i do have a 18gbps hdmi cable and playing these games at 4k. pluged into the 60hz port. just wondering if thats normal or did i fudge something up.


----------



## ZealotKi11er

Quote:


> Originally Posted by *dallas1990*
> 
> got a new tv its a 4k 43" hisense 43h7c has hdr picture is great for the price at local bestbuy $380 . but my only issue is some of my games are running at 45fps ghost recon wildlands, crysis 3, one more but cant remember the name. wondering if its my tv or its my zotac amp extreme 1080 ti. my cpu is a 6800k oc'd to 4.0ghz. i probably should ask this somewhere else but i since it could possibl be my gpu i figured it would be ok lol. oh and i do have a 18gbps hdmi cable and playing these games at 4k. pluged into the 60hz port. just wondering if thats normal or did i fudge something up.


Its normal. You got to lower some settings.


----------



## keikei

Quote:


> Originally Posted by *dallas1990*
> 
> got a new tv its a 4k 43" hisense 43h7c has hdr picture is great for the price at local bestbuy $380 . but my only issue is some of my games are running at 45fps ghost recon wildlands, crysis 3, one more but cant remember the name. wondering if its my tv or its my zotac amp extreme 1080 ti. my cpu is a 6800k oc'd to 4.0ghz. i probably should ask this somewhere else but i since it could possibl be my gpu i figured it would be ok lol. oh and i do have a 18gbps hdmi cable and playing these games at 4k. pluged into the 60hz port. just wondering if thats normal or did i fudge something up.


Drop/turn off post-processing and aa. You may not need it running @ 4k. You'll get more frames doing so.


----------



## Aganor

Guys i'm trying to find the best power/performance relation right now. I saw a user post his info on this subject but i do not know where it is.
What's the lowest voltage you guys managed to get for the max core frequency?
I started this due to mining zcash and nicehash but now i think that instead pushing for the max, i want to push for the minimum to safe energy and heat less.
Right now i'm testing 1898mhz @931mV.
Already failed after 1h of mining at 925mv and Driver crashed.


----------



## feznz

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Hello
> 
> Can someone tell me the maximum power that each fan header on 1080ti strix can handle?
> Can i put two 4 pin fans on 1 fan header on the 1080ti strix?
> 
> Thank u


I asked the same question at the Asus support forums, got no reply
seems pretty silly not to specify the exact maximum wattage the rated out put is 1 standard case fan per header in the specs sheet.


----------



## methadon36

Quote:


> Originally Posted by *Mad Pistol*
> 
> *sigh*
> 
> The Destiny 2 bundle hit just after I purchased the card, and all Amazon would do is give me a $15 promotional credit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I understand that I could return the card and repurchase it, but honestly, I just don't feel like going through the hassle. I guess I'll just have to pay full price on Destiny 2.


I purchased my evga 1080ti on newegg right before the bundle deal, they said nope on the destiny 2.


----------



## c0nsistent

Guys I need help ASAP!

I accidentally spilled Phobya liquid metal onto my GPU die and it ran down between the metal shroud around the die and the pcb. I sucked as much out with the syringe as I can but I'm not powering it on until I'm confident I can clean it out. People are saying use alcohol and a toothbrush? What do you all recommend. I really really don't want to power this thing on and see sparks and smoke.

Thanks!


----------



## Dasboogieman

Quote:


> Originally Posted by *c0nsistent*
> 
> Guys I need help ASAP!
> 
> I accidentally spilled Phobya liquid metal onto my GPU die and it ran down between the metal shroud around the die and the pcb. I sucked as much out with the syringe as I can but I'm not powering it on until I'm confident I can clean it out. People are saying use alcohol and a toothbrush? What do you all recommend. I really really don't want to power this thing on and see sparks and smoke.
> 
> Thanks!


Just flood the region with Isopropol and suction some more. Blast some canned air too for good measure. What I used was some ultra fine steel wool or alfoil because that absorbs the LM, just be careful not to damage the tiny caps. If it makes you feel better, I actually spilled my LM on the PCB itself and some of it ran underneath the VRAM BGA lol. If anything, my VRAM OCs improved.


----------



## c0nsistent

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just flood the region with Isopropol and suction some more. Blast some canned air too for good measure. What I used was some ultra fine steel wool or alfoil because that absorbs the LM, just be careful not to damage the tiny caps. If it makes you feel better, I actually spilled my LM on the PCB itself and some of it ran underneath the VRAM BGA lol. If anything, my VRAM OCs improved.


So it would be safe to literally flush it with high % alcohol by pouring it onto the die repeatedly? I assume I would have to let it dry for 2+ days afterwards, but my concerns were mainly doing damage to the board, but at this point its already past the point of no return.

To speed up the drying process, should I blow dry the board?



I see what you mean now... Basically pour a little at a time and try to suction it out. Will do. Tomorrow morning I'll be giving this a shot and I'll post my success or failure the following day after it dries.


----------



## ilikelobstastoo

2 days for rubbing alcohol to dry!? Lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *c0nsistent*
> 
> So it would be safe to literally flush it with high % alcohol by pouring it onto the die repeatedly? I assume I would have to let it dry for 2+ days afterwards, but my concerns were mainly doing damage to the board, but at this point its already past the point of no return.
> 
> To speed up the drying process, should I blow dry the board?
> 
> 
> 
> I see what you mean now... Basically pour a little at a time and try to suction it out. Will do. Tomorrow morning I'll be giving this a shot and I'll post my success or failure the following day after it dries.


Take the card apart and clean it properly.


----------



## DerComissar

Quote:


> Originally Posted by *chubalz*
> 
> I hope it support 1440p.
> 
> Gtx 1070 here is so much expensive in our country compare to gtx 1080ti FE.
> 
> Palit GTX 1080 Ti Founders Edition 11GB is cheaper here compare to Gigabyte GTX 1080 Ti Founders Edition 11GB
> 
> Do all Gtx 1080 Ti FE has same quality? same materials of blower?


Same circus, different clowns.


----------



## Dasboogieman

Quote:


> Originally Posted by *c0nsistent*
> 
> So it would be safe to literally flush it with high % alcohol by pouring it onto the die repeatedly? I assume I would have to let it dry for 2+ days afterwards, but my concerns were mainly doing damage to the board, but at this point its already past the point of no return.
> 
> To speed up the drying process, should I blow dry the board?
> 
> 
> 
> I see what you mean now... Basically pour a little at a time and try to suction it out. Will do. Tomorrow morning I'll be giving this a shot and I'll post my success or failure the following day after it dries.


the trick with liquid metal is do it ASAP, the longer you leave it exposed to air, the more it oxidizes which makes it harder to remove.


----------



## smonkie

Quote:


> Originally Posted by *Dasboogieman*
> 
> the trick with liquid metal is do it ASAP, the longer you leave it exposed to air, the more it oxidizes which makes it harder to remove.


As far as I know, liquid metal pastes are alloys of different metals but with a greater Gallium quantity in them. And Gallium does not react in presence of air.


----------



## SupernovaBE

Quote:


> Originally Posted by *GraphicsWhore*
> 
> I wouldn't use liquid metal. Regular TIM is fine.
> 
> I also wouldn't waste your time and money on a SLI setup. I guarantee it'll be more trouble than it's worth.


Yeah, the problem is, i have a 900D and it looks empty now








Maybe il just w8 and see if the new gen brings better sli scaling.
The scaling on my games are good tho ( my 1070 sli setup did well )


----------



## Dasboogieman

Quote:


> Originally Posted by *smonkie*
> 
> As far as I know, liquid metal pastes are alloys of different metals but with a greater Gallium quantity in them. And Gallium does not react in presence of air.


yes but are you willing to gamble a $1000+ GPU on that?


----------



## kfxsti

Have you guys seen the MSI lightning 1080 ti yet? Sweet baby Jesus


----------



## TheBoom

Quote:


> Originally Posted by *kfxsti*
> 
> Have you guys seen the MSI lightning 1080 ti yet? Sweet baby Jesus


What about it?


----------



## ilikelobstastoo

Quote:


> Originally Posted by *kfxsti*
> 
> Have you guys seen the MSI lightning 1080 ti yet? Sweet baby Jesus


According to msi ab thats exactly what my fe is . Lol


----------



## ZealotKi11er

Does anyone know if 1080 Hybrid cooler for 1080 Ti?


----------



## Bitbit017

Hi guys!

I need some help!
I have several Gigabyte GTX 1080 Ti 11 GB (Gaming OC) and I use them for mining!!
I have this problem: when it goes to mining my memory frequency drops from 11000mhz to 10000mhz, the card automatically autoselect that profile with 5000mhz(showed in picture).
In all other application, gaming, benchmarking etc...the card autoselect the First profile and the memory works at 11000mhz


http://imgur.com/yveJh


Anybody knows why? Do you have any solution?

Thank you!


----------



## Psykoking

So, since you guys were talking about liquid metal, I thought I might share my results going from the factory TIM to Thermal Grizzly Conductonaut on my Zotac Amp Extreme(on Air). IDLE temps didnt change but I went from 68°C under full load at 2062MHz to 57°C. So for me it was quite worth the change. Ambient temps ~26°C.


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does anyone know if 1080 Hybrid cooler for 1080 Ti?


If it's the EVGA hybrid, yes it fits just fine.


----------



## c0nsistent

Cleaned it with 91% alcohol and a paintbrush as well as a coffee filter. Waited 6 hours, powered it on. Geforce logo lights up the goes dark after a few seconds. No display output and no 100% fan speed which I usually accustom with a dead gpu. Then decided to bake it for 10 mins @ 375F. Same thing afterwards. Now I just soaked it in 91% alcohol for 10 mins and I'm letting it dry for 24 hours. After this I'll just be putting it up for parts on ebay and saving for another.

Be careful with liquid metal TIM guys. I wasn't even putting the TIM on the gpu die itself, I was merely putting it in between the waterblock and my copper shim and it leaked out in between the GPU and PCB which is the worst place it could have leaked.

It's just a turn of bad luck I guess. Now I'm back to using an AMD 6450 for a few weeks.

Just to make sure, I noticed that some of the hex screws have colored bands on them. If I didn't put them back in the corresponding areas, would that cause this issue? I figured it wouldn't matter.


----------



## max883

Evga GTX 1080 Ti SC in my Xbox One


----------



## lanofsong

Hello GTX 1080Ti owners,

We are having our monthly Foldathon from Monday 19th - Wednesday 21st - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

June 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## kfxsti

Quote:


> Originally Posted by *TheBoom*
> 
> What about it?


3 8pins lol. And pure sexiness lol


----------



## GraphicsWhore

Quote:


> Originally Posted by *c0nsistent*
> 
> Cleaned it with 91% alcohol and a paintbrush as well as a coffee filter. Waited 6 hours, powered it on. Geforce logo lights up the goes dark after a few seconds. No display output and no 100% fan speed which I usually accustom with a dead gpu. Then decided to bake it for 10 mins @ 375F. Same thing afterwards. Now I just soaked it in 91% alcohol for 10 mins and I'm letting it dry for 24 hours. After this I'll just be putting it up for parts on ebay and saving for another.
> 
> Be careful with liquid metal TIM guys. I wasn't even putting the TIM on the gpu die itself, I was merely putting it in between the waterblock and my copper shim and it leaked out in between the GPU and PCB which is the worst place it could have leaked.
> 
> It's just a turn of bad luck I guess. Now I'm back to using an AMD 6450 for a few weeks.
> 
> Just to make sure, I noticed that some of the hex screws have colored bands on them. If I didn't put them back in the corresponding areas, would that cause this issue? I figured it wouldn't matter.


You baked your 1080Ti?

Bruh...

Sounds like it was done by then anyway. I'd let it sit for a few days and try.

Are you talking about the washers? I don't even think missing one would cause that.


----------



## c0nsistent

Quote:


> Originally Posted by *GraphicsWhore*
> 
> You baked your 1080Ti?
> 
> Bruh...
> 
> Sounds like it was done by then anyway. I'd let it sit for a few days and try.
> 
> Are you talking about the washers? I don't even think missing one would cause that.


Yeah but it was stripped of course. It's a well known tactic as far as reflowing goes. I figured it was already done at that point so perhaps it was worth trying and could possibly dry the rest of the liquid out if that was the problem.

Now I'm giving it 24 hours and I'll try it again tomorrow before I list it on ebay for parts for $200 or so.


----------



## KedarWolf

Quote:


> Originally Posted by *kfxsti*
> 
> Have you guys seen the MSI lightning 1080 ti yet? Sweet baby Jesus


I'd strongly advise anyone flashing this BIOS on their FE's when it becomes available.

When I flashed the HOF BIOS I bricked my card, it's three eight pin power too I think.









Took me almost four days to get it flashed back and working again.


----------



## madmeatballs

Is PNY's XLR8 bios out yet?


----------



## KedarWolf

Quote:


> Originally Posted by *madmeatballs*
> 
> Is PNY's XLR8 bios out yet?


Haven't seen it yet, no.

If anyone has an XLR8 can they save the BIOS with GPU-Z, put it in a ZIP file, link it here as an attachment?


----------



## DerComissar

Quote:


> Originally Posted by *c0nsistent*
> 
> Cleaned it with 91% alcohol and a paintbrush as well as a coffee filter. Waited 6 hours, powered it on. Geforce logo lights up the goes dark after a few seconds. No display output and no 100% fan speed which I usually accustom with a dead gpu. Then decided to bake it for 10 mins @ 375F. Same thing afterwards. Now I just soaked it in 91% alcohol for 10 mins and I'm letting it dry for 24 hours. After this I'll just be putting it up for parts on ebay and saving for another.
> 
> Be careful with liquid metal TIM guys. I wasn't even putting the TIM on the gpu die itself, I was merely putting it in between the waterblock and my copper shim and it leaked out in between the GPU and PCB which is the worst place it could have leaked.
> 
> It's just a turn of bad luck I guess. Now I'm back to using an AMD 6450 for a few weeks.
> 
> *Just to make sure, I noticed that some of the hex screws have colored bands on them. If I didn't put them back in the corresponding areas, would that cause this issue? I figured it wouldn't matter.*


The colored "bands", usually blue in color, are blue Locktite.

This is applied by the factory when assembling the cards, to prevent the screws from loosening.


----------



## Rollergold

Well after running my current GTX 680 for 5 years (4 under water)



I have finally retired it yesterday and slotted in the new hotness







This also marks my first non reference card as well. From the 7800gt till just yesterday all I used were reference designed cards.





Will put it on water after EK releases the block for it as the QDC's show









But on air I was able to get this out of Unigine's SuperPosition Benchmark


----------



## rjeftw

Decided I am finally going to pull the trigger and snag a 1080Ti; just can't decide what company to go with. Debating on the either of the Aorus cards, EVGA SC2/FTW3, MSI Gaming model... So many choices; Currently have a MSI 980Ti Gaming with G10 & H50. Heavily considering water cooling in the future with the card... Any recommendations?

Thanks!


----------



## Dasboogieman

Quote:


> Originally Posted by *rjeftw*
> 
> Decided I am finally going to pull the trigger and snag a 1080Ti; just can't decide what company to go with. Debating on the either of the Aorus cards, EVGA SC2/FTW3, MSI Gaming model... So many choices; Currently have a MSI 980Ti Gaming with G10 & H50. Heavily considering water cooling in the future with the card... Any recommendations?
> 
> Thanks!


If you're gonna watercool, you're stuck with the FE, Aorus, Strix, Gaming or FTW3. All of which are fine but there are some minor quirks.

1. The FE has the best waterblock compatibility and BIOS flexibility
2. The Strix has unhindered operation with the XOC BIOS for overvolting
3. the Aorus has one of the highest non-XOC BIOS Power Limits of the watercoolable AIB cards, which means very very minimal power throttling.
4. The FTW3 has a decent Power limit ceiling but also hands down the best monitoring. It has thermocouples that can measure VRM + VRAM temps which no other card can do.


----------



## Streetdragon

Sorry AMD but to late is to late..... what a shame



I hope i get lucky with the silicon ones in my life... but as long i can get 2000 out of it...

Can i mod the bios somehow to get a higher powerlimit? Shunt mod is no option because of my X9 case and the liquid metal and so on. or can i tune it in anouther way without solder etc


----------



## KedarWolf

Quote:


> Originally Posted by *Streetdragon*
> 
> Sorry AMD but to late is to late..... what a shame
> 
> 
> 
> I hope i get lucky with the silicon ones in my life... but as long i can get 2000 out of it...
> 
> Can i mod the bios somehow to get a higher powerlimit? Shunt mod is no option because of my X9 case and the liquid metal and so on. or can i tune it in anouther way without solder etc


You can try the resistor mod, it works on vertical cases.

Search about it in this thread, but it's properly attaching 47 Ohm resistors on three capacitors on the card that does the same as the shunt mod.









Edit: Or flash XOC BIOS, removes power limit.


----------



## Streetdragon

Quote:


> Originally Posted by *KedarWolf*
> 
> You can try the resistor mod, it works on vertical cases.
> 
> Search about it in this thread, but it's properly attaching 47 Ohm resistors on three capacitors on the card that does the same as the shunt mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Or flash XOC BIOS, removes power limit.


i think i will play around with the bios. just to be safe.

This bios ? https://onedrive.live.com/?authkey=%21AJwt7bJPL03qgw0&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%21154264&parId=FABF1EAAEEBFA9D9%21148675&action=locate


----------



## KedarWolf

Quote:


> Originally Posted by *Streetdragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You can try the resistor mod, it works on vertical cases.
> 
> Search about it in this thread, but it's properly attaching 47 Ohm resistors on three capacitors on the card that does the same as the shunt mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Or flash XOC BIOS, removes power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i think i will play around with the bios. just to be safe in OP.
> 
> This bios ? https://onedrive.live.com/?authkey=%21AJwt7bJPL03qgw0&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%21154264&parId=FABF1EAAEEBFA9D9%21148675&action=locate
Click to expand...

Yes, if that's the one in OP in How To Flash A Different BIOS section.


----------



## Streetdragon

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, if that's the one in OP in How To Flash A Different BIOS section.


interesting^^ just flash a bios from a costume pcb card on the founders. xD

maybe i try the shuntmod. isolate it first with hotglue, apply the metal and seal it with more glue. the metal wont move, if i have tha card vertical or?


----------



## dansi

I doubt flashing bios makes a diff, especially for FE cards. Anecdotally, i noticed Nvidia had hardcoded Pascal gpus perfectly to the quality of the die. Has anyone turned a 2ghz dog to a 2.1ghz winner?


----------



## Aganor

Quote:


> Originally Posted by *dansi*
> 
> I doubt flashing bios makes a diff, especially for FE cards. Anecdotally, i noticed Nvidia had hardcoded Pascal gpus perfectly to the quality of the die. Has anyone turned a 2ghz dog to a 2.1ghz winner?


I tried to run heaven with 2100mhz but it never got stable even at 1093mV, its stable at 2030 or so


----------



## MrTOOSHORT

New Asus Strix 1080TI:




EK block and back plate incoming, in the mail. Will try the XOC bios when the block is installed.


----------



## smithsrt8

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> New Asus Strix 1080TI:
> 
> 
> 
> 
> EK block and back plate incoming, in the mail. Will try the XOC bios when the block is installed.


How much of a difference do you see between the Titan X you had and the 1080ti? I am always curious to see real world differences rather than the magazine spins...I went from a 970 laptop to a 1070 laptop to my now system so I saw huge differences


----------



## MrTOOSHORT

Still have the Titan X P, seem like the same card as the 1080 ti. Not much difference, but only got the 1080ti yesterday.


----------



## stangflyer

Picked up a Zotac AMP Extreme 1080TI! 

Took out these.


----------



## B3L13V3R

I can haz a joyn to clubz...?

A week ago (has changed a little already due to adding water-cooled Aquaero and new fluid):



During post-build testing:



1080 Ti's with the old mobo before she said goodbye after three and a half years:


----------



## Dasboogieman

Quote:


> Originally Posted by *dansi*
> 
> I doubt flashing bios makes a diff, especially for FE cards. Anecdotally, i noticed Nvidia had hardcoded Pascal gpus perfectly to the quality of the die. Has anyone turned a 2ghz dog to a 2.1ghz winner?


I turned my 2025-2037 mediocre unit into a 2100mhz chip provided it stays under 35C.


----------



## smithsrt8

Quote:


> Originally Posted by *Dasboogieman*
> 
> I turned my 2025-2037 mediocre unit into a 2100mhz chip provided it stays under 35C.


I have noticed that just about all the cards will have crazy performance levels when cold...the problem is that most water systems (in 20-25c ambient rooms) will cool the card to about 48-54c under somewhat hard use...my house is a bit warmer (right now) and I have yet to see anything over 55c but the best I can squeeze out of this card (this is a replacement card from Gigabyte since my last card was stuck at PCIe x4) 2055 @ 1081mv with the curve...but then again for gaming its just fine...I am not looking to break any benchmark records


----------



## mafia97160

hello..my score with my Founders Edition bios XOC 2190 MHz 1.200mv memory 6107 MHz
https://www.hostingpics.net/viewer.php?id=340406Screenshot1.png
http://www.3dmark.com/fs/12891722


----------



## dunbheagan

Quote:


> Originally Posted by *mafia97160*
> 
> hello..my score with my Founders Edition bios XOC 2190 MHz 1.200mv memory 6107 MHz
> https://www.hostingpics.net/viewer.php?id=340406Screenshot1.png
> http://www.3dmark.com/fs/12891722


Wow, 2190Mhz core, descent! Whats your Superposition 4K score with these settings? You should exeed the 11k points i guess? Whats your cooling?


----------



## stangflyer

Hi everyone! Going to change out the thermal paste of my new Zotac AMP Extreme 1080TI. The paste I am going to use is.

GELID Solutions GC-Extreme Thermal Compound - It came with my Asrock Z77 Formula OC motherboard. Will this be good to use?? It is 4 years old but never opened. I know there are better solutions but when this stuff came out it was pretty good stuff.

Thoughts...


----------



## smithsrt8

I would like it would still be good..and yes...Gelid Extreme is good stuff..the upper end of stuff (like Thermal Grizzly/Gelid) is solid for cooling...


----------



## stangflyer

In regards to Firestorm the OC software for Zotac cards. Can I run AB at the same time as Firestorm? I assume I need firestorm open to control the rgb lights and to make sure the 100% fan on/off stays off? I have FS installed. Can I open AB or do I have to close FS first? I have always heard that you are not supposed to have multiple oc software on at the same time.

If I have FS installed and have setup the rgb lights the way I want them and also turned off the 100% fan turning on can I then reboot and it will stay that way? Even if I do NOT have it to be on at startup? Then I can just open AB.

*How do the people with an Amp Extreme use FS and AB at the same time??*

Thanks


----------



## Psykoking

Quote:


> Originally Posted by *stangflyer*
> 
> In regards to Firestorm the OC software for Zotac cards. Can I run AB at the same time as Firestorm? I assume I need firestorm open to control the rgb lights and to make sure the 100% fan on/off stays off? I have FS installed. Can I open AB or do I have to close FS first? I have always heard that you are not supposed to have multiple oc software on at the same time.
> 
> If I have FS installed and have setup the rgb lights the way I want them and also turned off the 100% fan turning on can I then reboot and it will stay that way? Even if I do NOT have it to be on at startup? Then I can just open AB.
> 
> *How do the people with an Amp Extreme use FS and AB at the same time??*
> 
> Thanks


Well if you need Firestorm only for the RGB settings you wont have problems since the settings will be stored on the card. So if you want to use AB for Ocing make sure to not use Firestorm at the same time. Tried it and ended up crashing. So basically just put your settings for the fancy RGB lighting in Firestorm and make sure to untick starting with windows then you wont have any problems.

Greetings


----------



## nrpeyton

yeeehaa!!

my new temps ;-)



Ambient Temp (in room): 28c

Coolant Temp in loop _(and dew point)_: 13c

GPU Idle Temp: 14/15c

Load Temp: about to find out


----------



## ZealotKi11er

Quote:


> Originally Posted by *mcg75*
> 
> If it's the EVGA hybrid, yes it fits just fine.


Thanks. Should I bother modding the plate or just use the AIO using stock plate?


----------



## c0nsistent

Card is ruined. Only displaying artifacts when it boots up... well, it was fun while it lasted and I've learned an expensive lesson.

Now I'll be waiting until Vega's release before buying either another 1080 Ti or Vega.


----------



## smithsrt8

Quote:


> Originally Posted by *c0nsistent*
> 
> Card is ruined. Only displaying artifacts when it boots up... well, it was fun while it lasted and I've learned an expensive lesson.
> 
> Now I'll be waiting until Vega's release before buying either another 1080 Ti or Vega.


Ouch...sorry to hear!


----------



## jase78

Quote:


> Originally Posted by *c0nsistent*
> 
> Card is ruined. Only displaying artifacts when it boots up... well, it was fun while it lasted and I've learned an expensive lesson.
> 
> Now I'll be waiting until Vega's release before buying either another 1080 Ti or Vega.


that sux man!! i would be livid! you gonna sell the card? if so how much?


----------



## Wag

I picked up a cheap Asus 1080 Ti turbo off Newegg and while it overclocks decently for a blower-style fan (174/425) adjusting the voltage doesn't seem to accomplish anything (yes, I unlocked it on Afterburner and even tried Asus' Tweak II software). Why is this? I've never had a videocard where adjusting the voltage didn't allow for a more stable overclock.

It's an interesting card. The only modified blower-style 1080 Ti. The fan is kind of cheap but obviously it works. Should be able to water cool it if I decide to do it later.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Wag*
> 
> I picked up a cheap Asus 1080 Ti turbo off Newegg and while it overclocks decently for a blower-style fan (174/425) adjusting the voltage doesn't seem to accomplish anything (yes, I unlocked it on Afterburner and even tried Asus' Tweak II software). Why is this? I've never had a videocard where adjusting the voltage didn't allow for a more stable overclock.
> 
> It's an interesting card. The only modified blower-style 1080 Ti. The fan is kind of cheap but obviously it works. Should be able to water cool it if I decide to do it later.


Voltage does nothing for these cards. Its all about power and temps. But setting +174 you are telling the card to use less voltage (undervolting).


----------



## KraxKill

Quote:


> Originally Posted by *c0nsistent*
> 
> Card is ruined. Only displaying artifacts when it boots up... well, it was fun while it lasted and I've learned an expensive lesson.
> 
> Now I'll be waiting until Vega's release before buying either another 1080 Ti or Vega.


Are you sure? I would submerge the card in acetone (agitate the container a bit) and let it dry for 48h before giving up. I find it strange that you won't boot after removing the LM. There must be/was some still left in there. You've now tried to power on the card so there may be irreversible damage done, but it's worth a shot just in case you simply didn't get all of it. If you got some underneath the mid plate, and even if you're not sure, you should remove the mid plate and get the PCB naked and dipped to make sure you're not simply shorting something still. If you have a little pyrex container or something similar, it may help to lightly use an old toothbrush to quickly give the card a light scrub around the socket while it's bathing. I wouldn't keep it in there all day, but would not hesitate to submerge the card for a couple of min. Good luck.


----------



## c0nsistent

Quote:


> Originally Posted by *KraxKill*
> 
> Are you sure? I would submerge the card in acetone (agitate the container a bit) and let it dry for 48h before giving up. I find it strange that you won't boot after removing the LM. There must be/was some still left in there. You've now tried to power on the card so there may be irreversible damage done, but it's worth a shot just in case you simply didn't get all of it. If you got some underneath the mid plate, and even if you're not sure, you should remove the mid plate and get the PCB naked and dipped to make sure you're not simply shorting something still. If you have a little pyrex container or something similar, it may help to lightly use an old toothbrush to quickly give the card a light scrub around the socket while it's bathing. I wouldn't keep it in there all day, but would not hesitate to submerge the card for a couple of min. Good luck.


I'll give that a shot before getting rid of it. Typically with a 100% dead GPU you'll either get sparks/smoke/something that tells you it went poof, except I never got that with this card. It boots up and then artifacts instead of the 100% fan speed/power off that most GPU's do when they're toasted.


----------



## Wag

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Voltage does nothing for these cards. Its all about power and temps. But setting +174 you are telling the card to use less voltage (undervolting).


I didn't know that. I'm coming from a 980 Ti so I'm used to adjusting the voltage. 174/425 @ 72C under load is pretty good considering it has a cheap fan.


----------



## mouacyk

Quote:


> Originally Posted by *Wag*
> 
> I didn't know that. I'm coming from a 980 Ti so I'm used to adjusting the voltage. 174/425 @ 72C under load is pretty good considering it has a cheap fan.


I wouldn't say nothing -- it's something for benchers. For my card with 71.5 ASIC, going from stock volt of 1.1875v to 1.274v allowed going from ~1442MHz to 1480 for >4 hours of bf1 stability and 1519MHz for benches.


----------



## RavageTheEarth

Quote:


> Originally Posted by *B3L13V3R*
> 
> I can haz a joyn to clubz...?
> 
> A week ago (has changed a little already due to adding water-cooled Aquaero and new fluid):
> 
> 
> 
> During post-build testing:
> 
> 
> 
> 1080 Ti's with the old mobo before she said goodbye after three and a half years:


Looks great. What case is that?


----------



## done12many2

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Looks great. What case is that?


Looks like a Caselabs SM8 with a mix of black and white parts. Very nice.


----------



## FUZZFrrek

That's about right!


----------



## B3L13V3R

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Looks great. What case is that?


Quote:


> Originally Posted by *done12many2*
> 
> Looks like a Caselabs SM8 with a mix of black and white parts. Very nice.


Thank you both! Indeed it is an SM8 with back interior / white panels.





Always a WIP of course. This forum keeps me learning new things always.


----------



## rjeftw

Quote:


> Originally Posted by *Dasboogieman*
> 
> If you're gonna watercool, you're stuck with the FE, Aorus, Strix, Gaming or FTW3. All of which are fine but there are some minor quirks.
> 
> 1. The FE has the best waterblock compatibility and BIOS flexibility
> 2. The Strix has unhindered operation with the XOC BIOS for overvolting
> 3. the Aorus has one of the highest non-XOC BIOS Power Limits of the watercoolable AIB cards, which means very very minimal power throttling.
> 4. The FTW3 has a decent Power limit ceiling but also hands down the best monitoring. It has thermocouples that can measure VRM + VRAM temps which no other card can do.


Ended up ordering an Aorus Extreme today from the EGG. Looking forward to this upgrade!


----------



## dansi

Quote:


> Originally Posted by *Dasboogieman*
> 
> I turned my 2025-2037 mediocre unit into a 2100mhz chip provided it stays under 35C.


I doubt you were able to tune anything. The nature of Pascal is to find out which levels your die quality is at and run it at Nvidia hardcoded speeds for that level. Keeping temps as low as possible enable higher boost bin. Adding voltages and testing bios have no impact.

At below 35c, i think your card will boost 2-3 bins above your set max clocks.


----------



## nrpeyton

My card is vertical. And I have Liquid Metal on 3 shunts. And the LM stays where it's meant to. It hasn't moved at all.

Liquid Metal loves Liquid Metal (I.E. it sticks to it's self).

It should only move if it's physically agitated.

I do have it surrounded in a little Black Electrical Tape just in case. (the liquid kind -- otherwise known as Liquid Electrical Tape) which dries into normal tape and can be peeled off like normal tape).

In other words, you don't necessarily need to keep your card horizontal to do a Shunt mod.


----------



## Dasboogieman

Quote:


> Originally Posted by *dansi*
> 
> I doubt you were able to tune anything. The nature of Pascal is to find out which levels your die quality is at and run it at Nvidia hardcoded speeds for that level. Keeping temps as low as possible enable higher boost bin. Adding voltages and testing bios have no impact.
> 
> At below 35c, i think your card will boost 2-3 bins above your set max clocks.


Yes there was tuning involved, just in a different way. You didn't just simply let it boost 2-3 bins by itself because that would either straight crash or you don't get the optimal speed for a given temperature. The devil was tweaking the secondary + tertiary boost bins, at low temps, the card gets very overzealous with the boosting, if you let it to its own devices.

There are also strange stability anomalies at low temperatures and 2088mhz+ speeds. Certain bins won't directly crash but would either stutter, soft crash or regress performance. This has to be balanced with temperature because as much as 4-5C can make a big difference.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> My card is vertical. And I have Liquid Metal on 3 shunts. And the LM stays where it's meant to. It hasn't moved at all.
> 
> Liquid Metal loves Liquid Metal (I.E. it sticks to it's self).
> 
> It should only move if it's physically agitated.
> 
> I do have it surrounded in a little Black Electrical Tape just in case. (the liquid kind -- otherwise known as Liquid Electrical Tape) which dries into normal tape and can be peeled off like normal tape).
> 
> In other words, you don't necessarily need to keep your card horizontal to do a Shunt mod.


What liquid metal did you use?


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thanks. Should I bother modding the plate or just use the AIO using stock plate?


You don't have to mod anything to make it work.

I used Steve's guide from Gamers Nexus to put it on a 1080. Swapped it to my 1080 Ti a few months ago, no issues.

I did have to remove everything from the 1080 Ti board to make it work. There is one screw underneath that the 1080 didn't have and without it coming out, the wiring for power was very hard to route.


----------



## mechwarrior

anyone tried the new AB 4.4 beta 10? says it supports boost 3??


----------



## KedarWolf

Quote:


> Originally Posted by *mechwarrior*
> 
> anyone tried the new AB 4.4 beta 10? says it supports boost 3??


Been using beta 10 since it was released, no issues. Solves some cards fan issues as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mcg75*
> 
> You don't have to mod anything to make it work.
> 
> I used Steve's guide from Gamers Nexus to put it on a 1080. Swapped it to my 1080 Ti a few months ago, no issues.
> 
> I did have to remove everything from the 1080 Ti board to make it work. There is one screw underneath that the 1080 didn't have and without it coming out, the wiring for power was very hard to route.


Yeah I looked at GN mod. I was trying to see if I should bother with the VRM/RAM cooler to make it look cleaner. Also what kind of temps are you getting?


----------



## dansi

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thanks. Should I bother modding the plate or just use the AIO using stock plate?


Use the aio, leave the stock plate as it is and use a fine tweezers to unplug/plug the power cable. A 90 minutes job at most


----------



## DarkrReign2049

It looks like the EVGA FTW3 EK waterblock is up for pre-order if anyone has been waiting. Shipping next Tuesday the 27th.

https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-evga-geforce-gtx-1080-ti-ftw3/


----------



## kevindd992002

Quote:


> Originally Posted by *DarkrReign2049*
> 
> It looks like the EVGA FTW3 EK waterblock is up for pre-order if anyone has been waiting. Shipping next Tuesday the 27th.
> 
> https://www.ekwb.com/news/ek-releasing-full-cover-water-blocks-evga-geforce-gtx-1080-ti-ftw3/


Does the EK Webshop ever go on sale for WB's?


----------



## Benny89

Mah STRIX will be swimming this weekend







. Finally. Nervous as this is my first water loop but excited as little brat!


----------



## vm007

Hello all,

Newbie here with same stuff like most.

Gtx 1080ti founder's edition
i7 7700k liquid cooled (stock)
850w psu
16gb ram 2400 mhz

500gb SSD and 2 tb storage.

Basically Alienware R6 with Dell U3415W monitor.

Looking to overclock soon and starting with reading number 1 thread will report back.

Edit: come to think of it, I'm maxing out monitor fps on my witcher 3 and MEA - I won't overclock lol. No point, just cooling perhaps. I've noticed founder edition runs hotter than others


----------



## TheBoom

Quote:


> Originally Posted by *kfxsti*
> 
> 3 8pins lol. And pure sexiness lol


Sadly doesn't do much for pascal. Unless you have an exclusive unlocked bios and LN2 cooling.
Quote:


> Originally Posted by *c0nsistent*
> 
> Card is ruined. Only displaying artifacts when it boots up... well, it was fun while it lasted and I've learned an expensive lesson.
> 
> Now I'll be waiting until Vega's release before buying either another 1080 Ti or Vega.


Sorry to hear that. I was about to order some conductonaut but your post made me change my mind. Don't think it's worth the risk for a few degrees less.
Quote:


> Originally Posted by *vm007*
> 
> Hello all,
> 
> Newbie here with same stuff like most.
> 
> Gtx 1080ti founder's edition
> i7 7700k liquid cooled (stock)
> 850w psu
> 16gb ram 2400 mhz
> 
> 500gb SSD and 2 tb storage.
> 
> Basically Alienware R6 with Dell U3415W monitor.
> 
> Looking to overclock soon and starting with reading number 1 thread will report back.
> 
> Edit: come to think of it, I'm maxing out monitor fps on my witcher 3 and MEA - I won't overclock lol. No point, just cooling perhaps. I've noticed founder edition runs hotter than others


At 1440p 60fps a 1070 would have mostly done the trick. So far the most demanding game I have at hand is ME:A and the 1080 Ti does not disappoint at 165hz.


----------



## TheBoom

Double.


----------



## DarkrReign2049

Quote:


> Originally Posted by *kevindd992002*
> 
> Does the EK Webshop ever go on sale for WB's?


If they do I have not seen any recently.


----------



## vm007

Quote:


> Originally Posted by *TheBoom*
> 
> Sadly doesn't do much for pascal. Unless you have an exclusive unlocked bios and LN2 cooling.
> Sorry to hear that. I was about to order some conductonaut but your post made me change my mind. Don't think it's worth the risk for a few degrees less.
> At 1440p 60fps a 1070 would have mostly done the trick. So far the most demanding game I have at hand is ME:A and the 1080 Ti does not disappoint at 165hz.


I can overclock the monitor to 75hz but issue was 4k just doesn't allow me to read. I was going to get 4k but after numerous reviews and what not, I ended with this.

I'm so grateful i didn't get 4k , would've been a hassle for me and as is -the videos on youtube and what not don't play good either. Have to use an extension but that widens people as well lol.

I read a lot and use screen a LOT , continuous squinting would've given me a headache.


----------



## mcg75

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah I looked at GN mod. I was trying to see if I should bother with the VRM/RAM cooler to make it look cleaner. Also what kind of temps are you getting?


Ran Heaven for 30 minutes and max temp was 40c.

Gaming usually less since it's not usually pegged at 99% usage.


----------



## ilikelobstastoo

Quote:


> Originally Posted by *nrpeyton*
> 
> My card is vertical. And I have Liquid Metal on 3 shunts. And the LM stays where it's meant to. It hasn't moved at all.
> 
> Liquid Metal loves Liquid Metal (I.E. it sticks to it's self).
> 
> It should only move if it's physically agitated.
> 
> I do have it surrounded in a little Black Electrical Tape just in case. (the liquid kind -- otherwise known as Liquid Electrical Tape) which dries into normal tape and can be peeled off like normal tape).
> 
> In other words, you don't necessarily need to keep your card horizontal to do a Shunt mod.


Anything new scores supo 4k in particular since going sub ambient ?

Also was hoping you would do some testing on thermal pad thicknesses and temps like u did with 1080 so I can copy lol.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Benny89*
> 
> Mah STRIX will be swimming this weekend
> 
> 
> 
> 
> 
> 
> 
> . Finally. Nervous as this is my first water loop but excited as little brat!


Congrats, and don't be nervous. It's really not too bad.

Mine will be under a block too this weekend!


----------



## smithsrt8

Quote:


> Originally Posted by *Benny89*
> 
> Mah STRIX will be swimming this weekend
> 
> 
> 
> 
> 
> 
> 
> . Finally. Nervous as this is my first water loop but excited as little brat!


Dont forget to install the plugs into the ports you are not using with your GPU block...more than one person has made that mistake...and its also on youtube...


----------



## kevindd992002

So I tried playing GOW4 at 1080p with an EVGA GTX 1080Ti FTW3. THE CPU utilization maxes out to around 85% average but GPU utilization is just around 85%. Does that mean there is no bottleneck from both CPU and GPU?


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> So I tried playing GOW4 at 1080p with an EVGA GTX 1080Ti FTW3. THE CPU utilization maxes out to around 85% average but GPU utilization is just around 85%. Does that mean there is no bottleneck from both CPU and GPU?


What about "Max CPU Thread Usage"?

If one or more threads are pegged at 100% while others are only at 10-20% that could also be why you're only seeing 85% CPU usage. I.E. the threads the game is actually using are maxed out at 100%. And others are sitting doing nothing.

Another culript could be if the 'frame rate limiter' has been left on in Riva Tuner. _(don't know if you use Riva Tuner, but many ppl here do)._


----------



## nrpeyton

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Anything new scores supo 4k in particular since going sub ambient ?
> 
> Also was hoping you would do some testing on thermal pad thicknesses and temps like u did with 1080 so I can copy lol.


Just hooked the chiller up yesterday/today.

Reworking my CPU overclock first. Then planning to begin looking at the GPU.

Not sure if it's going to be quite as glorious as my adventures on my last card (due to no solution for voltage on FE).

But I'll certainly be doing the pad testing. (as you mentioned).









Due to having _really_ lost the silicon lottery on my 7700k this is taking a lot longer than I'd hoped for. I'm currently at 1.450v and stress testing as I type. Been slowly working my way up from 1.35v trying to get a stable 5Ghz on a non-delidded chip. But with only 14c coolant temp I'd have expected to of been doing much better than this.

Omg: Just failed again. This time at 18 of 60 minutes. At 1.455v at 14c coolant temp. CPU temps weren't even above 65c. Argh I'm about to give up. Seriously. This chip is such a disappointment. I can get 4.9 easily too. _Even_ without the Chiller.


----------



## keikei

Quote:


> Originally Posted by *kevindd992002*
> 
> So I tried playing GOW4 at 1080p with an EVGA GTX 1080Ti FTW3. THE CPU utilization maxes out to around 85% average but GPU utilization is just around 85%. Does that mean there is no bottleneck from both CPU and GPU?


Silly question, but did you uncap the frame settings?


----------



## 12Cores

I just upgraded to a 1080ti from a crossfire 390x setup, I was expecting a sidegrade at best and whole lot less heat. This thing is legitimately faster than my old crossfire setup, it's almost like magic when running at 2,050mhz at 40 degrees under water mauling AAA games at 4k is plain crazy. This is whole lot of card, more than I was expecting for sure.


----------



## vm007

People are mining using 1080ti , I wonder how long would a card last at 99% usage with fan on at 85% and overclocked with temps between 72-78 -running 24/7 at full boost speeds and more.

Lol that'll be a good test instead of a benchmark, using a miner lol -however, no video load and very little memory load


----------



## vm007

Just wondering, it's off topic but i7700k -are you using intel xtu ?

I cannot get it to stabilize either. I keep messing one thing or another.

I have an Alienware R6 with liquid cooling and so far, zero overclocking done on cpu cause I keep messing it up and they don't provide any utility to do it.

Bios has an option of level 2 overclock at 4.7 ghz and 1250 mv voltage but when I run PRIMA i keep getting thermal throttled.


----------



## MaXGTS

My upgrade path the last few years:

EVGA GTX 275 x 2 (SLI) - 9/9/2009
EVGA GTX 570 x 2 (SLI) - 11/30/2001
EVGA GTX 980Ti FTW - 10/8/2015
EVGA GTX 1080Ti SC2 - 6/5/2017

This 1080Ti is a beast of a card. Incredible performance. It's a perfect match with my Dell S2716DG monitor (1440p/144Hz/GSync).


----------



## numpsy

Mine looks very similar when all red, I change the color depending on the mood










OH and I have the same monitor!


----------



## smithsrt8

Quote:


> Originally Posted by *nrpeyton*
> 
> Just hooked the chiller up yesterday/today.
> 
> Reworking my CPU overclock first. Then planning to begin looking at the GPU.
> 
> Not sure if it's going to be quite as glorious as my adventures on my last card (due to no solution for voltage on FE).
> 
> But I'll certainly be doing the pad testing. (as you mentioned).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Due to having _really_ lost the silicon lottery on my 7700k this is taking a lot longer than I'd hoped for. I'm currently at 1.450v and stress testing as I type. Been slowly working my way up from 1.35v trying to get a stable 5Ghz on a non-delidded chip. But with only 14c coolant temp I'd have expected to of been doing much better than this.
> 
> Omg: Just failed again. This time at 18 of 60 minutes. At 1.455v at 14c coolant temp. CPU temps weren't even above 65c. Argh I'm about to give up. Seriously. This chip is such a disappointment. I can get 4.9 easily too. _Even_ without the Chiller.


You really need to delid it...for some reason if it doesnt work hit me up...I have my "old" (2 months old) 7700k delidded (5ghz @ 1.35-1.36 stable that I will sell you...I have a Silicon Lottery that I just purchased...the only thing on this processor is I need to reapply TIM because my temps were touching high 70's on water


----------



## smithsrt8

I like your guys' taste...I am rocking Dual S2716DG's


----------



## TheBoom

Quote:


> Originally Posted by *smithsrt8*
> 
> You really need to delid it...for some reason if it doesnt work hit me up...I have my "old" (2 months old) 7700k delidded (5ghz @ 1.35-1.36 stable that I will sell you...I have a Silicon Lottery that I just purchased...the only thing on this processor is I need to reapply TIM because my temps were touching high 70's on water


Yeah delidding makes a whole lot of difference. I thought I had a average 6700k until I delidded it last week.

Now that temps are out of the way this thing actually does 4.9 at 1.39 and 5.0 at 1.46. And that's with cache at -100 and mem at 3240.


----------



## Derko1

Just curious if anyone went from a 1080 SLI to 1080ti SLI and think that it's worth it? Money isn't a problem, but just curious if the upgrade is considerable.


----------



## crazyxelite

I also killed my palit 1080 ti with liquid metal, I did on my 7700k fine but the GPU died I think. the LEDs turn on but after 5 sec turns off. I just had it for a week.


----------



## DrFreeman35

I'm wondering if anyone has had a chance with the EVGA SC Black Edition? I have put both my upgrade requests in, due to not being able to resist for $50/card. Coming from SLI FTW2's and not sure if I should SLI again or just sell one of the other Ti's? I play at 3440x1440 btw.


----------



## TheBoom

Quote:


> Originally Posted by *crazyxelite*
> 
> I also killed my palit 1080 ti with liquid metal, I did on my 7700k fine but the GPU died I think. the LEDs turn on but after 5 sec turns off. I just had it for a week.


How much temp reduction are you getting on the CPU? I have LM between the die and the IHS but not sure if I should put it on the surface itself.


----------



## crazyxelite

I was hitting 75c with a 240mm rad and two noctua fans at 80% speed now is hitting around 67c at 50% speed very good I think I did put some outside the die aswell but now that my GPU died I'm a bit scared about my CPU I would say use normal Tim

Edit: forgot to mention I'm at 5ghz 1.385v is the best my chip can do I wanted better but hey could be worse.


----------



## AsusFan30

Can I join?


----------



## Aganor

Quote:


> Originally Posted by *AsusFan30*
> 
> Can I join?


Hopefully you wont need to remove the cpu for the time being. It will be such a pain


----------



## CptSpig

Quote:


> Originally Posted by *AsusFan30*
> 
> Can I join?


Rockin the MSI!


----------



## GraphicsWhore

Quote:


> Originally Posted by *TheBoom*
> 
> How much temp reduction are you getting on the CPU? I have LM between the die and the IHS but not sure if I should put it on the surface itself.


My reduction was 10 or so degrees at load (on air; I moved to water shortly after).

You do not need LM on the IHS. Use regular TIM.


----------



## GanGstaOne

Quote:


> Originally Posted by *crazyxelite*
> 
> I also killed my palit 1080 ti with liquid metal, I did on my 7700k fine but the GPU died I think. the LEDs turn on but after 5 sec turns off. I just had it for a week.


did you try removing the LM from the card to see if it runs cause my friend had same problem and when i removed all the LM from the GPU the card was fine this was almost a year ago and that 1080 is still working


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> What about "Max CPU Thread Usage"?
> 
> If one or more threads are pegged at 100% while others are only at 10-20% that could also be why you're only seeing 85% CPU usage. I.E. the threads the game is actually using are maxed out at 100%. And others are sitting doing nothing.
> 
> Another culript could be if the 'frame rate limiter' has been left on in Riva Tuner. (don't know if you use Riva Tuner, but many ppl here do).


CPU thread utilization was the one I was monitoring for the 8 CPU threads displayed in HWMonitor. None of them were at 100%. The 85% is the max I've seen among all of the threads.

And no, I don't use RivaTuner and the in-game frame limiter is set to unlimited.

Quote:


> Originally Posted by *keikei*
> 
> Silly question, but did you uncap the frame settings?


Same answer as above 

@ All

Can you install EVGA Prevision XOC and MSI Afterburner at the same time in one system without causing any conflicts? I prefer MSI Afterburner over EVGA Precision XOC but only the Precision displays the additional temp sensors of the EVGA GTX 10980Ti FTW3 so my plan is to keep it installed (if ever I need to check the temps of the other sensors and change LED colors) but use MSI Afterburner as my main OC software.


----------



## luan87us

What's up everyone? Just sold my Asus Strix 1070 OC edition to upgrade to a 1080ti. However, I'm faced with many options and would like to hear more about them from experienced users in this thread to may be help me decide on which 1080ti I want to go with. I'm currently looking at the Aorus Extreme, EVGA SC Black Edition, EVGA SC2, and may be the Asus Strix OC edition.

I am planning to put it through water loop but that won't be for awhile (at least a year or 2 later when I need to upgrade my CPU cooling). I most likely will not be manually overclocking the card as I don't normally care for benchmark numbers as long as the card can run the games I play. Plus I don't have much experience with overclocking graphic cards. I mostly play competitive FPS games but would like to run BF1 and some upcoming AAA titles in 4k on my 4k 60hz monitor. I've been reading a lots of reviews about those cards above but would love to hear some more personal experiences from you guys. Especially more into the coil whine issues some of these may have (Asus Strix, EVGA SC2)?

Thanks in advance!

PS: ATM I'm leaning toward EVGA SC Black edition because of the value/performance and EVGA customer support. But the Aorus Extreme also very close 2nd consideration but I had bad luck with Gigabyte graphic card in the past.


----------



## Rollergold

If you want to water cool EK makes blocks for all of your picks execpt sc 2 but there mentions that Aorus Extreme pcb might be had a change to it making the EK block incompatible
http://www.overclock.net/t/1627238/gigabyte-aorus-geforce-gtx-1080-ti-owners-thread/1440

Also since you want to water cool Evga still honors warranties even if remove the cooler something I don't think asus and gigabyte offer if that matters to you.

The Sc black edition I think is the best choice you noted founders edition pcb aftermarket cooler. If you want more bling evga's FTW3 card has rgb lighting more temp monitoring points more and better power delivery parts and eks block for the card will be shipping next week on the 27th


----------



## luan87us

Quote:


> Originally Posted by *Rollergold*
> 
> If you want to water cool EK makes blocks for all of your picks execpt sc 2 but there mentions that Aorus Extreme pcb might be had a change to it making the EK block incompatible
> http://www.overclock.net/t/1627238/gigabyte-aorus-geforce-gtx-1080-ti-owners-thread/1440
> 
> Also since you want to water cool Evga still honors warranties even if remove the cooler something I don't think asus and gigabyte offer if that matters to you.
> 
> The Sc black edition I think is the best choice you noted founders edition pcb aftermarket cooler. If you want more bling evga's FTW3 card has rgb lighting more temp monitoring points more and better power delivery parts and eks block for the card will be shipping next week on the 27th


Thank for the advice. I'm not 100% on the water cool yet and even then it won't be for another year or later before I will do it. I did see the stuff about the Aorus extreme having fitment issue. Good to know EVGA will still honor warranty with 3rd party cooling solution. That might actually just made a lot of impact on my decision. Plus EVGA step up program sounds amazing for future upgrades.


----------



## crazyxelite

Quote:


> Originally Posted by *GanGstaOne*
> 
> did you try removing the LM from the card to see if it runs cause my friend had same problem and when i removed all the LM from the GPU the card was fine this was almost a year ago and that 1080 is still working


I already send it back I hope I get refunded, I think I'm gonna order the gigabyte this time thanks anyway.


----------



## roberta507

Quote:


> Originally Posted by *luan87us*
> 
> Thank for the advice. I'm not 100% on the water cool yet and even then it won't be for another year or later before I will do it. I did see the stuff about the Aorus extreme having fitment issue. Good to know EVGA will still honor warranty with 3rd party cooling solution. That might actually just made a lot of impact on my decision. Plus EVGA step up program sounds amazing for future upgrades.


I went back to EVGA after returning a different 3rd party card
Customer service rep shared some numbers on return percentage of major brands they sold
EVGA had the lowest return rate so I could surmise better quality controls in place


----------



## nrpeyton

I _wanted_ to do some *sub-ambient benching* tonight on the *1080 Ti* (as I now have the *Water Chiller* reconnected to my loop).

But unfortunately it's been raining heavily all afternoon here so it's very humid. (I.E. the dew point is like 17 degrees C which is horrible for condensation). I'm also not feeling very well..

Hopefully have a better day tomorrow. If I can get a day where the dew point is about 5c again that would be absolutely perfect 

Wonder if I can get to 2.2Ghz









*Today*


*Ideal Condition would be:*


Not played with it since my last card _(1080 Classified)._

Anyone watched the 'Gamers Nexus' VRM & PCB analysis for the new 1080TI Kingpin card*??*






What do you guys think*?* The thought of owning this has me frothing at the mouth lol.

*Most interestingly -- it features digital voltage controllers that can be controlled via software!! for core, memory & even LLC settings







* (watch the video for more info)


----------



## kevindd992002

@nrpeyton

Any comments on my reply above, mate?


----------



## ScottFern

I just bought a 1080 Ti reference card off another forum and I am worried about fan noise. Has anyone with a reference card had luck with just limiting the fan speed with a custom fan curve and call it a day? I am running 3440x1440 and think the card isn't going to need every last bit of clock at this resolution.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> @nrpeyton
> 
> Any comments on my reply above, mate?


Okay -- I also play Gears of War 4.

But I play at 4k.

Let me switch to 1080p and see what happens-- then I'll report back. (and let you know if I have similar problem).

Give me 5 mins bro. brb


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> @nrpeyton
> 
> Any comments on my reply above, mate?


Okay think I *might* of found your problem?

V-SYNC *Enabled*:

right click and 'open new tab' to view *FULL SIZE*


V-SYNC *Disabled*:

right click and 'open new tab' to view *FULL SIZE*


Notice how with v-sync enabled my GPU is only at 34% and CPU usage only at 19%. ?

But with v-sync disabled (and frames at 180+ the GPU is pegged at 97% and CPU at nearly 50% with max single thread utilization of 63%.

If you right click the screenshots and select 'view in new tab' you'll be able to see them at full size and you'll be able to make out the CPU/GPU statistics showing in real time (when I took the screenshots).

Could v-sync be your problem? Or refresh rate. Or maybe g-sync? e.t.c

We've already ruled out Riva Tuner max frame rate setting. So can't think what else would be causing it.

*So in answer to your original question:* I wouldn't say it's a bottleneck. (especially If you have v-sync e.t.c enabled). It's just the fact the GPU & CPU both don't need all their power to push out the frames required for 1080p.

*Another way you could check* -- is download GPU-Z.. and check the 'perfcap' reason tab during game-play. If it says "utilisation" that would also confirm my hypothesis.


----------



## DerComissar

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Spoiler!
> 
> 
> 
> I _wanted_ to do some *sub-ambient benching* tonight on the *1080 Ti* (as I now have the *Water Chiller* reconnected to my loop).
> 
> But unfortunately it's been raining heavily all afternoon here so it's very humid. (I.E. the dew point is like 17 degrees C which is horrible for condensation). I'm also not feeling very well..
> 
> Hopefully have a better day tomorrow. If I can get a day where the dew point is about 5c again that would be absolutely perfect
> 
> Wonder if I can get to 2.2Ghz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Today*
> 
> 
> *Ideal Condition would be:*
> 
> 
> Not played with it since my last card _(1080 Classified)._
> 
> 
> 
> Anyone watched the 'Gamers Nexus' VRM & PCB analysis for the new 1080TI Kingpin card*??*
> 
> 
> 
> 
> 
> 
> What do you guys think*?* The thought of owning this has me frothing at the mouth lol.
> 
> *Most interestingly -- it features digital voltage controllers that can be controlled via software!! for core, memory & even LLC settings
> 
> 
> 
> 
> 
> 
> 
> * (watch the video for more info)


I *Think* that you're the kinda guy that could utilize a KPE, and have a lot of fun with it.

I had a 980Ti KPE, I recall being told by one forum member here that it was a waste of money if not running it sub-zero.
I don't totally agree with that logic regarding the KPE, Lightning, etc., but certainly the KPE is made for extreme cooling.

Even if you don't bolt an LN2 pot to it, you could still enjoy running it with your chiller setup.

So do continue to froth at the mouth, lol.


----------



## nrpeyton

Quote:


> Originally Posted by *DerComissar*
> 
> I *Think* that you're the kinda guy that could utilize a KPE, and have a lot of fun with it.
> 
> I had a 980Ti KPE, I recall being told by one forum member here that it was a waste of money if not running it sub-zero.
> I don't totally agree with that logic regarding the KPE, Lightning, etc., but certainly the KPE is made for extreme cooling.
> 
> Even if you don't bolt an LN2 pot to it, you could still enjoy running it with your chiller setup.
> 
> So do continue to froth at the mouth, lol.


lol 

Temperature difference between my coolant and GPU core is *11c* (while drawing about 300 watts on the GPU at 99% utilisation).

Anyone seeing less than that? _(If so I'd be interested in your method)._


----------



## kevindd992002

Quote:


> Spoiler: Warning: Spoiler!
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Okay think I *might* of found your problem?
> 
> V-SYNC *Enabled*:
> 
> right click and 'open new tab' to view *FULL SIZE*
> 
> 
> V-SYNC *Disabled*:
> 
> right click and 'open new tab' to view *FULL SIZE*
> 
> 
> Notice how with v-sync enabled my GPU is only at 34% and CPU usage only at 19%. ?
> 
> But with v-sync disabled (and frames at 180+ the GPU is pegged at 97% and CPU at nearly 50% with max single thread utilization of 63%.
> 
> If you right click the screenshots and select 'view in new tab' you'll be able to see them at full size and you'll be able to make out the CPU/GPU statistics showing in real time (when I took the screenshots).
> 
> Could v-sync be your problem? Or refresh rate. Or maybe g-sync? e.t.c
> 
> We've already ruled out Riva Tuner max frame rate setting. So can't think what else would be causing it.
> 
> *So in answer to your original question:* I wouldn't say it's a bottleneck. (especially If you have v-sync e.t.c enabled). It's just the fact the GPU & CPU both don't need all their power to push out the frames required for 1080p.
> 
> *Another way you could check* -- is download GPU-Z.. and check the 'perfcap' reason tab during game-play. If it says "utilisation" that would also confirm my hypothesis.


I should've mentioned, VSYNC is totally OFF as I know what I understand what it will do to utilization if turned ON and framerate gets past refresh rate 

What do you use to display OSD information in GOW4? I'll monitor perfcap with GPU-Z to check.


----------



## Lefty23

Quote:


> Originally Posted by *kevindd992002*
> 
> @ All
> 
> Can you install EVGA Prevision XOC and MSI Afterburner at the same time in one system without causing any conflicts? I prefer MSI Afterburner over EVGA Precision XOC but only the Precision displays the additional temp sensors of the EVGA GTX 10980Ti FTW3 so my plan is to keep it installed (if ever I need to check the temps of the other sensors and change LED colors) but use MSI Afterburner as my main OC software.


I never had 2 OC apps running at the same time so, I can't answer that part of the question.
I have read multiple times that it's not a very good idea though....
Having it installed but not running shouldn't cause any problems.

Another option for you - for monitoring part - would be HWiNFO64. I remember reading that they plan on adding support for monitoring the new EVGA GPU temps/fans. You may want to give that a try when it comes out and see if it works for you.
Then, if you ever want to change LED colors just run Precision XOC make the changes and shut it down afterwards.

edit:
https://www.hwinfo.com/news.php
if you scroll down to "upcoming changes" section it mentions:
"Added monitoring of EVGA iCX additional GPU temperatures and fans."
There is also a pre-release if you want to try it.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> I should've mentioned, VSYNC is totally OFF as I know what I understand what it will do to utilization if turned ON and framerate gets past refresh rate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you use to display OSD information in GOW4? I'll monitor perfcap with GPU-Z to check.


'HWINFO64' and 'RivaTuner Statistic Server'

hwinfo64 has all the sensors, then you can configure which ones "show in on-screen-display" in HWINFO64 settings.

The OSD function in HWINFO64 *passes the sensor* data to RivaTuner. (then in RivaTuner you can configure the size/colour/position of the OSD in-game).



From left to right I have mine set up like this:
Line 1) GPU Core Speed, GPU Power Draw _(in watts)_, then GPU voltage
Line 2) GPU Temp, GPU Utilisation
Line 3) CPU Core Temp, Max CPU Thread Usage & Max CPU Usage
Line 4) CPU v.core, CPU power draw _(in watts)_
Line 5) FPS


----------



## feznz

Quote:


> Originally Posted by *kevindd992002*
> 
> CPU thread utilization was the one I was monitoring for the 8 CPU threads displayed in HWMonitor. None of them were at 100%. The 85% is the max I've seen among all of the threads.
> 
> And no, I don't use RivaTuner and the in-game frame limiter is set to unlimited.
> 
> Same answer as above
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ All
> 
> Can you install EVGA Prevision XOC and MSI Afterburner at the same time in one system without causing any conflicts? I prefer MSI Afterburner over EVGA Precision XOC but only the Precision displays the additional temp sensors of the EVGA GTX 10980Ti FTW3 so my plan is to keep it installed (if ever I need to check the temps of the other sensors and change LED colors) but use MSI Afterburner as my main OC software.


Probably a temp related throttling I spent about 9 months in the Philippines I almost made it home from the neighbors house one night on the tanduay it was a nice sleep in the garden I guess the temp never dropped below 27°C
But I am guessing you are in a air con room probably need some screenies of hardware info to make a accurate diagnosis

Quote:


> Originally Posted by *roberta507*
> 
> I went back to EVGA after returning a different 3rd party card
> Customer service rep shared some numbers on return percentage of major brands they sold
> EVGA had the lowest return rate so I could surmise better quality controls in place


Q

As any well trained sales person would do







they just doing there job correctly


----------



## Streetdragon

Bought 1080Ti for 663€. Price now is 702€ Lucky me xD


----------



## nrpeyton

Quote:


> Originally Posted by *Streetdragon*
> 
> Bought 1080Ti for 663€. Price now is 702€ Lucky me xD


Welcome.









Which card did you grab? FE?


----------



## Streetdragon

The Palit FE "NEB108T019LCF"
Dont now why this one is cheaper then all other FE but... who cares^^ Fe are all the same like i know xD


----------



## Dasboogieman

Just wanted to share my recent Phanteks Primo Ultimate build
2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series



More rads, more pumps, less noise.
The 1080ti now never breaks 30C on load.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just wanted to share my recent Phanteks Primo Ultimate build
> 2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series
> 
> 
> 
> More rads, more pumps, less noise.
> The 1080ti now never breaks 30C on load.


Can I ask why?


----------



## DerComissar

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just wanted to share my recent Phanteks Primo Ultimate build
> 2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> More rads, more pumps, less noise.
> The 1080ti now never breaks 30C on load.


Yeah!

That's what having a good loop is all about.


----------



## ir88ed

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can I ask why?


[pun]Cool![\pun]
I notice thermal throttling on my cards at pretty low temps. What frequency do your cards clock themselves to during gaming? I have been kicking around the idea of doing a sub-ambient run just to see how far the card will go. Are you running the stock bios?


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can I ask why?


Because I can, there were so many mount points! couldn't resist filling them all in just to see what will happen.


----------



## ir88ed

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just wanted to share my recent Phanteks Primo Ultimate build
> 2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series
> 
> 
> 
> More rads, more pumps, less noise.
> The 1080ti now never breaks 30C on load.


[pun]Cool![\pun]
I notice thermal throttling on my cards at pretty low temps. What frequency do your cards clock themselves to during gaming? I have been kicking around the idea of doing a sub-ambient run just to see how far the card will go. Are you running the stock bios?

edit: quoted the wrong post


----------



## TheBoom

Quote:


> Originally Posted by *Streetdragon*
> 
> The Palit FE "NEB108T019LCF"
> Dont now why this one is cheaper then all other FE but... who cares^^ Fe are all the same like i know xD


Marketing. Palit is still relatively new in the market and their preferred method of gaining a bigger market share is lowest prices all around. I remember during a sale in my country their 1080 could be had for the price of a regular 1070.


----------



## Dasboogieman

Quote:


> Originally Posted by *ir88ed*
> 
> [pun]Cool![\pun]
> I notice thermal throttling on my cards at pretty low temps. What frequency do your cards clock themselves to during gaming? I have been kicking around the idea of doing a sub-ambient run just to see how far the card will go. Are you running the stock bios?
> 
> edit: quoted the wrong post


Unfortunately this doesn't go sub-ambient, it maintains roughly a 10 delta T with ambient, I don't have any active heat-pump devices. Despite the massive surface area, I feel the loop not actually all that efficient yet so more optimisation needs to be done (can probably squeeze another 2-3 delta T out of it once I clean up the loop). I'm currently running the stock Gigabyte Aorus BIOS. At stock, it boosts straight to 2000-2012mhz and stays there. I can run 2100mhz completely stable now with no power throttling (due to the extremely low temps).


----------



## Dasboogieman

Well its pretty cold here atm, we're enjoying 19-20C ambients


----------



## smithsrt8

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well its pretty cold here atm, we're enjoying 19-20C ambients


I wish...the room my computer is in (front of my house, upstairs) is usually the warmest...even with my AC cranked it is usually about 80-83f (about 27-29c) because that room just doesn't cool down that much


----------



## GraphicsWhore

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just wanted to share my recent Phanteks Primo Ultimate build
> 2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series
> 
> 
> More rads, more pumps, less noise.
> The 1080ti now never breaks 30C on load.


Whooaa, alright. Not even sure what's happening here but with that many fans this thing has to be fairly audible for a custom loop. What RPM are those fans running at?

Also, how are you testing "load"? 30C sounds unlikely.

Edit: Oh now I see in your most recent post you said you're at 19 ambient.


----------



## lilchronic

Yeah those are some very good temps for a 1080Ti.

I run 2x 360 rads in push pull and the max temp i have ever seen is 52c ambient temps 27-28c
My brother on the other hand has MORA 1080 rad and a 360 rad with temps hitting 50c


----------



## dunbheagan

Quote:


> Originally Posted by *nrpeyton*
> 
> Temperature difference between my coolant and GPU core is *11c* (while drawing about 300 watts on the GPU at 99% utilisation).
> 
> Anyone seeing less than that? _(If so I'd be interested in your method)._


I get very good delta T results from using Thermal Grizzly Conductonaut between GPU and EK Waterblock. I have a delta T between coolant and GPU of only 2°C at undervolting settings ([email protected],800V) and 6°C at overclocked settings ([email protected],05V). This is for a heated up card at full load like FSU, SUPO 4K or DOOM. I am really impressed how good the liquid metal works. But a few guys here have killed their card with liquid metal, so you have to decide, if it's worth the risk. If you decide to go for LM on the GPU, you should definatly insulate the small parts near the GPU first.


----------



## TheBoom

Quote:


> Originally Posted by *dunbheagan*
> 
> I get very good delta T results from using Thermal Grizzly Conductonaut between GPU and EK Waterblock. I have a delta T between coolant and GPU of only 2°C at undervolting settings ([email protected],800V) and 6°C at overclocked settings ([email protected],05V). This is for a heated up card at full load like FSU, SUPO 4K or DOOM. I am really impressed how good the liquid metal works. But a few guys here have killed their card with liquid metal, so you have to decide, if it's worth the risk. If you decide to go for LM on the GPU, you should definatly insulate the small parts near the GPU first.


Would it be beneficial or advisable for aircooling?

Also once applied does it actually seep out of the die itself?


----------



## nrpeyton

Quote:


> Originally Posted by *dunbheagan*
> 
> I get very good delta T results from using Thermal Grizzly Conductonaut between GPU and EK Waterblock. I have a delta T between coolant and GPU of only 2°C at undervolting settings ([email protected],800V) and 6°C at overclocked settings ([email protected],05V). This is for a heated up card at full load like FSU, SUPO 4K or DOOM. I am really impressed how good the liquid metal works. But a few guys here have killed their card with liquid metal, so you have to decide, if it's worth the risk. If you decide to go for LM on the GPU, you should definatly insulate the small parts near the GPU first.


Does it cause any discolouration or corrosion of the block or GPU DIE?


----------



## ALSTER868

Gigabyte Aorus 1080Ti non-extreme is cross-flashable? Can we flash it with any BIOS, except those like HOF?


----------



## nrpeyton

Quote:


> Originally Posted by *nrpeyton*
> 
> Does it cause any discolouration or corrosion of the block or GPU DIE?


Insulation is the east bit lol.

But trying to spread the stuff... lol

I'd go ahead and do it right now-- only problem is I'd need to order some new pads first.


----------



## ALSTER868

Quote:


> Originally Posted by *nrpeyton*
> 
> Does it cause any discolouration or corrosion of the block or GPU DIE?


It is going to corrode the engravement on the GPU if that's of any concern to you


----------



## kevindd992002

Quote:


> Spoiler: Warning: Spoiler!
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> 'HWINFO64' and 'RivaTuner Statistic Server'
> 
> hwinfo64 has all the sensors, then you can configure which ones "show in on-screen-display" in HWINFO64 settings.
> 
> The OSD function in HWINFO64 *passes the sensor* data to RivaTuner. (then in RivaTuner you can configure the size/colour/position of the OSD in-game).
> 
> 
> 
> From left to right I have mine set up like this:
> Line 1) GPU Core Speed, GPU Power Draw (in watts), then GPU voltage
> Line 2) GPU Temp, GPU Utilisation
> Line 3) CPU Core Temp, Max CPU Thread Usage & Max CPU Usage
> Line 4) CPU v.core, CPU power draw (in watts)
> Line 5) FPS


Ok, I exactly copied your settings and I see the same in-game. My max CPO/thread usage is 83% but my max GPU utilization is only at around 70%. My temps maxes out at 73C as I'm still waiting for some parts for my new watercooling build so this card is still running with its stock heatsink.

Any other ideas?

Quote:


> Originally Posted by *feznz*
> 
> Probably a temp related throttling I spent about 9 months in the Philippines I almost made it home from the neighbors house one night on the tanduay it was a nice sleep in the garden I guess the temp never dropped below 27°C
> But I am guessing you are in a air con room probably need some screenies of hardware info to make a accurate diagnosis
> Q
> 
> As any well trained sales person would do
> 
> 
> 
> 
> 
> 
> 
> they just doing there job correctly


Not really. Like I said above, it maxes out at 73C and that is not something that's considered super hot for a GPU. If anything, it will throttle the clocks but should still have close to a 100% utilization. So I'm not sure what to do now.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok, I exactly copied your settings and I see the same in-game. My max CPO/thread usage is 83% but my max GPU utilization is only at around 70%. My temps maxes out at 73C as I'm still waiting for some parts for my new watercooling build so this card is still running with its stock heatsink.
> 
> Any other ideas?
> 
> Not really. Like I said above, it maxes out at 73C and that is not something that's considered super hot for a GPU. If anything, it will throttle the clocks but should still have close to a 100% utilization. So I'm not sure what to do now.


1) What FPS are you getting and can u post a screenshot of in-game? plz

2) Also do you have "prefer maximum performance" set under power management in the nvidia control panel?

3) And is your power slider in MSI Afterburner maxed out?

4) If you could take a small video or a few screenshots _(even using your mobile phone or a screen capture app if need be)_ of GPU-z while game is running. _(on sensor screen/tab)_

5) Do you have all graphics settings in-game set to Ultra?

6) Also could you run this batch file please? (It forces the driver to maximise your power target)

powerlimit.zip 0k .zip file

(you may have to edit the .batch file using notepad first -- set the power limit to the maximum for the driver of your card). If you're not sure what that is.. let me know what card you have and I'll find out for you. If you have a FE then you don't need to edit it. As it's already set up for a FE _(at 300 watts)._


----------



## ZealotKi11er

Just installed the eVGA Hybrid cooler on my 1080 Ti. Tried to mod the stock plate that came with 1080 but was too dam hard. While installing the block one of the screw mount came lose. It did not inspire quality. Temps are good so far. From 75C to 50C.


----------



## nrpeyton

Quote:


> Originally Posted by *ALSTER868*
> 
> It is going to corrode the engravement on the GPU if that's of any concern to you


Hmmmm do you have any photos?

Also -- while we're on the topic of corrosion.. has anyone removed Liquid Metal from their shunts (after shunt mod) to see if there is any corrosion to the top of the shunt resistors?


----------



## dunbheagan

Quote:


> Originally Posted by *TheBoom*
> 
> Would it be beneficial or advisable for aircooling?
> 
> Also once applied does it actually seep out of the die itself?


I havent tried it on aircooling, but i should lower your temps by a few degrees on air too. But i am not sure if its worth it on air, cause 5°C less on air does not make the same difference than on water, cause you generally have higher temps. And you have to disassemble your cooler just for the LM. When you go waterblock, you have to disassemble the cooler anyway, so applying LM is almost no extra work, just a little risk.

Liquid metal can seep out of the die itself, because of the pressure you apply to the waterblock or to the heatspreader(if you use on a CPU). I say that from my own experience. I just disassembled the heatspreader of my old delidded 4790K yesterday and the liquid metal was squeezed out at the borders of the die. There were several small drops, which almost reached the small contacts near the die. If i hadnt insulated them before, my CPU could have been dead. So there certainly is a risk, i would recommend never to use LM without insulating everything nearby.
Quote:


> Originally Posted by *nrpeyton*
> 
> Does it cause any discolouration or corrosion of the block or GPU DIE?


I cant tell for certain, because i havent disassembled the block again yet. And i hope i will never have to







But it did some heavy discolouration to the heatspreader of my 4790K and some light discolouration on my CPU-Waterblock within a few weeks, which wouldnt go off even with aceton. The Die of CPU looked as new after cleaning.

This stuff has great cooling capacities, but also some disadvantages.

This is my old 4790K after a few weeks LM on DIE and Heatspreader:



DIE looks as new, heatspreader heavily stained. I prepared it with Kyronat to sell it. Got a very descent price on ebay for it, 280€, which is only 40€ less than i paid new for it three years ago...


----------



## FUZZFrrek

Is it a good idea to apply some electrical tape around the GPU die to isolate the contacts around it?

I have a FTW3 and it want to try LM on it so, i am still not sure if it's a good idea.


----------



## ZealotKi11er

Quote:


> Originally Posted by *FUZZFrrek*
> 
> Is it a good idea to apply some electrical tape around the GPU die to isolate the contacts around it?
> 
> I have a FTW3 and it want to try LM on it so, i am still not sure if it's a good idea.


I tried it on my FE. Liquid Metal does not leak out. It is safe as long as you put enough to cover the die and nothing more. I do not think its worth the risk though.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I tried it on my FE. Liquid Metal does not leak out. It is safe as long as you put enough to cover the die and nothing more. I do not think its worth the risk though.


Do you use liquid metal?

It absolutely can and does leak out even when applied minimally and with liquid metal you need enough to spread otherwise it won't pull across the surface you're spreading it on and will instead form its own droplet. Microdropplets about the size of a poppy seed to a sesseme seed absolutely do form at the edges of the surfaces being pressed together. If it doesn't you are unlikely to have used the full surface area of the contact patch and if you have, then a bit most surely does squeeze out as even the thinnest layer required to actually spread is thick enough to cause minor bleeding at the edges.

Been using LM since before it was a fad...


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> Do you use liquid metal?
> 
> It absolutely can and does leak out even when applied minimally and with liquid metal you need enough to spread otherwise it won't pull across the surface you're spreading it on and will instead form its own droplet. Microdropplets about the size of a poppy seed to a sesseme seed absolutely do form at the edges of the surfaces being pressed together. If it doesn't you are unlikely to have used the full surface area of the contact patch and if you have, then a bit most surely does squeeze out as even the thinnest layer required to actually spread is thick enough to cause minor bleeding at the edges.
> 
> Been using LM since before it was a fad...


My 3770K has had it since 2012 and has not move 1 mm. I used it on my 1080 Ti for 2 weeks and even the stuff on the side that I had touched with my application tool still where identical to where I left them. You need a lot of it to leak out. Also most GPUs are horizontal so its even less likely.


----------



## dunbheagan

Quote:


> Originally Posted by *KraxKill*
> 
> Do you use liquid metal?
> 
> It absolutely can and does leak out even when applied minimally and with liquid metal you need enough to spread otherwise it won't pull across the surface you're spreading it on and will instead form its own droplet. Microdropplets about the size of a poppy seed to a sesseme seed absolutely do form at the edges of the surfaces being pressed together. If it doesn't you are unlikely to have used the full surface area of the contact patch and if you have, then a bit most surely does squeeze out as even the thinnest layer required to actually spread is thick enough to cause minor bleeding at the edges.
> 
> Been using LM since before it was a fad...


100% agreed. Just saw exactly the same myself yesterday when i removed the heatspeader of my 4790K. And i did just use enough LM to spread it over the surface. LM will spill out on the edges just like any paste when pressure is applied. Not as much cause you use less, but it will.


----------



## dunbheagan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 3770K has had it since 2012 and has not move 1 mm. I used it on my 1080 Ti for 2 weeks and even the stuff on the side that I had touched with my application tool still where identical to where I left them. You need a lot of it to leak out. Also most GPUs are horizontal so its even less likely.


I think you got it wrong. We do not mean, that it leaks out of the HS, that would really be to much. But it can be squeezed over the edges of the DIE, making contact to the small parts near the die, but still under the HS.


----------



## FastEddieNYC

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I tried it on my FE. Liquid Metal does not leak out. It is safe as long as you put enough to cover the die and nothing more. I do not think its worth the risk though.


A small amount does leak out on the edge. I'm using CLP on the core of my 1080 Ti and used liquid electrical tape around the die. You can also use clear nail polish. LM will give you the best temps but requires additional precautions to reduce the possibility of electrical shorts.


----------



## ZealotKi11er

I am having some trouble overclocking my FE card. It is fine 100% power but 120% power it crashes with +150 OC. I am under 2GHz OC at this point.


----------



## nrpeyton

Quote:


> Originally Posted by *KraxKill*
> 
> Do you use liquid metal?
> 
> It absolutely can and does leak out even when applied minimally and with liquid metal you need enough to spread otherwise it won't pull across the surface you're spreading it on and will instead form its own droplet. Microdropplets about the size of a poppy seed to a sesseme seed absolutely do form at the edges of the surfaces being pressed together. If it doesn't you are unlikely to have used the full surface area of the contact patch and if you have, then a bit most surely does squeeze out as even the thinnest layer required to actually spread is thick enough to cause minor bleeding at the edges.
> 
> Been using LM since before it was a fad...


Insulation is easy. Just paint Liquid Electrical Tape around the edges. (Do a nice thick layer and it will make it easier to peel off -- it comes back off again just like tape).

_right click & 'open new tab' for FULL SIZE_

My main concern would be invalidation of warranty due to discolouration or loss of engravement on the top of the GPU DIE.

It is an intriguing subject though. Imagine your CPU was direct-DIE (no heatspreader) like modern Nvidia GPU's. Then imagine liquid metal between block and CPU-DIE. Too.
_I wonder if we'd see the same temperature differentials that we see on GPU's (between water & core temp)._

Also I've noticed now that I've reconnected the Chiller. My component temps are much more sensitive to pump speed than they ever were before. By running my pump at 50% I'm getting nearly 10 degrees C less on the GPU Core. Never experienced that before -- even on my 1080 Classy using the exact same setup. The new EK blocks must be better at responding do more "turbulent" water. I.E the more turbulant the lower the temps.

My EK pump is also capable of running at 24v (a lot of people don't realise this). Shame I couldn't get a 24v output from molex. It would be like double the power + RPM.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> 1) What FPS are you getting and can u post a screenshot of in-game? plz
> 
> 2) Also do you have "prefer maximum performance" set under power management in the nvidia control panel?
> 
> 3) And is your power slider in MSI Afterburner maxed out?
> 
> 4) If you could take a small video or a few screenshots _(even using your mobile phone or a screen capture app if need be)_ of GPU-z while game is running. _(on sensor screen/tab)_
> 
> 5) Do you have all graphics settings in-game set to Ultra?
> 
> 6) Also could you run this batch file please? (It forces the driver to maximise your power target)
> 
> powerlimit.zip 0k .zip file
> 
> (you may have to edit the .batch file using notepad first -- set the power limit to the maximum for the driver of your card). If you're not sure what that is.. let me know what card you have and I'll find out for you. If you have a FE then you don't need to edit it. As it's already set up for a FE _(at 300 watts)._


1.) I'm getting around 100-144 fps depending on the scene.

2.) The global setting in NVCP is currently set at its default of "Optimal power". But does that matter if my Win10 power profile is set to High Performance anyway?

3.) No, the power slider in MSI Afterburner is currently set to 100% but I'll slide it all the way to 117% now and check again. I have an EVGA GTX 1080Ti FTW3 so I can also switch to the slave BIOS to have more power limit (other than this, does the slave BIOS have any difference compared to the master BIOS?).

4.) Ok, will do. I forgot to mention though that perfcap is showing "Vrel" while GOW4 is running.

5.) Yes, all Ultra.

6.) Is there a difference in running the batch file and letting nvidia-smi.exe do the power target change compared to sliding the power limit to the max in MSI AB?


----------



## Dasboogieman

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Whooaa, alright. Not even sure what's happening here but with that many fans this thing has to be fairly audible for a custom loop. What RPM are those fans running at?
> 
> Also, how are you testing "load"? 30C sounds unlikely.
> 
> Edit: Oh now I see in your most recent post you said you're at 19 ambient.


The 10 delta T is actually with the fans at maximum (2200RPM), which is definitely very very loud.
The loading was full CPU + GPU (Supo 4K + Prime95). The GPU actually stays at 28 degrees if the loading is GPU only.
My fans usually are at 800RPM or 1200RPM which maintains slightly better delta T (around 12C-15C or so) than my previous HAF932 360mm +240mm @ 2200RPM.
Quote:


> Originally Posted by *TheBoom*
> 
> Would it be beneficial or advisable for aircooling?
> 
> Also once applied does it actually seep out of the die itself?


Depends on how much you put, back when I used my Aorus stock cooler, I applied it to the Copper plate + GPU. It definitely seeps but it doesn't move either unless disturbed. Actually, the most dangerous times for this is when you are placing and removing the cooler. While its on, unless ur GPU is vibrating like hell, it doesn't move.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> I tried it on my FE. Liquid Metal does not leak out. It is safe as long as you put enough to cover the die and nothing more. I do not think its worth the risk though.


It was super dangerous to remove for me, the shock of the HSF removal actually caused it to drip on my PCB when I was installing the WB
Quote:


> Originally Posted by *nrpeyton*
> 
> Insulation is easy. Just paint Liquid Electrical Tape around the edges. (Do a nice thick layer and it will make it easier to peel off -- it comes back off again just like tape).
> 
> _right click & 'open new tab' for FULL SIZE_
> 
> My main concern would be invalidation of warranty due to discolouration or loss of engravement on the top of the GPU DIE.
> 
> It is an intriguing subject though. Imagine your CPU was direct-DIE (no heatspreader) like modern Nvidia GPU's. Then imagine liquid metal between block and CPU-DIE. Too.
> _I wonder if we'd see the same temperature differentials that we see on GPU's (between water & core temp)._
> 
> Also I've noticed now that I've reconnected the Chiller. My component temps are much more sensitive to pump speed than they ever were before. By running my pump at 50% I'm getting nearly 10 degrees C extra on the GPU Core. Never experienced that before -- even on my 1080 Classy using the exact same setup. The new EK blocks must be better at responding do more "turbulent" water. I.E the more turbulant the lower the temps.
> 
> My EK pump is also capable of running at 24v (a lot of people don't realise this). Shame I couldn't get a 24v output from molex. It would be like double the power + RPM.


Women's Nail hardener can work in a pinch too. The stuff can survive a ridiculous amount of heat and mechanical pressure. Plus its easily removed with nail polish remover.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> 1.) I'm getting around 100-144 fps depending on the scene.
> 
> 2.) The global setting in NVCP is currently set at its default of "Optimal power". But does that matter if my Win10 power profile is set to High Performance anyway?
> 
> 3.) No, the power slider in MSI Afterburner is currently set to 100% but I'll slide it all the way to 117% now and check again. I have an EVGA GTX 1080Ti FTW3 so I can also switch to the slave BIOS to have more power limit (other than this, does the slave BIOS have any difference compared to the master BIOS?).
> 
> 4.) Ok, will do. I forgot to mention though that perfcap is showing "Vrel" while GOW4 is running.
> 
> 5.) Yes, all Ultra.
> 
> 6.) Is there a difference in running the batch file and letting nvidia-smi.exe do the power target change compared to sliding the power limit to the max in MSI AB?


*1) What is the make/model of your monitor and whats it's max refresh rate? Do you have g-sync or free-sync enabled.*?
If it's a '144HZ 1080p monitor'.. Then your card won't need all it's power to generate 144FPS at only 1080p.

2) The Windows power management and the Nvidia power management are two completely separate entities. You're best setting it to "prefer maximum performance" in Nvidia control panel.

3)Definitely a good thing to try -- it will allow the driver to push the card harder.

4) "vrel" -- That's normal - Just means GPU-Z thinks the card might need more voltage before the driver can safely push the cards frequency up higher.

5) Understood.

6) 99% of the time. NO. But there were some instances in the past where people were reporting it wasn't working correctly -- and this was a work-around. (Was just something I thought was maybe worth trying).

Unless there's some sort of frame limiting (FPS limiting) going on it's normal for your CPU not to be pegged to the max. But the GPU should still be above 95%.

Also something maybe worth considering: Gears of War 4 originally came out for the XBOX and was originally limited to 30 FPS in single player and 60 FPS in multiplayer.

I'm not sure if it was completely re-written properly for Windows or just ported across then patched to remove the FPS limiter E.T.C
But it may just be the fact that the game doesn't see a need to run the FPS any higher than 144 FPS. (which by the way is the max refresh rate on many monitors).
At 1080p the card _would_ only need about 75% utilisation to run 1080p at 144 FPS.

*Another thing to try is: Change the resolution scaling.
*
I believe there's two ways you can do that: *1)* *render at a higher resolution & downscale to 1080p) for higher image quality*. But since the GPU still has to work as hard as it would for say.... 4k.. then the frames should max out about 60FPS and you should be at 96% + utilisation.

Try doing it in-game in video-options under resolution scaling.
or
*2)* In Nvidia Geforce experience there's an option called "Dynamic Super Resolution". It's the same idea -- just you set it up in Windows -- not your game. You could set a resolution at something a bit higher (say 3860 x 2160) or something roughly close to that. Then run the game. If the GPU is then suddenly starts running at 96+% utilisation then you'll now know the answer









Remember - a 1080Ti _really_ is overkill for 1080p. Even at Ultra.









Even when I'm not overclocked. I still get a smooth 60 FPS in almost all scenes at 4k Ultra. Using a native 4k monitor. 1080p only requires about 25% of that processing power for the same FPS.

With "optimal power" ticked in the Nvidia control panel.. the driver has probably reallsed you're already over 100 FPS and doesn't need to push more frames. (It will try to conserve power where it can).

I honestly don't think it's a bottleneck you're seeing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 10 delta T is actually with the fans at maximum (2200RPM), which is definitely very very loud.
> The loading was full CPU + GPU (Supo 4K + Prime95). The GPU actually stays at 28 degrees if the loading is GPU only.
> My fans usually are at 800RPM or 1200RPM which maintains slightly better delta T (around 12C-15C or so) than my previous HAF932 360mm +240mm @ 2200RPM.
> Depends on how much you put, back when I used my Aorus stock cooler, I applied it to the Copper plate + GPU. It definitely seeps but it doesn't move either unless disturbed. Actually, the most dangerous times for this is when you are placing and removing the cooler. While its on, unless ur GPU is vibrating like hell, it doesn't move.
> It was super dangerous to remove for me, the shock of the HSF removal actually caused it to drip on my PCB when I was installing the WB
> Women's Nail hardener can work in a pinch too. The stuff can survive a ridiculous amount of heat and mechanical pressure. Plus its easily removed with nail polish remover.


Yeah removal was super hard. Alcohol works well. The removal process is what made me stop using it.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Insulation is easy. Just paint Liquid Electrical Tape around the edges. (Do a nice thick layer and it will make it easier to peel off -- it comes back off again just like tape).
> 
> _right click & 'open new tab' for FULL SIZE_
> 
> My main concern would be invalidation of warranty due to discolouration or loss of engravement on the top of the GPU DIE.
> 
> It is an intriguing subject though. Imagine your CPU was direct-DIE (no heatspreader) like modern Nvidia GPU's. Then imagine liquid metal between block and CPU-DIE. Too.
> _I wonder if we'd see the same temperature differentials that we see on GPU's (between water & core temp)._
> 
> Also I've noticed now that I've reconnected the Chiller. My component temps are much more sensitive to pump speed than they ever were before. By running my pump at 50% I'm getting nearly 10 degrees C less on the GPU Core. Never experienced that before -- even on my 1080 Classy using the exact same setup. The new EK blocks must be better at responding do more "turbulent" water. I.E the more turbulant the lower the temps.
> 
> My EK pump is also capable of running at 24v (a lot of people don't realise this). Shame I couldn't get a 24v output from molex. It would be like double the power + RPM.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [/


http://www.performance-pcs.com/ppcs-12v-to-24v-dc-d5-laing-pump-supercharger.html

https://martinsliquidlab.wordpress.com/2011/03/05/koolance-pmp-450s-d5-strong-pump/

Some pretty conclusive evidence there


----------



## ALSTER868

Quote:


> Originally Posted by *nrpeyton*
> 
> Hmmmm do you have any photos?
> 
> Also -- while we're on the topic of corrosion.. has anyone removed Liquid Metal from their shunts (after shunt mod) to see if there is any corrosion to the top of the shunt resistors?


I wanted to use LM on my GPU too, but changed my mind after seeing as a friend of mine used Conductonaut on his GPU and got corrosion of the inscriptions after a couple of months. I don't have a photo of it unluckily. BTW, the difference between Kryo and Conductonaut is only about 2C if used on a GPU so it's not worth the risc IMO. LM is only advisable underneath the CPU lid, in all other cases only paste as this LM brings about more headache than benefits as to me.


----------



## Enzarch

Quote:


> Originally Posted by *nrpeyton*
> 
> Temperature difference between my coolant and GPU core is *11c* (while drawing about 300 watts on the GPU at 99% utilisation).
> 
> Anyone seeing less than that? _(If so I'd be interested in your method)._


6° Δ for mine with traditional TIM (GC Extreme), Using core only universal block (EK VGA Supremacy) @ 1.094V no TDP cap

A properly finished coldplate surface and careful TIM application helps for those last few degrees. A bit silly overkill though.


----------



## Streetdragon

i think i will use kyronaut and not LM... the card was to expensive to risk it with LM... but just for the thrill xd


----------



## KedarWolf

Quote:


> Originally Posted by *Streetdragon*
> 
> i think i will use kyronaut and not LM... the card was to expensive to risk it with LM... but just for the thrill xd


MasterGel Maker is the new go to thermal compound. Spreads easier than Kryonaut and performs just a bit better.









http://www.coolermaster.com/cooling/thermal-compound/mastergel-maker/

https://play3r.net/reviews/cooling/cooler-master-master-gel-maker-nano-thermal-paste-performance/


----------



## Streetdragon

i could isolate the components around the DIE with normal TIM or? i made that already on my 2 R9 390 and got no problems.. but i dont know if this is realy save like liquid tape(dont wanna buy that too...).


----------



## nrpeyton

Quote:


> Originally Posted by *Enzarch*
> 
> 6° Δ for mine with traditional TIM (GC Extreme), Using core only universal block (EK VGA Supremacy) @ 1.094V no TDP cap
> 
> A properly finished coldplate surface and careful TIM application helps for those last few degrees. A bit silly overkill though.


Wow that's a fantastic result. Only 6c.

I can do 6c too.. but only when the card is drawing like half it's max TDP of course. lol

How are you managing 6c?

Did you make any modifications to your block?

Also interested in why you're using a universal block--how do you cool the VRM's? *(would love to see a picture)?
*
_(Do you think the difference could also be something to do with the fact the VRM isn't heating the block up-- so all the cooling power is directly soley on the GPU)?_

Very interested....









Thanks,

Nick


----------



## TheBoom

Thanks for all the input. Think I'm gonna stick with my MX4 in that case, seems like too little of a risk-reward factor to use LM.

I was just hoping to get temps down by about 5-6 c since that's the next boost bin for my card.


----------



## Streetdragon

i will upload a pic tomorrow after i applyed LM. if i dont forget it^^


----------



## GraphicsWhore

Quote:


> Originally Posted by *KraxKill*
> 
> Do you use liquid metal?
> 
> It absolutely can and does leak out even when applied minimally and with liquid metal you need enough to spread otherwise it won't pull across the surface you're spreading it on and will instead form its own droplet. Microdropplets about the size of a poppy seed to a sesseme seed absolutely do form at the edges of the surfaces being pressed together. If it doesn't you are unlikely to have used the full surface area of the contact patch and if you have, then a bit most surely does squeeze out as even the thinnest layer required to actually spread is thick enough to cause minor bleeding at the edges.
> 
> Been using LM since before it was a fad...


Yeah the micro dropplets are the real pain in the ass. If they form near a spot you don't want them going and you try to wipe them up/off, you normally just end up moving them and if one is not careful they'll end up like the guy in this thread who recently killed his 1080ti.

I used LM on my 7700k after de-lid but that's a lot easier and much more room for error.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> *1) What is the make/model of your monitor and whats it's max refresh rate? Do you have g-sync or free-sync enabled.*?
> If it's a '144HZ 1080p monitor'.. Then your card won't need all it's power to generate 144FPS at only 1080p.
> 
> 2) The Windows power management and the Nvidia power management are two completely separate entities. You're best setting it to "prefer maximum performance" in Nvidia control panel.
> 
> 3)Definitely a good thing to try -- it will allow the driver to push the card harder.
> 
> 4) "vrel" -- That's normal - Just means GPU-Z thinks the card might need more voltage before the driver can safely push the cards frequency up higher.
> 
> 5) Understood.
> 
> 6) 99% of the time. NO. But there were some instances in the past where people were reporting it wasn't working correctly -- and this was a work-around. (Was just something I thought was maybe worth trying).
> 
> Unless there's some sort of frame limiting (FPS limiting) going on it's normal for your CPU not to be pegged to the max. But the GPU should still be above 95%.
> 
> Also something maybe worth considering: Gears of War 4 originally came out for the XBOX and was originally limited to 30 FPS in single player and 60 FPS in multiplayer.
> 
> I'm not sure if it was completely re-written properly for Windows or just ported across then patched to remove the FPS limiter E.T.C
> But it may just be the fact that the game doesn't see a need to run the FPS any higher than 144 FPS. (which by the way is the max refresh rate on many monitors).
> At 1080p the card would only need about 75% utilisation to run 1080p at 144 FPS.
> 
> *Another thing to try is: Change the resolution scaling.*
> 
> I believe there's two ways you can do that: *1)* *render at a higher resolution & downscale to 1080p) for higher image quality*. But since the GPU still has to work as hard as it would for say.... 4k.. then the frames should max out about 60FPS and you should be at 96% + utilisation.
> 
> Try doing it in-game in video-options under resolution scaling.
> or
> *2)* In Nvidia Geforce experience there's an option called "Dynamic Super Resolution". It's the same idea -- just you set it up in Windows -- not your game. You could set a resolution at something a bit higher (say 3860 x 2160) or something roughly close to that. Then run the game. If the GPU is then suddenly starts running at 96+% utilisation then you'll now know the answer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Remember - a 1080Ti really is overkill for 1080p. Even at Ultra.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even when I'm not overclocked. I still get a smooth 60 FPS in almost all scenes at 4k Ultra. Using a native 4k monitor. 1080p only requires about 25% of that processing power for the same FPS.
> 
> With "optimal power" ticked in the Nvidia control panel.. the driver has probably reallsed you're already over 100 FPS and doesn't need to push more frames. (It will try to conserve power where it can).
> 
> I honestly don't think it's a bottleneck you're seeing.


1.) I have the AOC G2460PG, it's a 144Hz 1080p monitor, and yes G-Sync is enabled. I understand that I don't need much power to generate 144fps @ 1080p but then again the problem is that I'm not limiting myself/graphics card/game (vsync) to that framerate.

2.) I've been with a couple of Nvidia cards (my last was a 1070) and I never bother to use "prefer maximum performance" as I read that the card doesn't 'idle' when that is set. What's your take on this?

3.) Yeah, I should've turned the power slider all the way before anything.

4.) Ok, good to know. What screen capture app do you use to do in-game captures or even record a video?

5.) N/A

6.) Ok, I'll keep that in mind but I'll have to try one step at a time so that we'll know what caused the problem when it gets solved.

I know that I generally just need 144fps for a 144Hz monitor anyway but then I'm merely just doing some testing here. I want to know how high of an fps can my card reach with GOW4 @ 1080p before my CPU bottlenecks it. GOW4 being poorly ported is worth looking into but then I don't understand the use of the "unlimited" framerate limit in the settings.

7.) Regarding DSR, yes I'm familiar with it and is actually what I was thinking on doing as the next step. I've tried the in-game version of DSR before with my 1070 and it works just fine.

I know that a 1080Ti is overkill for a 1080p monitor but like I said I'm merely testing this. I'll just waiting for a deal for the Dell S2716DG and I'll go ahead and buy that for 1140p.

EDIT:

Just did some additional testing runs and here's my observation:

1.) It's definitely not power related. I've switched to the slave GPU BIOS and ramped up the slider in MSI Afterburner to 125% max. GPU-Z shows only 70% TDP while GOW4 is running.

2.) It's also not the NVCP power setting. I tried "prefer maximum performance" and the results are the same.

3.) And then I tried DSR. I've set the in-game resolution to 3840x2160 and the GPU load just soared to 99%! So what the heck? I'm stuck at just 100fps at 1080p even though the card can do more?


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> *1) What is the make/model of your monitor and whats it's max refresh rate? Do you have g-sync or free-sync enabled.*?
> If it's a '144HZ 1080p monitor'.. Then your card won't need all it's power to generate 144FPS at only 1080p.
> 
> 2) The Windows power management and the Nvidia power management are two completely separate entities. You're best setting it to "prefer maximum performance" in Nvidia control panel.
> 
> 3)Definitely a good thing to try -- it will allow the driver to push the card harder.
> 
> 4) "vrel" -- That's normal - Just means GPU-Z thinks the card might need more voltage before the driver can safely push the cards frequency up higher.
> 
> 5) Understood.
> 
> 6) 99% of the time. NO. But there were some instances in the past where people were reporting it wasn't working correctly -- and this was a work-around. (Was just something I thought was maybe worth trying).
> 
> Unless there's some sort of frame limiting (FPS limiting) going on it's normal for your CPU not to be pegged to the max. But the GPU should still be above 95%.
> 
> Also something maybe worth considering: Gears of War 4 originally came out for the XBOX and was originally limited to 30 FPS in single player and 60 FPS in multiplayer.
> 
> I'm not sure if it was completely re-written properly for Windows or just ported across then patched to remove the FPS limiter E.T.C
> But it may just be the fact that the game doesn't see a need to run the FPS any higher than 144 FPS. (which by the way is the max refresh rate on many monitors).
> At 1080p the card would only need about 75% utilisation to run 1080p at 144 FPS.
> 
> *Another thing to try is: Change the resolution scaling.*
> 
> I believe there's two ways you can do that: *1)* *render at a higher resolution & downscale to 1080p) for higher image quality*. But since the GPU still has to work as hard as it would for say.... 4k.. then the frames should max out about 60FPS and you should be at 96% + utilisation.
> 
> Try doing it in-game in video-options under resolution scaling.
> or
> *2)* In Nvidia Geforce experience there's an option called "Dynamic Super Resolution". It's the same idea -- just you set it up in Windows -- not your game. You could set a resolution at something a bit higher (say 3860 x 2160) or something roughly close to that. Then run the game. If the GPU is then suddenly starts running at 96+% utilisation then you'll now know the answer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Remember - a 1080Ti really is overkill for 1080p. Even at Ultra.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even when I'm not overclocked. I still get a smooth 60 FPS in almost all scenes at 4k Ultra. Using a native 4k monitor. 1080p only requires about 25% of that processing power for the same FPS.
> 
> With "optimal power" ticked in the Nvidia control panel.. the driver has probably reallsed you're already over 100 FPS and doesn't need to push more frames. (It will try to conserve power where it can).
> 
> I honestly don't think it's a bottleneck you're seeing.
> 
> 
> 
> 1.) I have the AOC G2460PG, it's a 144Hz 1080p monitor, and yes G-Sync is enabled. I understand that I don't need much power to generate 144fps @ 1080p but then again the problem is that I'm not limiting myself/graphics card/game (vsync) to that framerate.
> 
> 2.) I've been with a couple of Nvidia cards (my last was a 1070) and I never bother to use "prefer maximum performance" as I read that the card doesn't 'idle' when that is set. What's your take on this?
> 
> 3.) Yeah, I should've turned the power slider all the way before anything.
> 
> 4.) Ok, good to know. What screen capture app do you use to do in-game captures or even record a video?
> 
> 5.) N/A
> 
> 6.) Ok, I'll keep that in mind but I'll have to try one step at a time so that we'll know what caused the problem when it gets solved.
> 
> I know that I generally just need 144fps for a 144Hz monitor anyway but then I'm merely just doing some testing here. I want to know how high of an fps can my card reach with GOW4 @ 1080p before my CPU bottlenecks it. GOW4 being poorly ported is worth looking into but then I don't understand the use of the "unlimited" framerate limit in the settings.
> 
> 7.) Regarding DSR, yes I'm familiar with it and is actually what I was thinking on doing as the next step. I've tried the in-game version of DSR before with my 1070 and it works just fine.
> 
> I know that a 1080Ti is overkill for a 1080p monitor but like I said I'm merely testing this. I'll just waiting for a deal for the Dell S2716DG and I'll go ahead and buy that for 1140p.
> 
> EDIT:
> 
> Just did some additional testing runs and here's my observation:
> 
> 1.) It's definitely not power related. I've switched to the slave GPU BIOS and ramped up the slider in MSI Afterburner to 125% max. GPU-Z shows only 70% TDP while GOW4 is running.
> 
> 2.) It's also not the NVCP power setting. I tried "prefer maximum performance" and the results are the same.
> 
> 3.) And then I tried DSR. I've set the in-game resolution to 3840x2160 and the GPU load just soared to 99%! So what the heck? I'm stuck at just 100fps at 1080p even though the card can do more?
Click to expand...

You can set maximum performance per game, instead of global. Try fast sync. What's capping the frame rate @ 100 fps, the game?


----------



## kevindd992002

Quote:


> Originally Posted by *buellersdayoff*
> 
> You can set maximum performance per game, instead of global. Try fast sync. What's capping the frame rate @ 100 fps, the game?


Yes, and that's what I did, just to set it for GOW4. Fast Sync wouldn't do anything in my situation. VSYNC off is what I like so that there is no cap. Yes, it looks like it's the game @ 1080p but I'm not certain.


----------



## sid4975

hey guys, so I posted here about 2 weeks back having problems with a zotac 1080 ti amp extreme. I returned the first card to newegg for a refund, and re-ordered it from jet.com which they sent the wrong card, so I had to send that back. I then re-ordered the card from newegg when it came bac in stock. I just got it today after 2 weeks of no card. soon as I hook this bad boy up and run my first benchmark!!! Same crash only way faster! ill never buy another zotac product ever again in my life! I don't know what card to get now cause I'm stuck without my old 980ti i sold. are any of these after market 1080ti's stable? i see a lot of negative reviews for the gigabyte aurus version too just saying the cards aren't stable for ****e! this is just ridiculous i have almost 2000$ out in cards and refunds that don't work.


----------



## TheBoom

Quote:


> Originally Posted by *sid4975*
> 
> hey guys, so I posted here about 2 weeks back having problems with a zotac 1080 ti amp extreme. I returned the first card to newegg for a refund, and re-ordered it from jet.com which they sent the wrong card, so I had to send that back. I then re-ordered the card from newegg when it came bac in stock. I just got it today after 2 weeks of no card. soon as I hook this bad boy up and run my first benchmark!!! Same crash only way faster! ill never buy another zotac product ever again in my life! I don't know what card to get now cause I'm stuck without my old 980ti i sold. are any of these after market 1080ti's stable? i see a lot of negative reviews for the gigabyte aurus version too just saying the cards aren't stable for ****e! this is just ridiculous i have almost 2000$ out in cards and refunds that don't work.


That's some damn bad luck. Maybe the universe is trying to get you to buy a Titan Xp lol.

I've had two Amp extremes and both were fine.

Is everything else in your system working properly? PSU?


----------



## Agent-A01

Quote:


> Originally Posted by *sid4975*
> 
> hey guys, so I posted here about 2 weeks back having problems with a zotac 1080 ti amp extreme. I returned the first card to newegg for a refund, and re-ordered it from jet.com which they sent the wrong card, so I had to send that back. I then re-ordered the card from newegg when it came bac in stock. I just got it today after 2 weeks of no card. soon as I hook this bad boy up and run my first benchmark!!! Same crash only way faster! ill never buy another zotac product ever again in my life! I don't know what card to get now cause I'm stuck without my old 980ti i sold. are any of these after market 1080ti's stable? i see a lot of negative reviews for the gigabyte aurus version too just saying the cards aren't stable for ****e! this is just ridiculous i have almost 2000$ out in cards and refunds that don't work.


The chance of getting 2 bad cards in a row is very unlikely..

How do you know it's not your system?


----------



## GraphicsWhore

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am having some trouble overclocking my FE card. It is fine 100% power but 120% power it crashes with +150 OC. I am under 2GHz OC at this point.


Why do you think you should be getting a stable +150? That's not common at all especially for a FE. What's your memory at?

Quote:


> Originally Posted by *sid4975*
> 
> hey guys, so I posted here about 2 weeks back having problems with a zotac 1080 ti amp extreme. I returned the first card to newegg for a refund, and re-ordered it from jet.com which they sent the wrong card, so I had to send that back. I then re-ordered the card from newegg when it came bac in stock. I just got it today after 2 weeks of no card. soon as I hook this bad boy up and run my first benchmark!!! Same crash only way faster! ill never buy another zotac product ever again in my life! I don't know what card to get now cause I'm stuck without my old 980ti i sold. are any of these after market 1080ti's stable? i see a lot of negative reviews for the gigabyte aurus version too just saying the cards aren't stable for ****e! this is just ridiculous i have almost 2000$ out in cards and refunds that don't work.


Man that sucks but I find it hard to believe two cards are bad. Are you sure it's not another part of the system? Are you able to test it somewhere else?


----------



## sid4975

no i have nowhere else to test it. but what else would it be? its basically doing the same thing as the last one crashing around 65 degrees. i tried it on my 2k monitor and my x34 monitor and same thing. my 980ti never had any problems. what else could it be? would event viewer tell me anything?


----------



## KraxKill

Quote:


> Originally Posted by *sid4975*
> 
> no i have nowhere else to test it. but what else would it be? its basically doing the same thing as the last one crashing around 65 degrees. i tried it on my 2k monitor and my x34 monitor and same thing. my 980ti never had any problems. what else could it be? would event viewer tell me anything?


I would first log to a file with GPU Z (increase the polling interval) run it stock and see what conditions are actually needed for it to crash... is it crashing due to temps or tdp. From there you'll have a better idea of what's going on.

Is your system overclocked? There is no guarantee that what was stable on your 980ti is still stable with the 1080ti.. I would remove your CPU overclock if you have one, just to illuminate the system as a possible cause.


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> no i have nowhere else to test it. but what else would it be? its basically doing the same thing as the last one crashing around 65 degrees. i tried it on my 2k monitor and my x34 monitor and same thing. my 980ti never had any problems. what else could it be? would event viewer tell me anything?


Pretty sure I suggested two pcie cables from the power supply (not a single cable and daisy chain connection) to gpu. Can't remember if you confirmed doing that or not.
Also 65c is around a throttle point in voltage so lowering your clock speed (slight negative offset) or raising voltage slider. Be sure to run the whole system stock as mentioned


----------



## sid4975

so i lowered the gpu clock -20mhz to 1739 and it seems to be stable. its suppose to be stable at 1759mhz out of the box. this is ridiculous. i ran firestrike 4k stress loop 20 times at 1739mhz and it ran fine the temp stayed at 66 degrees. when i try the stress test at 1759mhz it crashes before even finishing 1 loop.

so what are my options send the card back again or use it underclocked. Then when i goto sell it in a year or two i have to say hey card doesn't run as advertised make sure u downclock it 20mhz lol.

and no i don't have anything overclocked in my system just regular asus bios settings, and i have two pcie cables going to the card


----------



## KraxKill

Quote:


> Originally Posted by *sid4975*
> 
> so i lowered the gpu clock -20mhz to 1739 and it seems to be stable. its suppose to be stable at 1759mhz out of the box. this is ridiculous. i ran firestrike 4k stress loop 20 times at 1739mhz and it ran fine the temp stayed at 66 degrees. when i try the stress test at 1759mhz it crashes before even finishing 1 loop.
> 
> so what are my options send the card back again or use it underclocked. Then when i goto sell it in a year or two i have to say hey card doesn't run as advertised make sure u downclock it 20mhz lol.
> 
> and no i don't have anything overclocked in my system just regular asus bios settings, and i have two pcie cables going to the card


Are you doing as suggested? Downclocking the card is simply lowering your TDP if it's a power supply issue you'd have no clue. You need to take the computer to a stock overclock and then bench to see at what TDP the card crashes. When you find this, you can lower your core clock and increase voltage to verify if the added TDP crashes the card. Then you'd know if there is power delivery issue, wether from your power supply (likely given issues with two cards) or if the card is actually at fault.

Otherwise, you need to find a friend with a PC and test the card on a different system.


----------



## sid4975

I'm sorry how exactly do i go about doing this? i have gpu-z, afterburner, and 3dmark. u want me to set the card back to factory and run the stress test and when it crashes look at gpu-z log? can i upload it here cuz i have no idea what I'm looking for i don't know what tdp even stands for.


----------



## sid4975

ok i made the gpu-z log and i see big jumps in power numbers right where it crashes but i don't know what it means here is the txt log can anyone tell me whats up

GPU-ZSensorLog.txt 120k .txt file


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> so i lowered the gpu clock -20mhz to 1739 and it seems to be stable. its suppose to be stable at 1759mhz out of the box. this is ridiculous. i ran firestrike 4k stress loop 20 times at 1739mhz and it ran fine the temp stayed at 66 degrees. when i try the stress test at 1759mhz it crashes before even finishing 1 loop.
> 
> so what are my options send the card back again or use it underclocked. Then when i goto sell it in a year or two i have to say hey card doesn't run as advertised make sure u downclock it 20mhz lol.
> 
> and no i don't have anything overclocked in my system just regular asus bios settings, and i have two pcie cables going to the card


Is that the actual clock speed though? Use msi afterburner or gpuz to check. It's possible with gpu boost 3 that the actual clock speed is 2000+ and the chip is not stable for your bios (or something like that) bump up the voltage, it won't hurt the card or your warranty.
You could also check out techpowerup's gpu bios database to see if there is a different bios version for your card, maybe even zotac support if they've had a few cards like that then they'll most likely have a bios revision


----------



## buellersdayoff

The log confirms at least your card is hitting 2050mhz so negative offset to 2030 is not bad, bump up voltage, you'll probably get stable.


----------



## sid4975

ok wait a minute so where it says gpu speed the max it should go is 1759mhz? cuz ya mine goes way above that 1900-2000mhz plus it only freezes around 2050mhz it seems. so the cards actually running faster then its suppose too? I posted the gpu-z thing above with the crash close to the bottom at stock. can u tell me what I'm suppose to do?

why is the card going up to 2050mhz if its set to 1759mhz? and how much by percent should i raise the power? i can only do it in zotac. afterburner wont unlock gpu voltage for me


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> ok wait a minute so where it says gpu speed the max it should go is 1759mhz? cuz ya mine goes way above that 1900-2000mhz plus it only freezes around 2050mhz it seems. so the cards actually running faster then its suppose too? I posted the gpu-z thing above with the crash close to the bottom at stock. can u tell me what I'm suppose to do?
> 
> why is the card going up to 2050mhz if its set to 1759mhz? and how much by percent should i raise the power? i can only do it in zotac. afterburner wont unlock gpu voltage for me


That's gpu boost 3 Google will give you more info on that. It seems your card/bios allows boost to 2000+ but doesn't have enough voltage for the chip. Try bumping voltage up +40, +70 or +100. There is a beta version of msi afterburner, I think it's 4.4 beta 10?


----------



## KraxKill

Quote:


> Originally Posted by *sid4975*
> 
> ok wait a minute so where it says gpu speed the max it should go is 1759mhz? cuz ya mine goes way above that 1900-2000mhz plus it only freezes around 2050mhz it seems. so the cards actually running faster then its suppose too? I posted the gpu-z thing above with the crash close to the bottom at stock. can u tell me what I'm suppose to do?
> 
> why is the card going up to 2050mhz if its set to 1759mhz? and how much by percent should i raise the power? i can only do it in zotac. afterburner wont unlock gpu voltage for me


Yeah your card is boosting way beyond where you thought it was boosting so thats why you are crashing. Something is odd with either your offset or afterburner. Either way you simply lack voltage for the frequency being attempted.

Considering that your prior card probably had similar issues, you may want to look towards software as the cause. I would uninstall afterburner and delete the afterburner install directory and the enclosed profiles and then reboot. Download MSI Afterburner 10, (somewhere in this thread) and try again. If it's still boosting over 2K at default, you simply need to bring that down or add voltage.

Can you post a screenshot of the afterburner settings you're attempting? Was the above run you posted done with the defaults applied?


----------



## sid4975

the gpuz.txt I posted was stock. it crashes at stock cuz I guess its going too 2050mhz plus like u said. so theres nothing wrong with the card or the one before? which the 1 I had b4 was probly better cuz it took a few mins to crash lol.

I tried doing the voltage I have to use the zotac firestorm cuz afterburner doesn't have voltage unlocked. I turned it from 10% all way too 100% it crashes no matter what. only way it is stable is if I subtract -20mhz from the core. it made it thru 20 runs of 3dmark 4k stress, which I don't know if that means it totally stable but it made it.


----------



## KraxKill

Quote:


> Originally Posted by *sid4975*
> 
> the gpuz.txt I posted was stock. it crashes at stock cuz I guess its going too 2050mhz plus like u said. so theres nothing wrong with the card or the one before? which the 1 I had b4 was probly better cuz it took a few mins to crash lol.
> 
> I tried doing the voltage I have to use the zotac firestorm cuz afterburner doesn't have voltage unlocked. I turned it from 10% all way too 100% it crashes no matter what. only way it is stable is if I subtract -20mhz from the core. it made it thru 20 runs of 3dmark 4k stress, which I don't know if that means it totally stable but it made it.


Pretty sure you should read through a bit of this thread and about afterburner volt curve edits. Get Beta 10, unlock your voltage so you can stabilize those higher freqencies or just use a negative offset like you are and be done with it.


----------



## sid4975

ok but theres nothing wrong with the card itself then? I read the gpu boost on redit and it says the card itself does it so if the card is boosting itself shouldn't it be stable to where it boosts too?


----------



## KraxKill

Quote:


> Originally Posted by *sid4975*
> 
> ok but theres nothing wrong with the card itself then? I read the gpu boost on redit and it says the card itself does it so if the card is boosting itself shouldn't it be stable to where it boosts too?


There is nothing wrong with your card....

As mentioned above a piece of software must be preventing your card from accessing the higher voltage bins dictated in your bios OR its adding a boost offset and pushing your card beyond what it can achieve on the stock voltage curve. You are at 1.050v you can actually run all the way to 1.093v on the stock bios with unlocked voltage.

You've installed Afterburner a program which is not perfect for ever card out there. You've discovered that Afterburner is offsetting your card at stock by posting your GPUZ log...this is why you're boosting so high on low voltage.

You need to uninstall Afterburner...delete the afterburner profiles by deleting the afterburner install folder manually and reinstall the nvida driver. THEN, before you install Afterburner again....run a test and post your actual STOCK GPUz log.


----------



## sid4975

ok doing that now, and I had afterburner installed, but I was using firestorm the zotac version. I'm uninstalling both for now and see what happens after a clean install


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> ok but theres nothing wrong with the card itself then? I read the gpu boost on redit and it says the card itself does it so if the card is boosting itself shouldn't it be stable to where it boosts too?


I personally think it's the bios zotac ships with the card, 2050mhz is just about the limit for most 1080ti's and that amp extreme bios allows it. Go back through the last few posts it seems you are over looking some important tips (zotac support, custom voltage curve in afterburner beta). Also bump up your fan speed to keep it under 65.
I don't think it is a fault with your card, just zotacs overzealous boost clocks for an average chip. My card crashes 2088 but because my bios boost clock is lower I have to manually overclock to those levels and fine tune my voltage curve to get stable in all games/benches, only then it will boost to (a fluctuating) 2012-2050mhz.


----------



## Enzarch

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow that's a fantastic result. Only 6c.
> 
> I can do 6c too.. but only when the card is drawing like half it's max TDP of course. lol
> 
> How are you managing 6c?
> 
> Did you make any modifications to your block?
> 
> Also interested in why you're using a universal block--how do you cool the VRM's? *(would love to see a picture)?
> *
> _(Do you think the difference could also be something to do with the fact the VRM isn't heating the block up-- so all the cooling power is directly soley on the GPU)?_
> 
> Very interested....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks,
> 
> Nick


Block mods are mostly aesthetic (shaved/ polished acrylic) but the coldplate surface has been trued and smoothed as well.
I use universal blocks because I dont like the idea of buying new blocks for each card, and also I like the challenge.

In this case, I bought the Strix card specifically to make VRM/VRAM cooling simpler/reversible as it uses a full cover plate. Additional heatsinks were added as
well as a OEM Intel fan, which I probably overkill in this case, but is my standard solution.

My speculation is that it has less to do with the other components adding heat and more to do with that fact that since it doesnt have to make contact with the whole card;
the contact pressure on the GPU is much improved. Also flow and turbulence should be better.

As requested


----------



## ZealotKi11er

After much testing I got +143/+500 with my card. With 100% power playing BF1 the core is low 19XX while with 120% power its at high 19XX. Really though with AIO this would stay over 2GHz.


----------



## KCDC

Apologies if this was already posted, but it appears you can now have graphs in your OSD with the latest afterburner beta 11 ... finally...

https://videocardz.com/70473/msi-afterburner-4-4-0-beta-11-will-now-plot-graphs


----------



## SultanOfWalmart

I just picked up an EVGA 1080ti FE along with a Titan X(p) EK water-block, which I understand is compatible with the 1080 ti. I am in the process of installing the block and noticed something that I feel I need clarification on before I continue.

According to the instructions that came with the block and verified by those available from EK website, the thermal pads are to be placed in the below illustrated manner:
\

However, after removing the OEM, blower, there are additional components which have thermal pads on them...(RED is covered in EK instructions GREEN is NOT covered in EK instruction but is covered under the OEM blower):



Some of the components on the far right are not even covered by the block only plexi. My question is, do the components outlined in green not be in contact with the block?

Hopefully that makes sense.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> I just picked up an EVGA 1080ti FE along with a Titan X(p) EK water-block, which I understand is compatible with the 1080 ti. I am in the process of installing the block and noticed something that I feel I need clarification on before I continue.
> 
> According to the instructions that came with the block and verified by those available from EK website, the thermal pads are to be placed in the below illustrated manner:
> \
> 
> However, after removing the OEM, blower, there are additional components which have thermal pads on them...(RED is covered in EK instructions GREEN is NOT covered in EK instruction but is covered under the OEM blower):
> 
> 
> 
> Some of the components on the far right are not even covered by the block only plexi. My question is, do the components outlined in green not be in contact with the block?
> 
> Hopefully that makes sense.


Most likely not a problem. Contact EK just to make sure.


----------



## cluster71

Quote:


> Originally Posted by *sid4975*
> 
> ok doing that now, and I had afterburner installed, but I was using firestorm the zotac version. I'm uninstalling both for now and see what happens after a clean install


I see you discussed Zotac. I can mention my experiences ..
I have Zotac Extreme. With Firestorm I could not change LED, voltage, fan or save any profile. It was as if I had no permission. The program changed its appearance every time I started it. It appeared to be because the program was not digitally signed.
I gave it full authorization and most of it worked. But I never managed to get firestorm to run over 1,081v. I was in touch with the support, I flash the bios. It was the same.

I tested instead with Evga and then Afterburner, but need to correct the "voltage /clock" curve to increase the voltage to 1.093v and with XOC I can go up to 1.2v. There is nothing wrong with the card, it is the software.
I was annoyed by the Led and pausing fans. I fixt this with Firestorm and then uninstall Firestorm and run with Afterburner. The card remembers the settings for Led and pausing Fans.


----------



## cluster71

Quote:


> Originally Posted by *KCDC*
> 
> Apologies if this was already posted, but it appears you can now have graphs in your OSD with the latest afterburner beta 11 ... finally...
> 
> https://videocardz.com/70473/msi-afterburner-4-4-0-beta-11-will-now-plot-graphs


Thank you for the information about the new version


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> Apologies if this was already posted, but it appears you can now have graphs in your OSD with the latest afterburner beta 11 ... finally...
> 
> https://videocardz.com/70473/msi-afterburner-4-4-0-beta-11-will-now-plot-graphs


I'll update thread with beta 11 in an hour when I get home from when work, bit of a hassle on my phone.


----------



## sid4975

cluster can u tell me how you have ur settings? it seems ur right mine is at same 1.08v so what exactly do I change can u maybe show me a screenshot of yours?


----------



## Falkentyne

That link doesn't work
it links to : http://www.guru3d.com/files/index.html


----------



## cluster71

In Msi Afterburner press CTRL + F and lift the curve to 1.093v.You must put voltage slider 100% as well (Select a point CTRL + L and Apply to lock a clock) once and run with GPU-Z so the card boost and pass the Firestorm lock, then you can do whatever you want with the curve


----------



## cluster71

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta11.rar

paste it


----------



## sid4975

so I tried messing with everything in afterburner as much as I could understand. I turned voltages/curves up down and everything between. everything crashes instantly. the only way to make the card stable is to run it at -15mhz core. anything else just crash after crash within seconds. atleast the last card I sent bac that was doin same thing lasted a few mins b4 the crashes. O well I guess Ill just make do. hopefully they'll release some kind've bios update that fixes these issues. it seems like a bunch of people are having this problem if u check newegg and its probably almost all related to the gpu boost.


----------



## bmgjet

Quote:


> Originally Posted by *sid4975*
> 
> so I tried messing with everything in afterburner as much as I could understand. I turned voltages/curves up down and everything between. everything crashes instantly. the only way to make the card stable is to run it at -15mhz core. anything else just crash after crash within seconds. atleast the last card I sent bac that was doin same thing lasted a few mins b4 the crashes. O well I guess Ill just make do. hopefully they'll release some kind've bios update that fixes these issues. it seems like a bunch of people are having this problem if u check newegg and its probably almost all related to the gpu boost.


What temp is it hitting.
Mine wasnt stable at stock boost clocks if it hit 62C+ Which was disapointing since stock boost was only 1987mhz.
Had to use custom fan profile so fan came on 100% at 50C and it would hover around 58C in games and it was fine.
Put a water block on it and now its happy to run at 2050mhz as long as it stays under 48c.


----------



## sid4975

it fluctuates but the highest I get is around 1999mhz and the highest voltage goes is 1062mv or something but usually lower. it never goes above 66 degrees. if I try to change anything like I said it doesn't even get a chance to get up to 66 degrees it crashes almost instantly.







don't know if I should attempt 1 last rma or try a different card. Or just deal with it.


----------



## cluster71

I do not know what to say. Zotac seems to have a problem but the assembly. My card was also not good from the start. It quickly went up to 80C and crached. I run on 2062 1.062v for it to work. I swapped paste (liquid metal) and padding. Since 2100 1,093 without any problems. I have just changed IRQ memory intervals for the shared pcie port, and then tested. Running now 1.5 hours 2126/11800 1.125v 39C 20C ambient, more voltage than I usually use for that clock and I run aircooled, slightly modified


----------



## roberta507

I never like to make comments about one brand vs another but I RMA'd my Zotac after upgrading from the 980Ti Amp Extreme to their 1080Ti variant
It was like a brick that ran at its default specs and that was it
No head room,fan control,led lights,and bad software
Out of box 2050 and that was it
Talked to them about the problem and its poor quality control


----------



## buellersdayoff

Quote:


> Originally Posted by *sid4975*
> 
> it fluctuates but the highest I get is around 1999mhz and the highest voltage goes is 1062mv or something but usually lower. it never goes above 66 degrees. if I try to change anything like I said it doesn't even get a chance to get up to 66 degrees it crashes almost instantly.
> 
> 
> 
> 
> 
> 
> 
> don't know if I should attempt 1 last rma or try a different card. Or just deal with it.


You could possibly flash the amp bios (not extreme) so that it doesn't boost as high, but I'd be tempted to replace the card with something else after going through all that.


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> 1.) I have the AOC G2460PG, it's a 144Hz 1080p monitor, and yes G-Sync is enabled. I understand that I don't need much power to generate 144fps @ 1080p but then again the problem is that I'm not limiting myself/graphics card/game (vsync) to that framerate.
> 
> 2.) I've been with a couple of Nvidia cards (my last was a 1070) and I never bother to use "prefer maximum performance" as I read that the card doesn't 'idle' when that is set. What's your take on this?
> 
> 3.) Yeah, I should've turned the power slider all the way before anything.
> 
> 4.) Ok, good to know. What screen capture app do you use to do in-game captures or even record a video?
> 
> 5.) N/A
> 
> 6.) Ok, I'll keep that in mind but I'll have to try one step at a time so that we'll know what caused the problem when it gets solved.
> 
> I know that I generally just need 144fps for a 144Hz monitor anyway but then I'm merely just doing some testing here. I want to know how high of an fps can my card reach with GOW4 @ 1080p before my CPU bottlenecks it. GOW4 being poorly ported is worth looking into but then I don't understand the use of the "unlimited" framerate limit in the settings.
> 
> 7.) Regarding DSR, yes I'm familiar with it and is actually what I was thinking on doing as the next step. I've tried the in-game version of DSR before with my 1070 and it works just fine.
> 
> I know that a 1080Ti is overkill for a 1080p monitor but like I said I'm merely testing this. I'll just waiting for a deal for the Dell S2716DG and I'll go ahead and buy that for 1140p.
> 
> EDIT:
> 
> Just did some additional testing runs and here's my observation:
> 
> 1.) It's definitely not power related. I've switched to the slave GPU BIOS and ramped up the slider in MSI Afterburner to 125% max. GPU-Z shows only 70% TDP while GOW4 is running.
> 
> 2.) It's also not the NVCP power setting. I tried "prefer maximum performance" and the results are the same.
> 
> 3.) And then I tried DSR. I've set the in-game resolution to 3840x2160 and the GPU load just soared to 99%! So what the heck? I'm stuck at just 100fps at 1080p even though the card can do more?


Anyone can gibe their thoughts on this?


----------



## dansi

Quote:


> Originally Posted by *roberta507*
> 
> I never like to make comments about one brand vs another but I RMA'd my Zotac after upgrading from the 980Ti Amp Extreme to their 1080Ti variant
> It was like a brick that ran at its default specs and that was it
> No head room,fan control,led lights,and bad software
> Out of box 2050 and that was it
> Talked to them about the problem and its poor quality control


2050 is like mid-way between golden and average.

I dont see a problem

Nature of Pascal, no tweaking can help once you found the equilibrium

I guess if modders can access Pascal temps lock, and allow us to change how much boost clocks to apply for each temp range, then we can get higher specs.

Else we dependent on our cooling apparatus. The cooler you can get, the higher the boost bins from equilibrium clocks.


----------



## nrpeyton

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> I just picked up an EVGA 1080ti FE along with a Titan X(p) EK water-block, which I understand is compatible with the 1080 ti. I am in the process of installing the block and noticed something that I feel I need clarification on before I continue.
> 
> According to the instructions that came with the block and verified by those available from EK website, the thermal pads are to be placed in the below illustrated manner:
> \
> 
> However, after removing the OEM, blower, there are additional components which have thermal pads on them...(RED is covered in EK instructions GREEN is NOT covered in EK instruction but is covered under the OEM blower):
> 
> 
> 
> Some of the components on the far right are not even covered by the block only plexi. My question is, do the components outlined in green not be in contact with the block?
> 
> Hopefully that makes sense.


Answer to your question: no they don't - and if you look at my experiment below.. you'll find they're also the HOTTEST part of the card. Which affects memory overclocking. As the GDDR5X memory chips closest to the VRM are running 15c hotter than those on the other side.

Missing Thermal Pads on *ALL* EK 1080Ti FE blocks
--- Memory overclocking affected ---

I noticed *EXACTLY* the same thing when I installed the 1080TI FE with a Pascal block:

So I done some *temperature testing* last night; (using temp probes attached to my fan controller) and here are the results:


For FULL SIZE right click picture & 'open new tab'


Temperature probes were obviously attached directly behind the components on the card (on the back of the PCB)

Coolant Temperature was: 19.5c _(using Water Chiller)_
Ambient Temp in room: 31c

*GPU Core Temp:* 35c
*Power Draw During Test:* 350 watts
*Core Frequency:* 2126 MHz
*Memory:* +575 (or 6075)

The blocks are definitely adequately cooled. But *only* adequately.

The main design flaw _(not with block but in MANUAL and supplied thermal pads)_ is the missing pads on the line of capacitors between the memory chips (no.1) and the smaller of the two VRM pads (no.2).
_Design flaw *isn't* with block. But with the instructions in the MANUAL & the supplied thermal pads._

The only reason this section is even as cool as it is -- is due to the thermal padding on the components adjacent to it. (You'd be surprised at how incredibly *thermally conductive a PCB is*).

Most importantly - the MAIN thing this impacts isn't stability or lifespan of VRM components (as they are rated at up to 115c). But *Your memory overclock stability.*

You can clearly see that the closer to the VRM a component is -- the hotter it is.... (I.E on the I/O side of the card the memory is only 5c above ambient room temperature. Where as the memory chips closest to the VRM are at a whopping 49c. (nearly 15c above even the core temp).

My next project: is to ADD THE MISSING THERMAL PADS.

This should help significantly reduce the temperatures of the memory chips closest to the VRM. (If we could get ALL GDDR5X memory chips at the same temp as ones on the I/O side then memory would definitely overclock faster. I actually PROVED this to be the case on my last card (my 1080 Classy). Where I was able to overclock with complete stability up to +925. I could even complete a bench at +1000 (albeit with some artefacts). I'm not talking rubbish here guys; in fact ere's an article at EVGA.com where I was awarded a ribbon for my work into researching a better pad arrangement on my previous card _(from stock instructions in manual):_
https://forums.evga.com/Must-Have-changes-to-EK-manual-RE-pad-sizes-gt-For-780TI-Classy-Block-gt-1080-Classy-m2598540.aspx

@SultanOfWalmart You mentioned contacting EK. I plan on doing the same. I will post their response here. If you get a response first can you PM me in case I miss your post? I promise to do the same.
Of course I will obviously also post the results here.

P.S.
Nvidia were smart--notice how the 12th (missing) memory chip is on the VRM side. This makes sense. As it increases the chances of still getting a good memory overclock. _(As impact from a sub-par "silicon lottery" chip would at least perform better if it were colder.
_


----------



## GraphicsWhore

Played with voltage curve and I seem to be solid at 2088. Mem is at 6003. Ran SP4k



Playing ME:A, FH3 and Watch Dogs 2 - so far so good.


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> 1.) I have the AOC G2460PG, it's a 144Hz 1080p monitor, and yes G-Sync is enabled. I understand that I don't need much power to generate 144fps @ 1080p but then again the problem is that I'm not limiting myself/graphics card/game (vsync) to that framerate.
> 
> 2.) I've been with a couple of Nvidia cards (my last was a 1070) and I never bother to use "prefer maximum performance" as I read that the card doesn't 'idle' when that is set. What's your take on this?
> 
> 3.) Yeah, I should've turned the power slider all the way before anything.
> 
> 4.) Ok, good to know. What screen capture app do you use to do in-game captures or even record a video?
> 
> 5.) N/A
> 
> 6.) Ok, I'll keep that in mind but I'll have to try one step at a time so that we'll know what caused the problem when it gets solved.
> 
> I know that I generally just need 144fps for a 144Hz monitor anyway but then I'm merely just doing some testing here. I want to know how high of an fps can my card reach with GOW4 @ 1080p before my CPU bottlenecks it. GOW4 being poorly ported is worth looking into but then I don't understand the use of the "unlimited" framerate limit in the settings.
> 
> 7.) Regarding DSR, yes I'm familiar with it and is actually what I was thinking on doing as the next step. I've tried the in-game version of DSR before with my 1070 and it works just fine.
> 
> I know that a 1080Ti is overkill for a 1080p monitor but like I said I'm merely testing this. I'll just waiting for a deal for the Dell S2716DG and I'll go ahead and buy that for 1140p.
> 
> EDIT:
> 
> Just did some additional testing runs and here's my observation:
> 
> 1.) It's definitely not power related. I've switched to the slave GPU BIOS and ramped up the slider in MSI Afterburner to 125% max. GPU-Z shows only 70% TDP while GOW4 is running.
> 
> 2.) It's also not the NVCP power setting. I tried "prefer maximum performance" and the results are the same.
> 
> 3.) And then I tried DSR. I've set the in-game resolution to 3840x2160 and the GPU load just soared to 99%! So what the heck? I'm stuck at just 100fps at 1080p even though the card can do more?
> 
> 
> 
> Anyone can gibe their thoughts on this?
Click to expand...

If your game is capped @100fs then it seems your gpu only ~70% power to do so. That's why it it bumps up to 99%with higher resolution. If you can remove the frame cap your problem will be solved. Or just enjoy it the way it is.


----------



## kevindd992002

Quote:


> Originally Posted by *buellersdayoff*
> 
> If your game is capped @100fs then it seems your gpu only ~70% power to do so. That's why it it bumps up to 99%with higher resolution. If you can remove the frame cap your problem will be solved. Or just enjoy it the way it is.


I know but like I said it is not capped or whatsoever. Is GOW4 known to be capped? FWIW, I only get around 300fps in CSGO too.


----------



## Aganor

Quote:


> Originally Posted by *nrpeyton*
> 
> Answer to your question: no they don't - and if you look at my experiment below.. you'll find they're also the HOTTEST part of the card. Which affects memory overclocking. As the GDDR5X memory chips closest to the VRM are running 15c hotter than those on the other side.
> 
> Missing Thermal Pads on *ALL* EK 1080Ti FE blocks
> --- Memory overclocking affected ---
> 
> I noticed *EXACTLY* the same thing when I installed the 1080TI FE with a Pascal block:
> 
> So I done some *temperature testing* (using temp probes attached to my fan controller) and here are the results:
> 
> 
> For FULL SIZE right click picture & 'open new tab'
> 
> 
> Temperature probes were obviously attached directly behind the components on the card (on the back of the PCB)
> 
> Coolant Temperature was: 19.5c
> Ambient Temp in room: 31c
> 
> GPU Core Temp: 35c
> Power Draw During Test: 350 watts
> Core Frequency: 2126 MHz
> Memory: +575 (or 6075)
> 
> The blocks are definitely adequately cooled. But *only* adequately.
> 
> The main design flaw is the missing pads on the line of capacitors between the memory chips (no.1) and the 2nd largest thermal pad (no.2).
> 
> The only reason this section is even as cool as it is -- is due to the thermal padding on the components adjacent to it. (You'd be surprised at how incredibly *thermally conductive a PCB is*).
> 
> Most importantly - the MAIN thing this impacts isn't stability or lifespan of VRM components (as they are rated at up to 115c). But *Your memory overclock stability.*
> 
> You can clearly see that the closer to the VRM a component is -- the hotter it is.... (I.E on the I/O side of the card the memory is only 5c above ambient room temperature. Where as the memory chips closest to the VRM are at a whopping 49c. (nearly 15c above even the core temp).
> 
> My next project: is to ADD THE MISSING THERMAL PADS.
> 
> This should help significantly reduce the temperatures of the memory chips closest to the VRM. (If we could get ALL GDDR5X memory chips at the same temp as ones on the I/O side then memory would definitely overclock faster. I actually PROVED this to be the case on my last card (my 1080 Classy). Where I was able to overclock with complete stability up to +925. I could even complete a bench at +1000 (albeit with some artefacts). I'm not talking rubbish here guys; in fact ere's an article at EVGA.com where I was awarded a ribbon for my work into researching a better pad arrangement on my previous card _(from stock instructions in manual):_
> https://forums.evga.com/Must-Have-changes-to-EK-manual-RE-pad-sizes-gt-For-780TI-Classy-Block-gt-1080-Classy-m2598540.aspx
> 
> @SultanOfWalmart You mentioned contacting EK. I plan on doing the same. I will post their response here. If you get a response first can you PM me in case I miss your post? I promise to do the same.
> Of course I will obviously also post the results here.
> 
> P.S.
> Nvidia were smart--notice how the 12th (missing) memory chip is on the VRM side. This makes sense. As it increases the chances of still getting a good memory overclock. _(As impact from a sub-par "silicon lottery" chip would at least perform better if it were colder.
> _


I noticed the same thing when I installed the 1080TI FE with a Pascal block:


----------



## SultanOfWalmart

I think what I'll do is take the block off while I still have the card out of the loop and add some pads that I have left over from the earlier install.


----------



## Coree

Got my 1080 TI blower, and I have a Accelero III laying around. The problem is that I don't want to glue heatsinks on my VRM/VRAM, because if I need to RMA my card it's impossible to get them out.
I have also an old 780 DirectCU II backplate, would it help dissipating heat if I add thick thermal pads on them? It may fit my 1080 TI


----------



## velocityx

I knew about it before I dissasembled my Ti and when I did my installation, what I did was I cut off a bit of a thermal pad from the top piece next to it and put it on this line that didnt have one. With this mod it overclocks the same but can't speak for the extremes here. I just put it +500 on mem to bring it to 12ghz. It did overclock like that on stock cooler and still overclocks like that on water.

Ek manuals have a few errors with them, one is the thermal pad on the block thing we talk about here, second one is the supremacy evo cpu backplate installation orientation image in the manual. it's upside down. I almost destroyed my mobo trying hard to follow the instruction until I went online and found a how to on some website which told me to do it like I thought it should be from the beginning.

Also the gpu block and backplate. Why doesnt the manual say that if you're installing the backplate. skip screwing scews nr 1,2,3,4,5,6 and go directly to backplate installation. I found it really annoying when at 1 am in the morning i'm trying to just put it together and go to sleep and later I find I have to unscrew the screws I just screwed in to put on the backplate.







,


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> I know but like I said it is not capped or whatsoever. Is GOW4 known to be capped? FWIW, I only get around 300fps in CSGO too.


If g-sync is enabled on a 144hz monitor, Then your refresh rate won't exceed 144 FPS.

And because the GPU doesn't need to be fully utilised to push 144 FPS at only 1080p; that's why you're seeing *less* than 97% GPU Utilisation.

If you turned on "Dynamic Super Resolution" to force the game to render at a higher resolution (before downscaling the imagine to 1080p) you'd see less FPS and more GPU Utilisation.

Alternatively -- turn off g-sync. You're FPS will go through the roof. Probably 200+ and your GPU's utilisation will also increase to 97-100% too.


----------



## Aganor

Quote:


> Originally Posted by *Coree*
> 
> Got my 1080 TI blower, and I have a Accelero III laying around. The problem is that I don't want to glue heatsinks on my VRM/VRAM, because if I need to RMA my card it's impossible to get them out.
> I have also an old 780 DirectCU II backplate, would it help dissipating heat if I add thick thermal pads on them? It may fit my 1080 TI


My card gets hot on the backplate so its always a good thing


----------



## nrpeyton

Quote:


> Originally Posted by *Enzarch*
> 
> Block mods are mostly aesthetic (shaved/ polished acrylic) but the coldplate surface has been trued and smoothed as well.
> I use universal blocks because I dont like the idea of buying new blocks for each card, and also I like the challenge.
> 
> In this case, I bought the Strix card specifically to make VRM/VRAM cooling simpler/reversible as it uses a full cover plate. Additional heatsinks were added as
> well as a OEM Intel fan, which I probably overkill in this case, but is my standard solution.
> 
> My speculation is that it has less to do with the other components adding heat and more to do with that fact that since it doesnt have to make contact with the whole card;
> the contact pressure on the GPU is much improved. Also flow and turbulence should be better.
> 
> As requested


beautiful ;-)

Do you have fans pointing at those heat sinks you added. And what temperature is your *1)* VRM & *2)* Memory? I like this idea too. And also love to play around with an open-pcb when possible.

Anyway this is the 1st time I've seen this done properly here. So definitely worth a rep+

Please enlighten me any more information you'd like to share,. I'm hungry lol.








(I'm specifically interested in the temperature of your GDDR5X chips closest to the VRM.

While we're on the subject, 3rd question) What's your maximum stable memory overclocking (I.E. before errors begin appearing in OCCT's GPU stress test tab)?


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> If g-sync is enabled on a 144hz monitor, Then your refresh rate won't exceed 144 FPS.
> 
> And because the GPU doesn't need to be fully utilised to push 144 FPS at only 1080p; that's why you're seeing less than 97% GPU Utilisation.
> 
> If you turned on "Dynamic Super Resolution" to force the game to render at a higher resolution (before downscaling the imagine to 1080p) you'd see less FPS and more GPU Utilisation.
> 
> Alternatively -- turn off g-sync. You're FPS will go through the roof. Probably 200+ and your GPU's utilisation will also increase to 97-100% too.


I beg to differ. I think you're talking about the gsync + vsync. With this setting, gsync will activate when fps

When gsync is on and vsync is off (my setting), gsync will activate when fps

But for the sake of argument, I will try to disable gsync altogether and will let you know.


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> I beg to differ. I think you're talking about the gsync + vsync. With this setting, gsync will activate when fps
> 
> When gsync is on and vsync is off (my setting), gsync will activate when fps
> 
> But for the sake of argument, I will try to disable gsync altogether and will let you know.


Turn *off* both.

Either through nvidia control panel or in-game. Or both, But definitely turn both off. If you need help to do that-- give me a shout and I'll guide you through


----------



## nrpeyton

Quote:


> Originally Posted by *velocityx*
> 
> I knew about it before I dissasembled my Ti and when I did my installation, what I did was I cut off a bit of a thermal pad from the top piece next to it and put it on this line that didnt have one. With this mod it overclocks the same but can't speak for the extremes here. I just put it +500 on mem to bring it to 12ghz. It did overclock like that on stock cooler and still overclocks like that on water.
> 
> Ek manuals have a few errors with them, one is the thermal pad on the block thing we talk about here, second one is the supremacy evo cpu backplate installation orientation image in the manual. it's upside down. I almost destroyed my mobo trying hard to follow the instruction until I went online and found a how to on some website which told me to do it like I thought it should be from the beginning.
> 
> Also the gpu block and backplate. Why doesnt the manual say that if you're installing the backplate. skip screwing scews nr 1,2,3,4,5,6 and go directly to backplate installation. I found it really annoying when at 1 am in the morning i'm trying to just put it together and go to sleep and later I find I have to unscrew the screws I just screwed in to put on the backplate.
> 
> 
> 
> 
> 
> 
> 
> ,


That's good bro.

*
*
Missing Thermal Pads on *EK* 1080TI FE blocks .....c o n t i n u e d

*Original post:* http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11910#post_26181737

Did you check the extra pad was making contact with the block though? If not it could act as more of a hindrance (helping trap heat against those capacitors).

The trick I used to do. Was this:

apply a thin layer (or dab) of thermal grease to the top of each thermal pad. Including the GPU core. Then lightly place the block down on top of the card (without screwing it down).

Lift off, then:
If residue from the thermal paste appeared on ALL the components, including all GDDR5X memory chips (and gpu-core section on block). The VRM. And any extra VRM areas you decided to pad (beyond minimum requirement in EK manual). Then you're good to go!

Sometimes it can be painstaking - sometimes one pad affects the contact of others adjacent to it. On my Classified I ended up changing the entire arrangement. But by the end it was perfect. So by the time I screwed the block down ALL thermal pads were getting best possible compression, thus delivering best possible heat transfer.
When I tested- I managed to successfully reduce the temp on the GDDR5X memory chips closest to the VRM by nearly 20c.


----------



## nrpeyton

Quote:


> Originally Posted by *Coree*
> 
> The problem is that I don't want to glue heatsinks on my VRM/VRAM, because if I need to RMA my card it's impossible to get them out.
> I have also an old 780 DirectCU II backplate, would it help dissipating heat if I add thick thermal pads on them? It may fit my 1080 TI


You just need to find the right glue (and apply a minimal amount).



*Most importantly* - these were *easily* removed when I had to put the original heat sink back on-- before trading the card in to buy my 1080 Ti.

The trick with the self-adhesive ones, is stick them to something else first. It reduces the "stickiness" Do it a few times. (Obviously not too much). Until you find the right level of adhesion (not too strong it can't be removed -- not too weak it won't fall off when the card heats up).


----------



## TheBoom

Quote:


> Originally Posted by *sid4975*
> 
> it fluctuates but the highest I get is around 1999mhz and the highest voltage goes is 1062mv or something but usually lower. it never goes above 66 degrees. if I try to change anything like I said it doesn't even get a chance to get up to 66 degrees it crashes almost instantly.
> 
> 
> 
> 
> 
> 
> 
> don't know if I should attempt 1 last rma or try a different card. Or just deal with it.


Quote:


> Originally Posted by *roberta507*
> 
> I never like to make comments about one brand vs another but I RMA'd my Zotac after upgrading from the 980Ti Amp Extreme to their 1080Ti variant
> It was like a brick that ran at its default specs and that was it
> No head room,fan control,led lights,and bad software
> Out of box 2050 and that was it
> Talked to them about the problem and its poor quality control


Bad stock bios. I'm running XOC and have no issues. It might be worth a try for you. Proper fan control, more headroom .

Also 2050 on the core is not even guaranteed for most 1080tis so that's not even an issue.

Their QC is a little behind brands like Asus but still better than some of the brands out there.

I'd suggest flashing the XOC and see how it goes for you. Default voltage with XOC is 1.12-1.25v so your card shouldn't crash at the stock boost. Also no thermal and power limits.


----------



## velocityx

I agree with the poster above, if it was 1950 out of box I would be upset about it, but 2050? jesus, I have 2050 rock steady and I have my Ti on water....


----------



## Coree

Could someone help, for some reason my stock 1080TI blower doesn't want to throttle during BF1!

Clocks stay at 1863 mhz, no matter what. Set my temp limit to 85C, and the temps continue to rise above 95C unless I put the fan to 70%, then it goes <85C. (The GPU should reduce clockspeed AUTOMATICALLY when hitting the temp limit.)

But on Unigine Heaven, my GPU throttles and temps are within the limits (85C). BF1 doesn't do this? Why does my GPU bypass the temp limits even if I have stock settings turned on? Doesn't the GPU know how to throttle down on BF1? What is going on. I have the newest drivers enabled.


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> Turn *off* both.
> 
> Either through nvidia control panel or in-game. Or both, But definitely turn both off. If you need help to do that-- give me a shout and I'll guide you through


I will, since the purpose of my testing is to have the max fps anyway







But in regular fps

If you read the gsync section in the blurbuster website, it fully explains how gsync works and how it won't give you any framerate cap.


----------



## pantsoftime

Has anyone ever tried shunt modding using conductive tape? Copper conductive tape (and maybe certain aluminum ones) seem to have the right resistance levels.


----------



## Streetdragon

have installed the ek block on my gpu wioth LM()forgot photo sorry)

idletemps are saame like water but at load it jumps form 23° to 48-52° not cool-.-

or is this normal?


what do you think? i have no idea if this is good. god crash at higher clock speed


----------



## dunbheagan

Quote:


> Originally Posted by *Streetdragon*
> 
> have installed the ek block on my gpu wioth LM()forgot photo sorry)
> 
> idletemps are saame like water but at load it jumps form 23° to 48-52° not cool-.-
> 
> or is this normal?
> 
> 
> what do you think? i have no idea if this is good. god crash at higher clock speed


Whats your watertemp at load? With LM and EK waterblock the delta T between coolant and gpu should not be higher than 10 degree. I got the same configuration and my highest delta T is 6 degree at full load on a overclocked card. And whats your flow rate?


----------



## Streetdragon

water was at full load 27° at taht time

flow rate dont know. but pump at full speed or 50% is no different


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> I will, since the purpose of my testing is to have the max fps anyway
> 
> 
> 
> 
> 
> 
> 
> But in regular fps


@nrpeyton

As expected, gsync off and on did absolutely no difference. Like I said, gsync is NOT vsync and does not cap any framerates. What's next?


----------



## KedarWolf

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> I will, since the purpose of my testing is to have the max fps anyway
> 
> 
> 
> 
> 
> 
> 
> But in regular fps
> 
> 
> 
> @nrpeyton
> 
> As expected, gsync off and on did absolutely no difference. Like I said, gsync is NOT vsync and does not cap any framerates. What's next?
Click to expand...

You need to cap your frame rate just below your refresh rate with G-Sync though or G-Sync is not working. Or have Vsync on as well.


----------



## kevindd992002

Quote:


> Originally Posted by *KedarWolf*
> 
> You need to cap your frame rate just below your refresh rate with G-Sync though or G-Sync is not working. Or have Vsync on as well.


Yes, I understand. 3 fps below refresh rate is the recommendation from blur busters.

But let's forgetavout gsync for now







It shouldn't affect the framerates AT ALL. Vsync does and that's turned off for GOW4 but it still seems to be capped to just around 100fps in my case.

I just tested Overwatch at 1080p and I'm getting the expected results! 220ish fps or so and 99% GPU load while keeping vsync off.

So that tells me the problem is GOW4. But then I also experience low GPU load in CSGO so that game is also not optimized for high framerates?

In CSGO, I get 88% CPU max thread usage and only 30% GPU load!


----------



## Aganor

Quote:


> Originally Posted by *kevindd992002*
> 
> @nrpeyton
> 
> As expected, gsync off and on did absolutely no difference. Like I said, gsync is NOT vsync and does not cap any framerates. What's next?


Use RivaTuner from MSI AB, works like a charm


----------



## cluster71

Is there anyone who has shunt mod on Zotac Extreme or Gigabyte Aorus? I do not recall, but I can search the forum myself of course.
My PC pulls around 470w to the most total. Z270 7700K 5.2 GHz. HWinfo64 has shown upwards 400w for GPU as most. I run XOC bios but have not experienced the hardware restriction entered. There's about ~ 400w on Extreme i guess. Or is it 450w as in old AMD Uber mode? I might need to go over 2165 to limit it?

Anyone who experienced it or should I drop it?


----------



## KedarWolf

For anyone interested, this Zotac Amp Extreme 384W BIOS works with our FE's

Run the powerlimit.bat as admin after you install it though.

Oh, and watch the memory offset. It's higher so to run 6210 memory for benching I only used a +608 offset.

It needs to be set lower than most BIOS's or memory will be too high and your system likely will crash.

Decent Time Spy score.

Zotac.GTX1080Ti.11264.170327.zip 156k .zip file


powerlimit.zip 0k .zip file


----------



## kevindd992002

Quote:


> Originally Posted by *Aganor*
> 
> Use RivaTuner from MSI AB, works like a charm


Yes, I use RTSS for the OSD but I'm testing out the full capability of my 1080Ti now with GOW4 so I don't intend to cap framerates


----------



## cluster71

Yes Graphics score in Time Spy is interesting, Superposition seems less interesting as it is possible to cheat in several ways here on the forum in the other threads


----------



## Streetdragon

Quote:


> Originally Posted by *dunbheagan*
> 
> Whats your watertemp at load? With LM and EK waterblock the delta T between coolant and gpu should not be higher than 10 degree. I got the same configuration and my highest delta T is 6 degree at full load on a overclocked card. And whats your flow rate?


found the problem... used to less LM... i thought better less than more but.... kek. used un the gpu more of it then on my cpu. way more

disambled the block in the case, over the motherboard, conectet to the loop. never again

edit: ahhh now i understand the power throttling... nice xD


----------



## lilchronic

idletemps are saame like water but at load it jumps form 23° to 48-52° not cool-.-

or is this normal?


what do you think? i have no idea if this is good. god crash at higher clock speed[/quote]

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah those are some very good temps for a 1080Ti.
> 
> I run 2x 360 rads in push pull and the max temp i have ever seen is 52c ambient temps 27-28c
> My brother on the other hand has MORA 1080 rad and a 360 rad with temps hitting 50c


I think your fine


----------



## mafia97160

my all bench FE bios XOC 2190 MHz 1.200 mv 6107 MHz memory..
[email protected] 4.850 GHz 1.375 vcore
Maximus VII Ranger bios 3003
HyperX Predator @ 2477 MHz



http://www.3dmark.com/fs/12891722

http://www.3dmark.com/3dm/20521701

http://www.3dmark.com/3dm/20521768

http://www.3dmark.com/spy/1926333


----------



## homingmystic

If anyone has installed the MSI 1080Ti EK waterblock, did you have any trouble installing the original backplate? As I cannot get the screws to thread, they seem too short to pass through the backplate < PCB < waterblock standoffs. Any insight on this would be great! Thanks


----------



## KraxKill

Quote:


> Originally Posted by *mafia97160*
> 
> my all bench FE bios XOC 2190 MHz 1.200 mv 6107 MHz memory..
> [email protected] 4.850 GHz 1.375 vcore
> Maximus VII Ranger bios 3003
> HyperX Predator @ 2477 MHz


Great setup...good score! Great! Clock...

I see you're using the XOC bios....are you shunt modified? I suspect that you could score higher in Superposition 4K with a lower clock and lower voltage because unless you've manually bypassed Nvidia's TDP limits (the XOC bios does not actually do that) you're leaving some performance on the table by running a voltage at which it is much easier to hit pwr. The highest Supo benches (10.8-10.9K) coming in seem to be on Ti's running around 2100-2126mhz at around 1.063v. With your card clocking to 2190 you have room and could explore that route and probably pick up quite a bit on the SP4K score.


----------



## Streetdragon

Quote:


> Originally Posted by *homingmystic*
> 
> If anyone has installed the MSI 1080Ti EK waterblock, did you have any trouble installing the original backplate? As I cannot get the screws to thread, they seem too short to pass through the backplate < PCB < waterblock standoffs. Any insight on this would be great! Thanks


the stock blackplate is not campatible with the block.

nvidia boost clocking is the worst i have ever seen...... without power throtteling it would be easyer. or just artefacts or so instant of carshes


----------



## homingmystic

Quote:


> Originally Posted by *Streetdragon*
> 
> the stock blackplate is not campatible with the block.
> 
> nvidia boost clocking is the worst i have ever seen...... without power throtteling it would be easyer. or just artefacts or so instant of carshes


States on their news and store front that it is. Very weird that they are supporting it, but the provided screws are not long enough.

"The factory backplate of the MSI® GeForce® GTX 1080 Ti Gaming 11G and MSI® GeForce® GTX 1080 Ti Gaming X 11G graphics cards is compatible with this water block."

Source:
https://www.ekwb.com/news/ek-releases-msi-gtx-1080-ti-tf6-full-cover-water-block/


----------



## smithsrt8

I have a interesting problem and wondering if anyone else has dealt with this...I am running SLI with a Aorus (non-extreme) and a FTW3 and they are different widths...My Aorus is water cooled and I am waiting for the FTW3 block to be released(it ships next week)...other than using a single flex bridge how do I hook up the 2 with an SLI bridge...also...when I get the block for the FTW3 how will I run the tubing? Has anyone gone SLI with 2 different cards that were different widths? if so how did you do it?


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> For anyone interested, this Zotac Amp Extreme 384W BIOS works with our FE's
> 
> Run the powerlimit.bat as admin after you install it though.
> 
> Oh, and watch the memory offset. It's higher so to run 6210 memory for benching I only used a +608 offset.
> 
> It needs to be set lower than most BIOS's or memory will be too high and your system likely will crash.
> 
> Decent Time Spy score.
> 
> Zotac.GTX1080Ti.11264.170327.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


I cannot help but wonder what your Actual core clocks are I was having a compare and we have the almost identical Graphics score though great score half the work is done for me comparing a Strix to Fe
http://www.3dmark.com/spy/1511440 2063Mhx core and I think about 12000Mhz mem
Quote:


> Originally Posted by *Streetdragon*
> 
> have installed the ek block on my gpu wioth LM()forgot photo sorry)
> 
> idletemps are saame like water but at load it jumps form 23° to 48-52° not cool-.-
> 
> or is this normal?
> 
> 
> what do you think? i have no idea if this is good. god crash at higher clock speed


Could be only the TIM you used or a poor waterblock mount
Assuming your ambient temps must be around 20°C to be topping 27°C in your water loop.


----------



## crazyxelite

Hi I got the normal version here a pic inside my fractal nano s nearly no space. I locked the card fan at 65 power slider at 150, it boost to around 1900 temps reached 86c and bad coil wine worse than palit and evga.


----------



## GraphicsWhore

Quote:


> Originally Posted by *crazyxelite*
> 
> Hi I got the normal version here a pic inside my fractal nano s nearly no space. I locked the card fan at 65 power slider at 150, it boost to around 1900 temps reached 86c and bad coil wine worse than palit and evga.


That's hot. How does it look on stock clocks/settings?


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> For anyone interested, this Zotac Amp Extreme 384W BIOS works with our FE's
> 
> Run the powerlimit.bat as admin after you install it though.
> 
> Oh, and watch the memory offset. It's higher so to run 6210 memory for benching I only used a +608 offset.
> 
> It needs to be set lower than most BIOS's or memory will be too high and your system likely will crash.
> 
> Decent Time Spy score.
> 
> Zotac.GTX1080Ti.11264.170327.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


Is it the same problem with the XOC; a loss of about 5% from scores when running at max overclock(s)?

Quote:


> Originally Posted by *Streetdragon*
> 
> the stock blackplate is not campatible with the block.
> 
> nvidia boost clocking is the worst i have ever seen...... without power throtteling it would be easyer. or just artefacts or so instant of carshes


"_1) the stock blackplate is not campatible with the block:_ "
Would ordering the same width of screws, only longer, help?

"_2)without power throttling it would be easier:_"
Have you considered jumping on the bandwagon and doing the Shunt Mod?

Quote:


> Originally Posted by *mafia97160*
> 
> my all bench FE bios XOC 2190 MHz 1.200 mv 6107 MHz memory..
> [email protected] 4.850 GHz 1.375 vcore
> Maximus VII Ranger bios 3003
> HyperX Predator @ 2477 MHz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/12891722
> 
> http://www.3dmark.com/3dm/20521701
> 
> http://www.3dmark.com/3dm/20521768
> 
> 
> 
> http://www.3dmark.com/spy/1926333


Doesn't the shunt mod should yield 5% better result vs the XOC BIOS?

Quote:


> Originally Posted by *pantsoftime*
> 
> Has anyone ever tried shunt modding using conductive tape? Copper conductive tape (and maybe certain aluminum ones) seem to have the right resistance levels.


No. I've never heard of this being tried.

But it certainly sounds VERY interesting. And sounds like it would be MUCH easier to apply than Liquid Metal (and less risky).

Do you have a link to the product?

And are you planning on doing this yourself? It's certainly something I'd like to try.

Rep+ for the idea


----------



## crazyxelite

Quote:


> Originally Posted by *GraphicsWhore*
> 
> That's hot. How does it look on stock clocks/settings?


sure is evga and palit sit around 75c


----------



## bmgjet

Conductive tape wont work since you need a certain resistance to trick the shunt.
LM and Condictive thermal compound are right in this range for 5-10% lower resistance then the shunt is.
Conductive tape, solder, bit of wire all have near 0 resistance so the card goes into fail safe mode and locks idle clock.

Just tried the AMP Bios and it doesnt have that 5% performance loss like the XOC bios. But it has some big draw backs for me.
You need to run the power limit bat each time you restart windows.
Stock boost isnt stable, Goes to 2050mhz on 1.062V which crashes my card. Need to run -12 core to get it stable.
Does weird stuff in games. where it will drop to 1.000v at 1569mhz. Then over shoot the boost up by 24mhz and drops again then repetes which makes game fps not very stable.

For me XOC bios seems the best since it allows for a extra 8% overclock over the stock so ends up slightly faster.
Waiting on some LM to try the shunt mod on stock bios. Which If I can lower that powerlimit by 8% I have a feeling it will beat the XOC bios.


----------



## nrpeyton

Quote:


> Originally Posted by *bmgjet*
> 
> Conductive tape wont work since you need a certain resistance to trick the shunt.
> LM and Condictive thermal compound are right in this range for 5-10% lower resistance then the shunt is.
> Conductive tape, solder, bit of wire all have near 0 resistance so the card goes into fail safe mode and locks idle clock.
> 
> Just tried the AMP Bios and it doesnt have that 5% performance loss like the XOC bios. But it has some big draw backs for me.
> You need to run the power limit bat each time you restart windows.
> Stock boost isnt stable, Goes to 2050mhz on 1.062V which crashes my card. Need to run -12 core to get it stable.
> Does weird stuff in games. where it will drop to 1.000v at 1569mhz. Then over shoot the boost up by 24mhz and drops again then repetes which makes game fps not very stable.
> 
> For me XOC bios seems the best since it allows for a extra 8% overclock over the stock so ends up slightly faster.
> Waiting on some LM to try the shunt mod on stock bios. Which If I can lower that powerlimit by 8% I have a feeling it will beat the XOC bios.


how many shunts are you planning on doing.

1, 2 or 3?


----------



## bmgjet

2 of them, 8 and 6 pin pci-e power shunts.


----------



## wsarahan

Hi guys how are you?

I had an 1080 SLi rig but i`m done with SLI, so many problems with games not optmized..... so i sold both cards and bought a 1080TI, this one:

http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#kf

i can see some diferences when Oc this card comparing to the 1080, first i can`t reach the same 2038 in core that i had with 1080, i can reach something about 2000 in this one, second and most annoying for me, that did not ocurred with 1080 SLi rig:

If a game uses for example 70% of the card the core gets 2050 fo example, if the game jumps to 99% card usage the core goes down to 1990 or something like that with same temps!!!! why this happens? Is it normal?

I use here a 4770K @4.6 and a PG278Q 2K gsync monitor

Thanks for helping


----------



## bmgjet

Quote:


> Originally Posted by *wsarahan*
> 
> Hi guys how are you?
> 
> I had an 1080 SLi rig but i`m done with SLI, so many problems with games not optmized..... so i sold both cards and bought a 1080TI, this one:
> 
> http://www.gigabyte.us/Graphics-Card/GV-N108TGAMING-OC-11G#kf
> 
> i can see some diferences when Oc this card comparing to the 1080, first i can`t reach the same 2038 in core that i had with 1080, i can reach something about 2000 in this one, second and most annoying for me, that did not ocurred with 1080 SLi rig:
> 
> If a game uses for example 70% of the card the core gets 2050 fo example, if the game jumps to 99% card usage the core goes down to 1990 or something like that with same temps!!!! why this happens? Is it normal?
> 
> I use here a 4770K @4.6 and a PG278Q 2K gsync monitor
> 
> Thanks for helping


Thats normal.
Its probably power limited so at 70% usage it boost up unlimited.
Where at 99% usage it boosts until it hits powerlimit then has to downclock to maintain that.

The 1080ti is quite strange in less voltage gets you more overclocking on most bios since they hit the power limit too easily.
Flashing the XOC bios you dont have to worry about power limit. But it is about 5% slower clock for clock so if you dont get more then 5% better overclock its wasted.


----------



## wsarahan

Quote:


> Originally Posted by *bmgjet*
> 
> Thats normal.
> Its probably power limited so at 70% usage it boost up unlimited.
> Where at 99% usage it boosts until it hits powerlimit then has to downclock to maintain that.
> 
> The 1080ti is quite strange in less voltage gets you more overclocking on most bios since they hit the power limit too easily.
> Flashing the XOC bios you dont have to worry about power limit. But it is about 5% slower clock for clock so if you dont get more then 5% better overclock its wasted.


Thanks

One more question, can i change the voltage at MSI afetrburner 4.4 beta 11? Is it works? If yes can i put 100% more?

And what voltage should i select?

Reference design

Standard MSI

Extended MSI

or

Third Party?

I really tought i would get a better oc in one card like this one than at my 1080 SLI rig , a little disapointed


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> For anyone interested, this Zotac Amp Extreme 384W BIOS works with our FE's
> 
> Run the powerlimit.bat as admin after you install it though.
> 
> Oh, and watch the memory offset. It's higher so to run 6210 memory for benching I only used a +608 offset.
> 
> It needs to be set lower than most BIOS's or memory will be too high and your system likely will crash.
> 
> Decent Time Spy score.
> 
> Zotac.GTX1080Ti.11264.170327.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


Exact same curve with Arctic Storm BIOS.

Annoys me to no end it's not a recognised BIOS by 3DMark and I get that validation warning though.









GP102_arcticstorm.zip 156k .zip file


powerlimit.zip 0k .zip file


----------



## KedarWolf

Quote:


> Originally Posted by *wsarahan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bmgjet*
> 
> Thats normal.
> Its probably power limited so at 70% usage it boost up unlimited.
> Where at 99% usage it boosts until it hits powerlimit then has to downclock to maintain that.
> 
> The 1080ti is quite strange in less voltage gets you more overclocking on most bios since they hit the power limit too easily.
> Flashing the XOC bios you dont have to worry about power limit. But it is about 5% slower clock for clock so if you dont get more then 5% better overclock its wasted.
> 
> 
> 
> Thanks
> 
> One more question, can i change the voltage at MSI afetrburner 4.4 beta 11? Is it works? If yes can i put 100% more?
> 
> And what voltage should i select?
> 
> Reference design
> 
> Standard MSI
> 
> Extended MSI
> 
> or
> 
> Third Party?
> 
> I really tought i would get a better oc in one card like this one than at my 1080 SLI rig , a little disapointed
Click to expand...

Third party.


----------



## KedarWolf

Can some with a Zotac Arctic Storm Special Edition 1080 Ti save their BIOS with GPU-Z and attach it here as a .zip file?


----------



## wsarahan

Quote:


> Originally Posted by *KedarWolf*
> 
> Third party.


Thanks

But this really helps?

I`m getting something about 60/65C at BF1 everything on ultra 2K and my card is +120 core and +500 mem

This gives me something about 2025 / 2012 at this temps without touching the voltage, is this a good result for 1080TI OC?

should i try touch the vcore and put something like 130 or i`m good at the point i`m now?

Thanks

EDIT: More tests here and i`m 100% sure now , when the card reaches 99% usage the core clocks have a big drop no matter what temp

Example 60C 80% usage 2063 core / 60C 99% usage 2000 core...

How can i fix it?


----------



## jprovido

Returned my strix because it was heating up inside my case too much (and doesn't overclock that well) with an EVGA GTX 1080 Ti SC2 Hybrid. this card is a hundred times better! overclocks to 2050mhz without any voltage increases and runs super cool! (exhausts heat straight out of the case too!) I didn't know hybrid cards are this good


----------



## Dasboogieman

Quote:


> Originally Posted by *jprovido*
> 
> Returned my strix because it was heating up inside my case too much (and doesn't overclock that well) with an EVGA GTX 1080 Ti SC2 Hybrid. this card is a hundred times better! overclocks to 2050mhz without any voltage increases and runs super cool! (exhausts heat straight out of the case too!) I didn't know hybrid cards are this good


Wait till you see what these cards can do with a full coverage block and 30C operating temps. Your VRAM OC goes through the roof and suddenly your card 2100mhz stable at 1.093V.


----------



## AstroSky

Hey guys. I got a gtx 1080ti Hyrbid edtion from evga. I cant seem to run 120 on the core no matter what. I want to reach 2100. it runs then crashes.

I heard of the voltage curve stuff. I have no clue how to use it and even after watching videos im still not getting it.

people say go to manual mode i think and select your highest voltage you can use with out crashing and put it where you want? My cpu is from my understanding faster than the top result on 3d mark time spy for the ryzen 1700. The only difference between letting me be on the top score is my gtx 1080 ti and his gtx 1080 ti. I need to get this sucker at 2100 to reach my goal.

Please someone mind helping me?


----------



## dansi

Quote:


> Originally Posted by *AstroSky*
> 
> Hey guys. I got a gtx 1080ti Hyrbid edtion from evga. I cant seem to run 120 on the core no matter what. I want to reach 2100. it runs then crashes.
> 
> I heard of the voltage curve stuff. I have no clue how to use it and even after watching videos im still not getting it.
> 
> people say go to manual mode i think and select your highest voltage you can use with out crashing and put it where you want? My cpu is from my understanding faster than the top result on 3d mark time spy for the ryzen 1700. The only difference between letting me be on the top score is my gtx 1080 ti and his gtx 1080 ti. I need to get this sucker at 2100 to reach my goal.
> 
> Please someone mind helping me?


In the past, you can up the voltage, buy GPU with oversized power delivery and hope the best.

With Pascal boost 3.0, Nvidia has programmed every thing at rom/die level. The only way to attain the max possible clocks of your GPU, is get the temps below 35C iirc.

Voltage curve only helps to smooth the transition between clock bins. ie. you want a gradual curve if not you will get hit by frametime penalty, resulting in stutters.









Say your GPU is only capable and has been programmed to 2050mhz at 56-65C range. Then voltage curve is only for you to flat-line the later bins (2067 and up), so that the GPU won't boost to 2067mhz at 55-65C, and prevent CTD.


----------



## AstroSky

ok at 120 after taking off the extra 100% voltage stuff i get 2100. No clue why though i figured extra voltage makes things more stable .-. so now i wanna "smooth" out this to make it super stable.


----------



## nrpeyton

Quote:


> Originally Posted by *bmgjet*
> 
> 2 of them, 8 and 6 pin pci-e power shunts.


Why not the bottom one?


----------



## nrpeyton

Quote:


> Originally Posted by *AstroSky*
> 
> ok at 120 after taking off the extra 100% voltage stuff i get 2100. No clue why though i figured extra voltage makes things more stable .-. so now i wanna "smooth" out this to make it super stable.


More voltage *will* allow you to run at a higher core clock frequency, up to a point -- but from 1.075v you'll notice diminishing returns until roughly 1.2v, after which more voltage won't help -- or even cause instability. (*Unless you can get temps down). _Even then, you'll still need to lose 50c, just for 100Mhz extra.
_

There's also a Nvidia imposed 'voltage limit' of 1.093v.

You need: *1)*1080Ti Kingpin _(soon to be released)_, *2)*1080Ti HOF, *3)* have the XOC BIOS installed, or *4)* have a physically modified PCB to get past 1.093v.

"Out of the box" at stock. _(Set to defaults)_ my card boosts to roughly 1872 MHZ at 1.05v.

Instead-- I often under-volt it to 1.0v at a 1999 MHZ overclock, with complete stability.
By doing so -- the card draws less power minimising throttling. _(You could say it's smoother as the core frequency isn't jumping around trying to stay under a Nvidia BIOS imposed power or temp limit)._


----------



## AstroSky

nvm its still not stable. need to some how get close. i still did not understand the voltage curve stuff unless someone can show a image?


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> nvm its still not stable. need to some how get close. i still did not understand the voltage curve stuff unless someone can show a image?


There's a video on page 1 on how to do s custom voltage curve part.


----------



## AstroSky

Ok 1081 is my max voltage as it crashes after it sits there at 1081. What does that tell you?


----------



## AstroSky

and this is with a 115 on the core and 100 on the mem (i can go higher on mem but i go lower to see how far i can take core first.


----------



## bmgjet

Quote:


> Originally Posted by *AstroSky*
> 
> and this is with a 115 on the core and 100 on the mem (i can go higher on mem but i go lower to see how far i can take core first.


Less voltage is best since the overclock stablity is tied with temps and power limit not voltage.
With a 120% limit (300w)

If your over 60C no point going over 1.062V
If your over 50C no point going over 1.075v
Really need to be under 40C to use 1.093v+

Other wise its just going to hit the power limit hard enough that it crashes from dropping too much voltage if your just using core offset.
Custom curve will stop it crashing but it will drop more performance then if it wasnt hitting power limit on a lower clock.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AstroSky*
> 
> nvm its still not stable. need to some how get close. i still did not understand the voltage curve stuff unless someone can show a image?
> 
> 
> 
> There's a video on page 1 on how to do s custom voltage curve part.
Click to expand...

You just want your top voltage point of 1.093v at 2100 or whatever you can get it to.


----------



## nrpeyton

Quote:


> Originally Posted by *AstroSky*
> 
> and this is with a 115 on the core and 100 on the mem (i can go higher on mem but i go lower to see how far i can take core first.




/\




Hit "apply" in main window after any adjustments made in the Curve Editor. For them to take affect. _(Sorry in my screenshot I hadn't)._

The 1.0v (1000 milli volts) is just one example. You can do anything you want.
But it's usually a good place to start.

You can also select any point in the editor and hold Ctrl & press L to *"lock" onto a specific voltage/frequency.* A yellow line will appear when successful. Remember to still hit APPLY in main window!

You can use the up/down arrow keys on your keyboard for *precise tuning.*


----------



## buellersdayoff

Quote:


> Originally Posted by *jprovido*
> 
> Returned my strix because it was heating up inside my case too much (and doesn't overclock that well) with an EVGA GTX 1080 Ti SC2 Hybrid. this card is a hundred times better! overclocks to 2050mhz without any voltage increases and runs super cool! (exhausts heat straight out of the case too!) I didn't know hybrid cards are this good


Nice build ?


----------



## Quadrider10

So does the XOC bios allow for no power limit?


----------



## Streetdragon

yep. powerlimit is no problem than. but wrote above: the clock per clock speed is 5% lower


----------



## AstroSky

Like this? setting at this causes my core clock to go 2100 something but in software its wrong lol


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> Like this? setting at this causes my core clock to go 2100 something but in software its wrong lol




Like this, but if you want max voltage have the highest point at 1.093V (watch temps).

A gradual drop from highest voltage point.


----------



## AstroSky

is this a good score?

it says i got it running at 2133 on the core. max temps i saw was 43 c [ 

(updated for better viewing)


----------



## AstroSky

hu...odd...seems im getting HORRIBLE scores 0-0


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> is this a good score?
> 
> it says i got it running at 2133 on the core. max temps i saw was 43 c [
> 
> (updated for better viewing)


Your core is too high and driver is crashing, frame rates will drop drastically when it does and most times you don't get an error message.

This is my Heaven Extreme.





Oh wait, you are using custom settings. Chose the Extreme preset, not fullscreen.


----------



## AstroSky

well i see that you are running at 1600x900. Ill run at that too and see if i get similar scores.


----------



## AstroSky

alright on your "max" fps i get nearly HALF of that at your exact settings


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> well i see that you are running at 1600x900. Ill run at that too and see if i get similar scores.


Just choose the Extreme preset.


----------



## AstroSky

I know my cpu is not the problem as i'm actually equivalent to a 6950x now that i have it overclocked to 4.02 ghz and running 3600 ddr4 ram full speed.


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> alright on your "max" fps i get nearly HALF of that at your exact settings


Reboot, driver crashed, then after lower core to 2088 or under at 1.093v. Frame rates will be messed up after a driver crash until you reboot, and here are my Nvidia settings.


----------



## Quadrider10

Quote:


> Originally Posted by *Quadrider10*
> 
> So does the XOC bios allow for no power limit?


So you end up loosing performance...? That sucks


----------



## AstroSky

its better now. still not where you are though lol .-. score is 4661 and core is 2075


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> its better now. still not where you are though lol .-. score is 4661 and core is 2075


My GPU memory is oc'd too, might need that.


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> its better now. still not where you are though lol .-. score is 4661 and core is 2075


Your memory settings might be too high. Heaven does NOT work well if your memory is maxed out.

This me at 2062 core 1.075v 6156 +650 memory. It did better than at +667 memory.



And if your driver crashed do a DDU uninstall, clean reinstall of drivers.

I used 378.78 Nvidia drivers as they are great for benching with Heaven.


----------



## homingmystic

Quote:


> Originally Posted by *nrpeyton*
> 
> Is it the same problem with the XOC; a loss of about 5% from scores when running at max overclock(s)?
> 
> "_1) the stock blackplate is not campatible with the block:_ "
> Would ordering the same width of screws, only longer, help?
> 
> "_2)without power throttling it would be easier:_"
> Have you considered jumping on the bandwagon and doing the Shunt Mod?
> 
> Doesn't the shunt mod should yield 5% better result vs the XOC BIOS?
> 
> No. I've never heard of this being tried.
> 
> But it certainly sounds VERY interesting. And sounds like it would be MUCH easier to apply than Liquid Metal (and less risky).
> 
> Do you have a link to the product?
> 
> And are you planning on doing this yourself? It's certainly something I'd like to try.
> 
> Rep+ for the idea


I think the width of the screw heads needs to be slightly smaller so they sit flush within the stock backplate, as the provided screws are wider than the stock ones. The length only needs an extra half mm to be able to thread, and if it can sit in the backplate flush it might not need to be longer.

I have attached a picture of the two screws, the reason I can't use the original is because the screw thread isn't wide enough to thread onto the waterblock.


----------



## pantsoftime

Quote:


> Originally Posted by *nrpeyton*
> 
> No. I've never heard of this being tried.
> But it certainly sounds VERY interesting. And sounds like it would be MUCH easier to apply than Liquid Metal (and less risky).
> Do you have a link to the product?
> And are you planning on doing this yourself? It's certainly something I'd like to try.


Quote:


> Originally Posted by *bmgjet*
> 
> Conductive tape wont work since you need a certain resistance to trick the shunt.
> LM and Condictive thermal compound are right in this range for 5-10% lower resistance then the shunt is.
> Conductive tape, solder, bit of wire all have near 0 resistance so the card goes into fail safe mode and locks idle clock.


According to the table on this site there are some conductive tapes with a Z-axis resistance (i.e. through the adhesive) in the 5 milliohm range which is on par with the current sense resistors in the design. It would seem like a properly selected tape could potentially be used. I'm not sure what the tolerance is or how well it would stick but it seems like it might be worth trying.

I have some of the copper stuff and I might at least see how well it sticks. I haven't tried any of this but I'm curious if anyone else has.


----------



## wsarahan

Guys the card reaching the power limit at heaven 2k ultra for example and the 120% slider can be related to the cpu?

I use the 1080Ti with a 4770k @4.6

Thanks

Enviado do meu iPhone usando Tapatalk


----------



## Unnatural

Guys, any possible long-term issue on running a modded card (LM/resistor) @stock?
I'd love to experiment with OC in the future, but for now my priority is to finish my hard tubing loop, so a "set-and-forget" solution would be the best


----------



## Dasboogieman

Quote:


> Originally Posted by *Unnatural*
> 
> Guys, any possible long-term issue on running a modded card (LM/resistor) @stock?
> I'd love to experiment with OC in the future, but for now my priority is to finish my hard tubing loop, so a "set-and-forget" solution would be the best


the biggest danger (aside from the usual issues with LM drippage) is the LM can eat the solder of the shunt resistor (this seems to be highly random and unpredictable) thereby forcing you to resolder it back on long term.


----------



## AstroSky

my score after doing a clean install is 4667.....and a plus 100 on the core overclock of 2076

No matter what i do i get lowerscores and mine is watercooled and chilly as ever,.


----------



## AstroSky

Temps are not the problem. Nivida settings are not the problem. Driver related? maybe i mean its the newest driver that just came out. it increase my games smoothness. But lower drivers make it worse. I seem to be unable to fix my benchmark scores. They just come out lower than anyone else. Powersupply is not the issue. Motherboard is not the issue. Plenty of cooling. i just dont know what it could be,.


----------



## AstroSky

alright apparently i keep hitting my "power" limit

i assume this could be the whole problem. Xoc bios might be a good fix? What does it do exactly and what issues might i face from it.


----------



## KraxKill

Quote:


> Originally Posted by *Quadrider10*
> 
> So does the XOC bios allow for no power limit?


*NO IT DOES NOT*.

People are confusing the XOC Bios vs other BIOS' and the hardlocked Nvidia TDP which cannot be modified without a bypass.

A different bios (like the XOC) will unlock (or raise) the firmware directed software TDP limits but *it will not allow the card to exceed the Nvidia hard locked TDP.*

Every Ti, at ~2100mhz and 1.074v will begin to bump against the hardcoded tdp. And you can do as you please. You can clock above 2100mhz all you want but the additional voltage needed will simply be eaten up by the limiter. So the card will run a higher clock but it's fools gold. Actual performance will not increase and may actually decrease.

Without power mods (not even the shunts) will allow the card to draw more current than what's available at ~2100mhz -1.063v

The best performance on every Ti for most will be around 2100-2126mhz and 1.050-1.074v.

When you get to above 2100mhz and 1.063v additional performance will NOT come without additional power regardless of that GPUz resources is showing or not showing.

The XOC removes the pwr reporting but it DOES NOT remove the limit. It only raises the software TDP limit up to the Nvidia forced limit.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> *NO IT DOES NOT*.
> 
> People are confusing the XOC Bios vs other BIOS' and the hardlocked Nvidia TDP which cannot be modified without a bypass.
> 
> A different bios (like the XOC) will unlock (or raise) the firmware directed software TDP limits but *it will not allow the card to exceed the Nvidia hard locked TDP.*
> 
> Every Ti, at ~2100mhz and 1.074v will begin to bump against the hardcoded tdp. And you can do as you please. You can clock above 2100mhz all you want but the additional voltage needed will simply be eaten up by the limiter. So the card will run a higher clock but it's fools gold. Actual performance will not increase and may actually decrease.
> 
> Without power mods (not even the shunts) will allow the card to draw more current.
> 
> The best performance on every Ti for most will be around 2100-2126mhz and 1.050-1.074v.
> 
> When you get to above 2100mhz and 1.063v additional performance will NOT come without additional power regardless of that GPUz is showing or not showing.
> 
> The XOC removes the pwr reporting but it DOES NOT remove the limit. It only slightly raises the software TDP limit of a stock card.


What solution do you have for someone like me where the card does not even hit 2GHz with +143 and 120% power when playing BF1?


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What solution do you have for someone like me where the card does not even hit 2GHz with +143 and 120% power when playing BF1?


What does your card hit, and at what voltage?


----------



## AstroSky

i just cant seem to add even 100 on the core with out issues


----------



## cluster71

Heaven is probably not the best benchmark program for evaluation. It's too old, it does not use multiple threads even when enabled in the nVidia control panel.
On my use it a one core 100% and the second 90% and do not care about the rest. My result on 7700K.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> What does your card hit, and at what voltage?


I am not looked at voltages but +13MHz more OC and games would crash. Playing BF1 I am hitting 2000 and drops -13, -26 and -39MHz. Average clock is 1974MHz.

At 100% which I use my card for 24/7 Its 1885/1898/1911 @ 0.9310-0.943v (Pwr Limited)

120% I am at 1974/1987 @ 1.012v (Pwr Limited)


----------



## cluster71

Yes I saw it on my FE earlier. When load it really heavy, (Donut bench), it backs down to ~ 1800-1900 ~0.9v as you describe. Is it mining 24/7 ?


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am not looked at voltages but +13MHz more OC and games would crash. Playing BF1 I am hitting 2000 and drops -13, -26 and -39MHz. Average clock is 1974MHz.
> 
> At 100% which I use my card for 24/7 Its 1885/1898/1911 @ 0.9310-0.943v (Pwr Limited)
> 
> 120% I am at 1974/1987 @ 1.012v (Pwr Limited)


Iwhat are your temps?

In general, your best performance in terms of frame delivery and performance will be where the clock is not jumping around much.

i think that for most the "undervolting" approach to overclocking is best.

You want to see for example how low of a voltage you can run at 1900mhz? What about at 1950? Etc....keeping the voltage low is allowing more TDP room and less clock hopping.

2000mhz at around 1.0v is a great spot. Temps have to be within reason ofcorse becuase those will make the clock hop.

ULTIMATLY you want to run as low voltage as you can for any given clock and the spot where the ratio of clock speed to voltage is most favorable will be where the best perfoormanc did for any card out there.

1900 @ .9xxx
2000 @ 1.0
2100 @ 1.063

Something like that.

The goal is to find the lowest voltage you can run at any one particular clock.

I would start low at 1900 and pull voltage until I'm not stable and record the performance there. Then in bump the clock a bin at a time and again find the lowest voltage required.

As one does this you will most certainly see that as the clock goes up the voltage required starts to increase exponentially eating up all of the available TDP room your card is allowed.

Basically you want as high a clock as you can run while minimizing pwr events by not feeding exessive voltage.


----------



## AstroSky

hey do people see my replies just wondering?


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> hey do people see my replies just wondering?


Post your GPUz log (log to file option) of a bench run.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> Iwhat are your temps?
> 
> In general, your best performance in terms of frame delivery and performance will be where the clock is not jumping around much.
> 
> i think that for most the "undervolting" approach to overclocking is best.
> 
> You want to see for example how low of a voltage you can run at 1900mhz? What about at 1950? Etc....keeping the voltage low is allowing more TDP room and less clock hopping.
> 
> 2000mhz at around 1.0v is a great spot. Temps have to be within reason ofcorse becuase those will make the clock hop.
> 
> ULTIMATLY you want to run as low voltage as you can for any given clock and the spot where the ratio of clock speed to voltage is most favorable will be where the best perfoormanc did for any card out there.
> 
> 1900 @ .9xxx
> 2000 @ 1.0
> 2100 @ 1.063
> 
> Something like that.
> 
> The goal is to find the lowest voltage you can run at any one particular clock.
> 
> I would start low at 1900 and pull voltage until I'm not stable and record the performance there. Then in bump the clock a bin at a time and again find the lowest voltage required.
> 
> As one does this you will most certainly see that as the clock goes up the voltage required starts to increase exponentially eating up all of the available TDP room your card is allowed.
> 
> Basically you want as high a clock as you can run while minimizing pwr events by not feeding exessive voltage.


Temps are 50-60C range. My problem is if I am only using close to 1v for 2GHz how and being power limited how am I suppose to get anything more. Trying to run more volt will not work and with less volt it will crash.


----------



## buellersdayoff

Quote:


> Originally Posted by *AstroSky*
> 
> my score after doing a clean install is 4667.....and a plus 100 on the core overclock of 2076
> 
> No matter what i do i get lowerscores and mine is watercooled and chilly as ever,.


What's your score in super position 4k? If you can get ~10000 then you're good.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Temps are 50-60C range. My problem is if I am only using close to 1v for 2GHz how and being power limited how am I suppose to get anything more. Trying to run more volt will not work and with less volt it will crash.


Well you certainly should not be power limited there. You could try a different bios with s higher limit. Can you run a SuperPosition4k run or Firestrike Ultra and post your GPUz log?

Pwr at 1.0v is odd.


----------



## AstroSky

Hey when you ask for gpuz log. Do you want me to load a bench and run it too? And where is gpu z log


----------



## AstroSky

nvm.

This is my log with out overclock. Quick run of heaven bench mark

GPU-ZSensorLognoneoverclock 42k .txt file


And this is my log with overclock (100 mhz on the core) power and temp all the way up

GPU-ZSensorLogwithoverclockof100mhz 61k .txt file


----------



## Essenbe

I have an EVGA FTW 3 and running Heaven it is Voltage limited at stock. I have asked a few friends who have 10xx cards to try it too. Every one of them is Voltage limited. Even at stock settings you cannot add enough voltage to make it not voltage limited.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> nvm.
> 
> This is my log with out overclock. Quick run of heaven bench mark
> 
> GPU-ZSensorLognoneoverclock 42k .txt file
> 
> 
> And this is my log with overclock (100 mhz on the core) power and temp all the way up
> 
> GPU-ZSensorLogwithoverclockof100mhz 61k .txt file


Did you even look at your own files to draw your own conclusions? I'm not trying to be a dick. Really trying to help you. But I'm inclined to teach you to fish vs handing you the fish.

Take a look a both files and see what you see. Look at your core clocks during the bench run and the voltages.

You're boosting to 1961 on your first file and are at 2050 on your second file (+100) and you're using 1.050 to 1.063v to get there.
















Could it be that you're not actually recognizing your actual core clock is a lot higher than you thought? Thats why I pointed you at the log.


----------



## PimpSkyline

*Why is a ASUS ROG Strix Ti going for over $1,100? Is there something special about it?* Thanks


----------



## KedarWolf

Quote:


> Originally Posted by *PimpSkyline*
> 
> *Why is a ASUS ROG Strix Ti going for over $1,100? Is there something special about it?* Thanks


https://www.amazon.com/ROG-STRIX-GTX1080TI-O11G-GAMING-GeForce-DisplayPort-Overclocked-Graphics/dp/B06XXZBPHZ


----------



## AstroSky

no i saw. but im clueless on overclocking. 100 on the core implies 100 on the core. So from that i can assume im going to 2050 as it says. Which i want. Are you telling me 100 is to much then? Im not following.


----------



## AstroSky

or are you saying at that voltage its not enough for that clock. Would going to higher voltage be better? it tends to just stick at the voltage at the overclock UNLESS i mess with the voltage curve which will allow me to go higher in voltage but its not as stable for obvious reasons. I did find that at 1.032 i can go higher but the fps is MUCH lower at that voltage but its more stable for sure.


----------



## KedarWolf

When not benching for gaming, I find In Diablo 3 I can absolutely max out the Nvidia control panel settings, enable G-Sync, and with it enabled get zero power limiting at 4k.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> no i saw. but im clueless on overclocking. 100 on the core implies 100 on the core. So from that i can assume im going to 2050 as it says. Which i want. Are you telling me 100 is to much then? Im not following.


Your card is already overclocked from the factory. You're boosting to 1950 with just your TDP, Volt and temp sliders in afterburner.

The reality for most 1080ti's is below.

Poor cards can do 1950 or so. You are above that....
Good cards can do 2000-2100 you are there if you bump your voltage a bit but you're gonna got more pwr events. (you have 1.074, 1.81 and 1.093 still available to you)
Great cards card can do 2100+ at 1.063-1.092

+100 offset puts your card at 2050 and your card is trying to run that at 1.063v according to your logs...It seems that for your card 1.063v is not enough for 2050. You could try to stabilize this by adding more voltage via the vcurve in Afterburner (you'll have to learn how to do this) or simply run a lower offset (+50 or +75) to get the maximum out of 1.063 volts you're currently feeding it.

However, I bet, that if you run about 2000 - 2025 or so and drop your voltages below 1.063 to 1.031 or so, you'll find better performance because pwr will not be an issue. Your temps will also be better.

Good luck.


----------



## AstroSky

I GET IT NOW!! I did 1.082 at 2050 and now its stable. i made a very smooth curve at about -10 mhz each time it steps down except at lower voltages (that i dont care about )

and now the fps is almost 10 fps more


----------



## AstroSky

well maybe i overestimated just a bit lol its not that much better. infact scores at kinda worse even though its stable.

i do notice the power jumping around just a bit from 78 -105 maybe i could fix that


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> I GET IT NOW!! I did 1.082 at 2050 and now its stable. i made a very smooth curve at about -10 mhz each time it steps down except at lower voltages (that i dont care about )
> 
> and now the fps is almost 10 fps more


There you go...enjoy. Awesome!

I still think that you should consider running whatever clock you can achieve between 1.0-1.063v because that will dictate temperature and max power usage and give you more consistent performance. Higher voltages may give you "temporary" higher clocks in instances where the card is not actually drawing a lot of current. But when pwr demand is high the card will down clock anyway.

The sweet spot is surely where you can maintain the highers clock you can at the lowest voltage you can while staying away from too much pwr.

Your card seems to want to do 2025 at about 1.04-1.063.....try that.

Make sure to log your attempts to make sure that what you're setting is what you're getting. Afterburner gets tricky especially when playing with the curve. You can always set defaults and try again if it gets weird on you.

Good luck.


----------



## ZealotKi11er

Here is my Log running SuperPosition4K

GPU-ZSensorLog.txt 64k .txt file


----------



## Coopiklaani

I did some tests regarding to clock-voltage-power-efficiency for causal mining. Conclusion is, not surprised at all, the efficiency goes out of the window at the highest voltage. 1.000v is a very good compromise.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Here is my Log running SuperPosition4K
> 
> GPU-ZSensorLog.txt 64k .txt file


That looks really messed up....

Can you post a screenshot of your Afterburner settings for this? Is this stock? If so you should uninstall afterburner, delete the afterburner profile folder and re-install your nvidia driver. You should not be triggering TDP there and you are by a large margin and at very low voltages so something is up.

Do the above and post your stock gpu z log again.. something is not right there for sure.


----------



## Enzarch

Quote:


> Originally Posted by *nrpeyton*
> 
> beautiful ;-)
> 
> Do you have fans pointing at those heat sinks you added. And what temperature is your *1)* VRM & *2)* Memory? I like this idea too. And also love to play around with an open-pcb when possible.
> 
> Anyway this is the 1st time I've seen this done properly here. So definitely worth a rep+
> 
> Please enlighten me any more information you'd like to share,. I'm hungry lol.
> 
> 
> 
> 
> 
> 
> 
> 
> (I'm specifically interested in the temperature of your GDDR5X chips closest to the VRM.
> 
> While we're on the subject, 3rd question) What's your maximum stable memory overclocking (I.E. before errors begin appearing in OCCT's GPU stress test tab)?


If you look again, you can see I have an old Intel OEM fan attached blowing on the VRM area, though this is probably unnecessary, considering stock is just the bare plate being blown through the HSF

Sorry, I dont have any real temps for you, I didnt want to disturb the thermal pads just to get some probes in there. I did do some preliminary tests on the bare plate using an IR temp gun, and never saw the VRM area exceed ~80c @ 1.094v with no direct airflow. As such I have little concern with ancillary temps in its current config. (particularly those RAM chips you are referring too since they get overflow from the fan)

For my card anyway, I found that OCCT was not valuable in finding max mem clocks as it allowed much higher clocks without errors than TimeSpy or Heaven would tolerate, It is currently clocked at 6100, which may leave another ~100MHz on the table.


----------



## AstroSky

why dose my powerusage seem to jump all over the place on the benchmark? Maybe im crashing? it does not say my clock went down though


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> why dose my powerusage seem to jump all over the place on the benchmark? Maybe im crashing? it does not say my clock went down though


power usage varies with the workload.


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> That looks really messed up....
> 
> Can you post a screenshot of your Afterburner settings for this? Is this stock? If so you should uninstall afterburner, delete the afterburner profile folder and re-install your nvidia driver. You should not be triggering TDP there and you are by a large margin and at very low voltages so something is up.
> 
> Do the above and post your stock gpu z log again.. something is not right there for sure.


What looks wrong?
Card looks like it boost's to 2025Mhz but is being power and temp throttled.


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> What looks wrong?
> Card looks like it boost's to 2025Mhz but is being power and temp throttled.


He's hitting 120% TDP at 0.975v which is odd. Thats why I asked him to post his afterburner settings so we can see what exactly is going on.


----------



## PimpSkyline

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PimpSkyline*
> 
> *Why is a ASUS ROG Strix Ti going for over $1,100? Is there something special about it?* Thanks
> 
> 
> 
> https://www.amazon.com/ROG-STRIX-GTX1080TI-O11G-GAMING-GeForce-DisplayPort-Overclocked-Graphics/dp/B06XXZBPHZ
Click to expand...

Yeah it's $809 on there, but on eBay, there are bids for over $1,100...Why? Does OC very well? Or is it very limited?


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> He's hitting 120% TDP at 0.975v which is odd. Thats why I asked him to post his afterburner settings so we can see what exactly is going on.


Not odd at all. These cards can hit TDP @ .95v depending on the load the certain application put's on it.
You can see in this vid my card throttling down to .993v to keep under the power limit and that with just stock boost clock of 1911Mhz and no mem oc. With a memory oc the card will throttle even more. I have seen down to.95v with a +500 mem oc


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> Not odd at all. These cards can hit TDP @ .95v depending on the load the certain application put's on it.
> You can see in this vid my card throttling down to .993v to keep under the power limit and that with just stock boost clock of 1911Mhz and no mem oc. With a memory oc the card will throttle even more. I have seen down to.95v with a +500 mem oc


His card spends 90% of the time below 1v...you don't find this odd?


----------



## cluster71

Quote:


> Originally Posted by *Coopiklaani*
> 
> I did some tests regarding to clock-voltage-power-efficiency for causal mining. Conclusion is, not surprised at all, the efficiency goes out of the window at the highest voltage. 1.000v is a very good compromise.


Thank you, I have not done mining for several years, but wondered if I should start on my PC to get started again, and maybe think about bigger, ASIC. As a hobby for it is not easy anymore.
It's all about efficiency. If you can not steal the current


----------



## AstroSky

i never understand why my watercooled gpu cant seem to have scores comparable to someone who is air cooled .-.


----------



## wsarahan

Quote:


> Originally Posted by *KraxKill*
> 
> His card spends 90% of the time below 1v...you don't find this odd?


Considering our friend power limit issue i run a heaven test here with this OC stable for me:



When the card reaches the power limit the voltage is something about a.1.013 to 1.021 , when the load goes down a little like 80% the voltage bump to 1.063

Is there any way to increase this voltage to bypass a little the power limit when the card reaches the 100%?


----------



## Coopiklaani

Quote:


> Originally Posted by *lilchronic*
> 
> Not odd at all. These cards can hit TDP @ .95v depending on the load the certain application put's on it.
> You can see in this vid my card throttling down to .993v to keep under the power limit and that with just stock boost clock of 1911Mhz and no mem oc. With a memory oc the card will throttle even more. I have seen down to.95v with a +500 mem oc


My card draws about 275w @2038MHz @1.000v @+500Mem in Timespy Scene 1 (TS stress test) in average. It can easily hits 300w TDP limit here and there. Drop the voltage down to 0.925v and clock to 1949MHz gives an average power draw of 235w.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> That looks really messed up....
> 
> Can you post a screenshot of your Afterburner settings for this? Is this stock? If so you should uninstall afterburner, delete the afterburner profile folder and re-install your nvidia driver. You should not be triggering TDP there and you are by a large margin and at very low voltages so something is up.
> 
> Do the above and post your stock gpu z log again.. something is not right there for sure.


Yeah I did that and removed MSI AB and new driver Install. I had 1080 before this card. Tried even eVGA and still same clock speeds and voltage. Could it be that AIO pump is using too much power ?


----------



## Ali3n77

I found this .....
No1 can try ?


----------



## KraxKill

Quote:


> Originally Posted by *wsarahan*
> 
> Considering our friend power limit issue i run a heaven test here with this OC stable for me:
> 
> 
> 
> When the card reaches the power limit the voltage is something about a.1.013 to 1.021 , when the load goes down a little like 80% the voltage bump to 1.063
> 
> Is there any way to increase this voltage to bypass a little the power limit when the card reaches the 100%?


You can run a different bios with a higher or unlocked tdp to give you more room. Kedar has a nice thread going on it.

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_30


----------



## AstroSky

like i did not just go out and spend 800 dollars to get a gpu that is worse in some cases than someone who got a founders edition. Sure mine is factory overclocked but at the same time CLOCk For Clock. copying the same core and settings. i get worse scores by at least 300 points or more


----------



## cluster71

Quote:


> Originally Posted by *wsarahan*
> 
> Considering our friend power limit issue i run a heaven test here with this OC stable for me:
> 
> 
> 
> When the card reaches the power limit the voltage is something about a.1.013 to 1.021 , when the load goes down a little like 80% the voltage bump to 1.063
> 
> Is there any way to increase this voltage to bypass a little the power limit when the card reaches the 100%?


There has been a lot of thought and testing here on the thread, many of your questions have already been answered if you start skimming through the beginning


----------



## cluster71

Sorry, the wrong person


----------



## wsarahan

Quote:


> Originally Posted by *KraxKill*
> 
> You can run a different bios with a higher or unlocked tdp to give you more room. Kedar has a nice thread going on it.
> 
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti/0_30


Thanks

Making the oc curve helps as well? I never didi it before, my curve is like that now:



How do i need to let this curve?


----------



## Coopiklaani

Quote:


> Originally Posted by *Ali3n77*
> 
> I found this .....
> No1 can try ?


It doesn't and will not work with any pascal cards. The voltage slider of pascal cards doesn't represent a voltage offset, but a voltage cap. e.g. 100% slider value gives you a voltage cap of 1.093v, 0% gives you 1.063v. And the max cap is locked by the GPU bios.


----------



## Coopiklaani

Quote:


> Originally Posted by *AstroSky*
> 
> like i did not just go out and spend 800 dollars to get a gpu that is worse in some cases than someone who got a founders edition. Sure mine is factory overclocked but at the same time CLOCk For Clock. copying the same core and settings. i get worse scores by at least 300 points or more


It is all about silicon lottery. FE card is no worse than any aftermarket unbinned card. You pay for the cooler design basically.


----------



## ZealotKi11er

Quote:


> Originally Posted by *lilchronic*
> 
> Not odd at all. These cards can hit TDP @ .95v depending on the load the certain application put's on it.
> You can see in this vid my card throttling down to .993v to keep under the power limit and that with just stock boost clock of 1911Mhz and no mem oc. With a memory oc the card will throttle even more. I have seen down to.95v with a +500 mem oc


Yeah. I noticed that too. +500 in memory brought my vCore down from 1.05 to 1.0v.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> like i did not just go out and spend 800 dollars to get a gpu that is worse in some cases than someone who got a founders edition. Sure mine is factory overclocked but at the same time CLOCk For Clock. copying the same core and settings. i get worse scores by at least 300 points or more


300pts is within the margin of your OS and Nvidia driver settings your CPU and RAM speed. This makes other's scores nowhere near the "clock for clock" statement you're making. It's apples and oranges. Your only point of reference is your system. What others can achieve on a similar setup can be vastly different.

I bet similar scores you refer to can be achieved on your card on the rigs that achieved them.

If it's news to you that all 1080ti's run the same GPU..sorry.


----------



## AstroSky

flased the XOC bios. cant edit power and temp targets anymore. Fps is worse. No clue how to fix it in this bios.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah I did that and removed MSI AB and new driver Install. I had 1080 before this card. Tried even eVGA and still same clock speeds and voltage. Could it be that AIO pump is using too much power ?


What? is the pump running off the GPU fan header? I always thought about trying that, but when I ran an AIO I ran the pump off of the motherboard and simply did not use the GPU fan header for anything. (NZXT x41 AIO controlled via CAM software). I'm full block now of course.

I know the regular fan draws only a few watts but maybe an AIO pulls quite a bit more?


----------



## Ali3n77

Quote:


> Originally Posted by *Coopiklaani*
> 
> It doesn't and will not work with any pascal cards. The voltage slider of pascal cards doesn't represent a voltage offset, but a voltage cap. e.g. 100% slider value gives you a voltage cap of 1.093v, 0% gives you 1.063v. And the max cap is locked by the GPU bios.


Sigh ok








Thnx mate


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> What? is the pump running off the GPU fan header? I always thought about trying that, but I when I ran and AIO I ran the pump off of the motherboard.


Yes. Both the Pump + GPU fan + RAD fan are on the same header.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes. Both the Pump + GPU fan + RAD fan are on the same header.


I know the GPU fan can draw a few W and eat into the TDP margin that way for sure, but I'm not sure how much wattage your pump is drawing in addition to the fan. May be forth a try to run it off the mobo header.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> I know the GPU fan can draw a few W and eat into the TDP margin that way for sure, but I'm not sure how much wattage your pump is drawing in addition to the fan. May be forth a try to run it off the mobo header.


Yeah the AIO cant run on anything else. Based on the rating they can draw up to 18W with 2 fans. Not sure if that is enough. Really sucks that these GPUs are TDP limited.


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> flased the XOC bios. cant edit power and temp targets anymore. Fps is worse. No clue how to fix it in this bios.


Power limit and temp limit is removed why you can't adjust it. If your card is under water it may offset the lower FPS by higher memory and core clocks with increased voltages on a custom power curve.


----------



## nrpeyton

Quote:


> Originally Posted by *pantsoftime*
> 
> According to the table on this site there are some conductive tapes with a Z-axis resistance (i.e. through the adhesive) in the 5 milliohm range which is on par with the current sense resistors in the design. It would seem like a properly selected tape could potentially be used. I'm not sure what the tolerance is or how well it would stick but it seems like it might be worth trying.
> 
> I have some of the copper stuff and I might at least see how well it sticks. I haven't tried any of this but I'm curious if anyone else has.


I've been trying to figure out the electrical resistance of Liquid Metal. (as something slightly lower would be perfect. (But obviously not low enough to put the card into safe mode).

Couldn't find the information (as it's not advertised).


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. I noticed that too. +500 in memory brought my vCore down from 1.05 to 1.0v.


Yeah memory overclock will draw more power thus making the core voltage/core clock to throttle more to stay within the tdp.


----------



## AstroSky

other than XOC bios. Whats another pretty good bios to try?


----------



## Psykoking

aircooled Amp Extreme on Stock Bios (2062 / 6023 MHz)- what do you guys think?


----------



## wammos

hey Kedar, what bios you using as your daily driver? I am behind on bios and wanted to see if there is something better than XOC.


----------



## AstroSky

I dont see the point in these bios. trie xoc. tried ampextreme. Dont seem to make a difference tbh.


----------



## smithsrt8

Quote:


> Originally Posted by *Psykoking*
> 
> 
> That is strong...what voltage are you stable at?
> 
> aircooled Amp Extreme on Stock Bios (2062 / 6023 MHz)- what do you guys think?


----------



## kevindd992002

Quote:


> Originally Posted by *Essenbe*
> 
> I have an EVGA FTW 3 and running Heaven it is Voltage limited at stock. I have asked a few friends who have 10xx cards to try it too. Every one of them is Voltage limited. Even at stock settings you cannot add enough voltage to make it not voltage limited.


I thought voltage limits are supposed to be the same for all 1080Ti cards? Or am I missing something?


----------



## Psykoking

Quote:


> Originally Posted by *smithsrt8*


1.081V


----------



## 12Cores

Running a MSI 1080ti gaming x under water and I cannot get the card game stable above 2050 on the core even with a custom curve. Overclocking result below:

2038/11800 - 1.081v stable
2050/11800 - 1.093v stable(lower performance)
2063/11800 - 1.093v bench stable
2076/11800 - 1.093v stable with a few benches not game stable
2088/11800 - 1093v semi stable with a few benches when all the stars align
2101/11800 - 1.093v unstable

While I was hoping for a higher overclock with the card, its still insanely powerful, the last time I got a card that made my jaw drop was the 7970 running at 1225/1750 at 1080p back in the day.


----------



## AstroSky

what is your scores on time spy?


----------



## Psykoking

Quote:


> Originally Posted by *AstroSky*
> 
> what is your scores on time spy?


currently downloading.... in the meantime i played with my settings again


and no I'm not sitting in a fridge or something ambient temps are ~25 °C


----------



## ZealotKi11er

Everyone is killing my score. Do aftermarket cards have more power?


----------



## Psykoking

Quote:


> Originally Posted by *AstroSky*
> 
> what is your scores on time spy?



There you go


----------



## cluster71

Graphic score is where it should be, 10400-11000


----------



## Psykoking

Ah yeah, since it might be interesting for other AMP Extreme owners... Alphacool will release a compatible water block for the AMP Extreme in near future, card has been measured and blocks are already in mass production. I thought about building a custom loop but for now my temps are ok although I might be changing the thermal pads and add some to the black plate to cool down the VRMs a bit


----------



## ZealotKi11er

Should I bother with BIOS flash? My 1080 Ti with hybrid cooler only getting 9.8K on SuperP 4K.


----------



## cluster71

It is a good idea to change the paste and pads. There seems to be poor control during assembly, many cards need to be replaced.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cluster71*
> 
> It is a good idea to change the paste and pads. There seems to be poor control during assembly, many cards need to be replaced.


The ones around the memory and vrm are fine but there are some while pads that are poorly placed like a last minute effort.


----------



## cluster71

I had Asus FE first and now Zotac Extreme both were poor and affected the temperature a lot


----------



## Psykoking

Quote:


> Originally Posted by *cluster71*
> 
> It is a good idea to change the paste and pads. There seems to be poor control during assembly, many cards need to be replaced.


That's why i replaced the stock TIM with LM, which gave me -11°C


----------



## cluster71

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Should I bother with BIOS flash? My 1080 Ti with hybrid cooler only getting 9.8K on SuperP 4K.


What do you have for MB CPU? I have the Z270 7700K in this. Many have trouble getting high scores. I had 2 M.2 SSD earlier, pull out one, this pcie multiplexing seems restricting. 15 pcie lines at the same time in Z270. I had ~ 9800 before,very irregular and increased to 10250 steady since in Superposition. And had high CPU usage during copying and very high disc usage during ram testing.Keeping an eye on AIDA64 device resources.
Compare with someone who has similar conditions or compare with another program


----------



## ZealotKi11er

Quote:


> Originally Posted by *cluster71*
> 
> I had Asus FE first and now Zotac Extreme both were poor and affected the temperature a lot


What pads did you use as replacements?
Quote:


> Originally Posted by *cluster71*
> 
> What do you have for MB CPU? I have the Z270 7700K in this. Many have trouble getting high scores. I had 2 M.2 SSD earlier, pull out one, this pcie multiplexing seems restricting 15 pcie lines at the same time. I had ~ 9800 before and increased to 10200 since in Superposition. Keeping an eye on AIDA64..Compare with someone who has similar conditions or compare with another program


My score is simply because low clock. Its only getting 19XXMHz for most of the run. Its a ITX MB so only the GPU.


----------



## cluster71

I took what I got hold of, not the worst (Phobya Thermal pad Ultra 5W / mk)


----------



## Psykoking

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The ones around the memory and vrm are fine but there are some while pads that are poorly placed like a last minute effort.


Well Zotac's TIM was the worst I've seen so far in terms of temps. Also the VRM-cooling is insufficient, since the VRMs will get about 90-92 °C hot without manual OCing. That's why I plan to also exchange the thermal pads with 14W/mK ones from alphacool. I tried to compare the ones that are applied on stock to what I could find on the internet and it seems they do only have a conductivity of about 2.7W/mK...


----------



## cluster71

I took 1 mm on both of my cards. Original seems to be 1.5 mm, spring load does not seem to push properly. I think that's why many people have high temp.


----------



## AstroSky

mine is only 10266 :/ in time spy


----------



## AstroSky

so replacing thermal paste and pads and such would help better overclocking?


----------



## ZealotKi11er

Quote:


> Originally Posted by *AstroSky*
> 
> so replacing thermal paste and pads and such would help better overclocking?


No. Only thing that helps is lower temps and higher power.


----------



## AstroSky

damn lol


----------



## Psykoking

Quote:


> Originally Posted by *ZealotKi11er*
> 
> No. Only thing that helps is lower temps and higher power.


Depends, since GPU boost 3.0 uses thermal step back to adjust itself it "can" help. From what I've noticed the first step back occures at around 46-47°C core temp.


----------



## KedarWolf

Quote:


> Originally Posted by *wammos*
> 
> hey Kedar, what bios you using as your daily driver? I am behind on bios and wanted to see if there is something better than XOC.


For my daily driver for gaming I just use the XOC BIOS.

But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.

Over 11000 graphics score on a mild overclock.

It has a 432W power limit.









You need to run the powerlimit.bat once after installing the BIOS.

GP102_arcticstorm.zip 156k .zip file


powerlimit.zip 0k .zip file


----------



## Psykoking

Quote:


> Originally Posted by *AstroSky*
> 
> mine is only 10266 :/ in time spy


might depend on core/memory clock ratio, when I test my settings I use a low resolution Furmark setting with 8xMSAA (1600x900) so i can see if it makes a difference. some combinations do yield 1-2 fps more.


----------



## feznz

I am lost for words but apparenty a Newegg RMA claim


----------



## Rollergold

Quote:


> Originally Posted by *feznz*
> 
> I am lost for words but apparenty a Newegg RMA claim










just







how the hell did that person get the GPU core off the card


----------



## bmgjet

Quote:


> Originally Posted by *Rollergold*
> 
> 
> 
> 
> 
> 
> 
> 
> just
> 
> 
> 
> 
> 
> 
> 
> how the hell did that person get the GPU core off the card


Probably ran it with out the cooler on properly.
A few seconds like that and the whole chip pops off.


----------



## AsusFan30

H
Quote:


> Originally Posted by *Dasboogieman*
> 
> Just wanted to share my recent Phanteks Primo Ultimate build
> 2x480mm, 1x360, 1x 240, 1x120mm, 2 D5 Pumps in series
> 
> 
> 
> More rads, more pumps, less noise.
> The 1080ti now never breaks 30C on load.


I guess you don't care how your PC looks. That is embarrassing!


----------



## Dasboogieman

Quote:


> Originally Posted by *AsusFan30*
> 
> H
> I guess you don't care how your PC looks. That is embarrassing!


Yup don't give no craps. People that truly care already can see past the mess, I don't need to impress bystanders. I needed it reliably work for years without dramas which is why I went with chemical resistant EPDM tubing and minimal brittle materials wherever possible. Critical areas were metallic clamped down so unless the hose tears, its not popping out.

Its actually a bit cleaner now after I've gone and optimised the loop.


----------



## Streetdragon

embarrassing? ***? its a nice loop. THAT is watercooling. not all this blinking/shiny poop. You should look on the screen while gaming and not in the case!

why is w.t.f censored xD ok than not


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


The Amp Extreme bios that you posted a few pages back with a power limit of 384w, is it a new revision or the stock bios that came with the first batch?


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file
> 
> 
> 
> 
> The Amp Extreme bios that you posted a few pages back with a power limit of 384w, is it a new revision or the stock bios that came with the first batch?
Click to expand...

I believe it's a new revision. Original BIOS did not work with our FE's and it was just added to the TechPowerUp database.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> I believe it's a new revision. Original BIOS did not work with our FE's and it was just added to the TechPowerUp database.


Thanks. My stock bios was buggy and fan control did not work. May give this a try. Currently using XOC as daily driver.


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


Quote:


> Originally Posted by *KedarWolf*
> 
> I believe it's a new revision. Original BIOS did not work with our FE's and it was just added to the TechPowerUp database.


So if the arctic extreme has better perf for you than the XOC and 432W power limit any reason to still use XOC? I've seen my power limit go up to 500W only when running OCCT GPU Test at 1000 fps


----------



## KedarWolf

Quote:


> Originally Posted by *kolkoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I believe it's a new revision. Original BIOS did not work with our FE's and it was just added to the TechPowerUp database.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> So if the arctic extreme has better perf for you than the XOC and 432W power limit any reason to still use XOC? I've seen my power limit go up to 500W only when running OCCT GPU Test at 1000 fps
Click to expand...

In Diablo 3 Arctic Storm BIOS power limits a few bins up and down In Diablo 3, my main game. Would rather no power limiting with XOC BIOS.


----------



## TheBoom

Quote:


> Originally Posted by *kolkoo*
> 
> So if the arctic extreme has better perf for you than the XOC and 432W power limit any reason to still use XOC? I've seen my power limit go up to 500W only when running OCCT GPU Test at 1000 fps


Careful mate you gonna fry that card with OCCT at that fps.


----------



## kolkoo

Quote:


> Originally Posted by *TheBoom*
> 
> Careful mate you gonna fry that card with OCCT at that fps.


Yeah it was for a short while to test for coil whine







No worries card is fine it held steady for a while at 43-44 and started to climb to 46 C when I stopped it


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> In Diablo 3 Arctic Storm BIOS power limits a few bins up and down In Diablo 3, my main game. Would rather no power limiting with XOC BIOS.


Is that at 1440p or 4k?


----------



## KedarWolf

Quote:


> Originally Posted by *kolkoo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> In Diablo 3 Arctic Storm BIOS power limits a few bins up and down In Diablo 3, my main game. Would rather no power limiting with XOC BIOS.
> 
> 
> 
> Is that at 1440p or 4k?
Click to expand...

4k with control panel 3D settings maxed out and g-sync enabled.


----------



## kolkoo

Quote:


> Originally Posted by *KedarWolf*
> 
> 4k with control panel 3D settings maxed out and g-sync enabled.


I see that makes sense then. Probably for people at 1080p or 1440p the arctic storm may be worthwhile.
I am using the XOC myself with 1080p and now recently with a 1440p monitor and am happy with performance.


----------



## TheBoom

Just tested the artic storm bios and I'm being power throttled at a max power usage of 353w. Already ran powerlimit.bat as admin but still getting limited 130 watts lower?


----------



## Coopiklaani

Quote:


> Originally Posted by *AstroSky*
> 
> mine is only 10266 :/ in time spy


Total score? or graphics score.
10266 total score is normal since it's CPU dependent.

http://www.3dmark.com/spy/1877297

FE card scores about 11000 in graphics score if you have a good sample.


----------



## TheBoom

So after testing my Amp Extreme with 3 bioses, the revised Amp Extreme bios, the Artic Storm bios and the XOC, I get better scores consistently with the XOC.

The Artic Storm comes close, within 100 points in Supo 4k but for some reason the power limit is stuck at 100% even after running the power limit.bat file. So there's some crazy power throttling going on and possibly affecting the score.

Think I'm gonna stick with the XOC for daily use in this case.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file


Does XOC Bios help FE card at all?


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does XOC Bios help FE card at all?


It did help hold off on hitting the power limit. But it was acting strange and I went back to my stock bios; for example while running the "nvidia-smi.exe -q -d power -l 1" command it was not able to read any of the values like it did on the stock bios and the whole PC should power cycle now and again. I am actually in the process of going back and trying it again as I'm thinking it may have been a bad flash.

Give it a shot. Worst case, you go back to stock BIOS.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> It did help hold off on hitting the power limit. But it was acting strange and I went back to my stock bios; for example while running the "nvidia-smi.exe -q -d power -l 1" command it was not able to read any of the values like it did on the stock bios and the whole PC should power cycle now and again. I am actually in the process of going back and trying it again as I'm thinking it may have been a bad flash.
> 
> Give it a shot. Worst case, you go back to stock BIOS.


How safe is to flash? I have done AMD cards flashes in 100s. Can you do blind flash with Nvidia GPU?


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> For my daily driver for gaming I just use the XOC BIOS.
> 
> But I got a 11166 Time Spy on the Arctic Extreme BIOS which is pretty much unheard of for a 1080 Ti.
> 
> Over 11000 graphics score on a mild overclock.
> 
> It has a 432W power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to run the powerlimit.bat once after installing the BIOS.
> 
> GP102_arcticstorm.zip 156k .zip file
> 
> 
> powerlimit.zip 0k .zip file
> 
> 
> 
> 
> Does XOC Bios help FE card at all?
Click to expand...

It helps because it stops power limiting and helps keep stable frame rates in games etc.









Clocks are not jumping up and down but staying stable at the clock you set it at.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does XOC Bios help FE card at all?


Quote:


> Originally Posted by *KedarWolf*
> 
> It helps because it stops power limiting and helps keep stable frame rates in games etc.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clocks are not jumping up and down but staying stable at the clock you set it at.


Does yours respond to the "nvidia-smi.exe" -q -d power -l 1" command after flashing? Where did you download it from?


----------



## KedarWolf

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Does XOC Bios help FE card at all?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> It helps because it stops power limiting and helps keep stable frame rates in games etc.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Clocks are not jumping up and down but staying stable at the clock you set it at.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Does yours respond to the "nvidia-smi.exe" -q -d power -l 1" command after flashing? Where did you download it from?
Click to expand...

Yes, I believe it does respond to that command.

Is in OP, first page, in How To Flash A Different BIOS section. Bottom of that forum post if you scroll down.

I think it's a Google Drive or OneDrive link or something like that.


----------



## ZealotKi11er

How does one know if the BIOS flash has take effect? GPU-Z still same as the original BIOS.


----------



## Streetdragon

a question for the XOC bios from me:

when i open MSI-afterburner -> Voltagecurve i can choose, lets say 1,063V to the first highest clock and it stays there and wont go to a higher voltage.

dont wanna push to high voltage etc, just wanna get stable clocks with a fixed "low" voltage


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How does one know if the BIOS flash has take effect? GPU-Z still same as the original BIOS.


When you flash the screen may go blank. You hit the 'y' key twice, screen will come back on.

If it says, 'You need to reboot for changes to take effect.' in the CMD window when screen comes back on you know BIOS has flashed.









Even though nvflash is supposed to disable your card automatically I know of a few people that had to disable it in device manager first to get the BIOS to flash.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How safe is to flash? I have done AMD cards flashes in 100s. Can you do blind flash with Nvidia GPU?


So here is my experience with BIOS flashing, primarily the XOS bios. Now, I will preface this by saying this may be just an isolated case due to hardware/software config or even mystical PC gremlins.

That being said, I've tried 3 different XOC BIOS' downloaded from 3 different locations:

1. XOC Bios from HWBOT on the first page of OP - flashed fine, removed power-limit, however would not respond to nvidia-smi.exe commands as the card was not fully recognized. In MSI Afterburner power/temp limit sliders grayed out.

2. XOC Bios from the "BestBIOScollection.zip" - flashed fine, removed power-limit, however would not respond to nvidia-smi.exe commands as the card was not fully recognized AND caused the PC to randomly power cycle...which is pretty dangerous when you're flashing different BIOS' on a card with a single BIOS chip. MSI Afterburner power/temp limit sliders grayed out.

3. XOC bios from TechPOWERUP database (my current bios) - flashed fine, fully recognized by nvidia-smi.exe, solid so far and MSI Afterburner temp/power sliders back to full adjustment.

This is just my experience.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> When you flash the screen may go blank. You hit the 'y' key twice, screen will come back on.
> 
> If it says, 'You need to reboot for changes to take effect.' in the CMD window when screen comes back on you know BIOS has flashed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even though nvflash is supposed to disable your card automatically I know of a few people that had to disable it in device manager first to get the BIOS to flash.


If you use the regular nvflash then it shouldn't disable the gpu. The auto flash tool will do it though.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> So here is my experience with BIOS flashing, primarily the XOS bios. Now, I will preface this by saying this may be just an isolated case due to hardware/software config or even mystical PC gremlins.
> 
> That being said, I've tried 3 different XOC BIOS' downloaded from 3 different locations:
> 
> 1. XOC Bios from HWBOT on the first page of OP - flashed fine, removed power-limit, however would not respond to nvidia-smi.exe commands as the card was not fully recognized. In MSI Afterburner power/temp limit sliders grayed out.
> 
> 2. XOC Bios from the "BestBIOScollection.zip" - flashed fine, removed power-limit, however would not respond to nvidia-smi.exe commands as the card was not fully recognized AND caused the PC to randomly power cycle...which is pretty dangerous when you're flashing different BIOS' on a card with a single BIOS chip. MSI Afterburner power/temp limit sliders grayed out.
> 
> 3. XOC bios from TechPOWERUP database (my current bios) - flashed fine, fully recognized by nvidia-smi.exe, solid so far and MSI Afterburner temp/power sliders back to full adjustment.
> 
> This is just my experience.


Removes power limit but with HWMonitor still only getting 300W. Still cant go over 1v.


----------



## KedarWolf

Quote:


> Originally Posted by *Streetdragon*
> 
> a question for the XOC bios from me:
> 
> when i open MSI-afterburner -> Voltagecurve i can choose, lets say 1,063V to the first highest clock and it stays there and wont go to a higher voltage.
> 
> dont wanna push to high voltage etc, just wanna get stable clocks with a fixed "low" voltage


With XOC BIOS if you have it as a straight line to the right at 1.063v say, a horizontal line to the right from that voltage point, it shouldn't go above that.

But with Boost 3.0 sometimes it still does it's own thing and may go a bin or two higher at some voltage points.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Removes power limit but with HWMonitor still only getting 300W. Still cant go over 1v.


Here is a Superpos. 4k run I just did at 2050. According to HWmonitor at its highest point, it pulled 328w. Still hitting the Pwr limit. Going to play around with the curve some more.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Here is a Superpos. 4k run I just did at 2050. According to HWmonitor at its highest point, it pulled 328w. Still hitting the Pwr limit. Going to play around with the curve some more.


I installed the FW3 BIOS. My score did go up. I am staying above 1v and 2GHz most of the run. In MSI AB power slider goes 127%. GPU-Z log says its beating 100-110 during the benchmark.


----------



## SultanOfWalmart

Is it letting you pull more than 300w?

Sup4k seems to really hit the power draw hard.


----------



## TheBoom

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Is it letting you pull more than 300w?
> 
> Sup4k seems to really hit the power draw hard.


Lol try 8k then. 420watts is normal for 8k supo with XOC bios.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *TheBoom*
> 
> Lol try 8k then. 420watts is normal for 8k supo with XOC bios.


With a shunt?


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Is it letting you pull more than 300w?
> 
> Sup4k seems to really hit the power draw hard.


Still only 312W which in line with 127 power target. 7% is like 17w extra. Can you link me your XOC bios.


----------



## Heruur

Whats the max temp limit for these cards, it was like 115 F this weekend and my FE card was running at 89-91C..I have the power/ temp limit set to 120%/ 90C in afterburner but the card wasnt downclocking. Im just particularly worried about component damage to the card..fan speed was at ~85%


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still only 312W which in line with 127 power target. 7% is like 17w extra. Can you link me your XOC bios.


https://www.techpowerup.com/vgabios/190941/asus-gtx1080ti-11264-170314


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> When you flash the screen may go blank. You hit the 'y' key twice, screen will come back on.
> 
> If it says, 'You need to reboot for changes to take effect.' in the CMD window when screen comes back on you know BIOS has flashed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even though nvflash is supposed to disable your card automatically I know of a few people that had to disable it in device manager first to get the BIOS to flash.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you use the regular nvflash then it shouldn't disable the gpu. The auto flash tool will do it though.
Click to expand...

The one on first page linked as an attachment .zip file works without disabling the GPU.









In the 'How To Flash A Difference BIOS' part.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SultanOfWalmart*
> 
> Is it letting you pull more than 300w?
> 
> Sup4k seems to really hit the power draw hard.
> 
> 
> 
> Still only 312W which in line with 127 power target. 7% is like 17w extra. Can you link me your XOC bios.
Click to expand...

http://forum.hwbot.org/showthread.php?t=169488

XOC BIOS at bottom of forum post, OneDrive link I think.









https://onedrive.live.com/?authkey=%21AJwt7bJPL03qgw0&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%21154264&parId=FABF1EAAEEBFA9D9%21148675&action=locate direct link.


----------



## TheBoom

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> With a shunt?


No shunt but with an Amp Extreme.


----------



## ZealotKi11er

I do not think I have gained much power but it seems with ASUS and FTW3 bios I can OC +50MHz higher. Not sure what the deal is. Still getting same volts as before. Going to try that BIOS again. First time I did not press y y .


----------



## DerComissar

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AsusFan30*
> 
> H
> I guess you don't care how your PC looks. That is embarrassing!
> 
> 
> 
> Yup don't give no craps. People that truly care already can see past the mess, I don't need to impress bystanders. I needed it reliably work for years without dramas which is why I went with chemical resistant EPDM tubing and minimal brittle materials wherever possible. Critical areas were metallic clamped down so unless the hose tears, its not popping out.
> 
> Its actually a bit cleaner now after I've gone and optimised the loop.
Click to expand...

That's a beautiful loop!

And an excellent build.
Too much emphasis from some here on having everything "perfect" looking.
Having a good, functional watercooling setup is what counts.

And that certainly is.


----------



## KraxKill

Quote:


> Originally Posted by *DerComissar*
> 
> That's a beautiful loop!
> 
> And an excellent build.
> Too much emphasis from some here on having everything "perfect" looking.
> Having a good, functional watercooling setup is what counts.
> 
> And that certainly is.


Thats a fine setup he's got....people should keep aesthetics to themselves. Most are all show no go.


----------



## Psykoking

ok so it's totally about core/memory ratio... again on stock bios (2062/6039). Best scoring combinations will pull the most power.... I've tested this serveral times now using different settings


----------



## KraxKill

Quote:


> Originally Posted by *Psykoking*
> 
> 
> ok so it's totally about core/memory ratio... again on stock bios (2062/6039). Best scoring combinations will pull the most power.... I've tested this serveral times now using different settings


Exactly It's core memory & available TDP power for the both of em.

If your core is eating all your available TDP during a bench, there is none left for the memory and if you go balls to the wall with your memory, you're going to rob yourself of a bin on the core and the pwr will eat your score more than it already does.

Its all about maximizing the area under the curve. In pwr intensive benches like Supo4k, sometime lower clocks with less pwr will actually score higher over higher clocks with constant pwr throttling.


----------



## cluster71

Quote:


> Originally Posted by *Streetdragon*
> 
> embarrassing? ***? its a nice loop. THAT is watercooling. not all this blinking/shiny poop. You should look on the screen while gaming and not in the case!
> 
> why is w.t.f censored xD ok than not










I had a case beforehand with good looks, but I tweak all the time so that's not a good idea. The latest PCs have been test benches instead. I have used two of these last years. My current mItx was intended for the Tv room, but has been moved to the basement as a bench bench. I'll look for a new one. I do not see PC because it's in the basement and I in the office over. No one sees the shiny


----------



## ZealotKi11er

So with XCO bios I am locked 2064MHz at 1.062. Card is pulling 371W.


----------



## cluster71

I'm testing the Arctic Storm 432w, running super position in the morning, but I'm not near the power limit, not even with the Zotac E 384w bios. When i run so much i can 2112/11800 1.093v. Are you close to the power limit?

Here's a Superposition Zotac 382w I think. It does not seem to be loaded hard as at the Donut test


----------



## Streetdragon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So with XCO bios I am locked 2064MHz at 1.062. Card is pulling 371W.


thats a lot of voltage there.

other question: are 1,093V save even with xoc? i mean it shouldnt go crazy with the Watts than or?


watt peaced @ 320W. but mostly it was around 300-310watts..
2050/6003 1,049V


----------



## cluster71

I do not know if there is constant real power we see or dynamically stored in capacitors, because the values in HWinfo64 and wall gauge or test instruments show different.

To get anywhere close, feed the graphics card with a secondary power supply and measure only dual 8 pin and gess 76w on pcie (as former Titan) and compensate for power supply efficiency. I've done it before, maybe doing a test on this.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> thats a lot of voltage there.
> 
> other question: are 1,093V save even with xoc? i mean it shouldnt go crazy with the Watts than or?
> 
> 
> watt peaced @ 320W. but mostly it was around 300-310watts..
> 2050/6003 1,049V


I was just testing it. I am going to set up a custom curve for 2025MHz OC with ~ 1v. Problem is 2015MHz @ 1v works fine but after 50C it dropped to 2012MHz. Trying now to get 2037MHz with 1v so it drops to 2025 once the card is warm.

Also to people that know more about these cards, does the fan of FE cooler get the power strait from PCIE right so the load is no reported to the GPU or is it?


----------



## KedarWolf

Quote:


> Originally Posted by *cluster71*
> 
> I do not know if there is constant real power we see or dynamically stored in capacitors, because the values in HWinfo64 and wall gauge or test instruments show different.


i think the nvidia-smi option best way to be honest.


----------



## ZealotKi11er

Man I am loving overclocking with unlocked TDP. Just like overclocking AMD GPU now. Only small thing that still remains is the drop in bin with temperature. Have 2 curved now for 0.9v @ 1885MHz and 1.012v @ 2037MHz.


----------



## GraphicsWhore

Quote:


> Originally Posted by *Streetdragon*
> 
> embarrassing? ***? its a nice loop. THAT is watercooling. not all this blinking/shiny poop. You should look on the screen while gaming and not in the case!
> 
> why is w.t.f censored xD ok than not


I personally don't care whether people choose to have things inside organized and clean but saying you should be looking at the screen and not in the case is silly, and claiming that "THAT is watercooling" given the reasonable standards that custom loops have come to adhere to is even more absurd. Nevermind that it makes it easier to work in the case and modify things when needed, it can have an effect on functionality/efficiency, and, yeah, it looks better. Aesthetics are a significant part of custom loops and custom builds in general; that's not even debatable.

So THAT is an outlier. Again, no issues with it personally but if you saw someone spend a bunch of money on a sports car and the inside was just trashed all the time you wouldn't say, "Now THAT is a true luxury sports car. You should be looking out the windshield and not inside anyway."

I'm tempted to try that Arctic Extreme BIOS on my FE. Looks solid.


----------



## ReFFrs

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> 3. XOC bios from TechPOWERUP database (my current bios) - flashed fine, fully recognized by nvidia-smi.exe, solid so far and MSI Afterburner temp/power sliders back to full adjustment.


Quote:


> Originally Posted by *SultanOfWalmart*
> 
> https://www.techpowerup.com/vgabios/190941/asus-gtx1080ti-11264-170314


This is NOT XOC BIOS. It's standard bios of Asus Strix 1080 Ti OC Edition.

XOC bios doesn't allow you to change power limit in MSI Afterburner because it doesn't have any limits at all.


----------



## KraxKill

Quote:


> Originally Posted by *GraphicsWhore*
> 
> I personally don't care whether people choose to have things inside organized and clean but saying you should be looking at the screen and not in the case is silly, and claiming that "THAT is watercooling" given the reasonable standards that custom loops have come to adhere to is even more absurd. Nevermind that it makes it easier to work in the case and modify things when needed, it can have an effect on functionality/efficiency, and, yeah, it looks better. Aesthetics are a significant part of custom loops and custom builds in general; that's not even debatable.
> 
> So THAT is an outlier. Again, no issues with it personally but if you saw someone spend a bunch of money on a sports car and the inside was just trashed all the time you wouldn't say, "Now THAT is a true luxury sports car. You should be looking out the windshield and not inside anyway."
> 
> I'm tempted to try that Arctic Extreme BIOS on my FE. Looks solid.


If it's faster than your car...it looks better bro...

You're all show no go...we get it.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *ReFFrs*
> 
> This is NOT XOC BIOS. It's standard bios of Asus Strix 1080 Ti OC Edition.
> 
> XOC bios doesn't allow you to change power limit in MSI Afterburner because it doesn't have any limits at all.


Hmm, I was under the impression that XOC was short for StriX OC edition bios.... But that would explain why I can manipulate the sliders for power limit. Either way, so far I'm having better results with this bios than I did with XOC from the other two sources.


----------



## ZealotKi11er

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Hmm, I was under the impression that XOC was short for StriX OC edition bios.... But that would explain why I can manipulate the sliders for power limit. Either way, so far I'm having better results with this bios than I did with XOC from the other two sources.


XOC is much better. I tried it and never drop because of power.

Also I noticed now I am mining and open up a game the GPU drops to 1569 MHz and 150W in power. Does anyone know why this happens?


----------



## Heruur

Any idea if running a 1080Ti at 90C will cause long term issues?


----------



## TheBoom

Anyone tried Arctic Storm bios and locked to default power even after running the powerlimit.bat file?

Quote:


> Originally Posted by *Heruur*
> 
> Any idea if running a 1080Ti at 90C will cause long term issues?


Probably highly reduced lifespan.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Heruur*
> 
> Any idea if running a 1080Ti at 90C will cause long term issues?


Problem with that is other parts might run even hotter.


----------



## KedarWolf

Definitely want to be under 80C on air at all times, ideally 65C or under.


----------



## Heruur

Yeah that was my concern, GDDR5X has a 95C max temp limit, Im not sure if how hot the vram gets when the gpu runs that high.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KraxKill*
> 
> If it's faster than your car...it looks better bro...
> 
> You're all show no go...we get it.


My go is great but that really has nothing to do with my response.
Quote:


> Originally Posted by *Heruur*
> 
> Yeah that was my concern, GDDR5X has a 95C max temp limit, Im not sure if how hot the vram gets when the gpu runs that high.


How are you reaching 90?


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> If it's faster than your car...it looks better bro...
> 
> You're all show no go...we get it.
> 
> 
> 
> My go is great but that really has nothing to do with my response.
Click to expand...

We ALL know if any car has like 12 chrome exhaust pipes it has to run much faster!!

On my phone, pipes autocorrected to puppies, totally different concept I think.


----------



## Heruur

Quote:


> Originally Posted by *GraphicsWhore*
> 
> How are you reaching 90?


I have the card housed in a SFF PC (Shuttle SZ170R8), normally temps are around 85-88C with power limit set to 120% and temp limit set to 90C.

https://i.redd.it/8hy22ktumnoy.jpg


----------



## GraphicsWhore

Quote:


> Originally Posted by *KedarWolf*
> 
> We ALL know if any car has like 12 chrome exhaust pipes it has to run much faster!!
> 
> On my phone, pipes autocorrected to puppies, totally different concept I think.


*Google searches 'chrome exhaust puppies'"*

Disappointed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Heruur*
> 
> I have the card housed in a SFF PC (Shuttle SZ170R8), normally temps are around 85-88C with power limit set to 120% and temp limit set to 90C.
> 
> https://i.redd.it/8hy22ktumnoy.jpg


You do not want to run reference cooler near 120%. Stick to 100%.


----------



## smithsrt8

Quote:


> Originally Posted by *GraphicsWhore*
> 
> *Google searches 'chrome exhaust puppies'"*
> 
> Disappointed.


404 Error...no puppies found

Dont forget about any car with stickers advertising what you "have" on the car (or sometimes don't have) make you faster...even more added HP with Yellow stickers...

Looks are all subjective...I would rather have my system run like Usain Bolt than Kate Upton in heels...although I know which one I would rather look at


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> We ALL know if any car has like 12 chrome exhaust pipes it has to run much faster!!
> 
> On my phone, pipes autocorrected to puppies, totally different concept I think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Google searches 'chrome exhaust puppies'"*
> 
> Disappointed.
Click to expand...

https://s-media-cache-ak0.pinimg.com/originals/5d/59/eb/5d59eb843755129af3a25d1be60cff10.jpg


----------



## KraxKill

Just got into water-cooling. Now need to add fancy LED's to really get all I can out of her..


----------



## GraphicsWhore

Quote:


> Originally Posted by *smithsrt8*
> 
> Looks are all subjective...I would rather have my system run like Usain Bolt than Kate Upton in heels...although I know which one I would rather look at


Well I think there's always room for objectivity even in looks but 100% agreed with the second part. I mean preferably it'd be a Kate Upton in heels that runs as fast Bolt, but maybe I'm being unrealistic...
Quote:


> Originally Posted by *KedarWolf*
> 
> https://s-media-cache-ak0.pinimg.com/originals/5d/59/eb/5d59eb843755129af3a25d1be60cff10.jpg


That's exactly what I was looking for! +rep


----------



## smithsrt8

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *KraxKill*
> 
> Just got into water-cooling. Now need to add fancy LED's to really get all I can out of her..






I know what you mean...I need to finish up my build at work...I have added LED's to my keyboard...although when I type I can't see anything because its too bright


----------



## KedarWolf

Quote:


> Originally Posted by *smithsrt8*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Just got into water-cooling. Now need to add fancy LED's to really get all I can out of her..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know what you mean...I need to finish up my build at work...I have added LED's to my keyboard...although when I type I can't see anything because its too bright
Click to expand...

I tried to add LED's to my cat but she just scratched and hissed and hasn't meowed to me for a week..


----------



## Psykoking

Quote:


> Originally Posted by *KedarWolf*
> 
> I tried to add LED's to my cat but she just scratched and hissed and hasn't meowed to me for a week..


man, i wish i had a cat


----------



## chibi

Has anyone tried mining with the 1080 Ti? I see mention of mining every now and then and wouldn't mind putting mine to use


----------



## Psykoking

Satisfying and not at the same time


----------



## KedarWolf

Quote:


> Originally Posted by *Psykoking*
> 
> 
> Satisfying and not at the same time


When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"









Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.









http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238


----------



## smithsrt8

You guys and your fancy 6+ core chips...when I was single gpu that best I managed was 9737 (10756 graphics/6336cpu)

I gave up and got another 1080ti...I figured that would solve everything


----------



## Heruur

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You do not want to run reference cooler near 120%. Stick to 100%.


So I should sell the card and go aftermarket if I want to OC the card? If I stick to 100%; frame rate and frame times are all over the place.


----------



## smithsrt8

Quote:


> Originally Posted by *Heruur*
> 
> So I should sell the card and go aftermarket if I want to OC the card? If I stick to 100%; frame rate and frame times are all over the place.


I don't see a huge problem if you OC the card as long as you are underwater...otherwise if you plan on having the thing cranked up 24/7 on air definitely look at getting a different card or put the system on water


----------



## Psykoking

Quote:


> Originally Posted by *smithsrt8*
> 
> You guys and your fancy 6+ core chips...when I was single gpu that best I managed was 9737 (10756 graphics/6336cpu)
> 
> I gave up and got another 1080ti...I figured that would solve everything


Quote:


> Originally Posted by *KedarWolf*
> 
> When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238


xD yeah I can imagine.... however I could OC those 6 cores as hard as possible and wouldn't get anywhere near your score ^^
Quote:


> Originally Posted by *smithsrt8*
> 
> You guys and your fancy 6+ core chips...when I was single gpu that best I managed was 9737 (10756 graphics/6336cpu)
> 
> I gave up and got another 1080ti...I figured that would solve everything


*cries in student*


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238


Does your card work properly with the arctic storm bios? I'm stuck at ~350 watts with it.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does your card work properly with the arctic storm bios? I'm stuck at ~350 watts with it.
Click to expand...

Did you run the powerlimit.bat as admin?

powerlimit.zip 0k .zip file


Edit: Then run Time Spy with these settings, Test 2. Monitor voltage with nvidia-tmi in a command prompt.


----------



## feznz

I couldn't find the voltage monitoring in MSI AB beta 10 the monitoring section for my 1080Ti I have check enable voltage monitoring and 3rd party
My GTX770 accessory display out put I am able to monitor the voltage in the but not the 1080Ti I quadruple checked the list the 1080Ti aint there not sure if it is just me or do I need to reinstall AB
Quote:


> Originally Posted by *Psykoking*
> 
> xD yeah I can imagine.... however I could OC those 6 cores as hard as possible and wouldn't get anywhere near your score ^^
> *cries in student*


*cries two children and wife*
sometimes its more about priorities and getting bang for buck or what you will get more fun for your money.
I am a little jealous of the 5960x maybe more than just a little 100%


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> I couldn't find the voltage monitoring in MSI AB beta 10 the monitoring section for my 1080Ti I have check enable voltage monitoring and 3rd party
> My GTX770 accessory display out put I am able to monitor the voltage in the but not the 1080Ti I quadruple checked the list the 1080Ti aint there not sure if it is just me or do I need to reinstall AB
> Quote:
> 
> 
> 
> Originally Posted by *Psykoking*
> 
> xD yeah I can imagine.... however I could OC those 6 cores as hard as possible and wouldn't get anywhere near your score ^^
> *cries in student*
> 
> 
> 
> *cries two children and wife*
> sometimes its more about priorities and getting bang for buck or what you will get more fun for your money.
> I am a little jealous of the 5960x maybe more than just a little 100%
Click to expand...

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1

Run that when running the benchmark in an admin command prompt with the quotation at the start.

If you want to change it to report say every five seconds change the 1 to 5.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KedarWolf*
> 
> When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238


So you said your daily driver BIOS is the XOC? Any reason you don't stay on the Arctic?

My 980Ti was flashed and it worked out quite well. Now that I know my ceiling on this current setup I wouldn't mind seeing what a custom BIOS can do.


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> When I got my 11166 Time Spy I was like "What the heck hacks did I use to do this?"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arctic Storm BIOS OP. Was still only 11137 graphics score though, so CPU did a ton of it all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11340_20#post_26148238
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So you said your daily driver BIOS is the XOC? Any reason you don't stay on the Arctic?
> 
> My 980Ti was flashed and it worked out quite well. Now that I know my ceiling on this current setup I wouldn't mind seeing what a custom BIOS can do.
Click to expand...

In the game I play the most Arctic BIOS jumps up and down a few bins where XOC stays stable.

Because I use G-Sync and have my frame rates capped the constant core speed is better for myself.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KedarWolf*
> 
> In the game I play the most Arctic BIOS jumps up and down a few bins where XOC stays stable.
> 
> Because I use G-Sync and have my frame rates capped the constant core speed is better for myself.


Got it, thanks. I take it your use of the XOC BIOS is an endorsement. Which other BIOS versions have you tried on your FE, if any?


----------



## buellersdayoff

Quote:


> Originally Posted by *Heruur*
> 
> Any idea if running a 1080Ti at 90C will cause long term issues?


Everyone seems to be over reacting a little, from nvidia- NVIDIA GPUs are designed to operate reliably up to their maximum specified operating temperature. This maximum temperature varies by GPU, but is generally in the 105C range (refer to the nvidia.com product page for individual GPU specifications)
http://nvidia.custhelp.com/app/answers/detail/a_id/2752
Before anyone says that page is 7 years old, it would have been updated if it needed too.
I have a gtx 780 reference in my daughters machine still gaming strong and always has been overclocked since I first used it. And imagine just how many still are. Default temp is 83c and has a 3 year warranty to back it up. I personally wouldn't want it going over 90 though.
https://forums.geforce.com/default/topic/1002225/1080ti-max-temp/?offset=2


----------



## KedarWolf

Quote:


> Originally Posted by *buellersdayoff*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Heruur*
> 
> Any idea if running a 1080Ti at 90C will cause long term issues?
> 
> 
> 
> Everyone seems to be over reacting a little, from nvidia- NVIDIA GPUs are designed to operate reliably up to their maximum specified operating temperature. This maximum temperature varies by GPU, but is generally in the 105C range (refer to the nvidia.com product page for individual GPU specifications)
> http://nvidia.custhelp.com/app/answers/detail/a_id/2752
> Before anyone says that page is 7 years old, it would have been updated if it needed too.
> I have a gtx 780 reference in my daughters machine still gaming strong and always has been overclocked since I first used it. And imagine just how many still are. Default temp is 83c and has a 3 year warranty to back it up. I personally wouldn't want it going over 90 though.
> https://forums.geforce.com/default/topic/1002225/1080ti-max-temp/?offset=2
Click to expand...

It's been pretty much a given in forums from my 680's on, the max operating temp you don't want to exceed is 80c for core for regular use.









You need to take into consideration the VRM's etc. can get much hotter than the core, so likely why that's recommended by most in the know.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Got it, thanks. I take it your use of the XOC BIOS is an endorsement. Which other BIOS versions have you tried on your FE, if any?


FE if you are under water of Hybrid cooler use XOC. By far the best BIOS. You can set the card to max out at a certain voltage and will stay there and not move. I would love someone to do frame time analysis with 1080 Ti with no TDP limit and one that is stock. Average performance is not that much difference but its consistent performance. With stock Bios I was 1912-1978MHz jumping at 120% power. With XOC I can go 2075MHz but I decided to be safe and go for 2025 @ 1.012v which is ~ 300W.


----------



## buellersdayoff

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Heruur*
> 
> Any idea if running a 1080Ti at 90C will cause long term issues?
> 
> 
> 
> Everyone seems to be over reacting a little, from nvidia- NVIDIA GPUs are designed to operate reliably up to their maximum specified operating temperature. This maximum temperature varies by GPU, but is generally in the 105C range (refer to the nvidia.com product page for individual GPU specifications)
> http://nvidia.custhelp.com/app/answers/detail/a_id/2752
> Before anyone says that page is 7 years old, it would have been updated if it needed too.
> I have a gtx 780 reference in my daughters machine still gaming strong and always has been overclocked since I first used it. And imagine just how many still are. Default temp is 83c and has a 3 year warranty to back it up. I personally wouldn't want it going over 90 though.
> https://forums.geforce.com/default/topic/1002225/1080ti-max-temp/?offset=2
> 
> Click to expand...
> 
> It's been pretty much a given in forums from my 680's on, the max operating temp you don't want to exceed is 80c for core for regular use.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to take into consideration the VRM's etc. can get much hotter than the core, so likely why that's recommended by most in the know.
Click to expand...

80? How many reference cards do you think are out there in work stations with multiple gpu configuration, oem pc's, mitx, htpc's, professional simulators, and are still running at the default 80-85. I hear what you are saying and not trying to discredit you. I'm just showing facts so that Heruur isn't scared into buying anything they don't need (want maybe)... Peace ?


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> I thought voltage limits are supposed to be the same for all 1080Ti cards? Or am I missing something?


BUMP! on my question.


----------



## AstroSky

So xoc would be best for gaming? As it "stays" at the clock and is pretty smoother? no jumping and such? please link this correct xoc bios and ill try it again as i was just looking at benchscores before


----------



## PimpSkyline

*Is there a way to increase TDP? That's all I want. No XOC BIOS, no Volt mods. Just more TDP, i want a stable and firm 2088 to 2100, not a bonfire in my PC.*


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> So xoc would be best for gaming? As it "stays" at the clock and is pretty smoother? no jumping and such? please link this correct xoc bios and ill try it again as i was just looking at benchscores before


http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/12120_20#post_26186011


----------



## ZealotKi11er

Quote:


> Originally Posted by *PimpSkyline*
> 
> *Is there a way to increase TDP? That's all I want. No XOC BIOS, no Volt mods. Just more TDP, i want a stable and firm 2088 to 2100, not a bonfire in my PC.*


That is what XOC does. If you know you canrd can do 2100 then set it to go at that voltage and that clock and it will use as much power to stay there.


----------



## KraxKill

So the CPU is not a mega multi core behemoth but the GPU score feels solid.

11333 with the Ti @ 2114 - FE - Stock Bios - Shunts


http://www.3dmark.com/spy/1978913


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> So the CPU is not a mega multi core behemoth but the GPU score feels solid.
> 
> 11333 with the Ti @ 2114 - FE - Stock Bios - Shunts
> 
> 
> http://www.3dmark.com/spy/1978913


Why use shunts and not flash XOC bios?


----------



## lilchronic

XOC Bios
2063Mhz / 6210Mhz Graphics score: 11126
http://www.3dmark.com/3dm/20064138


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why use shunts and not flash XOC bios?


Because XOC Bios is 5% slower when flashed to shunt modded FE's and likely others.

Speculations run wild as to why this could be, but the FE bios runs best on my FE . The shunts do their job on the power limit up to Nvidia's hard coded cap that I've yet to figure out how to easily bypass without major surgery.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> Because XOC Bios is 5% slower when flashed to shunt modded FE's and likely others.
> 
> Speculations run wild as to why this could be, but the FE bios runs best on my FE . The shunts do their job on the power limit up to Nvidia's hard coded cap that I've yet to figure out how to easily bypass without major surgery.


5% seems like a lot. Do you have any data on this?


----------



## cluster71

I get the same performance with the Zotac Extreme bios 2100 as XOC 2126. If I want to clock higher I have to choose XOC so I can give the card more voltage


----------



## ZealotKi11er

Quote:


> Originally Posted by *cluster71*
> 
> I get the same performance with the Zotac Extreme bios 2100 as XOC 2126. If I want to clock higher I have to choose XOC so I can give the card more voltage


Is Zotac you default BIOS? I am heading FE has the better BIOS of all 1080 Ti. I did notice with FE BIOS my OC was 50MHz lower.


----------



## cluster71

I have the 1645 variant https://www.zotac.com/se/product/graphics_card/zotac-geforce-gtx-1080-ti-amp-extreme#overview https://www.techpowerup.com/vgabios/192147/zotac-gtx1080ti-11264-170327

I do not know if there are any physical differences on AMP Extreme and AMP Extreme Core Edition or just the bios


----------



## cluster71

Yes FE seems to be the best, so with previous graphics cards as well. Do not know if nVidia has something to do with it


----------



## cluster71

A new bios was released to my motherboard, I do not know if it was for this or https://www.techpowerup.com/234667/critical-flaw-in-hyperthreading-discovered-in-skylake-and-kaby-lake-cpus


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 5% seems like a lot. Do you have any data on this?


Go back towards page 1, it's been talked about and talked about again.

When it first came out people did a bunch of testing it helps non shunt modded cards.

On the stock BIOS, a properly shunt modified card will simply bump up against the nvidia hard limit (not the bios tdp limit, that is bypassed by the shunts) while on a non shunt modified card it is able to unlock more tdp headroom than any other bios at the expense of 5% performance which is still faster than using the stock BIOS.

In other words, it's a small handicap that is worth it if you're not already bypassing the TDP limit via the shunts as the XOC ultimatly manages to gain a bit more than it looses. On a shunt modded FE for example the BIOS is simply inferior to the stock BIOS for all but the most extreme NL2 applications which also requires additional power mods. For everything else a shunt modified FE on the stock BIOS will outscore the XOC.


----------



## GraphicsWhore

Flashed to XOC, rebooted, DDU, rebooted, reinstalled drivers. Machine POSTs but monitor stays asleep. Everything looking normal in Device Manager. Re-flashed, re-installed drivers, set mobo BIOS to default, etc.

Swap from DP to HDMI and now I boot normally.

Any ideas? Found some people with this issue on some 1080Tis but none related to flashing and the fixes are like replace DP cable or unplug monitor power until you're in Windows then plug it in. Nothing could have happened to this cable in the past few mins and even if the monitor thing works I'm not going to do that every time I boot.

I play my PS4 on this Dell U3415 as well and only one HDMI port, so I'm going to want to figure this out to avoid having to swap HDMI cables. For now going to play with curves and run some benchmarks.


----------



## KraxKill

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Flashed to XOC, rebooted, DDU, rebooted, reinstalled drivers. Machine POSTs but monitor stays asleep. Everything looking normal in Device Manager. Re-flashed, re-installed drivers, set mobo BIOS to default, etc.
> 
> Swap from DP to HDMI and now I boot normally.
> 
> Any ideas? Found some people with this issue on some 1080Tis but none related to flashing and the fixes are like replace DP cable or unplug monitor power until you're in Windows then plug it in. Nothing could have happened to this cable in the past few mins and even if the monitor thing works I'm not going to do that every time I boot.
> 
> I play my PS4 on this Dell U3415 as well and only one HDMI port, so I'm going to want to figure this out to avoid having to swap HDMI cables. For now going to play with curves and run some benchmarks.


You're on the FE and probably just lost your top display port while on the XOC? Did you really not see this when you google it?

Here you go http://bit.ly/2ub01om

First link, first post 2nd paragraph.


----------



## AstroSky

Flashed the xoc one. Scores are better and even fps and seems to be stable. Thing is iv noticed that in order to have well.....pretty avg scores i have to overclock EXTRA more to get good. 2088 on the core to get the good fps on this card. Did a usermark bench mark and it went from 126 percent to 132 percent. Its still practically under other people and i even saw one dude get 156 percent and his clock was at my standard boost of 1999 on the core


----------



## AstroSky

other dude had same cpu and i got better cpu scores


----------



## AstroSky

stock bios i get even worse scores


----------



## PimpSkyline

*So....* Short of XOC, a shunt Mod is in my future. Great. Wish someone could crack the Pascal VBIOS so modding the BIOS is as easy as Max and Kelp.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PimpSkyline*
> 
> *So....* Short of XOC, a shunt Mod is in my future. Great. Wish someone could crack the Pascal VBIOS so modding the BIOS is as easy as Max and Kelp.


Its been 1 year now. I do not think its going to happen.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *AstroSky*
> 
> Flashed the xoc one. Scores are better and even fps and seems to be stable. Thing is iv noticed that in order to have well.....pretty avg scores i have to overclock EXTRA more to get good. 2088 on the core to get the good fps on this card. Did a usermark bench mark and it went from 126 percent to 132 percent. Its still practically under other people and i even saw one dude get 156 percent and his clock was at my standard boost of 1999 on the core


Not all cards are created equal and the only guarantee is that a card will run at advertised speeds,anything extra is just icing.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KraxKill*
> 
> You're on the FE and probably just lost your top display port while on the XOC? Did you really not see this when you google it?
> 
> Here you go http://bit.ly/2ub01om
> 
> First link, first post 2nd paragraph.


No, I only did a couple of quick searches before starting to mess with benches, but after one of my reboots I tried another cable in in another DP port, it worked so I put the original cable in that port and I'm good. Thanks.

So with XOC BIOS ran a few SP4k and this was the best at 2088/6003, 1.062, max temp 46C.



No more time tonight but will run 3DMark and play with AB some more tomorrow.


----------



## kingofsorrow

Hello,
Since the recent upgrade to 4k, I feel the need for the second 1080ti.Is there a point in buying one now or should I just wait couple of months for the new GPUs from NVidia or AMD?
When the 1080ti replacement is expected?
Thank you.


----------



## Aganor

Quote:


> Originally Posted by *kingofsorrow*
> 
> Hello,
> Since the recent upgrade to 4k, I feel the need for the second 1080ti.Is there a point in buying one now or should I just wait couple of months for the new GPUs from NVidia or AMD?
> When the 1080ti replacement is expected?
> Thank you.


Problem with SLI is the unreability on a lot of titles and the constant micro-stutter. If that do not bother you, you can buy another or wait until beggining of next year for a possible mini-Volta


----------



## kingofsorrow

Quote:


> Originally Posted by *Aganor*
> 
> Problem with SLI is the unreability on a lot of titles and the constant micro-stutter. If that do not bother you, you can buy another or wait until beggining of next year for a possible mini-Volta


Thank you. Is the microstutter 1080ti specific or is that the general sli stuff?
I am not really sensitive to sli microstutter. Never really noticed it. Maybe because of the two NVidia nf200 chips on the motherboard that control the pcie operation.


----------



## MrTOOSHORT

EK block and back plate came in yesterday, but I work all week. So I'll install it on the Strix 1080ti this weekend. Cost me $350 CAD with duty and taxes. Unreal.


----------



## kingofsorrow

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> EK block and back plate came in yesterday, but I work all week. So I'll install it on the Strix 1080ti this weekend. Cost me $350 CAD with duty and taxes. Unreal.


Yeah, the waterbucks and backplates are crazy expensive. And to think that you will have to sell it in a few months...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *kingofsorrow*
> 
> Yeah, the waterbucks and backplates are crazy expensive. And to think that you will have to sell it in a few months...


No It's for my 11 year olds computer. Will keep this card for a couple years.


----------



## Aganor

Quote:


> Originally Posted by *kingofsorrow*
> 
> Thank you. Is the microstutter 1080ti specific or is that the general sli stuff?
> I am not really sensitive to sli microstutter. Never really noticed it. Maybe because of the two NVidia nf200 chips on the motherboard that control the pcie operation.


The stutter is due to SLI.
I thought i was ok with it but back in the day of the Radeon 5970 dual chip i got so tired of it that i changed to a less powerful but single chip nvidia card lol


----------



## kingofsorrow

Quote:


> Originally Posted by *Aganor*
> 
> The stutter is due to SLI.
> I thought i was ok with it but back in the day of the Radeon 5970 dual chip i got so tired of it that i changed to a less powerful but single chip nvidia card lol


I see. Thanks.


----------



## GraphicsWhore

Quote:


> Originally Posted by *kingofsorrow*
> 
> Hello,
> Since the recent upgrade to 4k, I feel the need for the second 1080ti.Is there a point in buying one now or should I just wait couple of months for the new GPUs from NVidia or AMD?
> When the 1080ti replacement is expected?
> Thank you.


Nobody knows. Probably early next year for the consumer Volta cards.

You could get a couple of Titan Xps. Do you have $2400 burning a hole in your pocket?


----------



## ZealotKi11er

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> EK block and back plate came in yesterday, but I work all week. So I'll install it on the Strix 1080ti this weekend. Cost me $350 CAD with duty and taxes. Unreal.


My 290X block no back-plate was liek 200 CAD back before CAD was low. Still even AIO cost me $170. Way too expansive. My H100 GTX was only $80 CAD. They are milking the GPU guys.


----------



## gmpotu

Got a GFX card questions and an unrelated question.

My fan profile is set to come on 50% speed @ 40C but my GPU-Z on idle shows the system turning the fan ON/OFF/ON/OFF because the temp.
Is this okay or should I set a gradual increase from 0-50% between a couple of degrees like an incline from 0-50% speed from 40C-43C instead of the cliff at 50% speed at 40C I have now?

My other question anyone know of a monitor being released this year with these specs?
IPS panel, 144hz, 21:9, 3440x1440, G-Sync

I know Acer has one that's a VA panel.


----------



## cluster71

It's up to you if you want to use pausing fans or if they are annoying. I do not know if it's possible to run the fans all the time on Asus but I guess. I have the fans shut off for normal use, passive cooling is enough, and running fans at Gaming. I have the RPM fan curve logged in afterburner instead of out put signal. So I know if the fans run ..


----------



## cluster71

I have used IPS monitors for many years. I play faster on IPS than VA because i have easier to understand what happens when there are better colors, even if the response time is higher on IPS. And G-sync is a must now


----------



## jediblr

Quote:


> Originally Posted by *gmpotu*
> 
> Got a GFX card questions and an unrelated question.
> 
> My fan profile is set to come on 50% speed @ 40C but my GPU-Z on idle shows the system turning the fan ON/OFF/ON/OFF because the temp.
> Is this okay or should I set a gradual increase from 0-50% between a couple of degrees like an incline from 0-50% speed from 40C-43C instead of the cliff at 50% speed at 40C I have now?
> 
> My other question anyone know of a monitor being released this year with these specs?
> IPS panel, 144hz, 21:9, 3440x1440, G-Sync
> 
> I know Acer has one that's a VA panel.


*Acer Predator X34A* - is one close to ur spec


----------



## feznz

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> EK block and back plate came in yesterday, but I work all week. So I'll install it on the Strix 1080ti this weekend. Cost me $350 CAD with duty and taxes. Unreal.


I looked into the EK water block + backplate I just couldn't replace the pretty ROG illuminated logo with the EK one, did think about getting longer screws and drilling out the threads in the ROG backplate.... but that would make it impossible to put the original cooler back on.

Quote:


> Originally Posted by *kingofsorrow*
> 
> Hello,
> Since the recent upgrade to 4k, I feel the need for the second 1080ti.Is there a point in buying one now or should I just wait couple of months for the new GPUs from NVidia or AMD?
> When the 1080ti replacement is expected?
> Thank you.


Even then a true 4k single card solution will be IMHO twice the power of a 1080Ti....... bite the bullet now or go SLI or CF with next gen.

remember SLI is a good solution for keeping FPS above 70+FPS below is when most of the problems will arise
in saying that this is my first ever non SLI rig ever were a single card has been completely satisfactory until you wind the last settings right up like SMAA for 3k gaming.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> Did you run the powerlimit.bat as admin?
> 
> powerlimit.zip 0k .zip file
> 
> 
> Edit: Then run Time Spy with these settings, Test 2. Monitor voltage with nvidia-tmi in a command prompt.


Yeah I did. Double checked by running it again it was already at 432 watts.

Are you saying that the hwinfo power usage info is wrong? It's accurate for every other bios I've used though don't see why it should be wrong with just the arctic storm bios.


----------



## Psykoking

XOC finally lifted me above 11k in graphics score in TimeSpy








Now i can finally sleep peacefully again


----------



## Quadrider10

I'm guessing that the XOC bios works ok all 1080Ti cards linked here:

http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti


----------



## AstroSky

^^^ your score makes me feel better. Not because you get less than me but because i see that im not the only one getting pretty close to that on time spy with or with out timespy


----------



## AstroSky

actually now that i think about Psykoking you got worse cpu score than me. Your gpu is about 2 fps more than mine but what is holding you back is cpu. That sucks big time for me. My cpu scores 30 fps on the cpu score stuff and im running ryzen 1700.....damn


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Did you run the powerlimit.bat as admin?
> 
> powerlimit.zip 0k .zip file
> 
> 
> Edit: Then run Time Spy with these settings, Test 2. Monitor voltage with nvidia-tmi in a command prompt.
> 
> 
> 
> 
> 
> Yeah I did. Double checked by running it again it was already at 432 watts.
> 
> Are you saying that the hwinfo power usage info is wrong? It's accurate for every other bios I've used though don't see why it should be wrong with just the arctic storm bios.
Click to expand...

In nvidia-tmi I was getting 280-350W. Not sure if it's a reporting issue or not but this BIOS consistently power limits less and performs better than any other BIOS I tried, well,except XOC BIOS does not power limit at all.


----------



## AstroSky

KedarWolf shunt mod. Would it improve my gpus scores if i went through with it?


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> KedarWolf shunt mod. Would it improve my gpus scores if i went through with it?


Yes, you would have the benefit of next to no power limiting with the best performing FE BIOS.


----------



## Psykoking

Quote:


> Originally Posted by *AstroSky*
> 
> actually now that i think about Psykoking you got worse cpu score than me. Your gpu is about 2 fps more than mine but what is holding you back is cpu. That sucks big time for me. My cpu scores 30 fps on the cpu score stuff and im running ryzen 1700.....damn


Well as I said before, I lack two cores to compete with you guys ^^


----------



## luan87us

Bit the bullets on the EVGA SC2 from the newegg sale last week. Just got the card today and god this card blew my mind even though I expected such performance. Without any manual overclocking it was able to boost up to 1987 MHz while running heaven benchmark and temp never broke 72C. Very happy.


----------



## rockn

is the asus strix still the best brand for watercooling due to having an higher TDP/Voltage compared to FE and other brand cards?

Or with the XOC tools now they are all the same?


----------



## mrsus

I got a Gigabyte 1080 Ti Gaming OC 11G that can do 2113 mhz on water, is it compatible with XOC bios ?


----------



## Agent-A01

Quote:


> Originally Posted by *mrsus*
> 
> I got a Gigabyte 1080 Ti Gaming OC 11G that can do 2113 mhz on water, is it compatible with XOC bios ?


No, do not flash XOC on that card.
XOC only works with close to reference cards


----------



## Dasboogieman

Quote:


> Originally Posted by *Agent-A01*
> 
> No, do not flash XOC on that card.
> XOC only works with close to reference cards


XOC worked finr on my Aorus


----------



## MrBeer

Does the xoc bios work with 1080ti evga sc black. I am at 1212 but hitting the power target wall


----------



## Dasboogieman

The ArcticStorm BIOS is beyond godlike for my 1080ti Aorus


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> The ArcticStorm BIOS is beyond godlike for my 1080ti Aorus


What clocks is it running there?


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What clocks is it running there?


Same 2113mhz 12666vram i run for benching, this BIOS just doesnt power throttle at all. Unlike the XOC, this has no perf penalty.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> Same 2113mhz 12666vram i run for benching, this BIOS just doesnt power throttle at all. Unlike the XOC, this has no perf penalty.


Do you have link? The performance with XOC for me was not very good for the clocks.

Edit: I tried the BIOS but does not help. TDP gets increased but instead of using the extra power it sits a 8X%.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you have link? The performance with XOC for me was not very good for the clocks.
> 
> Edit: I tried the BIOS but does not help. TDP gets increased but instead of using the extra power it sits a 8X%.


 GP102_arcticstorm1.zip 156k .zip file


This was the exact one I used. Its not entirely all dandy tho, I'm getting coil whine now in 3d loads. This means more current is definitely going through the inductors. That or the BIOS is overclocking my MOSFETs.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> In nvidia-tmi I was getting 280-350W. Not sure if it's a reporting issue or not but this BIOS consistently power limits less and performs better than any other BIOS I tried, well,except XOC BIOS does not power limit at all.


Yeah isn't that well below the bios's actual power limit? If its 350w max usage.

I get constant power limiting on GPU Z and I can see the clocks jumping up and down (mostly down though) in 4k Supo. I don't get why it does this at 350 watts even though the power limit is 432 watts.

Unless of course you are shunt modded then maybe that explains why you are not getting power limited as much.

Is there a revision of this bios as well? I saw someone post another arctic storm bios that looks different a page back.

Edit : Nevermind I meant a post back lol ^^


----------



## TheBoom

Double.


----------



## Iceman2733

For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.

I recently picked up 2ea 1080ti FTW3

My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
Asus Rampage V Edition 10
Corsair Dominator Platinum 16gig 3200mhz
2x EVGA 1080 FTW in SLI EK Blocked
Samsung 950 PRO M.2 NVME
Samsung 850
WD Black 1tb
EVGA 1000 P2
Dual loops with D5 pumps
13 fans

I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


----------



## Dasboogieman

Quote:


> Originally Posted by *Iceman2733*
> 
> For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.
> 
> I recently picked up 2ea 1080ti FTW3
> 
> My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
> Asus Rampage V Edition 10
> Corsair Dominator Platinum 16gig 3200mhz
> 2x EVGA 1080 FTW in SLI EK Blocked
> Samsung 950 PRO M.2 NVME
> Samsung 850
> WD Black 1tb
> EVGA 1000 P2
> Dual loops with D5 pumps
> 13 fans
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


For me, easily 400W spikes with about 350W sustained with heavy 4K loading. Ur system is going to cut it close with 1000W.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Iceman2733*
> 
> For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.
> 
> I recently picked up 2ea 1080ti FTW3
> 
> My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
> Asus Rampage V Edition 10
> Corsair Dominator Platinum 16gig 3200mhz
> 2x EVGA 1080 FTW in SLI EK Blocked
> Samsung 950 PRO M.2 NVME
> Samsung 850
> WD Black 1tb
> EVGA 1000 P2
> Dual loops with D5 pumps
> 13 fans
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


You should be more than fine with 1000W.


----------



## smithsrt8

EVGA is a good power supply...

I will give you my story...I had a Evga 750w P2...I purchased a 1000w Be Quiet Dark Power Pro 11 (from this website from another member) and was waiting for it to come in...my second card showed up last friday...I couldn't wait to install the card after reading online that if I lower the power on MSI AB I can run both until I get the other power supply I would be fine...I have a Kill O Watt so I tested out the theory...with both cards @ 90% running Heavens and Prime95 at the same time (no overclocks on anything even stock on CPU) I was pulling 640ish (on a 750w P2 supply that is cutting it really close...my Power Supply showed up Monday...I installed it..cranked my CPU back to 5ghz @ 1.36v and overclocked the video cards in SLI...the max I have seen has been 820w....and that is running heavens and prime95 maxed...I am running 11 fans...2 led strips...aqueros 6xt...usb hub...1 mechanical HD (2 M.2's) so that gives you an idea of power draw


----------



## BrainSplatter

Quote:


> Originally Posted by *Iceman2733*
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too.


Should be ok as long as u are not trying to overvolt the GPUs like crazy which is a bit pointless with Pascal anyway.
I am cutting it very close with a 800W BeQuiet PSU but I have been running a [email protected] + 1080Ti FE SLI with it since 3 months. Haven't had a problem yet. Also not when benchmarking with +2Ghz GPU and 5.1Ghz [email protected] Rest of system is [email protected], 6xSSDs, 7 fans, 2 LED strips, USB stuff.


----------



## Ali3n77

Max voltage on artic bios ? thnx


----------



## jprovido

Quote:


> Originally Posted by *Iceman2733*
> 
> For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.
> 
> I recently picked up 2ea 1080ti FTW3
> 
> My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
> Asus Rampage V Edition 10
> Corsair Dominator Platinum 16gig 3200mhz
> 2x EVGA 1080 FTW in SLI EK Blocked
> Samsung 950 PRO M.2 NVME
> Samsung 850
> WD Black 1tb
> EVGA 1000 P2
> Dual loops with D5 pumps
> 13 fans
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


wouldn't even worry about running two 1080 ti's on a 750w psu tbh


----------



## KedarWolf

There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.

Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.

It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.


----------



## KCDC

Quote:


> Originally Posted by *Iceman2733*
> 
> For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.
> 
> I recently picked up 2ea 1080ti FTW3
> 
> My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
> Asus Rampage V Edition 10
> Corsair Dominator Platinum 16gig 3200mhz
> 2x EVGA 1080 FTW in SLI EK Blocked
> Samsung 950 PRO M.2 NVME
> Samsung 850
> WD Black 1tb
> EVGA 1000 P2
> Dual loops with D5 pumps
> 13 fans
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


My system is fairly similar to yours. When monitoring psu usage through Link, my spikes would go as high as 920 to 960w on the stock FE bios when benching or gaming. What's the PSU? Is it truly rated for 1k or does it just say it on the side? You should be fine, but it is cutting it close.


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.
> 
> Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.
> 
> It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.


That sounds like trapped pressure to me, though I'm not sure how you'd take care of that with a closed loop system... Did you contact EK over it? They might have a solution to slightly bleed the system.


----------



## Iceman2733

Quote:


> Originally Posted by *KCDC*
> 
> My system is fairly similar to yours. When monitoring psu usage through Link, my spikes would go as high as 920 to 960w on the stock FE bios when benching or gaming. What's the PSU? Is it truly rated for 1k or does it just say it on the side? You should be fine, but it is cutting it close.


my PSU is an EVGA 1000 P2, I have been trying to find myself a bigger one but all the miners and resellers are buying up everything.


----------



## smithsrt8

Quote:


> Originally Posted by *KedarWolf*
> 
> There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.
> 
> Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.
> 
> It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.


Did it just start happening? Did you have the water setup before the block? Whats your Flow rate?


----------



## KCDC

Quote:


> Originally Posted by *Iceman2733*
> 
> my PSU is an EVGA 1000 P2, I have been trying to find myself a bigger one but all the miners and resellers are buying up everything.


Yeah I hear that, been trying to RMA an ax1500i but Corsair is out of stock for a month. Amazon seems to have some EVGA 1600s in stock.


----------



## smithsrt8

I would trust a Kill O Watt over the Link system...I know that many times the manfacturers are very conservative with their numbers and in many cases your not really pulling as many watts as the link would say...I would like to have someone with a Corsair (with link) and a Kill O Watt to compare the 2 numbers...

Like I said...with maxed out sliders on AB and 1.36v on a 91w 7700k...a 6950k will obviously pull more...but I dont see if would pull another 100w...I would think you are good with a quality 1000w PSU...you will probably hit 920 at the very very max but it wont be constant so I would say go for it!

During Gaming you will be at 600-700w...during mining/folding you will hit high 800's


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.
> 
> Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.
> 
> It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That sounds like trapped pressure to me, though I'm not sure how you'd take care of that with a closed loop system... Did you contact EK over it? They might have a solution to slightly bleed the system.
Click to expand...

I unplugged everything but the power cord, rotated the case in every direction possible shaking it as I did, tapped the hoses as I did as well.,rotated it 90 degrees in every direction. Still same trouble. I know someone had gunk in their EK 1080 Ti block, I'm thinking that's it.









And I topped off the fluid as well.









An EK Predator 360 is basically a custom loop that's expandable, it has inlets you can unscrew, add fluid, top it off, 360 rad, attached DDC pump, just has a built in reservoir is all.


----------



## buellersdayoff

Quote:


> Originally Posted by *KedarWolf*
> 
> There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.
> 
> Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.
> 
> It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.


I posted a similar issue with my 980ti in the ek club. For what I can only assume, because I removed that gpu to install the 1080ti, was the bolts on the block coming loose allowing the jet to displace and some bypass of the fins in the block. I'd recommend at least checking the tension on those bolts and at worst opening the block to re-seat the jet plate. In hindsight I should have checked the tension before installing the block but hey it had a quality control sticker on there saying it was all good lol. One of my bolts had actually came out and luckily it never leaked.
http://www.overclock.net/t/993624/ek-club/19710#post_26123934


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> I unplugged everything but the power cord, rotated the case in every direction possible shaking it as I did, tapped the hoses as I did as well.,rotated it 90 degrees in every direction. Still same trouble. I know someone had gunk in their EK 1080 Ti block, I'm thinking that's it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I topped off the fluid as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An EK Predator 360 is basically a custom loop that's expandable, it has inlets you can unscrew, add fluid, top it off, 360 rad, attached DDC pump, just has a built in reservoir is all.


Can you totally empty and refill it?

When I flush my system, I use my little Metro DataVac which does both suck and blow. Blow through the system to get all of the fluid out and then refill or clean or whatever.

Might be a hassle with your setup, I know they use QDC fittings, might be worth a shot if you can find a way.


----------



## KedarWolf

Quote:


> Originally Posted by *smithsrt8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> There is something wrong with my loop. I was getting 48C max on a 360 RAD with a CPU and my 1080 Ti.
> 
> Now I'm getting over 60C. I think when I installed the 1080 Ti EK block it wasn't cleaned properly and some gunk got in the GPU block.
> 
> It's a EK Predator 360 AIO with a prefilled Titan X Pascal block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did it just start happening? Did you have the water setup before the block? Whats your Flow rate?
Click to expand...

I took the block off, reapplied the thermal paste, put it back on, now temps below 50C at 75% pump speed.









I think somehow the block wasn't completely flush with the GPU die, not sure why.


----------



## cluster71

I've been irritate by my Zotac E just been stable memory clock to 11800, then smeary image and freez. I would like to increase memory voltage but I do not know how to do that by hardware on Zotac, so just try to lower the temperature instead. I swapped pads. The others were very dry when I put them on. They probably had been in the shop for a long time. Took the Phobya Thermal pad XT 1mm, 7W / mk that I had at home and ordered new. I removed most of the rubber support over the VRM heat sink for better cooling.

Looping Heaven in the afternoon to find memory limit and check temp. Stabilized at 31-37C for mem and 41C VRM with IR meter 20C ambient.
The GPU memory is now stable to 12110, satisfied

Zotac E is now stable with 2126/12110 1.093v Arctic bios

A new Bios to my MB was released yesterday, so it was possible to get PC stable at 103.2 bus speed where the XMP-4272 profile is intended. CPU stable at 5470 now during cpu/gpu benchmark, 24/7 setting. Chanced the voltage mayby able to lower it a bit.

A bit bored on this PC, feels limited, but it's a mItx and what's supposed to be something better than the PS4. Perhaps the X299 Skylake-X variant in the fall. Does anyone have any tips about motherboards.

http://www.3dmark.com/spy/1988629


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> I took the block off, reapplied the thermal paste, put it back on, now temps below 50C at 75% pump speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think somehow the block wasn't completely flush with the GPU die, not sure why.


I'm sure you also know to use way more paste on the GPU than you normally do on a CPU, right? I almost made that mistake when I did my first build.


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I took the block off, reapplied the thermal paste, put it back on, now temps below 50C at 75% pump speed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think somehow the block wasn't completely flush with the GPU die, not sure why.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure you also know to use way more paste on the GPU than you normally do on a CPU, right? I almost made that mistake when I did my first build.
Click to expand...

Full loop of Fire Strike Ultra stress test, never went higher than 50C, which is a few C better than before the trouble, would hit 52-53C.


----------



## Amuro

i know this is overclock.net but here it goes. atm vega wont beat this efficiency to performance ratio.


----------



## smithsrt8

Quote:


> Originally Posted by *KedarWolf*
> 
> Full loop of Fire Strike Ultra stress test, never went higher than 50C, which is a few C better than before the trouble, would hit 52-53C.


That seems about right...on Firestrike I usually dont go higher than high 40's or 50 as well...


----------



## Dasboogieman

OK update on the ArcticStorm BIOS (on my Aorus). Voltage regulation seems to have gone to crap. At least I think thats what it is because 2100mhz is now no longer stable (i.e. 12hrs of Witcher 3 4K), 2088 seems OK for the time being.
Two theories
1. Vdroop is ridiculous for the Arcticstorm BIOS which in turn is not entirely compatible with the Aorus VRMs which run faster and tighter
2. Gigabyte's BIOS was lying, when it says 2100mhz in the GPUz, internally it was something else entirely.

The ArcticStorm still outperforms the Gigabyte's BIOS by a significant margin. I'm keeping an eye on the VRMs because the inductors are now starting to whine. Which means more current is being pulled through the assembly. Theoretically, both Zotac and Gigtabyte should have similar MOSFET switching frequency so the regulation should be similar.

Finally, my power readings are waaaay off. I just realised Gigabyte factory shunt modded their cards, the shunts are all 002, so this could be what is spoofing the power limit. Because the other BIOSes are unaware of this, the power reading is super inaccurate.

EDIT: OK so adjusting for gigabyte's factory 002 shunts, my power draw in the Witcher 3 is 407W so this thing definitely works.

EDIT: turns out Gigabyte's BIOS has the ridiculous Vdroop, the Aorus cards are actually boosting 1 bin below the max 1.093v because 2088mhz on the ArcticStorm=2100mhz on the Aorus.


----------



## KedarWolf

This is XOC BIOS at 2088 6210, I run 2088 6177 as my regular OC with it, both at 1.100v


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> This is XOC BIOS at 2088 6210, I run 2088 6177 as my regular OC with it, both at 1.100v


Is XOC bios faking something because I Oced my card to 2100 and core did not change from 2025 MHz runs.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is XOC bios faking something because I Oced my card to 2100 and core did not change from 2025 MHz runs.


That is the 5% performance regression you are seeing. The only cards which don't experience this are the Strix models which the BIOS was originally designed for. The performance number is what is reporting truth, your monitored clockspeed doesn't mean squat because there are internal regulation modules on die which we cannot control.

I just found out my stock Gigabyte BIOS was also lying to me, turns out it wasn't really running 2100mhz all this time (despite what it said in GPUz), it was boosting 1 bin lower to account for massive built in LLC. My ArcticStorm BIOS is now boosting to the correct 2088mhz frequency at the cost of coil whine (presumably because the AS BIOS has much less LLC).


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> This is XOC BIOS at 2088 6210, I run 2088 6177 as my regular OC with it, both at 1.100v


If you're running the Steam version of 3DMark, download the standalone version from 3DMark.com. XOC BIOS.

it seems to do better in benches that Steam version.









Mind you the difference may be I'm using my signature CPU settings etc. as previous was at 4.6GHZ CPU, 4.3GHZ cache as I was trouble shooting my card running too hot. :/


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> If you're running the Steam version of 3DMark, download the standalone version from 3DMark.com. XOC BIOS.
> 
> it seems to do better in benches that Steam version.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mind you the difference may be I'm using my signature CPU settings etc. as previous was at 4.6GHZ CPU, 4.3GHZ cache as I was trouble shooting my card running too hot. :/


Damn you, Steam... I didn't know this..

I've allocated some time tonight to figure out this voltage curve situation finally. On the XOC Bios. Questions may arise.


----------



## Streetdragon

Went back to stock Bios. 2012Mhz Core/6003Mhz on vram at 1V. nice stable. around 250-270 Watt. Works like a charm. For 50Mhz more around 50 Watt more(fore me) is not realy worth it.


and tuned my backplate a bit. With a bit airflow, they are cold on touch while bench/gaming. Droped the coretemp around 1°. dont know why. maybe in margin of error.. but better cool than sorry^^

but i cant reach the high benchscores fom you gueys.. dont know why. im far away from powerlimit. Maybe i try to get more power from the memory.. didnt tryed to go higher because over 6000 were a nice start^^


----------



## shadow85

I am getting lower performance with any drivers after 378.92 in Windows Creators.

I have 2x ASUS GTX 1080 Ti STRIX OC and I am only managing about 40-53 fps in Mass effect Andromeda @ 4K with any newer drivers, currently I am on 382.53 That is about the same as when I had non-Ti STRIX OC in SLi. AB is reporting that I am only getting 50-60% GPU usage from each card.

If i change back to 378.92 I am back at 60fps constant and 99% usage on both cards.

Anyone else having these sort of issues, is there anyway to fix it?


----------



## OneCosmic

Is Zotac Arctic Storm BIOS working on FE cards?


----------



## ZealotKi11er

Quote:


> Originally Posted by *OneCosmic*
> 
> Is Zotac Arctic Storm BIOS working on FE cards?


I did nothing for me. With FE card just stick to stock. I tried a lot and really 2-3% is not worth the haste.


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I did nothing for me. With FE card just stick to stock. I tried a lot and really 2-3% is not worth the haste.


or the extra amount of heat.


----------



## cluster71

For me on Zotac Amp Extreme, the Zotac E 384w or Zotac Arctic 432w bios does not seem to be restrictive. And i run "Powerlimit.bat" 384 432

During Time Spy yesterday 2126/12110 1.093v Arctic, pulled the entire PC 432w from the wall continuously 92% EF = 397w real. Afterburner shows 91% power. (TDP 360*0,91=328w GPU believable, close to what TechPowerUp reported
Do i run CPU benchtest pulls the entire PC 132w (121w real) The CPU load is not 100% during GPU benchmark.
The highest i recorded where 468w(430 real) when the PC was craching freezing.
This is 120volt table. 230vol is usually one percent better.

Only 1000w PSU Gold 15cm that I found
http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=494


----------



## Psykoking

new drivers are out: http://www.nvidia.com/download/driverResults.aspx/120603/de


----------



## AstroSky

Not seeing this new driver for windows 10 64 bit English :/


----------



## smithsrt8

http://www.nvidia.com/download/driverResults.aspx/120486/en-us

Here you go


----------



## Psykoking

I'm now getting nearly the same results with max possible OC from Stock BIOS (top) und max possible OC from XOC Bios (bottom).


----------



## Juub

Ok gotta ask, how in the hell do some of you guys manage 1900Mhz on air with a FE? My base clock is 1480 and I can only increase it to +125(1605Mhz) before the card locks up and crash. I don't modify the voltage. Would it allow for higher sustained clocks?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Juub*
> 
> Ok gotta ask, how in the hell do some of you guys manage 1900Mhz on air with a FE? My base clock is 1480 and I can only increase it to +125(1605Mhz) before the card locks up and crash. I don't modify the voltage. Would it allow for higher sustained clocks?


1900MHz boost clock. Look at gpuz sensor tab while loading your gpu to see your actual boost clocks.


----------



## Juub

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 1900MHz boost clock. Look at gpuz sensor tab while loading your gpu to see your actual boost clocks.


Boost clock is 1707Mhz. Still nowhere near the 1900+Mhz I see on here. Maybe I got a dud like always.

The more I think about it the more I think my issues are related to my PSU or motherboard. I've gone through a 670, a pair of Windforce 7950's, a 970, a pair of 970's, a 7850, a pair of 980's SC EVGA, a pair of 1080's FE, an RX 480, a 980 Matrix Gold, a 980 Ti Strix and a 1080 Ti FE and aside from the 980 Matrix Gold and 980 Ti Strix they were all below average overclockers. In the span of 5 years.


----------



## cluster71

A thought

As you know, high current requires thick cables to prevent resistance and heat. That's why the voltage is transformed to 400Kv before it is delivered to households, for example. Otherwise, the cables would get so thick that they could not be hanged. With car sterio, they use 12v as PC, they use thick cables to prevent voltage drop and bring forth the power.

With in PC, you are bound to standards created a long time ago and feel obsolete now with high-performance graphics cards. My PSU is solid on 12.288v even under load.
I was going to measure online during benchmark on the graphics card if it still 12.288v or if the voltage drops towards to 11.8-11.5v? When the GPU pulls up to 400-450w, outside the standard it is possible. So I connected the 4x pcie 8-pin + adapter and then there are extra capacitors on PSU and cables for GPU/CPU 2 weeks ago.

I have not measured yet, but now the graphics card does not seem to pull so much power and it does not seem to be an advantage to push XOC. And the Zotac Extreme has quite an oversized power supply so maybe it's not a problem. I'll measure and restore it later but ...

With FE, especially shunt mod, it might be a problem with voltage drop. The VRM on the graphics card is probably trying to keep up the voltage to the GPU in any case, but they become handicapped.

You may get coilwire sound and instability? You may want to try other cables and measure. And i hope you do not use those loop pcie kable two second contact, and 3 cables are connected instead of 2+ sense on 6 pin (Most manufacturers seem to go beyond the recommendation)

Someone who knows what extreme overclocking does. Do they just press on with higher voltage instead, or make the cold the trick. Many overvoltage their power supplys if you look back ten year or so. Now, there are many security features so it's not that easy.


----------



## smithsrt8

Interesting thing about voltage/watts I saw last night...I was messing around with my cards...I had the XOC BIOS for both cards and core voltage boost @ 50% in AB...while running timespy in the second scene my watts (showing on my Kill O Watt) showed 1047!!...I started freaking out a bit because I only have a 1000w Power supply (albeit its a Be Quiet so its a good one) but the card just chugged along...I ended up getting a 15757 in time spy

Anyone with any kind of monitoring notice stuff like this?

I did notice though afterwards when I tried to update my drivers to the latest ones (that were released last night) it kept failing...I had to reinstall windows (I tried to DDU the drivers and everything even safe mode) to get the new drivers to take


----------



## pattmarsons

Quote:


> Originally Posted by *shadow85*
> 
> I am getting lower performance with any drivers after 378.92 in Windows Creators.
> 
> I have 2x ASUS GTX 1080 Ti STRIX OC and I am only managing about 40-53 fps in Mass effect Andromeda @ 4K with any newer drivers, currently I am on 382.53 That is about the same as when I had non-Ti STRIX OC in SLi. AB is reporting that I am only getting 50-60% GPU usage from each card.
> 
> If i change back to 378.92 I am back at 60fps constant and 99% usage on both cards.
> 
> Anyone else having these sort of issues, is there anyway to fix it?


Are you using msi afterburner or evga overclocking utilitiy? I found for some reason the evga software throttled my cards. Is this with no aftermarket software on?


----------



## cluster71

Good, I check the drivers later ...


----------



## ilikelobstastoo

Quote:


> Originally Posted by *shadow85*
> 
> I am getting lower performance with any drivers after 378.92 in Windows Creators.
> 
> I have 2x ASUS GTX 1080 Ti STRIX OC and I am only managing about 40-53 fps in Mass effect Andromeda @ 4K with any newer drivers, currently I am on 382.53 That is about the same as when I had non-Ti STRIX OC in SLi. AB is reporting that I am only getting 50-60% GPU usage from each card.
> 
> If i change back to 378.92 I am back at 60fps constant and 99% usage on both cards.
> 
> Anyone else having these sort of issues, is there anyway to fix it?


I have noticed exactly the same issue. All the sudden my scores were low (low 10's in SuPo 4k) and I couldnt maintain the same curves while benching . I kept updating drivers to no avail. Finally I experimented with going back to older drivers and when I got back to 378.92 like you, suddenly my old curve would run and pass and my scores went back up to around 10,650 or so in SuPo 4k.

I haven't tried the latest drivers that just came out a day or so ago though have you?


----------



## ocCuS

So,
I tested my 1080 ti Hall of Fame (Bios for you to test is following) and I´m a little confused.
So this was my last Superposition run (1080 extreme):



When I increase the core clock, the benchmark / driver will crash. The voltage was at 1.0810 V, temps at 50°C and the power consumption was always between 32-38%.

Is there a way through software to put more voltage / increase the power consumption to get a higher core clock?


----------



## KedarWolf

Quote:


> Originally Posted by *ocCuS*
> 
> So,
> I tested my 1080 ti Hall of Fame (Bios for you to test is following) and I´m a little confused.
> So this was my last Superposition run (1080 extreme):
> 
> 
> 
> When I increase the core clock, the benchmark / driver will crash. The voltage was at 1.0810 V, temps at 50°C and the power consumption was always between 32-38%.
> 
> Is there a way through software to put more voltage / increase the power consumption to get a higher core clock?


I'd definitely RMA that card, if you're not getting 2000 core than something is wrong with it for sure.


----------



## lilchronic

Quote:


> Originally Posted by *ocCuS*
> 
> So,
> I tested my 1080 ti Hall of Fame (Bios for you to test is following) and I´m a little confused.
> So this was my last Superposition run (1080 extreme):
> 
> 
> 
> When I increase the core clock, the benchmark / driver will crash. The voltage was at 1.0810 V, temps at 50°C and the power consumption was always between 32-38%.
> 
> Is there a way through software to put more voltage / increase the power consumption to get a higher core clock?


On the sensor page of gpu-z click the gpu clock so it say's max gpu clock.


----------



## ocCuS

Quote:


> Originally Posted by *KedarWolf*
> 
> I'd definitely RMA that card, if you're not getting 2000 core than something is wrong with it for sure.


Quote:


> Originally Posted by *lilchronic*
> 
> On the sensor page of gpu-z click the gpu clock so it say's max gpu clock.


I´m even more confused now after a little more testing.
Core is max stable around 2050 (msi afterburner) / 2062 (gpu-z) during benchmarks. Haven´t got an 2077 or 2088 mhz stable....
But the memory on this card is insane. It is maxed out to +1000 (12992mhz) with no problems.

But the card is still during benchmarks around 40-45% power consumption...which I don´t know why


----------



## nyk20z3

Finally -

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_ti_lightning_z_review,1.html


----------



## ocCuS

Tried to upload my 1080ti Hall of Fame Bios here with an attachement but getting always "AJAX response unable to be parsed as valid JSON object."
Does anyone know what this means?


----------



## Psykoking

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> I have noticed exactly the same issue. All the sudden my scores were low (low 10's in SuPo 4k) and I couldnt maintain the same curves while benching . I kept updating drivers to no avail. Finally I experimented with going back to older drivers and when I got back to 378.92 like you, suddenly my old curve would run and pass and my scores went back up to around 10,650 or so in SuPo 4k.
> 
> I haven't tried the latest drivers that just came out a day or so ago though have you?


Might be because of the drivers,
Quote:


> Originally Posted by *ocCuS*
> 
> So,
> I tested my 1080 ti Hall of Fame (Bios for you to test is following) and I´m a little confused.
> So this was my last Superposition run (1080 extreme):
> 
> 
> 
> When I increase the core clock, the benchmark / driver will crash. The voltage was at 1.0810 V, temps at 50°C and the power consumption was always between 32-38%.
> 
> Is there a way through software to put more voltage / increase the power consumption to get a higher core clock?


Quote:


> Originally Posted by *ocCuS*
> 
> Tried to upload my 1080ti Hall of Fame Bios here with an attachement but getting always "AJAX response unable to be parsed as valid JSON object."
> Does anyone know what this means?


Do you have any script blockers, anti-virus software running? They might try to block that query.


----------



## ocCuS

Quote:


> Originally Posted by *Psykoking*
> 
> Do you have any script blockers, anti-virus software running? They might try to block that query.


Tried with or without anti-virus, adblock and different browsers and I tried it from my laptop...Everytime the same message ("AJAX response unable to be parsed as valid JSON object."). Will try it tomorrow from work...weird


----------



## Psykoking

Quote:


> Originally Posted by *ocCuS*
> 
> Tried with or without anti-virus, adblock and different browsers and I tried it from my laptop...Everytime the same message ("AJAX response unable to be parsed as valid JSON object."). Will try it tomorrow from work...weird


Have you tried uploading it to the GPU-Z database?



Just hit this button.


----------



## ocCuS

Quote:
Quote:


> Originally Posted by *Psykoking*
> 
> Have you tried uploading it to the GPU-Z database?
> 
> 
> 
> Just hit this button.


Yeah, this way I saved the bios on my PC. And even on my smart phone I couldn´t upload it and got the same message. Weird af

But thanks for your help!

Uploading to the online database worked fine.

Here is the link to the 1080 ti Hall of Fame Bios from Galax / KFA²:

https://www.techpowerup.com/vgabios/193134/193134

Enjoy


----------



## Psykoking

Quote:


> Originally Posted by *ocCuS*
> 
> Yeah, this way I saved the bios on my PC. And even on my smart phone I couldn´t upload it and got the same message. Weird af
> 
> Uploading to the online database worked fine.
> 
> Here is the link to the 1080 ti Hall of Fame Bios from Galax:
> 
> https://www.techpowerup.com/vgabios/193134/193134
> 
> Enjoy


Thank you very much, appreciate it


----------



## cluster71

Quote:


> Originally Posted by *ocCuS*
> 
> I´m even more confused now after a little more testing.
> Core is max stable around 2050 (msi afterburner) / 2062 (gpu-z) during benchmarks. Haven´t got an 2077 or 2088 mhz stable....
> But the memory on this card is insane. It is maxed out to +1000 (12992mhz) with no problems.
> 
> But the card is still during benchmarks around 40-45% power consumption...which I don´t know why


I had evaluated the memory and the core for itself. Run Heaven or something in the loop and increase memory until you find the limit. Then take the core and let the memory be on the default until you finish, then try to combine both to something good ...


----------



## Psykoking

So far no output from either of the DPs...


----------



## cluster71

Quote:


> Originally Posted by *Psykoking*
> 
> So far no output from either of the DPs...


You try it on Zotac Extreme?


----------



## Psykoking

Yep, glad i still have my 290xs lying around, so i can use them to revert the flash...


----------



## KedarWolf

Quote:


> Originally Posted by *ocCuS*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Psykoking*
> 
> Do you have any script blockers, anti-virus software running? They might try to block that query.
> 
> 
> 
> Tried with or without anti-virus, adblock and different browsers and I tried it from my laptop...Everytime the same message ("AJAX response unable to be parsed as valid JSON object."). Will try it tomorrow from work...weird
Click to expand...

You need to put it in a .zip file with winrar.


----------



## cluster71

Quote:


> Originally Posted by *Psykoking*
> 
> Yep, glad i still have my 290xs lying around, so i can use them to revert the flash...


Hehe sometimes it's not enough to just reboot. You might need to cut the power and boot to make it right


----------



## Psykoking

Quote:


> Originally Posted by *cluster71*
> 
> Hehe sometimes it's not enough to just reboot. You might need to cut the power and boot to make it right


Tried, didnt help tho...


----------



## cluster71

Do not want to start at all? Or is the DP that is not working


----------



## Psykoking

Both DPs do not have signal output and atm i cant try HDMI or DVI-D, because I got no cables lying around ...


----------



## cluster71

Can you use the CPU graphics circuit if you have any or another graphics card. Start without any pcie cables in Zotac so it boot in safe mode i think. Did so with one of my Titan's before


----------



## Psykoking

Quote:


> Originally Posted by *cluster71*
> 
> Can you use the CPU graphics circuit if you have any or another graphics card. Start without any pcie cables in Zotac so it boot in safe mode i think. Did so with one of my Titan's before


Well if only Intel decided to leave onboard graphics to the 5930K ?. Found an HDMI cable, still no output. Flashed again to make sure flashing wasnt the problem... still no output.


----------



## Psykoking

Will revert back to Stock using the 290x...


----------



## cluster71




----------



## KedarWolf

Quote:


> Originally Posted by *ocCuS*
> 
> Quote:
> Quote:
> 
> 
> 
> Originally Posted by *Psykoking*
> 
> Have you tried uploading it to the GPU-Z database?
> 
> 
> 
> Just hit this button.
> 
> 
> 
> Yeah, this way I saved the bios on my PC. And even on my smart phone I couldn´t upload it and got the same message. Weird af
> 
> But thanks for your help!
> 
> Uploading to the online database worked fine.
> 
> Here is the link to the 1080 ti Hall of Fame Bios from Galax / KFA²:
> 
> https://www.techpowerup.com/vgabios/193134/193134
> 
> Enjoy
Click to expand...

DON'T flash this HOF BIOS on anything but a HOF card!









Your PC will not boot if you do and if you don't have an iGPU ot second video card you won't be able to flash it back.









It's tricky too, had to enable motherboard BIOS defaults and DDU uninstall Nvidia driver with second card in and my 1080 Ti not in before I could boot with my 1080 Ti in and flash XOC BIOS back on it.


----------



## ZealotKi11er

Man looking at the Guru3D review going from Stock 1080 Ti to 1080 Ti Lightning + OC only 15% extra performance. Seems very low.


----------



## cluster71

Yes sometimes they have good results and sometimes they are very low. I'm surprised they can not do more with the memory when they have the extra voltage controls

2062 Zotac and 2050 Msi. I'm convinced they could push them a bit longer if they spent more time


----------



## lilchronic

Quote:


> Originally Posted by *ocCuS*
> 
> I´m even more confused now after a little more testing.
> Core is max stable around 2050 (msi afterburner) / 2062 (gpu-z) during benchmarks. Haven´t got an 2077 or 2088 mhz stable....
> But the memory on this card is insane. It is maxed out to +1000 (12992mhz) with no problems.
> 
> *But the card is still during benchmarks around 40-45% power consumption...which I don´t know why
> *


What card is that ? and have you flashed a different bios or using a secondary bios?

Galax HOF card is for extreme cooling and probably has a secondary unlocked power target bios.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man looking at the Guru3D review going from Stock 1080 Ti to 1080 Ti Lightning + OC only 15% extra performance. Seems very low.


Eh, that tells you how good Boost 3.0 is.


----------



## cluster71

Galaxy Hof 450w Limit 3x8 pcie power. If he can't overclock over 1.093v, then he does't have much use of the high limit.

I have Zotac Extreme with Zotac Arctic bios 450w 58.2% TDP GPU-Z, 2126/12000 1.093v


----------



## lilchronic

Quote:


> Originally Posted by *cluster71*
> 
> Galaxy Hof 450w Limit 3x8 pcie power. *If he can not overclock over 1.093v, then he does not have much use of the high limit.*
> 
> I have Zotac Extreme with Zotac Arctic bios 450w 58.2% TDP GPU-Z, 2126/12000 1.093v


So what. at least it's better than a FE card that throttle's @ 300w. I have to run my card @ 2000Mhz / .993v to stay away from power throttling. At least he can be able to clock higher and actually take advantage of the full 1.093v with out hitting that darn power limit.

Nice card

Can you pass any benchmarks at those speed or just run gpu-z render for 2 seconds


----------



## cluster71

It's no problem, but it feels overwhelming in the advertisement as if there is a lot to take off


----------



## cluster71

If you look back a few pages you can see some benchmarks. Played a couple of hours now. Stable with 2126/12110 and 5470MHz 7700k cpu 4266MHz ram. May be able to pull it up to 5500/4300 without any problems. Cpu temp 48c when i'm gaming


----------



## Psykoking

god damn if I could finally find where that stupid ass VBIOS checksum is stored, we could finally edit powerlimit......
Well, I guess I will tell you what I found so far.
The values for the power limit are non-encrypted 32 bit unsigned INTs, which can be found at addresses 0x00030E70 to 0x00030E73 (50% power limit), 0x00030E74 to 0x00030E77 (100% power limit) and finally 0x00030E78 to 0x00030E7B (120% power limit). [values vary slightly from vendor to vendor for example MSI Gaming X starts at 0x00030E83]



The easiest way to find them is:

open Hex Workshop
CTRL+F
select 32 bit Unsigned Long
enter your max power limit (for example 384W and add 000 -> 384000)
hit enter
These values are for all cards stored at nearly the same addresses (even for XOC). However, somwhere in the XOC BIOS there is a switch to prevent the card from pulling those limits from the BIOS. So what we now need to do is:

Find the checksum the be able to flash edited VBIOS (or ask JoeDirt to compile to newest nvflash with certificate bypass)
Find the value that switches off the power limit (I do think I already found them but since I cant flash an edited BIOS atm I cant test it)


----------



## ocCuS

Quote:


> Originally Posted by *cluster71*
> 
> I had evaluated the memory and the core for itself. Run Heaven or something in the loop and increase memory until you find the limit. Then take the core and let the memory be on the default until you finish, then try to combine both to something good ...


Quote:


> Originally Posted by *KedarWolf*
> 
> You need to put it in a .zip file with winrar.


Will try both later this day. Thanks









And yeah with that power design of this card I think standard 1080 ti cards will have problems with this bios. So beware to flash your cards with this bios when you have only one bios.

Still the card is a little strange. Last week I send my first HoF back (max was 2088 core / 6100 mhz memory in valley) with the main reason I ordered it 2 days after the destiny 2 bundle.
Now the card seems to max out at 2075 mhz core in valley but like I said the memory I can push to the limit to +1000 to 6494 mhz (temps between 50-55°C).
Voltage limit is at 1.093v but the power consumption never goes over 50% (same with the card last week). And from what I saw, this card has no second BIOS, so I don´t know how to push the card more (more voltage / power consumption)...


----------



## Dasboogieman

Quote:


> Originally Posted by *ocCuS*
> 
> Will try both later this day. Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still the card is a little strange. Last week I send my first HoF back (max was 2088 core / 6100 mhz memory in valley) with the main reason I ordered it 2 days after the destiny 2 bundle.
> Now the card seems to max out at 2075 mhz core in valley but like I said the memory I can push to the limit to +1000 to 6494 mhz.
> Voltage limit is at 1.093v but the power consumption never goes over 50% (same with the card last week). And from what I saw, this card has no second BIOS, so I don´t know how to push the card more (more voltage / power consumption)...


Get more cooling by putting it under water. That card will do 2126mhz easy with <40C, maybe even 6600mhz VRAM.


----------



## cg4200

After 5 1080tis I get one that will pass firestrike ultra at 2100 on core sad....
Most would do 2050 on core stable one would not even render on gpu-z at 2025 yikes !
my first titan pascal got 2126 core + 700 mem still in top 20 wish I never sold that card regrets>>>
Any way I have all cards shunted and all are reference cards
.
My first question is if I flash a bios and don't like it flash back to stock do I have to run power limit bat every time I start my computer??
If I reload windows I would not have to do power limit bat correct???
I do not want to go from 2100 to 2126 by flashing new bios but see no performance gain xoc seems if not asus not best choice>
Has anyone tried pony's xlr bios? or have a link to it?
or have suggestion for the best bios for shunted fe ??thanks


----------



## nrpeyton

Quote:


> Originally Posted by *kevindd992002*
> 
> 1.) I have the AOC G2460PG, it's a 144Hz 1080p monitor, and yes G-Sync is enabled. I understand that I don't need much power to generate 144fps @ 1080p but then again the problem is that I'm not limiting myself/graphics card/game (vsync) to that framerate.
> 
> 2.) I've been with a couple of Nvidia cards (my last was a 1070) and I never bother to use "prefer maximum performance" as I read that the card doesn't 'idle' when that is set. What's your take on this?
> 
> 3.) Yeah, I should've turned the power slider all the way before anything.
> 
> 4.) Ok, good to know. What screen capture app do you use to do in-game captures or even record a video?
> 
> 5.) N/A
> 
> 6.) Ok, I'll keep that in mind but I'll have to try one step at a time so that we'll know what caused the problem when it gets solved.
> 
> I know that I generally just need 144fps for a 144Hz monitor anyway but then I'm merely just doing some testing here. I want to know how high of an fps can my card reach with GOW4 @ 1080p before my CPU bottlenecks it. GOW4 being poorly ported is worth looking into but then I don't understand the use of the "unlimited" framerate limit in the settings.
> 
> 7.) Regarding DSR, yes I'm familiar with it and is actually what I was thinking on doing as the next step. I've tried the in-game version of DSR before with my 1070 and it works just fine.
> 
> I know that a 1080Ti is overkill for a 1080p monitor but like I said I'm merely testing this. I'll just waiting for a deal for the Dell S2716DG and I'll go ahead and buy that for 1140p.
> 
> EDIT:
> 
> Just did some additional testing runs and here's my observation:
> 
> 1.) It's definitely not power related. I've switched to the slave GPU BIOS and ramped up the slider in MSI Afterburner to 125% max. GPU-Z shows only 70% TDP while GOW4 is running.
> 
> 2.) It's also not the NVCP power setting. I tried "prefer maximum performance" and the results are the same.
> 
> 3.) And then I tried DSR. I've set the in-game resolution to 3840x2160 and the GPU load just soared to 99%! So what the heck? I'm stuck at just 100fps at 1080p even though the card can do more?


That makes sense then.

Remember G-SYNC forces the monitor to refresh at the FPS the game is rendering at. At it also works both ways--in the sense the driver knows 144 FPS is your maximum monitor refresh rate. So it won't try to render more than 144 FPS while G-SYNC is enabled. As that would put it OUT of sync.

So if your monitor refreshes at a maximum of 144 FPS then you're never going to see FPS beyond 144 while g-sync is enabled. At 1080p your card is powerful enough to do _*more*_ than 144 FPS. So that's why you're only seeing below 97% utilisation. And also why-- when you use DSR you're FPS goes down and your utilisation goes up.

It's not a bug. And nothing wrong. Completely normal behaviour.









Hope I was some help-- and you understand it all now ;-)

And:
It's been a while since I've done screen capturing. But you can get a fully unlocked version of FRAPS at a torrent site. And I think for in-game captures these days-- it's actually possible to use GeForce Experience. They've added the screen-capture ability recently I think.


----------



## nrpeyton

Quote:


> Originally Posted by *Psykoking*
> 
> god damn if I could finally find where that stupid ass VBIOS checksum is stored, we could finally edit powerlimit......
> Well, I guess I will tell you what I found so far.
> The values for the power limit are non-encrypted 32 bit unsigned INTs, which can be found at addresses 0x00030E70 to 0x00030E73 (50% power limit), 0x00030E74 to 0x00030E77 (100% power limit) and finally 0x00030E78 to 0x00030E7B (120% power limit). [values vary slightly from vendor to vendor for example MSI Gaming X starts at 0x00030E83]
> 
> 
> 
> The easiest way to find them is:
> 
> open Hex Workshop
> CTRL+F
> select 32 bit Unsigned Long
> enter your max power limit (for example 384W and add 000 -> 384000)
> hit enter
> These values are for all cards stored at nearly the same addresses (even for XOC). However, somwhere in the XOC BIOS there is a switch to prevent the card from pulling those limits from the BIOS. So what we now need to do is:
> 
> Find the checksum the be able to flash edited VBIOS (or ask JoeDirt to compile to newest nvflash with certificate bypass)
> Find the value that switches off the power limit (I do think I already found them but since I cant flash an edited BIOS atm I cant test it)


I think Joedirt already done his bit.. it was released ages ago.. it's just we never got the BIOS editor so it's been there doing nothing for ages.

Should be at the tech-powerup website.

I know this for a fact. Because I accidently tried to use the certificates bypassed version when I first tried to flash my old 1080 BIOS last year.. and it never worked because the BIOS I was trying to flash was the XOC (which was officially signed by ASUS I.E it had a certificate).. And that version only works (and only needs to be used) by BIOS's that we've edited ourselves.

If you've successfully edited that file in your post.. Then you could have that BIOS flashed now. (as soon as you get back online).

Let us know if it works.. coz if it does then by god.. omg.. you've just cracked it lol!


----------



## Psykoking

Quote:


> Originally Posted by *nrpeyton*
> 
> I think Joedirt already done his bit.. it was released ages ago.. it's just we never got the BIOS editor so it's been there doing nothing for ages.
> 
> Should be at the tech-powerup website.
> 
> I know this for a fact. Because I accidently tried to use the certificates bypassed version when I first tried to flash my old 1080 BIOS last year.. and it never worked because the BIOS I was trying to flash was the XOC (which was officially signed by ASUS I.E it had a certificate).. And that version only works (and only needs to be used) by BIOS's that we've edited ourselves.
> 
> If you've successfully edited that file in your post.. Then you could have that BIOS flashed now. (as soon as you get back online).
> 
> Let us know if it works.. coz if it does then by god.. omg.. you've just cracked it lol!


Well I already tried to flash using the existing nvflash versions from Joe, but I cant get it to recognize my card....
After I got my last two exams done on Wednesday I will look into this further.


----------



## nrpeyton

Quote:


> Originally Posted by *Psykoking*
> 
> Well I already tried to flash using the existing nvflash versions from Joe, but I cant get it to recognize my card....
> After I got my last two exams done on Wednesday I will look into this further.


Okay will be good to know how you get on ;-)

Quote:


> Originally Posted by *cg4200*
> 
> After 5 1080tis I get one that will pass firestrike ultra at 2100 on core sad....
> Most would do 2050 on core stable one would not even render on gpu-z at 2025 yikes !
> my first titan pascal got 2126 core + 700 mem still in top 20 wish I never sold that card regrets>>>
> Any way I have all cards shunted and all are reference cards
> .
> My first question is if I flash a bios and don't like it flash back to stock do I have to run power limit bat every time I start my computer??
> If I reload windows I would not have to do power limit bat correct???
> I do not want to go from 2100 to 2126 by flashing new bios but see no performance gain xoc seems if not asus not best choice>
> Has anyone tried pony's xlr bios? or have a link to it?
> or have suggestion for the best bios for shunted fe ??thanks


You can just move the power slider on MSI Afterburner or Precision X. Soon as you hit apply; you've undid what you did by running the .batch file. No need to reboot


----------



## cluster71

Quote:


> Originally Posted by *Dasboogieman*
> 
> Get more cooling by putting it under water. That card will do 2126mhz easy with <40C, maybe even 6600mhz VRAM.


I agree if you can lower the temperature in the room, or move the PC to another location and test. If you do not want to pick apart the card. I have worked a lot with my card. Hard to get over 2062/11500 at the beginning. Tested different bios, but it works when I keep the temperature below 40c (Has changed paste and pad but it is a part of lowering the temperature)


----------



## kevindd992002

Quote:


> Originally Posted by *nrpeyton*
> 
> That makes sense then.
> 
> Remember G-SYNC forces the monitor to refresh at the FPS the game is rendering at. At it also works both ways--in the sense the driver knows 144 FPS is your maximum monitor refresh rate. So it won't try to render more than 144 FPS while G-SYNC is enabled. As that would put it OUT of sync.
> 
> So if your monitor refreshes at a maximum of 144 FPS then you're never going to see FPS beyond 144 while g-sync is enabled. At 1080p your card is powerful enough to do _*more*_ than 144 FPS. So that's why you're only seeing below 97% utilisation. And also why-- when you use DSR you're FPS goes down and your utilisation goes up.
> 
> It's not a bug. And nothing wrong. Completely normal behaviour.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope I was some help-- and you understand it all now ;-)
> 
> And:
> It's been a while since I've done screen capturing. But you can get a fully unlocked version of FRAPS at a torrent site. And I think for in-game captures these days-- it's actually possible to use GeForce Experience. They've added the screen-capture ability recently I think.


Again, gsync will not cap your fps to your refresh rate. That is vsync. As a proof, I play overwatch and csgo with gsync enabled and average at around 200+ fps (a framerate greater than my monitor's refresh rate).

Do you have a gsync monitor yourself? You can bring this discussion to the blurbusters forum and you'll understand what I'm saying.

The problem here is that the CPU is bottlenecking the GPU. I've read about this and it seems that with some games, 88% CPU usage can already be considered a bottleneck.


----------



## ocCuS

Quote:


> Originally Posted by *cluster71*
> 
> I agree if you can lower the temperature in the room, or move the PC to another location and test. If you do not want to pick apart the card. I have worked a lot with my card. Hard to get over 2062/11500 at the beginning. Tested different bios, but it works when I keep the temperature below 40c (Has changed paste and pad but it is a part of lowering the temperature)


Thanks for the suggestion. I replaced the existing thermalpaste to the grizzly kryonaut and it went from around 60 to the low 50s under max load. Just bought the Grizzly Conductonaut (liquid metal) and will test this these days.


----------



## Psykoking

Quote:


> Originally Posted by *ocCuS*
> 
> Thanks for the suggestion. I replaced the existing thermalpaste to the grizzly kryonaut and it went from around 60 to the low 50s under max load. Just bought the Grizzly Conductonaut (liquid metal) and will test this these days.


Can really recommend it, got it on my Amp Extreme. When applying make sure not to use too much, cover DIE and heatspreader from the cooler so surface tension can make it's way and before putting it back together. Be careful and remove every last drop of thermal paste before applying the LM and do not under any circumstances touch anything made of aluminum with the LM. After spreading use the cotton tip and go from the edge to the center (also on the cooler). This causes the LM to form a small, barely visible drop in the center which helps to prevent air bubbles to get caught inside since they will be pressed to the sides when you put everything back together


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> I think Joedirt already done his bit.. it was released ages ago.. it's just we never got the BIOS editor so it's been there doing nothing for ages.
> 
> Should be at the tech-powerup website.
> 
> I know this for a fact. Because I accidently tried to use the certificates bypassed version when I first tried to flash my old 1080 BIOS last year.. and it never worked because the BIOS I was trying to flash was the XOC (which was officially signed by ASUS I.E it had a certificate).. And that version only works (and only needs to be used) by BIOS's that we've edited ourselves.
> 
> If you've successfully edited that file in your post.. Then you could have that BIOS flashed now. (as soon as you get back online).
> 
> Let us know if it works.. coz if it does then by god.. omg.. you've just cracked it lol!


The problem has always been not being able to flash a edited bios.
Quote:


> Originally Posted by *kevindd992002*
> 
> Again, gsync will not cap your fps to your refresh rate. That is vsync. As a proof, I play overwatch and csgo with gsync enabled and average at around 200+ fps (a framerate greater than my monitor's refresh rate).
> 
> Do you have a gsync monitor yourself? You can bring this discussion to the blurbusters forum and you'll understand what I'm saying.
> 
> The problem here is that the CPU is bottlenecking the GPU. I've read about this and it seems that with some games, 88% CPU usage can already be considered a bottleneck.


gsync is not on when you are above the refresh rate of you monitor. So if you have 144hz / 165Hz but are running above the frame rate then gsync is not on.

Also the cpu could be @ 25% usage and still bottleneck. Entire usage of cpu could be 25% but one core maxed out @90 -100%.


----------



## kevindd992002

Quote:


> Originally Posted by *lilchronic*
> 
> The problem has always been not being able to flash a edited bios.
> gsync is not on when you are above the refresh rate of you monitor. So if you have 144hz / 165Hz but are running above the frame rate then gsync is not on.
> 
> Also the cpu could be @ 25% usage and still bottleneck. Entire usage of cpu could be 25% but one core maxed out @90 -100%.


Exactly! This is what I've been telling nrpeyton since a few posts back.

And the 88% value that I'm referring to is actually max thread usage not cpu usage.


----------



## Falkentyne

There already is a pascal bios editor out. But you need a hardware programmer because NVflash won't flash them. When someone makes an NVflash bypass (or hell, even a THIRD PARTY PROGRAM LIKE FLASHROM through Linux) they're going to get a LOT of donations....

https://github.com/LaneLyng/MobilePascalTDPTweaker


----------



## KedarWolf

Afterburner beta 12 is out, updated OP to that. Remember to highlight the link, copy it, then paste to browser.

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta12.rar


----------



## Falkentyne

Anyone tested using the -j parameter in NVflash?


----------



## Coopiklaani

Quote:


> Originally Posted by *kevindd992002*
> 
> Exactly! This is what I've been telling nrpeyton since a few posts back.
> 
> And the 88% value that I'm referring to is actually max thread usage not cpu usage.


A good example is some of the MMORPGs, like the one I'm playing at the moment.
Terrible FPS, low CPU and GPU usage. It turns out it can only utilise 3 threads max.


----------



## kevindd992002

Quote:


> Originally Posted by *Coopiklaani*
> 
> A good example is some of the MMORPGs, like the one I'm playing at the moment.
> Terrible FPS, low CPU and GPU usage. It turns out it can only utilise 3 threads max.


Right, I understand your case because one or more of the threads is maxed out. In my situation, NO THREAD is maxed out. Max thread usage is 88% and yet I still see only 70% GPU usage in GOW4.


----------



## cluster71

Řç
Quote:


> Originally Posted by *kevindd992002*
> 
> Right, I understand your case because one or
> more of the threads is maxed out. In my situation, NO THREAD is maxed out. Max thread usage is 88% and yet I still see only 70% GPU usage in GOW4.


Sometimes it's a bit of a fix. I have a license on CPUBalance Pro https://bitsum.com/portfolio/cpubalance/ On this PC there are not so many cores or threads (8). But you can distribute the load to other threads on those who are overloaded, sometimes it works well. There is a lot to read if you have not already done so.

And that "threaded optimization" in the nVidia control panel might have already tried to force ON


----------



## PimpSkyline

Quote:


> Originally Posted by *KedarWolf*
> 
> Afterburner beta 12 is out, updated OP to that. Remember to highlight the link, copy it, then paste to browser.
> 
> http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta12.rar


Might wait till 4.4.0 _Final_ instead of Beta 12. Any objections?


----------



## Nico67

Quote:


> Originally Posted by *Psykoking*
> 
> Well I already tried to flash using the existing nvflash versions from Joe, but I cant get it to recognize my card....
> After I got my last two exams done on Wednesday I will look into this further.


Tried the same a few months back, need a cert bypassed version of a newer nvflash that recognises the Flash roms on 1080Ti's


----------



## feznz

I got to give beta 12 a whirl just because I can and because been still getting BSOD with beta 11

I was playing with the voltages for a the first time I just adjusted the voltage and noticed that the core clock goes from 2050Mhz with stock volts to 2075Mhz with nothing but the voltage slider adjustment.
I been using offset rather than curve adjustment as I just don't seem to gain anything from curve vs offset


----------



## Falkentyne

Quote:


> Originally Posted by *Nico67*
> 
> Tried the same a few months back, need a cert bypassed version of a newer nvflash that recognises the Flash roms on 1080Ti's


I'll ask again because no one replied last time:
has anyone tried the -J parameter for NVflash?


----------



## KedarWolf

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nico67*
> 
> Tried the same a few months back, need a cert bypassed version of a newer nvflash that recognises the Flash roms on 1080Ti's
> 
> 
> 
> I'll ask again because no one replied last time:
> has anyone tried the -J parameter for NVflash?
Click to expand...

I have no idea what the -j parameter does and Googling it finds nothing.









Never mind. Have no idea what it would do.

https://www.techinferno.com/index.php?/forums/topic/1687-guide-nvidia-vbios-flashing/


----------



## kevindd992002

Quote:


> Originally Posted by *cluster71*
> 
> Řç
> Sometimes it's a bit of a fix. I have a license on CPUBalance Pro https://bitsum.com/portfolio/cpubalance/ On this PC there are not so many cores or threads (8). But you can distribute the load to other threads on those who are overloaded, sometimes it works well. There is a lot to read if you have not already done so.
> 
> And that "threaded optimization" in the nVidia control panel might have already tried to force ON


I don't know about this but I don't want to pay an extra fee to fix the problem but I get where you're coming from.

I'm building a new rig with a 7700K anyway so I'm expecting this to be solved. I was merely experimenting how a CPU bottleneck is


----------



## Falkentyne

It's not listed in the command list but it apparently "removes" all protections.
No one has tested whether it does anything for the certificate issue on Pascal (i bet not).

Might only have to do with write protected or something. The switch was buried in some other Nvidia thread on another forum.


----------



## KedarWolf

Quote:


> Originally Posted by *Falkentyne*
> 
> It's not listed in the command list but it apparently "removes" all protections.
> No one has tested whether it does anything for the certificate issue on Pascal (i bet not).
> 
> Might only have to do with write protected or something. The switch was buried in some other Nvidia thread on another forum.


It's in that command list be pretty obscure description on what it does.


----------



## Cookybiscuit

Anyone tried replacing the thermal paste on a Aorus card? My temps and noise levels are great, especially after an undervolt, but I'd like to bring them down even further if possible. I'd just be using Arctic Silver as I don't want any trouble with warranty, looks like it's just four screws holding the card to the cooler on these. I figure if I can delid and liquid ultra a CPU without breaking it this should be a piece of piss.

Worth it?


----------



## cluster71

Quote:


> Originally Posted by *kevindd992002*
> 
> I don't know about this but I don't want to pay an extra fee to fix the problem but I get where you're coming from.
> 
> I'm building a new rig with a 7700K anyway so I'm expecting this to be solved. I was merely experimenting how a CPU bottleneck is


When I started, I had 2 M.2 SSDs for my Z270 and I experienced many strange limitations in the system. Large variation in benchmark score, high CPU load at download copy, high latencies, stability issues, monitor dropouts etc. I read a lot about Pcie lane HSIO multiplexing. Tried to redistribute some memory resources with MSI_util, but the distribution of resources depends on bios programming. I have shared MB resources, it creates problems.

I removed one M.2 SSD to free up resources and reassign with MSI_util. It worked well. Now there has been a new bios release that works well. Do not know if they solved some of the problems with pcie resources yet.

Keep an eye and read about your motherboard.

The 7700K is very easy to overclock, especially if you're Delid and get a little margin on the temperature


----------



## cluster71

Quote:


> Originally Posted by *Cookybiscuit*
> 
> Anyone tried replacing the thermal paste on a Aorus card? My temps and noise levels are great, especially after an undervolt, but I'd like to bring them down even further if possible. I'd just be using Arctic Silver as I don't want any trouble with warranty, looks like it's just four screws holding the card to the cooler on these. I figure if I can delid and liquid ultra a CPU without breaking it this should be a piece of piss.
> 
> Worth it?


Most graphics cards seem to be quite poorly mounted. There is no good connection between GPU and Heatsink. So I think you should disassemble the card and check. If you then switch to better paste and pads then it is definitely worth it. I do not know how it is with the warranty(Some sticker) or if you do not care.


----------



## kevindd992002

Quote:


> Originally Posted by *cluster71*
> 
> When I started, I had 2 M.2 SSDs for my Z270 and I experienced many strange limitations in the system. Large variation in benchmark score, high CPU load at download copy, high latencies, stability issues, monitor dropouts etc. I read a lot about Pcie lane HSIO multiplexing. Tried to redistribute some memory resources with MSI_util, but the distribution of resources depends on bios programming. I have shared MB resources, it creates problems.
> 
> I removed one M.2 SSD to free up resources and reassign with MSI_util. It worked well. Now there has been a new bios release that works well. Do not know if they solved some of the problems with pcie resources yet.
> 
> Keep an eye and read about your motherboard.
> 
> The 7700K is very easy to overclock, especially if you're Delid and get a little margin on the temperature


I'm still using a 2600K. I'm still waiting for my Z270 platform to arrive.


----------



## cluster71

Do you mean X299, or you've ordered something that's on it's way home?


----------



## kevindd992002

Quote:


> Originally Posted by *cluster71*
> 
> Do you mean X299, or you've ordered something that's on it's way home?


No, I mean X270. I'm going the 7700K route.


----------



## cluster71

Okay, I have food on the stove.

I usually run new games all the time, but in recent months I've only played Sniper Elite 3 and 4. Installed Sniper 3 yesterday(not Elite) and played prologue and configured PC. Hopefully, it will be the Sniper Marathon tonight. There has been some bad criticism for Sniper 3, I hope the worst bugs are solved.


----------



## Unnatural

Quote:


> Originally Posted by *nrpeyton*
> 
> My next project: is to ADD THE MISSING THERMAL PADS.
> 
> This should help significantly reduce the temperatures of the memory chips closest to the VRM. (If we could get ALL GDDR5X memory chips at the same temp as ones on the I/O side then memory would definitely overclock faster. I actually PROVED this to be the case on my last card (my 1080 Classy). Where I was able to overclock with complete stability up to +925. I could even complete a bench at +1000 (albeit with some artefacts). I'm not talking rubbish here guys; in fact ere's an article at EVGA.com where I was awarded a ribbon for my work into researching a better pad arrangement on my previous card _(from stock instructions in manual):_
> https://forums.evga.com/Must-Have-changes-to-EK-manual-RE-pad-sizes-gt-For-780TI-Classy-Block-gt-1080-Classy-m2598540.aspx
> [/I]


Just to be clear: the link you posted here is just for reference, it doesn't means those instructions are good for the 1080ti too, right? My apologies, but I'm a bit drowsy right now and I was actually about to order an assortment of various thickness pads before realizing something wasn't quite right


----------



## Nico67

Quote:


> Originally Posted by *Falkentyne*
> 
> I'll ask again because no one replied last time:
> has anyone tried the -J parameter for NVflash?


Haven't tried that, but have been waiting to see if I got any response on Joe Dirt's thread for an updated bypass nvflash.


----------



## ocCuS

A little update on my 1080 ti Hall of Fame:

I replaced the thermal compound again. Now the card has liquid metal. Temperature results are the following:

61°C under full load with stock thermal compound
54°C under full load with the kryonaut thermal compound
49°C under full load with the liquid metal
(card on air, fans at 100% in benchmarks)

Voltage limit is at 1.093V and on the memory I get the +1000mhz completely stable (6496mhz).

The core results are the following:
stock thermal compound: 1062 mhz max stable in every benchmark
kryonaut: 1075 mhz max stable in every benchmark
liquid metal: 1088mhz max stable in every benchmark

Power consumption still never goes over 50%, so I think only water and lower temperatures will take this card to and over 2100mhz.

And still I can´t upload the bios directly here via attachement because of the "AJAX response unable to be parsed as valid JSON object". On every mashine I get this message when I try to upload the bios alone or in .rar, with anti virus on and off.
And for those who didn´t saw it, here is the link to download the hall of fame bios:

(Beware of flashing it to your card. Power limit is at 128 and the card has a 12+3-phase VRM and three 8-pin PCIe power connectors. Some cards already had problems with this bios!)

https://www.techpowerup.com/vgabios/193134/193134


----------



## KedarWolf

Quote:


> Originally Posted by *ocCuS*
> 
> A little update on my 1080 ti Hall of Fame:
> 
> I replaced the thermal compound again. Now the card has liquid metal. Temperature results are the following:
> 
> 61°C under full load with stock thermal compound
> 54°C under full load with the kryonaut thermal compound
> 49°C under full load with the liquid metal
> (card on air, fans at 100% in benchmarks)
> 
> Voltage limit is at 1.093V and on the memory I get the +1000mhz completely stable (6496mhz).
> 
> The core results are the following:
> stock thermal compound: 1062 mhz max stable in every benchmark
> kryonaut: 1075 mhz max stable in every benchmark
> liquid metal: 1088mhz max stable in every benchmark
> 
> Power consumption still never goes over 50%, so I think only water and lower temperatures will take this card to and over 2100mhz.
> 
> And still I can´t upload the bios directly here via attachement because of the "AJAX response unable to be parsed as valid JSON object". On every mashine I get this message when I try to upload the bios alone or in .rar, with anti virus on and off.
> And for those who didn´t saw it, here is the link to download the hall of fame bios:
> 
> (Beware of flashing it to your card. Power limit is at 128 and the card has a 12+3-phase VRM and three 8-pin PCIe power connectors. Some cards already had problems with this bios!)
> 
> https://www.techpowerup.com/vgabios/193134/193134


BIOS won't work on any card but HOF cards, different voltage regulators. It bricked my card, had to pop in my spare GPU and flash it back, PC wouldn't even boot with just the FE in after flashing the HOF BIOS.


----------



## TheBoom

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay will be good to know how you get on ;-)
> 
> You can just move the power slider on MSI Afterburner or Precision X. Soon as you hit apply; you've undid what you did by running the .batch file. No need to reboot


So that's why the Arctic storm bios kept getting throttled at 352w with my card. I didn't know you shouldn't move the power limit slider after running the power limit.bat file.

Quote:


> Originally Posted by *kevindd992002*
> 
> Again, gsync will not cap your fps to your refresh rate. That is vsync. As a proof, I play overwatch and csgo with gsync enabled and average at around 200+ fps (a framerate greater than my monitor's refresh rate).
> 
> Do you have a gsync monitor yourself? You can bring this discussion to the blurbusters forum and you'll understand what I'm saying.
> 
> The problem here is that the CPU is bottlenecking the GPU. I've read about this and it seems that with some games, 88% CPU usage can already be considered a bottleneck.


The thing is with gsync enabled once you exceed your monitors refreshrate it forces vsync on even if you have it disabled. I spent hours trying to figure out why I couldn't get past 160-170 fps in overwatch and it turns out I had to completely disable g sync in control panel. Or use ULMB or Fast Sync instead. Wouldn't hurt to try that.

I don't know about other games but the bottleneck for Overwatch at high fps is memory speeds. It might apply to other games at extremely high fps as well. Even at 3200 mhz I occcasionally get bottlenecked to 250 fps with a frame limit of 299 fps. 90% CPU usage on a 6700k with no thread being maxed out and GPU usage from 80-90%.

Highly likely it's either the forced sync or memory holding you back since you don't have any thread maxed out.


----------



## GraphicsWhore

Gave it 1.093V and I think I'm about maxed out here at 2101/6003. This is a water-cooled EVGA FE with EK TXP block and backplate. Previous high temp was 46C; this took it to over 50.



Separate link for pic since numbers hard to read uploaded to OC.net: https://abload.de/img/2wtsqe.jpg

I know I've seen moderately higher scores with lower voltage but can't complain with 11k+ on graphics.


----------



## cluster71

Yes this thread is basically about GTX1080 Ti, should not focus too much on the other..Good score, it works as it should


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> The thing is with gsync enabled once you exceed your monitors refreshrate it forces vsync on even if you have it disabled. I spent hours trying to figure out why I couldn't get past 160-170 fps in overwatch and it turns out I had to completely disable g sync in control panel. Or use ULMB or Fast Sync instead. Wouldn't hurt to try that.
> 
> I don't know about other games but the bottleneck for Overwatch at high fps is memory speeds. It might apply to other games at extremely high fps as well. Even at 3200 mhz I occcasionally get bottlenecked to 250 fps with a frame limit of 299 fps. 90% CPU usage on a 6700k with no thread being maxed out and GPU usage from 80-90%.
> 
> Highly likely it's either the forced sync or memory holding you back since you don't have any thread maxed out.


If that's the case, then there's something wrong with your gsync implementation. Gsync will not force vsync on even if it's disabled. Like I said in my previous post, I can get 200+ fps on my 144Hz gsync monitor in Overwatch. And I already tried disabling gsync completely as a test for my problematic GOW4 game to no avail (it does nothing).

I agree that it could potentially be either the memory or CPU holding me back but definitely not gsync.


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> If that's the case, then there's something wrong with your gsync implementation. Gsync will not force vsync on even if it's disabled. Like I said in my previous post, I can get 200+ fps on my 144Hz gsync monitor in Overwatch. And I already tried disabling gsync completely as a test for my problematic GOW4 game to no avail (it does nothing).
> 
> I agree that it could potentially be either the memory or CPU holding me back but definitely not gsync.


Are you saying if its disabled or even if its disabled? Did you force disable vsync in nvp or nvcp?

If we're on the same drivers which I assume we are then it shouldn't be the case. Either way you do know that once you exceed your monitor's refresh rate gsync is disabled right?

I've tested this a few times and its always the case. I don't know if its a Overwatch issue or driver side bug but it always happens without fail when I install new drivers and I have to disable gsync in nvp again.

Don't mind though there's virtually no tearing or stuttering at 299 fps.

Edit : There's also the possibility of being bottlenecked by cache.


----------



## Slackaveli

Quote:


> Originally Posted by *Cookybiscuit*
> 
> Anyone tried replacing the thermal paste on a Aorus card? My temps and noise levels are great, especially after an undervolt, but I'd like to bring them down even further if possible. I'd just be using Arctic Silver as I don't want any trouble with warranty, looks like it's just four screws holding the card to the cooler on these. I figure if I can delid and liquid ultra a CPU without breaking it this should be a piece of piss.
> 
> Worth it?


yes. i improved 7-8c with thermal grizzzly


----------



## ocCuS

A little update
Quote:


> Originally Posted by *Cookybiscuit*
> 
> Anyone tried replacing the thermal paste on a Aorus card? My temps and noise levels are great, especially after an undervolt, but I'd like to bring them down even further if possible. I'd just be using Arctic Silver as I don't want any trouble with warranty, looks like it's just four screws holding the card to the cooler on these. I figure if I can delid and liquid ultra a CPU without breaking it this should be a piece of piss.
> 
> Worth it?


I´d say it´s definitely worth it, too.
Stock my card was 61°C under full load. With the grizzly kryonaut it went down to 54°C and now with liquid metal to 49°C under full load. 12°C I´d say is definitely worth the thermal compound change. These cards love lower temperatures and you´ll see improved clock speeds


----------



## TheBoom

Is there a completely safe way to apply LM on the die and the cooler without the possibility of bricking the card?

I'm really considering shipping some LM from the US to apply on the CPU and GPU but there have already been at least 2 people here whom killed their cards doing LM. Kinda scary to be honest on a $1200 card.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> Are you saying if its disabled or even if its disabled? Did you force disable vsync in nvp or nvcp?
> 
> If we're on the same drivers which I assume we are then it shouldn't be the case. Either way you do know that once you exceed your monitor's refresh rate gsync is disabled right?
> 
> I've tested this a few times and its always the case. I don't know if its a Overwatch issue or driver side bug but it always happens without fail when I install new drivers and I have to disable gsync in nvp again.
> 
> Don't mind though there's virtually no tearing or stuttering at 299 fps.
> 
> Edit : There's also the possibility of being bottlenecked by cache.


I meant "gsync will not force vsync to turn on if it's disabled". Vsync is disabled in both nvcp and in-game.

Yeah, I don't think it's the drivers. Yes, gsync is useless when fps > Hz. But that's beyond my point. My issue is the bottlenecking of my 1080Ti even when the CPU max thread usage is just 88%. Then @nrpeyton mentioned about gsync which led to this conversation. My point with gsync is that it doesn't affect your framerates in anyway.

When gsync was first announced, the first driver that supported it (forgot the specific version) did not allow the users to disable vsync. So it looked like that when gsync is enabled, the framerates are capped to the monitor's refresh rate. Then Nvidia, with the succeeding drivers, allowed everyone to disable vsync while gsync is in operation. This means that gsync will work when fps < Hz but will be disabled otherwise.

Your case is not normal then. I'll try to upload a video showing you what I have in the next couple of days.

Yeah, either way I'll be upgrading soon anyway so a little patience from my side will go a long way


----------



## Psykoking

Quote:


> Originally Posted by *TheBoom*
> 
> Is there a completely safe way to apply LM on the die and the cooler without the possibility of bricking the card?
> 
> I'm really considering shipping some LM from the US to apply on the CPU and GPU but there have already been at least 2 people here whom killed their cards doing LM. Kinda scary to be honest on a $1200 card.


depends what you understand under completely safe. LM is conductive and unless you use too much or drop it on the pcb it is relatively safe. You should always be carefull when taking the card apart after you've applied it, to prevent droplets dripping off. However this is usually not the case if you, again, have not used too much of it.


----------



## ALSTER868

Quote:


> Originally Posted by *TheBoom*
> 
> Is there a completely safe way to apply LM on the die and the cooler without the possibility of bricking the card?
> 
> I'm really considering shipping some LM from the US to apply on the CPU and GPU but there have already been at least 2 people here whom killed their cards doing LM. Kinda scary to be honest on a $1200 card.


The only safe way is to be very cautious when applying the LM to the die in order to avoid spilling it onto the PCB or SMDs around the die itself. Better isolate the small components (SMDs) before applying LM, and try to make the layer as thin as possible.
But I would be sticking to Kryonaut, and I did


----------



## propeldragon

I got my 1080 ti about a week or so ago and I had the 980 ti previously. I am getting fps drops in every game I play (rainbow six siege, gtav, witcher 3, titanfall 2). I play at 1440p 60hz. It will drop to 59 sometimes and others it will go down to 40. I'm running a i7 3820 4.3ghz which shouldnt be bottlenecking, according to my usage its not. It will do it at stock clocks. I'm not sure whats going on? I should add when it drops fps I can see on the graph my power drops to 28% and my gpu usage to 1. I find that odd. Heres a picture of a graph of me playing witcher 3 for a minute.

gtx1080problem.JPG 98k .JPG file


----------



## smithsrt8

Quote:


> Originally Posted by *propeldragon*
> 
> I got my 1080 ti about a week or so ago and I had the 980 ti previously. I am getting fps drops in every game I play (rainbow six siege, gtav, witcher 3, titanfall 2). I play at 1440p 60hz. It will drop to 59 sometimes and others it will go down to 40. I'm running a i7 3820 4.3ghz which shouldnt be bottlenecking, according to my usage its not. It will do it at stock clocks. I'm not sure whats going on? I should add when it drops fps I can see on the graph my power drops to 28% and my gpu usage to 1. I find that odd. Heres a picture of a graph of me playing witcher 3 for a minute.
> 
> gtx1080problem.JPG 98k .JPG file


If I had to guess I would check your power supply. It could be its starting to fail and the switch is causing the signs? But I know 980's can draw some power too.


----------



## jprovido

I can never get my gtx 1080 Ti SC2 Hybrid stable even at 2000mhz. I'm at 1950mhz right now stock voltage just because it's causing random crashes on pubg. any tips? even with max voltage I'm having trouble with stability. my temps are great never goes above 50+ degrees.


----------



## Benny89

Finished my first water loop.

Resulta: 2 hours of playing Witcher 3 at 1.6V 2063 with 6009Mhz memory - max 38-39C.

Ambient temp 21C.



Now I will try to chase that 2100


----------



## buellersdayoff

Quote:


> Originally Posted by *TheBoom*
> 
> Is there a completely safe way to apply LM on the die and the cooler without the possibility of bricking the card?
> 
> I'm really considering shipping some LM from the US to apply on the CPU and GPU but there have already been at least 2 people here whom killed their cards doing LM. Kinda scary to be honest on a $1200 card.


Just replaced stock paste with kryonaut and gave me approx. 4c, I still have CLU left over from my cpu delid and I'm not going to use it for a couple of extra degrees. I game not bench though and I don't believe the risk is worth it


----------



## buellersdayoff

Quote:


> Originally Posted by *Benny89*
> 
> Finished my first water loop.
> 
> Resulta: 2 hours of playing Witcher 3 at 1.6V 2063 with 6009Mhz memory - max 38-39C.
> 
> Ambient temp 21C.
> 
> 
> 
> Now I will try to chase that 2100


Very tidy, sweet build ?


----------



## s1rrah

Quote:


> Originally Posted by *jprovido*
> 
> I can never get my gtx 1080 Ti SC2 Hybrid stable even at 2000mhz. I'm at 1950mhz right now stock voltage just because it's causing random crashes on pubg. any tips? even with max voltage I'm having trouble with stability. my temps are great never goes above 50+ degrees.


My MSI isn't a world class overclocker either ... best I can get is 2025mhz stable with a curve voltage settting ...

That being said?

Take it from me: don't obsess (unless your just in to that which is also okay) ... 1950mhz vs 2000+ mhz isn't even going to make a 1/2 FPS difference ... hell, even 1950mhz vs 2100+mhz isn't really going to make any noticeable "in game," perceptible difference ...

Love my MSI Seahawk at 2025mhz ... single card, 1440p gaming has never been so good ... amazes me every time I play Witcher 3 or any other heavy title ...


----------



## DerComissar

Quote:


> Originally Posted by *Benny89*
> 
> Finished my first water loop.
> 
> Resulta: 2 hours of playing Witcher 3 at 1.6V 2063 with 6009Mhz memory - max 38-39C.
> 
> Ambient temp 21C.
> 
> 
> 
> Now I will try to chase that 2100


Yeah! GPU load temps. in the 30's!









The loop and the build look great.
I figure you should get some nice low temperatures on the 5775C as well!


----------



## TheBoom

Quote:


> Originally Posted by *Psykoking*
> 
> depends what you understand under completely safe. LM is conductive and unless you use too much or drop it on the pcb it is relatively safe. You should always be carefull when taking the card apart after you've applied it, to prevent droplets dripping off. However this is usually not the case if you, again, have not used too much of it.


Quote:


> Originally Posted by *ALSTER868*
> 
> The only safe way is to be very cautious when applying the LM to the die in order to avoid spilling it onto the PCB or SMDs around the die itself. Better isolate the small components (SMDs) before applying LM, and try to make the layer as thin as possible.
> But I would be sticking to Kryonaut, and I did


What I'm more worried about is the part where you have to apply it on the heatsink or copperblock as well. If you get it slightly off or covering more than the die itself when you put it back on the card it might touch the smbs or the pcb itself?

Would be easier if you could apply it like regular paste just on the die itself.


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> I meant "gsync will not force vsync to turn on if it's disabled". Vsync is disabled in both nvcp and in-game.
> 
> Yeah, I don't think it's the drivers. Yes, gsync is useless when fps > Hz. But that's beyond my point. My issue is the bottlenecking of my 1080Ti even when the CPU max thread usage is just 88%. Then @nrpeyton mentioned about gsync which led to this conversation. My point with gsync is that it doesn't affect your framerates in anyway.
> 
> When gsync was first announced, the first driver that supported it (forgot the specific version) did not allow the users to disable vsync. So it looked like that when gsync is enabled, the framerates are capped to the monitor's refresh rate. Then Nvidia, with the succeeding drivers, allowed everyone to disable vsync while gsync is in operation. This means that gsync will work when fps < Hz but will be disabled otherwise.
> 
> Your case is not normal then. I'll try to upload a video showing you what I have in the next couple of days.
> 
> Yeah, either way I'll be upgrading soon anyway so a little patience from my side will go a long way


No what I meant was when you have gsync enabled and when fps exceeds rf, vsync will kick in? Or is that no longer the case? Or do you have to force disable vsync for this not to happen.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> No what I meant was when you have gsync enabled and when fps exceeds rf, vsync will kick in? Or is that no longer the case? Or do you have to force disable vsync for this not to happen.


Vsync will kick in if it is enabled in either the in-game settings or nvcp (global or program-specific settings).

It's really simple. Treat gsync as separate from gsync and all confusions will go away









Gsync will not do anything with framerates. It is a monitor technology that varies rf. This is exactly the reason why the hardware is embedded in the monitor themselves because it only cares about matching rf to fps, not the other way around.


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> Vsync will kick in if it is enabled in either the in-game settings or nvcp (global or program-specific settings).
> 
> It's really simple. Treat gsync as separate from gsync and all confusions will go away
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gsync will not do anything with framerates. It is a monitor technology that varies rf. This is exactly the reason why the hardware is embedded in the monitor themselves because it only cares about matching rf to fps, not the other way around.


Lol I know exactly how it works. But there are many claims going around that with gsync enabled and fps>rf vsync is automatically enabled even if you have it (vsync) disabled in game.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> Lol I know exactly how it works. But there are many claims going around that with gsync enabled and fps>rf vsync is automatically enabled even if you have it (vsync) disabled in game.


Then better direct them to blurbusters for clarification. I never read about those situations myself. Gsync works as expected in my case and in the case of the developers doing extensive tests in blurbusters.


----------



## 12Cores

This is my first Nvidia card, is there someway to turn of GPU Boost 3.0 or force the card to run max performance all the time? Secondly does enabling prefer max performance do anything?


----------



## ZealotKi11er

Quote:


> Originally Posted by *12Cores*
> 
> This is my first Nvidia card, is there someway to turn of GPU Boost 3.0 or force the card to run max performance all the time? Secondly does enabling prefer max performance do anything?


You are stuck with GPU Boost 3.0.
I do not think it does much but better to have it on.


----------



## propeldragon

Quote:


> Originally Posted by *smithsrt8*
> 
> If I had to guess I would check your power supply. It could be its starting to fail and the switch is causing the signs? But I know 980's can draw some power too.


I have another power supply sitting around and I'll try that and see if anything changes. Also would changing the pci cables around do anything?
Thanks


----------



## MrBeer

Just a update got EVGA GTX 1080TI SC Black with (HYBRID Waterblock Cooler) (x2 SLI) it looks like i can do 2012 and 5808 memory at stock voltage. if i increase voltage i get crashing. My temps are 40-48 on top card and 35-40 on bottom card with 100% load. most games are around 43 mark.


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> Then better direct them to blurbusters for clarification. I never read about those situations myself. Gsync works as expected in my case and in the case of the developers doing extensive tests in blurbusters.


https://forums.blurbusters.com/viewtopic.php?f=5&t=52

Read the quoted post by Mark.

This is what I meant. Not exactly vsync turning on but gsync capping frames to monitor refresh the same way vsync does.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> https://forums.blurbusters.com/viewtopic.php?f=5&t=52
> 
> Read the quoted post by Mark.
> 
> This is what I meant. Not exactly vsync turning on but gsync capping frames to monitor refresh the same way vsync does.


Can you please direct me to that specific post you're referring to? I can find a post done by the name of "Mark".


----------



## Coree

Hello guys, I'm gonna show you the ugliest 1080TI here, with a nice ghetto mod without voiding your blower card's warranty. It reduced my temps by 20C and the noise is very quiet. Before it was 85C with 1900mhz core, now it's 65C with 1949mhz core. The plus side is that you can keep the stock VRM + VRAM heatplate!

I had an Accelero III with broken fans, so I replaced them with 3 new ones. The second thing which is a must is a 25x25mm copper shim which is 2mm thick.
I had to bend the Accelero a bit, because the stock cooler's fin's interfered with the Accelero (right corner of pcb)
My card is a 4-slot behemoth now!


----------



## itssladenlol

Quote:


> Originally Posted by *jprovido*
> 
> I can never get my gtx 1080 Ti SC2 Hybrid stable even at 2000mhz. I'm at 1950mhz right now stock voltage just because it's causing random crashes on pubg. any tips? even with max voltage I'm having trouble with stability. my temps are great never goes above 50+ degrees.


I undervolted my 1080ti SC2 Hybrid to 1.000mv @ 2026mhz and it runs rockstable on 10hr PBUG sessions.
seems you have lost the silicon lottery


----------



## ttnuagmada

Quote:


> Originally Posted by *Iceman2733*
> 
> For the guys running two 1080ti what is your power draw? I have read a lot of conflicting information and I need to see if some of you can get me some correct information if possible.
> 
> I recently picked up 2ea 1080ti FTW3
> 
> My PC is as belowIntel i7-6850k @ 4.4ghz 1.340v
> Asus Rampage V Edition 10
> Corsair Dominator Platinum 16gig 3200mhz
> 2x EVGA 1080 FTW in SLI EK Blocked
> Samsung 950 PRO M.2 NVME
> Samsung 850
> WD Black 1tb
> EVGA 1000 P2
> Dual loops with D5 pumps
> 13 fans
> 
> I am trying to figure out if my 1000watt PSU will be able to handle all of this, I don't think it will be able too. However with all the wonders of mining all the freaking PSU above 1000 are all sold out everywhere.


Sorry for the late response: I've been running a [email protected]@1.4v and 2 1080 Ti's. With a solid GPU OC (2050/6000ish) my system pulls a little over 900w at the wall. I would think an OC'd HEDT CPU would probably be pushing it on my AX1200i, definitely with a 1000w unit.

I've been debating going with a Skylake-X, but I think i'll get a 1500w unit if i do.


----------



## MrTOOSHORT

1080TI Strix under an EK block installed this past weekend:



*http://www.3dmark.com/3dm/20856003*


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> Can you please direct me to that specific post you're referring to? I can find a post done by the name of "Mark".


2nd poster quoted the post by Mark Rejhon.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> 2nd poster quoted the post by Mark Rejhon.


What Mark was explaining there is mostly about input lag. What he means by "gsync framerate cap" is that gsybc only workams until the refresh rate. If you read it again, he's mostly comparing the input lag between gsync+vsync on and gsync+vsync off. And he's talking about framerates greater than refresh rate even when gsync is enabled.

I'm sorry I can't explain it in a more technical level like they do but maybe let's bring over this thread:

http://forums.blurbusters.com/viewtopic.php?f=5&t=3441

We're already too much off-topic here.


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> What Mark was explaining there is mostly about input lag. What he means by "gsync framerate cap" is that gsybc only workams until the refresh rate. If you read it again, he's mostly comparing the input lag between gsync+vsync on and gsync+vsync off. And he's talking about framerates greater than refresh rate even when gsync is enabled.
> 
> I'm sorry I can't explain it in a more technical level like they do but maybe let's bring over this thread:
> 
> http://forums.blurbusters.com/viewtopic.php?f=5&t=3441
> 
> We're already too much off-topic here.


Well it's still kinda on topic since its Nvidia and driver/performance related. Anyway I think you missed this part. Which is what I was referring to.

"Above 144fps, the input lag diverges between GSYNC and VSYNC OFF only because GSYNC has a *144fps framerate cap*. Once you hit 144fps, it has to finally start waiting for the monitor to finish the previous refresh (much like waiting for VSYNC), so now it behaves like 144fps=144Hz VSYNC ON."


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> What Mark was explaining there is mostly about input lag. What he means by "gsync framerate cap" is that gsybc only workams until the refresh rate. If you read it again, he's mostly comparing the input lag between gsync+vsync on and gsync+vsync off. And he's talking about framerates greater than refresh rate even when gsync is enabled.
> 
> I'm sorry I can't explain it in a more technical level like they do but maybe let's bring over this thread:
> 
> http://forums.blurbusters.com/viewtopic.php?f=5&t=3441
> 
> We're already too much off-topic here.
> 
> 
> 
> Well it's still kinda on topic since its Nvidia and driver/performance related. Anyway I think you missed this part. Which is what I was referring to.
> 
> "Above 144fps, the input lag diverges between GSYNC and VSYNC OFF only because GSYNC has a *144fps framerate cap*. Once you hit 144fps, it has to finally start waiting for the monitor to finish the previous refresh (much like waiting for VSYNC), so now it behaves like 144fps=144Hz VSYNC ON."
Click to expand...

Stuff off topic is okay if it doesn't drag on for long periods of time and is somehow 1080 Ti related.

However, this has dragged on for a bit. Could you take it to the G-Sync thread as suggested, please?









Thanks.


----------



## kevindd992002

Quote:


> Originally Posted by *TheBoom*
> 
> Well it's still kinda on topic since its Nvidia and driver/performance related. Anyway I think you missed this part. Which is what I was referring to.
> 
> "Above 144fps, the input lag diverges between GSYNC and VSYNC OFF only because GSYNC has a *144fps framerate cap*. Once you hit 144fps, it has to finally start waiting for the monitor to finish the previous refresh (much like waiting for VSYNC), so now it behaves like 144fps=144Hz VSYNC ON."


No, I didn't miss that. That's exactly what I was referring to when I said "what he means by "*gsync framerate cap*" is that gsync only works until the refresh rate." He's basically explaining how the input lag behaves when your fps=hz. He' saying that the input lag seems to behave to when VSYNC is ON when you're at fps=hz. *He is not saying that GSYNC caps the fps to refresh rate* You can directly ask him if you don't believe me. Let me ping him.

EDIT: I agree that this is becoming too much off-topic in this thread. On second thought, please post in the thread that I've linked to so that developers there can explain to you better. I don't know what more to say to you regarding this topic.


----------



## cluster71

Excuse me
I have not read in detail your discussions about G-sync. Everyone has their own view.
But stumbled upon these pictures yesterday, which I think sums up quite well. Do not remember the source, but maybe you can figure it out.
(Maybe that was from here https://forums.geforce.com/default/topic/965224/gsync-not-working-100-/?offset=117)

It was consistent with my own experiences in recent years. G-sync ON + V-sync ON (nVidia Control Panel(Higher priority)) + Limit Game if required.

I've always thought it's pointless to let the Graphics Card go 100% and >50% of the pictures are thrown away if you run a game that does not require that much, then it's better with a limit on the top.




Led can be fun, but does't it start to get too much?
http://www.guru3d.com/news-story/thermaltake-launches-toughpower-irgb-plus-1250w-titanium.html


----------



## sew333

Sometimes when i launch benchmark there is not throttling thermal like always. Temps staying 83-84C but gpuz not reporting temp throttling and clocks are higher, only pwr limit gpuz. Any ideas,is this normal? Mostly often when i run some game or benchmark in 83-84C i can see thermal throttling and clocks are lower.

But this is sometimes. 83-84C and no throttling raport from gpuz

This is sometimes,very rare. Any ideas? Ah i no install evga precision or afterburner. Something like temp target forget. All on stock. Gtx 1080 Ti FE


----------



## cluster71

Pascal temperature compensation, difficult to fully control. Try to get temp to 40-50C if you can



I'm holding 40C Air, but my PC is in the basement 18-20C


----------



## kevindd992002

Quote:


> Originally Posted by *cluster71*
> 
> Excuse me
> I have not read in detail your discussions about G-sync. Everyone has their own view.
> But stumbled upon these pictures yesterday, which I think sums up quite well. Do not remember the source, but maybe you can figure it out.
> (Maybe that was from here https://forums.geforce.com/default/topic/965224/gsync-not-working-100-/?offset=117)
> 
> It was consistent with my own experiences in recent years. G-sync ON + V-sync ON (nVidia Control Panel(Higher priority)) + Limit Game if required.
> 
> I've always thought it's pointless to let the Graphics Card go 100% and >50% of the pictures are thrown away if you run a game that does not require that much, then it's better with a limit on the top.
> 
> 
> 
> 
> Led can be fun, but does't it start to get too much?
> http://www.guru3d.com/news-story/thermaltake-launches-toughpower-irgb-plus-1250w-titanium.html


That diagram is from blurbusters and kinda explains it all.


----------



## cluster71

Okay, I'll read then. Will change a bit on my PC disconnecting soon ..


----------



## cluster71

Quote:


> Originally Posted by *kevindd992002*
> 
> That diagram is from blurbusters and kinda explains it all.


Was from the same person jorimt


----------



## TheBoom

Quote:


> Originally Posted by *kevindd992002*
> 
> No, I didn't miss that. That's exactly what I was referring to when I said "what he means by "*gsync framerate cap*" is that gsync only works until the refresh rate." He's basically explaining how the input lag behaves when your fps=hz. He' saying that the input lag seems to behave to when VSYNC is ON when you're at fps=hz. *He is not saying that GSYNC caps the fps to refresh rate* You can directly ask him if you don't believe me. Let me ping him.
> 
> EDIT: I agree that this is becoming too much off-topic in this thread. On second thought, please post in the thread that I've linked to so that developers there can explain to you better. I don't know what more to say to you regarding this topic.


Nevermind. I can constantly reproduce this issue in Overwatch if I have regular Gsync on, instead of ULMB or completely disabled. Maybe a bug with my drivers maybe not. You have your opinion and I'll have mine. Let's end it here.

Quote:


> Originally Posted by *KedarWolf*
> 
> Stuff off topic is okay if it doesn't drag on for long periods of time and is somehow 1080 Ti related.
> 
> However, this has dragged on for a bit. Could you take it to the G-Sync thread as suggested, please?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.


Back on topic, Kedar does your card with the Arctic Storm bios ever go near 120% TDP? Mine keeps starting to throttle at a little over 100%. With 8K Supo I've seen it go to 400w max but that's with an insane amount of throttling and still below 105% TDP.

This powerlimit.bat file is a bit confusing. Someone says you shouldn't touch the power slider after running it? Is that correct?


----------



## kevindd992002

Quote:


> Originally Posted by *cluster71*
> 
> Was from the same person jorimt


Right. And the thread in my above post that I've linked to is his thread


----------



## cluster71

Quote:


> Originally Posted by *TheBoom*
> 
> Nevermind. I can constantly reproduce this issue in Overwatch if I have regular Gsync on, instead of ULMB or completely disabled. Maybe a bug with my drivers maybe not. You have your opinion and I'll have mine. Let's end it here.
> Back on topic, Kedar does your card with the Arctic Storm bios ever go near 120% TDP? Mine keeps starting to throttle at a little over 100%. With 8K Supo I've seen it go to 400w max but that's with an insane amount of throttling and still below 105% TDP.
> 
> This powerlimit.bat file is a bit confusing. Someone says you shouldn't touch the power slider after running it? Is that correct?


I was hoping 400-450 or more with 1.2volt. Therefore, I connected 4 pcie cables + adapters and extra capacitors to avoid voltage drops (and measurements during benchmark), as I mentioned earlier. But the average are 320-330w often lower, sometimes spices higher. 89-91% power AB Arctic Bios.

Must be something with nVidia drivers that hold back. I searched yesterday for more Commands to enter. I was thinking about restoring the cables now, they do not seem to be needed.


----------



## cluster71

Quote:


> Originally Posted by *kevindd992002*
> 
> Right. And the thread in my above post that I've linked to is his thread


Saw it, checking


----------



## TheBoom

Quote:


> Originally Posted by *cluster71*
> 
> I was hoping 400-450 or more with 1.2volt. Therefore, I connected 4 pcie cables + adapters and extra capacitors to avoid voltage drops (and measurements during benchmark), as I mentioned earlier. But the average are 320-330w often lower, sometimes spices higher. 89-91% power AB Arctic Bios.
> 
> Must be something with nVidia drivers that hold back. I searched yesterday for more Commands to enter. I was thinking about restoring the cables now, they do not seem to be needed.


How are you getting 1.2v though? It isn't an unlocked bios like the XOC right?


----------



## cluster71

Yes I've tested with everyone almost, but in XOC you do not see power, using HWinfo64. Same result ...


----------



## cluster71

Yesterday I run XOC run 1.2v and switched to Extreme bios after, about the same power use. Keep Extreme or Arctic for now until find some new info


----------



## TheBoom

Quote:


> Originally Posted by *cluster71*
> 
> Yes I've tested with everyone almost, but in XOC you do not see power, using HWinfo64. Same result ...


If you're talking about XOC bios then I have no issues with power limit. It has gone up to 445 watts in Mass Effect Andromeda at 1.143v.

I'm talking specifically about the Arctic Storm bios and the bat file that's supposed to set it's power limit to 432 watts. Doesn't seem to be working as highest I've seen is 394 watts in 8K Supo and I keep getting throttled from 350 watts onwards..


----------



## cluster71

I can not get continuous high power use, tested power.bat


----------



## cluster71

With more games? Or just Mass Effect Andromeda. Some bugs I think.
Something mentioned on nVidia I think


----------



## cluster71

Https://developer.nvidia.com/ was yesterday, have a login since earlier. I try to understand how power limit works in drivers

Disconnect some fix


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kevindd992002*
> 
> No, I didn't miss that. That's exactly what I was referring to when I said "what he means by "*gsync framerate cap*" is that gsync only works until the refresh rate." He's basically explaining how the input lag behaves when your fps=hz. He' saying that the input lag seems to behave to when VSYNC is ON when you're at fps=hz. *He is not saying that GSYNC caps the fps to refresh rate* You can directly ask him if you don't believe me. Let me ping him.
> 
> EDIT: I agree that this is becoming too much off-topic in this thread. On second thought, please post in the thread that I've linked to so that developers there can explain to you better. I don't know what more to say to you regarding this topic.
> 
> 
> 
> Nevermind. I can constantly reproduce this issue in Overwatch if I have regular Gsync on, instead of ULMB or completely disabled. Maybe a bug with my drivers maybe not. You have your opinion and I'll have mine. Let's end it here.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Stuff off topic is okay if it doesn't drag on for long periods of time and is somehow 1080 Ti related.
> 
> However, this has dragged on for a bit. Could you take it to the G-Sync thread as suggested, please?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Back on topic, Kedar does your card with the Arctic Storm bios ever go near 120% TDP? Mine keeps starting to throttle at a little over 100%. With 8K Supo I've seen it go to 400w max but that's with an insane amount of throttling and still below 105% TDP.
> 
> This powerlimit.bat file is a bit confusing. Someone says you shouldn't touch the power slider after running it? Is that correct?
Click to expand...

You need to run the powerlimit.bat and edit it in Notepad to make sure it's set at -pl 432 for the Arctic Storm BIOS AND max out your power limit slider at 120% I think it is.

And yes, it does power limit at around 100% but I think the voltage monitoring is wrong on this BIOS and it's pulling the voltages and wattage it's should from what other people have said.

How you can tell if it actually power limits and down clocks somewhat less than other BIOS's than the 432 is working as should no matter what it reports.

I got a 11166 Time spy with it which is pretty much unheard of with a 1080 Ti. I think it was 11136 graphics score alone.


----------



## cluster71

I've had several Zotac graphics cards, but not Extreme earlier. Is anyone who has had "Zotac's GeForce GTX 980 AMP! Omega"?
Nobody seems to know what the USB port on the Zotac 1080Ti Amp Extreme is for anything. A service port or something like this http://techreport.com/review/27176/geforce-gtx-980-cards-from-gigabyte-and-zotac-reviewed

"Via this connection, the OC+ feature monitors some key variables, including the 12V line from the PCIe slot, the 12V line from the PCIe power connectors, GPU current draw, and memory voltage. Beyond monitoring, OC+ also allows control over the card's memory voltage"

_
http://forum.hwbot.org/showthread.php?t=169488
Thinking about what he meant by the second point



_
Restored. No beauty but that was never my intention. Less cables at least


----------



## Benny89

Playing at 2088 at 1.75v now. Never breaks 40C. 38-39 mostly in W3.

Funny how much Power Limit was released with removed fans. Normally in W3 with 2063 I was seeing it spike to like 114%-118%. Now I rarely see 102/3% playing it.

Happy with my water cooling


----------



## ZealotKi11er

Quote:


> Originally Posted by *Benny89*
> 
> Playing at 2088 at 1.75v now. Never breaks 40C. 38-39 mostly in W3.
> 
> Funny how much Power Limit was released with removed fans. Normally in W3 with 2063 I was seeing it spike to like 114%-118%. Now I rarely see 102/3% playing it.
> 
> Happy with my water cooling


1.075v. Big difference.


----------



## Benny89

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1.075v. Big difference.


Well, I am testing slowly new speeds wiht new curves. Tomorrow I will test 1.8 and 1.9.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> You need to run the powerlimit.bat and edit it in Notepad to make sure it's set at -pl 432 for the Arctic Storm BIOS AND max out your power limit slider at 120% I think it is.
> 
> And yes, it does power limit at around 100% but I think the voltage monitoring is wrong on this BIOS and it's pulling the voltages and wattage it's should from what other people have said.
> 
> How you can tell if it actually power limits and down clocks somewhat less than other BIOS's than the 432 is working as should no matter what it reports.
> 
> I got a 11166 Time spy with it which is pretty much unheard of with a 1080 Ti. I think it was 11136 graphics score alone.


Even if it's reporting voltages and wattages wrong I'm getting lower clocks (more throttling) with both offset and curve method and my 4K and 8K Supo both get lower scores than XOC and about the same as Amp Extreme bios consistently.

So even if GPU-z is reporting Pwr perfcap wrongly and HWInfo power usage is wrong it's apparent that it's throttling hard from something and that's leading to worse scores for me.

Maybe it really depends on the card at hand. Could be that the FE with shunt benefits more from this bios than my Amp Extreme.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You need to run the powerlimit.bat and edit it in Notepad to make sure it's set at -pl 432 for the Arctic Storm BIOS AND max out your power limit slider at 120% I think it is.
> 
> And yes, it does power limit at around 100% but I think the voltage monitoring is wrong on this BIOS and it's pulling the voltages and wattage it's should from what other people have said.
> 
> How you can tell if it actually power limits and down clocks somewhat less than other BIOS's than the 432 is working as should no matter what it reports.
> 
> I got a 11166 Time spy with it which is pretty much unheard of with a 1080 Ti. I think it was 11136 graphics score alone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even if it's reporting voltages and wattages wrong I'm getting lower clocks (more throttling) with both offset and curve method and my 4K and 8K Supo both get lower scores than XOC and about the same as Amp Extreme bios consistently.
> 
> So even if GPU-z is reporting Pwr perfcap wrongly and HWInfo power usage is wrong it's apparent that it's throttling hard from something and that's leading to worse scores for me.
> 
> Maybe it really depends on the card at hand. Could be that the FE with shunt benefits more from this bios than my Amp Extreme.
Click to expand...

I'm on A Gigabyte FE with no shunt mod, but yes, XOC is pretty much on par with this BIOS for me.

I run XOC as my daily driver, just have never gotten above a say 11050 overall, 11038 graphics score with it in Time Spy, but pretty close to the Arctic Storm BIOS.


----------



## 12Cores

While this card is insanely fast I cannot help but wonder how much performance is being left on the table due to the 1.093v limit. I am pretty confident that I would easily be over 2.1ghz or faster if they allowed a higher voltage ceiling, 1.25v would have been nice especially for those of us under water.


----------



## kevindd992002

I'm looking for DDR4 RAM to match with my 1080Ti and 7700K. My choices just boil down to either of these G.Skill Trident Z's:

3200CL14 ($182.99)
3200CL15 ($172.99)
3600CL15 ($220.99)
3600CL16 ($196.99)

Which one is the most practical to get cobaidering I'll be overclocking everything (cpu, gpu, ram)?


----------



## MrTOOSHORT

Check the memory thread


----------



## kfxsti

Guys. I have an odd question..
Do you think a 4.9ghz delidded i5-6600k will bottleneck my 1080ti ? I have one sat aside from my upgrade to the the 7700k BUTtttt
My 2 7700ks crapped the bed due to a mother board issue. And it's going to a little bit before I'll be able to grab any sort of i7 replacement (ive got to have some radiation ) . But I was able to replace the bad Mobo.
Sorry for.the derail.. just trying to keep my ducks in a row right now..


----------



## Dasboogieman

Quote:


> Originally Posted by *kfxsti*
> 
> Guys. I have an odd question..
> Do you think a 4.9ghz delidded i5-6600k will bottleneck my 1080ti ? I have one sat aside from my upgrade to the the 7700k BUTtttt
> My 2 7700ks crapped the bed due to a mother board issue. And it's going to a little bit before I'll be able to grab any sort of i7 replacement (ive got to have some radiation ) . But I was able to replace the bad Mobo.
> Sorry for.the derail.. just trying to keep my ducks in a row right now..


What games will you be playing? what frame rates are you targetting? what is your monitor setup.


----------



## ZealotKi11er

I was testing 1080 Ti in BF1 and for some reason I am getting less fps than I did with 290X when using 1080p Low. Is this some Nvidia problem? I though Nvidia had much better CPU overhead. I used to get 140+ but now I am close to 120 fps.


----------



## KedarWolf

Quote:


> Originally Posted by *12Cores*
> 
> While this card is insanely fast I cannot help but wonder how much performance is being left on the table due to the 1.093v limit. I am pretty confident that I would easily be over 2.1ghz or faster if they allowed a higher voltage ceiling, 1.25v would have been nice especially for those of us under water.


You mean like the 1.2v XOC BIOS?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *kfxsti*
> 
> Guys. I have an odd question..
> Do you think a 4.9ghz delidded i5-6600k will bottleneck my 1080ti ? I have one sat aside from my upgrade to the the 7700k BUTtttt
> My 2 7700ks crapped the bed due to a mother board issue. And it's going to a little bit before I'll be able to grab any sort of i7 replacement (ive got to have some radiation ) . But I was able to replace the bad Mobo.
> Sorry for.the derail.. just trying to keep my ducks in a row right now..


The only bottleneck is the 4 threads vs the 8 with your old 7700ks. 6600k is a I nice place holder until you get another CPU.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was testing 1080 Ti in BF1 and for some reason I am getting less fps than I did with 290X when using 1080p Low. Is this some Nvidia problem? I though Nvidia had much better CPU overhead. I used to get 140+ but now I am close to 120 fps.


Double check that your NVIDIA driver settings are set to Maximum Performance. That looks like auto throttle at 120fps to conserve power.

Also, the NVIDIA driver tuning tends to discard a lot of frames compared to the AMD counterpart because NVIDIA typically targets frame consistency over raw frame rates.


----------



## trekxtrider

Finally got the upgrade from the 980ti.


----------



## KedarWolf

I tried every different revision of the Gigabyte Aorus/Gaming Extreme BIOS with a 375W power limit having had luck with a few Zotac Amp Extreme BIOS's that worked with my FE, none are working right.









Back to XOC BIOS.


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Coree*
> 
> Hello guys, I'm gonna show you the ugliest 1080TI here, with a nice ghetto mod without voiding your blower card's warranty. It reduced my temps by 20C and the noise is very quiet. Before it was 85C with 1900mhz core, now it's 65C with 1949mhz core. The plus side is that you can keep the stock VRM + VRAM heatplate!
> 
> I had an Accelero III with broken fans, so I replaced them with 3 new ones. The second thing which is a must is a 25x25mm copper shim which is 2mm thick.
> I had to bend the Accelero a bit, because the stock cooler's fin's interfered with the Accelero (right corner of pcb)
> My card is a 4-slot behemoth now!


Ewww... Yuck.


----------



## kfxsti

Quote:


> Originally Posted by *Dasboogieman*
> 
> What games will you be playing? what frame rates are you targetting? what is your monitor setup.


Warframe , The Division, B4 , Bf1 for the most part. Monitor is a 144hz 1080p that I use Nvidias DSR feature . Normally run 1440p or higher .
Sorry on the delay. Closed my eyes on the couch for a min. And like woke up just now LOL


----------



## kfxsti

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The only bottleneck is the 4 threads vs the 8 with your old 7700ks. 6600k is a I nice place holder until you get another CPU.


Awesome. Aslong as it holds up for a few weeks. Lol.


----------



## kevindd992002

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Check the memory thread


I was assuming that my post is on-topic as it has to do with the RAM that's best for the 1080Ti.


----------



## PimpSkyline

Quote:


> Originally Posted by *cluster71*
> 
> I've had several Zotac graphics cards, but not Extreme earlier. Is anyone who has had "Zotac's GeForce GTX 980 AMP! Omega"?
> Nobody seems to know what the USB port on the Zotac 1080Ti Amp Extreme is for anything. A service port or something like this http://techreport.com/review/27176/geforce-gtx-980-cards-from-gigabyte-and-zotac-reviewed
> 
> "Via this connection, the OC+ feature monitors some key variables, including the 12V line from the PCIe slot, the 12V line from the PCIe power connectors, GPU current draw, and memory voltage. Beyond monitoring, OC+ also allows control over the card's memory voltage"
> 
> _
> http://forum.hwbot.org/showthread.php?t=169488
> Thinking about what he meant by the second point
> 
> 
> 
> _
> Restored. No beauty but that was never my intention. Less cables at least


I have a Zotac GTX 980 Ti AMP! OMEGA, what did you wanna know? (Best card i have ever owned btw)


----------



## lilchronic

Quote:


> Originally Posted by *kevindd992002*
> 
> I was assuming that my post is on-topic as it has to do with the RAM that's best for the 1080Ti.


Ram is not going to make your 1080Ti perform any better. However It will increase your CPU performance.
Just go with what ever fits your budget.


----------



## Sgang

Anyone here with a 1080 ti Zotac AMP extreme core edition ?
i don't know if my performance are adeguate
i ran Unigine Valley Extreme HD and my score was 4600something, at the bottom of the 1080ti list, and just above the 1080.
My CPU is a Ryzen 1800x


----------



## kevindd992002

Quote:


> Originally Posted by *lilchronic*
> 
> Ram is not going to make your 1080Ti perform any better. However It will increase your CPU performance.
> Just go with what ever fits your budget.


I'm reading a lot of articles/blogs saying that RAM will make high-end GPU's (like the 1080Ti) perform a few fps more. It's more like RAM speeds are somewhat significant if you have a high-end GPU but the other way around for mainstream GPU's.


----------



## becks

Quote:


> Originally Posted by *kevindd992002*
> 
> I'm reading a lot of articles/blogs saying that RAM will make high-end GPU's (like the 1080Ti) perform a few fps more. It's more like RAM speeds are somewhat significant if you have a high-end GPU but the other way around for mainstream GPU's.


You don't really understand the whole process it seems....
Pairing the same 1080 ti with a i5 or i7 will produce the same phenomen and you have a "few" more FPS on the i7..
Game performance and FPS is not tied 100% to the GPU...CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.

Having better ram increases CPU - RAM performance or even Disk performance (when used as cache) but do not produce anything in regards to the GPU witch also has its own Ram..


----------



## Sgang

i Agree, especially if you are on a Ryzen platform
i tested my 1080ti with 2133 mhz memory and 2667 CL15 and i've improved 7-8 average fps on some games, (1800x)


----------



## Psykoking

Quote:


> Originally Posted by *becks*
> 
> You don't really understand the whole process it seems....
> Pairing the same 1080 ti with a i5 or i7 will produce the same phenomen and you have a "few" more FPS on the i7..
> Game performance and FPS is not tied 100% to the GPU...CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.
> 
> Having better ram increases CPU - RAM performance or even Disk performance (when used as cache) but do not produce anything in regards to the GPU witch also has its own Ram..


This is not necessarily true, since the CPU also has to do computational work on games which are needed to generate a frame. For instance the positon of other players in a multiplayer game. The CPU has to handle all the network related stuff. The CPU stores this data in the RAM and the GPU can access it via direct memory access leaving the CPU free to perform other calculations. This means RAM with a higher bandwidth allows data transfer to be performed faster.


----------



## becks

Quote:


> Originally Posted by *Psykoking*
> 
> This is not necessarily true, since the CPU also has to do computational work on games which are needed to generate a frame. For instance the positon of other players in a multiplayer game. The CPU has to handle all the network related stuff. The CPU stores this data in the RAM and the GPU can access it via direct memory access leaving the CPU free to perform other calculations. This means RAM with a higher bandwidth allows data transfer to be performed faster.


And how is this not necessarily true ?
Latency differences between 3200 or 3800 ram is so minimal it does not make 3 FPS difference!
For proof Youtube, OCN forums and LTT forums are full of arguing topics around ram OC...is it worth it or not!

More-so to make it more simple: if CPU is left free to perform other calculations why is it the CPU the main and only Bottleneck factor of a GPU performance ?

I stand by what I said and that is: Game performance and FPS *is not tied 100% to the GPU.*..CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.

And the whole thing started with his question which was: RAM that's best for the 1080Ti ?


----------



## Psykoking

Quote:


> Originally Posted by *becks*
> 
> And how is this not necessarily true ?
> Latency differences between 3200 or 3800 ram is so minimal it does not make 3 FPS difference!
> For proof Youtube, OCN forums and LTT forums are full of arguing topics around ram OC...is it worth it or not!
> 
> More-so to make it more simple: if CPU is left free to perform other calculations why is it the CPU the main and only Bottleneck factor of a GPU performance ?
> 
> I stand by what I said and that is: Game performance and FPS *is not tied 100% to the GPU.*..CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.
> 
> And the whole thing started with his question which was: RAM that's best for the 1080Ti ?


Ofc it does not do anything for the GPUs performance itself but it has impact on overall game performance. I do not have to look up anything in a forum, I am a student in this field. Yes the CPU is the main bottleneck because it does all other calculations that the GPU is not responsible for like AI, distance calculation etc. and if it fails to deliver that data in time frames will drop.
And ofc its overall performance what it comes to in the end but to answer the question which RAM would be the best, you can simply answer as always that one with the highest bandwidth and lowest latency. What I intended to say was that it can make a difference but you wont know unless you either calculate it (which is highly unlikely) or try it out.
Also the 7-8 Fps stated above are more likely to come from faulty / wrong placed Mem sticks than anything else or unsupported memory...


----------



## propeldragon

Change my power supply and i'm still getting power drops and gpu usage drop spikes. Like 40 power and 1 usage, fps will drop to like 38. Very odd.


----------



## TheBoom

Quote:


> Originally Posted by *Sgang*
> 
> Anyone here with a 1080 ti Zotac AMP extreme core edition ?
> i don't know if my performance are adeguate
> i ran Unigine Valley Extreme HD and my score was 4600something, at the bottom of the 1080ti list, and just above the 1080.
> My CPU is a Ryzen 1800x


The core edition are worst binned chips of the Amp Extreme series. Bottom of the 1080ti list makes sense but it still should perform much better than a 1080.

You could try ocing it as far as it would go and perhaps get a nice boost in performance.

The other thing is the IPC performance of Ryzen might be holding you back a bit and Unigine Valley is old so it's also not optimized for Ryzen I suppose.

Quote:


> Originally Posted by *becks*
> 
> You don't really understand the whole process it seems....
> Pairing the same 1080 ti with a i5 or i7 will produce the same phenomen and you have a "few" more FPS on the i7..
> Game performance and FPS is not tied 100% to the GPU...CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.
> 
> Having better ram increases CPU - RAM performance or even Disk performance (when used as cache) but do not produce anything in regards to the GPU witch also has its own Ram..


I agree this is a weird question to ask in this thread or even in general. Its like asking which car would go best with your apartment lol.


----------



## kevindd992002

Ok. I must admit that I'm not well-versed in the GPU, CPU, RAM field which is why I asked the question. That's what I understood from the articles I've read. They keep saying that the difference in RAM speeds is more evident when you have a high-end CPU and GPU.


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> Ok. I must admit that I'm not well-versed in the GPU, CPU, RAM field which is why I asked the question. That's what I understood from the articles I've read. They keep saying that the difference in RAM speeds is more evident when you have a high-end CPU and GPU.


CPUs tend to be the bottleneck and when using faster RAM, you're essentially releasing some of that bottleneck and that's where the performance is coming from.


----------



## TheBoom

Quote:


> Originally Posted by *pez*
> 
> CPUs tend to be the bottleneck and when using faster RAM, you're essentially releasing some of that bottleneck and that's where the performance is coming from.


^^

Overwatch seems to be the only highly ram dependant game I can think of period. 2400-3200 mhz ram makes a 100 fps difference.


----------



## Dasboogieman

Quote:


> Originally Posted by *becks*
> 
> And how is this not necessarily true ?
> Latency differences between 3200 or 3800 ram is so minimal it does not make 3 FPS difference!
> For proof Youtube, OCN forums and LTT forums are full of arguing topics around ram OC...is it worth it or not!
> 
> More-so to make it more simple: if CPU is left free to perform other calculations why is it the CPU the main and only Bottleneck factor of a GPU performance ?
> 
> I stand by what I said and that is: Game performance and FPS *is not tied 100% to the GPU.*..CPU, Ram, Disk, MB (lanes) etc. etc. all add up they'r part to the end result.
> 
> And the whole thing started with his question which was: RAM that's best for the 1080Ti ?


It does make a difference, but only if the RAM was actually faster (as in actually lower latency and not just mhz) and only if the CPU was playing a large role in performance to begin with (e.g. GTA5). You see this manifested as the extreme between stock 2133mhz and pedestrian 3200mhz. The reason you see minimal difference between 3200mhz and 3800mhz is due to the tuning of the timings. Loose 3800 is equal or worse than tight 3200mhz.

Or you compare another extreme case, the 4790K and the 5775c. If both are clocked similar (with the 4790K maybe 3-5% higher to account for broadwell's IPC) with equal RAM, the 5775c would still come out ahead due to the sheer performance of its L4 cache vs DDR3 systems.

TLDR: you only get a speedup in RAM performance in games if the RAM was actually faster, if it just sports a high MHz number but crap timings, you won't see it.


----------



## pez

Quote:


> Originally Posted by *TheBoom*
> 
> ^^
> 
> Overwatch seems to be the only highly ram dependant game I can think of period. 2400-3200 mhz ram makes a 100 fps difference.


I did notice a performance increase going from 4770K + 2133DDR3 to 7700K + 3000DDR4, but I didn't do nearly enough testing or comparisons to say what made the biggest difference between IPC increases, RAM speeds, etc. I believe GTA V and Witcher 3 have been said to do really well from high-speed RAM.


----------



## Psykoking

Quote:


> Originally Posted by *Dasboogieman*
> 
> It does make a difference, but only if the RAM was actually faster (as in actually lower latency and not just mhz) and only if the CPU was playing a large role in performance to begin with (e.g. GTA5). You see this manifested as the extreme between stock 2133mhz and pedestrian 3200mhz. The reason you see minimal difference between 3200mhz and 3800mhz is due to the tuning of the timings. Loose 3800 is equal or worse than tight 3200mhz.
> 
> Or you compare another extreme case, the 4790K and the 5775c. If both are clocked similar (with the 4790K maybe 3-5% higher to account for broadwell's IPC) with equal RAM, the 5775c would still come out ahead due to the sheer performance of its L4 cache vs DDR3 systems.
> 
> TLDR: you only get a speedup in RAM performance in games if the RAM was actually faster, if it just sports a high MHz number but crap timings, you won't see it.


I second that.


----------



## KraxKill

Quote:


> Originally Posted by *Dasboogieman*
> 
> It does make a difference, but only if the RAM was actually faster (as in actually lower latency and not just mhz) and only if the CPU was playing a large role in performance to begin with (e.g. GTA5). You see this manifested as the extreme between stock 2133mhz and pedestrian 3200mhz. The reason you see minimal difference between 3200mhz and 3800mhz is due to the tuning of the timings. Loose 3800 is equal or worse than tight 3200mhz.
> 
> Or you compare another extreme case, the 4790K and the 5775c. If both are clocked similar (with the 4790K maybe 3-5% higher to account for broadwell's IPC) with equal RAM, the 5775c would still come out ahead due to the sheer performance of its L4 cache vs DDR3 systems.
> 
> TLDR: you only get a speedup in RAM performance in games if the RAM was actually faster, if it just sports a high MHz number but crap timings, you won't see it.


Exactly on point!

It's the combination of latency + frequency that determines the overall result.

It is also true that different programs will respond differently to both. Large data sets may prefer bandwidth while small datasets may prefer latency.

Ultimately the easiest way to estimate the performance of the ram you intend to run is to simply

*DIVIDE the advertised LATENCY BY THE advertised FREQUENCY.* The combination resulting in the lowest value will be the fastest.

Again, this is situation dependent to an extent.

The 5775c for example has additive L4+DDR3 bandwidth so focus on latency at the expense of frequency can provide better results in situation where sufficient bandwidth already exists.

On other chips like 4790k, 6700k, 7700k you want both tight latency and sufficient bandwidth.

The main point, is don't be fooled by high frequency ram. It may give you a bunch more bandwidth, but if it's more bandwidth than you need you're giving up latency and incurring additional input lag by running loose ram timings to support the high ram frequency.

I've posted this graph before. It's a performance index of various ram frequencies vs ram CAS latencies.


----------



## smithsrt8

Weird post but I am not sure if anyone would have any interest in here...I figured I would post in this thread before I go to the for sale section...I have 2 1080ti's...one Aorus (non extreme) and one EVGA FTW3...I have water blocks for both (the Aorus is already underwater) and I was going to install the other block when I get back from vacation next week...but what bothers me is I cannot run the hard SLI bridge because the cards are different widths...and also the non-matching factor is bothering me...if they were close to the same width or if the waterblocks matched (they are different even though they are both EKWB) I wanted to see if anyone would have any interest in a swap...either my Aorus with block already installed (I have all the factory heat sink/fans) for your EVGA FTW3 or my FTW3 with block (not installed) for your Aorus 1080ti (block or no block...it doesnt matter to me)

anyone have any interest?


----------



## GRABibus

I would be very interested by this one :

http://www.gigabyte.fr/Graphics-Card/GV-N108TAORUSX-W-11GD#kf

Does anyone already own it ?


----------



## Buzzard1

I bought a Gigabyte Aorus Extreme 1080ti a week ago for $749 at frys. The card was bought as a open box and returned. The overclocks were tearable with the GPU maxing out at 1974 stable ( which is probably the reason it was returned). I have been waiting 1 1/2 weeks for the gigabyte Extreme to come in stock and I am not %100 sure if I should wait. Only reason I want the gigabyte one is the %150 power that it allows for overclocking and the 2 HDMI ports. Now I called fyrs today and they told me they would let me trade my gigabyte extreme ( paid $749 ) if for the same price as the EVGA FTW 1080ti (msrp $779). Should I jump on this deal or wait for the gigabyte to come in? The FTW is the only one they have new in stock.

Thank you


----------



## Iceman2733

Quote:


> Originally Posted by *KCDC*
> 
> My system is fairly similar to yours. When monitoring psu usage through Link, my spikes would go as high as 920 to 960w on the stock FE bios when benching or gaming. What's the PSU? Is it truly rated for 1k or does it just say it on the side? You should be fine, but it is cutting it close.


I wanted to quote this and update, I finally got around to putting my two Evga 1080ti FTW3 into my system, I do not see how people are running these cards with less than 1000watt PSU. With my CPU at [email protected] and the GPU's at stock clocks in Time Spy I was seeing spikes of 940watts. That is without O/Cing the GPUs at all this is crazy, it seemed to hoover most of the time in the mid 840-860 range. Unless this APC output is way off I don't see how people are running two 1080ti FTW3 without 1k PSU. I ended up finding me a 1200 P2 so I have a little head room and gonna just put my 1k up for sale.


----------



## smithsrt8

Quote:


> Originally Posted by *Iceman2733*
> 
> I wanted to quote this and update, I finally got around to putting my two Evga 1080ti FTW3 into my system, I do not see how people are running these cards with less than 1000watt PSU. With my CPU at [email protected] and the GPU's at stock clocks in Time Spy I was seeing spikes of 940watts. That is without O/Cing the GPUs at all this is crazy, it seemed to hoover most of the time in the mid 840-860 range. Unless this APC output is way off I don't see how people are running two 1080ti FTW3 without 1k PSU. I ended up finding me a 1200 P2 so I have a little head room and gonna just put my 1k up for sale.


With no overclock on CPU or GPU's I was seeing 660-690 on a 750w (I was waiting for my no PSU to show up) and I noticed that as soon as I got the 1000w PSU I did my normal OC's on everything and I have seen about 890 at the absolute max...Although I did see a crazy spike on the second part of Time Spy that went over 1000...but I dont think it was correct...I am using a Kill O Watt

I also have a 7700k @ 1.36v a FTW3 and a Aorus 1080 ti (non-extreme)


----------



## kevindd992002

Quote:


> Originally Posted by *KraxKill*
> 
> Exactly on point!
> 
> It's the combination of latency + frequency that determines the overall result.
> 
> It is also true that different programs will respond differently to both. Large data sets may prefer bandwidth while small datasets may prefer latency.
> 
> Ultimately the easiest way to estimate the performance of the ram you intend to run is to simply
> 
> *DIVIDE the advertised LATENCY BY THE advertised FREQUENCY.* The combination resulting in the lowest value will be the fastest.
> 
> Again, this is situation dependent to an extent.
> 
> The 5775c for example has additive L4+DDR3 bandwidth so focus on latency at the expense of frequency can provide better results in situation where sufficient bandwidth already exists.
> 
> On other chips like 4790k, 6700k, 7700k you want both tight latency and sufficient bandwidth.
> 
> The main point, is don't be fooled by high frequency ram. It may give you a bunch more bandwidth, but if it's more bandwidth than you need you're giving up latency and incurring additional input lag by running loose ram timings to support the high ram frequency.
> 
> I've posted this graph before. It's a performance index of various ram frequencies vs ram CAS latencies.


So if you were to recommend, which between the 3200C14 and 3600C15 would be ideal for a Z270 system using a 7700K and a 1080Ti?


----------



## DStealth

Quote:


> Originally Posted by *KraxKill*
> 
> It's the combination of latency + frequency that determines the overall result.
> I've posted this graph before. It's a performance index of various ram frequencies vs ram CAS latencies.


No it's not...try to understand what frequency and latency do and rethink this ridiculous table...There are application have to intensively address the memory where latency will bring better performance and some where only the transfer matters...
Just to clear you the situation it's like doing in HDD/SSD calculation transfer rate/latency formula to evaluate the performance ...there are quite huge variations in regards of the usage...i.e small vs big files...queue depth and so on


----------



## Dasboogieman

Quote:


> Originally Posted by *kevindd992002*
> 
> So if you were to recommend, which between the 3200C14 and 3600C15 would be ideal for a Z270 system using a 7700K and a 1080Ti?


The 3600 cl15 is faster, assuming ur cpu imc can even handle it.


----------



## kevindd992002

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 3600 cl15 is faster, assuming ur cpu imc can even handle it.


Is 3600 cl15 not something easily achievable by the 7700K's IMC?


----------



## pez

Quote:


> Originally Posted by *kevindd992002*
> 
> Is 3600 cl15 not something easily achievable by the 7700K's IMC?


The biggest difference would be between your current DDR3 and at least DDR4 3000. Go with whatever is the lowest latency and cheapest cost. C15 and C16 DDR4 is going to perform really similarly within that close of a speed rating.


----------



## lilchronic

Quote:


> Originally Posted by *kevindd992002*
> 
> Is 3600 cl15 not something easily achievable by the 7700K's IMC?


Yeah it will run 3600Mhz just fine. 3600Mhz cl 15 or 16 are the best kits to get.


----------



## Dasboogieman

Quote:


> Originally Posted by *kevindd992002*
> 
> Is 3600 cl15 not something easily achievable by the 7700K's IMC?


It will run but its not guaranteed. I'd say about 60% of 7700ks can do it with quad single rank dimms, about maybe 90% can do it if dual single rank dimms. Its obviosly harder depending on what is your max core OC, operating temps, how many dimms are in play (also dual vs single rank dimms) and mobo quality/make/model.


----------



## becks

Quote:


> Originally Posted by *kevindd992002*
> 
> Is 3600 cl15 not something easily achievable by the 7700K's IMC?


Not 100% Guaranteed... and also a big factor is if you go 2x 8Gb...or 2x16Gb or 4x 2Gb
As suggested before, check this tread here and pop questions there. There's where RAM people hang


----------



## lilchronic

Here are stability results of my 3600Mhz cl 16 kit overclocked up to 4133Mhz cl 18
3866Mhz cl 16 is the sweet spot for me @ 1.45v.

4133Mhz cl18


4000Mhz cl 17


4000Nhz cl 16 loose sub timings


3866Mhz cl16 tight timings


----------



## KraxKill

Quote:


> Originally Posted by *DStealth*
> 
> No it's not...try to understand what frequency and latency do and rethink this ridiculous table...There are application have to intensively address the memory where latency will bring better performance and some where only the transfer matters...
> Just to clear you the situation it's like doing in HDD/SSD calculation transfer rate/latency formula to evaluate the performance ...there are quite huge variations in regards of the usage...i.e small vs big files...queue depth and so on


What are you even talking about? Did you just selectively read my post and miss the bit on large vs small data sets and the fact that it's an "estimate"?

Move along...


----------



## blurp

Please stay on topic : GTX 1080ti.


----------



## PimpSkyline

Hey does anyone that has a WB on there Ti have the cooler they could give or sell me? Need a FE compatible cooler. Thx


----------



## khemist

https://imageshack.com/i/pnQmoVNbj
https://imageshack.com/i/pn2qiKHOj
https://imageshack.com/i/ponZGKGij
https://imageshack.com/i/pmkIbEYej

Sold my FE and got this since i'm back on air cooling.


----------



## jediblr

Quote:


> Originally Posted by *khemist*
> 
> Sold my FE and got this since i'm back on air cooling.


why? FE on water > any air card


----------



## khemist

I'm on air cooling in a new case and this is cooler and quieter than the FE on air.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jediblr*
> 
> why? FE on water > any air card


Is it? I am not on Water but I do have AIO and do not reach same level of performance as AIB unless you want to do shunt mod.


----------



## jediblr

i see photos of white case - SUPERB!
And why FE water > all , cos i never (run all tests and all new games) see t more then 42C ( ambient was 25- 27C!!!) and 2063 with +0 core volt and 40C ROX


----------



## Buzzard1

I just bought a EVGA FTW3 1080ti and I am getting some unique overclocking results. While my core is only stable around 2000 mhz, my vram is stable at 960+. I am getting a 10280 score on Super position. I would like some advice on what to do to get that core up. Also, is this card worth keeping with these results? Maybe try a different bios with a higher TDP? Would love to hear any feed back. Thank you.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buzzard1*
> 
> I just bought a EVGA FTW3 1080ti and I am getting some unique overclocking results. While my core is only stable around 2000 mhz, my vram is stable at 960+. I am getting a 10280 score on Super position. I would like some advice on what to do to get that core up. Also, is this card worth keeping with these results? Maybe try a different bios with a higher TDP? Would love to hear any feed back. Thank you.


It seems its already using 1.06v for 2012MHz. You can try to OC with curve to get a bit more volts.


----------



## Buzzard1

I am used to Afterburner but I need to use percision X now with the FTW3. lol I keep running OC scanner and it crashes like a mad man. I cant get no more then 1/3 the way through. Called EVGA and they just told me to manually set my offsets, and that OC scanner is FUBAR atm.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buzzard1*
> 
> Is there a good tutorial for that? I am used to Afterburner but I need to use percision X now with the FTW3


OP


----------



## MrBeer

Why afterburner beta supports all evga cards with the icx tech.

this is the new Beta

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

• Improved 5-channel thermal monitoring module architecture provides support for up to 20 independent thermal sensors per GPU (up to 5 independent GPU, up to 5 independent PCB, up to 5 independent memory and up to 5 independent VRM temperature sensors) on future custom design MSI graphics cards
• Added NCT7802Y thermal sensors support to provide compatibility with future custom design MSI graphics cards
• Improved hardware database format. New database subsections support provides more compact database definition for multiple graphics card models sharing similar hardware calibration info
• New cached I2C device detection algorithm improves application startup time on the systems with multichannel voltage controllers or multichannel thermal sensors
• Added experimental interleaved hardware polling mode, aimed to reduce hardware polling time on the systems with multiple polled I2C devices. When interleaved polling is enabled, just a part of hardware monitoring data sources is being polled on each hardware polling period, so it takes multiple periods to refresh all monitoring data sources. Power users may enable interleaved hardware polling mode via the configuration file if necessary
• Improved third party voltage control mode functionality. Now third party hardware database can also include extended thermal sensors calibration and mapping info for third party custom design graphics cards

How to get it to work
iCX thermal sensors are in third party database now, so selecting third party voltage control mode in "General" tab should unlock access to them in hardware monitoring module.

http://forums.guru3d.com/showthread.php?t=412822&page=17

This is not my work please go to the fourm is any questions. Alexey Nicolaychuk aka Unwinder, RivaTuner creator

Thanks


----------



## ZealotKi11er

MSI AB need a redesign. Too much info not present cleanly.


----------



## GraphicsWhore

Quote:


> Originally Posted by *Buzzard1*
> 
> I just bought a EVGA FTW3 1080ti and I am getting some unique overclocking results. While my core is only stable around 2000 mhz, my vram is stable at 960+. I am getting a 10280 score on Super position. I would like some advice on what to do to get that core up. Also, is this card worth keeping with these results? Maybe try a different bios with a higher TDP? Would love to hear any feed back. Thank you.


Yeah that is a nice mem OC but I'd see what you can get on core at (presumably) the expense of memory.

I got 10322 with +500 mem (6003) and 2088 core. That's a FE with XOC BIOS under water.

Point being you may absolutely be able to get a higher score with lower mem OC.


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Buzzard1*
> 
> I just bought a EVGA FTW3 1080ti and I am getting some unique overclocking results. While my core is only stable around 2000 mhz, my vram is stable at 960+. I am getting a 10280 score on Super position. I would like some advice on what to do to get that core up. Also, is this card worth keeping with these results? Maybe try a different bios with a higher TDP? Would love to hear any feed back. Thank you.
> 
> 
> 
> Yeah that is a nice mem OC but I'd see what you can get on core at (presumably) the expense of memory.
> 
> I got 10322 with +500 mem (6003) and 2088 core. That's a FE with XOC BIOS under water.
> 
> Point being you may absolutely be able to get a higher score with lower mem OC.
Click to expand...

I run 2088 core, 6177 memory 24/7 on the XOC BIOS and I tested Fire Strike Ultra stress test and Time Spy stress test stable today.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KedarWolf*
> 
> I run 2088 core, 6177 memory 24/7 on the XOC BIOS and I tested Fire Strike Ultra stress test and Time Spy stress test stable today.


That's awesome. FS Ultra craps the bed for me above 2088/6026. What voltage are you at?


----------



## Buzzard1

OK I flashed it to the XOC bios (86.02.39.00.58. I cant change the TDP limit now in percicsion X. Is it already maxed out or must I run that .bat file every time I boot up? Trying to follow these instructions here.

http://forum.hwbot.org/showthread.php?t=169488

Thank you for the tips guys


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buzzard1*
> 
> OK I flashed it to the XOC bios. I cant change the TDP limit now in percicsion X. Is it already maxed out or must I run that .bat file every time I boot up? Trying to follow these instructions here.
> 
> http://forum.hwbot.org/showthread.php?t=169488
> 
> Thank you for the tips guys


There is no limit. That is why you have no slider.


----------



## Buzzard1

OK well I will play with this XOC bios some more. My results so far are not really getting me much better results on final benchmark scores. I plan on buying another gtx 1080ti used down the road if the Ethereum Market crashes. lol I am unsure if my crappy clocking GPU is out weighed by my high overclocking ram to make this card worth keeping and taking one more chance on another card to swap out. I know frys will tell me after the next one, tough.

Would you guys say this is ok, or good card to keep? I am woried that if I try my luck one last time I may not be happy with the results of it. Basically 2000 stable with 960 mem OC. Wish some one would take the power limit off evgas bios. My memory doesnt OC as well with the XOC bios but still in the 800s.

P.S. I get 10280 all the time on superposion


----------



## Dasboogieman

Quote:


> Originally Posted by *Buzzard1*
> 
> OK well I will play with this XOC bios some more. My results so far are not really getting me much better results on final benchmark scores. I plan on buying another gtx 1080ti used down the road if the Ethereum Market crashes. lol I am unsure if my crappy clocking GPU is out weighed by my high overclocking ram to make this card worth keeping and taking one more chance on another card to swap out. I know frys will tell me after the next one, tough.
> 
> Would you guys say this is ok, or good card to keep? I am woried that if I try my luck one last time I may not be happy with the results of it. Basically 2000 stable with 960 mem OC. Wish some one would take the power limit off evgas bios. My memory doesnt OC as well with the XOC bios but still in the 800s.
> 
> P.S. I get 10280 all the time on superposion


I vote keep, your card's VRAM speed alone will outweigh the crap core speeds.


----------



## Streetdragon

higher memclocks = more watt or? how high is the differents between 6003->6200? maybe 20Watt more?

and keep it. you could get a card that wont even do 2000 on core...


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I run 2088 core, 6177 memory 24/7 on the XOC BIOS and I tested Fire Strike Ultra stress test and Time Spy stress test stable today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's awesome. FS Ultra craps the bed for me above 2088/6026. What voltage are you at?
Click to expand...

1.1v.


----------



## Dasboogieman

Quote:


> Originally Posted by *Streetdragon*
> 
> higher memclocks = more watt or? how high is the differents between 6003->6200? maybe 20Watt more?
> 
> and keep it. you could get a card that wont even do 2000 on core...


When you increase the frequency of anything, the wattage goes up in proportion to that frequency increase. GDDR5x (whole array) runs something like 30W at stock (going by the FE VRM phase capacities). So overclocking 5500-6300 whould theoretically increase power draw to 33W. This is assuming no adaptive voltage shenanigans is happening.


----------



## schoolofmonkey

I'm just about to order a GTX1080ti and was wondering what people's opinions are about the MSI GTX 1080Ti SEA HAWK X?

I was looking at it people I can at least keep the card cool and vent the heat out the back seeing I have a top mounted AIO for the CPU.


----------



## Buzzard1

Good point, but this card doesnt do much better even when its at stock memory. Maybe I am wrong about this, but I dont see any one so far running at my memory speeds perfectly stable. If only the GPU was just as good


----------



## Dasboogieman

Quote:


> Originally Posted by *Buzzard1*
> 
> Good point, but this card doesnt do much better even when its at stock memory. Maybe I am wrong about this, but I dont see any one so far running at my memory speeds perfectly stable. If only the GPU was just as good


I've been seeing the late model GTX 1080tis seem to OC VRAM much easier than the initial batches. I suspect Micron's ironing out the manufacturing so more chips are making it in to the Titan Xp bins because almost all the Titan Xps can do +800/+900 VRAM.

You won't see many of us on this thread getting good VRAM OCs because most of us are 1st or 2nd batch models.


----------



## GRABibus

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I'm just about to order a GTX1080ti and was wondering what people's opinions are about the MSI GTX 1080Ti SEA HAWK X?
> 
> I was looking at it people I can at least keep the card cool and vent the heat out the back seeing I have a top mounted AIO for the CPU.


I have tried this card.
It was maybe a "non good clocker one" because I wasd unablke to go over 2000Mhz with it...


----------



## schoolofmonkey

Quote:


> Originally Posted by *GRABibus*
> 
> I have tried this card.
> It was maybe a "non good clocker one" because I wasd unablke to go over 2000Mhz with it...


Good to know, I'm replacing my GTX1080 Strix, couldn't fault that card, would do over 2000Mhz, but being air wouldn't stay there.

Because I'm getting a 7820x, I was wanting as little hot air in the case as possible, so that looked like the best bet, though some reviews on newegg said there was coil whine..


----------



## Sweetwater

Oosh a new PB for my Strix oc running the XOC bios









2114mhz @ 1.093 with 6003mem, not bad for only on air and max temp 56*C


----------



## dansi

How you all increase and maintain voltages? Even i put 1.093v on the curve, the gpu will auto adjust down when on load.


----------



## KedarWolf

Quote:


> Originally Posted by *dansi*
> 
> How you all increase and maintain voltages? Even i put 1.093v on the curve, the gpu will auto adjust down when on load.


That's GPU Boost 3.0. You're either power limiting or downclocking due to temps.

Try the XOC BIOS for no power limit under 'How To Flash A Different BIOS'.


----------



## PimpSkyline

Anybody?

Have a Aftermarket Cooler, MSI/EVGA/Zotac etc for the FE 1080Ti that went Water, need something better than the stupid Blower, thanks.


----------



## Dasboogieman

Quote:


> Originally Posted by *PimpSkyline*
> 
> Anybody?
> 
> Have a Aftermarket Cooler, MSI/EVGA/Zotac etc for the FE 1080Ti that went Water, need something better than the stupid Blower, thanks.


Try the Accelero IV cooler. It cools the VRAM + VRM via the backside.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sweetwater*
> 
> 
> 
> Oosh a new PB for my Strix oc running the XOC bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2114mhz @ 1.093 with 6003mem, not bad for only on air and max temp 56*C


What is the power usage like?


----------



## Sweetwater

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is the power usage like?


I don't have a power meter unfortunately, is there a software you'd like me to try? I'll run it through and give you the readings


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sweetwater*
> 
> I don't have a power meter unfortunately, is there a software you'd like me to try? I'll run it through and give you the readings


HWINFO


----------



## cluster71

If anyone is interested in any command, "The NVIDIA System Management Interface (nvidia-smi) command line utility". KedarWolf uses a command in its "powerlimit.bat"

Commands that can be affected change with the driver version can be checked with nvidia-smi.exe help command -h. Several commands only work under Linux.

http://developer.download.nvidia.com/compute/DCGM/docs/nvidia-smi-367.38.pdf
https://developer.nvidia.com/nvidia-system-management-interface


----------



## Sweetwater

Quote:


> Originally Posted by *ZealotKi11er*
> 
> HWINFO


OK after 3 runs average was 370w during the super position benchmark.

Edit: ran a couple at 2000mhz and .993 the power draw was 294w


----------



## KedarWolf

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1

Run that in a admin command prompt WITH the quotation mark and you'll get an accurate power reading.









You can change the 1 to 3 or whatever you want to have it report every three seconds instead of every second for example.


----------



## DStealth

Quote:


> Originally Posted by *KraxKill*
> 
> What are you even talking about? Did you just selectively read my post and miss the bit on large vs small data sets and the fact that it's an "estimate"?
> 
> Move along...


Estimate of what...unrealistic formula, where no real world performance scenario could get close to...

Edit: Moving along...Ignorance is not an excuse, just FYI


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sweetwater*
> 
> OK after 3 runs average was 370w during the super position benchmark.
> 
> Edit: ran a couple at 2000mhz and .993 the power draw was 294w


Yeah. Had same power draw. 80W for 3-4% more performance is not worth it.


----------



## lilchronic

Quote:


> Originally Posted by *PimpSkyline*
> 
> Anybody?
> 
> Have a Aftermarket Cooler, MSI/EVGA/Zotac etc for the FE 1080Ti that went Water, need something better than the stupid Blower, thanks.


.
Sell the cooler and lose warranty. Why would anyone do that??? Just get a artic accelero for gpu.


----------



## GanGstaOne

hi ppl can anyone help HWINFO64 and GPU-Z shows me Performance Limit - GPU Utilization at all time even on idle on my Auros 1080 Ti i tested with my old Gigabyte 1080 G1 i dont have this problem
why is giving me GPU utilization even on idle and in every game no matter if i overclock or stay stock i even cap max boot clock to 1771Mhz same thing

EDIt: only in FurMark stress test GPU utilization goes away some time


----------



## Benny89

Quote:


> Originally Posted by *jediblr*
> 
> why? FE on water > any air card


LOL.

Silicon Lotter >>>>>>> FE/AIB, any brand, any cooling.

Doesn't matter if you put FE or non-FE under water. Sillicon Lottery is all that matters.


----------



## MunneY

Okay,

So I need some opinions here. I'm about at my whits end.

Yesterday I swapped in my 7900x and X299 Tiachi into my main system with my 2 1080ti's . I'd run the 7900x with a Fury X on the testbed with 0 issues in any gaming or rendering situation. When i switched to the 1080ti's in my main system and loaded up the only game I play right now (Playerunknowns Battlegrounds) I immediately noticed i was getting serious frame rate issues. I'm talking 90 to 15 and back with no consistency. I tabbed out to check EVGA XOC and thi sis what I see.



I have tried everythign I can think of. GPU Driver wipes, Rolling back GPU Drivers, Enable/Disable SLI, Remove GPU and swap cards, Reset to stock MB settings. Tested with known stable OC on CPU and no GPU clock and then as well as NO oc anywhere. I switched from Windows 8.1 to Windows 10 that I have used. The only thing I haven't done is completely reformat one of my drives and try a clean windows (absolute last resort)

Thoughts? Suggestions. What am I missing here!?


----------



## Psykoking

Quote:


> Originally Posted by *MunneY*
> 
> Okay,
> 
> So I need some opinions here. I'm about at my whits end.
> 
> Yesterday I swapped in my 7900x and X299 Tiachi into my main system with my 2 1080ti's . I'd run the 7900x with a Fury X on the testbed with 0 issues in any gaming or rendering situation. When i switched to the 1080ti's in my main system and loaded up the only game I play right now (Playerunknowns Battlegrounds) I immediately noticed i was getting serious frame rate issues. I'm talking 90 to 15 and back with no consistency. I tabbed out to check EVGA XOC and thi sis what I see.
> 
> 
> 
> I have tried everythign I can think of. GPU Driver wipes, Rolling back GPU Drivers, Enable/Disable SLI, Remove GPU and swap cards, Reset to stock MB settings. Tested with known stable OC on CPU and no GPU clock and then as well as NO oc anywhere. I switched from Windows 8.1 to Windows 10 that I have used. The only thing I haven't done is completely reformat one of my drives and try a clean windows (absolute last resort)
> 
> Thoughts? Suggestions. What am I missing here!?


Are both GPUs at the same voltage? Nvidia mentioned in there latest driver release, that there have been issues with SLI when cards run on different voltage levels.
Cant see the pics on my phone....


----------



## MunneY

Quote:


> Originally Posted by *Psykoking*
> 
> Are both GPUs at the same voltage? Nvidia mentioned in there latest driver release, that there have been issues with SLI when cards run on different voltage levels.
> Cant see the pics on my phone....


Its not a voltage problem. It happens with SLI off and only 1 card in the system


----------



## PimpSkyline

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PimpSkyline*
> 
> Anybody?
> 
> Have a Aftermarket Cooler, MSI/EVGA/Zotac etc for the FE 1080Ti that went Water, need something better than the stupid Blower, thanks.
> 
> 
> 
> .
> Sell the cooler and lose warranty. Why would anyone do that??? Just get a artic accelero for gpu.
Click to expand...

Got a point, didn't think about that. I will go look for other options.


----------



## Buzzard1

I think you guys are right. I am gonna keep this card and maybe get a water block for this. I wonder if I will lose my VRAM cooling or if it will get better with a water block or hybrid system. Looking at these leaders boards I have the highest clocking stable ram on it. There is only 2 that are slightly faster and they are titan Xp cards. Thank you all for giving me your opinions last night







I was just unsure. I am only ranked 42 but I feel like if I was on water I could do some serious A$$ kicking.

2017-07-07.png 481k .png file


----------



## Benny89

Benn running today for 4,5 hours Witcher 3 on Ulltra 1440p on *2101Mhz Core Clock at 1.09V and 6100 mhz Memory on my water loop*









Ambient temp: 22/23C. Mostly 22.

Results: Max GPU temp was 42C in some spikes, most of the time 41C. Max CPU: 60C. Nice solid 90-110 fps with ocasionally higher fps.

I am super pleased with moving to water cooling







. My card at 2101 did not throtle even once at anything. Max Power Limit is never reached with STRIX 330W which is totally enough for 1080Ti as Witcher 3 was so far game that was pulling out the most Power. Right now on water I don't see it exceeding 104-6%.

I am finally totally satisfied with my card







. Can play now 2101 24/7. I could probably go higher on XOC Bios, but I don't need


----------



## Sweetwater

Quote:


> Originally Posted by *KedarWolf*
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> Run that in a admin command prompt WITH the quotation mark and you'll get an accurate power reading.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can change the 1 to 3 or whatever you want to have it report every three seconds instead of every second for example.


Using the cmd prompt it reported 368w max average after a couple of runs. Seems in line with Hwinfo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. Had same power draw. 80W for 3-4% more performance is not worth it.


Pretty much my thoughts too, which is why I run daily 2000 core @ .993. Though on these cold nights without a heater a long gaming session with 1.093 is nice


----------



## GraphicsWhore

Quote:


> Originally Posted by *Sweetwater*
> 
> Using the cmd prompt it reported 368w max average after a couple of runs. Seems in line with Hwinfo
> Pretty much my thoughts too, which is why I run daily 2000 core @ .993. Though on these cold nights without a heater a long gaming session with 1.093 is nice


Damn man where do you live at? Southern Hemisphere obviously but how cold is it there now?


----------



## Sweetwater

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Damn man where do you live at? Southern Hemisphere obviously but how cold is it there now?


Western Australia, temps all week have been 15c° and during the night 5c°. I have to enjoy it while it lasts as come summer it'll be mid 40's haha


----------



## Dasboogieman

Quote:


> Originally Posted by *Sweetwater*
> 
> Western Australia, temps all week have been 15c° and during the night 5c°. I have to enjoy it while it lasts as come summer it'll be mid 40's haha


My rig is folding at night to stop the coolant from freezing over.


----------



## dunbheagan

Quote:


> Originally Posted by *Buzzard1*
> 
> OK well I will play with this XOC bios some more. My results so far are not really getting me much better results on final benchmark scores. I plan on buying another gtx 1080ti used down the road if the Ethereum Market crashes. lol I am unsure if my crappy clocking GPU is out weighed by my high overclocking ram to make this card worth keeping and taking one more chance on another card to swap out. I know frys will tell me after the next one, tough .
> 
> Would you guys say this is ok,, or good card to keep? I am woried that if I try my luck one last time I may not be happy with the results of it. Basically 2000 stable with 960 mem OC. Wish some one would take the power limit off evgas bios. My memory doesnt OC as well with the XOC bios but still in the 800s.
> 
> P.S. I get 10280 all the time on superposion


You should keep the card. 960 mem oc is awesome and at least 5% above good average. This will make up for the 50Mhz "missing" on the core, which is only 2,5%.

Really, almost nobody reaches 2100 stable, so dont be upset about 2025 instead of 2075, especially if you have such a fantastic memory. You have a very good card!


----------



## Sweetwater

Quote:


> Originally Posted by *Dasboogieman*
> 
> My rig is folding at night to stop the coolant from freezing over.


Time to fill the loop with vodka, then when it comes to draining it.. Hic


----------



## KedarWolf

Quote:


> Originally Posted by *Benny89*
> 
> Benn running today for 4,5 hours Witcher 3 on Ulltra 1440p on *2101Mhz Core Clock at 1.09V and 6100 mhz Memory on my water loop*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ambient temp: 22/23C. Mostly 22.
> 
> Results: Max GPU temp was 42C in some spikes, most of the time 41C. Max CPU: 60C. Nice solid 90-110 fps with ocasionally higher fps.
> 
> I am super pleased with moving to water cooling
> 
> 
> 
> 
> 
> 
> 
> . My card at 2101 did not throtle even once at anything. Max Power Limit is never reached with STRIX 330W which is totally enough for 1080Ti as Witcher 3 was so far game that was pulling out the most Power. Right now on water I don't see it exceeding 104-6%.
> 
> I am finally totally satisfied with my card
> 
> 
> 
> 
> 
> 
> 
> . Can play now 2101 24/7. I could probably go higher on XOC Bios, but I don't need


oops, wrong post.


----------



## Dasboogieman

Quote:


> Originally Posted by *Sweetwater*
> 
> Time to fill the loop with vodka, then when it comes to draining it.. Hic


Pls, I accidentally got a mouthful of CryoFuel already when I was draining the loop. It actually tastes surprisingly minty with a tinge of alcohol.


----------



## Aganor

Quote:


> Originally Posted by *Dasboogieman*
> 
> Pls, I accidentally got a mouthful of CryoFuel already when I was draining the loop. It actually tastes surprisingly minty with a tinge of alcohol.


I got my mouthfull of the EK variant. Tasted like citric juice with a sense of burning in the tongue lol


----------



## KedarWolf

This is my Time Spy at 2088/6177, my 24/7 gaming settings Fire Strike Ultra stress test and Time Spy stress test stable on XOC BIOS at 1.1v.


----------



## KedarWolf

Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.


----------



## Stiltz85

Guess you can add me to the club, My Asus GTX 1080 Ti Poseidon arrived today. keeping it air cooled until I get my new case.
10068 Time Spy score after some light overclocking: http://www.3dmark.com/3dm/20945265?

A comparison to my old GTX 980, this thing is a beast!


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.


make a note in the bios forum OP! thanks dude


----------



## Clukos

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.


It is the BIOS or incompatibility with your Strix? I'm tempted to try that on my Gaming X


----------



## KedarWolf

Quote:


> Originally Posted by *Clukos*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It is the BIOS or incompatibility with your Strix? I'm tempted to try that on my Gaming X
Click to expand...

It's like the HOF BIOS, totally different voltage controller, will only work with the Lightning X and maybe the HOF cards.


----------



## Dasboogieman

Quote:


> Originally Posted by *Clukos*
> 
> It is the BIOS or incompatibility with your Strix? I'm tempted to try that on my Gaming X


Why would you want to? the Strix has access to the best BIOS already (the XOC) without performance limitations.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Stiltz85*
> 
> Guess you can add me to the club, My Asus GTX 1080 Ti Poseidon arrived today. keeping it air cooled until I get my new case.
> 10068 Time Spy score after some light overclocking: http://www.3dmark.com/3dm/20945265?
> 
> A comparison to my old GTX 980, this thing is a beast!


Nice upgrade.


----------



## navjack27

I'm joining the club


I might do overclocking but like... Why? It seems to Max out on its own. Although, this being my first Pascal based card the GPU boost still seems to do that crap where if you are seeming to be doing something that doesn't seem to require much oomph it won't clock it self up. I'd like that to not be a thing.

Edit: have a firestrike run http://www.3dmark.com/fs/13066380


----------



## cluster71

Quote:


> Originally Posted by *KedarWolf*
> 
> Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.


I guess it's best to stay away from Bios belonging to Graphics Cards with more than 2 Pcie-8 pin power connectors

I'm just a bit annoyed that I can't unlock memory voltage on Zotac Extreme


----------



## KedarWolf

Quote:


> Originally Posted by *cluster71*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Don't flash the MSI Lightning BIOS is the techpowerup database. It'll stop your PC from booting and unless you have a spare video card or an iGPU you won't be able to flash it back and have a working PC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it's best to stay away from Bios belonging to Graphics Cards with more than 2 Pcie-8 pin power connectors
> 
> I'm just a bit annoyed that I can't unlock memory voltage on Zotac Extreme
Click to expand...

Yes, that's correct.


----------



## cluster71

I just tested this, I do not know if it has any effect anymore? Only ATI ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *cluster71*
> 
> I just tested this, I do not know if it has any effect anymore?


I am pretty sure that was for AMD GPUs back int he day and does nothing now.


----------



## ELIAS-EH

Hello all

Can i connect two PWM fan to a single fan header on my 1080ti strix? Or only one fan per header?
Each fan needs 0.38A max
Thank u


----------



## cluster71

I have two 140mm fans. Noctua NF-A14 iPPC-2000 140mm. Just took a couple in the pile. 0.18 Amp apiece.


----------



## ELIAS-EH

Do you think the fan header of the 1080ti strix is the same as the asus board fan header 1A?


----------



## cluster71

Zotac Extreme has three of these. In the same slot.


----------



## YamiJustin

Hey, I got the EVGA 1080 Ti SC2 Hydro Copper the other day. Have it installed in my nice custom loop with x2 quad radiators. Anyhow, I am new at getting stable overclocks. So far been using Valley and Fire Strike to test, although I have a copy of Superposition benchmark as well. Problem is that I can't seem to determine what my first 'wall' is. My goal is at least a 2.1ghz clock. It seems as if I have it stable at a max clock of 2088 I think, but that's when running Fire Strike Ultra. If I go higher I get the stress test failing (saying I cancelled the test), and my only true feedback is GPU-Z which says PWR is the performance cap. If this is the case adding more mv voltage using MSI Afterburner won't help right? Do I really need to get an unlocked BIOS to get around my issue. At max load, my GPU is at like 35-40c. I just want some advice on how to move forward? Get custom BIOS with no voltage or power limit? If so, what then? Increase power by 10%? Increase voltage by 5mv intervals? If heat is not my issue, what is?

Thanks! - a noob


----------



## cluster71

I have Zotac Amp Extreme and have tested many bios including XOC bios without software limit. You still have a hardware limit if you do not shunt mod. Not many run over 2100 because you don't win much after that. I've run 1.2 volt 2200 with my card several times but the performance does not get much better than 2100.

I have now returned to Orginal Bios 384w. It works well. The big gain I've realized is the temperature. I have changed the paste, pads and fans on my card (Air coold). If I hold my card under 40 degrees celcius, the card is completely stable on 2126 core 12110 mem 1.093v. I've been playing Sniper 3 for three hours right now at full clock speed. The card has been at 35C all the time (PC in the basement, do not hear fans etc).

If you keep the card cool, it's no problem to trim it, but it may take some time. If you run water then you have a big advantage


----------



## cluster71

You can get help here on the thread. I ran Benchmark programs continuously, and increased the memory frequency until I found the limit without touching the core. Then took the core and did the same until I found the limit. After that, try combining them and see what works for your card.

Benchmark Program usually runs in short periods so you do not know if it can handle the speed continuously. You can run benchmark all the time but i prefer to play and see if it's stable. Increase a bit and continue. I'm getting bored with all tweak and it feels like you're just wasting a lot of time and then you're back where you were in the beginning


----------



## Sgang

Hi @Cluster71, with your default settings what's your score in valley or firestrike or timespy?

I've a 1080ti amp Extreme Core Edition, and i was thinking to flash the Extreme BIOS (do you know where i can find it?







) The default is already at 1607 mhz but i've seen a sensible gain in game performance clocking a little bit more the GPU


----------



## cluster71

Do not know right now but check my profile "my rig"

Timespy 11129 Graphics Score. Do you end up in more than 11000, then it works as it should


----------



## cluster71

https://www.techpowerup.com/vgabios/

You have many to choose from. I have no idea if there is any hardware difference on Extreme and Core Edition, or if it's just Bios

Zotac Extreme and Zotac Arctic feels okay

https://www.techpowerup.com/vgabios/192501/192501
https://www.techpowerup.com/vgabios/192147/zotac-gtx1080ti-11264-170327


----------



## nexus35

Quote:


> Originally Posted by *cluster71*
> 
> Do not know right now but check my profile "my rig"
> 
> Timespy 11129 Graphics Score. Do you end up in more than 11000, then it works as it should


10304 for me,

I saw your post but I have to miss something.
Can I send you the link from my bench?


----------



## cluster71

Okay, 10000 is good, you must fight for the last


----------



## nexus35

http://www.3dmark.com/spy/2009198

What can I improve?
Timing ram? Push cpu?

jI think the big scores are related to the big cpu, am I wrong?


----------



## cluster71

Yes, many Benchmarks are affected by the CPU and system performance. They are not always fair, it's the Gaming performance you want to check. So you can check fraps when playing as well


----------



## nexus35

Thank you, your devotion. I'm going to bed,
thank you


----------



## cluster71

Check Windows if everything is optimized ? Any Power limit in Msi Afterburner coming in ? Try to increase Core 2100?

It may take some time to analyze. It took me a month to optimize my PC with everything

Okay


----------



## KedarWolf

*As well, in Power Option in Control Panel, put your Maximum Frequency above what your max CPU speed is.

Put your Minimum Processor State at 100% for benching.

Also in PCI Express turn Link Power State Management off.*

Above added to 'How To Maximize Your Memory Overclock' section.









XOC BIOS at 2088 Core, 6220 memory.


----------



## propeldragon

Quote:


> Originally Posted by *MunneY*
> 
> Its not a voltage problem. It happens with SLI off and only 1 card in the system


Did you ever figure out what was wrong? I had a similar issue, I think I fixed it but I haven't been playing to much games to verify and I don't do benchmarks.


----------



## Sgang

i was stuck with 9873 total score (1080ti Zotac Amp EXTREME core x39 ryzen 1800x), then i 've rised the maximum frequency to 4500 and my score improved to 9955

other things i can change?


----------



## ottoore

http://www.overclock.net/t/1606562/official-nvidia-titan-x-pascal-owners-thread/5580_30#post_25743105


----------



## Quadrider10

so how hot do these cards normally get? i got a evga SC2 thats hitting about 73C average. u guys think changing the thermal paste with better stuff or using liquid metal will be worth it?


----------



## FUZZFrrek

Quote:


> Originally Posted by *Quadrider10*
> 
> so how hot do these cards normally get? i got a evga SC2 thats hitting about 73C average. u guys think changing the thermal paste with better stuff or using liquid metal will be worth it?


Just by changing the fans curve to a more aggresive one, the temps will drop significantly. I have a FTW3 and on full load on MX-2 paste, I run on full load just under 56c-57c. Ill put some CLU when it arrives tomorrow to see if there is any tangible temp drops.


----------



## Quadrider10

the fan speeds are ramping to 60% on their own. i just took my side cover off and temps are only 1C different. so not an issue with my case... idle temps are a bit warm too. around 40-50C but thats with 2 case fans blowing directly onto the card's heatsink. also i had a gtx 1070 in here prior and that thing never hit 70C and idle was around 30C


----------



## FUZZFrrek

Quote:


> Originally Posted by *Quadrider10*
> 
> the fan speeds are ramping to 60% on their own.


On the GPU fan, i crancked it all the way to 85% to make sure it stays cold. I don't mind the sound so.

Also, if you have some fans and headers to your motherboard, you can put the fan on your card to either push or pull some air on/off the card. That way it can easily reduce then the temps up to 4-5C! Some easy DIY ghetto mod.


----------



## jleslie246

I finally did some benching on my 1080Ti to see if its a good one. I have not pushed it to crash yet and I have done no bios mods. All I did was tweak EVGA Precision X settings. Let me know if this is decent. I have a small OC on my 6700k as well (4.6GHz).

FireStrike score was 21990



Id appreciate any tips to push it further. Thanks


----------



## Quadrider10

Quote:


> Originally Posted by *jleslie246*
> 
> I finally did some benching on my 1080Ti to see if its a good one. I have not pushed it to crash yet and I have done no bios mods. All I did was tweak EVGA Precision X settings. Let me know if this is decent. I have a small OC on my 6700k as well (4.6GHz).
> 
> FireStrike score was 21990
> 
> 
> 
> Id appreciate any tips to push it further. Thanks


What are your temps like?


----------



## jleslie246

Quote:


> Originally Posted by *Quadrider10*
> 
> What are your temps like?


35-36c under load


----------



## FUZZFrrek

Quote:


> Originally Posted by *jleslie246*
> 
> 35-36c under load


Are you on water .? This is some crazy low temps man!


----------



## sew333

Hi all! First no installed any oc utilities like Afterburner or Precision. 0 programs like that. Only fraps,gpuz,cpuz,steam. So i am not manually tweaking card. Stock settings,fan auto.

I had a strange issue, normally my Gtx 1080 TI FE stock reference hits max 83-84°C, most of the time the gpu sits at 83-84C°C in heavy gaming with 49% auto fan.

I tried some benchmarks like 3dmark or Heaven and my GPU reached quickly 85-86°C at 60% auto fan speed.

First i thought there is something wrong with my cooling but everything is fine, no dust.

After that i started 3DMark at 1080p and the GPU reached 85-86°C again. I decided to watch everything on the GPU with gpuz, especially the Clocks. Now the strange thing, the GPU was not downclocking. Normally my Card downclocks if it reaches 83-84°C to maintain the temp target.

I rebooted my PC and everything was normal. 83°c max temp, the GPU downclocked as it should to maintain the temp target.

Now my question is, what in the world was that ? Boost clock is controlled by driver, isn't it ? Maybe the driver went nuts. I reinstalled it using DDU and performed a clean install only to be sure.

Some informations:
6700K @ Stock
16GB DDR4
Asus Z170-P Motherboard
Corsair 750 RM
Gtx 1080 TI FE stock

Thank you in advance!









I apologise my english. This isn't my native language. Thank you.
#1
Posted 14 hours ago


----------



## Simkin

If noise is not an issue while gaming, I strongly suggest setting up a custom fan profile. I'm using afterburner myself and my FE card hitting little over 60c in BF4. U also have the opportunity to set up OSD which is a really nice feature to have.

Sounds strange it's not downclocking at that temp. My FE downclock at 60c to 1999Mhz.

Sent fra min D6503 via Tapatalk


----------



## sew333

Sorry. Temp 85-86C and not downclocking i was mean. After restart it stabilzed again on 83-84C and throttling.


----------



## jleslie246

Quote:


> Originally Posted by *FUZZFrrek*
> 
> Are you on water .? This is some crazy low temps man!


Yes. Custom water loop


----------



## KeroBeron

Hi Guys,

I'm rather new to this whole overclocking business and Turbo boost 3. I've tried google search but haven't really found any real solutions.

I notice that I can add +150Mhz (+Boost?) to my core which allows me to reach 1974Mhz. Anything after that and I start crashing.

Unfortunately, half the time I won't be able to remain at 1974Mhz purely because my voltage is hovering around 850 to 889mV (CC - 1830Mhz). Occasionally (15-20% of the time) show 912-937mV allowing the 1974 clock @ 100 power limit.

I don't really get why it's undervolting. It's not as if it doesn't have enough power otherwise, it wouldn't run 20% of the time. My temperatures are running around 63 degrees when I have the fan switched at 65%.

Any attempts to change voltage/force a constant voltage have been in vain in MSIBurner/Precision/NVidiaInspector. They simply do nothing when I adjust the scale.

Can someone give me a clue as to why it seems to be randomly undervolting or perhaps more accurately using the minimal amount of voltage possible for my overclock settings.


----------



## ALSTER868

Quote:


> Originally Posted by *KeroBeron*
> 
> Hi Guys,
> 
> I'm rather new to this whole overclocking business and Turbo boost 3. I've tried google search but haven't really found any real solutions.
> 
> I notice that I can add +150Mhz (+Boost?) to my core which allows me to reach 1974Mhz. Anything after that and I start crashing.
> 
> Unfortunately, half the time I won't be able to remain at 1974Mhz purely because my voltage is hovering around 850 to 889mV (CC - 1830Mhz). Occasionally (15-20% of the time) show 912-937mV allowing the 1974 clock @ 100 power limit.
> 
> I don't really get why it's undervolting. It's not as if it doesn't have enough power otherwise, it wouldn't run 20% of the time. My temperatures are running around 63 degrees when I have the fan switched at 65%.
> 
> Any attempts to change voltage/force a constant voltage have been in vain in MSIBurner/Precision/NVidiaInspector. They simply do nothing when I adjust the scale.
> 
> Can someone give me a clue as to why it seems to be randomly undervolting or perhaps more accurately using the minimal amount of voltage possible for my overclock settings.


First, raise the power limit to the maximum if you didn't.


----------



## KeroBeron

When raising the power limit, I'm able to get higher voltages. Thanks!

Are my numbers of 1911Mhz vs 1830Mhz a result of freak accidents though? It seems odd that I could theoretically get both under the same conditions (power limit, temp limit etc)

I can only assume this is related to boost going on in the background?


----------



## zGunBLADEz

guys question 1080 and 1080 ti founders is the same vrm plate? that goes intop?
im asking this because the regular reference water blocks fits both versions of reference pcb 1080/1080ti

Im planning to buy a 1080ti i have a 1080 which i have to modified to put an universal ek block on it and im wondering if the vrm plates are the same so i can just swap ot the plate from my 1080 to the 1080ti.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> guys question 1080 and 1080 ti founders is the same vrm plate? that goes intop?
> im asking this because the regular reference water blocks fits both versions of reference pcb 1080/1080ti
> 
> Im planning to buy a 1080ti i have a 1080 which i have to modified to put an universal ek block on it and im wondering if the vrm plates are the same so i can just swap ot the plate from my 1080 to the 1080ti.


They are closely matched. I had to make some changes to make it fit to 1080 Ti but nothing huge. You might need more thermal pads though.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They are closely matched. I had to make some changes to make it fit to 1080 Ti but nothing huge. You might need more thermal pads though.


what exactly you did other than the thermal pads??




Thats what i did to my 1080

But if they can use the same plate just adding thermal pads thats no biggie lol

because i just swap plates and sell the 1080 with the 1080ti front vrm plate ...

The cut i did, dont impede putting the cooler back no problems or affect any cooling performance at all as you see theres no chips there, but you know how people are...

The card will do 2101/550 with a mere 1.000v all day along lol she loves low voltages


----------



## ALSTER868

Quote:


> Originally Posted by *KeroBeron*
> 
> When raising the power limit, I'm able to get higher voltages. Thanks!
> 
> Are my numbers of 1911Mhz vs 1830Mhz a result of freak accidents though? It seems odd that I could theoretically get both under the same conditions (power limit, temp limit etc)
> 
> I can only assume this is related to boost going on in the background?


Boost behaves differently depending on voltage, temps and power draw. If the card is pulling more power than it's set to, it will drop clocks and voltage to stay within its limits, same thing with temps.
If you want to keep clocks high, you should first raise the PL and temp limit and do your best to keep your temps down.


----------



## sew333

I think my 1080 Ti Fe is faulty or something. After pc reboot i run Time Spy looped. After few minutes card was loud as hell, fan goes to 62% on auto. Temps go to 87C and there was not clock throttling,clocks was high . Somebody can confirm its related to card or ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *sew333*
> 
> I think my 1080 Ti Fe is faulty or something. After pc reboot i run Time Spy looped. After few minutes card was loud as hell, fan goes to 62% on auto. Temps go to 87C and there was not clock throttling,clocks was high . Somebody can confirm its related to card or ?


Had that once. My card hit 93C for some reason. I think a driver clean fixed it?


----------



## sew333

I installed now newest drivers . Hope it will help. Running now Time Spy looped.


----------



## Quadrider10

So after swapping the thermal paste on the SC2 I'm still getting the same temps of 73-75c under load. The only thing that obviously reduces temps if I raise fan speed.

What programs out there can adjust both fan speeds of this card besides precision?

Also what about the XOC bios? Will both fans be controlled like one fan in afterburner?


----------



## 2ndLastJedi

Quote:


> Originally Posted by *zGunBLADEz*
> 
> what exactly you did other than the thermal pads??
> 
> 
> 
> 
> Thats what i did to my 1080
> 
> But if they can use the same plate just adding thermal pads thats no biggie lol
> 
> because i just swap plates and sell the 1080 with the 1080ti front vrm plate ...
> 
> The cut i did, dont impede putting the cooler back no problems or affect any cooling performance at all as you see theres no chips there, but you know how people are...
> 
> The card will do 2101/550 with a mere 1.000v all day along lol she loves low voltages


First thing to do is quit smoking, that'll gain you 20 yrs.


----------



## kaosstar

I've been mining with 4 1080tis for a couple months. One thing I've noticed is the high variability in hashrates between cards. Even at the same settings, they exhibit up to 5% variability between cards. This is a bit unusual compared to previous generations of cards I've mined with. I hear it has more to do with the memory silicon lottery, and memory intensity of the latest mining algorithms. I"m wondering if GDDR5X is varying so much due to the "newness"?

Anyway, now that mining is heading toward unprofitibility, I'm considering selling 2 and SLIing 2.

How's SLI these days, in general or compared to Xfire? I'm pretty experienced with Xfire, but the Pascal cards are the first Nvidia cards I've owned since the 6XX series.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kaosstar*
> 
> I've been mining with 4 1080tis for a couple months. One thing I've noticed is the high variability in hashrates between cards. Even at the same settings, they exhibit up to 5% variability between cards. This is a bit unusual compared to previous generations of cards I've mined with. I hear it has more to do with the memory silicon lottery, and memory intensity of the latest mining algorithms. I"m wondering if GDDR5X is varying so much due to the "newness"?
> 
> Anyway, now that mining is heading toward unprofitibility, I'm considering selling 2 and SLIing 2.
> 
> How's SLI these days, in general or compared to Xfire? I'm pretty experienced with Xfire, but the Pascal cards are the first Nvidia cards I've owned since the 6XX series.


Same as CFX.


----------



## kaosstar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same as CFX.


Maybe I ought to just stick with one.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kaosstar*
> 
> Maybe I ought to just stick with one.


If you are not playing 4K stick with 1.


----------



## KedarWolf

Quote:


> Originally Posted by *kaosstar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Same as CFX.
> 
> 
> 
> Maybe I ought to just stick with one.
Click to expand...

Or buy a 4K 144Hz G-Sync screen and go two cards.









http://wccftech.com/g-sync-hdr-monitors-available-q2-2017/


----------



## KCDC

SLI helps me a ton at 7680x1440 60fps, for the games that let me

SLI for high res or max FPS, or multi cards for gpu rendering. I love them with Redshift, they scream with that render engine.


----------



## Amuro

can someone explain this result. tnx



done it 5x same result, but last week i only get 20k-21k, same oc and underclock.


----------



## Asus11

guess im a owner.. didn't think I would be but the price was too good


----------



## sew333

Hi. I ask again if i can. My Gtx 1080 Ti Fe doing weird things. Sometimes temps going above stock temp target to 87-88C with auto fan 60% and gpuz not showing thermal throttle and clocks are high. Any ideas why this is happening? I dont tweak card,all is stock. No installed oc utilities. Card is doing this itself.

Tested on 382.05 and 378.69 drivers


----------



## PimpSkyline

Quote:


> Originally Posted by *Asus11*
> 
> guess im a owner.. didn't think I would be but the price was too good


----------



## Bolis

Hi guys,

I recently sold my 1070 AMP Extreme to a miner in order to buy a 1080ti Strix. Im liking the speed, but the power limit is bizarre. Onderclocked my card to put out 2000mhz @ 1v together with a +650 memory oc. Would it be worthwhile to flash to XOC for driver limit removal & higher voltage? Hence the single bios the card has and my warrenty I don't want to take risks in flashing. If the higher volts (1.2v) can increase performance by say 10% I would consider it worthwhile. Any advice?


----------



## PimpSkyline

Hey guys, is there a GPU 4-Pin to 3-Pin adapter? Like have the GPU fan run off of my Mobo instead of the GPU itself?

A quick google search just had stuff about hooking more fans to the GPU, but i want the opposite.

Thanks

https://www.moddiy.com/products/Mini-4%252dPin-GPU-to-4%252dPin-Fan-Adapter.html

This ^^ But is there a place i can buy 2 or 3 for less than $20-$30?


----------



## 2ndLastJedi

Quote:


> Originally Posted by *PimpSkyline*
> 
> Hey guys, is there a GPU 4-Pin to 3-Pin adapter? Like have the GPU fan run off of my Mobo instead of the GPU itself?
> 
> A quick google search just had stuff about hooking more fans to the GPU, but i want the opposite.
> 
> Thanks
> 
> https://www.moddiy.com/products/Mini-4%252dPin-GPU-to-4%252dPin-Fan-Adapter.html
> 
> This ^^ But is there a place i can buy 2 or 3 for less than $20-$30?


Why would you want to take control of GPU fans? Not being smart just truly wandering? Does it allow more power for the GPU or something?


----------



## Dasboogieman

Quote:


> Originally Posted by *Bolis*
> 
> Hi guys,
> 
> I recently sold my 1070 AMP Extreme to a miner in order to buy a 1080ti Strix. Im liking the speed, but the power limit is bizarre. Onderclocked my card to put out 2000mhz @ 1v together with a +650 memory oc. Would it be worthwhile to flash to XOC for driver limit removal & higher voltage? Hence the single bios the card has and my warrenty I don't want to take risks in flashing. If the higher volts (1.2v) can increase performance by say 10% I would consider it worthwhile. Any advice?


Normally no, but the XOC BIOS was designed for the Strix. Get some good cooling and watch the performance fly.
Quote:


> Originally Posted by *2ndLastJedi*
> 
> Why would you want to take control of GPU fans? Not being smart just truly wandering? Does it allow more power for the GPU or something?


Actually yes but its only 5-10W total at best.
I actually gained a tiny extra bit of TDP when I went to water. Partially because of the colder temps, partially because I don't have to power 3 fans and an LED plate anymore.


----------



## PimpSkyline

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PimpSkyline*
> 
> Hey guys, is there a GPU 4-Pin to 3-Pin adapter? Like have the GPU fan run off of my Mobo instead of the GPU itself?
> 
> A quick google search just had stuff about hooking more fans to the GPU, but i want the opposite.
> 
> Thanks
> 
> https://www.moddiy.com/products/Mini-4%252dPin-GPU-to-4%252dPin-Fan-Adapter.html
> 
> This ^^ But is there a place i can buy 2 or 3 for less than $20-$30?
> 
> 
> 
> Why would you want to take control of GPU fans? Not being smart just truly wandering? Does it allow more power for the GPU or something?
Click to expand...

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bolis*
> 
> Hi guys,
> 
> I recently sold my 1070 AMP Extreme to a miner in order to buy a 1080ti Strix. Im liking the speed, but the power limit is bizarre. Onderclocked my card to put out 2000mhz @ 1v together with a +650 memory oc. Would it be worthwhile to flash to XOC for driver limit removal & higher voltage? Hence the single bios the card has and my warrenty I don't want to take risks in flashing. If the higher volts (1.2v) can increase performance by say 10% I would consider it worthwhile. Any advice?
> 
> 
> 
> Normally no, but the XOC BIOS was designed for the Strix. Get some good cooling and watch the performance fly.
> Quote:
> 
> 
> 
> Originally Posted by *2ndLastJedi*
> 
> Why would you want to take control of GPU fans? Not being smart just truly wandering? Does it allow more power for the GPU or something?
> 
> Click to expand...
> 
> Actually yes but its only 5-10W total at best.
> I actually gained a tiny extra bit of TDP when I went to water. Partially because of the colder temps, partially because I don't have to power 3 fans and an LED plate anymore.
Click to expand...

Actually yes it will help TDP issues. I noticed on my FE card, that 23% to 100% fan is about 11-14% TDP increase or 4% to 18% TDP in the chart. So i am thinking putting the fan on my Mobo, i will gain 10-12% TDP headroom at the top end or Full Load, so less TDP throttling. Plus I'm gonna re-TIM the GPU for hopefully a 1-3C drop in temps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PimpSkyline*
> 
> Actually yes it will help TDP issues. I noticed on my FE card, that 23% to 100% fan is about 11-14% TDP increase or 4% to 18% TDP in the chart. So i am thinking putting the fan on my Mobo, i will gain 10-12% TDP headroom at the top end or Full Load, so less TDP throttling. Plus I'm gonna re-TIM the GPU for hopefully a 1-3C drop in temps.


Why would the fan be using the GPU VRM? GPU runs at different voltage than the fan.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why would the fan be using the GPU VRM? GPU runs at different voltage than the fan.


Depends on the wiring of each card. The ideal for fans is to have the Power drawn directly from the PCIe/8pin slot before going through the shunt with only PWM traces to the controller.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> Depends on the wiring of each card. The ideal for fans is to have the Power drawn directly from the PCIe/8pin slot before going through the shunt with only PWM traces to the controller.


Would be interest for someone to test the FE card. I run the Hybrid kit so both fan and pump are from the fan wire.


----------



## PimpSkyline

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PimpSkyline*
> 
> Actually yes it will help TDP issues. I noticed on my FE card, that 23% to 100% fan is about 11-14% TDP increase or 4% to 18% TDP in the chart. So i am thinking putting the fan on my Mobo, i will gain 10-12% TDP headroom at the top end or Full Load, so less TDP throttling. Plus I'm gonna re-TIM the GPU for hopefully a 1-3C drop in temps.
> 
> 
> 
> Why would the fan be using the GPU VRM? GPU runs at different voltage than the fan.
Click to expand...

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Why would the fan be using the GPU VRM? GPU runs at different voltage than the fan.
> 
> 
> 
> Depends on the wiring of each card. The ideal for fans is to have the Power drawn directly from the PCIe/8pin slot before going through the shunt with only PWM traces to the controller.
Click to expand...

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Depends on the wiring of each card. The ideal for fans is to have the Power drawn directly from the PCIe/8pin slot before going through the shunt with only PWM traces to the controller.
> 
> 
> 
> Would be interest for someone to test the FE card. I run the Hybrid kit so both fan and pump are from the fan wire.
Click to expand...











The GPU Fan for some reason, on the FE anyways, is taking into account with the GPU TDP limits, so basically the Whole card will stay below the TDP. (Core/Mem/Fan/LED's/etc)

I found the wire, and ordered it. Dam thing cost $12.99 including S/H for a 4 inch wire and 2 connectors. Geez lol Will follow up when I get it.

Would you guys like to see then, what the diff is?


----------



## becks

Quote:


> Originally Posted by *PimpSkyline*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The GPU Fan for some reason, on the FE anyways, is taking into account with the GPU TDP limits, so basically the Whole card will stay below the TDP. (Core/Mem/Fan/LED's/etc)
> 
> I found the wire, and ordered it. Dam thing cost $12.99 including S/H for a 4 inch wire and 2 connectors. Geez lol Will follow up when I get it.
> 
> Would you guys like to see then, what the diff is?


I think you will get cheaper / better ( maybe even sleeved ) if you go to this Club on OCN and ask if someone can do it for you...


----------



## sew333

Sometimes temp limit stops working on my card, and temps goes as high 87C and there is not throttling. Any ideas? Card is 1080 TI FE stock,no oc. No programs installed. Rebooting pc cure this.

I tried on 378.92 and 382 drivers,


----------



## Bolis

Quote:


> Originally Posted by *Dasboogieman*
> 
> Normally no, but the XOC BIOS was designed for the Strix. Get some good cooling and watch the performance fly.


My temps are at 48C full bench/gaming load due to custom fan profile slope moving to 55%. Also have two 120mm intake blowing under de card for fresh air.

I'm gonna give it a go then. 1.2v is the limit for air/water right? I read that the hard coded nvidia power limit is still in the way. Is thay correct?


----------



## Dasboogieman

Quote:


> Originally Posted by *Bolis*
> 
> My temps are at 48C full bench/gaming load due to custom fan profile slope moving to 55%. Also have two 120mm intake blowing under de card for fresh air.
> 
> I'm gonna give it a go then. 1.2v is the limit for air/water right? I read that the hard coded nvidia power limit is still in the way. Is thay correct?


Yup, the hardcoded limit is still there. You can actually observe it, its your Normalized TDP reading on HWinfo. From some testing I've done and analysis of Buildzoid's data. You can somewhat control the Hardware TDP in the BIOS.

For example. My GTX Aorus is using the Arctic Storm BIOS. Because my card uses 002 shunts from the factory, the AS BIOS doesn't take this in to account unlike the Stock BIOS. My Stock BIOS F4 generally has the TDP and Normalized TDP within 1W of each other with both calibrated to 150%.

However, on my AS BIOS, I get 2.5 times less TDP draw (essentially my PCB power controller is under-reporting power because it never goes above 50%) but the Normalized TDP is still showing the correct power limit at 120%. According to temps and power readins, its definitely drawing close to 400W+.

TLDR: the hardware power controller seems to be influenced by the BIOS so its not entirely independent, which makes sense, how else would it know what to do if it doesn't receive instructions from somewhere?


----------



## Evangelion

Has anyone tried undervolting their Founder's Edition 1080 Ti? Thinking about picking one up and putting it in an NCASE M1 and I wanted to know if anyone has tried undervolting their card to reduce the noise and temperature.


----------



## EdBald

del


----------



## zGunBLADEz

somebody knows the size in mm or can measure the square of the mem/vrm plate where the heatsink hole goes on founder??

It looks bigger than the 1080 but im not sure 100%


----------



## ZealotKi11er

Quote:


> Originally Posted by *PimpSkyline*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The GPU Fan for some reason, on the FE anyways, is taking into account with the GPU TDP limits, so basically the Whole card will stay below the TDP. (Core/Mem/Fan/LED's/etc)
> 
> I found the wire, and ordered it. Dam thing cost $12.99 including S/H for a 4 inch wire and 2 connectors. Geez lol Will follow up when I get it.
> 
> Would you guys like to see then, what the diff is?


Test Superposition 4K and see how much the Boost Clock are. See also how high the core goes.


----------



## zGunBLADEz

Nevermind ek supremacy vga no go but koolance 220 to the rescue.


----------



## PimpSkyline

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PimpSkyline*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The GPU Fan for some reason, on the FE anyways, is taking into account with the GPU TDP limits, so basically the Whole card will stay below the TDP. (Core/Mem/Fan/LED's/etc)
> 
> I found the wire, and ordered it. Dam thing cost $12.99 including S/H for a 4 inch wire and 2 connectors. Geez lol Will follow up when I get it.
> 
> Would you guys like to see then, what the diff is?
> 
> 
> 
> Test Superposition 4K and see how much the Boost Clock are. See also how high the core goes.
Click to expand...

I will. Just waiting on the parts.


----------



## Alex132

Just placed an order for an FTW3. Only 2 came into stock and they were sold out within minutes here


----------



## zGunBLADEz

That koolance block works better than ek supremacy my god ... Not even my 1080 have those temps with 1.00V .

I used kryonaut this time instead of pk1 doubt that would made that dramatic change like that


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> That koolance block works better than ek supremacy my god ... Not even my 1080 have those temps with 1.00V .
> 
> I used kryonaut this time instead of pk1 doubt that would made that dramatic change like that


That is insane temps lol. I do not get that at idle. What are your ambient temps?


----------



## zGunBLADEz

84F ambient inside. case temp 34c.


----------



## Dasboogieman

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> That koolance block works better than ek supremacy my god ... Not even my 1080 have those temps with 1.00V .
> 
> I used kryonaut this time instead of pk1 doubt that would made that dramatic change like that


The EK block isn't that bad though. Depends on rad + pumping power.



Btw that TDP number is misleading. Because the AS BIOS under-reports the TDP, its roughly 350W being drawn.

Accounting for my 20C ambient, the delta T is very similar to the Koolance Block.


----------



## zGunBLADEz

I dont know if it was bad mount which i doubt because i got very nice temps on that ek specially on summer last year.

I'm getting the same temps as I usually got on winter on the ek now been summer on a higher tdp card, I dont even want to see now that ti on winter lol.

But that koolance 220 with kryonaut is giving me way better temps in a higher tdp card in the same loop config.

EK Supremacy VGA + Prolimatech PK1 vs Koolance 220 + Kryonaut.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> 
> I dont know if it was bad mount which i doubt because i got very nice temps on that ek specially on summer last year.
> 
> I'm getting the same temps as I usually got on winter on the ek now been summer on a higher tdp card, I dont even want to see now that ti on winter lol.
> 
> But that koolance 220 with kryonaut is giving me way better temps in a higher tdp card in the same loop config.
> 
> EK Supremacy VGA + Prolimatech PK1 vs Koolance 220 + Kryonaut.


Oh yo using a "CPU" block. No wonder the temps.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Oh yo using a "CPU" block. No wonder the temps.


yeah thats why i was asking about the size for the block lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> 
> yeah thats why i was asking about the size for the block lol


Did you do shunt mod or something?


----------



## zGunBLADEz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Oh yo using a "CPU" block. No wonder the temps.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Did you do shunt mod or something?


No, but i played alot with the 1080 and i commented this on the 1080 forums and nobody payed attention to it lol.. I dont hit PL because of this..My clocks never throttled on my 1080 for the whole year i have it and she got good usage let me tell you not even once she throttled.

Throwing more volts at pascal is no good, so when you are overclocking you need to find the lowest voltage possible at the max stable overclocks is pretty easy to find the wall right away lol..

Thats exactly what i did here, turn on pc run heaven make sure temps are good then jumped to 2101 as my goal, it took me like 5 tries? to find out the stable voltage for 2101 lol....
my 1080 is 100% stable on 1.00V she dont even get close to PL no where lol..

This card seems to like 1.025V i started at 1.00V went up in the afterburner curve editor.. but she have more power to feed it so i dont mind the little bump on volts, shes been totally stable been looping heaven for almost 4 hrs now..
Pascal is no fun to overclock if you ask me is pretty easy once you know this and you will find the limits of your card right away..

The idea is to get the pl as low as possible on the highest clock you can get at the lowest voltage possible to have it running with no hitting pl. After that is sillicon lottery which any lol. You eventually will hit pl if you touch anything else lol. My perfcap is clean as a modded bios lol.

Same concept as modded bios to prevent throttling and high tdp. In this case we dont have mods to raise tdp so we have to play nvidias game with boost and the card pl/tdp


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> No, but i played alot with the 1080 and i commented this on the 1080 forums and nobody payed attention to it lol.. I dont hit PL because of this..My clocks never throttled on my 1080 for the whole year i have it and she got good usage let me tell you not even once she throttled.
> 
> Throwing more volts at pascal is no good, so when you are overclocking you need to find the lowest voltage possible at the max stable overclocks is pretty easy to find the wall right away lol..
> 
> Thats exactly what i did here, turn on pc run heaven make sure temps are good then jumped to 2101 as my goal, it took me like 5 tries? to find out the stable voltage for 2101 lol....
> my 1080 is 100% stable on 1.00V she dont even get close to PL no where lol..
> 
> This card seems to like 1.025V i started at 1.00V went up in the afterburner curve editor.. but she have more power to feed it so i dont mind the little bump on volts, shes been totally stable been looping heaven for almost 4 hrs now..
> Pascal is no fun to overclock if you ask me is pretty easy once you know this and you will find the limits of your card right away..
> 
> The idea is to get the pl as low as possible on the highest clock you can get at the lowest voltage possible to have it running with no hitting pl. After that is sillicon lottery which any lol. You eventually will hit pl if you touch anything else lol. My perfcap is clean as a modded bios lol.
> 
> Same concept as modded bios to prevent throttling and high tdp. In this case we dont have mods to raise tdp so we have to play nvidias game with boost and the card pl/tdp


Seems like very low voltage for 2100 MHz. Maybe its those super low temps.


----------



## Quadrider10

What are your guys temps on air?


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Quadrider10*
> 
> What are your guys temps on air?


68c with custom curve to 55% @2050-2038 and 6004/12008 on Galax GTX 1080 ti EXOC


----------



## Sweetwater

Quote:


> Originally Posted by *Quadrider10*
> 
> What are your guys temps on air?


52°c @ 100% fan (20°c room temp), 2114mhz core 6003mhz memory and 1.093 voltage.


----------



## Quadrider10

i wish these companies had a toggle for different fan curves. like silent, normal, and aggressive. i hate how i can put my fans at 60% and temps stay at a cool 58-60C, but if set to auto, they shoot up to 75C and core drops to 1900mhz. its annoying. does anyone know of any program out there that sets the fan on startup and does not need a program to start up with windows?


----------



## zGunBLADEz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Seems like very low voltage for 2100 MHz. Maybe its those super low temps.


Its the same like the 1080, it didnt matter how much volts I did throw at her she didnt like that. When i started using low voltages thats when she started reacting good. No PL no perfcaps no crashes no nothing. She do 2114 but will hit pl and go to 2101. Depending how demanding is x application.

Yes, the temps help the boost part/throttling embedded on the bios.

The problem starts when you find that sweet spot and want to squeeze more you are already on the pl zone because of the tdp so squeezing any further is useless you will throttle.

So you have 2 things to have in mind thermal wall/throttling and pl wall.

I will do the shuntmod if that guarantees me like 200mhz more stable but i dont think its worth it for less than 100mhz not stable. The way this chip is made it will crash/not stable and its no way in hell a 24/7 clock so to me its not worth it.
I can do a few profiles for different benches that not requires high tpd and get more score on the fly watching the tdp while benching. Than trying to finish a run and wish for the best.

What i find interesting was when i first booted she was boosting at stock at 1900mhz same as my 1080. So that told me this card can do 2101 no problems. Pascal is so easy to read lol.


----------



## Bolis

Quote:


> Originally Posted by *Quadrider10*
> 
> What are your guys temps on air?


Quote:


> Originally Posted by *Dasboogieman*
> 
> Yup, the hardcoded limit is still there. You can actually observe it, its your Normalized TDP reading on HWinfo. From some testing I've done and analysis of Buildzoid's data. You can somewhat control the Hardware TDP in the BIOS.
> 
> For example. My GTX Aorus is using the Arctic Storm BIOS. Because my card uses 002 shunts from the factory, the AS BIOS doesn't take this in to account unlike the Stock BIOS. My Stock BIOS F4 generally has the TDP and Normalized TDP within 1W of each other with both calibrated to 150%.
> 
> However, on my AS BIOS, I get 2.5 times less TDP draw (essentially my PCB power controller is under-reporting power because it never goes above 50%) but the Normalized TDP is still showing the correct power limit at 120%. According to temps and power readins, its definitely drawing close to 400W+.
> 
> TLDR: the hardware power controller seems to be influenced by the BIOS so its not entirely independent, which makes sense, how else would it know what to do if it doesn't receive instructions from somewhere?


Thanks for elaborating. I did the XOC flash on my 1080ti Strix. Man does she not like voltage! I tried 1,15v and couldn't get near 2101 clock.
The best result in voltage OC was 2076mhz @ 1,093. Anything above that doesn't yield any improvement.

Went on with some additional testing and concluded my OC on 2012mhz core with only 0,993v. This combined with a +650 memory OC i get 31090 firestrike score. No throttle whatsover due to the XOC bios and my low temps. During firestrike test I get a max of 44C using custom fan profile (50% at this temp) and stock paste. Pretty good for air right? ^^


----------



## zGunBLADEz

Wow this memory sure likes watts guaranteed pl once you touch her lol


----------



## Evangelion

Hi everyone. I picked up an EVGA GTX 1080 Ti at Micro Center today and I have some concerns. Out of the box, the card hits 80-84c on less demanding games like Dota 2, and it thermal throttles like crazy in games like The Witcher 3 (it hit about 1300mhz). I tried reapplying the thermal paste and that didn't help at all. Afterwards, I tried undervolting it through MSI Afterburner (its stable at 1835mhz at 0.8620 mv) and even at lower volts its still ends up thermal throttling. This is with everything running at stock and no overclocks. I've also made sure to run in in my system without any of the panels (I'm using an NCASE M1) and I also have fans inside my case set to intake so the card gets fresh air.

I've owned many cards with blower style coolers over the past couple of years (my last one was a 780 Ti) and I know they usually run hot and noisy, but I think something might be wrong with my card. Do I have a crappy 1080 Ti?


----------



## Quadrider10

what program are you guys using to lower the voltage? in MSI AB the only options i have are to add voltage.


----------



## zGunBLADEz

afterburner try the curve editor to lock voltage look at my picture a few posts back and you will have an idea how to lock it

Theres a little icon on core to the left that opens the curve editor


----------



## zGunBLADEz

Not too shabby for a ryzen lol @ 2126/6003


i didnt have no luck on the xoc bios for the 1080 higher clocks lower score, but in this 1080ti is a god sent bios 2126/6003 no perf lost


----------



## ilikelobstastoo

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Not too shabby for a ryzen lol @ 2126/6003
> 
> 
> i didnt have no luck on the xoc bios for the 1080 higher clocks lower score, but in this 1080ti is a god sent bios 2126/6003 no perf lost


Have you tried some of the tweaks in nvidia control panel outlined on the first page of this thread ? I can get 10600 supo 4k with just under 2100 clocks. I would imagine you could score higher. This is with memory +500 on fe though.


----------



## zGunBLADEz

Quote:


> Originally Posted by *ilikelobstastoo*
> 
> Have you tried some of the tweaks in nvidia control panel outlined on the first page of this thread ? I can get 10600 supo 4k with just under 2100 clocks. I would imagine you could score higher. This is with memory +500 on fe though.


what cpu tho??
im having problems with benchmarks staying consistent on this ryzen im thinking to go back using amd master as im having random perf issues under him cant point out where is the culprit


----------



## Bolis

Quote:


> Originally Posted by *Quadrider10*
> 
> what program are you guys using to lower the voltage? in MSI AB the only options i have are to add voltage.


Ctrl + f in afterburner main menu. Welcome to proper GPU tweaking.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Not too shabby for a ryzen lol @ 2126/6003
> 
> 
> i didnt have no luck on the xoc bios for the 1080 higher clocks lower score, but in this 1080ti is a god sent bios 2126/6003 no perf lost


3% better score than I get with a throttling card that sits most of the time under 2GHz. Yes memory will take a lot of watts.


----------



## Streetdragon

little question to my memory clock. ATM it clocks at 5400 with +400Mhz in MSI-AB
Befor it was with +400Mhz at 6003Mhz.

Why is it clocking 600Mhz lower? is it a DPM stat or so?

form time to time it clocks back to 6003... what


----------



## zGunBLADEz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 3% better score than I get with a throttling card that sits most of the time under 2GHz. Yes memory will take a lot of watts.


theres something fishy with my ryzen setup that card will do11k+ on my 4790K setup but im not switching it
http://www.overclock.net/t/1627767/top-30-unigine-superposition-benchmark/390#post_26216125

My 4790K do 5GHZ with ddr3 @ 2600


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> little question to my memory clock. ATM it clocks at 5400 with +400Mhz in MSI-AB
> Befor it was with +400Mhz at 6003Mhz.
> 
> Why is it clocking 600Mhz lower? is it a DPM stat or so?
> 
> form time to time it clocks back to 6003... what


Are you using it for mining?


----------



## Streetdragon

just gaming. atm just blade and soul, but even with heaven benchmark it stays at 5400

tryed msi ab reset and all that


----------



## zGunBLADEz

man this sucks, i got that score right? i keep benching is been under 10k like 4-5 tests so i removed the gpu drivers and everything went back and did


and i know its not the gpu or the overclocks..



EDIT: i guess after flashing bioses ddu came in handy first time i have that issue



Yeah definitley this card is a 11k+ on my 4790K


----------



## Streetdragon

defaul ram clock is 1376. and higher clock is 1476 but reading is 1350 in gpu-z



but it is runnign at 1350 most of the time


----------



## lilchronic

Quote:


> Originally Posted by *zGunBLADEz*
> 
> what cpu tho??
> im having problems with benchmarks staying consistent on this ryzen im thinking to go back using amd master as im having random perf issues under him cant point out where is the culprit


Quote:


> Originally Posted by *zGunBLADEz*
> 
> man this sucks, i got that score right? i keep benching is been under 10k like 4-5 tests so i removed the gpu drivers and everything went back and did
> 
> 
> and i know its not the gpu or the overclocks..
> 
> 
> 
> EDIT: i guess after flashing bioses ddu came in handy first time i have that issue
> 
> 
> 
> Yeah definitley this card is a 11k+ on my 4790K


Your CPU is not going to make much of a difference in this test. It's all GPU bound, especially at 4k.


----------



## zGunBLADEz

Quote:


> Originally Posted by *lilchronic*
> 
> Your CPU is not going to make much of a difference in this test. It's all GPU bound, especially at 4k.


my 4790k @ 5ghz with 2600 will im just not going to swap builds lol

i tried default bios first to make sure i wasnt getting the same crap i was getting on my 1080 with xoc bios which was less perf x more clocks... which was obvious the 8pin only so no enough power to move clocks..

Shes scaling linear with increments still testing..

Like i said theres something fishy on my ryzen setup at least i iron out the perf differences on that particular bench


----------



## Lefty23

Quote:


> Originally Posted by *lilchronic*
> 
> Your CPU is not going to make much of a difference in this test. It's all GPU bound, especially at 4k.


Yeah, about a month ago dunbheagan did an interesting test that shows CPU doesn't make much difference
http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11470#post_26154877

If zGunBLADEz actually tests with 4790k it will be interesting to see if different architecture does


----------



## zGunBLADEz

Quote:


> Originally Posted by *Lefty23*
> 
> Yeah, about a month ago dunbheagan did an interesting test that shows CPU doesn't make much difference
> http://www.overclock.net/t/1624521/nvidia-gtx-1080-ti-owners-club/11470#post_26154877


Question is what ram he was running those tests at?

Because my adata is rated @ 2400 she do 2600 and i get 500-800 point boosts depending the ram timings in different benchmarks

i can get same results all to 5ghz to 3ghz with ram @ stock, if i start messing with ram settings the perf start showing up.


----------



## Lefty23

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Question is what ram he was running those tests at?
> 
> Because my adata is rated @ 2400 she do 2600 and i get 500-800 point boosts depending the ram timings in different benchmarks


Sorry not sure. Did a quick search in his posts but didn't find anything. I did find this so, maybe 4x8 @2400?
In the result you referenced above I'm running 2x8 @2400MHz CL10 12-12-31 1T
In any case, as I said above it would be interesting to see any results from you.
Though guessing by the temps in your run you are WCed so it might be a hassle.


----------



## zGunBLADEz

Im having issues with this ryzen/mobo combo i cant pinpoint out the culprit but thats why i like been an enthusiast lol troubleshoot troubleshoot..
At least im certain im seeing the gains with the clocks i just need to find out whats what if ever lol.

This is a cheap mobo but didnt have no choice on that sff, compare it to my asus vii gene rog so that too...


----------



## Cookybiscuit

What kind of results have people been getting by replacing the thermal compound on these?

Just replaced mine with Noctua NT-H1. Before, running Valley and at a manual fan setting of 70%, I left it for about half an hour and it got a max of 73C, kept bouncing between 72-73. After, under the same testing parameters, it got a max of 69C, and was bouncing between 68-69. Does that seem about right for a non-liquid metal replacement? Just because some people have reported they swapped the paste and it did absolutely nothing, and other people are reporting huge drops. Maybe it's a luck of the draw thing, mine did have too much paste on, but not an insane amount. I also don't have any way of measuring ambients but it only took me 45mins to do so I doubt there was a massive change.

I was just happy I didn't break anything, I hate pulling out those headers, always feel like they'll break.


----------



## lilchronic

This benchmark uses one single core. you can see for your self, just run 1080p with everything on low and you will see CPU usage at 90-100% on one core and gpu usage never break 's 70% usage.

1080p
Shaders: low
Textures: low
DOF; disabled
Motionblur; disaled


----------



## Svaniis

2114 core 12312 memory, pretty good for stock bios.


----------



## Streetdragon

after 5 restarts the memoryclock is back to normal. lol


----------



## Lefty23

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Im having issues with this ryzen/mobo combo i cant pinpoint out the culprit but thats why i like been an enthusiast lol troubleshoot troubleshoot..
> At least im certain im seeing the gains with the clocks i just need to find out whats what if ever lol.
> 
> This is a cheap mobo but didnt have no choice on that sff, compare it to my asus vii gene rog so that too...


I was almost at your place (2 weeks ago I had everything in the cart - just didn't pull the trigger).

Try to get your GPU memory as high as possible (had mine at 6277 which is the highest I can get). I only gained ~40 points per GPU core clock bin when pushing to the limit.
I believe Superposition likes memory bandwidth.
At least with my card (EVGA FE) I always get higher scores in Superposition the higher the memOC is. Even if I have to sacrifice a core clock bin or two.


----------



## zGunBLADEz

see i went back to original bios thinking about the fake clocks perf boost and lower scores like when i was on 1080 and some other users reporting using the xoc bios. and i lost performance..200-300 points as a matter of fact its not the gpu/bios the issue


----------



## zGunBLADEz

Quote:


> Originally Posted by *lilchronic*
> 
> This benchmark uses one single core. you can see for your self, just run 1080p with everything on low and you will see CPU usage at 90-100% on one core and gpu usage never break 's 70% usage.
> 
> 1080p
> Shaders: low
> Textures: low
> DOF; disabled
> Motionblur; disaled


I wish thats the problem bro im talking about my cpu/mobo perse theres an issue and i guaranteed you that the clocks this gpu is running is a 11k+ guaranteed on my 4790k build... i can see the xoc bios gains vs stock not like on 1080 gpu when you saw people doing 2250/2300 and were gettng better score with the stock bios..
Quote:


> Originally Posted by *Lefty23*
> 
> I was almost at your place (2 weeks ago I had everything in the cart - just didn't pull the trigger).
> 
> Try to get your GPU memory as high as possible (had mine at 6277 which is the highest I can get). I only gained ~40 points per GPU core clock bin when pushing to the limit.
> I believe Superposition likes memory bandwidth.
> At least with my card (EVGA FE) I always get higher scores in Superposition the higher the memOC is. Even if I have to sacrifice a core clock bin or two.


how many points you get for every lets say 100+ on mem in that bench?


----------



## FastEddieNYC

Quote:


> Originally Posted by *Evangelion*
> 
> Hi everyone. I picked up an EVGA GTX 1080 Ti at Micro Center today and I have some concerns. Out of the box, the card hits 80-84c on less demanding games like Dota 2, and it thermal throttles like crazy in games like The Witcher 3 (it hit about 1300mhz). I tried reapplying the thermal paste and that didn't help at all. Afterwards, I tried undervolting it through MSI Afterburner (its stable at 1835mhz at 0.8620 mv) and even at lower volts its still ends up thermal throttling. This is with everything running at stock and no overclocks. I've also made sure to run in in my system without any of the panels (I'm using an NCASE M1) and I also have fans inside my case set to intake so the card gets fresh air.
> 
> I've owned many cards with blower style coolers over the past couple of years (my last one was a 780 Ti) and I know they usually run hot and noisy, but I think something might be wrong with my card. Do I have a crappy 1080 Ti?


What is you ambient temp? All founder edition cards run hot compared to partner coolers. You need to set a more aggressive fan profile. You should seriously consider going with a better cooler to get more consistent clocks from the card.


----------



## Evangelion

The ambient temperature where my computer is located is around 20c. I upgraded from a 1080 Strix (sold my 1070 Strix for it) and temps were fine with it when it was undervolted but it was dumping a ton of heat into my case because of the cooler. Even at stock, the 1080 Strix was running at around 80c and was really loud, thats why I decided to try undervolting it. Turned out that the 1080 Strix had a busted cooler, and I exchanged it for another one and it ran at 70c out of the box without unvervolting it. I then undervolted that card and got much better temps and noise under load.

Fast forward to today. I found an opened box 1080 Ti at Micro Center and I decided to pick it up because it had the blower style cooler and Micro Center had it marked down to $630. I wanted to try the Founders Edition card because I heard that they were better for mini ITX cases, and that's why I'm wondering if I have a busted card or something. I got the idea from reading posts and watching a lot of videos on undervolting the 1080 Ti Founders Edition card and that helped reduce the temps and fan speed, even from other NCASE M1 owners. So I'm either delusional in my expectations or the YouTube videos/tutorials and the people who posted their results in different threads are exaggerating.


----------



## Quadrider10

These cards just run hot... Unfortunately. Annoying.


----------



## Evangelion

Yeah I've noticed.







To be fair, I knew the card was going to run hot. I just didn't expect my card to thermal throttle like crazy. I might go back to the Strix 1080. Its a bit of a bummer considering how much I got this 1080 Ti for.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Quadrider10*
> 
> These cards just run hot... Unfortunately. Annoying.


Heat is not a problem. Power is.


----------



## Iceman2733

I know this doesn't quite belong here but I figure I would ask the people that own the cards. I am having these problem through a fresh install of windows and two different sets of 1080ti cards. When anyone loads into 4k optimized Superposition does the screen go red (where you would normally see the loading screen) it will run the benchmark perfectly just gets a red screen during the loading screen.


----------



## joder

http://www.3dmark.com/spy/2051897

Nice KedarWolf


----------



## zGunBLADEz

I dont have problems with heat lol, as a matter of fact this card runs cooler than my 1080 on the same loop for some weird reason with a gpu block that is discontinued vs my 1080 on 1.00v and a newer block.


----------



## Quadrider10

Well Yea, if on water, these cards are beasts. Especially with the XOC bios. But I mean, I have my card at stock with liquid metal paste and it hits 75C on stock profile. This is the EVGA SC2 card. If I limit voltage to 1.050-1.034V temps only drop to 70C. On top of that if I use precision XOC running the fan curve I create, temps are good, but the freaking program makes my computer stutter and freeze. It always has for years.


----------



## Evangelion

Does anyone know why I can't lower the voltage on the last point? I didn't have this problem when I tried undervolting my 1080 Strix (non Ti).



Could this be the reason why I'm still getting higher temps even when lowering the voltage on everything else? The card is stable at this voltage, but MSI Afterburner won't let me lower the last point. It always goes back up after clicking apply. Like I mentioned before, I'm using an EVGA GTX 1080 Ti Founders Edition card.


----------



## Lefty23

Quote:


> Originally Posted by *zGunBLADEz*
> 
> how many points you get for every lets say 100+ on mem in that bench?


... different timezones
I would say ~80 points per +100 on memory.

PMed you


----------



## Quadrider10

anyone know if there is a way to control both fans on the evga cards using msi AB?


----------



## zGunBLADEz

Quote:


> Originally Posted by *Quadrider10*
> 
> Well Yea, if on water, these cards are beasts. Especially with the XOC bios. But I mean, I have my card at stock with liquid metal paste and it hits 75C on stock profile. This is the EVGA SC2 card. If I limit voltage to 1.050-1.034V temps only drop to 70C. On top of that if I use precision XOC running the fan curve I create, temps are good, but the freaking program makes my computer stutter and freeze. It always has for years.


After burner? The stop/stuttering scenario without dropping clocks/crashing is tdp starvation Or hitting it maxing out hardware wise its either one or two in the xoc
Quote:


> Originally Posted by *Lefty23*
> 
> ... different timezones
> I would say ~80 points per +100 on memory.
> 
> PMed you


Got ya.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Quadrider10*
> 
> These cards just run hot... Unfortunately. Annoying.


Haha, coming from a 290X, the 1080ti is cool as a cucumber.

Mine runs 30-33 under gaming load 2050/1600, where my 290x was pushing to 40-41.


----------



## MrBeer

Quote:


> Originally Posted by *Quadrider10*
> 
> anyone know if there is a way to control both fans on the evga cards using msi AB?


no you cannot, and the author or the program said it is very low on the list of thing to do .you will not see that fan control in afterburner.


----------



## zGunBLADEz

They are not attached to one single fan connector?

Or afterburner simple not support it?


----------



## FUZZFrrek

Quote:


> Originally Posted by *zGunBLADEz*
> 
> They are not attached to one single fan connector?
> 
> Or afterburner simple not support it?


I have the EVGA FTW3 and there are multiplie fan headers. The fans are independent and controled separately.


----------



## zGunBLADEz

Quote:


> Originally Posted by *FUZZFrrek*
> 
> I have the EVGA FTW3 and there are multiplie fan headers. The fans are independent and controled separately.


so not only they went overkill on the vrm they also went overkill on the fan dept as well oh my my evga why so much love XD

@EVGAJACOB
But you guys cant send me a freaking set of screws for one of my 1080s that i lost even when i offer to pay for, talking about customer service dont you..

Thats why i bought a zotac 1080 and a msi 1080ti lol

Btw i found out one of my issues and my mistakes lol

didnt slide the slider to perf on image quality XD


----------



## Alex132

Quote:


> Originally Posted by *zGunBLADEz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FUZZFrrek*
> 
> I have the EVGA FTW3 and there are multiplie fan headers. The fans are independent and controled separately.
> 
> 
> 
> so not only they went overkill on the vrm they also went overkill on the fan dept as well oh my my evga why so much love XD
> 
> @EVGAJACOB
> But you guys cant send me a freaking set of screws for one of my 1080s that i lost even when i offer to pay for, talking about customer service dont you..
> 
> Thats why i bought a zotac 1080 and a msi 1080ti lol
> 
> Btw i found out one of my issues and my mistakes lol
> 
> didnt slide the slider to perf on image quality XD
Click to expand...

@EVGA-JacobF

There ya go, he sometimes visits these parts


----------



## Quadrider10

Dang. Idk, with precision x I've always had micro stutter anytime that program ran and set clocks or fans with all my PCs. That's why I hated it. These cards are overbuilt but they run hot unfortunately. That's probably due to the design of the cooler it's self. It's very slim and the fans are small/slim. Also their conservative fan profile dosent help. Me personally would prefer a louder card with the trade off of temps. There's no sense in buying a card with a beefy cooler just to have it perform almost identical to stock 1080tis. My core is hitting 75c, vrms are hitting 80+ along with memory. And it's not my case cause if I open so window. The temps only drop 1-2c. The cards have so much potential. I might look into a liquid solution.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Quadrider10*
> 
> Dang. Idk, with precision x I've always had micro stutter anytime that program ran and set clocks or fans with all my PCs. That's why I hated it. These cards are overbuilt but they run hot unfortunately. That's probably due to the design of the cooler it's self. It's very slim and the fans are small/slim. Also their conservative fan profile dosent help. Me personally would prefer a louder card with the trade off of temps. There's no sense in buying a card with a beefy cooler just to have it perform almost identical to stock 1080tis. My core is hitting 75c, vrms are hitting 80+ along with memory. And it's not my case cause if I open so window. The temps only drop 1-2c. The cards have so much potential. I might look into a liquid solution.


You say 1080 Ti FE but how many volts is your card pumping? I know I have a 1080 Ti FE and can supply more than 1.0v because of power problems.


----------



## MrBeer

Quote:


> Originally Posted by *zGunBLADEz*
> 
> They are not attached to one single fan connector?
> 
> Or afterburner simple not support it?


afterburner simple not support it..
Originally Posted by Unwinder View Post
Both AMD and NV drivers are providing API for controlling the primary fan only, adding support for secondary fan means that controller must be accessed directly at low level, which adds real jar of worms in compatibility area. It won't be in AB just because some vendor (regardless if it is MSI, EVGA or anybody else) decide to play marketing games with second "async" controller.

http://forums.guru3d.com/showthread.php?p=5429154#post5429154


----------



## zGunBLADEz

Quote:


> Originally Posted by *Quadrider10*
> 
> Dang. Idk, with precision x I've always had micro stutter anytime that program ran and set clocks or fans with all my PCs. That's why I hated it. These cards are overbuilt but they run hot unfortunately. That's probably due to the design of the cooler it's self. It's very slim and the fans are small/slim. Also their conservative fan profile dosent help. Me personally would prefer a louder card with the trade off of temps. There's no sense in buying a card with a beefy cooler just to have it perform almost identical to stock 1080tis. My core is hitting 75c, vrms are hitting 80+ along with memory. And it's not my case cause if I open so window. The temps only drop 1-2c. The cards have so much potential. I might look into a liquid solution.


I would say play sillicon lottery, buy the cheapest card get a evga hybrid kit profit..

There's a few posts here which clearly prove that throwing volts at this cards are just heat generators and not worth it..

My card needs 1.1v for 2164 vs 1.025v for 2101 what you would think would be my choice? 2164 looks nice and everything but its not something to cry about it at my point of view yeah i got lucky but 1.1 its not something i be running for 63+ more mhz sorry lol

You guys need to check overclocks on lower voltages instead of pushing volts thru this cards.

Because you buy custom aib dont expect to get 40c temps or lower and none of that paraphernalia.

I blame nvidia too on this, i remember 1080 launch [email protected] or some crap like that and the card finally got out and it was the throttling king.

So 2000 or 2101 you are literally not getting too much gains 2% at most in the best of cases?

The lower the voltage the less chance to hit pl. No throttling. Prefer that than a blink of a 2126 hitting pl back and forth.


----------



## ToxicAdam

Quote:


> Originally Posted by *Quadrider10*
> 
> These cards just run hot... Unfortunately. Annoying.


My Gigabyte 1080 Ti Turbo never goes over 80c with this fan curve. The fan is not that noisy because the curve is not that aggressive.. as all I wanted is the card to stay below 80c so there's no thermal throttling which starts to happen at 84c.


----------



## Quadrider10

I'm at 1.050 max voltage At 2000. Usually settles at 1947 +60 on core at 1.043V.

I ended up just using msi to control fan curve and leaving it run. Screw it. I'm gonna start playing with clocks again and make my own curve. But I can't add voltage any cause it shoots up the temps. Even at a Max of 1.094


----------



## Hulk1988

The Lightning Z card is just amazing!

Results with Memory Voltage +50

THIS MEMORY!!! 2088/6412.

Tested with 100% RPM and no LN2 Bios!


















I will test with the LN2 Bios today. My expectations are around 2100/6500 - 2112/6525


----------



## Quadrider10

I guess I got a bad overclocker. I can't even hit 2100 at 1.093v.


----------



## magbarn

Doing a new build. Still have EVGA's hybrid 1080 cooler that I was previously using on a Titan X Maxwell. Should I get the 1080ti FE (with the risk of coil whine - my Titan Maxwell had it and it was annoying at times) and slap the hybrid on it or get a EVGA SC2 (I'm limited to this AIB, as I have too many BB GC to ignore)? I have a well ventilated HAF case...\

Which one would give me the best and stable o/c with acceptable noise levels?


----------



## ZealotKi11er

Quote:


> Originally Posted by *magbarn*
> 
> Doing a new build. Still have EVGA's hybrid 1080 cooler that I was previously using on a Titan X Maxwell. Should I get the 1080ti FE (with the risk of coil whine - my Titan Maxwell had it and it was annoying at times) and slap the hybrid on it or get a EVGA SC2 (I'm limited to this AIB, as I have too many BB GC to ignore)? I have a well ventilated HAF case...\
> 
> Which one would give me the best and stable o/c with acceptable noise levels?


1080 Ti FE has 0 coil whine. One of the improvement Nvidia made was power delivery.


----------



## Simkin

Two 1080Ti FE cards here and no coil whine.

Did not have any coilwine on my two 980Ti reference either.


----------



## Luckbad

Been running the Zotac ArcticStorm with a baller water cooling setup for a while now. Still running perfectly without issue.

Got some news that might require me to move my machine to a more compact setup than my 430mm Black Ice rad and discrete water setup.

How have the EVGA AIO solutions been faring. I still have my Fractal R5 that I could switch back to from my huge Core X9 case.


----------



## Quadrider10

How hot is vram and power delivery capable of getting?


----------



## FUZZFrrek

I just changed the TIM for CLU and so far i get the same results as the the MX-2 i put before that. I am running a EVGA FTW3.

I am currently at 2050mhz (+52) and 6003vram. I am on the slave BIOS, power and voltage sliders are maxed out. At 100% fan speed i am getting around 56-57C while running Heaven.

I can assume for my card it was not an improvement right of the bat.


----------



## FUZZFrrek

Quote:


> Originally Posted by *Quadrider10*
> 
> I guess I got a bad overclocker. I can't even hit 2100 at 1.093v.


I can't even hit 2077 at 1.093....


----------



## zGunBLADEz

Quote:


> Originally Posted by *Quadrider10*
> 
> How hot is vram and power delivery capable of getting?


According to nexus hybrid review both around 70c


----------



## krizby

Quote:


> Originally Posted by *FUZZFrrek*
> 
> I can't even hit 2077 at 1.093....


Pascal will lower clock by 13mhz per 11 degrees C increase starting at 35c I think, so your card could reach 2101mhz when under water


----------



## BoredErica

How do I remove the midplate for EVGA SC Black?


----------



## outofmyheadyo

So this was my previous gigabyte founders 1080ti, coilwhining like crazy, is there any point in buying other 1080ti models that are founders or based on ref pcb, hoping to hit a jackpot with a card that is as silent as a rock ? Or should I only look towards custom models ? When I can hear the card, it`s too loud, the rest of my system is running so silent I cant hear it when it`s sitting next to me on an open case ( fans never go above 500rpm and D5/vario is @ 2/5 )


----------



## TheParisHeaton

All cards have this "feature".


----------



## outofmyheadyo

Well, during the last 2 or so years I have had about 5x 980ti, 3x 1070, 2x 1080, 1x1080ti all have had coilwhine, wich makes watercooling pointless.
Spend all that money for watercooling and silence and all, you end up with is a coilwhining pc that is unusable, and I would have had better results staying on air saving a truckload of money and headache.


----------



## FUZZFrrek

Quote:


> Originally Posted by *krizby*
> 
> Pascal will lower clock by 13mhz per 11 degrees C increase starting at 35c I think, so your card could reach 2101mhz when under water


As I wrote earlier, I can do 2050-2063 at 56-57C.


----------



## Nico67

Anybody having issues with VRAM speed dropping when going into game? I am on 382.05 and playing around with Secret World Legends and the speed drops from 6180 to 5693 during fullscreen gaming. Seems fine when I tried The Division, stays at 6180.


----------



## buellersdayoff

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Well, during the last 2 or so years I have had about 5x 980ti, 3x 1070, 2x 1080, 1x1080ti all have had coilwhine, wich makes watercooling pointless.
> Spend all that money for watercooling and silence and all, you end up with is a coilwhining pc that is unusable, and I would have had better results staying on air saving a truckload of money and headache.


If every card you owned coil whined then I'd be starting to look elsewhere in the system, like the power supply. Do they do it at stock clocks as well? What resolution 1080p?


----------



## GraphicsWhore

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Well, during the last 2 or so years I have had about 5x 980ti, 3x 1070, 2x 1080, 1x1080ti all have had coilwhine, wich makes watercooling pointless.
> Spend all that money for watercooling and silence and all, you end up with is a coilwhining pc that is unusable, and I would have had better results staying on air saving a truckload of money and headache.


Edit: watched the video and that does indeed sound bad but 10 other cards with coil whine that bad? Sounds...implausible, though I guess not impossible.

Bad luck if so but why does that make it unusable? If you want to push your card as much on air as water, or even more, that's almost certainly got to be louder than any coil whine with the latter. Get some noise-canceling headphones perhaps?


----------



## outofmyheadyo

Whats the point of watercooling if i need to wear headphones, the whole point of watercooling for me is silence. Whats the point of a fancy openair case if I have to hide it under the table to muffle the coilwhine? I have one 560 rad and fans on it never go above 500rpm, d5 pump is @ 2/5 anything higher and I can hear it. And then there are those gpu-s wich defeat the purpose if the entire thing, because they cant afford to spend 5€ on quality components on a 800€ gpu, that is just pathetic on their part. It's like they never owned a computer themselves.


----------



## Dasboogieman

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Whats the point of watercooling if i need to wear headphones, the whole point of watercooling for me is silence. Whats the point of a fancy openair case if I have to hide it under the table to muffle the coilwhine? I have one 560 rad and fans on it never go above 500rpm, d5 pump is @ 2/5 anything higher and I can hear it. And then there are those gpu-s wich defeat the purpose if the entire thing, because they cant afford to spend 5€ on quality components on a 800€ gpu, that is just pathetic on their part. It's like they never owned a computer themselves.


Sorry to break it to you but it's extremely difficult to design high current VRMs that don't whine. Even specialist low noise Inductors also whine, they are just less audible.

Even two identical models of the same design can have different whine characteristics and noise.

If you are really that particular about coil whine, go pad the inductors yourself to mitigate the vibration.


----------



## outofmyheadyo

And why dont they pad the inductors in the factory? To save 5 cents? What happens with my warranty if i start messing with them myself? Some of these decisions by manufacturers make absolutely no sense. Hey you buy this overpriced 800€ gpu and do some homemadevengineering to it, to make it actually usable?
Hey guys we have two options here give everyone padded inductors and spend 5 cents per card? No no, lets give everyone coilwhine instead as a free bonus...


----------



## becks

Quote:


> Originally Posted by *outofmyheadyo*


Quote:


> Here's a potential solution.
> 
> Overclock your card.
> 
> Overvolt it.
> 
> Do something that will change the amount of power the card draws.
> 
> Or replace your power supply with one of a different model.
> 
> The inductors in a graphics card are for the most part used in its VRM, or voltage regulation module. VRMs are a type of DC-DC switch-mode power supply, usually a synchronous multi-phase buck regulator. This means that it uses a bunch of "phases" in parallel, each consisting of a pair of transistors, an inductor, and a capacitor, to change the +12V input from your PSU to 1.xxV for your GPU to use.
> 
> In order to supply a different voltage or different current the transistors switch on and off at different frequencies. If the GPU demands more current, then that will cause a voltage drop across the inductor, which makes the transistors switch faster to keep the voltage where it needs to be. And the reverse. All this happens in microseconds.
> 
> An inductor stores electricity in a magnetic field. This magnetic field pushes and pulls on the ferrite core used in some inductors. When the voltage across the inductor changes, the magnetic field changes, and the inductor moves slightly. When the voltage changes very, very quickly (as is the case when switching transistors are changing its voltage thousands of times a second) the inductor vibrates. And that vibration, if it is between 20Hz and 20,000Hz, is audible to the human ear. Hence whine.
> 
> The frequency at which the switching transistors operates varies, but it's generally between 10,000Hz and 100,000Hz. If your GPU is whining, the transistors are switching at between 10,000 and 20,000Hz, or a primary harmonic of those frequencies. So if you want to stop the whining, you need to make the transistors operate at a different frequency.
> 
> So you change the amount of power drawn. Overclock and overvolt your GPU and the transistors will need to switch faster to provide power to the GPU, and if you're luck you'll bump them from 17,000Hz to 24,000Hz. Underclocking may work as well. And suddenly your GPU will whine no more.
> 
> You could also use a different PSU that outputs a slightly higher or lower voltage on the +12V rail, which will also change the frequency the transistors need to work at.
> 
> It won't work for everyone, and may not work all the time either. But it may help reduce the whine or the amount of time you hear it. It worked with my new HD6950. At stock, it's a real whiner. Unlocked, overclocked, overvolted, power cap raised... not a peep.


----------



## sew333

Hi. My card Gtx 1080 TI FE stock. Card sometimes ignoring temp limit and going above 84C to 85-86C at stock settings,without oc. In this moment gpuz shows VREL ( not THRM ) and clocks are high + fan 60%. Help. I tried on 3 different drivers and the same. Any ideas?


----------



## zGunBLADEz

This card she is really something else..

Ok guys, went back to reference bios stock. I started playing Nvidia's game with PL/Throttling and such, specially today is hot so is better for my testings..

I used 2 benchmarks which guaranteed PL hitting 120% instead of stupid furmark...

Superposition 4k run with no throttles whatsoever, but it was pretty damn close to it lol, at the following 2050/[email protected]
*PL % last line/last number*





Then i run Firestrike Ultra stress test in this one i had to lower the mem overclock to 550 as it was giving me artifacts, they get HOT in this test!!!! lol But thats good thats what i want ...

2050/[email protected]
In this one she did throttle, but it wasnt that bad it was briefly 2030 back again quick to 2050

http://www.3dmark.com/3dm/21118936

So from there, i created 3 profiles for my stock reference bios

2126/[email protected] where pl its not an issue most games
2050/[email protected] for close to PL situations
2050/[email protected] extreme cases of hitting PL

and walla binning is done!!! pretty easy.... Pascal its so predictable... As you see what i said with 1080 still stands, higher clocks at the lowest voltage possible + Sillicon lottery ...

as you see you will hit pl anything higher than 0.0981V on stock reference bios 300w tdp on those 2 tests...

Btw, the temperature throttling is apart from this... I have no problems with that...


----------



## Hayashi

Just out of curiosity if anyone knows the release date or possibly owns a Zotac 1080 Ti ArcticStorm? Many retailers ever don't have it or a complete mess like the Amazon listing.


----------



## KCDC

Planning on doing the shunt mod on both cards this weekend. Liquid electrical tape and conductonaut method.

Since I'll have plenty of both product left over, is it really worth it to use the conductonaut on the GPU or just stick with the kryonaut? The liquid metal is time consuming and dangerous to put on the GPU but if it will really make things that much better, then I'll do it.


----------



## zGunBLADEz

Mine have kryonaut and if you see my temps XD


----------



## ZealotKi11er

Quote:


> Originally Posted by *KCDC*
> 
> Planning on doing the shunt mod on both cards this weekend. Liquid electrical tape and conductonaut method.
> 
> Since I'll have plenty of both product left over, is it really worth it to use the conductonaut on the GPU or just stick with the kryonaut? The liquid metal is time consuming and dangerous to put on the GPU but if it will really make things that much better, then I'll do it.


If you want to RMA dont use Liquid Metal on the GPU. It will fog the GPU die and anyone with experience can tell.


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you want to RMA dont use Liquid Metal on the GPU. It will fog the GPU die and anyone with experience can tell.


can tell what? That they used liquid metal thermal paste? it should not void your warranty.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you want to RMA dont use Liquid Metal on the GPU. It will fog the GPU die and anyone with experience can tell.


BS - Tell what? That you replaced their crappy paste with something better? Nothing wrong with that.


----------



## Iceman2733

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Well, during the last 2 or so years I have had about 5x 980ti, 3x 1070, 2x 1080, 1x1080ti all have had coilwhine, wich makes watercooling pointless.
> Spend all that money for watercooling and silence and all, you end up with is a coilwhining pc that is unusable, and I would have had better results staying on air saving a truckload of money and headache.


No offense this is a completely silly statement, I would say people who watercool for reduction in sound are a very very small % of the people who water cool. Most watercool for the reduction of temps on CPU/GPU and for the benefits of the reduced temps. Also yep because it will have coil whine it is completely unusable, can't game on it or benchmark or browse the forum that coil whine stops all of that immediately. Cmon don't go overly dramatic here on us

If your ears are that sensitive I would wear hearing plugs..... The people that come to this forum with the "omg the coil whine, it is so loud I can't hear my cat meow or my smoke detector going off" is about silly I would love to know what rock you guys live under that coil whine is just too much. It can only be heard during load it isn't like it sits there all day long humming away. It is coil whine not a top fuel dragster doing a burn out, I agree if you have had that many GPU's whining I would change your PSU and go from there. I would agree all do it to some point after quite a bit of reading but a lot of factors influence how bad and to what frequency they hum at.


----------



## Iceman2733

Quote:


> Originally Posted by *lilchronic*
> 
> can tell what? That they used liquid metal thermal paste? it should not void your warranty.


Quote:


> Originally Posted by *KraxKill*
> 
> BS - Tell what? That you replaced their crappy paste with something better? Nothing wrong with that.


You might want to check on that before you go off with stating this stuff. Asus has stickers on there cards to show if the card has been opened (placed right on the main 4 screws to remove heatsink), Evga also on the FTW3 had a sticker that stuck between there two piece backplate and covered a screw for this also. I have contacted ASUS about it with my Strix 1080ti cards to verify (asked about waterblock install, was told if the card has been taken apart for any reason by me the warranty is void and they even mentioned the sticker on the screw). If the sticker has been removed you have no warranty, and I would say from the dealings I have had with MSI i would say they MIGHT and i use the term might fall to this also. The only people I would think you could get a way with on is EVGA. That is the reason they place the stuff there is help them and void our warranties. Call the manufacturer to your GPU and ask if changing thermal compound will void warranty


----------



## KedarWolf

Quote:


> Originally Posted by *Iceman2733*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lilchronic*
> 
> can tell what? That they used liquid metal thermal paste? it should not void your warranty.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> BS - Tell what? That you replaced their crappy paste with something better? Nothing wrong with that.
> 
> Click to expand...
> 
> You might want to check on that before you go off with stating this stuff. Asus has stickers on there cards to show if the card has been opened (placed right on the main 4 screws to remove heatsink), Evga also on the FTW3 had a sticker that stuck between there two piece backplate and covered a screw for this also. I have contacted ASUS about it with my Strix 1080ti cards to verify (asked about waterblock install, was told if the card has been taken apart for any reason by me the warranty is void and they even mentioned the sticker on the screw). If the sticker has been removed you have no warranty, and I would say from the dealings I have had with MSI i would say they MIGHT and i use the term might fall to this also. The only people I would think you could get a way with on is EVGA. That is the reason they place the stuff there is help them and void our warranties. Call the manufacturer to your GPU and ask if changing thermal compound will void warranty
Click to expand...

I don't think my Gigabyte FE had any stickers at all.


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> I don't think my Gigabyte FE had any stickers at all.


Gigabyte traditionally don't use those warranty stickers. That being said, those stickers don't override your statutory consumer rights anyway so breaking it will still be OK.


----------



## KraxKill

Quote:


> Originally Posted by *Iceman2733*
> 
> You might want to check on that before you go off with stating this stuff. Asus has stickers on there cards to show if the card has been opened (placed right on the main 4 screws to remove heatsink), Evga also on the FTW3 had a sticker that stuck between there two piece backplate and covered a screw for this also. I have contacted ASUS about it with my Strix 1080ti cards to verify (asked about waterblock install, was told if the card has been taken apart for any reason by me the warranty is void and they even mentioned the sticker on the screw). If the sticker has been removed you have no warranty, and I would say from the dealings I have had with MSI i would say they MIGHT and i use the term might fall to this also. The only people I would think you could get a way with on is EVGA. That is the reason they place the stuff there is help them and void our warranties. Call the manufacturer to your GPU and ask if changing thermal compound will void warranty


Have RMA'ed several cards in the past. Water blocks, AIO's etc. Stickers fall off. The stickers are there to protect the manufacturer in the event that if you accidentally use improper tools or actually break your card due to unprofessional physical damage. They are not there to keep users from using custom water blocks and coolers. That is just silly.


----------



## KedarWolf

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Iceman2733*
> 
> You might want to check on that before you go off with stating this stuff. Asus has stickers on there cards to show if the card has been opened (placed right on the main 4 screws to remove heatsink), Evga also on the FTW3 had a sticker that stuck between there two piece backplate and covered a screw for this also. I have contacted ASUS about it with my Strix 1080ti cards to verify (asked about waterblock install, was told if the card has been taken apart for any reason by me the warranty is void and they even mentioned the sticker on the screw). If the sticker has been removed you have no warranty, and I would say from the dealings I have had with MSI i would say they MIGHT and i use the term might fall to this also. The only people I would think you could get a way with on is EVGA. That is the reason they place the stuff there is help them and void our warranties. Call the manufacturer to your GPU and ask if changing thermal compound will void warranty
> 
> 
> 
> Have RMA'ed several cards in the past. Water blocks, AIO's etc. Stickers fall off. The stickers are there to protect the manufacturer in the event that if you accidentally use improper tools or actually break your card due to unprofessional physical damage. They are not there to keep users from using custom water blocks and coolers. That is just silly.
Click to expand...

I'm pretty sure it depends on your countries warranty laws as well. Some countries I think modifications can void warranties while some have laws allowing it.


----------



## Iceman2733

Quote:


> Originally Posted by *KraxKill*
> 
> Have RMA'ed several cards in the past. Water blocks, AIO's etc. Stickers fall off. The stickers are there to protect the manufacturer in the event that if you accidentally use improper tools or actually break your card due to unprofessional physical damage. They are not there to keep users from using custom water blocks and coolers. That is just silly.


Asus's exact words, ANY modifications that require disassembling of the card is a voided warranty, he said and I quote " Such as installing a water block" lol As I stated call your GPU manufacturer and inquire and see what they say, not trying to be a dick just wondering exactly what they will say.

The warranties stickers are there to see if we have opened the item and could have caused the issue that a person is trying to claim warranty on, now if they actually check is one thing. I would want to ask how honest were you when you call did you openly tell them you had installed custom water blocks?


----------



## SSJVegeta

My EVGA 1080 Ti SC Black Edition gets hot and noisy.

I'm replacing it with a Asus Strix or Zotac Amp Extreme.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *zGunBLADEz*
> 
> No, but i played alot with the 1080 and i commented this on the 1080 forums and nobody payed attention to it lol.. I dont hit PL because of this..My clocks never throttled on my 1080 for the whole year i have it and she got good usage let me tell you not even once she throttled.
> 
> Throwing more volts at pascal is no good, so when you are overclocking you need to find the lowest voltage possible at the max stable overclocks is pretty easy to find the wall right away lol..
> 
> Thats exactly what i did here, turn on pc run heaven make sure temps are good then jumped to 2101 as my goal, it took me like 5 tries? to find out the stable voltage for 2101 lol....
> my 1080 is 100% stable on 1.00V she dont even get close to PL no where lol..
> 
> This card seems to like 1.025V i started at 1.00V went up in the afterburner curve editor.. but she have more power to feed it so i dont mind the little bump on volts, shes been totally stable been looping heaven for almost 4 hrs now..
> Pascal is no fun to overclock if you ask me is pretty easy once you know this and you will find the limits of your card right away..
> 
> The idea is to get the pl as low as possible on the highest clock you can get at the lowest voltage possible to have it running with no hitting pl. After that is sillicon lottery which any lol. You eventually will hit pl if you touch anything else lol. My perfcap is clean as a modded bios lol.
> 
> Same concept as modded bios to prevent throttling and high tdp. In this case we dont have mods to raise tdp so we have to play nvidias game with boost and the card pl/tdp


My Galax GTX 1080 ti EXOC sits with VRel on constantly with volts at 1.062 and core sitting at 2038-2050 will this get better if voltage is raised, how do i raise voltage? Are there any good vids showing how to?


----------



## becks

Quote:


> Originally Posted by *2ndLastJedi*


Hi and welcome to the club...

Check this link: Here for all the information you seek...

All credit goes to @KedarWolf


----------



## Alex132

My FTW3 came in the mail today


----------



## zGunBLADEz

Quote:


> Originally Posted by *2ndLastJedi*
> 
> My Galax GTX 1080 ti EXOC sits with VRel on constantly with volts at 1.062 and core sitting at 2038-2050 will this get better if voltage is raised, how do i raise voltage? Are there any good vids showing how to?


in my testings with both 1080 and 1080ti, i will retain on raising voltages and try to find the highest overclock you can achieve on the lowest voltage possible..

As you see in my previous post in superposition 4k optimized or firestrike ultra stress tests you will hit PL with anything higher than 0.981v on a 300wTDP card so in my point of view it will be useless if you want to have the most out of your card biding by nvidia set parameters like boost/thermal throttle.

Those 2 tests are more realistic speaking than using furmark or xoc bios. If you dont want to flash xoc bios and what not i recommend using superposition 4k and firestrike 4k gpu stress test and watch for PL % after that you can create profiles like i did..

As of course sillicon lottery plays a role in what you get.

and *forget about nvidia perf cap reason* just watch for stability and PL % close to 120% equals throttling
my card even says that is in idle while running those kind of low volts lol, so the cap reason is bogus as it is no matter what it says its not something to follow.



Compare scores get consistency as well between runs. forget about the perf cap reason as long as it is stable in your tests and bench scores match and are consistent you are golden.


----------



## stangflyer

Quote:


> Originally Posted by *SSJVegeta*
> 
> My EVGA 1080 Ti SC Black Edition gets hot and noisy.
> 
> I'm replacing it with a Asus Strix or Zotac Amp Extreme.


Before you buy the AMP Extreme look at this-

http://www.gamersnexus.net/hwreviews/2981-zotac-1080ti-amp-extreme-review

I have an extreme and I love it- I added a 2mm thermal pad over the power component heatsink so that it contacted the main heatsink, I also changed out tim while I had it apart.


----------



## KCDC

Quote:


> Originally Posted by *Iceman2733*
> 
> Asus's exact words, ANY modifications that require disassembling of the card is a voided warranty, he said and I quote " Such as installing a water block" lol As I stated call your GPU manufacturer and inquire and see what they say, not trying to be a dick just wondering exactly what they will say.
> 
> The warranties stickers are there to see if we have opened the item and could have caused the issue that a person is trying to claim warranty on, now if they actually check is one thing. I would want to ask how honest were you when you call did you openly tell them you had installed custom water blocks?


I wonder, then, if they would be able to tell since there would be different thermal pads if I were to return it, not to mention the tiny little white ones that basically disintegrate when you take them off. My FE cards came directly from Nvidia. Pretty sure the only company that verbally states they don't care is EVGA. The rest treat it like usual, if you tamper with it, and they find out, they have more reason to void your warranty claim.

Might be a bad analogy, but I know with modding cars, they will still honor warranty claims if they can't prove your mods caused that specific failure. If they can trace the failure back to that mod, the entire warranty isn't voided, just that specific warranty claim.

That said, I'll just stick with kryonaut for the GPU die, haha.


----------



## Alex132

Great. My GPU is DoA. No matter the PCIE slot.

LEDs come on, fans spin up. But no video. Motherboard has a red LED by the PCIE slot - indicating it's not detecting video.

Maybe I am missing something though, any suggestions?


----------



## KCDC

Quote:


> Originally Posted by *Alex132*
> 
> Great. My GPU is DoA. No matter the PCIE slot.
> 
> LEDs come on, fans spin up. But no video. Motherboard has a red LED by the PCIE slot - indicating it's not detecting video.
> 
> Maybe I am missing something though, any suggestions?


Did you try different power cables into the card?


----------



## Alex132

To be honest, no. I just used the ones that were already in my R9 295X2. Straight swap if you will.

I can try different ones, but running off of my 295x2 now - so not sure that that is the issue.

edit- had to change my BIOS to not use UEFI boot - and it worked


----------



## vasyltheonly

So I'm having a dilemma. I can currently get the Zotac Extreme 1080ti for about $20 more than my blower 1080ti. What is people's opinion of the card and the brand? I know it runs a bit hotter than other equal 3 fan cards but I'm debating throwing a waterblock on it.


----------



## Agent-A01

Quote:


> Originally Posted by *SSJVegeta*
> 
> My EVGA 1080 Ti SC Black Edition gets hot and noisy.
> 
> I'm replacing it with a Asus Strix or Zotac Amp Extreme.


Repaste it with quality TIM.


----------



## Alex132

Anyone had the issue of where upon installing drivers, it refused to recognize the display? I tried DDU and ccleaner to reinstall drivers. But no luck.

Oddly enough, my Apple cinema display works - but not my Dell u2713hm...


----------



## 2ndLastJedi

Quote:


> Originally Posted by *zGunBLADEz*
> 
> in my testings with both 1080 and 1080ti, i will retain on raising voltages and try to find the highest overclock you can achieve on the lowest voltage possible..
> 
> As you see in my previous post in superposition 4k optimized or firestrike ultra stress tests you will hit PL with anything higher than 0.981v on a 300wTDP card so in my point of view it will be useless if you want to have the most out of your card biding by nvidia set parameters like boost/thermal throttle.
> 
> Those 2 tests are more realistic speaking than using furmark or xoc bios. If you dont want to flash xoc bios and what not i recommend using superposition 4k and firestrike 4k gpu stress test and watch for PL % after that you can create profiles like i did..
> 
> As of course sillicon lottery plays a role in what you get.
> 
> and *forget about nvidia perf cap reason* just watch for stability and PL % close to 120% equals throttling
> my card even says that is in idle while running those kind of low volts lol, so the cap reason is bogus as it is no matter what it says its not something to follow.
> 
> 
> 
> Compare scores get consistency as well between runs. forget about the perf cap reason as long as it is stable in your tests and bench scores match and are consistent you are golden.


All seems kinda scary! I wish i could watch someone do it! Anyone live in Brisbane and want to give OC lesson, lol


----------



## Rollergold

Quote:


> Originally Posted by *Iceman2733*
> 
> Evga also on the FTW3 had a sticker that stuck between there two piece backplate and covered a screw for this also.


For anyone that owns a FTW3 card I contacted EVGA and spoke to Gabe and looks like that small EVGA sticker can be broken without voiding the warranty


----------



## kevindd992002

Did any EVGA GTX 1080Ti FTW3 owner here have the chance to install the EK waterblock with the factory EVGA backplate already and got the same issue as what was described in these posts?

http://www.overclock.net/t/1631796/evga-ftw3-gtx-1080-ti-owners-thread/80#post_26212610

http://www.overclock.net/t/1631796/evga-ftw3-gtx-1080-ti-owners-thread/200#post_26230119

Apparently, EK's support sucks and they don't want to acknowledge the issue.


----------



## FUZZFrrek

I have a EVGA FTW3 and I have weird issues with my card. Every time I am trying to set a game or even my desktop resolution to 1440p, I get a weird flashing screen for every second or so.

The other issue is when I am able to set the same resolution, my sound device disable itself... I have to reset it each time by changing the bit rate.

Last issue is when I launch the Performance9 benchmark by Passmark, the first one directx 9 test when the resolution is very low, the screen flashed black and back with some major artefacts.

Does anyone has these same issues? I tried DDU many times; reinstalling the sound drivers; trying different Nvidia drivers; Restore my system to a previous state before I purchaced the card. Nothing cures this. The card is connected to an UHD TV (Panasonic 64AX800) with a HDMI cable.

I need some help please!

EDIT: I think I found the culprit... It is either my HDMI cables (tried 2 of them) or the 4K HDMI 2.0 port on my tv.

I tried to plug the Card on a 1080p monitor, upscaled it to 4k and run the test. Bang! I was able to run every tests at optimum performance (benchmark and gaming at 1440p) without any issues.

I just ordered a DP cable to do more testing to see if that improve/even the results.


----------



## Alex132

Let me know how it goes, also having issues getting display out post installing drivers on my 1440p monitor... but my 900p monitor works fine


----------



## FUZZFrrek

Quote:


> Originally Posted by *Alex132*
> 
> Let me know how it goes, also having issues getting display out post installing drivers on my 1440p monitor... but my 900p monitor works fine


Have you tried to upscaled at 1440p in Nvidia control panel on you 900p monitor?


----------



## Alex132

Quote:


> Originally Posted by *FUZZFrrek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Let me know how it goes, also having issues getting display out post installing drivers on my 1440p monitor... but my 900p monitor works fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried to upscaled at 1440p in Nvidia control panel on you 900p monitor?
Click to expand...

Haven't yet, at work now. Will do when I'm at home


----------



## TheBoom

I think some people here don't get what the perfcaps actually mean. It's not saying your performance is limited at that particular boost clock but more of telling you why it doesn't boost any higher.

So no matter what, unless you used a curve to fix the clocks to a specific bin, you are going to see perfcaps.

No need to get all angsty over it lol.


----------



## zGunBLADEz

Quote:


> Originally Posted by *TheBoom*
> 
> I think some people here don't get what the perfcaps actually mean. It's not saying your performance is limited at that particular boost clock but more of telling you why it doesn't boost any higher.
> 
> So no matter what, unless you used a curve to fix the clocks to a specific bin, you are going to see perfcaps.
> 
> No need to get all angsty over it lol.


Its just a tool that nvidia bios uses in conjunction with boost3 which is not that reliable as you see is easy to manipulate if you know how.
I was just proving a point..

Anyway, like i said...

We have 2 things to deal here;

Thermal throttle is the first one... better cooling problem solved
The hard power limit second, which you can go around it if you stop pushing volts to the card.. You will hit PL hard if you push too much volts.. I know pascal dont like volts i seeing it on the 1080 which is where i learned from and up to the teeth on the 1080TI same behavior, i already maxed out this card in less than 3 days of testing lol it took me longer on 1080 as i didnt know how it works..


----------



## encrypted11

Would the kryographics block work with the reference founders edition backplate?


----------



## dallas1990

Quote:


> Originally Posted by *vasyltheonly*
> 
> So I'm having a dilemma. I can currently get the Zotac Extreme 1080ti for about $20 more than my blower 1080ti. What is people's opinion of the card and the brand? I know it runs a bit hotter than other equal 3 fan cards but I'm debating throwing a waterblock on it.


if your going to watercool dont buy the zotac amp extreme. theres no waterblocks for it, least to my knowledge. just get a founders card


----------



## Alex132

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FUZZFrrek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Let me know how it goes, also having issues getting display out post installing drivers on my 1440p monitor... but my 900p monitor works fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried to upscaled at 1440p in Nvidia control panel on you 900p monitor?
> 
> Click to expand...
> 
> Haven't yet, at work now. Will do when I'm at home
Click to expand...

Fixed it with a DP cable btw


----------



## Psykoking

Quote:


> Originally Posted by *dallas1990*
> 
> if your going to watercool dont buy the zotac amp extreme. theres no waterblocks for it, least to my knowledge. just get a founders card


Alphacool has one in production but it will take some time until it'll be released.Release name will be GPX-N 1080(pro)-M18. At least thats what Mr. Janusch Kawetzki from AC told me


----------



## KraxKill

LOL

Anybody try undervolting yet?

1080Ti 2012mhz - 0.993v
DDR5X - 6123mhz
5775C @ 4.2ghz


----------



## lilchronic

Quote:


> Originally Posted by *KraxKill*
> 
> LOL
> 
> Anybody try undervolting yet?
> 
> 1080Ti 2012mhz - 0.993v
> DDR5X - 6123mhz
> 5775C @ 4.2ghz


Been running 2000Mhz @ .981v for 3 months now.


----------



## sew333

Hi guys. Fast question to you users. My card from time to time hitting perfcap reason VREL even above temp limit ( 84 C ) on 86C? Card is not overclocked,stock.
Is this normal? Card 1080 TI FE


----------



## zGunBLADEz

Quote:


> Originally Posted by *KraxKill*
> 
> LOL
> 
> Anybody try undervolting yet?
> 
> 1080Ti 2012mhz - 0.993v
> DDR5X - 6123mhz
> 5775C @ 4.2ghz


Nice see we dont need to overvolt this cards lol
I been trying to say this all along since the 1080


----------



## ELIAS-EH

Guys i have liquid metal conductonaut left in my tube after used it on my 6900k, do you recommend me to open my 1080ti strix and replace the stock tim with the liquid metal? How much it will improve the temp, and it is dangerous? There is any video tutorial? I am afraid to do it, please help, and can i put some electrical tape on the resistor around the GPU DIE and keep them after reassembles the STRIX? And do i need to replace the thermal pad on the mosfet? Thank u


----------



## Quadrider10

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Guys i have liquid metal conductonaut left in my tube after used it on my 6900k, do you recommend me to open my 1080ti strix and replace the stock tim with the liquid metal? How much it will improve the temp, and it is dangerous? There is any video tutorial? I am afraid to do it, please help, and can i put some electrical tape on the resistor around the GPU DIE and keep them after reassembles the STRIX? And do i need to replace the thermal pad on the mosfet? Thank u


I tried it on my SC2 Anand it destroyed the cooler and discolored the die. And that was only on for 3 days


----------



## zGunBLADEz

I use kryonaut myself or pk1


----------



## KingEngineRevUp

Quote:


> Originally Posted by *Quadrider10*
> 
> I tried it on my SC2 Anand it destroyed the cooler and discolored the die. And that was only on for 3 days


It technically surface dies, penetrates the top surface of copper. But it doesn't affect its performance, just becomes ugly.


----------



## ELIAS-EH

Quote:


> Originally Posted by *Quadrider10*
> 
> I tried it on my SC2 Anand it destroyed the cooler and discolored the die. And that was only on for 3 days


Did it damaged you card!!!!!???
Liquid metal only destroyed the aluminium
I will not open the strix, not worth the risk


----------



## lilchronic

4k Optimized
Stock bios
2000Mhz - 2038Mhz and voltage from .963v -1.025v


----------



## zGunBLADEz

can do one with cpu stock i want to see something.. Nice score btw


----------



## ZealotKi11er

Quote:


> Originally Posted by *lilchronic*
> 
> 4k Optimized
> Stock bios
> 2000Mhz - 2038Mhz and voltage from .963v -1.025v


Default Nvidia Control Panel?


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Default Nvidia Control Panel?


no


----------



## zGunBLADEz

just cpu stock


----------



## lilchronic

Quote:


> Originally Posted by *zGunBLADEz*
> 
> just cpu stock


I don't run my cpu at stock speed's.


----------



## ToxicAdam

I will be getting ARCTIC Accelero Xtreme III aftermarket cooler next month for my 1080ti. Will it fit this Gigabyte card with no issue?


----------



## zGunBLADEz

Quote:


> Originally Posted by *lilchronic*
> 
> I don't run my cpu at stock speed's.


its so easy to drop a multiplier lol

I just want to see if is worth to bench this card on my 4790K


----------



## cluster71

It does not help you. You can not compare different systems (chip and CPU model). You must compare with an equivalent system. I noticed that with my Z270 / i7 7700K system. There is so much else that can hold you back in the system (memory resources pcie bus etc)


----------



## Quadrider10

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Did it damaged you card!!!!!???
> Liquid metal only destroyed the aluminium
> I will not open the strix, not worth the risk


It ruined the surface of the heatsync and discolored the die. For the heatsink I had to sand the abrasions of with a really fine brillo pad to make it smooth again.


----------



## sammaz

4K Optimized
EVGA 1080ti SC2 Black Edition
Using Afterburner latest beta
+73 Voltage
+120 Power Limit
+90 Temp
+88 Core
+537 Memory
Very Aggressive Custom Fan profile keeping temps under 55

...on
6600K 4.5Ghz @1.33V
3200 Corsair LED Vengence Using XMP
Asus Strix z270i
Samsung 960 EVO 256G

I made the club !!! Yea!


----------



## Alex132

Guessing you flashed it to do 120%+ power limit? My FTW3 is 117% stock. And it sits around 1936-46mhz in games, so I'm happy and don't mind.


----------



## FUZZFrrek

Quote:


> Originally Posted by *Alex132*
> 
> Guessing you flashed it to do 120%+ power limit? My FTW3 is 117% stock. And it sits around 1936-46mhz in games, so I'm happy and don't mind.


Same here, I just OC for the benchmarks.


----------



## ZealotKi11er

I am running my card 1860MHz at 900mV and seem to be the sweet spot in power and performance. I still have to test 2025MHz @ 1.0V but power might not let me stay there all the time.


----------



## sammaz

No this is stock bios. My slider goes to 120% with the Black Edition SC2. Is this not normal?


----------



## iamjanco

It's not news, but some might find it worth a look. For now, I'd take it like a grain of salt, especially if you're not experiencing any issues:


__
https://www.reddit.com/r/6oubum/nvidia_confirms_stuttering_fps_drops_lag_problems/


----------



## navjack27

Heh, since getting this card and upgrading from a 980ti, I haven't noticed any of that stuff at all


----------



## ZealotKi11er

My only experience with 1080 Ti is HoST at 1440p and Titafall 2 at 4K. No problems.


----------



## iamjanco

If it's anything, it's probably a matter of YMMV depending on a whole slew of factors, from hardware combo, to software and OS in use, as well as what phase the moon is in. It'll be interesting to see how far nVidia runs with it.


----------



## Alex132

I have noticed some very odd stutters in some games where I didn't have it with my 295x2. But it's mostly like CoH2 in specific situations, and even twice in CS6. But eh, not bothered about it.

Gotta say, it's amazing how much VRAM games uses these days. I thought I was fine with 4GB @ 1440p with my 295x2 - then I see PUBG use 7.8GB of VRAM........ lol.

Guess my CPU from 2011 might need upgrading finally


----------



## BoredErica

I noticed some stuttering in BF3. It's pretty minor though, and it's not enough to bother me. Maybe this is the reason. I didn't notice stuttering in other games. I don't recall this stutter on my 980ti.


----------



## milikitungi

I have an EVGA 1080ti SC2 HYBRID in which I have flashed the XOC bios. I can't get past 1.093 voltage. I did the AfterBurner thing and made no difference.

Am I doing something wrong? Do I have to hard mode to go beyond the voltage limit?

I would appreciate it if you cleared those questions, thanks.


----------



## Alex132

Does it really matter? To be honest, I dont see much a point in overclocking this card. The heat+noise : performance ratio is horrible beyond around 1950mhz or so. Which the card boosts to anyway.


----------



## KedarWolf

Quote:


> Originally Posted by *milikitungi*
> 
> I have an EVGA 1080ti SC2 HYBRID in which I have flashed the XOC bios. I can't get past 1.093 voltage. I did the AfterBurner thing and made no difference.
> 
> Am I doing something wrong? Do I have to hard mode to go beyond the voltage limit?
> 
> I would appreciate it if you cleared those questions, thanks.


Did you download the Afterburner beta, enable 'third party' in settings, slide the voltage slider to 100%?

Need to do all that to go above 1.093v.









And if you don't have a Power Limit slider it means the XOC BIOS flashed properly.


----------



## milikitungi

Quote:


> Originally Posted by *KedarWolf*
> 
> Did you download the Afterburner beta, enable 'third party' in settings, slide the voltage slider to 100%?
> 
> Need to do all that to go above 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And if you don't have a Power Limit slider it means the XOC BIOS flashed properly.


I did everything else but I do have the power limit slider still. I will flash it again then. So if I have the power limit slider means it didn't flash properly?


----------



## KedarWolf

Quote:


> Originally Posted by *milikitungi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Did you download the Afterburner beta, enable 'third party' in settings, slide the voltage slider to 100%?
> 
> Need to do all that to go above 1.093v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And if you don't have a Power Limit slider it means the XOC BIOS flashed properly.
> 
> 
> 
> I did everything else but I do have the power limit slider still. I will flash it again then. So if I have the power limit slider means it didn't flash properly?
Click to expand...

Yes, with XOC BIOS if Power Limit slider isn't grayed out it means it never flashed right.

Can't adjust it if it has flashed right.


----------



## milikitungi

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, with XOC BIOS if Power Limit slider isn't grayed out it means it never flashed right.
> 
> Can't adjust it if it has flashed right.


I got it working now but I think I got a very bad chip. I get 2100Mhz stable at 1.2

I don't think the extra voltage is worth. Gonna flash it to stock.

Thanks anyways.


----------



## FUZZFrrek

A lot of people just want to crank it to the max possible OC but I think with Pascal, the way to go is undervolting the Card. You get more stability with less to no power throttle.


----------



## JimmyMo

Quote:


> Originally Posted by *stangflyer*
> 
> Before you buy the AMP Extreme look at this-
> 
> http://www.gamersnexus.net/hwreviews/2981-zotac-1080ti-amp-extreme-review
> 
> I have an extreme and I love it- I added a 2mm thermal pad over the power component heatsink so that it contacted the main heatsink, I also changed out tim while I had it apart.


I'd like to do this same mod to my ZOTAC Ti.

Was 2mm enough to make good contact with the heatsink? Sounds like it was.

What brand of pad did you go with? FujiPoly, maybe, like this one? >> http://www.frozencpu.com or this one >> http://www.performance-pcs.com

While I'm in there, I should replace the other thermal pads, as they get less effective each time they get compressed and then pulled apart.

Did you determine what those other thicknesses are? Looks like there are two different thicknesses, maybe? (light blue and the white ones)



Any advice you can give would be very welcome!

I wonder what could be done to tackle the other issue GN identified - - that the backplate's "wrap" on the underside prevents good heat conductance...


----------



## lilchronic

Quote:


> Originally Posted by *milikitungi*
> 
> I got it working now but I think I got a very bad chip. I get 2100Mhz stable at 1.2
> 
> I don't think the extra voltage is worth. Gonna flash it to stock.
> 
> Thanks anyways.


Yeah not bad. i can get 2138Mhz pretty stable with 1.2v but it is not worth it for the amount of power and heat it puts out.
My sweet spot is .981v @ 2000Mhz stock bios
Quote:


> Originally Posted by *FUZZFrrek*
> 
> A lot of people just want to crank it to the max possible OC but I think with Pascal, the way to go is undervolting the Card. You get more stability with less to no power throttle.


Well XOC bios has no power limit.


----------



## KCDC

New Driver released today, 384.94:


__
https://www.reddit.com/r/6p8nrw/driver_38494_faqdiscussion_thread/

Anyone try it yet? I will tonight. Hope the stuttering I've been getting in D3 goes away.


----------



## KraxKill

Quote:


> Originally Posted by *KCDC*
> 
> New Driver released today, 384.94:
> 
> 
> __
> https://www.reddit.com/r/6p8nrw/driver_38494_faqdiscussion_thread/%5B/URL


----------



## stangflyer

Quote:


> Originally Posted by *vasyltheonly*
> 
> So I'm having a dilemma. I can currently get the Zotac Extreme 1080ti for about $20 more than my blower 1080ti. What is people's opinion of the card and the brand? I know it runs a bit hotter than other equal 3 fan cards but I'm debating throwing a waterblock on it.


I have a AMP Extreme 1080ti and I really like it.
Quote:


> Originally Posted by *JimmyMo*
> 
> I'd like to do this same mod to my ZOTAC Ti.
> 
> Was 2mm enough to make good contact with the heatsink? Sounds like it was.
> 
> What brand of pad did you go with? FujiPoly, maybe, like this one? >> http://www.frozencpu.com or this one >> http://www.performance-pcs.com
> 
> While I'm in there, I should replace the other thermal pads, as they get less effective each time they get compressed and then pulled apart.
> 
> Did you determine what those other thicknesses are? Looks like there are two different thicknesses, maybe? (light blue and the white ones)
> 
> 
> 
> Any advice you can give would be very welcome!
> 
> I wonder what could be done to tackle the other issue GN identified - - that the backplate's "wrap" on the underside prevents good heat conductance...


The 2mm was perfect! I used Thermal Grizzly MInus Pad 8. Got a 100x100 square from Amazon. Cut it and applied. Re did the TIM on the core also. I did not mess with the other pads, only the heatsink for power. I did a loop of Superpostion and I was at 56C at 65% fan 2114 on the core +400 on memory. I have not tested for max memory yet.
75F ambient. I think the tim needs to cure for a few days. I have to admit that the tim from Zotac (whatever it is) was applied perfectly.

As far as the back plate- I am going to leave that alone. I have good cooling in my case. If I was to mod I would just cut 4 or 5 1/8 wide by 3 inch groves with a rounded dremmel tool. So it looks like the half by the IO.


----------



## vasyltheonly

Quote:


> Originally Posted by *stangflyer*
> 
> I have a AMP Extreme 1080ti and I really like it.
> The 2mm was perfect! I used Thermal Grizzly MInus Pad 8. Got a 100x100 square from Amazon. Cut it and applied. Re did the TIM on the core also. I did not mess with the other pads, only the heatsink for power. I did a loop of Superpostion and I was at 56C at 65% fan 2114 on the core +400 on memory. I have not tested for max memory yet.
> 75F ambient. I think the tim needs to cure for a few days. I have to admit that the tim from Zotac (whatever it is) was applied perfectly.
> 
> As far as the back plate- I am going to leave that alone. I have good cooling in my case. If I was to mod I would just cut 4 or 5 1/8 wide by 3 inch groves with a rounded dremmel tool. So it looks like the half by the IO.


I kinda went a different route. An open box 1080Ti AMP Edition went on sale for $638 and I bought the microcenter warranty. I played around on it for a day saw the awful temperatures of the card, 80+°C at load with fan speed of 100%.

Out of anger and the desire for cool and quite operation I slapped on my G10 with a universal CPU block and CLU as the thermal paste. BEST DECISION EVER!!!! The card doesn't see above 45°C and I have a stable OC of 2050Mhz Core and 5950Mhz on memory. I think Zotac really messed up on the cooling of these cards but the fact that they did not have a warranty sticker on their cooler made it a bit better.


----------



## xcom-

Hello everyone

First time overclocking a GPU so looking for advice were to begin with my EVGA GTX 1080 TI FE, which I have just put a waterblock on. 650w PSU

Are there any recommended settings I should use and should I be using the EVGA Precision software?

Any advice is greatly appreciated.

Many Thanks


----------



## JimmyMo

Quote:


> Originally Posted by *stangflyer*
> 
> I have a AMP Extreme 1080ti and I really like it.
> The 2mm was perfect! I used Thermal Grizzly MInus Pad 8. Got a 100x100 square from Amazon. Cut it and applied. Re did the TIM on the core also. I did not mess with the other pads, only the heatsink for power. I did a loop of Superpostion and I was at 56C at 65% fan 2114 on the core +400 on memory. I have not tested for max memory yet.
> 75F ambient. I think the tim needs to cure for a few days. I have to admit that the tim from Zotac (whatever it is) was applied perfectly.
> 
> As far as the back plate- I am going to leave that alone. I have good cooling in my case. If I was to mod I would just cut 4 or 5 1/8 wide by 3 inch groves with a rounded dremmel tool. So it looks like the half by the IO.


Awesome, thanks so much for replying!

OK, I will get the 2mm pads.

Good idea on the back plate. I might leave mine alone, too, though. Don't want to start carving into the thing...!

**Can anyone guess or know what these other two thermal pad thicknesses might be? Is there a standard thickness that most manufacturers use?

If I get them slightly TOO thick, like 1.5mm, will that be a problem if they are "squished?"



...Just found another Overclock.net discussion on pads, this fellow saw .5 and 1.5mm pads in his Zotac 1070 Mini, so maybe I will get those two sizes, and hope...

http://www.overclock.net/t/1622691/thermal-pad-thickness-for-gpu#post_25843485

Any other thoughts on that, anyone?


----------



## macwin2012

Got a new 1080ti , It boost to 2025 out of the box in gaming .

Looking forward to OC in 2 weeks .


----------



## GraphicsWhore

Quote:


> Originally Posted by *xcom-*
> 
> Hello everyone
> 
> First time overclocking a GPU so looking for advice were to begin with my EVGA GTX 1080 TI FE, which I have just put a waterblock on. 650w PSU
> 
> Are there any recommended settings I should use and should I be using the EVGA Precision software?
> 
> Any advice is greatly appreciated.
> 
> Many Thanks


Have the same card on water.

Here's what I'd do:

1. Download AfterBurner
2. Unlock voltage control. Turn voltage, power and temp limits to max
3. Start with +130 on core, +500 on memory. Run FireStrike.
4. If stable, increase on both.
5. Once you've found sweet spot, take note of numbers (core max, memory max, temp max)
6. Backup current BIOS
7. Flash to XOC BIOS
8. Use custom voltage curve in AfterBurner to see if you can hit 2100mhz at 1.093 (and whatever overclock on memory you were stable with on stock BIOS). If you can, do some benchmarks and compare numbers to stock BIOS. If not, either increase voltage another step if you want to keep shooting for 2100 or lower to 2088.
9. Look for a good long-term overclock that gives you solid speeds but staying in upper 40's, very low 50's on max temp (I'm at 2088mhz on core, 6003 memory at 1.075 reaching 49 after extended gaming).

PM me if you need help. Sounds overwhelming but very easy to do.


----------



## xcom-

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Have the same card on water.
> 
> Here's what I'd do:
> 
> 1. Download AfterBurner
> 2. Unlock voltage control. Turn voltage, power and temp limits to max
> 3. Start with +130 on core, +500 on memory. Run FireStrike.
> 4. If stable, increase on both.
> 5. Once you've found sweet spot, take note of numbers (core max, memory max, temp max)
> 6. Backup current BIOS
> 7. Flash to XOC BIOS
> 8. Use custom voltage curve in AfterBurner to see if you can hit 2100mhz at 1.093 (and whatever overclock on memory you were stable with on stock BIOS). If you can, do some benchmarks and compare numbers to stock BIOS. If not, either increase voltage another step if you want to keep shooting for 2100 or lower to 2088.
> 9. Look for a good long-term overclock that gives you solid speeds but staying in upper 40's, very low 50's on max temp (I'm at 2088mhz on core, 6003 memory at 1.075 reaching 49 after extended gaming).
> 
> PM me if you need help. Sounds overwhelming but very easy to do.


I am amazed! About 400 points higher than stock


----------



## GnarlyCharlie

Run it with the same settings except "Fullscreen" instead of "Windowed" and you can compare score in the Benchmarking section.


----------



## jleslie246

Quote:


> Originally Posted by *macwin2012*
> 
> Got a new 1080ti , It boost to 2025 out of the box in gaming .
> 
> Looking forward to OC in 2 weeks .


Mine boosts like that also. Which one did you get? Mine is the EVGA 1080Ti SC Black Edition. Now I'm on a custom water loop, EKWB, grizzly hydronaut, and dont go over 41c under heavy load with a 2100MHz OC. I love this card.


----------



## macwin2012

Quote:


> Originally Posted by *jleslie246*
> 
> Mine boosts like that also. Which one did you get? Mine is the EVGA 1080Ti SC Black Edition. Now I'm on a custom water loop, EKWB, grizzly hydronaut, and dont go over 41c under heavy load with a 2100MHz OC. I love this card.


I have the 1080ti Amp extreme , Got it since they provide 5 years warranty in our country . Better resale value for next upgrade .

Playing Mass effect Andromeda ultra wide on X34A , Temp's are 68/67 on Load , My ambient Temp is 28 C though . Seems good so far .

Will overclock soon , want a fresh GPU to do some burn in just something i do .


----------



## cluster71

Quote:


> Originally Posted by *macwin2012*
> 
> I have the 1080ti Amp extreme , Got it since they provide 5 years warranty in our country . Better resale value for next upgrade .
> 
> Playing Mass effect Andromeda ultra wide on X34A , Temp's are 68/67 on Load , My ambient Temp is 28 C though . Seems good so far .
> 
> Will overclock soon , want a fresh GPU to do some burn in just something i do .


If you are worried about the warranty, then you should not rebuild anything. I do not care so much myself, I'm building new projects all the time and then giving it away when I do not want it, and building a new project, and i don't care if i destroy something. Because I never take any loans on computers. If it's paid you do not have much to worry about.

Here are my experiences about Zotac Extreme.

I could play 2062 core 11600 mem 1.062v at the beginning. I could not run any benchmark for the card crached immediately at high temp in a few seconds. I swapped cooling paste (liquid metal) and pads. It got something better but far from good. I took it apart again and realized that the heatsink did not have good contact with the GPU. I felt like the stock pads (1.5 mm) were too thick that the spring loaded heatsink did not push properly.
The heatsink and the casted metal fan cover are very heavy, it makes matters worse, and liquid metal is not as forgiving as normal cooling paste. It should be under high pressure.

I took 1.0 mm pads and double springs to the screws to get a high pressure. Later I removed the entire fan cover and put two 140mm fans connected to the graphics card's fan control. I kept the backplate because I have a little support on it so that it does not break the pcie connector. The card is free standing.

Plastic maybe the wrong word because the backplate is a bent aluminum plate? I removed almost the entire rubber support over the VRM, only two small pieces left. I have 31-37C on mem modules and 41C VRM (with IR-gun) and 35-39C on GPU normaly during gaming and benchmark. Am i under 40C on the GPU then it's ok. I have been on 2126/12110 1.093v for weeks now.

Feels okay soon new project


----------



## Streetdragon

2100 on core sounds so nice....... but the jump in watt/volt for me from 2000-2100 is over 100Watts/from 1V to 1.093V.. argh


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> 2100 on core sounds so nice....... but the jump in watt/volt for me from 2000-2100 is over 100Watts/from 1V to 1.093V.. argh


Its only for benchmarks. There is no point on running more than 1v on this card for gaming. 2000 to 2100 is 5% OC which means in games in 2-3% more fps for 20-30% more power








.


----------



## JimmyMo

Quote:


> Originally Posted by *JimmyMo*
> 
> Awesome, thanks so much for replying!
> 
> OK, I will get the 2mm pads.
> 
> Good idea on the back plate. I might leave mine alone, too, though. Don't want to start carving into the thing...!
> 
> **Can anyone guess or know what these other two thermal pad thicknesses might be? Is there a standard thickness that most manufacturers use?
> 
> If I get them slightly TOO thick, like 1.5mm, will that be a problem if they are "squished?"
> 
> 
> 
> ...Just found another Overclock.net discussion on pads, this fellow saw .5 and 1.5mm pads in his Zotac 1070 Mini, so maybe I will get those two sizes, and hope...
> 
> http://www.overclock.net/t/1622691/thermal-pad-thickness-for-gpu#post_25843485
> 
> Any other thoughts on that, anyone?


Quote:


> Originally Posted by *cluster71*
> 
> Here are my experiences about Zotac Extreme.
> 
> I could play 2062 core 11600 mem 1.062v at the beginning. I could not run any benchmark for the card crached immediately at high temp in a few seconds. I swapped cooling paste (liquid metal) and pads. It got something better but far from good. *I took it apart again and realized that the heatsink did not have good contact with the GPU. I felt like the stock pads (1.5 mm) were too thick that the spring loaded heatsink did not push properly.*
> The heatsink and the casted metal fan cover are very heavy, it makes matters worse, and liquid metal is not as forgiving as normal cooling paste. It should be under high pressure.
> 
> *I took 1.0 mm pads and double springs to the screws to get a high pressure.* Later I removed the entire fan cover and put two 140mm fans connected to the graphics card's fan control. I kept the backplate because I have a little support on it so that it does not break the pcie connector. The card is free standing.


This was VERY helpful, thank you!

Sounds like ZOTAC's stock pad thickness is 1.5mm, but that it prevents good contact with the GPU and heatsink.

Based on that, I think I will get 2.0mm for the VRMs and remove that rubber strip, and then go with 1.0mm or even .5mm thermal pads for the area around the GPU and heatsink!

Since even with 1mm pads you had to add second tension springs to the GPU block mount, I think that .5mm pads sound like the way to go.

This photo I found at https://www.hardwarebbq.com shows a nice shot of the thermal pads - - they sure do look thick!


----------



## CoreyL4

Any new sweet bios updates or anything?


----------



## KedarWolf

Quote:


> Originally Posted by *CoreyL4*
> 
> Any new sweet bios updates or anything?


Nothing new but got this today on the Arctic Storm BIOS!









Peeps, use the stand-alone 3DMark, not the Steam version, you'll do a fair bit better benching.

You can get the CD keys from your 3DMark login on http://www.3dmark.com/search to use on the downloaded non-Steam version.

This is the fastest server I found to download it, took me two minutes.

https://www.techpowerup.com/download/futuremark-3dmark-timespy/

Click on Download and choose the 'DE-2 Server'.



http://www.3dmark.com/3dm/21263545


----------



## KedarWolf

Updated OP to latest Afterburner beta download direct from MSI, beta 12 when I checked.

http://download.msi.com/uti_exe/Afterburner_4.4.0.zip


----------



## Kriant

I can finally joint the club. Made a jump to 1080 ti's in SLI.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> Nothing new but got this today on the Arctic Storm BIOS!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peeps, use the stand-alone 3DMark, not the Steam version, you'll do a fair bit better benching.
> 
> You can get the CD keys from your 3DMark login on http://www.3dmark.com/search to use on the downloaded non-Steam version.
> 
> This is the fastest server I found to download it, took me two minutes.
> 
> https://www.techpowerup.com/download/futuremark-3dmark-timespy/
> 
> Click on Download and choose the 'DE-2 Server'.
> 
> 
> 
> http://www.3dmark.com/3dm/21263545


How much difference are we talking? Don't see why the steam version should perform worse though.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Nothing new but got this today on the Arctic Storm BIOS!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peeps, use the stand-alone 3DMark, not the Steam version, you'll do a fair bit better benching.
> 
> You can get the CD keys from your 3DMark login on http://www.3dmark.com/search to use on the downloaded non-Steam version.
> 
> This is the fastest server I found to download it, took me two minutes.
> 
> https://www.techpowerup.com/download/futuremark-3dmark-timespy/
> 
> Click on Download and choose the 'DE-2 Server'.
> 
> 
> 
> http://www.3dmark.com/3dm/21263545
> 
> 
> 
> How much difference are we talking? Don't see why the steam version should perform worse though.
Click to expand...

I was getting 11136 or so on Steam version. Highest I ever got on it was 11166, so a bit of an improvement.









Exact same Nvidia and Afterburner settings in Steam version, Earlier run was 11033 with Steam.


----------



## PimpSkyline

Quote:


> Originally Posted by *KedarWolf*
> 
> Updated OP to latest Afterburner beta download direct from MSI, beta 12 when I checked.
> 
> http://download.msi.com/uti_exe/Afterburner_4.4.0.zip


Awesome.

Any word on when the 4.4.0 FINAL is to be released? Really waiting on that to finally get rid of Precision, but still want my Pascal support.


----------



## Sgang

Quote:


> Originally Posted by *JimmyMo*
> 
> This was VERY helpful, thank you!
> 
> Sounds like ZOTAC's stock pad thickness is 1.5mm, but that it prevents good contact with the GPU and heatsink.
> 
> Based on that, I think I will get 2.0mm for the VRMs and remove that rubber strip, and then go with 1.0mm or even .5mm thermal pads for the area around the GPU and heatsink!
> 
> Since even with 1mm pads you had to add second tension springs to the GPU block mount, I think that .5mm pads sound like the way to go.
> 
> This photo I found at https://www.hardwarebbq.com shows a nice shot of the thermal pads - - they sure do look thick!


Can you please update us on your work?
I've a Zotac AMP Extrem Core Edition, and i would like to improve it, this seems a good way to go...
Do you have also suggestion on the firm to flash ? The Extreme firmware will help me ?


----------



## Alex132

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CoreyL4*
> 
> Any new sweet bios updates or anything?
> 
> 
> 
> Nothing new but got this today on the Arctic Storm BIOS!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peeps, use the stand-alone 3DMark, not the Steam version, you'll do a fair bit better benching.
> 
> You can get the CD keys from your 3DMark login on http://www.3dmark.com/search to use on the downloaded non-Steam version.
> 
> This is the fastest server I found to download it, took me two minutes.
> 
> https://www.techpowerup.com/download/futuremark-3dmark-timespy/
> 
> Click on Download and choose the 'DE-2 Server'.
> 
> 
> 
> http://www.3dmark.com/3dm/21263545
Click to expand...

Gonna do this tonight, and test on my slave BIOS with the actual +127% power limit... been using the Master BIOS which only had +117%


----------



## JimmyMo

Quote:


> Originally Posted by *Sgang*
> 
> Can you please update us on your work?
> I've a Zotac AMP Extrem Core Edition, and i would like to improve it, this seems a good way to go...
> Do you have also suggestion on the firm to flash ? The Extreme firmware will help me ?


No advice on firmware, sorry, I'm just sticking with the Amp Extreme (non-Core) firmware that it came with.

Since you asked, here's an update on thermal pad replacement - -

Good pads are expensive!

I want to make sure that I only buy the exact amount of pads that I need, and I didn't want to take things apart here at home to get measurements.

So, Guru3D had a review with a nice photograph of the board that included a scale.



I'm going to use .5mm thermal pads everywhere, and then a 2mm thick pad to make contact between the VRM heatsink and the main heatsink.

I used that photo, drew some lines across it to get lengths, and came up with:

- Memory, 55mm x 15mm, and need 3 lengths = 165mm long, total

- VRMs, 85mm long, so need one 85mm length of .5 pad under black heatsink AND then another 85mm of 2mm pad on top

- Smaller heatsink to lower right, 20mm, and again, need one .5 length under black heatsink AND then another 20mm of 2mm pad on top

With all those length, I've decided these look best:



(I'll need TWO of these in order to have enough length for everything. The one used under the VRM heatsink will have to be from two pieces, as well)



(Just one of these needed)

Next up, ordering and getting it installed! Will probably be a few weeks...


----------



## Kriant

Quote:


> Originally Posted by *JimmyMo*
> 
> No advice on firmware, sorry, I'm just sticking with the Amp Extreme (non-Core) firmware that it came with.
> 
> Since you asked, here's an update on thermal pad replacement - -
> 
> Good pads are expensive!
> 
> I want to make sure that I only buy the exact amount of pads that I need, and I didn't want to take things apart here at home to get measurements.
> 
> So, Guru3D had a review with a nice photograph of the board that included a scale.
> 
> 
> 
> I'm going to use .5mm thermal pads everywhere, and then a 2mm thick pad to make contact between the VRM heatsink and the main heatsink.
> 
> I used that photo, drew some lines across it to get lengths, and came up with:
> 
> - Memory, 55mm x 15mm, and need 3 lengths = 165mm long, total
> 
> - VRMs, 85mm long, so need one 85mm length of .5 pad under black heatsink AND then another 85mm of 2mm pad on top
> 
> - Smaller heatsink to lower right, 20mm, and again, need one .5 length under black heatsink AND then another 20mm of 2mm pad on top
> 
> With all those length, I've decided these look best:
> 
> 
> 
> (I'll need TWO of these in order to have enough length for everything. The one used under the VRM heatsink will have to be from two pieces, as well)
> 
> 
> 
> (Just one of these needed)
> 
> Next up, ordering and getting it installed! Will probably be a few weeks...


Def. post your results once done. I am contemplating on doing the same, after watching gamingnexus video on Amp Extreme


----------



## Sgang

Idem, i will do the same of you guys, so waiting for your pics!


----------



## cluster71

Quote:


> Originally Posted by *JimmyMo*
> 
> No advice on firmware, sorry, I'm just sticking with the Amp Extreme (non-Core) firmware that it came with.
> 
> Since you asked, here's an update on thermal pad replacement - -
> 
> Good pads are expensive!
> 
> I want to make sure that I only buy the exact amount of pads that I need, and I didn't want to take things apart here at home to get measurements.
> 
> So, Guru3D had a review with a nice photograph of the board that included a scale.
> 
> 
> 
> I'm going to use .5mm thermal pads everywhere, and then a 2mm thick pad to make contact between the VRM heatsink and the main heatsink.
> 
> I used that photo, drew some lines across it to get lengths, and came up with:
> 
> - Memory, 55mm x 15mm, and need 3 lengths = 165mm long, total
> 
> - VRMs, 85mm long, so need one 85mm length of .5 pad under black heatsink AND then another 85mm of 2mm pad on top
> 
> - Smaller heatsink to lower right, 20mm, and again, need one .5 length under black heatsink AND then another 20mm of 2mm pad on top
> 
> With all those length, I've decided these look best:
> 
> 
> 
> (I'll need TWO of these in order to have enough length for everything. The one used under the VRM heatsink will have to be from two pieces, as well)
> 
> 
> 
> (Just one of these needed)
> 
> Next up, ordering and getting it installed! Will probably be a few weeks...


I did not have so much pads at home of good quality, So pads for the memories i cut in module size instead for entire strips.

On the GPU itself I took Grizzly Conductonaut. Around the GPU itself, I took a thin layer of ordinary cold paste to avoid short circuit. I do not know if it's a risk, did the same on my Delid CPU.


----------



## ZealotKi11er

Man these thermal pads are just too expensive.


----------



## JimmyMo

Quote:


> Originally Posted by *cluster71*
> 
> I did not have so much pads at home of good quality, So pads for the memories i cut in module size instead for entire strips.


Oh! That's smart, thank you! I will do that, too, and save on the amount of pads I use! They are expensive, and I'd like to get the most mileage out of them that I can!


----------



## TheWizardMan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man these thermal pads are just too expensive.


They also aren't necessary. I bought pricey pads because I don't have any budget constraints, but if you do, then don't buy thermal pads.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheWizardMan*
> 
> They also aren't necessary. I bought pricey pads because I don't have any budget constraints, but if you do, then don't buy thermal pads.


People underestimate the effect of good thermal pads. With 290X my VRM1 temps dropped by 20C not using the EK stock thermal pads.


----------



## TheWizardMan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> People underestimate the effect of good thermal pads. With 290X my VRM1 temps dropped by 20C not using the EK stock thermal pads.


The actual impact of that tends to be much more minimal in real world scenarios though. But I agree, which is why I bought them. That said, the EK stock pads are better than the Nvidia pads.


----------



## cluster71

The ones I got in were the PHOBYA Thermal pad XT 1mm, 7W / mk (7,21 USD) and it works for me

I never switched VRM pads. I did't feel necessary to access the GPU, and because I removed the rubber strip to get better cooling.


----------



## cluster71

KFA2 / Galax GeForce GTX 1080 Ti HOF

http://www.guru3d.com/articles-pages/kfa2-geforce-gtx-1080-ti-hof-review,1.html

Nice if you have white chassis and motherboards, but I'm just so disappointed with the Bios software that holding you back. You have 450w available but TDP is still 250w, so even if the card boosts up to 400w, it retracts to 280-300 at continuous load. Feel like you have an F1 car and just have permission to drive with a brick under gas pedal.

I'm so ******* tired at Zotac, malicious software and very difficult or impossible to get waterblocks. No voltage control of the memories despite it's a top model. No information about hardware mod on the internet.

The hardware itself is very impressive. So the whole package could be so much better if they focused more on the other pieces

I have been in touch with the Zotac support several times. They just want to talk about installation issues in Windows. If you ask about hardware or software, they do not respond to the email at all.

Next time there will be full water and there will be Asus or Evga where there is more information, competitions and accessories.


----------



## KedarWolf

Anyone need Fuji 17.0 pads in Canada do what I did.

Go to www.amazon.com, buy the pads from there by choosing Other Sellers option when it says Prime does not ship to Canada.

I bought two sets of pads including shipping for less than $95 USD which is about $120 CAD for both sets.

You may pay a bit of tax when it gets to you on your end but still a big savings.

Each set of pads separately on Amazon.ca is about $120 CAD so I cut the cost in half as I'd pay tax on that $120X2 anyways.


----------



## DerComissar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man these thermal pads are just too expensive.


Even worse for us in Canada, because our buck sucks, lol.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheWizardMan*
> 
> They also aren't necessary. I bought pricey pads because I don't have any budget constraints, but if you do, then don't buy thermal pads.
> 
> 
> 
> People underestimate the effect of good thermal pads. With 290X my VRM1 temps dropped by 20C not using the EK stock thermal pads.
Click to expand...

I agree.

And the stock pads included with most blocks are very low W/mK, in comparison to the Fujipoly variants.

Quote:


> Originally Posted by *KedarWolf*
> 
> Anyone need Fuji 17.0 pads in Canada do what I did.
> 
> Go to www.amazon.com, buy the pads from there by choosing Other Sellers option when it says Prime does not ship to Canada.
> 
> I bought two sets of pads including shipping for less than $95 USD which is about $120 CAD for both sets.
> 
> You may pay a bit of tax when it gets to you on your end but still a big savings.
> 
> Each set of pads separately on Amazon.ca is about $120 CAD so I cut the cost in half as I'd pay tax on that $120X2 anyways.


Pretty good price, and shipping discount.
I paid more than that for the Fujipoly 14 W/mk pads, buying them through ModMyMods.

I think I spent as much for three sets of pads than the damn block cost, lol.


----------



## cluster71

I was wrong, it was a plastic backplate on Zotac. Felt like metal. Removed mine. Has not installed the card yet.

5W/mk pads on memory modules this time.


----------



## cluster71

I do not know if the removal of the backplate does something for overclocking, have not tested yet, but it may affect stability anyway. In any case, there was better airflow over the heat sink now.

I made a quick test. Seems to hold 40C which was the target. Ambient temperature was higher than usual.


----------



## TheBoom

Quote:


> Originally Posted by *ZealotKi11er*
> 
> People underestimate the effect of good thermal pads. With 290X my VRM1 temps dropped by 20C not using the EK stock thermal pads.


At the end of the day the question is do these thermal pads dropping VRM temps by a load help overclocking/performance in anyway? I'm 99% sure that's a no lol. Given that VRMs are already rated for much higher temperatures I fail to see the point of doing this other than to simply ensure that the entirety of the components on the card runs cooler just for the sake of it.

It's not so much as underestimating the effect but mis-estimating how it affects overall performance and/or overclocking.


----------



## GreedyMuffin

Question. I've had a 1080Ti under water since launch. I've been running 1950mhz on 0.900V to let the card never hit it's powerlimit. It is usually around 90 percent +/- some, but have anyone else experimented with undervolting and overclocking? I feel like the performance could be better if I used the "old school" way of overclocking.


----------



## Alex132

Score 12 198 with NVIDIA GeForce GTX 1080 Ti(1x) and Intel Core i5-2500K Processor
http://www.3dmark.com/fs/13237433

I wouldn't say I've got the best 1080 Ti. Can't get it past 2050Mhz









Oh well, not too bothered.


----------



## Psykoking

Quote:


> Originally Posted by *TheBoom*
> 
> At the end of the day the question is do these thermal pads dropping VRM temps by a load help overclocking/performance in anyway? I'm 99% sure that's a no lol. Given that VRMs are already rated for much higher temperatures I fail to see the point of doing this other than to simply ensure that the entirety of the components on the card runs cooler just for the sake of it.
> 
> It's not so much as underestimating the effect but mis-estimating how it affects overall performance and/or overclocking.


The thing with VRMs and temperature is the hotter the VRMs get the more ripple is produced on the voltage curve that is supplied to the chip. And the higher the fluctuation of the rippling voltage the higher the chance is that the chip will become instable on higher frequencies since the chip might be under load but the supplied voltage is just not enough for it. In case of ZOTAC ( where VRMs go up to over 90°C ) it is likely that you might win 1-2 bins if you manage to cool them proper.


----------



## Fatchicken

Ok guys, give it to me straight, what the hell is going on with my graphics card.

It's an Asus 1080 ti strix oc that i got a couple of months ago, i've been struggling to get some decent performance out of it, it just isn't a great overclocker.
For the last few months i used to run it at +100 voltage, 120% PL, +35 core and +460 mem, the card used to settle around 1989mhz core clock and stay consistently stay at that, in Gpu-z it showed a consistent VREL perfcap and that was it.
I tried everything to get it up to 2000mhz but it just wouldn't have it, these settings were the edge of it's stability.

This is where the fun starts.
Yesterday i did some work on my rig, a new PSU (higher wattage and better efficiency) and delidded my 6700k to squeeze an extra 0,2Ghz out of it (stress tested for hours and stable).
Ever since, my card has been all over the place. Using the old settings it now clocks down to about 1923mhz, the core clocks are jumpy as hell and it's hitting both the VREL and PWR perfcap.
I've spent half a day playing around with the settings and found that +100 voltage, 120% PL, +45 core and +430 mem is stable now, the card actually settles around 2012mhz while playing games for some magical reason, something that was impossible before.
In heavy tasks like 4k Superposition the core clock is still extremely jumpy, constantly switching between 1975 and 2025mhz, in the attached screenshot the perfcap graph looks weird as well.
The card is stable at the moment, the score i'm getting in Superposition is higher than before and doing the Firestrike ultra stress test gives me a 98,7% framerate stability too.

Is any of this somewhat normal or is GPUboost just being weird again, im confused.


----------



## trickeh2k

Ugh... so I'm joining the 1080 ti club after rocking a 780 for three years but just found out that it seems like the waterblock won't be out for another few months...


----------



## sammaz

you are downclocking due to temps

Set your fan to 100% and re-run the tests


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheBoom*
> 
> At the end of the day the question is do these thermal pads dropping VRM temps by a load help overclocking/performance in anyway? I'm 99% sure that's a no lol. Given that VRMs are already rated for much higher temperatures I fail to see the point of doing this other than to simply ensure that the entirety of the components on the card runs cooler just for the sake of it.
> 
> It's not so much as underestimating the effect but mis-estimating how it affects overall performance and/or overclocking.


Well in my case which I run a Hybrid card the VRM is still under air and the back gets hot. If I use a BIOS with higher TDP I am pulling 400W vs 250W. I am pretty sure the temps of VRM matter there.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBoom*
> 
> At the end of the day the question is do these thermal pads dropping VRM temps by a load help overclocking/performance in anyway? I'm 99% sure that's a no lol. Given that VRMs are already rated for much higher temperatures I fail to see the point of doing this other than to simply ensure that the entirety of the components on the card runs cooler just for the sake of it.
> 
> It's not so much as underestimating the effect but mis-estimating how it affects overall performance and/or overclocking.
> 
> 
> 
> Well in my case which I run a Hybrid card the VRM is still under air and the back gets hot. If I use a BIOS with higher TDP I am pulling 400W vs 250W. I am pretty sure the temps of VRM matter there.
Click to expand...

Got some pads coming, .5 mm and 1.5 mm

https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQDYA/ref=sr_1_4?ie=UTF8&qid=1501373205&sr=8-4&keywords=fujipoly


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> Got some pads coming, .5 mm and 1.5 mm
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQDYA/ref=sr_1_4?ie=UTF8&qid=1501373205&sr=8-4&keywords=fujipoly


What do I need the .5mm for? I assume 1.5 is the vRAM and VRM. Is .5 to replace those small messy pads.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Got some pads coming, .5 mm and 1.5 mm
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQDYA/ref=sr_1_4?ie=UTF8&qid=1501373205&sr=8-4&keywords=fujipoly
> 
> 
> 
> What do I need the .5mm for? I assume 1.5 is the vRAM and VRM. Is .5 to replace those small messy pads.
Click to expand...

I'm pretty sure you use the .5mm on the vram and the 1.5mm on the VRM but I may be wrong.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure you use the .5mm on the vram and the 1.5mm on the VRM but I may be wrong.


The stuff Nvidia uses on vRAM seems very think. Its those sticky white pads.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm pretty sure you use the .5mm on the vram and the 1.5mm on the VRM but I may be wrong.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The stuff Nvidia uses on vRAM seems very think. Its those sticky white pads.
Click to expand...

I read somewhere is better to us 1.0 or .5 on the memory and 1.5 on the VRMs with EK blocks.


----------



## cluster71

Just check that there is no short circuit if the card is intended for 1.5mm and you take 0.5mm. I have not tested 0.5mm but they can be difficult to handle according to some people.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cluster71*
> 
> Just check that there is no short circuit if the card is intended for 1.5mm and you take 0.5mm. I have not tested 0.5mm but they can be difficult to handle according to some people.


I mean if you put 0.5 and it does not make contact you can kill the card.


----------



## cluster71

Yes, both ways. If there is an air gap or the water block gets into contact with some components.


----------



## FastEddieNYC

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well in my case which I run a Hybrid card the VRM is still under air and the back gets hot. If I use a BIOS with higher TDP I am pulling 400W vs 250W. I am pretty sure the temps of VRM matter there.


There is another major benefit for cooling the VRM's, the memory chips nearest to them also heats up which can limit your stable memory OC. I went from a +525 to +675 OC, Witcher 3 stable, by using the passive backplate for my Kryographics WB. It cool the backside of the VRM's and I also added extra thermal pads on the backside of the memory chips. The backplate gets quite warm while gaming. I'm considering upgrading to the active backplate or adding memory heatsinks to the backplate to help dissipate the heat.


----------



## ZealotKi11er

Quote:


> Originally Posted by *FastEddieNYC*
> 
> There is another major benefit for cooling the VRM's, the memory chips nearest to them also heats up which can limit your stable memory OC. I went from a +525 to +675 OC, Witcher 3 stable, by using the passive backplate for my Kryographics WB. It cool the backside of the VRM's and I also added extra thermal pads on the backside of the memory chips. The backplate gets quite warm while gaming. I'm considering upgrading to the active backplate or adding memory heatsinks to the backplate to help dissipate the heat.


How hot is it. If you can hold your hand and not burn its probably 50-55C. If you can hold for 3-4 seconds is 6XC range. Other that your will feel it right away.


----------



## KCDC

I'm doing the shunt mod and a tear-down next weekend... two cards.

since I have the time, I might as well get some fujipolys. the entire card, including the inductors and such that nvidia put the crappy white pads on.

I've read through yalls posts, but is there an actual list of what's needed, pad-wise? including the little areas with the thin white ones? They're expensive and don't want to buy too much or the wrong stuff.


----------



## FastEddieNYC

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How hot is it. If you can hold your hand and not burn its probably 50-55C. If you can hold for 3-4 seconds is 6XC range. Other that your will feel it right away.


It gets pretty warm compared to my core load temp of 35C. My quess is 45~50C. My card is shunt modded and I play The Witcher 3(with hairworks on) at [email protected] [email protected] and can play at [email protected] stable so I'm pushing a good bit of current thru the VRM's. I wanted to buy the SC2 with the extra temp monitoring so I know the VRM temps but it is incompatible with full cover water blocks.


----------



## NBrock

Hey everyone. I just swapped to an MATX board from my MITX because I am looking to get a second 1080ti mainly for folding at home and whatever supports SLI.
Obviously my current PSU isn't enough for two, but I was wondering what kinda wattage you guys would recommend. The cards will be overclocked and my 4790k is running 4.7 on ~1.31v.
other items in the rig are 3 SSDs, 7 120mm fans (including the ones on the H100i), and the H100i.

Thanks


----------



## ZealotKi11er

Your current psu is fine.


----------



## Youngtimer

Hello, I just modded my Palit Jetstream 1080 Ti with with some new fans. The Pailt/Gainward heatsink is pretty fine, but the fans can be improved alot. I choosed 2x120mm NB eLoops PWM (B12-P) with a range from 800-2000rpm because of the very high static pressure, but I think good Noctuas will also work. Together with an adaptor and an y-cable I can use the Palits own PWM fan port and adjust the fans via MSI Afterburner just like the 92mm stock fans.

I´m quiet happy with the results inside my closed midi tower and like to share some pictures/screenshots with you:


http://imgur.com/aOQsr


Greetings from Germany
Felix


----------



## GraphicsWhore

Decided to juice some more and with a bit of fine tuning in AB managed some new high graphics scores. This is at 2114/6039 and 1.093v. Several more runs after these screens in and hitting 50. Water-cooling's the best.


----------



## Sweetwater

Nice man!

I just now tried few runs of SuPo and got this score with 2114 / 6075 and 1.093.

This is only on air cooling it would be interesting to see what the card could do on water with lower temps.


----------



## GraphicsWhore

Quote:


> Originally Posted by *Sweetwater*
> 
> Nice man!
> 
> I just now tried few runs of SuPo and got this score with 2114 / 6075 and 1.093.
> 
> This is only on air cooling it would be interesting to see what the card could do on water with lower temps.


Yup that's beast. I'd definitely try other cooling.


----------



## Fatchicken

I just flashed the XOC bios and my card is so much more stable now.
Would it be safe to run it like this 24/7? The temperatures are fine and the voltage seems to max out at 1.131V under load.


----------



## sdmf74

subbed


----------



## Streetdragon

Quote:


> Originally Posted by *Fatchicken*
> 
> I just flashed the XOC bios and my card is so much more stable now.
> Would it be safe to run it like this 24/7? The temperatures are fine and the voltage seems to max out at 1.131V under load.


For my taste its to high. BUT

"If you can cool it, clock it!"

ATM im running 2025/6003 with 1V. 100Mhz more or less is not worth it in my opinion^^


----------



## Sgang

How you judge these results?
Seems i reached the maximum before the Zotac AMP core Extreme Crash



Hi raised up the memory and gpu clock but with worst results ( i've repeated several time the test)


----------



## cluster71

Throw away that garbage Firestorm. You have no voltage measurement. You can not create or correct the clock voltage curve. And the card only goes to 1.081v with Firestorm. I have Extreme myself. Take Afterburner so it's easier to get help.

And Graphic score is the same.


----------



## utparatrooper

Not sure how the image is coming through. But when I fold on my 2 Ti's and a 6850 - all with moderate overclocks, I get peak readout over 800 watts from my UPS. My UPS is 900 watt and sometimes if I leave it alone folding the peak usage will exceed the limit and the UPS will reset.

My experience only. Others may differ. I have an EVGA 1000 P2.

Edit:

This is in response/comment to NBrock's concern on the PSU.


----------



## cluster71

I do not know if UPS is sufficient fast or can handle the load. Ideally, you should measure with a power meter to the wall during benchmark to see which load you have, and then have a margin of maybe 20%. So I guess the UPS must be bigger


----------



## Neon Lights

Anyone know a way to get higher voltages than what standard Afterburner voltage control allows?


----------



## cluster71

Bios are usually the limit. 1.093v and 1.2v with XOC bios.


----------



## Neon Lights

Could I get higher performance (through higher voltage) with the XOC BIOS?


----------



## cluster71

It is not certain, depending on the conditions, try yourself. Correct the clock voltage curve Ctrl + F


----------



## Neon Lights

What conditions would that be?

I have the 1080 Ti watercooled and shunt modded.


----------



## nyk20z3

Strix 1080 Ti OC Under Water -


----------



## Besty

FYI - with a 1080ti Hall-of-fame I can set 1.3 volts in the using the Galax tool Xtreme Tuner Plus. As expected it crashes the GPU as I cannot set a curve.

Does anyone know how to create a voltage curve using any available software that can use 1.3 volts ? Voltage control is currently greyed out in Afterburner.


----------



## TheBoom

Quote:


> Originally Posted by *Neon Lights*
> 
> Could I get higher performance (through higher voltage) with the XOC BIOS?


I've seen some users here claim that shunt modded cards do better with stock fe bios rather than XOC. XOC seems to be the best option for those who don't want to shunt mod.

Won't hurt to try though, higher voltage will allow for higher stable clocks. I use XOC but run at lower clocks to reduce heat and also the gains with slightly higher clocks are not huge at least with my card. Kinda like diminishing returns.


----------



## TheBoom

Quote:


> Originally Posted by *Psykoking*
> 
> The thing with VRMs and temperature is the hotter the VRMs get the more ripple is produced on the voltage curve that is supplied to the chip. And the higher the fluctuation of the rippling voltage the higher the chance is that the chip will become instable on higher frequencies since the chip might be under load but the supplied voltage is just not enough for it. In case of ZOTAC ( where VRMs go up to over 90°C ) it is likely that you might win 1-2 bins if you manage to cool them proper.


Would love it if someone did a proper test with this though. I thought of it myself, but some guys over at Nexus claim that it will in no way affect overclocking performance. So I dropped the idea.

I have an Amp Extreme too anyway.

Edit : Before you ask why I don't test it myself, I really want to but its extremely hard to get pads of decent quality over here. Not to mention I'd probably have to stack two sets of 1.0mm pads which wouldn't really help VRM temps much.


----------



## Psykoking

Quote:


> Originally Posted by *TheBoom*
> 
> Would love it if someone did a proper test with this though. I thought of it myself, but some guys over at Nexus claim that it will in no way affect overclocking performance. So I dropped the idea.
> 
> I have an Amp Extreme too anyway.
> 
> Edit : Before you ask why I don't test it myself, I really want to but its extremely hard to get pads of decent quality over here. Not to mention I'd probably have to stack two sets of 1.0mm pads which wouldn't really help VRM temps much.


Might look into this since i already plan to set my amp extreme under water with a custom made waterblock within the next few weeks. Atm however I'm too busy programming and constructing for a client.


----------



## TheBoom

Quote:


> Originally Posted by *Psykoking*
> 
> Might look into this since i already plan to set my amp extreme under water with a custom made waterblock within the next few weeks. Atm however I'm too busy programming and constructing for a client.


Thanks for the gesture, but this would be very different from simply replacing the rubber pads with thermal pads on the VRM heatsinks. Water itself would affect overclocking so much it wouldn't be the same test.


----------



## KraxKill

I gotta say my card is no slouch. I can hit 2164 at 1.093, however the increase in performance is so marginal here that it's tough to justify. Moreover, I think that most will find their frame pacing (frametime, frame latency) is actually better and more consistent at lower clocks where the Card is not bouncing off the TDP with unneeded voltage while trying to sustain a clock frequency they then can't actually utilize.

I run my card at 2ghz at 1v. This gives me the same performance with better frame time as the exact same system at 2075 at 1.063v in Superposition4K because at 1v, TDP limits are hardly encountered.

Half the folks in this thread are actually shooting themselves in the foot by underutilizing their available TDP by feeding higher voltage. This simply eats into your TDP limit so you are robbing yourself of performance at one end (TDP) while attempting to gain it on the other (frequency).

I score 10500 in Super4K at 2ghz @ 1.000v. For folks on air, this is even more valid since heat is much easier to manage at lower voltage, and for those on water it can lead to better frame times and otherwise identical scores.


----------



## coc_james

Subbed


----------



## Youngtimer

Quote:


> Originally Posted by *KraxKill*
> 
> ...
> I score 10500 in Super4K at 2ghz @ 1.000v. For folks on air, this is even more valid since heat is much easier to manage at lower voltage, and for those on water it can lead to better frame times and otherwise identical scores.


I´ve made the same experience, undervolt to 1.000v, keep the temps low and the clock above 2000mhz. Could you please share your custom "curve"?

Greetings


----------



## JimmyMo

Quote:


> Originally Posted by *TheBoom*
> 
> Would love it if someone did a proper test with this though. I thought of it myself, but some guys over at Nexus claim that it will in no way affect overclocking performance. So I dropped the idea.
> 
> I have an Amp Extreme too anyway.
> 
> Edit : Before you ask why I don't test it myself, I really want to but its extremely hard to get pads of decent quality over here. Not to mention I'd probably have to stack two sets of 1.0mm pads which wouldn't really help VRM temps much.


These reviews on Amazon provide some data points where various individuals saw significant temperature drops after replacing memory/VRM pads with Fujipoly Ultra Extreme XR-m Thermal Pads:

*REVIEWZZZZ on Amazon*



Not conclusive with graphs, etc., but definitely helps me to feel confident in the choice.

My Ultra Extreme pads came today, but I am still waiting for the 2mm pad to go between the VRM heatsink and the main heatsink. It should arrive tomorrow.

I will run some before/after tests and post them up.

Any preferences on tests to run?


----------



## ZealotKi11er

Quote:


> Originally Posted by *JimmyMo*
> 
> These reviews on Amazon provide some data points where various individuals saw significant temperature drops after replacing memory/VRM pads with Fujipoly Ultra Extreme XR-m Thermal Pads:
> 
> *REVIEWZZZZ on Amazon*
> 
> 
> 
> Not conclusive with graphs, etc., but definitely helps me to feel confident in the choice.
> 
> My Ultra Extreme pads came today, but I am still waiting for the 2mm pad to go between the VRM heatsink and the main heatsink. It should arrive tomorrow.
> 
> I will run some before/after tests and post them up.
> 
> Any preferences on tests to run?


Run Superposition 4K.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Got some pads coming, .5 mm and 1.5 mm
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQDYA/ref=sr_1_4?ie=UTF8&qid=1501373205&sr=8-4&keywords=fujipoly
> 
> 
> 
> What do I need the .5mm for? I assume 1.5 is the vRAM and VRM. Is .5 to replace those small messy pads.
Click to expand...

The company hasn't shipped yet, I contacted them, asking them if they can send me 1.0 instead of the 0.5 pads for memory.


----------



## MrTOOSHORT

Asus Strix OC 1080ti on stock bios running on x58:



*http://www.3dmark.com/3dm/21326626*


----------



## kevindd992002

Would Fujipoly pads really important when using EK waterblocks? I had some that I requested from Fujipoly directly when I was still in the electronics industry but I don't think I have enough to replace all the EK thermal pads that come with the EK waterblock for the FTW3 when all my parts arrive.


----------



## Dasboogieman

Quote:


> Originally Posted by *kevindd992002*
> 
> Would Fujipoly pads really important when using EK waterblocks? I had some that I requested from Fujipoly directly when I was still in the electronics industry but I don't think I have enough to replace all the EK thermal pads that come with the EK waterblock for the FTW3 when all my parts arrive.


The gains are very incremental compared to the exorbitant cost. If you compare the thermal conduction between 8mw/k vs 17mw/K, the efficiency increase is actually only 30% or so.


----------



## AstroSky

i think my gtx 1080 ti hyrbid is a bad bin. Reason im saying this is this guy here 




claims he is able to get his gtx 1080 ti to 13 secs on render time and he told me to copy his settings to see if i get the same scores. I did. My render time is 37 secs. Not very good at all. iv been here before talking about low fps on this card and iv had enough of it. Im not exactly understanding why my scores are so low compared to other people. Its confusing. Temps are great. Cpu is fine. Power is fine. im just not getting it and its really sad.


----------



## Dasboogieman

Quote:


> Originally Posted by *AstroSky*
> 
> i think my gtx 1080 ti hyrbid is a bad bin. Reason im saying this is this guy here
> 
> 
> 
> 
> claims he is able to get his gtx 1080 ti to 13 secs on render time and he told me to copy his settings to see if i get the same scores. I did. My render time is 37 secs. Not very good at all. iv been here before talking about low fps on this card and iv had enough of it. Im not exactly understanding why my scores are so low compared to other people. Its confusing. Temps are great. Cpu is fine. Power is fine. im just not getting it and its really sad.


Check your drivers and the Blender version. These things matter for Blender rendering.

The difference between a good binned 1080ti and a bad 1080ti is not double. You see a 10% separation at best, the only thing that can cause such a massive difference (if its not software related) is if the card was faulty but not faulty enough to crash the render outright.


----------



## AstroSky

Ok just for a test then. What of fps do people see with players unknown on ultra at 2560x1080p on avg. Me i see consistent 60 fps and the lowest drops at 46 fps and highest is 80 everything max out.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> Ok just for a test then. What of fps do people see with players unknown on ultra at 2560x1080p on avg. Me i see consistent 60 fps and the lowest drops at 46 fps and highest is 80 everything max out.


Why are you making it difficult for yourself and everyone else? Just run Superposition4K. Compare your score to others. Done. You should be somewhere between 9.8-11K.


----------



## Kriant

Quote:


> Originally Posted by *AstroSky*
> 
> Ok just for a test then. What of fps do people see with players unknown on ultra at 2560x1080p on avg. Me i see consistent 60 fps and the lowest drops at 46 fps and highest is 80 everything max out.


Run Superposition and Firestrike extreme. Compare to others. Profit.

P.S. I am getting about the same fps with one card, with dips here and there.


----------



## AstroSky

ok on it

4k superpostion scores

9781

Clearly really bad


----------



## Kriant

Quote:


> Originally Posted by *AstroSky*


Not really, if you haven't overclocked your card, it's within norm - check the image history in this thread, there are some with lower scores, there are some with higher.


----------



## AstroSky

i overclocked it lol to 2050

surprisingly i play all my games at 2050 on the core too with no problems


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AstroSky*
> 
> i think my gtx 1080 ti hyrbid is a bad bin. Reason im saying this is this guy here
> 
> 
> 
> 
> claims he is able to get his gtx 1080 ti to 13 secs on render time and he told me to copy his settings to see if i get the same scores. I did. My render time is 37 secs. Not very good at all. iv been here before talking about low fps on this card and iv had enough of it. Im not exactly understanding why my scores are so low compared to other people. Its confusing. Temps are great. Cpu is fine. Power is fine. im just not getting it and its really sad.
> 
> 
> 
> Check your drivers and the Blender version. These things matter for Blender rendering.
> 
> The difference between a good binned 1080ti and a bad 1080ti is not double. You see a 10% separation at best, the only thing that can cause such a massive difference (if its not software related) is if the card was faulty but not faulty enough to crash the render outright.
Click to expand...

Show me the version of blender you ran and settings, I'll do a test run at 2075/6177.


----------



## ACallander

Mine on air overclocked:

Core: +144
Memory: +500

9479 4k optmized.


----------



## hadesfactor

Quote:


> Originally Posted by *AstroSky*
> 
> i overclocked it lol to 2050


take into consideration not all chips will OC the same, my FE can run superposition at 2075 but so far that's the limit without attempting the XOC Bios, I hit power limits and just dont think my ti can OC further. Also, I have a hard time running 2025 at lower voltage but that could just be tweaking of the curve. Honestly if you compare your score to the top 100 and you are withing that range you're good to go imo unless you are going for sever OCing. Also your proc plays in part of your score. IMO if you can play games stable on 2050 without crashing thats good. I had to actually turn down my OC to about 1900 just to get Mass Effect Andromeda to even play without crashing the driver every 2 min lol but Doom will run fully OC'd

Also, and I know this has been touched on already, at a certain point oc'ing the mem can actually make your clocks/score worse and everyon's board is different. I can get mine up to +625 before it starts dropping my FPS....it stays stable up to +710 before I get artifacts but my FPS in superposition drop by almost 15 FPS on 4k optimized


----------



## GraphicsWhore

Quote:


> Originally Posted by *AstroSky*
> 
> i overclocked it lol to 2050
> 
> surprisingly i play all my games at 2050 on the core too with no problems


That's because your card is fine. It may not be the best overclocker but keep in mind there are people who can't hit 2Ghz consistently. They're on air but I think I even read of a guy here who has a SC2 Hybrid who couldn't stay at 2Ghz. Point is you're hitting 2050 stable (and possibly higher; don't know if you've tested) and that's a great OC.

CPU is also a factor.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> ok on it
> 
> 4k superpostion scores
> 
> 9781
> 
> Clearly really bad


That's not bad at all. First of all, your 2050 overclock is not a conclusive factor to draw conclusions from. You'd have to post your GPUZ log during a Supercomposition 4k run. 2050 Supo4k scores on a cool card running at 36C is going to outscore a 2050 running at 58C.

You're also running an AMD CPU. I don't want to start a chip war over here, but you should expect less in terms of outright frame delivery on AMD due to higher chaching latencies under gaming related workloads. Yes you're testing the GPU here, but ultimately it's your CPU handing out the instructions and there will be instances during the test where it can't hand out more than the card can deliver.

Because your card is running rather warm you shouldn't expect the scores people are achieving with cool pcb's. There will be more clock hopping and power management handed out by Boost 3.0.

If I were you, I'd try undervolting. I'm not sure what you're pushing on volts to run 2050, but based on your temps, I bet it's hurting you more than it is helping.

Try undervolting the card to run at 2ghz @ 1.000v using a custom voltage curve and compliment it with a more aggressive cooling fan curve in order to account for the lower temps lowering the fan speed.

I bet, you'd see the same or better performance than what you are seeing now without all the wasted power.

Moral of the story is if you can't keep the card cool under 40C at full load marginalizing the effects of Boost 3.0, there really is no need to feed the card extra voltage. You're just unecesserily chewing into the TDP and "robbing Peter to pay Paul"

Even those that can keep the card cool find a rather minimal gain between 2000 and 2150+. The cards power management eats up a lot of the potential and you are left with useless gains.

I score 10.5K @ 2000 1.0v. And score 10.9K @ [email protected] 1.063v, that's nearly 100w of additional power and heat the cooling solution has to be able to cope with for a rather insignificant 3.7% gain. Which in gaming is more like 1.5% if even that.

Good luck.


----------



## AstroSky

im watercooled on a custom loop though. Thats warm?


----------



## KedarWolf

Quote:


> Originally Posted by *AstroSky*
> 
> ok on it
> 
> 4k superpostion scores
> 
> 9781
> 
> Clearly really bad


Run 3DMark Time Spy and check your graphics score only with others here, not combined score.


----------



## Kriant

Question:

Does anyone know dimensions of Asus poseidon PCB? (PCB itself, not the whole card). I am trying to figure out the height difference between EVGA FTW3 pcb and asus Poseidon pcb.

Or is it a strix pcb with poseidon cooller slapped on it?


----------



## KraxKill

Quote:


> Originally Posted by *KedarWolf*
> 
> Run 3DMark Time Spy and check your graphics score only with others here, not combined score.


I agree 3DMark is misleading.

Supo4k is basically that without the misleading combined result people keep posting to confuse themselves and others. Unigine already marginalizes the CPU and while it doesn't illiminate the CPU variable (impossible to do) neither does 3DMark's graphics score.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> im watercooled on a custom loop though. Thats warm?


It is when you consider you're on a site full of overclocking nut jobs, myself included that have cooling solution that keep their cards at 40C (36C for me) and under at full load even during rather warm ambients. A warmer piece of silicon will have more voltage leakage, run less efficient and chew up more TDP. It's just physics.


----------



## AstroSky

I know for a fact my cpu is actually above avg in terms of clock for clock speed. can run 4.0 ghz and 4.1 with out too much of a voltage jump. Ram is plenty fast too 3200. So it has to be the gpu


----------



## ZealotKi11er

Quote:


> Originally Posted by *AstroSky*
> 
> I know for a fact my cpu is actually above avg in terms of clock for clock speed. can run 4.0 ghz and 4.1 with out too much of a voltage jump. Ram is plenty fast too 3200. So it has to be the gpu


Here is my card. Your card is fine. 2-3% difference is nothing to worry about.


----------



## AstroSky

is there a way i could get it closer to that score then?


----------



## ZealotKi11er

Quote:


> Originally Posted by *AstroSky*
> 
> is there a way i could get it closer to that score then?


You are only 2% slower. If you want to feel better go to Nvidia Control Panel, Adjust image setting, Change it to Performance.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> I know for a fact my cpu is actually above avg in terms of clock for clock speed. can run 4.0 ghz and 4.1 with out too much of a voltage jump. Ram is plenty fast too 3200. So it has to be the gpu


Consider the clock for clock % dificit of your CPU to those you are comparing your performance to and then add that handicap % to your score. You're overlooking the basics and are blaming your card for no reason.


http://www.sweclockers.com/test/23937-intel-core-i9-7900x-skylake-x/21#content


----------



## AstroSky

I think i need to work on fixing my temps tbh. im noting that it keeps binning lower and lower when it hits 50 c or more. I need something more stable and cooler. Im trying to undevolt the card to see what kind of temps i can get and see if i can keep it under 50c at least or at 50. At least then i know temps are not the issue and im ok with slightly lower clocks to keep it undercontrol.


----------



## hadesfactor

Quote:


> Originally Posted by *AstroSky*
> 
> I think i need to work on fixing my temps tbh. im noting that it keeps binning lower and lower when it hits 50 c or more. I need something more stable and cooler. Im trying to undevolt the card to see what kind of temps i can get and see if i can keep it under 50c at least or at 50. At least then i know temps are not the issue and im ok with slightly lower clocks to keep it undercontrol.


KedarWolf and others can tell you better then I but at 50C your card will temp throttle...if I remember it starts throttling at 40 Im not sure the exact amount it starts throttling but if you want higher clocks would need to get your temps a little lower


----------



## Kriant

Quote:


> Originally Posted by *AstroSky*
> 
> I think i need to work on fixing my temps tbh. im noting that it keeps binning lower and lower when it hits 50 c or more. I need something more stable and cooler. Im trying to undevolt the card to see what kind of temps i can get and see if i can keep it under 50c at least or at 50. At least then i know temps are not the issue and im ok with slightly lower clocks to keep it undercontrol.


I think you're overthinking this. (talking about performance, not cooling).

You have a decent score. Your CPU might be the bottleneck at 4k here, but it's just by a few points. Don't let that "Must...be...over...10...THOUSAND!!" mentality affect your daily user experience with your card.


----------



## Kriant

Side note:

Found an answer to my question about ASUS poseidon platinum's pcb - it's the same as strix.

Now....I need to figure out how to make my HB sli bridge slightly bendable







 (by .8 cm)


----------



## Youngtimer

@AstroSky: Just have a look at the low-budged approach on my Palit 1080Ti limitied to 1.000V, coupled with a thousand years old Sandy-Bridge.


http://imgur.com/tOJMT

All of your games will run fine.


----------



## khemist

Quote:


> Originally Posted by *Kriant*
> 
> Side note:
> 
> Found an answer to my question about ASUS poseidon platinum's pcb - it's the same as strix.
> 
> Now....I need to figure out how to make my HB sli bridge slightly bendable
> 
> 
> 
> 
> 
> 
> 
> (by .8 cm)


Got a poseidon coming tomorrow myself.


----------



## Dasboogieman

Quote:


> Originally Posted by *AstroSky*
> 
> I think i need to work on fixing my temps tbh. im noting that it keeps binning lower and lower when it hits 50 c or more. I need something more stable and cooler. Im trying to undevolt the card to see what kind of temps i can get and see if i can keep it under 50c at least or at 50. At least then i know temps are not the issue and im ok with slightly lower clocks to keep it undercontrol.


Have you done all the software mods outlined by this thread? the NVSMI trick alone will stop the rampant downbinning.


----------



## AstroSky

NVSMI???


----------



## GraphicsWhore

Quote:


> Originally Posted by *AstroSky*
> 
> im watercooled on a custom loop though. Thats warm?


I may have missed it but what card do you have? Have you flashed a different BIOS?


----------



## AstroSky

gtx 1080 ti hyrbid


----------



## ZealotKi11er

Quote:


> Originally Posted by *AstroSky*
> 
> gtx 1080 ti hyrbid


You cant get any better. You are power limited.


----------



## dansi

Supo4k 97xx is a low score for 2050.
Mine is only 2025 max and i got 9950+/-.

Ultimately we got average bin. Mine can't do 2050 in games without crashing..


----------



## ZealotKi11er

Quote:


> Originally Posted by *dansi*
> 
> Supo4k 97xx is a low score for 2050.
> Mine is only 2025 max and i got 9950+/-.
> 
> Ultimately we got average bin. Mine can't do 2050 in games without crashing..


He says its 2050 but having Hybrid card it will not hold that clock. I too scored 9999 with 2025.


----------



## KraxKill

*Less than a volt.*

1080Ti 2012mhz - 0.993v
DDR5X - 6123mhz
5775C @ 4.2ghz


----------



## AstroSky

how the heck are you only going up to 35c max load lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *AstroSky*
> 
> how the heck are you only going up to 35c max load lol


He is running water cooled.


----------



## KraxKill

Quote:


> Originally Posted by *AstroSky*
> 
> how the heck are you only going up to 35c max load lol


Nothing special really,

I have a 65w TDP CPU, running about 100W less than you through my Ti at 0.993v and have a 280mm radiator on a PMP600 pump with EK blocks. 4 x 140mm Noctuas in push pull cooling the rad and 3 x 120mm Noctuas cooling the case. That's it.


----------



## ALSTER868

Quote:


> Originally Posted by *Dasboogieman*
> 
> Have you done all the software mods outlined by this thread? the NVSMI trick alone will stop the rampant downbinning.


What is NVSMI and what trick are you talking about?


----------



## KCDC

Seems like no matter what I try when editing the voltage curve, it never goes above 1911 mhz.

when i type in a preset for the curve, then it "unlocks" going higher.

Am I missing something?


----------



## johnwayne117

Hey guys, is there a difference when you overclock in afterburner just with sliders and do it with curve? Like with curve you will get more stable oc?


----------



## ZealotKi11er

Quote:


> Originally Posted by *johnwayne117*
> 
> Hey guys, is there a difference when you overclock in afterburner just with sliders and do it with curve? Like with curve you will get more stable oc?


Not really. With the Curve you can limit the core clock and voltage which is slightly better if you are power limited.


----------



## johnwayne117

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not really. With the Curve you can limit the core clock and voltage which is slightly better if you are power limited.


1 I read somewhere with curve lined up @2050+ it forces gpu not drop frequency, and your @2050+ will be straight as something very heterosexual.
2 Do memory oc, tied with core oc? Like should i start with the memory, and find the limit there, and then go for the core oc, or they are tied and you have to find balance between them?


----------



## AstroSky

i am watercooled......custom.....360 rad with 6 fans in push pull.......with pump and gpu block and everything lol


----------



## Kriant

Can someone measure/ does someone know actual PCB dimensions of Asus Strix boards? (not the board + cooler)

Thanks in advance


----------



## hadesfactor

Quote:


> Originally Posted by *johnwayne117*
> 
> 1 I read somewhere with curve lined up @2050+ it forces gpu not drop frequency, and your @2050+ will be straight as something very heterosexual.
> 2 Do memory oc, tied with core oc? Like should i start with the memory, and find the limit there, and then go for the core oc, or they are tied and you have to find balance between them?


Normally you would wanna start with your core OC then move to memory. Its not tied as in the way its with BCLK. I usually get my stable OC then push the memory untill I either see artifact or a drop in FPS...clocking you mem too far can actually drop your performance in Pascal. I don't know if it's as essential to do it that way with GPU's.

As for the curve, you can use it to keep you clock more stable and not bounce around at certain voltages...keep in mind not all boards OC the same and you might not reach 2050 but you should...most people can hit that on water cool but not everyone can go past that so don't be bummed if your highest stable is 2025


----------



## khemist

https://imageshack.com/i/poZpa0e1j
https://imageshack.com/i/podzwEWKj
https://imageshack.com/i/pm8S2h3hj
https://imageshack.com/i/pomJbLlTj
https://imageshack.com/i/pmUtlRTOj
https://imageshack.com/i/pnKrSXS0j

New card for me.


----------



## dcf-joe

I am interested in getting a 1080 Ti and since there are so many great custom designs out there, I am unsure which one to get. I am looking for one that is "unlocked" out of the box, i.e. power limits and such. I have proper cooling lined up for the card so that is not an issue.

So far, I have narrowed the list down to the Asus Strix and the Evga FTW being unlocked out of the box. Did I miss another good custom design? Which one should I get?


----------



## hadesfactor

Quote:


> Originally Posted by *dcf-joe*
> 
> I am interested in getting a 1080 Ti and since there are so many great custom designs out there, I am unsure which one to get. I am looking for one that is "unlocked" out of the box, i.e. power limits and such. I have proper cooling lined up for the card so that is not an issue.
> 
> So far, I have narrowed the list down to the Asus Strix and the Evga FTW being unlocked out of the box. Did I miss another good custom design? Which one should I get?


None of them are "locked" certain 3rd party boards are guaranteed to hit a certain clock and you pay for their custom PCB and cooling. They also have custom Bios that enable more power to be delivered. All of these things can be achieved on an FE addition. None of the boards are guaranteed to hit higher then 1800mhz ( I actually think its more like 1700mhz) None of them are guaranteed to hit 2000mhz+

IMO if you are going to water cool just get the 1080 ti FE and a good block and flash the Bios with the XOC bios or another bios. Or if you don't wanna deal with all of that get the Poseidon


----------



## dcf-joe

Quote:


> Originally Posted by *hadesfactor*
> 
> None of them are "locked" certain 3rd party boards are guaranteed to hit a certain clock and you pay for their custom PCB and cooling. They also have custom Bios that enable more power to be delivered. All of these things can be achieved on an FE addition. None of the boards are guaranteed to hit higher then 1800mhz ( I actually think its more like 1700mhz) None of them are guaranteed to hit 2000mhz+
> 
> IMO if you are going to water cool just get the 1080 ti FE and a good block and flash the Bios with the XOC bios or another bios. Or if you don't wanna deal with all of that get the Poseidon


Thanks man! I guess I didn't do enough research as I didn't even know about the XOC bios.


----------



## Kriant

Quote:


> Originally Posted by *khemist*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://imageshack.com/i/poZpa0e1j
> https://imageshack.com/i/podzwEWKj
> https://imageshack.com/i/pm8S2h3hj
> https://imageshack.com/i/pomJbLlTj
> https://imageshack.com/i/pmUtlRTOj
> https://imageshack.com/i/pnKrSXS0j
> 
> New card for me.


N-n-n-n-n-n-n-ice. Double nice.

How tall is that PCB from bottom to sli bridge connectors?


----------



## kevindd992002

Which of these look and perform better?

1) evga gtx 108pti ftw3 with ek acrylic/nickel waterblock and factory EVGA backplate

2) evga gtx 1080ti fgw3 with ek acrylic/nickel waterblock and ek nickel backplate


----------



## khemist

Quote:


> Originally Posted by *Kriant*
> 
> [/SPOILER]
> 
> N-n-n-n-n-n-n-ice. Double nice.
> 
> How tall is that PCB from bottom to sli bridge connectors?


About 23cm.


----------



## hadesfactor

looks is subjective...performs..you put the same things up besides the back plate which doesn't make a lot of difference in cooling but if you're asking personal opinion....I like the the EK block with Nickel back plate


----------



## johnwayne117

Quote:


> Originally Posted by *hadesfactor*
> 
> Normally you would wanna start with your core OC then move to memory. Its not tied as in the way its with BCLK. I usually get my stable OC then push the memory untill I either see artifact or a drop in FPS...clocking you mem too far can actually drop your performance in Pascal. I don't know if it's as essential to do it that way with GPU's.
> 
> As for the curve, you can use it to keep you clock more stable and not bounce around at certain voltages...keep in mind not all boards OC the same and you might not reach 2050 but you should...most people can hit that on water cool but not everyone can go past that so don't be bummed if your highest stable is 2025


Thanks for the advice.
And i start wondering...so push to max what you will get and 1.093v...can be bad? And find something like 2000-2050 with less than 1v, will be more rewarding? Like oc and undervolting at same time?


----------



## kevindd992002

Quote:


> Originally Posted by *hadesfactor*
> 
> looks is subjective...performs..you put the same things up besides the back plate which doesn't make a lot of difference in cooling but if you're asking personal opinion....I like the the EK block with Nickel back plate


Yeah. I'm just wondering if the extra holes in the evga backplaye really has any effect as they claim it?


----------



## hadesfactor

Quote:


> Originally Posted by *johnwayne117*
> 
> Thanks for the advice.
> And i start wondering...so push to max what you will get and 1.093v...can be bad? And find something like 2000-2050 with less than 1v, will be more rewarding? Like oc and undervolting at same time?


1.093v will not hurt your card and yes the better clock speed you can push with the least amount of volts is always better that where the custom curve comes in but as stated in earlier post not everyone's card will behave the same and not everyone will be stable at 2000 with less then 1v.


----------



## dcf-joe

So, if I purchase the Evga FTW3 1080Ti, can I still use the XOC Bios on the card and use that XOC voltage utility?


----------



## KCDC

Quote:


> Originally Posted by *khemist*
> 
> https://imageshack.com/i/poZpa0e1j
> https://imageshack.com/i/podzwEWKj
> https://imageshack.com/i/pm8S2h3hj
> https://imageshack.com/i/pomJbLlTj
> https://imageshack.com/i/pmUtlRTOj
> https://imageshack.com/i/pnKrSXS0j
> 
> New card for me.


Wish I'd have held out for two of these instead of FE and EKWB. Looks so nice! Bench that thing!


----------



## khemist

WIll do later, although i never knew about the infinity window thing on it, i wish it wasn't there.


----------



## Alex132

Quote:


> Originally Posted by *kevindd992002*
> 
> Which of these look and perform better?
> 
> 1) evga gtx 108pti ftw3 with ek acrylic/nickel waterblock and factory EVGA backplate
> 
> 2) evga gtx 1080ti fgw3 with ek acrylic/nickel waterblock and ek nickel backplate


Much of a muchness, I'd go for the EVGA backplate because nickel plating is kinda ugly.


----------



## hadesfactor

Quote:


> Originally Posted by *dcf-joe*
> 
> So, if I purchase the Evga FTW3 1080Ti, can I still use the XOC Bios on the card and use that XOC voltage utility?


That is best answered by someone else but the FTW3 already has higher power limit of 330w so you don't really need to flash it with another bios....XOC bios and XOC utility are 2 different things....XOC is EVGA Over clocking utility like MSI Afterburner....the XOC Bios is the Asus bios given to their extreme overclocking team for LN2 etc...and btw IMO MSI Afterburner is better then Precision XOC


----------



## KCDC

I don't remember if there's a solution for this, but I'm wondering if there's a way to keep the card from going back a bin when I hit 50c on water. Is this inevitable? There's no way to tell it to wait till 60c?


----------



## ZealotKi11er

Quote:


> Originally Posted by *KCDC*
> 
> I don't remember if there's a solution for this, but I'm wondering if there's a way to keep the card from going back a bin when I hit 50c on water. Is this inevitable? There's no way to tell it to wait till 60c?


Nope. Its not a problem unless you keep the temp where the switch is happening.


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> I don't remember if there's a solution for this, but I'm wondering if there's a way to keep the card from going back a bin when I hit 50c on water. Is this inevitable? There's no way to tell it to wait till 60c?


nope it will bin at 40, 50, 60 and up getting worse and worse bins the higher the temps....pascal architecture (not exact temps but yeah)....what is your set-up? My card doesnt break 40c on load...think it hit 42c once on sup4K


----------



## KCDC

Two TI's and a 6900k OC'd to 4.4 @ 1.380V

rest is in my sig. Two 420 rads is overkill. one D5 pump

idle around 30c-35c for the cards, high 20s for the cpu when on balanced power, at full tilt 1.093mV the cards will hit 55-60C, room is a bit warm, LA summer, ambient is around 26-29c. pretty sure my water sits around 34c, don't have a probe installed currently.

I'm doing a teardown and shunt mod hopefully this weekend. will also include cleaning the blocks and new kryonaut for the GPUs. Who knows maybe I will see better temps. Using mayhems pastel white with UV dye. This current build has been up for a couple months now. Haven't opened and cleaned the monoblock since about March. The 1080ti blocks haven't been opened and cleaned proper yet. Planning to keep a 6 month regimen for a full clean. I know I don't have to but I enjoy it.


----------



## KCDC

Quote:


> Originally Posted by *khemist*
> 
> WIll do later, although i never knew about the infinity window thing on it, i wish it wasn't there.


Yeah that infinity window placement is weird, too. So out of place it looks too gimmicky. Hopefully you can turn it off.


----------



## khemist

You can, but the logo on the backplate will be off also.


----------



## mouacyk

Quote:


> Originally Posted by *khemist*
> 
> You can, but the logo on the backplate will be off also.


Can I haz it... the card?


----------



## khemist

Sure!, just forward me your details.


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> Two TI's and a 6900k OC'd to 4.4 @ 1.380V
> 
> rest is in my sig. Two 420 rads is overkill. one D5 pump
> 
> idle around 30c-35c for the cards, high 20s for the cpu when on balanced power, at full tilt 1.093mV the cards will hit 55-60C, room is a bit warm, LA summer, ambient is around 26-29c. pretty sure my water sits around 34c, don't have a probe installed currently.
> 
> I'm doing a teardown and shunt mod hopefully this weekend. will also include cleaning the blocks and new kryonaut for the GPUs. Who knows maybe I will see better temps. Using mayhems pastel white with UV dye. This current build has been up for a couple months now. Haven't opened and cleaned the monoblock since about March. The 1080ti blocks haven't been opened and cleaned proper yet. Planning to keep a 6 month regimen for a full clean. I know I don't have to but I enjoy it.


Also the same set-up except Im running a 6850k.....that's the temp I need to get under control OC @ 4.3 @ 1.32 I have it at 30-35c during normal use and it hits as high as 50c under load.....I know this is well within specs but I KNOW I can get it lower Im just not sure why its higher then I like it to be....AZ weather so my ambient is about 28c but my TI's are nice and cool 40 tops under load at 2050 1.093v....I think I'm over stressing it since my Delta T water-Ambient doesn't get over 4.2c which is great I think I might tear down the CPU block and re apply TIM etc


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> Also the same set-up except Im running a 6850k.....that's the temp I need to get under control OC @ 4.3 @ 1.32 I have it at 30-35c during normal use and it hits as high as 50c under load.....I know this is well within specs but I KNOW I can get it lower Im just not sure why its higher then I like it to be....AZ weather so my ambient is about 28c but my TI's are nice and cool 40 tops under load at 2050 1.093v....I think I'm over stressing it since my Delta T water-Ambient doesn't get over 4.2c which is great I think I might tear down the CPU block and re apply TIM etc


I don't run my fans at full tilt either, which isn't a good combo with these rads since they perform their best with fans above 1000rpm. Had I done that research before buying, Id have probably gone for other rads. Still, my gpus shouldn't get that high.

You're also using two pumps so theoretically, you should have fairly constant and low temps. Maybe I'm wrong there.


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> I don't run my fans at full tilt either, which isn't a good combo with these rads since they perform their best with fans above 1000rpm. Had I done that research before buying, Id have probably gone for other rads. Still, my gpus shouldn't get that high.
> 
> You're also using two pumps so theoretically, you should have fairly constant and low temps. Maybe I'm wrong there.


Flow at a certain point really doesnt have any effect as long as you get enough flow for turbulent flow your good....My GPU is getting great temps but my CPU isn't....I mean its not terrible when Im playing Mass Effect its right around 45c my GPU stays around 35C....but what leads me to believe I might need to reapply the block as my Delta is right on par...idle is 0-2c and load is 4c for CPU...I think the problem is the block not pulling the heat away as good as it should....also have the mono block so its also moving the VRM heat but that shouldn't be a reason.

Im running 2 dedicated loops which either means the difference in heat is the fact my proc is hot or maybe my tim application.

I do recommend having water temp sensors....that way you can get your delta between your ambient and water and set your curve to that....generally on a higher end loop your deltas should be between 0-5c so Im right on par with that just dont like my actual CPU temps lol


----------



## mouacyk

Quote:


> Originally Posted by *khemist*
> 
> Sure!, just forward me your details.


No, seriously. Interested in knowing what kinds of load temps this hybrid block is capable of of. OCDrift has shown 44C extrapolated in gaming mode, with a 20C ambient. Their actual was 51C load and and 27C ambient. I'd say this is pretty darn good for a block meant for hybrid cooling, but would like to see what your experience is like.


----------



## khemist

I'll post once i get it on water.


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> Flow at a certain point really doesnt have any effect as long as you get enough flow for turbulent flow your good....My GPU is getting great temps but my CPU isn't....I mean its not terrible when Im playing Mass Effect its right around 45c my GPU stays around 35C....but what leads me to believe I might need to reapply the block as my Delta is right on par...idle is 0-2c and load is 4c for CPU...I think the problem is the block not pulling the heat away as good as it should....also have the mono block so its also moving the VRM heat but that shouldn't be a reason.
> 
> Im running 2 dedicated loops which either means the difference in heat is the fact my proc is hot or maybe my tim application.
> 
> I do recommend having water temp sensors....that way you can get your delta between your ambient and water and set your curve to that....generally on a higher end loop your deltas should be between 0-5c so Im right on par with that just dont like my actual CPU temps lol


What's strange is the temps used to be lower, but I attributed that to the ambient going up in the room. We shall see if there's any blockage in the blocks. Also added a filter to each rad during my last flush. Think I may take those out this round. I also plan to add the temp probe back in.


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> What's strange is the temps used to be lower, but I attributed that to the ambient going up in the room. We shall see if there's any blockage in the blocks. Also added a filter to each rad during my last flush. Think I may take those out this round. I also plan to add the temp probe back in.


Ambient def will attribute to your temps...also those filters will add restriction so I would remove them....I use 2 water temp sensors in each loop...1 on the in of my rad and one on the out (only 1c difference because of equilibrium)

My issue is I soooooooooooooo don't wanna drain my loop and reapply my block HA!


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> Ambient def will attribute to your temps...also those filters will add restriction so I would remove them....I use 2 water temp sensors in each loop...1 on the in of my rad and one on the out (only 1c difference because of equilibrium)
> 
> My issue is I soooooooooooooo don't wanna drain my loop and reapply my block HA!


Best thing I ever got to drain my loop aside from a drain valve is a metrovac/blower.. connect the blower side to one of the hoses and let it run for 3 minutes, fully drains three blocks and two huge rads. Plus works better than canned air on dust

I no longer have a good place to put my probe, so I'm getting a normal thermal sensor and taping it to the back of the reservoir on the acrylic, should be close enough temps.


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> Best thing I ever got to drain my loop aside from a drain valve is a metrovac/blower.. connect the blower side to one of the hoses and let it run for 3 minutes, fully drains three blocks and two huge rads. Plus works better than canned air on dust
> 
> I no longer have a good place to put my probe, so I'm getting a normal thermal sensor and taping it to the back of the reservoir on the acrylic, should be close enough temps.


That's a great idea that's what im gonna try!

As far as taping the reg probe....why dont you look at the in-line sensors....they have g1/4" inner/outer so all you have to do is plug one end into your rad and your fitting into the other and just take a little bit off of your tubing....you'll get accurate readings of your water temp. I use temp stop plugs but im using the Nexxxos XT45 so I have 7 outlets lol

here's one for example

http://www.performance-pcs.com/phobya-temperature-sensor-in-outer-thread-g1-4-matte-black.html


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> That's a great idea that's what im gonna try!
> 
> As far as taping the reg probe....why dont you look at the in-line sensors....they have g1/4" inner/outer so all you have to do is plug one end into your rad and your fitting into the other and just take a little bit off of your tubing....you'll get accurate readings of your water temp. I use temp stop plugs but im using the Nexxxos XT45 so I have 7 outlets lol
> 
> here's one for example
> 
> http://www.performance-pcs.com/phobya-temperature-sensor-in-outer-thread-g1-4-matte-black.html


Yeah I forgot about those. Thanks.

For the vac/blower, I use this one (there are cheaper versions):

https://www.amazon.com/bonus-DataVac-Model-MDV-1BA-Computer/dp/B06XSL9VTC/


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> Yeah I forgot about those. Thanks.
> 
> For the vac/blower, I use this one (there are cheaper versions):
> 
> https://www.amazon.com/bonus-DataVac-Model-MDV-1BA-Computer/dp/B06XSL9VTC/


LOL I have the same one......exact same one lol....used to use it when I owned my own company


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> LOL I have the same one......exact same one lol....used to use it when I owned my own company


It's loud but I love that dang thing.


----------



## NBrock

I am planning on getting a second 1080ti for my current rig. I am going to hold off on liquid cooling until I decide on what to do as far as next system goes. Do you guys think the EVGA 1080 SC Black Edition is going to breath well enough? It's an MATX board so no spacing between them. I was planning on getting the same card since the one I have runs cool enough on air and overclocks pretty well...also aesthetics. I'd prefer if the cards were the same as far as looks in case I decide not to water cool.

Thanks for your thoughts!


----------



## hadesfactor

Quote:


> Originally Posted by *KCDC*
> 
> It's loud but I love that dang thing.


The worst is forgetting to empty the bag before I switch it around as a blower lol


----------



## kevindd992002

Quote:


> Originally Posted by *hadesfactor*
> 
> The worst is forgetting to empty the bag before I switch it around as a blower lol


I have this one:
https://www.amazon.com/gp/aw/d/B003BZCOKK/ref=mp_s_a_1_3?ie=UTF8&qid=1501635610&sr=8-3&pi=SL75_QL70&keywords=metrovac+datavac#immersive-view_1501635657306

Is it really safe to use this when draining?


----------



## becks

Yes, it is safe...most of us who have it use it, including myself.
Just don't let your pump spin to much...they don't like doing that while dry and you end up breaking it.


----------



## KCDC

Quote:


> Originally Posted by *becks*
> 
> Yes, it is safe...most of us who have it use it, including myself.
> Just don't let your pump spin to much...they don't like doing that while dry and you end up breaking it.


Yes agreed, should have said I do it from the top down.

Drain it with gravity, leaning it until the topmost tube starts draining clear..my top rad to gpu is where I start.. never from the pump. I do my best to keep the pump lubed with liquid.. makes it harder when I have to deal with pump stuff.. but.. part of the game.\

EDIT: from here


----------



## 2ndLastJedi

So has anyone else had massive gains with 385.12 ?


----------



## Alex132

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So has anyone else had massive gains with 385.12 ?


Are there any for the 1080 Ti?


__
https://www.reddit.com/r/6qtf9b/titan_x_pascal_benches_with_the_new_38512_driver/


----------



## 2ndLastJedi

Mine jumped to 9903 with 2038MHz and no memory OC from 9788 after i rolled to 394.94 back in Superposition 4k?


----------



## KraxKill

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Mine jumped to 9903 with 2038MHz and no memory OC from 9788 after i rolled to 394.94 back in Superposition 4k?


Wait!? Are you saying you are faster when you rolled back to the old driver? Can you clarify. Thanks.


----------



## hadesfactor

Quote:


> Originally Posted by *Alex132*
> 
> Are there any for the 1080 Ti?
> 
> 
> __
> https://www.reddit.com/r/6qtf9b/titan_x_pascal_benches_with_the_new_38512_driver/


Yes they just released the beta 2 days ago except I haven't tried it yet....has anyone here tried the beta on the TI?


----------



## LuisG7

New guide OC 1080Ti KIngP|ng
http://forum.kingpincooling.com/showthread.php?t=3972


----------



## KraxKill

Quote:


> Originally Posted by *hadesfactor*
> 
> Yes they just released the beta 2 days ago except I haven't tried it yet....has anyone here tried the beta on the TI?


I'm testing it now, so far identical performance in benchmarks and games. Nothing new to report. Seems stable enough to keep installed for now.


----------



## hadesfactor

Quote:


> Originally Posted by *KraxKill*
> 
> I'm testing it now, so far identical performance in benchmarks and games. Nothing new to report. Seems stable enough to keep installed for now.


Nice! Im wondering about stability at different clocks/voltages....I know firmware can only do so much when limited by hardware. Honestly I've been too wrapped up in playing games to tweak more....If I can get to 2100 stable IT would be great so far I can push 2075 stable on super4K but its not really "stable" sometimes it passes the whole bench sometime it doesnt lol but I think its more to do with my learning curve with the curve lol


----------



## Ghosteri7

Hello,

just some n00b question here:

I have asus strick oc, like most cards it goes to 1949 mhz in oc profile. I was just testing the oc and I saw this behavior:

My testing settings: oc oc

Using GPU tweak II user profile1:

GPU Boost Clock = +55 (01738) (I guess that after the original OC?)
GPU voltage = 0
Memory Clock = +100 (11110)
Fan Speed = 60
Power Target = +20 (120)
GPU Temp Target = +6 (90)

It boosted to 2025 mhz, then was at 2012 mhz, when temps reach 57C it goes to 1999 mhz, not matter what and stays there.

So mine question is, if the temp is so low why does it go down to 1999 mhz? Is this the max this card can do?


----------



## hadesfactor

Quote:


> Originally Posted by *Ghosteri7*
> 
> Hello,
> 
> just some n00b question here:
> 
> I have asus strick oc, like most cards it goes to 1949 mhz in oc profile. I was just testing the oc and I saw this behavior:
> 
> My testing settings: oc oc
> 
> Using GPU tweak II user profile1:
> 
> GPU Boost Clock = +55 (01738) (I guess that after the original OC?)
> GPU voltage = 0
> Memory Clock = +100 (11110)
> Fan Speed = 60
> Power Target = +20 (120)
> GPU Temp Target = +6 (90)
> 
> It boosted to 2025 mhz, then was at 2012 mhz, when temps reach 57C it goes to 1999 mhz, not matter what and stays there.
> 
> So mine question is, if the temp is so low why does it go down to 1999 mhz? Is this the max this card can do?


How are you cooling the card liquid or air? I don't use GPU Tweak I use MSI Afterburner which IMO is better for a few reasons.
How you cool will directly impact your performance because your card will down bin according to your temps and you will hit power limits if not maxed out as well. No matter what you do your clock will be lower at 50c then it is at 40c and 30c its pascal architecture. Low temps is considered 30's and actually if its staying close to 2000mhz at 57c thats pretty good

MSI will work on your card as well and will let you unlock you voltage and also adjust via curve so can can maintain better stable clocks.


----------



## cluster71

I installed 1080Ti KIngP|ng on Zotac Extreme just now. PC does not want to boot. No signal to DP or Hdmi. I turned on iGPU now. Sitting in the basement and testing, otherwise I will go to the previous Bios









I thought Kraxkill tested the bios, but maybe he meant something else.


----------



## Psykoking

Quote:


> Originally Posted by *cluster71*
> 
> I installed 1080Ti KIngP|ng on Zotac Extreme just now. PC does not want to boot. No signal to DP or Hdmi. I turned on iGPU now. Sitting in the basement and testing, otherwise I will go to the previous Bios


Erm... it's stated in the forum post, that this BIOS is not suitable for any other card, not even EVGA ones, due to different hardware setup on the PCB.


----------



## Sgang

How can i unlock voltage in msi afterburn? Seems locked with my zotac amp extreme

Inviato dal mio iPhone utilizzando Tapatalk


----------



## khemist

Here is the ticking noise on my poseidon card, sending it back for a refund.


----------



## hadesfactor

Quote:


> Originally Posted by *khemist*
> 
> Here is the ticking noise on my poseidon card, sending it back for a refund.


Ouch....it sounds like the fan is hitting the shroud or something


----------



## cluster71

Alive again. A bit tricky. Grafic card lost could not reflash. Did not work with iGPU in my mItx. Moved to another PC and managed to restore bios to EVGA FTW3 SLAVE Bios. Have higher power limit on pcie 8 pin. Eating now re-assembles later if I not find anything more. Bypassed the over current protection yesterday.


----------



## KCDC

Since I'm doing the shunt mod this weekend, I was wondering if it's worth it to throw some thermal pads on the inductors and such that nVidia put those crappy white ones on. I have some leftover 1mm thick pads. Anyone do this?


----------



## KCDC

Quote:


> Originally Posted by *hadesfactor*
> 
> Ouch....it sounds like the fan is hitting the shroud or something


I was going to say the same. Sucks dude.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KraxKill*
> 
> Wait!? Are you saying you are faster when you rolled back to the old driver? Can you clarify. Thanks.


No, sorry, i had memory OC to +460 and didn't realise! I've been doing all testing with memory at stock because i cant monitor memory temps and just concerned they may get to hot.
Once i put memory back to stock i get 9800 again.


----------



## KedarWolf

Quote:


> Originally Posted by *cluster71*
> 
> I installed 1080Ti KIngP|ng on Zotac Extreme just now. PC does not want to boot. No signal to DP or Hdmi. I turned on iGPU now. Sitting in the basement and testing, otherwise I will go to the previous Bios


Put card in pci-e slot with iGPU enabled, boot into PC with display attached to iGPU, use

'nvflash64 --list'

to find which is your GPU and likely you'll have to flash original BIOS with

'nvflash64 -6 --index=1 biosname.rom'

no quotations of course.









The 1080 Ti should be listed as GPU 1. Could be GPU 0 but I doubt it.









I've had great benches with the Arctic Storm BIOS. Run the powerlimit.bat as Admin after you install it.









powerlimitarcticstormbios.zip 156k .zip file


And yes, three power connector cards like the KingPin, HOF etc. will NOT work with other 1080 Ti's.


----------



## Ghosteri7

Quote:


> Originally Posted by *hadesfactor*
> 
> How are you cooling the card liquid or air? I don't use GPU Tweak I use MSI Afterburner which IMO is better for a few reasons.
> How you cool will directly impact your performance because your card will down bin according to your temps and you will hit power limits if not maxed out as well. No matter what you do your clock will be lower at 50c then it is at 40c and 30c its pascal architecture. Low temps is considered 30's and actually if its staying close to 2000mhz at 57c thats pretty good
> 
> MSI will work on your card as well and will let you unlock you voltage and also adjust via curve so can can maintain better stable clocks.


Oh so I guess is ok, I thought temps over 70c was the max temp to start going down in mhz, then again I don't know much about it.

Thanks for the info, my card it just normal asus strix oc non watecooled.


----------



## cluster71

I tried "nvflash64 --list" and "nvflash64 -6 --index = 1 biosname.rom" and some other trix, but did not work. It did not find any nVidia card all. Not force it either. It is not identified as nVidia anymore when K|NGP|N rom are installed. His own version. I'm used to nvflash, used it since GTX590. I reading help txt to se commands.

I boot without external power in another PC so it would start in Safe Mode, and then reflash.

Fell asleep on the couch, will mount the card now (narcolepsy)


----------



## GraphicsWhore

Quote:


> Originally Posted by *khemist*
> 
> 
> 
> 
> 
> Here is the ticking noise on my poseidon card, sending it back for a refund.


Dude that BLOWS. What the hell even is that sound? Never heard anything like that.


----------



## sdmf74

Do you have the fans running? Sounds like one of the aluminum fins on the cooler hitting a fan blade or just loose and hitting something.


----------



## cluster71

I do not know how to interpret Pascal TDP Tweaker, but it's similar to Kepler Bios Tweaker.
I know that Inno3D used more power on pcie-6 pin than anyone else. Looked at some 1080 Ti bios yesterday and noticed a deviation. All the bios I saw had 100/130 w 204/218 w. I do not know if the top section is if you plug 6 pin and lower for 8 pin? And then 76 w from MB if needed.

You have 1. nVidia FE 2. Zotac Extreme 3. Zotac Arctic 4. EVGA FTW3 slave 5. Kingpin X bios.

Extreme and Arctic, only power limit is increased not the power of the connector. At FTW, the is a 25 w increase to 230 / 243.8 w. Similarly, Kingpins X bios are elevated.

I installed FTW yesterday but he did't test that much, but the 358 w power limit should be enough. Most bios hover around 280-300 w. Although the graphics card can't hold high continuous load, you have advantage of the temporary boost in power available when playing. (The card is then reversing against TDP value)

https://www.techpowerup.com/vgabios/191833/evga-gtx1080ti-11264-170418


----------



## Iceman2733

Quote:


> Originally Posted by *NBrock*
> 
> I am planning on getting a second 1080ti for my current rig. I am going to hold off on liquid cooling until I decide on what to do as far as next system goes. Do you guys think the EVGA 1080 SC Black Edition is going to breath well enough? It's an MATX board so no spacing between them. I was planning on getting the same card since the one I have runs cool enough on air and overclocks pretty well...also aesthetics. I'd prefer if the cards were the same as far as looks in case I decide not to water cool.
> 
> Thanks for your thoughts!


Watch yourself with running different cards in sli, it is no big deal software wise and performance wise. However physical size can mess with SLI bridge placement and if you do go water it can mess with the placement of the inlet and outlet ports.


----------



## khemist

Quote:


> Originally Posted by *sdmf74*
> 
> Do you have the fans running? Sounds like one of the aluminum fins on the cooler hitting a fan blade or just loose and hitting something.


Yeah i did and it only seems to happen at certain speeds but i'm not messing with it for a second it's going straight back.

Also i'm running the card vertical in a FT05 case and the noise doesn't happen when i lay the case horizontal, anyway i've got a FTW3 coming today so want a refund.


----------



## cluster71

Quote:


> Originally Posted by *Sgang*
> 
> How can i unlock voltage in msi afterburn? Seems locked with my zotac amp extreme
> 
> Inviato dal mio iPhone utilizzando Tapatalk


I guess you have ticked "Unlock voltage control" and "Unlock voltage monitoring" in Settings.
Then you can press "Ctrl + F" or click on the front if you are using newer skin.
When you see the Clock / votage curve, drag the points or the entire curve corresponding to clock frequency and voltage.

Should you increase the voltage above 1,063v, you increase the voltage control on the front to 50%, which corresponds to 1.075v then you have to raise that point in the curve for it to work, and then hit Enter button, Otherwise it will only increase voltage at the same frequency.

Let interface tooltips and hints be on and read dialogs when hovering over objects, there is very good info equally in the Msi Afterburner installation folder.


----------



## khemist

https://imageshack.com/i/pooxaR87j
https://imageshack.com/i/pnM9ZyBHj
https://imageshack.com/i/pmHntVbPj
https://imageshack.com/i/pmZjFnc8j

New one arrived.


----------



## cluster71

Any test or opinion regarding the reversed blades, they have been in EVGA products for some time.


----------



## cluster71

I assume there is some benefit otherwise they would not have been used. Sound or air circulation


----------



## Kriant

Quote:


> Originally Posted by *khemist*
> 
> Yeah i did and it only seems to happen at certain speeds but i'm not messing with it for a second it's going straight back.
> 
> Also i'm running the card vertical in a FT05 case and the noise doesn't happen when i lay the case horizontal, anyway i've got a FTW3 coming today so want a refund.


My Poseidon is coming today, hopefully it will be noise-free. The noise in the video sounds like fins are hitting something.

P.S. FTW3 is a good card:thumb: I'm loving mine


----------



## khemist

Yeah, i gave it a quick test and temps are around 70c with only 60% fan speed holding 2025 boost, very happy.


----------



## cluster71

Yes, there is always a balance between acceptable sound level and performance.


----------



## ZealotKi11er

Was testing the card with Witcher 3 at 4K. Stock out of the box I had 72 fps. With memory OC, and curve OC I managed 75 fps. Not even worth the trouble. This was FE card. If it was a card that boost higher at stock it would have been even more underwhelming. For reference stock the card was 1823MHz, OC was 2025MHz steady.


----------



## Pandora's Box

My launch day MSI Gaming X 1080 Ti is still chugging along. Have had it undervolted and overclocked since day one. 1979Mhz locked at 0.983 volts.


----------



## cluster71

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Was testing the card with Witcher 3 at 4K. Stock out of the box I had 72 fps. With memory OC, and curve OC I managed 75 fps. Not even worth the trouble. This was FE card. If it was a card that boost higher at stock it would have been even more underwhelming. For reference stock the card was 1823MHz, OC was 2025MHz steady.


Obviously, it's not worth the effort if you are looking for performance profit. The easiest is to buy one or two extra cards and run sli, or simply stick to the performance as it is.
You just overlock to see if it's possible, like a hobby. If you only wait 6 months then there will be another card that is much faster than you ever can overclock your current card

But in some areas it can make a bigger profit. As a CPU you can overclock 25-100%. You notice that


----------



## KedarWolf

I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.

While running Time Spy stress test my temps hover around 43/44C and never went over 45C.









This on a highly overclocked 5960x and one 1080 Ti FE on one 360 rad with a DCC pump.









Before the Fuji pads I was hitting 54-55C.









Really great result!









Oh, ambient temp is 27C now, was 36C before.


----------



## coc_james

A little advice maybe? I have a Gigabyte Aorus 1080 ti Extreme (w/ the built in AIO) and I can't get past 1.063volts. With that I can hit 2113/6014 w/ a max temp of 45deg Celsius. With those temps it never throttles down in Superposition. I'm wondering how much better it can do if I pushed the volts higher... but I don't know how.


----------



## DerComissar

Quote:


> Originally Posted by *KedarWolf*
> 
> I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.
> 
> While running Time Spy stress test my temps hover around 43/44C and never went over 45C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This on a highly overclocked 5960x and one 1080 Ti FE on one 360 rad with a DCC pump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Before the Fuji pads I was hitting 54-55C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really great result!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, ambient temp is 27C now, was 36C before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Spoiler!


Great results with those pads!

It's interesting how well they contribute to lowering the core temps of the gpu.
One can only imagine how much lower your vrm and memory temps are now.

I envy you having the 17W/mK pads.
I had to settle for buying the 14W/mK version, I just couldn't justify paying the exorbitant cost difference for 17W/mK, but it's good to know you got a good deal on the primo version.

I figure the 14W/mK pads will do pretty well though, compared to the low-W/mK stock pads.

You're also doing great with one 360 rad.









Your ambient temp. dropped from 36C to 27C?
That must be a big relief, lol.


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.
> 
> While running Time Spy stress test my temps hover around 43/44C and never went over 45C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This on a highly overclocked 5960x and one 1080 Ti FE on one 360 rad with a DCC pump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Before the Fuji pads I was hitting 54-55C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really great result!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, ambient temp is 27C now, was 36C before.


That's awesome. You may already know, but these pads start to dry out and can be fragile after some time. They will still operate fine, but aren't very friendly to being repositioned as they can fall apart. Happened to my fujipoly vrm strip for my monoblock, but still retains it's high heat transfer.

Sadly, I can't justify buying all of these for 2 cards, so I'm going to try the cheaper method of dabbing a tiny amount of thermal grease between the current pads and the vrm and ram. See if that gives me anything.

If I may ask, where else did you put them? You mentioned putting some in "other areas", would it be the inductors where the original white pads were?


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.
> 
> While running Time Spy stress test my temps hover around 43/44C and never went over 45C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This on a highly overclocked 5960x and one 1080 Ti FE on one 360 rad with a DCC pump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Before the Fuji pads I was hitting 54-55C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really great result!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, ambient temp is 27C now, was 36C before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's awesome. You may already know, but these pads start to dry out and can be fragile after some time. They will still operate fine, but aren't very friendly to being repositioned as they can fall apart. Happened to my fujipoly vrm strip for my monoblock, but still retains it's high heat transfer.
> 
> Sadly, I can't justify buying all of these for 2 cards, so I'm going to try the cheaper method of dabbing a tiny amount of thermal grease between the current pads and the vrm and ram. See if that gives me anything.
> 
> If I may ask, where else did you put them? You mentioned putting some in "other areas", would it be the inductors where the original white pads were?
Click to expand...

Only had enough pads where red is. Yellow still is stock.


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> Only had enough pads where red is. Yellow still is stock.


Great, thanks!


----------



## Kriant

Got my Poseidon today. After a quick and dirty re-arrangement of my water loop (I am still amazed how fast I re-did the tubing, lol), I had it up and running. 41c under full load in Superposition benchmark. And I've managed to pop the HB sli bridge on it so it's SLI-ed wth FTW3.


----------



## KedarWolf

Beta Nvidia drivers and my Fujipoly pads.


----------



## KedarWolf

Always do better with stand alone 3Dmark rather than Steam version.









Over 11200 Time Spy with beta drivers.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> Always do better with stand alone 3Dmark rather than Steam version.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Over 11200 Time Spy with beta drivers.


Wow, okay, why Steam version doesn't do as well? How about the beta drivers with games or Superposition?


----------



## trickeh2k

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Wow, okay, why Steam version doesn't do as well? How about the beta drivers with games or Superposition?


I'm no expert here but if you want the highest scores possible you want to eliminate all possible background applications and run it without internet connection etc. Hardcore benchmarkers runs their tests with a clean installation of Windows with only the necessary drivers and 3d mark installed I believe. I don't think it will have any major impact on gpu performance but cpu performance will obviously take a hit when running Steam version and with other background applications running.


----------



## MrGreaseMonkkey

....FTW3 1987mhz/ 6212Mhz......

Sent from my iPhone using Tapatalk Pro


----------



## kfxsti

Guys I have a quick question. My water block is coming in today for my aorus extreme. Will a 360 rad be enough for it and my delidded 7700k ? I just pulled out the other 360 I was going to add to the loop and it looks like one of my children has had a field day with it using a flat top screw driver.... I can order another rad but it will be a few weeks before I'm able.
Thanks for any help


----------



## coc_james

Quote:


> Originally Posted by *coc_james*
> 
> A little advice maybe? I have a Gigabyte Aorus 1080 ti Extreme (w/ the built in AIO) and I can't get past 1.063volts. With that I can hit 2113/6014 w/ a max temp of 45deg Celsius. With those temps it never throttles down in Superposition. I'm wondering how much better it can do if I pushed the volts higher... but I don't know how.


No comments or advice on this?


----------



## Agent-A01

Quote:


> Originally Posted by *coc_james*
> 
> No comments or advice on this?


Use MSI afterburner curve OC to set what voltage you want.


----------



## Dasboogieman

Quote:


> Originally Posted by *kfxsti*
> 
> Guys I have a quick question. My water block is coming in today for my aorus extreme. Will a 360 rad be enough for it and my delidded 7700k ? I just pulled out the other 360 I was going to add to the loop and it looks like one of my children has had a field day with it using a flat top screw driver.... I can order another rad but it will be a few weeks before I'm able.
> Thanks for any help


Its definitely on the tight side, I wouldn't recommend it. At least have another 240mm worth of rad to be sure unless you're willing to run 2200RPM + push/pull fans in order to get satisfactory temps.


----------



## khemist

Very good cooler on this, it's performing even better than i thought it would.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Pandora's Box*
> 
> My launch day MSI Gaming X 1080 Ti is still chugging along. Have had it undervolted and overclocked since day one. 1979Mhz locked at 0.983 volts.


Okei, then I feel good about my 1950/0.900V 1080Ti. :-D

Still get up to 53*C under gaming... I have not cleaned the system since I installed that card, whoops..


----------



## kfxsti

It's here woohoo!!!


----------



## Unnatural

Quote:


> Originally Posted by *KedarWolf*
> 
> I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.


I can only find those pads in 100x15x* cuts... how many pieces would you suggest to get? (Actually I already have one 100x15x1 pad and two 100x15x0,5, don't know if I could use the latter somehow)


----------



## Sigtinius

Hey guys, I'm new to the forum. So glad I got my 1080TI SC after seeing Vega.
Anyways, I'm wondering if I should OC my 1080TI or leave it as is? Any suggestions?


----------



## hadesfactor

Quote:


> Originally Posted by *Sigtinius*
> 
> Hey guys, I'm new to the forum. So glad I got my 1080TI SC after seeing Vega.
> Anyways, I'm wondering if I should OC my 1080TI or leave it as is? Any suggestions?


It all depends what you want out of it. Most people on here like to push their cards as far as they can. If you don't care about that and just want good gaming, you can set a stable curve at a very minor OC to get a few extra FPS but its not really needed. You will increase performance by OC your mem higher but the biggest thing would be to keep it a stable consistent clock without it jumping around on you. This is done with software like MSI Afterburner or Precision XOC but I recommend afterburner


----------



## hadesfactor

@Kederwolf I wonder how much my temps might drop using fuji m already at pretty good temps with the EK pads on my FC block....I dont break 40c on load at 2075 (ambient for me is 28-29c)


----------



## Sigtinius

I'd like to OC but like you said, slightly. I want my card to last as long as possible but I would like to juice it for what it's worth. I have an ASRock Motherboard and an EVGA GPU, can I still use something like MSI Afterburner to OC? I'm a complete noob to OCing... Thanks!


----------



## Alex132

Is it the ICX model of 1080 Ti? And I'd just personally leave it stock and let GPU Boost 3.0 work its magic. Thats what Im doing. And it sits around 1987-2012mhz in games.


----------



## hadesfactor

Quote:


> Originally Posted by *Sigtinius*
> 
> I'd like to OC but like you said, slightly. I want my card to last as long as possible but I would like to juice it for what it's worth. I have an ASRock Motherboard and an EVGA GPU, can I still use something like MSI Afterburner to OC? I'm a complete noob to OCing... Thanks!


Welcome btw. It really comes down to cooling and if you plan to watercool or stick with air. What will degrade your board is excessive heat and more voltage then you can unlock by just using MSI. That said, yes you can use MSI on your board no issue. I would download the latest beta of MSI afterburner. KederWolf has a couple of good you tube vids that show a quick tutorial on curves, Using a curve with get it stable and more constant. Also, if you look in the 1st pages of this thread you will get some good info about OCing your card. Honestly, if you are staying with your stock cooler and just wanna give it a little more then what you got out of the box, you can get it close to 2000mhz without an issue. The biggest you will have is temps as pascal will down bin after certain temps.


----------



## KCDC

@KedarWolf New MSI AB has been posted, beta 15.

mostly ATI fixes, though.

http://forums.guru3d.com/showpost.php?p=5458447&postcount=629


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> @KedarWolf New MSI AB has been posted, beta 15.
> 
> mostly ATI fixes, though.
> 
> http://forums.guru3d.com/showpost.php?p=5458447&postcount=629


Thank you. +1 OP updated.


----------



## kfxsti

Ahhh yeaaaa . So.... I want ahead and attached the block.. temptation got the better of me even though being told to wait until I could grab another rad.
After asking ekwb here on the forum would the factory aorus backplate work on the water block, and being told no... I was able to.put it on anyway. Had to make 4 screw shims using plastic washers and extra nuts that came in the hardware kit. But it's on there with absolutely no bend in the card or backplate.
Idle is 29c and load didn't peak above 34c . But Its not overclocked. Just running stock. CPU is Maxing out at 54c both loaded under Firestrike.


----------



## Sigtinius

Thanks!









I would be air cooling it (It's a stock EVGA SC Black edition) and would want to OC it just enough to juice it but keep it stable and able to run as long as possible. I don't have any other means of cooling it besides the two fans that come build into the GPU. The boost clock says its 1670 Mhz, so can you overclock what the boost clock is rated at without having heating/ longevity issues?


----------



## 2ndLastJedi

You need to get MSI Afterburner or GPUz and just check what boost is actually happening, (closer to 2000MHz i reckon) then play whatever games you play and see if it performs as well as you expect, if you want a little more, add bit by bit to the core, then the memory, but the gains in OCing are negligible depending on resolution and FPS requirements.


----------



## buellersdayoff

Quote:


> Originally Posted by *KedarWolf*
> 
> I put Fujipoly 1.5mm, 17W/mK pads on my VRMs and Fujipoly 1.0mm, 17W/mK on my memory and a few other spots.
> 
> While running Time Spy stress test my temps hover around 43/44C and never went over 45C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This on a highly overclocked 5960x and one 1080 Ti FE on one 360 rad with a DCC pump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Before the Fuji pads I was hitting 54-55C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really great result!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, ambient temp is 27C now, was 36C before.


So before the fujipoly pads ambient was almost 10c higher? The pads themselves won't reduce temps that much...


----------



## kfxsti

Going from 6w/mk to 17w/mk would have to net some sort of a big difference i would think


----------



## buellersdayoff

Quote:


> Originally Posted by *kfxsti*
> 
> Going from 6w/mk to 17w/mk would have to net some sort of a big difference i would think


The drop in ambient temperature would be the main reason for 10c core temp drop, not the thermal pads on vram/vrm. Sure they'll help but seriously you guys should know this, right?


----------



## KedarWolf

Quote:


> Originally Posted by *buellersdayoff*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kfxsti*
> 
> Going from 6w/mk to 17w/mk would have to net some sort of a big difference i would think
> 
> 
> 
> The drop in ambient temperature would be the main reason for 10c core temp drop, not the thermal pads on vram/vrm. Sure they'll help but seriously you guys should know this, right?
Click to expand...

When I said the ambient temps dropped I meant my GPU at idle, 36C at idle to 27C. My room temperature is the same as always. I have a/c set all the time. The core drop is solely from the thermal pads.









My bad for confusing terms.


----------



## Sigtinius

So if a GPU is rated for 1670Mhz in boost mode, it can automatically be running @ ~2000Mhz under load, or you mean to boost it up to about 2000Mhz and try that?
I was also wondering how much of an FPS gain I can get from OCing it this much - any ballpark idea?


----------



## 2ndLastJedi

nVidia have this thing called boost 3.0 that will boost the card as high as it can within its thermal/power/voltage limitations.Mine has something like 1638 MHz on the box but hit 1911 MHz without OC and with +136 it now goes to 2088 MHz before dropping due to heat to 2063 MHz. That 16## whatever the manufacturer labels is just their maximum without boost 3.0, anything above 16## is good but getting into the 2100 is Great. Overclocking gains depend on game/resolution /settings but i think a general rule is whatever OC %you gain will gain similar FPS %gains.
So 5 % OC will net you 5 %FPS gains. Something like that.


----------



## pez

Quote:


> Originally Posted by *Sigtinius*
> 
> So if a GPU is rated for 1670Mhz in boost mode, it can automatically be running @ ~2000Mhz under load, or you mean to boost it up to about 2000Mhz and try that?
> I was also wondering how much of an FPS gain I can get from OCing it this much - any ballpark idea?


If you're looking to mildly OC and maintain your reliability for as long as possible, the way I would suggest doing it would be to find the max fan speed you can handle and then make that the top of your fan curve.

Pascal will boost up to 2000Mhz or so if it sits at the right temperature. Pascal scales based on thermals, so it makes it a near perfect architecture for what you're looking for.


----------



## cluster71

I have tried to get my graphics card to draw as much power as possible. I am convinced that there is a hardware limit of 300w, or current limited. Kingpin refers to "Total Power Limit" in his card. And if you check chip documentation.

When I run Heaven watching HWinfo64 and my power wall meter, the GPU managed to pull 280-300w continuously earlier with Zotac Extreme. No shunt mod, so I think it shows right.

The goal is that no limit interrupts. So I think I managed to disconnect "Total Load Current Sense" feed back (Hardware mod)

XOC Bios 2190 1.20v GPU pull 330-335w continuously, 30w more than before not so much. XOC does not usually give the best scores, so i go for FTW.

FTW Bios Stable at 300-305w. 2112/6000 1.081v

7700k Zotac Extreme Air 35-40C 5 w/mk



I would like to increase memory voltage, memory speed counts. Need better tools.

*Clock it if you can*. Especially if you have no temp or sound problem


----------



## KedarWolf

With beta drivers, clean Windows install, tons of O/S and services tweaking, Nvidia Control panel settings in OP.









http://www.3dmark.com/3dm/21407496


----------



## MrTOOSHORT

Great score! Catching up to my TX Pascal score.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> With beta drivers, clean Windows install, tons of O/S and services tweaking, Nvidia Control panel settings in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/21407496


beta driver causes screen tearing in YouTube for me







and 80 less in Superposition 4k and im only 1000 away from you in Timespy, lol


----------



## BoredErica

On air I was getting 2120 ish with fans at 100%, now on water with only two fans (literally, on my rads only 2 fans right now) I am getting ~2090 give or take, and vram is on +625 from +470ish.


----------



## Alex132

Quote:


> Originally Posted by *Darkwizzie*
> 
> On air I was getting 2120 ish with fans at 100%, now on water with only two fans (literally, on my rads only 2 fans right now) I am getting ~2090 give or take, and vram is on +625 from +470ish.


Pshhh I can do +900-950mhz on air on my VRAM. But only +75-78mhz core


----------



## Kriant

Played around more with my Poseidon (need to update my sig). At stock settings that card is just awesome. Boosts up to 2025 without any tweaks, and runs 37-38c while gaming, dead silent. And I've managed to fit an HB SLI bridge between Poseidon and FTW3, a super tight fit, but it works.biggrin.gif


----------



## BoredErica

Quote:


> Originally Posted by *Alex132*
> 
> Pshhh I can do +900-950mhz on air on my VRAM. But only +75-78mhz core


What in the what


----------



## GMcDougal

What settings is everyone running to get your oculus rift looking its best?


----------



## Alex132

Quote:


> Originally Posted by *Darkwizzie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Pshhh I can do +900-950mhz on air on my VRAM. But only +75-78mhz core
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What in the what
Click to expand...

I'm might be able to get +1000mhz stable if it is cooler / throw more power at it. But +1000 crashes around the middle of Time Spy.

This is the first card I've had that can overclock memory decently. My 5870 I could do a total of 20mhz more, 690 I could do a whole extra 2mhz before it was unstable, my 295x2 was around 30-40mhz more. However, all of them could do very good core OCs (I remember the 295x2 could do 1295mhz, cant remember the others tbh).

And.... I'd much more prefer a good core OC'er than memory









I was seriously gaining like single digit point scores in FS from upping the memory insanely, so I didn't really bother.

e- I think this run was +75/800
http://www.3dmark.com/fs/13237433


----------



## Agent-A01

Quote:


> Originally Posted by *Alex132*
> 
> Pshhh I can do +900-950mhz on air on my VRAM. But only +75-78mhz core


What exactly are you using to test that?

All these people claim high VRAM OCs but test a benchmark like firestrike where it only uses 1gb of vram..

You need thorough testing with a game that will fill it all up.
Black ops 3 for example will use 11gb easy at 1440P+ with ultra textures.

Just because the first 2 VRAM chips clock high does not mean the rest will


----------



## BoredErica

I can probably go higher than 625, but from my testing 635 netted lower performance in some instances. I see this as a sign of correctable errors in the memory... so when errors are less, perf is higher, when more errors have to be corrected, perf drops. Any higher than 695 for sure is not stable.


----------



## Alex132

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Pshhh I can do +900-950mhz on air on my VRAM. But only +75-78mhz core
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What exactly are you using to test that?
> 
> All these people claim high VRAM OCs but test a benchmark like firestrike where it only uses 1gb of vram..
> 
> You need thorough testing with a game that will fill it all up.
> Black ops 3 for example will use 11gb easy at 1440P+ with ultra textures.
> 
> Just because the first 2 VRAM chips clock high does not mean the rest will
Click to expand...

Firestrike for initial, then a game of PUBG (7-9GB VRAM) for stability. I don't really care about stability beyond benchmarks because that's the only time I overclock anyway. I run stock 24/7 otherwise.


----------



## tieberion

My Gigabyte GTX 980 was like that, i put a custom Bios that unlocked everything, it was a cherry picked G1 chip, and went to 1702 on boost, 1505 24/7. But damn, you put a single extra hrtz in the Memory, and it crashed everytime.


----------



## dcf-joe

Just replaced my GTX 980 Gigabyte G1 with a BIOS Mod that was at 1550 MHz core and a pretty hefty memory overclock with an GTX 1080 Ti EVGA FTW3 running the Slave BIOS (higher power target.) I kept the rest of my PC the same, basically a 2600K @ 4.8 GHz and 8 GB DDR3 RAM.

First thing I did was run a comparison benchmark in Superposition and a comparison of GTA 5 using the in-game benchmarking tool.

Overall, obviously, the 1080 Ti is a performer. I did not mess with any OC, did not touch any fan curves, and did not touch the voltage. It is obvious that I need to at least mess with the fan curves, because the core clocks on my Ti were all over the place due to the stock fan curves letting the temps get too high before turning the fans up.



GTX 980 Superposition score:


Stock 1080 Ti FTW3 Superposition score: (About a 96% improvement)


GTA 5 In-Game Benchmark Tool Comparison: (About a 74% improvement)


----------



## PimpSkyline

Would Graphite be almost as good as Liquid Metal for the Shunt Mod? Any thoughts?


----------



## alucardis666

So... how much should I be looking to dump my 1080 TI's for?


----------



## dcf-joe

Quote:


> Originally Posted by *alucardis666*
> 
> So... how much should I be looking to dump my 1080 TI's for?


What are you going to replace them with?


----------



## Psykoking

Quote:


> Originally Posted by *PimpSkyline*
> 
> Would Graphite be almost as good as Liquid Metal for the Shunt Mod? Any thoughts?


Not really, Gallium (main component of LM) has a electric conductivity of about 7.14 * 10^6 [S/m] (Siemens per meter) where as graphite doesn't even have half of it. Graphite has about 3 * 10^6 [S/m]. Those values are for the pure form. But taking into consideration that LM is 98% Gallium you can see it as a guideline.


----------



## ahnafakeef

Hello everyone. If it's not too much trouble, can someone please tell me if a single 1080ti will suffice for maxed out 4K/60FPS gaming? If not, I will probably look into getting two for SLi.

I haven't kept up with the recent developments of the gaming and GPU side of things. So please do mention any information that you think I might benefit from.

Thank you very much.


----------



## GraphicsWhore

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hello everyone. If it's not too much trouble, can someone please tell me if a single 1080ti will suffice for maxed out 4K/60FPS gaming? If not, I will probably look into getting two for SLi.
> 
> I haven't kept up with the recent developments of the gaming and GPU side of things. So please do mention any information that you think I might benefit from.
> 
> Thank you very much.


Many games, yes.
Some newer games, no.

Other factors at play, obviously (rest of the system, game's optimization, etc).

SLI will also depend on how it behaves with specific titles.

Chasing 4K60 is fruitless IMO. Games will continue to get more complex and taxing.

Right now if it were me I'd get a Titan xP for $1200, get close to it, minimize inherent SLI issues by keeping a single card and save potentially hundreds.


----------



## ahnafakeef

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Many games, yes.
> Some newer games, no.
> 
> Other factors at play, obviously (rest of the system, game's optimization, etc).
> 
> SLI will also depend on how it behaves with specific titles.
> 
> Chasing 4K60 is fruitless IMO. Games will continue to get more complex and taxing.
> 
> Right now if it were me I'd get a Titan xP for $1200, get close to it, minimize inherent SLI issues by keeping a single card and save potentially hundreds.


From the few benchmarks that I've gone through, the Titan XP doesn't perform $500 better than the 1080Ti, if at all. What advantages would a Titan XP provide over a 1080Ti?

As for the rest of the system, does it matter if I pair the card(s) with a 3770K or a 7820X at resolutions as high as 4K? If yes, how much of a difference should I expect between the two CPUs?

Thank you.


----------



## GraphicsWhore

Quote:


> Originally Posted by *ahnafakeef*
> 
> From the few benchmarks that I've gone through, the Titan XP doesn't perform $500 better than the 1080Ti, if at all. What advantages would a Titan XP provide over a 1080Ti?
> 
> As for the rest of the system, does it matter if I pair the card(s) with a 3770K or a 7820X at resolutions as high as 4K? If yes, how much of a difference should I expect between the two CPUs?
> 
> Thank you.


I was talking about getting a T Xp vs two 1080Tis.

Depending on how CPU intensive the game is you may or may not see noticeable differences at 4k, keeping in in mind CPU demand can decrease as resolution increases.

I'd probably stick with the 3770K, OC it as much as possible, introduce the new GPU/GPUs and see if it holds up.


----------



## ahnafakeef

Quote:


> Originally Posted by *GraphicsWhore*
> 
> I was talking about getting a T Xp vs two 1080Tis.
> 
> Depending on how CPU intensive the game is you may or may not see noticeable differences at 4k, keeping in in mind CPU demand can decrease as resolution increases.
> 
> I'd probably stick with the 3770K, OC it as much as possible, introduce the new GPU/GPUs and see if it holds up.


I'm not sure what I'm misunderstanding, but it makes even less sense to buy 1 Titan XP for $1200 when I can get 2 1080Tis for $1400. What am I missing? More specifically, why are you advocating the Titan XP instead of 1080Ti SLi?


----------



## TheBoom

Quote:


> Originally Posted by *ahnafakeef*
> 
> I'm not sure what I'm misunderstanding, but it makes even less sense to buy 1 Titan XP for $1200 when I can get 2 1080Tis for $1400. What am I missing? More specifically, why are you advocating the Titan XP instead of 1080Ti SLi?


Yeah I don't get it either lol. Why would anyone buy a Titan XP over a 1080ti unless they use it for deep learning or media development.

As for the question, an overclocked 1080ti can do 4K60 fps about 70% of the time in my experience.

You'd want SLI to be on the absolute safe side.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ahnafakeef*
> 
> I'm not sure what I'm misunderstanding, but it makes even less sense to buy 1 Titan XP for $1200 when I can get 2 1080Tis for $1400. What am I missing? More specifically, why are you advocating the Titan XP instead of 1080Ti SLi?
> 
> 
> 
> Yeah I don't get it either lol. Why would anyone buy a Titan XP over a 1080ti unless they use it for deep learning or media development.
> 
> As for the question, an overclocked 1080ti can do 4K60 fps about 70% of the time in my experience.
> 
> You'd want SLI to be on the absolute safe side.
Click to expand...

Not Titan XP that's the same as a 1080 Ti, Titan Xp, that's a step up from it.









http://gpu.userbenchmark.com/Compare/Nvidia-Titan-Xp-vs-Nvidia-GTX-1080-Ti/m265423vs3918





But yes, not a big step up.









http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/0_20

Compare the single card time Spy's here.


----------



## ahnafakeef

Right. I've missed a lot in the last 8 months it seems.

So let me get this straight. Titan X Pascal and GTX 1080Ti use the same die (GP102?) and perform essentially the same in games at the same clock speeds (?). Titan Xp has the same performance per CUDA core as those two, but has a few more CUDA cores and a higher memory width (384-bit vs 352-bit) that gives it a performance boost of ~10% (?).

Please correct me if any of that is wrong.

And given the prices of these cards, their price-to-performance ratio, and their performance at 4K, it is more practical to get two 1080Tis in SLi, right?

Also, comparisons with the Titans aside, will two 1080Tis in SLi definitively max out everything at 4K with a locked FPS of 60? If the answer to that is yes, I think I may already have my decision.

Thank you for all your help. I really appreciate it.


----------



## MrTOOSHORT

Tx pascal is 384bit too


----------



## ahnafakeef

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Tx pascal is 384bit too


Got it. And just to clarify, is the difference between Titan Xp and 1080Ti the same as it has always been in past generations with Titan and Ti cards - negligible, and almost never worth the premium in real world gaming scenarios?


----------



## GraphicsWhore

Quote:


> Originally Posted by *ahnafakeef*
> 
> I'm not sure what I'm misunderstanding, but it makes even less sense to buy 1 Titan XP for $1200 when I can get 2 1080Tis for $1400. What am I missing? More specifically, why are you advocating the Titan XP instead of 1080Ti SLi?


Not advocating, hence "Were it me." As I mentioned in my previous post, I personally find chasing 4K60 misguided. If I simply knew I wanted a crapload of GPU power and was willing to spend well over a grand, I'd get a T Xp, save a bit of money, avoid dealing with fickle-ass SLI, not have to worry about powering two of those cards, etc.

Totally just me.

Of course, since I already have a 1080Ti I'm not saying I'd neeever go SLI but at this point I'd guess an 11-series Ti on water will almost certainly be my next card.

So back to your question about 4K60: since we're already almost half-way through August, I'd wait on the 11-series if you can.


----------



## Dasboogieman

Quote:


> Originally Posted by *ahnafakeef*
> 
> Got it. And just to clarify, is the difference between Titan Xp and 1080Ti the same as it has always been in past generations with Titan and Ti cards - negligible, and almost never worth the premium in real world gaming scenarios?


Exactly that. The Titan Xp this time round has theoretically 14% more resources per given clockspeed but because of the severe power constraints and imperfect resource scaling, the actual performance difference is <10%.
It was slightly different as the 980ti vs the Titan Maxwell which ended up with the 980ti coming out ahead in most normal cases unless the Titan was heavily power modded due to having the higher clocks vs TDP which offset the resource disadvantage.

IMO the only 2 reasons you can justify a Titan as an average enthusiast is if:
1. you are competitively benchmarking, very few competitive benches push the cards so hard that the TDP becomes a real factor in performance so you can leverage the higher resource count to pull ahead score-wise.
2. you need the pro features of the Titan, which as of the recent drivers, is a massive speedup to professional software which isn't available to the Ti cards.

One thing I've learnt observing the Titan vs Ti battle over the years.
Unless you are strongly TDP constrained, increasing Clockspeed usually yields very consistent linear performance gains (unless you are VRAM bandwidth or interconnect bandwidth limited) compared to having a wider execution engine.
This means that the end result is the faster card is one that can best balance its clockspeed vs its execution engine size within its given TDP + Cooling.
So getting a narrower, hotter card such as the Ti doesn't necessarily mean it will be slower than the wider, colder Titan.


----------



## alanthecelt

Quote:


> Originally Posted by *ahnafakeef*
> 
> And given the prices of these cards, their price-to-performance ratio, and their performance at 4K, it is more practical to get two 1080Tis in SLi, right?
> 
> Also, comparisons with the Titans aside, will two 1080Tis in SLi definitively max out everything at 4K with a locked FPS of 60? If the answer to that is yes, I think I may already have my decision.
> 
> Thank you for all your help. I really appreciate it.


the games which could do with the extra performance from SLI... have negligible gains
for example... ghost recon wildlands which might sit at upper 40fps (off top of head on ultra) will gain like 3 fps in sli

Ive been in the same boat as you, and i have also weighed up the pro's and cons of titan Xp and TI's in SLI
so
i stopped chasing maxxed out 60fps
however i am now on two monitors. a 4k and a lovely 1440 gsync ultrawide, and found myself using the ultrawide more than i anticipated... the ultrawide will scale with any 11 series card when it becomes available


----------



## krizby

Flashing a higher tdp bios allow the 1080ti to reach the titan xp @ 10% higher power draw anyways, as of now a single 1080ti is probably enough for everything out there until 4k 144hz Gsync monitors come out.


----------



## alanthecelt

i am tempted at relocating my FE and running something like a gigabyte waterforce extreme, or zotac articstorm
do any of these offer higher power limits/ potential? my fe is happy at 2025 all day long (on water)


----------



## OneCosmic

Quote:


> Originally Posted by *KedarWolf*
> 
> With beta drivers, clean Windows install, tons of O/S and services tweaking, Nvidia Control panel settings in OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/21407496


What OS and service tweaking did you use? I know that lot of even pro overclockers use various tweaks to get better score which i immo consider as BS, because what's the point of changing original SW parameters and behavior to get a better score? Rather everyone can test on stock settings and that's it.


----------



## alanthecelt

hmm, my FE and waterblock only just fits at 267mm.. the zota articstorm is 300 apparently
MSI seahawk is 277

the gigabyte card is 267.. but cant find if it has any more potential


----------



## dcf-joe

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hello everyone. If it's not too much trouble, can someone please tell me if a single 1080ti will suffice for maxed out 4K/60FPS gaming? If not, I will probably look into getting two for SLi.
> 
> I haven't kept up with the recent developments of the gaming and GPU side of things. So please do mention any information that you think I might benefit from.
> 
> Thank you very much.


What I usually do to ascertain GPU performance is go to the latest GPU review from any reputable hardware review site and just look at the graphs for whatever card I am looking at because they are usually going to test multiple cards. I use Guru3D and their latest GPU review is a regular 1080, but all of the graphs for 4K contain FPS data from the 1080 Ti as well as the Titan X Pascal.

Personally, I still look at 1440P because I prefer high refresh rate gaming. You can see if the 4K framerates for recent games are up to snuff for you: http://www.guru3d.com/articles_pages/asus_geforce_gtx_1080_strix_oc_11_gbps_review,25.html
Quote:


> Originally Posted by *alanthecelt*
> 
> i am tempted at relocating my FE and running something like a gigabyte waterforce extreme, or zotac articstorm
> do any of these offer higher power limits/ potential? my fe is happy at 2025 all day long (on water)


I thought the EVGA FTW3 had the highest power limits?


----------



## Alex132

My FTW3 is 127%


----------



## KedarWolf

Quote:


> Originally Posted by *dcf-joe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ahnafakeef*
> 
> Hello everyone. If it's not too much trouble, can someone please tell me if a single 1080ti will suffice for maxed out 4K/60FPS gaming? If not, I will probably look into getting two for SLi.
> 
> I haven't kept up with the recent developments of the gaming and GPU side of things. So please do mention any information that you think I might benefit from.
> 
> Thank you very much.
> 
> 
> 
> What I usually do to ascertain GPU performance is go to the latest GPU review from any reputable hardware review site and just look at the graphs for whatever card I am looking at because they are usually going to test multiple cards. I use Guru3D and their latest GPU review is a regular 1080, but all of the graphs for 4K contain FPS data from the 1080 Ti as well as the Titan X Pascal.
> 
> Personally, I still look at 1440P because I prefer high refresh rate gaming. You can see if the 4K framerates for recent games are up to snuff for you: http://www.guru3d.com/articles_pages/asus_geforce_gtx_1080_strix_oc_11_gbps_review,25.html
> Quote:
> 
> 
> 
> Originally Posted by *alanthecelt*
> 
> i am tempted at relocating my FE and running something like a gigabyte waterforce extreme, or zotac articstorm
> do any of these offer higher power limits/ potential? my fe is happy at 2025 all day long (on water)
> 
> Click to expand...
> 
> I thought the EVGA FTW3 had the highest power limits?
Click to expand...

I regret buying a 4K 60Hz G-Sync screen, reall wish I went a high refresh rate 1440p G-Sync screen.


----------



## dcf-joe

Quote:


> Originally Posted by *Alex132*
> 
> My FTW3 is 127%


Mine too. However, I wish I knew what that 127% actually translated to. Card A might be 260 watts at 100% and Card B might be 275 watts at 100%. Card A lets you overpower to 130% which delivers 338 watts, while Card B only allowing overpower to 127%, but outpowers Card A to 349 watts.


----------



## Alex132

Quote:


> Originally Posted by *dcf-joe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> My FTW3 is 127%
> 
> 
> 
> Mine too. However, I wish I knew what that 127% actually translated to. Card A might be 260 watts at 100% and Card B might be 275 watts at 100%. Card A lets you overpower to 130% which delivers 338 watts, while Card B only allowing overpower to 127%, but outpowers Card A to 349 watts.
Click to expand...

127% of the 350 watt limit on the FTW3 IIRC.


----------



## mouacyk

Quote:


> Originally Posted by *KedarWolf*
> 
> I regret buying a 4K 60Hz G-Sync screen, reall wish I went a high refresh rate 1440p G-Sync screen.


Your card is unable to maintain >55fps? 60Hz turns out to be to low for your sensitive eyes? Both?

I've been off 60Hz for a year now. Can't go back unless it's third-person adventures.


----------



## s1rrah

Quote:


> Originally Posted by *KedarWolf*
> 
> I regret buying a 4K 60Hz G-Sync screen, reall wish I went a high refresh rate 1440p G-Sync screen.


I'm with you on that; this is precisely why I haven't upgraded to 4K ... sure, 60hz is plenty playable, even fun ... but 1440p at 120fps+ in pretty much everything? Ahhhhh, ambrosia....

It will most likely be two more GPU generations before I move to a 4K screen. 1080 ti is the Unicorn(tm) of 1440p gaming ...


----------



## ahnafakeef

Thank you all very much for your inputs. I've gone through a few reviews, and although most of them show FPS of around 60 for a single 1080Ti in most games, I have two concerns:

1. The games reviewed are from last year, most of which I've already played. So I don't know how the newer titles such as Mass Effect Andromeda, Wolfenstein II, Dirt 4 etc will fare.

2. And more importantly, the FPS shown is the _average_ FPS, not the _minimum_ FPS. I want the minimum FPS in all titles to be above 60.

Those being the concerns, is it possible for a single 1080Ti to maintain a minimum of 60FPS in all titles currently out right now?

I am all for a single card setup, but only as long as it achieves the minimum 60 FPS at my required resolution. So to 4K gamers with a single 1080Ti, any insight on your experience with recent titles would be greatly appreciated.

Just in case it helps, I'm planning on pairing the card(s) with a (possibly overclocked) 7820X and 16GB of 3200/3400MHz DDR4 RAM.

Any recommendation/suggestion would help. Thank you.


----------



## hadesfactor

Quote:


> Originally Posted by *ahnafakeef*
> 
> Thank you all very much for your inputs. I've gone through a few reviews, and although most of them show FPS of around 60 for a single 1080Ti in most games, I have two concerns:
> 
> 1. The games reviewed are from last year, most of which I've already played. So I don't know how the newer titles such as Mass Effect Andromeda, Wolfenstein II, Dirt 4 etc will fare.
> 
> 2. And more importantly, the FPS shown is the _average_ FPS, not the _minimum_ FPS. I want the minimum FPS in all titles to be above 60.
> 
> Those being the concerns, is it possible for a single 1080Ti to maintain a minimum of 60FPS in all titles currently out right now?
> 
> I am all for a single card setup, but only as long as it achieves the minimum 60 FPS at my required resolution. So to 4K gamers with a single 1080Ti, any insight on your experience with recent titles would be greatly appreciated.
> 
> Just in case it helps, I'm planning on pairing the card(s) with a (possibly overclocked) 7820X and 16GB of 3200/3400MHz DDR4 RAM.
> 
> Any recommendation/suggestion would help. Thank you.


What you also have to take into consideration is not all games will scale in SLI. There are ways to get your FPS up by changing certain settings to high instead of ultra or even medium. As other people have already said you'll go crazy trying to make sure you have a min of a certain FPS at 4k because games are coming out requiring more and more horsepower and trying to future-proof yourself with GPU's is futile. I can't attest to the Titan XPp but I am very happen with my single TI (which is very close to the performance in games for the XPp) If you can afford to SLI then go for it pretty much the only thing that stops anyone here besides current tech is the cost.


----------



## ahnafakeef

Quote:


> Originally Posted by *hadesfactor*
> 
> What you also have to take into consideration is not all games will scale in SLI. There are ways to get your FPS up by changing certain settings to high instead of ultra or even medium. As other people have already said you'll go crazy trying to make sure you have a min of a certain FPS at 4k because games are coming out requiring more and more horsepower and trying to future-proof yourself with GPU's is futile. I can't attest to the Titan XPp but I am very happen with my single TI (which is very close to the performance in games for the XPp) If you can afford to SLI then go for it *pretty much the only thing that stops anyone here besides current tech is the cost.*


Exactly what I'm thinking about. I know that going SLi will pretty much ensure the 60FPS (minus the ones with SLi issues, which I'm assuming is not too many), but I could also just lower a few settings for now and get a new card from the next generation.

What major titles should I be aware of that will not scale with SLi? Besides Wildlands, I mean.

I'm also considering getting just one first and see if I'm satisfied with the FPS achieved with the compromises in settings. If I'm not, I could always add another one later.

Do mention anything that you might think will help with the decision, or anything else that I am not taking into consideration.

Thank you.


----------



## Pandora's Box

Sli generally doesn't improve minimum fps all that much if at all. Asking for 60fps minimums at 4k is asking too much of a 1080ti. 1440p 165hz is where this card shines


----------



## KedarWolf

Quote:


> Originally Posted by *ahnafakeef*
> 
> Thank you all very much for your inputs. I've gone through a few reviews, and although most of them show FPS of around 60 for a single 1080Ti in most games, I have two concerns:
> 
> 1. The games reviewed are from last year, most of which I've already played. So I don't know how the newer titles such as Mass Effect Andromeda, Wolfenstein II, Dirt 4 etc will fare.
> 
> 2. And more importantly, the FPS shown is the _average_ FPS, not the _minimum_ FPS. I want the minimum FPS in all titles to be above 60.
> 
> Those being the concerns, is it possible for a single 1080Ti to maintain a minimum of 60FPS in all titles currently out right now?
> 
> I am all for a single card setup, but only as long as it achieves the minimum 60 FPS at my required resolution. So to 4K gamers with a single 1080Ti, any insight on your experience with recent titles would be greatly appreciated.
> 
> Just in case it helps, I'm planning on pairing the card(s) with a (possibly overclocked) 7820X and 16GB of 3200/3400MHz DDR4 RAM.
> 
> Any recommendation/suggestion would help. Thank you.


Trick is with 4K to keep frames rates up is the amount of pixels in 4K is so high you can have antialiasing completely off and still have incredible rendering. The difference between off and on is negligible and if you are dipping below 60 FPS you can also try turning the detail settings in the game from say Ultra to Very High etc.


----------



## ahnafakeef

Quote:


> Originally Posted by *Pandora's Box*
> 
> Sli generally doesn't improve minimum fps all that much if at all. Asking for 60fps minimums at 4k is asking too much of a 1080ti. 1440p 165hz is where this card shines


Quote:


> Originally Posted by *KedarWolf*
> 
> Trick is with 4K to keep frames rates up is the amount of pixels in 4K is so high you can have antialiasing completely off and still have incredible rendering. The difference between off and on is negligible and if you are dipping below 60 FPS you can also try turning the detail settings in the game from say Ultra to Very High etc.


All of which also begs the question - will I see a real difference in performance in going from a Titan X Maxwell SLi setup to a single 1080Ti?


----------



## hadesfactor

Quote:


> Originally Posted by *ahnafakeef*
> 
> Exactly what I'm thinking about. I know that going SLi will pretty much ensure the 60FPS (minus the ones with SLi issues, which I'm assuming is not too many), but I could also just lower a few settings for now and get a new card from the next generation.
> 
> What major titles should I be aware of that will not scale with SLi? Besides Wildlands, I mean.
> 
> I'm also considering getting just one first and see if I'm satisfied with the FPS achieved with the compromises in settings. If I'm not, I could always add another one later.
> 
> Do mention anything that you might think will help with the decision, or anything else that I am not taking into consideration.
> 
> Thank you.


Exactly what Pandora's box said. 4k gaming is in its infancy right now. And just as Kederwolf and myself said settings will make a huge difference. Here is a link to a post about games working with SLi and you'll pretty much hear the exact same things we just said here. Nvidia pushed back on SLI a while back, more 3 and 4 way SLI but developers dont really concentrate on SLI scaling and support as of right now in fact some games will give you a negative effect.

https://hardforum.com/threads/name-games-that-dont-work-with-sli.1930282/


----------



## ahnafakeef

Quote:


> Originally Posted by *hadesfactor*
> 
> Exactly what Pandora's box said. 4k gaming is in its infancy right now. And just as Kederwolf and myself said settings will make a huge difference. Here is a link to a post about games working with SLi and you'll pretty much hear the exact same things we just said here. Nvidia pushed back on SLI a while back, more 3 and 4 way SLI but developers dont really concentrate on SLI scaling and support as of right now in fact some games will give you a negative effect.
> 
> https://hardforum.com/threads/name-games-that-dont-work-with-sli.1930282/


Yeah, I'm starting to lean more and more towards a single 1080Ti, at least to begin with.

Is there any word/estimate on when the next line of GeForce cards might be released?

Would someone please make PSU recommendations as well, please? Would a CM V850 suffice for a 7900X+1080Ti SLi system when overclocked?

Thank you.


----------



## 12Cores

Since day one I have been running my card with fast sync enabled, prefer max performance, OC 2163/11800 and fps capped at 120fps. I game on a 60hz 40' 4k TV, today I enabled vsync & optimal power and to my surprise the card is running all my games under 1600mhz at sub 40c temps locked at 60hz! This is silly you should not be able to get this kind of performance with these temps and clocks.


----------



## Psykoking

Would someone please make PSU recommendations as well, please? Would a CM V850 suffice for a 7900X+1080Ti SLi system when overclocked?

Thank you.[/quote]

Depends on how hard you wanna overclock. An 850W PSU will only suffice for stock clocks up to a very slight OC. Also consider looking at recent benches of the 7900X, seems like without delidding you won't even be able to cool it sufficiently. For reference look at tom's hardware bench http://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092-11.html. If it was me looking for a PSU I would get at least 1000W one because 7900X in Prime95 pulls ~230W (on stock) + 2x 1080 Ti's (at stock) on turture test pull around 300W each. Which already adds up to 830W on stock clocks. Ofc you can now argue that its very unlikely to hit those number on normal use but if you want a system thats stable 24/7 regardless of load you should take this into consideration.


----------



## KickAssCop

Get an EVGA 1300 W G2 or 1200 W P2 and never worry about power again.


----------



## Streetdragon

Quote:


> Originally Posted by *KickAssCop*
> 
> Get an EVGA 1300 W G2 or 1200 W P2 and never worry about power again.


That. Yeah its overshoot BUT while gaming at full tilt without any sync or so(100%) you can get up to 500W power(Every part of the pc), maybe with some higher spikes. So the PSU can work at its maximum efficent.

If im wrong correct me^^

BTW wich Monitor is good for 1440P/over 100Hz. Dont need gsync. a good one for gaming that wont cost me my last underpants. ATM i have a 1440/60Hz freesync. cant overclock it and i wanna use it as my secound screen only than.


----------



## alanthecelt

picked up a cheap Strix 1080ti OC
I'm not sure it will fit in my current build.. yet....
my FE on water sits rock solid at 2025 all day long, i understand the Strix have the higher power limit, so kinda wondering if it is worth blocking up and putting in place of my FE, i get that any gains will be marginal at best


----------



## KickAssCop

First run it w/ air cooling, maybe? lol.


----------



## alanthecelt

would be an absolute nightmare to run it air only
i will make sure it boots obviously and go from there...


----------



## hadesfactor

Quote:


> Originally Posted by *alanthecelt*
> 
> picked up a cheap Strix 1080ti OC
> I'm not sure it will fit in my current build.. yet....
> my FE on water sits rock solid at 2025 all day long, i understand the Strix have the higher power limit, so kinda wondering if it is worth blocking up and putting in place of my FE, i get that any gains will be marginal at best


You could flash your FE bios and get a better power limit


----------



## alanthecelt

Quote:


> Originally Posted by *hadesfactor*
> 
> You could flash your FE bios and get a better power limit


yup, i could have
but the Strix card is engineered for it to start with.. so in my mind... makes more sense...


----------



## hadesfactor

Quote:


> Originally Posted by *alanthecelt*
> 
> yup, i could have
> but the Strix card is engineered for it to start with.. so in my mind... makes more sense...


If you're already water cooling I would IMO put a block on it


----------



## stangflyer

Quote:


> Originally Posted by *alanthecelt*
> 
> yup, i could have
> but the Strix card is engineered for it to start with.. so in my mind... makes more sense...


So if you go from 2025 to 2100 you get 1 fps extra. Is it worth all that work? Benching maybe but gaming no.


----------



## alanthecelt

^^ yup i agree... not so worried about cost as my man maths economics has it covered lol
but also gives me the opportunity to put a nicer looking block on


----------



## GraphicsWhore

Quote:


> Originally Posted by *alanthecelt*
> 
> yup, i could have
> but the Strix card is engineered for it to start with.. so in my mind... makes more sense...


Water-cooled FEs have responded very well to this BIOS. You will almost certainly get little to no gains using the Strix.


----------



## alanthecelt

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Water-cooled FEs have responded very well to this BIOS. You will almost certainly get little to no gains using the Strix.


And that's the answer i hoped for my own sanity
thanks
so i could toy with flashing a higher power bios and go from there
seems strange to add an extra 2 pin power and it be fairly unnecessary


----------



## GraphicsWhore

Quote:


> Originally Posted by *alanthecelt*
> 
> And that's the answer i hoped for my own sanity
> thanks
> so i could toy with flashing a higher power bios and go from there
> seems strange to add an extra 2 pin power and it be fairly unnecessary


XOC and ArcticStorm seem to have the biggest effect. Remember with the XOC you don't need to run the power limit batch file. Just flash and go. I assume you're acquainted with the "How to flash a BIOS on your 1080Ti" thread. Everything you need, including link to XOC BIOS, is there.

Let us know how it goes.


----------



## KedarWolf

Afterburner beta 16 link here, voltage slider works out of the box.

*Highlight the link text, then copy and paste into the browser, link is broken if you don't!*

http://office.guru3d.com/afterburner/MSIAfterburnerSetup440Beta16.rar


----------



## webhito

So, finally got enough money to do a little extra spending and picked up an extra 1080 ti, for some reason, sometimes when logging into windows my sli profile is disabled and getting it activated doesn't work all the time, it flickers my screen and then shows its selected just to reset again to disabled, anyone else have this issue? Mind you it worked fine on my sabertooth x99 board, but since side-grading to a x99 ws its been doing this to me.

Using the latest drivers and older ones prove to do the same.


----------



## GraphicsWhore

Noticed this discrepancy between Heaven and AB readings.

Memory of 6039 vs 6040 - whatever - but 2088 vs 2076 on core seems a relatively wide gap.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Noticed this discrepancy between Heaven and AB readings.
> 
> Memory of 6039 vs 6040 - whatever - but 2088 vs 2076 on core seems a relatively wide gap.


No big deal. One polled at 2088 for that split second while the other @2076Mhz the next. It's a single bin between the two.


----------



## TheBoom

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Noticed this discrepancy between Heaven and AB readings.
> 
> Memory of 6039 vs 6040 - whatever - but 2088 vs 2076 on core seems a relatively wide gap.


Heaven has always been reporting wrong values for me since the beginning. And I mean since the days of the 4xx GTX cards.

Just use the readings reported by GPU-Z or MSI AB and you'll never go wrong.


----------



## johnwayne117

And what to do first, memory or core? And how do you guys find the sweetspot?


----------



## TheBoom

Quote:


> Originally Posted by *johnwayne117*
> 
> And what to do first, memory or core? And how do you guys find the sweetspot?


Easier to do memory first. Start with +500mhz then go down in increments of 50 and then 30-20 to the point where it starts artifacting or starts giving you lower scores in benches. Then just go with the highest memory clock that gave you the best results without any artifacts. Some people like to fine tune down to 5 or even 1 mhz for memory but I personally don't care for it and don't think it makes the slightest of differences.

Then with that memory overclock set, you can start with the core overclock. There are two ways to do this though, one is the old school way of the slider the other is the curve. I suggest you look at the first page of this thread for more info on that.

The sweetspot is also subjective, one is looking for highest clock/score without any crashes. Then there is an entirely different goal with Pascal cards if you choose to follow it though. That is finding your highest overclock at the lowest possible voltage (preferably around 1.00v). That way the card uses less power and gives you almost identical scores/fps in games or benches. That can only be done with the curve however.

You can expect 2000-2050mhz on the core and +500 to 1000 on memory for majority of the 1080tis out there.


----------



## Youngtimer

Quote:


> Originally Posted by *TheBoom*
> 
> ...You can expect 2000-2050mhz on the core and +500 to 1000 on memory for majority of the 1080tis out there.


Yeah, it should look roughly like this.

_max.jpg 844k .jpg file


----------



## GraphicsWhore

Installed beta drivers (most recent from July 31st) and ran FireStrike hitting a new high for total and graphics score. Not sure if drivers actually helped but it was a decent bump.

For current AB profile (2114/6039), previous graphics high was 32,296. Previous total high was 23,305.


----------



## DerComissar

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Installed beta drivers (most recent from July 31st) and ran FireStrike hitting a new high for total and graphics score. Not sure if drivers actually helped but it was a decent bump.
> 
> For current AB profile (2114/6039), previous graphics high was 32,296. Previous total high was 23,305.


385.12?

I've been interested to see if any 1080 Ti owners had good results with this driver.
Good to see it works well for you.

It's been a good driver for my TXp, higher Supo scoring, and a bit smoother for GTA V.


----------



## GraphicsWhore

Quote:


> Originally Posted by *DerComissar*
> 
> 385.12?
> 
> I've been interested to see if any 1080 Ti owners had good results with this driver.
> Good to see it works well for you.
> 
> It's been a good driver for my TXp, higher Supo scoring, and a bit smoother for GTA V.


Correct - 385.12

Also got a tiny bump in TimeSpy.


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerComissar*
> 
> 385.12?
> 
> I've been interested to see if any 1080 Ti owners had good results with this driver.
> Good to see it works well for you.
> 
> It's been a good driver for my TXp, higher Supo scoring, and a bit smoother for GTA V.
> 
> 
> 
> Correct - 385.12
> 
> Also got a tiny bump in TimeSpy.
Click to expand...

That's the driver I got the 11298 Time Spy with.









http://www.3dmark.com/spy/2173470


----------



## GraphicsWhore

Quote:


> Originally Posted by *KedarWolf*
> 
> That's the driver I got the 11298 Time Spy with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/2173470


That was on the ArcticStorm BIOS, right?

I was annoyed I was 1 off 11200 graphics and I'd love to catch up to you but if you recall I had posted that when I flashed the ArcticStorm, my memory clocks topped out at 800Mhz lol. I'll need to give it another try.


----------



## KedarWolf

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That's the driver I got the 11298 Time Spy with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/spy/2173470
> 
> 
> 
> That was on the ArcticStorm BIOS, right?
> 
> I was annoyed I was 1 off 11200 graphics and I'd love to catch up to you but if you recall I had posted that when I flashed the ArcticStorm, my memory clocks topped out at 800Mhz lol. I'll need to give it another try.
Click to expand...

Yeah, Arctic Storm BIOS.









That was on a clean install of Windows with it fully optimized with services not needed disabled and windows features not needed disabled, all windows sounds and animations off, no backgrounds, like 75 tweaks on a clean install. I'd make a tweak guide "Aint nobody got time for dat".









Also I changed PhysX from my video card to CPU and that see to free up some GPU resources somehow.


----------



## dansi

I flashed ftw3 358w bios to my FE, got 127% now, but im still seeing pwr limit when i run ff14 stormblood bench, you know during the starting scene of the bench.

Any other tests to assure the new bios is working as it should?


----------



## Alex132

Quote:


> Originally Posted by *dansi*
> 
> I flashed ftw3 358w bios to my FE, got 127% now, but im still seeing pwr limit when i run ff14 stormblood bench, you know during the starting scene of the bench.
> 
> Any other tests to assure the new bios is working as it should?


I think GPU-Z can tell you.


----------



## erso44

Hi,

I am purchasing a 1080Ti but which manufacturer do you recommend?

After purchasing I will go for water again


----------



## GreedyMuffin

Evga, MSI, Gigabyte.


----------



## ZealotKi11er

Anyone experience some games crashing to desktop? I have been using AMD cards for very long time and since switching to Nvidia I am having too many problems with games crashing. First it was Metro LL and now Wolfenstein: The New Order. I managed to get Metro LL to finish but Wolfenstein: The New Order I cant play more than 60s.


----------



## MrGreaseMonkkey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone experience some games crashing to desktop? I have been using AMD cards for very long time and since switching to Nvidia I am having too many problems with games crashing. First it was Metro LL and now Wolfenstein: The New Order. I managed to get Metro LL to finish but Wolfenstein: The New Order I cant play more than 60s.


Download Display driver uninstaller, run it, it will ask to uninstall in safe mode, click yes. Once booted into safe mode uninstall the nvidia drivers and amd drivers if u have not already. Reboot and reinstall nvidia drivers. What is your system spec?

Sent from my iPhone using Tapatalk Pro


----------



## RebelHell

New to the 1080 ti owner's club. I just submitted my validation form. I actually bought the first 1080 ti last month along with an i7 7700k and Z270 board. Microcenter had a second open box for $100 off this week so I had to pick it up too. I stuck with the Founder's Edition as I have water cooling plans. Two of these in SLI even on air is a huge upgrade to my old 780 ti. So, any issues running these things on 2 8x PCIe 3.0 slots? They seem to be doing okay but I have nothing to compare it to. I've been thinking of switching to X299 or X399 in the near future. Not sure how much of a difference it will make. Holding out for more benchmarks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *RebelHell*
> 
> New to the 1080 ti owner's club. I just submitted my validation form. I actually bought the first 1080 ti last month along with an i7 7700k and Z270 board. Microcenter had a second open box for $100 off this week so I had to pick it up too. I stuck with the Founder's Edition as I have water cooling plans. Two of these in SLI even on air is a huge upgrade to my old 780 ti. So, any issues running these things on 2 8x PCIe 3.0 slots? They seem to be doing okay but I have nothing to compare it to. I've been thinking of switching to X299 or X399 in the near future. Not sure how much of a difference it will make. Holding out for more benchmarks.


As far as gaming goes you are fine with 7700K and x8 PCIE 3.0.


----------



## RebelHell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> As far as gaming goes you are fine with 7700K and x8 PCIE 3.0.


That's pretty much what I figured. I was kinda hoping for a reason to drop 1K on a CPU though.


----------



## Rainmaker91

Quote:


> Originally Posted by *RebelHell*
> 
> That's pretty much what I figured. I was kinda hoping for a reason to drop 1K on a CPU though.


Unless you actually do a ton of multi-threaded benchmarks your 7700k will be more than plenty. As for x399 and x299, if the only reason to upgrade would be PCI-e lanes then a x399 board with the Ryzen 1900x would likely be the best buy (64 lanes at 550USD is a no brainer in my eyes, as long as you can deal with slightly lower clock speeds). I mean unless you actually need 10 cores or more while still running at high speeds that is, but the use case for that is limited (considering most applications that can use 10 cores effectively can also use 16, which would put the 1950x up as the best buy)


----------



## 2ndLastJedi

Quote:


> Originally Posted by *RebelHell*
> 
> That's pretty much what I figured. I was kinda hoping for a reason to drop 1K on a CPU though.


A good reason would be so i can have your 7700k








Do you live in Brisbane?


----------



## KCDC

Quote:


> Originally Posted by *erso44*
> 
> Hi,
> 
> I am purchasing a 1080Ti but which manufacturer do you recommend?
> 
> After purchasing I will go for water again


If you're going to put a waterblock on it, use EVGA. They won't void warranty claims if you take the stock cooler off. AFAIK, they're the only company that doesn't care about modding the card. You can go with another company and roll the dice when something happens (like i did due to impatience), but keep in mind you may not get a RMA handled.


----------



## RebelHell

Quote:


> Originally Posted by *2ndLastJedi*
> 
> A good reason would be so i can have your 7700k
> 
> 
> 
> 
> 
> 
> 
> 
> Do you live in Brisbane?


No but very close. Ohio.

Seriously though, if I do switch to a HEDT platform I'll probably use this board and CPU to build a new computer for my sister. She's still running an i7-930 that I built for her years ago. We've upgraded her GPU and put an SSD in it since but could still use being replaced.


----------



## RebelHell

Quote:


> Originally Posted by *KCDC*
> 
> If you're going to put a waterblock on it, use EVGA. They won't void warranty claims if you take the stock cooler off. AFAIK, they're the only company that doesn't care about modding the card. You can go with another company and roll the dice when something happens (like i did due to impatience), but keep in mind you may not get a RMA handled.


Asus is another good one. They replaced my 780 ti 33 months into my 36 month warranty period. The only thing they asked was that there be no physical damage and I had to reinstall the original cooler before shipping it to them. Even had a fast turn around.


----------



## KCDC

Quote:


> Originally Posted by *RebelHell*
> 
> Asus is another good one. They replaced my 780 ti 33 months into my 36 month warranty period. The only thing they asked was that there be no physical damage and I had to reinstall the original cooler before shipping it to them. Even had a fast turn around.


Really? Of all the companies, that's pretty surprising. Good to know!


----------



## GreedyMuffin

Quote:


> Originally Posted by *KCDC*
> 
> Really? Of all the companies, that's pretty surprising. Good to know!


That is def. not the norm. Asked them this when I was shopping for a 980 back in the day. *No way in hell* they would replace the card.


Spoiler: Warning: Spoiler!


----------



## RebelHell

Quote:


> Originally Posted by *GreedyMuffin*
> 
> That is def. not the norm. Asked them this when I was shopping for a 980 back in the day. *No way in hell* they would replace the card.


I was trying to find a transcript of my RMA request. No luck so far. But I did locate the RMA instructions they sent me.


This was a generic form letter so I assume it's what they send everyone. Doesn't say the cooler can't be removed, just that it must be on for RMA.

EDIT:
Apparently the screenshot is completely illegible. It says...

Special Handling Instructions

Please review our product-specific handling instructions before proceeding to ensure if any associated accessories need to be sent in with your product to the repair facility for adequate testing and repair. You may be required to return accessories with your product. Do not include any other accessories, otherwise they will not be returned or replaced. Customers requesting accessories to be returned will be responsible for shipping cost. To view product-specific handling instructions for RMA on our website, click here.

**Instructions are valid for US and Canada RMA only**

Please read carefully for items that may be required to be sent in with your product.

Graphic Cards:

Please return with the original ASUS heat sink, otherwise it will be considered as physical damage


----------



## Rainmaker91

Quote:


> Originally Posted by *RebelHell*
> 
> I was trying to find a transcript of my RMA request. No luck so far. But I did locate the RMA instructions they sent me.
> 
> 
> This was a generic form letter so I assume it's what they send everyone. Doesn't say the cooler can't be removed, just that it must be on for RMA.
> 
> EDIT:
> Apparently the screenshot is completely illegible. It says...
> 
> Special Handling Instructions
> 
> Please review our product-specific handling instructions before proceeding to ensure if any associated accessories need to be sent in with your product to the repair facility for adequate testing and repair. You may be required to return accessories with your product. Do not include any other accessories, otherwise they will not be returned or replaced. Customers requesting accessories to be returned will be responsible for shipping cost. To view product-specific handling instructions for RMA on our website, click here.
> 
> **Instructions are valid for US and Canada RMA only**
> 
> Please read carefully for items that may be required to be sent in with your product.
> 
> Graphic Cards:
> 
> Please return with the original ASUS heat sink, otherwise it will be considered as physical damage


I actually specifically looked in to this at one point and as far as I could see there are 2 manufacturers that will still honour the warranty if the warranty seal is broken. The first and obvious one is EVGA, which almost encourages you to toy with the cards (not really, but far more so than the rest). The main problem with that is that EVGA only supplies Nvidia cards which is fine if that is what you want, but sucks otherwise. So I checked around and the only company that does this on AMD cards, and presumably also on Nvidia cards is MSI. No luck with Sapphire, Gigabyte and most definitely not ASUS (they are in fact very clear that they won't honour the warranty).

I checked with quite a few others, but generally speaking they won't honour the warranty if the sticker on the screw is "broken" or removed. So... yeah.


----------



## KraxKill

Quote:


> Originally Posted by *Rainmaker91*
> 
> I actually specifically looked in to this at one point and as far as I could see there are 2 manufacturers that will still honour the warranty if the warranty seal is broken. The first and obvious one is EVGA, which almost encourages you to toy with the cards (not really, but far more so than the rest). The main problem with that is that EVGA only supplies Nvidia cards which is fine if that is what you want, but sucks otherwise. So I checked around and the only company that does this on AMD cards, and presumably also on Nvidia cards is MSI. No luck with Sapphire, Gigabyte and most definitely not ASUS (they are in fact very clear that they won't honour the warranty).
> 
> I checked with quite a few others, but generally speaking they won't honour the warranty if the sticker on the screw is "broken" or removed. So... yeah.


It doesn't mean they won't honor the warranty. It means they can choose to not honor the warrenty. Having a broken sticker doesn't break the card.

Now if they find a broken sticker, a bunch of stripped screws and a bunch of oxidized water damage. Then yeah good luck.


----------



## Rainmaker91

Quote:


> Originally Posted by *KraxKill*
> 
> It doesn't mean they won't honor the warranty. It means they can choose to not honor the warrenty. Having a broken sticker doesn't break the card.


I know, but I contacted the manufacturers and they specifically told me that they won't honour a warranty on a card that has a broken sticker. EVGA and MSI are the exception to this from what I could find out. Though your experience may vary based on the specific personnel that you get in contact with.


----------



## KraxKill

Quote:


> Originally Posted by *Rainmaker91*
> 
> I know, but I contacted the manufacturers and they specifically told me that they won't honour a warranty on a card that has a broken sticker. EVGA and MSI are the exception to this from what I could find out. Though your experience may vary based on the specific personnel that you get in contact with.


Well yeah. I'm surprised you got the answers you got from EVGA and MSI. Perhaps if you call them again you'd get a different answer? You're asking them a loaded question. Regardless of their policy. I'm surprised they told you that they would honor the warrenty.

I think it's in every manufacturers interest to make sure that their customers get a chance to enjoy their hardware as they see fit within reason. They put a lot of red tape to protect themselves from false warrenty claims, and place themselves in a position to deny. It lest them set the right expectations and over deliver, vs the other way around. It not simply as clear cut as having the sticker intact or not. What if the sticker is missing entirely?

I'm just suggesting that if they determine the card to be dead due to "natural causes" they will most likely honor your claim regardless of the sticker bit they will never tell you this in a public forum. I do t blame them. What if you carefully peel off the sticker and re-stick it?

My point, is people are making too much fuss over the warrenty sticker. If your intention are sensere, the failure not a fault of your own doing, the manufacturer will treat each case on its own merit.

So I would suggest choosing a card based on hardware specs and fuctionality for your purposes rather than the remote possibility of having to process a warrenty claim.


----------



## Rainmaker91

Quote:


> Originally Posted by *KraxKill*
> 
> Well yeah. I'm surprised you got the answers you got from EVGA and MSI. Perhaps if you call them again you'd get a different answer? You're asking them a loaded question. Regardless of their policy. I'm surprised they told you that they would honor the warrenty.
> 
> I think it's in every manufacturers interest to make sure that their customers get a chance to enjoy their hardware as they see fit within reason. They put a lot of red tape to protect themselves from false warrenty claims, and place themselves in a position to deny. It lest them set the right expectations and over deliver, vs the other way around. It not simply as clear cut as having the sticker intact or not. What if the sticker is missing entirely?
> 
> I'm just suggesting that if they determine the card to be dead due to "natural causes" they will most likely honor your claim regardless of the sticker bit they will never tell you this in a public forum. I do t blame them. What if you carefully peel off the sticker and re-stick it?
> 
> My point, is people are making too much fuss over the warrenty sticker. If your intention are sensere, the failure not a fault of your own doing, the manufacturer will treat each case on its own merit.
> 
> So I would suggest choosing a card based on hardware specs and fuctionality for your purposes rather than the remote possibility of having to process a warrenty claim.


I agree, but there are some manufacturers that will simply outright refuse no matter what. From what I could see of other peoples response Asus is among the worst on that particular point. Still I don't really get why manufacturers trust you not to squash your 1000$ CPUs but won't let you mess around with an 800$ video card. In either case no one is stopping them from putting the stickers on the cards, but depending on the specific country (and thus the laws in place against it), they can't outright refuse to honour the warranty based only on that. There are still manufacturers that will simply because the laws of the country are more difficult to enforce than it should be (the US is a good example, since it requires the user to sue the company if something goes wrong).


----------



## GreedyMuffin

Gigabyte does not have a sticker, neither does Zotac. That's why I added them to the list.


----------



## Rainmaker91

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Gigabyte does not have a sticker, neither does Zotac. That's why I added them to the list.


Except it seems to vary greatly, I have no clue about Zotac but Gigabyte has cards both with and without the sticker (some say they have cards with stickers, but the majority I have seen are without). Either way, I'm not fond of stickers as it means the company doesn't put faith in the user at all. It's why I specifically bought EVGA this time around even though I had to pay nearly 100$ more for it than the cheapest option.Still I would love to have a manufacturer that would be safe to by 100% of the time if I choose to go with AMD next time around.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Rainmaker91*
> 
> Except it seems to vary greatly, I have no clue about Zotac but Gigabyte has cards both with and without the sticker (some say they have cards with stickers, but the majority I have seen are without). Either way, I'm not fond of stickers as it means the company doesn't put faith in the user at all. It's why I specifically bought EVGA this time around even though I had to pay nearly 100$ more for it than the cheapest option.Still I would love to have a manufacturer that would be safe to by 100% of the time if I choose to go with AMD next time around.


My 980 G1 did not, Zotac 980Ti did not, neither has my 1080Ti. But I don't know about older cards.


----------



## Rainmaker91

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My 980 G1 did not, Zotac 980Ti did not, neither has my 1080Ti. But I don't know about older cards.


That's good to know at least.


----------



## LunaP

Hey guys quick question, Just bought 2x 1080ti ICX's from eVGA, and wanted to know if anyone here has done a similar/same config that I"m trying to do.

I'm coming from Tri 980ti's and planning on keeping 1 while putting the 2 1080ti's in SLI, so I can use the 980ti for additional monitors. Is this still possible w/ nvidia cards?

980's were on blocks and plan to do the same w/ the 1080's ( using XSPC Razor blocks since I like the aesthetic).

Also just to be sure ( from what I've been reading on this thread ) as long as I'm not doing HEAVY OC's with the cards then ICX should be fine vs the FE correct? Appreciate any info.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> Hey guys quick question, Just bought 2x 1080ti ICX's from eVGA, and wanted to know if anyone here has done a similar/same config that I"m trying to do.
> 
> I'm coming from Tri 980ti's and planning on keeping 1 while putting the 2 1080ti's in SLI, so I can use the 980ti for additional monitors. Is this still possible w/ nvidia cards?
> 
> 980's were on blocks and plan to do the same w/ the 1080's ( using XSPC Razor blocks since I like the aesthetic).
> 
> Also just to be sure ( from what I've been reading on this thread ) as long as I'm not doing HEAVY OC's with the cards then ICX should be fine vs the FE correct? Appreciate any info.


You sure those blocks fit the ICX version? if it's the "SC Black edition" then it will fit, but if it's the SC2 then for most blocks it won't fit.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> You sure those blocks fit the ICX version? if it's the "SC Black edition" then it will fit, but if it's the SC2 then for most blocks it won't fit.


Doesn't say SC anywhere on the box, and I've no clue, I'm still researching, hence my previous question, since if its not possible I'll probably just end up returning them anyways.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> Doesn't say SC anywhere on the box, and I've no clue, I'm still researching, hence my previous question, since if its not possible I'll probably just end up returning them anyways.


I still have my box somewhere if you need to compare, I'll list the cards below though keep in mind that for some reason there are several pages showing the same card for some reason on EVGAs site:

EVGA GeForce GTX 1080 Ti Black Edition GAMING, EVGA GeForce GTX 1080 Ti SC Black Edition GAMING. These two should be the same card from what I can see, though one might just be clocked higher, they should fit with any "reference" titan blocks, but not necessarily a 1080ti block as they have an extra DVI port.
EVGA GeForce GTX 1080 Ti SC2 ELITE GAMING, EVGA GeForce GTX 1080 Ti SC2 GAMING, EVGA GeForce GTX 1080 Ti iCX GAMING. Again from what I can see these 3 cards should have the same PCB, but may be clocked differently. There are currently no full cover blocks that fit them as far as I know. The reason they don't fit a reference titan block is because they have an extra connector placed close to the vRAM.
EVGA GeForce GTX 1080 Ti FTW3 DT GAMING, EVGA GeForce GTX 1080 Ti FTW3 GAMING, EVGA GeForce GTX 1080 Ti FTW3 ELITE GAMING. Again as far as I know these 3 are the same, but are far from a reference PCB. EKWB does however have full cover blocks that will fit them.
Keep in mind that I could be way off with this, but from what I understand the "black edition" cards with the ICX cooler should be safe for a titan x 2016 block.

As for the other question... I really have no clue, but hope someone else will give you a sufficient answer.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> I still have my box somewhere if you need to compare, I'll list the cards below though keep in mind that for some reason there are several pages showing the same card for some reason on EVGAs site:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> EVGA GeForce GTX 1080 Ti Black Edition GAMING, EVGA GeForce GTX 1080 Ti SC Black Edition GAMING. These two should be the same card from what I can see, though one might just be clocked higher, they should fit with any "reference" titan blocks, but not necessarily a 1080ti block as they have an extra DVI port.
> EVGA GeForce GTX 1080 Ti SC2 ELITE GAMING, EVGA GeForce GTX 1080 Ti SC2 GAMING, EVGA GeForce GTX 1080 Ti iCX GAMING. Again from what I can see these 3 cards should have the same PCB, but may be clocked differently. There are currently no full cover blocks that fit them as far as I know. The reason they don't fit a reference titan block is because they have an extra connector placed close to the vRAM.
> EVGA GeForce GTX 1080 Ti FTW3 DT GAMING, EVGA GeForce GTX 1080 Ti FTW3 GAMING, EVGA GeForce GTX 1080 Ti FTW3 ELITE GAMING. Again as far as I know these 3 are the same, but are far from a reference PCB. EKWB does however have full cover blocks that will fit them.
> Keep in mind that I could be way off with this, but from what I understand the "black edition" cards with the ICX cooler should be safe for a titan x 2016 block.
> 
> As for the other question... I really have no clue, but hope someone else will give you a sufficient answer
> 
> 
> .


mmmm I can't find any info online about it either. Here's the ones I purchased, (bad angle I know, I'm at work atm so I'll take another from home)



*Edit* Whoops forgot I took a pic prior when I was verifying with a friend on which they recommended.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> mmmm I can't find any info online about it either. Here's the ones I purchased, (bad angle I know, I'm at work atm so I'll take another from home)


Looks to me like it's the one that won't fit, the "Black edition" doesn't say ICX anywhere in it's name as well so that's just something to keep in mind. Check up the specific part number towards EVGAs site, that way you're sure. Keep in mind that they are still terrific cards, I just don't know of any full cover blocks that will fit them.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> Looks to me like it's the one that won't fit, the "Black edition" doesn't say ICX anywhere in it's name as well so that's just something to keep in mind. Check up the specific part number towards EVGAs site, that way you're sure. Keep in mind that they are still terrific cards, I just don't know of any full cover blocks that will fit them.


Ahhh you responded as I edited my post!


----------



## stdout

Quote:


> Originally Posted by *alanthecelt*
> 
> the games which could do with the extra performance from SLI... have negligible gains
> for example... ghost recon wildlands which might sit at upper 40fps (off top of head on ultra) will gain like 3 fps in sli
> 
> Ive been in the same boat as you, and i have also weighed up the pro's and cons of titan Xp and TI's in SLI
> so
> i stopped chasing maxxed out 60fps
> however i am now on two monitors. a 4k and a lovely 1440 gsync ultrawide, and found myself using the ultrawide more than i anticipated... the ultrawide will scale with any 11 series card when it becomes available


I have 2 1080 ti FE's with Poseidon bios and EK waterblocks. In Ghost recon wildlands, adding a second 1080 ti almost doubles my fps with all on custom (ultra plus the last bits) and I do well over 100 FPS (last run was 107) at 2560x1440.

With 1 card it is 57 FPS.

I have not had one game glitch yet, Sometimes I play SC2 and that game only utilizes one card, but no errors. Wow uses both cards, with much improvement over 1 card, but far from double (again all setting to ultra) ~70 FPS to 110+ FPS

I have never run SLI before but a lot of crosssfire. I have never had issues. Just my 5 cents


----------



## Lefty23

Quote:


> Originally Posted by *LunaP*
> 
> mmmm I can't find any info online about it either. Here's the ones I purchased, (bad angle I know, I'm at work atm so I'll take another from home)
> 
> *Edit* Whoops forgot I took a pic prior when I was verifying with a friend on which they recommended.


Looks like it is this card:
EVGA 1080 TI SC2 (11G-P4-6593)

Unfortunately EKWB don't have or plan on making a block for it.
Check here for reason why FE block doesn't fit (a fan header according to their rep):
http://www.overclock.net/t/1629771/evga-1080-ti-sc2/0_100

Maybe you can use this info to check with other providers (XSPC, Swiftech, any other?) for compatibility.
Though I would think they'll have the same problem as EKWB, at least on their basic 1080ti block.


----------



## LunaP

Quote:


> Originally Posted by *Lefty23*
> 
> Looks like it is this card:
> EVGA 1080 TI SC2 (11G-P4-6593)
> 
> Unfortunately EKWB don't have or plan on making a block for it.
> Check here for reason why FE block doesn't fit (a fan header according to their rep):
> http://www.overclock.net/t/1629771/evga-1080-ti-sc2/0_100
> 
> Maybe you can use this info to check with other providers (XSPC, Swiftech, any other?) for compatibility.
> Though I would think they'll have the same problem as EKWB, at least on their basic 1080ti block.


You're 100% I just finished looking up the card it is indeed https://www.evga.com/products/product.aspx?pn=11G-P4-6593-KR

I'd LIKE to stick w/ XSPC if possible given my build and that I'll be keeping the waterblock on the 980ti to have a clean bridge going across however IF need be I can go w/ EK.

You guys mentioned the black edition WOULD take the block? If so which exact block was it if anyone has a link? I'm looking to order today and I do see the store still has blacks in stock, and since I haven't removed mine from the packaging yet they'll allow for a return/exchange.


----------



## webhito

Yea, it seems the sc2 has no waterblock available, the black seems to be identical to the fe so it should be compatible.

http://www.overclock.net/t/1627564/waterblock-for-evga-gtx-1080-ti-sc-black-edition


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> You're 100% I just finished looking up the card it is indeed https://www.evga.com/products/product.aspx?pn=11G-P4-6593-KR
> 
> I'd LIKE to stick w/ XSPC if possible given my build and that I'll be keeping the waterblock on the 980ti to have a clean bridge going across however IF need be I can go w/ EK.
> 
> You guys mentioned the black edition WOULD take the block? If so which exact block was it if anyone has a link? I'm looking to order today and I do see the store still has blacks in stock, and since I haven't removed mine from the packaging yet they'll allow for a return/exchange.


Quote:


> Originally Posted by *webhito*
> 
> Yea, it seems the sc2 has no waterblock available, the black seems to be identical to the fe so it should be compatible.
> 
> http://www.overclock.net/t/1627564/waterblock-for-evga-gtx-1080-ti-sc-black-edition


Not entirely, it has the extra DVI port like the titan. So just to clarify, the Titan X Pascal (not the Titan Xp 2017 edition) will fit. Blocks made specifically for the 1080Ti or the 2017 Titan Xp will not fit. A good example would be to look at Watercool's blocks:
This block will NOT fit (the plate extends to far and interferes with the DVI):


While this block WILL fit (again, it has a cut-out for the DVI port):


The black edition is essentially a Titan X 2016 PCB with a single memory chip removed so all Titan X 2016 blocks will fit without any real issues. The SC2 is essentially the same PCB as the Black edition with one crucial difference, a fan/light/sensor connector that obstructs a reference block.
Here is a picture showing the specific connector that makes it incompatible:


----------



## Lefty23

Quote:


> Originally Posted by *LunaP*
> 
> You guys mentioned the black edition WOULD take the block? If so which exact block was it if anyone has a link? I'm looking to order today and I do see the store still has blacks in stock, and since I haven't removed mine from the packaging yet they'll allow for a return/exchange.


I have been using EK the past few years so not sure about other providers.

It looks like these two are supported:
https://www.ekwb.com/configurator/step1_complist?gpu_gpus=3882
https://www.ekwb.com/configurator/step1_complist?gpu_gpus=3784

You can use this to see if EKWB has a block available for a specific card:
https://www.ekwb.com/configurator/
Just do a quick search with the hardware you are interested.

I think I saw in this thread (sometime in the last couple of weeks) a link where you could check for blocks available from all vendors but, i may be wrong.
I'm on my phone right now so cannot really check.

edit: ok back to my PC and it seems I was wrong there ^
Went back to the threads I read and it was a reference to the ekwb configurator.
It was just to note that, if the result of the configurator says the card you are interested is using reference pcb, it would be easier to find a block by the different vendors.
Just like other members mentioned.


----------



## webhito

Quote:


> Originally Posted by *Rainmaker91*
> 
> Not entirely, it has the extra DVI port like the titan. So just to clarify, the Titan X Pascal (not the Titan Xp 2017 edition) will fit. Blocks made specifically for the 1080Ti or the 2017 Titan Xp will not fit. A good example would be to look at Watercool's blocks:
> This block will NOT fit (the plate extends to far and interferes with the DVI):
> 
> While this block WILL fit (again, it has a cut-out for the DVI port):
> 
> The black edition is essentially a Titan X 2016 PCB with a single memory chip removed so all Titan X 2016 blocks will fit without any real issues. The SC2 is essentially the same PCB as the Black edition with one crucial difference, a fan/light/sensor connector that obstructs a reference block.
> Here is a picture showing the specific connector that makes it incompatible:


Right, nice catch!


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Not entirely, it has the extra DVI port like the titan. So just to clarify, the Titan X Pascal (not the Titan Xp 2017 edition) will fit. Blocks made specifically for the 1080Ti or the 2017 Titan Xp will not fit. A good example would be to look at Watercool's blocks:
> This block will NOT fit (the plate extends to far and interferes with the DVI):
> 
> 
> While this block WILL fit (again, it has a cut-out for the DVI port):
> 
> 
> The black edition is essentially a Titan X 2016 PCB with a single memory chip removed so all Titan X 2016 blocks will fit without any real issues. The SC2 is essentially the same PCB as the Black edition with one crucial difference, a fan/light/sensor connector that obstructs a reference block.
> Here is a picture showing the specific connector that makes it incompatible:
> 
> 
> 
> ]


I wanna be sure given the cards I have no since I know shipping cutoff ends soon so hoping for any solid info before then so I can grab these today.

Hmm looking @ XSPC and they do have the Titan X block, looking @ their compatibility list here https://static1.squarespace.com/static/51998404e4b0ef02d1bd9c2c/t/55dbca78e4b0a59016e1d18a/1440467576184/compatibility+gtx+980ti.pdf

Here's the block http://www.xs-pc.com/waterblocks-gpu/razor-gtx-980-980-ti

If what 100% on this then I won't even have to buy new blocks since I can just move the ones on my current 980ti's over to the 1080ti's lol.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> You mentioned the SC2 in here so I'm almost curious if that would work w/o having to return mine, though if not then I'd have to go w/ the EK block, but I wanna be sure given the cards I have now. I know shipping cutoff ends soon so hoping for any solid info before then so I can grab these today.
> 
> Hmm looking @ XSPC and they do have the Titan X block, looking @ their compatibility list here http://From the 9xx / titan compatibility list.]From the 9xx / titan compatibility list.[/URL]
> 
> Here's the block http://www.xs-pc.com/waterblocks-gpu/razor-gtx-980-980-ti


I don't quite think I got my point across. the SC2 won't fit any block that I know of, and the SC2 picture was added to show what makes it so that it doesn't work.

As for the "Black Edition" mentioned in post 13171, ANY block made for the original Titan X Pascal (not to be confused with the Titan Xp which was released in 2017) will work with this. It doen't matter if it's made by EKWB, Watercool, Alphacool, Aquacomputer, Swiftech, XSPC, Bitspower, Phanteks, or any other for that matter. If it fits with the original Titan X Pascal from 2016, then it will fit with the EVGA "Black Edition".


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> I don't quite think I got my point across. the SC2 won't fit any block that I know of, and the SC2 picture was added to show what makes it so that it doesn't work.
> .


Idk HOW you even caught that as it was a split second post/edit to remove that as I misread 1 line LOL all within 3-5 seconds total.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> Idk HOW you even caught that as it was a split second post/edit to remove that as I misread 1 line LOL all within 3-5 seconds total.


I'm just that fast







also I'm sort of bored, so I caught it quickly.

Still, I have no clue if the 980Ti blocks are compatible with Pascal cards. I simply researched around the specific EVGA cards recently as I bought one last week. If they are then that's great, but I have my doubts. Either way Nvidia is not making it easy to differentiate between their Titans when it comes to card generations, but do keep in mind that the blocks that I know for sure will fit is the Titan X Pascal (released in 2016) blocks, and not the Titan X Maxwell (released in 2015) blocks. I have no clue if they are interchangeable or not.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> I'm just that fast
> 
> 
> 
> 
> 
> 
> 
> also I'm sort of bored, so I caught it quickly.
> 
> Still, I have no clue if the 980Ti blocks are compatible with Pascal cards. I simply researched around the specific EVGA cards recently as I bought one last week. If they are then that's great, but I have my doubts. Either way Nvidia is not making it easy to differentiate between their Titans when it comes to card generations, but do keep in mind that the blocks that I know for sure will fit is the Titan X Pascal (released in 2016) blocks, and not the Titan X Maxwell (released in 2015) blocks. I have no clue if they are interchangeable or not.


Ahh that 9xx series is probably referring to the Maxwell one then maybe.

Nvidia GeForce GTX Titan X Pascal 12GB is the only model I see on the 1080/1080ti listing for comptaibility on the XSPC site.

Either way I have to return the 2 cards I got today and exchange them. So if I get the SC2 black I gotta somehow know IF the XSPC will fit or not, and if not then its EK for me unless someone here recommends BP instead. Hoping to hear from other SC2 Black edition owners which blocks they used. I also hear that some EK blocks prevent the HB Bridge since they don't offer room for it.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> Ahh that 9xx series is probably referring to the Maxwell one then maybe.
> 
> Nvidia GeForce GTX Titan X Pascal 12GB is the only model I see on the 1080/1080ti listing for comptaibility on the XSPC site.
> 
> Either way I have to return the 2 cards I got today and exchange them. So if I get the SC2 black I gotta somehow know IF the XSPC will fit or not, and if not then its EK for me unless someone here recommends BP instead. Hoping to hear from other SC2 Black edition owners which blocks they used. I also hear that some EK blocks prevent the HB Bridge since they don't offer room for it.


there is a SC version and a non SC version, only real difference is the clock as far as I can see. The card that you currently have is the only SC2 card as far as I know, and as mentioned, it won't fit any of the blocks that I know of.

If you prefer XSPC then by all means get the Razor RGB - GTX 1080 / 1080 Ti or the Razor GTX 1080 / 1080 Ti with or without a Razor GTX 1080 / 1080 Ti Backplate.

If you want to go with a different company then that works as well, there is really no holding you back.

As for running the 980Ti along side the 1080Tis... I honestly have no clue, but you should get that answer before making any decisions on that.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> there is a SC version and a non SC version, only real difference is the clock as far as I can see. The card that you currently have is the only SC2 card as far as I know, and as mentioned, it won't fit any of the blocks that I know of.
> 
> If you prefer XSPC then by all means get the Razor RGB - GTX 1080 / 1080 Ti or the Razor GTX 1080 / 1080 Ti with or without a Razor GTX 1080 / 1080 Ti Backplate.
> 
> If you want to go with a different company then that works as well, there is really no holding you back.
> 
> As for running the 980Ti along side the 1080Tis... I honestly have no clue, but you should get that answer before making any decisions on that.


So this one https://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR should be fine w/ the blocks you mentioned then correct?


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> So this one https://www.evga.com/products/product.aspx?pn=11G-P4-6393-KR should be fine w/ the blocks you mentioned then correct?


As far as I can see, it mentions it supports the Titan X Pascal and it does have the necessary cut-out for the DVI port.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> As far as I can see, it mentions it supports the Titan X Pascal and it does have the necessary cut-out for the DVI port.


Do you guys have a diagram for what sized thermal pads to put where btw? Gonna go w/ fuji-poly.


----------



## Rainmaker91

Quote:


> Originally Posted by *LunaP*
> 
> Do you guys have a diagram for what sized thermal pads to put where btw? Gonna go w/ fuji-poly.


Depends on the specific block. It should be mentioned that you really don't need fujipoly for it to perform well, but if you want that then you just have to check what size pads come with the specific block you want. For the most part it's 0.5mm and 1.0mm so if you get those I think you will be fine.

Backplates does tend to use significantly thicker pads though.


----------



## GreedyMuffin

Is new pads much better tho..?

My 1080Ti is now running at 40-41'C under load at 1960/0.900V..

I can also run at 2050/1.025V.. but more heat and 12% higher temp. Not worth it for 90mhz IMHO.

Anybody knows when next gen cards will come out?

I'm using the FE backplate on my EK block.

Loving this MO-RA3.. Fans at 500-550 RPM under load and awesome temp. As quiet as you'll get. About 62'C max on CPU. (7800X 4600) 69'C on 4800. Not tested higher, will delid and maybe push for 4900/5000mhz.


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> Depends on the specific block. It should be mentioned that you really don't need fujipoly for it to perform well, but if you want that then you just have to check what size pads come with the specific block you want. For the most part it's 0.5mm and 1.0mm so if you get those I think you will be fine.
> 
> Backplates does tend to use significantly thicker pads though.


I meant for non documented ones, XSPC/EK usually have a set area to put them on but in the past since titan/6xx/7xx series and up I've noticed that people recommend a .5 on some areas that aren't normally covered/mentioned by the block mfc but still help. Never had to put pads on the backplate though just the front side iirc.Though its been a while.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Is new pads much better tho..?
> 
> My 1080Ti is now running at 40-41'C under load at 1960/0.900V..
> 
> I can also run at 2050/1.025V.. but more heat and 12% higher temp. Not worth it for 90mhz IMHO.
> 
> Anybody knows when next gen cards will come out?
> 
> I'm using the FE backplate on my EK block.
> 
> Loving this MO-RA3.. Fans at 500-550 RPM under load and awesome temp. As quiet as you'll get. About 62'C max on CPU. (7800X 4600) 69'C on 4800. Not tested higher, will delid and maybe push for 4900/5000mhz.


That is very good OC with 0.9v. I run 1860/0.9 and 2025/1.0


----------



## PimpSkyline

If the Black Edition Ti is a FE PCB and your going WB, i will buy one of the coolers off you. PM me, thx.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PimpSkyline*
> 
> If the Black Edition Ti is a FE PCB and your going WB, i will buy one of the coolers off you. PM me, thx.


I am pretty sure you still need the cooler for RMA so not worth it.


----------



## GraphicsWhore

Decided to mess around with under-volting again today as earlier attempts were kind of half-assed and I was focused on OC'ing. I have to say I'm surprised/impressed.

.993v
2012
6039

Starting with a relatively high min temp so it's getting up there a bit but I've been running various benchmarks throughout the evening. Scores are all down quite a bit, so I'm still not able to pull off the magic some of you have with your cards, keeping them almost the same as your OC profiles, but I'm not complaining!


----------



## LunaP

Quote:


> Originally Posted by *Rainmaker91*
> 
> As far as I can see, it mentions it supports the Titan X Pascal and it does have the necessary cut-out for the DVI port.


Thank you btw for saving me $$$$ since once opened I can't return the GPU's lol. Just finished Exchanging them at the store. It doesn't even say BLACK anywhere on the box, unless u look up the model lol...



Putting in my order for the blocks now, https://www.amazon.com/Razor-COMBO-Water-Block-Backplate/dp/B06Y2G8PV9/ref=sr_1_1?ie=UTF8&qid=1502767889&sr=8-1&keywords=xspc+1080ti+waterblock I"ll report back.


----------



## KCDC

Ok, somewhere in these recent pages, someone mentioned flashing the Poseidon BIOS to their cards and saw a nice return on water. Is this confirmed, or is XOC still the one to use for FE cards on water and SLI?


----------



## ZealotKi11er

Quote:


> Originally Posted by *KCDC*
> 
> Ok, somewhere in these recent pages, someone mentioned flashing the Poseidon BIOS to their cards and saw a nice return on water. Is this confirmed, or is XOC still the one to use for FE cards on water and SLI?


For FE you want to stick with FE Bios. Other stuff gets you lower scores.


----------



## LunaP

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For FE you want to stick with FE Bios. Other stuff gets you lower scores.


Any recommended bios swaps for the SC Black edition?


----------



## GraphicsWhore

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For FE you want to stick with FE Bios. Other stuff gets you lower scores.


Not the experience I and some others have had. XOC bumped my scores up a decent amount.


----------



## CR-Pascal

I'm looking for advice on which BIOS to flash on my card.

I have an EVGA FE edition mounted with an Arctic Accelero Extreme III and two Noctua NF-P12-PWM running on the cards fanheader. The card idles at around 36 degrees, but because of the minimum fan speed set in the FE BIOS, the fans are running at around 1100 RPM, which is way to loud. It sits in a Ncase M1.

So basically I want to flash a different bios to give me full control of the fan speed. The goal is not overclocking but silence, so I don't need a higher power limit.

I tought about going with the Strix BIOS, but then read a few places that there seems to be performance issues with this on FE cards. Are there any FE BIOS's that have full range control of fan speed?


----------



## ZealotKi11er

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Not the experience I and some others have had. XOC bumped my scores up a decent amount.


I benched Superposition 4K. FE BIOS got me same score while using 2025/1.000v vs 2100/1.093v with XOC.


----------



## Alex132

Quote:


> Originally Posted by *CR-Pascal*
> 
> I'm looking for advice on which BIOS to flash on my card.
> 
> I have an EVGA FE edition mounted with an Arctic Accelero Extreme III and two Noctua NF-P12-PWM running on the cards fanheader. The card idles at around 36 degrees, but because of the minimum fan speed set in the FE BIOS, the fans are running at around 1100 RPM, which is way to loud. It sits in a Ncase M1.
> 
> So basically I want to flash a different bios to give me full control of the fan speed. The goal is not overclocking but silence, so I don't need a higher power limit.
> 
> I tought about going with the Strix BIOS, but then read a few places that there seems to be performance issues with this on FE cards. Are there any FE BIOS's that have full range control of fan speed?


What I did on my old graphics card was plug in the fans to a mobo header and control them with speedfan. Would that be possible for you?
If the connector is different, it is possible to just remove the 4 pins and place them in a standard 4pin fan header (I did this with my 5870 accelero extreme).


----------



## CR-Pascal

Quote:


> Originally Posted by *Alex132*
> 
> What I did on my old graphics card was plug in the fans to a mobo header and control them with speedfan. Would that be possible for you?
> If the connector is different, it is possible to just remove the 4 pins and place them in a standard 4pin fan header (I did this with my 5870 accelero extreme).


I guess that is also an option. I wasn't sure how the graphics card would react, if there were no fan attached, but the Noctua fans will plug right into the motherboard PWM header.


----------



## webhito

A few days ago I posted that I was having trouble with my sli, for some reason activating it doesn't work on the first try, after 3 times it will, and then works with not a single issue, however if I reboot and deactivate it happens again, it happens in any slot on the motherboard, be it 16x or 8x.
Leaving a video here for educational purposes lol, an apology for the birdie at the end.

Troubleshooting wise I have deactivated my antivirus, changed power plans, pretty much turned off all unwanted software running in the background, changed driver versions. Its a fresh windows install as its the third board I have received from amazon already ( first two had dead ram slots ). I already reported this to amazon and they are waiting for me to make a decision, I can have yet another one shipped to me or get a refund...


----------



## GraphicsWhore

Quote:


> Originally Posted by *webhito*
> 
> A few days ago I posted that I was having trouble with my sli, for some reason activating it doesn't work on the first try, after 3 times it will, and then works with not a single issue, however if I reboot and deactivate it happens again, it happens in any slot on the motherboard, be it 16x or 8x.
> Leaving a video here for educational purposes lol, an apology for the birdie at the end.
> 
> Troubleshooting wise I have deactivated my antivirus, changed power plans, pretty much turned off all unwanted software running in the background, changed driver versions. Its a fresh windows install as its the third board I have received from amazon already ( first two had dead ram slots ). I already reported this to amazon and they are waiting for me to make a decision, I can have yet another one shipped to me or get a refund...


I'd be more inclined to to suspect one of the cards (or both if you're really unlucky) than anything else.

I assume the board would be replaced again with the same model? Maybe get a refund and try a completely different model/brand. If same issue, gotta be GPU-related.


----------



## webhito

Quote:


> Originally Posted by *GraphicsWhore*
> 
> I'd be more inclined to to suspect one of the cards (or both if you're really unlucky) than anything else.
> 
> I assume the board would be replaced again with the same model? Maybe get a refund and try a completely different model/brand. If same issue, gotta be GPU-related.


Forgot to mention, both cards work fine, I had a sabertooth x99 and it was working perfectly fine, but yea, I also have that option, would probably grab a rampage v edition 10 as its even cheaper now.


----------



## GraphicsWhore

Quote:


> Originally Posted by *webhito*
> 
> Forgot to mention, both cards work fine, I had a sabertooth x99 and it was working perfectly fine, but yea, I also have that option, would probably grab a rampage v edition 10 as its even cheaper now.


Now that you mention I do remember you saying both cards worked in SLI on the X99.

Yeah a different board seems like the next logical step.

I assume you've tried beta drivers? Rolling back driver? BIOS flashed?.Don't recall all your previous posts.


----------



## webhito

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Now that you mention I do remember you saying both cards worked in SLI on the X99.
> 
> Yeah a different board seems like the next logical step.
> 
> I assume you've tried beta drivers? Rolling back driver? BIOS flashed?.Don't recall all your previous posts.


Yea, bios flashed, rolled back drivers, I hate beta drivers so I steer far away from those, but other than that yes, its the only option I have left, or use the credit for a threadripper and sell my 5960x lol. The only thing that stops me from doing that is that there are no air coolers available and I don't do water.


----------



## outofmyheadyo

Whats wrong with u14s Tr4?


----------



## webhito

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Whats wrong with u14s Tr4?


Not available in Mexico.

Not a single air cooler is available here, well, neither are the boards and processors, but those I can have amazon import, the air cooler is nowhere to be found.


----------



## outofmyheadyo

If u can import the mb and cpu why not the cooler ?


----------



## webhito

Quote:


> Originally Posted by *outofmyheadyo*
> 
> If u can import the mb and cpu why not the cooler ?


Because its not available to import. =P


----------



## Fediuld

I am sorry if high jacking the discussion, but for custom watercooling which one you believe would oc better? The EVGA GeForce GTX 1080 Ti FTW3 GAMING, GeForce GTX 1080Ti SC Black Edition GAMING ICX or the Aorus 1080Ti Xtreme? Or all of them will clock the same depending how good the watercooling is, and better buy the cheapest one?

Thanks


----------



## coc_james

@webhito

I had a similar issue. I resolved it by using DDU in safe mode, uninstalling Nvidia inspector, uninstalling geforce experience and reinstalling the DRIVER ONLY. Seems like Nvidia Inspector and geforce experience were breaking my SLI.


----------



## webhito

Quote:


> Originally Posted by *coc_james*
> 
> @webhito
> 
> I had a similar issue. I resolved it by using DDU in safe mode, uninstalling Nvidia inspector, uninstalling geforce experience and reinstalling the DRIVER ONLY. Seems like Nvidia Inspector and geforce experience were breaking my SLI.


Might as well give that a shot, I do not have inspector nor experience installed, I only grab the driver and physx on my installs.

Nope, no dice, did the ddu/fresh install and changed nothing. I will just return the motherboard.


----------



## webhito

Quote:


> Originally Posted by *Fediuld*
> 
> I am sorry if high jacking the discussion, but for custom watercooling which one you believe would oc better? The EVGA GeForce GTX 1080 Ti FTW3 GAMING, GeForce GTX 1080Ti SC Black Edition GAMING ICX or the Aorus 1080Ti Xtreme? Or all of them will clock the same depending how good the watercooling is, and better buy the cheapest one?
> 
> Thanks


A big part I believe depends on silicon lottery, my 1080ti strix overclocked nicely, while neither of my sc2's can go over 2000 in core and 400 in memory. I am in no way heat limited as the cards are nowhere near 60c.


----------



## Fediuld

Quote:


> Originally Posted by *webhito*
> 
> A big part I believe depends on silicon lottery, my 1080ti strix overclocked nicely, while neither of my sc2's can go over 2000 in core and 400 in memory. I am in no way heat limited as the cards are nowhere near 60c.


Last year had bought a GTX1080 Armor (cheapest 1080), and OCed it with waterblock to 2190 without failure, while temps never exceeded 35C. It I was able to keep it bellow 32C, it was hitting 2202. But difficult with a 360 thick rad. :/ This time around I have 2 360mm thick rads on the new system (and case), hence I have no issue tbh to run 2 loops. One on GPU with 360mm and the other on CPU.

Hence my question







The FTW3 is having the highest clocks from the box, but given that all 1080Tis are voltage limited at 1.093v might not matter which ones yes?


----------



## TheRedViper

Voltage limitation only applies to mainstream cards, high-ends like the kingpin or the galax allows you to use custom tools to OC above normal specs.


----------



## Fediuld

With kingpin not in stock, and
Quote:


> Originally Posted by *TheRedViper*
> 
> Voltage limitation only applies to mainstream cards, high-ends like the kingpin or the galax allows you to use custom tools to OC above normal specs.


thnx


----------



## CR-Pascal

Quote:


> Originally Posted by *Alex132*
> 
> What I did on my old graphics card was plug in the fans to a mobo header and control them with speedfan. Would that be possible for you?
> If the connector is different, it is possible to just remove the 4 pins and place them in a standard 4pin fan header (I did this with my 5870 accelero extreme).


So I tried this setup and it seems to work except for one thing. The GPU fans (the only ones controlled by Speedfan) keep spinning up every few minutes or so as if they are fed 100% power for just a second or two. Do you have any experience with this?


----------



## pcgaming247

what 1080 ti would look the best in tbe core p3
https://m.newegg.ca/products/N82E16811133313
im planning on getting this
https://m.newegg.ca/products/N82E16814487338


----------



## Pandora's Box

Quote:


> Originally Posted by *pcgaming247*
> 
> what 1080 ti would look the best in tbe core p3
> https://m.newegg.ca/products/N82E16811133313
> im planning on getting this
> https://m.newegg.ca/products/N82E16814487338


I have the MSI Gaming X 1080 Ti in my P3.


----------



## pcgaming247

Quote:


> Originally Posted by *Pandora's Box*
> 
> I have the MSI Gaming X 1080 Ti in my P3.


looks very good
did u paint the case red?


----------



## Pandora's Box

Quote:


> Originally Posted by *pcgaming247*
> 
> looks very good
> did u paint the case red?


No the case comes in red as an option. I also got the Tempered glass upgrade kit. Looks so much better than the acrylic window that comes with the case.


----------



## pcgaming247

Quote:


> Originally Posted by *Pandora's Box*
> 
> No the case comes in red as an option. I also got the Tempered glass upgrade kit. Looks so much better than the acrylic window that comes with the case.


from the pictures of the tempered glass it looks a bit too tinted
is there a clearer tempered glass available?


----------



## Pandora's Box

Quote:


> Originally Posted by *pcgaming247*
> 
> from the pictures of the tempered glass it looks a bit too tinted
> is there a clearer tempered glass available?


Not that I have seen.


----------



## webhito

Got an update, my sli issue seems to be tied to CSM. I turn it off due to my gsync screen, it flickers when posting, several times and its annoying, it only happens with 1080 ti's, mind you both my strix and sc2's do it, the last 1080 card I had did not have this problem.
Turning csm support on lets me enable on the first try, whereas with it off I need to activate it 3 times. Ridiculous.


----------



## hadesfactor

Quote:


> Originally Posted by *webhito*
> 
> Got an update, my sli issue seems to be tied to CSM. I turn it off due to my gsync screen, it flickers when posting, several times and its annoying, it only happens with 1080 ti's, mind you both my strix and sc2's do it, the last 1080 card I had did not have this problem.
> Turning csm support on lets me enable on the first try, whereas with it off I need to activate it 3 times. Ridiculous.


Actually this problem was even there during the 6 series as well....some people had the issue some people didn't but it was always tied to CSM......if you google CSM NVidia Im sure youll find pages on it. I havn't run sli in years but maybe someone else can help out on this.


----------



## webhito

Quote:


> Originally Posted by *hadesfactor*
> 
> Actually this problem was even there during the 6 series as well....some people had the issue some people didn't but it was always tied to CSM......if you google CSM NVidia Im sure youll find pages on it. I havn't run sli in years but maybe someone else can help out on this.


Oh, well its an ongoing issue then lol, I would have expected it to have been resolved by now.


----------



## blecap

I'm very new to OC but have been doing for the pass for hours. I have the Zotac 1080Ti 11GB Blower card. I kept seeing "FE" not sure what it meant I assume mine's the "FE"?

These are the programs I've installed - MSI AB, GPU-Z and a benchmark program Unigine Heaven. Just wondering how high can I OC this card? I tried searching on Google but there's very little on this card that people OC in details.

No fancy stuff on my GPU, everything is stock and the only tiny different thing I did was just remove a PCI slot behind my casing below the GPU slot for a little extra room for the blower to blow out the heat.

I'd love if anyone can help me with this particular card OC values, etc. Not sure what else I can provide any more information or ask but to await the experience to help me!

I play a game Hellblade - Senua's Sacrifice for an hour or so. As I quit the game, it suddenly greyed out and hung/crash, my temp was low 70s tho and I did not touch the voltage. When I play Ghost Recon Wildland, pretty sure there wasn't issue.

My fan curve is pretty straight forward. I set it this way 40% 40C then straight up to 85%, 70C to 85% 90C.


Just some extra infos com spec:
i7-7700k 4.2GHz,
Acer Predator XB271HU 27",
G.Skill Ripjaw 16gb (2 x 8GB) 3000MHz
3 Samsung SSD + 2 HDD

PS I run my games at 1440 resolution


----------



## LunaP

Quote:


> Originally Posted by *webhito*
> 
> Oh, well its an ongoing issue then lol, I would have expected it to have been resolved by now.


Pascal was designed with UEFI in mind, and CSM is pretty much anti-UEFI..... so go fig |:


----------



## webhito

Quote:


> Originally Posted by *LunaP*
> 
> Pascal was designed with UEFI in mind, and CSM is pretty much anti-UEFI..... so go fig |:


Yes, and while I could always sell one of my gpu's to solve the issue, they look too damn sexy to do so. I still have a few days to decide what to do.


----------



## BoredErica

My SC Black was 2012-2025 on air (100% fan speed), and now I'm at 2088-2100 on custom loop.


----------



## encrypted11

Well for the people that are asking about the EVGA SC Black Edition, have you seen this?
https://forums.evga.com/m/tm.aspx?m=2628711&p=2

It has DVI port than just the solder pad however.


----------



## AstroSky

sadly temps keep me from my true overclocking potential. im in a custom loop and i think i messed up my watercooling setup, I can reach 2000 and 2050 but temps climb around that area and it goes back down to 1999. It might be a combination of not enough rad space and bad thermal paste application. So ill invest some LM paste and maybe one more copper rad.


----------



## hadesfactor

Quote:


> Originally Posted by *AstroSky*
> 
> sadly temps keep me from my true overclocking potential. im in a custom loop and i think i messed up my watercooling setup, I can reach 2000 and 2050 but temps climb around that area and it goes back down to 1999. It might be a combination of not enough rad space and bad thermal paste application. So ill invest some LM paste and maybe one more copper rad.


Dont go for LM just get some grizzly kryo....what is your rad set up, are you running a dual loop or single. In a well designed custom loop your GPU temps under load should be between 40-45C (dependent on ambient as well)

Dont forget, TIM isn't used only or primarily for it's thermal conductivity, its used to remove the gap between the coldplate and chip/IHS so that there are no pockets of air. Ideally, and as sort of a leveler since you'll never get both surfaces perfectly flat. That's why even between the top 10 TIM's there is only like a .5c diff.

As for Rad space, the way I myself judge how much space I need is 120mm +(120mm per deivce) so if I have a CPU+GPU at the very min I have a 360mm. Other factors come into play as well, like if you are using a thinner rad such as a 30mm I add more rad space. Also cooling, ie good fans, good placement, low ambient etc. I kind of over do it in my builds, I generally have a 480x45 just for my GPU and another for my Mono-block/ram.

There are so many factors.

Can you post a pic of your loop also what are your temps....cpu, GPU, ambient and water


----------



## KraxKill

Maybe I'm the only one here that thinks like this, but having had my card up to 2164, I still think that the *sweet spot for these cards is around 2000mhz @ ˜1.0v or under (I run 2012 @ .950)*. Lower temps, more TDP headroom to handle tough scenes. When I'm not gaming, I'm mining Etherium at 38mh/s. Most of my games are configured to be CPU limited as I prefer frame rate and low latency over distracting eye candy on my 144hz display. This means that I actually get the same performance at 2000mhz as I do at 2100mhz while consuming around 200W compared to the power hog the card becomes at higher voltages and frequencies.

If you think in terms of efficiency, *these cards start robbing peter to pay paul at over 1.000v
*
Remember these cards have only so much TDP to work with, increasing voltage simply makes you hit this limit faster consuming a lot of the benefit you may have gained by the additional clock. You gain in easy scenes (where you don't really need it) while in tough scenes where you encounter TDP, you're actually loosing performance and incurring frame latency via clock hopping in places where you actually need the card do work hard and steady.

*2000mhz @ 1v or less. Set it and forget it*.


----------



## blecap

Someone help me with my Zotac post


----------



## webhito

Quote:


> Originally Posted by *blecap*
> 
> Someone help me with my Zotac post


FE= Founders Edition, those cards run hot, I didn't keep mine as the only solution was to watercool it as it throttled even at stock.

Just saw the link you left, its not a founders, but pretty sure the only thing that is different is the shroud, not sure if it will be any better though.


----------



## hadesfactor

Quote:


> Originally Posted by *KraxKill*
> 
> Maybe I'm the only one here that thinks like this, but having had my card up to 2164, I still think that the *sweet spot for these cards is around 2000mhz @ ˜1.0v or under (I run 2012 @ .950)*. Lower temps, more TDP headroom to handle tough scenes. When I'm not gaming, I'm mining Etherium at 38mh/s. Most of my games are configured to be CPU limited as I prefer frame rate and low latency over distracting eye candy on my 144hz display. This means that I actually get the same performance at 2000mhz as I do at 2100mhz while consuming around 200W compared to the power hog the card becomes at higher voltages and frequencies.
> 
> If you think in terms of efficiency, *these cards start robbing peter to pay paul at over 1.000v
> *
> Remember these cards have only so much TDP to work with, increasing voltage simply makes you hit this limit faster consuming a lot of the benefit you may have gained by the additional clock. While in tough scenes where you encounter TDP, you're actually loosing performance and incurring frame latency by clock hopping.
> 
> *2000mhz @ 1v or less. Set it and forget it*.


I agree if you can get those clocks at the the low voltage....not all cards perform the same, just for instance my card can't do those speeds at that voltage....but I agree you want the most stable clocks at the lowest voltage you can get


----------



## KraxKill

Quote:


> Originally Posted by *hadesfactor*
> 
> I agree if you can get those clocks at the the low voltage....not all cards perform the same, just for instance my card can't do those speeds at that voltage....but I agree you want the most stable clocks at the lowest voltage you can get


Where does you card hit at 1v? I'm not trying to assume things, but one thing to keep in mind is that when you lower voltage, the card will run cooler and slow your fan so you have to adjust the fan curve accordingly. Making it more aggressive so that it spools the same way it would have spooled at higher temps.


----------



## LunaP

Quote:


> Originally Posted by *KraxKill*
> 
> Maybe I'm the only one here that thinks like this, but having had my card up to 2164, I still think that the *sweet spot for these cards is around 2000mhz @ ˜1.0v or under (I run 2012 @ .950)*. Lower temps, more TDP headroom to handle tough scenes. When I'm not gaming, I'm mining Etherium at 38mh/s. Most of my games are configured to be CPU limited as I prefer frame rate and low latency over distracting eye candy on my 144hz display. This means that I actually get the same performance at 2000mhz as I do at 2100mhz while consuming around 200W compared to the power hog the card becomes at higher voltages and frequencies.
> 
> If you think in terms of efficiency, *these cards start robbing peter to pay paul at over 1.000v
> *
> Remember these cards have only so much TDP to work with, increasing voltage simply makes you hit this limit faster consuming a lot of the benefit you may have gained by the additional clock. You gain in easy scenes (where you don't really need it) while in tough scenes where you encounter TDP, you're actually loosing performance and incurring frame latency via clock hopping in places where you actually need the card do work hard and steady.
> 
> *2000mhz @ 1v or less. Set it and forget it*.


Can this get added to the OP?


----------



## johnwayne117

Hey, guys, what temps are you having with strix 1080ti. when it is hot. like 33-37 celcius?(~95Fahrenheit) no AC


----------



## hadesfactor

Quote:


> Originally Posted by *KraxKill*
> 
> Where does you card hit at 1v? I'm not trying to assume things, but one thing to keep in mind is that when you lower voltage, the card will run cooler and slow your fan so you have to adjust the fan curve accordingly. Making it more aggressive so that it spools the same way it would have spooled at higher temps.


Temps aren't a prob for me...running a custom dedicated loop just for the GPU...even pushing a constant 1.093v my card doesnt break 45C tops and that's under stress....constant gaming is closer to 40c

For me temps aren't the issue also my card (FE) can push 2075 fully stable either...I can run it at 2062 but Im also running into power limiting since I haven't gotten around to flashing the bios yet.

I'll have to chk when I get home what my stable is at 1v but IIRC I think my best clock at that voltage was 1987


----------



## LunaP

I plan on putting my SC Blacks in SLI in a custom loop, and wondering if I leave them vanilla would I run into any issues w/ throttling that would warrant the need for a bios change or is it just better overall to do so regardless?


----------



## GreedyMuffin

Nope. Just don't overvoltage.


----------



## tieberion

My EVGA FE hits 2009 at 1V and 74c with fan on auto, thats without adding power, etc from MSI Afterburner. So I kinda just leave it alone, other than running the batch file after new drivers to cgange the thermal limit from 250 to 300.


----------



## Dasboogieman

Quote:


> Originally Posted by *AstroSky*
> 
> sadly temps keep me from my true overclocking potential. im in a custom loop and i think i ****ed up my watercooling setup, I can reach 2000 and 2050 but temps climb around that area and it goes back down to 1999. It might be a combination of not enough rad space and bad thermal paste application. So ill invest some LM paste and maybe one more copper rad.


Without some data we cannot say what needs improving.

There are 2 interfaces you have to be aware of if you want to gauge what improvement needs to be done.
1. Water to air Delta T
2. Core to Water Delta T

Number 1 tells you the efficiency of your watercooling loop, number 2 tells you the efficiency of your heat transfer to water.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> Without some data we cannot say what needs improving.
> 
> There are 2 interfaces you have to be aware of if you want to gauge what improvement needs to be done.
> 1. Water to air Delta T
> 2. Core to Water Delta T
> 
> Number 1 tells you the efficiency of your watercooling loop, number 2 tells you the efficiency of your heat transfer to water.


Some people do not understand how temps work. For example my 1080 Ti idles at 33C and loads 50C. My ambient is 28C-30C most of the time. I could lower it 5-7C from better loop instead of AIO and 8-10C having 20C ambient and I would be low 30s on load.


----------



## pez

Quote:


> Originally Posted by *johnwayne117*
> 
> Hey, guys, what temps are you having with strix 1080ti. when it is hot. like 33-37 celcius?(~95Fahrenheit) no AC


If anything, the STRIX Ti is one of the better AIB implementations. So long as you have good airflow in your case, you shouldn't see an issue with that card.


----------



## johnwayne117

Quote:


> Originally Posted by *pez*
> 
> If anything, the STRIX Ti is one of the better AIB implementations. So long as you have good airflow in your case, you shouldn't see an issue with that card.


Yeah, but what temps are we talking about 75-85? air flow is kinda low...


----------



## pez

Quote:


> Originally Posted by *johnwayne117*
> 
> Yeah, but what temps are we talking about 75-85? air flow is kinda low...


I shoved a Strix Ti in my NCASE and I'd say that's worst case scenario. The only airflow the card had was the air the GPU fans were pulling through the bottom of the case. With 70% fan, the card was hitting somewhere around 75-80C?

I have some notes here at the very end of this album about temps...overall, the card given some airflow should perform well. 70% fan isn't bad on these cards, but was bad for me considering my case doesn't subdue noise at all.



http://imgur.com/PuHht


----------



## Streetdragon

Quote:


> Originally Posted by *hadesfactor*
> 
> Dont go for LM just get some grizzly kryo....what is your rad set up, are you running a dual loop or single. In a well designed custom loop your GPU temps under load should be between 40-45C (dependent on ambient as well)
> 
> Dont forget, TIM isn't used only or primarily for it's thermal conductivity, its used to remove the gap between the coldplate and chip/IHS so that there are no pockets of air. Ideally, and as sort of a leveler since you'll never get both surfaces perfectly flat. That's why even between the top 10 TIM's there is only like a .5c diff.
> 
> As for Rad space, the way I myself judge how much space I need is 120mm +(120mm per deivce) so if I have a CPU+GPU at the very min I have a 360mm. Other factors come into play as well, like if you are using a thinner rad such as a 30mm I add more rad space. Also cooling, ie good fans, good placement, low ambient etc. I kind of over do it in my builds, I generally have a 480x45 just for my GPU and another for my Mono-block/ram.
> 
> There are so many factors.
> 
> Can you post a pic of your loop also what are your temps....cpu, GPU, ambient and water


Why not LM? If you know how to work with LM then there are only benefits. My GPU runs after hours of gaming(VSYNC OFF/100%load [email protected]) 5°over water -> [email protected]°. I dont think that kyro can beat it in any way or get to the same result.
OK the differenz between LM and best non-metall tim is 2° with good overclock. but better have it or not


----------



## Dasboogieman

Quote:


> Originally Posted by *Streetdragon*
> 
> Why not LM? If you know how to work with LM then there are only benefits. My GPU runs after hours of gaming(VSYNC OFF/100%load [email protected]) 5°over water -> [email protected]°. I dont think that kyro can beat it in any way or get to the same result.
> OK the differenz between LM and best non-metall tim is 2° with good overclock. but better have it or not


Because LM can straight up kill your card without warning if even so much as a tiny drop goes where it isn't supposed to.

I really don't like it how so many people are pushing the benefits of LM without fully disclosing the risks which are no joke.


----------



## Streetdragon

Quote:


> Originally Posted by *Dasboogieman*
> 
> Because LM can straight up kill your card without warning if even so much as a tiny drop goes where it isn't supposed to.
> 
> I really don't like it how so many people are pushing the benefits of LM without fully disclosing the risks which are no joke.


i Qoute a bit fom my post...
"*If you know how to work with LM* then there are only benefits"

sure i you dont prepair your card befor, you can run in big problems.
Isolating around the DIE is a must


----------



## s1rrah

So on a whim and because the game is new to me ... I decided to play my first hour of Hellblade: Senua's Sacrifice last night on my Samsung 4K TV. I was running my single 1080 ti a t 2020mhz on the core and 5000mhz (think that's right) on the memory ...

I tried the game on both it's max settings of "Very High" ... and also "High" and generally, I averaged about 50fps on either. Occasionally, it would jump to 60fps but in general, it hung somewhere between 45fps and 55fps. It was definitely playable, and even though the JS9000 is one of the better 4K gaming TV's ... I just couldn't handle the FPS compared to the 100+ that my 1440p monitor gets.

So I ended up finishing the session on my 1440p monitor ...

I will say this, though ...

Hellblade looked staggeringly good on that 48" 4K TV ... I don't know if it's the panel, the whole "quantum dot" thing or what ... but man was the TV more brilliantly colorful, "contrasty" and just generally more "rich" and dynamic of an image than was the Acer XBU... 1440p monitor; it took me at least 20 minutes to acclimate to the monitors relatively more "pale" colors than the Samsung TV ...

Fun experiment but it reaffirmed the fact that I simply can't play at 50 to 60fps ...

The Hellblade:Senua game is outstanding, BTW ... gorgeous, great writing and just insane sound production ... highly recommend it ...


----------



## hadesfactor

Quote:


> Originally Posted by *Streetdragon*
> 
> Why not LM? If you know how to work with LM then there are only benefits. My GPU runs after hours of gaming(VSYNC OFF/100%load [email protected]) 5°over water -> [email protected]°. I dont think that kyro can beat it in any way or get to the same result.
> OK the differenz between LM and best non-metall tim is 2° with good overclock. but better have it or not


My cpu loop doesnt break 5c delta on stress and normal gaming stays between 2.8 and 3.5c delta...my gpu doesnt go over 3c delta after playing 10+ hrs and doesnt break 40c and thats with a 30c ambient (run my gpu at higher v)..normal operation my gpu sits @.7c delta and my GPU rad is even just in a pull config. All that with just kryo and not even fuji pads.
Besides the facts that were already stated, his problem isnt going to go away by a 2c diff. He needs to tame his loop 1st and figure out why temps are as hot as they are. Also, if his loop is optimized and running effecicant, 2c will make no difference in how his gpu runs...45 to 43c will make no diff, 43 to 41 card will run the same. LM could be dangerous if not used correctly and the gains on a water cooled GPU are not worth the risks IMO. If after he gets his loop under control and gets a stable oc and good temps and thinks that 2c will bring him lower then a throttle point he can try LM but you dont run before you walk.
@astrosky as dasboogieman and i stated...we need to know you temps across your sys. Do you have water temp sensors? If you dont i suggest adding at least 1 to your loop. I like adding a 1 to my in and out of my rad (only 1c diff) but knowing the delta between your water and ambient is pretty crucial..if its a high delta t you know that your set up isnt removing heat from your water. If your water temps arent raising in proportion to you chip temps then your blocks arent transfering heat from your chip to water. Your water/chips will never be lower then your ambient air in a watercooled loop (except for exotic cooling like phase change, ln2, geo thermal, chiller) so ambient has a major impact on your overall temps.
Its been tested lots of times that TIM application does make a diff but not as big as people think...if i can find the test that shows all the diff types of application ill post it...the tester even uses a smiley face design and there was barely a difference between the best and worst application. I would say look at your loop 1st then after thats nailed down, if you think that you could shave more down look at the other things.


----------



## GraphicsWhore

Quote:


> Originally Posted by *KraxKill*
> 
> Maybe I'm the only one here that thinks like this, but having had my card up to 2164, I still think that the *sweet spot for these cards is around 2000mhz @ ˜1.0v or under (I run 2012 @ .950)*. Lower temps, more TDP headroom to handle tough scenes. When I'm not gaming, I'm mining Etherium at 38mh/s. Most of my games are configured to be CPU limited as I prefer frame rate and low latency over distracting eye candy on my 144hz display. This means that I actually get the same performance at 2000mhz as I do at 2100mhz while consuming around 200W compared to the power hog the card becomes at higher voltages and frequencies.
> 
> If you think in terms of efficiency, *these cards start robbing peter to pay paul at over 1.000v
> *
> Remember these cards have only so much TDP to work with, increasing voltage simply makes you hit this limit faster consuming a lot of the benefit you may have gained by the additional clock. You gain in easy scenes (where you don't really need it) while in tough scenes where you encounter TDP, you're actually loosing performance and incurring frame latency via clock hopping in places where you actually need the card do work hard and steady.
> 
> *2000mhz @ 1v or less. Set it and forget it*.


Throttling and TDP aren't issues for me but I have a new AB profile for 2012/6039 at .993v and while benchmarks are certainly lower (compared to highest gaming OC of 2126/6048 at 1.093v), I'll probably just use that as my new gaming OC; performance seems the same.


----------



## ZealotKi11er

Anyone played Wolfenstein: The New Order recently? The game keeps crashing to desktop. I even did a fresh Windows 10 install and it still happening.


----------



## GraphicsWhore

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone played Wolfenstein: The New Order recently? The game keeps crashing to desktop. I even did a fresh Windows 10 install and it still happening.


Been playing the DLC and it's fine.


----------



## hadesfactor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone played Wolfenstein: The New Order recently? The game keeps crashing to desktop. I even did a fresh Windows 10 install and it still happening.


In some games for me playing with my card OC'd higher then 2000mhz causes a crash. I don't know if this will help but try downclocking a little (if OC'd) see if it still crashes. This issue for me was especially bad with ME Andromeda....couldn't play for more then 10 min when it actually loaded lol. Doom wasn't an issue.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hadesfactor*
> 
> In some games for me playing with my card OC'd higher then 2000mhz causes a crash. I don't know if this will help but try downclocking a little (if OC'd) see if it still crashes. This issue for me was especially bad with ME Andromeda....couldn't play for more then 10 min when it actually loaded lol. Doom wasn't an issue.


No problem with other games. I have tried stock and still happens.


----------



## Lefty23

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone played Wolfenstein: The New Order recently? The game keeps crashing to desktop. I even did a fresh Windows 10 install and it still happening.


Just tried it for ~40mins (Gibraltar Bridge chapter) with no problems.
1440p, Ultra preset, latest NV driver, Win10 pro (creators upd not installed). Card using my summer (room temp 30C++) low voltage OC profile 2000/[email protected]
Does it crash straight away or after long sessions?
In case of the latter I could try to run it for a couple of hours during the weekend.

Btw thanks for reminding me I should replay New Order & Old Blood in preparation for Colossus








... if only we didn't have to deal with the 60fps cap of id5








Hopefully the new one is being developed in the Doom engine


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lefty23*
> 
> Just tried it for ~40mins (Gibraltar Bridge chapter) with no problems.
> 1440p, Ultra preset, latest NV driver, Win10 pro (creators upd not installed). Card using my summer (room temp 30C++) low voltage OC profile 2000/[email protected]
> Does it crash straight away or after long sessions?
> In case of the latter I could try to run it for a couple of hours during the weekend.
> 
> Btw thanks for reminding me I should replay New Order & Old Blood in preparation for Colossus
> 
> 
> 
> 
> 
> 
> 
> 
> ... if only we didn't have to deal with the 60fps cap of id5
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully the new one is being developed in the Doom engine


It crashes after playing 2-3 mins, sometimes 5 mins, sometimes 30s. Always when something big is about to happen in the game like something jumps at you.
I only played New Order do far.

The new games will use id6.


----------



## Lefty23

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It crashes after playing 2-3 mins, sometimes 5 mins, sometimes 30s. Always when something big is about to happen in the game like something jumps at you.
> I only played New Order do far.


I'll try some more during the weekend and update you if I have similar issues.
Quote:


> The new games will use id6.


Good news. Doom run and looked great


----------



## coc_james

So my AORUS Extreme Waterforce won't volt past 1.063 in MSI AB. With the max volts at 1.063 my wall is 2113mhz in Superposition 1080p extreme. Score maxes at 6424. Max temps never exceed 45, and clocks don't fall. Wondering why I can't go to 1.096 volts. Is it BIOS limited or am I missing something? Yes, I tried a curve over shooting 2113, to like 2135 @ 1.096, no dice, won't go there.


----------



## Fediuld

Quote:


> Originally Posted by *coc_james*
> 
> So my AORUS Extreme Waterforce won't volt past 1.063 in MSI AB. With the max volts at 1.063 my wall is 2113mhz in Superposition 1080p extreme. Score maxes at 6424. Max temps never exceed 45, and clocks don't fall. Wondering why I can't go to 1.096 volts. Is it BIOS limited or am I missing something? Yes, I tried a curve over shooting 2113, to like 2135 @ 1.096, no dice, won't go there.


1.093v is the cap not 1.096


----------



## coc_james

Quote:


> Originally Posted by *Fediuld*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> So my AORUS Extreme Waterforce won't volt past 1.063 in MSI AB. With the max volts at 1.063 my wall is 2113mhz in Superposition 1080p extreme. Score maxes at 6424. Max temps never exceed 45, and clocks don't fall. Wondering why I can't go to 1.096 volts. Is it BIOS limited or am I missing something? Yes, I tried a curve over shooting 2113, to like 2135 @ 1.096, no dice, won't go there.
> 
> 
> 
> 1.093v is the cap not 1.096
Click to expand...

Either way, semantics aside, it won't go past 1.063.


----------



## GraphicsWhore

Quote:


> Originally Posted by *coc_james*
> 
> So my AORUS Extreme Waterforce won't volt past 1.063 in MSI AB. With the max volts at 1.063 my wall is 2113mhz in Superposition 1080p extreme. Score maxes at 6424. Max temps never exceed 45, and clocks don't fall. Wondering why I can't go to 1.096 volts. Is it BIOS limited or am I missing something? Yes, I tried a curve over shooting 2113, to like 2135 @ 1.096, no dice, won't go there.


You update to AB beta 16?

Try another OC program?

Maybe flash a different BIOS?


----------



## BoredErica

I did 5 tests of SuPo 8k optimized on FE vs XOC bios on SC Black (FE pcb). XOC won by a little bit.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Darkwizzie*
> 
> I did 5 tests of SuPo 8k optimized on FE vs XOC bios on SC Black (FE pcb). XOC won by a little bit.


I did test with FE @ 2025 vs XOC at 2100 and XOC got 1.2% more score in 3DMark. Totally not worth it.


----------



## BoredErica

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I did test with FE @ 2025 vs XOC at 2100 and XOC got 1.2% more score in 3DMark. Totally not worth it.


Totally worth it to me!


----------



## GraphicsWhore

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I did test with FE @ 2025 vs XOC at 2100 and XOC got 1.2% more score in 3DMark. Totally not worth it.


Ok, you're getting 1.2% benefit but it's for like a couple-minutes worth of work including benchmark. I mean, unless you're a big shot whose every single minute counts that much...


----------



## ZealotKi11er

Quote:


> Originally Posted by *GraphicsWhore*
> 
> Ok, you're getting 1.2% benefit but it's for like a couple-minutes worth of work including benchmark. I mean, unless you're a big shot whose every single minute counts that much...


That is fine. Problem is 2100 needs way more voltage and have no idea what is going on with VRM on a FE (Hybrid) card.


----------



## coc_james

Not yet.

Tried the AORUS software, actually worse.

Dont know if that's an option due to the built in AIO.


----------



## KedarWolf

Quote:


> Originally Posted by *KraxKill*
> 
> Maybe I'm the only one here that thinks like this, but having had my card up to 2164, I still think that the *sweet spot for these cards is around 2000mhz @ ˜1.0v or under (I run 2012 @ .950)*. Lower temps, more TDP headroom to handle tough scenes. When I'm not gaming, I'm mining Etherium at 38mh/s. Most of my games are configured to be CPU limited as I prefer frame rate and low latency over distracting eye candy on my 144hz display. This means that I actually get the same performance at 2000mhz as I do at 2100mhz while consuming around 200W compared to the power hog the card becomes at higher voltages and frequencies.
> 
> If you think in terms of efficiency, *these cards start robbing peter to pay paul at over 1.000v
> *
> Remember these cards have only so much TDP to work with, increasing voltage simply makes you hit this limit faster consuming a lot of the benefit you may have gained by the additional clock. You gain in easy scenes (where you don't really need it) while in tough scenes where you encounter TDP, you're actually loosing performance and incurring frame latency via clock hopping in places where you actually need the card do work hard and steady.
> 
> *2000mhz @ 1v or less. Set it and forget it*.


Added tested at .993v info to OP.

+1 for the suggestion, it's in the Custom Voltage Curve section.









*Edit: I was testing power limiting with the Arctic Storm BIOS. I found at 2025 core, 6177 memory at .993v my card does zero power limiting in Fire Strike Ultra, Time Spy and Superposition 4K.

Also, it won't power limit on the Fire Strike Ultra stress test. Every BIOS I've tested so far seems this is true at .993v under water.

My temps on a 360 RAD with a highly overclocked 5960x and my 1080 Ti never go above 43C on the GPU at these settings.
But a lot of that is adding Fuji 17.0 W/mK thermal pads on the memory and VRMs and a few other key places.

Normally I run 2088 core, 6077 memory because I have a 4K G-Sync monitor and I cap games out at 59 FPS to keep G-Sync enabled and capped at 59 FPS it does no power limiting at 2088 at the highest graphics settings in games.

But if you run higher frame rates or have a 144Hz monitor and run it at cap you may want to consider your max clocks at .993v.

Under water is great and under air it will greatly reduce any temps and thermal throttling you may encounter as well.[*


----------



## pez

Quote:


> Originally Posted by *s1rrah*
> 
> So on a whim and because the game is new to me ... I decided to play my first hour of Hellblade: Senua's Sacrifice last night on my Samsung 4K TV. I was running my single 1080 ti a t 2020mhz on the core and 5000mhz (think that's right) on the memory ...
> 
> I tried the game on both it's max settings of "Very High" ... and also "High" and generally, I averaged about 50fps on either. Occasionally, it would jump to 60fps but in general, it hung somewhere between 45fps and 55fps. It was definitely playable, and even though the JS9000 is one of the better 4K gaming TV's ... I just couldn't handle the FPS compared to the 100+ that my 1440p monitor gets.
> 
> So I ended up finishing the session on my 1440p monitor ...
> 
> I will say this, though ...
> 
> Hellblade looked staggeringly good on that 48" 4K TV ... I don't know if it's the panel, the whole "quantum dot" thing or what ... but man was the TV more brilliantly colorful, "contrasty" and just generally more "rich" and dynamic of an image than was the Acer XBU... 1440p monitor; it took me at least 20 minutes to acclimate to the monitors relatively more "pale" colors than the Samsung TV ...
> 
> Fun experiment but it reaffirmed the fact that I simply can't play at 50 to 60fps ...
> 
> The Hellblade:Senua game is outstanding, BTW ... gorgeous, great writing and just insane sound production ... highly recommend it ...


I was trying to get it to stream via 4K to my TV (PC is on wifi, so 4K streaming + my TV's bad implementation of AndroidTV = no bueno), but for the few seconds it ran and while standing still I could say you're not wrong at all. Mine doesn't seem to be as crazy of a difference as yours is, but the 4K itself is extremely noiceable. Also the 'QLED' is probably way better than the IPS panel your monitor has.

Hellblade looks, runs, and plays beautiful, too







!


----------



## Pandora's Box

Quote:


> Originally Posted by *KedarWolf*
> 
> Added tested at .993v info to OP.
> 
> +1 for the suggestion, it's in the Custom Voltage Curve section.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit: I was testing power limiting with the Arctic Storm BIOS. I found at 2025 core, 6177 memory at .993v my card does zero power limiting in Fire Strike Ultra, Time Spy and Superposition 4K.
> 
> Also, it won't power limit on the Fire Strike Ultra stress test. Every BIOS I've tested so far seems this is true at .993v under water.
> 
> My temps on a 360 RAD with a highly overclocked 5960x and my 1080 Ti never go above 43C on the GPU at these settings.
> But a lot of that is adding Fuji 17.0 W/mK thermal pads on the memory and VRMs and a few other key places.
> 
> Normally I run 2088 core, 6077 memory because I have a 4K G-Sync monitor and I cap games out at 59 FPS to keep G-Sync enabled and capped at 59 FPS it does no power limiting at 2088 at the highest graphics settings in games.
> 
> But if you run higher frame rates or have a 144Hz monitor and run it at cap you may want to consider your max clocks at .993v.
> 
> Under water is great and under air it will greatly reduce any temps and thermal throttling you may encounter as well.[*


Undervolting and overclocking is definitely the way to go with the 1080 Ti, imo.


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> Added tested at .993v info to OP.
> 
> +1 for the suggestion, it's in the Custom Voltage Curve section.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit: I was testing power limiting with the Arctic Storm BIOS. I found at 2025 core, 6177 memory at .993v my card does zero power limiting in Fire Strike Ultra, Time Spy and Superposition 4K.
> 
> Also, it won't power limit on the Fire Strike Ultra stress test. Every BIOS I've tested so far seems this is true at .993v under water.
> 
> My temps on a 360 RAD with a highly overclocked 5960x and my 1080 Ti never go above 43C on the GPU at these settings.
> But a lot of that is adding Fuji 17.0 W/mK thermal pads on the memory and VRMs and a few other key places.
> 
> Normally I run 2088 core, 6077 memory because I have a 4K G-Sync monitor and I cap games out at 59 FPS to keep G-Sync enabled and capped at 59 FPS it does no power limiting at 2088 at the highest graphics settings in games.
> 
> But if you run higher frame rates or have a 144Hz monitor and run it at cap you may want to consider your max clocks at .993v.
> 
> Under water is great and under air it will greatly reduce any temps and thermal throttling you may encounter as well.[*


Using fujipoly 17 on vrm and memory affect core temp? I always thought the stock pads were okay...


----------



## OZrevhead

Does anyone have the correct tool for adjusting voltages on a Galax 1080Ti HOF? I have a tool but it doesn't seem to work (voltages don't change).


----------



## Pandora's Box

Decided to finally bite the bullet and go full custom watercooling for my new rig (see signature for specs).

Have this on the way from EKWB for the MSI Gaming X 1080 Ti


----------



## Kylar182

Had my 2 cards for awhile. Under water on EK but didn't know there were any bios mods for them. I have the original "NVIDIA" brand ones. I overclocked my Titan X Maxwells quite a lot last round. I'd like to see what I can do with these. Anyone know where I should start?


----------



## outofmyheadyo

Anyone with the sea hawk ek x 1080ti here? Does it hit the power limit, and whats the power limit on it? I suppose it uses the gaming x pcb? Like the other sea hawk ek x cards?


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Added tested at .993v info to OP.
> 
> +1 for the suggestion, it's in the Custom Voltage Curve section.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit: I was testing power limiting with the Arctic Storm BIOS. I found at 2025 core, 6177 memory at .993v my card does zero power limiting in Fire Strike Ultra, Time Spy and Superposition 4K.
> 
> Also, it won't power limit on the Fire Strike Ultra stress test. Every BIOS I've tested so far seems this is true at .993v under water.
> 
> My temps on a 360 RAD with a highly overclocked 5960x and my 1080 Ti never go above 43C on the GPU at these settings.
> But a lot of that is adding Fuji 17.0 W/mK thermal pads on the memory and VRMs and a few other key places.
> 
> Normally I run 2088 core, 6077 memory because I have a 4K G-Sync monitor and I cap games out at 59 FPS to keep G-Sync enabled and capped at 59 FPS it does no power limiting at 2088 at the highest graphics settings in games.
> 
> But if you run higher frame rates or have a 144Hz monitor and run it at cap you may want to consider your max clocks at .993v.
> 
> Under water is great and under air it will greatly reduce any temps and thermal throttling you may encounter as well.[*
> 
> 
> 
> Using fujipoly 17 on vrm and memory affect core temp? I always thought the stock pads were okay...
Click to expand...

At 1.093v my max temps went down from 53C while stress testing to 45C max.


----------



## TK421

Quote:


> Originally Posted by *KedarWolf*
> 
> At 1.093v my max temps went down from 53C while stress testing to 45C max.


Maybe that is because you adjusted the pad height and cause the core contact to be improved?


----------



## KedarWolf

Quote:


> Originally Posted by *TK421*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> At 1.093v my max temps went down from 53C while stress testing to 45C max.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe that is because you adjusted the pad height and cause the core contact to be improved?
Click to expand...

I used 1.0MM pads on memory, 1.5MM on VRMs.


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*
> 
> I used 1.0MM pads on memory, 1.5MM on VRMs.


Hi, I've been following this thread fairly closely for a while now, I currently have an FE card with an AIO attached and was contemplating flashing the Arctic Storm vbios and keeping the PT at 100% (360W). Did you see a performance decrease with the AS vbios compared to the FE vbios and do you think I would be ok with 360W and the AIO cooling solution? I'm currently seeing wattage starvation clock drops down from 2062 to 2000 MHz in Superposition, Firestrike, and Witcher 3 (3D Vision). I'm reluctant to flash to AS as from what I have gathered from different forum members here is that they lost around 5% performance with XOC vs. FE, i.e. FE at 2025 Mhz =/ XOC at 2125 MHz so it's basically a wash and now you have 400+ watts coursing through your card. I'm worried that the situation is more or less the same with the Arctic Storm vbios. I noticed you stated that it performed will in Timespy, but what about games and other benchmarks vs. FE? I've looked and I haven't seen any elaboration on this. We could really use a detailed performance comparison between AS and FE. If it's like XOC vs. FE, where sure you will eliminate power throttling but in the end 2100 MHz yields the same FPS as 2000 MHz did with FE, what's the point? Thanks for all the work.


----------



## LunaP

This sucks, Finally got around to testing the 1080ti's yesterday and 1 is DoA, and since store policy is MFC after opening I gotta wait on eVGA now..


----------



## BoredErica

Kedarwolf, what do you get in 1080p SuPo?


----------



## ZealotKi11er

I am trying to OC with Cruve but for some reason the last parts of the graphs stick up. This is not a problem with FE bios but with no Power Limit XOC bios goes to those bins. Is there a trick to make it strait. I think with older drivers or older MSI AB it was fine.


----------



## hadesfactor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am trying to OC with Cruve but for some reason the last parts of the graphs stick up. This is not a problem with FE bios but with no Power Limit XOC bios goes to those bins. Is there a trick to make it strait. I think with older drivers or older MSI AB it was fine.


Happens to me sometimes even on stock bios.....click on the Main AB Screen (not the curve part) and hit Cntrl+D to reset to defaults. It will reset everything including power,vlotage, temp....then try again. Sometimes I get these really pronounced valleys and peaks if I play with the curve a lot then hit cntrl+D on just the curve screen


----------



## hadesfactor

Also, try bringing them to a lower bin then you have your clock at, they should jump to match your line


----------



## Lefty23

Quote:


> Originally Posted by *Mooncheese*
> 
> ﻿
> If it's like XOC vs. FE, where sure you will eliminate power throttling but in the end 2100 MHz yields the same FPS as 2000 MHz did with FE, what's the point?


That's not entirely true. In my case, while benching (5 3DMark tests & 3 SuPo presets) XOC gave me 2.5% higher scores on average, when compared to stock FE bios.
That is with the XOC bios limited at 400W max based on readings from APC UPS at the wall and HWiNFO64 report.


Spoiler: Warning: Spoiler!



3DMark graphics scores
FS : 34036 (+2.2%) http://www.3dmark.com/compare/fs/12302925/fs/13103973
FSU: 8117 (+2.4%) http://www.3dmark.com/compare/fs/12427634/fs/13103590
FSX: 16589 (+3.0%) http://www.3dmark.com/compare/fs/12302171/fs/13103909
TS : 11541 (+2.8%) http://www.3dmark.com/compare/spy/1917118/spy/2058639
3DM11:46470 (+2.5%) http://www.3dmark.com/compare/3dm11/12259575/3dm11/12270119

Superposition (here & here)
UNS1080pExtr : 6629 (+1.18% compared to the highest on FE which was 6552)
UNS____4Kopt : 11031 (+1.99% compared to the highest on FE which was 10819)
UNS____8Kopt : 5104 (+3.32% compared to the highest on FE which was 4940)



Even at same clocks I got better scores, e.g. 4K SuPo, FE vs XOC bios:
- FE 2126/6277 @ 1.093 -- 10819 -- FPS min/avg/max: 64.34/80.93/106.75
- XOC 2126/6277 @ 1.112 -- 10988 -- FPS min/avg/max: 65.25/82.19/106.09
XOC scores 169 (+1.6%) points higher than stock bios while using a bit more volts and runs about 4C higher.
FE run is on newer NV driver.


Spoiler: Warning: Spoiler!







If you are interested to find the limits of your card and you are comfortable with cross-flashing bios the best thing you can do is try it for yourself and see if it helps your card.
Just make sure to use the curve to OC and keep an eye on temps/voltage.

Fwiw for 24/7 I'm using the stock bios and running 2000/[email protected] as it keeps the card in mid/high 30s while running the rad fans at low rpm.
The difference between 2000 and 2100 is just a few fps anyway. Check here if you want or even better try it yourself in a game you play.

Quote:


> Originally Posted by *hadesfactor*
> 
> Also, try bringing them to a lower bin then you have your clock at, they should jump to match your line


This always works for me


----------



## ZealotKi11er

Quote:


> Originally Posted by *hadesfactor*
> 
> Also, try bringing them to a lower bin then you have your clock at, they should jump to match your line


That is what I do but the last few stick up. I try to lower them manually but still stick up.


----------



## s1rrah

Quote:


> Originally Posted by *pez*
> 
> I was trying to get it to stream via 4K to my TV (PC is on wifi, so 4K streaming + my TV's bad implementation of AndroidTV = no bueno), but for the few seconds it ran and while standing still I could say you're not wrong at all. Mine doesn't seem to be as crazy of a difference as yours is, but the 4K itself is extremely noiceable. Also the 'QLED' is probably way better than the IPS panel your monitor has.
> 
> Hellblade looks, runs, and plays beautiful, too
> 
> 
> 
> 
> 
> 
> 
> !


As an update ... I did a bunch of reading and found out how to disable depth of field and motion blur ... after disabling those, on my 4K TV, I get a solid 60fps just about all the time ... *much* better ... but still can't pull myself away from that 144hz Acer 1440p screen ... but once again, when running those above mentioned hacks/tests ... I was just floored by the colors and contrast of the TV vs the monitor ...

But happy my 1080 ti can do 4K at 60fps ... even if I'm not going to use it for that res ...

And yes ... Hellblade: Senua's Sacrifice is *incredible* ... it's become one of those games I don't want to play until I can set the room/environment just right and settle in for an hour or three ... I'm a huge fan of content/story/writing based games (think "Dear Esther, which I've played countless times) and Hellblade does it ... it's not without it's hiccups but man does it make you care about the female lead ... props to the devs!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lefty23*
> 
> That's not entirely true. In my case, while benching (5 3DMark tests & 3 SuPo presets) XOC gave me 2.5% higher scores on average, when compared to stock FE bios.
> That is with the XOC bios limited at 400W max based on readings from APC UPS at the wall and HWiNFO64 report.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3DMark graphics scores
> FS : 34036 (+2.2%) http://www.3dmark.com/compare/fs/12302925/fs/13103973
> FSU: 8117 (+2.4%) http://www.3dmark.com/compare/fs/12427634/fs/13103590
> FSX: 16589 (+3.0%) http://www.3dmark.com/compare/fs/12302171/fs/13103909
> TS : 11541 (+2.8%) http://www.3dmark.com/compare/spy/1917118/spy/2058639
> 3DM11:46470 (+2.5%) http://www.3dmark.com/compare/3dm11/12259575/3dm11/12270119
> 
> Superposition (here & here)
> UNS1080pExtr : 6629 (+1.18% compared to the highest on FE which was 6552)
> UNS____4Kopt : 11031 (+1.99% compared to the highest on FE which was 10819)
> UNS____8Kopt : 5104 (+3.32% compared to the highest on FE which was 4940)
> 
> 
> 
> Even at same clocks I got better scores, e.g. 4K SuPo, FE vs XOC bios:
> - FE 2126/6277 @ 1.093 -- 10819 -- FPS min/avg/max: 64.34/80.93/106.75
> - XOC 2126/6277 @ 1.112 -- 10988 -- FPS min/avg/max: 65.25/82.19/106.09
> XOC scores 169 (+1.6%) points higher than stock bios while using a bit more volts and runs about 4C higher.
> FE run is on newer NV driver.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> If you are interested to find the limits of your card and you are comfortable with cross-flashing bios the best thing you can do is try it for yourself and see if it helps your card.
> Just make sure to use the curve to OC and keep an eye on temps/voltage.
> 
> Fwiw for 24/7 I'm using the stock bios and running 2000/[email protected] as it keeps the card in mid/high 30s while running the rad fans at low rpm.
> The difference between 2000 and 2100 is just a few fps anyway. Check here if you want or even better try it yourself in a game you play.
> This always works for me


What is your secret. Your scores are 10% more than mine. My FS is 31.4K with 2100/6000. Yours like 34K.


----------



## BoredErica

6528 Supo 1080p extreme at 2138... Peeps be scoring higher than me with lower clocks. @[email protected]


----------



## ZealotKi11er

Quote:


> Originally Posted by *Darkwizzie*
> 
> 6528 Supo 1080p extreme at 2138... Peeps be scoring higher than me with lower clocks. @[email protected]


He is doing something we do not know. There is some secret going on.


----------



## Lefty23

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is your secret. Your scores are 10% more than mine. My FS is 31.4K with 2100/6000. Yours like 34K.


No secret really. These were supposed to be benchmarking runs so I'm using
- the NVCP tweaks,
- high performance power plan,
- shutting down as many programs/services as possible,
- setting desktop background to solid color and disable everything under colors in personalisation settings in windows
Most of these are mentioned in some of the benchmarking threads here on OCN. But these are the same for both the runs on FE and XOC.

That was not the point of my post though. I'm just saying that - for benchmarking - the XOC allows some cards to be pushed a bit further.
That is not always the case though. I remember, when XOC first became available, at least two members with shunt modded FE cards that had no improvements from using the XOC bios.

As I mentioned I'm on the stock bios and running undervolted for 24/7 usage.
Btw, that Firestrike run is at 2138/6280 @ 1.15v


----------



## BoredErica

Quote:


> Originally Posted by *Lefty23*
> 
> No secret really. These were supposed to be benchmarking runs so I'm using
> - the NVCP tweaks,
> - high performance power plan,
> - shutting down as many programs/services as possible,
> - setting desktop background to solid color and disable everything under colors in personalisation settings in windows
> Most of these are mentioned in some of the benchmarking threads here on OCN. But these are the same for both the runs on FE and XOC.
> 
> That was not the point of my post though. I'm just saying that - for benchmarking - the XOC allows some cards to be pushed a bit further.
> That is not always the case though. I remember, when XOC first became available, at least two members with shunt modded FE cards that had no improvements from using the XOC bios.


Yeah... But I'm wondering what the best things to do for benching are. I'm talking about specifics, too. Step by step, every detail. I spent a long time to get my score. :|


----------



## Lefty23

Quote:


> Originally Posted by *Darkwizzie*
> 
> Yeah... But I'm wondering what the best things to do for benching are. I'm talking about specifics, too. Step by step, every detail. I spent a long time to get my score. :|


What is unclear on the list above?
only thing I see is the nvcp tweaks, but they have been mentioned so many times in this forum.
Anyway, i'm using these

Other than that, reboot before every run
When in windows open task manager and shut down everything I don't need.
Disable antivirus
Run msi AB and set my OC then shut it down
Run benchmark, take screenshot, reboot and start over


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Kedarwolf, what do you get in 1080p SuPo?


1080p Extreme


----------



## Kokin

I'm looking for more performance to drive 3440x1440 and my Fury X isn't cutting it. Vega doesn't reach the performance levels I want, at least not yet, so I'm considering getting a 1080Ti + EK waterblock/backplate as my first-ever Nvidia GPU.

I don't see any Founders Editions any longer, but there are several choices for the AIB cards. Leaning towards the Gigabyte Aorus, but how are the other cards? There's the Asus Strix, MSI Gaming X and EVGA FTW3. As an AMD user, I've used GPUs from Sapphire, Gigabyte, MSI, and XFX, so I'm not partial towards any company.


----------



## hadesfactor

Quote:


> Originally Posted by *Kokin*
> 
> I'm looking for more performance to drive 3440x1440 and my Fury X isn't cutting it. Vega doesn't reach the performance levels I want, at least not yet, so I'm considering getting a 1080Ti + EK waterblock/backplate as my first-ever Nvidia GPU.
> 
> I don't see any Founders Editions any longer, but there are several choices for the AIB cards. Leaning towards the Gigabyte Aorus, but how are the other cards? There's the Asus Strix, MSI Gaming X and EVGA FTW3. As an AMD user, I've used GPUs from Sapphire, Gigabyte, MSI, and XFX, so I'm not partial towards any company.


Myself, Im partial to EVGA...not only are their cards solid, but they don't care at all if you water cool their cards....if you need to RMA the card, just put back the stock cooler and ship it's fine. If you're gonna water cool and OC, there's 2 schools of thought on it. Some say you'll hit higher OC on an AIB, and others (myself included) don't see the point in spending the extra money. Normally they are OC'ed less then you can push it anyways and silicon lottery always comes into play although they do tend to be binned better. You will have a Bios on those cards with a higher TDP but you can also flash your FE. I would say if you can't get an FE...I like the FTW3 personally.

As I said though, some people here have pushed their FE to higher clocks and some AIB's weren't able to push much passed 2050-2075....all cards should push 2000+ especially on water. I would find out what other companies let yuo water cool their products while honoring their RMA process and choose among those


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Kokin*
> 
> I'm looking for more performance to drive 3440x1440 and my Fury X isn't cutting it. Vega doesn't reach the performance levels I want, at least not yet, so I'm considering getting a 1080Ti + EK waterblock/backplate as my first-ever Nvidia GPU.
> 
> I don't see any Founders Editions any longer, but there are several choices for the AIB cards. Leaning towards the Gigabyte Aorus, but how are the other cards? There's the Asus Strix, MSI Gaming X and EVGA FTW3. As an AMD user, I've used GPUs from Sapphire, Gigabyte, MSI, and XFX, so I'm not partial towards any company.


I have the strix 1080ti with an EK block and back plate. Can't complain. No coil whine whatsoever. But I usually don't have gpus that whine.

Asus is hit or miss with customer service and warranties, although my one RMA claim went ok(Canada). Not sure if Asus allows customers to remove the air cooler for a water block or not.


----------



## webhito

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I have the strix 1080ti with an EK block and back plate. Can't complain. No coil whine whatsoever. But I usually don't have gpus that whine.
> 
> Asus is hit or miss with customer service and warranties, although my one RMA claim went ok(Canada). Not sure if Asus allows customers to remove the air cooler for a water block or not.


Nah, they don't, same as gigabyte afaik. Only evga and Msi I believe?


----------



## KedarWolf

Quote:


> Originally Posted by *Darkwizzie*
> 
> Kedarwolf, what do you get in 1080p SuPo?


----------



## Mooncheese

Quote:


> Originally Posted by *Lefty23*
> 
> That's not entirely true. In my case, while benching (5 3DMark tests & 3 SuPo presets) XOC gave me 2.5% higher scores on average, when compared to stock FE bios.
> That is with the XOC bios limited at 400W max based on readings from APC UPS at the wall and HWiNFO64 report.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3DMark graphics scores
> FS : 34036 (+2.2%) http://www.3dmark.com/compare/fs/12302925/fs/13103973
> FSU: 8117 (+2.4%) http://www.3dmark.com/compare/fs/12427634/fs/13103590
> FSX: 16589 (+3.0%) http://www.3dmark.com/compare/fs/12302171/fs/13103909
> TS : 11541 (+2.8%) http://www.3dmark.com/compare/spy/1917118/spy/2058639
> 3DM11:46470 (+2.5%) http://www.3dmark.com/compare/3dm11/12259575/3dm11/12270119
> 
> Superposition (here & here)
> UNS1080pExtr : 6629 (+1.18% compared to the highest on FE which was 6552)
> UNS____4Kopt : 11031 (+1.99% compared to the highest on FE which was 10819)
> UNS____8Kopt : 5104 (+3.32% compared to the highest on FE which was 4940)
> 
> 
> 
> Even at same clocks I got better scores, e.g. 4K SuPo, FE vs XOC bios:
> - FE 2126/6277 @ 1.093 -- 10819 -- FPS min/avg/max: 64.34/80.93/106.75
> - XOC 2126/6277 @ 1.112 -- 10988 -- FPS min/avg/max: 65.25/82.19/106.09
> XOC scores 169 (+1.6%) points higher than stock bios while using a bit more volts and runs about 4C higher.
> FE run is on newer NV driver.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> If you are interested to find the limits of your card and you are comfortable with cross-flashing bios the best thing you can do is try it for yourself and see if it helps your card.
> Just make sure to use the curve to OC and keep an eye on temps/voltage.
> 
> Fwiw for 24/7 I'm using the stock bios and running 2000/[email protected] as it keeps the card in mid/high 30s while running the rad fans at low rpm.
> The difference between 2000 and 2100 is just a few fps anyway. Check here if you want or even better try it yourself in a game you play.
> This always works for me


Are you shunted and comparing FE to XOC? Only getting 2.5% performance improvement with another 100W for essentially air cooled MOSFET's to cope with does not seem prudent.

I'm actually not interested in XOC because 1: I only have an AIO attached to the core, and I can expect vapor chamber VRM cooling performance (at least according to the analysis done by Gamers Nexus with their EVGA Hybrid kit review although I'm using an NZXT x41 which is 10C cooler and a lot quieter, I've had them both for comparison purposes) and 2: I have no control over PT whatsoever (I heard you can actually limit voltage now but I'm not entirely sure.)

Going back to my original comment, I'm interested to see some Arctic Storm and FE game and benchmark comparison. XOC seems to be really only an option if one is completely under water because of the not having control over how much watts the VRM / MOSFET's have to cope with issue.

With Arctic Storm I could keep PT at 100% and limit the wattage to 360W, but if I can expect a 5% reduction or so in performance vis a vis FE vbios then there is no point.

Kinda not interested in shunt modding as it seems that a lot could go wrong and then there is zero limit to controlling the wattage consumption?


----------



## KedarWolf




----------



## ZealotKi11er

Quote:


> Originally Posted by *Mooncheese*
> 
> Are you shunted and comparing FE to XOC? Only getting 2.5% performance improvement with another 100W for essentially air cooled MOSFET's to cope with does not seem prudent.
> 
> I'm actually not interested in XOC because 1: I only have an AIO attached to the core, and I can expect vapor chamber VRM cooling performance (at least according to the analysis done by Gamers Nexus with their EVGA Hybrid kit review although I'm using an NZXT x41 which is 10C cooler and a lot quieter, I've had them both for comparison purposes) and 2: I have no control over PT whatsoever (I heard you can actually limit voltage now but I'm not entirely sure.)
> 
> Going back to my original comment, I'm interested to see some Arctic Storm and FE game and benchmark comparison. XOC seems to be really only an option if one is completely under water because of the not having control over how much watts the VRM / MOSFET's have to cope with issue.
> 
> With Arctic Storm I could keep PT at 100% and limit the wattage to 360W, but if I can expect a 5% reduction or so in performance vis a vis FE vbios then there is no point.
> 
> Kinda not interested in shunt modding as it seems that a lot could go wrong and then there is zero limit to controlling the wattage consumption?


You can control how much volts with curve so that it does not go over v1.093 for example but pull as much power at that voltage which can vary from game to game. The only problem is trying to set up profile for lower voltages because the curve does not seem to work anymore wither with new MSI AB or new Drivers at its bumbs the last bins higher making XOC Bios very dangerous.


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*


Ok, I'm at 10122 at 4K Optimized (same settings) and that's without whatever nvcp control panel tricks might be in vogue at the moment, it also does 10,600 GPU in Timespy. So were talking what, 4% increase here going from 300 to 432W?

50% increase in power consumption for 4% performance?

I mean, even if I was under water I wouldn't be doing this.

So basically just as with XOC, AS isn't as fast as FE watt for watt?

Sure, XOC at 2126 Mhz and unlimited wattage(?) is going to be faster than FE vbios at 2126 in Superposition because everyone who has run Sup 4K with FE can see the clocks dip from 21XX to 2000MHz with only 300W, that it doesn't dip with XOC and ultimately is only 2.5% faster (at 400W+)...

I don't know, I just don't want to waste my time with the AS bios as I would need to reinstall the display driver, and on my end there is more to that than usual as I have 3D Vision fixes that would need to be completely redone (The Witcher 3), a plethora of settings, and then make powercfg.bat, and then hope (fingers crossed!) that flashing doesn't brick my card, only to end up with a 4% performance bump but needing to pump 432W through my card for said 4% which I cannot handle as again, I'm using the "Poor Man's Hybrid" approach:


__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/%5B/URL

You can control how much volts with curve so that it does not go over v1.093 for example but pull as much power at that voltage which can vary from game to game. The only problem is trying to set up profile for lower voltages because the curve does not seem to work anymore wither with new MSI AB or new Drivers at its bumbs the last bins higher making XOC Bios very dangerous.[/QUOTE]

Yeah I'm not messing with XOC. If I was completely under water MAYBE but then there is the heat pumped into the room and my electricity bill to consider with a 400-500W card (for 2.5% more performance, roll eyes emoticon).


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mooncheese*
> 
> Ok, I'm at 10122 at 4K Optimized (same settings) and that's without whatever nvcp control panel tricks might be in vogue at the moment, it also does 10,600 GPU in Timespy. So were talking what, 4% increase here going from 300 to 432W?
> 
> 50% increase in power consumption for 4% performance?
> 
> I mean, even if I was under water I wouldn't be doing this.
> 
> So basically just as with XOC, AS isn't as fast as FE watt for watt?
> 
> Sure, XOC at 2126 Mhz and unlimited wattage(?) is going to be faster than FE vbios at 2126 in Superposition because everyone who has run Sup 4K with FE can see the clocks dip from 21XX to 2000MHz with only 300W, that it doesn't dip with XOC and ultimately is only 2.5% faster (at 400W+)...
> 
> I don't know, I just don't want to waste my time with the AS bios as I would need to reinstall the display driver, and on my end there is more to that than usual as I have 3D Vision fixes that would need to be completely redone (The Witcher 3), a plethora of settings, and then make powercfg.bat, and then hope (fingers crossed!) that flashing doesn't brick my card, only to end up with a 4% performance bump but needing to pump 432W through my card for said 4% which I cannot handle as again, I'm using the "Poor Man's Hybrid" approach:
> 
> 
> __
> https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/
> Yeah I'm not messing with XOC. If I was completely under water MAYBE but then there is the heat pumped into the room and my electricity bill to consider with a 400-500W card (for 2.5% more performance, roll eyes emoticon).


I came the that conclusion too. After 2GHz and 1v Pascal overclocking is only for Benchmark and ePeen,


----------



## LunaP

Quote:


> Originally Posted by *hadesfactor*
> 
> Myself, Im partial to EVGA...not only are their cards solid, but they don't care at all if you water cool their cards....if you need to RMA the card, just put back the stock cooler and ship it's fine. If you're gonna water cool and OC, there's 2 schools of thought on it. Some say you'll hit higher OC on an AIB, and others (myself included) don't see the point in spending the extra money. Normally they are OC'ed less then you can push it anyways and silicon lottery always comes into play although they do tend to be binned better. You will have a Bios on those cards with a higher TDP but you can also flash your FE. I would say if you can't get an FE...I like the FTW3 personally.
> 
> As I said though, some people here have pushed their FE to higher clocks and some AIB's weren't able to push much passed 2050-2075....all cards should push 2000+ especially on water. I would find out what other companies let yuo water cool their products while honoring their RMA process and choose among those


Can 2nd this, bought 2x 1080ti's put the blocks on 1 is dead, called in had the rma approved within 5 minutes, doing a cross ship replacement, should have the card by tuesday/wednesday hopefully. I do love the new XSPC blocks though. For now I'll just do 2x 980ti and 1x 1080ti since I still wanna test my acer 4k 32"


----------



## Mooncheese

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I came the that conclusion too. After 2GHz and 1v Pascal overclocking is only for Benchmark and ePeen,


Yeah my card doesn't do more than 1975 MHz with default voltage (seeing 1.062v with default FE vbios and untouched voltage) and only +25 mv and a voltage curve and it does another 100 Mhz, and tends to hold 2062 MHz under 50C and from 50 to 55C it will hold 2050 MHz unless the game induces wattage starvation and then the clocks will dip down to 2000 MHz, occasionally 1987 MHz. Without the additional voltage the clocks still dip from 1975 down to 1939 MHz so....yeah I'm losing 50 MHz either way. Temps still don't exceed 55C with the AIO and usually hang out around 48-53C with a relaxed fan algorithm set for low noise.

My point is, my card will not do 2000MHz with an undervolt, and just running default voltage the best it will do is 1975 Mhz and power starvation induced throttling ends up dropping the clocks ~50 Mhz whether or not its from 2050 down to 2k or 1975 down to 1930 or so so yeah, YMMV with the undervolting.

The difference between 1911 MHz (+0 core + 0 memory, 120 PT) and 2050 MHz is 8-9 FPS at 2560x1440 in The Witcher 3, which coincidentally is the cost of running vastly superior SMAA via Reshade over the in-game AA. For me it isn't simply the difference between 100 and 108 FPS, it's 50 to 54-55 FPS as I'm running the game in 3D Vision and the game stutters at anything under say 55-57 FPS for whatever reason. With the OC I can bridge the gap between 55 and 60 FPS and alleviate stuttering in 90% of the game's environments, including and especially Novigrad, with the option of disabling Reshade on the fly for another 3-5 FPS in truly demanding areas.

For me it's not about measuring an E-Peen, I really need every ounce of performance I can wring out of this card, I could care less about getting a few hundred more points in some benchmark, hence the reason for feeling the need to break down and explicitly ask what kind of performance gains are to be had, if any, between the Arctic Storm vbios and the FE vbios before wasting my time flashing, reinstalling the display driver and all the various 3D Vision fixes and NVCP tweaks, creating a powercfg.bat file, and then dialing in the new vbios to only find that either A: the performance gains are a wash, i.e. XOC vbios, or that B: my hybrid cooled card wasn't meant to handle 360W of power and now I have to hope that Nvidia doesn't find out that the card I'm sending them under warranty has a different vbios or the more likely outcome, I would be disappointed with it and have to revert to the default vbios and do reinstall the display driver all over again.

If anyone else out there has played around with the AS vbios on their FE and wants to share their findings please do.

Edit:

Actually, it holds 2075 MHz @ 99% load in Titanfall 2 sustained even up to 50C, which it generally doesn't exceed because PT, going by the graph of the last session, is in between 90 and 105%. 2075 MHz with only +25 mV, that's a big difference above 1975 MHz with no voltage. It's only a few games that get the PT up high such as TW3 in 3D Vision where the clocks dip all the way down to 2000 MHz (on occasion, typically they dip down to 2012-2025 MHz).


----------



## dVeLoPe

hey friends i have a standard evga 1080 and will be putting my step up soon for the big bad Ti black edition or w.e its called.

i know thats like a very basic card what should i expect oc wise?


----------



## Dasboogieman

Quote:


> Originally Posted by *dVeLoPe*
> 
> hey friends i have a standard evga 1080 and will be putting my step up soon for the big bad Ti black edition or w.e its called.
> 
> i know thats like a very basic card what should i expect oc wise?


They're all the same chip quality RnG wise. What matters most is how high of a TDP ur bios allows and how cold you can get the chip.


----------



## WillG027

Quote:


> Originally Posted by *Mooncheese*
> 
> Yeah my card doesn't do more than 1975 MHz with default voltage (seeing 1.062v with default FE vbios and untouched voltage) and only +25 mv and a voltage curve and it does another 100 Mhz, and tends to hold 2062 MHz under 50C and from 50 to 55C it will hold 2050 MHz unless the game induces wattage starvation and then the clocks will dip down to 2000 MHz, occasionally 1987 MHz.
> 
> SNIP
> 
> If anyone else out there has played around with the AS vbios on their FE and wants to share their findings please do.
> 
> .


Your card is still performing better than mine.

MSI Gaming X will only do 1940 out of box at default 1.063V. Crashes anything 2000 or above. Max mem ( without losing speed is about +200, so not very in good silicon lottery )
Just decided to under-volt to 1V at 1936 and kept an aggressive fan profile - temps never stray from the 45 - 55 degrees mark.

The difference between when I pushed it to max power draw at 2000 - 2025 and at 1936 under-volted are not noticeable in every day use and the card is significantly cooler and pulls way less watts.

Small OC and undervolt to 1V is the go for these 1080ti - especially on air.


----------



## pcgaming247

which 1080ti to get if youre watercooling?
will the vrms and other differrences make a differrence?
so far from my research the evga ftw is the best


----------



## Dasboogieman

Quote:


> Originally Posted by *pcgaming247*
> 
> which 1080ti to get if youre watercooling?
> will the vrms and other differrences make a differrence?
> so far from my research the evga ftw is the best


Under water, the only thing you should care about the VRMs is their efficiency. High efficiency=less heat.

The best would be the ASUS Strix. Absolutely quality components across the board. Possibly has the most efficient MOSFETs I've seen to date as well (the datasheet mentioned 89-94% efficiency) which means much less heat dump in to your loop.

The Strix also has full access to the XOC BIOS without regressing performance if you want to have a fully TDP and voltage unlocked card. YMMV if you use the XOC bios on any other card.


----------



## leonman44

Guys , i have a 980ti with custom bios 1560/8000 but its hard to hit and maintain 180 fps (my monitor is 180hz 1080p) , i am looking for a 80-100fps gain to be safe , is this card going to make it? 1080TI MSI sea hawk ek
Always we are talking about a full oced card


----------



## Radox-0

Quote:


> Originally Posted by *leonman44*
> 
> Guys , i have a 980ti with custom bios 1560/8000 but its hard to hit and maintain 180 fps (my monitor is 180hz 1080p) , i am looking for a 80-100fps gain to be safe , is this card going to make it? 1080TI MSI sea hawk ek
> Always we are talking about a full oced card


How is your CPU usage also? I would expect that at 1080p or rather any very high refresh rate panel, there is also a heavy load on the CPU also. I would look at CPU usage on a per core basis also. If your CPU is fine and if the boundry is at the GPU, then sure depending on what game you play a 25%-35% gain in FPS is feasible.


----------



## leonman44

Quote:


> Originally Posted by *Radox-0*
> 
> How is your CPU usage also? I would expect that at 1080p or rather any very high refresh rate panel, there is also a heavy load on the CPU also. I would look at CPU usage on a per core basis also. If your CPU is fine and if the boundry is at the GPU, then sure depending on what game you play a 25%-35% gain in FPS is feasible.


Sadly I am aware of this but i have a i7 5820k oced at 4.5ghz , i mean if this cpu isnt enough then who is? I did choose the 6 cores back then so i can have more stable fps even if something was running in the background and i think that i was right... I dont think that i will have any cpu bottleneck for years


----------



## Lefty23

Quote:


> Originally Posted by *Mooncheese*
> 
> Going back to my original comment, I'm interested to see some Arctic Storm and FE game and benchmark comparison.


After reading all your posts I would say that the AS or FTW3 bios could have been good candidates for your use case.
Unfortunatelly, the only way to know for sure would have been to try it in your card which is something you are not willing to do. I totaly understand the reason behind this as you explained it.

I have been following this thread since the beginning and I have seen many conflicting reports on bios crossflashing that the best thing I did was try for my self.
I went through the results I kept last month when I was testing the AS and it seems with my card (EVGA FE, under full waterblock, *not shunt modded*) it is about 1% faster than stock at the same clocks in synthetics.

Now, I do not own the Witcher 3 but, I reinstalled the AS bios a couple of hours ago and run the Metro LL ingame benchmark at 2062 core @1.093v (you didn't mention memory or did I miss it?) to make sure the power limit is hit and the card downclocks.
I left the AC off (32C-33C room temp) and run the fans/pump low so that I got up to ~55C core temp under load.

This is averages from 3 runs using the FE bios
1440p
- at 120% PL Min/Avg/Max 21.41/*72.67*/195.81
4K
- at 120% PL Min/Avg/Max 9.69/*30*/180.18

This is averages from 3 runs using the AS bios
1440p
- at 100% PL Min/Avg/Max 20.12/72/222.02
- at 120% PL Min/Avg/Max 10.01/*72.67*/252.02
4K
- at 100% PL Min/Avg/Max 8.79/30/177.81
- at 120% PL Min/Avg/Max 8.8/*30*/187.62

So for Metro LL for example it doesn't look like it is worth the hassle.
Maybe someone who owns Witcher 3 can do something similar for you. Not sure if that game has an ingame benchmark though, otherwise the results could be semi-random.

EDIT: ok I was curious so I tested also the other two games with build-in benchmarks I have installed (Ghost Recon Wildlands & Rise of Tomb Raider).
Same clocks as above but for GR:W I had to add in all runs 500 on the memory in order to force it to hit the power limit.

Seems to improve the performance for GR:W (+3.6%) but, not so much for RoTR (+0.16%) at 1440p


Spoiler: Warning: Spoiler!



Ghost Recon Wildlands
- Stock
120% PL FPS Min/Avg/Max 56.72/63.94/70.86
- AS
100% PL FPS Min/Avg/Max 57.26/65.78/74.85
120% PL FPS Min/Avg/Max 58.76/66.79/76.24

Rise of Tomb Raider
- Stock
120% PL Overall 67.03fps Mountain Peak 41.79/84.49/126.46 --- Syria 40.80/56.10/84.29 --- Geothermal Valley 40.96/58.66/72.19
- AS
100% PL Overall 66.70fps Mountain Peak 44.13/84.29/131.58 --- Syria 28.06/55.72/83.01 --- Geothermal Valley 40.70/58.24/72.74
120% PL Overall 67.14fps Mountain Peak 25.30/83.17/134.13 --- Syria 35.63/56.79/87.04 --- Geothermal Valley 43.50/59.75/74.93



For 4K, performance is improved for both GR:W (+2.9%) and RoTR (+3.6%).


Spoiler: Warning: Spoiler!



Ghost Recon Wildlands
- Stock
120% PL FPS Min/Avg/Max 34.00/40.83/45.91
- AS
100% PL FPS Min/Avg/Max 34.28/41.93/47.20
120% PL FPS Min/Avg/Max 35.43/42.03/48.04

Rise of Tomb Raider
- Stock
120% PL Overall 31.46fps Mountain Peak 11.28/39.74/97.79 --- Syria 12.80/2756/52.15 --- Geothermal Valley 11.65/27.56/52.15
- AS
100% PL Overall 32.00fps Mountain Peak 13.33/40.41/61.55 --- Syria 18.92/26.13/37.29 --- Geothermal Valley 12.50/28.45/62.63
120% PL Overall 32.60fps Mountain Peak 19.76/41.05/98.17 --- Syria 13.04/28.79/42.72 --- Geothermal Valley 13.09/28.95/42.72



Seeing the anomaly at 1440p for RoTR I retested with similar results (5 runs on each).
I'm consistently getting that weird result (very low min, low avg, high max) for the Mountain Peak test @1440p when using 120% PL with AS.
No idea what is going on there.


----------



## Maximization

hiya guys submitted info

http://www.3dmark.com/3dm/21647960


----------



## Frosty288

No matter what I do with the voltage slider (followed the guide on the front page) for MSI afterburner, it looks like i'm stuck at 775mv - how are you guys getting to even over 1v?


----------



## coc_james

Quote:


> Originally Posted by *Frosty288*
> 
> No matter what I do with the voltage slider (followed the guide on the front page) for MSI afterburner, it looks like i'm stuck at 775mv - how are you guys getting to even over 1v?


You sure that's not your idle volts? How are you checking your voltage?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Frosty288*
> 
> No matter what I do with the voltage slider (followed the guide on the front page) for MSI afterburner, it looks like i'm stuck at 775mv - how are you guys getting to even over 1v?


Unless you card is stuck at at non Boost State you should have way more voltage. Use DDU to remove the driver and if that fails do a fresh Windows Install.


----------



## Mooncheese

Quote:


> Originally Posted by *WillG027*
> 
> Your card is still performing better than mine.
> 
> MSI Gaming X will only do 1940 out of box at default 1.063V. Crashes anything 2000 or above. Max mem ( without losing speed is about +200, so not very in good silicon lottery )
> Just decided to under-volt to 1V at 1936 and kept an aggressive fan profile - temps never stray from the 45 - 55 degrees mark.
> 
> The difference between when I pushed it to max power draw at 2000 - 2025 and at 1936 under-volted are not noticeable in every day use and the card is significantly cooler and pulls way less watts.
> 
> Small OC and undervolt to 1V is the go for these 1080ti - especially on air.


Actually if it is any better it isn't that much faster as last night while playing Batman: AK (2D, G-Sync, 90 FPS via edited .ini, + Reshade for SMAA and Lumasharpen) I got a black-screen fairly quickly, within 15 minutes of gameplay, with the clocks at 2075 @ 1.075v (+400 MHz memory, any higher than this yields diminishing returns with my particular sample). So it was back to the drawing board, I upped voltage to 1.093v, so now it does 2088 MHz at 1.093v, which it was stable in Firestrike and Sup 4K, but I need to do Fire Strike Stress Test this morning. Ultimately I may need to dial back the peak core to 2075 or even 2062 at 1.093v. On the curve the voltage drops to 1.062v at 2050 MHz. The crash occurred at 2075 MHz last night, so I know for certain that frequency needs more than an indicated 1.075v. Batman: AK doesn't even need an OC, core load is like 60% with everything maxed and Reshade at 90 FPS at 2560x1440 with Gameworks (in scenes with actual PhysX going on the FPS does dip down to 60 or so with 99% load though, so I was hoping that keeping the MHz high would offer some FPS here).

But yeah, it's completely luck of the draw with Big Pascal this time around unfortunately. Not sure if EVGA is offering 75 and 80% ASIC for $ this time around with Kingpin, but yeah they are a bit late to the game. I picked up my FE 5 minutes into initial pre-order in the beginning of March, and now that NGreedia has pretty much said they aren't rushing Volta I feel smug knowing I will have a top-tier card for that much longer. I like to get at least 2 years out of my cards, if 1180 Ti doesn't release until mid 2019 I have a better chance of that now.

As far as undervolting, as my stated in my previous comment, I'm seeing wattage starvation downclocking of around 50 MHz irrespective of voltage and where the frequency is at. If I run default voltage and have it at 1975 MHz the clocks dip down to 1930 or so, if it's +37v and 2075Mhz they dip down to 2012. Temps don't exceed 55C max even with the overvolt. YMMV with undervolting.

Quote:


> Originally Posted by *Lefty23*
> 
> After reading all your posts I would say that the AS or FTW3 bios could have been good candidates for your use case.
> Unfortunatelly, the only way to know for sure would have been to try it in your card which is something you are not willing to do. I totaly understand the reason behind this as you explained it.
> 
> I have been following this thread since the beginning and I have seen many conflicting reports on bios crossflashing that the best thing I did was try for my self.
> I went through the results I kept last month when I was testing the AS and it seems with my card (EVGA FE, under full waterblock, *not shunt modded*) it is about 1% faster than stock at the same clocks in synthetics.
> 
> Now, I do not own the Witcher 3 but, I reinstalled the AS bios a couple of hours ago and run the Metro LL ingame benchmark at 2062 core @1.093v (you didn't mention memory or did I miss it?) to make sure the power limit is hit and the card downclocks.
> I left the AC off (32C-33C room temp) and run the fans/pump low so that I got up to ~55C core temp under load.
> 
> This is averages from 3 runs using the FE bios
> 1440p
> - at 120% PL Min/Avg/Max 21.41/*72.67*/195.81
> 4K
> - at 120% PL Min/Avg/Max 9.69/*30*/180.18
> 
> This is averages from 3 runs using the AS bios
> 1440p
> - at 100% PL Min/Avg/Max 20.12/72/222.02
> - at 120% PL Min/Avg/Max 10.01/*72.67*/252.02
> 4K
> - at 100% PL Min/Avg/Max 8.79/30/177.81
> - at 120% PL Min/Avg/Max 8.8/*30*/187.62
> 
> So for Metro LL for example it doesn't look like it is worth the hassle.
> Maybe someone who owns Witcher 3 can do something similar for you. Not sure if that game has an ingame benchmark though, otherwise the results could be semi-random.
> 
> EDIT: ok I was curious so I tested also the other two games with build-in benchmarks I have installed (Ghost Recon Wildlands & Rise of Tomb Raider).
> Same clocks as above but for GR:W I had to add in all runs 500 on the memory in order to force it to hit the power limit.
> 
> Seems to improve the performance for GR:W (+3.6%) but, not so much for RoTR (+0.16%) at 1440p
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ghost Recon Wildlands
> - Stock
> 120% PL FPS Min/Avg/Max 56.72/63.94/70.86
> - AS
> 100% PL FPS Min/Avg/Max 57.26/65.78/74.85
> 120% PL FPS Min/Avg/Max 58.76/66.79/76.24
> 
> Rise of Tomb Raider
> - Stock
> 120% PL Overall 67.03fps Mountain Peak 41.79/84.49/126.46 --- Syria 40.80/56.10/84.29 --- Geothermal Valley 40.96/58.66/72.19
> - AS
> 100% PL Overall 66.70fps Mountain Peak 44.13/84.29/131.58 --- Syria 28.06/55.72/83.01 --- Geothermal Valley 40.70/58.24/72.74
> 120% PL Overall 67.14fps Mountain Peak 25.30/83.17/134.13 --- Syria 35.63/56.79/87.04 --- Geothermal Valley 43.50/59.75/74.93
> 
> 
> 
> For 4K, performance is improved for both GR:W (+2.9%) and RoTR (+3.6%).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ghost Recon Wildlands
> - Stock
> 120% PL FPS Min/Avg/Max 34.00/40.83/45.91
> - AS
> 100% PL FPS Min/Avg/Max 34.28/41.93/47.20
> 120% PL FPS Min/Avg/Max 35.43/42.03/48.04
> 
> Rise of Tomb Raider
> - Stock
> 120% PL Overall 31.46fps Mountain Peak 11.28/39.74/97.79 --- Syria 12.80/2756/52.15 --- Geothermal Valley 11.65/27.56/52.15
> - AS
> 100% PL Overall 32.00fps Mountain Peak 13.33/40.41/61.55 --- Syria 18.92/26.13/37.29 --- Geothermal Valley 12.50/28.45/62.63
> 120% PL Overall 32.60fps Mountain Peak 19.76/41.05/98.17 --- Syria 13.04/28.79/42.72 --- Geothermal Valley 13.09/28.95/42.72
> 
> 
> 
> Seeing the anomaly at 1440p for RoTR I retested with similar results (5 runs on each).
> I'm consistently getting that weird result (very low min, low avg, high max) for the Mountain Peak test @1440p when using 120% PL with AS.
> No idea what is going on there.


Well this is pretty much exactly what I wanted to see so thank you and + rep!!!!

Peak FPS really went up in Metro LL, but it's really the averages that are of practical import here, and there seems to be no substantive difference between AS and FE, unless the benchmarks are CPU dependent and the averages are being heavily skewed by a CPU bottleneck. If we could completely eliminate the CPU from the comparison we would have a much clearer picture I think. But yeah, thanks Lefty, I think I'm just going to sit tight with the default FE vbios as the only real gains by your data are peak FPS and that's really only with AS at 120% (432W) which I'm not sure my AIO cooled card can handle.


----------



## Mooncheese

Gamers Nexus did a power draw cost comparison between RX Vega 56 and GTX 1070 and since the power draw difference was around 100W, about that which can be expected (or more) between an FE card and a variant (or FE with variant vbios) drawing over 100W more from the wall I figured some of you might be interested to see how much that 2.5 to 5% performance improvement is going to cost on an annual basis, to say nothing of the environmental impact and heat produced locally, hopefully not with an air-cooled card dumping said heat into a case to be distributed about your motherboard, memory, and other components:






Edit:

Found this in the comments below the video:

"i live in Spain and in summer we are usually at 35 - 40ºC ( 95-105 ºF) and 100w extra of heat in a room without AC makes you wanna die, i even have my gtx 1070 at 70-75% power limit﻿"


----------



## Mooncheese

Update:

I HAVE SEEN THE LIGHT.

So, in The Witcher 3 (3D Vision) outside of Novigrad in a wooded area where I could induce sustained 99% load, initially out of curiosity to see how high of a clock would hold with the wattage starvation and what would otherwise be 2088 MHz in a non wattage starvation scenario I went back and forth between an AB profile of + 190 core, +400 memory and 1.093v (curve) and a profile of +88 core, + 400 memory and no additional voltage and to my initial dismay the more aggressive clock profile was dipping all the way down, not to the 2000 MHz I could have sworn I saw earlier, but all the way down to 1936 and even 1911 MHz, bouncing between say 2025, 2038, 1962, 1936, and 1911 every other second, with the FPS going between 51 and 53 FPS with the clock fluctuations and with voltage at around 1.050 above 2025 MHz and 1.043v at 1975 MHz. The situation was the same with the less aggressive profile with nearly identical FPS so out of curiosity I figured I would attempt to undervolt the card.

So I pulled back up the curve and increased the core clock at 1.000v to 1974, and flattened it out beyond that and set everything before it on a linear appearing curve and fired TW3 back up and boy was I surprised. The clocks now held steady at 1974 Mhz, with no fluctuations, no dipping down to 1936 and 1911 and no pointless bursts of 1.050v up to 2025 MHz and the FPS was now around 54-55 and holding steady, not dipping down to 52 and back up to 54 FPS, the temps were also around 2-3C lower and to my surprise I didn't lose a bin over 50C, it held 1974 MHz all the way out to 53C keeping that scene going for about 5 minutes to see if it could even hold 1974 MHz at 1.000v.

So at least in TW3 undervolting the card is the way to go, and presumably any other game or benchmark where wattage starvation is to be expected.

I'm going to do some stability testing to see if it holds but yeah, I am pleasantly surprised with this.

Kraxkill et al, I thought you were crazy but yeah youre right about undervolting.

I'm tempted to see what 2000 MHz at 1.012v looks like but need to ensure 1974 MHz is stable first. Maybe some of the instability is related to clocks and voltages spiking, and maybe it will hold 2000 MHz at 1.000v whereas previously perceived instability wasn't the product of insufficient voltage but highly erratic clock and voltage swings.

Edit:

It just passed Fire Strike Stress Test with 99.2% and particularly noteworthy was that the last time I ran this with the clocks at 2075 MHz and 1.075v, although it "passed" it had a score of 97% and at the very end I got that I got the windows aero warning that is indicative of an unstable OC "Do you want to change the color scheme?". Also, with the higher OC and voltage the clocks were actually at 2025 and 2012 MHz because of wattage starvation. And the temps were 55C at the end of the stress test, whereas the temps were at 50C with 1.000V this time around. Oh and it does 1987 MHz out to 49C, where it drops to 1974 and holds presumably until the next temperature bin drop. I'm tempted to try for 2000 MHz at 1.000v (at least until 49C) but want to ensure it's stable here first. Going to do a Sup 4K bench to see what kind of performance difference there is but yeah undervolting is totally the way to go with this card.

Update:

It's holding 2000 MHz at 1.000v, it just passed Firestrike Stress Test with 99.1%, and then endured 20 minutes of Unigine Heaven, and about an hour's worth of Batman: AK and some TW3 time to see what the clocks are doing.

It holds 2 GHz up to 49C where it drops to 1987 MHz. Max temps in Fire Strike Stress Test and 20 minutes of Heaven was 50C so it's about 4-5C cooler than with the overvolt.

It pulled 14807 GPU in Firestrike Extreme compared to 15110 with an OC of +190 and 1.093v (it only managed 2062 MHz peak and fluctuated between 2025 and 2038 MHz in GPU Test 2, GPU Test 1 the overvolt OC profile fluctuated between 1987 and 2000 MHz but with much greater voltage. Temps were 43C with the undervolt and 46C with the overvolt. The fact that the performance disparity is solely owed to GPU Test 2 leads me to conclude that the undervolt is absolutely the way to go if the game or benchmark induces power starvation.

I'm tempted to try for 2012 or even 2025MHz with 1.000v but am going to wait a few days and see if it is 100% stable at 2000 MHz first.


----------



## Frosty288

Quote:


> Originally Posted by *coc_james*
> 
> You sure that's not your idle volts? How are you checking your voltage?


yeah sorry guys that was idle voltage i'm an idiot


----------



## GRABibus

Got a GIGABYTE Gaming OC 11G today :

https://www.gigabyte.com/Graphics-Card/GV-N108TGAMING-OC-11G#kf

It is a very very goood clocker, despite the cooling. It is high end air cooler with 3 fans, but of course, 70°C are easily reached in bench or games.
i set the fan to 100% and max temp is 62°c at 24°C ambient.

This card can sustain 2088Mhz at 1.075V in Titan Fall 2.
I can bench with some peaks at 2126MHz...

This card is really good and should be now better cooled to improve again









Bad point : it has no back plate.


----------



## Mooncheese

Quote:


> Originally Posted by *GRABibus*
> 
> Got a GIGABYTE Gaming OC 11G today :
> 
> https://www.gigabyte.com/Graphics-Card/GV-N108TGAMING-OC-11G#kf
> 
> It is a very very goood clocker, despite the cooling. It is high end air cooler with 3 fans, but of course, 70°C are easily reached in bench or games.
> i set the fan to 100% and max temp is 62°c at 24°C ambient.
> 
> This card can sustain 2088Mhz at 1.075V in Titan Fall 2.
> I can bench with some peaks at 2126MHz...
> 
> This card is really good and should be now better cooled to improve again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bad point : it has no back plate.


8+6 power, so youre probably in the same boat as I am with FE, well if you read my posts above you will find that undervolting is the way to go.....

Edit:

It's stable in Titanfall 2, and temps are a good 5C lower than previously (45C). If youre on air you definitely want to try undervolting.

REALLY tempted to try 2012 MHz at 1.000v lol


----------



## TheRedViper

Quote:


> Originally Posted by *Mooncheese*
> 
> 8+6 power, so youre probably in the same boat as I am with FE, well if you read my posts above you will find that undervolting is the way to go.....
> 
> Edit:
> 
> It's stable in Titanfall 2, and temps are a good 5C lower than previously (45C). If youre on air you definitely want to try undervolting.


3x8 pins here ?


----------



## Alex132

Quote:


> Originally Posted by *TheRedViper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mooncheese*
> 
> 8+6 power, so youre probably in the same boat as I am with FE, well if you read my posts above you will find that undervolting is the way to go.....
> 
> Edit:
> 
> It's stable in Titanfall 2, and temps are a good 5C lower than previously (45C). If youre on air you definitely want to try undervolting.
> 
> 
> 
> 3x8 pins here ?
Click to expand...

Anything more than 2x8 (or 6, depending on how you want to view it) is an utter waste for a 1080 Ti in any circumstance possible.


----------



## GraphicsWhore

Quote:


> Originally Posted by *Mooncheese*
> 
> Update:
> 
> I'm tempted to try for 2012 or even 2025MHz with 1.000v but am going to wait a few days and see if it is 100% stable at 2000 MHz first.


A few days? You have evidence 2Ghz is stable, so just run 2025. If a no-go, try 2012.

Seems like you could save yourself some time.


----------



## Mooncheese

Quote:


> Originally Posted by *Alex132*
> 
> Anything more than 2x8 (or 6, depending on how you want to view it) is an utter waste for a 1080 Ti in any circumstance possible.


Except LN2 and dry ice, and Epeen stroking of course.

As a follow up to my posts above, it held 2012 MHZ throughout Firestrike Extreme Stress Test and then Sup 4k optimized at 1.000v and was 4c lower than the overvoltage run (48c vs 52c) with a score of 9950. This is with GSYNC on, not sure if that is detrimental to Sup 4k as it is with Fire Strike.

Windows 7
378.78

Leaving it here for now, really glad I tried undervolting before flashing a 400w vbios. All I wanted was to eliminate the dips down to 1936 and 1911 MHz, that I can do that and with lower temps is win win.


----------



## WillG027

Nice results Mooncheese !

Overall, when weighing the benefits, undervolting on this particular beast, is very advantageous.

Lower power draw, smoother gameplay (no power limit oscillation), less stress on card and less heat dumped into your case and room. Win-win-win.

I'm experimenting with 1873 at .950 V now . . .
(only at 60hz 1440p so don't require max overclock)


----------



## Alex132

Quote:


> Originally Posted by *Mooncheese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Anything more than 2x8 (or 6, depending on how you want to view it) is an utter waste for a 1080 Ti in any circumstance possible.
> 
> 
> 
> Except LN2 and dry ice, and Epeen stroking of course.
Click to expand...

No. It's useless even in LN2.


----------



## Amuro

^ im at 60% powerlimit non curve +75core -(minus)502memory = 150watts gpu powerdraw, still easily beating vegas


----------



## iamjanco

Anyone running sli'd 1080ti's using riser cables?


----------



## Mooncheese

Update:

I wanted to compare the undervolt with the former overvolt so I ran Firestrike Extreme at 2075 MHz with 1.075v (not even 1.093v, and this isn't even stable but it passed the bench) and then with the undervolt profile of 2012 at 1.000v and the results were within 1% of each other. With the overvolt profile, in GPU Test 1 the clocks were dipping all the way down to BELOW where the undervolted profile could hold (1987-2000 MHz vs. 2012 MHz):

http://www.3dmark.com/compare/fs/13403253/fs/13403635

This is just Firestrike, and this is NO WHERE near demanding as The Witcher 3 in 3D Vision which has such severe power starvation that the clocks don't simply dip to 1987-2000 MHz but 1911-1936 MHz so it's not really indicative of real world performance.

To reiterate, it actually needs 1.093v to hold 2075 MHz, (I deleted the overvolt curve and didn't feel like making another one so I simply set the voltage to +37v in AB which isn't actually enough voltage) With the old overvolt curve what happens is that with TW3 in 3D Vision it will run at 2075 MHz @ 1.093v with like 80% load and then if the scene becomes demanding, it power throttles down to 1939-1911 MHz so it's completely pointless as the additional voltage hikes the temps up from 49-50 to 55C costing one bin of frequency when youre not at full load and then you lose 60-75 MHz when there is power starvation.

Quote:


> Originally Posted by *GraphicsWhore*
> 
> A few days? You have evidence 2Ghz is stable, so just run 2025. If a no-go, try 2012.
> 
> Seems like you could save yourself some time.


I might, it does 2025 MHz off the bat under 39C then dips down to 2012 until 49C. So far so good, might push for 2025.
Quote:


> Originally Posted by *WillG027*
> 
> Nice results Mooncheese !
> 
> Overall, when weighing the benefits, undervolting on this particular beast, is very advantageous.
> 
> Lower power draw, smoother gameplay (no power limit oscillation), less stress on card and less heat dumped into your case and room. Win-win-win.
> 
> I'm experimenting with 1873 at .950 V now . . .
> (only at 60hz 1440p so don't require max overclock)


Why so low, are you on air? If you have FE I HIGHLY recommend adding an AIO. I'm not exaggerating, it is dead silent with the single 140mm fan at 50% RPM exhausting out of the case.


__
https://www.reddit.com/r/5z91t7/mod_the_poor_mans_hybrid_h55_gtx_10xx_fe/%5B/URL
Quote:


> Originally Posted by *Alex132*
> 
> No. It's useless even in LN2.


Damn...


----------



## TheRedViper

Quote:


> Originally Posted by *Alex132*
> 
> Anything more than 2x8 (or 6, depending on how you want to view it) is an utter waste for a 1080 Ti in any circumstance possible.


You seem to forget unlocked cards like galax and kingpin. An 8 pin gives you a max of about 325W on best case scenario. Anything above 2 on water is excessive but the kingpin was seen pulling 1300w peak on ln2


----------



## WillG027

Quote:


> Originally Posted by *Mooncheese*
> 
> Update:
> 
> Why so low, are you on air? If you have FE I HIGHLY recommend adding an AIO. I'm not exaggerating, it is dead silent with the single 140mm fan at 50% RPM exhausting out of the case.
> 
> Peak temps of 50C with the undervolt, dead silent, all heat exhausted out of case, VRM area pretty much the same as with the vapor chamber:


Yep, on air so it's limiting my max OC - plus don't have the best clocker.

The fact that the card is pretty average just doesn't make it worth it too pump extra power through it. The gains-to-extra power is not worth it.
As it is, at 1V the card sees a max of 43 - 49 degrees C during a multiple hour session (ambient 18 - 22C) - so watercooling is just not worth it in my situation.


----------



## evosamurai

Has anyone done Tins uncorking mod, looking for some peeps who've done it, have some questions, I also soldered on resistors over the shunts


----------



## Mooncheese

Quote:


> Originally Posted by *WillG027*
> 
> Yep, on air so it's limiting my max OC - plus don't have the best clocker.
> 
> The fact that the card is pretty average just doesn't make it worth it too pump extra power through it. The gains-to-extra power is not worth it.
> As it is, at 1V the card sees a max of 43 - 49 degrees C during a multiple hour session (ambient 18 - 22C) - so watercooling is just not worth it in my situation.


Wow 49C on air that is fantastic, what card exactly?


----------



## WillG027

Quote:


> Originally Posted by *Mooncheese*
> 
> Wow 49C on air that is fantastic, what card exactly?


MSI Gaming X
Most of the time the card is around 41-46 Deg C with the occassional spike to 49 C. (Winter in southern hemisphere right now tho)

I'm very impressed with the cooler. It is really quiet too - with the undervolt it cools at that temp without ever going over 50-60% fan speed too.


----------



## Pandora's Box

Quote:


> Originally Posted by *WillG027*
> 
> MSI Gaming X
> Most of the time the card is around 41-46 Deg C with the occassional spike to 49 C. (Winter in southern hemisphere right now tho)
> 
> I'm very impressed with the cooler. It is really quiet too - with the undervolt it cools at that temp without ever going over 50-60% fan speed too.


Loving my gaming x too. Mine runs at 1 volt at 2000mhz. No throttling from temp or power. Card runs at 70-75C at 70% fan speed


----------



## Mooncheese

Quote:


> Originally Posted by *WillG027*
> 
> MSI Gaming X
> Most of the time the card is around 41-46 Deg C with the occassional spike to 49 C. (Winter in southern hemisphere right now tho)
> 
> I'm very impressed with the cooler. It is really quiet too - with the undervolt it cools at that temp without ever going over 50-60% fan speed too.


Wow that is seriously impressive, you in Australia? Is electricity expensive down there?


----------



## Pandora's Box

Quote:


> Originally Posted by *Mooncheese*
> 
> Wow that is seriously impressive, you in Australia? Is electricity expensive down there?


https://en.m.wikipedia.org/wiki/Electricity_pricing#Global_comparison


----------



## WillG027

Quote:


> Originally Posted by *Mooncheese*
> 
> Wow that is seriously impressive, you in Australia? Is electricity expensive down there?


Have Matx case with 120mm inlet on bottom & front case + 120mm on side feeding fresh air straight into GCard - then two 120mm exhaust at top. (positive pressure)
3570k at 1.25V cooled by NH-D14 maxes at 55-65c. So case is quite cool all-over.

Yep in Australia.
and YES Power is very expensive, but so is everything else. One of the most expensive places to live in the world.

But it's worth it for the beautiful Land/weather and our easy going culture. . .


----------



## nuno_p

Is it possible to undervolt lower than 800mv?


----------



## GRABibus

Does someone owns the GIGABYTE Gaming OC 11G and succeed in flashing it with XOC Bios ?

I have tried, no way.

Protect off has worked
"nvflash64 -6 XOC.rom" has worked, as I see in command window that my EEprom has been rewrited with the XOC Bios.

Then, reboot and I have the VGA mode (As if I would have uninstalled the drivers).

hopefully, I could reflash with Gigabyte Bios.

Thank you for feedback.


----------



## coc_james

Quote:


> Originally Posted by *nuno_p*
> 
> Is it possible to undervolt lower than 800mv?


I don't think so. I was looking into it as well and can't figure out how. Which is strange because sometimes my idle voltage drops to like 650mv.


----------



## nuno_p

Quote:


> Originally Posted by *coc_james*
> 
> I don't think so. I was looking into it as well and can't figure out how. Which is strange because sometimes my idle voltage drops to like 650mv.


On the first page spreadsheet is many users with less than 800mv, and i would like to know how did they do it.


----------



## Mooncheese

Quote:


> Originally Posted by *Pandora's Box*
> 
> https://en.m.wikipedia.org/wiki/Electricity_pricing#Global_comparison


Quote:


> Originally Posted by *WillG027*
> 
> Have Matx case with 120mm inlet on bottom & front case + 120mm on side feeding fresh air straight into GCard - then two 120mm exhaust at top. (positive pressure)
> 3570k at 1.25V cooled by NH-D14 maxes at 55-65c. So case is quite cool all-over.
> 
> Yep in Australia.
> and YES Power is very expensive, but so is everything else. One of the most expensive places to live in the world.
> 
> But it's worth it for the beautiful Land/weather and our easy going culture. . .


Looks like it's about 50% more expensive than what it is in North America, making the Gamers Nexus Vega RX vs. GTX 1070 power cost comparison I posted above (and 1080 Ti performance tweaking approach) that much more relevant.

Undervolting is absolutely the way to go. XOC is relevant if you live in your parents basement without a utility bill somewhere cool, have a full water block and adequate rad surface area, have zero concern about your carbon footprint (in hand with this is a belief that man hasn't altered the climate with industrial activity) and believe that 5% more performance is worth 50% more power being used.

To me it's pretty insane but yeah, this is pretty much a microcosm of the world we live in where man abandons all reason and sensibility I pursuit of that one extra dollar sign or that single digit performance metric that sets them above "the mass of man" at any cost necessary, including the continued survival of the species.


----------



## taem

How much louder is the FTW3 compared to Asus Strix or MSI Gaming X? Just a little? Or a lot?

FTW3 is the card I want, for the make, looks, build. But noise is very very important to me. So if FTW3 is a little louder, then ok. But if it's in a whole tier or 2 below Strix and Gaming X in noise, then I have to rethink. And there's one guy in the card specific thread that it says it's the same as FE for noise...


----------



## buellersdayoff

Quote:


> Originally Posted by *taem*
> 
> How much louder is the FTW3 compared to Asus Strix or MSI Gaming X? Just a little? Or a lot?
> 
> FTW3 is the card I want, for the make, looks, build. But noise is very very important to me. So if FTW3 is a little louder, then ok. But if it's in a whole tier or 2 below Strix and Gaming X in noise, then I have to rethink. And there's one guy in the card specific thread that it says it's the same as FE for noise...


http://www.gamersnexus.net/hwreviews/2900-evga-1080-ti-ftw3-review-vs-sc2-gaming-x-xtreme-aorus/page-2
Apparently they are noisier/louder than the others
http://www.gamersnexus.net/hwreviews/2905-asus-1080-ti-rog-strix-review-vs-ftw3-gaming-x/page-2


----------



## taem

Quote:


> Originally Posted by *buellersdayoff*
> 
> http://www.gamersnexus.net/hwreviews/2900-evga-1080-ti-ftw3-review-vs-sc2-gaming-x-xtreme-aorus/page-2
> Apparently they are noisier/louder than the others
> http://www.gamersnexus.net/hwreviews/2905-asus-1080-ti-rog-strix-review-vs-ftw3-gaming-x/page-2


Thanks for that, but I've looked at all the reviews. I know Strix is king for quiet, followed by Gaming X. I want to hear from folks here, their subjective opinions on how much louder the FTW3 is.


----------



## buellersdayoff

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> http://www.gamersnexus.net/hwreviews/2900-evga-1080-ti-ftw3-review-vs-sc2-gaming-x-xtreme-aorus/page-2
> Apparently they are noisier/louder than the others
> http://www.gamersnexus.net/hwreviews/2905-asus-1080-ti-rog-strix-review-vs-ftw3-gaming-x/page-2
> 
> 
> 
> Thanks for that, but I've looked at all the reviews. I know Strix is king for quiet, followed by Gaming X. I want to hear from folks here, their subjective opinions on how much louder the FTW3 is.
Click to expand...

I do remember one or two people mentioning how much noisier they seemed over another. Can't remember which cards they were comparing and if it was here on this forum or not, did you search the thread?


----------



## rluker5

Hi, new to this club. Ordered my first 1080ti on RX Vega release day when I realized there was no way Vega could run 4k to my satisfaction. Ordered my second the next day when I figured since I was ready to spend almost as much as 2 1080tis hoping to beat one, I might as well go all in and enjoy every pixel my 4k has to offer







Now I am.
I already used the memory overclock tips on the OP and I'm almost disappointed over how little struggle it was. But not really. Also glad to hear that flashing is possible.


----------



## Amuro

my underclocked/undervolted/powerlimited msi gtx 1080 ti gaming x + underclocked/undervolted/powerlimited 4790k + gskill trident x 32gb 2400mhz cl10 ddr3 on msi z97 gaming 3 performance on witcher 3 ultra no hair works 1080p aoc 144hz gsync monitor. super power efficient.


----------



## pez

Quote:


> Originally Posted by *s1rrah*
> 
> As an update ... I did a bunch of reading and found out how to disable depth of field and motion blur ... after disabling those, on my 4K TV, I get a solid 60fps just about all the time ... *much* better ... but still can't pull myself away from that 144hz Acer 1440p screen ... but once again, when running those above mentioned hacks/tests ... I was just floored by the colors and contrast of the TV vs the monitor ...
> 
> But happy my 1080 ti can do 4K at 60fps ... even if I'm not going to use it for that res ...
> 
> And yes ... Hellblade: Senua's Sacrifice is *incredible* ... it's become one of those games I don't want to play until I can set the room/environment just right and settle in for an hour or three ... I'm a huge fan of content/story/writing based games (think "Dear Esther, which I've played countless times) and Hellblade does it ... it's not without it's hiccups but man does it make you care about the female lead ... props to the devs!


Hmmm I need to look up that hack. I'd like to keep the depth of field as I can handle the FPS drops occasionally with g-sync, but I definitely noticed the motion blur right away.

I think I might even check out Dear Esther next based on that.


----------



## Pandora's Box

Anyone else notice that Rise of the Tomb Raider seems to really punish the graphics card in terms of power usage and temperature? I just started playing the game yesterday and noticed my undervolted 1080 Ti was reaching 80C. This was at max in game settings, DX12, 1440P with 2xSSAA - not exactly easy to drive but I was impressed with the performance (60+fps). GPU was even hitting power throttling at 115% power usage, core clock speed was ranging from 1870 - 2000.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Pandora's Box*
> 
> Anyone else notice that Rise of the Tomb Raider seems to really punish the graphics card in terms of power usage and temperature? I just started playing the game yesterday and noticed my undervolted 1080 Ti was reaching 80C. This was at max in game settings, DX12, 1440P with 2xSSAA - not exactly easy to drive but I was impressed with the performance (60+fps). GPU was even hitting power throttling at 115% power usage, core clock speed was ranging from 1870 - 2000.


That is the problem when Pascal is Power Limited. You will hit different power on different games making the performance inconsistent. Just to 4K and its even worse.


----------



## Mooncheese

Quote:


> Originally Posted by *Pandora's Box*
> 
> Anyone else notice that Rise of the Tomb Raider seems to really punish the graphics card in terms of power usage and temperature? I just started playing the game yesterday and noticed my undervolted 1080 Ti was reaching 80C. This was at max in game settings, DX12, 1440P with 2xSSAA - not exactly easy to drive but I was impressed with the performance (60+fps). GPU was even hitting power throttling at 115% power usage, core clock speed was ranging from 1870 - 2000.


80c with the undervolt? Good lord, what's ambient like where you are? Still rocking the Silverstone Raven? I run ROTTR in 3D Vision and found that HBAO+ is extremely resource intensive, try running ssao instead if you want more frames and or want to get the temps down. I can run this game at 60 fps at 2560x1440p basically everywhere with only a few compromises (120 FPS 2D) if you want I can tell you my exact settings when I get home.


----------



## Pandora's Box

Quote:


> Originally Posted by *Mooncheese*
> 
> 80c with the undervolt? Good lord, what's ambient like where you are? Still rocking the Silverstone Raven? I run ROTTR in 3D Vision and found that HBAO+ is extremely resource intensive, try running ssao instead if you want more frames and or want to get the temps down. I can run this game at 60 fps at 2560x1440p basically everywhere with only a few compromises (120 FPS 2D) if you want I can tell you my exact settings when I get home.


No I'm running a thermaltake p3 case now, basically open air (see my rig in my signature). 72f is the ambient temperature. Yeah 80c at 1volt, 70% fan speed. I'm pretty sure it's the 2xssaa that's causing the temps to skyrocket. Game is basically running at 5160x2880 (2xssaa is basically doubling the resolution). I'm happy with the performance, just surprised at the temps and power throttling. Even firestrike extreme doesn't push it this hard


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Mooncheese*
> 
> 80c with the undervolt? Good lord, what's ambient like where you are? Still rocking the Silverstone Raven? I run ROTTR in 3D Vision and found that HBAO+ is extremely resource intensive, try running ssao instead if you want more frames and or want to get the temps down. I can run this game at 60 fps at 2560x1440p basically everywhere with only a few compromises (120 FPS 2D) if you want I can tell you my exact settings when I get home.


id like to here what your ROTTR settings are!


----------



## Curseair

If anyone has a Black SC Gaming from EVGA, could you please open it up and show me where all the thermal pads go? I was an idiot and did not take pictures now I have a problem here

http://www.overclock.net/t/1636717/problem-with-evga-1080-ti-black-sc-gaming-artifacts


----------



## Kokin

Quote:


> Originally Posted by *hadesfactor*
> 
> Myself, Im partial to EVGA...not only are their cards solid, but they don't care at all if you water cool their cards....if you need to RMA the card, just put back the stock cooler and ship it's fine. If you're gonna water cool and OC, there's 2 schools of thought on it. Some say you'll hit higher OC on an AIB, and others (myself included) don't see the point in spending the extra money. Normally they are OC'ed less then you can push it anyways and silicon lottery always comes into play although they do tend to be binned better. You will have a Bios on those cards with a higher TDP but you can also flash your FE. I would say if you can't get an FE...I like the FTW3 personally.
> 
> As I said though, some people here have pushed their FE to higher clocks and some AIB's weren't able to push much passed 2050-2075....all cards should push 2000+ especially on water. I would find out what other companies let yuo water cool their products while honoring their RMA process and choose among those


Quote:


> Originally Posted by *LunaP*
> 
> Can 2nd this, bought 2x 1080ti's put the blocks on 1 is dead, called in had the rma approved within 5 minutes, doing a cross ship replacement, should have the card by tuesday/wednesday hopefully. I do love the new XSPC blocks though. For now I'll just do 2x 980ti and 1x 1080ti since I still wanna test my acer 4k 32"


Thanks for the input. Just bought the FTW3 on Amazon and came out to a spooky price tag:



Going to try out the card before I purchase the EK block, but it looks like I can use the factory back plate so I will probably keep it.


----------



## Rollergold

Quote:


> Originally Posted by *Kokin*
> 
> Thanks for the input. Just bought the FTW3 on Amazon and came out to a spooky price tag:
> 
> 
> 
> Going to try out the card before I purchase the EK block, but it looks like I can use the factory back plate so I will probably keep it.


Welcome to the club. Just keep in mind the backplate section for the FTW3 where the GPU core is has slightly smaller screw holes and you will need to push the EK screws a little harder to drive them though on that side.


----------



## khemist

Quote:


> Originally Posted by *Rollergold*
> 
> Welcome to the club. Just keep in mind the backplate section for the FTW3 where the GPU core is has slightly smaller screw holes and you will need to push the EK screws a little harder to drive them though on that side.


Good to know, should have mine tomorrow.


----------



## Kokin

Quote:


> Originally Posted by *Rollergold*
> 
> Welcome to the club. Just keep in mind the backplate section for the FTW3 where the GPU core is has slightly smaller screw holes and you will need to push the EK screws a little harder to drive them though on that side.


Thank you, will keep that in mind when I get the block.


----------



## coc_james

1999mhz cap in SLI? Any guesses? Voltage unlocked. Running curve that yeilds [email protected] 1.094 in AB for single card.


----------



## KCDC

For me, messing with the voltage curve in any way causes my cards to never go near 2000, either caps at 1919 or in the 1980s. Once I go back to +150 with the slider, I get 2050+ again. I'm pretty fed up. Stock FE BIOS on SLI. I'll keep trying.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KCDC*
> 
> For me, messing with the voltage curve in any way causes my cards to never go near 2000, either caps at 1919 or in the 1980s. Once I go back to +150 with the slider, I get 2050+ again. I'm pretty fed up. Stock FE BIOS on SLI. I'll keep trying.


The only reason to use curve is to limit the voltage and clock speed. If you want max performance the offset is still the best.


----------



## coc_james

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KCDC*
> 
> For me, messing with the voltage curve in any way causes my cards to never go near 2000, either caps at 1919 or in the 1980s. Once I go back to +150 with the slider, I get 2050+ again. I'm pretty fed up. Stock FE BIOS on SLI. I'll keep trying.
> 
> 
> 
> The only reason to use curve is to limit the voltage and clock speed. If you want max performance the offset is still the best.
Click to expand...

My limited curve isn't working right for SLI either. In single card it caps under 1volt and peaks at 2056ish mhz.

Should I completely delete AB and all profiles and start over?


----------



## KedarWolf

Quote:


> Originally Posted by *coc_james*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KCDC*
> 
> For me, messing with the voltage curve in any way causes my cards to never go near 2000, either caps at 1919 or in the 1980s. Once I go back to +150 with the slider, I get 2050+ again. I'm pretty fed up. Stock FE BIOS on SLI. I'll keep trying.
> 
> 
> 
> The only reason to use curve is to limit the voltage and clock speed. If you want max performance the offset is still the best.
> 
> Click to expand...
> 
> My limited curve isn't working right for SLI either. In single card it caps under 1volt and peaks at 2056ish mhz.
> 
> Should I completely delete AB and all profiles and start over?
Click to expand...

You peeps need to unlink your cards in Afterburner and set the voltage curve for each one individually, my advice.









Uncheck that, select each card in the lower picture, do voltage curve for each one.


----------



## coc_james

@KedarWolf

I will try this tomorrow, thanks.


----------



## Ubernoobie

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Some people do not understand how temps work. For example my 1080 Ti idles at 33C and loads 50C. My ambient is 28C-30C most of the time. I could lower it 5-7C from better loop instead of AIO and 8-10C having 20C ambient and I would be low 30s on load.


What fans and what rpm is it spinning at? Mine reaches around 55c with ml120s push pull at 1500rpm with ambient around 25c


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ubernoobie*
> 
> What fans and what rpm is it spinning at? Mine reaches around 55c with ml120s push pull at 1500rpm with ambient around 25c


My fans are at 80% (1850 RPM). I hit about 55C with 120% Power and 28C ambient.


----------



## KedarWolf

Quote:


> Originally Posted by *Ubernoobie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Some people do not understand how temps work. For example my 1080 Ti idles at 33C and loads 50C. My ambient is 28C-30C most of the time. I could lower it 5-7C from better loop instead of AIO and 8-10C having 20C ambient and I would be low 30s on load.
> 
> 
> 
> What fans and what rpm is it spinning at? Mine reaches around 55c with ml120s push pull at 1500rpm with ambient around 25c
Click to expand...

I run a 5960x at high overclock and relatively high voltages and my 1080 Ti on one 360 rad, with pump at 100% and fans push/pull at 80% (I always wear a headset).









Since I put Fujipoly 17W/mK pads on the memory, VRMs and a few other choice places core on the GPU went from 53C stress testing to 46C at 1.093v, just over 40C is I go down to .993 2012 core.


----------



## weskeh

Just got myself one asus strix oc edition. 1987mhz on the core out of the box.


----------



## ZakZakXxX

https://www.3dmark.com/fs/13372639










GTX 1080 TI FTW3 hybrid


----------



## hadesfactor

Quote:


> Originally Posted by *Ubernoobie*
> 
> What fans and what rpm is it spinning at? Mine reaches around 55c with ml120s push pull at 1500rpm with ambient around 25c


I have 2 separate loops. My GPU loop has it's own 480mm Rad with 4xML120 set to a curve based on delta between ambient and water temps. My card during benching doesnt break 39c and while gaming for 8+ hrs doesnt break 45c. Delta while gaming hits between 4.5-5c and it's only set up in pull. My ambient is between 28-30c (live in the desert) While reg operation web surf etc, my curve has my GPU fans set off and doesn't kick in until it hit's 1c delta. When gaming, the fans def ramp up (prob about 85%) but I don't mind or can't hear it over the game and TV lol


----------



## Mooncheese

Quote:


> Originally Posted by *Pandora's Box*
> 
> No I'm running a thermaltake p3 case now, basically open air (see my rig in my signature). 72f is the ambient temperature. Yeah 80c at 1volt, 70% fan speed. I'm pretty sure it's the 2xssaa that's causing the temps to skyrocket. Game is basically running at 5160x2880 (2xssaa is basically doubling the resolution). I'm happy with the performance, just surprised at the temps and power throttling. Even firestrike extreme doesn't push it this hard


I'm wondering if maybe your thermal paste is either due for replacement or of its a bad paste job to begin with as with 72F ambient you should be much lower than 80c with 1.000v. For comparison, with the same ambient (72F = room temp) my FE with the original vapor chamber would be at 70c with 70% fan speed and default voltage and +137 core +400 memory, 120% PT with the default voltage indicating 1.062v. You should be about 10c lower than that with default voltage at 70% fan speed and 15c lower with the undervolt. The other member here from Australia is only hitting 49C for example. I really think it's your thermal paste, you should be around 60C max with an undervolted MSI Gaming X, 70% rpm at room temperature.


----------



## Mooncheese

Quote:


> Originally Posted by *ZakZakXxX*
> 
> https://www.3dmark.com/fs/13372639
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 1080 TI FTW3 hybrid


Nice cpu and combined scores, about 15% faster than my hexcore 4930k at 4.5 GHZ ( 16.7k CPU, 8k combined) Not sure what the difference is but yeah its faster.


----------



## weskeh

Quote:


> Originally Posted by *ZakZakXxX*
> 
> https://www.3dmark.com/fs/13372639
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 1080 TI FTW3 hybrid


https://www.3dmark.com/3dm/21717145

my four core lost







lol


----------



## taem

How well does the MNPC gpu brace work on 1080 tis with plastic shrouds? I'm worried the plastic will bend in a little from the weight. MSI Gaming X is the specific card I have in mind.




Absurd price on this one but it's metal. Not so sure about the plastic version of this you can get on Amazon. Anyway I'm assuming this is something I can use going forward on any gpu.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ubernoobie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Some people do not understand how temps work. For example my 1080 Ti idles at 33C and loads 50C. My ambient is 28C-30C most of the time. I could lower it 5-7C from better loop instead of AIO and 8-10C having 20C ambient and I would be low 30s on load.
> 
> 
> 
> What fans and what rpm is it spinning at? Mine reaches around 55c with ml120s push pull at 1500rpm with ambient around 25c
> 
> Click to expand...
> 
> I run a 5960x at high overclock and relatively high voltages and my 1080 Ti on one 360 rad, with pump at 100% and fans push/pull at 80% (I always wear a headset).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since I put Fujipoly 17W/mK pads on the memory, VRMs and a few other choice places core on the GPU went from 53C stress testing to 46C at 1.093v, just over 40C is I go down to .993 2012 core.
Click to expand...

Actually, my ambient temp is pushing close to 80F now and I'm getting up to 45C stress testing.

When I had my A/C cranked and ambient temp was around 65C was quite a bit lower.


----------



## hadesfactor

Quote:


> Originally Posted by *taem*
> 
> How well does the MNPC gpu brace work on 1080 tis with plastic shrouds? I'm worried the plastic will bend in a little from the weight. MSI Gaming X is the specific card I have in mind.
> 
> Absurd price on this one but it's metal. Not so sure about the plastic version of this you can get on Amazon. Anyway I'm assuming this is something I can use going forward on any gpu.


I have this one, don't need it really if you plan on WC. But to answer your question, it shouldn't harm your shroud. You can actually adjust the stopper. It's a pretty solid piece and if your Mobo doesn't have beefed up pcie slots and you have gpu sag, this will work out good


----------



## Maximization

I went vertical a long time ago, sag sucks


----------



## webhito

I use a cooler master vga holder, no sag whatsoever.
https://store.coolermaster.com/masteraccessory-universal-vga-holder-for-all-size-tower-chassis


----------



## LunaP

I just tighten both screws and no sag either, that + waterblocks lol.


----------



## KedarWolf

Quote:


> Originally Posted by *LunaP*
> 
> I just tighten both screws and no sag either, that + waterblocks lol.


I use a Thermaltake Core X9 case where the motherboard sits horizontal and cards vertical.


----------



## WillG027

I used a $1 metal drinking straw - which I bent to size and propped up the inside edge of the card, kept in place with a little blu-tac.

Works a treat and cheap.


----------



## iamjanco

I'll be using one of *these* in an SMA8 once I get the prototype sorted out.


----------



## hadesfactor

Quote:


> Originally Posted by *iamjanco*
> 
> I'll be using one of *these* in an SMA8 once I get the prototype sorted out.


There"s actually a good youtube video that shows how to make this plate as well...looks good btw


----------



## Mooncheese

There's literally no shortage of ways to hold up your GPU without having to spend $60 on a fabricated bracket it's not even funny.

Fishing line?

I'm using an adjustable automotive bolt .

Lego's?



http://imgur.com/VQNyF


As an update to my undervolting adventures, 2012 MHz wasn't stable in Titanfall 2 and neither was 2000 MHz, 1987 has held so far, fingers crossed. Surprisingly, this is the game that is testing my OC's, I can play The Witcher 3 for hours on end without a crash at 2012 MHz (yet to get one in TW3) with 90-99% load sustained but TF2 after like 20 min and it's black-screen city. Rise of the Tomb Raider is another game that will test your OC, at least it does for in 3D Vision.


----------



## taem

Quote:


> Originally Posted by *Mooncheese*
> 
> There's literally no shortage of ways to hold up your GPU without having to spend $60 on a fabricated bracket it's not even funny.
> 
> Fishing line?
> 
> I'm using an adjustable automotive bolt .
> 
> Lego's?


It's not that some of us lack imagination and we can't figure out how to hold up a gpu without paying $44 or whatever. It's that we don't want our systems to look ghetto. I mean there's literally no shortage of ways to hold up your pants either. But I still own belts. I don't tie twine around my waist or whatever.


----------



## iamjanco

Quote:


> Originally Posted by *hadesfactor*
> 
> There"s actually a good youtube video that shows how to make this plate as well...looks good btw


Thanks. I made some more changes to the artwork in that other post, which moves the pair of sli'd cards higher up to give the mb as much room as possible. If it all works out should line up fairly close with the current io panel layout, leaving the first two slots open as wiggle room. Bottom of the panel is designed to fit into the existing io cover slots, while the top is designed to allow use of the thumbnail screws for fastening. The only place I should have to do any drilling/cutting would be that left side closest to the window; and of course removing five slots worth of io bracing.

Btw, got a link for that video you mentioned?


----------



## webhito

Quote:


> Originally Posted by *taem*
> 
> It's not that some of us lack imagination and we can't figure out how to hold up a gpu without paying $44 or whatever. It's that we don't want our systems to look ghetto. I mean there's literally no shortage of ways to hold up your pants either. But I still own belts. I don't tie twine around my waist or whatever.


lol.


----------



## iamjanco

Quote:


> Originally Posted by *taem*
> 
> I mean there's literally no shortage of ways to hold up your pants either. But I still own belts. I don't tie twine around my waist or whatever.


Bungee cord also works. Somewhat innovative approach, if you ask me:


----------



## Mooncheese

Quote:


> Originally Posted by *taem*
> 
> It's not that some of us lack imagination and we can't figure out how to hold up a gpu without paying $44 or whatever. It's that we don't want our systems to look ghetto. I mean there's literally no shortage of ways to hold up your pants either. But I still own belts. I don't tie twine around my waist or whatever.


The bolt holding mine up looks kinda cool, no worse than the $45 bracket ($3.00) and even the lego's look cool. Some of us don't have $45 to piss into the wind. That $45 could be the difference between a $700 1180 Ti and the $750 variant that we actually want or it could be 5 days worth of food or taking your GF out to dinner at a decent restaurant.

Not everyone who has a 1080 Ti is Mr. Moneybags.

The Lego approach is kinda hip AND it's a way of not identifying with materialistic values. Just because I like a decent framerate in my games or need the GPU power to run them in 3D Vision doesn't mean that I automatically fall into the group of morons who feels the need to upgrade their perfectly fine smartphone every year or wear designer clothing (Samsung Galaxy S4 from 2013 still going strong).

Heck to this end, my entire computer from 2013 is still relatively futureproof and I will not be "upgrading" anything other that say the GPU until the CPU literally dies (with 1.362v this isn't happening anytime soon).

Windows 7
i7 4930k @ 4.5 GHz
RIVBE
32 GB Corsair Vengeance Memory @ 2133 MHz factory timings and voltage
Samsung 850 Evo
Corsair 850 RM
Asus ROG Swift PG278Q


----------



## taem

Quote:


> Originally Posted by *Mooncheese*
> 
> The bolt holding mine up looks kinda cool, no worse than the $45 bracket ($3.00) and even the lego's look cool. Some of us don't have $45 to piss into the wind. That $45 could be the difference between a $700 1180 Ti and the $750 variant that we actually want or it could be 5 days worth of food or taking your GF out to dinner at a decent restaurant.
> 
> Not everyone who has a 1080 Ti is Mr. Moneybags.
> 
> The Lego approach is kinda hip AND it's a way of not identifying with materialistic values. Just because I like a decent framerate in my games or need the GPU power to run them in 3D Vision doesn't mean that I automatically fall into the group of morons who feels the need to upgrade their perfectly fine smartphone every year or wear designer clothing (Samsung Galaxy S4 from 2013 still going strong).


I just find this so judgmental. Why is it "pissing money into the wind"? It's no different from rgb lightning, tempered glass, cases made of premium materials, sleeved cables, many backplates, etc. If you don't want to pay for aesthetics, that's cool, but why do you have to characterize it like that and imply that those who do are somehow deficient?

Same as calling folks who upgrade their phones all the time "morons." I also still use a GS4. But most folks I know upgrade every year or two. They are not morons, they just like to constantly upgrade the product that is more central to their lives than any other product they own. Nothing wrong with that. Nothing wrong with designer clothing either.

Also, I'd rather be materialistic than hip. Because I don't like drinking out of mason jars and stuff.


----------



## Mooncheese

Quote:


> Originally Posted by *taem*
> 
> I just find this so judgmental. Why is it "pissing money into the wind"? It's no different from rgb lightning, tempered glass, cases made of premium materials, sleeved cables, many backplates, etc. If you don't want to pay for aesthetics, that's cool, but why do you have to characterize it like that and imply that those who do are somehow deficient?
> 
> Same as calling folks who upgrade their phones all the time "morons." I also still use a GS4. But most folks I know upgrade every year or two. They are not morons, they just like to constantly upgrade the product that is more central to their lives than any other product they own. Nothing wrong with that. Nothing wrong with designer clothing either.
> 
> Also, I'd rather be materialistic than hip. Because I don't like drinking out of mason jars and stuff.


I'm drinking green tea out of a mason jar while reading this, identifying with materialist values has destroyed the planet, we are firmly in the 6th Mass Extinction and are probably on the cusp of NTHE (Near Term Human Extinction) precipitated by the Methane Clathrate Gun. Is it too late to stop being morons? Not sure, but after learning about the relationship between MY activity, MY values and the health of the biosphere, economic inequality, animal welfare, mental and physical health and many other interrelated issues I simply cannot continue to live in an unconscious manner. I'm pushing 40 though, when I was 20 I too was a moron. We all wake up eventually, I hope.


----------



## Kokin

Just get a case that holds the GPU in a vertical position. There are even some creative approaches to using a GPU vertically with PCI-E risers.


----------



## coc_james

Quote:


> Originally Posted by *Mooncheese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> I just find this so judgmental. Why is it "pissing money into the wind"? It's no different from rgb lightning, tempered glass, cases made of premium materials, sleeved cables, many backplates, etc. If you don't want to pay for aesthetics, that's cool, but why do you have to characterize it like that and imply that those who do are somehow deficient?
> 
> Same as calling folks who upgrade their phones all the time "morons." I also still use a GS4. But most folks I know upgrade every year or two. They are not morons, they just like to constantly upgrade the product that is more central to their lives than any other product they own. Nothing wrong with that. Nothing wrong with designer clothing either.
> 
> Also, I'd rather be materialistic than hip. Because I don't like drinking out of mason jars and stuff.
> 
> 
> 
> I'm drinking green tea out of a mason jar while reading this, identifying with materialist values has destroyed the planet, we are firmly in the 6th Mass Extinction and are probably on the cusp of NTHE (Near Term Human Extinction) precipitated by the Methane Clathrate Gun. Is it too late to stop being morons? Not sure, but after learning about the relationship between MY activity, MY values and the health of the biosphere, economic inequality, animal welfare, mental and physical health and many other interrelated issues I simply cannot continue to live in an unconscious manner. I'm pushing 40 though, when I was 20 I too was a moron. We all wake up eventually, I hope.
Click to expand...

Smoke another doob bruh!


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Mooncheese*
> 
> The bolt holding mine up looks kinda cool, no worse than the $45 bracket ($3.00) and even the lego's look cool. Some of us don't have $45 to piss into the wind. That $45 could be the difference between a $700 1180 Ti and the $750 variant that we actually want or it could be 5 days worth of food or taking your GF out to dinner at a decent restaurant.
> 
> Not everyone who has a 1080 Ti is Mr. Moneybags.
> 
> The Lego approach is kinda hip AND it's a way of not identifying with materialistic values. Just because I like a decent framerate in my games or need the GPU power to run them in 3D Vision doesn't mean that I automatically fall into the group of morons who feels the need to upgrade their perfectly fine smartphone every year or wear designer clothing (Samsung Galaxy S4 from 2013 still going strong).
> 
> Heck to this end, my entire computer from 2013 is still relatively futureproof and I will not be "upgrading" anything other that say the GPU until the CPU literally dies (with 1.362v this isn't happening anytime soon).
> 
> Windows 7
> i7 4930k @ 4.5 GHz
> RIVBE
> 32 GB Corsair Vengeance Memory @ 2133 MHz factory timings and voltage
> Samsung 850 Evo
> Corsair 850 RM
> Asus ROG Swift PG278Q


$45 is a working week of food? Well bugger me cheese on the moon is cheap! lol


----------



## buellersdayoff

Quote:


> Originally Posted by *Kokin*
> 
> Just get a case that holds the GPU in a vertical position. There are even some creative approaches to using a GPU vertically with PCI-E risers.


Hacked my own, gpu runs cooler as it is not recirculating it's own heat/air and the pch & m.2 ssd run cooler too


----------



## taem

Quote:


> Originally Posted by *Mooncheese*
> 
> I'm drinking green tea out of a mason jar while reading this, identifying with materialist values has destroyed the planet, we are firmly in the 6th Mass Extinction and are probably on the cusp of NTHE (Near Term Human Extinction) precipitated by the Methane Clathrate Gun. Is it too late to stop being morons? Not sure, but after learning about the relationship between MY activity, MY values and the health of the biosphere, economic inequality, animal welfare, mental and physical health and many other interrelated issues I simply cannot continue to live in an unconscious manner. I'm pushing 40 though, when I was 20 I too was a moron. We all wake up eventually, I hope.


I don't believe you're actually 40, based on the fact that you call people "morons" in internet discussions about gpu brackets.


----------



## Kokin

Quote:


> Originally Posted by *buellersdayoff*
> 
> Hacked my own, gpu runs cooler as it is not recirculating it's own heat/air and the pch & m.2 ssd run cooler too


Very nice! You did a great job making it look very clean.


----------



## kevindd992002

If I wanted to vertically mount my 108pTi, is the CM Vertical GPU Holder the best to go with? And which is the best pci-e riser cable thay I can buy?


----------



## Thoth420

Too late to jump on the 1080Ti train? I hear the next gen aren't coming out until some time next year. The Fury X is a bit long in the tooth and the small amount of VRAM is hurting the cause. I plan to swap to either a 4K display or more likely a 3440 x 1440 widescreen with the VRR for whatever card I am upgrading to. Seems like G Sync worked better than FreeSync as well in my experience.
Also what aftermarket AIBs are the most popular? I have been out of the GPU game for over a year. Cheers in advance!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Thoth420*
> 
> Too late to jump on the 1080Ti train? I hear the next gen aren't coming out until some time next year. The Fury X is a bit long in the tooth and the small amount of VRAM is hurting the cause. I plan to swap to either a 4K display or more likely a 3440 x 1440 widescreen with the VRR for whatever card I am upgrading to. Seems like G Sync worked better than FreeSync as well in my experience.
> Also what aftermarket AIBs are the most popular? I have been out of the GPU game for over a year. Cheers in advance!


Not late at all. I have a Fury X and 1080 Ti kills it.


----------



## KCDC

Quote:


> Originally Posted by *Thoth420*
> 
> Too late to jump on the 1080Ti train? I hear the next gen aren't coming out until some time next year. The Fury X is a bit long in the tooth and the small amount of VRAM is hurting the cause. I plan to swap to either a 4K display or more likely a 3440 x 1440 widescreen with the VRR for whatever card I am upgrading to. Seems like G Sync worked better than FreeSync as well in my experience.
> Also what aftermarket AIBs are the most popular? I have been out of the GPU game for over a year. Cheers in advance!


You kinda answered your own question, honestly. Still worth it in my eyes.

I run FE cards, but are loud if they aren't watercooled. I believe the strix cards are worth it, based on what I've read in this thread. I'm sure others will chime in with their faves.


----------



## WillG027

Functionality over form for me. Once the case cover goes on I don't care what it looks like as long as it's neat and functional.

If I"m staring at my case instead of the monitor then I'm using my computer wrong, LOL.

Each to their own though no harm if someone wants to dress up their rig. . .


----------



## LunaP

so the replacement card finally came from evga, the backplate had the same model # but when I removed it it was a diff model number, they sent the wrong card which was one w/ the non ref pcb, so now we're fighting a battle on proving that it was their fault and not mine... ***


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not late at all. I have a Fury X and 1080 Ti kills it.


Quote:


> Originally Posted by *KCDC*
> 
> You kinda answered your own question, honestly. Still worth it in my eyes.
> 
> I run FE cards, but are loud if they aren't watercooled. I believe the strix cards are worth it, based on what I've read in this thread. I'm sure others will chime in with their faves.


Cheers all. I hope I can find another popular one that isn't ASUS. I tend to have bad experiences with their products across the board with exception to monitors. I would only do the FE if I am sure I am going to be running another custom loop but since I am moving cross country soon that would probably not happen. My budget has to relax itself a bit for a while.


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> If I wanted to vertically mount my 108pTi, is the CM Vertical GPU Holder the best to go with? And which is the best pci-e riser cable thay I can buy?


I used the Thermaltake premium http://www.thermaltake.com/Chassis/Accessories_/_/C_00003013/TT_Premium_PCI_E_3_0_Extender_300mm/design.htm it is reinforced at each end of the cable, since the most common complaint of the cheaper cables is damage at each end resulting in a short or loose connection


----------



## Rollergold

Quote:


> Originally Posted by *Thoth420*
> 
> Cheers all. I hope I can find another popular one that isn't ASUS. I tend to have bad experiences with their products across the board with exception to monitors. I would only do the FE if I am sure I am going to be running another custom loop but since I am moving cross country soon that would probably not happen. My budget has to relax itself a bit for a while.


The EVGA GTX 1080 TI SC Black Edition is a nice all rounder. It comes with a reference PCB and has a nice dual fan open cooler so you can have a cooler and quieter running card now then the FE cards and since it has a reference PCB you have the best selection of Blocks if you want to water cool in the future.


----------



## Recidivism

I missed out on the Destiny 2 Beta key because Best Buy was not on approved venders for the deal. I wish I could return my card to best buy and order from newegg just to get that key I am super angry.


----------



## kevindd992002

Quote:


> Originally Posted by *buellersdayoff*
> 
> I used the Thermaltake premium http://www.thermaltake.com/Chassis/Accessories_/_/C_00003013/TT_Premium_PCI_E_3_0_Extender_300mm/design.htm it is reinforced at each end of the cable, since the most common complaint of the cheaper cables is damage at each end resulting in a short or loose connection


But I'm hearing a lot of bad experiences with the Thermaltake risers. So I'm kinda hesitant about it. How about EZDIY?

Another issue is that I'm reading that the risers with latches won't fit the Coolermaster vertical gpu mount. Any thoughts to this?


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> I used the Thermaltake premium http://www.thermaltake.com/Chassis/Accessories_/_/C_00003013/TT_Premium_PCI_E_3_0_Extender_300mm/design.htm it is reinforced at each end of the cable, since the most common complaint of the cheaper cables is damage at each end resulting in a short or loose connection
> 
> 
> 
> But I'm hearing a lot of bad experiences with the Thermaltake risers. So I'm kinda hesitant about it. How about EZDIY?
> 
> Another issue is that I'm reading that the risers with latches won't fit the Coolermaster vertical gpu mount. Any thoughts to this?
Click to expand...

I can't say anything about the mounts since I made my own. Not sure about the problems with the cables, Thermaltake has two different versions though. Again I chose the reinforced PREMIUM version. I tested with an ebay cheap cable first, being careful not to kink the cable at the ends, but I will not recommend them.


----------



## Thoth420

Quote:


> Originally Posted by *Rollergold*
> 
> The EVGA GTX 1080 TI SC Black Edition is a nice all rounder. It comes with a reference PCB and has a nice dual fan open cooler so you can have a cooler and quieter running card now then the FE cards and since it has a reference PCB you have the best selection of Blocks if you want to water cool in the future.


Thank you. I prefer EVGA when I am buy Nvidia cards...or PSUs even.


----------



## iamjanco

Before spending
Quote:


> Originally Posted by *buellersdayoff*
> 
> I can't say anything about the mounts since I made my own. Not sure about the problems with the cables, Thermaltake has two different versions though. Again I chose the reinforced PREMIUM version. I tested with an ebay cheap cable first, being careful not to kink the cable at the ends, but I will not recommend them.


That's likely the best way to go, testing first. A number of factors could influence the stability of a system using riser cables, malfunctioning/bad cables among them.


----------



## Kokin

Quote:


> Originally Posted by *Thoth420*
> 
> Cheers all. I hope I can find another popular one that isn't ASUS. I tend to have bad experiences with their products across the board with exception to monitors. I would only do the FE if I am sure I am going to be running another custom loop but since I am moving cross country soon that would probably not happen. My budget has to relax itself a bit for a while.


I was literally in the same place just a week ago. I have a Fury X and LG 34UC88 (3440x1440 60/75Hz) as well as a Qnix QX2710 (1440p 120Hz) and wanted something better to drive either one. I ended up choosing the EVGA 1080Ti FTW3 with plans to add the EK waterblock to it.

The card just came in today and I'm amazed at how beefy and quiet the stock cooler is. I'm actually questioning needing a block for this GPU, but then my custom loop with 2x 240mm rads would only be cooling my CPU lol.


----------



## 2ndLastJedi

I got the cheapest 1080 Ti i could find and it turns out to overclock as good as the big guys and stays quiet and cool








Galax GTX 1080 Ti EXOC . Does 2088/6008 @ 70C @ 55% fan .
Been absolutely stoked with it , does 4k mostly max in everything i play .


----------



## TheRedViper

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I got the cheapest 1080 Ti i could find and it turns out to overclock as good as the big guys and stays quiet and cool
> 
> 
> 
> 
> 
> 
> 
> 
> Galax GTX 1080 Ti EXOC . Does 2088/6008 @ 70C @ 55% fan .
> Been absolutely stoked with it , does 4k mostly max in everything i play .


Its a galax, better pcbs


----------



## Rainmaker91

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I got the cheapest 1080 Ti i could find and it turns out to overclock as good as the big guys and stays quiet and cool
> 
> 
> 
> 
> 
> 
> 
> 
> Galax GTX 1080 Ti EXOC . Does 2088/6008 @ 70C @ 55% fan .
> Been absolutely stoked with it , does 4k mostly max in everything i play .


From what I have read (obviously I don't have a ton of first hand knowledge, having only 1 card myself), the 1080ti's are all relatively close in performance with almost all of them managing to stay in the 2000-2100mhz range for overclocks. Doesn't matter if it's the cheapest you can find or the most expensive one, they all perform more or less the same as long as they are cooled properly. Assuming you're not bois flashing and doing more elaborate mods on the cards.


----------



## 2ndLastJedi

Buildzoid told me the EXOC has a reference PCB !


----------



## Minusorange

Joined the club today with Aorus Xtreme


----------



## Pandora's Box

Got a brother to keep my one card company. Still waiting on a fancier sli bridge to arrive


----------



## Kokin

Only got to try a few quick overclocks last night, but I was able to hit 2038-2050 on my FTW3. 127% PT, fans at 50% (~1500RPM), temps around 65C, tried both stock 1.063V and max 1.094V. Tried to do memory OCs up to 6003, but haven't found a stable point yet.


----------



## KedarWolf

Quote:


> Originally Posted by *Kokin*
> 
> Only got to try a few quick overclocks last night, but I was able to hit 2038-2050 on my FTW3. 127% PT, fans at 50% (~1500RPM), temps around 65C, tried both stock 1.063V and max 1.094V. Tried to do memory OCs up to 6003, but haven't found a stable point yet.


Try the secondary BIOS, it has a higher power limit.


----------



## lolredy

should i buy oc strix or ftw3? price doesnt matter.


----------



## Kokin

Quote:


> Originally Posted by *KedarWolf*
> 
> Try the secondary BIOS, it has a higher power limit.


That's the first thing I did after I took it out of the box







Master BIOS = 117% PT, Slave BIOS = 127% PT
Quote:


> Originally Posted by *lolredy*
> 
> should i buy oc strix or ftw3? price doesnt matter.


I chose FTW3 since it had dual BIOS and warranty will be retained if installing a waterblock. Strix has extra PWM/RGB headers if that's something you may need.


----------



## buellersdayoff

Quote:


> Originally Posted by *iamjanco*
> 
> Before spending
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> I can't say anything about the mounts since I made my own. Not sure about the problems with the cables, Thermaltake has two different versions though. Again I chose the reinforced PREMIUM version. I tested with an ebay cheap cable first, being careful not to kink the cable at the ends, but I will not recommend them.
> 
> 
> 
> That's likely the best way to go, testing first. A number of factors could influence the stability of a system using riser cables, malfunctioning/bad cables among them.
Click to expand...

I recently put my gpu back in the motherboard (no riser) for testing. I had a strange anomaly which turned out to be msi afterburner. Superposition score is only 30 points lower (out of ~10000, so small %) with the same clocks when using the riser but the card and nearby motherboard components run much cooler.


----------



## iamjanco

Quote:


> Originally Posted by *buellersdayoff*
> 
> I recently put my gpu back in the motherboard (no riser) for testing. I had a strange anomaly which turned out to be msi afterburner. Superposition score is only 30 points lower (out of ~10000, so small %) with the same clocks when using the riser but the card and nearby motherboard components run much cooler.


From the test results I've seen by others, it's my understanding that even properly functioning riser cards/cables can introduce a bit of latency. Typically though, using them seems to have negligible effect.


----------



## kevindd992002

Quote:


> Originally Posted by *buellersdayoff*
> 
> I recently put my gpu back in the motherboard (no riser) for testing. I had a strange anomaly which turned out to be msi afterburner. Superposition score is only 30 points lower (out of ~10000, so small %) with the same clocks when using the riser but the card and nearby motherboard components run much cooler.


Why would the card and nearby mobo components run cooler without the riser?


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> I recently put my gpu back in the motherboard (no riser) for testing. I had a strange anomaly which turned out to be msi afterburner. Superposition score is only 30 points lower (out of ~10000, so small %) with the same clocks when using the riser but the card and nearby motherboard components run much cooler.
> 
> 
> 
> Why would the card and nearby mobo components run cooler without the riser?
Click to expand...

No its the other way around, cooler with riser. Because the card is away from the motherboard.


----------



## kevindd992002

Quote:


> Originally Posted by *buellersdayoff*
> 
> No its the other way around, cooler with riser. Because the card is away from the motherboard.


Oops, you're right, sorry I read that wrong. I'm still contemplating which is the best riser cable though. The 3M is just too expensive.


----------



## buellersdayoff

Quote:


> Originally Posted by *kevindd992002*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buellersdayoff*
> 
> No its the other way around, cooler with riser. Because the card is away from the motherboard.
> 
> 
> 
> Oops, you're right, sorry I read that wrong. I'm still contemplating which is the best riser cable though. The 3M is just too expensive.
Click to expand...

I can vouch for the one I'm using only. I've read that the original Thermaltake cables and cheap no-name cables have had issues. Just don't cheap out. The issues I've heard of are that the wires come loose at the pcb section of the riser causing a short or just not working. So which ever riser cable you get you'll need to be careful in that area when bending them.


----------



## Thoth420

Quote:


> Originally Posted by *Kokin*
> 
> I was literally in the same place just a week ago. I have a Fury X and LG 34UC88 (3440x1440 60/75Hz) as well as a Qnix QX2710 (1440p 120Hz) and wanted something better to drive either one. I ended up choosing the EVGA 1080Ti FTW3 with plans to add the EK waterblock to it.
> 
> The card just came in today and I'm amazed at how beefy and quiet the stock cooler is. I'm actually questioning needing a block for this GPU, but then my custom loop with 2x 240mm rads would only be cooling my CPU lol.


Nice to hear that alot of the non ref EVGA blocks fit a block. I would like to loop the new system but later on down the road. I will secure the CPU and GPU blocks now and get to it later on in the year when everything is known good and working. I have terrible luck so I tend to wait a bit before doing anything rash.


----------



## kalidae

Hey guys I picked up a strix 1080ti non OC model. I have not played with voltages and im not sure if I will...I probably will. But im pretty happy with this GPU. I have it stable with +160mhz on the core and +600 on the memory. For benchmarking purposes the fans were at a constant 70% and my temps topped out at 46c. Average core speed is like 2011mhz 2020ish and peak was 2063mhz. It's a perfect match for my 4k swift


----------



## Maximization

my power supply shuts down if I push too hard.

I got 2088
https://www.3dmark.com/spy/2250207

but my overall score went down with it. score was higher with 2025

https://www.3dmark.com/spy/2242338


----------



## Alexium

So can the XOC BIOS be flashed into any FE card for removed power limit? Or is that not how it works?


----------



## weskeh

Quote:


> Originally Posted by *Maximization*
> 
> my power supply shuts down if I push too hard.
> 
> I got 2088
> https://www.3dmark.com/spy/2250207
> 
> but my overall score went down with it. score was higher with 2025
> 
> https://www.3dmark.com/spy/2242338


Yep, pascal is weird for
OC'ing too much going on on a hardware level.

Been pushing clocks yesterday and although my core can get high it does litle to nothing.

Benched rottr prob 20 times with diff clocks/voltages. Stock performed best lol


----------



## Alexium

Quote:


> Originally Posted by *weskeh*
> 
> Been pushing clocks yesterday and although my core can get high it does litle to nothing.
> 
> Benched rottr prob 20 times with diff clocks/voltages. Stock performed best lol


Could that be due to power and/or thermal limit being hit, and subsequent throttling?


----------



## Cookybiscuit

Still no way to disable GPU Boost on these? It's annoying as hell.


----------



## weskeh

Quote:


> Originally Posted by *Alexium*
> 
> Could that be due to power and/or thermal limit being hit, and subsequent throttling?


Possibly, temp stays below 60c, i can keep the core steady at say 2060mhz

But having perfcap limits. Bench points go up in syntetics but ingame benches the fps is worse.


----------



## TheBoom

So I just had the weirdest crash while in a game of Overwatch. Both screens went black and the GPU fans ramped up to max speed. Seems like some form of OCP was triggered but I don't think it was from the PSU.

This is the first time this happened and it definitely isnt a regular driver/overlock crash.


----------



## 2ndLastJedi

I hope your profile pic isnt a sign of things to come !


----------



## TheBoom

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I hope your profile pic isnt a sign of things to come !


Lol. Don't jinx it mate. I just cleaned and updated drivers. Fingers crossed it doesn't happen again. Might have to revert to default bios and undervolt worst case.

Too bad the component cooling on the Amp Extreme pretty much sucks as reviewed by GN.


----------



## TheBoom

Turns out it was probably a driver crash after all. For some reason the latest drIvers can crash in this manner?

Reverted AB to default and no crash.

Is there any bios apart from XOC and all of Zotac bioses that have a high power limit?

All the Zotac bioses have that fan bug for me where it won't stop the fans even at idle and low temps.


----------



## Dasboogieman

Quote:


> Originally Posted by *Cookybiscuit*
> 
> Still no way to disable GPU Boost on these? It's annoying as hell.


Its kinda intrinsic to the Pascal architecture. We can't even modify the BIOS let alone disable boost lol.


----------



## Falkentyne

Quote:


> Originally Posted by *Cookybiscuit*
> 
> Still no way to disable GPU Boost on these? It's annoying as hell.


You can but requires a hardware programmer. My 1070 sits at 1886 firm without moving stable, except in witcher 3 (1873 is stable in W3).
Clocks don't move at all.


----------



## coc_james

Quote:


> Originally Posted by *TheBoom*
> 
> Turns out it was probably a driver crash after all. For some reason the latest drIvers can crash in this manner?
> 
> Reverted AB to default and no crash.
> 
> Is there any bios apart from XOC and all of Zotac bioses that have a high power limit?
> 
> All the Zotac bioses have that fan bug for me where it won't stop the fans even at idle and low temps.


My AORUS can do 1.094 and up to 150%.


----------



## TheBoom

Quote:


> Originally Posted by *coc_james*
> 
> My AORUS can do 1.094 and up to 150%.


Thanks but 150% of what exactly? 375 watts total?


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> My AORUS can do 1.094 and up to 150%.


Nice you can slide the slider all the way over there. But what is the card actually doing?


----------



## coc_james

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> My AORUS can do 1.094 and up to 150%.
> 
> 
> 
> Thanks but 150% of what exactly? 375 watts total?
Click to expand...

I've pulled 374 watts with my max clocks on primary card.


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> My AORUS can do 1.094 and up to 150%.
> 
> 
> 
> Nice you can slide the slider all the way over there. But what is the card actually doing?
Click to expand...

2113mhz solid in SuPo @1.094 w/ slider set to 150%. It's the one with the AIO so temps never exceed 45 degrees, therefore no throttling.

Look on the SuPo leader boards under 1080p extreme single card. Look for James7679. It's around 43rd or 48thish? You can see the numbers there.

Just grabbed 8th place in SLI yesterday.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> 2113mhz solid in SuPo @1.094 w/ slider set to 150%. It's the one with the AIO so temps never exceed 45 degrees, therefore no throttling.


Post your GPUz. I call bull isht. You are most certainly getting TDP throttling at those volts on a GP102. No way around it without hard modding the board.

If you're running the XOC it's just lying to you as far as the clock is concerned. It will still throttle in the background due to the hardcoded die TDP.


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> 2113mhz solid in SuPo @1.094 w/ slider set to 150%. It's the one with the AIO so temps never exceed 45 degrees, therefore no throttling.
> 
> 
> 
> Post your GPUz. I call bull isht. You are most certainly getting TDP throttling at those volts on a GP102. No way around it without hard modding the board.
Click to expand...

Okay dude. Like I said, look at the SuPo leader boards.


----------



## TheBoom

That escalated quickly.

Thanks anyway I will look up the Aorus bios.


----------



## coc_james

Quote:


> Originally Posted by *TheBoom*
> 
> That escalated quickly.
> 
> Thanks anyway I will look up the Aorus bios.


I can pull it for you this weekend and put in a Dropbox. If you want. It is an AIO card, no sure how that will affect the BIOS.


----------



## coc_james

Ahem. For the guy calling me a liar.

https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> Ahem. For the guy calling me a liar.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b


Yeah congratulations. supo4k doesn't actually track your clock accuratly.

What's your supo4k score?


----------



## TheBoom

Quote:


> Originally Posted by *coc_james*
> 
> I can pull it for you this weekend and put in a Dropbox. If you want. It is an AIO card, no sure how that will affect the BIOS.


It's fine I'll pull it from techpowerup. Shouldn't affect the bios. Thanks anyway.


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Ahem. For the guy calling me a liar.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b
> 
> 
> 
> Yeah congratulations. supo4k doesn't actually track your clock accuratly.
> 
> What's your supo4k score?
Click to expand...

You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.


----------



## coc_james

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> I can pull it for you this weekend and put in a Dropbox. If you want. It is an AIO card, no sure how that will affect the BIOS.
> 
> 
> 
> It's fine I'll pull it from techpowerup. Shouldn't affect the bios. Thanks anyway.
Click to expand...

Rgr that.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> I can pull it for you this weekend and put in a Dropbox. If you want. It is an AIO card, no sure how that will affect the BIOS.
> 
> 
> 
> It's fine I'll pull it from techpowerup. Shouldn't affect the bios. Thanks anyway.
Click to expand...

If it's the Gigabyte Aorus BIOS it won't work on any Card but the Aorus and maybe the Zotac AMP.









I've tested it on my FE and clocks all messed up, don't work right at all. I think they use a different voltage controller.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.


Considering you score 10,200 in Supo 4k at 2100 and I score ~10400 at 2000 @ 1v you may want to actually climb off the high horse and learn something.

Your card is actually being TDP throttled, hurting your score.


----------



## KedarWolf

Quote:


> Originally Posted by *coc_james*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Ahem. For the guy calling me a liar.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b
> 
> 
> 
> Yeah congratulations. supo4k doesn't actually track your clock accuratly.
> 
> What's your supo4k score?
> 
> Click to expand...
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.
Click to expand...

What you'd need to do is have Afterburrner open on a second screen or logging your run in Afterburner and see if clocks jump around at all.

He's not disputing your getting those clocks, just that they stay stable at the top clock.









You could post the log here.


----------



## coc_james

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Ahem. For the guy calling me a liar.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b
> 
> 
> 
> Yeah congratulations. supo4k doesn't actually track your clock accuratly.
> 
> What's your supo4k score?
> 
> Click to expand...
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.
> 
> Click to expand...
> 
> What you'd need to do is have Afterburrner open on a second screen or logging your run in Afterburner and see if clocks jump around at all.
> 
> He's not disputing your getting those clocks, just that they stay stable at the top clock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You could post the log here.
Click to expand...

Look in the detailed view.

https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.
> 
> 
> 
> Considering you score 10,200 in Supo 4k at 2100 and I score ~10400 at 2000 @ 1v you may want to actually climb off the high horse and learn something.
> 
> Your card is actually being TDP throttled, hurting your score.
Click to expand...

High horse? You came after me, remember that. You decided you wanted to be a little keyboard warrior. Sure, maybe I don't know everything, but if you want to actually teach someone something, dont start by acting like a douche.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> Look in the detailed view.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b


Yeah that detailed view is BS. It's flat for everyone lol.


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Look in the detailed view.
> 
> https://benchmark.unigine.com/results/rid_a08e437ecf384fa5b5838a60cd0cf61b
> 
> 
> 
> Yeah that detailed view is BS. It's flat for everyone lol.
Click to expand...

See, you actually taught me something. I've been on here doing a lot of learning when it comes to GPUs. I JUST got these cards within the month and am still trying to figure some things out.


----------



## KedarWolf

Quote:


> Originally Posted by *coc_james*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.
> 
> 
> 
> Considering you score 10,200 in Supo 4k at 2100 and I score ~10400 at 2000 @ 1v you may want to actually climb off the high horse and learn something.
> 
> Your card is actually being TDP throttled, hurting your score.
> 
> Click to expand...
> 
> High horse? You came after me, remember that. You decided you wanted to be a little keyboard warrior. Sure, maybe I don't know everything, but if you want to actually teach someone something, dont start by acting like a douche.
Click to expand...

Not sure why you are getting so low scores. Might want to test at lower clocks, see if they improve. Some say increasing clocks can actually HURT your scores.

And I'd still like to see an Afterburner log with say a Superposition 4K Optimized run see how much you power limit, not sure I fully trust Superposition.


----------



## coc_james

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> You're more than welcome to look at my scores on there. I have nothing to prove. It's there for you to find.
> 
> 
> 
> Considering you score 10,200 in Supo 4k at 2100 and I score ~10400 at 2000 @ 1v you may want to actually climb off the high horse and learn something.
> 
> Your card is actually being TDP throttled, hurting your score.
> 
> Click to expand...
> 
> High horse? You came after me, remember that. You decided you wanted to be a little keyboard warrior. Sure, maybe I don't know everything, but if you want to actually teach someone something, dont start by acting like a douche.
> 
> Click to expand...
> 
> Not sure why you are getting so low scores. Might want to test at lower clocks, see if they improve. Some say increasing clocks can actually HURT your scores.
> 
> And I'd still like to see an Afterburner log with say a Superposition 4K Optimized run see how much you power limit, not sure I fully trust Superposition.
Click to expand...

I don't mind being wrong. I can admit that I am wrong, even if out of ignorance. I don't know what I don't know. And we learn by failures and people with more knowledge, generously giving.

I'm running the pair now and am trying to get the voltage working properly on them. You were helping me in the other thread. I still can't get the voltage to seem to work on the second card.


----------



## Dasboogieman

Quote:


> Originally Posted by *KraxKill*
> 
> Post your GPUz. I call bull isht. You are most certainly getting TDP throttling at those volts on a GP102. No way around it without hard modding the board.
> 
> If you're running the XOC it's just lying to you as far as the clock is concerned. It will still throttle in the background due to the hardcoded die TDP.


No his story is pretty legit.

My power throttle has vanished completely with 4K workloads once I went underwater 2100mhz @ 1.093V. Mine is an Aorus 1080ti with a custom waterblock setup.
My temps are around 35C. The highest burst I've ever seen drawn is 145% power limit on 4K Supo.

If I do 8K then I get some minor power throttle, even on the Arctic Storm BIOS.
Quote:


> Originally Posted by *coc_james*
> 
> I can pull it for you this weekend and put in a Dropbox. If you want. It is an AIO card, no sure how that will affect the BIOS.


Colder temperatures will reduce your power draw assuming voltage and clockspeed is the same. In fact, the correlation is very linear.

Your workloads + clockspeeds with the typical 60C+ aircooling setup is basically impossible, under water, it is very possible.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> See, you actually taught me something. I've been on here doing a lot of learning when it comes to GPUs. I JUST got these cards within the month and am still trying to figure some things out.


Based on what you scored, if you lower your clock a bit, and pull as much voltage as you can, you will actually score the same or higher than you did already.

The limiter on these cards is the internal TDP (normalized TDP) or whatever you want to call it. The cards actually hit this limit even as low as 1v especially if your're not shunt modded. This behavior is not really represented by GPUz and other monitors and has not really been described. The AIB cards with higher power limits simply remove the software TDP limits as does the shunt mod, but the hardcoded on die limits are still well in effect. These current limits need manual voltage hard mods to bypass.

If I may suggest. Try 2000 at ~1.000v (run less if you can) and work up from there. You will be surprised by your score and I bet you can beat what you've already set at between 1.000-1.050v at whatever frequency you can maintain there. Probably ~2050 or so.

If anybody wants to test this, it's easy. Just set 2000mhz and start increasing voltage without increasing the clock. You will see your score go down as you add voltage.

I'm my search for mining efficiency, I've had my way with multiple cards now and have setup mining rigs. I've run my card as low as -17C in this search and have posted 10920 @ 2100mhz at 1.063v on my shunt modefired card. Despite this, I now run my Ti at 2000mhz @ 1v and outscore (10.4-10.5k) many that are clocked significantly higher.

The issue is that as the voltage climbs, the instances of TDP throttling (perhaps it should be called current throttling) will increase with voltage, especially in Supo4k, eventually eating more of your score than the additional clock is helping achieve.

Good luck.


----------



## coc_james

So, as I've been testing my min/max clocks{min is highest at 1v}, I've been launching SuPo 1080p extreme, I watch the clocks and as soon as I see them drop, I stop the test and change as needed. SuPo does show you temps and clocks while it's running. Is that not accurate? I assumed if the clocks weren't dropping when showing in the OSD for SuPo that they were solid.


----------



## Dasboogieman

Quote:


> Originally Posted by *coc_james*
> 
> So, as I've been testing my min/max clocks{min is highest at 1v}, I've been launching SuPo 1080p extreme, I watch the clocks and as soon as I see them drop, I stop the test and change as needed. SuPo does show you temps and clocks while it's running. Is that not accurate? I assumed if the clocks weren't dropping when showing in the OSD for SuPo that they were solid.


The only way to be truly sure of the clockspeed on Pascal is with an Oscilloscope lol. The fluctuations happen very very quickly.

If you want to be sure, open up HwInfo 64 and record the Supo Run. If your max Power has hit 150%+ then your card throttled at some stage. This isn't as accurate as a real time scope shot but its good enough for what we are interested in.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> So, as I've been testing my min/max clocks{min is highest at 1v}, I've been launching SuPo 1080p extreme, I watch the clocks and as soon as I see them drop, I stop the test and change as needed. SuPo does show you temps and clocks while it's running. Is that not accurate? I assumed if the clocks weren't dropping when showing in the OSD for SuPo that they were solid.


If you run the XOC Bios, your clocks won't dip at all regardless of voltage in any of the reporting. You can only trust Supo and GPUz so much. I wouldn't worry about throttling at all. Just watch what's important (your score) as you change voltage at any given clock. If the clock is not hopping, the temperature constant but the score changes....there's clrearly more to it than just what's being reported.


----------



## Alexium

Quote:


> Originally Posted by *KraxKill*
> 
> If you run the XOC Bios, your clocks won't dip at all regardless of voltage in any of the reporting.


So is it safe (in terms of not bricking the card) to flash the XOC BIOS onto any Founder's Edition 1080 Ti card? Or is that not how it works?


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> So, as I've been testing my min/max clocks{min is highest at 1v}, I've been launching SuPo 1080p extreme, I watch the clocks and as soon as I see them drop, I stop the test and change as needed. SuPo does show you temps and clocks while it's running. Is that not accurate? I assumed if the clocks weren't dropping when showing in the OSD for SuPo that they were solid.
> 
> 
> 
> If you run the XOC Bios, your clocks won't dip at all regardless of voltage in any of the reporting. You can only trust Supo and GPUz so much. I wouldn't worry about throttling at all. Just watch what's important (your score) as you change voltage at any given clock. If the clock is not hopping, the temperature constant but the score changes....there's clrearly more to it than just what's being reported.
Click to expand...

Dont think I can run the XOC bios, this card is the one with the built in AIO.


----------



## KraxKill

Quote:


> Originally Posted by *Alexium*
> 
> So is it safe (in terms of not bricking the card) to flash the XOC BIOS onto any Founder's Edition 1080 Ti card? Or is that not how it works?


That's up to you to decide. I tested it on my FE without issue.

The XOC, is basically the less efficient version of the shunt mod with the added option (though useless without additional manual bypasses) of being able to run additional voltage bins beyond 1.093v

I tested this extensively in the early days when the cards first launched.

This is a bit cryptic, but here it goes...

An FE card with a shunt mod outperforms an FE on an XOC Bios (~3%) in clock for clock, volt for volt testing. This is because the stock FE bios is more optimized for the FE card.

However, *if you don't run a shunt mod*, the XOC will still allow higher performance at a given clock and voltage because despite a small handicap it still makes up for it by lifting the soft TDP limit ( the bios limit) beyond the otherwise limiting FE (non shunted) bios. However, running additional voltages (beyond 1.093v) on the XOC will not allow higher performance as the voltage simply eats into the TDP pie.

In other words....

A shunt modefied FE on the stock bios is the best combination possible.

An XOC Bios'ed FE (or other board or bios) are the next best combinations as despite taking a small performance penalty for running a non native bios, they will still make up for this by having a higher soft TDP when compared to the vanilla cards. So running a the same voltage as on the native bios (reasonable voltage that doesn't bump deeply the internal gpu limits) the XOC will still outperform a stock card.

However, if you're after the XOC for more voltage in the hopes of achieving higher clocks, you're in for a surprise as the actual performance is negated by the beyond reasonable current at higher voltages and is compensated for by the hardcoded limiters on the die.


----------



## Alexium

Quote:


> Originally Posted by *KraxKill*
> 
> This is a bit cryptic, but here it goes...


Thanks a lot! I think I understand. This also says, basically, that if I'm shopping for a Ti to put under a fullcover waterblock, there's no reason NOT to go for FE.

Are there known cases of the XOC BIOS bricking an FE card?


----------



## Cookybiscuit

Quote:


> Originally Posted by *Falkentyne*
> 
> You can but requires a hardware programmer. My 1070 sits at 1886 firm without moving stable, except in witcher 3 (1873 is stable in W3).
> Clocks don't move at all.


Interesting, never heard of this. I assume it's complicated and probably not worth it for plebs like me who are just annoyed by clocks moving around for no reason?


----------



## coc_james

@KedarWolf@KraxKill

I couldn't figure out how to upload a single log file and on this run I forgot to log it anyway. My previous run was better, but again, I cannot figure out how to upload a single log. Also, I'm knocking on the door of my PSU limit of 850w. @2076, dropped to 2063 @1.094


----------



## coc_james

Here's voltage at 1000mv, max freq set to 2012, which dropped to 2000.


----------



## epidemic

The newly assembled rig.


----------



## kalidae

Hey guys so im running the strix non OC edition. Is the strix OC bios worth flashing over? Would I see much difference if any? Also which bios do you guys reccomend? The highest score I have been able to get in superposition 4k is 10040 and that's with +160 core and +600 memory.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> Here's voltage at 1000mv, max freq set to 2012, which dropped to 2000.


That looks really good! Just 2.5% less performance at nearly 200w less power and heat going through the cards, your power supply, etc. Does your memory have more headroom at 2000 vs when you run 2076? I found you can often bump the vram a bit when running a lower core clock and voltage. It may be worthwhile testing a few different vram frequencies here as DDR5X for me behaves a bit odd, making the score jump as much as 200 points if you're in a bad vs good memory bin. What I've seen was... +600 good, +610 bad, +620 good again and so on. Not sure why this is, but there have been a few people in the thread that have posted about this. Finding a good mem bin can help. Or you may already be there..


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Here's voltage at 1000mv, max freq set to 2012, which dropped to 2000.
> 
> 
> 
> 
> That looks really good! Just 2.5% less performance at nearly 200w less power and heat going through the cards, your power supply, etc. Does your memory have more headroom at 2000 vs when you run 2076? I found you can often bump the vram a bit when running a lower core clock and voltage. It may be worthwhile testing a few different vram frequencies here as DDR5X for me behaves a bit odd, making the score jump as much as 200 points if you're in a bad vs good memory bin. What I've seen was... +600 good, +610 bad, +620 good again and so on. Not sure why this is, but there have been a few people in the thread that have posted about this. Finding a good mem bin can help. Or you may already be there..
Click to expand...

I'll give it a shot tomorrow. Am I using the Power Target right? Leave it cranked up even when using 1000mv? Seems like when I turn it down the clocks fluctuate.


----------



## kevindd992002

What sort of supo 4K scores should I be getting with a 2700K cpu and a z68 platform?


----------



## Amuro

hello masters, is there any other way to downclock gdrr5x to past -502mhz thats the limit on afterburner. im on a 50% powerlimit and +75mhz core, i get higher performance/fps when the memory is downclock at -502mhz. if i increase it lets say deafault 0 or +500mhz, performance will go down together with the core clock. it seems that the memory is grabbing more power from the limited 50% pl (which is 125watts max.) increasing bandwith but decreasing performance at the same time because the core can no longer sustain a higher clock. thank you.
the witcher 3 ultra hw on 8x + hbao+ at 1080p



my gaming benchmark @ 50% powerlimit
https://www.youtube.com/channel/UCze_je-jMNwt-MJbnl_sNHw/videos?sort=dd&view=0&shelf_id=0


----------



## Dasboogieman

Quote:


> Originally Posted by *Amuro*
> 
> hello masters, is there any other way to downclock gdrr5x to past -502mhz thats the limit on afterburner. im on a 50% powerlimit and +75mhz core, i get higher performance/fps when the memory is downclock at -502mhz. if i increase it lets say deafault 0 or +500mhz, performance will go down together with the core clock. it seems that the memory is grabbing more power from the limited 50% pl (which is 125watts max.) increasing bandwith but decreasing performance at the same time because the core can no longer sustain a higher clock. thank you.
> 
> 
> 
> my gaming benchmark @ 50% powerlimit
> https://www.youtube.com/channel/UCze_je-jMNwt-MJbnl_sNHw/videos?sort=dd&view=0&shelf_id=0


I am confused, why is your power limit so low? Reducing your VRAM speeds past that point would tank your performance waaaay more than what the core will. Something else is wrong with your setup,

What we've found is increasing your VRAM very high and being conservative with Core Clocks is the best balance of performance to power. Your 1080ti should not be only drawing 120W of power, something is seriously wrong.


----------



## nuno_p

He limited the power limit on purpose to 50%.


----------



## Amuro

Quote:


> Originally Posted by *Dasboogieman*
> 
> I am confused, why is your power limit so low? Reducing your VRAM speeds past that point would tank your performance waaaay more than what the core will. Something else is wrong with your setup,
> 
> What we've found is increasing your VRAM very high and being conservative with Core Clocks is the best balance of performance to power. Your 1080ti should not be only drawing 120W of power, something is seriously wrong.


im done with overclocking my 1080ti, im now underclocking downvolting power limiting to see how efficient this chip is in power to performance efficiency at the lowest level, you see at 115-125 watts max powerdraw, it is almost at the same performance of a gtx 1080 ocd that draws 180+ watts of power


----------



## kevindd992002

Quote:


> Originally Posted by *kevindd992002*
> 
> What sort of supo 4K scores should I be getting with a 2700K cpu and a z68 platform?


Anyone?


----------



## Sweetwater

Quote:


> Originally Posted by *kevindd992002*
> 
> Anyone?


I'd say between 9800 and 10700


----------



## kevindd992002

Quote:


> Originally Posted by *Sweetwater*
> 
> I'd say between 9800 and 10700


And that's because of the old platform even though it's not technically gpu-bottlenecked because the test is running at 4K?


----------



## Sweetwater

Yeah the CPU hardly impacts the score, at least from what I've seen of Intel. I'd hazard a guess there's a minimal difference between an i5 2500k and a 7700k. Margin of error stuff


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> If it's the Gigabyte Aorus BIOS it won't work on any Card but the Aorus and maybe the Zotac AMP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've tested it on my FE and clocks all messed up, don't work right at all. I think they use a different voltage controller.


As it turns out I have the Amp Extreme. The Arctic Storm bios is all sorts of weird for me. Also has the fan stop bug.

If only someone figured out how to undervolt the XOC bios. I'm pretty sure its the extra voltage causing the crashes. I can't get it below the default 1.13v no matter what.

Quote:


> Originally Posted by *Sweetwater*
> 
> Yeah the CPU hardly impacts the score, at least from what I've seen of Intel. I'd hazard a guess there's a minimal difference between an i5 2500k and a 7700k. Margin of error stuff


I actually did a few tests on my 6700k at 4.5ghz vs 4.8ghz and scores jumped from about 9900 to 10200. Not much, but it did. So I'd say it actually matters more than we think it does. I'd guess its the single thread performance affecting the scores somehow.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Amuro*
> 
> hello masters, is there any other way to downclock gdrr5x to past -502mhz thats the limit on afterburner. im on a 50% powerlimit and +75mhz core, i get higher performance/fps when the memory is downclock at -502mhz. if i increase it lets say deafault 0 or +500mhz, performance will go down together with the core clock. it seems that the memory is grabbing more power from the limited 50% pl (which is 125watts max.) increasing bandwith but decreasing performance at the same time because the core can no longer sustain a higher clock. thank you.
> the witcher 3 ultra hw on 8x + hbao+ at 1080p
> 
> 
> 
> my gaming benchmark @ 50% powerlimit
> https://www.youtube.com/channel/UCze_je-jMNwt-MJbnl_sNHw/videos?sort=dd&view=0&shelf_id=0


Yeah G5X seem to draw a lot of power. If 1080 Ti had HMB2 it would have been a monster.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> I'll give it a shot tomorrow. Am I using the Power Target right? Leave it cranked up even when using 1000mv? Seems like when I turn it down the clocks fluctuate.


Yeah leave the power target cranked up all the way that way it will push the cards as hard as they can draw. If you lower it, it will just clip the power peaks off and start chopping performance by clock hopping in hard to render scenes. These cards are magic undervolted but become really inefficient at higher voltages.

See how much more room if any running 1v has allowed you on the memory side. The memory is also part of the TDP, so at higher voltages where you're constantly near throttle higher memory frequencies will often result in lower performance (this is why some people think it's error correction kicking in as memory frequencies rise.) In reality however I suspect that the ram is just hitting the TDP limits along with the core and pulling performance. But at 1v, you gain a lot of TDP headroom so the memory may also have more room to run. Obviously it will depend on if the vram can sustain higher frequencies, but it's worth a shot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> Yeah leave the power target cranked up all the way that way it will push the cards as hard as they can draw. If you lower it, it will just clip the power peaks off and start chopping performance by clock hopping in hard to render scenes. These cards are magic undervolted but become really inefficient at higher voltages.
> 
> See how much more room if any running 1v has allowed you on the memory side. The memory is also part of the TDP, so at higher voltages where you're constantly near throttle higher memory frequencies will often result in lower performance (this is why some people think it's error correction kicking in as memory frequencies rise.) In reality however I suspect that the ram is just hitting the TDP limits along with the core and pulling performance. But at 1v, you gain a lot of TDP headroom so the memory may also have more room to run. Obviously it will depend on if the vram can sustain higher frequencies, but it's worth a shot.


For me even 120% cant keep 1v all the time. I think it has to do with AIO pump sucking power.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think it has to do with AIO pump sucking power.


You could test this by running the AIO off of the motherboard headers. You'd need to set a fixed RPM but you'd know. If it draws 20w for example that can make a differences for sure.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> You could test this by running the AIO off of the motherboard headers. You'd need to set a fixed RPM but you'd know. If it draws 20w for example that can make a differences for sure.


I need to find the some form of connector because its not 3 pin.


----------



## TheBoom

Had the same crash even with stock bios. What's worse is now the GPU is stuck at base boost bin? Reinstalled drIvers with clean option didn't do anything.

I hope the card isn't bricked.


----------



## MrGreaseMonkkey

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Amuro*
> 
> hello masters, is there any other way to downclock gdrr5x to past -502mhz thats the limit on afterburner. im on a 50% powerlimit and +75mhz core, i get higher performance/fps when the memory is downclock at -502mhz. if i increase it lets say deafault 0 or +500mhz, performance will go down together with the core clock. it seems that the memory is grabbing more power from the limited 50% pl (which is 125watts max.) increasing bandwith but decreasing performance at the same time because the core can no longer sustain a higher clock. thank you.
> 
> 
> 
> my gaming benchmark @ 50% powerlimit
> https://www.youtube.com/channel/UCze_je-jMNwt-MJbnl_sNHw/videos?sort=dd&view=0&shelf_id=0
> 
> 
> 
> I am confused, why is your power limit so low? Reducing your VRAM speeds past that point would tank your performance waaaay more than what the core will. Something else is wrong with your setup,
> 
> What we've found is increasing your VRAM very high and being conservative with Core Clocks is the best balance of performance to power. Your 1080ti should not be only drawing 120W of power, something is seriously wrong.
Click to expand...

? This makes no sense, if youre going to limit the power to 50% on a 1080 ti just return the card and buy a Gtx 1070.

Sent from my iPhone using Tapatalk Pro


----------



## Amuro

Quote:


> Originally Posted by *MrGreaseMonkkey*
> 
> ? This makes no sense, if youre going to limit the power to 50% on a 1080 ti just return the card and buy a Gtx 1070.
> 
> Sent from my iPhone using Tapatalk Pro


live fast die young brotha.


----------



## TheBoom

So while the card is stuck at 139mhz the perfcap reason is constantly on pwr. Nothing fixes this.

At least until I flashed back to XOC bios. Now it boosts and behaves as normal.

Did I somehow convince my Amp Extreme that it's a Asus Strix now? Lol. This is getting really weird.


----------



## GRABibus

Hi,
I got my Gigabyte GTX 1080 Ti Gaming OC 11G several days ago and could perform overclock investigations.

https://www.gigabyte.com/Graphics-Card/GV-N108TGAMING-OC-11G#kf

I am really happy with this card.
I flashed it with EVGA FTW3 Bios version 86.02.39.01.90
It can bench in TimeSpy 2101MHz (Graphic score 11132pts).

I could also set up a gaming overclock which is, right now, stable in BF1 (45mns gameplay) and DOOM (15mns gameplay).

2063MHz/6055MHz at 1.031V.

Here is my optimised V/F curve to reach this :



In MSI AB :
+99MHz on Core
+547MHz on GDDR5
127% power limit
100% for voltage range

And tweaking to have 2088MHz at 1.031V, which makes 2063MHz at 1.31V in games.
This makes a base clock at 1669MHz.



I will test of course more times and on other games (Star Wars Battle Front 2015, Quake Champions, TitanFall 2, Black Ops 3 mainly) to see if I can consider it as stable for games.

Ps : in Killing Floor 2, this OC crashes immediately...
In fact, in Killing Floor 2, any OC crash immediately.
This is related to the game engine I think.


----------



## TheBoom

Figured out the problem.

The OCP in Corsair Link was being triggered causing the PCI-e rail to short and send the GPU into failure.

Which was my initial suspicion after all. Oh well.

Disabling OCP for all PCI-e rails solved the problem, but I don't know if the card is really pulling that much amperage or the fault is on Corsair Link's end.

Might be the recent update for CL broke some reporting mechanic and is causing it to report wrong values or it might be that the card is suddenly pulling that much power?!


----------



## coc_james

Quote:


> Originally Posted by *TheBoom*
> 
> Figured out the problem.
> 
> The OCP in Corsair Link was being triggered causing the PCI-e rail to short and send the GPU into failure.
> 
> Which was my initial suspicion after all. Oh well.
> 
> Disabling OCP for all PCI-e rails solved the problem, but I don't know if the card is really pulling that much amperage or the fault is on Corsair Link's end.
> 
> Might be the recent update for CL broke some reporting mechanic and is causing it to report wrong values or it might be that the card is suddenly pulling that much power?!


I'm running my RM850I on sli ti's and a 5.1ghz 7700k. When it was in multi rail I had issues, after switching it to single rail, all is well. That all being said, you should consider initiating a conversation with Corsairand your GPU manufacturer. You most certainly should not be hitting OCP.


----------



## Thoth420

Quote:


> Originally Posted by *TheBoom*
> 
> Figured out the problem.
> 
> The OCP in Corsair Link was being triggered causing the PCI-e rail to short and send the GPU into failure.
> 
> Which was my initial suspicion after all. Oh well.
> 
> Disabling OCP for all PCI-e rails solved the problem, but I don't know if the card is really pulling that much amperage or the fault is on Corsair Link's end.
> 
> Might be the recent update for CL broke some reporting mechanic and is causing it to report wrong values or it might be that the card is suddenly pulling that much power?!


Another reason to never use Corsair Link. Added to list
So much jank software these days which is why I run lean as possible. Blaming it on the type of rail is ridiculous. Glad you sorted it these cards are not cheap.


----------



## KedarWolf

Quote:


> Originally Posted by *Thoth420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBoom*
> 
> Figured out the problem.
> 
> The OCP in Corsair Link was being triggered causing the PCI-e rail to short and send the GPU into failure.
> 
> Which was my initial suspicion after all. Oh well.
> 
> Disabling OCP for all PCI-e rails solved the problem, but I don't know if the card is really pulling that much amperage or the fault is on Corsair Link's end.
> 
> Might be the recent update for CL broke some reporting mechanic and is causing it to report wrong values or it might be that the card is suddenly pulling that much power?!
> 
> 
> 
> Another reason to never use Corsair Link. Added to list
> So much jank software these days which is why I run lean as possible. Blaming it on the type of rail is ridiculous. Glad you sorted it these cards are not cheap.
Click to expand...

OCP may be enabled by default. If you need to install Corsair Link to check Google Corsair Link 3, Corsair Link 4 is terrible and don't work right.

I tried to install it and check.


----------



## coc_james

Dbl post


----------



## taem

If temp/noise, especially noise, is the only consideration, what's the best model to get? For an air cooled build.

Btw what is the deal with Galax? I love white pc parts so I always browse Galax. In all the years, they've been sold out of every model, every time I look. Why don't they make a few more? I would love to have an EXOC. But I'm not even going to try to get one, I know it's pointless.


----------



## dansi

So i dropped my vcore a notch 1.031v and the clocks auto drop a bin, yet i get consistent higher scores in supo.

Unless gpuz gets more access to Nvidia sensors, i don't think 3rd party monitoring software is too accurate in reading pascal


----------



## ZealotKi11er

Quote:


> Originally Posted by *dansi*
> 
> So i dropped my vcore a notch 1.031v and the clocks auto drop a bin, yet i get consistent higher scores in supo.
> 
> Unless gpuz gets more access to Nvidia sensors, i don't think 3rd party monitoring software is too accurate in reading pascal


The lower the volts the higher the clocks will say without hitting TDP so you score higher. Also I found best way to OC Pascal is under load. Have something run to warm up the card to gaming temp and OC from there. You will always get the correct clock for your temp range. If you OC cold the card might clock 13MH*x higher and clock down.


----------



## sajuuk111

Hello I have recently bought a 1080Ti Seahawk X, and i am really happy with the temps i am getting. 50C on load at 1.1V, with XOC bios. However, sadly i cant clock my card higher than 2025MHz under any voltage, i tried 1.093 to all the way up 1.2V. 2037 yields a 100% crash in every instance. Since i can get 2GHz on 1.093, i decided to leave XOC bios. However, card can really use over 300W so i am looking for a nice BIOS that is compatible with mine.

I saw KedarWolf using an ArcticStorm bios, but i cant seem to find it anywhere. Could someone help me with it?


----------



## ZealotKi11er

Quote:


> Originally Posted by *sajuuk111*
> 
> Hello I have recently bought a 1080Ti Seahawk X, and i am really happy with the temps i am getting. 50C on load at 1.1V, with XOC bios. However, sadly i cant clock my card higher than 2025MHz under any voltage, i tried 1.093 to all the way up 1.2V. 2037 yields a 100% crash in every instance. Since i can get 2GHz on 1.093, i decided to leave XOC bios. However, card can really use over 300W so i am looking for a nice BIOS that is compatible with mine.
> 
> I saw KedarWolf using an ArcticStorm bios, but i cant seem to find it anywhere. Could someone help me with it?


Try to lower voltages. You do not need that high. Try 2GHz @ 1.025v


----------



## sajuuk111

It took only 1 second for running Unigine Heaven to crash on [email protected]

I think i really got one of the worst possible, too bad this temps will go to waste in terms of clock.

And it crashes as soon as it launches, like in 34C... I am really disappointed


----------



## KedarWolf

Quote:


> Originally Posted by *sajuuk111*
> 
> Hello I have recently bought a 1080Ti Seahawk X, and i am really happy with the temps i am getting. 50C on load at 1.1V, with XOC bios. However, sadly i cant clock my card higher than 2025MHz under any voltage, i tried 1.093 to all the way up 1.2V. 2037 yields a 100% crash in every instance. Since i can get 2GHz on 1.093, i decided to leave XOC bios. However, card can really use over 300W so i am looking for a nice BIOS that is compatible with mine.
> 
> I saw KedarWolf using an ArcticStorm bios, but i cant seem to find it anywhere. Could someone help me with it?


After installing the BIOS and rebooting, run the powerlimit.bat as Admin.









powerlimitarcticstormbios.zip 156k .zip file


----------



## sajuuk111

Quote:


> Originally Posted by *KedarWolf*
> 
> After installing the BIOS and rebooting, run the powerlimit.bat as Admin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> powerlimitarcticstormbios.zip 156k .zip file


Thank you so much for what you are doing. Do i need to force .bat running in every restart?


----------



## Pandora's Box

Quote:


> Originally Posted by *Pandora's Box*
> 
> Anyone else notice that Rise of the Tomb Raider seems to really punish the graphics card in terms of power usage and temperature? I just started playing the game yesterday and noticed my undervolted 1080 Ti was reaching 80C. This was at max in game settings, DX12, 1440P with 2xSSAA - not exactly easy to drive but I was impressed with the performance (60+fps). GPU was even hitting power throttling at 115% power usage, core clock speed was ranging from 1870 - 2000.


Quote:


> Originally Posted by *Mooncheese*
> 
> 80c with the undervolt? Good lord, what's ambient like where you are? Still rocking the Silverstone Raven? I run ROTTR in 3D Vision and found that HBAO+ is extremely resource intensive, try running ssao instead if you want more frames and or want to get the temps down. I can run this game at 60 fps at 2560x1440p basically everywhere with only a few compromises (120 FPS 2D) if you want I can tell you my exact settings when I get home.


Quote:


> Originally Posted by *Pandora's Box*
> 
> No I'm running a thermaltake p3 case now, basically open air (see my rig in my signature). 72f is the ambient temperature. Yeah 80c at 1volt, 70% fan speed. I'm pretty sure it's the 2xssaa that's causing the temps to skyrocket. Game is basically running at 5160x2880 (2xssaa is basically doubling the resolution). I'm happy with the performance, just surprised at the temps and power throttling. Even firestrike extreme doesn't push it this hard


So I finally had time tonight to remove the cooler and put some better thermal paste on the GPU. Removed whatever generic junk MSI was using (and a lot of it too, god damn lol, replaced it with Thermal Grizzly Kryonaught. Just ran 3 runs back to back of the Rise of the Tomb Raider benchmark. GPU was at 1.000 volts and was running from 2025 to 2000 MHz at a max temp of 63C. I'm astonished at the drop in temperature from 80C to 63C from just changing the thermal paste. Ambient temperature has not changed either, 72F. The crazy thing is, is I was running this benchmark run at 3440x1440 max settings, though FXAA and not SSAA, the previous runs where I was hitting 80C before changing the thermal paste was at 1440P.

While creating this post I decided to try 0.993 volts @ 2000MHz. Card ran locked at 2000Mhz at .993 Volts, it hit 60C after 3 runs of Rise of the Tomb Raider benchmark, a 4C drop from 1.000 volts.

Edit: Achieved a new high score in Firestrike Extreme:

GPU @ 2088 Mhz, Memory at 1514MHz

14857


----------



## KedarWolf

Quote:


> Originally Posted by *sajuuk111*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> After installing the BIOS and rebooting, run the powerlimit.bat as Admin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> powerlimitarcticstormbios.zip 156k .zip file
> 
> 
> 
> 
> Thank you so much for what you are doing. Do i need to force .bat running in every restart?
Click to expand...

Normally you don't, no, but after you do run it, reboot once and check. Some peeps have had to add the bat in task scheduler as Admin on boot.


----------



## DStealth

Just updated the CPU from [email protected] to [email protected]+Ghz
Running 24/7 settings Physics jumped from 18k to 22k:drool:https://www.3dmark.com/3dm/21823998
These Skylake-x CPUs are impressive...Just remind me of the first days of Sandy bridge era:thumb:...tought benching with 5+ghz will be in the past


----------



## Alexium

Quote:


> Originally Posted by *DStealth*
> 
> [email protected]+Ghz


Scalped?
I'm skeptical about Skylake-X. I don't own one (in no small part for this very reason), but I've read a ton of reviews / benchmarks. It often performs worse than Haswell-E. I think you would be better off with a 6900K. Although Haswell would probably stop around 4.8 GHz, not 5.0, so it's a moot point.


----------



## DStealth

Nope with IHS on...I was also skeptical, but after pushing it to the limits was like Wow.
Yesterday played the whole day on it with 5050Mhz 1.3v not a single Bsod or game close appeared in 28-30*C room with 1080ti blowing hot air next to it...impressive peace of silicon i would say...
These are the settings with 1.3v CPUz is not reading the Vcore voltage correct but the Input one


----------



## Pandora's Box

@DStealth Be sure to raise your power limit (TDP) in the bios for the CPU. Default is 140Watts, you'll want to crank it to 400 or 450. Otherwise you'll throttle because of the power limit.


----------



## hteng

hello, I've got a MSI 1080 TI Gaming X, it's mainly used for gaming but I want to use it for mining when im not using the PC. So currently when it's mining it draws about 280W from the wall, this includes the motherboard, cpu, HDD, fans..etc etc. I don't know how much the GPU itself is drawing but it seems to be pretty high compare to what i read on the web, my mining hash is something like 31-32 MH/s, not so spectacular.

So how exactly do I undervolt/underclock/optimize power consumption? I'm using MSI afterburner, do i follow what this page does for nvidia cards ? http://mylifegadgets.com/setup-ethmonitoring-miner-software/

just set the power limit, temp limit and core clock and I'm done?


----------



## KedarWolf

Quote:


> Originally Posted by *hteng*
> 
> hello, I've got a MSI 1080 TI Gaming X, it's mainly used for gaming but I want to use it for mining when im not using the PC. So currently when it's mining it draws about 280W from the wall, this includes the motherboard, cpu, HDD, fans..etc etc. I don't know how much the GPU itself is drawing but it seems to be pretty high compare to what i read on the web, my mining hash is something like 31-32 MH/s, not so spectacular.
> 
> So how exactly do I undervolt/underclock/optimize power consumption? I'm using MSI afterburner, do i follow what this page does for nvidia cards ? http://mylifegadgets.com/setup-ethmonitoring-miner-software/
> 
> just set the power limit, temp limit and core clock and I'm done?


http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20

Follow this and set voltages and clocks for what you're aiming for, I can do 2012 core, 6177 memory at .993v but I have a decent card, some might not fair so well.


----------



## DStealth

Quote:


> Originally Posted by *Pandora's Box*
> 
> @DStealth Be sure to raise your power limit (TDP) in the bios for the CPU. Default is 140Watts, you'll want to crank it to 400 or 450. Otherwise you'll throttle because of the power limit.


All is set to 999








Judging from the results and thermal monitoring keeping it in 60* area load there's zero throttling


----------



## Pandora's Box

Quote:


> Originally Posted by *DStealth*
> 
> All is set to 999
> 
> 
> 
> 
> 
> 
> 
> 
> Judging from the results and thermal monitoring keeping it in 60* area load there's zero throttling


Nice, enjoy the CPU. Skylake-X is a beast...


----------



## TheBoom

Quote:


> Originally Posted by *coc_james*
> 
> I'm running my RM850I on sli ti's and a 5.1ghz 7700k. When it was in multi rail I had issues, after switching it to single rail, all is well. That all being said, you should consider initiating a conversation with Corsairand your GPU manufacturer. You most certainly should not be hitting OCP.


Yeah I've started a thread in the Corsair PSU forum. Hopefully someone will look into it.
Quote:


> Originally Posted by *Thoth420*
> 
> Another reason to never use Corsair Link. Added to list
> So much jank software these days which is why I run lean as possible. Blaming it on the type of rail is ridiculous. Glad you sorted it these cards are not cheap.


Quote:


> Originally Posted by *KedarWolf*
> 
> OCP may be enabled by default. If you need to install Corsair Link to check Google Corsair Link 3, Corsair Link 4 is terrible and don't work right.
> 
> I tried to install it and check.


I've been using it all this while (about 5 years) and never had any problems even with OCP on. It was after the recent update, 3 days ago I think, that this issue cropped up. They must have broken something in the update.

Nevertheless I'm running it with all OCPs off now, hopefully nothing happens before they fix it.


----------



## 7thNemesis

I've got a watercooled rig with an evga 1080 ti FE, heatkiller iv full cover waterblock. my inlet temps are either 19 degrees or 18 degrees (so far, and zero heat increase)
Anyways, I haven't tried any voltage unlocking and across all games, stress tests and benchmarking I have found that I can get 100% stable with 2101MHz GPU and 6156MHz memory. With my D5 running at its lowest speed of 1850 RPM, my peak sustained load temp is 29C and at 4800 RPM its 27C. Does this mean my "delta" is 8 - 10 degrees C? Is this any good, and is my card decent? I'm curious as this is my first foray into water cooling.


----------



## TheBoom

Quote:


> Originally Posted by *7thNemesis*
> 
> I've got a watercooled rig with an evga 1080 ti FE, heatkiller iv full cover waterblock. my inlet temps are either 19 degrees or 18 degrees (so far, and zero heat increase)
> Anyways, I haven't tried any voltage unlocking and across all games, stress tests and benchmarking I have found that I can get 100% stable with 2101MHz GPU and 6156MHz memory. With my D5 running at its lowest speed of 1850 RPM, my peak sustained load temp is 29C and at 4800 RPM its 27C. Does this mean my "delta" is 8 - 10 degrees C? Is this any good, and is my card decent? I'm curious as this is my first foray into water cooling.


That's really good. With those clocks you may be able to undervolt and get even more efficient performance to heat produced.


----------



## weskeh

Guys is it possible that my psu is starting to show is age(wattage)

I seem to be the only one with a 1080ti that doesnt get his memory above +500mhz. 400+ is fine though and the core is happy above 2ghz aswell. Not sure how high but 2050mhz has no issues.

My psu is seasonic m12 620w

Cpu [email protected] 4.8ghz + nzxt x61

4x4gb ddr3 2133mhz

No hardlocks or reboots.

Can bench ROTTR @+500mem oc but crashes with timespy.

I think the psu is fine and that i have ****ty ddr5x on my card..


----------



## Rainmaker91

Quote:


> Originally Posted by *weskeh*
> 
> Guys is it possible that my psu is starting to show is age(wattage)
> 
> I seem to be the only one with a 1080ti that doesnt get his memory above +500mhz. 400+ is fine though and the core is happy above 2ghz aswell. Not sure how high but 2050mhz has no issues.
> 
> My psu is seasonic m12 620w
> 
> Cpu [email protected] 4.8ghz + nzxt x61
> 
> 4x4gb ddr3 2133mhz
> 
> No hardlocks or reboots.
> 
> Can bench ROTTR @+500mem oc but crashes with timespy.
> 
> I think the psu is fine and that i have ****ty ddr5x on my card..


Cinsidering the amount of power this card draws when overclocked I wouldn't be surprised if you have some issues with running a 620w PSU. For the most part it should be fine, but that CPU @ 4.8 Ghz is drawing a lot of power at the same time as that graphics card is drawing a lot of power when overclocked. Then add in any other stuff that is drawing power in the computer and you may very well be drawing 600w+ with that setup. I could be wrong, but I'd still say that you don't really have a lot of wiggle room with that PSU.


----------



## weskeh

Quote:


> Originally Posted by *Rainmaker91*
> 
> Cinsidering the amount of power this card draws when overclocked I wouldn't be surprised if you have some issues with running a 620w PSU. For the most part it should be fine, but that CPU @ 4.8 Ghz is drawing a lot of power at the same time as that graphics card is drawing a lot of power when overclocked. Then add in any other stuff that is drawing power in the computer and you may very well be drawing 600w+ with that setup. I could be wrong, but I'd still say that you don't really have a lot of wiggle room with that PSU.


Yeah ofcorse there is not much juice left on that psu. Was just wondering if it could hold any stable oc's back.. guess is it is very possible. Dont wanna trow money out the window either so.. might or might not look for another one







the OC twitch is strong though lol


----------



## KedarWolf

Quote:


> Originally Posted by *7thNemesis*
> 
> I've got a watercooled rig with an evga 1080 ti FE, heatkiller iv full cover waterblock. my inlet temps are either 19 degrees or 18 degrees (so far, and zero heat increase)
> Anyways, I haven't tried any voltage unlocking and across all games, stress tests and benchmarking I have found that I can get 100% stable with 2101MHz GPU and 6156MHz memory. With my D5 running at its lowest speed of 1850 RPM, my peak sustained load temp is 29C and at 4800 RPM its 27C. Does this mean my "delta" is 8 - 10 degrees C? Is this any good, and is my card decent? I'm curious as this is my first foray into water cooling.


That is quite good, yes. Try running Fire Strike Ultra 4K stress test and see what it ramps up to before the test finishes.


----------



## Thoth420

Quote:


> Originally Posted by *TheBoom*
> 
> Yeah I've started a thread in the Corsair PSU forum. Hopefully someone will look into it.
> 
> I've been using it all this while (about 5 years) and never had any problems even with OCP on. It was after the recent update, 3 days ago I think, that this issue cropped up. They must have broken something in the update.
> 
> Nevertheless I'm running it with all OCPs off now, hopefully nothing happens before they fix it.


Just curious as to why you would risk it for a crappy piece of optional software. Do you really need it? I tend to take about ten minutes tops to configure anything that thing can do manually in BIOS etc and I am far from a super advanced user.


----------



## TheBoom

Quote:


> Originally Posted by *weskeh*
> 
> Guys is it possible that my psu is starting to show is age(wattage)
> 
> I seem to be the only one with a 1080ti that doesnt get his memory above +500mhz. 400+ is fine though and the core is happy above 2ghz aswell. Not sure how high but 2050mhz has no issues.
> 
> My psu is seasonic m12 620w
> 
> Cpu [email protected] 4.8ghz + nzxt x61
> 
> 4x4gb ddr3 2133mhz
> 
> No hardlocks or reboots.
> 
> Can bench ROTTR @+500mem oc but crashes with timespy.
> 
> I think the psu is fine and that i have ****ty ddr5x on my card..


Mine does +350 on stock bios and +400 on XOC. So yeah, you're not alone. Problem with memory OC is you just need 1 out of 11 chips to be the weakest link and those odds usually aren't great.

Quote:


> Originally Posted by *Thoth420*
> 
> Just curious as to why you would risk it for a crappy piece of optional software. Do you really need it? I tend to take about ten minutes tops to configure anything that thing can do manually in BIOS etc and I am far from a super advanced user.


Like I said I never had any issues with it thus far so I wasn't aware or under the impression that it was crappy software. I only use it for my H115i, never messed around with the PSU side of things but it's automatically included. The power usage measurement isn't too bad either although it isn't the most accurate.

And yeah I need it to set the fan curve for the fans on the radiator and set the LED colour on the water block itself. If you have some other third party software that can do that then please do recommend.


----------



## Sgang

Someone seen the buldzoid video on the amp extreme edition and can explain in simple words What does it means? We bought the worst card?

Inviato dal mio iPhone utilizzando Tapatalk


----------



## Dasboogieman

Quote:


> Originally Posted by *Sgang*
> 
> Someone seen the buldzoid video on the amp extreme edition and can explain in simple words What does it means? We bought the worst card?
> 
> Inviato dal mio iPhone utilizzando Tapatalk


It means that the VRM assembly of the Amp Extreme is very inefficient despite it's admirable proportions. It outputs a lot of waste heat and takes up a lot of space (and potentially a large number of parts that can fail causing reliability issues).

It has 2 direct implications:
1. Lower overclock /TDP ceiling on Zotac cards because the VRM can possibly overheat
2. A lot of waste heat is being dumped in to the water in watercooled loops.

Actually, Buildzoid confirms what I've been suspecting for a while now. The TDP set by the manufacturers in the BIOS has a direct correlation to how efficient the VRM assembly is. Because NVIDIA measures the power consumed by the card before VRM switching losses then correlates it to the normalized TDP (which measures power after VRM losses), each card has a different main max TDP (with ultra efficient ones like FE and ASUS having relatively lower ceilings while ones like Zotac have ludicrous 432W ceilings).

Theoretically speaking, a card with an efficient VRM and a high (possibly cross-flashed) TDP will power throttle much less.


----------



## taem

Quote:


> Originally Posted by *Sgang*
> 
> Someone seen the buldzoid video on the amp extreme edition and can explain in simple words What does it means? We bought the worst card?
> 
> Inviato dal mio iPhone utilizzando Tapatalk


He didn't say that at all. He said it's basically the same as other 1080 ti's with beefy coolers, but is more expensive, and therefore falls behind some of the other cards in price/performance. The only real flaw he noted is that thermal pads are not optimally applied.

It looks like a nice card for the higher max power and 5 year warranty which no other 1080 ti has. I'm getting the Aorus for the 4 year warranty, it's a lot cheaper than Amp Extreme.


----------



## smushroomed

I have a 1080ti fe with an aftermarket AIO cooler, what should I expect for a decent stable overclock?

What should I set my core voltage and power limit at?

+200 core and +400 mem?"


----------



## Dasboogieman

Quote:


> Originally Posted by *smushroomed*
> 
> I have a 1080ti fe with an aftermarket AIO cooler, what should I expect for a decent stable overclock?
> 
> What should I set my core voltage and power limit at?
> 
> +200 core and +400 mem?"


Try for 2000mhz, +400 mem @ 1.06 V. Start trying to find your max clocks at that voltage, then increase voltage and see if you can go higher.


----------



## KCDC

Anyone know why the AB beta 16 link is gone? Is it unstable or something?


----------



## Thoth420

Quote:


> Originally Posted by *TheBoom*
> 
> Like I said I never had any issues with it thus far so I wasn't aware or under the impression that it was crappy software. I only use it for my H115i, never messed around with the PSU side of things but it's automatically included. The power usage measurement isn't too bad either although it isn't the most accurate.
> 
> And yeah I need it to set the fan curve for the fans on the radiator and set the LED colour on the water block itself. If you have some other third party software that can do that then please do recommend.


OK but why would you risk turning OCP off and just crossing your fingers? Seems not prudent and while you haven't had any issues CL has been reported to cause all types of various issues and frankly now you have had an issue with it. Reporting it is great and all but expecting a fast fix is well not realistic at all based on their history. I love corsair hardware but whoever they pay to write that software is a failure. Just my two cents but I wouldn't ever risk a system that nice over some software there are always other options to choose in those regards.

Fans can be done on your mobo BIOS you may need to move the connection to one of the headers as opposed to how it set up now but the BIOS is likely to never fail you(that is another thing CL has failed people with ...fan profiling). The color of an LED is hardly worth risking your system over at least to me but that afaik has no other solution but I could be wrong.


----------



## 8051

Is it possible to underclock the memory using Precision X or Afterburner? Would this have the effect of reducing the power used by the card?


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> Is it possible to underclock the memory using Precision X or Afterburner? Would this have the effect of reducing the power used by the card?


Yes and it would have a minimal impact on the power.


----------



## MrSkim

Anyone know how long the heatsinks fin on the SC2 sticks out from the plate?

http://hexus.net/media/uploaded/2017/5/cf01870b-8a58-47cf-8423-62e424fb78b4.jpg


----------



## BoredErica

> Originally Posted by *Dasboogieman*
> 
> Actually, Buildzoid confirms what I've been suspecting for a while now. The TDP set by the manufacturers in the BIOS has a direct correlation to how efficient the VRM assembly is. Because NVIDIA measures the power consumed by the card before VRM switching losses then correlates it to the normalized TDP (which measures power after VRM losses), each card has a different main max TDP (with ultra efficient ones like FE and ASUS having relatively lower ceilings while ones like Zotac have ludicrous 432W ceilings).
> 
> Theoretically speaking, a card with an efficient VRM and a high (possibly cross-flashed) TDP will power throttle much less.


I thought XOC bios essentially has no limit?



> Originally Posted by *ZealotKi11er*
> 
> The lower the volts the higher the clocks will say without hitting TDP so you score higher.


In other words, never, ever increase voltage slider in Afterburner? It doesn't seem to do anything (the slider).


----------



## Dasboogieman

Quote:


> Originally Posted by *Darkwizzie*
> 
> I thought XOC bios essentially has no limit?
> In other words, never, ever increase voltage slider in Afterburner? It doesn't seem to do anything (the slider).


The slider in MSI afterburner unlocks the 1.075, 1.081 and 1.093 voltage bins which are inaccessible otherwise. You still have to use NVSMI and voltage points to force the card to run at the higher voltages and a suitable BIOS to allow the extra TDP.

The XOC BIOS doesn't seem to remove the intrinsic temperature throttling that occurs internally within the chip. The GPU still dynamically adjusts it's internal operation. e.g. a GPU with XOC at 60C will score about 5% less than a GPU with XOC at 30C.

Hell, Buildzoid reports that the GPU can still monitor power consumption (the Normalized TDP) even if you completely fabricate the power results that goes in to the Shunt monitor chip.


----------



## Astral85

Hi 1080 Ti owners,

This is intended for the EVGA 1080 Ti FTW3 owners. I am wondering if is anyone is experiencing high pitched hums from their 1080 Ti's? The PWR and MEM fans on my 1080 Ti FTW3 both make high pitched hums in the 60-70% rpm range, literally like a loud vacuuming noise. It also hums at 30% but less audible and the MEM fan is worst affected at all volumes. Another owner was able to capture the noise quite clearly in this video: 



 Interestingly only the GPU fan is affected by this hum on their card.

I have already RMA'd my card for which I received another 1080 Ti FTW3 with exactly the same issue. EVGA are of the belief that it is a resonance issue with my case while myself and other people I have spoken to with the same issue are of the opinion it is a design flaw with the card. Just interested to hear from other owners, your experience, are you affected etc to sooth my frustration and inconclusiveness.


----------



## becks

Proud owner of a : Gigabyte Aorus gtx 1080 ti Waterforce WB Xtreme Edition starting today..
Had to change some PC components so I am back to scratch when it comes to CPU/RAM OC + GPU now..
Out of the box it does 1780-1820 (or something of that nature) shoots up to 1970 in OC profile for a short while.. Temp never went over 40C (2x 240 rads in Pull Cooling GPU + CPU) in short Gaming sessions (Destiny 2, Wow, HOtS, Wot)

Tried to Up V and Power to max and set Core to 2005 ended up in an endless loop of crashes after OS load.. as the profile was being added at the startup..
Will post some pictures later and will follow with more info in the upcoming weeks. My time is so limited now that I will be lucky if I get 2h/day in front of the PC so expect OC-ing everything to take 2 weeks at least.

Are you guys using the free or paid versions of the benchmarks ? (superposition, valey) which one is best to be used to determine OC stability and such ?
I am coming from a GTX 260 so I have no clue what I am doing, any advice appreciated.

Could those crashes I mentioned earlier (After OS load screen freezes and couple of seconds later PC restarts) be because the PSU ? only 650W Gold rated (XFX XTR 650)..the GPU has some LED's on the power connector which should get on if power is insufficient to my knowledge...but that never happened.

EDIT:

Still waiting for confirmation but I think the block on this card is EK made.
Managed to change the In/Out port with one of these : 
Struggled but managed to do it and it works great...

GPU has not warranty sticker but some elements are fiddly: for example the 3rd screw holding the In/Out port (the middle one) is all transparent plastic...so it is close to impossible to remove it without snapping it..luckily I managed it...
Also the GPU came with a big syringe (labeled Aorus so not really sure about quality) of thermal paste so I guess Gigabyte is OK with re-doing the TIM.

EDIT 2:

WB on this card is not EK....just got confirmation.


----------



## 8051

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes and it would have a minimal impact on the power.


Would slower memory reduce temps on an air cooled card?


----------



## Sgang

Hi guys i've a strange request. Can someone of you help me with verify oc on my card connecting with TeamViewer?

Seems i have strange behavior,
1) my card is An Amp Extreme Core edition (base clock 1607). I installed the Amp Extreme bios (should be 1645 base clock ) but i always have 1607 as base clock checking on 3dmark

2) the best benchmark score for the vga is 10347 /10317 with +350mhz core + 155 memory (no plus voltage).
If i clock to +380 +165 it's stable but i have a worst score

It's normal this type of oc? Or i have a bad sample?

Inviato dal mio iPhone utilizzando Tapatalk


----------



## TheBoom

Quote:


> Originally Posted by *taem*
> 
> He didn't say that at all. He said it's basically the same as other 1080 ti's with beefy coolers, but is more expensive, and therefore falls behind some of the other cards in price/performance. The only real flaw he noted is that thermal pads are not optimally applied.
> 
> It looks like a nice card for the higher max power and 5 year warranty which no other 1080 ti has. I'm getting the Aorus for the 4 year warranty, it's a lot cheaper than Amp Extreme.


Guess it depends where you live too. The Amp Extreme is the same price as the Aorus here and 100 bucks cheaper than the Strix. Only the MSI Gaming X is cheaper here.

Quote:


> Originally Posted by *Thoth420*
> 
> OK but why would you risk turning OCP off and just crossing your fingers? Seems not prudent and while you haven't had any issues CL has been reported to cause all types of various issues and frankly now you have had an issue with it. Reporting it is great and all but expecting a fast fix is well not realistic at all based on their history. I love corsair hardware but whoever they pay to write that software is a failure. Just my two cents but I wouldn't ever risk a system that nice over some software there are always other options to choose in those regards.
> 
> Fans can be done on your mobo BIOS you may need to move the connection to one of the headers as opposed to how it set up now but the BIOS is likely to never fail you(that is another thing CL has failed people with ...fan profiling). The color of an LED is hardly worth risking your system over at least to me but that afaik has no other solution but I could be wrong.


You ask me why I use the software and at the same time why I turn off OCP? How would I turn it on or off without the software? The OCP is a digital one and cannot be used without the software if I'm not wrong.

It's just easier to set up the fans to respond to the loop and cpu temps in CL then do it seperately in the bios. Also I don't use any of the motherboard fan headers or the Asus fan program whatever its called.

The CL software itself hasn't caused any other issues apart from the multi rail OCP being triggered after the last update. Reported to cause all types of issues, but not for my system? I could be wrong and it could be the GPU pulling too much current after all.

Will you stop playing a game you like cause many others have bugs with the game even if you have none? Don't need to blow this out of proportion mate. If I really hated the software or it was really that bad then I wouldn't bother buying a AX860i or H115i in the first place.

Heck, the CUE software has tons more bugs than CL.


----------



## becks

This is what I got "out of the box" (Default) values:

Water temp sensor broke so Fans were all at min 500 RPM
2x 240 Rads, both pull (1 in top EK SE 30 mm thick; 1 in front EK XE 60 mm thick)
CPU at 5.2 @ 1.41V..ram at 3733 16-16-28-1t 1.395V
650W XFX XTR gold PSU

Card stayed at 2025 for the first run.
At 3rd run card went up to 44 C and went down a clock to 2012 all of this at default and as I said previously with inadequate cooling as fans are supposed to ramp up and down accordingly to water temp but the sensor broke










Spoiler: Warning: Spoiler!







A beast of a GPU...
Has a slight coil whine ... but not the pitch noise but rather a rattle... like it has a fan full of dust...but there is no fan on it









Here is pic of system....still ongoing construction, most of the tubes will be replaced just run out of acrylic at the moment










Spoiler: Warning: Spoiler!







The little black piece in the front is a card support I made of some scrap plastic I had around







it sits under the GPU end so it keeps it straight and nice.


----------



## 8051

I've read that the Pascal series don't use shims around the GPU's? Can this make it more dangerous to R&R the heatsink assembly (i.e. because of cracked cores)?


----------



## KraxKill

Overclocking may be DEAD! *Try undervolting.*

Core @ 1760MHz - GDDR5x @ 6125MHz @ *0.800v*


Core @ 1936MHz - GDDR5x @ 6125MHz @ *0.900v*


Core @ 2012MHz - GDDR5x @ 6125MHz @ *1.000v*


Core @ 2100MHz - GDDR5x @ 6125MHz @ *1.063v*


----------



## ZealotKi11er

Quote:


> Originally Posted by *KraxKill*
> 
> Overclocking may be DEAD! *Try undervolting.*
> 
> Core @ 1760MHz - GDDR5x @ 6125MHz @ *0.800v*
> 
> 
> Core @ 1936MHz - GDDR5x @ 6125MHz @ *0.900v*
> 
> 
> Core @ 2012MHz - GDDR5x @ 6125MHz @ *1.000v*
> 
> 
> Core @ 2100MHz - GDDR5x @ 6125MHz @ *1.063v*


I used my 1080 Ti at 1860MHz/0.9v I think it hit ~ 86% Power. This way I never hit the power limit. I also run 2025/1v but that high the limit from time to time.


----------



## KraxKill

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I used my 1080 Ti at 1860MHz/0.9v I think it hit ~ 86% Power. This way I never hit the power limit. I also run 2025/1v but that high the limit from time to time.


And people always ask how to turn off boost 3.0.....HA!


----------



## 8051

Two questions:

1. Does 1080Ti performance scale w/memory speeds and if so by how much?

2. Does the power drawn by the fans on an air-cooled board count against TDP limits?


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> Two questions:
> 
> 1. Does 1080Ti performance scale w/memory speeds and if so by how much?
> 
> 2. Does the power drawn by the fans on an air-cooled board count against TDP limits?


Basically with every card if you OC the core you need to OC the memory to have linear scaling. Overclocking just the memory does likes 2-3%.

For the power draw I too am trying to figure it out. I think it probably does.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Basically with every card if you OC the core you need to OC the memory to have linear scaling. Overclocking just the memory does likes 2-3%.
> 
> For the power draw I too am trying to figure it out. I think it probably does.


I experimented a little w/memory speeds on my 980Ti, it seemed like it never made any difference in FPS from 1518 Mhz all the way up to 2000 MHz, but my
980Ti core couldn't overclock anywhere near 1500 MHz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> I experimented a little w/memory speeds on my 980Ti, it seemed like it never made any difference in FPS from 1518 Mhz all the way up to 2000 MHz, but my
> 980Ti core couldn't overclock anywhere near 1500 MHz.


1080 Ti has 11GBps because it needs it.


----------



## taem

Do any of the 1080 tis have vrm temp sensors that display in standard apps like gpu z?


----------



## josephimports

Quote:


> Originally Posted by *taem*
> 
> Do any of the 1080 tis have vrm temp sensors that display in standard apps like gpu z?


Yes, the EVGA FTW3.


Spoiler: Warning: Spoiler!


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1080 Ti has 11GBps because it needs it.


Aside from reducing latency, what's the point of a memory controller and memory IC's running significantly faster than the core clock? Wouldn't it just mean the memory controller is getting its data faster than the SM's (Streaming Multiprocessor) can use it? But I suppose it's better to have the memory controller waiting on the SM's than vice-versa.


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> Aside from reducing latency, what's the point of a memory controller and memory IC's running significantly faster than the core clock? Wouldn't it just mean the memory controller is getting its data faster than the SM's (Streaming Multiprocessor) can use it? But I suppose it's better to have the memory controller waiting on the SM's than vice-versa.


Because bandwidth requirements are not 1:1 proportional to IMC clockspeed.
The IMC receives the raw data from the VRAM, collates it, then pushes it to the L2 cache (which runs at core clockspeed) which then feeds the cores. Very rare does data flow directly to the core.

Let's not forget GDDR5X doesn't actually run at 11Gbps (i.e. 11ghz). It's a QDR design (it dual issues instructions on the rise and fall of the clock tick) with 2 synchronised clocks going so the actual clockspeed is only something like 1.375ghz (for 11gbps).


----------



## ZealotKi11er

Quote:


> Originally Posted by *josephimports*
> 
> Yes, the EVGA FTW3.
> 
> 
> Spoiler: Warning: Spoiler!


I am getting a FTW3 next time around for sure. I do not feel save with FE card not reporting VRM1 temps.


----------



## 8051

Quote:


> Originally Posted by *Dasboogieman*
> 
> Because bandwidth requirements are not 1:1 proportional to IMC clockspeed.
> The IMC receives the raw data from the VRAM, collates it, then pushes it to the L2 cache (which runs at core clockspeed) which then feeds the cores. Very rare does data flow directly to the core.
> 
> Let's not forget GDDR5X doesn't actually run at 11Gbps (i.e. 11ghz). It's a QDR design (it dual issues instructions on the rise and fall of the clock tick) with 2 synchronised clocks going so the actual clockspeed is only something like 1.375ghz (for 11gbps).


Wouldn't it have to be running at 2750 MHz. to get to an effective 11 GHz.?

I had thought that while GPU memory bandwidth made a difference in CUDA applications it didn't make much difference in gaming.


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> Wouldn't it have to be running at 2750 MHz. to get to an effective 11 GHz.?
> 
> I had thought that while GPU memory bandwidth made a difference in CUDA applications it didn't make much difference in gaming.


The GPU processes data and memory is there to move data in and out very fast. It loads all the textures and frame info. G5X is 5.5GHz for 1080 Ti which makes 11GBp/s.


----------



## Alexium

Can someone please confirm that MSI Armor has the same PCB as Gaming / Gaming X, and is thus compatible with Gaming X fullcover waterblocks? I remember reading it somewhere, but can't find proof.

These are the best shots I could find, and it does appear to be the same: Armor / Gaming X


----------



## Rollergold

Quote:


> Originally Posted by *Alexium*
> 
> Can someone please confirm that MSI Armor has the same PCB as Gaming / Gaming X, and is thus compatible with Gaming X fullcover waterblocks? I remember reading it somewhere, but can't find proof.
> 
> These are the best shots I could find, and it does appear to be the same: Armor / Gaming X


I think it was Gamer's Nexus that was the Armor was like the Gaming X


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> Wouldn't it have to be running at 2750 MHz. to get to an effective 11 GHz.?
> 
> I had thought that while GPU memory bandwidth made a difference in CUDA applications it didn't make much difference in gaming.


Anandtech explains how GDDR5x works better than I can.

http://www.anandtech.com/show/9883/gddr5x-standard-jedec-new-gpu-memory-14-gbps

VRAM bandwidth plays a HUGE role in gaming performance. If you want a case example, look no further than the NVIDIA GT 640 vs the GTX 650. Identical cores, similar clocks and the the GT 650 is leagues ahead only because it uses GDDR5 vs the GDDR3 on the GT 640.

All modern GPU architectures (or should I say all GPUs) to date are inherently VRAM bandwidth starved to some degree. The reason is because it is uneconomical for NVIDIA/AMD to design a card that has excess VRAM bandwidth relative to core power due to power and monetary factors.

The performance gains in gaming from having more VRAM bandwidth is a curve, there is a point where you are literally doubling VRAM bandwidth only to eke out 1-2% gains. It is optimal for NVIDIA/AMD to hit a sweetspot where the bandwidth can be delivered as cheaply (in terms of power and monetary cost) and easily as possible.


----------



## 8051

Quote:


> Originally Posted by *Dasboogieman*
> 
> Anandtech explains how GDDR5x works better than I can.
> 
> http://www.anandtech.com/show/9883/gddr5x-standard-jedec-new-gpu-memory-14-gbps
> 
> VRAM bandwidth plays a HUGE role in gaming performance. If you want a case example, look no further than the NVIDIA GT 640 vs the GTX 650. Identical cores, similar clocks and the the GT 650 is leagues ahead only because it uses GDDR5 vs the GDDR3 on the GT 640.
> 
> All modern GPU architectures (or should I say all GPUs) to date are inherently VRAM bandwidth starved to some degree. The reason is because it is uneconomical for NVIDIA/AMD to design a card that has excess VRAM bandwidth relative to core power due to power and monetary factors.
> 
> The performance gains in gaming from having more VRAM bandwidth is a curve, there is a point where you are literally doubling VRAM bandwidth only to eke out 1-2% gains. It is optimal for NVIDIA/AMD to hit a sweetspot where the bandwidth can be delivered as cheaply (in terms of power and monetary cost) and easily as possible.


As I wrote before I didn't see any difference in gaming benchmarks when running the memory on my 980Ti at 1508 MHz. vs. 2000 MHz. Aside from latency considerations, If the GPU core and cache are running slower than the memory controller what's the point in having faster VRAM? That just means the memory controller will be waiting longer for the SM's to pick up the data they requested.

Does anyone have any benchmarks showing a performance delta when overclocking VRAM on a given video card (i.e. not between different video cards)?


----------



## Pandora's Box

Pretty sure with overclocking the ram is just adjusting to looser timings to compensate, there's also error correction kicking in. Pascal really takes the fun out of overclocking. No real world gains pass what an aftermarket card boosts to


----------



## MaKeN

My card runs at +730 memory its 63xxmhz , core runs at 2025 ... it means memory is set 2 high ?

Also guys whats the best stress test to use for a gpu?


----------



## coc_james

Quote:


> Originally Posted by *MaKeN*
> 
> My card runs at +730 memory its 63xxmhz , core runs at 2025 ... it means memory is set 2 high ?
> 
> Also guys whats the best stress test to use for a gpu?


There are some different opinions here. First one says test your overclock with benchmarks like SuPo 4k, Firestrike Ultra, and then run your preferred games. If you don't get artifacts, driver crashes, or your benchmark scores don't start to drop, then you're good. The second opinion is to run furmark and check for errors and temps.


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> As I wrote before I didn't see any difference in gaming benchmarks when running the memory on my 980Ti at 1508 MHz. vs. 2000 MHz. Aside from latency considerations, If the GPU core and cache are running slower than the memory controller what's the point in having faster VRAM? That just means the memory controller will be waiting longer for the SM's to pick up the data they requested.
> 
> Does anyone have any benchmarks showing a performance delta when overclocking VRAM on a given video card (i.e. not between different video cards)?


You are probably not seeing a difference due to VRAM straps. The VRAM array is using auto-programmed looser timings to compensate for the higher clockspeed so you are not actually increasing your bandwidth all that much.
If you check out the Hawaii forums, those guys see massive gains from overclocking VRAM because they can mod the straps in the BIOS.


----------



## Streetdragon

Quote:


> Originally Posted by *coc_james*
> 
> There are some different opinions here. First one says test your overclock with benchmarks like SuPo 4k, Firestrike Ultra, and then run your preferred games. If you don't get artifacts, driver crashes, or your benchmark scores don't start to drop, then you're good. The second opinion is to run furmark and check for errors and temps.


dont use furmark.......... guys pls stop using it. you would not drive your car at max engine speed(red area)

to check for the sweet spot with VRAM speed i used heaven benchmark. stoped it(free walk) in window mode.

than the fps are at a constant number. than open afterburner or whatever and change the slider of the ramspeed.
i had FPS-gains till ramspeed +500. after that the fps didnt increased. at +600 the fps start to drop with a little drivercrash->pc reboot.

so i run with +500 on memory.

Easy way to find the sweet spot.

if you bought superposition you can even do it with a better benchmark.

i think you can do the same with any game in windowmode.


----------



## coc_james

Quote:


> Originally Posted by *Streetdragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> There are some different opinions here. First one says test your overclock with benchmarks like SuPo 4k, Firestrike Ultra, and then run your preferred games. If you don't get artifacts, driver crashes, or your benchmark scores don't start to drop, then you're good. The second opinion is to run furmark and check for errors and temps.
> 
> 
> 
> dont use furmark.......... guys pls stop using it. you would not drive your car at max engine speed(red area)
> 
> to check for the sweet spot with VRAM speed i used heaven benchmark. stoped it(free walk) in window mode.
> 
> than the fps are at a constant number. than open afterburner or whatever and change the slider of the ramspeed.
> i had FPS-gains till ramspeed +500. after that the fps didnt increased. at +600 the fps start to drop with a little drivercrash->pc reboot.
> 
> so i run with +500 on memory.
> 
> Easy way to find the sweet spot.
> 
> if you bought superposition you can even do it with a better benchmark.
> 
> i think you can do the same with any game in windowmode.
Click to expand...

Lol, actually, yes, I redline shift every single pass on the 1/4 mile with my Mustang. That's not a great analogy.

I've not heard someone recommend against using furmark. Do you have some proof, studies, etc stating it's bad?

Just curious. I wouldn't think MSI would recommend using it if it was going to kill their cards. Seems like that would cost them a lot of money, via RMA.


----------



## KraxKill

Quote:


> Originally Posted by *coc_james*
> 
> Lol, actually, yes, I redline shift every single pass on the 1/4 mile with my Mustang. That's not a great analogy.
> 
> I've not heard someone recommend against using furmark. Do you have some proof, studies, etc stating it's bad?
> 
> Just curious. I wouldn't think MSI would recommend using it if it was going to kill their cards. Seems like that would cost them a lot of money, via RMA.


I view Furmark as an over simplified non real world stress test. You may shift your mustang at redline, but you don't run it there for any extended period of time.

Furmark is great for testing supporting hardware failures like VRM failures, for forcing intermittent issues into the present etc. It is otherwise useless and has been for ages now.

Furmark represents an unrealistic non varying load on the hardware. Meaning that an overclock may actually PASS the brutality of Furmark but fail under a varying load like Unigine Heaven, Firestrike or any of the other real world scenarios where the card is asked to do much more than just operate at "red line". You may hit red line in your Mustang, but your cylinder pressures are actually highest in the middle of your rpm band, at he height of your torque curve. If all you do is stress at red line, you have no idea of potential issues everywhere else along the way.

I don't think there is anything particularly damaging about Furmark as long as it's used within reason, but for me it's more of a problem validation tool, rather than a good stress test.

A program like Superposition, Firestrike and the like, will modulate the VRM and all supporting electronics as the workload changes. They cast a much wider net to spot potential issues. This much closer represents real world scenarios and actually stresses the card under more than just full throttle. The power spikes associated with going from light load to high load can actually be quite problematic, while running the card flat out at a constant power load may not trigger any errors.

If I had to choose the best stress test, from experience I would suggest Unigine Heaven. I still haven't found anything better. I find that if I can pass looping Heaven with all options maxed, then I can usually pass everything else.

Furmark though is an overly simplistic test that although does take your card to redline, fails to actually stress the card under a multitude of scenarios where overclocks can and do fail.


----------



## josephimports

Quote:


> Originally Posted by *taem*
> 
> Do any of the 1080 tis have vrm temp sensors that display in standard apps like gpu z?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am getting a FTW3 next time around for sure. I do not feel save with FE card not reporting VRM1 temps.


A few of the MSI Lightning sensors are read by HWinfo. All of the FTW3 sensors are visible.


Spoiler: Warning: Spoiler!


----------



## coc_james

Quote:


> Originally Posted by *KraxKill*
> 
> Quote:
> 
> 
> 
> Originally Posted by *coc_james*
> 
> Lol, actually, yes, I redline shift every single pass on the 1/4 mile with my Mustang. That's not a great analogy.
> 
> I've not heard someone recommend against using furmark. Do you have some proof, studies, etc stating it's bad?
> 
> Just curious. I wouldn't think MSI would recommend using it if it was going to kill their cards. Seems like that would cost them a lot of money, via RMA.
> 
> 
> 
> I view Furmark as an over simplified non real world stress test. You may shift your mustang at redline, but you don't run it there for any extended period of time.
> 
> Furmark is great for testing supporting hardware failures like VRM failures, for forcing intermittent issues into the present etc. It is otherwise useless and has been for ages now.
> 
> Furmark represents an unrealistic non varying load on the hardware. Meaning that an overclock may actually PASS the brutality of Furmark but fail under a varying load like Unigine Heaven, Firestrike or any of the other real world scenarios where the card is asked to do much more than just operate at "red line". You may hit red line in your Mustang, but your cylinder pressures are actually highest in the middle of your rpm band, at he height of your torque curve. If all you do is stress at red line, you have no idea of potential issues everywhere else along the way.
> 
> I don't think there is anything particularly damaging about Furmark as long as it's used within reason, but for me it's more of a problem validation tool, rather than a good stress test.
> 
> A program like Superposition, Firestrike and the like, will modulate the VRM and all supporting electronics as the workload changes. They cast a much wider net to spot potential issues. This much closer represents real world scenarios and actually stresses the card under more than just full throttle. The power spikes associated with going from light load to high load can actually be quite problematic, while running the card flat out at a constant power load may not trigger any errors.
> 
> If I had to choose the best stress test, from experience I would suggest Unigine Heaven. I still haven't found anything better. I find that if I can pass looping Heaven with all options maxed, then I can usually pass everything else.
> 
> Furmark though is an overly simplistic test that although does take your card to redline, fails to actually stress the card under a multitude of scenarios where overclocks can and do fail.
Click to expand...

Agreed, and that's why I mentioned the use of other programs as well. I'm in no way advocating for the use of Furmark, I'm not beating it up either. I believe it has a purpose, just better to run a gambit of tests that tests all aspects.

My inquiry was that of genuine curiosity.


----------



## Kev13Dd

I have an MSI 1080 Ti Gaming

I believe my GPU has coil whine, and it's completely dependent on my GPU memory clocks. I can make the coil whine worse/better by adjusting the memory frequency. Neither voltage tweaks or core clocks have any impact, it's 100% based on memory clocks and GPU usage (above 50% it starts kicking in)

Anyone else ever seen this? Any indication if this is a bad card or an indication of a bad PSU? Thanks


----------



## Mooncheese

Quote:


> Originally Posted by *KraxKill*
> 
> Considering you score 10,200 in Supo 4k at 2100 and I score ~10400 at 2000 @ 1v you may want to actually climb off the high horse and learn something.
> 
> Your card is actually being TDP throttled, hurting your score.


Yeah 10.2k isn't indicative of 2100 MHZ with no throttling with 400w at all, my FE just did 9950 in 4k opt. at 2000 MHZ with 1.000v (300W) 48c peak. 250 points for another 100W, this would be indicative of exceeding a chips sweet spot; an analog would be my CPU which will do 4.5 GHZ at 1.361v but needs more than 1.4v to do 4.6 GHZ. At some point you have to let common sense rule out. 33% more wattage and 10% more voltage for 2.5% performance? I mean, the power isn't coming from fairy dust, to me this is a microcosm of the insanity behind Industrial "civilization" where we're never ever satisfied and need that extra 2.5% profit, if that means automating entire sectors of the economy, purchasing and then promptly dismantling electric rail mass transit (GM,Firestone and Standard Oil in Los Angeles in the 1930's) or denuding what were once biodiversity rich ecosystems to plant monocrops to be force fed to livestock because of our voracious appetite for meat, this is the sickness afflicting man, threatening our prosperity and continued survival (and when we get that 2.5% we are only momentarily satisfied, and instantly crave more, at whatever the cost).

"The most pressing issue facing Man is his inability to sit quietly alone in a room with his thoughts" Carl G. Jung


----------



## Leethal

how is this?

1750Mhz core
11400Mhz Mem
power limit 120%

i'm using MSI afterburner

can i go higher then this and should i play with the voltages?

i just want a decent stable overclock


----------



## NBrock

Quote:


> Originally Posted by *Leethal*
> 
> how is this?
> 
> 1750Mhz core
> 11400Mhz Mem
> power limit 120%
> 
> i'm using MSI afterburner
> 
> can i go higher then this and should i play with the voltages?
> 
> i just want a decent stable overclock


What temps are you hitting? I think most people's cards are boosting higher than 1750 core. Both my cards boosted to 1987 stock.


----------



## Leethal

Quote:


> Originally Posted by *NBrock*
> 
> What temps are you hitting? I think most people's cards are boosting higher than 1750 core. Both my cards boosted to 1987 stock.


max of like 68c

i'm going to try

power limit 120%
+150Mhz Core
+500Mhz Mem


----------



## taem

Best way to limit framerate with a 1080 ti? I used to use nVidia Inspector to do this with my 970m laptop and then after a driver update it no longer works. I know I can use RTSS, but is there a superior / alternate method?

(I know it's better to use in game frame limit rather than external app. But most games do not have in game frame rate cap options.)


----------



## MaKeN

Quote:


> Originally Posted by *Streetdragon*
> 
> dont use furmark.......... guys pls stop using it. you would not drive your car at max engine speed(red area)
> 
> to check for the sweet spot with VRAM speed i used heaven benchmark. stoped it(free walk) in window mode.
> 
> than the fps are at a constant number. than open afterburner or whatever and change the slider of the ramspeed.
> i had FPS-gains till ramspeed +500. after that the fps didnt increased. at +600 the fps start to drop with a little drivercrash->pc reboot.
> 
> so i run with +500 on memory.
> 
> Easy way to find the sweet spot.
> 
> if you bought superposition you can even do it with a better benchmark.
> 
> i think you can do the same with any game in windowmode.


very hard to do this way ...Fps in heaven arent constant they jump from 140-2xx .

fore some reason i do get better score in superposition every time i raise my memory mhz. or this does not have nothing to do with gaming?

ill try to open GTA 5 in windowed mode and try to raise the mem there to see how fps changes.


----------



## Mooncheese

Following up on my last post, here's 4K Sup Opt at 2025 Mhz core and 5900 MHz memory with 1.0v, zero throttling:



And data for that run:



You DONT need 400W with this card, this card responds best to an undervolt.

This OC is 100% stable with The Witcher 3 in 3D Vision with sustained 99-100% utilization, hours upon hours on end (yet to get a crash). I do need to turn the clocks down to 1974 MHz in Titanfall 2 (2D, G-Sync) with 1.0v, for whatever reason, with way less PT going on. I don't get it and I actually don't care as I really only need the OC with TW3 in 3D Vision.

This OC is also 99.2% stable in Firestrike Extreme Stress Test, and 30 minutes of Heaven, but yeah, Titanfall 2 it will test your OC for whatever reason.


----------



## Minusorange

I never want to OC a GPU ever again after trying to get the max out of my Aorus Xtreme, ended up 2000mhz core/1504mem @1.025v for my most stable in both Firestrike and Heaven.

I did manage 2037 core in Heaven @ max voltage but in Firestrike it kept downclocking and 2012 @ 1.012v worked great in Firestrike but kept crashing in Heaven

The amount of restarts and tweaks with that hideous mhz/V curve is enough to put me off doing it again lol fortunately my Ti will last me a good few years before I need to attempt oc'ing a new card


----------



## ZealotKi11er

Quote:


> Originally Posted by *Minusorange*
> 
> I never want to OC a GPU ever again after trying to get the max out of my Aorus Xtreme, ended up 2000mhz core/1504mem @1.025v for my most stable in both Firestrike and Heaven.
> 
> I did manage 2037 core in Heaven @ max voltage but in Firestrike it kept downclocking and 2012 @ 1.012v worked great in Firestrike but kept crashing in Heaven
> 
> The amount of restarts and tweaks with that hideous mhz/V curve is enough to put me off doing it again lol fortunately my Ti will last me a good few years before I need to attempt oc'ing a new card


That is Pascal for you. With custom card I would not even both. I have the FE and that has room for OC. Most Custom cards hit 19XX-2XXX already so no point on even bothering to try to OC. Best you can do is let the power rip and increase fans speeds. If I was not power limited I would just do a simple offset OC.


----------



## Mooncheese

Quote:


> Originally Posted by *taem*
> 
> Best way to limit framerate with a 1080 ti? I used to use nVidia Inspector to do this with my 970m laptop and then after a driver update it no longer works. I know I can use RTSS, but is there a superior / alternate method?
> 
> (I know it's better to use in game frame limit rather than external app. But most games do not have in game frame rate cap options.)


You want to use AB / RTSS to cap your frames as there is significantly less latency using this over Nvidia Inspector:


----------



## Mooncheese

Quote:


> Originally Posted by *Minusorange*
> 
> I never want to OC a GPU ever again after trying to get the max out of my Aorus Xtreme, ended up 2000mhz core/1504mem @1.025v for my most stable in both Firestrike and Heaven.
> 
> I did manage 2037 core in Heaven @ max voltage but in Firestrike it kept downclocking and 2012 @ 1.012v worked great in Firestrike but kept crashing in Heaven
> 
> The amount of restarts and tweaks with that hideous mhz/V curve is enough to put me off doing it again lol fortunately my Ti will last me a good few years before I need to attempt oc'ing a new card


What temps? You lose 13 MHz at 39, 49, and 59C. If you can keep your temps under 50C vs. 70C that's 26 MHz right there. The bin dropping is actually a built in safety feature, as the processor requires more voltage the higher the temperature. If you undervolt, that's -5C right there. I would undervolt and add an AIO. Another mistake people make is they go Full ****** with the memory overclocking, not understanding that beyond a certain frequency you start to see reduced performance. I get like 5 FPS more with the memory at +400 vs +500 or even +450 MHz. It's called error correction.

1. Bring the core temp down.
2. Undervolt.
3. Don't go willy nilly with the memory OC.

You can dial in your memory sweet spot by assigning different OC profiles to hotkeys and then with a demanding game open, alternate between say +400, +450, and +500 or more and see what it does to your FPS.

Try to use a static scene so you can eliminate any GPU load dynamism that is inherent to a constantly changing scene.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mooncheese*
> 
> What temps? You lose 13 MHz at 39, 49, and 59C. If you can keep your temps under 50C vs. 70C that's 26 MHz right there. The bin dropping is actually a built in safety feature, as the processor requires more voltage the higher the temperature. If you undervolt, that's 5C right there. I would undervolt and add an AIO. Another mistake people make is they go Full ****** with the memory overclocking, not understanding that beyond a certain frequency you start to see reduced performance. I get like 5 FPS more with the memory at +400 vs +500 or even +450 MHz. It's called error correction.
> 
> 1. Bring the core temp down.
> 2. Undervolt.
> 3. Don't go willy nilly with the memory OC.
> 
> You can dial in your memory sweet spot by assigning different OC profiles to hotkeys and then with a demanding game open, alternate between say +400, +450, and +500 or more and see what it does to your FPS.
> 
> Try to use a static scene so you can eliminate any GPU load dynamism that is inherent to a constantly changing scene.


I have tested +400 vs +500 and +500 was faster.


----------



## Mooncheese

Quote:


> Originally Posted by *Mooncheese*
> 
> What temps? You lose 13 MHz at 39, 49, and 59C. If you can keep your temps under 50C vs. 70C that's 26 MHz right there. The bin dropping is actually a built in safety feature, as the processor requires more voltage the higher the temperature. If you undervolt, that's -5C right there. I would undervolt and add an AIO. Another mistake people make is they go Full ****** with the memory overclocking, not understanding that beyond a certain frequency you start to see reduced performance. I get like 5 FPS more with the memory at +400 vs +500 or even +450 MHz. It's called error correction.
> 
> 1. Bring the core temp down.
> 2. Undervolt.
> 3. Don't go willy nilly with the memory OC.
> 
> You can dial in your memory sweet spot by assigning different OC profiles to hotkeys and then with a demanding game open, alternate between say +400, +450, and +500 or more and see what it does to your FPS.
> 
> Try to use a static scene so you can eliminate any GPU load dynamism that is inherent to a constantly changing scene.


Every card is different, while you certainly won the memory chip lottery, I'm confident at this point that my GPU core is above average (it does 1911 out-of-the-box simply setting PT to 120% and will sustain 2025 MHz with an undervolt of 1.0v).

I'm not saying it's a golden sample, but I think it's above avg.

For reference I ordered my FE 5 minutes into pre-order day back in the beginning of March.

If you have just one memory chip that isn't as good as the others you will experience Error Correction at lower frequencies. Some have dialed theirs in and lucked out and get more performance at +500, even +550+ MHz.


----------



## KedarWolf

Me at 2012 core, 6177 memory, XOC BIOS, .993v.











Edit: GPU Boost does ramp me up to 1.063v though in the run. I know if I adjust voltage points sometimes it helps. Going to try 1.000v.

This is 2025 1.000v with zero jumping in voltage, stays at 1.000v.

If your voltage jumps around with clocks staying the same experiment with different voltages points, some jump, some don't.



Likely lost the point in stability because I had Twitch in Chrome running on my second screen the entire time.

Going to do 30 minutes of Heaven Extreme preset now.









30 minutes of Heaven, Extreme preset, 2025 core, 6177 memory, 1.000v with no jumping, XOC BIOS.











Time Spy and Superposition finish without crashing as well. Get 42-43C on GPU.


----------



## lilchronic

2012Mhz / 6210Mhz @1v


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Me at 2012 core, 6177 memory, XOC BIOS, .993v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: GPU Boost does ramp me up to 1.063v though in the run. I know if I adjust voltage points sometimes it helps. Going to try 1.000v.
> 
> This is 2025 1.000v with zero jumping in voltage, stays at 1.000v.
> 
> If your voltage jumps around with clocks staying the same experiment with different voltages points, some jump, some don't.
> 
> 
> 
> Likely lost the point in stability because I had Twitch in Chrome running on my second screen the entire time.
> 
> Going to do 30 minutes of Heaven Extreme preset now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 30 minutes of Heaven, Extreme preset, 2025 core, 6177 memory, 1.000v with no jumping, XOC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time Spy and Superposition finish without crashing as well. Get 42-43C on GPU.


8K, High Textures Quality, uses close to 9GB of RAM.


----------



## Ultracarpet

Just got my 1080ti zotac mini. Going to be moving to a mitx build soon. Pretty satisfied with the card, if i jack up the power limit and nothing else, it's boosting to just over 1900mhz... Runs a little hot, though; at max fan speed with the power limit slider maxed like it was, i was hitting high 70's. Noise isn't as bad as I would've thought; though mind you coming from a reference 480 that was overclocked, my opinion may be skewed lol.

I haven't had an NVidia card for a while (well i had a 1070 briefly but i didn't mess with it at all), is undervolting a thing with these? Does it help with temperatures? Should i OC the memory/how far does it go usually?


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> 2012Mhz / 6210Mhz @1v


Turn off G-Sync, bro, or sis.

If you're capping your frame rate at 60 FPS you're going to get really high stability scores.

You should be getting over 140 FPS on average, and turn V-Sync and Riva Tuner off too.

That score isn't valid with capped frame rates.









If you post the 3DMark link we can see if you're doing it properly.


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> 2012Mhz / 6210Mhz @1v


With G-Sync and V-Sync on, not valid for this thread as my frame rate is capped at 60 FPS on my 4K G-Sync screen and I get artificially high results even though 3DMark reports it as valid.









Even with a higher refresh rate monitor with G-Sync on you're going to get results not valid here as your frame rates do not have the full range of what they can be making your stability better than what it really is.

Try V-Sync, G-Sync and Riva Tuner completely off, then your results will accurately reflect what your stability really is.









https://www.3dmark.com/3dm/21874807?


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> Turn off G-Sync, bro, or sis.
> 
> If you're capping your frame rate at 60 FPS you're going to get really high stability scores.
> 
> You should be getting over 140 FPS on average, and turn V-Sync and Riva Tuner off too.
> 
> That score isn't valid with capped frame rates.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you post the 3DMark link we can see if you're doing it properly.


Quote:


> Originally Posted by *KedarWolf*
> 
> With G-Sync and V-Sync on, not valid for this thread as my frame rate is capped at 60 FPS on my 4K G-Sync screen and I get artificially high results even though 3DMark reports it as valid.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even with a higher refresh rate monitor with G-Sync on you're going to get results not valid here as your frame rates do not have the full range of what they can be making your stability better than what it really is.
> 
> Try V-Sync, G-Sync and Riva Tuner completely off, then your results will accurately reflect what your stability really is.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/21874807?


Yeah i have G-sync on but i use fast sync so it will go over my refresh rate of 144hz if i don't cap it. For that run above i disabled the cap as you can see in afterburner monitor max fps was 181.


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Turn off G-Sync, bro, or sis.
> 
> If you're capping your frame rate at 60 FPS you're going to get really high stability scores.
> 
> You should be getting over 140 FPS on average, and turn V-Sync and Riva Tuner off too.
> 
> That score isn't valid with capped frame rates.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you post the 3DMark link we can see if you're doing it properly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> With G-Sync and V-Sync on, not valid for this thread as my frame rate is capped at 60 FPS on my 4K G-Sync screen and I get artificially high results even though 3DMark reports it as valid.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even with a higher refresh rate monitor with G-Sync on you're going to get results not valid here as your frame rates do not have the full range of what they can be making your stability better than what it really is.
> 
> Try V-Sync, G-Sync and Riva Tuner completely off, then your results will accurately reflect what your stability really is.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/21874807?
> 
> 
> 
> Click to expand...
> 
> Yeah i have G-sync on but i use fast sync so it will go over my refresh rate of 144hz if i don't cap it. For that run above i disabled the cap as you can see in afterburner monitor max fps was 181.
Click to expand...

Fast Sync still affects how your frame rate is during the run, I just tested it.

The only proper stability test that is accurate is V-Sync completely off, Fixed Refresh Rate and any frame limiters like Riva Tuner completely off.

With Fast Sync on you are still getting artificially higher stability scores.









I'm sure if you do a run with the suggested settings your score will be somewhat lower.









And I would set it in the Global Settings for the purposes of the benchmark, not in the Nvidia Program Settings options as the 3DMark start screen and the actual 3DMark benchmark running itself can have conflicting settings and possibly cause the benchmark to not run at the settings you think you're using.


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> Fast Sync still affects how your frame rate is during the run, I just tested it.
> 
> The only proper stability test that is accurate is V-Sync completely off, Fixed Refresh Rate and any frame limiters like Riva Tuner completely off.
> 
> With Fast Sync on you are still getting artificially higher stability scores.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure if you do a run with the suggested settings your score will be somewhat lower.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I would set it in the Global Settings for the purposes of the benchmark, not in the Nvidia Program Settings options as the 3DMark start screen and the actual 3DMark benchmark running itself can have conflicting settings and possibly cause the benchmark to not run at the settings you think you're using.


V-Syns is off here and i still get 99.6% ?
https://www.3dmark.com/3dm/21875444
Bout to run it default and see what happens.

Edit: Default clocks 99.4%








https://www.3dmark.com/3dm/21875648


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Fast Sync still affects how your frame rate is during the run, I just tested it.
> 
> The only proper stability test that is accurate is V-Sync completely off, Fixed Refresh Rate and any frame limiters like Riva Tuner completely off.
> 
> With Fast Sync on you are still getting artificially higher stability scores.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure if you do a run with the suggested settings your score will be somewhat lower.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I would set it in the Global Settings for the purposes of the benchmark, not in the Nvidia Program Settings options as the 3DMark start screen and the actual 3DMark benchmark running itself can have conflicting settings and possibly cause the benchmark to not run at the settings you think you're using.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> V-Syns is off here and i still get 99.6% ?
> https://www.3dmark.com/3dm/21875444
> Bout to run it default and see what happens.
> 
> Edit: Default clocks 99.4%
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/21875648
Click to expand...

Damn, nice. What BIOS are you running right now?


----------



## lilchronic

Quote:


> Originally Posted by *KedarWolf*
> 
> Damn, nice. What BIOS are you running right now?


Just stock EVGA FE BIOS.


----------



## KedarWolf

Quote:


> Originally Posted by *lilchronic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Damn, nice. What BIOS are you running right now?
> 
> 
> 
> Just stock EVGA FE BIOS.
Click to expand...

This is with the EVGA FTW3 BIOS at 2037, 1.025v, 6142 memory.









Seems EVGA BIOS's are really stable. I tried it when I saw you're using EVGA.









https://www.3dmark.com/fsst/550541


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> Me at 2012 core, 6177 memory, XOC BIOS, .993v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: GPU Boost does ramp me up to 1.063v though in the run. I know if I adjust voltage points sometimes it helps. Going to try 1.000v.
> 
> This is 2025 1.000v with zero jumping in voltage, stays at 1.000v.
> 
> If your voltage jumps around with clocks staying the same experiment with different voltages points, some jump, some don't.
> 
> 
> 
> Likely lost the point in stability because I had Twitch in Chrome running on my second screen the entire time.
> 
> Going to do 30 minutes of Heaven Extreme preset now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 30 minutes of Heaven, Extreme preset, 2025 core, 6177 memory, 1.000v with no jumping, XOC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time Spy and Superposition finish without crashing as well. Get 42-43C on GPU.


Mind sharing how you got the voltage to stay at 1.00v with the XOC bios? I thought it wasn't possible with XOC since it always reverts to 1.13v.


----------



## k4sh

Quote:


> Originally Posted by *Ultracarpet*
> 
> Just got my 1080ti zotac mini. Going to be moving to a mitx build soon. Pretty satisfied with the card, if i jack up the power limit and nothing else, it's boosting to just over 1900mhz... Runs a little hot, though; at max fan speed with the power limit slider maxed like it was, i was hitting high 70's. Noise isn't as bad as I would've thought; though mind you coming from a reference 480 that was overclocked, my opinion may be skewed lol.
> 
> I haven't had an NVidia card for a while (well i had a 1070 briefly but i didn't mess with it at all), is undervolting a thing with these? Does it help with temperatures? Should i OC the memory/how far does it go usually?


Just got mine a few weeks ago and under a waterblock it cannot exceed actually 2025 Mhz.
Dunno if the VRM only cooled by a 12cm fan are cooled enough but an american reviewer showed that the VRM embedded are ranged in the lowest quality part.
So they do not seem to be efficient.
As many others said in that forum, i will try to go with undervolting as i've only explored recently the card overclocking.
Actually my settings in MSI AB are :
+0% core voltage
+120 Power
+110 core
+ 300 Mhz Vram

I will also see in a few days if the copper heatsinks i ordered recently for the GPU VRM will help in any way.


----------



## MaKeN

Ive had coper heatsinks on vrm on my r9 390 .
They do help for sure


----------



## k4sh

Quote:


> Originally Posted by *MaKeN*
> 
> Ive had coper heatsinks on vrm on my r9 390 .
> They do help for sure


Had some with my R9 290 recently sold. They helped lowering the temps by approx 15° C which is fine.
Any temp decrease on VRM is good to take as they will work more efficiently.


----------



## Minusorange

Quote:


> Originally Posted by *Mooncheese*
> 
> What temps? You lose 13 MHz at 39, 49, and 59C. If you can keep your temps under 50C vs. 70C that's 26 MHz right there. The bin dropping is actually a built in safety feature, as the processor requires more voltage the higher the temperature. If you undervolt, that's -5C right there. I would undervolt and add an AIO. Another mistake people make is they go Full ****** with the memory overclocking, not understanding that beyond a certain frequency you start to see reduced performance. I get like 5 FPS more with the memory at +400 vs +500 or even +450 MHz. It's called error correction.
> 
> 1. Bring the core temp down.
> 2. Undervolt.
> 3. Don't go willy nilly with the memory OC.
> 
> You can dial in your memory sweet spot by assigning different OC profiles to hotkeys and then with a demanding game open, alternate between say +400, +450, and +500 or more and see what it does to your FPS.
> 
> Try to use a static scene so you can eliminate any GPU load dynamism that is inherent to a constantly changing scene.


Hitting 59 degrees

This is my voltage curve



And my fan curve



My room is warm so, ambients are already high and not something I can escape unless I end up going Custom loop under water but then I may as well have bought a reference PCB instead of AIB.... Idle with 28% fan = 48 degree already







So I have VERY little room

Memory sweetspot seems to be +400 working out at 3007, any higher and I get arti's, any lower and FPS begins to lower, the memory was actually easy and relatively enjoyable to tweak.


----------



## Ultracarpet

Quote:


> Originally Posted by *k4sh*
> 
> Just got mine a few weeks ago and under a waterblock it cannot exceed actually 2025 Mhz.
> Dunno if the VRM only cooled by a 12cm fan are cooled enough but an american reviewer showed that the VRM embedded are ranged in the lowest quality part.
> So they do not seem to be efficient.
> As many others said in that forum, i will try to go with undervolting as i've only explored recently the card overclocking.
> Actually my settings in MSI AB are :
> +0% core voltage
> +120 Power
> +110 core
> + 300 Mhz Vram
> 
> I will also see in a few days if the copper heatsinks i ordered recently for the GPU VRM will help in any way.


Yea i saw the video as well, but to be honest, this is going to be in a mitx build so I'm not too worried about the max overclocks. I would just like it to run at stock frequencies at a decent temperature.


----------



## 8051

Quote:


> Originally Posted by *Ultracarpet*
> 
> Just got my 1080ti zotac mini. Going to be moving to a mitx build soon. Pretty satisfied with the card, if i jack up the power limit and nothing else, it's boosting to just over 1900mhz... Runs a little hot, though; at max fan speed with the power limit slider maxed like it was, i was hitting high 70's. Noise isn't as bad as I would've thought; though mind you coming from a reference 480 that was overclocked, my opinion may be skewed lol.
> 
> I haven't had an NVidia card for a while (well i had a 1070 briefly but i didn't mess with it at all), is undervolting a thing with these? Does it help with temperatures? Should i OC the memory/how far does it go usually?


$986 in the USA for a card w/sub-standard air cooling. How much are the liquid cooled 1080ti Zotac minis?


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Me at 2012 core, 6177 memory, XOC BIOS, .993v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: GPU Boost does ramp me up to 1.063v though in the run. I know if I adjust voltage points sometimes it helps. Going to try 1.000v.
> 
> This is 2025 1.000v with zero jumping in voltage, stays at 1.000v.
> 
> If your voltage jumps around with clocks staying the same experiment with different voltages points, some jump, some don't.
> 
> 
> 
> Likely lost the point in stability because I had Twitch in Chrome running on my second screen the entire time.
> 
> Going to do 30 minutes of Heaven Extreme preset now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 30 minutes of Heaven, Extreme preset, 2025 core, 6177 memory, 1.000v with no jumping, XOC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time Spy and Superposition finish without crashing as well. Get 42-43C on GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mind sharing how you got the voltage to stay at 1.00v with the XOC bios? I thought it wasn't possible with XOC since it always reverts to 1.13v.
Click to expand...

You need to experiment with different voltage/core clock points. Like at 2012/.993v it'll jump to 1.063v but at 2025/1.000v my card will stay at 1.000 the entire Fire Strike stress test run.

So you need to find the voltage point/core clock that doesn't jump around for your card. Experiment!


----------



## Minusorange

Quote:


> Originally Posted by *KedarWolf*
> 
> You need to experiment with different voltage/core clock points. Like at 2012/.993v it'll jump to 1.063v but at 2025/1.000v my card will stay at 1.000 the entire Fire Strike stress test run.


Don't rely solely on Firestrike though

For instance in Fire I can do [email protected] consistently without issue or drop BUT in Heaven it causes driver crash and for the opposite I can do [email protected] without throttle in Heaven but in Firestrike it throttles down to below 2000.

The key is finding something that works in every bench and more importantly in your games unless you're attempting your max potential and ignoring everyday stability


----------



## Ultracarpet

Quote:


> Originally Posted by *8051*
> 
> $986 in the USA for a card w/sub-standard air cooling. How much are the liquid cooled 1080ti Zotac minis?


I got the card for $859 CAD (before tax). I don't actually know if zotac released the watercooled version, nor do i think there are blocks for it. If the pcb review is anything to go by, though; there probably isn't much point in watercooling as the card doesn't have very much headroom at all.


----------



## kiario

My current voltage and clock is
+40 on voltage
+90 on core
+500 on mem

This results in 2012 mhz core at 1.050 voltage and at 70 degrees in gaming and Heaven BM. Air cooled MSI gaming. Default fan curve in AB, no custom.

When temps go slightly above 70 which happens sometimes in Elite dangerous voltage goes to 1.062. and MHZ to 2000

But clocks stays at 2000 except in less intensive areas where clock rises to 2012.

If I lower core voltage then Elite crashes after a while even though heaven still runs OK.

Does this sound like a good clock and core V? Or could I recieve better results with lower volt if I have custom Volt curve?

It seems to working fine both in games and BM so why change?


----------



## Sgang

ZOTAC 1080 TI AMP EXTREME Owners

Could be that Zotac changed something in the production of the VGA after the bad reviews about the VRM /Rubber/ Not thermal pad issue?

I read in a lot of reviews a temperature beyond 80°, but mine seems stable after a lot of repeated benchmarks (superposition, 3dmark, and Furmark) on 69/70 in full load with a + 350 mhz from the base clock and 250 on the memory (no ov)

I was thinking to buy some thermal pad to place on the VRM


----------



## 8051

Quote:


> Originally Posted by *Ultracarpet*
> 
> I got the card for $859 CAD (before tax). I don't actually know if zotac released the watercooled version, nor do i think there are blocks for it. If the pcb review is anything to go by, though; there probably isn't much point in watercooling as the card doesn't have very much headroom at all.


But why is this card so expensive? I can understand the watercooled Zotac mini going for that much, but not the air cooled variant.


----------



## Raity

Guys, can you please advise.
I have a very strage problem that I can't explain.

I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
The water cooled card set to higher clocks by default and should perform better.
But on the absolutely same system, the scores is always better on FTW3.
CPU and physics sore is the same.
*
Here is several FTW3 results*

Time Spy - https://www.3dmark.com/spy/2279573 (8721)
Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
*
Here is several AORUS results (water cooled)*

Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)

Results differs significantly in favor of FTW3 card with lover clocks.


----------



## Maximization

guys I need advise,
I have been moving the sliders up slowly over the past week and a half, found limit on GPU,
but memory I am maxxed out on slider though,
Is there a way around this?


----------



## Minusorange

Quote:


> Originally Posted by *Maximization*
> 
> guys I need advise,
> Is there a way around this?


Get rid of GPU Tweak and use Afterburner instead


----------



## Maximization

OK!!!!


----------



## Mooncheese

Duplicate, why this site saves partial posts that you didn't wish to complete and then insert them into new posts you wish to make is absolutely ******ed.


----------



## Mooncheese

Quote:


> Originally Posted by *Raity*
> 
> Guys, can you please advise.
> I have a very strage problem that I can't explain.
> 
> I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
> The water cooled card set to higher clocks by default and should perform better.
> But on the absolutely same system, the scores is always better on FTW3.
> CPU and physics sore is the same.
> *
> Here is several FTW3 results*
> 
> Time Spy - https://www.3dmark.com/spy/2279573 (8721)
> Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
> *
> Here is several AORUS results (water cooled)*
> 
> Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
> Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)
> 
> Results differs significantly in favor of FTW3 card with lover clocks.


Do you have them both installed in the computer at the same time (for whatever reason, it's not as though you can SLI them) and is the Gigabyte card in a PCI-E lane that is 8x or 4x (vs. 16x that is usually the upper-most PCI-E slot and maybe one other with most boards)?

Running a 1080 Ti in a 4x PCIE lane would result in scores like that I would imagine.


----------



## coc_james

Quote:


> Originally Posted by *Mooncheese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Raity*
> 
> Guys, can you please advise.
> I have a very strage problem that I can't explain.
> 
> I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
> The water cooled card set to higher clocks by default and should perform better.
> But on the absolutely same system, the scores is always better on FTW3.
> CPU and physics sore is the same.
> *
> Here is several FTW3 results*
> 
> Time Spy - https://www.3dmark.com/spy/2279573 (8721)
> Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
> *
> Here is several AORUS results (water cooled)*
> 
> Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
> Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)
> 
> Results differs significantly in favor of FTW3 card with lover clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have them both installed in the computer at the same time (for whatever reason, it's not as though you can SLI them) and is the Gigabyte card in a PCI-E lane that is 8x or 4x (vs. 16x that is usually the upper-most PCI-E slot and maybe one other with most boards)?
> 
> Running a 1080 Ti in a 4x PCIE lane would result in scores like that I would imagine.
Click to expand...

Why couldn't they be ran in SLI?


----------



## josephimports

SLI testing

Ryzen 1700X / Asus Crosshair Hero running at 8x/8x
https://www.3dmark.com/3dm/21849083

Intel 6700K / Asus Z170-WS running at 16x /16x
https://www.3dmark.com/3dm/21890547

10K graphicss score difference?


----------



## Rainmaker91

Quote:


> Originally Posted by *josephimports*
> 
> SLI testing
> 
> Ryzen 1700X / Asus Crosshair Hero running at 8x/8x
> https://www.3dmark.com/3dm/21849083
> 
> Intel 6700K / Asus Z170-WS running at 16x /16x
> https://www.3dmark.com/3dm/21890547
> 
> 10K graphicss score difference?


Just FIY, that 6700k is not running 16x/16x. It's running 8x/8x just like the ryzen system. Now that I have cleared that up, something to keep in mind is that Zen performs about the same as broadwell clock for clock. Skylake is about 5% better than Broadwell clock for clock, then there is the fact that Skylake is clocked about 20% higher than Ryzen is. So yeah, you will see some better scores on a 6700k than on a 1700x for sure. That will stay true until you actually start utilizing those extra threads that Ryzen 7 has available.

Edit: I have no clue about why the graphics score is that different though.


----------



## josephimports

Quote:


> Originally Posted by *Rainmaker91*
> 
> Just FIY, that 6700k is not running 16x/16x. It's running 8x/8x just like the ryzen system. Now that I have cleared that up, something to keep in mind is that Zen performs about the same as broadwell clock for clock. Skylake is about 5% better than Broadwell clock for clock, then there is the fact that Skylake is clocked about 20% higher than Ryzen is. So yeah, you will see some better scores on a 6700k than on a 1700x for sure. That will stay true until you actually start utilizing those extra threads that Ryzen 7 has available.
> 
> Edit: I have no clue about why the graphics score is that different though.


Thanks for chiming in. The cards are at 16x/16x. This board utilizes a PLX chip to make that happen.


Spoiler: Warning: Spoiler!


----------



## ZealotKi11er

Quote:


> Originally Posted by *josephimports*
> 
> Thanks for chiming in. The cards are at 16x/16x. This board utilizes a PLX chip to make that happen.
> 
> 
> Spoiler: Warning: Spoiler!


It does not matter. PLX does nothing in terms of CPU to PCIE. It still 16X.


----------



## Maximization

once you unlock these suckers...nice!!!!!


----------



## josephimports

Quote:


> Originally Posted by *Rainmaker91*
> 
> Just FIY, that 6700k is not running 16x/16x. It's running 8x/8x just like the ryzen system.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> It does not matter. PLX does nothing in terms of CPU to PCIE. It still 16X.


Thanks for clearing that up.


----------



## KedarWolf

Our thread has been tagged now as the Official overclock.net 1080 Ti by our wonderful overclock.net mod @Arizonian


----------



## Ultracarpet

Quote:


> Originally Posted by *8051*
> 
> But why is this card so expensive? I can understand the watercooled Zotac mini going for that much, but not the air cooled variant.


It was a good $50 cheaper than any other variant, so i don't understand your question. Maybe you thought i was talking about USD, it would have been equivalent to $689 USD. It was the cheapest 1080ti i could buy, and it also happened to be the one i wanted due to making a mitx build soon.


----------



## Raity

Quote:


> Originally Posted by *Raity*
> 
> Guys, can you please advise.
> I have a very strage problem that I can't explain.
> 
> I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
> The water cooled card set to higher clocks by default and should perform better.
> But on the absolutely same system, the scores is always better on FTW3.
> CPU and physics sore is the same.
> *
> Here is several FTW3 results*
> 
> Time Spy - https://www.3dmark.com/spy/2279573 (8721)
> Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
> *
> Here is several AORUS results (water cooled)*
> 
> Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
> Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)
> 
> Results differs significantly in favor of FTW3 card with lover clocks.


Quote:


> Originally Posted by *Mooncheese*
> 
> Do you have them both installed in the computer at the same time (for whatever reason, it's not as though you can SLI them) and is the Gigabyte card in a PCI-E lane that is 8x or 4x (vs. 16x that is usually the upper-most PCI-E slot and maybe one other with most boards)?
> 
> Running a 1080 Ti in a 4x PCIE lane would result in scores like that I would imagine.


I'm running video cards in absolutely the same environment, x16 PCI express, the same PCI port on the motherboard.
Any one getting such a low results (as mine) with 2000Mhz clocked card?
Can it be the damaged chip or something?


----------



## 8051

Quote:


> Originally Posted by *Raity*
> 
> Guys, can you please advise.
> I have a very strage problem that I can't explain.
> 
> I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
> The water cooled card set to higher clocks by default and should perform better.
> But on the absolutely same system, the scores is always better on FTW3.
> CPU and physics sore is the same.
> *
> Here is several FTW3 results*
> 
> Time Spy - https://www.3dmark.com/spy/2279573 (8721)
> Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
> *
> Here is several AORUS results (water cooled)*
> 
> Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
> Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)
> 
> Results differs significantly in favor of FTW3 card with lover clocks.


This could be down to VBIOS tweaks done by EVGA. The Asus Gold 20th edition 980Ti had the highest L2C/SYS/XBAR settings I've ever seen in any stock 980Ti VBIOS, it was billed as the fastest out-of-the-box 980Ti ever made.


----------



## 8051

Quote:


> Originally Posted by *Ultracarpet*
> 
> It was a good $50 cheaper than any other variant, so i don't understand your question. Maybe you thought i was talking about USD, it would have been equivalent to $689 USD. It was the cheapest 1080ti i could buy, and it also happened to be the one i wanted due to making a mitx build soon.


On newegg right now, the highest priced 1080Ti's are some founders editions (some are priced at over a thousand bones!) followed by the Zotac 1080Ti mini @ $987 US dollars.


----------



## k4sh

So i've done a few test yesterday with my Zotac mini and i could achieve a 1 hour loop test with unigine valley benchmark.
The bench ran successfully @ 2050 / 1031mV The GPU power would reach occasionaly 108%.

Then I've tried TW3 in an area where the GPU power fluctuated from 94% to 107%. Only a few minutes were necessary for the game to crash.
This is one of the most demanding game i've ever played with MEA. Fortunately, TW3 is fully playable at max settings and lower frequence at 60 fps on a surround set.
I'm starting to think that the max TDP possible with that card is really limited. I hope that my heatsinks ordered will help though i'm not sure.

Another conclusion is that undervolting really helped pass through benchs.


----------



## Raity

Quote:


> Originally Posted by *Raity*
> 
> Guys, can you please advise.
> I have a very strage problem that I can't explain.
> 
> I have EVGA 1080ti FTW3 and Gigabyte 1080ti AORUS Waterforce (water cooled).
> The water cooled card set to higher clocks by default and should perform better.
> But on the absolutely same system, the scores is always better on FTW3.
> CPU and physics sore is the same.
> *
> Here is several FTW3 results*
> 
> Time Spy - https://www.3dmark.com/spy/2279573 (8721)
> Firestrike Ultra - https://www.3dmark.com/fs/13488927 (7124)
> *
> Here is several AORUS results (water cooled)*
> 
> Time Spy - https://www.3dmark.com/3dm/21888469? (8528)
> Firestrike Ultra - https://www.3dmark.com/fs/13506018 (6747)
> 
> Results differs significantly in favor of FTW3 card with lover clocks.


Quote:


> Originally Posted by *Raity*
> 
> I'm running video cards in absolutely the same environment, x16 PCI express, the same PCI port on the motherboard.
> Any one getting such a low results (as mine) with 2000Mhz clocked card?
> Can it be the damaged chip or something?


I made some testing - the FTW3 card was set to 117% power limit, and it worked at 117% speed during all the tests.
AORUS set to 150%! power limit in the software, but it only reaches 100 - 103% speed during test and any other graphic application running.

I tried different software, like MSI Afterburner, Gigabyte AORUS Software, EVGA Precision XOC, Asus GPU Tweak - same results with any settings.
I don't know what's wrong with AOURS power limit and how to fix it, changing the clock speeds in software listed above is working like intended, but changing power limit only affecting performance when you set it below 100% (like 50% for example). Everything above 100% working as 100%.

Also, I tried to enable voltage settings in MSI Afterburner, even after all necessary check marks set in the settings, after software and PC reboot - it does not allow changing voltage settings, scale is grayed out.

Any suggestions about the power limit problem with AORUS?


----------



## Maximization

cool how to video, you will have to configure the settings to unlock AORUS I think


----------



## Raity

Quote:


> Originally Posted by *Maximization*
> 
> cool how to video, you will have to configure the settings to unlock AORUS I think


Dear Maximization, thank you!

But the problem with the locked voltage is't the main - I don't plan to it's just a side effect in my opinion.
My main concern is the power limit, that isn't working as intended.

Or you think that voltage unlocking may help with power limit and boost problem?


----------



## Alexium

I think I'm going to end up breaking the piggy bank and ordering a 1080 Ti Waterforce myself. Isn't there an XOC BIOS available for it?


----------



## Dasboogieman

Quote:


> Originally Posted by *Alexium*
> 
> I think I'm going to end up breaking the piggy bank and ordering a 1080 Ti Waterforce myself. Isn't there an XOC BIOS available for it?


no but the Strix XOC BIOS works on it.


----------



## Minusorange

Quote:


> Originally Posted by *Raity*
> 
> Dear Maximization, thank you!
> 
> But the problem with the locked voltage is't the main - I don't plan to it's just a side effect in my opinion.
> My main concern is the power limit, that isn't working as intended.
> 
> Or you think that voltage unlocking may help with power limit and boost problem?


What does your voltage curve look like ?

On my Aorus I can only get stable at 2000mhz at 1.025v BUT my Power never goes above 100%, I can run higher clock at higher voltage and Power does go above 100% but it makes the card hotter and results in downclock due to the stupid heat regulation


----------



## Alexium

Quote:


> Originally Posted by *Dasboogieman*
> 
> no but the Strix XOC BIOS works on it.


So effectively there is an XOC BIOS for it?.. I'm not sure why you've made that distinction in your post.


----------



## mgoldshteyn

I am running an SLI configuration of a pair of EVGA 1080 Ti SC2 Hybrid Gaming cards on Windows 7 with the latest nVidia drivers. I've run about 8 days of 24/7 compute on both cards and noticed that their ability to overclock has drastically gone down from the time I purchased them.

When I first got the cards, I could easily get 2,050 MHz stable, even at 1.050 V. Now, I can't even get 1,900 MHz on either card, regardless of voltage/power. Is it possible that by running at 120% power and max voltage (configured by setting K Boost on and raising the voltage slider all the way up using EVGA Precision XOC), I actually managed to damage the hardware in both cards?

Why else would the cards degrade so rapidly after only about 8 days of total compute time? I now get crashes just starting EVGA Precision XOC with K Boost turned on, even though it is defaulting to a 0 MHz offset, which results in something like 1,931 MHz / 1,961 MHz on my two cards, respectively. Adding voltage only results in the cards crashing sooner. Temperatures are great, both GPU, Memory, power regulation. Everything is staying below 60 C at load, with GPU temperatures much lower (e.g., 50 C).

I am testing 1,850 MHz with 1.000 V right now, since I can't even get the 1,900s to work at any voltage and higher voltages (e.g., >1.025 V) only cause crashes sooner. I have switched to MSI Afterburner, because I can't even get EVGA Precision XOC to come up without halting the OS. Afterburner comes up with the cards clocked to a locked 1,900 MHz, that I've set, so I at least have a chance to try new settings. But, running at 1,900 MHz at any reasonable voltage ultimately results in a system halt after anywhere between 30 mins to 4 hours.

I am wondering if one of the following two things has occurred:

Electromigration damaged the hardware of the cards. I know this sounds impossible after such a short period of 24/7 running and given that the max voltages were still bound by the 1.093 V cap, but the behavior I am seeing is very similar to the way my Pentium 4s behaved when overvolted/overclocked and having experienced electromigration (i.e., See: Sudden Northwood Death Syndrome (SNDS)).
The thermal compound that EVGA uses on their SC2 Hybrid Gaming cards has somehow worn down (but the temps are still low, so I doubt this happened).
If anybody has any ideas on what happened or what I can do to fix things, I am all ears.


----------



## Maximization

you might have shot your PSU, got a psu tester?


----------



## Ultracarpet

Quote:


> Originally Posted by *8051*
> 
> On newegg right now, the highest priced 1080Ti's are some founders editions (some are priced at over a thousand bones!) followed by the Zotac 1080Ti mini @ $987 US dollars.


Strange, much cheaper up here I guess. No way in hell I would spend 987 USD on this card lol.

Quote:


> Originally Posted by *k4sh*
> 
> So i've done a few test yesterday with my Zotac mini and i could achieve a 1 hour loop test with unigine valley benchmark.
> The bench ran successfully @ 2050 / 1031mV The GPU power would reach occasionaly 108%.
> 
> Then I've tried TW3 in an area where the GPU power fluctuated from 94% to 107%. Only a few minutes were necessary for the game to crash.
> This is one of the most demanding game i've ever played with MEA. Fortunately, TW3 is fully playable at max settings and lower frequence at 60 fps on a surround set.
> I'm starting to think that the max TDP possible with that card is really limited. I hope that my heatsinks ordered will help though i'm not sure.
> 
> Another conclusion is that undervolting really helped pass through benchs.


I haven't tested much with overclocking, nor do i think I will as I don't think the cooling is good enough on the card for it. However; i managed to play ME:A for about 30 minutes without a crash at 1900mhz 0.925v. I think this is pretty good, and it lowered my temps decently. It did crash at 0.9v so upped it a few notches and it seems pretty stable. I'll have to try it across a bigger game spectrum, but it's encouraging.


----------



## k4sh

Quote:


> Originally Posted by *Ultracarpet*
> 
> Strange, much cheaper up here I guess. No way in hell I would spend 987 USD on this card lol.
> I haven't tested much with overclocking, nor do i think I will as I don't think the cooling is good enough on the card for it. However; i managed to play ME:A for about 30 minutes without a crash at 1900mhz 0.925v. I think this is pretty good, and it lowered my temps decently. It did crash at 0.9v so upped it a few notches and it seems pretty stable. I'll have to try it across a bigger game spectrum, but it's encouraging.


Well, that card can't handle MEA all maxed out with a triple screen setup or with a 4K screen.
I mean to maintain a 60 fps constantly.
But if the resolution you use in game is lower, well the card running at 1900 Mhz should probably handle it easily. At least to reach the 60 fps.
If that is your case why bother anyway to overclock it. So that your undervolting setting is the right way to go.


----------



## Streetdragon

I played a bit with my card.
2000/6003 with 0,975V at 33°-34° max on the core
Stock bios Palit FE


I dont know how some can reach 10000 score. its miles away. And i dont think that 50Mhz are the reason for it. Any ideas?

i was already running the pc in high performe mode


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> I played a bit with my card.
> 2000/6003 with 0,975V at 33°-34° max on the core
> Stock bios Palit FE
> 
> 
> I dont know how some can reach 10000 score. its miles away. And i dont think that 50Mhz are the reason for it. Any ideas?
> 
> i was already running the pc in high performe mode


10000 vs 9807 is 2% difference lol.


----------



## Streetdragon

but but but 1000 sounds way better


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> but but but 1000 sounds way better


Ok the secret is go to Nvidia Control Panel > Adjust image settings with preview > Use my preference emphasizing > Performance.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Streetdragon*
> 
> but but but 1000 sounds way better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok the secret is go to Nvidia Control Panel > Adjust image settings with preview > Use my preference emphasizing > Performance.
Click to expand...

Going to fix thread OP, the part with Nvidia Control Panel settings for benching got messed up.


----------



## KraxKill

Quote:


> Originally Posted by *Streetdragon*
> 
> I played a bit with my card.
> 2000/6003 with 0,975V at 33°-34° max on the core
> Stock bios Palit FE
> 
> 
> I dont know how some can reach 10000 score. its miles away. And i dont think that 50Mhz are the reason for it. Any ideas?
> 
> i was already running the pc in high performe mode


Your voltage curve is hurting you. You need to fix that cliff of a vcurve to be more like a curve. At that voltage (as low as it may seem,) the card will still throttle in a few scenes.

To fix, just raise the bins, immediately below your top vbin so that when the card down clocks to a lower bin it doesn't do it in such a large down step. With your current curve, in the event of a Boost 3.0 pwr event, you'r card will hop multiple bins instead of just one bin.


----------



## Mooncheese

Quote:


> Originally Posted by *mgoldshteyn*
> 
> I am running an SLI configuration of a pair of EVGA 1080 Ti SC2 Hybrid Gaming cards on Windows 7 with the latest nVidia drivers. I've run about 8 days of 24/7 compute on both cards and noticed that their ability to overclock has drastically gone down from the time I purchased them.
> 
> When I first got the cards, I could easily get 2,050 MHz stable, even at 1.050 V. Now, I can't even get 1,900 MHz on either card, regardless of voltage/power. Is it possible that by running at 120% power and max voltage (configured by setting K Boost on and raising the voltage slider all the way up using EVGA Precision XOC), I actually managed to damage the hardware in both cards?
> 
> Why else would the cards degrade so rapidly after only about 8 days of total compute time? I now get crashes just starting EVGA Precision XOC with K Boost turned on, even though it is defaulting to a 0 MHz offset, which results in something like 1,931 MHz / 1,961 MHz on my two cards, respectively. Adding voltage only results in the cards crashing sooner. Temperatures are great, both GPU, Memory, power regulation. Everything is staying below 60 C at load, with GPU temperatures much lower (e.g., 50 C).
> 
> I am testing 1,850 MHz with 1.000 V right now, since I can't even get the 1,900s to work at any voltage and higher voltages (e.g., >1.025 V) only cause crashes sooner. I have switched to MSI Afterburner, because I can't even get EVGA Precision XOC to come up without halting the OS. Afterburner comes up with the cards clocked to a locked 1,900 MHz, that I've set, so I at least have a chance to try new settings. But, running at 1,900 MHz at any reasonable voltage ultimately results in a system halt after anywhere between 30 mins to 4 hours.
> 
> I am wondering if one of the following two things has occurred:
> 
> Electromigration damaged the hardware of the cards. I know this sounds impossible after such a short period of 24/7 running and given that the max voltages were still bound by the 1.093 V cap, but the behavior I am seeing is very similar to the way my Pentium 4s behaved when overvolted/overclocked and having experienced electromigration (i.e., See: Sudden Northwood Death Syndrome (SNDS)).
> The thermal compound that EVGA uses on their SC2 Hybrid Gaming cards has somehow worn down (but the temps are still low, so I doubt this happened).
> If anybody has any ideas on what happened or what I can do to fix things, I am all ears.


Youre not alone, when I first got my FE it would do 2037 MHz without touching the voltage (2050 MHz under 39C), then as time progressed, that became unstable, and then it was 2025 MHz, then that became unstable, then it was 2012 Mhz, AND THEN THAT BECAME UNSTABLE, and now it's somewhere between 1987 and 2000 MHz without touching the voltage curve, which is actually the reason I'm here in this forum, trying to recoup in vain some lost performance.

I don't know what it is, but yeah, this has been my experience not just with 1080 Ti but also 980 Ti. With 980 Ti I was overvolting a little, so I concluded that it was the voltage that was the culprit in the loss of stability at a given frequency. So this time around, I didn't touch the voltage, but yeah, now I've lost 37 Mhz off the top in terms of stability without touching the voltage (PT at 120%).

I am left to conclude that the silicon slowly loses its integrity with in relation to load, usage, heat, voltage and time.

I mean, lets be real, we all know that cryptocurrency miners DESTROY their GPU's relatively quickly with no overclock to speak of. What's going on here? Well, we are witnessing a sped up timescale of silicon degradation in relation to usage: load, frequency, heat, time and voltage.

So I believe the wisdom that increasing your voltage is detrimental to silicon integrity does hold out here, even though it may only be an infinitesimal amount (+25mV), over time, this compounds upon the other factors contributing to silicon degradation.

Me? Well, I don't replace my smartphone but once a decade, if my gaming proclivities entail purchasing a $700 GPU every 2-3 years, that's pretty much the cost of participating in PC Master race 2560x1440 144 Hz G-Sync 2D or 3D Vision glory with most of the bells-and-whistles turned on and up. It's just a given at this point. Love my 1080 Ti, but I'm sure, if history is any indication I'm going to love my 1180 Ti that much more (if it's 50% faster than overclocked 980 Ti or about as fast as 1080 Ti SLI at default clocks I will be seriously blown away, again.)
Quote:


> Originally Posted by *Streetdragon*
> 
> I played a bit with my card.
> 2000/6003 with 0,975V at 33°-34° max on the core
> Stock bios Palit FE
> 
> 
> I dont know how some can reach 10000 score. its miles away. And i dont think that 50Mhz are the reason for it. Any ideas?
> 
> i was already running the pc in high performe mode


Your card is rubbish, you should throw it in the trash and get a proper FE!

Just kidding, we need more data, where is your PT% at? Is V-Sync on (turn it off)? Are you monitoring your frequency throughout the run and does core freq. dip? Your memory chips might be dealing with error correction at 6GB, try reducing to 5900 MHz. (this is my experience with my card, it's sweet-spot is +400 Mhz, not more, no less).

Quote:


> Originally Posted by *Raity*
> 
> I made some testing - the FTW3 card was set to 117% power limit, and it worked at 117% speed during all the tests.
> AORUS set to 150%! power limit in the software, but it only reaches 100 - 103% speed during test and any other graphic application running.
> 
> I tried different software, like MSI Afterburner, Gigabyte AORUS Software, EVGA Precision XOC, Asus GPU Tweak - same results with any settings.
> I don't know what's wrong with AOURS power limit and how to fix it, changing the clock speeds in software listed above is working like intended, but changing power limit only affecting performance when you set it below 100% (like 50% for example). Everything above 100% working as 100%.
> 
> Also, I tried to enable voltage settings in MSI Afterburner, even after all necessary check marks set in the settings, after software and PC reboot - it does not allow changing voltage settings, scale is grayed out.
> 
> Any suggestions about the power limit problem with AORUS?


Yeah after reading your initial reply to my PCI-E lane inquiry I was gearing up to ask you about PT, that maybe you had PT set too low with the Aorus, well I now that I've read your newer posts something has come to mind.

If we flash a different vbios to a given card, it is recommended / required that we reinstall the display driver, for whatever reason. You physically switching between the FTW3 and the Aorus is basically the equivalent of flashing an Aorus vbios onto a FTW3 correct? So maybe, you had the FTW3 installed first, installed the display driver with that card, and now THIS is limiting the PT of the Aorus to that of the original card.

To eliminate this possibility I would try reinstalling the display driver with the Aorus installed, basically following the vbios flashing instructions here, but without flashing the vbios.


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Ok the secret is go to Nvidia Control Panel > Adjust image settings with preview > Use my preference emphasizing > Performance.


no tweaks here @ 2000Mhz / 6000Mhz 1v and i get 10.1k.
Tweaks aren't everything got to work on that efficiency as well.


This was my top score with xoc bios and a bunch of tweaks @ 2150Mhz / 6210Mhz. 1.175v


----------



## Streetdragon

Quote:


> Originally Posted by *KraxKill*
> 
> Your voltage curve is hurting you. You need to fix that cliff of a vcurve to be more like a curve. At that voltage (as low as it may seem,) the card will still throttle in a few scenes.
> 
> To fix, just raise the bins, immediately below your top vbin so that when the card down clocks to a lower bin it doesn't do it in such a large down step. With your current curve, in the event of a Boost 3.0 pwr event, you'r card will hop multiple bins instead of just one bin.


the card Power was between 90 and 110 so i had 10% room. but i will try it!
Quote:


> Originally Posted by *Mooncheese*
> 
> Just kidding, we need more data, where is your PT% at? Is V-Sync on (turn it off)? Are you monitoring your frequency throughout the run and does core freq. dip? Your memory chips might be dealing with error correction at 6GB, try reducing to 5900 MHz. (this is my experience with my card, it's sweet-spot is +400 Mhz, not more, no less).


tuned my ram so farm my sweet spot was +500. 50Mhz more gives not more performence. and lower the Mhz is lowering the score too. so ithink im lucky here^^


----------



## KraxKill

Quote:


> Originally Posted by *lilchronic*
> 
> no tweaks here @ 2000Mhz / 6000Mhz 1v and i get 10.1k.
> Tweaks aren't everything got to work on that efficiency as well.
> 
> 
> This was my top score with xoc bios and a bunch of tweaks @ 2150Mhz / 6210Mhz. 1.175v


Agreed, with a good undervolt and a proper supporting voltage curve, these cards are magic. It's all about how you configure the curve to take advantage of your available TDP. It's possible to hit 10K with less than .9 volts and people are shooting them in the foot by feeding excessive voltages. This just makes them hit TDP that much sooner. Pascal is extremely efficient.

Core @ 1760MHz - GDDR5x @ 6125MHz @ *0.800v*


Core @ 1936MHz - GDDR5x @ 6125MHz @ *0.900v*


Core @ 2012MHz - GDDR5x @ 6125MHz @ *1.000v*


Core @ 2100MHz - GDDR5x @ 6125MHz @ *1.063v*


----------



## Ultracarpet

Quote:


> Originally Posted by *k4sh*
> 
> Well, that card can't handle MEA all maxed out with a triple screen setup or with a 4K screen.
> I mean to maintain a 60 fps constantly.
> But if the resolution you use in game is lower, well the card running at 1900 Mhz should probably handle it easily. At least to reach the 60 fps.
> If that is your case why bother anyway to overclock it. So that your undervolting setting is the right way to go.


Yea I'm at 1440p, hovered around 80-100fps in the area that i was in. With max settings i believe.


----------



## Mooncheese

Quote:


> Originally Posted by *KraxKill*
> 
> Your voltage curve is hurting you. You need to fix that cliff of a vcurve to be more like a curve. At that voltage (as low as it may seem,) the card will still throttle in a few scenes.
> 
> To fix, just raise the bins, immediately below your top vbin so that when the card down clocks to a lower bin it doesn't do it in such a large down step. With your current curve, in the event of a Boost 3.0 pwr event, you'r card will hop multiple bins instead of just one bin.


Good observation / point.

Also when OP said "I'm already running the PC in high performance mode" I kinda cringed.

Are you talking about the High Performance power management profile in Windows? Because Superposition is NOT CPU intensive.

What I'm more worried about though is that you set your Global Nvidia Power Management to "Prefer Max Performance". You only want specific programs and games set to Prefer Max, otherwise you will be watching Youtube with 1.050v and 2000 MHz on your GPU.

Edit:

Nice Sup 4k scores! I wish my card could benefit from more than +400 on the memory, I haven't tried since the undervolt but looking at your scores is making me want to.

Do you have any other tweaks for those scores?


----------



## mbm

Just got the msi Seahawk.. runs very cool. Loadtemp 47c fans are quit. The only noise is coilwhine. Damn. My first ever card I noticed this annoying Sound.


----------



## Streetdragon

Quote:


> Originally Posted by *Mooncheese*
> 
> Good observation / point.
> 
> Also when OP said "I'm already running the PC in high performance mode" I kinda cringed.
> 
> Are you talking about the High Performance power management profile in Windows? Because Superposition is NOT CPU intensive.
> 
> What I'm more worried about though is that you set your Global Nvidia Power Management to "Prefer Max Performance". You only want specific programs and games set to Prefer Max, otherwise you will be watching Youtube with 1.050v and 2000 MHz on your GPU.
> 
> Do you have any other tweaks for those scores?


sure windows power managment. and every bit counts xD

Ok the secret is go to Nvidia Control Panel > Adjust image settings with preview > Use my preference emphasizing > Performance.

used it and changed lower bins to higher clock



now im happy xD


----------



## KraxKill

Quote:


> Originally Posted by *Mooncheese*
> 
> Nice Sup 4k scores! I wish my card could benefit from more than +400 on the memory, I haven't tried since the undervolt but looking at your scores is making me want to.
> 
> Do you have any other tweaks for those scores?


No other tweaks, just undervolt and performance settings in nvcpanel...card stays cool at 36C or under.

I would suggest that if you're on air (and even if not), that the best combination is an undervolt + overclock + a more aggressive fan curve. A lot of folks don't realize the full benefit of the undervolt because they keep their stock fan profile. But naturaly as the voltage goes down, the heat load goes down as well and the fan spins slower. So to compensate, one needs a more aggressive fan profile in order to drop the temps further and further prevent clock hopping.


----------



## Mooncheese

Quote:


> Originally Posted by *Streetdragon*
> 
> sure windows power managment. and every bit counts xD
> 
> Ok the secret is go to Nvidia Control Panel > Adjust image settings with preview > Use my preference emphasizing > Performance.
> 
> used it and changed lower bins to higher clock
> 
> 
> 
> now im happy xD


Actually CPU has next-to-zero bearing in Sup 4k, go ahead and try it yourself. I'm seeing 10% CPU utilization with my i7 4930k (hex-core)?

Also, what you want to do in pursuit of component longevity is have both High Performance and Balanced power profiles on hotkey (for me it's Ctrl+Alt+1 and Ctrl+Alt+2) so that when you don't need the frequency AND VOLTAGE, like say you have Chrome with 30+ tabs open, you can reduce the voltage sustained by your CPU.

95% of the time my CPU is at 3.4 GHz and like .850v, it's only when I am playing a game that is CPU bound, say TW3 or GTA5, that I hit Ctrl+Alt+1 before firing it up. Titanfall 2 I don't even run in High Performance, there's no benefit.

But yeah, your CPU, you do whatever you want, the silicon degradation formula of load, heat, voltage and time equally applies to your CPU as it does to your GPU. If youre Mr. Moneybags who builds a $4k computer every other year, like many of the owners here seem to be, then who cares I suppose.

Edit:

https://www.howtogeek.com/howto/windows-vista/create-a-shortcut-or-hotkey-to-switch-power-plans/


----------



## Mooncheese

Quote:


> Originally Posted by *KraxKill*
> 
> No other tweaks, just undervolt and performance settings in nvcpanel...card stays cool at 36C or under.
> 
> I would suggest that if you're on air (and even if not), that the best combination is an undervolt + overclock + a more aggressive fan curve. A lot of folks don't realize the full benefit of the undervolt because they keep their stock fan profile. But naturaly as the voltage goes down, the heat load goes down as well and the fan spins slower. So to compensate, one needs a more aggressive fan profile in order to drop the temps further and further prevent clock hopping.


I have an NZXT 140mm radiator on the GPU core and see 47C in Sup 4K and like 42-44C in Titanfall 2 (2D) and 50C in TW3 (3D Vision) with near sustained 99% load.

But youre at 35C, that's beyond impressive, so youre 1 bin higher than avg. among those with liquid cooling.


----------



## AshBorer

Picked this up at microcenter today


----------



## Streetdragon

Quote:


> Originally Posted by *Mooncheese*
> 
> Actually CPU has next-to-zero bearing in Sup 4k, go ahead and try it yourself. I'm seeing 10% CPU utilization with my i7 4930k (hex-core)?
> 
> Also, what you want to do in pursuit of component longevity is have both High Performance and Balanced power profiles on hotkey (for me it's Ctrl+Alt+1 and Ctrl+Alt+2) so that when you don't need the frequency AND VOLTAGE, like say you have Chrome with 30+ tabs open, you can reduce the voltage sustained by your CPU.
> 
> 95% of the time my CPU is at 3.4 GHz and like .850v, it's only when I am playing a game that is CPU bound, say TW3 or GTA5, that I hit Ctrl+Alt+1 before firing it up. Titanfall 2 I don't even run in High Performance, there's no benefit.
> 
> But yeah, your CPU, you do whatever you want, the silicon degradation formula of load, heat, voltage and time equally applies to your CPU as it does to your GPU. If youre Mr. Moneybags who builds a $4k computer every other year, like many of the owners here seem to be, then who cares I suppose.
> 
> Edit:
> 
> https://www.howtogeek.com/howto/windows-vista/create-a-shortcut-or-hotkey-to-switch-power-plans/


pls stop crap talking, if you have no idea. Maybe i changed the powerplan only for the bench? Even if the full speed is not used in the benchmark its still better than the lowest clock.
and degradation wont happen at full speed with low temp and no load, but if you think this happens than have a cookie









Quote:


> Originally Posted by *AshBorer*
> 
> Picked this up at microcenter today


hold it/her tight!


----------



## nrpeyton

.Evening.


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> .Evening.


Hi Nick! Hows things?
Your Ti treating you good?









Cheers

Occamrazor


----------



## nrpeyton

Hi OccamRazor,

Not bad mate.

I've been busy -- not been on in a while.

Aye it's still going good. Although I've not had much time to play with it lately lol.

Almost forgot I had it modified with the liquid metal. (shunt mod). It's been like that for months now and still going strong. Although I hope that'll remain true if I start a game up, lol.

The PC has been on 24/7 just idling for the last few months, so the card has been on... just it's never done anything taxing for ages...

How are you?


----------



## OccamRazor

Quote:


> Originally Posted by *nrpeyton*
> 
> Hi OccamRazor,
> 
> Not bad mate.
> 
> I've been busy -- not been on in a while.
> 
> Aye it's still going good. Although I've not had much time to play with it lately lol.
> 
> Almost forgot I had it modified with the liquid metal. (shunt mod). It's been like that for months now and still going strong. Although I hope that'll remain true if I start a game up, lol.
> 
> The PC has been on 24/7 just idling for the last few months, so the card has been on... just it's never done anything taxing for ages...
> 
> How are you?


Just fine thanks!








Manage to play a bit now and then, my Ti is under water with an EK block and a 360 RAD with Vardar fans and even with the high temps here (30C) it hoovers around 46C full load @ 2126/6100 in some benches and games, just finished Hellblade a couple days ago and it was a breeze through the new unreal engine @ nearly 4K (3240x1920)


----------



## DerComissar

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ultracarpet*
> 
> It was a good $50 cheaper than any other variant, so i don't understand your question. Maybe you thought i was talking about USD, it would have been equivalent to $689 USD. It was the cheapest 1080ti i could buy, and it also happened to be the one i wanted due to making a mitx build soon.
> 
> 
> 
> On newegg right now, the highest priced 1080Ti's are some founders editions (some are priced at over a thousand bones!) followed by the Zotac 1080Ti mini @ $987 US dollars.
Click to expand...

And then there are the third-party [email protected]
Newegg USA:
https://www.newegg.com/Product/Product.aspx?Item=9SIAE7U6215098

Newegg Canada:
https://www.newegg.ca/Product/Product.aspx?Item=9SIAEGZ6425655

I think there's a bit of price-gouging going on there, lol.


----------



## Mooncheese

Quote:


> Originally Posted by *Streetdragon*
> 
> pls stop crap talking, if you have no idea. Maybe i changed the powerplan only for the bench? Even if the full speed is not used in the benchmark its still better than the lowest clock.
> and degradation wont happen at full speed with low temp and no load, but if you think this happens than have a cookie
> 
> 
> 
> 
> 
> 
> 
> 
> hold it/her tight!


That's the way to repay friendly advice, with passive aggressive hostility. If your GPU is at 1.093+v with no load, 1.093v is still 1.093v and yes voltage is a factor in silicon degradation, you can keep your cookie, thanks.

And no, there is no difference between flogging your CPU and running your CPU with default clocks in Superposition, go ahead and prove me wrong, I will wait right here.

Gotta love the interwebs.


----------



## TheBoom

Gotta love how there's a verbal war every 10 pages lol.

Humans.

That being said I've experienced and talked about that "degradation" too. But that was on the XOC at much higher voltages. I think the driver updates play a part too though.

I'm now just running the card at 2000-2012mhz stable. Hasn't deteriorated yet and I'm done with tinkering with the curve to be honest.


----------



## MaKeN

Did anyone who plays mass effect andromeda noticed that the gpu heats up way faster and higher then any of those benchmarks do with the card?


----------



## Leethal

Quote:


> Originally Posted by *MaKeN*
> 
> Did anyone who plays mass effect andromeda noticed that the gpu heats up way faster and higher then any of those benchmarks do with the card?


Ha! i was asking myself the same thing

brought my GPUs temp way higher then any other game or benchmark.


----------



## Blameless

Quote:


> Originally Posted by *Mooncheese*
> 
> If your GPU is at 1.093+v with no load, 1.093v is still 1.093v and yes voltage is a factor in silicon degradation, you can keep your cookie, thanks.


Voltage is one of several variables; current and temperature are just as notable when it comes to electromigration.

If the voltage is safe to use at high load for any significant period of time, it's safe at idle forever.


----------



## Thoth420

Hey all I asked for advice probably over a 100 pages back on what 1080Ti to get and what monitor. Thanks again for all the help and suggestions. I decided on the Strix as it has a block for it available and is quite well received. The panel I opted for is the Acer Predator 34 inch 21:9 100hz G-Sync widescreen.









Also getting the TT View 71 RGB and a Samsung 960 500GB and swapping my build into that chassis. Vertical mounting here I come! Going to run an AIO(TT Floe 240) for now and assess the card on air then build a loop later on down the line. I am securing a block for the GPU now though.

*Query: Is there a compatible backplate for the Strix once it has an EK block on it?*


----------



## Rollergold

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all I asked for advice probably over a 100 pages back on what 1080Ti to get and what monitor. Thanks again for all the help and suggestions. I decided on the Strix as it has a block for it available and is quite well received. The panel I opted for is the Acer Predator 34 inch 21:9 100hz G-Sync widescreen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also getting the TT View 71 RGB and a Samsung 960 500GB and swapping my build into that chassis. Vertical mounting here I come! Going to run an AIO(TT Floe 240) for now and assess the card on air then build a loop later on down the line. I am securing a block for the GPU now though.
> 
> *Query: Is there a compatible backplate for the Strix once it has an EK block on it?*


Only the EK Backplate is compatable


----------



## Thoth420

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Rollergold*
> 
> Only the EK Backplate is compatable






Damn they don't look as cool but ok I guess I will have to opt for one of those again. Thanks


----------



## k4sh

Quote:


> Originally Posted by *MaKeN*
> 
> Did anyone who plays mass effect andromeda noticed that the gpu heats up way faster and higher then any of those benchmarks do with the card?


Depending on the player location i could see a rise in GPU load, especially in the tempest spacecraft. But i never saw any T° increase as my GPU is under water.


----------



## 8051

Quote:


> Originally Posted by *DerComissar*
> 
> And then there are the third-party [email protected]
> Newegg USA:
> https://www.newegg.com/Product/Product.aspx?Item=9SIAE7U6215098
> 
> Newegg Canada:
> https://www.newegg.ca/Product/Product.aspx?Item=9SIAEGZ6425655
> 
> I think there's a bit of price-gouging going on there, lol.


Is this because of the miners? I hope miners have to pay extra power usage fees for all the power they're wasting producing nothing for noone.


----------



## 8051

Quote:


> Originally Posted by *k4sh*
> 
> Depending on the player location i could see a rise in GPU load, especially in the tempest spacecraft. But i never saw any T° increase as my GPU is under water.


Why didn't you get the Zotac watercooled mini model? The announcement blurb for the Zotac 1080Ti mini's clearly indicated they were coming out w/a watercooled model w/a nice
waterblock on it.


----------



## k4sh

Quote:


> Originally Posted by *8051*
> 
> Why didn't you get the Zotac watercooled mini model? The announcement blurb for the Zotac 1080Ti mini's clearly indicated they were coming out w/a watercooled model w/a nice
> waterblock on it.


Because i already own a custom water loop.Card after card, my universal waterblock did a good job.
It's summer here and, while the temperatures raised recently, the GPU temp never exceeded 43°C. Quite good.


----------



## LunaP

Had a quesiton for you guys, I"m running 2x 1080ti's and 1 980ti, the 1080ti's are SLI'd while the 980ti is used for additional monitors.

Given that the length diff between the 1080 and the 980 is about 2-3 cm the water bridge locations are slightly off, other than using 90 degree connectors and tubing are there any flexible/ and or rotatable connectors for offset bridges? I'm building my next rig and would like to have this an easy transition if possible. ANy suggestions are much appreciated, the configuration is 1080ti->980ti->1080ti for the loop.


----------



## jaki

Hi guys,

I don't want to start a new thread so I thought going amongst you owners is a good idea.
My computer is in lack of a GPU currently because I sold my 1070 before I got a new one. I want to upgrade to the 1080 TI and I need a little help in deciding.
My financial budget is about 760€ max.
So here is a list of Cards that are in question of buying but I need some insight which to avoid and which to get:

https://geizhals.de/?cat=gra16_512&bpmax=760&v=e&hloc=at&hloc=de&filter=aktualisieren&sort=p&bl1_id=30&xf=11555_3%7E9810_7+10609+-+GTX+1080+Ti#gh_filterbox

I definitely want a 3 fan model because I might overclock in the future and I assume 3 fans reving slower is a bit more quiet. My computer is ultra silent during desktop use so I want to keep it like that. As long as there is no annoying fan noise I don't really care how loud it's gonna be while playing games.

Thanks in advance!


----------



## kevindd992002

Will the EVGA Powerlink be compatible with the EVGA GTX 1080Ti FTW3 if an EK waterblock/backplate is installed to it?


----------



## WillG027

Quote:


> Originally Posted by *jaki*
> 
> I definitely want a 3 fan model because I might overclock in the future and I assume 3 fans reving slower is a bit more quiet. My computer is ultra silent during desktop use so I want to keep it like that. As long as there is no annoying fan noise I don't really care how loud it's gonna be while playing games.


It's the opposite actually.

Small diameter fans have to spin faster to move the same amount of air as the larger fan (two fan config). (There is three of them though so total air moved can vary model to model)

MSI and ASUS are apparently the quietest ( i have the MSI and it is very quiet).


__
https://www.reddit.com/r/69cblh/quietest_1080ti/


----------



## TheRedViper

Quote:


> Originally Posted by *LunaP*
> 
> Had a quesiton for you guys, I"m running 2x 1080ti's and 1 980ti, the 1080ti's are SLI'd while the 980ti is used for additional monitors.
> 
> Given that the length diff between the 1080 and the 980 is about 2-3 cm the water bridge locations are slightly off, other than using 90 degree connectors and tubing are there any flexible/ and or rotatable connectors for offset bridges? I'm building my next rig and would like to have this an easy transition if possible. ANy suggestions are much appreciated, the configuration is 1080ti->980ti->1080ti for the loop.


You can always check out bitspower rotary snake fittings.


----------



## taem

Quote:


> Originally Posted by *jaki*
> 
> Hi guys,
> 
> I don't want to start a new thread so I thought going amongst you owners is a good idea.
> My computer is in lack of a GPU currently because I sold my 1070 before I got a new one. I want to upgrade to the 1080 TI and I need a little help in deciding.
> My financial budget is about 760€ max.
> So here is a list of Cards that are in question of buying but I need some insight which to avoid and which to get:
> 
> https://geizhals.de/?cat=gra16_512&bpmax=760&v=e&hloc=at&hloc=de&filter=aktualisieren&sort=p&bl1_id=30&xf=11555_3%7E9810_7+10609+-+GTX+1080+Ti#gh_filterbox
> 
> I definitely want a 3 fan model because I might overclock in the future and I assume 3 fans reving slower is a bit more quiet. My computer is ultra silent during desktop use so I want to keep it like that. As long as there is no annoying fan noise I don't really care how loud it's gonna be while playing games.
> 
> Thanks in advance!


MSI Gaming X is just terrific, hands down the best 1080 ti right now imho


----------



## jaki

Quote:


> Originally Posted by *taem*
> 
> MSI Gaming X is just terrific, hands down the best 1080 ti right now imho


There is no MSI Gaming X on that list. It has two fans only

also what makes it the best?


----------



## Minusorange

Quote:


> Originally Posted by *jaki*
> 
> There is no MSI Gaming X on that list. It has two fans only
> 
> also what makes it the best?


https://www.caseking.de/msi-geforce-gtx-1080-ti-gaming-x-11g-11264-mb-gddr5x-gcmc-174.html

It's the coolest and quietest Air card going


----------



## Mooncheese

Quote:


> Originally Posted by *jaki*
> 
> Hi guys,
> 
> I don't want to start a new thread so I thought going amongst you owners is a good idea.
> My computer is in lack of a GPU currently because I sold my 1070 before I got a new one. I want to upgrade to the 1080 TI and I need a little help in deciding.
> My financial budget is about 760€ max.
> So here is a list of Cards that are in question of buying but I need some insight which to avoid and which to get:
> 
> https://geizhals.de/?cat=gra16_512&bpmax=760&v=e&hloc=at&hloc=de&filter=aktualisieren&sort=p&bl1_id=30&xf=11555_3%7E9810_7+10609+-+GTX+1080+Ti#gh_filterbox
> 
> I definitely want a 3 fan model because I might overclock in the future and I assume 3 fans reving slower is a bit more quiet. My computer is ultra silent during desktop use so I want to keep it like that. As long as there is no annoying fan noise I don't really care how loud it's gonna be while playing games.
> 
> Thanks in advance!


Save up some more and go with an AIO cooled card, either MSI'S Seahawk or EVGA FTW3 with the AIO. The quietest you can go and you're pushing the heat out of the chassis vs dumping said heat in the case.

Save and do it right.


----------



## Battleneter

Quote:


> Originally Posted by *taem*
> 
> MSI Gaming X is just terrific, hands down the best 1080 ti right now imho


Apart from the fact its fugly as hell, its cooler is outperformed by the Asus Strix.

I own a EVGA 1080 Ti SC2, there really not much between any of these cards but the Strix is the best if you had to pick a winner objectively. That said the Strix is often priced around $100+ more than similar competitors, so just no ways its worth that.

MSI Gaming X is a good card but its not the best.


----------



## jaki

The cheapest card on that list is the Zotac Extreme Core. I wonder why you guys didn't mention it, seems like a good contender to me.
I'm not going for a water cooled card, too expensive and I see no reason for it


----------



## BaldMan

So I have been lurking around, listening in on the Undervoltage/voltage curve discussion.

I have played around with my Galax 1080 ti EXOC (basically a FE) under water, and ended up with 1v @ 2000mhz with a max temp of 47deg C, with this voltage, it hits 103% sometimes (I have the limit at 120%)

I can push 2075 @ 1.09v smashing into the 120% power limit with a max temp of 54 deg, 7 deg GPU reduction for 0 FPS loss.

Also helps keep the 4.8gz 7700K temps down.


----------



## 8051

Quote:


> Originally Posted by *BaldMan*
> 
> So I have been lurking around, listening in on the Undervoltage/voltage curve discussion.
> 
> I have played around with my Galax 1080 ti EXOC (basically a FE) under water, and ended up with 1v @ 2000mhz with a max temp of 47deg C, with this voltage, it hits 103% sometimes (I have the limit at 120%)
> 
> I can push 2075 @ 1.09v smashing into the 120% power limit with a max temp of 54 deg, 7 deg GPU reduction for 0 FPS loss.
> 
> Also helps keep the 4.8gz 7700K temps down.


What's strange to me about your GPUz screenshot is that you're throwing three different perfcaps and it's not throttling your card.


----------



## Minusorange

Quote:


> Originally Posted by *jaki*
> 
> The cheapest card on that list is the Zotac Extreme Core. I wonder why you guys didn't mention it, seems like a good contender to me.
> I'm not going for a water cooled card, too expensive and I see no reason for it


I was originally looking at the Zotac as my go to Ti but after reading some reviews decided against it because it's the biggest card out of them all and the cooling isn't very good when it comes to the VRM's






Zotac 1080 Ti AMP Extreme Review: *Bigger, Not Better*

Out of the cards on your list I'd probably go for the Aorus


----------



## nrpeyton

This is a hypothetical question about how a GPU operates.

*"*If I purchase two identical GPU's, and use one long term (say one year), will it be identical in speed to the unused GPU? Will the number of clock cycles, latency of requests, etc on the used GPU be less than that of the unused GPU?
A supporting argument may be that mechanical devices degrade over time, While a GPU has no moving parts (other than the external fan), it does have circuits that can be damaged by heat, and voltage spikes. Lets say that after a year of intensive use, the circuits degrade and fewer electrons can pass since the pathway is narrower, etc.
Is this the nature of how a GPU operates, or is it simply working or broken, with no speed degradation in between?*"*

Debate \/ /\ ?


----------



## coc_james

Quote:


> Originally Posted by *nrpeyton*
> 
> This is a hypothetical question about how a GPU operates.
> 
> *"*If I purchase two identical GPU's, and use one long term (say one year), will it be identical in speed to the unused GPU? Will the number of clock cycles, latency of requests, etc on the used GPU be less than that of the unused GPU?
> A supporting argument may be that mechanical devices degrade over time, While a GPU has no moving parts (other than the external fan), it does have circuits that can be damaged by heat, and voltage spikes. Lets say that after a year of intensive use, the circuits degrade and fewer electrons can pass since the pathway is narrower, etc.
> Is this the nature of how a GPU operates, or is it simply working or broken, with no speed degradation in between?*"*
> 
> Debate \/ /\ ?


All electrical components degrade. Some degrade faster than others, for many reasons, be it quality of components, user influence, or environmental impacts. As they degrade their efficiency goes down as well. With a drop in efficiency, you will also have a decrease in performance. At some point this process accelerates due to the ineffectiveness of cooling components that are no longer overheating from stress, but from the "wear and tear" itself. IMHO


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> This is a hypothetical question about how a GPU operates.
> 
> *"*If I purchase two identical GPU's, and use one long term (say one year), will it be identical in speed to the unused GPU? Will the number of clock cycles, latency of requests, etc on the used GPU be less than that of the unused GPU?
> A supporting argument may be that mechanical devices degrade over time, While a GPU has no moving parts (other than the external fan), it does have circuits that can be damaged by heat, and voltage spikes. Lets say that after a year of intensive use, the circuits degrade and fewer electrons can pass since the pathway is narrower, etc.
> Is this the nature of how a GPU operates, or is it simply working or broken, with no speed degradation in between?*"*
> 
> Debate \/ /\ ?


Thermal cycling will cause mechanical damage to vias and solderballs while electromigration will be damaging ICs internally essentially any time they are receiving power (at a highly variable rate dependent on voltage, load/current, and temperature).

Without surpassing reference power, voltage, or thermal limits, under average workloads, a year's worth of use is not going to significantly degrade a part...going beyond spec and/or hammering the part with protracted loads, or subjecting it to extreme thermal cycling is an entirely different matter.


----------



## taem

Quote:


> Originally Posted by *Battleneter*
> 
> Apart from the fact its fugly as hell, its cooler is outperformed by the Asus Strix.
> 
> I own a EVGA 1080 Ti SC2, there really not much between any of these cards but the Strix is the best if you had to pick a winner objectively. That said the Strix is often priced around $100+ more than similar competitors, so just no ways its worth that.
> 
> MSI Gaming X is a good card but its not the best.


Are you talking about long haired gamer nexus guy's 40db normalized test for 1080 ti's to say Strix is best? That's just db. Gaming X's larger fans spin with a lower pitched sound profile than the Strix. At the same db, the Strix is louder. Period. Sound profiles are largely subjective, but higher pitched sounds are always more intrusive than lower pitched ones. Have you heard the cards? I have. I shake my head at the folks who say Strix is quieter. That's crazy. It's not even close, Gaming X is soooo much quieter.

Anyway I call Gaming X best because at the major retailers, Amazon Newegg etc, it's the cheapest of the good air cooled cards, it's the quietest, and despite the looks which I also hate, it's built like a tank. Yeah I will gladly double down on calling this the best 1080 ti. The Strix, right now at the places I'm willing to buy a high end gpu from, it's more expensive than the Lightning Z I got


----------



## Mooncheese

You're not going to get this level of frequency, temperature, acoustics and stability with an air cooled variant. You're also not going to be expelling the heat from the chassis but dumping it your case to be dealt with by your mobo (and Northbridge heatsink etc.), ram, and cpu cooling solution. Thinking of going SLI down the road? Not with two non-reference cards dumping 700w of heat into your chassis. Get ready for crippling throttling and 90c. Been there done that. Why the AIO design hasn't become more popular is because it's relatively new. It is better, hands down.

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_sea_hawk_x_review,11.html

Edit:

If I was only now looking at picking up a Ti this would be the only card I would be looking at:

http://www.guru3d.com/news-story/evga-launches-geforce-gtx-1080-ti-ftw3-hybrid-graphics-card.html

If getting it meant spending another $100 well so be it, I won't eat out for a month or two.

That's what's awesome about shopping for a Ti now, you have better options. When I got mine back in the beginning of March it was FE, FE or FE. Sure, I've added an 140mm AIO for that silence and 47c load at 2025 MHZ but yeah, doing it now it would be FTW3 Hybrid. Sensors, proper VRM cooling, more power, great aesthetic design.


----------



## TucoPacifico

Aorus 1080 Ti non-Xtreme , 6260 score in SuperPosition Extreme 1080p, playing at 2075/11954, original air cooling, 56C temp when playing very GPU intense stuff like SkyrimSE +140 mods + Nyclix's ENB. Using MSI AB 4.4.0. beta 8. Are these normal results for this card, or am I very lucky in the lottery?


----------



## KedarWolf

Yeah, that's pretty good, this is what I get with my every day OC settings.









That's with latest drivers though, I do better with older drivers for benching.











And with Nvidia tweaks in first post.


----------



## TheRedViper

Quote:


> Originally Posted by *KedarWolf*
> 
> Yeah, that's pretty good, this is what I get with my every day OC settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's with latest drivers though, I do better with older drivers for benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And with Nvidia tweaks in first post.


Whats that wallpaper


----------



## KedarWolf

Quote:


> Originally Posted by *TheRedViper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yeah, that's pretty good, this is what I get with my every day OC settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's with latest drivers though, I do better with older drivers for benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And with Nvidia tweaks in first post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats that wallpaper
Click to expand...

3D werewolf wallpaper.


----------



## 8051

Quote:


> Originally Posted by *KedarWolf*
> 
> 3D werewolf wallpaper.


Where wolf? There wolf!


----------



## KedarWolf

Had to go back to driver 385.28, 385.41 causes terrible stuttering in Hellblade: Senua's Sacrifice. 385.28 is fine though.









By the way, Hellblade is one of the very best games I've ever played, and best story line in any game I've played.


----------



## Kriant

I am sitting over here pondering - is there any point in watercooling my 1080ti ftw3. I have an Asus 1080ti poseidon and EVGA 1080ti ftw3 in SLI. I know that FTW3 is holding Poseidon back a tad, but is it worth dropping $170 (with packplate) on a waterblock for ftw3? I am not sure that going from 1936ghz to 2050 will net me substantial gains.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kriant*
> 
> I am sitting over here pondering - is there any point in watercooling my 1080ti ftw3. I have an Asus 1080ti poseidon and EVGA 1080ti ftw3 in SLI. I know that FTW3 is holding Poseidon back a tad, but is it worth dropping $170 (with packplate) on a waterblock for ftw3? I am not sure that going from 1936ghz to 2050 will net me substantial gains.


In SLI even less.


----------



## Kriant

Quote:


> Originally Posted by *ZealotKi11er*
> 
> In SLI even less.


That's kinda what I am thinking. I am swapping my 1800x to 1950x in a few weeks (I have my 1950x and zenith sitting on the bench already, waiting for Swiftech Apogee SKF-TR4 to arrive) so re-working on my loop was going to provide me this opportunity to add the second card on the loop as well, but yeah, I don't think I will see any benefit.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kriant*
> 
> That's kinda what I am thinking. I am swapping my 1800x to 1950x in a few weeks (I have my 1950x and zenith sitting on the bench already, waiting for Swiftech Apogee SKF-TR4 to arrive) so re-working on my loop was going to provide me this opportunity to add the second card on the loop as well, but yeah, I don't think I will see any benefit.


If you want to card in the loop I would at least get similar cards







.


----------



## Kriant

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you want to card in the loop I would at least get similar cards
> 
> 
> 
> 
> 
> 
> 
> .


I couldn't pass on the cheap poseidon I've managed to get lol. They work well in sli (sli bridge fit just fine), and afterburner takes care of the rest. It's just a question of temperature and overclock.


----------



## Dry Bonez

Can someone explain this logic to me? I received my Aorus extreme Waterforce 1080ti 2 days ago and i noticed something just now, while playing rise of the tomb raider in 4k. My temps never went above 53c, but out of curiosity i went ahead and touched the card, and i kid you not i almost burned myself from the massive amount of heat coming out of it? Do i have a defective card? I dont think a Watercooled card is supposed to get that hot that i felt nearly burned.


----------



## taem

How indicative is Boost Engine for predicting overclock limit?

In other words, whatever clock you get at stock, how much, on average, are you able to add to that?


----------



## TheBoom

Quote:


> Originally Posted by *taem*
> 
> Are you talking about long haired gamer nexus guy's 40db normalized test for 1080 ti's to say Strix is best? That's just db. Gaming X's larger fans spin with a lower pitched sound profile than the Strix. At the same db, the Strix is louder. Period. Sound profiles are largely subjective, but higher pitched sounds are always more intrusive than lower pitched ones. Have you heard the cards? I have. I shake my head at the folks who say Strix is quieter. That's crazy. It's not even close, Gaming X is soooo much quieter.
> 
> Anyway I call Gaming X best because at the major retailers, Amazon Newegg etc, it's the cheapest of the good air cooled cards, it's the quietest, and despite the looks which I also hate, it's built like a tank. Yeah I will gladly double down on calling this the best 1080 ti. The Strix, right now at the places I'm willing to buy a high end gpu from, it's more expensive than the Lightning Z I got


I think you're missing a major factor here. GN did a normalized test yes, but that in no way would represent real life performance.

The Strix is a much bigger card with a beefier cooler meaning the fans wouldn't even have to ramp up to the same "sound output" as the Gaming X under the same load and TDP.

If you did have both fans at the same exact sound output (40db in this case) then obviously 3 smaller fans would make higher pitched or even more noise then 2 larger ones.


----------



## Streetdragon

Quote:


> Originally Posted by *Dry Bonez*
> 
> Can someone explain this logic to me? I received my Aorus extreme Waterforce 1080ti 2 days ago and i noticed something just now, while playing rise of the tomb raider in 4k. My temps never went above 53c, but out of curiosity i went ahead and touched the card, and i kid you not i almost burned myself from the massive amount of heat coming out of it? Do i have a defective card? I dont think a Watercooled card is supposed to get that hot that i felt nearly burned.


i glued heatsinks on the backplate and have a slow spinning fan infront of them. That way i got my card much cooler on the backplate.


----------



## Dry Bonez

Quote:


> Originally Posted by *Streetdragon*
> 
> i glued heatsinks on the backplate and have a slow spinning fan infront of them. That way i got my card much cooler on the backplate.


so in other words, this is normal?


----------



## Streetdragon

yep. for example. the vrm for the ram has no direkt contakt with the waterblock. only on the backplate (EK waterblock) IF im not wrong now. The Backplate gets the heat that the pcb exudes.
in addition to this the backplate has normaly contact to the gpu and gpu-vrm too.

Backplate = heatsink.

with the heatsink i glued on my backplate, i just increased the Surface


----------



## BaldMan

Quote:


> Originally Posted by *Dry Bonez*
> 
> Can someone explain this logic to me? I received my Aorus extreme Waterforce 1080ti 2 days ago and i noticed something just now, while playing rise of the tomb raider in 4k. My temps never went above 53c, but out of curiosity i went ahead and touched the card, and i kid you not i almost burned myself from the massive amount of heat coming out of it? Do i have a defective card? I dont think a Watercooled card is supposed to get that hot that i felt nearly burned.


Same here, seems common place, as mentioned above there is a massive amount of heat coming out of the VRM.

I have a 92mm fan sitting on the card with a spacer to prop it up 3mm.

Have helped get the backplate load temps down thats for sure.


----------



## Glottis

Quote:


> Originally Posted by *TheBoom*
> 
> I think you're missing a major factor here. GN did a normalized test yes, but that in no way would represent real life performance.
> 
> The Strix is a much bigger card with a beefier cooler meaning the fans wouldn't even have to ramp up to the same "sound output" as the Gaming X under the same load and TDP.
> 
> If you did have both fans at the same exact sound output (40db in this case) then obviously 3 smaller fans would make higher pitched or even more noise then 2 larger ones.


You are correct. Both cards are good and very close in cooling performance, but Strix just has slightly better cooler overall. Good indication is when you check idle temp for both cards (fans off). Strix runs way cooler. I don't know why some people here are hellbent on claiming that MSI GAMING is better, when all major review sites and youtube channels show Strix is cooler and quieter card.


----------



## TheBoom

Quote:


> Originally Posted by *Glottis*
> 
> You are correct. Both cards are good and very close in cooling performance, but Strix just has slightly better cooler overall. Good indication is when you check idle temp for both cards (fans off). Strix runs way cooler. I don't know why some people here are hellbent on claiming that MSI GAMING is better, when all major review sites and youtube channels show Strix is cooler and quieter card.


Human nature I guess. In other words "fanboyism". I own neither of the 2 cards by the way lol.


----------



## weskeh

Quote:


> Originally Posted by *Glottis*
> 
> You are correct. Both cards are good and very close in cooling performance, but Strix just has slightly better cooler overall. Good indication is when you check idle temp for both cards (fans off). Strix runs way cooler. I don't know why some people here are hellbent on claiming that MSI GAMING is better, when all major review sites and youtube channels show Strix is cooler and quieter card.


Those idle temps geez, did they test them in the desert? Mine idle at 30c maxes at 60c. Asus strix oc


----------



## webhito

Quote:


> Originally Posted by *weskeh*
> 
> Those idle temps geez, did they test them in the dessert? Mine idle at 30c maxes at 60c. Asus strix oc


Do you have a custom fan profile on yours? Or do you live somewhere with ambients <20c? Otherwise those are the normal temperatures these cards work at since the fans don't even spin until needed.


----------



## weskeh

Quote:


> Originally Posted by *webhito*
> 
> Do you have a custom fan profile on yours? Or do you live somewhere with ambients <20c? Otherwise those are the normal temperatures these cards work at since the fans don't even spin until needed.


I live in belgium so jeh u an shed off some degrees. Ambient temp of about 22-25c and yes custom fan profile.


----------



## 8051

Quote:


> Originally Posted by *weskeh*
> 
> Those idle temps geez, did they test them in the dessert? Mine idle at 30c maxes at 60c. Asus strix oc


What kind of dessert? Baked Alaska?


----------



## Dry Bonez

Quote:


> Originally Posted by *BaldMan*
> 
> Same here, seems common place, as mentioned above there is a massive amount of heat coming out of the VRM.
> 
> I have a 92mm fan sitting on the card with a spacer to prop it up 3mm.
> 
> Have helped get the backplate load temps down thats for sure.[/quote, glad to know im not alone in this. Since you have the issue too, is this a reason to RMA?


----------



## Battleneter

Quote:


> Originally Posted by *Glottis*
> 
> You are correct. Both cards are good and very close in cooling performance, but Strix just has slightly better cooler overall. Good indication is when you check idle temp for both cards (fans off). Strix runs way cooler. I don't know why some people here are hellbent on claiming that MSI GAMING is better, when all major review sites and youtube channels show Strix is cooler and quieter card.


Agreed these kinds of results have been replicated in other reviews pretty much confirming the Strix as "technically" the best air-cooled card, not even so sure Gaming X would rank #2 or #3 any more.

People here sticking up for the Strix don't actually own one ie me







, its just being accurate.


----------



## tarasis

Quote:


> Originally Posted by *Battleneter*
> 
> Agreed these kinds of results have been replicated in other reviews pretty much confirming the Strix as "technically" the best air-cooled card, not even so sure Gaming X would rank #2 or #3 any more.
> 
> People here sticking up for the Strix don't actually own one ie me
> 
> 
> 
> 
> 
> 
> 
> , its just being accurate.


I've been struggling trying to decide which card to get. (Particularly before the Destiny 2 deal ends tomorrow). I've seen some sites say MSI is coolest / quietest but others the Strix OC. (For instance https://uk.hardware.info/category/5/graphics-cards/testresults )

-

I should say I'm aiming to replace my Gigabyte Gaming G1 GTX 970 for VR use and also improved 2560x1080 (or higher with DSR) performance.

I was looking to chose between the MSI Gaming X, Gigabyte Auroa Extreme and Asus Strix OC. Different sites put different of those cards first, and people on different forums put different cards first (for instance the MSI seems to be pushed more on Reddit as the quietest).

Prices in Germany in the last week or two have shot up on the.


----------



## taem

Quote:


> Originally Posted by *TheBoom*
> 
> Human nature I guess. In other words "fanboyism". I own neither of the 2 cards by the way lol.


Neither do I lol. But going by dB benches on the net is not a good test for noise. I love gamer nexus guy, when he said Strix has dethroned Gaming X I wanted Strix, then I heard it compared to Gaming X and went right back to Gaming X. But then I got a different card altogether lol.


----------



## Minusorange

Quote:


> Originally Posted by *tarasis*
> 
> Prices in Germany in the last week or two have shot up on the.


Shortage on RAM is going to cause prices to rise


----------



## Kriant

EK waterblock ordered. Caved in.


----------



## tarasis

Quote:


> Originally Posted by *Minusorange*
> 
> Shortage on RAM is going to cause prices to rise


There I was hoping that the prices would drop. Makes me kick ymself for not thinking about doing this upgrade months ago.

Thanks.


----------



## pez

Quote:


> Originally Posted by *TheBoom*
> 
> I think you're missing a major factor here. GN did a normalized test yes, but that in no way would represent real life performance.
> 
> The Strix is a much bigger card with a beefier cooler meaning the fans wouldn't even have to ramp up to the same "sound output" as the Gaming X under the same load and TDP.
> 
> If you did have both fans at the same exact sound output (40db in this case) then obviously 3 smaller fans would make higher pitched or even more noise then 2 larger ones.


Quote:


> Originally Posted by *taem*
> 
> Neither do I lol. But going by dB benches on the net is not a good test for noise. I love gamer nexus guy, when he said Strix has dethroned Gaming X I wanted Strix, then I heard it compared to Gaming X and went right back to Gaming X. But then I got a different card altogether lol.


I have to agree. The STRIX card is fine below 40% from what I recall, but anything past that and the fans are distressingly annoying. The noise profile of the STRIX has to be one of the worst I've heard on a $700+ card to date. Even the EVGA ICX coolers have a more pleasing noise profile.

The issue isn't so much that it's 3 smaller fans, it's the issue that the fan motor is rather whiny and just not good. The saving grace to the card is the large heatsink that generally doesn't take much fan speed in an ideal situation.


----------



## Blameless

I've been inured to all but the most extreme fan noise from running 290X Crossfire and an air cooled Fury in my gaming systems...even 100% fan speed on my Aorus GTX 1080 ti is perfectly tolerable to me, and the 70% I need to never dip below 2012MHz core while running 5940MHz memory is barely audible over ambient noise levels in a closed case.


----------



## Thoth420

Quote:


> Originally Posted by *Kriant*
> 
> That's kinda what I am thinking. I am swapping my 1800x to 1950x in a few weeks (I have my 1950x and zenith sitting on the bench already, waiting for Swiftech Apogee SKF-TR4 to arrive) so re-working on my loop was going to provide me this opportunity to add the second card on the loop as well, but yeah, I don't think I will see any benefit.


I am curious to see what your loop looks like with one card under water and the other on air. I agree that it would not be worth it to block the second card for perf off the cuff the only bonus would be aesthetic for the most part.
Quote:


> Originally Posted by *Kriant*
> 
> EK waterblock ordered. Caved in.


Noice I can't wait to see the end result with the new CPU and block as well as both cards in the loop.


----------



## Rangerscott

Wish theyd make a 1080ti with less ram to drop the price. 11gb is over kill for me but would like me some of that core.

The top graphic cards are just getting crazy expensive.


----------



## Mooncheese

Quote:


> Originally Posted by *Rangerscott*
> 
> Wish theyd make a 1080ti with less ram to drop the price. 11gb is over kill for me but would like me some of that core.
> 
> The top graphic cards are just getting crazy expensive.


Seeing nearly 8gb utilized in Titanfall 2 and Rise of the Tomb Raider at 2560x1440p with "insane" textures and we are not even 6 months into its life cycle. 4k uses even (way) more. Not everyone intends to stay at 1080p. 11gb may have been overkill 3 years ago but the games are only using more and more of it.


----------



## weskeh

Quote:


> Originally Posted by *Mooncheese*
> 
> Seeing nearly 8gb utilized in Titanfall 2 and Rise of the Tomb Raider at 2560x1440p with "insane" textures and we are not even 6 months into its life cycle. 4k uses even (way) more. Not everyone intends to stay at 1080p. 11gb may have been overkill 3 years ago but the games are only using more and more of it.


Even @ 1080p tomb raider user nearly 8gb with lots eyecandy.


----------



## k4sh

Quote:


> Originally Posted by *Mooncheese*
> 
> Seeing nearly 8gb utilized in Titanfall 2 and Rise of the Tomb Raider at 2560x1440p with "insane" textures and we are not even 6 months into its life cycle. 4k uses even (way) more. Not everyone intends to stay at 1080p. 11gb may have been overkill 3 years ago but the games are only using more and more of it.


Didn't know that ROTR comes with such textures. Is it already part of the game settings or do you need to mod anything ?
Would be very curious to see how it behaves with my card.


----------



## TheBoom

Quote:


> Originally Posted by *weskeh*
> 
> Even @ 1080p tomb raider user nearly 8gb with lots eyecandy.


ROTR is designed to use as much VRAM as available. It loads textures into the VRAM so they minimize memory swaps and excessive reloading of the same data. The game will run the same on a card with less VRAM as long the minimum requirements are met.

Quote:


> Originally Posted by *Blameless*
> 
> I've been inured to all but the most extreme fan noise from running 290X Crossfire and an air cooled Fury in my gaming systems...even 100% fan speed on my Aorus GTX 1080 ti is perfectly tolerable to me, and the 70% I need to never dip below 2012MHz core while running 5940MHz memory is barely audible over ambient noise levels in a closed case.


Lol I have the same problem. Can't see the card running below 2000 and the first bin after 2000 is 2012mhz.


----------



## Glottis

Quote:


> Originally Posted by *Rangerscott*
> 
> Wish theyd make a 1080ti with less ram to drop the price. 11gb is over kill for me but would like me some of that core.
> 
> The top graphic cards are just getting crazy expensive.


Come on, get serious here. This is high end enthusiast part. You can't afford current 1080Ti price, but if it costed 50$ cheaper with slightly less VRAM you would suddenly be able to afford it? Or perhaps you are dreaming that 1080Ti 8GB version would cost 300$ or something?


----------



## dVeLoPe

hey guys i can step up to the 1080ti by evga from a 1080 what perfomance increase should i see?

is it worth getting a 1440p or 4k monitor with just 1x 1080ti


----------



## TheBoom

Quote:


> Originally Posted by *dVeLoPe*
> 
> hey guys i can step up to the 1080ti by evga from a 1080 what perfomance increase should i see?
> 
> is it worth getting a 1440p or 4k monitor with just 1x 1080ti


I'd say roughly a 30% increase. With this card the sweet spot is really 1440p 144-165hz.


----------



## Dasboogieman

Quote:


> Originally Posted by *dVeLoPe*
> 
> hey guys i can step up to the 1080ti by evga from a 1080 what perfomance increase should i see?
> 
> is it worth getting a 1440p or 4k monitor with just 1x 1080ti


35% stock for stock


----------



## Blameless

Quote:


> Originally Posted by *Rangerscott*
> 
> Wish theyd make a 1080ti with less ram to drop the price. 11gb is over kill for me but would like me some of that core.


It might be possible to make a 5.5GiB 1080 ti, but 5.5GiB is pretty lean for a part in this segment...probably not worth while for NVIDIA or AIBs to support and try to sell such an SKU.
Quote:


> Originally Posted by *Rangerscott*
> 
> The top graphic cards are just getting crazy expensive.












As far as value goes, the reduction in performance per dollar going from the GTX 1080 or Vega parts to the 1080 ti is as small as it's ever been in the history of GPUs.

I remember when going from the 250 dollar to 500 dollar part was a sub-20% performance increase. Now you get 30%+ going from 500 to 750.
Quote:


> Originally Posted by *dVeLoPe*
> 
> hey guys i can step up to the 1080ti by evga from a 1080 what perfomance increase should i see?
> 
> is it worth getting a 1440p or 4k monitor with just 1x 1080ti


A 1080 ti is roughly 30% faster than a 1080.

Nearly everything will run extremely well at 1440p on the 1080 ti, but there are some game/setting combinations that won''t do well at 4k.


----------



## dVeLoPe

for instance right now at 1080p my single 1080 with a 4ghz 5820k gets as low as 21 fps in rise of the tomb raider maxxed out


----------



## stangflyer

Quote:


> Originally Posted by *Blameless*
> 
> It might be possible to make a 5.5GiB 1080 ti, but 5.5GiB is pretty lean for a part in this segment...probably not worth while for NVIDIA or AIBs to support and try to sell such an SKU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as value goes, the reduction in performance per dollar going from the GTX 1080 or Vega parts to the 1080 ti is as small as it's ever been in the history of GPUs.
> 
> I remember when going from the 250 dollar to 500 dollar part was a sub-20% performance increase. Now you get 30%+ going from 500 to 750.
> A 1080 ti is roughly 30% faster than a 1080.
> 
> Nearly everything will run extremely well at 1440p on the 1080 ti, but there are some game/setting combinations that won''t do well at 4k.


I have had nine of those cards in the chart:. Would of been more but I also had a 9800 pro, 4870, 5970 and 7950's in SLI.


----------



## Hl86

I don´t know if its a issue, but when i´m playing Watch dogs 2 the power % lies around 80--85%. It´s the only game where power % is so low.


----------



## TheRedViper

Quote:


> Originally Posted by *Hl86*
> 
> I don´t know if its a issue, but when i´m playing Watch dogs 2 the power % lies around 80--85%. It´s the only game where power % is so low.


Ubisoft thats why


----------



## TheBoom

Quote:


> Originally Posted by *dVeLoPe*
> 
> for instance right now at 1080p my single 1080 with a 4ghz 5820k gets as low as 21 fps in rise of the tomb raider maxxed out


How is that possible? Unless you mean stutters? At 4k maxed out I'd say a low of 21fps is possible but at 1080p?


----------



## KedarWolf

Quote:


> Originally Posted by *dVeLoPe*
> 
> for instance right now at 1080p my single 1080 with a 4ghz 5820k gets as low as 21 fps in rise of the tomb raider maxxed out


This is ROTTR at 1080p settings fully maxed out including Pure Hair.



DirectX 12 seems more consistent but lower top Frame Rates.


----------



## TheBoom

Guys I need some help here. I've got this most baffling issue with my Amp Extreme. Or it may not be the card, I can't seem to identify it.

I posted here last week about this, basically a blackscreen followed usually by the GPU fans ramping to 100%. If I restart the clocks get stuck at idle boost state for both core and memory. However if I shutdown and power on again it works normally, until another blackscreen happens that is.

This blackscreen doesn't happen immediately when launching a game or 3D app, it can take hours. But once it happens it keeps happening for a while after.

I borrowed a R9 290x to rule out the PSU, and it works fine for hours in game. Apart from a few normal driver crashes which I think is the factory overclock and high temps.

So I decided it had to be the GPU and brought it to the service center. They tested it and nothing happened. I have a feeling the testing wasn't long enough.

What do you guys think?


----------



## 8051

What was so great about the 8800 Ultra that it sold for $850 in 2007?


----------



## GRABibus

Quote:


> Originally Posted by *8051*
> 
> What was so great about the 8800 Ultra that it sold for $850 in 2007?


In 10 years, you will ask the same question :
What was so great about the GTX 1080 Ti that it sold for $850 in 2017?


----------



## GRABibus

So, i did a lot of game tests since I own my Gigabyte GTX 1080 Ti Gaming OC 11G : Star Wars Battlefront 2015, DOOM, BF1, TF2, Overwatch, Black ops 3, CS GO.
With my GTX 1080 Ti Gaming OC 11G with Bios EVGA FTW3, I can play at 2063MHz/6055MHz at 1.05V with following curve :

http://www.casimages.com/img.php?i=17090508253217369815256460.png

Bios EVGA FTW3 v86.02.39.01.90
+99MHz on Core
+547MHz on Memory
127% Power limit
100% voltage range
100% fan speed.
Flat curve 2088MHz from 1.05V to 1.2V

This card heats and the cooling is really not the best one, so power limit is reached, as my curve gives 2050MHz to 2063MHz at 1.05V in game.
But, stability is really good.

I will play 3 major games in the next weeks : Destiny 2, COD WWII and Star Wars.
I will check if this overclock is stable with those games too.
I need also to confirm with Quake Champions, as I saw a kind of artefacts (To be confirmed).

I am really satisfied about this card.


----------



## Thoth420

Quote:


> Originally Posted by *TheBoom*
> 
> How is that possible? Unless you mean stutters? At 4k maxed out I'd say a low of 21fps is possible but at 1080p?


Overuse of AA that can't all be necessary at 1080p unless it is a very large TV...also sounds wrong just off the cuff my Fury X can do better at 1440p with everything maxed.


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> What was so great about the 8800 Ultra that it sold for $850 in 2007?


It was faster than the 8800GTX, which was already over 600 dollars.
Quote:


> Originally Posted by *GRABibus*
> 
> Bios EVGA FTW3 v86.02.39.01.90


Why that particular BIOS?


----------



## Sgang

Hi Everybody,
is someone using a UWide monitor 3440x1440?
i'm experiencing stuttering and unexpected frame drops in The Witcher 3 (maxed out less hairworks) after installing my new monitor (Asus MX34V, no gsync 100hz UW Monitor). I'm coming from an ACER XB271Hu ( i still have it just on another machine)

I've also tried to fix the framerate to 60fps, but also if minor, the drops are still present. Is the 1080ti not enough for this resolution? Is the processor? 1800x Ryzen with 3.9ghz and 2666mhz ram, or is just the monitor?

Thanks to anyone will reply


----------



## GRABibus

Quote:


> Originally Posted by *Blameless*
> 
> Why that particular BIOS?


Because I read it gave good results, increasing power limit also.


----------



## TheRedViper

Quote:


> Originally Posted by *Sgang*
> 
> Hi Everybody,
> is someone using a UWide monitor 3440x1440?
> i'm experiencing stuttering and unexpected frame drops in The Witcher 3 (maxed out less hairworks) after installing my new monitor (Asus MX34V, no gsync 100hz UW Monitor). I'm coming from an ACER XB271Hu ( i still have it just on another machine)
> 
> I've also tried to fix the framerate to 60fps, but also if minor, the drops are still present. Is the 1080ti not enough for this resolution? Is the processor? 1800x Ryzen with 3.9ghz and 2666mhz ram, or is just the monitor?
> 
> Thanks to anyone will reply


My previous monitor was the xb271hu running on 1070 SLI, which I changed for x34 still on 1070 SLI, but im currently building a new pc from sratch with a 1080ti and eventually more. I had no issues at all except on wildlands but thats just ubisoft. I think gsync makes the difference here.


----------



## 8051

Quote:


> Originally Posted by *GRABibus*
> 
> In 10 years, you will ask the same question :
> What was so great about the GTX 1080 Ti that it sold for $850 in 2017?


But the 1080Ti wasn't supposed to sell for $850, it was supposed to sell for $700.

Was the 8800 Ultra that much better than anything ATI had back then?


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> But the 1080Ti wasn't supposed to sell for $850, it was supposed to sell for $700.
> 
> Was the 8800 Ultra that much better than anything ATI had back then?


8800 Ultra was like the Titan of the time. At the time AMD had nothing. Similar to the situation they are now. 1080 Ti is very well priced at $700.


----------



## neurotix

Getting two EVGA GTX 1080ti FTW3 tomorrow most likely, for SLI, so I'm subscribing.

Going through some very rocky times in my life right now, but once I overclock them and figure them out I will post my results etc.

Used AMD since 2009 when I first built Big Red, finally making the switch to Nvidia because I'm sick of paying the same amount of money (e.g. a LOT of money) and having my investment depreciate faster as well as having inferior performance.

One OC'ed 1080ti is probably stronger than my dual Fury setup so with two... well, I'll be able to save money for other things now and not have to worry about upgrading my GPUs for a while. I still have all my dual Fury benches on hwbot and will submit 1080ti benches there too eventually. Betting the SLI 1080ti's will be about 3x more powerful than what I have now.

Tax return time I'll be saving some money and putting it aside for an Ice Lake setup most likely (I know everyone likes Ryzen right now but I play a lot of older titles and would like the best single thread CPU performance on the market).

I heard you just OC Pascal by raising the power limit to the max and they'll basically auto-overclock. Anything else I need to know? Thanks guys.


----------



## Maximization

Quote:


> Originally Posted by *neurotix*
> 
> Getting two EVGA GTX 1080ti FTW3 tomorrow most likely, for SLI, so I'm subscribing.
> 
> Going through some very rocky times in my life right now, but once I overclock them and figure them out I will post my results etc.
> 
> Used AMD since 2009 when I first built Big Red, finally making the switch to Nvidia because I'm sick of paying the same amount of money (e.g. a LOT of money) and having my investment depreciate faster as well as having inferior performance.
> 
> One OC'ed 1080ti is probably stronger than my dual Fury setup so with two... well, I'll be able to save money for other things now and not have to worry about upgrading my GPUs for a while. I still have all my dual Fury benches on hwbot and will submit 1080ti benches there too eventually. Betting the SLI 1080ti's will be about 3x more powerful than what I have now.
> 
> Tax return time I'll be saving some money and putting it aside for an Ice Lake setup most likely (I know everyone likes Ryzen right now but I play a lot of older titles and would like the best single thread CPU performance on the market).
> 
> I heard you just OC Pascal by raising the power limit to the max and they'll basically auto-overclock. Anything else I need to know? Thanks guys.


I came from 2 fury x with ek water blocks...this was my 4k score in superposition



this is my score with 2 1080ti, still more room to overclock too



there is no going back, until pcie 4 or whatever it will be


----------



## sliq

So I bought a EVGA GTX 1080 TI SC2 Hybrid a few weeks ago. My first one, seemed great, boosted up to 1987-1999MHZ out of the box, and I didn't even try OC'ing. However, whenever the fan speed for the power/mem side fan ran above 30%, there would be a clicking/grinding noise coming from that fan. Suffice to say, I RMA'd.

The replacement I got has no clicking/girding noise, and even has a high memory OC. I got it to +774 (12,528MHz) stable with no artifacts. However, I barely see a 1 fps difference vs stock with that mem OC..

Also, there are 2 main issues:
1) Has slight whine coil which I can hear if my system is idle and I don't have headphones on, and a loud pump (compared to the previous one and my kraken x62 which has no audible pump sounds)
2) This card does not get high core frequencies at all.. Boost 3.0 out of the box brings my core clock to 1911-1924MHZ.. My MAX stable overclock was 1974. And that is even with +20% Power and +100% voltage on EVGA XOC (1.093) I prefer not to touch the voltage.

So, should I try again or settle with this? I know for a fact some people were able to get their cards core clock from 2050-2100mhz stable at stock voltage. And it's not like my card is overheating or anything. Temps across all sensors barely touch 50 C. Usually 47 on the gpu and 52 on the power/mem.


----------



## webhito

I also had a fury x, got a few 480's as well, would love to have an all amd system but they are lacking in the gpu department, that, and the freesync monitor I had was a pain... Gsync has proved to be flawless, no more random disconects, syncing issues, nada. Might have had bad luck with my screen, but now I am stuck with the green team until I go 4k.


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> But the 1080Ti wasn't supposed to sell for $850, it was supposed to sell for $700.


3rd party 1080 Ti's going for $750 are a pretty good deal for gaming, especially relative to the street prices of any remotely competitive products.
Quote:


> Originally Posted by *8051*
> 
> Was the 8800 Ultra that much better than anything ATI had back then?


The 8800 GTX and Ultra were nearly twice as fast as the X1950XTX. The 2900X that came out after them didn't improve the situation...people compare Vega to the 2900XT, but they are mistaken. Vega would need to be slower than Fury X at the same power levels it is now to be comparable. Yes, the 2900XT often lost to the X1950XT.

ATI/AMD wouldn't have a part competitive at the very high-end again until the HD 5000 series.

The 8800 Ultra was still a rip off vs. the 8800 GTX, of course. As Zealot mentions, it was the Titan of it's day.
Quote:


> Originally Posted by *Maximization*
> 
> I came from 2 fury x with ek water blocks..


Quote:


> Originally Posted by *Maximization*
> 
> this is my score with 2 1080ti


Neither SLI nor CFX seem to be working there; those are the same scores you'd get with a single card.
Quote:


> Originally Posted by *sliq*
> 
> The replacement I got has no clicking/girding noise, and even has a high memory OC. I got it to +774 (12,528MHz) stable with no artifacts. However, I barely see a 1 fps difference vs stock with that mem OC.


What tests are showing such a small improvement?

Most 4k tests with higher texture quality settings should be bandwidth limited on a 1080 Ti. How about Unigine Valley or Superposition?
Quote:


> Originally Posted by *sliq*
> 
> Also, there are 2 main issues:
> 1) Has slight whine coil which I can hear if my system is idle and I don't have headphones on, and a loud pump (compared to the previous one and my kraken x62 which has no audible pump sounds)
> 2) This card does not get high core frequencies at all.. Boost 3.0 out of the box brings my core clock to 1911-1924MHZ.. My MAX stable overclock was 1974. And that is even with +20% Power and +100% voltage on EVGA XOC (1.093) I prefer not to touch the voltage.
> 
> So, should I try again or settle with this? I know for a fact some people were able to get their cards core clock from 2050-2100mhz stable at stock voltage. And it's not like my card is overheating or anything. Temps across all sensors barely touch 50 C. Usually 47 on the gpu and 52 on the power/mem.


I'd be inclined to try again...there isn't much point in paying extra for a hybrid model if it's going to be louder and slower than half the purely air cooled samples out there.


----------



## alton brown

Hi everyone! Hope everyone is doing well! My 1st time in this thread. I'm finally upgrading my gpu. I need some opinions, it's a toss up between the EVGA Hydro Copper 1080ti or the Gigabyte Aorus Xtreme Waterforce. Anyone have any input? I know that both cards are very close in specs and construction. I've never used Gigabyte brand before. I mainly use Asus and EVGA.


----------



## MaKeN

I had the same debate in my had , also rog poseidon to that list.
I was most likely would go with poseidon, but it wasn't available back then( sold out).

So i went with waterforce... i really happy with the card, it handels OC well, temps are really good on it, vrms are also good. It does look nice in person with the lights it has.
Well all in all i guess evga HC/poseidon/waterforce are all good . At this point decide yourself what to buy, you cant go wrong with any of them. Waterforce offers 4 years of warranty though, donno about rest.


----------



## Dry Bonez

Welcome my friend. i would say go with the Gigabyte waterforce. i recently just got this card about 4 days ago and i dont regret it i tell ya. Considering it pretty much beats out the competition in terms of performance in pretty much every benchmark. The only thing to my knowledge where the Evga is better is aesthetics.


----------



## alton brown

Quote:


> Originally Posted by *MaKeN*
> 
> I had the same debate in my had , also rog poseidon to that list.
> I was most likely would go with poseidon, but it wasn't available back then( sold out).
> 
> So i went with waterforce... i really happy with the card, it handels OC well, temps are really good on it, vrms are also good. It does look nice in person with the lights it has.
> Well all in all i guess evga HC/poseidon/waterforce are all good . At this point decide yourself what to buy, you cant go wrong with any of them. Waterforce offers 4 years of warranty though, donno about rest.


I saw the 4 year warranty, that's good. Thanks for the info. I'm leaning towards the Gigabyte as well.


----------



## alton brown

Quote:


> Originally Posted by *Dry Bonez*
> 
> Welcome my friend. i would say go with the Gigabyte waterforce. i recently just got this card about 4 days ago and i dont regret it i tell ya. Considering it pretty much beats out the competition in terms of performance in pretty much every benchmark. The only thing to my knowledge where the Evga is better is aesthetics.


Thanks! What temps are you guys getting with this card?


----------



## alton brown

Can a couple of you guys let me know what products you guys are using in your custom loops for the GB Waterforce? Just so I can get an idea. I bought the EK Xtreme 240 kit plus addition Ek Slim 360 rad and bunch of other goodies and all Corsair HP SP fans.


----------



## MaKeN

I run it in the same loop with my 7700k, one 120mm rad plus 280mm rad, maximum temps are 47c , haven seen more than that even in a really warm day.
But im rebulding my pc today, ill add a 360 rad to it with push pull. Very interesting to see the temps change after it.
1080ti really love low temps.


----------



## Dry Bonez

Quote:


> Originally Posted by *alton brown*
> 
> Thanks! What temps are you guys getting with this card?


I never seen my temps go above 53c and thats at 4k resolution....

on a side note,i am scared for my life as i prepare for hurricane irma. I will be packing my pc parts in case i need to evacuate from orlando fl. Please keep us in prayers OCnet


----------



## MaKeN

Damn brother, im up in NYC no warrning for us, but i really feel sorry for you guys there in Florida, i heard its a mandatory evacuation there right?
Is your house happens to be uphill somewhere?


----------



## Sgang

Quote:


> Originally Posted by *TheRedViper*
> 
> My previous monitor was the xb271hu running on 1070 SLI, which I changed for x34 still on 1070 SLI, but im currently building a new pc from sratch with a 1080ti and eventually more. I had no issues at all except on wildlands but thats just ubisoft. I think gsync makes the difference here.


Was my doubt. I think UW with no Gsync it's not an option for gamers...


----------



## Dry Bonez

Quote:


> Originally Posted by *MaKeN*
> 
> Damn brother, im up in NYC no warrning for us, but i really feel sorry for you guys there in Florida, i heard its a mandatory evacuation there right?
> Is your house happens to be uphill somewhere?


yes, there is definately a mandatory evacuation considering this storm is bigger than the state of Ohio and will cover the entire state of Florida. I am planning on driving to NYC since we have family there. All incare about is mynfamily and my PC tbh.


----------



## alton brown

Quote:


> Originally Posted by *Dry Bonez*
> 
> I never seen my temps go above 53c and thats at 4k resolution....
> 
> on a side note,i am scared for my life as i prepare for hurricane irma. I will be packing my pc parts in case i need to evacuate from orlando fl. Please keep us in prayers OCnet


Good luck to you friend and all of you guys that are in the path! . I'm
In Rhode Island. I'm praying it doesn't swing up the east coast at full force.


----------



## alton brown

Thanks for the input fellas! Greatly appreciated!


----------



## TheRedViper

Quote:


> Originally Posted by *Sgang*
> 
> Was my doubt. I think UW with no Gsync it's not an option for gamers...


Well it can work fine on stable games like bf1, overwatch, etc. I just think gsync saves the day in games with massive fps drop potential such as the witcher 3.


----------



## lilchronic

Quote:


> Originally Posted by *Dry Bonez*
> 
> I never seen my temps go above 53c and thats at 4k resolution....
> 
> on a side note,i am scared for my life as i prepare for hurricane irma. I will be packing my pc parts in case i need to evacuate from orlando fl. Please keep us in prayers OCnet


Quote:


> Originally Posted by *MaKeN*
> 
> Damn brother, im up in NYC no warrning for us, but i really feel sorry for you guys there in Florida, i heard its a mandatory evacuation there right?
> Is your house happens to be uphill somewhere?


Yeah it's scary. Last year we had hurricane hermine (CAT 1) Send a tree through my house and that was with 60MPH winds. Irma is a CAT 5 185MPH perfect looking hurricane.


I have been through a lot of hurricanes here in Florida. If you guy's remember katrina that went to New Orleans but first went over florida and i was right in the eye of that hurricane in Boca Raton FL


----------



## MaKeN

Yea, this thing is at same power as Catrina cat 4 right?
I cant imagine those islands that got a direct hit by the eye at cat 5


----------



## alton brown

Quote:


> Originally Posted by *lilchronic*
> 
> Yeah it's scary. Last year we had hurricane hermine (CAT 1) Send a tree through my house and that was with 60MPH winds. Irma is a CAT 5 185MPH perfect looking hurricane.
> 
> 
> I have been through a lot of hurricanes here in Florida. If you guy's remember katrina that went to New Orleans but first went over florida and i was right in the eye of that hurricane in Boca Raton FL


Yea, I got hit once with the center when I was younger. 1985 hurricane Gloria. There are zero things that be done when these massive storms hit. I live on the water and have a " Hurricane House", I still **** my pants all the time when I see these storms and see how destructive they can be.


----------



## lilchronic

Quote:


> Originally Posted by *alton brown*
> 
> Yea, I got hit once with the center when I was younger. 1985 hurricane Gloria. There are zero things that be done when these massive storms hit. I live on the water and have a " Hurricane House", I still **** my pants all the time when I see these storms and see how destructive they can be.


People thought it was over and came out side to start cutting up trees and moving debris. Then about 15 minutes later the winds picked back up within a min or 2, It's was crazy even though it was a CAT 1 when it hit the east coast of FL.


----------



## Mooncheese

Quote:


> Originally Posted by *Dry Bonez*
> 
> yes, there is definately a mandatory evacuation considering this storm is bigger than the state of Ohio and will cover the entire state of Florida. I am planning on driving to NYC since we have family there. All incare about is mynfamily and my PC tbh.


This storm looks like it's no joke at all, 185 mph winds and it's registering on seismographs. I would pack your pc in waterproof Rubbermaid tubs and remember if it gets wet it isn't the end of the world, if you thoroughly air dry all the components before applying power.

Seems like we are really coming to terms with 200 years of rapacious industrial activity and countless billions of metric tons of pollution.

https://robertscribbler.com/2017/09/05/this-is-the-climate-pattern-scientists-warned-us-about-wildfires-approach-8-million-acres-in-u-s-during-summer-of-extreme-western-heat-severe-eastern-storms/

Maybe we will have a collective awakening moment but probably not.

I can't believe Trump assigned a vocal, renowned climate change denier, Scott Pruitt, to head up the EPA. I mean really?

We are in COMPLETE denial.


----------



## MaKeN

What, even after salty water , you can dry your pc parts and its fine?

From my experience this wont work with the cars electronics...


----------



## Mooncheese

Quote:


> Originally Posted by *MaKeN*
> 
> What, even after salty water , you can dry your pc parts and its fine?
> 
> From my experience this wont work with the cars electronics...


Soak in rubbing alcohol, then air dry, it's worth a shot.


----------



## lilchronic

Quote:


> Originally Posted by *Mooncheese*
> 
> This storm looks like it's no joke at all, 185 mph winds and it's registering on seismographs. I would pack your pc in waterproof Rubbermaid tubs and remember if it gets wet it isn't the end of the world, if you thoroughly air dry all the components before applying power.
> 
> Seems like we are really coming to terms with 200 years of rapacious industrial activity and countless billions of metric tons of pollution.
> 
> https://robertscribbler.com/2017/09/05/this-is-the-climate-pattern-scientists-warned-us-about-wildfires-approach-8-million-acres-in-u-s-during-summer-of-extreme-western-heat-severe-eastern-storms/
> 
> Maybe we will have a collective awakening moment but probably not.
> 
> I can't believe Trump assigned a vocal, renowned climate change denier, Scott Pruitt, to head up the EPA. I mean really?
> 
> We are in COMPLETE denial.


185MPH sustained winds with gust up to 225MPH it's like a tornado the size of Ohio. It's taking roofs and some house's off the foundation.


----------



## ZealotKi11er

Now BF1 is crashing for me. Man this 1080 Ti has been the worse experience ever.


----------



## MaKeN

What brand TI is that?
Btw, sory for not quoting, for some reason on my iphone quote wont work on this forum


----------



## ZealotKi11er

Quote:


> Originally Posted by *MaKeN*
> 
> What brand TI is that?
> Btw, sory for not quoting, for some reason on my iphone quote wont work on this forum


Evga FE.


----------



## MaKeN

Heh, i heard only good stuff about it....


----------



## Streetdragon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Now BF1 is crashing for me. Man this 1080 Ti has been the worse experience ever.


what kind of crash? Blackscreen and than back to desktop? Driver crash? Maybe to high clock on the vram?


----------



## stangflyer

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Now BF1 is crashing for me. Man this 1080 Ti has been the worse experience ever.


Stock clocks??


----------



## ZealotKi11er

Quote:


> Originally Posted by *Streetdragon*
> 
> what kind of crash? Blackscreen and than back to desktop? Driver crash? Maybe to high clock on the vram?


Quote:


> Originally Posted by *stangflyer*
> 
> Stock clocks??


bf1.exe has stopped working. Game closes and back to desktop.

With and without OC.


----------



## TheBoom

Quote:


> Originally Posted by *ZealotKi11er*
> 
> bf1.exe has stopped working. Game closes and back to desktop.
> 
> With and without OC.


Already disabled Origin in game overlay?


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheBoom*
> 
> Already disabled Origin in game overlay?


Yeah. I turned off RT OSD, Geforce Experince Overlay, Origin Overlay, Xbox Recording. Had one month older drivers. With those I was not even able to start a match. With new driver I can play 30s-2mins and crash.

Code:



Code:


Faulting application name: bf1.exe, version: 1.0.50.62815, time stamp: 0x59a0d13f
Faulting module name: bf1.exe, version: 1.0.50.62815, time stamp: 0x59a0d13f
Exception code: 0xc0000005
Fault offset: 0x00000000041b7794
Faulting process id: 0x18e8
Faulting application start time: 0x01d327f4b51a882d
Faulting application path: C:\Program Files (x86)\Origin Games\Battlefield 1\bf1.exe
Faulting module path: C:\Program Files (x86)\Origin Games\Battlefield 1\bf1.exe
Report Id: 56e1e859-3dcc-4214-b288-08f253c06c81


----------



## DADDYDC650

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Evga FE.


Have you tried repairing the game files?


----------



## ZealotKi11er

Quote:


> Originally Posted by *DADDYDC650*
> 
> Have you tried repairing the game files?


That is the first thing I did. Tried also DX12, Borderless Window, different game setting.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. I turned off RT OSD, Geforce Experince Overlay, Origin Overlay, Xbox Recording. Had one month older drivers. With those I was not even able to start a match. With new driver I can play 30s-2mins and crash.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Faulting application name: bf1.exe, version: 1.0.50.62815, time stamp: 0x59a0d13f
> Faulting module name: bf1.exe, version: 1.0.50.62815, time stamp: 0x59a0d13f
> Exception code: 0xc0000005
> Fault offset: 0x00000000041b7794
> Faulting process id: 0x18e8
> Faulting application start time: 0x01d327f4b51a882d
> Faulting application path: C:\Program Files (x86)\Origin Games\Battlefield 1\bf1.exe
> Faulting module path: C:\Program Files (x86)\Origin Games\Battlefield 1\bf1.exe
> Report Id: 56e1e859-3dcc-4214-b288-08f253c06c81


CPU still overclocked at 4.6? I had to change my CPU OC with my Fury X from the previous 780Ti configuration for BF4. No clue why....


----------



## ZealotKi11er

Quote:


> Originally Posted by *Thoth420*
> 
> CPU still overclocked at 4.6? I had to change my CPU OC with my Fury X from the previous 780Ti configuration for BF4. No clue why....


I played 1 hours of BF4 with no problem. Have this OC stable for 5 years and was not my CPU limit. CPU does 4.7GHz. Even my memory runs less 2133 instead of 2400. BF1 ran fine with 1080 and 1080 Ti a while back. Even tested with Fury X in this system. My problem is either a game or driver change that is causes some kind of hag and crashing. I can feel the game slows downs and stops in the last 1-2s before the crash.


----------



## neurotix

Quote:


> Originally Posted by *Maximization*
> 
> I came from 2 fury x with ek water blocks...this was my 4k score in superposition
> 
> 
> 
> this is my score with 2 1080ti, still more room to overclock too
> 
> 
> 
> there is no going back, until pcie 4 or whatever it will be


Sorry been very busy.

Even at stock on stock bios (no OC) I get 45 more fps in Unigine Valley (4x AA as per Valley thread rules) than my Fury's at 5760x1080 surrround. 80 for the Fury's (unstably pushed to 1125mhz- the max for my cards) in Valley Eyefinity and 125 for the 1080ti's in Surround at stock. Yeah....


----------



## Leethal

Whats a good starting point for a overclock?

120% Power
+100Mhz Core
+300Mhz Memory

???


----------



## KedarWolf

Quote:


> Originally Posted by *Leethal*
> 
> Whats a good starting point for a overclock?
> 
> 120% Power
> +100Mhz Core
> +300Mhz Memory
> 
> ???


Try between 2000-2025, maybe even 2038 on a great card at 1.000v in Afterburner and +500 memory, a good start

At 1.000v you'll have little or no power limiting.


----------



## Leethal

Quote:


> Originally Posted by *KedarWolf*
> 
> Try between 2000-2025, maybe even 2038 on a great card at 1.000v in Afterburner and +500 memory, a good start
> 
> At 1.000v you'll have little or no power limiting.


so in after burner i just set voltage to 1.000v and increase core +150mhz and memory +500Mhz?

ive tried this without voltage and it would crash, i was also using one PCIe with dual 6+2pin connectors. Read online this can be an issue. I ordered two individual PCIe cables.


----------



## KedarWolf

Quote:


> Originally Posted by *Leethal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Try between 2000-2025, maybe even 2038 on a great card at 1.000v in Afterburner and +500 memory, a good start
> 
> At 1.000v you'll have little or no power limiting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so in after burner i just set voltage to 1.000v and increase core +150mhz and memory +500Mhz?
> 
> ive tried this without voltage and it would crash, i was also using one PCIe with dual 6+2pin connectors. Read online this can be an issue. I ordered two individual PCIe cables.
Click to expand...

No, CTRL F and set a custom voltage curve on a gradual curve down from 1.000v and a straight line to the right of 1.000v start at 2000, +500 memory, if it's stable go up one step at a time.

Actually, leave your memory at stock, do core first, start at say 2000 or one or two bins lower, then raise it one bin at a time get the highest core you can at 1.000v, then try upping your memory +100 at a time until you get crashes or artifacts, then lower it say 30 at a time until it's fully stable.









You can test stability by running the Heaven benchmark.


----------



## Leethal

Quote:


> Originally Posted by *KedarWolf*
> 
> No, CTRL F and set a custom voltage curve on a gradual curve down from 1.000v and a straight line to the right of 1.000v start at 2000, +500 memory, if it's stable go up one step at a time.
> 
> Actually, leave your memory at stock, do core first, start at say 2000 or one or two bins lower, then raise it one bin at a time get the highest core you can at 1.000v, then try upping your memory +100 at a time until you get crashes or artifacts, then lower it say 30 at a time until it's fully stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can test stability by running the Heaven benchmark.


Sorry but this is abit confusing to me, never messed with a voltage curve like this.

i will have to look it up before i do it.


----------



## Blameless

Do note that the voltage curve is temperature dependent, so can read a different offset depending on the temperature of the GPU when you set it vs. when you look at it under a different load.

Also, quite a few GP102s aren't going to be stable at 2000+ with 1v, and just as many will have issues with 12Gbps memory. Both are pretty temperature dependent, so picking a noise level you can tolerate, then figuring out how hot the card will get at that fan speed can help you find what temperature to focus testing around.
Quote:


> Originally Posted by *MaKeN*
> 
> What, even after salty water , you can dry your pc parts and its fine?


If it's not powered up, water won't do anything to it, unless it stays in contact long enough to corrode ferrous parts.

Out of the stuff you find in a typical PC, only mechanical drives are likely to be damaged by submersion alone.
Quote:


> Originally Posted by *Mooncheese*
> 
> Soak in rubbing alcohol, then air dry, it's worth a shot.


Strong solvents are usually a bad idea for submerging parts that have TIM on them.

A rinse with distilled water then drying them swiftly is usually the best bet.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> bf1.exe has stopped working. Game closes and back to desktop.
> 
> With and without OC.


I would revert your CPU and memory to stock settings then test again to rule out any issues there. I discovered the card and it's drivers loaded my CPU's PCI-E controller/uncore in ways my prior hardware didn't and had to adjust my input voltage and VCCSA on my BW-E setup.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I played 1 hours of BF4 with no problem. Have this OC stable for 5 years and was not my CPU limit. CPU does 4.7GHz. Even my memory runs less 2133 instead of 2400. BF1 ran fine with 1080 and 1080 Ti a while back. Even tested with Fury X in this system. My problem is either a game or driver change that is causes some kind of hag and crashing. I can feel the game slows downs and stops in the last 1-2s before the crash.


Is that the only game that is crashing in your library? Have you tried other Battlefield titles?


----------



## KedarWolf

Quote:


> Originally Posted by *Leethal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> No, CTRL F and set a custom voltage curve on a gradual curve down from 1.000v and a straight line to the right of 1.000v start at 2000, +500 memory, if it's stable go up one step at a time.
> 
> Actually, leave your memory at stock, do core first, start at say 2000 or one or two bins lower, then raise it one bin at a time get the highest core you can at 1.000v, then try upping your memory +100 at a time until you get crashes or artifacts, then lower it say 30 at a time until it's fully stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can test stability by running the Heaven benchmark.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry but this is abit confusing to me, never messed with a voltage curve like this.
> 
> i will have to look it up before i do it.
Click to expand...

See the video here.

http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


----------



## Leethal

Quote:


> Originally Posted by *KedarWolf*
> 
> See the video here.
> 
> http://www.overclock.net/t/1627037/best-method-to-overclock-a-1080-ti-under-water-no-shunt-mod-good-under-air-too-lower-temps/0_20


Your the man!!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Do note that the voltage curve is temperature dependent, so can read a different offset depending on the temperature of the GPU when you set it vs. when you look at it under a different load.
> 
> Also, quite a few GP102s aren't going to be stable at 2000+ with 1v, and just as many will have issues with 12Gbps memory. Both are pretty temperature dependent, so picking a noise level you can tolerate, then figuring out how hot the card will get at that fan speed can help you find what temperature to focus testing around.
> If it's not powered up, water won't do anything to it, unless it stays in contact long enough to corrode ferrous parts.
> 
> Out of the stuff you find in a typical PC, only mechanical drives are likely to be damaged by submersion alone.
> Strong solvents are usually a bad idea for submerging parts that have TIM on them.
> 
> A rinse with distilled water then drying them swiftly is usually the best bet.
> I would revert your CPU and memory to stock settings then test again to rule out any issues there. I discovered the card and it's drivers loaded my CPU's PCI-E controller/uncore in ways my prior hardware didn't and had to adjust my input voltage and VCCSA on my BW-E setup.


Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


----------



## Ultracarpet

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


I remember when i was playing blackops3 lots, i randomly started getting crashes out of no where... literally no other game or application was affected. Even stability tested my OC overnight... no errors or crashes. Reset BIOS to defaults and voilà, no more crashes. Some games just poke at OC's differently i guess.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ultracarpet*
> 
> I remember when i was playing blackops3 lots, i randomly started getting crashes out of no where... literally no other game or application was affected. Even stability tested my OC overnight... no errors or crashes. Reset BIOS to defaults and voilà, no more crashes. Some games just poke at OC's differently i guess.


I have had AMD GPUs since I got this CPU and never had problems. Got 1080 and was getting problem with SW:BF and was putting blame on GPU OC but it would happen with stock clocks too. Got 1080 Ti but did not game much and not this happens. Before this Wolfenstein the new order would also crash to desktop but was completely fine with Fury X. Maybe changing MB form Gigabyte to ASRock requires more tuning. Maybe is a sign I need to just get the new 8700K and call it a day







.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blameless*
> 
> Do note that the voltage curve is temperature dependent, so can read a different offset depending on the temperature of the GPU when you set it vs. when you look at it under a different load.
> 
> Also, quite a few GP102s aren't going to be stable at 2000+ with 1v, and just as many will have issues with 12Gbps memory. Both are pretty temperature dependent, so picking a noise level you can tolerate, then figuring out how hot the card will get at that fan speed can help you find what temperature to focus testing around.
> If it's not powered up, water won't do anything to it, unless it stays in contact long enough to corrode ferrous parts.
> 
> Out of the stuff you find in a typical PC, only mechanical drives are likely to be damaged by submersion alone.
> Strong solvents are usually a bad idea for submerging parts that have TIM on them.
> 
> A rinse with distilled water then drying them swiftly is usually the best bet.
> I would revert your CPU and memory to stock settings then test again to rule out any issues there. I discovered the card and it's drivers loaded my CPU's PCI-E controller/uncore in ways my prior hardware didn't and had to adjust my input voltage and VCCSA on my BW-E setup.
> 
> 
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?
Click to expand...

Time to adjust your OC settings to get it game stable. CPUs can degrade a bit over time and may need a bit more voltages, core, CPU Input, RAM etc. to be completely stable. Test with RealBench 2.54 for CPU, AIDA64 Extreme cache only stress test for cache, HCI Memtest or GSAT in Linux for memory.

GSAT in Linux is the easiest, fastest way to test memory. Takes an hour.

HCI Memtest can take four hours or more.

If you're interested then let me know, I'll PM you how to make a Linux USB with GSAT to boot from and run, no need to actually install Linux on your hard drive.









Edit: Have HWInfo sensors only running while you stress test to watch temps, that can get insanely high when doing so. You want to keep your CPU and CPU package under 80C for sure for i7s etc.

GSAT temps will be fine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> Time to adjust your OC settings to get it game stable. CPUs can degrade a bit over time and may need a bit more voltages, core, CPU Input, RAM etc. to be completely stable. Test with RealBench 2.54 for CPU, AIDA64 Extreme cache only stress test for cache, HCI Memtest or GSAT in Linux for memory.
> 
> GSAT in Linux is the easiest, fastest way to test memory. Takes an hour.
> 
> HCI Memtest can take four hours or more.
> 
> If you're interested then let me know, I'll PM you how to make a Linux USB with GSAT to boot from and run, no need to actually install Linux on your hard drive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Have HWInfo sensors only running while you stress test to watch temps, that can get insanely high when doing so. You want to keep your CPU and CPU package under 80C for sure for i7s etc.
> 
> GSAT temps will be fine.


Memory is running under-clock technically. Its 2400MHz kit at 2133MHz. 2400MHz does not post in this MB.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


I called it.








The nub called it. Glad you had some progress.
I can't answer the why sadly other than perhaps degradation and bad timing.


----------



## Mooncheese

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


I'm caught up with your posts, when I read your initial statement about crashing in BF1 I assumed it was a black-screen related crash but if its a BSOD, unless the report implicates the display driver it's likely your CPU or memory.

I use a program called Bluescreeen View to try and figure out what the culprit is. .

If it's a black-screen that necessitates a hard shut-down, that is the display driver failing, usually because of an overzealous OC.

If you do get black-screens, what can alleviate them most of the time is simply setting the game to "Prefer Max Performance" in NVCP, assuming you don't have some insanely overzealous OC. If it's an OC that is right on the edge, like 13 MHz away from being unstable, setting the game to "Prefer Max" will help a lot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mooncheese*
> 
> I'm caught up with your posts, when I read your initial statement about crashing in BF1 I assumed it was a black-screen related crash but if its a BSOD, unless the report implicates the display driver it's likely your CPU or memory.
> 
> I use a program called Bluescreeen View to try and figure out what the culprit is. .
> 
> If it's a black-screen that necessitates a hard shut-down, that is the display driver failing, usually because of an overzealous OC.
> 
> If you do get black-screens, what can alleviate them most of the time is simply setting the game to "Prefer Max Performance" in NVCP, assuming you don't have some insanely overzealous OC. If it's an OC that is right on the edge, like 13 MHz away from being unstable, setting the game to "Prefer Max" will help a lot.


Just the game crashes to desktop and nothing else. No BSOD or Black Screen.

Edit: Got it stable. Was using +165, now +180. Probably just the ASRock MB having a lot more Voltage fluctuation.


----------



## Blameless

Been testing the VF vs. temperature curves on my 1080 Ti.

It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:

?? - 43C
43C - 55C
55C - 68C
68C - 73C
73C - 78C
78C - ??

Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.

Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


The GPU driver accesses the GPU over the PCI-E bus and the PCI-E controller is part of the CPU. The GPU driver also caches data to the system memory, and the IMC is part of the CPU. The CPU also decompresses assets and prepares frames for the GPU drivers.

It's almost impossible to change a major piece of hardware without running the risk of needing to reevaluate the CPU settings, unless you have significant stability margins in all areas, because everything goes through the CPU.


----------



## Ultracarpet

Quote:


> Originally Posted by *Blameless*
> 
> Been testing the VF vs. temperature curves on my 1080 Ti.
> 
> It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:
> 
> ?? - 43C
> 43C - 55C
> 55C - 68C
> 68C - 73C
> 73C - 78C
> 78C - ??
> 
> Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.
> 
> Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.


Forgive me I'm slow lol, are you saying have the card loaded to that temp when applying the voltage/clock curve?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Been testing the VF vs. temperature curves on my 1080 Ti.
> 
> It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:
> 
> ?? - 43C
> 43C - 55C
> 55C - 68C
> 68C - 73C
> 73C - 78C
> 78C - ??
> 
> Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.
> 
> Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.
> The GPU driver accesses the GPU over the PCI-E bus and the PCI-E controller is part of the CPU. The GPU driver also caches data to the system memory, and the IMC is part of the CPU. The CPU also decompresses assets and prepares frames for the GPU drivers.
> 
> It's almost impossible to change a major piece of hardware without running the risk of needing to reevaluate the CPU settings, unless you have significant stability margins in all areas, because everything goes through the CPU.


Could also be that 1080 Ti is a lot more powerful than 290X and Fury X.


----------



## lilchronic

Quote:


> Originally Posted by *Blameless*
> 
> Been testing the VF vs. temperature curves on my 1080 Ti.
> 
> It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:
> 
> ?? - 43C
> 43C - 55C
> 55C - 68C
> 68C - 73C
> 73C - 78C
> 78C - ??
> 
> Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.
> 
> Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.
> The GPU driver accesses the GPU over the PCI-E bus and the PCI-E controller is part of the CPU. The GPU driver also caches data to the system memory, and the IMC is part of the CPU. The CPU also decompresses assets and prepares frames for the GPU drivers.
> 
> It's almost impossible to change a major piece of hardware without running the risk of needing to reevaluate the CPU settings, unless you have significant stability margins in all areas, because everything goes through the CPU.


My card throttles @ 37c - 38c.


----------



## Mooncheese

Quote:


> Originally Posted by *Blameless*
> 
> Been testing the VF vs. temperature curves on my 1080 Ti.
> 
> It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:
> 
> ?? - 43C
> 43C - 55C
> 55C - 68C
> 68C - 73C
> 73C - 78C
> 78C - ??
> 
> Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.
> 
> Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.
> The GPU driver accesses the GPU over the PCI-E bus and the PCI-E controller is part of the CPU. The GPU driver also caches data to the system memory, and the IMC is part of the CPU. The CPU also decompresses assets and prepares frames for the GPU drivers.
> 
> It's almost impossible to change a major piece of hardware without running the risk of needing to reevaluate the CPU settings, unless you have significant stability margins in all areas, because everything goes through the CPU.


Great info, you know your stuff!

As far as thermal related throttling my experience with my FE has been that I lose 13MHz at 39, 49, 59 and 69C. Keeping the temp under 49C is good for 39 MHz.


----------



## KedarWolf

Quote:


> Originally Posted by *Mooncheese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blameless*
> 
> Been testing the VF vs. temperature curves on my 1080 Ti.
> 
> It appears that temperature offsets the core's VF slope by one clock step (12.5MHz) at the following temperature ranges:
> 
> ?? - 43C
> 43C - 55C
> 55C - 68C
> 68C - 73C
> 73C - 78C
> 78C - ??
> 
> Those are rough figures as there are load duration and hysteresis factors I haven't figured out yet.
> 
> Anyway, if you set the voltage curve in MSI AB while the card is within one of these ranges, then look at the curve again when the card is hotter or cooler enough to put it in a different range, your slope will appear different. So, set your curve for the temps you expect to see or take into account the clock step differentials while doing so.
> The GPU driver accesses the GPU over the PCI-E bus and the PCI-E controller is part of the CPU. The GPU driver also caches data to the system memory, and the IMC is part of the CPU. The CPU also decompresses assets and prepares frames for the GPU drivers.
> 
> It's almost impossible to change a major piece of hardware without running the risk of needing to reevaluate the CPU settings, unless you have significant stability margins in all areas, because everything goes through the CPU.
> 
> 
> 
> Great info, you know your stuff!
> 
> As far as thermal related throttling my experience with my FE has been that I lose 13MHz at 39, 49, 59 and 69C. Keeping the temp under 49C is good for 39 MHz.
Click to expand...

Unless I use the XOC BIOS I lose a bin at 45C on my Gigabyte FE.


----------



## criminal

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Put the CPU and RAM to stock and no crash. Why is this happening now. You think all this time since I got 1080 and 1080 Ti and all the problems I have been have been CPU related?


It is so strange that a cpu overclock you have been running for years can be affected by upgrading the gpu. I have ran into that issue in the past.


----------



## ZealotKi11er

Quote:


> Originally Posted by *criminal*
> 
> It is so strange that a cpu overclock you have been running for years can be affected by upgrading the gpu. I have ran into that issue in the past.


I think its this ASRock Z77E- ITX MB. Its not as good as Z77X-UD5H.


----------



## Agent-A01

Quote:


> Originally Posted by *criminal*
> 
> It is so strange that a cpu overclock you have been running for years can be affected by upgrading the gpu. I have ran into that issue in the past.


Why is that strange?

With a faster GPU, CPU and memory subsystem have to work harder.
More stress shows instability faster.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Agent-A01*
> 
> Why is that strange?
> 
> With a faster GPU, CPU and memory subsystem have to work harder.
> More stress shows instability faster.


I do not think that is the case. I have had games push the CPU harder. In BF1 the CPU usage is only 50-55% because of 4K. When I played with 290X I used 1080p at Low and was getting 140-160 fps and CPU usage was 90-95%.


----------



## Blameless

Quote:


> Originally Posted by *lilchronic*
> 
> My card throttles @ 37c - 38c.


Sounds about right. I can't test that low on my card which is air cooled and in a 26C room.
Quote:


> Originally Posted by *Ultracarpet*
> 
> Forgive me I'm slow lol, are you saying have the card loaded to that temp when applying the voltage/clock curve?


If I want to make sure the the clocks I'm setting are reflective of what I'm getting at that load/temp, yes, I load the card to that temperature while I edit the curves.


----------



## lilchronic

Quote:


> Originally Posted by *Blameless*
> 
> Sounds about right. I can't test that low on my card which is air cooled and in a 26C room.
> If I want to make sure the the clocks I'm setting are reflective of what I'm getting at that load/temp, yes, I load the card to that temperature while I edit the curves.


There also seems to be a change on the curve with with temps below 22c or 23c but hard to really test that with my ambient temps,

yeah @ 38c like clock work i get a bin drop.



..had to put my fans on the lowest speed possible to get it to 38c.


----------



## SkylineGTR34

Can i join the club to









Got a 1080 TI in august and since i have beaten it throw benchmark and games and hell what a nice GFX. Som result below here.


----------



## 8051

Quote:


> Originally Posted by *lilchronic*
> 
> There also seems to be a change on the curve with with temps below 22c or 23c but hard to really test that with my ambient temps,
> 
> yeah @ 38c like clock work i get a bin drop.
> 
> ..had to put my fans on the lowest speed possible to get it to 38c.


So you have to keep your GPU below 101° F to keep it from dropping bins?

That is really restrictive.

I wonder if it'll drop a bin at any frequency? Or if it only drops at certain minimum frequency-voltage bins?


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> So you have to keep your GPU below 101° F to keep it from dropping bins?
> 
> That is really restrictive.


Might be more scaling at even cooler temps. Maybe someone has actually looked at the firmware to see how many values are in there for the temperature steps?

Anyway, I wouldn't call this a restriction as much as free performance if you can keep the card cold. Since I'm using an air cooled card that I want to keep 24/7 stable at modest noise levels I'm doing all of my testing and clock calibrations around the high-end of the bin that ends at ~68C (as the memory on my sample becomes unstable at 11880Gbps @ 70C). I can rely on getting 2037MHz core in most protracted load scenarios, but if my card is running colder for some reason I can see 2050, 2062, or even 2075MHz.


----------



## taem

Re benchmarking, how on earth do you guys get such high scores? I thought this card must not be right because my numbers were significantly off, but no, my numbers are in line with posted review scores. It's you guys getting high scores, not me getting low scores. So what gives? Some settings in control panel I need to adjust?


----------



## ZealotKi11er

Quote:


> Originally Posted by *taem*
> 
> Re benchmarking, how on earth do you guys get such high scores? I thought this card must not be right because my numbers were significantly off, but no, my numbers are in line with posted review scores. It's you guys getting high scores, not me getting low scores. So what gives? Some settings in control panel I need to adjust?


What scores in particular. Not all 1080 Tis are the same because of Boost 3.0. Some can hold the clock while card like FE will throttle from temps and power.


----------



## Dasboogieman

Quote:


> Originally Posted by *lilchronic*
> 
> There also seems to be a change on the curve with with temps below 22c or 23c but hard to really test that with my ambient temps,
> 
> yeah @ 38c like clock work i get a bin drop.
> 
> 
> 
> ..had to put my fans on the lowest speed possible to get it to 38c.


I get zero clockspeed changes until mine hits around 42C. I suspect there is a degree of ASIC variance at work here.


----------



## ZealotKi11er

Personally I ignore the clock speed increase or decrease because of temperatures. My suggestion is find a clock speed your card is roughly stable at. Load a benchmark and let the card heat up and keep in mind the range of temp. For me my card hits ~ 48-55C. Mostly ~ 52C. I than Curve OC while the card is hot and running. Next time the card loads up to same temps there is no funny clock speeds and you get the clock you set it at.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *taem*
> 
> Re benchmarking, how on earth do you guys get such high scores? I thought this card must not be right because my numbers were significantly off, but no, my numbers are in line with posted review scores. It's you guys getting high scores, not me getting low scores. So what gives? Some settings in control panel I need to adjust?


Yeah set nVidia control panel like in OP and you should see a increase in benchmark scores.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Personally I ignore the clock speed increase or decrease because of temperatures. My suggestion is find a clock speed your card is roughly stable at. Load a benchmark and let the card heat up and keep in mind the range of temp. For me my card hits ~ 48-55C. Mostly ~ 52C. I than Curve OC while the card is hot and running. Next time the card loads up to same temps there is no funny clock speeds and you get the clock you set it at.


I switched back to my overclocked and VBIOS modded 980Ti in order to put some aftermarket cooling on the 1080Ti. While the framerate isn't as high w/the 980Ti it also seems to be less variable.


----------



## Blameless

Quote:


> Originally Posted by *Dasboogieman*
> 
> I suspect there is a degree of ASIC variance at work here.


Could be.

My bin shifts vary by a few degrees, so whatever is controlling the transitions seems to be fairly complex.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Personally I ignore the clock speed increase or decrease because of temperatures. My suggestion is find a clock speed your card is roughly stable at. Load a benchmark and let the card heat up and keep in mind the range of temp. For me my card hits ~ 48-55C. Mostly ~ 52C. I than Curve OC while the card is hot and running. Next time the card loads up to same temps there is no funny clock speeds and you get the clock you set it at.


The custom offsets applied to the curve are the same regardless of temperature (i.e. if you set a +101Mhz offset to the 1025mV voltage, that stays no matter the temp), it's just that the base curve moves.
Quote:


> Originally Posted by *taem*
> 
> Re benchmarking, how on earth do you guys get such high scores? I thought this card must not be right because my numbers were significantly off, but no, my numbers are in line with posted review scores. It's you guys getting high scores, not me getting low scores. So what gives? Some settings in control panel I need to adjust?


Most benchmarks are run with performance rather than quality driver settings, but a lot of reviews don't tune clocks at all either.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Could be.
> 
> My bin shifts vary by a few degrees, so whatever is controlling the transitions seems to be fairly complex.
> The custom offsets applied to the curve are the same regardless of temperature (i.e. if you set a +101Mhz offset to the 1025mV voltage, that stays no matter the temp), it's just that the base curve moves.
> Most benchmarks are run with performance rather than quality driver settings, but a lot of reviews don't tune clocks at all either.


If I want 1860MHz at 0.9v for example and do it while the card is cold it will be lower 1847MHz once it heats up. If I do it while the card is hot I get the load overclock for my card.


----------



## Thoth420

Glad this happened now before I migrated it to the new build(waiting on some parts and have a Micro Center road trip planned). The Z170 XPower board pulled its die right at the one year mark...talk about planned obsolescence. I checked the reviews on Newegg and Amazon after it happened to find a bunch of the same issue one after another after another...

Back to AsRock or ugh ASUS if there is something with very good reviews. I can't decide if I should sell the 6700k and go i9 or moreso Threadripper. I haven't even looked into CPUs and board because that was one of the few things I was carrying over. Looks like its a full rebuild aside the PSU now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Thoth420*
> 
> Glad this happened now before I migrated it to the new build(waiting on some parts and have a Micro Center road trip planned). The Z170 XPower board pulled its die right at the one year mark...talk about planned obsolescence. I checked the reviews on Newegg and Amazon after it happened to find a bunch of the same issue one after another after another...
> 
> Back to AsRock or ugh ASUS if there is something with very good reviews. I can't decide if I should sell the 6700k and go i9 or moreso Threadripper. I haven't even looked into CPUs and board because that was one of the few things I was carrying over. Looks like its a full rebuild aside the PSU now.


Do you need all the cores? What about waiting for 8700K? Also i recommend Gigabyte over ASRock and ASUS personally.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you need all the cores? What about waiting for 8700K? Also i recommend Gigabyte over ASRock and ASUS personally.


I'd like to prepare in case games start to chomp down those extra cores. I don't plan on swapping anything but the GPU from this build in the next three years or longer if I can help it. I also like to do streaming so having more than 4 physical cores would be nice. I am not dead set on one of the higher end CPUs but the Z270 and X370 chipsets don't exactly impress me.

I have never had good experience with Gigabyte but I can say the same with ASUS for the most part. I found AsRock boards to be essentially problem free so far.

I can't wait for October for Coffee Lake so that wont work. I don't let parts sit around so I need to decide quickly.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Thoth420*
> 
> I'd like to prepare in case games start to chomp down those extra cores. I don't plan on swapping anything but the GPU from this build in the next three years or longer if I can help it. I also like to do streaming so having more than 4 physical cores would be nice. I am not dead set on one of the higher end CPUs but the Z270 and X370 chipsets don't exactly impress me.
> 
> I have never had good experience with Gigabyte but I can say the same with ASUS for the most part. I found AsRock boards to be essentially problem free so far.
> 
> I can't wait for October for Coffee Lake so that wont work. I don't let parts sit around so I need to decide quickly.


Both i9 and especially TR will be too slow in IPC when games use more than 8 Cores. Get a 1700. There is no point to all the PCIE and Quad Channel.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both i9 and especially TR will be too slow in IPC when games use more than 8 Cores. Get a 1700. There is no point to all the PCIE and Quad Channel.


Thanks for the advice much appreciated.


----------



## RickyOG90

Bought a ftw3 hydrocopper on august 30th and have finally just started to work on overclocking it. So far the gpu is great and I am really enjoying the extra vram up from my 980 ti classified.


----------



## ZealotKi11er

Quote:


> Originally Posted by *RickyOG90*
> 
> Bought a ftw3 hydrocopper on august 30th and have finally just started to work on overclocking it. So far the gpu is great and I am really enjoying the extra vram up from my 980 ti classified.


Probably not much to OC lol. Maybe +100 Core and +500 memory. About 5% performance uplift.


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*
> 
> Unless I use the XOC BIOS I lose a bin at 45C on my Gigabyte FE.


Weird, yeah I lose 13 MHz at 39, 49, 59 and 69C with nothing really going on in-between.

Actually, I wonder if youre experiencing a bin-drop that isn't related to temperature. I experimented with 2000 MHz at .993v, down from 2025 MHz at 1.00V, and fully expecting it to throttle less in really demanding scenes in The Witcher 3 in 3D Vision (2560x1440) the opposite actually happened, there was more dipping and bin-dropping with less voltage, which lead me to wonder if maybe Error Correction was a factor somehow (not sure what else to say here, i know this is a phenomenon with memory overclocking) but yeah, it actually performs better at 1.000v vs. .993v, even with another 26 MHz on the core.

I kinda remember you saying you were at .993v.


----------



## RickyOG90

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Probably not much to OC lol. Maybe +100 Core and +500 memory. About 5% performance uplift.


Yea.... I'm noticing that. Working on it now and i tried +100 on the core and it would crash unigine heaven within a few seconds. Oh well, since its watercooled, at my current +81 on the core and +50 on memory(for some reason it isn't liking me going beyond that...) With a +75% on the voltage slider, it is boosting up to 2062 so I'm not too disappointed haha


----------



## TheBoom

Quote:


> Originally Posted by *Thoth420*
> 
> Glad this happened now before I migrated it to the new build(waiting on some parts and have a Micro Center road trip planned). The Z170 XPower board pulled its die right at the one year mark...talk about planned obsolescence. I checked the reviews on Newegg and Amazon after it happened to find a bunch of the same issue one after another after another...
> 
> Back to AsRock or ugh ASUS if there is something with very good reviews. I can't decide if I should sell the 6700k and go i9 or moreso Threadripper. I haven't even looked into CPUs and board because that was one of the few things I was carrying over. Looks like its a full rebuild aside the PSU now.


1 year?? Seems like their quality hasn't changed much in 10 years. That's why I specifically avoid MSI. You can't go wrong with Asus especially with mobos.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both i9 and especially TR will be too slow in IPC when games use more than 8 Cores. Get a 1700. There is no point to all the PCIE and Quad Channel.


Does i9 actually have weaker IPC than the regular i7s and i5s? I thought they would at least be on par. Unless you're talking about raw clock speeds of course.


----------



## CrazyElf

Quote:


> Originally Posted by *Blameless*
> 
> 3rd party 1080 Ti's going for $750 are a pretty good deal for gaming, especially relative to the street prices of any remotely competitive products.
> The 8800 GTX and Ultra were nearly twice as fast as the X1950XTX. The 2900X that came out after them didn't improve the situation...people compare Vega to the 2900XT, but they are mistaken. Vega would need to be slower than Fury X at the same power levels it is now to be comparable. Yes, the 2900XT often lost to the X1950XT.
> 
> ATI/AMD wouldn't have a part competitive at the very high-end again until the HD 5000 series.
> 
> The 8800 Ultra was still a rip off vs. the 8800 GTX, of course. As Zealot mentions, it was the Titan of it's day.
> 
> Neither SLI nor CFX seem to be working there; those are the same scores you'd get with a single card.
> What tests are showing such a small improvement?
> 
> Most 4k tests with higher texture quality settings should be bandwidth limited on a 1080 Ti. How about Unigine Valley or Superposition?
> I'd be inclined to try again...there isn't much point in paying extra for a hybrid model if it's going to be louder and slower than half the purely air cooled samples out there.


You could always trying doing the hybrid "mod" on your GPU.

Agree though with the idea that the 1080Ti is a surprisingly good value. If you think about it, going by MSRP:

GTX 1060 6GB is supposed to be $250 USD MSRP for 1280 shaders
GTX 1080Ti 11 GB is supposed to be $700 USD MSRP for 3584 shaders
700 / 250 = 2.8 and 3584 / 1280 = 2.8, so it's not all bad in that regard.
Premium cards usually do not have a linear scale in shaders per dollar. I guess you don't get a linear scale for VRAM, but 11GB should be adequate given the speed of the GPU. You are also getting GDDR5X rather than GDDR5.

Of course prices are now above the MSRP. The cheapest GTX 1060 6GB I could find in Canada is $360 CAD before taxes. That's about $295 USD converted at current exchange rates. The cheapest 1080Ti (a blower with a reference PCB) was about $882 CAD before taxes (about $721.39 USD). Ironically that means that the shaders vs dollars spent is better with the 1080Ti.

That's probably because GDDR5X isn't that great for mining, while the GTX 1060 is still decent for mining. I remember reading at one point, people were getting 25 MH/s on that GPU. The 3GB GTX 1060 versions are likely to be cheaper because as we end this Ethereum Epoch ... they don't have the necessary frame buffer to keep going. Of course if Ethereum does go Proof of Stake, this might shake up the GPU market too. Until the next cryptocurrency boom anyways.

The 1060 does seem to OC a bit more - they top out at 2150 MHz, while the 1080Ti is limited to 2050 MHz or so. Then again, performance does not scale linearly with clocks - or for that matter shaders.



You can't compare the 2 graphs, but it is safe to assume that the 1080Ti is about 35% more than the 1080 and at $700 USD.


I'd assume the 1060 is roughly in the same class as the RX 580.

I myself decided on the Lightning X. Quote from myself.
Quote:


> An MSI 1080TI Lightning X in Canada goes for $1026 CAD before taxes (about $839.17 USD)
> The cheapest 1080Ti MSI GTX 1080 TI AERO is $882 CAD before taxes (about $721.39 USD) - I think this is a reference PCB without a power limit
> A Gigabyte 1080Ti Aorus is $980 CAD before taxes (about $801.54 USD)
> The EVGA triple fan version, the FTW3, (( 11G-P4-6696-KR )) is $1000 CAD dead on ($817.90 USD)
> 
> On the high end:
> 
> 
> An MSI 1080TI Lightning Z in Canada goes for $1085 CAD before taxes (about $887.42 USD)
> A Zotac 1080Ti AMP Extreme goes for $1010 CAD (about $826.08 USD)
> A GIGABYTE AORUS Xtreme GeForce GTX 1080 Ti Waterforce (the one with the AIO) is about $1045 CAD (about $855 USD) - the full waterblock version is $1080 CAD (about $883.33 USD)
> The EVGA AIO version is $970 CAD dead on (around $794 USD)


The premium I think was worth it.

From your Aorus 1080Ti posts:
Quote:


> Originally Posted by *Blameless*
> 
> Got my Aorus (non-X) 1080 Ti in the other day and have been playing with it since. Chose this card because of price, availability, and the 375w advertised power limit without needing to flash 3rd party firmware.
> 
> My initial impressions are a bit mixed, especially with regard to the cooler.
> 
> I'm pretty impressed with the noise levels and mine doesn't seem to need to be taken apart or have the TIM replaced to perform well enough (though I did find it prudent to tighten down all the screws to increase mounting pressure). However, the build quality of the actual heatsink itself is pretty mediocre and there are some very questionable design decisions regarding the VRM cooling. Why the FETs would have the largest gap, needing the thickest thermal pads, while other VRM components like the chokes have a much smaller gap, is beyond me...it's backwards. Backplate is also completely worthless from a thermal performance perspective and the silly copper insert behind the GPU is the coolest (and least functional) part of it...purely decorative.
> 
> Flashed the part to F3P almost out of the box...didn't seem to change much, but I like to start with the most recent stable firmware.
> 
> Still dialing in an overclock. Spent the better part of yesterday messing with the dynamic clock/voltage curves in MSI AB, but gave up on them as the offets they were applying weren't always predictable...seemed to vary by two or three bins between reboot and at different loads, so I went back to the basic sliders.
> 
> Only real issue I'm having (beyond the unavoidable NVIDIA Pascal and driver annoyances) is that the card doesn't seem to actually be able to reach it's advertised power limit. If I set 150%/375w, I seem to still run into a power limit cap between 133% and 142% power.
> 
> Anyone else seeing this issue? I suspect it's just reacting to transients, but if others are able to peg power at or near 150%, I'll know something is wrong with my sample.


Nvidia has become super conservative with its power draws, so it may very well be that Nvidia has specifically restricted the maximum power limit to a lower level for tests like Furmark. I think that when the program detects these programs, it restricts power draw. I would be interested to see if OCCT has similar restrictions. I would recommend testing.

I wonder if renaming the programs might bypass the restrictions. This is risky though, as I don't know whether there is risk to damaging the GPU. I do know that historically, when people renamed Furmark, they could get around the downclock restrictions that Nvidia set. It's likely though that it would be fixed at 150% if you did rename.

Quote:


> Originally Posted by *Blameless*
> 
> Might be more scaling at even cooler temps. Maybe someone has actually looked at the firmware to see how many values are in there for the temperature steps?
> 
> Anyway, I wouldn't call this a restriction as much as free performance if you can keep the card cold. Since I'm using an air cooled card that I want to keep 24/7 stable at modest noise levels I'm doing all of my testing and clock calibrations around the high-end of the bin that ends at ~68C (as the memory on my sample becomes unstable at 11880Gbps @ 70C). I can rely on getting 2037MHz core in most protracted load scenarios, but if my card is running colder for some reason I can see 2050, 2062, or even 2075MHz.


Water cooling does seem to get you a slightly more "stable" boost clockspeed, but the gains in real clockspeed are not that high, and certainly not enough to justify the price of a waterblock.

An even riskier option is the Shunt Mod: http://www.overclock.net/t/1608437/tutorial-power-target-limit-hardware-mod-shunt-mod-for-titan-x-and-many-other-nvidia-gpus/0_100

Quote:


> Originally Posted by *Blameless*
> 
> I've pushed north of 450w through Fermi and Hawaii parts that only have 8-pin + 6-pin connectors, without significant issues. Then again, I've fried a few HEDT motherboards and one R9 290 by melting the EPS-12v or 8-pin PCI-E connectors, with as low as 7A a conductor.
> 
> Anyway, you want to split whatever load you have as evenly as possible between rails so you don't risk hitting OCP during a load spike.
> 200A is 2400w at 12v. Try to push that through a PCI-E power cable and it will vaporize.
> 
> The ATX specs are quite conservative, and the 18AWG wire can handle quite a bit more, as can new, properly fit, connectors. However, I wouldn't ignore the guidelines, especially in the case of aging connectors that have seen dozens or hundreds of matings.
> 
> Every time I've seen a DC power cable fail anywhere near spec, without being obviously damaged, it was because of poor contact at the actual connector level, not because of the wire.


One of the tests of what a good manufacturer makes these days is how much it draws from the PCIe slot versus the 8 pin PCIe power cables I'm finding.

This is the ideal situation: http://www.tomshardware.fr/articles/test-msi-gtx-1080-ti-lightning,2-2714-5.html

  

Seeing as how you have a 1080Ti Aorus, I did a bit of digging. It looks like Tom's has something for you too: http://www.tomshardware.com/reviews/aorus-gtx-1080-ti-xtreme-edition,5054-4.html


This is the "maximum overclock gaming power consumption test". Looks well within the 75W spec. If you wanted, SLI would be no problem or a second one for a dual use mining/gaming hybrid setup.

Many motherboards come with an extra power supply for the motherboard.



It might come in handy for people running setups like the RX 480 Crossfire, which exceeds PCIe spec.


----------



## taem

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Yeah set nVidia control panel like in OP and you should see a increase in benchmark scores.


Turning off my framerate limiter really boosted my benchmarks. Lol. Totally forgot about that. Funny thing is I was still happy with the performance. What a beast of a gpu. Dam I don't want to be disloyal but I've never had that feeling with a gpu before, that this is enough power.


----------



## Thoth420

Quote:


> Originally Posted by *TheBoom*
> 
> 1 year?? Seems like their quality hasn't changed much in 10 years. That's why I specifically avoid MSI. You can't go wrong with Asus especially with mobos.


This was my first and last MSI board.

I have had some issues with stability even at stock clocks on recent series and chipset ASUS boards so I have been opting for AsRock mostly which have had none.

I am looking at the Skylake X i7 7820K 8 core which seems enough for me and as long as it can do over 4.0 somewhere that is ok for me. Pairing it with the AsRock Taichi X299 was my plan but I would certainly consider an ASUS board. Any recommendations?

*My second choice looks like the R7 1800x (or 1700 not sure) and the X370 ASUS Crosshair VI which I built a friends system on so I have experience with that and it worked perfectly.*


----------



## ZealotKi11er

Quote:


> Originally Posted by *Thoth420*
> 
> This was my first and last MSI board.
> 
> I have had some issues with stability even at stock clocks on recent series and chipset ASUS boards so I have been opting for AsRock mostly which have had none.
> 
> I am looking at the Skylake X i7 7820K 8 core which seems enough for me and as long as it can do over 4.0 somewhere that is ok for me. Pairing it with the AsRock Taichi X299 was my plan but I would certainly consider an ASUS board. Any recommendations?
> 
> *My second choice looks like the R7 1800x (or 1700 not sure) and the X370 ASUS Crosshair VI which I built a friends system on so I have experience with that and it worked perfectly.*


If you got the cash get Intel. Still not sold on Skylake X stuff for gaming. They seem to be having some problems. 1800X is also $350 is you close to MC which is nice deal.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you got the cash get Intel. Still not sold on Skylake X stuff for gaming. They seem to be having some problems. 1800X is also $350 is you close to MC which is nice deal.


I am not exactly close but its about 3 hours out and I already plan on heading there in a few weeks.
I don't mind a quad core CPU per say (however I read last night after our exchange that the 1080 Ti can handle the video load for streaming and capping very well on its own)however the 7700K seems to run very hot and frankly I am just tired of only having a quad core. Some of it is frankly epeen.

I would like a bit more flexibility and overhead in the future. AMD coming back with viable options has made deciding very hard. I think by the time I get all my parts and get to go to MC that the Coffee Lake will probably be upon us but I hate messing with first batch early BIOS mobos ...refresh series or not.

This is a rough decision in a a build that a week ago seemed very cut and dry.

I cancelled my order on the Strix 1080 Ti for now as they seem to be finicky. I think its either going to be an FE or the EVGA FTW3 Gamer with heavy leaning to the FE for the option to block it after about 6 months of system assessment.


----------



## Blameless

Quote:


> Originally Posted by *CrazyElf*
> 
> I wonder if renaming the programs might bypass the restrictions. This is risky though, as I don't know whether there is risk to damaging the GPU. I do know that historically, when people renamed Furmark, they could get around the downclock restrictions that Nvidia set. It's likely though that it would be fixed at 150% if you did rename.


FurMark even used to include a second copy of itself to get around the primitive block in older drivers...they just called it Quake3.exe as that was a popular OGL app that NVIDIA would never throttle. Unfortunately, renaming the exe no longer seems to work.

I wouldn't expect it to be damaging as the limiter would still be in place and I can run other apps that easily hit it (Superposition 8k, for example) even with the 375w limit on my card/firmware.


----------



## xcom-

Hello everyone good evening.

After finally completing my build I am now looking at overclocking my EVGA GTX 1080 TI with EK Waterblock, however looking for some sound advice from you guys, as I really have no idea where to begin.

I am surprised of how few video guides they are on YouTube.

Can someone offer me any suggestions how I can improve my current settings - Also, I am wondering, why are people using MSI afterburner rather than the default software with the card? Such as EVGA Precision.

Any advice would be greatly appreciated.

My current settings :

Core Clock +150
Memory Clock +550



And if anyone wants to see my build it is


http://imgur.com/R81PX

 :


----------



## ZealotKi11er

Some Nvidia History. The good old days.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *taem*
> 
> Turning off my framerate limiter really boosted my benchmarks. Lol. Totally forgot about that. Funny thing is I was still happy with the performance. What a beast of a gpu. Dam I don't want to be disloyal but I've never had that feeling with a gpu before, that this is enough power.


Framerate limiter ?? Do you mean V-Sync ?


----------



## Swolern

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Framerate limiter ?? Do you mean V-Sync ?


Framerate limiter is a limit set for your GPU. Vsync is the monitors limit.


----------



## Iceman2733

Quote:


> Originally Posted by *Thoth420*
> 
> I am not exactly close but its about 3 hours out and I already plan on heading there in a few weeks.
> I don't mind a quad core CPU per say (however I read last night after our exchange that the 1080 Ti can handle the video load for streaming and capping very well on its own)however the 7700K seems to run very hot and frankly I am just tired of only having a quad core. Some of it is frankly epeen.
> 
> I would like a bit more flexibility and overhead in the future. AMD coming back with viable options has made deciding very hard. I think by the time I get all my parts and get to go to MC that the Coffee Lake will probably be upon us but I hate messing with first batch early BIOS mobos ...refresh series or not.
> 
> This is a rough decision in a a build that a week ago seemed very cut and dry.
> 
> I cancelled my order on the Strix 1080 Ti for now as they seem to be finicky. I think its either going to be an FE or the EVGA FTW3 Gamer with heavy leaning to the FE for the option to block it after about 6 months of system assessment.


Strix finicky? I am not sure where you reading or talking to but that is far from the truth. I started with 2ea FTW3 and took them back for 2ea Strix OC also my good friend system has 2ea Strix OC in his. Mine are on water his on air neither of us have experienced any thing finicky about them. Comparing to the FTW3 they are quieter and ran cooler on stock fan profiles. O/C wise they are right on par with every other Ti on the market mine are at 2055 and his with nice cold ambient evening in his house I think he is getting an average of 1963 (his ambient is horrible







)


----------



## Thoth420

Quote:


> Originally Posted by *Iceman2733*
> 
> Strix finicky? I am not sure where you reading or talking to but that is far from the truth. I started with 2ea FTW3 and took them back for 2ea Strix OC also my good friend system has 2ea Strix OC in his. Mine are on water his on air neither of us have experienced any thing finicky about them. Comparing to the FTW3 they are quieter and ran cooler on stock fan profiles. O/C wise they are right on par with every other Ti on the market mine are at 2055 and his with nice cold ambient evening in his house I think he is getting an average of 1963 (his ambient is horrible
> 
> 
> 
> 
> 
> 
> 
> )


User reviews and if it does occur I cannot be bothered to deal with ASUS for RMA on a GPU as I only use one. I would be dead in the water for who knows how long. Their RMA process is a nightmare while if something happens to the EVGA they have advanced RMA service. I don't care about a bit of extra noise at all over peace of mind.


----------



## ZealotKi11er

Played about 2 hours of BF1. Fully stable now. Though I would test my 2025MHz/1v OC and noticed that power target does not come close to 120%. Mostly 100-105%. Looks like BF1 does not push the card that hard.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Played about 2 hours of BF1. Fully stable now. Though I would test my 2025MHz/1v OC and noticed that power target does not come close to 120%. Mostly 100-105%. Looks like BF1 does not push the card that hard.


Hellblade: Senua's Sacrifice I can do 1.062v at 2062 core, 6142 memory and it doesn't power limit during gameplay at the highest detail settings and maxed in Nvidia Control Panel except I have Antialiasing: Transparency on Multisample, rest maxed though.









Well, except for cut scenes, to get cut scenes not to power limit I dropped it down to 2037 1.025v.


----------



## ZealotKi11er

I have yet to figure out how much power AIO uses. From what I am reading is very low 3-5W and the stock fan probably not much more at 50%.


----------



## buellersdayoff

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have yet to figure out how much power AIO uses. From what I am reading is very low 3-5W and the stock fan probably not much more at 50%.


Asetek pump is 2.1w, fans less than that


----------



## fisher6

Can the extra voltage from the XOC bios affect the card in the long term? My card is at 2000-2050 with 1.1v most of the time. I know I can cap the voltage with the curve but most of the time I add nothing to the core.


----------



## Dasboogieman

Quote:


> Originally Posted by *fisher6*
> 
> Can the extra voltage from the XOC bios affect the card in the long term? My card is at 2000-2050 with 1.1v most of the time.


Yes, we just cannot tell you to what degree. The best data we have so far is 1.28v+ can degrade a 980ti within about 3years of use. Since Pascal is similar architecture but using 14nm. I can surmise about 1.18v-1.2v on Pascal is equivalent to 1.28v on Maxwell.


----------



## pierluigi74

hi guys, i just got an EVGA 1080 ti ftw3. can anyone tell me if I can modify the bios? are there mod bios ready to use? thank you so much


----------



## Dasboogieman

Quote:


> Originally Posted by *pierluigi74*
> 
> hi guys, i just got an EVGA 1080 ti ftw3. can anyone tell me if I can modify the bios? are there mod bios ready to use? thank you so much


Nope and nope. Pascal is completely locked down, the best you can do is flash the BIOS from another manufacturer.


----------



## pierluigi74

oh no .... my god !!
and therefore, the best bios I can use with high tdp to avoid cuts?


----------



## Dasboogieman

Quote:


> Originally Posted by *pierluigi74*
> 
> oh no .... my god !!
> and therefore, the best bios I can use with high tdp to avoid cuts?


Your current BIOS is already one of the best.

If you want to gamble, you can try the XOC BIOS but I'm not 100% sure you can get away with it on the FTW3. I'll have to scan the PCB to see.


----------



## Maximization

still waiting for a new power supply to push mine further, watt meter showing over 1000 watts system usage, my 850 was cutting off.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Played about 2 hours of BF1. Fully stable now. Though I would test my 2025MHz/1v OC and noticed that power target does not come close to 120%. Mostly 100-105%. Looks like BF1 does not push the card that hard.


I'm using Unigine Superposition 8k optimized as my worst case scenario as it's the most demanding test, as far as power goes, that isn't artificially/prematurely throttled.


----------



## CYBER-SNIPA

Is anyone running the DP V1.3 or V1.4 cables? If so have you noticed any difference over a V1.2 cable???


----------



## Dasboogieman

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> Is anyone running the DP V1.3 or V1.4 cables? If so have you noticed any difference over a V1.2 cable???


DP doesn't care about cable version. It just cares about the port protocol support on either side.


----------



## CYBER-SNIPA

Quote:


> Originally Posted by *Dasboogieman*
> 
> DP doesn't care about cable version. It just cares about the port protocol support on either side.


Actually you are incorrect. The newer version's of DP cable's (1.3 & 1.4) are utilizing HBR3 and have a higher bandwidth, thus enabling a higher bit rate of data transference. And yes I should have been clearer and mentioned HBR3 in my original question. I am aware that the output and input ports need to be running the same version, but the cable also needs to support the bandwidth/data rate being asked of it too. Therefore the cable and ports all need to be compatible, with different cables supporting different bandwidth's as shown below.

V1.1 = 10.8Gbits/s
V1.2 = 17.28Gbits/s
V1.3 = 32.4Gbits/s
V1.4 = 32.4Gbits/s

SOURCE; https://en.wikipedia.org/wiki/DisplayPort.

HBR3 enables the user to run a higher resolution screen coupled with a higher refresh rate. The 1080 cards support DP V1.4 but without the compatible connecting cable being used, your monitor will be unable to support the higher res and refresh rates. So hence why I asked the question???

DisplayPort High Bit Rate 3 (HBR3) [finally explained!]

mini DisplayPort to DisplayPort white cable
Approved on September 15, 2014, this new standard has replaced HBR2 in 2015. High Bit Rate 3 (HBR3) is the new standard used by the all new DisplayPort 1.3 video cards.

The data bandwidth represent the maximum information (in gigabytes per seconds) each version of DisplayPort can send to the output. Because of this shared bandwidth, it is possible to connect many monitors of different make, model, resolution and refresh rate on each cable.

The total bandwidth of High Bit Rate 3 connections is 32.4 Gbit/s, the Video Data Rate being 25.92 Gbit/s.

Here are some of the known possible monitor resolutions:

One 7680 x 4320 (8K) at 60 frames per second (hertz)
One 5120 x 2880 (5K) at 60 frames per second (hertz)
One 4096 x 2160 (4K) up to 120 frames per second (hertz)
Up to two 3840 x 2160 (4K) up to 120 frames per second (hertz)
Up to three 2560 x 1600 up to 60 frames per second (hertz), or one monitor up to 120 hz
Up to four 1920 x 1200 up to 60 frames per second (hertz), or one monitor up to 240 hz
Up to four 1920 x 1080 up to 60 frames per second (hertz), or one monitor up to 240 hz
& all lower resolutions and the ones in between. There is a maximum number of monitors that can be connected to any card, so be sure to check in your owner's manual.
And a variety of mix and match resolutions like one 2560 x 1600 and two 1920 x 1080, etc, as long as the total resolution is below the total bandwidth available.

Source; http://multimonitorcomputer.com/solved/displayport-high-bit-rate-3-hbr3.php

So by swapping your DP cable for a V1.3 or V1.4 (in order to obtain a higher bandwidth), that upgrade should in theory enable the user to OC your monitor to obtain a higher refresh rate? So I am asking if anyone has tried this and if so what were your results???


----------



## Psykoking

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> DP Cables are not all equal, the Wikipedia and Displayport pages clearly demonstrates this with different bandwidths etc. The article which you have quoted only refers to HBR2 and not the new HBR3 standard. Obviously you did not read that or understand that part???.


Seems like you understand something wrong, while of course what is stated on the wiki page is correct, those are numbers for the display port interface. This has nothing to do with the cable itself. The interface in this case consists of your graphics card as the sender, the cable as the transmission medium and your monitor as the receiver. So the cable itself wont help you with the bandwidth. What's important is that both sender and receiver support newer display port versions. Therefore having a gfx card that supports HBR 3 and a monitor only supporting HBR 2 will still result in the interface using HBR 2 because the modules (sender and receiver) decide which version is beeing used. The cable itself wont give you problems unless you try to connect to smth several hundred meters away -> noise in cable decreases bandwidth.

Vesa itself has a pretty nice powerpoint presentation explaining display port. To keep it short i copied the page with the needed scheme as can be seen below.


For further information on this topic look up the Shannon-Hartley theorem.

Hope i helped clearing that up









Sources: http://www.vesa.org/wp-content/uploads/2011/01/ICCE-Presentation-on-VESA-DisplayPort.pdf


----------



## 8051

Is the percentage power limit in Precision X a reflection of what is in the VBIOS of the card, or a limit imposed by Precision X? The maximum I can set my percentage power limit to is 120% and I hit it constantly in some games, which results in PWR perfcaps and throttling.


----------



## navjack27

Percentage is bios based


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> Is the percentage power limit in Precision X a reflection of what is in the VBIOS of the card, or a limit imposed by Precision X? The maximum I can set my percentage power limit to is 120% and I hit it constantly in some games, which results in PWR perfcaps and throttling.


Its its BIOS limit. If you are hitting the limit lower the OC and use less Volts. No point to have a core clock and keep going up and down.


----------



## Dasboogieman

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> Actually you are incorrect. The newer version's of DP cable's (1.3 & 1.4) are utilizing HBR3 and have a higher bandwidth, thus enabling a higher bit rate of data transference. And yes I should have been clearer and mentioned HBR3 in my original question. I am aware that the output and input ports need to be running the same version, but the cable also needs to support the bandwidth/data rate being asked of it too. Therefore the cable and ports all need to be compatible, with different cables supporting different bandwidth's as shown below.
> 
> V1.1 = 10.8Gbits/s
> V1.2 = 17.28Gbits/s
> V1.3 = 32.4Gbits/s
> V1.4 = 32.4Gbits/s
> 
> SOURCE; https://en.wikipedia.org/wiki/DisplayPort.
> 
> HBR3 enables the user to run a higher resolution screen coupled with a higher refresh rate. The 1080 cards support DP V1.4 but without the compatible connecting cable being used, your monitor will be unable to support the higher res and refresh rates. So hence why I asked the question???
> 
> DisplayPort High Bit Rate 3 (HBR3) [finally explained!]
> 
> mini DisplayPort to DisplayPort white cable
> Approved on September 15, 2014, this new standard has replaced HBR2 in 2015. High Bit Rate 3 (HBR3) is the new standard used by the all new DisplayPort 1.3 video cards.
> 
> The data bandwidth represent the maximum information (in gigabytes per seconds) each version of DisplayPort can send to the output. Because of this shared bandwidth, it is possible to connect many monitors of different make, model, resolution and refresh rate on each cable.
> 
> The total bandwidth of High Bit Rate 3 connections is 32.4 Gbit/s, the Video Data Rate being 25.92 Gbit/s.
> 
> Here are some of the known possible monitor resolutions:
> 
> One 7680 x 4320 (8K) at 60 frames per second (hertz)
> One 5120 x 2880 (5K) at 60 frames per second (hertz)
> One 4096 x 2160 (4K) up to 120 frames per second (hertz)
> Up to two 3840 x 2160 (4K) up to 120 frames per second (hertz)
> Up to three 2560 x 1600 up to 60 frames per second (hertz), or one monitor up to 120 hz
> Up to four 1920 x 1200 up to 60 frames per second (hertz), or one monitor up to 240 hz
> Up to four 1920 x 1080 up to 60 frames per second (hertz), or one monitor up to 240 hz
> & all lower resolutions and the ones in between. There is a maximum number of monitors that can be connected to any card, so be sure to check in your owner's manual.
> And a variety of mix and match resolutions like one 2560 x 1600 and two 1920 x 1080, etc, as long as the total resolution is below the total bandwidth available.
> 
> Source; http://multimonitorcomputer.com/solved/displayport-high-bit-rate-3-hbr3.php
> 
> So by swapping your DP cable for a V1.3 or V1.4 (in order to obtain a higher bandwidth), that upgrade should in theory enable the user to OC your monitor to obtain a higher refresh rate? So I am asking if anyone has tried this and if so what were your results???


I'm going by the guidelines on the DP website.
https://www.displayport.org/cables/how-to-choose-a-displayport-cable-and-not-get-a-bad-one/

here is the quote
_Despite what you may read, there is no such thing as a DisplayPort 1.1 cable and DisplayPort 1.2 cable. A standard DisplayPort cable, including the so-call DisplayPort 1.1 cables, will work for any DisplayPort configuration including the new capabilities enabled by DisplayPort 1.2, including 4K and multi-stream capabilities. All standard DisplayPort cables support RBR, HBR (High Bit Rate), and HBR2 (High Bit Rate 2), which can support 4K at 60Hz, or up to four 1080p displays using multi-stream._


----------



## CYBER-SNIPA

Quote:


> Originally Posted by *Psykoking*
> 
> Seems like you understand something wrong, while of course what is stated on the wiki page is correct, those are numbers for the display port interface. This has nothing to do with the cable itself. The interface in this case consists of your graphics card as the sender, the cable as the transmission medium and your monitor as the receiver. So the cable itself wont help you with the bandwidth. What's important is that both sender and receiver support newer display port versions. Therefore having a gfx card that supports HBR 3 and a monitor only supporting HBR 2 will still result in the interface using HBR 2 because the modules (sender and receiver) decide which version is beeing used. The cable itself wont give you problems unless you try to connect to smth several hundred meters away -> noise in cable decreases bandwidth.


I understand that the cable is not controlling the bandwidth, as it is merely a conduit for the signal to be sent from one device to another. However if the cable is not capable of handling the bit rate of data transference, surely it will bottleneck the signal, causing the bandwidth to decrease to a lower bit rate, one which it can handle and deliver accordingly? Which is in agreement with what you said above.

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'm going by the guidelines on the DP website.
> https://www.displayport.org/cables/how-to-choose-a-displayport-cable-and-not-get-a-bad-one/
> 
> here is the quote
> _Despite what you may read, there is no such thing as a DisplayPort 1.1 cable and DisplayPort 1.2 cable. A standard DisplayPort cable, including the so-call DisplayPort 1.1 cables, will work for any DisplayPort configuration including the new capabilities enabled by DisplayPort 1.2, including 4K and multi-stream capabilities. All standard DisplayPort cables support RBR, HBR (High Bit Rate), and HBR2 (High Bit Rate 2), which can support 4K at 60Hz, or up to four 1080p displays using multi-stream._


Those guidelines only refer to HBR2 and not the new HBR3 standard. Hence why I am asking those questions.


----------



## KedarWolf

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Psykoking*
> 
> Seems like you understand something wrong, while of course what is stated on the wiki page is correct, those are numbers for the display port interface. This has nothing to do with the cable itself. The interface in this case consists of your graphics card as the sender, the cable as the transmission medium and your monitor as the receiver. So the cable itself wont help you with the bandwidth. What's important is that both sender and receiver support newer display port versions. Therefore having a gfx card that supports HBR 3 and a monitor only supporting HBR 2 will still result in the interface using HBR 2 because the modules (sender and receiver) decide which version is beeing used. The cable itself wont give you problems unless you try to connect to smth several hundred meters away -> noise in cable decreases bandwidth.
> 
> 
> 
> I understand that the cable is not controlling the bandwidth, as it is merely a conduit for the signal to be sent from one device to another. However if the cable is not capable of handling the bit rate of data transference, surely it will bottleneck the signal, causing the bandwidth to decrease to a lower bit rate, one which it can handle and deliver accordingly? Which is in agreement with what you said above.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> I'm going by the guidelines on the DP website.
> https://www.displayport.org/cables/how-to-choose-a-displayport-cable-and-not-get-a-bad-one/
> 
> here is the quote
> _Despite what you may read, there is no such thing as a DisplayPort 1.1 cable and DisplayPort 1.2 cable. A standard DisplayPort cable, including the so-call DisplayPort 1.1 cables, will work for any DisplayPort configuration including the new capabilities enabled by DisplayPort 1.2, including 4K and multi-stream capabilities. All standard DisplayPort cables support RBR, HBR (High Bit Rate), and HBR2 (High Bit Rate 2), which can support 4K at 60Hz, or up to four 1080p displays using multi-stream._
> 
> Click to expand...
> 
> Those guidelines only refer to HBR2 and not the new HBR3 standard. Hence why I am asking those questions.
Click to expand...

I know if you're using a 3 metre DP cable with 4K 60Hz, not all are equal. I have not found one yet that doesn't cause artefacts but it has been suggested I buy a 'Certified' cable to get it to work.









Using a 2 metre cable works fine.


----------



## Psykoking

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> I understand that the cable is not controlling the bandwidth, as it is merely a conduit for the signal to be sent from one device to another. However if the cable is not capable of handling the bit rate of data transference, surely it will bottleneck the signal, causing the bandwidth to decrease to a lower bit rate, one which it can handle and deliver accordingly? Which is in agreement with what you said above.
> Those guidelines only refer to HBR2 and not the new HBR3 standard. Hence why I am asking those questions.


Yeah the guidlines are from 2011 but I took the scheme drawing from it, just for clarification of how the interface works. I was too lazy to look up the newer ones.
The cable should always "be capable" of handling the bitrate. The problem with transmission length is it is bound to the signal strength (decreases over length) and noise created by sources of interference. So what you can basically derive from that is, that if you use a cable A in an environment X with a noise N and a cable B ( which has better insulation against electromagnetic waves ) you will achieve a longer transmission length on max bandwidth with cable B.
Quote:


> Originally Posted by *KedarWolf*
> 
> I know if you're using a 3 metre DP cable with 4K 60Hz, not all are equal. I have not found one yet that doesn't cause artefacts but it has been suggested I buy a 'Certified' cable to get it to work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Using a 2 metre cable works fine.


Well, makes sense, it's possible that the signal power decreases just enough over that additional meter that the maximum bandwidth cannot be achieved anymore (again Shannon-Hartley). Also probably worth noting older flourecent tubes are like cancer for cables with moderate electro magnetic insulation thus inflicting a lot of noise.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its its BIOS limit. If you are hitting the limit lower the OC and use less Volts. No point to have a core clock and keep going up and down.


Has anyone bricked a 1080ti by burning a VBIOS not designed for the specific card?


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Its its BIOS limit. If you are hitting the limit lower the OC and use less Volts. No point to have a core clock and keep going up and down.
> 
> 
> 
> Has anyone bricked a 1080ti by burning a VBIOS not designed for the specific card?
Click to expand...

I bricked my FE flashing a HOF BIOS on it, took two days to figure out how to flash it back, I thought I was screwed.


----------



## Mooncheese

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have yet to figure out how much power AIO uses. From what I am reading is very low 3-5W and the stock fan probably not much more at 50%.


You can power the AIO pump (and fan, depending on the model, this is an essential mod with EVGA's kit because the fan speed is controlled by the pump and it's insanely load and annoying) off of the motherboard, they have mini-PWM to PWM adapters. Also, you can turn off the illumination on FE now with a standalone program to save a few more watts, and running the fan at 25% RPM is another few watts. I wouldn't be surprised if I'm using 10W less with the AIO and all of the above.


----------



## RickyOG90

Quote:


> Originally Posted by *Mooncheese*
> 
> You can power the AIO pump (and fan, depending on the model, this is an essential mod with EVGA's kit because the fan speed is controlled by the pump and it's insanely load and annoying) off of the motherboard, they have mini-PWM to PWM adapters. Also, you can turn off the illumination on FE now with a standalone program to save a few more watts, and running the fan at 25% RPM is another few watts. I wouldn't be surprised if I'm using 10W less with the AIO and all of the above.


Say what? I'm not using an AIO. I didn't buy a hybrid, I bought the hydrocopper which means its a full waterblock haha


----------



## Mooncheese

Sorry about that, I don't know if the error was the site or mine.


----------



## TheBoom

*For anyone with a Amp Extreme, try not to use or even try the XOC bios*.

Can confirm that the card somehow degrades at an accelerated rate with anything more than stock voltage. In fact I now wonder how long the card would last even at stock voltages with the stock bios.

It would blackscreen at high voltages. Over time I had to keep decreasing the slider on AB to stop the card from doing that. I've had to gradually decrease voltage until now even at 1.05v it will blackscreen with fans ramping up to max.

Better still, underclock the card so you won't have to make use of the 5 year warranty. Unfortunately I've had to, after like 4 months.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheBoom*
> 
> *For anyone with a Amp Extreme, try not to use or even try the XOC bios*.
> 
> Can confirm that the card somehow degrades at an accelerated rate with anything more than stock voltage. In fact I now wonder how long the card would last even at stock voltages with the stock bios.
> 
> It would blackscreen at high voltages. Over time I had to keep decreasing the slider on AB to stop the card from doing that. I've had to gradually decrease voltage until now even at 1.05v it will blackscreen with fans ramping up to max.
> 
> Better still, underclock the card so you won't have to make use of the 5 year warranty. Unfortunately I've had to, after like 4 months.


I would consider anything over 1.05v high voltage. Use 1.0 - 1.025v for 24/7 OC with curve. I am using 0.9v right now. No reason for extra 10% performance for me right now since the card is very fast.


----------



## 2ndLastJedi

So my Galax EXOC at stock 1.062 is not fine ?


----------



## MrGreaseMonkkey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBoom*
> 
> *For anyone with a Amp Extreme, try not to use or even try the XOC bios*.
> 
> Can confirm that the card somehow degrades at an accelerated rate with anything more than stock voltage. In fact I now wonder how long the card would last even at stock voltages with the stock bios.
> 
> It would blackscreen at high voltages. Over time I had to keep decreasing the slider on AB to stop the card from doing that. I've had to gradually decrease voltage until now even at 1.05v it will blackscreen with fans ramping up to max.
> 
> Better still, underclock the card so you won't have to make use of the 5 year warranty. Unfortunately I've had to, after like 4 months.
> 
> 
> 
> I would consider anything over 1.05v high voltage. Use 1.0 - 1.025v for 24/7 OC with curve. I am using 0.9v right now. No reason for extra 10% performance for me right now since the card is very fast.
Click to expand...

I want all the performance as i have a ftw3. My stock voltage is 1.062v with 1987 clocks @ 45c. Not overvolting/clocking, Ive had this card for less than 3 weeks and i got 1 black screen watching Netflix.

Sent from my iPhone using Tapatalk Pro


----------



## xTesla1856

I currently have one of eight Kingpin cards in Switzerland. To keep it or to flip it ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *MrGreaseMonkkey*
> 
> I want all the performance as i have a ftw3. My stock voltage is 1.062v with 1987 clocks @ 45c. Not overvolting/clocking, Ive had this card for less than 3 weeks and i got 1 black screen watching Netflix.
> 
> Sent from my iPhone using Tapatalk Pro


Sure but you do not need to run more than 1.0v with 1080 Ti. You want this card to last at least 1-2 years before getting next Ti. You really care about 2-3% more fps?


----------



## chibi

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would consider anything over 1.05v high voltage. Use 1.0 - 1.025v for 24/7 OC with curve. I am using 0.9v right now. No reason for extra 10% performance for me right now since the card is very fast.


Can you please advise what core and memory clocks you're able to sustain at 0.9V? I'm looking to undervolt my 1080Ti for the mitx rig I'm building and was curious what results to expect. It'll be under full custom water cool with 2x 240 rads including the cpu.


----------



## MrGreaseMonkkey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrGreaseMonkkey*
> 
> I want all the performance as i have a ftw3. My stock voltage is 1.062v with 1987 clocks @ 45c. Not overvolting/clocking, Ive had this card for less than 3 weeks and i got 1 black screen watching Netflix.
> 
> Sent from my iPhone using Tapatalk Pro
> 
> 
> 
> Sure but you do not need to run more than 1.0v with 1080 Ti. You want this card to last at least 1-2 years before getting next Ti. You really care about 2-3% more fps?
Click to expand...

Card is stock @1.062v with no overvolting. This is an out of the box configuration for this silicon. If it breaks i will rma it. I wanted to wait for volta but this maxwell refresh is as powerful as two 980ti so i jumped.

Sent from my iPhone using Tapatalk Pro


----------



## KedarWolf

Quote:


> Originally Posted by *chibi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> I would consider anything over 1.05v high voltage. Use 1.0 - 1.025v for 24/7 OC with curve. I am using 0.9v right now. No reason for extra 10% performance for me right now since the card is very fast.
> 
> 
> 
> Can you please advise what core and memory clocks you're able to sustain at 0.9V? I'm looking to undervolt my 1080Ti for the mitx rig I'm building and was curious what results to expect. It'll be under full custom water cool with 2x 240 rads including the cpu.
Click to expand...

I can do 1961 core, 6147 memory at .900v. But a third of the way through the Fire Strike Ultra stress test it drops to 1949 and stays there.


----------



## MaKeN

Whats the higher clock you guys know /heard of from anyone on 1.0mv?


----------



## chibi

Quote:


> Originally Posted by *KedarWolf*
> 
> I can do 1961 core, 6147 memory at .900v. But a third of the way through the Fire Strike Ultra stress test it drops to 1949 and stays there.


Thank you Kedar, I will play around with the voltage curve and see what I need to get 2000MHz core. Might have to bump that up to 1.0V.


----------



## KedarWolf

Quote:


> Originally Posted by *chibi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I can do 1961 core, 6147 memory at .900v. But a third of the way through the Fire Strike Ultra stress test it drops to 1949 and stays there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you Kedar, I will play around with the voltage curve and see what I need to get 2000MHz core. Might have to bump that up to 1.0V.
Click to expand...

I can get 2012 at .993v.


----------



## jura11

Quote:


> Originally Posted by *MaKeN*
> 
> Whats the higher clock you guys know /heard of from anyone on 1.0mv?


Best what I can do is 2025-2035MHz at 1.0v but in Superposition 4k Optimized I'm getting or I'm hitting Power and Voltage limits

Hope this helps

Thanks, Jura


----------



## Blameless

Quote:


> Originally Posted by *MaKeN*
> 
> Whats the higher clock you guys know /heard of from anyone on 1.0mv?


Stable clocks at a given voltage are heavily temperature dependent.

Just going to water-level temps is good for 25-50MHz more vs. an air cooled part that runs ~20C warmer.


----------



## MaKeN

I see thx for replying ,
Well im stable at 2025 1.mv , interesting is it worth bothering that curve line again for no reason, but as you guys sa it may even do 2050 in some cases, i guess ill try.
Temps wise the card stays at 39c during load


----------



## ZealotKi11er

I run 1860MHz for 0.9v and 2025MHz for 1.0v. These are not effect by temps. They always stay at these clocks. I could probably do 1900MHz with 0.9v but have had no time to test.


----------



## MaKeN

So im able to run a 4k benchmarking at 2050 1.0v,
But after 34c it drops a bin to 2038.
I have tried to set 2062 so with a drop it would be 2050 all time , and benching freezes .
What to do if i want 2050 all time at 1.0v? Heat the gpu hard and then set the curve till its still warm?


----------



## criminal

Quote:


> Originally Posted by *MaKeN*
> 
> So im able to run a 4k benchmarking at 2050 1.0v,
> But after 34c it drops a bin to 2038.
> I have tried to set 2062 so with a drop it would be 2050 all time , and benching freezes .
> What to do if i want 2050 all time at 1.0v? Heat the gpu hard and then set the curve till its still warm?


Keep the card below 34C


----------



## MaKeN

Hah lol impossible . Ok thx for the input i got it.

Na... anyway aparently it will crash in heaven even at 2038 1.0v. So i guess 2025 is the spot.
I guess i have to try that 2025 on 0.993v and i doubt it will not crash but anyway nothing to lose at that point

Edit: so 2025 crashes at 0.993
2012 goes at 0.993
But the temps diference is just exactly the same 39 at load.
I guess im keeping 2025 at 1.0v


----------



## ZealotKi11er

Quote:


> Originally Posted by *MaKeN*
> 
> So im able to run a 4k benchmarking at 2050 1.0v,
> But after 34c it drops a bin to 2038.
> I have tried to set 2062 so with a drop it would be 2050 all time , and benching freezes .
> What to do if i want 2050 all time at 1.0v? Heat the gpu hard and then set the curve till its still warm?


Up the voltage a bit and set the curve with load on.


----------



## MaKeN

@ZealoKi11er thx .
I cant do it , i have tried all the damn methods, that voltage slider wont unlock in msi afterburner, only thing is to install Evga software so its unlocked there.
Very good info to know. And again time to spend


----------



## Maximization

new psu unleashed me..hahah

https://www.3dmark.com/spy/2376253



wooo hoooo

https://www.3dmark.com/spy/2376347


----------



## AzimErebus

Hi all. Some interesting things with my SC2 Hybrid. I've tried all the different BIOSes. XOC gives me a bin or two higher at a lot of voltages, but that bios takes a performance hit overall, while using substantially more wattage. Arctic storm is pretty nice, gives my boot up a different, flashy picture, too.

Having an EVGA card, though. The FTW3 bios seems to be the best, for me, as I keep all my sensors. The FTW3 bios is also very stable, more stable it would seem than even my stock bios. I run my card at 1.000v 1999 core, 6048 memory. In supersition 8k I spike at 318 watts once or twice, but mostly under 300. A few times the core hops between 1987, and 1999, but settles on 1999 core, and stays there. If I lowered my memory I may be able to get .930 voltage, as my card will run 1999 core at .930 voltage in benchmarks. Heck, it will run lots of configurations between .993-1.025 1999-2050 in benchmarks. But, alas I get crashes in games at many of those settings, and rather than more testing I'm just taking the easy route, for now.

Heroes of the Storm is pretty sensitive, GTA V, and The Witcher 3, as well. The highest I took the card was 1.125 at 2137 core, except for the one time the tail end of the voltage curve in XOC flipped up to 1.2, and I missed it, but just for a few seconds









Anyway, I did not realize. My blower fan stopped working at some point between all the flashes. My memory, and vrm, getting way too warm tipped me off. So, I uninstalled afterburner, and installed evga precision XOC. During installation, precision XOC detected my fan, but the speed went up to the fans actual full speed, not the cards default. It did not sound good. Other than just being loud, the bearing rattled. But, it was only for a few seconds, it then was adjustable between the 20 or so percent, and 0.

I tried to use precision XOC, but I do not like it, uninstalled it. However, now my fan runs at the default speed at all times. I think it is 20 percent or so, silent, but moving air. Almost nothing can detect my fan now, though. Afterburner fan control doesn't do anything, speedfan can't detect my gpu fan. GPU-Z does see a middle, and right fan. The middle fan is blank, the right fan is at 3240 rpm. It is not actually at that speed, though.

I'm happy with this setup, temps are all in line, but the fan situation is pretty interesting.

Also, people have mentioned in several places, that the pump, LED, and fans drawing power from the card, could impact the cards core/memory voltage. If I could get something like .930v 2037 core by disconnecting all of them, well I probably would. But, I may sell the card at some point, so I'm not wanting to take it apart without some substantiated info. If anyone has any info, give me a heads up, if you would please?


----------



## Blameless

Final clocks on my Aorus are 2025/5940 with 1.031v.

This is stable all the way to 68C, which is where I put the temp limiter, while the fan curve is a straight line from 0% at 50C to 100% at 67C with a 2C hysteresis.

No limits hit except in Superposition 8k, which just barely touches the power limiter (at 375w).
Quote:


> Originally Posted by *MaKeN*
> 
> So im able to run a 4k benchmarking at 2050 1.0v,
> But after 34c it drops a bin to 2038.
> I have tried to set 2062 so with a drop it would be 2050 all time , and benching freezes .
> What to do if i want 2050 all time at 1.0v? Heat the gpu hard and then set the curve till its still warm?


2062 just below 34C should be the same as setting 2050 at a modestly higher temperature.

It's quite possible your sample just isn't stable at 2050 with 1v.
Quote:


> Originally Posted by *MaKeN*
> 
> @ZealoKi11er thx .
> I cant do it , i have tried all the damn methods, that voltage slider wont unlock in msi afterburner, only thing is to install Evga software so its unlocked there.
> Very good info to know. And again time to spend


Which 1080 Ti are you using again?


----------



## MaKeN

Gigabyte waterforce fanless....
Yes its not stable at 2050/2038 for now at 1.0v .. passes the superposition but not heaven for like 20 min or that andromeda game would crash in about 5 min.....2025/6318 seems stable

For some reason no benchmark/stress test does more then 39c on the card , andromeda does 42c ...
I guess its because it heats up the cpu also ,its in the same loop...
Do i have to run cpu+gpu tests combinet for a perfect stability?







feels like it
Or simply *** it and not bother myself of squeezing last drop of performance from that card by finding its la final point ....
Ill saty at 2025 1.0v for now .
Thx very much for the suggestions/input , i do apreciate it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MaKeN*
> 
> Gigabyte waterforce fanless....
> Yes its not stable at 2050/2038 for now at 1.0v .. passes the superposition but not heaven for like 20 min or that andromeda game would crash in about 5 min.....2025/6318 seems stable
> 
> For some reason no benchmark/stress test does more then 39c on the card , andromeda does 42c ...
> I guess its because it heats up the cpu also ,its in the same loop...
> Do i have to run cpu+gpu tests combinet for a perfect stability?
> 
> 
> 
> 
> 
> 
> 
> feels like it


I would suggest you keep memory at +500. If it does +650 test it to make sure its getting performance.


----------



## MaKeN

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would suggest you keep memory at +500. If it does +650 test it to make sure its getting performance.


thats what i get in superposition:
at 500 9994 points


at650 10064 points



at 700 10250 pts


at 730 10130 pts


at 750 1276points



i did a pc restart after changing the mhz for memory.

feels like 750 is better for me


----------



## ZealotKi11er

Quote:


> Originally Posted by *MaKeN*
> 
> thats what i get in superposition:
> at 500 9994 points
> 
> 
> at650 10064 points
> 
> 
> 
> at 700 10250 pts
> 
> 
> at 730 10130 pts
> 
> 
> at 750 1276points
> 
> 
> 
> i did a pc restart after changing the mhz for memory.
> 
> feels like 750 is better for me


Keep in mind memory also sucks a lot of power.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *MaKeN*
> 
> thats what i get in superposition:
> at 500 9994 points
> 
> 
> at650 10064 points
> 
> 
> 
> at 700 10250 pts
> 
> 
> at 730 10130 pts
> 
> 
> at 750 1276points
> 
> 
> 
> i did a pc restart after changing the mhz for memory.
> 
> feels like 750 is better for me


So do you game with +750 memory? What core, memory and voltage do you game 24/7 settings with? And what temps? Is a high memory OC safe for long term use? My Galax EXOC doesn't have monitoring on anything other than cire so I'm a bit scared about memory temps.


----------



## nrpeyton

Nice 'n cold and fully stable











Extremely disappointed with my memory overclock though.

I should be able to do better than a measley +575 before errors start kicking in.

Anyone managing a +1000?


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> Nice 'n cold and fully stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Extremely disappointed with my memory overclock though.
> 
> I should be able to do better than a measley +575 before errors start kicking in.
> 
> Anyone managing a +1000?


Only the Titan Xp guys can do +1000 lol, all the gddr5x going to those cards are te higher grade bins which very rarely goes to 1080tis.


----------



## nrpeyton

doesn't make sense to waste higher grade memory on the Titan. lol


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> doesn't make sense to waste higher grade memory on the Titan. lol


They have to, Titan Xp has 11.4GBps base VRAM clock. It is to ensure all the chips can meet stock clocks.

It is no surprise Titan Xps routinely get +800 overclocks or better.


----------



## lilchronic

They could have different timings so they clock higher. Similar to how the k|ngp|n edition cards have tighter mem timings, Ti's could be the card's with tighter timings.


----------



## Sweetwater

Quote:


> Originally Posted by *nrpeyton*
> 
> Nice 'n cold and fully stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Extremely disappointed with my memory overclock though.
> 
> I should be able to do better than a measley +575 before errors start kicking in.
> 
> Anyone managing a +1000?


That's exactly what I get with my Strix on the XOC Bios but at 50c. Same core same memory.. If I up the memory to say 600 it starts a disco party, I assume because it is too warm as it takes about a minute to start flashing.


----------



## MaKeN

@2ndLastJedi
Yep i game with +750 for now...
Core at 2025 at 1.0mv max temps in game 41c water temps 35.5c .
Idkn about the vrm temps on the card....but with IR gun , if i point on the cards backplate it shows me 53c ( ill have to retest that ) .


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> Nice 'n cold and fully stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Extremely disappointed with my memory overclock though.
> 
> I should be able to do better than a measley +575 before errors start kicking in.
> 
> Anyone managing a +1000?
> 
> 
> 
> Only the Titan Xp guys can do +1000 lol, all the gddr5x going to those cards are te higher grade bins which very rarely goes to 1080tis.
Click to expand...

My Gigabyte FE I can do +699 for benching butt run it at +642 24/7.


----------



## NeedMoreJuice

Hey, I've got a couple of questions about OC'ing the 1080 Ti. I'm asking because I'd get one and watercool it later on, after I made sure how everything will fit etc., but in the meantime I would still like to overclock it as close to the "limit" as possible:

I've seen many videos regarding OC'ing the 1080 Ti and they all had in common that the voltage and power target were instantly cranked up. I get maxing out the power target, but why would you max. out the voltage in the beginning of the oc, without seeing of how much the card is capable of with the default voltage?

Also, any games or programs to test the oc? Valley is from my experience quite forgiving regarding the memory clock. 3DMark Firestrike Extreme / Stress test is either too picky or stresses out the gpu too much... GTA 5, Shadow of Mordor and Overwatch are quite okay figuring out the memory clock, any more suggestions? Metro 2033 or LL perhaps? Heavens I heard is rather more of a cpu and less of a gpu benchmark, compared to valley. How is superposition?

Last but not least, what is the average overclock of a 1080 Ti?

I'd appreciate your answers.


----------



## 8051

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Hey, I've got a couple of questions about OC'ing the 1080 Ti. I'm asking because I'd get one and watercool it later on, after I made sure how everything will fit etc., but in the meantime I would still like to overclock it as close to the "limit" as possible:
> 
> I've seen many videos regarding OC'ing the 1080 Ti and they all had in common that the voltage and power target were instantly cranked up. I get maxing out the power target, but why would you max. out the voltage in the beginning of the oc, without seeing of how much the card is capable of with the default voltage?
> 
> Also, any games or programs to test the oc? Valley is from my experience quite forgiving regarding the memory clock. 3DMark Firestrike Extreme / Stress test is either too picky or stresses out the gpu too much... GTA 5, Shadow of Mordor and Overwatch are quite okay figuring out the memory clock, any more suggestions? Metro 2033 or LL perhaps? Heavens I heard is rather more of a cpu and less of a gpu benchmark, compared to valley. How is superposition?
> 
> Last but not least, what is the average overclock of a 1080 Ti?
> 
> I'd appreciate your answers.


My ultimate test for stability is Dying Light. All my other benchmark programs (RE5 and Unigine Heaven) can loop forever, but for some bizarre reason Dying Light always cranks my used power percentage up to 120%.


----------



## Ultracarpet

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Hey, I've got a couple of questions about OC'ing the 1080 Ti. I'm asking because I'd get one and watercool it later on, after I made sure how everything will fit etc., but in the meantime I would still like to overclock it as close to the "limit" as possible:
> 
> I've seen many videos regarding OC'ing the 1080 Ti and they all had in common that the voltage and power target were instantly cranked up. I get maxing out the power target, but why would you max. out the voltage in the beginning of the oc, without seeing of how much the card is capable of with the default voltage?
> 
> Also, any games or programs to test the oc? Valley is from my experience quite forgiving regarding the memory clock. 3DMark Firestrike Extreme / Stress test is either too picky or stresses out the gpu too much... GTA 5, Shadow of Mordor and Overwatch are quite okay figuring out the memory clock, any more suggestions? Metro 2033 or LL perhaps? Heavens I heard is rather more of a cpu and less of a gpu benchmark, compared to valley. How is superposition?
> 
> Last but not least, what is the average overclock of a 1080 Ti?
> 
> I'd appreciate your answers.


Well if you are wanting to just know what the max your card can achieve without worrying about anything else like if the chip will stay cool enough, you might as well just crank up the voltage and power limit to their max and increase the core clock until its unstable. It's the easiest and fastest way to finding out what your card can do.

If you are trying to achieve a different type of overclock, or something that needs to keep temps and power consumption in check, then yea, you might wanna just see what the card can do with stock voltage or even reduced voltage.

I haven't had my 1080ti for long, but it looks like for core clocks the average is somewhere in between 2050-2150 for max OC's. For memory it seems to be 400-500. Also, I think these cards are a bit sensitive to temperature for stability so you may see slightly better results under water than what you are able to achieve on air even at the same voltage.

I don't really know what to tell you about stress testing, I generally just play games and try to push the clock in small increments when I think I'm getting close to the limit.


----------



## Blameless

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Hey, I've got a couple of questions about OC'ing the 1080 Ti. I'm asking because I'd get one and watercool it later on, after I made sure how everything will fit etc., but in the meantime I would still like to overclock it as close to the "limit" as possible:
> 
> I've seen many videos regarding OC'ing the 1080 Ti and they all had in common that the voltage and power target were instantly cranked up. I get maxing out the power target, but why would you max. out the voltage in the beginning of the oc, without seeing of how much the card is capable of with the default voltage?
> 
> Also, any games or programs to test the oc? Valley is from my experience quite forgiving regarding the memory clock. 3DMark Firestrike Extreme / Stress test is either too picky or stresses out the gpu too much... GTA 5, Shadow of Mordor and Overwatch are quite okay figuring out the memory clock, any more suggestions? Metro 2033 or LL perhaps? Heavens I heard is rather more of a cpu and less of a gpu benchmark, compared to valley. How is superposition?
> 
> Last but not least, what is the average overclock of a 1080 Ti?
> 
> I'd appreciate your answers.


I always max out power limit, really no reason not to. However, maxing out voltage is completely counter productive on an unmodded part...you are either going to hit a power limit or a temperature limit before significantly more voltage becomes beneficial. If you aren't using the frequency/voltage curve editor for some reason, using the voltage and clock offset sliders can tune the clock speed with more control than just the clock slider, but there is still almost no reason to max it out.

As for benchmarks, Superposition 8k is great for pushing power limit, Timespy finds unstable GPU clocks faster than almost anything, and FFXIV Stormblood is good for general stability.

You can also use MemtestG80, running 6-8 instances, to quickly test memory stability. It's especially nice because it gives an objective error count and doesn't rely on one needing to see artifacts or get crashes before noticing instability. The downside is that as a CUDA app you need to start the test then increase the memory clock offset by another 500 to get gaming clocks, as the compute memory clock is 500 below the full load gaming API clocks.
Quote:


> Originally Posted by *Ultracarpet*
> 
> Also, I think these cards are a bit sensitive to temperature for stability so you may see slightly better results under water than what you are able to achieve on air even at the same voltage.


They are hugely temperature sensitive and finding the temperature I can comfortably run, worst case scenario, is my first step when I begin to fine tune 24/7 stable clocks.


----------



## Ultracarpet

Quote:


> Originally Posted by *Blameless*
> 
> I always max out power limit, really no reason not to. However, maxing out voltage is completely counter productive on an unmodded part...*you are either going to hit a power limit* or a temperature limit before significantly more voltage becomes beneficial. If you aren't using the frequency/voltage curve editor for some reason, using the voltage and clock offset sliders can tune the clock speed with more control than just the clock slider, but there is still almost no reason to max it out.
> 
> As for benchmarks, Superposition 8k is great for pushing power limit, Timespy finds unstable GPU clocks faster than almost anything, and FFXIV Stormblood is good for general stability.
> 
> You can also use MemtestG80, running 6-8 instances, to quickly test memory stability.


Oh yea, forgot about that lol. I remember when I couldn't figure out for the life of me why my 290x was throttling even though the temps were fine.


----------



## GraphicsWhore

Quote:


> Originally Posted by *chibi*
> 
> Can you please advise what core and memory clocks you're able to sustain at 0.9V? I'm looking to undervolt my 1080Ti for the mitx rig I'm building and was curious what results to expect. It'll be under full custom water cool with 2x 240 rads including the cpu.


2012/6039 here at .993v

Water-cooled EVGA FE.

FYI, my benchmark scores take a dip as compared to my max OC (2126/6048 at 1.093v) but insignificant difference in real-world performance (i.e., gaming).


----------



## NeedMoreJuice

Quote:


> Originally Posted by *8051*
> 
> My ultimate test for stability is Dying Light. All my other benchmark programs (RE5 and Unigine Heaven) can loop forever, but for some bizarre reason Dying Light always cranks my used power percentage up to 120%.


I had the same problem when I modded the pt to 130%, that game is quite strange, but has its use.
Quote:


> Originally Posted by *Ultracarpet*
> 
> Well if you are wanting to just know what the max your card can achieve without worrying about anything else like if the chip will stay cool enough, you might as well just crank up the voltage and power limit to their max and increase the core clock until its unstable. It's the easiest and fastest way to finding out what your card can do.
> 
> If you are trying to achieve a different type of overclock, or something that needs to keep temps and power consumption in check, then yea, you might wanna just see what the card can do with stock voltage or even reduced voltage.
> 
> I haven't had my 1080ti for long, but it looks like for core clocks the average is somewhere in between 2050-2150 for max OC's. For memory it seems to be 400-500. Also, I think these cards are a bit sensitive to temperature for stability so you may see slightly better results under water than what you are able to achieve on air even at the same voltage.
> 
> I don't really know what to tell you about stress testing, I generally just play games and try to push the clock in small increments when I think I'm getting close to the limit.


What I forgot to ask, is the difference in core clock as huge as it was with maxwell? By that I mean if 50mhz difference are more or less pretty neat or rather meh ?
Quote:


> Originally Posted by *Blameless*
> 
> I always max out power limit, really no reason not to. However, *maxing out voltage is completely counter productive on an unmodded part...you are either going to hit a power limit or a temperature limit before significantly more voltage becomes beneficial.* If you aren't using the frequency/voltage curve editor for some reason, using the voltage and clock offset sliders can tune the clock speed with more control than just the clock slider, but there is still almost no reason to max it out.
> 
> As for benchmarks, Superposition 8k is great for pushing power limit, Timespy finds unstable GPU clocks faster than almost anything, and FFXIV Stormblood is good for general stability.
> 
> You can also use MemtestG80, running 6-8 instances, to quickly test memory stability. It's especially nice because it gives an objective error count and doesn't rely on one needing to see artifacts or get crashes before noticing instability. The downside is that as a CUDA app you need to start the test then increase the memory clock offset by another 500 to get gaming clocks, as the compute memory clock is 500 below the full load gaming API clocks.
> They are hugely temperature sensitive and finding the temperature I can comfortably run, worst case scenario, is my first step when I begin to fine tune 24/7 stable clocks.


Exactly, so I didn't quite understand why you would just max it out instantly.

I looked at the voltage curve earlier and it definitely comes in handy and I'll give it a try. But before that I want, as it was already mentioned see the limits of the card before I invest extra work in it.

Regarding timespy, is the free version of it enough? Else I'm going to get the upgrade.


----------



## Ultracarpet

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> What I forgot to ask, is the difference in core clock as huge as it was with maxwell? By that I mean if 50mhz difference are more or less pretty neat or rather meh ?


Like in terms of how it translates to performance?

If so, I have found that the clocks do scale pretty pretty well with performance, but there really isn't a point in my mind to chase that last 50mhz or so if it is at the cost of stability in some games and applications as in the grand scheme it isn't that big of an increase. Like, 50mhz ontop of 2000mhz is only 2.5%... unless you are benching or something, I wouldn't bother with spending too much time on it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ultracarpet*
> 
> Like in terms of how it translates to performance?
> 
> If so, I have found that the clocks do scale pretty pretty well with performance, but there really isn't a point in my mind to chase that last 50mhz or so if it is at the cost of stability in some games and applications as in the grand scheme it isn't that big of an increase. Like, 50mhz ontop of 2000mhz is only 2.5%... unless you are benching or something, I wouldn't bother with spending too much time on it.


Going from 2025 to 2010 for me needs 0.1v more for extra 3.75% clock bump which does not scale lineally ~ 80W extra power.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *Ultracarpet*
> 
> Like in terms of how it translates to performance?
> 
> If so, I have found that the clocks do scale pretty pretty well with performance, but there really isn't a point in my mind to chase that last 50mhz or so if it is at the cost of stability in some games and applications as in the grand scheme it isn't that big of an increase. Like, 50mhz ontop of 2000mhz is only 2.5%... unless you are benching or something, I wouldn't bother with spending too much time on it.


Ugh, yes, scaling, I had it on the tip of my tongue









I picked a pretty bad oc'ing 980 Ti that barely manages 1400 with +25mV, which quite hurt, seeing cards at 1500mhz definitely did benefit from 100mhz more on maxwell. While I knew that Pascal wouldn't scale as well as maxwell, I wasn't too sure about the 1080 Ti, since it could've looked different because it's close to the titan.


----------



## Blameless

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Regarding timespy, is the free version of it enough? Else I'm going to get the upgrade.


The free version will work, just run it manually 3-5 times in rapid succession with the card a few degrees higher than you'd normally have it. It crashes or fails on flaky GPU clocks like almost nothing else.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Ugh, yes, scaling, I had it on the tip of my tongue
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I picked a pretty bad oc'ing 980 Ti that barely manages 1400 with +25mV, which quite hurt, seeing cards at 1500mhz definitely did benefit from 100mhz more on maxwell. While I knew that Pascal wouldn't scale as well as maxwell, I wasn't too sure about the 1080 Ti, since it could've looked different because it's close to the titan.


1500/1400 ~ 7.5% more clock speed. You get about 5% more performance which is not bad. 5% can be noticed but if you are at 100 fps its only 105 fps and 60 fps is 63 fps. So from game to game 3-5 fps. Compared to stock 980 Ti, 1500/1200 ~ 25% more clock speed ~ 20% more performance. In games thats 100 to 120 fps and at 60 fps to 72 fps which is quite noticeable. 1080 Ti get similar increase if you compare stock FE to OCed 2GHz+ card. Probably more in line ~ 15% increase. You notice it less unless you play at 4K.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1500/1400 ~ 7.5% more clock speed. You get about 5% more performance which is not bad. 5% can be noticed but if you are at 100 fps its only 105 fps and 60 fps is 63 fps. So from game to game 3-5 fps. Compared to stock 980 Ti, 1500/1200 ~ 25% more clock speed ~ 20% more performance. In games thats 100 to 120 fps and at 60 fps to 72 fps which is quite noticeable. 1080 Ti get similar increase if you compare stock FE to OCed 2GHz+ card. Probably more in line ~ 15% increase. You notice it less unless you play at 4K.


While I definitely don't think that I have to worry with a 1080 Ti for a good period of time, I'm really greedy for every fps when I play on [email protected] - depending on the game of course, for example more competitive online games. Losing 1 to 2 fps due to less oc isn't the world, everything higher feels a little like wasted "free" performance you could've gotten though :/


----------



## Djreversal

so would 2 of these be good for water blocks and the XOC bios ???

PNY Blower Design 1080ti


----------



## Kronos8

Anyone owns a Gainward GTX 1080Ti? Any model?
I was looking at the first page of this thread (datasheet) and noticed no Gainward entry.
I was also looking on TechPowerUp Bios database for info on Gainward GTX 1080Ti bios and also no entry.
Strange, as the Gainward 1080 models were very popular.
Is there a known problem for 1080Ti's that I'm not aware of?
Thanks in advance for any answer.


----------



## MaKeN

Kronos8
Why would uou be interested in those?


----------



## rluker5

Quote:


> Originally Posted by *Maximization*
> 
> new psu unleashed me..hahah
> 
> https://www.3dmark.com/spy/2376253
> 
> 
> 
> wooo hoooo
> 
> https://www.3dmark.com/spy/2376347


Those are pretty far up on the 4960x list


----------



## rluker5

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I run 1860MHz for 0.9v and 2025MHz for 1.0v. These are not effect by temps. They always stay at these clocks. I could probably do 1900MHz with 0.9v but have had no time to test.


Your mentioning undervolting got me to try it. I had to control the volts with my 780tis or else they would temp throttle, even with a boatload of loud airflow, but my 1080tis were fine with temps and not throttling so I didn't get around to it.

Been playing Dishonored 2 (4k60, mostly max + 8xmsaa + SweetFX dithering) so I thought I would screenshot a comparison of stock to some heavy undervolting where I just picked a little above the clocks at the end of the curve screen. Maybe I could get more out of 800mv, only tried one clock so far. No visible difference in gameplay, was quietly rummaging around Stilton manor, same areas, in the present in both runs.
stock: 
undervolt: 

Glad Afterburner has the grids. Doesn't look like that much of a difference to my cards, but going from around 187w to 125w (100%=250w) apiece will keep my room a bunch cooler and the computer quieter.

Hopefully Afterburner gets easier to handle with voltage curves for sli since I could see using this a bunch when I don't need all of the juice.


----------



## Maximization

Quote:


> Originally Posted by *rluker5*
> 
> Those are pretty far up on the 4960x list


yeah thanks there is 5 ahead of my globally, I will mess with Nvidia global settings over the weekend, but I think they are using water chillers, breaking 16000 needs it I think.


----------



## Kronos8

Quote:


> Originally Posted by *MaKeN*
> 
> Kronos8
> Why would uou be interested in those?


I don't see any reason not to be interested for those.
As I mentioned, 1080's where very popular and with good reviews.
Ti's have dual BIOS that makes flashing safer.
And of course price and availability.
Unless there is a known problem I'm not aware of.


----------



## Mooncheese

Quote:


> Originally Posted by *Maximization*
> 
> yeah thanks there is 5 ahead of my globally, I will mess with Nvidia global settings over the weekend, but I think they are using water chillers, breaking 16000 needs it I think.


Something is off with your CPU score at 4.6 GHz, here's my 4930k at 4.5 GHz:

https://www.3dmark.com/spy/1723784

Not sure if it's wattage starvation or what but yeah you should be higher with 100 more MHz.


----------



## Maximization

Quote:


> Originally Posted by *Mooncheese*
> 
> Something is off with your CPU score at 4.6 GHz, here's my 4930k at 4.5 GHz:
> 
> https://www.3dmark.com/spy/1723784
> 
> Not sure if it's wattage starvation or what but yeah you should be higher with 100 more MHz.


I think it might be my 64 gb of ram, it eats more resources and keeping it stable is another thing.

ram disks are fun though ....the performance hit I pay for it stinks


----------



## webhito

What drivers are you folks currently using? I recently sidegraded to a CH6 and now am having weird stuttering/freezing on my system. I have installed 382.75 and updated to the latest ones, sadly the latest drivers are giving me artifacts.

Think my issue is driver related or could it be ryzen?

Mind you, what happens is when a cutscene comes to play, it turns into a slideshow, and after that in certain parts of the game it does it. Weird indeed, dont think I have ever had this happen before.


----------



## ZealotKi11er

Quote:


> Originally Posted by *webhito*
> 
> What drivers are you folks currently using? I recently sidegraded to a CH6 and now am having weird stuttering/freezing on my system. I have installed 382.75 and updated to the latest ones, sadly the latest drivers are giving me artifacts.
> 
> Think my issue is driver related or could it be ryzen?
> 
> Mind you, what happens is when a cutscene comes to play, it turns into a slideshow, and after that in certain parts of the game it does it. Weird indeed, dont think I have ever had this happen before.


Check for CPU stability.


----------



## webhito

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Check for CPU stability.


Can't really, my tool of choice has always been afterburner, but for some reason its not compatible with the latest win 10 update, cant see the load on the cores.

Benchmarks work fine, ibt as well, the only place it does this is during games. And sadly like I mentioned above, I can't check it with afterburner as riva tuner pops up saying its not compatible.

Edit:

Manually installed riva tuner and now it works, cpu load is around 50% on one core and split out between the others, very low, gpu usage however is at 30% lol.

Went to bios, hit default, same thing.


----------



## Dry Bonez

Hey everyone, Can someone please help me out or atleast tell me some of you guys are experiencing the same thing that way i can know it isnt just me. So i just wentfrom a GTX 1080 to a Aorus Extreme waterforce 1080TI, and ever since i have had this card, my screen keeps flickering(blinking), and its quite annoying. I am not able to pinpoint the determining factor. I have swapped out ports,i have yet to change my cable, but i doubt its the isssue since its never once happend with my 1080. I am guessing its either a faulty GPU or a driver issue, and i am on the latest Nvidia driver as well. Please help, thanks in advance.


----------



## KedarWolf

Quote:


> Originally Posted by *Dry Bonez*
> 
> Hey everyone, Can someone please help me out or atleast tell me some of you guys are experiencing the same thing that way i can know it isnt just me. So i just wentfrom a GTX 1080 to a Aorus Extreme waterforce 1080TI, and ever since i have had this card, my screen keeps flickering(blinking), and its quite annoying. I am not able to pinpoint the determining factor. I have swapped out ports,i have yet to change my cable, but i doubt its the isssue since its never once happend with my 1080. I am guessing its either a faulty GPU or a driver issue, and i am on the latest Nvidia driver as well. Please help, thanks in advance.


I had that with a 3-metre display port cable, went back to 2 metres and all is well.

Try another display port cable, some have issues. Try to get a DP certified cable.


----------



## Dry Bonez

Quote:


> Originally Posted by *KedarWolf*
> 
> I had that with a 3-metre display port cable, went back to 2 metres and all is well.
> 
> Try another display port cable, some have issues. Try to get a DP certified cable.


Thanks a bundle for the fast reply, but i use a TV, a Samsung KS8000 to be exact, and as you know, it is only HDMI. And my cable is 18gbps and it is fairly short since i dont need it for long distance.


----------



## webhito

Yup, the slideshow happens in game, all cores get pegged to 100% and gpu usage drops to 10% or less. Its not very often, but its extremely annoying.


----------



## KedarWolf

Quote:


> Originally Posted by *Dry Bonez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I had that with a 3-metre display port cable, went back to 2 metres and all is well.
> 
> Try another display port cable, some have issues. Try to get a DP certified cable.
> 
> 
> 
> Thanks a bundle for the fast reply, but i use a TV, a Samsung KS8000 to be exact, and as you know, it is only HDMI. And my cable is 18gbps and it is fairly short since i dont need it for long distance.
Click to expand...

Oh, I see. there's some TV setting you need to do but I don't recall what it is. I've heard others talk about Samsung TVs.

Google is your friend.


----------



## 8051

Quote:


> Originally Posted by *KedarWolf*
> 
> I had that with a 3-metre display port cable, went back to 2 metres and all is well.
> 
> Try another display port cable, some have issues. Try to get a DP certified cable.


KedarWolf I just wanted to say thanks for your 1080Ti VBIOS flashing guide.


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I had that with a 3-metre display port cable, went back to 2 metres and all is well.
> 
> Try another display port cable, some have issues. Try to get a DP certified cable.
> 
> 
> 
> KedarWolf I just wanted to say thanks for your 1080Ti VBIOS flashing guide.
Click to expand...


----------



## Dry Bonez

Quote:


> Originally Posted by *KedarWolf*


So yesterday i decided to switch hdmi ports on my gpu, and ever since i placed it on the other hdmi port, it has not happened not once! I have also left the gpu and tv on all night, watching some good old fashioned yugioh! lol. Should instill return my Aorus 1080ti waterforce gpu for a faulty hdmi port?


----------



## KedarWolf

Quote:


> Originally Posted by *Dry Bonez*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yesterday i decided to switch hdmi ports on my gpu, and ever since i placed it on the other hdmi port, it has not happened not once! I have also left the gpu and tv on all night, watching some good old fashioned yugioh! lol. Should instill return my Aorus 1080ti waterforce gpu for a faulty hdmi port?
Click to expand...

See if there is the Waterforce BIOS on the TechPowerUp link in the How To Flash A BIOS section and flash it, see if it fixes it.









But if your card is a good overclocker and you don't need the extra HDMI keep it.









If it's a dud, RMA it for sure.









Here.









https://www.techpowerup.com/vgabios/193733/gigabyte-gtx1080ti-11264-170516


----------



## vf-

This is probably rare but I'm not sure which it is. Anyone have a aluminium Apple 23" display hooked up to this card? I had it running for the past two hours and gaming with it. Had to power it down and hook it back up to the Mac to adjust its brightness. Now it'll no longer give a picture except 3 flashing lights but works connected to the Mac. I don't understand it.

The bootup logo displayed and everything when it was connected to the PC but now no picture but will work when it is connected to the Mac. Not sure if it is a z77 issue or NVIDIA issue? Doesn't make sense how it worked before and now no longer.


----------



## Clukos

Is the noise about Nvidia not performing as well in Dx12 as in Dx11 just bs? I just tried with latest drivers, latest Windows 10 update in The Division and I'm getting significantly better results with Dx12, especially in frametimes...

Dx11


Dx12


----------



## tarasis

Finally joining the GTX 1080 TI owners club.

Have ordered a Gainward Phoenix "Golden Sample" card for 729€. The cards GPU base clock is 1556 MHz, boosting to 1670 MHz. Memory is 5505Mhz

Its not entirely the card I wanted, but the prices on the rest of the cards where crazy and upwards of 800€. I'm kicking myself a little for not snapping the MSI Gaming X card when I saw it a few weeks ago for 770€, before the Destiny deal and the memory shortage had depleted stock most places here / drove the prices to crazy levels. (I was also interesting in the AIO models, but again their pricing is just too high atm)

Currently a mixture of excited and not







I tend to run at 2560x1080 (with eye to move up to a larger 21:9 res sooner rather than later when I build a new PC for me and give my old PC and monitor to my kids) but this is intended to much improve my Vive VR experience compared to my GTX 970.


----------



## KedarWolf

Quote:


> Originally Posted by *Clukos*
> 
> Is the noise about Nvidia not performing as well in Dx12 as in Dx11 just bs? I just tried with latest drivers, latest Windows 10 update in The Division and I'm getting significantly better results with Dx12, especially in frametimes...
> 
> Dx11
> 
> 
> Dx12


Yeah, in Rise Of The Tomb Raider benchmark frame rates jump all over in DX11, Direct X 12 at 4K pretty much a solid 60 FPS across the board.


----------



## 8051

Quote:


> Originally Posted by *Clukos*
> 
> Is the noise about Nvidia not performing as well in Dx12 as in Dx11 just bs? I just tried with latest drivers, latest Windows 10 update in The Division and I'm getting significantly better results with Dx12, especially in frametimes...


Less than an 8% difference is significantly better?


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> Less than an 8% difference is significantly better?


Look at the graph.


----------



## 8051

Quote:


> Originally Posted by *Blameless*
> 
> Look at the graph.


I didn't see that. That's noticeable.


----------



## nrpeyton

Any new BIOS recommendations for 1080ti FE?


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Any new BIOS recommendations for 1080ti FE?


Stock and use Custom curve to OC. I would suggest 0.9v if you using air cooling. 1850-1900MHz


----------



## Djreversal

Hey guys so I put in the XOC overclock Bios.. Got everything up and running, with my settings in Afterburner I can not get my system go past 2050mhz ... Am I missing a step to allow for more voltage?? Also my power slider that goes to 120% doesn't show up, I just have the voltage 0-100 thing there. But it doesn't seem to help anything once I hit 2050 its just a dead stick.. here is my current Bench

https://www.3dmark.com/fs/13635674


----------



## ZealotKi11er

Quote:


> Originally Posted by *Djreversal*
> 
> Hey guys so I put in the XOC overclock Bios.. Got everything up and running, with my settings in Afterburner I can not get my system go past 2050mhz ... Am I missing a step to allow for more voltage?? Also my power slider that goes to 120% doesn't show up, I just have the voltage 0-100 thing there. But it doesn't seem to help anything once I hit 2050 its just a dead stick.. here is my current Bench
> 
> https://www.3dmark.com/fs/13635674


You have to do a curve OC and add more voltage.


----------



## Djreversal

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You have to do a curve OC and add more voltage.


ah ok im going to have to look into that further... im sure i can squeeze out a little more performance out of this thing im not even breaking 40c currently.


----------



## KedarWolf

Quote:


> Originally Posted by *Djreversal*
> 
> Hey guys so I put in the XOC overclock Bios.. Got everything up and running, with my settings in Afterburner I can not get my system go past 2050mhz ... Am I missing a step to allow for more voltage?? Also my power slider that goes to 120% doesn't show up, I just have the voltage 0-100 thing there. But it doesn't seem to help anything once I hit 2050 its just a dead stick.. here is my current Bench
> 
> https://www.3dmark.com/fs/13635674






Are you under water?

If you are, use XOC BIOS and I still wouldn't go higher than 1.15v using this method.

Try for say 2100-2133 at 1.125v ideally with a full water block.

And watch your temps.

Even on an AIO I wouldn't use XOC BIOS as you're pumping a ton more voltage into the VRMs etc. and they are only air cooled.









Just the GPU core is water cooled with an AIO.









On air, I wouldn't even think about it.









On air just stick with the stock BIOS and see what speeds you can get say at 1.000v. Try for 2000 to 2038 at 1.000v if you have a decent card.

Less if your card doesn't respond well.


----------



## Djreversal

Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> Are you under water?
> 
> If you are, use XOC BIOS and I still wouldn't go higher than 1.15v using this method.
> 
> Try for say 2100-2133 at 1.125v ideally with a full water block.
> 
> And watch your temps.
> 
> Even on an AIO I wouldn't use XOC BIOS as you're pumping a ton more voltage into the VRMs etc. and they are only air cooled.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just the GPU core is water cooled with an AIO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On air, I wouldn't even think about it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On air just stick with the stock BIOS and see what speeds you can get say at 1.000v. Try for 2000 to 2038 at 1.000v if you have a decent card.
> 
> Less if your card doesn't respond well.


No matter what i do following these instructions i cant get the power Limit bar to become active. I have a full EK water Block on a seperate loop threw a 560mm radiator. Should hold temps ok..
Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> Are you under water?
> 
> If you are, use XOC BIOS and I still wouldn't go higher than 1.15v using this method.
> 
> Try for say 2100-2133 at 1.125v ideally with a full water block.
> 
> And watch your temps.
> 
> Even on an AIO I wouldn't use XOC BIOS as you're pumping a ton more voltage into the VRMs etc. and they are only air cooled.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just the GPU core is water cooled with an AIO.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On air, I wouldn't even think about it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On air just stick with the stock BIOS and see what speeds you can get say at 1.000v. Try for 2000 to 2038 at 1.000v if you have a decent card.
> 
> Less if your card doesn't respond well.


Unfortunately no matter what i do i can no get the Power Limit % bar to become active on my card... im editing the MSIafterburner.oem2 file to what my subsys is .. and its still not activating it.. I even tried the wild card one.. No such luck.


----------



## Sweetwater

Quote:


> Originally Posted by *KedarWolf*
> 
> On air, I wouldn't even think about it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On air just stick with the stock BIOS and see what speeds you can get say at 1.000v. Try for 2000 to 2038 at 1.000v if you have a decent card.


Depends how you use the XOC Bios, it has a valid purpose for air-cooled cards. I can use 1.081 highly clocked and my gpu temps is 50-55c. I can also run 2025 @ .993 49c. While on the strix stock bios I'd still power throttle so this XOC is perfect.

I agree though that if you push above 1.093 on air it is likely playing with fire and I dare not tempt the gpu overlords haha


----------



## TheBoom

Quote:


> Originally Posted by *Djreversal*
> 
> No matter what i do following these instructions i cant get the power Limit bar to become active. I have a full EK water Block on a seperate loop threw a 560mm radiator. Should hold temps ok..
> Unfortunately no matter what i do i can no get the Power Limit % bar to become active on my card... im editing the MSIafterburner.oem2 file to what my subsys is .. and its still not activating it.. I even tried the wild card one.. No such luck.


What are you on about? The XOC bios greys out the power limit slider. You have unlimited power (on the software side only) with the XOC bios.


----------



## eivissa

I am facing a similar situation!

I have the Zotac Arctic Storm, but maybe not the best batch as the most I could get out of it on 1.092V was stable 2038Mhz, on some games like Battlefield also running into Power and Voltage limitations and reducing to 2025Mhz. 12050mhz on the RAM under all conditions.

Now I am running the XOC Bios, on afterburner the TEMP and POWER LIMIT sliders have disappeared, voltage is set to 100%, but I am still struggling to go much higher on the clocks. Its now stable at 2050Mhz / 1.113V and the temps are nowhere near any limits with this awesome waterblock from Zotac. Still after a while in Battlefield I get Power and Voltage Limit indications by RivaTuner and the card clocking itsself down to 2038. Even If I raise the voltage one or two more steps, the card does the same and clocks down by 13mhz under heavy load especially in Battlefield and after the second heaven run.

I can add 1.2V and crank it up to just below 2100Mhz, but doesnt feel right and durable that way.

Anyone got an idea why I still get Power and Voltage Limit indications although this shouldnt be a limit on the XOC Bios?

My PSU is high quality Seasonic Gold Standard with 850W. Could I have messed up the Power connection by routing 2x8Pin sockets with only one cable to the GPU or would the PC just crash if the GPU cant get enough Watts from the PSU?


----------



## Djreversal

MSI is only telling my im getting 1062mv it just sits there the entire time in any bench mark at 2050mhz ... and the crosshairs on the voltage / frequency curve sit there as well..


----------



## Djreversal

when i run this Powerlimit BAT file they provided as well, it tells me changing power management limit is not supported for my GPU. Treating as a warning and moving on.

All Done


----------



## kabu

Hi all,

sorry if this has been asked already, but I could not find;

Is there a way to lock the GTX 1080Ti core clock frequency, core voltage and memory clock frequency someway?

I'm having trouble with one game (DCS World 2.1.1) where as the game freezes after some time.
I can see in afterburner my FPS, GPU usage, Power %, core clock & memory clock all drop at the same time. Most likely it is just SW bug in the game, but I would like to fix my GPU core clock, voltage and memory clock just to try out.

I'm on EKWB water cooling. MSI Z97 Gaming 5 i7-4970K, GTX 1080ti, 32GB Corsair 2400 RAM, 2 x SSD.


----------



## TheBoom

Quote:


> Originally Posted by *eivissa*
> 
> I am facing a similar situation!
> 
> I have the Zotac Arctic Storm, but maybe not the best batch as the most I could get out of it on 1.092V was stable 2038Mhz, on some games like Battlefield also running into Power and Voltage limitations and reducing to 2025Mhz. 12050mhz on the RAM under all conditions.
> 
> Now I am running the XOC Bios, on afterburner the TEMP and POWER LIMIT sliders have disappeared, voltage is set to 100%, but I am still struggling to go much higher on the clocks. Its now stable at 2050Mhz / 1.113V and the temps are nowhere near any limits with this awesome waterblock from Zotac. Still after a while in Battlefield I get Power and Voltage Limit indications by RivaTuner and the card clocking itsself down to 2038. Even If I raise the voltage one or two more steps, the card does the same and clocks down by 13mhz under heavy load especially in Battlefield and after the second heaven run.
> 
> I can add 1.2V and crank it up to just below 2100Mhz, but doesnt feel right and durable that way.
> 
> Anyone got an idea why I still get Power and Voltage Limit indications although this shouldnt be a limit on the XOC Bios?
> 
> My PSU is high quality Seasonic Gold Standard with 850W. Could I have messed up the Power connection by routing 2x8Pin sockets with only one cable to the GPU or would the PC just crash if the GPU cant get enough Watts from the PSU?


There is a hardware power limit on the card that you cannot bypass no matter what. Also sometimes there is bug that happens only on certain bins of voltage where if you use a specific clock bin it will show power perfcaps but if you drop 1 bin or 2 you stop getting the indication.

Voltage limit indication is correct, because you are still limited by what you set the voltage in the curve to.

Quote:


> Originally Posted by *Djreversal*
> 
> when i run this Powerlimit BAT file they provided as well, it tells me changing power management limit is not supported for my GPU. Treating as a warning and moving on.
> 
> All Done


You do not need to use any powerlimit.bat for the XOC bios. There isn't even one. I'm not sure which bios's powerlimit.bat file you are trying to use. I think you have no clue what you're doing. Better stop and do some research before you brick or damage your card.


----------



## TheADLA

Hey guys,

just sent my validation Info..

MSI GTX 1080 Ti Gaming X

Not very lucky on silicone lottery ( as usual lol) but I am happy. Running her on Acer 4K 60Hz G-Sync, oc'd with Afterburner. For Games +85 on the core and + 450 on the memory for now she usually boosts up to 2037, no Bios or Power Limit/Voltage Mod. But I replaced the stock Thermal paste . Runs fine


----------



## 8051

Quote:


> Originally Posted by *Sweetwater*
> 
> Depends how you use the XOC Bios, it has a valid purpose for air-cooled cards. I can use 1.081 highly clocked and my gpu temps is 50-55c. I can also run 2025 @ .993 49c. While on the strix stock bios I'd still power throttle so this XOC is perfect.
> 
> I agree though that if you push above 1.093 on air it is likely playing with fire and I dare not tempt the gpu overlords haha


I had the same problem w/the Gigabyte 1080Ti Gaming OC -- it would throttle badly at any overclock above 2000 MHz. The XOC VBIOS fixed that.

Is the XOC VBIOS on hwbot customized by someone other than Asus?


----------



## Djreversal

Quote:


> Originally Posted by *TheBoom*
> 
> There is a hardware power limit on the card that you cannot bypass no matter what. Also sometimes there is bug that happens only on certain bins of voltage where if you use a specific clock bin it will show power perfcaps but if you drop 1 bin or 2 you stop getting the indication.
> 
> Voltage limit indication is correct, because you are still limited by what you set the voltage in the curve to.
> You do not need to use any powerlimit.bat for the XOC bios. There isn't even one. I'm not sure which bios's powerlimit.bat file you are trying to use. I think you have no clue what you're doing. Better stop and do some research before you brick or damage your card.


I followed the thread and that file was there and said it could be used which was with the bios i used and the tools ive been useing.. it didnt do anything as i used it so i figured it wasnt necessary, the curve does absolutely nothing, i can not draw anything more the the 1060mv in afterburner and it will not go past 2050mhz.. 2051 and it crashes.. 2050 its stable no matter what i do. ive adjusted the curve and it still wont pull any more voltage, and the cards arent breaking 38C at 2050mhz during testing.


----------



## Djreversal

the only other mod i guess i could do is the Shunt Mod to get more voltage at the same %, but im not going to pull the cards apart and do that so ill just leave them at the 2050 setting.. seems to do fine, just felt i could of squeezed out some more on the benchmarks and moved up the ladder from number 5x on the hall of fame lol


----------



## KedarWolf

Quote:


> Originally Posted by *Djreversal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheBoom*
> 
> There is a hardware power limit on the card that you cannot bypass no matter what. Also sometimes there is bug that happens only on certain bins of voltage where if you use a specific clock bin it will show power perfcaps but if you drop 1 bin or 2 you stop getting the indication.
> 
> Voltage limit indication is correct, because you are still limited by what you set the voltage in the curve to.
> You do not need to use any powerlimit.bat for the XOC bios. There isn't even one. I'm not sure which bios's powerlimit.bat file you are trying to use. I think you have no clue what you're doing. Better stop and do some research before you brick or damage your card.
> 
> 
> 
> I followed the thread and that file was there and said it could be used which was with the bios i used and the tools ive been useing.. it didnt do anything as i used it so i figured it wasnt necessary, the curve does absolutely nothing, i can not draw anything more the the 1060mv in afterburner and it will not go past 2050mhz.. 2051 and it crashes.. 2050 its stable no matter what i do. ive adjusted the curve and it still wont pull any more voltage, and the cards arent breaking 38C at 2050mhz during testing.
Click to expand...

Your voltage curve should look like this but you drag the higher voltage points, core speeds up higher with your mouse, then hit apply.

You really need to watch the video in the 'How To Do A Custom Voltage Curve' section.


----------



## Djreversal

Quote:


> Originally Posted by *KedarWolf*
> 
> Your voltage curve should look like this but you drag the higher voltage points, core speeds up higher with your mouse, then hit apply.
> 
> You really need to watch the video in the 'How To Do A Custom Voltage Curve' section.


i could put the curve up to +900 if i want to.. the voltage draw never goes past 1060mv so it never sees the higher clocks that i have dragged on the curve.. it just stops at the 2050mhz number at 1060mv no matter what i do to the curve after that 1060mv point .. ive watched the video 50 times to see if i was missing anything.. it wont read the curve at a point its not hitting.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Djreversal*
> 
> i could put the curve up to +900 if i want to.. the voltage draw never goes past 1060mv so it never sees the higher clocks that i have dragged on the curve.. it just stops at the 2050mhz number at 1060mv no matter what i do to the curve after that 1060mv point .. ive watched the video 50 times to see if i was missing anything.. it wont read the curve at a point its not hitting.


+900 does nothing. You have to drag the point with higher voltages more and hit apply.


----------



## KedarWolf

Quote:


> Originally Posted by *Djreversal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Your voltage curve should look like this but you drag the higher voltage points, core speeds up higher with your mouse, then hit apply.
> 
> You really need to watch the video in the 'How To Do A Custom Voltage Curve' section.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i could put the curve up to +900 if i want to.. the voltage draw never goes past 1060mv so it never sees the higher clocks that i have dragged on the curve.. it just stops at the 2050mhz number at 1060mv no matter what i do to the curve after that 1060mv point .. ive watched the video 50 times to see if i was missing anything.. it wont read the curve at a point its not hitting.
Click to expand...

What we are saying is CTRL F, click the mouse on the voltage line, the voltage point you want, and drag it up.


----------



## Djreversal

Quote:


> Originally Posted by *KedarWolf*
> 
> What we are saying is CTRL F, click the mouse on the voltage line, the voltage point you want, and drag it up.


yes i have all the voltage points at the graph that i want.. if u notice the green dotted line.. that will never go past 1060mv no matter what i do.

https://i.imgur.com/Ha3w7rl.png


----------



## nrpeyton

shunt mod all the way guys,









I've had my FE shunt modded for months and card is sitting vertical (side up) due to motherboard sitting open on my desk. .

The liquid metal has never moved -- and never caused any issues.









I can draw nearly 400w and maintain 2113Mhz without throttling at 4k Ultra in the most graphically demanding games. (I.E Dragon Age Inquisition, Witcher 3 & Gears of War 4 etc).


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> shunt mod all the way guys,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had it shunt modded for months and card is sitting vertical (side up) and liquid metal has never moved -- and never caused any issues.
> 
> I can draw nearly 400w and maintain 2113Mhz without throttling at 4k Ultra in the most graphically demanding games.


And is going from 2025MHz to 2113MHz ~ 4.3% more core speed which is like 2% more fps for 100W more power?


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> And is going from 2025MHz to 2113MHz ~ 4.3% more core speed which is like 2% more fps for 100W more power?


In less graphically demanding games I run at 800mv (0.8v) & lower clock speed.

In _more_ demanding games at 4k when I'm seeing 30-40 FPS I like to feel that it's overclocked to the knife edge _just_ inside that last inch of stability. (I like to know it is performing at it's *maximum* capability (without being artificially and unnecessarily throttled)!

Does it really make much difference: NO
Does it _*FEEL*_ like it does: YES
















*lol*

.


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> In less graphically demanding games I run at 800mv (0.8v) & lower clock speed.
> 
> In _more_ demanding games at 4k when I'm seeing 30-40 FPS I like to feel that it's overclocked to the knife edge just inside the last inch of stability. (I like to know it is performing at it's maximum possible capability (without being artificially and unnecessarily throttled)!


I have mine set for 1860/0.9v for all games. I have 2025/1v profile witch I use for BF1.


----------



## KedarWolf

Quote:


> Originally Posted by *Djreversal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> What we are saying is CTRL F, click the mouse on the voltage line, the voltage point you want, and drag it up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yes i have all the voltage points at the graph that i want.. if u notice the green dotted line.. that will never go past 1060mv no matter what i do.
> 
> https://i.imgur.com/Ha3w7rl.png
Click to expand...

Did you hit L or CTRL L and lock the point by mistake?

It should be a red vertical line if not locked.


----------



## Dasboogieman

Quote:


> Originally Posted by *Djreversal*
> 
> yes i have all the voltage points at the graph that i want.. if u notice the green dotted line.. that will never go past 1060mv no matter what i do.
> 
> https://i.imgur.com/Ha3w7rl.png


Have you done the NVSMI .bat file? it overrides the standard curve control and basically forces your card to always boost to the highest (i.e. 1.093V) bin.


----------



## Thoth420

I think I just built a behemoth system to game on mostly only to discover that I may indeed have epilepsy or at the least have a severe sensitivity to on screen motion and light change since I have been taken off the meds(xanax) I have been on for the past 7 years. Any first person game makes me feel very strange after even a few moments









I had been playing games on my Switch since I got out of the hospital but I don't own anything on that which is first person. Oddly Mario + Rabbids has lots of flashing graphical effects but that doesn't seem to bother me at all nor does Zelda which also does. No seizures so far which I am told I am at risk for anyways but it would be a shame if this issue doesn't subside.









The 1080Ti on the Predator x34 is a monster though.

Moral of the story: never take drugs...not even from your doctor


----------



## KedarWolf

Quote:


> Originally Posted by *Djreversal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> What we are saying is CTRL F, click the mouse on the voltage line, the voltage point you want, and drag it up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yes i have all the voltage points at the graph that i want.. if u notice the green dotted line.. that will never go past 1060mv no matter what i do.
> 
> https://i.imgur.com/Ha3w7rl.png
Click to expand...

Try running this in an Admin command prompt. With the quotation marks.

"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -acp UNRESTRICTED


----------



## 8051

Is 1.094V a safe voltage for 1080Ti's? Or only for 1080Ti's under full coverage water blocks?

I used the XOC VBIOS on my gigabyte air-cooled 1080ti and I applied an offset to the frequency and then noticed it had boosted the voltage up to 1.094V! Even though I had the voltage slider all the way down at the bottom.


----------



## TheBoom

Quote:


> Originally Posted by *8051*
> 
> Is 1.094V a safe voltage for 1080Ti's? Or only for 1080Ti's under full coverage water blocks?
> 
> I used the XOC VBIOS on my gigabyte air-cooled 1080ti and I applied an offset to the frequency and then noticed it had boosted the voltage up to 1.094V! Even though I had the voltage slider all the way down at the bottom.


1.093v is perfectly safe. It's the maximum allowed voltage under default bioses.

The XOC bios allows up to 1.193v though. Now that's not safe for air cooled cards.

If you want to use a lower voltage with the XOC you will need to use the curve or it will always boost upwards of 1.093v. Highest I've seen it go by itself is 1.125v though.


----------



## Heruur

Shuttle XPC SFF build with 1080Ti SC2


----------



## 8051

Quote:


> Originally Posted by *Heruur*
> 
> Shuttle XPC SFF build with 1080Ti SC2


That is a compact build. Is that a custom PSU on the right?


----------



## Beagle Box

Quote:


> Originally Posted by *Thoth420*
> 
> I think I just built a behemoth system to game on mostly only to discover that I may indeed have epilepsy or at the least have a severe sensitivity to on screen motion and light change since I have been taken off the meds(xanax) I have been on for the past 7 years. Any first person game makes me feel very strange after even a few moments
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had been playing games on my Switch since I got out of the hospital but I don't own anything on that which is first person. Oddly Mario + Rabbids has lots of flashing graphical effects but that doesn't seem to bother me at all nor does Zelda which also does. No seizures so far which I am told I am at risk for anyways but it would be a shame if this issue doesn't subside.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 1080Ti on the Predator x34 is a monster though.
> 
> Moral of the story: never take drugs...not even from your doctor


You are experiencing simulator sickness. Your eyes are telling your body you are moving, but your inner ear says you are sitting still. Very common.

Sometimes enabling motion blur helps. Or learning to intentionally blur your eyes during horizontal panning.


----------



## CrazyElf

Quote:


> Originally Posted by *Blameless*
> 
> I always max out power limit, really no reason not to. However, maxing out voltage is completely counter productive on an unmodded part...you are either going to hit a power limit or a temperature limit before significantly more voltage becomes beneficial. If you aren't using the frequency/voltage curve editor for some reason, using the voltage and clock offset sliders can tune the clock speed with more control than just the clock slider, but there is still almost no reason to max it out.
> 
> As for benchmarks, Superposition 8k is great for pushing power limit, Timespy finds unstable GPU clocks faster than almost anything, and FFXIV Stormblood is good for general stability.
> 
> You can also use MemtestG80, running 6-8 instances, to quickly test memory stability. It's especially nice because it gives an objective error count and doesn't rely on one needing to see artifacts or get crashes before noticing instability. The downside is that as a CUDA app you need to start the test then increase the memory clock offset by another 500 to get gaming clocks, as the compute memory clock is 500 below the full load gaming API clocks.
> They are hugely temperature sensitive and finding the temperature I can comfortably run, worst case scenario, is my first step when I begin to fine tune 24/7 stable clocks.


Thanks +Rep

I'm wondering if it is more productive in some cases where the cards are hitting the power limit to consider undervolting slightly.

If you think about it, if power limit is the bottleneck, we ought to be undervolting aggressively to see what the bare minimum for stability is.

There have been some interesting results:

__
https://www.reddit.com/r/614w1u/my_interesting_1080ti_undervolting_results/%5B/URL

A few of the MSI Lightning sensors are read by HWinfo. All of the FTW3 sensors are visible.


Spoiler: Warning: Spoiler!






[/QUOTE]

Thanks + Rep.

I think there is something to EVGA's claim about monitoring - their ICX Suite.






I suspect that it won't be long before everyone copies it.

Quote:


> Originally Posted by *Beagle Box*
> 
> You are experiencing simulator sickness. Your eyes are telling your body you are moving, but your inner ear says you are sitting still. Very common.
> 
> Sometimes enabling motion blur helps. Or learning to intentionally blur your eyes during horizontal panning.


Yes, motion blur helps.

Some games have mods that might help: https://www.nexusmods.com/skyrimspecialedition/mods/2742/?

None of these are sure fire solutions, but they can all help.


----------



## Thoth420

Quote:


> Originally Posted by *Beagle Box*
> 
> You are experiencing simulator sickness. Your eyes are telling your body you are moving, but your inner ear says you are sitting still. Very common.
> 
> Sometimes enabling motion blur helps. Or learning to intentionally blur your eyes during horizontal panning.


Thanks for the advice it is a hard thing to adjust to and I had all my games with blur settings disabled prior so I will try to change those.


----------



## Beagle Box

Quote:


> Originally Posted by *Thoth420*
> 
> Thanks for the advice it is a hard thing to adjust to and I had all my games with blur settings disabled prior so I will try to change those.


There are other 'tricks' as well. Some people are just fine if they have a small fan blowing air in their face. I've heard also that you can slightly rock your chair.

Some games are just bad, though, because the walking gait in first person is just too pronounced.

You can probably find more tips by just Googling 'simulator sickness.

Good luck with it.


----------



## Thoth420

Quote:


> Originally Posted by *Beagle Box*
> 
> There are other 'tricks' as well. Some people are just fine if they have a small fan blowing air in their face. I've heard also that you can slightly rock your chair.
> 
> Some games are just bad, though, because the walking gait in first person is just too pronounced.
> 
> You can probably find more tips by just Googling 'simulator sickness.
> 
> Good luck with it.


Will do and cheers.


----------



## NeedMoreJuice

Wanted to get the EK CoolStream PE 120 Radiator and the EK-CryoFuel Premix, BUT the water block of the 1080 Ti Poseidon is made out of copper, while the radiator uses ALUMINIUM, copper and brass...

I read that copper and aluminium create galvanic corrosion, so... what are my copper 120mm radiator options... ? Or can I "safely" use that combo if I regularly (every 6 months) change the coolant?


----------



## ZealotKi11er

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Wanted to get the EK CoolStream PE 120 Radiator and the EK-CryoFuel Premix, BUT the water block of the 1080 Ti Poseidon is made out of copper, while the radiator uses ALUMINIUM, copper and brass...
> 
> I read that copper and aluminium create galvanic corrosion, so... what are my copper 120mm radiator options... ? Or can I "safely" use that combo if I regularly (every 6 months) change the coolant?


You are good. Brass will not corrode with copper block. You do not want aluminum core for the RAD.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are good. Brass will not corrode with copper block. You do not want aluminum core for the RAD.


If I understood you right the EK PE 120mm Radiator is safe to use together with the copper block of the card, correct? (Because it does not have an aluminium core, although it is being made out of copper, brass and aluminium?).

Excuse me for asking the same question differently, it's just overwhelming and confusing, and I don't want to destroy the block of the graphics card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> If I understood you right the EK PE 120mm Radiator is safe to use together with the copper block of the card, correct? (Because it does not have an aluminium core, although it is being made out of copper, brass and aluminium?).
> 
> Excuse me for asking the same question differently, it's just overwhelming and confusing, and I don't want to destroy the block of the graphics card.


The only stuff you do not want to use is: EK-ALUSTREAM SE 240


----------



## NeedMoreJuice

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The only stuff you do not want to use is: EK-ALUSTREAM SE 240


All right, thank you


----------



## becks

Quote:


> Originally Posted by *Clukos*
> 
> Is the noise about Nvidia not performing as well in Dx12 as in Dx11 just bs? I just tried with latest drivers, latest Windows 10 update in The Division and I'm getting significantly better results with Dx12, especially in frametimes...
> 
> Dx11
> 
> 
> Dx12


What are you using for that overlay ? Thanks.


----------



## Clukos

Quote:


> Originally Posted by *becks*
> 
> What are you using for that overlay ? Thanks.


MSI Afterburner + RTSS, latest beta (I think it should be 16 for afterburner and 27 for RTSS).


----------



## velocityx

Just as the poster above, I tried DX12 with BF1 New TSAR DLC and I get around 10-12more fps in 1440p all maxed out compared to dx11. didnt check vanilla maps.


----------



## Blameless

Quote:


> Originally Posted by *CrazyElf*
> 
> I'm wondering if it is more productive in some cases where the cards are hitting the power limit to consider undervolting slightly.
> 
> If you think about it, if power limit is the bottleneck, we ought to be undervolting aggressively to see what the bare minimum for stability is.


I'm undervolting significantly vs. what my part asks for at the 2025MHz I have it set to at the temperatures I'm able to run. 1.031v is the lowest I go without Timespy failing intermittently. If I don't use a voltage curve and just set a clock offset, the card sets 1.062v or 1.075v to level off at that same 2025Mhz, which doesn't hit any power limits in lighter tests, but is not workable for maintaining stable clocks in anything really demanding.

In most uses, my sample, at 2025MHz core/5940 memory/1.031v, doesn't report any limiters active, ever. However, I can still prompt the power limiter (at 375w...250w TDP +50% slider on the F3P firmware for my Aorus) in some synthetic stress tests as well as see it touch on it in 8k Superposition (4k doesn't cut it).

With regard to memory stability, MemtestG80 is still proving faster at finding errors than about anything I've tried. However, since NVIDIA's drivers are falling back to a low power state in any CUDA app, one has to edit hidden profile settings to disable this 'feature'. I use NVIDIA Profile Inspector 2.1.3.10 for this. Makes things much more convenient than having to set a higher memory offset after starting the tests and having to remember to revert it before the tests stop.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> I'm undervolting significantly vs. what my part asks for at the 2025MHz I have it set to at the temperatures I'm able to run. 1.031v is the lowest I go without Timespy failing intermittently. If I don't use a voltage curve and just set a clock offset, the card sets 1.062v or 1.075v to level off at that same 2025Mhz, which doesn't hit any power limits in lighter tests, but is not workable for maintaining stable clocks in anything really demanding.
> 
> In most uses, my sample, at 2025MHz core/5940 memory/1.031v, doesn't report any limiters active, ever. However, I can still prompt the power limiter (at 375w...250w TDP +50% slider on the F3P firmware for my Aorus) in some synthetic stress tests as well as see it touch on it in 8k Superposition (4k doesn't cut it).
> 
> With regard to memory stability, MemtestG80 is still proving faster at finding errors than about anything I've tried. However, since NVIDIA's drivers are falling back to a low power state in any CUDA app, one has to edit hidden profile settings to disable this 'feature'. I use NVIDIA Profile Inspector 2.1.3.10 for this. Makes things much more convenient than having to set a higher memory offset after starting the tests and having to remember to revert it before the tests stop.


Yeah no clue why for CUDA we always have to change the power state to get full memory speed.


----------



## Blameless

Probably something they stuffed in there for power/stability reasons or to make Quadro's look better in CUDA benchmarks vs. GeForce.


----------



## NeedMoreJuice

Can anyone explain me what the heck is going on with this? (everything else is copied from Jayztwocents)

I need +110 on the CC to reach 2101mhz, while it drops to constant 2088Mhz


Spoiler: Warning: Spoiler!







Here with +75 on the CC, as you can see I am at 2063Mhz, why is that?


Spoiler: Warning: Spoiler!







Here the Jayz video 




*EDIT*

It went to 2114 with +110 and dropped to 2101 and then to 2088 after some minutes.


Spoiler: Warning: Spoiler!


----------



## ZealotKi11er

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Can anyone explain me what the heck is going on with this? (everything else is copied from Jayztwocents)
> 
> I need +110 on the CC to reach 2101mhz, while it drops to constant 2088Mhz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here with +75 on the CC, as you can see I am at 2063Mhz, why is that?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here the Jayz video
> 
> 
> 
> 
> *EDIT*
> 
> It went to 2114 with +110 and dropped to 2101 and then to 2088 after some minutes.
> 
> 
> Spoiler: Warning: Spoiler!


It drops +13MHz every 10C or so.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It drops +13MHz every 10C or so.


Nono, not the drop. I am aware that Nvidia has this strange +13 step increments. I mean that I get 2063Mhz with +75, while Jay gets 2088Mhz. His runs at default at 1974Mhz which would explain that, but why does mine run at 1963Mhz no matter the temperature.

I mean, yes, it is silicon lottery, but it's hella strange. Ufff... and until the watercooling parts arrive I don't know how to feel right now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Nono, not the drop. I am aware that Nvidia has this strange +13 step increments. I mean that I get 2063Mhz with +75, while Jay gets 2088Mhz. His runs at default at 1974Mhz which would explain that, but why does mine run at 1963Mhz no matter the temperature.
> 
> I mean, yes, it is silicon lottery, but it's hella strange. Ufff... and until the watercooling parts arrive I don't know how to feel right now.


Nope. It all temperatures. If both cards have same Boost Clock than both should hit same clock speed at same temp if both are stable. For example at 42C you can have 2101MHz OC but at 52C its 2088MHz and 62C its 2075MHz. Not sure if your card has different modes for different clock speeds.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nope. It all temperatures. If both cards have same Boost Clock than both should hit same clock speed at same temp if both are stable. For example at 42C you can have 2101MHz OC but at 52C its 2088MHz and 62C its 2075MHz. Not sure if your card has different modes for different clock speeds.


Hm... if it's all temperature based, this includes the idle base temp too, right?
With that in mind, before I've tested this card I had another one in the system that idled at 29°C, while this idles at 34°C. The first card did not like anything above +70, it hit 2088 but couldn't keep it so it dropped to 2076. This one can hit 2114, although it drops to 2088.
Sooo, even though it's a new card, I think I will change the thermal paste, they seem to have messed it up if it's 5°C worse. Oc profile, room temp and case fan speeds didn't change, so it can only be bad thermal paste, Grizzly Kryonaut made at least at my friends card and on my 980 Ti quite the impact even on air.


----------



## 8051

I've been using the Strix XOC VBIOS on my gigabyte Gaming OC 1080Ti (thanks Phobos223) and noticed that if I set a frequency w/the frequency slider in Precision XOC it'll keep the frequency but will change the voltage. For example for +84 MHz. it'll start at 1.063V then go up to 1.075V, 1.081V or 1.093V. Sometimes it'll even drop down from 1.093V to 1.081V.


----------



## lilchronic

Quote:


> Originally Posted by *Blameless*
> 
> I'm undervolting significantly vs. what my part asks for at the 2025MHz I have it set to at the temperatures I'm able to run. 1.031v is the lowest I go without Timespy failing intermittently. If I don't use a voltage curve and just set a clock offset, the card sets 1.062v or 1.075v to level off at that same 2025Mhz, which doesn't hit any power limits in lighter tests, but is not workable for maintaining stable clocks in anything really demanding.
> 
> In most uses, my sample, at 2025MHz core/5940 memory/1.031v, doesn't report any limiters active, ever. However, I can still prompt the power limiter (at 375w...250w TDP +50% slider on the F3P firmware for my Aorus) in some synthetic stress tests as well as see it touch on it in 8k Superposition (4k doesn't cut it).
> 
> With regard to memory stability, MemtestG80 is still proving faster at finding errors than about anything I've tried. However, since NVIDIA's drivers are falling back to a low power state in any CUDA app, one has to edit hidden profile settings to disable this 'feature'. I use NVIDIA Profile Inspector 2.1.3.10 for this. Makes things much more convenient than having to set a higher memory offset after starting the tests and having to remember to revert it before the tests stop.


For MemtestG80 do you run default settings?


----------



## Blameless

Quote:


> Originally Posted by *lilchronic*
> 
> For MemtestG80 do you run default settings?


I run eight 1024MiB instances for 1000 loops if I want to be really confident of memory stability. Takes about 60-90 minutes.

"MemtestG80.exe 1024 1000"


----------



## nrpeyton

Quote:


> Originally Posted by *Thoth420*
> 
> I think I just built a behemoth system to game on mostly only to discover that I may indeed have epilepsy or at the least have a severe sensitivity to on screen motion and light change since I have been taken off the meds(xanax) I have been on for the past 7 years. Any first person game makes me feel very strange after even a few moments
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had been playing games on my Switch since I got out of the hospital but I don't own anything on that which is first person. Oddly Mario + Rabbids has lots of flashing graphical effects but that doesn't seem to bother me at all nor does Zelda which also does. No seizures so far which I am told I am at risk for anyways but it would be a shame if this issue doesn't subside.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 1080Ti on the Predator x34 is a monster though.
> 
> Moral of the story: never take drugs...not even from your doctor


Xanax

wow.
I know people who have missed entire days on those.

You're right. Don't go down that street. Very dangerous.


----------



## Thoth420

Quote:


> Originally Posted by *nrpeyton*
> 
> Xanax
> 
> wow.
> I know people who have missed entire days on those.
> 
> You're right. Don't go down that street. Very dangerous.


Yeah I was on it for about 7 years and the stuff is no joke. I finally feel alive again.
I hope my vision goes back to normal soon because gaming is kind of a major hobby obviously...spent about 4 grand on this new hardware already.


----------



## Dasboogieman

Quote:


> Originally Posted by *Blameless*
> 
> I'm undervolting significantly vs. what my part asks for at the 2025MHz I have it set to at the temperatures I'm able to run. 1.031v is the lowest I go without Timespy failing intermittently. If I don't use a voltage curve and just set a clock offset, the card sets 1.062v or 1.075v to level off at that same 2025Mhz, which doesn't hit any power limits in lighter tests, but is not workable for maintaining stable clocks in anything really demanding.
> 
> In most uses, my sample, at 2025MHz core/5940 memory/1.031v, doesn't report any limiters active, ever. However, I can still prompt the power limiter (at 375w...250w TDP +50% slider on the F3P firmware for my Aorus) in some synthetic stress tests as well as see it touch on it in 8k Superposition (4k doesn't cut it).
> 
> With regard to memory stability, MemtestG80 is still proving faster at finding errors than about anything I've tried. However, since NVIDIA's drivers are falling back to a low power state in any CUDA app, one has to edit hidden profile settings to disable this 'feature'. I use NVIDIA Profile Inspector 2.1.3.10 for this. Makes things much more convenient than having to set a higher memory offset after starting the tests and having to remember to revert it before the tests stop.


Wait what settings did you modify to stop the VRAM downclock?


----------



## mksteez

Which brand is the one to buy right now?


----------



## Rollergold

Quote:


> Originally Posted by *mksteez*
> 
> Which brand is the one to buy right now?


EVGA, Asus MSI and Gigabyte are good brands. I'm partial to EVGA because of their Water-cooling friendly warranty (no warranty void if removed stickers over the screws holding on the cooler/backplate).


----------



## Dasboogieman

Quote:


> Originally Posted by *Rollergold*
> 
> EVGA, Asus MSI and Gigabyte are good brands. I'm partial to EVGA because of their Water-cooling friendly warranty (no warranty void if removed stickers over the screws holding on the cooler/backplate).


Gigabyte is also the same with respect to watercooling, no warranty void stickers. However, you will be riding the roulette of Gigabyte's legendary QC.


----------



## Blameless

Quote:


> Originally Posted by *Dasboogieman*
> 
> Wait what settings did you modify to stop the VRAM downclock?




Just make sure "CUDA - Force P2 State" is set to off in the global driver profile.


----------



## KedarWolf

Quote:


> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Wait what settings did you modify to stop the VRAM downclock?
> 
> 
> 
> 
> 
> Just make sure "CUDA - Force P2 State" is set to off in the global driver profile.
Click to expand...

I downloaded that version of Inspector but has no such setting.









Not using latest drivers though, could that be why?


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Wait what settings did you modify to stop the VRAM downclock?
> 
> 
> 
> 
> 
> Just make sure "CUDA - Force P2 State" is set to off in the global driver profile.
> 
> Click to expand...
> 
> I downloaded that version of Inspector but has no such setting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not using latest drivers though, could that be why?
Click to expand...

Oh wait, ran it as Admin, problem solved.


----------



## Blameless

Quote:


> Originally Posted by *KedarWolf*
> 
> Oh wait, ran it as Admin, problem solved.


Didn't even think to check this. The admin account is the only account on my stripped down primary/gaming system (yes, I know it has risks).

Good to know though.


----------



## shalafi

Quote:


> Originally Posted by *Clukos*
> 
> MSI Afterburner + RTSS, latest beta (I think it should be 16 for afterburner and 27 for RTSS).


Does the beta give you the CPU temp reading, or do you still need to have HWINFO running and feeding that data to RTSS?


----------



## Dasboogieman

Quote:


> Originally Posted by *Blameless*
> 
> Didn't even think to check this. The admin account is the only account on my stripped down primary/gaming system (yes, I know it has risks).
> 
> Good to know though.


I just tested it but I dunno how useful G80 Memtest is anymore. The GDDR5x ECC is extremely efficient. Like I can only force uncorrectable errors with VRAM speeds above 6408mhz (+900) but my go-to ultimate reliability test 4K FFXIV will catch reliability issues as early as 6226mhz (+720) within 20 seconds.

I'm also using the GTX 970 VRAM Bandwidth test to find the ideal VRAM straps.


----------



## The Viper

New 1080ti owner...

Just out of curiosity, if your not going for 2000+ core clocks, I'm assuming raising the power limits is pointless?

I'm running around 1900 core @.900


----------



## Agent-A01

Quote:


> Originally Posted by *The Viper*
> 
> New 1080ti owner...
> 
> Just out of curiosity, if your not going for 2000+ core clocks, I'm assuming raising the power limits is pointless?
> 
> I'm running around 1900 core @.900


You'll still hit powerlimit on those clocks.
Default is around 250w~


----------



## Blameless

Quote:


> Originally Posted by *Dasboogieman*
> 
> I just tested it but I dunno how useful G80 Memtest is anymore. The GDDR5x ECC is extremely efficient. Like I can only force uncorrectable errors with VRAM speeds above 6408mhz (+900) but my go-to ultimate reliability test 4K FFXIV will catch reliability issues as early as 6226mhz (+720) within 20 seconds.
> 
> I'm also using the GTX 970 VRAM Bandwidth test to find the ideal VRAM straps.


Does consumer pascal even have any ECC enabled? Querying the card with nvidia-SMI seems to suggest not, but maybe it's not being reported correctly.

In my experience, eight instances (this is important, because it takes at least 4-6 to saturate the memory controller) of MemtestG80 will find unstable memory on my part much faster than any test I've tried. I don't have FFIV, but I do use the FFXIV Stormblood benchmark in a loop to stability test...at ~65C, 6048 memory will produce errors in MemtestG80 inside 5-10 minutes, but I can loop Stormblood bench at 4k for several hours before running into issues.

Temperature is also important. MemtestG80 will not pull as much power as graphically intensive tests, so pump/fan speeds need to be reduced to make sure the memory is hitting worst case temperatures.

Regardless, ensuring no errors are reported _and_ that performance is still scaling correctly with increased memory clocks should be a very strong indicator that the memory is stable and that any ECC, if present, isn't confusing things.


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Viper*
> 
> New 1080ti owner...
> 
> Just out of curiosity, if your not going for 2000+ core clocks, I'm assuming raising the power limits is pointless?
> 
> I'm running around 1900 core @.900


You want the power limit at 120% so that you never hit the 100% limit. With 0.9v its hard to hit 250W but it can happen.


----------



## 8051

Thanks to whoever suggested a 1080Ti may uncover instabilities in my CPU overclock, that seemed to be what my problem was in Dying Light -- but no other game or benchmark. I still find it hard to believe, but I have noticed the CPU usage increases drastically over my 980Ti when I'm getting my maximum framerates w/the 1080Ti.


----------



## TheRedViper

Just installed the block on one of the 5 galax 1080ti 8pack limited edition.


----------



## Dasboogieman

Quote:


> Originally Posted by *Blameless*
> 
> Does consumer pascal even have any ECC enabled? Querying the card with nvidia-SMI seems to suggest not, but maybe it's not being reported correctly.
> 
> In my experience, eight instances (this is important, because it takes at least 4-6 to saturate the memory controller) of MemtestG80 will find unstable memory on my part much faster than any test I've tried. I don't have FFIV, but I do use the FFXIV Stormblood benchmark in a loop to stability test...at ~65C, 6048 memory will produce errors in MemtestG80 inside 5-10 minutes, but I can loop Stormblood bench at 4k for several hours before running into issues.
> 
> Temperature is also important. MemtestG80 will not pull as much power as graphically intensive tests, so pump/fan speeds need to be reduced to make sure the memory is hitting worst case temperatures.
> 
> Regardless, ensuring no errors are reported _and_ that performance is still scaling correctly with increased memory clocks should be a very strong indicator that the memory is stable and that any ECC, if present, isn't confusing things.


There is definitely a sort of Error compensation but not true ECC. Like the bus can detect errors and re-transmit.

Ill try more instances of G80 and see. The FFXIV section I use is the Ocean scene from Swiftperch. Just stand there and look at it, the GPU will crap itself if there is any instability.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *shalafi*
> 
> Does the beta give you the CPU temp reading, or do you still need to have HWINFO running and feeding that data to RTSS?


I can't get CPU readings at all even with HWINFO running ! Can someone please give a little help?


----------



## Blameless

Quote:


> Originally Posted by *Dasboogieman*
> 
> There is definitely a sort of Error compensation but not true ECC. Like the bus can detect errors and re-transmit.


Sounds rather like the memory EDC on Hawaii, though there was a counter for these that software could read.


----------



## Dasboogieman

Quote:


> Originally Posted by *Blameless*
> 
> Sounds rather like the memory EDC on Hawaii, though there was a counter for these that software could read.


Yeah, the memtest CL was able to pick this up on Hawaii but I dont have the coding expertise to port it to CUDA.


----------



## Clukos

More Dx11 vs Dx12... 1440p in RotTR Very High preset in Geothermal Valley

GPU bound

Dx11


Dx12


CPU bound

Dx11


Dx12


Better all around, not sure if MS fixed Dx12 or Nvidia drivers or whatever. Dx12, when implement correctly, performs better. Dice and Frostbite seem to be the worst offenders when it comes to Dx12 performance, maybe somethings up with their implementation.


----------



## velocityx

Quote:


> Originally Posted by *Clukos*
> 
> . Dice and Frostbite seem to be the worst offenders when it comes to Dx12 performance, maybe somethings up with their implementation.


I checked yesterday, that DX12 in Vanilla BF1 maps is stuttery, but those new TSAR maps are very good in DX12. Makes me wonder if it has to do with particular map design assets not being made for dx12 in mind in vanilla bf1.


----------



## jaki

Hi,

finally got a 1080 TI namely the Gainward Phoenix Golden Sample.

I'm not interested in overlocking so I'm currently undervolting it with MSI Afterburner.
Right now I have it running at 1924MHz, 0,931V, 68°C at 47% Fan stable.
Can you tell from these values what to do next? Is it a good "Golden Sample"?









edit: Forgot to mention Unigine Heaven is running all the time right now
edit2: The voltage curve in MSI Afterburner is a ***** it's so difficult to set up


----------



## The Viper

Quote:


> Originally Posted by *jaki*
> 
> Hi,
> 
> finally got a 1080 TI namely the Gainward Phoenix Golden Sample.
> 
> I'm not interested in overlocking so I'm currently undervolting it with MSI Afterburner.
> Right now I have it running at 1924MHz, 0,931V, 68°C at 47% Fan stable.
> Can you tell from these values what to do next? Is it a good "Golden Sample"?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: Forgot to mention Unigine Heaven is running all the time right now
> edit2: The voltage curve in MSI Afterburner is a ***** it's so difficult to set up


Your in the same boat as me, I'm not trying to push the clocks either, I'm only interested in undervolting. Right now I'm running about [email protected] I haven't pushed any lower then that.

I have found the voltage curve super simple to work with, it's prolly the easiest overclocking/undervolting I've ever experienced. KedarWolf made a great video on how to set up a voltage curve


----------



## NeedMoreJuice

I'm curious since I've never thought of it before, but can the chip itself be a factor for a different temperature?

Both Poseidons (tested under the same conditions and ALONE in the case) differ in about about 5~°C
I've changed the paste of the hotter card and the temp barely dropped about 1.5°C

Is it simply the chip? (I had to screw and unscrew and adjust the cooler a couple of times though, so not sure if I lost 1°C or so, I may redo the application of the paste in a couple of days again, just to be sure).


----------



## Blameless

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> I'm curious since I've never thought of it before, but can the chip itself be a factor for a different temperature?
> 
> Both Poseidons (tested under the same conditions and ALONE in the case) differ in about about 5~°C
> I've changed the paste of the hotter card and the temp barely dropped about 1.5°C
> 
> Is it simply the chip? (I had to screw and unscrew and adjust the cooler a couple of times though, so not sure if I lost 1°C or so, I may redo the application of the paste in a couple of days again, just to be sure).


The average leakage of an IC will vary from sample to sample and this affects the clocks that can be reached, the voltage needed, and the power consumption (which is also heat production) at given clocks/volts.

Even with proper matching of VIDs to leakage, some parts will just run hotter than others doing the same things.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *Blameless*
> 
> The average leakage of an IC will vary from sample to sample and this affects the clocks that can be reached, the voltage needed, and the power consumption (which is also heat production) at given clocks/volts.
> 
> Even with proper matching of VIDs to leakage, some parts will just run hotter than others doing the same things.


Ah, all right thanks


----------



## 8051

Quote:


> Originally Posted by *Blameless*
> 
> The average leakage of an IC will vary from sample to sample and this affects the clocks that can be reached, the voltage needed, and the power consumption (which is also heat production) at given clocks/volts.
> 
> Even with proper matching of VIDs to leakage, some parts will just run hotter than others doing the same things.


It doesn't seem to matter much, the difference between the best overclocking 1080Ti's and worst seems to be a matter of a few percent and none of them overclock like the Maxwells or Keplers did.


----------



## KedarWolf

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blameless*
> 
> The average leakage of an IC will vary from sample to sample and this affects the clocks that can be reached, the voltage needed, and the power consumption (which is also heat production) at given clocks/volts.
> 
> Even with proper matching of VIDs to leakage, some parts will just run hotter than others doing the same things.
> 
> 
> 
> Ah, all right thanks
Click to expand...

I think you mention two video cards.

If you are running two in SLI the top card will always run hotter than the bottom card due to the limited space for the fans on the top card and the heat generated from both cards.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *KedarWolf*
> 
> I think you mention two video cards.
> 
> If you are running two in SLI the top card will always run hotter than the bottom card due to the limited space for the fans on the top card and the heat generated from both cards.


Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Both Poseidons (tested under the same conditions and ALONE in the case) differ in about about 5~°C


^^


----------



## KedarWolf

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I think you mention two video cards.
> 
> If you are running two in SLI the top card will always run hotter than the bottom card due to the limited space for the fans on the top card and the heat generated from both cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *NeedMoreJuice*
> 
> Both Poseidons (tested under the same conditions and ALONE in the case) differ in about about 5~°C
> 
> Click to expand...
> 
> ^^
Click to expand...

Oh I see. I missed that.

Replacing the thermal paste on both would be an interesting experiment or did someone already suggest that?


----------



## NeedMoreJuice

Quote:


> Originally Posted by *KedarWolf*
> 
> Oh I see. I missed that.
> 
> Replacing the thermal paste on both would be an interesting experiment or did someone already suggest that?


Just wasted my whole day doing that, the screws were a pain to unscrew and to screw back on, the two side screws (the ones that are near the display connections) moved the pcb and backblate and vice versa. After a while I was able to screw everything back on.

The first time it dropped about 1.5°C, from constant 35°C to 33~34°C - although for some reason my oc only applied 2088 instead of 2114Mhz, don't know what this is about. Seems like the thermal grizzly kryonaut gets instant warm compared to the glue-like gunk asus used.

The other paste application increased the temp to 36°C.... The resistors around the chip are smudged and the paste spread even in the corner of the gpu. I had only 70% ethanol, guess I'm going to get 99% isopropyl from the pharmacy tomorrow. Paper towel helped the most getting the resistors clean, Q-tips however dropped too much fiber on the resistors...

Is it safe to soak the chip and the resistors around it in isopropyl?


----------



## The Viper

Quote:


> Originally Posted by *KedarWolf*
> 
> Oh I see. I missed that.
> 
> Replacing the thermal paste on both would be an interesting experiment or did someone already suggest that?


LOL, you really gotta put those glasses back on.
post 14077


----------



## AllGamer

Hey guys, I'm having an annoying problem with the 3rd monitor not coming back from Sleep mode.

The only way to get the last monitor to go back on, is to turn off the computer and then turn it back on.

It has been happening since I upgraded to driver 384.94.

Tried the newest version 385.41, same problem.

It was fine before the new drivers but forgot which version number I was in.

Or perhaps is there another work around, other than keeping the monitors "On At All times" in windows power setting.

If I do that then it's fine, but I'll like for them to go into idle / sleep mode when I'm not using my 4 monitors.


----------



## KedarWolf

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Oh I see. I missed that.
> 
> Replacing the thermal paste on both would be an interesting experiment or did someone already suggest that?
> 
> 
> 
> Just wasted my whole day doing that, the screws were a pain to unscrew and to screw back on, the two side screws (the ones that are near the display connections) moved the pcb and backblate and vice versa. After a while I was able to screw everything back on.
> 
> The first time it dropped about 1.5°C, from constant 35°C to 33~34°C - although for some reason my oc only applied 2088 instead of 2114Mhz, don't know what this is about. Seems like the thermal grizzly kryonaut gets instant warm compared to the glue-like gunk asus used.
> 
> The other paste application increased the temp to 36°C.... The resistors around the chip are smudged and the paste spread even in the corner of the gpu. I had only 70% ethanol, guess I'm going to get 99% isopropyl from the pharmacy tomorrow. Paper towel helped the most getting the resistors clean, Q-tips however dropped too much fiber on the resistors...
> 
> Is it safe to soak the chip and the resistors around it in isopropyl?
Click to expand...

99% isopropyl should be fine.









Just let it dry thoroughly after is all.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *KedarWolf*
> 
> 99% isopropyl should be fine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just let it dry thoroughly after is all.


I couldn't wait until tomorrow so I redid everything again by using ethanol 70% (Spiritus), now I'm waiting that everything that went beneath the chip to dry (some came out by using an aircan). Hopefully I can get at least that 1.5°C drop I had earlier.

Edit: Although I had the temp drop, my card won't boost above 2088 anymore, it used to boost to 2114 and then drop to 2101, but now it just hits 2088 and drops to 2076 ?? All I did was cleaning the resistors, the chip and heatsink, reapplying the paste and putting the cooler back on... the pads stuck on the cooler so I didn't touch them at all, besides wiping with a q-tip and some ethanol the dust and paste off of it. The fudge. Uff... it doesn't make sense... *Nevermind, gpu tweak turned the voltage off.. pff.*

By the way, is it bad that some paste touched the vrams and thermal pads? (Only a little which I tried to remove with a q-tip and ethanol, removed also some dust that went on it while I was cleaning and re-applying the new paste). I saw that pads are a little greasy, so that's why I ask.


----------



## Maximization

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> I couldn't wait until tomorrow so I redid everything again by using ethanol 70% (Spiritus), now I'm waiting that everything that went beneath the chip to dry (some came out by using an aircan). Hopefully I can get at least that 1.5°C drop I had earlier.
> 
> Edit: Although I had the temp drop, my card won't boost above 2088 anymore, it used to boost to 2114 and then drop to 2101, but now it just hits 2088 and drops to 2076 ?? All I did was cleaning the resistors, the chip and heatsink, reapplying the paste and putting the cooler back on... the pads stuck on the cooler so I didn't touch them at all, besides wiping with a q-tip and some ethanol the dust and paste off of it. The fudge. Uff... it doesn't make sense... *Nevermind, gpu tweak turned the voltage off.. pff.*
> 
> By the way, is it bad that some paste touched the vrams and thermal pads? (Only a little which I tried to remove with a q-tip and ethanol, removed also some dust that went on it while I was cleaning and re-applying the new paste). I saw that pads are a little greasy, so that's why I ask.


when I put water blocks on other cards sometimes I put paste on the thermal pads so they get more heat conductance to the block in theory.. if everything is non electrical conducting it will be fine


----------



## NeedMoreJuice

Quote:


> Originally Posted by *Maximization*
> 
> when I put water blocks on other cards sometimes I put paste on the thermal pads so they get more heat conductance to the block in theory.. if everything is non electrical conducting it will be fine


Good, because none of the cards I've had had thermal pads, besides the Matrix but under a separate aluminium heatsink. And from the little there is about thermal pads on the internet it's always "do not touch, because", so it bothered me that my ocd kicked in and made me Q-tip (with ethanol) over them to get the dust and paste sring(s) off that went on it after taking off the cooler. Every couple of minutes I also "had" to go over the cooler and pcb with an aircan. Can't stand the idea that little fibers or dust get on the card.







...


----------



## SauronTheGreat

How come my 1080Ti is performing at 99% in gaming rather than 100% ... is it something to be worried about ?


----------



## k4sh

Quote:


> Originally Posted by *SauronTheGreat*
> 
> How come my 1080Ti is performing at 99% in gaming rather than 100% ... is it something to be worried about ?


No, you don't have to. That's absolutely fine.


----------



## KedarWolf

I'm pretty sure the memory clocks are higher on this RAM but dunno if they are just OCing it or it's actually different modules.









https://www.evga.com/articles/01149/evga-geforce-gtx-1080-ti-12ghz/


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure the memory clocks are higher on this RAM but dunno if they are just OCing it or it's actually different modules.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.evga.com/articles/01149/evga-geforce-gtx-1080-ti-12ghz/


Maybe Titan Xp modules?


----------



## k4sh

So i've just received today my vram and vrm sinks (ENZOTECH) and it's night and day.
Prior to this, VRAM & VRM were only cooled by a fan.
Now i can pass a short test with TW3 that systematically failed. VGPU is locked at 1031 mV and GPU clock at 2038 MHz
VRM better cooled seemed to have done it.
Now i have to figure out how to make these small sinks stick better as already one has fallen from the VRM.
If anyone has an idea ...


----------



## Beagle Box

Quote:


> Originally Posted by *k4sh*
> 
> So i've just received today my vram and vrm sinks (ENZOTECH) and it's night and day.
> Prior to this, VRAM & VRM were only cooled by a fan.
> Now i can pass a short test with TW3 that systematically failed. VGPU is locked at 1031 mV and GPU clock at 2038 MHz
> VRM better cooled seemed to have done it.
> Now i have to figure out how to make these small sinks stick better as already one has fallen from the VRM.
> If anyone has an idea ...


What? No photos?


----------



## k4sh

Quote:


> Originally Posted by *Beagle Box*
> 
> What? No photos?


Ehhh sorry for the missing pictures but my card is under water








These are the sinks i'm actually using.
https://www.amazon.fr/Enzotech-Mosfet-K%C3%BChler-MOS-C1-LE-passiv-Black/dp/B00P87G1NU


----------



## ZealotKi11er

Quote:


> Originally Posted by *k4sh*
> 
> Ehhh sorry for the missing pictures but my card is under water
> 
> 
> 
> 
> 
> 
> 
> 
> These are the sinks i'm actually using.
> https://www.amazon.fr/Enzotech-Mosfet-K%C3%BChler-MOS-C1-LE-passiv-Black/dp/B00P87G1NU


http://www.ebay.com/bhp/thermal-adhesive-tape


----------



## k4sh

Are they all efficient ? I mean my sinks are upside down. And how about the thickness ?


----------



## Blameless

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure the memory clocks are higher on this RAM but dunno if they are just OCing it or it's actually different modules.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.evga.com/articles/01149/evga-geforce-gtx-1080-ti-12ghz/


Better binning, custom timings, slightly more voltage, or any combination of the three could account for the difference, even if they are the same ICs.

Micron's newest D9VRN is rated for 12Gbps and it's possible they use this, but I believe even the Titan Xp uses 11Gbps rated D9VRL ICs.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Better binning, custom timings, slightly more voltage, or any combination of the three could account for the difference, even if they are the same ICs.
> 
> Micron's newest D9VRN is rated for 12Gbps and it's possible they use this, but I believe even the Titan Xp uses 11Gbps rated D9VRL ICs.


I am thinking they changed the timings and voltage increase. Most 1080 Ti can do 12GHz no problem.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Most 1080 Ti can do 12GHz no problem.


At the temperatures expected of an out-of-box fan profile on an air cooled part?


----------



## Dasboogieman

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure the memory clocks are higher on this RAM but dunno if they are just OCing it or it's actually different modules.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.evga.com/articles/01149/evga-geforce-gtx-1080-ti-12ghz/


Until someone does a teardown we won't know for sure. My gut instinct is telling me this is new modules. Because if Micron's manufacturing had matured enough where EVGA could bin the 11gbps modules to pull out mass production 12gbps, Micro might as well release a new SKU, this is a high volume part unlike the Titan Xp.


----------



## Kana Chan

Micron GDDR5X MT58K256M321JA-120:A is 12 Gb/s
https://www.anandtech.com/show/11543/micron-discusses-gddr5x-gddr6-and-gddr5
"When Micron announced its GDDR5X memory in late 2015, it set two targets for data transfer rates: the initial target of 10 - 12 Gbps and the longer-term target of 16 Gbps. Initially, the company only supplied GDDR5X ICs validated at 10 and 11 Gbps, but this year the company also started to bin the chips for 12 Gbps. "


----------



## KedarWolf

Quote:


> Originally Posted by *Kana Chan*
> 
> Micron GDDR5X MT58K256M321JA-120:A is 12 Gb/s
> https://www.anandtech.com/show/11543/micron-discusses-gddr5x-gddr6-and-gddr5
> "When Micron announced its GDDR5X memory in late 2015, it set two targets for data transfer rates: the initial target of 10 - 12 Gbps and the longer-term target of 16 Gbps. Initially, the company only supplied GDDR5X ICs validated at 10 and 11 Gbps, but this year the company also started to bin the chips for 12 Gbps. "


Are they only binned to run at 12 Gbps or are they a newer version chip at 12 Gbps in production.

My card does well over 12 Gbps as it is.


----------



## 8051

I've had to underclock the memory on my 1080Ti by 350 MHz. to get the core to 1076 Mhz. in one game: Dying Light.

Bringing the memory up to full speed on this game (before it CTD'd) seemed to increase my GPU temps by 5°C.


----------



## NeedMoreJuice

Quote:


> Originally Posted by *8051*
> 
> I've had to underclock the memory on my 1080Ti by 350 MHz. to get the core to 1076 Mhz. in one game: Dying Light.
> 
> Bringing the memory up to full speed on this game (before it CTD'd) seemed to increase my GPU temps by 5°C.


Don't bother with Dying Light, the optimization is garbage. Vsync halfes my fps and I need to enable and disable it a couple of times until it works properly. With barely an oc on my 980 Ti that game managed to get over the modded PowerTarget of 130%, it doesn't make sense, there are way heavier games out there not behaving that way.


----------



## Agent-A01

Quote:


> Originally Posted by *k4sh*
> 
> Are they all efficient ? I mean my sinks are upside down. And how about the thickness ?


Just use some sticky thermal paste like thermal grizzly with a dab of super glue on the corners


----------



## 8051

Quote:


> Originally Posted by *NeedMoreJuice*
> 
> Don't bother with Dying Light, the optimization is garbage. Vsync halfes my fps and I need to enable and disable it a couple of times until it works properly. With barely an oc on my 980 Ti that game managed to get over the modded PowerTarget of 130%, it doesn't make sense, there are way heavier games out there not behaving that way.


Dying Light is an open world game, I've also maxed the view distance through modding of the config files.


----------



## k4sh

Quote:


> Originally Posted by *Agent-A01*
> 
> Just use some sticky thermal paste like thermal grizzly with a dab of super glue on the corners


Super glue ? Huh ? And what if i need to rma the card ?
I've done a test with a single sink stick with arctic mx-4 paste.
Will see at the end of the week end what it is worth.


----------



## 8051

Quote:


> Originally Posted by *k4sh*
> 
> Super glue ? Huh ? And what if i need to rma the card ?
> I've done a test with a single sink stick with arctic mx-4 paste.
> Will see at the end of the week end what it is worth.


What kind of overclocks can you get out of your H2O cooled Zotac 1080Ti Mini?


----------



## k4sh

Quote:


> Originally Posted by *8051*
> 
> What kind of overclocks can you get out of your H2O cooled Zotac 1080Ti Mini?


Not a great one.
Witcher 3 does on its own. A curve AB profile stuck at 2038 mhz, 1043 mv ends at 2038 mhz up to 1062 mv.
Stable after an hour at 100% - 112% power (GPU at 99%).
This card is limited to 120 % of the TDP which is 300 W.
Depending of the games i think i could probably reach 2050 mhz maximum which is quite good for a GPU that embeds approx 12 billions of transistors.
Except few games very demanding i can play all games all settings maxed out at 60 fps (triple head setup).
The VRM really need to be well cooled.


----------



## 8051

Quote:


> Originally Posted by *k4sh*
> 
> Not a great one.
> Witcher 3 does on its own. A curve AB profile stuck at 2038 mhz, 1043 mv ends at 2038 mhz up to 1062 mv.
> Stable after an hour at 100% - 112% power (GPU at 99%).
> This card is limited to 120 % of the TDP which is 300 W.
> Depending of the games i think i could probably reach 2050 mhz maximum which is quite good for a GPU that embeds approx 12 billions of transistors.
> Except few games very demanding i can play all games all settings maxed out at 60 fps (triple head setup).
> The VRM really need to be well cooled.


If you're hitting PWR perfcaps you should consider the Strix XOC VBIOS -- it made all the diff. w/my air-cooled 1080Ti, it doesn't throw PWR perfcaps ever anymore and I managed to get up to 2076 MHz. at anywhere from 1.063 to 1.094 volts (occasionally hitting 1.1).


----------



## Blameless

Quote:


> Originally Posted by *KedarWolf*
> 
> Are they only binned to run at 12 Gbps or are they a newer version chip at 12 Gbps in production.


The only difference is who does the sorting. A micron 11Gbps GDDR5X IC is built the same as a 10 or 12Gbps one from the same line...Micron just sorts them by what it's willing to guarantee them capable of.


----------



## 8051

Quote:


> Originally Posted by *Blameless*
> 
> The only difference is who does the sorting. A micron 11Gbps GDDR5X IC is built the same as a 10 or 12Gbps one from the same line...Micron just sorts them by what it's willing to guarantee them capable of.


So do they actually test each IC? Or maybe an IC of each wafer?


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> So do they actually test each IC? Or maybe an IC of each wafer?


Generally, it is better if the bin came directly from Micron itself. Their equipment is much "closer" to the die and is generally more liberal. If Micron says an IC is binned for 12gpbs, there is a very good possibility a number of those ICs will over clock to 13gpbs once implemented but if the AIB bins the RAM, chances are there won't be much headroom left.


----------



## k4sh

Quote:


> Originally Posted by *8051*
> 
> If you're hitting PWR perfcaps you should consider the Strix XOC VBIOS -- it made all the diff. w/my air-cooled 1080Ti, it doesn't throw PWR perfcaps ever anymore and I managed to get up to 2076 MHz. at anywhere from 1.063 to 1.094 volts (occasionally hitting 1.1).


Not sure increasing the tdp max with such a card would be a good idea.
Considering that it's a tiny one and that the vrm seem not to be the highest quality over the market, i don't want to push it too far.


----------



## 8051

Quote:


> Originally Posted by *k4sh*
> 
> Not sure increasing the tdp max with such a card would be a good idea.
> Considering that it's a tiny one and that the vrm seem not to be the highest quality over the market, i don't want to push it too far.


Don't you have a full coverage water block? Wouldn't that keep the VRM's cool and safe?


----------



## k4sh

Quote:


> Originally Posted by *8051*
> 
> Don't you have a full coverage water block? Wouldn't that keep the VRM's cool and safe?


Universal block.
Sorry but a full waterblock is more than a hundred euros here and cannot pass through architecture generations.


----------



## ZealotKi11er

Quote:


> Originally Posted by *k4sh*
> 
> Universal block.
> Sorry but a full waterblock is more than a hundred euros here and cannot pass through architecture generations.


Yeah but its ideal to have full cover. For just core I rather just use AIO.


----------



## Dry Bonez

Hey all, i need some help. so i have a gigabyte aorus waterfoce 1080ti GPU, and ive always ran it in OC mode under the "graphics engine" program by gigabyte... So2 days ago i decided to see what clocks i can get so i used MSI AB and started with +50 and i got a top of 2076mhz solid, then went +100 and gave me issues running the tomb raider benchmark, but ever since that, it has given me errors and so i downclocked and even went back to stock settings and now when i boot up Ris of the tomb raider or any other game, im getting about 40fps which is way below and considering ive been getting 60fps in every single game. Did i mess up my GPU?


----------



## Sweetwater

Quote:


> Originally Posted by *Dry Bonez*
> 
> Hey all, i need some help. so i have a gigabyte aorus waterfoce 1080ti GPU, and ive always ran it in OC mode under the "graphics engine" program by gigabyte... So2 days ago i decided to see what clocks i can get so i used MSI AB and started with +50 and i got a top of 2076mhz solid, then went +100 and gave me issues running the tomb raider benchmark, but ever since that, it has given me errors and so i downclocked and even went back to stock settings and now when i boot up Ris of the tomb raider or any other game, im getting about 40fps which is way below and considering ive been getting 60fps in every single game. Did i mess up my GPU?


I'd say DDU your drivers but first I'd try uninstalling your gpu in device manager and restarting the PC.

I had this problem when I pushed the oc too far and it fixed it up.


----------



## KedarWolf

Quote:


> Originally Posted by *Dry Bonez*
> 
> Hey all, i need some help. so i have a gigabyte aorus waterfoce 1080ti GPU, and ive always ran it in OC mode under the "graphics engine" program by gigabyte... So2 days ago i decided to see what clocks i can get so i used MSI AB and started with +50 and i got a top of 2076mhz solid, then went +100 and gave me issues running the tomb raider benchmark, but ever since that, it has given me errors and so i downclocked and even went back to stock settings and now when i boot up Ris of the tomb raider or any other game, im getting about 40fps which is way below and considering ive been getting 60fps in every single game. Did i mess up my GPU?


Try using DDU, then reinstalling Nvidia driver.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dry Bonez*
> 
> Hey all, i need some help. so i have a gigabyte aorus waterfoce 1080ti GPU, and ive always ran it in OC mode under the "graphics engine" program by gigabyte... So2 days ago i decided to see what clocks i can get so i used MSI AB and started with +50 and i got a top of 2076mhz solid, then went +100 and gave me issues running the tomb raider benchmark, but ever since that, it has given me errors and so i downclocked and even went back to stock settings and now when i boot up Ris of the tomb raider or any other game, im getting about 40fps which is way below and considering ive been getting 60fps in every single game. Did i mess up my GPU?
> 
> 
> 
> Try using DDU, then reinstalling Nvidia driver.
Click to expand...

Edit: If you go into DDU options when it says you're not in Safe Mode you can enable an option to have DDU boot into Safe Mode for you, then restart it.









Hate when I do that, meant to edit my post, not quote myself.


----------



## webhito

So, after blaming my 1800x, it seems that maybe my gtx 1080 ti is the culprit of my stuttering, how many here are having issues with weird stuttering/freezing with their cards?

Lots of complaints here if you want to take a look.

https://forums.geforce.com/default/topic/1004600/geforce-drivers/all-games-stuttering-with-fps-drops-since-windows-10-creators-update/1/


----------



## ZealotKi11er

Quote:


> Originally Posted by *webhito*
> 
> So, after blaming my 1800x, it seems that maybe my gtx 1080 ti is the culprit of my stuttering, how many here are having issues with weird stuttering/freezing with their cards?
> 
> Lots of complaints here if you want to take a look.
> 
> https://forums.geforce.com/default/topic/1004600/geforce-drivers/all-games-stuttering-with-fps-drops-since-windows-10-creators-update/1/


I had to slow downs and crash in BF1 but was and unstable CPU OC or game patch. Not its all good. I had to add more voltage to CPU.


----------



## webhito

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I had to slow downs and crash in BF1 but was and unstable CPU OC or game patch. Not its all good. I had to add more voltage to CPU.


On which build was that? On your phenom or your 3770k? Or do you have a ryzen build somewhere?

Reason I ask is that it doesn't seem to be locked down to just ryzen as folks on that forum are pointing to nvidia/creators update.

I did try installing anniversary update as others suggested but it changed nothing. I actually started swapping out all my hardware to try to find the culprit and at no point even suspected it could be software related.

It doesn't only happen with 1 game, its with several, enslaved, fallout 4, sniper elite zombie trilogy. And only in full screen I believe.

Not sure if I posted this here, but you can check the freeze at 1:06.


----------



## ZealotKi11er

Quote:


> Originally Posted by *webhito*
> 
> On which build was that? On your phenom or your 3770k? Or do you have a ryzen build somewhere?
> 
> Reason I ask is that it doesn't seem to be locked down to just ryzen as folks on that forum are pointing to nvidia/creators update.
> 
> I did try installing anniversary update as others suggested but it changed nothing. I actually started swapping out all my hardware to try to find the culprit and at no point even suspected it could be software related.
> 
> It doesn't only happen with 1 game, its with several, enslaved, fallout 4, sniper elite zombie trilogy. And only in full screen I believe.
> 
> Not sure if I posted this here, but you can check the freeze at 1:06.


Yeah had the same freeze in BF1.


----------



## webhito

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah had the same freeze in BF1.


On what system did it happen?


----------



## ZealotKi11er

Quote:


> Originally Posted by *webhito*
> 
> On what system did it happen?


My sig rig. I was in a tank and the same lag happened for 3-4 s and than back to normal.


----------



## webhito

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My sig rig. I was in a tank and the same lag happened for 3-4 s and than back to normal.


Thanks! So yea, its definitely not locked down to ryzen.


----------



## ZealotKi11er

Quote:


> Originally Posted by *webhito*
> 
> Thanks! So yea, its definitely not locked down to ryzen.


1080 Ti pushes the CPU hard.


----------



## webhito

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1080 Ti pushes the CPU hard.


Haha, yes it does.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I had to slow downs and crash in BF1 but was and unstable CPU OC or game patch. Not its all good. I had to add more voltage to CPU.


I had to add voltage to my CPU OC for the same reasons except with a different game, Dying Light.


----------



## KedarWolf

Good news!!!

MSI Afterburner beta 16 working again on Windows 10 Insider Preview Build 16296 (PC) RS3_Release Slow Ring release.

I made torrents to the x64 US-EN and the US-GB latest Windows 10 Pro Slow Ring release ISO with all the language packs and updates added through UPP that'll fit on a 4GB USB key.

It fixes the games stuttering problem Windows 10 1703 suffers from.

PM me for links if you want.


----------



## RMBR

Hello!
I bought an Apex with 7900x and I am waiting to arrive at my house.
I have two 1080ti strix OC in sli.
A friend has told me that X299 is horrible for sli.
It is true?
Anyone here have and can you tell me how it's working in games?
Thank you very much.


----------



## ZealotKi11er

Quote:


> Originally Posted by *RMBR*
> 
> Hello!
> I bought an Apex with 7900x and I am waiting to arrive at my house.
> I have two 1080ti strix OC in sli.
> A friend has told me that X299 is horrible for sli.
> It is true?
> Anyone here have and can you tell me how it's working in games?
> Thank you very much.


SLI has nothing to do with the CPU or Chip-set. If he says its horrible he just means SLI is horrible which is true.


----------



## RMBR

Quote:


> Originally Posted by *ZealotKi11er*
> 
> SLI has nothing to do with the CPU or Chip-set. If he says its horrible he just means SLI is horrible which is true.


But sli works well in some games.
What I learned is that the X299 is much worse than the X99.


----------



## ZealotKi11er

Quote:


> Originally Posted by *RMBR*
> 
> But sli works well in some games.
> What I learned is that the X299 is much worse than the X99.


No. There is no difference.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> SLI has nothing to do with the CPU or Chip-set. If he says its horrible he just means SLI is horrible which is true.


I've seen tests that indicate significant differences in SLI in some games over PCIe 3.0 x8/x16 configs vs PCIe 3.0 x16/x16 configs.


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> I've seen tests that indicate significant differences in SLI in some games over PCIe 3.0 x8/x16 configs vs PCIe 3.0 x16/x16 configs.


Hell no. That is the least of your problems.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Hell no. That is the least of your problems.


This was with Maxwell based video cards, but the difference was MASSIVE in both average FPS and minimum FPS in BF1.


----------



## alton brown

http://www.overclock.net/content/type/61/id/3123445/width/200/height/400[/IMG]Hi all! Just wanna say hello! I finally made it to the 1080TI class! I replaced my 2 780s. Going to post some pics. It's on a X79 platform. Older but she'll do the job. I added a custom water loop (1st time). Studied many, many hours and this is how she turned out. I hope you guys like it.

I do have a question. I have the Aorus Waterblock edition, what should I use to overclock? I always used AB up until this card arrived a week ago.








[/URL]







[/URL]


----------



## 8051

Alton, how much do those nifty swivel fittings cost?


----------



## alton brown

hmm, I think those were 9.99 from Performance-PC.com. those fittings saved the day!


----------



## Luckbad

Long time since I've posted in this thread.

Anyone know what's up with 1080 Ti availability?

I was thinking of selling my Zotac 1080 Ti ArcticStorm and water cooling setup to recover some cash, but I'm hardly seeing any decent 1080 Tis on sale anywhere.


----------



## rluker5

Quote:


> Originally Posted by *RMBR*
> 
> But sli works well in some games.
> What I learned is that the X299 is much worse than the X99.


Maybe he was getting a case of the hot vrms and a throttling mobo>cpu was making the gpus look bad? Might be a little variance from game to game due to arch.

Interesting claim, but good luck on finding data on it. Even x16 vs x8 is pretty sparse. Here's one to add to the list: http://www.overclock.net/t/1616578/x99-6850k-4-5-vs-z170-6700k-4-8-w-titan-xp-sli-benchmarks-and-results

But if you are going for high res the sli 1080tis will be the bottleneck. My 4770k has an easier time at [email protected] than my ti's in new aa games and it isn't even half a 7900x.

Sorry I didn't have anything specific, but congrats on that rig


----------



## 8051

Quote:


> Originally Posted by *alton brown*
> 
> hmm, I think those were 9.99 from Performance-PC.com. those fittings saved the day!


That's a nice looking build.


----------



## alton brown

thanks, I brought back the corny anti kink springs!..


----------



## Dasboogieman

Quote:


> Originally Posted by *alton brown*
> 
> thanks, I brought back the corny anti kink springs!..


Kinky!


----------



## dVeLoPe

i have a 1080 and will have a 1080ti soon what fps increase should i expect playing @1080p games like bf1 and pubg


----------



## JedixJarf

Quote:


> Originally Posted by *dVeLoPe*
> 
> i have a 1080 and will have a 1080ti soon what fps increase should i expect playing @1080p games like bf1 and pubg


10% or so?

http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_ti_lightning_z_review,15.html


----------



## 8051

Quote:


> Originally Posted by *alton brown*
> 
> thanks, I brought back the corny anti kink springs!..


It's interesting but some car radiator hoses have anti-kink springs -- but they're usually on the inside.


----------



## Nervoize

Hit a 2164 stable overclock. But my memory doesn't seem to overclock well. 5953 is the max i can squeeze out of it, while my other cards did 6200 and 6400. https://www.techpowerup.com/gpuz/zsf8e I saw someone in the Google sheet having a volt of 1.27, can i get that BIOS or is that voltage really unsafe?


----------



## Dasboogieman

Quote:


> Originally Posted by *Nervoize*
> 
> Hit a 2164 stable overclock. But my memory doesn't seem to overclock well. 5953 is the max i can squeeze out of it, while my other cards did 6200 and 6400. https://www.techpowerup.com/gpuz/zsf8e I saw someone in the Google sheet having a volt of 1.27, can i get that BIOS or is that voltage really unsafe?


Unsafe for long periods of time. That is more of an extreme benchmark voltage.


----------



## alton brown

Quote:


> Originally Posted by *8051*
> 
> It's interesting but some car radiator hoses have anti-kink springs -- but they're usually on the inside.


They really weren't needed, the Ek tubing made great bends without crimping. I was really impressed with the product. I figured I use the anti kink coils because I chose not to use colored coolant.


----------



## Nervoize

Quote:


> Originally Posted by *Dasboogieman*
> 
> Unsafe for long periods of time. That is more of an extreme benchmark voltage.


Oki doki, will try to squeeze a 2177MHz clock out of the 1200mV i currently have set as voltage.


----------



## Streetdragon

woow how much watts is the card drawing with 1.2V and that clock? close to 400? or even more?


----------



## Nervoize

Quote:


> Originally Posted by *Streetdragon*
> 
> woow how much watts is the card drawing with 1.2V and that clock? close to 400? or even more?


I have no idea haha, i forgot the command to check that out









Edit: 400watt

https://image.prntscr.com/image/_2sqDTblRpadKwaVIhdscQ.png


----------



## 8051

Quote:


> Originally Posted by *Streetdragon*
> 
> woow how much watts is the card drawing with 1.2V and that clock? close to 400? or even more?


Will GPUz or Hardware Info give you that information? Or is it just an approximation? 400 Watts of power is crazy. How much of that power would be going to the memory?


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> Will GPUz or Hardware Info give you that information? Or is it just an approximation? 400 Watts of power is crazy. How much of that power would be going to the memory?


50W


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 50W


Thanks ZealotKi11er, that doesn't seem bad at all, relative to the 350 Watts the GPU is consuming. I imagine the PCB and assorted support chips must be drawing some power too.

How much power could a VBIOS modded Maxwell TitanX use?

There's a guy here w/a modded TitanZ that he thinks pulls about 700 Watts of power (unbelievable).


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> 50W
> 
> 
> 
> Thanks ZealotKi11er, that doesn't seem bad at all, relative to the 350 Watts the GPU is consuming. I imagine the PCB and assorted support chips must be drawing some power too.
> 
> How much power could a VBIOS modded Maxwell TitanX use?
> 
> There's a guy here w/a modded TitanZ that he thinks pulls about 700 Watts of power (unbelievable).
Click to expand...

Gigabyte 1080 TI FE, XOC BIOS.

1.200v Just under 500w.

1.200v.txt 9k .txt file


1.100v Just under 400w.

1.100v.txt 9k .txt file


.950v Under 300w.

.950v.txt 13k .txt file


Edit: This is with Fire Strike Ultra stress test.


----------



## 8051

Quote:


> Originally Posted by *KedarWolf*
> 
> Gigabyte 1080 TI FE, XOC BIOS.
> 
> 1.200v Just under 500w.
> 
> 1.200v.txt 9k .txt file
> 
> 
> 1.100v Just under 400w.
> 
> 1.100v.txt 9k .txt file
> 
> 
> .950v Under 300w.
> 
> .950v.txt 13k .txt file
> 
> 
> Edit: This is with Fire Strike Ultra stress test.


500 Watts for a single card. Is that just for the card or is that total system draw?

Who needs space heaters anymore?


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Gigabyte 1080 TI FE, XOC BIOS.
> 
> 1.200v Just under 500w.
> 
> 1.200v.txt 9k .txt file
> 
> 
> 1.100v Just under 400w.
> 
> 1.100v.txt 9k .txt file
> 
> 
> .950v Under 300w.
> 
> .950v.txt 13k .txt file
> 
> 
> Edit: This is with Fire Strike Ultra stress test.
> 
> 
> 
> 500 Watts for a single card. Is that just for the card or is that total system draw?
> 
> Who needs space heaters anymore?
Click to expand...

Just the card itself, that log is the nvidia-smi.exe power draw in wattage from only the card.


----------



## 8051

Quote:


> Originally Posted by *KedarWolf*
> 
> Just the card itself, that log is the nvidia-smi.exe power draw in wattage from only the card.


Now that's a lot of bananas.


----------



## tongerks

it is safe to apply liquid metal to our gpu? i put cool lab liquid ultra.


----------



## Quadrider10

I tried and it seemed to eat at the GPU die along with the block it was attached to, I would not suggest it.


----------



## tongerks

so i need to change it. i will use kryonaut


----------



## Quadrider10

I would suggest not to use liquid metal on GPU. Especially with aluminum heatsink.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tongerks*
> 
> so i need to change it. i will use kryonaut


I used it in my 1080 for about 2 weeks and I do not recommend it. You might get 2-3C lower but it is not worth it. It did stain the GPU die which would probably void the warranty.


----------



## Radox-0

thermal grizzly kryonaut is not liquid metal, that would be Thermal Grizzly Conductonaut.

Kyronaught itself is as easy as other TIM to use.


----------



## The Viper

Guys does anyone know how to calculate the load watts of the gtx 1080ti at a given voltage?


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Viper*
> 
> Guys does anyone know how to calculate the load watts of the gtx 1080ti at a given voltage?


It dies not really work like that. Voltage reported is only for the core but power in watts is for the entire GPU. Generally speaking the power % should tell you the load 100% beaing 250w at the given voltage.


----------



## rluker5

Quote:


> Originally Posted by *The Viper*
> 
> Guys does anyone know how to calculate the load watts of the gtx 1080ti at a given voltage?


Also Afterburner will let you put in a multiplier on their graph, so if your bios standard power limit is 250w, you can multiply it by 2.5 and the number will look like current watts on the graph.


----------



## becks

I am currently running my 1080ti at 2050+ out of the box with no AB or any other program tweak...I wonder how that will change when i switch from 1080p to 1440p.. is that a factor to keep into consideration ?!


----------



## 8051

Quote:


> Originally Posted by *becks*
> 
> I am currently running my 1080ti at 2050+ out of the box with no AB or any other program tweak...I wonder how that will change when i switch from 1080p to 1440p.. is that a factor to keep into consideration ?!


I'd imagine it would. I can overclock to 2101 in GTAIV, but that's the only game I can do so in and I think it's only because the GPU load never goes above 75%.


----------



## nrpeyton

Quote:


> Originally Posted by *The Viper*
> 
> Guys does anyone know how to calculate the load watts of the gtx 1080ti at a given voltage?


You don't need to calculate it.

Download hwinfo64 and it shows your power in watts in real time as you game or bench.

You can even also download Riva Tuner Statics Server. And set up hwinfo64 to pass the info to Riva Tuner (which displays the info in your game the same way FRAPS shows your framerate).

While I game I have it set up so I can always view the following in real time 24/7:
*GPU:*
Speed, Utilisation,Core Voltage, Power in Watts & Temperature
*CPU:,
* Max Thread Usage & Total CPU Usage, core voltage, power in watts & temperature
*MISC:* FPS.


_don't pay any attention to the FPS at the bottom (thats only so high while I have that game running in "windowed mode. Also GPU utilisation seems to be bugged out for me just now as it always shows 0%".)._


----------



## Ultracarpet

Quote:


> Originally Posted by *nrpeyton*
> 
> You don't need to calculate it.
> 
> Download hwinfo64 and it shows your power in watts in real time as you game or bench.
> 
> You can even also download Riva Tuner Statics Server. And set up hwinfo64 to pass the info to Riva Tuner (which displays the info in your game the same way FRAPS shows your framerate).
> 
> While I game I have it set up so I can always view the following in real time 24/7:
> *GPU:*
> Speed, Utilisation,Core Voltage, Power in Watts & Temperature
> *CPU:,
> * Max Thread Usage & Total CPU Usage, core voltage, power in watts & temperature
> *MISC:* FPS.
> 
> 
> _don't pay any attention to the FPS at the bottom (thats only so high while I have that game running in "windowed mode. Also GPU utilisation seems to be bugged out for me just now as it always shows 0%".)._


Afaik the numbers provided through software are approximations and generally are only taking the actual processor into consideration rather than total board power... which may be all you're after (i didn't read the chain of reply's before you lol).


----------



## pez

You'd honestly be better off buying one of the wall socket power readers (I know there's probably an official/appropriate name for this) and calculating it based on PSU efficiency and 0% load vs XX% load.


----------



## Maximization

Quote:


> Originally Posted by *pez*
> 
> You'd honestly be better off buying one of the wall socket power readers (I know there's probably an official/appropriate name for this) and calculating it based on PSU efficiency and 0% load vs XX% load.


yes that's true, my 850 watt was shutting off at 1050 watt.
I found the problem

something like this

https://www.amazon.com/gp/product/B00009MDBU/ref=oh_aui_detailpage_o01_s00?ie=UTF8&psc=1


----------



## pez

Yep, that was the exact one I had in mind







. I usually use my UPS' reading to know power, but that's obviously a more costly piece of equipment...though IMO, a necessary and underrated one.


----------



## tongerks

thanks for the answers







will change it to thermal paste


----------



## Sgang

I'm planning to change the thermal pads of the Zotac AMP Extreme 1080ti, with fujypoly extreme, has anyone done it yet? suggestions?
My main purpose is to create a connection between the VRM and the big heatsink of the card, that actually is not connected, as gamernexus has found in their review.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sgang*
> 
> I'm planning to change the thermal pads of the Zotac AMP Extreme 1080ti, with fujypoly extreme, has anyone done it yet? suggestions?
> My main purpose is to create a connection between the VRM and the big heatsink of the card, that actually is not connected, as gamernexus has found in their review.


That would be a good idea. Cool VRM = Long GPU life.


----------



## zhrooms

Made a thread about my 1080 Ti FE artifacting, memory went bad it seems, http://www.overclock.net/t/1638968/gtx-1080-ti-fe-artifacts-bad-memory/

Quote:


> Originally Posted by *zhrooms*
> 
> Just thought I'd share this minute video, perhaps useful for some unlucky one like me in the future (help figure out the cause), I was playing a game for around half an hour, GPU usage was hovering around 70% and GPU temperature only at 43c.
> 
> I switched cables, reinstalled drivers, closed third-party gpu software, nothing would prevent the artifacts from appearing.
> 
> What did reduce the artifacts though, was lowering memory clock from 5500 MHz to 5000 MHz, did not entirely remove them, in specific games still some artifacts present, but by cutting the power limit too, forcing the card to clock down to around 1200 MHz on the Core, eliminated every artifact. As seen in the video, could run Core just over 2000MHz and 1081mV in Unigine Valley, 99% GPU Usage with no artifacts as long as memory ran 5000MHz.
> 
> Video


----------



## 8051

Quote:


> Originally Posted by *Sgang*
> 
> I'm planning to change the thermal pads of the Zotac AMP Extreme 1080ti, with fujypoly extreme, has anyone done it yet? suggestions?
> My main purpose is to create a connection between the VRM and the big heatsink of the card, that actually is not connected, as gamernexus has found in their review.


I'd think the frag tape you would have to use would be so thick as to make it less than useful.


----------



## ELIAS-EH

hello all
please help me, i have no experience in OC GPU especially the pascal card. (only i do OC CPU)
My gtx 1080ti strix OC edition, temp max 57 degree C, can go 2025 mhz and down to stable 2012 mhz with voyage slider at 100% and power limit 120% in MSI after burner which shows me a locked voltage to 1.092V and no crash after Heaven Benchmark and gaming testing and i am still testing.
Is nvidia locked to that voltage because this is the max safe voltage to not damage the card? i am on safe side ? usually i upgrade my GPU every 3 years, the default OC of my card is 1987 mhz and 1.062V which is 2.9% increase in voltage.
thank u so much.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ELIAS-EH*
> 
> hello all
> please help me, i have no experience in OC GPU especially the pascal card. (only i do OC CPU)
> My gtx 1080ti strix OC edition, temp max 57 degree C, can go 2025 mhz and down to stable 2012 mhz with voyage slider at 100% and power limit 120% in MSI after burner which shows me a locked voltage to 1.092V and no crash after Heaven Benchmark and gaming testing and i am still testing.
> Is nvidia locked to that voltage because this is the max safe voltage to not damage the card? i am on safe side ? usually i upgrade my GPU every 3 years, the default OC of my card is 1987 mhz and 1.062V which is 2.9% increase in voltage.
> thank u so much.


Yes that is the MAX safe voltage but you do not need to have it that high. Just add clock to the core at 13MHz increments with 0% voltage and 120% power until you crash. Looking at your settings try + 52MHz and go from there.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes that is the MAX safe voltage but you do not need to have it that high. Just add clock to the core at 13MHz increments with 0% voltage and 120% power until you crash. Looking at your settings try + 52MHz and go from there.


So at a 100% voltage slider in Precision XOC you'll only get 1.093V? I swear I've seen it boost to 1.1 (with the Strix XOC VBIOS).


----------



## Mooncheese

Quote:


> Originally Posted by *8051*
> 
> If you're hitting PWR perfcaps you should consider the Strix XOC VBIOS -- it made all the diff. w/my air-cooled 1080Ti, it doesn't throw PWR perfcaps ever anymore and I managed to get up to 2076 MHz. at anywhere from 1.063 to 1.094 volts (occasionally hitting 1.1).


You can go the other direction and eliminate the perf caps with an undervolt. I'm stable at 2025 MHz in The Witcher 3 with 1.000v and seldom see perf caps whereas with default voltage the clocks would dip all the way down to 1936 MHz in demanding scenes. Temps are 4-5 C cooler, which for me equates to 13 MHz keeping the core under 49C. Additionally, the extra voltage is going to shorten the life of the silicon and at 1.093V youre at what 400W for 50-75 more MHz? Doesn't seem sensible.


----------



## KedarWolf

So, revisiting BIOS's on a clean install of Windows 10 Insider Preview the latest Slow Ring installed from a USB made from an Windows ISO.

Of all the BIOS's I tested on my standard Windows install with the exact same voltage curve and clock/memory settings and the power limit correcting set by a .bat file Strix OC Bios wins over the Arctic Storm BIOS by 100 points in Time Spy.









Now to test Heaven, the Insider Preview is supposed to fix games etc. stuttering so much, an issue with Heaven.









Edit: All the other BIOS's I tested never performed as well.


----------



## RevanCorana

Is getting a 1080ti still worth it? How far off is Volta you think?


----------



## 8051

Quote:


> Originally Posted by *Mooncheese*
> 
> You can go the other direction and eliminate the perf caps with an undervolt. I'm stable at 2025 MHz in The Witcher 3 with 1.000v and seldom see perf caps whereas with default voltage the clocks would dip all the way down to 1936 MHz in demanding scenes. Temps are 4-5 C cooler, which for me equates to 13 MHz keeping the core under 49C. Additionally, the extra voltage is going to shorten the life of the silicon and at 1.093V youre at what 400W for 50-75 more MHz? Doesn't seem sensible.


I'm sure the extra voltage is going to shorten the life of the IC, but by how much? Chances are I won't even have this GPU 5 years from now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *RevanCorana*
> 
> Is getting a 1080ti still worth it? How far off is Volta you think?


It depends. 1080 Ti replacement probably 1.5 year away.


----------



## ELIAS-EH

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes that is the MAX safe voltage but you do not need to have it that high. Just add clock to the core at 13MHz increments with 0% voltage and 120% power until you crash. Looking at your settings try + 52MHz and go from there.


Thanks for your reply.

above +50mhz, Crash heaven benchmark within 5 sec.
at +50mhz, start 2025mhz and stable at 2012 MHz with max temp 52 Deere C (I increase the fan curve a bit)
but if I decreased the voltage less than 100%, I have also crash, the voltage fluctuate a lot, 1.062, 1.081, 1.07....
at 100% voltage, it is 1.081V-1.092V without crash @ 2012Mhz

Why GPU OC is like that not clear like CPU OC









Anyway, I have two options, 1974mhz at 1.062V or 2012mhz at 1.092V.
Can you gives me your advice which one to choose please ? and how much FPS is the difference?, thank you all
best regards


----------



## WillG027

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Anyway, I have two options, 1974mhz at 1.062V or 2012mhz at 1.092V.
> Can you gives me your advice which one to choose please ? and how much FPS is the difference?, thank you all
> best regards


Might be worth trying 1911 at 1.0V .
Not much performance loss - but big cooling gains.


----------



## xTesla1856

Anyone have a quick crash course on undervolting using AB curve? I can't get it to work for the life of me. Thanks


----------



## encrypted11

Quote:


> Originally Posted by *Dasboogieman*
> 
> Until someone does a teardown we won't know for sure. My gut instinct is telling me this is new modules. Because if Micron's manufacturing had matured enough where EVGA could bin the 11gbps modules to pull out mass production 12gbps, Micro might as well release a new SKU, this is a high volume part unlike the Titan Xp.




Source: EVGA Jacob


----------



## Glottis

Quote:


> Originally Posted by *RevanCorana*
> 
> Is getting a 1080ti still worth it? How far off is Volta you think?


I keep thinking about this a lot. Buying a 1080Ti right now is a bit odd, because soon we'll start seeing gaming Volta release rumors. I recon we'll have GTX2070 which will match 1080Ti performance no later than summer 2018. GTX2080 might launch even sooner than that.


----------



## WillG027

Quote:


> Originally Posted by *xTesla1856*
> 
> Anyone have a quick crash course on undervolting using AB curve? I can't get it to work for the life of me. Thanks


In the curve editor drag your selected point (ex: 1911 @ 1V) to those co-ordinates - then flatten out the rest of the curve so that every point thereafter is at 1911 as well.

so should rise somewhat linearly, diagonally upto your selected point and then is completely flat to the right after the selected point.


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> 400 Watts of power is crazy. How much of that power would be going to the memory?


I hit the 375w limit of the firmware on my Aorus at 1.031v in Superposition 8k at my 24/7 clocks.

I haven't bothered flashing to firmware with a higher limit since 375w coincides nicely with what the stock cooler on this part can remove.
Quote:


> Originally Posted by *8051*
> 
> Thanks ZealotKi11er, that doesn't seem bad at all, relative to the 350 Watts the GPU is consuming. I imagine the PCB and assorted support chips must be drawing some power too.


A good chunk (40-50w) is lost via the VRM and other assorted losses probably consume another 10-20w. You are looking at the GPU itself using 250-300w on a card that is pulling 400w.
Quote:


> Originally Posted by *8051*
> 
> There's a guy here w/a modded TitanZ that he thinks pulls about 700 Watts of power (unbelievable).


I wouldn't be surprised if this were the case, depending on clocks and load, of course.
Quote:


> Originally Posted by *Sgang*
> 
> I'm planning to change the thermal pads of the Zotac AMP Extreme 1080ti, with fujypoly extreme, has anyone done it yet? suggestions?
> My main purpose is to create a connection between the VRM and the big heatsink of the card, that actually is not connected, as gamernexus has found in their review.


I've used the Fujipoly 14 and 17w/mK pads and they are a massive pain to work with. They are very stiff, rather brittle, and barely compressible. Not worth the hassle, IMO.

If the gap is very large, use a copper spacer with paste on both sides to fill it. If the gap is small, use paste only.

For a medium sized gap, get a quality thermal pad that is still easy to work with...Arctic pads are a pretty solid combination of usability, price, and performance.

A comparison I found useful (will probably need a translator, if you don't speak German): https://www.hardwareluxx.de/community/f136/alphacool-eisschich-co-waermeleitpad-test-1093326.html


----------



## xTesla1856

Quote:


> Originally Posted by *WillG027*
> 
> In the curve editor drag your selected point (ex: 1911 @ 1V) to those co-ordinates - then flatten out the rest of the curve so that every point thereafter is at 1911 as well.
> 
> so should rise somewhat linearly, diagonally upto your selected point and then is completely flat to the right after the selected point.


Thanks man, I'll give it another try tonight. Should be fun considering this Xp maxes out at 2050 with power at 120 and voltage at +100%


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> I hit the 375w limit of the firmware on my Aorus at 1.031v in Superposition 8k at my 24/7 clocks.
> 
> I haven't bothered flashing to firmware with a higher limit since 375w coincides nicely with what the stock cooler on this part can remove.
> A good chunk (40-50w) is lost via the VRM and other assorted losses probably consume another 10-20w. You are looking at the GPU itself using 250-300w on a card that is pulling 400w.
> I wouldn't be surprised if this were the case, depending on clocks and load, of course.
> I've used the Fujipoly 14 and 17w/mK pads and they are a massive pain to work with. They are very stiff, rather brittle, and barely compressible. Not worth the hassle, IMO.
> 
> If the gap is very large, use a copper spacer with paste on both sides to fill it. If the gap is small, use paste only.
> 
> For a medium sized gap, get a quality thermal pad that is still easy to work with...Arctic pads are a pretty solid combination of usability, price, and performance.
> 
> A comparison I found useful (will probably need a translator, if you don't speak German): https://www.hardwareluxx.de/community/f136/alphacool-eisschich-co-waermeleitpad-test-1093326.html


I used 11W and yes it was very brittle and only usable once. Why do you suggest for future pads?


----------



## ELIAS-EH

Overwatch for hours is a good stability test?


----------



## becks

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Overwatch for hours is a good stability test?


Not really found it to be running just fine on my test bench profile for hours.
Wow 25 man raids is a different thing (Even LFR - looking for raid)...the prime 95 of my game library.


----------



## Ultracarpet

Quote:


> Originally Posted by *ELIAS-EH*
> 
> Overwatch for hours is a good stability test?


I find overwatch to be a decent test of CPU and memory overclocks, but dunno about GPU. Also, blackops 3 was REALLY sensitive to my CPU overclocks. I stressed my devils canyon i7 with the video encode method for a day and a half without errors, and yet couldn't play more than a few minutes of blackops without it crashing.


----------



## ELIAS-EH

For cpu OC, the ultimate stability test is occt large data set for 3 hours,


----------



## ELIAS-EH

guys are you Kidding me ?
the difference in FPS between 1974 mhz and 2012 mhz is only 1 FPS most of the time, sometimes 2 FPS? tested in 5 games
If i know that , i didn't wasted my time and your time.
thank u anyway, back to 1974 mhz.
best regards


----------



## ZealotKi11er

Quote:


> Originally Posted by *ELIAS-EH*
> 
> guys are you Kidding me ?
> the difference in FPS between 1974 mhz and 2012 mhz is only 1 FPS most of the time, sometimes 2 FPS? tested in 5 games
> If i know that , i didn't wasted my time and your time.
> thank u anyway, back to 1974 mhz.
> best regards


Yeah. What did you expect? 2000 to 2100 is only 5% OC which is 2-3% faster. If games is getting 100 fps it would be 103-105 fps.


----------



## s1rrah

Quote:


> Originally Posted by *ELIAS-EH*
> 
> guys are you Kidding me ?
> the difference in FPS between 1974 mhz and 2012 mhz is only 1 FPS most of the time, sometimes 2 FPS? tested in 5 games
> If i know that , i didn't wasted my time and your time.
> thank u anyway, back to 1974 mhz.
> best regards


Ahhhhh ... the voice of reason. I totally agree. I've been running at stock most the time and I don't really notice any difference at all over 2050mhz ... I do enjoy overclocking, though ... very much ... but only because I can't help but tinker and modify (everything) ... not so much for the minuscule performance gains. Bench competitions are fun too ...


----------



## Ultracarpet

Quote:


> Originally Posted by *s1rrah*
> 
> Ahhhhh ... the voice of reason. I totally agree. I've been running at stock most the time and I don't really notice any difference at all over 2050mhz ... I do enjoy overclocking, though ... very much ... but only because I can't help but tinker and modify (everything) ... not so much for the minuscule performance gains. Bench competitions are fun too ...


These days cards are pushed pretty close to their max clocks anyway thanks to stuff like GPU boost... Honestly, undervolting gets me more excited than overclocking now lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ultracarpet*
> 
> These days cards are pushed pretty close to their max clocks anyway thanks to stuff like GPU boost... Honestly, undervolting gets me more excited than overclocking now lol.


Which is quite sad really. With FE you still have to overclock and under-volt though.


----------



## Blameless

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I used 11W and yes it was very brittle and only usable once. Why do you suggest for future pads?


My issue with the pads that advertise extreme performance was that they often weren't even usable once; cutting, placing, and then mounting the cooler on them was often enough to damage them to the point they weren't going to be at peak performance.

In cases where pads are really necessary (as opposed to filling gaps with copper sheet) I'm probably going to stick to the Arctic thermal pads (https://www.arctic.ac/us_en/thermal-pad.html) as they seem to be the best performing pads that can be used without major fuss or risk of damaging them during cutting/install.

I did recently try the Thermal Grizzly Minus Pad 8, and I wasn't particularly impressed with them...very similar in consistency to 6.0W/mK Fujipoly and stiffer than the Arctic for similar real world performance, while being much more expensive.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. What did you expect? 2000 to 2100 is only 5% OC which is 2-3% faster. If games is getting 100 fps it would be 103-105 fps.


Memory is where the real gains are. Higher resolutions tend to be memory bandwidth limited on GP102 and many 1080 Ti have much more memory clock headroom than core clock headroom.

I run a ~3% core OC and an 8-9% memory OC that gets me 5-6% more performance in _Elite: Dangerous_ at 4k resolution with 4k custom texture/shadow map size as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> My issue with the pads that advertise extreme performance was that they often weren't even usable once; cutting, placing, and then mounting the cooler on them was often enough to damage them to the point they weren't going to be at peak performance.
> 
> In cases where pads are really necessary (as opposed to filling gaps with copper sheet) I'm probably going to stick to the Arctic thermal pads (https://www.arctic.ac/us_en/thermal-pad.html) as they seem to be the best performing pads that can be used without major fuss or risk of damaging them during cutting/install.
> 
> I did recently try the Thermal Grizzly Minus Pad 8, and I wasn't particularly impressed with them...very similar in consistency to 6.0W/mK Fujipoly and stiffer than the Arctic for similar real world performance, while being much more expensive.
> Memory is where the real gains are. Higher resolutions tend to be memory bandwidth limited on GP102 and many 1080 Ti have much more memory clock headroom than core clock headroom.
> 
> I run a ~3% core OC and an 8-9% memory OC that gets me 5-6% more performance in _Elite: Dangerous_ at 4k resolution with 4k custom texture/shadow map size as well.


Fujipoly 11w/mk where a must for me with 290X under water. Stock EK pads where garbage. I do not know why they charge so much now for blocks and give terrible pads. I was hitting 80C+ with stock pads and 55C with 11W ones. Since my 1080 Ti is Hybrid and the VRM portion is air cooler I wanted to get good pads to replace the Nvidia stuff.


----------



## Blameless

Most OEM thermal pads are between 1.0 and 3.0W/mK (those light green, highly conformable/foamy pads are near the low end of this) and are often too thick on the stock and aftermarket coolers that include them. The difference in temps between a quality 5+W/mK pad of the correct thickness (thick enough that it's adequately compressed, but not so thick it's permanently deformed or develops cracks) and much higher thermal conductivity pads is usually pretty small, at least on anything you should be using pads on at all.

EK's pads are 3.5W/mK and very squishy...probably sufficient, but I wouldn't use them on the VRM either.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blameless*
> 
> Most OEM thermal pads are between 1.0 and 3.0W/mK (those light green, highly conformable/foamy pads are near the low end of this) and are often too thick on the stock and aftermarket coolers that include them. The difference in temps between a quality 5+W/mK pad of the correct thickness (thick enough that it's adequately compressed, but not so thick it's permanently deformed or develops cracks) and much higher thermal conductivity pads is usually pretty small, at least on anything you should be using pads on at all.
> 
> EK's pads are 3.5W/mK and very squishy...probably sufficient, but I wouldn't use them on the VRM either.


Oh ok. So the 6W is a upgrade. Yeah I have a bunch of ebay pads for old card to change crusty pads lol. Using the god stuff for my better cards. Probably going to buy $50 worth of Arctic pads just to have.

Also regrading 1080 Ti FE PCB. The FE cooler has some weird pads for voltage regulators but EK block does not cover them.


----------



## 8051

Blameless

Where do you get copper sheet to fill in gaps?


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> Blameless
> 
> Where do you get copper sheet to fill in gaps?


You can order in bulk from metal suppliers or just go to an arts and crafts or hardware store for smaller amounts.

I try to use copper for any gap over 1mm in thickness (assuming I can make sure it won't short anything and that the surfaces aren't so irregular that a pad is absolutely needed), and just paste for anything less than 0.25mm.


----------



## Sgang

Regarding the thermal pads, i had a bunch of pads of different measures and size from fujypoly for work, so that's why i will try with these. No personal preference... i will try the Arctic for the next project!









here is my card, and you can see the imperfections of the thermal system





in these pictures the pads are visible... and is also visible the absurde rubber filler between VRM and Heatsink
do you have some suggestions on where and how to apply ? the copper do you think is a viable solution for that VRM conformation ? (there's an heatsink also on the VRM)


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sgang*
> 
> Regarding the thermal pads, i had a bunch of pads of different measures and size from fujypoly for work, so that's why i will try with these. No personal preference... i will try the Arctic for the next project!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is my card, and you can see the imperfections of the thermal system
> 
> 
> 
> 
> 
> in these pictures the pads are visible... and is also visible the absurde rubber filler between VRM and Heatsink
> do you have some suggestions on where and how to apply ? the copper do you think is a viable solution for that VRM conformation ? (there's an heatsink also on the VRM)


I would remove the rubber filler so it can let air though the heat-sink and cool the smaller VRM heat sink..


----------



## Blameless

Yeah, I'd pull off that rubber strip to allow for more airflow to the VRM sink.

Those soft green thermal pads are probably junk and I'd replace the one under the independent VRM sink. The VRAM and memory VRM pads can be left alone though.


----------



## Sgang

Nice idea, so you think it's better to remove the rubber, instead of replace it with some thermalpads and make contact with the sink?
@blameless What you mean with "The VRAM and memory VRM pads can be left alone though" (sorry english is not my main language







)


----------



## KedarWolf

Quote:


> Originally Posted by *Sgang*
> 
> Nice idea, so you think it's better to remove the rubber, instead of replace it with some thermalpads and make contact with the sink?
> @blameless What you mean with "The VRAM and memory VRM pads can be left alone though" (sorry english is not my main language
> 
> 
> 
> 
> 
> 
> 
> )


If money is no object check out these thermal pads at 17W/mK.

https://modmymods.com/alphacool-ice-thermal-pad-17w-mk-100x100x1-5mm-1011309.html

Edit: I went with 14W/mK fur memory and VRMs and 11W/mK for everywhere else, I'm a poor guy. .


----------



## GreedyMuffin

I can do 2075 at 1.050V, but I think I'll stick to 1949 at 0.900V. Temps are sub 40 at that voltage.


----------



## Sgang

Quote:


> Originally Posted by *KedarWolf*
> 
> If money is no object check out these renal pads at 17W/mK.
> 
> https://modmymods.com/alphacool-ice-thermal-pad-17w-mk-100x100x1-5mm-1011309.html




$_$


----------



## Blameless

Quote:


> Originally Posted by *Sgang*
> 
> Nice idea, so you think it's better to remove the rubber, instead of replace it with some thermalpads and make contact with the sink?


Yes. You aren't going to get enough contact to justify the airflow obstruction with that design.
Quote:


> Originally Posted by *Sgang*
> 
> @blameless What you mean with "The VRAM and memory VRM pads can be left alone though" (sorry english is not my main language
> 
> 
> 
> 
> 
> 
> 
> )


You can leave the pads on the main heatsink, the ones that cool the GDDR5X, and smaller VRM area, alone. You might see some small improvement by replacing them, but given the lower thermal density of these parts, probably not enough to justify replacing them.


----------



## 8051

Quote:


> Originally Posted by *Sgang*
> 
> Regarding the thermal pads, i had a bunch of pads of different measures and size from fujypoly for work, so that's why i will try with these. No personal preference... i will try the Arctic for the next project!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is my card, and you can see the imperfections of the thermal system
> 
> 
> 
> 
> 
> in these pictures the pads are visible... and is also visible the absurde rubber filler between VRM and Heatsink
> do you have some suggestions on where and how to apply ? the copper do you think is a viable solution for that VRM conformation ? (there's an heatsink also on the VRM)


The rubber filler might be thermally conductive -- I think there are such things.


----------



## Blameless

Quote:


> Originally Posted by *8051*
> 
> The rubber filler might be thermally conductive -- I think there are such things.


Everything is thermally conductive, to one degree or another. I still think that rubber strip is just a rubber strip placed there to prevent the fins of the main heatsink and the VRM heatsink from damaging each other.

Regardless, with how much air flow it blocks, it wouldn't matter if it was pure silver and soldered to both heatsinks, it would probably harm performance of the VRM heatsink more than it would help.


----------



## s1rrah

Quote:


> Originally Posted by *Blameless*
> 
> You can order in bulk from metal suppliers or just go to an arts and crafts or hardware store for smaller amounts.
> 
> I try to use copper for any gap over 1mm in thickness (assuming I can make sure it won't short anything and that the surfaces aren't so irregular that a pad is absolutely needed), and just paste for anything less than 0.25mm.


I happen to have a bunch of these laying around from earlier this year when I was planning on re pasting my Alienware 17 Pascal machine ...

...



...

Here's a link; stupid cheap too: https://www.amazon.com/gp/product/B00E5SMY0W/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

I never got around to using them on the Alienware laptop as my temps have never caused throttling of either CPU or GPU on that (ridiculously great) machine ... so do you think these might work in place of thermal pads on a MSI Seahawk 1080 ti? I would most likely use a small layer of thermal paste on both sides of the shim ...

My only worry is that they would not have any "give" and that the CPU block might not make good contact with the GPU die

Great idea ... copper shims have *got* to be many times better than any thermal pad if contact is right...


----------



## CrazyElf

Been busy at work, but has anyone else had tried overclocking 1080Ti's in SLI? I left the voltage alone and I haven't passed 2025 MHz in SLI with both cards synchronized. Disabling the synchronization does not seem to push clocks forward past 2025 MHz. Temperature of the top card was 71C and this was on a hot day when I tested (room temps were likely in excess of 25C). Just tried and +325 MHz seems to be the best they can do together in SLI.

I have not tried yet, but I might try pushing the VRAM more, as that seems more likely to do something than trying to squeeze 25-50 MHz out of the core. If anything, undervolting the core seems like the best option. I wonder if it is overvolting the VRAM. I don't think that it is worth overvolting the core. On that note, it seems like overvolting the core or the "Shunt" mod turns a 300W card into a 400W card with gains of performance in the single digits (essentially the 1080Ti equal to turning your RX 480 into a 580, with perhaps even worse gains from a performance gained vs energy used POV).

The main reason to get third party cards is the cooler and I guess having overkill VRM helps ensure maximum VRM efficiency (as opposed to VRMs at higher temperatures). In any event, the premium isn't too big - $50 to $120 USD for a premium GPU like the Aorus or Lightning. The only other option is something like the Raijentek Morpheus, which easily costs as much and doesn't offer a VRM upgrade.

For those that need a guide for Memtest G80:
https://github.com/ihaque/memtestG80

I think that if you run 2 instances of CMD, and flag it, for example,

Instance 1:
MemtestG80.exe 1024 1000

Instance 2 of CMD (open up a new command line to do this):
MemtestG80.exe 1024 1000 --gpu 1 g1

This should allow you to run both at once in SLI (a useful trick if you want to spend less time checking for stability).

Quote:


> Originally Posted by *Blameless*
> 
> Does consumer pascal even have any ECC enabled? Querying the card with nvidia-SMI seems to suggest not, but maybe it's not being reported correctly.
> 
> In my experience, eight instances (this is important, because it takes at least 4-6 to saturate the memory controller) of MemtestG80 will find unstable memory on my part much faster than any test I've tried. I don't have FFIV, but I do use the FFXIV Stormblood benchmark in a loop to stability test...at ~65C, 6048 memory will produce errors in MemtestG80 inside 5-10 minutes, but I can loop Stormblood bench at 4k for several hours before running into issues.
> 
> Temperature is also important. MemtestG80 will not pull as much power as graphically intensive tests, so pump/fan speeds need to be reduced to make sure the memory is hitting worst case temperatures.
> 
> Regardless, ensuring no errors are reported _and_ that performance is still scaling correctly with increased memory clocks should be a very strong indicator that the memory is stable and that any ECC, if present, isn't confusing things.


Interesting read for those who want to know about MemtestG80's error checking abilities:
http://www.cs.stanford.edu/people/ihaque/posters/sc-2009.pdf

Alongside HCI Memtest, for DRAM, it may be one of the best error finding tools in WIndows.

ECC is disabled by default on consumer GPUs. If you were to run Nvidia's deviceQuery, you'd get something like the following:

https://devtalk.nvidia.com/default/topic/1000116/driver-for-gtx-1080-ti/

Quote:


> Device 0: "Graphics Device"
> CUDA Driver Version / Runtime Version 8.0 / 8.0
> CUDA Capability Major/Minor version number: 6.1
> Total amount of global memory: 11169 MBytes (11711938560 bytes)
> (28) Multiprocessors, (128) CUDA Cores/MP: 3584 CUDA Cores
> GPU Max Clock rate: 1582 MHz (1.58 GHz)
> Memory Clock rate: 5505 Mhz
> Memory Bus Width: 352-bit
> L2 Cache Size: 2883584 bytes
> Maximum Texture Dimension Size (x,y,z) 1D=(131072), 2D=(131072, 65536), 3D=(16384, 16384, 16384)
> Maximum Layered 1D Texture Size, (num) layers 1D=(32768), 2048 layers
> Maximum Layered 2D Texture Size, (num) layers 2D=(32768, 32768), 2048 layers
> Total amount of constant memory: 65536 bytes
> Total amount of shared memory per block: 49152 bytes
> Total number of registers available per block: 65536
> Warp size: 32
> Maximum number of threads per multiprocessor: 2048
> Maximum number of threads per block: 1024
> Max dimension size of a thread block (x,y,z): (1024, 1024, 64)
> Max dimension size of a grid size (x,y,z): (2147483647, 65535, 65535)
> Maximum memory pitch: 2147483647 bytes
> Texture alignment: 512 bytes
> Concurrent copy and kernel execution: Yes with 2 copy engine(s)
> Run time limit on kernels: Yes
> Integrated GPU sharing Host Memory: No
> Support host page-locked memory mapping: Yes
> Alignment requirement for Surfaces: Yes
> *Device has ECC support: Disabled*
> Device supports Unified Addressing (UVA): Yes
> Device PCI Domain ID / Bus ID / location ID: 0 / 3 / 0
> Compute Mode:
> < Default (multiple host threads can use ::cudaSetDevice() with device simultaneously) >
> 
> deviceQuery, CUDA Driver = CUDART, CUDA Driver Version = 8.0, CUDA Runtime Version = 8.0, NumDevs = 1, Device0 = Graphics Device
> Result = PASS
> [email protected]:~/NVIDIA_CUDA-8.0_Samples/1_Utilities/deviceQuery$


Only the Quadros have ECC and of course the Nvidia HBM cards.

Quote:


> Originally Posted by *Blameless*
> 
> 
> 
> Just make sure "CUDA - Force P2 State" is set to off in the global driver profile.


Yep. Fun tip: you get some modest gains (at the expense of power consumption) if you disable the P2 state in mining:

http://cryptomining-blog.com/tag/p2-power-state-in-compute/

I'm not sure it is great though from a mining gained versus power consumption POV though. GDDR5X as a whole seems pretty awful for mining.

Quote:


> Originally Posted by *RevanCorana*
> 
> Is getting a 1080ti still worth it? How far off is Volta you think?


I lean towards the yes for a few reasons.

We don't know what consumer Volta will look like. I strong suspect though that we will see a GTX 1080 sized Volta (or at most GTX 980 sized so around 400mm^2) and only shortly after a larger Titan Volta. It will take another 8 months or so afterwards before the GTX 1180Ti (or whatever they call it) comes out. So that might be well into 2019. The only time that would change is if AMD's Navi is as good as we hope - then that might force Nvidia's hand.

Quote:


> Originally Posted by *mgoldshteyn*
> 
> I am running an SLI configuration of a pair of EVGA 1080 Ti SC2 Hybrid Gaming cards on Windows 7 with the latest nVidia drivers. I've run about 8 days of 24/7 compute on both cards and noticed that their ability to overclock has drastically gone down from the time I purchased them.
> 
> When I first got the cards, I could easily get 2,050 MHz stable, even at 1.050 V. Now, I can't even get 1,900 MHz on either card, regardless of voltage/power. Is it possible that by running at 120% power and max voltage (configured by setting K Boost on and raising the voltage slider all the way up using EVGA Precision XOC), I actually managed to damage the hardware in both cards?
> 
> Why else would the cards degrade so rapidly after only about 8 days of total compute time? I now get crashes just starting EVGA Precision XOC with K Boost turned on, even though it is defaulting to a 0 MHz offset, which results in something like 1,931 MHz / 1,961 MHz on my two cards, respectively. Adding voltage only results in the cards crashing sooner. Temperatures are great, both GPU, Memory, power regulation. Everything is staying below 60 C at load, with GPU temperatures much lower (e.g., 50 C).
> 
> I am testing 1,850 MHz with 1.000 V right now, since I can't even get the 1,900s to work at any voltage and higher voltages (e.g., >1.025 V) only cause crashes sooner. I have switched to MSI Afterburner, because I can't even get EVGA Precision XOC to come up without halting the OS. Afterburner comes up with the cards clocked to a locked 1,900 MHz, that I've set, so I at least have a chance to try new settings. But, running at 1,900 MHz at any reasonable voltage ultimately results in a system halt after anywhere between 30 mins to 4 hours.
> 
> I am wondering if one of the following two things has occurred:
> 
> Electromigration damaged the hardware of the cards. I know this sounds impossible after such a short period of 24/7 running and given that the max voltages were still bound by the 1.093 V cap, but the behavior I am seeing is very similar to the way my Pentium 4s behaved when overvolted/overclocked and having experienced electromigration (i.e., See: Sudden Northwood Death Syndrome (SNDS)).
> The thermal compound that EVGA uses on their SC2 Hybrid Gaming cards has somehow worn down (but the temps are still low, so I doubt this happened).
> If anybody has any ideas on what happened or what I can do to fix things, I am all ears.


Sounds like eletromigration to me. It's actually a risk with the overvolting and "Shunt" mod IMO.


----------



## KedarWolf

Quote:


> Originally Posted by *Radox-0*
> 
> thermal grizzly kryonaut is not liquid metal, that would be Thermal Grizzly Conductonaut.
> 
> Kyronaught itself is as easy as other TIM to use.


I prefer Mastergel Maker Nano over Krynonaut. Performs a bit better and much easier to apply.

My new go to paste.









http://www.enostech.com/mastergel-makel-paste-review/


----------



## KedarWolf

470 iterations of a 1000 of MemtestG80 at 1024MB, 8 instances running (I'm running Twitch in the background in Chrome to keep me entertained) and zero errors at 6162MHZ memory, 2075 core. jump one bin higher on memory and get errors though and ambient temps are quite low right now.









Summer temps I'm pretty sure I'd settle at 6142 I normally run my card at 24/7.


----------



## WillG027

Does anyone know if this 'MemtestG80' is known to work correctly on GDDR5X ?


----------



## Dasboogieman

Quote:


> Originally Posted by *WillG027*
> 
> Does anyone know if this 'MemtestG80' is known to work correctly on GDDR5X ?


It does but ymmv. It didn't catch memory ECC resends for my gpu because the point mine throws out detectable errors is beyond the VRAM stability and performance loss inflection point.


----------



## Blameless

Quote:


> Originally Posted by *CrazyElf*
> 
> For those that need a guide for Memtest G80:
> https://github.com/ihaque/memtestG80
> 
> I think that if you run 2 instances of CMD, and flag it, for example,
> 
> Instance 1:
> MemtestG80.exe 1024 1000
> 
> Instance 2 of CMD (open up a new command line to do this):
> MemtestG80.exe 1024 1000 --gpu 1 g1
> 
> This should allow you to run both at once in SLI (a useful trick if you want to spend less time checking for stability).


Good tip for SLI users, though you'll want to run many more than two instances.

I normally run eight or nine on my 1080 Ti and have found it impossible to get full memory controller load with less than six.
Quote:


> Originally Posted by *CrazyElf*
> 
> Interesting read for those who want to know about MemtestG80's error checking abilities:
> http://www.cs.stanford.edu/people/ihaque/posters/sc-2009.pdf


Interesting slide, sums up the relevant memtestG80 dev comments nicely.
Quote:


> Originally Posted by *CrazyElf*
> 
> ECC is disabled by default on consumer GPUs. If you were to run Nvidia's deviceQuery, you'd get something like the following:
> 
> https://devtalk.nvidia.com/default/topic/1000116/driver-for-gtx-1080-ti/
> Only the Quadros have ECC and of course the Nvidia HBM cards.


nvidia-smi states as much as well.
Quote:


> Originally Posted by *CrazyElf*
> 
> you get some modest gains (at the expense of power consumption) if you disable the P2 state in mining


It's worth it. Hash rate goes up ~5% for a similar increase in power consumption.


----------



## Alexium

Does the FE 1080 Ti really not have a VRM temp sensor? Got my 1080 Ti 2 weeks ago, at first I just shrugged when I'd seen there's no VRM temp reading in GPU-Z, but now that I think about, it seems really odd.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alexium*
> 
> Does the FE 1080 Ti really not have a VRM temp sensor? Got my 1080 Ti 2 weeks ago, at first I just shrugged when I'd seen there's no VRM temp reading in GPU-Z, but now that I think about, it seems really odd.


New VRM sensor. Reason stupid move by Nvidia. Would be feeling better if I know how hot the VRM run.


----------



## gerardfraser

Quote:


> Originally Posted by *Alexium*
> 
> Does the FE 1080 Ti really not have a VRM temp sensor? Got my 1080 Ti 2 weeks ago, at first I just shrugged when I'd seen there's no VRM temp reading in GPU-Z, but now that I think about, it seems really odd.


Nvidia has not had VRM sensor for ever .
However there are a couple of board partner cards that have some VRM sensors such as MSI Lighting,EVGA's new iCX range cards.


----------



## nrpeyton

Evening,

Anyone know how to manually *force* a *voltage* *lower* than *800mv?* (on a 1080 TI FE).

800mv (0.8v) is the lowest on the 'curve window' in MSI Afterburner!

Any help would be much appreciated!

Thanks,

Nick Peyton

Quote:


> Originally Posted by *s1rrah*
> 
> I happen to have a bunch of these laying around from earlier this year when I was planning on re pasting my Alienware 17 Pascal machine ...
> 
> ...
> 
> 
> 
> ...
> 
> Here's a link; stupid cheap too: https://www.amazon.com/gp/product/B00E5SMY0W/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
> 
> I never got around to using them on the Alienware laptop as my temps have never caused throttling of either CPU or GPU on that (ridiculously great) machine ... so do you think these might work in place of thermal pads on a MSI Seahawk 1080 ti? I would most likely use a small layer of thermal paste on both sides of the shim ...
> 
> My only worry is that they would not have any "give" and that the CPU block might not make good contact with the GPU die
> 
> Great idea ... copper shims have *got* to be many times better than any thermal pad if contact is right...


I tried this (replaced pads on my GDDR5X memory chips with these on my 1080 Classified last round) and temps weren't any better. In fact it created more problems than it's worth. Due to there being no *give* you end up with a few memory chips having excellent temps while others absolutely appauling. The pads are designed to absorb the difference in height between different components. Also any good pad will also compress to at least 50% it's original height. So a 0.5mm pad is really only a 0.25mm pad. It also just seems to "absorb" the whole GDDR5X chip surface better than the copper shim. If we were talking about much bigger pad sizes then it might offer an improvement. But not at the heights we are working. I've tried it. Numerous times.



_Even 3 w/MK ($2 for 10cm x 10cm) will do just as good as something 10x the price. You be *very* lucky to get 1c difference by going to 20 w/MK_


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Evening,
> 
> Anyone know how to manually *force* a *voltage* *lower* than *800mv?* (on a 1080 TI FE).
> 
> 800mv (0.8v) is the lowest on the 'curve window' in MSI Afterburner!
> 
> Any help would be much appreciated!
> 
> Thanks,
> 
> Nick Peyton


I do not think its possible. It there for a reason. Lowering it too much might cause other part to become unstable if they are dependent on vCore.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I do not think its possible. It there for a reason. Lowering it too much might cause other part to become unstable if they are dependent on vCore.


Reason I want to do it is to save energy. Even at the lowest possible voltage (800mv) and 999Mhz GPU clock, memory -512 and then with all settings in-game maxed out at ULTRA on 2560 x 1440 I *still* get 60 FPS consistently 24 / 7.

GPU is still consuming about 86 watts unnecessarily.

If I could lower that voltage. To maybe 650mv I would save even more energy.

See screenshot below showing 60 FPS:


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Reason I want to do it is to save energy. Even at the lowest possible voltage (800mv) and 999Mhz GPU clock, memory -512 and then with all settings in-game maxed out at ULTRA on 2560 x 1440 I *still* get 60 FPS consistently 24 / 7.
> 
> GPU is still consuming about 100 watts unnecessarily.
> 
> If I could lower that voltage. To maybe 650mv I would save even more energy.
> 
> See screenshot below:


Sell it and buy 1070. Also you are doing ti wrong. 1GHz + 0.8v you are overvlolting that core. Get the max clock speed with the GPU you can with 0.8v or just a bit lower and limit fps.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Sell it and buy 1070. Also you are doing ti wrong. 1GHz + 0.8v you are overvlolting that core. Get the max clock speed with the GPU you can with 0.8v or just a bit lower and limit fps.


I reckon I could maintain 60 FPS on this game at that resolution easily at a lower voltage & lower clock speed. (_Even_ lower than 800mv and 999Mhz core).

When I'm playing at 4k on a more graphically demanding game I am doing the opposite. I have it overclocked to within an inch of its life. At 1.093v, +575 memory & 2113 Mhz with shunt mod allowing me to draw up to nearly 400w.

*But* when playing the game mentioned in my earlier post -- and when I'm also not using the 4k TV I'd rather save energy.

I've seen the card idle at lower than 800mv.. so there must be a way to force that lower voltage.

Is anyone aware of any overclocking apps (_like but not_ msi afterburner) that allows a lower voltage than 800mv on the voltage/frequency curve window.

I'm surprised we still don't have either a BIOS editor yet OR at least something that can talk to the voltage controller on these cards.


----------



## Thoth420

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah had the same freeze in BF1.


I had the same problem on my sig rig it was pinned down to dirty power in my house. Occurred through a USPS too. Worth investigating.


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> I tried this (replaced pads on my GDDR5X memory chips with these on my 1080 Classified last round) and temps weren't any better.


Memory is often the lowest thermal density component on a video card that has any kind of thermal interface attaching it to the heatsink, which is why you won't see an improvement. Going from 3 to 20 W/mK doesn't mean much when you only have two watts of heat per square cm to move.
Quote:


> Originally Posted by *nrpeyton*
> 
> Due to there being no *give* you end up with a few memory chips having excellent temps while others absolutely appauling.


A thin layer of paste is the opposite of what works best when using copper foil or plates as a spacer.

I use a viscous TIM (Ceramique is my go to for applications like this...it's not conductive or capacitive at all and lasts forever) and put about as much on each mosfet as I would put on an entire GPU die. Any differences in height are mitigated by the paste's bondline (which is still much thinner than a fully compressed pad).


----------



## KedarWolf

New drivers really bad for benching.

I lost 600 points in Time Spy.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> New drivers really bad for benching.
> 
> I lost 600 points in Time Spy.


Edit: Had an issue with my water pump I fixed. Affected my CPU score as it was throttling.

With Strix OC BIOS I just got a 11201 graphics score and 11152 over all, which is about 100 points better that what I was doing with the Arctic Storm BIOS with the same voltage curve







.


----------



## nuno_p

Quote:


> Originally Posted by *nrpeyton*
> 
> I reckon I could maintain 60 FPS on this game at that resolution easily at a lower voltage & lower clock speed. (_Even_ lower than 800mv and 999Mhz core).
> 
> When I'm playing at 4k on a more graphically demanding game I am doing the opposite. I have it overclocked to within an inch of its life. At 1.093v, +575 memory & 2113 Mhz with shunt mod allowing me to draw up to nearly 400w.
> 
> *But* when playing the game mentioned in my earlier post -- and when I'm also not using the 4k TV I'd rather save energy.
> 
> I've seen the card idle at lower than 800mv.. so there must be a way to force that lower voltage.
> 
> Is anyone aware of any overclocking apps (_like but not_ msi afterburner) that allows a lower voltage than 800mv on the voltage/frequency curve window.
> 
> I'm surprised we still don't have either a BIOS editor yet OR at least something that can talk to the voltage controller on these cards.


I want to know the same for my 1070 and 1060,but i didn't find anything yet.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Sell it and buy 1070. Also you are doing ti wrong. 1GHz + 0.8v you are overvlolting that core. Get the max clock speed with the GPU you can with 0.8v or just a bit lower and limit fps.
> 
> 
> 
> I reckon I could maintain 60 FPS on this game at that resolution easily at a lower voltage & lower clock speed. (_Even_ lower than 800mv and 999Mhz core).
> 
> When I'm playing at 4k on a more graphically demanding game I am doing the opposite. I have it overclocked to within an inch of its life. At 1.093v, +575 memory & 2113 Mhz with shunt mod allowing me to draw up to nearly 400w.
> 
> *But* when playing the game mentioned in my earlier post -- and when I'm also not using the 4k TV I'd rather save energy.
> 
> I've seen the card idle at lower than 800mv.. so there must be a way to force that lower voltage.
> 
> Is anyone aware of any overclocking apps (_like but not_ msi afterburner) that allows a lower voltage than 800mv on the voltage/frequency curve window.
> 
> I'm surprised we still don't have either a BIOS editor yet OR at least something that can talk to the voltage controller on these cards.
Click to expand...

Try Precision XOC and Google how to do a custom voltage curve with it but I don't know if you can, just saw you can do custom curves.

Edit: No, you can't. More control with Afterburner actually.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Try Precision XOC and Google how to do a custom voltage curve with it but I don't know if you can, just saw you can do custom curves.
> 
> Edit: No, you can't. More control with Afterburner actually.


yup I tried Precision XOC too. No luck. :-(


----------



## s1rrah

Quote:


> Originally Posted by *nrpeyton*
> 
> I tried this (replaced pads on my GDDR5X memory chips with these on my 1080 Classified last round) and temps weren't any better. In fact it created more problems than it's worth. Due to there being no *give* you end up with a few memory chips having excellent temps while others absolutely appauling. The pads are designed to absorb the difference in height between different components. Also any good pad will also compress to at least 50% it's original height. So a 0.5mm pad is really only a 0.25mm pad. It also just seems to "absorb" the whole GDDR5X chip surface better than the copper shim. If we were talking about much bigger pad sizes then it might offer an improvement. But not at the heights we are working. I've tried it. Numerous times.
> 
> [/I]


Good info ... thanks ...

joel


----------



## nrpeyton

No probs m8.

Has anyone done any research into *best performing memory overclock ratios?*

For example, last round (non Ti) we found that any memory overclock ending in a 5 usually performed better than a memory overclock with a number ending in a 0.

I.E. see the graph I made below:



_Right click and select 'open new tab' on picture to view full size._

Just wondering if anyone has done any testing and noticed any patterns for the 1080Ti?


----------



## Beagle Box

Quote:


> Originally Posted by *nrpeyton*
> 
> 
> 
> Spoiler: Stuff
> 
> 
> 
> No probs m8.
> 
> Has anyone done any research into *best performing memory overclock ratios?*
> 
> 
> ...
> For example, last round (non Ti) we found that any memory overclock ending in a 5 usually performed better than a memory overclock with a number ending in a 0.
> ...
> 
> 
> Spoiler: More stuff
> 
> 
> 
> I.E. see the graph I made below:
> 
> 
> 
> _Right click and select 'open new tab' on picture to view full size._
> 
> Just wondering if anyone has done any testing and noticed any patterns for the 1080Ti?


Maybe that's true for Classifieds, but I think what you actually found was that the fastest memory speeds are found 50 ticks apart on the slider.

I've worked with two GTX 1080s (Zotac and MSI) and once I found a 'sweet spot' in memory speed, I'd find another at precisely +50 or -50 from that point. Neither card's sweet spots ended in '0' or '5'.


----------



## s1rrah

Quote:


> Originally Posted by *Beagle Box*
> 
> Maybe that's true for Classifieds, but I think what you actually found was that the fastest memory speeds are found 50 ticks apart on the slider.
> 
> I've worked with two GTX 1080s (Zotac and MSI) and once I found a 'sweet spot' in memory speed, I'd find another at precisely +50 or -50 from that point. Neither card's sweet spots ended in '0' or '5'.


Now ya'll just getting all "ancient aliens" and what not ... LOL ...

(sorry)


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> yup I tried Precision XOC too. No luck. :-(


I managed to get voltage control working w/the custom voltage curve in
Quote:


> Originally Posted by *s1rrah*
> 
> Now ya'll just getting all "ancient aliens" and what not ... LOL ...
> 
> (sorry)


We all know any frequency ending in 666 gives the best performance, because Nvidia didn't just accidentally become a dominant force in the computing world, there was blood sacrifices made.


----------



## nuno_p

Quote:


> Originally Posted by *8051*
> 
> *I managed to get voltage control working w/the custom voltage curve* in
> We all know any frequency ending in 666 gives the best performance, because Nvidia didn't just accidentally become a dominant force in the computing world, there was blood sacrifices made.


How?


----------



## KCDC

I've been wanting to upgrade the EK pads on my 1080ti's

Of course I want the fuji poly 17s

Or the alpha cool 14s or 11s

Ended up getting Thermal Grizzly's minus pad 8s

Still double the conductivity of the EK pads and a decent price. 8W/Mk

Didn't break the bank on pads with plenty left over.

Is it now ill-advised, or wasteful, to add a little thermal paste between the affected item and the pad? This was useful before... Have a ton of ek paste and Arctic silver 5 sitting around I could use.

Looking to finally do all of this and shunt mod this weekend on both cards. Been putting it off for too long... Loop, rad and blocks need cleaning. My idles are too high.


----------



## Dasboogieman

Quote:


> Originally Posted by *KCDC*
> 
> I've been wanting to upgrade the EK pads on my 1080ti's
> 
> Of course I want the fuji poly 17s
> 
> Or the alpha cool 14s or 11s
> 
> Ended up getting Thermal Grizzly's minus pad 8s
> 
> Still double the conductivity of the EK pads and a decent price. 8W/Mk
> 
> Didn't break the bank on pads with plenty left over.
> 
> Is it now ill-advised, or wasteful, to add a little thermal paste between the affected item and the pad? This was useful before... Have a ton of ek paste and Arctic silver 5 sitting around I could use.
> 
> Looking to finally do all of this and shunt mod this weekend on both cards. Been putting it off for too long... Loop, rad and blocks need cleaning. My idles are too high.


The less interfaces you have, the better the thermal transfer. Unless the pads were insufficient thickness, thermal paste addition is of dubious benefit. Just make sure you don't touch the pads with bare skin (use some gloves) as the oils on your skin will knock off a massive amount of thermal conductivity.

From my knowledge of GDDR5x heat output. I don't think thermal conductivity of pads makes much of a difference unless the Delta T is really big (i.e. the VRAM was already very hot). How much you can compress the pads (Z-height) matters a lot more so make sure you can still screw the whole assembly down well.


----------



## KCDC

Quote:


> Originally Posted by *Dasboogieman*
> 
> The less interfaces you have, the better the thermal transfer. Unless the pads were insufficient thickness, thermal paste addition is of dubious benefit. Just make sure you don't touch the pads with bare skin (use some gloves) as the oils on your skin will knock off a massive amount of thermal conductivity.
> 
> From my knowledge of GDDR5x heat output. I don't think thermal conductivity of pads makes much of a difference unless the Delta T is really big (i.e. the VRAM was already very hot). How much you can compress the pads (Z-height) matters a lot more so make sure you can still screw the whole assembly down well.


Appreciate the prompt response. I'll have black nitrile gloves on at all times. Will also avoid physical contact. I'll save the paste for another build.









Also, I purchased the same thickness that EK requires. plus extra .5mm for the little areas that nvidia had pads that ek didn't provide.


----------



## nrpeyton

Quote:


> Originally Posted by *Beagle Box*
> 
> Maybe that's true for Classifieds, but I think what you actually found was that the fastest memory speeds are found 50 ticks apart on the slider.
> 
> I've worked with two GTX 1080s (Zotac and MSI) and once I found a 'sweet spot' in memory speed, I'd find another at precisely +50 or -50 from that point. Neither card's sweet spots ended in '0' or '5'.


That makes sense too. I wonder why every +50 or -50 are stronger than numbers in-between. An overclock is still an overclock. So it shouldn't really matter. But for some reason it does.


----------



## 8051

Quote:


> Originally Posted by *nuno_p*
> 
> How?


What you do is set the curve so that the highest voltage is at the frequency you want. It'll boost up to that voltage and frequency and no farther, because you have the curve fall off at that point.

I don't even bother w/the custom voltage curve anymore though, I just add an offset to the frequency and extend the voltage range.


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> That makes sense too. I wonder why every +50 or -50 are stronger than numbers in-between. An overclock is still an overclock. So it shouldn't really matter. But for some reason it does.


Different timing straps or a clock crossing boundary that has a different latency at different different memory multipliers.


----------



## WillG027

Interesting that the clock number ending affecting perceived performance is mentioned - I swear my 1080ti runs smoother at clock additions ending in 75.
+75, +175 , +375 etc.
Pretty confident it's not placebo either. . .


----------



## Dasboogieman

Quote:


> Originally Posted by *WillG027*
> 
> Interesting that the clock number ending affecting perceived performance is mentioned - I swear my 1080ti runs smoother at clock additions ending in 75.
> +75, +175 , +375 etc.
> Pretty confident it's not placebo either. . .


You are not seeing things. What you are seeing is the effect of memory straps (i.e. latencies) on Bandwidth.

If my clockspeed is aligned with my strap (for me it's any factor of 10) the bandwidth difference is at least 50GB/s vs misaligned.

The catch here is the straps get retrained at the beginning of every single Windows 10 boot (the IMC trains itself according to parameters like bus heat, VRAM heat, IMC heat, electrical factors etc etc). To find the aligned strap, I have to constantly re-test any VRAM OC because the strap alignment is +-10 from the original value of last boot.

I just run the CUDA VRAM Bandwidth test (originally designed to test the 970 4gb array) and if it spits out 495GB/s then I know I'm aligned. Otherwise, it shows 440GB/s.


----------



## WillG027

Thanks for that ^^ DasBoogieman.

It's both interesting and useful.


----------



## Agent-A01

Quote:


> Originally Posted by *Dasboogieman*
> 
> You are not seeing things. What you are seeing is the effect of memory straps (i.e. latencies) on Bandwidth.
> 
> If my clockspeed is aligned with my strap (for me it's any factor of 10) the bandwidth difference is at least 50GB/s vs misaligned.
> 
> The catch here is the straps get retrained at the beginning of every single Windows 10 boot (the IMC trains itself according to parameters like bus heat, VRAM heat, IMC heat, electrical factors etc etc). To find the aligned strap, I have to constantly re-test any VRAM OC because the strap alignment is +-10 from the original value of last boot.
> 
> I just run the CUDA VRAM Bandwidth test (originally designed to test the 970 4gb array) and if it spits out 495GB/s then I know I'm aligned. Otherwise, it shows 440GB/s.


Can you go into more detail on this?

For example if your clock speed is exactly 2000mhz and you have set VRAM to exactly 8ghz(assuming 8000mhz is a possible strap) you will have perfect alignment with full VRAM speed.
If you set 2012 mhz with 8ghz vram, your VRAM bandwidth drops down to 440gbs.

Is that correct?


----------



## Dasboogieman

Quote:


> Originally Posted by *Agent-A01*
> 
> Can you go into more detail on this?
> 
> For example if your clock speed is exactly 2000mhz and you have set VRAM to exactly 8ghz(assuming 8000mhz is a possible strap) you will have perfect alignment with full VRAM speed.
> If you set 2012 mhz with 8ghz vram, your VRAM bandwidth drops down to 440gbs.
> 
> Is that correct?


Yes that is correct. Which side it swings to can vary between boots (I haven't done detailed testing but I suspect the main culprit for the repeated VRAM retraining is the operating temperature variances).

You can test it yourself with these two applications:

Use the NVIDIA Profile inspector to change the P2 CUDA VRAM speed throttle to force the VRAM to run at full speed in CUDA applications.
Then use the VRAM bandwidth test to parse the realtime bandwidth.

nvidiaProfileInspector.zip 129k .zip file


vRamBandWidthTest.zip 120k .zip file


----------



## Agent-A01

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes that is correct. Which side it swings to can vary between boots (I haven't done detailed testing but I suspect the main culprit for the repeated VRAM retraining is the operating temperature variances).
> 
> You can test it yourself with these two applications:
> 
> Use the NVIDIA Profile inspector to change the P2 CUDA VRAM speed throttle to force the VRAM to run at full speed in CUDA applications.
> Then use the VRAM bandwidth test to parse the realtime bandwidth.
> 
> nvidiaProfileInspector.zip 129k .zip file
> 
> 
> vRamBandWidthTest.zip 120k .zip file


So what VRAM clock are you running at?

I'm using +499 which equates to 460-470 GB/s.

I forced P2 off in cuda, VRAM shows max clock; core does not max out.


----------



## outofmyheadyo

Why would a 2070 beat 1080ti,1080 was barely faster than 980ti when launched.
And knowing how nvidia likes to release the potato GPU-s first and then months later release the good cards like 1080ti, so they can ofcourse milk everyone, with titans.


----------



## nrpeyton

Quote:


> Originally Posted by *Agent-A01*
> 
> So what VRAM clock are you running at?
> 
> I'm using +499 which equates to 460-470 GB/s.
> 
> I forced P2 off in cuda, VRAM shows max clock; core does not max out.


*You're getting 460 - 470 GB/s with a +499?*

With a +499 I'm *only* getting 387 GB/s - 407 GB/s

Despite fact that with a +575. _(My max stable VRAM overclock - tested for stability using OCCT)_ I get 393 - 411 GB/s (reason I mention this -- is it IS still increasing). In fact it even keeps increasing _past_ the point of stability.

I can't find "P2 CUDA VRAM SPEED THROTTLE" option when I download & open 'nvidia Profile Inspector'.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> I can't find "P2 CUDA VRAM SPEED THROTTLE" option when I download & open 'nvidia Profile Inspector'.
> 
> With a +575. _(My max stable VRAM overclock - tested for stability using OCCT)_ I get approx. 411 GB/s.


Setting category 5, CUDA Force P2 state. Set this to OFF from the default of ON.
Quote:


> Originally Posted by *outofmyheadyo*
> 
> Why would a 2070 beat 1080ti,1080 was barely faster than 980ti when launched.
> And knowing how nvidia likes to release the potato GPU-s first and then months later release the good cards like 1080ti, so they can ofcourse milk everyone, with titans.


The 980ti was an anomaly. It was a rare combination of being able to overclock well above and beyond it's original performance bracket and able to be BIOS modded to allow more voltage. The same cannot be done to the 1080ti so I am confident the 2070 can match the 1080ti or only lags very slightly (makes sense, you buy yesterday's flagship at a discount).

this "milking" you speak is probably the only way new GPUs will be funded lol. It is an engineering marvel that they can still deliver 35% performance gains across generations on a yearly cadence. We as consumers have become a little entitled we have forgotten that these things are extremely costly and time consuming to fabricate, in a time where shrinking the process node increases cost rather than the opposite in yesteryears. Considering Pascal cost $1 Billion to make, I think it is a perfectly reasonable approach for sustainable long term growth


----------



## Blameless

~470GiB/s out of a maximum theoretical of 484.

Real memory clock is 1485MHz (+433 in MSI AB, or 5940 QDR, or 11880MT/s), which is the highest whole number memory multiplier (real memory clock divisible by the 27MHz reference clock) that's stable in all tests I can find at the temp limit I have set (68C).

Core clock during the benchmark was 1569 due to the low load on the GPU itself.

http://cdn.overclock.net/8/8b/8bc1e5ce_1080tibandwidth.png


----------



## Dasboogieman

Quote:


> Originally Posted by *Blameless*
> 
> ~470GiB/s out of a maximum theoretical of 484.
> 
> Real memory clock is 1485MHz (+433 in MSI AB, or 5940 QDR, or 11880MT/s), which is the highest whole number memory multiplier (real memory clock divisible by the 27MHz reference clock) that's stable in all tests I can find at the temp limit I have set (68C).
> 
> Core clock during the benchmark was 1569 due to the low load on the GPU itself.


That is good, if it was misaligned, you'd be seeing 440GB/s ish give or take 10.


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Setting category 5, CUDA Force P2 state. Set this to OFF from the default of ON.


Nice one. That little option in the Nvidia Profile Inspector just increased me to:

458 - 474 GB/s @ + 499

&

464 - 481 GB/s @ +575

Thanks 

I still seem a lttle behind what you're getting though -- you're still beating my + 575 even with your +499

Increasing my core clock didn't help. _(In fact result was actually slightly less)._

What does that option actually do, anyway? (apart from increase my vram bandwidth).?


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> That little option in the Nvidia Profile Inspector just increased me to


Without it, anything that touches CUDA will kick the memory clock down to what it is at the next lower power state, which usually costs at least 10% of VRAM performance.

I first noticed this while gaming and recording with NVENC. Recording video kicked my memory clock from 5940 down to 5505 until I figured out what was happening and corrected it.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent-A01*
> 
> So what VRAM clock are you running at?
> 
> I'm using +499 which equates to 460-470 GB/s.
> 
> I forced P2 off in cuda, VRAM shows max clock; core does not max out.
> 
> 
> 
> *You're getting 460 - 470 GB/s with a +499?*
> 
> With a +499 I'm *only* getting 387 GB/s - 407 GB/s
> 
> Despite fact that with a +575. _(My max stable VRAM overclock - tested for stability using OCCT)_ I get 393 - 411 GB/s (reason I mention this -- is it IS still increasing). In fact it even keeps increasing _past_ the point of stability.
> 
> I can't find "P2 CUDA VRAM SPEED THROTTLE" option when I download & open 'nvidia Profile Inspector'.
Click to expand...

Right click it, Run As Admin.


----------



## LunaP

Hey guys troubleshooting some stuttering issues, here, even w/ gsync on I'm noticing shakyness when moving now and then as if syncs just going out of whack every few minutes, and its driving me insane.

Tried multipl driver revisions ranging from 378 -> current.

2x 1080ti SLI using the EvGA HB Bridge, ensured its on, ran AIDA64 and got this multiple times.



This is the 3rd RMA from eVGA. Cards are teh Black edition variants. No custom bios.

Any tests I can run to verify if its the card ? I don't wanna have to break my loop apart if I don't have to , but know I'll have to after this. Looking for suggestions though.


----------



## 8051

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Why would a 2070 beat 1080ti,1080 was barely faster than 980ti when launched.
> And knowing how nvidia likes to release the potato GPU-s first and then months later release the good cards like 1080ti, so they can ofcourse milk everyone, with titans.


"A 1080 was barely faster than a 980Ti" But not a stock 980Ti running stock clocks.

1080Ti's don't overclock nearly as well as 980Ti's did either.


----------



## LunaP

N/m figured it out, 16x for gpu 1 and 8x for gpu2, guess I gotta wait for my new board to get dual x16....


----------



## Blameless

Quote:


> Originally Posted by *LunaP*
> 
> N/m figured it out, 16x for gpu 1 and 8x for gpu2, guess I gotta wait for my new board to get dual x16....


Yes, PCI-E transfer speed is what the read/write scores test in that benchmark. 'Copy' is the actual VRAM performance.

Why isn't 16x/16x possible on your current board? Your CPU has the lanes and your board should have two 16x electrical slots.


----------



## LunaP

Quote:


> Originally Posted by *Blameless*
> 
> Yes, PCI-E transfer speed is what the read/write scores test in that benchmark. 'Copy' is the actual VRAM performance.
> 
> Why isn't 16x/16x possible on your current board? Your CPU has the lanes and your board should have two 16x electrical slots.


Only has 1 sadly, busted pin on the outer lane area for pcie slot 3 so it only runs in 8x.

I'm going x299 soon once they're in stock, also apparently the stuttering was from having gsync off, now its perfectly fluid with it off.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> *You're getting 460 - 470 GB/s with a +499?*
> 
> With a +499 I'm *only* getting 387 GB/s - 407 GB/s
> 
> Despite fact that with a +575. _(My max stable VRAM overclock - tested for stability using OCCT)_ I get 393 - 411 GB/s (reason I mention this -- is it IS still increasing). In fact it even keeps increasing _past_ the point of stability.
> 
> I can't find "P2 CUDA VRAM SPEED THROTTLE" option when I download & open 'nvidia Profile Inspector'.


That means your 499 is out of alignment. Try 500, 505 or 510.

The actual bandwidth has a lot to do with the precision your VRAM traces were manufactured at (I expect a healthy amount of variance as tiny variations can yield significant bandwidth differences) and BIOS version. The F3P BIOS on my card has very optimized timings so my VRAM is pretty much running 100% of potential.


----------



## GreedyMuffin

Where can I download the program to test that sort if thing?


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> N/m figured it out, 16x for gpu 1 and 8x for gpu2, guess I gotta wait for my new board to get dual x16....


I had thought w/the high bandwidth bridge the number of lanes used in SLI wasn't supposed to be an issue anymore?


----------



## outofmyheadyo

Has anyone tried running a 1080ti with just the universal block for a long time ? I might want to buy a 1080ti again but the one I am lookin for has no fullcover block available.
And not really interested in buyng fullcover blocks one after another whenever I switch cards.


----------



## Nervoize

Quote:


> Originally Posted by *nrpeyton*
> 
> *You're getting 460 - 470 GB/s with a +499?*
> 
> With a +499 I'm *only* getting 387 GB/s - 407 GB/s
> 
> Despite fact that with a +575. _(My max stable VRAM overclock - tested for stability using OCCT)_ I get 393 - 411 GB/s (reason I mention this -- is it IS still increasing). In fact it even keeps increasing _past_ the point of stability.
> 
> I can't find "P2 CUDA VRAM SPEED THROTTLE" option when I download & open 'nvidia Profile Inspector'.


Weird, my +450 strikes down with a fair 524.1GB/s



Gotta admit, +450 is the highest i can do before insta crashing. My memory overclocks very very bad







My core easily hits 2164MHz without throttle


----------



## GreedyMuffin

Quote:


> Originally Posted by *Nervoize*
> 
> Weird, my +450 strikes down with a fair 524.1GB/s
> 
> 
> 
> Gotta admit, +450 is the highest i can do before insta crashing. My memory overclocks very very bad
> 
> 
> 
> 
> 
> 
> 
> My core easily hits 2164MHz without throttle


Where can I download this tool?

Sincerely


----------



## ZealotKi11er

Quote:


> Originally Posted by *Nervoize*
> 
> Weird, my +450 strikes down with a fair 524.1GB/s
> 
> 
> 
> Gotta admit, +450 is the highest i can do before insta crashing. My memory overclocks very very bad
> 
> 
> 
> 
> 
> 
> 
> My core easily hits 2164MHz without throttle


What card do you have that does 2164MHz easy?


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> I had thought w/the high bandwidth bridge the number of lanes used in SLI wasn't supposed to be an issue anymore?


High bandwidth bridge does what its supposed to , supports high bandwidth when available, if 1 slot is in x16 and the other is in x8 its only going to be able to run as fast as the card can process, Nvlink could be fixed just dunno if they're gonna, I get where your'e coming from but yeah, I dropped my x16 down to 2.0 x16 and it ran at the same speed as the x8 ( as a test )


----------



## 8051

Quote:


> Originally Posted by *Nervoize*
> 
> Weird, my +450 strikes down with a fair 524.1GB/s
> 
> 
> 
> Gotta admit, +450 is the highest i can do before insta crashing. My memory overclocks very very bad
> 
> 
> 
> 
> 
> 
> 
> My core easily hits 2164MHz without throttle


A 1080Ti that can easily hit and maintain a reliable 2164 MHz.? I'd like to see that.


----------



## Nervoize

Quote:


> Originally Posted by *8051*
> 
> A 1080Ti that can easily hit and maintain a reliable 2164 MHz.? I'd like to see that.


Sure, no problem








https://benchmark.unigine.com/results/rid_5e9ebddbdc5a47348d7de0559d1287bf


Again, i saw a 1.27v bios for the 1080Ti. It's not recommended for 24/7 use, but i mostly game about 6-10 hrs a day and ofc it's not always running at 1.27v, but about 75% of the time. Still safe for daily use? Since i wanna break that 2200MHz wall







I got a 1080Ti FTW3 with an EKWB nickel/plexi block on it running the XOC bios as slave. I am currently pulling between 350 and 520Watts out of my graphics card, and i don't know if my 2x8pin can handle much more.


----------



## KedarWolf

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent-A01*
> 
> Can you go into more detail on this?
> 
> For example if your clock speed is exactly 2000mhz and you have set VRAM to exactly 8ghz(assuming 8000mhz is a possible strap) you will have perfect alignment with full VRAM speed.
> If you set 2012 mhz with 8ghz vram, your VRAM bandwidth drops down to 440gbs.
> 
> Is that correct?
> 
> 
> 
> Yes that is correct. Which side it swings to can vary between boots (I haven't done detailed testing but I suspect the main culprit for the repeated VRAM retraining is the operating temperature variances).
> 
> You can test it yourself with these two applications:
> 
> Use the NVIDIA Profile inspector to change the P2 CUDA VRAM speed throttle to force the VRAM to run at full speed in CUDA applications.
> Then use the VRAM bandwidth test to parse the realtime bandwidth.
> 
> nvidiaProfileInspector.zip 129k .zip file
> 
> 
> vRamBandWidthTest.zip 120k .zip file
Click to expand...

The bandwidth test is a good way to find out what clock speeds you're losing bandwidth (like you said, it's 'misaligned')









This is me locked at 2062 core, 6156 memory.

DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):484.08 GByte/s
DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):474.38 GByte/s
DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):486.47 GByte/s
DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):483.19 GByte/s
DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):481.29 GByte/s
DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):477.90 GByte/s
DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):484.20 GByte/s
DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):481.82 GByte/s
DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):486.77 GByte/s
DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):481.88 GByte/s

etc.

If I drop a bin or two it goes down to 450 or so, drop a few more bins, back up close to these results.


----------



## Nervoize

Quote:


> Originally Posted by *8051*
> 
> A 1080Ti that can easily hit and maintain a reliable 2164 MHz.? I'd like to see that.


Quote:


> Originally Posted by *KedarWolf*
> 
> The bandwidth test is a good way to find out what clock speeds you're losing bandwidth (like you said, it's 'misaligned')
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is me locked at 2062 core, 6156 memory.
> 
> DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):484.08 GByte/s
> DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):474.38 GByte/s
> DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):486.47 GByte/s
> DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):483.19 GByte/s
> DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):481.29 GByte/s
> DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):477.90 GByte/s
> DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):484.20 GByte/s
> DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):481.82 GByte/s
> DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):486.77 GByte/s
> DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):481.88 GByte/s
> 
> etc.
> 
> If I drop a bin or two it goes down to 450 or so, drop a few more bins, back up close to these results.


Quite funny to see that a +50 clock increase reduces the speed by 30GB/s







Best spot for me is +350 for 420GB/s max lol. This program is completely different than the GB/s GPU-Z provides.


----------



## Maximization

Quote:


> Originally Posted by *Nervoize*
> 
> Quite funny to see that a +50 clock increase reduces the speed by 30GB/s
> 
> 
> 
> 
> 
> 
> 
> Best spot for me is +350 for 420GB/s max lol. This program is completely different than the GB/s GPU-Z provides.


you will get diminishing returns in an overclock...it is the physical limit on system


----------



## navjack27

misaligned? whatever it is i'm just confused by this test and my results.

at 5508mhz memory i get this

Code:



Code:


Nai's Benchmark, edited by VultureX
  Device: GeForce GTX 1080 Ti (11.00 GB)
  Memory Bus Width (bits): 352
  Peak Theoretical DRAM Bandwidth (GB/s): 484.440000

Allocating Memory . . .
Chunk Size: 128 MiByte
Allocated 24 Chunks
Allocated 3072 MiByte
Benchmarking DRAM
DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):432.11 GByte/s
DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):423.82 GByte/s
DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):423.82 GByte/s
DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):424.04 GByte/s
DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):426.81 GByte/s
DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):426.94 GByte/s
DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):422.54 GByte/s
DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):423.30 GByte/s
DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):424.95 GByte/s
DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):426.85 GByte/s
DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):431.68 GByte/s
DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):427.03 GByte/s
DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):426.89 GByte/s
DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):430.73 GByte/s
DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):424.78 GByte/s
DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):424.50 GByte/s
DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):431.53 GByte/s
DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):431.82 GByte/s
DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):427.27 GByte/s
DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):423.31 GByte/s
DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):425.01 GByte/s
DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):424.69 GByte/s
DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):427.26 GByte/s
DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):429.84 GByte/s
Benchmarking L2-Cache
L2-Cache-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):1174.69 GByte/s
L2-Cache-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):1234.35 GByte/s
L2-Cache-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):1227.34 GByte/s
L2-Cache-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):1228.70 GByte/s
L2-Cache-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):1227.65 GByte/s
L2-Cache-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):1228.42 GByte/s
L2-Cache-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):1227.94 GByte/s
L2-Cache-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):1229.26 GByte/s
L2-Cache-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):1228.34 GByte/s
L2-Cache-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):1229.41 GByte/s
L2-Cache-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):1258.05 GByte/s
L2-Cache-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):1287.97 GByte/s
L2-Cache-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):1286.95 GByte/s
L2-Cache-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):1287.71 GByte/s
L2-Cache-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):1285.86 GByte/s
L2-Cache-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):1288.98 GByte/s
L2-Cache-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):1288.89 GByte/s
L2-Cache-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):1289.06 GByte/s
L2-Cache-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):1289.06 GByte/s
L2-Cache-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):1287.71 GByte/s
L2-Cache-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):1286.87 GByte/s
L2-Cache-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):1287.54 GByte/s
L2-Cache-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):1286.70 GByte/s
L2-Cache-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):1283.51 GByte/s
Press any key to continue . . .

at 6003mhz i get this

Code:



Code:


Nai's Benchmark, edited by VultureX
  Device: GeForce GTX 1080 Ti (11.00 GB)
  Memory Bus Width (bits): 352
  Peak Theoretical DRAM Bandwidth (GB/s): 484.440000

Allocating Memory . . .
Chunk Size: 128 MiByte
Allocated 24 Chunks
Allocated 3072 MiByte
Benchmarking DRAM
DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):470.19 GByte/s
DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):463.10 GByte/s
DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):468.95 GByte/s
DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):465.84 GByte/s
DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):462.99 GByte/s
DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):469.18 GByte/s
DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):466.00 GByte/s
DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):463.10 GByte/s
DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):469.51 GByte/s
DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):463.84 GByte/s
DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):471.18 GByte/s
DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):471.52 GByte/s
DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):463.79 GByte/s
DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):463.41 GByte/s
DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):464.08 GByte/s
DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):466.73 GByte/s
DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):464.25 GByte/s
DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):464.79 GByte/s
DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):465.23 GByte/s
DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):464.18 GByte/s
DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):464.73 GByte/s
DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):464.95 GByte/s
DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):465.68 GByte/s
DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):464.68 GByte/s
Benchmarking L2-Cache
L2-Cache-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):1254.36 GByte/s
L2-Cache-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):1254.12 GByte/s
L2-Cache-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):1254.12 GByte/s
L2-Cache-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):1255.00 GByte/s
L2-Cache-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):1253.00 GByte/s
L2-Cache-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):1270.32 GByte/s
L2-Cache-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):1288.20 GByte/s
L2-Cache-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):1284.77 GByte/s
L2-Cache-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):1287.63 GByte/s
L2-Cache-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):1283.09 GByte/s
L2-Cache-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):1301.52 GByte/s
L2-Cache-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):1294.14 GByte/s
L2-Cache-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):1293.73 GByte/s
L2-Cache-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):1286.95 GByte/s
L2-Cache-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):1285.10 GByte/s
L2-Cache-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):1288.04 GByte/s
L2-Cache-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):1294.91 GByte/s
L2-Cache-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):1286.79 GByte/s
L2-Cache-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):1293.38 GByte/s
L2-Cache-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):1286.45 GByte/s
L2-Cache-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):1285.52 GByte/s
L2-Cache-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):1286.70 GByte/s
L2-Cache-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):1285.84 GByte/s
L2-Cache-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):1284.77 GByte/s
Press any key to continue . . .

before i obsess and stay up all night someone show me that this correlates to performance gain in benches and or games


----------



## 8051

Quote:


> Originally Posted by *Nervoize*
> 
> Sure, no problem
> 
> 
> 
> 
> 
> 
> 
> 
> https://benchmark.unigine.com/results/rid_5e9ebddbdc5a47348d7de0559d1287bf
> 
> Again, i saw a 1.27v bios for the 1080Ti. It's not recommended for 24/7 use, but i mostly game about 6-10 hrs a day and ofc it's not always running at 1.27v, but about 75% of the time. Still safe for daily use? Since i wanna break that 2200MHz wall
> 
> 
> 
> 
> 
> 
> 
> I got a 1080Ti FTW3 with an EKWB nickel/plexi block on it running the XOC bios as slave. I am currently pulling between 350 and 520Watts out of my graphics card, and i don't know if my 2x8pin can handle much more.


That's an excellent overclock. Is that as far as it'll go? What's sad is that is only a 10% overclock from the stock maximum boost of a 1080Ti FE.


----------



## Dasboogieman

Quote:


> Originally Posted by *navjack27*
> 
> misaligned? whatever it is i'm just confused by this test and my results.
> 
> at 5508mhz memory i get this
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Nai's Benchmark, edited by VultureX
> Device: GeForce GTX 1080 Ti (11.00 GB)
> Memory Bus Width (bits): 352
> Peak Theoretical DRAM Bandwidth (GB/s): 484.440000
> 
> Allocating Memory . . .
> Chunk Size: 128 MiByte
> Allocated 24 Chunks
> Allocated 3072 MiByte
> Benchmarking DRAM
> DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):432.11 GByte/s
> DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):423.82 GByte/s
> DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):423.82 GByte/s
> DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):424.04 GByte/s
> DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):426.81 GByte/s
> DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):426.94 GByte/s
> DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):422.54 GByte/s
> DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):423.30 GByte/s
> DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):424.95 GByte/s
> DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):426.85 GByte/s
> DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):431.68 GByte/s
> DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):427.03 GByte/s
> DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):426.89 GByte/s
> DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):430.73 GByte/s
> DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):424.78 GByte/s
> DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):424.50 GByte/s
> DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):431.53 GByte/s
> DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):431.82 GByte/s
> DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):427.27 GByte/s
> DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):423.31 GByte/s
> DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):425.01 GByte/s
> DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):424.69 GByte/s
> DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):427.26 GByte/s
> DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):429.84 GByte/s
> Benchmarking L2-Cache
> L2-Cache-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):1174.69 GByte/s
> L2-Cache-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):1234.35 GByte/s
> L2-Cache-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):1227.34 GByte/s
> L2-Cache-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):1228.70 GByte/s
> L2-Cache-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):1227.65 GByte/s
> L2-Cache-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):1228.42 GByte/s
> L2-Cache-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):1227.94 GByte/s
> L2-Cache-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):1229.26 GByte/s
> L2-Cache-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):1228.34 GByte/s
> L2-Cache-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):1229.41 GByte/s
> L2-Cache-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):1258.05 GByte/s
> L2-Cache-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):1287.97 GByte/s
> L2-Cache-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):1286.95 GByte/s
> L2-Cache-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):1287.71 GByte/s
> L2-Cache-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):1285.86 GByte/s
> L2-Cache-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):1288.98 GByte/s
> L2-Cache-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):1288.89 GByte/s
> L2-Cache-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):1289.06 GByte/s
> L2-Cache-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):1289.06 GByte/s
> L2-Cache-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):1287.71 GByte/s
> L2-Cache-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):1286.87 GByte/s
> L2-Cache-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):1287.54 GByte/s
> L2-Cache-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):1286.70 GByte/s
> L2-Cache-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):1283.51 GByte/s
> Press any key to continue . . .
> 
> at 6003mhz i get this
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Nai's Benchmark, edited by VultureX
> Device: GeForce GTX 1080 Ti (11.00 GB)
> Memory Bus Width (bits): 352
> Peak Theoretical DRAM Bandwidth (GB/s): 484.440000
> 
> Allocating Memory . . .
> Chunk Size: 128 MiByte
> Allocated 24 Chunks
> Allocated 3072 MiByte
> Benchmarking DRAM
> DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):470.19 GByte/s
> DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):463.10 GByte/s
> DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):468.95 GByte/s
> DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):465.84 GByte/s
> DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):462.99 GByte/s
> DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):469.18 GByte/s
> DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):466.00 GByte/s
> DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):463.10 GByte/s
> DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):469.51 GByte/s
> DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):463.84 GByte/s
> DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):471.18 GByte/s
> DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):471.52 GByte/s
> DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):463.79 GByte/s
> DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):463.41 GByte/s
> DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):464.08 GByte/s
> DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):466.73 GByte/s
> DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):464.25 GByte/s
> DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):464.79 GByte/s
> DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):465.23 GByte/s
> DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):464.18 GByte/s
> DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):464.73 GByte/s
> DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):464.95 GByte/s
> DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):465.68 GByte/s
> DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):464.68 GByte/s
> Benchmarking L2-Cache
> L2-Cache-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):1254.36 GByte/s
> L2-Cache-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):1254.12 GByte/s
> L2-Cache-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):1254.12 GByte/s
> L2-Cache-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):1255.00 GByte/s
> L2-Cache-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):1253.00 GByte/s
> L2-Cache-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):1270.32 GByte/s
> L2-Cache-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):1288.20 GByte/s
> L2-Cache-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):1284.77 GByte/s
> L2-Cache-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):1287.63 GByte/s
> L2-Cache-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):1283.09 GByte/s
> L2-Cache-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):1301.52 GByte/s
> L2-Cache-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):1294.14 GByte/s
> L2-Cache-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):1293.73 GByte/s
> L2-Cache-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):1286.95 GByte/s
> L2-Cache-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):1285.10 GByte/s
> L2-Cache-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):1288.04 GByte/s
> L2-Cache-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):1294.91 GByte/s
> L2-Cache-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):1286.79 GByte/s
> L2-Cache-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):1293.38 GByte/s
> L2-Cache-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):1286.45 GByte/s
> L2-Cache-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):1285.52 GByte/s
> L2-Cache-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):1286.70 GByte/s
> L2-Cache-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):1285.84 GByte/s
> L2-Cache-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):1284.77 GByte/s
> Press any key to continue . . .
> 
> before i obsess and stay up all night someone show me that this correlates to performance gain in benches and or games


Your second results look aligned to me. The difference between aligned and misaligned is about 3-4% performance (Directly measurable with Superposition).


----------



## Blameless

Quote:


> Originally Posted by *LunaP*
> 
> High bandwidth bridge does what its supposed to , supports high bandwidth when available, if 1 slot is in x16 and the other is in x8 its only going to be able to run as fast as the card can process, Nvlink could be fixed just dunno if they're gonna, I get where your'e coming from but yeah, I dropped my x16 down to 2.0 x16 and it ran at the same speed as the x8 ( as a test )


The bandwidth offered by the the bridge and the slot are independent of each other.
Quote:


> Originally Posted by *Nervoize*
> 
> This program is completely different than the GB/s GPU-Z provides.


GPU-Z isn't a benchmark, it only reports the theoretical maximum bandwidth of the the memory interface by multiplying bus width by effective clock.


----------



## LunaP

Quote:


> Originally Posted by *Blameless*
> 
> The bandwidth offered by the the bridge and the slot are independent of each other..


Correct his response however was on the slot, since the bridge response was already shown on the photos,


----------



## Nervoize

Quote:


> Originally Posted by *8051*
> 
> That's an excellent overclock. Is that as far as it'll go? What's sad is that is only a 10% overclock from the stock maximum boost of a 1080Ti FE.


GPU boost 3.0 did most of the job by boosting to 2000MHz. After adding a nice curve to it, i had a stable 2164MHz overclock. But to lower the wattage and heat, i stick at a solid 2151MHz core and 6003MHz memory.

This is my Time Spy score by the way, 9914. https://www.3dmark.com/spy/2489795


----------



## ZealotKi11er

Quote:


> Originally Posted by *Nervoize*
> 
> GPU boost 3.0 did most of the job by boosting to 2000MHz. After adding a nice curve to it, i had a stable 2164MHz overclock. But to lower the wattage and heat, i stick at a solid 2151MHz core and 6003MHz memory.
> 
> This is my Time Spy score by the way, 9914. https://www.3dmark.com/spy/2489795


Is that 1.2v?


----------



## KedarWolf

Quote:


> Originally Posted by *Nervoize*
> 
> Quote:
> 
> 
> 
> Originally Posted by *8051*
> 
> That's an excellent overclock. Is that as far as it'll go? What's sad is that is only a 10% overclock from the stock maximum boost of a 1080Ti FE.
> 
> 
> 
> GPU boost 3.0 did most of the job by boosting to 2000MHz. After adding a nice curve to it, i had a stable 2164MHz overclock. But to lower the wattage and heat, i stick at a solid 2151MHz core and 6003MHz memory.
> 
> This is my Time Spy score by the way, 9914. https://www.3dmark.com/spy/2489795
Click to expand...

Graphics score seems a bit low.

I got 11289 with 2088 core, 6210 memory.

https://www.3dmark.com/spy/2173470

Could memory overclock and CPU type account for that much difference?

http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1200_20#post_26267692


----------



## KedarWolf

http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26377605

#1 on overclock.net for a 1080 Ti.



Come on peeps, my card is only some above average and I never really pushed it.

Challenging you all to beat it ASAP.









Edit: Validation warning is older beta drivers that do better for my benching.


----------



## navjack27

what do i have to beat, the overall or graphics score?

these are my two best

23469 overall
31686 graphics
2088mhz core
https://www.3dmark.com/fs/13141412

23354 overall
31366 graphics
2000mhz core (i doubt that is correct, must be a curve overclock)
https://www.3dmark.com/fs/13082423

i'll run an up-to-date benchmark in a lil bit here

EDIT: i just put all my benches on hwbot - http://hwbot.org/user/jack.mangano/

EDIT AGAIN:

sorry dude, i'm gettin close. your extra cores are no match for me
http://hwbot.org/submission/3671487_
https://www.3dmark.com/3dm/22527028

24053 overall
32844 graphics


----------



## Blameless

Yeah, getting a higher graphics score isn't that difficult, but besting the overall score will take another eight-plus core Intel CPU.


----------



## KedarWolf

Quote:


> Originally Posted by *navjack27*
> 
> what do i have to beat, the overall or graphics score?
> 
> these are my two best
> 
> 23469 overall
> 31686 graphics
> 2088mhz core
> https://www.3dmark.com/fs/13141412
> 
> 23354 overall
> 31366 graphics
> 2000mhz core (i doubt that is correct, must be a curve overclock)
> https://www.3dmark.com/fs/13082423
> 
> i'll run an up-to-date benchmark in a lil bit here
> 
> EDIT: i just put all my benches on hwbot - http://hwbot.org/user/jack.mangano/
> 
> EDIT AGAIN:
> 
> sorry dude, i'm gettin close. your extra cores are no match for me
> http://hwbot.org/submission/3671487_
> https://www.3dmark.com/3dm/22527028
> 
> 24053 overall
> 32844 graphics


Pushed some more, you're still getting a better graphics score than me though.


----------



## KraxKill

Quote:


> Originally Posted by *KedarWolf*
> 
> Pushed some more, you're still getting a better graphics score than me though.


I can't match the CPU score with a quad core...but the graphics score is sailing at 33k


https://www.3dmark.com/fs/12704696


----------



## Nervoize

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is that 1.2v?


Correct








Quote:


> Originally Posted by *KedarWolf*
> 
> Graphics score seems a bit low.
> 
> I got 11289 with 2088 core, 6210 memory.
> 
> https://www.3dmark.com/spy/2173470
> 
> Could memory overclock and CPU type account for that much difference?
> 
> http://www.overclock.net/t/1606006/3dmark-time-spy-benchmark-top-30/1200_20#post_26267692


That is because of my memory is not good at overclocking. I crash beyond +500. And yes, your CPU does increase the points a lot compared to mine.


----------



## nrpeyton

Quote:


> Originally Posted by *Nervoize*
> 
> Correct
> 
> 
> 
> 
> 
> 
> 
> 
> That is because of my memory is not good at overclocking. I crash beyond +500. And yes, your CPU does increase the points a lot compared to mine.


Are there any new options for voltage yet except the XOC? As XOC peforms worse on a FE clock for clock. So any gains you get with the extra voltage are lost. And you end up with the same performance for higher voltage and power requirement.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Nervoize*
> 
> Correct
> 
> 
> 
> 
> 
> 
> 
> 
> That is because of my memory is not good at overclocking. I crash beyond +500. And yes, your CPU does increase the points a lot compared to mine.
> 
> 
> 
> Are there any new options for voltage yet except the XOC? As XOC peforms worse on a FE clock for clock. So any gains you get with the extra voltage are lost. And you end up with the same performance for higher voltage and power requirement.
Click to expand...

No new options, no, but I'm getting the best benches with the Strix OC BIOs, about 100 points better in Time Spy then the Arctic Storm and I got a 11579 graphics score this morning with it in Fire Strike this a.m.

Arctic Storm BIOS would crash the driver at the same clocks and score over 300 point less.









I tried every viable BIOS including XOC and Strix OC BIOS performed best of them all.









Like FTW3 BIOS at same clocks the driver wouldn't crash but it scored 300 points less, same with XOC.


----------



## nrpeyton

Well it looks like my Shunt Mod is still paying off.

6 months later. And the Shunt Mod is still going strong and never caused any problems.

*1080 Ti FE* overclocked at *2176 MHZ*, fully *stable*. Drawing 350 - *375 watts* (approx.) in OCCT's version of the Furmark Stress Test.

Just imagine what a bit more voltage could do on this chip!! lol

See screenshot below: _(right click & select 'open new tab' to view FULL SIZE)._



Focused & Enlarged:


Anyone know if EVGA or Nvidia have released updated BIOS (firmware) for Founders Edition cards yet?


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> Well it looks like my Shunt Mod is still paying off.
> 
> 6 months later. And the Shunt Mod is still going strong and never caused any problems.
> 
> Just imagine what a bit more voltage could do on this chip!! lol
> 
> Anyone know if EVGA or Nvidia have released updated BIOS (firmware) for Founders Edition cards yet?


Is it just your EVGA card that allows for 1.27V? Or can most 1080Ti's get to this voltage?


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> Is it just your EVGA card that allows for 1.27V? Or can most 1080Ti's get to this voltage?


Where are you reading 1.27v? _(i wish lol)._

Voltage = 1.093v

In-game screenshot at 4k Ultra (Mafia 3) below.

From left to right:
GPU Core Clock Speed _(1080Ti FE)_, Power Draw in Watts, Voltage
GPU Temp, GPU Utilisation, GPU video memory allocated
CPU Temp _(i7 7700k)_, CPU Max Single Thread Usage (at specific time), Total CPU Usage (avg. of all cores)
CPU V.core, CPU Power Draw in Watts.

/\ All in real-time.



Has anyone tried the new 1080 Ti Kingpin 'Extreme watercooling OC BIOS' ? (https://xdevs.com/doc/_PC_HW/EVGA/KPE180/bios/6798X390.rom) on a FE? It has higher power target than the 3 BIOS's shipped with the card (shipped BIOS's are changed through BIOS switch). This is custom BIOS and must be downloaded separately.and flashed to card (also not available from EVGA site). But available at Kingpincooling.com (or the link above I copied from kingpin's own site).


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *8051*
> 
> Is it just your EVGA card that allows for 1.27V? Or can most 1080Ti's get to this voltage?
> 
> 
> 
> Where are you reading 1.27v? _(i wish lol)._
> 
> Voltage = 1.093v
> 
> In-game screenshot at 4k Ultra (Mafia 3) below.
> 
> From left to right:
> GPU Core Clock Speed _(1080Ti FE)_, Power Draw in Watts, Voltage
> GPU Temp, GPU Utilisation, GPU video memory allocated
> CPU Temp _(i7 7700k)_, CPU Max Single Thread Usage (at specific time), Total CPU Usage (avg. of all cores)
> CPU V.core, CPU Power Draw in Watts.
> 
> /\ All in real-time.
> 
> 
> 
> Has anyone tried the new 1080 Ti Kingpin 'Extreme watercooling OC BIOS' ? (https://xdevs.com/doc/_PC_HW/EVGA/KPE180/bios/6798X390.rom) on a FE? It has higher power target than the 3 BIOS's shipped with the card (shipped BIOS's are changed through BIOS switch). This is custom BIOS and must be downloaded separately.and flashed to card (also not available from EVGA site). But available at Kingpincooling.com (or the link above I copied from kingpin's own site).
Click to expand...

BIOS bricked my FE, PC wouldn't boot.









Must be a three power connector card, similar cards do the same thing.

But I know what to do when it happens, BIOS defaults, pop in my spare pci-e card, enable Fast Boot and CSM, pop in my 1080 Ti in the second slot, boot up, flash old BIOS back.









It's a pain in the butt, especially since I forgot to enable DVI-D on my monitor and I'm like 'what the heck, no display'.


----------



## ZealotKi11er

Got to love flat 99% GPU usage, flat 2025MHz Clock Speed.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> BIOS bricked my FE, PC wouldn't boot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Must be a three power connector card, similar cards do the same thing.
> 
> But I know what to do when it happens, BIOS defaults, pop in my spare pci-e card, enable Fast Boot and CSM, pop in my 1080 Ti in the second slot, boot up, flash old BIOS back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a pain in the butt, especially since I forgot to enable DVI-D on my monitor and I'm like 'what the heck, no display'.


Sounds scary!


----------



## 2ndLastJedi

Quote:


> Originally Posted by *nrpeyton*
> 
> Where are you reading 1.27v? _(i wish lol)._
> 
> Voltage = 1.093v
> 
> In-game screenshot at 4k Ultra (Mafia 3) below.
> 
> From left to right:
> GPU Core Clock Speed _(1080Ti FE)_, Power Draw in Watts, Voltage
> GPU Temp, GPU Utilisation, GPU video memory allocated
> CPU Temp _(i7 7700k)_, CPU Max Single Thread Usage (at specific time), Total CPU Usage (avg. of all cores)
> CPU V.core, CPU Power Draw in Watts.
> 
> /\ All in real-time.
> 
> 
> 
> Has anyone tried the new 1080 Ti Kingpin 'Extreme watercooling OC BIOS' ? (https://xdevs.com/doc/_PC_HW/EVGA/KPE180/bios/6798X390.rom) on a FE? It has higher power target than the 3 BIOS's shipped with the card (shipped BIOS's are changed through BIOS switch). This is custom BIOS and must be downloaded separately.and flashed to card (also not available from EVGA site). But available at Kingpincooling.com (or the link above I copied from kingpin's own site).


She's running a bit hot @24C, lol. Nice!


----------



## samoth777

Hello everybody. I have a 1070, a 4690k, a 1440p monitor with 144hz and gsync. I'm planning on upgrading to a 1080 ti. Will my setup cause any bottlenecks? I mainly play battlefield 1 and Witcher 3.


----------



## KedarWolf

Quote:


> Originally Posted by *samoth777*
> 
> Hello everybody. I have a 1070, a 4690k, a 1440p monitor with 144hz and gsync. I'm planning on upgrading to a 1080 ti. Will my setup cause any bottlenecks? I mainly play battlefield 1 and Witcher 3.


It'll be fine I'm sure.


----------



## GreedyMuffin

Yes. You will.

This is a Ryzen 1700 at 3850 versus as 7800X at 4700. My own results.


Spoiler: Warning: Spoiler!



I7 7800X, MSI X299 Tomhawk, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10

R7 1700, X370 Crosshair 6, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10

I7 7800X: Min FPS 114 - Max FPS 173 - Avg FPS 141 - 64 players Conquest on Soissons - Ultra preset - 1440P

R7 1700: MIN FPS 91 - Max FPS 174 - Avg FPS 115 - 64 players Conquest on Soissons - Ultra preset - 1440P

Ryzen at 3850/2933 on mem - GPU at 2050/+500

SK-X at 4700/3000 and 3200 on mem - GPU at 2050/+500



My old I7 7700K used to be pegged 100% in most cases in Battlefield 1, even OCed to 5 ghz.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26377605
> 
> #1 on overclock.net for a 1080 Ti.
> 
> 
> 
> Come on peeps, my card is only some above average and I never really pushed it.
> 
> Challenging you all to beat it ASAP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Validation warning is older beta drivers that do better for my benching.


What driver do you find best for benchmarks ? Do you always run the current ones ?


----------



## Pyounpy-2

Hi, I have used SLI of Zotac 1080ti amp extreme.
Using bios mod (using XOC) and shunt mod for the power limit, I got the stable SLI system at 2227MHz and GPU voltage of 1.2V.

CPU:7700K @5.4GHz Vcore:1.408V, Mem: 4200MHz 15-17-17-35-1T
GPU:[email protected], Video Mem:6003MHz, SLI
Cooler: water cooling


----------



## 8051

Quote:


> Originally Posted by *samoth777*
> 
> Hello everybody. I have a 1070, a 4690k, a 1440p monitor with 144hz and gsync. I'm planning on upgrading to a 1080 ti. Will my setup cause any bottlenecks? I mainly play battlefield 1 and Witcher 3.


As long as you're not planning on playing GTAIV at >= 4K resolutions, because it's a DX9 game I believe only one core can communicate w/the GPU and I always have one core of my 5820K maxed at 90% while my GPU idles along at 25% and my framerate tanks to the sub thirties.


----------



## samoth777

Ok thanks for the info guys. Was just worried that the 4690k was too slow.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *8051*
> 
> As long as you're not planning on playing GTAIV at >= 4K resolutions, because it's a DX9 game I believe only one core can communicate w/the GPU and I always have one core of my 5820K maxed at 90% while my GPU idles along at 25% and my framerate tanks to the sub thirties.


I have a 6600k paired with 1080 Ti and GTA V is perfect at 4k 60 with Vsync maxed (As is just about everything bar Watch Dogs 2 but thats UbiSoft) with AA x2 . I only assume that 6600k is only one gen newer and Intel leaps in performance means these CPU's should perform pretty similar .


----------



## Pyounpy-2

Quote:


> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26377605
> 
> #1 on overclock.net for a 1080 Ti.
> 
> 
> 
> Come on peeps, my card is only some above average and I never really pushed it.
> 
> Challenging you all to beat it ASAP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Validation warning is older beta drivers that do better for my benching.


My result

25084 overall
33404 graphics
2252.5mhz core Voltage 1.2V
https://www.3dmark.com/fs/13804113


----------



## CrazyElf

Over the past week, I tried my luck at undervolting. Overall score is just over 17k. For my GPU score, I get just under 20k in TimeSpy at 2560x1440 with a bit of undervolting. Here was the best I could do:

 

I get scores all over the map right now, from high 18.8k to just under 19.9k and I think I got over 20k a couple of times. It's very inconsistent.

The above curve seems to be the best undervolt I can do in SLI. Undervolting more causes instability. Both increasing the curve for a more aggressive scaling at lower voltages and decreasing the curve for lower scores at lower voltages seem to cause a decrease in performance. I also may have the misfortune of having a decent undervolter and one that lost the silicon lottery. That makes synchonizing the cards in SLI very difficult. On the other hand, I think another possibility is that the top card may be running hot because it is sandwiched in between another 1080Ti and a CPU cooler in a North South configuration. Maybe getting a CPU cooler with an East-West configuration could help.

The CPU score is weak, likely because of my low uncore and I have a dead DIMM - I've left as dual channel for now.


Spoiler: Configuration kind of looks like this, only with 1080Ti Lightnings now instead of 290X Lightnings





I got rid of the sound card in favor of an external DAC, but there's no denying that 5960X is starving for air. The top card is still 7C hotter.

The CPU cooler fan is right against the backplate in the 1080Ti, wheres before there was a bit of room. It looks like the 1080Ti Lightnings are a bit bigger than the 290X Lightnings.



The undervolting score is not quite the result that this person got:

__
https://www.reddit.com/r/6ejqja/psa_undervolting_your_gpu_can_give_you_a_free/%5B/URL

Core seems counterproductive given my voltage curve.

Anyone try pushing up the AUX1 and AUX2? Both can be raised up to 30 mV. On AMD GPUs, that is the PLL, so I would imagine that it would be similar for Nvidia GPUs. I'm not sure what the difference is between AUX1 and AUX2. I'm hoping for slightly better RAM clocks. At 4k, it does seem like we are facing a VRAM bottleneck - Nvidia may very well have gimped the 352 bit bus knowing that this would happen. GPU1 seems to do +325 MHz and GPU2, if left alone does +400 MHz with no voltage. Any more fails Timespy, although +500 can be stable in games.

Quote:


> Originally Posted by *Blameless*
> 
> Good tip for SLI users, though you'll want to run many more than two instances.
> 
> I normally run eight or nine on my 1080 Ti and have found it impossible to get full memory controller load with less than six.


Odd - Memtest G80 is finding errors at stock.

I wonder if there is an easier way to do this.

MemtestG80.exe 10240 1000
MemtestG80.exe 10240 1000 --gpu 1 g1

OR if this is a 32 bit application, then we are left with:

MemtestG80.exe 3072 1000
MemtestG80.exe 3072 1000 --gpu 1 g1

We could open 3-4 instances or with SLI 6-8 instances.

Quote:


> Originally Posted by *Blameless*
> 
> Interesting slide, sums up the relevant memtestG80 dev comments nicely.


What's really interesting is that Nvidia according to the Stanford made progress in their memory errors between the G80 and GT200. I wonder if they continued to progress since with Fermi, Kepler, Maxwell, and Pascal respectively. I would suspect that they must have, especially because they want to dominate the compute market.

Perhaps modern GDDR5X has less memory errors as well.

Quote:


> Originally Posted by *Blameless*
> 
> It's worth it. Hash rate goes up ~5% for a similar increase in power consumption.


Very interesting. +Rep

I did not know that hash rates went up 5% on a 1080Ti at a given power consumption level using P0. Strictly for mining of course, GDDR5X remains noncompetitive versus the cheaper GTX 1060, 1070, and AMD's Polaris, but it does help I guess for those not gaming.

I do agree with your hypothesis that Nvidia may be attempting to protect its high margin Quadro markets. I suppose there is also the matter that the Quadros offer ECC that keeps them distinguished, which is necessary where absolute accuracy is needed.

Quote:


> Originally Posted by *Blameless*
> 
> The bandwidth offered by the the bridge and the slot are independent of each other.
> GPU-Z isn't a benchmark, it only reports the theoretical maximum bandwidth of the the memory interface by multiplying bus width by effective clock.


Yep. This is actually an interesting situation - I'd love to see a test of x8/x8 vs x8/x16 vs x16/x16 SLI testing from a frame time POV. There have been claims that the x16/x16 does better, which is possible.

I wonder if there are any games that benefit from the HB SLI bridge with a 1080Ti. Tested with a 1080, there were games that did at 4k. Most did not, but a few did:
https://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html

  

They must have tested 2 soft bridges versus 1 HB bridge. What we really need though is a frame time test between SLI soft vs HB SLI on the 1080Ti.

This is also interesting as well.
https://overclocking.guide/nvidia-hb-sli-bridge-technical-review/

IF 2x GTX 1080 can at times gain, I wonder if this is even more so for 2x 1080Ti, especially at 4k?

Edit:
Now here's the big claim that Nvidia made - their frame times are better with HB SLI.

https://www.geforce.com/whats-new/articles/geforce-gtx-1080-gaming-perfected



Independent testing shows this claim may have some merit.
https://www.computerbase.de/2016-09/nvidia-sli-hb-bridge-test/2/#diagramm-rise-of-the-tomb-raider-directx-11-frametimes-in-3840-2160

If true, then from Pascal onwards, people with SLI setups would be well advised to get a HB SLI bridge. _It is not for frame rates. It is for frame time variance._


----------



## mouacyk

Any new developments on TDP editing in BIOS?


----------



## nrpeyton

Quote:


> Originally Posted by *2ndLastJedi*
> 
> She's running a bit hot @24C, lol. Nice!


aye lol, it's amazing what a bit of cold can do for core clocks lol.

I can run the OCCT stress test at 2189 Mhz (without crashing) at that temp too, but errors start getting reported by the error detector.

Another 10c colder and I could probably manage 2202 Mhz


----------



## GreedyMuffin

My GPU used to be at 50-56`C, but now it is max around 33-37'C after I upgraded to a MO-RA3 a few months ago. Is it possible that I could increase my OC even though the temp only sunk by 15-20'C? Like 50 mhz more? :hmm:


----------



## nrpeyton

Quote:


> Originally Posted by *GreedyMuffin*
> 
> My GPU used to be at 50-56`C, but now it is max around 33-37'C after I upgraded to a MO-RA3 a few months ago. Is it possible that I could increase my OC even though the temp only sunk by 15-20'C? Like 50 mhz more? :hmm:


Yes definitely. 100%.

You should get _AT LEAST_ 2MHZ for every 1 degree C colder. (roughly) on Pascal.

So 20c colder = 40 MHZ faster. _(or it goes in bins of 13.. so 39 MHZ)._


----------



## GreedyMuffin

I could do 1950 on 0.900V before.. Maybe I can hit 2000 mhz at 0.925V.


----------



## nrpeyton

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I could do 1950 on 0.900V before.. Maybe I can hit 2000 mhz at 0.925V.


Try it. And let us know ;-)

You may also get 50 MHZ more on your memory. But memory isn't quite as responsive to cold. But you wont know until you try. You should hopefully at least get an extra 25mhz. (which might get you to that sweet spot you were just missing). Once you find memory sweet spot. The next sweet spot seems to be every 50 MHZ incriment.

In an earlier post someone said this may be due to pre-programmed latency timings at different speeds.


----------



## mrsheepmasterdy

Hi,between the zotac gtx 1080 ti amp extreme or the aorus gtx 1080 ti waterforce extreme ,wich one would you recommend? I want the one with the best performance i can get without manually overclocking


----------



## nrpeyton

Quote:


> Originally Posted by *mrsheepmasterdy*
> 
> Hi,between the zotac gtx 1080 ti amp extreme or the aorus gtx 1080 ti waterforce extreme ,wich one would you recommend? I want the one with the best performance i can get without manually overclocking


One way to find real usable technical information on this is: _(this is how I'd do it anyway)_ is check the maximum power target for each GPU on the GPU BIOS database at techpowerup.com (and compare them until you find the highest).

Cards with higher power targets programmed by default won't throttle as much in power hungry games.

Interesting you-tube videos regarding Liquid Metal on GPU:










In the comments, some guys are reporting 15c colder temps at load.


----------



## 8051

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I have a 6600k paired with 1080 Ti and GTA V is perfect at 4k 60 with Vsync maxed (As is just about everything bar Watch Dogs 2 but thats UbiSoft) with AA x2 . I only assume that 6600k is only one gen newer and Intel leaps in performance means these CPU's should perform pretty similar .


The problem is with GTA IV and that it's a DX9 game.


----------



## LunaP

Quote:


> Originally Posted by *CrazyElf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Over the past week, I tried my luck at undervolting. Overall score is just over 17k. For my GPU score, I get just under 20k in TimeSpy at 2560x1440 with a bit of undervolting. Here was the best I could do:
> 
> 
> 
> I get scores all over the map right now, from high 18.8k to just under 19.9k and I think I got over 20k a couple of times. It's very inconsistent.
> 
> The above curve seems to be the best undervolt I can do in SLI. Undervolting more causes instability. Both increasing the curve for a more aggressive scaling at lower voltages and decreasing the curve for lower scores at lower voltages seem to cause a decrease in performance. I also may have the misfortune of having a decent undervolter and one that lost the silicon lottery. That makes synchonizing the cards in SLI very difficult. On the other hand, I think another possibility is that the top card may be running hot because it is sandwiched in between another 1080Ti and a CPU cooler in a North South configuration. Maybe getting a CPU cooler with an East-West configuration could help.
> 
> The CPU score is weak, likely because of my low uncore and I have a dead DIMM - I've left as dual channel for now.
> 
> 
> Spoiler: Configuration kind of looks like this, only with 1080Ti Lightnings now instead of 290X Lightnings
> 
> 
> 
> 
> 
> I got rid of the sound card in favor of an external DAC, but there's no denying that 5960X is starving for air. The top card is still 7C hotter.
> 
> The CPU cooler fan is right against the backplate in the 1080Ti, wheres before there was a bit of room. It looks like the 1080Ti Lightnings are a bit bigger than the 290X Lightnings.
> 
> 
> 
> The undervolting score is not quite the result that this person got:
> 
> __
> https://www.reddit.com/r/6ejqja/psa_undervolting_your_gpu_can_give_you_a_free/%5B/URL
> 
> Core seems counterproductive given my voltage curve.
> 
> Anyone try pushing up the AUX1 and AUX2? Both can be raised up to 30 mV. On AMD GPUs, that is the PLL, so I would imagine that it would be similar for Nvidia GPUs. I'm not sure what the difference is between AUX1 and AUX2. I'm hoping for slightly better RAM clocks. At 4k, it does seem like we are facing a VRAM bottleneck - Nvidia may very well have gimped the 352 bit bus knowing that this would happen. GPU1 seems to do +325 MHz and GPU2, if left alone does +400 MHz with no voltage. Any more fails Timespy, although +500 can be stable in games.
> Odd - Memtest G80 is finding errors at stock.
> 
> I wonder if there is an easier way to do this.
> 
> MemtestG80.exe 10240 1000
> MemtestG80.exe 10240 1000 --gpu 1 g1
> 
> OR if this is a 32 bit application, then we are left with:
> 
> MemtestG80.exe 3072 1000
> MemtestG80.exe 3072 1000 --gpu 1 g1
> 
> We could open 3-4 instances or with SLI 6-8 instances.
> What's really interesting is that Nvidia according to the Stanford made progress in their memory errors between the G80 and GT200. I wonder if they continued to progress since with Fermi, Kepler, Maxwell, and Pascal respectively. I would suspect that they must have, especially because they want to dominate the compute market.
> 
> Perhaps modern GDDR5X has less memory errors as well.
> Very interesting. +Rep
> 
> I did not know that hash rates went up 5% on a 1080Ti at a given power consumption level using P0. Strictly for mining of course, GDDR5X remains noncompetitive versus the cheaper GTX 1060, 1070, and AMD's Polaris, but it does help I guess for those not gaming.
> 
> I do agree with your hypothesis that Nvidia may be attempting to protect its high margin Quadro markets. I suppose there is also the matter that the Quadros offer ECC that keeps them distinguished, which is necessary where absolute accuracy is needed.
> Yep. This is actually an interesting situation - I'd love to see a test of x8/x8 vs x8/x16 vs x16/x16 SLI testing from a frame time POV. There have been claims that the x16/x16 does better, which is possible.
> 
> I wonder if there are any games that benefit from the HB SLI bridge with a 1080Ti. Tested with a 1080, there were games that did at 4k. Most did not, but a few did:
> https://www.pcworld.com/article/3087524/hardware/tested-the-payoff-in-buying-nvidias-40-sli-hb-bridge.html
> 
> 
> 
> They must have tested 2 soft bridges versus 1 HB bridge. What we really need though is a frame time test between SLI soft vs HB SLI on the 1080Ti.
> 
> This is also interesting as well.
> https://overclocking.guide/nvidia-hb-sli-bridge-technical-review/
> 
> IF 2x GTX 1080 can at times gain, I wonder if this is even more so for 2x 1080Ti, especially at 4k?
> 
> Edit:
> Now here's the big claim that Nvidia made - their frame times are better with HB SLI.
> 
> https://www.geforce.com/whats-new/articles/geforce-gtx-1080-gaming-perfected
> 
> 
> 
> Independent testing shows this claim may have some merit.
> https://www.computerbase.de/2016-09/nvidia-sli-hb-bridge-test/2/#diagramm-rise-of-the-tomb-raider-directx-11-frametimes-in-3840-2160
> 
> If true, then from Pascal onwards, people with SLI setups would be well advised to get a HB SLI bridge. _It is not for frame rates. It is for frame time variance._


I can attest to the HB bridge since I didn't have a 0/1 slot bridge, so went w/ the single ribbon in which nvidia panel gave me a warning, I noticed a small gain when adding a 2nd ribbon, and a slightly higher increase once my HB bridge arrived. I play on 4k SLI


----------



## KedarWolf

Afterburner Beta 18 out, OP updated.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Afterburner Beta 18 out, OP updated.


Any significant changes?

I'm still using 4.4.0 Beta 6.

In any case -- thanks for update.


----------



## KedarWolf

I saw a Reddit about undervolting to 800mv.

My card will do 1809 core/ 6014 memory without Time Spy crashing.

Peeps who like undervolting might want to see what they can achieve at 800v.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> I saw a Reddit about undervolting to 800mv.
> 
> My card will do 1809 core/ 6014 memory without Time Spy crashing.
> 
> Peeps who like undervolting might want to see what they can achieve at 800v.


Entering Winter Season so no reason to go under 900mv for me.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I saw a Reddit about undervolting to 800mv.
> 
> My card will do 1809 core/ 6014 memory without Time Spy crashing.
> 
> Peeps who like undervolting might want to see what they can achieve at 800v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Entering Winter Season so no reason to go under 900mv for me.
Click to expand...

I'm pretty sure I can do 1987 at 900v when I checked.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm pretty sure I can do 1987 at 900v when I checked.


I am at 1860MHz 900mv. Did not bother to go higher.


----------



## Spin Cykle

Simple upgrade question for everyone. I've got the funds for one option. What would you choose?

Current system:

3770k @ 4.7ghz
GTX 1070 w/ EK Block @ +150/+500
Samsung SSD's
Acer Predator 1080p 180HZ gsync
Custom WC loop
Option 1:

Keep the 1070 and upgrade to 8700k (CPU,MB,RAM)

Option 2:

Keep current 3770k (Z77 chipset) platform and upgrade to a 1080ti.

Sent from my iPhone using Tapatalk Pro


----------



## ZealotKi11er

1080 Ti easy. CPU upgrade is pointless lol.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm pretty sure I can do 1987 at 900v when I checked.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am at 1860MHz 900mv. Did not bother to go higher.
Click to expand...

Nope, 1987 crashed Time Spy but 1961 core/ 6085 memory passed it.

I get about 11000 in Time Spy at 2076 core/ 6147 memory and get 10378 at 1961 core/ 6085 memory.









Not a huge difference for 24/7 gaming even if you're running 4K.


----------



## nrpeyton

Quote:


> Originally Posted by *Spin Cykle*
> 
> Simple upgrade question for everyone. I've got the funds for one option. What would you choose?
> 
> Current system:
> 
> 3770k @ 4.7ghz
> GTX 1070 w/ EK Block @ +150/+500
> Samsung SSD's
> Acer Predator 1080p 180HZ gsync
> Custom WC loop
> Option 1:
> 
> Keep the 1070 and upgrade to 8700k (CPU,MB,RAM)
> 
> Option 2:
> 
> Keep current 3770k (Z77 chipset) platform and upgrade to a 1080ti.
> 
> Sent from my iPhone using Tapatalk Pro


I agree 100%. Go for the faster Graphics Card.

That CPU is absolutely fine and shouldn't bottleneck the card. And by the time the CPU does become irrelevant. It will be time to upgrade to a bigger higher resolution monitor anyway. Which then takes pressure back off the CPU again. Making it relevant again.. lol.. so really -- it actually still has a good few years left in it, too.

100% go for the 1080 Ti.

You won't regret it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> Nope, 1987 crashed Time Spy but 1961 core/ 6085 memory passed it.
> 
> I get about 11000 in Time Spy at 2076 core/ 6147 memory and get 10378 at 1961 core/ 6085 memory.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not a huge difference for 24/7 gaming even if you're running 4K.


Testing 1860 @ 875mv.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Afterburner Beta 18 out, OP updated.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any significant changes?
> 
> I'm still using 4.4.0 Beta 6.
> 
> In any case -- thanks for update.
Click to expand...

I've never read the release notes on it but developer is always improving it, so I'd upgrade.


----------



## mouacyk

Quote:


> Originally Posted by *Spin Cykle*
> 
> Simple upgrade question for everyone. I've got the funds for one option. What would you choose?
> 
> Current system:
> 
> 3770k @ 4.7ghz
> GTX 1070 w/ EK Block @ +150/+500
> Samsung SSD's
> Acer Predator 1080p 180HZ gsync
> Custom WC loop
> Option 1:
> 
> Keep the 1070 and upgrade to 8700k (CPU,MB,RAM)
> 
> Option 2:
> 
> Keep current 3770k (Z77 chipset) platform and upgrade to a 1080ti.
> 
> Sent from my iPhone using Tapatalk Pro


I was debating the same thing. Up to 50% more multi-core performance, where almost none of it will be seen in games or definitely have 50% more frames?
Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1080 Ti easy. CPU upgrade is pointless lol.


Exactly what I went with. Found an MSI Seahawk EK X for $799.


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26377605
> 
> #1 on overclock.net for a 1080 Ti.
> 
> 
> 
> Come on peeps, my card is only some above average and I never really pushed it.
> 
> Challenging you all to beat it ASAP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Validation warning is older beta drivers that do better for my benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What driver do you find best for benchmarks ? Do you always run the current ones ?
Click to expand...

Got a 11298 time Spy with 385.12 and consistently performs best in benches.


----------



## Dasboogieman

Quote:


> Originally Posted by *mouacyk*
> 
> I was debating the same thing. Up to 50% more multi-core performance, where almost none of it will be seen in games or definitely have 50% more frames?
> Exactly what I went with. Found an MSI Seahawk EK X for $799.


Well if I were to gte the 8700K, it would be primarily to eliminate frame drops when running games that saturate a hyperthreaded quadcore as the 2 additional cores can handle windows/background/multitasked stuff.


----------



## 8051

My 1080Ti has raised havoc w/my overclocked system. I finally had to raise the PCH voltage to stabilize it, after too many hours wasted on red herrings.

I've never had a video card affect my system overclocks before.


----------



## 8051

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well if I were to gte the 8700K, it would be primarily to eliminate frame drops when running games that saturate a hyperthreaded quadcore as the 2 additional cores can handle windows/background/multitasked stuff.


I'd figure games are going to be going SMP more and more because of the architecture of the leading consoles, but having a 4 core hyperthreaded CPU running at more than twice the speed of these consoles I'd imagine would more than make up for that.


----------



## Spin Cykle

Thanks for all the replies! I was definitely leaning towards the 1080Ti. And given I have an EK water block, it seems like the right upgrade.

The one area holding me back is frame time consistency and frame drops. It seems new CPU architecture minimizes the spikes/drops and smoothness. (Overall feel)

Does anyone have a recommendation for a reference design card? My EK block only supports ref design. I was thinking a EVGA SC Black Edition ogle regular SC non ICX would be my best bet!

Sent from my iPhone using Tapatalk Pro


----------



## RoBiK

Quote:


> Originally Posted by *Spin Cykle*
> 
> Thanks for all the replies! I was definitely leaning towards the 1080Ti. And given I have an EK water block, it seems like the right upgrade.
> 
> The one area holding me back is frame time consistency and frame drops. It seems new CPU architecture minimizes the spikes/drops and smoothness. (Overall feel)
> 
> Does anyone have a recommendation for a reference design card? My EK block only supports ref design. I was thinking a EVGA SC Black Edition ogle regular SC non ICX would be my best bet!
> 
> Sent from my iPhone using Tapatalk Pro


There is no difference in PCB between all the different brands and models based on the reference design, only in the cooler design which is irrelevant when watercooling. Just pick up the cheapest one.


----------



## Asus11

currently own a GTX 1080 Ti SeaHawk EK X .. it coils whine which is annoying tbh should I consider RMA , does anyone else have a gaming X or similar that does the same?


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Spin Cykle*
> 
> Thanks for all the replies! I was definitely leaning towards the 1080Ti. And given I have an EK water block, it seems like the right upgrade.
> 
> The one area holding me back is frame time consistency and frame drops. It seems new CPU architecture minimizes the spikes/drops and smoothness. (Overall feel)
> 
> Does anyone have a recommendation for a reference design card? My EK block only supports ref design. I was thinking a EVGA SC Black Edition ogle regular SC non ICX would be my best bet!
> 
> Sent from my iPhone using Tapatalk Pro


I have a Galax GTX 1080 Ti EXOC , it has a reference PCB and hits 30,000 in FireStrike https://www.3dmark.com/fs/13219699 and best in SuperPosition was 9992 and max core is 2088 and memory is 12008 , temp maxes at 75 C with default curve . This is with offset OC and i have only just started playing with the curve to undervolt and its sitting @ 0.993 @ 2003 MHz .
Here in Australia its the cheapest 1080 Ti .
Heres a Timespy :
https://www.3dmark.com/spy/2058030


----------



## CYBER-SNIPA

Quote:


> Originally Posted by *Asus11*
> 
> currently own a GTX 1080 Ti SeaHawk EK X .. it coils whine which is annoying tbh should I consider RMA , does anyone else have a gaming X or similar that does the same?


Best too, if you ever want to sell on, a non-whiney card will be easier to sell!!!


----------



## BanBoo

Hey guys, I own a Asus 1080Ti Poseidon for 2 weeks now.

Here`s my System:

i7 4790k @ 4700Mhz (Corsair H110i)
Asus Maximus Hero VII
Asus ROG Poseidon 1080Ti @ 2000Mhz - No OC (280 & 140 Radiator)

The thing is everytime I OC the card on like ~2150Mhz the fps start to drop at a uncertan point in games... the Core Freq stays the same but the utilisation goes up to 100% in almost every scene.
It feels almost like a driver crash but without a screen flicker or anything.
Is that the way those cards cards deal with crashes? There are no artifact at all....


----------



## MaKeN

Quote:


> Originally Posted by *KedarWolf*
> 
> I saw a Reddit about undervolting to 800mv.
> 
> My card will do 1809 core/ 6014 memory without Time Spy crashing.
> 
> Peeps who like undervolting might want to see what they can achieve at 800v.


Trying my best now ... interesting what it can do at that voltage

Edit: would pass supervision at :
1775/6318 9k+ score.....
Running time spy now.

I get 8891 score in time spy....
Graphics score 9582

Edit:
Undervolting from 1.0 to 0.8 2000 vs 17775 , reaoults are same 144 fps in game and a 6c drop in temps for now... mass efect andromeda...
Interesting to test it more... if it really is that 6c drop overtime, ill stay on 0.8 v


----------



## Asus11

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> Best too, if you ever want to sell on, a non-whiney card will be easier to sell!!!


I will contact Amazon


----------



## Sgang

Quote:


> Originally Posted by *Blameless*
> 
> Yes. You aren't going to get enough contact to justify the airflow obstruction with that design.
> You can leave the pads on the main heatsink, the ones that cool the GDDR5X, and smaller VRM area, alone. You might see some small improvement by replacing them, but given the lower thermal density of these parts, probably not enough to justify replacing them.


I Changed thermal pads with fujypoly 17 and 13 on the vrm and mosfet of my Zotac amp extreme and i've found lower temp in the stress test, but Not as i was expecting. I also simply removed the rubber from the main sink as suggested, and chanhed the thermal pad of the memory at first but then i reused the green one (Any problem with reusing them?) because the fujipoly i used was less flexible and the heatsink has less contact with the gpu (the card touched 80degrees in timespy)
Tomorrow i will post some logs
Do you use a personalized fan curve that i can copy?


----------



## KedarWolf

Quote:


> Originally Posted by *MaKeN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I saw a Reddit about undervolting to 800mv.
> 
> My card will do 1809 core/ 6014 memory without Time Spy crashing.
> 
> Peeps who like undervolting might want to see what they can achieve at 800v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Trying my best now ... interesting what it can do at that voltage
> 
> Edit: would pass supervision at :
> 1775/6318 9k+ score.....
> Running time spy now.
> 
> I get 8891 score in time spy....
> Graphics score 9582
> 
> Edit:
> Undervolting from 1.0 to 0.8 2000 vs 17775 , reaoults are same 144 fps in game and a 6c drop in temps for now... mass efect andromeda...
> Interesting to test it more... if it really is that 6c drop overtime, ill stay on 0.8 v
Click to expand...

Yeah, not much difference 1809 core/6014 memory than at 2072 core, 6147 memory and only a few FPS even with Hellblade at 4K.









If you're not running benches a big advantage running at 800mv.









Edit: Fire Strike Ultra stress test passed at 1797 core/5977 memory at .800v.

And temps are 10C less than at 1.093V.


----------



## MaKeN

For sure!
Im not benching ... just gaming ...
im really happy with the temps drop i see , i went from 44 max temp to 37c , water temps went down also, same as cpu ....
definitely ill keep it at that lowest voltage possible.
In my case there are almost no fps drop at all... its still going strong 144fps


----------



## KedarWolf

Quote:


> Originally Posted by *MaKeN*
> 
> For sure!
> Im not benching ... just gaming ...
> im really happy with the temps drop i see , i went from 44 max temp to 37c , water temps went down also, same as cpu ....
> definitely ill keep it at that lowest voltage possible.
> In my case there are almost no fps drop at all... its still going strong 144fps


Fire Strike, I get about 32000 graphics with latest drivers, this is with 800mv, 27868 graphic score.


----------



## Blameless

Quote:


> Originally Posted by *Sgang*
> 
> Any problem with reusing them?


Reusing them isn't ideal, but the softer pads can generally be reused several times if you are careful with them.
Quote:


> Originally Posted by *Sgang*
> 
> Do you use a personalized fan curve that i can copy?


My fan curve is extremely simple:










Basically, I find the temperature the card idles at with my typical ambients and case airflow, then set the fan to turn on a few degrees above that. I then figure out the highest temperature the settings I intend to use are completely stable at, and put my temp limit 1C below this and my 100% fan point 2C below this.

Fan usually hovers around 70% while gaming with no vsync/uncapped FPS and is off at idle, but will ramp all the way to 90-100% in the heaviest stress tests that are practical to run.


----------



## samoth777

Hi all. So I finally decided to upgrade my 1070 to a Gigabyte 1080 ti Aorus Xtreme.

I noticed a couple of issues:

1) Stuttering: This is very annoying. I've had no such stuttering issues when I was using my 1070. I play Battlefield 1, I notice how great the frames are, but every now and then I can see the frames dip significantly, sometimes from 100+ frames down to less than 50, and sometimes less. This is totally frustrating. It's making me regret my purchase.









2) Coil Whine: I didn't have any in my previous 1070 Palit Jetstream at all. I actually really don't mind coil whine cause from what I've heard it is very normal

3) A weird graphical glitch when I boot up windows: you know when you boot up, and it loads and you see a small spinning thing. For like a split second I see a weird glitchy line. Then back to normal. Not particularly a terrible issue, but it is very annoying.

Any solutions for issue number 1? I already did a DDU driver refresh and I'm currently using the latest drivers from Nvidia.

Thanks guys


----------



## becks

Hey everyone..been a long time since I been here.
Had to move everything to a new address and so on...

Still rocking a I7 7700K + Gtx 1080 ti Waterforce WB Xtreme Edition 11G .
Recently I bought a new screen (dell u2417ha) and I have this weird problem where I get no signal over DP until I am in the OS... can't get in bios or see the post screen.
Have to constantly switch to an old DVI screen if I want to change anything.

Is there any known fix to this ?


----------



## Sgang

Quote:


> Originally Posted by *Blameless*
> 
> Reusing them isn't ideal, but the softer pads can generally be reused several times if you are careful with them.
> My fan curve is extremely simple:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Basically, I find the temperature the card idles at with my typical ambients and case airflow, then set the fan to turn on a few degrees above that. I then figure out the highest temperature the settings I intend to use are completely stable at, and put my temp limit 1C below this and my 100% fan point 2C below this.
> 
> Fan usually hovers around 70% while gaming with no vsync/uncapped FPS and is off at idle, but will ramp all the way to 90-100% in the heaviest stress tests that are practical to run.


I double checked the temperatures this morning, nothing seems changed. At 2032 the temperature reach 68° (indicated in GPUZ log) as before the new thermal compound (Noctua NTH1) on the core and the new thermal pads







I have to say i have not a great ventilation in the case (InWIN 101), just one exhaust fan very far from the VGA, as i'm waiting for a new stock of fans.

Seems i have more stability in overclock, could be due to a better dissipation of the VRM?


----------



## SauronTheGreat

has anyone noticed that BF1 does not take the full load during gaming ? the load it occupies on my 1080ti is in between 75% to 95% ... is because its CPU intensive ?


----------



## Blameless

Quote:


> Originally Posted by *Sgang*
> 
> I double checked the temperatures this morning, nothing seems changed. At 2032 the temperature reach 68° (indicated in GPUZ log) as before the new thermal compound (Noctua NTH1) on the core and the new thermal pads
> 
> 
> 
> 
> 
> 
> 
> I have to say i have not a great ventilation in the case (InWIN 101), just one exhaust fan very far from the VGA, as i'm waiting for a new stock of fans.


No VRM temp sensor?
Quote:


> Originally Posted by *Sgang*
> 
> Seems i have more stability in overclock, could be due to a better dissipation of the VRM?


Quite possible.


----------



## haavard

Quote:


> Originally Posted by *samoth777*
> 
> Hi all. So I finally decided to upgrade my 1070 to a Gigabyte 1080 ti Aorus Xtreme.
> 
> I noticed a couple of issues:
> 
> 1) Stuttering: This is very annoying. I've had no such stuttering issues when I was using my 1070. I play Battlefield 1, I notice how great the frames are, but every now and then I can see the frames dip significantly, sometimes from 100+ frames down to less than 50, and sometimes less. This is totally frustrating. It's making me regret my purchase.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2) Coil Whine: I didn't have any in my previous 1070 Palit Jetstream at all. I actually really don't mind coil whine cause from what I've heard it is very normal
> 
> 3) A weird graphical glitch when I boot up windows: you know when you boot up, and it loads and you see a small spinning thing. For like a split second I see a weird glitchy line. Then back to normal. Not particularly a terrible issue, but it is very annoying.
> 
> Any solutions for issue number 1? I already did a DDU driver refresh and I'm currently using the latest drivers from Nvidia.
> 
> Thanks guys


I have an MSI GTX1080Ti Gaming X 11GB

1) Have you checked the resolution scale in BF1? Mine was at 175% and I was playing in 1440p. This gave me fps all over the place. Set it to 100% to render in the same resolution as you have chosen as your output resolution.

2) Can't help you there, sorry.

3) I get this glitchy line as well, don't know what causes it. I just close my eyes for 10 seconds after booting and open them to se my Windows desktop


----------



## Sgang

Quote:


> Originally Posted by *Blameless*
> 
> No VRM temp sensor?
> Quite possible.


Unfortunately not. I have a Zotac Amp extreme.

I also observed a strange behavior, yesterday my results was 10179, today is 9979 after several test and temperature are 3° higher, 65° maximum yesterday, 68° today.

https://1drv.ms/t/s!Asfgq8kJqus0zB7-CdxOE9KDxBQT i attached my gpuz sensor log, can you have a look at it?


----------



## mouacyk

Quote:


> Originally Posted by *samoth777*
> 
> Hi all. So I finally decided to upgrade my 1070 to a Gigabyte 1080 ti Aorus Xtreme.
> 
> I noticed a couple of issues:
> 
> 1) Stuttering: This is very annoying. I've had no such stuttering issues when I was using my 1070. I play Battlefield 1, I notice how great the frames are, but every now and then I can see the frames dip significantly, sometimes from 100+ frames down to less than 50, and sometimes less. This is totally frustrating. It's making me regret my purchase.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2) Coil Whine: I didn't have any in my previous 1070 Palit Jetstream at all. I actually really don't mind coil whine cause from what I've heard it is very normal
> 
> 3) A weird graphical glitch when I boot up windows: you know when you boot up, and it loads and you see a small spinning thing. For like a split second I see a weird glitchy line. Then back to normal. Not particularly a terrible issue, but it is very annoying.
> 
> Any solutions for issue number 1? I already did a DDU driver refresh and I'm currently using the latest drivers from Nvidia.
> 
> Thanks guys


Your CPU being quad-core without HT is a concern, even if overclocked to 4.5GHz. BF1 is heavily threaded and some maps will choke your processor, making it the bottleneck, resulting in dropped GPU usage and low swing of frame rates. Even on mine with HT, sometimes I see GPU usage drop as low as 85% on maps like Amiens and St Quentin Scar.
Quote:


> Originally Posted by *SauronTheGreat*
> 
> has anyone noticed that BF1 does not take the full load during gaming ? the load it occupies on my 1080ti is in between 75% to 95% ... is because its CPU intensive ?


Yes, it's really annoying. The same map behaves differently at different times. There likely is more going on in the engine than just CPU bottlenecks (like the quad-core limit above). I don't think it's too far fetched to think that overall network latency (your's, the server's, and everyone else on the map) also affects your (GPU) performance. Just think of how much physics is going on and multiply that by network latency factors.


----------



## samoth777

Quote:


> Originally Posted by *haavard*
> 
> I have an MSI GTX1080Ti Gaming X 11GB
> 
> 1) Have you checked the resolution scale in BF1? Mine was at 175% and I was playing in 1440p. This gave me fps all over the place. Set it to 100% to render in the same resolution as you have chosen as your output resolution.
> 
> 2) Can't help you there, sorry.
> 
> 3) I get this glitchy line as well, don't know what causes it. I just close my eyes for 10 seconds after booting and open them to se my Windows desktop


1) Yeah BF1 is at 100% render, so I guess that's not the issue

2) All good, I can live it I suppose

3) LOL, I wonder what's causing it


----------



## samoth777

Quote:


> Originally Posted by *mouacyk*
> 
> Your CPU being quad-core without HT is a concern, even if overclocked to 4.5GHz. BF1 is heavily threaded and some maps will choke your processor, making it the bottleneck, resulting in dropped GPU usage and low swing of frame rates. Even on mine with HT, sometimes I see GPU usage drop as low as 85% on maps like Amiens and St Quentin Scar.
> Yes, it's really annoying. The same map behaves differently at different times. There likely is more going on in the engine than just CPU bottlenecks (like the quad-core limit above). I don't think it's too far fetched to think that overall network latency (your's, the server's, and everyone else on the map) also affects your (GPU) performance. Just think of how much physics is going on and multiply that by network latency factors.


Thanks for that. Indeed it does do this a lot in Amiens. Might look for a cheap second hand 4790k somewhere.


----------



## mouacyk

Quote:


> Originally Posted by *samoth777*
> 
> Thanks for that. Indeed it does do this a lot in Amiens. Might look for a cheap second hand 4790k somewhere.


Even 4770K that overclocks to the same 4.5GHz is fine too, and may be found cheaper than the 4790K. Secondary upgrades that might help you are an SSD (loading and caching) and another 8GB of RAM, in case you like to have stuff running in background (which you really shouldn't with just 4-cores).


----------



## latexyankee

Wondering if anyone can help me with my 1080 ti not downclocking when idle.

I have power set to adaptive, I'm running minimal apps in the background like anti virus. I've used process explorer and cant find anything using the GPU. Usage is at 0% but the core and mem run at full stock clocks (non boost). I'm using Afterburner to monitor and force constant voltage not checked.

I'm pulling my hair out trying to figure out why.

thanks


----------



## d3v0

Generally looking into a new 1080Ti but with reports of prices rising, what can I expect to get, deal wise on a decent non-reference (ie, better cooling/VRMs) GTX 1080ti? Its been a long time since Ive bought a GPU! If I find a decent factory OC/good cooler edition new for say, $700, is that a solid price?


----------



## keikei

Quote:


> Originally Posted by *d3v0*
> 
> Generally looking into a new 1080Ti but with reports of prices rising, what can I expect to get, deal wise on a decent non-reference (ie, better cooling/VRMs) GTX 1080ti? Its been a long time since Ive bought a GPU! If I find a decent factory OC/good cooler edition new for say, $700, is that a solid price?


Non-reference? You're looking at $750+.


----------



## d3v0

Quote:


> Originally Posted by *keikei*
> 
> Non-reference? You're looking at $750+.


Is that a recent trend? I remember seeing a massdrop with one at ~700 and I didnt purchase yet, but subsequently marked $680 as a great deal. I think since then is when every outlet is saying some memory costs are driving prices up.


----------



## keikei

I really haven't seen the prices budge too much. Reference cards were hitting $700 @ launch, but also had a free game code. Newegg still having the same 'deals' right now I believe for the non-reference cards. Blower cards maybe harder to find now though, if you're looking to save a few bucks. I still have mine running stock and no complaints.


----------



## mouacyk

SuperBiiz has the Gaming X + EK Waterblock together for $799, just no free game. Still, I'll pay the $50 for a waterblock installed with warranty anyday over a free game deal. Didn't get very far in the original game anyway, way too darn repetitive.


----------



## KedarWolf

Quote:


> Originally Posted by *MaKeN*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I saw a Reddit about undervolting to 800mv.
> 
> My card will do 1809 core/ 6014 memory without Time Spy crashing.
> 
> Peeps who like undervolting might want to see what they can achieve at 800v.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Trying my best now ... interesting what it can do at that voltage
> 
> Edit: would pass supervision at :
> 1775/6318 9k+ score.....
> Running time spy now.
> 
> I get 8891 score in time spy....
> Graphics score 9582
> 
> Edit:
> Undervolting from 1.0 to 0.8 2000 vs 17775 , reaoults are same 144 fps in game and a 6c drop in temps for now... mass efect andromeda...
> Interesting to test it more... if it really is that 6c drop overtime, ill stay on 0.8 v
Click to expand...

I highly recommend to peeps here if you're only gaming, not benching, to undervolt to .800v.

I get a Fire Strike 32050 graphics score at 2077 core/ 6210 memory, 1.093v and I get a 28064 graphics score at 1797 core/5977 memory .800v and I'm Fire Strike Ultra stess test stable.









That's less than a 13% lower score and my temps while stress testing drop 10C.


----------



## mouacyk

Quote:


> Originally Posted by *KedarWolf*
> 
> I highly recommend to peeps here if you're only gaming, not benching, to undervolt to .800v.
> 
> I get a Fire Strike 32050 graphics score at 2077 core/ 6210 memory, 1.093v and I get a 28064 graphics score at 1797 core/5977 memory .800v and I'm Fire Strike Ultra stess test stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's less than a 13% lower score and my temps while stress testing drop 10C.


How much did the power usage drop?
Saw a youtube video of someone restricting their 1080 Ti to 125W and it was still beating a 980 ti at stock (~225W).


----------



## KedarWolf

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I highly recommend to peeps here if you're only gaming, not benching, to undervolt to .800v.
> 
> I get a Fire Strike 32050 graphics score at 2077 core/ 6210 memory, 1.093v and I get a 28064 graphics score at 1797 core/5977 memory .800v and I'm Fire Strike Ultra stess test stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's less than a 13% lower score and my temps while stress testing drop 10C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How much did the power usage drop?
> Saw a youtube video of someone restricting their 1080 Ti to 125W and it was still beating a 980 ti at stock (~225W).
Click to expand...

In HWInfo the GPU alone is pulling just under 200W.


----------



## d3v0

Yeah I should have just bought it when they were selling Destiny 2 with it!

Edit: is it worth going for a Hybrid version vs say, gigabyte aorus?


----------



## becks

Quote:


> Originally Posted by *becks*
> 
> Hey everyone..been a long time since I been here.
> Had to move everything to a new address and so on...
> 
> Still rocking a I7 7700K + Gtx 1080 ti Waterforce WB Xtreme Edition 11G .
> Recently I bought a new screen (dell u2417ha) and I have this weird problem where I get no signal over DP until I am in the OS... can't get in bios or see the post screen.
> Have to constantly switch to an old DVI screen if I want to change anything.
> 
> Is there any known fix to this ?


Any suggestions guys ? kinda tried everything... at least point me out where to start from..
Is it a bios setting ? OS setting ?


----------



## KedarWolf

Quote:


> Originally Posted by *becks*
> 
> Quote:
> 
> 
> 
> Originally Posted by *becks*
> 
> Hey everyone..been a long time since I been here.
> Had to move everything to a new address and so on...
> 
> Still rocking a I7 7700K + Gtx 1080 ti Waterforce WB Xtreme Edition 11G .
> Recently I bought a new screen (dell u2417ha) and I have this weird problem where I get no signal over DP until I am in the OS... can't get in bios or see the post screen.
> Have to constantly switch to an old DVI screen if I want to change anything.
> 
> Is there any known fix to this ?
> 
> 
> 
> Any suggestions guys ? kinda tried everything... at least point me out where to start from..
> Is it a bios setting ? OS setting ?
Click to expand...

Try enabling or disabling Fast Boot and/or CSM.

Edit: Wait, you're booting from DVI? Did you disable your onboard graphics?

Dunno a 1080 Ti with DVI.


----------



## becks

Quote:


> Originally Posted by *KedarWolf*
> 
> Try enabling or disabling Fast Boot and/or CSM.
> 
> Edit: Wait, you're booting from DVI? Did you disable your onboard graphics?
> 
> Dunno a 1080 Ti with DVI.


Its an Aorus 1080 ti Waterforce WB Xtreme ....it has *DVI-D*......and DP...and HDMI...I want and have to us DP...
Can't use Fast boot...had to disable it in BIOS and OS as PC shuts down on first boot on its own...a strange bug that hasn't been addressed yet with a bios update. (Maximus Viii Impact).. posted about it in ASUS Z170 Motherboards Q&A Thread a while ago.. some other people had the same problem as well.


----------



## 8051

Quote:


> Originally Posted by *becks*
> 
> Any suggestions guys ? kinda tried everything... at least point me out where to start from..
> Is it a bios setting ? OS setting ?


It could be your video card is outputting a resolution or refresh rate that is out-of-bounds for the monitor. I believe there is a BIOS setting for what resolution it will output to the screen.


----------



## Beagle Box

Quote:


> Originally Posted by *becks*
> 
> Hey everyone..been a long time since I been here.
> Had to move everything to a new address and so on...
> 
> Still rocking a I7 7700K + Gtx 1080 ti Waterforce WB Xtreme Edition 11G .
> Recently I bought a new screen (dell u2417ha) and I have this weird problem where I get no signal over DP until I am in the OS... can't get in bios or see the post screen.
> Have to constantly switch to an old DVI screen if I want to change anything.
> 
> Is there any known fix to this ?


Have you tried the old _brute force_ method?








Make sure you've saved your BIOS settings before trying it.

Cold booting and shutting down the computer a few times _with no monitor attached_ and then booting with a single DP cable attached to the chosen monitor has worked for me in the past.
Make sure the monitor has only that one cable plugged in and the monitor is set to that port.

It risks OS integrity unless you memorize the proper Windows shutdown keystroke sequence.

I can offer no explanation as to why it works.


----------



## KedarWolf

Afterburner Beta 19 available here.

https://www.guru3d.com/files-get/msi-afterburner-beta-download,32.html


----------



## Blameless

Quote:


> Originally Posted by *KedarWolf*
> 
> That's less than a 13% lower score and my temps while stress testing drop 10C.


I don't have 13% to spare in terms of performance.

I still drop below 60 fps at times in _Elite: Dangerous_ with custom settings at 4k. If I could get another 5% in performance to push those minimums up to the point they never fell below my recording frame rate, I would, even if it cost me another 50w. However, I'm at the limits of my part in both stable temperatures and power and would have to put the card under water to improve things tangibly.


----------



## pez

Quote:


> Originally Posted by *Blameless*
> 
> I don't have 13% to spare in terms of performance.
> 
> I still drop below 60 fps at times in _Elite: Dangerous_ with custom settings at 4k. If I could get another 5% in performance to push those minimums up to the point they never fell below my recording frame rate, I would, even if it cost me another 50w. However, I'm at the limits of my part in both stable temperatures and power and would have to put the card under water to improve things tangibly.


Just 5%? Sounds like the TXp is just what you need at the cost of 0w (just ignore the price tag







).


----------



## Streetdragon

Quote:


> Originally Posted by *becks*
> 
> Any suggestions guys ? kinda tried everything... at least point me out where to start from..
> Is it a bios setting ? OS setting ?


have the same problem with my rig.. have a second screen(also DP) and there it shows the Bios etc. my new one is dark till windows start in the login sreen


----------



## 8051

Quote:


> Originally Posted by *Streetdragon*
> 
> have the same problem with my rig.. have a second screen(also DP) and there it shows the Bios etc. my new one is dark till windows start in the login sreen


Could it be that one particular DP port is the only one that outputs on boot?


----------



## becks

I am in "moving home" process right now and only having limited time at the pc in the afternoons/late night so will have to test all that has been said here and report back.







Thank you everyone


----------



## stangflyer

Quote:


> Originally Posted by *Streetdragon*
> 
> have the same problem with my rig.. have a second screen(also DP) and there it shows the Bios etc. my new one is dark till windows start in the login sreen


There is a primary DP input on the GPU. I believe it is the one that is closest to motherboard. Streetdragon: switch the inputs on the GPU. The input that shows the bios is the primary. If windows switches just change main display in display setting in windows.


----------



## becks

Fixed my problem...but changed couple of things at once so I am not sure which one of them was the "one" ...
Its either going in bios and switching the primary display output from auto to PEG and/or CSM on auto...or from the display menu change input from auto to mDP.


----------



## encrypted11

Redacted for now


----------



## nrpeyton

Quote:


> Originally Posted by *latexyankee*
> 
> Wondering if anyone can help me with my 1080 ti not downclocking when idle.
> 
> I have power set to adaptive, I'm running minimal apps in the background like anti virus. I've used process explorer and cant find anything using the GPU. Usage is at 0% but the core and mem run at full stock clocks (non boost). I'm using Afterburner to monitor and force constant voltage not checked.
> 
> I'm pulling my hair out trying to figure out why.
> 
> thanks


I had the same problem

A clean- re-install of all Nvidia software is the only thing that fixed it for me.(and you have to put a tick in the 'clean install' checkbox after downloading the latest Nvidia driver). Then reboot.


----------



## vf-

Quote:


> Originally Posted by *KedarWolf*
> 
> *I highly recommend to peeps here if you're only gaming*, not benching, *to undervolt to .800v.*
> 
> I get a Fire Strike 32050 graphics score at 2077 core/ 6210 memory, 1.093v and I get a 28064 graphics score at 1797 core/5977 memory .800v and I'm Fire Strike Ultra stess test stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's less than a 13% lower score and my temps while stress testing drop 10C.


Oh... With MSI Afterburner or EVGA Precision?


----------



## erso44

finally I got my msi 1080ti gaming









so here goes my first question: does somebody already uses VR?


----------



## MURDoctrine

Quote:


> Originally Posted by *erso44*
> 
> finally I got my msi 1080ti gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so here goes my first question: does somebody already uses VR?


Yes I've got a Vive. Have used it on my 980 and my 1080ti.


----------



## LunaP

Quote:


> Originally Posted by *MURDoctrine*
> 
> Yes I've got a Vive. Have used it on my 980 and my 1080ti.


Had a question since I've been debating on the vive, though whenever I go to the store its always having issues either I'm underneath everything so can't see or something else, and the rep can never figure out what the issue is and just shuts it off. Demo's been down for 6 months.

Would you recommend it and for what type of things? Can it still be used for 2D? like does it have a means to just use itself as an over the head display unit? Or is it strictly only for VR ready games/applications?


----------



## KedarWolf

Quote:


> Originally Posted by *vf-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> *I highly recommend to peeps here if you're only gaming*, not benching, *to undervolt to .800v.*
> 
> I get a Fire Strike 32050 graphics score at 2077 core/ 6210 memory, 1.093v and I get a 28064 graphics score at 1797 core/5977 memory .800v and I'm Fire Strike Ultra stess test stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's less than a 13% lower score and my temps while stress testing drop 10C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh... With MSI Afterburner or EVGA Precision?
Click to expand...

Afterburner.


----------



## demon09

100% MSI EVGA messes up pxoc all the time only reason to go with evga's software is if you need to control fans on an icx card


----------



## Alexium

Installed a Bykski full-cover waterblock on my FE card yesterday. Playing Witcher 3, I used to get 84-86 C core temp, and 1610-1680 actual boost clock after the card has warmed up. Now, before any OC whatsoever, it's 45-47 temp and solid 1860-1880 boost clock (ambient t is 21-22). Setting power limit to +20% did not seem to have any effect, so I assume it did not even hit 100% in this game.
I've chosen Witcher because I noticed that it causes boost clock to drop drastically compared to GTA 5, Quantum Break or Mirror's Edge (none of which reliably produces 60 FPS in 4K so it's not a case of the card being under-loaded).


----------



## mouacyk

Quote:


> Originally Posted by *Alexium*
> 
> Installed a Bykski full-cover waterblock on my FE card yesterday. Playing Witcher 3, I used to get 84-86 C core temp, and 1610-1680 actual boost clock after the card has warmed up. Now, before any OC whatsoever, it's 45-47 temp and solid 1860-1880 boost clock (ambient t is 21-22). Setting power limit to +20% did not seem to have any effect, so I assume it did not even hit 100% in this game.
> I've chosen Witcher because I noticed that it causes boost clock to drop drastically compared to GTA 5, Quantum Break or Mirror's Edge (none of which reliably produces 60 FPS in 4K so it's not a case of the card being under-loaded).


What kind of FPS difference did the clock bump net you?


----------



## Alexium

Quote:


> Originally Posted by *mouacyk*
> 
> What kind of FPS difference did the clock bump net you?


Unfortunately, Witcher 3 does not have a built-in benchmark nor any enthusiast-made benchmarking tools that I could find, but I do have MSI Afterburner monitoring on my Logitech G15 display, which means I can observe the FPS graph (not just instant values) in real time. And judging from that, the approx. 13% boost clock jump results in pretty much exactly the same FPS gain: where I used to see 44-47 FPS, I now see 50-55.

(This is in 4K with maxed out settings).


----------



## d3v0

Quote:


> Originally Posted by *Alexium*
> 
> Unfortunately, Witcher 3 does not have a built-in benchmark nor any enthusiast-made benchmarking tools that I could find, but I do have MSI Afterburner monitoring on my Logitech G15 display, which means I can observe the FPS graph (not just instant values) in real time. And judging from that, the approx. 13% boost clock jump results in pretty much exactly the same FPS gain: where I used to see 44-47 FPS, I now see 50-55.
> 
> (This is in 4K with maxed out settings).


I run with a similar setup, and basically starting the game and doing a full gallop is a solid benchmark. very consistent. Witcher 3 responds so well to additional graphics horsepower, but *even better* to CPU speed and ram upgrades. Especially in novigrad/oxenfurt, etc.


----------



## Ultracarpet

I switched from ryzen to coffee lake (going mini itx build, didn't like am4 mini itx options)... Now my 1080ti won't hold it's boost clocks that I set on the curve in afterburner. My FPS in overwatch dropped from like 250 before to 120 now, and the GPU is maxing at like 1500mhz on the core down from 1900. I DDU wiped and reinstalled a few times with no effect. A few games seemed to perform normally like ME:A, as well as GTAV. Blackops 3 was stuck at around 60-70 fps with GPU clock stuck at 1102....

I think i might try reinstalling windows. I know it's a cardinal sin to just transfer over operating systems to new builds without reinstalling, but I have done it before without issue.

Before I do so, anyone have any input?

Oh, also, the i5-8400 that I got shows that it's holding 3.8ghz across all 6 cores while gaming. There does seem to be unusually high CPU utilization for a game like overwatch though... 70-80%.


----------



## Alexium

Quote:


> Originally Posted by *d3v0*
> 
> I run with a similar setup, and basically starting the game and doing a full gallop is a solid benchmark. very consistent.


I shall try that, thanks for the tip!

What I do is save in the appropriate location, then load and take notice of FPS without moving at all. Seems to be consistent, too, with the added convenience of immediately getting a single FPS value.
Quote:


> Originally Posted by *d3v0*
> 
> Especially in novigrad/oxenfurt, etc.


Well, so far it doesn't seem as if my R7 1700 is unable to load the card. But I'll see very soon for I have just received a water block for it and will finally OC.
On a side note, Novigrad does not seem to be particularly tough on the card. The lowest FPS I've seen yet was in the thirties and it was around White Orchard.


----------



## chibi

Quote:


> Originally Posted by *Ultracarpet*
> 
> I switched from ryzen to coffee lake (going mini itx build, didn't like am4 mini itx options)... Now my 1080ti won't hold it's boost clocks that I set on the curve in afterburner. My FPS in overwatch dropped from like 250 before to 120 now, and the GPU is maxing at like 1500mhz on the core down from 1900. I DDU wiped and reinstalled a few times with no effect. A few games seemed to perform normally like ME:A, as well as GTAV. Blackops 3 was stuck at around 60-70 fps with GPU clock stuck at 1102....
> 
> I think i might try reinstalling windows. I know it's a cardinal sin to just transfer over operating systems to new builds without reinstalling, but I have done it before without issue.
> 
> Before I do so, anyone have any input?
> 
> Oh, also, the i5-8400 that I got shows that it's holding 3.8ghz across all 6 cores while gaming. There does seem to be unusually high CPU utilization for a game like overwatch though... 70-80%.


Definitely reinstall windows!


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*
> 
> Fire Strike, I get about 32000 graphics with latest drivers, this is with 800mv, 27868 graphic score.


How is the latest driver performance wise? Any bugs?

I'm interested in it because I run Zelda: BOTW on Cemu and the latest Nvidia driver cuts memory usage down like 50%, although I do have 32 GB on tap so I'm not sure if this will help with performance. That and PhysX in Warframe doesn't work with 378.78.

I've been hesitant to update the display driver because I'm hearing of so many issues in Nvidia's driver feedback thread for each new driver. My philosophy is if it aint broke don't fix it.


----------



## Mooncheese

Quote:


> Originally Posted by *latexyankee*
> 
> Wondering if anyone can help me with my 1080 ti not downclocking when idle.
> 
> I have power set to adaptive, I'm running minimal apps in the background like anti virus. I've used process explorer and cant find anything using the GPU. Usage is at 0% but the core and mem run at full stock clocks (non boost). I'm using Afterburner to monitor and force constant voltage not checked.
> 
> I'm pulling my hair out trying to figure out why.
> 
> thanks


For me with my Asus ROG Swift PG278Q if I enable 3D Vision and don't switch back to G-Sync my clocks sit at 1400MHz with usage indicating 25% simply rendering the desktop and temps get about 5C higher than otherwise (26-27C vs 21-22C).

If you have the option of manually switching the refresh rate from 144 to 120 Hz try that or you can even cap it to 120 FPS in Nvidia Control Panel via Control Panel > Appearance and Personalization > Display > Screen Resolution > Advanced Settings > Monitor > Monitor Settings > Drop Down Menu (Win 7)

Running the refresh rate at 144 Hz used to keep utilization higher with older drivers, maybe youre experiencing that.


----------



## Mooncheese

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well if I were to gte the 8700K, it would be primarily to eliminate frame drops when running games that saturate a hyperthreaded quadcore as the 2 additional cores can handle windows/background/multitasked stuff.


I think it's time to support AMD, without whom, Intel finally realizing that we're tired of their profitable drip, drip iterative architectural evolution cycle (i.e. Apple) the 8700k wouldn't have even been a thing.

8700k isn't actually that special, and you can get comparable performance for less with an 1800x.


----------



## Mooncheese

Quote:


> Originally Posted by *nrpeyton*
> 
> One way to find real usable technical information on this is: _(this is how I'd do it anyway)_ is check the maximum power target for each GPU on the GPU BIOS database at techpowerup.com (and compare them until you find the highest).
> 
> Cards with higher power targets programmed by default won't throttle as much in power hungry games.
> 
> Interesting you-tube videos regarding Liquid Metal on GPU:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In the comments, some guys are reporting 15c colder temps at load.


Liquid metal TIM on a GPU core with lands encircling it is extremely risky. Painters tape helps mitigate that risk somewhat, but I'm still worried that some of that TIM might migrate off of the core. If it does it's night night for your GPU. I've read quite a few of cases of this on the web over the past few years. Also, my experience with it on my CPU has been that it will stain / meld with the cooling solution, in this instance a 3-pipe copper heat sink (AW M18x R2), which wouldn't have been an issue had it lasted as long as I thought it would compared to say IC Diamond or Gelid GC Extreme. But it only lasted maybe 5 months. So I had to take a knife and try to scrape that off of the heat-sink before going back to Gelid GC Extreme, which although it runs maybe 4-5C hotter doesn't need replacement as quickly.

That's been my personal direct experience with it. There is a risk, and in my experience, it doesn't last very long (Coolaboratory Liquid Ultra).


----------



## Blameless

If you are worried about liquid metal TIMs contacting surface mount components on the GPU substrate, you can just coat them with super glue or the like first.


----------



## RoBiK

Quote:


> Originally Posted by *Blameless*
> 
> If you are worried about liquid metal TIMs contacting surface mount components on the GPU substrate, you can just coat them with super glue or the like first.


transparent nail polish coating works just fine and is removable


----------



## NBrock

I used liquid electrical tape. Can be found at just about any hardware store and is pretty easy to remove if you need to. That'd what I used on my 4790k.


----------



## Blameless

Quote:


> Originally Posted by *RoBiK*
> 
> transparent nail polish coating works just fine and is removable


The same stuff that will remove nail polish will also remove superglue.

Indeed, you'd be hard pressed to find a single-part, non-heat/UV cured adhesive that won't clean up with common solvents.


----------



## Shawn Shutt jr

just picked up a 1080 TI, is applying new thermal paste hard? im getting 65c but thought about throwing some new arctic silver 5 on it to get lower temps.


----------



## Dasboogieman

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> just picked up a 1080 TI, is applying new thermal paste hard? im getting 65c but thought about throwing some new arctic silver 5 on it to get lower temps.


Depends on your card. EVGA + ASUS cards are pretty good out of the box already. MSI + GIgabyte + Zotac seem to benefit the most from repasting. If you're gonna re-paste with AS5 then don't bother, use something like GC Extreme or Kryonaut to make it worth your while.


----------



## RoBiK

Quote:


> Originally Posted by *Dasboogieman*
> 
> If you're gonna re-paste with AS5 then don't bother, use something like GC Extreme or Kryonaut to make it worth your while.


why then not go all the way and use conductonaut?


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *Dasboogieman*
> 
> Depends on your card. EVGA + ASUS cards are pretty good out of the box already. MSI + GIgabyte + Zotac seem to benefit the most from repasting. If you're gonna re-paste with AS5 then don't bother, use something like GC Extreme or Kryonaut to make it worth your while.


its the EVGA geforce gtx 1080 TI FTW3

Quote:


> Originally Posted by *RoBiK*
> 
> why then not go all the way and use conductonaut?


ill look into that, i dont really know what the best paste is but iv always heard great things about thermal grizzly but wasnt sure how safe or smart using liquid metal was on a video card since iv never really done anything to video cards out of box temp wise. also could you use liquid metal on a AIO? i might if i buying a tube why not use it on everything lol go ham


----------



## RoBiK

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> i dont really know what the best paste is but iv always heard great things about thermal grizzly bnt sure how safe or smart using liquid metal was on a video card since iv never really done anything to video cards out of box temp wise. also could you use liquid metal on a AIO? i might if i buying a tube why not use it on everything lol go ham


thermal grizzly TIMs are considered to be among the very best. Liquid metal TIMs are in a league of their own when compared to thermal paste, thermal paste simply can't compete against liquid metal. The downside is that liquid metal requires more skill and preparation to apply properly and can cause damage if misapplied.
You can also use an AIO as long as the block's cold plate surface is made of nickel or copper, do not use it on aluminum (liquid metal contains galium which reacts with aluminum).


----------



## KedarWolf

Quote:


> Originally Posted by *RoBiK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shawn Shutt jr*
> 
> i dont really know what the best paste is but iv always heard great things about thermal grizzly bnt sure how safe or smart using liquid metal was on a video card since iv never really done anything to video cards out of box temp wise. also could you use liquid metal on a AIO? i might if i buying a tube why not use it on everything lol go ham
> 
> 
> 
> thermal grizzly TIMs are considered to be among the very best. Liquid metal TIMs are in a league of their own when compared to thermal paste, thermal paste simply can't compete against liquid metal. The downside is that liquid metal requires more skill and preparation to apply properly and can cause damage if misapplied.
> You can also use an AIO as long as the block's cold plate surface is made of nickel or copper, do not use it on aluminum (liquid metal contains galium which reacts with aluminum).
Click to expand...

I get a bit better results with Mastergel Maker Nano than Grizzly Kryonaut.









It's easier to apply and it uses nano diamond particles which is somewhat better than normal diamond TIMs.









I strongly recommend it.


----------



## KedarWolf

Quote:


> Originally Posted by *Mooncheese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Fire Strike, I get about 32000 graphics with latest drivers, this is with 800mv, 27868 graphic score.
> 
> 
> 
> 
> 
> How is the latest driver performance wise? Any bugs?
> 
> I'm interested in it because I run Zelda: BOTW on Cemu and the latest Nvidia driver cuts memory usage down like 50%, although I do have 32 GB on tap so I'm not sure if this will help with performance. That and PhysX in Warframe doesn't work with 378.78.
> 
> I've been hesitant to update the display driver because I'm hearing of so many issues in Nvidia's driver feedback thread for each new driver. My philosophy is if it aint broke don't fix it.
Click to expand...

Benching in DX12 latest drivers do really well so likely good for DX12 games as well.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *RoBiK*
> 
> thermal grizzly TIMs are considered to be among the very best. Liquid metal TIMs are in a league of their own when compared to thermal paste, thermal paste simply can't compete against liquid metal. The downside is that liquid metal requires more skill and preparation to apply properly and can cause damage if misapplied.
> You can also use an AIO as long as the block's cold plate surface is made of nickel or copper, do not use it on aluminum (liquid metal contains galium which reacts with aluminum).


iv always used AS5 but over the years iv heard thermal grizzly or riot lol im willing to make the switch because iv ran out of AS5 anyways, ill most likely use TIM not LM until i test LM on my old 650 TI (for the lulz)

liquid sounds great but also risky and iv heard things about it voiding warranty, shorting out GPU and ruining the GPU heatsink.

i have a Corsair H110 280mm, but if heard you cant use LM on copper as well.

Quote:


> Originally Posted by *KedarWolf*
> 
> I get a bit better results with Mastergel Maker Nano than Grizzly Kryonaut.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's easier to apply and it uses nano diamond particles which is somewhat better than normal diamond TIMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I strongly recommend it.


ill check it out, never heard of that because but ill try anything lol do you use it on CPU and GPU?


----------



## Dasboogieman

Quote:


> Originally Posted by *RoBiK*
> 
> why then not go all the way and use conductonaut?


Because Conductonaut is not for the inexperienced. While it has decent gains over Kryonaut/GC, it has a real chance of killing your hardware if mishandled. I would never recommend it as a first port of call unless there is a need for maximum thermal transfer at all costs.


----------



## EDK-TheONE

Tooooooooooooooo long card is arrived now!


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RoBiK*
> 
> thermal grizzly TIMs are considered to be among the very best. Liquid metal TIMs are in a league of their own when compared to thermal paste, thermal paste simply can't compete against liquid metal. The downside is that liquid metal requires more skill and preparation to apply properly and can cause damage if misapplied.
> You can also use an AIO as long as the block's cold plate surface is made of nickel or copper, do not use it on aluminum (liquid metal contains galium which reacts with aluminum).
> 
> 
> 
> iv always used AS5 but over the years iv heard thermal grizzly or riot lol im willing to make the switch because iv ran out of AS5 anyways, ill most likely use TIM not LM until i test LM on my old 650 TI (for the lulz)
> 
> liquid sounds great but also risky and iv heard things about it voiding warranty, shorting out GPU and ruining the GPU heatsink.
> 
> i have a Corsair H110 280mm, but if heard you cant use LM on copper as well.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I get a bit better results with Mastergel Maker Nano than Grizzly Kryonaut.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's easier to apply and it uses nano diamond particles which is somewhat better than normal diamond TIMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I strongly recommend it.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> ill check it out, never heard of that because but ill try anything lol do you use it on CPU and GPU?
Click to expand...

CPU and GPU, yes.

I get about 2C better results than Grizzly.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Tooooooooooooooo long card is arrived now!


holy **** that thing is huge







thats a really nice looking card dude.

Quote:


> Originally Posted by *KedarWolf*
> 
> CPU and GPU, yes.
> 
> I get about 2C better results than Grizzly.


i would be happy with getting 60c during load on my card, it sets @ about 67c with fans at 60% and im happy with that but abit lower would be perfect! my 1070 never ran higher then 60c @2000mhz


----------



## Ascii Aficionado

Received my 1080 this week.

Anyone here interested in Shadow of War for half off ?

I'd accept PUBG in a trade on Steam.

Not sure if I'm actually allowed to ask this here, although I do have some rep and positive trader feedback.


----------



## KedarWolf

First is Time Spy.

Second is Time Spy Extreme.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First is Time Spy.
> 
> Second is Time Spy Extreme.


maybe im crazy but how are you getting 30c @ 2126mhz?


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First is Time Spy.
> 
> Second is Time Spy Extreme.
> 
> 
> 
> 
> 
> 
> maybe im crazy but how are you getting 30c @ 2126mhz?
Click to expand...

That's idle temps custom water block at 1.2v, gets to 50c or so while benching.

Edit: With 17 W/mK thermal pads on my memory and VRMs.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.


Quote:


> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.


new thermals pads help that much? i dont think ill go watercooled with my 1080 ti but new paste with pads could be worth a shot lol im getting 67-70c underload @2000mhz right now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.


1.2v with BIOS Power Limit or Shunt?


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1.2v with BIOS Power Limit or Shunt?
Click to expand...

XOC BIOS.


----------



## ZealotKi11er

Quote:


> Originally Posted by *KedarWolf*
> 
> XOC BIOS.


How much power is it pulling at 1.2v?


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> That's idle temps custom water block at 1.2v, gets to 50c or so while benching.
> 
> Edit: With 17 W/mK thermal pads on my memory and VRMs.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> new thermals pads help that much? i dont think ill go watercooled with my 1080 ti but new paste with pads could be worth a shot lol im getting 67-70c underload @2000mhz right now.
Click to expand...

Might help some, yes.

www.amazon.com best place to get the pads cheap.









And I find the best paste is Mastergel Maker Nano. Not Grizzly.


----------



## KedarWolf

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> XOC BIOS.
> 
> 
> 
> How much power is it pulling at 1.2v?
Click to expand...

I've seen it pull close to 500W.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *KedarWolf*
> 
> Might help some, yes.
> 
> www.amazon.com best place to get the pads cheap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I find the best paste is Mastergel Maker Nano. Not Grizzly.


does the brand or size of the thermal pads matter?

like this?
https://www.amazon.com/ARCTIC-Thermal-Pad-1-0-Conductivity/dp/B00UYTTDO6/ref=sr_1_1?ie=UTF8&qid=1508055612&sr=8-1&keywords=thermal+pads+gpu


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Might help some, yes.
> 
> www.amazon.com best place to get the pads cheap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I find the best paste is Mastergel Maker Nano. Not Grizzly.
> 
> 
> 
> does the brand or size of the thermal pads matter?
> 
> like this?
> https://www.amazon.com/ARCTIC-Thermal-Pad-1-0-Conductivity/dp/B00UYTTDO6/ref=sr_1_1?ie=UTF8&qid=1508055612&sr=8-1&keywords=thermal+pads+gpu
Click to expand...

Fujipoly or Alphacool 17W/mk are best, Sarcon are easier to apply and more pliable but I think they max out at 14W/mk, cheaper though.

Wear rubber gloves when applying them so the oil on your hands doesn't lower the thermal abilities. Non powdered rubber gloves.

To do just the R22 VRMs and memory you can get away with two of these, 1.5mm for VRM, 1.0mm for memory.

https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQUT8/ref=sr_1_8?s=electronics&ie=UTF8&qid=1508058049&sr=1-8&keywords=Fujipoly

https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQLME/ref=sr_1_3?s=electronics&ie=UTF8&qid=1508058049&sr=1-3&keywords=Fujipoly

You'd need maybe three more of the 1.5mm pads to do the rest of the stuff recommended to do thermal pads on. The memory (11 small rectangles on the left side around the GPU core, the VRMs (R22 chips) and the long area just directly left of the R22 VRMs I'd do 17W/mk, the rest could likely get away with cheaper 14W/mk or 11W/mk pads. If money is no object do 17W/mk for everything for best results.

You'll want to do the areas with pads like the picture below. The green rectangles are the size of the pads. You will need to do the 60x50mm pad into a couple of long rectangles to cover the longer areas as the pads are only 60mm (2.4 inches long).



The 100x15mm pads are less than 4" long anyways and may not even cover the long row of VRMs etc. so may as well just buy the 60x50 and cut it I think. You get more pad for less money that way.

Hope this helps.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *KedarWolf*
> 
> Fujipoly or Alphacool 17W/mk are best, Sarcon are easier to apply and more pliable but I think they max out at 14W/mk, cheaper though.
> 
> Wear rubber gloves when applying them so the oil on your hands doesn't lower the thermal abilities. Non powdered rubber gloves.
> 
> To do just the R22 VRMs and memory you can get away with two of these, 1.5mm for VRM, 1.0mm for memory.
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQUT8/ref=sr_1_8?s=electronics&ie=UTF8&qid=1508058049&sr=1-8&keywords=Fujipoly
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQLME/ref=sr_1_3?s=electronics&ie=UTF8&qid=1508058049&sr=1-3&keywords=Fujipoly
> 
> You'd need maybe three more of the 1.5mm pads to do the rest of the stuff recommended to do thermal pads on. The memory (11 small rectangles on the left side around the GPU core, the VRMs (R22 chips) and the long area just directly left of the R22 VRMs I'd do 17W/mk, the rest could likely get away with cheaper 14W/mk or 11W/mk pads. If money is no object do 17W/mk for everything for best results.
> 
> You'll want to do the areas with pads like the picture below. The green rectangles are the size of the pads. You will need to do the 60x50mm pad into a couple of long rectangles to cover the longer areas as the pads are only 60mm (2.4 inches long).
> 
> 
> 
> The 100x15mm pads are less than 4" long anyways and may not even cover the long row of VRMs etc. so may as well just buy the 60x50 and cut it I think. You get more pad for less money that way.
> 
> Hope this helps.


Wow, Thanks for all the info! im going to look into this for sure! i really wanna get my temp's down to around 50s or 60s and this should help me get in the ballpark about it! silly we pay $700-$800 for a video card and have to replace parts but heck it happens lol.


----------



## lanofsong

Hey GTX 1080 Ti owners,

We are having our monthly Foldathon from Monday 16th - Wednesday 18th - 12noon EST.
Would you consider putting all that awesome GPU power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

October 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Alexium

Is there a method to overclocking these cards? I've read most of the "guides" that I could find, but they pretty much boil down to "move the clock and voltage sliders around until it stops crashing under load".

I have a Founder's Edition card, and the best I managed so far is 2012-2025 core clock at +30% voltage and +156 MHz boost clock. More voltage and it seems to be dropping clock (hello, power limit), more clock speed and it starts crashing from time to time.
Thing is, I've got it water cooled with a full-cover waterblock. Barely ever hits 50 C, typically hovers at 48. So the result sounds a bit low.

Also, I believe I've had it stable at 2037-2050 MHz earlier today, tested in the same game. Then I tried pushing for higher clock, moved the sliders around. And now I can't find those settings. Didn't write them down


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alexium*
> 
> Is there a method to overclocking these cards? I've read most of the "guides" that I could find, but they pretty much boil down to "move the clock and voltage sliders around until it stops crashing under load".
> 
> I have a Founder's Edition card, and the best I managed so far is 2012-2025 core clock at +30% voltage and +156 MHz boost clock. More voltage and it seems to be dropping clock (hello, power limit), more clock speed and it starts crashing from time to time.
> Thing is, I've got it water cooled with a full-cover waterblock. Barely ever hits 50 C, typically hovers at 48. So the result sounds a bit low.
> 
> Also, I believe I've had it stable at 2037-2050 MHz earlier today, tested in the same game. Then I tried pushing for higher clock, moved the sliders around. And now I can't find those settings. Didn't write them down


You are pretty much limited by power with FE card. You want to find the highest clock you can with 1v. Anything over 1v and you will hit power limit. Do not even bother with voltage slider in MSI AB. Leave that at stock. Also look at the guide for curve OC. I for example run the card 1860MHz 0.875V and 2025MHz 1V. Set PL to 120%.


----------



## Mooncheese

Quote:


> Originally Posted by *Alexium*
> 
> Is there a method to overclocking these cards? I've read most of the "guides" that I could find, but they pretty much boil down to "move the clock and voltage sliders around until it stops crashing under load".
> 
> I have a Founder's Edition card, and the best I managed so far is 2012-2025 core clock at +30% voltage and +156 MHz boost clock. More voltage and it seems to be dropping clock (hello, power limit), more clock speed and it starts crashing from time to time.
> Thing is, I've got it water cooled with a full-cover waterblock. Barely ever hits 50 C, typically hovers at 48. So the result sounds a bit low.
> 
> Also, I believe I've had it stable at 2037-2050 MHz earlier today, tested in the same game. Then I tried pushing for higher clock, moved the sliders around. And now I can't find those settings. Didn't write them down


You want to go in the other direction actually, undervolting the card down to 1.000v and trying to secure 2 GHz on the core. I'm at 2025 MHz and + 400 MHz memory stable in The Witcher 3 (3D Vision) and 1987 MHz stable in other, 2D titles at 1.000v. Dropping the voltage will reduce the power needed for a given frequency, meaning, there are fewer wattage starvation induced frequency dips. With default voltage (1.065) the clocks in The Witcher 3 will drop from say 2038 Mhz all the way down to 1936 MHz whereas with 1.000v the dips are virtually non-existent, and when they do happen, it's down to 1975 MHz at the most.

FE with "Poor Mans Hybrid" solution, an NZXT x41. Load temps don't exceed 50C with inaudible fan speed.

Open up the frequency / voltage curve in MSI AB, go back and get caught up, we were talking about this in depth months ago.


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*
> 
> Benching in DX12 latest drivers do really well so likely good for DX12 games as well.


Not a fan of Windoze 10 NSA Surveillance Edition.

Waiting for Vulkan to demolish DX12, which is basically dying stillborn (Ashes of Singularity, Battlefield 1 and Forza 7 support it? Golf clap.)

Vulkan supports all OS.

Windows 10 is a spectacular failure in my opinion. Why tech literate people are using it is beyond me.

Advertisements embedded in the OS. forced updates, telemetry / spying, for no gain in performance outside of a handful of titles, no thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mooncheese*
> 
> Not a fan of Windoze 10 NSA Surveillance Edition.
> 
> Waiting for Vulkan to demolish DX12, which is basically dying stillborn (Ashes of Singularity, Battlefield 1 and Forza 7 support it? Golf clap.)
> 
> Vulkan supports all OS.
> 
> Windows 10 is a spectacular failure in my opinion. Why tech literate people are using it is beyond me.
> 
> Advertisements embedded in the OS. forced updates, telemetry / spying, for no gain in performance outside of a handful of titles, no thanks.


Because Windows 7 is so oldddddddd now. Try using Android 1.0 or iOS 3.


----------



## Alexium

Quote:


> Originally Posted by *ZealotKi11er*
> 
> look at the guide for curve OC. I for example run the card 1860MHz 0.875V and 2025MHz 1V. Set PL to 120%.


Thanks a bunch!
Quote:


> Originally Posted by *Mooncheese*
> 
> Dropping the voltage will reduce the power needed for a given frequency


I hear you.
I've noticed that when power-throttling, the card drops voltage to 1.043. So I've edited my curve to be flat beyond that point, and I'm searching for the highest core clock that Witcher 3 won't lock up at (this game seems to be good at stability testing). So far the best I can do at 1.043V is 2025, and even then the game locked up once so need to go lower. Do I have an utterly terrible card?

I'm thinking of flashing the ASUS BIOS (the Strix with 330W power limit, not the XOC one). I know the card can do at least 2050 MHz with no crashes, just at higher voltage. But in the heavier scenes it hits power limit before reaching that voltage.


----------



## Mad Pistol

Trying something fun today. Playing Battlefield 1 @ 5120x2880 and 200% resolution scaling, all low settings.

Well...


----------



## OZrevhead

Guys, I was looking for a shorty GPU for an itx build but EK doesn't make water blocks for the shorty GPUs, do you think I can keep temps and noise down on a Zotac 1080Ti mini with undervolt? I love water cooling and would prefer water but if I can get satisfactory results then that will do for this build.


----------



## MaKeN

People, quick question...
If j would flash an asus bios on a gigabyte waterforce card, would it work with Asus Aura rgb efectcs from mobo?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mad Pistol*
> 
> Trying something fun today. Playing Battlefield 1 @ 5120x2880 and 200% resolution scaling, all low settings.
> 
> Well...


I play BF1 @ 4K Low with Mesh at Ultra. Its the best experience. 120 fps + and you can spot enemies much better.


----------



## Ultracarpet

Quote:


> Originally Posted by *OZrevhead*
> 
> Guys, I was looking for a shorty GPU for an itx build but EK doesn't make water blocks for the shorty GPUs, do you think I can keep temps and noise down on a Zotac 1080Ti mini with undervolt? I love water cooling and would prefer water but if I can get satisfactory results then that will do for this build.


The undervolt does help a bit. What you are asking, though; really depends on the case/airflow getting to the card. I run it at 0.925v 1900mhz, in GPU bound games the fan will get up to around 60-70% with my fan curve maxing at 80C. If i would let it get hotter it could probably be quieter but I don't really mind, it's just a personal preference. I think at complete stock the card was compared with the founders edition cooler and was about the same db (not too loud).


----------



## demon09

when undervolting watch the performance numbers as there are spots where say i have .975 at 1935 and i up it to 1.00v my performance goes up even though both were stable. as you can teeter on stability and get better performance by say drooping your clocks to 1925 at .975 it just takes some playing around with voltages vs clocks to find whats stable for each card


----------



## 8051

HWmonitor64 reports my overclocked (2076 MHz.) power usage for my 1080Ti at 163 Watts. I'm guessing that's not accurate?


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> HWmonitor64 reports my overclocked (2076 MHz.) power usage for my 1080Ti at 163 Watts. I'm guessing that's not accurate?


"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1

Run that in a admin command prompt WITH the quotation mark and you'll get an accurate power reading.









You can change the 1 to 3 or whatever you want to have it report every three seconds instead of every second for example.


----------



## Glottis

Quote:


> Originally Posted by *EDK-TheONE*
> 
> Tooooooooooooooo long card is arrived now!


Nice I just checked Guru3D review of that card, cooler is great but dimensions are just crazy. This wouldn't even fit in my Define R5 unless I remove HDD cages. I really hope 325mm long cards don't become a thing next year when Volta is out.


----------



## Alexium

Quote:


> Originally Posted by *KedarWolf*
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> Run that in a admin command prompt WITH the quotation mark and you'll get an accurate power reading.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can change the 1 to 3 or whatever you want to have it report every three seconds instead of every second for example.


Nice!
Will the wattage numbers stay accurate if I flash a BIOS that has a higher power limit?


----------



## KedarWolf

Quote:


> Originally Posted by *Alexium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
> 
> Run that in a admin command prompt WITH the quotation mark and you'll get an accurate power reading.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can change the 1 to 3 or whatever you want to have it report every three seconds instead of every second for example.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice!
> Will the wattage numbers stay accurate if I flash a BIOS that has a higher power limit?
Click to expand...

Yes, it should, it reads it directly from the card itself. Best way to test voltages other than hooking up a voltmeter.


----------



## 8051

Quote:


> Originally Posted by *Glottis*
> 
> Nice I just checked Guru3D review of that card, cooler is great but dimensions are just crazy. This wouldn't even fit in my Define R5 unless I remove HDD cages. I really hope 325mm long cards don't become a thing next year when Volta is out.


My Zotac 980Ti AMP! Xtreme was 328mm long. When I put aftermarket fans on the huge heatsink I could keep the GPU temps <= 60° C even in the middle of a summer heat wave (>80°F temps) and overclocked to 1455 MHz.


----------



## Alexium

What is Vop limit and how does it differ from Vrel? So far I don't understand the logic behind Vop, and Googling for the answer doesn't yield much.

They say:
"vRel = Reliability. Indicating performance is limited by reliability voltage.
VOp = Operating. Indicating performance is limited by max operating voltage."

But a) it does not explain what Vrel actually means, and b) it does not explain why I sometimes see Vrel during undervolting ant not Vop.


----------



## SavantStrike

Quote:


> Originally Posted by *LunaP*
> 
> Had a question since I've been debating on the vive, though whenever I go to the store its always having issues either I'm underneath everything so can't see or something else, and the rep can never figure out what the issue is and just shuts it off. Demo's been down for 6 months.
> 
> Would you recommend it and for what type of things? Can it still be used for 2D? like does it have a means to just use itself as an over the head display unit? Or is it strictly only for VR ready games/applications?


You can use it for 2D with third party apps like big screen. The resolution isn't fantastic so it's only impressive for VR gaming content. I wouldn't try doing productivity work on the current gen of HMDs.


----------



## 8051

Quote:


> Originally Posted by *Alexium*
> 
> What is Vop limit and how does it differ from Vrel? So far I don't understand the logic behind Vop, and Googling for the answer doesn't yield much.
> 
> They say:
> "vRel = Reliability. Indicating performance is limited by reliability voltage.
> VOp = Operating. Indicating performance is limited by max operating voltage."
> 
> But a) it does not explain what Vrel actually means, and b) it does not explain why I sometimes see Vrel during undervolting ant not Vop.


I thought Vrel referred to a mismatch between the voltage you're requesting at a given frequency and what the voltage table range indicates is supposed to be provided at that frequency.


----------



## Alexium

Quote:


> Originally Posted by *8051*
> 
> I thought Vrel referred to a mismatch between the voltage you're requesting at a given frequency and what the voltage table range indicates is supposed to be provided at that frequency.


So my custom V/F curve is checked against another table, and my values are discarded unless deemed sane? Did I get it right?


----------



## 8051

Quote:


> Originally Posted by *Alexium*
> 
> So my custom V/F curve is checked against another table, and my values are discarded unless deemed sane? Did I get it right?


My description of Vrels only applied to what I've seen in editing Maxwell VBIO's. I've never seen any Perfcaps with my 1080Ti since I flashed the XOC VBIOS.


----------



## KedarWolf

https://benchmark.unigine.com/leaderboards/superposition/1.0/4k-optimized/single-gpu/page-1

10869 4K Optimised Superposition, #12 on the leaderboards.


----------



## Shawn Shutt jr

Does our memory type matter aton and should we be using modded or flashed bio's on our cards?

also how accurate is GPU-Z when it comes to looking up your card? sounds silly but the look up option pulled up a ASUS ROG strix gtx 1080TI not the EVGA FTW3 model i have lol


----------



## navjack27

Cuz you flashed a bios on it


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *navjack27*
> 
> Cuz you flashed a bios on it


i got mine at micro center clearance, that could have been the issue, i dont even know how to flash a bio on a video card, im using the slave bio's so maybe ill restart and check master out.

Master is
https://www.techpowerup.com/gpudb/b4315/evga-gtx-1080-ti-ftw3-w-icx-cooler

Slave is
https://www.techpowerup.com/gpudb/b4306/asus-rog-strix-gtx-1080-ti-gaming-oc

how does someone reset the bio's to default and or does it matter at all?


----------



## Shawn Shutt jr

Same card, different bio's, no OC. should i keep the card or return it and get a brand new one?

https://www.3dmark.com/fs/13878670

https://www.3dmark.com/fs/13878629

Just for fun, the last few years of upgrading compared.

https://www.3dmark.com/compare/fs/13878670/fs/13878629/fs/13238322/fs/11341857


----------



## TheBoom

Quote:


> Originally Posted by *8051*
> 
> My Zotac 980Ti AMP! Xtreme was 328mm long. When I put aftermarket fans on the huge heatsink I could keep the GPU temps <= 60° C even in the middle of a summer heat wave (>80°F temps) and overclocked to 1455 MHz.


What aftermarket fans may I ask?


----------



## Dasboogieman

Quote:


> Originally Posted by *Alexium*
> 
> What is Vop limit and how does it differ from Vrel? So far I don't understand the logic behind Vop, and Googling for the answer doesn't yield much.
> 
> They say:
> "vRel = Reliability. Indicating performance is limited by reliability voltage.
> VOp = Operating. Indicating performance is limited by max operating voltage."
> 
> But a) it does not explain what Vrel actually means, and b) it does not explain why I sometimes see Vrel during undervolting ant not Vop.


VOP means the chip is at the highest voltage bin allowed in software.
VREL means the chip is operating at the highest voltage bin (but not necessarily the highest allowed) that holds the highest clockspeed.

E.g. If I set my chip to 2088mhz at 1.093V. I will likely get either VREL or VOP (or both at once) because my chip is at the highest voltage allowed by BIOS. If I set my chip to 2088mhz at 1.064V then I will get a VREL because this is my highest voltage bin that also has the highest clock. Now if I set the first setting but allow 2100mhz at 1.2V then I will get a VREL again.

Typically, VOP rarely manifests but it is functionally the same as VREL. It just simply means you are at the highest allowed Voltage bin.


----------



## Alexium

Quote:


> Originally Posted by *Dasboogieman*
> 
> VOP means the chip is at the highest voltage bin allowed in software.
> VREL means the chip is operating at the highest voltage bin (but not necessarily the highest allowed) that holds the highest clockspeed.


Thanks a lot, that explains it perfectly!
So when the F/V curve is smooth, I would typically see Vop, and when there's a spike, I'll see Vrel. For instance: 1.0V-1950 MHz, 1.01V-2000 MHz, 1.03V-1975 MHz. I would get Vrel at 1.01 because higher voltage is available, but there is no sense in switching to it due to lower clock at that bin.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Dasboogieman*
> 
> VOP means the chip is at the highest voltage bin allowed in software.
> VREL means the chip is operating at the highest voltage bin (but not necessarily the highest allowed) that holds the highest clockspeed.


So VOP comes from the highest voltage allowed by my curve ? or the software at its max ?
If im seeing VRel , VOP and Power and Power target is at 120% but im at 0.925 @ 1938MHz what should i do ?
If i just max power / temp and add +136MHz i get 2088MHz and VRel !


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Same card, different bio's, no OC. should i keep the card or return it and get a brand new one?
> 
> https://www.3dmark.com/fs/13878670
> 
> https://www.3dmark.com/fs/13878629
> 
> Just for fun, the last few years of upgrading compared.
> 
> https://www.3dmark.com/compare/fs/13878670/fs/13878629/fs/13238322/fs/11341857


Keep the Asus on ,lol


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Keep the Asus on ,lol


Right, its a decent bio's for sure, i was even able to hit 21k abit later. it looks like i can add 70mhz clock and 600mem clock for an OC without even adding more volts, no crashes or anything. i gotta buy a bigger PSU before i start adding volts because im pretty sure im capped with this 660watt

https://www.3dmark.com/fs/13878810


----------



## 2ndLastJedi

How are my combined FPS higher than yours ?
https://www.3dmark.com/fs/13219699


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *2ndLastJedi*
> 
> How are my combined FPS higher than yours ?
> https://www.3dmark.com/fs/13219699


Must be something about Intel that liked the combine? since im running Ryzen and it seems to do great with Physics.


----------



## d3v0

Just pulled the trigger on a Gigabyte Aorus GTX 1080Ti 11G for $680. General thoughts in here on that particular model? I personally like how the vrms are cooled by the heatsink and it doesnt seem wattage or thermally limited as badly as reference cards or some other AIBs are.

Pairing that with my new Acer Predator XB271HU (1440p, 165hz, IPS) I snagged for $599.

I have an EVGA Supernova G3 750W I purchased and havent yet fired up. I bought it on ultra sale, for something like $80 and have just held it for this moment. The moment I finally retire my ten year old PCP&C 610w silencer.

I heard that Aorus software is really bad, are we all using MSI AB for overclocking still? its been ages since I did an OC on my GPU. I just used a modded/OC bios for my 970 and set it/forget it.


----------



## SavantStrike

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Right, its a decent bio's for sure, i was even able to hit 21k abit later. it looks like i can add 70mhz clock and 600mem clock for an OC without even adding more volts, no crashes or anything. i gotta buy a bigger PSU before i start adding volts because im pretty sure im capped with this 660watt
> 
> https://www.3dmark.com/fs/13878810


What's your CPU? You have to work really hard at eclipsing 300-330W with a 1080TI.


----------



## KedarWolf

Quote:


> Originally Posted by *d3v0*
> 
> Just pulled the trigger on a Gigabyte Aorus GTX 1080Ti 11G for $680. General thoughts in here on that particular model? I personally like how the vrms are cooled by the heatsink and it doesnt seem wattage or thermally limited as badly as reference cards or some other AIBs are.
> 
> Pairing that with my new Acer Predator XB271HU (1440p, 165hz, IPS) I snagged for $599.
> 
> I have an EVGA Supernova G3 750W I purchased and havent yet fired up. I bought it on ultra sale, for something like $80 and have just held it for this moment. The moment I finally retire my ten year old PCP&C 610w silencer.
> 
> I heard that Aorus software is really bad, are we all using MSI AB for overclocking still? its been ages since I did an OC on my GPU. I just used a modded/OC bios for my 970 and set it/forget it.


Good work on snagging the hardware!!

Yes, Afterburner is the way to go.

On first page of this thread is how to do a custom voltage curve.


----------



## Dasboogieman

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So VOP comes from the highest voltage allowed by my curve ? or the software at its max ?
> If im seeing VRel , VOP and Power and Power target is at 120% but im at 0.925 @ 1938MHz what should i do ?
> If i just max power / temp and add +136MHz i get 2088MHz and VRel !


VOP is the highest allowed by the BIOS.


----------



## 8051

Quote:


> Originally Posted by *TheBoom*
> 
> What aftermarket fans may I ask?


A shrouded noctua 140x25mm, NF A14, 3000 RPM and a mechatronics 127x38mm. These fans perfectly covered the entire surface area of the heatsink.


----------



## Krazee

Any news if any more FE will be released? They are sold out everywhere


----------



## KedarWolf

Quote:


> Originally Posted by *Krazee*
> 
> Any news if any more FE will be released? They are sold out everywhere


Sign up for nowinstock.com and do custom notifications, guarantee you'll find stock of you keep on top of your email, I get notifications all the time, and with FEs don't matter brand you buy, they'll all work SLI.


----------



## Alexium

My FE card only seems stable at 2000 MHz core clock at 1.025 V. Water cooled, temperature below 50C for now. Is it a terrible piece of silicon, or is it about average?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Alexium*
> 
> My FE card only seems stable at 2000 MHz core clock at 1.025 V. Water cooled, temperature below 50C for now. Is it a terrible piece of silicon, or is it about average?


Its fine.


----------



## 8051

Quote:


> Originally Posted by *Alexium*
> 
> My FE card only seems stable at 2000 MHz core clock at 1.025 V. Water cooled, temperature below 50C for now. Is it a terrible piece of silicon, or is it about average?


Why don't you try more voltage?


----------



## BigBeard86

What do you guys think of the strix regular and the strix oc? I plan to watercool the card...in this case, would anyone still prefer the oc model, for any reason?


----------



## 2ndLastJedi

When im using superposition GPUz shows Idle even though power consumption is 115% and GPU load is 99 %
Why would this be ?

Screenshot118.png 1092k .png file


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *d3v0*
> 
> Just pulled the trigger on a Gigabyte Aorus GTX 1080Ti 11G for $680. General thoughts in here on that particular model? I personally like how the vrms are cooled by the heatsink and it doesnt seem wattage or thermally limited as badly as reference cards or some other AIBs are.
> 
> Pairing that with my new Acer Predator XB271HU (1440p, 165hz, IPS) I snagged for $599.
> 
> I have an EVGA Supernova G3 750W I purchased and havent yet fired up. I bought it on ultra sale, for something like $80 and have just held it for this moment. The moment I finally retire my ten year old PCP&C 610w silencer.
> 
> I heard that Aorus software is really bad, are we all using MSI AB for overclocking still? its been ages since I did an OC on my GPU. I just used a modded/OC bios for my 970 and set it/forget it.


God damn that is a good deal man! i paid $701 for my EVGA geforce gtx 1080 TI and $750 for my Asus PG279Q. wish i could have gotten a deal like yours lol! pretty much use afterburner for everything.

Quote:


> Originally Posted by *SavantStrike*
> 
> What's your CPU? You have to work really hard at eclipsing 300-330W with a 1080TI.


Ryzen 1700 3.925ghz @1.43volt, 32GB of corsair dominator platinum @3000mhz with 1.35volt, ROG crosshair hero x370 mobo, EVGA geforce GTX 1080TI, then you got the rest stuff m.2, AIO, keyboard/mouse, 2 screens, external HD. i was playing last night and my PC just shut off on me, must have tripped a power breaker.


----------



## khemist

I'm using 350w at the wall with a 1080ti and overclocked 7700k.


----------



## rluker5

Quote:


> Originally Posted by *d3v0*
> 
> Just pulled the trigger on a Gigabyte Aorus GTX 1080Ti 11G for $680. General thoughts in here on that particular model? I personally like how the vrms are cooled by the heatsink and it doesnt seem wattage or thermally limited as badly as reference cards or some other AIBs are.
> 
> Pairing that with my new Acer Predator XB271HU (1440p, 165hz, IPS) I snagged for $599.
> 
> I have an EVGA Supernova G3 750W I purchased and havent yet fired up. I bought it on ultra sale, for something like $80 and have just held it for this moment. The moment I finally retire my ten year old PCP&C 610w silencer.
> 
> I heard that Aorus software is really bad, are we all using MSI AB for overclocking still? its been ages since I did an OC on my GPU. I just used a modded/OC bios for my 970 and set it/forget it.


The Aorus software can set your rgb color, and I installed it at the same time as afterburner and lets just say I reinstalled windows








Aorus for color then get rid of it.

Also some guy here, about a month ago, who also had an Aorus had relatively poor performance with an F3 or F4 vbios and played around with some vbioses and found the other Aorus one to be among the best performers even though the clocks were in the mid 2000s. Wish I could remember which one was better.

I had a similar experience with my waterforce F2 vbios where I can get about 2037 and get roughly the same scores as 2140 on the xoc vbios so I switched back.

So if you can keep your card in the low 60s and be pretty close to the top graphics scores with similar cpus with the voltage and power sliders up, you probably don't need to flash around your vbios.
And with your power limit being high, your biggest worry will probably be keeping the card thermally steady and quiet and the voltage curves work great for that. KedarWolf made a video on how in the op.
I generally run lowered volts even though I don't have to worry about power or temp limits. It just seems more enjoyable to have a quiet rig running well within it's limits.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *khemist*
> 
> I'm using 350w at the wall with a 1080ti and overclocked 7700k.


i think i need to just buy a wall mount for watts, so i know how much im pulling, no point is buying a $150 PSU if i dont need it right? Thanks for the info

This should work fine.

https://www.amazon.com/dp/B00009MDBU/_encoding=UTF8?coliid=I1CNKFN9Q4J56U&colid=2LP5CNCMS6VC0


----------



## tongerks

hello guys any inputs for barrow 1080 ti gpu block and EK 1080 ti fe gpu block?

any pros and cons?


----------



## pez

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> i think i need to just buy a wall mount for watts, so i know how much im pulling, no point is buying a $150 PSU if i dont need it right? Thanks for the info
> 
> This should work fine.
> 
> https://www.amazon.com/dp/B00009MDBU/_encoding=UTF8?coliid=I1CNKFN9Q4J56U&colid=2LP5CNCMS6VC0


Yes, people tend to buy PSU's that are greater than what they need.

I'd have to confirm what I'm pulling from the wall, but OC'ed on the TXP and 7700K, I think I was pulling closer to 400w after I calculated efficiency. A lot of single GPU systems with OCs on both CPU and GPU generally don't need more than 550w. 450w or less if running a 1080 or lower.


----------



## Alexium

Quote:


> Originally Posted by *8051*
> 
> Why don't you try more voltage?


I will, but I will have to flash the ASUS Strix VBIOS to raise the power limit. For example, Witcher 3 hits the power limit at 1.042V, but 3DMark Fire Strike Ultra limits even at 1.02, drops down to 1.0 at least.
I will probably have to go all the way to 1.06-1.09 to get any kind of decent clocks from my card. Just trying to decide whether it's worth the trouble at all or this card is hopeless.


----------



## TheBoom

Quote:


> Originally Posted by *8051*
> 
> A shrouded noctua 140x25mm, NF A14, 3000 RPM and a mechatronics 127x38mm. These fans perfectly covered the entire surface area of the heatsink.


Ah I thought you just replaced the fans without removing the cooler. Thanks anyway,


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *pez*
> 
> Yes, people tend to buy PSU's that are greater than what they need.
> 
> I'd have to confirm what I'm pulling from the wall, but OC'ed on the TXP and 7700K, I think I was pulling closer to 400w after I calculated efficiency. A lot of single GPU systems with OCs on both CPU and GPU generally don't need more than 550w. 450w or less if running a 1080 or lower.


i picked up a seasonic 660 80+platinum like a year ago and didnt think i would ever need to really upgrade unless i went SLI for the lulz..

i think 660watt is enough i just gotta test it from the wall


----------



## pez

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> i picked up a seasonic 660 80+platinum like a year ago and didnt think i would ever need to really upgrade unless i went SLI for the lulz..
> 
> i think 660watt is enough i just gotta test it from the wall


For a single GPU? What are your other specs? That PSU should be plenty with room to spare.


----------



## Alexium

I'm running a Corsair RM1000i PSU, it has a bunch of current meters and is able to calculate both wall power and output power. The highest I've seen so far with a minor overclock (+20% TDP, 2000-2025 MHz) is 440 W from the wall / 400 W actual power (about 90% PSU efficiency). This is with a stock AMD R7 1700 CPU. I expect to hit 500/450 wall/output power after overclocking the CPU.


----------



## KedarWolf

Quote:


> Originally Posted by *Alexium*
> 
> I'm running a Corsair RM1000i PSU, it has a bunch of current meters and is able to calculate both wall power and output power. The highest I've seen so far with a minor overclock (+20% TDP, 2000-2025 MHz) is 440 W from the wall / 400 W actual power (about 90% PSU efficiency). This is with a stock AMD R7 1700 CPU. I expect to hit 500/450 wall/output power after overclocking the CPU.


I'm running into some money in a year or so.

Think this would run four Volta video cards in quad SLI fully water cooled and delidded 7980XE at 5+GHZ?

http://www.legitreviews.com/super-flower-2000w-power-supply-introduced-designed-8pack_158135

Likely be the very last high-end PC I'll be able to build unless I hit the lotto or something.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *pez*
> 
> For a single GPU? What are your other specs? That PSU should be plenty with room to spare.


Ryzen 1700 3.925ghz @1.43volt, 32GB of corsair dominator platinum @3000mhz with 1.35volt, ROG crosshair hero x370 mobo, EVGA geforce GTX 1080TI, then you got the rest stuff m.2, AIO, keyboard/mouse, 2 screens, external HD.

Quote:


> Originally Posted by *Alexium*
> 
> I'm running a Corsair RM1000i PSU, it has a bunch of current meters and is able to calculate both wall power and output power. The highest I've seen so far with a minor overclock (+20% TDP, 2000-2025 MHz) is 440 W from the wall / 400 W actual power (about 90% PSU efficiency). This is with a stock AMD R7 1700 CPU. I expect to hit 500/450 wall/output power after overclocking the CPU.


im most likely running around the same build as you but with an overclock but i dont think i should be pulling 660watts at all, i only thought maybe i did because my PC just shut off but could be unstable OC or mem, never had issues with them before tho.


----------



## Alexium

Quote:


> Originally Posted by *KedarWolf*
> 
> i dont think i should be pulling 660watts at all


My point exactly. I can imagine 600 W if your EVGA has 360 W power limit versus my 300, and considering your seriously high CPU voltage. But no more than that.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *Alexium*
> 
> My point exactly. I can imagine 600 W if your EVGA has 360 W power limit versus my 300, and considering your seriously high CPU voltage. But no more than that.


Agreed, sadly anything under 1.43volts and i cant boot with 3.9ghz


----------



## pez

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Ryzen 1700 3.925ghz @1.43volt, 32GB of corsair dominator platinum @3000mhz with 1.35volt, ROG crosshair hero x370 mobo, EVGA geforce GTX 1080TI, then you got the rest stuff m.2, AIO, keyboard/mouse, 2 screens, external HD.
> im most likely running around the same build as you but with an overclock but i dont think i should be pulling 660watts at all, i only thought maybe i did because my PC just shut off but could be unstable OC or mem, never had issues with them before tho.


Yep, you'll be perfectly fine on that







.


----------



## SavantStrike

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm running into some money in a year or so.
> 
> Think this would run four Volta video cards in quad SLI fully water cooled and delidded 7980XE at 5+GHZ?
> 
> http://www.legitreviews.com/super-flower-2000w-power-supply-introduced-designed-8pack_158135
> 
> Likely be the very last high-end PC I'll be able to build unless I hit the lotto or something.


Yeah that unit would handle that load. That unit is 230V only though so if you're in north America then you'll need to run a special circuit for it.

Also, 4 way SLI is a waste of money outside of a few benchmarks and specialized rendering workloads. It's not officially supported so expect it to get worse over time unless DX12 EMA gets better.

You could buy half as many GPUs and a CPU that's half as much money and most likely get the same performance, then a few years later you can buy a new machine with the money you saved. Unless you have a need for something that high end I would recommend that course of action.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *SavantStrike*
> 
> Yeah that unit would handle that load. That unit is 230V only though so if you're in north America then you'll need to run a special circuit for it.
> 
> Also, 4 way SLI is a waste of money outside of a few benchmarks and specialized rendering workloads. It's not officially supported so expect it to get worse over time unless DX12 EMA gets better.
> 
> You could buy half as many GPUs and a CPU that's half as much money and most likely get the same performance, then a few years later you can buy a new machine with the money you saved. Unless you have a need for something that high end I would recommend that course of action.


i think i would rather build 2 computers for the price of 4 way SLI lol. that way i could have an AMD and an intel system


----------



## siffonen

I own a Strix 1080 ti and i am hitting power limit which reduces my oc clocks.
I flashed the xoc bios, but i noticed that i was hitting 1.125v without touching the voltage slider. Is this safe voltage for 24/7?, i have ordered a ek waterblock so cooling wont be an issue in few days.
If that is too high, how can i lower it to safe voltage?


----------



## RoBiK

Quote:


> Originally Posted by *siffonen*
> 
> I own a Strix 1080 ti and i am hitting power limit which reduces my oc clocks.
> I flashed the xoc bios, but i noticed that i was hitting 1.125v without touching the voltage slider. Is this safe voltage for 24/7?, i have ordered a ek waterblock so cooling wont be an issue in few days.
> If that is too high, how can i lower it to safe voltage?


See the first post in this thread to learn how to set up a custom voltage curve, with that you will have complete control over the voltages.


----------



## Eldred32

I own a Strix 1080 ti too and i was thinking about ordering an EK waterblock but i saw that with our card specifically the warranty is getting void if you unscrew the backplate (there is a sticker that voids warranty on top of the screw that u have to tear in order to unscrew the backplate) , is that not the case? Is it worth it? One of my friends has LCed his Aorus 1080ti and we don't have that huge of a difference in temps like 15 C or smthing like that.


----------



## siffonen

Quote:


> Originally Posted by *RoBiK*
> 
> See the first post in this thread to learn how to set up a custom voltage curve, with that you will have complete control over the voltages.


Thanks for that, didnt know thats possible


----------



## buellersdayoff

Quote:


> Originally Posted by *Eldred32*
> 
> I own a Strix 1080 ti too and i was thinking about ordering an EK waterblock but i saw that with our card specifically the warranty is getting void if you unscrew the backplate (there is a sticker that voids warranty on top of the screw that u have to tear in order to unscrew the backplate) , is that not the case? Is it worth it? One of my friends has LCed his Aorus 1080ti and we don't have that huge of a difference in temps like 15 C or smthing like that.


If you are just gaming the extra cost is not worth it for the small gains imo, but if you are a serious bencher then, well, you'd be on ln2


----------



## Alexium

To me, LCS is first and foremost about noise management, and then its nice side effect is being able to OC more than you could on air while still remaining silent.

Just noticed that Furmark 4K is causing my card to drop the voltage / clocks A LOT. I mean, it was expected, but I didn't expect THAT much a difference. Witcher 3, one of the more demanding/optimized games that I've run into yet, is power-limiting at 1.043 V; 3D Mark Fire Strike Ultra - at 1.0 or high 0.9s, and Furmark - at 0.875. Fun stuff.
If you're running an XOC VBIOS, I strongly suggest you don't run Furmark. Even if your voltage doesn't seem that high.


----------



## Dasboogieman

Quote:


> Originally Posted by *Eldred32*
> 
> I own a Strix 1080 ti too and i was thinking about ordering an EK waterblock but i saw that with our card specifically the warranty is getting void if you unscrew the backplate (there is a sticker that voids warranty on top of the screw that u have to tear in order to unscrew the backplate) , is that not the case? Is it worth it? One of my friends has LCed his Aorus 1080ti and we don't have that huge of a difference in temps like 15 C or smthing like that.


15C is a lot more than you think. Lower temperatures not only increase your clockspeed headroom, but can shave off about 10-15W (Plus another 10W for no need to power fans + LEDs) of power to prevent throttling. That watercooled Aorus card just gained 25W of TDP headroom from watercooling alone.


----------



## KCDC

Quote:


> Originally Posted by *Alexium*
> 
> My FE card only seems stable at 2000 MHz core clock at 1.025 V. Water cooled, temperature below 50C for now. Is it a terrible piece of silicon, or is it about average?


My FE cards, as far as I can tell, stock BIOS, need 1.093 for anything near 2050-2088. I just use the slider at +150, the voltage curve editing doesn't help me. Not until I shunt mod.

Main culprit for me, other than PL, is TDP. Even though I'm under water, it's an OC'd CPU and two OC'd 1080 TI's..


----------



## Alexium

Quote:


> Originally Posted by *KCDC*
> 
> My FE cards, as far as I can tell, stock BIOS, need 1.093 for anything near 2050-2088.


Thanks, I feel a bit better now







That seems to be the case for me as well, can't get 2050 to hold at any kind of a lowered voltage.
Quote:


> Originally Posted by *KCDC*
> 
> Main culprit for me, other than PL, is TDP.


By TDP, do you mean temperature?


----------



## KCDC

So, I'm FINALLY going to have time to do a full tear-down this Saturday.

Purchased a meat smoker, so I'm smoking a 13 pound brisket all day, which will keep me home and away from friends and events. 9 hours to do a smoke, so 9 hours available for a complete clean.

Two reasons why:

My idles are too high, so my load temps get too high. With an ambient of 26-29c, my cards currently idle around 35-45c. same loop as cpu, balanced power.

Something aint right if you check the hardware in my sig. They should idle 10-17c cooler. Pumps working fine. gonna use better pads all around. 8w/mk, couldn't bring myself to get the pricey ones. Two cards makes it expensive. Plus they dry out and crumble over time. The EK pads are only 4w/mk so I am doubling the thermal conductivity.

Second reason is finally doing the shunt mod. I've been researching it, waiting for the right time so I can test and retest. That's the fun/scary one.

I will probably be a bit active on this thread as I work on it. Anyone have final tips on shunt mod? I was gonna use liquid electrical tape and little detail paint brushes to build a perimeter around each shunt, then cap the LM with a piece of old thermal pad.. stuff takes time to dry. Who's used the liquid ET?


----------



## KCDC

Quote:


> Originally Posted by *Alexium*
> 
> Thanks, I feel a bit better now
> 
> 
> 
> 
> 
> 
> 
> That seems to be the case for me as well, can't get 2050 to hold at any kind of a lowered voltage.
> By TDP, do you mean temperature?


Yes temp.

The cards go down a bin when they hit certain temps. It's really annoying when you're on water. If the bios wasn't so strict on TDP, we could possibly keep a solid clock.


----------



## Alexium

How many radiator sections do you have in your water cooling loop? Your temps are not that much higher than mine: I'm idling at 26-27 C, but that's with just one card in the loop - not even the CPU is liquid-cooled yet, and I have 3x120 + 2x140 rads (albeit the fans run at very low RPMs).

Why go through all the trouble with the shunt mod when you can just flash a BIOS with a higher power limit? There is EVGA FTW3 BIOS with 330W, there's the secondary FTW3 BIOS with 360 W limit, and then there's the ASUS XOC if you want to go full nuts.


----------



## RoBiK

Quote:


> Originally Posted by *KCDC*
> 
> If the bios wasn't so strict on TDP, we could possibly keep a solid clock.


There is always the XOC BIOS that you could use.


----------



## KCDC

Quote:


> Originally Posted by *Alexium*
> 
> How many radiator sections do you have in your water cooling loop? Your temps are not that much higher than mine: I'm idling at 26-27 C, but that's with just one card in the loop - not even the CPU is liquid-cooled yet, and I have 3x120 + 2x140 rads (albeit the fans run at very low RPMs).


I have two HWLabs GTR 420x60mm rads. I know this could be an ambient Southern Cali temp thing, but I know my system should idle cooler, it has in the past on the same loop. Also, kind of just using it as an excuse to be a recluse and mess with my rig for a whole weekend.


----------



## KCDC

Quote:


> Originally Posted by *RoBiK*
> 
> There is always the XOC BIOS that you could use.


Yes this is true, but I want to do the shunt mod, plus I run triple monitor and it kills the first DP.

I ran XOC for a bit and it worked great, just had an overwhelmingly loud amount of coil whine. I want to see if shunting is better than XOC.

An experiment, if you will.


----------



## Alexium

Quote:


> Originally Posted by *KCDC*
> 
> Yes this is true, but I want to do the shunt mod, plus I run triple monitor and it kills the first DP.


I have not tried it yet myself (not sure if it makes any sense with my crappy card), but they say the EVGA FTW3 VBIOS leaves all 3 DP ports functional.
Quote:


> Originally Posted by *KCDC*
> 
> I ran XOC for a bit and it worked great, just had an overwhelmingly loud amount of coil whine. I want to see if shunting is better than XOC.


Not trying to talk you out or anything, but 10 bucks says the coil whine is a function of power draw. XOC BIOS = No limits = higher power draw = louder whine. Shunt mod = higher power draw = louder whine.
Also, coil whine typically manifests while the card is rendering something at high FPS. Enabling Vsync to lock FPS at 60 usually kills the whine altogether, or at least drastically reduces its loudness.


----------



## KCDC

Quote:


> Originally Posted by *Alexium*
> 
> I have not tried it yet myself (not sure if it makes any sense with my crappy card), but they say the EVGA FTW3 VBIOS leaves all 3 DP ports functional.
> Not trying to talk you out or anything, but 10 bucks says the coil whine is a function of power draw. XOC BIOS = No limits = higher power draw = louder whine. Shunt mod = higher power draw = louder whine.
> Also, coil whine typically manifests while the card is rendering something at high FPS. Enabling Vsync to lock FPS at 60 usually kills the whine altogether, or at least drastically reduces its loudness.


The coil whine complaint it minimal.. Pretty sure it's always there, so it's a mute point, shouldn't have made it a thing.. From what I've read and gathered, doing the shunt is the best and most solid way to keep stable high clocks. Plus I get to mess with the cards. It's a hobby, after-all.


----------



## Glottis

Quote:


> Originally Posted by *Eldred32*
> 
> I own a Strix 1080 ti too and i was thinking about ordering an EK waterblock but i saw that with our card specifically the warranty is getting void if you unscrew the backplate (there is a sticker that voids warranty on top of the screw that u have to tear in order to unscrew the backplate) , is that not the case? Is it worth it? One of my friends has LCed his Aorus 1080ti and we don't have that huge of a difference in temps like 15 C or smthing like that.


Not worth the hassle, IMO. My Strix 1080Ti is 50-60C while gaming, which is amazing for a 250TDP GPU. It only climbs above 65C when benching at 100% load for 30 minutes. Custom coolers nowadays are so efficient.
Also you have to consider very important thing, when not gaming Strix fans are off 0dB. LC keeps making noise all the time.


----------



## Alexium

Quote:


> Originally Posted by *Glottis*
> 
> LC keeps making noise all the time.


That's one way of building an LCS. Another way is to control your fans dynamically based on CPU / GPU temperatures. And the third way is to buy a quiet pump, a bunch of good fans (I mean the best out there), and build a system that's silent all the time.


----------



## TheBoom

Just got back my Amp Extreme from Zotac. Slightly higher core clocks compared to the old one, running around 2062-2075 stable at 1440p and jumping from 2025-2050 at 4k (power throttling).

But I cannot for the life of me find the best memory clock to settle on. It seems to be changing it's timings every time I drop a bin and fps jumps around even at the same bin whenever I change the memory clock and come back to it. My old card just preferred +400 and it would reproduce the same results every time.


----------



## KedarWolf

Quote:


> Originally Posted by *TheBoom*
> 
> Just got back my Amp Extreme from Zotac. Slightly higher core clocks compared to the old one, running around 2062-2075 stable at 1440p and jumping from 2025-2050 at 4k (power throttling).
> 
> But I cannot for the life of me find the best memory clock to settle on. It seems to be changing it's timings every time I drop a bin and fps jumps around even at the same bin whenever I change the memory clock and come back to it. My old card just preferred +400 and it would reproduce the same results every time.


Run Nvidia Profile Inspector as Admin, turn 'Force P2 State' off. You'll have to hit Apply twice before it actually changes it.

Then set your memory clock, run the vRamBandWidthTest (unzip the file first of course).

I know when my bandwidth test jumps from 450-460 MB/sec to 480-490MB/sec at close to the same clocks speed (I move them up and down a bin or two until I see the change) I'm at optimal memory speeds.


----------



## Alexium

Quote:


> Originally Posted by *KedarWolf*
> 
> Run Nvidia Profile Inspector as Admin, turn 'Force P2 State' off. You'll have to hit Apply twice before it actually changes it.
> 
> Then set your memory clock, run the vRamBandWidthTest (unzip the file first of course).


Thanks for the pro tip.
What's P2 state?


----------



## KedarWolf

Quote:


> Originally Posted by *Alexium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Run Nvidia Profile Inspector as Admin, turn 'Force P2 State' off. You'll have to hit Apply twice before it actually changes it.
> 
> Then set your memory clock, run the vRamBandWidthTest (unzip the file first of course).
> 
> 
> 
> Thanks for the pro tip.
> What's P2 state?
Click to expand...

Forgot the attachments, here.

vRamBandWidthTest.zip 120k .zip file


nvidiaProfileInspector.zip 129k .zip file


It just forces max memory clocks on CUDA apps which the RamTest is a CUDA app.


----------



## mouacyk

Just like to confirm Bullzoid's suspicion with Pascal voltage affecting performance...

I have my Seahawk EK X locked to 2100.5MHz on cores and 12,420MHz and VRAM, then played with voltages:

Voltage - Firestrike Ultra Stress Test - Superposition 8K Run 1 - Superposition 8K Run 2
1.081v - Failed - N/A
1.100v - Passed - 4779
1.113v - Passed - 4789
1.125v - Passed - 4805 - 4804
1.131v - Passed - 4785

It seems my optimal voltage is 1.125v, which is confirmed with a Firestrike run netting 32,344 graphics score. It even beats 2177MHz / 12000MHz at 1.2v, which netted someting like 32,169 or so.


----------



## Alexium

Quote:


> Originally Posted by *mouacyk*
> 
> Just like to confirm Bullzoid's suspicion with Pascal voltage affecting performance...


0.5% difference between min and max result? I'd say the suspicion is busted. You get more variance between successive benchmark runs, all the results are within margin of error.


----------



## mouacyk

Quote:


> Originally Posted by *Alexium*
> 
> 0.5% difference between min and max result? I'd say the suspicion is busted. You get more variance between successive benchmark runs, all the results are within margin of error.


The voltage tolerance on Pascal is much tighter than it was on Maxwell. Compared to overclocking my 980 TI, I noticed that it takes Pascal much sooner to discover an unstable overclock.

I will take your point that more runs are needed for better validation. However, I should note that my OS is stripped down to diagnostic mode (no non-native OS services) which should have minimized the possible variances by CPU.


----------



## Alexium

Quote:


> Originally Posted by *mouacyk*
> 
> I will take your point that more runs are needed for better validation. However, I should note that my OS is stripped down to diagnostic mode (no non-native OS services) which should have minimized the possible variances by CPU.


Fair enough. My point is two-fold:
1) more data points are needed to reliably establish correlation between voltage and performance given such a small magnitude of variance;
2) the practical value of a 0.5% gain (or 0.25% on average) is not evident to me. I assume only hardcore / professional benchers should care enough.


----------



## TheBoom

Quote:


> Originally Posted by *KedarWolf*
> 
> Run Nvidia Profile Inspector as Admin, turn 'Force P2 State' off. You'll have to hit Apply twice before it actually changes it.
> 
> Then set your memory clock, run the vRamBandWidthTest (unzip the file first of course).
> 
> I know when my bandwidth test jumps from 450-460 MB/sec to 480-490MB/sec at close to the same clocks speed (I move them up and down a bin or two until I see the change) I'm at optimal memory speeds.


Thanks for this, didn't know there was an easier way to find the optimal memory overclock.


----------



## DStealth

Coffee lake pushed my old 1080ti to a new level...
https://www.3dmark.com/3dm/22804284


----------



## tknight

Quote:


> Originally Posted by *mouacyk*
> 
> Just like to confirm Bullzoid's suspicion with Pascal voltage affecting performance...
> 
> I have my Seahawk EK X locked to 2100.5MHz on cores and 12,420MHz and VRAM, then played with voltages:
> 
> Voltage - Firestrike Ultra Stress Test - Superposition 8K Run 1 - Superposition 8K Run 2
> 1.081v - Failed - N/A
> 1.100v - Passed - 4779
> 1.113v - Passed - 4789
> 1.125v - Passed - 4805 - 4804
> 1.131v - Passed - 4785
> 
> It seems my optimal voltage is 1.125v, which is confirmed with a Firestrike run netting 32,344 graphics score. It even beats 2177MHz / 12000MHz at 1.2v, which netted someting like 32,169 or so.


Is the MSI 1080ti Seahawk EK X, the same as a putting the EK 1080 ti TF6 waterblock on a MSI 1080 ti Gaming X ? In terms of actual card(pcb) and clocks are they identical ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DStealth*
> 
> Coffee lake pushed my old 1080ti to a new level...
> https://www.3dmark.com/3dm/22804284


What was your gpu score before? Looks like the cpu pushed the total score up. I like the other two scores, nice for a 6core.


----------



## DStealth

GPU was 200-300 point lower with [email protected] and [email protected] but yes Physics and combined especially Combined is better than almost any other CPU paired with such clocked 1080ti does...








Edit: Just checked HOF and my result is number 30 world wide for single card...there are also 2 8700k in front of me beating many 8-10 + Core CPUs Total scores


----------



## Alexium

Something weird is going on with my 1080 Ti FE as I am trying to find stable overclock. Through trial and error, over the course of a week, I have established that 1999 MHz is stable at 1.025 V, 2025 MHz is stable at 1.0425 V, and 2038 MHz is stable at around 1.063. I've used this curve for the last 2-3 days with no issues. Been mostly playing Witcher 3 because it's just what I'm into these days, and it does seem to load the card better than many other games. But today the very same Witcher 3 game started crashing on me, and quite often / soon after I start it up. I've checked my curve. Then I've dropped one core clock step (12 MHz) on all voltage bins. Still crashing. Then I've dropped again and again. Now it finally seems stable at 1974 MHz / 1.025 V. This may not be the highest clock for this V / lowest V for this clock as I have not fine-tuned it too much, but you get the idea of how much I had to drop clocks that used to be stable yesterday.

Any idea what could be going on? Did anyone encounter such a phenomenon?

I'm fairly confident in my PSU as it's a brand new Corsair RM1000i, and the card is liquid-cooled by a full-cover waterblock. Th card is bone stock, I did not flash a different VBIOS or anything. Just some OC using Afterburner.


----------



## KedarWolf

Quote:


> Originally Posted by *Alexium*
> 
> Something weird is going on with my 1080 Ti FE as I am trying to find stable overclock. Through trial and error, over the course of a week, I have established that 1999 MHz is stable at 1.025 V, 2025 MHz is stable at 1.0425 V, and 2038 MHz is stable at around 1.063. I've used this curve for the last 2-3 days with no issues. Been mostly playing Witcher 3 because it's just what I'm into these days, and it does seem to load the card better than many other games. But today the very same Witcher 3 game started crashing on me, and quite often / soon after I start it up. I've checked my curve. Then I've dropped one core clock step (12 MHz) on all voltage bins. Still crashing. Then I've dropped again and again. Now it finally seems stable at 1974 MHz / 1.025 V. This may not be the highest clock for this V / lowest V for this clock as I have not fine-tuned it too much, but you get the idea of how much I had to drop clocks that used to be stable yesterday.
> 
> Any idea what could be going on? Did anyone encounter such a phenomenon?
> 
> I'm fairly confident in my PSU as it's a brand new Corsair RM1000i, and the card is liquid-cooled by a full-cover waterblock. Th card is bone stock, I did not flash a different VBIOS or anything. Just some OC using Afterburner.


Try DDU uninstalling your drivers in safe mode and and a clean install of your drivers. A driver crash can cause your drivers to be unstable, this method may help.


----------



## mouacyk

Quote:


> Originally Posted by *tknight*
> 
> Is the MSI 1080ti Seahawk EK X, the same as a putting the EK 1080 ti TF6 waterblock on a MSI 1080 ti Gaming X ? In terms of actual card(pcb) and clocks are they identical ?


It is exactly that but you don't get the default cooler. You get warranty and $90 off compared to buying separately. Great for those with custom loops who plan to keep for a long time.


----------



## tknight

Quote:


> Originally Posted by *mouacyk*
> 
> It is exactly that but you don't get the default cooler. You get warranty and $90 off compared to buying separately. Great for those with custom loops who plan to keep for a long time.


Thanks for clarifying that, much appreciated.


----------



## SavantStrike

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tknight*
> 
> Is the MSI 1080ti Seahawk EK X, the same as a putting the EK 1080 ti TF6 waterblock on a MSI 1080 ti Gaming X ? In terms of actual card(pcb) and clocks are they identical ?
> 
> 
> 
> It is exactly that but you don't get the default cooler. You get warranty and $90 off compared to buying separately. Great for those with custom loops who plan to keep for a long time.
Click to expand...

Also it should be noted that MSI does the warranty void if removed sticker crap. Word on the street is they sometimes let it slide, but if you crack one open it's a slight risk.

Sent from my ZTE Axon 7 Resurrection Remix.


----------



## mouacyk

Quote:


> Originally Posted by *SavantStrike*
> 
> Also it should be noted that MSI does the warranty void if removed sticker crap. Word on the street is they sometimes let it slide, but if you crack one open it's a slight risk.
> 
> Sent from my ZTE Axon 7 Resurrection Remix.


If you buy the Seahawk EK X, it has the block installed and the sticker is intact from factory.


----------



## CYBER-SNIPA

Quote:


> Originally Posted by *SavantStrike*
> 
> Also it should be noted that MSI does the warranty void if removed sticker crap. Word on the street is they sometimes let it slide, but if you crack one open it's a slight risk.
> 
> Sent from my ZTE Axon 7 Resurrection Remix.


MSI replaced my Titan X when it died, I had installed and removed an EK water block and back plate. The stickers were not intact but they replaced the card anyways, so I guess it's up to the individual warranty center and how they choose to honor the warranty or not? I guess if the card is in a non-modded condition or shows no signs of abuse they will usually replace faulty parts.


----------



## 2ndLastJedi

Last night i was playing The Division with my seemingly stable undervolt of [email protected] and i got a crash which shut off my PC and it restarted but with only my 1080 Ti running with lights and fans but mobo and case fans and lights qhere not running, i panicked and hard shut it off! I then restarted it and it just fires up as normal! Should i be worried? Today when i started my PC it froze once but seems to be running right again now!
Have you Guys ever experienced anything like this and how would you proceed now?


----------



## DStealth

Moving forward








https://www.3dmark.com/3dm/22827122


----------



## buellersdayoff

Quote:


> Originally Posted by *tknight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mouacyk*
> 
> Just like to confirm Bullzoid's suspicion with Pascal voltage affecting performance...
> 
> I have my Seahawk EK X locked to 2100.5MHz on cores and 12,420MHz and VRAM, then played with voltages:
> 
> Voltage - Firestrike Ultra Stress Test - Superposition 8K Run 1 - Superposition 8K Run 2
> 1.081v - Failed - N/A
> 1.100v - Passed - 4779
> 1.113v - Passed - 4789
> 1.125v - Passed - 4805 - 4804
> 1.131v - Passed - 4785
> 
> It seems my optimal voltage is 1.125v, which is confirmed with a Firestrike run netting 32,344 graphics score. It even beats 2177MHz / 12000MHz at 1.2v, which netted someting like 32,169 or so.
> 
> 
> 
> Is the MSI 1080ti Seahawk EK X, the same as a putting the EK 1080 ti TF6 waterblock on a MSI 1080 ti Gaming X ? In terms of actual card(pcb) and clocks are they identical ?
Click to expand...

According to gamers nexus the MSI armor is also the same pcb, possibly a cheaper way to do it?

Sent from my SGP512 using Tapatalk


----------



## Garrett1974NL

Quote:


> Originally Posted by *buellersdayoff*
> 
> According to gamers nexus the MSI armor is also the same pcb, possibly a cheaper way to do it?
> 
> Sent from my SGP512 using Tapatalk


I bought the OC Armor because the cooler was gonna be replaced by a waterblock anyway, I can confirm: no problems


----------



## mouacyk

Quote:


> Originally Posted by *buellersdayoff*
> 
> According to gamers nexus the MSI armor is also the same pcb, possibly a cheaper way to do it?
> 
> Sent from my SGP512 using Tapatalk


Cheapest MSI Armor is 710, then add 140 EK water block = 850. The Seahawk EK can be bought for 800, with factory warranty and pre-installed block.


----------



## KedarWolf

If anyone wants to test it, this Palit BIOS works with our FE's and has a 350W power limit and the power slider goes to 140.

https://www.techpowerup.com/vgabios/193137/193137

Unzip this and run this .bat file after you flash it.

powerlimit350.zip 0k .zip file


----------



## Streetdragon

Is the clock to clock performence of this bios the same as FE stock?


----------



## 2ndLastJedi

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Last night i was playing The Division with my seemingly stable undervolt of [email protected] and i got a crash which shut off my PC and it restarted but with only my 1080 Ti running with lights and fans but mobo and case fans and lights qhere not running, i panicked and hard shut it off! I then restarted it and it just fires up as normal! Should i be worried? Today when i started my PC it froze once but seems to be running right again now!
> Have you Guys ever experienced anything like this and how would you proceed now?


So no one has experienced anything like this ?


----------



## Shawn Shutt jr

iv had some issues since i got my 1080TI that my system just shuts off, no idea what is forcing it, i just power it back on and go about my day til i found out why


----------



## SavantStrike

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> iv had some issues since i got my 1080TI that my system just shuts off, no idea what is forcing it, i just power it back on and go about my day til i found out why


That sounds like a flaky power supply at first glance. What are the rest of your specs?


----------



## MaKeN

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> iv had some issues since i got my 1080TI that my system just shuts off, no idea what is forcing it, i just power it back on and go about my day til i found out why


There are asus z270 code mobos that shuts down after load in games.... if you own one....


----------



## TheBoom

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> iv had some issues since i got my 1080TI that my system just shuts off, no idea what is forcing it, i just power it back on and go about my day til i found out why


Most likely power supply. But its really hard to confirm and one of the most annoying and troublesome PC issues you can run into.

I experienced something like this recently. Turns out it was a faulty 8pin cable. Who woulda known. Spare 290x ran fine on the same cable, possibly a fail-safe that triggers on Nvidia cards.


----------



## KedarWolf

#1 Fire Strike 1080 Ti benchmark on overclock.net.



http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26410156


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *2ndLastJedi*
> 
> Last night i was playing The Division with my seemingly stable undervolt of [email protected] and i got a crash which shut off my PC and it restarted but with only my 1080 Ti running with lights and fans but mobo and case fans and lights qhere not running, i panicked and hard shut it off! I then restarted it and it just fires up as normal! Should i be worried? Today when i started my PC it froze once but seems to be running right again now!
> Have you Guys ever experienced anything like this and how would you proceed now?
> 
> 
> 
> So no one has experienced anything like this ?
Click to expand...

Try turning this or the equivalent off. Anti Surge Support


----------



## Mooncheese

Quote:


> Originally Posted by *KedarWolf*
> 
> If anyone wants to test it, this Palit BIOS works with our FE's and has a 350W power limit and the power slider goes to 140.
> 
> https://www.techpowerup.com/vgabios/193137/193137
> 
> Unzip this and run this .bat file after you flash it.
> 
> powerlimit350.zip 0k .zip file


Have you tried it and how does it compare to FE?

If it's FE with 350W I'm game.


----------



## mouacyk

Quote:


> Originally Posted by *KedarWolf*
> 
> #1 Fire Strike 1080 Ti benchmark on overclock.net.
> 
> 
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26410156


33,148 here. Am I #2? Can't push anymore...

https://www.3dmark.com/fs/13951080


----------



## KedarWolf

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> #1 Fire Strike 1080 Ti benchmark on overclock.net.
> 
> 
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26410156
> 
> 
> 
> 33,148 here. Am I #2? Can't push anymore...
> 
> https://www.3dmark.com/fs/13951080
Click to expand...

Your CPU holding you back overall but that's a really nice graphics score.


----------



## DStealth

I have a couple of runs over 27k Total but sub 33 GPU This is my best atm https://www.3dmark.com/fs/13935312

How is this TimeSpy ?
https://www.3dmark.com/spy/2606888


Keep in mind it's stock Palit gamerock BIOS and cooler









Edit:
Wow 2100/12600 exceeded 178fps pushing from 8700k is huge


----------



## Gunslinger.

Quote:


> Originally Posted by *KedarWolf*
> 
> #1 Fire Strike 1080 Ti benchmark on overclock.net.
> 
> 
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26410156


Challenge accepted


----------



## Gunslinger.

https://www.3dmark.com/fs/12869460


----------



## mouacyk

Quote:


> Originally Posted by *Gunslinger.*
> 
> 
> 
> https://www.3dmark.com/fs/12869460


Com'on, the old LOD trick again? J/k, that 2200+ core is gonna be hard to top, anyway. Congrats.


----------



## DStealth

These 8700k having such combined...
https://www.3dmark.com/3dm/22895490

Still stock BIOS and cooler...


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> #1 Fire Strike 1080 Ti benchmark on overclock.net.
> 
> 
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/2020_20#post_26410156


LOL there you go for showing off who's gunner break the 30k mark then ?????

on side note that 8700k is looking mighty fine for an upgrade from my current 3700k still undecided though the 7900x is looking mighty fine too.....
Kind of making my choice on aesthetics of the mother board so really probably spend the extra and still go for the 7900x with the Asus Rampage extreme XI such a beautiful mobo.


----------



## KedarWolf

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gunslinger.*
> 
> 
> 
> https://www.3dmark.com/fs/12869460
> 
> 
> 
> Com'on, the old LOD trick again? J/k, that 2200+ core is gonna be hard to top, anyway. Congrats.
Click to expand...

LOD trick? whazzat?

LOD trick Googled.


----------



## siffonen

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So no one has experienced anything like this ?


If you have a modular psu, you could try to swap your pci-e cables to different connectors into to your psu.
I had similar issue with my old psu and swapping those cables to different connectors helped but eventually issue started again so had to replace my psu


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> LOD trick? whazzat?
> 
> LOD trick Googled.


LOD; level of detail
I accidently found it after google failed me.


----------



## KedarWolf

Quote:


> Originally Posted by *Gunslinger.*
> 
> 
> 
> https://www.3dmark.com/fs/12869460


Invalid score. LOD BIAS tweaks not allowed. Found that out when I tried that trick.


----------



## Gunslinger.

Quote:


> Originally Posted by *KedarWolf*
> 
> Invalid score. LOD BIAS tweaks not allowed. Found that out when I tried that trick.


Not allowed for HOF but allowed and valid everywhere else.


----------



## KedarWolf

Quote:


> Originally Posted by *Gunslinger.*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Invalid score. LOD BIAS tweaks not allowed. Found that out when I tried that trick.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not allowed for HOF but allowed and valid everywhere else.
Click to expand...

Not allowed on overclock.net Fire Strike benchmark thread as well.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gunslinger.*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Invalid score. LOD BIAS tweaks not allowed. Found that out when I tried that trick.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not allowed for HOF but allowed and valid everywhere else.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Not allowed on overclock.net Fire Strike benchmark thread as well.
Click to expand...

and not on Hwbot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gunslinger.*
> 
> Not allowed for HOF but allowed and valid everywhere else.


What is the point if it shows it. Its nothing special if anyone can do it.


----------



## chibi

Hey guys, I'm thinking of switching out my EVGA card for a Strix OC. Am I right to assume that the ASUS GTX 1080 Ti Strix OC is the best card to pair with the XOC bios? I just want to get rid of that pesky Power Limit without having to resort to shunt modding.

Thanks!


----------



## Gunslinger.

Quote:


> Originally Posted by *KedarWolf*
> 
> and not on Hwbot.


It IS allowed at HWBot, the only 3D bench it is NOT allowed for is Time Spy.

You have to know the rules of the game.









LOD Tweaking allowed in the following (HWBot):

3DMark '01
3DMark '03
3DMark '05
3DMark '06
3DMark Vantage
3DMark '11
3DMark Fire Strike
3DMark Fire Strike Extreme
Aquamark3
Unigine Heaven Extreme
Unigine Heaven Basic
Catzilla 720P
Catzilla 1440P

LOD Tweaking NOT allowed in the following (HWBot):

3DMark Time Spy
3DMark Time Spy Extreme (presumably)

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is the point if it shows it. Its nothing special if anyone can do it.


You're right, we should all just click start benchmark and wait for the score to pop up.


----------



## Agent-A01

Quote:


> Originally Posted by *Gunslinger.*
> 
> It IS allowed at HWBot, the only 3D bench it is NOT allowed for is Time Spy.
> 
> You have to know the rules of the game.
> 
> 
> 
> 
> 
> 
> 
> 
> You're right, we should all just click start benchmark and wait for the score to pop up.


LOD invalidates score, don't care how you try to validate it as being OK.

No different than AMD users turning off/down tessellation, score is still invalid.

Anytime I see those, I ignore them.


----------



## Gunslinger.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is the point if it shows it. Its nothing special if anyone can do it.


Quote:


> Originally Posted by *Agent-A01*
> 
> LOD invalidates score, don't care how you try to validate it as being OK.
> 
> No different than AMD users turning off/down tessellation, score is still invalid.
> 
> Anytime I see those, I ignore them.


If you're talking about Futuremark's HOF you are correct. Which is why I'll occasionally run some untweaked runs just for HOF purposes.

I'm talking about HWBot, which in reality is where 99% of "World Record Scores" are celebrated.


----------



## LunaP

omg the 388 drivers finally fixed my issues, OBS was causing fps drops just by having it open, and was getting lag/spikes in some games with nothing going on where it'd say 53-56fps but act like its 5.
SLI works great now w/ the accessory card too!


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> omg the 388 drivers finally fixed my issues, OBS was causing fps drops just by having it open, and was getting lag/spikes in some games with nothing going on where it'd say 53-56fps but act like its 5.
> SLI works great now w/ the accessory card too!


So you have SLI'd Titan X's? Or SLI'd 1080Ti's? Either way I imagine the heat output from your computer must be staggering.


----------



## Shawn Shutt jr

so iv had my 1080TI for about a week now, i got it clearance from micro center ( yeah i know sue me) the slave bio's was already flashed (didnt think much of it) so after afew days my games started crashing over and over, Destiny 2 crashes after about 10mins of playing. i thought maybe my PSU was bad because my PC would turn off randomly sometimes. i got the idea to switch to master bio's (the default) and i haven't had a crash in hours.

so, i paid $701 for my 1080TI, should i just bite the bullet and return it and grab a brand new 1080ti for $800? money isnt the problem really, just the stress of having to return it. but something is wrong with the card either way.


----------



## Agent-A01

Quote:


> Originally Posted by *Gunslinger.*
> 
> If you're talking about Futuremark's HOF you are correct. Which is why I'll occasionally run some untweaked runs just for HOF purposes.
> 
> I'm talking about HWBot, which in reality is where 99% of "World Record Scores" are celebrated.


I'm talking about in general, regardless of hwbot or hof.
Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> so iv had my 1080TI for about a week now, i got it clearance from micro center ( yeah i know sue me) the slave bio's was already flashed (didnt think much of it) so after afew days my games started crashing over and over, Destiny 2 crashes after about 10mins of playing. i thought maybe my PSU was bad because my PC would turn off randomly sometimes. i got the idea to switch to master bio's (the default) and i haven't had a crash in hours.
> 
> so, i paid $701 for my 1080TI, should i just bite the bullet and return it and grab a brand new 1080 for $800? money isnt the problem really, just the stress of having to return it. but something is wrong with the card either way.


If it's not crashing in master bios then there's nothing wrong with it.

EVGA doesn't guarantee an OC will work(if slave bios is actually OCing)

I would just leave it be and set a curve OC with lower voltages.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *Agent-A01*
> 
> I'm talking about in general, regardless of hwbot or hof.
> If it's not crashing in master bios then there's nothing wrong with it.
> 
> EVGA doesn't guarantee an OC will work(if slave bios is actually OCing)
> 
> I would just leave it be and set a curve OC with lower voltages.


nvm just crashed on me. i thought for sure the slave bio's was bad maybe but its doing it in both bio's. maybe ill place the power supply.

game just keeps closing after awhile, no errors nothing

note it does this in other games as well, not just D2


----------



## navjack27

Does EVGA precision XOC actually show that you have an EVGA card? Gpu-z? Cuz it could of been flashed with another cards bios... See if you can get an original bios for that exact card before doing anything

Edit: are you using Windows 10 supplied drivers? Manually install the latest from Nvidia I've had issues with Microsoft supplied drivers.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *navjack27*
> 
> Does EVGA precision XOC actually show that you have an EVGA card? Gpu-z? Cuz it could of been flashed with another cards bios... See if you can get an original bios for that exact card before doing anything
> 
> Edit: are you using Windows 10 supplied drivers? Manually install the latest from Nvidia I've had issues with Microsoft supplied drivers.


the master bios is the default one, the slave is a flashed asus strix bios. i got the card on clearance from micro center and someone had already flashed the bios. and i have the newest drivers, i ran a DDU before destiny came out and installed 388.00 drivers.

but i still get crashes from D2 on both bios. so its gotta be the game right? but i also get crashes from other games, Middle earth and pubg.


----------



## navjack27

Ok. Yah just return it.
Note: I never suggest DDU, I think it's trash and overkill and commonly breaks things


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *navjack27*
> 
> Ok. Yah just return it.
> Note: I never suggest DDU, I think it's trash and overkill and commonly breaks things


you think its the card? iv also had afew issues with my computer just shutting off during gaming and restarting it, thought maybe the card was pulling to much power? i only got a 660watt.

really? people swear by DDU so much lol


----------



## navjack27

My card pulls at max 358w more commonly in the 200s for games and not compute. Maybe if you have a really really bad PSU.... If you have another card, test it


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *navjack27*
> 
> My card pulls at max 358w more commonly in the 200s for games and not compute. Maybe if you have a really really bad PSU.... If you have another card, test it


its a seasonic 660watt plat maybe 7 months old. i got a 970 in my gfs pc i could pop in and test


----------



## navjack27

I'm pretty sure that card is the problem. But if it crashes with the 970 under gaming load... You'll have a fun day of troubleshooting


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *navjack27*
> 
> I'm pretty sure that card is the problem. But if it crashes with the 970 under gaming load... You'll have a fun day of troubleshooting


lol facts! i didnt start having issues till i got the 1080TI and it was used. so lets pray its the 1080TI because i still got like 3 weeks to return it and i can just pick up a brand new one.


----------



## mouacyk

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> its a seasonic 660watt plat maybe 7 months old. i got a 970 in my gfs pc i could pop in and test


When I first started using the XOC BIOS at 1.2v, hwinfo64 was showing power up to 450W when I tried ~2200MHz. I have that exact psu and it would randomly shutdown when benching with the overclock. I switched to a 850W ss gold that I had and the problems went away completely. This GPU sucks power when power target is unlimited...


----------



## DStealth




----------



## Shawn Shutt jr

Quote:


> Originally Posted by *mouacyk*
> 
> When I first started using the XOC BIOS at 1.2v, hwinfo64 was showing power up to 450W when I tried ~2200MHz. I have that exact psu and it would randomly shutdown when benching with the overclock. I switched to a 850W ss gold that I had and the problems went away completely. This GPU sucks power when power target is unlimited...


SS of the 2 different bio's. im running the 970 and iv had 2 crashes but they arent like the others, so im still unconfirmed on the issue at hand. with the 1080TI the game would freeze, then turn off. with the 970 it just freezes and nothing happens after. i gotta CTRL-ALT-DELETE and end task. could be because the 1080TI is fast enough to try and recover but not sure.

Note the pictures are not under load.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *DStealth*


also screw you for getting the 8700k







still waiting till i can pick one up lol


----------



## DStealth

Sure XOC is strong [email protected]
https://www.3dmark.com/fs/13973532


----------



## Mooncheese

Has anyone tried 388.00 and if so what is the verdict?

I'm still on 378.78 but am interested in updating because I'm using Cemu to run Breath of the Wild and a recent driver reduced RAM consumption and also because I'd like to rule out a corrupted driver as the source of the occasional, near daily / daily black-screen that happens even on default clocks.


----------



## Gunslinger.

388.00

https://www.3dmark.com/fs/13980223


----------



## nrpeyton

Quote:


> Originally Posted by *Pyounpy-2*
> 
> Hi, I have used SLI of Zotac 1080ti amp extreme.
> Using bios mod (using XOC) and shunt mod for the power limit, I got the stable SLI system at 2227MHz and GPU voltage of 1.2V.
> 
> CPU:7700K @5.4GHz Vcore:1.408V, Mem: 4200MHz 15-17-17-35-1T
> GPU:[email protected], Video Mem:6003MHz, SLI
> Cooler: water cooling


Nice water-cooling setup m8.

I'm assuming Aquarium Water Chiller incorporated into your loop? For a *at load* temp of 13c on the GPU u must only have a Water Temp of around 3c? _(My chillers lowest setting is 3c)._

What is the ambient temperature and relative humidity allowing you to run it so cold? (Unless I insulate everything I generally can't go below 12c due to moisture formation).

If I open all the Windows I could probably go as low as 8-10c. But below 8 is impossible even sitting inside with Jacket on. lol

In the past -- i have insualated everything. Then re-wired the thermostat to allow it to go sub-zero. But it barely keeps up with it. Lucky if it can maintain 0c. Lowest I ever got was -12c using sleep mode and only running pump _(Then I'd get maybe 2/3 mins before heat load raised temp of coolant).._


----------



## Shawn Shutt jr

just swapped my video card and got the EVGA geforce gtx 1080 TI SC2 hybrid, fresh out the box card, temps are a wet dream, 35c idle 50c under load its great. tho since its a fresh card the default bio's is still clean, what are the recommended tweaks for a brand new card? flash a new bios? remove power limit? also what are the risks?


----------



## nrpeyton

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> just swapped my video card and got the EVGA geforce gtx 1080 TI SC2 hybrid, fresh out the box card, temps are a wet dream, 35c idle 50c under load its great. tho since its a fresh card the default bio's is still clean, what are the recommended tweaks for a brand new card? flash a new bios? remove power limit? also what are the risks?


First thing I'd do is download MSI Afterburner. Slide that power slider all the way up to maximum. This will help prevent power throttling. _(Max is usually approx. 120%)_

So far no risk.

You could also do a custom voltage curve _without_ actually touching the voltage slider.

Without touching the voltage slider, there is no risk.
Just means your curve will max out at the default max voltage. (Whatever frequency overclock you manage to achieve at default max voltage, is "Silicon Lottery" dependant).

Finally, I'd definitely overclock the memory. No voltage options here, so no risk. It's just free gains. (Do some benching and stability testing -- if you go too high, this could negatively affect benchmark scores & FPS). _Many people are easily enjoying a +500 from memory.
_

If you feel a bit more adventurous -- you could increase the voltage slider to its 100% maximum. IF cold enough - this may allow your card to remain stable at a core frequency not possible at a lower voltage.

The voltage slider is not unsafe. Nvidia still imposes a very conservative maximum. Only thing I'd be aware of -- is it is not impossible that this could shorten the _long-term_ life expectancy of your card. _(I.E. faster wear & tear / degradation)._

The Opening Page of this thread is also a wealth of information. I'd definitely have a good read through it, any questions about it are always 100% welcome here.


----------



## RusMarksman

GIGABYTE *GTX 1080 Ti Founder Edition @2253/12312 MHz*, 1.2V, shunt mod, XOC-bios
i7 4790K @5000 MHz
Water cooling system
MS Windows 10 Home 64-bit
Drivers: 387.92
*FIRE STRIKE SCORE 23 294; Graphics Score 34 367
FIRE STRIKE EXTREME SCORE 14 902; Graphics Score 16 867
FIRE STRIKE ULTRA SCORE 8 139; Graphics Score 8 358

@2240/12213 MHz
TIME SPY SCORE 10 178; Graphics Score 11 895*



Spoiler: Unigine Heaven DX11 Benchmark v4.0


----------



## tomxlr8

Could someone please let me know if a custom bios is available for *MSI GeForce GTX 1080 Ti Seahawk EK X* ?

It is not in the best of zip file on main page but I was hoping by now there would be something.
I am finding this card caps out quickly because I can't up the power very far.

Alternatively, if there is any guide to get the best out of this card then please link. I followed the guide for optimal power curve already.


----------



## KedarWolf

Quote:


> Originally Posted by *tomxlr8*
> 
> Could someone please let me know if a custom bios is available for *MSI GeForce GTX 1080 Ti Seahawk EK X* ?
> 
> It is not in the best of zip file on main page but I was hoping by now there would be something.
> I am finding this card caps out quickly because I can't up the power very far.
> 
> Alternatively, if there is any guide to get the best out of this card then please link. I followed the guide for optimal power curve already.


All the files in the Collection will work on your card. Try the 350W one.

Or XOC, I think that has a full waterblock on it, right?

if you try the 350W one remember to run the 350W .bat file as Admin after you flash it, reboot and reinstall your drivers.


----------



## DStealth

Quote:


> Originally Posted by *RusMarksman*
> 
> *GTX 1080 Ti Founder Edition @2253/12312 MHz*, 1.2V, shunt mod, XOC-bios
> i7 4790K @5000 MHz
> Water cooling system
> MS Windows 10 Home 64-bit
> Drivers: 387.92


What cooling on the card 15-17* while loaded in [email protected] seems pretty descent.








All your scores are great anyway paired with [email protected] you'll most probably break top10 single card in Firestrike.


----------



## mouacyk

Quote:


> Originally Posted by *tomxlr8*
> 
> Could someone please let me know if a custom bios is available for *MSI GeForce GTX 1080 Ti Seahawk EK X* ?
> 
> It is not in the best of zip file on main page but I was hoping by now there would be something.
> I am finding this card caps out quickly because I can't up the power very far.
> 
> Alternatively, if there is any guide to get the best out of this card then please link. I followed the guide for optimal power curve already.


The XOC BIOS works great on this card. I have voltage curve profiles for [email protected], [email protected], and [email protected]


----------



## KedarWolf

Did some long overdue editing of the Overclock.net 1080Ti Owner's Survey spreadsheet.


----------



## RusMarksman

Quote:


> Originally Posted by *DStealth*
> 
> What cooling on the card 15-17* while loaded in [email protected] seems pretty descent.


The radiator (420x140) was placed on the windowsill outside the window. The outside temperature was 1 °С


Spoiler: Pictures


----------



## RusMarksman

Quote:


> Originally Posted by *DStealth*
> 
> All your scores are great anyway paired with [email protected] you'll most probably break top10 single card in Firestrike.


Judging by *this table* I have the fifth result in the world in FIRE STRIKE ULTRA Graphics Score.
On the water I am the first in Graphics Score in all tests.


----------



## nrpeyton

Quote:


> Originally Posted by *RusMarksman*
> 
> *GTX 1080 Ti Founder Edition @2253/12312 MHz*, 1.2V, shunt mod, XOC-bios
> i7 4790K @5000 MHz
> Water cooling system
> MS Windows 10 Home 64-bit
> Drivers: 387.92
> *FIRE STRIKE SCORE 23 294; Graphics Score 34 367
> FIRE STRIKE EXTREME SCORE 14 902; Graphics Score 16 867
> FIRE STRIKE ULTRA SCORE 8 139; Graphics Score 8 358
> 
> @2240/12213 MHz
> TIME SPY SCORE 10 178; Graphics Score 11 895*
> 
> 
> 
> Spoiler: Unigine Heaven DX11 Benchmark v4.0


Do you score higher with the XOC bios over your cards own BIOS? _(Reason I ask is I see you have completed the hardware Shunt Mod)
_

Quote:


> Originally Posted by *KedarWolf*
> 
> Did some long overdue editing of the Overclock.net 1080Ti Owner's Survey spreadsheet.


Nice one m8


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RusMarksman*
> 
> *GTX 1080 Ti Founder Edition @2253/12312 MHz*, 1.2V, shunt mod, XOC-bios
> i7 4790K @5000 MHz
> Water cooling system
> MS Windows 10 Home 64-bit
> Drivers: 387.92
> *FIRE STRIKE SCORE 23 294; Graphics Score 34 367
> FIRE STRIKE EXTREME SCORE 14 902; Graphics Score 16 867
> FIRE STRIKE ULTRA SCORE 8 139; Graphics Score 8 358
> 
> @2240/12213 MHz
> TIME SPY SCORE 10 178; Graphics Score 11 895*
> 
> 
> 
> Spoiler: Unigine Heaven DX11 Benchmark v4.0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you score higher with the XOC bios over your cards own BIOS? _(Reason I ask is I see you have completed the hardware Shunt Mod)
> _
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Did some long overdue editing of the Overclock.net 1080Ti Owner's Survey spreadsheet.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Nice one m8
Click to expand...

Thank you.


----------



## RusMarksman

Quote:


> Originally Posted by *nrpeyton*
> 
> Do you score higher with the XOC bios over your cards own BIOS?


With XOC-bios I have better results since the frequency and voltage are higher. At the same frequency the results are the same.
That's what I managed to achieve with stock bios and shunt mod at a voltage of 1.093 https://www.3dmark.com/fs/13911412


----------



## MrTOOSHORT

Nice 1080ti scores @RusMarksman!


----------



## DStealth

Very nice actually...out of competition outside extreme cooling. That's why asked for the cooling, but the answer was clear...near zero radiator ambient says it all


----------



## nrpeyton

Quote:


> Originally Posted by *DStealth*
> 
> Very nice actually...out of competition outside extreme cooling. That's why asked for the cooling, but the answer was clear...near zero radiator ambient says it all


I'd love to have my radiator routed through to an industrial freezer.


----------



## TheRedViper

Quote:


> Originally Posted by *nrpeyton*
> 
> I'd love to have my radiator routed through to an industrial freezer.


Or just move to canada ?


----------



## CrazyElf

For the folks running SLI, Nvidia still has not provided a default Superposition Benchmark SLI file.









If you need to import a SLI Profile, use this:
Quote:


> Profile "Unigine: Superposition Benchmark"
> ShowOn All
> ProfileType Application
> Executable "launcher.exe" FindFile="superposition.exe"
> Executable "superposition.exe"
> Setting ID_0x00a06946 = 0x088000f5
> Setting ID_0x00a06946 = 0x0a0010f5 UserSpecified=true
> Setting ID_0x00db3859 = 0x00000012
> Setting ID_0x1033cec1 = 0x00000003
> Setting ID_0x1033cec2 = 0x00000002
> Setting ID_0x1033dcd2 = 0x00000004
> Setting ID_0x1033dcd3 = 0x00000004
> Setting ID_0x106d5cff = 0x00000000
> Setting ID_0x1095def8 = 0x02c00005
> Setting ID_0x10ecdb82 = 0x00000001
> Setting ID_0x10f9dc81 = 0x00000011
> Setting ID_0x10fd4c5f = 0x01f296c1
> Setting ID_0x2014cc70 = 0x00000001
> Setting ID_0x202dc432 = 0x00000012
> Setting ID_0x20441369 = 0x00000001
> Setting ID_0x20570fc6 = 0x00000001
> Setting ID_0x209746c1 = 0x04080001
> Setting ID_0x20992431 = 0x00000001
> Setting ID_0x209fd306 = 0x0fc00008
> Setting ID_0x20c1221e = 0x00000001
> EndProfile


Get the Nvidia Geforce 3D Profile Manager: http://nvidia.custhelp.com/app/answers/detail/a_id/2625



I have the misfortune of having a card that is decent and losing the silicon lottery on another card.







One GPU is undervolted to .962V, while the other one seems to need 100mV more just to sustain the same clocks.

On the upside, considering this is all on air cooling, temps seem to be fairly decent.

Quote:


> Originally Posted by *Blameless*
> 
> I don't have 13% to spare in terms of performance.
> 
> I still drop below 60 fps at times in _Elite: Dangerous_ with custom settings at 4k. If I could get another 5% in performance to push those minimums up to the point they never fell below my recording frame rate, I would, even if it cost me another 50w. However, I'm at the limits of my part in both stable temperatures and power and would have to put the card under water to improve things tangibly.


We still are not yet at the point where we can game reliably at 4k @ 60Hz on a single GPU I'm afraid. I suspect that we would need at least the 1180Ti of Volta before we reach that point. The medium sized Volta won't do - it will probably be like what the GTX 980 was to the 780Ti; perhaps a 10% improvement on a smaller die.

SLI does seem to work. For the cost of a custom loop, it might be a serious option:

https://forums.frontier.co.uk/showthread.php/365017-Does-Elite-Dangerous-support-SLI
https://forums.frontier.co.uk/showthread.php/366924-Elite-dangerous-in-SLI
https://forums.frontier.co.uk/showthread.php/383788-SLI-Support
Hopefully the issues have been resolved though with the newer drives. You may want to make your own thread and ask about issues.

Quote:


> Originally Posted by *pez*
> 
> Just 5%? Sounds like the TXp is just what you need at the cost of 0w (just ignore the price tag
> 
> 
> 
> 
> 
> 
> 
> ).


Considering he already owns a 1080Ti, a second one would serve better.

For the price of a $1200 USD Titan, you are probably better off getting a second 1080Ti. Yes, not all games support SLI, but those that do will by far out perform a TItan XP.

Quote:


> Originally Posted by *TheRedViper*
> 
> Or just move to canada ?


Already there. Not a solution IMO. We get huge temperature variations depending on where you are in Canada.


----------



## KedarWolf

I'm wondering where does one exactly get a water chiller for a custom loop and what kind of power requirements does it need?

I have a massive Core X9 case and if it's not too huge would likely fit inside.


----------



## CYBER-SNIPA

Hailea make the best water chillers, which are adaptable for PC cooling;

http://www.hailea.com/e-hailea/product1/HC-500A.htm

They are an external unit tho, not sure anyone makes a compact unit to fit inside a PC case?

Which country do you live in?


----------



## KedarWolf

Quote:


> Originally Posted by *CYBER-SNIPA*
> 
> Hailea make the best water chillers, which are adaptable for PC cooling;
> 
> http://www.hailea.com/e-hailea/product1/HC-500A.htm
> 
> Which country do you live in?


Canada, USA power requirements.

http://www.performance-pcs.com/water-chillers/hot-hailea-hc-300a-110v-1-4hp-395watt-cooling-capacity-waterchiller.html#Specifications

Maybe?


----------



## CYBER-SNIPA

Have a read of this review dude, it may give you some insight;

https://www.bit-tech.net/reviews/tech/cooling/hailea-hc-500a-water-chiller-review/1/


----------



## 8051

Quote:


> Originally Posted by *mouacyk*
> 
> The XOC BIOS works great on this card. I have voltage curve profiles for [email protected], [email protected], and [email protected]


Doesn't this exceed the maximum recommended voltages for the 1080Ti? I seem to remember an Nvidia rep. saying 1080Ti's would die quickly at too high voltages.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Canada, USA power requirements.
> 
> http://www.performance-pcs.com/water-chillers/hot-hailea-hc-300a-110v-1-4hp-395watt-cooling-capacity-waterchiller.html#Specifications
> 
> Maybe?


Here's how my setup looks (with Chiller):



It just replaces the radiator in your loop. Actually a lot easier than you'd think. Your existing pump should be fine too.

Mine is only the 1/2 HP unit. But still features 750w of cooling power.

I've even modified it (in the past) to go sub-zero. Which it did.. and I was able to get water temp down to about -12c (absolute lowest I ever got her). But at temps lower than your daily dew-point you need to insulate all your components.



Normally _(like today)_ I just run it at +12c - +13c. Saves worrying about condensation. Most regular water-cooling loops stabilise around 27c. (that's assuming a 21c ambient temp & 2x more radiator space than you actually _need_).

It's a bit like getting 2x the benefits from a de-lid. That's best way I can probably describe it. (for CPU). And for GPU you're never exceeding 28c GPU temp (even under heavy load at 1.093v / 2176 Mhz). With Chiller set to 12c. Standard TIM used between DIE & block. Using LM would knock another 5c off _(at least)._

I grabbed mine 'used' from Ebay for only 200 bux. (Hailea HC 500-A)

You just need two adapters that enable connection into a standard PC loop. About 5 bux each.

Mayhems also do a coolant that works below 0c. (down to -49c I believe).


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Canada, USA power requirements.
> 
> http://www.performance-pcs.com/water-chillers/hot-hailea-hc-300a-110v-1-4hp-395watt-cooling-capacity-waterchiller.html#Specifications
> 
> Maybe?
> 
> 
> 
> Here's how my setup looks (with Chiller):
> 
> 
> 
> It just replaces the radiator in your loop. Actually a lot easier than you'd think. Your existing pump should be fine too.
> 
> Mine is only the 1/2 HP unit. But still features 750w of cooling power.
> 
> I've even modified it (in the past) to go sub-zero. Which it did.. and I was able to get water temp down to about -12c (absolute lowest I ever got her). But at temps lower than your daily dew-point you need to insulate all your components.
> 
> 
> 
> Normally _(like today)_ I just run it at +12c - +13c. Saves worrying about condensation. Most regular water-cooling loops stabilise around 27c. (that's assuming a 21c ambient temp & 2x more radiator space than you actually _need_).
> 
> It's a bit like getting 2x the benefits from a de-lid. That's best way I can probably describe it. (for CPU). And for GPU you're never exceeding 28c GPU temp (even under heavy load at 1.093v / 2176 Mhz). With Chiller set to 12c. Standard TIM used between DIE & block. Using LM would knock another 5c off _(at least)._
> 
> I grabbed mine 'used' from Ebay for only 200 bux. (Hailea HC 500-A)
> 
> You just need two adapters that enable connection into a standard PC loop. About 5 bux each.
> 
> Mayhems also do a coolant that works below 0c. (down to -49c I believe).
Click to expand...

Is this in North America?

I ask because of the power requirements.









I found 110v North American units.


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mouacyk*
> 
> The XOC BIOS works great on this card. I have voltage curve profiles for [email protected], [email protected], and [email protected]
> 
> 
> 
> Doesn't this exceed the maximum recommended voltages for the 1080Ti? I seem to remember an Nvidia rep. saying 1080Ti's would die quickly at too high voltages.
Click to expand...

I'd only use the higher voltages for benching runs. not 24/7 use.

But the reps are going to say things like that, just as they want to avoid warranty issues.

I've read 1080 Ti's have really great voltage controllers and your card is not going to die from that.









However, if you not on custom water you don't want to run those voltages on air for sure.

Then you are risking killing your card from excess heat.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Is this in North America?
> 
> I ask because of the power requirements.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I found 110v North American units.


Nah U.K m8.

But I do believe they sell them in North America too. (110v units).

I just found a U.K seller who ships to North America and offers the 110v option. Still awfully expensive though. I'd keep an eye out for 2nd hand units if I were you (i.e. Ebay).

The one good thing though, is even though they're expensive.. they do last a while.. think of how many years your fridge or freezer goes without any maintenance at all -- these things are pretty similar.

And you don't need to upgrade anything when you upgrade your hardware... it just fits into your existing loop instead of your radiator. So it could still be cooling your stuff happily 6-7, or even 10 years, from now. If you decide to use quick connect/disconnects and swap it in/out with a radiator (I.E. only use for benching) it could still be going strong in 15 years.









I've had mine running 24/7 for the past 2 years now.. With no sign of any wear & tear, yet. _(And remember mine was already 'used') _


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> Nah U.K m8.
> 
> But I do believe they sell them in North America too. (110v units).
> 
> I just found a U.K seller who ships to North America and offers the 110v option. Still awfully expensive though. I'd keep an eye out for 2nd hand units if I were you (i.e. Ebay).
> 
> The one good thing though, is even though they're expensive.. they do last a while.. think of how many years your fridge or freezer goes without any maintenance at all -- these things are pretty similar.
> 
> And you don't need to upgrade anything when you upgrade your hardware... it just fits into your existing loop instead of your radiator. So it could still be cooling your stuff happily 6-7, or even 10 years, from now. If you decide to use quick connect/disconnects and swap it in/out with a radiator (I.E. only use for benching) it could still be going strong in 15 years.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had mine running 24/7 for the past 2 years now.. With no sign of any wear & tear, yet. _(And remember mine was already 'used') _


Sorry if this was brought up already, but what's the dB level?


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Is this in North America?
> 
> I ask because of the power requirements.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I found 110v North American units.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nah U.K m8.
> 
> But I do believe they sell them in North America too. (110v units).
> 
> I just found a U.K seller who ships to North America and offers the 110v option. Still awfully expensive though. I'd keep an eye out for 2nd hand units if I were you (i.e. Ebay).
> 
> The one good thing though, is even though they're expensive.. they do last a while.. think of how many years your fridge or freezer goes without any maintenance at all -- these things are pretty similar.
> 
> And you don't need to upgrade anything when you upgrade your hardware... it just fits into your existing loop instead of your radiator. So it could still be cooling your stuff happily 6-7, or even 10 years, from now. If you decide to use quick connect/disconnects and swap it in/out with a radiator (I.E. only use for benching) it could still be going strong in 15 years.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had mine running 24/7 for the past 2 years now.. With no sign of any wear & tear, yet. _(And remember mine was already 'used') _
Click to expand...

Trying to Rep your last few posts on my phone, on my way to a late night MRI, not working on mobile Chrome.

I'll do it when I get home.


----------



## Hydroplane

Just ordered a Gigabyte Aorus 1080 Ti Waterforce WB. Gonna have all sorts of blinky lights in my case


----------



## KedarWolf

Quote:


> Originally Posted by *Hydroplane*
> 
> Just ordered a Gigabyte Aorus 1080 Ti Waterforce WB. Gonna have all sorts of blinky lights in my case


Nice!!!


----------



## Hydroplane

Quote:


> Originally Posted by *KedarWolf*
> 
> Nice!!!


Thanks







I plan to go SLI. Newegg had a 1 per customer limit right now and that was enough money spent for one night anyway, lol. Plus I don't have a motherboard or complete loop to attach it to yet, so no rush.

I will probably do a single color for the lights, similar to below, and use it as the single source of light in my case:


----------



## dtfkev

Hello all,

I'm just getting to overclock my card, first time using the curve editor.

However, I'm running into an issue where my card down clocks from 2088 to 2075. Or after restarting setting itself to 2100.

What causes this generally?


----------



## KedarWolf

Quote:


> Originally Posted by *dtfkev*
> 
> Hello all,
> 
> I'm just getting to overclock my card, first time using the curve editor.
> 
> However, I'm running into an issue where my card down clocks from 2088 to 2075.
> 
> What causes this generally?


Temps too high or power limiting. It's normal unless you flash the XOC BIOS, then it won't do either


----------



## dtfkev

Quote:


> Originally Posted by *KedarWolf*
> 
> Temps too high or power limiting. It's normal unless you flash the XOC BIOS, then it won't do either


It's on a water block never peaks above 45*.

I'm not familiar with bios flashing.

I'm just not sure how to go about getting a stable overclock without it ever holding a constant frequency?


----------



## KedarWolf

Quote:


> Originally Posted by *dtfkev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Temps too high or power limiting. It's normal unless you flash the XOC BIOS, then it won't do either
> 
> 
> 
> It's on a water block never peaks above 45*.
> 
> I'm not familiar with bios flashing.
> 
> I'm just not sure how to go about getting a stable overclock without it ever holding a constant frequency?
Click to expand...

Flash XOC BIOS using the guide on first page or lower voltages and slightly lower clocks.


----------



## KedarWolf

false,

On my phone, not sure why this posted. :/


----------



## dtfkev

Quote:


> Originally Posted by *KedarWolf*
> 
> Flash XOC BIOS using the guide on first page or lower voltages and slightly lower clocks.


Does the graph replot itself after each restart? I've noticed that I'll set everything to a set frequency and every time I restart it will be set to something completely different.

Using after burner beta 19.


----------



## KedarWolf

Quote:


> Originally Posted by *dtfkev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Flash XOC BIOS using the guide on first page or lower voltages and slightly lower clocks.
> 
> 
> 
> Does the graph replot itself after each restart? I've noticed that I'll set everything to a set frequency and every time I restart it will be set to something completely different.
> 
> Using after burner beta 19.
Click to expand...

You need to tweak it a few times between reboots to get it to stick and stay on your curve.


----------



## dtfkev

Quote:


> Originally Posted by *KedarWolf*
> 
> You need to tweak it a few times between reboots to get it to stick and stay on your curve.


So by using the curve I'll have to reinput my overclock after each restart?


----------



## KedarWolf

Quote:


> Originally Posted by *dtfkev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You need to tweak it a few times between reboots to get it to stick and stay on your curve.
> 
> 
> 
> So by using the curve I'll have to reinput my overclock after each restart?
Click to expand...

After you set it s few times should stay set


----------



## dtfkev

Quote:


> Originally Posted by *KedarWolf*
> 
> After you set it s few times should stay set


Wondering if something is wrong then.

I just restarted a dozen times. Manually set 2050 at 1031mv each time. After every restart it would be set at 2062, 2075, 2025, 2088, 2011. Seems completely random.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dtfkev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You need to tweak it a few times between reboots to get it to stick and stay on your curve.
> 
> 
> 
> So by using the curve I'll have to reinput my overclock after each restart?
> 
> Click to expand...
> 
> After you set it s few times should stay set
Click to expand...

Try saving the settings to a profile.and locking the profile.


----------



## dtfkev

Quote:


> Originally Posted by *KedarWolf*
> 
> Try saving the settings to a profile.and locking the profile.


Yup I am. I'm going to try and reinstall.

It'll even change when I set something during heaven. Just seems to pick a random frequency above or below whatever I set lol


----------



## Blameless

Quote:


> Originally Posted by *navjack27*
> 
> I never suggest DDU, I think it's trash and overkill and commonly breaks things


Very frequently overkill, but almost never harmful outside of user error.
Quote:


> Originally Posted by *dtfkev*
> 
> Yup I am. I'm going to try and reinstall.
> 
> It'll even change when I set something during heaven. Just seems to pick a random frequency above or below whatever I set lol


There are temperature cut offs hysteresis. You should only need to set the curve once, but you'll want to set it at peak temperature to ensure clocks are never below that point.

Also 1.31v is very high, and quite possibly dangerous at 45C.


----------



## dtfkev

Quote:


> Originally Posted by *Blameless*
> 
> Very frequently overkill, but almost never harmful outside of user error.
> There are temperature cut offs hysteresis. You should only need to set the curve once, but you'll want to set it at peak temperature to ensure clocks are never below that point.
> 
> Also 1.31v is very high, and quite possibly dangerous at 45C.


1031mv typo.

So you're saying that the amount of volts applied is based on load & temperature?

The games I play the gpu never see's higher than 40c, heaven benchmark puts it to 45c


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> So you have SLI'd Titan X's? Or SLI'd 1080Ti's? Either way I imagine the heat output from your computer must be staggering.


Staggering? its pretty low yeah, and yes dual 1080tis and 1 980ti for accessory monitors. Room has been slightly cooler though now that u mention it. So another perk I guess !


----------



## SmoothR

Hi Guys,

I have one question:
I own the Zotac 1080TI Amp! Extreme Core Edition.
My problem is, that the card reaches to power limit too early. I increased the limit to 120% and the core voltace to 100% with the MSI Afterburner.
When I play e.g. The Witcher 3 on 4k Riva Statistics tells me, that the power limit is reached, but I only get 110 - 114% as a maximum.
With my Zotac 980 I always hits the 120% all of the time. I own a Corsair HX1000i, so I don't think that is the problem.
Thank you very much for your reply!


----------



## Blameless

Quote:


> Originally Posted by *dtfkev*
> 
> So you're saying that the amount of volts applied is based on load & temperature?


Voltage is based on load and clocks requested, while clocks are based on temperature and voltage, adjusted by your offsets.

Run a worst case load, at worst case temperatures, and set your curves for that. At lower temps, clocks will be a step or two higher at the same voltage, but you should never see lower.


----------



## dtfkev

Quote:


> Originally Posted by *Blameless*
> 
> Voltage is based on load and clocks requested, while clocks are based on temperature and voltage, adjusted by your offsets.
> 
> Run a worst case load, at worst case temperatures, and set your curves for that. At lower temps, clocks will be a step or two higher at the same voltage, but you should never see lower.


Very weird. Definitely seeing the opposite.

Might go back to just the standard offset overclock.

I cannot, for the life of me, get this thing to hold a constant frequency.

I've tried 2100-1999 @ 1031, doesn't matter what I set it underclocks and the curve I set gets moved. Every time.


----------



## Blameless

Quote:


> Originally Posted by *dtfkev*
> 
> Very weird. Definitely seeing the opposite.
> 
> Might go back to just the standard offset overclock.
> 
> I cannot, for the life of me, get this thing to hold a constant frequency.
> 
> I've tried 2100-1999 @ 1031, doesn't matter what I set it underclocks and the curve I set gets moved. Every time.


What's your current curve look like?

What happens if you reduce pump speed or shut off fans, while looping something like Superposition, until you get temps to stabilize around 45-50C, then without closing the test, manually set the curve you need for the clocks you want, apply & save clocks to a profile, then load it at a later point?


----------



## dtfkev

Quote:


> Originally Posted by *Blameless*
> 
> What's your current curve look like?
> 
> What happens if you reduce pump speed or shut off fans, while looping something like Superposition, until you get temps to stabilize around 45-50C, then without closing the test, manually set the curve you need for the clocks you want, apply & save clocks to a profile, then load it at a later point?


2062 @ 1031

I know my card is stable at 2100 @ 1093 based on my previous offset overclock.

However, using the curve method I can't get a constant frequency.

If I set 1999 it downclocks
If I set 2025 it downclocks
If I set 2037 it downclocks
If I set 2055 it downclocks
If I set 2088 it downclocks
If I set 2100 it downclocks

Also after setting any of the above, if I restart my computer the graph will either jump up or down to the next level and I'll have to manually reset it each time I open benchmark.


----------



## ZealotKi11er

The way yo u OC is just like Blameless said. First thing you do is find your offset OC. For me it was +190 for my 1080 Ti FE card. Then you pt a load in the card that get you to max temp and set 15-50MHz offset clock. For me I set +175MHz. This sets the curve higher for all voltage values. Now target the voltage you want to run your card 100% load and increase just that offset to +190 and eveything after it set lower.


----------



## dtfkev

Why is my card down clocking? How do I monitor if I'm hitting power limit throttling?


----------



## ZealotKi11er

Quote:


> Originally Posted by *dtfkev*
> 
> Why is my card down clocking? How do I monitor if I'm hitting power limit throttling?


GPU-Z and MSI AB will tell you the power limit. For overclocking you need to max out the power limit. The card will throttle depending on the voltage you use. The idea is to find a voltage that the power limit can sustain. For me its 1v at 120% power.


----------



## SmoothR

Just want to say, that I tried the Arctic Storm BIOS. So, the card should have now 435w. The MSI Afterburner Slider is at 120%. But in Ashes of Singularity I get the message from the Rivatuner, Power Limit at 103 -105%.... I don't understand what is the issue... The card throttles down to 1987mhz at absolutely minimum. In regular cases the card is at 2050 ~ 2063 mhz...
Can somebody please help me, why I can't get near the power limit? 105% is ca. 380 watts...


----------



## dtfkev

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GPU-Z and MSI AB will tell you the power limit. For overclocking you need to max out the power limit. The card will throttle depending on the voltage you use. The idea is to find a voltage that the power limit can sustain. For me its 1v at 120% power.


I guess I'm not fully understanding something about this process.

It doesn't matter what frequency or voltage I set, the power limit goes from 0 to 1.

Can someone help me understand how the power limit works, and what to do about it?

Edit:

For ****s and giggles. I reset the curve to defaults. I set 2113mhz @ 1093mv. Core voltage %, Power limit %, and temperature limit maxed. Once heaven starts, it will power limit down to 2100. But it passes heaven, superposition etc without issues. Is this not considered stable because it's power limiting?


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> Staggering? its pretty low yeah, and yes dual 1080tis and 1 980ti for accessory monitors. Room has been slightly cooler though now that u mention it. So another perk I guess !


They used to make space heaters that had a 1000 Watts of heat output.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> I guess I'm not fully understanding something about this process.
> 
> It doesn't matter what frequency or voltage I set, the power limit goes from 0 to 1.
> 
> Can someone help me understand how the power limit works, and what to do about it?
> 
> Edit:
> 
> For ****s and giggles. I reset the curve to defaults. I set 2113mhz @ 1093mv. Core voltage %, Power limit %, and temperature limit maxed. Once heaven starts, it will power limit down to 2100. But it passes heaven, superposition etc without issues. Is this not considered stable because it's power limiting?


If you reach power limit. Then it will move down to the next lowest voltage bin. (And the frequency you have set on _that_ voltage bin becomes your new frequency). If you're still hitting power target then it will move down to an even lower voltage bin. And again the frequency that correlates to that voltage (which you have already set) becomes your new frequency.

It will also move up and down slightly (only 13mhz jumps -- also known as bins) depending on temperature. *There are at least 3/4 points between 25c and 90c* in which the frequency will drop a bin (13 MHz) due to temp.

I think it will try to simply drop a frequency bin first.... then if still hitting limiters then it will _then_ hit the next lowest voltage... and so on, and so forth. Until it is both within temp & power limits.

Obviously this all happens on a second-by-second basis. You really need a way of monitoring it *second-to-second* (I.E. onscreen *INSIDE* the benchmark or game in *REAL TIME*).

Download hwinfo64 and Riva Tuner Statics Server. Use the "OSD - RTSS" tab in hwinfo64 to pass the info you want to Riva Tuner. (Riva Tuner is a bit like fraps).


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> If you reach power limit. Then it will move down to the next lowest voltage bin. (And the frequency you have set on _that_ voltage bin becomes your new frequency). If you're still hitting power target then it will move down to an even lower voltage bin. And again the frequency that correlates to that voltage (which you have already set) becomes your new frequency.
> 
> It will also move up and down slightly (only 13mhz jumps -- also known as bins) depending on temperature. There are at least 3/4 points between 25c and 90c in which the frequency will drop a bin (13 MHz).
> 
> I think it will try to simply drop a frequency bin first.... then if still hitting limiters then it will then hit the next lowest voltage... and so on, and so forth. Until it is both within temp & power limits. Obviously this all happens on a second-by-second basis. You really need a way of monitoring it second-by-second (I.E. onscreen INSIDE the benchmark or game in REAL TIME). Download hwinfo64 and Riva Tuner Statics Server. Use the "OSD - RTSS" tab in hwinfo64 to pass the info you want to Riva Tuner. (Riva Tuner is a bit like fraps).


Thank you.

So any curve I set where the power limit is hit would be considered unstable, correct?

As it is, I can't get anything over the stock 1987 frequency to not power limit.

Dyno tuning cars is SO much easier lol


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Thank you.
> 
> So any curve I set where the power limit is hit would be considered unstable, correct?


No.

The whole idea of a "curve" is that you are specifying a *different* frequency to use at *every* *possible* voltage point.

If it always stayed on the same voltage and frequency then it wouldn't be a curve. It would just be a traditional, simple, overclock.

The whole idea of a curve is that GPU Boost 3.0 will constantly be changing the voltage to keep it within Power & Temp limits.
So you might have the following:
1.093v -- 2050 MHZ.
1.081v -- 2037 MHZ
1.075v -- 2024 MHZ

You will also find that due to the 13 MHZ bins (or steps) the curve may attempt to "SNAP" into place so that values you set *are rounded up/down towards the nearest FREQUENCY BIN.*

So for example if you select 2038 MHZ it will try to "snap" to the 2037 MHZ bin. Make sure you are hitting APPLY in the main window after EACH & EVERY frequency/voltage point. (This will help you see how the curve will end up). _As it will only be changing one at a time instead of the entire curve moving!
_

ALSO: *(after you get the hang of the above)*
It will always automatically select the *lowest voltage it needs to attain the highest frequency.* So for example if:
1.093v & 1.081v *BOTH* have the *SAME* frequency. Then it will select the lowest voltage (1.081v) as it knows you have given permission for a lower voltage to be used to attain that frequency in your curve. You may also find that the curve attempts to "snap" into place for any values beyond 1.081v *IF* the frequency you set for 1.081v is higher than any frequency already used at any higher voltage point. (In other words you will end up with a straight horizontal line all the way to the end of your curve).


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> No.
> 
> The whole idea of a "curve" is that you are specifying a frequency to use at every possible voltage point.
> 
> If it always stayed on the same voltage and frequency then it wouldn't be a curve. It would just be a traditional, simple, overclock.
> 
> The whole idea of a curve is that GPU Boost 3.0 will constantly be changing the voltage to keep it within Power & Temp limits.
> So you might have the following:
> 1.093v -- 2050 MHZ.
> 1.081v -- 2037 MHZ
> 1.075v -- 2024 MHZ
> 
> You will also find that due to the 13 MHZ bins (or steps) the curve may attempt to "SNAP" into place so that values you set are rounded up/down towards the nearest FREQUENCY BIN. So for example if you select 2038 MHZ it will try to "snap" to the 2037 MHZ bin. Make sure you are hitting APPLY in the main window after EACH & EVERY frequency/voltage point. (This will help you see how the curve will end up). As it will only be changing one at a time instead of the entire curve moving.
> 
> It will always automatically select the lowest voltage it needs to attain the highest frequency. So for example if for:
> 1.093v & 1.081v you select the same frequency. Then it will select the lowest voltage (1.081v) as it knows you have given permission for a lower voltage to be used for that frequency in your curve. You may also find that the curve attempts to "snap" into place for any values beyond 1.081v IF the frequency you set for 1.081v is higher than any frequency already used at any higher voltage point. (In other words you will end up with a straight horizontal line all the way to the end of your curve).


Thank you.

I guess I'm just as lost as can be.

So it's fine if my card his hitting the power limit?

Literally ANY frequency above stock 1987 at ANY voltage I set, downclocks.

I don't understand how you're supposed to test for stability or even find the maximum potential if it changes the values you've manually input.

Example:
Using any voltage from .993 to 1.093 I can manually set values, but it downclocks no matter what.
2113 downclocks to 2100
2100 downclocks to 2088
2088 downclocks to 2075
2075 downclocks to 2062
2062 downclocks to 2050
2050 downclocks to 2037
2037 downclocks to 2025
2025 downclocks to 2013
2013 downlocks to 1999
1999 downclocks to stock voltages.

I know my card is stable and will run 2100mhz at 1093mv. In order to obtain this with the curve I need to set it to 2113mhz so it throttles down to 2100. If I manually set it at 2100 it will downclock to the next bin right off the bat at 2088.

Is that the correct way of obtaining my maximum clock at my maximum voltage? By setting the curve higher than what I want?

All the videos, guides, screenshots I've seen show people maintaining whatever clock they input - I can't seem to understand why my card throttles down with anything above the stock frequency of 1987.

Even following the guide on the first page. Setting 1999mhz at 1031 will downclock to stock 1987mhz.

I don't want my frequency and voltage jumping around all over the place while I'm gaming it's causing issues, that's why I'm trying to get it to run a set frequency at a set voltage and not move.


----------



## Blameless

Quote:


> Originally Posted by *dtfkev*
> 
> It doesn't matter what frequency or voltage I set, the power limit goes from 0 to 1.


You are either hitting the limiter (1) or not (0), if you are hitting the limiter, the card will downclock, no matter what your other settings are or what the temperature is.

You need to increase the power limit slider, use firmware with a higher power limit, or reduce power consumption by lowering voltage and/or clocks.
Quote:


> Originally Posted by *dtfkev*
> 
> So it's fine if my card his hitting the power limit?


Not if you want it to hold it's clock speed.


----------



## dtfkev

Quote:


> Originally Posted by *Blameless*
> 
> You are either hitting the limiter (1) or not (0), if you are hitting the limiter, the card will downclock, no matter what your other settings are or what the temperature is.
> 
> You need to increase the power limit slider, use firmware with a higher power limit, or reduce power consumption by lowering voltage and/or clocks.
> Not if you want it to hold it's clock speed.


I'm hitting the limiter (1) with anything over stock frequency of 1987mhz from .993mv to 1093mv

Voltage, power limit & temp limit sliders are set to max.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Thank you.
> 
> I guess I'm just as lost as can be.
> 
> So it's fine if my card his hitting the power limit?
> 
> Literally ANY frequency above stock 1987 at ANY voltage I set, downclocks.
> 
> I don't understand how you're supposed to test for stability or even find the maximum potential if it changes the values you've manually input.
> 
> Example:
> Using any voltage from .993 to 1.093 I can manually set values, but it downclocks no matter what.
> 2113 downclocks to 2100
> 2100 downclocks to 2088
> 2088 downclocks to 2075
> 2075 downclocks to 2062
> 2062 downclocks to 2050
> 2050 downclocks to 2037
> 2037 downclocks to 2025
> 2025 downclocks to 2013
> 2013 downlocks to 1999
> 1999 downclocks to stock voltages.
> 
> I know my card is stable and will run 2100mhz at 1093mv. In order to obtain this with the curve I need to set it to 2113mhz so it throttles down to 2100. If I manually set it at 2100 it will downclock to the next bin right off the bat at 2088.
> 
> Is that the correct way of obtaining my maximum clock at my maximum voltage? By setting the curve higher than what I want?
> 
> All the videos, guides, screenshots I've seen show people maintaining whatever clock they input - I can't seem to understand why my card throttles down with anything above the stock frequency of 1987.
> 
> Even following the guide on the first page. Setting 1999mhz at 1031 will downclock to stock 1987mhz.


That is *normal* behaviour. You don't need to re-install.

If 2113 @ 1.093v is +156 _(12 x 13mhz bins)_ Then yes it may down-clock by 1 frequency BIN to 2100MHZ when you hit the various temp limiters. *But it is STILL a +156. It's just a +156 with a -13mhz offset applied as it hits the temp limiter.
*

*How to test for stability?*
It's only moving by 1 bin. When stability testing you need to account for this. The chances are it will ALWAYS drop that bin as the temps increase. So it's still consistent.

Only way to minimise how much it jumps about (due to temp) is get water cooling. It will still move slightly.. but not as much.

If you also hit a power limit then the voltage will also drop too. (And your frequency will change to the corresponding frequency for THAT voltage point).


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> That is *normal* behaviour. You don't need to re-install.
> 
> If 2113 @ 1.093v is +156 _(12 x 13mhz bins)_ Then yes it may down-clock by 1 frequency BIN to 2100MHZ when you hit the various temp limiters. _*But it is STILL a +156. It's just a +156 with a -13mhz offset applied as it hits the temp limiter.
> *_
> 
> How to test for stability?
> It's only moving by 1 bin. When stability testing you need to account for this. The chances are it will ALWAYS drop that bin as the temps increase. So it's still consistent.
> 
> Only way to minimise how much it jumps about (due to temp) is get water cooling. It will still move slightly.. but not as much.
> 
> If you also hit a power limit then the voltage will also drop too. (And your frequency will change to the corresponding frequency for THAT voltage point).


I'm on a full loop. Temps during heaven are a constant 44-45*, gaming never breaks 40*.

I'm hitting the limiter (1) with anything over stock frequency of 1987mhz from .993mv to 1093mv

Voltage, power limit & temp limit sliders are set to max.

That is normal?


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> I'm on a full loop. Temps during heaven are a constant 44-45*, gaming never breaks 40*.
> 
> I'm hitting the limiter (1) with anything over stock frequency of 1987mhz from .993mv to 1093mv
> 
> Voltage, power limit & temp limit sliders are set to max.
> 
> That is normal?


Yes. completely normal..

Okay on water it will probably only drop by 1 bin, then. (_*Maybe*_ 2).

People on air move about even more..... people on air it will jump 13mhz, 2, 3 or even 4 times. You're lucky as you may only need to deal with it jumping once or twice_(at most)_.

I'm not sure exactly where the temp bins are... but I think there is one at about 30c.. another at about 45c.. another at 60.. another at 75 and another at 90.. etc...


----------



## Blameless

Quote:


> Originally Posted by *dtfkev*
> 
> I'm hitting the limiter (1) with anything over stock frequency of 1987mhz from .993mv to 1093mv
> 
> Voltage, power limit & temp limit sliders are set to max.


That's unusual. Even reference cards with reference firmware shouldn't be throttling much at 0.993v without really pushing clocks.

What program are you using for testing (Unigine Heaven)? There will always been some extreme outliers that won't be worthwhile accounting for when it comes to power limit (at least if you mostly concerned with gaming or benching), but you still want something on the demanding side to produce a fairly representative worst case gaming load.

Unigine Heaven is demanding, but it normally should not be inducing power throttling at ~2GHz/0.993v with a maxed out power limit slider.

Also, what brand/model card is this and what firmware are you using?
Quote:


> Originally Posted by *nrpeyton*
> 
> Yes. completely normal..
> 
> Okay on water it will probably only drop by 1 bin, then. (_*Maybe*_ 2).
> 
> People on air move about even more..... people on air it will jump 13mhz, 2 or 3 times. You're lucky as you may only need to deal with it jumping once or twice_(at most)_.


He's calibrating the slope for the highest temps the card ever reaches, so it should never be below those clocks under load, if it weren't for that power limit.


----------



## nrpeyton

The power limit for reference designs is completley and utterly pathetic.. Nvidia were far too conservative here.

Even with power limiter at 120% -- most modern games will still hit the limit (and downclock) even at stock.

When you overclock you are going to hit that even more.

I dealt with mine by carrying out a physical "shunt mod" on my cards circuit board. To fool the card into thinking it is drawing less power than it really is.

I get about an extra 25% from the mod (over and above the extra 20% you get from the slider in MSI Afterburner).

The mod allows me to run most games without power throttling.... however in some very intensive games and very intensive benchmarks on SOME scenes I do still sometimes power throttle a LITTLE _(althogh it is rare)._


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes. completely normal..


Quote:


> Originally Posted by *Blameless*
> 
> That's unusual..


Well, uh lol

I'm using MSI Afterburner 4.4 beta 19.

Unigine Heaven

Asus Strix GTX 1080Ti OC

Stock firmware.
Quote:


> Originally Posted by *nrpeyton*
> 
> The power limit for reference designs is completley and utterly pathetic.. Nvidia were far too conservative here.
> 
> I dealt with mine by carrying out a physical "shunt mod" on my cards circuit board.


I'm not looking to flash a bios or mod the card. Just looking for a stable OC that doesn't jump around while gaming.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Well, uh lol
> 
> I'm using MSI Afterburner 4.4 beta 19.
> 
> Unigine Heaven
> 
> Asus Strix GTX 1080Ti OC
> 
> Stock firmware.
> I'm not looking to flash a bios or mod the card. Just looking for a stable OC that doesn't jump around while gaming.


If you have the STRIX then you could download and install the XOC BIOS (firmware). It's completely compatible with your card... and it REMOVES the power limit.
Completely!

*Be aware though!* That it also removes all temp limits! And if your pump or fans failed -- then your card "could" burn up before it throttled!!

The XOC firmware also allows extra voltage (on your curve).. up to 1.2v (so 107 milli volts extra -- above the usual 1.093v limit). Going over 1.093v usually isn't recommended. (if you run high current and high voltage for prolonged periods of time your GPU may degrade faster).

The up-side is that the increased voltage may allow you to attain higher frequencies that weren't possible at 1.093v _(the default maximum when 100% voltage is applied on the slider in MSI AB)._


----------



## BigBeard86

did anyone go fro ma 1080 to 1080ti on 1440p 165hz? do you feel it was worth it?


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> The power limit for reference designs is completley and utterly pathetic.. Nvidia were far too conservative here.


Reference FE power limit is 250w, or 300w with +20% on the slider.

This is pretty low, but at ~45C, 0.993v and sub 2GHz, only an unusually leaky sample should be hitting 300w in Heaven.
Quote:


> Originally Posted by *dtfkev*
> 
> Well, uh lol
> 
> I'm using MSI Afterburner 4.4 beta 19.
> 
> Unigine Heaven
> 
> Asus Strix GTX 1080Ti OC
> 
> Stock firmware.
> I'm not looking to flash a bios or mod the card. Just looking for a stable OC that doesn't jump around while gaming.


I'm not sure what the default power limit of the Strix is.

Are you sure the slider is working? Is the actual power % reported by MSI AB hitting the ~120% range with the slider at 120%?


----------



## nrpeyton

Quote:


> Originally Posted by *Blameless*
> 
> Reference FE power limit is 250w, or 300w with +20% on the slider.
> 
> This is pretty low, but at ~45C, 0.993v and sub 2GHz, only an unusually leaky sample should be hitting 300w in Heaven.
> I'm not sure what the default power limit of the Strix is.
> 
> Are you sure the slider is working? Is the actual power % reported by MSI AB hitting the ~120% range with the slider at 120%?


According to the GPU-BIOS (firmware) database at techpowerup.com the STRIX OC has the same power limits as a reference design. (250 watt) or 300 with the slider at 120% in MSI AB.

If I owned a STRIX -- I'd download and install the XOC firmware. _(XOC = e*X*treme *O*ver *C*lock)._


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> The power limit for reference designs is completley and utterly pathetic.. Nvidia were far too conservative here.
> 
> I dealt with mine by carrying out a physical "shunt mod" on my cards circuit board.


Quote:


> Originally Posted by *Blameless*
> 
> Reference FE power limit is 250w, or 300w with +20% on the slider.
> 
> This is pretty low, but at ~45C, 0.993v and sub 2GHz, only an unusually leaky sample should be hitting 300w in Heaven.
> I'm not sure what the default power limit of the Strix is.
> 
> Are you sure the slider is working? Is the actual power % reported by MSI AB hitting the ~120% range with the slider at 120%?


You might be onto something there.

Looks like power % is staying between 99-103%


----------



## Blameless

Quote:


> Originally Posted by *nrpeyton*
> 
> If I owned a STRIX -- I'd download and install the XOC firmware.


So would I, but dtfkev seems to be hesitant to flash firmware and there does seem to be another unresolved issue at work.
Quote:


> Originally Posted by *dtfkev*
> 
> Looks like power % is staying between 99-103%


Outside of a few apps specifically limited further, you should be coming within a few percent of the set power limit, if not actually reaching or exceeding it, before seeing any throttling. Doesn't look like the slider is being applied.

Is your "unlock voltage control" setting in MSI AB's options set to third party?


----------



## nrpeyton

Try this: _(if you own the STRIX OC)_

StrixOC330W.zip 0k .zip file
 _File courtesy of the OP on this thread by *KedarWolf*_
Extract and run the .bat file.

Sets power limit at 330w

If you hit "apply" on MSI Afterburner with the slider at anything other than maximum you may need to re-run the .bat file again. To re-apply the setting.

This is also useful if there is a bug and MSI Afterburner isn't setting the correct, max power target!

You can open the .bat file with notepad and change the value to any other wattage you want. (But it won't work beyond the hard-coded maximum of your individual firmware ). So setting 700W isn't going to work lol.


----------



## NBrock

Did a quick run of SuperPosition 4k optimized. Unfortunetly my cards don't want to do higher than 2063/6000 but it sill scored pretty well. CPU is 5775c @ 4.2 core and 3.8 cache with ram a2 2133 c9.


----------



## dtfkev

Quote:


> Originally Posted by *Blameless*
> 
> So would I, but dtfkev seems to be hesitant to flash firmware and there does seem to be another unresolved issue at work.
> Outside of a few apps specifically limited further, you should be coming within a few percent of the set power limit, if not actually reaching or exceeding it, before seeing any throttling. Doesn't look like the slider is being applied.
> 
> Is your "unlock voltage control" setting in MSI AB's options set to third party?


Yes it is set on third party.

Quote:


> Originally Posted by *nrpeyton*
> 
> Try this: _(if you own the STRIX OC)_
> 
> StrixOC330W.zip 0k .zip file
> _File courtesy of the OP on this thread by KedarWolf_
> Extract and run the .bat file.
> 
> Sets power limit at 330w
> 
> If you hit "apply" on MSI Afterburner with the slider at anything other than maximum you may need to re-run the .bat file again. To re-apply the setting.
> 
> This is also useful if there is a bug and MSI Afterburner isn't setting the correct, max power target!
> 
> You can open the .bat file with notepad and change the value to any other wattage you want. (But it won't work beyond the hard-coded maximum of your individual firmware ). So setting 700W isn't going to work lol.


Thank you, I'll give it a look over.

Hoping the card doesn't have an issue now with the warranty being void because of water block.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Yes it is set on third party.
> Thank you, I'll give it a look over.
> 
> Hoping the card doesn't have an issue now with the warranty being void because of water block.


I think you're jumping the gun, lol.

Honestly -- I think once you get used to everything you'll be OKAY.

This new GPU BOOST 3.0 is *so* fundamentally different to the way overclocking/boosting worked before PASCAL.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> I think you're jumping the gun, lol.
> 
> Honestly -- I think once you get used to everything you'll be OKAY.
> 
> This new GPU BOOST 3.0 is *so* fundamentally different to the way overclocking/boosting worked before PASCAL.


Power still staying at 100% after applying for file.

It said it switched from 330W to 330W, so assuming it was set correctly.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Power still staying at 100% after applying for file.
> 
> It said it switched from 330W to 330W, so assuming it was set correctly.


Have you downloaded hwinfo64 and Riva Tuner? It allows you to watch your power draw (in watts) in REAL-TIME. Second-by-second. In-game. Or in-benchmark.

Just now I know by looking at hwinfo64 (in Windows) my card is only consuming 12 watts (approx. as card is only idling in windows).

If I fired up a game it would look like this:



Power draw in above game at time of screenshot = 321 watts


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Have you downloaded hwinfo64 and Riva Tuner? It allows you to watch your power draw (in watts) in REAL-TIME. Second-by-second. In-game. Or in-benchmark.
> 
> Just now I know by looking at hwinfo64 (in Windows) my card is only consuming 12 watts (approx. as card is only idling in windows).


Looks like it's showing between 260-280w with heaven running.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Looks like it's showing between 260-280w with heaven running.


Why don't you try including frequency and voltage in the info that it sends to the on-screen-display. And watch and see how the curve you're using is affecting how it boosts in-game? (You may also notice if you also send GPU Temp info too -- that the card will down-clock to a lower bin at the exact same temperature every time).

Once you get used to that -- you'll be more comfortable.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Why don't you try including frequency and voltage in the info that it sends to the on-screen-display. And watch and see how the curve you're using is affecting how it boosts in-game? (You may also notice if you also send GPU Temp info too -- that the card will down-clock to a lower bin at the exact same temperature every time).
> 
> Once you get used to that -- you'll be more comfortable.


I have all that enabled. Looks like my card isn't getting the full 330W.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> I have all that enabled. Looks like my card isn't getting the full 330W.


Try a more intensive application..

Try the Superposition benchmark at highest or 2nd highest settings. And let us know. (I know for a fact Superposition definitely is a high-current / wattage benchmark).

Also under the "maximum" columb in hwinfo64 in the "Power" row it should give you the MAX drawn from the card since starting hwinfo64. (Sometimes this is useful as it updates faster than the eyes can keep up). If you close hwinfo64 everything is reset.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Try a more intensive application..
> 
> Try the Superposition benchmark at highest or 2nd highest settings. And let us know. (I know for a fact Superposition definitely is a high-current / wattage benchmark).
> 
> Also under the "maximum" columb in hwinfo64 in the "Power" row it should give you the MAX ever drawn from the card. (Sometimes this is useful as it updates faster than the eyes can keep up). If you close hwinfo64 everything is reset.


Max from superposition was 320W.

Suppose I'll have to look into flashing the bios now lol


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Max from superposition was 320W


So it's looking more like there is nothing wrong with your card then?









Try a higher voltage. And run, again.. you may notice it jumps up (sometimes even *slightly* beyond the maximum).


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> So it's looking more like there is nothing wrong with your card then?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try a higher voltage. And run, again.. you may notice it jumps up (sometimes even *slightly* beyond the maximum).


that was at 1093mv.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> that was at 1093mv.


Okay cool.

4k optimised test, aye?

In the 8k optimised test, power draw was *so high*-- that my card even downclocked all the way to *0.933v* for a few seconds. (completely annihilating my overclock lol). I don't even have the frequency at 0.933v overclocked. _(it's at default)._

Max Power draw though was: 426 watts. (bear in mind though; I have the *shunt* *mod*).

In 4k optimised test, max was 377 watts.

.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay cool.
> 
> 4k optimised test, aye?
> 
> In the 8k optimised test, power draw was *so high*-- that my card even downclocked all the way to *0.933v* for a few seconds. (completely annihilating my overclock lol). I don't even have the frequency at 0.933v overclocked. _(it's at default)._
> 
> Max Power draw though was: 426 watts. (bear in mind I have the shunt mod).
> 
> In 4k optimised test, max was 377 watts.
> .


Yup 4K opitmized.

I'm trying to follow the guide to flash bios, but it's in no logical order lol


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Yup 4K opitmized.
> 
> I'm trying to follow the guide to flash bios, but it's in no logical order lol


Okay well if you need a hand give us a wee shout here.

Either me, or someone else -- will be more than happy to help.

Just remember to disable the graphics display driver (in device manager) before you start the update. (It's safer this way). When you disable the display driver you may find your desktop looks weird as resolution will go so low.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay well if you need a hand give us a wee shout here.
> 
> Either me, or someone else -- will be more than happy to help.
> 
> Just remember to disable the graphics display driver (in device manager) before you start the update. (It's safer this way). When you disable the display driver you may find your desktop looks weird as resolution will go so low.


I can only see the option to disable the device, not just the driver within device manager

So if I'm following this correctly:

download and extract nvflash to c drive
download and extract bios collection to nvflash folder on c drive
run cmd in admin
turn protection off
backup current bios
flash new bios (assuming the strixocnopowerlimit bios?)
restart
install driver
restart


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> I can only see the option to disable the device, not just the driver within device manager
> 
> So if I'm following this correctly:
> 
> download and extract nvflash to c drive
> download and extract bios collection to nvflash folder on c drive
> run cmd in admin
> turn protection off
> backup current bios
> flash new bios (assuming the strixocnopowerlimit bios?)
> restart
> install driver
> restart


Yes. Make sure you get the correct version of nvlash.

Some people make the mistake of accidently downloading the "certificates bypassed" version. Don't!
That version won't work with an official "signed" BIOS like the XOC.

(The XOC *has* the ASUS "stamp").
They made it for their LN2 overclock team. A guy in that team was nice enough to release the file to us.


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes. Make sure you get the correct version of nvlash.
> 
> Some people make the mistake of accidently downloading the "certificates bypassed" version. Don't!
> That version won't work with an official "signed" BIOS like the XOC.
> 
> (The XOC *has* the ASUS "stamp").
> They made it for their LN2 overclock team. A guy in that team was nice enough to release the file to us


I downloaded the version linked in the guide so hopefully I'm okay. Just reinstalled nvidia drivers. Gonna give this a try!


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> I downloaded the version linked in the guide so hopefully I'm okay. Just reinstalled nvidia drivers. Gonna give this a try!


Okay first thing you'll notice in MSI AB is the power slider becomes unavailable _(I think - from memory)._


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay first thing you'll notice in MSI AB is the power slider becomes unavailable _(I think - from memory)._


Yup.

Any dangers running this bios?

Will it still stay under 1093mv?


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Yup.
> 
> Any dangers running this bios?
> 
> Will it still stay under 1093mv?


If you want to stay under 1.093v then make sure that any voltages that are higher than 1.093v don't have a higher frequency than the frequency attached to 1.093v.
That way it will never try to reach a higher frequency by selecting a higher voltage.

Another way of looking at it is: You should just have a straight horizontal line from 1.093v all the way up to the end of the curve.

Like this:



Example:
If you have a higher frequency set for 1.15v (for example) then it will boost to that higher frequency at 1.15v.

Max for the XOC is 1.2v I believe.

It is safe as long as you keep voltage to a max of 1.093v. (And monitor your temps in case your pump/fans fail).

Small disclaimer may be that if you continually put high current through it 24/7 it would obviously still degrade faster than keeping to normal power limits.

Regular gaming a few hrs a day (and being sensible with temps). Then you'll be fine.


----------



## dtfkev

Awesome, thanks a ton!

Rock solid 2100 @ 1093. No downclocking, 45*







EDIT: I lied, it started downclocking while gaming









Next question,

I'm going to format windows (only becasue of the amount of BSOD i've had the past few weeks trying to dial in CPU & GPU clocks)

Assuming I'll just have to download drivers and afterburner and just reapply settings and i'll be golden?


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Awesome, thanks a ton!
> 
> Rock solid 2100 @ 1093. No downclocking, 45*


enjoy


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Awesome, thanks a ton!
> 
> Rock solid 2100 @ 1093. No downclocking, 45*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next question,
> 
> I'm going to format windows (only becasue of the amount of BSOD i've had the past few weeks trying to dial in CPU & GPU clocks)
> 
> Assuming I'll just have to download drivers and afterburner and just reapply settings and i'll be golden?


Aye, but you won*'t* need to flash the BIOS of the GPU again. (as firmware is stored on the card).


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> Aye, but you won*'t* need to flash the BIOS of the GPU again. (as firmware is stored on the card).


Excellent!

Looks like I lied though, it's downclocking after a bit of gaming









The power limit bar is disabled and it was at 1100mv when first ran so I'm pretty sure I flashed it right.

Thought this BIOS prevented power limiter? Still have a power limit of (1) in afterburner


----------



## KedarWolf

Quote:


> Originally Posted by *dtfkev*
> 
> Awesome, thanks a ton!
> 
> Rock solid 2100 @ 1093. No downclocking, 45*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next question,
> 
> I'm going to format windows (only becasue of the amount of BSOD i've had the past few weeks trying to dial in CPU & GPU clocks)
> 
> Assuming I'll just have to download drivers and afterburner and just reapply settings and i'll be golden?


I PM'd you links to the official Windows 10 ISO's for a clean install of the latest version of Windows 10.

You might need it, has the sha-1's as well so no need to worry they are altered in any way.


----------



## nrpeyton

Quote:


> Originally Posted by *dtfkev*
> 
> Excellent!
> 
> Looks like I lied though, it's downclocking after a bit of gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The power limit bar is disabled and it was at 1100mv when first ran so I'm pretty sure I flashed it right.
> 
> Thought this BIOS prevented power limiter? Still have a power limit of (1) in afterburner


I honestly thought the XOC removed "all" power/temp limits.

*Not sure -- can anyone with more experience with the XOC firmware/BIOS can shed any light on this, please?*

(Was it just one bin it down-clocked)? Was it at a specific temperature? _(Does it always do it upon reaching *that* temperature)*?*_
...can you repeat the down-clock by doing something again?

Just a *theory:* _maybe the middle temperature limits (I.E. between 35c and 80c) are still in place.. and it's only the *e*xtreme *l*ow and *e*xtreme *h*igh_
_(I.E. 0c and 95c that have been removed)_*?*

And yes -- if it hadn't been flashed correctly it wouldn't be working at all.

So I'm guessing this must just be normal behaviour for the XOC.... there are honestly so many details and intricacies to GPU Boost 3.0 --

--even more when you flash 'limit-removing' firmware that was never "meant" to be removed by nvidia design.... maybe sometimes it just doesn't behave *exactly* as one would expect? (I'm sure there are ways around it and ways to make it "play").


----------



## KedarWolf

If anyone is ordering Fujippoly thermal pads that are cheaper on www.amazon.com than, for example, www.amazon.ca Google Amazon Global and order from there, or for anything you need from amazon.com if you are not in the USA.

Shipping is much cheaper, I saved $20 on shipping by doing it this way.









Here's a direct link to it.

https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=global


----------



## KedarWolf

Do the thermal pads on the backplate of my EK water block serve any actual purpose and any point in replacing them?









Edit: Yeah, They passively cool the back of the VRMs.









I'm sure I ordered enough 17.0 W/mK pads to replace them as well.









And work has powder-free latex gloves to use to apply them. Oil from your hands is a bad thing on pads.


----------



## Gunslinger.

Quote:


> Originally Posted by *nrpeyton*
> 
> I honestly thought the XOC removed "all" power/temp limits.
> 
> *Not sure -- can anyone with more experience with the XOC firmware/BIOS can shed any light on this, please?*
> 
> (Was it just one bin it down-clocked)? Was it at a specific temperature? _(Does it always do it upon reaching *that* temperature)*?*_
> ...can you repeat the down-clock by doing something again?
> 
> Just a *theory:* _maybe the middle temperature limits (I.E. between 35c and 80c) are still in place.. and it's only the *e*xtreme *l*ow and *e*xtreme *h*igh_
> _(I.E. 0c and 95c that have been removed)_*?*
> 
> And yes -- if it hadn't been flashed correctly it wouldn't be working at all.
> 
> So I'm guessing this must just be normal behaviour for the XOC.... there are honestly so many details and intricacies to GPU Boost 3.0 --
> 
> --even more when you flash 'limit-removing' firmware that was never "meant" to be removed by nvidia design.... maybe sometimes it just doesn't behave *exactly* as one would expect? (I'm sure there are ways around it and ways to make it "play").


XOC bios does *NOT* remove the power limit sadly (speaking on the XOC Lightning bios)


----------



## RoBiK

Quote:


> Originally Posted by *Gunslinger.*
> 
> XOC bios does *NOT* remove the power limit sadly (speaking on the XOC Lightning bios)


That is not the XOC BIOS that people are discussing here. It is the Asus Strix XOC BIOS and it has no power limit.


----------



## jpm888

I can get a good deal on a Founders GTX 1080 Ti but will I regret the noise?

Im already annoyed by something like an h100i but consider a Hyper 212 to be quiet


----------



## KedarWolf

Founders can be loud but you can undervolt them to .800v and they'll stay a lot cooler. With a custom fan curve and flash a.BIOS WITH 0% fan speed at lower temps noise would be pretty much nonexistent.

You'll lose about 10% of your performance but really it only matters if you're gaming at 4k or benching.

I run 2066/6147 at 1.093v on the BIOS I'm running now but do 1786/5977 at .800v.

About 10 to 12% lower in Time.Spy scores.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jpm888*
> 
> I can get a good deal on a Founders GTX 1080 Ti but will I regret the noise?
> 
> Im already annoyed by something like an h100i but consider a Hyper 212 to be quiet


If you undervolt it should be fine.


----------



## jpm888

Thanks guys,

The price im being offered for it is around the same as a 1080 gaming x or strix.


----------



## FedericoUY

Hi, wich are the (normal) volts, for 2012 core?
What are gtx 1080 ti strix users to oc and overvoltage this cards?
Thanks in advance


----------



## Agent-A01

Quote:


> Originally Posted by *FedericoUY*
> 
> Hi, wich are the (normal) volts, for 2012 core?
> What are gtx 1080 ti strix users to oc and overvoltage this cards?
> Thanks in advance


It varies on a per card basis, there is no "normal".

voltage can be anywhere from 1v to 1.075~


----------



## dtfkev

Quote:


> Originally Posted by *nrpeyton*
> 
> I honestly thought the XOC removed "all" power/temp limits.
> 
> *Not sure -- can anyone with more experience with the XOC firmware/BIOS can shed any light on this, please?*
> 
> (Was it just one bin it down-clocked)? Was it at a specific temperature? _(Does it always do it upon reaching *that* temperature)*?*_
> ...can you repeat the down-clock by doing something again?
> 
> Just a *theory:* _maybe the middle temperature limits (I.E. between 35c and 80c) are still in place.. and it's only the *e*xtreme *l*ow and *e*xtreme *h*igh_
> _(I.E. 0c and 95c that have been removed)_*?*
> 
> And yes -- if it hadn't been flashed correctly it wouldn't be working at all.
> 
> So I'm guessing this must just be normal behaviour for the XOC.... there are honestly so many details and intricacies to GPU Boost 3.0 --
> 
> --even more when you flash 'limit-removing' firmware that was never "meant" to be removed by nvidia design.... maybe sometimes it just doesn't behave *exactly* as one would expect? (I'm sure there are ways around it and ways to make it "play").


It's for sure still power limiting. Automatically goes down one bin. Temps don't move around much, pretty much stays put at 45* during testing.

So far seems completely random.

Sucks because in order to run stable at 2100 core i have to set it at 2113 - which isn't stable for me.

No idea how to get it to hold a constant 2100 core.

Options are set to 2113 and have rare stability crashes when it doesn't down clock to 2100 as soon as load hits
or
set to 2100 and have it down clock to 2088 which is stable and not much different. But dangit I want my 2100!!!!!


----------



## MaKeN

Quote:


> Originally Posted by *dtfkev*
> 
> It's for sure still power limiting. Automatically goes down one bin. Temps don't move around much, pretty much stays put at 45* during testing.
> 
> So far seems completely random.
> 
> Sucks because in order to run stable at 2100 core i have to set it at 2113 - which isn't stable for me.
> 
> No idea how to get it to hold a constant 2100 core.
> 
> Options are set to 2113 and have rare stability crashes when it doesn't down clock to 2100 as soon as load hits
> or
> set to 2100 and have it down clock to 2088 which is stable and not much different. But dangit I want my 2100!!!!!


First when i got my card i was also so much after that 2100...
But its only stable at 2075, as i understand its mb 1 fps difference in games.

Now im on 1772 undervolted to 0.8mv and more than happy with it. I lose about 10-12 fps , and win much more with it.


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> They used to make space heaters that had a 1000 Watts of heat output.


Ah my apologies you're commenting on the wrong forum then, this was starting to sound like a troll but I'm more than guessing you're just in the wrong section, space heaters don't produce cool air sorry.
I've 0 idea what you're referring to anymore.


----------



## Agent-A01

Quote:


> Originally Posted by *dtfkev*
> 
> set to 2100 and have it down clock to 2088 which is stable and not much different. But dangit I want my 2100!!!!!


Who cares about 13mhz.

That would equate to .2% difference


----------



## dtfkev

Quote:


> Originally Posted by *MaKeN*
> 
> First when i got my card i was also so much after that 2100...
> But its only stable at 2075, as i understand its mb 1 fps difference in games.
> 
> Now im on 1772 undervolted to 0.8mv and more than happy with it. I lose about 10-12 fps , and win much more with it.


Quote:


> Originally Posted by *Agent-A01*
> 
> Who cares about 13mhz.
> 
> That would equate to .2% difference


Yeah I know! Just a mind set. Goals when I built the rig were 5ghz on cpu and 2.1ghz GPU.

I'll probably just run a quick bench at 5.3ghz and 2.1ghz screenshot for victory and go back to my under volted 5ghz CPU 2ghz GPU daily clocks ?


----------



## mouacyk

The fact that Jensen demo'ed the 1080 st 2111MHz doesn't help our reality of not hitting that magical barrier.


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Might help some, yes.
> 
> www.amazon.com best place to get the pads cheap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I find the best paste is Mastergel Maker Nano. Not Grizzly.
> 
> 
> 
> does the brand or size of the thermal pads matter?
> 
> like this?
> https://www.amazon.com/ARCTIC-Thermal-Pad-1-0-Conductivity/dp/B00UYTTDO6/ref=sr_1_1?ie=UTF8&qid=1508055612&sr=8-1&keywords=thermal+pads+gpu
Click to expand...

Oh, and you'll want 1.5mm for the VRMS, 1.0mm for the rest.

VRM's are the R22.


----------



## 8051

What's the maximum voltage most 1080Ti's voltage controllers will allow?


----------



## sblantipodi

is a 5930K @ 4.2GHz a bottle neck for two GTX1080 Ti in SLI?
will I loose some FPS due to my CPU?

I will play from 2K to 4K.


----------



## FedericoUY

Does a change of TIM, and thermal pad on VRM been necesary to anyone using a 1080 TI Strix?


----------



## KedarWolf

Quote:


> Originally Posted by *FedericoUY*
> 
> Does a change of TIM, and thermal pad on VRM been necesary to anyone using a 1080 TI Strix?


I'm putting 17.0 W/m-K thermal pads on everything. 1.5mm on the R22's and 1.0mm everywhere else.









I figured out this is where to put them, well, except not on the ones with the red lines, the EK block doesn't really cover those, I think it would hurt more than help.









Made a better picture. Red squares you use 1.5mm pads, green squares 1.0mm pads, the same size and shape as the squares.

The blue squares you DON'T do with EK blocks as the block doesn't cover those, I think it would hurt more than help.


----------



## KedarWolf

Quote:


> Originally Posted by *Shawn Shutt jr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Fujipoly or Alphacool 17W/mk are best, Sarcon are easier to apply and more pliable but I think they max out at 14W/mk, cheaper though.
> 
> Wear rubber gloves when applying them so the oil on your hands doesn't lower the thermal abilities. Non powdered rubber gloves.
> 
> To do just the R22 VRMs and memory you can get away with two of these, 1.5mm for VRM, 1.0mm for memory.
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQUT8/ref=sr_1_8?s=electronics&ie=UTF8&qid=1508058049&sr=1-8&keywords=Fujipoly
> 
> https://www.amazon.com/Fujipoly-mod-smart-Extreme-Thermal/dp/B00ZSJQLME/ref=sr_1_3?s=electronics&ie=UTF8&qid=1508058049&sr=1-3&keywords=Fujipoly
> 
> You'd need maybe three more of the 1.5mm pads to do the rest of the stuff recommended to do thermal pads on. The memory (11 small rectangles on the left side around the GPU core, the VRMs (R22 chips) and the long area just directly left of the R22 VRMs I'd do 17W/mk, the rest could likely get away with cheaper 14W/mk or 11W/mk pads. If money is no object do 17W/mk for everything for best results.
> 
> You'll want to do the areas with pads like the picture below. The green rectangles are the size of the pads. You will need to do the 60x50mm pad into a couple of long rectangles to cover the longer areas as the pads are only 60mm (2.4 inches long).
> 
> 
> 
> The 100x15mm pads are less than 4" long anyways and may not even cover the long row of VRMs etc. so may as well just buy the 60x50 and cut it I think. You get more pad for less money that way.
> 
> Hope this helps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow, Thanks for all the info! im going to look into this for sure! i really wanna get my temp's down to around 50s or 60s and this should help me get in the ballpark about it! silly we pay $700-$800 for a video card and have to replace parts but heck it happens lol.
Click to expand...

I revised this after watching a video and seeing the EK block doesn't cover the blue squares.









I'm putting 17.0 W/m-K thermal pads on everything. 1.5mm on the R22's and 1.0mm everywhere else.









I figured out this is where to put them, well, except not on the ones with the red lines, the EK block doesn't really cover those, I think it would hurt more than help.









Made a better picture. Red squares you use 1.5mm pads, green squares 1.0mm pads, the same size and shape as the squares.

The blue squares you DON'T do with EK blocks as the block doesn't cover those, I think it would hurt more than help.









Edit: Sorry about the spam folks, the two posts I thought was in two different threads.


----------



## mouacyk

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm putting 17.0 W/m-K thermal pads on everything. *1.5mm on the R22's and 1.0mm everywhere else.
> 
> 
> 
> 
> 
> 
> 
> *


If anything, you might end up dampening or completely eliminating any coil whine (like in game menus where fps shoots upwards of 1000fps+). I think it's common knowledge now that inductors/chokes don't need active cooling. Someone correct me if I'm wrong.


----------



## KedarWolf

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm putting 17.0 W/m-K thermal pads on everything. *1.5mm on the R22's and 1.0mm everywhere else.
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> If anything, you might end up dampening or completely eliminating any coil whine (like in game menus where fps shoots upwards of 1000fps+). I think it's common knowledge now that inductors/chokes don't need active cooling. Someone correct me if I'm wrong.
Click to expand...

Yes, it's not necessary, just for coil whine and may help a bit...









Edit: I did read the thermal pads on the backplate passively cools the back of the VRM's so I'm replacing those pads as well.


----------



## nrpeyton

Quote:


> Originally Posted by *sblantipodi*
> 
> is a 5930K @ 4.2GHz a bottle neck for two GTX1080 Ti in SLI?
> will I loose some FPS due to my CPU?
> 
> I will play from 2K to 4K.


Once overclocked -- your 5930k *only* averages about 7% slower than an i7 7700k at QUAD-core loads. (_Reason I mention the 7700k:
It has the FASTEST quad-core performance out of ANY chip). Games mostly benefit from quad-core setups above anything else.
_
At multi-core loads _(anything more than 4 cores)_ you actually *better* an i7 7700k by about 27%.

So I'd say from 2k to 4k resolutions you definitely wouldn'*t* be bottlenecked in SLI.

You're fine









_When I play at 4k my CPU (7700k) often only runs at around 25% total utilization. Individual threads don't even get fully utilized (no more than by spikes up to 40% at most)._


----------



## sblantipodi

Quote:


> Originally Posted by *nrpeyton*
> 
> Once overclocked -- your 5930k only averages about 7% slower than an i7 7700k at QUAD-core loads. (_Reason I mention the 7700k:
> It has the FASTEST quad-core performance out of ANY chip). Games mostly benefit from quad-core setups above anything else.
> _
> At multi-core loads _(anything more than 4 cores)_ you actually *better* an i7 7700k by about 27%.
> 
> So I'd say from 2k to 4k resolutions you definitely wouldn'*t* be bottlenecked in SLI.
> 
> You're fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _When I play at 4k my CPU (7700k) often only runs at around 25% total utilization. Individual threads don't even get fully utilized (no more than by spikes up to 40% at most)._


thank you for the complete answer.
is the 7700K faster than 8700K in quad core load?

Ok this means that my CPU should survive the next upgrade to volta


----------



## jameschisholm

Has anyone in here gone from a 780 Ti to a 1080 ti ? If you have, can you share your experiences with game performance in mind on recent titles? (Shadow of War?)


----------



## ZealotKi11er

Quote:


> Originally Posted by *sblantipodi*
> 
> thank you for the complete answer.
> is the 7700K faster than 8700K in quad core load?
> 
> Ok this means that my CPU should survive the next upgrade to volta


Nope because 8700 is 7700 with more core and clocks a bit more.


----------



## nrpeyton

Quote:


> Originally Posted by *sblantipodi*
> 
> thank you for the complete answer.
> is the 7700K faster than 8700K in quad core load?
> 
> Ok this means that my CPU should survive the next upgrade to volta


No probs.

And indeed. I think you have plenty years left on that chip -- easily ;-)

If you are in any doubt then download "HWINFO64" & "Riva Tuner Statistics Server".

Riva Tuner is like FRAPS -- except it gives 10x more information. Use the "RTSS" tab in HWINFO64 to choose what info you want to send to your on-screen display via Rivia.

Set it up to show CPU utilization by thread. (it's called "Max CPU Thread Usage). It'll show the utilization % for the thread that is currently utilized the most. Total CPU utilization can also be sent, too.

As well as GPU total utilization. IF you notice that GPU Utilization is continuously below 90% while "Max CPU Thread Usage" is constantly pegged above 80/90% then you know you've got a bottleneck on the CPU.

I think you'll actually be pleasantly surprised though ;-)

I have mine setup to show power usage and voltage as well as VRAM usage and RAM usage too.

Like below:

*full size* - right click & select 'open new tab'


_Not sure how accurate GPU utilization will be in SLI. But you could still watch it with only one GPU running... then that would give you some sort of estimation/idea what it will be like with two GPU's._


----------



## chibi

Hey guys, I'm thinking of switching out my EVGA card for a Strix OC. Am I right to assume that the ASUS GTX 1080 Ti Strix OC is the best card to pair with the XOC bios? I've read FE cards losing points in benchmarks with the XOC bios, this shouldn't be a problem correct? I just want to get rid of that pesky Power Limit without having to resort to shunt modding.

Also, would anyone happen to have a Strix + EK GPU block? If so, may I please ask for the measurement below? I know the card itself is 134mm in height. I need to know the extra port height because the case I'm eying only has 150mm clearance. Thank you


----------



## nrpeyton

Quote:


> Originally Posted by *chibi*
> 
> Hey guys, I'm thinking of switching out my EVGA card for a Strix OC. Am I right to assume that the ASUS GTX 1080 Ti Strix OC is the best card to pair with the XOC bios? I've read FE cards losing points in benchmarks with the XOC bios, this shouldn't be a problem correct? I just want to get rid of that pesky Power Limit without having to resort to shunt modding.
> 
> Also, would anyone happen to have a Strix + EK GPU block? If so, may I please ask for the measurement below? I know the card itself is 134mm in height. I need to know the extra port height because the case I'm eying only has 150mm clearance. Thank you


You're correct about it being the best card to pair with the XOC.

Sorry, not sure about measurements & clearances though, as I have an EVGA Founders Edition. Hopefully someone else can answer that one.


----------



## KedarWolf

Should be getting a chunk of cash, enough to buy a second Founders 1080 Ti and enough to throw an EK waterblock and backplate on it!!

I'll let you all know in the next week or so if i do.









Just setup www.nowinstock.net to alert me when there is stock.


----------



## nrpeyton

Lol lucky you.

Wish I could afford that.

If I had the money I wouldn't bother with another 1080 Ti....

But I would trade my existing one in for a Kingpin 1080 Ti. _(I can get playable FPS at 4k ultra in all games so far anyway -- the chances are the one I eventually can't, won't support SLI anyway. Plus the Kingpin has more "tinker" value lol.)_

I thought SLI was a dying breed?


----------



## RoBiK

Quote:


> Originally Posted by *chibi*
> 
> Also, would anyone happen to have a Strix + EK GPU block? If so, may I please ask for the measurement below? I know the card itself is 134mm in height. I need to know the extra port height because the case I'm eying only has 150mm clearance.


----------



## chibi

Quote:


> Originally Posted by *RoBiK*


Thanks RoBik!







Would you happen to know if these type of drawings are easily accessible for other EK products? I tried looking them up on google and ekwb, but my google-fu skills are not up to snuff!


----------



## nrpeyton

Rise of the Tomb Raider is a beautiful game, at 4k.

I get 45 - 70 FPS at Ultra, too. _(Not bad for a 1.0v under-volt at 1950Mhz & sporting a little +525 memory)._


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Rise of the Tomb Raider is a beautiful game, at 4k.
> 
> I get 45 - 70 FPS at Ultra, too. _(Not bad for a 1.0v under-volt at 1950Mhz & sporting a little +525 memory)._


Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.

Low FPS may be an issue not SLI or no G-Sync though.


----------



## RoBiK

Quote:


> Originally Posted by *chibi*
> 
> Thanks RoBik!
> 
> 
> 
> 
> 
> 
> 
> Would you happen to know if these type of drawings are easily accessible for other EK products? I tried looking them up on google and ekwb, but my google-fu skills are not up to snuff!


Yes, for a quite a few of EK products - blocks, radiators, fitting adapters etc.... Usually you find them as the last of the product pictures on the product page in the shop.
For example, you find that strix drawing here: https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-nickel
There is a strip with product images, scroll that strip til the end and there is the picture.


----------



## chibi

Quote:


> Originally Posted by *RoBiK*
> 
> Yes, for a quite a few of EK products - blocks, radiators, fitting adapters etc.... Usually you find them as the last of the product pictures on the product page in the shop.
> For example, you find that strix drawing here: https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-nickel
> There is a strip with product images, scroll that strip til the end and there is the picture.


Right on, I was looking in the wrong place all this time. I checked the install manual and the specs, but couldn't find anything there. Thank you once again


----------



## RaptormanUSMC

So, *** is up with SLI? Seems like every single title SLI is broken in now. Running my 1080Ti's in SLI in almost every title results in micro stutters and only about 50% gpu usage. This has been a theme with Wildlands, R6 Siege, and Destiny 2. Switching to single card shows 99% usage of the card. I've really noticed this since the Fall Creators Update and have used drivers 388.00, 388.10, and 388.13.


----------



## KedarWolf

Quote:


> Originally Posted by *RaptormanUSMC*
> 
> So, *** is up with SLI? Seems like every single title SLI is broken in now. Running my 1080Ti's in SLI in almost every title results in micro stutters and only about 50% gpu usage. This has been a theme with Wildlands, R6 Siege, and Destiny 2. Switching to single card shows 99% usage of the card. I've really noticed this since the Fall Creators Update and have used drivers 388.00, 388.10, and 388.13.


You have a G-Sync screen, I read others having issues with G-Sync enabled.


----------



## RaptormanUSMC

Quote:


> Originally Posted by *KedarWolf*
> 
> You have a G-Sync screen, I read others having issues with G-Sync enabled.


No, I don't have a G-Sync screen. I read that Power monitoring in overclocking apps has been causing stuttering, but it still doesn't explain the terribly low GPU usage when running SLI.

Ready to sell my second 1080Ti at this point. It's practically useless except for mining.


----------



## sblantipodi

Quote:


> Originally Posted by *nrpeyton*
> 
> No probs.
> 
> And indeed. I think you have plenty years left on that chip -- easily ;-)
> 
> If you are in any doubt then download "HWINFO64" & "Riva Tuner Statistics Server".
> 
> Riva Tuner is like FRAPS -- except it gives 10x more information. Use the "RTSS" tab in HWINFO64 to choose what info you want to send to your on-screen display via Rivia.
> 
> Set it up to show CPU utilization by thread. (it's called "Max CPU Thread Usage). It'll show the utilization % for the thread that is currently utilized the most. Total CPU utilization can also be sent, too.
> 
> As well as GPU total utilization. IF you notice that GPU Utilization is continuously below 90% while "Max CPU Thread Usage" is constantly pegged above 80/90% then you know you've got a bottleneck on the CPU.
> 
> I think you'll actually be pleasantly surprised though ;-)
> 
> I have mine setup to show power usage and voltage as well as VRAM usage and RAM usage too.
> 
> Like below:
> 
> *full size* - right click & select 'open new tab'
> 
> 
> _Not sure how accurate GPU utilization will be in SLI. But you could still watch it with only one GPU running... then that would give you some sort of estimation/idea what it will be like with two GPU's._


thanks for the answer but I don't think that this is the best way to check for CPU bottleneck.
as far as I know there are old CPUs with bad IPC that bottleneck GPUs without being maxed out in terms of CPU usage.


----------



## nrpeyton

Quote:


> Originally Posted by *sblantipodi*
> 
> thanks for the answer but I don't think that this is the best way to check for CPU bottleneck.
> as far as I know there are old CPUs with bad IPC that bottleneck GPUs without being maxed out in terms of CPU usage.


Even when looking at individual thread usage, not total CPU utilization?

Is there another way?

Quote:


> Originally Posted by *KedarWolf*
> 
> Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.
> 
> Low FPS may be an issue not SLI or no G-Sync though.


Hellblade: Senou's Sacrifice -- interesting; i'll check it out. thx


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sblantipodi*
> 
> thanks for the answer but I don't think that this is the best way to check for CPU bottleneck.
> as far as I know there are old CPUs with bad IPC that bottleneck GPUs without being maxed out in terms of CPU usage.
> 
> 
> 
> That's why I look at individual thread usage, not total CPU utilization.
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.
> 
> Low FPS may be an issue not SLI or no G-Sync though.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Hellblade: Senou's Sacrifice -- interesting; i'll check it out. thx
Click to expand...

If you SLI.

https://www.forum-3dcenter.org/vbulletin/showpost.php?p=11455083&postcount=2622

And I played both games, loved Rise of The Tomb Raider but the graphics on Hellblade are quite a bit better and the best story in any game I've ever played, enjoyed it much more.

Not SLI at 4K maxed out graphics settings you'll get about 45 FPS though.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.
> 
> Low FPS may be an issue not SLI or no G-Sync though.


Whats the hacked profile? Is it useful for single 1080 Ti?


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.
> 
> Low FPS may be an issue not SLI or no G-Sync though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats the hacked profile? Is it useful for single 1080 Ti?
Click to expand...

No, only SLI, see my other post.


----------



## RaptormanUSMC

So, Power monitoring in MSI Afterburner was definitely causing stuttering in games. Turned off power monitoring and the stuttering is gone, usage and FPS are up in R6 Siege to where it should be. Destiny 2 SLI GPU utilization still sucks, and runs better in Single-GPU (Both GPU's only being used at 46%)

I did notice for utilization in R6 Siege that GPU1 utilization was 53% while GPU 2 Utilization was 98%.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dtfkev*
> 
> Awesome, thanks a ton!
> 
> Rock solid 2100 @ 1093. No downclocking, 45*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Next question,
> 
> I'm going to format windows (only becasue of the amount of BSOD i've had the past few weeks trying to dial in CPU & GPU clocks)
> 
> Assuming I'll just have to download drivers and afterburner and just reapply settings and i'll be golden?
> 
> 
> 
> I PM'd you links to the official Windows 10 ISO's for a clean install of the latest version of Windows 10.
> 
> You might need it, has the sha-1's as well so no need to worry they are altered in any way.
Click to expand...

Had some PM me about this. I would NEVER promote pirated software, the torrents I made are just the official Windows ISO's formatted so they'll fit on a 4GB USB key. NOT cracked, can only be used for a clean install if you have a licence or have previously installed Windows 10 before.

The torrent site even has that in the description of the torrent I made.

I don't even download pirated software personally.

The only reason I PM it is the torrent site I upload them has a few (not porn) but NSFW pictures and stuff like torrents sites can do.









Anyhow, let's move on.


----------



## Glottis

Has there been any information that later shipments of 1080Tis come with inferior quality memory? I bought my STRIX 1080Ti a month ago and can barely do +300 on memory before it craps out. Everyone who go their cards on release seem to be easily hitting +500 or even +600. I wonder if cryptocurency craze played a part in newer 1080Tis having worse quality components. This is all speculation on my part, but I think there might be some truth to it.


----------



## RaptormanUSMC

Fixed the Destiny 2 SLI problem. Even though I had switched back to Fullscreen vs Borderless Windowed the game was still running in borderless windowed. Had to "Reset to Defaults," in the game menu to correct this, and uninstall GeForce Experience. Now, I'm getting proper scaling in Destiny 2 with SLI.


----------



## KedarWolf

Quote:


> Originally Posted by *RaptormanUSMC*
> 
> Fixed the Destiny 2 SLI problem. Even though I had switched back to Fullscreen vs Borderless Windowed the game was still running in borderless windowed. Had to "Reset to Defaults," in the game menu to correct this, and uninstall GeForce Experience. Now, I'm getting proper scaling in Destiny 2 with SLI.


Good job!!!

I'm thinking about getting a second 1080 Ti FE and a waterblock but I've read with G-Sync GPU utilization can be really low on both cards.









Edit:

I'll have enough Fujipoly 17.0 W/mK Ultra Extreme XR-m thermal pads to completely redo my waterblock and backplate by Tuesday at the latest from www.amazon.com, cheapest place to get them.









Use the Amazon Global if you're not in the USA, I saved $40 on shipping. I'm in Canada.


----------



## djriful

Sold my 1080 Ti...

/rip


----------



## ZealotKi11er

Quote:


> Originally Posted by *djriful*
> 
> Sold my 1080 Ti...
> 
> /rip


Reason?


----------



## RoBiK

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Reason?


probably needs the money for the titan collector's edition


----------



## ZealotKi11er

Quote:


> Originally Posted by *RoBiK*
> 
> probably needs the money for the titan collector's edition


Logically one could buy 1080 Ti, use it for 6 months and played all the demanding games an sell it before the next card comes and and lose very little money.


----------



## djriful

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *djriful*
> 
> Sold my 1080 Ti...
> 
> /rip
> 
> 
> 
> Reason?
Click to expand...

/born

Zephyrus GTX 1080 Max-Q Design


----------



## sblantipodi

Quote:


> Originally Posted by *nrpeyton*
> 
> Even when looking at individual thread usage, not total CPU utilization?
> 
> Is there another way?
> 
> Hellblade: Senou's Sacrifice -- interesting; i'll check it out. thx


the best way is by comparing the same game with the same GPU using different CPU.
I'm pretty sure that a CPU can be a bottleneck even if it isn't running 100%.


----------



## mouacyk

Quote:


> Originally Posted by *sblantipodi*
> 
> the best way is by comparing the same game with the same GPU using different CPU.
> I'm pretty sure that a CPU can be a bottleneck even if it isn't running 100%.


In BF1, only one of my 8 threads has to hit around 90% usage and I start seeing GPU usage drop below 90%. My RAM is tightended 2666MHz, so that's unlikely to be the problem. Wonder if my 4300MHz Cache is the culprit... haven't tried pushing it since I got my new GPU. Would be interesting to see.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mouacyk*
> 
> In BF1, only one of my 8 threads has to hit around 90% usage and I start seeing GPU usage drop below 90%. My RAM is tightended 2666MHz, so that's unlikely to be the problem. Wonder if my 4300MHz Cache is the culprit... haven't tried pushing it since I got my new GPU. Would be interesting to see.


If GPU usage drops its the CPU/Memory that is holding you back.


----------



## 8051

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If GPU usage drops its the CPU/Memory that is holding you back.


I sometimes get sub-30 FPS in GTA IV (all details maxed and draw distance tripled), when it does my 1080Ti is at 40% to 50% usage, but none of the CPU cores seem to be pinned at 100% usage, although one of them is at 90%.


----------



## SavantStrike

Quote:


> Originally Posted by *Glottis*
> 
> Has there been any information that later shipments of 1080Tis come with inferior quality memory? I bought my STRIX 1080Ti a month ago and can barely do +300 on memory before it craps out. Everyone who go their cards on release seem to be easily hitting +500 or even +600. I wonder if cryptocurency craze played a part in newer 1080Tis having worse quality components. This is all speculation on my part, but I think there might be some truth to it.


That's wild speculation. I don't know why everyone looks at mining as the boogeyman. If there's any variance in clocks from early to later models, it's because Micron's GDDR5X process has matured and they are binning the chips for 12gbps (and faster) GDDR5X, so early production samples that were not binned are no longer available. Prices have gone up across the board because of what has to be collusion with DRAM suppliers, so again not mining, and no OEM is going to shoot themselves in the foot by purposely building off-spec cards.

I've also seen old production TI's that can barely budge on RAM clocks before getting unstable or having errors that cause them to get lower scores. No conspiracy here, just some cards are better than others.


----------



## GreedyMuffin

I think my GPU is degrading.... Can fold 48 hours at 1950/+500 on mem at 0.900V. Temps are 30-33'C. Will crash in CS:GO rather quickly...?

Tried DDU and tried updating to the newest driver.


----------



## 8051

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I think my GPU is degrading.... Can fold 48 hours at 1950/+500 on mem at 0.900V. Temps are 30-33'C. Will crash in CS:GO rather quickly...?
> 
> Tried DDU and tried updating to the newest driver.


I've noticed stable overclocks can vary by game/benchmark.


----------



## SavantStrike

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I think my GPU is degrading.... Can fold 48 hours at 1950/+500 on mem at 0.900V. Temps are 30-33'C. Will crash in CS:GO rather quickly...?
> 
> Tried DDU and tried updating to the newest driver.


Have you tried feeding it a little more voltage? That's a pretty significant undervolt to be running at 1950mhz.


----------



## GreedyMuffin

Been running that since release without any issues. Before it could do hours og BF1. Havent't testet that in a while.


----------



## tomxlr8

Quote:


> Originally Posted by *mouacyk*
> 
> The XOC BIOS works great on this card. I have voltage curve profiles for [email protected], [email protected], and [email protected]


Thanks for your advice here. If you've been through this, can you just confirm the process is:


nvflash: S1080TIXOCNoPowerLimit.rom
don't install a power limit BAT file
remove nvidia drivers in safemode with ddu
install nvidia drivers
in MSI set power curves to: [email protected], [email protected], and [email protected]

Would be great if you can please comment if i got it right or not.


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> I sometimes get sub-30 FPS in GTA IV (all details maxed and draw distance tripled), when it does my 1080Ti is at 40% to 50% usage, but none of the CPU cores seem to be pinned at 100% usage, although one of them is at 90%.


1 thread at 85/90%+ is all it takes for a bottleneck.

Sounds like the game isn't optimised to take advantage of all other cores/threads simultaneously.

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I think my GPU is degrading.... Can fold 48 hours at 1950/+500 on mem at 0.900V. Temps are 30-33'C. Will crash in CS:GO rather quickly...?
> 
> Tried DDU and tried updating to the newest driver.


Were you able to play CS:GO without crashes at the same O/C when the card was new?

If it is really degrading -- this would be the first time I've heard any reports. _(Although I do admit I find it unlikely if you've been running with that under-volt -- even if card is on the go 24/7)._

Quote:


> Originally Posted by *Glottis*
> 
> Has there been any information that later shipments of 1080Tis come with inferior quality memory? I bought my STRIX 1080Ti a month ago and can barely do +300 on memory before it craps out. Everyone who go their cards on release seem to be easily hitting +500 or even +600. I wonder if cryptocurency craze played a part in newer 1080Tis having worse quality components. This is all speculation on my part, but I think there might be some truth to it.


One thing I noticed when going chilled water -- is GDDR5X is very temperature sensitive.

I also remember on my last Pascal card (1080 non Ti) that temps on the memory chips closest to the VRM side of the card were running 30c hotter than the memory chips on the I/O side of the card. I was able to improve it quite significantly by trying different height thermal pads. Also if you go over-board with pads and start padding up areas between VRM and memory that don't normally take pads it helps absorb/remove heat spillage from VRM affecting other PCB areas.

Quote:


> Originally Posted by *tomxlr8*
> 
> Thanks for your advice here. If you've been through this, can you just confirm the process is:
> 
> 
> nvflash: S1080TIXOCNoPowerLimit.rom
> don't install a power limit BAT file
> remove nvidia drivers in safemode with ddu
> install nvidia drivers
> in MSI set power curves to: [email protected], [email protected], and [email protected]
> 
> Would be great if you can please comment if i got it right or not.



Backup current BIOS first. (GPU-Z can do it, I think). But I'd do both. Use nvflash and GPU-z and have two copies.
Need to disable graphics adapter in device manager for 1080Ti before flashing.

Also mileage will differ card-to-card in terms of what more voltage does. Many cards don't see a lot of benefit until sub 25c load temps are reached.

I can get 2176MHZ at 1.093v if I get her below 25c. _(Temperature is so much more important than voltage, on Pascal).
_


----------



## 8051

Quote:


> Originally Posted by *tomxlr8*
> 
> Thanks for your advice here. If you've been through this, can you just confirm the process is:
> 
> nvflash: S1080TIXOCNoPowerLimit.rom
> don't install a power limit BAT file
> remove nvidia drivers in safemode with ddu
> install nvidia drivers
> in MSI set power curves to: [email protected], [email protected], and [email protected]
> Would be great if you can please comment if i got it right or not.


Is 1.31V or 1.2V safe? I remember an Nvidia rep. said that too much voltage kills Pascals quick.
Quote:


> Originally Posted by *nrpeyton*
> 
> One thing I noticed when going chilled water -- is GDDR5X is very temperature sensitive.
> 
> I also remember on my last Pascal card (1080 non Ti) that temps on the memory chips closest to the VRM side of the card were running 30c hotter than the memory chips on the I/O side of the card. I was able to improve it quite significantly by trying different height thermal pads. Also if you go over-board with pads and start padding up areas between VRM and memory that don't normally take pads it helps absorb/remove heat spillage from VRM affecting other PCB areas.


Would your idea of putting padding on areas not normally padded work on air-cooled heatsinks as well? What about on areas where there's nothing but bare PCB?


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm putting 17.0 W/m-K thermal pads on everything. 1.5mm on the R22's and 1.0mm everywhere else.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I figured out this is where to put them, well, except not on the ones with the red lines, the EK block doesn't really cover those, I think it would hurt more than help.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Made a better picture. Red squares you use 1.5mm pads, green squares 1.0mm pads, the same size and shape as the squares.
> 
> The blue squares you DON'T do with EK blocks as the block doesn't cover those, I think it would hurt more than help.


Kedar, I got 1.0 and .5mm pads to replace on both cards... Since that's what EK provided in thickness. You think 1.0 is ok where you put the 1.5 pads? With the EK block? and .5 instead of the 1.0? Again, with the EK blocks.
Quote:


> Originally Posted by *nrpeyton*
> 
> Rise of the Tomb Raider is a beautiful game, at 4k.
> 
> I get 45 - 70 FPS at Ultra, too. _(Not bad for a 1.0v under-volt at 1950Mhz & sporting a little +525 memory)._


That game is gorgeous. At 7680x1440 ultra everything it blows my mind. Great SLI scaling as well.
Quote:


> Originally Posted by *KedarWolf*
> 
> Check out Hellblade: Senou's Sacrifice at 4k, but you may need 1080 Ti's in SLI and a hacked profile or G-Sync to really benefit, on my 4K G-Sync it's incredible.
> 
> Low FPS may be an issue not SLI or no G-Sync though.


7680x1440 sli it plays 60fps just fine, so 4K should be ok.. No hack.
Quote:


> Originally Posted by *RaptormanUSMC*
> 
> So, Power monitoring in MSI Afterburner was definitely causing stuttering in games. Turned off power monitoring and the stuttering is gone, usage and FPS are up in R6 Siege to where it should be. Destiny 2 SLI GPU utilization still sucks, and runs better in Single-GPU (Both GPU's only being used at 46%)
> 
> I did notice for utilization in R6 Siege that GPU1 utilization was 53% while GPU 2 Utilization was 98%.


when you say "power monitoring" do you mean ticking off "unlock voltage monitoring"? curious.


----------



## KedarWolf

Quote:


> Originally Posted by *KCDC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'm putting 17.0 W/m-K thermal pads on everything. 1.5mm on the R22's and 1.0mm everywhere else.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I figured out this is where to put them, well, except not on the ones with the red lines, the EK block doesn't really cover those, I think it would hurt more than help.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Made a better picture. Red squares you use 1.5mm pads, green squares 1.0mm pads, the same size and shape as the squares.
> 
> The blue squares you DON'T do with EK blocks as the block doesn't cover those, I think it would hurt more than help.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Kedar, I got 1.0 and .5mm pads to replace on both cards... Since that's what EK provided in thickness. You think 1.0 is ok where you put the 1.5 pads? With the EK block? and .5 instead of the 1.0? Again, with the EK blocks.
Click to expand...

I'm not an expert on this by any means but 1.0mm on the R22 VRM's and 0.5mm everywhere else should be fine.

I made this the other night. The green you want .5mm, the purple you want 1.0mm, the red you want .5mm as well but they aren't absolutely necessary, they'll help with coil whine and may help keep things a bit cooler.

All the same size and shape as the drawings. Cut them just a bit bigger than the actual RAM and VRM block etc.



Oh, DON'T do the small blue ones in the original drawing, I checked, the heatsink on the EK block doesn't contact them and it'll not help but may actually cause them to run hotter.


----------



## KCDC

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm not an expert on this by any means but 1.0mm on the R22 VRM's and 0.5mm everywhere else should be fine.
> 
> I made this the other night. The green you want .5mm, the purple you want 1.0mm, the red you want .5mm as well but they aren't absolutely necessary, they'll help with coil whine and may help keep things a bit cooler.
> 
> All the same size and shape as the drawings. Cut them just a bit bigger than the actual RAM and VRM block etc.
> 
> 
> 
> Oh, DON'T do the small blue ones in the original drawing, I checked, the heatsink on the EK block doesn't contact them and it'll not help but may actually cause them to run hotter.


Thanks dude. I got the 8w thermal grizzly, still twice as good as the ek pads. Couldn't bring myself to get the fuji 17w for two cards.. it was alot of cash. I definitely have a lot of coil whine. Hope this helps.


----------



## RaptormanUSMC

Quote:


> Originally Posted by *KCDC*
> 
> when you say "power monitoring" do you mean ticking off "unlock voltage monitoring"? curious.


If you go to Settings, Monitoring in MSI Afterburner in the list there is one item called "GPU Power," uncheck it to stop stuttering.


----------



## KedarWolf

Afterburner 4.4.0 NOT beta has been released.

Updating OP.

https://www.msi.com/page/afterburner


----------



## mouacyk

Quote:


> Originally Posted by *tomxlr8*
> 
> Thanks for your advice here. If you've been through this, can you just confirm the process is:
> 
> 
> nvflash: S1080TIXOCNoPowerLimit.rom
> don't install a power limit BAT file
> remove nvidia drivers in safemode with ddu
> install nvidia drivers
> in MSI set power curves to: [email protected], [email protected], and [email protected]
> 
> Would be great if you can please comment if i got it right or not.


1. Make sure monitor is running 60Hz, otherwise it could go blank
2. Backup BIOS as already suggested with GPU-z
3. nvflash64 --protectoff
3: nvflash64 -6 NewBIOS.rom
4. nvflash64 --protecton
5. Reboot and set curves. Monitor temps.
* No need to DDU and re-install drivers, if you already did this one. Also no need to run power limit BAT file, as you mentioned.


----------



## KedarWolf

Quote:


> Originally Posted by *mouacyk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tomxlr8*
> 
> Thanks for your advice here. If you've been through this, can you just confirm the process is:
> 
> 
> nvflash: S1080TIXOCNoPowerLimit.rom
> don't install a power limit BAT file
> remove nvidia drivers in safemode with ddu
> install nvidia drivers
> in MSI set power curves to: [email protected], [email protected], and [email protected]
> 
> Would be great if you can please comment if i got it right or not.
> 
> 
> 
> 1. Make sure monitor is running 60Hz, otherwise it could go blank
> 2. Backup BIOS as already suggested with GPU-z
> 3. nvflash64 --protectoff
> 3: nvflash64 -6 NewBIOS.rom
> 4. nvflash64 --protecton
> 5. Reboot and set curves. Monitor temps.
> * No need to DDU and re-install drivers, if you already did this one. Also no need to run power limit BAT file, as you mentioned.
Click to expand...

Quite often I need to DDU or no Nvidia Control Panel, so I do it every time. It's just cleaner.


----------



## GreedyMuffin

Yep. I used this OC even when my GPU temps would hit 45-50'C (before I had my MO-RA3). That would only make the OC more stable, wouldn't it?

I've dismantled my GPU blocks a few weeks ago, but the card only now recently started to show problems.

Hopefully it will die sooner than later. I was thinking of switching to Volta.


----------



## iBerggman

Does anyone one know what pcb layout the white Gigabyte 1080 Ti Turbo has? (GV-N108TTURBO-11GD) The ports makes me think FE but the product page mentions improved power delivery and what not so I assume it's custom pcb like on the Asus turbo?


----------



## RoBiK

Quote:


> Originally Posted by *iBerggman*
> 
> Does anyone one know what pcb layout the white Gigabyte 1080 Ti Turbo has? (GV-N108TTURBO-11GD) The ports makes me think FE but the product page mentions improved power delivery and what not so I assume it's custom pcb like on the Asus turbo?


It is definitively not the reference design PCB, the ports are not the same, the HDMI port is in a different position.


----------



## iBerggman

Quote:


> Originally Posted by *RoBiK*
> 
> It is definitively not the reference design PCB, the ports are not the same, the HDMI port is in a different position.


Thanks, didn't realize to compare the port order. I guess I'll have to keep looking for a founders edition.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *iBerggman*
> 
> Thanks, didn't realize to compare the port order. I guess I'll have to keep looking for a founders edition.


If you're not sure if it's reference, just go to EK's webshop and under a 1080 ti founder edition block, check the compatibility list and see if the card your researching is there.





Also reference will have Nvidia on the pcb by the pci- E teeth, the giga card has Gigabyte written.


----------



## Bartholdi

Subbing here so I can post stats. Card should arrive monday.


----------



## siffonen

Just watercooled my Strix with EK FC-waterblock and temps are great, barely hit 36c with hour of gaming with 100% load


----------



## Mooncheese

Quote:


> Originally Posted by *nrpeyton*
> 
> Once overclocked -- your 5930k *only* averages about 7% slower than an i7 7700k at QUAD-core loads. (_Reason I mention the 7700k:
> It has the FASTEST quad-core performance out of ANY chip). Games mostly benefit from quad-core setups above anything else.
> _
> At multi-core loads _(anything more than 4 cores)_ you actually *better* an i7 7700k by about 27%.
> 
> So I'd say from 2k to 4k resolutions you definitely wouldn'*t* be bottlenecked in SLI.
> 
> You're fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _When I play at 4k my CPU (7700k) often only runs at around 25% total utilization. Individual threads don't even get fully utilized (no more than by spikes up to 40% at most)._


7%? Try like 30%+?

http://www.tomshardware.com/answers/id-3546832/8700k-absolute-ice-lake.html#20338450

I mean, I have a 4930k, also on 22nm node, and it's only marginally slower than a 5930k (16.8k CPU Firestrike), doing the math, dividing 16.8k by 6 and then multiplying the quotient by 4 results in around 10.5-11k, whereas a 7700k is around 14-14.5k at the same frequency (4.5 GHz). A 7700k at 4.9 GHz does nearly 16k CPU, so basically, you have the speed of all 6 cores of a 4930k at 4.5 GHz with a 7700k at 4.9 GHz.

With any (basically like 90%+) title that is single core dependent, say I don't know, Planetside 2, or running Zelda: BOTW on Cemu (where I see 22-25 FPS at Dueling Peaks stable and a popular Cemu uploader is seeing, per video, 40 FPS here with his 7700k at 4.6 GHz and DDR4 memory at 3200 MHz) that's a difference that mirrors the discrepancy between both Firestrike CPU scores, 14.5k-10.5k is around 35%. And it's like this for pretty much every title.

The misconception in the actual performance disparity in games is because various review outlets tend to use games where the GPU is the bottleneck, i.e. "Rise of the Tomb Raider on Ultra" and "BF1 on Ultra" and on and on. These review outlets don't point to say Planetside 2 or Zelda: BOTW on Cemu etc. They then point and say, look no difference between a 2600k and an 8700k! *****. Dude, going from 22 to 14nm is a big deal and not without a performance difference.

The above is even worse if you enjoy 3D Vision and the title in question induces the dreaded "3-Core Bug", such as GTA 5.

I too have been going around with this misconception that the CPU performance gains were only marginal and that I was "future proof" with my 4930k lol. Meanwhile, in large fights in Planetside 2 my frames drop down to 45-50 FPS whereas someone running a 6700 or 7700k at the same frequency and same in game settings is sitting at ~80 FPS, here have a look:






As for Zelda: BOTW, I basically have to put the game off until I upgrade, either to 8700k when outlets stop price gouging or wait for Ice Lake sometime late next year. But yeah, it's MUCH more than incremental 5-10% difference between 22 and 14nm as you, and I, have been led to believe.

Sure, youre not going to see a difference running a corridor shooter with little-to-no CPU load going on such as Call of Duty 9: Republican Space Rangers 2 but anything with a lot of NPC activity, say Far Cry 4 or Fallout 4 in downtown Boston with excessive draw-calls or massive fights in Planetside 2 or running Cemu emulator or trying to run games in 3D Vision with stupid 3-Core bugs, youre much better off upgrading if youre still on 22nm.

In FACT, to be brutally honest, upgrading from 980 Ti to 1080 Ti has been a wash, I'm seeing like 70% utilization, well under refresh rate (i.e. 80 out of 144 etc.) in many, MANY games, because of the CPU bottleneck, some you wouldn't even think of such as Dragon Age: Inquisition where many many areas do this, or The Witcher 3 at the dinner party in Skillege where possessed bears come in and kill everyone (in 3D Vision but yeah).

I probably would be in the same boat had I upgraded from 4930k to 7700k, I would probably have my GPU pegged at 100% in pretty much every title (except Zelda: BOTW, but it would be better than what I'm seeing as aforementioned youtuber is running a GTX 980 and seeing 40 FPS, vs the 25 FPS I'm seeing with 1080 Ti, and minimums are much more meaningful than maximums).

The only good thing about my situation is that if I wait for Ice Lake I will have a hell of an upgrade, akin to going from GTX 580 to GTX 1080 Ti.

Imagine an octo-core 7700k that can do over 5.0 GHz with three times the transistor density at 10nm than at 14nm. Yeah, that kind of upgrade.

9700k here I come, I'm just going to leap frog 14nm altogether. 22 to 10nm HELL YEAH.


----------



## KedarWolf

Quote:


> Originally Posted by *iBerggman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RoBiK*
> 
> It is definitively not the reference design PCB, the ports are not the same, the HDMI port is in a different position.
> 
> 
> 
> Thanks, didn't realize to compare the port order. I guess I'll have to keep looking for a founders edition.
Click to expand...

If you set up alerts on www.nowinstock.net you'll get emails when an FE is in stock, or even text messages.


----------



## ZealotKi11er

So cold right now in my room my card is hitting 36C under load.


----------



## Shawn Shutt jr

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So cold right now in my room my card is hitting 36C under load.


a buddy of mine's PC afew weeks back, it was cold outside, he opened his door, put his PC right outside and left it for an hour, booted it and had those temps lol got a good laugh


----------



## nrpeyton

Evening guys,

Anyone know if our 1080Ti's use Pixel Shader V5.0 or V5.1? _(also running Windows v. 10 with all latest updates).

_I'm finding conflicting information online.

Thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Evening guys,
> 
> Anyone know if our 1080Ti's use Pixel Shader V5.0 or V5.1? _(also running Windows v. 10 with all latest updates).
> 
> _I'm finding conflicting information online.
> 
> Thanks!


Why does it matter? Nvidia is #1 and #1 dictates all.


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So cold right now in my room my card is hitting 36C under load.


That's it ?

My max temp is 31c and min is 22c.


----------



## ZealotKi11er

Quote:


> Originally Posted by *lilchronic*
> 
> That's it ?
> 
> My max temp is 31c and min is 22c.


AIO on a ITX case with CPU AIO in the front dumping warm air.


----------



## lilchronic

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AIO on a ITX case with CPU AIO in the front dumping warm air.


Need some fujipoly pads 17w/mk and some liquid metal on gpu die.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Evening guys,
> 
> Anyone know if our 1080Ti's use Pixel Shader V5.0 or V5.1? _(also running Windows v. 10 with all latest updates).
> 
> _I'm finding conflicting information online.
> 
> Thanks!


All the information I see is it uses Pixel Shader 5.0.


----------



## KedarWolf

This BIOS is 350W and has a 140 Power Limit.

https://www.techpowerup.com/vgabios/193137/193137

My old best non-XOC BIOS was the Strix OC.

It would give me about 10950 in Time Spy at my current CPU and GPU OC. This Palit BIOS gives me 11111. I lose one bin on my 1080 Ti but it power limits much less that any other BIOS I've used.


----------



## 2ndLastJedi

So with my Stock Galax 1080 Ti EXOC , is there a Bios that i could use that would benefit this card even just with its stock cooler ?


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So with my Stock Galax 1080 Ti EXOC , is there a Bios that i could use that would benefit this card even just with its stock cooler ?


I'd give this Palit BIOS a try.

It has a 140 Power Limit and is 350W.

https://www.techpowerup.com/vgabios/193137/193137

I just finished testing it against the Strix OC, EVGA FTW3 and the Arctic Storm BIOS and it beat all of them in benching.

The closest was that Arctic Storm which in Time Spy was almost 100 points lower than the Palit one.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> I'd give this Palit BIOS a try.
> 
> It has a 140 Power Limit and is 350W.
> 
> https://www.techpowerup.com/vgabios/193137/193137
> 
> I just finished testing it against the Strix OC, EVGA FTW3 and the Arctic Storm BIOS and it beat all of them in benching.
> 
> The closest was that Arctic Storm which in Time Spy was almost 100 points lower than the Palit one.


Does the PALIT bios score higher than the stock bios on an FE?


----------



## nrpeyton

*% of different cards owned (worldwide)*





*% owned by generation (nvidia)*


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I'd give this Palit BIOS a try.
> 
> It has a 140 Power Limit and is 350W.
> 
> https://www.techpowerup.com/vgabios/193137/193137
> 
> I just finished testing it against the Strix OC, EVGA FTW3 and the Arctic Storm BIOS and it beat all of them in benching.
> 
> The closest was that Arctic Storm which in Time Spy was almost 100 points lower than the Palit one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does the PALIT bios score higher than the stock bios on an FE?
Click to expand...

All those BIOS'ss I tested score higher than the stock FE BIOS which is why I tested those ones.


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> Is 1.31V or 1.2V safe? I remember an Nvidia rep. said that too much voltage kills Pascals quick.
> Would your idea of putting padding on areas not normally padded work on air-cooled heatsinks as well? What about on areas where there's nothing but bare PCB?


I'm not sure about air m8.

I was assuming water. My apologies.

Here's an example of what I done with my last Pascal card (it helped tremendously). By the time I finished I was able to manage a +1000 memory _(the maximum on MSI Afterburner)_, without crashing (and only a little artifacts) _+925 was totally stable:_

The pads circled in *RED* are EXTRA *(above and beyond what is required in the EK installation manual).*
I had to tweak about a lot to perfect heights for everything. (while probing temps of different components during tinkering). You'll notice all the extra padding *is* between the VRM and first row of memory chips.
_
You'd be amazed at how much heat the VRM outputs. It affects memory temps *more* than the adjacent core.
_

*[full size]* - right click & select 'open new tab'



Quote:


> Originally Posted by *KedarWolf*
> 
> All those BIOS'ss I tested score higher than the stock FE BIOS which is why I tested those ones.


Even on the same clock? _(So if I clocked both BIOS's at 1999MHZ the PALIT one would still score higher)?_

If that's true m8 I will be flashing it now lol.....?


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *8051*
> 
> Is 1.31V or 1.2V safe? I remember an Nvidia rep. said that too much voltage kills Pascals quick.
> Would your idea of putting padding on areas not normally padded work on air-cooled heatsinks as well? What about on areas where there's nothing but bare PCB?
> 
> 
> 
> I'm not sure about air m8.
> 
> I was assuming water. My apologies.
> 
> Here's an example of what I done with my last Pascal card (it helped tremendously). By the time I finished I was able to manage a +1000 (the maximum) on chilled water without crashing (and only a little artifacts).
> 
> ....2 secs finding the info - - sorry (back in a few minutes)
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> All those BIOS'ss I tested score higher than the stock FE BIOS which is why I tested those ones.
> 
> Click to expand...
> 
> Even on the same clock? _(So if I clocked both BIOS's at 1999MHZ the PALIT one would still score higher)?_
> 
> If that's true m8 I will be flashing it now lol.....?
Click to expand...

My tests were at 1.093v where power limiting really comes into play.

The Strix OC BIOS is best of you eliminate power limiting.

It constantly gets the best bench scores with no power limiting.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> My tests were at 1.093v where power limiting really comes into play.
> 
> The Strix OC BIOS is best of you eliminate power limiting.
> 
> It constantly gets the best bench scores with no power limiting.


What about a card that is shunt modded like mine? (i never power-throttle in games but I do in superposition but only _*slightly*_).


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> My tests were at 1.093v where power limiting really comes into play.
> 
> The Strix OC BIOS is best of you eliminate power limiting.
> 
> It constantly gets the best bench scores with no power limiting.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What about a card that is shunt modded like mine? (i never power-throttle in games but I do in superposition but only _*slightly*_).
Click to expand...

Probably be a choice between the 432W Arctic Storm BIOS or the Strix OC.

I haven't done the shunt mod as my card sits vertical so I haven't tested it.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Probably be a choice between the 432W Arctic Storm BIOS or the Strix OC.
> 
> I haven't done the shunt mod as my card sits vertical so I haven't tested it.


Power throttling isn't an issue for me though. (due to Shunt mod).

And clock for clock _(mhz to mhz)_ my stock BIOS scores higher than the XOC.

So to benefit from the XOC at all, I would need to pump 1.2v at it with an insane overclock (like 2200 MHZ to offset the Mhz-Mhz points loss). Even then i would probably only manage to "even" it out. _*(In other words to get the same score as my stock BIOS I'd have to overclock it a lot more).*_

If I could find a BIOS that is better clock for clock. And simply match that with my "already" shunt modded card. I theorise maybe getting a higher score?

So sorry, to summarise that: Are there any BIOS's compatible with my 1080 Ti FE that score better *clock-for-clock?*


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Probably be a choice between the 432W Arctic Storm BIOS or the Strix OC.
> 
> I haven't done the shunt mod as my card sits vertical so I haven't tested it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power throttling isn't an issue for me though. (due to Shunt mod).
> 
> And MHz for MHz my stock BIOS scores higher than the XOC.
> 
> So to benefit from the XOC at all, I would need to pump 1.2v at it with an insane overclock (like 2200 MHZ to offset the Mhz-Mhz points loss). Even then i would probably only manage to "even" it out. (In other words to get the same score as my stock BIOS I'd have to overclock it a lot more).
> 
> If I could find a BIOS that is better clock for clock. And simply match that with my "already" shunt modded card. I theorise maybe getting a higher score?
> 
> So sorry, to summarise that: Are there any BIOS's compatible with my 1080 Ti FE that score better *clock-for-clock?*
Click to expand...

I find in my testing the Strix OC BIOS performs better than any other I tried clock for clock.









Edit: So on your card run Time Spy on the FE BIOS, then at the same clocks run Time Spy after flashing the Strix OC BIOS.

https://www.techpowerup.com/vgabios/193817/asus-gtx1080ti-11264-170503

After flashing the Strix OC BIOS and a clean install of the Nvidia drivers run this powerlimt.bat file as admin.

powerlimit330.zip 0k .zip file


You will find the Strix OC performs significantly better.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> I find in my testing the Strix OC BIOS performs better than any other I tried clock for clock.


Okay I'll try that now then.

Do you have a link to the BIOS, please?

Then I'll report back with my findings.

Thx


----------



## RoBiK

remove please


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I find in my testing the Strix OC BIOS performs better than any other I tried clock for clock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay I'll try that now then.
> 
> Do you have a link to the BIOS, please?
> 
> Then I'll report back with my findings.
> 
> Thx
Click to expand...

See edit of previous post.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> I find in my testing the Strix OC BIOS performs better than any other I tried clock for clock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: So on your card run Time Spy on the FE BIOS, then at the same clocks run Time Spy after flashing the Strix OC BIOS.
> 
> https://www.techpowerup.com/vgabios/193817/asus-gtx1080ti-11264-170503
> 
> After flashing the Strix OC BIOS and a clean install of the Nvidia drivers run this powerlimt.bat file as admin.
> 
> powerlimit330.zip 0k .zip file
> 
> 
> You will find the Strix OC performs significantly better.


Thank you.

If it works, rep+ ;-) so fingers crossed lol


----------



## nrpeyton

*BIOS / FIRMWARE comparison*

All BIOS's tried on 1080Ti Founders Edition

stock FE *vs.* STRIX OC *vs.* XOC



Strange - because according to the research I just carried out -- the XOC never worsened my score on a clock-to-block basis. When I used the L key to lock the voltage on the Curve Window then remained at that voltage & frequency for the duration of the run.
All scores actually seem pretty similar -- in fact the XOC even seemed to beat other scores slightly _(although I guess that's within margin of error)_

What do you guys make of this? (I always assumed the XOC caused a reduction in score in terms of clock-to-clock). And for *that* reason alone I never bothered flashing it. Instead I carried out a shunt mod. Now I'm wondering if that shunt mod was really necessary?!

*Important:*
It also begs the question -- if my card was NOT shunt modded, would I then see a score reduction when using the XOC? (I recall watching a you-tube video once about Pascal purposely shutting down shaders when hitting power/voltage limits). -- maybe the shunt mod has removed those hardware-level limits, enabling me to get FULL use of the XOC.

I stress the point that -- this is the *FIRST* time I've tried the XOC *SINCE* doing the shunt mod.

In other words: (_*BEFORE*_ the shunt mod -- the XOC caused a reduction in clock-for-clock scores.

*
Next Step*
Use the extra voltage the XOC allows and see if I can get an insane score. (before the increased MHZ was offset by the lower clock-to-clock aka IPC the XOC caused when cross flashing it)... _(from my last paragraph above -- maybe thats no longer an issue)?_

.


----------



## RoBiK

Quote:


> Originally Posted by *nrpeyton*
> 
> *BIOS / FIRMWARE comparison*
> 
> All BIOS's tried on 1080Ti Founders Edition
> 
> stock FE *vs.* STRIX OC *vs.* XOC


and now we know







good work!


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> *BIOS / FIRMWARE comparison*
> 
> All BIOS's tried on 1080Ti Founders Edition
> 
> stock FE *vs.* STRIX OC *vs.* XOC
> 
> 
> 
> Strange - because according to the research I just carried out -- the XOC never worsened my score on a clock-to-block basis. When I used the L key to lock the voltage on the Curve Window then remained at that voltage & frequency for the duration of the run.
> All scores actually seem pretty similar -- in fact the XOC even seemed to beat other scores slightly _(although I guess that's within margin of error)_
> 
> What do you guys make of this? (I always assumed the XOC caused a reduction in score in terms of clock-to-clock). And for *that* reason alone I never bothered flashing it. Instead I carried out a shunt mod. Now I'm wondering if that shunt mod was really necessary?!
> 
> *Important:*
> It also begs the question -- if my card was NOT shunt modded, would I then see a score reduction when using the XOC? (I recall watching a you-tube video once about Pascal purposely shutting down shaders when hitting power/voltage limits). -- maybe the shunt mod has removed those hardware-level limits, enabling me to get FULL use of the XOC.
> 
> I stress the point that -- this is the *FIRST* time I've tried the XOC *SINCE* doing the shunt mod.
> 
> *
> Next Step*
> Use the extra voltage the XOC allows and see if I can get an insane score. (before the increased MHZ was offset by the lower clock-to-clock aka IPC the XOC caused when cross flashing it)... _(from my last paragraph above -- maybe thats no longer an issue)?_
> 
> .


Did you run the powerlimit.bat files as admin after flashing the BIOS and installing the drivers?

You need to do it after the drivers install as it'll reset the power limit lower.

If you never it'll definitely lower your scores.

The BestBIOSCollection .zip file has all the powerlimit.bats.

BestBIOSCollectionPowerLimitBatFiles.zip 1091k .zip file


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Did you run the powerlimit.bat files as admin after flashing the BIOS and installing the drivers?
> 
> You need to do it after the drivers install as it'll reset the power limit lower.
> 
> If you never it'll definitely lower your scores.
> 
> The BestBIOSCollection .zip file has all the powerlimit.bats.


The power limit.bat doesn't apply to the XOC.

Even if you just compare the clock-clock results of the stock FE BIOS vs the XOC _(and ignore the STRIX OC bios completely)_ the XOC _*still*_ seems to be scoring slightly higher _(or nearly the same)_ as the stock FE firmware when *both* are locked to 1999Mhz @ 1.0v......

When I carried this research out *BEFORE* the shunt mod. The opposite was true. The XOC caused a reduction in IPC (or clock for clock).

So I'm wondering if the shunt mod has removed hardware-level restrictions that were holding back the XOC before the shunt mod?
_(it *does seem plausible* -- as nobody could ever understand WHY the XOC caused those reductions?_

.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Did you run the powerlimit.bat files as admin after flashing the BIOS and installing the drivers?
> 
> You need to do it after the drivers install as it'll reset the power limit lower.
> 
> If you never it'll definitely lower your scores.
> 
> The BestBIOSCollection .zip file has all the powerlimit.bats.
> 
> 
> 
> The power limit.bat doesn't apply to the XOC.
> 
> Even if you just compare the clock-clock results of the stock BIOS vs the XOC (and ignore the STRIX OC bios completely) the XOC still seems to be scoring slightly higher _(or nearly the same)_ as the stock FE firmware when locked to 1999Mhz @ 1.0v......
Click to expand...

Yes, but it applies to the other two BIOS's.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Yes, but it applies to the other two BIOS's.


I know m8. I hear what you're saying. 100%

But do you think it's plausable what I'm getting at? (My theory)? _(when you combine XOC and shunt mod together vs. XOC without shunt)?_
/\ /\ /\ see my post above


----------



## ZealotKi11er

The differences are too small. Its not even worth bothering.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Yes, but it applies to the other two BIOS's.
> 
> 
> 
> I know m8. I hear what you're saying. 100%
> 
> But do you think it's plausable what I'm getting at? (My theory)? _(when you combine XOC and shunt mod together vs. XOC without shunt)?_
> /\ /\ /\ see my post above
Click to expand...

It's possible, yes.

There are weird quirks with the XOC like if you don't have a gradual power curve it performs worse.









Edit: I tell you what, I don't have the shunt and will compare all three at 1999 with the same memory settings.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The differences are too small. Its not even worth bothering.


When the XOC first came out everyone thought it was great.. finally we had a way to get past the 1.093v limit on Founders Edition Cards. _(as wel as bypass the power limit)._

But when we researched it -- everyone found it was NOT worth it;
*R*eason: using the XOC resulted in a LOWER score on an IPC (clock for clock basis) when compared to using your stock FE BIOS.

So most people opted to carry out a shunt mod to bypass the Power Limit while still maintaining the benefits of higher IPC from their stock FE BIOS.

_However_ even with a well carried out & successful shunt mod, the benefit to said shunt mod is still only about an extra 25% power.

IF my new research is correct. And the Shunt Mod *removes* the IPC loss of using the XOC (extreme O/C BIOS) on a FE. Then by *combining* *them* *both* would give an even higher power limit (removing the rare instances when shunt modded cards _*still*_ power throttle a little in high-current drawing benches.

Moreover; shunt modded cards could further enjoy *more volts.* But from now on -- *WIHOUT* a reduction in IPC that the XOC brings if your card isn't shunt modded.

*?*

I need to do more research -- but if I'm right -- this could be an unprecedicented descovery for enthusiast 1080Ti Founders Edition owners?



.


----------



## ZealotKi11er

I get all that but you still doing all this for clock and voltage which are pointless for 24/7 and very small increase in synthetic benchmarks.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I get all that but you still doing all this for clock and voltage which are pointless for 24/7 and very small increase in synthetic benchmarks.


Not necessarily.

I just passed a Superposition 4k test at 2202 MHZ at 1.2v (max power draw was 472 watts on a card that normally can't draw more than 300 watts without shunt mod or BIOS change).


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Not necessarily.
> 
> I just passed a Superposition 4k test at 2202 MHZ at 1.2v (max power draw was 472 watts on a card that normally can't draw more than 300 watts without shunt mod or BIOS change).


10% more clock speed for 55% more power. Should have bought a Vega.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 10% more clock speed for 55% more power. Should have bought a Vega.


I just hit the leaderboard for Superposition.

34th place (worldwide) for a quick dirty run, isn't bad.









1200 point increase vs stock FE bios at 2000 MHZ.

Another 438 points and I'll be in 10th place (worldwide) on superposition. _(time to tinker lol)._ <-- thats without LN2 btw


----------



## KedarWolf

All at 1999 core/ 6156 memory, zero power limiting at .975v.

FE.



Palit.



FTW3.



Strix OC.



Arctic Storm.



XOC



#12 Superposition Leaderboards 4k Optimised

https://benchmark.unigine.com/leaderboards/superposition/1.0/4k-optimized/single-gpu/page-1

#9 8K Optimized

https://benchmark.unigine.com/leaderboards/superposition/1.0/8k-optimized/single-gpu/page-1

#2 720p Low

https://benchmark.unigine.com/leaderboards/superposition/1.0/720p-low/single-gpu/page-1

Also top ten in a few other categories.









Just water cooled.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> All at 1999 core/ 6156 memory, zero power limiting at .975v.
> 
> FE.
> 
> 
> 
> Palit.
> 
> 
> 
> FTW3.
> 
> 
> 
> Strix OC.
> 
> 
> 
> Arctic Storm.
> 
> 
> 
> XOC
> 
> 
> 
> #12 Superposition Leaderboards 4k Optimised
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/4k-optimized/single-gpu/page-1
> 
> #9 8K Optimized
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/8k-optimized/single-gpu/page-1
> 
> #2 720p Low
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/720p-low/single-gpu/page-1
> 
> Also top ten in a few other categories.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just water cooled.



Thats with no modifications in the Nvidia control panel or NV inspector & also running only 4 core processor with only dual channel memory ;-)

Nice to see some competition ;-)


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> All at 1999 core/ 6156 memory, zero power limiting at .975v.
> 
> FE.
> 
> 
> 
> Palit.
> 
> 
> 
> FTW3.
> 
> 
> 
> Strix OC.
> 
> 
> 
> Arctic Storm.
> 
> 
> 
> XOC
> 
> 
> 
> #12 Superposition Leaderboards 4k Optimised
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/4k-optimized/single-gpu/page-1
> 
> #9 8K Optimized
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/8k-optimized/single-gpu/page-1
> 
> #2 720p Low
> 
> https://benchmark.unigine.com/leaderboards/superposition/1.0/720p-low/single-gpu/page-1
> 
> Also top ten in a few other categories.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just water cooled.
> 
> 
> 
> 
> Thats with no modifications in the Nvidia control panel or NV inspector & also running only 4 core processor with only dual channel memory ;-)
> 
> Nice to see some competition ;-)
Click to expand...

10869 WITH Nvidia tweaks, no Inspector tweaks though.

https://benchmark.unigine.com/results/rid_977f56ec8793421a8bb548c51b2f7502


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> 10869 WITH Nvidia tweaks, no Inspector tweaks though.
> 
> https://benchmark.unigine.com/results/rid_977f56ec8793421a8bb548c51b2f7502


Is that with a +710 memory overclock? (if my calculations were correct)?

Max memory O/C I can get is +575 - +625 dependant on temperature.

I reckon quad channel memory and a faster CPU would definitely let me hit 11,000 score.

I just need to see if I can get that memory temperature a bit lower. And maybe find a way to over-volt it. GDDR5X can take up to 1.5v (default being 1.34v if memory serves me correcty). Not sure I'm ready to start soldering the PCB yet though lol.

*Is it the nvidia tweaks from the OP?*


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 10869 WITH Nvidia tweaks, no Inspector tweaks though.
> 
> https://benchmark.unigine.com/results/rid_977f56ec8793421a8bb548c51b2f7502
> 
> 
> 
> Is that with a +710 memory overclock? (if my calculations were correct)?
> 
> Max memory O/C I can get is +575 - +625 dependant on temperature.
> 
> I reckon quad channel memory and a faster CPU would definitely let me hit 11,000 score.
> 
> I just need to see if I can get that memory temperature a bit lower. And maybe find a way to over-volt it. GDDR5X can take up to 1.5v (default being 1.34v if memory serves me correcty). Not sure I'm ready to start soldering the PCB yet though lol.
> 
> *Is it the nvidia tweaks from the OP?*
Click to expand...

+699 memory.

Yes, Nvidia tweaks from OP.


----------



## LunaP

Hey guys Im noticin g a random drop in performance at times when gaming, I havent modded a bios yet and running 2x slis SC blacks using waterblocks.

what bios/settings are good to start with in the event Im throttling?


----------



## ZealotKi11er

Quote:


> Originally Posted by *LunaP*
> 
> Hey guys Im noticin g a random drop in performance at times when gaming, I havent modded a bios yet and running 2x slis SC blacks using waterblocks.
> 
> what bios/settings are good to start with in the event Im throttling?


Throttling will not cause random drops. At least throttling that is bios effected.


----------



## schoolofmonkey

@Jpmboy

Did Jay really beat your score?


----------



## Mooncheese

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 10% more clock speed for 55% more power. Should have bought a Vega.


Lol! My thoughts exactly.


----------



## GRABibus

Hi KedarWolf,

in the BIOS colection, you could also add the EVGA FTW3 Master Bios (330W) :

https://www.techpowerup.com/vgabios/191990/evga-gtx1080ti-11264-170417

Formerly I was using Slave bios 358W from your collection with very good results on my Gigabyte Gaming OC 11G.

i have tried the Master Bios and will keep it definitely.
It gives the same results just by adding 20MHz on Core from my side :



fans are at "0" when idling, which is an advantage for noise versus Slave Bios

I am gaming stable at following Base clocks : *Core=1670MHz/Memory=1501MHz*, which leads to *Core=2063Mz/Memory=6003MHz at Vddc between 1.62V and 1.81V*, depending on the game.
And for a air cooled card as mine, it is better I think to have 330W than 358W.


----------



## KedarWolf

Quote:


> Originally Posted by *GRABibus*
> 
> Hi KedarWolf,
> 
> in the BIOS colection, you could also add the EVGA FTW3 Master Bios (330W) :
> 
> https://www.techpowerup.com/vgabios/191990/evga-gtx1080ti-11264-170417
> 
> Formerly I was using Slave bios 358W from your collection with very good results on my Gigabyte Gaming OC 11G.
> 
> i have tried the Master Bios and will keep it definitely.
> It gives the same results just by adding 20MHz on Core from my side :
> 
> 
> 
> fans are at "0" when idling, which is an advantage for noise versus Slave Bios
> 
> I am gaming stable at following Base clocks : *Core=1670MHz/Memory=1501MHz*, which leads to *Core=2063Mz/Memory=6003MHz at Vddc between 1.62V and 1.81V*, depending on the game.
> And for a air cooled card as mine, it is better I think to have 330W than 358W.


Added FTW3 330W BIOS and powerlimit.bat file.


----------



## sportfan1969

This is what i was able to score with my FTW3 on the stock (slave bios) 127% on the Power Slider 1.093v, +75 on the Core, +1000 on the RAM, on my i7-6700K @4.6 with 16GB DDR4 @3000 MHz...


----------



## DStealth

Good results...but 1000 times said this "+75 on the Core" means nothing please write the exact core clock during bench not the offset. +1000 memory is very good if scores are scaling till this point of course.


----------



## KedarWolf

Quote:


> Originally Posted by *sportfan1969*
> 
> This is what i was able to score with my FTW3 on the stock (slave bios) 127% on the Power Slider 1.093v, +75 on the Core, +1000 on the RAM, on my i7-6700K @4.6 with 16GB DDR4 @3000 MHz...


You can hit CTRL L in Afterburner on the highest point on your curve and it'll show your top core and memory speed and lock your speeds at that for benching.









Will still power limit though.


----------



## sportfan1969

KedarWolf,

I use PrecisionXOC not afterburne; I know that I would see better results with MSI AB... Below are my published score details from the Superposition leaderboards
Would I benefit from the XOC bios flash at all?


----------



## sportfan1969

Quote:


> Originally Posted by *DStealth*
> 
> Good results...but 1000 times said this "+75 on the Core" means nothing please write the exact core clock during bench not the offset. +1000 memory is very good if scores are scaling till this point of course.


DStealth, Ive uploaded screenshots of the score details from the Superposition leaderboards; as for the +1000 I saw no degradation (only improvement) over all the benchmark runs. The testing methodology i used was to OC the core 1 step at a time until crash then back off 1 step, and then OC the GDDR5X + 25 MHz at a time. There were no crashes while OC'ing the RAM and no significant loss of performance over the many runs (50+ runs on that day alone)
Is there anything you could suggest to improve the scores a bit more outside of a waterblock or LN2? As I mentioned I've already swapped the TIM with Conductonaut, running on Slave bios and fan's at 100% Power limit 1.093v, Power Slider 127%, Temp Slider at 85C, KBoost On. This card is installed in a NZXT s340 with an i7-6700K @4.6 GHz, 16 GB Corsair LPX DDR4 @3000 MHz, Corsair H115i (push/pull with 4 Corsair ML 140 Pro Led White 1400mm Fans on the Rad, a ML 140 on the top (exhaust) and an ML 120 on the rear (exhaust).


----------



## GRABibus

Quote:


> Originally Posted by *KedarWolf*
> 
> Added FTW3 330W BIOS and powerlimit.bat file.












I could even play today 30mns BF1 with +106MHz on core, which makes 1675MHz base clock and 2076MHz core in game.
No crash. To be confirmed with more gameplay









With +111MHz on Core (1680MHz base clock), I crash (BF1, COD WWII).


----------



## SavantStrike

Quote:


> Originally Posted by *sportfan1969*
> 
> DStealth, Ive uploaded screenshots of the score details from the Superposition leaderboards; as for the +1000 I saw no degradation (only improvement) over all the benchmark runs. The testing methodology i used was to OC the core 1 step at a time until crash then back off 1 step, and then OC the GDDR5X + 25 MHz at a time. There were no crashes while OC'ing the RAM and no significant loss of performance over the many runs (50+ runs on that day alone)
> Is there anything you could suggest to improve the scores a bit more outside of a waterblock or LN2? As I mentioned I've already swapped the TIM with Conductonaut, running on Slave bios and fan's at 100% Power limit 1.093v, Power Slider 127%, Temp Slider at 85C, KBoost On. This card is installed in a NZXT s340 with an i7-6700K @4.6 GHz, 16 GB Corsair LPX DDR4 @3000 MHz, Corsair H115i (push/pull with 4 Corsair ML 140 Pro Led White 1400mm Fans on the Rad, a ML 140 on the top (exhaust) and an ML 120 on the rear (exhaust).


Quote:


> Originally Posted by *iBerggman*
> 
> Thanks, didn't realize to compare the port order. I guess I'll have to keep looking for a founders edition.


What are your GPU temps like? You mention you're considering the XOC BIOS, but unless you're under water it might get a bit er, toasty. 400W plus runs on air are easily going to have you breeching 85C in many if not all cases.


----------



## RoBiK

Quote:


> Originally Posted by *SavantStrike*
> 
> What are your GPU temps like? You mention you're considering the XOC BIOS, but unless you're under water it might get a bit er, toasty. 400W plus runs on air are easily going to have you breeching 85C in many if not all cases.


His screenshot is showing max of 48 degrees celsius... that is great for air. I think with XOC and custom VF curve he should be able to go above 2100MHz.


----------



## NBrock

Anyone else having 3DMark issues in SLI? It crashes for me in any test in SLI.

I have tried stock clocks on CPU and GPUs.
Also tried DDU for drivers
I tried the Steam version and well as the Stand Alone version. Clean installs of both.

Currently on 388.00. Before that I was on 385.69.

SuperPosition works in SLI.
Valley bench
Gaming works in SLI (BF1, Battle Front, CSGO, OverWatch)


----------



## KedarWolf

Quote:


> Originally Posted by *NBrock*
> 
> Anyone else having 3DMark issues in SLI? It crashes for me in any test in SLI.
> 
> I have tried stock clocks on CPU and GPUs.
> Also tried DDU for drivers
> I tried the Steam version and well as the Stand Alone version. Clean installs of both.
> 
> Currently on 388.00. Before that I was on 385.69.
> 
> SuperPosition works in SLI.
> Valley bench
> Gaming works in SLI (BF1, Battle Front, CSGO, OverWatch)


"Update - I read on futuremark forums if HPET is turned off in bios it will crash 3DMark . Had it turned off in bios , turned it back on . Rerun 3DMark with no issues." read in a forum.


----------



## NBrock

Quote:


> Originally Posted by *KedarWolf*
> 
> "Update - I read on futuremark forums if HPET is turned off in bios it will crash 3DMark . Had it turned off in bios , turned it back on . Rerun 3DMark with no issues." read in a forum.


Awesome find man! I will report back when I'm not on mobile.


----------



## pez

In an Owner's Club on OCN of all places, I'm seeing people getting criticized for pushing the limits of their GPUs for benchmark scores?

I didn't think my life would come full circle this early on.


----------



## Rakanoth

Quote:


> Originally Posted by *GRABibus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I could even play today 30mns BF1 with +106MHz on core, which makes 1675MHz base clock and 2076MHz core in game.
> No crash. To be confirmed with more gameplay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With +111MHz on Core (1680MHz base clock), I crash (BF1, COD WWII).


How many FPS gains did you get?


----------



## Mooncheese

Duplicate.


----------



## GRABibus

Quote:


> Originally Posted by *Rakanoth*
> 
> How many FPS gains did you get?


I did't Check and I am sûre it is négligeable.
By the way, This OC crashed in COD WWII.


----------



## jameschisholm

Is the 1080 Ti worth £700?


----------



## MaKeN

Get it from USA cheaper... depending on shipping cost


----------



## Cookybiscuit

My Aorus Extreme is unstable at stock clocks, and will only do 1873MHz regardless of voltage. I tested it for 12 hours at 1.000V 1873-1860MHz and it's stable, but if I try to run it at 1911MHz or something even at 1.093V it crashes within minutes (usually seconds). It also will only take about +250MHz on the memory.

Do I have the worlds absolute worst 1080Ti or is there something amiss here? I'm confused because I've recently made a switch to a new motherboard and CPU, and beforehand I was able to get it above 2000MHz at 1.081V, enough to score 10118 points in 4K Superposition. Can a motherboard change have any influence on this sort of thing? It has always been unstable at stock clocks though, will have to try a BIOS update to see if that fixes it, if not I'll probably RMA it and get a Strix.


----------



## 8051

Quote:


> Originally Posted by *Cookybiscuit*
> 
> My Aorus Extreme is unstable at stock clocks, and will only do 1873MHz regardless of voltage. I tested it for 12 hours at 1.000V 1873-1860MHz and it's stable, but if I try to run it at 1911MHz or something even at 1.093V it crashes within minutes (usually seconds). It also will only take about +250MHz on the memory.
> 
> Do I have the worlds absolute worst 1080Ti or is there something amiss here? I'm confused because I've recently made a switch to a new motherboard and CPU, and beforehand I was able to get it above 2000MHz at 1.081V, enough to score 10118 points in 4K Superposition. Can a motherboard change have any influence on this sort of thing? It has always been unstable at stock clocks though, will have to try a BIOS update to see if that fixes it, if not I'll probably RMA it and get a Strix.


What kind of temps are you getting? I have a Gigabyte 1080Ti Gaming OC which is probably the same card w/a different cooler attached? Did you crank up the fan speed to the maximum? As others have said Pascals like to be cool.

My Gigabyte gets to 2088 reliably but takes 1.112V to do it.


----------



## Cookybiscuit

Quote:


> Originally Posted by *8051*
> 
> What kind of temps are you getting? I have a Gigabyte 1080Ti Gaming OC which is probably the same card w/a different cooler attached? Did you crank up the fan speed to the maximum? As others have said Pascals like to be cool.
> 
> My Gigabyte gets to 2088 reliably but takes 1.112V to do it.


The temperatures don't seem to make any difference what so ever. I can leave it turned off for two hours, turn the system on, fire up a benchmarking program or a game and it'll crash within 30 seconds despite barely being above 30-40C. Same story as if I use my custom profile to heat it up to 80C+, then use Afterburner to reset to defaults.


----------



## Streetdragon

What PSU do you have? Maybe the power is not stable enough..... but the it is possible, that you got a "slower" chip. Everything over Boost still is OC and not guaranteed.


----------



## Cookybiscuit

Quote:


> Originally Posted by *Streetdragon*
> 
> What PSU do you have? Maybe the power is not stable enough..... but the it is possible, that you got a "slower" chip. Everything over Boost still is OC and not guaranteed.


EVGA T2 1000W, it's the only PSU I have on hand at the moment so not able to swap it out and see if it's causing the "issue".

I guess I maybe just got a crappy GPU, shame I can't look at the ASIC quality to know for sure. Looks like some people can do 2000MHz or more at 1.000V, which seemed high to me but I guess it's still only 7% better that what I've got, 7% is obviously well within the margins of the silicon lottery. The crashing at stock clocks is still a problem though, if a a flash to the newest BIOS doesn't fix it maybe I'll see what Gigabyte say.


----------



## Streetdragon

If it crashes at stock Speed: -> Send it back and get a new one!


----------



## Blameless

Make sure the screws holding the card together are snug and re-test. You should also be using an independent cable for each power connector on the card.

I would also carefully reevaluate the stability of the rest of the system. It's not at all unusual for new, powerful, hardware to shift bottlenecks and reveal issues with seemingly unrelated components.

Also, "ASIC quality" is just a relative leakage rating...it's not really a quality guideline.


----------



## KedarWolf

I now have enough 17.0 W/m-K thermal pads to completely cover everything on my EK waterblock.









1.5MM n the R22 VRM's, 1.0mm everywhere else.









Doing everything pictured except the ones marked blue, the waterblock doesn't actually contact them and I think it would hurt more than help.


----------



## FedericoUY

Is anyone reaching 2000mhz @ 1000mv or less on stock cooling?


----------



## RoBiK

Quote:


> Originally Posted by *FedericoUY*
> 
> Is anyone reaching 2000mhz @ 1000mv or less on stock cooling?


ASUS ROG Strix 1080 Ti OC with the Strix XOC BIOS and abysmal case ventilation. Running a compute workload more or less 24/7.


----------



## FedericoUY

Quote:


> Originally Posted by *RoBiK*
> 
> ASUS ROG Strix 1080 Ti OC with the Strix XOC BIOS and abysmal case ventilation. Running a compute workload more or less 24/7.


I've got that same card with same bios. I could reach 1012mv BF1 (and all) stable , but I can go to 1000mv as long as is cold, but as soon as the card starts to warm (hot days), that is no longer stable. Considering going on watercooling this card...


----------



## RoBiK

Quote:


> Originally Posted by *FedericoUY*
> 
> I've got that same card with same bios. I could reach 1012mv BF1 (and all) stable , but I can go to 1000mv as long as is cold, but as soon as the card starts to warm (hot days), that is no longer stable. Considering going on watercooling this card...


It is stable for my particular workload at my particular conditions. Your mileage may vary. I am getting 4 of these cards and putting them under watercooling sometime in the next couple of weeks. As a rule of thumb you get a bump of one frequency bin (12.5 MHz) per 10 degrees celsius that you manage to cut.


----------



## SavantStrike

Quote:


> Originally Posted by *Cookybiscuit*
> 
> My Aorus Extreme is unstable at stock clocks, and will only do 1873MHz regardless of voltage. I tested it for 12 hours at 1.000V 1873-1860MHz and it's stable, but if I try to run it at 1911MHz or something even at 1.093V it crashes within minutes (usually seconds). It also will only take about +250MHz on the memory.
> 
> Do I have the worlds absolute worst 1080Ti or is there something amiss here? I'm confused because I've recently made a switch to a new motherboard and CPU, and beforehand I was able to get it above 2000MHz at 1.081V, enough to score 10118 points in 4K Superposition. Can a motherboard change have any influence on this sort of thing? It has always been unstable at stock clocks though, will have to try a BIOS update to see if that fixes it, if not I'll probably RMA it and get a Strix.


It sounds like a bad clocker. Since it's unstable at stock speed, a replacement card is a reasonable request. Try getting another one, the Aorus cards are really nice.


----------



## nrpeyton

Quote:


> Originally Posted by *RoBiK*
> 
> It is stable for my particular workload at my particular conditions. Your mileage may vary. I am getting 4 of these cards and putting them under watercooling sometime in the next couple of weeks. As a rule of thumb you get a bump of one frequency bin (12.5 MHz) per 10 degrees celsius that you manage to cut.


That is a very conservative.

I would rather estimate 2 MHZ for every 1 degree C

So if you can only manage 10c then yes, maybe you'll only get one extra bin.

But as soon as you are able to get 30c or more then an increase in 'frequency ceiling' becomes more apparent. --> _30c gets you approximately 5 bins (or 62mhz).
_
I get about 100mhz for 50c.

On Maxwell we got a bit more.

With litography getting ever smaller -- I fear Nvidia's next release will squeeze the fun out of overclocking even more.


----------



## becks

Soon enough we will have a TI and a TI K.... if you know what I mean...
Next gen "bin"...


----------



## nrpeyton

Quote:


> Originally Posted by *becks*
> 
> Soon enough we will have a TI and a TI K.... if you know what I mean...
> Next gen "bin"...


aye,

Anyone used Liquid Metal _(i.e. Conductonaut)_ between GPU core and waterblock? How many extra degrees did you shave off compared to paste?

Made an interesting discovery:

VRAM overclocks a lot faster when I run with XOC firmware as opposed to my stock Founders Edition BIOS.

Stable VRAM Overclocks:

Stock FE BIOS = +525 --> +575 _(dependant on temps)_
XOC firmware = +705 _(still carrying out tests - but no errors so far unless I go to 710)_

XOC BIOS must use looser VRAM timings. So if you flash it -- be sure to re-find a new max VRAM overclock.


----------



## GRABibus

Quote:


> Originally Posted by *8051*
> 
> What kind of temps are you getting? I have a Gigabyte 1080Ti Gaming OC which is probably the same card w/a different cooler attached? Did you crank up the fan speed to the maximum? As others have said Pascals like to be cool.
> 
> My Gigabyte gets to 2088 reliably but takes 1.112V to do it.


So you mean you flashed your Gaming OC with ASUS XOC Bios ? (U > 1.093v)
I have the same card, but flashing with this Bios never worked for me.


----------



## nrpeyton

Quote:


> Originally Posted by *GRABibus*
> 
> So you mean you flashed your Gaming OC with ASUS XOC Bios ? (U > 1.093v)
> I have the same card, but flashing with this Bios never worked for me.


I have a 1080Ti Founders Edition.

I flashed my Founders Edition with the XOC BIOS. _(1.2v . no temp limits, no power limits)_.

I'm surprised you weren't able to successfully flash the XOC onto another ASUS card. Your card is perfect for the XOC. _(Provided precautions are taken due to limiters being removed).
_

Guys - 'Forza 7 Motorsport' for PC is a *100GB download*. Now also -- considering how well optimized a big title like Forza is... a download size of that magnitude can only mean one thing --> *graphically*, that it's an *absolutely beautiful game.*

There's a FREE demo available (only 29GB) on the Windows 10; Microsoft Store, too. I'm downloading it now -- can't wait to try it.

The full game is also cross-plaform multiplayer, enabled. (I.E. XBOX vs. PC).









Has anyone else bought the game? There are different packages available ranging from £50 to £75 for digital download.

I'll report back about the DEMO soon.


----------



## 8051

Quote:


> Originally Posted by *GRABibus*
> 
> So you mean you flashed your Gaming OC with ASUS XOC Bios ? (U > 1.093v)
> I have the same card, but flashing with this Bios never worked for me.


Yes I have the Asus Strix OC flashed on my Gigabyte. It eliminated all my power throttling.


----------



## szeged

so ive been out of the gpu loop for a while, still rocking my original titan X from launch day.

looking to get a 1080ti, which ones are considered the best for OCing on air and playing games at 4k?

My entire system has been neglected for a little over 2 years now and its starting to show lol.


----------



## KedarWolf

Completely redid my thermals pads on my EK block and backplate with Fujipoly 17.0 W/mK Ultra Extreme XR-m thermal pads.

At 75% pump speed, 60% radiator fans and system fans, a single 360 rad, push/pull with my GPU block and a 5960X at 4.6GHZ in the loop I got no higher than 39C GPU temps and this for Time Spy Extreme stress test.









Edit: This at 2088 core/6147 memory 1.100v, XOC BIOS.









GPU-ZSensorLog.txt 324k .txt file




Second Edit: Ran Time Spy Extreme stress test at .800v, 1771core/5977 memory.

Only passed at 99.5% but max temps were 32C.


----------



## nrpeyton

Not bad.

The XOC's extra voltage is giving me CRAZY speeeeds. (2214 Mhz @ 1.20v) - _I actually managed to complete the benchmark at 2265 MHZ but bench results made it look unstable._

*Forza Motorsport 7:* 4k, ultra

*/\* _actual game-play -- not a video_

At 4k, everything at Ultra -- the game is *so* freaking _demanding_ I'm still only averaging 40 FPS on the game's benchmark. (obviously the in-game benchmark gives a worst case scenario stress on the GPU, but still). lol. - _I'm also running my CPU at 5GHZ and memory at 4133MHZ._

'Max CPU Thread Utilization' is only about 45% and 'total CPU utilization' on my 7700k only 22%. So I'm puzzled how the GPU is only sitting at 92% utilization in my screenshot. _Poor coding/programming by the game devs*?*_


----------



## KedarWolf

I'm #2 in 8K Optimized, #3 in 4K Optimized, Superposition.









https://benchmark.unigine.com/leaderboards/superposition/1.0/8k-optimized/single-gpu/page-1

https://benchmark.unigine.com/leaderboards/superposition/1.0/4k-optimized/single-gpu/page-1

Only #13 in 1080P Extreme though.









https://benchmark.unigine.com/leaderboards/superposition/1.0/1080p-extreme/single-gpu/page-1


----------



## feznz

Just got a new PSU RM1000i have no idea why I have a 1500w UPS pure sine wave and still 5th faulty PSU in 9 years not happy but got 1 with a 10 year warrantee (found the receipt for my corsair 850w was still in warrantee but had already thrown out the PSU







)
My former 1500w enermax already replaced under warrantee was just out of the 5 year warrantee.
a bit of over kill still thinking of SLI in the near future but the link software is pretty cool though I cannot understand how I can have 621w max watts in but have 576w max watts out.



Now can get down to playing Assassins Creed Origins beautiful game without a random shut down.


----------



## GRABibus

Quote:


> Originally Posted by *8051*
> 
> Yes I have the Asus Strix OC flashed on my Gigabyte. It eliminated all my power throttling.


Can You please post your stable OC data (+ How Many in core ? + How Many on memory ?) and also your V/F curve ?

I flashed yesterday and I havé big crashes even With only +70MHz on core, especially in Time spy and Unigine heaven.

With the Evga ftw3 Bios master I can add 101MHz on core without ans issue, starting from base clock 1569MHz, as With the ASUS XOC.....

I don't Understand


----------



## Gdourado

Hi,

Anyone here has a gainward Phoenix model?
This one:
http://www.gainward.com/main/vgapro.php?id=1006&lang=en



How is it?
Is the cooling solution a good one?
Good temps and low noise?
Does it clock well?

From what I can tell online, it is the same card as the Palit Jetstream. So if anyone has a palit, how do you like it?

Cheers!


----------



## ottoore

Quote:


> Originally Posted by *feznz*
> 
> I cannot understand how I can have 621w max watts in but have 576w max watts out.


Psu efficiency.

576/621= 0.927

That' s ok for ~50% load on a gold unit ( 220v, i guess you' re in Europe ).


----------



## lilchronic

Quote:


> Originally Posted by *GRABibus*
> 
> Can You please post your stable OC data (+ How Many in core ? + How Many on memory ?) and also your V/F curve ?
> 
> I flashed yesterday and I havé big crashes even With only +70MHz on core, especially in Time spy and Unigine heaven.
> 
> With the Evga ftw3 Bios master I can add 101MHz on core without ans issue, starting from base clock 1569MHz, as With the ASUS XOC.....
> 
> I don't Understand


You need to know what core clocks you are running. +70Mhz on core is going to be different from card to card.

XOC bios boost to 2000Mhz so +70 should be 2070Mhz or 2063Mhz since it goes by 13Mhz bins.

With stock FE bios boost clock is 1911Mhz. So +70 would give you 1981Mhz or 1976Mhz,


----------



## SavantStrike

Quote:


> Originally Posted by *KedarWolf*
> 
> Completely redid my thermals pads on my EK block and backplate with Fujipoly 17.0 W/mK Ultra Extreme XR-m thermal pads.
> 
> At 75% pump speed, 60% radiator fans and system fans, a single 360 rad, push/pull with my GPU block and a 5960X at 4.6GHZ in the loop I got no higher than 39C GPU temps and this for Time Spy Extreme stress test.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: This at 2088 core/6147 memory 1.100v, XOC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-ZSensorLog.txt 324k .txt file
> 
> 
> 
> 
> Second Edit: Ran Time Spy Extreme stress test at .800v, 1771core/5977 memory.
> 
> Only passed at 99.5% but max temps were 32C.


What were the old temps?


----------



## FedericoUY

Quote:


> Originally Posted by *nrpeyton*
> 
> Guys - 'Forza 7 Motorsport' for PC is a *100GB download*. Now also -- considering how well optimized a big title like Forza is... a download size of that magnitude can only mean one thing --> *graphically*, that it's an *absolutely beautiful game.*
> 
> There's a FREE demo available (only 29GB) on the Windows 10; Microsoft Store, too. I'm downloading it now -- can't wait to try it.
> 
> The full game is also cross-plaform multiplayer, enabled. (I.E. XBOX vs. PC).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone else bought the game? There are different packages available ranging from £50 to £75 for digital download.
> 
> I'll report back about the DEMO soon.


Did you downloaded the demo of this game? is it still available on the microsoft store?


----------



## 8051

Quote:


> Originally Posted by *GRABibus*
> 
> Can You please post your stable OC data (+ How Many in core ? + How Many on memory ?) and also your V/F curve ?
> 
> I flashed yesterday and I havé big crashes even With only +70MHz on core, especially in Time spy and Unigine heaven.
> 
> With the Evga ftw3 Bios master I can add 101MHz on core without ans issue, starting from base clock 1569MHz, as With the ASUS XOC.....
> 
> I don't Understand


I'm using Precision XOC. My curve tops out at 1112mV (x-axis) and +100 MHz. (y-axis). The last position in the curve (1125mV) I have set down to +25 Mhz. On the left hand side i have a smooth curve leading up to the 1112mV point with the point right before 1112mV (1100mV) set to +75 Mhz.

You should monitor you overclock to make sure you're getting the voltages you set/request.


----------



## GRABibus

Quote:


> Originally Posted by *8051*
> 
> I'm using Precision XOC. My curve tops out at 1112mV (x-axis) and +100 MHz. (y-axis). The last position in the curve (1125mV) I have set down to +25 Mhz. On the left hand side i have a smooth curve leading up to the 1112mV point with the point right before 1112mV (1100mV) set to +75 Mhz.
> 
> You should monitor you overclock to make sure you're getting the voltages you set/request.


I did monitor everything.

With EVGA Bios, I get 2063MHz at 1.062V - 1.075V, and I am full stable.
+101MHz on Core
+497MHz on Memory
Slider voltage at 100¨%
Slider power at 117% (Maximum)

With XOC Bios, I get 2063MHz - 2076MHz at 1.112V - 1.125V and Time Spy + Heaven Unigine crash
+65MHz on Core
+497MHz on Memory
Slider voltage on 100¨%
Slider power => greyed and disabled.

A screenshot of your V/F curves would help


----------



## lilchronic

Quote:


> Originally Posted by *GRABibus*
> 
> I did monitor everything.
> 
> With EVGA Bios, I get 2063MHz at 1.062V - 1.075V, and I am full stable.
> +101MHz on Core
> +497MHz on Memory
> Slider voltage at 100¨%
> Slider power at 117% (Maximum)
> 
> With XOC Bios, I get 2063MHz - 2076MHz at 1.112V - 1.125V and Time Spy + Heaven Unigine crash
> +65MHz on Core
> +497MHz on Memory
> Slider voltage on 100¨%
> Slider power => greyed and disabled.
> 
> A screenshot of your V/F curves would help


With stock bios at that voltage your are being power limited so the card throttles and is not using the full power it can at those clocks. XOC bios has no limits so now your card can perform at its highest. That means your oc on stock bios may not be stable with xoc as it is drawing more power to maintain those clocks.


----------



## GRABibus

Quote:


> Originally Posted by *lilchronic*
> 
> With stock bios at that voltage your are being power limited so the card throttles and is not using the full power it can at those clocks. XOC bios has no limits so now your card can perform at its highest. That means your oc on stock bios may not be stable with xoc as it is drawing more power to maintain those clocks.


ok, thanks.
I am not at Stock Bios, I use the EVGA FTW3 Master Bios (330W).
But your explanation remains the same whatever my BIOS.

i will keep this EVGA FTW3 Master Bios. This is the best for my card I tested until now.


----------



## nrpeyton

Quote:


> Originally Posted by *FedericoUY*
> 
> Did you downloaded the demo of this game? is it still available on the microsoft store?


Yes m8.

(I posted a screenshot on the last page on this thread). And yes it's still available.

Quote:


> Originally Posted by *8051*
> 
> I'm using Precision XOC. My curve tops out at 1112mV (x-axis) and +100 MHz. (y-axis). The last position in the curve (1125mV) I have set down to +25 Mhz. On the left hand side i have a smooth curve leading up to the 1112mV point with the point right before 1112mV (1100mV) set to +75 Mhz.
> 
> You should monitor you overclock to make sure you're getting the voltages you set/request.


MSI Afterburner will allow you to set up to 1.2v using the XOC BIOS _(as unlike Precision X -- the curve goes all the way up to 1.2v)._

Only thing to be careful of though -- is PASCAL doesn't generally like more voltage unless it's cold enough. If card is too hot there are actually occasions when I've seen a card more stable at a lower voltage when testing different voltages on the same frequency.


----------



## FedericoUY

Quote:


> Originally Posted by *nrpeyton*
> 
> Yes m8.
> 
> (I posted a screenshot on the last page on this thread). And yes it's still available.


Thanks already downloading...


----------



## NeedMoreJuice

Something odd happened. I turned off my pc, switched off the switch from the power supply, removed the cables and when I removed out the graphics card from the pcie slot (there seemed to be some power left) the ROG led logo on the motherboard went on again and then off. Do I have to worry for the pcie slot or the gpu  ?


----------



## aberrero

Hey guys, I'm looking to upgrade to a Ti from my 1080. Couple questions:

1. Any reason to think a newer, faster card is coming out soon at this price point (i.e. not a Titan)?

2. Any good deals on reference cards (I have a reference water block)?

3. What's the best reference card to get? Are there any with good BIOS compatiblity/unlocked voltage/good bins, etc.?


----------



## nrpeyton

Quote:


> Originally Posted by *aberrero*
> 
> Hey guys, I'm looking to upgrade to a Ti from my 1080. Couple questions:
> 
> 1. Any reason to think a newer, faster card is coming out soon at this price point (i.e. not a Titan)?
> 
> 2. Any good deals on reference cards (I have a reference water block)?
> 
> 3. What's the best reference card to get? Are there any with good BIOS compatiblity/unlocked voltage/good bins, etc.?


1. I'd say no. There are no big announcements about Nvidia's next big architectural release yet.
AMD isn't competitive at 1080Ti level.

2. Ebay? (Or use google shopping button then sort by lowest price) and look around. Prices will be different though depending where u live. (I'm in the U.K and I think Ebuyer & Alza.co.uk are the two cheapest here).

3. EVGA has the best support policy and you are allowed to water-cool it (and remove heatsink/cooler to re-do TIM without voiding warranty).

Other than that I believe all reference models are the same. ASUS is the only company with a BIOS that has power/temp limits removed and allows more voltage (up to 1.2v) but an ASUS Founders Edition is no more compatible with that BIOS than an EVGA Founders Edition. Both FE's would behave similarly when using that BIOS. If you wanted the card that pairs up with that BIOS the best then the cheapest STRIX OC version would be best _(but obviously that is not a reference design)._
The BIOS is called the "XOC". You've maybe seen it being referred to on here already as "XOC". It was designed by ASUS for their LN2 sponsored team. A guy in that team was nice enough to give us the file. (I'm not sure if ASUS intended for the file to ever be released publicly -- but we're lucky it was).
I have the EVGA founders edition and I am currently using that BIOS. It was designed for use on a STRIX OC though. Hence reason that is the best card for it. But I still enjoy using it on my EVGA FE, perfectly fine. Although I know if I had the STRIX OC -- my score would probably only be _sligthly_ higher in benches.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aberrero*
> 
> Hey guys, I'm looking to upgrade to a Ti from my 1080. Couple questions:
> 
> 1. Any reason to think a newer, faster card is coming out soon at this price point (i.e. not a Titan)?
> 
> 2. Any good deals on reference cards (I have a reference water block)?
> 
> 3. What's the best reference card to get? Are there any with good BIOS compatiblity/unlocked voltage/good bins, etc.?
> 
> 
> 
> 1. I'd say no. There are no big announcements about Nvidia's next big architectural release yet.
> AMD isn't competitive at 1080Ti level.
> 
> 2. Ebay? (Or use google shopping button then sort by lowest price) and look around. Prices will be different though depending where u live. (I'm in the U.K and I think Ebuyer & Alza.co.uk are the two cheapest here).
> 
> 3. EVGA has the best support policy and you are allowed to water-cool it (and remove heatsink/cooler to re-do TIM without voiding warranty).
> 
> Other than that I believe all reference models are the same. ASUS is the only company with a BIOS that has power/temp limits removed and allows more voltage (up to 1.2v) but an ASUS Founders Edition is no more compatible with that BIOS than an EVGA Founders Edition. Both FE's would behave similarly when using that BIOS. If you wanted the card that pairs up with that BIOS the best then the cheapest STRIX OC version would be best _(but obviously that is not a reference design)._
Click to expand...

Even a regular Strix not OC should be fine to save money. You're flashing the XOC or alternately the Strix OC BIOS anyways and they are basically the same card.









Some say the OC might be binned a bit better but the reality of the 1080 Ti is you can get an OC that clocks poorly or get a regular card the OCs well, at's all luck.

It's basically the silicon lottery with our cards, luck of the draw more than model type.


----------



## feznz

Quote:


> Originally Posted by *ottoore*
> 
> Psu efficiency.
> 
> 576/621= 0.927
> 
> That' s ok for ~50% load on a gold unit ( 220v, i guess you' re in Europe ).


Thanks I just woke up after a week of 4hrs average a night sleep it makes perfect sense now


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Even a regular Strix not OC should be fine to save money. You're flashing the XOC or alternately the Strix OC BIOS anyways and they are basically the same card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some say the OC might be binned a bit better but the reality of the 1080 Ti is you can get an OC that clocks poorly or get a regular card the OCs well, at's all luck.
> 
> It's basically the silicon lottery with our cards, luck of the draw more than model type.


True. I couldn't remember if the regular STRIX still used the "OC" suffix.

Thanks for clearing that up.









Also KedarWolf congrats on reaching 200 rep points and for a very sucessful thread starter & maintenance ;-)


----------



## chibi

Hi guys, looking for help regarding waterblock dimensions. If one of you guys happen to have either an Aquacomputer or Heatkiller 1080 Ti block, may I please ask for the lengths below? Thank you ever so much


----------



## KCDC

Anyone here using the Phanteks blocks for these cards? They so fancy! But I have two cards and hear these blocks are pretty dang heavy. Was thinking of selling my EKs for a couple of these.


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Anyone here using the Phanteks blocks for these cards? They so fancy! But I have two cards and hear these blocks are pretty dang heavy. Was thinking of selling my EKs for a couple of these.


Do they perform _better_ regards to GPU Core Temp and Memory temps?


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> Do they perform _better_ regards to GPU Core Temp and Memory temps?


I'd imagine all GPU waterblocks perform about the same, I just like how they look. Wanted them when they were first announced but took forever to release and ended up getting the EKWBs.. Only want them mainly for aesthetic.


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> I'd imagine all GPU waterblocks perform about the same, I just like how they look. Wanted them when they were first announced but took forever to release and ended up getting the EKWBs.. Only want them mainly for aesthetic.


My main board looks like a DISCO machine with all the lights (RGB lol). I remember when I first got it I could fall asleep being in awe at the beauty of my new gaming machine, lol. So I completely understand. If it makes it look good & makes u feel good, go for it.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Even a regular Strix not OC should be fine to save money. You're flashing the XOC or alternately the Strix OC BIOS anyways and they are basically the same card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some say the OC might be binned a bit better but the reality of the 1080 Ti is you can get an OC that clocks poorly or get a regular card the OCs well, at's all luck.
> 
> It's basically the silicon lottery with our cards, luck of the draw more than model type.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> True. I couldn't remember if the regular STRIX still used the "OC" suffix.
> 
> Thanks for clearing that up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also KedarWolf congrats on reaching 200 rep points and for a very sucessful thread starter & maintenance ;-)
Click to expand...

Thanks @nrpeyton I really enjoy helping with this thread and have learned a ton of great stuff from all you peeps!!


----------



## 8051

Quote:


> Originally Posted by *GRABibus*
> 
> I did monitor everything.
> 
> With EVGA Bios, I get 2063MHz at 1.062V - 1.075V, and I am full stable.
> +101MHz on Core
> +497MHz on Memory
> Slider voltage at 100¨%
> Slider power at 117% (Maximum)
> 
> With XOC Bios, I get 2063MHz - 2076MHz at 1.112V - 1.125V and Time Spy + Heaven Unigine crash
> +65MHz on Core
> +497MHz on Memory
> Slider voltage on 100¨%
> Slider power => greyed and disabled.
> 
> A screenshot of your V/F curves would help


----------



## 8051

Quote:


> Originally Posted by *aberrero*
> 
> Hey guys, I'm looking to upgrade to a Ti from my 1080. Couple questions:
> 
> 1. Any reason to think a newer, faster card is coming out soon at this price point (i.e. not a Titan)?
> 
> 2. Any good deals on reference cards (I have a reference water block)?
> 
> 3. What's the best reference card to get? Are there any with good BIOS compatiblity/unlocked voltage/good bins, etc.?


I agree with what others have said, it's not worth it. I upgraded from an overclocked 980Ti and I only get 40% to 60% more minimum FPS.


----------



## aberrero

Quote:


> Originally Posted by *8051*
> 
> I agree with what others have said, it's not worth it. I upgraded from an overclocked 980Ti and I only get 40% to 60% more minimum FPS.


I don't think anyone has said it isn't worth it? I game at 4k exclusively, and my 1080 is getting around 45fps in my games. Getting 30% more performance from a Ti would mean being able to hit 60fps and use vsync. I've also spent close to $4000 on my system as a whole, so spending $250 to get 30% more performance seems to make a lot of sense.


----------



## BigBeard86

I have an MSI gaming X, watercooled and oced to 2139mhz.
Past this, minor intermittent artifacts arise while gaming

Is there any bios I can use to increase the power limit and vcore limit for my card?


----------



## 8051

Quote:


> Originally Posted by *aberrero*
> 
> I don't think anyone has said it isn't worth it? I game at 4k exclusively, and my 1080 is getting around 45fps in my games. Getting 30% more performance from a Ti would mean being able to hit 60fps and use vsync. I've also spent close to $4000 on my system as a whole, so spending $250 to get 30% more performance seems to make a lot of sense.


I'll bet you would get more out of a 1080SLI setup than you would out of a single 1080Ti -- at least in games that support it.

If you overclock your 1080, the difference would be insignificant.


----------



## SavantStrike

Quote:


> Originally Posted by *BigBeard86*
> 
> I have an MSI gaming X, watercooled and oced to 2139mhz.
> Past this, minor intermittent artifacts arise while gaming
> 
> Is there any bios I can use to increase the power limit and vcore limit for my card?


The XOC BIOS will give you what you want, although it sounds like you've already got unusually high clocks.


----------



## l1CappYl1

Bought a strix oc and its running over 2100mhz with temps in the low to mid 50s. I'm using the included software and no hardware mods.

I'm curious about all the custom bios and oc software mentioned here and wonder if I should try to push it more as I seem to still have plenty of thermal headroom.


----------



## BigBeard86

Quote:


> Originally Posted by *SavantStrike*
> 
> The XOC BIOS will give you what you want, although it sounds like you've already got unusually high clocks.


can you guide my to where to get the bios, how ot back up mine, and how to flash?

I attached image...it would be very nice if i can hit 2.2ghz. seems like i can, if i had some more voltage and power.



http://imgur.com/HEAMow9


----------



## aberrero

Quote:


> Originally Posted by *8051*
> 
> I'll bet you would get more out of a 1080SLI setup than you would out of a single 1080Ti -- at least in games that support it.
> 
> If you overclock your 1080, the difference would be insignificant.


That's true, but the 1080Ti is cheaper and hardly any games support SLI.


----------



## KedarWolf

Quote:


> Originally Posted by *aberrero*
> 
> Quote:
> 
> 
> 
> Originally Posted by *8051*
> 
> I'll bet you would get more out of a 1080SLI setup than you would out of a single 1080Ti -- at least in games that support it.
> 
> If you overclock your 1080, the difference would be insignificant.
> 
> 
> 
> That's true, but the 1080Ti is cheaper and hardly any games support SLI.
Click to expand...

Yes, but most games you can do Nvidia Inspector driver hacks and enable SLI.


----------



## ottoore

Quote:


> Originally Posted by *feznz*
> 
> Thanks I just woke up after a week of 4hrs average a night sleep it makes perfect sense now


----------



## Gdourado

Hello all.
I sold my 1080 yesterday to have funds to step up to a ti.
I got 525 euros for my 1080.
Now the best deal I have is to order a Gainward 1080 Ti Phoenix for 630 euros.
But I don't know much about the card.
From what I can tell, it is the same as a Palit Jetstream with a different plastic shroud on the cooler.
Anyone has one?
It has a dual 100mm fan triple slot cooler and 12 phase custom pcb.
Would it be worth it?
Ab Aorus, strix or Msi gaming are almost 800 euros!
I know you usually get what you pay for but still...
Is the Gainward any good?
It's this card:
http://www.gainward.com/main/vgapro.php?id=1006&lang=en

Cheers and thanks


----------



## GRABibus

Quote:


> Originally Posted by *8051*
> 
> I'll bet you would get more out of a 1080SLI setup than you would out of a single 1080Ti -- at least in games that support it.
> 
> If you overclock your 1080, the difference would be insignificant.


It is not only a question of fps and core overclock.

Don't forget also about huge Vram consumption of some games...
In TitanFall 2 for example, in ultra texture mode, even my former 1080 SLI had some stutters due to the fact that this mode in TF2 requires more than 8GB VRam...And with a 1080 SLI, you will get same amount of Vram than a single 1080 (8GB), even if your bandwidth is higher.

With my single 1080Ti (11GB Vram), I don't stutter anymore in TF2 in Extra texture mode.


----------



## GRABibus

Quote:


> Originally Posted by *8051*


Aarrggh








I forgot how the V/F curve is a headache to manage with PX....
You don't have MSI AB ?

By the way, thank you for your answer.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gdourado*
> 
> Hello all.
> I sold my 1080 yesterday to have funds to step up to a ti.
> I got 525 euros for my 1080.
> Now the best deal I have is to order a Gainward 1080 Ti Phoenix for 630 euros.
> But I don't know much about the card.
> From what I can tell, it is the same as a Palit Jetstream with a different plastic shroud on the cooler.
> Anyone has one?
> It has a dual 100mm fan triple slot cooler and 12 phase custom pcb.
> Would it be worth it?
> Ab Aorus, strix or Msi gaming are almost 800 euros!
> I know you usually get what you pay for but still...
> Is the Gainward any good?
> It's this card:
> http://www.gainward.com/main/vgapro.php?id=1006&lang=en
> 
> Cheers and thanks


That difference in price I would get the Gainward.


----------



## Hulio225

Guys i have a problem, tonight i got a blackscreen...
My 1080Ti isn't listed anymore in bios and in windwows while using IGPU of my CPU.
But if i *remove PCIe Power connectors from the 1080Ti it is listed again in BIOS and Windows*...

I am wondering if i have to buy a new 1080Ti or maybe just a Powersupply...
I have a custom waterblock i cant test the card in another pc and i have no other power supply atm, maybe later this day....

But what do you guys think is it the GPU or the Powersupply, did someone of you had similar issues?

Edit: Got quickly a PSU... It seems to be the GPU :´(


----------



## Rakanoth

Quote:


> Originally Posted by *Hulio225*
> 
> Guys i have a problem, tonight i got a blackscreen...
> My 1080Ti isn't listed anymore in bios and in windwows while using IGPU of my CPU.
> But if i *remove PCIe Power connectors from the 1080Ti it is listed again in BIOS and Windows*...
> 
> I am wondering if i have to buy a new 1080Ti or maybe just a Powersupply...
> I have a custom waterblock i cant test the card in another pc and i have no other power supply atm, maybe later this day....
> 
> But what do you guys think is it the GPU or the Powersupply, did someone of you had similar issues?
> 
> Edit: Got quickly a PSU... It seems to be the GPU :´(


I guess PSU. You must find another computer in which you can test your GPU.


----------



## nrpeyton

Anyone who is using the XOC, notice that their card doesn't idle properly?

Mine is constantly consuming 70 watts at 0.813v @ 1569MHZ. (Even after a fresh reboot or from shutdown; just browsing the internet etc).


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone who is using the XOC, notice that their card doesn't idle properly?
> 
> Mine is constantly consuming 70 watts at 0.813v @ 1569MHZ. (Even after a fresh reboot or from shutdown; just browsing the internet etc).


Some browsers use GPU acceleration -- I know for a fact Firefox does.


----------



## lilchronic

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone who is using the XOC, notice that their card doesn't idle properly?
> 
> Mine is constantly consuming 70 watts at 0.813v @ 1569MHZ. (Even after a fresh reboot or from shutdown; just browsing the internet etc).


yeah i get that. I just hit the reset button in msi afterburner and it will downclock. It's pretty annoying though since i leave my pc on 24/7 ive had it happen a few times overnight.


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> Some browsers use GPU acceleration -- I know for a fact Firefox does.


aye but my browser isn't doing anything that needs a 1080Ti clocked at 1569Mhz lol.

I could still run full blown games at 4k at that speed with probably only a 25% drop in framerate lol.

reset button does nothing to make it idle properly.

Yeah it sure is annoying.. especially when loop is setup with a noisy Water Chiller....

At idling speeds (browsing internet / watching videos / youtube) etc my system doesn't produce enough watts to heat my coolant up past my thermostat setting on Chiller. (So Chiller pretty much remains inactive/switched off).

But with that GPU not idling properly (constantly putting out 70 watts) it's raising my water temp enough -- and the Chiller is switching on. I don't mind during gaming because I wear the headphones so I don't hear anything anyway.. but it's awfully annoying when I'm not.

Needs fixed. I mean I need to figure something out -- fast.









Saying that -- it's made me notice something nice. I switched my dew point meter on. Dew point is quite low tonight. I can go down as low as 7c water temp tonight, with no condensation risk. Maybe see if I can beat my best score.

Edit:
Wow I was just pulling 463 watts at 1.2v on the FE and GPU temp only 27c. Clocked at 2189MHZ & +685 memory -- all fully stable. Omg; and they moaned because Vega is hungry. Nvidia sure has a bigger beak.

P.S.
Are any of you using Dual-boot machines with the 2nd boot being a clean install of windows -- only for benching purposes? I think my old & bloated windows 10 install, could be costing me serious points.


----------



## Nervoize

Had a conversation with LN2 overclockers today and told them about my 2164MHz at 1.2v. They were impressed and told me that my temp was the only limit. It could easily hit 2300-2500 at 1.2 with proper cooling (LN2). Currently sitting around 40c under load, but the power output is massive. The card pulls 520 watts through PCIE and 2x8pin at 2164Mhz core 6003Mhz memory. Don't know how to lower the temps even more. Running a 30mm 360 rad and 40mm 280rad at ambient 15-18c.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Nervoize*
> 
> Had a conversation with LN2 overclockers today and told them about my 2164MHz at 1.2v. They were impressed and told me that my temp was the only limit. It could easily hit 2300-2500 at 1.2 with proper cooling (LN2). Currently sitting around 40c under load, but the power output is massive. The card pulls 520 watts through PCIE and 2x8pin at 2164Mhz core 6003Mhz memory. Don't know how to lower the temps even more. Running a 30mm 360 rad and 40mm 280rad at ambient 15-18c.


Lower ambient







.


----------



## nrpeyton

I get a little afraid when I see that power reading hitting the 450w + + mark at 1.2v _(According to Nvidea anything over 250 watts voids warranty)._

High current combined with high voltage causes electro migration & degradation.

I've noticed racing games (like Forza 7) don't seem to pull as much current _(power, measured in watts)._, even at 4k. I feel safer over-volting more in racers.

Graphically demanding FPS and RPG games or benchmarks such as SuperPosition and Firestrike 4k Extreme just aren't safe for continous running at 1.2v due to high current.

Also noticed that games/benches are much more stable at higher frequencies & voltages if the power usage is low. (For example I can pass the Forza 7 benchmark at 4k ultra @ 2265MHZ) but in SuperPosition at the same settings it's stable at 2189 but only barely stable at 2202MHZ _(which might crash 1/3 runs)._

Anyway I've just got Titanfall 2 after someone mentioned a few pages back about it using over 8gb of VRAM at 4k ultra. I'm very interested to see how nice the graphics look. Never played a game that uses more than 8, before.

Installing now ;-)


----------



## 2ndLastJedi

Quote:


> Originally Posted by *nrpeyton*
> 
> I get a little afraid when I see that power reading hitting the 450w + + mark at 1.2v _(According to Nvidea anything over 250 watts voids warranty)._
> 
> High current combined with high voltage causes electro migration & degradation.
> 
> I've noticed racing games (like Forza 7) don't seem to pull as much current _(power, measured in watts)._, even at 4k. I feel safer over-volting more in racers.
> 
> Graphically demanding FPS and RPG games or benchmarks such as SuperPosition and Firestrike 4k Extreme just aren't safe for continous running at 1.2v due to high current.
> 
> Also noticed that games/benches are much more stable at higher frequencies & voltages if the power usage is low. (For example I can pass the Forza 7 benchmark at 4k ultra @ 2265MHZ) but in SuperPosition at the same settings it's stable at 2189 but only barely stable at 2202MHZ _(which might crash 1/3 runs)._
> 
> Anyway I've just got Titanfall 2 after someone mentioned a few pages back about it using over 8gb of VRAM at 4k ultra. I'm very interested to see how nice the graphics look. Never played a game that uses more than 8, before.
> 
> Installing now ;-)


Im getting the full 11GB VRam usage in Wolfenstein 2 with the 8k textures!


----------



## tozedup

hello i don't know if anyone could help..

but i'm running an asus FE gtx 1080ti overlocked to 2114mhz on water. xoc custom bios.

when i push 2138mhz my 2 screens start turning on and off lol? but it runs 2138mhz fine ?


----------



## KedarWolf

Quote:


> Originally Posted by *SavantStrike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Completely redid my thermals pads on my EK block and backplate with Fujipoly 17.0 W/mK Ultra Extreme XR-m thermal pads.
> 
> At 75% pump speed, 60% radiator fans and system fans, a single 360 rad, push/pull with my GPU block and a 5960X at 4.6GHZ in the loop I got no higher than 39C GPU temps and this for Time Spy Extreme stress test.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: This at 2088 core/6147 memory 1.100v, XOC BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-ZSensorLog.txt 324k .txt file
> 
> 
> 
> 
> Second Edit: Ran Time Spy Extreme stress test at .800v, 1771core/5977 memory.
> 
> Only passed at 99.5% but max temps were 32C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What were the old temps?
Click to expand...

Around 48C.


----------



## tozedup

Nice I'm at about 10% pump speed and 30% fan speed and getting results similar to yours


----------



## KedarWolf

Quote:


> Originally Posted by *tozedup*
> 
> Nice I'm at about 10% pump speed and 30% fan speed and getting results similar to yours


How many rads and do you have a CPU in your loop?


----------



## tozedup

Quote:


> Originally Posted by *KedarWolf*
> 
> How many rads and do you have a CPU in your loop?


oh I got 2 loops card is using a 360 65mm rad. Hey is it worth watercooling ram?


----------



## KedarWolf

Quote:


> Originally Posted by *tozedup*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> How many rads and do you have a CPU in your loop?
> 
> 
> 
> oh I got 2 loops card is using a 360 65mm rad. Hey is it worth watercooling ram?
Click to expand...

I don't think so, no. If you check your RAM temps in HWInfo they are never hot.


----------



## nrpeyton

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Im getting the full 11GB VRam usage in Wolfenstein 2 with the 8k textures!


Does sound *interesting*.. but do you have an 8k monitor? (way i see it is it's just a loss in framerate for extra resolution your monitor/tv can't display)?

I love finding games that make full use of more than 8 gigs of video ram at 4k. As it usally means it's a beautiful game.

I'm seriously conidering upgrading the TIM on my 1080Ti to Liquid Metal (conductonaut). Getting sick of seeing the GPU running approx. 15 - 17c above coolant temperature.

Anyone else who uses Liquid Metal on their GPU able to tell me the difference between coolant and GPU temp with the LM used? Or am I going to have to be my own guinnii pig on this one lol.


----------



## nrpeyton

Quote:


> Originally Posted by *tozedup*
> 
> oh I got 2 loops card is using a 360 65mm rad. Hey is it worth watercooling ram?


I agree with KedarWolf. (About the RAM).

The only *exception* to that rule is if you're running chilled water around your loop. (I.E. your loop runs with a Water Chiller instead of a radiator). Then you'll be able to get those sticks running below ambient temps. Which could help you eek out the next bin of stability from them. I.E. 4133mhz instead of 4000MHZ at the same timings. _(that's if you're lucky)._ A more probable evaluation would be that you may actually just get the same speed -- but may get longer (& further) coverage on MemTest before it finds an error _(when stress testing your memory)._

*VRAM Overclocking - Update:*
I've noticed Titanfall 2 is quite crash-resistant; despite it's high power draw. I can get 2215MHZ at 1.2v / +685 memory -- stable. Draw is up to 510-watts. So that is very impressive.

It's also a *great memory tester.* It's the *only game* (to date) since upgrading to my 1080Ti that I *actually see artefacts on screen* if going too high on memory. _(instead of crashing)._

When I further validate my findings on OCCT the two seem to match up nicely (in terms off finding max memory overclock). So that gives me plenty confidence my new +685 on this founders edition is perfectly stable. _That memory overclock just wasn't possible before upgrading to the XOC firmware on my FE._


----------



## 2ndLastJedi

Quote:


> Originally Posted by *nrpeyton*
> 
> Does sound *interesting*.. but do you have an 8k monitor? (way i see it is it's just a loss in framerate for extra resolution your monitor/tv can't display)?
> 
> I love finding games that make full use of more than 8 gigs of video ram at 4k. As it usally means it's a beautiful game.
> 
> I'm seriously conidering upgrading the TIM on my 1080Ti to Liquid Metal (conductonaut). Getting sick of seeing the GPU running approx. 15 - 17c above coolant temperature.
> 
> Anyone else who uses Liquid Metal on their GPU able to tell me the difference between coolant and GPU temp with the LM used? Or am I going to have to be my own guinnii pig on this one lol.


Nah Mate i have a 4k screen but well just because Wolf 2 has it you may as well try it hey !
Frame Rate dropped by about half , i use vsync especially in this because i had such a hard time with stutter and tearing but have now solved it ,yay .
4k Max in this is pretty awesome but some textures could definitely use the 8k file but just vanilla it's a looker . When playing at speed this is probably the best looking game i have ! No shimmer or aliasing at all and the lighting is amazing and great story, writing and actors .
I find it amazing how a game that looks this good can run at 4k Ultra well above 60 when others look ordinary and struggle to hit 60 !


----------



## nrpeyton

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Nah Mate i have a 4k screen but well just because Wolf 2 has it you may as well try it hey !
> Frame Rate dropped by about half , i use vsync especially in this because i had such a hard time with stutter and tearing but have now solved it ,yay .
> 4k Max in this is pretty awesome but some textures could definitely use the 8k file but just vanilla it's a looker . When playing at speed this is probably the best looking game i have ! No shimmer or aliasing at all and the lighting is amazing and great story, writing and actors .
> I find it amazing how a game that looks this good can run at 4k Ultra well above 60 when others look ordinary and struggle to hit 60 !


I have wolfinstein too m8.. it is a great game.

But in terms of graphics immersion there are so many other titles out there that look a lot nicer. (Wolf does look good at 4k though.. not denying that -- just saying there is more luxurious titles out there in terms of raw graphical beauty).

New 'Rise of the Tomb Raider' for instance -- is a beautiful game


----------



## pfinch

hey guys,

do you also have the problem that the voltage points (curved, MSI Afterburner 4.4 final) are always jumping to different spots after restarts from afterburner and/or win10?


----------



## nrpeyton

Quote:


> Originally Posted by *pfinch*
> 
> hey guys,
> 
> do you also have the problem that the voltage points (curved, MSI Afterburner 4.4 final) are always jumping to different spots after restarts from afterburner and/or win10?


That happens on purpose. It's not a bug.

Overclocks on pascal for GPU core, work on 13mhz incrimental jumps. So 1999MHZ, 2012MZ, 2025mhz etc etc.

You'll find the curve will "snap" into place to round up/down to the nearest pre-set bins.

Also any time you set a lower frequency point _*after*_ already setting a higher frequency for a voltage before it -the line will "even out" horizontally.

Since there's no point in having two voltages attached to the same frequency. (It automatically evens the line out horizontally then in-game the driver will use the smaller voltage if two voltages are attached to same frequency. It will always use the smaller one. Curve "snaps" into place to reflect that.

Try plotting the points on the curve one at a time, using the 'apply' button in main window each time after every voltage. Then remember to save the profile from 1-5. at the end. It might take you 5-10 mins to do a whole curve.
OR
Decide what voltage you want to run at mostly when gaming (say 1.075v). Hold down the cntrl key and drag the frequency corresponding to 1.075v up to the frequency you want. (for example 2051mhz). Then hit apply. After hitting apply you might need to go back and adjust the very last (and only last) voltage at 1.075v as the last one doesn't seem to "stick" properly when using this method. Use the up/down arrow keys on keyboard to perfect it. In this intance you'd use the up arrow to move the frequency at 1.075 up to 2051. (after the cntrl key exercise it won't be far off it -- probably at most one bin below it -- usually matching the frequency for the next lowest voltage bin - _so easy 2 second fix) ._

Edit: There is actually a scenario when you might want the same frequency attached to more than one voltage point.. If you don't want your card to go over a certain voltage you specify. Then you would set the highest frequency you want to run at, at the voltage you want it to run at -- then the same frequency for every voltage point after it. In games system will then pick the lowest voltage that the frequency is attached to.
_(So for example: if you have the same frequency set for 1.075, 1.081 & 1.093v. The system will only use the lower 1.075v to reach that frequency)._


----------



## Gurkburk

So on Singles day (11/11/17) there was a sale where we could remove roughly 120 USD from an item costing above 550 USD.

I was unsure if this was actually gonna work on Any type of expensive PC hardware. Tested the code on a 1080 Ti, and it worked. This amount of sale, is never seen here in Sweden. What we usually see on such high tier hardware, is at most 5% sale.

I jumped on the ship and ordered a MSI 1080Ti which i got for 840~ USD & it before, it was 960~ USD.

I've got a 1070 atm, with a Arctic Extreme cooler, which will sell for around 400~.

Did i do a good pickup here? Or should one wait for Volta? (Which i've read might be delayed due to no competition from AMD)


----------



## GreedyMuffin

Quote:


> Originally Posted by *Gurkburk*
> 
> So on Singles day (11/11/17) there was a sale where we could remove roughly 120 USD from an item costing above 550 USD.
> 
> I was unsure if this was actually gonna work on Any type of expensive PC hardware. Tested the code on a 1080 Ti, and it worked. This amount of sale, is never seen here in Sweden. What we usually see on such high tier hardware, is at most 5% sale.
> 
> I jumped on the ship and ordered a MSI 1080Ti which i got for 840~ USD & it before, it was 960~ USD.
> 
> I've got a 1070 atm, with a Arctic Extreme cooler, which will sell for around 400~.
> 
> Did i do a good pickup here? Or should one wait for Volta? (Which i've read might be delayed due to no competition from AMD)


Komplett.se I guess? ^^

I'd keep the card. Sell the 1070.


----------



## Gurkburk

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Komplett.se I guess? ^^
> 
> I'd keep the card. Sell the 1070.


Haha yeah. You saw the deal too? That's insane for such a hardware item here. You'd never see that or anything close to it on Black Friday.

But yeah, going to sell the 1070. Pretty much paying for half a 1080ti with that money.

Edit: Are there any Firmware updates to go for with the MSI 1080 ti when i get it?


----------



## nrpeyton

yeeeehaa!!

Worldwide International SuperPosition bench results:

My last run I came _(what would be if I had a premium account to submit the score)_ 6th place worldwide @ 4k optimized. !!



Still being beat by our thread leader, KedarWolf. And also k|ngp|n; who both hold some of the top spots there. (I.E. Kingpin no.1 on at 1080p Extreme and KedarWolf no.2 worldwide in 4k optimized.

I am quite happy with my 6th place. lol. Soon to be 5th place _worldwide_.


----------



## Gurkburk

Quote:


> Originally Posted by *nrpeyton*
> 
> yeeeehaa!!
> 
> Worldwide International SuperPosition bench results:
> 
> My last run I came _(what would be if I had a premium account to submit the score)_ 6th place worldwide @ 4k optimized. !!
> 
> 
> 
> Still being beat by our thread leader, KedarWolf. And also k|ngp|n; who both hold some of the top spots there. (I.E. Kingpin no.1 on at 1080p Extreme and KedarWolf no.2 worldwide in 4k optimized.
> 
> I am quite happy with my 6th place. lol. Soon to be 5th place.


I'll come at you when i get my msi 1080 ti & Arctic extreme cooler


----------



## nrpeyton

Quote:


> Originally Posted by *Gurkburk*
> 
> I'll come at you when i get my msi 1080 ti & Arctic extreme cooler


lol, bring it


----------



## 2ndLastJedi

Quote:


> Originally Posted by *nrpeyton*
> 
> I have wolfinstein too m8.. it is a great game.
> 
> But in terms of graphics immersion there are so many other titles out there that look a lot nicer. (Wolf does look good at 4k though.. not denying that -- just saying there is more luxurious titles out there in terms of raw graphical beauty).
> 
> New 'Rise of the Tomb Raider' for instance -- is a beautiful game


When you say the" New 'Rise of the Tomb Raider' " do you mean the 2016 game ? If so , i hear people say that it is a looker but i have that one and can play it at near Max @ 4k and to me it just doesn't look as good lighting wise as say Wolf 2 or Destiny 2 . Maybe just a matter of opinion .
I'm a graphics whore and if it looks good it will usually pull me in even if i dont like the story much (and Tomb Raider bores me) but...nothing








I'd like a list of all the BEST looking games !


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> I have wolfinstein too m8.. it is a great game.
> 
> But in terms of graphics immersion there are so many other titles out there that look a lot nicer. (Wolf does look good at 4k though.. not denying that -- just saying there is more luxurious titles out there in terms of raw graphical beauty).
> 
> New 'Rise of the Tomb Raider' for instance -- is a beautiful game
> 
> 
> 
> When you say the" New 'Rise of the Tomb Raider' " do you mean the 2016 game ? If so , i hear people say that it is a looker but i have that one and can play it at near Max @ 4k and to me it just doesn't look as good lighting wise as say Wolf 2 or Destiny 2 . Maybe just a matter of opinion .
> I'm a graphics whore and if it looks good it will usually pull me in even if i dont like the story much (and Tomb Raider bores me) but...nothing
> 
> 
> 
> 
> 
> 
> 
> 
> I'd like a list of all the BEST looking games !
Click to expand...

Best looking game I've ever played, with an amazing story line, is Hellblade: Senou's Sacrifice.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> Best looking game I've ever played, with an amazing story line, is Hellblade: Senou's Sacrifice.


Okay that looks to be my next game!


----------



## SavantStrike

Quote:


> Originally Posted by *Gurkburk*
> 
> So on Singles day (11/11/17) there was a sale where we could remove roughly 120 USD from an item costing above 550 USD.
> 
> I was unsure if this was actually gonna work on Any type of expensive PC hardware. Tested the code on a 1080 Ti, and it worked. This amount of sale, is never seen here in Sweden. What we usually see on such high tier hardware, is at most 5% sale.
> 
> I jumped on the ship and ordered a MSI 1080Ti which i got for 840~ USD & it before, it was 960~ USD.
> 
> I've got a 1070 atm, with a Arctic Extreme cooler, which will sell for around 400~.
> 
> Did i do a good pickup here? Or should one wait for Volta? (Which i've read might be delayed due to no competition from AMD)


Why didn't I think of that. All I bought was a pair of GPU blocks. Should've dreampt bigger.


----------



## BigBeard86

Is the thing labeled here on the right of the vrm (labeled 8 phase) supposed to be cooled? What are those? I am asking because I am running a kraken bracket, and one of my copper heatsinks fell off of those - surface area is too small to create strong adhesion.


----------



## MrTOOSHORT

@nrpeyton

Nice score there in 4k SP!


----------



## Streetdragon

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm seriously conidering upgrading the TIM on my 1080Ti to Liquid Metal (conductonaut). Getting sick of seeing the GPU running approx. 15 - 17c above coolant temperature.
> 
> Anyone else who uses Liquid Metal on their GPU able to tell me the difference between coolant and GPU temp with the LM used? Or am I going to have to be my own guinnii pig on this one lol.


My GPU runs 3-5° over wateremps with liquid metal. But i "only" push 0,997V with 2000Core. Without vsync. want as much Frames as i can get


----------



## Dasboogieman

Quote:


> Originally Posted by *Streetdragon*
> 
> My GPU runs 3-5° over wateremps with liquid metal. But i "only" push 0,997V with 2000Core. Without vsync. want as much Frames as i can get


Mine is the same with LM except at 1.093V and 2100mhz.


----------



## DStealth

Tested my new CPU today 11.5k GPU points with the stock cooler on my poor Palit Gamerock








https://www.3dmark.com/spy/2722579


----------



## KedarWolf

Quote:


> Originally Posted by *DStealth*
> 
> Tested my new CPU today 11.5k GPU points with the stock cooler on my poor Palit Gamerock
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/spy/2722579


That's a really great graphics score!









This is my best.

https://www.3dmark.com/3dm/22687598


----------



## DStealth

Will beat it when get home...can run 2114/+875/900 mem
This morning this was the best result while testing the CPU...not really optimized


----------



## Gurkburk

What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


----------



## FedericoUY

Quote:


> Originally Posted by *Gurkburk*
> 
> What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


I just use BF1, is the best stress stability test for me. I can pass all the other test, and fail BF1. If you can play 2 1000 ticket rounds, you maybe stable or very near.


----------



## KedarWolf

Quote:


> Originally Posted by *Gurkburk*
> 
> What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


I run Time Spy Extreme or Firestrike Ultra stress tests a full loop.


----------



## Gurkburk

Quote:


> Originally Posted by *FedericoUY*
> 
> I just use BF1, is the best stress stability test for me. I can pass all the other test, and fail BF1. If you can play 2 1000 ticket rounds, you maybe stable or very near.


Cba to install BF1 to do that xD


----------



## GRABibus

Quote:


> Originally Posted by *Gurkburk*
> 
> What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


From benchmark softwares point of view, Unigine Heaven test is the one I could fail when all others pass (Firestrike, Time Spy).
And concerning my main running games (BF1, DOOM, COD WWII, Overwatch, SWBF 2015, Quake Champions, TitanFall2), BF1 is the one I can fail when all others pass.

So, if I can pass Unigine Heaven and BF1 for hours, I am very confident that my overclock is "stable" (Even if stable means nothing for an overclock...)


----------



## feznz

Quote:


> Originally Posted by *Gurkburk*
> 
> What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


depends on what you call stable but folding @ home will tell the difference of a card that is stable for an hour or 2 to a card that is stable for a month or 2.

My personal experience depends on the game GTA5 was a killer for OCs had to always down clock for that game yet other games I could play for hours on the same clocks


----------



## 8051

Quote:


> Originally Posted by *feznz*
> 
> depends on what you call stable but folding @ home will tell the difference of a card that is stable for an hour or 2 to a card that is stable for a month or 2.
> 
> My personal experience depends on the game GTA5 was a killer for OCs had to always down clock for that game yet other games I could play for hours on the same clocks


It really does seem like stability varies by game/benchmark. With my 980Ti it used to be Dying Light was my go-to test, because it would crash even though everything else worked, but w/my 1080Ti, it's Unigine Heaven that is the most sensitive to overclocks.


----------



## nrpeyton

Quote:


> Originally Posted by *Streetdragon*
> 
> My GPU runs 3-5° over wateremps with liquid metal. But i "only" push 0,997V with 2000Core. Without vsync. want as much Frames as i can get


Quote:


> Originally Posted by *Dasboogieman*
> 
> Mine is the same with LM except at 1.093V and 2100mhz.


Wow that's nice.... that's like another 10c (at least).

10c = 20mhz more stability.


----------



## SavantStrike

Quote:


> Originally Posted by *BigBeard86*
> 
> Is the thing labeled here on the right of the vrm (labeled 8 phase) supposed to be cooled? What are those? I am asking because I am running a kraken bracket, and one of my copper heatsinks fell off of those - surface area is too small to create strong adhesion.
> 
> 
> Spoiler: Warning: Spoiler!


That's an MSI gaming X or an Armor? It would be helpful if you specify that next time. I had to figure out what card it was from memory/guesswork.

The original HSF didn't have a pad to crool those components so you should probably be okay. From guru3D http://www.guru3d.com/articles_pages/msi_geforce_gtx_1080_ti_gaming_x_trio_review,4.html


----------



## aberrero

Quote:


> Originally Posted by *Gurkburk*
> 
> What do you guys use to stability test your GPU clocks? I've head Furmark is bad nowadays & can mess up your card. Running Unigine Heaven atm, but it doesnt seem to push the card & BARELY uses any memory.. If it's not using the memory, does it even stresstest it?


The Superposition benchmark shows artifacts in the first two scenes for me when I push core frequency too high. I can reliably get artifacts in 30 seconds. Overwatch is great for overall stability.


----------



## 2ndLastJedi

So if when i OC and run SuperPosition 4k and my PC restarts is that due to memory being to high?
What are the symptoms for unstable OC?
Is it artifacts for core and crashes for memory?


----------



## aberrero

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So if when i OC and run SuperPosition 4k and my PC restarts is that due to memory being to high?
> What are the symptoms for unstable OC?
> Is it artifacts for core and crashes for memory?


It could be either one. You should only push one at a time. If it is restarting your computer that means you are probably very aggressive. A slightly too high setting will make it freeze up or not load properly.


----------



## nrpeyton

Quote:


> Originally Posted by *2ndLastJedi*
> 
> So if when i OC and run SuperPosition 4k and my PC restarts is that due to memory being to high?
> What are the symptoms for unstable OC?
> Is it artifacts for core and crashes for memory?


Other way around. Artefacts is more likely memory and crashes the core.

If system is restarting _(without blue screen or freezing first)_; you could also estimate your total system power draw and compare that to the wattage rating of your PSU. Superposition 4k is a very high current bench. I've seen just my card alone peaking at 510 watts in Sup.

Little experiment I want to perform soon:


I want to measure the power draw on the 8 pin PCI-E power connector.
Then measure the power draw on the 6 pin (PCI-E power connector)
-- and see how they compare _(at load)_. Will be interesting to see how the card views these two connectors in terms of load balancing in real time.


----------



## 2ndLastJedi

Its weird because i can usually get +138 on core (2088MHz) and was setting voltage curve to 2088 and when i moved the core adjust slider in AB just to see what it said the + was, it was +217! No wonder it crashed!
But why would a +138 make the core 2088 and setting the voltage curve to 2088 make the core in AB +217?


----------



## nrpeyton

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Its weird because i can usually get +138 on core (2088MHz) and was setting voltage curve to 2088 and when i moved the core adjust slider in AB just to see what it said the + was, it was +217! No wonder it crashed!
> But why would a +138 make the core 2088 and setting the voltage curve to 2088 make the core in AB +217?


If you use the slider in the main window first; this becomes the new baseline. Then anything you do in the curve is a further offset of the new baseline.

So for example in the main window you do a traditional +50 then all frequencies at every voltage point will be on a +50. If you then raise the frequency for 1.093v in the curve by +50, you'll be at a combined +100mhz for 1.093v and still at +50 for all other voltage points.

...continued from my last post:

There is a *common misconception* in the "GPU World" that the *6 pin PCI-E power connector* can only *safely deliver 75 watts* & the 8 pin can only 150 watts.

This isn't exactly true. 75w and 150w are made-up imaginary numbers and are more driver-based than hardware based. _(They just help the driver load balance more safely as the 8 pin has two extra wires for ground -- so can safely handle higher current)._

Most high quality power supplies usually 16 AWG cables to GPU. Molex Specifications actually allow 13 amps per contact but due to circuit size of 6 and 8 respectfully we reduce that due to thermal limitations of cable thickness and proximity. So we now have 10 amps for circuit size of 6-8. _(6 & 8 pin)._

The 6 pin has only 2 ground. So 10 amps x two 12v pins= 20 amps. Then 20 amps x 12v = *240 watts*

The 8 pin has extra ground so can make use of all three 12v pins. So 10 amps x 3 = 30 amps. Then 30 amps x 12v = *360 watts*

Smaller power supplies likely use 18 AWG cable so only 8.5 amps per active 12v pin (2 on 6 pin and 3 on 8 pin). So smaller power supply becomes:
204 watts (6 pin)
and
306 watts (8 pin).

Cheap power supplies may be stuck with thinner 20 AWG cable so safety limits on cuircuit sizes of 6-8 become even smaller. This time only 7 amps per active 12v pin.
So:
168 watts (6 pin)
and
252 watts (8 pin).

So there you have it -- even cables/connectors of a cheap power supply still capable wayyy beyond the common misconception of only 75 watts and 150 watts.

(So total max power of 6 & 8 pin combined is not 225w -- in fact; even a cheap power supply's cables should be capable of safely delivering 420 watts to GPU. Remember we aren't even counting the power that is also delivered via the PCI-E slot it's self.)

_Disclaimer: If pushing high overclocks/voltages using XOC BIOS I recommend airflow direclty over cables and especially PCI-E power cable "connectors" themselves. Connectors/plugs can heat up by *extra* 30c at full loads. (listed above).
Also - your PSU obviously also has to be actually be *"capable"* of deliving the total power on it's 12v line._

.


----------



## Leijido

This is off topic but any 1080 ti owners here have 2 gtx 1080 TIs in sli and i7-8700k? I really need to see 8700k sli supported gaming benchmarks at 4k resolution or above before I get one. Stock frequency benchmarks are fine as long as it's for actual games and not Timespy 3d Mark Firestrike etc. Thanks so much and sorry for the trouble


----------



## nrpeyton

Quote:


> Originally Posted by *Leijido*
> 
> This is off topic but any 1080 ti owners here have 2 gtx 1080 TIs in sli and i7-8700k? I really need to see 8700k sli supported gaming benchmarks at 4k resolution or above before I get one. Stock frequency benchmarks are fine as long as it's for actual games and not Timespy 3d Mark Firestrike etc. Thanks so much and sorry for the trouble


As far as I'm lead to believe; SLI is a dying breed. So many games don't support it.

There are hacks to force the 2nd GPU to work in SLI -- but then you end up with a situation of both GPU's only being utilized about 50%.


----------



## Leijido

@nrpeyton: Agreed but always curious about 8700k and 1080 ti sli performance for games that have good sli support with nice scaling. These games are rare but can be found


----------



## nrpeyton

True, but in an ideal world. One would search for a new game by genre/story. _(What they're in the mood to play)._

Not by looking through a list to see whats SLI supported.

The same is true for Nvidia 3d Vision. Instead of picking my game and enjoying it in 3d. I have to look through a list of 3d supported games then find the game that offends me the least. lol

Anyway sorry for going off track -- hopefully someone here will be able to provide you the benchmarks. (8700k & 1080Ti SLI)

Here's one:





CPU Total Utilization: 50% ish
GPU 1 & 2 Utilization: 70% - 80% ish


----------



## Leijido

Yeah I saw that one. I need to see 4k resolution sli


----------



## feznz

Quote:


> Originally Posted by *Leijido*
> 
> This is off topic but any 1080 ti owners here have 2 gtx 1080 TIs in sli and i7-8700k? I really need to see 8700k sli supported gaming benchmarks at 4k resolution or above before I get one. Stock frequency benchmarks are fine as long as it's for actual games and not Timespy 3d Mark Firestrike etc. Thanks so much and sorry for the trouble


https://ageeky.com/top-10-most-cpu-intensive-games/

Personal opinion any current quad core will do for now I played ROTTR and if it is considered as in the top 10 most CPU intensive then my [email protected] ate it for breakfast, it was a single GTX 1080ti that was holding back the FPS.

I have been tossing up about a new CPU for years but still waiting for a game to actually make the CPU "bottleneck" the GPU still waiting.........but I couldn't comment on SLI but I am 99% sure it would not make a huge difference.
so 8700k is a nice choice for future proofing if I were to be building a new system yes 16 PCIe lanes is a "weak" point but in real life scenarios it would hardly make any difference you just got to look at
https://www.pugetsystems.com/labs/articles/Titan-X-Performance-PCI-E-3-0-x8-vs-x16-851/

and another good read about core count and SLI old write up but still relevant.
http://www.technologyx.com/featured/amd-vs-intel-our-8-core-cpu-gaming-performance-showdown/5/

really I want to get the 7900x would be the cheapest intel that would be able to run x16/x16 or then there is thread ripper that a max clock of 4Ghz will be the killer in the deal.

really comes down to budget

Actually need to update the most CPU intensive game to Assassins Creed Origins averaging 80% CPU usage with 3770k 4core/[email protected] and 8Gb system memory lol still only maxing 6.6Gb usage so have a spare 1.4Gb of system memory.


----------



## skupples

should I sell my original 1070 and return the one I just bought. in exchange for a 1080Ti? (their SLi scaling just keeps getting worse and worse) I'm trying to push a 3440x1440 predator, but 2x 1070 just isn't cutting it (and gsync is so far just a stuttery strobe.. which makes no sense)

i'm so out of touch that I don't even know when the next series is coming. I'm definitely going to return the second card, either way, the only thing up in the air is re-investing in 10 series.

pretty sure I'm also going to return the predator for a cheaper 4K, with better image quality as well. The second main issue with the monitor is it jacks up the streaming to the 1080p screen jacked into my 2017 shield.\

but really, i just want some wisdom on if its even worth moving up to a 1080ti? I've been having issues finding any info on the next refresh / revolution


----------



## SavantStrike

Quote:


> Originally Posted by *skupples*
> 
> should I sell my original 1070 and return the one I just bought. in exchange for a 1080Ti? (their SLi scaling just keeps getting worse and worse) I'm trying to push a 3440x1440 predator, but 2x 1070 just isn't cutting it (and gsync is so far just a stuttery strobe.. which makes no sense)
> 
> i'm so out of touch that I don't even know when the next series is coming. I'm definitely going to return the second card, either way, the only thing up in the air is re-investing in 10 series.
> 
> pretty sure I'm also going to return the predator for a cheaper 4K, with better image quality as well. The second main issue with the monitor is it jacks up the streaming to the 1080p screen jacked into my 2017 shield.\
> 
> but really, i just want some wisdom on if its even worth moving up to a 1080ti? I've been having issues finding any info on the next refresh / revolution


Volta in early Q2 2018, though there are rumors it might be ampere in Q3 2018 and no gaming Volta (but those rumors are new and highly speculative). Expect the 1080 TI to be top dog until march or April, and then it will probably be competitive with the 1170. Not a bad route to go as then you don't have to wait 6 months.


----------



## Gurkburk

Is it possible to get past the Power limit & Core voltage for greater overclocks on the Ti?

I'm reaching +95 CC & +440Mem(so far), if i go above 95CC I go above the powerlimit of 117.


----------



## mouacyk

Go to page 1 to learn how to flash a custom BIOS, compatible with your GPU. The ASUS XOC BIOS will allow up to 1.2v and bypass ALL power limits without ANY hardmodding. You are advised to have active and sufficient cooling for your VRM.


----------



## Hl86

World of Warcraft finds unstable overclocks fastest with my 1080 ti.
You need to crank that resolution scale up though.


----------



## Dasboogieman

Quote:


> Originally Posted by *Hl86*
> 
> World of Warcraft finds unstable overclocks fastest with my 1080 ti.
> You need to crank that resolution scale up though.


FFXIV as well at 4K.


----------



## Gurkburk

Quote:


> Originally Posted by *mouacyk*
> 
> Go to page 1 to learn how to flash a custom BIOS, compatible with your GPU. The ASUS XOC BIOS will allow up to 1.2v and bypass ALL power limits without ANY hardmodding. You are advised to have active and sufficient cooling for your VRM.


What would you say is enough cooling on the vrm?

On my 1070, there's a accelero xtreme cooler(air cooling). Would you say this is enough on the ti as well?

On the back you put a pretty massive backplate.


----------



## SavantStrike

Quote:


> Originally Posted by *Gurkburk*
> 
> What would you say is enough cooling on the vrm?
> 
> On my 1070, there's a accelero xtreme cooler(air cooling). Would you say this is enough on the ti as well?
> 
> On the back you put a pretty massive backplate.


On air you will need to pay extra attention to the core temp of your GPU. Try to keep voltage reasonable or if you start feeding a lot of voltage find a way to keep temps under 60C. 70C+ temps with 1090mV or 1100mV will most likely degrade your chip over time, as stock Pascal throttles to avoid this.

VRMs should be fine with any heatsink that has a decent amount of air blowing over it. Accellero is competitive with AIB 1080 TIs with large 2.5 slot air coolers.


----------



## Gurkburk

Quote:


> Originally Posted by *SavantStrike*
> 
> On air you will need to pay extra attention to the core temp of your GPU. Try to keep voltage reasonable or if you start feeding a lot of voltage find a way to keep temps under 60C. 70C+ temps with 1090mV or 1100mV will most likely degrade your chip over time, as stock Pascal throttles to avoid this.
> 
> VRMs should be fine with any heatsink that has a decent amount of air blowing over it. Accellero is competitive with AIB 1080 TIs with large 2.5 slot air coolers.


5o be fair, I expect a 10*c drop at least with the accelero compared to the msi cooler + log quieter.

Gigabyte g1 cooler on 1070 had this happen at least


----------



## d3v0

After following the excellent guide in the OP, I am still struggling to gain understanding of the concepts of overclocking this beast. In the olden times, you would max out voltage either by installing a custom bios or by setting the power limit, vmodding the card etc.

From what I understand, With my Gigabyte Aorus, if I set the voltage to 100%, and the power limit to 150%, I am getting 1.093v? I notice my clocks jumping from 2050 down to 1889 all the time. I know why, its because of Boost 3.0 and how it keeps it stable. What I want it stability at 2050, and to do that, I need constant 1.093v. How do I achieve this?

I did try the custom voltage curve as I mentioned, but that seems almost more tuned to just keeping your card efficient. I need to prevent downclocking. How have you guys solved the problem?


----------



## becks

Quote:


> Originally Posted by *d3v0*


You can't do that...your GPU and every other Pascal will forever have their "economic" presents Enable in their Bios ...so you can't cancel that and make the card stay 24/7 at a certain Voltage/Clock..

It will constantly go up and down as needed..

EDIT: you can only set the max OC... so the top value it goes to... but the rest is up to Pascal and Turbo boost 3.0


----------



## SavantStrike

Quote:


> Originally Posted by *d3v0*
> 
> After following the excellent guide in the OP, I am still struggling to gain understanding of the concepts of overclocking this beast. In the olden times, you would max out voltage either by installing a custom bios or by setting the power limit, vmodding the card etc.
> 
> From what I understand, With my Gigabyte Aorus, if I set the voltage to 100%, and the power limit to 150%, I am getting 1.093v? I notice my clocks jumping from 2050 down to 1889 all the time. I know why, its because of Boost 3.0 and how it keeps it stable. What I want it stability at 2050, and to do that, I need constant 1.093v. How do I achieve this?
> 
> I did try the custom voltage curve as I mentioned, but that seems almost more tuned to just keeping your card efficient. I need to prevent downclocking. How have you guys solved the problem?


It sounds like your card is thermal throttling to protect itself. Throttling happens in 5C increments and starts at 30-35C. More voltage makes throttling worse as it drives power consumption up exponentially.

You should find your max clocks before adding voltage. If you're not happy with your clocks after dialing them in, then you can apply some voltage to try and get more core speed. You may end up putting extra voltage on the chip if it's running hot as they are not as efficient or stable at higher temps.


----------



## d3v0

Quote:


> Originally Posted by *SavantStrike*
> 
> It sounds like your card is thermal throttling to protect itself. Throttling happens in 5C increments and starts at 30-35C. More voltage makes throttling worse as it drives power consumption up exponentially.
> 
> You should find your max clocks before adding voltage. If you're not happy with your clocks after dialing them in, then you can apply some voltage to try and get more core speed. You may end up putting extra voltage on the chip if it's running hot as they are not as efficient or stable at higher temps.


It runs at about 68c fully loaded. I am not sure how to "add voltage" anyway









But yeah, I am going to just be setting power limits to the max and adding to the core and memory, seems sort of bland - unless someone has better ideas


----------



## ZealotKi11er

Quote:


> Originally Posted by *d3v0*
> 
> It runs at about 68c fully loaded. I am not sure how to "add voltage" anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah, I am going to just be setting power limits to the max and adding to the core and memory, seems sort of bland - unless someone has better ideas


Pascal overclocking is a balance of Voltage, Power and Temperature.

Voltage will not help you much unless you want to get the highest clocks. Doing +100 give the card the ability to hit higher voltage bins.
Power is the biggest problem with Pascal. The higher the voltage the higher the power draw which will cause the core clock to drop to keep within that power limit.
As for temperature the lower the temp the higher the card can clock. For every 10C drop the card gains 12.5MHz clock speed. For example 2000MHz at 68C would be 2013MHz at 58, 2025MHz at 48C and 2050MHz at 38C.

Now if you want 2050 at 1.093v and you know the card hits the power limit all you can do is get a custom bios that turns off the limit. I personally do no bother. I think 1.093v is stupidly high for the gains you get.
The best way to do it is test games and see the max voltage you can use before the card hits the limit and with that voltage find the highest clock you can get. With those settings in mind you do a curve OC as described at OP.
My current OC for 24/7 is 1860MHz @ 0.875v and for more demanding games 2025MHz @ 1v.


----------



## SavantStrike

Quote:


> Originally Posted by *d3v0*
> 
> It runs at about 68c fully loaded. I am not sure how to "add voltage" anyway
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah, I am going to just be setting power limits to the max and adding to the core and memory, seems sort of bland - unless someone has better ideas


The voltage curve in afterburner will let you set the voltage higher if you unlock voltage monitoring and voltage control under advanced options.

The Aorus cards allow for 375W power draw so you could probably set some fairly high voltages and still be under the power limit, but if you drive everting to maximum levels expect the card to hit 75-80C. 1080 TIs really need to be under water for ultra heavy over clocks.

If you're hitting 2050Mhz on air you're going pretty well.


----------



## d3v0

Thank you for all the responses fellas. I spent alot more time swapping out my decade old PC Power and Cooling 610W silencer to my new modular PSU and installing my GTX 1080Ti than I expected and had alot less time to test. I will spend the greater amount of time not messing with voltage curve and seeing what she will do on the core clock. Afterwards, I can try my hand at the curve, but so far it seemed to just sort of make it unstable and cause it to crash.


----------



## Gurkburk

Holy... Crap... This was a tight fit. I thought the 780 and the 1070 was a tight fit, but this Ti, its HUGE.


----------



## nrpeyton

Quote:


> Originally Posted by *d3v0*
> 
> Thank you for all the responses fellas. I spent alot more time swapping out my decade old PC Power and Cooling 610W silencer to my new modular PSU and installing my GTX 1080Ti than I expected and had alot less time to test. I will spend the greater amount of time not messing with voltage curve and seeing what she will do on the core clock. Afterwards, I can try my hand at the curve, but so far it seemed to just sort of make it unstable and cause it to crash.


You can isolate individual voltages in the curve by selecting a 'plot point' above a specific voltage. Then pressing the L key on keyboard. This will force the card to "lock" onto that voltage and card will attempt to stay on only that voltage. Makes it easier to test specific voltages along the curve for stability. One at a time.

You know it's worked when a yellow line appears in the curve window highling the voltage you've selected.

Quote:


> Originally Posted by *Gurkburk*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Holy... Crap... This was a tight fit. I thought the 780 and the 1070 was a tight fit, but this Ti, its HUGE.


Is that a 1080Ti? It's the most unique one I've ever seen.

But what I do like is the fins on the backplate.. With decent case-airflow that should help nicely on VRM and memory temps.

I'd love to stick it with some probes and test with/without that backplate for comparison.


----------



## criminal

Quote:


> Originally Posted by *Gurkburk*
> 
> Holy... Crap... This was a tight fit. I thought the 780 and the 1070 was a tight fit, but this Ti, its HUGE.


----------



## Gurkburk

I saw other people (on youtube) mounting this fan with only a silverplate holding it on the back, so no big backplate adding cooling on the hotspots(vrm), which is weird.

I did my best at placing this backplate, i don't think it's originally meant to work(?). There are these clips you use on the 1070 at least, which wont work at all.

But So far, its cooling well.


----------



## Gurkburk

Where would guys say that the temperature should be the ending factor with increasing the clocks?

With +100CC & 450 Memory (stock msi bios etc) I'm incredibly close to the power limit. (Not sure if im maxed on VRM yet)


----------



## 8051

Quote:


> Originally Posted by *Gurkburk*
> 
> I saw other people (on youtube) mounting this fan with only a silverplate holding it on the back, so no big backplate adding cooling on the hotspots(vrm), which is weird.
> 
> I did my best at placing this backplate, i don't think it's originally meant to work(?). There are these clips you use on the 1070 at least, which wont work at all.
> 
> But So far, its cooling well.


For air cooling you have to go big or go home. I installed the Arctic Accelero Xtreme II+ on my 6970, it was a knuckle biter installing it (I've cracked the core on my 9800Pro installing an aftermarket heatsink).

I have an Arctic Xtreme III that I bought for my 1080Ti, but I can't install it, not without violating the terms of the warranty









What kind of temps are you getting w/the Arctic Accelero Xtreme IV on your 1080Ti?


----------



## Gurkburk

Quote:


> Originally Posted by *8051*
> 
> For air cooling you have to go big or go home. I installed the Arctic Accelero Xtreme II+ on my 6970, it was a knuckle biter installing it (I've cracked the core on my 9800Pro installing an aftermarket heatsink).
> 
> I have an Arctic Xtreme III that I bought for my 1080Ti, but I can't install it, not without violating the terms of the warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What kind of temps are you getting w/the Arctic Accelero Xtreme IV on your 1080Ti?


Running Firestrike stress test atm.

On 100% fanspeed (which i always bump up before stresstesting etc), sitting on 60*C. So that's 3-4*C less than stock PLUS it's SOOO much quieter.


----------



## 8051

Quote:


> Originally Posted by *Gurkburk*
> 
> Running Firestrike stress test atm.
> 
> On 100% fanspeed (which i always bump up before stresstesting etc), sitting on 60*C. So that's 3-4*C less than stock PLUS it's SOOO much quieter.


What kind of 1080 TI do you have? I would've figured the Accelero Xtreme IV would've made more of a difference than 3-4°C.


----------



## Gurkburk

Quote:


> Originally Posted by *8051*
> 
> What kind of 1080 TI do you have? I would've figured the Accelero Xtreme IV would've made more of a difference than 3-4°C.


The MSI card. I was pretty impressed by that cooler. Only issue with it is that it sounds like a Boeing flying by...

I might have a very good airflow in my case too, not sure.

I'm having trouble understanding the firstpost on how to flash the card.

I've got the MSI 1080 Ti. Within the "Best flash files" folder, is the "S1080TIXOCnopowerlimit" compatible with my MSI card?


----------



## Stiltz85

Got number 2 my new PC is almost complete, just waiting on the new case and power supply!


----------



## SavantStrike

Quote:


> Originally Posted by *Gurkburk*
> 
> Where would guys say that the temperature should be the ending factor with increasing the clocks?
> 
> With +100CC & 450 Memory (stock msi bios etc) I'm incredibly close to the power limit. (Not sure if im maxed on VRM yet)


Over 30C Pascal starts to drop speeds every 5C hotter it gets. This is especially visible above 45C. If you dropped 10C you'd probably get another 26mhz at the same power limit.


----------



## Gurkburk

What driver do you guys use? Latest one seem to be stealing away 3000~ 3dmark score..

Edit: I had gsync on, that's why the score was low









But my score is just 30,600~.

My card runs 1.093v & isn't close to power limit when crashing @ +110cc.. This seems very early. 2025mhz core. I'm gonna flash once im feeling ready..


----------



## 8051

If someone were to flash the ASUS Strix XOC super VBIOS to their card and it were to fry, I'm guessing this would invalidate the warranty for a non-ASUS card? Would they be able to check if the card is fried?


----------



## nrpeyton

Quote:


> Originally Posted by *Hl86*
> 
> World of Warcraft finds unstable overclocks fastest with my 1080 ti.
> You need to crank that resolution scale up though.


Wow, that's an old title.

Definitely a classic.

I remember playing that over 6+ years ago _(maybe even longer)_ on my old AMD 'Radeon HD 6950' graphics card and 2.4Ghz Dual Core CPU, and it ran beautifully.

By comparison, the 1080Ti is now over 550% more powerful. Just shows how technology has advanced.


----------



## becks

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow, that's an old title.
> 
> Definitely a classic.
> 
> I remember playing that over 6+ years ago _(maybe even longer)_ on my old AMD 'Radeon HD 6950' graphics card and 2.4Ghz Dual Core CPU, and it ran beautifully.
> 
> By comparison, the 1080Ti is now over 550% more powerful. Just shows how technology has advanced.


It's not Old at all! they just announced a new Expansion going out soon....(been playing it since pre-TBC







)


----------



## Gurkburk

So my score of 30600 gpu score in 3dmark, that seems low..

My friend which has a gigabyte ti, gets 32000~.

Will a 1440p screen affect the score? Should i reinstall windows with the new card?

I have DDU'd & using the latest driver & have tested an older driver too, only a 100~ score difference.


----------



## Stiltz85

Quote:


> Originally Posted by *Gurkburk*
> 
> So my score of 30600 gpu score in 3dmark, that seems low..
> 
> My friend which has a gigabyte ti, gets 32000~.
> 
> Will a 1440p screen affect the score? Should i reinstall windows with the new card?
> 
> I have DDU'd & using the latest driver & have tested an older driver too, only a 100~ score difference.


What benchmark and settings? Also air or water cooled?
I pulled off 30570 graphics score on a standard Firestrike run with my Poseidon while air cooled, first time overclocking this second card so I have not tried a higher clock yet. Judging on what my card got I would assume you're in the ballpark for a decently overclocked air cooled score. Watercooled I am sure it can go to 32000 and higher.

As for your native screen resolution, it should not effect it at all. I'm pretty sure 3dMark just tests at it's predeceased resolution and just scaling to your resolution. Though I could be wrong.

And unless you benchmark for a living I really don't see why one would reinstall windows just to get a higher score.


----------



## Gurkburk

Quote:


> Originally Posted by *Stiltz85*
> 
> What benchmark and settings? Also air or water cooled?
> I pulled off 30570 graphics score on a standard Firestrike run with my Poseidon while air cooled, first time overclocking this second card so I have not tried a higher clock yet. Judging on what my card got I would assume you're in the ballpark for a decently overclocked air cooled score. Watercooled I am sure it can go to 32000 and higher.
> 
> As for your native screen resolution, it should not effect it at all. I'm pretty sure 3dMark just tests at it's predeceased resolution and just scaling to your resolution. Though I could be wrong.
> 
> And unless you benchmark for a living I really don't see why one would reinstall windows just to get a higher score.


My friend has a gigabyte card with stock gigabyte (windforce?) cooler & has not really maxed out his clocks, his heat gets up to 70~.

Low 3dmark score could indicate bad performance in general too..

Edit: as for settings, it's normal firesteike with no changes.

My msi card has a accelero extreme cooler. Doesn't get hotter than 60*c.


----------



## Gurkburk

Quick question regarding Bios flashing.

I've got the MSI 1080 ti Gaming X. I am able to flash "S1080TIXOCNoPowerLimit.rom" onto that right? I'm not really understanding the first post about it. There's only two power connections.


----------



## KedarWolf

Quote:


> Originally Posted by *Gurkburk*
> 
> Quick question regarding Bios flashing.
> 
> I've got the MSI 1080 ti Gaming X. I am able to flash "S1080TIXOCNoPowerLimit.rom" onto that right? I'm not really understanding the first post about it. There's only two power connections.


Yes, you can flash the XOC BIOS


----------



## SavantStrike

Quote:


> Originally Posted by *8051*
> 
> If someone were to flash the ASUS Strix XOC super VBIOS to their card and it were to fry, I'm guessing this would invalidate the warranty for a non-ASUS card? Would they be able to check if the card is fried?


I'm not singling you out here as you're the second person I've heard use the word "warranty" and XOC BIOS in the same sentence in two days.

Anyone using the XOC BIOS should treat it like a hard volt mod. If a card running XOC gets fried, it's not the manufacturers fault the user put double or triple the designed current through the VRMs.

XOC = no warranty. If the enthusiast community doesn't behave itself here and people start returning XOC'd and abused cards, there may never be a "leaked" XOC BIOS for an nvidia platform again.

Never volt mod something you can't afford to lose. Over clocking was always this way. You fry it you own it.


----------



## Gurkburk

Soo i flashed the XoC bios & I'm hitting the same OC wall as before. When i cross + 100 Core clock, 3dmark stress test just shuts down. 95 is fine.

3Dmark doesnt even get the volt to raise.

What's going on here? Same thing happened in stock.

Edit: Firestrike stresstest launches, but runs for 2 seconds then shuts down. Driver doesnt crash.

Edit2: I've downloaded HWINFO64 & checked the GPU power. Is it possible my PSU isnt enough? I don't remember what PSU i have to be honest, I'll have to check that.


----------



## 8051

Quote:


> Originally Posted by *SavantStrike*
> 
> I'm not singling you out here as you're the second person I've heard use the word "warranty" and XOC BIOS in the same sentence in two days.
> 
> Anyone using the XOC BIOS should treat it like a hard volt mod. If a card running XOC gets fried, it's not the manufacturers fault the user put double or triple the designed current through the VRMs.
> 
> XOC = no warranty. If the enthusiast community doesn't behave itself here and people start returning XOC'd and abused cards, there may never be a "leaked" XOC BIOS for an nvidia platform again.
> 
> Never volt mod something you can't afford to lose. Over clocking was always this way. You fry it you own it.


Then tell me what the maximum voltage is for 1080Ti's and I won't run it at that voltage -- because running it on the stock VBIOS is nothing but an exercise in power throttling.


----------



## Gurkburk

Quote:


> Originally Posted by *8051*
> 
> Then tell me what the maximum voltage is for 1080Ti's and I won't run it at that voltage -- because running it on the stock VBIOS is nothing but an exercise in power throttling.


I'm curious as well. I'm also wondering how to get to the higher voltages. With +100CV and 120PL/Unlocked, my MSI card crashes when it gets to 1.093v.


----------



## nrpeyton

OMG I'm playing the new *Call of Duty WWII.*

It is fully *maxing out* my 1080Ti *11GB VRAM (*by standard in pretty much the vast majority of scenes).

This is the 1st time I'm seeing a game max out VRAM on a 1080Ti.

Seeing *VRAM usage* as *high as 10,800 MB

*The game is also very well optimized. A smooth 70 FPS with everything maxed out at 4k. GPU _*only*_ at 1999mhz @ 1.0v. Power consumption around 250-300+ watts.


----------



## nrpeyton

Quote:


> Originally Posted by *Gurkburk*
> 
> I'm curious as well. I'm also wondering how to get to the higher voltages. With +100CV and 120PL/Unlocked, my MSI card crashes when it gets to 1.093v.


At what frequency is it crashing at? Have you tried lowering the frequency that is attached to 1,093v on your curve slightly?
If it is only crashing mid-way through a bench then you're probably very close to getting it stable.. try lowering it by one 13mhz step. If it's still not completely stable then try lowering it again by another 13mhz.

Also what are your 'at load' temps like?

Quote:


> Originally Posted by *Gurkburk*
> 
> Soo i flashed the XoC bios & I'm hitting the same OC wall as before. When i cross + 100 Core clock, 3dmark stress test just shuts down. 95 is fine.
> 
> 3Dmark doesnt even get the volt to raise.
> 
> What's going on here? Same thing happened in stock.
> 
> Edit: Firestrike stresstest launches, but runs for 2 seconds then shuts down. Driver doesnt crash.
> 
> Edit2: I've downloaded HWINFO64 & checked the GPU power. Is it possible my PSU isnt enough? I don't remember what PSU i have to be honest, I'll have to check that.


Sounds like your maybe just 1 or 2 frequency bins too high. If it was your PSU the entire PC would shut down/restart. _(I'm not saying rule it out completely - but I'd definitely look at lowering the frequency just a little tad first)._ Or trying to get the card 10-20c cooler.


----------



## chibi

Hey guys, after applying an overclock with Afterburner, I noticed that whenever I start my computer, there is a quick flash once it loads into windows. Kind of like an indication that it's applying the overclocked profile. I turned off the overclock and restarted and no flash.

Is this normal?


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> OMG I'm playing the new *Call of Duty WWII.*
> 
> It is fully *maxing out* my 1080Ti *11GB VRAM (*by standard in pretty much the vast majority of scenes).
> 
> This is the 1st time I'm seeing a game max out VRAM on a 1080Ti.
> 
> Seeing *VRAM usage* as *high as 10,800 MB
> 
> *The game is also very well optimized. A smooth 70 FPS with everything maxed out at 4k. GPU _*only*_ at 1999mhz @ 1.0v. Power consumption around 250-300+ watts.


So are you using DSR to get to 4K? Because your system description says you have a 1440p monitor.


----------



## sviru

Guys. I'm about to buy 1080ti. Are there any cards I should avoid? What should I buy? Thanks. Midrange normal card.


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> So are you using DSR to get to 4K? Because your system description says you have a 1440p monitor.


The 4K TV should be in the description too. A Sony Bravia X8509C 4k HDR compatible (55 inch).

Saying that -- I'm having *difficulty getting HDR working properly.* I mean it *IS* working -- however on the windows desktop *text appears* like it's got a *red tint or slight red tinted border* around some of it in areas --. (I.E. it's patchy as it doesn't affect every bit of all text).

Under EXTERNAL INPUTS on TV I have the HDMI signal set to: "Enhanced".

Under windows in Nvidia Control Panel I have it set to:

YCBRC - 4-2-2
12 bpc colour depth

For "dynamic range" on HDMI options on TV I have it set to "full". However in the nvidia control panel I'm only able to select "limited".

DIXDIAG *does* confirm the HDR is active on the TV (and working). Just can't figure out the red tinted borders around some text.

Anyone have any experience at this? _(I'm worried this is going to translate into washed out looking images in-games -- though haven't had a chance to test this yet)._


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> OMG I'm playing the new *Call of Duty WWII.*
> 
> It is fully *maxing out* my 1080Ti *11GB VRAM (*by standard in pretty much the vast majority of scenes).
> 
> This is the 1st time I'm seeing a game max out VRAM on a 1080Ti.
> 
> Seeing *VRAM usage* as *high as 10,800 MB
> 
> *The game is also very well optimized. A smooth 70 FPS with everything maxed out at 4k. GPU _*only*_ at 1999mhz @ 1.0v. Power consumption around 250-300+ watts.


I know in BO3 the game caches the shaders etc. so it uses most of your RAM, but you don't need that much video RAM to actually play the game.

Just because it caches objects in the video memory is why it maxes it out.


----------



## KedarWolf

Yes, it's









But to share a tech victory with you all, I got my 128GB kit of RAM running at 3200MHZ which is pretty much unheard of for high density RAM.









I run games and benches in a RAM disk you see.


----------



## feznz

^^^^^^









Quote:


> Originally Posted by *nrpeyton*
> 
> The 4K TV should be in the description too. A Sony Bravia X8509C 4k HDR compatible (55 inch).
> 
> Saying that -- I'm having *difficulty getting HDR working properly.* I mean it *IS* working -- however on the windows desktop *text appears* like it's got a *red tint or slight red tinted border* around some of it in areas --. (I.E. it's patchy as it doesn't affect every bit of all text).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Under EXTERNAL INPUTS on TV I have the HDMI signal set to: "Enhanced".
> 
> Under windows in Nvidia Control Panel I have it set to:
> 
> YCBRC - 4-2-2
> 12 bpc colour depth
> 
> For "dynamic range" on HDMI options on TV I have it set to "full". However in the nvidia control panel I'm only able to select "limited".
> 
> DIXDIAG *does* confirm the HDR is active on the TV (and working). Just can't figure out the red tinted borders around some text.
> 
> Anyone have any experience at this? _(I'm worried this is going to translate into washed out looking images in-games -- though haven't had a chance to test this yet)._


only had similar problem with a faulty HDMI cable

About to roll back to 388.13 drivers Assassins creed has started jumping GPU clocks from idle to max load clock only happened since 388.31 drivers, anyone tried the 388.31 driver I liked the 388.13 seems everyone was getting nice bench scores with that driver too.


----------



## demon09

Quote:


> Originally Posted by *feznz*
> 
> ^^^^^^
> 
> 
> 
> 
> 
> 
> 
> 
> [/SPOILER]
> 
> only had similar problem with a faulty HDMI cable
> 
> About to roll back to 388.13 drivers Assassins creed has started jumping GPU clocks from idle to max load clock only happened since 388.31 drivers, anyone tried the 388.31 driver I liked the 388.13 seems everyone was getting nice bench scores with that driver too.


eh 388.13 was stuttery for me 388 was ok I am actually on 385.69 at the moment as it works the best for the games I am currently playing . But when I get more time I will compare it closer with 388


----------



## feznz

Quote:


> Originally Posted by *demon09*
> 
> eh 388.13 was stuttery for me 388 was ok I am actually on 385.69 at the moment as it works the best for the games I am currently playing . But when I get more time I will compare it closer with 388


interesting I give the 388 a whirl still got the stuttering with 388.13 weird thing is it after 5-10 mins of gameplay it goes away

Actually it has been a few months since I played a game been too busy fishing/working off topic but a little teaser looking next year for a yellow fin tuna but for now I will settle for this little guy also known as a bluenose.


Spoiler: Warning: Spoiler!


----------



## Rakanoth

That fish look ridiculous.


----------



## demon09

Quote:


> Originally Posted by *feznz*
> 
> interesting I give the 388 a whirl still got the stuttering with 388.13 weird thing is it after 5-10 mins of gameplay it goes away
> 
> Actually it has been a few months since I played a game been too busy fishing/working off topic but a little teaser looking next year for a yellow fin tuna but for now I will settle for this little guy also known as a bluenose.
> 
> 
> Spoiler: Warning: Spoiler!


388.13 was where the odd stutter seemed to be the worse. And where people on the Nvidia fourms really seemed to complain.385.69 seems to be the best for me currently. But my time has been limited also so I haven't really had time to test much on 388.0 and 388.31 part of the issue for you may be windows 10 and HDR support.385.69 is one of those if it ain't broke don't fix it kinda driver for me so unless I see a boost in my game on a new driver I don't really care to move as lots of the new ones have messed with g sync ect


----------



## feznz

Quote:


> Originally Posted by *Rakanoth*
> 
> That fish look ridiculous.


Actually wearing a fish head trophy helmet in a dress while getting your butt signed off in the bar is ridiculous but on a bucks night anything can happen.


Spoiler: Warning: Spoiler!





top secret wasn't till the end of the night that the publican clicked on that it wasn't a plastic fish it just looked real but I promised it was smelling real by the end of the night any sorry lets get on topic


----------



## jprovido

SLI time!


----------



## demon09

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> SLI time!


:0 I have allways strayed from SLI for the fact of spotty support / bad SLI optimization but man does it tempt me some days specaily on games it works well with.


----------



## 8051

Quote:


> Originally Posted by *feznz*
> 
> interesting I give the 388 a whirl still got the stuttering with 388.13 weird thing is it after 5-10 mins of gameplay it goes away
> 
> Actually it has been a few months since I played a game been too busy fishing/working off topic but a little teaser looking next year for a yellow fin tuna but for now I will settle for this little guy also known as a bluenose.
> 
> 
> Spoiler: Warning: Spoiler!


Nice catch. How much does that fish weigh? It looks like fish will be on the menu for a while.


----------



## 8051

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> SLI time!


What kind of PSU do you need to support 1080Ti in SLI? 1000 Watts? 1200 Watts?


----------



## Stiltz85

Quote:


> Originally Posted by *8051*
> 
> What kind of PSU do you need to support 1080Ti in SLI? 1000 Watts? 1200 Watts?


1000W should cover it, not sure about high overclocking, I'll let someone more knowledgeable tackle that.


----------



## Stiltz85

Quote:


> Originally Posted by *jprovido*
> 
> 
> 
> SLI time!


I'll take your AOI hybrids and raise you custom loop hybrids. lol


----------



## feznz

Quote:


> Originally Posted by *demon09*
> 
> :0 I have allways strayed from SLI for the fact of spotty support / bad SLI optimization but man does it tempt me some days specaily on games it works well with.


with the grunt of a 1080Ti who cares about the hit or miss with scalability of SLI just drop down to a single card if need be. I would love an SLI setup just to know you got a bit more power on tap when the game is right.
TBH an decent 850w PSU could do SLI but it would be cutting it fine personally I would go 1000w minimum to allow for some degrading and head room for OCing

Quote:


> Originally Posted by *8051*
> 
> Nice catch. How much does that fish weigh? It looks like fish will be on the menu for a while.


25+ kg but they always get bigger as you tell the story a few more times it always get photo proof.


----------



## SavantStrike

Quote:


> Originally Posted by *8051*
> 
> Then tell me what the maximum voltage is for 1080Ti's and I won't run it at that voltage -- because running it on the stock VBIOS is nothing but an exercise in power throttling.


I'll help you but this is a long post.

You demonstrated my point exactly - if you're unsure what these parameters should be, a vmod or XOC BIOS is premature. Don't just take the word of some stranger on the internet when messing with 700+ dollar hardware.

Overclocking is the act of changing the design parameters of working silicon to something other than the original engineers specified because the end user controls one or more variables the original engineers couldn't. These are usually temperature and voltage along with hand binning.

This is an old formula for OC power draw. It assumes 100 percent load. Things get harder when you factor in dynamic clock speeds.

W = Stock TDP * ( OC MHz / Stock MHz) * ( OC Vcore / Stock Vcore )^2

Voltage increase is squared. Small changes in voltage have big impacts on power consumption. Pascal degrades with high voltage and temperature so the stock BIOS has a power limit in place to force voltage (and therefore temperature) to stay reasonable.

Beyond any free performance from over clocking at stock voltage (which can be a lot with GPU boost already giving most users as much as 10 percent), you can usually take 5-10 percent more frequency at stock voltage as long as you keep the chip cool. The number is different by architecture, but they all throttle with higher temp. With Pascal it's every 5C over 30C.

Pascal has a very good voltage, temp, frequency curve with GPU boost. The stock max of 1093 MV is much higher than is attainable without pushing the card into thermal and power limits so its a good safe max voltage under water. If the card didn't have such a well defined max, a good rule of thumb is 10 percent over stock, or higher if you have guidance from the manufacturer or multiple sources.
Quote:


> Originally Posted by *Gurkburk*
> 
> I'm curious as well. I'm also wondering how to get to the higher voltages. With +100CV and 120PL/Unlocked, my MSI card crashes when it gets to 1.093v.


Are you setting a voltage offset? If so you might try backing it down a little bit. 120 percent power will throttle at 1.093, probably to where it's slower than say 1.063. The MSI cards have a 330W limit at 120 percent. 330W is 2.05-2.113 at 1.060 from what I've seen on several TI's. That's under water, so on air it well be a bit lower.


----------



## Gdourado

It arrived!
I am now proudly part of the club.
I must say I am impressed with the card.
It is huge, but at the same time feels like a great quality piece!
Now to see what it can do on the system.

Cheers!


----------



## Gurkburk

Quote:


> Originally Posted by *nrpeyton*
> 
> At what frequency is it crashing at? Have you tried lowering the frequency that is attached to 1,093v on your curve slightly?
> If it is only crashing mid-way through a bench then you're probably very close to getting it stable.. try lowering it by one 13mhz step. If it's still not completely stable then try lowering it again by another 13mhz.
> 
> Also what are your 'at load' temps like?
> 
> Sounds like your maybe just 1 or 2 frequency bins too high. If it was your PSU the entire PC would shut down/restart. _(I'm not saying rule it out completely - but I'd definitely look at lowering the frequency just a little tad first)._ Or trying to get the card 10-20c cooler.


When I'm trying to get to 2050mhz, it wants to go to 1.093v & that's where it crashes the Firestrike stress test.

At all times, it's running around 60*c~,accelero extreme air cooler.

I'm far from the powerlimit as well. Which is annoying..


----------



## jprovido

Quote:


> Originally Posted by *Stiltz85*
> 
> 1000W should cover it, not sure about high overclocking, I'll let someone more knowledgeable tackle that.


It doesn't consume that much power. my GTX 480 SLI (overclocked cpu and gpu) ran with a Corsair TX850 back in the day. people overestimate the power supplies they need. now I'm using a Corsair AX760i. runs comfortably with everything overclocked


----------



## demon09

Quote:


> Originally Posted by *jprovido*
> 
> It doesn't consume that much power. my GTX 480 SLI (overclocked cpu and gpu) ran with a Corsair TX850 back in the day. people overestimate the power supplies they need. now I'm using a Corsair AX760i. runs comfortably with everything overclocked


truth I have seen my 7700k 1080ti both overclocked spike to around 5xx, but mostly flucatues around the 3xx-4xx range when gaming,but people will go around saying you must have an 750watt PSU. I mean it's nice to have a better PSU. But people sometimes end up upgrading a PSU thinking that was there issue. I currently run a 850 but that's only because it was barely more then the 650 and had a longer warranty. Nvidia does recommend a 600 watt which is not to crazy as you want some headroom for as a psu ages ect.


----------



## TheWizardMan

Quote:


> Originally Posted by *demon09*
> 
> truth I have seen my 7700k 1080ti both overclocked spike to around 5xx, but mostly flucatues around the 3xx-4xx range when gaming,but people will go around saying you must have an 750watt PSU. I mean it's nice to have a better PSU. But people sometimes end up upgrading a PSU thinking that was there issue. I currently run a 850 but that's only because it was barely more then the 650 and had a longer warranty


OK, but the otherside of this is that you don't want to underestimate either. I have a 6700k, and 2 1080ti's in SLI, a full water cooling loop, 11 fans and a bunch of lights. I thought I'd be fine with a 1000W PSU, but was not.


----------



## demon09

Quote:


> Originally Posted by *TheWizardMan*
> 
> OK, but the otherside of this is that you don't want to underestimate either. I have a 6700k, and 2 1080ti's in SLI, a full water cooling loop, 11 fans and a bunch of lights. I thought I'd be fine with a 1000W PSU, but was not.


did you use a killwatt to measure? That's surprising it was over 1000 watts. I have 9 fans totsl in my build 6 are RGB and an SSD ,hhd. But I don't have a pump or a bunch of led strips but I do have one. But I gusse your extras must add a lot more then I imagine if they are hitting 1000watts


----------



## jprovido

Quote:


> Originally Posted by *TheWizardMan*
> 
> OK, but the otherside of this is that you don't want to underestimate either. I have a 6700k, and 2 1080ti's in SLI, a full water cooling loop, 11 fans and a bunch of lights. I thought I'd be fine with a 1000W PSU, but was not.


I highly doubt that it consumed more than 1000w. I've owned multi gpu setups back in the day that consumed way more power than modern hardware that are really power efficient. Must've been a faulty psu. I had one of those zalman killawat meter add ons on the front bay that were "in" back in the day and it barel consumes 1000w from the wall that's not even including pau efficiency

I'm running a [email protected] and 1080 Ti SLI both overclocked on a corsair AX760i it's not even sweating lol


----------



## aberrero

Quote:


> Originally Posted by *TheWizardMan*
> 
> OK, but the otherside of this is that you don't want to underestimate either. I have a 6700k, and 2 1080ti's in SLI, a full water cooling loop, 11 fans and a bunch of lights. I thought I'd be fine with a 1000W PSU, but was not.


That shouldn't use a 1000W. Maybe if you are massively overvolting, but even then. The CPU shouldn't use more than 200W, the cards can use 300W each. Your fans, mobo, etc, shouldn't be using more than 50W. You should have at least 150W to spare. And any decent power supply should work at 10% above its rated load without causing instability.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jprovido*
> 
> I highly doubt that it consumed more than 1000w. I've owned multi gpu setups back in the day that consumed way more power than modern hardware that are really power efficient. Must've been a faulty psu. I had one of those zalman killawat meter add ons on the front bay that were "in" back in the day and it barel consumes 1000w from the wall that's not even including pau efficiency
> 
> I'm running a [email protected] and 1080 Ti SLI both overclocked on a corsair AX760i it's not even sweating lol


Well GTX1080 Ti with 20% power limit is 300W. Thats 600W for 2 GPUs. You are cutting it close with 760W.


----------



## NBrock

I had an EVGA Supernova G2 750 watt and when i got my second card and ran SLI with my 4790k @ 4.7 it would shut off to protect the PSU. I ended up getting a 1300 watt G2 to add in some wiggle room for future upgrades.


----------



## BrainSplatter

800W BeQuiet with SLI, no problems since 10 months:
[email protected]@1.35v, 2x1080Ti SLI (essentially no overvolt), 4x8GB RAM @ 1.4v, 4xSSD in RAID0, 1xSSD, 1xNVME SSD, 8 fans, 2 LED strips

But it seems like some PSU protections kick in much earlier than others. I think the Seasonic Platinum 860 switches off earlier than the BeQuiets straight power 800W.


----------



## TheWizardMan

Quote:


> Originally Posted by *aberrero*
> 
> That shouldn't use a 1000W. Maybe if you are massively overvolting, but even then. The CPU shouldn't use more than 200W, the cards can use 300W each. Your fans, mobo, etc, shouldn't be using more than 50W. You should have at least 150W to spare. And any decent power supply should work at 10% above its rated load without causing instability.


I was drawing over 1000W, even the stupid calculators say its over 1000W. Literally added a second monitor that had an extendable USB on it and my system became unstable.

Edit: Also, both my 1080 tis are overclocked to 2062Mhz and my 6700K is at 4.8 Ghz with a 1.43v.


----------



## MightEMatt

Quote:


> Originally Posted by *aberrero*
> 
> That shouldn't use a 1000W. Maybe if you are massively overvolting, but even then. The CPU shouldn't use more than 200W, the cards can use 300W each. Your fans, mobo, etc, shouldn't be using more than 50W. You should have at least 150W to spare. And any decent power supply should work at 10% above its rated load without causing instability.


Under furmark load I can pull 1450w from the wall, you underestimate how much power 1080Tis can use.


----------



## KCDC

With the build in my sig 4.5 OC cpu and +150 core +600 mem on sli 1080ti x2, I was getting spikes around 960 watts when in Firestrike Ultra, and would stay around 920W. This was using corsair link software direct read from PSU.


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> ^^^^^^
> 
> 
> 
> 
> 
> 
> 
> 
> [/SPOILER]
> 
> only had similar problem with a faulty HDMI cable
> 
> About to roll back to 388.13 drivers Assassins creed has started jumping GPU clocks from idle to max load clock only happened since 388.31 drivers, anyone tried the 388.31 driver I liked the 388.13 seems everyone was getting nice bench scores with that driver too.


Today the text doesn't have a red tint but it is just very difficult to read. It's patchy everywhere. Even text on this website is affected and the text I'm writing now..

I tried switching off YCBRC - 4-2-2 in Nvidia control panel and going back to normal RGB (& also switching off HDR in Windows Display Properties) -- but oddly it hasn't fixed the text this time. I'm going to try restarting the PC.


----------



## 8051

Quote:


> Originally Posted by *MightEMatt*
> 
> Under furmark load I can pull 1450w from the wall, you underestimate how much power 1080Tis can use.


Holy frijole! Did you have to put in a special circuit breaker or a dedicated circuit for your computer?


----------



## MightEMatt

Quote:


> Originally Posted by *8051*
> 
> Holy frijole! Did you have to put in a special circuit breaker or a dedicated circuit for your computer?


A single standard american circuit (120V/15A) can do 1800W total. I have my server on the same circuit though so I was pushing it a little bit. My UPS was screaming under that load though, overload alarms and all that (only a 1500VA UPS). Anyways, just trying to make the point that 1080Tis have pretty beefy power delivery and under the right load can pull a pile of power.


----------



## 8051

Quote:


> Originally Posted by *MightEMatt*
> 
> A single standard american circuit (120V/15A) can do 1800W total. I have my server on the same circuit though so I was pushing it a little bit. My UPS was screaming under that load though, overload alarms and all that (only a 1500VA UPS). Anyways, just trying to make the point that 1080Tis have pretty beefy power delivery and under the right load can pull a pile of power.


I was considering an SLI 1080Ti setup later on when prices for 1080Ti's would drop, but not after reading the insane power draw of your setup. I have a space heater that draws as much power as your computer! I had to turn off the audible alarm on my Kensington MasterPiece Plus Power Center because it tends to go off constantly when gaming during the summer (which indicates more than 12A AC is being drawn). I cannot imagine pumping out 1000 Watts of heat into my bedroom during the summer.


----------



## MightEMatt

Quote:


> Originally Posted by *8051*
> 
> I was considering an SLI 1080Ti setup later on when prices for 1080Ti's would drop, but not after reading the insane power draw of your setup. I have a space heater that draws as much power as your computer! I had to turn off the audible alarm on my Kensington MasterPiece Plus Power Center because it tends to go off constantly when gaming during the summer (which indicates more than 12A AC is being drawn). I cannot imagine pumping out 1000 Watts of heat into my bedroom during the summer.


Well you'd get power limited on the standard BIOS before you hit these draws. I'm power unlocked but I'm only at 2GHz @ 1.1V so a pretty low overclock in the grand scheme of things. All stock I can exceed 1KW in Timespy graphics test 2 though.

But yeah I get a lot of heat out of it. It's not a big deal in Canadian winters but it's a toasty experience in the summer.

Edit: Also, these powers are under synthetic load. I absolutely do not recommend that anybody who cares about their hardware try and recreate it though. As soon as I enable postfx in Furmark my power consumption jumps over 500W and my GPU temperatures spike by nearly 15 degrees. This is all nearly instantaneous. If you tried to run this for an extended time you'd absolutely risk blowing something up unless you ran some mad cooling system.


----------



## Gdourado

Hello,
Installed my Gainward yesterday and started playing with it.
So, out of the box stock, the card boost to 1875 and never goes below 1850.
I started playing with the card and afterburner and with stock voltage I managed to get the card to a locked 2000 core speed.
Stock fan curve, extremely quiet and stays between 68 and 70 degrees.
Really impressed with the cooling solution here.


Now, to tune the voltage, I don't know why, but I need to use Gainwards own utility as in afterburner, the voltage slider os grayed out no matter the options.

Is it worth it to start trying to tune the voltages?
How much more can I gain?
Or is it good to stay at 2000 with these Temps?
Also, is it worth it to overclock the memory on a 1080ti?

Thanks.
Cheers


----------



## 8051

I've only ever seen a performance difference with GPU memory speed in one game benchmark: Resident Evil 5.


----------



## feznz

Quote:


> Originally Posted by *KCDC*
> 
> With the build in my sig 4.5 OC cpu and +150 core +600 mem on sli 1080ti x2, I was getting spikes around 960 watts when in Firestrike Ultra, and would stay around 920W. This was using corsair link software direct read from PSU.


I got the corsair link RM1000i I got around 600w with a single1080ti peak with Superposition so your 960w peaks is exactly what I would expect from a highly OC SLI system,

though I am finding the link software extremely bugged even with the latest 4..9.2.27 software but in the forums everyone is having very similar problems.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Today the text doesn't have a red tint but it is just very difficult to read. It's patchy everywhere. Even text on this website is affected and the text I'm writing now..
> 
> I tried switching off YCBRC - 4-2-2 in Nvidia control panel and going back to normal RGB (& also switching off HDR in Windows Display Properties) -- but oddly it hasn't fixed the text this time. I'm going to try restarting the PC.


Not sure how to take that but I didn't mean to sound like that ISP technical adviser "have you tried restart you modem" I really have no idea what could be making the problem
Maybe a screen shot might clirify what you mean by wrong colour text.


----------



## jprovido

Quote:


> Originally Posted by *8051*
> 
> I've only ever seen a performance difference with GPU memory speed in one game benchmark: Resident Evil 5.


Run Heaven Benchmark. go to Free Walk mode. just stare into the Dragon and look at the fps counter. you'll see fps gains when you move that Memory speed slider. basically every game there's gains in fps


----------



## 8051

Quote:


> Originally Posted by *jprovido*
> 
> Run Heaven Benchmark. go to Free Walk mode. just stare into the Dragon and look at the fps counter. you'll see fps gains when you move that Memory speed slider. basically every game there's gains in fps


But does that translate into a higher benchmark score?


----------



## nrpeyton

Quote:


> Originally Posted by *feznz*
> 
> Not sure how to take that but I didn't mean to sound like that ISP technical adviser "have you tried restart you modem" I really have no idea what could be making the problem
> Maybe a screen shot might clirify what you mean by wrong colour text.


*Text difficult to read right across windows - display problem?*

I have taken pictures of my screen using my phone.

The text colour is fine.. but it looks patchy.

Not sure what could be causing this. It is very difficult to read. I suspect a display problem of some sorts.

FULL SIZE - right click pic & select 'open in new tab'




Anyone ever exerpeicned this problem? I am pulling my hair out. _Could it be a faulty HDMI cable?_


----------



## Gdourado

Quote:


> Originally Posted by *jprovido*
> 
> Run Heaven Benchmark. go to Free Walk mode. just stare into the Dragon and look at the fps counter. you'll see fps gains when you move that Memory speed slider. basically every game there's gains in fps


I don't know about the ti, but when I had the 1080, I would overclock the memory and gain performance in benchmarks until a certain point, but past that, even though the card was rock solid stable, the scores would start to come down.

That's why I ask if in the ti, it's worth to overclock the memory.

Cheers


----------



## KCDC

Quote:


> Originally Posted by *feznz*
> 
> I got the corsair link RM1000i I got around 600w with a single1080ti peak with Superposition so your 960w peaks is exactly what I would expect from a highly OC SLI system,
> 
> though I am finding the link software extremely bugged even with the latest 4..9.2.27 software but in the forums everyone is having very similar problems.


I should note I stopped using the link software and dongle. The dongle would cause random restarts/shutdowns of my machine. Not sure if that's the dongles fault or the mobo USB headers, but it had to go.


----------



## Rakanoth

Quote:


> Originally Posted by *nrpeyton*
> 
> *Text difficult to read right across windows - display problem?*
> 
> I have taken pictures of my screen using my phone.
> 
> The text colour is fine.. but it looks patchy.
> 
> Not sure what could be causing this. It is very difficult to read. I suspect a display problem of some sorts.
> 
> FULL SIZE - right click pic & select 'open in new tab'
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone ever exerpeicned this problem? I am pulling my hair out. _Could it be a faulty HDMI cable?_


Your pictures aren't clear. Try playing with Windows's display settings and color settings. Try different color profiles. Run some games and see if your GPU is making artifacts.
If you can find another monitor and GPU, try with them. Try with another cable. And report back here please.
Quote:


> Originally Posted by *MightEMatt*
> 
> Under furmark load I can pull 1450w from the wall, you underestimate how much power 1080Tis can use.


Prove it.


----------



## MightEMatt

Quote:


> Originally Posted by *Rakanoth*
> 
> Prove it.


I don't have a 1450W screenshot but this is a screen from an earlier run. These cards can pull a thousand watts each under LN2 I don't know why you're skeptical.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> *Text difficult to read right across windows - display problem?*
> 
> I have taken pictures of my screen using my phone.
> 
> The text colour is fine.. but it looks patchy.
> 
> Not sure what could be causing this. It is very difficult to read. I suspect a display problem of some sorts.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> FULL SIZE - right click pic & select 'open in new tab'
> 
> 
> 
> 
> 
> 
> Anyone ever exerpeicned this problem? I am pulling my hair out. _Could it be a faulty HDMI cable?_


Long shot for HDMI but even a magnetic field over the HDMI cable can cause display output problem
looking at those pics corrupt display drivers or OS would be my bet with the benching you do or even possibly a faulty GPU

Quote:


> Originally Posted by *KCDC*
> 
> I should note I stopped using the link software and dongle. The dongle would cause random restarts/shutdowns of my machine. Not sure if that's the dongles fault or the mobo USB headers, but it had to go.


Interesting......I will keep this in mind
Quote:


> Originally Posted by *MightEMatt*
> 
> I don't have a 1450W screenshot but this is a screen from an earlier run. These cards can pull a thousand watts each under LN2 I don't know why you're skeptical.


that could be a peak load but not sustaining peak load i.e it could be a 1/50sec load caused by unplugging something on the UPS

On my own system I have been playing Assassins creed Origins with stuttering problems I know it is a driver issues as the 385.69 is 75% better than the 388.13/388.31


----------



## demon09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Well GTX1080 Ti with 20% power limit is 300W. Thats 600W for 2 GPUs. You are cutting it close with 760W.


I agree 760 is cutting it close if he has everything overclocked for 1080ti SLI I'd probably recommened a 850 exspecailly since psu's degrade over time.


----------



## MightEMatt

Quote:


> Originally Posted by *feznz*
> 
> that could be a peak load but not sustaining peak load i.e it could be a 1/50sec load caused by unplugging something on the UPS


1450W is peak, but sustained was over 1300W. I'd get the multimeter out but then you'll accuse me of metering a different circuit so it's cool. My point is, 1000W is easily attainable if you're running them on a HEDT platform, and even a reference card has enough power delivery to draw over 500W.


----------



## feznz

Quote:


> Originally Posted by *MightEMatt*
> 
> I don't have a 1450W screenshot but this is a screen from an earlier run. These cards can pull a thousand watts each under LN2 I don't know why you're skeptical.


Quote:


> Originally Posted by *MightEMatt*
> 
> 1450W is peak, but sustained was over 1300W. I'd get the multimeter out but then you'll accuse me of metering a different circuit so it's cool. My point is, 1000W is easily attainable if you're running them on a HEDT platform, and even a reference card has enough power delivery to draw over 500W.


I believe that is possible BUT under some exotic cooling LN², DICE, peltier, or AC plumbed into the loop.

also you are measuring from the UPS and with the PSU efficiency you would be around 89% so 1400w x 89% = 1246w - anything else plugged into your UPS so sustained draw would be closer to 1157w

I have a 1500w UPS modified to take 3 car size batteries to allow for around 1 hour runtime with everything plugged in monitors printer modem etc for surge protection just the UPS is not a good place to measure system draw.


----------



## MightEMatt

Quote:


> Originally Posted by *feznz*
> 
> I believe that is possible BUT under some exotic cooling LN², DICE, peltier, or AC plumbed into the loop.
> 
> also you are measuring from the UPS and with the PSU efficiency you would be around 89% so 1400w x 89% = 1246w - anything else plugged into your UPS so sustained draw would be closer to 1157w
> 
> I have a 1500w UPS modified to take 3 car size batteries to allow for around 1 hour runtime with everything plugged in monitors printer modem etc for surge protection just the UPS is not a good place to measure system draw.


You're also not accounting for VRM and MOSFET efficiency, resistive cable losses, chipset power, hard drive power, USB devices, on and on. An HEDT system with SLI 1080Tis is capable of pulling 1450W from the wall on Furmark, as I stated in my first post.


----------



## nrpeyton

Quote:


> Originally Posted by *MightEMatt*
> 
> 1450W is peak, but sustained was over 1300W. I'd get the multimeter out but then you'll accuse me of metering a different circuit so it's cool. My point is, 1000W is easily attainable if you're running them on a HEDT platform, and even a reference card has enough power delivery to draw over 500W.


How much of that 1300 watts is actually the GPU alone? HWINFO64 gives power draw in watts in real time for all nvidia GPU's.

My Sony 4k TV is 100% HDR compatible. I have a Nvidia GeForce GTX1080ti which is also 100% HDR compatible.

Normally when I output my display to my 4K HDR TV (single display 3840x2160p 60Hz) it's a perfect picture. Nice and sharp with vivid colour. However today when I changed the TV picture setting to HDR (on TV) the image was very poor. It was nowhere near as sharp and the colours look washed out.

I went into the Windows 10 display options and noticed that there is option for 'HDR and Advanced Colour' which was set to 'On'. I don't remember having seen this option before. Is this a new option which has been included in the new Creators update.

It appears as though this HDR option does not work as it should. When set to 'on' the TV automatically changes the picture mode to HDR as it should -- but as I said, the picture quality goes bad. When I turn this option off the perfect picture comes back to what it should be.
(so in effect I'm _*running the PC in HDR mode,*_ _yet the TV is NOT set to HDR mode_ -- instead the picture mode on TV is set to 'game'.

Games are locked out of HDR when this option in Windows Display Property is turned off so I don't really know what's going on.

If anyone can shed some light on this I'd be very grateful.

Many thanks in advance.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> How much of that 1300 watts is actually the GPU alone? HWINFO64 gives power draw in watts in real time for all nvidia GPU's.
> 
> My Sony 4k TV is 100% HDR compatible. I have a Nvidia GeForce GTX1080ti which is also 100% HDR compatible.
> 
> Normally when I output my display to my 4K HDR TV (single display 3840x2160p 60Hz) it's a perfect picture. Nice and sharp with vivid colour. However today when I changed the TV picture setting to HDR (on TV) the image was very poor. It was nowhere near as sharp and the colours look washed out.
> 
> I went into the Windows 10 display options and noticed that there is option for 'HDR and Advanced Colour' which was set to 'On'. I don't remember having seen this option before. Is this a new option which has been included in the new Creators update.
> 
> It appears as though this HDR option does not work as it should. When set to 'on' the TV automatically changes the picture mode to HDR as it should -- but as I said, the picture quality goes bad. When I turn this option off the perfect picture comes back to what it should be.
> (so in effect I'm _*running the PC in HDR mode,*_ _yet the TV is NOT set to HDR mode_ -- instead the picture mode on TV is set to 'game'.
> 
> Games are locked out of HDR when this option in Windows Display Property is turned off so I don't really know what's going on.
> 
> If anyone can shed some light on this I'd be very grateful.
> 
> Many thanks in advance.


Try and re-calibrate Windows ClearType.


----------



## KedarWolf

Quote:


> Originally Posted by *feznz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MightEMatt*
> 
> I don't have a 1450W screenshot but this is a screen from an earlier run. These cards can pull a thousand watts each under LN2 I don't know why you're skeptical.
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *MightEMatt*
> 
> 1450W is peak, but sustained was over 1300W. I'd get the multimeter out but then you'll accuse me of metering a different circuit so it's cool. My point is, 1000W is easily attainable if you're running them on a HEDT platform, and even a reference card has enough power delivery to draw over 500W.
> 
> Click to expand...
> 
> I believe that is possible BUT under some exotic cooling LN², DICE, peltier, or AC plumbed into the loop.
> 
> also you are measuring from the UPS and with the PSU efficiency you would be around 89% so 1400w x 89% = 1246w - anything else plugged into your UPS so sustained draw would be closer to 1157w
> 
> I have a 1500w UPS modified to take 3 car size batteries to allow for around 1 hour runtime with everything plugged in monitors printer modem etc for surge protection just the UPS is not a good place to measure system draw.
Click to expand...

I've seen a single 1080 Tii push 500w with the XOC BIOS at 1.2v so I don't think it's a stretch for Ann SLI system to pull 1350w..


----------



## chibi

Gosh I'm having terrible luck with GPU's. I bought an EVGA 1080 Ti Gaming yesterday and brought it home for the new 8700K build and it had coil whine. Had a rough time with the vendor, but managed to get it replaced with a Strix 1080 Ti today. Brought it home and it also coil whined. like a squealing seal









I'm not sure I can get another replacement as Memoryexpress has a final sale no return policy on higher end GPU's...


----------



## demon09

Quote:


> Originally Posted by *demon09*
> 
> I agree 760 is cutting it close if he has everything overclocked for 1080ti SLI I'd probably recommened a 850 exspecailly since psu's degrade over time.


Quote:


> Originally Posted by *chibi*
> 
> Gosh I'm having terrible luck with GPU's. I bought an EVGA 1080 Ti Gaming yesterday and brought it home for the new 8700K build and it had coil whine. Had a rough time with the vendor, but managed to get it replaced with a Strix 1080 Ti today. Brought it home and it also coil whined. like a squealing seal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure I can get another replacement as Memoryexpress has a final sale no return policy on higher end GPU's...


I have had an Fe and strix and both had coil whine at high fps but it was not super audiable exspecailly over case fan noise. It can whine pretty bad if fps goes really high but for 144hz monitor mine seems to not be audiable above the strix fans at 50% but I am sure if I put my ear to the glass pannel I would hear jt


----------



## Dasboogieman

Quote:


> Originally Posted by *chibi*
> 
> Gosh I'm having terrible luck with GPU's. I bought an EVGA 1080 Ti Gaming yesterday and brought it home for the new 8700K build and it had coil whine. Had a rough time with the vendor, but managed to get it replaced with a Strix 1080 Ti today. Brought it home and it also coil whined. like a squealing seal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure I can get another replacement as Memoryexpress has a final sale no return policy on higher end GPU's...


All cards whine, it's luck of the draw if you get one that whines at an audible frequency.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> I've seen a single 1080 Tii push 500w with the XOC BIOS at 1.2v so I don't think it's a stretch for Ann SLI system to pull 1350w..


yeah possibly GPU x2 =1000w and 350w left over for CPU/system, it is stretching slightly but TBH if I were competitively benching I would have a 1600w single PSU or a twin PSU setup.

personally I believe in overkill PSU after knowing 600w was a realistic total system draw with a decent gaming OC I still got the RM1000i for future possible SLI upgrade.
just Gaming OC and bench OC would have different PSU requirements I generally speak with 24/7 game stable OC PSU requirements


----------



## nrpeyton

Quote:


> Originally Posted by *Dasboogieman*
> 
> Try and re-calibrate Windows ClearType.


Wow thanks.. that helped a lot.

Any text that I have to type (or text heavy websites like this one) and it's now fixed.. however many menus and stuff throughout windows are still affected.

This will do as a temproary fix 100%.. so thanks...

Just can't help feeling I've cheated with a "workaround "...

Wish I could get to the root of the issue. Though. As I worry I may be losing graphics quality in games etc without knowing it.

Anyway rep+


----------



## Rakanoth

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow thanks.. that helped a lot.
> 
> Any text that I have to type (or text heavy websites like this one) and it's now fixed.. however many menus and stuff throughout windows are still affected.
> 
> This will do as a temproary fix 100%.. so thanks...
> 
> Just can't help feeling I've cheated with a "workaround "...
> 
> Wish I could get to the root of the issue. Though. As I worry I may be losing graphics quality in games etc without knowing it.
> 
> Anyway rep+


It is not a workaround, IMO. You attacked the heart of the problem.








If cleartype calibration helped you fix the issue, then you fixed the root of the problem. No need to worry about "the core" of the issue. Your problem is not a hardware problem.


----------



## kayway

Hey was hoping for advice on two things:
1 - my current overclock is only +170 core and +300 memory with voltage + 100 and power limit 120. I run on water and i'm definitely not thermal throttling but i notice i hit the power limit when i try to clock higher and will result in crashes on more core or artifacts on more memory. So should i do the bios mod. I heard you cant monitor the voltage though if you use it which seems worrying.
2 - Whats the best software for testing your oc, currently im using Heaven and 3D Mark(free version)
P.S using dual sli if that makes a difference.


----------



## DStealth

Quote:


> Originally Posted by *kayway*
> 
> Hey was hoping for advice on two things:
> 1 - my current overclock is only +170 core and +300 memory with voltage + 100 and power limit 120.


And this statement actually means nothing such offset on top factory OCed cards means top HOF results and on FE means not even average results.
Please write the exact values you have during benchmarks not offset please.


----------



## 8051

Quote:


> Originally Posted by *DStealth*
> 
> And this statement actually means nothing such offset on top factory OCed cards means top HOF results and on FE means not even average results.
> Please write the exact values you have during benchmarks not offset please.


I agree, posting the offsets is irrelevant if no one knows what the base/boost values are.


----------



## d3v0

Looks like I got her stable at ~2012mhz and ~5750 memory clocks, thanks everyone for all of the input here. Also, I found out that firestrike is a much better stability test than super position. But that superposition is even better than heaven, lol. I could easily run heaven on loop at 2050, but I got crashes in firestrike until I went down to 2012mhz.


----------



## feznz

Quote:


> Originally Posted by *nrpeyton*
> 
> Wow thanks.. that helped a lot.
> 
> Any text that I have to type (or text heavy websites like this one) and it's now fixed.. *however many menus and stuff throughout windows are still affected*.
> 
> This will do as a temproary fix 100%.. so thanks...
> 
> Just can't help feeling I've cheated with a "workaround "...
> 
> Wish I could get to the root of the issue. Though. As I worry I may be losing graphics quality in games etc without knowing it.
> 
> Anyway rep+


Well even I learned something there TBH I am no idea what's causing the current problem.
but you know the old saying if all else fails there...... there is nothing like a fresh install of windows.


----------



## kayway

Quote:


> Originally Posted by *DStealth*
> 
> And this statement actually means nothing such offset on top factory OCed cards means top HOF results and on FE means not even average results.
> Please write the exact values you have during benchmarks not offset please.


Oh sorry about that.
The core clock is 2050Mhz and memory 5805Mhz


----------



## shreduhsoreus

Corsair Hydro GFX(massive regret, wish I would have got the EVGA SC Hybrid). Voltage untouched, power limit maxed. Benchmarks have it running at 2050 or 2062, depending on the specific benchmark, in games it sometimes drops down to 2GHz. Memory hits 5950, initially had it at 6000, but after a few weeks I started noticing issues with stability in specific scenarios, been flawless since dropping it to 5950.

The reason I regret this specific card is because Corsair software is terrible. Like an idiot, I bought the Commander Pro to control all my ML120 fans(and so I can tie the GPU rad fan to GPU core temp). To be safe, I have to manually set my fans to a "Game" profile I made with static fan speeds because Corsair Link likes to freeze in the background periodically.


----------



## Agent-A01

Quote:


> Originally Posted by *shreduhsoreus*
> 
> Corsair Hydro GFX(massive regret, wish I would have got the EVGA SC Hybrid). Voltage untouched, power limit maxed. Benchmarks have it running at 2050 or 2062, depending on the specific benchmark, in games it sometimes drops down to 2GHz. Memory hits 5950, initially had it at 6000, but after a few weeks I started noticing issues with stability in specific scenarios, been flawless since dropping it to 5950.
> 
> The reason I regret this specific card is because Corsair software is terrible. Like an idiot, I bought the Commander Pro to control all my ML120 fans(and so I can tie the GPU rad fan to GPU core temp). To be safe, I have to manually set my fans to a "Game" profile I made with static fan speeds because Corsair Link likes to freeze in the background periodically.


Don't use the corsair software then.

You can easily set rad fan to mobo fan header or other means to control it.


----------



## nrpeyton

Quote:


> Originally Posted by *Rakanoth*
> 
> It is not a workaround, IMO. You attacked the heart of the problem.
> 
> 
> 
> 
> 
> 
> 
> 
> If cleartype calibration helped you fix the issue, then you fixed the root of the problem. No need to worry about "the core" of the issue. Your problem is not a hardware problem.


I'm still getting a red 'tint' to text today when HDR is enabled in Windows display properties.

It is affecting black text on white background the most.

For example this text which I'm typing now is affected. BUT the white text on the grey banner on the bottom of this page looks absolutely fine.

Also sometimes after restarting the PC I've got to "wiggle" the HDMI cable a little to get the picture to show on the TV... I'm beginning to suspect maybe a faulty HDMI cable is causing it.

Although I was skeptical of this because with HDMI I always believed it either worked perfectly 100% or didn't work at all (due to it being a digital -- not an analogue signal).


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Don't use the corsair software then.
> 
> You can easily set rad fan to mobo fan header or other means to control it.


There is no other way to tie the fan speed to GPU core temperature. My motherboard isn't supported by Speed Fan(plus I wouldn't use that software if you paid me because I'm not stuck in 2001). And all of those other methods would be no less of a hassle than the way I do it now. I can't rely on CPU temp because mining, folding and gaming all have different typical CPU temps. This is the best way for me to do it. I should have just went with EVGA.

I don't think I'm being unreasonable by expecting a big company like Corsair to get their software figured out, especially considering how many people have issues with it.


----------



## Dasboogieman

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm still getting a red 'tint' to text today when HDR is enabled in Windows display properties.
> 
> It is affecting black text on white background the most.
> 
> For example this text which I'm typing now is affected. BUT the white text on the grey banner on the bottom of this page looks absolutely fine.
> 
> Also sometimes after restarting the PC I've got to "wiggle" the HDMI cable a little to get the picture to show on the TV... I'm beginning to suspect maybe a faulty HDMI cable is causing it.
> 
> Although I was skeptical of this because with HDMI I always believed it either worked perfectly 100% or didn't work at all (due to it being a digital -- not an analogue signal).


A faulty HDMI cable can cause image artefacts. While it is true superior quality cables don't affect image quality, a faulty cable can affect image quality if the fault was not enough to completely degrade the handshake.


----------



## Agent-A01

Quote:


> Originally Posted by *shreduhsoreus*
> 
> There is no other way to tie the fan speed to GPU core temperature. My motherboard isn't supported by Speed Fan(plus I wouldn't use that software if you paid me because I'm not stuck in 2001). And all of those other methods would be no less of a hassle than the way I do it now. I can't rely on CPU temp because mining, folding and gaming all have different typical CPU temps. This is the best way for me to do it. I should have just went with EVGA.
> 
> I don't think I'm being unreasonable by expecting a big company like Corsair to get their software figured out, especially considering how many people have issues with it.


Replace the rad fan with something much better like a low RPM ek vardar fan or ML120 fan.

Those will outperform stock rad fan while being quieter.
Then you can just have it run full speed 24/7


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Replace the rad fan with something much better like a low RPM ek vardar fan or ML120 fan.
> 
> Those will outperform stock rad fan while being quieter.
> Then you can just have it run full speed 24/7


The stock fan IS an ML120, that's what separates it from the MSI Seahawk(that and a Corsair logo instead of "Seahawk") and the only reason I bought it haha. They're definitely not quiet either, I have 6 of them. They're great for low RPMs, but once they kick up to full speed they're just as loud as most fans.

I just have to right click the Corsair Link logo in my system tray, select my "Game" profile I created(GPU rad fan at 2000RPM, CPU rad fans at 1250)and I'm good to go. It's just annoying that I can't use my custom curves because if Link decides to freeze in the background I end up seeing my liquid cooled card hit 70 degrees(with the software thinking it's only ~40). Very hit or miss, with miss being more common. Here's hoping their upcoming "Sync" software that combines CUE and Link works better...but I'm not holding my breath.

I appreciate the advice, it's just a lost cause for me. The MSI/Corsair liquid cooled cards are just the worst of the bunch. At least the GPU itself is good lol.


----------



## SavantStrike

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm still getting a red 'tint' to text today when HDR is enabled in Windows display properties.
> 
> It is affecting black text on white background the most.
> 
> For example this text which I'm typing now is affected. BUT the white text on the grey banner on the bottom of this page looks absolutely fine.
> 
> Also sometimes after restarting the PC I've got to "wiggle" the HDMI cable a little to get the picture to show on the TV... I'm beginning to suspect maybe a faulty HDMI cable is causing it.
> 
> Although I was skeptical of this because with HDMI I always believed it either worked perfectly 100% or didn't work at all (due to it being a digital -- not an analogue signal).


It is most likely a bad cable.

The "it works or it doesn't" concept is a good one, but it gets murky when you throw in error correction. HDMI has error correction, so a marginal signal can display with artifacts.

In practice this just means that any capable of meeting the appropriate HDMI spec will work fine. An older cable will work too as long as it was built to exceed the previous spec. Out of spec or damaged cables will make you lose your mind.

About 2 months ago I had a cable that had endured one too many bends tint my whole screen red. I spent an hour tracking it down as I had just swapped the video card in the machine it was hooked up to.


----------



## siffonen

So whats the safe 24/7 voltage? im using currently 1.093V with xoc bios & EK waterblock.
If i remember correctly with stock bios my card was running at 1.093V when voltage was set to +100


----------



## feznz

Quote:


> Originally Posted by *siffonen*
> 
> So whats the safe 24/7 voltage? im using currently 1.093V with xoc bios & EK waterblock.
> If i remember correctly with stock bios my card was running at 1.093V when voltage was set to +100


I believe the general consensus is actually under volt @1v I just use stock 1.063v for 24/7 but if you feel the need to squeeze every last Mhz the don't worry over volting will lead to automatic volt/MHz downclocking to maintain the power/temp target


----------



## RoBiK

Quote:


> Originally Posted by *feznz*
> 
> I believe the general consensus is actually under volt @1v I just use stock 1.063v for 24/7 but if you feel the need to squeeze every last Mhz the don't worry over volting will lead to automatic volt/MHz downclocking to maintain the power/temp target


he said he is using the xoc bios... so no, it won't automatically downclock as there is no power/temp targer on xoc bios


----------



## SauronTheGreat

what advantage does the 1080Ti have over the old Titan X Pascal 2016 ? would it be wise in switching to the old titan x pascal 2016 from a 1080ti ? is this titan x better in overclocking ? i am using a zotac 1080ti amp extreme core edition and my chip is soo bad at overclocking


----------



## Agent-A01

Quote:


> Originally Posted by *SauronTheGreat*
> 
> what advantage does the 1080Ti have over the old Titan X Pascal 2016 ? would it be wise in switching to the old titan x pascal 2016 from a 1080ti ? is this titan x better in overclocking ? i am using a zotac 1080ti amp extreme core edition and my chip is soo bad at overclocking


No advantages besides better PCB components.

Reference Ti is much higher quality than Titan X Pascal

How terrible are we talking?
What clock speeds and voltage


----------



## SauronTheGreat

Quote:


> Originally Posted by *Agent-A01*
> 
> No advantages besides better PCB components.
> 
> Reference Ti is much higher quality than Titan X Pascal
> 
> How terrible are we talking?
> What clock speeds and voltage


well my 1080ti basically cannot be overclocked at all, it is a bad chip ...... all though it is decently factory overclocked out of the box, but if you slightly tweak it in any OC software it crashes

this is the one

https://www.techpowerup.com/gpudb/b4589/zotac-gtx-1080-ti-amp-extreme-core-edition


----------



## Agent-A01

Quote:


> Originally Posted by *SauronTheGreat*
> 
> well my 1080ti basically cannot be overclocked at all, it is a bad chip ...... all though it is decently factory overclocked out of the box, but if you slightly tweak it in any OC software it crashes
> 
> this is the one
> 
> https://www.techpowerup.com/gpudb/b4589/zotac-gtx-1080-ti-amp-extreme-core-edition


That still doesn't tell us anything.

For all we know you card is 2100mhz out of the box.

Screenshot default clock speeds under load.


----------



## siffonen

I was at [email protected], and just tested superposition 4K benchmark at [email protected] without any issues, raised my memory clocks to 6000Mhz. Still needs to test stability at that voltage.
Temps barely hit 33c


----------



## ezveedub

Just got a MSI Seahawk 1080Ti....seems good so far. Any shortcomings on this model?

Sent from my iPhone using Tapatalk Pro


----------



## bajer29

Planning on selling my 980s to upgrade to 1080/ 1080 TI. Anyone know of any Black Friday deals? Should I wait to see what Volta is all about before purchasing?

I just want 100FPS on 1440P. I don't game a ton, but I'm on at least 3 days a week for about 2 hours at a time currently (new to fatherhood







). I however, don't want my 980s to lose their value before selling them so I'm on the fence of selling now and waiting to buy or waiting to sell right before buying. Advice?


----------



## SauronTheGreat

Quote:


> Originally Posted by *Agent-A01*
> 
> That still doesn't tell us anything.
> 
> For all we know you card is 2100mhz out of the box.
> 
> Screenshot default clock speeds under load.


here is the screenshot you asked for ....



the thing is i do not have the zotac 1080Ti amp extreme version, i have the zotac 1080Ti amp extreme CORE version which were chips intended for the amp extremes but they did not reach up to mark, so they rebranded these chips and named them amp extreme core edition ... my amp extreme barely has any decent waterblock, tbt i prefer EK and it has no overclocking potential ... with this titan x pascal 2016 i will be able to put a decent WB on it and probably able to OC it ... would i be doing the right thing ?


----------



## Agent-A01

Quote:


> Originally Posted by *SauronTheGreat*
> 
> here is the screenshot you asked for ....
> 
> 
> 
> the thing is i do not have the zotac 1080Ti amp extreme version, i have the zotac 1080Ti amp extreme CORE version which were chips intended for the amp extremes but they did not reach up to mark, so they rebranded these chips and named them amp extreme core edition ... my amp extreme barely has any decent waterblock, tbt i prefer EK and it has no overclocking potential ... with this titan x pascal 2016 i will be able to put a decent WB on it and probably able to OC it ... would i be doing the right thing ?


Swapping for a titan x pascal would be a waste of time and money.

There's nothing wrong with your Ti, 2ghz at 1.06 is pretty average, nothing *bad* about it.

Even an above average chip that does 2050 at would only gain about 2%, so in short it's not worth the hassle.


----------



## SauronTheGreat

Quote:


> Originally Posted by *Agent-A01*
> 
> Swapping for a titan x pascal would be a waste of time and money.
> 
> There's nothing wrong with your Ti, 2ghz at 1.06 is pretty average, nothing *bad* about it.
> 
> Even an above average chip that does 2050 at would only gain about 2%, so in short it's not worth the hassle.


suppose i do get the TItan X pascal , would it be a downgrade ?


----------



## Agent-A01

Quote:


> Originally Posted by *SauronTheGreat*
> 
> suppose i do get the TItan X pascal , would it be a downgrade ?


It's a downgrade in the sense that you get lower quality components.

Performance will be the same.
Only difference is an additonal 1gb of vram.


----------



## SauronTheGreat

Quote:


> Originally Posted by *Agent-A01*
> 
> It's a downgrade in the sense that you get lower quality components.
> 
> Performance will be the same.
> Only difference is an additonal 1gb of vram.


'

reference GPUs have lower quality components then third party manufactured cards ?


----------



## nrpeyton

Quote:


> Originally Posted by *SauronTheGreat*
> 
> '
> 
> reference GPUs have lower quality components then third party manufactured cards ?


That depends on the make/model.

For example an EVGA Kingpin 1080Ti is going to have much higher quality parts _(or higher spec parts)_ than a normal Nvidia Founders Edition.

However in contrast it's always possible that the lowest spec non reference card that Zotac or Palit do could very well actually have lower spec parts than a Founders.

_/\ however another way of looking at it again -- is -- not saying PALIT & ZOTAC *necessarily* use lower grade parts on their PCB's.. but just using as an example to get the point across._

"Buildzoid" does a lot of good reviews on quality of VRM's on cards.... his videos are definitely worth a watch on you-tube.


----------



## SauronTheGreat

Quote:


> Originally Posted by *nrpeyton*
> 
> That depends on the make/model.
> 
> For example an EVGA Kingpin 1080Ti is going to have much higher quality parts _(or higher spec parts)_ than a normal Nvidia Founders Edition.
> 
> However in contrast it's always possible that the lowest spec non reference card that Zotac or Palit do could very well actually have lower spec parts than a Founders.
> 
> _/\ however another way of looking at it again -- is -- not saying PALIT & ZOTAC *necessarily* use lower grade parts on their PCB's.. but just using as an example to get the point across._
> 
> "Buildzoid" does a lot of good reviews on quality of VRM's on cards.... his videos are definitely worth a watch on you-tube.


i get your point my gpu indeed has premium components ... i think i should stick with my 1080ti


----------



## SavantStrike

Quote:


> Originally Posted by *bajer29*
> 
> Planning on selling my 980s to upgrade to 1080/ 1080 TI. Anyone know of any Black Friday deals? Should I wait to see what Volta is all about before purchasing?
> 
> I just want 100FPS on 1440P. I don't game a ton, but I'm on at least 3 days a week for about 2 hours at a time currently (new to fatherhood
> 
> 
> 
> 
> 
> 
> 
> ). I however, don't want my 980s to lose their value before selling them so I'm on the fence of selling now and waiting to buy or waiting to sell right before buying. Advice?


Given that the 980 is already beaten by the 1060, I don't think they can drop in value too much more until Volta releases. Selling them now vs later shouldn't have too much of an impact either way. You've got them under water if I remember correctly so there are still people that will want them.

As far as black Friday deals, GPUs have been climbing in price steadily for months now with no downward movement at all. I wouldn't expect more than 25-30 bucks off









You could sell the 980's and trade up to a 1080 TI, and then when Volta comes out in Q2 2018 (still like 6 months away) either keep the TI or sell it and upgrade to an 1170/1180. That's if consumer Volta happens and the ampere rumors aren't true.


----------



## bajer29

Quote:


> Originally Posted by *SavantStrike*
> 
> Given that the 980 is already beaten by the 1060, I don't think they can drop in value too much more until Volta releases. Selling them now vs later shouldn't have too much of an impact either way. You've got them under water if I remember correctly so there are still people that will want them.
> 
> As far as black Friday deals, GPUs have been climbing in price steadily for months now with no downward movement at all. I wouldn't expect more than 25-30 bucks off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You could sell the 980's and trade up to a 1080 TI, and then when Volta comes out in Q2 2018 (still like 6 months away) either keep the TI or sell it and upgrade to an 1170/1180. That's if consumer Volta happens and the ampere rumors aren't true.


Hmmm, thanks. The 980s are actually stock air. But I'm still leaning toward biting the bullet and seeing if I can get a small discount on a 1080 TI over the holiday, then sell the 980s so there is no PC downtime and hope that I get at least $400 back from the sale.


----------



## SavantStrike

Quote:


> Originally Posted by *bajer29*
> 
> Hmmm, thanks. The 980s are actually stock air. But I'm still leaning toward biting the bullet and seeing if I can get a small discount on a 1080 TI over the holiday, then sell the 980s so there is no PC downtime and hope that I get at least $400 back from the sale.


For stock air there's no better time than the present to sell. With Volta the 1150 TI could theoretically compete with the 980, and will have at least 4GB of RAM if not 8.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *bajer29*
> 
> Hmmm, thanks. The 980s are actually stock air. But I'm still leaning toward biting the bullet and seeing if I can get a small discount on a 1080 TI over the holiday, then sell the 980s so there is no PC downtime and hope that I get at least $400 back from the sale.


If you wait for Volta there will certainly be a whole heap of 1080ti's going cheap just before release!


----------



## bajer29

Quote:


> Originally Posted by *2ndLastJedi*
> 
> If you wait for Volta there will certainly be a whole heap of 1080ti's going cheap just before release!


Very good point... But what's your definition of cheap? Lol


----------



## 2ndLastJedi

I'll let mine go for about $250-300 less than i paid :/


----------



## bajer29

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I'll let mine go for about $250-300 less than i paid :/


Keep me in mind if/ when Volta meets your expectations


----------



## MrTOOSHORT

Quote:


> Originally Posted by *bajer29*
> 
> Keep me in mind if/ when Volta meets your expectations


Stay one gen behind and save, good shopping!


----------



## 2ndLastJedi

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Stay one gen behind and save, good shopping!


Is that what you call it when you go from a 1080 to a Titan Xp


----------



## 8051

Does underclocking memory on 1080Ti's affect stability?


----------



## SavantStrike

Quote:


> Originally Posted by *8051*
> 
> Does underclocking memory on 1080Ti's affect stability?


Under... Underclocking?

Not sure why you'd do that, but it shouldn't hurt stability.


----------



## 8051

Quote:


> Originally Posted by *SavantStrike*
> 
> Under... Underclocking?
> 
> Not sure why you'd do that, but it shouldn't hurt stability.


With my old 980Ti I only noticed one game that was affected by memory speed and the card ran cooler when I underclocked the memory (ditto for my 1080Ti).

What I've noticed is that when overclocking the core to 2100 Mhz. it's not stable if I underclock the memory (by 500Mhz.), yet this was never a problem w/my 980Ti. But this doesn't make any sense unless it's some sort of memory timings issue.


----------



## nrpeyton

hmm can't understand why you'd ever want to underclock the memory. _(Exception: Unless your using a BIOS with faster memory at default which isn't quite stable or you bought a card that isn't stable out of the box and you're troubleshooting the cause to report to technical support/warranty teams)._

Memory timings for VRAM are set at boot the same as RAM. That's why when people who are able to over-volt their memory need to do a reboot to get it stable after applying the extra voltage. If they just went right ahead and over-volted the VRAM, and applied the overclock it may be unstable. However if that voltage is already applied when card boots (on a soft restart of PC) it is more likely to train properly. This is normal.
Owners of Kingpin models and EVGA Classified's and this applies to. Also I think gigabyte's own overclocking tool works on their top tier card and allows a small memory over-volt.

GDDR5X also responds nicely to colder temps.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm can't understand why you'd ever want to underclock the memory.
> 
> Memory timings for VRAM are set at boot the same as RAM. That's why when people who are able to over-volt their memory need to do a reboot to get it stable after applying the extra voltage. If they just went right ahead and over-volted the RAM, and applied the overclock it may be unstable. However if that voltage is already applied when card boots (on a soft restart of PC) it is more likely to train properly.
> Owners of Kingpin models and EVGA Classified's and this applies to. Also I think gigabyte's own overclocking tool works on their top tier card and allows a small memory over-volt.
> 
> GDDR5X also responds nicely to colder temps.


Oh, you're from Scotland.

You know why they call a kilt a 'kilt'?

Because they kilt that last man that called it a dress.


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> Oh, you're from Scotland.
> 
> You know why they call a kilt a 'kilt'?
> 
> Because they kilt that last man that called it a dress.


lol

Completed COD WW2 tonight. <--- absolutely amazing game (story _*AND*_ graphics)

And last week I completed Titanfall 2.

The week before that I completed Rise of the Tomb Raider.

Need something new -- any ideas?


----------



## KCDC

My friends, I've finally finished my rebuild. Probably the most intensive one I've done yet. Aside from everything else I updated and fixed (took two days and two nights), I swapped the thermal pads with the thermal grizzly 8w/mk versions and did shunt mod on both cards. Also added pads to the other areas KedarWolf mentioned in the past. Used 1.0 and 0.5mm pads, seems to be doing well.

Since I updated so many things it's hard to say if the pads are doing this much, but now my temps never go above 50 on benchmarks for the cards. The shunt mod allowed me to keep a constant 2050 mhz with no bin drops! I think the pads are allowing me to take the ram higher since I have been able to offset them by another 40mhz with no artifacts. I'll post actual numbers when I do more testing, the benchmarking was very late last night and I didn't log anything. I'm quite pleased with the shunt mod and my updates all around! I'll post full build in the watercool thread.

EDIT: one thing I noticed, though, is I cant get my cards to go past 2050 now.. Before the shunt mod, I was able to at least get spikes up to 2088. Any thoughts? Also, should I now keep my power limit to 100? 120 is unsafe with the shunt mod correct? I can't remember.


----------



## KedarWolf

Quote:


> Originally Posted by *nrpeyton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> Oh, you're from Scotland.
> 
> You know why they call a kilt a 'kilt'?
> 
> Because they kilt that last man that called it a dress.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> Completed COD WW2 tonight. <--- absolutely amazing game (story _*AND*_ graphics)
> 
> And last week I completed Titanfall 2.
> 
> The week before that I completed Rise of the Tomb Raider.
> 
> Need something new -- any ideas?
Click to expand...

I think I mentioned Hellblade: Senou's Sacrifice, one of the very best games I've ever played.

Enjoyed it even better than Rise Of The Tomb Raider and I really liked ROTR.









Only $30 on Steam.


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> My friends, I've finally finished my rebuild. Probably the most intensive one I've done yet. Aside from everything else I updated and fixed (took two days and two nights), I swapped the thermal pads with the thermal grizzly 8w/mk versions and did shunt mod on both cards. Also added pads to the other areas KedarWolf mentioned in the past. Used 1.0 and 0.5mm pads, seems to be doing well.
> 
> Since I updated so many things it's hard to say if the pads are doing this much, but now my temps never go above 50 on benchmarks for the cards. The shunt mod allowed me to keep a constant 2050 mhz with no bin drops! I think the pads are allowing me to take the ram higher since I have been able to offset them by another 40mhz with no artifacts. I'll post actual numbers when I do more testing, the benchmarking was very late last night and I didn't log anything. I'm quite pleased with the shunt mod and my updates all around! I'll post full build in the watercool thread.
> 
> EDIT: one thing I noticed, though, is I cant get my cards to go past 2050 now.. Before the shunt mod, I was able to at least get spikes up to 2088. Any thoughts? Also, should I now keep my power limit to 100? 120 is unsafe with the shunt mod correct? I can't remember.


If the card is drawing more power it will be less stable at higher clocks. (Just like how your CPU probably can't pass prime95 at 5.1ghz but can probably still boot into windows and do other light stuff).

I'd say if you're getting a consistent 2050 you're still faster than before. (as u say -- no bin drops).

I've also got my card shunt modded -- and before I installed the XOC firmware I also had power slider at 120%. If you did all three shunts expect about an extra 25% power from the mod. I've seen spikes up to nearly 500 watts on rare occasions _(just from card alone)_. _(But I also have the XOC firmware too).
_
I wouldn't say it's 'unsafe'.. however if you constantly run at 450+ watts at high temps 24/7 your card IS going to degrade faster due to micro-electro migration. However turning everything up to FULL for the odd very-demanding game isn't going to kill your card. It's just one of those chances you take -- _I.E. do you need it to last 5 years or 8 years?_

Lastly I agree 100% that your memory probably does indeed clock faster now. Adding those extra pads will have helped draw heat spillage from VRM that affects the GDDR5X memory chips on the VRM side of the card. I've noticed they run up to 30c hotter than the GDDR5X chips on the I/O side. VRAM on these cards does respond nicely to lower temps.

If you want to be 100% sure on the stability of your memory overclock; maybe check out OCCT... it's great for detecting memory overclock errors. (See recommendation on OP).


----------



## nrpeyton

Quote:


> Originally Posted by *KedarWolf*
> 
> I think I mentioned Hellblade: Senou's Sacrifice, one of the very best games I've ever played.


Perfect - I'll have a look. Thanks


----------



## KCDC

Can confirm: Hellblade is amazing and needs to be experienced. Plus you're supporting an indie developer with no major corp.
No easter egges, no leveling, just an amazing story, fun puzzles and a look into mental psychosis. Use headphones for the best experience.

Graphics are top notch.


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> If the card is drawing more power it will be less stable at higher clocks. (Just like how your CPU probably can't pass prime95 at 5.1ghz but can probably still boot into windows and do other light stuff).
> 
> I'd say if you're getting a consistent 2050 you're still faster than before. (as u say -- no bin drops).
> 
> I've also got my card shunt modded -- and before I installed the XOC firmware I also had power slider at 120%. If you did all three shunts expect about an extra 25% power from the mod. I've seen spikes up to nearly 500 watts on rare occasions _(just from card alone)_. _(But I also have the XOC firmware too).
> _
> I wouldn't say it's 'unsafe'.. however if you constantly run at 450+ watts at high temps 24/7 your card IS going to degrade faster due to micro-electro migration. However turning everything up to FULL for the odd very-demanding game isn't going to kill your card. It's just one of those chances you take -- _I.E. do you need it to last 5 years or 8 years?_
> 
> Lastly I agree 100% that your memory probably does indeed clock faster now. Adding those extra pads will have helped draw heat spillage from VRM that affects the GDDR5X memory chips on the VRM side of the card. I've noticed they run up to 30c hotter than the GDDR5X chips on the I/O side. VRAM on these cards does respond nicely to lower temps.
> 
> If you want to be 100% sure on the stability of your memory overclock; maybe check out OCCT... it's great for detecting memory overclock errors. (See recommendation on OP).


Hay, I appreciate this response, thanks!

I'm gonna push the ram more and see what I can get. Can't push the clock past what I did before shunt mod, but that wasn't the point of the mod.

I think for safety, ill keep limit at 100 since I get the same clock regardless. I do a lot of GPU rendering, so I want to play it safe. Thanks again for the info!


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Hay, I appreciate this response, thanks!
> 
> I'm gonna push the ram more and see what I can get. Can't push the clock past what I did before shunt mod, but that wasn't the point of the mod.
> 
> I think for safety, ill keep limit at 100 since I get the same clock regardless. I do a lot of GPU rendering, so I want to play it safe. Thanks again for the info!


No probs









I find that interesting.. because a good successful shunt mod will allow an extra 25-30% more% power to be drawn. But the slider in MSI AB yields 20% without a shunt mod. Was it worth doing the shunt mod for only an extra 5-10%?

If you're interested in seeing what your card is actually drawing in real-time while you're gaming _(or benching)_-- download hwinfo64 (if you don't have it already). And Rivia Tuner Statistics Server. hwinfo64 displays real-time power usage for the card and you can send the info to RTSS via the OSD tab in hwinfo64. Its displayed on-screen a bit like an FPS counter. (apologies if you knew that already or already have it -- just trying to suggest something that might help u monitor power -- as u seem quite concerned about power).


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> No probs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find that interesting.. because a good successful shunt mod will allow an extra 25-30% more% power to be drawn. But the slider in MSI AB yields 20% without a shunt mod. Was it worth doing the shunt mod for only an extra 5-10%?
> 
> If you're interested in seeing what your card is actually drawing in real-time while you're gaming _(or benching)_-- download hwinfo64 (if you don't have it already). And Rivia Tuner Statistics Server. hwinfo64 displays real-time power usage for the card and you can send the info to RTSS via the OSD tab in hwinfo64. Its displayed on-screen a bit like an FPS counter. (apologies if you knew that already or already have it -- just trying to suggest something that might help u monitor power -- as u seem quite concerned about power).


Appreciated again. I am concerned about power draw and know of some "ways" but having it told to me is most welcome. Doing this now +rep!


----------



## KCDC

before I am doing that, I pulled some numbers.. Maybe I didn't put enough LM on the shunts.

power draw would shift from 65-78 sometimes in the 80s on both cards

2050 clock
6179 memory

solid no crash

timespy.

shouldn't I be seeing a lower power consumption on load when power limits at 100?

temps remained around 45-50 on load


----------



## KCDC

looks like one of my cards has less of a proper shunt applied..

according to hwinfo, card 0 hits around 200w

card 1 will hit 230-240

this is with power limit at 100.


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Appreciated again. I am concerned about power draw and know of some "ways" but having it told to me is most welcome. Doing this now +rep!


No probs.

Here's how it looks for me with it all set up.



Quote:


> Originally Posted by *KCDC*
> 
> looks like one of my cards has less of a proper shunt applied..
> 
> according to hwinfo, card 0 hits around 200w
> 
> card 1 will hit 230-240
> 
> this is with power limit at 100.


If your shunt mod is successful a reported 230-240w is probably really actually 287 - 300 watts. (As with slider at 100% max power draw as defined in BIOS is 250 watts). With slider at 120% max power draw is 300w.
300w is also the hard coded maximum for most cards. (and definitely for Founders Editions).
With slider at 120% *and* shunt mod -- if its reporting 300 watts it's probably actually drawing approx. 375 watts.

See my edit below for explanation \/

*EDIT:* (important)
Also what to remember is the Shunt is *"tricking"* the card into thinking you're drawing less power than you really are.

So less power will be reported to hwinfo64. (and to windows _*and more importantly to the nvidia display driver).*_

I used a hardware watt meter plugged in between wall socket and PC and made some rough estimates to validate my 25 - 30% theory.

You can edit individual values in hwinfo64 by applying an additional "formula" or "sum" to any displayed figure.

So because I know my shunt mod fools the card into thinking it's drawing 25% less power than it really is (allowing it to draw that much more over and above it's BIOS cap of 300w).

So what I do is right click the GPU power row in hwinfo64, and select "customise values" and under where it says "multiply" I add the value "1.25".

From tests I've done I reckon it's probably still accurate to within 5%. Which is pretty decent.

Remember shunt mod won't change the hard coded BIOS cap of 300w. You're only fooling card into thinking it's drawing less than it really is so it thinks it's under the cap.. allowing it to draw more. <-- that's key... then you correct what is displayed to you on-screen in HWINFO64 by applying the formula and multiplying the outputted value by an extra 25%. So 1.25.



Quote:


> Originally Posted by *KedarWolf*
> 
> I think I mentioned Hellblade: Senou's Sacrifice, one of the very best games I've ever played.
> 
> Enjoyed it even better than Rise Of The Tomb Raider and I really liked ROTR.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only $30 on Steam.


Wow game is very demanding. I've had to max my overclock out for this title. (not had to do this for a game in ages).

At 2138Mhz at 1.093v and +685 mem I'm barely hitting 50 FPS.

It's a pitty there isn't a way to force a game to use more VRAM for increased performance.

I noticed in COD WW2 it was consistently using 10GB + VRAM which I've never witnessed in a game before -- but considering how beautiful the game is -- getting 60-100 FPS at 4k ultra was easy (even under-volted at 1.0v). I think it maybe got something to do with how much is buffered into fast GDDR5X VRAM for easy--quick access to data for GPU core. I wonder if this is a good example of how games can be optimized. IE.. how good the developers are at mapping VRAM requirements for games and managing how that scales across different cards with different amounts of VRAM.?

Anyway regardless, it's nice to see a game that stresses my card! Makes you feel like your overclock and the extra money invested in the biggest card was worth it


----------



## KCDC

Cool.. this helps me, thank you. I did a ton of reading before hand, knowing what the shut does, but this gives me more info. I just wanted consistent rates without down bins. Seems it is doing that.

But now I am clamped to 2050 solid. Not a bad thing, just was hoping to push it more. Seems to be this cards limit? I need to adjust the cards separately


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> hmm can't understand why you'd ever want to underclock the memory. _(Exception: Unless your using a BIOS with faster memory at default which isn't quite stable or you bought a card that isn't stable out of the box and you're troubleshooting the cause to report to technical support/warranty teams)._


Heat and power.

But if the GPU is running at 2100 Mhz. what's the point in having the memory run at 3000 Mhz. aside from having lower latencies? If the memory controller is running at 3000 MHz. it's waiting on the GPU.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> Heat and power.
> 
> But if the GPU is running at 2100 Mhz. what's the point in having the memory run at 3000 Mhz. aside from having lower latencies? If the memory controller is running at 3000 MHz. it's waiting on the GPU.


That doesn't make any sense.

You're not saving heat or power from underclocking the memory.


----------



## 8051

Quote:


> Originally Posted by *Agent-A01*
> 
> That doesn't make any sense.
> 
> You're not saving heat or power from underclocking the memory.


But you're wrong. Dead wrong. Power usage increases w/increased switching frequency in transistors.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> But you're wrong. Dead wrong. Power usage increases w/increased switching frequency in transistors.


Nothing you will be able to measure.

So real world usage you will see zero gains.

Considering the 11 GDDR5x chips use about 30w at 1.5v, to see considerable or even 'decent' gains you would have to undervolt as well.

Even if you took off 1000mhz, you'll see about a 3w drop across 11 chips..

Waste of time.
Not to mention they are rated to hit 80c.


----------



## nrpeyton

On the back of my last post -- I'm not so sure about this game lol (Hellblade: Senou's Sacrifice).. too many annoying puzzles and having to look around to make _"ravens line up"_ etc and not enough fast paced action...

I'll give it another hour or so but I'm not 100% sure anymore lol.

Reviews are good though.. and it's definitely unique.. just not sure I've got the patience, lol. Also it's an independant game.. only 20 developers using the unreal engine.

Having just played through and completed 3 or 4 'AAA FPS' games in a row I do fancy a change though -- maybe a strategy game.

Anyone know of a decent modern game thats anything like the old classic 'Total Annihilation'? Or just a good modern strategy game in general?

Racing would do too. but I'm not willing to pay for Forza 7 due to reviews I've read about micro-transactions so unless I can find a torrent somewhere to test it first then I'm missing out.

If anyone knows of a torrent link for it though plz PM me. If I like what they've done and the micro-transactions don't turn out to be too greedy causing distrust the reward/progression-system then I'll buy it. I bought and paid for Asseta Corsa *&* Project Cars and regretted *both* purchases.


----------



## 8051

Quote:


> Originally Posted by *Agent-A01*
> 
> Nothing you will be able to measure.
> 
> So real world usage you will see zero gains.
> 
> Considering the 7 GDDR5x chips use about 30w at 1.5v, to see considerable or even 'decent' gains you would have to undervolt as well.
> 
> Even if you took off 1000mhz, you'll see about a 3w drop across 7 chips..
> 
> Waste of time.
> Not to mention they are rated to hit 80c.


Um the memory controller they're connected to ALSO uses more power at higher frequency. Or did you think those memory chips operated all by themselves?


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> Um the memory controller they're connected to ALSO uses less power at higher frequency. Or did you think those memory chips operated all by themselves?


GDDR5X Memory Chips themselves don't produce much heat. The main reason for any heat on them is spillage from VRM or other components. _(when I say heat I mean enough to affect performance/overclock etc).
_

Heat-sink or waterblock are just too efficient to be overwhelmed by a GDDR5X memory chip enough to affect an overclock.

Also the switching frequency of the memory VRM doesn't change dependant on overclock. It's 300 Khz as a standard whether you're under-clocked by -1000 or overclocked by +1000.

Raising the memory VRM switching frequency via software is only possible on EVGA Classy, EVGA Kingpin & also Galax HOF cards.

However it might seem a good idea in theory, but it also introduces more "noise". Power draw from memory is so low anyway the memory VRM doesn't need to deal with sudden high spikes in power requirement same way the core does. So there's really not much benefit under normal air/water cooling. _(or could even be counter-productive to stability due to increased noise emissions)._


----------



## nrpeyton

deleted


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> On the back of my last post -- I'm not so sure about this game lol (Hellblade: Senou's Sacrifice).. too many annoying puzzles and having to look around to make _"ravens line up"_ etc and not enough fast paced action...
> 
> I'll give it another hour or so but I'm not 100% sure anymore lol.
> 
> Reviews are good though.. and it's definitely unique.. just not sure I've got the patience, lol. Also it's an independant game.. only 20 developers using the unreal engine.
> 
> Having just played through and completed 3 or 4 'AAA FPS' games in a row I do fancy a change though -- maybe a strategy game.
> 
> Anyone know of a decent modern game thats anything like the old classic 'Total Annihilation'? Or just a good modern strategy game in general?
> 
> Racing would do too. but I'm not willing to pay for Forza 7 due to reviews I've read about micro-transactions so unless I can find a torrent somewhere to test it first then I'm missing out.
> 
> If anyone knows of a torrent link for it though plz PM me. If I like what they've done and the micro-transactions don't turn out to be too greedy causing distrust the reward/progression-system then I'll buy it.


the latest xcom 2 dlc is what im into lately... its hard. one of the few where I consider changing difficulty to easy..

http://store.steampowered.com/app/593380/XCOM_2_War_of_the_Chosen/

also total war: warhammer is tons of fun, takes getting used to.

http://store.steampowered.com/app/364360/Total_War_WARHAMMER/


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> the latest xcom 2 dlc is what im into lately... its hard. one of the few where I consider changing difficulty to easy..
> 
> http://store.steampowered.com/app/593380/XCOM_2_War_of_the_Chosen/
> 
> also total war: warhammer is tons of fun, takes getting used to.
> 
> http://store.steampowered.com/app/364360/Total_War_WARHAMMER/


i'll have a look. thanks

I loved the total war series. Just not so sure about the more recent ones as I fear they dumnbed it down a bit.


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> i'll have a look. thanks
> 
> I loved the total war series. Just not so sure about the more recent ones as I fear they dumnbed it down a bit.


they tend to do that so that more purchase it, but thats where you change difficulty!









plenty of good RPGs out there too..

original sin 2 is a fave currently

edited due to phone.


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> they tend to do that so that more purchase it, but thats where you change difficulty!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> plenty of good RPGs out there too..
> 
> original sin 2 is a fave currently
> 
> edited due to phone.


aye that's a good point actually -- I i have the latest dragon Age, and I loved the first one.. but for some reason I just can't seem to get into the latest one as much. (But I forgot I had it -- and to be fair I don't think I've given it a proper chance yet).

Story just doesn't seem to engage me fast enough to keep me interested/hooked. Weird , coz I was genuinely totally and utterly addicted to the last one while I was playing through the campaign.

Think I might revisit it actually.

Anyone whose not played Titanfall 2 or COD WW2 you are really missing out.


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> aye that's a good point actually -- I i have the latest dragon Age, and I loved the first one.. but for some reason I just can't seem to get into the latest one as much. (But I forgot I had it -- and to be fair I don't think I've given it a proper chance yet).
> 
> Story just doesn't seem to engage me fast enough to keep me interested/hooked. Weird , coz I was genuinely totally and utterly addicted to the last one while I was playing through the campaign.
> 
> Think I might revisit it actually.
> 
> Anyone whose not played Titanfall 2 or COD WW2 you are really missing out.


Witcher 3 should be your next game then..

If not, xcom2


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> GDDR5X Memory Chips themselves don't produce much heat. The main reason for any heat on them is spillage from VRM or other components. _(when I say heat I mean enough to affect performance/overclock etc).
> _
> 
> Heat-sink or waterblock are just too efficient to be overwhelmed by a GDDR5X memory chip enough to affect an overclock.
> 
> Also the switching frequency of the memory VRM doesn't change dependant on overclock. It's 300 Khz as a standard whether you're under-clocked by -1000 or overclocked by +1000.
> 
> Raising the memory VRM switching frequency via software is only possible on EVGA Classy, EVGA Kingpin & also Galax HOF cards.
> 
> However it might seem a good idea in theory, but it also introduces more "noise". Power draw from memory is so low anyway the memory VRM doesn't need to deal with sudden high spikes in power requirement same way the core does. So there's really not much benefit under normal air/water cooling. _(or could even be counter-productive to stability due to increased noise emissions)._


The memory controller is 384-bits wide and is synchronous with the memory, it uses more power the faster it runs.


----------



## nrpeyton

Quote:


> Originally Posted by *8051*
> 
> The memory controller is 384-bits wide and is synchronous with the memory, it uses more power the faster it runs.


I agree, but not enough to affect produce enough extra heat to affect overclocking performance or overwhelm modern heat sink/waterblock technology.,


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Witcher 3 should be your next game then..
> 
> If not, xcom2


Witcher 3 - already done - good game.

downloading xcom 2 and Need For Sped Payback now









Do I really need 'war of the chosen' to get the best from xcom 2?

Edit: Can''t get Need For Speed Payback working so looks like it's XCON 2 for now.....

Installing now.

Hopefully I can get into it.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> The memory controller is 384-bits wide and is synchronous with the memory, it uses more power the faster it runs.


You're arguing technicalities when we are talking about real world gains; there is none.


----------



## propeldragon

I'm thinking about doing a repad of fujipoly for my 1080 ti FE, how much will I need? Thanks


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> Witcher 3 - already done - good game.
> 
> downloading xcom 2 and Need For Sped Payback now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do I really need 'war of the chosen' to get the best from xcom 2?
> 
> Edit: Can''t get Need For Speed Payback working so looks like it's XCON 2 for now.....
> 
> Installing now.
> 
> Hopefully I can get into it.


Xcom 2.. Save often. It's turn based, so you'll wanna adjust your strategy many times. Also pay attention to your crew and ship.. The upgrades are essential.play the original first. Get war of the chosen after


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Xcom 2.. Save often. It's turn based, so you'll wanna adjust your strategy many times. Also pay attention to your crew and ship.. The upgrades are essential.play the original first. Get war of the chosen after


Okay just begun the tutorial.

What's up with the graphics? It looks pretty "bland" yet with my 1080Ti overclocked to 2189MHZ at 1.2v and +685 memory I'm only hitting 37 FPS with drops down to 30.

(Everything maxed out at 4k -- but still, come on, lol). The foilage for example doesn't even look anything like it does in COD WW22 or any other AAA games.

Something doesn't seem right there - lol.


----------



## 8051

Quote:


> Originally Posted by *Agent-A01*
> 
> You're arguing technicalities when we are talking about real world gains; there is none.


Really? A 500 Mhz. difference in the speed of a 384-bit memory controller makes no difference in power consumption eh?
I call shens.


----------



## KCDC

Quote:


> Originally Posted by *nrpeyton*
> 
> Okay just begun the tutorial.
> 
> What's up with the graphics? It looks pretty "bland" yet with my 1080Ti overclocked to 2189MHZ at 1.2v and +685 memory I'm hitting 37 FPS with drops down to 30.
> 
> Something doesn't seem right there - lol.


Never had that issue... Always 60 fps at 1440p

Play with the settings.. Maybe AA is wrong?


----------



## nrpeyton

Quote:


> Originally Posted by *KCDC*
> 
> Never had that issue... Always 60 fps at 1440p
> 
> Play with the settings.. Maybe AA is wrong?


Graphics look a bit "bland" so I assumed I'd be able to have everything on max at 4k.

I'll play about with settings then.

Edit:
This game looks okay actually -- _seem to be getting into it a bit._


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> I agree, but not enough to affect produce enough extra heat to affect overclocking performance or overwhelm modern heat sink/waterblock technology.,


How much power does a 384-bit memory controller use? Do you know? I'll bet it's NOT insignificant.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> Really? A 500 Mhz. difference in the speed of a 384-bit memory controller makes no difference in power consumption eh?
> I call shens.


You're dense as a rock.
I said there will be no real world gains. A few watts at best


----------



## nrpeyton

Any bitcoin miners on here?

What kind of daily earnings are you making off a single 1080Ti?

I've juste set up a miner app in "demo" mode to see what my potential earnings would be.

$4.50 US Dollar a day or £3.35 GBP a day _(As I'm from U.K)._

Even with card undervolted to 1.0v at 1999MHZ and a +685 memory my card is still hitting massive power spikes of up to 370 watts _(card only)._









Been running for about 25 minutes and I think I'm on about £0.06 lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Any bitcoin miners on here?
> 
> What kind of daily earnings are you making off a single 1080Ti?
> 
> I've juste set up a miner app in "demo" mode to see what my potential earnings would be.
> 
> $4.50 US Dollar a day or £3.35 GBP a day _(As I'm from U.K)._
> 
> Even with card undervolted to 1.0v at 1999MHZ and a +685 memory my card is still hitting massive power spikes of up to 370 watts _(card only)._
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been running for about an hour and I think I'm on about £0.06 lol


Seem about right.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Seem about right.


Can I add my old ATI Radeon HD GPU 6950 and Radeon 5770 into spare PCI-E slots?

Or does crossfire/sli need to be enabled for the system to use the extra processing power for miming?


----------



## nrpeyton

deleted


----------



## Nervoize

Any way to improve my current oc? 2176 seems stable for 10sec and then crashes at 1.2v.
Currently this stable oc


How to get over 1.2v?


----------



## Leopardi

Anyone know if the basic Zotac AMP! or Palit JetStream cooler are any good? Those are the only decent blackfriday deals here. The MSI ARMOR as well but I guess it is ruled out, so much bad stuff about it being noisy.

EVGA 1080 Ti SC is the next best but its getting expensive, how is that one with the mutilated ICX cooler?


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Can I add my old ATI Radeon HD GPU 6950 and Radeon 5770 into spare PCI-E slots?
> 
> Or does crossfire/sli need to be enabled for the system to use the extra processing power for miming?


Too old. You need GCN at very least.


----------



## feznz

Quote:


> Originally Posted by *Agent-A01*
> 
> You're arguing technicalities when we are talking about real world gains; there is none.


Quote:


> Originally Posted by *8051*
> 
> Really? A 500 Mhz. difference in the speed of a 384-bit memory controller makes no difference in power consumption eh?
> I call shens.


Can we agree you are both right coming from a guy that has naked memory


----------



## 8051

Quote:


> Originally Posted by *Agent-A01*
> 
> You're dense as a rock.
> I said there will be no real world gains. A few watts at best


Let's see some proof.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> Let's see some proof.


I don't have to provide proof, you do.

You made the claims of the benefits.
Lol


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Too old. You need GCN at very least.


GCN?


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> GCN?


GCN = HD 7XXX +


----------



## becks

Gigabyte Aorus GeForce GTX 1080Ti "Waterforce WB Xtreme
Edition" 11264MB GDDR5X

2012 - 2000 core.
6066 - Memory - Can go up to 6400+ but not sure its worth it. Let me know if I am wrong here and I will push it up more.

1.012v

https://www.3dmark.com/3dm/23548527 - Graphics Score 10.819 CPU Score 6.127


----------



## Gurkburk

Quote:


> Originally Posted by *becks*
> 
> Gigabyte Aorus GeForce GTX 1080Ti "Waterforce WB Xtreme
> Edition" 11264MB GDDR5X
> 
> 2012 - 2000 core.
> 6066 - Memory - Can go up to 6400+ but not sure its worth it. Let me know if I am wrong here and I will push it up more.
> 
> 1.012v
> 
> https://www.3dmark.com/3dm/23548527 - Graphics Score 10.819 CPU Score 6.127


Really? 1.012v when being pushed?


----------



## becks

Quote:


> Originally Posted by *Gurkburk*
> 
> Really? 1.012v when being pushed?


Yes, on water.

Temps 35-46 C (Max) in Heaven loop all ultra 8x Antiallising.


----------



## Leopardi

Well I just ordered the Palit 1080 Ti JetStream version. Seems its between the MSI Gaming X and Gigabyte's Aorus in terms of noise, and offer similar cooling power. While I didn't find any complaint about coil whine with it either.

Should be a nice performance jump from my 980 in any case


----------



## nrpeyton

Quote:


> Originally Posted by *Nervoize*
> 
> Any way to improve my current oc? 2176 seems stable for 10sec and then crashes at 1.2v.
> Currently this stable oc
> 
> 
> How to get over 1.2v?


Only way over 1.2v is a physical hardware mod.

Or just get it *colder*. <-- _most important_!

I can pass the Forza 7 benchmark at 2189Mhz at 1.2v if my coolant temperature in loop is 6-8c.

It'll also run some lighter benches at 2214.


----------



## Rakanoth

I find it really frustrating that I can't get over 165 fps as average with this card at 1440p at major titles: BF1, Witcher 3, GTA 5, Mankind Divided. I just can't max out everything. Always some tweaking with graphics settings required. Not always at the cost of beautiful graphics but still not being able to max out annoys me. Even if I max out, I get almost always under 80-90 fps.


----------



## Leopardi

Quote:


> Originally Posted by *Rakanoth*
> 
> I find it really frustrating that I can't get over 165 fps as average with this card at 1440p at major titles: BF1, Witcher 3, GTA 5, Mankind Divided. I just can't max out everything. Always some tweaking with graphics settings required. Not always at the cost of beautiful graphics but still not being able to max out annoys me. Even if I max out, I get almost always under 80-90 fps.


Are you bumping up supersampling so its not 1440p? Or is the Ryzen limiting it?


----------



## Rakanoth

Quote:


> Originally Posted by *Leopardi*
> 
> Are you bumping up supersampling so its not 1440p? Or is the Ryzen limiting it?


I am not sure if I understood your question correctly. I am turning up everything: supersampling, ambient occlusion, foliage visibility Range, shadow quality, NVIDIA HairWorks etc.
Am I supposed to turn down supersampling when I am at 1440p?
Ryzen won't bottleneck at 1440p. It is the GPU that bottlenecks.


----------



## GreedyMuffin

"Ryzen won't bottleneck at 1440"

That's what I though too... Glad I made the switch to Intel again.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GreedyMuffin*
> 
> "Ryzen won't bottleneck at 1440"
> 
> That's what I though too... Glad I made the switch to Intel again.


It depends. If Intel get 120 fps and Ryzen get 110 fps you can call it "bottleneck" but really its not that big of a deal. 7820X has many similar problems to what Ryzen has. Its basically a Ryzen with higher clock speeds.


----------



## Rakanoth

Quote:


> Originally Posted by *GreedyMuffin*
> 
> "Ryzen won't bottleneck at 1440"
> 
> That's what I though too... Glad I made the switch to Intel again.


I read lots of reviews and bechmarks. There is only one game in which there is a significant difference between R7 and Intel: Dota 2. In other games performance difference is not more than 14 FPS.
I could just get Intel 7700k or 8700k but I use my machine as a workstation from time to time. So Ryzen was the best fit for me. Intel x299 could be nice too but power draw is too much. Also it would be more costly.

Of course when new Volta GPUs come, I might consider switching to Intel.


----------



## Leopardi

Quote:


> Originally Posted by *Rakanoth*
> 
> I am not sure if I understood your question correctly. I am turning up everything: supersampling, ambient occlusion, foliage visibility Range, shadow quality, NVIDIA HairWorks etc.
> Am I supposed to turn down supersampling when I am at 1440p?
> Ryzen won't bottleneck at 1440p. It is the GPU that bottlenecks.


Supersampling is literally turning up the resolution, you're supposed to leave it at 100%. 120% supersampling setting on a 2560*1440 monitor would mean you're at 3072*1728 resolution actually.


----------



## KedarWolf

Quote:


> Originally Posted by *Rakanoth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GreedyMuffin*
> 
> "Ryzen won't bottleneck at 1440"
> 
> That's what I though too... Glad I made the switch to Intel again.
> 
> 
> 
> I read lots of reviews and bechmarks. There is only one game in which there is a significant difference between R7 and Intel: Dota 2. In other games performance difference is not more than 14 FPS.
> I could just get Intel 7700k or 8700k but I use my machine as a workstation from time to time. So Ryzen was the best fit for me. Intel x299 could be nice too but power draw is too much. Also it would be more costly.
> 
> Of course when new Volta GPUs come, I might consider switching to Intel.
Click to expand...

I'm part of a large class action, hoping to get a good chunk of cash when it's done.

I'm thinking of building an 18 core, 36 thread i9 with four Volta cards in it for benching. I know I'll only be water cooling it all, but still.

Maybe with a 980 Pro SSD.

I'm thinking of having the four cards stand alone on my AX1500i and buying another for my motherboard and fans etc.

My Thermaltake Core X9 supports two PSUs.

I'm not sure an AX1500i Can support four Voltas with an XOC type BIOS but under North American power requirements I don't think there is a better option.

Guess I could have three on it and one on my motherboard, system PSU.

It'll all depend on how much power Voltas pull I think.









Unless I hit the lotto or something this will probably be the last high-end PC I'll be able to build so going all out.


----------



## becks

Is it worth pushing mem on GPU for gaming and such ??...struggle to find any good articles online.
At the moment on a conservative 6066 mem...

Also even if my card is at 2025 set by curve and works at that freq in benchmarks I don't usually see it going over 1700 in games (with RTS OSD ) is there a reason for it ?

EDIT: I keep my games with Vsync on and capped at 60 fps on a 1080p Screen til I get my 3440 x 1440 ultra-wide QHD 100 hz back


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It depends. If Intel get 120 fps and Ryzen get 110 fps you can call it "bottleneck" but really its not that big of a deal. 7820X has many similar problems to what Ryzen has. Its basically a Ryzen with higher clock speeds.


Mm I guess my idea of a bottleneck is very different. I consider a bottleneck to be a problem if something is limiting the performance of something else by at least 25%. Otherwise, it seems like par of the course.


----------



## KedarWolf

My card, after several reboots in my PC BIOS it shows as no card detected.









It works in Windows and shows up in GPU-Z but it annoys the heck out of me and I usually reset it in the PCI-E slot to fix it.


----------



## shreduhsoreus

Quote:


> Originally Posted by *Rakanoth*
> 
> I am not sure if I understood your question correctly. I am turning up everything: supersampling, ambient occlusion, foliage visibility Range, shadow quality, NVIDIA HairWorks etc.
> Am I supposed to turn down supersampling when I am at 1440p?
> Ryzen won't bottleneck at 1440p. It is the GPU that bottlenecks.


If you're using super sampling then it's no longer 1440p. It's rendering a larger resolution that is then scaled down to the resolution of your screen.

And yes, Ryzen can be a bottleneck. Ryzen can't pump as many frames per second as their Intel counterparts, which creates a bottleneck when you're trying to hit high framerates. When you're targeting high framerates, CPU is just as important as GPU.


----------



## Rakanoth

Quote:


> Originally Posted by *shreduhsoreus*
> 
> And yes, Ryzen can be a bottleneck. Ryzen can't pump as many frames per second as their Intel counterparts, which creates a bottleneck when you're trying to hit high framerates. When you're targeting high framerates, CPU is just as important as GPU.


So 10 FPS is bottleneck? :O


----------



## shreduhsoreus

Quote:


> Originally Posted by *Rakanoth*
> 
> So 10 FPS is bottleneck? :O


If the CPU is limiting your frame rate in any way, then yes, by definition, that is a bottleneck.


----------



## GreedyMuffin

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It depends. If Intel get 120 fps and Ryzen get 110 fps you can call it "bottleneck" but really its not that big of a deal. 7820X has many similar problems to what Ryzen has. Its basically a Ryzen with higher clock speeds.


Just going to leave this here..


Spoiler: Warning: Spoiler!



I7 7800X, MSI X299 Tomhawk, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10

R7 1700, X370 Crosshair 6, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10

I7 7800X: Min FPS 114 - Max FPS 173 - Avg FPS 141 - 64 players Conquest on Soissons - Ultra preset - 1440P

R7 1700: MIN FPS 91 - Max FPS 174 - Avg FPS 115 - 64 players Conquest on Soissons - Ultra preset - 1440P

Ryzen at 3850/2933 on mem - GPU at 2050/+500

SK-X at 4700/3000 and 3200 on mem - GPU at 2050/+500



I would not call (about) 25% higher min. and avg. fps little. And remember, this was with a 7800X running dual-channel memory. Not my 4000 mhz CL17 setup in quad-channel.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Just going to leave this here..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I7 7800X, MSI X299 Tomhawk, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10
> 
> R7 1700, X370 Crosshair 6, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10
> 
> I7 7800X: Min FPS 114 - Max FPS 173 - Avg FPS 141 - 64 players Conquest on Soissons - Ultra preset - 1440P
> 
> R7 1700: MIN FPS 91 - Max FPS 174 - Avg FPS 115 - 64 players Conquest on Soissons - Ultra preset - 1440P
> 
> Ryzen at 3850/2933 on mem - GPU at 2050/+500
> 
> SK-X at 4700/3000 and 3200 on mem - GPU at 2050/+500
> 
> 
> 
> I would not call (about) 25% higher min. and avg. fps little. And remember, this was with a 7800X running dual-channel memory. Not my 4000 mhz CL17 setup in quad-channel.


There is clearly a difference there. The Intel CPU is also clocked 22% higher and has slight higher IPC hence the difference. Your BF1 do seem low though. My 3770K averages 140 fps + in almost every map.


----------



## Rakanoth

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Just going to leave this here..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I7 7800X, MSI X299 Tomhawk, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10
> 
> R7 1700, X370 Crosshair 6, 2x8GB Corsair Vengance 3200 mhz, 1080Ti, AX1500I, Samsung 840 Pro, Windows 10
> 
> I7 7800X: Min FPS 114 - Max FPS 173 - Avg FPS 141 - 64 players Conquest on Soissons - Ultra preset - 1440P
> 
> R7 1700: MIN FPS 91 - Max FPS 174 - Avg FPS 115 - 64 players Conquest on Soissons - Ultra preset - 1440P
> 
> Ryzen at 3850/2933 on mem - GPU at 2050/+500
> 
> SK-X at 4700/3000 and 3200 on mem - GPU at 2050/+500
> 
> 
> 
> I would not call (about) 25% higher min. and avg. fps little. And remember, this was with a 7800X running dual-channel memory. Not my 4000 mhz CL17 setup in quad-channel.


That is a big difference and it's definitely worth switching to 7820x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Rakanoth*
> 
> That is a big difference and it's definitely worth switching to 7820x.


He can get 8700K and get even more.


----------



## SavantStrike

Quote:


> Originally Posted by *KedarWolf*
> 
> I'm part of a large class action, hoping to get a good chunk of cash when it's done.
> 
> I'm thinking of building an 18 core, 36 thread i9 with four Volta cards in it for benching. I know I'll only be water cooling it all, but still.
> 
> Maybe with a 980 Pro SSD.
> 
> I'm thinking of having the four cards stand alone on my AX1500i and buying another for my motherboard and fans etc.
> 
> My Thermaltake Core X9 supports two PSUs.
> 
> I'm not sure an AX1500i Can support four Voltas with an XOC type BIOS but under North American power requirements I don't think there is a better option.
> 
> Guess I could have three on it and one on my motherboard, system PSU.
> 
> It'll all depend on how much power Voltas pull I think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless I hit the lotto or something this will probably be the last high-end PC I'll be able to build so going all out.


You could build three or four mid range machines 3-4 years apart for the same money.

That would give you something really fun to look forward to for over a decade if not a decade and a half.

You could go for top single card bench every three years, or top multi card bench one time and then never again. Your call but I just did the top multi thing and meh, it's fleeting. I only did it because I'm mining on that machine.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> My card, after several reboots in my PC BIOS it shows as no card detected.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It works in Windows and shows up in GPU-Z but it annoys the heck out of me and I usually reset it in the PCI-E slot to fix it.


I had a similar problem with a Asus LGA 775 Striker Formula motherboard, I replaced the motherboard in the end


----------



## GreedyMuffin

Quote:


> Originally Posted by *ZealotKi11er*
> 
> He can get 8700K and get even more.


Honestly I wouldn't. The 7820X costed me 50 USD less than a 8700K at MSRP. So It was well worth it.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Rakanoth*
> 
> That is a big difference and it's definitely worth switching to 7820x.


That was only with a 7800X (6 core) with some dual-channel RAM. So not even in Quad-channel.


----------



## becks

Common guys...

I post 2 things in regards to my GTX 1080 TI and ask some questions on 2 different days here....and still no answer like everyone is completely jumping my posts...

And here you are actively battling on the ultimate Red vs Blue team...

You know what...GREAT!


----------



## GreedyMuffin

Quote:


> Originally Posted by *becks*
> 
> Common guys...
> 
> I post 2 things in regards to my GTX 1080 TI and ask some questions on 2 different days here....and still no answer like everyone is completely jumping my posts...
> 
> And here you are actively battling on the ultimate Red vs Blue team...
> 
> You know what...GREAT!


Honestly, there's no battle. If you for the money. Nvidia/Intel is superior.









About your question:


Spoiler: Warning: Spoiler!



Is it worth pushing mem on GPU for gaming and such ??...struggle to find any good articles online.
At the moment on a conservative 6066 mem...

Also even if my card is at 2025 set by curve and works at that freq in benchmarks I don't usually see it going over 1700 in games (with RTS OSD ) is there a reason for it ?

EDIT: I keep my games with Vsync on and capped at 60 fps on a 1080p Screen til I get my 3440 x 1440 ultra-wide QHD 100 hz back
Edited by becks - Yesterday at 6:13 pm



I'd push the memory to at least around 400mhz+. From what I've read, memory OC would be better than a slight core OC. Esp. in Benchmarks.

Are you sure that RTS OSD is reporting correctly? Seems weird as the shouldn't downclock at all.


----------



## shreduhsoreus

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Honestly I wouldn't. The 7820X costed me 50 USD less than a 8700K at MSRP. So It was well worth it.


You got a 7820X for under $400? How and where?


----------



## becks

Quote:


> Originally Posted by *GreedyMuffin*


Yes I am sure, and its strange... with AB on second screen and AIDA OSD GPU-z on other screen and game open ...all 3 programs report the same...
If I open a bench on second screen while the game is open it reports the core at 1600-1700 and if I fire the bench the core jumps to 2025.....so there isn't anything obviously keeping it at that lower core while the game is on...

Thanks for the info on Mem.. will tweak it some more later when I get home.


----------



## GreedyMuffin

Quote:


> Originally Posted by *shreduhsoreus*
> 
> You got a 7820X for under $400? How and where?


Picked it up second-hand from a guy on my forum. Guy had a 7900X and a 7920X so he didn't use this at all it seemed like.

Since we have to pay 25% more in taxes on everything, a price would be unfair. MSRP in Norway on the 7820X is around 720 USD, and I bought mine for 430 USD.


----------



## 8051

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Honestly, there's no battle. If you for the money. Nvidia/Intel is superior.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About your question:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Is it worth pushing mem on GPU for gaming and such ??...struggle to find any good articles online.
> At the moment on a conservative 6066 mem...
> 
> Also even if my card is at 2025 set by curve and works at that freq in benchmarks I don't usually see it going over 1700 in games (with RTS OSD ) is there a reason for it ?
> 
> EDIT: I keep my games with Vsync on and capped at 60 fps on a 1080p Screen til I get my 3440 x 1440 ultra-wide QHD 100 hz back
> Edited by becks - Yesterday at 6:13 pm
> 
> 
> 
> I'd push the memory to at least around 400mhz+. From what I've read, memory OC would be better than a slight core OC. Esp. in Benchmarks.
> 
> Are you sure that RTS OSD is reporting correctly? Seems weird as the shouldn't downclock at all.


I always heard just the opposite (that core overclock is always better than memory OC). I haven't come across any benchmarks where overclocking the memory made any diff.


----------



## nrpeyton

Quote:


> Originally Posted by *becks*
> 
> Yes I am sure, and its strange... with AB on second screen and AIDA OSD GPU-z on other screen and game open ...all 3 programs report the same...
> If I open a bench on second screen while the game is open it reports the core at 1600-1700 and if I fire the bench the core jumps to 2025.....so there isn't anything obviously keeping it at that lower core while the game is on...
> 
> Thanks for the info on Mem.. will tweak it some more later when I get home.


In some games I've found I get half my final O/C from GDDR5X memory, and the other half from the core. So for instance if I have the memory at a +685 I probably get roughly the same increase in FPS with that memory bump as I do with roughly a +126 on the core. So 5% from memory and 5% from core. Giving roughly 10% more FPS.

Memory overclocking is definitely well worth it in my eyes. It's also free. As you probably won't be increasing memory voltage -- so longevity of card isn't affected. And there is only a very small amount of extra power used.

OCCT is great for checking your memory overclock. It will begin to throw up errors (it counts them) when you get too high. These errors will always start to show up before a complete driver crash.

Also when errors start to show up. Try doing a soft restart on your PC. Memory VRAM timings may adjust during restart and allow you to continue tuning at a memory frequency that precviously gave errors.


----------



## becks

Quote:


> Originally Posted by *nrpeyton*


Thanks for the tips..
At the moment passed 5 Heaven benches (consecutive runs) temp went up to 46.

2012 Core +638 Mem (6284) at 1.012v


----------



## 8051

Quote:


> Originally Posted by *nrpeyton*
> 
> In some games I've found I get half my final O/C from GDDR5X memory, and the other half from the core. So for instance if I have the memory at a +685 I probably get roughly the same increase in FPS with that memory bump as I do with roughly a +126 on the core. So 5% from memory and 5% from core. Giving roughly 10% more FPS.
> 
> Memory overclocking is definitely well worth it in my eyes. It's also free. As you probably won't be increasing memory voltage -- so longevity of card isn't affected. And there is only a very small amount of extra power used.
> 
> OCCT is great for checking your memory overclock. It will begin to throw up errors (it counts them) when you get too high. These errors will always start to show up before a complete driver crash.
> 
> Also when errors start to show up. Try doing a soft restart on your PC. Memory VRAM timings may adjust during restart and allow you to continue tuning at a memory frequency that precviously gave errors.


Do you have any proof of a memory overclock making any difference in an in-game benchmark?


----------



## MaKeN

It does afect memory fps , i did test it in mass efect andromeda, stock vs oc gets me arround 12 fps as i remember
Same for the benching


----------



## Minusorange

Since switching to dual monitor setup and having primary monitor go from 1080p to 1440p my overclock is now unstable...

Is this normal for this kind of change ?


----------



## ZealotKi11er

Memory OC basically improves how much Core OC scales. They compliment each other.


----------



## LunaP

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Memory OC basically improves how much Core OC scales. They compliment each other.


Back when I ran 3x Titans I noticed Mem OC helped significantly bring up my lowest FPS to a steady number when running in surround @ 1440p

I wanted to ask if that still applied now since I have 2x 1080ti's currently at stock and haven't loaded a custom bios yet, if there were any minor tweaks I should put on, since I wont' be benching just gaming mostly MMO's and an occasional steam game ( currently playing Nier @ max + FXAA/SMAA on via profile and loving the beauty this game has minus the textures of some areas)


----------



## 8051

So the decrease in memory latency what's making the diff.? Because if the memory is running faster than the GPU, how could the additional bandwidth make a diff.?


----------



## ZealotKi11er

Quote:


> Originally Posted by *8051*
> 
> So the decrease in memory latency what's making the diff.? Because if the memory is running faster than the GPU, how could the additional bandwidth make a diff.?


Wait what? Memory clock and Core clock have 0 relation to each other.


----------



## Agent-A01

Quote:


> Originally Posted by *8051*
> 
> So the decrease in memory latency what's making the diff.? Because if the memory is running faster than the GPU, how could the additional bandwidth make a diff.?


I think you should go back to the basics and learn more about that.


----------



## 8051

Quote:


> Originally Posted by *Agent-A01*
> 
> I think you should go back to the basics and learn more about that.


Why don't you explain it to me?
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Wait what? Memory clock and Core clock have 0 relation to each other.


Except for the fact all data that goes to/from the core has to go through something called a memory controller.
Quote:


> Originally Posted by *Agent-A01*
> 
> I think you should go back to the basics and learn more about that.


A memory controller's ability to provide data is determined by the memory frequency and timing, if the bandwidth determined by these variables is greater than the GPU core's ability to read data how is it going to make a difference?


----------



## GreedyMuffin

Sold my X299 setup and I'm grabbing a 8700K.

Will be fun to test versus Ryzen and X299.


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> A memory controller's ability to provide data is determined by the memory frequency and timing, if the bandwidth determined by these variables is greater than the GPU core's ability to read data how is it going to make a difference?


You touched on the crux of the issue. Modern GPUs are NEVER so fast that the greater bandwidth does nothing. In fact, almost every GPU in existence has been VRAM Bandwidth bottlenecked to a greater or lesser degree. The IMC is simply a relay, it is a device which schedules memory access operations and distributes data to the L2 cache of a GPU. It's operational speed has little relevance to the speed which the VRAM operates.

A good example are Raster Operations. These units within the GPU pretty devour whatever memory bandwidth is available in order to do their duties.

Because almost every GPU in existence is VRAM Bandwidth bottlenecked to some degree, they all see lesser or greater gains from overclocking their VRAM.


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> Why don't you explain it to me?


I doubt they'd want to when you can't even operate a forum correctly, multi/edit button but anytime you respond on other threads its usually 3-20 posts in a row since you seem to flip out if someone doesn't respond to you instantly.


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nrpeyton*
> 
> In some games I've found I get half my final O/C from GDDR5X memory, and the other half from the core. So for instance if I have the memory at a +685 I probably get roughly the same increase in FPS with that memory bump as I do with roughly a +126 on the core. So 5% from memory and 5% from core. Giving roughly 10% more FPS.
> 
> Memory overclocking is definitely well worth it in my eyes. It's also free. As you probably won't be increasing memory voltage -- so longevity of card isn't affected. And there is only a very small amount of extra power used.
> 
> OCCT is great for checking your memory overclock. It will begin to throw up errors (it counts them) when you get too high. These errors will always start to show up before a complete driver crash.
> 
> Also when errors start to show up. Try doing a soft restart on your PC. Memory VRAM timings may adjust during restart and allow you to continue tuning at a memory frequency that precviously gave errors.
> 
> 
> 
> Do you have any proof of a memory overclock making any difference in an in-game benchmark?
Click to expand...

The higher I've had my memory clocked the higher I've scored in Superposition.

When I made my place very cold for benching and got my memory higher than I've ever had it was when I scored the highest.

And it gradually crept up as I gradually increased memory speed.

I don't get why someone would argue that increasing memory speed would have no effect on benchmarks or even gaming.


----------



## becks

Quote:


> Originally Posted by *KedarWolf*


I was the one with the original question...
Because I come from a Gtx 260....and I never OC'ed a GPU before..and my question seemed normal because the more I push Mem I have to lower core to keep same V bin...

Ignoring the "forever debate cause of reasons" I filtered quite good info and I'm on the right track but not really pushing the card to the limit as I will receive my second screen back soon (PG348Q) and will see how it handles 1440p than


----------



## khemist

https://postimages.org/

https://postimages.org/

https://postimages.org/

Picked up a reference Zotac.


----------



## Dasboogieman

Quote:


> Originally Posted by *khemist*
> 
> https://postimages.org/
> 
> https://postimages.org/
> 
> https://postimages.org/
> 
> Picked up a reference Zotac.


That actually looks classier than the FE or even Quadros.

Subdued, sleek, professional looking.


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> I doubt they'd want to when you can't even operate a forum correctly, multi/edit button but anytime you respond on other threads its usually 3-20 posts in a row since you seem to flip out if someone doesn't respond to you instantly.


I'm crying on the outside but I'm laughing on the inside.


----------



## bajer29

Quote:


> Originally Posted by *khemist*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> https://postimages.org/
> 
> https://postimages.org/
> 
> https://postimages.org/
> 
> 
> 
> Picked up a reference Zotac.


This has probably been discussed already, but I'd be curious to know how the reference cooler with a blower does vs 2 fan and 3 fan non-reference models. I know that the reference cards are $50-$100 cheaper than the OC'd and "gaming" models.

Specifically EVGA FTW3's cooler vs reference.


----------



## 8051

Quote:


> Originally Posted by *Dasboogieman*
> 
> You touched on the crux of the issue. Modern GPUs are NEVER so fast that the greater bandwidth does nothing. In fact, almost every GPU in existence has been VRAM Bandwidth bottlenecked to a greater or lesser degree. The IMC is simply a relay, it is a device which schedules memory access operations and distributes data to the L2 cache of a GPU. It's operational speed has little relevance to the speed which the VRAM operates.
> 
> A good example are Raster Operations. These units within the GPU pretty devour whatever memory bandwidth is available in order to do their duties.
> 
> Because almost every GPU in existence is VRAM Bandwidth bottlenecked to some degree, they all see lesser or greater gains from overclocking their VRAM.


What I don't understand, at least with the GDDR5 standard, is If the core clock is at 2100 MHz. and the memory controller is running at 2500 Mhz. which device is waiting on which? Is the latency for GDDR5 memory that bad that it reduces the bandwidth?

The theoretical bandwidth for a 384-bit monolithic GDDR5 memory controller running at 2500 MHz. should be 2.5 Ghz. x 4 x 48 bytes = 480 GiB/sec.? But the GPU core can only read data at a SDR of 2.1 Ghz. x 48 bytes = 100 GiB/sec? So how can the GPU core take advantage of this greater data rate? Is the GPU somehow reading the memory controller at a quad data rate as well?

An IMC "is simply a relay", huh? Relays are synchronous devices with free running differential forwarded clocks and at least four control signals?
https://www.skhynix.com/eolproducts.view.do?pronm=GDDR5+SDRAM&srnm=H5GC%28Q%292H24BFR&rk=26&rc=graphics


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> I'm crying on the outside but I'm laughing on the inside.


I'm laughing because I'm positive you don't even know what you just said lmao. I'm not trying to reign on your parade, the mod's have already expressed they don't care if you drama post anyways, the edit/multi buttons are gonna be removed anyways.


----------



## 8051

Quote:


> Originally Posted by *KedarWolf*
> 
> The higher I've had my memory clocked the higher I've scored in Superposition.
> 
> When I made my place very cold for benching and got my memory higher than I've ever had it was when I scored the highest.
> 
> And it gradually crept up as I gradually increased memory speed.
> 
> I don't get why someone would argue that increasing memory speed would have no effect on benchmarks or even gaming.


I wanted to see proof of such a difference in in-game benchmarks not canned benchmark applications.


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> I'm laughing because I'm positive you don't even know what you just said lmao. I'm not trying to reign on your parade, the mod's have already expressed they don't care if you drama post anyways, the edit/multi buttons are gonna be removed anyways.


Why don't you answer my questions instead of spewing?


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> The higher I've had my memory clocked the higher I've scored in Superposition.
> 
> When I made my place very cold for benching and got my memory higher than I've ever had it was when I scored the highest.
> 
> And it gradually crept up as I gradually increased memory speed.
> 
> I don't get why someone would argue that increasing memory speed would have no effect on benchmarks or even gaming.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wanted to see proof of such a difference in in-game benchmarks not canned benchmark applications.
Click to expand...

I'll do a test on Metro: Last Light Redux benches and Rise Of The Tomb Raider DX12 benches and get back to you when I get home from work.

But do us all a favour, both of you please drop the drama and bickering. Keep it to facts and discussion, please.









Thank you.


----------



## LunaP

Quote:


> Originally Posted by *8051*
> 
> Why don't you answer my questions instead of spewing?


Why don't you start being courteous and not bash people when you're unsure. You're not asking questions you're attacking people stating what you believe to be facts vs asking since you're unsure, start by asking the question first. Try to be courteous and think of others first though vs spamming, you can fit it all in one post vs taking up an entire page, but I'm not gonna respond further anyways, already blocked, you're notorious for this in every thread you're in, try being polite for once. Anyways I'm done and apologies, back to lurking , haven't been able to get any answers to anything here as the threads are pretty active so I'll resume reading.


----------



## KedarWolf

Quote:


> Originally Posted by *LunaP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *8051*
> 
> Why don't you answer my questions instead of spewing?
> 
> 
> 
> Why don't you start being courteous and not bash people when you're unsure. You're not asking questions you're attacking people stating what you believe to be facts vs asking since you're unsure, start by asking the question first. Try to be courteous and think of others first though vs spamming, you can fit it all in one post vs taking up an entire page, but I'm not gonna respond further anyways, already blocked, you're notorious for this in every thread you're in, try being polite for once. Anyways I'm done and apologies, back to lurking , haven't been able to get any answers to anything here as the threads are pretty active so I'll resume reading.
Click to expand...

See above post. Two above actually.

Thank you.


----------



## LunaP

Quote:


> Originally Posted by *KedarWolf*
> 
> See above post.
> 
> Thank you.


Read the end of my post thank you







you posted while I was in the middle of mine, can we please get back OT?


----------



## SavantStrike

Quote:


> Originally Posted by *bajer29*
> 
> This has probably been discussed already, but I'd be curious to know how the reference cooler with a blower does vs 2 fan and 3 fan non-reference models. I know that the reference cards are $50-$100 cheaper than the OC'd and "gaming" models.
> 
> Specifically EVGA FTW3's cooler vs reference.


The non reference designs are a lot quieter and can pick up 50-100mhz on the core.

The reference PCB itself is decent, so it's a very good choice for water cooling if you can find one.


----------



## KedarWolf

Quote:


> Originally Posted by *LunaP*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> See above post.
> 
> Thank you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Read the end of my post thank you
> 
> 
> 
> 
> 
> 
> 
> you posted while I was in the middle of mine, can we please get back OT?
Click to expand...

Thanks!!

There will always be people that can be confrontational on public forums. We can take the bait and get caught up in it, or we can let it slide and move on.

I'm glad you're moving on.


----------



## bajer29

Quote:


> Originally Posted by *SavantStrike*
> 
> The non reference designs are a lot quieter and can pick up 50-100mhz on the core.
> 
> The reference PCB itself is decent, so it's a very good choice for water cooling if you can find one.


I'm talking strictly for air cooling, but the core OC info is helpful, thanks.

The reason I'm asking is, I'd like my next GPU (1080TI) to last as long as my 980 SLI setup which have lasted me 3 years with no issues. Obviously, lower temps=longer fan and CPU/ Mem lifespan.


----------



## KedarWolf

Quote:


> Originally Posted by *8051*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GreedyMuffin*
> 
> Honestly, there's no battle. If you for the money. Nvidia/Intel is superior.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About your question:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Is it worth pushing mem on GPU for gaming and such ??...struggle to find any good articles online.
> At the moment on a conservative 6066 mem...
> 
> Also even if my card is at 2025 set by curve and works at that freq in benchmarks I don't usually see it going over 1700 in games (with RTS OSD ) is there a reason for it ?
> 
> EDIT: I keep my games with Vsync on and capped at 60 fps on a 1080p Screen til I get my 3440 x 1440 ultra-wide QHD 100 hz back
> Edited by becks - Yesterday at 6:13 pm
> 
> 
> 
> I'd push the memory to at least around 400mhz+. From what I've read, memory OC would be better than a slight core OC. Esp. in Benchmarks.
> 
> Are you sure that RTS OSD is reporting correctly? Seems weird as the shouldn't downclock at all.
> 
> 
> 
> I always heard just the opposite (that core overclock is always better than memory OC). I haven't come across any benchmarks where overclocking the memory made any diff.
Click to expand...

Higher core is better for sure, if you have to lower core to raise memory it would depends how much of a memory overclock you gain in doing so.

If the memory increase is marginal and you need to lower core, that yes, you want to keep the higher core.

But to do how games scale in actually games with memory increases i'm going to try the meory at +0, +300 and +600, then from there at +667, which i'm 24/7 stable and record the differences. i'll reboot between each bin.

The only variation is i'm going to set the memory at the point and getting maximum bandwidth in the bandwidth memory test so it might vary a bin or two, different points can have much lower bandwidth for some reason, this will take that varience out.

And i'm going to test with the Metro Last light Redux bench, the Rise Of The Tomb Raider DX12 bench and maybe another in game bench I may have.

I'm home in 90 minutes or so.


----------



## demon09

Quote:


> Originally Posted by *bajer29*
> 
> I'm talking strictly for air cooling, but the core OC info is helpful, thanks.
> 
> The reason I'm asking is, I'd like my next GPU (1080TI) to last as long as my 980 SLI setup which have lasted me 3 years with no issues. Obviously, lower temps=longer fan and CPU/ Mem lifespan.


I'd recommend Asus strix or MSI trio for the best cooling to noise ratio. But the ftw3 is not bad if you are fine with a little more noise


----------



## bajer29

Quote:


> Originally Posted by *demon09*
> 
> I'd recommend Asus strix or MSI trio for the best cooling to noise ratio. But the ftw3 is not bad if you are fine with a little more noise


Cool thanks for the suggestion.


----------



## demon09

Quote:


> Originally Posted by *KedarWolf*
> 
> Higher core is better for sure, if you have to lower core to raise memory it would depends how much of a memory overclock you gain in doing so.
> 
> If the memory increase is marginal and you need to lower core, that yes, you want to keep the higher core.
> 
> But to do how games scale in actually games with memory increases i'm going to try the meory at +0, +300 and +600, then from there at +667, which i'm 24/7 stable and record the differences. i'll reboot between each bin.
> 
> The only variation is i'm going to set the memory at the point and getting maximum bandwidth in the bandwidth memory test so it might vary a bin or two, different points can have much lower bandwidth for some reason, this will take that varience out.
> 
> And i'm going to test with the Metro Last light Redux bench, the Rise Of The Tomb Raider DX12 bench and maybe another in game bench I may have.
> 
> I'm home in 90 minutes or so.


hmm let me know I run my memory at 11,900 effective or 450+ would be interesting to know if pushing it higher helps at all I may do some testing my self when I get free time
Quote:


> Originally Posted by *bajer29*
> 
> Cool thanks for the suggestion.


my Asus strix not the oc version hits 2050-2037 on the core at 1.093 but I ussaly run it at 1.025 and 1950mhz as the games I play don't need 2050 so it saves temps by a little bit as some games force full boost when playin. It's not a silicon lottery winner but it's not a dud and for less then the oc version with the same cooler I will take it. But Asus has a questionable warranty support so I paid an extra 50$ for a square trade one and ended up being the same price as a ftw3. But my card has been great be it a little coil wine when running 144hz but I can't hear it over my fans unless I put my ear to the glass


----------



## 8051

Quote:


> Originally Posted by *LunaP*
> 
> Why don't you start being courteous and not bash people when you're unsure. You're not asking questions you're attacking people stating what you believe to be facts vs asking since you're unsure, start by asking the question first. Try to be courteous and think of others first though vs spamming, you can fit it all in one post vs taking up an entire page, but I'm not gonna respond further anyways, already blocked, you're notorious for this in every thread you're in, try being polite for once. Anyways I'm done and apologies, back to lurking , haven't been able to get any answers to anything here as the threads are pretty active so I'll resume reading.


How is answering a question with a question courteous?


----------



## Bartholdi

Can’t wait for the results kedar - I’m fact guy too. Be dreaming of clocks and testruns


----------



## Neo_Morpheus

Just got a MSI 1080ti Sea Hawk X. I have a couple of corsair hd120 fans I'm going to use on the corsair h55 cooler on the card. Should be better in temps, overclocking (fingers crossed) and noise. The leaf blowers will finally become a distant memory!


----------



## SavantStrike

Quote:


> Originally Posted by *bajer29*
> 
> I'm talking strictly for air cooling, but the core OC info is helpful, thanks.
> 
> The reason I'm asking is, I'd like my next GPU (1080TI) to last as long as my 980 SLI setup which have lasted me 3 years with no issues. Obviously, lower temps=longer fan and CPU/ Mem lifespan.


Temperature is key for Pascal. The evga warranty and price are big selling points, but if you want a more enjoyable experience spring for the FTW version with the bigger heatsink than the sc2 and the black.

The triple fan MSI and Asus offerings are also solid choices. I'll also go ahead and suggest the gigabyte aorus. It's a beast of a card with a very attractive price.


----------



## Scotty99

Hey guys first post as owner of strix 1080ti OC. My card is not seeming to downclock at all with super high idle temps (50c+). I know 165hz monitors can cause this, but even switching over to 60hz my card will not move off of 1569mhz, it is already set to "optimal" power in the control panel, which is what most suggest to change to fix this.


----------



## KedarWolf

Not a huge difference but some gains.

Biggest jumps seem to be ROTR DX12.

Rise of The Tomb Raider +0 Memory 2012 core.



Rise of The Tomb Raider +301 Memory 2012 core.



Rise of The Tomb Raider +642 Memory 2012 core.



Ghost Recon: Wildlands +0 Memory 2012 core.



Ghost Recon: Wildlands +301 Memory 2012 core.



Ghost Recon: Wildlands +642 Memory 2012 core.



Metro Last Light Redux +0 Memory 2012 core.



Metro Last Light Redux +301 Memory 2012 core.



Metro Last Light Redux +642 Memory 2012 core.


----------



## Scotty99

Really not impressed with this 1080ti strix, might be going back for a hybrid.

How do reviewers always have lower load temps than when you get them home and put em in a system, do they use a 60hz monitor with vsync enabled lol?

To give an idea, im playing world of warcraft right now with frame limiter off 75c with 1900 rpm fans.....that is atrocious. Every review ive read had this thing max 65-68c. My case is one of the best flowing on the market, a meshify c with the foam removed and a fan blowing directly on the GPU.

Did i get a dud of a GPU or do these actually run this hot?


----------



## feznz

Quote:


> Originally Posted by *Scotty99*
> 
> Hey guys first post as owner of strix 1080ti OC. My card is not seeming to downclock at all with super high idle temps (50c+). I know 165hz monitors can cause this, but even switching over to 60hz my card will not move off of 1569mhz, it is already set to "optimal" power in the control panel, which is what most suggest to change to fix this.


there is something not right there my Strix will clock down to 139Mhz with a [email protected] and with web browsing 608Mhz sounds like a possible poorly fitted cooler. or a background process using resorces
I am on water but idle temps are about 7degrees C higher than ambient.


----------



## KCDC

Did the shunt mod on both cards a while back. I think I found the reason why I can't go above 2050 at all:



It appears that GPU 1 is no longer power limited, but gpu2 is getting it still. Just sitting here at idle when the gpu spikes sometimes.

Is this a result of not enough CLU or too much? Also, I would imagine that GPU 2 means the GPU on the second PCIe slot right? Hopefully?? I ask because I'd rather not have to reapply on both.

EDIT: After doing some benchmarking on gpu 1 solo, looks like I can get it to 2075 stable, so looks like I need to reapply on gpu2. Sigh, and I just rebuilt the damn thing...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scotty99*
> 
> Really not impressed with this 1080ti strix, might be going back for a hybrid.
> 
> How do reviewers always have lower load temps than when you get them home and put em in a system, do they use a 60hz monitor with vsync enabled lol?
> 
> To give an idea, im playing world of warcraft right now with frame limiter off 75c with 1900 rpm fans.....that is atrocious. Every review ive read had this thing max 65-68c. My case is one of the best flowing on the market, a meshify c with the foam removed and a fan blowing directly on the GPU.
> 
> Did i get a dud of a GPU or do these actually run this hot?


How hot is it there? Also different games push the card different. I personally prefer hybrids. Mine runs 50C mining in ITX case with intake effected by 240 AIO cooler.


----------



## Scotty99

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How hot is it there? Also different games push the card different. I personally prefer hybrids. Mine runs 50C mining in ITX case with intake effected by 240 AIO cooler.


Thermostat set to 72f (22c), room gets no hotter than that.

Wish asus made a hybrid card, this strix does look amazing when synced with the ram and motherboard lights









I just dont think 75c and 1900 rpm fans is acceptable for the best air cooled 1080ti on the market.

Edit:
https://overclock3d.net/reviews/gpu_displays/asus_gtx1080_ti_strix_oc_graphics_card_review/3

My card must be broken or reviewers dont actually test games on 165hz monitors with frame limiters off. Max he recorded in anything was 60c, my card idles at 49c. Sorry for the whining im just disappointed lol.


----------



## DStealth

Quote:


> Originally Posted by *KedarWolf*
> 
> Not a huge difference but some gains.


Not a huge jump...
First game ~5%, second 3%, third 5.3%
And all this not impressive till count the 642Mhz OC from factory 12000 is 5.35% increase


----------



## Scotty99

My card has to be 100% broken, after 15 mins of overwatch 80c 2300 rpm fans.

Taking it out buying a hybrid.


----------



## aberrero

I upgraded from a 1080 that I still have. I figured I'd keep it around for mining. I'm curious though, can I run a Ti and 1080 in SLI as if they were two 1080s?


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Hey guys first post as owner of strix 1080ti OC. My card is not seeming to downclock at all with super high idle temps (50c+). I know 165hz monitors can cause this, but even switching over to 60hz my card will not move off of 1569mhz, it is already set to "optimal" power in the control panel, which is what most suggest to change to fix this.


hmm odd I run my non oc strix with a 1440p 144hz g sync monitor I could set it to 165hz but my card idles down just like normal. Make sure Nvidia shadow play is not running. It's hard to pin point down without being there but there's lots of trouble shooting steps out there


----------



## demon09

Quote:


> Originally Posted by *aberrero*
> 
> I upgraded from a 1080 that I still have. I figured I'd keep it around for mining. I'm curious though, can I run a Ti and 1080 in SLI as if they were two 1080s?


saldy no SLI only works with the same GPUs you could set up the 1080 mining in the same build though


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> My card has to be 100% broken, after 15 mins of overwatch 80c 2300 rpm fans.
> 
> Taking it out buying a hybrid.


sounds like bad paste my fans stay quite low in overwatch at a locked 144hz and 2k but even if I unlock the frame rate it's no where near 80c


----------



## Scotty99

Strix 1080ti owners, whats the hottest you have ever seen your card? Im going to do more tests tomorrow with more fans in the case as id really like to keep it, i just want to know if my temps are out of line or *way* out of line. Again after 15 mins of overwatch in the training area, ultra preset 1440p 165hz with no fps cap, 80c and fans hit around 2300 rpm. From all the research ive done on this card 80c shouldnt even be possible, HWmonitor reported around 275w draw during that overwatch test which seems normal so i just dont know what to say.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Strix 1080ti owners, whats the hottest you have ever seen your card? Im going to do more tests tomorrow with more fans in the case as id really like to keep it, i just want to know if my temps are out of line or *way* out of line. Again after 15 mins of overwatch in the training area, ultra preset 1440p 165hz with no fps cap, 80c and fans hit around 2300 rpm. From all the research ive done on this card 80c shouldnt even be possible, HWmonitor reported around 275w draw during that overwatch test which seems normal so i just dont know what to say.


overclocked or stock 100% power draw? If you monitor has g sync I'd recommend capinf the fps slightly below the 165 at 164 or 163 as if it sits on 165-166 it will turn of gysnc or if it goes past 165+


----------



## Scotty99

Quote:


> Originally Posted by *demon09*
> 
> overclocked or stock 100% power draw?


I havent touched the clocks at all, i think it was boosting to 1974.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> I havent touched the clocks at all, i think it was boosting to 1974.


so the power draw is still at 100% and not moved to 120%? Just trying to mimic your settings to make it the most appropriate this is 5 mins in at 100% power limit at 2k ultra preset that auto does 122% render scale with unlocked fps with 2300 rpm fan speed to match yours 10 mins in I hit 55-56c some games get it a good bit warmer if they are set to run at a much lower fps as shooting for high fps ussaly ends up CPU dependent


----------



## Scotty99

Jesus what is wrong with my card lol. Yes i havent installed any GPU software or anything, just popped it straight in and was boosting to 1974.

As for FPS limiters i use fast sync combined with gsync, best of both worlds.

BTW i never touched fan profiles, that is just what the card was adjusting to automatically. Worst part about this is i cant reapply thermal paste if i wanted to, it would void the warranty....why asus why lol.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Jesus what is wrong with my card lol. Yes i havent installed any GPU software or anything, just popped it straight in and was boosting to 1974.
> 
> As for FPS limiters i use fast sync combined with gsync, best of both worlds.
> 
> BTW i never touched fan profiles, that is just what the card was adjusting to automatically. Worst part about this is i cant reapply thermal paste if i wanted to, it would void the warranty....why asus why lol.


ya If you are not adjusting the fan speed manually there is something wrong it should not hit 2300 rpm on stock curve as that's 65% fan speed. I feel ya on Asus being stupid about reaplying thermal paste. Where did you buy it from? I feel like if you are hitting 80c at stock 100% draw limit I'd be scared to see what would happen in a fully goingbound situation I can hit upwards of 68-70ish c at 55% fan speed on my card when fully gpu bound with 120% power limit if I set a game to where it runs at 60-75 fps range


----------



## Sweetwater

Quote:


> Originally Posted by *Scotty99*
> 
> Strix 1080ti owners, whats the hottest you have ever seen your card? Im going to do more tests tomorrow with more fans in the case as id really like to keep it, i just want to know if my temps are out of line or *way* out of line. Again after 15 mins of overwatch in the training area, ultra preset 1440p 165hz with no fps cap, 80c and fans hit around 2300 rpm. From all the research ive done on this card 80c shouldnt even be possible, HWmonitor reported around 275w draw during that overwatch test which seems normal so i just dont know what to say.


My highest temperature while overclocked to 2114mhz and +575 mhz memory is 57c. Mind that's with 100% fans and about 25c ambient temperature.

80c is wacky as I've never seen even 65c with default fan curve


----------



## Scotty99

Im really torn on what to do, the strix fits my case and theme so well changing GPU throws off my entire build, but no way am i putting up with these temps. The side of my meshify c glass window is almost too hot to touch after playing a round of overwatch, its that hot.


----------



## demon09

Quote:


> Originally Posted by *Sweetwater*
> 
> My highest temperature while overclocked to 2114mhz and +575 mhz memory is 57c. Mind that's with 100% fans and about 25c ambient temperature.
> 
> 80c is wacky as I've never seen even 65c with default fan curve


ya the fact his default fan curve went to 65% fan speed raises alarms . I ussaly prefer noise over temps a little and live with 55-60% fan speed. Nice overclocker you got there. Overwatch seems to not be a fan of overclocks so my Max is 2050 on the core I could probably get 2080 in most games if I smashed up fans speeds


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Im really torn on what to do, the strix fits my case and theme so well changing GPU throws off my entire build, but no way am i putting up with these temps. The side of my meshify c glass window is almost too hot to touch after playing a round of overwatch, its that hot.


can you get a replacement ? That's what I would shoot for either a replacement or get a card like the MSI trio or an aio. I am not a fan of aio's but on GPUs they have a much bigger benifit. (Don't go to Asus for the replacement since you card is pretty new go to where you bought it from


----------



## Scotty99

Ya im sure i could but i just dont know if thats gonna fix it









Browsing around (i found the official strix thread) it appears i have the hottest running 1080ti strix in history, i havent seen one person mention 80c ever. What was the hottest temp you saw in your overwatch test?


----------



## Scotty99

One thing i need to check on, do you guys have the "OC" version? I understand its just a factory overclock, but doing a bit of searching these might run hotter for whatever reason. This is the 800 dollar strix card.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> One thing i need to check on, do you guys have the "OC" version? I understand its just a factory overclock, but doing a bit of searching these might run hotter for whatever reason. This is the 800 dollar strix card.


mine runs way cooler even with more of an overclock then your factory oc with a higher power limit and overclocking on the memory. the other guys might be an oc version though. when i bought mine it was 750 for non oc and 999$ for the oc so the choice was easy lol


----------



## KedarWolf

Quote:


> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> One thing i need to check on, do you guys have the "OC" version? I understand its just a factory overclock, but doing a bit of searching these might run hotter for whatever reason. This is the 800 dollar strix card.
> 
> 
> 
> mine runs way cooler even with more of an overclock then your factory oc with a higher power limit and overclocking on the memory. the other guys might be an oc version though. when i bought mine it was 750 for non oc and 999$ for the oc so the choice was easy lol
Click to expand...

You can flash the Strix OC BIOS and it's basically the same card.

It's not a good idea with high temps though as it has a higher power limit and likely would just make temps worse.


----------



## Sweetwater

Quote:


> Originally Posted by *Scotty99*
> 
> do you guys have the "OC" version? I understand its just a factory overclock, but doing a bit of searching these might run hotter for whatever reason. This is the 800 dollar strix card.


I have one of the first batches that Australia got and it's an OC version as that's the only option we had at launch.


----------



## RoBiK

Quote:


> Originally Posted by *Scotty99*
> 
> My case is one of the best flowing on the market, a meshify c with the foam removed and a fan blowing directly on the GPU.


Quote:


> Originally Posted by *Scotty99*
> 
> The side of my meshify c glass window is almost too hot to touch after playing a round of overwatch, its that hot.


Something doesn't add up... if you have such a great air flow, why is the glass so hot?


----------



## Dasboogieman

Quote:


> Originally Posted by *8051*
> 
> What I don't understand, at least with the GDDR5 standard, is If the core clock is at 2100 MHz. and the memory controller is running at 2500 Mhz. which device is waiting on which? Is the latency for GDDR5 memory that bad that it reduces the bandwidth?
> 
> The theoretical bandwidth for a 384-bit monolithic GDDR5 memory controller running at 2500 MHz. should be 2.5 Ghz. x 4 x 48 bytes = 480 GiB/sec.? But the GPU core can only read data at a SDR of 2.1 Ghz. x 48 bytes = 100 GiB/sec? So how can the GPU core take advantage of this greater data rate? Is the GPU somehow reading the memory controller at a quad data rate as well?
> 
> An IMC "is simply a relay", huh? Relays are synchronous devices with free running differential forwarded clocks and at least four control signals?
> https://www.skhynix.com/eolproducts.view.do?pronm=GDDR5+SDRAM&srnm=H5GC%28Q%292H24BFR&rk=26&rc=graphics


That is the thing, they wait on each other depending on the operation demanded at the time but it's usually the GPU waiting on the IMC. That is why the IMC plays such a big role in the effective round trip latency of the RAM access.

We can take the example of the CPU which usually runs the IMC at the same multiplier as the L3 cache (I think, if I read the Kaby Lake schematics properly).

Intel and Ryzen both have DDR4 IMCs but the Intel one is noticeably lower latency than the Ryzen one. This is partly due to the way the IMCs are wired up to the cache + core (or infinity fabric in the case of Ryzen) but it's mostly to do with the design. I won't pretend to know the specifics but you can see that Intel's superior design allows them to get something like 45ns latency while the Ryzen struggles to go below 80ns. What this means is, the Core usually races ahead of the IMC, the optimal way to utilise resources is to ensure the core has enough work to do during the 45ns while data is being fetched. However if we were to increase the DDR4 bandwidth by increasing the clock. This clockspeed increase is usually independent of the clockspeed the IMC works at (or Synchronous in the case of Ryzen but I'm not 100% sure on this) but the net result is you get more data per delivery once the 45ns is done. Cores are usually designed to have a 90-95% effective hit rate for data to be within cache before going to IMC. This is efficient because it is hideously die space wasteful and power expensive to double the cache just to inch closer to 99%. This means that the core still cares about the rate of data delivery to a greater or lesser degree. This means you will still see gains with Faster memory even if the cache is already 90-95% efficient.

Extrapolating to the GPU. It doesn't care so much about latency as the CPU as most work is serially executed with few looping dependencies or branching. This means the bulk of the data can be pre-fetched ahead of time. However, the sheer rate at which the GPU inputs and outputs data is why memory bandwidth is so important, the internal bus is simply that fast (I think the L2 of a 1080ti works at 1200GB/s or something). Latency affects memory bandwidth because the IMC is literally idle during the time the data is being fetched from the GDDR5/5x/HBM array. Every ns it spends idle is time it could've pre-fetched more data. This is why there is a difference between theoretical bandwidth and attainable bandwidth. The type of work the GPU does also has a massive impact on the attainable bandwidth. The ideal type of work for the GPU (barring just working off the L2) is data that only needs to be accessed very few times but the page is left open for long periods of time, i.e. you incur the latency penalty few times and spend the rest continuously streaming data. This is why certain applications measure higher bandwidth and others measure less.

Hypothetically, if we were to increase the data rate (aka the clockspeed of the VRAM) but leave the latency the same. You will get a performance boost because the GPU will receive more data within it's burst window faster, it still suffers from the same latency penalty but you shave time on the transport.

The reason you don't see such a massive impact with modern GPUs is the VRAM array uses an automated strap system for the VRAM. The latency goes up after a certain amount of clockspeed is applied, it's usually a net gain in speed but it's not a massive spike. This means there are several latency straps which are programmed in to the firmware (or BIOS in the case of AMD cards) which dynamically become applied according to the current operating clockspeed. That is why if you mapped the memory bandwidth increase per bin, you actually get an increase, a peak, then a dip before the next peak. The changing of strap is automated and you generally cannot control it (except for AMD cards lmao which got ludicrous performance gains from the VBIOS memory strap mods).

Finally, you also have the well known GGDR5/GDDR5X auto error checking which can play havoc with the effective bandwidth at high clocks, high temperatures or ultra conservative straps.


----------



## Scotty99

Quote:


> Originally Posted by *RoBiK*
> 
> Something doesn't add up... if you have such a great air flow, why is the glass so hot?


I dont get it either man lol. I have fans coming tomorrow but its not going to make as much difference as i want it to:





They only saw a 3c drop going to 2 high quality noctua fans for GPU temps (cpu temps dropped a lot tho). I will say removing the dust filter moved my temps down 3-4c, but that isnt something id be willing to run 24/7.

At best without a dust filter and fans fitted in every slot im still looking at ~70c load temps, which seems higher than most people i see posting around here. As i am typing this my GPU is at 45c, it is downclocking to 139mhz now but the temps have not went down at all.


----------



## KedarWolf

That is the thing, they wait on each other depending on the operation demanded at the time but it's usually the GPU waiting on the IMC. That is why the IMC plays such a big role in the effective round trip latency of the RAM access.


Spoiler: Warning: Spoiler!



We can take the example of the CPU which usually runs the IMC at the same multiplier as the L3 cache (I think, if I read the Kaby Lake schematics properly).

Intel and Ryzen both have DDR4 IMCs but the Intel one is noticeably lower latency than the Ryzen one. This is partly due to the way the IMCs are wired up to the cache + core (or infinity fabric in the case of Ryzen) but it's mostly to do with the design. I won't pretend to know the specifics but you can see that Intel's superior design allows them to get something like 45ns latency while the Ryzen struggles to go below 80ns. What this means is, the Core usually races ahead of the IMC, the optimal way to utilise resources is to ensure the core has enough work to do during the 45ns while data is being fetched. However if we were to increase the DDR4 bandwidth by increasing the clock. This clockspeed increase is usually independent of the clockspeed the IMC works at (or Synchronous in the case of Ryzen but I'm not 100% sure on this) but the net result is you get more data per delivery once the 45ns is done. Cores are usually designed to have a 90-95% effective hit rate for data to be within cache before going to IMC. This is efficient because it is hideously die space wasteful and power expensive to double the cache just to inch closer to 99%. This means that the core still cares about the rate of data delivery to a greater or lesser degree. This means you will still see gains with Faster memory even if the cache is already 90-95% efficient.

Extrapolating to the GPU. It doesn't care so much about latency as the CPU as most work is serially executed with few looping dependencies or branching. This means the bulk of the data can be pre-fetched ahead of time. However, the sheer rate at which the GPU inputs and outputs data is why memory bandwidth is so important, the internal bus is simply that fast (I think the L2 of a 1080ti works at 1200GB/s or something). Latency affects memory bandwidth because the IMC is literally idle during the time the data is being fetched from the GDDR5/5x/HBM array. Every ns it spends idle is time it could've pre-fetched more data. This is why there is a difference between theoretical bandwidth and attainable bandwidth. The type of work the GPU does also has a massive impact on the attainable bandwidth. The ideal type of work for the GPU (barring just working off the L2) is data that only needs to be accessed very few times but the page is left open for long periods of time, i.e. you incur the latency penalty few times and spend the rest continuously streaming data. This is why certain applications measure higher bandwidth and others measure less.

Hypothetically, if we were to increase the data rate (aka the clockspeed of the VRAM) but leave the latency the same. You will get a performance boost because the GPU will receive more data within it's burst window faster, it still suffers from the same latency penalty but you shave time on the transport.

The reason you don't see such a massive impact with modern GPUs is the VRAM array uses an automated strap system for the VRAM. The latency goes up after a certain amount of clockspeed is applied, it's usually a net gain in speed but it's not a massive spike. This means there are several latency straps which are programmed in to the firmware (or BIOS in the case of AMD cards) which dynamically become applied according to the current operating clockspeed. That is why if you mapped the memory bandwidth increase per bin, you actually get an increase, a peak, then a dip before the next peak. The changing of strap is automated and you generally cannot control it (except for AMD cards lmao which got ludicrous performance gains from the VBIOS memory strap mods).

Finally, you also have the well known GGDR5/GDDR5X auto error checking which can play havoc with the effective bandwidth at high clocks, high temperatures or ultra conservative straps.



You can set your memory speed, run the attached program, and if you say get 450-460GB/s at +642 memory you can raise it say a bin higher or lower and if you get 475-480GB/s you know you're at a bin using your maximum bandwidth.









vRamBandWidthTest.zip 119k .zip file


----------



## demon09

Quote:


> Originally Posted by *KedarWolf*
> 
> You can flash the Strix OC BIOS and it's basically the same card.
> 
> It's not a good idea with high temps though as it has a higher power limit and likely would just make temps worse.


unless I am mistaken the oc and non oc strix have the same max power limit of 120%. Unless does the oc version have a higher base power limit?


----------



## KedarWolf

Quote:


> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You can flash the Strix OC BIOS and it's basically the same card.
> 
> It's not a good idea with high temps though as it has a higher power limit and likely would just make temps worse.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> unless I am mistaken the oc and non oc strix have the same max power limit of 120%. Unless does the oc version have a higher base power limit?
Click to expand...

The OC is 330W, I think the regular is only 300W. Same slider at 120%, just a higher power draw.


----------



## KedarWolf

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> You can flash the Strix OC BIOS and it's basically the same card.
> 
> It's not a good idea with high temps though as it has a higher power limit and likely would just make temps worse.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> unless I am mistaken the oc and non oc strix have the same max power limit of 120%. Unless does the oc version have a higher base power limit?
> 
> Click to expand...
> 
> The OC is 330W, I think the regular is only 300W. Same slider at 120%, just a higher power draw.
Click to expand...

Yes, I checked. regular Strix is only 300W, OC 330W.

https://www.techpowerup.com/vgabios/?architecture=&manufacturer=ASUS&model=GTX+1080+Ti&interface=&memType=&memSize=&since=


----------



## ZealotKi11er

Can you please remove the side panel and test it then. I think the case just cant handle that much heat dump.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Jesus what is wrong with my card lol. Yes i havent installed any GPU software or anything, just popped it straight in and was boosting to 1974.
> 
> As for FPS limiters i use fast sync combined with gsync, best of both worlds.
> 
> BTW i never touched fan profiles, that is just what the card was adjusting to automatically. Worst part about this is i cant reapply thermal paste if i wanted to, it would void the warranty....why asus why lol.


Fast sync does not compliment gsync.

Don't use it with it.
Enable vsync and enabel fps limit 3fps below your refresh rate.

Also, sounds like your card has bad factory thermal paste application.


----------



## Scotty99

Ive had no problem with fast sync and sync together. I dislike frame limiters i like to see how many FPS im getting for a card i paid nearly a grand for lol. My temps are slightly better today after i removed the demciflex filter (great filter material, but might be a bit too restrictive). Ill update if i see any more temp drops when i add two extra intake fans and my 240 aio in roof.


----------



## Dasboogieman

Quote:


> Originally Posted by *Scotty99*
> 
> Ive had no problem with fast sync and sync together. I dislike frame limiters i like to see how many FPS im getting for a card i paid nearly a grand for lol. My temps are slightly better today after i removed the demciflex filter (great filter material, but might be a bit too restrictive). Ill update if i see any more temp drops when i add two extra intake fans and my 240 aio in roof.


I prefer smooth frame delivery and fluid visuals above all else. To each their own I guess.

Gsync/Freesync should be able to work well with Fast Sync. Fast Sync is essentially oldschool triple buffering so it's meant to compliment Vsync (adaptive or not).


----------



## Scotty99

I dunno fast sync seems pretty good to me, i cant tell when it goes above 165 fps there is no tearing nor do i see stutter, pretty awesome implementation in tandem with gsync imo.


----------



## KedarWolf

Quote:


> Originally Posted by *Scotty99*
> 
> I dunno fast sync seems pretty good to me, i cant tell when it goes above 165 fps there is no tearing nor do i see stutter, pretty awesome implementation in tandem with gsync imo.


If your framerate goes above your refresh rate G-Sync is disabled.









Why you cap your framerate a few FPS below your refresh rate or use Vsync.


----------



## nrpeyton

Need some help:

I've installed an extra OLD amd GPU into a spare x8 slot on my mobo. My 1080ti remains in it's original slot closest to the CPU. (16/x8).

When I've got the AMD GPU installed the Nvidia one isn't recognised even in the BIOS.

I want to mine on one -- and play on the other. I'm playing some old games just now (mass effect 1) so the old AMD GPU will keep up while I mine on my card that I originally bought for gaming. (just temporarily).

Any ideas why the nvidia card isn't being detected at all? I can't even get boot up-info onscreen if I have cable plugged into the 1st original 1080Ti in my main x16 slot. Boot info strangley shows when I plug monitor cable into in this 7 year old anient AMD card.

Doesn't make sense.

I've also noticed that intel HD graphics 630 has begun showing up in GPU-Z etc. But I have reset system to BIOS defaults -- so doesn't make sense.

Edit:

As soon as I remove the old AMD card from x8 slot, the system boots into my nvidia card perfectly as if it were never missing. (it was never even unplugged when I had the AMD card in there).

What am I missing?


----------



## aberrero

I just installed my Ti yesterday and I've been having trouble with my system shutting off during games. It happened twice at stock and once when OC'd, and I'm not sure what's causing it. My system is watercooled and has never exceeded 54Con the GPU itself. Could it be poor contact with VRMs on the GPU block? I have an EVGA card with their IceQ thing, but I can't seem to see all the temperature sensors it supposedly has.

I've never had my system straight up turn off the way this has before. It just shuts off and wont turn on again until I physically power cycle the PSU.

It was running stable for months with my previous card, and it almost certainly isn't a PSU issue (I have an 80Plus Platinum with plenty of capacity).


----------



## nrpeyton

Quote:


> Originally Posted by *aberrero*
> 
> I just installed my Ti yesterday and I've been having trouble with my system shutting off during games. It happened twice at stock and once when OC'd, and I'm not sure what's causing it. My system is watercooled and has never exceeded 54Con the GPU itself. Could it be poor contact with VRMs on the GPU block? I have an EVGA card with their IceQ thing, but I can't seem to see all the temperature sensors it supposedly has.
> 
> I've never had my system straight up turn off the way this has before. It just shuts off and wont turn on again until I physically power cycle the PSU.
> 
> It was running stable for months with my previous card, and it almost certainly isn't a PSU issue (I have an 80Plus Platinum with plenty of capacity).


somtimes things just give in early. Or could be a loose cable.

You could test your PSU with multimeter just to make sure.

Is it all games of a specific game>


----------



## KedarWolf

Quote:


> Originally Posted by *aberrero*
> 
> I just installed my Ti yesterday and I've been having trouble with my system shutting off during games. It happened twice at stock and once when OC'd, and I'm not sure what's causing it. My system is watercooled and has never exceeded 54Con the GPU itself. Could it be poor contact with VRMs on the GPU block? I have an EVGA card with their IceQ thing, but I can't seem to see all the temperature sensors it supposedly has.
> 
> I've never had my system straight up turn off the way this has before. It just shuts off and wont turn on again until I physically power cycle the PSU.
> 
> It was running stable for months with my previous card, and it almost certainly isn't a PSU issue (I have an 80Plus Platinum with plenty of capacity).


if your motherboard BIOS has a surge protection option turn it off.

Common issues with cards pulling a lot of voltage.


----------



## ZealotKi11er

Quote:


> Originally Posted by *nrpeyton*
> 
> Need some help:
> 
> I've installed an extra OLD amd GPU into a spare x8 slot on my mobo. My 1080ti remains in it's original slot closest to the CPU. (16/x8).
> 
> When I've got the AMD GPU installed the Nvidia one isn't recognised even in the BIOS.
> 
> I want to mine on one -- and play on the other. I'm playing some old games just now (mass effect 1) so the old AMD GPU will keep up while I mine on my card that I originally bought for gaming. (just temporarily).
> 
> Any ideas why the nvidia card isn't being detected at all? I can't even get boot up-info onscreen if I have cable plugged into the 1st original 1080Ti in my main x16 slot. Boot info strangley shows when I plug monitor cable into in this 7 year old anient AMD card.
> 
> Doesn't make sense.
> 
> I've also noticed that intel HD graphics 630 has begun showing up in GPU-Z etc. But I have reset system to BIOS defaults -- so doesn't make sense.
> 
> Edit:
> 
> As soon as I remove the old AMD card from x8 slot, the system boots into my nvidia card perfectly as if it were never missing. (it was never even unplugged when I had the AMD card in there).
> 
> What am I missing?


There might be setting in BIOS. I have 0 experience with ASUS MB sadly. Try to turn off IGP all together. I game with my 1080 Ti while mining just fine. Not the best performance but enough for HoTS.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Ive had no problem with fast sync and sync together*. I dislike frame limiters i like to see how many FPS im getting for a card i paid nearly a grand for* lol. My temps are slightly better today after i removed the demciflex filter (great filter material, but might be a bit too restrictive). Ill update if i see any more temp drops when i add two extra intake fans and my 240 aio in roof.


That sounds like OCD

Anyways, you don't get any benefit of a gsync monitor above refresh rate.

FPS limit will give you lower temps and less input lag plus the benefits of gsync.

Therefore it makes no sense to use your current settings.

And there is definitely stutter with fast sync, you just can't see it.
I can, but everyone is different.


----------



## nrpeyton

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There might be setting in BIOS. I have 0 experience with ASUS MB sadly. Try to turn off IGP all together. I game with my 1080 Ti while mining just fine. Not the best performance but enough for HoTS.


Okay this time i resettled BIOS first. Removed the AMD card. Powered off system then re-installed the older card. The older card is being detected this time.

Must be a certain order things have to go in.

Thanks


----------



## Mark II

Hi All,

I have just bought an AORUS 1080 TI WF, the AORUS software tells me I have BIOS F2 and it also tells me there is no newer ones... but reading on this forum I see there should me a F4 or even a F3P, I'm confused.

Could you please tell me if there is a newr BIOS for her, if it is worth to upgrade it and in case where to download the AORUS .exe to update to the latest one ?

Thanks in advance, much appreciated.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *aberrero*
> 
> I just installed my Ti yesterday and I've been having trouble with my system shutting off during games. It happened twice at stock and once when OC'd, and I'm not sure what's causing it. My system is watercooled and has never exceeded 54Con the GPU itself. Could it be poor contact with VRMs on the GPU block? I have an EVGA card with their IceQ thing, but I can't seem to see all the temperature sensors it supposedly has.
> 
> I've never had my system straight up turn off the way this has before. It just shuts off and wont turn on again until I physically power cycle the PSU.
> 
> It was running stable for months with my previous card, and it almost certainly isn't a PSU issue (I have an 80Plus Platinum with plenty of capacity).


The PC of a friend of mine was working just as it should, until suddenly it would shut itself off during games and other heavy tasks. It was obviously the PSU, as replacing the PSU with another unit made the PC healthy again.

My point is, you never know when a PSU can go bad. It could be a bad capacitor, it could be damage from the power grid fluctuating. What was your previous card? If you have the possibillity you test with another PSU, I would do it. Then you will be sure at least.


----------



## Streetdragon

Just a random PSU question, is it possible/how can i check, without tools, if my PSU lifetime is........ "ending"?


----------



## becks

Quote:


> Originally Posted by *Streetdragon*
> 
> Just a random PSU question, is it possible/how can i check, without tools, if my PSU lifetime is........ "ending"?


Common sense...

Smells like burn ?
REALY Hot ?
Makes noises ?


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Enable vsync and enabel fps limit 3fps below your refresh rate.


Doing both is pretty pointless as you're using 2 methods to achieve the same goal. Using Fastsync isn't that terrible of an idea either, especially if you rarely go above your refresh rate, due to lower input lag than using either of the previous methods. In fact, depending on what software method you use to limit frame rate, you could be adding more input lag than traditional v-sync. Though to get the lowest input lag you're better off just dealing with a little tearing every so often.






This video doesn't have FPS limiting tests, but if you'd like, I can dig up my sources again. RivaTuner can be one of the biggest culprits of added input lag.

EDIT: my memory is failing me. It's Nvidia Inspector that's bad with added input lag. RivaTuner only adds a frame's worth of delay.

https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/12/


----------



## Agent-A01

Quote:


> Originally Posted by *Streetdragon*
> 
> Just a random PSU question, is it possible/how can i check, without tools, if my PSU lifetime is........ "ending"?


Instability like random crashing, shutdowns under-load, freezing, etc.

Quote:


> Originally Posted by *shreduhsoreus*
> 
> Doing both is pretty pointless as you're using 2 methods to achieve the same goal. Using Fastsync isn't that terrible of an idea either, especially if you rarely go above your refresh rate, due to lower input lag than using either of the previous methods. In fact, depending on what software method you use to limit frame rate, you could be adding more input lag than traditional v-sync. Though to get the lowest input lag you're better off just dealing with a little tearing every so often.
> 
> 
> 
> 
> 
> 
> This video doesn't have FPS limiting tests, but if you'd like, I can dig up my sources again. RivaTuner can be one of the biggest culprits of added input lag.
> 
> EDIT: my memory is failing me. It's Nvidia Inspector that's bad with added input lag. RivaTuner only adds a frame's worth of delay.
> 
> https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/12/


Doing both isn't pointless.

You have to enable Vsync to get optimal usage out of gsync within its range.

It enables frametime compensation; you turn it off you will get tearing even in gsync's range.

Most people do not understand how gsync works.

You need to enable VSync AND a FPS limit(ingame is best if it's stable, like overwatch OR RTSS) to 3 below refresh rate.

BTW RTSS adds UP to 1 frame of delay, not a flat static frame of delay.
FPS limiters can actually reduce input lag over maxed out GPU with variable FPS that's slightly above the limit.

But that's unrelated to gsync.


----------



## KedarWolf

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Streetdragon*
> 
> Just a random PSU question, is it possible/how can i check, without tools, if my PSU lifetime is........ "ending"?
> 
> 
> 
> Instability like random crashing, shutdowns under-load, freezing, etc.
> 
> Quote:
> 
> 
> 
> Originally Posted by *shreduhsoreus*
> 
> Doing both is pretty pointless as you're using 2 methods to achieve the same goal. Using Fastsync isn't that terrible of an idea either, especially if you rarely go above your refresh rate, due to lower input lag than using either of the previous methods. In fact, depending on what software method you use to limit frame rate, you could be adding more input lag than traditional v-sync. Though to get the lowest input lag you're better off just dealing with a little tearing every so often.
> 
> 
> 
> 
> 
> 
> This video doesn't have FPS limiting tests, but if you'd like, I can dig up my sources again. RivaTuner can be one of the biggest culprits of added input lag.
> 
> EDIT: my memory is failing me. It's Nvidia Inspector that's bad with added input lag. RivaTuner only adds a frame's worth of delay.
> 
> https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/12/
> 
> Click to expand...
> 
> Doing both isn't pointless.
> 
> You have to enable Vsync to get optimal usage out of gsync within its range.
> 
> It enables frametime compensation; you turn it off you will get tearing even in gsync's range.
> 
> Most people do not understand how gsync works.
> 
> You need to enable VSync AND a FPS limit(ingame is best if it's stable, like overwatch OR RTSS) to 3 below refresh rate.
> 
> BTW RTSS adds UP to 1 frame of delay, not a flat static frame of delay.
> FPS limiters can actually reduce input lag over maxed out GPU with variable FPS that's slightly above the limit.
> 
> But that's unrelated to gsync.
Click to expand...

On my 4K G-Sync screen if I have vsync off and limit FPS to 59 G-Sync stays enabled. I've checked when the
G-Sync indicator.

Can be less input lag etc. with Vsync disabled, G-Sync on and Rivatuner or something limiting FPS.

If you limit the FPS below your refresh rate no need for Vsync.


----------



## Scotty99

Apparently stutter is dependent on your system.

Just testing things out:

AMD system: No stutter with a fps cap
Stutter with vsync+gsync

Intel: Stutter with fps cap
No stutter with vsync+gsync

Not placebo either, they are the exact opposite and its very noticeable. Fast sync i get no stutter on either systems.


----------



## nyk20z3

Asus Strix 1080 Ti OC with a Phanteks Glacier Block, I normally use EK blocks but wanted to try something different. The quality is just as good as any other block and it retains the stock backplate. It also has 2 built in rgb led strips which is the icing on the cake.


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Doing both isn't pointless.
> 
> You have to enable Vsync to get optimal usage out of gsync within its range.
> 
> It enables frametime compensation; you turn it off you will get tearing even in gsync's range.


That's simply not true. I run 3 G-Sync monitors in surround. Not one bit of tearing and I have vsync off. Vsync with G-Sync is only useful if you frequently come close to/surpass your monitor's refresh rate, the reason you cap the framerate 3 FPS below your refresh rate is to keep your current framerate further from your refresh rate to avoid input lag caused by vsync.
Quote:


> Originally Posted by *Agent-A01*
> 
> Most people do not understand how gsync works.


Yourself included.


----------



## Agent-A01

Quote:


> Originally Posted by *KedarWolf*
> 
> On my 4K G-Sync screen if I have vsync off and limit FPS to 59 G-Sync stays enabled. I've checked when the
> G-Sync indicator.
> 
> Can be less input lag etc. with Vsync disabled, G-Sync on and Rivatuner or something limiting FPS.
> 
> If you limit the FPS below your refresh rate no need for Vsync.


You do need Vsync enabled; its not enabling VSync, it enables Frame time compensation within gsyncs range.
Therefore there is no added latency.

Of course Vsync does activate when fps > refresh rate
Quote:


> Originally Posted by *shreduhsoreus*
> 
> That's simply not true. I run 3 G-Sync monitors in surround. Not one bit of tearing and I have vsync off. Vsync with G-Sync is only useful if you frequently come close to your monitor's refresh rate, the reason you do cap the framerate 3 FPS below your refresh rate is to keep your current framerate further from your refresh rate to avoid input lag caused by vsync.


Yes it is true.

Without vsync on there is no frametime compensation. You will get 'micro' tearing, much less noticeable than regular tearing.

You may not be able to see it, but I can.
It's just a fact of how gsync works; micro tearing is going to happen without vsync on.

Many people in general do not understand Gsyncs behavior.

I suggest you all to read Gsync 101 at blurbusters to better educate yourself.


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> I suggest you all to read Gsync 101 at blurbusters to better educate yourself.


I have. Several times. I suggest you do the same. Vsync is not needed with G-Sync to eliminate screen tearing if you're framerate is always within G-Sync's range. You're just plain wrong, and this isn't the first time you've replied to me in this thread with no clue what you're talking about.

I'll see myself out. I have games to play without screen tearing


----------



## Agent-A01

Quote:


> Originally Posted by *shreduhsoreus*
> 
> I have. Several times. I suggest you do the same. Vsync is not needed with G-Sync to eliminate screen tearing if you're framerate is always within G-Sync's range. You're just plain wrong, and this isn't the first time you've replied to me in this thread with no clue what you're talking about.
> 
> I'll see myself out. I have games to play without screen tearing


Obviously you didn't read, you're clearly wrong.

Believe what you want but don't spread misinformation around to those that don't understand it(like you) lol.

Here is their statement, you can't refute facts.
Quote:


> G-SYNC + V-SYNC "Off":
> The tearing inside the G-SYNC range with V-SYNC "Off" is caused by sudden frametime variances output by the system, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system's ability (or inability) to deliver consistent frametimes.
> 
> G-SYNC + V-SYNC "Off" disables the G-SYNC module's ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC "Off" will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).
> 
> In the Upper FPS range, tearing will be limited to the bottom of the display. In the Lower FPS range (<36) where frametime spikes can occur (see What are Frametime Spikes?), full tearing will begin.
> 
> Without frametime compensation, G-SYNC functionality with V-SYNC "Off" is effectively "Adaptive G-SYNC," and should be avoided for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).
> 
> G-SYNC + V-SYNC "On":
> This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC "Off," G-SYNC + V-SYNC "On" allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.
> 
> Frametime compensation with V-SYNC "On" is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).


You can't notice the tearing, that's okay







.

My eyes are better at picking up things like that.

But I suppose you'll keep backpedaling, in which case i suggest you email jorimt of BB and tell him his hard work and words mean **** according to you


----------



## KedarWolf

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> On my 4K G-Sync screen if I have vsync off and limit FPS to 59 G-Sync stays enabled. I've checked when the
> G-Sync indicator.
> 
> Can be less input lag etc. with Vsync disabled, G-Sync on and Rivatuner or something limiting FPS.
> 
> If you limit the FPS below your refresh rate no need for Vsync.
> 
> 
> 
> You do need Vsync enabled; its not enabling VSync, it enables Frame time compensation within gsyncs range.
> Therefore there is no added latency.
> 
> Of course Vsync does activate when fps > refresh rate
> Quote:
> 
> 
> 
> Originally Posted by *shreduhsoreus*
> 
> That's simply not true. I run 3 G-Sync monitors in surround. Not one bit of tearing and I have vsync off. Vsync with G-Sync is only useful if you frequently come close to your monitor's refresh rate, the reason you do cap the framerate 3 FPS below your refresh rate is to keep your current framerate further from your refresh rate to avoid input lag caused by vsync.
> 
> Click to expand...
> 
> Yes it is true.
> 
> Without vsync on there is no frametime compensation. You will get 'micro' tearing, much less noticeable than regular tearing.
> 
> You may not be able to see it, but I can.
> It's just a fact of how gsync works; micro tearing is going to happen without vsync on.
> 
> Many people in general do not understand Gsyncs behavior.
> 
> I suggest you all to read Gsync 101 at blurbusters to better educate yourself.
Click to expand...

It's been pretty much been a given since G-Sync came out that vsync off with framerates capped below the refresh rate is the way to go in games.

It reduces input lag and you still get G-Sync.

I've never read anything about you HAVE to have vsync on with G-sync except hearing it from you.

I'm not being argumentative, just stating what has been stated many times before.


----------



## Agent-A01

Quote:


> Originally Posted by *KedarWolf*
> 
> It's been pretty much been a given since G-Sync came out that vsync off with framerates capped below the refresh rate is the way to go in games.
> 
> It reduces input lag and you still get G-Sync.
> 
> I've never read anything about you HAVE to have vsync on with G-sync except hearing it from you.
> 
> I'm not being argumentative, just stating what has been stated many times before.


Read above, people who have tested everything with gsync.


----------



## shreduhsoreus

You're referring to the upper and lower limits of G-Sync's range, which is an entirely different story, and V-Sync is needed for different reasons than you are claiming. I spent hours on end reading about G-Sync before deciding to invest $1200 in monitors.

You don't have some sort of super human eyes, I'm running SURROUND monitors dude, you would have to literally be blind to not see tearing with this much screen width. Prior to me disabling the Windows feature to turn off monitors after a set amount of time, I had G-Sync turn off without realizing it. I got massive tearing in game and was thinking "Why the hell is G-Sync allowing tearing?". It's because it wasn't even enabled. Fixed that and I haven't seen one bit of tearing. Not once. My framerate ranges between 55-120, far enough from the upper and lower limits of G-Sync's range, eliminating my need for vsync or an FPS limiter.

Pay attention to the specific lines I quote when I reply. I specifically quoted you saying "You will get tearing without vsync enabled with G-Sync". You made an objective statement, you said "will". No, that is not true. Regardless of what you and your phony super human eyes think. What you should have said is "You may get tearing", because that would have been a true statement.

I don't know what your deal is that you have to act like such a **** over computers, I've seen you do it multiple times in this thread.


----------



## Scotty99

I suggest you take the tone down a notch agent, this board is for discussion.






Tom peterson himself said vsync on with gsync is a "poor choice" and suggests using fast sync in combination with gsync as they are complimentary to each other.


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Obviously you didn't read, you're clearly wrong.
> 
> Believe what you want but don't spread misinformation around to those that don't understand it(like you) lol.
> 
> Here is their statement, you can't refute facts.
> You can't notice the tearing, that's okay
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My eyes are better at picking up things like that.
> 
> But I suppose you'll keep backpedaling, in which case i suggest you email jorimt of BB and tell him his hard work and words mean **** according to you


When they say "frametime variances" what they mean are those random individual frames that will be higher(or at the bottom end of the spectrum, lower) than G-Sync's range. The "compensation" is for going out of G-Sync's range for a very short amount of time. When you're consistently around 100FPS on a 165Hz display, no, you do not need vsync. Period.

I've literally read everything you've posted, many times. I've seen those blog posts, I've seen the charts. They have nothing to do with the middle of G-Sync's range. Which is where I'm at.


----------



## Agent-A01

Quote:


> Originally Posted by *shreduhsoreus*
> 
> You're referring to the upper and lower limits of G-Sync's range, which is an entirely different story, and V-Sync is needed for different reasons than you are claiming. I spent hours on end reading about G-Sync before deciding to invest $1200 in monitors.
> 
> You don't have some sort of super human eyes, I'm running SURROUND monitors dude, you would have to literally be blind to not see tearing with this much screen width. Prior to me disabling the Windows feature to turn off monitors after a set amount of time, I had G-Sync turn off without realizing it. I got massive tearing in game and was thinking "Why the hell is G-Sync allowing tearing?". It's because it wasn't even enabled. Fixed that and I haven't seen one bit of tearing. Not once. My framerate ranges between 55-120, far enough from the upper and lower limits of G-Sync's range, eliminating my need for vsync or an FPS limiter.
> 
> Pay attention to the specific lines I quote when I reply. I specifically quoted you saying "You will get tearing without vsync enabled with G-Sync". You made an objective statement, you said "will". No, that is not true. Regardless of what you and your phony super human eyes think. What you should have said is "You may get tearing", because that would have been a true statement.
> 
> I don't know what your deal is that you have to act like such a **** over computers, I've seen you do it multiple times in this thread.


Nobody is acting like a ****.

Maybe I should add more smiley faces for you









But again, you're wrong.

I'll quote the important bit again.
Quote:


> G-SYNC + V-SYNC "Off":
> The tearing *inside the G-SYNC range with V-SYNC "Off" is caused by sudden frametime variances output by the system*, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system's ability (or inability) to deliver consistent frametimes.
> 
> G-SYNC + V-SYNC "Off" disables the G-SYNC module's ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC "Off" will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).


The bold is right there, *tearing within gsync's range*

I've given you the data that proves you are wrong.

Being in the lower or upper part of the range only changes where the tearing shows, on the bottom or the top and everywhere in between.

Like I said before, it's not the normal tearing you see, it's 'micro' tearing which is much harder to notice, especially if it's happening on the bottom of the screens and your eyes are fixated on the middle.

Should I tag jorimt here?
I'm sure he'll be happy to tell you why you're wrong as he has tested this thoroughly with equipment.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> I suggest you take the tone down a notch agent, this board is for discussion.
> 
> 
> 
> 
> 
> 
> Tom peterson himself said vsync on with gsync is a "poor choice" and suggests using fast sync in combination with gsync as they are complimentary to each other.


No tone can be read online lol.









Tom Petersen isn't correct, you do not want to enable fast sync with gsync.
Quote:


> *So, what about pairing Fast Sync with G-SYNC? Even Nvidia suggests it can be done, but doesn't go so far as to recommend it. But while it can be paired, it shouldn't be&#8230;
> 
> Say the system can maintain an average framerate just above the maximum refresh rate, and instead of an FPS limit being applied to avoid V-SYNC-level input lag, Fast Sync is enabled on top of G-SYNC. In this scenario, G-SYNC is disabled 99% of the time, and Fast Sync, with very few excess frames to work with, not only has more input lag than G-SYNC would at a lower framerate, but it can also introduce uneven frame pacing (due to dropped frames), causing recurring microstutter. Further, even if the framerate could be sustained 5x above the refresh rate, Fast Sync would (at best) only match G-SYNC latency levels, and the uneven frame pacing (while reduced) would still occur.
> 
> That's not to say there aren't any benefits to Fast Sync over V-SYNC on a standard display (60Hz at 300 FPS, for instance), but pairing Fast Sync with uncapped G-SYNC is effectively a waste of a G-SYNC monitor, and an appropriate FPS limit should always be opted for instead.
> 
> Which poses the next question: if uncapped G-SYNC shouldn't be used with Fast Sync, is there any benefit to using G-SYNC + Fast Sync + FPS limit over G-SYNC + V-SYNC (NVCP) + FPS limit?
> 
> Blur Buster's G-SYNC 101: Input Lag & Optimal Settings
> 
> The answer is no. In fact, unlike G-SYNC + V-SYNC, Fast Sync remains active near the maximum refresh rate, even inside the G-SYNC range, reserving more frames for itself the higher the native refresh rate is. At 60Hz, it limits the framerate to 59, at 100Hz: 97 FPS, 120Hz: 116 FPS, 144Hz: 138 FPS, 200Hz: 189 FPS, and 240Hz: 224 FPS. This effectively means with G-SYNC + Fast Sync, Fast Sync remains active until it is limited at or below the aforementioned framerates, otherwise, it introduces up to a frame of delay, and causes recurring microstutter. And while G-SYNC + Fast Sync does appear to behave identically to G-SYNC + V-SYNC inside the Minimum Refresh Range (<36 FPS), it's safe to say that, under regular usage, G-SYNC should not be paired with Fast Sync.*


So in short, fast sync will cause microstuttering.

Fast sync is perfect for standard displays where you can keep a stable fps of at least 2x your refresh rate, otherwise you will get microstuttering.

Of course not everyone is able to notice frame hitching but from what I've tested it's terrible with varying FPS.


----------



## Scotty99

Im just gonna say this, never once have i seen tom peterson recommend people use vsync with gsync, he always says set a fps limit or use fast sync.

Ah ok ill take the advice of an angry dude on a forum over the guy nvidia hires to explain this stuff to the public lol.


----------



## shreduhsoreus

Quote:


> Originally Posted by *Agent-A01*
> 
> Nobody is acting like a ****.


Yeah. Ok.

Show me visual evidence of a 165Hz display tearing with a consistent 100FPS instead of quoting the same few paragraphs over and over or other information that's out of the context that we were talking about in the first place.

Or not, I'd rather not continue to communicate with somebody that gets so unpleasant over ******* computers.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Im just gonna say this, never once have i seen tom peterson recommend people use vsync with gsync, he always says set a fps limit or use fast sync.
> 
> Ah ok ill take the advice of an angry dude on a forum over the guy nvidia hires to explain this stuff to the public lol.


Tom Petersen is a director of marketing; he doesn't understand the technicalities of things the engineers do.

It's not my advice, it came from here.

https://www.blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/8/

Quote:


> Originally Posted by *shreduhsoreus*
> 
> Yeah. Ok.
> 
> Show me visual evidence of a 165Hz display tearing with a consistent 100FPS instead of quoting the same few paragraphs over and over or other information that's out of the context that we were talking about in the first place.
> 
> Or not, I'd rather not continue to communicate with somebody that gets so unpleasant over ******* computers.


The only person who is acting unpleasant is you, can't deal with the fact of being wrong









Frankly, I'm not bothered either way, most people like to be hard headed when they think they are right.
Common occurrence in the tech world.

I mean you can disagree with jormit all you want, but you're basically saying he is wrong.
Which is kind of rude considering he's proven it with high speed cameras( a ton of work)

I'll message jorimt to join the discussion; otherwise you'll just believe what you want.
Saves everyone the trouble


----------



## Agent-A01

Quote:


> Originally Posted by *shreduhsoreus*
> 
> Yeah. Ok.
> 
> Show me visual evidence of a 165Hz display tearing with a consistent 100FPS instead of quoting the same few paragraphs over and over or other information that's out of the context that we were talking about in the first place.
> 
> Or not, I'd rather not continue to communicate with somebody that gets so unpleasant over ******* computers.


Out of context?

We are talking about your original "*Vsync is not needed with G-Sync to eliminate screen tearing if you're framerate is always within G-Sync's range"*

Which is wrong and what I've been quoting BB to prove you are wrong.

Now you are saying you want proof of tearing at a consistent 100fps?

Anyways, I have seen tearing on B03(some games are less noticeable than others) and I can see tearing with vsync off where FPS is not close to Gsync ceiling, aka around 120 fps.

Even if I gave visual proof you would say it was improperly configured or something.
Google Gsync tearing and you will see how many people who have noticed this tearing despite having a frame cap.

Anyways, here is a post as well as jorimt on the bottom of the thread.
I suggest you message him as thread is already derailed more than it needs to be.
https://forums.guru3d.com/threads/normal-g-sync-behavior.416759/#post-5471675


----------



## Scotty99

So in your head blur busters are more reputable than nvidia engineers? Clearly that is where tom gets his info to relay to the public, he also seems like a bright guy in the interviews ive seen him do with pcper.

Also yea you are the unpleasant chap in here, less barking orders and more discussion please.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> So in your head blur busters are more reputable than nvidia engineers? Clearly that is where tom gets his info to relay to the public, he also seems like a bright guy in the interviews ive seen him do with pcper.
> 
> Also yea you are the unpleasant chap in here, less barking orders and more discussion please.


Blur busters are more reputable than tom P, yes.

Data and proof is there, click on the link.

Otherwise, believe what you want

Do you know how many times marketing has incorrectly stated things about things they never truly understood?
Quite common, AMD is a prime example.


----------



## Scotty99

Quote:


> Originally Posted by *Agent-A01*
> 
> Blur busters are more reputable than tom P, yes.
> 
> Data and proof is there, click on the link.
> 
> Otherwise, believe what you want
> 
> Do you know how many times marketing has incorrectly stated things about things they never truly understood?
> Quite common, AMD is a prime example.


Im just gonna go with tom on this one, he clearly states in the video i linked that vsync in combination with gsync is a "poor choice" and based on my testing that is mostly true. Never once have i witnessed tearing below 165hz and im super sensitive to stutter or tearing.

I would personally advise people to use a fps cap or enable fast sync, there is no scenario i can envision where someone would want to enable vsync with gsync.


----------



## nrpeyton

*Need help* from anyone familiar with ASUS mobos.

I have my 1080Ti in my main slot (closest to CPU).

I am trying to add a 2nd GPU (an old Radeon) for mining.

When I install the Radeon my 1080Ti is no longer recognised in BIOS or in Windows. _(But the Radeon becomes visible)._

If I remove the Radeon the 1080Ti immediately becomes visible again.

The Radeon is going into the other X8 slot on the MOBO.

Thanks,

Nick

P.S

Yesterday I got them both working *momemtarily* together (both were showing) by re-installing/uninstalling resetting to BIOS defaults & powering on/off. But when I repeated the same steps exactly today I can't get it to recognise both.

.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *nrpeyton*
> 
> *Need help* from anyone familiar with ASUS mobos.
> 
> I have my 1080Ti in my main slot (closest to CPU).
> 
> I am trying to add a 2nd GPU (an old Radeon) for mining.
> 
> When I install the Radeon my 1080Ti is no longer recognised in BIOS or in Windows. _(But the Radeon becomes visible)._
> 
> If I remove the Radeon the 1080Ti immediately becomes visible again.
> 
> The Radeon is going into the other X8 slot on the MOBO.
> 
> Thanks,
> 
> Nick
> 
> P.S
> 
> Yesterday I got them both working *momemtarily* together (both were showing) by re-installing/uninstalling resetting to BIOS defaults & powering on/off. But when I repeated the same steps exactly today I can't get it to recognise both.
> 
> .


Sounds like something you should ask Asus about?


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Im just gonna go with tom on this one, he clearly states in the video i linked that vsync in combination with gsync is a "poor choice" and based on my testing that is mostly true. Never once have i witnessed tearing below 165hz and im super sensitive to stutter or tearing.
> 
> I would personally advise people to use a fps cap or enable fast sync, there is no scenario i can envision where someone would want to enable vsync with gsync.


What testing did you do?

You want to enable a FPS cap below gsync ceiling; vsync ON enable frame time compensation of the Gsync module.

Plainly, this means that gsync module FTC forces a single frame to be drawn in a single scanout of the monitor.

If you disable VSync( FTC) the GPU will draw as many frames as possible, even within gsyncs range once a frame has been completed, the next frame is started instantly instead of being forced to wait/be synced with the monitor's scanout

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), AKA 2 frames are merged which causes a tear.

BTW I'll let you know that Originally Frame time compensation was always on when Gsync was released.

FTC = Vsync on with Gsync; at this time you never got tearing.

The engineers deemed that having FTC on in the drivers was the best case for optimal Gsync usage.
Obviously it is but i digress.

The issue with that was people complained in the forums that their FPS was not going over their refresh rate.
With those complaints, drivers had FTC disabled by default(Vsync off) so FPS would be uncapped.

This happened in the drivers about r340/350 or so.
Funny enough, this started a bunch of people saying that they noticed tearing despite being within Gsync's range.
You can find that by googling gsync tearing.

Like i said, this microtearing is not noticeable by everyone, even ones that think they would notice it.
It's very easy to miss; it's like tearing that lasts for a microsecond.

Very fast paced games and even different engines may make it more or less noticeable.
Quote:


> Originally Posted by *nrpeyton*
> 
> *Need help* from anyone familiar with ASUS mobos.
> 
> I have my 1080Ti in my main slot (closest to CPU).
> 
> I am trying to add a 2nd GPU (an old Radeon) for mining.
> 
> When I install the Radeon my 1080Ti is no longer recognised in BIOS or in Windows. _(But the Radeon becomes visible)._
> 
> If I remove the Radeon the 1080Ti immediately becomes visible again.
> 
> The Radeon is going into the other X8 slot on the MOBO.
> 
> Thanks,
> 
> Nick
> 
> P.S
> 
> Yesterday I got them both working *momemtarily* together (both were showing) by re-installing/uninstalling resetting to BIOS defaults & powering on/off. But when I repeated the same steps exactly today I can't get it to recognise both.
> 
> .


Are you using M.2?

It uses 4x CPU lanes(unless you set it to use shared sata ports) which forces your GPU in 1st slot to drop down to 8x.

Add another GPU to a dedicated 8x slot and it may force kick the 1080ti off of post intiliazation.

Under BIOS PCH settings (or around there) you can set each individual slot to be 8x/4x 1x etc sometimes.
You would want to set the radeon to 4x.

Since 4x + 4x + 8x = 16 PCIe lanes.

Otherwise you'll have to force M.2 to use sata ports instead of CPU


----------



## Scotty99

Well no you can also use fast sync instead of a fps limiter, again you have tried to bark at people the past 3 pages.....there are options.

This is literally the first time ive ever heard you need vsync enabled to have gsync working, and its from some random on a forum with a peculiar posting style, surely you can understand why you are getting pushback on this one.

Until i see someone from nvidia state you need vsync enabled with gsync to not get tearing i will be leaving it off.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Well no you can also use fast sync instead of a fps limiter, again you have tried to bark at people the past 3 pages.....there are options.
> 
> This is literally the first time ive ever heard you need vsync enabled to have gsync working, and its from some random on a forum with a peculiar posting style, surely you can understand why you are getting pushback on this one.
> 
> Until i see someone from nvidia state you need vsync enabled with gsync to not get tearing i will be leaving it off.


Well you can believe what you want, but this information is on geforce.com

Don't agree with people who have tested it? Ask manuelG or someone like that.

I don't know why you said "its some random on a forum" when clearly I have reinstated what someone who testted it has said.
You can also watch chrisbattlenonsense on youtube if you want easy to understand videos saying the exact same thing.

https://forums.geforce.com/default/topic/1002056/understanding-how-g-sync-and-v-sync-work-together/
Quote:


> The "best" configuration is g-sync + vsync on + frame limiter to 2 or more FPS below max refresh (I use 4FPS lower to be on the safe side; so for 144Hz, cap to 140FPS.) In-game limiter is best (most in-game limiters have zero input lag), RTSS is second best (1 frame of input lag at worst), nvidia's limiter (through Inspector) is the worst (two frames of input lag.)
> 
> Using vsync off with g-sync disables a quite important frame time variance compensation feature in the g-sync module, which is why it's not recommended to do it unless you really want an uncapped frame rate. Note that enabling vsync does NOT incur any input lag penalty with g-sync if your max refresh isn't reached. It just ensures frame time variance compensation is active so that you don't get tearing at the top of bottom of the screen when FPS is fluctuating too much. Without vsync enabled, g-sync doesn't try to hide the tear line.
> 
> I won't duplicate all the info here, but you can find the details, including input lag tests (with a 1000FPS camera) here:


Watch this video on tearing recorded with a high speed camera.






FPS Is below Gsync range

Notice the word "partial" tearing.
It's not as noticeable as normal tearing which is why so many people are argumentative.

"I don't see tearing so you are wrong" mentality.


----------



## Scotty99

Again thats a forum post by another random. I want to see a tweet from nvidia or tom peterson that explicitly states what those posts are saying because default nvidia settings are "application controlled" and many games have vsync disabled by default.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Again thats a forum post by another random. I want to see a tweet from nvidia or tom peterson that explicitly states what those posts are saying because default nvidia settings are "application controlled" and many games have vsync disabled by default.


What is a random to you?
Are battenonsense and jorimt who know technology more than most randoms too?

Anyways, that youtube link above shows proof of what you are refuting.

You won't get an official statement from NV for various reasons; one it would be considered a flaw by many saying they wasted their money on Gsync if they have to enable vsync.
The issues here are is NV should have renamed vsync to something else as that causes confusion by so many.

But here is a thread about many people complaining about having tearing within gsyncs range.

https://forums.geforce.com/default/topic/965224/geforce-drivers/gsync-not-working-100-/9/

Jorimt and others have provide enough infomation to confirm that Vsync ON + FPS limit is the way to go; If nvidia did not agree, manuelG would have posted a direct response.



Beyond this is all you well get from me; the information is there and you are fully capable of doing research on your own.


----------



## Agent-A01

Last post from me regarding Gsync unless it's discussion isn't pointless.
Pointless being "you and everyone else is wrong and I am right"

This has already been derailed enough


----------



## Scotty99

No, that is far from proof for anything. These are all random people, i have no idea what a jormit is.

This entire thing started when i posted that i use fast sync, to which you replied:
Quote:


> Fast sync does not compliment gsync.
> 
> Don't use it with it.
> Enable vsync and enabel fps limit 3fps below your refresh rate.


Not only is that 100% at odds with everything ive read about fast sync and gsync working in tandem (with a video from tom peterson backing me up), it is also the first time ive heard anyone say you need vsync enabled with gsync to remove tearing.

I do agree this conversation is now pointless, shall be my last response directed at you.


----------



## DarthBaggins

Just snagged a MSI 1080 Ti Aero OC from MicroCenter Black Friday weekend, gotta love their Clearance/Open-Box sales since the card was $600. Just need to snag a block for it, planning on the EK Nickel/Plexi or the Nickel/Acetyl. Crappy thing is I'm now out of town for the week so no real chance to enjoy it - can't wait to see what this card can do underwater and OC'd. Debating on selling off my 1080 SC from Temper Tantrum but don't want to part with it (great clocking card)


----------



## nrpeyton

Quote:


> Originally Posted by *Agent-A01*
> 
> What testing did you do?
> 
> You want to enable a FPS cap below gsync ceiling; vsync ON enable frame time compensation of the Gsync module.
> 
> Plainly, this means that gsync module FTC forces a single frame to be drawn in a single scanout of the monitor.
> 
> If you disable VSync( FTC) the GPU will draw as many frames as possible, even within gsyncs range once a frame has been completed, the next frame is started instantly instead of being forced to wait/be synced with the monitor's scanout
> 
> This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), AKA 2 frames are merged which causes a tear.
> 
> BTW I'll let you know that Originally Frame time compensation was always on when Gsync was released.
> 
> FTC = Vsync on with Gsync; at this time you never got tearing.
> 
> The engineers deemed that having FTC on in the drivers was the best case for optimal Gsync usage.
> Obviously it is but i digress.
> 
> The issue with that was people complained in the forums that their FPS was not going over their refresh rate.
> With those complaints, drivers had FTC disabled by default(Vsync off) so FPS would be uncapped.
> 
> This happened in the drivers about r340/350 or so.
> Funny enough, this started a bunch of people saying that they noticed tearing despite being within Gsync's range.
> You can find that by googling gsync tearing.
> 
> Like i said, this microtearing is not noticeable by everyone, even ones that think they would notice it.
> It's very easy to miss; it's like tearing that lasts for a microsecond.
> 
> Very fast paced games and even different engines may make it more or less noticeable.
> Are you using M.2?
> 
> It uses 4x CPU lanes(unless you set it to use shared sata ports) which forces your GPU in 1st slot to drop down to 8x.
> 
> Add another GPU to a dedicated 8x slot and it may force kick the 1080ti off of post intiliazation.
> 
> Under BIOS PCH settings (or around there) you can set each individual slot to be 8x/4x 1x etc sometimes.
> You would want to set the radeon to 4x.
> 
> Since 4x + 4x + 8x = 16 PCIe lanes.
> 
> Otherwise you'll have to force M.2 to use sata ports instead of CPU


Thanks so much for reply.

I'm not using M.2... in fact i don't have anything plugged into any of my PCI-E slots at all except GPU's. Which is why I am so puzzled.

*Edit:* Just got it working. By changing an option in BIOS called "PEG" to *on*.

However I can't seem to get any HMDI output from the Radeon card. Only displayport seems to be working.

P.S.

I'm playing Mass Effect 1 on the old Radeon while I mine on the 1080Ti and CPU. The 7700k seems to be able to keep up with both.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> No, that is far from proof for anything. These are all random people, i have no idea what a jormit is.
> 
> This entire thing started when i posted that i use fast sync, to which you replied:
> Not only is that 100% at odds with everything ive read about fast sync and gsync working in tandem (with a video from tom peterson backing me up), it is also the first time ive heard anyone say you need vsync enabled with gsync to remove tearing.
> 
> I do agree this conversation is now pointless, shall be my last response directed at you.


Jorimt is the person that thoroughly tested gsync's technology at blurbusters.com
He wrote the Gsync 101 article.

Fast sync causes mad stuttering, especially on variable fps even with gsync.
I bet many people wish they had your inability to notice those things; it's quite annoying.

Just humor myself, i watched the fast sync overview with Tom P.

No where does he *recommend* to use Fast sync with Gsync.

I'll quote Tom P
*"this is a technology for games that run well in excess of the refresh rate, as the render rate declines sampling becomes much more problematic"*

AKA, as FPS gets closer to refresh rate, stuttering is problematic.
Below FPS, you still get stuttering.

From that statement alone you can see that it does not compliment Gsync.

Again, video proof is here.





With high speed camera results.

Believe what you want though, no need to discuss further as you said.
Quote:


> Originally Posted by *nrpeyton*
> 
> Thanks so much for reply.
> 
> I'm not using M.2... in fact i don't have anything plugged into any of my PCI-E slots at all except GPU's. Which is why I am so puzzled.
> 
> *Edit:* Just got it working. By changing an option in BIOS called "PEG" to *on*.
> 
> However I can't seem to get any HMDI output from the Radeon card. Only displayport seems to be working.
> 
> P.S.
> 
> I'm playing Mass Effect 1 on the old Radeon while I mine on the 1080Ti and CPU. The 7700k seems to be able to keep up with both.


PEG is just short for PCIe Graphics. You would enable that when you have a system with an iGPU.
Unsure if yours is enabled or not; if not it should have auto'd to PEG anyways.

May be a driver issue but you may try updating the vbios on that card.


----------



## Streetdragon

Quote:


> Originally Posted by *becks*
> 
> Common sense...
> 
> Smells like burn ?
> REALY Hot ?
> Makes noises ?


Like on your profil pic?







If its only showing problems like crashes/reboots... just pray to the Gods of the PSUs that my will life forever!
Quote:


> Originally Posted by *DarthBaggins*
> 
> Just snagged a MSI 1080 Ti Aero OC from MicroCenter Black Friday weekend, gotta love their Clearance/Open-Box sales since the card was $600. Just need to snag a block for it, planning on the EK Nickel/Plexi or the Nickel/Acetyl. Crappy thing is I'm now out of town for the week so no real chance to enjoy it - can't wait to see what this card can do underwater and OC'd. Debating on selling off my 1080 SC from Temper Tantrum but don't want to part with it (great clocking card)


Nice catch! And there its coming.... the thought... go sli.... even if you dont need it..... DO IT









And im very happy with gsync without any fps limiters or something. If i use fps limiter it crashes some games for me. And V-sync always limit my fps to 60Fps on my 144 Panel. Feels bad.

But gsync only works like a charm!


----------



## Leopardi

Quote:


> Originally Posted by *Agent-A01*
> 
> What is a random to you?
> Are battenonsense and jorimt who know technology more than most randoms too?
> 
> Anyways, that youtube link above shows proof of what you are refuting.
> 
> You won't get an official statement from NV for various reasons; one it would be considered a flaw by many saying they wasted their money on Gsync if they have to enable vsync.
> The issues here are is NV should have renamed vsync to something else as that causes confusion by so many.
> 
> But here is a thread about many people complaining about having tearing within gsyncs range.
> 
> https://forums.geforce.com/default/topic/965224/geforce-drivers/gsync-not-working-100-/9/
> 
> Jorimt and others have provide enough infomation to confirm that Vsync ON + FPS limit is the way to go; If nvidia did not agree, manuelG would have posted a direct response.
> 
> 
> 
> Beyond this is all you well get from me; the information is there and you are fully capable of doing research on your own.


Am I reading this graph correct, that for the least input lag on 144Hz you need to limit fps to 126?


----------



## AngryLobster

Scotty99, I know you have a Meshify C and people seem to think it has good airflow (doesn't) but take your tempered glass side panel off and see if temps drop 5-7C like they did for me.

1080 Tis are not happy in low internal volume cases like that with perpendicular airflow. Also replacing the cases foam filter with a demciflex is like putting the Define C front panel on. Those filters are super restrictive and defeat the purpose of a Meshify C.


----------



## Agent-A01

Quote:


> Originally Posted by *Leopardi*
> 
> Am I reading this graph correct, that for the least input lag on 144Hz you need to limit fps to 126?


No, that graph has nothing to do with input lag.

Only the range of visible tearing that will occur for specific refresh rates under Gsync Vsync off scenario.

I.E. a 144hz monitor will have visible tearing above 125 FPS unless VSync is ON; 100hz above 93fps, etc.

You would limit FPS 3 below your refresh rate with Vsync ON for the lowest input lag.
Use in game FPS limiter where possible or RTSS.


----------



## GRABibus

Quote:


> Originally Posted by *Agent-A01*
> 
> No, that graph has nothing to do with input lag.
> 
> Only the range of visible tearing that will occur for specific refresh rates under Gsync Vsync off scenario.
> 
> I.E. a 144hz monitor will have visible tearing above 125 FPS unless VSync is ON; 100hz above 93fps, etc.
> 
> You would limit FPS 3 below your refresh rate with Vsync ON for the lowest input lag.
> Use in game FPS limiter where possible or RTSS.


So, to be sure :
For example in Star Wars Battlefront 2 with my PG278Q (144Hz, Gsync enabled) :
I go in NVIDIA Control panel and select in 3D settings menu the "Star Wars Battlefront 2" profile (starwarsbattlefrontii.exe),
Then I set Vsync "on" for this profile.
I open RTSS and limit the fps to 141 (144 - 3) for starwarsbattlefrontii.exe.
Then I will have the lowest input lag ?


----------



## Agent-A01

Quote:


> Originally Posted by *GRABibus*
> 
> So, to be sure :
> For example in Star Wars Battlefront 2 with my PG278Q (144Hz, Gsync enabled) :
> I go in NVIDIA Control panel and select in 3D settings "Star Wars Battlefront 2" profile (starwarsbattlefrontii.exe), then I set Vsycnh "on".
> I open RTSS and limit the fps to 141 (144 - 3).
> Then I will have the lowest input lag ?


Correct

But for battlefronts case, it has a good ingame FPS limiter.

You would use it.
GameTime.MaxVariableFps 141 via user.cfg or add it to the launch options.


----------



## GRABibus

Quote:


> Originally Posted by *Agent-A01*
> 
> Correct
> 
> But for battlefronts case, it has a good ingame FPS limiter.
> 
> You would use it.
> GameTime.MaxVariableFps 141 via user.cfg or add it to the launch options.


I don't find any user.cfg in Batlefront 2 folder installation game.


----------



## mouacyk

Quote:


> Originally Posted by *GRABibus*
> 
> So, to be sure :
> For example in Star Wars Battlefront 2 with my PG278Q (144Hz, Gsync enabled) :
> I go in NVIDIA Control panel and select in 3D settings menu the "Star Wars Battlefront 2" profile (starwarsbattlefrontii.exe),
> Then I set Vsync "on" for this profile.
> I open RTSS and limit the fps to 141 (144 - 3) for starwarsbattlefrontii.exe.
> Then I will have the lowest input lag ?


I see that you have an ULMB capable monitor and components capable of driving 120fps+ on it. I'd recommend enabling ULMB and turning on Fast Sync. Here's what you get:
1. Motion Clarity at 120fps (try viewing this moving photo with and without ULMB: https://www.testufo.com/photo)
2. No tearing
3. Minimal input lag given 1 and 2

You may need to up brightness manually, if it's too dim for you. As for G-Sync for now (because it doesn't support light strobing), I'd only recommend it for <100fps gameplay.


----------



## Agent-A01

Quote:


> Originally Posted by *GRABibus*
> 
> I don't find any user.cfg in Batlefront 2 folder installation game.


You create a user.cfg or add it to launch options via origin.


----------



## Gurkburk

Re-seated my Accelero Xtreme on my MSI card & seem to have dropped from 62~*C Firestrike Stresstest to 56~57*C. I'll wait a bit to let it settle, but it's a pretty significant difference.

"Forgot" a "plate" with cooling pads from the MSI cooler which sat on the front of the card, cooling Vram(?) from the front. Placed the pads on the back of those on my big backplate & maybe that had a play in the temp drop.


----------



## becks

How do you guys make 470-480 on vRamBandWidthTest ....I can't go over 440.....no matter what mem OC I try...


----------



## KedarWolf

Quote:


> Originally Posted by *becks*
> 
> How do you guys make 470-480 on vRamBandWidthTest ....I can't go over 440.....no matter what mem OC I try...


You need to turn P2 State to off in Nvidia Inspector.

Do a custom search with my username and Nvidia Inspector and you'll find the correct version you need as an attachment.









Edit: And right click the .exe and Run As Admin.


----------



## becks

Quote:


> Originally Posted by *KedarWolf*


Thank you very much..

Found it.

Have a look and tell me what you think...:


Spoiler: GPU Z









Spoiler: P2 Off







I can push MEM some more...actually much more, over 860+ but the bandwidth doesn't go any higher and the temp load in the loop increases to the point where my fans are at 2600++ So I play it a little conservative.

Also...do I need to apply the Nvidia Inspector thing after each restart/power on ? or is it a one off...?


----------



## KedarWolf

Quote:


> Originally Posted by *becks*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> 
> 
> Thank you very much..
> 
> Found it.
> 
> Have a look and tell me what you think...:
> 
> 
> Spoiler: GPU Z
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: P2 Off
> 
> 
> 
> 
> 
> 
> 
> I can push MEM some more...actually much more, over 860+ but the bandwidth doesn't go any higher and the temp load in the loop increases to the point where my fans are at 2600++ So I play it a little conservative.
> 
> Also...do I need to apply the Nvidia Inspector thing after each restart/power on ? or is it a one off...?
Click to expand...

That's really great memory speeds!!

And yes, the P2 State thing is one time, unless you reinstall your drivers, then you do it one time again.


----------



## KedarWolf

I tested the new EVGA Elite FTW BIOS with 12GBPS memory, the Palit 140 power limit BIOS and the XOC BIOS for how much wattage they pull.

NOTE!! If you flash the latest EVGA BIOS have your Afterburner at defaults, not your old overclock. Or have it not start at boot.

The memory is a much higher clock and you'll crash your PC when Afterburner starts.

Ctrl L in the curve at your highest voltage point to lock it in Afterburner, then adjust your core and memory so you can see what it's going to.

I think I did +142 and it was ar 6147 or something, not +642 for the same clocks.

Anyone, the XOC BIOS at 1.1v pulled 300-400W averaging around 330-340W.

The Palit BIOS pulled 270-350W averaging around 300W.

The new EVGA BIOS pulled 280-360W averaging around 310W.

This using nvidia-smi.

The XOC never went below 300W, peaked at 400W.

The other two would go as low regularly as 270-280W and peak at 340-360W.


----------



## Agent-A01

Quote:


> Originally Posted by *KedarWolf*
> 
> I tested the new EVGA Elite FTW BIOS with 12GBPS memory, the Palit 140 power limit BIOS and the XOC BIOS for how much wattage they pull.


Link to bios?

I wonder if they changed the timings on the memory. Those who could only hit 11800 may be stable on that bios at 12000


----------



## KedarWolf

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I tested the new EVGA Elite FTW BIOS with 12GBPS memory, the Palit 140 power limit BIOS and the XOC BIOS for how much wattage they pull.
> 
> 
> 
> Link to bios?
> 
> I wonder if they changed the timings on the memory. Those who could only hit 11800 may be stable on that bios at 12000
Click to expand...

https://www.techpowerup.com/vgabios/195601/195601


----------



## Hl86

Is there a soft hb sli bridge somewhere on the internet?


----------



## KedarWolf

I'm pretty sure you can attach two single flexible regular bridges to both SLI connectors and it'll work the same as a high bandwidth bridge.


----------



## EarlZ

I placed an order for the msi gaming x, how easy is it to get 2Ghz core? Reviews from different YT channels show it boosts to 19xx mhz out of the box


----------



## 2ndLastJedi

Has anyone had issues running the VR bench in Superposition ?
Here's a vid of my issue


----------



## KedarWolf

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Has anyone had issues running the VR bench in Superposition ?
> Here's a vid of my issue


It does the exact same thing to me.









Edit: When I put in my key I bought it works, well, except I get an error on the Rift one.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *KedarWolf*
> 
> It does the exact same thing to me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: When I put in my key I bought it works, well, except I get an error on the Rift one.


Good to know it even happens to a pro








Others get it working on the free version and it makes me not want to spend the $ if it's not working correctly .


----------



## Scotty99

Quote:


> Originally Posted by *AngryLobster*
> 
> Scotty99, I know you have a Meshify C and people seem to think it has good airflow (doesn't) but take your tempered glass side panel off and see if temps drop 5-7C like they did for me.
> 
> 1080 Tis are not happy in low internal volume cases like that with perpendicular airflow. Also replacing the cases foam filter with a demciflex is like putting the Define C front panel on. Those filters are super restrictive and defeat the purpose of a Meshify C.


I mean, its a tiny case obviously GPU temps are going to be higher i just didnt expect it to be 10-15c higher than everything ive read on the net about a strix 1080ti. Ive decided to limit my FPS to 158 instead of fast sync because that is the only way i can stay away from 80c, 74 is the max temp ive seen on overwatch over the past hour which is disappointing but acceptable.

Im debating contacting asus to ask them if i can remove the cooler and reapply the paste, i dont want to void warranty on a 800 dollar graphics card lol.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Scotty99*
> 
> I mean, its a tiny case obviously GPU temps are going to be higher i just didnt expect it to be 10-15c higher than everything ive read on the net about a strix 1080ti. Ive decided to limit my FPS to 158 instead of fast sync because that is the only way i can stay away from 80c, 74 is the max temp ive seen on overwatch over the past hour which is disappointing but acceptable.
> 
> Im debating contacting asus to ask them if i can remove the cooler and reapply the paste, i dont want to void warranty on a 800 dollar graphics card lol.


I haven't been following your posts but have you tried undervolting?
Ive a Galax 1080Ti EXOC, a dual slot card with small cooler, max i get default is about 75 C in ROTTR but if i undervolt to 0.900 @ 1911MHz the highest it goes is 66 C and from default thats actually overclocked from around 186xMHz so not even loosing performance although if i push an OC i can maintain 2088MHz and need to run the fan at 75% to stay at about 73C but that 66C and 1911 is preferable for me.


----------



## 2ndLastJedi

Forgot to mention i play @4k aa close to maxed as possible to hold 60 vsync. But have a Rift arriving tomorrow so temps may change after that.


----------



## Scotty99

Quote:


> Originally Posted by *2ndLastJedi*
> 
> I haven't been following your posts but have you tried undervolting?
> Ive a Galax 1080Ti EXOC, a dual slot card with small cooler, max i get default is about 75 C in ROTTR but if i undervolt to 0.900 @ 1911MHz the highest it goes is 66 C and from default thats actually overclocked from around 186xMHz so not even loosing performance although if i push an OC i can maintain 2088MHz and need to run the fan at 75% to stay at about 73C but that 66C and 1911 is preferable for me.


No but i shouldnt have to do that is the thing. I dont know what % my fans are at buy they max around 1700 after i set my FPS cap to 158.

I just changed overwatch preset to medium (which honestly looks same as ultra to me ) and max GPU temps were 70c for a whole game so thats better.

BTW my card boosts to 1974mhz out of the box, is that a fairly high clock rate for a stock 1080ti, i honestly dont look into that stuff.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> I mean, its a tiny case obviously GPU temps are going to be higher i just didnt expect it to be 10-15c higher than everything ive read on the net about a strix 1080ti. Ive decided to limit my FPS to 158 instead of fast sync because that is the only way i can stay away from 80c, 74 is the max temp ive seen on overwatch over the past hour which is disappointing but acceptable.
> 
> Im debating contacting asus to ask them if i can remove the cooler and reapply the paste, i dont want to void warranty on a 800 dollar graphics card lol.


I feel like if you turned a games settings up to where you were closer to 60fps you would hit thermal thorttle of 84c even at 2300 rpm.its doubtfull Asus will let you repaste though . Overwatch at 2k res high fps gives my 1080ti much less then say running shadow of war at 4k 60fps


----------



## Scotty99

Ya i dunno, max ive seen so far is 74c since ive put the fps limiter on, previously i was seeing 80c with fast sync. Core voltage HWinfo reported was 1.063, dunno if thats high or not.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Scotty99*
> 
> No but i shouldnt have to do that is the thing. I dont know what % my fans are at buy they max around 1700 after i set my FPS cap to 158.
> 
> I just changed overwatch preset to medium (which honestly looks same as ultra to me ) and max GPU temps were 70c for a whole game so thats better.
> 
> BTW my card boosts to 1974mhz out of the box, is that a fairly high clock rate for a stock 1080ti, i honestly dont look into that stuff.


Your card seems to boost pretty high default compared to its default boost of 1632 considering my EXOC boost is 1645 and hits 186x something( from memory) . Is you 1974 with power and temp slider maxed or something?


----------



## Scotty99

Quote:


> Originally Posted by *2ndLastJedi*
> 
> Your card seems to booat pretty high default compared to its default booat of 1632 considering my EXOC boost is 1645 and hits 186x something from memory. Is you 1974 with power and temp slider maxed or something?


No thats just putting the card in and turning the PC on. I havent even installed asus GPU utility which is what is needed to hit its "OC" rated speeds.


----------



## 2ndLastJedi

Quote:


> Originally Posted by *Scotty99*
> 
> No thats just putting the card in and turning the PC on. I havent even installed asus GPU utility which is what is needed to hit its "OC" rated speeds.


Your card seems to be racing along and getting sweaty lol


----------



## Scotty99

Ya thats the highest boost clock ive seen at stock just browsing this thread a bit, this may be a 2100+mhz card but dunno if i can even OC it lol.


----------



## 2ndLastJedi

Unless your just benching 1900-2100 doesn't make that much difference. As i said i can run @2088 happily under 75c but honestly 1900 is ample for [email protected] maxed in alot of games and slightly lowered for others but soo much cooler.


----------



## Scotty99

Ya reading through reviews on newegg first page has 3 people saying they are getting 75-80c under load, my temps arent out of line whatsoever. I think what got my hopes up was watching video reviewers who probably tested in open air test benches and/or with vsync on or 1080p. 165hz 1440p these cards get hot, no way around it.


----------



## 2ndLastJedi

But it does seem pretty hot considering the price, i have the cheapest 1080ti i could find and have never hit more than 75 with default fan curve and with custom curve and OC'd 73c.
Undervolting really is the way to go though!


----------



## Scotty99

Well you use 60hz 4k, 165hz 1440p is a lot harder on a GPU from what i gather. Those 80c temps i mentioned only happen with no FPS limiter, capped at 158 max i see is 74c with default fan curve. That said i was expecting 65c max load temps when i bought the card, but i guess most people dont push theirs as hard as i do.


----------



## Scotty99

When i compare it to my 1060 ssc 6gb its actually impressive, draws over twice the power and stays as cool or maybe even a bit cooler.


----------



## 2ndLastJedi

Yeah i forget that im playing at 4k maxed, it just seems the norm now lol. I remember when i first got it i was just amazed that i could do 4k 60 not to mention having everything maxed. Its a good feeling just hitting the Ultra preset and forgetting about it.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Well you use 60hz 4k, 165hz 1440p is a lot harder on a GPU from what i gather. Those 80c temps i mentioned only happen with no FPS limiter, capped at 158 max i see is 74c with default fan curve. That said i was expecting 65c max load temps when i bought the card, but i guess most people dont push theirs as hard as i do.


quite the contray when you venture into higher refresh rates your cpu ends up limiting much more then the gpus i get much lower temps at high fps 2k vs going for 4k 60-75. I agree yours sounds much to hot run heaven bench mark or superposition in 4k for a couple runs in a row and see what temps you get when your gpu can stay at 99% ussage


----------



## Scotty99

What i meant by that is 1440p 165hz is more bandwidth than 4k 60hz, i really cant comment on temps between the two never own or will own a 4k monitor lol. (well maybe 10 years from now when GPU's can push 144hz easy at that res).


----------



## demon09

ya 2k high res is more bandwidth but usually with higher frame rates 165hz can be easier said then done to push in some games and end up not sitting at 99% utilization like what would happen at a higher res without a frame limiter. give this bench mark a run or two in a row at a high res and see what your temps are https://benchmark.unigine.com/superposition my bet is either you fans go faster then 2300 rpm which is fast for stock curve or you hit thermal throttle or both. since you are allready a good bit warmer then you should be for that card


----------



## Scotty99

Im actually not convinced i am warmer than i should be tho. Im reading results around the net and 75c is not out of order for this particular model, strix 1080ti OC edition. Ive been playing OW now for about 3 hours and max reported temp is 70c at 1440p medium preset (75c at ultra). Im sure a thermal paste redo would give me 2-4c but i dont think its gonna give me 8-10c, this card i think runs 72c-ish when pushed on a 165hz 2k monitor like i have.

I will give those benches a shot later tho and report back, im still on stock CPU cooler tho so im more worried about cpu throttling lol.


----------



## 6u4rdi4n

Guess my Asus Strix ain't too bad. 1974Mhz @ 1.08V out of the box. Highest temperature I saw it reach on air was 67C. Fans on auto, so it was silent. Actually the first "stock" GPU cooler I've experienced that didn't make me wanna go water cooling instantly.

But of course, I had to. Running on water now, I'm currently at 2100MHz @ 1.093V and +500MHz on the memory. I haven't seen it go past 38C. When gaming, I'm usually down in the 32-34C area. I'm not done with it yet, but I have been busy just enjoying it for gaming.

I imagine it will take some time to find the "sweetspot" for it. I have to go through the voltage range and see where the card thrives. Pascal is a funny chip for those who like to tweak stuff endlessly.

I've had two 1080s before I upgraded to this 1080 Ti. The first 1080 mysteriously died, but for normal use like games, it would do around 2088 @ 1.093V without hitting the power limit (FE PCB models), but the one I got as a replacement would be hitting the power limit at around 1.03V-1.04V in the same scenarios. The last one did clock a bit better though, as I was running it somewhere above 2100MHz @ 1.01V before I sold it and upgraded.

Will report back when I get around to tweak my 1080 Ti some more!


----------



## Scotty99

Quote:


> Originally Posted by *demon09*
> 
> ya 2k high res is more bandwidth but usually with higher frame rates 165hz can be easier said then done to push in some games and end up not sitting at 99% utilization like what would happen at a higher res without a frame limiter. give this bench mark a run or two in a row at a high res and see what your temps are https://benchmark.unigine.com/superposition my bet is either you fans go faster then 2300 rpm which is fast for stock curve or you hit thermal throttle or both. since you are allready a good bit warmer then you should be for that card


This is after one run on 1080p extreme:



http://imgur.com/6gvFxuP


I cant use the stress test portion because free version, but if i let it run non stop my guess is that it would settle at around 75c.

Too high? Honestly i have no idea, results are all over the place for this card.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> This is after one run on 1080p extreme:
> 
> 
> 
> http://imgur.com/6gvFxuP
> 
> 
> I cant use the stress test portion because free version, but if i let it run non stop my guess is that it would settle at around 75c.
> 
> Too high? Honestly i have no idea, results are all over the place for this card.


run it at 4k the 1080p benchmark will still not fully stress the gpu


----------



## Scotty99

Sure gotta do my WoW dailies first tho









In WoW it never goes above ~67c, thats kinda what i was expecting for the card for any game when i purchased it.



http://imgur.com/gmUqp


2c higher at 4k. /shrug


----------



## EarlZ

Just installed my MSI 1080Ti Gaming X, Getting the following clocks/temps
Dunno if they are with in expectations

card idles at 34-35c
boosts to 1949 @ 1.062v
drops to 1924 upon hitting 62c (temp limit set to 90C) @ 1.050v
further drops to 1911 when hitting 69-70c @ 1.043v
I believe my ambient is around 28-30c which is normal for my country

If i leave the fan speed to auto it will drop around 1898 and get to 75-76c.
I am using a custom fan curve and run @ 95%+ fan speed opening my side panel and pointing a house fan which moves a decent amount of air only drops my GPU around 3-5c so id say my case airflow is okay.


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> Sure gotta do my WoW dailies first tho
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In WoW it never goes above ~67c, thats kinda what i was expecting for the card for any game when i purchased it.
> 
> 
> 
> http://imgur.com/gmUqp
> 
> 
> 2c higher at 4k. /shrug


I gusse you could turn overwatch up to epic and 200% render scale and run around the target range for a better test since the super postion test really isn't long enough


----------



## swingarm

I want to get a AIO liquid cooling system for my EVGA 1080 Ti SC Black Edition card, I know about EVGA's own solution but I want to know if someone else has something similar to that?


----------



## EarlZ

Is there a way to restore the voltage curve MSI AB initially detects, pressing CTRL+D lowers the actual curve. maybe a file i can delete?


----------



## demon09

Quote:


> Originally Posted by *EarlZ*
> 
> Is there a way to restore the voltage curve MSI AB initially detects, pressing CTRL+D lowers the actual curve. maybe a file i can delete?


Ctrl+f let's you edit the voltage curve as a whole. If you want it to stick on a certain voltage say 1.075 make it a flat line after 1.075


----------



## EarlZ

Quote:


> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> Is there a way to restore the voltage curve MSI AB initially detects, pressing CTRL+D lowers the actual curve. maybe a file i can delete?
> 
> 
> 
> Ctrl+f let's you edit the voltage curve as a whole. If you want it to stick on a certain voltage say 1.075 make it a flat line after 1.075
Click to expand...

I just need it to be whatever it was defaulted to before I pressed CTRL+D like pure stock


----------



## demon09

Quote:


> Originally Posted by *EarlZ*
> 
> I just need it to be whatever it was defaulted to before I pressed CTRL+D like pure stock


click the little reset button at the bottom but cntrl d does the same thing


----------



## EarlZ

Quote:


> Originally Posted by *demon09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> I just need it to be whatever it was defaulted to before I pressed CTRL+D like pure stock
> 
> 
> 
> click the little reset button at the bottom but cntrl d does the same thing
Click to expand...

Im sure I noticed a slightly different curve before I press CTRL+D


----------



## demon09

Quote:


> Originally Posted by *EarlZ*
> 
> Im sure I noticed a slightly different curve before I press CTRL+D


worst case delete the app and reboot


----------



## KedarWolf

Quote:


> Originally Posted by *EarlZ*
> 
> Is there a way to restore the voltage curve MSI AB initially detects, pressing CTRL+D lowers the actual curve. maybe a file i can delete?


Try uninstalling Afterburner, completely deleting the programs/afterburner folder, then reinstalling it.


----------



## becks

Something strange is happening with my 1080 ti...or v-sync.... or games...idk

Sometimes (I always play in Window mode - Border-less as I have 3 screens ) If I alt tab or click out of the main screen where I play and do some browser staff or open or document or whatever...the game (Most common with Wow) drops to 30 fps (in-game setting so it does not crank up GPU while not using it...) but when I click back in Game and resume.....fps never go back up to 100 and I'm left with a clunky game and have to alt-tab for a million times or completely reset the game to get it working...

Anyone has/had this ?

Disabling v-sync doesn't help ..tried it already

Also I thought my OC isn't stable so played with it with no success... does it even at stock...


----------



## EarlZ

Quote:


> Originally Posted by *KedarWolf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> Is there a way to restore the voltage curve MSI AB initially detects, pressing CTRL+D lowers the actual curve. maybe a file i can delete?
> 
> 
> 
> Try uninstalling Afterburner, completely deleting the programs/afterburner folder, then reinstalling it.
Click to expand...

Still doesnt restore it, I guess it doesnt matter much in the long run. Is the default curve read somewhere on the card?


----------



## EarlZ

Quote:


> Originally Posted by *becks*
> 
> Something strange is happening with my 1080 ti...or v-sync.... or games...idk
> 
> Sometimes (I always play in Window mode - Border-less as I have 3 screens ) If I alt tab or click out of the main screen where I play and do some browser staff or open or document or whatever...the game (Most common with Wow) drops to 30 fps (in-game setting so it does not crank up GPU while not using it...) but when I click back in Game and resume.....fps never go back up to 100 and I'm left with a clunky game and have to alt-tab for a million times or completely reset the game to get it working...
> 
> Anyone has/had this ?
> 
> Disabling v-sync doesn't help ..tried it already
> 
> Also I thought my OC isn't stable so played with it with no success... does it even at stock...


Try to change the power mode under nvidia CP from power saving to adaptive see if that helps


----------



## demon09

Quote:


> Originally Posted by *EarlZ*
> 
> Still doesnt restore it, I guess it doesnt matter much in the long run. Is the default curve read somewhere on the card?


it's built into the bios. And afterburner doesn't touch that so you must be noticing something else


----------



## DStealth

Quote:


> Originally Posted by *Scotty99*
> 
> This is after one run on 1080p extreme:
> 
> 
> 
> http://imgur.com/6gvFxuP
> 
> 
> I cant use the stress test portion because free version, but if i let it run non stop my guess is that it would settle at around 75c.
> 
> Too high? Honestly i have no idea, results are all over the place for this card.


Quote:


> Originally Posted by *Scotty99*
> 
> Sure gotta do my WoW dailies first tho
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In WoW it never goes above ~67c, thats kinda what i was expecting for the card for any game when i purchased it.
> 
> 
> 
> http://imgur.com/gmUqp
> 
> 
> 2c higher at 4k. /shrug


Improve your scores m8...this CPU and good OCed 1080ti to 2088-2100/12000+ are aiming at 6600+ 1080p Xtreme and 11k 4k Optimized scores...

Edit: 24/7 undervolted GPU @2088/12600


----------



## Scotty99

Everything is at stock dude lol, that was just to show temps.


----------



## DStealth

And who cares at stock @overclock.net







Push this card, your CPU seems fine to not stop it


----------



## Scotty99

lol for sure tomorrow, thats when this dumb deepcool aio is supposed to show up









Side note, this 1080ti stays super cool on less demanding games, played WoW all day today max temp 65c. Thats on ultra with 158 fps cap and gsync. My 1060 would get to ~72c and it has a pretty beefy cooler for a 1060, evga ssc 2 fan card.


----------



## CptSpig

Quote:


> Originally Posted by *swingarm*
> 
> I want to get a AIO liquid cooling system for my EVGA 1080 Ti SC Black Edition card, I know about EVGA's own solution but I want to know if someone else has something similar to that?


EKWB has full cover module water blocks for better cooling than EVGA.

https://www.ekwb.com/shop/aio/ek-mlc/gpu-module?p=2


----------



## demon09

Quote:


> Originally Posted by *Scotty99*
> 
> lol for sure tomorrow, thats when this dumb deepcool aio is supposed to show up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Side note, this 1080ti stays super cool on less demanding games, played WoW all day today max temp 65c. Thats on ultra with 158 fps cap and gsync. My 1060 would get to ~72c and it has a pretty beefy cooler for a 1060, evga ssc 2 fan card.


what happens if you play overwatch at 200% resoultion scale ? Would be an easy way to do a longer stress test for temps


----------



## KedarWolf

Quote:


> Originally Posted by *swingarm*
> 
> I want to get a AIO liquid cooling system for my EVGA 1080 Ti SC Black Edition card, I know about EVGA's own solution but I want to know if someone else has something similar to that?


And you can get a 120mm to 360mm rad/pump combo or two to add to it.

And a CPU block.

Get a backplate as well, absolutely needed

https://www.ekwb.com/shop/catalogsearch/result/?q=mlc

Buy some Fujipoly 17.0 W/mK Ultra Extreme XR-m thermal pads and put it all together for top notch cooling.

https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=Fujipoly+17.0+W%2FmK

The MLC line as quick disconnects so so very easy to add/remove cards, rad/pumps etc.

The easiest way to get into custom cooling.









Edit: I have a picture of where exactly to put thermal pads using EK blocks for the GPU, anyone needs it, let me know.

A few extra places that help that EK doesn't mention.


----------



## swingarm

Ha, I posted something here a day or two ago asking for an AIO water cooling alternative for my EVGA 1080 Ti SC Black Edition to the one from EVGA and in a round-a-bout way you guys answered it.


----------



## CptSpig

Quote:


> Originally Posted by *swingarm*
> 
> Ha, I posted something here a day or two ago asking for an AIO water cooling alternative for my EVGA 1080 Ti SC Black Edition to the one from EVGA and in a round-a-bout way you guys answered it.


Glad we could help. We both use these products and are very happy customers. I cool a 7980Xe at 4.4 core / 3.0 cashe OC with a Titan X all on one Predator 360. temps 27c idle and 40c max gaming


----------



## CptSpig

Mistake


----------



## KedarWolf

Quote:


> Originally Posted by *CptSpig*
> 
> Quote:
> 
> 
> 
> Originally Posted by *swingarm*
> 
> Ha, I posted something here a day or two ago asking for an AIO water cooling alternative for my EVGA 1080 Ti SC Black Edition to the one from EVGA and in a round-a-bout way you guys answered it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Glad we could help. We both use these products and are very happy customers. I cool a 7980Xe at 4.4 core / 3.0 cashe OC with a Titan X all on one Predator 360. temps 27c idle and 40c max gaming
Click to expand...

Did you take the sponge out of your 360? It disintegrates and leaves flecks of sponge all through it.









I'm doing it this weekend, getting the MLC 360 rad/pump Thursday.

Opening and cleaning the GPU block, hose and QDCs putting it back together. MY GPU will be on the MLC.









Then when I get my Cryofuel, taking the Predator completely apart, removing the sponge and completely flushing and cleaning it.

The Predator will be for the CPU only.


----------



## CptSpig

Quote:


> Originally Posted by *KedarWolf*
> 
> Did you take the sponge out of your 360? It disintegrates and leaves flecks of sponge all through it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm doing it this weekend, getting the MLC 360 rad/pump Thursday.
> 
> Opening and cleaning the GPU block, hose and QDCs putting it back together. MY GPU will be on the MLC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Then when I get my Cryofuel, taking the Predator completely apart, removing the sponge and completely flushing and cleaning it.
> 
> The Predator will be for the CPU only.


I have not removed the sponge. No issues so far idles @ 27c and max gaming 40c with 7980 Xe 390 ci @ 4.4 GHz and Titan XP full cover block 182/698 OC on the loop. I have a bottle of Cryofuel concentrate if anyone needs some.


----------



## EarlZ

What is like the safe set and forget memory OC for 1080Ti's +400 ?


----------



## becks

Oky... my problem is still present.
Gaming at 97-100 Fps...if I do something in my second screen when I click back in main screen and start playing I get anywhere from 30-60fps with lot of tearing and I have to alt-tab it for a million times till it gets back to 97-100....

Tried setting v-sync on/off... no change
Tried changing it to max performance under NCP.
P2 is off trough nvidiainspector

Reinstalled driver....(not used DDU but checked the "clean install" box)
Played with Oc....but even at stock it does it..
Reinstall DirectX...net framework...bla bla..

Will try DDU + an older driver today...
Any other suggestions guys ? it mostly happens while playing Wow...

Apparently I am not alone..and some other games are badly affected:

__
https://www.reddit.com/r/7cpuji/1080ti_alttab_stuttering_fix/

The problem is not everyone has a gtx 1080 ti and especially not 2 or 3 screens with a 1080 ti..
So we might get a fix or not...


----------



## Bartholdi

Quote:


> Originally Posted by *becks*
> 
> Oky... my problem is still present.
> Gaming at 97-100 Fps...if I do something in my second screen when I click back in main screen and start playing I get anywhere from 30-60fps with lot of tearing and I have to alt-tab it for a million times till it gets back to 97-100....
> 
> Tried setting v-sync on/off... no change
> Tried changing it to max performance under NCP.
> P2 is off trough nvidiainspector
> 
> Reinstalled driver....(not used DDU but checked the "clean install" box)
> Played with Oc....but even at stock it does it..
> Reinstall DirectX...net framework...bla bla..
> 
> Will try DDU + an older driver today...
> Any other suggestions guys ? it mostly happens while playing Wow...
> 
> Apparently I am not alone..and some other games are badly affected:
> 
> __
> https://www.reddit.com/r/7cpuji/1080ti_alttab_stuttering_fix/
> 
> The problem is not everyone has a gtx 1080 ti and especially not 2 or 3 screens with a 1080 ti..
> So we might get a fix or not...


Was it a clean driver install?


----------



## Scotty99

I dont know if it was the removal of the filter, the addition of the two fans, or installing my AIO but my strix is now running at "normal" temps. Playing destiny 2 for an hour now and max temps are 67c, 1440p high details usually 100-140 fps in game. (a 1080ti is surprisingly not enough for this game at ultra).


----------



## DarthBaggins

Quote:


> Originally Posted by *Scotty99*
> 
> I dont know if it was the removal of the filter, the addition of the two fans, or installing my AIO but my strix is now running at "normal" temps. Playing destiny 2 for an hour now and max temps are 67c, 1440p high details usually 100-140 fps in game. (a 1080ti is surprisingly not enough for this game at ultra).


You might want to check your settings then, I hit 100-144 w/ a GTX 1080 OC'd to 2113/4563 and maintain temps around 45c max (under a water lock of course). But I'll have to run it on my newly acquired Ti once I get back home from closing up the Gallery for the winter. Somehow I think you might not have something optimized or a background process is hindering your card's output (or your card is running at stock clocks/settings)


----------



## Scotty99

Quote:


> Originally Posted by *DarthBaggins*
> 
> You might want to check your settings then, I hit 100-144 w/ a GTX 1080 OC'd to 2113/4563 and maintain temps around 45c max (under a water lock of course). But I'll have to run it on my newly acquired Ti once I get back home from closing up the Gallery for the winter. Somehow I think you might not have something optimized or a background process is hindering your card's output (or your card is running at stock clocks/settings)


Nah its an mmo the fps is going to vary drastically depending where you are, or even where you are pointing your camera. If im just cruising around a planet im at 150 fps or more, public events goes down to 80 or so (although that is mostly cpu in those areas).

Side note this card is going back to newegg not for temps, but coil whine. Ive never had a video card exhibit this and for the first half hour i thought it was the AIO i installed today, but its 100% coil whine. The pitch of it even changes depending on what im looking at in game, its not super loud but its louder than my case fans.


----------



## Scotty99

BTW with highest settings D2 only averages 110 FPS with a 1080 ti:





This is why i play on high


----------



## becks

Quote:


> Originally Posted by *Bartholdi*
> 
> Was it a clean driver install?


Yes, did a clean install..

Tried playing some more with it Y-day...V-syn on / off
P2 On / Off...

When I do smt outside the Game screen ...it drops the game to 30 fps and if I monitor GPU 3D usage it drops to 0%-25%... when I click back in game it gets stuck there and never goes back up to 90-100% usage...

At the end I got it sorted somehow... it only happens in Window-mode or Border-less Window-mode ...in full-screen is working really great. Just a pity I have to minimize the game every time I want to do smt on the other screens...


----------



## feznz

Quote:


> Originally Posted by *becks*
> 
> Yes, did a clean install..
> 
> Tried playing some more with it Y-day...V-syn on / off
> P2 On / Off...
> 
> When I do smt outside the Game screen ...it drops the game to 30 fps and if I monitor GPU 3D usage it drops to 0%-25%... when I click back in game it gets stuck there and never goes back up to 90-100% usage...
> 
> At the end I got it sorted somehow... it only happens in Window-mode or Border-less Window-mode ...in full-screen is working really great. Just a pity I have to minimize the game every time I want to do smt on the other screens...


At least you found a work round I had a similar problem with my gtx 580s I did fix it in the end ....................... unfortunately it was a clean install of windows I am sure it is something silly but the time taken to find the problem vs clean install is a gamble.

personally I used to do a clean install every 12 months now I got better with OC which can corrupt windows and resist the temptation of pirate programmes my last install lasted 5 years before I got a problem.


----------



## becks

Quote:


> Originally Posted by *feznz*


A clean OS Install might be what I will do next...just getting to many random errors and its triggering my OCD..

But I am holding back on it atm as I am in the process of looking for a new case...something that can hold 2x360 80 mm rads


----------



## KedarWolf

Quote:


> Originally Posted by *becks*
> 
> Quote:
> 
> 
> 
> Originally Posted by *feznz*
> 
> 
> 
> A clean OS Install might be what I will do next...just getting to many random errors and its triggering my OCD..
> 
> But I am holding back on it atm as I am in the process of looking for a new case...something that can hold 2x360 80 mm rads
Click to expand...

Thermaltake Core X5 or Core X9.

Just know the motherboard sits horizontal, flat on a tray.


----------



## becks

Quote:


> Originally Posted by *KedarWolf*
> 
> Thermaltake Core X5 or Core X9.
> 
> Just know the motherboard sits horizontal, flat on a tray.


Was going more towards 900 Pro.. Black/Orange









But its just a tough... Think about all the headache and overhaul...

M8I in that huge case will look like a nut in the wall...
So new MB....new CPU maybe...
New Rads...New Fans...New loops ...

Will try and part my rig and see if it's worth...
Maybe a side-grade rather than an upgrade...


----------



## becks

Quote:


> Originally Posted by *KedarWolf*


Ohh!!...feel so angry now...!!!









EVER since the first day i started heaven it used to go with 25-35 FPs in the first 3 scenes...and I always assumed this was "how it works" and after those scenes it goes up to 99-100 fps where it stays...and JUST realised that's not normal!
But it's not a driver crash either...and I can't see it in windows logs ... so back to scratch..


----------



## KCDC

I am having a frustrating issue where MSI AB won't allow me to adjust core offset per GPU.

I tick off sync, then set my offsets accordingly, however when I run a benchmark, both cards still run at the lowest setting, instead of running individual settings as they should.

I haven't tried any other OC tuners yet... I'd like to still use AB, but if I can't tune my cards individually, then.. sigh..

On XOC Bios BTW, but this also happened on stock FE bios.


----------



## 2ndLastJedi

https://www.nvidia.com/en-us/titan/titan-v/
Place your orders here!


----------



## mfdoom7

Quote:


> Originally Posted by *2ndLastJedi*
> 
> https://www.nvidia.com/en-us/titan/titan-v/
> Place your orders here!


so Titan V is 14.8Tflop 32 computing power ?


----------



## DarthBaggins

That price though


----------



## Streetdragon

But.... the shipping is free!!!!!


----------



## DStealth

How nice in EU the price is 3100 euro or 3638.75 US Dollar
Sounds like a deal...for sure this card will be fast but for plain gaming better wait 2080 or 2080ti








Very tempted BTW from this free shipping


----------



## SavantStrike

Quote:


> Originally Posted by *2ndLastJedi*
> 
> https://www.nvidia.com/en-us/titan/titan-v/
> Place your orders here!


I get the feeling we're being milked here. The 1180 TI is probably going to be $1200 given how expensive the new titan is.


----------



## RoBiK

Quote:


> Originally Posted by *DarthBaggins*
> 
> That price though


That depends entirely on the intended use. If your workload needs FP32 and can not use FP16 or if you are not severely memory bandwidth limited than this would be a poor investment. On the other hand if your workload is able to utilize those tensor cores and/or needs the HBM2 bandwidth then the price is actually pretty great.


----------



## SavantStrike

Quote:


> Originally Posted by *RoBiK*
> 
> That depends entirely on the intended use. If your workload needs FP32 and can not use FP16 or if you are not severely memory bandwidth limited than this would be a poor investment. On the other hand if your workload is able to utilize those tensor cores and/or needs the HBM2 bandwidth then the price is actually pretty great.


Which part of double the price of the predecessor is great? HBM isn't THAT expensive.


----------



## mouacyk

Was there some kind of missed grand announcement?


----------



## RoBiK

Quote:


> Originally Posted by *SavantStrike*
> 
> Which part of double the price of the predecessor is great? HBM isn't THAT expensive.


I did not state that double price vs predecessor is great... so that question doesn't make much sense to me.


----------



## CptSpig

Quote:


> Originally Posted by *mouacyk*
> 
> Was there some kind of missed grand announcement?


Yes, the TitanV is now selling on Nvidia.


----------



## RoBiK

Quote:


> Originally Posted by *mouacyk*
> 
> Was there some kind of missed grand announcement?


https://www.nvidia.com/en-us/titan/titan-v/


----------



## aberrero

Quote:


> Originally Posted by *EarlZ*
> 
> What is like the safe set and forget memory OC for 1080Ti's +400 ?


it can be 500. I'd set to 530 and stress test it then go down to 500


----------



## nrpeyton

Anyone know how to force a 1080Ti into a low power, *power state**?*

(As we all know cards don't always idle at proper low idling clocks/voltages.... ) if there was a *console command* I could use to *change power state* -- that could solve the issue. _(Could even have a text file saved as a batch file for quickness)._

Guys, *any ideas?*

_(Using curve won't work because lowest selectable voltage in Afterburner is 0.8v (800mv) )._

Thanks.


----------



## bajer29

What are you using to OC?


----------



## nrpeyton

MSI Afterburner


----------



## bajer29

Darn, I was going to say I know that K Boost on OCX will push the card to not down-clock... Is there a setting in Afterburner that does something similar?


----------



## nrpeyton

I am trying to do the opposite.

I.E i want to be able to set the voltage to *0.4v* _(for instance)_ *OR* force the card into a *low-power state* so it automatically selects that lower voltage.

Lowest voltage you can select through MSI Afterburner is 0.8v. I need to go lower.

Any other apps anyone is aware of that allow you to go lower than 0.8v. ?


----------



## bajer29

Right, if that was he case I would tell you to try to turn off K-Boost.

What happens if you try to lower the power target (to specifically 60%)?


----------



## nrpeyton

Quote:


> Originally Posted by *bajer29*
> 
> Right, if that was he case I would tell you to try to turn off K-Boost.
> 
> What happens if you try to lower the power target (to specifically 60%)?


Nothing. Card isn't under load, anyway.

For example:

When card is *properly* idling it will be at *0.65v at 139 Mhz*.

I want to be able to tell it to do that _at will._ I.E. by my command.


----------



## KedarWolf

I'm starting a 12 Step support group for people addicted to ethical hacking.

Yes, it'll be called 'Anonymous Anonymous'.


----------



## GRABibus

I have done a nice experience :
I have unmounted my Gigabyte GTX 1080Ti Gaming OC 11G and I have mounted on it (With a lot of difficulties), the AIO cooling of the Gigabyte AORUS GTX1080 Ti Xtreme Waterforce.

Results : maximum core temp = 40°C in bench (With fan at 100%) at 2114MHz / 1.093V








I even didn't have these results with the single Gigabyte AORUS GTX1080 Ti Xtreme Waterforce.

I did it as my Gigabyte GTX 1080Ti Gaming OC 11G si a good clocker.
I have flashed the card with ASUS XOC Bios...Which I am sure now was useless, as I can have same results with stock Bios (To be confirmed)
Now I can sustain 2101MHz in games with 40°C max on core, that's nice (Ambien temeprature=21°C)


----------



## hotrod717

With Pascal its more about the chip sample and cooling. People get stuck on voltage. Its been well established, at this point, voltage alone isnt going do any wonders. Getting a card under 20* does more than voltage. Voltage wont help in most cases, unless you are going subzero.


----------



## GRABibus

Quote:


> Originally Posted by *hotrod717*
> 
> With Pascal its more about the chip sample and cooling. People get stuck on voltage. Its been well established, at this point, voltage alone isnt going do any wonders. Getting a card under 20* does more than voltage. Voltage wont help in most cases, unless you are going subzero.


Yeah.
The only thing is that when I set for example the fan at 100%, then core temperature remains below 40°C.
That's great, but, voltage on the card never exceeds 1.081V (Even with ASUS XOC bios and even with slider at 100% for voltage in MSI AB).
Then, as I set my core at 2101MHz (+111MHz offset on core in MSI AB), there can be some instabilities (Voltage too low for 2101MHz).

Now, if I set the fan on auto, then the core temerature is higher, then I can see that voltage increases until 1.112V and frequency remains at 2101MHz, with max core temp = 60°C.
With 1.112V at 2101MHz, it is more stable.

is it normal that the voltage doesn't exceeed 1.081V with Core temp at 40°C ?
Is it normal that voltage increases until 1.112V with Core temps at 60°C ?

The higher the temperature, the higher the voltage ?
I should have expected the opposite effect....


----------



## 12Cores

Picked up two straps with the latest bios update, went from 2063 to 2088 stable overclock. I don't mean to keep beating a dead horse but it is clear that the 1.093v is crippling these cards under water, I spent an hour in the division dark zone at 2088mhz averaging about 35c. Any sane person would conclude that with those temps and two 8 pin connectors performance is being left on the table, pretty confident that this card would hit 2200 if I had access to more voltage, not really sure why the AIB's were not allowed to add more voltage to their cards.


----------



## EarlZ

Ughh.. looks like I am easily hitting a power limit on the MSI GTX 1080Ti Gaming X, I only have +60 on core (2012-2025Mhz on core) @ 1.050v and + 500 on GPU, slider maxes out at 117% I havent noticed a game hitting power limit but time spy extreme and firestrike extreme (both 1080p) can spike up to 120%


----------



## MrWhispers

** deleted post **

reason: No help from the community


----------



## EarlZ

I just saw buildzoid's video explaining about the pascal problem in where his GPU still runs at a high clock speed at extremely low voltages but also tanks performance, that was on a 1070, does the said 'issue' exist on the 1080Ti?


----------



## hotrod717

Quote:


> Originally Posted by *12Cores*
> 
> Picked up two straps with the latest bios update, went from 2063 to 2088 stable overclock. I don't mean to keep beating a dead horse but it is clear that the 1.093v is crippling these cards under water, I spent an hour in the division dark zone at 2088mhz averaging about 35c. Any sane person would conclude that with those temps and two 8 pin connectors performance is being left on the table, pretty confident that this card would hit 2200 if I had access to more voltage, not really sure why the AIB's were not allowed to add more voltage to their cards.


I can tell you, it wouldnt. Only thing that would is -90 to -120*c. I can speak from experience. Same clock on my card 2088 -2126 without going subzero. 2224+ with -90*. In fact these cards lose bin when frozen and top volts goes from 1.092v down to 1.05v. This was on a vanilla card. Also, if people aren't going gaga over KPE, Voltage doesnt help on air or water.


----------



## nrpeyton

Quote:


> Originally Posted by *hotrod717*
> 
> I can tell you, it wouldnt. Only thing that would is -90 to -120*c. I can speak from experience. Same clock on my card 2088 -2126 without going subzero. 2224+ with -90*. In fact these cards lose bin when frozen and top volts goes from 1.092v down to 1.05v. This was on a vanilla card. Also, if people aren't going gaga over KPE, Voltage doesnt help on air or water.


I'm not sure I agree regarding the voltage. Pascal just doesn't like it. I don't see much of a clock-speed increase between 1.1 & 1.2v.

Temperature seems to matter a lot though. roughly 2Mhz for every 1 degree C.

I can get 2189Mhz at 1.1v _(1.093v)_ if I keep the *at load* temp below 20-25c



Anyone else lose money this week with Nicehash miner?

(I had my wallet with them as well).









My wallet got completely emptied. And the "24hr news statement" is still the only message left to us by the company/site. It's been 3/4 days.
Quote:


> Originally Posted by *hotrod717*
> 
> I can tell you, it wouldnt. Only thing that would is -90 to -120*c. I can speak from experience. Same clock on my card 2088 -2126 without going subzero. 2224+ with -90*. In fact these cards lose bin when frozen and top volts goes from 1.092v down to 1.05v. This was on a vanilla card. Also, if people aren't going gaga over KPE, Voltage doesnt help on air or water.


----------



## lilchronic

Quote:


> Originally Posted by *hotrod717*
> 
> I can tell you, it wouldnt. Only thing that would is -90 to -120*c. I can speak from experience. Same clock on my card 2088 -2126 without going subzero. 2224+ with -90*. *In fact these cards lose bin when frozen and top volts goes from 1.092v down to 1.05v. This was on a vanilla card. Also*, if people aren't going gaga over KPE, Voltage doesnt help on air or water.


I've notice that just from going water cooled. On air i would see up to 1.08v and water cooled with max temps 30c the card never goes past 1.05v. Now with strix bios @ 1.15v i can hit 2138Mhz but at 1.05v i can only hit 2063Mhz. so voltage does help a little bit.


----------



## madmeatballs

Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone else lose money this week with Nicehash miner?


I lost about 120USD from the hack. It was my first try, I was about to get a payout next day. I sure do hope they fix everything soon. Currently trying to mine some VTC using the OCM lol. I shouldve known sooner that the 1080ti could make me some $$$ earlier tho.









Anyway, I hope they really come back soon.

My card was making 79MH/s what about your?


----------



## sblantipodi

Quote:


> Originally Posted by *2ndLastJedi*
> 
> https://www.nvidia.com/en-us/titan/titan-v/
> Place your orders here!


LOL, limit 2 per customer.


----------



## alucardis666

Quote:


> Originally Posted by *madmeatballs*
> 
> I lost about 120USD from the hack. It was my first try, I was about to get a payout next day. I sure do hope they fix everything soon. Currently trying to mine some VTC using the OCM lol. I shouldve known sooner that the 1080ti could make me some $$$ earlier tho.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, I hope they really come back soon.
> 
> My card was making 79MH/s what about your?


And I was upset over the $40 I lost.


----------



## 6u4rdi4n

As long as we're talking about air cooling or plain water cooling, less voltage could yield higher frequencies than higher voltage when
Quote:


> Originally Posted by *alucardis666*
> 
> And I was upset over the $40 I lost.


That's fair. I'm "upset" with the ~$25 I lost


----------



## THEROTHERHAMKID

In the group with a palit jetsream 1080 ti


----------



## EarlZ

Can anyone confirm if this is normal, there is a scene in time spy bench2 in where you see those futuristic rifles a few seconds from the start which power limits my card, I see recorded in MSI AB as 119%, it will throttle my core from 2012Mhz down to 1960+, the core voltage used is 1.025 and + 500Mhz on mem.. this is with the benchmark running on max sliders ( custom ) and 4k resolution.. seems like 1080Ti is power starved?


----------



## StenioMoreira

Founders 1080ti Vram overclocking causing shutdowns, like power failure shutdowns, quick and fast. FYI Shunt moded

Got G10kraken on it with closed loop corsair cooler, have little heatsinks on Ram and Vrm and there is that little fan. Even added fans and put copper sheet on the back of the card right on the other side of ram sticks, which gets hot to the touch.

How likely is it that Vram overclock is causing shutdowns. Seems like Vram's VRM or and the Vram itself might be too hot, or VRM exceeding its Amp limits. We know what ever a rating for VRM is, it becomes less and less with heat. I got cooling on it but, just wanted to know how likely. Gtx 1080 ti Founders 1.000v on core for 1999 in game and Insane vram clocks. 770-790 + Ram overclock and yes it scales performance, haven't found the sweet spot before I stop getting Gains lol, I powerr limit before it. I get 11,000+ on timespy Graphics score......

I know very weird, the other card I had 1080ti founders as well couldn't even 450+ on vram and would crash, this one can only do 800+ on Vram with fans ontop and below it at max rpm. Even then sometimes its automatic shutdown, so its all power related and heat.

Yeah its some insane Vram clocking i. Gotta treat it like the GPU die itself







Btw I'm pretty sure im peaking 400watt when I do 2088-2062core +770ish+ Vram


----------



## EarlZ

VRAM overclocking on the 1080Ti is weird, I get the following results on TimeSpy Bench1 and 2. No crashing no artifacts. Though i sense something else is wrong/missing or is it really common to get those +850 stable ?

+500
68.15
61.67

+600
68.47
61.92

+700
68.46
62.09

+750
68.92
62.10

+800
68.92
61.75

+850
69.16
62.33

I did not try +900


----------



## DStealth

Was hopping Volta to be revolution...
https://www.3dmark.com/compare/fs/14415213/fs/14382680#
It's fast but comparing my result to the 2Ghz/1000 OCed TitanV running the same platform in FS only 10% difference on average for 4 times the value ...let's hope the gaming model will be better performance and OCing wise








In TS on the other hand 2100Mhz/1050 TitanV is 40% better than [email protected]


----------



## StenioMoreira

@EarlZ yeah, well like I suggested before there's a limit to where you see performance, and beyond is just extra power, heat, and even decrease performance. To me, whats surprising is the constant linear performance increase with no matter what clock I use on Vram, 800 is usually autoshut down, If I had a real water block I could keep testing but lol it would go well beyond 400watt. http://www.3dmark.com/spy/2922992 that's as far as I can go with my current cooling, card hasnt shown any sign of hitting Vram scaling limit, power gives out.

@DStealth, yes Volta disappointing as of now and pretty much overall on price obviously and even performance. It's only 15 % faster than gtx1080ti in dx11, and where it looks to be a game changer is the field Nvidia previously struggled on, or kinda stayed staggered on, the low API stuff, DX12/Vulkan. Where AMD has huge leaps and Nvidia often performed worse than it's dx11. The Titan V up to 35% than a Gtx1080ti, on those low APIs. The first Gtx Titan for Pascal came out early too, so this Titan V looks like the next generation of what we had for Pascal's first Titan, the thing though is 2.5x the price of Pascals first Titan.

This isn't surprising, the price that is. It's a huge die, that reduces profit, more can go wrong, then you add HMB2.... speaks for itself, I reckon there's a good chance the consumer cards will break records in high pricing as well, 10 % is pretty bad jump lol. One thing looks clear though, they will overclock like the legendary Maxwell generation, where 20% overclocking was average, if you were willing to mod.

Although we can only expect further lockdown in bios and soft mods, like every new generation gets worse and worse for our tweaking freedom. So im sure it wont allow voltage mod through software hax, nor bios edit, like Maxwell allowed







. But even without that, the overclocking reviews are looking to be insane.

I honestly wont be upgrading though, this card can do everything I need, keeps high frames in competitive games, keeps smooth frames in 4k games, honestly dont need anything higher, but who knows ^^


----------



## EarlZ

Whats a good benchmark aside from time spy that is suitable to validate VRAM oc, one that should be extremely reliable and can show the slightest performance gain or loss?


----------



## Gunslinger.

Quote:


> Originally Posted by *DStealth*
> 
> Was hopping Volta to be revolution...
> https://www.3dmark.com/compare/fs/14415213/fs/14382680#
> It's fast but comparing my result to the 2Ghz/1000 OCed TitanV running the same platform in FS only 10% difference on average for 4 times the value ...let's hope the gaming model will be better performance and OCing wise
> 
> 
> 
> 
> 
> 
> 
> 
> In TS on the other hand 2100Mhz/1050 TitanV is 40% better than [email protected]


Maybe the Titan V is CPU bottlenecked in Fire Strike vs. Time Spy.


----------



## Gunslinger.

Quote:


> Originally Posted by *EarlZ*
> 
> Whats a good benchmark aside from time spy that is suitable to validate VRAM oc, one that should be extremely reliable and can show the slightest performance gain or loss?


Probably a benchmark that has a 4K option.


----------



## DarthBaggins

Superposition is good


----------



## EarlZ

Ill give that a try, standard or extreme preset?


----------



## DarthBaggins

I'd go Extreme


----------



## EarlZ

Ill give that a try, standard or extreme
Quote:


> Originally Posted by *DarthBaggins*
> 
> I'd go Extreme


Doesnt seem to be a very good test as I am hitting power 120% power limit a lot of times in 8k optimized even dropping the clocks down the 1886Mhz (GPU temp not exceeding 68c) with 4k optimized it still hits a power limit but a lot less. I guess ill just stick to 500Mhz OC and be happy with it ?


----------



## Masterchief79

I got myself a 1080Ti Lightning Z. Upgraded from a 980Ti. It's just ridiculous









Its running [email protected] 0.981V as of right now. 0.96V ran as well in BF1 but i think i had an artifact in project cars 2 yesterday.
Max seems to be 2130MHz @ 1.08V although that seems to crash in FSX sometimes. So sweetspot rather around 2GHz.


----------



## Agent-A01

Quote:


> Originally Posted by *EarlZ*
> 
> Whats a good benchmark aside from time spy that is suitable to validate VRAM oc, one that should be extremely reliable and can show the slightest performance gain or loss?


There is no benchmark to do that.

To validate 100% stability you need a test that fills VRAM up completely.

Timespy or anything else like that only uses 2-4gb, not near enough.

Just because a few chips on the card will do +900 that does not mean all of them will do that.

That's the thing, so many people claim their card will do +1000 or something but I can guarantee it would crash in a game like COD WW2 or something like that which will use all VRAM.


----------



## mouacyk

Quote:


> Originally Posted by *Masterchief79*
> 
> 
> I got myself a 1080Ti Lightning Z. Upgraded from a 980Ti. It's just ridiculous
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its running [email protected] 0.981V as of right now. 0.96V ran as well in BF1 but i think i had an artifact in project cars 2 yesterday.
> Max seems to be 2130MHz @ 1.08V although that seems to crash in FSX sometimes. So sweetspot rather around 2GHz.


I hope you're getting a proper WB for that LZ. It just doesn't feel right with a univeral cooler and no active VRM/VRAM cooling when you're drawing power over 8+8+8 pins, supposed with unlocked TDP.


----------



## Silent Scone

Quote:


> Originally Posted by *Masterchief79*
> 
> 
> I got myself a 1080Ti Lightning Z. Upgraded from a 980Ti. It's just ridiculous
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its running [email protected] 0.981V as of right now. 0.96V ran as well in BF1 but i think i had an artifact in project cars 2 yesterday.
> Max seems to be 2130MHz @ 1.08V although that seems to crash in FSX sometimes. So sweetspot rather around 2GHz.


My ROG Strix OC does 2125 @ 1.082v. You'll find a lot of these cards top out around the same.


----------



## Hydroplane

Merry Christmas everyone


----------



## KedarWolf

Quote:


> Originally Posted by *Hydroplane*
> 
> Merry Christmas everyone


I hope you enjoy the Winter Solstice.


----------



## EarlZ

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> Whats a good benchmark aside from time spy that is suitable to validate VRAM oc, one that should be extremely reliable and can show the slightest performance gain or loss?
> 
> 
> 
> There is no benchmark to do that.
> 
> To validate 100% stability you need a test that fills VRAM up completely.
> 
> Timespy or anything else like that only uses 2-4gb, not near enough.
> 
> Just because a few chips on the card will do +900 that does not mean all of them will do that.
> 
> That's the thing, so many people claim their card will do +1000 or something but I can guarantee it would crash in a game like COD WW2 or something like that which will use all VRAM.
Click to expand...

I dont have COD WW2 but does it use something like 6-8GB even on 1080p ?


----------



## Agent-A01

Quote:


> Originally Posted by *EarlZ*
> 
> I dont have COD WW2 but does it use something like 6-8GB even on 1080p ?


It does as you can downsample 4k if you wanted. It has in game options for that.


----------



## EarlZ

Quote:


> Originally Posted by *Agent-A01*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> I dont have COD WW2 but does it use something like 6-8GB even on 1080p ?
> 
> 
> 
> It does as you can downsample 4k if you wanted. It has in game options for that.
Click to expand...

Does it also have an ingame benchmark so that I can tell if I am still gaining anything with my OC? and is the campaign at least good before I get it lol!


----------



## hotrod717

Quote:


> Originally Posted by *nrpeyton*
> 
> I'm not sure I agree regarding the voltage. Pascal just doesn't like it. I don't see much of a clock-speed increase between 1.1 & 1.2v.
> 
> Temperature seems to matter a lot though. roughly 2Mhz for every 1 degree C.
> 
> I can get 2189Mhz at 1.1v _(1.093v)_ if I keep the *at load* temp below 20-25c
> 
> 
> 
> Anyone else lose money this week with Nicehash miner?
> 
> (I had my wallet with them as well).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My wallet got completely emptied. And the "24hr news statement" is still the only message left to us by the company/site. It's been 3/4 days.


So, actually you do agree with what i'm saying.

Quote:


> Originally Posted by *lilchronic*
> 
> I've notice that just from going water cooled. On air i would see up to 1.08v and water cooled with max temps 30c the card never goes past 1.05v. Now with strix bios @ 1.15v i can hit 2138Mhz but at 1.05v i can only hit 2063Mhz. so voltage does help a little bit.


I'm definitely speaking in general terms. We can't see or there isnt a asic anymore, but I'm sure different cards exhibit some different efficiencies. Remember with 780ti, 980, and 980ti, it usually took at least 1.35-1.45v to see that big jump in clock. As the die are getting smaller and more efficient, you can't do that without ln2 anymore. like you could get away with it on a 780ti and decent loop.


----------



## sblantipodi

Quote:


> Originally Posted by *Hydroplane*
> 
> Merry Christmas everyone


Hi, is 5960X a bottleneck for the two Ti?


----------



## DarthBaggins

Lol, hope you’re not serious.


----------



## Neo_Morpheus

Some of the games (like downscaling from 4k) are pushing the card so hard, I keep it at 110% power limit. Still alright, I can get an overclock in. It's going to run somewhere under 2000Mhz in games anyways.

merry christmas lighting!!


----------



## gta1989

What are you guys seeing for temps on hybrid cards vs full block 1080ti cards. i have a msi FE card with a evga hybrid kit on it with arctic silver 5 thermal paste that runs about 49c and i have a evga sc2 hybrid that runs about 52c, stock thermal paste on that one still. wondering if i should sell them and get 2 full block cards or if it wont make much of a difference.


----------



## 6u4rdi4n

Quote:


> Originally Posted by *gta1989*
> 
> What are you guys seeing for temps on hybrid cards vs full block 1080ti cards. i have a msi FE card with a evga hybrid kit on it with arctic silver 5 thermal paste that runs about 49c and i have a evga sc2 hybrid that runs about 52c, stock thermal paste on that one still. wondering if i should sell them and get 2 full block cards or if it wont make much of a difference.


Well, my card usually runs at 30-33C with a custom loop. Does it make a huge difference? Probably (most definitely?) not.

If you would like a custom loop because of the custom loop, then I will say go for it.


----------



## gta1989

I haven’t been able to test if I can get any more core clock out of them with 20c drop. Since i would now need to get another 360 rad and gpu blocks. I’m sure it still comes down to the lottery of the gpu.


----------



## ExoticallyPure

*New thread to move to for everyone that can afford to have the best of the best:
*http://www.overclock.net/t/1644172/official-nvidia-titan-v-owners-club

All the rest with a _404,640 second-old card_ can remain _here_.


----------



## EarlZ

Quote:


> Originally Posted by *Neo_Morpheus*
> 
> Some of the games (like downscaling from 4k) are pushing the card so hard, I keep it at 110% power limit. Still alright, I can get an overclock in. It's going to run somewhere under 2000Mhz in games anyways.
> 
> merry christmas lighting!!


Thats my current problem with the MSI Gaming X, I am hitting the 120% on some games down scaled, 1080p benchmarks like TimeSpy graphics test 2 and firestrike are also hitting the 120% limit.


----------



## gta1989

Try flashing xoc bios. It’s working great for me no limits anymore. Before I could get 2075mhz but as soon as it pushed hard it would power limit to 2025mhz. Now I have a steady 2075mhz never fluctuates and much better results on benchmarks. I use the curve on Msi afterburn so I can set my voltage at certain clocks also


----------



## Hydroplane

Quote:


> Originally Posted by *sblantipodi*
> 
> Hi, is 5960X a bottleneck for the two Ti?


Not sure, but a 7980XE is not


----------



## StenioMoreira

I could keep 2100mhz on my card if the temp never exceeds 33c, I imagine nice custom loops being able to achieve this? with a lot of liquid in it, without needing fans to go crazy right. Based on the fact that about 4-5C causes drops in clock this means I probably could go further with even less temps, but my loop isnt very OP. Still my card is a lot better core clocker than the big majority from what I've seen online, like Jay 2 Cents said, with 1080ti's its not about wining the lottery, its about not losing it XD. As crazy as it sounds a whole lot of 1080ti's cannot hold 2000mhz or even reach it.

I dont own a lot of games, and benchmarks are sexy but they're not realistic because they aren't very long. I think the game that I've come to find that test stability of your core more than any other is The Witcher 3, and for Vram overclock I'd say Assassin Creed Unity. I dont know how much of the Vram it uses up but frankly I dont care to even check, all i know is that it's very sensitive to Vram overclock(AA Unity) is. And again the Witcher 3 is amazing for testing top stability of core, I can pass all benchmarks at 50 mhz higher on core but fail almost instantly for The Witcher 3. I also noticed that the Witcher 3 draws a lot, lot of power.

EDIT. used XOC bios for the first time and went up to 2167mhz with a ****load of voltage and my score on every benchmark was worse than my original bios at 2088mhz, PROVES YET AGAIN, what i previously thought. Not all mhz is created equal lol, yea worst score 100 more watt or so







not good boyz

and fans on 90 % for the rads LOL, My 2088mhz + 770vram best score was 11,075 i think ill have to load it but i had it posted up a page back.


----------



## l1CappYl1

Skimming through overclock pages and. It seems i got a very good GPU batch. Asus strix oc, no bios flash and packaged software and cooling. My GPU runs at 2109mhz (set to 1800) and running +800 on the memory. Temps under 60°c with fans under 50%. Timespy score 1115 (71.3/64.6 fps).


----------



## Abaidor

Quote:


> Originally Posted by *l1CappYl1*
> 
> Skimming through overclock pages and. It seems i got a very good GPU batch. Asus strix oc, no bios flash and packaged software and cooling. My GPU runs at 2109mhz (set to 1800) and running +800 on the memory. Temps under 60°c with fans under 50%. Timespy score 1115 (71.3/64.6 fps).


Could you post any info regarding the production lot / batch? I received the same card 3 days ago but can't test until I have all the parts for my new build.


----------



## EarlZ

Quote:


> Originally Posted by *gta1989*
> 
> Try flashing xoc bios. It's working great for me no limits anymore. Before I could get 2075mhz but as soon as it pushed hard it would power limit to 2025mhz. Now I have a steady 2075mhz never fluctuates and much better results on benchmarks. I use the curve on Msi afterburn so I can set my voltage at certain clocks also


Where can I read more about this XOC bios, I only found it on the first page and no further details.


----------



## l1CappYl1

Quote:


> Originally Posted by *Abaidor*
> 
> Could you post any info regarding the production lot / batch? I received the same card 3 days ago but can't test until I have all the parts for my new build.


Sure. Ill have a look when I get home.


----------



## StenioMoreira

Yeah I think we ought to maybe, stick to our original Bios or at least our own models, clearly the XOC clocks im getting are bad, though they work in everything, they score much worse. Id call this not a real overclock. My 1080ti fe with Asus XOC bios clocks 2150+ and scores like you know what, while on original bios with max 2088 I get over 11000 + timespy but have power limit issues, Nice thing about XOC bios is, there is now power limit issue at all whats so ever. LUCKY ASUS STRIX OWNERS!!!!


----------



## l1CappYl1

Sure. Ill have a look when i get home later tonight.
Quote:


> Originally Posted by *StenioMoreira*
> 
> Yeah I think we ought to maybe, stick to our original Bios or at least our own models, clearly the XOC clocks im getting are bad, though they work in everything, they score much worse. Id call this not a real overclock. My 1080ti fe with Asus XOC bios clocks 2150+ and scores like you know what, while on original bios with max 2088 I get over 11000 + timespy but have power limit issues, Nice thing about XOC bios is, there is now power limit issue at all whats so ever. LUCKY ASUS STRIX OWNERS!!!!


No power limit ? I could be wrong but i believe they are capped in the bios at 330w. When i run nvidia-smi.exe, the power draw is always x/330w.


----------



## pez

Quote:


> Originally Posted by *gta1989*
> 
> What are you guys seeing for temps on hybrid cards vs full block 1080ti cards. i have a msi FE card with a evga hybrid kit on it with arctic silver 5 thermal paste that runs about 49c and i have a evga sc2 hybrid that runs about 52c, stock thermal paste on that one still. wondering if i should sell them and get 2 full block cards or if it wont make much of a difference.


Running a Titan X P, but similar TDPs and heat production. Currently with a lower spinning fan than stock, I hit 50-56C depending on the game. This is with about a +150Mhz GPU OC (sits around 1974-2000MHz in games). I could run a higher speed fan and sacrifice noise for a couple more bin steps, but it's not worth to me.


----------



## Unnatural

Any advice/tip for running a FE (EVGA) in SLI with this model?


----------



## gta1989

i found this thread " https://linustechtips.com/main/topic/764949-50mv-sli-voltage-difference-and-how-to-fix-it/ " very helpful when overclocking 2 different cards if they have different base clock speeds as in sli it runs the same speeds but one of my cards was pushing .050mv more voltage which was just heating up the card.


----------



## StenioMoreira

Quote:


> Originally Posted by *l1CappYl1*
> 
> Sure. Ill have a look when i get home later tonight.
> No power limit ? I could be wrong but i believe they are capped in the bios at 330w. When i run nvidia-smi.exe, the power draw is always x/330w.


Yeah no, XOC has none, at least not for me. I ran 1.200v on it just recently no shutdowns no power issues nothing. It was drawing 500+ish watt.


----------



## gta1989

same i also confirmed with a wattage draw from the wall that for me on a FE card and evga sc2 hybrid that it does not have a power limit. but be careful as you can over power hybrid cards pretty easily when pushing 330+ watts. i found at 350+watts temps start to run away over time it cant keep up


----------



## StenioMoreira

Quote:


> Originally Posted by *gta1989*
> 
> same i also confirmed with a wattage draw from the wall that for me on a FE card and evga sc2 hybrid that it does not have a power limit. but be careful as you can over power hybrid cards pretty easily when pushing 330+ watts. i found at 350+watts temps start to run away over time it cant keep up


Yeah I've done it a 2-3 hours for testing and in games, already have noticed degradation both on the core and Vram max overclock cant do 2088mhz + 770vram on stock bios anymore

EDIT: Maybe not, I've been having a registry virus, that seems to run away from every where I find it, honestly dont know what it is, might be an app I paid for called reWASD, remaps any gamepad keys or console controller keys. Anyhow it causes blue screens often and now even trying to scan to find it, I crashed again. Might not be GPU degradation, might be something in background. Although 500 watt towards the gpu die is also a bit much... i might have forced i.


----------



## gta1989

Quote:


> Originally Posted by *StenioMoreira*
> 
> Yeah I've done it a 2-3 hours for testing and in games, already have noticed degradation both on the core and Vram max overclock cant do 2088mhz + 770vram on stock bios anymore


770vram seem very high I've never been able to get over 450 I stick to 375 for daily and I have notice the 1080ti seems to take a little more time to show its instability compared to older cards which crash in a few second. I have also noticed if I game for a significant amount of time my room will heat up a little and a few c can make it unstable so I dial back atleast one bin


----------



## nrpeyton

Quote:


> Originally Posted by *gta1989*
> 
> 770vram seem very high I've never been able to get over 450 I stick to 375 for daily and I have notice the 1080ti seems to take a little more time to show its instability compared to older cards which crash in a few second. I have also noticed if I game for a significant amount of time my room will heat up a little and a few c can make it unstable so I dial back atleast one bin


After you've dialled in your +375 and saved the profile etc try rebooting your machine _(soft restart *only*)_. The GPU will re-initialize memory timings (the same way your regular RAM trains at start-up). When you get back into Windows you may find you can now go a bit higher with VRAM overclock.


----------



## EarlZ

Quote:


> Originally Posted by *gta1989*
> 
> Quote:
> 
> 
> 
> Originally Posted by *StenioMoreira*
> 
> Yeah I've done it a 2-3 hours for testing and in games, already have noticed degradation both on the core and Vram max overclock cant do 2088mhz + 770vram on stock bios anymore
> 
> 
> 
> 770vram seem very high I've never been able to get over 450 I stick to 375 for daily and I have notice the 1080ti seems to take a little more time to show its instability compared to older cards which crash in a few second. I have also noticed if I game for a significant amount of time my room will heat up a little and a few c can make it unstable so I dial back atleast one bin
Click to expand...

What game are you using to test for stability?


----------



## gta1989

Quote:


> Originally Posted by *EarlZ*
> 
> What game are you using to test for stability?


I play alot of different games. heros of the storm doesnt support sli so just 1 card runs. destiny 2 does support sli and in pvp will usually show you an unstable core clock pretty fast. I have a 4k tv i play on so idk if that makes any difference the resolution that you are playing at. I also use max settings that i can sustain 60fps min. I hate laggy gaming and tearing of the frames so i usually turn off vsync unless that game just plain doesnt play well with high fps


----------



## gta1989

Quote:


> Originally Posted by *nrpeyton*
> 
> After you've dialled in your +375 and saved the profile etc try rebooting your machine _(soft restart *only*)_. The GPU will re-initialize memory timings (the same way your regular RAM trains at start-up). When you get back into Windows you may find you can now go a bit higher with VRAM overclock.


I will give this a try next time I play with some bench marks. do you restart more then once and keep going or is it a one time deal?


----------



## l1CappYl1

Quote:


> Originally Posted by *StenioMoreira*
> 
> Yeah no, XOC has none, at least not for me. I ran 1.200v on it just recently no shutdowns no power issues nothing. It was drawing 500+ish watt.


Im aware that XOC removes the power limits, but you mentioned lucky strix owners like the cards ship uncapped.


----------



## StenioMoreira

Quote:


> Originally Posted by *gta1989*
> 
> I will give this a try next time I play with some bench marks. do you restart more then once and keep going or is it a one time deal?


Yeah, that why if it doesn't go back to way it was before 500watt torture, I will be a bit bummed out. Still the card is top 10 % or better, I own 3 and have tested 8 cards, this was the best of all, core wasn't what stuck out, more of the extremely high Vram clock that scales, 770 was the limit, and beyond it would not give any increase. Now its crashing at 770... let alone anything higher, thats the big evidence for it being hurt by my experiments, it never crashed any Vram overclock before. The performance use to stop at 770, but i could still go further. Anyhow I hope it magically goes back to being 11000 timespy graphics score card.

I've had my hand 3 that couldn't put up 400 on vram
Quote:


> Originally Posted by *l1CappYl1*
> 
> Im aware that XOC removes the power limits, but you mentioned lucky strix owners like the cards ship uncapped.


No lol, i meant maybe their clocks are real with that Bios, considering it's a bios for their card, maybe it scales better. 2152mhz on 1.200v for me was less than 1% better performance than my original bios 2088mhz +770vram


----------



## EarlZ

My card can do +850Mhz on memory but I dont have a reliable way to determine if I am getting more performance or not because I am hitting a power limit on time spy graphics test 2 and same with firestrike, games dont crash or artifact but maybe thats just the ECC on GDDR5X kicking in and slowing the actual clocks. Any suggestions ?


----------



## StenioMoreira

Quote:


> Originally Posted by *EarlZ*
> 
> My card can do +850Mhz on memory but I dont have a reliable way to determine if I am getting more performance or not because I am hitting a power limit on time spy graphics test 2 and same with firestrike, games dont crash or artifact but maybe thats just the ECC on GDDR5X kicking in and slowing the actual clocks. Any suggestions ?


of course there is, run 6 + runs of each memory state around same temps, has to be hot, to simulate real prolonged game and not a premature quick session, i've gathered results from valleybenchmark and timespy, and no it doesnt have to be 4k + 8k ect,.... lol there is always a performance difference, also sometimes the clocks dont really take effect unless u do software restart or even pc... Not absolutely sure but the first 2 runs were always unreliable. And yeah, for me personally, i see consistent difference with clocks doing the above ^^. All benchmarks have shown it, even at 1980p


----------



## EarlZ

Quote:


> Originally Posted by *StenioMoreira*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> My card can do +850Mhz on memory but I dont have a reliable way to determine if I am getting more performance or not because I am hitting a power limit on time spy graphics test 2 and same with firestrike, games dont crash or artifact but maybe thats just the ECC on GDDR5X kicking in and slowing the actual clocks. Any suggestions ?
> 
> 
> 
> of course there is, run 6 + runs of each memory state around same temps, has to be hot, to simulate real prolonged game and not a premature quick session, i've gathered results from valleybenchmark and timespy, and no it doesnt have to be 4k + 8k ect,.... lol there is always a performance difference, also sometimes the clocks dont really take effect unless u do software restart or even pc... Not absolutely sure but the first 2 runs were always unreliable. And yeah, for me personally, i see consistent difference with clocks doing the above ^^. All benchmarks have shown it, even at 1980p
Click to expand...

What Im trying to say is that I need a reliable way to establish if I am gaining rt losing performance from overclocking my mems from a +500 to +850, I intended to use Firestrike and Timespy but the problem with that is hitting the power limit even on 1080P, once I am able to tell until which mem oc gives me performance thats when I will test for stability.

EDIT:
Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.


----------



## StenioMoreira

Quote:


> Originally Posted by *EarlZ*
> 
> Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.


Yeah, hmm check out the gtx1080ti bios flash thread on this site, find the batfile that comes with the zip folder for "best Bioses", use that bat file to give you 50 watt head room, to help ur situation, and yea 3dmark is almost like a GPU power virus lol.

next, I'd test runs at different clocks like you have been, and recording not just score but minimums if possible. Discard any clock that does not show performance increase, keep lowering it until there's a drop off in performance that's more of an outlier compared to the rest, like if it stands out because I know pinpointing memory gains/losses is tough due to how it jumps around. 770 for me is great on first boot, top score always, but doesn't work if core is realistically hot, so I backed off to 750, where there is least variable in scores and fps low's also, 750 scores for me were closest within each run than the higher ones.

I discovered some good news guys regarding that XOC bios


----------



## gta1989

Quote:


> Originally Posted by *EarlZ*
> 
> What Im trying to say is that I need a reliable way to establish if I am gaining rt losing performance from overclocking my mems from a +500 to +850, I intended to use Firestrike and Timespy but the problem with that is hitting the power limit even on 1080P, once I am able to tell until which mem oc gives me performance thats when I will test for stability.
> 
> EDIT:
> Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.


i have found that synthetic benchmarks do show more improvements with memory above 400mhz oc then games or heaven. i found firestrike i could get a boost if i went to +600. but in heavon or games i would actually loose fps and at 650+ in games it was a drastic decrease in fps. maybe that point depends on the card but thats just what i have noticed with my 2 cards. sli doesnt seem to make a difference that +400 mark is it then im over the hill


----------



## EarlZ

Quote:


> Originally Posted by *StenioMoreira*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.
> 
> 
> 
> Yeah, hmm check out the gtx1080ti bios flash thread on this site, find the batfile that comes with the zip folder for "best Bioses", use that bat file to give you 50 watt head room, to help ur situation, and yea 3dmark is almost like a GPU power virus lol.
> 
> next, I'd test runs at different clocks like you have been, and recording not just score but minimums if possible. Discard any clock that does not show performance increase, keep lowering it until there's a drop off in performance that's more of an outlier compared to the rest, like if it stands out because I know pinpointing memory gains/losses is tough due to how it jumps around. 770 for me is great on first boot, top score always, but doesn't work if core is realistically hot, so I backed off to 750, where there is least variable in scores and fps low's also, 750 scores for me were closest within each run than the higher ones.
> 
> I discovered some good news guys regarding that XOC bios
Click to expand...

It does not have the MSI Gaming X as part of its list. I am able to run 800Mhz on COD WW2 about 2hrs that I've played all max settings render scale is at 400% which uses about 10.6GB VRAM. No artifacts and stable during that 2hr time frame

Quote:


> Originally Posted by *gta1989*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> What Im trying to say is that I need a reliable way to establish if I am gaining rt losing performance from overclocking my mems from a +500 to +850, I intended to use Firestrike and Timespy but the problem with that is hitting the power limit even on 1080P, once I am able to tell until which mem oc gives me performance thats when I will test for stability.
> 
> EDIT:
> Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.
> 
> 
> 
> i have found that synthetic benchmarks do show more improvements with memory above 400mhz oc then games or heaven. i found firestrike i could get a boost if i went to +600. but in heavon or games i would actually loose fps and at 650+ in games it was a drastic decrease in fps. maybe that point depends on the card but thats just what i have noticed with my 2 cards. sli doesnt seem to make a difference that +400 mark is it then im over the hill
Click to expand...

Thats what I am trying to do know, is to know if my +800Mhz is getting me any real performance increase in games. Im currently playing COD WW2 and I'll try to look for a scene with a steady FPS and try a 100Mhz increment see if there is a performance drop/gain


----------



## nrpeyton

Quote:


> Originally Posted by *EarlZ*
> 
> What Im trying to say is that I need a reliable way to establish if I am gaining rt losing performance from overclocking my mems from a +500 to +850, I intended to use Firestrike and Timespy but the problem with that is hitting the power limit even on 1080P, once I am able to tell until which mem oc gives me performance thats when I will test for stability.
> 
> EDIT:
> Ive been testing superposition from 500-600-700-800Mhz memory overclock and Ive been getting a score increase little by little for every 100Mhz all the way up to 800Mhz so Im thinking my memory is not throttling down what so ever I just need to find a stable clock for it.


Use 'OCCT' to find any instability in your VRAM overclock.

It scans for _(and displays)_ detected errors as it runs. Errors will always begin showing long before a driver crash.

If errors are detected then this would translate to dropped frames in-game or in-benchmark.

The way OCCT detects errors is quite elegant; it simply compares the currently drawn frame to the last. Any differences and an error is detected. Simple but *extremely* effective.
_
Note: Be careful not to move the OCCT app's 'window' about on the screen, as this can cause it to wrongly detect errors._

Quote:


> Originally Posted by *gta1989*
> 
> I will give this a try next time I play with some bench marks. do you restart more then once and keep going or is it a one time deal?


Keep going


----------



## EarlZ

Suggested duration for OCCT?

EDIT:

I did a quick 900seconds test at 720Mhz and no errors found


----------



## StenioMoreira

Quote:


> Originally Posted by *EarlZ*
> 
> It does not have the MSI Gaming X as part of its list. I am able to run 800Mhz on COD WW2 about 2hrs that I've played all max settings render scale is at 400% which uses about 10.6GB VRAM. No artifacts and stable during that 2hr time frame
> Thats what I am trying to do know, is to know if my +800Mhz is getting me any real performance increase in games. Im currently playing COD WW2 and I'll try to look for a scene with a steady FPS and try a 100Mhz increment see if there is a performance drop/gain


Like I've kinda mentioned, find your top scoring Vram clock, and test this with low core clocks please so its consistant, than back off the clock, because what ever it is, wont be good in game. In game the stress is so much, it will get hot, errors and performance wont show. So find the highest clock that gives u least Low Fps/minimum, goal is stability when overcloking Vram. Whether its helping in certain tittles or not, does not matter, what matters is that its not hurting stability and smoothness, and that it's actually working. As stated, keep down clocking till you find clear loss of performance that deviates from regular range of variable scores. Name of the game is stability


----------



## Neo_Morpheus

Lots of 3dmark tests are somewhere 1500Mhz - 1600Mhz. Play a stressful game, the build up would start to show the Vram. I keep it at +500. Could probably do another 50.


----------



## EarlZ

Quote:


> Originally Posted by *StenioMoreira*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> It does not have the MSI Gaming X as part of its list. I am able to run 800Mhz on COD WW2 about 2hrs that I've played all max settings render scale is at 400% which uses about 10.6GB VRAM. No artifacts and stable during that 2hr time frame
> Thats what I am trying to do know, is to know if my +800Mhz is getting me any real performance increase in games. Im currently playing COD WW2 and I'll try to look for a scene with a steady FPS and try a 100Mhz increment see if there is a performance drop/gain
> 
> 
> 
> Like I've kinda mentioned, find your top scoring Vram clock, and test this with low core clocks please so its consistant, than back off the clock, because what ever it is, wont be good in game. In game the stress is so much, it will get hot, errors and performance wont show. So find the highest clock that gives u least Low Fps/minimum, goal is stability when overcloking Vram. Whether its helping in certain tittles or not, does not matter, what matters is that its not hurting stability and smoothness, and that it's actually working. As stated, keep down clocking till you find clear loss of performance that deviates from regular range of variable scores. Name of the game is stability
Click to expand...

What im now doing to test performance gain is look for a spot on COD WW2 with a stable fps and check the performance from there, I do gain something up until 720Mhz from 720-800 the FPS is the same, super position shows a difference from 720 to 800


----------



## DarthBaggins

Need to load WWII up on the test bench to finally game of my To (has only run [email protected])


----------



## Streetdragon

Quote:


> Originally Posted by *nrpeyton*
> 
> Use 'OCCT' to find any instability in your VRAM overclock.
> 
> It scans for _(and displays)_ detected errors as it runs. Errors will always begin showing long before a driver crash.
> 
> If errors are detected then this would translate to dropped frames in-game or in-benchmark.
> 
> The way OCCT detects errors is quite elegant; it simply compares the currently drawn frame to the last. Any differences and an error is detected. Simple but *extremely* effective.
> _
> Note: Be careful not to move the OCCT app's 'window' about on the screen, as this can cause it to wrongly detect errors._
> 
> Keep going


is the OCCT GPU test save in any way? i thought this is one of the GPU-Killers because of the insane load


----------



## becks

Quote:


> Originally Posted by *Streetdragon*
> 
> is the OCCT GPU test save in any way? i thought this is one of the GPU-Killers because of the insane load


As safe as Prime for CPU... and we still use the later with 8th and 9th gens..
It's all about knowing what you do..

If you trow 1.45+ on CPU and test with RB or Games you might be fine but it might or it might not burn the CPU with Prime

On GPU you are much more limited V wise so the only problem would be temp.
Keep an eye on it and you should be fine.


----------



## Streetdragon

Quote:


> Originally Posted by *becks*
> 
> As safe as Prime for CPU... and we still use the later with 8th and 9th gens..
> It's all about knowing what you do..
> 
> If you trow 1.45+ on CPU and test with RB or Games you might be fine but it might or it might not burn the CPU with Prime
> 
> On GPU you are much more limited V wise so the only problem would be temp.
> Keep an eye on it and you should be fine.


always learn something new


----------



## EarlZ

The voltage curve on MSI AB is annoying me. I set it for 2000Mhz and after a reboot it does 2 things either bump it up to 2012Mhz or lower it down to 1999Mhz (which results to 1987) 2012 is stable on my rig but the OCD in me only wants to look at 2000Mhz


----------



## Streetdragon

its the boost 3.0 crap. lower temp= higher bin.
set the voltage curve while tha card is under load + reached its peak temps.

than you have under load 2000MHZ and with lower temps 2012 or so


----------



## StenioMoreira

That's right, it actually changes under load, what ever the default is, its lower when your card is hot. Editing while running a benchmark in windowed mode on the background or a game for example. I first thought I might be going crazy to see how stuff changes at times while still in windows LOL, let alone reboot.


----------



## EarlZ

No im not talking about the GPU throttle from GPU boost 3.0 im taking about the actual curve found sometimes it chooses 1999Mhz sometimes it chooses 2012Mhz, Ive installed the latest update ill do more testing.

Im not even running any load on the GPU just checking t he curve after a reboot.


----------



## Streetdragon

1999 is normal. just rounding error or somethign like that. but 2012 cant be. you sure you saved the settings + reenable the settings after reboot? the little ion on the left in msiab


----------



## EarlZ

Pretty sure I saved it at 2000Mhz, the GPU will clock at 2000Mhz as well, after a reboot only 2 things will happen either it goes to 1999Mhz which in turn runs my GPU on 1987 or it will choose 2012Mhz. I'll probably just have to live with 2012.

EDIT:

It doesnt have to be a reboot, fully closing the app and opening it again will trigger it.


----------



## gta1989

Quote:


> Originally Posted by *EarlZ*
> 
> The voltage curve on MSI AB is annoying me. I set it for 2000Mhz and after a reboot it does 2 things either bump it up to 2012Mhz or lower it down to 1999Mhz (which results to 1987) 2012 is stable on my rig but the OCD in me only wants to look at 2000Mhz


ive had the same thing. i finally gave up and left it. seems it sets it to the nearest bin of the mhz you have set in AB


----------



## KedarWolf

Anyone in Canada check out this website.

https://www.softwarecity.ca/

I bought a Maximus X Formula motherboard from them and here is my email to them to their Customer Service.

"I ordered a motherboard from you yesterday. I had the shipping address postal code wrong, your sales department called me within 30 minutes minutes to correct this.

Also, you sent it Purolater with less than $10.00 shipping costs, I ordered it yesterday and it came today before Xmas Day.

Also, you were the only retailer that had the item in stock. Not even Canada Computers locally or newegg.ca had it.

And your price was below anyone else that had the motherboard advertised.

Thank you!"

Highly recommend them.


----------



## EarlZ

Quote:


> Originally Posted by *gta1989*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EarlZ*
> 
> The voltage curve on MSI AB is annoying me. I set it for 2000Mhz and after a reboot it does 2 things either bump it up to 2012Mhz or lower it down to 1999Mhz (which results to 1987) 2012 is stable on my rig but the OCD in me only wants to look at 2000Mhz
> 
> 
> 
> ive had the same thing. i finally gave up and left it. seems it sets it to the nearest bin of the mhz you have set in AB
Click to expand...

Yeah, it seems like 2000Mhz is something it doesnt like but has no issues with 2012Mhz bin


----------



## StenioMoreira

This is how you tell how well your Bin is, find the highest stable clock you can maintain in a game with only 800v. The best one I had my hands on did 1759 under long game session. Why 800v? well because that's the lowest on the curve you're allowed to go, and anything after that will come down to your cooler and some other factors, too many variables. But, at that voltage even a nasty old blow style cooler has a fair chance to show it's silicon quality.


----------



## EarlZ

Quote:


> Originally Posted by *StenioMoreira*
> 
> This is how you tell how well your Bin is, find the highest stable clock you can maintain in a game with only 800v. The best one I had my hands on did 1759 under long game session. Why 800v? well because that's the lowest on the curve you're allowed to go, and anything after that will come down to your cooler and some other factors, too many variables. But, at that voltage even a nasty old blow style cooler has a fair chance to show it's silicon quality.


How is this supposed to help with MSI-AB randomly choosing 1989 or 2012 on relaunch ? Unless this reply is not for my issue.


----------



## Nuginu

Oops, entered the wrong clock values. My, hope they get adjusted. EVGA precision doesn't seem to show accurate values.


----------



## StenioMoreira

Quote:


> Originally Posted by *EarlZ*
> 
> How is this supposed to help with MSI-AB randomly choosing 1989 or 2012 on relaunch ? Unless this reply is not for my issue.


Yeah it wasn't a reply to you but rather a post for this thread, and here is a 100% working solution for you. {SPOILER pascal overclocking blows}

Clocks go in steps of 12.5Mhz at a time, call it 13 Mhz for sake for everything, so with that In mind. Remember it rounds off up or down to the nearest step. For me on Msi curve, if I want 1999mhz. I have to set the curve on 2000mhz and then apply, apply more than once at times, because it often goes a step above or down. But, if it works which it often does, it will show 1999mhz for me, could be different for you.

Now for your the issue of, your clocks/curve moving above what you'd like. Create, save and apply your curve outside of any load in your idle temps, and all other adjustments to either your Fan Speed/ Power Target/ Temp Limit/ Memory and ect... also have to be done in idle temps. (Test all and create your Profiles while ur in idle temps).

(You cannot edit that profile in middle of load or when you're temps are significantly higher) again, or it will change the actual curve, in game or in any load, you can only click profile # 1-5 and apply or what not. Do not change any other of the mentioned options above, while in game if you're depending on curves and not offset. Your curve will never move if you dont make changes to that profile while underload/higher temps, and will be exactly the same as when you've created it.

Example my curve for 1999mhz was created, for 1.000v, it took me more than one try to see 1.000v land on 1999mhz LOL. Okay but, it was made in (idle temps), and same goes for my memory overclock.... It too was applied in idle, and so is all the my ideal settings. Now I'll save that profile, and never edit it again or make any changes to anything on it while in game. Do not make any changes to the profile while underload/high temps or your curve will always move<---- THIS IS IF YOU DONT WANT CURVE TO MOVE - You can make changes while in game, but, that change can only be applying a different profile, but remember all profiles have to be made in idle.

So, changes cannot be done in load/high temp. I guarantee you, you'll never see the curve change again.

As for creating a good working curve, you have to make one that looks similar to the actual curve, step bystep, at least 6 steps removed from your final point, and the best curve wiil always adjust clocks based on temperature. NOW this is where it gets tricky again, now you have to test curves to see which one actually works, as (intended). When creating my best 1.000vfpr 1999mhz it goes like this.

1999mhz drops to 1987mhz, then drops again to (1974c at 64c). A really good working curve drops clocks twice before it crashes, though you dont wan't it doping too fast, like at 33-34C, that would be too early. But, any drop while under load in clock is a good sign, at least it means that, rather than crashing, it's adjusting and lowering itself. And that is what and how curves should be.

Took me a while to find the perfect Curve for the sweet spot of 1999mhz up to 55C. Most of the time the clocks would never drop but rather crash long before anything else.....







. So there curve points for me, have to be changed, not just your final clock/volt point but at least one one prior too!. In fact my final curves have been adjusted.

My best working curves have been adjusted (6 steps deep) for that final result one drop at 55 C and another at 64C for final of 1974 in the worst case scenario. If your curve crashed at one clock and never moves, its bad, if it drops right away, it's also bad. All signs point to you need more voltage for that clock on the curve.

Perhaps save you a lot of time and give you my golden 6 points ( 1stStep 1999mhz-1.000v)..... (2ndStep 1949mhz -.993v)..... (3rdStep -1898mhz-.981v)......(4thStep -1860mhz-.975v)....(5thStep -1860mhz-.962v).... (6thStep -1860mhz-.950v)

The first challenge was getting the clock to actually drop with temps, first curves dropped from 1999 to 1987 at 34c for the earliest, and most common came at 44c to drop to the same clock. And i was using higher voltage 1.012v most of the testing time. Wasn't until i went 6 deep points and perfected it, that it actually came to only need 1.000v.







The sad thing is the most common thing while finding these results were crashes lol

My end result with my six points works perfect for my reference card within any range voltage points up to 1.081v. Meaning the space between my clocks and voltage have to be exactly the same as listed above and it works for other clocks too.

[[[[[ 2075mhz]]]]]---- ( 1stStep-2075mhz -1.081v)..... (2ndStep-2025mhz -1.075v)..... (3rdStep-1987mhz-1.062v)......(4thStep -1949mhz-1.050v)....(5thStep -1949mhz-1.043v).... (6thStep -1949mhz-1.031v)

Steps between 1st and 2nd = (12.5mhz x 4).........That's 4Steps between point 1, and 2, and 3Steps between point 2 and 3...and so on and so on, the only difference between this curve and my 1999mhz, is that there's is only 3Steps between 2nd-3rd instead of 4Steps between. But every other point on the curve has the same space between along the voltage points

The space between wont always be exactly the same, dont think, because core clock steps at 12.5 in bios. I know this from tweaking with Maxwell bios... I'f I remember correctly. I got better results with a curve this deep and was able to move it to other clock ranges, the only change in space being that one time, using 3Steps instead of 4 because it wouldn't let me.

FYI!!!! in case you'd like to copy my curve, remember silicon lottery or lack there of, still applies. My card is a 1759mhz at .800v, and take that information into account, when looking at these results


----------



## nrpeyton

Any miners here?

I'm new to bitcoin and still trying to get my head around it.

I downloaded bitcoin core and installed + synchronised. And it's up and running. So I assume I now have a "wallet"? I also encrypted it.
*But:*
Question is; how is that wallet identified? How is it unique to me? If i re-installed Windows (for example) is there an address or a key or something that is linked to my wallet that I would just need to re-enter?

Before the big hack on NiceHash miner, I had a wallet address I created by following the instructions on their site. And I just assumed that address was the identifier to my own unique bitcoin Wallet -- but bitcoin core allows you to create multiple "receiving addresses" all linked to my wallet... so if thats the case then what is the root identifier for my wallet? I mean if I didn't create any receiving or sending addresses. But I still have Bitcoin Core up and running -- then what or where is my wallet?

_P.S I was one of the victims of the attack on Nicehash miner... I logged in one day the other week and found my balance was at zero. So I'm starting again from scratch. But this time I am NOT having my Wallet with NiceHash miner -- I am making my own Wallet with Bitcoin Core then I'll use that Wallet with NiceHash -- that way if the service is ever hacked again in the future; I won't lose the entire lot (at most maybe just 2 days as it's 2 days between mining payouts)._


----------



## RoBiK

Quote:


> Originally Posted by *nrpeyton*
> 
> Any miners here?
> 
> I'm new to bitcoin and still trying to get my head around it.
> 
> I downloaded bitcoin core and installed + synchronised. And it's up and running. So I assume I now have a "wallet"? I also encrypted it.
> *But:*
> Question is; how is that wallet identified? How is it unique to me? If i re-installed Windows (for example) is there an address or a key or something that is linked to my wallet that I would just need to re-enter?
> 
> Before the big hack on NiceHash miner, I had a wallet address I created by following the instructions on their site. And I just assumed that address was the identifier to my own unique bitcoin Wallet -- but bitcoin core allows you to create multiple "receiving addresses" all linked to my wallet... so if thats the case then what is the root identifier for my wallet? I mean if I didn't create any receiving or sending addresses. But I still have Bitcoin Core up and running -- then what or where is my wallet?
> 
> _P.S I was one of the victims of the attack on Nicehash miner... I logged in one day the other week and found my balance was at zero. So I'm starting again from scratch. But this time I am NOT having my Wallet with NiceHash miner -- I am making my own Wallet with Bitcoin Core then I'll use that Wallet with NiceHash -- that way if the service is ever hacked again in the future; I won't lose the entire lot (at most maybe just 2 days as it's 2 days between mining payouts)._


A "wallet" is something that stores your private keys, those keys are what gives you access to the addresses and their associated balance. "How a wallet is identified" depends on the type of wallet. The important thing is that you backup your private keys and the associated addresses.


----------



## GreedyMuffin

Is Winminer recommended? Me and my father is just getting starting to all this.

Somebody have a trustworthy guide or will make a short one?

Much appreciated!!


----------



## Nervoize

Im still looking for a different bios than XOC. It only allowes me to hit 1.2v but i have way more headroom regarding temperature. It now hits 40c at full load 2164mhz 1.2v with stock paste from EKWB.


----------



## DStealth

Just added the card into the loop looks promising for the beginning ?


Edit: Wow 33700 GPU score in Firestrike


----------



## apw63

Played around OCing my new card (ASUS 1080 Ti strix OC) this morning 113(2100) on the core clock 975(12960) on the memory. I looped valley for 3 hrs card never went over 35c. I have the overvolt unlocked in Afterburner setting, but the volts never go over 1050mV. I thought this card would overvolt to 1093mV. Does anyone have any experaince with this card and overvolting it?. ADDED forgot to say this card is under water EK 1080 Ti water block and back plate.


----------



## d3v0

If I am able to hit 2000mhz stable for gaming constantly on my GTX 1080ti Aorus, what sort of performance upgrade might I get out of a waterblock?


----------



## Streetdragon

Quote:


> Originally Posted by *apw63*
> 
> Played around OCing my new card (ASUS 1080 Ti strix OC) this morning 113(2100) on the core clock 975(12960) on the memory. I looped valley for 3 hrs card never went over 35c. I have the overvolt unlocked in Afterburner setting, but the volts never go over 1050mV. I thought this card would overvolt to 1093mV. Does anyone have any experaince with this card and overvolting it?.
> 
> 
> Spoiler: Warning: Spoiler!


slide your core voltage slider to the right +100
Quote:


> Originally Posted by *d3v0*
> 
> If I am able to hit 2000mhz stable for gaming constantly on my GTX 1080ti Aorus, what sort of performance upgrade might I get out of a waterblock?


maybe 2075 or so but thats only 1FPs or so.
You get only a lower temp and a bit lowerd noise.

But it looks nice


----------



## d3v0

Its $140







Maybe not cost effective!


----------



## SavantStrike

Quote:


> Originally Posted by *d3v0*
> 
> If I am able to hit 2000mhz stable for gaming constantly on my GTX 1080ti Aorus, what sort of performance upgrade might I get out of a waterblock?


3-5 percent. The quiet factor is a consideration too







.

What's your voltage and temp like on your Aorus? My wb aorus cards all do 2100 with full cover blocks. It's a temperature thing, with the best samples hitting it with ease at 40C or below.


----------



## d3v0

Quote:


> Originally Posted by *SavantStrike*
> 
> 3-5 percent. The quiet factor is a consideration too
> 
> 
> 
> 
> 
> 
> 
> .
> 
> What's your voltage and temp like on your Aorus? My wb aorus cards all do 2100 with full cover blocks. It's a temperature thing, with the best samples hitting it with ease at 40C or below.


My Aorus will do 2050+ at 1081mV, but sometimes locks up, other times just downclocks because of heat.


----------



## apw63

Quote:


> Originally Posted by *Streetdragon*
> 
> slide your core voltage slider to the right +100


Thank you Ill give that a try


----------



## SavantStrike

Quote:


> Originally Posted by *d3v0*
> 
> My Aorus will do 2050+ at 1081mV, but sometimes locks up, other times just downclocks because of heat.


You might get 1050 at 2100 and chilly under full cover then. Beyond that might also be possible, but probably at least 2100.


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> Played around OCing my new card (ASUS 1080 Ti strix OC) this morning 113(2100) on the core clock 975(12960) on the memory. I looped valley for 3 hrs card never went over 35c. I have the overvolt unlocked in Afterburner setting, but the volts never go over 1050mV. I thought this card would overvolt to 1093mV. Does anyone have any experaince with this card and overvolting it?. ADDED forgot to say this card is under water EK 1080 Ti water block and back plate.


Looking at your screenshot; you have to pull the voltage slider all the way up to 100% to allow the card to over-volt up to 1.093v _(the normal Nvidia imposed maximum on Pascal)
_
Also it can depend on your voltage/frequency curve. If your curve allows a frequency to be achieved with a lower voltage then it will only use that lower voltage to achieve that frequency.
Every voltage on the voltage table has a frequency attached to it. So if for example 1.050v and 1.093v are both set at 2113mhz then the card will only use 1.050v to get to 2113mhz.


----------



## DStealth

Water cooling helps...11607 GPU score in TimeSpy








https://www.3dmark.com/3dm/24135451


----------



## apw63

Quote:


> Originally Posted by *nrpeyton*
> 
> Looking at your screenshot; you have to pull the voltage slider all the way up to 100% to allow the card to over-volt up to 1.093v _(the normal Nvidia imposed maximum on Pascal)
> _
> Also it can depend on your voltage/frequency curve. If your curve allows a frequency to be achieved with a lower voltage then it will only use that lower voltage to achieve that frequency.
> Every voltage on the voltage table has a frequency attached to it. So if for example 1.050v and 1.093v are both set at 2113mhz then the card will only use 1.050v to get to 2113mhz.


I did try sliding the slider all the way to 100%. This made no difference. The max volts only hit 1050mV. Most of the time it stayed at 1043mV. I was thinking it has something to do with the voltage/frequency curve. Could I make a user define voltage/frequency curve to get higher volts? I might get a few more points on the benchmark?

Thank you for the info

Added info.

I got it figured out. It's like you said the voltage/frequency curve was holding it down. I did a custom curve now I get 1093mV


----------



## GRABibus

Quote:


> Originally Posted by *DStealth*
> 
> Water cooling helps...11607 GPU score in TimeSpy
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/24135451


Great









yes, it helps so much. Temperature is definitely the key on Pascal.

As my Gigabyte 1080 Ti Gaming OC 11G (http://www.gigabyte.fr/Graphics-Card/GV-N108TGAMING-OC-11G#kf) is a really good clocker, I was frustrated that it goes maximum to 2063MHz stable due to its air cooling.

So, I have mounted the AIO cooling of the1080 Ti AORUS Xtreme Waterforce on my Gigabyte GTX 1080 Ti Gaming OC 11G.

Now I am gaming stable at 2101MHz/6003MHz at 1.043V and also 2114MHz/6003MHz at 1.081V.
GPU Temps maximum at 44°C at 22°C ambient


----------



## ThrashZone

Hi,
Yep core clock droop = add more voltage


----------



## Scotty99

Does anyone elses 1080 ti strix OC have coil whine? I just got my replacement card from newegg and it has the exact same type of coil whine the first one did, but a bit quieter.

You can only really hear it with everything off in your room (fans, tv etc) but its there and still annoying lol.

Edit, coil whine is just as bad maybe worse than last card, easily hear it above the case fans.


----------



## Agent-A01

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone elses 1080 ti strix OC have coil whine? I just got my replacement card from newegg and it has the exact same type of coil whine the first one did, but a bit quieter.
> 
> You can only really hear it with everything off in your room (fans, tv etc) but its there and still annoying lol.
> 
> Edit, coil whine is just as bad maybe worse than last card, easily hear it above the case fans.


Coil whine is normal.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone elses 1080 ti strix OC have coil whine? I just got my replacement card from newegg and it has the exact same type of coil whine the first one did, but a bit quieter.
> 
> You can only really hear it with everything off in your room (fans, tv etc) but its there and still annoying lol.
> 
> Edit, coil whine is just as bad maybe worse than last card, easily hear it above the case fans.


My experience is that changing powersupply is the fix to coil whine. If only quality powersupply wasn't soo damn expensive, then again I guess it's an investment that will last you the longest when it comes to computers.


----------



## Scotty99

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> My experience is that changing powersupply is the fix to coil whine. If only quality powersupply wasn't soo damn expensive, then again I guess it's an investment that will last you the longest when it comes to computers.


I have a supernova g3, my 1060 exhibits zero coil whine when put into this system.


----------



## SavantStrike

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> My experience is that changing powersupply is the fix to coil whine. If only quality powersupply wasn't soo damn expensive, then again I guess it's an investment that will last you the longest when it comes to computers.


It will only fix it if the previous PSU was poor.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Scotty99*
> 
> I have a supernova g3, my 1060 exhibits zero coil whine when put into this system.


Coilwhine is almost like tinnitus, sometimes, once you start to notice it its almost unbearable. Try switching the pci cables around on the power supply?


----------



## gta1989

Quote:


> Originally Posted by *Scotty99*
> 
> I have a supernova g3, my 1060 exhibits zero coil whine when put into this system.


my FE 1080ti had coil wine when i used my old corsair 950w bronze psu. I also went for the evga G3 1000w and now everything is fine. my mobo would also make a little noise before when under realbench stress and thats gone to


----------



## nrpeyton

Quote:


> Originally Posted by *apw63*
> 
> I did try sliding the slider all the way to 100%. This made no difference. The max volts only hit 1050mV. Most of the time it stayed at 1043mV. I was thinking it has something to do with the voltage/frequency curve. Could I make a user define voltage/frequency curve to get higher volts? I might get a few more points on the benchmark?
> 
> Thank you for the info
> 
> Added info.
> 
> I got it figured out. It's like you said the voltage/frequency curve was holding it down. I did a custom curve now I get 1093mV


No probs & glad you got it figured out.

Have fun,.


----------



## GreedyMuffin

How is 1950mhz at 0.900V..?

That's good? Under water.


----------



## pcgaming247

strix 1080ti
keeps lag spiking to 30-40 fps then crashes
oced and normal modes too
what could be the problem?
ryzen 1700


----------



## DStealth

Quote:


> Originally Posted by *GreedyMuffin*
> 
> How is 1950mhz at 0.900V..?
> 
> That's good? Under water.


Who cares








What's your maximum OC @ 1.1 and 1.2v ?


----------



## Mooncheese

Quote:


> Originally Posted by *pcgaming247*
> 
> strix 1080ti
> keeps lag spiking to 30-40 fps then crashes
> oced and normal modes too
> what could be the problem?
> ryzen 1700


Windows 10 Fall Creators Update?


----------



## Scotty99

Any suggestions on what i should do with this 1080ti strix oc? The one i got back from newegg has the same coil whine as the one i sent in, do i keep doing that or should i contact asus about a replacement? Most ive ever spent on a video card and its the first time ive ever had coil whine, just lol.


----------



## MrTOOSHORT

I never get annoying coil whine, very faint is about it.

But I do always have top tier motherboards and psus in my systems.

I have the Strix 1080ti, no whine.


----------



## Scotty99

What does motherboard have to do with coil whine? Ive been building PC's since 2001, first time ive ever experienced it. I always actually wondered what it sounded like.....now i know.

There is no whine if i set my monitor to 60hz (or at least i mostly goes away) but on this 165hz monitor it there and very apparent. 1060 in the same system, zero coil whine.


----------



## MrTOOSHORT

Like from the psu, the gpu also takes power from the motherboard pciE slot.


----------



## Scotty99

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Like from the psu, the gpu also takes power from the motherboard pciE slot.


I have a quality psu and board, like i said its the card i already put my 1060 in here, zero whine.

Im kind of surprised more strix 1080ti owners arent complaining about this, my guess is most have it (at least on the OC edition). A google search brings up a bunch of coil whine results for this card, as well as a lot of newegg reviews. Maybe some people are buying 800 dollar video cards and playing on 60hz monitors, heh.


----------



## feznz

Quote:


> Originally Posted by *Scotty99*
> 
> I have a quality psu and board, like i said its the card i already put my 1060 in here, zero whine.
> 
> Im kind of surprised more strix 1080ti owners arent complaining about this, my guess is most have it (at least on the OC edition). A google search brings up a bunch of coil whine results for this card, as well as a lot of newegg reviews. Maybe some people are buying 800 dollar video cards and playing on 60hz monitors, heh.


Strix owner here no coil whine
never had 1 card with coil whine and I am yet to hear one I have delt with many friends PCs so probably near 100+ GPUs
what is your PSU could be that a 1060 probably draws a lot less power or incredibly unlucky or they shipped the same card back always put a secret mark on an RMA card
also the squeaky hinge gets the oil most people that are happy don't leave good feedback but an in-happy one will normally yell at the top of their lungs

on another note finally done the mobo upgrade from the old 3770k, I decided to forget the epeen and go for a 8600k, Asus Apex and 3200Mhz Gskill RGB really happy








But also not gained not much in benching but Finally got Assassins Creed Origins to run buttery smooth


----------



## Scotty99

Quote:


> Originally Posted by *feznz*
> 
> Strix owner here no coil whine
> never had 1 card with coil whine and I am yet to hear one I have delt with many friends PCs so probably near 100+ GPUs
> what is your PSU could be that a 1060 probably draws a lot less power or incredibly unlucky or they shipped the same card back always put a secret mark on an RMA card
> also the squeaky hinge gets the oil most people that are happy don't leave good feedback but an in-happy one will normally yell at the top of their lungs
> 
> on another note finally done the mobo upgrade from the old 3770k, I decided to forget the epeen and go for a 8600k, Asus Apex and 3200Mhz Gskill RGB really happy
> 
> 
> 
> 
> 
> 
> 
> 
> But also not gained not much in benching but Finally got Assassins Creed Origins to run buttery smooth


Like i said this is the first video card ive owned in 15+ years ive had coil whine on, and ive had it on back to back occassions. PSU is in my sig, evga 750 g3. I think its a flaw with the card, i would bet someone money my if newegg sent me a third strix 1080ti OC it would also have it. I just dont know if i should do that or contact asus directly.


----------



## Blameless

Anything that uses PWMs to regulate voltage, and has coil chokes that are less than fully encapsulated, is going to create vibrations that can potentially be heard.

I don't think I've had any video card powerful enough to require VRM cooling that _didn't_ have at least some degree of audible coil whine, utterly independent of any other component. Degree of whine is extremely variable though, from sample to sample, and load to load.


----------



## KCDC

Both of my FEs have coil whine, using EK waterblocks, corsair AX1500i. Can't really hear it unless I open my case and listen for it. It's more like a buzz than a whine. Using cablemod cables as well.


----------



## Gurkburk

When you flash bios to another brand, will GPUZ & HWmonitor(apps like these) stop showing Power use if say my MSI card has sensors for this, but ASUS doesnt?

My softwares stop showing the Watt-use. But I'm unsure if it's because of the bios flash.. Anyone got a clue?


----------



## ThrashZone

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone elses 1080 ti strix OC have coil whine? I just got my replacement card from newegg and it has the exact same type of coil whine the first one did, but a bit quieter.
> 
> You can only really hear it with everything off in your room (fans, tv etc) but its there and still annoying lol.
> 
> Edit, coil whine is just as bad maybe worse than last card, easily hear it above the case fans.


Hi,
What's your fan curve ?
I personally haven't noticed any :/


----------



## apw63

Quote:


> Originally Posted by *Scotty99*
> 
> Does anyone elses 1080 ti strix OC have coil whine? I just got my replacement card from newegg and it has the exact same type of coil whine the first one did, but a bit quieter.
> 
> You can only really hear it with everything off in your room (fans, tv etc) but its there and still annoying lol.
> 
> Edit, coil whine is just as bad maybe worse than last card, easily hear it above the case fans.


I don't get any coil whine out of my 1080 Ti strix. I have it OC to 2113/12992.


----------



## MNDan

Coil whine is always a GPU/PSU mismatch.


----------



## EarlZ

I also get a coil whine on my MSI GTX1080Ti but I can only hear it if I have my side panel off and my ears close to the GPU and have it render something like 500-1000fps


----------



## DStealth

11619 GPU score [email protected]/12100...very bad memory...

2177 does better score...even 2200 is worse


----------



## skupples

anyone have an opinion to share on the MSI DUKE 1080ti? I got damn near $600 stuck in amazon's gift card system, so I figured I'd just toy with some things., and its one of the cheapest (and actually available.)


----------



## ThrashZone

Quote:


> Originally Posted by *skupples*
> 
> anyone have an opinion to share on the MSI DUKE 1080ti? I got damn near $600 stuck in amazon's gift card system, so I figured I'd just toy with some things., and its one of the cheapest (and actually available.)


Hi,
I'd wait a little longer once the consumer version of 1280's hit a lot lower prices are likely Titan v release has the 1280's line looking good








600.us is hard to pass on though


----------



## buellersdayoff

Quote:


> Originally Posted by *ThrashZone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> anyone have an opinion to share on the MSI DUKE 1080ti? I got damn near $600 stuck in amazon's gift card system, so I figured I'd just toy with some things., and its one of the cheapest (and actually available.)
> 
> 
> 
> Hi,
> I'd wait a little longer once the consumer version of 1280's hit a lot lower prices are likely Titan v release has the 1280's line looking good
> 
> 
> 
> 
> 
> 
> 
> 
> 600.us is hard to pass on though
Click to expand...

Probably double the dollars and about 25% more performance...nvidia monopoly. Oh and another 9 months away


----------



## SavantStrike

Quote:


> Originally Posted by *ThrashZone*
> 
> Hi,
> I'd wait a little longer once the consumer version of 1280's hit a lot lower prices are likely Titan v release has the 1280's line looking good
> 
> 
> 
> 
> 
> 
> 
> 
> 600.us is hard to pass on though


Judging from the titan V the 1280 might be 15 percent faster than the 1080 TI. Hardly worth getting excited over.


----------



## EarlZ

Its safe to assume that Titan V's gmaing performance will be the same level as the next Ti, right?


----------



## smokerings

not at all


----------



## DStealth

1080Ti in AIO loop







)
https://www.3dmark.com/3dm/24217333


So close to 11k Superposition 4k...


----------



## 6u4rdi4n

Quote:


> Originally Posted by *EarlZ*
> 
> Its safe to assume that Titan V's gmaing performance will be the same level as the next Ti, right?


Not at all. Don't forget, there are two Titan cards with Pascal chips. Maybe this time, we'll see three!


----------



## mouacyk

Quote:


> Originally Posted by *EarlZ*
> 
> Its safe to assume that Titan V's gmaing performance will be the same level as the next Ti, right?


Not at all. But at least as much. As they remove unneeded compute components, there's more thermal/power headroom or even additional FP32 units that can be added.


----------



## KCDC

Was it the same with the 980ti and the Maxwell Titan? Pretty sure the Maxwell Titan was still better than the 980ti by a decent margin, correct me if I'm wrong.

If correct, and seeing the progression with how they treated Pascal - 1080ti Vs Titan X, it may be the same for Volta's 1180Ti vs Titan V, no? It would seem strange if they continually flip-flopped with how the chips are ranked with each iteration, but I'm no business/marketing expert.

Either way, I'm definitely waiting a bit longer this time around before instantly jumping on the 1180Ti wagon.


----------



## Radox-0

Quote:


> Originally Posted by *KCDC*
> 
> Was it the same with the 980ti and the Maxwell Titan? Pretty sure the Maxwell Titan was still better than the 980ti by a decent margin, correct me if I'm wrong.
> 
> If correct, and seeing the progression with how they treated Pascal - 1080ti Vs Titan X, it may be the same for Volta's 1180Ti vs Titan V, no? It would seem strange if they continually flip-flopped with how the chips are ranked with each iteration, but I'm no business/marketing expert.
> 
> Either way, I'm definitely waiting a bit longer this time around before instantly jumping on the 1180Ti wagon.


Not be a decent margin, they were on par for most the part or Titan X (maxwell) was ahead by a tiny bit when comparing stock blower cooler's. Not too unsurprising given the core count difference of 8.3% which translates to couple of % at similar clocks.


----------



## Mooncheese

Quote:


> Originally Posted by *ThrashZone*
> 
> Hi,
> I'd wait a little longer once the consumer version of 1280's hit a lot lower prices are likely Titan v release has the 1280's line looking good
> 
> 
> 
> 
> 
> 
> 
> 
> 600.us is hard to pass on though


I typically recommend waiting, but 1280 or 2080 will in all likelihood be no more than 15% faster than 1080 Ti, as has been the performance disparity between architectures going all the way back to Fermi. The new 80 card is always only slightly faster than the outgoing Ti card.

The question is, are new Nvidia features that will accompany the new architecture worth waiting for?

May is still a ways out, that's when many think we will get a mainstream Volta reveal, with production following immediately thereafter.

I would probably buy a 1080 Ti for $600 now. It can always be sold quickly because of the demand mining has put on the GPU market.

Quote:


> Originally Posted by *DStealth*
> 
> 1080Ti in AIO loop
> 
> 
> 
> 
> 
> 
> 
> )
> https://www.3dmark.com/3dm/24217333
> 
> 
> So close to 11k Superposition 4k...


THAT IS INSANE.

5.5 GHz at 1.440!

I can't even imagine what your 1080 Ti is at for that score, 2200 MHz?

My 8700k is getting away with 5.1 GHz (Zero AVX), Dynamic Votlage at 1.386V, which I'm happy with, but 5.5 GHz has to be the highest I've seen so far.

Here's my before and after, i7 4930k @ 4.5 GHz + 980 Ti @ 1500 MHz vs. i7 8700k @ 5.1 GHz + 1080 Ti @ 2050 MHz with 1.0v.

Yes, I'm getting away with 2050 MHz with the undervolt freq curve and it's stable. It' just mined all last night, about 9 hours total, at 2012+ MHz with .975v or so, 120 PT, at 38C.

https://www.3dmark.com/compare/fs/14520125/fs/11807761#

Here's the latest post-build vid, full loop this time around:


----------



## EarlZ

So i fired up Mass Effect Andromeda, all maxed out including 200% render scale @ 1080p so thats 4k rendering. Its floating around 110% power limit and will constantly hit the 120%


----------



## skupples

it was keplar titan that rang all the bells and whistles.


----------



## DStealth

Quote:


> Originally Posted by *Mooncheese*
> 
> THAT IS INSANE.
> 
> 5.5 GHz at 1.440!
> 
> I can't even imagine what your 1080 Ti is at for that score, 2200 MHz?
> 
> My 8700k is getting away with 5.1 GHz (Zero AVX), Dynamic Votlage at 1.386V, which I'm happy with, but 5.5 GHz has to be the highest I've seen so far.


Not insane but good CPU...1.44v is now the reading as putting the 1080ti in the loop needs 1.43 in BIOS
Prior this could bench 5.5+ with 1.42v set



HT off 5600+ are not a problem also just discovered 5300+Mhz Cache is achievable also


----------



## Deathscythes

Hello,

I have a 1080 Ti Strix and flashed the XOC bios on it. Despite that modification i cannot get passed 1093mV.
I have tried using gpu tweak, MSI AB and the XOC Bios Voltage tool. None of these softwares seems to work :/

Any help on this matter is very welcome. Thank you very much =)


----------



## Gunslinger.

Quote:


> Originally Posted by *Deathscythes*
> 
> Hello,
> 
> I have a 1080 Ti Strix and flashed the XOC bios on it. Despite that modification i cannot get passed 1093mV.
> I have tried using gpu tweak, MSI AB and the XOC Bios Voltage tool. None of these softwares seems to work :/
> 
> Any help on this matter is very welcome. Thank you very much =)


The only software that will work is the XOC voltage tool.

https://onedrive.live.com/?authkey=%21AEILpAc%5FGsTuUds&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%2160630&parId=FABF1EAAEEBFA9D9%2127654&action=locate


----------



## RoBiK

Quote:


> Originally Posted by *Gunslinger.*
> 
> The only software that will work is the XOC voltage tool.
> 
> https://onedrive.live.com/?authkey=%21AEILpAc%5FGsTuUds&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%2160630&parId=FABF1EAAEEBFA9D9%2127654&action=locate


That is incorrect. You only need to flash the XOC BIOS and you will be able to set the voltage/frequency curve up to 1.2 V in Afterburner.


----------



## GRABibus

Quote:


> Originally Posted by *Deathscythes*
> 
> Hello,
> 
> I have a 1080 Ti Strix and flashed the XOC bios on it. Despite that modification i cannot get passed 1093mV.
> I have tried using gpu tweak, MSI AB and the XOC Bios Voltage tool. None of these softwares seems to work :/
> 
> Any help on this matter is very welcome. Thank you very much =)


Did you try to use the V/F curve to set a voltage higher than 1.093 V ?


----------



## Deathscythes

Quote:


> Originally Posted by *Gunslinger.*
> 
> The only software that will work is the XOC voltage tool.
> 
> https://onedrive.live.com/?authkey=%21AEILpAc%5FGsTuUds&cid=FABF1EAAEEBFA9D9&id=FABF1EAAEEBFA9D9%2160630&parId=FABF1EAAEEBFA9D9%2127654&action=locate


Hi,

I did mention the XOC Voltage tool and it didn't work :/
Is there anything specific to do for it to work? maybe softwares are in conflict?


----------



## Deathscythes

Quote:


> Originally Posted by *GRABibus*
> 
> Did you try to use the V/F curve to set a voltage higher than 1.093 V ?


I may say something silly but I did try to use the V/F curve in MSI AB and if I understand correctly it adjusts the Frequency depending on the VCore not the other way around.
Am i missing something? Thanks for the help =)


----------



## GRABibus

Quote:


> Originally Posted by *Deathscythes*
> 
> I may say something silly but I did try to use the V/F curve in MSI AB and if I understand correctly it adjusts the Frequency depending on the VCore not the other way around.
> Am i missing something? Thanks for the help =)


Yes, I am talking about Voltage/frequency curve you get by making "ctrl-F" inside MSI AB window.

Here is my curve for example :

http://www.casimages.com/img.php?i=17122912165117369815428489.png

Gigabyte GTX 1080 Ti Gaming OC 11G with ASUS XOC Bios

I added +121MHz on Core over the 1569MHz base clock.
I added 441MHz on Memory.

And then, I tweaked the curve to get 2126MHz on Core at 1.125V (I moved the V/F points between 1.125V and 1.2V in order o make a flat curve at 2126MHz between those 2 voltages).

We speak about the same ?


----------



## Mooncheese

Quote:


> Originally Posted by *DStealth*
> 
> Not insane but good CPU...1.44v is now the reading as putting the 1080ti in the loop needs 1.43 in BIOS
> Prior this could bench 5.5+ with 1.42v set
> 
> 
> 
> HT off 5600+ are not a problem also just discovered 5300+Mhz Cache is achievable also


So what is the voltage limit of 8700k? Do you run yours at 1.44v all day or do you mostly benchmark? Where are you on the 3DMark leaderboard? 5.5 GHz is the highest I've seen so far, seriously impressive.


----------



## Deathscythes

Quote:


> Originally Posted by *GRABibus*
> 
> Yes, I am talking about Voltage/frequency curve you get by making "ctrl-F" inside MSI AB window.
> 
> Here is my curve for example :
> 
> http://www.casimages.com/img.php?i=17122912165117369815428489.png
> 
> Gigabyte GTX 1080 Ti Gaming OC 11G with ASUS XOC Bios
> 
> I added +121MHz on Core over the 1569MHz base clock.
> I added 441MHz on Memory.
> 
> And then, I tweaked the curve to get 2126MHz on Core at 1.125V (I moved the V/F points between 1.125V and 1.2V in order o make a flat curve at 2126MHz between those 2 voltages).
> 
> We speak about the same ?


Yes we do speak about the same








Actually the only thing i had to do was simply to apply to the curve it seems without doing any specific change. Many thanks man =)
I still don't get why the voltage tool does not work however :/

What max voltage/temp do you recommend under water ?


----------



## DStealth

Quote:


> Originally Posted by *Mooncheese*
> 
> So what is the voltage limit of 8700k? Do you run yours at 1.44v all day or do you mostly benchmark? Where are you on the 3DMark leaderboard? 5.5 GHz is the highest I've seen so far, seriously impressive.


In datasheets Intel recommends not to get over 1.52 for 14++ . I run 1.43v now for 24/7 gaming and benching with very good temps delided. 3dmark leader board(HOF) moves fast with single card on AIR was able to put 22nd when 8700k comes a couple of months ago, but now when many people are benching 7980xe(and 10+core CPUs) on exotic cooling and Titan V comes into play last I checked was in top 40s single card scores even on water and much better score.


----------



## pcgaming247

Quote:


> Originally Posted by *Mooncheese*
> 
> Windows 10 Fall Creators Update?


yeah i think


----------



## GreedyMuffin

I can do 5.3 ghz at 1.39V, but I don't feel like it is safe for 24/7 operation.


----------



## Mooncheese

Quote:


> Originally Posted by *DStealth*
> 
> In datasheets Intel recommends not to get over 1.52 for 14++ . I run 1.43v now for 24/7 gaming and benching with very good temps delided. 3dmark leader board(HOF) moves fast with single card on AIR was able to put 22nd when 8700k comes a couple of months ago, but now when many people are benching 7980xe(and 10+core CPUs) on exotic cooling and Titan V comes into play last I checked was in top 40s single card scores even on water and much better score.


1.52 is max voltage for 8700k?! This is the first I've heard of this figure, and I must say that's a lot higher than I'm used to and most are saying 1.4v is max. Would you mind pointing to exactly where Intel specifies 1.52v for 8700k? Thanks.
Quote:


> Originally Posted by *pcgaming247*
> 
> yeah i think


Fall Creators Update has introduced MASSIVE problems into systems:

https://forums.geforce.com/default/topic/1004600/geforce-drivers/all-games-stuttering-with-fps-drops-since-windows-10-creators-update/159/

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I can do 5.3 ghz at 1.39V, but I don't feel like it is safe for 24/7 operation.


I'm running Dynamic Voltage, 5.1 GHz, -1 AVX, 1.386v, which I reduce to 3.6 GHz @ 1.0v by switching power plans via hotkey as my PC is on 24/7 mining and I only use it for gaming maybe 4 hours a day:






I highly recommend getting Dynamic / Advantage going if youre worried about pumping a lot of voltage through your chip 24/7.


----------



## Falkentyne

Quote:


> Originally Posted by *DStealth*
> 
> In datasheets Intel recommends not to get over 1.52 for 14++ . I run 1.43v now for 24/7 gaming and benching with very good temps delided. 3dmark leader board(HOF) moves fast with single card on AIR was able to put 22nd when 8700k comes a couple of months ago, but now when many people are benching 7980xe(and 10+core CPUs) on exotic cooling and Titan V comes into play last I checked was in top 40s single card scores even on water and much better score.


1.52v is NOT the max voltage. 1.52v VID is. There is a BIG BIG BIG DIFFERENCE Between VID AND VOLTAGE.


----------



## DStealth

Quote:


> Originally Posted by *Mooncheese*
> 
> 1.52 is max voltage for 8700k?! This is the first I've heard of this figure, and I must say that's a lot higher than I'm used to and most are saying 1.4v is max. Would you mind pointing to exactly where Intel specifies 1.52v for 8700k? Thanks.


Here
https://www.intel.com/content/www/us/en/processors/core/8th-gen-processor-family-s-platform-datasheet-vol-1.html
Page 110 ; Table 7-2 at the very first row
Quote:


> Originally Posted by *Falkentyne*
> 
> 1.52v is NOT the max voltage. 1.52v VID is. There is a BIG BIG BIG DIFFERENCE Between VID AND VOLTAGE.


It would be even worse to be VID as it's needed to operate at stock speeds









Everyone decides itself but from my experience with 8700k since lunch date till now and 4-5 CPU while temp are fine 1.45 to 1.48 loaded are not an issue...dunno in 2-3-5 years period..but for the moment zero degradation is observed


----------



## Mooncheese

Quote:


> Originally Posted by *DStealth*
> 
> Here
> https://www.intel.com/content/www/us/en/processors/core/8th-gen-processor-family-s-platform-datasheet-vol-1.html
> Page 110 ; Table 7-2 at the very first row
> It would be even worse to be VID as it's needed to operate at stock speeds
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone decides itself but from my experience with 8700k since lunch date till now and 4-5 CPU while temp are fine 1.45 to 1.48 loaded are not an issue...dunno in 2-3-5 years period..but for the moment zero degradation is observed


Oh dude if Intel is referering to VID, there is basically a 100mv different between VID and Vcore, i.e. if my VID says 1.475v my VCore is at 1.375v.

If Intel is saying max VID of 1.52V that would be about 1.42v VCore.

I don't know that silicon has changed fundamentally. 1.4V has always been the max. You wont kill the chip immediately, but what will happen is that eventually your OC will be unstable, and will require more voltage, and you'll up the voltage, and then it will be unstable there relatively quickly, and then you'll up the voltage, rinse repeat until your chip now requires 1.4V for 4.6 GHz.


----------



## DStealth

Don't wanna argue use your CPU with sub 1.4 if afraid








Here's a good explanation in this forum regarding VID - http://www.overclock.net/t/665362/vid-voltage-identification-explained/0_50

In Intel document is clearly stated for Coffe lake in this table and in the table bellow for the iGPU MAX and Voltage 1.52
I have 2600k running 5Ghz since launch date(7years) @ 1.5v and no degradation is observed just changing the paste from time to time helps to stabilize temps and OC...also pushed it very hard in 2011 see


And also forget to say it's not the voltage what's killing the CPU current does


----------



## Blameless

Quote:


> Originally Posted by *DStealth*
> 
> In datasheets Intel recommends not to get over 1.52 for 14++


Neither absolute maximum ratings, nor the upper ends of the VID range, are meant to imply a safe voltage for a given sample.

Stock VID of parts can vary quite widely because leakage current varies quite widely from sample to sample and core to core. Anything over stock is out of spec, and setting a VID near the top end of the range for the entire line is going to be extremely out of spec for the vast majority of parts.

Voltage and loads that don't cause appreciable degradation to one part could kill another in short order.
Quote:


> Originally Posted by *DStealth*
> 
> Here
> https://www.intel.com/content/www/us/en/processors/core/8th-gen-processor-family-s-platform-datasheet-vol-1.html
> Page 110 ; Table 7-2 at the very first row


Now read the notes right under that table.
Quote:


> Originally Posted by *Mooncheese*
> 
> Oh dude if Intel is referering to VID, there is basically a 100mv different between VID and Vcore, i.e. if my VID says 1.475v my VCore is at 1.375v.
> 
> If Intel is saying max VID of 1.52V that would be about 1.42v VCore.


The difference between VID and actual vcore is based on the loadline and the current draw of the part, but yes, under load Intel spec mandates significant vcore vdroop on parts that don't have FIVRs.
Quote:


> Originally Posted by *DStealth*
> 
> And also forget to say it's not the voltage what's killing the CPU current does


Voltage, temperature, and current all influence the rate of electromigration, which is the predominant mode of failure, outside of mechanical failures due to thermal cycling.


----------



## GreedyMuffin

Yeah. I think I'll stick to 5 ghz at 1.230V for 24/7. Better be safe than sorry.


----------



## OrionBG

Hey guys,
I have a FE 1080ti and will be watercooling the beast. I want to do the shunt mod but since there have been issues with liquid metal and solder in some cases, I was wandering if there are other non destructive ways to do it. A pencil maybe? I have one 8B pencil which is the softest I could source. I drew a thick line with it on a piece of paper and it is conductive... Or maybe some conductive paint marker?
Anyone done it with something different than a liquid metal?


----------



## DStealth

Ok Pushed the card to 2215/12400 Almost 34k GPU score in Firestike
https://www.3dmark.com/fs/14562017

https://www.3dmark.com/spy/3037002


----------



## Mooncheese

Quote:


> Originally Posted by *Blameless*
> 
> Neither absolute maximum ratings, nor the upper ends of the VID range, are meant to imply a safe voltage for a given sample.
> 
> Stock VID of parts can vary quite widely because leakage current varies quite widely from sample to sample and core to core. Anything over stock is out of spec, and setting a VID near the top end of the range for the entire line is going to be extremely out of spec for the vast majority of parts.
> 
> Voltage and loads that don't cause appreciable degradation to one part could kill another in short order.
> Now read the notes right under that table.
> The difference between VID and actual vcore is based on the loadline and the current draw of the part, but yes, under load Intel spec mandates significant vcore vdroop on parts that don't have FIVRs.
> Voltage, temperature, and current all influence the rate of electromigration, which is the predominant mode of failure, outside of mechanical failures due to thermal cycling.


Quote:


> Originally Posted by *Blameless*
> 
> Neither absolute maximum ratings, nor the upper ends of the VID range, are meant to imply a safe voltage for a given sample.
> 
> Stock VID of parts can vary quite widely because leakage current varies quite widely from sample to sample and core to core. Anything over stock is out of spec, and setting a VID near the top end of the range for the entire line is going to be extremely out of spec for the vast majority of parts.
> 
> Voltage and loads that don't cause appreciable degradation to one part could kill another in short order.
> Now read the notes right under that table.
> The difference between VID and actual vcore is based on the loadline and the current draw of the part, but yes, under load Intel spec mandates significant vcore vdroop on parts that don't have FIVRs.
> Voltage, temperature, and current all influence the rate of electromigration, which is the predominant mode of failure, outside of mechanical failures due to thermal cycling.


So what is the max voltage for 8700k? Or is it 1.52v?

Quote:


> Originally Posted by *OrionBG*
> 
> Hey guys,
> I have a FE 1080ti and will be watercooling the beast. I want to do the shunt mod but since there have been issues with liquid metal and solder in some cases, I was wandering if there are other non destructive ways to do it. A pencil maybe? I have one 8B pencil which is the softest I could source. I drew a thick line with it on a piece of paper and it is conductive... Or maybe some conductive paint marker?
> Anyone done it with something different than a liquid metal?


Are you actually in need of more performance? One of the few games I actually needed more performance, The Witcher 3 in 3D Vision at 2560x1440 (where 60 FPS 3D is 120 FPS 2D) I recouped a lot of performance simply running a 1.0v undervolt, and now the wattage starvation induced frequency dips don't go below say 2000MHz whereas with default voltage they would dip all the way down to 1926 MHz.

And undervolting meant my core was 4-5C cooler than at default voltage, which allowed for negating one 13 MHz bin drop keeping said core under 50C (where a 13 Mhz bin drop occurs).

But that was with the AIO, now with a full water block the clocks don't dip below 2050 MHz under 30C and at 37C or so where they stabilize they don't dip under 2038MHz. Yes. 2038Mhz with 1.0v. and +400 memory. No need to worry about a botched shunt mod destroying your card.
Quote:


> Originally Posted by *DStealth*
> 
> Ok Pushed the card to 2215/12400 Almost 34k GPU score in Firestike
> https://www.3dmark.com/fs/14562017
> 
> https://www.3dmark.com/spy/3037002


Very nice scores, how much power and voltage is this card requiring and what are you doing for cooling?


----------



## OrionBG

Quote:


> Originally Posted by *Mooncheese*
> 
> Are you actually in need of more performance? One of the few games I actually needed more performance, The Witcher 3 in 3D Vision at 2560x1440 (where 60 FPS 3D is 120 FPS 2D) I recouped a lot of performance simply running a 1.0v undervolt, and now the wattage starvation induced frequency dips don't go below say 2000MHz whereas with default voltage they would dip all the way down to 1926 MHz.
> 
> And undervolting meant my core was 4-5C cooler than at default voltage, which allowed for negating one 13 MHz bin drop keeping said core under 50C (where a 13 Mhz bin drop occurs).
> 
> But that was with the AIO, now with a full water block the clocks don't dip below 2050 MHz under 30C and at 37C or so where they stabilize they don't dip under 2038MHz. Yes. 2038Mhz with 1.0v. and +400 memory. No need to worry about a botched shunt mod destroying your card.
> Very nice scores, how much power and voltage is this card requiring and what are you doing for cooling?


Let's face it, it is not a question of "need" but a question of "want"







(after all I've just ordered a Threadripper 1950x from Amazon)
I just want to push this thing to its limits and then move those limits higher (with the shunt mod) I've worked in a PC repair shop for nearly 11 years and I've seen a lot of hardware... Supposedly I know how not to kill my GPU (but still you never know)...
Just want to know if somebody have done it (successfully) with something else than liquid metal compounds.


----------



## Mooncheese

Quote:


> Originally Posted by *OrionBG*
> 
> Let's face it, it is not a question of "need" but a question of "want"
> 
> 
> 
> 
> 
> 
> 
> (after all I've just ordered a Threadripper 1950x from Amazon)
> I just want to push this thing to its limits and then move those limits higher (with the shunt mod) I've worked in a PC repair shop for nearly 11 years and I've seen a lot of hardware... Supposedly I know how not to kill my GPU (but still you never know)...
> Just want to know if somebody have done it (successfully) with something else than liquid metal compounds.


Good thing you bought it from amazon as they have a great return policy.

For actually wringing all of the performance out of your 1080 Ti in gaming you would be better off with an 8700k, gaming benchmarks start here:






For everything else, sure, 1950x is better. But this is the 1080 Ti forum, and youre going to be CPU bottlenecked everywhere with that CPU.


----------



## DStealth

Quote:


> Originally Posted by *Mooncheese*
> 
> Very nice scores, how much power and voltage is this card requiring and what are you doing for cooling?


Power is not measured ...benches are with 1.2v Asus XOC BIOS cooling is stock MSI Seahawk EK X just running 3* water in the loop


----------



## Mooncheese

So seahawk X
Quote:


> Originally Posted by *DStealth*
> 
> Power is not measured ...benches are with 1.2v Asus XOC BIOS cooling is stock MSI Seahawk EK X just running 3* water in the loop


So it's just a reference PCB 1080 Ti with XOC VBIOS?

My 1080 Ti FE is under water as well, not sure I'm ready to pump 1.2v though it though. But yeah, 34k GPU Firestrike is impressive.

What kind of temps do you see under load?


----------



## mouacyk

Quote:


> Originally Posted by *Mooncheese*
> 
> So seahawk X
> So it's just a reference PCB 1080 Ti with XOC VBIOS?
> 
> My 1080 Ti FE is under water as well, not sure I'm ready to pump 1.2v though it though. But yeah, 34k GPU Firestrike is impressive.
> 
> What kind of temps do you see under load?


The Seahawk EK X is not reference PCB. It has 8+8pins PCIe connector vs reference 6+8.


----------



## OrionBG

Quote:


> Originally Posted by *Mooncheese*
> 
> Good thing you bought it from amazon as they have a great return policy.
> 
> For actually wringing all of the performance out of your 1080 Ti in gaming you would be better off with an 8700k, gaming benchmarks start here:
> 
> 
> 
> 
> 
> 
> For everything else, sure, 1950x is better. But this is the 1080 Ti forum, and youre going to be CPU bottlenecked everywhere with that CPU.


I'm done with Intel CPUs for the time being. Will be on team AMD for the foreseeable future (running a Ryzen7 1700 at 4GHz right now) but I wanted a high-end platform so no i7 for me... Also, I'll be watercooling everything so 4GHz and a little bit up is quite possible. Not to mention that running Threadripper's memory sub-system in NUMA mode instead of UMA (the default one) gives a nice boost in most of the games, bringing the performance on par.
But still, thanks for the info!

End of offtopic.

Back to my original question


----------



## Blameless

Quote:


> Originally Posted by *OrionBG*
> 
> Hey guys,
> I have a FE 1080ti and will be watercooling the beast. I want to do the shunt mod but since there have been issues with liquid metal and solder in some cases, I was wandering if there are other non destructive ways to do it. A pencil maybe? I have one 8B pencil which is the softest I could source. I drew a thick line with it on a piece of paper and it is conductive... Or maybe some conductive paint marker?
> Anyone done it with something different than a liquid metal?


Soldering a resistor of correct resistance in parallel is probably the best bet.

Conductive paint will generally lower resistance too much, while pencil is fragile and thus temporary (even if you cover it, it will eventually degrade).
Quote:


> Originally Posted by *Mooncheese*
> 
> So what is the max voltage for 8700k? Or is it 1.52v?


Officially? Whatever is stock for the specific sample in question, assuming none of the operating specs are violated in any way.

Unofficially, whatever you can get away with without damage or degradation, which will also vary widely from sample to sample, operating environment to environment, overclock to overclock, and intended use/lifetime.

Any fixed figure can only be a rough guideline, at best, which is why it's generally best to make conservative recommendations.


----------



## pcgaming247

Quote:


> Originally Posted by *Mooncheese*
> 
> Fall Creators Update has introduced MASSIVE problems into systems:


how can i be sure its windows?


----------



## DStealth

Quote:


> Originally Posted by *Mooncheese*
> 
> So seahawk X
> So it's just a reference PCB 1080 Ti with XOC VBIOS?
> 
> My 1080 Ti FE is under water as well, not sure I'm ready to pump 1.2v though it though. But yeah, 34k GPU Firestrike is impressive.
> 
> What kind of temps do you see under load?


As already answered the PCB is not reference a couple of models from MSI are sharing the same layout and 2*8pin connectors
Under load with 1.2v temps are close to 20*C with 1.1 or 1.15 are close from 10 to 15* while maximum load is applied
This is how I left 24/7 OC 1.15v 2164/12140 and temps in FS Ultra are in 14* range during Loops - https://www.3dmark.com/fs/14570357


----------



## nrpeyton

Quote:


> Originally Posted by *OrionBG*
> 
> Hey guys,
> I have a FE 1080ti and will be watercooling the beast. I want to do the shunt mod but since there have been issues with liquid metal and solder in some cases, I was wandering if there are other non destructive ways to do it. A pencil maybe? I have one 8B pencil which is the softest I could source. I drew a thick line with it on a piece of paper and it is conductive... Or maybe some conductive paint marker?
> Anyone done it with something different than a liquid metal?


Someone on this forum did indeed do something similar with conductive paint.


----------



## mouacyk

Quote:


> Originally Posted by *OrionBG*
> 
> Hey guys,
> I have a FE 1080ti and will be watercooling the beast. I want to do the shunt mod but since there have been issues with liquid metal and solder in some cases, I was wandering if there are other non destructive ways to do it. A pencil maybe? I have one 8B pencil which is the softest I could source. I drew a thick line with it on a piece of paper and it is conductive... Or maybe some conductive paint marker?
> Anyone done it with something different than a liquid metal?


Have you considered flashing the Asus XOC BIOS? It should be compatible with FE and will eliminate ALL power throttling.


----------



## IvantheDugtrio

Just picked up a FTW3 from B&H. It should arrive in a week or so. Also bought an EK block to go in my loop. Let's see how many watts I can push through this thing.


----------



## goodwidp

Was able to score a great deal on a EVGA SC Black Edition 1080 Ti. $690 from Amazon. Was listed as used (technically "like new"), but only issue was the retail box having a few rips/creases. The card itself was untouched and still had all the OEM protective film on it. Runs amazing so far, though I haven't tried overclocking it yet. This was an upgrade from a 970, and it's been incredible so far to see the performance difference between the two.


----------



## OrionBG

Quote:


> Originally Posted by *nrpeyton*
> 
> Someone on this forum did indeed do something similar with conductive paint.


Let's see if he will reveal himself








Quote:


> Originally Posted by *mouacyk*
> 
> Have you considered flashing the Asus XOC BIOS? It should be compatible with FE and will eliminate ALL power throttling.


I have considered it but to my knowledge, the XOC BIOS will not give me bigger power limits without the shunt mod... or am I wrong?


----------



## DStealth

You're wrong ...It will








I've burned my first card while shunt modded with liquid metal and definitely not recommending doing it.
Just stick with XOC BIOS it's great for benchmarks for 24/7 gaming just consumes a couple of watts more compared to regular ones









Happy New 2018 Year folks


----------



## Lefty23

Quote:


> Originally Posted by *OrionBG*
> 
> I have considered it but to my knowledge, the XOC BIOS will not give me bigger power limits without the shunt mod... or am I wrong?


I'm using the XOC bios on a non shunt modded EVGA FE (under full waterblock) for benchmarking (always seem to go back to stock bios for 24/7 usage). I'm definitely getting higher power usage with it.
Just did a quick run of Timespy test 2 @1.162v -- 2177 core -- 6280 mem. You can see usage over the stock 300W limit and peaks over 400W:


Spoiler: HWiNFO -- MSI AB screnshot -- right click -> open in new tab then left click to see full size









Spoiler: nvidia-smi output (1 sec polling period)



Code:



Code:


C:\Windows\system32>"C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -q -d power -l 1
        Power Draw                  : 61.73 W
        Power Draw                  : 61.74 W
        Power Draw                  : 170.15 W
        Power Draw                  : 323.10 W
        Power Draw                  : 367.06 W
        Power Draw                  : 372.62 W
        Power Draw                  : 375.32 W
        Power Draw                  : 370.21 W
        Power Draw                  : 361.04 W
        Power Draw                  : 390.56 W
        Power Draw                  : 349.99 W
        Power Draw                  : 354.15 W
        Power Draw                  : 359.13 W
        Power Draw                  : 401.70 W
        Power Draw                  : 406.81 W
        Power Draw                  : 383.88 W
        Power Draw                  : 373.36 W
        Power Draw                  : 386.86 W
        Power Draw                  : 351.93 W
        Power Draw                  : 342.18 W
        Power Draw                  : 357.05 W
        Power Draw                  : 361.11 W
        Power Draw                  : 415.60 W
        Power Draw                  : 379.53 W
        Power Draw                  : 406.27 W
        Power Draw                  : 387.41 W
        Power Draw                  : 414.33 W
        Power Draw                  : 367.21 W
        Power Draw                  : 349.17 W
        Power Draw                  : 338.54 W
        Power Draw                  : 369.49 W
        Power Draw                  : 340.23 W
        Power Draw                  : 320.85 W
        Power Draw                  : 314.38 W
        Power Draw                  : 336.76 W
        Power Draw                  : 330.40 W
        Power Draw                  : 340.08 W
        Power Draw                  : 336.20 W
        Power Draw                  : 333.65 W
        Power Draw                  : 318.85 W
        Power Draw                  : 332.12 W
        Power Draw                  : 338.73 W
        Power Draw                  : 345.54 W
        Power Draw                  : 329.34 W
        Power Draw                  : 333.12 W
        Power Draw                  : 367.27 W
        Power Draw                  : 340.72 W
        Power Draw                  : 336.50 W
        Power Draw                  : 349.52 W
        Power Draw                  : 350.48 W
        Power Draw                  : 353.50 W
        Power Draw                  : 345.60 W
        Power Draw                  : 361.35 W
        Power Draw                  : 338.17 W
        Power Draw                  : 359.77 W
        Power Draw                  : 328.34 W
        Power Draw                  : 324.10 W
        Power Draw                  : 318.23 W
        Power Draw                  : 332.01 W
        Power Draw                  : 320.08 W
        Power Draw                  : 343.94 W
        Power Draw                  : 345.39 W
        Power Draw                  : 351.54 W
        Power Draw                  : 366.22 W
        Power Draw                  : 362.20 W
        Power Draw                  : 382.97 W
        Power Draw                  : 348.85 W
        Power Draw                  : 361.78 W
        Power Draw                  : 353.65 W
        Power Draw                  : 356.03 W
        Power Draw                  : 361.16 W
        Power Draw                  : 362.64 W
        Power Draw                  : 348.16 W
        Power Draw                  : 62.13 W
        Power Draw                  : 61.74 W





Happy new year to everyone!


----------



## OrionBG

Quote:


> Originally Posted by *Lefty23*
> 
> I'm using the XOC bios on a non shunt modded EVGA FE (under full waterblock) for benchmarking (always seem to go back to stock bios for 24/7 usage). I'm definitely getting higher power usage with it.
> Just did a quick run of Timespy test 2 @1.162v -- 2177 core -- 6280 mem. You can see usage over the stock 300W limit and peaks over 400W:
> Happy new year to everyone!


I've installed the XOC bios and not nvidia-smi reports no limits. Unfortunately while running superposition with Afterburner for monitoring, the moment the GPU goes to 100% utilization I get the Power limit sign in the OSD and the frequency goes down one step. Running nvidia-smi I saw that this happens at 300W power usage. So I'm still limited somehow. running the nvidia-smi -pl xxx command does not work with this XOC bios so I don't know how to make the card to consume more power... Temp were stable during those tests...


----------



## mouacyk

Quote:


> Originally Posted by *OrionBG*
> 
> I've installed the XOC bios and not nvidia-smi reports no limits. Unfortunately while running superposition with Afterburner for monitoring, the moment the GPU goes to 100% utilization I get the Power limit sign in the OSD and the frequency goes down one step. Running nvidia-smi I saw that this happens at 300W power usage. So I'm still limited somehow. running the nvidia-smi -pl xxx command does not work with this XOC bios so I don't know how to make the card to consume more power... Temp were stable during those tests...


Did you follow the instructions here properly? http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti

When I had it working properly on my MSI EK X for benchmarking, the Power Target setting in Afterburner becomes disabled. Make sure you have the right Asus XOC BIOS from that linked page which is meant for LN2. I believe there is another "XOC" BIOS that doesn't remove the power limits.

Another thing to note in the flashing is that "--protectoff" and "-6" parameters are important to a successful flash.


----------



## DStealth

I can run 2200+ 1,2v with 450w + consumption w/o mods. Reflash your card something definitely is wrong.
Lock your frequency via curve and see how it behaves.


----------



## Lefty23

Quote:


> Originally Posted by *OrionBG*
> 
> I've installed the XOC bios and not nvidia-smi reports no limits. Unfortunately while running superposition with Afterburner for monitoring, the moment the GPU goes to 100% utilization I get the Power limit sign in the OSD and the frequency goes down one step. Running nvidia-smi I saw that this happens at 300W power usage. So I'm still limited somehow. running the nvidia-smi -pl xxx command does not work with this XOC bios so I don't know how to make the card to consume more power... Temp were stable during those tests...


That is weird..
Have you uninstalled the drivers after the flash?
I do this using DDU in safe mode and then I install the drivers again.
See also OP here:
http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
I also clean install MSI AB (I do this ever since it has caused me problems reusing the profiles I had from a previous bios) but I'm pretty sure this not necessary.
From what you say (nvidia-smi reporting no power limit) it sounds like the flash was successful. My only guess is there might be something going on with the drivers.

edit: as mouacyk mentioned above, another indication of a successful flash is that the Power Limit & Temp Limit sliders in MSI AB should now be greyed out:


Spoiler: Warning: Spoiler!


----------



## OrionBG

Quote:


> Originally Posted by *mouacyk*
> 
> Did you follow the instructions here properly? http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> 
> When I had it working properly on my MSI EK X for benchmarking, the Power Target setting in Afterburner becomes disabled. Make sure you have the right Asus XOC BIOS from that linked page which is meant for LN2. I believe there is another "XOC" BIOS that doesn't remove the power limits.
> 
> Another thing to note in the flashing is that "--protectoff" and "-6" parameters are important to a successful flash.


Quote:


> Originally Posted by *DStealth*
> 
> I can run 2200+ 1,2v with 450w + consumption w/o mods. Reflash your card something definitely is wrong.
> Lock your frequency via curve and see how it behaves.


Quote:


> Originally Posted by *Lefty23*
> 
> That is weird..
> Have you uninstalled the drivers after the flash?
> I do this using DDU in safe mode and then I install the drivers again.
> See also OP here:
> http://www.overclock.net/t/1627212/how-to-flash-a-different-bios-on-your-1080-ti
> I also clean install MSI AB (I do this ever since it has caused me problems reusing the profiles I had from a previous bios) but I'm pretty sure this not necessary.
> From what you say (nvidia-smi reporting no power limit) it sounds like the flash was successful. My only guess is there might be something going on with the drivers.


The flashing went OK. All done as the guide says. Right now my card is being recognized as an ASUS card and as I've mentioned the smi reports no limits and the stock card goes to 2000MHz right away and stays there until it hits 300W and then Power Limit and goes a step down...
One thing that I have not done completely is the driver uninstall... I was using a little older driver anyway and after the flash I downloaded the newest one from nvidia and chose clean installation that supposedly removed any old ones but still... I'll try with DDU and will report back


----------



## RoBiK

Quote:


> Originally Posted by *OrionBG*
> 
> The flashing went OK. All done as the guide says. Right now my card is being recognized as an ASUS card and as I've mentioned the smi reports no limits and the stock card goes to 2000MHz right away and stays there until it hits 300W and then Power Limit and goes a step down...
> One thing that I have not done completely is the driver uninstall... I was using a little older driver anyway and after the flash I downloaded the newest one from nvidia and chose clean installation that supposedly removed any old ones but still... I'll try with DDU and will report back


Are you sure the drop in frequency is because of power draw and not because of temperature?


----------



## DStealth

That's why suggested locking the curve ...each 10* w/o lock will result to one bin 13mhz lower


----------



## OrionBG

Quote:


> Originally Posted by *RoBiK*
> 
> Are you sure the drop in frequency is because of power draw and not because of temperature?


Quote:


> Originally Posted by *DStealth*
> 
> That's why suggested locking the curve ...each 10* w/o lock will result to one bin 13mhz lower


There are no temp limits in this bios also. Theoretically the card will fry itself but it will not throttle because of temperature. Also, the max temp I got during the short test bursts was about 60C (still on air, just for testing. Not going to run it for more that a minute like that)
How should I set that curve? The only curve I've found is the one with the voltages and frequencies, but it looks like it's written in some ancient Lantean dialect (watching Stargate Atlantis right now







)


----------



## DStealth

It has temp limits w/o locking the curve frequency/voltage. It's no matter if you have 60 or 40 just crossing the border 39 to 40...49 to 50...59 to 60 will result in frequency/voltage drop








And has nothing related with PT limits...just Nvidia driver related...called GPU boost 3.0


----------



## OrionBG

Quote:


> Originally Posted by *DStealth*
> 
> It has temp limits w/o locking the curve frequency/voltage. It's no matter if you have 60 or 40 just crossing the border 39 to 40...49 to 50...59 to 60 will result in frequency/voltage drop
> 
> 
> 
> 
> 
> 
> 
> 
> And has nothing related with PT limits...just Nvidia driver related


I guess I'll have to wait until I rebuild my system with everything water-cooled to get a stable temperature and then see how it will behave.


----------



## Blameless

Quote:


> Originally Posted by *DStealth*
> 
> It has temp limits w/o locking the curve frequency/voltage. It's no matter if you have 60 or 40 just crossing the border 39 to 40...49 to 50...59 to 60 will result in frequency/voltage drop
> 
> 
> 
> 
> 
> 
> 
> 
> And has nothing related with PT limits...just Nvidia driver related...called GPU boost 3.0


It's also worth pointing out that there is a hysteresis to that temp vs. clock slope, so reducing temperature won't restore the previous step until about half way to the next lower step.
Quote:


> Originally Posted by *OrionBG*
> 
> I guess I'll have to wait until I rebuild my system with everything water-cooled to get a stable temperature and then see how it will behave.


You can get a stable temperature with any cooling...worse cooling just means a higher temp.

In my case, I do all of my testing at 68C (I have a fan curve that keeps the card at that temperature regardless of load, for testing purposes), because that's the temperature I know I can maintain on my stock (though non-reference) air cooled AORUS at high load, in warm ambients, without being too obnoxiously loud (for me).

Sometimes this means I get another 12.5MHz bin over the clocks I've chosen, if my ambients are cooler than what I test at, but that's not such a bad thing.


----------



## OrionBG

I understand now. The only strange thing for me is that the card is not consuming more than 300W... I'll do some more tests to get a better idea what, when and why is going on...


----------



## OrionBG

So... I tried one more thing. I overclocked my GPU and RAM as much as possible to get maximum power draw and this time, under Superposition, the Power Limit indicator appeared almost immediately. The card again did now pull more than 300W according to SMI...
I'll do complete driver cleanup tomorrow and will test again...


----------



## DStealth

Within factory voltage max 1.093/1.081 300w are depending on the chip almost 2100 if you need higher consumpltion just rise voltage and lock via curve to let say 1.12-1.15 too see better consumption


----------



## OrionBG

Quote:


> Originally Posted by *DStealth*
> 
> Within factory voltage max 1.093/1.081 300w are depending on the chip almost 2100 if you need higher consumpltion just rise voltage and lock via curve to let say 1.12-1.15 too see better consumption


Thanks, will try that.


----------



## duganator

I tried to join the club yesterday, but microcenter had just sold out of their last 1080ti. From my research it seems like the strix is the best air cooled option, is there a consensus among owners as to which air cooled card is the best?


----------



## DStealth

Wasn't benched FS Extreme from a long time. How is this score for sub 2200 core for comparison if anyone still running this benchmark ?
https://www.3dmark.com/fs/14590079


----------



## barch88

I just picked that up. Now to learn how to OC my GPU. I just learned how to OC my cpu and got my i5 Haswell stable at 4.7 @1.25v


----------



## GOLDDUBBY

Quote:


> Originally Posted by *duganator*
> 
> I tried to join the club yesterday, but microcenter had just sold out of their last 1080ti. From my research it seems like the strix is the best air cooled option, is there a consensus among owners as to which air cooled card is the best?


Gigabyte Auros extreme and evga KingPin have binned chips and massive coolers. Evga even guarantees a 2025mhz core over-clock on the kingpin on air.
For casual gaming, however, it won't make much of a difference. If you intend to do Ln2 overclocking the kingpin card is a given. As a rule of thumb I try to steer clear of msi as they have worthless costumer service and generally cut corners on things such as ram quality and other small components. Even on their most expensive cards.

Personally I went with the gtx1080 ti gaming elite from evga because it has a newer memory module that runs at 12GHz stock (not just a bios feature). I believe this helps me stay above 60fps @4K res. Reviews I've seen with other 1080ti's have had lows dip down below 60 which I'm trying to avoid. (once you experienced gaming @144hz 1080p it's difficult to accept low refresh rates. The interactive/reality feel is way different)

Best advice to you, WAIT.. Rumors has it Nvidia will be releasing a new graphics card this month.
If you can't wait, get a card from evga and register it on their website. That way you will qualify for a 90 day trade up program that they usually have. (kingpin not included)


----------



## DStealth

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Personally I went with the gtx1080 ti gaming elite from evga because it has a newer memory module that runs at 12GHz stock. (not just a bios feature)


Great info, and how far it overclocks ?


----------



## GreedyMuffin

New card THIS month?

I just bougt a 1070 for mining, and my father just bought a 1080Ti. But he can return it within the next 45 days, and I can within the next 30..


----------



## GOLDDUBBY

Quote:


> Originally Posted by *DStealth*
> 
> Great info, and how far it overclocks ?


Afaik there's not many 1080ti's hitting 12ghz stable on the memory.
I haven't played around with it much yet, still waiting for a couple other components for the build. Dropped it in my sons computer @2075 core and +250 on the mem for some mining with no issues.

I'll post some overclocking results once it's in the actual build.


----------



## DStealth

Great


----------



## MrTOOSHORT

The evga elite looks like a nice 1080ti. Thx for the info.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *DStealth*
> 
> Great info, and how far it overclocks ?


Btw, my son's been away for a couple of days, so I turned the heater off and cracked a window in there. The card's holding steady sub 46°C thanx to the swedish climate. How's that for an exotic cooling solution?


----------



## GreedyMuffin

Norway is also freezing cold at the moment. Good to have my MO-RA3 behind my monitor - which makes it stand right in front of the window. :-D


----------



## Dimelus

hey guys... i just bought a 1080 Ti and I was wondering if I made the right choice,

Some reviews sad that it was one of the best 1080 TI available, with really high clock. But I just read another review and they sad that this card has a problem related to VRM Cooling.

The card that i bought is the Zotac GTX 1080 Ti AMP! Extreme. I have 10 days to return it and I could pick other 3 options at the same price.

My other options are:

MSI GeForce GTX 1080 Ti Gaming X Trio
ASUS ROG Strix GeForce GTX 1080 Ti
MSI GTX 1080 Ti GAMING X Review

So what do you guys think? Should i keep the zotac or change it for another one?


----------



## DStealth

Keep it they're all the same in term of OC it depends of the quality of the GPU(Nvidia made) and Vmem(Micron produced). The difference is slight in terms of cooling between them all if you have no problems with 5-10*C or 2-3db difference there's zero point exchanging between these options.


----------



## EarlZ

I think the Asus Strix has be best overall cooler followed by the MSI Gaming X. The AMP extreme is one of the worse in VRM cooling according to GN.


----------



## DStealth

From TPU 3db difference while the same temps are maintained...Guru3d has FLIR images VRMs are in 60-70* nothing to be worried about...


----------



## SirWaWa

do these cards really run that hot idle?
interested in the asus strix btw


----------



## EarlZ

Quote:


> Originally Posted by *SirWaWa*
> 
> do these cards really run that hot idle?
> interested in the asus strix btw


For Idle I think they run anywhere around 4-5c above ambient.


----------



## EarlZ

Quote:


> Originally Posted by *DStealth*
> 
> From TPU 3db difference while the same temps are maintained...Guru3d has FLIR images VRMs are in 60-70* nothing to be worried about...


Gamers Nexus has discussed about how bad the VRM (not GPU) cooling is on the AMP extreme.





 ( time stamped )


----------



## MrTOOSHORT

Evga elite ftw3 for the faster memory chips. I might pick it up tomorrow.


----------



## lilchronic

i been thinking about getting the kingpin.







Wish they had it with the faster memory chips.


----------



## MrTOOSHORT

Kingpin here locally for $1370 cad plus tax. Was $1300 yesterday.

Elite $1100


----------



## lilchronic

yesterday they had it for $999 now its $1,029 USD


----------



## Nervoize

I had a K|NGP|N for 1100 EUR. It performed worse than my current FTW3 which cost me 899 EUR







I don't recommend the K|NGP|N tbh.

Besides that, how to give my GPU more volts than 1.2v with the XOC bios? I saw some cards getting above 1.2v and the bios has no limits. People have their voltage number in the MSI Afterburner but all i see is 100%.


----------



## duganator

Finally joined the club yesterday and got the gaming x trio from msi. Holy crap this thing crushes my 1070 @ 1440p 144hz


----------



## D13mass

For me all 1080ti the same, I bought reference, installed EK waterblock and setup 0.95 V for 1960/12200 MHz, probably will use in such mode for next 1-2 years.


----------



## dazed1992

I'm wondering if any of you might know of my issue.

I just got a Gigabyte 1080 ti Gaming (3fan) and when the power % is at 100% i can over clock just fine up to around 1960 11200.

However raising the power to 120% even on STOCK settings causes the card to crash within minutes.Is this an RMA issue or is this GPU 3.0 just ******* me over? I've never had an issue like this before so im pretty much stumped.

ty


----------



## stangflyer

Quote:


> Originally Posted by *Dimelus*
> 
> hey guys... i just bought a 1080 Ti and I was wondering if I made the right choice,
> 
> Some reviews sad that it was one of the best 1080 TI available, with really high clock. But I just read another review and they sad that this card has a problem related to VRM Cooling.
> 
> The card that i bought is the Zotac GTX 1080 Ti AMP! Extreme. I have 10 days to return it and I could pick other 3 options at the same price.
> 
> My other options are:
> 
> MSI GeForce GTX 1080 Ti Gaming X Trio
> ASUS ROG Strix GeForce GTX 1080 Ti
> MSI GTX 1080 Ti GAMING X Review
> 
> So what do you guys think? Should i keep the zotac or change it for another one?


I have the Amp Extreme 1080ti. Card is running perfectly. I did see the below review.

https://www.gamersnexus.net/hwreviews/2981-zotac-1080ti-amp-extreme-review

I bet it is the one you saw. I did do the mod in that video at 11:30. The card was running perfectly before but you can not tell the temp of the vram so I did not want to chance it. I took the rubber strip off of the heatsink and put a 2mm thermal pad on the black vrm heatsink. A 1.5 should work also. This way the heatsink is actively connected and cooling the vrm.

I had the card at 2075 mhz with no voltage oc and all was good for the month BEFORE I did the thermal pad mod. My case does have excellant airflow. Keep that in mind.

If you do not have the tools or confidence to do the mod I would consider a return and get a Strix if you can. If you do not have thermal pads on hand they will run you another 20-35 bucks so keep that cost in mind. Also, some manufacturers may not warranty the card if taken apart.

Up to you- good luck


----------



## Dimelus

Quote:


> Originally Posted by *stangflyer*
> 
> I have the Amp Extreme 1080ti. Card is running perfectly. I did see the below review.
> 
> https://www.gamersnexus.net/hwreviews/2981-zotac-1080ti-amp-extreme-review
> 
> I bet it is the one you saw. I did do the mod in that video at 11:30. The card was running perfectly before but you can not tell the temp of the vram so I did not want to chance it. I took the rubber strip off of the heatsink and put a 2mm thermal pad on the black vrm heatsink. A 1.5 should work also. This way the heatsink is actively connected and cooling the vrm.
> 
> I had the card at 2075 mhz with no voltage oc and all was good for the month BEFORE I did the thermal pad mod. My case does have excellant airflow. Keep that in mind.
> 
> If you do not have the tools or confidence to do the mod I would consider a return and get a Strix if you can. If you do not have thermal pads on hand they will run you another 20-35 bucks so keep that cost in mind. Also, some manufacturers may not warranty the card if taken apart.
> 
> Up to you- good luck


yep it was that review.

I do have thermal pads (i just changed the ones from my old GPU).

I was thinking about doing the same thing when i receive the zotac... My only fear is loosing the warranty.


----------



## Renairy

Just picked up an ASUS Strix.
Anyone know what frequencies I should be achieving with no mods?
Seems to not even hit 2000mhz stable.

Not using the curve at this point.
Thanks!


----------



## Bartholdi

Quote:


> Originally Posted by *Renairy*
> 
> Just picked up an ASUS Strix.
> Anyone know what frequencies I should be achieving with no mods?
> Seems to not even hit 2000mhz stable.
> 
> Not using the curve at this point.
> Thanks!


Power limit set to 120%?


----------



## Falkentyne

Quote:


> Originally Posted by *Blameless*
> 
> Neither absolute maximum ratings, nor the upper ends of the VID range, are meant to imply a safe voltage for a given sample.
> 
> Stock VID of parts can vary quite widely because leakage current varies quite widely from sample to sample and core to core. Anything over stock is out of spec, and setting a VID near the top end of the range for the entire line is going to be extremely out of spec for the vast majority of parts.
> 
> Voltage and loads that don't cause appreciable degradation to one part could kill another in short order.
> Now read the notes right under that table.
> The difference between VID and actual vcore is based on the loadline and the current draw of the part, but yes, under load Intel spec mandates significant vcore vdroop on parts that don't have FIVRs.
> Voltage, temperature, and current all influence the rate of electromigration, which is the predominant mode of failure, outside of mechanical failures due to thermal cycling.


Looks like I missed a ton of posts apparently.
Your observations about VID, Vcore and vdroop are pretty accurate and I'm impressed. When I posted this on the Intel section, some toxic troll decided to flame and insult me and accused me in private message of not knowing what amps and volts and watts are (yeah, telling a 46 year old adult something like that will get you VERY far....one reason why i don't even have many friends anymore...too many people are simply not worth associating with because of their egos, but rant off..):

Vdroop still exists within the spec just like before FIVR's, which is why the "Loadline Calibration" (LLC) setting still exists. But now, with FIVR's, there is an extra setting that people have been overlooking, and which Asus, in their overclocking guide, tells to sets to the lowest NON ZERO value that is not Auto (as 0=Auto): IA AC DC Loadline.

This setting did not exist before FIVR's, and this setting functions on VID, rather than VCORE. (but of course I get flamed for saying this, even though it was [email protected] who is the one who TOLD me this. I guess random overclockers here know more than an Asus employee, huh?). They look like the same thing but they're not. Basically, to quote Raja, this setting boosts the VID signal based on CPU load. The VID is the starting voltage target BEFORE vdroop is applied, so if for example, the VID At full idle is 1.15v, and the IA AC DC loadline resistance is set to 2.10 mOhms, at about 80 amps of CPU current, the VID will rise to about 1.23v at full load. THEN vdroop is applied afterwards, as VID will not show vdroop, only VCORE will.

This setting seems to be designed to work with ADAPTIVE Vcore, rather than manual vcore. And some systems may not report VID accurately either, especially if voltage is set to adaptive rather than manual. The Intel reference value for Z370 (8700K) is 2.10 mOhms (For Auto), and it seems for 7700K chipsets, this may be the same value as well. Setting this value to 1 (or 0.01) with adaptive vcore seems to make the VID reported more accurately.

Using static (manual) voltages causes the VID to be reported more accurately, but Asus recommends setting this (IA AC DC) to the lowest value (0.01 or 1, depending on mainboard), then setting your manual voltage and using traditional Loadline Calibration instead.

It seems that some boards will ignore the IA AC DC loadline setting when using manual voltages (VID may change apparently, but CPU vcore and power draw remains the same whether this is set to auto or 0.01), while others do not. The ones that do not ignore this setting, when using manual vcore, will report VERY high temps and power draw when using manual voltages with this setting at "Auto" (or 2.10 mOhms).

Anyway, regardless, the only way to know the TRUE "Default VID" (if you even care) for your processor, is to set IA AC DC loadline to 1 (lowest value that is not auto). If you are then using adaptive voltage, your cpu's pre-programmed default VID will be shown in something like HWinfo or Throttlestop, but many overclocks will not be stable at this setting. Using manual voltages will override the VID signal, and the VID should be relatively close to the manual voltage you set. Then you have to compensate for vdroop.

On mainboards without any sort of LLC control (e.g. most laptops), you have to resort to vid boosting with IA AC DC, instead of Loadline Calibration. But if your system does NOT have a vcore sensor, you will have no way of knowing what the live vcore actually is at load.


----------



## Hulio225

Quote:


> Originally Posted by *dazed1992*
> 
> I'm wondering if any of you might know of my issue.
> 
> I just got a Gigabyte 1080 ti Gaming (3fan) and when the power % is at 100% i can over clock just fine up to around 1960 11200.
> 
> However raising the power to 120% even on STOCK settings causes the card to crash within minutes.Is this an RMA issue or is this GPU 3.0 just ******* me over? I've never had an issue like this before so im pretty much stumped.
> 
> ty


Higher powerlimit means higher temps since the limit is kicking in later or never depended on the title you are playing or program running.
Therefore alone the temps can mean stability issues, but like you assumed right GPU boost 3.0 is doing it portion too, it raises clocks higher because your power-limit isn't reached.
Distance diagnosis is hard i don't see your monitoring software etc. but either its both ( temps and gpu boost 3.0) or one of both, definitely no RMA issue.


----------



## Shweller

Sub'd just picked up a EVGA black sc 1080ti.


----------



## pick_o

I am hoping someone could help me get my card past 1,049mv so i can overclock higher. I keep reading about guys going to 1,093mv but how?

thanks you


----------



## Hulio225

Quote:


> Originally Posted by *pick_o*
> 
> I am hoping someone could help me get my card past 1,049mv so i can overclock higher. I keep reading about guys going to 1,093mv but how?
> 
> thanks you


Literally first page of this thread covers everything you need to know about everything http://www.overclock.net/t/1624521/official-nvidia-gtx-1080-ti-owners-club#post_25885897
"Best Way To Do A Custom Voltage Curve. See short video for best explanation."


----------



## hotrod717

Here we go again. Take a look at Newegg and 1080ti's available. Miners are going to drive prices up, darn near already, to $1k for any 1080ti. Considering a reference card cost me $750 on release day and now going for $800 plus used.








I'm looking to see the prices go up again. Not just on 1080ti' s, but any current gpu. It also screws the resale market on quality cards being resold. If you are looking to buy a used card over the next year or two, the market is going to be flooded with mining cards. Especially, once consumer volta hits. I remember this with 290x cards a few years ago. wt....

Wow only 1080ti available on Newegg or 6 other top hardware resellers is 1080tikpe at Newegg.
At $1029 for kpe, a titan xp empire starts look attractive at $1138. 1080ti,, 1080, 1070ti, 1070, Vega 56, Vega 64 sold out everywhere. Crazy.


----------



## Shweller

Quote:


> Originally Posted by *hotrod717*
> 
> Here we go again. Take a look at Newegg and 1080ti's available. Miners are going to drive prices up, darn near already, to $1k for any 1080ti. Considering a reference card cost me $750 on release day and now going for $800 plus used.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm looking to see the prices go up again. Not just on 1080ti' s, but any current gpu. It also screws the resale market on quality cards being resold. If you are looking to buy a used card over the next year or two, the market is going to be flooded with mining cards. Especially, once consumer volta hits. I remember this with 290x cards a few years ago. wt....
> 
> Wow only 1080ti available on Newegg or 6 other top hardware resellers is 1080tikpe at Newegg.
> At $1029 for kpe, a titan xp empire starts look attractive at $1138. 1080ti,, 1080, 1070ti, 1070, Vega 56, Vega 64 sold out everywhere. Crazy.


Yup


----------



## KedarWolf

Quote:


> Originally Posted by *hotrod717*
> 
> Here we go again. Take a look at Newegg and 1080ti's available. Miners are going to drive prices up, darn near already, to $1k for any 1080ti. Considering a reference card cost me $750 on release day and now going for $800 plus used.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm looking to see the prices go up again. Not just on 1080ti' s, but any current gpu. It also screws the resale market on quality cards being resold. If you are looking to buy a used card over the next year or two, the market is going to be flooded with mining cards. Especially, once consumer volta hits. I remember this with 290x cards a few years ago. wt....
> 
> Wow only 1080ti available on Newegg or 6 other top hardware resellers is 1080tikpe at Newegg.
> At $1029 for kpe, a titan xp empire starts look attractive at $1138. 1080ti,, 1080, 1070ti, 1070, Vega 56, Vega 64 sold out everywhere. Crazy.


Sign up for alerts on www.nowinstock.net get email alerts when a model is in stock. I get alerts for FE's all the time, but you need to act fast, they sell out right away.


----------



## KedarWolf

http://www.overclock.net/t/1510328/asus-x99-motherboard-series-official-support-thread-north-american-users-only/16020_20#post_26537055

http://www.overclock.net/t/1510328/asus-x99-motherboard-series-official-support-thread-north-american-users-only/16020_20#post_26537072

See these two posts if you want to check if the exploits are patched and the second to manually download the Windows updates to patch them.

Be sure to Rep +1 them.


----------



## MonarchX

Any EVGA 1080 Ti Kingpin owners who run the card on air? Any advice for best OC on air? I asked.on EVGA forums and.so far advice was:
- Use fully independent power cables (not the included 2 from 1 split cable)
- Use the orange OC BIOS
- Use Precision X with KBoost

Anything else?

Does MSI AB lack the needed Precision X features for this card, like KBoost and extra thermal sensor monitors?


----------



## Gunslinger.

Quote:


> Originally Posted by *MonarchX*
> 
> Any EVGA 1080 Ti Kingpin owners who run the card on air? Any advice for best OC on air? I asked.on EVGA forums and.so far advice was:
> - Use fully independent power cables (not the included 2 from 1 split cable)
> - Use the orange OC BIOS
> - Use Precision X with KBoost
> 
> Anything else?
> 
> Does MSI AB lack the needed Precision X features for this card, like KBoost and extra thermal sensor monitors?


That all looks spot on.

MSI AB does not have the features that Precision X offers.


----------



## cltexe

Hi fellow overclockers,

I tried to flash my KFA 1080 EXOC into KFA21080 HOF. Flashin was successfull but after that it doesn't responded. I switched to my IGPU and boot up windows 10. From there I tried to reflash to my original EXOC bios but nvflash says "No NVIDIA display adapters found." so I can't simply reflash. When I checked form GPU-Z there is no BIOS version or etc. Only card is identified as GTX 1080. From device manager there is also a question mark on the GPU. Any directions to make it work again? Thanks for your precious help.

Though this is not 1080 ti related, I believe we can come up with a solution.


----------



## DStealth

What you can see when nvflash --list is executed ?
Try disable device from device manager and run CMD as admin


----------



## cltexe

NVIDIA display adapters present in system:
No NVIDIA display adapters found.

This is what I get exactly. I can see device in the device manager but it has got error code 31.

I also checked that my card has 8 PWM and this HOF model got like 11 or something like that. So definitely a mismatch. But flashing run smoothly without even a single warning about mismatch of anythying.

Is there anyway I can hard reset the BIOS. I removed the cooler and saw the DOUBLE BIOS thing. But I couldn't find any switches nor jumpers or I simply missed them or don't know that they look like. Here are some shots:


----------



## Hulio225

Quote:


> Originally Posted by *cltexe*
> 
> NVIDIA display adapters present in system:
> No NVIDIA display adapters found.
> 
> This is what I get exactly. I can see device in the device manager but it has got error code 31.
> 
> I also checked that my card has 8 PWM and this HOD model got like 11 or something like that. So definitely a mismatch. But flashing run smoothly without even a single warning about mismatch of anythying.


Try to pull the pci-e power cable off your gpu and boot again and try flashing again, it worked for me once.


----------



## mAs81

Hey guys ,

I'm considering biting the bullet on a 1080ti , and the closest one to my budget is the
Palit GeForce GTX 1080 Ti 11GB JetStream for 865euros..

I've watched some reviews that place it rather high in the 1080ti performance list..

Anyone have this particular model?All/any feedback will be highly appreciated


----------



## Hulio225

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys ,
> 
> I'm considering biting the bullet on a 1080ti , and the closest one to my budget is the
> Palit GeForce GTX 1080 Ti 11GB JetStream for 865euros..
> 
> *I've watched some reviews that place it rather high in the 1080ti performance list..*
> 
> Anyone have this particular model?All/any feedback will be highly appreciated


All those decent "custom board" cards perform same'ish the differences are within the margin off error to be honest.

If you want to save some bucks and you are willing to order at a german dealer and pay with paypal you can save a significant amount of money.

https://www.mindfactory.de/Hardware/Grafikkarten+(VGA)/GeForce+GTX+fuer+Gaming/GTX+1080+Ti.html

I checked delivery cost to greece, its 32,99 Euros. https://www.mindfactory.de/info_center.php/icID/13/popup/true

But your Jetstream card is only 785 EUR and there are a lot other even cheaper cards or bettr cards availbe etc. check it out









Edit:
Quote:


> Originally Posted by *cltexe*
> 
> NVIDIA display adapters present in system:
> No NVIDIA display adapters found.
> 
> This is what I get exactly. I can see device in the device manager but it has got error code 31.
> 
> I also checked that my card has 8 PWM and this HOF model got like 11 or something like that. So definitely a mismatch. But flashing run smoothly without even a single warning about mismatch of anythying.
> 
> Is there anyway I can hard reset the BIOS. *I removed the cooler and saw the DOUBLE BIOS thing. But I couldn't find any switches nor jumpers or I simply missed them or don't know that they look like.* Here are some shots:


Because if that card would have dual bios the jumper would be there:

But its not soldered to, because that card has no dual bios that board is just used for other cards which have dual bios... and try my tipp i gave you previously, try to take the pci-e power cables off, boot the system and try then to flash, it worked for me while my card was totally bricked.


----------



## mAs81

Thank you for the links !

I'm aware of that shop,but the thing is that I'll be using my credit card to buy it with doses without tax , so I don't know if I can do that there


----------



## MonarchX

Which card is better for air OC:
- KFA
- HOC
- Kingpin

??? They all have awesome custom PCB's, don't they?


----------



## SavantStrike

Quote:


> Originally Posted by *hotrod717*
> 
> Here we go again. Take a look at Newegg and 1080ti's available. Miners are going to drive prices up, darn near already, to $1k for any 1080ti. Considering a reference card cost me $750 on release day and now going for $800 plus used.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm looking to see the prices go up again. Not just on 1080ti' s, but any current gpu. It also screws the resale market on quality cards being resold. If you are looking to buy a used card over the next year or two, the market is going to be flooded with mining cards. Especially, once consumer volta hits. I remember this with 290x cards a few years ago. wt....
> 
> Wow only 1080ti available on Newegg or 6 other top hardware resellers is 1080tikpe at Newegg.
> At $1029 for kpe, a titan xp empire starts look attractive at $1138. 1080ti,, 1080, 1070ti, 1070, Vega 56, Vega 64 sold out everywhere. Crazy.


Every card with more than 3GB of RAM on the AMD side is gone. On the nvidia side, everything down to the 1050 TI is gone or nearly gone.

It's not just mining though -look at RAM prices and you'll see that price fixing due to artificial shortages has wreaked havoc.

If Volta releases q1 this year, low ram availability plus pent up demand will make it hard to get at MSRP. It will end up scalped just like pascal was at release. The artificially high msrp on AMD and nvidia cards might even mean volta has a higher msrp than pascal did. That's if consumer Volta is real and nvidia isn't making us wait for ampere.

Prices have been trending upward ever since the red team hasn't had a competitive product. Mining might get them some extra cash to come out with better products in the long run.

Used mining cards aren't the worst thing in the world either as long as they were taken care of.


----------



## porschedrifter

count me in the fam Msi 1080ti Duke Oc


----------



## nrpeyton

Quote:


> Originally Posted by *SavantStrike*
> 
> Every card with more than 3GB of RAM on the AMD side is gone. On the nvidia side, everything down to the 1050 TI is gone or nearly gone.
> 
> It's not just mining though -look at RAM prices and you'll see that price fixing due to artificial shortages has wreaked havoc.
> 
> If Volta releases q1 this year, low ram availability plus pent up demand will make it hard to get at MSRP. It will end up scalped just like pascal was at release. The artificially high msrp on AMD and nvidia cards might even mean volta has a higher msrp than pascal did. That's if consumer Volta is real and nvidia isn't making us wait for ampere.
> 
> Prices have been trending upward ever since the red team hasn't had a competitive product. Mining might get them some extra cash to come out with better products in the long run.
> 
> Used mining cards aren't the worst thing in the world either as long as they were taken care of.


Quote:


> Originally Posted by *SavantStrike*
> 
> Every card with more than 3GB of RAM on the AMD side is gone. On the nvidia side, everything down to the 1050 TI is gone or nearly gone.
> 
> It's not just mining though -look at RAM prices and you'll see that price fixing due to artificial shortages has wreaked havoc.
> 
> If Volta releases q1 this year, low ram availability plus pent up demand will make it hard to get at MSRP. It will end up scalped just like pascal was at release. The artificially high msrp on AMD and nvidia cards might even mean volta has a higher msrp than pascal did. That's if consumer Volta is real and nvidia isn't making us wait for ampere.
> 
> Prices have been trending upward ever since the red team hasn't had a competitive product. Mining might get them some extra cash to come out with better products in the long run.
> 
> Used mining cards aren't the worst thing in the world either as long as they were taken care of.


I was doing a bit of mining for a while right up until Nicehash Miner 2 app was hacked... and I lost everything.

I always ran the card at low voltage while mining, usually about 0.975v. @ 2000MHz. On water temps - mid 30's.


----------



## nrpeyton

Quote:


> Originally Posted by *Gunslinger.*
> 
> That all looks spot on.
> 
> MSI AB does not have the features that Precision X offers.


Has something changed recently with Precision X?

Last time I used it I found it extremely frustrating and very "limited" compared to MSI AB.
1) Their auto-overclock feature was still broken last time I tried it. It never "resumes" properly after a driver crash.
2) Extremely limited control over your voltage/frequency curve _compared_ to MSI AB.

If that's changed -- i'd be interested .. and may even download it....

P.S how does Precision X get you a higher overclock over MSI AB?

Thanks in advance..


----------



## porschedrifter

Quote:


> Originally Posted by *nrpeyton*
> 
> Has something changed recently with Precision X?
> 
> Last time I used it I found it extremely frustrating and very "limited" compared to MSI AB.
> 1) Their auto-overclock feature was still broken last time I tried it. It never "resumes" properly after a driver crash.
> 2) Extremely limited control over your voltage/frequency curve _compared_ to MSI AB.
> 
> If that's changed -- i'd be interested .. and may even download it....
> 
> P.S how does Precision X get you a higher overclock over MSI AB?
> 
> Thanks in advance..


Same here, Precision X is horrible compared to Afterburner. Afterburner really is the top of the line 3rd party tools for GPU's. Unless they recently did a major overhaul on Precision X but I doubt it.


----------



## Gunslinger.

Quote:


> Originally Posted by *nrpeyton*
> 
> Has something changed recently with Precision X?
> 
> Last time I used it I found it extremely frustrating and very "limited" compared to MSI AB.
> 1) Their auto-overclock feature was still broken last time I tried it. It never "resumes" properly after a driver crash.
> 2) Extremely limited control over your voltage/frequency curve _compared_ to MSI AB.
> 
> If that's changed -- i'd be interested .. and may even download it....
> 
> P.S how does Precision X get you a higher overclock over MSI AB?
> 
> Thanks in advance..


I never said that it gave me higher clocks, and I'm only referring to Precision X as it pertains to use with the KPE card.
What I said was, all of the features work very well with the KPE card. Voltage, monitoring, temps, fan curve ect;

I've no idea about any "auto overclocking" features, that's not something I'd be interested in anyway.


----------



## hotrod717

Quote:


> Originally Posted by *SavantStrike*
> 
> Every card with more than 3GB of RAM on the AMD side is gone. On the nvidia side, everything down to the 1050 TI is gone or nearly gone.
> 
> It's not just mining though -look at RAM prices and you'll see that price fixing due to artificial shortages has wreaked havoc.
> 
> If Volta releases q1 this year, low ram availability plus pent up demand will make it hard to get at MSRP. It will end up scalped just like pascal was at release. The artificially high msrp on AMD and nvidia cards might even mean volta has a higher msrp than pascal did. That's if consumer Volta is real and nvidia isn't making us wait for ampere.
> 
> Prices have been trending upward ever since the red team hasn't had a competitive product. Mining might get them some extra cash to come out with better products in the long run.
> 
> Used mining cards aren't the worst thing in the world either as long as they were taken care of.


EEEh. I've gotten a few mining cards and they aren't what i'd call great cards. Most have custom underclocked bios' and considering their use have been been dragged through the mud so to speak. Who games or overclocks 24/7/365? They see an extreme amount of use.
After seeing a used FE go for $950 on ebay yesterday, a brand new KPE doesnt look so bad, even after the price bump to $1029


----------



## porschedrifter

Quote:


> Originally Posted by *hotrod717*
> 
> EEEh. I've gotten a few mining cards and they aren't what i'd call great cards. Most have custom underclocked bios' and considering their use have been been dragged through the mud so to speak. Who games or overclocks 24/7/365? They see an extreme amount of use.
> After seeing a used FE go for $950 on ebay yesterday, a brand new KPE doesnt look so bad, even after the price bump to $1029


It's a crazy time to be in the market for a GPU. I'd be extremely wary of buying a used GPU right now with the mining abuse going on.
On the other side of things, I managed to pick up a Vega 56 for $400 when they were randomly on sale two months ago new, used it for 2 months, enjoyed it, then turned around and sold it for $890 on ebay a few days ago. Bought a new MSI 1080ti duke OC with the profits a day before they sold out and went up in price. Currently enjoying that card now









The last few GPU listings I've sold pretty quick probably due to the fact I don't mine and list it as *never mined on*


----------



## Kronos8

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys ,
> 
> I'm considering biting the bullet on a 1080ti , and the closest one to my budget is the
> Palit GeForce GTX 1080 Ti 11GB JetStream for 865euros..
> 
> I've watched some reviews that place it rather high in the 1080ti performance list..
> 
> Anyone have this particular model?All/any feedback will be highly appreciated


I 've been looking for a 1080ti several months now and compare prices.
865E are much too high. But not for 12 monthly payments.
Unfortunately, no Greek store has acceptable prices.
You must look abroad.
I was able to pull the triger on one for 770E total from amazon.de
Still expecting it.
Apart from that, the card is 2,5 slot width and has a dual bios switch.
For thermals you have to check reviews
PALIT / GAINWARD are the same company. Cards are very similar.
I think I saw a video where people changed the card's fans with 120mm NoiseBlockers without modding, but it was German language, so......


----------



## KedarWolf

Quote:


> Originally Posted by *Kronos8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mAs81*
> 
> Hey guys ,
> 
> I'm considering biting the bullet on a 1080ti , and the closest one to my budget is the
> Palit GeForce GTX 1080 Ti 11GB JetStream for 865euros..
> 
> I've watched some reviews that place it rather high in the 1080ti performance list..
> 
> Anyone have this particular model?All/any feedback will be highly appreciated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I 've been looking for a 1080ti several months now and compare prices.
> 865E are much too high. But not for 12 monthly payments.
> Unfortunately, no Greek store has acceptable prices.
> You must look abroad.
> I was able to pull the triger on one for 770E total from amazon.de
> Still expecting it.
> Apart from that, the card is 2,5 slot width and has a dual bios switch.
> For thermals you have to check reviews
> PALIT / GAINWARD are the same company. Cards are very similar.
> I think I saw a video where people changed the card's fans with 120mm NoiseBlockers without modding, but it was German language, so......
Click to expand...

Go to www.nowinstock.net I think there's an option for EU notifications.

I get notifications for 1080 Ti FE's for $699 USD all the time, but you need to buy right away as they sell out quickly.


----------



## mAs81

Quote:


> Originally Posted by *Kronos8*
> 
> Unfortunately, no Greek store has acceptable prices.You must look abroad.


Tell me about it..Our hardware prices always suck







Quote:


> Originally Posted by *Kronos8*
> 
> I think I saw a video where people changed the card's fans with 120mm NoiseBlockers without modding, but it was German language, so......


That is very interesting , tho I believe that the card is already pretty quiet , from what I've seen
Quote:


> Originally Posted by *KedarWolf*
> 
> Go to www.nowinstock.net I think there's an option for EU notifications.


Bookmarked









Since I'll be using my credit card for payment , I'm thinking about maybe going the extra mile and get the MSI Gaming X , but I'm not so sure yet because it is 950euros(!) after all...


----------



## Kronos8

Quote:


> Originally Posted by *KedarWolf*
> 
> Go to www.nowinstock.net I think there's an option for EU notifications.
> 
> I get notifications for 1080 Ti FE's for $699 USD all the time, but you need to buy right away as they sell out quickly.


Appreciate the link and the help.
But ...

699 USD x 0,85 Euros per USD = 594,15 Euro + 15Euro Custom fee = 610Euro + 24% VAT on greek price of the card (650 Euro if lucky, 700+ if unlucky, set by custom employee with no rules) + more than 2-3 months waiting custom clearance = Forget it ....................

3 months ago I sent an e-mail to MSI complaining about the Greek prices of their cards (for example MSI GTX 1080 TI Armor 11G OC at 943Euro in only one store, 1 minute ago!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!).
{The rest of the Greek stores have the Armor out of stock, while available in the rest of Europe







}
I fully understand free market rules but 943Euro for 1080 Ti Armor makes MSI look VERY bad.
The answer to my e-mail was about mining (give me a break) and after that, all MSI 1080ti cards were not delivered in Greece through Amazon.de for more than a month.
That was because amazon.de was our only chance for acceptable prices. And they blocked it !!!
The local MSI distributor really plays MSI, actually helping other brands.
That is their problem.


----------



## SavantStrike

Quote:


> Originally Posted by *hotrod717*
> 
> EEEh. I've gotten a few mining cards and they aren't what i'd call great cards. Most have custom underclocked bios' and considering their use have been been dragged through the mud so to speak. Who games or overclocks 24/7/365? They see an extreme amount of use.
> After seeing a used FE go for $950 on ebay yesterday, a brand new KPE doesnt look so bad, even after the price bump to $1029


Again though, not every miner beats up their cards. A card thats been mined on and not allowed to go over 60-65C is fine as long as it has the stock BIOS.

Folders run cards flat out 24x7, yet they don't burn them out too quickly - the trick is taking care of your equipment.


----------



## feznz

Quote:


> Originally Posted by *SavantStrike*
> 
> Again though, not every miner beats up their cards. A card thats been mined on and not allowed to go over 60-65C is fine as long as it has the stock BIOS.
> 
> Folders run cards flat out 24x7, yet they don't burn them out too quickly - the trick is taking care of your equipment.


Yep that is true but you have a 50/50 chance of having one owned by a little old lady or thrashed
and if you get a mint one also the chances are it has been binned and a dud OCer


----------



## KCDC

Quote:


> Originally Posted by *feznz*
> 
> Yep that is true but you have a 50/50 chance of having one owned by a little old lady or thrashed
> and if you get a mint one also the chances are it has been binned and a dud OCer


Too bad we can't read their little black box and see it's history. That would really screw more than half of the ebay sellers.


----------



## SavantStrike

Quote:


> Originally Posted by *feznz*
> 
> Yep that is true but you have a 50/50 chance of having one owned by a little old lady or thrashed
> and if you get a mint one also the chances are it has been binned and a dud OCer


I don't like buying any air cooled card used. Miners aside I've seen far too many air cooled cards crammed in cases without enough airflow and then flogged with high voltage for a few more FPS

Binning is a concern but only on a relatively new card.

Any used card is a compromise which is why they are (usually) cheaper.


----------



## HoneyBadger84

Got a brand new eVGA FTW3 Hybrid coming from Amazon, they just updated my order to "arriving Friday", which consider it said "in stock on the 18th, order now & we'll ship when it arrives" I'm glad it's coming early. I got it for $27 under eVGA's site retail price, and it's direct from Amazon.com as the seller not some 3rd party.

I'm excited to upgrade my aging R9 295x2. Next will be getting a nice fat 21 by 9 3440x1440 monitor to enjoy this 1080 Ti with ^_^















I'M SO EXCITED


----------



## MonarchX

Got my EVGA Kingpin 1080 Ti a few hours ago - very impressive card!


----------



## DStealth

Share OC details please


----------



## MonarchX

My Kingpin does not OC at all...lol...


----------



## DStealth

But this means sub 2050Mhz with the most expensive card ...also there's no way your memory wouldn't exceed the stock 11016MHz speed


----------



## gta1989

are you hitting a power supply limit by chance with that 1080ti kingpin. i tried using my old psu when i upgraded to a 1080ti and it didnt like that old psu i couldnt make it run much more then +35 core clock. i threw in a new evga g3 1000 watt and now i can get 2050 out of it at 1.045v


----------



## GRABibus

unfortunately, High price card does't mean Silicon lottery winning.

My first 1080TI was a MSI SeaHawk....and this card was unable to take 1MHz and to stay stable....
My last one is a gigabyte Gaming OC (Improved air cooling), at a much cheaper price Than Seahawk and Kingpin and this card can do 2063MHz at 1,05V.

I havé unmounted the air cooler and mounted the AIO watercooling of the Gigabyte aorus xtreme waterforce 11G and I am gaming stable and benchmark stable at 2164MHz at 1.131v !! (With ASUS XOC Bios)


----------



## Silent Scone

Quote:


> Originally Posted by *GRABibus*
> 
> unfortunately, High price card does't mean Silicon lottery winning.
> 
> My first 1080TI was a MSI SeaHawk....and this card was unable to take 1MHz and to stay stable....
> My last one is a gigabyte Gaming OC (Improved air cooling), at a much cheaper price Than Seahawk and Kingpin and this card can do 2063MHz at 1,05V.
> 
> I havé unmounted the air cooler and mounted the AIO watercooling of the Gigabyte aorus xtreme waterforce 11G and I am gaming stable and benchmark stable at 2164MHz at 1.131v !! (With ASUS XOC Bios)


Depends on the OE and model.

Strix models are binned by tier.


----------



## Domler

All right folks. Time to swallow my pride and admit......I'm an idiot. I was sleep deprived, rushing, and doing late night building, which never works. My buddy wanted my other 1080 ti and i needed my 1080 i let him barrow. I had a block on it, and wanted to wait till we put his whole system under water, but he wanted it first. So i took the block off and when i was putting the reference blower back on......i slipped.







So then I went to finished and slipped again.







Tested the card and dead.







My buddy wants to see if we can fix it. He knows a guy willing to solder the pieces back on, until my wife threw out the baggy that had them in there because she thought there was nothing in there. Now I'm trying to see if any of you guys, that are much smarter than me, could help me identify the components so i could try and source them. Any help would be greatly appreciated.

And i know I'm gonna get flack for slipping,







because i was using the wrong tool, and some is deserved.


----------



## MonarchX

Bah, stupid Precision X was the problem. It locks up my PC every time I adjust voltage, power, or any clocks. I seriously dislike it. Not only did they steal a bunch of MSI AB code, but also made it much more complicated.

I went back to MSI AB and even with its limited features for Kingpin I got 2050/12000Mhz going for Assassin Creed Origins. I simply have not tested higher clocks, but will do so during the next week.


----------



## brian19876

hi all with the death of one of my 980tis i had in sli I sold my other 980ti and got a 1080ti evga hybrid sc2

im new to pascal overclocking but when i run heaven benchmark at around 2025 core i get 3-5 sec pauses in the benchmark and the gpu seems to be swinging in freq 2025 all the way down to 896 does this mean i hit the overclock limit for this card


----------



## Hulio225

Quote:


> Originally Posted by *Domler*


There are two capacitors missing, should have the same capacity as the one sitting directly above it -> desoldering -> measuring -> get another one/two and solder again.

A picture as reference with all capacitors in its place:


Quote:


> Originally Posted by *Domler*


This is a resistor. Can't see perfectly which resistance it has thou... i see it as 202 which is 2kOhm SMD resistor.
I would try to fix this problem as the first one, since this should be more important than capacitors which are mostly in parallel to simply yield a higher capacity for less rippled voltage curves...

A pic as reference:


Good luck and you're welcome


----------



## nrpeyton

Quote:


> Originally Posted by *brian19876*
> 
> hi all with the death of one of my 980tis i had in sli I sold my other 980ti and got a 1080ti evga hybrid sc2
> 
> im new to pascal overclocking but when i run heaven benchmark at around 2025 core i get 3-5 sec pauses in the benchmark and the gpu seems to be swinging in freq 2025 all the way down to 896 does this mean i hit the overclock limit for this card


Sounds like you may be right. (about reaching the limit). But I'd definitely run a few other benches and games first; and see how you get on--one app may behave very differently to another.


----------



## nrpeyton

Quote:


> Originally Posted by *GRABibus*
> 
> unfortunately, High price card does't mean Silicon lottery winning.
> 
> My first 1080TI was a MSI SeaHawk....and this card was unable to take 1MHz and to stay stable....
> My last one is a gigabyte Gaming OC (Improved air cooling), at a much cheaper price Than Seahawk and Kingpin and this card can do 2063MHz at 1,05V.
> 
> I havé unmounted the air cooler and mounted the AIO watercooling of the Gigabyte aorus xtreme waterforce 11G and I am gaming stable and benchmark stable at 2164MHz at 1.131v !! (With ASUS XOC Bios)


Quote:


> Originally Posted by *Silent Scone*
> 
> Depends on the OE and model.
> 
> Strix models are binned by tier.


I agree. However one thing to bear in mind -- is if you were able to actually remove the GPU chip from a card. And you tried the same GPU on a basic card, then tried it in the Kingpin. The Kingpin would likely clock a little higher. For two reasons: >> *1)* smoother power delivery to card due to bigger VRM_(less ripple)._ and *2)* Higher maximum power target (in watts). Moreover -- in higher current drawing benchmarks & games the smoother power delivery becomes incrementally more helpful.


----------



## KedarWolf

I tell my friend, "Check out my new Z370 Maximus X Formula, the VRMs are water cooled!!"

Him, "Were they overheating before?"

Me,


----------



## nrpeyton

Awfully quiet on here; the last few days


----------



## mouacyk

shhhh


----------



## HoneyBadger84

The cat killed it.

My card got updated finally (shipping etc), now says it will be here Wednesday.














I'm still excited! And my current card is selling for beaucoup dinero on EBay (thanks Mining) so woo hoo!


----------



## RoBiK

Quote:


> Originally Posted by *mouacyk*
> 
> shhhh


wakey wakey


----------



## mouacyk

Quote:


> Originally Posted by *RoBiK*
> 
> wakey wakey
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


----------



## Scotty99

Whats going on with video cards right now? (especially 80 series)

There is almost none in stock, and the ones that are are way overpriced.
https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888


----------



## feznz

mining bitcoin would be my guess


----------



## HoneyBadger84

Quote:


> Originally Posted by *Scotty99*
> 
> Whats going on with video cards right now? (especially 80 series)
> 
> There is almost none in stock, and the ones that are are way overpriced.
> https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&IsNodeId=1&N=100007709%20601201888


Orderwd mine when it was in Stock from Amazon direct, $832 for the FTW3 Hybrid, it said "in Stock January 18th, buy now and we'll ship when it arrives", it now says it'll be here this coming Wednesday.

The whole of the upper end for Nvidia cards and the Vega's for AMD is a crap shoot in terms of stock right now. Between mining and everything else, they stay in Stock for less than 10 minutes normally.

I'm seriously considering canceling my order, selling my r9 295x2 first and getting a Titan Xp instead. Depends on if the 1080 Ti actually ships this week. Since Nvidia sells the Titan Xp direct, at least I know it'll be shipped with relatively decent speed after purchase...

Like, look at this insanity mate: http://www.nowinstock.net/computers/videocards/nvidia/gtx1080ti/

There's a whole Lotta nothing. Lol


----------



## Scotty99

This is new tho whats driving it? Like i just built a pc a month ago and 1080's were 500 bucks and 80 ti's were 700 and every model was in stock and available. Did they all sell out for christmas lol?


----------



## kabu

explains a lot...


----------



## D13mass

Hi guys! I will try to describe my problems.

I have 1080ti FE card for long time with waterblock from EK and I flashed Palit bios maybe half year ago. Today during playing in "Rise of the Tomb Raider" I realized that I dont have smooth FPS, I turned on MSI Afterburner monitoring and saw only 30-40 FPS... Then I spent so many hours for investigation and found reaason...

By default my card with Palit bios:

1962 MHz core and 1251 MHz memory == 115 FPS avg. in bench "Rise of the Tomb Raider", Perfect smooth gameplay

When I run tool like MSI AfterBurner OR EVGA Precision X and setup my long-known stable settings +100 core and + 600 memory I will get next:

2050 MHz core and 1400 MHz memory == 60 FPS avg. in bench "Rise of the Tomb Raider", will be slideshow 30-40 FPS in gameplay.

Here is the most important part, because now even if I switch back to default settings OR close OC tool ( MSI AfterBurner OR EVGA Precision X) anyway I will have this 60 FPS in benchmark and 30-40 in real gameplay, before I dont reboot my PC, till next run OC tool ( MSI AfterBurner OR EVGA Precision X)

What should I do? Is it problem with OC or degradation of my card?

Of course I can forget about overcklocking my card and use it as it is in default (1962Mhz) but first I met so interesting bug.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *MonarchX*
> 
> My Kingpin does not OC at all...lol...


It's *guaranteed* to overclock! Evga promises a 2025mhz minimum on air with that card. So either your doing something wrong or the card is bad. I highly doubt the last option on this type of card. Head on over to the evga forums and talk to them. My bet is they will sort everything out for you.
Quote:


> Originally Posted by *RoBiK*
> 
> wakey wakey


Nice mining stack.


----------



## Scotty99

Just put my 1080ti on craigslist for 1200, wonder if it will sell lol.

Or maybe i should look into mining?


----------



## kithylin

Quote:


> Originally Posted by *Scotty99*
> 
> This is new tho whats driving it? Like i just built a pc a month ago and 1080's were 500 bucks and 80 ti's were 700 and every model was in stock and available. Did they all sell out for christmas lol?


Coin miners bought all the top-tier video cards from GTX 1070 and up. As well with AMD side, all the vega cards and most of the RX 480/580 are gone too. Coin miners are buying everything in seconds after they appear for sale anywhere.

Coin miners are destroying the GPU industry as a whole. And just wait until the coin bubble bursts and they try to liquidate all their fried / burnt-out / mined-to-death cards that can't play games anymore in 6 months on the used market and then don't tell anyone they used em for mining.

My local Fry's Electronics has adopted a strict "no returns, no exceptions, no exclusions" on all video cards now due to coin miners that would take em home and mine em until 1 day before the end of the return period then return em broken and unusable and get their money back.

If you were wanting to build a new PC or upgrade to a new computer and are mad you can't buy a fast GPU for gaming: Be mad at and blame coin miners. They're the entire reason we're in this mess.

I'm so darn happy I scored my 1080 Ti for $685 shipped in summer 2017, they're going for $1200+ right now.


----------



## HoneyBadger84

Mining won't be going away anytime soon. When one currency bursts, they move on to the next one. There's going to be new cryptocurrencies on the regular because ones like BitCoin, Ethereum, etc made people near-instant profit. Unfortunately, that's the world we live in.

The only solution is what a lot of makers are doing and imposing "limit 1-2" then after that you have to pay a penalty fee. Will the miners pay it? Probably, but at least it's a deterrent for some.

I got my new GTX 1080 Ti FTW3 HYBRID From Amazon for $832, it was on delayed shipping, but it should be here this week. Noting like a nice quiet AiO cooling loop card.

I'm glad NVidia seem to be keeping the Titan Xp an in house card. They're expensive enough without miners jacking up the price even further. Kinda wanna get one, but I think I'll stick with the 1080 Ti, if it ships by Tuesday.


----------



## pez

Guess I'm glad I got into the GPU game way early last gen. I mean, hybrid kits exist and FE cards being generally the cheapest cards to get (ignoring the whole beginning part of the launch of the 1070/1080) make it a no brainer to go reference. Maximum compatibility, first to get waterblocks and no binning for 'top-tier' models.


----------



## Domler

Thank you so much!!! Kinda what we figured, just wanted a second opinion. When he gets back from vacation, we will take a swing at it and see what happens.


----------



## feznz

Well glad the mining craze hasn't hit NZ maybe I can send some card back







no shortage here


----------



## mAs81

Buying a 1080Ti is really getting harder and harder here each day that passes....All the ones that I checked are out of stock,and their prices got a 40-60euros boost all of a sudden..


----------



## GOLDDUBBY

"no shortage".. Yeah with those prices


----------



## HoneyBadger84

1499 in NZ dollars is about $1090 in the US... So they are also overpriced over there, by a fair bit.


----------



## feznz

That's the way it is with import charges etc even if I were to buy from newegg or amazon the cards would be around the same price the only real way to buy cheaper is to jump on a plane and have a holiday at the same time








but a check around there is plenty of stock with all our suppliers still temped to go SLI


----------



## HoneyBadger84

Quote:


> Originally Posted by *feznz*
> 
> That's the way it is with import charges etc even if I were to buy from newegg or amazon the cards would be around the same price the only real way to buy cheaper is to jump on a plane and have a holiday at the same time
> 
> 
> 
> 
> 
> 
> 
> 
> but a check around there is plenty of stock with all our suppliers still temped to go SLI


Do et! Lol


----------



## Rollergold

Good day Ladies and Gents:

I flashed my 1080 Ti FTW3 with the XOC BIOS (found it on the OG Post) and when I was doing my Voltage/Frequency graph and running some tests I was going over 1.093v (see below). Is it normal with the XOC BIOS for that to happen? I thought the card was for the most part limited to 1.093v

If thats Normal is 1.175v a safe voltage? Aiming for around 2100mhz on the core. I'm running EK's Full Cover Block on it btw so cooling is no issue


----------



## kithylin

Quote:


> Originally Posted by *Rollergold*
> 
> Good day Ladies and Gents:
> 
> I flashed my 1080 Ti FTW3 with the XOC BIOS (found it on the OG Post) and when I was doing my Voltage/Frequency graph and running some tests I was going over 1.093v (see below). Is it normal with the XOC BIOS for that to happen? I thought the card was for the most part limited to 1.093v
> 
> If thats Normal is 1.175v a safe voltage? Aiming for around 2100mhz on the core. I'm running EK's Full Cover Block on it btw so cooling is no issue


As long as you're water cooled, you're golden to go as high as msi afterburner will allow you to. 1.200v is the maximum safe voltage allowed by nvidia, and in fact MSI Afterburner will only allow you to go up to 1.200v. There are ways to go above that but it's not obvious and you would risk damage to the card. "Hard Mods" like "Shunt resistor mods" and stuff like that to cause the cards to draw excessive power and such.

Just sticking to software though with the XOC bios, anything up to (or below) 1.200v is 100% perfectly safe and won't fry anything.

Disclaimer: if you some how manage to fry your card, please don't blame me. 1.200v -SHOULD- be safe, according to nvidia.

Also, my 1080 Ti I've been running at 2138 Mhz @ 1.200v for going on 5 months daily now with no issues.


----------



## Rollergold

Quote:


> Originally Posted by *kithylin*
> 
> As long as you're water cooled, you're golden to go as high as msi afterburner will allow you to. 1.200v is the maximum safe voltage allowed by nvidia, and in fact MSI Afterburner will only allow you to go up to 1.200v. There are ways to go above that but it's not obvious and you would risk damage to the card. "Hard Mods" like "Shunt resistor mods" and stuff like that to cause the cards to draw excessive power and such.
> 
> Just sticking to software though with the XOC bios, anything up to (or below) 1.200v is 100% perfectly safe and won't fry anything.
> 
> Disclaimer: if you some how manage to fry your card, please don't blame me. 1.200v -SHOULD- be safe, according to nvidia.
> 
> Also, my 1080 Ti I've been running at 2138 Mhz @ 1.200v for going on 5 months daily now with no issues.


Ok thanks for the heads up.


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> There are ways to go above that but it's not obvious and you would risk damage to the card. "Hard Mods" *like "Shunt resistor mods" and stuff like that to cause the cards to draw excessive power* and such.


Don't want to start a argument with you, but the XOC Bios is exactly doing that, it lifts the Power Limit from the card entirely, which can (and will cause in certain applications) excessive power draw. A classic "shunt resistor mod" with liquid metal is simply tricking the card to allow it to pull some extra power which is still limited, mostly around 400 Watts (depends how much liquid metal you applied, amount limited by surface tension), while XOC Bios is able to draw as much power as it can get.

Just wanted to clarify that...


----------



## SavantStrike

Quote:


> Originally Posted by *kithylin*
> 
> Coin miners bought all the top-tier video cards from GTX 1070 and up. As well with AMD side, all the vega cards and most of the RX 480/580 are gone too. Coin miners are buying everything in seconds after they appear for sale anywhere.
> 
> Coin miners are destroying the GPU industry as a whole. And just wait until the coin bubble bursts and they try to liquidate all their fried / burnt-out / mined-to-death cards that can't play games anymore in 6 months on the used market and then don't tell anyone they used em for mining.
> 
> My local Fry's Electronics has adopted a strict "no returns, no exceptions, no exclusions" on all video cards now due to coin miners that would take em home and mine em until 1 day before the end of the return period then return em broken and unusable and get their money back.
> 
> If you were wanting to build a new PC or upgrade to a new computer and are mad you can't buy a fast GPU for gaming: Be mad at and blame coin miners. They're the entire reason we're in this mess.
> 
> I'm so darn happy I scored my 1080 Ti for $685 shipped in summer 2017, they're going for $1200+ right now.


Why doesn't anyone blame Samsung? DRAM and NAND prices have been steadily rising due to supposedly unforseen advances in mobile handsets. There is an artificially limited supply of RAM to go into the cards, which means production can't be increased to meet demand because there's no RAM. Popular RAM kits also end up out of stock on a routine basis.

South Korean RAM manufacturers have a day of reckoning coming just like micron did in the 90's.


----------



## Streetdragon

Quote:


> Originally Posted by *Hulio225*
> 
> Don't want to start a argument with you, but the XOC Bios is exactly doing that, it lifts the Power Limit from the card entirely, which can (and will cause in certain applications) excessive power draw. A classic "shunt resistor mod" with liquid metal is simply tricking the card to allow it to pull some extra power which is still limited, mostly around 400 Watts (depends how much liquid metal you applied, amount limited by surface tension), while XOC Bios is able to draw as much power as it can get.
> 
> Just wanted to clarify that...


and all that to go from 2000 to 2150Mhz or so wich leads to 2-3FPS more. yeah


----------



## feznz

Quote:


> Originally Posted by *Streetdragon*
> 
> and all that to go from 2000 to 2150Mhz or so wich leads to 2-3FPS more. yeah


hey lets be fair 2-4fps








but if you a hobbyist bencher then 0.1fps counts


----------



## GRABibus

Quote:


> Originally Posted by *Streetdragon*
> 
> and all that to go from 2000 to 2150Mhz or so wich leads to 2-3FPS more. yeah


And here this is www.overclock.net..The pursuit of performance...Not www.atstock.net


----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> hey lets be fair 2-4fps
> 
> 
> 
> 
> 
> 
> 
> 
> but if you a hobbyist bencher then 0.1fps counts


While true, i guess you remember, when XOC bios was released, and we tested the XOC bios thoroughly and came to the conclusion that it performs ~5% worse than the FE bios if you can increase the power limit with other methods, i am one of those guys where 0.1 FPS counts








Imho the XOC bios is putting the normal guys card (NOT LN2) under higher risk than it is worth. The entirely lifted power limit is putting so much more strain on several components (not talking VRMs, mostly over engineered anyway) but some other smaller SMD components and traces in general etc etc.
The argument of water-cooling isn't holding here completely, because you are not in contact with every component and so on.
At a certain point in May 2017 i tested something with the XOC bios again and i was to lazy to flash my FE bios back and DDU etc. and my card died in November 2017... can't tell if it was a production flaw or some long term stuff related to have XOC bios and unlimited power draw capabilities.

What i am trying to say is, that be careful with that XOC bios and people should stop downplaying the risks with a bios which has every and yes i mean every safety precaution disabled.

I am aware that some people can't or don't want to do a shunt mod for example and they go for the XOC bios or are interested in it, but please guys stop downplaying the risks of using it, those are significantly higher than most-likely everything else you can do with your card...

Thanks for your attention


----------



## Rollergold

Quote:


> Originally Posted by *Hulio225*
> 
> While true, i guess you remember, when XOC bios was released, and we tested the XOC bios thoroughly and came to the conclusion that it performs ~5% worse than the FE bios if you can increase the power limit with other methods, i am one of those guys where 0.1 FPS counts
> 
> 
> 
> 
> 
> 
> 
> 
> Imho the XOC bios is putting the normal guys card (NOT LN2) under higher risk than it is worth. The entirely lifted power limit is putting so much more strain on several components (not talking VRMs, mostly over engineered anyway) but some other smaller SMD components and traces in general etc etc.
> The argument of water-cooling isn't holding here completely, because you are not in contact with every component and so on.
> At a certain point in May 2017 i tested something with the XOC bios again and i was to lazy to flash my FE bios back and DDU etc. and my card died in November 2017... can't tell if it was a production flaw or some long term stuff related to have XOC bios and unlimited power draw capabilities.
> 
> What i am trying to say is, that be careful with that XOC bios and people should stop downplaying the risks with a bios which has every and yes i mean every safety precaution disabled.
> 
> I am aware that some people can't or don't want to do a shunt mod for example and they go for the XOC bios or are interested in it, but please guys stop downplaying the risks of using it, those are significantly higher than most-likely everything else you can do with your card...
> 
> Thanks for your attention


How much more of the these risks with the XOC Bios do you think would be applicable to users with Full Cover Blocks on their Ti's?

I would never even think about using the XOC Bios without a full cover block on it anyway and I have an FTW3 Ti with it's even more overbuilt components then the FE card.


----------



## kithylin

I've been using XOC bios on my card for 5 months now. Max power draw is 350 - 380 watts, which is just about +75W more than some out-of-the-box AIB partner cards do. And just flashing the bios it's self and using msi afterburner doesn't cause any 1080 Ti's to even get close to any dangerous levels, as long as you stick to MSI Afterburner software and it's built in limitations. The folks getting in to "Dangerous Power Draw" are using XOC Bios + shunt mod for example.

And it triggers driver-reset tripping @ 65c, instead of the default 85c by nvidia. It won't even let you get as hot as the default nvidia bios on most cards even.

And it's so far the only bios we've found that can completely eliminate power throttling forever on all 1080 Ti's. That I'm aware of.. there may be other bios's that do it as well, but I haven't seen them discussed yet.

The benefits far outweigh any potential negative.

All of this is kind of off-topic for this thread though. We already have a dedicated XOC Bios thread where we've been discussing it all in depth over there. Not a lot of point in re-running over stuff we've already said here.


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> Max power draw is 350 - 380 watts, which is just about +75W more than some out-of-the-box AIB partner cards do.


Yep you told me that in the bios thread, but you play in 1080p 60 or 80Hz and v-sync on... sure it will be lower than g-sync 240Hz screen...
Quote:


> Originally Posted by *kithylin*
> 
> And just flashing the bios it's self and using msi afterburner *doesn't cause any 1080 Ti's to even get close to any dangerous levels*, as long as you stick to MSI Afterburner software and it's built in limitations.


Stuff like that is simply not true, i have an engineering degree in microelectronics, i'm not telling you stuff like that to argue with you or to make you feel bad...
Being able to draw 500w under certain circumstances with a card with 300w specification is a hefty strain increase. You are downplaying that whole topic pretty hefty imho. Especially using words like "doesn't even get close to dangerous levels"...
Quote:


> Originally Posted by *kithylin*
> 
> The folks getting in to "Dangerous Power Draw" are using XOC Bios + shunt mod for example.


You still seem not to understand that XOC bios alone is lifting the power limit completely. XOC bios + shunt mod is no difference to just XOC bios alone!! Shunt mod isn't doing anything anymore if you have XOC bios.
So at this point you are admitting that it is possible to get "dangerous power draw" since there is zero difference in running xoc bios with or without shunt mod.

Don't get me wrong i don't want to start an argument, but you are playing it down...
Quote:


> Originally Posted by *Rollergold*
> 
> How much more of the these risks with the XOC Bios do you think would be applicable to users with Full Cover Blocks on their Ti's?
> 
> I would never even think about using the XOC Bios without a full cover block on it anyway and I have an FTW3 Ti with it's even more overbuilt components then the FE card.


My first card was permanently under a full cover block from ekwb too, it never saw temps over 45° C.
I can't tell you exactly how much higher the risks are since im not working at nvidia and have no insights in rma stuff and analysis. But certain games/programms are pulling so much more power that alone common sense alone should enable a light in your mind...
Like i said previously, guys go use it and have fun with it but be aware that it is putting your card at a significant higher strain and it can cause damage. "can" not "will"


----------



## Rollergold

Well I will monitor the power draw from the card during my various games and programs and if I see any draw approching 500W from the card I'll back away. I plan on using the card over the long term anyway (not just dumping it when the next xx80 Ti card comes out)

I can afford to replace the card but Memory Express has stopped selling GPU's outside of system builds for the time being (probably due to the massive GPU price jump).


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> When I say 350-380 watts, that's vsync off, everything off, and running benchmarks with the card @ 100% usage constantly. It never gets to or goes above 400 watts no matter what game, software, or benchmark I use with it. *Even furmark doesn't even cause it to get to 400 watts. It will never see 500W or even close to it.* Under normal situations we can not damage the card, or hurt it, or even come close to that even with XOC Bios on it. You know that as much as I do and yet you still repeatedly attack me about this, and *you're repeatedly calling me a liar to my face*, in public, now in multiple threads. I asked you in PM to stop, I've reported you to the mods to get you to stop, twice. And yet you still persist. I really wish I knew what the hell your problem is, and whatever it is you'd go harass someone else. It's getting to the point I can't post anything on these dumb forums without you showing up and attacking me again. *sigh*


No words needed:





edit:


----------



## feznz

@Hulio225 sorry to hear your card died RIP, the BIOs chip is on the card itself so did you try RMA? or would that be too cheeky

@kithylin I would like you to try mining or folding for a month then tell me how "safe" a XOC bios is to use 24/7









Talking from experience I fried 2 GTX 580's to test "stability" at the highest possible OC with folding I can assure you it will shorten the life span of a card significantly the best part was I managed to get RMA because there was no modification sadly they couldn't replace the GTX 580s, so I had to settle for GTX 770s as replacements


----------



## kithylin

Quote:


> Originally Posted by *feznz*
> 
> @Hulio225 sorry to hear your card died RIP, the BIOs chip is on the card itself so did you try RMA? or would that be too cheeky
> 
> @kithylin I would like you to try mining or folding for a month then tell me how "safe" a XOC bios is to use 24/7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Talking from experience I fried 2 GTX 580's to test "stability" at the highest possible OC with folding I can assure you it will shorten the life span of a card significantly the best part was I managed to get RMA because there was no modification sadly they couldn't replace the GTX 580s, so I had to settle for GTX 770s as replacements


Openly admitting you intentionally fried two gpu's and further admitting you sent in two totally dead cards to get "free replacements" from the manufacturer is really not something one should be proud of.... I wouldn't think.


----------



## feznz

Quote:


> Originally Posted by *kithylin*
> 
> Openly admitting you intentionally fried two gpu's and further admitting you sent in two totally dead cards to get "free replacements" from the manufacturer is really not something one should be proud of.... I wouldn't think.


Intentionally no happy I got replacements for a defective product that was used in a way that was intended, to help the research of cancer yes

If Nvidia wanted to they could have locked the clocks and voltage but they allow OCing software so I used the MSI cards with MSI afterburner what did I do wrong? used a product as the manufacturer intended.
stock bios no modifications so I was entitled to replacements I might add that I kept the cards below 60°C and openly admided using waterblocks.

I am stressing that gaming is one thing but [email protected] or mining will put maximum stress on a card and no knows how long that will last 1month or 1 year or the expected 10year life span with FE stock clocks
just that that last 5% of OC will put 60%+ more power draw.

ATM I run 1.063v @ 2050Mhz that's enough for me yes I have tried to get more out of my 1080ti but the gains and heat for 1.092v is simply not worth it for gaming or the 100w odd watts of extra power draw.

I would rather talk about the price gouging NVidia do with their top tier cards like the profit margin in a Titan V I don't feel bad actually I feel happy I got my moneys worth


----------



## kithylin

Quote:


> Originally Posted by *feznz*
> 
> Intentionally no happy I got replacements for a defective product that was used in a way that was intended, to help the research of cancer yes
> 
> If Nvidia wanted to they could have locked the clocks and voltage but they allow OCing software so I used the MSI cards with MSI afterburner what did I do wrong? used a product as the manufacturer intended.
> stock bios no modifications so I was entitled to replacements I might add that I kept the cards below 60°C and openly admided using waterblocks.
> 
> I am stressing that gaming is one thing but [email protected] or mining will put maximum stress on a card and no knows how long that will last 1month or 1 year or the expected 10year life span with FE stock clocks
> just that that last 5% of OC will put 60%+ more power draw.
> 
> ATM I run 1.063v @ 2050Mhz that's enough for me yes I have tried to get more out of my 1080ti but the gains and heat for 1.092v is simply not worth it for gaming or the 100w odd watts of extra power draw.
> 
> I would rather talk about the price gouging NVidia do with their top tier cards like the profit margin in a Titan V I don't feel bad actually I feel happy I got my moneys worth


Right... next you're going to tell me that coin mining and folding are actually what the consumer tier of video cards are intended for. And when your normal GTX cards (or 1080 Ti's) fail from coin mining and folding, you're going to claim they're defective. Surely nvidia designed their GAMER series of video cards to be run at 100% load constantly 24-7 for months, or years. Surely nvidia's Quadro cards are pointless.


----------



## feznz




----------



## Hulio225

Quote:


> Originally Posted by *feznz*
> 
> @Hulio225 sorry to hear your card died RIP, the BIOs chip is on the card itself so did you try RMA? or would that be too cheeky
> 
> @kithylin I would like you to try mining or folding for a month then tell me how "safe" a XOC bios is to use 24/7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Talking from experience I fried 2 GTX 580's to test "stability" at the highest possible OC with folding I can assure you it will shorten the life span of a card significantly the best part was I managed to get RMA because there was no modification sadly they couldn't replace the GTX 580s, so I had to settle for GTX 770s as replacements


Could RMA my card, luckily







It was totaly dead with pci-e power cables attached, without it was in the device manager and i could flash original bios back.

@kithylin There you go again, you are getting another valid example where cards are pushed hard like with Folding at Home and such. I have proven you wrong, once again, i showed evidence that Furmark is drawing 500+ Watts, in that video i saw it peaking at 522Watts, *you CLAIMED it will not go over 400 in furmak.*.. and yet you still try to downtalk the whole subject and advertise that bios like it is nothing to worry about in no existing case, which is simply not true and the whole point i am making all the time.
Every time someone is showing you examples or mention them where it can get dangerous you are back tracking, ignoring it or you by yourself define what is normal and what not.

You asked what my problem with you is, this is my problem with you exactly your ignorant behavior, your ignorance and the spreading of misinformation and talking stuff down which can be dangerous in quite some cases.

Reflect your behavior think about what you are saying and spreading here and come to good sense and stop saying that i harass you, i am just clarifying the misinformation's you spread in the threads i am active.


----------



## Rollergold

How long does it take [email protected] to put max stress on the GPU? I've ran it for about 1 hour w/ it set to full "Folding Power" and never saw GPU usage past 85% and only pulling 475-500w as total system draw. I get a similar level of power draw in the games I play (KF2, Density 2, Titanfall 2, etc). I downloaded Furmark and even on stock settings on MSI AB I went from about 200w draw at idle of total system draw to round 700w so I think I'll air on the side of caution and go back to the stock bios and see what I can get. Most likey better for long term use anyway.


----------



## GreedyMuffin

[email protected] won't use more than 85-90%

I've been folding for years. Just recently stopped in order to join the mining hype and rasther just put them back to folding when the hype is dead.


----------



## DStealth

Quote:


> Originally Posted by *Hulio225*
> 
> @kithylin There you go again, you are getting another valid example where cards are pushed hard like with Folding at Home and such. I have proven you wrong, once again, i showed evidence that Furmark is drawing 500+ Watts, in that video i saw it peaking at 522Watts, *you CLAIMED it will not go over 400 in furmak.*.. and yet you still try to downtalk the whole subject and advertise that bios like it is nothing to worry about in no existing case, which is simply not true and the whole point i am making all the time.
> Every time someone is showing you examples or mention them where it can get dangerous you are back tracking, ignoring it or you by yourself define what is normal and what not.


I could confirm 1.2v on XOC can exceed 450w mark even in 3dmarks monitored in HWinfo tab and CMD nvidia info both.
Just got my best FS result with 1.15v only







)
*28343*


----------



## feznz

I haven't folded since the 580's RMA saga that was putting 100% load on them maybe they have tuned it back a little to stop scaring people like me away








those 580s were hot, great for winter time saved my from using a heater.

I looked at mining when the price was about 1000USD a bit coin and it barely covered the cost of eletricty and intial harware cost if you mined for a year continusously, I wonder if it would be worth it now?

I get about 500w total system draw with 8600k @ 4.6 Ghz (I lost the lottery can bench easily @ 5Ghz) and 1080Ti @ 2050Mhz But I still couldn't help myself getting a big 1000w PSU

Kederwolf called me out I also couldn't believe that a 1080ti could pull 500w+ I believe if you are game enough even on air
it is suprising how exponentially the power draw goes up after 2000Mhz.


----------



## OrionBG

Quote:


> Originally Posted by *DStealth*
> 
> I could confirm 1.2v on XOC can exceed 450w mark even in 3dmarks monitored in HWinfo tab and CMD nvidia info both.
> Just got my best FS result with 1.15v only
> 
> 
> 
> 
> 
> 
> 
> )
> *28343*


By XOC, do you mean the bios or you have a XOC card? If it's the bios, than can you please tell me how got the voltage to 1.2V? Thanks!


----------



## Hulio225

Quote:


> Originally Posted by *OrionBG*
> 
> By XOC, do you mean the bios or you have a XOC card? If it's the bios, than can you please tell me how got the voltage to 1.2V? Thanks!


XOC is bios it stands for *X*treme *O*ver*c*locking.

You can reach 1.2V with MSI Afterburner Curve Editor by pressing control+f and adjusting frequencies related to voltages.


----------



## OrionBG

Quote:


> Originally Posted by *Hulio225*
> 
> XOC is bios it stands for *X*treme *O*ver*c*locking.
> 
> You can reach 1.2V with MSI Afterburner Curve Editor by pressing control+f and adjusting frequencies related to voltages.


Thanks. I flashed that bios a week ago but haven't played much with it since I'll be building a new watercooled system soon and then I'll test what the GPU can do. Is there some good guide about that curve in afterburner? Every time I look at it, it makes no sense to me...


----------



## Hulio225

Quote:


> Originally Posted by *OrionBG*
> 
> Thanks. I flashed that bios a week ago but haven't played much with it since I'll be building a new watercooled system soon and then I'll test what the GPU can do. Is there some good guide about that curve in afterburner? Every time I look at it, it makes no sense to me...


The first page of this thread should cover every question you could possibly have. KedarWolf did a fantastic job there.


----------



## Lefty23

Quote:


> Originally Posted by *OrionBG*
> 
> Is there some good guide about that curve in afterburner? Every time I look at it, it makes no sense to me...


There is a YouTube video by @KedarWolf in the OP here you might find usefull


----------



## OrionBG

One more question... Is the XOC bios negating the need for a shunt mod or is the shunt mod still required for maximum OC?


----------



## Hulio225

Quote:


> Originally Posted by *OrionBG*
> 
> One more question... Is the XOC bios negating the need for a shunt mod or is the shunt mod still required for maximum OC?


Its negating the need for a shunt mod, but a shunt-modded card with original bios will perform ~5% better than a card with XOC bios.

This is true for FE cards and some other cards. Probably true for most cards but the card it was designed for (strix)... But feel free to make your own research and build your own opinion.

Edit: To clarify, shunt mod is not doing anything in regards of overclocking per se, if limit is reached it downclocks the card until out of limit returning the clocks and fluctuating like this as long power draw is above the limit. Shunt mod is tricking the cards power consumption calculation by a certain amount and it allows the card to draw some additional power before limiting. The XOC bios is ignoring this calculation completely, allowing the card to draw as much power as it wants.


----------



## ColdDeckEd

Quote:


> Originally Posted by *kithylin*
> 
> Right... next you're going to tell me that coin mining and folding are actually what the consumer tier of video cards are intended for. And when your normal GTX cards (or 1080 Ti's) fail from coin mining and folding, you're going to claim they're defective. Surely nvidia designed their GAMER series of video cards to be run at 100% load constantly 24-7 for months, or years. Surely nvidia's Quadro cards are pointless.


Wait, the guy that rallies against mining because its using the card in a way that wasn't intended... is perfectly ok with flashing a custom bios that drastically increases voltage and power consumption far beyond stock levels, and gaming/benching with it?

Is this the most blatant example of pot calling the kettle black or what?


----------



## OrionBG

Quote:


> Originally Posted by *Hulio225*
> 
> Its negating the need for a shunt mod, but a shunt-modded card with original bios will perform ~5% better than a card with XOC bios.
> 
> This is true for FE cards and some other cards. Probably true for most cards but the card it was designed for (strix)... But feel free to make your own research and build your own opinion.
> 
> Edit: To clarify, shunt mod is not doing anything in regards of overclocking per se, if limit is reached it downclocks the card until out of limit returning the clocks and fluctuating like this as long power draw is above the limit. Shunt mod is tricking the cards power consumption calculation by a certain amount and it allows the card to draw some additional power before limiting. The XOC bios is ignoring this calculation completely, allowing the card to draw as much power as it wants.


Thanks!
I have a EVGA FE 1080ti and I have flashed the the XOC bios on it some days ago. Once my new system is ready (when I have time to build it







) I'll test the life out of it


----------



## HoneyBadger84

Welp, it's getting close to the day of expected shipping for my GTX 1080 Ti (when I bought it said would ship by the 18th of January). If it does ship by that evening, I think I may cancel that order & get a Titan Xp instead... maybe. I could always resell the 1080 Ti whenever it gets here & get one of those instead.

Sounds like these things can be crazy on power draw when OCing. I remember how crazy the power draw got when I was overclocking 4 R9 290Xs going for my own personal records at the time. Fun stuff when you see well over 1700W at the wall on a 1600W PSU, you know you're getting close to it's max power. lol


----------



## OrionBG

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Welp, it's getting close to the day of expected shipping for my GTX 1080 Ti (when I bought it said would ship by the 18th of January). If it does ship by that evening, I think I may cancel that order & get a Titan Xp instead... maybe. I could always resell the 1080 Ti whenever it gets here & get one of those instead.
> 
> Sounds like these things can be crazy on power draw when OCing. I remember how crazy the power draw got when I was overclocking 4 R9 290Xs going for my own personal records at the time. Fun stuff when you see well over 1700W at the wall on a 1600W PSU, you know you're getting close to it's max power. lol


If I'm not mistaken, the Titan Xp and the 1080Ti share the same PCB with the only difference that the Ti is missing one RAM chip and some power phases related to it... If this is the case, then the Titan Xp should be even more power hungry!


----------



## Hulio225

Quote:


> Originally Posted by *OrionBG*
> 
> If I'm not mistaken, the Titan Xp and the 1080Ti share the same PCB with the only difference that the Ti is missing one RAM chip and some power phases related to it... If this is the case, then the Titan Xp should be even more power hungry!


The power hunger is caused by the XOC bios, 1080Ti by itself is quite an efficient card thou... the gains from XOC bios and the "exponentially higher power consumption it can cause is the problem" (to be taken with a grain of salt)


----------



## OrionBG

Quote:


> Originally Posted by *Hulio225*
> 
> The power hunger is caused by the XOC bios, 1080Ti by itself is quite an efficient card thou... the gains from XOC bios and the "exponentially higher power consumption it can cause is the problem" (to be taken with a grain of salt)


Then just theoretically speaking, if there was a XOC bios for the Titan Xp, the situation would have been the same, right?
My earlier comment was specifically towards HoneyBadger84 as he/she is considering the Titan Xp over the Ti because of the crazy power draw in certain situations.
But, as you have mentioned, the crazy power draw situations are manifesting themselves because of the XOC bios and not under normal conditions with stock bios.


----------



## Hulio225

Quote:


> Originally Posted by *OrionBG*
> 
> Then just theoretically speaking, if there was a XOC bios for the Titan Xp, the situation would have been the same, right?
> My earlier comment was specifically towards HoneyBadger84 as he/she is considering the Titan over the Ti because of the crazy power draw in certain situations.
> But, as you have mentioned, the crazy power draw situations are manifesting themselves because of the XOC bios and not under normal conditions with stock bios.


This is correct.


----------



## HoneyBadger84

Quote:


> Originally Posted by *OrionBG*
> 
> Then just theoretically speaking, if there was a XOC bios for the Titan Xp, the situation would have been the same, right?
> My earlier comment was specifically towards HoneyBadger84 as he/she is considering the Titan Xp over the Ti because of the crazy power draw in certain situations.
> But, as you have mentioned, the crazy power draw situations are manifesting themselves because of the XOC bios and not under normal conditions with stock bios.


Worth note: I don't really care about power draw :-D Main reason I'm considering a Titan Xp would be the extra performance, and I'm getting a crapton of money out of my R9 295x2 (it's on EBay & currently at $770, considering it's over 2 years old, that's LUL worthy). If I wanted to, I could just cancel the GTX 1080 Ti order, wait for the R9 295x2's payment to clear after it ends, and order a Titan Xp. I think I'm going to wait for the 1080 Ti though. I can always resell it & get a Titan Xp later. The 30-45% increase the GTX 1080 Ti FTW3 Hybrid will give vs my R9 295x2 just with it at stock, will be amazing in of itself (I'm getting the 30-45% from benchmarks I've seen on Guru3d & the like).

The house I live in has Solar Power, so a little extra power draw won't mean much, and I don't really see the 1080 Ti or Titan Xp ever getting to the silly power draw numbers I've seen from this 295x2... or am I wrong about that assumption? I'll haveta look up my old power numbers with just one card in.


----------



## ColdDeckEd

Oh no it will never ever reach 295x2 levels of absurdity unless you get a kingpin. Crazy about the 295x2. I was set to buy a modified one locally(two separate aios) for 150 afew months ago..


----------



## HoneyBadger84

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Oh no it will never ever reach 295x2 levels of absurdity unless you get a kingpin. Crazy about the 295x2. I was set to buy a modified one locally(two separate aios) for 150 afew months ago..


Yeah, there was one on EBay that sold about 4 days ago for $1200. Insanity. What I'm already getting makes me happy, if I get anywhere near that I'll be ecstatic since I only paid $999 when it was brand new.


----------



## blixt

World of warcraft the new furmark 2.0







? lol

Just messing around with the XOC bios on my SC2 Hybrid, found a power hungry MSAA setting in wow "Color 4x / Depth 4x"
2101 @ 1.200v = 470-480w GPU power draw. System power draw (watt meter) 580-600w. Saw a peak at 500w with GPU temp at 48c.

Superposition 430max with the same OC.


----------



## kithylin

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Wait, the guy that rallies against mining because its using the card in a way that wasn't intended... is perfectly ok with flashing a custom bios that drastically increases voltage and power consumption far beyond stock levels, and gaming/benching with it?
> 
> Is this the most blatant example of pot calling the kettle black or what?


Yes because XOC is perfectly fine and harmless.. for GAMING (And gaming related benchmarks). Surely people have enough common sense in their head not to flash a power-unlimited bios to their card then go mining with it. Or if they don't know better, then they deserve to kill their cards. XOC Unlimited Bios is literally the exact same thing as the common practice around overclock.net of everyone manually editing the bios's and removing power limits that's done around here with every gpu we've ever had since the dawn of computers that we figure out how to edit the bios with. That's the generally accepted way of removing power throttling with any gpu, AMD and NVIDA. I have power-unlimited bios's on every nvidia boost card I've ever owned and they all still work. But I don't mine with them, there's a big difference. You never hear of anyone flailing their arms "WATCH OUT IT CAUSES UNSAFE POWER DRAW" in the past when they edit a bios for someone and send it back to be flashed on OCN. Everyone just edits the bios and sends it back like "Here you go fam" and we flash it and go. I don't understand why everyone has their panties in a twist over this one bios and 1080 Ti's all of a sudden. It's like some how with this combination of this bios with 1080 Ti's that it's some how we're doing something around here that we haven't done with every other card in the past.


----------



## ColdDeckEd

There is a big difference between pascal and maxwell when it comes to voltages. If you think pushing what would be high voltages on maxwell (ive pushed maxwell to 1.28 using bios mods on many cards btw) on pascal is totally safe then I would disagree. Im not saying people shouldnt do it im all for modding bioses BUT pascal is NOT maxwell. If voltages up to 1.23 were safe on pascal NVidia would have allowed it, they allowed it on maxwell didnt they? The truth is 1.09v is the offical limit for a reason its plenty fine for pascal and yes going to 1.2 will kill it faster unless you have cooling for it. If you are going to push 1.2v id say you are doing more "damage" to the card than mining ever could.


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> *Yes because XOC is perfectly fine and harmless.. for GAMING (And gaming related benchmarks).*


You are doing it again, this isn't true, period.
Its *NOT* harmless, there are games which will pull close to 500watts, and this is not harmless, stop saying that, simply stop!!!!!

Another example:
Quote:


> Originally Posted by *blixt*
> 
> World of warcraft the new furmark 2.0
> 
> 
> 
> 
> 
> 
> 
> ? lol
> 
> Just messing around with the XOC bios on my SC2 Hybrid, found a power hungry MSAA setting in wow "Color 4x / Depth 4x"
> 2101 @ 1.200v = 470-480w GPU power draw. System power draw (watt meter) 580-600w. Saw a peak at 500w with GPU temp at 48c.
> 
> Superposition 430max with the same OC.


You can use it and you can have fun with it, but saying it is harmless to draw close to 2 times the power of the specification (in certain games and applications) is not harmless or risk free.
Nobody is saying to stop using the bios or that the bios will definitely destroy cards or something, it simply involves a significantly higher risk to damage the card compared to stock bios or even shunt mods. This is a fact, defined by the current going through the card!!

Why the **** are you trying to sell XOC bios like it would be the same like using the stock bios in terms of strain to the card???
It makes no sense...

People have to be aware of the increased strain (and the amount) to the card while using that bios, its that simple. But you are playing it down like it would be nothing.... why?!
We provided enough evidence to prove you wrong, multiple times and you still claim its harmless... Its like you would get money for advertising that bios, lol...


----------



## blixt

Quote:


> Originally Posted by *Hulio225*
> 
> You are doing it again, this isn't true, period.
> Its *NOT* harmless, there are games which will pull close to 500watts, and this is not harmless, stop saying that, simply stop!!!!!
> 
> Another example:
> You can use it and you can use it but saying it is harmless to draw close to 2 times the power of the specification is not harmless or risk free.
> Nobody is saying to stop using the bios or that the bios will definitely destroy cards or something, it simply involves a significantly higher risk to damage the card compared to stock bios or even shunt mods. This is a fact, defined by the current going through the card!!
> 
> Why the **** are you trying to sell XOC bios like it would be the same like using the stock bios in terms of strain to the card???
> It makes no sense...
> 
> People have to be aware of the increased strain the card while using that bios, its that simple. But you are playing it down like it would be nothing.... why?!
> We provided enough evidence to prove you wrong, multiple times and you still claim its harmless... Its like you would get money for advertising that bios, lol...


Just a little disclaimer







when using SSAA 4x + CMAA instead the power draw drops to 300-350w and frame times normalize.
But no doubt XOC is like an old V8, ****ty performance and gas mileage. but the extra displacement makes up for it


----------



## HoneyBadger84

Out of curiosity, what temps are considered safe for an air or AiO (like the one I'm getting, FTW3 Hybrid) GTX 1080 Ti? Or rather, where does thermal throttling occur (on the normal BIOS)? I may push the card a bit since I don't think I'll be keeping it for the long run. That's if it ever gets here.


----------



## Hulio225

Quote:


> Originally Posted by *blixt*
> 
> Just a little disclaimer
> 
> 
> 
> 
> 
> 
> 
> when using SSAA 4x + CMAA instead the power draw drops to 300-350w and frame times normalize.
> But no doubt XOC is like an old V8, ****ty performance and gas mileage. but the extra displacement makes up for it


I know that it isn't always like that and just under certain circumstances or some games like Mass Effect Andromeda which was pulling like 480 Watts with XOC bios, but he is claimig stuff like that:
Quote:


> Originally Posted by *kithylin*
> 
> When I say 350-380 watts, that's vsync off, everything off, and running benchmarks with the card @ 100% usage constantly. It never gets to or goes above 400 watts no matter what game, software, or benchmark I use with it. Even furmark doesn't even cause it to get to 400 watts. It will never see 500W or even close to it.


I show him that: 



 peaking at 522watt btw and he still want to be right in everything he claims... you are giving another example, other people did the same and he still is claiming the same stuff... what is wrong with that guy, i mean seriously...

and tht was just one example i could show several more where he is doing stuff like that... maybe he is a troll and he got me triggered like hell xD


----------



## kithylin

Quote:


> Originally Posted by *Hulio225*
> 
> You are doing it again, this isn't true, period.
> Its *NOT* harmless, there are games which will pull close to 500watts, and this is not harmless, stop saying that, simply stop!!!!!


Since you're new here to OCN forums (Your join date is here in jan 2017), you must not understand what this place is about. Here on OCN we've been flashing unlimited-power bios's on all of our nvidia and AMD / ATI video cards since the first cards that started shipping with boost / power limits. Cross-flashing unlimited-power bios's is also a common practice around here. This is what we do on OCN: We remove power limits, overclock the cards and game on em. We've been doing it before, we'll keep doing it again with volta (nvidia), Navi (AMD), and every other newer video cards in the future. No matter how much you huff, puff, scream, and fail your arms around, we're all going to keep on doing it anyway. If you're scared of having unlimited-power bios's on your cards then perhaps OCN forums isn't the place for you. The rest of us here will keep laughing at you and enjoying our unlocked cards. Also the older pre-boost cards didn't have any power limits at all. Literally they behaved exactly like 1080 Ti's with XOC on them do today: no limits at all. Yet everyone still overclocked them and increased voltage and custom bios's and everything.


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> Since you're new here to OCN forums (Your join date is here in jan 2017) and must not understand what this place is about. Here on OCN we've been flashing unlimited-power bios's on all of our nvidia and AMD / ATI video cards since the first cards that started shipping with boost / power limits. Cross-flashing unlimited-power bios's is also a common practice around here. This is what we do on OCN: We remove power limits, overclock the cards and game on em. We've been doing it before, we'll keep doing it again with volta (nvidia), Navi (AMD), and every other newer video cards in the future. No matter how much you huff, puff, scream, and fail your arms around, we're all going to keep on doing it anyway. If you're scared of having unlimited-power bios's on your cards then perhaps OCN forums isn't the place for you. The rest of us here will keep laughing at you and enjoying our unlocked cards. Also the older pre-boost cards didn't have any power limits at all. Literally they behaved exactly like 1080 Ti's with XOC on them do today: no limits at all. Yet everyone still overclocked them and increased voltage and custom bios's and everything.


*
You are being the one who isn't understanding the whole POINT!!!!*

I am the guy who is pursuing every little fraction of a percent in terms of performance no matter what! BUT i am aware of the risks and what can happen to my card if i push 450 ampere through it, i am aware of that and i have no fear in doing that. but i understand exactly what i am doing.

BUT there are enough other people which aren't aware of what they are doing exactly, they ask in this thread if its save or how high the risk is to use that bios.
YOU CLAIM IT IS NOTHING and PERFECTLY SAVE... WHILE THIS IS PLAIN AND SIMPLE NOT TRUE

THAT IS MY WHOLE POINT ALL THE TIME, ALL THE TIME, get this in your head dude!


----------



## kithylin

Quote:


> Originally Posted by *Hulio225*
> 
> *
> You are being the one who isn't understanding the whole POINT!!!!*
> 
> I am the guy who is pursuing every little fraction of a percent in terms of performance no matter what! BUT i am aware of the risks and what can happen to my card if i push 450 ampere through it, i am aware of that and i have no fear in doing that. but i understand exactly what i am doing.
> 
> BUT there are enough other people which aren't aware of what they are doing exactly, they ask in this thread if its save or how high the risk is to use that bios.
> YOU CLAIM IT IS NOTHING and PERFECTLY SAVE... WHILE THIS IS PLAIN AND SIMPLE NOT TRUE
> 
> THAT IS MY WHOLE POINT ALL THE TIME, ALL THE TIME, get this in your head dude!


It's always a risk of frying any video card when you're flashing any bios on it other than stock. Everyone knows and acknowledges that risk when they do it. It's in the hands of whom ever owns the card that if they flash something onto it and break it, it's no one's fault but their own. Also you're still failing to understand: Nothing we do with any of our cards is proven safe around here, we all know it and we do it anyway. It's like you're thinking we haven't been doing this for years already or something.......


----------



## Hulio225

Quote:


> Originally Posted by *kithylin*
> 
> It's always a risk of frying any video card when you're flashing any bios on it other than stock. Everyone knows and acknowledges that risk when they do it. It's in the hands of whom ever owns the card that if they flash something onto it and break it, it's no one's fault but their own. Also you're still failing to understand: Nothing we do with any of our cards is proven safe around here, we all know it and we do it anyway. It's like you're thinking we haven't been doing this for years already or something.......


Quote:


> Originally Posted by *kithylin*
> 
> Just sticking to software though with the XOC bios, *anything up to (or below) 1.200v is 100% perfectly safe and won't fry anything*.


Quote:


> Originally Posted by *kithylin*
> 
> And just flashing the bios it's self and using msi afterburner *doesn't cause any 1080 Ti's to even get close to any dangerous levels*, as long as you stick to MSI Afterburner software and it's built in limitations.


Quote:


> Originally Posted by *kithylin*
> 
> Even furmark doesn't even cause it to get to 400 watts. It will never see 500W or even close to it. *Under normal situations we can not damage the card, or hurt it, or even come close to that even with XOC Bios on it.*




Dude do you want more of your inconsistent arguments? You are making me laugh every time you answer, thanks for that


----------



## Agent-A01

Quote:


> Originally Posted by *Hulio225*
> 
> 
> 
> Dude do you want more of your inconsistent arguments? You are making me laugh every time you answer, thanks for that


Enough of the bickering.

Your inconsequential arguements filling pages isn't helping anyone.

FYI 400w is not dangerous at all when temperatures are relatively safe for the Ti.

Ti has a very robust component build.

I had the OG titan that would hit 600w all day for a year and its' components were not nearly as strong.
Thing is, it was watercooled which lowers risks.

Plainly, making blanket statement over 400w is dangerous is incorrect and not as simple as you make it out to be.
Variables must be taken into account.

Yes 400w could be dangerous with 30c ambient temps on a reference cooler.
That's not too common around here though.


----------



## HoneyBadger84

Since it was skimmed over by the argument going on (lol), figured I'd repeat my question:

Out of curiosity, what temps are considered safe for an air or AiO (like the one I'm getting, FTW3 Hybrid) GTX 1080 Ti? Or rather, where does thermal throttling occur (on the normal BIOS)? I may push the card a bit since I don't think I'll be keeping it for the long run. That's if it ever gets here.

Also, seriously, assuming everyone here knows flashing a BIOS & doing what most of us do is dangerous & accept the risks is a bad assumption. There are always new people coming around, telling everyone it's perfectly safe & you can't possibly fry your card by pumping twice the original power limit of the card through it etc is absolutely ridiculous, and shouldn't be the routine anywhere.


----------



## 6u4rdi4n

I thought the Pascal cards had hardware limitations built into them as well?


----------



## Mooncheese

Quote:


> Originally Posted by *Hulio225*
> 
> 
> 
> Dude do you want more of your inconsistent arguments? You are making me laugh every time you answer, thanks for that


I haven't been around here for a while, got my card stable at 2025 MHz with an undervolt of 1.0v with load temps of ~35C under full waterblock: 




Anyhow, whoever the person stating that 1.2v+ and 400W+ is no problem for longevity, they are completely clueless. I actually had to post here to say this.

Sure buddy, keep running your card with XOC vbios at 1.2v+ and 400W+! That's totally safe, no man, the 50k hour MTBF for Pascal was totally not calculated taking into account voltage, wattage, temperature and time.

Because your card didn't immediately croak under 1.2v and 450W means it's invincible! MTBF wasn't just literally shrunk from 50k hours to 10k hours! No! You have a Super Card. Yeah you know like the movie Super Man?

Keep on informing the rest of us as to how we are all incorrect, many of sitting here HAVING LEARNED the hard way with fried cards, that no, unfortunately you can't go FULL ****** with an unlocked vbios.

Thanks for the laugh!

PC Clown Race represent!


----------



## Mooncheese

What is also so totally asinine is that people will risk ruining a card that can literally be put up on ebay and sold for $1k used right now for the sake of maybe 10% more performance?

50% more current (450W)
20% more voltage (1.2v)
10% more performance
200% less MTBF (or more)

If youre doing the above youre either rolling in cash or an idiot. Pick one.

Youre better off picking up another 1080 Ti and setting both of the cards to 100% PT, you will have what, an 70% gain for the same TDP and you won't have to worry about destroying said hardware?

Where has the common sense gone?

Are we PC Master Race or PC Clown Race?


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> What is also so totally asinine is that people will risk ruining a card that can literally be put up on ebay and sold for $1k used right now for the sake of maybe 10% more performance?
> 
> 50% more current (450W)
> 20% more voltage (1.2v)
> 10% more performance
> 200% less MTBF (or more)
> 
> If youre doing the above youre either rolling in cash or an idiot. Pick one.
> 
> Youre better off picking up another 1080 Ti and setting both of the cards to 100% PT, you will have what, an 70% gain for the same TDP and you won't have to worry about destroying said hardware?
> 
> Where has the common sense gone?
> 
> Are we PC Master Race or PC Clown Race?


For SLI to work and have positive scaling in the +80% or more range, it requires the game developer to support it and code it in to their game. And it requires nvidia to release SLI profiles for it via driver updates. If one of these two things doesn't happen, then you only get to use a single card in your games. So not all games will always support SLI. Some games flat out don't support it at all and never will. Even then the ones that do, sometimes don't scale very well (poorly optimized). Some games are AMD CrossFireX and don't support SLI, some games are SLI and don't support CrossFireX. Multi-GPU has always been a big dice game with PC gaming.

The best way for computer gaming is a very powerful single card.

The main reason for overclocking is back when the 1080 Ti launched for $699, the common thing was to get a 1080 Ti for $699, overclock it and essentially have the same (or faster) performance as the more expensive $1200 Nvidia TITAN Xp for -$500 less. Which at the time, was actually a good thing to do. However now that the 1080 Ti's are $1000 (or higher), it's not such a difference.

I just looked, and today we can get 1080 Ti's for $1075 - $1100, and the Nvidia TITAN Xp is $1400 - $1500. So it's almost the same price margin difference at today's prices, +/- $400 today. It depends on the card and what it can do but my 1080 Ti, clocked where I have it already exceeds the TITAN Xp by about +7%.


----------



## ColdDeckEd

Quote:


> Originally Posted by *kithylin*
> 
> The main reason for overclocking is back when the 1080 Ti launched for $699, the common thing was to get a 1080 Ti for $699, overclock it and essentially have the same (or faster) performance as the more expensive $1200 Nvidia TITAN Xp for -$500 less. Which at the time, was actually a good thing to do. However now that the 1080 Ti's are $1000 (or higher), it's not such a difference.
> 
> I just looked, and today we can get 1080 Ti's for $1075 - $1100, and the Nvidia TITAN Xp is $1400 - $1500. So it's almost the same price margin difference at today's prices, +/- $400 today. It depends on the card and what it can do but my 1080 Ti, clocked where I have it already exceeds the TITAN Xp by about +7%.


Strawman much? Nobody is talking about why we overclock. We all know theres free extra performance when we overclock, thats a big reason why we do it. No one is saying you wont get extra performance from xoc.

The point literally everyone else is making is that one should not use xoc as a daily bios because it will shorten the lifespan of the card. Unless youve have a full loop that can handle the extra heat generated by it. I have a seahawk and even with the aio I wouldnt feel comfortable using that bios for daily gaming. For afew benches to beat personal records sure, but its pretty much a waste to run it for gaming.

But you keep doing you. I just hope if you plan on reselling that card youll tell prospective buyers you ran that bios.


----------



## kithylin

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Strawman much? Nobody is talking about why we overclock. We all know theres free extra performance when we overclock, thats a big reason why we do it. No one is saying you wont get extra performance from xoc.
> 
> The point literally everyone else is making is that one should not use xoc as a daily bios because it will shorten the lifespan of the card. Unless youve have a full loop that can handle the extra heat generated by it. I have a seahawk and even with the aio I wouldnt feel comfortable using that bios for daily gaming. For afew benches to beat personal records sure, but its pretty much a waste to run it for gaming.
> 
> But you keep doing you. I just hope if you plan on reselling that card youll tell prospective buyers you ran that bios.


Please explain exactly to me how using the XOC bios on a 1080 Ti is any different than using the bios editor software and making a custom bios for any and all previous nvidia boost cards and setting the max power to something like 800W+ so we remove the power limit and remove power throttling. Which is generally what almost every one has been doing for the past many years since nvidia started selling boost cards. The way some folks are going on about it, surely there must be some "Secret sauce" in the hex code for this bios that is causing this card to do something special by using it that we haven't already been doing for nvidia cards in the past.


----------



## ColdDeckEd

Quote:


> Originally Posted by *kithylin*
> 
> Please explain exactly to me how using the XOC bios on a 1080 Ti is any different than using the bios editor software and making a custom bios for any and all previous nvidia boost cards and setting the max power to something like 800W+ so we remove the power limit and remove power throttling. Which is generally what almost every one has been doing for the past many years since nvidia started selling boost cards. The way some folks are going on about it, surely there must be some "Secret sauce" in the hex code for this bios that is causing this card to do something special by using it that we haven't already been doing for nvidia cards in the past.


Pascal is 16nm and Maxwell is 28nm. Nvidia "allowed" bios modding with previous generations because they knew the silicon could handle 1.2+ volts. When they decided that pascal couldnt handle the voltages maxwell did they locked down the bios to prevent mass failures. Thats the difference.


----------



## kithylin

So then I guess the big question we should be asking: Has anyone out there confirmed -ACTUALLY- killed, fried, or damaged their 1080 Ti pascal card by using XOC bios, or any other bios or cross-flashing, combined with MSI Afterburner? Under "Normal usage" that is, Playing games, or 3dmark benchmarks, or other game-related benchmarks, and -NOT- mining with it?


----------



## feznz

This is entertaining


----------



## ColdDeckEd

Your hate for mining is so funny considering your position on running a custom bios. The arguement you have against mining is you arent using the card as intended... Yet flashing a custom bios and running a card at voltages that it wasnt meant to run, is using the card as intended?

That you are so willfuly ignorant of the irony makes me think you are just trolling now.

Anyways im done responding to you and messing up this thread sorry for arguing. The one thing we all agree on is that its up to the owner of the card to do their due diligance before doing any kind of mods. Time will tell whos right and whos wrong I guess.


----------



## Hulio225

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Pascal is 16nm and Maxwell is 28nm. Nvidia "allowed" bios modding with previous generations because they knew the silicon could handle 1.2+ volts. When they decided that pascal couldnt handle the voltages maxwell did they locked down the bios to prevent mass failures. Thats the difference.


In addition to that, Pascal is the first architecture of NVidia with FinFET transistors, they simply don't have long time experience with that design to know what is save and what not.
Quote:


> Originally Posted by *Agent-A01*
> 
> Enough of the bickering.
> 
> Your inconsequential arguements filling pages isn't helping anyone.
> 
> FYI 400w is not dangerous at all when temperatures are relatively safe for the Ti.
> 
> Ti has a very robust component build.


Temperature alone don't determine if current flow through components is save or not, in addition not every trace and every small SMD component is cooled by the waterblock.
Quote:


> Originally Posted by *Agent-A01*
> 
> I had the OG titan that would hit 600w all day for a year and its' components were not nearly as strong.
> Thing is, it was watercooled which lowers risks.


But you are aware that it could die any time soon(or MTBF is significantly smaler) because the strain on it was double as high as it was designed for, do you?!
Quote:


> Originally Posted by *Agent-A01*
> 
> Plainly, making blanket statement over 400w is dangerous is incorrect and not as simple as you make it out to be.
> Variables must be taken into account.
> 
> Yes 400w could be dangerous with 30c ambient temps on a reference cooler.
> That's not too common around here though.


Nothing like "blanket statement" have been done, i explained every statement, the whole point is not to talking down the situation of using that bios like it would be nothing, because its a significant difference of using it or not. People asking about that bios should be aware of what it does to their systems and cards...

I am done arguing with those particular guys here now too, but i will clarify stuff if some other people ask (which don't exactly know what that bios does) about that bios.


----------



## kithylin

The bottom line is all of this theorizing that it's "dangerous" is pointless unless we actually have concrete data from other users stating "Yes, it killed my card(s)". Until then, it's all just a theory, I think it's not dangerous, y'all folks think it is, etc. So not much point in continuing any of the discussion until we get actual data from folks. I'll drop it, and stop saying it's safe, if y'all will drop it too saying it isn't, and let's just be civil and end this until multiple different individuals come in here saying their cards are dying and we actually have something to discuss on the subject.

Also the other bottom line: Everyone should know, flashing any bios to any card you own is your own risk as the owner. It doesn't matter what I, them, or anyone else says on the forums. If you do it, it's your issue and if something goes wrong it's you, and only you to blame. The only "totally safe" bios is the one that shipped on your card.

Slightly back on topic. For those of you wondering why current GPU prices are so high:


----------



## feznz

I just got to add the times that Throttling bios have saved my components.

1st time didný clip the stock intel cooler on properly intalled windows before trying some benchmarks wondering why I was hitting 101°C @ 1.6 Ghz - many reports of those stupid push clips.

2nd time didn't remove the protective film on the bottom of a water block I checked it but didn't notice the film, now there is a printed lable,- glad I wasn't the only one









3rd time forgot to plug in my secondary 24v PSU dedicated for water pump.- removed secondary PSU after doing this 2 times, whats the point of running a 12v pump @ 60% duty cycle on 24v

Using the XOC bios is not really recommened IMHO for daily use.
if you can hit 2000Mhz on stock bios then you are golden, even better if you can hit it @ 1v

but will leave you with this what you can do with out power/heat limiting bios
I would demonstrate it with my own GPU but the principle is the same.


----------



## D13mass

Quote:


> Originally Posted by *D13mass*
> 
> Hi guys! I will try to describe my problems.
> 
> I have 1080ti FE card for long time with waterblock from EK and I flashed Palit bios maybe half year ago. Today during playing in "Rise of the Tomb Raider" I realized that I dont have smooth FPS, I turned on MSI Afterburner monitoring and saw only 30-40 FPS... Then I spent so many hours for investigation and found reaason...
> 
> By default my card with Palit bios:
> 
> 1962 MHz core and 1251 MHz memory == 115 FPS avg. in bench "Rise of the Tomb Raider", Perfect smooth gameplay
> 
> When I run tool like MSI AfterBurner OR EVGA Precision X and setup my long-known stable settings +100 core and + 600 memory I will get next:
> 
> 2050 MHz core and 1400 MHz memory == 60 FPS avg. in bench "Rise of the Tomb Raider", will be slideshow 30-40 FPS in gameplay.
> 
> Here is the most important part, because now even if I switch back to default settings OR close OC tool ( MSI AfterBurner OR EVGA Precision X) anyway I will have this 60 FPS in benchmark and 30-40 in real gameplay, before I dont reboot my PC, till next run OC tool ( MSI AfterBurner OR EVGA Precision X)
> 
> What should I do? Is it problem with OC or degradation of my card?
> 
> Of course I can forget about overcklocking my card and use it as it is in default (1962Mhz) but first I met so interesting bug.


I found issue: I over-overclocked my card, Pascal has strange behavior when your OC is unstable







FPS will be less and less every minutes... So I just flashed stock bios for my FE card and downvoltaged my card by MSI AB curve, now it works in 0.9 V 1950/11400 MHz


----------



## blixt

Quote:


> Originally Posted by *kithylin*
> 
> The bottom line is all of this theorizing that it's "dangerous" is pointless unless we actually have concrete data from other users stating "Yes, it killed my card(s)". Until then, it's all just a theory, I think it's not dangerous, y'all folks think it is, etc. So not much point in continuing any of the discussion until we get actual data from folks. I'll drop it, and stop saying it's safe, if y'all will drop it too saying it isn't, and let's just be civil and end this until multiple different individuals come in here saying their cards are dying and we actually have something to discuss on the subject.
> 
> Also the other bottom line: Everyone should know, flashing any bios to any card you own is your own risk as the owner. It doesn't matter what I, them, or anyone else says on the forums. If you do it, it's your issue and if something goes wrong it's you, and only you to blame. The only "totally safe" bios is the one that shipped on your card.
> 
> Slightly back on topic. For those of you wondering why current GPU prices are so high:


Lol 1030-1400$ for a 1080ti is the normal price in Denmark. Feeling ripped off is part of life in this "**** hole"


----------



## Streetdragon

Quote:


> Originally Posted by *Streetdragon*
> 
> and all that to go from 2000 to 2150Mhz or so wich leads to 2-3FPS more. yeah


maybe i should write something like "for me its not worth it"

didnt wanted to start a war here^^

All in all im just a pus*sy but it is my first 700€+ card and i saw my electricity bill xD if you game 2H a day without vsync with 250Watt on the GPU or 350Watt. Thats more than 1€ extra on the bill


----------



## HoneyBadger84

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Out of curiosity, what temps are considered safe for an air or AiO (like the one I'm getting, FTW3 Hybrid) GTX 1080 Ti? Or rather, where does thermal throttling occur (on the normal BIOS)? I may push the card a bit since I don't think I'll be keeping it for the long run. That's if it ever gets here.


Anyone? :-D


----------



## Hulio225

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Anyone? :-D


Thermal throttling is a feature of gpu boost 3.0










Starting at the switch from 35°C to 36° C, card will lower the clock by 13 MHz (12.5 MHz to be precise) for every temperature threshold reached.


Other than that under air (FE cooler and stock fan curve) and stock bios cards are getting easily 80+° C warm/hot, if this helps

So if you don't want gpu boost 3.0 to throttle your clocks you have to stay below 35° C.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Hulio225*
> 
> Thermal throttling is a feature of gpu boost 3.0
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Starting at the switch from 35°C to 36° C, card will lower the clock by 13 MHz (12.5 MHz to be precise) for every temperature threshold reached.
> 
> Other than that under air (FE cooler and stock fan curve) and stock bios cards are getting easily 80+° C warm/hot, if this helps
> 
> So if you don't want gpu boost 3.0 to throttle your clocks you have to stay below 35° C.


Oh, I was sorta asking when it start severely-downclocking you because you've reached the "scary" temperature the BIOS has programmed in to it, like the R9 295x2 throttles if you somehow reach 74C (which I've never seen on mine personally, thanks to a push/pull fan configuration on the radiator). Just curious how much headroom I'll have.

From what I'm seeing the Thermal Throttle temp on the GTX 1080 Ti is either 91C or 92C. Don't think I'll ever see that myself, since I keep my room fairly cool, my case has a crapton of airflow, and my card will have the AiO loop cooling the GPU/VRMs.

But more over, what kinda temps do people see on a standard or aftermarket cooler setup on this card in regular usage. I know it can get up in to the 80s (reviewers show as much on FE/blower style cards), but is that actually routine, or not so much? I've seen reviews say ~84C is the highest they see on a blower card, whereas most aftermarket coolers stay below 70-75C give or take a degree or two.

Trying to get a rough feel for what this eVGA FTW3 Hybrid is going to run, since there are very few reviews out there for it.


----------



## OrionBG

Quote:


> Originally Posted by *Streetdragon*
> 
> maybe i should write something like "for me its not worth it"
> 
> didnt wanted to start a war here^^
> 
> All in all im just a pus*sy but it is my first 700€+ card and i saw my electricity bill xD if you game 2H a day without vsync with 250Watt on the GPU or 350Watt. Thats more than 1€ extra on the bill


1€ per month or per day?


----------



## HoneyBadger84

Quote:


> Originally Posted by *OrionBG*
> 
> 1€ per month or per day?


lol that's probably per day. I know back when I was folding on 4-6 GTX 580s and then later running games/benchmarks etc on 4x R9 290Xs, my power bill went up by as much as $200 a month (before we got our solar), and power is dirt cheap where I live.

Now that we have solar, I might eventually look in to mining or something on a secondary system, if I can find the time to build one back up since I disassembled my other machines quite some time ago.


----------



## OrionBG

How expensive is your power people??
Here in Bulgaria it is about 9 Euro-cents per KW... I'll have to spend more than 10000W for 1 Euro...


----------



## Agent-A01

Quote:


> Originally Posted by *Hulio225*
> 
> In addition to that, Pascal is the first architecture of NVidia with FinFET transistors, they simply don't have long time experience with that design to know what is save and what not.


That's what everyone says with every release that has a die shrink or change in design.
"Don't overvolt sandybridge, it's not as robust as last gen 40nm"
"Don't overvolt ivybridge with the 3d transistor design, they are weak(untrue obviously as time has shown)"

Thing is, those assumptions aren't ever as simple as people make them out to be.
Even with die shrinks silicon in general has become more robust.

Haswell has no issues running 1.5v on water, where people say do not go over 1.3v.
Ivy has no issues running >1.4 either(had a 3770k running 5.1ghz with 1.45v for a couple of years with no degradation)
Quote:


> Originally Posted by *Hulio225*
> 
> Temperature alone don't determine if current flow through components is save or not, in addition not every trace and every small SMD component is cooled by the waterblock.


Temperature is a number 1 factor in any component life.
High temps + high current seriously degrades life span.
Low temps + high current is much safer and doesn't always reduce lifespan by an meaningful amount.

BTW SMD is a surface mounted device, no need to say SMD Component(device is a synonym for component)
Besides that, most SMDs on a GPU do not create any substantial heat, so it's not necessary for them to be cooled.

Traces on GPUs are generally overspecced for their rated current; how many times have you heard about traces blowing?
Not very common.
Besides, there are many traces that distribute current.
Quote:


> Originally Posted by *Hulio225*
> 
> But you are aware that it could die any time soon(or MTBF is significantly smaler) because the strain on it was double as high as it was designed for, do you?!


Everyone knows running out of specifications reduces lifespan by some amount.
Everyone should assume some risk is possible.

Assuming a stock 1080ti with respectable temps in general use will last 10 years, an additional 100w might reduce lifespan by a couple years.
Inconsequential considering by then it will be a mere paperweight in value.
The actual core isn't what usually dies, it's generally other components. Ie caps that have 1/2 reduced lifespan with every constant 10c increase in temps, etc.

Anyways, everyone here acts like 400w is dangerous and the card can die at any moment within a year.
Highly unlikely it will happen on a well-cooled card.

Like i said 600w+ on OG titan for a year(at least 3 years of use with 400w+) and it never had any issues.

980Ti sli for 2 years with both core OC @ 1600mhz with voltages well above factory specification, power draw was 800w+ between the two on full load.
No issues there.

Both of those cards have weak component build compared to Ti.

The risk for a Ti is inherently lower due to better build quality.

Obviously someone using 400w+ with 80-90c temps is asking for trouble, but a card with adequately cooled VRMs and low core temp(<65c ish) will generally be fine for long-term use.


----------



## blixt

Quote:


> Originally Posted by *OrionBG*
> 
> How expensive is your power people??
> Here in Bulgaria it is about 9 Euro-cents per KW... I'll have to spend more than 10000W for 1 Euro...


30 Euro cents / 0.36$ KWh in Denmark. The power itself accounts for 13%, state charges and taxes is a whopping 69%, distribution 18%.


----------



## OrionBG

Quote:


> Originally Posted by *blixt*
> 
> 30 Euro cents / 0.36$ KWh in Denmark. The power itself accounts for 13%, state charges and taxes is a whopping 69%, distribution 18%.


Ouch... also yikes...


----------



## SavantStrike

Quote:


> Originally Posted by *blixt*
> 
> 30 Euro cents / 0.36$ KWh in Denmark. The power itself accounts for 13%, state charges and taxes is a whopping 69%, distribution 18%.


Something is rotten in Denmark!


----------



## blixt

Quote:


> Originally Posted by *OrionBG*
> 
> Ouch... also yikes...


Yea pretty bad.. 3 years ago we got a 9kW solar system and two 7kW heat pumps. That helped tremendously


----------



## blixt

Quote:


> Originally Posted by *SavantStrike*
> 
> Something is rotten in Denmark!


Thats nothing lol







Registrations fee for cars that's rotten!
An example 355.082$ for a brand new Mercedes C63 4,0 AMG. Then you've got a 550 tax every 6 months depending on the gas mileage. and then insurance... Thats rotten


----------



## feznz

Quote:


> Originally Posted by *Agent-A01*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> That's what everyone says with every release that has a die shrink or change in design.
> "Don't overvolt sandybridge, it's not as robust as last gen 40nm"
> "Don't overvolt ivybridge with the 3d transistor design, they are weak(untrue obviously as time has shown)"
> 
> Thing is, those assumptions aren't ever as simple as people make them out to be.
> Even with die shrinks silicon in general has become more robust.
> 
> 
> Haswell has no issues running 1.5v on water, where people say do not go over 1.3v.
> Ivy has no issues running >1.4 either(had a 3770k running 5.1ghz with 1.45v for a couple of years with no degradation)
> Temperature is a number 1 factor in any component life.
> High temps + high current seriously degrades life span.
> Low temps + high current is much safer and doesn't always reduce lifespan by an meaningful amount.
> 
> BTW SMD is a surface mounted device, no need to say SMD Component(device is a synonym for component)
> Besides that, most SMDs on a GPU do not create any substantial heat, so it's not necessary for them to be cooled.
> 
> Traces on GPUs are generally overspecced for their rated current; how many times have you heard about traces blowing?
> Not very common.
> Besides, there are many traces that distribute current.
> *Everyone knows running out of specifications reduces lifespan by some amoun*t.
> Everyone should assume some risk is possible.
> 
> Assuming a stock 1080ti with respectable temps in general use will last 10 years, an additional 100w might reduce lifespan by a couple years.
> Inconsequential considering by then it will be a mere paperweight in value.
> The actual core isn't what usually dies, it's generally other components. Ie caps that have 1/2 reduced lifespan with every constant 10c increase in temps, etc.
> 
> Anyways, everyone here acts like 400w is dangerous and the card can die at any moment within a year.
> Highly unlikely it will happen on a well-cooled card.
> 
> Like i said 600w+ on OG titan for a year(at least 3 years of use with 400w+) and it never had any issues.
> 
> 980Ti sli for 2 years with both core OC @ 1600mhz with voltages well above factory specification, power draw was 800w+ between the two on full load.
> No issues there.
> 
> Both of those cards have weak component build compared to Ti.
> 
> The risk for a Ti is inherently lower due to better build quality.
> 
> *Obviously someone using 400w+ with 80-90c temps is asking for trouble, but a card with adequately cooled VRMs and low core temp(<65c ish) will generally be fine for long-term use*.


depending on how you use it now lets say in Japan or China they talk about the missing generation that do not leave their room and game 18hrs a day rumour has it that some even use a potty to hand to their parents so they physical don't leave the bedroom
or the mining scenario with coin worth as much as it is even I might consider OCa mining GPU.

yes short bursts on a highly OC anything is going to be harmless but lets think of every senario.

Just my personal findings a hightly OC GPU running equvilant of 2100Mhz 1080Ti 24/7 @ 100% would most likely fail within 6 months
but some still say that extra 100mhz gives them the edge to be the online champion


----------



## SavantStrike

Quote:


> Originally Posted by *feznz*
> 
> depending on how you use it now lets say in Japan or China they talk about the missing generation that do not leave their room and game 18hrs a day rumour has it that some even use a potty to hand to their parents so they physical don't leave the bedroom
> or the mining scenario with coin worth as much as it is even I might consider OCa mining GPU.
> 
> yes short bursts on a highly OC anything is going to be harmless but lets think of every senario.
> 
> Just my personal findings a hightly OC GPU running equvilant of 2100Mhz 1080Ti 24/7 @ 100% would most likely fail within 6 months
> but some still say that extra 100mhz gives them the edge to be the online champion


Do you have some empirical data to back up your findings, or is this just your instinct?

If an Oced 1080 TI will fail in 6 months, then I've only got 1 month of life left









4x Aorus 1080 TI water force wb cards running between 2050 and 2100 mining 24x7. They've never shown signs of degrading and I had them pulling 330W a piece for a month while dialing them in (275W now).

Full cover - my cards have run cooler than an air cooled model (any air cooled model). The GPU boost profiles on these cards show huge gains at low temps, even on a founders edition. Voltages are lower and clocks are higher when things aren't at 80 degrees.


----------



## Agent-A01

Quote:


> Originally Posted by *feznz*
> 
> depending on how you use it now lets say in Japan or China they talk about the missing generation that do not leave their room and game 18hrs a day rumour has it that some even use a potty to hand to their parents so they physical don't leave the bedroom
> or the mining scenario with coin worth as much as it is even I might consider OCa mining GPU.
> 
> yes short bursts on a highly OC anything is going to be harmless but lets think of every senario.
> 
> Just my personal findings a hightly OC GPU running equvilant of 2100Mhz 1080Ti 24/7 @ 100% would most likely fail within 6 months
> but some still say that extra 100mhz gives them the edge to be the online champion


Nope, I am referring to general use.
I don't see how anyone on the planet can game for 120 hours a week.

Those that do long gaming sessions like that will probably die.

Mining isn't so bad, a lot of algorithims don't actually fully stress the GPU.

As for your claim, it's a pretty far fetched claim.

I've been running my Ti for 10-11months now at 2100mhz.
No issues.


----------



## looniam




----------



## HoneyBadger84

I just got this from Amazon:



Debating if I should continue waiting & hope I get one, or just cancel & order a Titan Xp from NVidia direct.

At that price, I think I'll just wait, they haven't charged my card yet so not missing any money in the mean time...


----------



## hotrod717

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I just got this from Amazon:
> 
> 
> 
> Debating if I should continue waiting & hope I get one, or just cancel & order a Titan Xp from NVidia direct.
> 
> At that price, I think I'll just wait, they haven't charged my card yet so not missing any money in the mean time...


Titan Xp certainly looks good at going price for a 1080ti now....
But, once consumer volta hits, all cards are going to drop in value. If you are looking to hold onto a card for a few years go for it. If you like to have the latest and greatest and sell once new gen is on the horizon, now is not the time to buy.


----------



## kithylin

Quote:


> Originally Posted by *blixt*
> 
> 30 Euro cents / 0.36$ KWh in Denmark. The power itself accounts for 13%, state charges and taxes is a whopping 69%, distribution 18%.


$0.085 / KWh here in north central Texas at the moment. Partially due to being on a power company that we put in my mother's name (she's 72) and they give her a senior citizen discount (we live together in the same house). Although without discounts most power in the area is at most $0.108 / KWh. Our bill is a bit high this month, having to run electric radiant oil heaters for warmth, but we're expecting our bill to max out about $150 this month. And that's with our 3 oil heaters, a little ceramic heater in my room.. me having 2 desktop computers and total 5 monitors, gaming on my 1080 Ti + x99 computer, lots of overclocking on x99 system, and a file server with 8 hard drives, 2 x i7 processors and 96 GB ram that runs 24-7-365.


----------



## feznz

Quote:


> Originally Posted by *SavantStrike*
> 
> Do you have some empirical data to back up your findings, or is this just your instinct?
> 
> If an Oced 1080 TI will fail in 6 months, then I've only got 1 month of life left
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4x Aorus 1080 TI water force wb cards running between 2050 and 2100 mining 24x7. They've never shown signs of degrading and I had them pulling 330W a piece for a month while dialing them in (275W now).
> 
> Full cover - my cards have run cooler than an air cooled model (any air cooled model). The GPU boost profiles on these cards show huge gains at low temps, even on a founders edition. Voltages are lower and clocks are higher when things aren't at 80 degrees.


Quote:


> Originally Posted by *Agent-A01*
> 
> Nope, I am referring to general use.
> I don't see how anyone on the planet can game for 120 hours a week.
> 
> Those that do long gaming sessions like that will probably die.
> 
> Mining isn't so bad, a lot of algorithims don't actually fully stress the GPU.
> 
> As for your claim, it's a pretty far fetched claim.
> 
> I've been running my Ti for 10-11months now at 2100mhz.
> No issues.


Probably a broad statement applies to 'Totally safe on XOC Bios" really nobody knows when a GPU will fail otherwise we would sell a month before it does right? or at least before its out of warrantee.

I would assume that both using stock bios rather than the XOC meaning the NVidia top controlled voltage of 1.092 and boost technology then the "safety factors" will greatly increase life expectancy

https://www.pugetsystems.com/labs/articles/Video-Card-Failure-Rates-by-Generation-563/

I have not been able to find any stats on 10XX failure rates

Actually Ireland spend the most time on steam games @ 32.2 hours per week Lol watching the news about pro Chinese gamers and I guess just because it's on T.V. doesn't mean it is totally correct.

http://www.news.com.au/lifestyle/real-life/true-stories/hikikomori-condition-of-being-confined-to-home-for-years/news-story/1aa3a1da48636b0636324988285097aa

http://steamspy.com/country/


----------



## DStealth

Exceeded 34k GPU score in FireStrike...actually exceeded them by far







*34506*








DStealth - i7 8700k @ 5620GHz - MSI EKX1080 Ti @ 2202/12580 XOC bios 1.175v - 28686
https://www.3dmark.com/3dm/24673879


----------



## Streetdragon

Quote:


> Originally Posted by *OrionBG*
> 
> 1€ per month or per day?


i lived alone (moved back to my parents because of stupid neighbors that didnt needes sleep..)
i needed/used electricity for 4 people somehow xD
kWh: 31,xx Cent used around 1900KWh in the half year. HALF

now i went down with every OC in my rig. even lowerd the brightness of my screen

edit.: the old big plasma TV used alot of power too....


----------



## Abaidor

Quote:


> Originally Posted by *Streetdragon*
> 
> i lived alone (moved back to my parents because of stupid neighbors that didnt needes sleep..)
> i needed/used electricity for 4 people somehow xD
> kWh: 31,xx Cent used around 1900KWh in the half year. HALF
> 
> now i went down with every OC in my rig. even lowerd the brightness of my screen
> 
> edit.: the old big plasma TV used alot of power too....


1900KWh for half a year? We consume even up to 3000Kwh in 2 months lol! It's me my two kids (6 & 10) and my wife....


----------



## OrionBG

I usually do 1000kWh to 1500kWh per month alone, depending on the season.


----------



## Streetdragon

Quote:


> Originally Posted by *Abaidor*
> 
> 1900KWh for half a year? We consume even up to 3000Kwh in 2 months lol! It's me my two kids (6 & 10) and my wife....


Quote:


> Originally Posted by *OrionBG*
> 
> I usually do 1000kWh to 1500kWh per month alone, depending on the season.


ok now i feel better


----------



## feznz

Quote:


> Originally Posted by *Streetdragon*
> 
> i lived alone (moved back to my parents because of stupid neighbors that didnt needes sleep..)
> i needed/used electricity for 4 people somehow xD
> kWh: 31,xx Cent used around 1900KWh in the half year. HALF
> 
> now i went down with every OC in my rig. even lowerd the brightness of my screen
> 
> edit.: the old big plasma TV used alot of power too....


what's your power saving secret we, 2 adult 2 children used more than that in 34 days

















just lost the games SSD 512Gb thats puts the 2nd 1080Ti back till I sort out another SSD not sure what to get am thinking 2Tb and flag 2nd 1080Ti or 1Tb and a 1080Ti decisions decisions


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DStealth*
> 
> Exceeded 34k GPU score in FireStrike...actually exceeded them by far
> 
> 
> 
> 
> 
> 
> 
> *34506*
> 
> 
> 
> 
> 
> 
> 
> 
> DStealth - i7 8700k @ 5620GHz - MSI EKX1080 Ti @ 2202/12580 XOC bios 1.175v - 28686
> https://www.3dmark.com/3dm/24673879


Nice! Gives me incentive to bench and try and beat.


----------



## DStealth

No problem mate that's why we're a community to provide an incentive each others








Should be easy for you with TitanXp and weather in Canada


----------



## Lefty23

Quote:


> Originally Posted by *DStealth*
> 
> Exceeded 34k GPU score in FireStrike...actually exceeded them by far
> 
> 
> 
> 
> 
> 
> 
> *34506*
> 
> 
> 
> 
> 
> 
> 
> 
> DStealth - i7 8700k @ 5620GHz - MSI EKX1080 Ti @ 2202/12580 XOC bios 1.175v - 28686
> https://www.3dmark.com/3dm/24673879


Finally







-- been expecting this for the past couple of weeks...

Excellent result man!
Enjoy top (8700k-1080ti) spot


----------



## DStealth

Thanks neighbor, your result is also remarkable with this combo.
Mine is actually the 6th in Futuremark HOF with 1080ti and the fist with non-7980XE CPU
These Titan Vs pushed the leader board very hard...


----------



## Lefty23

Oh, you're playing with the big boys
Yeah, just checked, can't get close to that 7980XE/TitanV combo.
Unfortunatelly can't find enough time anymore to trully tune the system and go higher. I usually just check against people with same configs and I remembered your name from when you had a problem with your first 1080ti. Then, when I saw the Cinebench/SuperPi etc results you have been posting, I was sure you were going to beat my scores (expect you'll do the same in the other gfx benches).


----------



## Pandora's Box

Quote:


> Originally Posted by *feznz*
> 
> what's your power saving secret we, 2 adult 2 children used more than that in 34 days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just lost the games SSD 512Gb thats puts the 2nd 1080Ti back till I sort out another SSD not sure what to get am thinking 2Tb and flag 2nd 1080Ti or 1Tb and a 1080Ti decisions decisions


Meanwhile in Pennsylvania:


----------



## feznz

meanwhile to keep this on topic the cost of mining with inflated hydro electric power with our foreign owned power stations and import tax on GPUS I am still doubting the profitably on mining with a 1080Ti even @ 9k a coin

just I don't know why I fry so many parts PSU units especially and now a SSD and to have a bitcoin wallet stored in a off line hardware that would shatter anyone could be years of mining and who knows how much value gone in seconds.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> For SLI to work and have positive scaling in the +80% or more range, it requires the game developer to support it and code it in to their game. And it requires nvidia to release SLI profiles for it via driver updates. If one of these two things doesn't happen, then you only get to use a single card in your games. So not all games will always support SLI. Some games flat out don't support it at all and never will. Even then the ones that do, sometimes don't scale very well (poorly optimized). Some games are AMD CrossFireX and don't support SLI, some games are SLI and don't support CrossFireX. Multi-GPU has always been a big dice game with PC gaming.
> 
> The best way for computer gaming is a very powerful single card.
> 
> The main reason for overclocking is back when the 1080 Ti launched for $699, the common thing was to get a 1080 Ti for $699, overclock it and essentially have the same (or faster) performance as the more expensive $1200 Nvidia TITAN Xp for -$500 less. Which at the time, was actually a good thing to do. However now that the 1080 Ti's are $1000 (or higher), it's not such a difference.
> 
> I just looked, and today we can get 1080 Ti's for $1075 - $1100, and the Nvidia TITAN Xp is $1400 - $1500. So it's almost the same price margin difference at today's prices, +/- $400 today. It depends on the card and what it can do but my 1080 Ti, clocked where I have it already exceeds the TITAN Xp by about +7%.


7% faster than Titan Xp you say, care to share a benchmark? And we're talking Titan Xp out of the box, because a Titan Xp with a moderate overclock within the 300W of it's vbios will DESTROY ANY 1080 Ti @ 450W.
Quote:


> Originally Posted by *kithylin*
> 
> Please explain exactly to me how using the XOC bios on a 1080 Ti is any different than using the bios editor software and making a custom bios for any and all previous nvidia boost cards and setting the max power to something like 800W+ so we remove the power limit and remove power throttling. Which is generally what almost every one has been doing for the past many years since nvidia started selling boost cards. The way some folks are going on about it, surely there must be some "Secret sauce" in the hex code for this bios that is causing this card to do something special by using it that we haven't already been doing for nvidia cards in the past.


The thing is, where there used to be an argument for running a hot vbios with previous architecture, where say for example 980 Ti was doing 1250 MHz out of the box yet it could do 1550 MHz depending on silicon lottery, a 25-30% gain, the additional performance that can be wrung out of 1080 Ti is marginal to say the least as they are clocked very aggressively from the factory. The risk, both to Maxwell and Pascal and previous architectures was always present, we were and ARE drastically shortening the life of the component in question when we override safeguards and pump 50% more TDP through them. If this sounds ridiculous, it's because it is, and all of this talk about keeping temps under control, let's not forgot that voltage and current degrade / effect silicon migration on their own, the temperature of the core or VRM / MOSFET not even being a factor (it is a compounding factor, but current and voltage don't need heat to degrade, heat just puts the degradation on an exponential curve).

The argument, again, used to be that with big risk came big reward, i.e. 50% more TDP and an overvolt and now your 980 Ti is doing 1550 MHz, a 30% overclock but now the risk is unchanged and were looking at a 10% overclock, from 2000 to 2150, maybe 20% of 1080 Ti's can do 2200 MHz?

Not prudent. It's best to err on the side of caution.

Go ahead, pump 450W and 1.2v through your card, but do the right thing and report back here if your card mysteriously fails in the next year or two. Again, the card may not die over night, but if you think youre not EASILY halving the life-span at 450W and 1.2v because you have the temps under control, youre kidding yourself.


----------



## Mooncheese

Quote:


> Originally Posted by *Agent-A01*
> 
> ...
> Assuming a stock 1080ti with respectable temps in general use will last 10 years, an additional 100w might reduce lifespan by a couple years.
> Inconsequential considering by then it will be a mere paperweight in value....


1080 Ti will not become a paper weight. I could put my 980 Ti from 2015 on ebay right now and it would sell at $600-700 considering it's as fast as a GTX 1070 (5% or so faster with an overclock actually) and GTX 1070's are going for $1k.


----------



## KCDC

Quote:


> Originally Posted by *Mooncheese*
> 
> 1080 Ti will not become a paper weight. I could put my 980 Ti from 2015 on ebay right now and it would sell at $600-700 considering it's as fast as a GTX 1070 (5% or so faster with an overclock actually) and GTX 1070's are going for $1k.


Damn, I really should've waited on selling my 980ti's


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> The argument, again, used to be that with big risk came big reward, i.e. 50% more TDP and an overvolt and now your 980 Ti is doing 1550 MHz, a 30% overclock but now the risk is unchanged and were looking at a 10% overclock, from 2000 to 2150, maybe 20% of 1080 Ti's can do 2200 MHz?


Except of course, this is overclock.net where generally everyone tries to wring every tiny last % out of all of their computer bits regardless. Doesn't matter if it's 5% or 30%, if there's more to be had we want it. That's generally the mindset and point of the forums here. I'm not saying (here in this message) if that's good or bad for the hardware, that's not really the issue. This is *overclock*.net and this place is for *overclocking*.


----------



## 8051

Quote:


> Originally Posted by *KCDC*
> 
> Damn, I really should've waited on selling my 980ti's


I wish I had, I only got $300 for my 980Ti.


----------



## mouacyk

Quote:


> Originally Posted by *8051*
> 
> I wish I had, I only got $300 for my 980Ti.


Have you also considered what happens if all those 1070's get dumped and you still have your 980 Ti?


----------



## Mooncheese

Quote:


> Originally Posted by *KCDC*
> 
> Damn, I really should've waited on selling my 980ti's


This 980 Ti SC ACX is currently at $400 with 11 bids with 4 days left to go USED:

https://www.ebay.com/itm/EVGA-GeForce-GTX-980-Ti-6GB-SC-GAMING-ACX-2-0W-Backplate/273027224247?epid=220474536&hash=item3f91b0d2b7:g:KIkAAOSwz7NaXWYx

1080 Ti WILL NOT become a paper weight. BTC is headed to 30k and beyond. If youre an investor invest in Nvidia and AMD, they are about to price all of the new cards up $100 because of supply and demand.
Quote:


> Originally Posted by *kithylin*
> 
> Except of course, this is overclock.net where generally everyone tries to wring every tiny last % out of all of their computer bits regardless. Doesn't matter if it's 5% or 30%, if there's more to be had we want it. That's generally the mindset and point of the forums here. I'm not saying (here in this message) if that's good or bad for the hardware, that's not really the issue. This is *overclock*.net and this place is for *overclocking*.


Yeah sure, but what about going in the other direction, I'm at 2025 Mhz 100% stable +400 memory with an undervolt of 1.0 on the freq. curve and my card does 31.4k GPU with 300W with load temps that don't exceed 35-38C. It's 100% safe, and I've in all likelihood, EXTENDED the MTBF of the card (MTBF = voltage x current x temperature x time), and hey that thing about the environment? Yeah 450W for 34K GPU are you daft?

There is a distinction between wringing more out of your hardware in an intelligent manner and going full ****** in the other direction.

450W, 1.2v, 75C+, youre going to cut the life of the card in half at minimum. Sure you can cool the card, but lets remember, voltage and wattage don't need temperature to be harmful. All of this for 10% more performance? Absolutely not worth it.

I get the SLI argument, but if you have a 1080 Ti and aren't at 4k, you don't need that 10% extra performance with your suicide do or die overclock (and 10% becomes even less with less framerate at 4k, i.e 50 becomes 55). You can get far more gains by turning down a setting or two. Take Forza Horizon 3, a game I just started playing and am enjoying immensely. Initially I had everything maxed out, on Ultra, and was experimenting with both FXAA and MSAA enabled (x8). FPS was around 90 in the very beginning of the game (2560x1440). I knew exactly where gains could be had and I turned off FXAA and turned MSAA down from x8 to x4. I honestly could not discern a visual difference and my FPS went from 90 to 120. Now, you could be an idiot, and say "well 90 FPS, I need to run 500W through my card for this game!" and you would run your XOC vbios at 1.2v and you might see another 10% going from 90-100 FPS? I'm serious, if the rationale for the suicide, do or die, halve my MTBF overclock is because SLI is broken in many titles you could get vastly more performance turning one or two settings down one level. This is utter insanity. Think about it. Youre going to pump 450W through your 1080 Ti FE with a vbios that wasn't actually designed for it and youre seriously ok with that for such as measly gain?

I could see if this insanity yielded a 25% gain, to which I would say, sure that sound like somewhat compelling risk reward. But 10%? No, that's just stupid, yes stupid even on overclock.net.


----------



## Mooncheese

What actually angers me more superimposing the used market prices and insane, unsafe overclocking is thinking about all of these pricks who are going to run an insane, unsafe overclock for a meager 10% gain (if they're lucky), run the card absolutely ragged, and then flash back to the original vbios and throw it up on ebay "gently used, never overclocked" after it's been absolutely thrashed, only for some kid looking to save because they can't afford to buy said hardware new and the card will probably keel over and die withing 6 months of getting it.

If youre that prick reading this that's NOT cool.

Be sensible and considerate.

Please don't be an idiot prick.


----------



## OrionBG

Quote:


> Originally Posted by *mouacyk*
> 
> Have you also considered what happens if all those 1070's get dumped and you still have your 980 Ti?


I would like you to imagine something else: At some point in the future, mining will crash and burn, it's inevitable... Then, hundreds of thousands of second hand GPUs will flood the market and at that point I'm sure at least one or two GPU manufacturers will go out of business as almost nobody will be buying new cards...

I'm not sure this will be a good thing but still hope there will be an end soon to this crypto-mining craze soon... The Gamer and Enthusiast markets are being hurt a lot right now...


----------



## porschedrifter

Msi 1080ti duke owners, could someone send me a pic of the model number of the fan on the stock 3 fan cooler please? One of mine has a bad bearing on my brand new card and I'm not wasting time with an RMA.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> I get the SLI argument, but if you have a 1080 Ti and aren't at 4k, you don't need that 10% extra performance with your suicide do or die overclock (and 10% becomes even less with less framerate at 4k, i.e 50 becomes 55). You can get far more gains by turning down a setting or two. Take Forza Horizon 3, a game I just started playing and am enjoying immensely. Initially I had everything maxed out, on Ultra, and was experimenting with both FXAA and MSAA enabled (x8). FPS was around 90 in the very beginning of the game (2560x1440). I knew exactly where gains could be had and I turned off FXAA and turned MSAA down from x8 to x4. I honestly could not discern a visual difference and my FPS went from 90 to 120. Now, you could be an idiot, and say "well 90 FPS, I need to run 500W through my card for this game!" and you would run your XOC vbios at 1.2v and you might see another 10% going from 90-100 FPS? I'm serious, if the rationale for the suicide, do or die, halve my MTBF overclock is because SLI is broken in many titles you could get vastly more performance turning one or two settings down one level. This is utter insanity. Think about it. Youre going to pump 450W through your 1080 Ti FE with a vbios that wasn't actually designed for it and youre seriously ok with that for such as measly gain?
> 
> I could see if this insanity yielded a 25% gain, to which I would say, sure that sound like somewhat compelling risk reward. But 10%? No, that's just stupid, yes stupid even on overclock.net.


Yet another person jumping on the "if you don't play in 4K you wasted money on a 1080 Ti" bandwagon.. yet I have multiple titles I can play at 1080p where the single 1080 Ti, even overclocked, runs pegged at 99%/100% constantly and still can not even maintain 60 FPS minimums. And I get I have a "slightly older" system, 5820K @ 4.5 ghz, but tests have shown it's only -4% off a 8700K, so that's not it. So there are several games where even at 1080p, a single 1080 Ti is -NOT ENOUGH- and could use an even faster single card.

It all depends on how you want to play games. Play 4K @ medium with some settings turned down? Or 1080p with every possible last setting, bell, and whistle cranked to the max and enjoy beautiful eye candy prettiness. Personally I prefer the eye candy route.


----------



## nrpeyton

Voltage on my 1080Ti is only 0.931v but the card is still consuming up to a mega 350 watts. With an average of about 334 watts (taken over half an hour).


----------



## Mooncheese

Quote:


> Originally Posted by *OrionBG*
> 
> I would like you to imagine something else: At some point in the future, mining will crash and burn, it's inevitable... Then, hundreds of thousands of second hand GPUs will flood the market and at that point I'm sure at least one or two GPU manufacturers will go out of business as almost nobody will be buying new cards...
> 
> I'm not sure this will be a good thing but still hope there will be an end soon to this crypto-mining craze soon... The Gamer and Enthusiast markets are being hurt a lot right now...


"Mining will crash and burn".

*****.

Cryptocurrencies are the future.

Quote:


> Originally Posted by *kithylin*
> 
> Yet another person jumping on the "if you don't play in 4K you wasted money on a 1080 Ti" bandwagon.. yet I have multiple titles I can play at 1080p where the single 1080 Ti, even overclocked, runs pegged at 99%/100% constantly and still can not even maintain 60 FPS minimums. And I get I have a "slightly older" system, 5820K @ 4.5 ghz, but tests have shown it's only -4% off a 8700K, so that's not it. So there are several games where even at 1080p, a single 1080 Ti is -NOT ENOUGH- and could use an even faster single card.
> 
> It all depends on how you want to play games. Play 4K @ medium with some settings turned down? Or 1080p with every possible last setting, bell, and whistle cranked to the max and enjoy beautiful eye candy prettiness. Personally I prefer the eye candy route.


5820k only 4% slower than 8700k, yeah ok keep smoking crack.

4930k is about 5% slower than 5820k clock for clock.

I was with 4930k @ 4.5 GHz since 2014 and it's now the equivalent of a space heater that pumps heat at my feat whilst mining Cryptonight with the 980 Ti usually on Lyar2Rev2 or Equihash.

8700k is EASILY 40% faster than 4930k on avg. Some games show a doubling of framerate.

Here's an idea, playing Zelda: Breath of the Wild on CEMU before 11.3, at Dueling Peak Stables FPS was 23 FPS. On my 8700k machine this area is now 45 FPS.

GTA 5, in 3D Vision, overlooking Franklin's balcony with the 4930k @ 4.5 GHz: 30 FPS.

The above but with the 8700k @ 5.0 GHz (5.1, -1 AVX): 52 FPS.

And it's like this with about 50% of my games, where the 4930k is CPU bottlenecked.

GTA 5 2D, around 105 FPS driving around on the CPU bound area where the UFO diner is, around 105 FPS with the 8700k @ 3.6 GHz (and the 4930k @ 4.5 GHz), I alt+tab out of the game switch to High Performance alt+tab back in and now it's 140 FPS at 5.0 GHz. I mean, this is way more than 10% (5% between 4930k and 5820k and the 5% you claim to be within 8700k with a 5820k).

Here's an idea of the actual performance difference. When I was using Dynamic voltage, I had two separate power plans and forgot to switch to High Performance and the 8700k still pulled 16k CPU Firestrike @ 3.6 GHz. Meaning, it is as fast as 4930k @ 4.5 GHz with NO OVERCLOCK. At 5.0 GHz it does 22.5k CPU.

https://www.3dmark.com/compare/fs/14503788/fs/11808107

8700k @ 5.0 Ghz, which 70% of 8700k's can do at or under 1.4v is about 30% faster than 5820k at 4.5 GHz, a freq. limit of most 5820k's, easy. And there are a handful of 8700k's doing 5.2, 5.3+. DJ Stealh has his at 5.6 GHz. I mean, if I'm doing 22.5k CPU with mine at 5.0 GHz, and a 5820k does 17.5k CPU, that's not 5%. That's 30%?

"But that's synthetic 3D Mark firestrike!" you will retort, but no, those are gains you can see in titles that are CPU bound, take the 105 to 140 FPS difference above, that's neatly the 35% difference between my 4930k @ 4.5 GHz and my 8700k @ 5.0 GHz in Firestrike CPU, a relevant and useful simulation of a CPU bottleneck condition.

If my 4930k which is 5% slower than your 5820 is 35% slower than my 8700k in a CPU bound situation (at minimum, the gains in CEMU and 3D Vision are much more than 35%, try closer to 70%, go ahead 30 FPS vs 52 and 23 FPS vs 46, work that out) then your 5820k isn't 5% slower than my 8700k.

Sorry, 8700k is WAY faster than 5% over 5820k.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> https://www.3dmark.com/compare/fs/14503788/fs/11808107


3dmark is a Synthetic benchmark, you can not use it to compare game performance of any different platforms and you know it.

Quote:


> Originally Posted by *Mooncheese*
> 
> Sorry, 8700k is WAY faster than 5% over 5820k.







Actual game performance compared just recently. In some titles, when overclocked to 4.6 ghz, the 5820K is just -14% less, some it's -5% less, vs 8700K @ 4.8 ghz. And it's definitely no where near -40% difference, at least not in gaming.

It's mainly in "productivity" and content-creating / number crunching where the 8700K sees the +30% to +40% gains, not gaming though.

With my purchase a couple months ago I was presented with a choice at the time, since my 3770K died: $750 for board + 2x8GB DDR4-2400 + 8700K, for a platform with just 16x PCIE lanes, which would mean if I wanted to use M.2-NVME-PCIE-SSD + raid card, it would drop my GPU all the way down to 4x PCIE speed, and the board would be disabling some of the onboard sata ports to do that configuration, too. Or $430 for x99 + 5820K + 4x4GB DDR4-2400, for a platform starting with 28 PCIE lanes and the ability to upgrade to 40-lanes later. And here on x99, even with just a 28-lane CPU, I can still run 1 x M.2-NVME-PCIE-SSD + raid card, and my 1080 Ti's still at 16x PCIE-3.0, and I still get full use of all onboard sata ports too. 8700K option would of been +$300 more for something that's expansion-limited and only +15% gains in gaming, and this is primarily a gaming system (Although occasionally I do video editing for some friends and music production, but those aren't very often for me). So for me and my needs and my configuration I want out of my computer, 8700K would of costed more and I would of had less expansion, and not a significant enough gain to be worth the extra money.

EDIT: Besides all of that, here maybe fall 2018 or spring 2019 the 6950X 10 core chip might drop down to affordable levels (like $400) and snatch up one of those and overclock it to 4.8 ghz and forget all about 8700K.


----------



## Scotty99

It actually is that much faster, test a game like world of warcraft and you will see incredible differences.

Cool thing is new games dont see much benefit as they are mostly GPU bound, mmo's are the exception as most of the coding needs to be done on one core (or so ive heard from industry experts/coders).


----------



## kithylin

Quote:


> Originally Posted by *Scotty99*
> 
> It actually is that much faster, test a game like world of warcraft and you will see incredible differences.
> 
> Cool thing is new games dont see much benefit as they are mostly GPU bound, mmo's are the exception as most of the coding needs to be done on one core (or so ive heard from industry experts/coders).


I literally just posted a video example of 11 different current-day AAA titles being tested, did you not look at it?


----------



## 8051

Quote:


> Originally Posted by *OrionBG*
> 
> I would like you to imagine something else: At some point in the future, mining will crash and burn, it's inevitable... Then, hundreds of thousands of second hand GPUs will flood the market and at that point I'm sure at least one or two GPU manufacturers will go out of business as almost nobody will be buying new cards...
> 
> I'm not sure this will be a good thing but still hope there will be an end soon to this crypto-mining craze soon... The Gamer and Enthusiast markets are being hurt a lot right now...


But will crypto-mining ever crash and burn like you say? I think it would take a government banning crypto-mining or instigating some sort of taxation on it.


----------



## Scotty99

Quote:


> Originally Posted by *kithylin*
> 
> I literally just posted a video example of 11 different current-day AAA titles being tested, did you not look at it?


You didnt read my post.


----------



## OrionBG

Quote:


> Originally Posted by *8051*
> 
> But will crypto-mining ever crash and burn like you say? I think it would take a government banning crypto-mining or instigating some sort of taxation on it.


South Korea are doing something in that regard. Maybe soon other countries will follow. You can't have so much unregulated funds coming out of thin air...
I don't know... I guess we will see.


----------



## kithylin

Quote:


> Originally Posted by *Scotty99*
> 
> You didnt read my post.


Yes.. you talk of mmo's. I'm not entirely sure the demand on a CPU is any different in an open-world MMO vs say a large open-world game like ark evolved, or GTA-V, or large multiplayer games like ashes of the benchmark. Maybe warcraft is some how different.. not sure. Would need to find someone to test 5820K vs newer chips in mmo's.


----------



## Scotty99

Quote:


> Originally Posted by *kithylin*
> 
> Yes.. you talk of mmo's. I'm not entirely sure the demand on a CPU is any different in an open-world MMO vs say a large open-world game like ark evolved, or GTA-V, or large multiplayer games like ashes of the benchmark. Maybe warcraft is some how different.. not sure. Would need to find someone to test 5820K vs newer chips in mmo's.


MMO's are literally the only reason you buy a fast CPU vs welll AMD lol. Thing is mmo's are still quite a large genre and should be tested by more outlets, especially with world of warcraft classic on the horizon millions of players will be coming back to the game.

Its a funny deal honestly, the older the game the faster CPU you need heh.


----------



## OrionBG

Quote:


> Originally Posted by *kithylin*
> 
> Yes.. you talk of mmo's. I'm not entirely sure the demand on a CPU is any different in an open-world MMO vs say a large open-world game like ark evolved, or GTA-V, or large multiplayer games like ashes of the benchmark. Maybe warcraft is some how different.. not sure. Would need to find someone to test 5820K vs newer chips in mmo's.


I literary did some tests with WOW one week ago.
On a place (inside a city) where there are many players right now, 3.2GHz - 60fps, 4GHz - 80fps... (1080p SSAA 4x + CMAA everything on 10)
This on a Ryzen 1700, It uses mainly only one core and it needs high clock rates to perform better. The 1080ti was at 40% usage during those tests.
Basically WOW cares only about clock speed.


----------



## GreedyMuffin

6950X at 4.8ghz? Mudt be on LN2.


----------



## kithylin

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 6950X at 4.8ghz? Mudt be on LN2.


One can be optimistic







But more realistically there are lots of results from 6950X on custom water on hwbot for 4.5 <-> 4.7 ghz.


----------



## OrionBG

Quote:


> Originally Posted by *GreedyMuffin*
> 
> 6950X at 4.8ghz? Mudt be on LN2.


Hey GreedyMuffin, one question.
I see in your signature a 1500W power supply. Why?


----------



## HoneyBadger84

Quote:


> Originally Posted by *OrionBG*
> 
> Hey GreedyMuffin, one question.
> I see in your signature a 1500W power supply. Why?


Because MOAR POWA!

Edit: anyone want a Lepa G 1600W on the cheap? #justkidding


----------



## Streetdragon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Because MOAR POWA!
> 
> Edit: anyone want a Lepa G 1600W on the cheap? #justkidding


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> 3dmark is a Synthetic benchmark, you can not use it to compare game performance of any different platforms and you know it.
> 
> 
> 
> 
> 
> Actual game performance compared just recently. In some titles, when overclocked to 4.6 ghz, the 5820K is just -14% less, some it's -5% less, vs 8700K @ 4.8 ghz. And it's definitely no where near -40% difference, at least not in gaming.
> 
> It's mainly in "productivity" and content-creating / number crunching where the 8700K sees the +30% to +40% gains, not gaming though.
> 
> With my purchase a couple months ago I was presented with a choice at the time, since my 3770K died: $750 for board + 2x8GB DDR4-2400 + 8700K, for a platform with just 16x PCIE lanes, which would mean if I wanted to use M.2-NVME-PCIE-SSD + raid card, it would drop my GPU all the way down to 4x PCIE speed, and the board would be disabling some of the onboard sata ports to do that configuration, too. Or $430 for x99 + 5820K + 4x4GB DDR4-2400, for a platform starting with 28 PCIE lanes and the ability to upgrade to 40-lanes later. And here on x99, even with just a 28-lane CPU, I can still run 1 x M.2-NVME-PCIE-SSD + raid card, and my 1080 Ti's still at 16x PCIE-3.0, and I still get full use of all onboard sata ports too. 8700K option would of been +$300 more for something that's expansion-limited and only +15% gains in gaming, and this is primarily a gaming system (Although occasionally I do video editing for some friends and music production, but those aren't very often for me). So for me and my needs and my configuration I want out of my computer, 8700K would of costed more and I would of had less expansion, and not a significant enough gain to be worth the extra money.
> 
> EDIT: Besides all of that, here maybe fall 2018 or spring 2019 the 6950X 10 core chip might drop down to affordable levels (like $400) and snatch up one of those and overclock it to 4.8 ghz and forget all about 8700K.


No.

1. This is not scientific in the slightest sense because for one, we don't know if the the performance figures for the overclocked 8700k are as high as they could be BECAUSE THE GPU COULD BE PEGGED A 100%. This data is just completely absent. Sure, I can take a 2600k, run it at 3.5 GHz, have a game that isn't very demanding CPU wise (none of the titles in this "comparison" were CPU intensive) which say does 110 FPS (a racing game is essentially the CPU intensive equivalent of a corridor shooter, yeah you know those, the console ports that can be literally run with a 2.5 GHz quad core processor?) with 90% GPU utilization and then "compare" that to an overclocked 8700k that is now obviously pegging the GPU to 99% utilization and point to the 10 FPS difference and say "see, only 10% faster!".

2. 4.8 GHz is not indicative of 8700k avg. overclock.

3. 4.6 GHz is a higher than avg. 5820k overclock.

An 8700k will do 4.8 GHz if it is an absolute dog and you did NOT win the silicon lottery.

70% of 8700k's can do 5.0 GHz with 1.4v or less.

I'm stable at with 5.0 GHz (5.1 -1 AVX) w/ 1.33-1.34v, meaning, I could probably secure 5.2 GHz -1 AVX or 5.1 GHz no AVX at or under 1.4v, but I am mining all day when I'm not actually gaming and don't want to run my chip there 24/7 day in, day out. But it has 5.1 GHz in it easy, possibly 5.2.



http://imgur.com/OY028


I've already posted my Firestrike CPU score at 5.1 GHz, -1 AVX, (22.5k) put yours up, and we will do the math to see the actual difference in a benchmark that actually simulates a CPU bound title or problematic area scene of a title.

I stand firmly behind my assertion that the testing methodology used in the comparison is invalid because, truthfully, we don't know if the 8700k overclocked performance figure was influenced by a GPU bottleneck or not and 4.8 GHz isn't a fair or realistic appraisal of an overclocked 8700k. Again, 70% of 8700k's can do 5.0 GHz at or under 1.4v. 4.8 GHz is a good 100% figure I suppose? Well if that's the case, he should have ran the 5820k at 4.3 GHz, because that's the freq. that 100% of 5820k's will do.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> 
> 
> http://imgur.com/OY028
> 
> 
> I've already posted my Firestrike CPU score at 5.1 GHz, -1 AVX, (22.5k) put yours up, and we will do the math to see the actual difference in a benchmark that actually simulates a CPU bound title or problematic area scene of a title.


Here ya go: https://www.3dmark.com/fs/14036899 Check the user name, same as here on the forums, best I can wring out of this system.

I'm guessing you're "Vulcan-2" on 3dmark? If so here's a comparison against the 8700K score you posted earlier vs mine: https://www.3dmark.com/compare/fs/14036899/fs/14503788

Sure physics score is vastly different, but actual graphics score is 3.4% different, in my favor actually.

It's like I actually said earlier, *IN GAMING*, the 5820K is fine vs newer systems, it's just in heavy cpu-crunching workloads where the 8700K really shines.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> Here ya go: https://www.3dmark.com/fs/14036899 Check the user name, same as here on the forums, best I can wring out of this system.
> 
> I'm guessing you're "Vulcan-2" on 3dmark? If so here's a comparison against the 8700K score you posted earlier vs mine: https://www.3dmark.com/compare/fs/14036899/fs/14503788
> 
> Sure physics score is vastly different, but actual graphics score is 3.4% different, in my favor actually.
> 
> It's like I actually said earlier, *IN GAMING*, the 5820K is fine vs newer systems, it's just in heavy cpu-crunching workloads where the 8700K really shines.


Yeah as I suspect, 5820 is actually 10% faster than 4930k, clock for clock so the performance difference isn't as great but if there is a CPU bound area or title or if the game youre playing is in 3D Vision and you have the "3 Core bug' (the game is rendered on 3 cores with a lot of titles, if not all of them, when using 3D Vision, making CPU bottlenecks nightmarish and CPU bound titles unplayable, i.e. GTA 5) or say running Zelda: Breath of the Wild on CEMU etc. you will absolutely feel it.

But many other games, like Dragon Age: Inquisition, some village in the Hinterlands with a Griffon statue and I was at like 60 FPS there with 70% GPU utilization and revisiting this demanding area with the 8700k and now it's 90 FPS and 99% utilization or Geothermal Valley in Rise of the Tomb Raider in 3D Vision or say downtown Boston in Fallout 4 or say even Far Cry 4, especially if you botch an outpost stealth attempt and an alarm is triggered and reinforcements are en route, check your FPS. These area all CPU intensive scenarios and titles, by no means exhaustive (the central square in Novigrad in The Witcher 3) and I was seeing them EVERYWHERE, and I got tired of the CPU bottlenecks, it's as though the 4930k and the 980 Ti were a match made in heaven, but after upgrading to 1080 Ti I was getting them everywhere.

Your 1080 Ti is faster in Firestrike because youre at 2126 MHz. I'm at 2050 with an undervolt of 1.0v, default FE vbios, 120% PT.

If you have Fallout 4 with an ENB (or without) go walk around downtown Boston and have a look at your GPU utilization and FPS. Whatever that is, that would be 25% better with an 8700k @ 5.0 GHz.


----------



## Mooncheese

Quote:


> Originally Posted by *OrionBG*
> 
> South Korea are doing something in that regard. Maybe soon other countries will follow. You can't have so much unregulated funds coming out of thin air...
> I don't know... I guess we will see.


You mean how the "elites" / Big Banks print themselves fiat currency?


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> Yeah as I suspect, 5820 is actually 10% faster than 4930k, clock for clock so the performance difference isn't as great but if there is a CPU bound area or title or if the game youre playing is in 3D Vision and you have the "3 Core bug' (the game is rendered on 3 cores with a lot of titles, if not all of them, when using 3D Vision, making CPU bottlenecks nightmarish and CPU bound titles unplayable, i.e. GTA 5) or say running Zelda: Breath of the Wild on CEMU etc. you will absolutely feel it.
> 
> But many other games, like Dragon Age: Inquisition, some village in the Hinterlands with a Griffon statue and I was at like 60 FPS there with 70% GPU utilization and revisiting this demanding area with the 8700k and now it's 90 FPS and 99% utilization or Geothermal Valley in Rise of the Tomb Raider in 3D Vision or say downtown Boston in Fallout 4 or say even Far Cry 4, especially if you botch an outpost stealth attempt and an alarm is triggered and reinforcements are en route, check your FPS. These area all CPU intensive scenarios and titles, by no means exhaustive (the central square in Novigrad in The Witcher 3) and I was seeing them EVERYWHERE, and I got tired of the CPU bottlenecks, it's as though the 4930k and the 980 Ti were a match made in heaven, but after upgrading to 1080 Ti I was getting them everywhere.
> 
> Your 1080 Ti is faster in Firestrike because youre at 2126 MHz. I'm at 2050 with an undervolt of 1.0v, default FE vbios, 120% PT.
> 
> If you have Fallout 4 with an ENB (or without) go walk around downtown Boston and have a look at your GPU utilization and FPS. Whatever that is, that would be 25% better with an 8700k @ 5.0 GHz.


I think what you're referring to is areas of "unoptimization", like downtown in fallout 4. Yeah I sometimes get drops to 30 FPS there, but when it is happening, not a single one of my cpu cores / threads is at 100%, and the gpu is like 40% load. It's just certain areas of certain games are highly unoptimized in general. Sadly not all games are perfect everywhere. Faster CPU will help but you're just brute-forcing the game's flaws really at that point. Some games that's all we can do.


----------



## DStealth

Yes in GPU limited situations the difference is not huge
Here're two from my results with [email protected] vs [email protected] the same card and one strap 13mhz difference
https://www.3dmark.com/compare/fs/12830703/fs/13925758#
In GPU test FPS has lees than 2 difference, but this are the average remember low and high should be much different...and the most important are the 99th percentiles


----------



## Scotty99

ITs all about the game, WoW my 8700k beats my ryzen by 80-90%. In overwatch, maybe 5 fps at most lol.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Streetdragon*




Lol


----------



## EarlZ

-resolved-


----------



## Agent-A01

Quote:


> Originally Posted by *Mooncheese*
> 
> 1080 Ti will not become a paper weight. I could put my 980 Ti from 2015 on ebay right now and it would sell at $600-700 considering it's as fast as a GTX 1070 (5% or so faster with an overclock actually) and GTX 1070's are going for $1k.


Did you read what I said?

In 10 years it will definitely be a paperweight.


----------



## ColdDeckEd

Quote:


> Originally Posted by *Agent-A01*
> 
> Did you read what I said?
> 
> In 10 years it will definitely be a paperweight.


Ten years from now, if properly taken care of these cards will still be able to play games of this era and i doubt the next gen of consoles will be that much stronger if at all, so im willing to bet it will still be viable gaming card. Sure theyll be 100 dollar cards that stomp it by then, doesnt mean it has to be a paperweight though it can still have use.


----------



## 8051

Quote:


> Originally Posted by *OrionBG*
> 
> South Korea are doing something in that regard. Maybe soon other countries will follow. You can't have so much unregulated funds coming out of thin air...
> I don't know... I guess we will see.


I've always read crypto-currency is the go-to choice of terrorist and criminal organisations everywhere.


----------



## hotrod717

Quote:


> Originally Posted by *8051*
> 
> I've always read crypto-currency is the go-to choice of terrorist and criminal organisations everywhere.


Also, what does it actually produce?? What is the benefit of mining? Folding benefits science, but what does crypto do?
Looks like Newegg has Arorous Ex. In stock for $1100. So sad.


----------



## mouacyk

Quote:


> Originally Posted by *hotrod717*
> 
> Also, what does it actually produce?? What is the benefit of mining? Folding benefits science, but what does crypto do?
> Looks like Newegg has Arorous Ex. In stock for $1100. So sad.


Not to mention the immense waste of electricity, since fiat operational costs aren't going away anytime soon.


----------



## GreedyMuffin

Quote:


> Originally Posted by *OrionBG*
> 
> Hey GreedyMuffin, one question.
> I see in your signature a 1500W power supply. Why?


Got that AX1500I for free, that's why!


----------



## SavantStrike

Quote:


> Originally Posted by *hotrod717*
> 
> Also, what does it actually produce?? What is the benefit of mining? Folding benefits science, but what does crypto do?
> Looks like Newegg has Arorous Ex. In stock for $1100. So sad.


Block chain does a lot of things. Right now it's not being used in any meaningful way. Prices are high because of idiots cleaning out local and online stores and selling these things people want on eBay etc. I bet you half of these sellers didn't know what a graphics card was until they heard someone wanted one.

Quote:


> Originally Posted by *8051*
> 
> I've always read crypto-currency is the go-to choice of terrorist and criminal organisations everywhere.


That's just something banks say while secretly preparing to use the tech themselves


----------



## OrionBG

Quote:


> Originally Posted by *GreedyMuffin*
> 
> Got that AX1500I for free, that's why!


I see


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> I think what you're referring to is areas of "unoptimization", like downtown in fallout 4. Yeah I sometimes get drops to 30 FPS there, but when it is happening, not a single one of my cpu cores / threads is at 100%, and the gpu is like 40% load. It's just certain areas of certain games are highly unoptimized in general. Sadly not all games are perfect everywhere. Faster CPU will help but you're just brute-forcing the game's flaws really at that point. Some games that's all we can do.


True and the number of games that are unoptomized, exhibit unoptomized areas is much greater than we think. Like XCOM 2 in 3D Vision, where just looking at the ship / map / base of operations and I'm at 45 FPS with 80% GPU utilization with the 4930k @ 4.5 GHz, now it's 55-60 FPS with 99% GPU utilization, same goes for Middle Earth: Shadows of Mordor, also in 3D Vision, if I'm in any large orc stronghold with a lot of verticality going on. The game was doing 40 FPS in these areas, nearly unplayable, especially if I got mobbed by orcs. Now, I couldn't believe my eyes, it was holding 60 FPS (120 FPS 2D) and I was literally surrounded by 50+ orcs, it was absolutely breathtaking.

Speaking of Fallout 4, here's how sensitive this game is, but before I go on I want to point out two things you should do if youre playing this, one would be to turn Shadow Render Distance down to Medium, as this setting seems to really exacerbate the excessive draw calls that occur in downtown Boston, and two, you need to set the game to High Priority either with Task Manager or Process Hacker (my preference because you can save the setting and it will auto-apply next time) by alt+tabbing out of the game, opening application of choice, finding the .exe and right clicking, > Priority.

Here's standing atop a building in downtown Boston with the 4930k @ 3.4 GHz, 59 FPS with 74% utilization:



Here's the same spot, only having alt+tabbed out of the game and switched power plans to High Performance and 4.5 GHz, 67 FPS with 91% utilization:



Now that's with the 980 Ti at 1500 MHz. When I upgraded to 1080 Ti, if I were to stand in this very same spot the FPS would be the same except now GPU utilization would be say at 40%. And I already have Shadow Render Distance to Medium and have the game set to High Priority. Here's the exact settings in fact:

God Rays Low
Shadow Distance Medium
Shadow Quality High
Everything Else Maxed
Vogue ENB + Reshade

More:



http://imgur.com/ZzhLV


If I were to revisit this area with the 8700k, I could in reality expect a 35% improvement here, taking my FPS from 67 probably right up 100 or so. As I stated, 4930k and 980 Ti were a match made in heaven, after upgrading to 1080 Ti I was seeing severe CPU bottlenecks EVERYWHERE, in games you wouldn't expect to find them and I knew that my next upgrade wasn't another 1080 Ti, it was time to upgrade the CPU.

With that in perspective, 5820k is 5-10% faster than 4930k, clock for clock.

Here's my 4930k Firestrike CPU:

https://www.3dmark.com/fs/11809878

And I've noticed that at default clocks 5820k does I think 1250 in Cinebench R15 whereas 4930k does 1150. So it's over 5% here.

But yeah, no dude, 5820k is NOT within spitting distance of the 8700k. Both with an average overclock, 4.5 GHz and 5.0 GHz respectively, and youre looking at at 25% difference and 35% difference between 4930k and 8700k.

Sure, there are many titles such as Overwatch, Ranbow Six Siege and Project Cars 2 that youre not going to see the difference really between a 2600k, a 5820k, and an 8700k (really though, that comparison was so ridiculous, how CPU intensive is a 4 on 4 FPS going to be in a small map or a racing game? Come on now, next time use Planetside 2 with a big fight, say 48/48+ and THEN lets see the difference between the CPU's) but the number of CPU bound titles is MUCH greater than you would expect, even more so if youre at all a fan of 3D Vision.

Quote:


> Originally Posted by *DStealth*
> 
> Yes in GPU limited situations the difference is not huge
> Here're two from my results with [email protected] vs [email protected] the same card and one strap 13mhz difference
> https://www.3dmark.com/compare/fs/12830703/fs/13925758#
> In GPU test FPS has lees than 2 difference, but this are the average remember low and high should be much different...and the most important are the 99th percentiles


Yeah but look at that Combined difference. 40%? That's where the real story is.

Quote:


> Originally Posted by *Agent-A01*
> 
> Did you read what I said?
> 
> In 10 years it will definitely be a paperweight.


Yes. In 10 years 1080 Ti will absolutely be paperweight. All we need to do is look 10 years into the rear view mirror for this one. GTX 280? LOL.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_300_series

Quote:


> Originally Posted by *ColdDeckEd*
> 
> Ten years from now, if properly taken care of these cards will still be able to play games of this era and i doubt the next gen of consoles will be that much stronger if at all, so im willing to bet it will still be viable gaming card. Sure theyll be 100 dollar cards that stomp it by then, doesnt mean it has to be a paperweight though it can still have use.


No. It will be a paperweight. Unless Moore's Law is showing itself in GPU compute, and that isn't happening (they are getting 100% faster every generation).

Quote:


> Originally Posted by *hotrod717*
> 
> Also, what does it actually produce?? What is the benefit of mining? Folding benefits science, but what does crypto do?
> Looks like Newegg has Arorous Ex. In stock for $1100. So sad.


Folding benefits Science? You mean like how cancer is profitable to the Pharmaceutical Industry? How natural, homeopathic treatment is vastly more effective than chemotherapy but isn't condoned because, hey you can't make money off of telling people to seek natural, inexpensive treatment? You can't have a profit oriented healthcare system that is predicated on the ill-health of it's, ahem, "customers" and a healthy populace. The two are incompatible. It's either record profits for the .01% or a healthy populace. A quick survey of the health in this country and it's immediately apparent which one "we've" chosen:

https://www.naturalnews.com/051482_cancer_industry_overdiagnosis_false_positives.html

https://articles.mercola.com/sites/articles/archive/2013/08/03/natural-cancer-treatment.aspx






Quote:


> Originally Posted by *mouacyk*
> 
> Not to mention the immense waste of electricity, since fiat operational costs aren't going away anytime soon.


Yeah because using coal and oil are the only ways to get electricity.


----------



## ColdDeckEd

I do think there will be a wall they run into much like intel cpus but ill be pleasantly surprised if they can keep the gen to gen improvements that high in the future. I guess 10 years is a long time. Would I still be using it on my main rig? Most likely no, but it always be useful until its not working thats all Im saying.


----------



## Mooncheese

Quote:


> Originally Posted by *ColdDeckEd*
> 
> I do think there will be a wall they run into much like intel cpus but ill be pleasantly surprised if they can keep the gen to gen improvements that high in the future. I guess 10 years is a long time. Would I still be using it on my main rig? Most likely no, but it always be useful until its not working thats all Im saying.


1080 Ti will still be relevant in 3 years, 5 years tops. Just like 780 Ti is still relevant today (Overwatch, 1080p etc.).


----------



## 8051

Quote:


> Originally Posted by *Mooncheese*
> 
> Yeah because using coal and oil are the only ways to get electricity.


But your sarcasm didn't address the issue he raised, namely the "waste of electricity", which it is, because it's arguable as to whether mining produces anything worth making.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> Now that's with the 980 Ti at 1500 MHz. When I upgraded to 1080 Ti, if I were to stand in this very same spot the FPS would be the same except now GPU utilization would be say at 40%. And I already have Shadow Render Distance to Medium and have the game set to High Priority. Here's the exact settings in fact:
> 
> God Rays Low
> Shadow Distance Medium
> Shadow Quality High
> Everything Else Maxed
> Vogue ENB + Reshade


Also... if we have a 1080 Ti in a computer made within the past 3 years and have to turn down -any- setting in -any game- even 1 notch below maxium/epic settings to run it @ 60 FPS, then it's just a s*** crap game and no amount of hardware we throw at it will make it better, at least not now. Maybe 5 years from now when we have Intel 38000K 15 core chips @ 5.5 ghz stock and a GTX 9080 Ti we'll be able to come back and brute-force it to 60 FPS. Some games are just bad in some places. 90% of the places in fallout 4 I can run @ 60 FPS with my 1080 Ti and.. main reason I wanted it is I often like to build huge fortress bases in the game and even at 1080p, some of the bases I build, have built, and play at will load up my 1080 Ti to 80% - 85% just to walk around in them @ 1080p / 60 FPS. There's many, many situations in some games where a 1080 Ti is barely enough (and some not even enough) at 1080p. It might be my processor, but my 5820K was just a stop-gap until I can get my paws on a 6950X (or some sort of broadwell chip) later anyway. It'll do for now and later on it'll be excellent.


----------



## KCDC

Quote:


> Originally Posted by *mouacyk*
> 
> Have you also considered what happens if all those 1070's get dumped and you still have your 980 Ti?


Right, but I was talking like right now instead of last August, I got 650 for both, one was a coveted waterforce card. Looks like, now, I couldve gotten closer to 900 for both. Just saying that never happened before crypto became such a thing, where old tech actually appreciates after it depreciates.


----------



## mouacyk

Quote:


> Originally Posted by *8051*
> 
> But your sarcasm didn't address the issue he raised, namely the "waste of electricity", which it is, because it's arguable as to whether mining produces anything worth making.


Even solar panels don't appear out of thin air, unlike photons from the sun. Hidden from sight, hidden from mind...


----------



## 8051

Quote:


> Originally Posted by *kithylin*
> 
> Also... if we have a 1080 Ti in a computer made within the past 3 years and have to turn down -any- setting in -any game- even 1 notch below maxium/epic settings to run it @ 60 FPS, then it's just a s*** crap game and no amount of hardware we throw at it will make it better, at least not now. Maybe 5 years from now when we have Intel 38000K 15 core chips @ 5.5 ghz stock and a GTX 9080 Ti we'll be able to come back and brute-force it to 60 FPS. Some games are just bad in some places. 90% of the places in fallout 4 I can run @ 60 FPS with my 1080 Ti and.. main reason I wanted it is I often like to build huge fortress bases in the game and even at 1080p, some of the bases I build, have built, and play at will load up my 1080 Ti to 80% - 85% just to walk around in them @ 1080p / 60 FPS. There's many, many situations in some games where a 1080 Ti is barely enough (and some not even enough) at 1080p. It might be my processor, but my 5820K was just a stop-gap until I can get my paws on a 6950X (or some sort of broadwell chip) later anyway. It'll do for now and later on it'll be excellent.


Is a Broadwell-E (?) much better than a Haswell-E that it's a worthwhile expenditure? The Broadwell-E's are also LGA-2011V3 right?


----------



## 8051

Quote:


> Originally Posted by *mouacyk*
> 
> Even solar panels don't appear out of thin air, unlike photons from the sun. Hidden from sight, hidden from mind...


How many home solar panel setups are capable of providing 300 Watts of power -- especially at night?


----------



## 8051

Quote:


> Originally Posted by *Mooncheese*
> 
> Speaking of Fallout 4, here's how sensitive this game is, but before I go on I want to point out two things you should do if youre playing this, one would be to turn Shadow Render Distance down to Medium, as this setting seems to really exacerbate the excessive draw calls that occur in downtown Boston, and two, you need to set the game to High Priority either with Task Manager or Process Hacker (my preference because you can save the setting and it will auto-apply next time) by alt+tabbing out of the game, opening application of choice, finding the .exe and right clicking, > Priority.
> 
> [


I tried this out (without setting Shadow Render Distance to Medium) by setting Fallout4.exe to realtime priority. I did gain about 2 more FPS on my 5820k, I also noticed one core was pegged at 90% usage and my 1080Ti wasn't even hitting 60% usage.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> Also... if we have a 1080 Ti in a computer made within the past 3 years and have to turn down -any- setting in -any game- even 1 notch below maxium/epic settings to run it @ 60 FPS, then it's just a s*** crap game and no amount of hardware we throw at it will make it better, at least not now. Maybe 5 years from now when we have Intel 38000K 15 core chips @ 5.5 ghz stock and a GTX 9080 Ti we'll be able to come back and brute-force it to 60 FPS. Some games are just bad in some places. 90% of the places in fallout 4 I can run @ 60 FPS with my 1080 Ti and.. main reason I wanted it is I often like to build huge fortress bases in the game and even at 1080p, some of the bases I build, have built, and play at will load up my 1080 Ti to 80% - 85% just to walk around in them @ 1080p / 60 FPS. There's many, many situations in some games where a 1080 Ti is barely enough (and some not even enough) at 1080p. It might be my processor, but my 5820K was just a stop-gap until I can get my paws on a 6950X (or some sort of broadwell chip) later anyway. It'll do for now and later on it'll be excellent.


Who cares if you can get 99% GPU utilization in The Sanctuary, or any other irrelevant locale, the main dish, the central nexus, the most interesting and exciting place in Fallout 4 is downtown Boston, and if your FPS plummets down to 60 here, sure, that's still playable, but it looks much better at 100 FPS, with much higher minimums and no stutter / hitching.

And it's like this with any game. Sure, you can ghost cap some base and have 100 FPS, 99% GPU utilization in Planetside 2, but that's not interesting or fun, big fights are interesting and fun, 48+/48+ fights, and with anything less than an 6700k @ 4.7 GHz youre going to see horrid performance here.

Or Middle Earth: Shadow of Mordor, sure, you can waltz around on the outskirts of some orc stronghold, but that isn't any fun, fun is fighting 50+ orcs in the middle of their stronghold.

Far Cry 4, sure, you can sneak around and do things stealthily, but the game gets very interesting when reinforcements are called and youre not up against a squad of goons.

GTA 5, sure you can drive around like a tourist, but the game really gets exciting once you have 2-3 or 4 stars and have a plethora of cops and helicopters chasing you.

And it's like this with any game, right on down the line. All of the above induce SEVERE CPU bottleneck scenarios. Your 5820k at 4.5 GHz will NOT be up to task. I'm STILL seeing a CPU bottleneck with the 8700k in some of the aforementioned scenarios, particularly Far Cry 4 (my avg did go from 80 to 110 though, and is 30 FPS higher in CPU bound situations)

CPU absolutely matters.

Firestrike Combined is a scenario indicative of the performance one could expect in any of the aforementioned scenarios.

https://www.3dmark.com/compare/fs/12830703/fs/13925758#

Sorry but 40 FPS isn't playable.


----------



## feznz

Thats what I discovered Assassins Origins stuttered like to unplayable with a 3770k @ 5Ghz even with highest utilisation core @ 90%, switched to the 8600k @ 4.6Ghz buttery smooth I wouldn't recommend a 4core to anyone period.
We are hitting new hieghts in CPU demands bit like 10 years ago, when do I need a quad core as my duo is doing fine

I just highly dissappointed in intel with marketing stratagy releasing a 270 Chipset then in months the 370 required to run the Coffee lake yet same physical CPU socket but then anything to empty your pockets of money I guess.


----------



## Mooncheese

Oh and looking at and thinking of the 40 to 60 FPS difference in Firestrike Combined caused me to remember something relevant that I wanted to add to my last post; in large fights in Planetside 2, 48+/48+, my FPS went from 40 to 60 FPS, a 50% gain going from 4930k @ 4.5 GHz to 8700k @ 5.0 Ghz. Considering the performance difference between 4930k and 5820k (7%) this would be the difference between 43 and 60 FPS, not 40 and 60 FPS. Still unplayable.
Quote:


> Originally Posted by *feznz*
> 
> Thats what I discovered Assassins Origins stuttered like to unplayable with a 3770k @ 5Ghz even with highest utilisation core @ 90%, switched to the 8600k @ 4.6Ghz buttery smooth I wouldn't recommend a 4core to anyone period.
> We are hitting new hieghts in CPU demands bit like 10 years ago, when do I need a quad core as my duo is doing fine
> 
> I just highly dissappointed in intel with marketing stratagy releasing a 270 Chipset then in months the 370 required to run the Coffee lake yet same physical CPU socket but then anything to empty your pockets of money I guess.


I agree, and the rationale for Z370, that Coffee Lake processors would need more robust power delivery, was thoroughly dismantled because all of the Z270 boards were over-engineered and could easily handle 180W.

Occam's Razor says it's motherboard industry in cahoots with Intel.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> Who cares if you can get 99% GPU utilization in The Sanctuary, or any other irrelevant locale, the main dish, the central nexus, the most interesting and exciting place in Fallout 4 is downtown Boston, and if your FPS plummets down to 60 here, sure, that's still playable, but it looks much better at 100 FPS, with much higher minimums and no stutter / hitching.
> 
> And it's like this with any game. Sure, you can ghost cap some base and have 100 FPS, 99% GPU utilization in Planetside 2, but that's not interesting or fun, big fights are interesting and fun, 48+/48+ fights, and with anything less than an 6700k @ 4.7 GHz youre going to see horrid performance here.


That's fine and great if you're made of money. The GPU is a big investment and it's already been proven that the 1080 Ti cards scale positively faster with each new processor released to date, all the way up to 8700K. So most likely the 1080 Ti's will continue to show improved performance with each new respective computing platform. We as of yet do not have any computer that exists that stops scaling 1080 Ti performance upwards.

So I invested in the GPU first, and got what's affordable now computing wise. For my needs (outlined in a previous post) x99 / 5820K / broadwell is the best cost vs performance, at the moment. Later when the newer HEDT platforms come down in price I'll switch up to x299 / i9 chips and move my 1080 Ti forwards. Then likely what ever HEDT platform after that will also scale even further with 1080 Ti's.

So for now, I'm fine.


----------



## hotrod717

Just snagged a 1080ti ftw3 for msrp ($899).
KPE just jumped to $1229 on the egg. While Arorous went to $1029. Looks like they are adding stock. GL.


----------



## webhito

Quote:


> Originally Posted by *hotrod717*
> 
> Just snagged a 1080ti ftw3 for msrp ($899).
> KPE just jumped to $1229 on the egg.


Nicely done! I got my KPE for $1000 here in Mexico. It was cheaper than the the Aorus non extreme edition...

Crazy prices right now.


----------



## hotrod717

Quote:


> Originally Posted by *webhito*
> 
> Nicely done! I got my KPE for $1000 here in Mexico. It was cheaper than the the Aorus non extreme edition...
> 
> Crazy prices right now.


Thanks. Been looking for a while and keep hesitating too long. I don't think many people are going to have luck finding 1080ti at original retail for much longer. not sure why some are at original and they're really bumping others.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> That's fine and great if you're made of money. The GPU is a big investment and it's already been proven that the 1080 Ti cards scale positively faster with each new processor released to date, all the way up to 8700K. So most likely the 1080 Ti's will continue to show improved performance with each new respective computing platform. We as of yet do not have any computer that exists that stops scaling 1080 Ti performance upwards.
> 
> So I invested in the GPU first, and got what's affordable now computing wise. For my needs (outlined in a previous post) x99 / 5820K / broadwell is the best cost vs performance, at the moment. Later when the newer HEDT platforms come down in price I'll switch up to x299 / i9 chips and move my 1080 Ti forwards. Then likely what ever HEDT platform after that will also scale even further with 1080 Ti's.
> 
> So for now, I'm fine.


But it's not actually. 5820k is $400+ new and $250+ used right now? Why would anyone not pay the difference between used 5820k and new 8700k? $150 for 25% performance difference? Yeah sure you can save maybe $150 but I would gladly pay the $150 difference for a NEW 8700k. Who knows what youre getting with used CPU's and GPU's. Who knows, maybe DJ Stealth's 5820k that he used 1.45v to secure 4.6 GHz is on there "gently used, never overclocked".

You don't save anything buying new 5820k, in fact they are all over $380!

https://www.ebay.com/sch/i.html?_from=R40&_trksid=p2380057.m570.l1313.TR0.TRC0.H0.Xforza+horizon+3+pc+.TRS0&_nkw=i7+5820k&_sacat=0

Oh and add Watch Dogs 2 and Battlefield 1 to the list of games where CPU matters much more than usual.

If you use Temporal Filtering and eliminate the GPU bottleneck that can be present you will absolutely see a CPU bottleneck with a 5820k @ 4.5 GHz, because I still see one with the 8700k @ 5.0 GHz:

https://www.geforce.com/whats-new/guides/watch-dogs-2-graphics-and-performance-guide

Anyone reading this and playing Watch Dogs 2 if you have Temporal Filtering off youre doing it wrong.

And I'm not made of money, I haven't upgraded CPU side of things since 2014. I was able to swing Coffee Lake because of disability back pay from time spent doing the bidding of the Military Industrial Complex, in Afghanistan.






I got a lump sum around $1200 right around the time of Black Friday and right after I tried to haggle with someone on Craigslist for their used 6700k. I said screw it, and went all in on Coffee Lake, buying everything during Black Friday. I spent maybe $1700, and that's counting $700 worth of liquid cooling, the motherboard, PSU, case, and memory. Add the cost of the 1080 Ti and the computer in the post-build vid above is a $4k PC that I pieced together for $2500. It's 50% faster , both CPU and GPU and memory-wise (3200 MHz DDR4 vs 2133 MHz DDR3) yet 50% cheaper than my original x79 780 Ti SLI build in 2014 ($3750) and dare I say, 50% more beautiful.

But yeah, I get the not being able to update as frequently as we would like bit, but for me, that was a blessing in disguise, because if it were not for financial constraint I probably would have upgraded to 6700k, and maybe, but probably not, 7700k along the way.

By the time you can afford to you will be able to upgrade to whatever's after Ice Lake if not Ice Lake itself.

But If youre trying to comfort yourself by pretending that the newer CPU's don't afford significant, meaningful gains in games vs running an older CPU, youre kidding yourself.

That 40% Firestrike Combined difference is about the difference you could expect playing Planetside 2 with a big fight going on, 48+/48+ (44 vs 58) FPS. And those minimums under 60 FPS are a big deal. When the game gets down to around 40 FPS it feels like an absolute stutter fest. Like you don't even want to play it. Well for me that's how Planetside 2 and the vast majority of my CPU bound 3D Vision games were, 40 FPS sucks. That's just Firestrike Combined though, a somewhat useful benchmark but I'm actually seeing 50% or more performance in games. 40 to 60 in Planetside 2 is 50%. 23 to 46 FPS in Zelda: Breath of the Wild on CEMU is, 100%? 30 to 52 FPS overlooking Franklin's balcony in GTA 5 3D Vision is over 50% (45 would be 50%). 105 to 140 FPS in GTA 5 2D is 35%, 60 to 90 FPS in Dragon Age: Inquisition is 50%. Etc.

I would say that Combined is a nice avg. simulation of a CPU bottleneck but the actual bottlenecks in games can be much higher, between 50-70% between 4930k @ 4.5 GHz and 8700k @ 5.0 GHz (between 45-60% between 5820k and 8700k).

That youtuber's "comparison" is a total joke. Really though, youre going to do a comparison between processors, and you take a 5820k with a higher than avg. overclock of 4.6 GHz (maybe 33% can do this) and run it against an 8700k at 4.8 GHz that 100% of them can do (70% can do 5.0 GHz, and 42% can do 5.1 GHz, mine is in the latter group) and then you use a few racing games and Rainbow Six: Seige a 4 on 4 multiplayer game (yeah that's a lot going on CPU wise, NOT). Dude. CPU comparison FAIL.


----------



## KedarWolf

I dunno why I'm no longer getting email notifications for any threads.

I checked Preferences, Immediate Notifications are enabled, nada.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> But it's not actually. 5820k is $400+ new and $250+ used right now? Why would anyone not pay the difference between used 5820k and new 8700k?


I already answered that question previously, and you even quoted my message and still you ask this. I guess you can't read so I'll point to it again: http://www.overclock.net/t/1624521/official-nvidia-gtx-1080-ti-owners-club/15880#post_26555815

I went with x99 instead of 8700K because I need the PCIE lanes for my usage out of a computer. It's not just about gaming. I'm not repeating all of that a second time, it's posted over there.

So I have two choices, new Intel x299, and I need minimum 6 core i7 chip with 28 PCIE lanes, so that would be the I7-7800X, and a motherboard that can do at least 2x16 + 1x8, probably a 3-way or 4-way board, and cheapest one with 8 ram slots. Starting with 4 ram sticks now (4x4GB), and want the ability to expand to 8 sticks later when ddr4 prices come down again. So cheapest board that can do that was chosen.

https://pcpartpicker.com/list/Hx9h6X - Current prices for that, $941.36 + I don't have a water block capable of mounting on LGA-2066, so I'd have to buy one of those somewhere (I can't even actually find one for sale right now), so let's ballpark that at $1056.36 right now.

Or x99, I already have a block for it.. I managed to get the entire motherboard + cpu + ram (4x4GB DDR3-2400) for $450 shipped off ebay about 2 months ago.

8700K's faster sure but it doesn't have enough PCIE lanes for me.

So now I've explained it, twice. Please stop and drop the cpu comparison stuff, it's way off topic here.

EDIT: 8700K system is here: https://pcpartpicker.com/list/2nd4Fd $703.98 base, cheapest 2x8GB ram (have to have at least 16GB), which is still +$250 over what I paid, and it doesn't have enough PCIE lanes and can't use it anyway.


----------



## Thoth420

Hey all I was wondering if I need to have XOC installed and running to control the RGB color of a 1080 Ti FTW3 DT. I just want them to be a static purple color 24/7 to match my other lights. No breathing or any fancy jazz.


----------



## feznz

Quote:


> Originally Posted by *KedarWolf*
> 
> I dunno why I'm no longer getting email notifications for any threads.
> 
> I checked Preferences, Immediate Notifications are enabled, nada.


LOL thats probably a good thing


----------



## mAs81

Quote:


> Originally Posted by *KedarWolf*
> 
> I dunno why I'm no longer getting email notifications for any threads.
> 
> I checked Preferences, Immediate Notifications are enabled, nada.


Happened to me couple of times too..PM ENTERPRISE and he'll fix it for you..


----------



## KedarWolf

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KedarWolf*
> 
> I dunno why I'm no longer getting email notifications for any threads.
> 
> I checked Preferences, Immediate Notifications are enabled, nada.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Happened to me couple of times too..PM ENTERPRISE and he'll fix it for you..
Click to expand...

I figured out all my subscriptions in the Subscribed option at bottom of each thread had defaulted to Site Only, No Email Notifications.

I fixed it.


----------



## SavantStrike

Quote:


> Originally Posted by *hotrod717*
> 
> Thanks. Been looking for a while and keep hesitating too long. I don't think many people are going to have luck finding 1080ti at original retail for much longer. not sure why some are at original and they're really bumping others.


I think the bump happens when the item comes back in stock, and when w competitor bumps. The few affordable options left are remnants. If one comes in stock buy immediately as the bump is coming within hours.

Crazy times.


----------



## ThrashZone

Quote:


> Originally Posted by *Thoth420*
> 
> Hey all I was wondering if I need to have XOC installed and running to control the RGB color of a 1080 Ti FTW3 DT. I just want them to be a static purple color 24/7 to match my other lights. No breathing or any fancy jazz.


Hi,
Yes you do as far as I know
You also need it to control all 3 fan curves.
I like XOC and K Boost too


----------



## Scotty99

I dont know where else to post this but, (feel free to move if not ok here) is anyone willing to trade a MSI hydro 1080ti for my asus strix? I want to put a 360mm rad in front of my case and reference design is longest that will fit.

Alternatively i could simply buy a different case like the corsair 460x, but i dont like it as much as the meshify









What do you guys think? Either way im going with corsairs new 360 aio for my CPU, that thing seems to perform like a beast.


----------



## Agent-A01

Quote:


> Originally Posted by *Mooncheese*
> 
> But it's not actually. 5820k is $400+ new and $250+ used right now? Why would anyone not pay the difference between 5820k and 8700k? $100 for 25% performance difference?


You can get a new 6800/6850k for about 300ish now

Considering the 8700k has such a weak platform It's a decent trade off for 10-15% performance loss.


----------



## rickyman0319

is 83C too high for 1080ti for mining?


----------



## Thoth420

Quote:


> Originally Posted by *ThrashZone*
> 
> Hi,
> Yes you do as far as I know
> You also need it to control all 3 fan curves.
> I like XOC and K Boost too


Thanks! I just heard alot of horror stories from users. I will be using ASUS aura for the rest of my system but I prefer an EVGA card in case I need to RMA it for some reason.


----------



## 8051

I k
Quote:


> Originally Posted by *Agent-A01*
> 
> You can get a new 6800/6850k for about 300ish now
> 
> Considering the 8700k has such a weak platform It's a decent trade off for 10-15% performance loss.


What does a Broadwell-E get you over a Hawell-E? 5% more performance? 10% more?


----------



## kithylin

Quote:


> Originally Posted by *8051*
> 
> I k
> What does a Broadwell-E get you over a Hawell-E? 5% more performance? 10% more?


+10%, 5960x vs 6950X: http://cpu.userbenchmark.com/Compare/Intel-Core-i7-5960X-vs-Intel-Core-i7-6950X/2580vs3604

+10%, 8700K vs 6950X: http://cpu.userbenchmark.com/Compare/Intel-Core-i7-8700K-vs-Intel-Core-i7-6950X/3937vs3604

This is purely a synthetic test though and has no real bearing on anything.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> I already answered that question previously, and you even quoted my message and still you ask this. I guess you can't read so I'll point to it again: http://www.overclock.net/t/1624521/official-nvidia-gtx-1080-ti-owners-club/15880#post_26555815
> 
> I went with x99 instead of 8700K because I need the PCIE lanes for my usage out of a computer. It's not just about gaming. I'm not repeating all of that a second time, it's posted over there.
> 
> So I have two choices, new Intel x299, and I need minimum 6 core i7 chip with 28 PCIE lanes, so that would be the I7-7800X, and a motherboard that can do at least 2x16 + 1x8, probably a 3-way or 4-way board, and cheapest one with 8 ram slots. Starting with 4 ram sticks now (4x4GB), and want the ability to expand to 8 sticks later when ddr4 prices come down again. So cheapest board that can do that was chosen.
> 
> https://pcpartpicker.com/list/Hx9h6X - Current prices for that, $941.36 + I don't have a water block capable of mounting on LGA-2066, so I'd have to buy one of those somewhere (I can't even actually find one for sale right now), so let's ballpark that at $1056.36 right now.
> 
> Or x99, I already have a block for it.. I managed to get the entire motherboard + cpu + ram (4x4GB DDR3-2400) for $450 shipped off ebay about 2 months ago.
> 
> 8700K's faster sure but it doesn't have enough PCIE lanes for me.
> 
> So now I've explained it, twice. Please stop and drop the cpu comparison stuff, it's way off topic here.
> 
> EDIT: 8700K system is here: https://pcpartpicker.com/list/2nd4Fd $703.98 base, cheapest 2x8GB ram (have to have at least 16GB), which is still +$250 over what I paid, and it doesn't have enough PCIE lanes and can't use it anyway.


Sorry I didn't read your previous post, yeah $450 for the CPU, mobo and RAM was the way to go, 5820k is still a solid CPU, 8700k only makes sense if youre coming from something older and don't need the PCI-E lanes.

But, with that out of the way, you may have seen the comparison's between M.2 PCI-E and conventional SSD and in gaming, aside from minutely faster load times, they aren't worth the hassle? And, RAID card? For data loss? You could just buy a cheap external hard-drive and periodically create a system image. In fact, you probably could have sold the M.2 PCI-E and RAID card for, I don't know, $100-200 and put that towards the difference between the 5820k build and a new 8700k build.

But whatever man, to each their own, you do whatever you want to do.

Edit:

Oh and you still could have run one of the cards, ideally your M.2 PCI-E if you wanted as it's already been established that the difference between PCI-E 3.0 x16 and x8 is less than 2%.


----------



## Thoth420

I think I am going to go with the strixx. Anyone have one just randomly die of do any funny business? Asus RMA is a nightmare but I need to use Asus sync.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> Sorry I didn't read your previous post, yeah $450 for the CPU, mobo and RAM was the way to go, 5820k is still a solid CPU, 8700k only makes sense if youre coming from something older and don't need the PCI-E lanes.
> 
> But, with that out of the way, you may have seen the comparison's between M.2 PCI-E and conventional SSD and in gaming, aside from minutely faster load times, they aren't worth the hassle? And, RAID card? For data loss? You could just buy a cheap external hard-drive and periodically create a system image. In fact, you probably could have sold the M.2 PCI-E and RAID card for, I don't know, $100-200 and put that towards the difference between the 5820k build and a new 8700k build.
> 
> But whatever man, to each their own, you do whatever you want to do.


I occasionally do some work stuff that needs about 10-12 TB of storage with 1200 - 1500 MB/s access performance, video editing 2-4 raw 4K streams at the same time. And I can't have any risk of losing my current work while working on it due to something like a drive failure, so it absolutely must have at minimum 2-drive-failure protection, which means raid array, and a PCIE-3.0-8x, 12 Gbps raid card. External storage can't come close to that capacity and that read/write performance. And M.2 PCIE for superfast 2000 - 2500 MB/s temporary scratch space for scrubbing.

I don't do it daily or actually that often, a couple times a month when a local friend needs production for his business, and I do it for him for some extra cash (good for about $500 - $900 extra per month, depending on what he needs at the time, it varies). But when I do need it, I need it. I had all of this before on a Z68 board that used PLX chips with a 3770K and it worked fine but the board + CPU decided to just suddenly die one morning when I turned it on. This x99 system is so much faster for this configuration, and the new X299 system is waaaaaaaaaay more expensive. I had everything else (all the other parts) I just needed board + cpu + ram.

I actually have the main OS and most of my games on a 2 x 2TB SATA raid-0 SSD array for gaming, on the onboard sata ports.

Some of us work real jobs.. this is actually most of my income right now. That and freelance online computer consulting for other folks. Sometimes custom builds for others, I flew to new york to do a custom water loop assembly build for someone in jan 2017. Sometimes I do online technical support from home when I can get it for various companies, etc.


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> I occasionally do some work stuff that needs about 10-12 TB of storage with 1200 - 1500 MB/s access performance, video editing 2-4 raw 4K streams at the same time. And I can't have any risk of losing my current work while working on it due to something like a drive failure, so it absolutely must have at minimum 2-drive-failure protection, which means raid array, and a PCIE-3.0-8x, 12 Gbps raid card. External storage can't come close to that capacity and that read/write performance. And M.2 PCIE for superfast 2000 - 2500 MB/s temporary scratch space for scrubbing.
> 
> I don't do it daily or actually that often, a couple times a month when a local friend needs production for his business, and I do it for him for some extra cash (good for about $500 - $900 extra per month, depending on what he needs at the time, it varies). But when I do need it, I need it. I had all of this before on a Z68 board that used PLX chips with a 3770K and it worked fine but the board + CPU decided to just suddenly die one morning when I turned it on. This x99 system is so much faster for this configuration, and the new X299 system is waaaaaaaaaay more expensive. I had everything else (all the other parts) I just needed board + cpu + ram.
> 
> I actually have the main OS and most of my games on a 2 x 2TB SATA raid-0 SSD array for gaming, on the onboard sata ports.
> 
> Some of us work real jobs.. this is actually most of my income right now. That and freelance online computer consulting for other folks. Sometimes custom builds for others, I flew to new york to do a custom water loop assembly build for someone in jan 2017. Sometimes I do online technical support from home when I can get it for various companies, etc.


Now that I have a better understanding of your needs, profession etc. I have agree that you made the right decision with the 5820k. It should last many years to come, by that time hopefully AMD will put enough pressure on Intel to cease their ridiculous overpriced products (x299) and we will have a spiritual successor to the 5820k on some smaller node after Ice Lake.


----------



## kithylin

Quote:


> Originally Posted by *Mooncheese*
> 
> Now that I have a better understanding of your needs, profession etc. I have agree that you made the right decision with the 5820k. It should last many years to come, by that time hopefully AMD will put enough pressure on Intel to cease their ridiculous overpriced products (x299) and we will have a spiritual successor to the 5820k on some smaller node after Ice Lake.


Yeah.. I don't really call it a "profession" just something I do to help a friend grow his business interests every so often, and just happens I also game on it when I don't have 'work' to do. I just need it all in one computer that can do my needs for work, and also be "competent enough" for gaming when not working.

I don't know if I will be upgrading this thing to broadwell-E or not, we'll have to see what the future holds. I'm silently holding my breath and hoping this x99 system holds up for me at least 2-3 years and by then maybe the price on either threadripper or x299 will drop to manageable levels and by then I can upgrade. Who knows, maybe in 2 years X299 will still be a $2000+ option for the better 10+ core chips and maybe 6950X will be down at $300.

I managed to both have the spare cash and find one for sale to score my 1080 Ti in august 2017 for $685 shipped.. now they're almost $1600 and probably going up from here. I was actually wanting to upgrade to 32GB, 64GB or 128GB ram, I spoke with MSI and my x99 system supports up to 128GB with normal ecc-unbuffered ram (8 x 16), but.. sadly the ddr4 prices flew through the roof before I was able to buy any more.. just doesn't make sense now.

I just recently picked up a "backup" computer to use in case my x99 system dies on me.. gotta have a backup plan to keep doing work.

Scored a GA-Z77X-UD7 board for $200 used, PLX chip board with 4-way-8x so I could drop-in stuff and go in < 24 hours if I had to, still need to find a cpu for it though later. Never know with hardware these days. Anything can decide to die on you any day without warning.


----------



## DStealth

Don't know what you guys are arguing about, but putting m2 on z370 is not making you running x8 Pci-e as Sata and m2 are communicating with DMI not PCI-e lines from the CPU.
Here's a diagram:

Actually the platform has 16 lines from CPU and 24 from the chipset or a total number of 40


----------



## kithylin

Quote:


> Originally Posted by *DStealth*
> 
> Don't know what you guys are arguing about, but putting m2 on z370 is not making you running x8 Pci-e as Sata and m2 are communicating with DMI not PCI-e lines from the CPU.
> Here's a diagram:
> Actually the platform has 16 lines from CPU and 24 from the chipset or a total number of 40


Pick a random Z370 motherboard and download and read the manual. All of them when you use the M.2 port in PCI-Express mode (you can toggle the ports between SATA and PCIE modes) it disables some things. What it disables depends on the motherboard and how they implemented their board. Some boards it disables partial onboard sata ports. Some boards it disables some of the PCIE ports. And then if you use another 8x card in one of the PCIE slots and the M.2 port in PCIE mode (takes 4x), it again, depends on the motherboard. Some Z370 boards will drop the gpu down to 8x and you're fine. Some of them will drop the installed 8x card down to 4x and keep the gpu at 8x. Any of those situations is not ideal, and if one needs all 3 devices (GPU + m.2-PCIE + additional 8x card) they typically need all devices to operate at full speed all the time. Even if you were to use dual video cards on Z370 + m.2 PCIE drive, your gpu's would run at 8x-8x (not the end of the world) and then you'd lose some or all of your onboard sata ports.

Unfortunately this is Intel's strategy and they designed their systems this way on purpose. If they gave you full speed for all connected devices on their "Desktop" class systems (Z370, coffee lake), then no one would ever buy the X99 / HEDT systems.


----------



## feznz

I don't notice the difference with my Z370 have X2 Graphics cards running x8/x8 with PCIE soundcard 1 M.2 SSD and 4 SATA and 2 sticks of ram
For home use this is fine


----------



## kithylin

Quote:


> Originally Posted by *feznz*
> 
> I don't notice the difference with my Z370 have X2 Graphics cards running x8/x8 with PCIE soundcard 1 M.2 SSD and 4 SATA and 2 sticks of ram
> For home use this is fine


Are you just gaming on your system though?


----------



## feznz

general home stuff but everything works of course there are limitations like running PCIE @ x8 vs x16 but nothing warranting a x99 upgrade
I didn't need 128Mb ram were certain scenarios you will but I have always been told you will know if you need that actually took 3 weeks before I notice 10Gb of ram useage finally justified getting 16Gb vs 8Gb


----------



## kithylin

Quote:


> Originally Posted by *feznz*
> 
> general home stuff but everything works of course there are limitations like running PCIE @ x8 vs x16 but nothing warranting a x99 upgrade
> I didn't need 128Mb ram were certain scenarios you will but I have always been told you will know if you need that actually took 3 weeks before I notice 10Gb of ram useage finally justified getting 16Gb vs 8Gb


Yeah I was thinking of getting back in to coding and programming and running 4-6 virtual machines at the same time as testing environments for programs, is why the ram. No chance now with ddr4 prices so high, so maybe later when ram's affordable again.


----------



## rickyman0319

is this correct?


----------



## Mooncheese

Quote:


> Originally Posted by *kithylin*
> 
> Yeah.. I don't really call it a "profession" just something I do to help a friend grow his business interests every so often, and just happens I also game on it when I don't have 'work' to do. I just need it all in one computer that can do my needs for work, and also be "competent enough" for gaming when not working.
> 
> I don't know if I will be upgrading this thing to broadwell-E or not, we'll have to see what the future holds. I'm silently holding my breath and hoping this x99 system holds up for me at least 2-3 years and by then maybe the price on either threadripper or x299 will drop to manageable levels and by then I can upgrade. Who knows, maybe in 2 years X299 will still be a $2000+ option for the better 10+ core chips and maybe 6950X will be down at $300.
> 
> I managed to both have the spare cash and find one for sale to score my 1080 Ti in august 2017 for $685 shipped.. now they're almost $1600 and probably going up from here. I was actually wanting to upgrade to 32GB, 64GB or 128GB ram, I spoke with MSI and my x99 system supports up to 128GB with normal ecc-unbuffered ram (8 x 16), but.. sadly the ddr4 prices flew through the roof before I was able to buy any more.. just doesn't make sense now.
> 
> I just recently picked up a "backup" computer to use in case my x99 system dies on me.. gotta have a backup plan to keep doing work.
> 
> Scored a GA-Z77X-UD7 board for $200 used, PLX chip board with 4-way-8x so I could drop-in stuff and go in < 24 hours if I had to, still need to find a cpu for it though later. Never know with hardware these days. Anything can decide to die on you any day without warning.


That's the best thing about x79, x99, and x299: access to quad channel memory. I saw a 500 point jump in Firestrike CPU going from dual channel to quad channel with my i7 4930k. How fast is your memory? I'm curious to see some memory benchmarks if we're running the same speed. I'm using 2x8 3200 MHz, 16-18-18-38 CAS 16.

And redundancy is very important, I'm really happy that I built the new computer for this reason, if one of them goes down I can jump on the other one and troubleshoot or just use it for general purpose etc if the problem is a failed expensive component.

Quote:


> Originally Posted by *DStealth*
> 
> Don't know what you guys are arguing about, but putting m2 on z370 is not making you running x8 Pci-e as Sata and m2 are communicating with DMI not PCI-e lines from the CPU.
> Here's a diagram:
> 
> Actually the platform has 16 lines from CPU and 24 from the chipset or a total number of 40


Excellent information, I was unaware of this, I was under the impression that adding an M.2 drive would consume PCIe lanes, actually it shares chipset lanes with the SATA bus as you stated.

Quote:


> Originally Posted by *kithylin*
> 
> Pick a random Z370 motherboard and download and read the manual. All of them when you use the M.2 port in PCI-Express mode (you can toggle the ports between SATA and PCIE modes) it disables some things. What it disables depends on the motherboard and how they implemented their board. Some boards it disables partial onboard sata ports. Some boards it disables some of the PCIE ports. And then if you use another 8x card in one of the PCIE slots and the M.2 port in PCIE mode (takes 4x), it again, depends on the motherboard. Some Z370 boards will drop the gpu down to 8x and you're fine. Some of them will drop the installed 8x card down to 4x and keep the gpu at 8x. Any of those situations is not ideal, and if one needs all 3 devices (GPU + m.2-PCIE + additional 8x card) they typically need all devices to operate at full speed all the time. Even if you were to use dual video cards on Z370 + m.2 PCIE drive, your gpu's would run at 8x-8x (not the end of the world) and then you'd lose some or all of your onboard sata ports.
> 
> Unfortunately this is Intel's strategy and they designed their systems this way on purpose. If they gave you full speed for all connected devices on their "Desktop" class systems (Z370, coffee lake), then no one would ever buy the X99 / HEDT systems.


I'm curious to know what Z370 boards don't run the M.2 over the chipset lanes shared with the SATA Bus, my Aorus Gaming 7, a $175 board (Black Friday price, normally it's $230-$250) that plants it firmly in the category of high end budget, does this:





In fact, thanks to this back and forth and input from Stealth I'm seriously considering picking up a good M.2 drive as having downloaded both Forza 7 and Forza Horizon 3 I'm down 150 GB of storage space.

Hopefully moving the OS over to it is relatively straightforward. Any idea as to whether or not this could be accomplished natively with Win 10?

Edit:

Looking up my memory via my Newegg order I realized that I couldn't have built my new computer at a better time.

I saved $50 on the mobo, $50 on the RAM ($175), and $40 on the PSU (Corsair RM 1000i, $159) and picked up Win 10 Pro off of Kinguin for $30 so that might count as $50-60? I then saved 15% on the entire order from performance-pcs.com for the EK parts, or about $100 and I got the case, Thermaltake View 71 for $150 out the door, no tax and free shipping from B&H Photo. I probably saved $350-400 doing this during Black Friday.


----------



## Thoth420

These GPU prices and shortages are ridic. I curse AMD for failing to compete on the graphics front.


----------



## MightEMatt

Quote:


> Originally Posted by *Thoth420*
> 
> These GPU prices and shortages are ridic. I curse AMD for failing to compete on the graphics front.


Their failure to compete at a 1080Ti level isn't the cause of the current GPU market.


----------



## nrpeyton

Anyone aware of any "tweaks" or "optimisations" that can be made to maximise mining efficiency on a 1080Ti?

_(For those who don't like mining -- don't worry -- I bought my GPU as a gamer and for games... I'm only taking advantage of an extra opportunity).. I'd never buy a GPU specifically just to mine)._


----------



## Mooncheese

Quote:


> Originally Posted by *Thoth420*
> 
> These GPU prices and shortages are ridic. I curse AMD for failing to compete on the graphics front.


Quote:


> Originally Posted by *MightEMatt*
> 
> Their failure to compete at a 1080Ti level isn't the cause of the current GPU market.


I have a feeling that the current situation is not the result of demand, which is higher than usual, but the lack of supply because, at least on Nvidia side of things, the factories have switched over production to Volta and there WILL be no more Pascal cards. Not sure what's going on with AMD.
Quote:


> Originally Posted by *nrpeyton*
> 
> Anyone aware of any "tweaks" or "optimisations" that can be made to maximise mining efficiency on a 1080Ti?
> 
> _(For those who don't like mining -- don't worry -- I bought my GPU as a gamer and for games... I'm only taking advantage of an extra opportunity).. I'd never buy a GPU specifically just to mine)._


I mine and game, I've found that my 1080 Ti likes 100% PT for whatever reason, it will do 74 Mh/s Lyra2Rev2 at 1900 Mhz at .950v and 100% PT. If I up PT to 120%, this figure will only go up to 78 MH/s at 2000 MHz at 1.0v.

So, were looking at 20% more wattage drawn, another 5% more clockspeed, and maybe 5% more hashing?

For memory overclock, with my FE anything over +400 yields declining returns due to Error Correction.

I'm under full water block, I posted a vid a few pages back, I end up running Prime95, small FFT, while NiceHash is running Cryptonight and Lyra2Rev2 to show off the cooling and acoustics. 35-38C depending on whether it's a game or mining etc. Right now it's at 37C doing 722 H/s on Equihash.






I will turn this off in a few after dinner and play Forza Horizon 3 for a few hours and then it's back to mining.

If youre not mining with you components when youre not using them, youre not making an ideological statement, youre just stupid. Period.

You thinking that by not mining youre going to "kill" cryptocurrency mining only makes BTC and the various alt coins in question that much more valuable and the uber rich Chinese families in China running massive mining farms that much richer.






Talk about delusion with a capital D.

You want to stop mining? Well then YOU should be mining. The more SUPPLY the less the commodity in question, physical or ahem, ethereal, has value.

And with the GPU prices this high, well, you might as well get a return on your investment..






The conversation we SHOULD be having however, is about export constraints in the form of tariffs etc. to countries where large cryptocurrency mining is going on, i.e. China. If GPU manufacturers demand this, then THAT will constrain cryptocurrency mining much more than impassioned ethical pleas to the PC community about how some tween living in their parents basement shouldn't use his / her GTX 1080 to make $3 a day when they aren't gaming. Talk about not addressing the proverbial elephant in the room. Watch that Motherboard mini-documentary on Cryptocurrency mining farms in China. The tween with the GTX 1070 is NOT the problem. Again, back to basic economics, if you want to kill cryptocurrency mining, based on the law of supply and demand YOU SHOULD BE MINING. The more ubiquitous the commodity the less the value, and hence, less massive cryptocurrency farms in China and elsewhere.

Look at the value of gold, it is NOT ubiquitous.

Ok, now look at the value of air. It IS ubiquitous.

Make Cryptocurrency ubiquitous.

Problem solved.

For the record I'm a diehard PC gamer. I don't have a mini farm and I'm using hardware that I already had on hand to mine with.

That's my $ .02

(I posted this comment to Jays2cent's recent video on the topic.)


----------



## Thoth420

I thought Volta was being skipped this upcoming gen for some form of Pascal refresh. There hasn't been any info on the 11xx series yet at all has there?


----------



## Mooncheese

I apologize for the TLDR posts, anyhow, on the subject of M.2 drives, I found this article on PC Gamer and some very interesting comments in the comment section that I think are relevant:

http://www.pcgamer.com/intel-optane-ssd-900p-280gb-review/

I will probably pick up another conventional SSD for half the price. I've seen the boot time comparisons and the game load time comparisons between a fast M.2 (960 Evo) and a good conventional SSD (850 Evo) and was not impressed.

Bribase • 3 months ago
What I want to know is whether this confers any benefit to your average gamer whatsoever. Developers and content creators, sure. I've never once felt as though I needed a better SSD other than it's capacity. I'm kind of okay with <10 second load times for most games, why would I need this?

•Reply•Share ›
Avatar
Tilia Bribase • 3 months ago
None whatsoever, the 850 EVO at the bottom of the chart will be as fast as everything else above in games and most programs. The only real benefit to faster m.2 drives etc is when you copy files.

I bought a 960 EVO and I was so disappointed with the results that I sent it back after a day and bought a 850 EVO of double the capacity instead. m.2 drives are so pointless.

3
•Reply•Share ›
Avatar
Crimfresh Tilia • 3 months ago
I disagree. I notice a difference between the drives. I have both and the 960 is literally twice as fast. It's not a huge difference and I can understand the preference for more space but saying double the performance is useless is foolish.

1
•Reply•Share ›
Avatar
Tilia Crimfresh • 3 months ago
Where do you notice a difference ? When I tried It would only be faster whenever it came to copying file(s). A few programs benefited from it due to creating huge temp files and then the process is significantly faster, but beyond that it amounted to no change over my old Crucial SSD. Games would load just as fast, same with programs. Windows loaded a little bit faster once we're past all the hardware booting sequence part. Given the premium you pay for it, this is totally not worth it IMO. I rather get double the capacity for the same price. What's foolish is to claim it's twice as fast on anything but paper and file transfers.

•Reply•Share ›
Avatar
Foxfire15 Bribase • 3 months ago
Realistically? You don't. This was stated in the article for what it's worth. For the average person, a larger cheaper SSD will probably be the better option.

1
•Reply•Share ›
Avatar
danwat1234 Foxfire15 • 3 months ago
Diminishing returns.. a mid-range SATS3 SSD is plenty fast for any consumer use besides huge file transfers. NVME vs Xpoint, u wouldn't see any benefit. Super high end uses r different.


----------



## Mooncheese

Quote:


> Originally Posted by *Thoth420*
> 
> I thought Volta was being skipped this upcoming gen for some form of Pascal refresh. There hasn't been any info on the 11xx series yet at all has there?


Where did you hear that?

Dude Volta is already technically out (Titan V).

Edit:

And Samsung has switched over its memory production from GDDR5 to GDDR6. They aren't even making GDDR5 anymore. Which leads me to conclude that, exactly what I stated earlier, there will be no more Pascal cards with Volta due out in May? Feb, March, April Volta?

https://arstechnica.com/gadgets/2018/01/samsung-is-now-mass-manufacturing-gddr6-memory-for-your-next-gpu/

Usually the situation has been thus, new architecture rolls out, say Maxwell, and 3rd party board manufacturers and retail outlets try to offload their surplus inventory, where you could find 780 Ti's for $300 new on amazon and newegg when Maxwell released, and then again, 980 Ti's for $300 new when Pascal released. Not this time around. Everyone is SOLD OUT EARLY. And the manufacturers AREN'T MAKING ANY MORE BECAUSE THEY ALREADY RETOOLED THEIR ASSEMBLY LINES , HAVE DIFFERENT COMPONENTS ON HAND (i.e GDDR6 memory) ETC. FOR VOLTA.

Double Edit:

I'm not sure if I said this here earlier but I had a new, sealed 1080 Ti FE in my ebay watch list that I was just watching out of curiosity and the auction ended yesterday with it at $1350!

That is twice MSRP!

You don't think Nvidia would like to sell more GPU's? THEY CAN'T! They switched over production to ramp up for expected demand for Volta! They aren't even making Pascal cards at the moment!

Think about it!

And I think this is why AMD is also sold out and highly priced at the moment, with Nvidia having ceased production of GPU's AMD cannot keep up with demand on it's own!

Back to supply and demand. You have X amount of something that people want, popularity of X goes up right at the time that the amount of X available is cut in half. Hence the GPU price situation in early 2018.


----------



## CoreyL4

hey guys,

today i noticed my 1080ti running a little bit hotter than usual. like 5c hotter. today i cleaned my case with a $40 air compressor and sprayed all over. could that have done anything to make the card run hotter? ie. make something lose on the card etc?

i may need to apply some new paste if its starting to run hotter. firestrike score 2 months ago was in the low 60s for temps and now during firestrike it is in the high 60s.

what paste and process do you guys recommend?


----------



## Thoth420

Quote:


> Originally Posted by *Mooncheese*
> 
> Where did you hear that?
> 
> Dude Volta is already technically out (Titan V).
> 
> Edit:
> 
> And Samsung has switched over its memory production from GDDR5 to GDDR6. They aren't even making GDDR5 anymore. Which leads me to conclude that, exactly what I stated earlier, there will be no more Pascal cards with Volta due out in May? Feb, March, April Volta?
> 
> https://arstechnica.com/gadgets/2018/01/samsung-is-now-mass-manufacturing-gddr6-memory-for-your-next-gpu/
> 
> Usually the situation has been thus, new architecture rolls out, say Maxwell, and 3rd party board manufacturers and retail outlets try to offload their surplus inventory, where you could find 780 Ti's for $300 new on amazon and newegg when Maxwell released, and then again, 980 Ti's for $300 new when Pascal released. Not this time around. Everyone is SOLD OUT EARLY. And the manufacturers AREN'T MAKING ANY MORE BECAUSE THEY ALREADY RETOOLED THEIR ASSEMBLY LINES ETC. FOR VOLTA.
> 
> Double Edit:
> 
> I'm not sure if I said this here earlier but I had a new, sealed 1080 Ti FE in my ebay watch list that I was just watching out of curiosity and the auction ended yesterday with it at $1350!
> 
> That is twice MSRP!
> 
> You don't think Nvidia would like to sell more GPU's? THEY CAN'T! They switched over production to ramp up for expected demand for Volta! They aren't even making Pascal cards at the moment!
> 
> Think about it!
> 
> And I think this is why AMD is also sold out and highly priced at the moment, with Nvidia having ceased production of GPU's AMD cannot keep up with demand on it's own!
> 
> Back to supply and demand. You have X amount of something that people want, popularity of X goes up right at the time that the amount of X available is cut in half. Hence the GPU price situation in early 2018.


I forget now just heard it in passing. I can't wait for May I need a build online in the next few weeks. Thanks for the info though.


----------



## Mooncheese

Quote:


> Originally Posted by *CoreyL4*
> 
> hey guys,
> 
> today i noticed my 1080ti running a little bit hotter than usual. like 5c hotter. today i cleaned my case with a $40 air compressor and sprayed all over. could that have done anything to make the card run hotter? ie. make something lose on the card etc?
> 
> i may need to apply some new paste if its starting to run hotter. firestrike score 2 months ago was in the low 60s for temps and now during firestrike it is in the high 60s.
> 
> what paste and process do you guys recommend?


Did you account for ambient? I.e., if it's suddenly 5-7F hotter ambient that will correlate directly to load temp. That's usually the cause. I use Gelid GC Extreme. I used to use IC Diamond but I didn't like the way it scratched my dies. Do not mess with liquid metal, if you get any on the lands around your die it's game over. Not worth the hassle IMHO.





Quote:


> Originally Posted by *Thoth420*
> 
> I forget now just heard it in passing. I can't wait for May I need a build online in the next few weeks. Thanks for the info though.


Youre screwed friend.

Why can't you wait?


----------



## GreedyMuffin

I've sadly sold my 1080Ti and are moving to a 1070 Strix which I already got.

Sold it for a high price due to the mining craze, so I earned a some $$$.

The 1070 should keep me going until the new series is here.


----------



## Scotty99

Quote:


> Originally Posted by *GreedyMuffin*
> 
> I've sadly sold my 1080Ti and are moving to a 1070 Strix which I already got.
> 
> Sold it for a high price due to the mining craze, so I earned a some $$$.
> 
> The 1070 should keep me going until the new series is here.


Im trying to do that as well haha. I have it on craigslist and its the cheapest 1080ti on there but only one guy has contacted me, maybe ill try ebay. Most of my games are cpu bound, my 1060 is plenty and i can hold out for a while for a 2070 or whatever next gen is called.

How much you get out of yours? Im asking 1050 for the strix OC edition, which is a 800 dollar card.....surprised it hasnt sold yet tbh.


----------



## GreedyMuffin

Quote:


> Originally Posted by *Scotty99*
> 
> Im trying to do that as well haha. I have it on craigslist and its the cheapest 1080ti on there but only one guy has contacted me, maybe ill try ebay. Most of my games are cpu bound, my 1060 is plenty and i can hold out for a while for a 2070 or whatever next gen is called.
> 
> How much you get out of yours? Im asking 1050 for the strix OC edition, which is a 800 dollar card.....surprised it hasnt sold yet tbh.


I asked 1150 with EKWB. But bear in mind that we got a 25% tax on all of our eletronics. So it is "only" a 900 USD card + WB.


----------



## Hulio225

Quote:


> Originally Posted by *Thoth420*
> 
> I thought Volta was being skipped this upcoming gen for some form of Pascal refresh. There hasn't been any info on the 11xx series yet at all has there?


Quote:


> Originally Posted by *Mooncheese*
> 
> Where did you hear that?
> 
> Dude Volta is already technically out (Titan V).
> 
> Edit:
> 
> And Samsung has switched over its memory production from GDDR5 to GDDR6. They aren't even making GDDR5 anymore. Which leads me to conclude that, exactly what I stated earlier, there will be no more Pascal cards with Volta due out in May? Feb, March, April Volta?
> 
> https://arstechnica.com/gadgets/2018/01/samsung-is-now-mass-manufacturing-gddr6-memory-for-your-next-gpu/
> 
> Usually the situation has been thus, new architecture rolls out, say Maxwell, and 3rd party board manufacturers and retail outlets try to offload their surplus inventory, where you could find 780 Ti's for $300 new on amazon and newegg when Maxwell released, and then again, 980 Ti's for $300 new when Pascal released. Not this time around. Everyone is SOLD OUT EARLY. And the manufacturers AREN'T MAKING ANY MORE BECAUSE THEY ALREADY RETOOLED THEIR ASSEMBLY LINES , HAVE DIFFERENT COMPONENTS ON HAND (i.e GDDR6 memory) ETC. FOR VOLTA.
> 
> Double Edit:
> 
> I'm not sure if I said this here earlier but I had a new, sealed 1080 Ti FE in my ebay watch list that I was just watching out of curiosity and the auction ended yesterday with it at $1350!
> 
> That is twice MSRP!
> 
> You don't think Nvidia would like to sell more GPU's? THEY CAN'T! They switched over production to ramp up for expected demand for Volta! They aren't even making Pascal cards at the moment!
> 
> Think about it!
> 
> And I think this is why AMD is also sold out and highly priced at the moment, with Nvidia having ceased production of GPU's AMD cannot keep up with demand on it's own!
> 
> Back to supply and demand. You have X amount of something that people want, popularity of X goes up right at the time that the amount of X available is cut in half. Hence the GPU price situation in early 2018.


You can read about that topic here for example:

https://wccftech.com/nvidia-next-generation-ampere-gpu-rumor/

TL;DR:
It seems like volta isn't going to be released to the consumer in the form of 11x0 GPUs. Probably because the die is so big and tensor stuff in the architecture for deep learning and AI stuff and so on...
Rumor has it that Ampere is the next consumer GPU because the roadmap of volta is already completed...

I guess thats what @Thoth420 is referring to.


----------



## Thoth420

Hulio225 said:


> Quote:Originally Posted by *Thoth420*
> 
> I thought Volta was being skipped this upcoming gen for some form of Pascal refresh. There hasn't been any info on the 11xx series yet at all has there?
> 
> Quote:Originally Posted by *Mooncheese*
> 
> Where did you hear that?
> 
> Dude Volta is already technically out (Titan V).
> 
> Edit:
> 
> And Samsung has switched over its memory production from GDDR5 to GDDR6. They aren't even making GDDR5 anymore. Which leads me to conclude that, exactly what I stated earlier, there will be no more Pascal cards with Volta due out in May? Feb, March, April Volta?
> 
> https://arstechnica.com/gadgets/201...manufacturing-gddr6-memory-for-your-next-gpu/
> 
> Usually the situation has been thus, new architecture rolls out, say Maxwell, and 3rd party board manufacturers and retail outlets try to offload their surplus inventory, where you could find 780 Ti's for $300 new on amazon and newegg when Maxwell released, and then again, 980 Ti's for $300 new when Pascal released. Not this time around. Everyone is SOLD OUT EARLY. And the manufacturers AREN'T MAKING ANY MORE BECAUSE THEY ALREADY RETOOLED THEIR ASSEMBLY LINES , HAVE DIFFERENT COMPONENTS ON HAND (i.e GDDR6 memory) ETC. FOR VOLTA.
> 
> Double Edit:
> 
> I'm not sure if I said this here earlier but I had a new, sealed 1080 Ti FE in my ebay watch list that I was just watching out of curiosity and the auction ended yesterday with it at $1350!
> 
> That is twice MSRP!
> 
> You don't think Nvidia would like to sell more GPU's? THEY CAN'T! They switched over production to ramp up for expected demand for Volta! They aren't even making Pascal cards at the moment!
> 
> Think about it!
> 
> And I think this is why AMD is also sold out and highly priced at the moment, with Nvidia having ceased production of GPU's AMD cannot keep up with demand on it's own!
> 
> Back to supply and demand. You have X amount of something that people want, popularity of X goes up right at the time that the amount of X available is cut in half. Hence the GPU price situation in early 2018.
> 
> 
> You can read about that topic here for example:
> 
> https://wccftech.com/nvidia-next-generation-ampere-gpu-rumor/
> 
> TL;DR:
> It seems like volta isn't going to be released to the consumer in the form of 11x0 GPUs. Probably because the die is so big and tensor stuff in the architecture for deep learning and AI stuff and so on...
> Rumor has it that Ampere is the next consumer GPU because the roadmap of volta is already completed...
> 
> I guess thats what @Thoth420 is referring to.


Yep that was it. It was when the Titan V came out.
Thanks for the link. I need a GPU to drive my Alienware Ultrawide panel asap. Bad timing.


----------



## KCDC

Thought I'd give the latest AISuite a go just for fun. Looks like it picks up my FE cards with the XOC BIOS and now has internal functions to OC them.

My guess is that it will interfere with AB... 

So I tried a firestrike SLI run with my latest stable OC on both cards and it crashed while both AB and AISuite were running.

Going to uninstall it and see if this is card degradation or AISuite messing with AB.


----------



## Silent Scone

KCDC said:


> Thought I'd give the latest AISuite a go just for fun. Looks like it picks up my FE cards with the XOC BIOS and now has internal functions to OC them.
> 
> My guess is that it will interfere with AB...
> 
> So I tried a firestrike SLI run with my latest stable OC on both cards and it crashed while both AB and AISuite were running.
> 
> Going to uninstall it and see if this is card degradation or AISuite messing with AB.


Worth noting that AI Suite currently has a number of issues since the latest barrage of security patches.


----------



## Captcha

Bought an Nvidia 1080ti FE today (europe). They do restock a few cards daily around 10am (CET/MEZ).


----------



## webhito

Captcha said:


> Bought an Nvidia 1080ti FE today (europe). They do restock a few cards daily around 10am (CET/MEZ).


Was the price as inflated as every gpu on the planet right now?


----------



## keikei

webhito said:


> Was the price as inflated as every gpu on the planet right now?


I think the only place where the card is actually selling @ msrp is on Nvidia's, but it wasnt in stock.


----------



## webhito

keikei said:


> I think the only place where the card is actually selling @ msrp is on Nvidia's, but it wasnt in stock.


Yea, and they don't ship to anywhere but Canada and the US I believe.


----------



## KedarWolf

webhito said:


> Yea, and they don't ship to anywhere but Canada and the US I believe.


If anyone in USA/Canada DOES want a Nvidia 1080 Ti FE at MSRP go to www.nowinstock.net and sign up for alerts.

I get notifications they are in stock from the Nvidia store all the time in my email.

You need to buy it right away though, they sell out as quick as they come in stock.

I'm pretty sure you can sign up for alerts from the UK store as well.


----------



## CoreyL4

So ambient temp has remained the same and my temps seem like they are very high all of a sudden.

would bench/play games around 59-62c and now when i run heaven i am in the high 60s.


i have an old firestrike run compared to one i did yesterday and the temps were around 61c before and now hover 64+.

not sure what happened. could a driver update cause something like this?


----------



## ZealotKi11er

CoreyL4 said:


> So ambient temp has remained the same and my temps seem like they are very high all of a sudden.
> 
> would bench/play games around 59-62c and now when i run heaven i am in the high 60s.
> 
> 
> i have an old firestrike run compared to one i did yesterday and the temps were around 61c before and now hover 64+.
> 
> not sure what happened. could a driver update cause something like this?


I noticed that too. My card is running a bit hotter than normal.


----------



## KCDC

Silent Scone said:


> Worth noting that AI Suite currently has a number of issues since the latest barrage of security patches.



Noted, I deleted it anyway, don't need it outside of on-the-fly fan control that I can live without.


----------



## Silent Scone

KCDC said:


> Noted, I deleted it anyway, don't need it outside of on-the-fly fan control that I can live without.


Yeah, so much cleaner doing things in the UEFI. Although, I do have an Aquaero lol


----------



## Thoth420

KedarWolf said:


> If anyone in USA/Canada DOES want a Nvidia 1080 Ti FE at MSRP go to www.nowinstock.net and sign up for alerts.
> 
> I get notifications they are in stock from the Nvidia store all the time in my email.
> 
> You need to buy it right away though, they sell out as quick as they come in stock.
> 
> I'm pretty sure you can sign up for alerts from the UK store as well.


You can and I already did set up alerts for all Strixx models as well as EVGA models.


----------



## HoneyBadger84

I'm just patiently waiting for Amazon to get more eVGA GTX 1080 Ti FTW3 Hybrids in, I have an order placed @ $832 that they have to fill at that price unless I cancel it, I'll let it sit there forever just so I can make profit off the card while I order a Titan Xp with the funds from selling my R9 295x2 LUL


----------



## Thoth420

HoneyBadger84 said:


> I'm just patiently waiting for Amazon to get more eVGA GTX 1080 Ti FTW3 Hybrids in, I have an order placed @ $832 that they have to fill at that price unless I cancel it, I'll let it sit there forever just so I can make profit off the card while I order a Titan Xp with the funds from selling my R9 295x2 LUL


How do you do that? The auto order I meant.


----------



## HoneyBadger84

Thoth420 said:


> How do you do that? The auto order I meant.


I placed the order on January 7th, they had them listed as "Will ship on January 18th", and it's been delayed twice since then, but unless I cancel the order, whenever they finally do get one in stock, they have to sell it to me at that price. lol


----------



## ZealotKi11er

KedarWolf said:


> If anyone in USA/Canada DOES want a Nvidia 1080 Ti FE at MSRP go to www.nowinstock.net and sign up for alerts.
> 
> I get notifications they are in stock from the Nvidia store all the time in my email.
> 
> You need to buy it right away though, they sell out as quick as they come in stock.
> 
> I'm pretty sure you can sign up for alerts from the UK store as well.


From the looks of it 1070 Ti does came and go from stock. It was in stock Jan 24th at 9 am. I hope I can get the next gen Nvidia GPU before they sell out.


----------



## HoneyBadger84

It's hilarious that NVidia actually went to the lengths of putting out a bulletin for retailers to "only sell to gamers", like a retailer can control what someone does with the cards. Even with purchase limits, people can get around that, if a miner really wants to buy a bunch of cards, they can use different credit cards, addresses, it's impossible to completely block out miners from being able to buy cards.

I wish NVidia had an actual storefront, like brick and mortar, somewhere close I could get to a card at. I'd love to buy a Titan Xp in person, but sadly I live in the middle of bum**** Egypt New Mexico, so that's not gonna happen.


----------



## GOLDDUBBY

Silent Scone said:


> Worth noting that AI Suite currently has a number of issues since the latest barrage of security patches.


Check the installed updates list I believe it's KB4056892.. Pause Windows update, then uninstall that filthy good for nothing patch.


----------



## hotrod717

Actually FE was in stock on us site for a brief period today.

So...just received my first non-fe card, EVGA FTW3. Nice, solid card and other than needing firmware for fans, running good. I usually use AB, but was playing with PXOC. Card initially boosts to 1987, with appropriate nvidia control settings, but drops to 1974 after a few minutes. All readings look really good. Dont understand the clock drop. Tried several scenarios and always happens. I didnt have this with FE. Is this common for AIB cards? Energy saving feature, part bios? Also didnt see any discernible difference between oc and standard bios' performance. Just started playing, but card seems like a winner, so far, based on initial impressions and testing on air. Should wake up on universal block and chilled water.


----------



## feznz

Have to say even here in NZ the cards are starting to sell like hotcakes but still plenty of stock it you browse a little it is the top rated retailers that are selling out


----------



## Thoth420

Just spend 1389 US on an EVGA FTW3 Gaming....I feel so dirty. I need a shower and scotch. RIP wallet


----------



## Mooncheese

Thoth420 said:


> Just spend 1389 US on an EVGA FTW3 Gaming....I feel so dirty. I need a shower and scotch. RIP wallet


Star Wars Titan Xp out of stock?

Edit: 

Nope. 

I mean if youre going to pay $1400 why not buy something 15% faster? 

https://www.nvidia.com/en-us/titan/titan-xp/

Personally I wouldn't pay that much for a 1080 Ti unless I was mining with it, and even then, ROI is going to twice as long and youre probably better off waiting for Volta as again, I think the reason for the supply shortage is because Nvidia is already making Volta and may have completely ceased production of Pascal. 

And it's still Volta, due out in May, the Ampere information was due to a mis-translation by a foreign outlet, this has already been clarified.


----------



## Thoth420

Mooncheese said:


> Star Wars Titan Xp out of stock?
> 
> Edit:
> 
> Nope.
> 
> I mean if youre going to pay $1400 why not buy something 15% faster?
> 
> https://www.nvidia.com/en-us/titan/titan-xp/


I am going with an all deep purple LED theme so I just couldn't do it. I was thinking about it but the only one was the Jedi Green and I think that would blend terrible with the build. If they had the Empire Red I think the rest of the LEDs would have drowned it out some but alas those were sold out. The noise from the blower cooler is also a concern for me.


----------



## Mooncheese

Thoth420 said:


> I am going with an all deep purple LED theme so I just couldn't do it. I was thinking about it but the only one was the Jedi Green and I think that would blend terrible with the build. If they had the Empire Red I think the rest of the LEDs would have drowned it out some but alas those were sold out. The noise from the blower cooler is also a concern for me.


Wait, you paid $1400 for a 15% slower card because of LED color? You do know you can turn the LED's completely off right? Personally speaking, that cooler would have been replaced with a water block. Is it too late to return the 1080 Ti? 

Titan Xp is about 15% faster. That's why it's $1400. Also, I edited my previous comment.

Edit: 

Phanteks sells an RGB water block for 1080 Ti / Titan X / Titan Xp.


----------



## Thoth420

Mooncheese said:


> Wait, you paid $1400 for a 15% slower card because of LED color? You do know you can turn the LED's completely off right? Personally speaking, that cooler would have been replaced with a water block. Is it too late to return the 1080 Ti?
> 
> Titan Xp is about 15% faster. That's why it's $1400. Also, I edited my previous comment.
> 
> Edit:
> 
> Phanteks sells an RGB water block for 1080 Ti / Titan X / Titan Xp.


I'm not doing a loop this time around it's too much hassle. The LED on that limited edition can be turned off? In the BIOS or via a physical switch? I am mostly avoiding it because I hate blower cards the noise is atrocious. 
I really am that OCD as well so yes I want the card to have purple LED enabled. I know its crazy but I know what I wanted and the miners will not stop me. I do appreciate the advice however.


----------



## HoneyBadger84

Mooncheese said:


> Star Wars Titan Xp out of stock?
> 
> Edit:
> 
> Nope.
> 
> I mean if youre going to pay $1400 why not buy something 15% faster?
> 
> https://www.nvidia.com/en-us/titan/titan-xp/
> 
> Personally I wouldn't pay that much for a 1080 Ti unless I was mining with it, and even then, ROI is going to twice as long and youre probably better off waiting for Volta as again, I think the reason for the supply shortage is because Nvidia is already making Volta and may have completely ceased production of Pascal.
> 
> And it's still Volta, due out in May, the Ampere information was due to a mis-translation by a foreign outlet, this has already been clarified.


The Titan Xp is now out of stock. Rip I was about to order one this morning...


----------



## falcon26

I spoke with people from Amazon, Newegg, Frys, Central Computer and B&H photo they all pretty much said don't expect stock or normal prices on any 1080 or 1080 ti for several months. When does the 1080 ti replacement come out? I may just wait for that. There is no way I am paying $1200-1400 for a 1080 ti that is insane...


----------



## KedarWolf

falcon26 said:


> I spoke with people from Amazon, Newegg, Frys, Central Computer and B&H photo they all pretty much said don't expect stock or normal prices on any 1080 or 1080 ti for several months. When does the 1080 ti replacement come out? I may just wait for that. There is no way I am paying $1200-1400 for a 1080 ti that is insane...


Okay,

I'm saying this again.

Go to www.nowinstock.net Sign up for alerts for the Nvidia Store 1080 Ti FE, you get an email when the FE is in stock for $699 USD, buy it IMMEDIATELY, they go out of stock right away.

Profit.


----------



## keikei

falcon26 said:


> I spoke with people from Amazon, Newegg, Frys, Central Computer and B&H photo they all pretty much said don't expect stock or normal prices on any 1080 or 1080 ti for several months. When does the 1080 ti replacement come out? I may just wait for that. There is no way I am paying $1200-1400 for a 1080 ti that is insane...


What would you be willing to pay though? Fortunately/unfortunately its a seller's market. I've seen some used cards going for $800-1100. Bidding offers being the low end.


----------



## ThrashZone

Hi,
One can get on email alerts on evga's website too
http://www.evga.com/


----------



## Mooncheese

falcon26 said:


> I spoke with people from Amazon, Newegg, Frys, Central Computer and B&H photo they all pretty much said don't expect stock or normal prices on any 1080 or 1080 ti for several months. When does the 1080 ti replacement come out? I may just wait for that. There is no way I am paying $1200-1400 for a 1080 ti that is insane...


I wouldn't pay that much either. Who can't wait 3-4 months until the situation clears? The situation isn't just demand side, it is mostly supply side. When Volta is released there will be a glut of 1180 / 2080 on the market at MSRP, a card that will be 15% faster? Yeah so basically wait 3-4 months for a $700, 200W Titan Xp. If youre buying a 1080 Ti this late in the game for 200% more than it's worth youre not thinking straight. Sorry. 

As I said, I don't think they are making Pascal cards at the moment. It's basically February and Volta is due out in May? March, April, May? Yeah they've already retooled their production lines for Volta. This situation is unprecedented because of the cryptocurrency mining craze. This is how it usually goes, Nvidia stops what it's doing to prepare for the new architecture. It really is that complicated. You cant make both at the same time. They have completely different components (GDDR6 memory for example). They are making them earlier in an attempt to meet demand based on what happened with Pascal. 

Think about it. 

There would be more cards on the market already. You don't think Nvidia wants to make money? There is a demand for something, youre a manufacturer, you will do whatever it takes to meet that demand. There are no more cards on the market and there will be no more because, as I stated, THEY ARE NOT MAKING PASCAL AT THE MOMENT. 

Feb, March, April, Volta.

This isn't exactly a new trend though, this happened going from Maxwell to Pascal too: 

https://www.eteknix.com/nvidia-may-stopped-geforce-gtx-980ti-production/

"One of the inevitable signs of an imminent release of new products is when the old model starts becoming hard to find. A seamless transition to the new version is a mark of good logistics and something Nvidia is known for. In line with the expectations for Pascal, Nvidia has reportedly stopped shipping GTX 980Ti’s to their AiB partners, which indicates that Nvidia is winding down the supply chain for the high-end card.

A stop in GTX 980Ti production points to a Pascal chip to replace it coming soon down the line. Usually, shipments to stores are ahead by a month and production a month or so before that. If Nvidia stops supplies now, there will still be about 2-3months before supplies run low. This puts the timeframe smack dab during Computex where Pascal is expected to be launched. It seems like perfect timing for GTX 980Ti supply to dry up just as Pascal launches and becomes available."

What's different this time around is that, again, in all caps, THERE WILL BE NO SURPLUS OUTGOING ARCHITECTURE FOR OUTLETS AND 3RD PARTY BOARD MANUFACTURERS TO OFFLOAD ON SALE BECAUSE OF THE CRYPTOCURRENCY MINING BOOM PRECISELY COINCIDING WITH THE SWITCH OVER IN PRODUCTION FROM THE OLD TO THE NEW ARCHITECTURES. 

Again, THINK ABOUT IT. The above happened as well, they straight up STOPPED making Maxwell a few months before Pascal but you couldn't tell BECAUSE THERE WAS NO CRYPTOCURRENCY BOOM THEN. 

I can't believe I'm literally the only person in this entire thread saying this. 

Do we have collective amnesia and / or a loss of our critical faculties? 

It's basic common friggin sense AND IT'S HAPPENED BEFORE. 

"Uhhhh, I don't know why there are no more Nvdia GPU's for sale......"

With this basic understanding it's axiomatic that THE CURRENT INSANE PRICE TREND IS NOT SUSTAINABLE AND WILL BE ALLEVIATED WHEN A GLUT OF VOLTA HITS THE MARKET. 

Again. 

The situation IS NOT ONLY DEMAND SIDE. 

Demand is NOT that unprecedented. 

The situation IS MOSTLY SUPPLY SIDE. 

Again, in basic language even a toddler can understand: 

Johnny and his neighbor Annie sell apples. Johnny's farm is attacked by blight, and he stops producing apples. Annie tries to meet demand but sell all of her apples and has to price her spare inventory at a rate the market will bear. Johnny sees all of the money Annie is making and so wants to sell apples too but can't because, he has no apples to sell. The demand for apples is so great that even at prices twice what they normally are, Annie sells out of apples and there is an apple shortage in the area lasting until next season, when Johnny's farm recovers. 

JOHNNY IS NVIDIA AND ANNIE IS AMD. 

Do you understand basic economics now? 

NOW, WITH THIS BASIC UNDERSTANDING OF ECONOMICS, KNOWING THAT THE SITUATION IS TRANSIENT, THAT THERE WILL BE A GLUT OF VOLTA CARDS ON THE MARKET AT MSRP IN 4 MONTHS, ARE YOU GOING TO GO OUT AND PURCHASE A 300W 1080 TI FOR $1400 THAT WILL BE AS FAST AS THE $600 160W 1170/ 2070? IF YOU ANSWERED IN THE AFFIRMATIVE I HAVE SOME BEACH FRONT PROPERTY IN MIAMI, FLORIDA I WOULD LIKE TO SELL YOU (Uhhhh, what Climate Change, dur, dur, dur, drool.).


----------



## ThrashZone

Hi,
My local MicroCenter has some 1080ti's in stock they just changed the way they list the prices pretty much doubled but they do have price discounts for builders.
I'm set for now though but all they has were crappy asus cards :yuck:


----------



## keikei

Hell, at this rate im willing to put up my zotac blower up for sale. If miners want to pay these kind of prices, who am i to stop them?


----------



## Mooncheese

Following up on my last post. If you have half a brain and can go 4 months without PC gaming now is the time to sell your 1080 Ti for $1000.

Edit: 

I posted this right after keikei's post lol, well at least someone here doesn't have fried macaroni for brains.


----------



## Mooncheese

And yes, you can turn any LED off, be it a Star Wars Titan Xp or any Geforce illuminated reference card: 

https://forums.evga.com/Official-Nvidia-LED-Visualizer-App-download-m2637454.aspx


----------



## fat4l

Go go go 

999£
https://www.nvidia.co.uk/geforce/pr...-galactic-empire-titan-xp-collectors-edition/

and the same cost 1149£ ?? Doesnt make sense does it  
https://www.nvidia.co.uk/titan/titan-xp/


----------



## fat4l

Btw anyone with MSI GTX 1080Ti Sea HawK EK X ??
What do you think about the card, how does it run, OC, experience ? Thanks


----------



## Mooncheese

fat4l said:


> Btw anyone with MSI GTX 1080Ti Sea HawK EK X ??
> What do you think about the card, how does it run, OC, experience ? Thanks


It looks like a solid card but there was an issue with MSI forgetting to take the protective plastic film off of the thermal tape that interfaces the block and the VRM / MOSFET and memory etc. I'm not sure if that issue has been rectified. EK makes good stuff and MSI makes good stuff.


----------



## SavantStrike

fat4l said:


> Btw anyone with MSI GTX 1080Ti Sea HawK EK X ??
> What do you think about the card, how does it run, OC, experience ? Thanks


It's a gaming X with an EK block. If you can get one for a decent price then go for it, but before pricing went nuts the gigabyte waterforce wb was only $30 more with better VRMs and a much higher 375W TDP. The waterforce cards are over a grand now so ignore them.


----------



## Mooncheese

SavantStrike said:


> It's a gaming X with an EK block. If you can get one for a decent price then go for it, but before pricing went nuts the gigabyte waterforce wb was only $30 more with better VRMs and a much higher 375W TDP. The waterforce cards are over a grand now so ignore them.


TDP isn't that important with most 1080 Ti's unable to do more than 2050 MHz. My FE holds 2038 MHz with 300W at 1.0v just fine but I believe I won the Silicon Lottery (keeping the core at 35C with a full water block definitely helps lol). Maybe 10% of 1080 Ti's can do 2100 Mhz and Pascal isn't responsive to pumping voltage into it. This isn't Kepler. 

So 375W is more or less a marketing gimmick.


----------



## SavantStrike

Mooncheese said:


> TDP isn't that important with most 1080 Ti's unable to do more than 2050 MHz. My FE holds 2038 MHz with 300W at 1.0v just fine but I believe I won the Silicon Lottery (keeping the core at 35C with a full water block definitely helps lol). Maybe 10% of 1080 Ti's can do 2100 Mhz and Pascal isn't responsive to pumping voltage into it. This isn't Kepler.
> 
> So 375W is more or less a marketing gimmick.


I found that I could hit higher clocks when I wasn't limited to 300W. I guess YMMV.


----------



## kithylin

A couple posts back some folks were claiming some new nvidia driver was making their cards run hotter. Has anyone else seen this? It sort of fell off to the new-card-availability discussion.


----------



## mouacyk

Mooncheese said:


> It looks like a solid card but there was an issue with MSI forgetting to take the protective plastic film off of the thermal tape that interfaces the block and the VRM / MOSFET and memory etc. I'm not sure if that issue has been rectified. EK makes good stuff and MSI makes good stuff.


You gotta be ****ting me... I bought this card to get warranty on a pre-installed waterblock. Haven't had issues with mine since buying it in Oct, but something like that does make me feel uneasy about not taking it apart and inspecting for myself.


----------



## Mooncheese

SavantStrike said:


> I found that I could hit higher clocks when I wasn't limited to 300W. I guess YMMV.


If you undervolt you can sustain higher clocks with less wattage, i.e., my FE holds 2038 MHz at 35-37C with only 1.0v (not 1.065v). Going in the other direction, or just leaving the voltage at stock (1.065-1.081v) and the clocks dip all the way down to 1925 MHz. Temp matters the most with 1080 Ti, not voltage or wattage. 



kithylin said:


> A couple posts back some folks were claiming some new nvidia driver was making their cards run hotter. Has anyone else seen this? It sort of fell off to the new-card-availability discussion.


I don't update my driver unless it's a title that I'm trying to play that will benefit from it. I'm usually around 4-6 months behind whatever the current driver is. Why not revert to an older driver? 



mouacyk said:


> You gotta be ****ting me... I bought this card to get warranty on a pre-installed waterblock. Haven't had issues with mine since buying it in Oct, but something like that does make me feel uneasy about not taking it apart and inspecting for myself.


http://www.overclock.net/forum/69-nvidia/1609397-msi-left-plastic-1080-seahawk-ek-x.html


----------



## Mooncheese

Try undervolting via MSI AB freq cure and report back. 

I too was under the impression that more voltage and more wattage was the way to go but it was thanks to members in this thread that I experimented with undervolting 1080 Ti and have to say that it is the way to go. Expect 4-5C cooler core temp and sustaining clocks that you couldn't with more voltage and wattage.

Edit: 

I'm not fond of the new format of this forum, it feels like a straight up downgrade. For one, commenting here results in a duplicate post error where I've only posted once.


----------



## falcon26

Yeah I think I'll just wait for the next gen of cards from Nvidia


----------



## ZealotKi11er

falcon26 said:


> Yeah I think I'll just wait for the next gen of cards from Nvidia


I am worried because there will be a lot of thirsty gamers and a lot of miners with money to buy the next gen stuff.


----------



## mAs81

ZealotKi11er said:


> I am worried because there will be a lot of thirsty gamers and a lot of miners with money to buy the next gen stuff.


It's going to be a race to the checkout for sure..

I think that I'll wait some more myself before getting a high end gpu..The prices are ridiculous here..

Both the 1080 and 1080Ti went up 100 euros at least in the past few weeks..


----------



## ThrashZone

Hi,
Yep I bought a titan xp in advance of selling my 1080ti at an obscene price next week 
If it doesn't sell no biggie it will be in my x99 rig
Sold both my 980's which paid for more than half 800.us of the new titan xp


----------



## hotrod717

So having watched in stock now for the past two weeks straight is it just me or does Newegg really have stock and there piecemealing cards to charge a higher price. Same 6 or so cards coming to stock every day and sell out within 10 minutes. I can't believe they're getting shipment every single day from every manufacturer.


----------



## ThrashZone

Hi,
Newegg always post what they have if sold and shipped by them 

Amazon is the ones that sell what they don't have and just keep people waiting... until they do have the items 

Happened to me 3 times in the last 2 months sold and shipped by amazon too and I thanked them by canceling = they have no clue what they have in stock


----------



## SavantStrike

ZealotKi11er said:


> I am worried because there will be a lot of thirsty gamers and a lot of miners with money to buy the next gen stuff.


Which is why I went with a titan xP. The 1180 probably won't beat it by more than 10-15 percent, and I have it now simply by selling my 1080 TI.


----------



## Mooncheese

ThrashZone said:


> Hi,
> Yep I bought a titan xp in advance of selling my 1080ti at an obscene price next week
> If it doesn't sell no biggie it will be in my x99 rig
> Sold both my 980's which paid for more than half 800.us of the new titan xp


Smart move, I honestly wish I done similar earlier when you could get Titan Xp. 



ThrashZone said:


> Hi,
> Newegg always post what they have if sold and shipped by them
> 
> Amazon is the ones that sell what they don't have and just keep people waiting... until they do have the items
> 
> Happened to me 3 times in the last 2 months sold and shipped by amazon too and I thanked them by canceling = they have no clue what they have in stock


This is how Amazon.com is out competing smaller distributors. They're logistics are SO fast that even if an item isn't in stock they can accept an order and wait 3-4 days in the hope of getting some from the manufacturer and still be as fast as Newegg, NCIX (bankruptcy), and B&H Photo. Couple that to their no questions asked return policy, whereas Newegg has exchange only within 30 days, and it's no small wonder the others cant compete. All of that takes a lot of money though, and its somewhat true that Jeff Bezos is running nearly in the margins to pay for that competitive advantage (taking 100% of items back without any question as to the nature of the refund without imposing a fee etc) and the logic is that one by one Amazon.com will completely replace all of the other outlets (it's inevitable, Amazon.com is the Wall Mart of the internet, yes that Wal Mart that hallowed out towns and cities across the country }well they were already beginning to rot because of the offshoring of industry) destroying small businesses. 



SavantStrike said:


> Which is why I went with a titan xP. The 1180 probably won't beat it by more than 10-15 percent, and I have it now simply by selling my 1080 TI.


 Smart, actually 9980 GTX Mark 7's (yes this is the number they are going with, not 2080 or 1080) will probably only be as fast as the Titan Xp going by previous architectural enhancements (GTX 980 more or less eqv. to OG Titan X).


----------



## ZealotKi11er

Mooncheese said:


> Smart move, I honestly wish I done similar earlier when you could get Titan Xp.
> 
> 
> 
> This is how Amazon.com is out competing smaller distributors. They're logistics are SO fast that even if an item isn't in stock they can accept an order and wait 3-4 days in the hope of getting some from the manufacturer and still be as fast as Newegg, NCIX (bankruptcy), and B&H Photo. Couple that to their no questions asked return policy, whereas Newegg has exchange only within 30 days, and it's no small wonder the others cant compete. All of that takes a lot of money though, and its somewhat true that Jeff Bezos is running nearly in the margins to pay for that competitive advantage (taking 100% of items back without any question as to the nature of the refund without imposing a fee etc) and the logic is that one by one Amazon.com will completely replace all of the other outlets (it's inevitable, Amazon.com is the Wall Mart of the internet, yes that Wal Mart that hallowed out towns and cities across the country }well they were already beginning to rot because of the offshoring of industry) destroying small businesses.
> 
> Smart, actually 9980 GTX Mark 7's (yes this is the number they are going with, not 2080 or 1080) will probably only be as fast as the Titan Xp going by previous architectural enhancements (GTX 980 more or less eqv. to OG Titan X).


Nvidia has to have some goodies what would be pascal cards even without having the raw performance. Also Amazon took my order for 1070 Tis and canceld 10 years after saying they are not getting more stock from manufacturer.


----------



## pez

Newegg's return policy on mobos and GPUs are non-existent to the point it's not even worth ordering that kinda stuff from them.


----------



## kithylin

Mooncheese said:


> I don't update my driver unless it's a title that I'm trying to play that will benefit from it. I'm usually around 4-6 months behind whatever the current driver is. Why not revert to an older driver?


I'm the same way, I don't update drivers very often, probably a few months behind as well. That's why I was asking, if the current driver today is causing slightly higher temps I may wait off even longer and just update in another 1-3 months instead.


----------



## Scotty99

Would you guys take 1050 for a strix 1080ti? Only one guy has replied on craigslist and thats his offer, they are out of stock everywhere online. Kind of weird asking if making 250 dollars is something i should do, but maybe i should hold out for more lol?


----------



## ZealotKi11er

Scotty99 said:


> Would you guys take 1050 for a strix 1080ti? Only one guy has replied on craigslist and thats his offer, they are out of stock everywhere online. Kind of weird asking if making 250 dollars is something i should do, but maybe i should hold out for more lol?


I would not bother that at price.


----------



## Scotty99

ZealotKi11er said:


> I would not bother that at price.


So you think i should wait it out for more? I just dont know how long this shortage is gonna last, and id rather not use ebay for reasons.


----------



## keikei

An easy $250? Sure.


----------



## SavantStrike

Scotty99 said:


> So you think i should wait it out for more? I just dont know how long this shortage is gonna last, and id rather not use ebay for reasons.



I think zealot is saying don't buy it at that price.

As far as selling it at that price, remember that with a local sale you save 10 percent eBay final value fees plus 4 percent PayPal fees, so it's like selling for 1197 on eBay, but you don't get a buyer returning a now opened card in 29 days (or worse filing a bogus PayPal claim 3 months from now).

I would sell at that price if you don't have any other buyers, or give it anther day or two and hold out for another hundred bucks. I wouldn't sit on it for a week.


----------



## nrpeyton

hey is it just me or has the layout of this whole website/forum changed?


----------



## keikei

nrpeyton said:


> hey is it just me or has the layout of this whole website/forum changed?


http://www.overclock.net/forum/3-ov...ion/1647313-new-platform-feedback-thread.html


----------



## Mooncheese

nrpeyton said:


> hey is it just me or has the layout of this whole website/forum changed?


Neo is beginning to see the Matrix.


----------



## nrpeyton

Guys,

Got a really weird problem going on with my 1080Ti. 

Wondering if anyone has experienced anything similar:

My memory overclock isn't working correctly. Best way I can explain it is there is some 'bug' causing a default negative offset. For example even with memory at a +1000 (the maximum in MSI afterburner) it still only does 1500mhz.

See screenshot below. If I decrease the memory O/C (remove the +1000 to say for example zero... my memory is actually massively UNDERCLOCKED, in fact even LOWER than default.










I recently upgraded to latest nvidia drivers and I also have an old AMD Radeon 6950 GPU in the system.

Hope someone can help?

Thanks,

Nick


----------



## nrpeyton

And with memory set to zero in MSI Afterburner:


----------



## nrpeyton

And with memory set to zero in MSI Afterburner:


----------



## kithylin

nrpeyton said:


> And with memory set to zero in MSI Afterburner:


You need to understand how the software is reporting on multipliers.

The actual memory reference clock for the 1080 Ti from nvidia is actually is 1251 Mhz. The actual effective speed is 11,008 Mhz (1251 * 8). Now, MSI Afterburner works on *4 multiplier. So the default in MSI afterburner is going to show 5005 mhz because that is the nvidia default (1251 * 4 = 5004 / 5005 mhz). GPU-Z on the other hand shows the actual base clock of 1251 Mhz. So when you add +1000 mhz in MSI Afterburner, you are actually seeing correct and the OC is applying correctly. 5005 Mhz + 1000 = 6005 Mhz. Which if you look in gpu-z however (which shows the base clock of 1251 Mhz), you are getting +250 Mhz (6005 / 4), or 1501 (1251 + 250 = 1501). And in the end, you adding +1000 mhz to memory in MSI Afterburner is adding almost +2000 Mhz to the effective rate: 1501 * 8 = 12008, vs the nvidia reference 11,008

See nvidia reference clock here: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series


----------



## nrpeyton

kithylin said:


> You need to understand how the software is reporting on multipliers.
> 
> The actual memory reference clock for the 1080 Ti from nvidia is actually is 1251 Mhz. The actual effective speed is 11,008 Mhz (1251 * 8). Now, MSI Afterburner works on *4 multiplier. So the default in MSI afterburner is going to show 5005 mhz because that is the nvidia default (1251 * 4 = 5004 / 5005 mhz). GPU-Z on the other hand shows the actual base clock of 1251 Mhz. So when you add +1000 mhz in MSI Afterburner, you are actually seeing correct and the OC is applying correctly. 5005 Mhz + 1000 = 6005 Mhz. Which if you look in gpu-z however (which shows the base clock of 1251 Mhz), you are getting +250 Mhz (6005 / 4), or 1501 (1251 + 250 = 1501). And in the end, you adding +1000 mhz to memory in MSI Afterburner is adding almost +2000 Mhz to the effective rate: 1501 * 8 = 12008, vs the nvidia reference 11,008
> 
> See nvidia reference clock here: https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_10_series


But on the last page (my 1st screenshot) with a +1000, the first screen of GPU-Z is showing 1626 but the 2nd screen (sensors) is showing 1500).

Normally before this bug a +670 would be 1544mhz (or 12,352mhz). But currently with a +670 for example the sensors tab (and also HWINFO64) is only showing 1417mhz (or 11,340mhz).

Moreover the fact a +1000 is running perfectly fine with over 300 watts going through my GPU should be enough to know something isn't right lol, my silicon lottery luck isn't that good. My best memory overclock on this card with low temps (using water chiller) has always been about +685

At default it should be 1375 (or 11,000) not 1250 and 10,000). It's almost like the memory is reporting as GDDR5 and not GDDR5X.?? _(my AMD card has GDDR5 memory at a base clock of 1250)_


----------



## kithylin

nrpeyton said:


> But on the last page (my 1st screenshot) with a +1000, the first screen of GPU-Z is showing 1626 but the 2nd screen (sensors) is showing 1500).
> 
> Normally before this bug a +670 would be 1544mhz (or 12,352mhz). But currently with a +670 for example the sensors tab (and also HWINFO64) is only showing 1417mhz (or 11,340mhz).


My above post was reading the 1080 numbers not 1080 Ti, 1080 Ti defaults to 1376 * 8, but you get the idea, the different softwares run on multipliers and the multipliers apply fine. My 1080 Ti only takes +700 mhz over stock (afterburner), which results in 1551 in GPU-Z, 6210 in Afterburner, and 12,420 effective. Can't go a single +1 mhz higher and that's a pretty nice overclock on ram and I'm happy with it.

I don't know what you're claiming for a "Bug", it all seems normal in your posts to me.


----------



## nrpeyton

kithylin said:


> My above post was reading the 1080 numbers not 1080 Ti, 1080 Ti defaults to 1376 * 8, but you get the idea, the different softwares run on multipliers and the multipliers apply fine. My 1080 Ti only takes +700 mhz over stock (afterburner), which results in 1551 in GPU-Z, 6210 in Afterburner, and 12,420 effective. Can't go a single +1 mhz higher and that's a pretty nice overclock on ram and I'm happy with it.
> 
> I don't know what you're claiming for a "Bug", it all seems normal in your posts to me.



Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?

How is this normal lol?


----------



## kithylin

nrpeyton said:


> Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?
> 
> How is this normal lol?


Hwinfo is probably just misreporting. I stopped using it years ago when I observed gross misinformation for my older 775 systems when overclocked to extremes (Core2duo @ 4600+ Mhz) and saw hwinfo reporting temperatures for CPU being +25c above CoreTemp & AIDA64. I'm guessing here but, I think GPU-Z and afterburner are likely updated much more frequently, and as a result probably much more accurate for current-gen cards..


----------



## ZealotKi11er

Mining?


----------



## KedarWolf

nrpeyton said:


> Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?
> 
> How is this normal lol?


Yes, yours is messed up.

You try without the second card in?

And how do you insert a picture, I only have options for an attachment. 

Here's mine.










Okay, now 'Insert Image' option only has the option to use a URL.


----------



## kithylin

KedarWolf said:


> Yes, yours is messed up.
> 
> You try without the second card in?
> 
> And how do you insert a picture, I only have options for an attachment.
> 
> Here's mine.


http://www.overclock.net/forum/1516...e-between-basic-enhanced-wysiwyg-editors.html


----------



## 2ndLastJedi

nrpeyton said:


> Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?
> 
> How is this normal lol?


This is what you should be seeing @ +0 . As Kedarwolf said , remove the other card and try again .


----------



## KedarWolf

kithylin said:


> http://www.overclock.net/forum/1516...e-between-basic-enhanced-wysiwyg-editors.html


The 'Insert Image' option only has an option to use a URL, not embed a picture.


----------



## ThrashZone

nrpeyton said:


> Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?
> 
> How is this normal lol?


Hi,
Try 3DMark and juice the card see what it reports as the clocks are at 
Yesterday mine looked like it was stuck at 1500 for both core and mem :/
Yeah I believe it's the newer drivers I was on 385.69 with no problem hitting close to 2k 
But with 388.16 I believe it hit a cap of 1500
Only recovering a system image fixed the issue but I was only going by what 3DMark was showing not anything else.

By the way +1000 on memory you showed


----------



## ThrashZone

KedarWolf said:


> Yes, yours is messed up.
> 
> You try without the second card in?
> 
> And how do you insert a picture, I only have options for an attachment.
> 
> Here's mine.
> 
> Okay, now 'Insert Image' option only has the option to use a URL.


Hi,
It's weird but use this 
Right click the thumbnail image to get copy image location and paste it into your reply
Exclude the gibberish on the end =..... can't remember all of it 


Code:


[img]paste-url[/img]


----------



## Mooncheese

pez said:


> Newegg's return policy on mobos and GPUs are non-existent to the point it's not even worth ordering that kinda stuff from them.


Exactly. Motherboards are the most problematic part of a new PC. I would venture to say that 25% or more have problems. Like right now I've lost one of my USB ports on my Gigabyte Z370 Aorus Gaming 7 and it's only two months old and nothing should have taken the port out (Logitech G900 Chaos Spectrum was occupying the port, it would charge but not remain connected to the computer). Not sure what I should do to try to troubleshoot this.


----------



## Kylar182

Have FE model. Is there a bios that unlocks Power/Voltage? (Water cooled cards)


----------



## Scotty99

Nvm wrong thread, lulz.


----------



## DarthBaggins

Kylar182 said:


> Have FE model. Is there a bios that unlocks Power/Voltage? (Water cooled cards)


Might want this myself, but I have a MSI Aero OC that uses the factory PCB just tweaked from MSI. Maybe a OC bios from a factory PCB based card might allow the features you're looking for (never flashed a GPU personally)


----------



## feznz

nrpeyton said:


> Okay look closely at my screenshots please. On one tab on GPU-Z when my memory is at a ZERO overclock (default), it is reporting 1375. But on the other screen of GPU-Z (sensor screen) and also in hwinfo64 it isn't showing as 1375 it is showing as 1250mhz. No 1080Ti runs at 1250MHZ memory at default.?
> 
> How is this normal lol?



I suspect your Highly OCed is slowly corrupting windows I remember when you had line fault just I have learnt over time looking for that last 2-3% performance will result in file corruption it may only happen once a week or so but over time you will accumulate more corruption till the inevitable clean install is the only way to fix it.
Pro OCer will have a cut down OS on several SSD that when going for world records may corrupt several OS so plug in another SSD to combat this.


----------



## Unnatural

Hi, I currently own 2 1080ti, an EVGA FE and a MSI Aero OC; since I abandoned my original plans to have 2 different pcs, I now want to sell one, and I'd need advice on which one to keep. Since, from what I understand, the MSI is basically just a FE with factory OC, and since it will be watercooled, I'd keep the EVGA for the better support. Am I missing some other aspect?


----------



## kithylin

Unnatural said:


> Hi, I currently own 2 1080ti, an EVGA FE and a MSI Aero OC; since I abandoned my original plans to have 2 different pcs, I now want to sell one, and I'd need advice on which one to keep. Since, from what I understand, the MSI is basically just a FE with factory OC, and since it will be watercooled, I'd keep the EVGA for the better support. Am I missing some other aspect?


The EVGA one being FE will have better waterblock compatibility as well. I don't know if the MSI Aero OC uses a custom PCB or not.


----------



## kithylin

Unnatural said:


> Hi, I currently own 2 1080ti, an EVGA FE and a MSI Aero OC; since I abandoned my original plans to have 2 different pcs, I now want to sell one, and I'd need advice on which one to keep. Since, from what I understand, the MSI is basically just a FE with factory OC, and since it will be watercooled, I'd keep the EVGA for the better support. Am I missing some other aspect?


The EVGA one being FE will have better waterblock compatibility as well. I don't know if the MSI Aero OC uses a custom PCB or not.


----------



## gta1989

kithylin said:


> The EVGA one being FE will have better waterblock compatibility as well. I don't know if the MSI Aero OC uses a custom PCB or not.


i would keep the EVGA FE as it is a ref pcb. i do know the msi aero oc is a custom pcb and i have not seen a waterblock for it. also i think EVGA warranty is far better then msi


----------



## Kronos8

Unnatural said:


> Hi, I currently own 2 1080ti, an EVGA FE and a MSI Aero OC; since I abandoned my original plans to have 2 different pcs, I now want to sell one, and I'd need advice on which one to keep. Since, from what I understand, the MSI is basically just a FE with factory OC, and since it will be watercooled, I'd keep the EVGA for the better support. Am I missing some other aspect?


Check here https://www.ekwb.com/configurator/ for compatible blocks per model


----------



## DarthBaggins

kithylin said:


> The EVGA one being FE will have better waterblock compatibility as well. I don't know if the MSI Aero OC uses a custom PCB or not.


It uses the Nvidia FE PCB, just mounted the waterblock on mine


----------



## Scotty99

Welp goodbye asus strix 1080ti hello 250 dollars in my pocket lol (sold for 1050.00). Guy actually drove 3.5 hours from another state to get it, wonder if he posts on this forum lol.


----------



## GOLDDUBBY

Scotty99 said:


> Welp goodbye asus strix 1080ti hello 250 dollars in my pocket lol (sold for 1050.00). Guy actually drove 3.5 hours from another state to get it, wonder if he posts on this forum lol.


I wouldn't have sold it. Could have just mined on it for 2 months and made that money on top.


----------



## Scotty99

GOLDDUBBY said:


> I wouldn't have sold it. Could have just mined on it for 2 months and made that money on top.


Ya i could of, but then i woulda felt dirty...


----------



## xxicrimsonixx

I got 2x 1080ti a couple weeks ago.  I must say they are really powerful, I am able to max everything at 4k.


----------



## kithylin

GOLDDUBBY said:


> I wouldn't have sold it. Could have just mined on it for 2 months and made that money on top.


Yeah except mining on a card (for any duration) and then trying to sell it and not disclaiming to the new owner during the sale that it was used for mining is possibly the most dirty, despicable, underhanded thing anyone can do. Personally I never sell my nvidia cards, I still have all my old ones here and they all still work.


----------



## xxicrimsonixx

kithylin said:


> Yeah except mining on a card (for any duration) and then trying to sell it and not disclaiming to the new owner during the sale that it was used for mining is possibly the most dirty, despicable, underhanded thing anyone can do. Personally I never sell my nvidia cards, I still have all my old ones here and they all still work.


Mining has the largest impact on the fans for the cooler of the gpu, which are mechanical devices with motors that can wear out. GPU performance doesn't not degrade over time.


----------



## kithylin

xxicrimsonixx said:


> Mining has the largest impact on the fans for the cooler of the gpu, which are mechanical devices with motors that can wear out. GPU performance doesn't not degrade over time.


Under gaming, or "work station" programs or "normal usage" no. However running any 'gaming' class gpu 24-7-365 for coin mining non-stop for 2 or 3 years is going to have a serious impact on it's life and most likely won't be very usable for gaming afterwards. Mostly it's the capacitors that wear out being stressed at high temps constantly for a long period of time. It depends largely on the owner of the card and what they did with it. It's also sort of well known that video cards can run higher overclocks for coin mining than they can for 3D gaming and some (not all.. but some) gpu owners flash a high overclock on their cards then coin-mine with em for years and then think they can just resell em for normal used prices afterwards, not tell anyone and recover their money back as a business model.

In fact coin mining is so destructive to the health of gpu's that people were buying em from Fry's Electronics, coin mining on em for even just 30 days, and then trying to return them as non-working on the 29'th day to the store claiming they suddenly died. It was so bad that now Fry's doesn't allow anyone to return any video cards for any reason now.


----------



## xxicrimsonixx

kithylin said:


> Under gaming, or "work station" programs or "normal usage" no. However running any 'gaming' class gpu 24-7-365 for coin mining non-stop for 2 or 3 years is going to have a serious impact on it's life and most likely won't be very usable for gaming afterwards. Mostly it's the capacitors that wear out being stressed at high temps constantly for a long period of time. It depends largely on the owner of the card and what they did with it. It's also sort of well known that video cards can run higher overclocks for coin mining than they can for 3D gaming and some (not all.. but some) gpu owners flash a high overclock on their cards then coin-mine with em for years and then think they can just resell em for normal used prices afterwards, not tell anyone and recover their money back as a business model.


What if you leave it at gaming level OC, don't flash the BIOS. What happens if you keep temps at reasonable levels?


----------



## kithylin

xxicrimsonixx said:


> What if you leave it at gaming level OC, don't flash the BIOS. What happens if you keep temps at reasonable levels?


I don't know, I don't do mining. From everything I've seen of coin mining, it literally runs all gpu's @ 80+ C constantly (and likely hold VRM's around 100+ C at the same time) to the point they're all thermal/power throttling and expected to endure that for multiple years non-stop. I can't understand in my mind how anyone can punish a card like that and expect it to be used as a normal gaming gpu afterwards.

I can't find any examples at the moment but I have seen multiple listings on ebay of people selling video cards and disclaiming in the listing "Artifacts when gaming, but works perfect for mining no problem. XX hashes/second. Great mining card." and charging normal used prices for em. Usually it's couple years older AMD cards described like that however.


----------



## ZealotKi11er

kithylin said:


> I don't know, I don't do mining. From everything I've seen of coin mining, it literally runs all gpu's @ 80+ C constantly (and likely hold VRM's around 100+ C at the same time) to the point they're all thermal/power throttling and expected to endure that for multiple years non-stop. I can't understand in my mind how anyone can punish a card like that and expect it to be used as a normal gaming gpu afterwards.
> 
> I can't find any examples at the moment but I have seen multiple listings on ebay of people selling video cards and disclaiming in the listing "Artifacts when gaming, but works perfect for mining no problem. XX hashes/second. Great mining card." and charging normal used prices for em. Usually it's couple years older AMD cards described like that however.



Mining load is usually less then gaming. Also if a card artifacts in gaming it cant mine. It might seem its mining but it will get a lot of errors and mine much less than its reporting. 80C+ is way too much. With cards I mine only reference 7970 hits 75C. All other cards are under that. VRM also under control under 70C. This is because mining either uses memory or GPU and not many time both. If you do dual-mining then you basically cooking the card. For example Fury X would hit ~ 51C Core/ 55C VRM with mining 1 coin but as soon as you hit it with dual coins VRM hit 90C+ and core 65C.


----------



## Unnatural

Kronos8 said:


> Check here https://www.ekwb.com/configurator/ for compatible blocks per model


Yes, I knew (and I also have WB for both cards), I forgot to mention that, my question wasn't about waterblock, but thanks anyway


----------



## kithylin

ZealotKi11er said:


> Mining load is usually less then gaming. Also if a card artifacts in gaming it cant mine. It might seem its mining but it will get a lot of errors and mine much less than its reporting. 80C+ is way too much. With cards I mine only reference 7970 hits 75C. All other cards are under that. VRM also under control under 70C. This is because mining either uses memory or GPU and not many time both. If you do dual-mining then you basically cooking the card. For example Fury X would hit ~ 51C Core/ 55C VRM with mining 1 coin but as soon as you hit it with dual coins VRM hit 90C+ and core 65C.


Again.. I don't do coin mining on my cards and not exactly sure how it works, today. But unless something has changed recently.. I had a friend show me his mining setup in spring 2017 and all video cards were at 100% load on the core constantly the entire time and never let off. And because of the excessive load they were all running at max temps and max fans constantly. I was therefore assuming that's how all coin mining works. Friend of mine was mining Ethereum at the time, however you spell it. Not sure what you're doing different than what my friend was doing. Maybe people don't mine Ethereum anymore.. damned if I know. And I'm pretty sure playing games with gsync / vsync on that the gpu is never constantly at 100% load the entire time you're playing games and screaming with it's fans maxed out trying to fight to cool it's self.


----------



## nrpeyton

Whenever I use my card to mine I run it on a Water Chiller. And I never exceed 0.9v (900mv). Which my card can do 1999Mhz at. Temps also never exceed 25c on the core (with a water temp of about 10-13c). And VRM temps (which I test using probes attached directly to MOSFETS never even reach 50c. (Usually hovering about 45c). Which is quite healthy for components capable of running at 125c. That being said -- while mining the card does indeed still pull a lot of wattage. Even at only 900mv (0.9v) it still pulls up to 325 watts. Which is why I never EVER exceed 0.9v while mining.

My card is probably last much longer than someone gaming everyday at 1.093v @ 2100Mhz & pulling roughly 250-300 watts.


----------



## kithylin

nrpeyton said:


> Whenever I use my card to mine I run it on a Water Chiller. And I never exceed 0.9v (900mv). Which my card can do 1999Mhz at. Temps also never exceed 25c on the core (with a water temp of about 10-13c). And VRM temps (which I test using probes attached directly to MOSFETS never even reach 50c. (Usually hovering about 45c). Which is quite healthy for components capable of running at 125c. That being said -- while mining the card does indeed still pull a lot of wattage. Even at only 900mv (0.9v) it still pulls up to 325 watts. Which is why I never EVER exceed 0.9v while mining.
> 
> My card is probably last much longer than someone gaming everyday at 1.093v @ 2100Mhz & pulling roughly 250-300 watts.


That's nice and all but.. that's so far off from the typical use case of coin miner people.. maybe you should take a moment to in to google images and type in "coin mining rig" and search. Most of them are air cooled, almost no one uses a chiller for each gpu for mining.

Anyway, that's pretty far off topic already.


----------



## nrpeyton

Having a really odd issue guys. (card is 1080Ti FE)

Need some help:

I can no longer go over 30HZ at 4k on my 4K Sony Bravia X85C 55" TV.


*What does work:*

TV HDMI signal set to "standard" _(in TV settings)_:
1080p 60hz
4k but only up to 30hz

TV HDMI Signal set to "advanced":
1080p 60hz only

For a start-- the above seems back-to-front.

I should be able to set the HDMI signal format on TV to "advanced". Then in Windows set resolution on TV to 3840 x 2160 @60hz and there should be no issues. But I'm having no luck 

I don't get a picture.


Here's what I've tried so far:
1) Replacing the HDMI cable with a brand new one (the new one is HDMI 2.0 certified and can do 4k at 60hz). (from cables.co.uk)
2) Ensuring TV's HDMI signal is set to "advanced" and not "standard" format.
3) Uninstalling ALL Nvidia, AMD & Intel GPU drivers using the "DDU" utility. (as posted on EVGA support forums). Rebooting, then re-installing *only* the Nvidia software/drivers.
4) Running a full in-depth virus scan using the latest edition of AVG & running a full Windows Defender scan.
5) Resetting TV back to factory defaults.
6) Checked that intel's iGPU is disabled in system BIOS. (and confirming this by checking in Windows that it's not displayed in GPU-Z -- which it isn't)!
7) Tried HDMI ports 1 & 2 on TV (there are 4). (despite the fact the TV settings give option for "standard" and "advanced" for ALL ports)
8) Tried setting the TV as the "primary" display device. & tried disconnecting my monitor.


Note: I do have an older AMD Radeon HD 6950 installed. But as mentioned above I uninstalled it. It was only re-installed automatically from Microsoft Update as part of boot process after I exited safe-mode. (I performed the clean uninstalls with DDU in safe-mode). I also currently have NO displays attached to the AMD card. I have no issues on my 2560 x 1440p monitor.


----------



## 2ndLastJedi

I have KD55X8500B and just using a Display port to HDMI cable have done nothing but plug and play. 
Only now using DP to HDMI due to Rift but previously used standard HDMI 2.0 and same, just plug and play 4K/60.


----------



## kithylin

nrpeyton said:


> Having a really odd issue guys. (card is 1080Ti FE)
> 
> Need some help:
> 
> I can no longer go over 30HZ at 4k on my 4K Sony Bravia X85C 55" TV.


I did some quick googling and found a couple reviews for your TV that said there have been multiple firmware updates for your TV since it's release. One of which addressed 4K @ 60 hz compatibility, specifically.

I might suggest making sure your TV is currently up-to-date on all updates and try again.


----------



## nrpeyton

kithylin said:


> I did some quick googling and found a couple reviews for your TV that said there have been multiple firmware updates for your TV since it's release. One of which addressed 4K @ 60 hz compatibility, specifically.
> 
> I might suggest making sure your TV is currently up-to-date on all updates and try again.


I just checked on the TV's settings. It's on version 6.0.1 and is reporting as fully up to date. No new updates are currently available. :-(


----------



## DarthBaggins

nrpeyton said:


> I just checked on the TV's settings. It's on version 6.0.1 and is reporting as fully up to date. No new updates are currently available. :-(


Might have a corrupted update, I would try to get the update file from the manufacturer and flash it via thumbstick rather than via web based.


----------



## KCDC

DarthBaggins said:


> Might have a corrupted update, I would try to get the update file from the manufacturer and flash it via thumbstick rather than via web based.



I had to do this with my LG B7 to get the FW to properly update.


----------



## falcon26

Got a brand new Msi 1080 ti Gaming X off ebay for $1000. Which I guess is not too bad considering what others are selling for. I will now sell my 1080 for $500 to make up the cost a bit...


----------



## kithylin

falcon26 said:


> Got a brand new Msi 1080 ti Gaming X off ebay for $1000. Which I guess is not too bad considering what others are selling for. I will now sell my 1080 for $500 to make up the cost a bit...


Make sure you check prices... GTX 1080's are going for about $750 used today, not $500.


----------



## falcon26

Wait what? I thought they were around $500. Time to change my ad


----------



## xxicrimsonixx

falcon26 said:


> Wait what? I thought they were around $500. Time to change my ad


Lol, I was just about to say I would buy it from you for $500...


----------



## kithylin

falcon26 said:


> Wait what? I thought they were around $500. Time to change my ad


See here: https://www.ebay.com/sch/i.html?_fr...&LH_ItemCondition=3000&_trksid=p2045573.m1684


----------



## DerComissar

xxicrimsonixx said:


> Lol, I was just about to say I would buy it from you for $500...





kithylin said:


> Make sure you check prices... GTX 1080's are going for about $750 used today, not $500.


Yeah.

So a used 1080 going for $750 USD would be about $925 CAD.

Less shipping and any other fees, lol.


----------



## falcon26

Yeah there sure are going for $650-750 and that is also without the box too. I don't have the box for mine, so I will probably sell mine for an even $600 then.


----------



## Johan45

nrpeyton said:


> Whenever I use my card to mine I run it on a Water Chiller. And I never exceed 0.9v (900mv). Which my card can do 1999Mhz at. Temps also never exceed 25c on the core (with a water temp of about 10-13c). And VRM temps (which I test using probes attached directly to MOSFETS never even reach 50c. (Usually hovering about 45c). Which is quite healthy for components capable of running at 125c. That being said -- while mining the card does indeed still pull a lot of wattage. Even at only 900mv (0.9v) it still pulls up to 325 watts. Which is why I never EVER exceed 0.9v while mining.
> 
> My card is probably last much longer than someone gaming everyday at 1.093v @ 2100Mhz & pulling roughly 250-300 watts.


I know this isn't thright quote but if you're having trouble with mem hitting the higher bin while mining there are two things you can try
Start a 3d app before you start the miner to boost it which works sometimes
This works most of the time for me get Nvidia Inspector, it allows you to change lower Pstates in windows. That's how I got my 980 and 1070ti to stay where they should be


----------



## nrpeyton

Tried that too. Memory still stuck at a max of 1500mhz (and I need a +1000 to get it to that). I'm running at 0.9v 1999mhz & +1000 mem (which gives me 1500mhz mem). I think it's due to some conflict with my AMD card. As when I leave memory at zero in MSI Afterburner it defaults to 1250 (which is the default memory clock of my AMD card). The default for a 1080Ti should be 1375mhz I believe.
1375 x 2 = 2750
2750 x 4 = *11*,000 Gbps

It's a pity my TV won't give me the option to delete all updates and start again. It DOES give me the option to reset to factory default -- but that doesn't seem to remove any updates. (It does however delete apps etc that I've manually installed) etc.


----------



## Johan45

Did you try K-Boost through Precision X


----------



## ZealotKi11er

How bad is to buy 1080 Ti for 930 USD right now?


----------



## HoneyBadger84

ZealotKi11er said:


> How bad is to buy 1080 Ti for 930 USD right now?


They're going for $1k plus on EBay recently so that's not a horrible price, but I think you'd be better off waiting to get a top of the line cooler one for ~$875, or a Reference (for less than that) if you're going to liquid cool it.


----------



## SavantStrike

Given today's prices that's a reasonable number.


----------



## nrpeyton

Johan45 said:


> Did you try K-Boost through Precision X


I'll give that a try. Too. Thanks.


----------



## navit

ZealotKi11er said:


> How bad is to buy 1080 Ti for 930 USD right now?


Evga puts some for sale on their website every night for the best prices you will see , like from 729 and up. 
I got a ftw3 hybrid for 850. To be honest that's not what I wanted but I couldn't pay for the others I was trying to get for less fast enough before 
they were gone. Cant really complain though its a great card that runs cool and quiet.


----------



## DerComissar

ZealotKi11er said:


> How bad is to buy 1080 Ti for 930 USD right now?


That's $1146.60 CAD atm, our dollar is a bit better to the USD now, but it still sucks having to pay that big conversion to CAD.

But looking at the current 1080Ti prices, I suppose that's a fairly good deal.


----------



## 8051

DerComissar said:


> That's $1146.60 CAD atm, our dollar is a bit better to the USD now, but it still sucks having to pay that big conversion to CAD.
> 
> But looking at the current 1080Ti prices, I suppose that's a fairly good deal.


The new normal is insanity as far as pricing for video cards.


----------



## nrpeyton

I feel so lucky.

I got my 1080Ti for *free* from EVGA as part of their "step-up" programme. I had a 1080 Classified and was allowed to step up to a 1080Ti FE. (You only pay the difference). But when 1080Ti launched it was only £700. Which is what I paid for my 1080 Classified. Therefore I only paid for postage & packaging. 

I love EVGA. They really saved my skin this round, lol. No way I could of afforded that upgrade myself at todays prices.

I flashed the ASUS STRIX XOC firmware to my card and done the shunt mod. Slapped a water-block on it too. So my card is also effectively _(for all intensive purposes)_ just as fast (if not even faster) and more powerful than a Kingpin 1080Ti.


----------



## KCDC

nrpeyton said:


> I feel so lucky.
> 
> I got my 1080Ti for free from EVGA as part of their "step-up" programme. I had a 1080 Classified and was allowed to step up to a 1080Ti. (You only pay the difference). But when 1080Ti launched it was only £700. Which is what I paid for my 1080 Classified. Therefore I only paid for postage & packaging.
> 
> I love EVGA. They really saved my skin this round, lol. No way I could of afforded that upgrade myself at todays prices.


Lucky! Nice score! Especially now. Maybe it was a nod to gamers/non-miners as well.


----------



## HoneyBadger84

I'm sure eVGA will reconsider that program after how hard they're probably getting pummeled by Miners with step-ups & RMAs, they'll probably shorten their RMA period & make the Step-Ups only available for certain cards from now on.

I love eVGA, despite their "Issues", and I hate to think they're being taken advantage of by the same people that did stuff like what made Fry's stop taking returns on video cards all together (mining a card heavily for 28 days then returning for a refund).


----------



## nrpeyton

I agree.

However not ALL miners are bad. What about people who bought their card to game (primeraly) but later also discovered mining later on? And now do a bit on the side when their machine would otherwise be sitting idling? 
(Also having no intension of returning a card or running it at a voltage high enough to cause any degradation over time). Nor even ever having intension on buying a 2nd one.

Many people may "mine" on the side but their plans have otherwise not changed. (In terms of how many cards they own or how frequently they would renew a card).

People who buy up 30 cards and set up "mining farms" are responsible for this. Not the little guy who does a little bit on the side with 1 card he would already of had anyway.


----------



## HoneyBadger84

nrpeyton said:


> I agree.
> 
> However not ALL miners are bad. What about people who bought their card to game (primeraly) but also just do a bit of mining on the side when their machine would otherwise be sitting idling? (Also having no intension of returning a card or running it at a voltage high enough to cause any degradation over time).


Those people I have no problem with. It's the people abusing the system that suck. I don't even have a problem with someone that has a mining farm, as long as they're not abusing RMAs/Returns when they know full well mining cards 24/7 shortens their lifespan if they're OCed... I've been out of the game a while, so I don't know how damaging it is if you run them stock. I used to do [email protected] on a small farm of GTX 580s and R9 290Xs (not at the same time, the 290Xs came second) when I wasting gaming on them, but I ran them at stock & had the blowers (they were stock blower cards) on 100% fan whenever they were Folding, so they never really got HOT hot, they stayed under 70C.

Heck, I might even BE one of those people that mines when their system would otherwise be idle, depends on how difficult it is to setup etc & whether or not it's "worth it". Since we have/will have solar power soon, the power draw of one card part time won't be an issue.


----------



## ZealotKi11er

HoneyBadger84 said:


> Those people I have no problem with. It's the people abusing the system that suck. I don't even have a problem with someone that has a mining farm, as long as they're not abusing RMAs/Returns when they know full well mining cards 24/7 shortens their lifespan if they're OCed... I've been out of the game a while, so I don't know how damaging it is if you run them stock. I used to do [email protected] on a small farm of GTX 580s and R9 290Xs (not at the same time, the 290Xs came second) when I wasting gaming on them, but I ran them at stock & had the blowers (they were stock blower cards) on 100% fan whenever they were Folding, so they never really got HOT hot, they stayed under 70C.
> 
> Heck, I might even BE one of those people that mines when their system would otherwise be idle, depends on how difficult it is to setup etc & whether or not it's "worth it". Since we have/will have solar power soon, the power draw of one card part time won't be an issue.


I mine with my EVGA 1080 Ti and I know for a fact that for me it more worth it to keep the card safe then have to RMA it. RMA is wasted time mining. I run my card 0.875v which is far less than what card uses stock and runs 50C.


----------



## cekim

8051 said:


> The new normal is insanity as far as pricing for video cards.


On the bright side, the Titan Xp is now a "value" card - lol...


----------



## HoneyBadger84

cekim said:


> On the bright side, the Titan Xp is now a "value" card - lol...


And they're STILL in stock over 24hrs after coming back: https://www.nvidia.com/en-us/titan/titan-xp/ 

I will be amazed if we don't have a run on those, at least by gamers who see the obvious "1080s & 1080 Tis are selling for ridiculous prices on EBay, why not sell mine & get a Titan Xp practically free" path to an even better card. Especially if you're one of the lucky ones that does liquid cooling & already has a block that fits both the 1080/1080 Ti & the Titan Xp.


----------



## KCDC

HoneyBadger84 said:


> And they're STILL in stock over 24hrs after coming back: https://www.nvidia.com/en-us/titan/titan-xp/
> 
> I will be amazed if we don't have a run on those, at least by gamers who see the obvious "1080s & 1080 Tis are selling for ridiculous prices on EBay, why not sell mine & get a Titan Xp practically free" path to an even better card. Especially if you're one of the lucky ones that does liquid cooling & already has a block that fits both the 1080/1080 Ti & the Titan Xp.


I'm one of those and am seriously considering it.


----------



## nrpeyton

HoneyBadger84 said:


> Those people I have no problem with. It's the people abusing the system that suck. I don't even have a problem with someone that has a mining farm, as long as they're not abusing RMAs/Returns when they know full well mining cards 24/7 shortens their lifespan if they're OCed... I've been out of the game a while, so I don't know how damaging it is if you run them stock. I used to do [email protected] on a small farm of GTX 580s and R9 290Xs (not at the same time, the 290Xs came second) when I wasting gaming on them, but I ran them at stock & had the blowers (they were stock blower cards) on 100% fan whenever they were Folding, so they never really got HOT hot, they stayed under 70C.
> 
> Heck, I might even BE one of those people that mines when their system would otherwise be idle, depends on how difficult it is to setup etc & whether or not it's "worth it". Since we have/will have solar power soon, the power draw of one card part time won't be an issue.


Unless throttling due to temp/power at *stock* the card would run at 1.062v and about 1800mhz with GPU boost 3.0.

I run my card at 0.9v at 1999Mhz. So while the card _is_ overclocked -- the strain on it is actually LESS than it would be at stock. It will arguably last *longer* than a card left to stock with no overclocking/tweaks.

At stock card would only be at 1747mhz at 0.9v. Moreover memory overclocking (for most people) costs nothing. There are NO longevity/lifetime penalties for memory O/C'ing simply because you aren't touching voltages.

I have a +252 O/C on the core and a +1000 O/C on my memory and card still last longer than stock. lol 


P.S.
~Under-volting is *key.* Under-volting while simultaneously over-clocking ;-)


----------



## ZealotKi11er

Sapphire Fury X + Gigabyte RX 570 4GB for MSI 1080 Ti Duke? Good deal?


----------



## HoneyBadger84

KCDC said:


> I'm one of those and am seriously considering it.


DO ET! lol I still have a eVGA 1080 Ti FTW3 Hybrid on "order" whenever Amazon gets them back in stock, I'll be getting that just to resell it & make back almost all of the rest of what I spent on getting the Titan Xp (which was paid for 2/3s of the way by reselling my R9 295x2 for $1k on Ebay, thanks miners)


----------



## 8051

nrpeyton said:


> Unless throttling due to temp/power at *stock* the card would run at 1.062v and about 1800mhz with GPU boost 3.0.
> 
> I run my card at 0.9v at 1999Mhz. So while the card _is_ overclocked -- the strain on it is actually LESS than it would be at stock. It will arguably last *longer* than a card left to stock with no overclocking/tweaks.
> 
> At stock card would only be at 1747mhz at 0.9v. Moreover memory overclocking (for most people) costs nothing. There are NO longevity/lifetime penalties for memory O/C'ing simply because you aren't touching voltages.
> 
> P.S.
> ~Under-volting is *key.* Under-volting while simultaneously over-clocking ;-)


Memory doesn't work by itself, the memory controller on the GPU is actually a thing of complexity and it's being overclocked as well because it's synchronous.


----------



## OrionBG

nrpeyton said:


> I feel so lucky.
> 
> I got my 1080Ti for *free* from EVGA as part of their "step-up" programme. I had a 1080 Classified and was allowed to step up to a 1080Ti FE. (You only pay the difference). But when 1080Ti launched it was only £700. Which is what I paid for my 1080 Classified. Therefore I only paid for postage & packaging.
> 
> I love EVGA. They really saved my skin this round, lol. No way I could of afforded that upgrade myself at todays prices.
> 
> I flashed the ASUS STRIX XOC firmware to my card and done the shunt mod. Slapped a water-block on it too. So my card is also effectively _(for all intensive purposes)_ just as fast (if not even faster) and more powerful than a Kingpin 1080Ti.


How exactly did you do the shunt mod on your card and is there a difference if the card has the XOC BIOS already?


----------



## luckerr

Hello,

I did not want to ''spam'' this thread with my problem - so I created another one and post it here. If you don't mind.

http://www.overclock.net/forum/69-nvidia/1652105-asus-strix-gtx-1080ti-oc-lag-spikes-stuttering.html

If anyone of you has some experiance with stuttering / or does not have - please check my problem.

Thank you very much


----------



## KickAssCop

What are people experiences with AIO solutions. I have 2 EVGA SC2 Hybrids and 6 MSI Sea Hawks X on order. Anyone here deal with bad pumps and crap?


----------



## 2ndLastJedi

Burn him at the stake , Fu#king GPU horder


----------



## mouacyk

Welcome to overmine.net.


----------



## SavantStrike

KickAssCop said:


> What are people experiences with AIO solutions. I have 2 EVGA SC2 Hybrids and 6 MSI Sea Hawks X on order. Anyone here deal with bad pumps and crap?


The pumps in an AIO aren't going to be happy running at high temps 24x7 if you're using them for mining.


----------



## DarthBaggins

lmao. . 

If you need a home for any of those GPU's I'm sure I can make room on my test bench :thumb: 


But normally I haven't seen or heard many issues w/ the AiO solutions on those GPU's, but of course if there is an issue it would be covered under their warranties.


----------



## ZealotKi11er

SavantStrike said:


> The pumps in an AIO aren't going to be happy running at high temps 24x7 if you're using them for mining.


You telling me the card running 50C which is with much cooler water is going to be a problem?


----------



## 8051

mouacyk said:


> Welcome to overmine.net.


Yes who needs gaming when you mining? Maybe Nvidia and AMD just need to appeal to their new consumer base: miners and forget PC gaming altogether because if GPU prices stay they way they are I know I will.


----------



## SavantStrike

ZealotKi11er said:


> You telling me the card running 50C which is with much cooler water is going to be a problem?


If you load up a card on a clc, temps are higher than you think.

Undervolt the cards.


----------



## ZealotKi11er

SavantStrike said:


> If you load up a card on a clc, temps are higher than you think.
> 
> Undervolt the cards.



Been mining with my 1080 Ti since June. Card temp is 42-50C. Yes the card is heavily under-volted. 0.875v vs 0.972-1.062 which runs stock.


----------



## KCDC

HoneyBadger84 said:


> DO ET! lol I still have a eVGA 1080 Ti FTW3 Hybrid on "order" whenever Amazon gets them back in stock, I'll be getting that just to resell it & make back almost all of the rest of what I spent on getting the Titan Xp (which was paid for 2/3s of the way by reselling my R9 295x2 for $1k on Ebay, thanks miners)


Sigh, I want to, but I just finished my glass tube build! Only to tear it apart again... But, that's lazy talk for a serious free upgrade. We shall see what the weekend has in store for me.


----------



## kithylin

nrpeyton said:


> I flashed the ASUS STRIX XOC firmware to my card and done the shunt mod. Slapped a water-block on it too. So my card is also effectively _(for all intensive purposes)_ just as fast (if not even faster) and more powerful than a Kingpin 1080Ti.


Just so you know this is completely pointless. The XOC bios does literally the exact same thing as the shunt mod, and you don't have dangerous liquid metal on your card that can drip onto components.. (at least that's how gamer's nexus did the shunt mod). Shunt mod is not necessary if you are using the XOC bios. They both do the same thing.



nrpeyton said:


> However not ALL miners are bad. What about people who bought their card to game (primeraly) but later also discovered mining later on? And now do a bit on the side when their machine would otherwise be sitting idling?





nrpeyton said:


> Many people may "mine" on the side but their plans have otherwise not changed. (In terms of how many cards they own or how frequently they would renew a card).


I sincerely hope you know better than to be running coin mining (even part time.. even for a short while) on a 1080 Ti with either shunt modded or the XOC bios on it. That'll run an extremely high chance of frying your card. XOC bios is for benchmarking or gaming (with gsync or vsync on) only.


----------



## Agent-A01

8051 said:


> Memory doesn't work by itself, the memory controller on the GPU is actually a thing of complexity and it's being overclocked as well because it's synchronous.


Memory controllers have logic to send/receive/refresh data to memory.

Sorry but your assumption that raising the data-rate of memory also overclocks something else makes no sense logically.
Simply an untrue statement.


----------



## nrpeyton

OrionBG said:


> How exactly did you do the shunt mod on your card and is there a difference if the card has the XOC BIOS already?


I originally decided I liked the idea of keeping the stock FE BIOS. Which is why I done the *shunt mod.*

Later on I decided I also wanted to take advantage of the extra voltage the XOC firmware allows (107 milli volts extra when pushed to absolute limits). Hence I installed XOC firmware.

I never run the card at that voltage apart from once or twice while I was "experimenting" with overclocking my card. I wanted to know what she was capable of. I.E. the performance I could get when pushing the envelope to the absolute limit of the GPU architecture and FE PCB.

To carry out the mod safely you apply liquid electrical tape all around the sides of all three shunt resistors. (Being careful not to get any on top of the resistor). 
Liquid Electrical Tape transforms into what looks like normal black electrical insulation tape when it dries. Also easily to peel off.

Next; apply a tiny amount of Liquid Metal (I.E. Thermal Grizzly Conductonaut) to the top of all 3 resistors. Then spread so it covers the top of the resistor. (You're effectively creating an extra electrical pathway across the resistor) which changes the recorded resistance and fools the card into thinking it's drawing about 25% less power. You now effectively have an extra 25% before reaching maximum power envelope for card. Which stops it power throttling.

Be careful not to get any liquid metal around the edges of the resistor (edges where base of resistor joins with PCB) as Liquid Metal has been known to "eat" solder and 2 guys reported resistors actually falling off.

Also before applying the tiny blob of liquid metal try to 1st apply an even smaller amount (barely a visible trace) then smear this over the top of the resistor. Liquid Metal is attracted to Liquid Metal so it helps with applying it. (Stops it trying to move around as sometimes it will want to move around in a "little ball" instead of painting over the resistor).


P.S
GPU Boost 3.0 works with Nvidia limitations ('hard caps' & 'soft caps' to limit your card). This way I have attacked the power throttling problem at both angles. Physically on the PCB and software via the firmware.


----------



## nrpeyton

kithylin said:


> I sincerely hope you know better than to be running coin mining (even part time.. even for a short while) on a 1080 Ti with either shunt modded or the XOC bios on it. That'll run an extremely high chance of frying your card. XOC bios is for benchmarking or gaming (with gsync or vsync on) only.


I run at a +248 O/C on the core while simultaneously under-volting her to 0.9v. The result is card runs at 0.9v @ 1999mhz. This will last longer than a card drawing 1.064v & 1826mhz at stock. 


P.S. 
-My 1080Ti Core temp barely ever exceeds 25c. 
-VRM temps never exceed 45c. 
-Memory 20-33c depending on location. I/O side memory chips run cooler than VRM side.



...............................................................current..........min..................max.......avg.


----------



## GOLDDUBBY

nrpeyton said:


> I run at a +248 O/C on the core while simultaneously under-volting her to 0.9v. The result is card runs at 0.9v @ 1999mhz. This will last longer than a card drawing 1.064v & 1826mhz at stock.
> 
> 
> P.S.
> -My 1080Ti Core temp barely ever exceeds 25c.
> -VRM temps never exceed 45c.
> -Memory 20-33c depending on location. I/O side memory chips run cooler than VRM side.


How are you undervolting it, with the power limiter or voltage dial? Afterburner?


----------



## ZealotKi11er

nrpeyton said:


> I run at a +248 O/C on the core while simultaneously under-volting her to 0.9v. The result is card runs at 0.9v @ 1999mhz. This will last longer than a card drawing 1.064v & 1826mhz at stock.
> 
> 
> P.S.
> -My 1080Ti Core temp barely ever exceeds 25c.
> -VRM temps never exceed 45c.
> -Memory 20-33c depending on location. I/O side memory chips run cooler than VRM side.
> 
> 
> 
> ...............................................................current..........min..................max.......avg.


The power seem to be a bit high. Are you using a different BIOS? You should be ~ 230-250W.


----------



## nrpeyton

GOLDDUBBY said:


> How are you undervolting it, with the power limiter or voltage dial? Afterburner?


MSI Afterburner:










All cards maximum editable voltage = 1.093v except cards flashed with ASUS XOC firmware, then max editable voltage - 1.2v
However, better to under-volt ( as seen in my curve--not over-volt)

Core is overclocked by +248 while under-volted to 0.9v
Normal _(stock) _speed for 0.9v is only 1751mhz.


----------



## 8051

Agent-A01 said:


> Memory controllers have logic to send/receive/refresh data to memory.
> 
> Sorry but your assumption that raising the data-rate of memory also overclocks something else makes no sense logically.
> Simply an untrue statement.


Memory controller have logic to send/receive/refresh data to memory? That's all eh? What about communicating w/the GPU and the cache? What about queuing up requests to read/write data from/to multiple SM's? What about arbitrating for requests from multiple SM's? You don't think the memory controller runs synchronously w/the memory? 

LOL, you have a LOT to learn.


----------



## nrpeyton

ZealotKi11er said:


> The power seem to be a bit high. Are you using a different BIOS? You should be ~ 230-250W.


aye, without under-volt (stock) my card run at 1.062v (no throttling) so it will draw over 425 watts due to power limits removed with shunt mod & XOC firmware (actual power draw in watts is entirely dependant on use, (i.e. compute workloads)/game/scenario/benchmark/4k or 8k resolution etc etc.....)
With voltage slider at 100% (1.093v to 1.2v I've seen power draw hit 500 watts). 1.2v obviously only with XOC.

note: Don't like to see it over 350 watts except momentarily

p.s. power/wattage at the extreme upper end maybe only 1 or 2 experimental times : -)


----------



## Nvidia ATI

I am interested in purchasing the EVGA ftw3 hybrid. I like a very silent PC so I would like to be able to turn off or slow down the fan that cools the memory heatsink. Is this possible in EVGA precision? And for those who currently own this card, is this memory fan noisy? I am not concerned about the heatsink fan because it will be immediately replaced with a better fan.


----------



## SavantStrike

Nvidia ATI said:


> I am interested in purchasing the EVGA ftw3 hybrid. I like a very silent PC so I would like to be able to turn off or slow down the fan that cools the memory heatsink. Is this possible in EVGA precision? And for those who currently own this card, is this memory fan noisy? I am not concerned about the heatsink fan because it will be immediately replaced with a better fan.


It's quite possible.

I have an SC2 that's pretty quiet so I would expect the ftw3 to be even quieter.


----------



## nrpeyton

Nvidia ATI said:


> I am interested in purchasing the EVGA ftw3 hybrid. I like a very silent PC so I would like to be able to turn off or slow down the fan that cools the memory heatsink. Is this possible in EVGA precision? And for those who currently own this card, is this memory fan noisy? I am not concerned about the heatsink fan because it will be immediately replaced with a better fan.


I believe on that card you can use Precision X to separately control either the left 2 fans or the 1 right fan. So left 2 fans must be in one group, the right fan in group of it's own?


----------



## ada187

hey guys, are there any heatsink out there from varies company (EVGA, MSI, GIGABYTE) that will fit the reference card?


----------



## OrionBG

ada187 said:


> hey guys, are there any heatsink out there from varies company (EVGA, MSI, GIGABYTE) that will fit the reference card?


You mean something like this: https://www.arctic.ac/eu_en/accelero-xtreme-iv.html


----------



## KedarWolf

OrionBG said:


> You mean something like this: https://www.arctic.ac/eu_en/accelero-xtreme-iv.html


Probably best to go with an EVGA Hybrid kit or if money isn't an issue an EKWB MLC AIO with a prefilled FE full waterblock. 

https://www.ekwb.com/shop/catalogsearch/result/?q=mlc


----------



## HoneyBadger84

All the AIO systems they have for the Titan Xp are end of life. Rip.


----------



## kithylin

nrpeyton said:


> I originally decided I liked the idea of keeping the stock FE BIOS. Which is why I done the *shunt mod.*
> 
> Later on I decided I also wanted to take advantage of the extra voltage the XOC firmware allows (107 milli volts extra when pushed to absolute limits). Hence I installed XOC firmware.
> 
> I never run the card at that voltage apart from once or twice while I was "experimenting" with overclocking my card. I wanted to know what she was capable of. I.E. the performance I could get when pushing the envelope to the absolute limit of the GPU architecture and FE PCB.
> 
> To carry out the mod safely you apply liquid electrical tape all around the sides of all three shunt resistors. (Being careful not to get any on top of the resistor).
> Liquid Electrical Tape transforms into what looks like normal black electrical insulation tape when it dries. Also easily to peel off.
> 
> Next; apply a tiny amount of Liquid Metal (I.E. Thermal Grizzly Conductonaut) to the top of all 3 resistors. Then spread so it covers the top of the resistor. (You're effectively creating an extra electrical pathway across the resistor) which changes the recorded resistance and fools the card into thinking it's drawing about 25% less power. You now effectively have an extra 25% before reaching maximum power envelope for card. Which stops it power throttling.
> 
> Be careful not to get any liquid metal around the edges of the resistor (edges where base of resistor joins with PCB) as Liquid Metal has been known to "eat" solder and 2 guys reported resistors actually falling off.
> 
> Also before applying the tiny blob of liquid metal try to 1st apply an even smaller amount (barely a visible trace) then smear this over the top of the resistor. Liquid Metal is attracted to Liquid Metal so it helps with applying it. (Stops it trying to move around as sometimes it will want to move around in a "little ball" instead of painting over the resistor).
> 
> 
> P.S
> GPU Boost 3.0 works with Nvidia limitations ('hard caps' & 'soft caps' to limit your card). This way I have attacked the power throttling problem at both angles. Physically on the PCB and software via the firmware.


What I think you some how are failing to understand is the XOC bios literally does -EXACTLY- the same thing as the shunt mod: remove all power restrictions from the card. Literally if you have the XOC bios on, there is physically no point in the shunt mod if you're already using the XOC bios. And yes all vendor companies know about the shunt mod, and know what to look for, and will deny you any RMA requests if you have done this to your card, they will know, and your warranty will become useless.


----------



## nrpeyton

kithylin said:


> What I think you some how are failing to understand is the XOC bios literally does -EXACTLY- the same thing as the shunt mod: remove all power restrictions from the card. Literally if you have the XOC bios on, there is physically no point in the shunt mod if you're already using the XOC bios. And yes all vendor companies know about the shunt mod, and know what to look for, and will deny you any RMA requests if you have done this to your card, they will know, and your warranty will become useless.


XOC = extra voltage


I carried out the shunt mod because initially I wanted to keep the FE BIOS. I changed my mind. And naturally ended up with both shunt mod & XOC.

Moreover -- I can switch back & forward from FE BIOS to XOC and back without having to worry about power envelope.


Maybe see your point if didn't already have the mod. But I do. Removing it would mean removing water-block and everything else that entails.

*Important:* Some benchmarks actually prefer FE BIOS (on a clock-for-clock basis). My "theory" and it's only a theory is mod bypasses any "hard" limit related to power. There was a video by 'J's 2 cents' last year speculating about Pascal shutting down parts of the DIE to keep to Nvidia specs. Such behaviour would never be reported in Windows -- only way this would be apparent to the user would be benchmark results on a clock-for-clock basis. Which we _have_ found can be little different from different 1080Ti.


----------



## Nvidia ATI

SavantStrike said:


> It's quite possible.
> 
> I have an SC2 that's pretty quiet so I would expect the ftw3 to be even quieter.





nrpeyton said:


> I believe on that card you can use Precision X to separately control either the left 2 fans or the 1 right fan. So left 2 fans must be in one group, the right fan in group of it's own?


I am referring to the ftw3 hybrid which has the AIO cooler.


----------



## nrpeyton

Nvidia ATI said:


> I am referring to the ftw3 hybrid which has the AIO cooler.


Ahh right I see.. the fan still on the card then is primarily for cooling VRM? Which will in turn help memory temps too. As a lot of heat spills across from VRM onto memory. I believe memory is cooled passively with copper extension to block?

If you have no control over that fan through MSI Afterburner / Precision X e.t.c. you could always re-route it's power cable to a fan controller or a mobo fan header for control (although you may need an adapter)?

I'd try both L & R fan control on Precision X first though. _<<-- where is that fan connected on FTW3 PCB?_


----------



## 8051

nrpeyton said:


> XOC = extra voltage


The XOC allows extra voltage but more importantly it stops power throttling.


----------



## nrpeyton

8051 said:


> The XOC allows extra voltage but more importantly it stops power throttling.



aye :- )


----------



## Mrip541

If I buy the official 1080ti hybrid cooling kit from EVGA and install it on my EVGA 1080ti sc2, will I void the warranty?

Edit - I forgot that EVGA has a US customer service phone number staffed by actual humans! I called and verified that installing the hybrid kit does not void the warranty.


----------



## feznz

kithylin said:


> What I think you some how are failing to understand is the XOC bios literally does -EXACTLY- the same thing as the shunt mod: remove all power restrictions from the card. Literally if you have the XOC bios on, there is physically no point in the shunt mod if you're already using the XOC bios. And yes all vendor companies know about the shunt mod, and know what to look for, and will deny you any RMA requests if you have done this to your card, they will know, and your warranty will become useless.


You seem to have changed your tune did you do some research


----------



## ir88ed

Not sure of board sentiment about crypto miners, so I will say that I built my rig for gaming/work and used the etherium I mined on the off hours to recoup the cost of the 1080ti's.

Anyway, my question. When mining my cards have no problem running memory at 12000MHz (no I can't game like that). Memory speed is directly linked to Etherium mining speed. The thing is, none of the afterburner/precisionX/AsusTweak allow me to go over 12K. Is there any tool that allows memory overclocks over 12K? I can't do a bios mod because the cards would never be stable for windows at those speeds. Asus tweak shows that it goes to 13K, but only reports 12K in its monitor graph, and hash rate does not increase. Is there anything out there more flexible? Google searches get me nothing.

Anyway, thanks and please don't flame too badly.


----------



## OrionBG

nrpeyton said:


> I originally decided I liked the idea of keeping the stock FE BIOS. Which is why I done the *shunt mod.*
> 
> Later on I decided I also wanted to take advantage of the extra voltage the XOC firmware allows (107 milli volts extra when pushed to absolute limits). Hence I installed XOC firmware.
> 
> I never run the card at that voltage apart from once or twice while I was "experimenting" with overclocking my card. I wanted to know what she was capable of. I.E. the performance I could get when pushing the envelope to the absolute limit of the GPU architecture and FE PCB.
> 
> To carry out the mod safely you apply liquid electrical tape all around the sides of all three shunt resistors. (Being careful not to get any on top of the resistor).
> Liquid Electrical Tape transforms into what looks like normal black electrical insulation tape when it dries. Also easily to peel off.
> 
> Next; apply a tiny amount of Liquid Metal (I.E. Thermal Grizzly Conductonaut) to the top of all 3 resistors. Then spread so it covers the top of the resistor. (You're effectively creating an extra electrical pathway across the resistor) which changes the recorded resistance and fools the card into thinking it's drawing about 25% less power. You now effectively have an extra 25% before reaching maximum power envelope for card. Which stops it power throttling.
> 
> Be careful not to get any liquid metal around the edges of the resistor (edges where base of resistor joins with PCB) as Liquid Metal has been known to "eat" solder and 2 guys reported resistors actually falling off.
> 
> Also before applying the tiny blob of liquid metal try to 1st apply an even smaller amount (barely a visible trace) then smear this over the top of the resistor. Liquid Metal is attracted to Liquid Metal so it helps with applying it. (Stops it trying to move around as sometimes it will want to move around in a "little ball" instead of painting over the resistor).
> 
> 
> P.S
> GPU Boost 3.0 works with Nvidia limitations ('hard caps' & 'soft caps' to limit your card). This way I have attacked the power throttling problem at both angles. Physically on the PCB and software via the firmware.


Thank you nrpeyton for the explanation!



kithylin said:


> What I think you some how are failing to understand is the XOC bios literally does -EXACTLY- the same thing as the shunt mod: remove all power restrictions from the card. Literally if you have the XOC bios on, there is physically no point in the shunt mod if you're already using the XOC bios. And yes all vendor companies know about the shunt mod, and know what to look for, and will deny you any RMA requests if you have done this to your card, they will know, and your warranty will become useless.


Thanks kithylin for clarifying this!


----------



## mouacyk

Unfortunately XOC BIOS is not compatible with all models.


----------



## SavantStrike

ir88ed said:


> Not sure of board sentiment about crypto miners, so I will say that I built my rig for gaming/work and used the etherium I mined on the off hours to recoup the cost of the 1080ti's.
> 
> Anyway, my question. When mining my cards have no problem running memory at 12000MHz (no I can't game like that). Memory speed is directly linked to Etherium mining speed. The thing is, none of the afterburner/precisionX/AsusTweak allow me to go over 12K. Is there any tool that allows memory overclocks over 12K? I can't do a bios mod because the cards would never be stable for windows at those speeds. Asus tweak shows that it goes to 13K, but only reports 12K in its monitor graph, and hash rate does not increase. Is there anything out there more flexible? Google searches get me nothing.
> 
> Anyway, thanks and please don't flame too badly.


12K is the maximum you'll get. You couod try flashing the 12gbps evga FTW3 bios if it's compatible and over clock from there.

Are you certain your aren't getting memory errors? Ether is pretty forgiving - too forgiving. If you do some zcash mining you'll probably find that you get lower hash rate with memory clocked that high (ecc errors). Your mining memory OC should be stable for games/3d applications or you run the risk of invalid shares and beating up your card.


----------



## Hl86

I´m having a issue with dota 2 with my 1080 ti sli
After 30 minutes the frames drops from 130 to 40 fps.
Im using nvidia dsr.
The cards frequency and power usage stays the same.


----------



## Hl86

Double post


----------



## Agent-A01

8051 said:


> Memory controller have logic to send/receive/refresh data to memory? That's all eh? What about communicating w/the GPU and the cache? What about queuing up requests to read/write data from/to multiple SM's? What about arbitrating for requests from multiple SM's? You don't think the memory controller runs synchronously w/the memory?
> 
> LOL, you have a LOT to learn.


Did I say that's the only thing it did?

Nope.
I listed the basic task of an IMC.

If you think memory controllers have a clock frequency that is tied to memory datarate, you clearly do not understand basics.

Clearly that's true based on your previous statements you've pulled out of your ass in past debates. 

You sure do love to make a lot of claims but you never have anything to prove your bogus theories.

Pro tip, we aren't in the early 90s where external memory controllers were tied to the speed of DDR, such as 25 or 33mhz.

IMCs today run at a specific frequency provided externally by PLL depending on strap.
Overclocking BCLK alone(while leaving strap the same) in today's system doesn't not inherently increase the IMC frequency.

You want to increase IMC frequency you have to increase the relevant memory straps.
Once you pick the highest available strap it will be at the maximum frequency.

Instead of wearing that shovel down, provide white papers of Pascal GPUs to prove your statements.

Like that other claim you had of lowering memory speeds for a significant power consumption reduction


----------



## kithylin

feznz said:


> You seem to have changed your tune did you do some research


Nothing I said in that post to nrpeyton is contrary to anything I've ever said. I've said in the past shunt mods on cards voids your warranty or renders you unable to RMA, that's all I was reminding them of. I did go along with Hulio225 though and I don't actively go around stating XOC is completely safe anymore.. just give folks a disclaimer to 'use at your own risk' now, if that's what you mean. I still think the bios mods (XOC or otherwise) is a much safer way of unlocking the 1080 Ti card and runs a much lower chance of frying your card (compared to shunt mods with liquid metal on your card..) and should be virtually undetectable by any support company for warranty reasons. Especially since the XOC bios does there exact same thing as shunt mods, there's no point in any physical modification to the cards anymore. We don't know long term yet if XOC damages any cards.. because no one's yet reported back saying it has. So.. at this point in time it should be quick and easy to just flash your stock bios back to the card and send it back for RMA if you had to, since, so far.. XOC bios (or any other bios mods) aren't frying cards, at least not yet. I don't know about other cards, but for EVGA: Specifically the EVGA support rep I spoke to told me we can do anything we want to the cards, mount any cooler we wish or flash any bios we wish and warranty will remain intact. The -ONLY- thing that would void EVGA's warranty would be any sort of physical modification to the card's PCB it's self. Which would be shunt mods.


----------



## feznz

kithylin said:


> Nothing I said in that post to nrpeyton is contrary to anything I've ever said. I've said in the past shunt mods on cards voids your warranty or renders you unable to RMA, that's all I was reminding them of. I did go along with Hulio225 though and I don't actively go around stating XOC is completely safe anymore.. just give folks a disclaimer to 'use at your own risk' now, if that's what you mean. I still think the bios mods (XOC or otherwise) is a much safer way of unlocking the 1080 Ti card and runs a much lower chance of frying your card (compared to shunt mods with liquid metal on your card..) and should be virtually undetectable by any support company for warranty reasons. We don't know long term yet if XOC damages any cards.. because no one's yet reported back saying it has. So.. at this point in time it should be quick and easy to just flash your stock bios back to the card and send it back for RMA if you had to, since, so far.. XOC isn't frying cards, at least not yet. I don't know about other cards, but for EVGA: Specifically the EVGA support rep I spoke to told me we can do anything we want to the cards, mount any cooler we wish or flash any bios we wish and warranty will remain intact. The -ONLY- thing that would void EVGA's warranty would be any sort of physical modification to the card's PCB it's self. Which would be shunt mods.


I like your answer :thumb: but the truth is that LM TIM is used so it can be cleaned off for RMA unless you soldered the resisters on and also I have also read in this thread that a totally non responding GPU was/could be blind flashed from XOC back to stock bios if you read between the lines because we don't publicly want to tell people what we do with our RMA cards right?
Personally I just run the stock Strix Bios that came with my card but technically I voided the warrantee when I took off the cooler to put on a water block unless I used tweezers to undo the screw with the anti tamper sticker on it 

but looking a RMA stats it would seem that less than 2% of 10xx are RMAed

Also thermal shut down saved another CPU for me after building a PC for my 6Y old daughter last week. Done the gentle persuasion close of the back of the case only to have those stupid molex plug connector to lose connection to the water pump She played Roblox for 2 hours before the CPU hit 100° and shut down on stock clocks
\



Hl86 said:


> I´m having a issue with dota 2 with my 1080 ti sli
> After 30 minutes the frames drops from 130 to 40 fps.
> Im using nvidia dsr.
> The cards frequency and power usage stays the same.


I think I had that before it could be a ever so slightly unstable OC on the CPU easiest why to check is look for WHEA errors on administrative log correctable happen with no knowledge of them happening, but in-correctable WHEA error result in BSOD


----------



## ir88ed

SavantStrike said:


> 12K is the maximum you'll get. You couod try flashing the 12gbps evga FTW3 bios if it's compatible and over clock from there.
> 
> Are you certain your aren't getting memory errors? Ether is pretty forgiving - too forgiving. If you do some zcash mining you'll probably find that you get lower hash rate with memory clocked that high (ecc errors). Your mining memory OC should be stable for games/3d applications or you run the risk of invalid shares and beating up your card.


My mining dashboard reports 1.8% bad shares total, but the overclock took me from 60MH/s to 75+MH/s for the two cards combined, so that seems reasonable-ish.
I will be content with the current OC, and maybe dial it back to see if I can spare my GPUs without losing hashes. 
Thanks.


----------



## nrpeyton

kithylin said:


> Nothing I said in that post to nrpeyton is contrary to anything I've ever said. I've said in the past shunt mods on cards voids your warranty or renders you unable to RMA, that's all I was reminding them of. I did go along with Hulio225 though and I don't actively go around stating XOC is completely safe anymore.. just give folks a disclaimer to 'use at your own risk' now, if that's what you mean. I still think the bios mods (XOC or otherwise) is a much safer way of unlocking the 1080 Ti card and runs a much lower chance of frying your card (compared to shunt mods with liquid metal on your card..) and should be virtually undetectable by any support company for warranty reasons. Especially since the XOC bios does there exact same thing as shunt mods, there's no point in any physical modification to the cards anymore. We don't know long term yet if XOC damages any cards.. because no one's yet reported back saying it has. So.. at this point in time it should be quick and easy to just flash your stock bios back to the card and send it back for RMA if you had to, since, so far.. XOC bios (or any other bios mods) aren't frying cards, at least not yet. I don't know about other cards, but for EVGA: Specifically the EVGA support rep I spoke to told me we can do anything we want to the cards, mount any cooler we wish or flash any bios we wish and warranty will remain intact. The -ONLY- thing that would void EVGA's warranty would be any sort of physical modification to the card's PCB it's self. Which would be shunt mods.



That's not entirely true.. although your points do indeed hold some weight.. I'm not disputing that or ruling them out completely.

Whats important to remermber is the XOC allows extra voltage. (up to 1.2v) which is farr beyond NVidia's recommendation. Remember even going up to 1.093v is technically a warranty breach with Nvidia.

The shunt mod on the other hand doesn't allow any extra voltage. It just allows an extra 25% approx. boost to upper power envelope. So in that regard may even be safer for many people in regards to longevity of card. (IF and ONLY IF the mod is done correctly with all safety precautions taken.


----------



## nrpeyton

----------------------------
"Russian man convicted of murder after killing friend over AMD vs Nvidia argument:

Sentenced to nine and half years for *GPU-related stabbing.*

Last year a 37-year-old Russian man, Aleksander Trofimov, murdered his friend Evgeny Lylin during an argument over the two major graphics card makers, AMD and Nvidia. 
The pair reportedly used to work together in software development, and met up for drinks. While drunk, the two started disagreeing over whether AMD or Nvidia GPUs have better coolers, and things took a turn for the worse soon after. Trofimov stabbed Lylin to death, and he was arrested three days later. 

It's a horrific story, and sadly one that could have been avoided with a bit of research. Nvidia is clearly on top right now.

Trofimov was recently sentenced after confessing to the crime, and will serve nine and a half years in prison for the murder. He is an Nvidia fan however, so might be out a little faster".
----------------------------

P.S. just pulled that from "PC Gamer" magazine. I wonder how just how many people out there who take it that seriously/


----------



## Thoth420

nrpeyton said:


> ----------------------------
> "Russian man convicted of murder after killing friend over AMD vs Nvidia argument:
> 
> Sentenced to nine and half years for *GPU-related stabbing.*
> 
> Last year a 37-year-old Russian man, Aleksander Trofimov, murdered his friend Evgeny Lylin during an argument over the two major graphics card makers, AMD and Nvidia.
> The pair reportedly used to work together in software development, and met up for drinks. While drunk, the two started disagreeing over whether AMD or Nvidia GPUs have better coolers, and things took a turn for the worse soon after. Trofimov stabbed Lylin to death, and he was arrested three days later.
> 
> It's a horrific story, and sadly one that could have been avoided with a bit of research. Nvidia is clearly on top right now.
> 
> Trofimov was recently sentenced after confessing to the crime, and will serve nine and a half years in prison for the murder. He is an Nvidia fan however, so might be out a little faster".
> ----------------------------
> 
> P.S. just pulled that from "PC Gamer" magazine. I wonder how just how many people out there who take it that seriously/


This made me laugh so hard 
In soviet Russia GPU stabs you.

In other news I will be joining the club shortly as tomorrow is build day...yes I bought a legit Win10 USB install as it was the same as a legit key and I don't like getting them from offbrand sources. That's an 8700k kinda hard to see. The plastic bags are cablemods stuff and other various cabling.


----------



## kithylin

Thoth420 said:


> This made me laugh so hard
> In soviet Russia GPU stabs you.
> 
> In other news I will be joining the club shortly as tomorrow is build day...yes I bought a legit Win10 USB install as it was the same as a legit key and I don't like getting them from offbrand sources. That's an 8700k kinda hard to see. The plastic bags are cablemods stuff and other various cabling.


Welcome to the modern virus you pay money for.

If you're new to windows 10, don't forget to download and run SpyBot Anti-Beacon: https://www.safer-networking.org/spybot-anti-beacon/

Stops the majority of microsoft's tools in the OS that they use to spy on your every action.


----------



## KedarWolf

kithylin said:


> Welcome to the modern virus you pay money for.
> 
> If you're new to windows 10, don't forget to download and run SpyBot Anti-Beacon: https://www.safer-networking.org/spybot-anti-beacon/
> 
> Stops the majority of microsoft's tools in the OS that they use to spy on your every action.


I prefer 'Disable Win Tracking'

https://github.com/10se1ucgo/DisableWinTracking/releases

You might need to install 

http://www.majorgeeks.com/files/details/visual_c_runtime_installer.htm

The easy way to get all your C++ dependencies. 

I run version 3.0.1 as well as the latest version is this version uninstalls a ton of windows bloatware. 

https://github.com/10se1ucgo/DisableWinTracking/releases/tag/v3.0.1


----------



## Thoth420

Cheers! I’m not new to 10 but thanks for the resources I have always just let it spy in the past. I use a win7 slave for an HTPC media system. My main rig is just for gaming and content creation but I’ll still check them out.

Any suggestions on a fan curve for the FTW3 Gaming? I just set them all to a static 65% for cursory testing which worked ok with Mankind Divided cranked to max @ 3440 x 1440 100hz aside MSAA which was off but I’d like to profile the fans to stay under throttle temps no matter what. 

What is throttle temp anyways for these cards? I came from the Fury X so it has been a while since I had an Nvidia card.


----------



## AvengedRobix

I’m loving xoc bios on ftw3 elite


Inviato dal mio iPhone utilizzando Tapatalk


----------



## nrpeyton

AvengedRobix said:


> I’m loving xoc bios on ftw3 elite
> 
> 
> Inviato dal mio iPhone utilizzando Tapatalk




If you're using the XOC BIOS then shouldn't GPU-Z be reporting the correct voltage? You can't possibly be over 2.2GHZ at .7 odd volts lol?


----------



## KedarWolf

Was getting a lot of lag in Diablo 3 with G-Sync enabled but figured out I need to lock my voltage curve to the highest point. Because G-Sync at 4K caps the frame rate at 60 I was getting really low clock speeds, it doesn't max out.


----------



## AvengedRobix

nrpeyton said:


> If you're using the XOC BIOS then shouldn't GPU-Z be reporting the correct voltage? You can't possibly be over 2.2GHZ at .7 odd volts lol?



When i do screen gpu was in idle.. [email protected],20


Inviato dal mio iPhone utilizzando Tapatalk


----------



## nrpeyton

Anyone using Excavator_cuda_9.1 ?

My GPU's memory allocation is now at over 9GB with NeoScrypt algorythm.

(Currently 9258 MB to be exact -- or 82.2% of total memory). Memory Controller Load is also averaging 65% with peaks up to 76%.

Is this normal?


----------



## hotrod717

RobotDevil666 said:


> Quote:Originally Posted by *hebrewbacon*
> 
> Nice. I just ordered a Titan XP EK waterblock as well to integrate this beast into my loop. I will definitely miss the power of SLI in games where it scaled well but it will be nice not to worry about SLI issues and if the game supported it before buying it.
> 
> 
> Im in the same boat mate, gonna sell of my 1080's and get 1080Ti lack of SLI support and bugs are pissing me off, that said in games like Division or GTA V I will be worse off, oh well I'll probably buy a 2nd one later on anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any word on when are they actually coming in stock ?


Depends what you are looking for. Subscribe to nowinstock.com. I've had a dozen or so opportunities to buy a 1080ti in the past week. It was better than it had been the previous 2 weeks, so i believe the shortage to, hopefully, be on decline. Also local Microcenter has gotten multiple 1070ti, 1080, and 1080ti's. Ebay prices have dropped. Hopefully the trend continues and we see $50 over original msrp soon. It's about $100 right now. 1080ti is so epeen right now. 1080 or 980ti are the most sense. In fact i see no difference in my k/d playing COD WW2 on a 1080ti or a 7950 oc'd.


----------



## nrpeyton

Great News!

Latest edition of Precision X now allows *memory overclocking *up to a max of *+2000* _(instead of the old 1000 which MSI Afterburner is unfortunately still handicapped by)._

.


----------



## ada187

OrionBG said:


> You mean something like this: https://www.arctic.ac/eu_en/accelero-xtreme-iv.html





KedarWolf said:


> Probably best to go with an EVGA Hybrid kit or if money isn't an issue an EKWB MLC AIO with a prefilled FE full waterblock.
> 
> https://www.ekwb.com/shop/catalogsearch/result/?q=mlc




like if I buy a heatsink/cooler from Gigabyte extreeme and slap it on the reference board.


----------



## ada187

also does you guys clock fluctuate around? I look at afterburner and Precision X, my clock jump around alot, I see some youtube videoes where the clock just stay stable the whole time. Cooling is not an issue.


----------



## ZealotKi11er

ada187 said:


> also does you guys clock fluctuate around? I look at afterburner and Precision X, my clock jump around alot, I see some youtube videoes where the clock just stay stable the whole time. Cooling is not an issue.


Its not a problem. Its power related. You simply have to increase power limit. If it still happens you have to find the highest voltage that does not tip the power limit and OC to that point. For example, my card can handle in most games 1v @ 2025MHz without hitting power limit. By default, the card goes to 1.062v.


----------



## OrionBG

I've done some quick tests with my now completed build and with the XOC bios and no other tweaks, the GPU is going to 2000MHz during Time Spy and staying there the whole time. Max. temp. 38C


----------



## kithylin

nrpeyton said:


> If you're using the XOC BIOS then shouldn't GPU-Z be reporting the correct voltage? You can't possibly be over 2.2GHZ at .7 odd volts lol?


That's obviously their idle voltage... see the voltage graph peaked during their benchmark and dropped to idle again... :doh:



KedarWolf said:


> Was getting a lot of lag in Diablo 3 with G-Sync enabled but figured out I need to lock my voltage curve to the highest point. Because G-Sync at 4K caps the frame rate at 60 I was getting really low clock speeds, it doesn't max out.


Also.. no offense intended but.. do you even understand how G-Sync & Boost 3.0 even work together? This is both working exactly how they're designed to work. Yes, of course you get capped at 60 FPS, that's the maximum refresh rate of your monitor. And your clocks are low because with Nvidia Boost 3.0, your gpu won't hit max clocks unless the current "work load" (rendering your game) actually requires it. The card actually does it's very best to clock as -LOW- as possible to maintain 60 FPS. This is by design to save power and heat. There's no physical point in your card running @ max clocks the entire time. That's just completely wasteful, and would make the card run hotter than necessary.


----------



## SavantStrike

ada187 said:


> like if I buy a heatsink/cooler from Gigabyte extreeme and slap it on the reference board.


The GB heatsinks only fit on gigabyte cards. They are also not available for sale.

You're better off with a third party cooler of you can't get an aob card with a decent cooler.


----------



## ir88ed

KedarWolf said:


> Was getting a lot of lag in Diablo 3 with G-Sync enabled but figured out I need to lock my voltage curve to the highest point. Because G-Sync at 4K caps the frame rate at 60 I was getting really low clock speeds, it doesn't max out.


I have had good luck with setting a custom refresh rate over 60hz (Nvidia control panel --> change resolution --> customize --> create custom resolution).  This allows the monitor to be overclocked a bit. Instead of dipping from 60 to 45 fps, you dip from 67 to mid 50's. Actually, in most games I have played the frame rate on my rig the frame rates have stayed over 60 and bounced much less than when I was running a 60fps resolution. Obviously, this is dependent on what your monitor will overclock to.


----------



## nrpeyton

Still can't get 4k @ 60hz.

Max that seems to work is 4k @ 30hz.

Already replaced the HDMI cable with a brand new one.

What could I be missing?


----------



## ZealotKi11er

nrpeyton said:


> Still can't get 4k @ 60hz.
> 
> Max that seems to work is 4k @ 30hz.
> 
> Already replaced the HDMI cable with a brand new one.
> 
> What could I be missing?


You should not have to change anything on your side. For my LG TV I had to enable HDMI Ultra HD Deep Color.


----------



## OrionBG

nrpeyton said:


> Still can't get 4k @ 60hz.
> 
> Max that seems to work is 4k @ 30hz.
> 
> Already replaced the HDMI cable with a brand new one.
> 
> What could I be missing?


It might be a stupid question but have you checked that the HDMI port on the TV is HDMI 2.0? some TVs have both 1.4 and 2.0


----------



## nrpeyton

OrionBG said:


> It might be a stupid question but have you checked that the HDMI port on the TV is HDMI 2.0? some TVs have both 1.4 and 2.0


Help - Can't get 4k @ 60hz 

--------
*What does work:*

TV HDMI signal set to "standard" (in TV's settings, not windows or any PC settings):

1080p 60hz
4k but only up to 30hz

TV HDMI Signal set to "enhanced": (in TV settings)....:

1080p 60hz only
-------------


For a start-- the above seems back-to-front.

I should be able to set the HDMI signal format on TV to "enhanced". Then in Windows set resolution to 3840 x 2160 @60hz and there should be no issues. But I'm having no luck

I don't get a picture.


*Here's what I've tried so far:*
1) Replacing the HDMI cable with a brand new one (the new one is HDMI 2.0 certified and can do 4k at 60hz). (from cables.co.uk)
2) Ensuring TV's HDMI signal is set to "advanced" and not "standard" format. <-- with "enhanced" I can't even get 4k 30hz (TV signal format needs to be on 'standard' even just to get 30hz at 4k).
3) Uninstalling ALL Nvidia, AMD & Intel GPU drivers using the "DDU" utility. (as posted on EVGA support forums). Rebooting, then re-installing only the Nvidia software/drivers.
4) Running a full in-depth virus scan using the latest edition of AVG & running a full Windows Defender scan.
5) Resetting TV back to factory defaults. (this does not remove updates received by the TV - which is currently on version 6.0.1 and is reporting as fully up to date & no new updates are currently available).
6) Checked that intel's iGPU is disabled in system BIOS. (and confirming this by checking in Windows that it's not displayed in GPU-Z -- which it isn't)!
7) Tried HDMI ports 1 & 2 on TV (there are 4). (despite the fact the TV settings give option for "standard" and "enhanced" for ALL ports & also I used to get 4k 60hz on any port
8) Tried setting the TV as the "primary" display device. & tried disconnecting my monitor.


Note: I do have an older AMD Radeon HD 6950 installed. But as mentioned above I uninstalled it (not physically). It was only re-installed automatically from Microsoft Update as part of boot process after I exited safe-mode. (I performed the clean uninstalls with DDU in safe-mode). I also currently have NO displays attached to the AMD card. I have no issues on my 2560 x 1440p monitor.


Please help. I am at a total loss.

Thanks,

Nick Peyton


----------



## kithylin

nrpeyton said:


> Help - Can't get 4k @ 60hz
> 
> --------
> *What does work:*
> 
> TV HDMI signal set to "standard" (in TV's settings, not windows or any PC settings):
> 
> 1080p 60hz
> 4k but only up to 30hz
> 
> TV HDMI Signal set to "enhanced": (in TV settings)....:
> 
> 1080p 60hz only
> -------------
> 
> 
> For a start-- the above seems back-to-front.
> 
> I should be able to set the HDMI signal format on TV to "enhanced". Then in Windows set resolution to 3840 x 2160 @60hz and there should be no issues. But I'm having no luck
> 
> I don't get a picture.
> 
> 
> *Here's what I've tried so far:*
> 1) Replacing the HDMI cable with a brand new one (the new one is HDMI 2.0 certified and can do 4k at 60hz). (from cables.co.uk)
> 2) Ensuring TV's HDMI signal is set to "advanced" and not "standard" format. <-- with "enhanced" I can't even get 4k 30hz (TV signal format needs to be on 'standard' even just to get 30hz at 4k).
> 3) Uninstalling ALL Nvidia, AMD & Intel GPU drivers using the "DDU" utility. (as posted on EVGA support forums). Rebooting, then re-installing only the Nvidia software/drivers.
> 4) Running a full in-depth virus scan using the latest edition of AVG & running a full Windows Defender scan.
> 5) Resetting TV back to factory defaults. (this does not remove updates received by the TV - which is currently on version 6.0.1 and is reporting as fully up to date & no new updates are currently available).
> 6) Checked that intel's iGPU is disabled in system BIOS. (and confirming this by checking in Windows that it's not displayed in GPU-Z -- which it isn't)!
> 7) Tried HDMI ports 1 & 2 on TV (there are 4). (despite the fact the TV settings give option for "standard" and "enhanced" for ALL ports & also I used to get 4k 60hz on any port
> 8) Tried setting the TV as the "primary" display device. & tried disconnecting my monitor.
> 
> 
> Note: I do have an older AMD Radeon HD 6950 installed. But as mentioned above I uninstalled it (not physically). It was only re-installed automatically from Microsoft Update as part of boot process after I exited safe-mode. (I performed the clean uninstalls with DDU in safe-mode). I also currently have NO displays attached to the AMD card. I have no issues on my 2560 x 1440p monitor.
> 
> 
> Please help. I am at a total loss.
> 
> Thanks,
> 
> Nick Peyton


Are you using a Certified Premium High Speed HDMI cable to make sure your cable supports 18 Gbps connection needed for 4K @ 60 Hz?

See here: https://www.hdmi.org/manufacturer/premiumcable/Premium_HDMI_Cable_Certification_Program.aspx

You may be using some generic cable that may not be up to the task.

Also I've never tried this myself but.. have you tried going in to the monitor settings in windows? Can you give a screenshot? Does it physically not even offer you the option to select 60 Hz for the TV? Does windows only offer 30 Hz?


----------



## ZealotKi11er

Check this on your Nvidia Control Panel:

Also from reading online, most people have problems with Cables. Also, have you tried setting it to 4K/60Hz and then once you get black screen unplug HDMI and plug it back on.


----------



## buellersdayoff

nrpeyton said:


> Help - Can't get 4k @ 60hz
> 
> --------
> *What does work:*
> 
> TV HDMI signal set to "standard" (in TV's settings, not windows or any PC settings):
> 
> 1080p 60hz
> 4k but only up to 30hz
> 
> TV HDMI Signal set to "enhanced": (in TV settings)....:
> 
> 1080p 60hz only
> -------------
> 
> 
> For a start-- the above seems back-to-front.
> 
> I should be able to set the HDMI signal format on TV to "enhanced". Then in Windows set resolution to 3840 x 2160 @60hz and there should be no issues. But I'm having no luck
> 
> I don't get a picture.
> 
> 
> *Here's what I've tried so far:*
> 1) Replacing the HDMI cable with a brand new one (the new one is HDMI 2.0 certified and can do 4k at 60hz). (from cables.co.uk)
> 2) Ensuring TV's HDMI signal is set to "advanced" and not "standard" format. <-- with "enhanced" I can't even get 4k 30hz (TV signal format needs to be on 'standard' even just to get 30hz at 4k).
> 3) Uninstalling ALL Nvidia, AMD & Intel GPU drivers using the "DDU" utility. (as posted on EVGA support forums). Rebooting, then re-installing only the Nvidia software/drivers.
> 4) Running a full in-depth virus scan using the latest edition of AVG & running a full Windows Defender scan.
> 5) Resetting TV back to factory defaults. (this does not remove updates received by the TV - which is currently on version 6.0.1 and is reporting as fully up to date & no new updates are currently available).
> 6) Checked that intel's iGPU is disabled in system BIOS. (and confirming this by checking in Windows that it's not displayed in GPU-Z -- which it isn't)!
> 7) Tried HDMI ports 1 & 2 on TV (there are 4). (despite the fact the TV settings give option for "standard" and "enhanced" for ALL ports & also I used to get 4k 60hz on any port
> 8) Tried setting the TV as the "primary" display device. & tried disconnecting my monitor.
> 
> 
> Note: I do have an older AMD Radeon HD 6950 installed. But as mentioned above I uninstalled it (not physically). It was only re-installed automatically from Microsoft Update as part of boot process after I exited safe-mode. (I performed the clean uninstalls with DDU in safe-mode). I also currently have NO displays attached to the AMD card. I have no issues on my 2560 x 1440p monitor.
> 
> 
> Please help. I am at a total loss.
> 
> Thanks,
> 
> Nick Peyton


Had a similar issue with my Sony tv, might be a firmware bug with the tv? Try the other ports, check the manual for the correct 4k 60hz ports, some are 30hz only (on some models). Also could be a handshake issue, with pc and TV on, set advanced, then full reboot TV, then try the setting in nvcp. I'm using hdmi 4, pretty sure I had issues with both 1&2


----------



## Scotty99

Make sure you try all HDMI ports, that is an old 4k tv and its likely only some of the ports are hdmi 2.0 capable.


----------



## Mooncheese

nrpeyton said:


> Anyone using Excavator_cuda_9.1 ?
> 
> My GPU's memory allocation is now at over 9GB with NeoScrypt algorythm.
> 
> (Currently 9258 MB to be exact -- or 82.2% of total memory). Memory Controller Load is also averaging 65% with peaks up to 76%.
> 
> Is this normal?


That's normal, Neoscrypt is currently using 76% of my FE's memory. 

Using NiceHash. 

How are you mining? 

Crypto fell hard today, under $7k. I was making $10-11 a day with my primary 8700k and 1080 Ti PC a month ago (and another $5-7 with old 4930k + 980 Ti PC). I can't believe it's doing $4 a day now. 

I have to say, I'm GLAD this has happened as I'm not a miner, I'm just recouping the cost of hardware already purchased FOR GAMING. 

Posts like this disgust me: 

https://www.reddit.com/r/NiceHash/comments/7srrzf/hello_i_am_back_with_my_finished_8_1080_ti_strix/

I'm SophisticatedPeasant on there and I kinda went overboard with "feedback" against this arsehole but yeah, these people are really making PC Gaming non-viable in 2018.


----------



## feznz

kithylin said:


> That's obviously their idle voltage... see the voltage graph peaked during their benchmark and dropped to idle again... :doh:
> 
> 
> Also.. no offense intended but.. do you even understand how G-Sync & Boost 3.0 even work together? This is both working exactly how they're designed to work. Yes, of course you get capped at 60 FPS, that's the maximum refresh rate of your monitor. And your clocks are low because with Nvidia Boost 3.0, your gpu won't hit max clocks unless the current "work load" (rendering your game) actually requires it. The card actually does it's very best to clock as -LOW- as possible to maintain 60 FPS. This is by design to save power and heat. There's no physical point in your card running @ max clocks the entire time. That's just completely wasteful, and would make the card run hotter than necessary.












Not sure how your G-Sync works but when I looked my GPU clock stays the same only the usage goes up and down might be a newer version.
This is Playing Fallout4 monitoring graphs 

But got to say I am Happy Had a bit of a play with my 8600k managed to get 5Ghz stable ...... so for across all 6 cores 
My new CPU stability program check is Assassins Creed Origins can bring a OC to a crash within Minutes where Prime95 even after an hour with small FFTs same OC


----------



## OrionBG

nrpeyton said:


> Help - Can't get 4k @ 60hz
> 
> --------
> *What does work:*
> 
> TV HDMI signal set to "standard" (in TV's settings, not windows or any PC settings):
> 
> 1080p 60hz
> 4k but only up to 30hz
> 
> TV HDMI Signal set to "enhanced": (in TV settings)....:
> 
> 1080p 60hz only
> -------------
> 
> 
> For a start-- the above seems back-to-front.
> 
> I should be able to set the HDMI signal format on TV to "enhanced". Then in Windows set resolution to 3840 x 2160 @60hz and there should be no issues. But I'm having no luck
> 
> I don't get a picture.
> 
> 
> *Here's what I've tried so far:*
> 1) Replacing the HDMI cable with a brand new one (the new one is HDMI 2.0 certified and can do 4k at 60hz). (from cables.co.uk)
> 2) Ensuring TV's HDMI signal is set to "advanced" and not "standard" format. <-- with "enhanced" I can't even get 4k 30hz (TV signal format needs to be on 'standard' even just to get 30hz at 4k).
> 3) Uninstalling ALL Nvidia, AMD & Intel GPU drivers using the "DDU" utility. (as posted on EVGA support forums). Rebooting, then re-installing only the Nvidia software/drivers.
> 4) Running a full in-depth virus scan using the latest edition of AVG & running a full Windows Defender scan.
> 5) Resetting TV back to factory defaults. (this does not remove updates received by the TV - which is currently on version 6.0.1 and is reporting as fully up to date & no new updates are currently available).
> 6) Checked that intel's iGPU is disabled in system BIOS. (and confirming this by checking in Windows that it's not displayed in GPU-Z -- which it isn't)!
> 7) Tried HDMI ports 1 & 2 on TV (there are 4). (despite the fact the TV settings give option for "standard" and "enhanced" for ALL ports & also I used to get 4k 60hz on any port
> 8) Tried setting the TV as the "primary" display device. & tried disconnecting my monitor.
> 
> 
> Note: I do have an older AMD Radeon HD 6950 installed. But as mentioned above I uninstalled it (not physically). It was only re-installed automatically from Microsoft Update as part of boot process after I exited safe-mode. (I performed the clean uninstalls with DDU in safe-mode). I also currently have NO displays attached to the AMD card. I have no issues on my 2560 x 1440p monitor.
> 
> 
> Please help. I am at a total loss.
> 
> Thanks,
> 
> Nick Peyton


This is what I've found in one article about this TV:

"A recent firmware update not only enabled 4:4:4 chroma reproduction at 3840×[email protected] resolution (once [HDMI Signal Format] was changed to “Extended Format“):

Extended HDMI signal format

…but also added HDR (high dynamic range) playback support – there’s now a [HDR Video] preset alongside the usual offerings of [Vivid], [Standard], [Cinema], etc. And unlike the HDR modes implemented by most TV manufacturers, Sony’s allows for the adjustments of many video parameters including [Brightness], [Contrast], [Gamma], [Black Level], and [Colour temperature]."

So check for any firmware updates and if not too much of a hassle, reset the TV to factory defaults.


----------



## kithylin

feznz said:


> Not sure how your G-Sync works but when I looked my GPU clock stays the same only the usage goes up and down might be a newer version.
> This is Playing Fallout4 monitoring graphs
> 
> But got to say I am Happy Had a bit of a play with my 8600k managed to get 5Ghz stable ...... so for across all 6 cores
> My new CPU stability program check is Assassins Creed Origins can bring a OC to a crash within Minutes where Prime95 even after an hour with small FFTs same OC


95% of 6-core and 8 core coffee lake chips all do 5+ ghz, most do 5.1 - 5.2 or even 5.3, this isn't even unusual or special anymore.


----------



## Scotty99

kithylin said:


> 95% of 6-core and 8 core coffee lake chips all do 5+ ghz, most do 5.1 - 5.2 or even 5.3, this isn't even unusual or special anymore.


Nah, its just that most people on this forum dont actually stress test, or check for whea errors.

Silicon lottery buys thousands of CPU's, only 72% of their 8700k's can do 5.0ghz with 1.4v:
https://siliconlottery.com/collections/coffeelake/products/8700k50g?variant=224965885964

If you went by this forum, every 8700k is stable at 1.35v lolol.


----------



## feznz

Scotty99 said:


> Nah, its just that most people on this forum dont actually stress test, or check for whea errors.
> 
> Silicon lottery buys thousands of CPU's, only 72% of their 8700k's can do 5.0ghz with 1.4v:
> https://siliconlottery.com/collections/coffeelake/products/8700k50g?variant=224965885964
> 
> If you went by this forum, every 8700k is stable at 1.35v lolol.


or AVX set to 5


----------



## Mooncheese

kithylin said:


> 95% of 6-core and 8 core coffee lake chips all do 5+ ghz, most do 5.1 - 5.2 or even 5.3, this isn't even unusual or special anymore.


72% will do 5.0 GHz at or under 1.4v, 43% will do 5.1 GHz, 16% can do 5.2 GHz according to silicon lottery. 

Going by the 8700k sub-forum, quite a few members actually cannot attain 5.0 GHz with 1.4v and many of those who thought they were stable at 5.0-5.1, 5.2 have since returned to report the contrary and that securing the given freq. will necessitate more than 1.4v. 

https://siliconlottery.com/collections/coffeelake


----------



## feznz

Opps not to debunk this thread back to something GPU related (BTW no whea errors in a day)  so no gremlins changing settings or misreporting so far........ see after a week.

Anyway finally have a great use for GeForce. Fallout 4 is on sale 50% off on steam I promised myself to replay with high resolution textures and at the right resolution (Also I am endeavouring to buy all games on steam even if it cost more than stand alone)
so download the game 85Gb cannot for the life of me set anything in the relms of fullscreen and all in windowed mode even after editing the perferences.ini file almost went to install flawless wide screen but I remembered back 3 years ago it didn't work then.
With all hopes dshed of playing this game in the correct format I thought maybe I will give GeExperience a go lastnight none of the settings would down load so gave up again and went for a 5Ghz OC instead.
anyway after looking at this thread knowing how G-sync works I thought I might give FO4 another go.
so loaded GeExperience to see if any games settings had downloaded since lastnight and booya FO4 settings were there so I applied them everything in game was in perfect resolution 
Just went back to the loading screen and no idea why with GeExperience it didn't set every setting to ultra so manually done this so I am one happy camper with GeExperience.


----------



## Mooncheese

feznz said:


> Opps not to debunk this thread back to something GPU related (BTW no whea errors in a day)  so no gremlins changing settings or misreporting so far........ see after a week.
> 
> Anyway finally have a great use for GeForce. Fallout 4 is on sale 50% off on steam I promised myself to replay with high resolution textures and at the right resolution (Also I am endeavouring to buy all games on steam even if it cost more than stand alone)
> so download the game 85Gb cannot for the life of me set anything in the relms of fullscreen and all in windowed mode even after editing the perferences.ini file almost went to install flawless wide screen but I remembered back 3 years ago it didn't work then.
> With all hopes dshed of playing this game in the correct format I thought maybe I will give GeExperience a go lastnight none of the settings would down load so gave up again and went for a 5Ghz OC instead.
> anyway after looking at this thread knowing how G-sync works I thought I might give FO4 another go.
> so loaded GeExperience to see if any games settings had downloaded since lastnight and booya FO4 settings were there so I applied them everything in game was in perfect resolution
> Just went back to the loading screen and no idea why with GeExperience it didn't set every setting to ultra so manually done this so I am one happy camper with GeExperience.


God Rays on anything higher than Medium will consume about 20 FPS at 2560x1440. 

Shadow Render Distance should not be higher than low or medium otherwise you will have severe draw call induced CPU bottleneck problems in larger areas. 

Playing in Survival mode is an absolute must. It introduces so much realism that the game is lacking, like the need to eat, drink and sleep + the enemies are more difficult + you get 2x XP for killing them. I wish I knew how much depth this added when I first started playing FO4. I didn't turn Survival on until I already had like 300 hours into the game and was a higher level character. 

Some of the mods are must have, one of my favorites is Pip Boy Shadows, extremely cool: 

https://www.nexusmods.com/fallout4/mods/281/


----------



## feznz

Mooncheese said:


> 72% will do 5.0 GHz at or under 1.4v, 43% will do 5.1 GHz, 16% can do 5.2 GHz according to silicon lottery.
> 
> Going by the 8700k sub-forum, quite a few members actually cannot attain 5.0 GHz with 1.4v and many of those who thought they were stable at 5.0-5.1, 5.2 have since returned to report the contrary and that securing the given freq. will necessitate more than 1.4v.
> 
> https://siliconlottery.com/collections/coffeelake


I wish there was a like or rep button you were right on report back later with TBH I am down to 4.8Ghz.
Just getting back into Fallout 4 so I will keep in mind the suggestions TBH it feels a bit backwards with the graphics quality but keep an eye on CPU usage. and I did notice the in game mods there too.
I lost Dogmeat last time I think I told him to stay and forgot were I left him


----------



## kithylin

Mooncheese said:


> God Rays on anything higher than Medium will consume about 20 FPS at 2560x1440.
> 
> Shadow Render Distance should not be higher than low or medium otherwise you will have severe draw call induced CPU bottleneck problems in larger areas.
> 
> Playing in Survival mode is an absolute must. It introduces so much realism that the game is lacking, like the need to eat, drink and sleep + the enemies are more difficult + you get 2x XP for killing them. I wish I knew how much depth this added when I first started playing FO4. I didn't turn Survival on until I already had like 300 hours into the game and was a higher level character.
> 
> Some of the mods are must have, one of my favorites is Pip Boy Shadows, extremely cool:
> 
> https://www.nexusmods.com/fallout4/mods/281/


It must feel pretty terrible inside to pay all that money for the 2nd fastest gpu available and still have to turn down the graphics at all from ultra..


----------



## feznz

could be the architecture of the time ... I have turned every single setting to max but I won't talk too fast this time more testing required also on 3840x1600p so far everything has been fine 
money they are still printing it.... not sure about mining it though after a disastrous start to the year hopefully the GPU market will stabilise before volta
I not sure if I will make the move TBH I only upgrade when I see the actual need ie a game is unplayable @ high settings with native resolution 
But still till a more Graphics challenging game comes out before I will consider a upgrade


----------



## kithylin

feznz said:


> could be the architecture of the time ... I have turned every single setting to max but I won't talk too fast this time more testing required also on 3840x1600p so far everything has been fine
> money they are still printing it.... not sure about mining it though after a disastrous start to the year hopefully the GPU market will stabilise before volta
> I not sure if I will make the move TBH I only upgrade when I see the actual need ie a game is unplayable @ high settings with native resolution
> But still till a more Graphics challenging game comes out before I will consider a upgrade


I upgrade just because, on the 100%'s pretty much: When a new GPU comes out that is literally a full +100% performance wise over my current card. 

About the 1080 Ti specifically though: Even if sometimes I'm playing the same older games I've been playing for years and it doesn't really "load" the GPU very much... It will at least greatly reduce power usage playing such games at max settings. And it gives me the "option" I can turn around and if I wanted to play a modern AAA game, I could do that, and have a GPU capable of doing very well in that also. That's pretty much what I love about the 1080 Ti: It's not just great for modern games (Which is is very good at), it's also extremely low power playing older games, if we wanted to do that. Sometimes I play ARK: Survival Evolved, which will run my 1080 Ti @ 99% load constantly. Sometimes I decide I want to visit an older game and play skyrim for a few hours.. and running skyrim @ 16-24 watts from the GPU for ultra is great. Sometimes I want to play PubG, sometimes I play Overwatch, sometimes Fallout 4. It's the best all-around card I've had in the history of owning video cards: It does everything I could want for any game I could decide to play no matter how new or old the game is.


----------



## SavantStrike

kithylin said:


> It must feel pretty terrible inside to pay all that money for the 2nd fastest gpu available and still have to turn down the graphics at all from ultra..


Third fastest...

Hopefully SLI doesn't die with consumer Volta/Ampere.


----------



## Mooncheese

feznz said:


> I wish there was a like or rep button you were right on report back later with TBH I am down to 4.8Ghz.
> Just getting back into Fallout 4 so I will keep in mind the suggestions TBH it feels a bit backwards with the graphics quality but keep an eye on CPU usage. and I did notice the in game mods there too.
> I lost Dogmeat last time I think I told him to stay and forgot were I left him


I think Dogmeat returns to Sanctuary if you lose him.

Oh and you lose Fast Travel ability in Survival, which is actually kinda great because there are so many random encounters you miss fast travelling everywhere and it's so not immersive being able to do that. In survival, you will be forced to hump it, and say you run out of water or food along the way, well now you have to lurk into some disused military facility turned super mutant lair etc. It's WAY better on Survival. Like 10x better. It should've been the default mode.

Oh and you can only save when you sleep...which you need to do otherwise you lose perception etc.


----------



## mouacyk

The most inviting features of a declining franchise are revealed in an OCN 1080ti thread. I love it. Will actually buy it now. Loved metro in ranger hardcore mode. F1 and F2 are still the best.


----------



## steezebe

Besides the EVGA Kingpin hydro, are there any single slot 1080ti cards with a waterblock? All of the cards so far seem to be double wide. (As if you can find a bloody card at a reasonable price)...


----------



## SavantStrike

steezebe said:


> Besides the EVGA Kingpin hydro, are there any single slot 1080ti cards with a waterblock? All of the cards so far seem to be double wide. (As if you can find a bloody card at a reasonable price)...


Out of the box with a preinstalled block - I don't think there are.

If you are willing to install the block yourself any founders edition card can be converted to single slot, although those cards are practically unicorns as most of the reference PCB AIB cards added DVI ports.


----------



## Mooncheese

steezebe said:


> Besides the EVGA Kingpin hydro, are there any single slot 1080ti cards with a waterblock? All of the cards so far seem to be double wide. (As if you can find a bloody card at a reasonable price)...


MSI 1080 Ti Seahawk. 

Add your own waterblock to whatever card of your choosing?


----------



## Mooncheese

mouacyk said:


> The most inviting features of a declining franchise are revealed in an OCN 1080ti thread. I love it. Will actually buy it now. Loved metro in ranger hardcore mode. F1 and F2 are still the best.


Haven't played Metro in a while, surprised they haven't released a sequel since Last Light Redux in 2015.


----------



## steezebe

SavantStrike said:


> Out of the box with a preinstalled block - I don't think there are.
> 
> If you are willing to install the block yourself any founders edition card can be converted to single slot, although those cards are practically unicorns as most of the reference PCB AIB cards added DVI ports.


yeah that DVI port is quite bothersome. I wonder if the card will work with a simple desolder of the port... (?) 



Mooncheese said:


> MSI 1080 Ti Seahawk.
> 
> Add your own waterblock to whatever card of your choosing?


the seahawk is a double slot, unless I'm missing something?

Perhaps this whole search is in vain; even a used 1080ti is 50% over msrp on ebay... not the best time for a pc gamer, is it.


----------



## KedarWolf

steezebe said:


> yeah that DVI port is quite bothersome. I wonder if the card will work with a simple desolder of the port... (?)
> 
> 
> 
> the seahawk is a double slot, unless I'm missing something?
> 
> Perhaps this whole search is in vain; even a used 1080ti is 50% over msrp on ebay... not the best time for a pc gamer, is it.


Go to www.nowinstock.net Get alerts for the Nvidia 1080 Ti FE from the Nvidia Store, they still sell them often but go out of stock again quick, for $699 USD.


----------



## Thoth420

Some crude cursory images of my new setup...still have to set up the mic, headphones and various recording equipment. Loving the 1080 Ti so far though!


----------



## SavantStrike

steezebe said:


> yeah that DVI port is quite bothersome. I wonder if the card will work with a simple desolder of the port... (?)
> 
> 
> 
> the seahawk is a double slot, unless I'm missing something?
> 
> Perhaps this whole search is in vain; even a used 1080ti is 50% over msrp on ebay... not the best time for a pc gamer, is it.


A desolder will do the trick. FE cards come up if you watch now in stock.

You could also go titan.. More expensive though.


----------



## DarthBaggins

You can also look to certain manufacturers who stuck w/ the factory PCB (my MSI Aero OC was one) that way you can single slot the card with no issue (EK's block for the FE & Factory OEM PCB's comes with a single slot adapter).


----------



## steezebe

KedarWolf said:


> Go to www.nowinstock.net Get alerts for the Nvidia 1080 Ti FE from the Nvidia Store, they still sell them often but go out of stock again quick, for $699 USD.


anymore, nowinstock has buybots that make the data obsolete before I could even click. it's almost a sick tease really hah.



SavantStrike said:


> A desolder will do the trick. FE cards come up if you watch now in stock.
> 
> You could also go titan.. More expensive though.





DarthBaggins said:


> You can also look to certain manufacturers who stuck w/ the factory PCB (my MSI Aero OC was one) that way you can single slot the card with no issue (EK's block for the FE & Factory OEM PCB's comes with a single slot adapter).



Thanks for the FE tip; that may be the way I go, hopefully before the next gen cards. Does the thread have a compilation of cards that are FE or factory PCB design? It might be useful...


----------



## kithylin

steezebe said:


> Thanks for the FE tip; that may be the way I go, hopefully before the next gen cards. Does the thread have a compilation of cards that are FE or factory PCB design? It might be useful...


Just use the EK Cooking Configurator on their website and select the model of card you want to use. If it has FE PCB then they'll tell you to use the Titan/1080 Ti FE block. Easy peasy.


----------



## Mooncheese

steezebe said:


> yeah that DVI port is quite bothersome. I wonder if the card will work with a simple desolder of the port... (?)
> 
> 
> 
> the seahawk is a double slot, unless I'm missing something?
> 
> Perhaps this whole search is in vain; even a used 1080ti is 50% over msrp on ebay... not the best time for a pc gamer, is it.




It is double slot, I apologize. 

Yeah really at this point youre better off waiting for Volta, an 1170 ought to be as fast as our 1080 Ti but at 180W, or you could go with the 1180 which, if history is any indication, will probably be 15-20% faster at 220W. 

I wouldn't buy a 1080 Ti now, especially not at anything over $700. 

If you've been following this thread you may remember I stated (about 5-10 pages back) that GPU shortage situation isn't JUST demand driven, that it's mostly supply driven as Nvidia has in all likelihood retooled and resupplied their production facilities with components for Volta and are probably making Volta as I type this as it's due out in May and are no longer making Pascal, period? 

March, April, May? 

Don't buy a 1080 Ti now, especially for anything more than $700. 

If you F5 vigilantly some time in May during Volta's launch you will be able to purchase an 1170 or 1180 at MSRP directly from Nvidia.


----------



## Mooncheese

Thoth420 said:


> Some crude cursory images of my new setup...still have to set up the mic, headphones and various recording equipment. Loving the 1080 Ti so far though!


Congrats on getting it up and running, love the purple!


----------



## nrpeyton

kithylin said:


> Are you using a Certified Premium High Speed HDMI cable to make sure your cable supports 18 Gbps connection needed for 4K @ 60 Hz?
> 
> See here: https://www.hdmi.org/manufacturer/premiumcable/Premium_HDMI_Cable_Certification_Program.aspx
> 
> You may be using some generic cable that may not be up to the task.
> 
> Also I've never tried this myself but.. have you tried going in to the monitor settings in windows? Can you give a screenshot? Does it physically not even offer you the option to select 60 Hz for the TV? Does windows only offer 30 Hz?


1) The cable is HDMI 2.0 certified. And lab tested to 17.8 GB/s. (According to the website I bought it on and my 'order confirmation email'
Here's the cable here: http://www.cablesuk.co.uk/5m-HDMI-2.0-Cable_Smallest-Head-SUPREME-GOLD-'In-The-World'-p926.html
Cable Name: 

5m HDMI 2.0 Cable
Smallest Head SUPREME GOLD 'In The World'
HDMI 2.0 HD Resolution
Ideal For Gaming, HD and 3D Viewing
FULL HD 1080p, 1440p, 2160p, 4K Compatible
Lab tested to 17.8 Gigabit
3D Ready

2) *Yes *there is an option for 60hz in both Nvidia Control Panel & Windows Display.



buellersdayoff said:


> Had a similar issue with my Sony tv, might be a firmware bug with the tv? Try the other ports, check the manual for the correct 4k 60hz ports, some are 30hz only (on some models). Also could be a handshake issue, with pc and TV on, set advanced, then full reboot TV, then try the setting in nvcp. I'm using hdmi 4, pretty sure I had issues with both 1&2


According to this ALL HDMI ports are enabled:














OrionBG said:


> This is what I've found in one article about this TV:
> 
> "A recent firmware update not only enabled 4:4:4 chroma reproduction at 3840×[email protected] resolution (once [HDMI Signal Format] was changed to “Extended Format“):
> 
> Extended HDMI signal format
> 
> …but also added HDR (high dynamic range) playback support – there’s now a [HDR Video] preset alongside the usual offerings of [Vivid], [Standard], [Cinema], etc. And unlike the HDR modes implemented by most TV manufacturers, Sony’s allows for the adjustments of many video parameters including [Brightness], [Contrast], [Gamma], [Black Level], and [Colour temperature]."
> 
> So check for any firmware updates and if not too much of a hassle, reset the TV to factory defaults.


I have already tried resetting the TV back to factory defaults. :-(


----------



## ZealotKi11er

5M is a bit long. I got a 10 feet cable and that was giving me problems. You might want to try 6 a feet cable.


----------



## nrpeyton

ZealotKi11er said:


> 5M is a bit long. I got a 10 feet cable and that was giving me problems. You might want to try 6 a feet cable.


Even though it's tested to 18 GB/s and HDMI 2.0 and 4k certified?




--------




Update:

Okay I just received an email from Sony Support.

They referred me to a manual that pre-dates a firmware update that the TV had which fully enables 4k 60hz at HDR colours.

My TV does have the update. So the Sony Support rep is actually wrong. (He/she obviously never investigated it enough before emailing me a solution). If he'd researched it properly he'd be aware Sony released the update making the manual reference out of date.

ANYWAY: Regardless - I am going to try his suggestion anyway -- all this talk about cables got me thinking -- if I can reduce the load on the cable, maybe it MIGHT just work.

The manual states to receive 4k @ 60hz colour must be set to: YCbCr 4:2:0 / 8 bit only?

Now in the Nvidia control panel I don't see that option. Only options I see are, (see picture):












Above in control panel I only see:
RGB
YCbCr 4:2:2
YCbCr 4:4:4 


Where is YCbCr 4:2:0 *????*


----------



## ZealotKi11er

nrpeyton said:


> Even though it's tested to 18 GB/s and HDMI 2.0 and 4k certified?


A lot of these cables are not used for PC application. The PC does not do any processing is the signal is weak. I am having trouble with my 10 feet cable where some pixels flash bright white which is a cause of cable limitation. 5m is simply too long for PC use.


----------



## kithylin

nrpeyton said:


> Even though it's tested to 18 GB/s and HDMI 2.0 and 4k certified?


I linked you to the page and the information a few pages back and I'll re-link again and re-explain to you. Just because the place you bought your cable from "Says" it's "Tested to 18 GB/s" and "says" it's "4k certified" does not mean anything. If your cable does -NOT- have the Premium HDMI Certification logo, then it's not officially certified to work @ HDMI 2.0 @ the length you have.

I would suggest you read and understand the HDMI Premium Certification here: https://www.hdmi.org/manufacturer/premiumcable/faq.aspx This literally exists for exactly your problem. Many companies "Say" their cables "Are tested to XXX" but they actually are not. Only cables with the certification and the logo are. You probably bought a cable from some company that advertised it to be XX speed @ XX resolution but as you're finding out, it actually is not.

Also I would suggest you take a few minutes and read about this here: https://en.wikipedia.org/wiki/HDMI#Cables Yes, used for PC usage, signal quality degrades over distance and it depends entirely on cable construction quality, and the components used in said cable. And if the cable isn't Premium Certified, it probably will break down trying to push 4K @ 60 Hz @ anything longer than 8ft. (Generally 8ft is the maximum distance that it's supposed to work at for HDMI 2.0 as a general rule of thumb). It doesn't necessarilly matter the exact resolution it depends on bandwidth. I'd have to look it up but something like say for example: [email protected] Hz is (roughly) about the same bandwidth requirements as say, 1440p @ 120 Hz, in terms of what you need with a cable.

I think you're probably either going to need to buy a shorter cable, or if you really need the length, shop around online for a Premium Certified 15ft cable. Do look for the logo and certification. And no the certifications can not be faked. They contain a QR code that you can use to look up to tie back to the HDMI registry and verify that the company's cable is actually Premium Certified.

Yes these cables do cost a little bit more money but if you want it to actually work then this is what you pay for and deal with. Otherwise you're going to actually have to buy (and return) multiple different cables from many different companies until you hopefully some day may get lucky and find one that works. It's up to you, and it's your money.

I needed a displayport cable to handle 1920x1080 @ 80 Hz @ 25ft. Sadly DisplayPort has no such certification program, and the monitor I have only does that with displayport, so I had to go through and buy and return about 18 different cables before I found one that finally worked.


----------



## nrpeyton

Okay I have sent the following email to the company where I bought the cable:

Hi Cables UK (Gill),

I have just tried another HDMI cable at 4k 60HZ and it works perfectly.

Therefore I know now that the cable is definitely the problem.

I have also done some further research on HDMI 2.0 cables (which is what my cable is advertised as on your website) and here is what it says:

“An HDMI Cable, that is tested against the version 1.4 High Speed testing specification (10.2Gbps), will support 4K content @ 30Hz . However, 4K support beyond 50Hz (typically at 60Hz in advanced 4K UHD TVs) can only be supported using an 18Gbps (17.8Gbps)
capable HDMI cable (or a v2.0 HDMI cable)”.

On your website for the cable that I purchased it is advertised as BOTH lab tested up to 18Gbps (17.8Gbps) AND HDMI 2.0 certified including ALL HDMI 2.0 features.


This leaves only three possibilities:

1)	The HDMI cable you sent me is faulty.
2)	The HDMI cable you sent me is not correctly advertised on your web site.
3)	You accidently sent me the wrong cable


Which ever of the above is the problem. This is not my fault.

Under the Consumer Protection Act I have the right to have this corrected.

Can you please send me a new cable which has all the features that were advertised for the cable that I purchased:
-All HDMI 2.0 features
-Lab tested to 18Gbps (17.8Gbps)
-4K

Also in the interest of making sure my new cable definitely works I also propose you send me a shorter cable. 3 metres. And something a lot thicker than the very thin cable you sent me last time.

I fear many of your cables are not in fact lab tested properly (as claimed) so I would like the best chance possible for my new cable to work without any further frustration/communication via email.

If you will not send me the new cable then can you please issue me an immediate refund, so that I might purchase something elsewhere.

Thank you.

Nicholas Peyton





P.S very angry and frustrated at the time I have wasted with Nvidia and Sony support representatives due to false/inaccurate advertising on your website.


The company & web page where I bought the cable: (I recommend to *AVOID *this company).

*FULL SIZE: *http://www.cablesuk.co.uk/5m-HDMI-2.0-Cable_Smallest-Head-SUPREME-GOLD-'In-The-World'-p926.html


----------



## KCDC

Sorry to hear the troubles, man. I've been through it plenty, with many kinds of cables from BNC connection style up to Disply Port over the years. My latest OLED TV came with hdmi cables that were poor knockoffs that were advertised as high speed. After swapping them, the jittery gameplay went away instantly.
I would suggest returning them for your money and get truly certified cables as I doubt any of theirs will hold up.

True certified HDMI cables will have this label on them, I would suggest only getting these as you will be assured they will perform to the current standard:


----------



## DarthBaggins

Last HDMI cable I bought had that same logo, which is why I bought it (was for my PS4 Pro), I tend to stray away from HDMI's when it comes to PC and just stick with Display Port - which I wish was a option for TV's w/ out having to go with a passive/active adapter.


----------



## 2ndLastJedi

DarthBaggins said:


> Last HDMI cable I bought had that same logo, which is why I bought it (was for my PS4 Pro), I tend to stray away from HDMI's when it comes to PC and just stick with Display Port - which I wish was a option for TV's w/ out having to go with a passive/active adapter.


I've been using HDMI with my PC for years now and just recently got a Rift and only have one HDMI from my Galax 1080Ti so got a Display Port to HDMI cable but now upon boot I always have a black screen after bios and need to power cycle my 4k Sony to get the picture working. 
But getting 4k/60 hasn't been an issue for me with both HDMI or this new DP to HDMI cables thankfully but I have always just got cheap cables lol this DP to HDMI was $19 aud http://www.msy.com.au/display-cable/18858-cablelist-cl-dphdmi3m4k-3-meter-displayport-to-hdmi-copper-cable-4k-support.html


----------



## Thoth420

Mooncheese said:


> Congrats on getting it up and running, love the purple!


Thanks I am in love with this system already even just at stock. Can’t wait to start streaming and content creating. 
Everything runs butter smooth at max settings less some of the more agressive AA like for example 8x MSAA which isn’t necessary. I still have to test Rise of the Tomb Raider though.

Ultrawide is one of those things I’ll never be able to give up now.


----------



## Mooncheese

Thoth420 said:


> Thanks I am in love with this system already even just at stock. Can’t wait to start streaming and content creating.
> Everything runs butter smooth at max settings less some of the more agressive AA like for example 8x MSAA which isn’t necessary. I still have to test Rise of the Tomb Raider though.
> 
> Ultrawide is one of those things I’ll never be able to give up now.


That's how I feel about 3D Vision. It's a shame it doesn't have wider adoption, there are like no new 3D Vision monitors. I only play maybe 25% of my games in 3D Vision (because it's just so demanding at 2560x1440, 60 FPS 3D is 120 FPS 2D) but with games where it works properly, holy cow, man if you see this you won't want to play say The Witcher 3 or Rise of the Tomb Raider in 2D every again.


----------



## Scotty99

Glad you got it working but thats pretty crazy that cable didnt work. I kid you not i could take any hdmi cable in my house from years ago and it would work at 60hz 4k, you need a super super cheap cable for 60hz not to work, something so cheap ive never ran across one.


----------



## KCDC

DarthBaggins said:


> Last HDMI cable I bought had that same logo, which is why I bought it (was for my PS4 Pro), I tend to stray away from HDMI's when it comes to PC and just stick with Display Port - which I wish was a option for TV's w/ out having to go with a passive/active adapter.


If they're using that logo and still selling an inferior product, then that's blatant false advertising and should be reported. The point of that QR code is to verify it's integrity. 

I use these for everything HDMI:

https://www.amazon.com/Tera-Grand-Certified-Aluminum-Playstation/dp/B01MFXEG1X/

fixed my fps issues.

I also agree that DP is the way to go, but it's the same issue with them as it is with HDMI, I've had crappy DP cables as well based on the same quality problem.


----------



## feznz

3d vision is nice just I found the passive 3d TV very easy on the eye compared to active NVidia about an hour and I feel the eye strain but differs between people.

As for HDMI cables I have always used the one supplied with the monitor/blue ray player and never actually had to buy one so far.... but I will keep this in mind


----------



## kithylin

Scotty99 said:


> Glad you got it working but thats pretty crazy that cable didnt work. I kid you not i could take any hdmi cable in my house from years ago and it would work at 60hz 4k, you need a super super cheap cable for 60hz not to work, something so cheap ive never ran across one.


Except these days I would say somewhere around 80% of generic hdmi cables sold are imported from sweat shops in china and don't actually run at their rated speeds. Why do you think the Certified Premium program was even created? It's so bad that they had to go -THAT FAR-. Most generic, longer-length hdmi cables out there today (longer than 10 foot / 3 meters) will struggle to do well with 1080p @ 60 hz even. Some of them won't even establish signal to connect at all for 4K/30 Hz, if it's like 15+ FT long or more.


----------



## 2ndLastJedi

I can vouch for this 3 meter DP to HDMI getting 4K /60 and their straight HDMI cables are the same and CHEAP .
http://www.msy.com.au/display-cable...playport-to-hdmi-copper-cable-4k-support.html


----------



## mouacyk

Anyone has issues running any of the tests from the GPUTest v0.7 suite? My 980 Ti didn't have any problems but 1080 Ti will lock up the screen after a test starts up in slideshow fps. Is it incompatible OpenGL drivers?


----------



## 2ndLastJedi

Just put my 1080Ti on Gumtree for sale if anyone is interested. 
Only $1450  
May as well try hey, lol


----------



## KraxKill

Is anybody else mining with their TI? I'm mining ETH at 40.1mh/s while using only 200w at the wall from the entire rig. The Ti is assumed to be pulling around 170w. Fairly efficient if you ask me.


----------



## ColdDeckEd

KraxKill said:


> Is anybody else mining with their TI? I'm mining ETH at 40.1mh/s while using only 200w at the wall from the entire rig. The Ti is assumed to be pulling around 170w. Fairly efficient if you ask me.


That's pretty good, what are your settings if I my ask?

I find the ti is more profitable at zcash, getting 760ish sol/s.


----------



## KraxKill

1760mhz core and 6143mhz memory on water blocked card with a shunt mod. Temp is at 32C while mining.


----------



## KedarWolf

kithylin said:


> That's obviously their idle voltage... see the voltage graph peaked during their benchmark and dropped to idle again... :doh:
> 
> 
> Also.. no offense intended but.. do you even understand how G-Sync & Boost 3.0 even work together? This is both working exactly how they're designed to work. Yes, of course you get capped at 60 FPS, that's the maximum refresh rate of your monitor. And your clocks are low because with Nvidia Boost 3.0, your gpu won't hit max clocks unless the current "work load" (rendering your game) actually requires it. The card actually does it's very best to clock as -LOW- as possible to maintain 60 FPS. This is by design to save power and heat. There's no physical point in your card running @ max clocks the entire time. That's just completely wasteful, and would make the card run hotter than necessary.


I understand what your saying, but it solved my constant stuttering in games when I maxed out the voltage point. And I have my 1080 Ti on it's own 360 RAD and EK block and backplate with 17 wm/k Fuji pads and get about 36C when playing Diablo so it's really not an issue.


----------



## KedarWolf

nrpeyton said:


> Still can't get 4k @ 60hz.
> 
> Max that seems to work is 4k @ 30hz.
> 
> Already replaced the HDMI cable with a brand new one.
> 
> What could I be missing?


I'm pretty sure HDMI is limited to 30hz at 4K and you need to use display port for 4K 60hz.


----------



## KCDC

KedarWolf said:


> I'm pretty sure HDMI is limited to 30hz at 4K and you need to use display port for 4K 60hz.



HDMI 2.0a/b should be able to display [email protected]

Upcoming 2.1 is rated @120


----------



## SavantStrike

KedarWolf said:


> I'm pretty sure HDMI is limited to 30hz at 4K and you need to use display port for 4K 60hz.


HDMI 2.0 and above do 4k60
2.0a adds HDR
2.0B adds more HDR
2.1 adds 4k 120 and 8k 60, along with magical 10k 120


----------



## 2ndLastJedi

4k/60 here on both HDMI and DP to HDMI cables. 
Massive price drop on my 1080Ti, now only $1200aud!


----------



## feznz

^^ thats the way I under stood it too I cannot understand what the actual physical difference is between 1.4 and 2.0 HDMI cables though. 
But opens up the repackaged 1.4 version and hey if only 1/10 of people realise the difference then one way to get rid of old stock with minimal fuss. still get 4k but @ 30Hz most people will never know


----------



## SavantStrike

feznz said:


> ^^ thats the way I under stood it too I cannot understand what the actual physical difference is between 1.4 and 2.0 HDMI cables though.
> But opens up the repackaged 1.4 version and hey if only 1/10 of people realise the difference then one way to get rid of old stock with minimal fuss. still get 4k but @ 30Hz most people will never know


I really dislike 30hz anything. I'm pretty sure that people would notice if watching fast paced content.

The cables themselves aren't necessarily any different, but the spec to which they are tested is higher. An old high-quality 1.4 cable can probably do 2.0b with no trouble. 2.1 might be reachable but it's probably pushing it.


----------



## kithylin

SavantStrike said:


> I really dislike 30hz anything. I'm pretty sure that people would notice if watching fast paced content.
> 
> The cables themselves aren't necessarily any different, but the spec to which they are tested is higher. An old high-quality 1.4 cable can probably do 2.0b with no trouble. 2.1 might be reachable but it's probably pushing it.


If all you're doing with an hdmi cable is watching blu ray / 4K movies @ 4K, from a set-top box with a physical disc in it, then 30hz is enough. Movies are only displayed at 23-24 frames per second anyway. So for that, 30hz is more than enough. For PC gaming though yes you want 60 Hz Minimum. You'd need at least a pair of two 1080 Ti's to play any modern AAA title @ 4K/60 with all the settings, anti-aliasing, bells, whistles, and doohickeys on though, all the game settings cranked to max for every title if you want 60 FPS minimums everywhere.


----------



## 2ndLastJedi

kithylin said:


> If all you're doing with an hdmi cable is watching blu ray / 4K movies @ 4K, from a set-top box with a physical disc in it, then 30hz is enough. Movies are only displayed at 23-24 frames per second anyway. So for that, 30hz is more than enough. For PC gaming though yes you want 60 Hz Minimum. You'd need at least a pair of two 1080 Ti's to play any modern AAA title @ 4K/60 with all the settings, anti-aliasing, bells, whistles, and doohickeys on though, all the game settings cranked to max for every title if you want 60 FPS minimums everywhere.


Destiny 2, 4k/60 maxed, Wolfenstein 2 /pC2 same. I know there's more too.


----------



## SavantStrike

kithylin said:


> If all you're doing with an hdmi cable is watching blu ray / 4K movies @ 4K, from a set-top box with a physical disc in it, then 30hz is enough. Movies are only displayed at 23-24 frames per second anyway. So for that, 30hz is more than enough. For PC gaming though yes you want 60 Hz Minimum. You'd need at least a pair of two 1080 Ti's to play any modern AAA title @ 4K/60 with all the settings, anti-aliasing, bells, whistles, and doohickeys on though, all the game settings cranked to max for every title if you want 60 FPS minimums everywhere.


Some movies are higher frame rate (there are some 48 fps titles). Broadcast television is at 60 too, so 30 really is kind of lame. It should never have been a standard, but TV vendors didn't want to wait for HDMI 2.0


----------



## sblantipodi

I hope that nvidia will fail soon.
There is really no reason why a 1080 Ti should cost 1100-1200€ in europe.

I really hope that they will fail and get kicked ass by amd.


----------



## KickAssCop

I heard you like 1080 Ti.







Will do cable management later.


----------



## the_real_7

KickAssCop said:


> I heard you like 1080 Ti.
> 
> 
> 
> 
> 
> 
> 
> Will do cable management later.


If that was a gaming rig i'm sure you would get props your looking for , but here is mostly frowned upon around here , since most people right now cant even get a gaming card to enjoy themselves.


----------



## Mooncheese

That Nvidia has halted production of certain, if not all Pascal GPU's is confirmed: 

http://www.guru3d.com/news-story/ru...-geforce-gtx-2070-and-2080-on-april-12th.html

"It's now said and written that Nvidia halted production of the GP102 (e.g. 1080 Ti / Titan X) and likely GP104 GPU (e.g. 1070/1080), all originally released back in 2016. So, the question then arises, what is happening with Volta GPUs? Where are they?"

As I stated, the current GPU scarcity and reflective price issue isn't only demand driven, it is supply driven amplified by demand issues from miners. 

DO NOT pay $1k+ for a 1080 Ti right now. This situation will not hold. You will be able to buy a GPU directly from Nvidia some time in May. As to the Ampere rumors, I don't know.


----------



## kithylin

sblantipodi said:


> I hope that nvidia will fail soon.
> There is really no reason why a 1080 Ti should cost 1100-1200€ in europe.
> 
> I really hope that they will fail and get kicked ass by amd.


Really? I guess you don't understand market economics then. It's not nvidia selling their gpu's for 1100-1200€, it's the retailers and current market demand. Nvidia is still releasing them at around 570€ or so, it's just that the current demand for their products is so very high that by the time it gets to stores, stores are all charging 1100-1200€ for the same 570€ item because that's what the current market is for them, due to extremely high demand for the video cards and low supply rate. It's called supply-and-demand. We're the same way over here in the USA, original product was $699, and that's what nvidia still sells em for, and then the retailers mark em up because that's what they're currently going for globally. There's one 3rd party vendor I know of (EVGA) that sells direct to the public on their website and they have "decent" prices of around $779 - $850 for 1080 Ti's. But most other 3rd party retail places like newegg and amazon still charge around $1200 - $1500 for em.

So.. don't blame nvidia, it's not their fault. Blame the market at the moment and retailers.

Speaking of that. EVGA currently has a Hydro Copper water cooled 1008 Ti @ $929 in stock on their website: https://www.evga.com/products/product.aspx?pn=11G-P4-6599-KR

And there's a reference model / Blower type / FE card for $899.99 on newegg right now: https://www.newegg.com/Product/Product.aspx?Item=N82E16814125989


----------



## KCDC

kithylin said:


> If all you're doing with an hdmi cable is watching blu ray / 4K movies @ 4K, from a set-top box with a physical disc in it, then 30hz is enough. Movies are only displayed at 23-24 frames per second anyway. So for that, 30hz is more than enough. For PC gaming though yes you want 60 Hz Minimum. You'd need at least a pair of two 1080 Ti's to play any modern AAA title @ 4K/60 with all the settings, anti-aliasing, bells, whistles, and doohickeys on though, all the game settings cranked to max for every title if you want 60 FPS minimums everywhere.



7680x1440 definitely needs SLI 1080ti for 60fps on major titles. That's 3 million pixels more than 4k, though. I can keep a stable 60fps in ME:A if I'm at 2000 on the cards. thats ultra/max/extreme/whatever settings

Vulcan seems to keep a solid 60fps in doom at that res with no stutter or frame drop.


----------



## 2ndLastJedi

Well I upgraded from ps4 to 1080Ti, kind of bicycle to Ferrari hey. And saved because I'm a spoiled brat who works hard and has obligations to fulfill before I can buy my toys.


----------



## KickAssCop

Some seriously salty people here. The world is not USA. If your market doesn’t have cards what do I care?


----------



## 2ndLastJedi

I'm just trying to figure out what's wrong with an individual buying 10 graphics cards to make money with? 
Just because one person has the means and can get their hands on 10 1080Ti's why can't they just do whatever they want with them. 
Some people buy expensive cars only to trash them on the track all while I drive in a 200000km 2007 car. 
Why do people begrudge others making money? 
I personally have no idea how to mine, I have a 1080Ti and a job but if i could get rich from doing something easier than cleaning cars why wouldn't I take it?


----------



## feznz

blanket statement but only you, yourself can judge what is greedy.
Back on topic though I personally found the 1080Ti enough for a 3k screen(3840x1600) lowest FPS 50ish with ACO all settings on max, I wonder if SLI is really required for 4k just reduce a few settings and should be able to hold 60FPS


----------



## ZealotKi11er

feznz said:


> blanket statement but only you, yourself can judge what is greedy.
> Back on topic though I personally found the 1080Ti enough for a 3k screen(3840x1600) lowest FPS 50ish with ACO all settings on max, I wonder if SLI is really required for 4k just reduce a few settings and should be able to hold 60FPS


SLI is dead now. No point to even bother with it.


----------



## kithylin

KickAssCop said:


> Some seriously salty people here. The world is not USA. If your market doesn’t have cards what do I care?


And next you're going to tell us that you expect to re-sell your cards in 12-24 months as part of your business plan for coin mining to "use the cards for free" for a while. And, this next part is a wild guess, but, probably you likely wouldn't think there would be any reason to tell the next owner that you ever even used them for coin mining either. Are you also overclocking them with a custom bios while coin mining too? Are you planning to flash back to stock bios when you sell em and not tell anyone?

All of this I mentioned above is "standard practice" of all coin miners in the world regardless of where you live. Things like that above are why we gamers -HATE- coin miners.


----------



## KickAssCop

I am a gamer first and foremost.


----------



## 2ndLastJedi

How can this be ?? and if it is , why hasn't everyone jumped all over it ?


----------



## SavantStrike

kithylin said:


> And next you're going to tell us that you expect to re-sell your cards in 12-24 months as part of your business plan for coin mining to "use the cards for free" for a while. And, this next part is a wild guess, but, probably you likely wouldn't think there would be any reason to tell the next owner that you ever even used them for coin mining either. Are you also overclocking them with a custom bios while coin mining too? Are you planning to flash back to stock bios when you sell em and not tell anyone?
> 
> All of this I mentioned above is "standard practice" of all coin miners in the world regardless of where you live. Things like that above are why we gamers -HATE- coin miners.


If there's a custom bios for Pascal, count me in. Pascal is painfully bios locked.

As for the discussion about discontinued 1080 TI/TxP above, my guess is that GDDR5x is no longer in volume production.


----------



## Nineball_Seraph

KickAssCop said:


> Some seriously salty people here. The world is not USA. If your market doesn’t have cards what do I care?


Well your post was clearly a 4chan-level troll post with the intent to rile people up on a touchy subject. Did you expect people not to respond towards you in a negative manner?

So yeah people are going to be salty about someone being toxic. No offense.


----------



## ZealotKi11er

kithylin said:


> And next you're going to tell us that you expect to re-sell your cards in 12-24 months as part of your business plan for coin mining to "use the cards for free" for a while. And, this next part is a wild guess, but, probably you likely wouldn't think there would be any reason to tell the next owner that you ever even used them for coin mining either. Are you also overclocking them with a custom bios while coin mining too? Are you planning to flash back to stock bios when you sell em and not tell anyone?
> 
> All of this I mentioned above is "standard practice" of all coin miners in the world regardless of where you live. Things like that above are why we gamers -HATE- coin miners.


I do not think that is how it works. You want to keep mining with GPU way past their ROI. I am sure people will say not used for mining when they sell their GPU. Never believe anyone lol. Just buy a GPU that you know can survive even the worse condition and has warranty left. I personally would not buy a GPU from a miner. Its a simple principle of mine. If people stop buying mining cards then most miners would second guess himself before buying 6+ GPUs, to begin with. Now let say you mined your 1080 Ti for 3 years and decide to sell it 3 years after, for dirt cheap, I am fine with that. As bad as 2013/14 was it did not last long. 2017/18 is going for much longer in favor of the miner.


----------



## SavantStrike

ZealotKi11er said:


> I do not think that is how it works. You want to keep mining with GPU way past their ROI. I am sure people will say not used for mining when they sell their GPU. Never believe anyone lol. Just buy a GPU that you know can survive even the worse condition and has warranty left. I personally would not buy a GPU from a miner. Its a simple principle of mine. If people stop buying mining cards then most miners would second guess himself before buying 6+ GPUs, to begin with. Now let say you mined your 1080 Ti for 3 years and decide to sell it 3 years after, for dirt cheap, I am fine with that. As bad as 2013/14 was it did not last long. 2017/18 is going for much longer in favor of the miner.


There's no second guessing when an investment pays back in 6 months. The guy mining will buy the cards and throw them in the trash can in a year if no one wants to buy them.

If people want to be angry at miners then they are free to do so. The reality is that the gaming market has changed, and the GPU market has changed too. Everything had changed, and it's only now that people are starting to realize this.


----------



## kithylin

2ndLastJedi said:


> How can this be ?? and if it is , why hasn't everyone jumped all over it ?


Because while everyone knows Volta is coming (we already have Volta Titan V cards), nvidia has not actually confirmed that "GTX 2080 Ti" will be an actual physical product yet. It is all still conjecture, hearsay and guessing at this point. It's also not actually released as a retail card yet either.


----------



## feznz

Agreed not appreciating the salt rubbed into the wounds 

I could reach under my mattress and buy 10 1080Tis but I choose not to mine simply I have no practical way to spend the mining money even Steam no longer accepts bitcoin payment.
There is an exchange that will do international transfers but it adds 10% in processing fees and the only local person in NZ I seen accepting bitcoin was a working girl, I am happily married.

Also the resale I hardly ever sell second hand eletronics beause after 2 years it is pracically worthless and I really honestly cannot be bothered trying to get a good amount of money for it and rather sell it for a bargain so the buyer won't complain even if there were a small problem.

Recently I tried to talk someone into a 1080 or 1070 even a rx580 in the end they got the 1080Ti put the whole amount about 1300USD on credit card and will be paying it off for the next 6months on a minium wage job. I feel for him 

ecash, digicash, bit gold, bmoney were all around before 2000 all failed


----------



## kithylin

feznz said:


> Agreed not appreciating the salt rubbed into the wounds
> 
> I could reach under my mattress and buy 10 1080Tis but I choose not to mine simply I have no practical way to spend the mining money even Steam no longer accepts bitcoin payment.
> There is an exchange that will do international transfers but it adds 10% in processing fees and the only local person in NZ I seen accepting bitcoin was a working girl, I am happily married.
> 
> Also the resale I hardly ever sell second hand eletronics beause after 2 years it is pracically worthless and I really honestly cannot be bothered trying to get a good amount of money for it and rather sell it for a bargain so the buyer won't complain even if there were a small problem.
> 
> Recently I tried to talk someone into a 1080 or 1070 even a rx580 in the end they got the 1080Ti put the whole amount about 1300USD on credit card and will be paying it off for the next 6months on a minium wage job. I feel for him
> 
> ecash, digicash, bit gold, bmoney were all around before 2000 all failed


I don't know where you live in NZ but I did a quick google and turns out there are two bitcoin ATM's in NZ: https://coinatmradar.com/country/153/bitcoin-atm-new-zealand/ You can literally walk to them and exchange bitcoins for cold hard cash and then spend it on anything you want. Just like any other currency. Just because some company doesn't accept bitcoin directly doesn't mean you can't spend bitcoins on something, just exchange it first. I despise GPU mining as a source of obtaining bitcoins. But felt I should at least educate folks that bitcoin it's self is an established currency just like USD or Euros or Yen or any other currency in the world.


----------



## HoneyBadger84

If you read the rumors, scuttlebut is the GTX 2080 Ti will be Ampere & that Volta is a deep-learning/Quadro type card only, that will never be made for the gaming segment, while Ampere will be the gaming/consumer segment card.

That's all rumor & hearsay at this point, but I thought I'd share.

If the rumors are true, Pascal production has completely halted, so what cards are on the market now is all there will be (the last ones may still be being shipped to suppliers/OEMs, but there are no more GPU boards being made fresh), and Ampere is already on assembly lines in anticipation of a release in March/April.


----------



## KickAssCop

Quick question. One of my sea hawks blower fans makes a squeaky sound. Can I replace it with the Corsair N970 kit blower fan? Remember the ones that came for GTX 970. Size of blower should be the same? I can't find any info on google. lmk.


----------



## pez

KickAssCop said:


> Quick question. One of my sea hawks blower fans makes a squeaky sound. Can I replace it with the Corsair N970 kit blower fan? Remember the ones that came for GTX 970. Size of blower should be the same? I can't find any info on google. lmk.


IIRC, the 980Ti and GTX1080 kits are all the same (at least for the EVGA ones). On top of that, the 1080 pump and bracket is the same as the 1080Ti/Titan X(p) kit. The only difference in most of the SKUs are the shroud that's provided (which for Pascal is unnecessary). 

TL;DR if you can link the mounting/pump being the same between the 970 and 980Ti kit and confirm that the 980Ti and 1080 mounting/pump are the same; you should be fine.

In the end, however, putting the appropriate grease on the stock blower fan will probably be one of the easiest and cheapest solutions to do.


----------



## feznz

kithylin said:


> I don't know where you live in NZ but I did a quick google and turns out there are two bitcoin ATM's in NZ: https://coinatmradar.com/country/153/bitcoin-atm-new-zealand/ You can literally walk to them and exchange bitcoins for cold hard cash and then spend it on anything you want. Just like any other currency. Just because some company doesn't accept bitcoin directly doesn't mean you can't spend bitcoins on something, just exchange it first. I despise GPU mining as a source of obtaining bitcoins. But felt I should at least educate folks that bitcoin it's self is an established currency just like USD or Euros or Yen or any other currency in the world.


Half way between the two Christchurch :doh: still a plane ride and taxi that would cost more than the plane ride or est$300 round trip. TBH I didn't know they existed.
https://www.stuff.co.nz/business/101038176/bank-pulls-support-for-cryptocurrency-platform-cryptopia
https://www.reddit.com/r/NZBitcoin/comments/7hw6sy/any_recommendations_for_a_cryptofriendly_nz_bank/

I have no confidence bank closes your bank account if transactions from bitcoin show up that scares me.

and the whole chain block is great it grows and grows but each chain block is getting longer and longer so more and more electricity and compute power is required to solve each chain block as time goes by and at some point unless you have very cheap electricity and super efficient GPU the electricity itself at some point will cost more than the reward for solving a chain block 
Or people stop trading in bitcoin also means no chain blocks to solve 
So I think Nvidia have rethought the release of Volta to keep up with mining demands and profitability, look at what the Press release was in August
https://www.techspot.com/news/70584-nvidia-volta-gaming-gpus-not-foreseeable-future.html

Bolivian peso was an established currency http://www.nytimes.com/1985/04/08/business/hyper-inflation-traumatizes-bolivia.html many folk law stories of what to do when metal coins are worth more in scrap metal.

Honestly I don't despise bitcoin or miners I just don't support it I honestly feel for fellow gamers Our LAN gaming group, 1 has GTX1060, 3 of then have GTX970 1 has a GTX1080 and 2 of us have GTX1080Ti 
And 4 want to upgrade but can't really afford it, The guy with the GTX1080 swapped his VA Ultra wide 1440p and got a Ultra wide 1080p was cheaper to downgrade the monitor than upgrade the GPU if that makes sense but purely seeing the difference side by side of a VA to newer IPS he bought another monitor within a week.
The 2 of us that have 1080Ti have more money than brains. 

I just wished I was this guy 
user "SmokeTooMuch" auctioned 10,000 BTC for $50 (cumulatively), but no buyer was found

AND I talk too much :thumb:


----------



## kithylin

feznz said:


> https://www.stuff.co.nz/business/101038176/bank-pulls-support-for-cryptocurrency-platform-cryptopia
> https://www.reddit.com/r/NZBitcoin/comments/7hw6sy/any_recommendations_for_a_cryptofriendly_nz_bank/
> 
> I have no confidence bank closes your bank account if transactions from bitcoin show up that scares me.


You use a seperate device to access your crypto currency at the ATM's, and then to go to your bank you would just deposit cash / paper money into your bank account. How would your bank ever possibly know that you got the money from bitcoin vs any other source of income? That's so confusing.

I don't know anything about NZ, do they still use paper / cash money?


----------



## feznz

kithylin said:


> You use a seperate device to access your crypto currency at the ATM's, and then to go to your bank you would just deposit cash / paper money into your bank account. How would your bank ever possibly know that you got the money from bitcoin vs any other source of income? That's so confusing.
> 
> I don't know anything about NZ, do they still use paper / cash money?



yes there are work rounds but it all eats into the profit margin and the main reason for Crypto currency not taking off in NZ the cost of electricity 

but just in case you didn't know for laughs 

NVIDIA (pronounced nah-vid-eya) has been around for 23 years, but you may not have heard of it.


edit;
I just realised why those BTC ATM are strategically demographically located in those centres Probably in site of the 2 largest universities full of fresh naïve people.
I am just too old could be totally wrong but every easy money ideas usually results in loss of money or in jail. 
Besides I am actually happy with my lifestyle I have enough money and enough time so a life balance between work and play, this thread is a hobby to me I turn a lot of cash jobs down just too old to chase every single cent.


----------



## KickAssCop

pez said:


> IIRC, the 980Ti and GTX1080 kits are all the same (at least for the EVGA ones). On top of that, the 1080 pump and bracket is the same as the 1080Ti/Titan X(p) kit. The only difference in most of the SKUs are the shroud that's provided (which for Pascal is unnecessary).
> 
> TL;DR if you can link the mounting/pump being the same between the 970 and 980Ti kit and confirm that the 980Ti and 1080 mounting/pump are the same; you should be fine.
> 
> In the end, however, putting the appropriate grease on the stock blower fan will probably be one of the easiest and cheapest solutions to do.


Can you recommend some grease that I can put. I really don't know or haven't tried in the past.


----------



## SavantStrike

KickAssCop said:


> Can you recommend some grease that I can put. I really don't know or haven't tried in the past.


Silicone or even better standard household oil.

You could also just buy a 65mm blower fan. Things are cheap and plentiful.


----------



## pez

KickAssCop said:


> Can you recommend some grease that I can put. I really don't know or haven't tried in the past.


I used Thermal Grizzly Kryonaut personally, but I haven't done any official testing to compare it to anything. It was recommended to me by a few members here.


----------



## webhito

pez said:


> I used Thermal Grizzly Kryonaut personally, but I haven't done any official testing to compare it to anything. It was recommended to me by a few members here.


He has a squeaky fan that needs lubricating, wrong kind of grease lol.


----------



## Streetdragon

WD40? It has to move -> WD40


----------



## KickAssCop

pez said:


> I used Thermal Grizzly Kryonaut personally, but I haven't done any official testing to compare it to anything. It was recommended to me by a few members here.





webhito said:


> He has a squeaky fan that needs lubricating, wrong kind of grease lol.


Yeah that was a head scratcher for me.


----------



## KedarWolf

Still getting alerts from the Nvidia store from www.nowinstock.net for 1080 Ti FE's for $699 USD.

You can also get alerts for the UK store for MSRP.


----------



## ZealotKi11er

KedarWolf said:


> Still getting alerts from the Nvidia store from www.nowinstock.net for 1080 Ti FE's for $699 USD.
> 
> You can also get alerts for the UK store for MSRP.


Yeah if you want to get a 1080 Ti is easy but its FE which is not the best version for temps.


----------



## KedarWolf

ZealotKi11er said:


> Yeah if you want to get a 1080 Ti is easy but its FE which is not the best version for temps.



Water cooling, bruh. 

Wait, did just assume your gender? 

Want to custom water cool a 1080 Ti?

https://www.ekwb.com/shop/catalogsearch/result/?q=MLC


----------



## SavantStrike

ZealotKi11er said:


> Yeah if you want to get a 1080 Ti is easy but its FE which is not the best version for temps.


It's the best version for water cooling though


----------



## 8051

HoneyBadger84 said:


> If you read the rumors, scuttlebut is the GTX 2080 Ti will be Ampere & that Volta is a deep-learning/Quadro type card only, that will never be made for the gaming segment, while Ampere will be the gaming/consumer segment card.
> 
> That's all rumor & hearsay at this point, but I thought I'd share.
> 
> If the rumors are true, Pascal production has completely halted, so what cards are on the market now is all there will be (the last ones may still be being shipped to suppliers/OEMs, but there are no more GPU boards being made fresh), and Ampere is already on assembly lines in anticipation of a release in March/April.


In the consumer market I don't see why Nvidia and AMD just wouldn't continue to manufacture their current crop of GPU's forever -- after all they're selling everything they can manufacture in practically every market segment (top tier, mid tier etc.).


----------



## mouacyk

8051 said:


> In the consumer market I don't see why Nvidia and AMD just wouldn't continue to manufacture their current crop of GPU's forever -- after all they're selling everything they can manufacture in practically every market segment (top tier, mid tier etc.).


They don't manufacturer their own GPU's. More or less, they buy the resources along with processing time from a foundry for a set of GPU dies at a time. This is all done months ahead of time, and the foundries then allocate their open times to other clients like apple and various cell phone manufacturers. NVidia and AMD know better than to continue flooding the market for an unreliable mining niche. The R9 290 fiasco taught AMD (and others) that, and they don't want to hurt their future sales of next-gen hardware, which *is* worth buying foundry time for.


----------



## pez

webhito said:


> He has a squeaky fan that needs lubricating, wrong kind of grease lol.





KickAssCop said:


> Yeah that was a head scratcher for me.


Yeahhhh...I literally blanked on that one. I was still thinking we were on the specific topic of the AIO cooler part..not he fan .

Di-Electric grease should be good. I think you can easily find the stuff at a auto-parts store. It's especially good for radiator fans (for cars), so it should do equally well for a small GPU blower fan. TBH, you shouldn't need much at all.


----------



## feznz

webhito said:


> He has a squeaky fan that needs lubricating, wrong kind of grease lol.


Might work  I have seen people use CRC to fix squeaky breaks on cars


----------



## kithylin

webhito said:


> He has a squeaky fan that needs lubricating, wrong kind of grease lol.


KY Jelly maybe? .......... :devil-smi


----------



## feznz

https://overclock3d.net/news/gpu_displays/nvidia_s_turing_gpu_may_be_a_dedicated_cryptocurrency_mining_card/1


----------



## mAs81

feznz said:


> https://overclock3d.net/news/gpu_displays/nvidia_s_turing_gpu_may_be_a_dedicated_cryptocurrency_mining_card/1


I literally almost spit out my coffee , lol :lachen:


----------



## khemist

Picked this up.


----------



## kithylin

khemist said:


> Picked this up.


Hope you didn't pay more than $1200 US for it, as that would be terribly overpriced.


----------



## khemist

$1196 converted from pounds, but i'm waiting on a RMA replacement from EVGA that i will be selling to make the money back.


----------



## feznz

:doh: mind you I paid $1184USD + shipping of $6USD  that was first week of release and because I had waited so long that games were unplayable even at 720p and thats what is expected here with import taxes etc. but prices have dropped since release $1150USD 

Just hope that the next gen if the rumours are correct then dedicated mining and gaming card and NVidia some how either hardware or software lock the 20xx to gaming algorithms and Turing to mining then they can truly say we NVidia put Gamers our core consumers first.

Thats business though you got to cater to your current strongest market.


----------



## khemist

It's crazy, i paid £623 for a 1080ti a few months a go that's now over £900.


----------



## kithylin

I just looked and apparently prices on the 1080 Ti's on ebay are actually coming -DOWN-, just last month they were around $1500 at one point. Now I see several on ebay for $945 - $978 range. Maybe this is a sign of things to come.


----------



## KickAssCop

It is because in less than 2 months this card will be irrelevant for gamers and miners.


----------



## khemist




----------



## ZealotKi11er

KickAssCop said:


> It is because in less than 2 months this card will be irrelevant for gamers and miners.


The think is 1080/Ti where not popular because of their ETH performance. 1080 Ti will still be very good for gaming until the next Ti.


----------



## 8051

I like those pictures of a $1000+ video card that should only be worth $700.


----------



## kithylin

KickAssCop said:


> It is because in less than 2 months this card will be irrelevant for gamers and miners.


Irrelevant for miners possibly but just because a card isn't the most current, latest thing doesn't mean it doesn't have good value for gamers. I just last week finished building a Z77 / 2500K system using my older GTX 780 Ti card for a WinXP gaming rig. Just because a card isn't the latest doesn't make it any slower for gamers. A GTX 780 Ti is just as fast today as it was when it was new for example.


----------



## KCDC

Kinda hoping this helps take the miners off gaming GPUs:

https://news.bitcoin.com/samsung-enters-bitcoin-mining-asic-manufacturing-business/


----------



## ZealotKi11er

KCDC said:


> Kinda hoping this helps take the miners off gaming GPUs:
> 
> https://news.bitcoin.com/samsung-enters-bitcoin-mining-asic-manufacturing-business/


ASIC kill mining because of the price and no release value.


----------



## feznz

still keeping an eye out for another 1080Ti just I don't really need it but I want it :sonic: I can't help myself looking daily
Thought this might give some light to a few 1080tis showing up on Ebay
even seen a couple of whole mining rigs for 5k turn up on trademe our version of ebay still "guaranteed" $15 return a day + power + hardware = hardware paid off 1 year still need to recoup power cost = not worth it no more warrantee period on GPUs 


https://overclock3d.net/news/gpu_di...arket_as_ethereum_s_price_crashes_below_150/1


----------



## ZealotKi11er

feznz said:


> still keeping an eye out for another 1080Ti just I don't really need it but I want it :sonic: I can't help myself looking daily
> Thought this might give some light to a few 1080tis showing up on Ebay
> even seen a couple of whole mining rigs for 5k turn up on trademe our version of ebay still "guaranteed" $15 return a day + power + hardware = hardware paid off 1 year still need to recoup power cost = not worth it no more warrantee period on GPUs
> 
> 
> https://overclock3d.net/news/gpu_di...arket_as_ethereum_s_price_crashes_below_150/1


What happened there was the switch from AMD cards to Nvidia cards which were better at Zec mining.


----------



## feznz

Ah OK I have no idea but seems everyone is chasing the coin sideways across the market I really got to stop thinking about it and game on 
But such a hot topic that I can't help myself though I have made the commitment I am not going to touch any virtual currency not even mine with my 1080Ti I did toy with the idea just to buy games on steam then they stopped bitcoin payment also the nearest bitcoin ATM is a day return trip.
But I liken Bitcoin to playing the slot machines it either way up or way down so I can see how the excitement gets to people, gambling has that addictive nature to it.


----------



## jase78

KCDC said:


> Kinda hoping this helps take the miners off gaming GPUs:
> 
> https://news.bitcoin.com/samsung-enters-bitcoin-mining-asic-manufacturing-business/


as a matter of fact that is the memory nividia is using on their new line releasing in a couple months.


----------



## specialedge

Anybody running asus 1080ti Poseidon? Got one coming in the mail. I don't imagine the new product line will debut with a Poseidon 2080ti, so I figure 1050 on a 950 card.... I could have done a lot worse.... like the remaining stock of 1030s on shelf at Fry's.....


----------



## KCDC

jase78 said:


> as a matter of fact that is the memory nividia is using on their new line releasing in a couple months.



well shoot


----------



## KickAssCop

Picked up 1080 Ti FTW Hybrid for 899 on Amazon. :up:


----------



## Maximization

specialedge said:


> Anybody running asus 1080ti Poseidon? Got one coming in the mail. I don't imagine the new product line will debut with a Poseidon 2080ti, so I figure 1050 on a 950 card.... I could have done a lot worse.... like the remaining stock of 1030s on shelf at Fry's.....


got 2, use fans with water cooling for best performance....


----------



## raydub

*I'm Going To Do It! ...FLASH My EVGA 1080 Ti*

...thanks SavantStrike for the info, I'm Going to use KedarWolfs video and links to flash my own EVGA 1080 Ti card. Any tips for flashing it for my 2012 Mac Pro 12core? also I'm going to have to find a friend with a windows based PC, what version of windows should I use and after I flash it, anyone know the best slot to install it to for 2012 Mac Pro given that I'm also using an AngelBird Wings pcie SSD? Isn't there only one 16x fast slot on that machine?


----------



## MRCOCOset

*1080Ti FE + XOC Bios + ACCELERO HYBRID III-140 + 2x NOCTUA NF-A14*

Push - Pull Test:

Hello.
I replaced the original fan with 2x NOCTUA NF-A14

1080Ti FE + XOC Bios + ACCELERO HYBRID III-140 + 2x NOCTUA NF-A14


Picture: https://zapodaj.net/bf4c45d8b7459.jpg.html


----------



## SavantStrike

raydub said:


> ...thanks SavantStrike for the info, I'm Going to use KedarWolfs video and links to flash my own EVGA 1080 Ti card. Any tips for flashing it for my 2012 Mac Pro 12core? also I'm going to have to find a friend with a windows based PC, what version of windows should I use and after I flash it, anyone know the best slot to install it to for 2012 Mac Pro given that I'm also using an AngelBird Wings pcie SSD? Isn't there only one 16x fast slot on that machine?


Windows 7 or above should be fine 

Arent the Mac pro machines x79 based? I'd be surprised if they weren't. If so you should have at least two full x16 slots.

GPU is usually in the first slot, although I don't have any idea if Apple does things differently.


----------



## Arexniba

What's up OC. It's been a while since I've posted here (since 2011). 

I've gotten back into PC building (inadvertently). I wanted a piece of the action with mining cryptocurrency. Since the news broke out, I was always curious, but back when it was starting, the steps to mine were more complex. These days, they are definitely user friendly.

Anyways, I purchased a used Aorus 1080 TI a few weeks ago. I saw that the daily production was decent, so, I went back to the same guy and purchased another one.

I am not sure if any of you have gone through this dilemma, but the cards are so damm big that I had to take out the PSU to give it space. Then, I had to move the water pump out of the way to also give more space. Lol it's a pretty tangled mess from the pictures. Nice with one 1080TI, then ****ty with 2. 










I'm thinking of replacing the fans with waterblocks (Swiftech KOMODO NV-LE GTX 1080 TI GPU Block), but after reading a lot of articles/forums, I'm noting a lot of people saying it's not worth it (especially the risk of damaging the cards). So, now I'm on the fence from swapping my old HAF 932 case to a bigger one. 

Any suggestions or recommendations would be greatly appreciated!


----------



## 8051

MRCOCOset said:


> Push - Pull Test:
> 
> Hello.
> I replaced the original fan with 2x NOCTUA NF-A14
> 
> 1080Ti FE + XOC Bios + ACCELERO HYBRID III-140 + 2x NOCTUA NF-A14


That was impressive, 500 Watts going thru a 1080Ti! What were the ambient temps in the room for the Furmark tests? I couldn't read them because the video size was so small.

How hot were the 1080Ti VRM's running thru that test?

How long are the hoses to the rad?


----------



## kithylin

raydub said:


> ...thanks SavantStrike for the info, I'm Going to use KedarWolfs video and links to flash my own EVGA 1080 Ti card. Any tips for flashing it for my 2012 Mac Pro 12core? also I'm going to have to find a friend with a windows based PC, what version of windows should I use and after I flash it, anyone know the best slot to install it to for 2012 Mac Pro given that I'm also using an AngelBird Wings pcie SSD? Isn't there only one 16x fast slot on that machine?


If you're still air-cooled on your 1080 Ti, -DO NOT- use the XOC unlimited bios. That's for water cooled or LN2 competitions only.


----------



## MRCOCOset

8051 said:


> That was impressive, 500 Watts going thru a 1080Ti! What were the ambient temps in the room for the Furmark tests? I couldn't read them because the video size was so small.
> 
> How hot were the 1080Ti VRM's running thru that test?
> 
> How long are the hoses to the rad?


What were the ambient temps in the room for the Furmark tests? : room temperature
How hot were the 1080Ti VRM's running thru that test? : on the vrm there is a radiator cooled like on a photo. it's cool. I did not check the temperature. I do not have a laser thermometer.
How long are the hoses to the rad? : 40 cm
I couldn't read them because the video size was so small : 4K desktop resolution. Play on 4k tv/monitor.


----------



## SavantStrike

kithylin said:


> If you're still air-cooled on your 1080 Ti, -DO NOT- use the XOC unlimited bios. That's for water cooled or LN2 competitions only.


In this case the flash is to change the normal BIOS to an Apple BIOS for use on a Mac Pro. I agree that the XOC BIOS should only be used for heavily cooled cards.


----------



## raydub

kithylin said:


> If you're still air-cooled on your 1080 Ti, -DO NOT- use the XOC unlimited bios. That's for water cooled or LN2 competitions only.


Thanks for the Info, it's much appreciated...


----------



## jura11

Arexniba said:


> What's up OC. It's been a while since I've posted here (since 2011).
> 
> I've gotten back into PC building (inadvertently). I wanted a piece of the action with mining cryptocurrency. Since the news broke out, I was always curious, but back when it was starting, the steps to mine were more complex. These days, they are definitely user friendly.
> 
> Anyways, I purchased a used Aorus 1080 TI a few weeks ago. I saw that the daily production was decent, so, I went back to the same guy and purchased another one.
> 
> I am not sure if any of you have gone through this dilemma, but the cards are so damm big that I had to take out the PSU to give it space. Then, I had to move the water pump out of the way to also give more space. Lol it's a pretty tangled mess from the pictures. Nice with one 1080TI, then ****ty with 2.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm thinking of replacing the fans with waterblocks (Swiftech KOMODO NV-LE GTX 1080 TI GPU Block), but after reading a lot of articles/forums, I'm noting a lot of people saying it's not worth it (especially the risk of damaging the cards). So, now I'm on the fence from swapping my old HAF 932 case to a bigger one.
> 
> Any suggestions or recommendations would be greatly appreciated!


Hi there

I recently built PC with AORUS GTX1080Ti and I have used Phanteks Glacier WB for this card and must admit this block is one of the best blocks what I used,temperatures 16-20C and load 35-36C that's with 360mm radiator on top and 240mm radiator on top,just friend Aorus is not the best OC'er,his card do 2063MHz as max,but his temperatures are well worth it there

If I can compare my EK blocks and Phanteks,then I must say,Phanteks block is in different league,plus you can use stock AORUS backplate with Glacier WB and if you like RGB then this block do have RGB,risk of damaging the card,not sure there,I've done several builds and I have 3*GPUs in my PC and no issues there

Recommendation for case,get case which do have good airflow,have look on Phanteks Enthoo Primo or EVGA DG-87,I have built few PCs with these cases and been very happy with them there

Hope this helps

Thanks,Jura


----------



## KCDC

jura11 said:


> Hi there
> 
> I recently built PC with AORUS GTX1080Ti and I have used Phanteks Glacier WB for this card and must admit this block is one of the best blocks what I used,temperatures 16-20C and load 35-36C that's with 360mm radiator on top and 240mm radiator on top,just friend Aorus is not the best OC'er,his card do 2063MHz as max,but his temperatures are well worth it there
> 
> If I can compare my EK blocks and Phanteks,then I must say,Phanteks block is in different league,plus you can use stock AORUS backplate with Glacier WB and if you like RGB then this block do have RGB,risk of damaging the card,not sure there,I've done several builds and I have 3*GPUs in my PC and no issues there
> 
> Recommendation for case,get case which do have good airflow,have look on Phanteks Enthoo Primo or EVGA DG-87,I have built few PCs with these cases and been very happy with them there
> 
> Hope this helps
> 
> Thanks,Jura



I was interested in those phanteks 1080ti blocks, but they didn't release in time for my build. Heard they're pretty heavy. Are they a lot heavier than the EK blocks?


----------



## Arexniba

jura11 said:


> Hi there
> 
> I recently built PC with AORUS GTX1080Ti and I have used Phanteks Glacier WB for this card and must admit this block is one of the best blocks what I used,temperatures 16-20C and load 35-36C that's with 360mm radiator on top and 240mm radiator on top,just friend Aorus is not the best OC'er,his card do 2063MHz as max,but his temperatures are well worth it there
> 
> If I can compare my EK blocks and Phanteks,then I must say,Phanteks block is in different league,plus you can use stock AORUS backplate with Glacier WB and if you like RGB then this block do have RGB,risk of damaging the card,not sure there,I've done several builds and I have 3*GPUs in my PC and no issues there
> 
> Recommendation for case,get case which do have good airflow,have look on Phanteks Enthoo Primo or EVGA DG-87,I have built few PCs with these cases and been very happy with them there
> 
> Hope this helps
> 
> Thanks,Jura


Hey Jura, thanks for your feedback and recommendations here. The only reason I am more inclined for the Swiftech product is because I talked to the RMA guy there, and he is willing to give me a 10% discount on the water blocks (almost $40 in savings). But $300 for waterblocks (not including compression fittings) doesn't seem worth while for mining. I'll be transparent, I don't game on my PC anymore. I commute 4hrs a day + work 10hrs and come home to my wife and kid. If I do have time, PUBG on xbox is all I can do, lol.

Will look into the Phanteks, but the Swiftech one has pretty good reviews everywhere I've researched. Nonetheless, I love feedback from people that are actually using the products.

Cheers!
Alex


----------



## raydub

SavantStrike said:


> In this case the flash is to change the normal BIOS to an Apple BIOS for use on a Mac Pro. I agree that the XOC BIOS should only be used for heavily cooled cards.


Question does anyone know if the Apple Bios is one of those in the "best bios and power limit download file? Or have a link to the correct bios I need for a Mac /gtx 1080ti


----------



## jura11

KCDC said:


> I was interested in those phanteks 1080ti blocks, but they didn't release in time for my build. Heard they're prettyy heavy. Are they a lot heavier than the EK blocks?



Hi there 

Wish I bought for myself these blocks, have EK blocks in my build and first time I used Phanteks block for friend build and my EK blocks looks pants against these Phanteks Glacier blocks

Yes they're heavy blocks, heavier than EK not tried Heatkiller blocks which I would love to try

But overall they're nice blocks and temperatures are pretty awesome 

Hope this helps 

Thanks, Jura


----------



## jura11

Arexniba said:


> Hey Jura, thanks for your feedback and recommendations here. The only reason I am more inclined for the Swiftech product is because I talked to the RMA guy there, and he is willing to give me a 10% discount on the water blocks (almost $40 in savings). But $300 for waterblocks (not including compression fittings) doesn't seem worth while for mining. I'll be transparent, I don't game on my PC anymore. I commute 4hrs a day + work 10hrs and come home to my wife and kid. If I do have time, PUBG on xbox is all I can do, lol.
> 
> Will look into the Phanteks, but the Swiftech one has pretty good reviews everywhere I've researched. Nonetheless, I love feedback from people that are actually using the products.
> 
> Cheers!
> Alex


Hi Alex 

Really is up to you there, which block you will choose, these blocks usually perform within 1-5°C 

I have like EK and tried as well like Bykski or this one Phanteks Glacier and they performed pretty much good without the issue 

For compression fittings I can recommend Barrow fittings 

Hard to recommend what I would do, I do mine with my GPUs if I don't render which is rare but yes I do mine and temperatures are on GPUs in 38-42°C on all 3*GPUs 


If you have single GPU and you want have low or lower temperatures then I would go with water cooling there

Hope this helps 

Thanks, Jura


----------



## KickAssCop

Buy a hybrid. Forget blocking cards. IMO.


----------



## SavantStrike

raydub said:


> Question does anyone know if the Apple Bios is one of those in the "best bios and power limit download file? Or have a link to the correct bios I need for a Mac /gtx 1080ti


It most likely isn't. The Apple BIOS is going to be a stock FE BIOS with some changes that make it digitally signed to work on a Mac. The techpowerup site probably has what you need though.



Arexniba said:


> Hey Jura, thanks for your feedback and recommendations here. The only reason I am more inclined for the Swiftech product is because I talked to the RMA guy there, and he is willing to give me a 10% discount on the water blocks (almost $40 in savings). But $300 for waterblocks (not including compression fittings) doesn't seem worth while for mining. I'll be transparent, I don't game on my PC anymore. I commute 4hrs a day + work 10hrs and come home to my wife and kid. If I do have time, PUBG on xbox is all I can do, lol.
> 
> Will look into the Phanteks, but the Swiftech one has pretty good reviews everywhere I've researched. Nonetheless, I love feedback from people that are actually using the products.
> 
> Cheers!
> Alex


Unless you're hitting a temp wall, it's not going to be cost effective to water cool simply for mining. Water cooling a mining card just let's you run the card harder, which still pushes you outside the optimal h/w ratio. About the only benefit is reduced noise.

If price is a significant factor, you might want to look at Bykski blocks. Total cost is around 100 per card and you can use the stock backplate. They are pretty attractive and perform fairly well, with thicker construction than an EK block. Stepping up to phanteks or watercool products is a significant price bump with a 1-3C drop in temps and a different aesthetic.




KickAssCop said:


> Buy a hybrid. Forget blocking cards. IMO.


While better than an air cooled card, none of the hybrids I've ever used come close to a card with a full cover block. VRM temperatures are similar to an all air cooled card and die temps are 10-15C hotter than a full cover block.

The only benefit of a hybrid is it's 5-15C cooler than an air cooled card and you don't have to build a full loop. Price is similar to a card with a full block (provided you already have a rad and a pump).


----------



## DarthBaggins

Personally I'd rather block my cards than trust an AiO solution, but again I'm not mining on my cards just running [email protected] in their down time (I know I could run for CureCoin etc).


----------



## mouacyk

KickAssCop said:


> Buy a hybrid. Forget blocking cards. IMO.


For how much these cards cost, I'd rather not have a silent failure/degradation point. EVGA does have VRM sensors on their newer models, but no one else does.


----------



## feznz

well Full cover blocks are the only way to go on a unlimited budget and aesthetically more appeasing too, AIO can be reused on many cards but TBH I see the 1080Ti lasting a long time, for me as it exceeds my needs.

For me I just got universal GPU water blocks and the have done 3 sets of GPUs 

Just re done my custom water cooling loop last night cleaned out the blocks and used Mayhems pastel white I had temp issues with GPU temps hitting 70°c now hitting 35°C pretty happy even lost 7°C on CPU which even more happy got to stabilize @5.2Ghz with AVX offset of 2 
Even managed to get another 50Mhz OC on GPU so pretty happy all round.

So really it is personal choice and what you are willing to spend and aesthetically appeasing to you for cooling your GPUs


----------



## khemist

Is there an Nvidia rep on here that can help me out?, i'm trying to get a RMA number to return an unused Titan xp star wars edition card for a refund through Nvidia CS but i've been waiting almost a week with no progress.

It seems stupidly hard to return this card and has put me off buying direct in the future.


----------



## kithylin

KickAssCop said:


> Buy a hybrid. Forget blocking cards. IMO.


This is such a stupid blanket statement.

Everyone knows full-cover blocks are the highest possible cooling performance, quite a bit better at cooling than hybrid cards.

Full-Cover solutions are upgradable and hybrid are not (can upgrade the custom liquid loop solution).

It depends on what folks need/want to do but in the end if one is after the absolutely highest cooling performance it's full cover blocks.

But then again, you already knew that KickAssCop before you put your post in.

Just like you posted your mining setup of cards just because you knew it would "Trigger" people and get them mad. I'm seriously starting to think that's the only reason you're posting in here lately: just to upset everyone and cause drama.


----------



## KCDC

Kinda why I rolled my eyes and read on. Hah.


----------



## SavantStrike

DarthBaggins said:


> Personally I'd rather block my cards than trust an AiO solution, but again I'm not mining on my cards just running [email protected] in their down time (I know I could run for CureCoin etc).


I've had so much grief with aio solutions I wouldn't want to use one for more than two years.


----------



## mouacyk

My neutral response should also serve as a warning when these mined GPUs hit 2nd hand markets.


----------



## DarthBaggins

mouacyk said:


> My neutral response should also serve as a warning when these mined GPUs hit 2nd hand markets.


Hopefully by the time that happens, we all have moved over to the 20** series lol


----------



## kithylin

DarthBaggins said:


> Hopefully by the time that happens, we all have moved over to the 20** series lol


Again.. people here seem to think that just because a new gpu model comes out, people won't be buying 1080 Ti's anymore. Heck some of my friends were just happy to upgrade to a GTX 980 last week used. Not everyone has the money to jump to the latest and greatest as soon as it comes out. A lot of folks will actually usually buy video cards used 2-5 years after they were released. People will continue buying the 1080 Ti for a long time from now used. That's where miners selling their 1080 Ti's and not disclaiming they were mined on is scary as f--- for potential buyers.


----------



## DarthBaggins

I always buy the extended warranty from MicroCenter when I buy my cards, so when the next latest and greatest comes out I can trade in.
But I do know your average gamer, buyer doesn't spend over $250-300 for a GPU.


----------



## SavantStrike

kithylin said:


> Again.. people here seem to think that just because a new gpu model comes out, people won't be buying 1080 Ti's anymore. Heck some of my friends were just happy to upgrade to a GTX 980 last week used. Not everyone has the money to jump to the latest and greatest as soon as it comes out. A lot of folks will actually usually buy video cards used 2-5 years after they were released. People will continue buying the 1080 Ti for a long time from now used. That's where miners selling their 1080 Ti's and not disclaiming they were mined on is scary as f--- for potential buyers.


I'd actually trust one that a seller stated had been mined on more than one that they didn't say.



DarthBaggins said:


> I always buy the extended warranty from MicroCenter when I buy my cards, so when the next latest and greatest comes out I can trade in.
> But I do know your average gamer, buyer doesn't spend over $250-300 for a GPU.


Doesn't something have to actually break on the card for the extended warranty to apply? I didn't know it allowed trade ins, if it does, then that's pretty cool.


----------



## DarthBaggins

Technically it doesn't allow a trade-in, but all you have to tell them is that it isn't not meeting your needs or you make up a reason and you get a card w/ the amount you initially paid for the GPU/ other products.


----------



## ZealotKi11er

DarthBaggins said:


> Technically it doesn't allow a trade-in, but all you have to tell them is that it isn't not meeting your needs or you make up a reason and you get a card w/ the amount you initially paid for the GPU/ other products.


That seems very sketchy.


----------



## CDub07

That is my whole issue at moment. When/if prices do come down and the market is loaded with 1080ti for reasonable prices, how bad is a GPU that was used for mining *IF* it was taken care of and not abused to hell and back? We run our CPUs wayyyyyyy beyond anything Intel or AMD would actually warrant and they last for multiple years on end. How much difference is mining from running your CPU 24/7 at load? Im a new born when it comes to how mining works so please educate on how the work loads are handled and wear the GPUs.


----------



## ZealotKi11er

CDub07 said:


> That is my whole issue at moment. When/if prices do come down and the market is loaded with 1080ti for reasonable prices, how bad is a GPU that was used for mining *IF* it was taken care of and not abused to hell and back? We run our CPUs wayyyyyyy beyond anything Intel or AMD would actually warrant and they last for multiple years on end. How much difference is mining from running your CPU 24/7 at load? Im a new born when it comes to how mining works so please educate on how the work loads are handled and wear the GPUs.


If you get a 1080 Ti a good cooler like ASUS Strix or EVGA FTW3 or Hybrid card you would have nothing to worry about. The only problem I can see is the person running them at 2.1GHz with max volts 24/7. Also you want to make sure the card still has warranty. Both my 1080 Ti Hybrids mine and run them at 0.875v @ 1860MHz which is far lower what these cards run at stock (1.062v @ 1962MHz). Temps under 50C and VRM under 70C.


----------



## KCDC

ZealotKi11er said:


> If you get a 1080 Ti a good cooler like ASUS Strix or EVGA FTW3 or Hybrid card you would have nothing to worry about. The only problem I can see is the person running them at 2.1GHz with max volts 24/7. Also you want to make sure the card still has warranty. Both my 1080 Ti Hybrids mine and run them at 0.875v @ 1860MHz which is far lower what these cards run at stock (1.062v @ 1962MHz). Temps under 50C and VRM under 70C.



Correct me if I'm wrong, but I don't think warranties transfer to the card when you sell it to another buyer. They would have to go through the original owner if they need to RMA the card as they need the original bill of sale that matches the original owner's name and payment


----------



## ZealotKi11er

KCDC said:


> Correct me if I'm wrong, but I don't think warranties transfer to the card when you sell it to another buyer. They would have to go through the original owner if they need to RMA the card as they need the original bill of sale that matches the original owner's name and payment


Yes some require that. Some dont. ASUS/Giga/MSI work on Serial so you good there. Evga you have to register the card. You can transfer the warranty if you live in NA.


----------



## CDub07

ZealotKi11er said:


> If you get a 1080 Ti a good cooler like ASUS Strix or EVGA FTW3 or Hybrid card you would have nothing to worry about. The only problem I can see is the person running them at 2.1GHz with max volts 24/7. Also you want to make sure the card still has warranty. Both my 1080 Ti Hybrids mine and run them at 0.875v @ 1860MHz which is far lower what these cards run at stock (1.062v @ 1962MHz). Temps under 50C and VRM under 70C.


Yeah its the Asus ROG Strix 1080ti or bust. I thought Paschal had its voltage locked down so you could only just change the power limit?


----------



## feznz

TBH the one thing with mining is passing the hash check meaning most miners know failed hash check = failed chain block workload so waste of how many hours of mining? or even days if you don't monitor your mining farm 
Most impotently the power usage so most miners will be looking at maximising successful hash checks with the least amount of power consumed so running stock or even undervolting 
So my theory would be only buy a 1080Ti under 6 months old and should be good to go 
I couldn't see a miner actually believing that a 15% OC while consuming 400+w is going to make money but I see most are already talking about a 10+months for ROI at current prices 
S big dump of cards? I think most will keep on trucking on the belief that last years peak is nothing like the peak yet to come.


----------



## kithylin

CDub07 said:


> That is my whole issue at moment. When/if prices do come down and the market is loaded with 1080ti for reasonable prices, how bad is a GPU that was used for mining *IF* it was taken care of and not abused to hell and back? We run our CPUs wayyyyyyy beyond anything Intel or AMD would actually warrant and they last for multiple years on end. How much difference is mining from running your CPU 24/7 at load? Im a new born when it comes to how mining works so please educate on how the work loads are handled and wear the GPUs.


A lot of variance depends on many factors. Did the person doing the mining buy a 100% stock gpu (even if it's factory overclocked by EVGA, MSI, etc), stick it in their mining rig, load software and go mine on it 24-7 for 3 years? Probably actually fine. Or did the person doing the mining flash a custom bios (XOC perhaps, unlimited power), manually overclock it to maximum 1.200v and run it at 2200 mhz on a water block and mine on it 24-7-365 for 3 years, then flash it back to the original bios and then sell it and claim "Never mined on, only gaming." Then it may run a very real risk of being artifacting/bad for gaming by then. That's the problem with used market and trying to buy top-spec gpu's made within the past 5 years, there's no way to know.



KCDC said:


> Correct me if I'm wrong, but I don't think warranties transfer to the card when you sell it to another buyer. They would have to go through the original owner if they need to RMA the card as they need the original bill of sale that matches the original owner's name and payment


I don't know about other companies, but, EVGA explicitly has that as one of their biggest positive things about their company. With EVGA, as long as you find a GPU within the original "3 year warranty from original sale date" left.. that is, buy a card within the first 3 years of it being sold retail, then you can totally register it's serial number under your EVGA account and activate another 3 years of warranty on it. I did this for my GTX 770 in the other room, bought it used with 4 months of original warranty left used off ebay and it currently has full warranty coverage until september 2019 still active. Even though that would be 6 years after it was originally sold, and EVGA "officially" only offers warranty for cards for 3 years.


----------



## feznz

kithylin said:


> A lot of variance depends on many factors. Did the person doing the mining buy a 100% stock gpu (even if it's factory overclocked by EVGA, MSI, etc), stick it in their mining rig, load software and go mine on it 24-7 for 3 years? Probably actually fine. Or did the person doing the mining flash a custom bios (XOC perhaps, unlimited power), manually overclock it to maximum 1.200v and run it at 2200 mhz on a water block and mine on it 24-7-365 for 3 years, then flash it back to the original bios and then sell it and claim "Never mined on, only gaming." Then it may run a very real risk of being artifacting/bad for gaming by then. That's the problem with used market and trying to buy top-spec gpu's made within the past 5 years, there's no way to know.
> 
> 
> I don't know about other companies, but, EVGA explicitly has that as one of their biggest positive things about their company. With EVGA, as long as you find a GPU within the original "3 year warranty from original sale date" left.. that is, buy a card within the first 3 years of it being sold retail, then you can totally register it's serial number under your EVGA account and activate another 3 years of warranty on it. I did this for my GTX 770 in the other room, bought it used with 4 months of original warranty left used off ebay and it currently has full warranty coverage until september 2019 still active. Even though that would be 6 years after it was originally sold, and EVGA "officially" only offers warranty for cards for 3 years.


I have to say Gsync works sometimes lowering the core clock as well as usage  Played an old game the other day stayed @ 1600Mhz while about 25% GPU usage, but does take a pretty basic game to lower the core clock.

Yes the theoretical SH card I am sure there are a plenty of good ones just who really knows what people have done with their cards


----------



## CDub07

Its what Im still afraid of then.You truly will never know.I just hope either the new cards come out reasonable prices(with the supply to subtain that price) or the market tanks and a used 1080tis are sooooooo readily available I can get it at a price I can afford to bite the bullet on and take a dud if necessary.


----------



## FUZZFrrek

I have a DT version of the EVGA 1080 ti FTW3 and it can clock its memory at 13k mhz without any artifacts on either PrecisionX of Msi Afterburner. My question is, is there a way to make it clock higher? Right now the offset only enables us to add no more than 1000 mhz on the memory. I want to test the absolute limit of the card.


----------



## kithylin

CDub07 said:


> Its what Im still afraid of then.You truly will never know.I just hope either the new cards come out reasonable prices(with the supply to subtain that price) or the market tanks and a used 1080tis are sooooooo readily available I can get it at a price I can afford to bite the bullet on and take a dud if necessary.


Like I mentioned.. grab a used EVGA 1080 Ti within the next 2 years. At least if you're in the USA and it dies you're guaranteed to get some sort of working card back as a replacement for free (minus shipping it to them). Just remember to register the serial # on it with the EVGA website with your free EVGA account. And if they don't show the serial # in the listing, then they're probably fraudulent / stolen or something fishy. Then it doesn't matter if it was mined to death and dies on you 60 days after you bought it, they'll still replace it for you no matter where you bought it. I know I sort of "Ride with EVGA" a lot but I don't know of any other GPU vendor that will let you do this with used video cards, only EVGA.



FUZZFrrek said:


> I have a DT version of the EVGA 1080 ti FTW3 and it can clock its memory at 13k mhz without any artifacts on either PrecisionX of Msi Afterburner. My question is, is there a way to make it clock higher? Right now the offset only enables us to add no more than 1000 mhz on the memory. I want to test the absolute limit of the card.


If you're already getting 13,000 mhz of ram on your 1080 Ti and it's actually stable (play games on it daily forever and nothing ever crashes even once) then you're already almost in the top 1% of 1080 Ti's and very lucky. I seriously doubt you'll get to clock it much further. But try other programs. I think EVGA PrecisionX should work on any card (even those not by EVGA) and should allow more than +1000 to memory, not sure.


----------



## ZealotKi11er

The wing will be just like 290X/290 days. You could have bought R9 290 Reference for $180 USD or waited until September to buy a brand new spanking GTX 970 for $330. Most people would not risk it and just get GTX 970 hence since then Nvidia crushed AMD. The idea now is for Nvidia or even AMD to release such good cards that gamers would not be compelled buy used cheap mining cards.


----------



## DarthBaggins

MSI is very good on their GPU warranties as well, they warrantied out a broken fan blade on a 960 I bought used a few years ago with no hassle


----------



## CDub07

kithylin said:


> Like I mentioned.. grab a used EVGA 1080 Ti within the next 2 years. At least if you're in the USA and it dies you're guaranteed to get some sort of working card back as a replacement for free (minus shipping it to them). Just remember to register the serial # on it with the EVGA website with your free EVGA account. And if they don't show the serial # in the listing, then they're probably fraudulent / stolen or something fishy. Then it doesn't matter if it was mined to death and dies on you 60 days after you bought it, they'll still replace it for you no matter where you bought it. I know I sort of "Ride with EVGA" a lot but I don't know of any other GPU vendor that will let you do this with used video cards, only EVGA.
> 
> 
> If you're already getting 13,000 mhz of ram on your 1080 Ti and it's actually stable (play games on it daily forever and nothing ever crashes even once) then you're already almost in the top 1% of 1080 Ti's and very lucky. I seriously doubt you'll get to clock it much further. But try other programs. I think EVGA PrecisionX should work on any card (even those not by EVGA) and should allow more than +1000 to memory, not sure.


I get exactly what your saying but right now I'm I trying to building me a complete ROG theme computer. So it will have to be a Asus ROG Strix.


----------



## 8051

kithylin said:


> A lot of variance depends on many factors. Did the person doing the mining buy a 100% stock gpu (even if it's factory overclocked by EVGA, MSI, etc), stick it in their mining rig, load software and go mine on it 24-7 for 3 years? Probably actually fine. Or did the person doing the mining flash a custom bios (XOC perhaps, unlimited power), manually overclock it to maximum 1.200v and run it at 2200 mhz on a water block and mine on it 24-7-365 for 3 years, then flash it back to the original bios and then sell it and claim "Never mined on, only gaming." Then it may run a very real risk of being artifacting/bad for gaming by then. That's the problem with used market and trying to buy top-spec gpu's made within the past 5 years, there's no way to know.
> 
> I don't know about other companies, but, EVGA explicitly has that as one of their biggest positive things about their company. With EVGA, as long as you find a GPU within the original "3 year warranty from original sale date" left.. that is, buy a card within the first 3 years of it being sold retail, then you can totally register it's serial number under your EVGA account and activate another 3 years of warranty on it. I did this for my GTX 770 in the other room, bought it used with 4 months of original warranty left used off ebay and it currently has full warranty coverage until september 2019 still active. Even though that would be 6 years after it was originally sold, and EVGA "officially" only offers warranty for cards for 3 years.


That's amazing. Considering mining has now come to dominate GPU sales is EVGA still continuing that policy?


----------



## 8051

kithylin said:


> If you're already getting 13,000 mhz of ram on your 1080 Ti and it's actually stable (play games on it daily forever and nothing ever crashes even once) then you're already almost in the top 1% of 1080 Ti's and very lucky. I seriously doubt you'll get to clock it much further. But try other programs. I think EVGA PrecisionX should work on any card (even those not by EVGA) and should allow more than +1000 to memory, not sure.


EVGA Precision X works on non-EVGA cards, it works on my Gigabyte 1080TI Gaming OC flashed w/the Strix OC VBIOS.


----------



## kithylin

8051 said:


> That's amazing. Considering mining has now come to dominate GPU sales is EVGA still continuing that policy?


EVGA has always let anyone register used cards off ebay under an EVGA account for as long as I can remember, at least since the early 2000 - 2001 era until just a few years ago when I tried. I don't know if they actually do it today.. I bought my 1080 Ti new in august 2017 so I don't know about it currently today but I don't see why they would change any time recently after honoring it for so long.


----------



## KCDC

kithylin said:


> I don't know about other companies, but, EVGA explicitly has that as one of their biggest positive things about their company. With EVGA, as long as you find a GPU within the original "3 year warranty from original sale date" left.. that is, buy a card within the first 3 years of it being sold retail, then you can totally register it's serial number under your EVGA account and activate another 3 years of warranty on it. I did this for my GTX 770 in the other room, bought it used with 4 months of original warranty left used off ebay and it currently has full warranty coverage until september 2019 still active. Even though that would be 6 years after it was originally sold, and EVGA "officially" only offers warranty for cards for 3 years.


EVGA is a unicorn then. Corsair definitely does not. I have a faulty 1200i just sitting in my office because I can't locate my original bill of sale and they won't RMA it for me without it regardless of what I do, and I refuse to pay them to fix it. They have the worst customer service I've ever dealt with. GTK about EVGA, though. Makes sense, falls in line with their watercooling policy.



ZealotKi11er said:


> Yes some require that. Some dont. ASUS/Giga/MSI work on Serial so you good there. Evga you have to register the card. You can transfer the warranty if you live in NA.


Also good to know, thanks! I wonder if it's the same for Nvidia directly. Both my cards I bought direct from Nvidia during the first presale.


----------



## ZealotKi11er

KCDC said:


> EVGA is a unicorn then. Corsair definitely does not. I have a faulty 1200i just sitting in my office because I can't locate my original bill of sale and they won't RMA it for me without it regardless of what I do, and I refuse to pay them to fix it. They have the worst customer service I've ever dealt with. GTK about EVGA, though. Makes sense, falls in line with their watercooling policy.
> 
> 
> 
> Also good to know, thanks! I wonder if it's the same for Nvidia directly. Both my cards I bought direct from Nvidia during the first presale.


From Nvidia it is most likely only original owner.


----------



## Arexniba

Has anyone done SLI with the Aorus 1080 TI cards?

I have a HAF 932 case, but these cards are behemoth. I had to remove the PSU just so they could fit in the case.

Any of you done mods or recommend a bigger/better case for a 1080 TI setup?

(No clue why the pictures rotated during upload)


----------



## kithylin

Arexniba said:


> Has anyone done SLI with the Aorus 1080 TI cards?
> 
> I have a HAF 932 case, but these cards are behemoth. I had to remove the PSU just so they could fit in the case.
> 
> Any of you done mods or recommend a bigger/better case for a 1080 TI setup?
> 
> (No clue why the pictures rotated during upload)


I would suggest this case here: https://www.newegg.com/Product/Product.aspx?Item=N82E16811352049

Fractal Define R5. Definitely enough room for your two GPU's + Power supply. I like this one because it has two removable (Washable!) dust filters, one large one in front that pops out, and a second one on the bottom you can pull out the front easily (very similar to a clothes dryer lint screen) without getting behind the computer or moving it. Only $120 too.

Review here: 




I did a build for someone using this in Jan 2017. X99 system with single hybrid 1080 Ti, with H90 for CPU. See final build here: 



 Sorry, my video capturing isn't the best, it was freezing cold in Washington DC in Jan and their house was old and leaky. Give you an idea of what it looks like built though. I'm also not a pro at cable management but did my best.


----------



## john1016

Yes EVGA still has the same warranty. I recently bought a 980ti Locally and it started to have lots of artifacts so they let me RMA it and sent me a 1080 SC. While waiting on the RMA I bought a 980 FTW on ebay that after a week did the same thing and now I.m waiting to see what I get back for the 980 FTW.

I think I got very unlucky with getting 2 used cards in a row that went bad like that, but I'm very happy I went with EVGA. Glad to now know that there are other companies that are also good with warranties.


----------



## john1016

The 2 cards in SLI look to be 2.5 slot cards, usually you would use reg 2 slot cards for sli, that's why they don't fit with the psu. You need a case with 10 pcie slots.

CM stryker, TT X9/X5 are a couple.


----------



## 2ndLastJedi

feznz said:


> I have to say Gsync works sometimes lowering the core clock as well as usage  Played an old game the other day stayed @ 1600Mhz while about 25% GPU usage, but does take a pretty basic game to lower the core clock.
> 
> Yes the theoretical SH card I am sure there are a plenty of good ones just who really knows what people have done with their cards
> 
> https://www.youtube.com/watch?v=2GHR9zPzZps


FH3 has my 1080Ti idling @15xxMHz ! not sure why even with 8xAA and Ultra everything with V-sync .


----------



## feznz

2ndLastJedi said:


> FH3 has my 1080Ti idling @15xxMHz ! not sure why even with 8xAA and Ultra everything with V-sync .


I tend not to worry about those things too much if Gsync is enabled and keeping the FPS 60 or 75Hz what ever it may be... as long as back on desktop back to 836Mhz its all good.
but TBH I turn off Gsync and use the in game Vsync options that way there is no chopping and changing really more of a sales pitch between NVidia and Monitor manufactures


----------



## KickAssCop

My reasons for not blocking cards.

1) Resale value decrease
2) Usually can't find a guy who buys the waterblock (still have my 980 Ti classified block lying around)
3) High maintenance of complete closed loop solution
4) 5-7 C temperature difference not worth my time

At least in my country.


----------



## 2ndLastJedi

feznz said:


> I tend not to worry about those things too much if Gsync is enabled and keeping the FPS 60 or 75Hz what ever it may be... as long as back on desktop back to 836Mhz its all good.
> but TBH I turn off Gsync and use the in game Vsync options that way there is no chopping and changing really more of a sales pitch between NVidia and Monitor manufactures


On the desktop mine idles at 139MHz from memory.


----------



## kithylin

KickAssCop said:


> My reasons for not blocking cards.
> 
> 3) High maintenance of complete closed loop solution


Yeah.. adding a little coolant to the loop once every 3-4 months and blowing dust out of the radiator with air can once a month is super hard.

Do you even know how to maintain a computer?

That's literally all you have to do with computer water system (custom loop) for regular usage. One of my computers here at home has had the same water setup, and same coolant in it for going on 4 years now with no issues. No corrosion, no fluid change necessary, still cools as well today as it does when I built it 4 years ago. Just top it off and keep the radiators clear and it's great.


----------



## feznz

kithylin said:


> Yeah.. adding a little coolant to the loop once every 3-4 months and blowing dust out of the radiator with air can once a month is super hard.
> 
> Do you even know how to maintain a computer?
> 
> That's literally all you have to do with computer water system (custom loop) for regular usage. One of my computers here at home has had the same water setup, and same coolant in it for going on 4 years now with no issues. No corrosion, no fluid change necessary, still cools as well today as it does when I built it 4 years ago. Just top it off and keep the radiators clear and it's great.


What coolant do you use? I was thinking my last loop I had regular antifreeze in it lasted 5years and zero corrosion.
If I am lucky I might dust out every 4-5months

but decided to go all out and replace with mayhems white pastel which is guaranteed for 3years but the temps with grizzly 70°C to 35° got another 50Mhz out of the GPU not that I would ever notice the FPS increase its all in the mind 


yes I just checked 139Mhz for idle unless you got a few things going on


----------



## kithylin

feznz said:


> What coolant do you use? I was thinking my last loop I had regular antifreeze in it lasted 5years and zero corrosion.
> If I am lucky I might dust out every 4-5months
> 
> but decided to go all out and replace with mayhems white pastel which is guaranteed for 3years but the temps with grizzly 70°C to 35° got another 50Mhz out of the GPU not that I would ever notice the FPS increase its all in the mind
> 
> 
> yes I just checked 139Mhz for idle unless you got a few things going on


about 50ml - 75ml straight (undiluted) automotive antifreeze per gallon of distilled water. Shaken thoroughly. Put in water loop, win. Super cheap and easy. Great cooling performance. Haven't had any problems in years. Bought a big bottle of antifreeze... gods, 10 years ago? stored in the storage shed outside, still have 90% of it. And a gallon of distilled water is $0.45 at Kroger grocery store near by (they give a discount for any store brand items.. they have a store brand distilled water).

Just don't ever use the "50/50" pre-mixed stuff. 50% antifreeze is -way- too strong for computer water loops and will physically dissolve rubber seals over time. But just a tiny bit for a little anti-corrosive is enough.


----------



## siffonen

KickAssCop said:


> My reasons for not blocking cards.
> 
> 1) Resale value decrease
> 2) Usually can't find a guy who buys the waterblock (still have my 980 Ti classified block lying around)
> 3) High maintenance of complete closed loop solution
> 4) 5-7 C temperature difference not worth my time
> 
> At least in my country.


3) High mainteance? Might only need a fluid change once a year, i didnt do anything to my loop in over a year
4) that difference is low, my temps has usually dropped around 30c with block


----------



## ZealotKi11er

KickAssCop said:


> My reasons for not blocking cards.
> 
> 1) Resale value decrease
> 2) Usually can't find a guy who buys the waterblock (still have my 980 Ti classified block lying around)
> 3) High maintenance of complete closed loop solution
> 4) 5-7 C temperature difference not worth my time
> 
> At least in my country.


These days with AIO I do not bother with full block. Main reason for me is that it is too much work if you want to take the card in and out of the system for testing and stuff. Its a very permanent solution.


----------



## nrpeyton

evening

If you *REALLY *want to squeeze every last drop of performance out of your 1080 Ti; simply flashing the XOC BIOS isn't enough.

There is circuity inside the GPU core that dynamically alters core clock and/or shuts down shaders to keep card within maximum power envelope. This is at the hardware level.

That's why doing *BOTH *a shunt mod on PCB *AND *flashing the XOC BIOS is more likely to allow card to perform at its maximum potential.

See this video: 



 _(the pascal problem)_


----------



## kithylin

nrpeyton said:


> evening
> 
> If you *REALLY *want to squeeze every last drop of performance out of your 1080 Ti; simply flashing the XOC BIOS isn't enough.
> 
> There is circuity inside the GPU core that dynamically alters core clock and/or shuts down shaders to keep card within maximum power envelope. This is at the hardware level.
> 
> That's why doing *BOTH *a shunt mod on PCB *AND *flashing the XOC BIOS is more likely to allow card to perform at its maximum potential.
> 
> See this video:


Except shunt mod is extremely dangerous, involves physical changes to the PCB (The one video outlying how to do it involves painting on something around the chips on the board, putting liquid metal on it, painting that on.. etc) and involves having liquid metal on components running around 500+ watts and running extremely hot (even water cooled the vrm's can still get up to 70c - 80c), voids your factory warranty and runs a serious risk of melting the card.

Where as the XOC bios is easily reversible (just flash stock bios back to it again) and doesn't involve any physical changes at all. XOC Bios may (possibly) incur some risk with the cards (we don't know for sure yet if it does or doesn't damage cards over time) but the shunt mod for sure definitely drives the cards -WAY- over their design parameters and right in to "dangerous territory".

I guess if you really have totally disposable income where you can blow $1000 on a card and don't care if you fry it in a month with physical mods for the PCB and you can afford to replace it yourself out of warranty for another $1000 on a whim go ahead. But I have a feeling most people aren't in that situation.


----------



## feznz

nrpeyton said:


> evening
> 
> If you *REALLY *want to squeeze every last drop of performance out of your 1080 Ti; simply flashing the XOC BIOS isn't enough.
> 
> There is circuity inside the GPU core that dynamically alters core clock and/or shuts down shaders to keep card within maximum power envelope. This is at the hardware level.
> 
> That's why doing *BOTH *a shunt mod on PCB *AND *flashing the XOC BIOS is more likely to allow card to perform at its maximum potential.[/I]


I get it for benching so ..... why haven't I seen any submissions here

http://www.overclock.net/forum/21-b...-30-3dmark11-scores-single-dual-tri-quad.html
http://www.overclock.net/forum/21-b...7-top-30-unigine-superposition-benchmark.html
http://www.overclock.net/forum/21-b...06006-3dmark-time-spy-benchmark-top-30-a.html
http://www.overclock.net/forum/21-b...al-top-30-unigine-valley-benchmark-1-0-a.html


And really it's your card you do with what you want back in the day when I was young  I took a soldering iron to the GTX 770 shunts before NVidia software/hardware? blocked a direct short over the shunts


----------



## ZealotKi11er

Does 1080 Ti memory differ from brand to brand and model to model? I know G5X is made by micron but some card could use more capable chips. For example my eVGA 1080Ti FE + Hybrid kit runs a bit slower in mining than my eVGA 1080 Ti FW3 Hybrid. One is a launch card and the other one came 6 moths after. I know eVGA has some cards sold with 12GHz memory.


----------



## SavantStrike

KickAssCop said:


> My reasons for not blocking cards.
> 
> 1) Resale value decrease
> 2) Usually can't find a guy who buys the waterblock (still have my 980 Ti classified block lying around)
> 3) High maintenance of complete closed loop solution
> 4) 5-7 C temperature difference not worth my time
> 
> At least in my country.


I'm having trouble seeing how the resale value would decrease. As long as the block is properly installed and the original air cooler comes with the card, resale value shouldn't be any lower.

Maintenance I guess so if you're using colored coolant?


----------



## ZealotKi11er

SavantStrike said:


> I'm having trouble seeing how the resale value would decrease. As long as the block is properly installed and the original air cooler comes with the card, resale value shouldn't be any lower.
> 
> Maintenance I guess so if you're using colored coolant?


Most people do not want to block or you have people that want to block but not the FE card. For example 290X Referee + Block you can sell maybe same price or $20 more. You lost 90% of the value right there.


----------



## DarthBaggins

kithylin said:


> I would suggest this case here: https://www.newegg.com/Product/Product.aspx?Item=N82E16811352049
> 
> Fractal Define R5. Definitely enough room for your two GPU's + Power supply. I like this one because it has two removable (Washable!) dust filters, one large one in front that pops out, and a second one on the bottom you can pull out the front easily (very similar to a clothes dryer lint screen) without getting behind the computer or moving it. Only $120 too.
> 
> Review here: https://www.youtube.com/watch?v=SRZIdbdcIiU
> 
> I did a build for someone using this in Jan 2017. X99 system with single hybrid 1080 Ti, with H90 for CPU. See final build here: https://www.youtube.com/watch?v=rIhanXE0oVI Sorry, my video capturing isn't the best, it was freezing cold in Washington DC in Jan and their house was old and leaky. Give you an idea of what it looks like built though. I'm also not a pro at cable management but did my best.


Or the new Define R6


----------



## KickAssCop

SavantStrike said:


> I'm having trouble seeing how the resale value would decrease. As long as the block is properly installed and the original air cooler comes with the card, resale value shouldn't be any lower.
> 
> Maintenance I guess so if you're using colored coolant?


On most cards blocking voids warranty. Also at least buyers in my country don’t want a card that is not as shipped from factory.

I think maintenance is for coolant refills and also possible leaks. Cleaning is a pain as well. Compared to AIO which requires only a dust blower and you are done. Biggest pain is that I go through cards every year and can’t afford to spend 200+ for block and backplate that I know I won’t be able to sell.


----------



## KickAssCop

siffonen said:


> 3) High mainteance? Might only need a fluid change once a year, i didnt do anything to my loop in over a year
> 4) that difference is low, my temps has usually dropped around 30c with block


My Strix runs at 65 C Max load on a warm summer day 28C ambient. Normally 60C.
My Seahawks run at 50-55 C Max load these days 25C ambient.
My Hybrids from EVGA run at 45 C Max load these days.

What you are saying is that your cards run from 15C to 30C using a water block on max load. Please tell me how you are doing this? Chilled water?


----------



## feznz

KickAssCop said:


> On most cards blocking voids warranty. Also at least buyers in my country don’t want a card that is not as shipped from factory.
> 
> I think maintenance is for coolant refills and also possible leaks. Cleaning is a pain as well. Compared to AIO which requires only a dust blower and you are done. Biggest pain is that I go through cards every year and can’t afford to spend 200+ for block and backplate that I know I won’t be able to sell.


Got ask then how many 1080Tis have you had fail then? 
yes the mining business you got to cut any costs if possible so I can understand the margin is slim so 1 water block was going to be 5 days mining so x 9 GPUs was going to add another 2 months to ROI if you went full custom loop.


----------



## ZealotKi11er

After watching a bunch of Youtube video of people building mining rings with 1080 Ti I would have to say, stay away from any air cooled card. Most of them are letting them run at peak temp which is no good.


----------



## SavantStrike

ZealotKi11er said:


> Most people do not want to block or you have people that want to block but not the FE card. For example 290X Referee + Block you can sell maybe same price or $20 more. You lost 90% of the value right there.


Comparing AIB to reference, of course the reference model loses. The block itself is worth less than you paid for it, but it isn't like the cat card itself has depreciated.


----------



## kithylin

KickAssCop said:


> On most cards blocking voids warranty. Also at least buyers in my country don't want a card that is not as shipped from factory.
> 
> I think maintenance is for coolant refills and also possible leaks. Cleaning is a pain as well. Compared to AIO which requires only a dust blower and you are done. Biggest pain is that I go through cards every year and can't afford to spend 200+ for block and backplate that I know I won't be able to sell.


As stated before, once you build a loop, no more maintenance other than an air cooled card is needed, air duster for radiators is all.. maybe add a little coolant 3-4 times per year. And for you blocking cards may not make much sense, especially with only 28c ambient there. Some people however get 1 high end nvidia card once every 3-4 years and for them, blocking would make more sense. Most custom water loop builds aren't changed every year, most custom water looped PC's are a 3-5 year build. Or like me in Texas, USA where our summers are typically in the 43C <-> 49C range for about 4-5 months out of the year, and we struggle to maintain 28c ambient indoors with good air conditioners, blocks make a lot more sense here for our gpu's.



KickAssCop said:


> My Strix runs at 65 C Max load on a warm summer day 28C ambient. Normally 60C.
> My Seahawks run at 50-55 C Max load these days 25C ambient.
> My Hybrids from EVGA run at 45 C Max load these days.
> 
> What you are saying is that your cards run from 15C to 30C using a water block on max load. Please tell me how you are doing this? Chilled water?


Obviously the only way to get a -30c drop converting to a water block is when one is going from air cooled -> water block. Which obviously is what siffonen was referring to.



ZealotKi11er said:


> After watching a bunch of Youtube video of people building mining rings with 1080 Ti I would have to say, stay away from any air cooled card. Most of them are letting them run at peak temp which is no good.


As long as it's a stock card and not flashed with some sort of custom bios, then it doesn't matter. The cards will throttle-down to safe levels if they run too hot and it won't hurt anything. The big problem with coin miners is some of em flash bios's in to the cards and remove the throttle limitations and then mine on em, then flash stock back and then sell em as a normal card. That's the real issue.


----------



## ZealotKi11er

kithylin said:


> As long as it's a stock card and not flashed with some sort of custom bios, then it doesn't matter. The cards will throttle-down to safe levels if they run too hot and it won't hurt anything. The big problem with coin miners is some of em flash bios's in to the cards and remove the throttle limitations and then mine on em, then flash stock back and then sell em as a normal card. That's the real issue.


Most custom cards run well below the safety limit. 1080 Ti temp throttle when they hit 84C. I have no idea what other parts like memory and VRM are hitting. My 1080 Ti FW3 VRM is superbly cooled but still runs around 60C. Not all card have a huge VRM cooler. The reason these cards runs so hot is because they are running 6 are one place. Its like taking consumer HDD and using them for high density applications. I did not even want to run my 1080 Ti FE at 84C. I immediately lowered volts and boost so i do not hit 81C. All I am saying is that cards left like that to throttle due to temps are bad news.


----------



## KickAssCop

kithylin said:


> As stated before, once you build a loop, no more maintenance other than an air cooled card is needed, air duster for radiators is all.. maybe add a little coolant 3-4 times per year. And for you blocking cards may not make much sense, especially with only 28c ambient there. Some people however get 1 high end nvidia card once every 3-4 years and for them, blocking would make more sense. Most custom water loop builds aren't changed every year, most custom water looped PC's are a 3-5 year build. Or like me in Texas, USA where our summers are typically in the 43C <-> 49C range for about 4-5 months out of the year, and we struggle to maintain 28c ambient indoors with good air conditioners, blocks make a lot more sense here for our gpu's.
> 
> Obviously the only way to get a -30c drop converting to a water block is when one is going from air cooled -> water block. Which obviously is what siffonen was referring to.
> 
> As long as it's a stock card and not flashed with some sort of custom bios, then it doesn't matter. The cards will throttle-down to safe levels if they run too hot and it won't hurt anything. The big problem with coin miners is some of em flash bios's in to the cards and remove the throttle limitations and then mine on em, then flash stock back and then sell em as a normal card. That's the real issue.


I have 55 C summers. 23-25 C ambient indoors w/ A/C.
Strix is at 65 C max load during summer. It is an air cooled card.
I don't know a single person that bought a high end card and kept it more than 2 years.

Especially guys with blocks usually end up changing more often. At least the guys I know on forums I frequent. 3-5 years for a graphics card is for budget builds normally not high end afaik or have read on forums.


----------



## nrpeyton

Ambient here indoors is about 22c. My card never exceeds 25c ;-)


----------



## OccamRazor

Hey Nick, hows it going man? Long time... Hows your TI treating you?

Cheers

Occamrazor


----------



## siffonen

KickAssCop said:


> I have 55 C summers. 23-25 C ambient indoors w/ A/C.
> Strix is at 65 C max load during summer. It is an air cooled card.
> I don't know a single person that bought a high end card and kept it more than 2 years.
> 
> Especially guys with blocks usually end up changing more often. At least the guys I know on forums I frequent. 3-5 years for a graphics card is for budget builds normally not high end afaik or have read on forums.


i usually keep my card atleast 3 years before upgrading, and i do have a custom loop


----------



## KCDC

siffonen said:


> i usually keep my card atleast 3 years before upgrading, and i do have a custom loop


Same, I really don't see a reason to keep taking this dude's bait.


----------



## SavantStrike

ZealotKi11er said:


> Most custom cards run well below the safety limit. 1080 Ti temp throttle when they hit 84C. I have no idea what other parts like memory and VRM are hitting. My 1080 Ti FW3 VRM is superbly cooled but still runs around 60C. Not all card have a huge VRM cooler. The reason these cards runs so hot is because they are running 6 are one place. Its like taking consumer HDD and using them for high density applications. I did not even want to run my 1080 Ti FE at 84C. I immediately lowered volts and boost so i do not hit 81C. All I am saying is that cards left like that to throttle due to temps are bad news.


I agree. 

It amazes me how hot some people let their gear run, but those people are the ones who have cards die in 2-3 years.


----------



## erso44

SavantStrike said:


> I agree.
> 
> It amazes me how hot some people let their gear run, but those people are the ones who have cards die in 2-3 years.


My GPU reaches 85-87 (MSI GTX 1080 TI Armored OC lol) under fullload. I am not sure if I should go back to watercooling because I don't like to avoid my warranty...especially after I paid 800€. EKWB has released some sexy GPU waterblocks  My previous setup was watercooled and temps while overclocking was in a "safe range". I really don't know if it's worth anymore.


----------



## feznz

erso44 said:


> My GPU reaches 85-87 (MSI GTX 1080 TI Armored OC lol) under fullload. I am not sure if I should go back to watercooling because I don't like to avoid my warranty...especially after I paid 800€. EKWB has released some sexy GPU waterblocks  My previous setup was watercooled and temps while overclocking was in a "safe range". I really don't know if it's worth anymore.


Wow that is quite hot for a card I believe that the MSI cards still have GPU VRM temp probe that can be monitored via MSI AB might still have memory temp probes as well not sure without looking.
But that heat will migrate all over the card now caps are usually rated at 5000hrs @ 105° BUT which is short no-one runs at 105°C but even at lets say 75°C the MTBF will be drastically reduced to lets sat 10 000hrs.
So realisably you have a real life reduced life expectancy personally I would say 60°C-70°C is upper limit for a GPU so unless you can control the heat a little better via reduced clocks you will have no choice but to remove the cooler and replace the TIM at the very least.
but if you plan on long term with this card then water block I might add the other day watching a Utube about getting into mining this gut had 70 1080Ti in 10 rigs he had 4 cards? fail so far MSI cards 







But what he was saying November but as what I have noticed that even though there has been a semi crash with mining it has put no-one off just keep on trucking in the believe that the price will some day...... be back to the crazy peak last year.

edit BTW the more I look into mining the more I am thinking this is got to be the biggest fraud of the history of man I could be wrong but only time will tell
but if I were to seriously get into mining forget the GPUs go straight to the bad boy Antminer S9 if you had balls because they have no resell value except for mining thats all they do.


----------



## erso44

feznz said:


> Wow that is quite hot for a card I believe that the MSI cards still have GPU VRM temp probe that can be monitored via MSI AB might still have memory temp probes as well not sure without looking.
> But that heat will migrate all over the card now caps are usually rated at 5000hrs @ 105° BUT which is short no-one runs at 105°C but even at lets say 75°C the MTBF will be drastically reduced to lets sat 10 000hrs.
> So realisably you have a real life reduced life expectancy personally I would say 60°C-70°C is upper limit for a GPU so unless you can control the heat a little better via reduced clocks you will have no choice but to remove the cooler and replace the TIM at the very least.
> but if you plan on long term with this card then water block I might add the other day watching a Utube about getting into mining this gut had 70 1080Ti in 10 rigs he had 4 cards? fail so far MSI cards
> 
> 
> https://www.youtube.com/watch?v=nXHAjBMLRqY&t=665s
> 
> But what he was saying November but as what I have noticed that even though there has been a semi crash with mining it has put no-one off just keep on trucking in the believe that the price will some day...... be back to the crazy peak last year.
> 
> edit BTW the more I look into mining the more I am thinking this is got to be the biggest fraud of the history of man I could be wrong but only time will tell
> but if I were to seriously get into mining forget the GPUs go straight to the bad boy Antminer S9 if you had balls because they have no resell value except for mining thats all they do.


I just read my fan should hit 4000 rpm at 100% (fan load). IF this is correct my card is not working properly :O


----------



## SavantStrike

erso44 said:


> feznz said:
> 
> 
> 
> Wow that is quite hot for a card I believe that the MSI cards still have GPU VRM temp probe that can be monitored via MSI AB might still have memory temp probes as well not sure without looking.
> But that heat will migrate all over the card now caps are usually rated at 5000hrs @ 105? BUT which is short no-one runs at 105?C but even at lets say 75?C the MTBF will be drastically reduced to lets sat 10 000hrs.
> So realisably you have a real life reduced life expectancy personally I would say 60?C-70?C is upper limit for a GPU so unless you can control the heat a little better via reduced clocks you will have no choice but to remove the cooler and replace the TIM at the very least.
> but if you plan on long term with this card then water block I might add the other day watching a Utube about getting into mining this gut had 70 1080Ti in 10 rigs he had 4 cards? fail so far MSI cards
> 
> 
> https://www.youtube.com/watch?v=nXHAjBMLRqY&t=665s
> 
> But what he was saying November but as what I have noticed that even though there has been a semi crash with mining it has put no-one off just keep on trucking in the believe that the price will some day...... be back to the crazy peak last year.
> 
> edit BTW the more I look into mining the more I am thinking this is got to be the biggest fraud of the history of man I could be wrong but only time will tell
> but if I were to seriously get into mining forget the GPUs go straight to the bad boy Antminer S9 if you had balls because they have no resell value except for mining thats all they do.
> 
> 
> 
> I just read my fan should hit 4000 rpm at 100% (fan load). IF this is correct my card is not working properly :O
Click to expand...

Do you have a founders edition or something else? 4K RPM is for blower costs


----------



## kithylin

ZealotKi11er said:


> Most custom cards run well below the safety limit. 1080 Ti temp throttle when they hit 84C. I have no idea what other parts like memory and VRM are hitting. My 1080 Ti FW3 VRM is superbly cooled but still runs around 60C. Not all card have a huge VRM cooler. The reason these cards runs so hot is because they are running 6 are one place. Its like taking consumer HDD and using them for high density applications. I did not even want to run my 1080 Ti FE at 84C. I immediately lowered volts and boost so i do not hit 81C. All I am saying is that cards left like that to throttle due to temps are bad news.





SavantStrike said:


> I agree.
> 
> It amazes me how hot some people let their gear run, but those people are the ones who have cards die in 2-3 years.





erso44 said:


> My GPU reaches 85-87 (MSI GTX 1080 TI Armored OC lol) under fullload. I am not sure if I should go back to watercooling because I don't like to avoid my warranty...especially after I paid 800€. EKWB has released some sexy GPU waterblocks  My previous setup was watercooled and temps while overclocking was in a "safe range". I really don't know if it's worth anymore.


Nothing adverse will happen. Linus tech tips (LTT) even had a video showing how his years old GTX 480 it was after running many many years @ 85c temps constantly ran just as good in 2016 that it did when new. 

See video here: 




Running hot has no adverse effects on video cards over the years as you would think.


----------



## feznz

yes true for degrading but jump to 6.30 where we talk about failure 
GPUs that fail do exactly what he said 1 min they are working fine the next second they fail ..... exactly what I had happen to me with 2 MSI GTX 580s no performance degrading just poof with the slight burning smell.
Caused by excessive heat, overclocking and excessive voltage.

the same guy was talking about buying mining GPUs and same story I mined for XXX years or months no problems yes GPUs give no warning on complete failure and generally not the die itself fails but the VRM or Capacitors are the main cause of failure or
the solder balls under the die to the PCB also can fracture from excessive thermal cycling or artefacts from failing memory


----------



## ZealotKi11er

kithylin said:


> Nothing adverse will happen. Linus tech tips (LTT) even had a video showing how his years old GTX 480 it was after running many many years @ 85c temps constantly ran just as good in 2016 that it did when new.
> 
> See video here: https://www.youtube.com/watch?v=44JqNJq-PC0
> 
> Running hot has no adverse effects on video cards over the years as you would think.


Considering how many computers he has and how much time he spends gaming I do not think his GTX 480 WC is a good example.


----------



## zGunBLADEz

nrpeyton said:


> evening
> 
> If you *REALLY *want to squeeze every last drop of performance out of your 1080 Ti; simply flashing the XOC BIOS isn't enough.
> 
> There is circuity inside the GPU core that dynamically alters core clock and/or shuts down shaders to keep card within maximum power envelope. This is at the hardware level.
> 
> That's why doing *BOTH *a shunt mod on PCB *AND *flashing the XOC BIOS is more likely to allow card to perform at its maximum potential.
> 
> See this video: https://www.youtube.com/watch?v=bflLDenKirQ _(the pascal problem)_


idk about that, xoc bios remove some features but you loose performance and to squeeze those 50-100MHz extra requires a lot of voltage which is not worth it.

as long as you keep your card from power throttling and cooled well you are golden.

My 1080TI do 0.98V 2076MHz/550 completely stable under a koolance 220 block, the temps using this block are like 30c avg pretty cool for a universal block.

I recently put her under a full block including my 8700K paired with monoblock and my temps went up like 5c but i have my vrms and mem and pcb in general cooled.
My water temp right now is about 2c+ over the mother board temp the loop is optimal.

She do 2101/620 before error correction with 1.012v it is worth it vs the 0.98V you may ask? no, but the mem overclock speed yes it is worth it tho.
I got gains like if i raised 200mhz on the core alone with those 70mhz on the memory.

Pascal core overclocks are meaningless you need the shader counts and the mem speed XD


----------



## chris89

Can you edit the parameters of the bios on these cards, or is there no editor like on Vega?


----------



## Blameless

Operating temperature is one factor of many interrelated ones when it comes to card longevity/reliability.

You can run very high temps for a very long time if you aren't pulling much current, running high volts, or seeing frequent thermal cycles. Conversely, you may have to keep temperatures far lower to be anywhere near safe once you start increasing voltage or current draw and very few board components will handle frequent 80C+ temperature swings in the long term.



feznz said:


> But that heat will migrate all over the card now caps are usually rated at 5000hrs @ 105° BUT which is short no-one runs at 105°C but even at lets say 75°C the MTBF will be drastically reduced to lets sat 10 000hrs.


The rule of thumb (related to Black's Equation and the like) is that every 10C reduction in temperatures very roughly doubles longevity, all other things being equal.

If something is rated for 5k hours at 105C, it's probably going to last a very long time at 75C...closer to 40k hours on average than 10k hours.

Of course, that's for steady state degradation like electromigration, oxidation, and capacitor wear, not thermomechanical failures. However, the latter are far less likely in dedicated mining setups because the number of thermal cycles they see is minimal.

Most of my mining hardware has proven to last longer than gaming hardware that has been put under intermittent high loads, and much much longer than hardware I've used for benching.



ZealotKi11er said:


> Considering how many computers he has and how much time he spends gaming I do not think his GTX 480 WC is a good example.


My worst clocking and thus least OCed/tuned/benched/molested GTX 480 is still running with it's reference air cooler and has tens of thousands of hours of use over 85C.

My best GTX 480 lasted less than six months at ~55C...because I was ramming more than double peak stock current through the VRM when I had it overclocked by 35% underwater.

Context is everything. 90C is fine for reference clocks at typical loads, but 40C colder may be way too hot for heavier loads while running way beyond spec.


----------



## SavantStrike

kithylin said:


> Nothing adverse will happen. Linus tech tips (LTT) even had a video showing how his years old GTX 480 it was after running many many years @ 85c temps constantly ran just as good in 2016 that it did when new.
> 
> See video here: https://www.youtube.com/watch?v=44JqNJq-PC0
> 
> Running hot has no adverse effects on video cards over the years as you would think.


That card is still alive because it spent a fair amount of time under water (and Linus managed not to drop it). 

A significant number of Fermi cards died from heat related illness. It's generally NOT a good idea to run at 80C for long periods of time unless you want to be one of the people who wonder why their 2-3 year old GPUs keep dying.

Sent from my ZTE Axon 7 Resurrection Remix.


----------



## nrpeyton

OccamRazor said:


> Hey Nick, hows it going man? Long time... Hows your TI treating you?
> 
> Cheers
> 
> Occamrazor


Aye it's good mate. Thanks.

Card is doing fine. Still got it shunt modded and still got XOC flashed. Also still hooked up to the Water Chiller (which I have left the thermostat set at 12c now for the last few moths -- so 24/7).

It's never given me any problems - temps never exceed 25c even under an extreme heavy load. Most of the time temps don't exceed about 18-20c.

I'm mostly playing Mass Effect just now. Also dabbled with mining for a month -- made over $130. But just need to find a way to cash it in. Although I've not actively mined this month at all. Got bored when I realised you have to jump through hoops to translate it into real money.

How are you doing?




--------------






zGunBLADEz said:


> idk about that, xoc bios remove some features but you loose performance and to squeeze those 50-100MHz extra requires a lot of voltage which is not worth it.
> 
> as long as you keep your card from power throttling and cooled well you are golden.
> 
> My 1080TI do 0.98V 2076MHz/550 completely stable under a koolance 220 block, the temps using this block are like 30c avg pretty cool for a universal block.
> 
> I recently put her under a full block including my 8700K paired with monoblock and my temps went up like 5c but i have my vrms and mem and pcb in general cooled.
> My water temp right now is about 2c+ over the mother board temp the loop is optimal.
> 
> She do 2101/620 before error correction with 1.12v it is worth it vs the 0.98V you may ask? no, but the mem overclock speed yes it is worth it tho.
> I got gains like if i raised 200mhz on the core alone with those 70mhz on the memory.
> 
> Pascal core overclocks are meaningless you need the shader counts and the mem speed XD


That's the thing though. The card may report 2076mhz through software, but in reality that clock may actually be something quite different. Moreover there may be cores/shaders that are being dynamically shut down within your GPU and you'd never know. Not without direct performance comparisons/benchmarks. Even then -- it will only show the bizarre behaviour. But won't allow you to calculate a precise reason.


----------



## feznz

Blameless said:


> Operating temperature is one factor of many interrelated ones when it comes to card longevity/reliability.
> 
> You can run very high temps for a very long time if you aren't pulling much current, running high volts, or seeing frequent thermal cycles. Conversely, you may have to keep temperatures far lower to be anywhere near safe once you start increasing voltage or current draw and very few board components will handle frequent 80C+ temperature swings in the long term.
> 
> 
> 
> The rule of thumb (related to Black's Equation and the like) is that every 10C reduction in temperatures very roughly doubles longevity, all other things being equal.
> 
> If something is rated for 5k hours at 105C, it's probably going to last a very long time at 75C...*closer to 40k hours on average than 10k hours*.
> 
> Of course, that's for steady state degradation like electromigration, oxidation, and capacitor wear, not thermomechanical failures. However, the latter are far less likely in dedicated mining setups because the number of thermal cycles they see is minimal.
> 
> Most of my mining hardware has proven to last longer than gaming hardware that has been put under intermittent high loads, and much much longer than hardware I've used for benching.
> 
> 
> 
> My worst clocking and thus least OCed/tuned/benched/molested GTX 480 is still running with it's reference air cooler and has tens of thousands of hours of use over 85C.
> 
> My best GTX 480 lasted less than six months at ~55C...because I was ramming more than double peak stock current through the VRM when I had it overclocked by 35% underwater.
> 
> Context is everything. 90C is fine for reference clocks at typical loads, but 40C colder may be way too hot for heavier loads while running way beyond spec.


that was an random number pulled out of the air IDKW caps are rated @105°C for 5000hrs rather than 50°C @ xxxxhrs 
But we are on the same page


----------



## zGunBLADEz

Yes, saw the same thing with the 1080 so i started using lower voltages instead of trying to raise them and the architecture responded to that instead of the "mooooar volts" that we used to.


----------



## dentnu

Hi everyone its been a while since I checked this thread and really do not wanna read all 411 pages. I have a MSI GTX 1080 Ti Gaming X. I am looking for the best bios for this card. My card is using this bios https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320. Can Someone please help me ? Thanks


----------



## kithylin

chris89 said:


> Can you edit the parameters of the bios on these cards, or is there no editor like on Vega?


First off you meant Maxwell. Maxwell has the last / latest full custom-bios editor software that works. Vega is the yet unreleased nvidia video cards that aren't even out yet. So there's no editor for Vega yet, the gpu's aren't even for sale retail. (There is a Titan-V Vega out but no one's made custom bios software for it). That aside there is also no full editor program for GTX 1000 (Pascal) like we have for Maxwell. They do have an editor program, but currently it's only to edit mobile video cards. And currently all user-created custom bios's with the editor have to be flashed manually by disassembling the laptop and physically connecting a clip-on flashing tool to the card it's self. We can't flash the user-created bios's for mobile GPU's with software.

So no, there's not really an editor yet.


----------



## 2ndLastJedi

kithylin said:


> chris89 said:
> 
> 
> 
> Can you edit the parameters of the bios on these cards, or is there no editor like on Vega?
> 
> 
> 
> First off you meant Maxwell. Maxwell has the last / latest full custom-bios editor software that works. Vega is the yet unreleased nvidia video cards that aren't even out yet. So there's no editor for Vega yet, the gpu's aren't even for sale retail. (There is a Titan-V Vega out but no one's made custom bios software for it). That aside there is also no full editor program for GTX 1000 (Pascal) like we have for Maxwell. They do have an editor program, but currently it's only to edit mobile video cards. And currently all user-created custom bios's with the editor have to be flashed manually by disassembling the laptop and physically connecting a clip-on flashing tool to the card it's self. We can't flash the user-created bios's for mobile GPU's with software.
> 
> So no, there's not really an editor yet.
Click to expand...

Vega? 
He means Volta!


----------



## kithylin

2ndLastJedi said:


> Vega?
> He means Volta!


Yeah meant Volta.. my brain no work sometimes. I was hungry when typing that earlier.


----------



## 2ndLastJedi

kithylin said:


> 2ndLastJedi said:
> 
> 
> 
> Vega?
> He means Volta!
> 
> 
> 
> Yeah meant Volta.. my brain no work sometimes. I was hungry when typing that earlier.
Click to expand...

I implied male with he but, well... Who knows who anyone is, lol


----------



## feznz

2ndLastJedi said:


> I implied male with he but, well... Who knows who anyone is, lol


judging by avatar....... :thinking:


----------



## kithylin

dentnu said:


> Hi everyone its been a while since I checked this thread and really do not wanna read all 411 pages. I have a MSI GTX 1080 Ti Gaming X. I am looking for the best bios for this card. My card is using this bios https://www.techpowerup.com/vgabios/190927/msi-gtx1080ti-11264-170320. Can Someone please help me ? Thanks


Are you using full custom water loop for your video card? Or air cooled?


----------



## erso44

well, I did a RMA and they said it will be back after one month  because of:
- coil whine
- 100% fan speed = 2300 rpm -> operating temperature between 85-87°C

Yesterday I was looking for an EWKB waterblock when suddenly the coil whining started. I´m lucky that coil whining didn't start after the waterblock arrived and my warranty would be avoided. :/
Whatever, let's wait for MSI's action and see what they will do for their clients.

I'll keep you guys up to date.


----------



## feznz

erso44 said:


> well, I did a RMA and they said it will be back after one month  because of:
> - coil whine
> - 100% fan speed = 2300 rpm -> operating temperature between 85-87°C
> 
> Yesterday I was looking for an EWKB waterblock when suddenly the coil whining started. I´m lucky that coil whining didn't start after the waterblock arrived and my warranty would be avoided. :/
> Whatever, let's wait for MSI's action and see what they will do for their clients.
> 
> I'll keep you guys up to date.


just for your info running very hot is normal for this card seems to be general feedback on newegg could be a good time to sell/demand refund seems the cooler is pretty substandard 


https://www.newegg.com/global/nz/Product/Product.aspx?Item=N82E16814137111


----------



## nrpeyton

Time for some overclocking.

(It's snowing & Dew point is less than 1c today).

P.S. how do I remove annoying thumbnail from my posts? _(Never had this problem before the overclock.net update)._


----------



## BulletSponge

have any 1080ti owners here had problems with freezing in games? I replaced my 1070 with an EVGA 1080ti SC2 2 days ago and am having issues with BF1. The game will launch and run fine for about 5 minutes before a brief bit of stuttering and then freezing. When I first installed the card it was freezing in BF1 and World of Warships on the loading screen but after running DDU and reinstalling drivers and Afterburner I am only having problems with BF1. Hoping I didn't get a dud card. Also performance in Valley and Heaven benchmarks seems @ 15% below what I have been able to find online. At lunch now so I can't upload any screenshots until this evening.


----------



## kithylin

erso44 said:


> well, I did a RMA and they said it will be back after one month  because of:
> - coil whine
> - 100% fan speed = 2300 rpm -> operating temperature between 85-87°C
> 
> Yesterday I was looking for an EWKB waterblock when suddenly the coil whining started. I´m lucky that coil whining didn't start after the waterblock arrived and my warranty would be avoided. :/
> Whatever, let's wait for MSI's action and see what they will do for their clients.
> 
> I'll keep you guys up to date.


Are you out-of-the-box stock and it's running 85c some how? Did you put the XOC bios on an air cooled card? A normal GTX 1080 Ti shouldn't ever be running that hot out-of-the-box with just normal software overclocking. I'd seriously doubt that's an issue of the card it's self as any sort of defect either. Running that hot sounds more like you're putting some sort of modified bios on it and trying to overclock it on air cooling or something "not normal" is going on there.

The coil whine I guess is a returnable reason.. personally I game with headphones on though so I'd never hear it or even care even if it did whine.


----------



## nrpeyton

kithylin said:


> Are you out-of-the-box stock and it's running 85c some how? Did you put the XOC bios on an air cooled card? A normal GTX 1080 Ti shouldn't ever be running that hot out-of-the-box with just normal software overclocking. I'd seriously doubt that's an issue of the card it's self as any sort of defect either. Running that hot sounds more like you're putting some sort of modified bios on it and trying to overclock it on air cooling or something "not normal" is going on there.
> 
> The coil whine I guess is a returnable reason.. personally I game with headphones on though so I'd never hear it or even care even if it did whine.


That's the 1st thing that came to mind for me too when he mentioned the fan speed. _(flashing of a non-blower fan card's BIOS)._


----------



## nrpeyton

wow just seen the news about the folders whose computing power was really being re-directed into crypto mining without user consent/knowledge.


----------



## Thoth420

My FTW3 Gaming sometimes fails to downclock when idling. Anyone else have this problem?


----------



## kithylin

Thoth420 said:


> My FTW3 Gaming sometimes fails to downclock when idling. Anyone else have this problem?


My 1080 Ti does this if I have 3 x 1080p screens and have one or more set to anything above 60hz (like 75 Hz or 80 Hz), and only if I do 3-screen nvidia surround.

If I'm doing independant screens and all 3 are set to 1080p @ 60hz, then it down clocks when idle just fine.

This might have something to do with your issue.


----------



## DarthBaggins

nrpeyton said:


> wow just seen the news about the folders whose computing power was really being re-directed into crypto mining without user consent/knowledge.


Yeah those would be folding for the Cure Coin team, that's alot of power since folding works on similar workloads but not as intense on the hardware (been my experience so far since my hardware has been able to have a good long life)


----------



## feznz

Thoth420 said:


> My FTW3 Gaming sometimes fails to downclock when idling. Anyone else have this problem?


the more tabs you got open will also do this and power plan in windows set to maximum performance, instead of balanced


----------



## Thoth420

kithylin said:


> My 1080 Ti does this if I have 3 x 1080p screens and have one or more set to anything above 60hz (like 75 Hz or 80 Hz), and only if I do 3-screen nvidia surround.
> 
> If I'm doing independant screens and all 3 are set to 1080p @ 60hz, then it down clocks when idle just fine.
> 
> This might have something to do with your issue.


I'm running a single ultrawide at 100hz(not an OC, native) 3440 x 1440 perhaps it doesn't matter either way just my card will sit at like 40C if it never down clocks when not in use. Was just curious to the cause.


----------



## kithylin

Thoth420 said:


> I'm running a single ultrawide at 100hz(not an OC, native) 3440 x 1440 perhaps it doesn't matter either way just my card will sit at like 40C if it never down clocks when not in use. Was just curious to the cause.


It might be the refresh rate. Goto windows graphics settings and just try setting it to 60 hz and see if it down clocks when idle. If it does.. you found your answer. If it doesn't clock down at 60hz, then it's something else.


----------



## KCDC

Remember when we were talking about EVGAs amazing RMA policy with used cards.... and then someone was like, I wonder if they'll keep it with all the miners...

nope

http://www.overclock.net/forum/225-hardware-news/1665961-evga-evga-does-away-guest-rma.html

sad day.


----------



## ZealotKi11er

KCDC said:


> Remember when we were talking about EVGAs amazing RMA policy with used cards.... and then someone was like, I wonder if they'll keep it with all the miners...
> 
> nope
> 
> http://www.overclock.net/forum/225-hardware-news/1665961-evga-evga-does-away-guest-rma.html
> 
> sad day.


I never knew they even had that. I always thought you had to register the card.


----------



## feznz

KCDC said:


> Remember when we were talking about EVGAs amazing RMA policy with used cards.... and then someone was like, I wonder if they'll keep it with all the miners...
> 
> nope
> 
> http://www.overclock.net/forum/225-hardware-news/1665961-evga-evga-does-away-guest-rma.html
> 
> sad day.


Not surprising...... 
even so a RMA process can be very painful especially here in NZ send in for RMA card has to go to the nearest service agent usually by boat to China or where ever..... 3 months later get new cards never again...... I found a new respect to treating cards nicely. 
Never will I fold or mine with cards that in my current gaming rig maybe with worthless throw away cards in winter for space heating


----------



## KCDC

feznz said:


> Not surprising......
> even so a RMA process can be very painful especially here in NZ send in for RMA card has to go to the nearest service agent usually by boat to China or where ever..... 3 months later get new cards never again...... I found a new respect to treating cards nicely.
> Never will I fold or mine with cards that in my current gaming rig maybe with worthless throw away cards in winter for space heating



And I'm fairly sure any other companies that had similar return policies, if any, will do the same.


----------



## Thoth420

kithylin said:


> It might be the refresh rate. Goto windows graphics settings and just try setting it to 60 hz and see if it down clocks when idle. If it does.. you found your answer. If it doesn't clock down at 60hz, then it's something else.


Next time it happens I'll give that a try.


----------



## FUZZFrrek

Thoth420 said:


> My FTW3 Gaming sometimes fails to downclock when idling. Anyone else have this problem?


I depends if the card is needed for any graphical rendering of any sorts (browsers, 3rd party apps, etc). If you choose the Optimal power in the Nvidia Control Panel power preset, the outcome will be different. Set the power preset at '''Optimal Power'', your GPU Clock will then be at its minimal power usage IF nothing is opened on a fresh reboot. Even if a browser is opened, it will be set on the minimal GPU Clock.

The other way is also true, set the power Management Mode to ''Prefer maximum performance''.......
https://drive.google.com/file/d/1rz-v7ionNeaaLqbpp9NS2Do60BPRdN_X/view?usp=drivesdk





Wait a minute.... While doing the steps to reproduce, i get this:
https://drive.google.com/file/d/1hfIqhxwa6XiwnhIyJVjmSlXrXpToQFnA/view?usp=drivesdk

It makes no sense...

Any Insight guys?


----------



## DarthBaggins

erso44 said:


> well, I did a RMA and they said it will be back after one month  because of:
> - coil whine
> - 100% fan speed = 2300 rpm -> operating temperature between 85-87°C
> 
> Yesterday I was looking for an EWKB waterblock when suddenly the coil whining started. I´m lucky that coil whining didn't start after the waterblock arrived and my warranty would be avoided. :/
> Whatever, let's wait for MSI's action and see what they will do for their clients.
> 
> I'll keep you guys up to date.


My Aero OC had light coil whine on the stock blower, now that it's under water there is zero coil whine. But I didn't have other issues you did (glad I have the MC replacement plan too)


----------



## nrpeyton

*New Nvidia Archiecture (for gaming) roomers

*There is NO NEED for a new Nvidia gaming architecture.
Current Pascal cards are more than SUFFICIENT in the current climate. _(sufficient for 4k & sufficient for VR)._

Nvidia should be concentrating on manufacturing MORE pascal cards. I doubt anyone would complain if we STUCK WITH pascal for the next two years.


Nvidia is moving too fast. In my opinion.﻿


----------



## Agent-A01

nrpeyton said:


> *New Nvidia Archiecture (for gaming) roomers
> 
> *There is NO NEED for a new Nvidia gaming architecture.
> Current Pascal cards are more than SUFFICIENT in the current climate. _(sufficient for 4k & sufficient for VR)._
> 
> Nvidia should be concentrating on manufacturing MORE pascal cards. I doubt anyone would complain if we STUCK WITH pascal for the next two years.
> 
> 
> Nvidia is moving too fast. In my opinion.﻿


Not really.

1080Ti is too slow for 4k gaming for max details and struggles to keep min FPS high for 1440P 144hz in new titles.


----------



## KCDC

I've been having a hell of a time getting my cards to OC independent of each other through MSI AB 4.4.2, XOC bios, latest Nvidia drivers (btw this goes for any bios, AB version and driver version) dual FE cards, I think one of them can OC better than the other.

When I unlink the cards to set different OC settings, it will lock at 1999 mhz regardless of my settings, either using the slider or custom V curve. At first I thought it was just with the curve, but when I, say, set one card to top at 2088 and the other to top at 2025, both cards still lock to one freq, 1999.

Am I missing a setting? What's going on here? Any help would be appreciated!


----------



## feznz

will need more GPU power when 4k 120hz monitors come out

https://www.geforce.com/whats-new/articles/bfgd-big-format-gaming-display


----------



## feznz

dp


----------



## kithylin

nrpeyton said:


> *New Nvidia Archiecture (for gaming) roomers
> 
> *There is NO NEED for a new Nvidia gaming architecture.
> Current Pascal cards are more than SUFFICIENT in the current climate. _(sufficient for 4k & sufficient for VR)._
> 
> Nvidia should be concentrating on manufacturing MORE pascal cards. I doubt anyone would complain if we STUCK WITH pascal for the next two years.
> 
> 
> Nvidia is moving too fast. In my opinion.﻿





Agent-A01 said:


> Not really.
> 
> 1080Ti is too slow for 4k gaming for max details and struggles to keep min FPS high for 1440P 144hz in new titles.


Yep this is the problem. Single GTX 1080 Ti, in any computer we can buy, no matter how fast or what cpu it is, is not fast enough for 4K, -IF- you expect to run 4K with all the settings, options, bells and whistles turned up to the maximum possible setting in every game and still maintain minimum 60 FPS (most 4K monitors are 60 hz). No matter what you do to it, a single 1080 Ti can not manage that, currently. They're just not fast enough yet. The only way to maintain acceptable minimum frame rates on a 1080 Ti for 4K is to turn down game settings to medium/high and forget ultra, on most games. Some games are optimized better and can be run at max graphics on a 1080 Ti. But the majority can not. Yes it's a great card for 4K .. if you're willing to sacrifice graphics slightly.



KCDC said:


> I've been having a hell of a time getting my cards to OC independent of each other through MSI AB 4.4.2, XOC bios, latest Nvidia drivers (btw this goes for any bios, AB version and driver version) dual FE cards, I think one of them can OC better than the other.
> 
> When I unlink the cards to set different OC settings, it will lock at 1999 mhz regardless of my settings, either using the slider or custom V curve. At first I thought it was just with the curve, but when I, say, set one card to top at 2088 and the other to top at 2025, both cards still lock to one freq, 1999.
> 
> Am I missing a setting? What's going on here? Any help would be appreciated!


The problem is you're not supposed to clock cards independently, that's not how you overclock with multi-gpu. You're supposed to lock both cards and OC both together as one "Unit". It may be possible but it's not how it should be done so I'm not surprised it doesn't work for you. Just lock em together and OC em together and it should work fine.


----------



## DarthBaggins

I won't move to 4k till there's a card that can push 4k Ultra over 60+ fps consistently, Pascal isn't there in that aspect


----------



## SavantStrike

DarthBaggins said:


> I won't move to 4k till there's a card that can push 4k Ultra over 60+ fps consistently, Pascal isn't there in that aspect


I'm not sure the next arch will be there either. It will be close but there will be frame drops below 60 fps. I'm referring to the 2080 TI too, the 2080 probably won't eclipse the 1080 TI by more than 15-20 percent.


----------



## DarthBaggins

That's why I plan to stick with my XF270HU monitor wise, love how all these high refresh rate panels are being made for 4k with nothing to really drive them to the 100hz+ mark.


----------



## kithylin

DarthBaggins said:


> That's why I plan to stick with my XF270HU monitor wise, love how all these high refresh rate panels are being made for 4k with nothing to really drive them to the 100hz+ mark.


Sure there is. Just buy 2 x 1080 Ti's and use nvidia inspector to enable SLI in the 50% of games that don't support it.... :lachen:


----------



## DarthBaggins

lmao, if I could buy another Ti for $600 I would but I don't see that happening any time soon lol. I could sell the 1080 SC I have in another build and that would lessen the blow of the cost of another Ti.


----------



## @purple

This card is really amazing (i don't have one) but if I had money I would go for it.
How much power it needs?


----------



## kithylin

kush113 said:


> This card is really amazing (i don't have one) but if I had money I would go for it.
> How much power it needs?


It depends. If you buy an aftermarket card with a decent factory overclock, probably 350 - 375 watts typical. If you buy a founder's edition / blower card with original stock nvidia specs, probably more like 275W - 300W.

It also depends entirely on the game you play.

If you intend to try and run games @ 4K on this card, even if you turn the quality down to medium, it will still run the 1080 Ti hot, and use max power while doing so.

If you play 1440p @ 60hz with gsync/vsync on it with games on ultra it may do better power wise, but still probably near the higher side of power usage.

If you manually overclock it with custom bios to the max it can possibly run stable with a water block and then run 4K games on that, power usage can and will get up to and beyond the 400W mark in some games.


----------



## SavantStrike

kithylin said:


> It depends. If you buy an aftermarket card with a decent factory overclock, probably 350 - 375 watts typical. If you buy a founder's edition / blower card with original stock nvidia specs, probably more like 275W - 300W.
> 
> It also depends entirely on the game you play.
> 
> If you intend to try and run games @ 4K on this card, even if you turn the quality down to medium, it will still run the 1080 Ti hot, and use max power while doing so.
> 
> If you play 1440p @ 60hz with gsync/vsync on it with games on ultra it may do better power wise, but still probably near the higher side of power usage.
> 
> If you manually overclock it with custom bios to the max it can possibly run stable with a water block and then run 4K games on that, power usage can and will get up to and beyond the 400W mark in some games.


The aorus extreme maxes out at 375, with only a few cards from zotac going any higher. 375 is definitely the upper celling for a 1080 TI that hasn't been flashed or hard modded.

I've never been able to get more than 340W out of my aorus, but its a water force wb model. Air cooled variants can probably hit 375W.


----------



## DarthBaggins

I need to check my Aero OC to see what it's getting now that I put it under a block and bumped the clocks up (2050/5635). Would be nice to know what kind of power it's pulling (have it temporarily in another build to test it out)


----------



## Trippen Out

I caved and gave into the high prices. picked up my first replacement in a long time. Should be a good upgrade considering the card im running now is a 780gtx 6gb version. 

GIGABYTE AORUS GeForce GTX 1080 Ti DirectX 12 GV-N108TAORUS-11GD 11GB 352-Bit GDDR5X PCI Express 3.0 x16 SLI Support ATX Video Card

Sadly and maybe, hopefully i'm wrong but the only water block I could find for this was made by EK. was really hoping for an XSPC one.


----------



## jura11

Trippen Out said:


> I caved and gave into the high prices. picked up my first replacement in a long time. Should be a good upgrade considering the card im running now is a 780gtx 6gb version.
> 
> GIGABYTE AORUS GeForce GTX 1080 Ti DirectX 12 GV-N108TAORUS-11GD 11GB 352-Bit GDDR5X PCI Express 3.0 x16 SLI Support ATX Video Card
> 
> Sadly and maybe, hopefully i'm wrong but the only water block I could find for this was made by EK. was really hoping for an XSPC one.


Hi there 

Personally I would go with Phanteks Glacier WB which is better quality and performs like EK, quality of Phanteks Glacier is way better than EK 

Other WB which I tried has been Bykski for this GPU and been very happy with the performance as well 

With Phanteks Glacier WB and Aorus GTX1080Ti temperatures are on par with my EK block, idle at 17-20?C and load temperatures in Mass Effect Andromeda has been in 33-36?C

Hope this helps 

Thanks, Jura


----------



## SavantStrike

jura11 said:


> Hi there
> 
> Personally I would go with Phanteks Glacier WB which is better quality and performs like EK, quality of Phanteks Glacier is way better than EK
> 
> Other WB which I tried has been Bykski for this GPU and been very happy with the performance as well
> 
> With Phanteks Glacier WB and Aorus GTX1080Ti temperatures are on par with my EK block, idle at 17-20?C and load temperatures in Mass Effect Andromeda has been in 33-36?C
> 
> Hope this helps
> 
> Thanks, Jura


The glacier has similar aesthetics to the XSPC blocks, and is much nicer than the EK block.

The Bykski blocks are nice too - that's what comes on them from the factory when you buy the waterforce WB edition.


----------



## DarthBaggins

I would go with the BitsPower option, so far one of the better performing blocks (and I'm not a fan of BitsPower's products). If it had a OEM PCB I would opt for the HeatKiller block over anything else.


----------



## feznz

kush113 said:


> This card is really amazing (i don't have one) but if I had money I would go for it.
> How much power it needs?


as in PSU requirements?
stock with 4 core you could possibly go down as low as a 450w but realistically a 550w 
game stable overclocked with 6 core then 650w
bencher the sky is the limit

of course this is a decent single rail PSU the other day seen a guy with a 750w multi-rail PSU it couldn't do it as only had 28A allocated to PCIe


----------



## minsekt

hey guys, ive rma'd a strix 1080ti, now i have a zotac 1080ti amp extreme, the gpu boosts up to 2050mhz core, however it crashes after 2sec of running firestrike. is this a driver problem/related? if i set core -27mhz it will work fine, can't get higher, will crash immediately at -25mhz. is there a fix for this or is the card/vbios faulty? my dealer is still out of gpus, and i dont really want to rma again. at least right now, the -27mhz fix is working fine but.. yeah i almost payed $300 on top of the strix because the price got higher over the last 6months. it should at least work as intended. 

how i see it, gpu boost 3 is just overclocking too high, is this common?


----------



## ZealotKi11er

minsekt said:


> hey guys, ive rma'd a strix 1080ti, now i have a zotac 1080ti amp extreme, the gpu boosts up to 2050mhz core, however it crashes after 2sec of running firestrike. is this a driver problem/related? if i set core -27mhz it will work fine, can't get higher, will crash immediately at -25mhz. is there a fix for this or is the card/vbios faulty? my dealer is still out of gpus, and i dont really want to rma again. at least right now, the -27mhz fix is working fine but.. yeah i almost payed $300 on top of the strix because the price got higher over the last 6months. it should at least work as intended.
> 
> how i see it, gpu boost 3 is just overclocking too high, is this common?


You should not have paid extra money. 2050 seems like too much for stock.


----------



## minsekt

ZealotKi11er said:


> You should not have paid extra money. 2050 seems like too much for stock.


nah, ive payed 860 like half a year ago for the strix. i then rma at my local dealer, they gave me the money back because there was nothing in stock, 1 week later the 1080ti amp extreme came in stock so i bought it for like 1150. (swiss prices...)
well. 

so, if 2050 seems to be too high, i guess its ok to just underclock it to like 2000-2025ish? feels kinda good to have that rma joker for the next few years


----------



## ZealotKi11er

minsekt said:


> nah, ive payed 860 like half a year ago for the strix. i then rma at my local dealer, they gave me the money back because there was nothing in stock, 1 week later the 1080ti amp extreme came in stock so i bought it for like 1150. (swiss prices...)
> well.
> 
> so, if 2050 seems to be too high, i guess its ok to just underclock it to like 2000-2025ish? feels kinda good to have that rma joker for the next few years


Yeah.


----------



## 8051

kithylin said:


> It depends. If you buy an aftermarket card with a decent factory overclock, probably 350 - 375 watts typical. If you buy a founder's edition / blower card with original stock nvidia specs, probably more like 275W - 300W.
> 
> It also depends entirely on the game you play.
> 
> If you intend to try and run games @ 4K on this card, even if you turn the quality down to medium, it will still run the 1080 Ti hot, and use max power while doing so.
> 
> If you play 1440p @ 60hz with gsync/vsync on it with games on ultra it may do better power wise, but still probably near the higher side of power usage.
> 
> If you manually overclock it with custom bios to the max it can possibly run stable with a water block and then run 4K games on that, power usage can and will get up to and beyond the 400W mark in some games.


Will that be a sustained 400W or will it just occasionally peak out at 400W?


----------



## mouacyk

117% TDP on a 250W stock BIOS allows for up to 321W of power on my MSI card, and has been plenty for everything (ROTR, BF1, Witcher 3, TF2, Crysis 3, UE4 demos) when running 1440p @ 144Hz and averages right around 300W (2,100MHz/12,600MHz) without throttling. ROTR immediately throttles though at 4K using DSR.


----------



## kithylin

8051 said:


> Will that be a sustained 400W or will it just occasionally peak out at 400W?


Like I tried to say, it depends entirely on what you do with the cards. If you were to load the STRIX XOC unlimited power bios for example, overclock it to 2100 mhz on water, and then tried to play some modern AAA game @ 4K, it would probably go over 400+ Watts constantly.

But if you just pop in an aftermarket factory-overclocked air cooled card and don't touch anything and just go game at 4K, more like the 350W maybe 375W range.


----------



## ThrashZone

Hi,
Yeah they soak up power that plus a system oc lol I didn't feel all that good with an 850P2 on my x99


----------



## kithylin

ThrashZone said:


> Hi,
> Yeah they soak up power that plus a system oc lol I didn't feel all that good with an 850P2 on my x99


This post is sort of for kush113, but maybe give others an idea of what to expect with this card.

Some people say I'm crazy for buying a 1080 Ti and not playing 4K on it. Personally it's my money and I don't care what they say, I got it for what I want it for and I'm happy with my purchase. Before this I had a R9 290X 8GB card that I had manually overclocked with a custom unlimited bios and heavy volts and was pulling around 500 - 550 watts in games with that card, just for 1080p gaming, and I couldn't get a waterblock for the custom-PCB Sapphire version (because I bought it used a few years after the blocks were discontinued) and it ran hot like 78c - 83c and that was with fans at 80% - 100%. Sapphire makes a great cooler and it's fairly quiet for what it is, but still a bit noisy if you max it out. It still wasn't enough for 60 FPS minimums in most games with all the options cranked to max and several mods and it was just hot, power hungry, noisy, and not a great gaming experience.

Compared to now with a 1080 Ti I can do the same games and much higher in-game graphics settings, and even with the XOC unlimited bios on it and a manual overclock to 2139 Mhz, the gpu's like 60% - 70% loaded and averages around 275-300 watts, with most of the games I play commonly. Most of the games I play.. yeah it's 1080p, but with lots of mods and maximum ultra settings. Most of the time it isn't always maxed that hard in load, and that's what I wanted. I didn't want a GPU that runs full-tilt all the time @ max power & max heat. The one game I play that will load it hard is ARK Survival evolved that will peg it at 99% constantly and make it draw around 400-425 watts and still not 60 FPS minimums (in ark) unless I turn off/down some things, even at 1080p.

So for me it's perfect.. I sucked up the new price and bought one new this time so I could get a gpu before the blocks were discontinued and got a fullcover EK block for it. I love my full-copper with clear plexi top EK block and temps are great. Even with the overclocks, gpu averages 35c - 45c in most games, and can run up around 58c - 61c in ark evolved (I need a bigger case and second big radiator to go with it, for better temps). And near silent with a big radiator and fans on low @ 30%, and literally half the power vs my previous card. Today's games may not load it hard (other than ark) just right now, but I'm sure in the next few years we'll have new games that will stress it to the max like ark does, even at 1080p.

The options with a single 1080 Ti are this: 4K @ medium, or 1080p @ maximum ultra. Or maybe 1440p @ high. I like eyecandy and pretty graphics so I went with the 1080p option, it's better on power and heat, for now.

Yes just gaming on 1080p I could of gone with a 1060 or a 1070 and saved some money, but those cards would be struggling with maxed 1080p ultra + mods, and I'd be back in the "maxed out gpu that's hot and noisy and hungry" category like before. I was trying to get away from that.


----------



## 2ndLastJedi

kithylin said:


> ThrashZone said:
> 
> 
> 
> Hi,
> Yeah they soak up power that plus a system oc lol I didn't feel all that good with an 850P2 on my x99 /forum/images/smilies/smile.gif
> 
> 
> 
> This post is sort of for kush113, but maybe give others an idea of what to expect with this card.
> 
> Some people say I'm crazy for buying a 1080 Ti and not playing 4K on it. Personally it's my money and I don't care what they say, I got it for what I want it for and I'm happy with my purchase. Before this I had a R9 290X 8GB card that I had manually overclocked with a custom unlimited bios and heavy volts and was pulling around 500 - 550 watts in games with that card, just for 1080p gaming, and I couldn't get a waterblock for the custom-PCB Sapphire version (because I bought it used a few years after the blocks were discontinued) and it ran hot like 78c - 83c and that was with fans at 80% - 100%. Sapphire makes a great cooler and it's fairly quiet for what it is, but still a bit noisy if you max it out. It still wasn't enough for 60 FPS minimums in most games with all the options cranked to max and several mods and it was just hot, power hungry, noisy, and not a great gaming experience.
> 
> Compared to now with a 1080 Ti I can do the same games and much higher in-game graphics settings, and even with the XOC unlimited bios on it and a manual overclock to 2139 Mhz, the gpu's like 60% - 70% loaded and averages around 275-300 watts, with most of the games I play commonly. Most of the games I play.. yeah it's 1080p, but with lots of mods and maximum ultra settings. Most of the time it isn't always maxed that hard in load, and that's what I wanted. I didn't want a GPU that runs full-tilt all the time @ max power & max heat. The one game I play that will load it hard is ARK Survival evolved that will peg it at 99% constantly and make it draw around 400-425 watts and still not 60 FPS minimums (in ark) unless I turn off/down some things, even at 1080p.
> 
> So for me it's perfect.. I sucked up the new price and bought one new this time so I could get a gpu before the blocks were discontinued and got a fullcover EK block for it. I love my full-copper with clear plexi top EK block and temps are great. Even with the overclocks, gpu averages 35c - 45c in most games, and can run up around 58c - 61c in ark evolved (I need a bigger case and second big radiator to go with it, for better temps). And near silent with a big radiator and fans on low @ 30%, and literally half the power vs my previous card. Today's games may not load it hard (other than ark) just right now, but I'm sure in the next few years we'll have new games that will stress it to the max like ark does, even at 1080p.
> 
> The options with a single 1080 Ti are this: 4K @ medium, or 1080p @ maximum ultra. Or maybe 1440p @ high. I like eyecandy and pretty graphics so I went with the 1080p option, it's better on power and heat, for now.
> 
> Yes just gaming on 1080p I could of gone with a 1060 or a 1070 and saved some money, but those cards would be struggling with maxed 1080p ultra + mods, and I'd be back in the "maxed out gpu that's hot and noisy and hungry" category like before. I was trying to get away from that.
Click to expand...

The games I play in 4k ultra get 60 pretty easily. 
The thing the 1080ti can't do at ultra is VR in the Sim racers I play. Dedicated VR games its fine but Assetto Corsa /pC2, R3E and rF2 struggle with even medium in VR.


----------



## kithylin

2ndLastJedi said:


> The games I play in 4k ultra get 60 pretty easily.
> The thing the 1080ti can't do at ultra is VR in the Sim racers I play. Dedicated VR games its fine but Assetto Corsa /pC2, R3E and rF2 struggle with even medium in VR.


Are you playing in 4K with every possible graphics option turned up on maximum, including anti-aliasing? Everything on ultra? And are you able to sustain a minimum of 60 FPS at all times and never dip below even once? I seriously doubt you can claim this as "yes" on a single 1080 Ti unless you're playing some ancient game from 6+ years ago.


----------



## VeritronX

Hi guys, I have an evga 1080ti sc and I'm looking to compare running it with my 4Ghz R7 1700 to my new 5.2Ghz i5 8600K, feel free to suggest what tests you'd like to see in this thread if you are interested =)


----------



## 2ndLastJedi

kithylin said:


> 2ndLastJedi said:
> 
> 
> 
> The games I play in 4k ultra get 60 pretty easily.
> The thing the 1080ti can't do at ultra is VR in the Sim racers I play. Dedicated VR games its fine but Assetto Corsa /pC2, R3E and rF2 struggle with even medium in VR.
> 
> 
> 
> Are you playing in 4K with every possible graphics option turned up on maximum, including anti-aliasing? Everything on ultra? And are you able to sustain a minimum of 60 FPS at all times and never dip below even once?
Click to expand...

4k all maxed including AA (I hate aliasing) and if it dips below 60 I can't see or feel it. 
In some games I'm forcing higher than ingame AA. 
Even games like ROTTR to me looks better with slightly lower than max settings in 4k than max in 1080p. 
But Wolfenstein 2 probably hits 90fps with all in game settings Uber and 8xAA and there's AC where I use content manager to push the experimental 8x AA then pC2 again maxed with MSAA High. These games actually look pretty awful in 1080p maxed and only look acceptable in 4k ultra. 
Destiny 2 again ultra, Forza 7 and Horizon 3 4k ultra 8x AA. 
I use v-sync so would feel dips if they where there but maybe I'm not sensitive to them. 
Anyway now I'm in VR so 1080ti doesn't get close to Ultra with the games I care about (Sim Racing) and will need to wait till at least the GPU after the next to get that.


----------



## kithylin

VeritronX said:


> Hi guys, I have an evga 1080ti sc and I'm looking to compare running it with my 4Ghz R7 1700 to my new 5.2Ghz i5 8600K, feel free to suggest what tests you'd like to see in this thread if you are interested =)


Firestrike and time spy are the most common ones everyone seems to like to see these days. And then you can just link the relative pages and their comparison pages and it's super easy.


----------



## 86JR

I'm in the same boat. Upgrading from 2500k at 4.8ghz and 690 GTX standard OC to probably 8700k at 4.8ghz and 1080ti. 

I only play Rocket League and Arma 3 and I only play at 1080p. 

In arma 3, even tri-sli-titan-V it won't max out with draw distance set to maximum at 1080p, fps will drop down to single figures, playing single player, the maps are that large and detailed.

On top of this the 690 does something like 300w TDP on its own (2 680's on one die) and SLI is buggy in some games.



I did this with the 690 when it came out. Spend $5k on a PC and it lasts over 5 years, with the knowing that you can just buy a game and play it in all its glory without worrying your gpu can't keep up!


----------



## Vapochilled

Hey guys, so i went thru all these posts but some comments got me confused and worried.
I just got an MSI Gaming X 1080 Ti (750euros) and i'm confused about the BIOS topics.
I previously had a 780Ti with Bios Unlocked for higher Power Slide and i know that was the major draw during benchs. Not the voltage itself.

However, i saw some people here saying that the XOC bios for the 1080 ti is not bringing more frames. Just higher core clock..... that doesnt reflect in the benchs. That doesnt make sense unless you are throttling.

So the main question is: Do i stick with MSI Gaming X bios and around 2040Core, or do i try the XOC Bios with lets say: 1.1v / 140% power slide and try to go to 2100Mhz ?
Can i flash the XOC bios at all on it?


----------



## ZealotKi11er

Vapochilled said:


> Hey guys, so i went thru all these posts but some comments got me confused and worried.
> I just got an MSI Gaming X 1080 Ti (750euros) and i'm confused about the BIOS topics.
> I previously had a 780Ti with Bios Unlocked for higher Power Slide and i know that was the major draw during benchs. Not the voltage itself.
> 
> However, i saw some people here saying that the XOC bios for the 1080 ti is not bringing more frames. Just higher core clock..... that doesnt reflect in the benchs. That doesnt make sense unless you are throttling.
> 
> So the main question is: Do i stick with MSI Gaming X bios and around 2040Core, or do i try the XOC Bios with lets say: 1.1v / 140% power slide and try to go to 2100Mhz ?
> Can i flash the XOC bios at all on it?


The reason XOC is not the best is that of the internal limits of Pascal. I have played around with 1080 Ti a lot because I was frustrated with how Pascal overclocks and you learn that after 2GHz its just a waste of time to push more unless you want to benchmark it. Even then you have to think about what 2.1GHz really is. Its 100MHz OC over 2GHz which is 5% OC. That scales 2-3% at best which is nothing in reality.


----------



## mouacyk

Vapochilled said:


> Hey guys, so i went thru all these posts but some comments got me confused and worried.
> I just got an MSI Gaming X 1080 Ti (750euros) and i'm confused about the BIOS topics.
> I previously had a 780Ti with Bios Unlocked for higher Power Slide and i know that was the major draw during benchs. Not the voltage itself.
> 
> However, i saw some people here saying that the XOC bios for the 1080 ti is not bringing more frames. Just higher core clock..... that doesnt reflect in the benchs. That doesnt make sense unless you are throttling.
> 
> So the main question is: Do i stick with MSI Gaming X bios and around 2040Core, or do i try the XOC Bios with lets say: 1.1v / 140% power slide and try to go to 2100Mhz ?
> Can i flash the XOC bios at all on it?


I have the MSI Gaming X with an EK block, and 1.1v is possible on stock BIOS with Afterburner. My 24/7 gaming profile is 1.094v 2100MHz/12,600MHz and it doesn't throttle when TDP is maxed at 321W (117%), unless I run greater than 1440p in some titles, like ROTR. I can't recommend XOC BIOS, unless your use-cases are encountering power-throttling, like benching and gaming at 4K.

Yes, you can flash XOC Bios on this card. I was able to bench 1.2v and 2200MHz for about 33,500 in FS.


----------



## cole2109

My Strix has very average GPU. 2025-2038MHz  but crazy MEM clock 800+MHz (artefacts at 830MHz)


----------



## nyk20z3

kithylin said:


> Are you playing in 4K with every possible graphics option turned up on maximum, including anti-aliasing? Everything on ultra? And are you able to sustain a minimum of 60 FPS at all times and never dip below even once? I seriously doubt you can claim this as "yes" on a single 1080 Ti unless you're playing some ancient game from 6+ years ago.


At 4K you don't need anti anything imo


----------



## 2ndLastJedi

nyk20z3 said:


> kithylin said:
> 
> 
> 
> Are you playing in 4K with every possible graphics option turned up on maximum, including anti-aliasing? Everything on ultra? And are you able to sustain a minimum of 60 FPS at all times and never dip below even once? I seriously doubt you can claim this as "yes" on a single 1080 Ti unless you're playing some ancient game from 6+ years ago.
> 
> 
> 
> At 4K you don't need anti anything imo
Click to expand...

Kinda do, 4k on a 65" screen has lower ppi than a 27" 1080p. 
You'd use AA on that 27" so why not on the 65"? Just because it's 4k means nothing, it's more size depending.


----------



## KCDC

2ndLastJedi said:


> Kinda do, 4k on a 65" screen has lower ppi than a 27" 1080p.
> You'd use AA on that 27" so why not on the 65"? Just because it's 4k means nothing, it's more size depending.



4K on a 65 still looks better with some AA, I think due to PPI, as you say. Perhaps not on a half-sized monitor.


----------



## 2ndLastJedi

Still playing around in SuperPosition 4k and still just under 10,000  with this latest driver .

Default gets me (1800-1850ish MHz) = 9026 @ 68 C (default fan)

UnderVolt 0.900 @ 1911 MHz = 9268 @ 54C (fan 100%)

+150 Core (2088MHz) = 9711 @ 63C (fan 100%)

+150 Core = 2088 MHz and only +490 Memory = 9948 @ 64C (fan 100%)

Sticking with the undervolt for VR so temps stay lower than 60C 

Galax GTX 1080 Ti EXOC , i5 6600k @ 4.7 , 2x4GB 2400MHz OC'd to 2800MHz (1 Dimm died so removed 2 and waiting for $) lol But 8GB seems to be working beautifully ...shrugs


----------



## porschedrifter

*VRM and Mem temp?*

Hey dudes! So I'm about to start overclocking my 1080ti and artifact scanning it, but I was wondering if there's any way to see the VRM and Mem temps on this card to make sure I'm not roasting it while I'm testing? I feel uneasy not being able to see them. Is there a way?


----------



## ThrashZone

porschedrifter said:


> Hey dudes! So I'm about to start overclocking my 1080ti and artifact scanning it, but I was wondering if there's any way to see the VRM and Mem temps on this card to make sure I'm not roasting it while I'm testing? I feel uneasy not being able to see them. Is there a way?


Hi,
msi afterburner shows just about everything on the v3 skin
evga's utility will show 3 current temps gpu/ power/ memory


----------



## ThrashZone

2ndLastJedi said:


> Still playing around in SuperPosition 4k and still just under 10,000  with this latest driver .
> 
> Default gets me (1800-1850ish MHz) = 9026 @ 68 C (default fan)
> 
> UnderVolt 0.900 @ 1911 MHz = 9268 @ 54C (fan 100%)
> 
> +150 Core (2088MHz) = 9711 @ 63C (fan 100%)
> 
> +150 Core = 2088 MHz and only +490 Memory = 9948 @ 64C (fan 100%)
> 
> Sticking with the undervolt for VR so temps stay lower than 60C
> 
> Galax GTX 1080 Ti EXOC , i5 6600k @ 4.7 , 2x4GB 2400MHz OC'd to 2800MHz (1 Dimm died so removed 2 and waiting for $) lol But 8GB seems to be working beautifully ...shrugs


Hi,
I'd lower the core and push the memory more 
Try +117 and +700


----------



## porschedrifter

ThrashZone said:


> Hi,
> msi afterburner shows just about everything on the v3 skin
> evga's utility will show 3 current temps gpu/ power/ memory


HI! yeah I tried evga after seeing that and it actually doesnt show those temps for mine. I have an MSI duke card but still, and afterburner v3 skin doesnt show the vrm or mem temps, neither does HWinfo. Are you not able to read these temps on the MSI cards?


----------



## Vapochilled

mouacyk said:


> I have the MSI Gaming X with an EK block, and 1.1v is possible on stock BIOS with Afterburner. My 24/7 gaming profile is 1.094v 2100MHz/12,600MHz and it doesn't throttle when TDP is maxed at 321W (117%), unless I run greater than 1440p in some titles, like ROTR. I can't recommend XOC BIOS, unless your use-cases are encountering power-throttling, like benching and gaming at 4K.
> 
> Yes, you can flash XOC Bios on this card. I was able to bench 1.2v and 2200MHz for about 33,500 in FS.




Hum...... I cant go over 2050... and in TimeSpy when the powerlimit hits 115... 112.. 117 the core drops to 1990
Temps are 45... 50 because ive benching with 100% fan on the MSI Gaming X ....

So... now I'm confused .... Seems that powerlimit is a problem...


----------



## mouacyk

Vapochilled said:


> Hum...... I cant go over 2050... and in TimeSpy when the powerlimit hits 115... 112.. 117 the core drops to 1990
> Temps are 45... 50 because ive benching with 100% fan on the MSI Gaming X ....
> 
> So... now I'm confused .... Seems that powerlimit is a problem...


To clarify, I do get power-throttling too with almost every kind of benching (FireStrike SuperPosition) even at 1440p. Gaming is more forgiving, and will not throttle until resolution is higher than 1440p.


----------



## 2ndLastJedi

mouacyk said:


> Vapochilled said:
> 
> 
> 
> Hum...... I cant go over 2050... and in TimeSpy when the powerlimit hits 115... 112.. 117 the core drops to 1990
> Temps are 45... 50 because ive benching with 100% fan on the MSI Gaming X ....
> 
> So... now I'm confused .... Seems that powerlimit is a problem...
> 
> 
> 
> To clarify, I do get power-throttling too with almost every kind of benching (FireStrike SuperPosition) even at 1440p. Gaming is more forgiving, and will not throttle until resolution is higher than 1440p.
Click to expand...

I've just learnt to live with it limited by power and just undervolt it and limit it more, lol more stable clocks this way.


----------



## GAN77

Hi guys!
Who knows what is the difference between the revision of the Asus GTX 1080 Ti Strix graphics card?


----------



## Scotty99

Any kraken g12 users in here? I got a smoking deal on a corsair h50 and kraken g12 on ebay sale couple days ago, i was wondering if anyone has used a gpu fan adapter to regular PWM so that the radiator fan can be seen/controlled by GPU bios.

I think this is what im looking for:
https://www.amazon.com/Gelid-CA-PWM...id=1520793763&sr=8-2&keywords=gpu+pwm+adapter

My motherboard is able to control fans via gpu temperature but id rather see my GPU fan speeds where they should be, in gpu monitoring section of hwinfo. Is that adapter i linked universal? GPU it will be going on is a evga gtx 1060 ssc, just wanted to try this thread first as 1080ti owners are more likely to use such a setup id imagine.


----------



## kithylin

GAN77 said:


> Hi guys!
> Who knows what is the difference between the revision of the Asus GTX 1080 Ti Strix graphics card?


It would help in the future if you would include a link to the page you are referring to so we can more easily see it. But I found it myself with some googling: https://www.ekwb.com/shop/ek-fc-1080-gtx-ti-strix-nickel If you would click on the "Compatibility List" link (orange icon, center, between manual and 'read before you buy'), you would see that it clearly says this block in question is -ONLY- for the "ASUS Poseidon GeForce GTX 1080 Ti / AKA ROG POSEIDON" and -NOT- the "normal" ASUS ROG Strix GeForce GTX 1080 Ti card. A little confusing but the Compatibility List link should clear that up for you.


----------



## DarthBaggins

Found these using the configurator, Strix OC and for the regular Strix Gaming


----------



## domrockt

Sooo iam gonna kick in i bought myselfe a Gainward Pheonix (Gaming) 1080ti just before Christmas for 640€!
I changed the Thermalpaste for some Phobya Stuff i have around.
I flashed the Asus XOC BIOS, thanks to the MASSIVE Air cooler which is activly cooling the VRMs. My Card stays @ 60° Max with 30-40% Fanspeed @ 2138.5/12600Mhz
All tested with Unigin Superpos Stresstest.

Not even the Backplate is getting hot, change the Thermal Paste its the best ting to do. I have no screenshoot from previous Temps but i reached about 1950-2050 ish Mhz @ 70-80° with 60-70% [email protected] stock Bios, ridiculous.

https://www.3dmark.com/3dm/25560537


----------



## ThrashZone

Hi,
Yeah evga used some pretty bad thermal paste too 
Mine flaked off :/
Put some thermal grizzly on and it stays at 57c swapped out the nasty thermal pads on the memory chips with EK water block pads and they stay pretty much 50c all I had laying around.
The rest of the nasty thermal pads getting swapped out later this week with some new probably try some NT-H1 thermal paste too it is my favorite


----------



## SavantStrike

domrockt said:


> Sooo iam gonna kick in i bought myselfe a Gainward Pheonix (Gaming) 1080ti just before Christmas for 640€!
> I changed the Thermalpaste for some Phobya Stuff i have around.
> I flashed the Asus XOC BIOS, thanks to the MASSIVE Air cooler which is activly cooling the VRMs. My Card stays @ 60° Max with 30-40% Fanspeed @ 2138.5/12600Mhz
> All tested with Unigin Superpos Stresstest.
> 
> Not even the Backplate is getting hot, change the Thermal Paste its the best ting to do. I have no screenshoot from previous Temps but i reached about 1950-2050 ish Mhz @ 70-80° with 60-70% [email protected] stock Bios, ridiculous.


You should be aiming for 40C on the XOC bios. At 60, you'll be dumping a ton of extra power into the card to keep it at the same clocks. 60C with 400-500W is really punishing duty on the GPU and will shorten it's life.


----------



## 8051

domrockt said:


> Sooo iam gonna kick in i bought myselfe a Gainward Pheonix (Gaming) 1080ti just before Christmas for 640€!
> I changed the Thermalpaste for some Phobya Stuff i have around.
> I flashed the Asus XOC BIOS, thanks to the MASSIVE Air cooler which is activly cooling the VRMs. My Card stays @ 60° Max with 30-40% Fanspeed @ 2138.5/12600Mhz
> All tested with Unigin Superpos Stresstest.
> 
> Not even the Backplate is getting hot, change the Thermal Paste its the best ting to do. I have no screenshoot from previous Temps but i reached about 1950-2050 ish Mhz @ 70-80° with 60-70% [email protected] stock Bios, ridiculous.
> 
> https://www.3dmark.com/3dm/25560537


1175mV seems like a lot of volts to put thru an air-cooled 1080Ti.


----------



## kithylin

domrockt said:


> Sooo iam gonna kick in i bought myselfe a Gainward Pheonix (Gaming) 1080ti just before Christmas for 640€!
> I changed the Thermalpaste for some Phobya Stuff i have around.
> I flashed the Asus XOC BIOS, thanks to the MASSIVE Air cooler which is activly cooling the VRMs. My Card stays @ 60° Max with 30-40% Fanspeed @ 2138.5/12600Mhz
> All tested with Unigin Superpos Stresstest.
> 
> Not even the Backplate is getting hot, change the Thermal Paste its the best ting to do. I have no screenshoot from previous Temps but i reached about 1950-2050 ish Mhz @ 70-80° with 60-70% [email protected] stock Bios, ridiculous.
> 
> https://www.3dmark.com/3dm/25560537


I'm going to join in the others and remind you that this is quite unhealthy and dangerous for an air cooled card. Those clocks and volts should never be used with the gpu above 60c for any length of time. If anything you should monitor the VRM section of your card if you have any way to do so. IR Thermometer aimed near the power connectors of the backplate at the very least and see how hot it runs when you benchmark it. Also benchmarks are not a good indicator of thermal performance because they only run for a short while then switch the card to idle then the next test and only load it for a few minutes at a time. Try playing some sort of modern AAA game for several hours and then report back on your findings regarding temps. But regardless, you should be very concerned about the health of your card long-term if you continue running it air cooled like that. The XOC bios removes limitations and allows the cards to pull more power than it normally would. It should only ever be used under full custom water loop or hybrid setups.

EDIT: Also it seems you're missing half of your ram sticks and you're running your x99 system in dual-channel mode there (instead of quad channel) at half of it's potential ram performance. That's probably negatively impacting your score and overall performance.. at least in modern stuff. Probably not an issue if you're playing Diablo 2 on it though.


----------



## SavantStrike

I didn't even catch that

1175 is sky high. That GPU should be COLD when being force fed that kind of power. 
@domrockt I implore you to flash back the stock BIOS before you kill that card. You'll get 1-2 years out of it if you are lucky if you keep running it like this. The XOC bios isn't a normal BIOS. All thermal protection is removed. You can't flash it to an air cooled card and run at 30 percent fan speed.


----------



## go4life

Got my first 1080 Ti earlier today, the MSI GTX 1080 Ti Gaming X. Very happy with it so far! Seems to be stable gaming and benching with +75 core and +750 on memory! Which equals to 2012mhz core and 6264mhz (12528mhz) memory. Was hoping for more on the core clock, however was pleasantly surprised by the memory OC. Temperatures was around 75c maximum after gaming for a longer while with 98-99% GPU usage constant in a fairly warm room and case fans on low speed.

Will test more later to see if it is truly stable, however did game for several hours with no issue and went through Heaven and Superstition benchmark, along with that boyband simulator Final Fantasy XV benchmark without any artifacts or crashes. Heavy load as I did all in 4K, so all games and benchmarks was constant 98-99% GPU load.


----------



## kithylin

go4life said:


> Got my first 1080 Ti earlier today, the MSI GTX 1080 Ti Gaming X. Very happy with it so far! Seems to be stable gaming and benching with +75 core and +750 on memory! Which equals to 2012mhz core and 6264mhz (12528mhz) memory. Was hoping for more on the core clock, however was pleasantly surprised by the memory OC. Temperatures was around 75c maximum after gaming for a longer while with 98-99% GPU usage constant in a fairly warm room and case fans on low speed.
> 
> Will test more later to see if it is truly stable, however did game for several hours with no issue and went through Heaven and Superstition benchmark, along with that boyband simulator Final Fantasy XV benchmark without any artifacts or crashes. Heavy load as I did all in 4K, so all games and benchmarks was constant 98-99% GPU load.


Are you able to monitor your clocks over time? Does it actually maintain 2012 Mhz core over time when it heats up and playing games? My 1080 Ti (originally stock on air cooler) would throttle down all the way to about 1825 Mhz (even though I had it's overclock set to 2050 mhz) when it got hot playing games when I first got it.


----------



## go4life

kithylin said:


> go4life said:
> 
> 
> 
> Got my first 1080 Ti earlier today, the MSI GTX 1080 Ti Gaming X. Very happy with it so far! Seems to be stable gaming and benching with +75 core and +750 on memory! Which equals to 2012mhz core and 6264mhz (12528mhz) memory. Was hoping for more on the core clock, however was pleasantly surprised by the memory OC. Temperatures was around 75c maximum after gaming for a longer while with 98-99% GPU usage constant in a fairly warm room and case fans on low speed.
> 
> Will test more later to see if it is truly stable, however did game for several hours with no issue and went through Heaven and Superstition benchmark, along with that boyband simulator Final Fantasy XV benchmark without any artifacts or crashes. Heavy load as I did all in 4K, so all games and benchmarks was constant 98-99% GPU load.
> 
> 
> 
> Are you able to monitor your clocks over time? Does it actually maintain 2012 Mhz core over time when it heats up and playing games? My 1080 Ti (originally stock on air cooler) would throttle down all the way to about 1825 Mhz (even though I had it's overclock set to 2050 mhz) when it got hot playing games when I first got it.
Click to expand...


Yes I am using the the MSI Afterburner ingame overlay, so could watch clocks constantly! At lowest it dropped to 2000mhz. So it goes between 2000-2012mhz.

That certainly was a drop in clock speed for you, however with an aggressive fan profile I imagine that even the reference cooler will be able to maintain a decent clock speed at the cost of noise.


----------



## feznz

Lets see the SH GPU market flow...... well who knows but I see cryptic currency market took a dump over night 

https://bitcoincharts.com/markets/


----------



## 2ndLastJedi

I took a massive dump this morning,lol


----------



## Scotty99

2ndLastJedi said:


> I took a massive dump this morning,lol


Nice dude, im about to pinch one off too. I had the arbys miami cuban for supper.


----------



## OrionBG

Hey guys, I finally got my replacement riser cable (first one was bad) and now my EVGA 1080ti FE flies again!
No more BSODs, driver crashes and "no GPU detected" BIOS errors! 
Currently at 2075MHz Core and 6100MHz RAM... Will play further...


----------



## KraxKill

Still don't understand why so many are pulling at the highest clocks. Been there, done that. 

If your goal is the smoothest gameplay possible, the best thing you can do to gain that performance out of Pascal, is to 

1. Cool it well, 
2. Shunt mod the soft TDP limit
3. Under-volt it to gain additional TDP headroom.
4. Don't set the core and mem clock too high.

The max frequency value only vs the above approach simply robs available power needed by the card while it tries to sustain a frequency and generates additional heat load while introducing nasty frame judder.

These cards use all the TDP they have available to them, if you're trying to squeeze the highest clock, you're just robbing the card elsewhere. The goal is to not have the clock too low, but not higher than you need.

I'd rather have 100fps of fluid frame delivery vs 105fps with nasty frame judder.

I've been overclocking everything around me since the original 3dfx Voodoo cards and 486. Maybe it's just me, but I feel that its no longer about getting 60fps it's about getting 60 frames all evenly spaced and timed so that the action appears fluid.

NO hate, just m opinion


----------



## feznz

KraxKill I noticed the same game @ 2077Mhz get stutter drop back to 2038Mhz smooth again


----------



## mouacyk

Yeah, people should clarify if they are constrained by cooling and/or power with the clocks they're chasing. Many continue to chase Kepler and pre-Kepler clocks without fully understanding Boost 3.0.


----------



## OrionBG

KraxKill said:


> Still don't understand why so many are pulling at the highest clocks. Been there, done that.
> 
> If your goal is the smoothest gameplay possible, the best thing you can do to gain that performance out of Pascal, is to
> 
> 1. Cool it well,
> 2. Shunt mod the soft TDP limit
> 3. Under-volt it to gain additional TDP headroom.
> 4. Don't set the core and mem clock too high.
> 
> The max frequency value only vs the above approach simply robs available power needed by the card while it tries to sustain a frequency and generates additional heat load while introducing nasty frame judder.
> 
> These cards use all the TDP they have available to them, if you're trying to squeeze the highest clock, you're just robbing the card elsewhere. The goal is to not have the clock too low, but not higher than you need.
> 
> I'd rather have 100fps of fluid frame delivery vs 105fps with nasty frame judder.
> 
> I've been overclocking everything around me since the original 3dfx Voodoo cards and 486. Maybe it's just me, but I feel that its no longer about getting 60fps it's about getting 60 frames all evenly spaced and timed so that the action appears fluid.
> 
> NO hate, just m opinion


I'm using the XOC BIOS and I have no limits. The card is water-cooled and the max temp I got after almost an hour of testing was 55C. My GPU frequency does not move from its max position ever. everything is smooth as a baby's a**.... I did a lot of benchmarks and I have seen no performance degradation at all at 2075MHz.


----------



## mouacyk

OrionBG said:


> I'm using the XOC BIOS and I have no limits. The card is water-cooled and the max temp I got after almost an hour of testing was 55C. My GPU frequency does not move from its max position ever. everything is smooth as a baby's a**.... I did a lot of benchmarks and I have seen no performance degradation at all at 2075MHz.


55C is hot for custom water. How much rad do you have? Aside from that, the flat line is great.


----------



## OrionBG

mouacyk said:


> 55C is hot for custom water. How much rad do you have? Aside from that, the flat line is great.


Yeah, I know it is not quite a good temp. I'll be doing a rebuild soon and hopefully, 3x 360 EK rads will be able to keep up with an overclocked Threadripper 1950X and the 1080TI.
Right now I have one 360 and one 240 cross-flow radiators from Alphacool. The 55C I got after a lot of testing. The usual temp during Superposition or Heaven for instance, is around 45C


----------



## kithylin

mouacyk said:


> 55C is hot for custom water. How much rad do you have? Aside from that, the flat line is great.


Running 2139 mhz max on my 1080 Ti and 1.200v (yes my 1080 Ti is a crappy clocker and needs that for 2139 mhz) and it sometimes runs around 53c - 55c custom water looped, but only in certain games like ark evolved that will max it out @ 99% usage. Most games it runs around 35c - 40c most of the time, sometimes 43c. 3x120mm radiator single, but I also have a 5820K looped into it too that's also overclocked to 4.5 ghz.


----------



## mouacyk

I have 360+120 and pump at 4K rpm along with an 8700K at 5GHz in the same loop. With the MSI stock GPU TIM, I was getting up to 46C in BF1 after hours of play on the XOC BIOS at 1.131v 2100/12420. Ever since I swapped out the stock TIM with liquid metal, nothing (except possibly furmark) takes the temperature beyond 40C now. I am back on stock BIOS with 1.094v at 2100.5/12600MHz sustained with full GPU usage without throttling.

I used to have a 980 Ti with liquid metal on the die as well and it never surpassed 40C, so seeing 46C on the 1080 TI was a surprise. I initially attributed the higher temp to the higher transistor density of Pascal, until I came across some anecdotes of shoddy MSI assembly and pasting jobs. Sure enough, the gob of stock TIM was overflowing onto the PCB and VRAM pads were mis-aligned leaving parts of the chips uncovered. Fixing the pads gave me +200 on RAM and the liquid metal gave another 2x12.5MHz bins.

Perhaps it's just the 4K rpm pump speed giving me the generally lower temps, because people don't tend to run that high.


----------



## kithylin

I was experimenting with another radiator and did have about -4c to -5c temp drops with adding a second 240mm rad to my 360mm rad, I'm probably going to just wait until I can get some money together and build a better water cooled machine for the 1080 Ti stuff. I had my eye on something like the thermaltake x9 and just pipe it through a pair of matched 360mm rads up top and a splitter on both sides. Using two radiators before though I did have to step up and add a second older MCP355 to my big MCP655, even at full blast the single MCP655 wasn't enough, barely moved coolant around enough to even bleed the air out. So probably move up to a dual-pump or triple-pump setup too. I actually can't maintain my 1080 Ti @ 2139 @ 99% load forever unless I increase cooling later, runs about 45 minutes until the water in the loop heats up then it approaches almost 60c and resets the driver. I think it's just the 1080 Ti's make a stupid high amount of heat when overclocked.


----------



## mouacyk

It could be the primary radiator that gives me cooler temps than you guys -- my 360 is the Nemesis GTX and is extra thick with high static pressure Vardar fans on them.


----------



## ThrashZone

Hi,
I believe the gpu's memory gets the hottest for me anyway about 54-56c while the gpu shows 40c
Still working on that with different thermal pads I've yet to install yet.

Little birdie said to separate the gpu and cpu into different loops :/
Sounds like a pain to me but I was already going to add another pump so possible not to bad besides more tubing in a mid tower case :/

May just swap the way it's piped pump-res/ rad/ cpu/ pump/ rad/ gpu sort of thing


----------



## trawetSluaP

I've always struggled to get my card to or above 2000MHz.

Out of the box it sits at 1974MHz.

An increase in the power target and voltage slider to the max mean it goes to 1987MHz.

If I try to add even one bin of 13MHz to the core clock the driver crashes in most instances


----------



## 8051

ThrashZone said:


> Hi,
> I believe the gpu's memory gets the hottest for me anyway about 54-56c while the gpu shows 40c
> Still working on that with different thermal pads I've yet to install yet.
> 
> Little birdie said to separate the gpu and cpu into different loops :/
> Sounds like a pain to me but I was already going to add another pump so possible not to bad besides more tubing in a mid tower case :/
> 
> May just swap the way it's piped pump-res/ rad/ cpu/ pump/ rad/ gpu sort of thing


Why couldn't you just get an extra-thick radiator like mouacyk?


----------



## kithylin

8051 said:


> Why couldn't you just get an extra-thick radiator like mouacyk?


Radiator thickness has very little to do with thermal capacity, modern half thickness slim radiators have just about same thermal capacity as a thick one. It's just with thick radiators the FPI (fins per inch) density is less, so you don't need as strong of fans to push through them. Smaller slim radiators have a much denser FPI and need strong fans to push through. Otherwise if both radiator types are just about the same. It depends on the setup you want. Thin radiators are for smaller cases that don't have a lot of room. Generally the most efficient way (that will have the biggest impact to component temperatures) is to add more thermal dissipation capacity to a loop is just add more radiator instead of changing the radiator type (thick vs thin). I'm referring to the wattage capacity of a radiator by the way. Thicker may be more but not a lot, like +100W to +500W more thermal capacity, vs a thin one, but that's nothing compared to adding a second radiator (of any type) that could be another +2000W capacity.


----------



## 2ndLastJedi

trawetSluaP said:


> I've always struggled to get my card to or above 2000MHz.
> 
> Out of the box it sits at 1974MHz.
> 
> An increase in the power target and voltage slider to the max mean it goes to 1987MHz.
> 
> If I try to add even one bin of 13MHz to the core clock the driver crashes in most instances /forum/images/smilies/frown.gif


Well that sucks.


----------



## feznz

trawetSluaP said:


> I've always struggled to get my card to or above 2000MHz.
> 
> Out of the box it sits at 1974MHz.
> 
> An increase in the power target and voltage slider to the max mean it goes to 1987MHz.
> 
> If I try to add even one bin of 13MHz to the core clock the driver crashes in most instances



after 2000Mhz performance gains diminish can you seriously distinguish between 60FPS and 63FPS and probably being generous 

Also with GPU temps had problems with 65°C so done a coolant change and got the new Grizzzly kryonaut dropped temps to 35°C great ended up with an extra 50Mhz OC

after few days realised it made games stutter so dropped back to 2038Mhz core games smooth again it just a number
Like your CPU been feeding that some last night still 5.4Ghz 2% increase over 5.2Ghz in benchmark I know I will never see a difference in games but I might have to go for 5.5Ghz because I can :thumb:


----------



## KickAssCop

Got my quick setup done.


----------



## jura11

kithylin said:


> I was experimenting with another radiator and did have about -4c to -5c temp drops with adding a second 240mm rad to my 360mm rad, I'm probably going to just wait until I can get some money together and build a better water cooled machine for the 1080 Ti stuff. I had my eye on something like the thermaltake x9 and just pipe it through a pair of matched 360mm rads up top and a splitter on both sides. Using two radiators before though I did have to step up and add a second older MCP355 to my big MCP655, even at full blast the single MCP655 wasn't enough, barely moved coolant around enough to even bleed the air out. So probably move up to a dual-pump or triple-pump setup too. I actually can't maintain my 1080 Ti @ 2139 @ 99% load forever unless I increase cooling later, runs about 45 minutes until the water in the loop heats up then it approaches almost 60c and resets the driver. I think it's just the 1080 Ti's make a stupid high amount of heat when overclocked.


Hi there 

What is yours current case and what ambient temperature is yours usually or when you do tests etc? 

What are yours radiators?

What fans are you using?

At what RPM you are running?

Do you have any water temperature sensor? 

Have look recently I built loop for my friend, he have in single loop 5930k OC to 4.6Ghz,Gigabyte GTX1080Ti Aorus Extreme with Phanteks Glacier WB,EK PE360 with Noctua NF-F12 on top, Mayhems Havoc 240mm 60mm with EK Vardar F3 1850RPM,case is Enthoo Primo and temperatures on idle are in 17-20°C and load in 33-36°C at load as max with fans running at 900-950RPM,pump he have XSPC D5 and pump is running at 1800RPM I think 

And regarding the OC on his GTX1080Ti, we are tried few OC settings and max I saw few 2113MHz which is stable in most games, only one game where isn't is Mass Effect Andromeda 

Hope this helps 

Thanks, Jura


----------



## kithylin

jura11 said:


> Hi there
> 
> What is yours current case and what ambient temperature is yours usually or when you do tests etc?
> 
> What are yours radiators?
> 
> What fans are you using?
> 
> At what RPM you are running?
> 
> Do you have any water temperature sensor?
> 
> Have look recently I built loop for my friend, he have in single loop 5930k OC to 4.6Ghz,Gigabyte GTX1080Ti Aorus Extreme with Phanteks Glacier WB,EK PE360 with Noctua NF-F12 on top, Mayhems Havoc 240mm 60mm with EK Vardar F3 1850RPM,case is Enthoo Primo and temperatures on idle are in 17-20°C and load in 33-36°C at load as max with fans running at 900-950RPM,pump he have XSPC D5 and pump is running at 1800RPM I think
> 
> And regarding the OC on his GTX1080Ti, we are tried few OC settings and max I saw few 2113MHz which is stable in most games, only one game where isn't is Mass Effect Andromeda
> 
> Hope this helps
> 
> Thanks, Jura


Using a 3x120mm Stealth GTX radiator, and I don't remember the exact fans. They're ADDA brand and I bought em off of Mouser because they're industrial 120mm fans (Double ball bearing) and rated (According to the spec pdf files) higher than any "made for computer water cooling" fans I could find at the time in terms of static pressure, compared to like corsair and such. I run em in push-pull and on a fan controller and usually run both sets (6 fans) around 40%. I'm also now running two pumps, MCP-355 & MCP-655, I was running the MCP-655 solo but had problems not having enough flow to even successfully bleed the loop after filling. I have the GPU and a single CPU block on the CPU. I keep my ambient temps between 74F <-> 87F with a window air conditioner in the computer's room. In the summer it varies depending on when the AC is on or not, it cycles. So it has a hot side and a cold side, usually. I tested in the We've had outside highs near 80F - 83F the past few days so I haven't had to use the AC in there yet.. mostly just running it open-door for now in that room until the summer heats up.

It may not be pretty but it has decent internals. I've had this case since like 2008 and it's been through.. at least 5 computer upgrades for me now. I just love the airflow and love it in general. I've had to hack it up and adapt it to water cooling because it wasn't designed for it. But whatever, it's my baby and I love her and don't care how she looks on the outside, it's what's inside that counts.

https://i.imgur.com/ww4DRO1.jpg


----------



## jura11

kithylin said:


> Using a 3x120mm Stealth GTX radiator, and I don't remember the exact fans. They're ADDA brand and I bought em off of Mouser because they're industrial 120mm fans (Double ball bearing) and rated (According to the spec pdf files) higher than any "made for computer water cooling" fans I could find at the time in terms of static pressure, compared to like corsair and such. I run em in push-pull and on a fan controller and usually run both sets (6 fans) around 40%. I'm also now running two pumps, MCP-355 & MCP-655, I was running the MCP-655 solo but had problems not having enough flow to even successfully bleed the loop after filling. I have the GPU and a single CPU block on the CPU. I keep my ambient temps between 74F <-> 87F with a window air conditioner in the computer's room. In the summer it varies depending on when the AC is on or not, it cycles. So it has a hot side and a cold side, usually. I tested in the We've had outside highs near 80F - 83F the past few days so I haven't had to use the AC in there yet.. mostly just running it open-door for now in that room until the summer heats up.
> 
> It may not be pretty but it has decent internals. I've had this case since like 2008 and it's been through.. at least 5 computer upgrades for me now. I just love the airflow and love it in general.. I've just had to hack it up and adapt it to water cooling because it wasn't designed for it. But whatever, it's my baby and I love her and don't care how she looks on the outside, it's what's inside that counts.
> 
> https://i.imgur.com/ww4DRO1.jpg


Hi there

Really doesn't matter how it looks there, saw several loops which do look awesome but they perform bellow expectation, friend build awesome loop which do look awesome but 56°C on GTX1080 GPU seems too high for my liking

HWLabs GTX360 are nice radiators and personally I use on my loop GTS360 radiators

Same as you I have in my personal loop 2 pumps(XSPC D5 and EK DDC 3.2 PWM Elite edition) as single pump don't think would be able keep good flow in my loop as I'm running 3*GTS360 with EK XE360 and yesterday I added to my loop MO-ra3 360mm

I see you are running Heatkiller IV Pro block I assume which is one of the best blocks on market

These ambient temperatures are high and at 87F which is around 30°C I too would have high temperatures on CPU or GPU, only way keep GPU or CPU at low temperatures is get chiller

If you want to keep yours case I strongly recommend get Mo-ra3 360 or 420mm..

I will try post the pictures of my old loop when I run Enthoo Primo which I really recommend as case for water cooling other case is EVGA DG-87 etc

Currently have Caselabs M8 with pedestal although they're are have 4*360mm rad space I added MO-ra3 360mm due I want to run fans slow or very slow as I do render every day etc

Hope this helps

Thanks, Jura


----------



## kithylin

jura11 said:


> Hi there
> 
> Really doesn't matter how it looks there, saw several loops which do look awesome but they perform bellow expectation, friend build awesome loop which do look awesome but 56°C on GTX1080 GPU seems too high for my liking
> 
> HWLabs GTX360 are nice radiators and personally I use on my loop GTS360 radiators
> 
> Same as you I have in my personal loop 2 pumps(XSPC D5 and EK DDC 3.2 PWM Elite edition) as single pump don't think would be able keep good flow in my loop as I'm running 3*GTS360 with EK XE360 and yesterday I added to my loop MO-ra3 360mm
> 
> I see you are running Heatkiller IV Pro block I assume which is one of the best blocks on market
> 
> These ambient temperatures are high and at 87F which is around 30°C I too would have high temperatures on CPU or GPU, only way keep GPU or CPU at low temperatures is get chiller
> 
> If you want to keep yours case I strongly recommend get Mo-ra3 360 or 420mm..
> 
> I will try post the pictures of my old loop when I run Enthoo Primo which I really recommend as case for water cooling other case is EVGA DG-87 etc
> 
> Currently have Caselabs M8 with pedestal although they're are have 4*360mm rad space I added MO-ra3 360mm due I want to run fans slow or very slow as I do render every day etc
> 
> 
> Hope this helps
> 
> Thanks, Jura


Thanks for the input, and yes Heatkiller IV Pro block, and the high ambient temps in the summer is just to even get the a/c to cycle in there. It's a 8000-btu for one little 10' x 11' room in there but when it gets up around 120F / about 49C in the summer outside we gotta let the a/c cycle off some time or the power bill would be insane, and If I set it to target like 77F high 73F low it runs forever. What I have works for now, I didn't get to add it in the above posting but I was testing for a little while and I can run around 2139 Mhz core speed with this 1080 Ti with a second 2x120mm stealth GTX rad in the loop and it would hold fine at around 50c - 53c, so it will help. I do plan to some day later try and build up a better case and two matched big radiators. Just going to be in to next year before that happens at this rate. Been kinda strapped for cash around here with things since buying the 1080 Ti, had to buy some stuff on credit cards and almost back to normal soon. Unlike some of you rich people with disposable income, it was a pretty big purchase for me and I try to "Go big" and buy the top card once every other generation and get a card to last me 4-5 years. Back on single-rad as it is now I have to run the card around max 2112 Mhz and that's fine for a while. Funny how just a few mhz make such a big difference in heat and required voltage. I can run it around 1.150v @ 2112, but it wants 1.20v to be stable for 2139.

Every card's different. I've seen some people on the forums here with 1080 Ti's that can do 2200 Mhz @ 1.125v, sadly not mine. And I'm not going to try and RMA it just because it doesn't overclock well and deal with EVGA giving me a refurbished unit instead. So I'll just cool it's heels for 6-7 months and later can run it full OC next year. At least I have a 1080 Ti now.


----------



## jura11

kithylin said:


> Thanks for the input, and yes Heatkiller IV Pro block, and the high ambient temps in the summer is just to even get the a/c to cycle in there. It's a 8000-btu for one little 10' x 11' room in there but when it gets up around 120F / about 49C in the summer outside we gotta let the a/c cycle off some time or the power bill would be insane, and If I set it to target like 77F high 73F low it runs forever. What I have works for now, I didn't get to add it in the above posting but I was testing for a little while and I can run around 2139 Mhz core speed with this 1080 Ti with a second 2x120mm stealth GTX rad in the loop and it would hold fine at around 50c - 53c, so it will help. I do plan to some day later try and build up a better case and two matched big radiators. Just going to be in to next year before that happens at this rate. Been kinda strapped for cash around here with things since buying the 1080 Ti, had to buy some stuff on credit cards and almost back to normal soon. Unlike some of you rich people with disposable income, it was a pretty big purchase for me and I try to "Go big" and buy the top card once every other generation and get a card to last me 4-5 years. Back on single-rad as it is now I have to run the card around max 2112 Mhz and that's fine for a while. Funny how just a few mhz make such a big difference in heat and required voltage. I can run it around 1.150v @ 2112, but it wants 1.20v to be stable for 2139.
> 
> Every card's different. I've seen some people on the forums here with 1080 Ti's that can do 2200 Mhz @ 1.125v, sadly not mine. And I'm not going to try and RMA it just because it doesn't overclock well and deal with EVGA giving me a refurbished unit instead. So I'll just cool it's heels for 6-7 months and later can run it full OC next year. At least I have a 1080 Ti now.


Hi there 

These ambient temperatures are way too high and 49°C not sure if I could live in such environment, sorry I'm from country where for me 30°C is too hot,personally like 15-20°C as max there ambient

Yes I agree with running A/C daily or 24/7 it will or cause high bills etc

On my Ti I can run 2154MHz as max that's with 1.093v, still I'm on stock BIOS for time being but will be flashing BIOS later this week and will do few tests if its worth but for better OC etc I would need to do shunt mode which I'm scared to do 

Yours OC is still awesome and these temperatures are good or not bad as well if we take to consideration yours ambient temperatures 

Regarding the rich people there,I build this PC in last year and half,GPUs they're been bought through our company which employed me previously and still paying them to the date,I worked in rendering field that time and still work and we are got good deals from our company on this

Buying new GPU for full price, this I couldn't afford at all there,same applies for many things which I bought for my PC 

Hope this helps and good luck 

Thanks, Jura


----------



## Kriant

Snatched 1080ti FTW3 Hybrid Cooler for $119.99 on Newegg.

I guess my EK block will go on ebay block without ever being opened. I just can't bring myself to re-do the whole loop again and swap cards from other rig to align two ftw3's.


----------



## ITAngel

Finally got a EVGA GTX 1080 Ti KINGPIN edition graphic card. These GPUs are pretty heavy and I think is a great addition to my AMD Threadripper 1920X setup. =)


----------



## ThrashZone

Hi,
Wow @ITAngel is that a copper heat sink ?


----------



## SavantStrike

ThrashZone said:


> Hi,
> Wow @ITAngel is that a copper heat sink ?


Copper plated IIRC.


----------



## ITAngel

ThrashZone said:


> Hi,
> Wow @ITAngel is that a copper heat sink ?


Hi there yes, copper-plated heatsink with six heatpipes.


----------



## ThrashZone

Hi,
Nice I suppose platted well I'm sure it might work better than the normal aluminum


----------



## ITAngel

ThrashZone said:


> Hi,
> Nice I suppose platted well I'm sure it might work better than the normal aluminum


It might, I mean I guess the idea is to provide extra cooling to the card with it. =) Well will see so far impressed with the cooling of the card in my case so I can't complain.


----------



## ThrashZone

ITAngel said:


> It might, I mean I guess the idea is to provide extra cooling to the card with it. =) Well will see so far impressed with the cooling of the card in my case so I can't complain.


Hi,
Yeah after decapitating my ftw3 with new paste and new thermal pads on the memory it's sticking to about 55c 
Before it was choking at 65c+ evga used some crappy thermal paste it flaked off and greasy thermal pads.


----------



## SavantStrike

ThrashZone said:


> Hi,
> Nice I suppose platted well I'm sure it might work better than the normal aluminum


Not significantly better, just looks cool 

That's a good enough reason in my book!


----------



## ThrashZone

Hi,
Yeah all of this nickle plated copper is really annoying


----------



## kithylin

ThrashZone said:


> Hi,
> Wow @ITAngel is that a copper heat sink ?


Yeah, the EVGA KINGPIN is a really, really nice card for tuning and overclocking and has a lot of sexy features: Comes with perforated copper heatsink and copper heat pipes. I've heard it's quite a heavy air cooled card and they 'unofficially' claim all KINGPIN Cards should do at least 2050 mhz out of the box with software overclocking, on air, & stable with no down-throttling. They also have hookups for the very old EVGA EVBOT, and a USB header to connect to the motherboard's usb header and do all EVBOT functions via software. I really wish I'd bought one of those instead but they were $1100 back when all the other 1080 Ti's were $689 and it was difficult to justify it. That and the only water blocks for em were from EVGA only (no 3rd party vendors) and EVGA charged like another $215 for the block when other blocks were like $125. I mean it would of been nice just.. I dunno if it would of been worth it. Lucido, aka "K|NGP|N" himself literally worked directly with EVGA engineers and hand-designed the card with fully custom PCB himself and did their own in-lab testing with the engineers and ln2 overclocking.

If you don't know about em, take a bit to see Buildzoid's PCB breakdown video: 




The Kingpin 1080 Ti cards are literally "overkill" for the 1080 Ti, and really it's completely pointless and wasted of it's potential to buy one for "normal" gaming or even custom water. They're LN2 cards.


----------



## ThrashZone

Hi,
Yeah evga has numerous version of the king pin prices pretty high on the highest clocks 
Almost think they have more fun with those than the titans when or if evga still offers titans ?


----------



## MRCOCOset

*LIQUID METAL on 1080TI FE*

LIQUID METAL on 1080TI FE:

Hello. I thought about replacing the molten metal paste:


----------



## kithylin

ThrashZone said:


> Hi,
> Yeah evga has numerous version of the king pin prices pretty high on the highest clocks
> Almost think they have more fun with those than the titans when or if evga still offers titans ?


EVGA doesn't sell titans. Only NVIDIA does directly off the nvidia website. No 3rd party vendors are allowed to sell Titans this time around.


----------



## axm

I'm having an issue with my 1080ti, I've tried troubleshooting myself and postintg on reddit but I've gotten nowhere. I noticed recently that I was getting lower fps than usual, however, the drop happens suddenly. One game I have 300 fps, then the next I have 150. A restart fixes the issue, however after a little while, I get low frames again. I was able to replicate it 100% by just waiting a few hours after a restart, my fps would be lower. Temps are the same, clocks are the same, but in GPU-Z the Power Consumption is a little lower. Here are some benchmarks to show how drastic the loss of performance is.

So far I've looked in the bios, all the power options are maxed, I've tried clean wiping windows, and I've gone into nvidia control panel and put prefer maximum performance. Please help, I've been tearing my hair out over this.

You can see in GPU-Z exactly when my fps drops

I'm not sure if I can link my 3dmark scores, but before I lag I'm at 4300 and after, I'm at 3300. My graphics score drops from 5000~ to 3600~

edit: it happened again, and this time my pc crashed shortly after I started lagging. I attatched a new screencap from gpu-z before I crashed


----------



## kithylin

axm said:


> I'm having an issue with my 1080ti, I've tried troubleshooting myself and postintg on reddit but I've gotten nowhere. I noticed recently that I was getting lower fps than usual, however, the drop happens suddenly. One game I have 300 fps, then the next I have 150. A restart fixes the issue, however after a little while, I get low frames again. I was able to replicate it 100% by just waiting a few hours after a restart, my fps would be lower. Temps are the same, clocks are the same, but in GPU-Z the Power Consumption is a little lower. Here are some benchmarks to show how drastic the loss of performance is.
> 
> So far I've looked in the bios, all the power options are maxed, I've tried clean wiping windows, and I've gone into nvidia control panel and put prefer maximum performance. Please help, I've been tearing my hair out over this.
> 
> You can see in GPU-Z exactly when my fps drops
> 
> I'm not sure if I can link my 3dmark scores, but before I lag I'm at 4300 and after, I'm at 3300. My graphics score drops from 5000~ to 3600~
> 
> edit: it happened again, and this time my pc crashed shortly after I started lagging. I attatched a new screencap from gpu-z before I crashed


It's difficult to diagnose over the internet without being there to touch your computer but the only indication of a problem I can see is in GPU-Z it clearly says limitation is VRel, which is Voltage Reliability. Which generally means your power supply isn't able to sustain reliable voltage to the video card. You may possibly have an issue with your power supply, it could be the gpu is unable to get enough power after running for a while and will automatically down-clock. Now, I don't know if this really is the issue but sometimes it can be, but, one suggestion is depending on your power supply, consider running two power cords from the psu to your video card instead of just one. Like, some power supplies will have one cable with 8 wires and then at the end have one 6+2 connector, and a second 6+2 connector chained off the first one. Instead of running the entire card on that, try pulling out another 8-wire cord for the second power connector so the gpu isn't sucking along just one. That may fix your issue.


----------



## 2ndLastJedi

kithylin said:


> axm said:
> 
> 
> 
> I'm having an issue with my 1080ti, I've tried troubleshooting myself and postintg on reddit but I've gotten nowhere. I noticed recently that I was getting lower fps than usual, however, the drop happens suddenly. One game I have 300 fps, then the next I have 150. A restart fixes the issue, however after a little while, I get low frames again. I was able to replicate it 100% by just waiting a few hours after a restart, my fps would be lower. Temps are the same, clocks are the same, but in GPU-Z the Power Consumption is a little lower. Here are some benchmarks to show how drastic the loss of performance is.
> 
> So far I've looked in the bios, all the power options are maxed, I've tried clean wiping windows, and I've gone into nvidia control panel and put prefer maximum performance. Please help, I've been tearing my hair out over this.
> 
> You can see in GPU-Z exactly when my fps drops
> 
> I'm not sure if I can link my 3dmark scores, but before I lag I'm at 4300 and after, I'm at 3300. My graphics score drops from 5000~ to 3600~
> 
> edit: it happened again, and this time my pc crashed shortly after I started lagging. I attatched a new screencap from gpu-z before I crashed
> 
> 
> 
> It's difficult to diagnose over the internet without being there to touch your computer but the only indication of a problem I can see is in GPU-Z it clearly says limitation is VRel, which is Voltage Reliability. Which generally means your power supply isn't able to sustain reliable voltage to the video card. You may possibly have an issue with your power supply, it could be the gpu is unable to get enough power after running for a while and will automatically down-clock. Now, I don't know if this really is the issue but sometimes it can be, but, one suggestion is depending on your power supply, consider running two power cords from the psu to your video card instead of just one. Like, some power supplies will have one cable with 8 wires and then at the end have one 6+2 connector, and a second 6+2 connector chained off the first one. Instead of running the entire card on that, try pulling out another 8-wire cord for the second power connector so the gpu isn't sucking along just one. That may fix your issue.
Click to expand...


VRel just means the core is unable to step up to the next voltage bin on the Boost curve because the core will become unstable, thus limiting maximum Boost clock. Generally it's nothing to be worried about, as all cards will get this perfcap when running games because the Boost curve is defined beyond the maximum voltage that is reasonable.


----------



## bmgjet

Any one had there card overclock a lot better in a different computer.
Just built my self a new 7900X rig and my 1080ti now does 2100mhz on 1.093v effortlessly, Can even do some benchmark runs on 2125mhz.
Previously 2062mhz was the most I could get out of it with same voltage.

Just CPU and Mobo changed.
Same watercooling loop, Which is running about 3c hotter then my 6900k setup.


----------



## pez

Probably better PCIe power delivery and/or better components on the new board overall.


----------



## SavantStrike

MRCOCOset said:


> LIQUID METAL on 1080TI FE:
> 
> Hello. I thought about replacing the molten metal paste:
> 
> https://www.youtube.com/watch?v=1ITPIc9qG0I&t=312s


Don't use liquid metal on any GPU heatsink unless you're sure of the composition. It corrodes aluminum like crazy.

If you are careful not to hit the memory/VRM spreader which is aluminum, you could probably get away with it on a FE 1080 TI, although how much you get versus a good aftermarket TIM is debatable.


----------



## Nineball_Seraph

SavantStrike said:


> Don't use liquid metal on any GPU heatsink unless you're sure of the composition. It corrodes aluminum like crazy.
> 
> If you are careful not to hit the memory/VRM spreader which is aluminum, you could probably get away with it on a FE 1080 TI, although how much you get versus a good aftermarket TIM is debatable.


I would like to second this.

I was running LM on my FE 1080ti w/hybrid kit and then removed the LM and used thermal grizzly and did not notice a change in temps.


----------



## 2ndLastJedi

https://youtu.be/HPqi8RdApxc
Here's a vid of a Titan that was killed by liquid Metal!


----------



## buellersdayoff

bmgjet said:


> Any one had there card overclock a lot better in a different computer.
> Just built my self a new 7900X rig and my 1080ti now does 2100mhz on 1.093v effortlessly, Can even do some benchmark runs on 2125mhz.
> Previously 2062mhz was the most I could get out of it with same voltage.
> 
> Just CPU and Mobo changed.
> Same watercooling loop, Which is running about 3c hotter then my 6900k setup.


Clean install of windows and drivers maybe?


----------



## SirWaWa

is the default color on the asus strix 1080ti orange?
i know about asus aura but does it change to any other color on it's own? is it temperature based?


----------



## feznz

SirWaWa said:


> is the default color on the asus strix 1080ti orange?
> i know about asus aura but does it change to any other color on it's own? is it temperature based?


reddish slightly orange with "breathing" illuminates slowly bright to off in a cycle


----------



## SirWaWa

feznz said:


> reddish slightly orange with "breathing" illuminates slowly bright to off in a cycle


that's exactly what I see... so it won't do anything else "outta the box"?
can I install aura, change it, then uninstall it? or does aura have to be running all the time? this tricks with corsair link but thats corsair stuff


----------



## KCDC

SirWaWa said:


> that's exactly what I see... so it won't do anything else "outta the box"?
> can I install aura, change it, then uninstall it? or does aura have to be running all the time? this tricks with corsair link but thats corsair stuff



For other Aura things, mobo and leds, I've been able to uninstall and keep settings, but sometimes when I reinstall Aura, it will reset everything.


----------



## pez

Yep, I remember my STRIX Ti and my Z270i (and now my X370) board all keeping the lighting settings after uninstalling the software afterwards.


----------



## feznz

SirWaWa said:


> that's exactly what I see... so it won't do anything else "outta the box"?
> can I install aura, change it, then uninstall it? or does aura have to be running all the time? this tricks with corsair link but thats corsair stuff


I can say I uninstalled corsair link it used a consistent 2-3% CPU and with Intel boost 3 the CPU never really idled properly
Also with Aura I had light cycles with the motherboard which also used quite a bit for CPU in the background so I just left it as a solid colour and RGB RAM as default (colour cycles) 
I have no idea what would happen if you changed the lighting to what you want then uninstall the Aura software in regards to CPU usage.


----------



## keikei

Hey guys,

quick question. How is SLI these days? Rumors of Volta have been silent and I would like some moar power.


----------



## OrionBG

feznz said:


> I can say I uninstalled corsair link it used a consistent 2-3% CPU and with Intel boost 3 the CPU never really idled properly
> Also with Aura I had light cycles with the motherboard which also used quite a bit for CPU in the background so I just left it as a solid colour and RGB RAM as default (colour cycles)
> I have no idea what would happen if you changed the lighting to what you want then uninstall the Aura software in regards to CPU usage.


CPU Usage does not have anything to do with the RGB effects themselves! All those programs just send a command that i written to the RGB controller and there is no communication between the software and the RGB controller until you want to change the effect. The CPU usage in Corsair Link is actually because of the many different functions it has: Temperature Monitoring, Voltage Monitoring, Fan Speed Control and so on.
You can set an RGB effect and stop the Aura app/Uninstall it. The effect will be saved until the RGB controller is reset. This may happen at any point due to power surge, BIOS Flash, possibly even a BIOS Reset, or just by any fluke of nature... Regarding Corsair Link, most of the Fans do remember the effect. For instance, on my PC I have 6xHD fans and 3xLL fans and although I have Corsair Link 4, Asrock RGB software and G.Skill's TridentZ RGB software installed, I have disabled "run at startup"for all of them and none is being started. I only start them when I want to change the effects. Even their Services are not working.


----------



## feznz

OrionBG said:


> CPU Usage does not have anything to do with the RGB effects themselves! All those programs just send a command that i written to the RGB controller and there is no communication between the software and the RGB controller until you want to change the effect. The CPU usage in Corsair Link is actually because of the many different functions it has: Temperature Monitoring, Voltage Monitoring, Fan Speed Control and so on.
> You can set an RGB effect and stop the Aura app/Uninstall it. The effect will be saved until the RGB controller is reset. This may happen at any point due to power surge, BIOS Flash, possibly even a BIOS Reset, or just by any fluke of nature... Regarding Corsair Link, most of the Fans do remember the effect. For instance, on my PC I have 6xHD fans and 3xLL fans and although I have Corsair Link 4, Asrock RGB software and G.Skill's TridentZ RGB software installed, I have disabled "run at startup"for all of them and none is being started. I only start them when I want to change the effects. Even their Services are not working.


good points TBH I really liked the RGB for the first week or so but decided just 1 solid colour was more would be nice if it were RGB and white.
I never really found a useful use for corsair link as HWmonitor will read power draw temps etc and I didn't need the fan controller 
But why link uses like 10x more CPU than MSI afterburner doesn't make sense and feels like a double up on monitoring software.


----------



## SirWaWa

well guess I'm leaving it as is... too bad it had to default to orange... that strix legacy


----------



## Mr Ripper

I wish to get a 1080 ti and water cool with my EK universal block which I've had on my 7970 for many years and recently as a stop gap easily swapped it onto an ASUS GTX 1060 6GB and placed heatsinks where needed.

I was wondering whether any particular 1080 ti PCB design would be better suited? I'm thinking placement of additional heatsinks as well as PCB quality. I've read this about the MSI Armor (poor stock cooling but good PCB) - https://www.gamersnexus.net/hwreviews/2927-msi-1080ti-armor-review-high-temps?showall=1

However as I'm likely to buy secondhand I'd rather not get one that I can't resale easily in the future if needed when used prices seem closer together.


----------



## kithylin

Mr Ripper said:


> I wish to get a 1080 ti and water cool with my EK universal block which I've had on my 7970 for many years and recently as a stop gap easily swapped it onto an ASUS GTX 1060 6GB and placed heatsinks where needed.
> 
> I was wondering whether any particular 1080 ti PCB design would be better suited? I'm thinking placement of additional heatsinks as well as PCB quality. I've read this about the MSI Armor (poor stock cooling but good PCB) - https://www.gamersnexus.net/hwreviews/2927-msi-1080ti-armor-review-high-temps?showall=1
> 
> However as I'm likely to buy secondhand I'd rather not get one that I can't resale easily in the future if needed when used prices seem closer together.


That will cause a modern card to melt, you need full water block contact on the VRM's for 1080 and 1080 Ti. It gets very hot. You can't just heatsink it like that. If you're going to be spending near $1000 for a card today, at least spend another $150 for a proper EK full-cover waterblock instead of some dinky 7 year old universal thing.


----------



## 8051

kithylin said:


> That will cause a modern card to melt, you need full water block contact on the VRM's for 1080 and 1080 Ti. It gets very hot. You can't just heatsink it like that. If you're going to be spending near $1000 for a card today, at least spend another $150 for a proper EK full-cover waterblock instead of some dinky 7 year old universal thing.


The gamersnexus testing w/the NZXT Kraken G12 cooler on the MSI Armor 1080Ti shows a significant reduction in ALL temps (VRM's, VRAM and GPU) over the stock air cooler and using the stock cooling plate.


----------



## kithylin

8051 said:


> The gamersnexus testing w/the NZXT Kraken G12 cooler on the MSI Armor 1080Ti shows a significant reduction in ALL temps (VRM's, VRAM and GPU) over the stock air cooler and using the stock cooling plate.


They aimed a fan on the card from the side of the motherboard in their testing, that's why. Unless I'm reading something wrong, it sounds like "Mr Ripper" above is describing using a generic/universal water block and just putting heatsinks on the ram and vrm's, with no fan and no additional cooling. That just won't work long-term with 1080 Ti's. Besides all that... gluing anything to the ram or any other part of the PCB would void any warranty from any manufacturer due to modifications to the PCB... I mean, unless they're literally swimming in disposable income and don't care about just throwing away almost $1000 on a card and void the warranty straight away.. I guess more power to em.. but it wouldn't be smart for most people.


----------



## Kronos8

Mr Ripper said:


> I wish to get a 1080 ti and water cool with my EK universal block which I've had on my 7970 for many years and recently as a stop gap easily swapped it onto an ASUS GTX 1060 6GB and placed heatsinks where needed.
> 
> I was wondering whether any particular 1080 ti PCB design would be better suited? I'm thinking placement of additional heatsinks as well as PCB quality. I've read this about the MSI Armor (poor stock cooling but good PCB) - https://www.gamersnexus.net/hwreviews/2927-msi-1080ti-armor-review-high-temps?showall=1
> 
> However as I'm likely to buy secondhand I'd rather not get one that I can't resale easily in the future if needed when used prices seem closer together.


I got that exact card, before prices go high. So far I use it air cooled. I did not have the time to adjust my universal EK block to it, as I also have to take apart my loop for *maintenance*. After a lot of investigation, my plan is the following. Adjust my EK universal block to GPU. Between pcb and main heatsink, armor has a metal plate. I intend to keep it and add copper heatsinks where ram chips are. For VRM I have two options. Either use a waterblock like Koolance MVR-100 above the metal plate, if I can mount it, or use heatsinks and adjust/use the plastic shroud of the original fans to dissipate the heat from the heatsinks (that is very tricky but plausible). In that case, my opinion is that the use of a fan is mandatory. The use of Koolance MVR-100 is more appealing. I have seen photos of that setup in OCN, but I can't remember in which thread. Generally speaking, cards that have a metal plate between pcb and major heatsinks are easier for adjustment. I will get pictures when I'm done.


----------



## Mr Ripper

Kronos8 said:


> I got that exact card, before prices go high. So far I use it air cooled. I did not have the time to adjust my universal EK block to it, as I also have to take apart my loop for *maintenance*. After a lot of investigation, my plan is the following. Adjust my EK universal block to GPU. Between pcb and main heatsink, armor has a metal plate. I intend to keep it and add copper heatsinks where ram chips are. For VRM I have two options. Either use a waterblock like Koolance MVR-100 above the metal plate, if I can mount it, or use heatsinks and adjust/use the plastic shroud of the original fans to dissipate the heat from the heatsinks (that is very tricky but plausible). In that case, my opinion is that the use of a fan is mandatory. The use of Koolance MVR-100 is more appealing. I have seen photos of that setup in OCN, but I can't remember in which thread. Generally speaking, cards that have a metal plate between pcb and major heatsinks are easier for adjustment. I will get pictures when I'm done.


Thanks for the input. I've actually managed to get a Zotac 1080 (non ti) AMP at a good price because it has a faulty fan so I will experiment on this first as I still plan to get a ti later. I'm not not sure how different the 1080 is to the ti but I'll PM you if I manage something before you do (unless others don't mind me posting here).

Anyway I've not broke anything yet including electrical hard mods so far for over 15 years (with prior research) so I'm sure I won't do anything stupid. I've got an IR temp gun to monitor things from boot up.


----------



## Kronos8

Mr Ripper said:


> Thanks for the input. I've actually managed to get a Zotac 1080 (non ti) AMP at a good price because it has a faulty fan so I will experiment on this first as I still plan to get a ti later. I'm not not sure how different the 1080 is to the ti but I'll PM you if I manage something before you do (unless others don't mind me posting here).
> 
> Anyway I've not broke anything yet including electrical hard mods so far for over 15 years (with prior research) so I'm sure I won't do anything stupid. I've got an IR temp gun to monitor things from boot up.


It would be educating to exchange ideas. :thumb:


----------



## rakesh27

I own a EVGA 1080Ti, day one i swapped out the crap standard cooler and put a Kracken G12, temps not a problem and ive had the card for 6months now.

Just follow the step to install, use a AIO water/air/pump cooler for the GPU, for the memory if you want just stick some small heatsinks on there and same for the VRMS.

The idea is the fan blowing down from the kracken will blow all over the card, in turn as you have small heatsinks on vrm and memory air will cool them off as the card is running at idle and load.

My EVGA 1080TI i can overclock gpu to 2100 and i forgot the memory however its high..

Kracken is good i just moved it originally off my Zotac 1080 to the EVGA 1080TI FE...

Just ensure you have good air flow in your case... you should have air coming in from front and bottom if you can on the latter part and hot air expelled from rear and top...

I idea is that everything stays constant i mean temps and hot air never sits around to heat up other parts...


----------



## kithylin

rakesh27 said:


> I own a EVGA 1080Ti, day one i swapped out the crap standard cooler and put a Kracken G12, temps not a problem and ive had the card for 6months now.
> 
> Just follow the step to install, use a AIO water/air/pump cooler for the GPU, for the memory if you want just stick some small heatsinks on there and same for the VRMS.
> 
> The idea is the fan blowing down from the kracken will blow all over the card, in turn as you have small heatsinks on vrm and memory air will cool them off as the card is running at idle and load.
> 
> My EVGA 1080TI i can overclock gpu to 2100 and i forgot the memory however its high..
> 
> Kracken is good i just moved it originally off my Zotac 1080 to the EVGA 1080TI FE...
> 
> Just ensure you have good air flow in your case... you should have air coming in from front and bottom if you can on the latter part and hot air expelled from rear and top...
> 
> I idea is that everything stays constant i mean temps and hot air never sits around to heat up other parts...


When's the last time you checked the temperature of your VRM area under load with something like an IR thermometer? What temps is your core running at? VRM's usually run +25c to +30c hotter than your core. Try to keep your vrm's under 100c if at all possible.

Other than that... I guess if that works for you it works. However for me.. I personally prefer full cover blocks on my gpu's. My case and radiator setup may appear ghetto, but my gpu blocks usually look nice and sleek. I love the full cover EK block and shiny EK backplate for my 1080 Ti. I desperately want a full cover EK block & backplate for my 780 classy for my other project computer some day.


----------



## rakesh27

I get what you all saying...

Think of it like this... if using machine for web, windows stuff, no need to overclock the beast...

When gaming you dont need to either....however you can... just test and do a good long gaming session.

In idle its happy sitting there trinderling along, in load it comes alive with a good overclock, you dont have to go all balls out...

Remember this card is very good.

Like i said it works. i did this way on the 980ti, 1080 and 1080Ti, touch on wood cards are stable... no problems and i run g-sync monitor with some games running in virtual resolution of 4k...

The Kracken works... you can either connect the fan to the motherboard fan header or get the adapter so you can connect to the gpu fan plug and you have more control at any temp.

I got a EVGA 1080Ti FE not day one, i think i got 2 -3 months after release, and to be honest it hasnt missed a beat.

im still running 6700k and im very happy with my Z170 mobo, maxes all games out at 144hz... 

guys remember you guys come late to the party, when the card first came out and there after, everything was explored, what was possible and what was not...

No offense to anyone, since if you recently have brought a 1080Ti then your cards should be more refined unless you got a bad batch....

ive always done a AIO cooler on my cpu and gpu.... i dont want the hassle of a full water cooling system... this way i clean the fins and fans when i want it runs like clock work.

However i have Corsair obsidan 900d. plenty of air flow, it weighs like a small tank, maybe thats why its ok for me.

Good luck


----------



## Kronos8

rakesh27 said:


> I get what you all saying...
> 
> Like i said it works. i did this way on the 980ti, 1080 and 1080Ti, touch on wood cards are stable... no problems and i run g-sync monitor with some games running in virtual resolution of 4k...
> 
> The Kracken works... you can either connect the fan to the motherboard fan header or get the adapter so you can connect to the gpu fan plug and you have more control at any temp.
> 
> Good luck


I'm sure it works. If you don't mind asking, with Kracken there is no additional heatsink between the fan and the VRM's, isn't that right? I would defenitely consider something like Kracken, but I already own a universal GPU waterblock and a custom loop, so I have to exploit through that channel.


----------



## Mr Ripper

*Universal water blocks*

Just to add RE: Universal water blocks. I believe the 7970 has similar power draw to the 1080 and the 1080 ti draws about 70w more - Please correct me if I'm wrong as I'm struggling to find comparative figures.

I ran my 7970 with a universal water block and only passive cooling on my memory and vrms for around 5-6 years (no fans). The VRMs in fact only had a tiny narrow line of heatsinks stuck on because there was so little space for them. I really can't see there being a problem doing the same with the 1080 and I reckon the 1080 ti would be fine also with a bit more of a heatsink on the vrms - On my Zotac 1080 AMP the stock VRM heatsink (separate to the main heatsink) is a fair bit bigger than what I had on my 7970. These pascal GPUs have so many safety features anyway so I bet you'd fine out via throttling before anything drastic could happen.

I was almost ready to try it out last night but I'm lacking some adhesive thermal padding so I couldn't cover the memory VRMs so I'm awaiting a delivery.


----------



## Kronos8

Mr Ripper said:


> Just to add RE: Universal water blocks. I believe the 7970 has similar power draw to the 1080 and the 1080 ti draws about 70w more - Please correct me if I'm wrong as I'm struggling to find comparative figures.
> 
> I ran my 7970 with a universal water block and only passive cooling on my memory and vrms for around 5-6 years (no fans). The VRMs in fact only had a tiny narrow line of heatsinks stuck on because there was so little space for them. I really can't see there being a problem doing the same with the 1080 and I reckon the 1080 ti would be fine also with a bit more of a heatsink on the vrms - On my Zotac 1080 AMP the stock VRM heatsink (separate to the main heatsink) is a fair bit bigger than what I had on my 7970. These pascal GPUs have so many safety features anyway so I bet you'd fine out via throttling before anything drastic could happen.
> 
> I was almost ready to try it out last night but I'm lacking some adhesive thermal padding so I couldn't cover the memory VRMs so I'm awaiting a delivery.


I've read several posts saying they only use passive cooling on VRM's with no problem. I know it can work, I'm not afraid to try it. The only parameter different to older cards is Boost 3.0. Boost 3.0 actually overclock 1080 and 1080ti's with user doing nothing. That potentially could lead to higher VRM temps and/or unstable GPU clocks. I know users who have tried that may laugh on what I'm writing, but caution is necessary and feedback is always welcome.


----------



## 8051

rakesh27 said:


> I get what you all saying...
> 
> Think of it like this... if using machine for web, windows stuff, no need to overclock the beast...
> 
> When gaming you dont need to either....however you can... just test and do a good long gaming session.
> 
> In idle its happy sitting there trinderling along, in load it comes alive with a good overclock, you dont have to go all balls out...
> 
> Remember this card is very good.
> 
> Like i said it works. i did this way on the 980ti, 1080 and 1080Ti, touch on wood cards are stable... no problems and i run g-sync monitor with some games running in virtual resolution of 4k...
> 
> The Kracken works... you can either connect the fan to the motherboard fan header or get the adapter so you can connect to the gpu fan plug and you have more control at any temp.
> 
> I got a EVGA 1080Ti FE not day one, i think i got 2 -3 months after release, and to be honest it hasnt missed a beat.
> 
> im still running 6700k and im very happy with my Z170 mobo, maxes all games out at 144hz...
> 
> guys remember you guys come late to the party, when the card first came out and there after, everything was explored, what was possible and what was not...
> 
> No offense to anyone, since if you recently have brought a 1080Ti then your cards should be more refined unless you got a bad batch....
> 
> ive always done a AIO cooler on my cpu and gpu.... i dont want the hassle of a full water cooling system... this way i clean the fins and fans when i want it runs like clock work.
> 
> However i have Corsair obsidan 900d. plenty of air flow, it weighs like a small tank, maybe thats why its ok for me.
> 
> Good luck


The way you've been able to re-use your NZXT Kraken (not Kracken AFAICT) on your GPU's makes it seem like a good idea. What type of Kraken do you have? Did you have to buy a new GPU mounting kit for the 1080 series?


----------



## 8051

Kronos8 said:


> I've read several posts saying they only use passive cooling on VRM's with no problem. I know it can work, I'm not afraid to try it. The only parameter different to older cards is Boost 3.0. Boost 3.0 actually overclock 1080 and 1080ti's with user doing nothing. That potentially could lead to higher VRM temps and/or unstable GPU clocks. I know users who have tried that may laugh on what I'm writing, but caution is necessary and feedback is always welcome.


Considering how much 1080Ti's are selling for now I wouldn't be willing to take the chance on running bare VRM's. I've already had to downclock the memory on my 1080Ti to get stability when overclocking the core (I'm on air w/aftermarket fans and stock heatsink) and for all I know it could be because of insufficient cooling.


----------



## rakesh27

NZKT Kracken G12 works like this...

They supply you with mouthing plate and fan... the mounting plate has 3 holes in each corner for the AIO cooler.. 

Now the 3 holes im sure as its nvidia will mount in one of the 3, so replicate 3 times... this will hold the mouthing plate with aio cooler to the board....

Theres not that much difference between the 980ti 1080 and 1080 ti for the mounting holes for a aio cooler on the pcb...

The fan for the vrms and memory, eg i think it could be a 90mm or 120mm mounts to the kracken plate...

Thanks it done.

Basically you need to mount the Kracken to the PCB and the approved aio cooler list will be able to fit in the mounting plate hold to make a tight bond with the gpu..

go and check the the Kracken website and see the intructions theyve laid out...

I use a corsair h100i for both gpu and cpu and ram im using a corsair 3 mini fan blower....

Ive always used a push n pull on my aio. one fan on front and another fan on back... however you either have to connect fans to molex or mobo fan header... or get a splitter and connect one set to gpu fan header with an adapter and for the cpu connect to molex each fan... 

Just wire nicely...

When i pull my gpu out the whole lot comes out in one go, fan, aio with fans...

cpu is different, just disconnect molex and then pull the whole lot...

So easy....

I wouldnt mind water cooling, but to be honest, to much headache setting up and if something goes wrong you have to pull apart.

Now go younglings and find your way.


----------



## jura11

AIO on GPU, no way I would do that again, last time EVGA Hybrid failed on me and caused lots of headache and if its worth it,55-60°C load temperatures on friend Ti and OC slightly improved 

If you want good cooler and cheap cooler then Raijintek Morpheus II cooler is one of the best air cooler for GPU on market and performs very nicely on Ti,tried my Raijintek Morpheus II cooler on friend EVGA GTX1080Ti SC Black ICX and temperatures under load or in gaming in mid to high 40's, VRM has been in nice 50-55°C during the longer gaming session

Hope this helps 

Thanks, Jura


----------



## ZealotKi11er

jura11 said:


> AIO on GPU, no way I would do that again, last time EVGA Hybrid failed on me and caused lots of headache and if its worth it,55-60°C load temperatures on friend Ti and OC slightly improved
> 
> If you want good cooler and cheap cooler then Raijintek Morpheus II cooler is one of the best air cooler for GPU on market and performs very nicely on Ti,tried my Raijintek Morpheus II cooler on friend EVGA GTX1080Ti SC Black ICX and temperatures under load or in gaming in mid to high 40's, VRM has been in nice 50-55°C during the longer gaming session
> 
> Hope this helps
> 
> Thanks, Jura


Been very happy with Evga Hybrid. Running it for 8 months now with 1080 Ti. They are getting better and better with each version.


----------



## Mr Ripper

*EK Supreme universal waterblock*

This is my Zotac AMP 1080 (non ti) but it has the same PCB layout as the ti version. I couldn't wait for the new thermal pads and new heatsinks so I made use with what I have.

The memory VRM has no heatsink currently and the temps have been OK but I haven't done anything too strenuous. Using my IR Thermistor temps have been OK, even on the top right ram chip where the heatsink fell off and I've been monitoring it without. It's strange because I used this water block on a 1060 for a little while and the ram chips got really hot, and they didn't have any heatsinks to begin with. The gpu has been auto boosting to 1949mhz which I guess is due to the low temperatures.


----------



## kithylin

Mr Ripper said:


> This is my Zotac AMP 1080 (non ti) but it has the same PCB layout as the ti version. I couldn't wait for the new thermal pads and new heatsinks so I made use with what I have.
> 
> The memory VRM has no heatsink currently and the temps have been OK but I haven't done anything too strenuous. Using my IR Thermistor temps have been OK, even on the top right ram chip where the heatsink fell off and I've been monitoring it without. It's strange because I used this water block on a 1060 for a little while and the ram chips got really hot, and they didn't have any heatsinks to begin with. The gpu has been auto boosting to 1949mhz which I guess is due to the low temperatures.


I suppose it's down to each person's tastes and what they want to do with their cards. If you have a stock card and stock bios and just leave it alone to boost on it's own without doing anything, then a universal block or AIO would probably be fine, and even heatsink'ing the VRM's with no fan would probably be fine too. The GTX 1080 is rated for 180 watts, and the 1080 Ti is rated for 250 watts. The HD 7970 is rated for 250 watts so yes they're the same. The only concern I would have is posts like this may lead people to think a universal block or AIO on a card would be "Enough" to be "water cooled" and then try to load one of the unlimited-power bios's on the card and try to overclock it. In that situation we've already documented and observed 1080 Ti's going to and beyond 400 watts in some game titles with some overclocks. That's the territory where a full-cover block that fully contacts the vrm's and all parts of the card is an absolute must.

For power:

7970: https://www.techpowerup.com/gpudb/296/radeon-hd-7970
1080: https://www.techpowerup.com/gpudb/2839/geforce-gtx-1080
1080 Ti: https://www.techpowerup.com/gpudb/2877/geforce-gtx-1080-ti


----------



## 8051

kithylin said:


> The only concern I would have is posts like this may lead people to think a universal block or AIO on a card would be "Enough" to be "water cooled" and then try to load one of the unlimited-power bios's on the card and try to overclock it. In that situation we've already documented and observed 1080 Ti's going to and beyond 400 watts in some game titles with some overclocks. That's the territory where a full-cover block that fully contacts the vrm's and all parts of the card is an absolute must.


Is it really that dire that it requires a full water block? Jura reports he's had good luck w/high-end air GPU coolers like the Morpheus II.


----------



## kithylin

8051 said:


> Is it really that dire that it requires a full water block? Jura reports he's had good luck w/high-end air GPU coolers like the Morpheus II.


If someone was to use an unlocked bios and pushing overclocks combined with voltage up to 1.200v and north of the 400W+ range, probably yes. While it may be possible with AIO & AIR combination, personally I wouldn't risk it with air cooling the vrm's, the 1080 Ti's are too expensive. Not for daily gaming usage that is. Temporarily on a test bench for 3dmark, it might would be fine.


----------



## jura11

kithylin said:


> If someone was to use an unlocked bios and pushing overclocks combined with voltage up to 1.200v and north of the 400W+ range, probably yes. While it may be possible with AIO & AIR combination, personally I wouldn't risk it with air cooling the vrm's, the 1080 Ti's are too expensive. Not for daily gaming usage that is. Temporarily on a test bench for 3dmark, it might would be fine.


Hi there 

Do you really need to run XOC BIOS on Ti, I'm more than OK and happy to run stock BIOS, how much I will gain running XOC BIOS, probably few % which hasn't worth it 

I was running few BIOS and still I'm returning back to stock BIOS with which I have good experience 

Unlocked BIOS unless we will have again Pascal BIOS tweaker which I don't think we will ever see

Friend run on his Ti with Morpheus II cooler XOC BIOS without the single issue for few months, his temperatures are still better than he have with stock cooler by good margin and still is lot quieter than stock cooler 

If I would run this cooler on my cards, yes,if I would find board which will take these three cards, I would be more than happy to run again air cooler on my CPU NH-D15 or GPUs Morpheus II 

Have look my friend have loop, which he built himself and his temperatures on GPU are in mid 40's to low 50'sand CPU are very similar what he run on his AIO,we are few months reverted his PC back to air, Thermalright Le Grand Macho RT for CPU and Raijintek Morpheus II for his Ti, he is now running on his 5960x 4.6Ghz with 1.28v and temperatures are around 5-8°C better than with AIO and GPU temperatures are similar what he run under water, if you would ask him,what he prefer he would say air right now, less noise and less problems on longer run,he went via through the small hell with his loop which he built, which I redone few times but at the end,he went back to air 




Hope this helps 

Thanks, Jura


----------



## feznz

I guess thats why you don't run XOC bios, the standard bios Asus Strix bios it has an power limiting bios set at 120% so about 300w yawn never has this card seen 400w
still must have limited the life expectancy on the GTX 770 as I run the memory bare back since release date back in Feb 2013 so.... you have been warned


----------



## RoBiK

*pushing XOC BIOS to the limit*

pushing XOC BIOS to the limit, values for a 10 minutes window after more than one hour to rump up the temperatures until they stabilized. Rock solid across 4 cards with [email protected] I could have pushed the cards even higher with a lighter workload or if i just picked only the most stable card instead of aiming for stability across all of them at the same frequencies. Each card was pulling in the vicinity of 500W with spikes well over 500W (confirmed with measurements from the two Corsair AXi PSUs that the cards were fed with).


----------



## MightEMatt

RoBiK said:


> pushing XOC BIOS to the limit, values for a 10 minutes window after more than one hour to rump up the temperatures until they stabilized. Rock solid across 4 cards with [email protected] I could have pushed the cards even higher with a lighter workload or if i just picked only the most stable card instead of aiming for stability across all of them at the same frequencies. Each card was pulling in the vicinity of 500W with spikes well over 500W (confirmed with measurements from the two Corsair AXi PSUs that the cards were fed with).



Were you frequency curve tuning? If so which utility did you use? I find afterburner gives me nothing but problems when trying to set curves on multiple cards. It either locks at 2GHz for every card regardless of the curves, or it resets the curve of the card I'm not currently tuning. If you didn't and instead used offset tuning, how did you get it to pass 1.1V?


----------



## RoBiK

MightEMatt said:


> Were you frequency curve tuning? If so which utility did you use? I find afterburner gives me nothing but problems when trying to set curves on multiple cards. It either locks at 2GHz for every card regardless of the curves, or it resets the curve of the card I'm not currently tuning. If you didn't and instead used offset tuning, how did you get it to pass 1.1V?


Yes i did use the frequency/voltage curve and yes i used afterburner... it can be sometimes confusing when using the curve and switching between the cards - when i switch cards i make sure to close the curve window and reopen it before making any changes.


----------



## MightEMatt

RoBiK said:


> Yes i did use the frequency/voltage curve and yes i used afterburner... it can be sometimes confusing when using the curve and switching between the cards - when i switch cards i make sure to close the curve window and reopen it before making any changes.



Hmm, I'll have to try again sometime. Haven't bother with the XOC BIOS in months since I couldn't get Afterburner to cooperate, maybe it had something to do with the older version. Awesome setup anyways, quad GPU water setups are pretty savage, usefulness notwithstanding.


----------



## jura11

RoBiK said:


> pushing XOC BIOS to the limit, values for a 10 minutes window after more than one hour to rump up the temperatures until they stabilized. Rock solid across 4 cards with [email protected] I could have pushed the cards even higher with a lighter workload or if i just picked only the most stable card instead of aiming for stability across all of them at the same frequencies. Each card was pulling in the vicinity of 500W with spikes well over 500W (confirmed with measurements from the two Corsair AXi PSUs that the cards were fed with).


My GTX1080Ti will do 2164MHz at 1.093v, maybe bit more I can do, in rendering 2164MHz is easy possible or in gaming as well only one game is not and this is Mass Effect Andromeda where I running 2152Mhz but difference between the 2113MHz and 2164MHz is just so minimal in rendering or in gaming and is not worth it to me

Other cards GTX1080's, first one will do 2164MHz at 1.075v and other one 1080 will do 2100MHz at 1.093v 

Tried few times XOC BIOS on my Ti and still not sure if its worth it or not, with XOC looks like better way for me is to do shunt mode which I planned to do but looks like I will not be doing that as I don't game as much and if I game I still I'm more than OK with my cards

Hope this helps 

Thanks, Jura


----------



## 8051

RoBiK said:


> pushing XOC BIOS to the limit, values for a 10 minutes window after more than one hour to rump up the temperatures until they stabilized. Rock solid across 4 cards with [email protected] I could have pushed the cards even higher with a lighter workload or if i just picked only the most stable card instead of aiming for stability across all of them at the same frequencies. Each card was pulling in the vicinity of 500W with spikes well over 500W (confirmed with measurements from the two Corsair AXi PSUs that the cards were fed with).


What did those 4 1080ti's and waterblocks cost? $4K? Or more?


----------



## RoBiK

8051 said:


> What did those 4 1080ti's and waterblocks cost? $4K? Or more?


Including VAT at today's EUR/USD exchange rate it would be almost exactly $4K.


----------



## kithylin

I just thought I would put out a warning and disclaimer to folks out there. If you are trying to buy a GTX 1080 Ti or 1080 today, it's probably best to avoid ebay right now if at all possible. There's currently an abundance of coin miners selling their used, coin-mined gpu's listed as "New" on ebay. Or "New other". I saw several listings on there yesterday of "New" 1080 Ti's that obviously had visual dust in the fan blades. People are being careful how they open the packaging and careful with the sticker seals and replacing cards in the boxes to look like new and resealing em again. Right now the safe bet is either buy from amazon, newegg.com, or any other reputable dealer directly. Or EVGA occasionally has some on their website to buy direct from them. Doing that on ebay is a violation of ebay seller terms and not allowed, but the problem is enforcement.

Also remember if you do go through ebay and buy something and it's listed as new but not when you open it: Don't message the seller directly, open a return through the ebay resolution center instead. If you message a seller directly about returning something, they can deny it, refuse, try to get you to pay return shipping, try to saddle you with restocking fees, all sorts of nonsense. If you use the resolution center you're guaranteed free return shipping and the seller has to accept it after the 5'th day even if they say no, argue, and deny it. Doesn't matter, ebay will force it on them if it goes to the 5'th day and they haven't agreed to by then, and through the resolution center they can't tack on restocking fees. You get the full value back. At least in the USA.


----------



## Thoth420

Hey all /wave

When I am gaming my max temps are as follows on the 1080 Ti FTW3 Gaming(as shown by EVGA Precision X):

Core: 67C
Power: 72C
Memory: 69C

Is it normal for the GPU itself to be the lowest temperature on the list? Stock clocks at the moment by the way


----------



## kithylin

Thoth420 said:


> Hey all /wave
> 
> When I am gaming my max temps are as follows on the 1080 Ti FTW3 Gaming(as shown by EVGA Precision X):
> 
> Core: 67C
> Power: 72C
> Memory: 69C
> 
> Is it normal for the GPU itself to be the lowest temperature on the list? Stock clocks at the moment by the way


Yes this is quite normal and well within acceptable margins. Usually the Power (AKA VRM) usually runs up to +15 to +20 degrees C hotter than the core does in most video cards. There are some unusual exceptions where the Power/VRM will run cooler than the core but that's not normal. It's also not unusual for most gpu's, especially 1080 Ti's to usually run up in to the mid-70's and 80's C depending on the game you're playing, resolution and other factors (how hard you're working it). And then have the VRM/Power around 90c - 100c at the same time. That's usually how Founder's Edition or reference cards work though. AIB or Aftermarket board partner cards like your EVGA FTW3 usually run a lot cooler in general.


----------



## ThrashZone

Thoth420 said:


> Hey all /wave
> 
> When I am gaming my max temps are as follows on the 1080 Ti FTW3 Gaming(as shown by EVGA Precision X):
> 
> Core: 67C
> Power: 72C
> Memory: 69C
> 
> Is it normal for the GPU itself to be the lowest temperature on the list? Stock clocks at the moment by the way


Hi,
Yes power is usually the highest temp 
Yours are on the high side I would suggest doing a custom fan curve rather than the default 
Try and live with a little more noise and turn up music 
Personally I set all three fans at 30c-30% fan and straight line to 60c-100%
benchmarking just swing all to 100% call it a day with 55c max temp full load.


----------



## kithylin

ThrashZone said:


> Hi,
> Yes power is usually the highest temp
> Yours are on the high side I would suggest doing a custom fan curve rather than the default
> Try and live with a little more noise and turn up music
> Personally I set all three fans at 30c-30% fan and straight line to 60c-100%
> benchmarking just swing all to 100% call it a day with 55c max temp full load.


Personally I wouldn't consider 72c for power anywhere near "high side" for an air cooled card. AIB air cooled 1080 Ti's can easily handle up to 80c - 85c on core and vrm/power up to 100c easily with no ill effects. I may be wrong but I'm pretty sure this is "out of the box" performance for Founder's Edition / Reference cards.


----------



## ThrashZone

Hi,
Oops thought I was on the ftw3 thread 
You're right poor single fan cooler will run hotter but should still use a custom fan profile 
Just pick a noise level they can live with


----------



## 8051

kithylin said:


> I just thought I would put out a warning and disclaimer to folks out there. If you are trying to buy a GTX 1080 Ti or 1080 today, it's probably best to avoid ebay right now if at all possible. There's currently an abundance of coin miners selling their used, coin-mined gpu's listed as "New" on ebay. Or "New other". I saw several listings on there yesterday of "New" 1080 Ti's that obviously had visual dust in the fan blades. People are being careful how they open the packaging and careful with the sticker seals and replacing cards in the boxes to look like new and resealing em again. Right now the safe bet is either buy from amazon, newegg.com, or any other reputable dealer directly. Or EVGA occasionally has some on their website to buy direct from them. Doing that on ebay is a violation of ebay seller terms and not allowed, but the problem is enforcement.
> 
> Also remember if you do go through ebay and buy something and it's listed as new but not when you open it: Don't message the seller directly, open a return through the ebay resolution center instead. If you message a seller directly about returning something, they can deny it, refuse, try to get you to pay return shipping, try to saddle you with restocking fees, all sorts of nonsense. If you use the resolution center you're guaranteed free return shipping and the seller has to accept it after the 5'th day even if they say no, argue, and deny it. Doesn't matter, ebay will force it on them if it goes to the 5'th day and they haven't agreed to by then, and through the resolution center they can't tack on restocking fees. You get the full value back. At least in the USA.


Someone sold me a New-Other Samsung 850 Pro 256 GiB that turned out to have a well over 500 GiB already written to the SSD. I contacted the seller directly and got an immediate discount.

Can't you contact the seller first and then if they don't resolve it to your satisfaction dispute the transaction?


----------



## kithylin

8051 said:


> Someone sold me a New-Other Samsung 850 Pro 256 GiB that turned out to have a well over 500 GiB already written to the SSD. I contacted the seller directly and got an immediate discount.
> 
> Can't you contact the seller first and then if they don't resolve it to your satisfaction dispute the transaction?


Sure, you can. But by contacting them directly they can say or do anything they want to, and try to charge you anything they wish. They can try to make you pay restocking fees. They can try to make you pay return shipping, etc. With the actual "File a return" system they can't do any of that. Sellers actually are not allowed to charge you return shipping and not allowed to charge restocking fees per ebay. But many sellers try to say "Contact us first! Do not open a return case!" These sort of ebay sellers should be avoided at all costs because they're obviously trying to charge you unnecessary crap like this.

Also everyone should note that on ebay USA, even if the seller says "Returns not accepted" in the listing, per ebay rules they have to accept returns via the "Return an item" under purchase history, regardless of what they set in the listing. With the "Return an item" / return case, sellers can not refuse, can not deny and can not stop it. They must accept the returns no matter what is said in the listing. This applies to new, new-other, and used items.

Also you can completely ignore any and all "Seller's terms" in any listing as those do not apply on ebay and are not enforceable. A lot of folks don't know this about ebay.

The reason I say this is because the percentage of actual honest sellers on ebay that will actually own up to a return (and give us our full purchase price including not paying return shipping) is extremely low today to the point it's almost non-existent. We buyers absolutely must open a return case first if we actually want our money back for something.

Anyway.. it's still best to buy direct from reputable sources at the moment until things settle down later in the year for video cards.


----------



## Thoth420

ThrashZone said:


> Hi,
> Oops thought I was on the ftw3 thread
> You're right poor single fan cooler will run hotter but should still use a custom fan profile
> Just pick a noise level they can live with


It is I have the FTW3 and the profile is set to aggressive in Precision.


----------



## 8051

kithylin said:


> Sure, you can. But by contacting them directly they can say or do anything they want to, and try to charge you anything they wish. They can try to make you pay restocking fees. They can try to make you pay return shipping, etc. With the actual "File a return" system they can't do any of that. Sellers actually are not allowed to charge you return shipping and not allowed to charge restocking fees per ebay. But many sellers try to say "Contact us first! Do not open a return case!" These sort of ebay sellers should be avoided at all costs because they're obviously trying to charge you unnecessary crap like this.
> 
> Also everyone should note that on ebay USA, even if the seller says "Returns not accepted" in the listing, per ebay rules they have to accept returns via the "Return an item" under purchase history, regardless of what they set in the listing. With the "Return an item" / return case, sellers can not refuse, can not deny and can not stop it. They must accept the returns no matter what is said in the listing. This applies to new, new-other, and used items.
> 
> Also you can completely ignore any and all "Seller's terms" in any listing as those do not apply on ebay and are not enforceable. A lot of folks don't know this about ebay.
> 
> The reason I say this is because the percentage of actual honest sellers on ebay that will actually own up to a return (and give us our full purchase price including not paying return shipping) is extremely low today to the point it's almost non-existent. We buyers absolutely must open a return case first if we actually want our money back for something.
> 
> Anyway.. it's still best to buy direct from reputable sources at the moment until things settle down later in the year for video cards.


Thanks for that info kithylin. I bought my 1080Ti right here on overclock.net and it worked out great.


----------



## sblantipodi

is there someone with a 1080Ti and an Haswell-E CPU?
is there a big difference in 4K with a 1080Ti running an Haswell-E over an 8700K?


----------



## toncij

sblantipodi said:


> is there someone with a 1080Ti and an Haswell-E CPU?
> is there a big difference in 4K with a 1080Ti running an Haswell-E over an 8700K?


I have one. Depending on the clock. My 5960X runs at 4.5 and 8700K runs a 5.2, but the difference is not that high as you'd expect. To be honest, I'd have to test. What titles you need?


----------



## sblantipodi

toncij said:


> I have one. Depending on the clock. My 5960X runs at 4.5 and 8700K runs a 5.2, but the difference is not that high as you'd expect. To be honest, I'd have to test. What titles you need?


Ac origins and some other CPU demanding game.


----------



## KedarWolf

sblantipodi said:


> is there someone with a 1080Ti and an Haswell-E CPU?
> is there a big difference in 4K with a 1080Ti running an Haswell-E over an 8700K?


On my 5960x at 4.742GHZ CPU, bit over3200 memory best TimeSpy score I got was 11298.

On my 8700k at 5.2GHZ, 4200MHZ memory best score was around 11200.

And on my 8700k graphics score was higher, just CPU score was lower with less cores and only dual channel memory instead of quad channel on the 8700k.


----------



## cekim

sblantipodi said:


> is there someone with a 1080Ti and an Haswell-E CPU?
> is there a big difference in 4K with a 1080Ti running an Haswell-E over an 8700K?


First, - 4K is GPU limited - so the difference in CPU is going to be less than it would at lower resolutions. As a practical matter, no, there isn't a huge difference. Either the 8700K or a 6/8 core HW-E is going to do just fine.

Second, there are a lot of variables here... 

I've run the following combos:
1080Ti SLI

on:

2x2696 v3 (3.8GHz - ucode mod to get higher "all" and many-core turbos ~9-10 cores @3.8)
5960x (4.4 and 4.7GHz)
6950x (4.4GHz)
7980XE (4.5GHz)
6700K (4.6GHz)

(notice the OC's - that's critical)

Overall, In game, there is not a huge difference among any of them - the characteristics of moving between them can be broken down as:
- 2xXeon had occasional stutter likely due to QPI issues and lower memory speed inherent to the platform
- 6700K had a comparatively high micro-stutter/stutter in SLI likely due to lack of cores (4 vs 8,10,36), few memory channels (2 vs 4 vs 8) and finally PCIe lanes (16x16 vs 8x8 or 16x8 of the 6700K)

For the HW/BW/SKL progression:
- HW/BW it will depend wildly on your games - specifically its ability to use more cores or not.
- Clock rate and memory speed are king for max frame-rates and clock-for-clock-core-for-core BW is better than both HW and SKL in games, but not by a huge margin.

A 4.8+GHz 8700k is a screamer though... Particularly for games, its hard to go wrong there... 

If it only had 4 cores, then I think it would be a no-brainer to go with an OC'd 5960x for over-all performance (average/minimum frames - "smoothness" of play) PROVIDED you have OC'd your 5960x to 4.3+ GHz. Stock for stock, the 8700K is going to be better.

TL;DR - you aren't going to cripple your 1080ti with a HWE chip for games. If benchmarking is your aim, you will see it, but playing games, having more than 4 cores is criticial, the highest clocks and memory speeds you can get within reason is critical. Particularly at 4K, the rest is in the noise.


----------



## toncij

If you're going SLI, I'd go with HEDT:


----------



## shadow85

Cmon bring on the 1180, the 1080 Ti is still not a true 4K gaming card! In SLi it does nostly well, but it still gets stomped on by a few games, Kingdom Come: Deliverance for example.


----------



## Blameless

sblantipodi said:


> is there a big difference in 4K with a 1080Ti running an Haswell-E over an 8700K?


Depends on game, but at 4k, I'm generally completely GPU limited with a 1080Ti at 2025MHz and a 4.3GHz 5820K.


----------



## amd955be5670

So Nvidia seems to be generously selling the 1080Ti FE in my country for 854$ (US converted) whereas most retailers are stocking AIBs and some blower versions starting at 1125$~.

I am still confused so as to pull the trigger or not. Guess I just would hate a throttled card.

My back story is, I went through great miles tweaking my 970's BIOS so that even at 82~85c in demanding games (witcher3) it can maintain its boost clocks of 1519mhz. Boy that was satisfying.

So yeah @internet how do I proceed.


----------



## ThrashZone

amd955be5670 said:


> So Nvidia seems to be generously selling the 1080Ti FE in my country for 854$ (US converted) whereas most retailers are stocking AIBs and some blower versions starting at 1125$~.
> 
> I am still confused so as to pull the trigger or not. Guess I just would hate a throttled card.
> 
> My back story is, I went through great miles tweaking my 970's BIOS so that even at 82~85c in demanding games (witcher3) it can maintain its boost clocks of 1519mhz. Boy that was satisfying.
> 
> So yeah @*internet* how do I proceed.


Hi,
Seems like an I Buy deal :thumb:
Although I would also I Buy an Hybrid kit too


----------



## chibi

Good deal if you can actually order one. Nvidia is showing all their cards out of stock at their store, lol.


----------



## kithylin

amd955be5670 said:


> So Nvidia seems to be generously selling the 1080Ti FE in my country for 854$ (US converted) whereas most retailers are stocking AIBs and some blower versions starting at 1125$~.
> 
> I am still confused so as to pull the trigger or not. Guess I just would hate a throttled card.
> 
> My back story is, I went through great miles tweaking my 970's BIOS so that even at 82~85c in demanding games (witcher3) it can maintain its boost clocks of 1519mhz. Boy that was satisfying.
> 
> So yeah @internet how do I proceed.


The new GTX 1180 is supposed to come out some time this year, supposedly, it's unknown exactly when. I would probably wait.


----------



## 8051

cekim said:


> First, - 4K is GPU limited - so the difference in CPU is going to be less than it would at lower resolutions. As a practical matter, no, there isn't a huge difference. Either the 8700K or a 6/8 core HW-E is going to do just fine.
> 
> Second, there are a lot of variables here...
> 
> I've run the following combos:
> 1080Ti SLI
> 
> on:
> 
> 2x2696 v3 (3.8GHz - ucode mod to get higher "all" and many-core turbos ~9-10 cores @3.8)
> 5960x (4.4 and 4.7GHz)
> 6950x (4.4GHz)
> 7980XE (4.5GHz)
> 6700K (4.6GHz)
> 
> (notice the OC's - that's critical)
> 
> Overall, In game, there is not a huge difference among any of them - the characteristics of moving between them can be broken down as:
> - 2xXeon had occasional stutter likely due to QPI issues and lower memory speed inherent to the platform
> - 6700K had a comparatively high micro-stutter/stutter in SLI likely due to lack of cores (4 vs 8,10,36), few memory channels (2 vs 4 vs 8) and finally PCIe lanes (16x16 vs 8x8 or 16x8 of the 6700K)
> 
> For the HW/BW/SKL progression:
> - HW/BW it will depend wildly on your games - specifically its ability to use more cores or not.
> - Clock rate and memory speed are king for max frame-rates and clock-for-clock-core-for-core BW is better than both HW and SKL in games, but not by a huge margin.
> 
> A 4.8+GHz 8700k is a screamer though... Particularly for games, its hard to go wrong there...
> 
> If it only had 4 cores, then I think it would be a no-brainer to go with an OC'd 5960x for over-all performance (average/minimum frames - "smoothness" of play) PROVIDED you have OC'd your 5960x to 4.3+ GHz. Stock for stock, the 8700K is going to be better.
> 
> TL;DR - you aren't going to cripple your 1080ti with a HWE chip for games. If benchmarking is your aim, you will see it, but playing games, having more than 4 cores is criticial, the highest clocks and memory speeds you can get within reason is critical. Particularly at 4K, the rest is in the noise.


Which CPU has 8 memory channels? The bandwidth must be incredible.


----------



## RoBiK

8051 said:


> Which CPU has 8 memory channels? The bandwidth must be incredible.


AMD Epyc


----------



## sblantipodi

toncij said:


> If you're going SLI, I'd go with HEDT: https://www.youtube.com/watch?v=8KOTiee5RqA


I have a 5930K 40 lanes CPU and I would like to upgrade my GTX980 Ti SLI to newer cards, probably the next 1180 in SLI.
Since I am going to spend a lot on GPUs I don't want to be CPU limited but I even don't want to waste money for things I don't need.

So, is my 5930K 40 lanes CPU will limit a 1180 SLI when compared to a 8700K or newer CPU?


----------



## kithylin

sblantipodi said:


> I have a 5930K 40 lanes CPU and I would like to upgrade my GTX980 Ti SLI to newer cards, probably the next 1180 in SLI.
> Since I am going to spend a lot on GPUs I don't want to be CPU limited but I even don't want to waste money for things I don't need.
> 
> So, is my 5930K 40 lanes CPU will limit a 1180 SLI when compared to a 8700K or newer CPU?


I'm assuming if you're using a 980 Ti SLI set that you're trying to game in 4K. If that's the case then it doesn't matter which CPU, that's a gpu-limited area. Except do go with HEDT platforms though, 8700K only allows 8x-8x for graphics and will limit it slightly.


----------



## sblantipodi

is there someone having some BSOD with the latest drivers?
I have this BSOD
kmode exception not handled


----------



## mtbiker033

a friend of mine has a EVGA GTX 1080 Ti SC2 HYBRID 11G-P4-6598-KR and is forced to use precision XOC with it. he's telling me that without precision running the gpu fan stops, like won't do anything and MSI AB won't work with it at all. anyone run into this before? 

Now I have an FE that I put a hybrid kit on and then flashed it to the FTW3 bios and MSI AB controls my gpu fan perfectly and I have my push/pull radiator fans going to a fan controller so I control those. I just coiled up the rad fan inside the gpu shroud and not using it. 

If he were to flash his bios to a different bios would he gain MSI AB control of the gpu fan on his card? He's doing what I'm doing now and has his rad fans to his fan controller too.


----------



## kithylin

mtbiker033 said:


> a friend of mine has a EVGA GTX 1080 Ti SC2 HYBRID 11G-P4-6598-KR and is forced to use precision XOC with it. he's telling me that without precision running the gpu fan stops, like won't do anything and MSI AB won't work with it at all. anyone run into this before?
> 
> Now I have an FE that I put a hybrid kit on and then flashed it to the FTW3 bios and MSI AB controls my gpu fan perfectly and I have my push/pull radiator fans going to a fan controller so I control those. I just coiled up the rad fan inside the gpu shroud and not using it.
> 
> If he were to flash his bios to a different bios would he gain MSI AB control of the gpu fan on his card? He's doing what I'm doing now and has his rad fans to his fan controller too.


You can't use precisionX and MSI Afterburner on the same computer at the same time. You have to uninstall one before using the other, that's probably why MSI AB doesn't work: he didn't uninstall precisionX first. Even if you close PrecisionX OC, just that it's installed in the system will conflict with MSI AB and prevent it from working, even if it's not running at the moment. This is an old, well documented issue with all nvidia cards going back many years and families.


----------



## outofmyheadyo

How much are your 1080ti pulling during gaming? I see my 1080ti Gamerock upto 331w with the stock bios and no overclocks, just +16% power limit.


----------



## kithylin

outofmyheadyo said:


> How much are your 1080ti pulling during gaming? I see my 1080ti Gamerock upto 331w with the stock bios and no overclocks, just +16% power limit.


Some people have reported seeing in excess of 500W+ with XOC bios, overclocking and certain game titles running in 4K.


----------



## outofmyheadyo

Yes, what I am interested is powerdraw with stock bios of different cards.


----------



## kithylin

outofmyheadyo said:


> Yes, what I am interested is powerdraw with stock bios of different cards.


https://www.techpowerup.com/vgabios...X+1080+Ti&interface=&memType=&memSize=&since=

Hold down the left control button on your keyboard and left-click on each one to spawn it in a new tab and just go look down the list.

There is a section under "Bios Internals" that looks like this:

Board power limit
Target: 250.0 W
Limit: 300.0 W

The "Limit: 300.0W" is how much power each different model of card will use even with the power slider pushed over as far as it will go.

Some cards are 300W, some are 360W, etc. The cards can pull up to and in excess of 500W in some situations, so if you're interested in stock power usage, then these bios's will tell you because having a default bios on any card will be limited entirely by which bios it shipped with. And here you can view bios's, and limits of nearly all models of card.


----------



## mtbiker033

kithylin said:


> You can't use precisionX and MSI Afterburner on the same computer at the same time. You have to uninstall one before using the other, that's probably why MSI AB doesn't work: he didn't uninstall precisionX first. Even if you close PrecisionX OC, just that it's installed in the system will conflict with MSI AB and prevent it from working, even if it's not running at the moment. This is an old, well documented issue with all nvidia cards going back many years and families.


thank you for the reply! yes we are aware of that and he had removed MSI AB and only has precision installed.


----------



## gavros777

Hello guys, is a 1080ti as good as 2 titan x maxwell OC at 1450 in sli?
My cards produce too much heat and thinking to move to a single card.


----------



## SavantStrike

gavros777 said:


> Hello guys, is a 1080ti as good as 2 titan x maxwell OC at 1450 in sli?
> My cards produce too much heat and thinking to move to a single card.


It's not quite double the speed, but its a better solution since some titles don't support SLI.


----------



## sblantipodi

toncij said:


> If you're going SLI, I'd go with HEDT: https://www.youtube.com/watch?v=8KOTiee5RqA





gavros777 said:


> Hello guys, is a 1080ti as good as 2 titan x maxwell OC at 1450 in sli?
> My cards produce too much heat and thinking to move to a single card.


unuseful sidegrade imho


----------



## SavantStrike

sblantipodi said:


> unuseful sidegrade imho


We would need to know what games are being played. If there's a lot of VR, I'd rather have a single 1080 than SLI TXM's.


----------



## 8051

sblantipodi said:


> unuseful sidegrade imho


This.


----------



## mouacyk

gavros777 said:


> Hello guys, is a 1080ti as good as 2 titan x maxwell OC at 1450 in sli?
> My cards produce too much heat and thinking to move to a single card.


You're going to get way more consistent frame times in everything, even if max fps is lower. Also, you halved the power consumption and heat output, as you noted. For general gaming, definitely worth moving to single GPU. Most multi-GPU setups now-a-day are used for very specific needs -- select SLI-supported games, ultra resolutions on multiple monitors, rendering, benchmarks, and the like.


----------



## 8051

mouacyk said:


> You're going to get way more consistent frame times in everything, even if max fps is lower. Also, you halved the power consumption and heat output, as you noted. For general gaming, definitely worth moving to single GPU. Most multi-GPU setups now-a-day are used for very specific needs -- select SLI-supported games, ultra resolutions on multiple monitors, rendering, benchmarks, and the like.


In BF1, if the SLI Titan X maxwells are on a 2x16 lane PCIe 3.0 setup, the 1080Ti will be noticeably slower in minima and maxima, but this game's performance is an outlier.


----------



## gavros777

Thanks for the info guys, i noticed in dying light the following only during cutscenes the temperature of the first card jumps to 92c then drops back down to 60c.
It scared me a lot as i have a 280mm aio cooling solution on my first card and was worrying if it died on me.
Another issue i have with my cards is that i use a custom bios and i'm stuck with a very old driver as the new ones remove the overclocking on my cards.
I will try to wait thanks to your advises as long i dont get forced to update my drivers.

By the way why nvidia is messing with the overclocking on older cards, are they trying to force us to upgrade?


----------



## MightEMatt

gavros777 said:


> Thanks for the info guys, i noticed in dying light the following only during cutscenes the temperature of the first card jumps to 92c then drops back down to 60c.
> It scared me a lot as i have a 280mm aio cooling solution on my first card and was worrying if it died on me.
> Another issue i have with my cards is that i use a custom bios and i'm stuck with a very old driver as the new ones remove the overclocking on my cards.
> I will try to wait thanks to your advises as long i dont get forced to update my drivers.
> 
> By the way why nvidia is messing with the overclocking on older cards, are they trying to force us to upgrade?



Everything you describe sounds like bugs related to using custom BIOS. I have 500 series cards that still overclock on the newest drivers, and the temperature spikes sound like the result of running really aggressive loads with no power limit. That or a bad cooler mount.


----------



## trickeh2k

A bit late to the part but finally have my new system up and running  https://www.techpowerup.com/gpuz/c9x86 Now, is there any performance gains from any custom vbioses or are the results like +1-3%?

EDIT: quick 'n dirty bench i did last night. https://www.3dmark.com/3dm/26421339


----------



## kithylin

gavros777 said:


> Thanks for the info guys, i noticed in dying light the following only during cutscenes the temperature of the first card jumps to 92c then drops back down to 60c.
> It scared me a lot as i have a 280mm aio cooling solution on my first card and was worrying if it died on me.
> Another issue i have with my cards is that i use a custom bios and i'm stuck with a very old driver as the new ones remove the overclocking on my cards.
> I will try to wait thanks to your advises as long i dont get forced to update my drivers.
> 
> By the way why nvidia is messing with the overclocking on older cards, are they trying to force us to upgrade?


Sounds like you're playing with gsync/v-sync disabled. Turn that back on and you won't have spikes like that anymore. Or if you really want vsync off, at least use a frame limiter via nvidia inspector.

Also with nvidia, the bios is set in the card at the hardware level. Nvidia can not detect nor change nor do anything software side in relation to that. If your card is screwing up and crashing after you change software it's not "nvidia screwing with us" you have an unstable overclock, add more voltage or reduce the overclock. Or if it's been running like that for many years it's probably degraded and doesn't work as fast as it did years ago.


----------



## 8051

gavros777 said:


> Thanks for the info guys, i noticed in dying light the following only during cutscenes the temperature of the first card jumps to 92c then drops back down to 60c.
> It scared me a lot as i have a 280mm aio cooling solution on my first card and was worrying if it died on me.
> Another issue i have with my cards is that i use a custom bios and i'm stuck with a very old driver as the new ones remove the overclocking on my cards.
> I will try to wait thanks to your advises as long i dont get forced to update my drivers.
> 
> By the way why nvidia is messing with the overclocking on older cards, are they trying to force us to upgrade?


Dying Light, after I jacked up the view distance, is the game I have that puts the most load on my 1080Ti. I use it to test my overclocks, along with Unigine's Heaven.


----------



## gavros777

kithylin said:


> Sounds like you're playing with gsync/v-sync disabled. Turn that back on and you won't have spikes like that anymore. Or if you really want vsync off, at least use a frame limiter via nvidia inspector.
> 
> Also with nvidia, the bios is set in the card at the hardware level. Nvidia can not detect nor change nor do anything software side in relation to that. If your card is screwing up and crashing after you change software it's not "nvidia screwing with us" you have an unstable overclock, add more voltage or reduce the overclock. Or if it's been running like that for many years it's probably degraded and doesn't work as fast as it did years ago.


Thanks for the vsync info!
About nvidia messing with titan x maxwell's custom bios check the titan x thread. All custom bios posted there are incompatible with nvidia's new drivers and everyone who is using them is affected. I'm a big nvidia fan and always troll amd fanboys at wccftech lol but cant help but feel kinda weird with nvidia on this issue.


----------



## KedarWolf

trickeh2k said:


> A bit late to the part but finally have my new system up and running  https://www.techpowerup.com/gpuz/c9x86 Now, is there any performance gains from any custom vbioses or are the results like +1-3%?
> 
> EDIT: quick 'n dirty bench i did last night. https://www.3dmark.com/3dm/26421339


On XOC BIOS. Not much more.  

https://www.3dmark.com/3dm/26444640

15,297

And that's a benching run. I run 24/7 much slower. I'd post what I get at my normal clocks. brb.

https://www.3dmark.com/3dm/26444738 Not that much less at normal clock speeds. Strange. 

14,918


----------



## kithylin

gavros777 said:


> Thanks for the vsync info!
> About nvidia messing with titan x maxwell's custom bios check the titan x thread. All custom bios posted there are incompatible with nvidia's new drivers and everyone who is using them is affected. I'm a big nvidia fan and always troll amd fanboys at wccftech lol but cant help but feel kinda weird with nvidia on this issue.


I didn't know that, I've never heard of such a thing. Must be a titan thing, or maybe something to do with windows 10. I wonder if they work fine in win7.


----------



## Hl86

gavros777 said:


> Hello guys, is a 1080ti as good as 2 titan x maxwell OC at 1450 in sli?
> My cards produce too much heat and thinking to move to a single card.


Sli is dead, far cry 5 poor multi gpu performance hammered the last nail in the coffin for me.


----------



## KCDC

Anyone get any strange screen glitches while on the XOC BIOS? Happened to me on all three monitors the other night, strange icons and such all over the screens, I should've taken a pic.

I was only browsing the web, nothing taxing.


----------



## urbanman2004

Where's the 1070 Owner's club


----------



## kithylin

urbanman2004 said:


> Where's the 1070 Owner's club


The built in search in these forums doesn't find it. But if you use google and restrict the results to only from this website, you can find it first result. Here's how to do that: https://www.google.com/search?q=107...rome..69i57.4052j0j4&sourceid=chrome&ie=UTF-8


----------



## Gurkburk

So I repadded my msi 1080 ti gaming x, while at it I put liquid metal on it. I got a good temp drop from it. But I've received backlash from some people regarding it, that it won't be good in the long run.

What do you guys think? Should I remove it?


----------



## gavros777

Hl86 said:


> Sli is dead, far cry 5 poor multi gpu performance hammered the last nail in the coffin for me.


Why they killed sli?
It saved my ass back in 2015 when i went 4k.
Was thinking to go 5k in the near future but since they killed sli it doesnt seem to be a good idea now.


----------



## Kronos8

urbanman2004 said:


> Where's the 1070 Owner's club


http://www.overclock.net/forum/69-nvidia/1601546-official-nvidia-gtx-1070-owner-s-club-964.html


----------



## Unnatural

Hi, I'm trying to get back in the loop after months (moving to a new house took me really busy).
I have a FE and I'm going to watercooling it and flash it with the XOC bios; back then, I also ordered smd resistors and conductive epoxy to do the "resistor mod", just in case.
In a daily overclock scenario, does the latter offer any effective advantage over the BIOS, or can I just skip it and save me all the trouble? Thanks!


----------



## SavantStrike

Gurkburk said:


> So I repadded my msi 1080 ti gaming x, while at it I put liquid metal on it. I got a good temp drop from it. But I've received backlash from some people regarding it, that it won't be good in the long run.
> 
> What do you guys think? Should I remove it?


Liquid metal is gallium based. Gallium reacts aggressively with aluminum and destroys it. The contact plate is nickel plated but has two aluminum tabs that connect to the aluminum fins on the heatsink.

If the liquid metal moves or runs at all and hits those tabs, your heatsink is ruined. If it moves at all and hits one of the many surface mount components next to the GPU die, the card shorts and gets ruined.

It's not worth the risk to screw around with liquid metal on an air cooled GPU for a couple of degree drop (at best) when the drop won't do enough to take you low enough to even affect boost clocks. If you search you'll find people who were very proud of a 2-3C drop only to have a dead GPU in a year or two. Liquid metal is only suitable if you seal the surface mount components first, and should probably just be reserved for CPUs.


----------



## kithylin

Gurkburk said:


> So I repadded my msi 1080 ti gaming x, while at it I put liquid metal on it. I got a good temp drop from it. But I've received backlash from some people regarding it, that it won't be good in the long run.
> 
> What do you guys think? Should I remove it?


This is a better example to get the point across. Liquid metal coolant has gallium in it. People say "Gallium reacts with aluminum" okay, well to give you a better visual representation of that, check out this 2 minute video of someone putting gallium on a sheet of aluminum and see the results: 




If the liquid metal you put on your GPU touches any aluminum parts of your gpu's heatsink, or any aluminum at all, including like said, the tabs around the gpu core on the card it's self, this will be the result. In this video this is only a week of exposure as well. Gallium is extremely destructive to aluminum. Also chemically, even if you do not touch any aluminum with it, it still will "Soak in to" and slowly get absorbed by copper as well. Many people wrongly assume that with copper the liquid metal "Dries out" over time, but in reality it's being absorbed by the copper. This however, is a very slow process compared to Gallium's reaction to Aluminum, the reaction to copper may take many months to over a year. But eventually the copper will absorb and suck up all of the gallium and liquid metal, leaving no thermal interface material at all left between your gpu's heatsink and the core. Gallium will not turn copper brittle and "break off like crackers" like it does with aluminum, but it will reduce the thermal transfer efficiency of the copper, making the heatsink permanently run hotter after a year of exposure to the liquid metal. After a year of running on liquid metal even if you were to wipe it all off both heatsink and core, the heatsink will never run as cool as it did before being exposed to liquid metal. This is a chemical reaction between the two types of metals and is not a fault of either product. Many people that have used liquid metal on their graphics cards for 1+ years with copper heatsinks have shown that "The tempatures are fine" a year later, even though they may run +2c to +4c hotter than before applying the liquid metal a year previously by then. And look for people that show a year exposure to liquid metal and then when they clean it off you will see a dull grey residue on the copper side that can not be cleaned off at all. 

The -LEAST- reactive surface for gallium is nickel-plated copper. Nickel still will react with gallium, but will take much much longer to have any effect, we're talking like 3-5 years or more. This is why it's mostly safer to use on computer processors, the IHS (metal lid over the cpu chip) from Intel is nickel-plated copper and reacts the slowest.

In general it's a very very very bad idea to use on video cards, unless you are going to be mounting a nickel-plated-copper water block on it. And no, no one makes nickel-plated copper air coolers for video cards. All video cards on air use either copper or aluminum.


----------



## AlbertoM

The heatsink (vapor chamber) of Founders Edition is nickel-plated copper.


----------



## kithylin

AlbertoM said:


> The heatsink (vapor chamber) of Founders Edition is nickel-plated copper.


I just searched google and can confirm yes it is. But fact as that may be, most AIB / Aftermarket cards will use copper.


----------



## pvt.joker

Just picked up my 1080 TI FTW3 hybrid 2 weeks ago, love the hybrid to keep things even quieter.


----------



## 8051

pvt.joker said:


> Just picked up my 1080 TI FTW3 hybrid 2 weeks ago, love the hybrid to keep things even quieter.


The VRM for your model isn't watercooled though. I had thought all AIB CLC AIO cards had watercooled VRM's? If the VRM's aren't watercooled I might as well go with a roll-your-own solution w/the NZXT adapter plate.


----------



## ZealotKi11er

8051 said:


> The VRM for your model isn't watercooled though. I had thought all AIB CLC AIO cards had watercooled VRM's? If the VRM's aren't watercooled I might as well go with a roll-your-own solution w/the NZXT adapter plate.


The cooler for the VRM is huge and has a fan.


----------



## Gurkburk

Anyone with a Msi 1080 ti gaming x here? Hows your overclocking?


----------



## ThrashZone

pvt.joker said:


> Just picked up my 1080 TI FTW3 hybrid 2 weeks ago, love the hybrid to keep things even quieter.


Hi,
Good choice just don't scratch it :thumbsups
http://www.overclock.net/forum/69-n...ting-rma-sli-pci-e-finger-connector-wear.html


----------



## Nenkitsune

Just got myself an EVGA 1080ti sc2 icx card. Man does this thing run cool.

I ended up doing a lot of upgrading this month to my pc. Went from a [email protected] to a [email protected] (and had to replace my board cause I broke a pin...rip z170 board)

Then I went from a 970 at 1500mhz to this 1080ti.

I don't think it's quite sank in yet just how bonkers of a PC I have now lol. Pretty much everything runs maxed out at 1440p and VR games run with tons of super sampling now.


----------



## Garrett1974NL

Gurkburk said:


> Anyone with a Msi 1080 ti gaming x here? Hows your overclocking?


I have the MSI GTX1080Ti OC Armor, that has the same PCB...does that also count? 
(ok ok it doesn't have the LED headers... big deal  )
I can run 2050 fully stable, whenever I get to 2076 it appears stable but will freeze within 15 minutes.
But it's just a silicon lottery anyway, some FE cards hit 2101 stable, some don't even hit 2050 stable.
Just the luck of the draw.


----------



## ZealotKi11er

I am bored of my 1080 Ti. Need something new and fresh.


----------



## cyronn

Ive just got a Super Jetstream 1080ti to replace my 1070 Jetstream. Not had the chance that much to game on my new card though.


----------



## pvt.joker

8051 said:


> The VRM for your model isn't watercooled though. I had thought all AIB CLC AIO cards had watercooled VRM's? If the VRM's aren't watercooled I might as well go with a roll-your-own solution w/the NZXT adapter plate.


Yeah, I think the fan on the VRM's should be ok. I'm not looking to do a ton of overclocking, just wanted something quiet. 



ThrashZone said:


> Hi,
> Good choice just don't scratch it :thumbsups
> http://www.overclock.net/forum/69-n...ting-rma-sli-pci-e-finger-connector-wear.html


I'd read that, and haven't had to send anything back to evga since my old 470's. Hopefully, i'll keep that streak going..


----------



## 2ndLastJedi

Nenkitsune said:


> Just got myself an EVGA 1080ti sc2 icx card. Man does this thing run cool.
> 
> I ended up doing a lot of upgrading this month to my pc. Went from a [email protected] to a [email protected] (and had to replace my board cause I broke a pin...rip z170 board)
> 
> Then I went from a 970 at 1500mhz to this 1080ti.
> 
> I don't think it's quite sank in yet just how bonkers of a PC I have now lol. Pretty much everything runs maxed out at 1440p and VR games run with tons of super sampling now.


I know the feeling , I went from a PS4 to 6600k to 7700k with 1080ti ,lol 
Loving 4k and VR with this setup !


----------



## slidero

I'll be upgrading from a 980ti to a 1080ti in a couple days whenever I receive it and I'm pumped. I have a rog swift pg279q and the old 980ti is starting is struggling to keep up these days at high fps 1440p.

Have a question though - do I need to run DDU before I put in the new card?


----------



## kithylin

slidero said:


> I'll be upgrading from a 980ti to a 1080ti in a couple days whenever I receive it and I'm pumped. I have a rog swift pg279q and the old 980ti is starting is struggling to keep up these days at high fps 1440p.
> 
> Have a question though - do I need to run DDU before I put in the new card?


Since you are going from current-driver-supported nvidia -> current-driver-supported nvidia, no. The drivers for your 980 Ti should also support the 1080 Ti. Should be just pop it in, auto-detect, reboot, done.

DDU is mainly either for nvidia -> AMD, or AMD -> nvidia, or very-old-nvidia -> current nvidia.


----------



## slidero

Thank you. I was surprised when I looked up the performance of the 980 ti vs 1080 ti, in some games at 1440p the fps nearly doubles.


----------



## menzy

Hello,
I have an Aorus 1080 ti wb edition. It's been running fine with ridiculously low temps. 

However my loop is contaminated and I've ripped everything apart to clean... Except for the graphic card as I'm not sure if this will:
1- void warranty {card is 6 months old} 
2- I'm assuming I'll have to reapply thermal paste (can I use coollaboratory liquid metal pro) 
3- do I need to have new thermal pads or could I reuse same ones? 

Any other info is appreciated! 
Thanks


----------



## buellersdayoff

menzy said:


> Hello,
> I have an Aorus 1080 ti wb edition. It's been running fine with ridiculously low temps.
> 
> However my loop is contaminated and I've ripped everything apart to clean... Except for the graphic card as I'm not sure if this will:
> 1- void warranty {card is 6 months old}
> 2- I'm assuming I'll have to reapply thermal paste (can I use coollaboratory liquid metal pro)
> 3- do I need to have new thermal pads or could I reuse same ones?
> 
> Any other info is appreciated!
> Thanks


1 it may depending on the country you live, if there is no sticker on the screws just do it
2 wouldn't bother with liquid metal, the risk is too high for what small gains you would achieve, if you've got some spare paste lying around just use that. If you want to make the most of it get something like thermal grizzly kryonaut. 
3 you can reuse the original pads, try not to damage them, again if you want to you can buy some higher quality pads for small gains.


----------



## VeritronX

You should be able to take the cover off and clean the block without removing it from the card. Either way there's probably a warranty void sticker involved. You could just use a cleaning product like system reboot to clean everything without pulling it all apart.


----------



## SavantStrike

menzy said:


> Hello,
> I have an Aorus 1080 ti wb edition. It's been running fine with ridiculously low temps.
> 
> However my loop is contaminated and I've ripped everything apart to clean... Except for the graphic card as I'm not sure if this will:
> 1- void warranty {card is 6 months old}
> 2- I'm assuming I'll have to reapply thermal paste (can I use coollaboratory liquid metal pro)
> 3- do I need to have new thermal pads or could I reuse same ones?
> 
> Any other info is appreciated!
> Thanks


Try running cleaner through it first. If that fails then carefully take the block off the card and determine if you can get the block apart from only the screws on the back (you probably can't). If you can't, then the aorus logo needs to come off the front. There are no warranty stickers on your particular card, so if you go slow you should be okay.

There have been countless warnings against liquid metal on GPUs in this thread - don't do it.

You can reuse thermal pads.

Unless it's really bad, I recommend you just use cleaner and call it a day.


----------



## 8051

slidero said:


> Thank you. I was surprised when I looked up the performance of the 980 ti vs 1080 ti, in some games at 1440p the fps nearly doubles.


Really? I didn't notice anything like that when I upgraded from my overclocked 980Ti (1450 MHz.) to my overclocked 1080ti 2000 MHz. stable (until I get better cooling).

I don't think I got a 100% increase in FPS in anything except really old games -- and some saw almost no gains in FPS (cough, GTAIV, cough).


----------



## menzy

Thanks dude. Pulled it all apart and everything is back to normal without the stickers on the front naked cover looks best. 



buellersdayoff said:


> menzy said:
> 
> 
> 
> Hello,
> I have an Aorus 1080 ti wb edition. It's been running fine with ridiculously low temps.
> 
> However my loop is contaminated and I've ripped everything apart to clean... Except for the graphic card as I'm not sure if this will:
> 1- void warranty {card is 6 months old}
> 2- I'm assuming I'll have to reapply thermal paste (can I use coollaboratory liquid metal pro)
> 3- do I need to have new thermal pads or could I reuse same ones?
> 
> Any other info is appreciated!
> Thanks
> 
> 
> 
> 1 it may depending on the country you live, if there is no sticker on the screws just do it
> 2 wouldn't bother with liquid metal, the risk is too high for what small gains you would achieve, if you've got some spare paste lying around just use that. If you want to make the most of it get something like thermal grizzly kryonaut.
> 3 you can reuse the original pads, try not to damage them, again if you want to you can buy some higher quality pads for small gains.
Click to expand...


----------



## GraphicsWhore

8051 said:


> Really? I didn't notice anything like that when I upgraded from my overclocked 980Ti (1450 MHz.) to my overclocked 1080ti 2000 MHz. stable (until I get better cooling).
> 
> I don't think I got a 100% increase in FPS in anything except really old games -- and some saw almost no gains in FPS (cough, GTAIV, cough).


Almost no gains between 980Ti to 1080Ti? That's certainly not typical. I think there's something else going on there.


----------



## dVeLoPe

I am currently mining on some GTX 1080Ti at around 60% TDP is their anyway to undervolt the cards even further thru bios editing? (interested for all 10 series cards aswell)


----------



## slidero

Graphics***** said:


> Almost no gains between 980Ti to 1080Ti? That's certainly not typical. I think there's something else going on there.


Only scenario I can think of where that would be true would be on on older titles, or MMOs, where the CPU is making a massive amount of draw calls while the GPU does nothing.


----------



## kithylin

8051 said:


> Really? I didn't notice anything like that when I upgraded from my overclocked 980Ti (1450 MHz.) to my overclocked 1080ti 2000 MHz. stable (until I get better cooling).
> 
> I don't think I got a 100% increase in FPS in anything except really old games -- and some saw almost no gains in FPS (cough, GTAIV, cough).


http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918

The 1080 Ti is +59% faster. If you aren't seeing any increase at all there's something weird going on.

What computer are you using? What is your processor? What games are you trying to play, and at what resolution?


----------



## 8051

kithylin said:


> http://gpu.userbenchmark.com/Compare/Nvidia-GTX-980-Ti-vs-Nvidia-GTX-1080-Ti/3439vs3918
> 
> The 1080 Ti is +59% faster. If you aren't seeing any increase at all there's something weird going on.
> 
> What computer are you using? What is your processor? What games are you trying to play, and at what resolution?


59% that's about the largest gain I saw in any of the modern DX11 titles I play. In GTAIV there is almost zero difference between the 980Ti and 1080Ti.


----------



## MightEMatt

8051 said:


> 59% that's about the largest gain I saw in any of the modern DX11 titles I play. In GTAIV there is almost zero difference between the 980Ti and 1080Ti.



That's the fault of the game, not the card. Bump the resolution enough and you'll probably find a difference between the cards pretty quick.


----------



## kithylin

8051 said:


> 59% that's about the largest gain I saw in any of the modern DX11 titles I play. In GTAIV there is almost zero difference between the 980Ti and 1080Ti.


Well if you're seeing roughly +59% performance gain then that's what you should be getting, everything is working as expected and normal.

And the problem is you're playing GTA 4. GTA-4 is ancient and never ran well on any card and won't run any better on any newer card. It was a terrible port to begin with and has a permanent 125ms latency built in to the game engine when running on PC. Most people wouldn't even bother trying to run old games on a modern video card like that. Most people buy a 1080 Ti for modern-day DX-11 AAA titles.

Nvidia has never had any card show a +100% increase from 1 generation to the next, usually in the past it has been typical to see +30%, this +59% from maxwell -> pascal was unusual.

When we went from GTX 780 Ti -> GTX 980 Ti it was only a +36% increase, just so you know.

To get 100% gains over a 980 Ti you would have had to wait and go for a GTX 1180 Ti instead. +100% gains are had by waiting and going +2 generations above what you have, not +1 generation.

Like for example, GTX 1080 Ti is +116% over a 780 Ti.


----------



## Blameless

dVeLoPe said:


> I am currently mining on some GTX 1080Ti at around 60% TDP is their anyway to undervolt the cards even further thru bios editing? (interested for all 10 series cards aswell)


You need a hardware flash programmer to flash custom firmware, so unless you have a signed firmware image that has lower voltage offsets, you are out of luck with a standard flash via software.



kithylin said:


> Nvidia has never had any card show a +100% increase from 1 generation to the next


There were many scenarios where the 6800 Ultra was double the performance of the FX5950 and the 8800 GTX had a similar, perhaps even greater, overall advantage over the 7900.


----------



## kithylin

Blameless said:


> There were many scenarios where the 6800 Ultra was double the performance of the FX5950 and the 8800 GTX had a similar, perhaps even greater, overall advantage over the 7900.


I didn't think to mention those because we're discussing the GTX series... 1080 Ti and 980 Ti and possibly the one before that, not so far back as those old things. But you are right, I should not of said "Never".


----------



## HoneyBadger84

I'm baaaaaaaack!

Got meh an eVGA GTX 1080 Ti FTW3 Hybrid, it is quite stupid how cool this card runs thus far! Not to mention that coolness allows it to stay steady at a boost clock of 1987MHz on stock settings. Can't wait to get some testing with higher clocks done, but for now, I gotta sleep.

I find GPU1 Temp is uh... interesting. My ambients are cool, and the side fans I have blow a steady stream of cool air in to the case, but uh... 21C is below ambient for my room lol. The other 2 GPU sensors seem more accurate, and are also in the mid 20s, that's just nuts. I have the Power & Memory monitors set to the warmest of the temps out of all the options.


----------



## VeritronX

The biggest Nvidia jump I think was from Fermi to Kepler.. Userbench shows +133% going from a GTX 580 to a GTX 780ti, which were the biggest versions of both. Interestingly going from a 6970 to a 290X is +153%.


----------



## kithylin

VeritronX said:


> The biggest Nvidia jump I think was from Fermi to Kepler.. Userbench shows +133% going from a GTX 580 to a GTX 780ti, which were the biggest versions of both. Interestingly going from a 6970 to a 290X is +153%.


GTX 580 -> GTX 680, only +48%
GTX 680 -> GTX 780, only +36%

You can't really compare Ti products to non-Ti products, even if they are the top-end of different families.

But anyway, this is really off topic for this thread at this point.


----------



## Kryptein

Managed to get over 6500 today, gigabyte 1080 ti gaming oc


----------



## HoneyBadger84

What's the run of the mill in terms of clocks on stock everything for these things? Not trying to hit anything epic, just a generally stable OC for some extra performance in games that actually need it (I.e. GTA V etc) @ 3440x1440.

I'm assuming +100 core +500 mem is pretty much easy peasy?


----------



## kithylin

HoneyBadger84 said:


> What's the run of the mill in terms of clocks on stock everything for these things? Not trying to hit anything epic, just a generally stable OC for some extra performance in games that actually need it (I.e. GTA V etc) @ 3440x1440.
> 
> I'm assuming +100 core +500 mem is pretty much easy peasy?


That's not how you overclock gpu's with pascal and boost 3.0, as +100 core will be wildly different for each individual card. For some people that might result in 1700 mhz, for someone else that +100 might result in 1900 mhz even if they both have the exact same card.

If you just want a "Generally stable OC" what you should really do is grab a copy of MSI Afterburner, push the temp target and power target sliders to 100%, manual fan curve and set it to go to 65% fan @ about 70c and just let it go. With boost 3.0, the card will monitor how hot it is, how much power it has available, what it's core is and it will self-overclock as high as it can safely. By doing it this method should guarantee the maximum safe overclock on air and will be 100% stable at all times. You do not actually need to even touch any of the core sliders, it should max out it's self as far as it's able to automatically. Even with a stock nvidia 1080 Ti that should yeild roughly 1800~1850 mhz, and with a big AIB 3-fan cooler on a card, auto-overclock would probably take it to the 1900~1950 Mhz range on it's own.


----------



## KedarWolf

Kryptein said:


> Managed to get over 6500 today, gigabyte 1080 ti gaming oc



My best run no LOD BIOS tweaks. My RAM clocks insanely high though when I bench with my water pump at 100%.


----------



## Nervoize

Is there any tool that can exceed the 1.2v cap besides the ASUS tool? The ASUS tool is not working on my FTW3, and since i have a big gap between my current max temp and the cards max safe temp i want to push my card even further.
Any tips for me to overclock my memory beyond 400mhz+? It crashes above 400, but maybe some fix is available.


----------



## kithylin

Nervoize said:


> Is there any tool that can exceed the 1.2v cap besides the ASUS tool? The ASUS tool is not working on my FTW3, and since i have a big gap between my current max temp and the cards max safe temp i want to push my card even further.
> Any tips for me to overclock my memory beyond 400mhz+? It crashes above 400, but maybe some fix is available.


1.200v is the "hard cap" by nvidia. Nothing can exceed this (software or bios) unless you physically modify your card by soldering things onto it.


----------



## 8051

kithylin said:


> GTX 580 -> GTX 680, only +48%
> GTX 680 -> GTX 780, only +36%
> 
> You can't really compare Ti products to non-Ti products, even if they are the top-end of different families.
> 
> But anyway, this is really off topic for this thread at this point.


Even though the 680 and 780Ti are both keplers wasn't it more than a year between release dates for each?


----------



## kithylin

8051 said:


> Even though the 680 and 780Ti are both keplers wasn't it more than a year between release dates for each?


If you want to discuss nvidia performance you should really go start a new thread. This is supposed to be about discussing GTX 1080 Ti's in here.


----------



## Kryptein

KedarWolf said:


> My best run no LOD BIOS tweaks. My RAM clocks insanely high though when I bench with my water pump at 100%.


Nice! Same, VRAM clocks all the way to 6495mhz but the core seems to be temperature sensitive. Soon as it reaches 45c it will throttle down from 2075-2100 to 2025-2050, 49c will hover around 2000-2025 and then crash at 50c.

This is set with +170 core, +1000 mem everything else is maxed. I drop core to +160 it will pass at 52c

Coming into winter so I will test again when ambient is single digits to confirm, cannot fit a radiator in this case :upsidedwn


----------



## Kaltenbrunner

How much better in games was this than a 980ti @1440p with new drivers in 2018 ? (I already sold my 980ti and wish I never)

What about a regular 1080 ?

I'd rather wait for 1180ti, but prices are so crazy, there's no good solution, except save money and wait...the most boring solution.


----------



## HoneyBadger84

Kaltenbrunner said:


> How much better in games was this than a 980ti @1440p with new drivers in 2018 ? (I already sold my 980ti and wish I never)
> 
> What about a regular 1080 ?
> 
> I'd rather wait for 1180ti, but prices are so crazy, there's no good solution, except save money and wait...the most boring solution.


They're actually coming down in price again. Somewhat. A 1080 Ti will outperform a 980 Ti by up to 2:1 in the same system, depending on the game.


----------



## ZealotKi11er

HoneyBadger84 said:


> They're actually coming for an in price again. Somewhat. A 1080 Ti will outperform a 980 Ti by up to 2:1 in the same system, depending on the game.


50-60% faster. Not 2x faster.


----------



## HoneyBadger84

ZealotKi11er said:


> 50-60% faster. Not 2x faster.


I did say up to, and depending on the system & application being USED. 67% gains in GTA V @1440p is a decent example of a "high end title".


----------



## kithylin

HoneyBadger84 said:


> I did say up to, and depending on the system & application being USED. 67% gains in GTA V @1440p is a decent example of a "high end title".


There's also so many wildly different variables in these sorts of tests we don't know about. Was the 980 Ti a reference card and the 1080 Ti a big 3-fan AIB card? Were both reference cards? Was one overclocked and the other stock? Is this average FPS? Minimums? 0.1% lows? If you look around the internet long enough you can find 1 single test from 1 single review that will prove your point in any argument. That does -NOT- mean that the 1080 Ti is +67% faster in all games -across the board-. That's just one outlier and may not even be reproducible by other reviewers doing the same test at the same resolution and settings. The best way to determine a video card's performance is using a site like userbenchmark.com where they take tens of thousands of samples of both cards, different systems and speeds, pools all the data and shows a median average % across all results. On userbenchmark, overclock vs stock doesn't matter, it's an overall average so those sorts of things won't have any bearing on the final resulting %. There is no test, review, or example anywhere on the entire internet showing any 1080 Ti being exactly +100% over any GTX 980 Ti, overclocked or not, where both cards are running the same test, in the same computer, with the same driver, at the same resolution. The card is not that much faster.


----------



## 8051

HoneyBadger84 said:


> They're actually coming down in price again. Somewhat. A 1080 Ti will outperform a 980 Ti by up to 2:1 in the same system, depending on the game.


Maybe up to 100% in average FPS, but not in minimums.


----------



## HoneyBadger84

kithylin said:


> There's also so many wildly different variables in these sorts of tests we don't know about. Was the 980 Ti a reference card and the 1080 Ti a big 3-fan AIB card? Were both reference cards? Was one overclocked and the other stock? Is this average FPS? Minimums? 0.1% lows? If you look around the internet long enough you can find 1 single test from 1 single review that will prove your point in any argument. That does -NOT- mean that the 1080 Ti is +67% faster in all games -across the board-. That's just one outlier and may not even be reproducible by other reviewers doing the same test at the same resolution and settings. The best way to determine a video card's performance is using a site like userbenchmark.com where they take tens of thousands of samples of both cards, different systems and speeds, pools all the data and shows a median average % across all results. On userbenchmark, overclock vs stock doesn't matter, it's an overall average so those sorts of things won't have any bearing on the final resulting %. There is no test, review, or example anywhere on the entire internet showing any 1080 Ti being exactly +100% over any GTX 980 Ti, overclocked or not, where both cards are running the same test, in the same computer, with the same driver, at the same resolution. The card is not that much faster.


You just love arguing. Lol you can find out easily what cards those numbers were with by looking on the source site which is Guru3d.com - maybe my up to 2x claim was a bit much, but at least I'm willing to admit it. It is at a bare minimum, a lot faster for one generation. We haven't seen this kind of improvement in a long time. 

You also don't bother to read previous posts half the time. Earlier you posted some crazy thing about "as long as it stays around 70C you're good" after I plainly stated a few posts earlier that I have a FTW3 Hybrid card aka it will never see 70C. I'm not going to address this further as I've seen you continually lash out at others in this thread in the past.



8051 said:


> Maybe up to 100% in average FPS, but not in minimums.


Definitely not. Although it would be interesting to see minimum and 0.1% stats for both cards, the gap is probably in the same 30-70% range most game scaling is.


----------



## kithylin

HoneyBadger84 said:


> You just love arguing.


It's not arguing for the sake of arguing, it's just false information. They're not +100% faster and they're not +67% faster and it's provable. They're just +50% <-> +55% only and no more.

My only issue is people trying to disseminate false information to others that may be looking at upgrading from a (single) 980 Ti -> (single) 1080 Ti and then they come read posts in this thread and then they would be expecting one thing and actually see something else entirely when they got the card installed. People that would be spending $800 or more expecting +67% performance or +100% performance and not ever actually getting it.

I didn't want us in this thread to mislead anyone. I'm just trying to ensure others reading this thread get the correct information from the correct sources so they know what to expect with their new purchase, that's all. I'm not trying to "Lash out" at anyone or even argue. Just making sure folks get the facts.

EDIT: Edited my post to make it clear I'm referring to single-card upgrading folks and not discussing SLI at all in this post.


----------



## zipeldiablo

HoneyBadger84 said:


> They're actually coming down in price again. Somewhat. A 1080 Ti will outperform a 980 Ti by up to 2:1 in the same system, depending on the game.


Depending on the game a 1080ti overclocked cannot even outperform a sli of 980ti hof (a good example of this would be rise of the tomb raider).
How do you expect a 1080ti to have twice the performance of a 980ti then???

And i am talking about personal experience here.


----------



## stangflyer

Off topic but continuing this conversation. When I say a 1080ti is twice as fast as a 980ti I am meaning two 980ti's in sli. So around 55-70% depending on sli scaling. That is what I mean by twice as fast. I had one 980ti then went sli and now have a 1080ti. 
In games that have great scaling the 980ti is faster and in bad scaling the 1080ti is faster. When the scaling is around 65% I think it is even.


----------



## MightEMatt

stangflyer said:


> Off topic but continuing this conversation. When I say a 1080ti is twice as fast as a 980ti I am meaning two 980ti's in sli. So around 55-70% depending on sli scaling. That is what I mean by twice as fast. I had one 980ti then went sli and now have a 1080ti.
> In games that have great scaling the 980ti is faster and in bad scaling the 1080ti is faster. When the scaling is around 65% I think it is even.



This is the same experience I had going from 980Ti SLI to a 1080Ti. The 1080Ti overall gave me the better experience. The 980Tis certainly benchmarked higher in ideal scenarios but the vast majority of games I play saw the 1080Ti get at least 90% of the performance of the 980Tis.


----------



## ZealotKi11er

MightEMatt said:


> This is the same experience I had going from 980Ti SLI to a 1080Ti. The 1080Ti overall gave me the better experience. The 980Tis certainly benchmarked higher in ideal scenarios but the vast majority of games I play saw the 1080Ti get at least 90% of the performance of the 980Tis.


For sure. Even with 20-30% less perf the SLI setup sucks these days.


----------



## zipeldiablo

MightEMatt said:


> This is the same experience I had going from 980Ti SLI to a 1080Ti. The 1080Ti overall gave me the better experience. The 980Tis certainly benchmarked higher in ideal scenarios but the vast majority of games I play saw the 1080Ti get at least 90% of the performance of the 980Tis.


Well maybe now, but when the 1080ti just came out it wasn't the experience i had.
In some games i lost as up to 15fps by switching to the 1080ti...

I never had any issue with sli on the games i used to play back then so...


----------



## trickeh2k

KedarWolf said:


> On XOC BIOS. Not much more.
> 
> https://www.3dmark.com/3dm/26444640
> 
> 15,297
> 
> And that's a benching run. I run 24/7 much slower. I'd post what I get at my normal clocks. brb.
> 
> https://www.3dmark.com/3dm/26444738 Not that much less at normal clock speeds. Strange.
> 
> 14,918


 seems like i didn't get a good chip either. 1974Mhz is the highest it will go on stock settings, even with increased power and voltage.


----------



## KedarWolf

trickeh2k said:


> seems like i didn't get a good chip either. 1974Mhz is the highest it will go on stock settings, even with increased power and voltage.


I get 2113 or 2127 core when the ambient temp is very low and 6237 memory in Afterburner with 1.2v and the XOC BIOS for benching. :h34r-smi
Not really great but acceptable overclocks when benching. 

I run 2037 core and 6177 memory as my 24/7 gaming clocks.


----------



## 2ndLastJedi

KedarWolf said:


> trickeh2k said:
> 
> 
> 
> /forum/images/smilies/frown.gif seems like i didn't get a good chip either. 1974Mhz is the highest it will go on stock settings, even with increased power and voltage.
> 
> 
> 
> I get 2113 or 2127 core when the ambient temp is very low and 6237 memory in Afterburner with 1.2v and the XOC BIOS for benching. /forum/images/smilies/ph34r-smiley.gif
> Not really great but acceptable overclocks when benching.
> 
> I run 2037 core and 6177 memory as my 24/7 gaming clocks. /forum/images/smilies/biggrin.gif
Click to expand...

@KedaWolf can you post your Firestrike normal scores so I can compare how my Galax EXOC compares ?


----------



## gammagoat

Hoping that one of you guys can help me find a universal water block for my EVGA 1080 ti fe. I know about EKWB looking for some other options.

Shopping on Amazon I found this, Alphacool NexXxoS GPX Solo Waterblock, but I haven't been able to find any compatibility info. Anybody have any experience with this block?


----------



## kithylin

gammagoat said:


> Hoping that one of you guys can help me find a universal water block for my EVGA 1080 ti fe. I know about EKWB looking for some other options.
> 
> Shopping on Amazon I found this, Alphacool NexXxoS GPX Solo Waterblock, but I haven't been able to find any compatibility info. Anybody have any experience with this block?


Do you mean this? https://www.amazon.com/Alphacool-NexXxoS-Solo-Waterblock-Black/dp/B00ZXXNMBI

That's a core-only block. You need to cover the VRM's, RAM, and the core all together in a full cover block. On an expensive card like this, it would be a very bad idea to try and run the ram and/or vrm's bare with no cooling, and quite seriously may lead to card damage/failure.

You can look here, these should all be compatible with reference-design PCB.

http://www.performance-pcs.com/wate...ck-gpu-type--nvidia/vga-series--gtx-1080-ti/?


----------



## hotrod717

gammagoat said:


> Hoping that one of you guys can help me find a universal water block for my EVGA 1080 ti fe. I know about EKWB looking for some other options.
> 
> Shopping on Amazon I found this, Alphacool NexXxoS GPX Solo Waterblock, but I haven't been able to find any compatibility info. Anybody have any experience with this block?


Here is what I use. Price is right and have had no issues. Works great!

http://www.performance-pcs.com/alphacool-hf-14-ati-nvidia-smart-motion-universal-nickel-edition.html


----------



## kithylin

hotrod717 said:


> Here is what I use. Price is right and have had no issues. Works great!
> 
> http://www.performance-pcs.com/alphacool-hf-14-ati-nvidia-smart-motion-universal-nickel-edition.html


How are you cooling the ram and vrm's on your 1080 Ti then with this?


----------



## gammagoat

Thanks guy's ram and mosfets are already sinked and unfortunately I used thermal epoxy to attach them. I am using an artic 3 accelero right now. 

So a universal would be best, think I might tear a component off if I try to remove the sinks.

hotrod717, could you give me some temps and what your setup is like?


----------



## hotrod717

kithylin said:


> hotrod717 said:
> 
> 
> 
> Here is what I use. Price is right and have had no issues. Works great!
> 
> http://www.performance-pcs.com/alphacool-hf-14-ati-nvidia-smart-motion-universal-nickel-edition.html
> 
> 
> 
> How are you cooling the ram and vrm's on your 1080 Ti then with this?
Click to expand...

Fan is all you need. I stopped using full cover blocks about 3-4 years ago. I disnt like getting a release card and waiting.


gammagoat said:


> Thanks guy's ram and mosfets are already sinked and unfortunately I used thermal epoxy to attach them. I am using an artic 3 accelero right now.
> 
> So a universal would be best, think I might tear a component off if I try to remove the sinks.
> 
> hotrod717, could you give me some temps and what your setup is like?


 When I switched from full cover, I found the universal is actually better on core temps than a full cover in most situations.
Right now I do not have a gpu in loop. I havent gotten around to putting it in a gtx580 Matrix I've been testing. As I just switched to Sky-X, have been concentrating on that. I run separate loops for cpu and gpu. When I'm not using gpu loop, I unplug pump and 
push block and tubing under my horizontal mobo tray. Out of the way.


----------



## kithylin

hotrod717 said:


> Fan is all you need. I stopped using full cover blocks about 3-4 years ago. I disnt like getting a release card and waiting.
> When I switched from full cover, I found the universal is actually better on core temps than a full cover in most situations.
> Right now I do not have a gpu in loop. I havent gotten around to putting it in a gtx580 Matrix I've been testing. As I just switched to Sky-X, have been concentrating on that. I run separate loops for cpu and gpu. When I'm not using gpu loop, I unplug pump and
> push block and tubing under my horizontal mobo tray. Out of the way.


So no XOC bios and no big overclocks for you then. Also no big custom bios's with heavy volts for anyone else using those then. Fan's aren't enough for the VRM's on these cards if you try pushing 400~500W through the VRM's with certain titles and max OC'ing.


----------



## 8051

gammagoat said:


> Thanks guy's ram and mosfets are already sinked and unfortunately I used thermal epoxy to attach them. I am using an artic 3 accelero right now.
> 
> So a universal would be best, think I might tear a component off if I try to remove the sinks.
> 
> hotrod717, could you give me some temps and what your setup is like?


So I'm guessing the Arctic Accelero Xtreme III didn't work out for you on your 1080Ti? Did you try using different fans? The fans that come stock on the Xtreme III only run at 2000 RPM and don't produce much static pressure considering the negligible current draw.


----------



## 8051

kithylin said:


> So no XOC bios and no big overclocks for you then. Also no big custom bios's with heavy volts for anyone else using those then. Fan's aren't enough for the VRM's on these cards if you try pushing 400~500W through the VRM's with certain titles and max OC'ing.


I've read a video reviewer on youtube whose SOP is to transplant his NZXT AIO-CLC from last-gen top tier to the latest top tier -- he never bothers to even sink his VRM's or VRAM he just relies on the fan on the NZXT unit.


----------



## KraxKill

The idea that a universal block is somehow better than a full cover is simply not true. The larger surface area, metal mass and liquid contact area and the fact that the entire card PCB is able to dissipate heat into the loop means the unit is more efficient at cooling the card. As far as being more efficient at cooling the core, all the above factors still apply and result in better cooling provided that the heat load extracted can be dissipated. About the only time that excluding the VRM and Memory from the loop would provide better cooling to the core, would be if the radiator in use to cool the card was smaller than what was required to dissipate the heat load generated but the full cover block is in no way to blame here.


----------



## kithylin

8051 said:


> I've read a video reviewer on youtube whose SOP is to transplant his NZXT AIO-CLC from last-gen top tier to the latest top tier -- he never bothers to even sink his VRM's or VRAM he just relies on the fan on the NZXT unit.


There was that issue a while back from the GTX 1070's from EVGA where just because EVGA did not put thermal pads to make the vrm's contact the heatsink, multiple users were reporting their video cards physically catching fire. And the GTX 1080 Ti pushes more power than those cards through the vrm's, at stock, and can get excessive with overclocking.

The bottom line is you, nor anyone else should never, ever advocate running bare VRM's on these cards with no cooling what so ever, even at stock speeds. Heatsink and fan cooling is a must for stock 1080 Ti's vrms. And if one is overclocking with unlimited bios (Like XOC) and pushing like say for example 1.200v and 2100~2200 mhz core speed and playing either 4K or higher refresh rate gaming where you're going to be running the card @ 100% utilization most of the time, then vrm cooling is both critical, and likely would exceed air cooling's capabilities if you're pushing 400W~500W through it.

The thing is, most people aren't really overclocking their 1080 Ti's to the max like some of us are. And folks need to understand that if they do decide to overclock these cards "that far", that they need to take serious concern to vrm cooling.

This is not something to be taken lightly or kid around with, especially because of how expensive these cards are.

Some user could read your posts and grab a 1080 Ti and put a core-only water cooler on it, run it bare-card otherwise and go try to overclock it thinking "I'm water cooled and I'm fine!" then their card physically catch fire later.


----------



## gammagoat

8051 said:


> So I'm guessing the Arctic Accelero Xtreme III didn't work out for you on your 1080Ti? Did you try using different fans? The fans that come stock on the Xtreme III only run at 2000 RPM and don't produce much static pressure considering the negligible current draw.


To tell the trust the cooler works pretty good, usually 60 - 62c when running Einstein at home and about the same when playing AC Origins.

Just want to add GPU to my loop, quiet things down some more.

The Accelero has been great other than fan noise and I'd like to get temps down a little more, of course I'll still have to have a couple of fans to cool ram and whatever but they will be 140's not the three 92's.


----------



## hotrod717

kithylin said:


> So no XOC bios and no big overclocks for you then. Also no big custom bios's with heavy volts for anyone else using those then. Fan's aren't enough for the VRM's on these cards if you try pushing 400~500W through the VRM's with certain titles and max OC'ing.


Lol. I wish i could tell you to click on the robot. Temperature and finesse, not voltage or custom bios'. If you scroll way back to release in this thread, you may find some valuable info, as i had a 1080ti on day 1. Silly rabbit. 
Are you talking about hardcore benching or just gaming? Or just trolling a argument as i see you do.


----------



## KraxKill

I still can't understand why people push such brute force through these cards to gain little if any practical utility. I think the efficiency sweet-spot for the 1080Ti is 2ghz at as low a voltage that you can run without crashing. Bump the mem as far as it can handle and done. Anything else is just useless power drain and heat to gain ever diminishing returns. Unless you're trying to win a benchmark contest, it's entirely useless to OC a Ti beyond that point. The best mod you can do, is the shunt and cooling. The XOC bios has been entirely useless to me when you take into account performance gain while looking at your kilowatt meter while also taking into account the silicon abuse.

Check my posts on undervolting some many pages back. In other words, I completely agree.

Say you had a 100m² apartment you had paid $100K for. You're essentially saying that you will pay an extra $100K (in power) to gain an additional 5m². NO DOUBT some people will, but essentially that is the utility you gain by pushing all that extra wattage through that beautiful pice of silicon.


----------



## ZealotKi11er

KraxKill said:


> I still can't understand why people push such brute force through these cards to gain little if any practical utility. I think the efficiency sweet-spot for the 1080Ti is 2ghz at as low a voltage that you can run without crashing. Bump the mem as far as it can handle and done. Anything else is just useless power drain and heat to gain ever diminishing returns. Unless you're trying to win a benchmark contest, it's entirely useless to OC a Ti beyond that point. The best mod you can do, is the shunt and cooling. The XOC bios has been entirely useless to me when you take into account performance gain while looking at your kilowatt meter while also taking into account the silicon abuse.
> 
> Check my posts on undervolting some many pages back. In other words, I completely agree.


Pretty much. 90% of the time I am at 1860MHz @ 0.875v. Also can do 2025 @ 1v.


----------



## KraxKill

..


----------



## 8051

kithylin said:


> There was that issue a while back from the GTX 1070's from EVGA where just because EVGA did not put thermal pads to make the vrm's contact the heatsink, multiple users were reporting their video cards physically catching fire. And the GTX 1080 Ti pushes more power than those cards through the vrm's, at stock, and can get excessive with overclocking.
> 
> The bottom line is you, nor anyone else should never, ever advocate running bare VRM's on these cards with no cooling what so ever, even at stock speeds. Heatsink and fan cooling is a must for stock 1080 Ti's vrms. And if one is overclocking with unlimited bios (Like XOC) and pushing like say for example 1.200v and 2100~2200 mhz core speed and playing either 4K or higher refresh rate gaming where you're going to be running the card @ 100% utilization most of the time, then vrm cooling is both critical, and likely would exceed air cooling's capabilities if you're pushing 400W~500W through it.
> 
> The thing is, most people aren't really overclocking their 1080 Ti's to the max like some of us are. And folks need to understand that if they do decide to overclock these cards "that far", that they need to take serious concern to vrm cooling.
> 
> This is not something to be taken lightly or kid around with, especially because of how expensive these cards are.
> 
> Some user could read your posts and grab a 1080 Ti and put a core-only water cooler on it, run it bare-card otherwise and go try to overclock it thinking "I'm water cooled and I'm fine!" then their card physically catch fire later.


I'd never run bare VRM's -- not after the ones on my Gigabyte LGA775 board fried.


----------



## feznz

hotrod717 said:


> Lol. I wish i could tell you to click on the robot. Temperature and finesse, not voltage or custom bios'. If you scroll way back to release in this thread, you may find some valuable info, as i had a 1080ti on day 1. Silly rabbit.
> Are you talking about hardcore benching or just gaming? Or just trolling a argument as i see you do.



This came up quite a few times in this thread I am also a unversal block user as you said unless you are going mental using XOC bios then you can only pull around 330w depending on your cards stock bios I agree.
I would say that a full cover block is more aesthically appeasing but I can add I have had my universals on 3 different cards I run a tight budget when water cooling. 


and also a pic of my cards got the GTX770 on release date and also my Asus Strix 1080Ti on release date zero issues.

and it would be mental to run bare vrms I guess this illustrates that a small heat sink should suffice or maybe this person is doing it wrong too 
https://www.eteknix.com/gpu-frequency-world-record-set-gtx-1060/


----------



## Abaidor

KraxKill said:


> I still can't understand why people push such brute force through these cards to gain little if any practical utility. I think the efficiency sweet-spot for the 1080Ti is 2ghz at as low a voltage that you can run without crashing. Bump the mem as far as it can handle and done. Anything else is just useless power drain and heat to gain ever diminishing returns. Unless you're trying to win a benchmark contest, it's entirely useless to OC a Ti beyond that point. The best mod you can do, is the shunt and cooling. The XOC bios has been entirely useless to me when you take into account performance gain while looking at your kilowatt meter while also taking into account the silicon abuse.
> 
> Check my posts on undervolting some many pages back. In other words, I completely agree.
> 
> Say you had a 100m² apartment you had paid $100K for. You're essentially saying that you will pay an extra $100K (in power) to gain an additional 5m². NO DOUBT some people will, but essentially that is the utility you gain by pushing all that extra wattage through that beautiful pice of silicon.


I got the Asus 1080Ti OC with my new X299 build and although I got a Phanteks waterblock for it I just recently installed it after about 2 months sitting in its box. 

The card would boost to 1974 with the stock cooler. With the waterblock I can run the memory past 6000Mhz and the GPU at 2065 while it can also complete Superposition at 2100 while temps never exceed 40C. I only played with the Asus GPU Tweak II.

Now I am not even loading the profile when windows starts since I am not gaming (might do so) and I honestly don't see the reason too. I am also wondering whether there was a point in buying the waterblock but it does look nice and my PC is dead silent 99% of the time.


----------



## kithylin

KraxKill said:


> I still can't understand why people push such brute force through these cards to gain little if any practical utility. I think the efficiency sweet-spot for the 1080Ti is 2ghz at as low a voltage that you can run without crashing. Bump the mem as far as it can handle and done. Anything else is just useless power drain and heat to gain ever diminishing returns. Unless you're trying to win a benchmark contest, it's entirely useless to OC a Ti beyond that point. The best mod you can do, is the shunt and cooling. The XOC bios has been entirely useless to me when you take into account performance gain while looking at your kilowatt meter while also taking into account the silicon abuse.
> 
> Check my posts on undervolting some many pages back. In other words, I completely agree.
> 
> Say you had a 100m² apartment you had paid $100K for. You're essentially saying that you will pay an extra $100K (in power) to gain an additional 5m². NO DOUBT some people will, but essentially that is the utility you gain by pushing all that extra wattage through that beautiful pice of silicon.


Even with XOC bios and my 1080 Ti running 2138 mhz, even in games like ark evolved that will run it maxed out @ 99% usage, I still only see 325~360 watts through nvidia-smi, it's not terrible, and I do get an extra roughly +10% of performance vs stock bios. It's more about buying a card and using it for 2~4 years and wanting to get every last % of performance out of what we paid for. That's kind of supposed to be the entire point of this entire OCN forums, I think people lost sight of that over time. Everyone here used to be about overclocking everything as far as it can go while still remaining stable, and then running with it daily. That used to be the spirit of this site here... seems not so much anymore in 2018. Now those of us that overclock things to the limit are ridiculed publically for even doing it. :sad-smile Just like this post above is stating. It's like you're saying we're bad people for wanting to get everything we can out of the hardware we paid for.


----------



## 8051

kithylin said:


> Even with XOC bios and my 1080 Ti running 2138 mhz, even in games like ark evolved that will run it maxed out @ 99% usage, I still only see 325~360 watts through nvidia-smi, it's not terrible, and I do get an extra roughly +10% of performance vs stock bios. It's more about buying a card and using it for 2~4 years and wanting to get every last % of performance out of what we paid for. That's kind of supposed to be the entire point of this entire OCN forums, I think people lost sight of that over time. Everyone here used to be about overclocking everything as far as it can go while still remaining stable, and then running with it daily. That used to be the spirit of this site here... seems not so much anymore in 2018. Now those of us that overclock things to the limit are ridiculed publically for even doing it. :sad-smile Just like this post above is stating. It's like you're saying we're bad people for wanting to get everything we can out of the hardware we paid for.


I was reluctant to do anything to my 1080Ti only because at one point it would've cost me $1K to replace it, although it looks like a used one can be had for ~$700 now.


----------



## Abaidor

kithylin said:


> Even with XOC bios and my 1080 Ti running 2138 mhz, even in games like ark evolved that will run it maxed out @ 99% usage, I still only see 325~360 watts through nvidia-smi, it's not terrible, and I do get an extra roughly +10% of performance vs stock bios. It's more about buying a card and using it for 2~4 years and wanting to get every last % of performance out of what we paid for. That's kind of supposed to be the entire point of this entire OCN forums, I think people lost sight of that over time. Everyone here used to be about overclocking everything as far as it can go while still remaining stable, and then running with it daily. That used to be the spirit of this site here... seems not so much anymore in 2018. Now those of us that overclock things to the limit are ridiculed publically for even doing it. :sad-smile Just like this post above is stating. It's like you're saying we're bad people for wanting to get everything we can out of the hardware we paid for.


Just to add to my previous post - I am all for overclocking when I need the extra performance but I also like quiet systems. So my i9-7940X is using "Dynamic OC" per core up to 4.7Ghz (3 cores now) and 4.4 for the rest. I don't need extra right now so I might enjoy some extra time under warranty. My 1080Ti is watercooled, has a OC profile ready and when I see the need I can load it.

For day-to-day use however I enjoy a speedy system with a full custom loop and a 9X140 radiator with ML140 Pro fans running so low I can't even hear them. If and when I want to game everything can be boosted to max out performance.


----------



## feznz

kithylin said:


> Even with XOC bios and my 1080 Ti running 2138 mhz, even in games like ark evolved that will run it maxed out @ 99% usage, I still only see 325~360 watts through nvidia-smi, it's not terrible, and I do get an extra roughly +10% of performance vs stock bios. It's more about buying a card and using it for 2~4 years and wanting to get every last % of performance out of what we paid for. That's kind of supposed to be the entire point of this entire OCN forums, I think people lost sight of that over time. Everyone here used to be about overclocking everything as far as it can go while still remaining stable, and then running with it daily. That used to be the spirit of this site here... seems not so much anymore in 2018. Now those of us that overclock things to the limit are ridiculed publically for even doing it. :sad-smile Just like this post above is stating. It's like you're saying we're bad people for wanting to get everything we can out of the hardware we paid for.



There is benching OC and there is sensible daily OC
I guess I look at your OC and think that is an insane want to kill the card early OC 


where do you get 10% from even 2063Mhz to 2138Mhz is an 3.6% clock increase which may not equate to linear performance gains 
Guess I am saying that I am wondering if getting a water block was worth it to OC to 2063Mhz when 2000Mhz was achievable with the stock cooler/bios which equates to 3.15% increase

2000Mhz to 2138Mhz is 6.9% increase 

https://www.marshu.com/articles/calculate-percentage-increase-decrease-percent-calculator.php


----------



## kithylin

feznz said:


> There is benching OC and there is sensible daily OC
> I guess I look at your OC and think that is an insane want to kill the card early OC
> 
> where do you get 10% from even 2063Mhz to 2138Mhz is an 3.6% clock increase which may not equate to linear performance gains
> Guess I am saying that I am wondering if getting a water block was worth it to OC to 2063Mhz when 2000Mhz was achievable with the stock cooler/bios which equates to 3.15% increase
> 
> 2000Mhz to 2138Mhz is 6.9% increase
> 
> https://www.marshu.com/articles/calculate-percentage-increase-decrease-percent-calculator.php


You haven't read any of my posts in the past.. even though you've been around when I posted them. On the stock air cooler with the stock bios, this card was throttling down to 1780 <-> 1825 Mhz even with fans max, that's the most it would ever do. It would start out like 1950 mhz then when gaming for over an hour slow down to about 1780-1825 around in there. So going water and getting it to run 2138 mhz is quite a big boost, that's +20.11% more for me. And just water cooling on stock bios only let it go to 1925~1950 Mhz, it wouldn't do any more either.


----------



## feznz

kithylin said:


> You haven't read any of my posts in the past.. even though you've been around when I posted them. On the stock air cooler with the stock bios, this card was throttling down to 1780 <-> 1825 Mhz even with fans max, that's the most it would ever do. It would start out like 1950 mhz then when gaming for over an hour slow down to about 1780-1825 around in there. So going water and getting it to run 2138 mhz is quite a big boost, that's +20.11% more for me. And just water cooling on stock bios only let it go to 1925~1950 Mhz, it wouldn't do any more either.


I guess you at looking at 41°C on Friday in Dallas ... that would explain a lot I can only dream of hitting low 30s°C on the hottest mid summer afternoon currently I am sitting here at an 11°C high today 
but I have read some of your posts just I don't agree with a lot of them especially pushing the card to the brink of the limit for daily use but it is your card you can do with what you please.


----------



## EarlZ

Whats the best way to easily determine if the memory OC Im running is still giving me a boost or just hitting diminishing returns


----------



## kithylin

feznz said:


> I guess you at looking at 41°C on Friday in Dallas ... that would explain a lot I can only dream of hitting low 30s°C on the hottest mid summer afternoon currently I am sitting here at an 11°C high today
> but I have read some of your posts just I don't agree with a lot of them especially pushing the card to the brink of the limit for daily use but it is your card you can do with what you please.


I've always run every card I've ever owned as far as they will possibly go stable, and fed em what ever voltage is necessary to sustain it and ran with it daily. I still have almost all of my old cards. Made me painfully sad to sell my tweaked 290X when I bought my 1080 Ti.. still probably buy another one some day. I still have my GTX 470 pair I bought new back when they first came out, EVGA GTX 470 Hydro Copper FTW, been running those @ max volts @ 1.212v since the first month I own em and they still work flawlessly when I tried them a few months ago here in 2018.

And yes, the heat here is part of why I went water cooling. My 1080 Ti with EVGA's stock cooler was running mid-70's C and sometimes low 80's C, even with fans @ 100%, even back when it was winter here in jan 2018. And now entering the middle of summer it's overclocked as far as I can push it and only averages low 50's C in demanding games, and mid-40's C on most games I play in the heat of summer, and with the overclock it's about +140% over the 290X I had, and I'm loving it and I'm happy with it.

Nothing I've done to it has violated my warranty status either, not the bios and not the water cooling, so there's really nothing negative I can see. I'm wholy expecting this to last as long as any other card I've overclocked in the past. All my other video cards I've always run them with completely power-unlimited custom bios's and increased voltage, never had a card die on me once yet.

I also recently finally sold my Nvidia GeForce 2 Ultra with a custom bios and overclock. Had that card for so many years, sad to see it go.. I held on to it so long though that I got pretty close to original new price from a collector in 2017 though.:thumb:

I learned most of my knowledge about modifying bios's and reflashing nvidia cards here from OCN over the years, and I always tweak and tune everything to the limit with every computer I own. Run CPU as fast as it can go stable, OC the ram as far as it will go stable, and OC all the video cards as far as possible. Never had a single failure related to OC in all these years yet.


----------



## Gurkburk

Can't wait til i get my watercooling equipment. Going to be putting my Msi 1080ti gaming x on water to start off. Then later on put the CPU on water as well. (Blocks are expensive as hell here, hence only starting with one part)


----------



## Nervoize

Any way to increase memory speed? Mine crashes above 450Mhz (5950)


----------



## EarlZ

Nervoize said:


> Any way to increase memory speed? Mine crashes above 450Mhz (5950)


What are you using to test for stability ?


----------



## KedarWolf

There is a standalone GPU Memory Bandwidth Tester I added as an attachment here a long time ago. 

I tried searching my posts here and can't find it, don't remember what it's called.

Can someone re-add it as an attachment if they know what I'm talking about?


----------



## mouacyk

KedarWolf said:


> There is a standalone GPU Memory Bandwidth Tester I added as an attachment here a long time ago.
> 
> I tried searching my posts here and can't find it, don't remember what it's called.
> 
> Can someone re-add it as an attachment if they know what I'm talking about?


vBandwidthTest for measuring GPU bandwidth: Can't upload recent verison I got (old versions at https://devtalk.nvidia.com/default/...-benchmark-is-somewhat-suspicious-/?offset=17)
MemTestG80 for stability testing: https://simtk.org/projects/memtest


----------



## KedarWolf

mouacyk said:


> vBandwidthTest for measuring GPU bandwidth: Can't upload recent verison I got (old versions at https://devtalk.nvidia.com/default/...-benchmark-is-somewhat-suspicious-/?offset=17)
> MemTestG80 for stability testing: https://simtk.org/projects/memtest


I found it, figured out how to see all the attachments I've posted on the forum.


----------



## KedarWolf

KedarWolf said:


> I found it, figured out how to see all the attachments I've posted on the forum.


Attachments not working. Upload failed. 

Meant to edit, not quote myself. :/


----------



## EarlZ

To clarify I need both too tools, one is to measure if I am still getting more memory bandwidth with the increased OC and the other tool is for stability ?


----------



## mouacyk

EarlZ said:


> To clarify I need both too tools, one is to measure if I am still getting more memory bandwidth with the increased OC and the other tool is for stability ?


Yes. The stability tool is to absolutely make sure GDDR5X error correction doesn't kick in, which does affect performance.


----------



## EarlZ

mouacyk said:


> Yes. The stability tool is to absolutely make sure GDDR5X error correction doesn't kick in, which does affect performance.


So this means I can simply just use that one tool ? Whats the correct command line flags for this? I tried memtestg80 11264 50 and I get 4294961546 errors and the test completes in 2seconds

EDIT:
Oh wait can this app not address more than 3.2GB of VRAM ? as I am getting errors setting it to 3.3GB

I am getting these scores on Superposition with stock clocks and just sliding the max power/temp setting
1080P extreme : 5820
4k optimized : 9465

Are these scores normal or on the low side?


----------



## mouacyk

EarlZ said:


> So this means I can simply just use that one tool ? Whats the correct command line flags for this? I tried memtestg80 11264 50 and I get 4294961546 errors and the test completes in 2seconds
> 
> EDIT:
> Oh wait can this app not address more than 3.2GB of VRAM ? as I am getting errors setting it to 3.3GB
> 
> I am getting these scores on Superposition with stock clocks and just sliding the max power/temp setting
> 1080P extreme : 5820
> 4k optimized : 9465
> 
> Are these scores normal or on the low side?


What I normally do is run vBandwidthTest first to make sure I get a good memory OC strap. A bad strap (unoptimized timings) can drop bandwidth from 500GB/s to 466GB/s. Then run memtestg80 to verify stability. My configuration is optimized to test a little over 10.25GB and takes around 5 minutes. While it runs, I can see GPU usage at 100% and memory usage nearly up there as well. You can try to dig through the readme, or I can post the script later. It's about running the optimal number of instances and memory allocation per instance.


----------



## EarlZ

mouacyk said:


> What I normally do is run vBandwidthTest first to make sure I get a good memory OC strap. A bad strap (unoptimized timings) can drop bandwidth from 500GB/s to 466GB/s. Then run memtestg80 to verify stability. My configuration is optimized to test a little over 10.25GB and takes around 5 minutes. While it runs, I can see GPU usage at 100% and memory usage nearly up there as well. You can try to dig through the readme, or I can post the script later. It's about running the optimal number of instances and memory allocation per instance.


If you can share the correct syntax on the flags that would be great

I ran the vBandwidthtest and I seem to be getting 368-376GB/s, is this normal for non overclocked?

https://imgur.com/a/Gq0Kyn1

I cant seem to attach the image on the post or use the


----------



## mouacyk

EarlZ said:


> If you can share the correct syntax on the flags that would be great
> 
> I ran the vBandwidthtest and I seem to be getting 368-376GB/s, is this normal for non overclocked?
> 
> https://imgur.com/a/Gq0Kyn1
> 
> I cant seem to attach the image on the post or use the codes.[/quote]
> 
> I think that bandwidth looks OK for stock memory. My script settings are:
> instances = 4
> iterations = 5
> memory = 2534


----------



## EarlZ

mouacyk said:


> I think that bandwidth looks OK for stock memory. My script settings are:
> instances = 4
> iterations = 5
> memory = 2534


Thanks, ill check the readme later on how to use the scripting as I ran it with command prompt and used the flags, would you be kind enough to run it on stock mem speed and see if we are getting the same result? I tried +500 and it only went to 380Gb/s whih is quite far from your 500Gb/s


----------



## mouacyk

EarlZ said:


> Thanks, ill check the readme later on how to use the scripting as I ran it with command prompt and used the flags, would you be kind enough to run it on stock mem speed and see if we are getting the same result? I tried +500 and it only went to 380Gb/s whih is quite far from your 500Gb/s


My RAM and CPU overclocks might be skewing the results as well.


----------



## Gypsycurse

*TDP confirmation*

Hi all, 

I wanted to pose a question as after looking through the thread I couldn't see the answer (sorry in advance if I missed it).

I have flashed a ASUS Strix 1080 TI, with the OC version. I cleared out and reset the drivers and the stock and boost figures changed. 
So far I haven't used the powerlimit.zip I wanted to ask if this was necessary or only required if you hit an issue?

My experience so far has been rather obviously as the stock is increased the benches are higher, but also when running with the same over clock I used previously it appears to be a little faster too, higher timespy results and slightly higher sustained boost speed (maybe placebo) 

What do you use to verify the watts consumed, do you need to use the .bat file to ensure the TDP is set correctly?

For additional information the care is water cooled, and presently hitting 2050 core 12400 memory

Thanks for any advice.

John


----------



## mouacyk

EarlZ said:


> Thanks, ill check the readme later on how to use the scripting as I ran it with command prompt and used the flags, would you be kind enough to run it on stock mem speed and see if we are getting the same result? I tried +500 and it only went to 380Gb/s whih is quite far from your 500Gb/s


Stock memory results posted:


Spoiler


----------



## kithylin

Gypsycurse said:


> Hi all,
> 
> I wanted to pose a question as after looking through the thread I couldn't see the answer (sorry in advance if I missed it).
> 
> I have flashed a ASUS Strix 1080 TI, with the OC version. I cleared out and reset the drivers and the stock and boost figures changed.
> So far I haven't used the powerlimit.zip I wanted to ask if this was necessary or only required if you hit an issue?
> 
> My experience so far has been rather obviously as the stock is increased the benches are higher, but also when running with the same over clock I used previously it appears to be a little faster too, higher timespy results and slightly higher sustained boost speed (maybe placebo)
> 
> What do you use to verify the watts consumed, do you need to use the .bat file to ensure the TDP is set correctly?
> 
> For additional information the care is water cooled, and presently hitting 2050 core 12400 memory
> 
> Thanks for any advice.
> 
> John


The power wattage is set in the bios.. I'd have to go look up what it should be.. guess I might do that for you... okay here you go: https://www.techpowerup.com/vgabios...X+1080+Ti&interface=&memType=&memSize=&since=

According to this, I looked at all of the models for "STRIX OC" and the limit for all of these is 330 watts, so you want to edit the batch file to reference 330W, and then run it once. 

I'm assuming since you said "Asus OC Bios" that this is what you are referring to.. did you mean the Asus Strix XOC Unlimited bios? Continuing this post based on the Strix OC bios then...

That batch file is only to tune your existing windows installation. Once you have the bios in there, in the future, if you say changed hard drives or for any other reason completely re-installed windows (with a fresh format), the card will always automatically be maximum power wattage. In the future for future installs you do not need to re-run the batch file. It's only for your current windows install to tune it now. And you only need to run it once. After your drivers are loaded.

As to if it is working, nvidia provides a power monitoring program with the current drivers for pascal cards called Nvidia SMI.
The default location is: C:\Program Files\NVIDIA Corporation\NVSMI The program is "nvidia-smi.exe"
And you can create a shortcut to it and drop it on desktop or where ever on your system and I usually change the "Target" field for the shortcut to look like this: "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -l

Then it will run in a command prompt window, auto-update every few seconds and show you how much power your card is pulling. Open it up in a second screen/monitor and watch it while you run firestrike or superposition and see if it's actually pulling the 330 watts you programmed in with the batch file.


----------



## Gypsycurse

Hi Kithylin, 

Thanks you hit the nail on the head. I purposefully avoided the XOC version based on your own personal comments that realistically it needed a full cover water block. As I am only using an AIO with an NZXT G12 adaptor I thought it was wise to err on the side of caution. That said, the higher TDP of the EVGA cards are a tempting idea to play with.

Thanks again for your help, and your sage words in the forum.

John


----------



## EarlZ

mouacyk said:


> Stock memory results posted:
> 
> 
> Spoiler




That tool doesnt look like the 2 tools linked on the previous post, where can I get that so atleast I can run the same test.
What i was able to download was the memtestG80 and the testcudamemorybandwidth


----------



## kithylin

Gypsycurse said:


> Hi Kithylin,
> 
> Thanks you hit the nail on the head. I purposefully avoided the XOC version based on your own personal comments that realistically it needed a full cover water block. As I am only using an AIO with an NZXT G12 adaptor I thought it was wise to err on the side of caution. That said, the higher TDP of the EVGA cards are a tempting idea to play with.
> 
> Thanks again for your help, and your sage words in the forum.
> 
> John


If you want to stay on "some sort of stock bios with a limit" then the actual highest power limit bios I have found (and I've looked at every 1080 Ti bios in the list on techpowerup) is from Zotac.

See over here: https://www.techpowerup.com/vgabios...&version=&interface=&memType=&memSize=&since=

The "Amp! Extreme" models allow up to 384 watts maximum.


----------



## KedarWolf

kithylin said:


> If you want to stay on "some sort of stock bios with a limit" then the actual highest power limit bios I have found (and I've looked at every 1080 Ti bios in the list on techpowerup) is from Zotac.
> 
> See over here: https://www.techpowerup.com/vgabios...&version=&interface=&memType=&memSize=&since=
> 
> The "Amp! Extreme" models allow up to 384 watts maximum.


Amp Extreme BIOS's will not work properly on most 1080 Ti's. 

Highest non-XOC BIOS is the 432W Zotac BIOS that'll work with FE's and most non-Amp cards.

https://www.techpowerup.com/vgabios/195428/195428


----------



## kithylin

KedarWolf said:


> Amp Extreme BIOS's will not work properly on most 1080 Ti's.
> 
> Highest non-XOC BIOS is the 432W Zotac BIOS that'll work with FE's and most non-Amp cards.
> 
> https://www.techpowerup.com/vgabios/195428/195428


I didn't know this, I won't recommend it anymore then.


----------



## st0neh

kithylin said:


> There was that issue a while back from the GTX 1070's from EVGA where just because EVGA did not put thermal pads to make the vrm's contact the heatsink, multiple users were reporting their video cards physically catching fire. And the GTX 1080 Ti pushes more power than those cards through the vrm's, at stock, and can get excessive with overclocking.


That turned out to be an issue with faulty components in the VRM.


----------



## Gurkburk

Can someone explain how this works..

My 1080 Ti is a MSI gaming X. I've had the stock cooler since i bought it. This Thursday I bought watercooling and cooled the card.

It's incredibly warm in Sweden now, around 30*C outside and inside the room where my computer is, is around 27*C.

So far I've seen a temp high of 52*C~ on water. 2050mhz core clock and 6156mhz memory clock (+110 / +650)

Before on the Aircooler, the TDP was floating around 117% which is the max on this card. Now i've seen highest 105%. Does the lower heat also drop the TDP?

Edit: Doing the 4k stresstest in 3dmark now, seeing as high as 120% TDP. But I usually stress on the regular furmark. Could've sworn that i saw it float around max TDP before too..


----------



## kithylin

st0neh said:


> That turned out to be an issue with faulty components in the VRM.


Then why did EVGA send out thermal pads to everyone to resolve the issue? And even if you sent one back to be "repaired" for the issue, all they did was install thermal pads and send it back.


----------



## Blameless

kithylin said:


> Then why did EVGA send out thermal pads to everyone to resolve the issue?


I'm not very familiar with the exact circumstances behind that event, but a cooler running VRM will absolutely last longer, even if it's components were improperly selected, or defective, in the first place. It's also a lot cheaper to send out thermal pads than replace video cards or waterblocks.


----------



## kithylin

st0neh said:


> That turned out to be an issue with faulty components in the VRM.


Then why did EVGA send out thermal pads to everyone to resolve the issue? And even if you sent one back to be "repaired" for the issue, all they did was install thermal pads and send it back.


----------



## mouacyk

EarlZ said:


> That tool doesnt look like the 2 tools linked on the previous post, where can I get that so atleast I can run the same test.
> What i was able to download was the memtestG80 and the testcudamemorybandwidth


Uploaded to: https://nofile.io/f/p15Suy0bu2E/vRamBandWidthTest.zip


----------



## MightEMatt

kithylin said:


> Then why did EVGA send out thermal pads to everyone to resolve the issue? And even if you sent one back to be "repaired" for the issue, all they did was install thermal pads and send it back.



A lot cheaper to cool the components down into spec than to rebuild thousands of cards with new VRMs.


----------



## EarlZ

mouacyk said:


> Uploaded to: https://nofile.io/f/p15Suy0bu2E/vRamBandWidthTest.zip


Thanks, I am getting around 350-364Gb/s, I think something is wrong with my 1080Ti, I only get the 400-433/Gbs result if I do +850 mem OC. Im losing about 67Gb/s of memory bandwidth.. Not sure how to explain this for warranty :/


----------



## mouacyk

EarlZ said:


> Thanks, I am getting around 359-366Gb/s which is about 67Gb/s lower than yours, could it be possible that something is wrong with my 1080Ti ?


That looks like a bad strap. Try incrementing/decrementing the memory oc until the next strap kicks in, then retry bandwidth. Repeat until it goes up.


----------



## EarlZ

mouacyk said:


> That looks like a bad strap. Try incrementing/decrementing the memory oc until the next strap kicks in, then retry bandwidth. Repeat until it goes up.


Thats stock speed no overclock yet.. which worries me..


----------



## Pyounpy-2

*Low temp. safe mode?*

Hi, someone knows low teperature safe mode appearing at less than -5 degrees C?
which is explained in the link of the post:
http://www.overclock.net/forum/69-n...ia-titan-xp-owners-club-266.html#post27463218

That low temp. safe mode appears for 1080ti, too.
The safe mode appears at -5 degrees C and gose out at +2 or +3 degrees C.
Then, in the real use, GPU frequency jumps when it gose out.
The example:




in this move at 3min 7sec., GPU frequency (shown by OSD : in the second lines) jumped from 2252MHz to 2300MHz.
This jump some time brings instability. And I can not find how to avoid this.
Someone has good idea? Due to this I have difficulty for more than 2.3GHz opeartion of 1080ti.
Additionally, it becomes more serious for Titan XP, because there is no mod bios for it, and I can not apply excess voltage for stablity to Titan-XP.
Thank you.


----------



## kithylin

Pyounpy-2 said:


> Hi, someone knows low teperature safe mode appearing at less than -5 degrees C?
> which is explained in the link of the post:
> http://www.overclock.net/forum/69-n...ia-titan-xp-owners-club-266.html#post27463218
> 
> That low temp. safe mode appears for 1080ti, too.
> The safe mode appears at -5 degrees C and gose out at +2 or +3 degrees C.
> Then, in the real use, GPU frequency jumps when it gose out.
> The example:
> https://youtu.be/VMhd-tzkIhI
> in this move at 3min 7sec., GPU frequency (shown by OSD : in the second lines) jumped from 2252MHz to 2300MHz.
> This jump some time brings instability. And I can not find how to avoid this.
> Someone has good idea? Due to this I have difficulty for more than 2.3GHz opeartion of 1080ti.
> Additionally, it becomes more serious for Titan XP, because there is no mod bios for it, and I can not apply excess voltage for stablity to Titan-XP.
> Thank you.


Even super cooled to sub-ambient, most 1080 Ti's can not exceed 2200~2250 mhz without hard volt mods to the card (soldering things to the card and feeding it more than 1.200v).


----------



## KedarWolf

mouacyk said:


> Uploaded to: https://nofile.io/f/p15Suy0bu2E/vRamBandWidthTest.zip


Updated latest version of vram bandwidth test found here.

https://www.computerbase.de/forum/threads/nais-benchmarkthread.1440618/

I uploaded it to www.virustotal.com and it's fine.


----------



## EarlZ

KedarWolf said:


> Updated latest version of vram bandwidth test found here.
> 
> https://www.computerbase.de/forum/threads/nais-benchmarkthread.1440618/
> 
> I uploaded it to www.virustotal.com and it's fine.


This version tells me its around 393-397 (stock), still very far from the results you are getting at stock.


----------



## mouacyk

EarlZ said:


> This version tells me its around 393-397 (stock), still very far from the results you are getting at stock.


Did you observe any throttling, especially thermal? My results for 430GB/s at stock are with a full water block, keeping my temperature under 40C.


----------



## KedarWolf

EarlZ said:


> This version tells me its around 393-397 (stock), still very far from the results you are getting at stock.


https://drive.google.com/open?id=1vv_gbzUgEeaQfrNUcPiH7VzuazmgT9Z7

Download the Nvidia Profile Inspector. Sorry, adding attachments not working so just made a Google Drive link.

Unzip, right click on it, Run As Admin.

Go to Common settings and 'CUDA - Force P2 State' as Off and then you'll get the correct bandwidth.

If you hit L at the top point after CTRL F in Afterburner even on stock settings you'll see the memory clocks go much lower than stock when running the bench because P2 States are enabled and it lowers memory clocks while the bench is running.

This makes sure they don't drop at all. 

Edit: P2 States on.



Spoiler



Benchmarking DRAM
0 MiByte to 128 MiByte: 369.53 GByte/s Read, 391.35 GByte/s Write
128 MiByte to 256 MiByte: 366.51 GByte/s Read, 374.49 GByte/s Write
256 MiByte to 384 MiByte: 365.07 GByte/s Read, 390.79 GByte/s Write
384 MiByte to 512 MiByte: 366.94 GByte/s Read, 380.57 GByte/s Write
512 MiByte to 640 MiByte: 363.84 GByte/s Read, 390.16 GByte/s Write
640 MiByte to 768 MiByte: 369.32 GByte/s Read, 374.28 GByte/s Write
768 MiByte to 896 MiByte: 367.34 GByte/s Read, 365.97 GByte/s Write
896 MiByte to 1024 MiByte: 369.19 GByte/s Read, 377.90 GByte/s Write
1024 MiByte to 1152 MiByte: 368.16 GByte/s Read, 358.17 GByte/s Write
1152 MiByte to 1280 MiByte: 366.02 GByte/s Read, 391.57 GByte/s Write
1280 MiByte to 1408 MiByte: 364.19 GByte/s Read, 374.68 GByte/s Write
1408 MiByte to 1536 MiByte: 362.18 GByte/s Read, 391.23 GByte/s Write
1536 MiByte to 1664 MiByte: 363.58 GByte/s Read, 380.58 GByte/s Write
1664 MiByte to 1792 MiByte: 360.95 GByte/s Read, 389.98 GByte/s Write
1792 MiByte to 1920 MiByte: 363.96 GByte/s Read, 383.14 GByte/s Write
1920 MiByte to 2048 MiByte: 363.75 GByte/s Read, 374.28 GByte/s Write
2048 MiByte to 2176 MiByte: 366.19 GByte/s Read, 375.56 GByte/s Write
2176 MiByte to 2304 MiByte: 364.75 GByte/s Read, 361.48 GByte/s Write
2304 MiByte to 2432 MiByte: 369.28 GByte/s Read, 378.90 GByte/s Write
2432 MiByte to 2560 MiByte: 368.34 GByte/s Read, 359.74 GByte/s Write
2560 MiByte to 2688 MiByte: 369.07 GByte/s Read, 391.56 GByte/s Write
2688 MiByte to 2816 MiByte: 366.59 GByte/s Read, 377.48 GByte/s Write
2816 MiByte to 2944 MiByte: 364.35 GByte/s Read, 390.68 GByte/s Write
2944 MiByte to 3072 MiByte: 367.02 GByte/s Read, 381.64 GByte/s Write
3072 MiByte to 3200 MiByte: 364.75 GByte/s Read, 386.30 GByte/s Write
3200 MiByte to 3328 MiByte: 369.37 GByte/s Read, 375.10 GByte/s Write
3328 MiByte to 3456 MiByte: 367.84 GByte/s Read, 363.79 GByte/s Write
3456 MiByte to 3584 MiByte: 369.29 GByte/s Read, 378.70 GByte/s Write
3584 MiByte to 3712 MiByte: 368.18 GByte/s Read, 357.22 GByte/s Write
3712 MiByte to 3840 MiByte: 369.22 GByte/s Read, 391.61 GByte/s Write
3840 MiByte to 3968 MiByte: 366.52 GByte/s Read, 374.89 GByte/s Write
3968 MiByte to 4096 MiByte: 364.90 GByte/s Read, 390.91 GByte/s Write
4096 MiByte to 4224 MiByte: 366.89 GByte/s Read, 381.23 GByte/s Write
4224 MiByte to 4352 MiByte: 363.74 GByte/s Read, 389.63 GByte/s Write
4352 MiByte to 4480 MiByte: 369.43 GByte/s Read, 374.06 GByte/s Write
4480 MiByte to 4608 MiByte: 367.25 GByte/s Read, 366.33 GByte/s Write
4608 MiByte to 4736 MiByte: 369.22 GByte/s Read, 376.93 GByte/s Write
4736 MiByte to 4864 MiByte: 368.26 GByte/s Read, 358.21 GByte/s Write
4864 MiByte to 4992 MiByte: 369.18 GByte/s Read, 391.56 GByte/s Write
4992 MiByte to 5120 MiByte: 367.40 GByte/s Read, 375.01 GByte/s Write
5120 MiByte to 5248 MiByte: 365.29 GByte/s Read, 391.32 GByte/s Write
5248 MiByte to 5376 MiByte: 366.74 GByte/s Read, 381.19 GByte/s Write
5376 MiByte to 5504 MiByte: 363.85 GByte/s Read, 390.41 GByte/s Write
5504 MiByte to 5632 MiByte: 369.30 GByte/s Read, 376.21 GByte/s Write
5632 MiByte to 5760 MiByte: 367.34 GByte/s Read, 366.72 GByte/s Write
5760 MiByte to 5888 MiByte: 369.42 GByte/s Read, 376.21 GByte/s Write
5888 MiByte to 6016 MiByte: 367.94 GByte/s Read, 360.68 GByte/s Write
6016 MiByte to 6144 MiByte: 369.29 GByte/s Read, 378.99 GByte/s Write
6144 MiByte to 6272 MiByte: 368.28 GByte/s Read, 363.44 GByte/s Write
6272 MiByte to 6400 MiByte: 368.44 GByte/s Read, 391.56 GByte/s Write
6400 MiByte to 6528 MiByte: 366.70 GByte/s Read, 378.71 GByte/s Write
6528 MiByte to 6656 MiByte: 364.35 GByte/s Read, 390.67 GByte/s Write
6656 MiByte to 6784 MiByte: 367.00 GByte/s Read, 382.89 GByte/s Write
6784 MiByte to 6912 MiByte: 365.82 GByte/s Read, 380.47 GByte/s Write
6912 MiByte to 7040 MiByte: 369.32 GByte/s Read, 375.03 GByte/s Write
7040 MiByte to 7168 MiByte: 367.64 GByte/s Read, 363.79 GByte/s Write
7168 MiByte to 7296 MiByte: 369.32 GByte/s Read, 378.49 GByte/s Write
7296 MiByte to 7424 MiByte: 368.18 GByte/s Read, 356.85 GByte/s Write
7424 MiByte to 7552 MiByte: 369.16 GByte/s Read, 391.49 GByte/s Write
7552 MiByte to 7680 MiByte: 366.62 GByte/s Read, 375.78 GByte/s Write
7680 MiByte to 7808 MiByte: 364.80 GByte/s Read, 390.76 GByte/s Write
7808 MiByte to 7936 MiByte: 366.94 GByte/s Read, 381.97 GByte/s Write
7936 MiByte to 8064 MiByte: 363.99 GByte/s Read, 389.28 GByte/s Write
8064 MiByte to 8192 MiByte: 366.19 GByte/s Read, 374.81 GByte/s Write
8192 MiByte to 8320 MiByte: 364.19 GByte/s Read, 364.86 GByte/s Write
8320 MiByte to 8448 MiByte: 366.12 GByte/s Read, 376.54 GByte/s Write
8448 MiByte to 8576 MiByte: 365.20 GByte/s Read, 357.73 GByte/s Write
8576 MiByte to 8704 MiByte: 366.08 GByte/s Read, 391.21 GByte/s Write
8704 MiByte to 8832 MiByte: 363.98 GByte/s Read, 374.69 GByte/s Write
8832 MiByte to 8960 MiByte: 362.08 GByte/s Read, 391.24 GByte/s Write
8960 MiByte to 9088 MiByte: 363.54 GByte/s Read, 381.80 GByte/s Write
9088 MiByte to 9216 MiByte: 360.68 GByte/s Read, 390.21 GByte/s Write



P2 States Off



Spoiler



Benchmarking DRAM
0 MiByte to 128 MiByte: 435.09 GByte/s Read, 447.34 GByte/s Write
128 MiByte to 256 MiByte: 433.59 GByte/s Read, 444.59 GByte/s Write
256 MiByte to 384 MiByte: 435.89 GByte/s Read, 450.52 GByte/s Write
384 MiByte to 512 MiByte: 433.86 GByte/s Read, 448.69 GByte/s Write
512 MiByte to 640 MiByte: 433.39 GByte/s Read, 461.52 GByte/s Write
640 MiByte to 768 MiByte: 434.29 GByte/s Read, 449.18 GByte/s Write
768 MiByte to 896 MiByte: 433.82 GByte/s Read, 445.52 GByte/s Write
896 MiByte to 1024 MiByte: 435.03 GByte/s Read, 458.29 GByte/s Write
1024 MiByte to 1152 MiByte: 434.30 GByte/s Read, 448.83 GByte/s Write
1152 MiByte to 1280 MiByte: 433.01 GByte/s Read, 452.22 GByte/s Write
1280 MiByte to 1408 MiByte: 436.14 GByte/s Read, 446.28 GByte/s Write
1408 MiByte to 1536 MiByte: 433.84 GByte/s Read, 447.42 GByte/s Write
1536 MiByte to 1664 MiByte: 433.67 GByte/s Read, 465.37 GByte/s Write
1664 MiByte to 1792 MiByte: 433.73 GByte/s Read, 445.66 GByte/s Write
1792 MiByte to 1920 MiByte: 434.01 GByte/s Read, 443.86 GByte/s Write
1920 MiByte to 2048 MiByte: 435.60 GByte/s Read, 453.99 GByte/s Write
2048 MiByte to 2176 MiByte: 434.00 GByte/s Read, 448.42 GByte/s Write
2176 MiByte to 2304 MiByte: 433.27 GByte/s Read, 457.79 GByte/s Write
2304 MiByte to 2432 MiByte: 434.73 GByte/s Read, 449.15 GByte/s Write
2432 MiByte to 2560 MiByte: 432.67 GByte/s Read, 444.57 GByte/s Write
2560 MiByte to 2688 MiByte: 433.44 GByte/s Read, 466.92 GByte/s Write
2688 MiByte to 2816 MiByte: 431.83 GByte/s Read, 447.80 GByte/s Write
2816 MiByte to 2944 MiByte: 432.15 GByte/s Read, 444.16 GByte/s Write
2944 MiByte to 3072 MiByte: 434.44 GByte/s Read, 450.39 GByte/s Write
3072 MiByte to 3200 MiByte: 432.37 GByte/s Read, 448.72 GByte/s Write
3200 MiByte to 3328 MiByte: 433.15 GByte/s Read, 462.61 GByte/s Write
3328 MiByte to 3456 MiByte: 434.30 GByte/s Read, 449.18 GByte/s Write
3456 MiByte to 3584 MiByte: 433.97 GByte/s Read, 444.27 GByte/s Write
3584 MiByte to 3712 MiByte: 434.95 GByte/s Read, 457.91 GByte/s Write
3712 MiByte to 3840 MiByte: 434.30 GByte/s Read, 449.34 GByte/s Write
3840 MiByte to 3968 MiByte: 433.58 GByte/s Read, 447.44 GByte/s Write
3968 MiByte to 4096 MiByte: 435.99 GByte/s Read, 447.80 GByte/s Write
4096 MiByte to 4224 MiByte: 433.87 GByte/s Read, 446.88 GByte/s Write
4224 MiByte to 4352 MiByte: 433.44 GByte/s Read, 463.64 GByte/s Write
4352 MiByte to 4480 MiByte: 433.94 GByte/s Read, 447.04 GByte/s Write
4480 MiByte to 4608 MiByte: 434.16 GByte/s Read, 444.86 GByte/s Write
4608 MiByte to 4736 MiByte: 435.17 GByte/s Read, 457.65 GByte/s Write
4736 MiByte to 4864 MiByte: 434.11 GByte/s Read, 450.65 GByte/s Write
4864 MiByte to 4992 MiByte: 433.13 GByte/s Read, 455.59 GByte/s Write
4992 MiByte to 5120 MiByte: 436.18 GByte/s Read, 445.37 GByte/s Write
5120 MiByte to 5248 MiByte: 434.01 GByte/s Read, 446.73 GByte/s Write
5248 MiByte to 5376 MiByte: 433.56 GByte/s Read, 464.79 GByte/s Write
5376 MiByte to 5504 MiByte: 433.71 GByte/s Read, 445.36 GByte/s Write
5504 MiByte to 5632 MiByte: 433.87 GByte/s Read, 445.81 GByte/s Write
5632 MiByte to 5760 MiByte: 435.55 GByte/s Read, 454.58 GByte/s Write
5760 MiByte to 5888 MiByte: 434.16 GByte/s Read, 448.57 GByte/s Write
5888 MiByte to 6016 MiByte: 433.44 GByte/s Read, 455.27 GByte/s Write
6016 MiByte to 6144 MiByte: 434.73 GByte/s Read, 451.00 GByte/s Write
6144 MiByte to 6272 MiByte: 434.01 GByte/s Read, 447.31 GByte/s Write
6272 MiByte to 6400 MiByte: 433.56 GByte/s Read, 465.79 GByte/s Write
6400 MiByte to 6528 MiByte: 433.65 GByte/s Read, 445.91 GByte/s Write
6528 MiByte to 6656 MiByte: 433.58 GByte/s Read, 444.61 GByte/s Write
6656 MiByte to 6784 MiByte: 435.60 GByte/s Read, 453.85 GByte/s Write
6784 MiByte to 6912 MiByte: 434.01 GByte/s Read, 448.95 GByte/s Write
6912 MiByte to 7040 MiByte: 433.30 GByte/s Read, 457.17 GByte/s Write
7040 MiByte to 7168 MiByte: 434.69 GByte/s Read, 449.47 GByte/s Write
7168 MiByte to 7296 MiByte: 434.27 GByte/s Read, 444.25 GByte/s Write
7296 MiByte to 7424 MiByte: 434.85 GByte/s Read, 460.47 GByte/s Write
7424 MiByte to 7552 MiByte: 434.43 GByte/s Read, 448.40 GByte/s Write
7552 MiByte to 7680 MiByte: 433.58 GByte/s Read, 444.83 GByte/s Write
7680 MiByte to 7808 MiByte: 436.03 GByte/s Read, 448.84 GByte/s Write
7808 MiByte to 7936 MiByte: 433.97 GByte/s Read, 446.43 GByte/s Write
7936 MiByte to 8064 MiByte: 433.54 GByte/s Read, 462.61 GByte/s Write
8064 MiByte to 8192 MiByte: 434.26 GByte/s Read, 448.11 GByte/s Write
8192 MiByte to 8320 MiByte: 434.00 GByte/s Read, 445.01 GByte/s Write
8320 MiByte to 8448 MiByte: 433.87 GByte/s Read, 456.33 GByte/s Write
8448 MiByte to 8576 MiByte: 432.79 GByte/s Read, 448.70 GByte/s Write
8576 MiByte to 8704 MiByte: 433.37 GByte/s Read, 455.11 GByte/s Write
8704 MiByte to 8832 MiByte: 434.88 GByte/s Read, 451.30 GByte/s Write
8832 MiByte to 8960 MiByte: 432.58 GByte/s Read, 447.65 GByte/s Write
8960 MiByte to 9088 MiByte: 432.01 GByte/s Read, 465.45 GByte/s Write
9088 MiByte to 9216 MiByte: 432.15 GByte/s Read, 445.67 GByte/s Write


----------



## kithylin

KedarWolf said:


> https://drive.google.com/open?id=1vv_gbzUgEeaQfrNUcPiH7VzuazmgT9Z7
> 
> Download the Nvidia Profile Inspector. Sorry, adding attachments not working so just made a Google Drive link.
> 
> Unzip, right click on it, Run As Admin.
> 
> Go to Common settings and 'CUDA - Force P2 State' as Off and then you'll get the correct bandwidth.
> 
> If you hit L at the top point after CTRL F in Afterburner even on stock settings you'll see the memory clocks go much lower than stock when running the bench because P2 States are enabled and it lowers memory clocks while the bench is running.
> 
> This makes sure they don't drop at all.
> 
> Edit: P2 States on.


Very weird.. the version you posted there doesn't even open in Windows 7 x64, just crashes. And older versions of the program that do open in windows 7, do not have the "CUDA - Force P2 State" settings under general like you described. And the bandwidth testing program drops my 1080 Ti down to less than stock when I run it.


----------



## KedarWolf

kithylin said:


> Very weird.. the version you posted there doesn't even open in Windows 7 x64, just crashes. And older versions of the program that do open in windows 7, do not have the "CUDA - Force P2 State" settings under general like you described. And the bandwidth testing program drops my 1080 Ti down to less than stock when I run it.


Try this version, right click, Run As Admin.

https://drive.google.com/open?id=127pBsFhuh2CDZjVBDbC9GVtEXMvmDcmu

Or click on More and try one of the older builds here.

https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/history

Edit: Build 2.1.3.5 is the oldest with P2 States.

https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/2.1.3.5/artifacts

If latest version from my Google drive doesn't work.


----------



## kithylin

KedarWolf said:


> Try this version, right click, Run As Admin.
> 
> https://drive.google.com/open?id=127pBsFhuh2CDZjVBDbC9GVtEXMvmDcmu


Worked, thank you.


----------



## gammagoat

KedarWolf said:


> https://drive.google.com/open?id=1vv_gbzUgEeaQfrNUcPiH7VzuazmgT9Z7
> 
> Download the Nvidia Profile Inspector. Sorry, adding attachments not working so just made a Google Drive link.
> 
> Unzip, right click on it, Run As Admin.
> 
> Go to Common settings and 'CUDA - Force P2 State' as Off and then you'll get the correct bandwidth.
> 
> If you hit L at the top point after CTRL F in Afterburner even on stock settings you'll see the memory clocks go much lower than stock when running the bench because P2 States are enabled and it lowers memory clocks while the bench is running.
> 
> This makes sure they don't drop at all.
> 
> Edit: P2 States on.
> 
> 
> 
> Spoiler
> 
> 
> 
> Benchmarking DRAM
> 0 MiByte to 128 MiByte: 369.53 GByte/s Read, 391.35 GByte/s Write
> 128 MiByte to 256 MiByte: 366.51 GByte/s Read, 374.49 GByte/s Write
> 256 MiByte to 384 MiByte: 365.07 GByte/s Read, 390.79 GByte/s Write
> 384 MiByte to 512 MiByte: 366.94 GByte/s Read, 380.57 GByte/s Write
> 512 MiByte to 640 MiByte: 363.84 GByte/s Read, 390.16 GByte/s Write
> 640 MiByte to 768 MiByte: 369.32 GByte/s Read, 374.28 GByte/s Write
> 768 MiByte to 896 MiByte: 367.34 GByte/s Read, 365.97 GByte/s Write
> 896 MiByte to 1024 MiByte: 369.19 GByte/s Read, 377.90 GByte/s Write
> 1024 MiByte to 1152 MiByte: 368.16 GByte/s Read, 358.17 GByte/s Write
> 1152 MiByte to 1280 MiByte: 366.02 GByte/s Read, 391.57 GByte/s Write
> 1280 MiByte to 1408 MiByte: 364.19 GByte/s Read, 374.68 GByte/s Write
> 1408 MiByte to 1536 MiByte: 362.18 GByte/s Read, 391.23 GByte/s Write
> 1536 MiByte to 1664 MiByte: 363.58 GByte/s Read, 380.58 GByte/s Write
> 1664 MiByte to 1792 MiByte: 360.95 GByte/s Read, 389.98 GByte/s Write
> 1792 MiByte to 1920 MiByte: 363.96 GByte/s Read, 383.14 GByte/s Write
> 1920 MiByte to 2048 MiByte: 363.75 GByte/s Read, 374.28 GByte/s Write
> 2048 MiByte to 2176 MiByte: 366.19 GByte/s Read, 375.56 GByte/s Write
> 2176 MiByte to 2304 MiByte: 364.75 GByte/s Read, 361.48 GByte/s Write
> 2304 MiByte to 2432 MiByte: 369.28 GByte/s Read, 378.90 GByte/s Write
> 2432 MiByte to 2560 MiByte: 368.34 GByte/s Read, 359.74 GByte/s Write
> 2560 MiByte to 2688 MiByte: 369.07 GByte/s Read, 391.56 GByte/s Write
> 2688 MiByte to 2816 MiByte: 366.59 GByte/s Read, 377.48 GByte/s Write
> 2816 MiByte to 2944 MiByte: 364.35 GByte/s Read, 390.68 GByte/s Write
> 2944 MiByte to 3072 MiByte: 367.02 GByte/s Read, 381.64 GByte/s Write
> 3072 MiByte to 3200 MiByte: 364.75 GByte/s Read, 386.30 GByte/s Write
> 3200 MiByte to 3328 MiByte: 369.37 GByte/s Read, 375.10 GByte/s Write
> 3328 MiByte to 3456 MiByte: 367.84 GByte/s Read, 363.79 GByte/s Write
> 3456 MiByte to 3584 MiByte: 369.29 GByte/s Read, 378.70 GByte/s Write
> 3584 MiByte to 3712 MiByte: 368.18 GByte/s Read, 357.22 GByte/s Write
> 3712 MiByte to 3840 MiByte: 369.22 GByte/s Read, 391.61 GByte/s Write
> 3840 MiByte to 3968 MiByte: 366.52 GByte/s Read, 374.89 GByte/s Write
> 3968 MiByte to 4096 MiByte: 364.90 GByte/s Read, 390.91 GByte/s Write
> 4096 MiByte to 4224 MiByte: 366.89 GByte/s Read, 381.23 GByte/s Write
> 4224 MiByte to 4352 MiByte: 363.74 GByte/s Read, 389.63 GByte/s Write
> 4352 MiByte to 4480 MiByte: 369.43 GByte/s Read, 374.06 GByte/s Write
> 4480 MiByte to 4608 MiByte: 367.25 GByte/s Read, 366.33 GByte/s Write
> 4608 MiByte to 4736 MiByte: 369.22 GByte/s Read, 376.93 GByte/s Write
> 4736 MiByte to 4864 MiByte: 368.26 GByte/s Read, 358.21 GByte/s Write
> 4864 MiByte to 4992 MiByte: 369.18 GByte/s Read, 391.56 GByte/s Write
> 4992 MiByte to 5120 MiByte: 367.40 GByte/s Read, 375.01 GByte/s Write
> 5120 MiByte to 5248 MiByte: 365.29 GByte/s Read, 391.32 GByte/s Write
> 5248 MiByte to 5376 MiByte: 366.74 GByte/s Read, 381.19 GByte/s Write
> 5376 MiByte to 5504 MiByte: 363.85 GByte/s Read, 390.41 GByte/s Write
> 5504 MiByte to 5632 MiByte: 369.30 GByte/s Read, 376.21 GByte/s Write
> 5632 MiByte to 5760 MiByte: 367.34 GByte/s Read, 366.72 GByte/s Write
> 5760 MiByte to 5888 MiByte: 369.42 GByte/s Read, 376.21 GByte/s Write
> 5888 MiByte to 6016 MiByte: 367.94 GByte/s Read, 360.68 GByte/s Write
> 6016 MiByte to 6144 MiByte: 369.29 GByte/s Read, 378.99 GByte/s Write
> 6144 MiByte to 6272 MiByte: 368.28 GByte/s Read, 363.44 GByte/s Write
> 6272 MiByte to 6400 MiByte: 368.44 GByte/s Read, 391.56 GByte/s Write
> 6400 MiByte to 6528 MiByte: 366.70 GByte/s Read, 378.71 GByte/s Write
> 6528 MiByte to 6656 MiByte: 364.35 GByte/s Read, 390.67 GByte/s Write
> 6656 MiByte to 6784 MiByte: 367.00 GByte/s Read, 382.89 GByte/s Write
> 6784 MiByte to 6912 MiByte: 365.82 GByte/s Read, 380.47 GByte/s Write
> 6912 MiByte to 7040 MiByte: 369.32 GByte/s Read, 375.03 GByte/s Write
> 7040 MiByte to 7168 MiByte: 367.64 GByte/s Read, 363.79 GByte/s Write
> 7168 MiByte to 7296 MiByte: 369.32 GByte/s Read, 378.49 GByte/s Write
> 7296 MiByte to 7424 MiByte: 368.18 GByte/s Read, 356.85 GByte/s Write
> 7424 MiByte to 7552 MiByte: 369.16 GByte/s Read, 391.49 GByte/s Write
> 7552 MiByte to 7680 MiByte: 366.62 GByte/s Read, 375.78 GByte/s Write
> 7680 MiByte to 7808 MiByte: 364.80 GByte/s Read, 390.76 GByte/s Write
> 7808 MiByte to 7936 MiByte: 366.94 GByte/s Read, 381.97 GByte/s Write
> 7936 MiByte to 8064 MiByte: 363.99 GByte/s Read, 389.28 GByte/s Write
> 8064 MiByte to 8192 MiByte: 366.19 GByte/s Read, 374.81 GByte/s Write
> 8192 MiByte to 8320 MiByte: 364.19 GByte/s Read, 364.86 GByte/s Write
> 8320 MiByte to 8448 MiByte: 366.12 GByte/s Read, 376.54 GByte/s Write
> 8448 MiByte to 8576 MiByte: 365.20 GByte/s Read, 357.73 GByte/s Write
> 8576 MiByte to 8704 MiByte: 366.08 GByte/s Read, 391.21 GByte/s Write
> 8704 MiByte to 8832 MiByte: 363.98 GByte/s Read, 374.69 GByte/s Write
> 8832 MiByte to 8960 MiByte: 362.08 GByte/s Read, 391.24 GByte/s Write
> 8960 MiByte to 9088 MiByte: 363.54 GByte/s Read, 381.80 GByte/s Write
> 9088 MiByte to 9216 MiByte: 360.68 GByte/s Read, 390.21 GByte/s Write
> 
> 
> 
> P2 States Off
> 
> 
> 
> Spoiler
> 
> 
> 
> Benchmarking DRAM
> 0 MiByte to 128 MiByte: 435.09 GByte/s Read, 447.34 GByte/s Write
> 128 MiByte to 256 MiByte: 433.59 GByte/s Read, 444.59 GByte/s Write
> 256 MiByte to 384 MiByte: 435.89 GByte/s Read, 450.52 GByte/s Write
> 384 MiByte to 512 MiByte: 433.86 GByte/s Read, 448.69 GByte/s Write
> 512 MiByte to 640 MiByte: 433.39 GByte/s Read, 461.52 GByte/s Write
> 640 MiByte to 768 MiByte: 434.29 GByte/s Read, 449.18 GByte/s Write
> 768 MiByte to 896 MiByte: 433.82 GByte/s Read, 445.52 GByte/s Write
> 896 MiByte to 1024 MiByte: 435.03 GByte/s Read, 458.29 GByte/s Write
> 1024 MiByte to 1152 MiByte: 434.30 GByte/s Read, 448.83 GByte/s Write
> 1152 MiByte to 1280 MiByte: 433.01 GByte/s Read, 452.22 GByte/s Write
> 1280 MiByte to 1408 MiByte: 436.14 GByte/s Read, 446.28 GByte/s Write
> 1408 MiByte to 1536 MiByte: 433.84 GByte/s Read, 447.42 GByte/s Write
> 1536 MiByte to 1664 MiByte: 433.67 GByte/s Read, 465.37 GByte/s Write
> 1664 MiByte to 1792 MiByte: 433.73 GByte/s Read, 445.66 GByte/s Write
> 1792 MiByte to 1920 MiByte: 434.01 GByte/s Read, 443.86 GByte/s Write
> 1920 MiByte to 2048 MiByte: 435.60 GByte/s Read, 453.99 GByte/s Write
> 2048 MiByte to 2176 MiByte: 434.00 GByte/s Read, 448.42 GByte/s Write
> 2176 MiByte to 2304 MiByte: 433.27 GByte/s Read, 457.79 GByte/s Write
> 2304 MiByte to 2432 MiByte: 434.73 GByte/s Read, 449.15 GByte/s Write
> 2432 MiByte to 2560 MiByte: 432.67 GByte/s Read, 444.57 GByte/s Write
> 2560 MiByte to 2688 MiByte: 433.44 GByte/s Read, 466.92 GByte/s Write
> 2688 MiByte to 2816 MiByte: 431.83 GByte/s Read, 447.80 GByte/s Write
> 2816 MiByte to 2944 MiByte: 432.15 GByte/s Read, 444.16 GByte/s Write
> 2944 MiByte to 3072 MiByte: 434.44 GByte/s Read, 450.39 GByte/s Write
> 3072 MiByte to 3200 MiByte: 432.37 GByte/s Read, 448.72 GByte/s Write
> 3200 MiByte to 3328 MiByte: 433.15 GByte/s Read, 462.61 GByte/s Write
> 3328 MiByte to 3456 MiByte: 434.30 GByte/s Read, 449.18 GByte/s Write
> 3456 MiByte to 3584 MiByte: 433.97 GByte/s Read, 444.27 GByte/s Write
> 3584 MiByte to 3712 MiByte: 434.95 GByte/s Read, 457.91 GByte/s Write
> 3712 MiByte to 3840 MiByte: 434.30 GByte/s Read, 449.34 GByte/s Write
> 3840 MiByte to 3968 MiByte: 433.58 GByte/s Read, 447.44 GByte/s Write
> 3968 MiByte to 4096 MiByte: 435.99 GByte/s Read, 447.80 GByte/s Write
> 4096 MiByte to 4224 MiByte: 433.87 GByte/s Read, 446.88 GByte/s Write
> 4224 MiByte to 4352 MiByte: 433.44 GByte/s Read, 463.64 GByte/s Write
> 4352 MiByte to 4480 MiByte: 433.94 GByte/s Read, 447.04 GByte/s Write
> 4480 MiByte to 4608 MiByte: 434.16 GByte/s Read, 444.86 GByte/s Write
> 4608 MiByte to 4736 MiByte: 435.17 GByte/s Read, 457.65 GByte/s Write
> 4736 MiByte to 4864 MiByte: 434.11 GByte/s Read, 450.65 GByte/s Write
> 4864 MiByte to 4992 MiByte: 433.13 GByte/s Read, 455.59 GByte/s Write
> 4992 MiByte to 5120 MiByte: 436.18 GByte/s Read, 445.37 GByte/s Write
> 5120 MiByte to 5248 MiByte: 434.01 GByte/s Read, 446.73 GByte/s Write
> 5248 MiByte to 5376 MiByte: 433.56 GByte/s Read, 464.79 GByte/s Write
> 5376 MiByte to 5504 MiByte: 433.71 GByte/s Read, 445.36 GByte/s Write
> 5504 MiByte to 5632 MiByte: 433.87 GByte/s Read, 445.81 GByte/s Write
> 5632 MiByte to 5760 MiByte: 435.55 GByte/s Read, 454.58 GByte/s Write
> 5760 MiByte to 5888 MiByte: 434.16 GByte/s Read, 448.57 GByte/s Write
> 5888 MiByte to 6016 MiByte: 433.44 GByte/s Read, 455.27 GByte/s Write
> 6016 MiByte to 6144 MiByte: 434.73 GByte/s Read, 451.00 GByte/s Write
> 6144 MiByte to 6272 MiByte: 434.01 GByte/s Read, 447.31 GByte/s Write
> 6272 MiByte to 6400 MiByte: 433.56 GByte/s Read, 465.79 GByte/s Write
> 6400 MiByte to 6528 MiByte: 433.65 GByte/s Read, 445.91 GByte/s Write
> 6528 MiByte to 6656 MiByte: 433.58 GByte/s Read, 444.61 GByte/s Write
> 6656 MiByte to 6784 MiByte: 435.60 GByte/s Read, 453.85 GByte/s Write
> 6784 MiByte to 6912 MiByte: 434.01 GByte/s Read, 448.95 GByte/s Write
> 6912 MiByte to 7040 MiByte: 433.30 GByte/s Read, 457.17 GByte/s Write
> 7040 MiByte to 7168 MiByte: 434.69 GByte/s Read, 449.47 GByte/s Write
> 7168 MiByte to 7296 MiByte: 434.27 GByte/s Read, 444.25 GByte/s Write
> 7296 MiByte to 7424 MiByte: 434.85 GByte/s Read, 460.47 GByte/s Write
> 7424 MiByte to 7552 MiByte: 434.43 GByte/s Read, 448.40 GByte/s Write
> 7552 MiByte to 7680 MiByte: 433.58 GByte/s Read, 444.83 GByte/s Write
> 7680 MiByte to 7808 MiByte: 436.03 GByte/s Read, 448.84 GByte/s Write
> 7808 MiByte to 7936 MiByte: 433.97 GByte/s Read, 446.43 GByte/s Write
> 7936 MiByte to 8064 MiByte: 433.54 GByte/s Read, 462.61 GByte/s Write
> 8064 MiByte to 8192 MiByte: 434.26 GByte/s Read, 448.11 GByte/s Write
> 8192 MiByte to 8320 MiByte: 434.00 GByte/s Read, 445.01 GByte/s Write
> 8320 MiByte to 8448 MiByte: 433.87 GByte/s Read, 456.33 GByte/s Write
> 8448 MiByte to 8576 MiByte: 432.79 GByte/s Read, 448.70 GByte/s Write
> 8576 MiByte to 8704 MiByte: 433.37 GByte/s Read, 455.11 GByte/s Write
> 8704 MiByte to 8832 MiByte: 434.88 GByte/s Read, 451.30 GByte/s Write
> 8832 MiByte to 8960 MiByte: 432.58 GByte/s Read, 447.65 GByte/s Write
> 8960 MiByte to 9088 MiByte: 432.01 GByte/s Read, 465.45 GByte/s Write
> 9088 MiByte to 9216 MiByte: 432.15 GByte/s Read, 445.67 GByte/s Write



I have a big ole Prime Ribeye with your name on it, thanks bro!

I'll make sure to rep you when it comes back online.

Went from 420's to 470's, pretty near the peak theoretical bandwidth the test says is possible.


----------



## Jflisk

Looks Like as of tommrow I will be the proud owner of a EVGA GeForce GTX 1080 Ti FTW3 HYBRID GAMING . I will be switching from the red side to the green side, I could imagine these over perform in every aspect , Gaming -VR . Thanks


----------



## kithylin

Jflisk said:


> Looks Like as of tommrow I will be the proud owner of a EVGA GeForce GTX 1080 Ti FTW3 HYBRID GAMING . I will be switching from the red side to the green side, I could imagine these over perform in every aspect , Gaming -VR . Thanks


Congrats! Hope you enjoy your new beast. Welcome to the green side. :devil-smi

What did you have before?


----------



## mouacyk

Jflisk said:


> Looks Like as of tommrow I will be the proud owner of a EVGA GeForce GTX 1080 Ti FTW3 HYBRID GAMING . I will be switching from the red side to the green side, I could imagine these over perform in every aspect , Gaming -VR . Thanks


Go where the grass is green. No idea how long it will stay that way though, given Intel's budget and poaching of GPU talents.


----------



## EarlZ

mouacyk said:


> Did you observe any throttling, especially thermal? My results for 430GB/s at stock are with a full water block, keeping my temperature under 40C.


No thernal drops at all, also under 40 running the test.


----------



## EarlZ

KedarWolf said:


> https://drive.google.com/open?id=1vv_gbzUgEeaQfrNUcPiH7VzuazmgT9Z7
> 
> Download the Nvidia Profile Inspector. Sorry, adding attachments not working so just made a Google Drive link.
> 
> Unzip, right click on it, Run As Admin.
> 
> Go to Common settings and 'CUDA - Force P2 State' as Off and then you'll get the correct bandwidth.
> 
> If you hit L at the top point after CTRL F in Afterburner even on stock settings you'll see the memory clocks go much lower than stock when running the bench because P2 States are enabled and it lowers memory clocks while the bench is running.
> 
> This makes sure they don't drop at all.
> 
> Edit: P2 States on.
> 
> 
> 
> Spoiler
> 
> 
> 
> Benchmarking DRAM
> 0 MiByte to 128 MiByte: 369.53 GByte/s Read, 391.35 GByte/s Write
> 128 MiByte to 256 MiByte: 366.51 GByte/s Read, 374.49 GByte/s Write
> 256 MiByte to 384 MiByte: 365.07 GByte/s Read, 390.79 GByte/s Write
> 384 MiByte to 512 MiByte: 366.94 GByte/s Read, 380.57 GByte/s Write
> 512 MiByte to 640 MiByte: 363.84 GByte/s Read, 390.16 GByte/s Write
> 640 MiByte to 768 MiByte: 369.32 GByte/s Read, 374.28 GByte/s Write
> 768 MiByte to 896 MiByte: 367.34 GByte/s Read, 365.97 GByte/s Write
> 896 MiByte to 1024 MiByte: 369.19 GByte/s Read, 377.90 GByte/s Write
> 1024 MiByte to 1152 MiByte: 368.16 GByte/s Read, 358.17 GByte/s Write
> 1152 MiByte to 1280 MiByte: 366.02 GByte/s Read, 391.57 GByte/s Write
> 1280 MiByte to 1408 MiByte: 364.19 GByte/s Read, 374.68 GByte/s Write
> 1408 MiByte to 1536 MiByte: 362.18 GByte/s Read, 391.23 GByte/s Write
> 1536 MiByte to 1664 MiByte: 363.58 GByte/s Read, 380.58 GByte/s Write
> 1664 MiByte to 1792 MiByte: 360.95 GByte/s Read, 389.98 GByte/s Write
> 1792 MiByte to 1920 MiByte: 363.96 GByte/s Read, 383.14 GByte/s Write
> 1920 MiByte to 2048 MiByte: 363.75 GByte/s Read, 374.28 GByte/s Write
> 2048 MiByte to 2176 MiByte: 366.19 GByte/s Read, 375.56 GByte/s Write
> 2176 MiByte to 2304 MiByte: 364.75 GByte/s Read, 361.48 GByte/s Write
> 2304 MiByte to 2432 MiByte: 369.28 GByte/s Read, 378.90 GByte/s Write
> 2432 MiByte to 2560 MiByte: 368.34 GByte/s Read, 359.74 GByte/s Write
> 2560 MiByte to 2688 MiByte: 369.07 GByte/s Read, 391.56 GByte/s Write
> 2688 MiByte to 2816 MiByte: 366.59 GByte/s Read, 377.48 GByte/s Write
> 2816 MiByte to 2944 MiByte: 364.35 GByte/s Read, 390.68 GByte/s Write
> 2944 MiByte to 3072 MiByte: 367.02 GByte/s Read, 381.64 GByte/s Write
> 3072 MiByte to 3200 MiByte: 364.75 GByte/s Read, 386.30 GByte/s Write
> 3200 MiByte to 3328 MiByte: 369.37 GByte/s Read, 375.10 GByte/s Write
> 3328 MiByte to 3456 MiByte: 367.84 GByte/s Read, 363.79 GByte/s Write
> 3456 MiByte to 3584 MiByte: 369.29 GByte/s Read, 378.70 GByte/s Write
> 3584 MiByte to 3712 MiByte: 368.18 GByte/s Read, 357.22 GByte/s Write
> 3712 MiByte to 3840 MiByte: 369.22 GByte/s Read, 391.61 GByte/s Write
> 3840 MiByte to 3968 MiByte: 366.52 GByte/s Read, 374.89 GByte/s Write
> 3968 MiByte to 4096 MiByte: 364.90 GByte/s Read, 390.91 GByte/s Write
> 4096 MiByte to 4224 MiByte: 366.89 GByte/s Read, 381.23 GByte/s Write
> 4224 MiByte to 4352 MiByte: 363.74 GByte/s Read, 389.63 GByte/s Write
> 4352 MiByte to 4480 MiByte: 369.43 GByte/s Read, 374.06 GByte/s Write
> 4480 MiByte to 4608 MiByte: 367.25 GByte/s Read, 366.33 GByte/s Write
> 4608 MiByte to 4736 MiByte: 369.22 GByte/s Read, 376.93 GByte/s Write
> 4736 MiByte to 4864 MiByte: 368.26 GByte/s Read, 358.21 GByte/s Write
> 4864 MiByte to 4992 MiByte: 369.18 GByte/s Read, 391.56 GByte/s Write
> 4992 MiByte to 5120 MiByte: 367.40 GByte/s Read, 375.01 GByte/s Write
> 5120 MiByte to 5248 MiByte: 365.29 GByte/s Read, 391.32 GByte/s Write
> 5248 MiByte to 5376 MiByte: 366.74 GByte/s Read, 381.19 GByte/s Write
> 5376 MiByte to 5504 MiByte: 363.85 GByte/s Read, 390.41 GByte/s Write
> 5504 MiByte to 5632 MiByte: 369.30 GByte/s Read, 376.21 GByte/s Write
> 5632 MiByte to 5760 MiByte: 367.34 GByte/s Read, 366.72 GByte/s Write
> 5760 MiByte to 5888 MiByte: 369.42 GByte/s Read, 376.21 GByte/s Write
> 5888 MiByte to 6016 MiByte: 367.94 GByte/s Read, 360.68 GByte/s Write
> 6016 MiByte to 6144 MiByte: 369.29 GByte/s Read, 378.99 GByte/s Write
> 6144 MiByte to 6272 MiByte: 368.28 GByte/s Read, 363.44 GByte/s Write
> 6272 MiByte to 6400 MiByte: 368.44 GByte/s Read, 391.56 GByte/s Write
> 6400 MiByte to 6528 MiByte: 366.70 GByte/s Read, 378.71 GByte/s Write
> 6528 MiByte to 6656 MiByte: 364.35 GByte/s Read, 390.67 GByte/s Write
> 6656 MiByte to 6784 MiByte: 367.00 GByte/s Read, 382.89 GByte/s Write
> 6784 MiByte to 6912 MiByte: 365.82 GByte/s Read, 380.47 GByte/s Write
> 6912 MiByte to 7040 MiByte: 369.32 GByte/s Read, 375.03 GByte/s Write
> 7040 MiByte to 7168 MiByte: 367.64 GByte/s Read, 363.79 GByte/s Write
> 7168 MiByte to 7296 MiByte: 369.32 GByte/s Read, 378.49 GByte/s Write
> 7296 MiByte to 7424 MiByte: 368.18 GByte/s Read, 356.85 GByte/s Write
> 7424 MiByte to 7552 MiByte: 369.16 GByte/s Read, 391.49 GByte/s Write
> 7552 MiByte to 7680 MiByte: 366.62 GByte/s Read, 375.78 GByte/s Write
> 7680 MiByte to 7808 MiByte: 364.80 GByte/s Read, 390.76 GByte/s Write
> 7808 MiByte to 7936 MiByte: 366.94 GByte/s Read, 381.97 GByte/s Write
> 7936 MiByte to 8064 MiByte: 363.99 GByte/s Read, 389.28 GByte/s Write
> 8064 MiByte to 8192 MiByte: 366.19 GByte/s Read, 374.81 GByte/s Write
> 8192 MiByte to 8320 MiByte: 364.19 GByte/s Read, 364.86 GByte/s Write
> 8320 MiByte to 8448 MiByte: 366.12 GByte/s Read, 376.54 GByte/s Write
> 8448 MiByte to 8576 MiByte: 365.20 GByte/s Read, 357.73 GByte/s Write
> 8576 MiByte to 8704 MiByte: 366.08 GByte/s Read, 391.21 GByte/s Write
> 8704 MiByte to 8832 MiByte: 363.98 GByte/s Read, 374.69 GByte/s Write
> 8832 MiByte to 8960 MiByte: 362.08 GByte/s Read, 391.24 GByte/s Write
> 8960 MiByte to 9088 MiByte: 363.54 GByte/s Read, 381.80 GByte/s Write
> 9088 MiByte to 9216 MiByte: 360.68 GByte/s Read, 390.21 GByte/s Write
> 
> 
> 
> P2 States Off
> 
> 
> 
> Spoiler
> 
> 
> 
> Benchmarking DRAM
> 0 MiByte to 128 MiByte: 435.09 GByte/s Read, 447.34 GByte/s Write
> 128 MiByte to 256 MiByte: 433.59 GByte/s Read, 444.59 GByte/s Write
> 256 MiByte to 384 MiByte: 435.89 GByte/s Read, 450.52 GByte/s Write
> 384 MiByte to 512 MiByte: 433.86 GByte/s Read, 448.69 GByte/s Write
> 512 MiByte to 640 MiByte: 433.39 GByte/s Read, 461.52 GByte/s Write
> 640 MiByte to 768 MiByte: 434.29 GByte/s Read, 449.18 GByte/s Write
> 768 MiByte to 896 MiByte: 433.82 GByte/s Read, 445.52 GByte/s Write
> 896 MiByte to 1024 MiByte: 435.03 GByte/s Read, 458.29 GByte/s Write
> 1024 MiByte to 1152 MiByte: 434.30 GByte/s Read, 448.83 GByte/s Write
> 1152 MiByte to 1280 MiByte: 433.01 GByte/s Read, 452.22 GByte/s Write
> 1280 MiByte to 1408 MiByte: 436.14 GByte/s Read, 446.28 GByte/s Write
> 1408 MiByte to 1536 MiByte: 433.84 GByte/s Read, 447.42 GByte/s Write
> 1536 MiByte to 1664 MiByte: 433.67 GByte/s Read, 465.37 GByte/s Write
> 1664 MiByte to 1792 MiByte: 433.73 GByte/s Read, 445.66 GByte/s Write
> 1792 MiByte to 1920 MiByte: 434.01 GByte/s Read, 443.86 GByte/s Write
> 1920 MiByte to 2048 MiByte: 435.60 GByte/s Read, 453.99 GByte/s Write
> 2048 MiByte to 2176 MiByte: 434.00 GByte/s Read, 448.42 GByte/s Write
> 2176 MiByte to 2304 MiByte: 433.27 GByte/s Read, 457.79 GByte/s Write
> 2304 MiByte to 2432 MiByte: 434.73 GByte/s Read, 449.15 GByte/s Write
> 2432 MiByte to 2560 MiByte: 432.67 GByte/s Read, 444.57 GByte/s Write
> 2560 MiByte to 2688 MiByte: 433.44 GByte/s Read, 466.92 GByte/s Write
> 2688 MiByte to 2816 MiByte: 431.83 GByte/s Read, 447.80 GByte/s Write
> 2816 MiByte to 2944 MiByte: 432.15 GByte/s Read, 444.16 GByte/s Write
> 2944 MiByte to 3072 MiByte: 434.44 GByte/s Read, 450.39 GByte/s Write
> 3072 MiByte to 3200 MiByte: 432.37 GByte/s Read, 448.72 GByte/s Write
> 3200 MiByte to 3328 MiByte: 433.15 GByte/s Read, 462.61 GByte/s Write
> 3328 MiByte to 3456 MiByte: 434.30 GByte/s Read, 449.18 GByte/s Write
> 3456 MiByte to 3584 MiByte: 433.97 GByte/s Read, 444.27 GByte/s Write
> 3584 MiByte to 3712 MiByte: 434.95 GByte/s Read, 457.91 GByte/s Write
> 3712 MiByte to 3840 MiByte: 434.30 GByte/s Read, 449.34 GByte/s Write
> 3840 MiByte to 3968 MiByte: 433.58 GByte/s Read, 447.44 GByte/s Write
> 3968 MiByte to 4096 MiByte: 435.99 GByte/s Read, 447.80 GByte/s Write
> 4096 MiByte to 4224 MiByte: 433.87 GByte/s Read, 446.88 GByte/s Write
> 4224 MiByte to 4352 MiByte: 433.44 GByte/s Read, 463.64 GByte/s Write
> 4352 MiByte to 4480 MiByte: 433.94 GByte/s Read, 447.04 GByte/s Write
> 4480 MiByte to 4608 MiByte: 434.16 GByte/s Read, 444.86 GByte/s Write
> 4608 MiByte to 4736 MiByte: 435.17 GByte/s Read, 457.65 GByte/s Write
> 4736 MiByte to 4864 MiByte: 434.11 GByte/s Read, 450.65 GByte/s Write
> 4864 MiByte to 4992 MiByte: 433.13 GByte/s Read, 455.59 GByte/s Write
> 4992 MiByte to 5120 MiByte: 436.18 GByte/s Read, 445.37 GByte/s Write
> 5120 MiByte to 5248 MiByte: 434.01 GByte/s Read, 446.73 GByte/s Write
> 5248 MiByte to 5376 MiByte: 433.56 GByte/s Read, 464.79 GByte/s Write
> 5376 MiByte to 5504 MiByte: 433.71 GByte/s Read, 445.36 GByte/s Write
> 5504 MiByte to 5632 MiByte: 433.87 GByte/s Read, 445.81 GByte/s Write
> 5632 MiByte to 5760 MiByte: 435.55 GByte/s Read, 454.58 GByte/s Write
> 5760 MiByte to 5888 MiByte: 434.16 GByte/s Read, 448.57 GByte/s Write
> 5888 MiByte to 6016 MiByte: 433.44 GByte/s Read, 455.27 GByte/s Write
> 6016 MiByte to 6144 MiByte: 434.73 GByte/s Read, 451.00 GByte/s Write
> 6144 MiByte to 6272 MiByte: 434.01 GByte/s Read, 447.31 GByte/s Write
> 6272 MiByte to 6400 MiByte: 433.56 GByte/s Read, 465.79 GByte/s Write
> 6400 MiByte to 6528 MiByte: 433.65 GByte/s Read, 445.91 GByte/s Write
> 6528 MiByte to 6656 MiByte: 433.58 GByte/s Read, 444.61 GByte/s Write
> 6656 MiByte to 6784 MiByte: 435.60 GByte/s Read, 453.85 GByte/s Write
> 6784 MiByte to 6912 MiByte: 434.01 GByte/s Read, 448.95 GByte/s Write
> 6912 MiByte to 7040 MiByte: 433.30 GByte/s Read, 457.17 GByte/s Write
> 7040 MiByte to 7168 MiByte: 434.69 GByte/s Read, 449.47 GByte/s Write
> 7168 MiByte to 7296 MiByte: 434.27 GByte/s Read, 444.25 GByte/s Write
> 7296 MiByte to 7424 MiByte: 434.85 GByte/s Read, 460.47 GByte/s Write
> 7424 MiByte to 7552 MiByte: 434.43 GByte/s Read, 448.40 GByte/s Write
> 7552 MiByte to 7680 MiByte: 433.58 GByte/s Read, 444.83 GByte/s Write
> 7680 MiByte to 7808 MiByte: 436.03 GByte/s Read, 448.84 GByte/s Write
> 7808 MiByte to 7936 MiByte: 433.97 GByte/s Read, 446.43 GByte/s Write
> 7936 MiByte to 8064 MiByte: 433.54 GByte/s Read, 462.61 GByte/s Write
> 8064 MiByte to 8192 MiByte: 434.26 GByte/s Read, 448.11 GByte/s Write
> 8192 MiByte to 8320 MiByte: 434.00 GByte/s Read, 445.01 GByte/s Write
> 8320 MiByte to 8448 MiByte: 433.87 GByte/s Read, 456.33 GByte/s Write
> 8448 MiByte to 8576 MiByte: 432.79 GByte/s Read, 448.70 GByte/s Write
> 8576 MiByte to 8704 MiByte: 433.37 GByte/s Read, 455.11 GByte/s Write
> 8704 MiByte to 8832 MiByte: 434.88 GByte/s Read, 451.30 GByte/s Write
> 8832 MiByte to 8960 MiByte: 432.58 GByte/s Read, 447.65 GByte/s Write
> 8960 MiByte to 9088 MiByte: 432.01 GByte/s Read, 465.45 GByte/s Write
> 9088 MiByte to 9216 MiByte: 432.15 GByte/s Read, 445.67 GByte/s Write


Thanks, Ill try it later but id suppose in gaming the bandwidth is i full swing? I did however try to run the render test on GPUZ to take the GPU out of a power saving state as that was what I initially thought was causing the lower bandwidth. Ill report back my findings in a bit.

EDIT:

Tried it now and I am relieved to see that nothing is wrong with my card though my result is 10-20GB/s slower than yours at stock. If I leave the state to "Off" will that have any long term negative effects like my memory failing to downclock ?



Spoiler



DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):434.30 GByte/s
DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):431.33 GByte/s
DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):431.96 GByte/s
DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):422.00 GByte/s
DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):425.60 GByte/s
DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):435.31 GByte/s
DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):428.15 GByte/s
DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):432.67 GByte/s
DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):423.40 GByte/s
DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):431.26 GByte/s
DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):430.97 GByte/s
DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):430.12 GByte/s
DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):436.13 GByte/s
DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):436.13 GByte/s
DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):432.08 GByte/s
DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):425.56 GByte/s
DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):432.87 GByte/s
DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):433.73 GByte/s
DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):429.79 GByte/s
DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):431.85 GByte/s
DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):424.87 GByte/s
DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):431.68 GByte/s
DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):425.70 GByte/s
DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):402.43 GByte/s
DRAM-Bandwidth of Chunk no. 24 (3072 MiByte to 3200 MiByte):437.25 GByte/s
DRAM-Bandwidth of Chunk no. 25 (3200 MiByte to 3328 MiByte):421.04 GByte/s
DRAM-Bandwidth of Chunk no. 26 (3328 MiByte to 3456 MiByte):421.72 GByte/s
DRAM-Bandwidth of Chunk no. 27 (3456 MiByte to 3584 MiByte):437.29 GByte/s
DRAM-Bandwidth of Chunk no. 28 (3584 MiByte to 3712 MiByte):437.44 GByte/s
DRAM-Bandwidth of Chunk no. 29 (3712 MiByte to 3840 MiByte):421.84 GByte/s
DRAM-Bandwidth of Chunk no. 30 (3840 MiByte to 3968 MiByte):426.48 GByte/s
DRAM-Bandwidth of Chunk no. 31 (3968 MiByte to 4096 MiByte):420.80 GByte/s
DRAM-Bandwidth of Chunk no. 32 (4096 MiByte to 4224 MiByte):421.68 GByte/s
DRAM-Bandwidth of Chunk no. 33 (4224 MiByte to 4352 MiByte):427.46 GByte/s
DRAM-Bandwidth of Chunk no. 34 (4352 MiByte to 4480 MiByte):418.70 GByte/s
DRAM-Bandwidth of Chunk no. 35 (4480 MiByte to 4608 MiByte):419.41 GByte/s
DRAM-Bandwidth of Chunk no. 36 (4608 MiByte to 4736 MiByte):427.66 GByte/s
DRAM-Bandwidth of Chunk no. 37 (4736 MiByte to 4864 MiByte):418.62 GByte/s
DRAM-Bandwidth of Chunk no. 38 (4864 MiByte to 4992 MiByte):427.55 GByte/s
DRAM-Bandwidth of Chunk no. 39 (4992 MiByte to 5120 MiByte):432.63 GByte/s
DRAM-Bandwidth of Chunk no. 40 (5120 MiByte to 5248 MiByte):418.58 GByte/s
DRAM-Bandwidth of Chunk no. 41 (5248 MiByte to 5376 MiByte):414.55 GByte/s
DRAM-Bandwidth of Chunk no. 42 (5376 MiByte to 5504 MiByte):418.78 GByte/s
DRAM-Bandwidth of Chunk no. 43 (5504 MiByte to 5632 MiByte):419.36 GByte/s
DRAM-Bandwidth of Chunk no. 44 (5632 MiByte to 5760 MiByte):429.42 GByte/s
DRAM-Bandwidth of Chunk no. 45 (5760 MiByte to 5888 MiByte):434.78 GByte/s
DRAM-Bandwidth of Chunk no. 46 (5888 MiByte to 6016 MiByte):426.30 GByte/s
DRAM-Bandwidth of Chunk no. 47 (6016 MiByte to 6144 MiByte):416.59 GByte/s
DRAM-Bandwidth of Chunk no. 48 (6144 MiByte to 6272 MiByte):434.73 GByte/s
DRAM-Bandwidth of Chunk no. 49 (6272 MiByte to 6400 MiByte):413.33 GByte/s
DRAM-Bandwidth of Chunk no. 50 (6400 MiByte to 6528 MiByte):416.81 GByte/s
DRAM-Bandwidth of Chunk no. 51 (6528 MiByte to 6656 MiByte):424.64 GByte/s
DRAM-Bandwidth of Chunk no. 52 (6656 MiByte to 6784 MiByte):426.85 GByte/s
DRAM-Bandwidth of Chunk no. 53 (6784 MiByte to 6912 MiByte):421.00 GByte/s
DRAM-Bandwidth of Chunk no. 54 (6912 MiByte to 7040 MiByte):418.48 GByte/s
DRAM-Bandwidth of Chunk no. 55 (7040 MiByte to 7168 MiByte):424.82 GByte/s
DRAM-Bandwidth of Chunk no. 56 (7168 MiByte to 7296 MiByte):431.73 GByte/s
DRAM-Bandwidth of Chunk no. 57 (7296 MiByte to 7424 MiByte):434.68 GByte/s
DRAM-Bandwidth of Chunk no. 58 (7424 MiByte to 7552 MiByte):418.43 GByte/s
DRAM-Bandwidth of Chunk no. 59 (7552 MiByte to 7680 MiByte):419.73 GByte/s
DRAM-Bandwidth of Chunk no. 60 (7680 MiByte to 7808 MiByte):427.53 GByte/s
DRAM-Bandwidth of Chunk no. 61 (7808 MiByte to 7936 MiByte):420.93 GByte/s
DRAM-Bandwidth of Chunk no. 62 (7936 MiByte to 8064 MiByte):421.63 GByte/s
DRAM-Bandwidth of Chunk no. 63 (8064 MiByte to 8192 MiByte):427.39 GByte/s
DRAM-Bandwidth of Chunk no. 64 (8192 MiByte to 8320 MiByte):421.00 GByte/s
DRAM-Bandwidth of Chunk no. 65 (8320 MiByte to 8448 MiByte):428.24 GByte/s
DRAM-Bandwidth of Chunk no. 66 (8448 MiByte to 8576 MiByte):429.36 GByte/s
DRAM-Bandwidth of Chunk no. 67 (8576 MiByte to 8704 MiByte):421.35 GByte/s
DRAM-Bandwidth of Chunk no. 68 (8704 MiByte to 8832 MiByte):426.50 GByte/s
DRAM-Bandwidth of Chunk no. 69 (8832 MiByte to 8960 MiByte):429.09 GByte/s
DRAM-Bandwidth of Chunk no. 70 (8960 MiByte to 9088 MiByte):419.52 GByte/s
DRAM-Bandwidth of Chunk no. 71 (9088 MiByte to 9216 MiByte):424.41 GByte/s


----------



## Jflisk

kithylin said:


> Congrats! Hope you enjoy your new beast. Welcome to the green side. :devil-smi
> 
> What did you have before?


<------AMD Fury X Crossfire, the picture in the my icon is the machine as it is now.


----------



## 8051

Jflisk said:


> <------AMD Fury X Crossfire, the picture in the my icon is the machine as it is now.


A single 1080Ti can't match the performance of a Fury X Crossfire setup for games that support it can it?


----------



## gavros777

Nvidia released this firmware update for my titan x maxwell card.
http://www.nvidia.com/object/nv-uefi-update-x64.html
to support dp 1.3/1.4

I am using a custom bios, will this firmware update affect my custom bios if i apply it to my card?


----------



## EarlZ

How reliable is the memtestg80 for testing memory overclock, I sometimes see it spit out errors on stock speeds :/

I always get the random block error



Spoiler



Test iteration 60 (GPU 0, 3072 MiB): 0 errors so far
Moving Inversions (ones and zeros): 0 errors (31 ms)
Memtest86 Walking 8-bit: 0 errors (329 ms)
True Walking zeros (8-bit): 0 errors (156 ms)
True Walking ones (8-bit): 0 errors (156 ms)
Moving Inversions (random): 0 errors (47 ms)
Memtest86 Walking zeros (32-bit): 0 errors (641 ms)
Memtest86 Walking ones (32-bit): 0 errors (640 ms)
Random blocks: 5943 errors (109 ms) 
Memtest86 Modulo-20: 0 errors (2000 ms)
Logic (one iteration): 0 errors (16 ms)
Logic (4 iterations): 0 errors (31 ms)
Logic (shared memory, one iteration): 0 errors (16 ms)
Logic (shared-memory, 4 iterations): 0 errors (31 ms)

Test iteration 61 (GPU 0, 3072 MiB): 5943 errors so far
Moving Inversions (ones and zeros): 0 errors (31 ms)
Memtest86 Walking 8-bit: 0 errors (328 ms)
True Walking zeros (8-bit): 0 errors (157 ms)
True Walking ones (8-bit): 0 errors (156 ms)
Moving Inversions (random): 0 errors (47 ms)
Memtest86 Walking zeros (32-bit): 0 errors (640 ms)
Memtest86 Walking ones (32-bit): 0 errors (641 ms)
Random blocks: 0 errors (78 ms)
Memtest86 Modulo-20: 0 errors (2000 ms)
Logic (one iteration): 0 errors (16 ms)
Logic (4 iterations): 0 errors (31 ms)
Logic (shared memory, one iteration): 0 errors (16 ms)
Logic (shared-memory, 4 iterations): 0 errors (31 ms)


----------



## kithylin

8051 said:


> A single 1080Ti can't match the performance of a Fury X Crossfire setup for games that support it can it?


http://gpu.userbenchmark.com/Compare/AMD-R9-Fury-X-vs-Nvidia-GTX-1080-Ti/3498vs3918

A single 1080 Ti stock is +88% faster than a single Fury X stock, and at *best* games only scale +80% with a second card, either CrossFireX or SLI, so essentially yes, a single 1080 Ti can match dual-CrossFireX, and still be +8% faster.


----------



## 8051

kithylin said:


> http://gpu.userbenchmark.com/Compare/AMD-R9-Fury-X-vs-Nvidia-GTX-1080-Ti/3498vs3918
> 
> A single 1080 Ti stock is +88% faster than a single Fury X stock, and at *best* games only scale +80% with a second card, either CrossFireX or SLI, so essentially yes, a single 1080 Ti can match dual-CrossFireX, and still be +8% faster.


For some reason I thought the Fury X was faster than a 980Ti and SLI 980Ti seems to be on par w/1080Ti.


----------



## kithylin

8051 said:


> For some reason I thought the Fury X was faster than a 980Ti and SLI 980Ti seems to be on par w/1080Ti.


Do remember that you don't get straight +100% scaling with SLI or CrossFireX. Most titles scale only +40% to +50%, some may do up to +80%. It's this variable performance as to why personally I abandoned SLI years ago.


----------



## KedarWolf

EarlZ said:


> Thanks, Ill try it later but id suppose in gaming the bandwidth is i full swing? I did however try to run the render test on GPUZ to take the GPU out of a power saving state as that was what I initially thought was causing the lower bandwidth. Ill report back my findings in a bit.
> 
> EDIT:
> 
> Tried it now and I am relieved to see that nothing is wrong with my card though my result is 10-20GB/s slower than yours at stock. If I leave the state to "Off" will that have any long term negative effects like my memory failing to downclock ?
> 
> 
> 
> Spoiler
> 
> 
> 
> DRAM-Bandwidth of Chunk no. 0 (0 MiByte to 128 MiByte):434.30 GByte/s
> DRAM-Bandwidth of Chunk no. 1 (128 MiByte to 256 MiByte):431.33 GByte/s
> DRAM-Bandwidth of Chunk no. 2 (256 MiByte to 384 MiByte):431.96 GByte/s
> DRAM-Bandwidth of Chunk no. 3 (384 MiByte to 512 MiByte):422.00 GByte/s
> DRAM-Bandwidth of Chunk no. 4 (512 MiByte to 640 MiByte):425.60 GByte/s
> DRAM-Bandwidth of Chunk no. 5 (640 MiByte to 768 MiByte):435.31 GByte/s
> DRAM-Bandwidth of Chunk no. 6 (768 MiByte to 896 MiByte):428.15 GByte/s
> DRAM-Bandwidth of Chunk no. 7 (896 MiByte to 1024 MiByte):432.67 GByte/s
> DRAM-Bandwidth of Chunk no. 8 (1024 MiByte to 1152 MiByte):423.40 GByte/s
> DRAM-Bandwidth of Chunk no. 9 (1152 MiByte to 1280 MiByte):431.26 GByte/s
> DRAM-Bandwidth of Chunk no. 10 (1280 MiByte to 1408 MiByte):430.97 GByte/s
> DRAM-Bandwidth of Chunk no. 11 (1408 MiByte to 1536 MiByte):430.12 GByte/s
> DRAM-Bandwidth of Chunk no. 12 (1536 MiByte to 1664 MiByte):436.13 GByte/s
> DRAM-Bandwidth of Chunk no. 13 (1664 MiByte to 1792 MiByte):436.13 GByte/s
> DRAM-Bandwidth of Chunk no. 14 (1792 MiByte to 1920 MiByte):432.08 GByte/s
> DRAM-Bandwidth of Chunk no. 15 (1920 MiByte to 2048 MiByte):425.56 GByte/s
> DRAM-Bandwidth of Chunk no. 16 (2048 MiByte to 2176 MiByte):432.87 GByte/s
> DRAM-Bandwidth of Chunk no. 17 (2176 MiByte to 2304 MiByte):433.73 GByte/s
> DRAM-Bandwidth of Chunk no. 18 (2304 MiByte to 2432 MiByte):429.79 GByte/s
> DRAM-Bandwidth of Chunk no. 19 (2432 MiByte to 2560 MiByte):431.85 GByte/s
> DRAM-Bandwidth of Chunk no. 20 (2560 MiByte to 2688 MiByte):424.87 GByte/s
> DRAM-Bandwidth of Chunk no. 21 (2688 MiByte to 2816 MiByte):431.68 GByte/s
> DRAM-Bandwidth of Chunk no. 22 (2816 MiByte to 2944 MiByte):425.70 GByte/s
> DRAM-Bandwidth of Chunk no. 23 (2944 MiByte to 3072 MiByte):402.43 GByte/s
> DRAM-Bandwidth of Chunk no. 24 (3072 MiByte to 3200 MiByte):437.25 GByte/s
> DRAM-Bandwidth of Chunk no. 25 (3200 MiByte to 3328 MiByte):421.04 GByte/s
> DRAM-Bandwidth of Chunk no. 26 (3328 MiByte to 3456 MiByte):421.72 GByte/s
> DRAM-Bandwidth of Chunk no. 27 (3456 MiByte to 3584 MiByte):437.29 GByte/s
> DRAM-Bandwidth of Chunk no. 28 (3584 MiByte to 3712 MiByte):437.44 GByte/s
> DRAM-Bandwidth of Chunk no. 29 (3712 MiByte to 3840 MiByte):421.84 GByte/s
> DRAM-Bandwidth of Chunk no. 30 (3840 MiByte to 3968 MiByte):426.48 GByte/s
> DRAM-Bandwidth of Chunk no. 31 (3968 MiByte to 4096 MiByte):420.80 GByte/s
> DRAM-Bandwidth of Chunk no. 32 (4096 MiByte to 4224 MiByte):421.68 GByte/s
> DRAM-Bandwidth of Chunk no. 33 (4224 MiByte to 4352 MiByte):427.46 GByte/s
> DRAM-Bandwidth of Chunk no. 34 (4352 MiByte to 4480 MiByte):418.70 GByte/s
> DRAM-Bandwidth of Chunk no. 35 (4480 MiByte to 4608 MiByte):419.41 GByte/s
> DRAM-Bandwidth of Chunk no. 36 (4608 MiByte to 4736 MiByte):427.66 GByte/s
> DRAM-Bandwidth of Chunk no. 37 (4736 MiByte to 4864 MiByte):418.62 GByte/s
> DRAM-Bandwidth of Chunk no. 38 (4864 MiByte to 4992 MiByte):427.55 GByte/s
> DRAM-Bandwidth of Chunk no. 39 (4992 MiByte to 5120 MiByte):432.63 GByte/s
> DRAM-Bandwidth of Chunk no. 40 (5120 MiByte to 5248 MiByte):418.58 GByte/s
> DRAM-Bandwidth of Chunk no. 41 (5248 MiByte to 5376 MiByte):414.55 GByte/s
> DRAM-Bandwidth of Chunk no. 42 (5376 MiByte to 5504 MiByte):418.78 GByte/s
> DRAM-Bandwidth of Chunk no. 43 (5504 MiByte to 5632 MiByte):419.36 GByte/s
> DRAM-Bandwidth of Chunk no. 44 (5632 MiByte to 5760 MiByte):429.42 GByte/s
> DRAM-Bandwidth of Chunk no. 45 (5760 MiByte to 5888 MiByte):434.78 GByte/s
> DRAM-Bandwidth of Chunk no. 46 (5888 MiByte to 6016 MiByte):426.30 GByte/s
> DRAM-Bandwidth of Chunk no. 47 (6016 MiByte to 6144 MiByte):416.59 GByte/s
> DRAM-Bandwidth of Chunk no. 48 (6144 MiByte to 6272 MiByte):434.73 GByte/s
> DRAM-Bandwidth of Chunk no. 49 (6272 MiByte to 6400 MiByte):413.33 GByte/s
> DRAM-Bandwidth of Chunk no. 50 (6400 MiByte to 6528 MiByte):416.81 GByte/s
> DRAM-Bandwidth of Chunk no. 51 (6528 MiByte to 6656 MiByte):424.64 GByte/s
> DRAM-Bandwidth of Chunk no. 52 (6656 MiByte to 6784 MiByte):426.85 GByte/s
> DRAM-Bandwidth of Chunk no. 53 (6784 MiByte to 6912 MiByte):421.00 GByte/s
> DRAM-Bandwidth of Chunk no. 54 (6912 MiByte to 7040 MiByte):418.48 GByte/s
> DRAM-Bandwidth of Chunk no. 55 (7040 MiByte to 7168 MiByte):424.82 GByte/s
> DRAM-Bandwidth of Chunk no. 56 (7168 MiByte to 7296 MiByte):431.73 GByte/s
> DRAM-Bandwidth of Chunk no. 57 (7296 MiByte to 7424 MiByte):434.68 GByte/s
> DRAM-Bandwidth of Chunk no. 58 (7424 MiByte to 7552 MiByte):418.43 GByte/s
> DRAM-Bandwidth of Chunk no. 59 (7552 MiByte to 7680 MiByte):419.73 GByte/s
> DRAM-Bandwidth of Chunk no. 60 (7680 MiByte to 7808 MiByte):427.53 GByte/s
> DRAM-Bandwidth of Chunk no. 61 (7808 MiByte to 7936 MiByte):420.93 GByte/s
> DRAM-Bandwidth of Chunk no. 62 (7936 MiByte to 8064 MiByte):421.63 GByte/s
> DRAM-Bandwidth of Chunk no. 63 (8064 MiByte to 8192 MiByte):427.39 GByte/s
> DRAM-Bandwidth of Chunk no. 64 (8192 MiByte to 8320 MiByte):421.00 GByte/s
> DRAM-Bandwidth of Chunk no. 65 (8320 MiByte to 8448 MiByte):428.24 GByte/s
> DRAM-Bandwidth of Chunk no. 66 (8448 MiByte to 8576 MiByte):429.36 GByte/s
> DRAM-Bandwidth of Chunk no. 67 (8576 MiByte to 8704 MiByte):421.35 GByte/s
> DRAM-Bandwidth of Chunk no. 68 (8704 MiByte to 8832 MiByte):426.50 GByte/s
> DRAM-Bandwidth of Chunk no. 69 (8832 MiByte to 8960 MiByte):429.09 GByte/s
> DRAM-Bandwidth of Chunk no. 70 (8960 MiByte to 9088 MiByte):419.52 GByte/s
> DRAM-Bandwidth of Chunk no. 71 (9088 MiByte to 9216 MiByte):424.41 GByte/s


It'll be fine. Memory will still downclock when not running a game etc., just won't when a game or app tries to use a lower P2 State. 

Yes, I just tested it. P2 States off, core and memory at 607/405 at idle.


----------



## Jflisk

8051 said:


> A single 1080Ti can't match the performance of a Fury X Crossfire setup for games that support it can it?





kithylin said:


> http://gpu.userbenchmark.com/Compare/AMD-R9-Fury-X-vs-Nvidia-GTX-1080-Ti/3498vs3918
> 
> A single 1080 Ti stock is +88% faster than a single Fury X stock, and at *best* games only scale +80% with a second card, either CrossFireX or SLI, so essentially yes, a single 1080 Ti can match dual-CrossFireX, and still be +8% faster.





8051 said:


> For some reason I thought the Fury X was faster than a 980Ti and SLI 980Ti seems to be on par w/1080Ti.



kithylin is correct the 1080TI is sligtly faster (actual number is 10% but close enough), It is also a single card as opposed to having 2 cards taking up space and power - crossfire profiles that dont exist or cause studdering. I got my card last night and got it installed ,took some games for a spin ( Titanfall2 - COD BLOPS 3) . This card is nice , have not tried the RIFT yet with the new card probably get into that tonight. I had to go thru the routine yesterday , replaced 1 bad fan - reinstall a fan I removed for the Fury X's . Clean the radiators out and the RAD's/Filters , figured while its open might as well get it done.


----------



## pfinch

So anyone got experience regarding the nvidia firmware DP update? Does it ruin XOC ?


----------



## Jflisk

Not sure if we still need to be added to groups , but below is my proof of ownership . Please add me to the group, thank you in advance. 

https://www.techpowerup.com/gpuz/details/z3rhb


----------



## 8051

kithylin said:


> Do remember that you don't get straight +100% scaling with SLI or CrossFireX. Most titles scale only +40% to +50%, some may do up to +80%. It's this variable performance as to why personally I abandoned SLI years ago.


Kithylin does SLI ever do anything for minimum FPS?


----------



## kithylin

8051 said:


> Kithylin does SLI ever do anything for minimum FPS?


Of course it does, it scales everything, minimum and maximum.

It's just dependant on a lot of factors.

Does the game even support SLI at all?
Did Nvidia release an optimized SLI profile for the game?
Does the game even scale at all with a second card?

Some games even if you have SLI support by the developer and an nvidia profile, sometimes the best you'll get is like lop-sided performance, first card 99% utilization second card 40%, and only see like +25% gains.
Some titles everything works great and you can get up to the full +80% from a second card.

Also depending on the computer, how many PCIE Lanes it has and how they're allotted to different onboard components... to power two cards in SLI you may possibly have to give up onboard sata ports entirely, or give up your m.2 ports, or both. 

Even if everything scales properly and you get the full +80%, then there's the whole double power usage, and double heat which makes your a/c system run more because your room's hotter, which is even more power. 

In general SLI is for rich people that can throw away money at the power bill and don't care about it, or possibly someone in a rental situation where you don't pay for power. I've seen some apartments that are "All bills paid" including power.


----------



## brenopapito

pfinch said:


> So anyone got experience regarding the nvidia firmware DP update? Does it ruin XOC ?


I just updated my Zotac 1080 Ti FE with XOC bios and the DP 1 is now working!


----------



## KedarWolf

pfinch said:


> So anyone got experience regarding the nvidia firmware DP update? Does it ruin XOC ?


Never affected my XOC BIOS, can still do 2125 at 1.2v.


----------



## kithylin

brenopapito said:


> I just updated my Zotac 1080 Ti FE with XOC bios and the DP 1 is now working!


Using my EVGA Black Edition GTX 1080 Ti, I've never had any issues. All of my onboard ports have all always worked even with XOC bios here. So I see no reason to update anything. "If it works, don't fix it."


----------



## ttnuagmada

mouacyk said:


> That looks like a bad strap. Try incrementing/decrementing the memory oc until the next strap kicks in, then retry bandwidth. Repeat until it goes up.


Ive noticed that it looks like with newer Afterburner versions, it takes the guesswork out of getting a bad mem OC. for example; I've had my mem OC at +597 for several months, which put me at 6210 (landed there after watching framerates in Heaven to make sure i was getting the best bin), but now, it will stay at 6210 until i drop it all the way down to +579, and then it will drop to 6180. It will no longer let me select a speed between 6180 or 6210. Both speeds seem to be good straps too, as my bandwidth only goes down like 4 Gbps (getting 492 gb/s read - 537 gb/s on the writes at 6210)


----------



## kithylin

ttnuagmada said:


> Ive noticed that it looks like with newer Afterburner versions, it takes the guesswork out of getting a bad mem OC. for example; I've had my mem OC at +597 for several months, which put me at 6210 (landed there after watching framerates in Heaven to make sure i was getting the best bin), but now, it will stay at 6210 until i drop it all the way down to +579, and then it will drop to 6180. It will no longer let me select a speed between 6180 or 6210. Both speeds seem to be good straps too, as my bandwidth only goes down like 4 Gbps (getting 492 gb/s read - 537 gb/s on the writes at 6210)


Personally I never bothered with bandwidth tests, or straps or binning or any of that nonsense. I just set the ram as high as it will go in afterburner and not crash. For me it turns out that's +700 mhz which ends up being 6210 Mhz. I get higher scores in 3dmark than with the ram speed set at +0 (no overclock), so obviously it's faster than stock. Games don't crash, nothing crashes, there's no artifacts, 3dmark shows documented increase in performance, everything seems fine. That's all I care about. Go play games, happy, never think too much about it.

I think some of y'all over-analyze things to death around here sometimes.


----------



## brenopapito

kithylin said:


> Using my EVGA Black Edition GTX 1080 Ti, I've never had any issues. All of my onboard ports have all always worked even with XOC bios here. So I see no reason to update anything. "If it works, don't fix it."


It's a "FE bug" like Kendar describe here: http://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

_Note: Top display port will be disabled if you flash a BIOS on the FE other then one of the stock BIOS's._


----------



## LancerVI

I just picked up an MSI GAMING X GeForce GTX 1080 Ti 11GB GDRR5X from Amazon Warehouse / fullfilled by Amazon for $565.00 I couldn't believe it when I saw it. It's used, but verified to work and in it's original box and accessories and covered completely by amazon's return policy. I still can't believe it. Also came with Free 2-day ship for Prime customers.

Any one have any experience, good or bad with Amazon Warehouse? I tried to buy 4 of the six they have, but by the time I checked out, only one was available.

Thoughts? Either on this specific card or Amazon Warehouse?


----------



## kithylin

LancerVI said:


> I just picked up an MSI GAMING X GeForce GTX 1080 Ti 11GB GDRR5X from Amazon Warehouse / fullfilled by Amazon for $565.00 I couldn't believe it when I saw it. It's used, but verified to work and in it's original box and accessories and covered completely by amazon's return policy. I still can't believe it. Also came with Free 2-day ship for Prime customers.
> 
> Any one have any experience, good or bad with Amazon Warehouse? I tried to buy 4 of the six they have, but buy the time I checked out, only one was available.
> 
> Thoughts? Either on this specific card or Amazon Warehouse?


I would wonder if MSI will honor their warranty period for an item purchased like this.
I would probably suggest you get on the phone and call a CSR Rep @ MSI and verify that. Else you might be left holding the bag after the amazon return period is up.


----------



## LancerVI

kithylin said:


> I would wonder if MSI will honor their warranty period for an item purchased like this.
> I would probably suggest you get on the phone and call a CSR Rep @ MSI and verify that. Else you might be left holding the bag after the amazon return period is up.


I've thought about that. That's probably a good idea. I'm definitely going to torture test it when I get it to make sure it's tip top. It's really an unbelievable price. I had to read it top to bottom several times before I pulled the trigger. Amazon said they'll guarantee it.


----------



## LancerVI

Based on the website, MSI goes by serial number and manufacture date, so I would say I'm probably good to go!!! 

https://us.msi.com/page/warranty


----------



## kithylin

LancerVI said:


> Based on the website, MSI goes buy serial number and manufacture date, so I would say I'm probably good to go!!!
> 
> https://us.msi.com/page/warranty


If it was me and my money I would be calling to hear it direct from someone at MSI myself, just to sleep better at night. That's just me though.


----------



## EarlZ

ttnuagmada said:


> Ive noticed that it looks like with newer Afterburner versions, it takes the guesswork out of getting a bad mem OC. for example; I've had my mem OC at +597 for several months, which put me at 6210 (landed there after watching framerates in Heaven to make sure i was getting the best bin), but now, it will stay at 6210 until i drop it all the way down to +579, and then it will drop to 6180. It will no longer let me select a speed between 6180 or 6210. Both speeds seem to be good straps too, as my bandwidth only goes down like 4 Gbps (getting 492 gb/s read - 537 gb/s on the writes at 6210)


6210Mhz is +720 if I am not mistaken, with +597 I am getting 6102. I still dont have an effective way to check if my memory OC is stable, the memtestg80 posted here seems unreliable and will throw an error even if you are on stock speeds ( tested on 3 cards )

+1000 gives me 503-515GB/s, I dont know if its stable though


----------



## kithylin

EarlZ said:


> 6210Mhz is +720 if I am not mistaken, with +597 I am getting 6102. I still dont have an effective way to check if my memory OC is stable, the memtestg80 posted here seems unreliable and will throw an error even if you are on stock speeds ( tested on 3 cards )
> 
> +1000 gives me 503-515GB/s, I dont know if its stable though


Basically just do what I've been doing.. and tell some of my other friends to do as to stability: Go use it. I just go play games, and run benchmarks a few times and look for anomalies (after you've seen firestrike run about 200 times you know what's different and what's normal), play some of my favorite older games I've had for years and spent thousands of hours in.. and I know what the graphics should look like and know what something out of place would be.. etc.

So far I'm 10 months on my ram overclock since buying my 1080 Ti and no crashes no anomalies yet, nothing wrong at all. If there was anything wrong with my overclock I would of seen it a long time ago by now.


----------



## EarlZ

kithylin said:


> Basically just do what I've been doing.. and tell some of my other friends to do as to stability: Go use it. I just go play games, and run benchmarks a few times and look for anomalies (after you've seen firestrike run about 200 times you know what's different and what's normal), play some of my favorite older games I've had for years and spent thousands of hours in.. and I know what the graphics should look like and know what something out of place would be.. etc.
> 
> So far I'm 10 months on my ram overclock since buying my 1080 Ti and no crashes no anomalies yet, nothing wrong at all. If there was anything wrong with my overclock I would of seen it a long time ago by now.


I was lead to believe that there wont be any graphical anomalities on GDDR5X due to the ECC and automatic throttle/downclock. I havent seen any as well for the last 20mins on FF15 with +1000 on mem


----------



## hwoverclkd

LancerVI said:


> I just picked up an MSI GAMING X GeForce GTX 1080 Ti 11GB GDRR5X from Amazon Warehouse / fullfilled by Amazon for $565.00 I couldn't believe it when I saw it. It's used, but verified to work and in it's original box and accessories and covered completely by amazon's return policy. I still can't believe it. Also came with Free 2-day ship for Prime customers.
> 
> Any one have any experience, good or bad with Amazon Warehouse? I tried to buy 4 of the six they have, but buy the time I checked out, only one was available.
> 
> Thoughts? Either on this specific card or Amazon Warehouse?


I'd say that's a very good deal. I bought a returned 980 Ti Classified from amazon over 2 years ago, with everything in it and it didn't have any scratch, for a little over $400. So basically it looks really new and the box was in in almost pristine condition - there's just crease on the side but nothing major, and it's just the box. I didn't think twice because 1) amazon std return policy applies 2) evga will still cover it and 3) i can file a claim to amex if anything goes wrong with 1 and 2 options. It's still perfectly working today, no issues or anything.

I also picked up 1080Ti a few days ago for a little lower than the original MSRP price but it's bundled with a lot of things that i need to pay regular price for (technically). lol


----------



## kithylin

EarlZ said:


> I was lead to believe that there wont be any graphical anomalities on GDDR5X due to the ECC and automatic throttle/downclock. I havent seen any as well for the last 20mins on FF15 with +1000 on mem


I definitely saw tons of terrible graphical artifacts on my 1080 Ti all over the place with ram set to +800 in afterburner, everything looked like garbage-puke and nothing was discernible as any actual graphics, at least in games, "ram test" software showed fine, but in games it looked gods awful. But I backed it down to +700 and mine works perfectly now. So it totally is a thing. And I'm pretty sure normal 1080 Ti's do -NOT- come with ECC ram. That's usually (at least historically with nvidia) been one of the big selling points of all of the Quadro series cards I know for sure had ECC ram, and (I think) the titans may have it but I may be wrong. I've never heard of any "Normal" GTX gamer nvidia card ever having ECC ram yet in the history of video cards.

There may be automatic throttling/downclocking if you're on a normal bios, I'm on the XOC bios though so it let me push it right over the cliff and off in to the ether so I had to back the OC down.


----------



## ttnuagmada

kithylin said:


> Personally I never bothered with bandwidth tests, or straps or binning or any of that nonsense. I just set the ram as high as it will go in afterburner and not crash. For me it turns out that's +700 mhz which ends up being 6210 Mhz. I get higher scores in 3dmark than with the ram speed set at +0 (no overclock), so obviously it's faster than stock. Games don't crash, nothing crashes, there's no artifacts, 3dmark shows documented increase in performance, everything seems fine. That's all I care about. Go play games, happy, never think too much about it.
> 
> I think some of y'all over-analyze things to death around here sometimes.


No, the bandwidth test is just the eastiest, latest way to measure what we're talking about. There are steps within the memory speeds where the timings are loose for that speed, and a slightly lower speed will tighten up the timings, and actually increase your memory performance. You could see it by watching fraps while being in camera mode in Heaven or something similar. You could see the fps drop by slightly raising the memory speed, until it hits the next strap, and the fps jump up again. While you're getting better performance than stock by overclocking ram by that much, you may not actually be getting optimal performance for that particular clock speed range. 

There's nothing wrong with cranking it up and making sure its stable, but with these cards, that doesn't mean it's fully dialed in.


----------



## ttnuagmada

EarlZ said:


> 6210Mhz is +720 if I am not mistaken, with +597 I am getting 6102. I still dont have an effective way to check if my memory OC is stable, the memtestg80 posted here seems unreliable and will throw an error even if you are on stock speeds ( tested on 3 cards )
> 
> +1000 gives me 503-515GB/s, I dont know if its stable though


its +597 on mine (Aorus Xtreme). Or actually, i can set it as low as 580 now before it drops to 6180


----------



## AlbertoM

I use Furmark to set my ram overclock.

I increase it to the point that FPS go up and not down. On my 1080, it's about +595 MHz.

Above +600 MHz, the FPS count drop like 10 points immediately, even though there is no crash or artifacts.


----------



## LancerVI

acupalypse said:


> I'd say that's a very good deal. I bought a returned 980 Ti Classified from amazon over 2 years ago, with everything in it and it didn't have any scratch, for a little over $400. So basically it looks really new and the box was in in almost pristine condition - there's just crease on the side but nothing major, and it's just the box. I didn't think twice because 1) amazon std return policy applies 2) evga will still cover it and 3) i can file a claim to amex if anything goes wrong with 1 and 2 options. It's still perfectly working today, no issues or anything.
> 
> I also picked up 1080Ti a few days ago for a little lower than the original MSRP price but it's bundled with a lot of things that i need to pay regular price for (technically). lol


That's awesome. Thanks for the info. I just couldn't pass it up.


----------



## kithylin

ttnuagmada said:


> No, the bandwidth test is just the eastiest, latest way to measure what we're talking about. There are steps within the memory speeds where the timings are loose for that speed, and a slightly lower speed will tighten up the timings, and actually increase your memory performance. You could see it by watching fraps while being in camera mode in Heaven or something similar. You could see the fps drop by slightly raising the memory speed, until it hits the next strap, and the fps jump up again. While you're getting better performance than stock by overclocking ram by that much, you may not actually be getting optimal performance for that particular clock speed range.
> 
> There's nothing wrong with cranking it up and making sure its stable, but with these cards, that doesn't mean it's fully dialed in.


If I'm already getting say +15 FPS minimums over stock (no overclock) with it at +700, then I'm not going to concern myself with +1 or +3 FPS more at that point even if I could get a lower strap. Just seems like a waste of time to me for minimal gains at that point.



AlbertoM said:


> I use Furmark to set my ram overclock.
> 
> I increase it to the point that FPS go up and not down. On my 1080, it's about +595 MHz.
> 
> Above +600 MHz, the FPS count drop like 10 points immediately, even though there is no crash or artifacts.


Current day nvidia drivers will detect furmark and prevent the cards from running full speed while it's active, and intentionally either down-clock the cards or otherwise limit their performance while it's running. Nvidia has been doing this via drivers since about the GTX-700 series came out. It's not a very good way to test cards in general anymore today.


----------



## KedarWolf

kithylin said:


> If I'm already getting say +15 FPS minimums over stock (no overclock) with it at +700, then I'm not going to concern myself with +1 or +3 FPS more at that point even if I could get a lower strap. Just seems like a waste of time to me for minimal gains at that point.
> 
> 
> Current day nvidia drivers will detect furmark and prevent the cards from running full speed while it's active, and intentionally either down-clock the cards or otherwise limit their performance while it's running. Nvidia has been doing this via drivers since about the GTX-700 series came out. It's not a very good way to test cards in general anymore today.


If you turn P2 States off like I showed earlier it won't downclock running Furmark.


----------



## kithylin

KedarWolf said:


> If you turn P2 States off like I showed earlier it won't downclock running Furmark.


The original reason nvidia did that is because furmark was legitimately killing video cards in the past. I wouldn't be circumventing that if it was me or my card and I'll never touch furmark on my expensive video cards myself, it's not worth the risk.


----------



## ttnuagmada

kithylin said:


> If I'm already getting say +15 FPS minimums over stock (no overclock) with it at +700, then I'm not going to concern myself with +1 or +3 FPS more at that point even if I could get a lower strap. Just seems like a waste of time to me for minimal gains at that point.



I'm just explaining to you that there is a point to it, and that we aren't "over-analyzing" anything . I've not seen anyone in this thread try to push anything on you, so I'm not entirely sure why you feel you need to defend the fact that you don't feel like dialing your mem OC all the way in. You're free to do whatever you'd like with your GPU, and if you don't feel like dialing it all the way in, that's your prerogative.


----------



## LancerVI

LancerVI said:


> I just picked up an MSI GAMING X GeForce GTX 1080 Ti 11GB GDRR5X from Amazon Warehouse / fullfilled by Amazon for $565.00 I couldn't believe it when I saw it. It's used, but verified to work and in it's original box and accessories and covered completely by amazon's return policy. I still can't believe it. Also came with Free 2-day ship for Prime customers.
> 
> Any one have any experience, good or bad with Amazon Warehouse? I tried to buy 4 of the six they have, but by the time I checked out, only one was available.
> 
> Thoughts? Either on this specific card or Amazon Warehouse?





acupalypse said:


> I'd say that's a very good deal. I bought a returned 980 Ti Classified from amazon over 2 years ago, with everything in it and it didn't have any scratch, for a little over $400. So basically it looks really new and the box was in in almost pristine condition - there's just crease on the side but nothing major, and it's just the box. I didn't think twice because 1) amazon std return policy applies 2) evga will still cover it and 3) i can file a claim to amex if anything goes wrong with 1 and 2 options. It's still perfectly working today, no issues or anything.
> 
> I also picked up 1080Ti a few days ago for a little lower than the original MSRP price but it's bundled with a lot of things that i need to pay regular price for (technically). lol



Well, it looks like I lucked out. This MSI Gaming X GTX 1080 ti seems to be tip top. Been running it through the ringer all day, finishing up with an almost 2 hour run in furmark with temps topping out at 59c. Rock solid, with no faults all for $565.

Any suggestions on what else I should check???


----------



## kithylin

LancerVI said:


> Well, it looks like I lucked out. This MSI Gaming X GTX 1080 ti seems to be tip top. Been running it through the ringer all day, finishing up with an almost 2 hour run in furmark with temps topping out at 59c. Rock solid, with no faults all for $565.
> 
> Any suggestions on what else I should check???


Try ark survival evolved @ max settings in epic @ 1080p, with motion blur on. That will utterly punish any card and run any 1080 Ti @ 100% constantly, it's almost as brutal as most 3dmark benchmarks.

Also as already stated in this thread (Unless you manually circumvent it), nvidia drivers detect and down-clock your card when it detects furmark running. So furmark is not actually loading your card at all, it's actually running it at about half speed. Try 3dmark firestrike and superposition.


----------



## LancerVI

kithylin said:


> Try ark survival evolved @ max settings in epic @ 1080p, with motion blur on. That will utterly punish any card and run any 1080 Ti @ 100% constantly, it's almost as brutal as most 3dmark benchmarks.
> 
> Also as already stated in this thread (Unless you manually circumvent it), nvidia drivers detect and down-clock your card when it detects furmark running. So furmark is not actually loading your card at all, it's actually running it at about half speed. Try 3dmark firestrike and superposition.


That's great info. I didn't know that. Luckily for me I was running firestrike and time spy all day as well. 

Just downloaded superposition, so I'll give that a try.


----------



## 8051

LancerVI said:


> Well, it looks like I lucked out. This MSI Gaming X GTX 1080 ti seems to be tip top. Been running it through the ringer all day, finishing up with an almost 2 hour run in furmark with temps topping out at 59c. Rock solid, with no faults all for $565.
> 
> Any suggestions on what else I should check???


Where did you get such a great price? Private party?

Try out Deus Ex: Mankind Divided w/AA on at 4K that'll put the 1080Ti thru the ringer.


----------



## 2ndLastJedi

8051 said:


> LancerVI said:
> 
> 
> 
> Well, it looks like I lucked out. This MSI Gaming X GTX 1080 ti seems to be tip top. Been running it through the ringer all day, finishing up with an almost 2 hour run in furmark with temps topping out at 59c. Rock solid, with no faults all for $565.
> 
> Any suggestions on what else I should check???
> 
> 
> 
> Where did you get such a great price? Private party?
> 
> Try out Deus Ex: Mankind Divided w/AA on at 4K that'll put the 1080Ti thru the ringer.
Click to expand...

Deus Ex sucks as a game but will certainly load a 1080Ti and still only give 40 fps in 4k ! P.O.S .


----------



## kithylin

2ndLastJedi said:


> Deus Ex sucks as a game but will certainly load a 1080Ti and still only give 40 fps in 4k ! P.O.S .


This about describes ark evolved as well. Even with a 1080 Ti, Ark evolved on max settings @ 1080p will run around 50's FPS after you have a base built and some dinos tamed.

Games today are just garbage.


----------



## feznz

8051 said:


> Where did you get such a great price? Private party?
> 
> Try out Deus Ex: Mankind Divided w/AA on at 4K that'll put the 1080Ti thru the ringer.



I got Deus Ex Mankind divided just couldn't get into it but I didn't find it as demanding as Assassins Creed Origins that have clocked up 150+ hours on had to get all the expantion packs thoughly enjoyed the game as a whole


----------



## LancerVI

8051 said:


> Where did you get such a great price? Private party?
> 
> Try out Deus Ex: Mankind Divided w/AA on at 4K that'll put the 1080Ti thru the ringer.


I got it from Amazon. Amazon Warehouse to be exact. It's items that have been returned to Amazon and are sold, apparently, at a deep discount. I was a bit wary at first, but it was covered by Amazons' usual return policy so I said "what the hell." and bought it. $565.

Since the price was so low, I went ahead and bought the EKWB block for this card. So all told, $715 for a MSI Gaming X GTX 1080ti and the EKWB for this card. Going to hold off putting on the block just to be absolutely sure; but I beat that card up yesterday gaming and benching. I had Deus Ex installed, forgot to run it. I did, however play some Kingdom Come: Deliverance; that definitely punishes a gpu. NICE upgrade from my 980 ti.


----------



## mouacyk

LancerVI said:


> I got it from Amazon. Amazon Warehouse to be exact. It's items that have been returned to Amazon and are sold, apparently, at a deep discount. I was a bit wary at first, but it was covered by Amazons' usual return policy so I said "what the hell." and bought it. $565.
> 
> Since the price was so low, I went ahead and bought the EKWB block for this card. So all told, $715 for a MSI Gaming X GTX 1080ti and the EKWB for this card. Going to hold off putting on the block just to be absolutely sure; but I beat that card up yesterday gaming and benching. I had Deus Ex installed, forgot to run it. I did, however play some Kingdom Comeeliverance; that definitely punishes a gpu. NICE upgrade from my 980 ti.


Kind of curious on what stable top clocks and voltage were you able to get on this sample?


----------



## LancerVI

mouacyk said:


> Kind of curious on what stable top clocks and voltage were you able to get on this sample?


You know, I really didn't pay attention to that. I'll try to see when I get home tonight and test. What should I be looking for. What's considered good clocks for the 1080ti? Just off the top of my head, I remember seeing it at 19XX while benching yesterday.


----------



## mouacyk

LancerVI said:


> You know, I really didn't pay attention to that. I'll try to see when I get home tonight and test. What should I be looking for. What's considered good clocks for the 1080ti? Just off the top of my head, I remember seeing it at 19XX while benching yesterday.


Since you have a full coverage block, here are some interesting voltage points to check:

1.1v - 2100MHz (voltage possible on stock BIOS, but needs to be unlocked in Afterburner). Firestrike ~ 22K
1.2v - 2200MHz (will need XOC BIOS) Firestrike ~ 23K


----------



## kithylin

mouacyk said:


> Since you have a full coverage block, here are some interesting voltage points to check:
> 
> 1.1v - 2100MHz (voltage possible on stock BIOS, but needs to be unlocked in Afterburner). Firestrike ~ 22K
> 1.2v - 2200MHz (will need XOC BIOS) Firestrike ~ 23K


^^^^ This depends on the card, not all cards can do this. My 1080 Ti even with full-cover waterblock and XOC needs 1.15v for 2100 and 1.2v for 2138, not all 1080 Ti's will do 2200, even with XOC and 1.200v, in fact most won't.


----------



## LancerVI

mouacyk said:


> Since you have a full coverage block, here are some interesting voltage points to check:
> 
> 1.1v - 2100MHz (voltage possible on stock BIOS, but needs to be unlocked in Afterburner). Firestrike ~ 22K
> 1.2v - 2200MHz (will need XOC BIOS) Firestrike ~ 23K





kithylin said:


> ^^^^ This depends on the card, not all cards can do this. My 1080 Ti even with full-cover waterblock and XOC needs 1.15v for 2100 and 1.2v for 2138, not all 1080 Ti's will do 2200, even with XOC and 1.200v, in fact most won't.


I will give it a try. I haven't installed the block yet as I wanted to make sure this thing works fine at stock, which it appears to. Once I get the block on, I'll definitely see what's up. According to what I've found online, the MSI Gaming X is pretty much already OC'd to it's max already, but I'll definitely try for more. 900mm of rad and only a CPU block in my current build so I have plenty of headroom. Trying to decide if I should go separate loops or run it serial. Pump>Rad>CPU>Pump>RAd>GPU or separate them out.


----------



## KY_BULLET

Anyone flash the XOC on an MSI 1080 ti Armor OC? I've looked through here but, havent found anyone that tried it.

I'm using a Thermaltake 240mm Riing AIO (Push Pull config). I can run 2100mhz at 1.093v and be under 50c. Just wondering if I can get more out of it.

I've tried 2113mhz and it will hold it for about 30 seconds then downclock to 2100mhz on skydiver, superposition, pretty much any benchmark except firestrike, timespy, etc etc...If I try anything above 2088mhz on firestrike etc.., my driver crashes out I'm guessing because of power limited?


----------



## mouacyk

KY_BULLET said:


> Anyone flash the XOC on an MSI 1080 ti Armor OC? I've looked through here but, havent found anyone that tried it.
> 
> I'm using a Thermaltake 240mm Riing AIO (Push Pull config). I can run 2100mhz at 1.093v and be under 50c. Just wondering if I can get more out of it.
> 
> I've tried 2113mhz and it will hold it for about 30 seconds then downclock to 2100mhz on skydiver, superposition, pretty much any benchmark except firestrike, timespy, etc etc...If I try anything above 2088mhz on firestrike etc.., my driver crashes out I'm guessing because of power limited?


1. Congratulations on 2100MHz at 1.1v! If that's game stable, then it's top of the line stuff.
2. Since you're using an AIO cooler (as opposed to full block), be extra careful that the VRM's are properly cooled when testing with the XOC BIOS. This can pull 400W+ and burn an inadequately cooled VRM.
3. If you get a driver reset crash, the clock isn't stable; not because it's power limited. Eliminating the power limit is not exactly known to fix crashes; it just allows you to sustain a higher stable clock.


----------



## kithylin

mouacyk said:


> 1. Congratulations on 2100MHz at 1.1v! If that's game stable, then it's top of the line stuff.
> 2. Since you're using an AIO cooler (as opposed to full block), be extra careful that the VRM's are properly cooled when testing with the XOC BIOS. This can pull 400W+ and burn an inadequately cooled VRM.
> 3. If you get a driver reset crash, the clock isn't stable; not because it's power limited. Eliminating the power limit is not exactly known to fix crashes; it just allows you to sustain a higher stable clock.


I've found personally on XOC bios, my 2139 clock is perfectly stable.. IF I can keep the GPU at or below 55c. It only crashes the driver if the GPU gets to 60c or more. So you may want to add in a disclaimer that keeping the card cold has a big effect on overclocks. I know that's hot for water, it's partially my fault. Just using a single 3x120 radiator, I think it's a bit dirty too.. planning to clean it later this week. Ultimately would like to set up a big multi-radiator setup just to keep my 1080 Ti < 50c even @ 1.200v & 2138 mhz Possibly hope for more like 45c or less under load. Something like 3x420mm rads possibly, in a big TT Core X9 case. That's for later though, maybe next year. I also have my CPU looped in too, 5820K @ 4.5 ghz. I did a test for a while a week or so ago and looped in a little 2x120mm rad in the loop with my 3x120mm rad and saw the temps didn't decrease much, but I could run the [email protected] clock all the time with no crashing anymore, even at 65c. So more radiators does have an impact, and seems Pascal is -REALLY- sensitive to temperatures.


----------



## KY_BULLET

mouacyk said:


> KY_BULLET said:
> 
> 
> 
> Anyone flash the XOC on an MSI 1080 ti Armor OC? I've looked through here but, havent found anyone that tried it.
> 
> I'm using a Thermaltake 240mm Riing AIO (Push Pull config). I can run 2100mhz at 1.093v and be under 50c. Just wondering if I can get more out of it.
> 
> I've tried 2113mhz and it will hold it for about 30 seconds then downclock to 2100mhz on skydiver, superposition, pretty much any benchmark except firestrike, timespy, etc etc...If I try anything above 2088mhz on firestrike etc.., my driver crashes out I'm guessing because of power limited?
> 
> 
> 
> 1. Congratulations on 2100MHz at 1.1v! If that's game stable, then it's top of the line stuff.
> 2. Since you're using an AIO cooler (as opposed to full block), be extra careful that the VRM's are properly cooled when testing with the XOC BIOS. This can pull 400W+ and burn an inadequately cooled VRM.
> 3. If you get a driver reset crash, the clock isn't stable; not because it's power limited. Eliminating the power limit is not exactly known to fix crashes; it just allows you to sustain a higher stable clock.
Click to expand...

No not game stable, quick benchmark stable though. I really need to go full water block. If I was under water, I think I could hold that clock at that voltage.


----------



## EarlZ

Does the XOC bios work on the MSI Gaming X ?


----------



## KedarWolf

EarlZ said:


> Does the XOC bios work on the MSI Gaming X ?


Yes, it does. :h34r-smi


----------



## EDK-TheONE

EarlZ said:


> Does the XOC bios work on the MSI Gaming X ?


Yes it works well, but notice rpm of fans goes up too much around 60% will be 2500rpm!


----------



## kithylin

EDK-TheONE said:


> Yes it works well, but notice rpm of fans goes up too much around 60% will be 2500rpm!


XOC is not for air cooled cards. Do not use it on air cooled cards ever. Water cooled only, it can kill air cooled cards.


----------



## 8051

kithylin said:


> XOC is not for air cooled cards. Do not use it on air cooled cards ever. Water cooled only, it can kill air cooled cards.


I've been using the XOC VBIOS on my air cooled Gigabyte Gaming OC 1080Ti for months now, but I replaced the fans with more powerful shrouded versions (incl. one 6000 RPM 92x38mm vaneaxial fan blowing right over the VRM's) and it's barely enough. Dying Light is the only game where I have to deliberately set a lower core clock. 

I'm hoping an Arctic Cooling Accelero Xtreme III with more powerful fans will do better.


----------



## kithylin

8051 said:


> I've been using the XOC VBIOS on my air cooled Gigabyte Gaming OC 1080Ti for months now, but I replaced the fans with more powerful shrouded versions (incl. one 6000 RPM 92x38mm vaneaxial fan blowing right over the VRM's) and it's barely enough. Dying Light is the only game where I have to deliberately set a lower core clock.
> 
> I'm hoping an Arctic Cooling Accelero Xtreme III with more powerful fans will do better.


While it may of worked for you, and you're apparently okay with gambling with the health of your $800 video card like that, it's still something we're going to try and actively discourage to other users. Air cooled cards, no matter how good the cooler, just don't have the cooling capacity for an unlocked 1080 Ti like what you get with XOC. Even then if it "works" for you, you can't overclock on it with air cooling anyway, so there's literally no point in even using it.


----------



## gammagoat

kithylin said:


> While it may of worked for you, and you're apparently okay with gambling with the health of your $800 video card like that, it's still something we're going to try and actively discourage to other users. Air cooled cards, no matter how good the cooler, just don't have the cooling capacity for an unlocked 1080 Ti like what you get with XOC. Even then if it "works" for you, you can't overclock on it with air cooling anyway, so there's literally no point in even using it.


I don't understand, is the XOC bios all or nothing?

Does it just run up the voltage/amperage to max? 

Couldn't one use this bios with the understanding while your air cooling that you might not be able to use it to its full potential?


----------



## kithylin

gammagoat said:


> I don't understand, is the XOC bios all or nothing?
> 
> Does it just run up the voltage/amperage to max?
> 
> Couldn't one use this bios with the understanding while your air cooling that you might not be able to use it to its full potential?


The XOC bios is originally intended for LN2 overclocking. As such most (almost all) of the built in restrictions by nvidia are removed when using it. This means quite literally -NO LIMIT AT ALL- for power or amperage draw. If you use XOC on air cooling, then it is entirely possible for the card to run over and suddenly pull an excessive amount of power and sustain physical damage without warning if you were to open a benchmark or some game that caused it to hit peak power. This could happen in seconds before the onboard boost throttling mechanism has time to react due to how air cooling can't handle the rapid heat fluxuations from these cards if they were unlocked with XOC and overclocked and over volted, on air cooling. The only even moderately safe way to use XOC bios is with a fullcover water block in a custom loop. Mostly due to how with custom water cooling it takes quite a lot of time for any changes in the card to heat up the liquid in the loop and as such it would at least give the card time to trigger some of the onboard failsafes like throttling and such. Or one of those few AIO's that are also fullcover with a built in radiator+pump. Some people with XOC and certain games and certain overclocks have reported their 1080 Ti's exceeding 500+ watts at times. It's terribly dangerous, unsafe and in general a very bad idea and no one should ever do it with air cooling. Unless you're filthy rich with disposable income and don't care if you destroy an $800 - $1000 video card for the lulz. I imagine most folks here are not in that situation though.

It does also also allow higher voltage in afterburner for overclocking, yes, but that's not entirely all it does. There are supposedly hardware protections built in to place to prevent the card from frying, but they're once or nothing, like fuses that will melt and such. Once you trigger those card's dead. And most manufacturers won't RMA your card if you do that to it. 

There's a whole separate thread about it and everyone has already discussed this and many other things over hundreds of pages by now.

XOC Bios thread: http://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html


----------



## gammagoat

kithylin said:


> The XOC bios is originally intended for LN2 overclocking. As such most (almost all) of the built in restrictions by nvidia are removed when using it. This means quite literally -NO LIMIT AT ALL- for power or amperage draw. If you use XOC on air cooling, then it is entirely possible for the card to run over and suddenly pull an excessive amount of power and sustain physical damage without warning if you were to open a benchmark or some game that caused it to hit peak power. This could happen in seconds before the onboard boost throttling mechanism has time to react due to how air cooling can't handle the rapid heat fluxuations from these cards if they were unlocked with XOC and overclocked and over volted, on air cooling. The only even moderately safe way to use XOC bios is with a fullcover water block in a custom loop. Mostly due to how with custom water cooling it takes quite a lot of time for any changes in the card to heat up the liquid in the loop and as such it would at least give the card time to trigger some of the onboard failsafes like throttling and such. Or one of those few AIO's that are also fullcover with a built in radiator+pump. Some people with XOC and certain games and certain overclocks have reported their 1080 Ti's exceeding 500+ watts at times. It's terribly dangerous, unsafe and in general a very bad idea and no one should ever do it with air cooling. Unless you're filthy rich with disposable income and don't care if you destroy an $800 - $1000 video card for the lulz. I imagine most folks here are not in that situation though.
> 
> It does also also allow higher voltage in afterburner for overclocking, yes, but that's not entirely all it does. There are supposedly hardware protections built in to place to prevent the card from frying, but they're once or nothing, like fuses that will melt and such. Once you trigger those card's dead. And most manufacturers won't RMA your card if you do that to it.
> 
> There's a whole separate thread about it and everyone has already discussed this and many other things over hundreds of pages by now.
> 
> XOC Bios thread: http://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html



Thanks for your detailed answer. 

If I understand correctly then this Bios could/will exceed limits that I would place with say Afterburner?


----------



## kithylin

gammagoat said:


> Thanks for your detailed answer.
> 
> If I understand correctly then this Bios could/will exceed limits that I would place with say Afterburner?


If you use the XOC bios on a 1080 Ti it no longer has any limitations at all on it's power, regardless of what ever you set in afterburner. Just because you only set it to say 1.200v volts in afterburner, that's still enough for the cards to pull down 500 or more watts in certain situations in certain titles. Not to say they always pull that much, mine averages around 250~350 watts typical when gaming even with XOC on it. But, it's possible that it could. I've seen mine get up to 430 watts before with Project C.A.R.S.


----------



## 8051

kithylin said:


> While it may of worked for you, and you're apparently okay with gambling with the health of your $800 video card like that, it's still something we're going to try and actively discourage to other users. Air cooled cards, no matter how good the cooler, just don't have the cooling capacity for an unlocked 1080 Ti like what you get with XOC. Even then if it "works" for you, you can't overclock on it with air cooling anyway, so there's literally no point in even using it.


But you're wrong because with the stock VBIOS I hit power limits long before I hit my temp limits and the constant throttling of the stock VBIOS was also very annoying.

Under H2O your VRM's and VRAM's are commonly connected to the cold plate through thermal pads, but with my Accelero Xtreme III it'll be thermal epoxy which has much better thermal conductivity.


----------



## kithylin

8051 said:


> But you're wrong because with the stock VBIOS I hit power limits long before I hit my temp limits and the constant throttling of the stock VBIOS was also very annoying.
> 
> Under H2O your VRM's and VRAM's are commonly connected to the cold plate through thermal pads, but with my Accelero Xtreme III it'll be thermal epoxy which has much better thermal conductivity.


Except with water cooling we're trying to heat up the entire mass of the liquid (which is cooled with a radiator). And you're just heating up a bit of copper with some fans on it. There's a pretty big difference there. It takes an awful lot longer to heat up a water loop than it does any air cooler. That's generally why most folks use it for video cards and other components.


----------



## Rangerscott

Just installed the latest 398.11 drivers it completely screwed up my display settings. It took away my HDR, 4:4:4 chroma, and so on. Now I can't even turn on the "enhanced hdmi 2.0" on my tv or I get a black screen. Guess I'll roll back to 391.25 that was working.


----------



## 8051

kithylin said:


> Except with water cooling we're trying to heat up the entire mass of the liquid (which is cooled with a radiator). And you're just heating up a bit of copper with some fans on it. There's a pretty big difference there. It takes an awful lot longer to heat up a water loop than it does any air cooler. That's generally why most folks use it for video cards and other components.


And you're heating up frag tape, which can hardly conduct heat as well as epoxy or that "bit of copper".


----------



## krizby

8051 said:


> And you're heating up frag tape, which can hardly conduct heat as well as epoxy or that "bit of copper".


Thermal conductivity is one thing, it's also about the temperature of the copper block. Full cover waterblock is usually at around water temperature (which is around 39C maximum in my case, max GPU temp 42C with Grizzly Conductonaut) while the copper on your Accelero heats up to around 55C conservatively speaking. Even with low performance thermal pads you can be sure the VRAM and VRM will never reach 55C with a full cover waterblock (unless you cheap out on radiators or fans and allow water temp to reach 55C of course). I can put tooth paste on the GPU and it would still be colder than the Accelero cooler with Grizzly Conductonaut.


----------



## 8051

krizby said:


> Thermal conductivity is one thing, it's also about the temperature of the copper block. Full cover waterblock is usually at around water temperature (which is around 39C maximum in my case, max GPU temp 42C with Grizzly Conductonaut) while the copper on your Accelero heats up to around 55C conservatively speaking. Even with low performance thermal pads you can be sure the VRAM and VRM will never reach 55C with a full cover waterblock (unless you cheap out on radiators or fans and allow water temp to reach 55C of course). I can put tooth paste on the GPU and it would still be colder than the Accelero cooler with Grizzly Conductonaut.


Fact is, frag tape is TERRIBLE relative to thermal epoxy:

from: https://www.hardocp.com/article/1999/09/08/fragtape_vs_thermal_paste/1
(ASTM C-177) 0.43 W/m-K (0.25 BTU/Ft. Hr. آ°F)

from: https://www.masterbond.com/properties/thermally-conductive-epoxy-adhesives

The chart below illustrates thermal conductivity values that can be achieved for select grades of systems with different fillers:
System Type Product Filler Thermal Conductivity
One part epoxy Supreme 12AOHT-LO Aluminum Oxide 1.30-1.44 W/(m•K)
Two part epoxy EP30TC Aluminum Nitride 2.60-2.88 W/(m•K)
One part epoxy EP3HTS-LO Silver 2.45-2.60 W/(m•K)
Two part epoxy EP75-1 Graphite 1.87-2.02 W/(m•K)

Your ability to remove heat from a system is limited by your weakest link, frag tape. Furthermore, Cu and Al both have higher thermal conductivity values than H2O.


----------



## kithylin

8051 said:


> And you're heating up frag tape, which can hardly conduct heat as well as epoxy or that "bit of copper".


Please don't start arguments in this thread and attempt to de-rail it off topic. 
This isn't even worth discussing.
Pads or paste, which ever, custom water cooling is able to handle heat better than any air cooler.
This is established fact everyone knows since the early 2000's.

This is not the thread to discuss this sort of thing, that is basic gpu cooling mechanics that should be in it's own thread somewhere else, as it has nothing specific to concern with this specific video card in this specific thread, it's just a general gpu cooling thing.


----------



## 8051

kithylin said:


> Please don't start arguments in this thread and attempt to de-rail it off topic.
> This isn't even worth discussing.
> Pads or paste, which ever, custom water cooling is able to handle heat better than any air cooler.
> This is established fact everyone knows since the early 2000's.
> 
> This is not the thread to discuss this sort of thing, that is basic gpu cooling mechanics that should be in it's own thread somewhere else, as it has nothing specific to concern with this specific video card in this specific thread, it's just a general gpu cooling thing.


Nice so you can't refute my points WRT cooling VRM's/VRAM's using frag tape so now I'm supposed to leave eh?


----------



## kithylin

8051 said:


> Nice so you can't refute my points WRT cooling VRM's/VRAM's using frag tape so now I'm supposed to leave eh?


No, no one said to leave, but we're here to discuss GTX 1080 Ti's, not general gpu cooling issues. What you're talking about refers to all gpu's, not just 1080 Ti's. You're welcome to start your own topic about it, link to it in here and we can continue discussing it there. It's just you're getting pretty far off topic here.


----------



## KedarWolf

8051 said:


> Nice so you can't refute my points WRT cooling VRM's/VRAM's using frag tape so now I'm supposed to leave eh?


If you think thermal epoxy with copper heatsinks is going to perform better that a full cover waterblock you're very misinformed, it's not about the conductivity of the thermal material but the difference in the cooling ability over water blocks and a copper heatsink. Anyhow, enough on that, if we can't agree we need to move on.


----------



## krizby

8051 said:


> Nice so you can't refute my points WRT cooling VRM's/VRAM's using frag tape so now I'm supposed to leave eh?


Probably some confusion here, there is no frag tape on the full cover waterblock, we put thermal pads on VRM, VRAM and those are rated at 5W/mk, KedarWolf here uses like 18W/mk thermal pads on his .

Now onto my story, so last week I bought a wired headset (were using a wireless headset) and notice slight crackling sound when playing game, both the onboard and the Creative AE5 sound card produce the same noise. It's only happened in game so I figured it would be something to do with the 1080ti. Turned out I was using 1 PCIE cable with split 8+6 ends, using 2 separate cable fixed the problem. I had the Titan X maxwell before with unlock TDP bios (power draw reach like 350-400W) and 1 cable was fine (tested against 2 cables). Googling a bit and I found other people had the same problem when putting 1080ti into their system and couldn't find the cause (LOL). So my suggestion is always use 2 PCIE cables for your 1080ti even though it probably look a bit unsightly. PSU is Corsair RM1000 and this unit is not known for good ripple supression lol.


----------



## ViBEyt

Hello everyone, recently I flashed XOC bios on my Armor GTX 1080 Ti (Gaming X PCB) + Morpheus II cooler and I looked around but I cant really find exact answer to whats a safe voltage for these cards currently Im using 1.100v the core speed is 2063 when Im above 51C or 2076-2088 when the GPU is under 50C. So what voltage can I consider safe ? Thanks and sorry if this question has been answered before


----------



## mouacyk

The one temperature that you're not seeing that is more important is for the VRMs. It's generally not advised to run more than 1.1v sustained unless you have active cooling on the VRMs, such as via a full-coverage water block. At this voltage with this BIOS, the card can draw 400-500W and burn out the VRMs if not cooled properly. Questions like these make you want to have VRM temperature sensors like AMD and EVGA on their newer cards.

NVidia themselves has publicly stated that 1.1v prolonged use will shorten the GPU's life to 1 year. Check the PCPer podcast.


----------



## 8051

krizby said:


> Probably some confusion here, there is no frag tape on the full cover waterblock, we put thermal pads on VRM, VRAM and those are rated at 5W/mk, KedarWolf here uses like 18W/mk thermal pads on his .


Duh, "frag tape" is a nickname for thermal tape.


----------



## 8051

KedarWolf said:


> If you think thermal epoxy with copper heatsinks is going to perform better that a full cover waterblock you're very misinformed, it's not about the conductivity of the thermal material but the difference in the cooling ability over water blocks and a copper heatsink. Anyhow, enough on that, if we can't agree we need to move on.


"It's not about the conductivity of the thermal material." Yeah, it's not about physics.


----------



## kithylin

ViBEyt said:


> Hello everyone, recently I flashed XOC bios on my Armor GTX 1080 Ti (Gaming X PCB) + Morpheus II cooler and I looked around but I cant really find exact answer to whats a safe voltage for these cards currently Im using 1.100v the core speed is 2063 when Im above 51C or 2076-2088 when the GPU is under 50C. So what voltage can I consider safe ? Thanks and sorry if this question has been answered before


Supposedly up to 1.200v is safe for 24-7 usage, as this is the maximum we can get with any bios so far and the nvidia "hard limit" programmed into the cards, if using custom water. About 1.15v is about the max for air cooled cards. Which also by the way, if you're hitting 80c or above during benchmarks like firestrike / superposition, you're too hot and reduce clocks and volts.


----------



## MightEMatt

8051 said:


> "It's not about the conductivity of the thermal material." Yeah, it's not about physics.



I believe his point was that at the high end, you're not limited by the thermal interface, but the cooling capacity of the system. It doesn't matter that you can transfer heat marginally faster if the cooling mechanism is insufficient to dissipate the heat.


Edit: As an example say that a copper heatsink can remove 5W of heat and has the best thermal interface, and the waterblock can remove 10W of heat but has a slightly worse thermal interface. For the first 5W, the copper sink _may_ perform better. However beyond that the copper heatsink will begin rising in temperature exponentially as the transfer of heat between the heatsink and the air is quite low. The temperature difference between the heatsink and chip will be minimal because of the thermal interface. This compared to the waterblock, which although may have slightly worse temperatures initially is able to remove significantly more heat from the heatsink. This results in the heatsink being much cooler than the chip, which is the key to the whole process. Thermal interface material is rated by thermal capacity, per area degree. Every degree difference between the materials will exponentially increase the heat transfer rate, thereby minimizing the negligible difference in TIM quality. The best thermal interface on the best cooling solution would obviously perform slightly better, but only by a couple of degrees at most.

Edit 2: This isn't to say TIM doesn't matter, delidded processors and all that. It all comes down to sizing the cooling system for the component. A 200W overclocked 8700k is completely incomparable to a 5W power delivery component in regards to heatsink size and thermal interface K value.


----------



## krizby

8051 said:


> Duh, "frag tape" is a nickname for thermal tape.


Lol how misinformed are you ? Thermal pads are not thermal tape (Duh, they all have the word "thermal", they should be the same right ? yeah that's absolutely wrong pal)

Just google Fujipoly ultra extreme thermal pad and you see they are rated at 17W/mk, the stock pads that come with waterblock are usually 5W/mk, still better than those thermal epoxy you quoted. Just lay the discussion to rest already.

Anyways the cooling results of the Accelero III are good, the included vram and vrm heatsinks are fine at maximum fan rpm so I guess there is no harm using the XOC bios on your Gigabyte 1080ti Gaming. I had been using the titan X maxwell with ACX 2 cooler, unlock tdp bios that pull 350-400w at load and it's still kicking after 3 years. Just keep vcore at or below 1.05v and you are fine. Also use 2 separate power cables as I found out using 1 has some bad side effects.


----------



## ViBEyt

So I should be fine up 1.162 correct ?


----------



## ViBEyt

kithylin said:


> Supposedly up to 1.200v is safe for 24-7 usage, as this is the maximum we can get with any bios so far and the nvidia "hard limit" programmed into the cards, if using custom water. About 1.15v is about the max for air cooled cards. Which also by the way, if you're hitting 80c or above during benchmarks like firestrike / superposition, you're too hot and reduce clocks and volts.


Im not even hitting 60C atm at 1.162v.


----------



## KedarWolf

ViBEyt said:


> Im not even hitting 60C atm at 1.162v.



I believe even if your temps are decent, high voltages can degrade components quicker than normal voltages. You may be able to run that voltage but debatable how wise it is in the long run. No big deal I'm sure if you're going to upgrade to a Volta card eventually, but myself, I only run higher voltages when benching, for 24/7 use I run at 1.1v or under with XOC BIOS. :h34r-smi


----------



## ViBEyt

KedarWolf said:


> I believe even if your temps are decent, high voltages can degrade components quicker than normal voltages. You may be able to run that voltage but debatable how wise it is in the long run. No big deal I'm sure if you're going to upgrade to a Volta card eventually, but myself, I only run higher voltages when benching, for 24/7 use I run at 1.1v or under with XOC BIOS. :h34r-smi


Yeah Im at 1.100v atm and 2063 on the core which throttles down to 2050 if the temp is above 50C. What frequencies can you reach on your card/cards at 1.100v ?


----------



## ViBEyt

Welp the XOC bios started making my PC restart just now for some reason, it was either because the GPU was hitting 330W or the OC was unstable. I have 8700k at 5.1Ghz and 1.4v the power usage on it was normal. My PSU is Seasonic Focus Plus 650W. Anyways I returned to my stock bios and now that I've learned how to worke with the curve managed to get it stable at 2025/6003 at 1.075v.


----------



## ZealotKi11er

Has anyone had problems where the screen would go black for spit second from time to time when using HDMI 2.0. I do not have this problem with Vega 64.


----------



## mouacyk

ViBEyt said:


> Welp the XOC bios started making my PC restart just now for some reason, it was either because the GPU was hitting 330W or the OC was unstable. I have 8700k at 5.1Ghz and 1.4v the power usage on it was normal. My PSU is Seasonic Focus Plus 650W. Anyways I returned to my stock bios and now that I've learned how to worke with the curve managed to get it stable at 2025/6003 at 1.075v.


I had a Seasonic XP2 660W Platinum and was getting restarts as well with CPU overclocked (~200W) and GPU overclocked using XOC BIOS (~500W). When I swapped over my X-850W into the system, the restarts stopped. Of course, I was load testing with Furmark and prime95.


----------



## ViBEyt

mouacyk said:


> I had a Seasonic XP2 660W Platinum and was getting restarts as well with CPU overclocked (~200W) and GPU overclocked using XOC BIOS (~500W). When I swapped over my X-850W into the system, the restarts stopped. Of course, I was load testing with Furmark and prime95.


It was restarting whenI was playing Rise Of The Tomb Raider the GPU was around 300-330W and CPU was not even hitting 100W if HWInfo is correct and before that I ran superposition and other stress tests for like 30 minutes to 1 hour and no issues but Rise of the Tomb Raider made the thing restart twice but after I switched back to my stock bios everything was fine and good thing that the XOC bios thought how to work with the voltage curve it was super confusing in the beginning.


----------



## kithylin

KedarWolf said:


> I believe even if your temps are decent, high voltages can degrade components quicker than normal voltages. You may be able to run that voltage but debatable how wise it is in the long run. No big deal I'm sure if you're going to upgrade to a Volta card eventually, but myself, I only run higher voltages when benching, for 24/7 use I run at 1.1v or under with XOC BIOS. :h34r-smi


I have a pair of hydro copper EVGA GTX 470's that I've run @ 1.212v with custom bios since buying em new in summer 2010. I have a single GTX 680 MSI Lightning with custom water block and the extra power daughter board, been running that one @ 1.250v for years since buying it new. I have a EVGA GTX 780 Classified with a custom bios I'm running at 1.26v since 2014, air cooled on factory air cooler from EVGA too (I've never been able to source a block for it, sadly). And I tuned a friend of mine's GTX 980 with a custom water block and custom bios up to 1.23v and all of em are fine and nothing has gone bad or happened or degraded yet. It's still only several months into my first pascal card, from august 2017 -> today, but so far it's fine. I really doubt 1.200v is going to harm an nvidia card.


----------



## merlin__36

Anyone else notices 1080 ti cards in SLI not working to max usage usually 40%-60% on ea while gaming. even though it does not maintain the FPS cap i set for 160 Consistently.


----------



## kithylin

merlin__36 said:


> Anyone else notices 1080 ti cards in SLI not working to max usage usually 40%-60% on ea while gaming. even though it does not maintain the FPS cap i set for 160 Consistently.


Try a different game. SLI Scaling is different for each game. It also depends on your processor and the system you're using. You're not going to get 100% usage on both cards and high FPS in every title. SLI scaling is a mixed bag. Some games only scale +30%, some games may scale +80%, some games may perform worse in SLI Than they do with a single card. It varies greatly from game to game.

Are you trying to play 1080p @ 160hz/FPS ? Or 4K? Or 2K? Pretty sure we don't have 4K Screens that can do 160hz yet so probably something less than 4K.


----------



## merlin__36

kithylin said:


> Try a different game. SLI Scaling is different for each game. It also depends on your processor and the system you're using. You're not going to get 100% usage on both cards and high FPS in every title. SLI scaling is a mixed bag. Some games only scale +30%, some games may scale +80%, some games may perform worse in SLI Than they do with a single card. It varies greatly from game to game.
> 
> Are you trying to play 1080p @ 160hz/FPS ? Or 4K? Or 2K? Pretty sure we don't have 4K Screens that can do 160hz yet so probably something less than 4K.


2560 x 1080 @ 160hz is my monitor's max and I am trying to supply 160fps to match with Gsync.

I haven't tried any games other than Call Of Duty WWII so far but I can download Farcry 5 tonight and test it. this is a brand new pc build.


----------



## kithylin

merlin__36 said:


> 2560 x 1080 @ 160hz is my monitor's max and I am trying to supply 160fps to match with Gsync.
> 
> I haven't tried any games other than Call Of Duty WWII so far but I can download Farcry 5 tonight and test it. this is a brand new pc build.


You should get familiar with a 3rd party tool called Nvidia Inspector, you'll be using it a lot to fine-tune nvidia's (usually really crappy) default SLI profiles.

Nvidia Inspector is here, free: http://www.guru3d.com/files-details/nvidia-inspector-download.html

I think it's actually split off in to Nvidia Profile Inspector now a days.

A guide on how to use it is here: https://forums.guru3d.com/threads/nvidia-inspector-introduction-and-guide.403676/

And the entire reason I'm suggesting this to you is I did some searching and a user has posted the related bits to change in nvidia inspector to edit the profile for COD WWII and shows a measured increase in performance (Above the default nvidia SLI profile) by as much as +25% to +30% in SLI afterwards. That post is over here: https://steamcommunity.com/app/476600/discussions/0/1479857071259011262/ All from just spending a few minutes to copy-paste a few blocks of numbers into the correct boxes and clicking an apply button. And the beauty of editing the profiles this way is once done, you just open the game as normal and it's always applied, do it once and loads forever. And you don't need nvidia inspector, or any other 3rd party program running for it to have effect.

Seems like this would probably help you get more out of your system, at least in this one title.


----------



## KedarWolf

kithylin said:


> I have a pair of hydro copper EVGA GTX 470's that I've run @ 1.212v with custom bios since buying em new in summer 2010. I have a single GTX 680 MSI Lightning with custom water block and the extra power daughter board, been running that one @ 1.250v for years since buying it new. I have a EVGA GTX 780 Classified with a custom bios I'm running at 1.26v since 2014, air cooled on factory air cooler from EVGA too (I've never been able to source a block for it, sadly). And I tuned a friend of mine's GTX 980 with a custom water block and custom bios up to 1.23v and all of em are fine and nothing has gone bad or happened or degraded yet. It's still only several months into my first pascal card, from august 2017 -> today, but so far it's fine. I really doubt 1.200v is going to harm an nvidia card.


XOC BIOS at 1.2v has no power limit and can pull 500W+. It's been shown that does. I've even seen it peaking higher in my tests. Even under water, I'm not sure that can be really good for the card in the long term.


----------



## kithylin

KedarWolf said:


> XOC BIOS at 1.2v has no power limit and can pull 500W+. It's been shown that does. I've even seen it peaking higher in my tests. Even under water, I'm not sure that can be really good for the card in the long term.


Yep, I know you have and I've warned people about that high power draw too. So far no one has reported a dead card from it, at least on OCN forums, at least not in these threads. It may be happening out there but we don't have any evidence of it yet. So far it seems fine. I've heard of lots of folks in here running 1.200v on their 1080 Ti's, some folks even reporting 2200~2250 Mhz core speed and being fine.


----------



## 8051

MightEMatt said:


> I believe his point was that at the high end, you're not limited by the thermal interface, but the cooling capacity of the system. It doesn't matter that you can transfer heat marginally faster if the cooling mechanism is insufficient to dissipate the heat.
> 
> Edit: As an example say that a copper heatsink can remove 5W of heat and has the best thermal interface, and the waterblock can remove 10W of heat but has a slightly worse thermal interface. For the first 5W, the copper sink _may_ perform better. However beyond that the copper heatsink will begin rising in temperature exponentially as the transfer of heat between the heatsink and the air is quite low. The temperature difference between the heatsink and chip will be minimal because of the thermal interface. This compared to the waterblock, which although may have slightly worse temperatures initially is able to remove significantly more heat from the heatsink. This results in the heatsink being much cooler than the chip, which is the key to the whole process. Thermal interface material is rated by thermal capacity, per area degree. Every degree difference between the materials will exponentially increase the heat transfer rate, thereby minimizing the negligible difference in TIM quality. The best thermal interface on the best cooling solution would obviously perform slightly better, but only by a couple of degrees at most.
> 
> Edit 2: This isn't to say TIM doesn't matter, delidded processors and all that. It all comes down to sizing the cooling system for the component. A 200W overclocked 8700k is completely incomparable to a 5W power delivery component in regards to heatsink size and thermal interface K value.


Let's look at some math (http://www.physicsclassroom.com/class/thermalP/Lesson-1/Rates-of-Heat-Transfer):

Rate = k•A•(T1 - T2)/d

where k = thermal conductivity

Ergo, the rate at which you can remove heat (in Watts) from any system thru a solid is directly proportional to the thermal conductivity of the material. IOW, your thermal tape is the absolute limit at which you can remove heat energy from the system. Although it's plausible that the heat energy of the VRM's and VRAM's may be sunk thru the GPU as well, it may well also be possible that copper heatsinks and thermal epoxy and air flow are superior to water cooling thru thermal tape.

If you noticed thermal epoxy is orders of magnitude better than thermal tape.


----------



## kithylin

8051 said:


> Let's look at some math (http://www.physicsclassroom.com/class/thermalP/Lesson-1/Rates-of-Heat-Transfer):
> 
> Rate = k•A•(T1 - T2)/d
> 
> where k = thermal conductivity
> 
> Ergo, the rate at which you can remove heat (in Watts) from any system thru a solid is directly proportional to the thermal conductivity of the material. IOW, your thermal tape is the absolute limit at which you can remove heat energy from the system. Although it's plausible that the heat energy of the VRM's and VRAM's may be sunk thru the GPU as well, it may well also be possible that copper heatsinks and thermal epoxy and air flow are superior to water cooling thru thermal tape.
> 
> If you noticed thermal epoxy is orders of magnitude better than thermal tape.


I seriously think you are the only person in the entire internet that some how got this notion into your mind that an air cooled video card is some how going to run cooler than one in a custom water loop with a full cover water block. Just because.... of some tape vs a thermal pad? I am sure there are some engineers out there that would love to know what this magical tape is that some how can defeat the laws of thermodynamics.


----------



## feznz

KedarWolf said:


> I believe even if your temps are decent, high voltages can degrade components quicker than normal voltages. You may be able to run that voltage but debatable how wise it is in the long run. No big deal I'm sure if you're going to upgrade to a Volta card eventually, but myself, I only run higher voltages when benching, for 24/7 use I run at 1.1v or under with XOC BIOS. :h34r-smi


I would agree with voltage regardless of temperature will degrade a chip significantly faster from personal experiences
But I just don't run the XOC bios daily simply it removes all thermal protections


----------



## MightEMatt

8051 said:


> Let's look at some math (http://www.physicsclassroom.com/class/thermalP/Lesson-1/Rates-of-Heat-Transfer):
> 
> Rate = k•A•(T1 - T2)/d
> 
> where k = thermal conductivity
> 
> Ergo, the rate at which you can remove heat (in Watts) from any system thru a solid is directly proportional to the thermal conductivity of the material. IOW, your thermal tape is the absolute limit at which you can remove heat energy from the system. Although it's plausible that the heat energy of the VRM's and VRAM's may be sunk thru the GPU as well, it may well also be possible that copper heatsinks and thermal epoxy and air flow are superior to water cooling thru thermal tape.
> 
> If you noticed thermal epoxy is orders of magnitude better than thermal tape.



To reiterate my point using the formula you kindly provided, ΔT is equally important. Doubling the ΔT has the same impact as doubling the K value of the thermal interface. This isn't the whole story however as there is also a second transfer of heat occuring either from the heatsink into the air, or from the waterblock into the water. 

In the air cooled solution, the transfer of heat from a heatsink to the surrounding air is relatively slow, therefore your ΔT for the initial equation will continually decrease until the ΔT in the second equation becomes sufficient to match the the heat transfer of the first equation. 

In the case of the waterblock, you can feasibly keep the waterblock within 10°C of the initial conditions (ambient temperature) thanks to the incredibly high rate of transfer between the waterblock and the water. This reduces the ΔT in the second equation, but because of the immense capacity and efficiency of radiators, this is dealt with through brute force. However this massively increases the ΔT of the first equation relative to the conditions in the first scenario, which results in a much faster transfer of heat in the first equation.

You don't need math to tell you any of this, we wouldn't be spending the time and money watercooling if it was a universally inferior cooling method.


Edit: Just to clarify where the difference is being made, the thermal conductivity of water is nearly 25 times greater than the thermal conductivity of air. The goal is to remove heat from the component, which has a fixed, incredibly small surface area. A heatsink or radiator can have as much surface area as you want, but how efficiently you move the heat into the cooling system is the name of the game. Your thought process isn't wrong, you always want the best thermal interface on the component, but you have to consider how the secondary thermal system affects the primary system. By the time a system reaches equilibrium, the ΔT of the water system could be many-fold greater than the air system.

Edit 2: This is the same reason we don't all use liquid metal on GPUs even though it has a much greater K value. On air you sometimes see 5 or even 10 degrees improvement from the thermal interface, but nowhere near the temperatures you get on water. This is because the system is fundamentally limited by the heat you can remove from the cooling medium.


----------



## 8051

kithylin said:


> I seriously think you are the only person in the entire internet that some how got this notion into your mind that an air cooled video card is some how going to run cooler than one in a custom water loop with a full cover water block. Just because.... of some tape vs a thermal pad? I am sure there are some engineers out there that would love to know what this magical tape is that some how can defeat the laws of thermodynamics.


Do you understand the concept of orders of magnitude? I'll tell you what, I'll give you a dollar for a hundred dollars -- maybe then it'll sink in.


----------



## 8051

krizby said:


> Lol how misinformed are you ? Thermal pads are not thermal tape (Duh, they all have the word "thermal", they should be the same right ? yeah that's absolutely wrong pal)
> 
> Just google Fujipoly ultra extreme thermal pad and you see they are rated at 17W/mk, the stock pads that come with waterblock are usually 5W/mk, still better than those thermal epoxy you quoted. Just lay the discussion to rest already.
> 
> Anyways the cooling results of the Accelero III are good, the included vram and vrm heatsinks are fine at maximum fan rpm so I guess there is no harm using the XOC bios on your Gigabyte 1080ti Gaming. I had been using the titan X maxwell with ACX 2 cooler, unlock tdp bios that pull 350-400w at load and it's still kicking after 3 years. Just keep vcore at or below 1.05v and you are fine. Also use 2 separate power cables as I found out using 1 has some bad side effects.


Sorry, I was dead wrong, I didn't know about this amazing silicone gap filling product.

There seem to be TWO specs for thermal conductance that are quoted for Fujipoly ultra extreme, the 17W/mk and 11W/mk:
https://www.fujipoly.com/usa/assets/files/2017_data_sheets/SARCON XR-m ES_DS4060 ver.1.pdf

The thermal conductance of this product increases w/increasing pressure and decreases w/decreasing pressure. There's no thermal epoxy that can even come close (unless you count the foil solder products as epoxy).

It even beats out Shin-Etsu's TIM, so why not trim their 0.3mm product and use it instead of TIM?

I'd imagine an application of epoxy is much thinner than anything you can get w/a gap filling pad though (maybe a hundred times thinner). An application of thermal epoxy is probably only micrometers in thickness and thickness is also a factor in determining the thermal conductance of a system.

The thermal conductivity of liquid H2O is the pits in comparison to Cu or Al.


----------



## 8051

MightEMatt said:


> To reiterate my point using the formula you kindly provided, ΔT is equally important. Doubling the ΔT has the same impact as doubling the K value of the thermal interface. This isn't the whole story however as there is also a second transfer of heat occuring either from the heatsink into the air, or from the waterblock into the water.
> 
> In the air cooled solution, the transfer of heat from a heatsink to the surrounding air is relatively slow, therefore your ΔT for the initial equation will continually decrease until the ΔT in the second equation becomes sufficient to match the the heat transfer of the first equation.
> 
> In the case of the waterblock, you can feasibly keep the waterblock within 10°C of the initial conditions (ambient temperature) thanks to the incredibly high rate of transfer between the waterblock and the water. This reduces the ΔT in the second equation, but because of the immense capacity and efficiency of radiators, this is dealt with through brute force. However this massively increases the ΔT of the first equation relative to the conditions in the first scenario, which results in a much faster transfer of heat in the first equation.
> 
> You don't need math to tell you any of this, we wouldn't be spending the time and money watercooling if it was a universally inferior cooling method.
> 
> 
> Edit: Just to clarify where the difference is being made, the thermal conductivity of water is nearly 25 times greater than the thermal conductivity of air. The goal is to remove heat from the component, which has a fixed, incredibly small surface area. A heatsink or radiator can have as much surface area as you want, but how efficiently you move the heat into the cooling system is the name of the game. Your thought process isn't wrong, you always want the best thermal interface on the component, but you have to consider how the secondary thermal system affects the primary system. By the time a system reaches equilibrium, the ΔT of the water system could be many-fold greater than the air system.
> 
> Edit 2: This is the same reason we don't all use liquid metal on GPUs even though it has a much greater K value. On air you sometimes see 5 or even 10 degrees improvement from the thermal interface, but nowhere near the temperatures you get on water. This is because the system is fundamentally limited by the heat you can remove from the cooling medium.


I was initially restricting this to the discussion of VRM's/VRAM's being cooled by air-cooled thermal epoxy attached Cu heatsinks vs. VRM's/VRAM's being attached to a waterblock via thermal tape. I'm already wrong, I didn't know about the unbelievable thermal conductive properties of fujipoly's gap filling product, which is superior to any thermal epoxy manufactured that I could find.


----------



## feznz

8051 said:


> I was initially restricting this to the discussion of VRM's/VRAM's being cooled by air-cooled thermal epoxy attached Cu heatsinks vs. VRM's/VRAM's being attached to a waterblock via thermal tape. I'm already wrong, I didn't know about the unbelievable thermal conductive properties of fujipoly's gap filling product, which is superior to any thermal epoxy manufactured that I could find.


yes when measured in watts per meter-kelvin (W/(m⋅K) thermal pads are 1mm thick epoxy being 0.02mm thick at a guess so without an actual test bench to prove which is more efficient we are guessing I know there is alot of equations to calculate which would actually transfer the most heat.

TBH we are talking a few degrees here and there but overall maybe another 200Mhz on memory who knows its not worth my time to find the last 1-2% of potential OC headroom


----------



## krizby

8051 said:


> Sorry, I was dead wrong, I didn't know about this amazing silicone gap filling product.
> 
> There seem to be TWO specs for thermal conductance that are quoted for Fujipoly ultra extreme, the 17W/mk and 11W/mk:
> https://www.fujipoly.com/usa/assets/files/2017_data_sheets/SARCON XR-m ES_DS4060 ver.1.pdf
> 
> The thermal conductance of this product increases w/increasing pressure and decreases w/decreasing pressure. There's no thermal epoxy that can even come close (unless you count the foil solder products as epoxy).
> 
> It even beats out Shin-Etsu's TIM, so why not trim their 0.3mm product and use it instead of TIM?
> 
> I'd imagine an application of epoxy is much thinner than anything you can get w/a gap filling pad though (maybe a hundred times thinner). An application of thermal epoxy is probably only micrometers in thickness and thickness is also a factor in determining the thermal conductance of a system.
> 
> The thermal conductivity of liquid H2O is the pits in comparison to Cu or Al.


Since there are no thermal sensor on 1080ti vram and vrm except for the EVGA one, let's take AMD R9 290 results for VRM cooling.

r9 290 with accelero 3
https://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html 

r9 290 with waterblock + fujipoly pads
https://www.overclock.net/forum/72-ati-cooling/1455768-290x-vrm1-high-temps-ek-waterblock-3.html

VRM1 (core vrm) reach 70C playing games in tomshardware article while forum users reported VRM temp in the low 50s or even 40s when Fujipoly pads are used with waterblock running furmark. I'm sure you can draw the conclusion from here.


----------



## Gdourado

Hello, how are you?

I got a 1080ti with a blower cooler.
I got the blower because the card went onto a Silverstone ml08 case. 
The case has no case fans, but the gpu is on a separate compartment. 
After creating a custom fan profile, the card stays at around 80 to 82 degrees and the clock is stable at around 1850 give it take.
That is with stock power limit and no overclock. 

I am happy with the performance given the constraints of the case and cooler solution. 

But I am worried about the temperature. 
The gpu core like I said stays at around 80 to 82 degrees while gaming.
But the card itself gets hot.
So hot that the backplate and shroud can almost burn my hand to the touch. 

I am guessing that this is because there are no case fans blowing air over the card itself and no exhaust fans on the case. 

Is this a problem?
Can it affect the reliability of the card?
I really like the ml08 formfactor and the cpu temps are pretty good. 
But should I just get a case with active air intake and exhaust?

Or is this just fine despite all the heat being produced?

Thanks
Cheers


----------



## kithylin

Gdourado said:


> Hello, how are you?
> 
> I got a 1080ti with a blower cooler.
> I got the blower because the card went onto a Silverstone ml08 case.
> The case has no case fans, but the gpu is on a separate compartment.
> After creating a custom fan profile, the card stays at around 80 to 82 degrees and the clock is stable at around 1850 give it take.
> That is with stock power limit and no overclock.
> 
> I am happy with the performance given the constraints of the case and cooler solution.
> 
> But I am worried about the temperature.
> The gpu core like I said stays at around 80 to 82 degrees while gaming.
> But the card itself gets hot.
> So hot that the backplate and shroud can almost burn my hand to the touch.
> 
> I am guessing that this is because there are no case fans blowing air over the card itself and no exhaust fans on the case.
> 
> Is this a problem?
> Can it affect the reliability of the card?
> I really like the ml08 formfactor and the cpu temps are pretty good.
> But should I just get a case with active air intake and exhaust?
> 
> Or is this just fine despite all the heat being produced?
> 
> Thanks
> Cheers


The cards are rated for up to 95c max, so.. mid-80's C for a blower style cooler on these cards sounds fine. I doubt it will effect it too much long-term, but I would suggest you make sure your warranty is intact and register it with the support department for which ever company you bought it from just in case.

I just looked at your computer case on newegg, it looks like it has slots in the side panel cover for the video card to get fresh air? With nvidia boost 3.0 in default air cooler / stock configuration it should have built in protections. If it gets hot enough to where the card thinks it will sustain damage, the card will down-clock really far, possibly down to 1600 mhz core or below if it really gets hot. And there are still protections beyond that built in, if it really gets up to 95c or beyond the card will shut off and jerk the fan to 100% (drop monitor output and black screen) to save it's self. So I probably wouldn't worry. Yes it's hot but it's most likely not going to damage/kill the card unless you're maxing out @ 95c constantly.

Play around with the fan profile and while I know it would be loud, try putting the fans manually to 100% when you get up to the mid-80's C and see if it can drop it further, maybe you could try to be more aggressive on the fan speed.


----------



## feznz

thermal cycling the greater the temp between cold and hot can cause the solder balling to fracture between pcb and chip though fine in short term...... long term......


----------



## kithylin

feznz said:


> thermal cycling the greater the temp between cold and hot can cause the solder balling to fracture between pcb and chip though fine in short term...... long term......


Most people don't own their video cards longer than 2~4 years, and most people will not own a card long enough for that to ever be noticeable or even effect anything, even if that did effect anything. And while there are some types of solder that melt as low as 90c, I seriously doubt nvidia would be stupid enough to use that type of solder in a video card designed to go up to 95c. There are other solders, some melt around 183c or even 188c. I haven't heard of anyone yet causing a GTX 1080 Ti to "Desolder" it's self, no matter how hot they run it. I wouldn't ever concern myself with such a non-issue.

As far as I'm aware we haven't had any heat-related fracturing issues relating to video cards since the GeForce 7000 series and "BumpGate". Even that was just a flaw in the manufacturing process and they just don't manufacture the cards in the same way anymore like they used to back then. I may be wrong here, but I'm pretty sure all of that was resolved years ago and shouldn't effect modern video cards.


----------



## feznz

considering the 1080Ti has been out for 13months I don't think anyone is qualified to say if thermal cycling is good or bad in the long run only time will tell..... but I personally wouldn't take my chances 
I would personally aim for less than 70°C at full load but thats my opinion


----------



## 8051

krizby said:


> Since there are no thermal sensor on 1080ti vram and vrm except for the EVGA one, let's take AMD R9 290 results for VRM cooling.
> 
> r9 290 with accelero 3
> https://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html
> 
> r9 290 with waterblock + fujipoly pads
> https://www.overclock.net/forum/72-ati-cooling/1455768-290x-vrm1-high-temps-ek-waterblock-3.html
> 
> VRM1 (core vrm) reach 70C playing games in tomshardware article while forum users reported VRM temp in the low 50s or even 40s when Fujipoly pads are used with waterblock running furmark. I'm sure you can draw the conclusion from here.


There are high performance thermal epoxies though and I don't think Arctic Cooling is providing those w/their AC Xtreme III -- just like AIB's don't use Fujipoly pads w/their shipped products.


----------



## KedarWolf

8051 said:


> There are high performance thermal epoxies though and I don't think Arctic Cooling is providing those w/their AC Xtreme III -- just like AIB's don't use Fujipoly pads w/their shipped products.


You're not going to convince this person water is better than air. 99% of us here know it is but he's sure he's right and the rest of us are wrong. 

Just let them live in fantasy land and we can move on.


----------



## 8051

KedarWolf said:


> You're not going to convince this person water is better than air. 99% of us here know it is but he's sure he's right and the rest of us are wrong.
> 
> Just let them live in fantasy land and we can move on.


The highest performing piston-engined fighter the USN ever had was air-cooled.


----------



## feznz

8051 said:


> The highest performing piston-engined fighter the USN ever had was air-cooled.


If only I could have that kind of airflow though my PC case .... actually it would probaby blow the ram out of its slot so probably not a good idea


----------



## krizby

8051 said:


> The highest performing piston-engined fighter the USN ever had was air-cooled.


Piston engines are cooled by engine oil which circulate to the heat exchanger/radiator to be cooled off and then back to the engine, sounds a lot like water-cool system doesn't it ?. You don't cool engine with air directly dummy  unless you want your engine to crack due to thermal shock.


----------



## MightEMatt

8051 said:


> The highest performing piston-engined fighter the USN ever had was air-cooled.



In a plane weight is a consideration, and air will win out every time in that regard. That aside, cooling only matters so long as you have _enough_. Our cooling needs aren't as great as an engine, but it needs to be very efficiently designed given the small surface area we need to draw heat from. We also don't want it to _sound_ like a plane. We could all run reference coolers and crank the fan speed way up, but that's loud an inefficient.


----------



## siffonen

I thought that here we discuss about 1080ti, not airplanes


----------



## krizby

Gdourado said:


> Hello, how are you?
> 
> I got a 1080ti with a blower cooler.
> I got the blower because the card went onto a Silverstone ml08 case.
> The case has no case fans, but the gpu is on a separate compartment.
> After creating a custom fan profile, the card stays at around 80 to 82 degrees and the clock is stable at around 1850 give it take.
> That is with stock power limit and no overclock.
> 
> I am happy with the performance given the constraints of the case and cooler solution.
> 
> But I am worried about the temperature.
> The gpu core like I said stays at around 80 to 82 degrees while gaming.
> But the card itself gets hot.
> So hot that the backplate and shroud can almost burn my hand to the touch.
> 
> I am guessing that this is because there are no case fans blowing air over the card itself and no exhaust fans on the case.
> 
> Is this a problem?
> Can it affect the reliability of the card?
> I really like the ml08 formfactor and the cpu temps are pretty good.
> But should I just get a case with active air intake and exhaust?
> 
> Or is this just fine despite all the heat being produced?
> 
> Thanks
> Cheers


My advice is to use MSI afterburner to increase your core clock and memory to the highest stable (usually core clock +130mhz memory +250mhz) then decease power limits to 90%, this way your card will actually perform slightly better compare to stock while operating slightly cooler (or less noise if you prefer). You can even lower the power limit to 80% and the card still perform 95% of stock performance. To better explain this is because the power/performance curve is rather parabola, at the top of the curve you gain very little performance for tremendous power increase. Also if you can keep the card cooler the efficiency goes up like 1.5% per 10C decrease.


----------



## kithylin

siffonen said:


> I thought that here we discuss about 1080ti, not airplanes


Yep, all this discussion over air vs water and now.. PLANES?!?!?!?! in a video card thread?? come on folks, back on topic here. Don't let the trolls derail the thread.


----------



## JohnnyFlash

krizby said:


> My advice is to use MSI afterburner to increase your core clock and memory to the highest stable (usually core clock +130mhz memory +250mhz) then decease power limits to 90%, this way your card will actually perform slightly better compare to stock while operating slightly cooler (or less noise if you prefer). You can even lower the power limit to 80% and the card still perform 95% of stock performance. To better explain this is because the power/performance curve is rather parabola, at the top of the curve you gain very little performance for tremendous power increase. Also if you can keep the card cooler the efficiency goes up like 1.5% per 10C decrease.


To do what you want here, you need to adjust the voltage curve, not the power limit setting. All power limit does is throttle based on power consumption, it doesn't increase efficiency.

My performance profile runs 1898MHz at 0.975v instead of the stock setting of 1.11v; this setting runs much cooler and draws 35 watts less at the wall. Day to day I run 1506MHz at 0.82v, which barely heats at all.


----------



## krizby

JohnnyFlash said:


> To do what you want here, you need to adjust the voltage curve, not the power limit setting. All power limit does is throttle based on power consumption, it doesn't increase efficiency.
> 
> My performance profile runs 1898MHz at 0.975v instead of the stock setting of 1.11v; this setting runs much cooler and draws 35 watts less at the wall. Day to day I run 1506MHz at 0.82v, which barely heats at all.


Setting voltage curve or undervolt your card is a waste of time on stock cooling since every game loads the GPU differently, on one game you are inhibiting performance and other game would also throttle clocks when 100% power limit is reached even at 0.975v anyways. Pascal's clock/voltage curve is designed to work with power limit in mind, just set the power consumption you want and the clock/voltage will adapt to it, pretty intelligent design isn't it.

Pascal efficiency goes up if you can keep the GPU cool, a 13C drop in temp allow you to reach a higher clock/voltage bin on the same power limit setting and a +12mhz clock offset. You miss out on all this when you deliberately undervolting your GPU. Seriously Pascal isn't Vega where you can gain performance when undervolting, on Pascal you just lose performance when undervolting.


----------



## JohnnyFlash

krizby said:


> Setting voltage curve or undervolt your card is a waste of time on stock cooling since every game loads the GPU differently, on one game you are inhibiting performance and other game would also throttle clocks when 100% power limit is reached even at 0.975v anyways. Pascal's clock/voltage curve is designed to work with power limit in mind, just set the power consumption you want and the clock/voltage will adapt to it, pretty intelligent design isn't it.
> 
> Pascal efficiency goes up if you can keep the GPU cool, a 13C drop in temp allow you to reach a higher clock/voltage bin on the same power limit setting and a +12mhz clock offset. You miss out on all this when you deliberately undervolting your GPU. Seriously Pascal isn't Vega where you can gain performance when undervolting, on Pascal you just lose performance when undervolting.


Efficiency doesn't change, only the clock speed the chip allows you to hit within each temperature window. If you lock the fan at 100% and test power draw at the wall, there is a massive difference in draw and temp between stock and lower voltage, for all games and renders. If you're hitting a temp wall with standard cooling, adjusting the voltage curve will get you better results because the chip will throttle less if it's hitting the temp block.

Your statement of "if you can keep the chip cool" isn't helpful for a user that has issues with temperature in the first place. If we were talking about water cooling, that would be a different conversation. If a user with temp issues can find a sweet spot in the voltage curve that allows for a slight loss in performance, then that is the best avenue to take.

As for different games loading it differently: That's true of every generation.


----------



## kithylin

JohnnyFlash said:


> Efficiency doesn't change, only the clock speed the chip allows you to hit within each temperature window. If you lock the fan at 100% and test power draw at the wall, there is a massive difference in draw and temp between stock and lower voltage, for all games and renders. If you're hitting a temp wall with standard cooling, adjusting the voltage curve will get you better results because the chip will throttle less if it's hitting the temp block.
> 
> Your statement of "if you can keep the chip cool" isn't helpful for a user that has issues with temperature in the first place. If we were talking about water cooling, that would be a different conversation. If a user with temp issues can find a sweet spot in the voltage curve that allows for a slight loss in performance, then that is the best avenue to take.
> 
> As for different games loading it differently: That's true of every generation.


I don't see the concern here. A stock pascal 1080 Ti on stock air cooling with stock bios will never exceed 400 watts even in benchmarks or the worst games, at most a stock card in stock setup will average around max 350-370 watts, even if you're gaming @ 4K at max settings in most games. The performance you get from that power usage is so very high that I don't understand why anyone would concern yourself with it. It's good enough as it is and no where near high enough to complain at all. Consider that there's no other (non-titan) gpu that we can buy that gives the 1080 Ti's performance for it's power usage.


----------



## JohnnyFlash

kithylin said:


> I don't see the concern here. A stock pascal 1080 Ti on stock air cooling with stock bios will never exceed 400 watts even in benchmarks or the worst games, at most a stock card in stock setup will average around max 350-370 watts, even if you're gaming @ 4K at max settings in most games. The performance you get from that power usage is so very high that I don't understand why anyone would concern yourself with it. It's good enough as it is and no where near high enough to complain at all. Consider that there's no other (non-titan) gpu that we can buy that gives the 1080 Ti's performance for it's power usage.


Another poster was worried about their card temps with the stock cooler. Lowering power draw is the easiest solution if that user doesn't want to upgrade cooling.


----------



## krizby

JohnnyFlash said:


> Efficiency doesn't change, only the clock speed the chip allows you to hit within each temperature window. If you lock the fan at 100% and test power draw at the wall, there is a massive difference in draw and temp between stock and lower voltage, for all games and renders. If you're hitting a temp wall with standard cooling, adjusting the voltage curve will get you better results because the chip will throttle less if it's hitting the temp block.
> 
> Your statement of "if you can keep the chip cool" isn't helpful for a user that has issues with temperature in the first place. If we were talking about water cooling, that would be a different conversation. If a user with temp issues can find a sweet spot in the voltage curve that allows for a slight loss in performance, then that is the best avenue to take.
> 
> As for different games loading it differently: That's true of every generation.


Just try my method for like 5 min and you will see why undervolting is fruitless, really. For undervolting to actually reduce power draw and temp you either have to use a different overclocking profile for each game or undervolt agressively. Try some graphical demanding game or Superposition benchmark at 4K and you will see that at 0.975v the 100% power draw is reached and temp is same as stock (monitor power on afterburner OSD is a great help). Lower voltage doesn't lower the temp, lower power consumption does. 

Now setting a lower power limit you will definitely see a constant drop in temperature across every games you play (since you only allow the card to draw 225w instead of 250w at 90% power limit). I benched Timespy and only see a drop of 2% in graphic score when setting power limit to 90%. Trading a non discernible performance loss for a discernible lower fan noise especially on the stock FE fan is nice. Sure the clock fluctuate but that is how Pascal clocking mechanism is designed anyway.


----------



## JohnnyFlash

krizby said:


> Just try my method for like 5 min and you will see why undervolting is fruitless, really. For undervolting to actually reduce power draw and temp you either have to use a different overclocking profile for each game or undervolt agressively. Try some graphical demanding game or Superposition benchmark at 4K and you will see that at 0.975v the 100% power draw is reached and temp is same as stock (monitor power on afterburner OSD is a great help). Lower voltage doesn't lower the temp, lower power consumption does.
> 
> Now setting a lower power limit you will definitely see a constant drop in temperature across every games you play (since you only allow the card to draw 225w instead of 250w at 90% power limit). I benched Timespy and only see a drop of 2% in graphic score when setting power limit to 90%. Trading a non discernible performance loss for a discernible lower fan noise especially on the stock FE fan is nice. Sure the clock fluctuate but that is how Pascal clocking mechanism is designed anyway.


This is just outright incorrect, and could confuse new owners looking to learn. 

Lower voltage under the same workload will always result in lower power usage and heat output, period. Setting the power limit lower throttles the clockspeed to reduce power draw; this most definitely reduces performance.

I ran 4 passes of the Haven benchmark at 4k and set graphics to extreme. You will notice that the undervolted profile maintained clockspeed *50MHz higher*, while measuring *11% less power limit* and running *6 degrees cooler* than running stock settings.

Results: http://i64.tinypic.com/33e259d.jpg (Attachments not working for me)


----------



## buellersdayoff

JohnnyFlash said:


> This is just outright incorrect, and could confuse new owners looking to learn.
> 
> Lower voltage under the same workload will always result in lower power usage and heat output, period. Setting the power limit lower throttles the clockspeed to reduce power draw; this most definitely reduces performance.
> 
> I ran 4 passes of the Haven benchmark at 4k and set graphics to extreme. You will notice that the undervolted profile maintained clockspeed *50MHz higher*, while measuring *11% less power limit* and running *6 degrees cooler* than running stock settings.
> 
> Results: http://i64.tinypic.com/33e259d.jpg (Attachments not working for me)


He did mention a core overclock with the power limit lower...just saying


----------



## buellersdayoff

krizby said:


> My advice is to use MSI afterburner to increase your core clock and memory to the highest stable (usually core clock +130mhz memory +250mhz) then decease power limits to 90%, this way your card will actually perform slightly better compare to stock while operating slightly cooler (or less noise if you prefer). You can even lower the power limit to 80% and the card still perform 95% of stock performance. To better explain this is because the power/performance curve is rather parabola, at the top of the curve you gain very little performance for tremendous power increase. Also if you can keep the card cooler the efficiency goes up like 1.5% per 10C decrease.


In this post


----------



## krizby

JohnnyFlash said:


> This is just outright incorrect, and could confuse new owners looking to learn.
> 
> Lower voltage under the same workload will always result in lower power usage and heat output, period. Setting the power limit lower throttles the clockspeed to reduce power draw; this most definitely reduces performance.
> 
> I ran 4 passes of the Haven benchmark at 4k and set graphics to extreme. You will notice that the undervolted profile maintained clockspeed *50MHz higher*, while measuring *11% less power limit* and running *6 degrees cooler* than running stock settings.
> 
> Results: http://i64.tinypic.com/33e259d.jpg (Attachments not working for me)


Lol you increased the core clock in the undervolt profile while leaving stock at 0 that's why your undervolt profile is 50mhz higher, something tells me you are pretty new to this. Try increase core clock (+101mhz) so that the curve is 1900mhz @ 0.975v, the same as your undervolt profile without the level off after 0.975v, lower the power limit to 90% and test against power limit at 100% you will see very little performance drop off while using 25w less and remain cooler. This is what Nvidia is doing with their Max Q laptop design:

https://www.notebookcheck.net/fileadmin/Notebooks/News/_nc3/Max_Q_4.png 

If you like super tight frametimes then sure undervolt helps, otherwise you are just sacrificing performance.


----------



## JohnnyFlash

krizby said:


> Lol you increased the core clock in the undervolt profile while leaving stock at 0 that's why your undervolt profile is 50mhz higher, something tells me you are pretty new to this. Try increase core clock (+101mhz) so that the curve is 1900mhz @ 0.975v, the same as your undervolt profile without the level off after 0.975v, lower the power limit to 90% and test against power limit at 100% you will see very little performance drop off while using 25w less and remain cooler. This is what Nvidia is doing with their Max Q laptop design:
> 
> https://www.notebookcheck.net/fileadmin/Notebooks/News/_nc3/Max_Q_4.png
> 
> If you like super tight frametimes then sure undervolt helps, otherwise you are just sacrificing performance.


I didn't increase the core clock at all. If you look at the voltage curve there are steps left and it ends at the same speed (1900). The card will not use those steps at that temp under full load at stock voltage.

The power limit restriction moves the card down the curve, which set both the speed and voltage lower. The card will always try to use the highest position on the curve that it can under the power and hidden temp settings. You are correct that setting it to 90% power limit will reduce temps, because it reduces both the clock rate and the voltage.

Please feel free to post your own results here to prove otherwise.


----------



## rluker5

krizby said:


> Lol you increased the core clock in the undervolt profile while leaving stock at 0 that's why your undervolt profile is 50mhz higher, something tells me you are pretty new to this. Try increase core clock (+101mhz) so that the curve is 1900mhz @ 0.975v, the same as your undervolt profile without the level off after 0.975v, lower the power limit to 90% and test against power limit at 100% you will see very little performance drop off while using 25w less and remain cooler. This is what Nvidia is doing with their Max Q laptop design:
> 
> https://www.notebookcheck.net/fileadmin/Notebooks/News/_nc3/Max_Q_4.png
> 
> If you like super tight frametimes then sure undervolt helps, otherwise you are just sacrificing performance.



Your ti is under water. Other people's will throttle clocks more as a result of a boost that is being fast and loose with volts until it hits the power limit. 

Also power limiting throttles and results in variable clocks depending on the load and is more stuttery than volt limiting which just reduces all power draws evenly and can reduce temps with even clocks. Power limiting drops performance specifically in heavy load situations where you want perf the most and wastes power in low load situations.

But power limiting is easier to do. You don't need to figure out what clocks are good with the volts you want to set.


----------



## krizby

JohnnyFlash said:


> I didn't increase the core clock at all. If you look at the voltage curve there are steps left and it ends at the same speed (1900). The card will not use those steps at that temp under full load at stock voltage.
> 
> The power limit restriction moves the card down the curve, which set both the speed and voltage lower. The card will always try to use the highest position on the curve that it can under the power and hidden temp settings. You are correct that setting it to 90% power limit will reduce temps, because it reduces both the clock rate and the voltage.
> 
> Please feel free to post your own results here to prove otherwise.


If you hover your mouse over the point 0.975v/1900mhz it will say +101 and that is the overclock that you applied at that specific voltage point .

These are my result

Full stock with no overclock 
https://ibb.co/ivNMjy

+162mhz core/+250mhz mem with power limit at 100%
https://ibb.co/ijuOxJ

+162mhz core/+250mhz mem with power limit at 90%
https://ibb.co/kHToWd

a drop of only 2.5% in graphic score with power limit at 90% compare to 100%, if this were on stock FE cooler the gap would be a little closer since at 90% the card is less thermally constraint. 




rluker5 said:


> Your ti is under water. Other people's will throttle clocks more as a result of a boost that is being fast and loose with volts until it hits the power limit.
> 
> Also power limiting throttles and results in variable clocks depending on the load and is more stuttery than volt limiting which just reduces all power draws evenly and can reduce temps with even clocks. Power limiting drops performance specifically in heavy load situations where you want perf the most and wastes power in low load situations.
> 
> But power limiting is easier to do. You don't need to figure out what clocks are good with the volts you want to set.


As your card is heating up the clock would drop anyways, even under water 1080ti clock still drops, this is how Pascal is designed. Go read every 1080ti FE reviews out there and you see even though the clock fluctuates, the frametimes are all consistent anyways (1% low and 0.1% low fps). If you are OCD enough to want your clock to remain flat be my guest . I'm just suggesting a 1 min overclocking method here that works in every game rather than spending an hour setting an undervolt overclocking profile that works in 1 game.


----------



## rluker5

krizby said:


> As your card is heating up the clock would drop anyways, even under water 1080ti clock still drops, this is how Pascal is designed. Go read every 1080ti FE reviews out there and you see even though the clock fluctuates, the frametimes are all consistent anyways (1% low and 0.1% low fps). If you are OCD enough to want your clock to remain flat be my guest . I'm just suggesting a 1 min overclocking method here that works in every game rather than spending an hour setting an undervolt overclocking profile that works in 1 game.


I've tried both and power throttling gives worse frametimes than selecting optimal power vs maximum performance. 
Power throttling can only be as efficient as voltage curve control in ideal situations where you are getting practical use out of 100% gpu use like in benchmarks. It isn't like this in games, unless you are into 300fps walls and menus.

It is basically the opposite of radeon chill. They should call it Nvidia cook - where your gpu cranks all the way up when you don't care and throttles when you do. Volt control gives you better performance when it counts in uncapped scenarios and even moreso when you cap your framerate.

And it is only about a single voltage notch difference between the entire span of games. I use a universal volt control setting for both my desktop and 1050ti laptop. Timespy is a good enough volt per clock test bench too imo.


----------



## krizby

rluker5 said:


> I've tried both and power throttling gives worse frametimes than selecting optimal power vs maximum performance.
> Power throttling can only be as efficient as voltage curve control in ideal situations where you are getting practical use out of 100% gpu use like in benchmarks. It isn't like this in games, unless you are into 300fps walls and menus.
> 
> It is basically the opposite of radeon chill. They should call it Nvidia cook - where your gpu cranks all the way up when you don't care and throttles when you do. Volt control gives you better performance when it counts in uncapped scenarios and even moreso when you cap your framerate.
> 
> And it is only about a single voltage notch difference between the entire span of games. I use a universal volt control setting for both my desktop and 1050ti laptop. Timespy is a good enough volt per clock test bench too imo.


To each their own but you will more likely to notice hiccups cause by: 
1. Game engine
2. DRM
3. Windows
4. Cpu bottleneck
5. Ram bottleneck
6. Network lag
7. Exceeding monitor refresh cap

than your gpu dropping 100mhz causing you to lose 2 fps and your frametimes by 4ms lol.

Now to determine which moments in game that you care or don't care. Let take PUBG as an example, in heavy gun fight when undervolt gives you +2 fps or when you are tracking a running enemy and gpu boost give you +5fps, I would say tracking the enemy and line up the shot is more important.


----------



## Fifth Horseman

Just got my GTX 1080 Ti Aorus EXtreme at a flash sale for $679.99 and a ek waterblock for $59.99. First Nvidia card i have ever owned mostly due to shady business practices, but I figured for the price i couldn't pass it up. Unfamiliar with overclocking Nvidia cards but it appears under water it seems to be hitting close to 2ghz on its own.

I wanted to post some picks with the block but the function seems to not be working, ever since this site changed over to this new thing everything is borked.


----------



## Fifth Horseman

Just now getting into the overclocking i notice according to afterburner and evga, that i have a 150% power target option. I am not used to having more than 20%. how do these cards do at 150%?


----------



## kithylin

Fifth Horseman said:


> Just got my GTX 1080 Ti Aorus EXtreme at a flash sale for $679.99 and a ek waterblock for $59.99. First Nvidia card i have ever owned mostly due to shady business practices, but I figured for the price i couldn't pass it up. Unfamiliar with overclocking Nvidia cards but it appears under water it seems to be hitting close to 2ghz on its own.
> 
> I wanted to post some picks with the block but the function seems to not be working, ever since this site changed over to this new thing everything is borked.


Yeah! Congrats! And the way nvidia boost 3.0 works, it will notice you have better cooling and it will self-overclock on it's own. What I would suggest for you (The really easy way) is grab a copy of msi afterburner and just push temp target and power target to the max and then don't touch anything else and just go with it. If you're already getting 2ghz that's right about the maximum as-is and the sliders might get you to 2050 or 2075 and you should be happy enough as it is.

You might could get in to 2100 - 2150 if you got in to flashing a different bios and manual voltage tuning and all that jazz but really if you're already at 2ghz stable under custom water, and get like 2025-2050 with the two sliders then IMHO it's not worth screwing with anything more. At that point it's a big risk for a marginal gain. I would suggest just do the sliders, leave it, and go enjoy it. :thumb:


----------



## rluker5

Fifth Horseman said:


> Just now getting into the overclocking i notice according to afterburner and evga, that i have a 150% power target option. I am not used to having more than 20%. how do these cards do at 150%?


That's available power limit, and if you raise it that high it just makes that amount available before it wants to throttle. Doesn't make you use any more. Also raising the volts in AB doesn't do anything for me. 
A lot of the people here volt limit to avoid a 120% or so power limit. You shouldn't hit 150% with your bios voltage limit. 

I have very similar cards but aio, and I still volt limit for low usage just to keep the fans idle and the room cool. Also I tried the xoc vbios, got an extra 100mhz for an extra 125mv so I went back, but that's just my preference. 

And Pascal has a real abrupt oc limit per volt. Memory has decent variability and the op has a good way to zero in on its high end.

Sounds pretty boring eh? Yours will beat the published game benches on a 24/7 though. So congrats and enjoy.


----------



## krizby

Fifth Horseman said:


> Just now getting into the overclocking i notice according to afterburner and evga, that i have a 150% power target option. I am not used to having more than 20%. how do these cards do at 150%?


The Aorus extreme edition 1080ti allows for 375w maximum power draw (each 10% is 25w), FE edition allows only 300w but since the FE's VRM is more efficient there is very little difference in overclocking capability between the two. 

Now to get you going with overclocking:

_First max out the core voltage (This only affect how quick GPU boost react to available power and ramming up the clock quicker, however when you modify the voltage curve it allows for 1.093v to be used instead of 1.062v by default)
_Max out power limit and temp limit
_Increase the core clock to the max stable 
_Increase the memory clock to the max stable
_Now follow the guide on first page to access the voltage curve which is the wifi symbol left of the core clock, you will see that the curve flat line after the 1.062v, increase the clock by 12mhz at 1.075v, flat line at 1.081, increase the clock by 13mhz at 1.093v then adjust the curve so that all the clocks after 1.093v are flat. Save the profile then hit apply.
_Play games


----------



## Fifth Horseman

alll good info thanks hopefully i can play with it soon within the last hour my rig air locked on the water cooling side, i must have got a large bubble in the system when i swapped in the new gpu in >.<. first time ever i had a 90 degree temp alarm go off.


----------



## ViBEyt

Is it safe to be above 1.100v for 24/7 usage (Gaming), like 1.150 or 1.162 max ?


----------



## kithylin

ViBEyt said:


> Is it safe to be above 1.100v for 24/7 usage (Gaming), like 1.150 or 1.162 max ?


Unfortunately no one knows 100% for sure if it really is safe long-term or not. The cards just haven't been out long enough. All I can say is no one yet (at least here in OCN forums) has reported killing their 1080 Ti's from voltage all the way up to 1.200v. And personally I've been running my own 1080 Ti @ 1.200v daily since august 2017 with no ill effects what so ever.

I would say however that anything above 1.00v should be custom-water-block territory if you're pushing an overclock with it. I couldn't go above 1.087v with the factory big 2-fan air cooler on my stock card, it was hitting 80c and throttling. It's why I went water on mine.

Your Mileage May Vary.


----------



## kithylin

Just got this in email today: https://www.evga.com/products/featured-bundles.aspx

$749.98 - GTX 1080 Ti (iCX Gaming with 2 fans cooler) with EVGA SuperNova 80+ Gold 850W PSU, free shipping & free Crew2 game included + bonus 2 years warranty on power supply.
$759.98 - GTX 1080 Ti with hybrid cooler, free shipping & free Crew2 game included.
$769.98 - GTX 1080 Ti FTW3 DT Gaming 3/fans cooler with EVGA SuperNova 80+ Gold 850W PSU, free shipping & free Crew2 game included + bonus 2 years warranty on power supply.

And last, what I think is the best offer out of the lot:
$789.98 - GTX 1080 Ti FTW3 DT Gaming 3/fans cooler, with included EVGA Hydro Copper full-cover water block, free shipping & Crew2 game free.

All are still in stock and ready to buy and ship at the time of writing this message.

EDIT: Typos. FTW2 does not exist.


----------



## krizby

I imagine everyone in this thread must have a 1080ti by now but that was good info, vendors are probably clearing stock in anticipation for new gpu.


----------



## Fifth Horseman

ok so here is a question for you gurus, if my math is correct if the 150% power target works as intended does that infact increase my max tdp to 375 Watts. if it does is there any point to even try and use the Strix XOC bios? if it is worth it how dangerous is it for the vrms to approach that level of power?


----------



## kithylin

Fifth Horseman said:


> ok so here is a question for you gurus, if my math is correct if the 150% power target works as intended does that infact increase my max tdp to 375 Watts. if it does is there any point to even try and use the Strix XOC bios? if it is worth it how dangerous is it for the vrms to approach that level of power?


Here's the thing.. the XOC bios doesn't technically make the card magically draw more power just because it's on there vs your stock bios. What makes it draw more power is when you yourself start raising volts and/or clocks further, after you load the XOC bios. If you load the XOC bios and then set the clocks and volts to what you were on stock, your card would use roughly the same power.


----------



## Fifth Horseman

kithylin said:


> Here's the thing.. the XOC bios doesn't technically make the card magically draw more power just because it's on there vs your stock bios. What makes it draw more power is when you yourself start raising volts and/or clocks further, after you load the XOC bios. If you load the XOC bios and then set the clocks and volts to what you were on stock, your card would use roughly the same power.


Obviously but from what i read it removes the power target limit completely, my question is more towards what is safe cut off point for the power target, specifically when will the vrms start to have problems is it worth going that route or is my 375 tdp already to much.


----------



## rluker5

Fifth Horseman said:


> Obviously but from what i read it removes the power target limit completely, my question is more towards what is safe cut off point for the power target, specifically when will the vrms start to have problems is it worth going that route or is my 375 tdp already to much.


I think you have the same pcb as me (I have the waterforce aio version) and the xoc worked fine. I think I lost use of one of my video outputs, but otherwise the card was ok. I had the rgb turned off and I might have had the fan+aio pump hooked up to a molex adapter in a futile attempt to get more stability out of the card. Can't remember it was almost a year ago. The Aorus pcbs have like 5 fan/rgb headers on the board so other vbioses for cards with less headers probably don't tell them all how to work.

I noticed I was able to clock the vram higher and I was able to pass timespy at 2150 with xoc and 1.2v vs 2050 at stock vbios max volts. But the score wasn't 100mhz better, just a little bit better. And my aio couldn't keep up. The xoc gave me higher clocks but it seemed worse per clock scores than the stock vbios for the Aorus. And it certainly wasn't an everyday vbios if it made my cards run hotter for the same performance.

I also tried some of the other vbioses and they worked, but they performed at the same level at best. My tries led me to believe that the stock vbios is just more compatible with the Aorus cards. Maybe the Aorus vbios doesn't work as well for other cards as well. I haven't seen many posts where people liked it better than the others they were trying. Even though it does have the high power limit.

But I did get a touch higher score and noticeably better clocks with the xoc, and the short time I used it didn't seem to hurt anything. 

Maybe if you want the maximum possible oc for a short time it would be worth it for you? Depends on how much effort you want to put into it.
I just prefer full performance with full efficiency over a slight boost that may beat up the card or give it weird issues down the road.

Oh and the Aorus comes with a 4 year warranty for the stock vbios that has a 375w limit. And that oc software that came with the card let you hit that too, so Gigabyte seems to think 375w is fine for that card. But the stock cards also have a plate on the vrms so there is that.


----------



## Gurkburk

Does MSI 1080ti gaming X work with the XOC bios? I'm on watercooling now, thought i'd give it a real shot.


----------



## KedarWolf

Gurkburk said:


> Does MSI 1080ti gaming X work with the XOC bios? I'm on watercooling now, thought i'd give it a real shot.



Yes, it does.


----------



## Gurkburk

KedarWolf said:


> Yes, it does.


Great! Just flashed it. Is the Power Limit supposed to be untouchable in msi afterburner? Means there's no limit?


----------



## KedarWolf

Gurkburk said:


> Great! Just flashed it. Is the Power Limit supposed to be untouchable in msi afterburner? Means there's no limit?





Gurkburk said:


> Great! Just flashed it. Is the Power Limit supposed to be untouchable in msi afterburner? Means there's no limit?


Yes, it is, should be greyed out.


----------



## GDS

Hello All

I am coming back to you to know if this OC is pretty acceptable? 
i am not a gamer, just for rendering 360° videos and also photogrammetry, so it will work at this frequency ofr an average of 10 hours 

So i am under water (EKWB rig)
1080Ti Aorus waterforce waterbloc

with OCCT (more than 10 mins):
GPU: 2152MHz
GDDR: 6475MHz
voltage: 1142mV
temperature: 34°C during the test
Kombustor cannot stand it and crashes when i start it 


can i try more mV without any problems or much much better to stay at 1142 or less?
(the question of longenvity of the card for 2 or 3 years it is not my interest) 

thx all


----------



## kithylin

Fifth Horseman said:


> Obviously but from what i read it removes the power target limit completely, my question is more towards what is safe cut off point for the power target, specifically when will the vrms start to have problems is it worth going that route or is my 375 tdp already to much.


That's a big "Unknown" right now in general. I can give you the best answer we have at the moment though. In general, no one really knows what the actual "Safe Limit" is for the power draw or vrms (with power draw) that the 1080 Ti can do in general. Some aftermarket cards (K|NGP|N Edition is a good example) have a stupidly over-built VRM section that can easily handle up to and above 600+ watts without any risk of damage. But the stock / reference version of the 1080 Ti from Nvidia, no one knows. Nvidia is not telling us. Many users have been using the XOC bios on OCN here and so far, no one has written back in this thread or the other bios flashing thread reporting that their 1080 Ti's have died from using the XOC bios. Now, that doesn't mean that there aren't some people out there that it's happened to. They just may not be registered on OCN and care to comment on the forums about their experience. But so far we haven't seen anything negative reported about it. And it was KedarWolf that reported some games causing their 1080 Ti to suck up to and a little above 500 watts across the VRM's and I haven't heard from KedarWolf yet that they've had any issues with their card. Personally my 1080 Ti has been running with the XOC bios every day since the 2nd month I've owned it, and I have no issues. However, the typical power draw -I- see in most of my gaming is 285~340 watts with my 1080 Ti, according to nvidia-smi app, even with the XOC bios on and pushing 1.200v and 2138 mhz max boost clock.

So in short: No one knows what's safe and unsafe. But so far nothing negative is reported about the XOC bios, and (so far) it hasn't killed anyone's 1080 Ti (that we know of), no matter how much power we're pulling across the vrm's while using it.


----------



## Fifth Horseman

GDS said:


> Hello All
> 
> I am coming back to you to know if this OC is pretty acceptable?
> i am not a gamer, just for rendering 360° videos and also photogrammetry, so it will work at this frequency ofr an average of 10 hours
> 
> So i am under water (EKWB rig)
> 1080Ti Aorus waterforce waterbloc
> 
> with OCCT (more than 10 mins):
> GPU: 2152MHz
> GDDR: 6475MHz
> voltage: 1142mV
> temperature: 34°C during the test
> Kombustor cannot stand it and crashes when i start it
> 
> 
> can i try more mV without any problems or much much better to stay at 1142 or less?
> (the question of longenvity of the card for 2 or 3 years it is not my interest)
> 
> thx all


Probably crashing from your memory overclock, that almost 13ghz effective. from what ive been reading most are lucky to get +400-500 on the memory and your rocking almost double that.


----------



## schoolofmonkey

Anyone got any thoughts on the MSI GTX1080Ti Duke OC?


----------



## kithylin

schoolofmonkey said:


> Anyone got any thoughts on the MSI GTX1080Ti Duke OC?


It's a 1080 Ti. If you're in to air cooled cards, they're literally all the same, almost all of them hitting nearly the same overclocks within =/- 50 mhz. It's only water cooling where the differences matter. Plus I hate to say this but, it's really kinda pointless to spend $700 on a card right now when we literally have the GTX 1180's supposed to come out in about 30~50 days from now and they're supposed to debut at $700.


----------



## BLUuuE

Anyone have an issue where performance seems to fall off a cliff when overclocking?
My STRIX 1080 Ti at 2050MHz 1.100v is fully stable with the XOC BIOS and is cooled with a G12 + H55.
However, if I try 2100MHz 1.200v, the drivers don't crash, but my performance is almost halved.
VRM temperatures aren't an issue, as I've got a fan blowing directly over the aluminium heatsinks on the midplate which are over the VRMs. According the HWINFO, the GPU Ambient Temperature sensor which I think is the VRM temperature, only hits mid 60s.

A few other people also have this issue with their 1080 Tis on some Discord servers I'm in.

Any ideas?


----------



## ZealotKi11er

BLUuuE said:


> Anyone have an issue where performance seems to fall off a cliff when overclocking?
> My STRIX 1080 Ti at 2050MHz 1.100v is fully stable with the XOC BIOS and is cooled with a G12 + H55.
> However, if I try 2100MHz 1.200v, the drivers don't crash, but my performance is almost halved.
> VRM temperatures aren't an issue, as I've got a fan blowing directly over the aluminium heatsinks on the midplate which are over the VRMs. According the HWINFO, the GPU Ambient Temperature sensor which I think is the VRM temperature, only hits mid 60s.
> 
> A few other people also have this issue with their 1080 Tis on some Discord servers I'm in.
> 
> Any ideas?


At 1.2v the GPU is dying in the inside.


----------



## BLUuuE

ZealotKi11er said:


> At 1.2v the GPU is dying in the inside.


I know 1.2v is kinda iffy, but I've only run it for benching.

It doesn't only happen at 1.2v. The people I talked to on Discord were using the stock BIOS, and they still had the same issue, where if they overclocked a bit too much, performance goes down the drain.

Unigine Heaven Extreme Preset


Spoiler















I usually get 140+ FPS here, but I'm currently at a measly 30 FPS. Power consumption is also way down and so are temps.


----------



## kithylin

BLUuuE said:


> Anyone have an issue where performance seems to fall off a cliff when overclocking?
> My STRIX 1080 Ti at 2050MHz 1.100v is fully stable with the XOC BIOS and is cooled with a G12 + H55.
> However, if I try 2100MHz 1.200v, the drivers don't crash, but my performance is almost halved.
> VRM temperatures aren't an issue, as I've got a fan blowing directly over the aluminium heatsinks on the midplate which are over the VRMs. According the HWINFO, the GPU Ambient Temperature sensor which I think is the VRM temperature, only hits mid 60s.
> 
> A few other people also have this issue with their 1080 Tis on some Discord servers I'm in.
> 
> Any ideas?


Use a second monitor and monitor your clocks while running, it sounds like your card's throttling somewhere. And an AIO + fans for vrm's most likely isn't enough for XOC + 1.200v, that's full-custom-water-only territory. Most likely your VRM's are exceeding 100c+ and the card's detecting overheating and shutting off power due to that and running in limp-home mode. 



ZealotKi11er said:


> At 1.2v the GPU is dying in the inside.


Nope, not even remotely true at all. My 1080 Ti runs full performance @ 1.200v @ 2138 mhz with XOC and has no performance degradation / fall off, and isn't "dying inside", in fact it's perfectly fine. That's with a full cover EK block and a custom loop though. There's other users in the forum here running 1.200v daily on XOC with no issues as well. This is the nvidia limit in the card. Now exceeding 1.200v with a potentiometer and a soldered on volt mod, -THAT- would be killing em. But so far no one has reported issues with 1.200v doing anything negative at all to their 1080 Ti's other than make em run a little hot.


----------



## BLUuuE

kithylin said:


> Use a second monitor and monitor your clocks while running, it sounds like your card's throttling somewhere. And an AIO + fans for vrm's most likely isn't enough for XOC + 1.200v, that's full-custom-water-only territory. Most likely your VRM's are exceeding 100c+ and the card's detecting overheating and shutting off power due to that and running in limp-home mode.


See my post above. 

I was only running 1.100v and the issue still occurred.


----------



## buellersdayoff

BLUuuE said:


> See my post above.
> 
> I was only running 1.100v and the issue still occurred.


In gamersnexus.net reviews the 1080ti's 1% lows dropped off in tests while the average went up, this when overclocked toward the upper limit of each card


----------



## 2ndLastJedi

kithylin said:


> Fifth Horseman said:
> 
> 
> 
> Obviously but from what i read it removes the power target limit completely, my question is more towards what is safe cut off point for the power target, specifically when will the vrms start to have problems is it worth going that route or is my 375 tdp already to much.
> 
> 
> 
> That's a big "Unknown" right now in general. I can give you the best answer we have at the moment though. In general, no one really knows what the actual "Safe Limit" is for the power draw or vrms (with power draw) that the 1080 Ti can do in general. Some aftermarket cards (K|NGP|N Edition is a good example) have a stupidly over-built VRM section that can easily handle up to and above 600+ watts without any risk of damage. But the stock / reference version of the 1080 Ti from Nvidia, no one knows. Nvidia is not telling us. Many users have been using the XOC bios on OCN here and so far, no one has written back in this thread or the other bios flashing thread reporting that their 1080 Ti's have died from using the XOC bios. Now, that doesn't mean that there aren't some people out there that it's happened to. They just may not be registered on OCN and care to comment on the forums about their experience. But so far we haven't seen anything negative reported about it. And it was KedarWolf that reported some games causing their 1080 Ti to suck up to and a little above 500 watts across the VRM's and I haven't heard from KedarWolf yet that they've had any issues with their card. Personally my 1080 Ti has been running with the XOC bios every day since the 2nd month I've owned it, and I have no issues. However, the typical power draw -I- see in most of my gaming is 285~340 watts with my 1080 Ti, according to nvidia-smi app, even with the XOC bios on and pushing 1.200v and 2138 mhz max boost clock.
> 
> So in short: No one knows what's safe and unsafe. But so far nothing negative is reported about the XOC bios, and (so far) it hasn't killed anyone's 1080 Ti (that we know of), no matter how much power we're pulling across the vrm's while using it.
Click to expand...

How accurate is HWInfo64 for reading GPU Power usage ?
If I run at 0.900 mV at 1911 MHz it shows 130 w and fully OC with 1.062 mV at 2088 core and 5960 memory only 192 w ! Power and temp sliders at Max on my Galax 1080Ti EXOC .


----------



## kithylin

BLUuuE said:


> See my post above.
> 
> I was only running 1.100v and the issue still occurred.


I think what might be happening is during overclocking you may get a "Driver has failed and been reset" crash. With nvidia once this has happened your card will run in fail-safe mode at significantly reduced performance until you reboot the computer. Sounds like that's likely what it is. If you ever get a driver reset, you have to reboot before you get full performance back


----------



## krizby

kithylin said:


> That's a big "Unknown" right now in general. I can give you the best answer we have at the moment though. In general, no one really knows what the actual "Safe Limit" is for the power draw or vrms (with power draw) that the 1080 Ti can do in general. Some aftermarket cards (K|NGP|N Edition is a good example) have a stupidly over-built VRM section that can easily handle up to and above 600+ watts without any risk of damage. But the stock / reference version of the 1080 Ti from Nvidia, no one knows. Nvidia is not telling us. Many users have been using the XOC bios on OCN here and so far, no one has written back in this thread or the other bios flashing thread reporting that their 1080 Ti's have died from using the XOC bios. Now, that doesn't mean that there aren't some people out there that it's happened to. They just may not be registered on OCN and care to comment on the forums about their experience. But so far we haven't seen anything negative reported about it. And it was KedarWolf that reported some games causing their 1080 Ti to suck up to and a little above 500 watts across the VRM's and I haven't heard from KedarWolf yet that they've had any issues with their card. Personally my 1080 Ti has been running with the XOC bios every day since the 2nd month I've owned it, and I have no issues. However, the typical power draw -I- see in most of my gaming is 285~340 watts with my 1080 Ti, according to nvidia-smi app, even with the XOC bios on and pushing 1.200v and 2138 mhz max boost clock.
> 
> So in short: No one knows what's safe and unsafe. But so far nothing negative is reported about the XOC bios, and (so far) it hasn't killed anyone's 1080 Ti (that we know of), no matter how much power we're pulling across the vrm's while using it.


There is a youtube channel specialized in analyzing vrm capability: Actually hardcore Overclocking. He does part-time on gamersnexus channel and he did analyze the 1080 ti FE edition and many other 1080ti.






Apparently 1080ti FE VRM can easily handle 400AMP (at 1.2v that is 480W power draw) and produce 40W of heat, can even go up to 800AMP but produces 156W of heat by itself (LN2 territory, or you got them 17W/mk Fujipoly pad on the VRM lol). Therefore you can go crazy on the power draw with the card under water (any 1080ti out there), however:






TL;DR: there is a hardware limiter on die or something funky going on that would lower the performance if you increased the vcore too much, IE: When you increased vcore too much, a lower core clock would produce higher score.


----------



## kithylin

krizby said:


> TL;DR: there is a hardware limiter on die or something funky going on that would lower the performance if you increased the vcore too much, IE: When you increased vcore too much, a lower core clock would produce higher score.


I've read about this and it's related to clocks and it's not what you think. There is something in pascal that starts throttling back it's self if you go above 40c I think is the threshold. To maintain full clocks at higher voltages and still get full performance out of it you need to keep the cards at or under 50c during full load. I've seen this too, my custom water cooled 1080 Ti will drop off performance if it gets to 60c when I'm trying to run around 1.200v & 2150 mhz for example I ended up settling on 2138 mhz as that's where it runs stable, no performance drop and I run fine. But now that it's summer here in Texas, and with summer temps I'm finding I'm running a bit on the too-hot side and I've had to drop it down to 2088 mhz for the summer now on my 1080 Ti. Later maybe next year when I can afford a proper case like a TT Core X9 where I can fit something around 3x420mm rads and then run it full speed and handle summer better.


----------



## krizby

Yeah my 1080ti starts thermal throtting at 35C (not exactly throttle but it increase vcore by 1 bin), I think this is when thermal throttling start, when gpu reach the thermal throttling points the card will either:
_Increase vcore by 1 bin to maintain clock
_Reduce clock by 1 bin if at max vcore 
Temperature steps at which thermal throttling happen is below 










That means when your card reaches above 60C the clock reduces by 5 bin (from 2150mhz down to 2088mhz - 63mhz)


----------



## kithylin

krizby said:


> Yeah my 1080ti starts thermal throtting at 35C (not exactly throttle but it increase vcore by 1 bin), I think this is when thermal throttling start, when gpu reach the thermal throttling points the card will either:
> _Increase vcore by 1 bin to maintain clock
> _Reduce clock by 1 bin if at max vcore
> Temperature steps at which thermal throttling happen is below
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That means when your card reaches above 60C the clock reduces by 5 bin (from 2150mhz down to 2088mhz - 63mhz)


Fascinating. See that (I think, I may be wrong here) was tested with a stock / reference card. For some reason I'm finding if you're using XOC and pushing 1.200v and higher clocks, when you reach 60~65c it will completely dump the card and either crash the driver or just cause the massive performance falloff seen in the photos there. Probably because it's a LN2-Designed bios where it's supposed to be run in sub-zero at all times, and not supposed to ever even be in anything above 0c, so it doesn't surprise me that the XOC "Acts weird" at or above 60c.


----------



## Fifth Horseman

I havent noticed any drop off yet, but im still in the early overclocking and stress test process however with the memory tweaking ive done im pulling close to 320 wats power and sitting at around 43*c under full load courtesy of dual 560mm rads. I am very excited though early testing has shown i have gotten a very very good gpu.


----------



## kithylin

Fifth Horseman said:


> I havent noticed any drop off yet, but im still in the early overclocking and stress test process however with the memory tweaking ive done im pulling close to 320 wats power and sitting at around 43*c under full load courtesy of dual 560mm rads. I am very excited though early testing has shown i have gotten a very very good gpu.


Yeah.. I'm currently on a single 360mm rad and have my i7-5820K @ 4.5 ghz piped through it too with the gpu. I planned to expand my cooling but need a bigger case and other things with finances took priority, bought a second vehicle used and doing maintenance on it the previous owner neglected, standard used car stuff. Still have to buy brake pads and another alternator for it soon. And still need to sort out the storage situation in my second 'budget beast' computer.. will happen some time. I'm planning to sit on my 1080 Ti at least until the GTX 1280's come out, if not the 1380's possibly so I have plenty of time to get the cooling sorted out later.

Although I did realize recently after having to replace the radiator in our recently purchased truck that an automotive radiator for $120 would be cheaper than buying a new case + 3x420mm computer radiators. The thought has occurred to me to just buy a truck radiator and mount a box fan to it, and figure out how to pipe my computer's cooling system through it:>


----------



## Fifth Horseman

kithylin said:


> Yeah.. I'm currently on a single 360mm rad and have my i7-5820K @ 4.5 ghz piped through it too with the gpu. I planned to expand my cooling but need a bigger case and other things with finances took priority, bought a second vehicle used and doing maintenance on it the previous owner neglected, standard used car stuff. Still have to buy brake pads and another alternator for it soon. And still need to sort out the storage situation in my second 'budget beast' computer.. will happen some time. I'm planning to sit on my 1080 Ti at least until the GTX 1280's come out, if not the 1380's possibly so I have plenty of time to get the cooling sorted out later.
> 
> Although I did realize recently after having to replace the radiator in our recently purchased truck that an automotive radiator for $120 would be cheaper than buying a new case + 3x420mm computer radiators. The thought has occurred to me to just buy a truck radiator and mount a box fan to it, and figure out how to pipe my computer's cooling system through it:>


thats an interesting option and epoxying g 1/4 fittings to input and output would probably be easy, i fear the aluminum would become to much of a headaches, the chemicals you would have to put in it to keep it from corroding would probably destroy hard and soft tubing and proably ruin d5 pumps too. If you could find a car radiator that was not aluminum that would probably be best


----------



## Streetdragon

@kithylin
Little question: How high is your temp differentz between GPU and water? I have the EK block to and with 1V 2000Mhz the GPU runs 8-9° over water. I think that is way to high. i uses liquit metal on the gpu.

Mybe i have to clean my loop and reseat the block...

Disstield water and a silver coil are ok or?


ok forget that with the silver^^ maybe i use some red coolant or blue....


----------



## kithylin

Streetdragon said:


> @kithylin
> Little question: How high is your temp differentz between GPU and water? I have the EK block to and with 1V 2000Mhz the GPU runs 8-9° over water. I think that is way to high. i uses liquit metal on the gpu.
> 
> Mybe i have to clean my loop and reseat the block...
> 
> Disstield water and a silver coil are ok or?
> 
> 
> ok forget that with the silver^^ maybe i use some red coolant or blue....


I've been running 98% distilled water + 2% automotive antifreeze in my computer loop for going on 7 years now and have yet to ever have a single issue and most of my water cooled computers I run multiple years without draining the loop with no issues. Longest was 5 years for my x58 system at one point and when I drained it, everything came out clean and just a simple flush with straight distilled water and everything was clean again. Super easy and super cheap. About 4 tablespoons of antifreeze to a gallon of water for me. Chemically, there's nothing man-made that can transfer heat better than water it's self, so in reality for the best cooling performance you want as much actual water, and as little "anything else" in the coolant as possible. 

I have no way to measure the temperature of the water when running so I have no idea, but it depends on what I'm doing with the card. I run with two settings in afterburner currently. One setting for "low games" that won't be maxing out the card very often and I can run max boost there for certain games that average 50% - 70% utilization or below. And then if I want to play ARK: Survival Evolved, or some other game that's really demanding I'll drop the core clock down to 2088 mhz instead, given how ARK runs the 1080 Ti @ 99% usage constantly even at 1080p.

Some information I can share with others: Apparently a single 360mm rad isn't enough for a 1080 Ti pushing 1.200v & XOC bios @ 2138 mhz (with a i7-5820K @ 1.400v @ 4.5 ghz in the loop too) in the summer here. It was perfectly fine through the winter with no issues though. I just need more radiator for the summer for max clocks.


----------



## krizby

Streetdragon said:


> @kithylin
> Little question: How high is your temp differentz between GPU and water? I have the EK block to and with 1V 2000Mhz the GPU runs 8-9° over water. I think that is way to high. i uses liquit metal on the gpu.
> 
> Mybe i have to clean my loop and reseat the block...
> 
> Disstield water and a silver coil are ok or?
> 
> 
> ok forget that with the silver^^ maybe i use some red coolant or blue....


With Kryonaut the delta temp between my water temp and gpu temp is 6C, with Conductonaut delta temp is 3C. I'm using heatkiller IV block. For water temp i measure with an IR thernometer pointing at the resevoir.
I have a loop with 240x60mm (EK Coolstream XE) rad dedicated to the 1080ti with push pull fan config (4 noctua NF-F12), delta temp between room temp and gpu is only 10C lol, probably no problem running XOC bios @ 1.2v


----------



## Fifth Horseman

So I thought i would post back my results from my exhaustive overclock testing and benching. On my GTX 1080 TI Aorus Extreme I was able to achieve 2,088mhz on core clock and 6,200mhz on memory @1.094v pullling 320watts under full load.

So definitely not the best, but i would say a good chip much better than most. dual 560mm rads keeping it at 49*C under oc full load, haven't noticed any throttling.

Edit: I wanted to post some screenshots and benchmarks scores, but the image upload problem is still alive and well, how can a modern website have these issues, much less a technology based one?


----------



## Streetdragon

krizby said:


> With Kryonaut the delta temp between my water temp and gpu temp is 6C, with Conductonaut delta temp is 3C. I'm using heatkiller IV block. For water temp i measure with an IR thernometer pointing at the resevoir.
> I have a loop with 240x60mm (EK Coolstream XE) rad dedicated to the 1080ti with push pull fan config (4 noctua NF-F12), delta temp between room temp and gpu is only 10C lol, probably no problem running XOC bios @ 1.2v


i thin i have to reseat my GPU block... temp ifferntz water and gpu was 10° yesterday-.-
Maybe i just reuse the water in my loop(some clear premix, looks clear now too.....), clean cpu block and get a better conatct for the gpu block
BUT the delta water o air is only 3° xD Overkill lopp yeah :thumb:


----------



## krizby

Fifth Horseman said:


> So I thought i would post back my results from my exhaustive overclock testing and benching. On my GTX 1080 TI Aorus Extreme I was able to achieve 2,088mhz on core clock and 6,200mhz on memory @1.094v pullling 320watts under full load.
> 
> So definitely not the best, but i would say a good chip much better than most. dual 560mm rads keeping it at 49*C under oc full load, haven't noticed any throttling.
> 
> Edit: I wanted to post some screenshots and benchmarks scores, but the image upload problem is still alive and well, how can a modern website have these issues, much less a technology based one?


My 1080ti top at 2088mhz @ 1.093v also, memory only goes 5750mhz, can you post some superposition bench at 4k optimized ? Most 1080ti owners here got score around 10k, with XOC scores around 10500-10800. 

wow with so much rad space how come your gpu is at 49C ? I only have a 240x60mm rad with 4 fans and gpu is only at 36C max (300W power draw max under load), room temp is 26C. Granted i applied Conductonaut on the GPU but even with Kryonaut max GPU temp is 40C. My pump is a lowly D5 DDC pump rated at 2.5 GPM.




Streetdragon said:


> i thin i have to reseat my GPU block... temp ifferntz water and gpu was 10° yesterday-.-
> Maybe i just reuse the water in my loop(some clear premix, looks clear now too.....), clean cpu block and get a better conatct for the gpu block
> BUT the delta water o air is only 3° xD Overkill lopp yeah :thumb:


Water conduct heat better at higher temperature so you could be having some diminishing returns there


----------



## Streetdragon

krizby said:


> Water conduct heat better at higher temperature so you could be having some diminishing returns there


So you wanna say my coolant is to cool to cool my GPU to a cooler temperatur? :specool: ...... Mind -> Blown

Can have enough cool!

--i know a bit offtopic^^--
Think than i will do a complet cleaning of my loop when i get some holidays. Replace some ZMB with rigd and replace the crappy corsair fans with Scythe SM1225GF12SH-P Grand Flex PWM... God i love this fan.
Replace my old Pump. better save than sorry....
I use a Res-Pump combo, can i set two in serial? if onw dies have a second one? would be the water level in the two res be the same? didnt saw a build with two res-pump combos


----------



## Fifth Horseman

krizby said:


> My 1080ti top at 2088mhz @ 1.093v also, memory only goes 5750mhz, can you post some superposition bench at 4k optimized ? Most 1080ti owners here got score around 10k, with XOC scores around 10500-10800.
> 
> wow with so much rad space how come your gpu is at 49C ? I only have a 240x60mm rad with 4 fans and gpu is only at 36C max (300W power draw max under load), room temp is 26C. Granted i applied Conductonaut on the GPU but even with Kryonaut max GPU temp is 40C. My pump is a lowly D5 DDC pump rated at 2.5 GPM.
> 
> 
> 
> 
> Water conduct heat better at higher temperature so you could be having some diminishing returns there



Well you have to remember that my ambient conditions are probably not the same as you, we are in 95 degree summer heat here and i cannont afford to keep my house under 76*f. also while i have a dual loop system they are linked together i have a heavily overclocked 8700k running high volts and a full cover waterblock, not to mention my gpu stability test consisted of running Valley for about 2.5 hours and stabilized at 49c. I also keep my fans under 500 rpms for a quiet pc and i dont turn them up for benchmark because it is not realistic.

So to make a long story short, my conditions are probably very different from yours.


----------



## Fifth Horseman

I ran superposition 4k optimized and received a 10384 score which seems to be pretty good @45*c drawing 340 watts of power. I would love to post the screenshot but i am pretty sure this is the only website in all of internetz thats you cant upload too.

Edit: 2nd run gave 10473 points so must be some anomalies with that benchmark


----------



## kithylin

Fifth Horseman said:


> I ran superposition 4k optimized and received a 10384 score which seems to be pretty good @45*c drawing 340 watts of power. I would love to post the screenshot but i am pretty sure this is the only website in all of internetz thats you cant upload too.
> 
> Edit: 2nd run gave 10473 points so must be some anomalies with that benchmark


All 3dmark programs have a variance between runs on the same hardware of +- 1~3 % either way. It's generally attributed to "Margin Of Error". Your 10473 score was just +0.86% faster than your 10384 score, so this is normal.


----------



## Fifth Horseman

So this is kind of funny, I have been only running benchmarks and stuff since i have gotten my new ti, and i just started gaming today and im noticing low frames and gpu usage what gives? getting like 50-60 fps and like 30% gpu usuage ironically i am getting worse frames than the R9 280x i replaced in actual real gaming.


----------



## krizby

Fifth Horseman said:


> So this is kind of funny, I have been only running benchmarks and stuff since i have gotten my new ti, and i just started gaming today and im noticing low frames and gpu usage what gives? getting like 50-60 fps and like 30% gpu usuage ironically i am getting worse frames than the R9 280x i replaced in actual real gaming.


Well use DDU to completely wipe old AMD drivers, reinstall afterburner, check if you have any frame cap running like rivatuner statistic server or in game, do some cinebench to see if your CPU is running up to spec, use AIDA64 memory and cache benchmark to check also. The Aorus Z370 board is pretty funky with bioses, you don't know if the main or back up bios is running, I had to update bios couple of times in order for my overclocks to stick.


----------



## jura11

Fifth Horseman said:


> So I thought i would post back my results from my exhaustive overclock testing and benching. On my GTX 1080 TI Aorus Extreme I was able to achieve 2,088mhz on core clock and 6,200mhz on memory @1.094v pullling 320watts under full load.
> 
> So definitely not the best, but i would say a good chip much better than most. dual 560mm rads keeping it at 49*C under oc full load, haven't noticed any throttling.
> 
> Edit: I wanted to post some screenshots and benchmarks scores, but the image upload problem is still alive and well, how can a modern website have these issues, much less a technology based one?


Hi there 

Temperatures are bit high for such loop,I assume yours ambient temperature are high 

What are yours water and ambient temperatures? 

Regarding the Gigabyte Aorus 1080ti Extreme,have built for my friend loop, where we are run same card,his loop run EK PE360 360mm radiator on top and Mayhems Havoc 240mm 60mm thick radiator on bottom with 5930k with 4.6Ghz OC 

Best OC on Aorus Extreme 1080Ti I could achieve is 2088MHz as well,+450MHz on VRAM with temperatures ranging from 33-36°C with idle temperatures in 17-20°C

Waterblock we are used Phanteks Glacier with Phanteks TIM and CPU block has been EK Supermacy EVO 

Results 

FireStrike 23032, FireStrike Extreme 14210

Hope this helps 

Thanks, Jura


----------



## jura11

Streetdragon said:


> @kithylin
> Little question: How high is your temp differentz between GPU and water? I have the EK block to and with 1V 2000Mhz the GPU runs 8-9° over water. I think that is way to high. i uses liquit metal on the gpu.
> 
> Mybe i have to clean my loop and reseat the block...
> 
> Disstield water and a silver coil are ok or?
> 
> 
> ok forget that with the silver^^ maybe i use some red coolant or blue....



Hi there 

In my case with 4*360mm radiators plus MO-ra3 360mm my GPU vs water temperature is in similar range 8-9°C but water delta T(ambient vs water temperature) is in 2.2-3°C during the gaming in Assassin's Creed Origins or Unity with 2088MHz OC and GPU temperatures are in range 35-38°C as max

Waterblock is same as yours EK block with Noctua NT-H1 TIM 

Coolant I'm using Mayhems X1 

Usually my idle GPU temperatures are within 0.5-1°C from water temperature, idle water delta T is in 0.5-1°C 

I wouldn't worry about the water and GPU load delta

Hope this helps 

Thanks, Jura


----------



## Fifth Horseman

jura11 said:


> Hi there
> 
> Temperatures are bit high for such loop,I assume yours ambient temperature are high
> 
> What are yours water and ambient temperatures?
> 
> Regarding the Gigabyte Aorus 1080ti Extreme,have built for my friend loop, where we are run same card,his loop run EK PE360 360mm radiator on top and Mayhems Havoc 240mm 60mm thick radiator on bottom with 5930k with 4.6Ghz OC
> 
> Best OC on Aorus Extreme 1080Ti I could achieve is 2088MHz as well,+450MHz on VRAM with temperatures ranging from 33-36°C with idle temperatures in 17-20°C
> 
> Waterblock we are used Phanteks Glacier with Phanteks TIM and CPU block has been EK Supermacy EVO
> 
> Results
> 
> FireStrike 23032, FireStrike Extreme 14210
> 
> Hope this helps
> 
> Thanks, Jura


Today it was 97*f and it is about 78*f in my computer room right now,so yes im sweating my ass off. so its not helping my temps at all, getting back to above though i have my 1080ti directly after my monoblock, and for the record while i have 5ghz on my 8700k I got a big fat turd from intel it requires 1.393 volts to hold that overclock which generates alot of heat, so while i may have gotten a golden gpu from nvidia i got probably the worse 8700k one could get.


----------



## Fifth Horseman

krizby said:


> Well use DDU to completely wipe old AMD drivers, reinstall afterburner, check if you have any frame cap running like rivatuner statistic server or in game, do some cinebench to see if your CPU is running up to spec, use AIDA64 memory and cache benchmark to check also. The Aorus Z370 board is pretty funky with bioses, you don't know if the main or back up bios is running, I had to update bios couple of times in order for my overclocks to stick.


Not using afterburner, not even overclocked during regular gaming. not using rivatuner or any frame cap. Cpu is running great as usual just scored a nice 1642 cinnebench score. The overclock and gpu run great when im stress testing valley, heaven, superposition. Like for instance i just booted up world of warcraft vanilla, i was getting an erratic 80 fps with 8% gpu usage. 

I remember a lot of people complaining about a similar issue with Kepler cards, hell evga came up with a K-boost thing to get around it. Alot of the games i play are old, so maybe nvidia drivers are just crap for anything not brand new.

Edit: just for funnies i changed my game resolution to 720p and i dropped to 40- 70 fps with 3% gpu usage. clearly there is some kind of power saving bull crap going on here i just not sure where.


----------



## kithylin

jura11 said:


> Hi there
> 
> In my case with 4*360mm radiators plus MO-ra3 360mm my GPU vs water delta is in similar range 8-9°C but water delta T(ambient vs water temperature) is in 2.2-3°C during the gaming in Assassin's Creed Origins or Unity with 2088MHz OC and GPU temperatures are in range 35-38°C as max
> 
> Waterblock is same as yours EK block with Noctua NT-H1 TIM
> 
> Coolant I'm using Mayhems X1
> 
> Usually my idle GPU temperatures are within 0.5-1°C from water temperature, idle water delta T is in 0.5-1°C
> 
> I wouldn't worry about the water and GPU load delta
> 
> Hope this helps
> 
> Thanks, Jura


Nevermind Delta or any of that stuff.. what is your actual gpu max temps? You're seeing actual 38c max temps? I'm trying to understand what you're writing, what's your average actual ambient temps in the room? And are you trying XOC bios @ 1.200v and max overclocks? Or just stock bios and 2088 mhz?


----------



## jura11

kithylin said:


> Nevermind Delta or any of that stuff.. what is your actual gpu max temps? You're seeing actual 38c max temps? I'm trying to understand what you're writing, what's your average actual ambient temps in the room? And are you trying XOC bios @ 1.200v and max overclocks? Or just stock bios and 2088 mhz?


Hi there 

Max GPU temperatures are in range as I said 35-38°C, really depends on ambient temperatures 

Average ambient hard to say, during the day 26-28°C and during the night 19-21°C, no A/C 

No XOC BIOS, stock FE BIOS, in Assassin's Creed Origins I can run 2164MHz but in Assassin's Creed Unity 2088-2113MHz are my max OC, in any other games like is FC5 2113MHz is pretty much max what I can achieve, 2088MHz at 1.075v and 2113MHz at same voltage and 2164MHz with 1.093v


I will edit my post which should be easier to understand 


Hope this helps 

Thanks, Jura


----------



## jura11

Here is screenshot of the Aquasuite after I played AC:U,2088MHz on 1080Ti with 1.075v,stock FE BIOS,just ignore interior or inside case temperatures,using Phobya temperature and this doesn't one is not best and seems is misreading by about 1-1.5C when I comparing against the Aquacomputer temperatures probes

Ambient temperature sensor is in front of the case and doesn't touch any metal and inside or case temperature sensor is close to reservoir,have spare normal AQ sensor which I will replace 

Hope this helps

Thanks,Jura


----------



## kithylin

jura11 said:


> Hi there
> 
> Max GPU temperatures are in range as I said 35-38°C, really depends on ambient temperatures
> 
> Average ambient hard to say, during the day 26-28°C and during the night 19-21°C, no A/C
> 
> No XOC BIOS, stock FE BIOS, in Assassin's Creed Origins I can run 2164MHz but in Assassin's Creed Unity 2088-2113MHz are my max OC, in any other games like is FC5 2113MHz is pretty much max what I can achieve, 2088MHz at 1.075v and 2113MHz at same voltage and 2164MHz with 1.093v
> 
> 
> I will edit my post which should be easier to understand
> 
> 
> Hope this helps
> 
> Thanks, Jura


Actually it does, thank you for that. I have an air conditioner here in the computer room where my tower is but it cycles, I have it come on @ 29c and switch off at 23c, so there's a "hot side" and a "Cold side" during the day to keep a/c power down. Our outside temps the past few days have been hovering around 39c - 41c, so it would be much hotter if not for the a/c pushing it down occasionally. I may just drop back my overclock, leave XOC on and leave it @ 2088 mhz and lower voltage forever, not sure yet. I was curious what to think if I upgraded cooling from single 360mm rad -> 4x radiators might be like, still not sure if that's going to be enough to keep my card < 40c at all times like I want.


----------



## Fifth Horseman

Just thought I would post an update with my situation. if i run a demanding game like Rise of the tomb raider in the background with my older games i get much better fps 100+. seems like the issue is the game not telling the gpu to do more work i have to force it to work in odd ways.


----------



## krizby

Fifth Horseman said:


> Just thought I would post an update with my situation. if i run a demanding game like Rise of the tomb raider in the background with my older games i get much better fps 100+. seems like the issue is the game not telling the gpu to do more work i have to force it to work in odd ways.


Open nvidia control panel --> manage 3d settings --> power management mode and choose prefer maximum performance, this way your card maintain it's 3d clocks all the time and use a little more power when idle



jura11 said:


> Hi there
> 
> Max GPU temperatures are in range as I said 35-38°C, really depends on ambient temperatures
> 
> Average ambient hard to say, during the day 26-28°C and during the night 19-21°C, no A/C
> 
> No XOC BIOS, stock FE BIOS, in Assassin's Creed Origins I can run 2164MHz but in Assassin's Creed Unity 2088-2113MHz are my max OC, in any other games like is FC5 2113MHz is pretty much max what I can achieve, 2088MHz at 1.075v and 2113MHz at same voltage and 2164MHz with 1.093v
> 
> 
> I will edit my post which should be easier to understand
> 
> 
> Hope this helps
> 
> Thanks, Jura


I highly doubt you can run 2113mhz @ 1.075v and then jump to 2164mhz @ 1.093v and be stable across many games, if your card is stable at 2113mhz/1.075v then maybe stable at 2138mhz/1.093v maximum. One reason you are stable is AC:O is because of power limits your card never actually boost to the highest clock/vcore and just hover around 2113mhz/1.075v. In older games your card will actually reach the highest clock/vcore bin and that why it crashes. 

The easiest way to find the stable overclock is first set the core clock offset in afterburner, stress test a bunch THEN modify the voltage/freq curve by raising the clock by 12-13mhz at 1.075v, 1.081v and 1.093v voltage points.


----------



## Fifth Horseman

krizby said:


> Open nvidia control panel --> manage 3d settings --> power management mode and choose prefer maximum performance, this way your card maintain it's 3d clocks all the time and use a little more power when idle



Yea already tried that it does not do anything.


----------



## kithylin

Fifth Horseman said:


> Yea already tried that it does not do anything.


Exactly what "Older games" are you trying? Because it's perfectly normal for the new GTX 900 and GTX 1000 series to not be very fast in very old games. If we're talking old DirectX 6,7,8, and some DirectX-9 games, then that's perfectly normal behaviour and the newer cards actually do run quite a bit slower in these older games than say, if you were to use a GTX-780 in the same games. This is sort of a known behaviour with nvidia video cards. The newer GPU's are optimized via the drivers for DirectX 10, 11, and 12 titles and almost no optimization at all for older games. Most people would not buy a 1080 Ti for playing old games. Most people buy a 1080 Ti to run today's latest and greatest AAA titles.


----------



## jura11

krizby said:


> Open nvidia control panel --> manage 3d settings --> power management mode and choose prefer maximum performance, this way your card maintain it's 3d clocks all the time and use a little more power when idle
> 
> 
> 
> I highly doubt you can run 2113mhz @ 1.075v and then jump to 2164mhz @ 1.093v and be stable across many games, if your card is stable at 2113mhz/1.075v then maybe stable at 2138mhz/1.093v maximum. One reason you are stable is AC:O is because of power limits your card never actually boost to the highest clock/vcore and just hover around 2113mhz/1.075v. In older games your card will actually reach the highest clock/vcore bin and that why it crashes.
> 
> The easiest way to find the stable overclock is first set the core clock offset in afterburner, stress test a bunch THEN modify the voltage/freq curve by raising the clock by 12-13mhz at 1.075v, 1.081v and 1.093v voltage points.


As I said in AC:O I'm running 2164MHz, same applies for many other games or even Unigine Superposition,FireStrike etc

I tried last night again AC:U just to test my OC and seems I can do there as well 2164MHz OC, checked Event Viewer what caused previous errors and seems is not after all my OC, because that game never been compatible with Win10 and I got in this game 5-45 mins errors and game would crash to desktop, disabled tablet service and can run my OC on daily basis without any issues 

FC:5 seems is happy with 2113MHz or 2128MHz with same voltage 1.075v, didn't tried 1.093v and higher OC but doesn't matter as I finished the game few weeks ago 

I own Ti for long enough to know how to OC the Pascal GPUs, have 2*GTX1080 and GTX1080Ti, all of them are OC,all of them are used for rendering and depending on ambient temperature Ti running from 2113-2164MHz,EVGA GTX1080 running as max 2100MHz and anything above that would crash render or stress test, Manli GTX1080 is best OC'er 2164MHz stable like in games or rendering, stress testing etc, then have tried few other Ti like MSI Gaming GTX1080Ti with which I could do as max 2128MHz and Aorus which would do 2088MHz, FTW3 has been good too but 2100MHz has been max,then have owned few GTX1070's which usually under water would do above 2100MHz 

I use only use curves for OC,offset doesn't work for me 

Easiest way for me find stable OC, try curves and if its crash lower OC by one bin, rinse and repeat,I use my GPUs for rendering or simulation in few SW,gaming is for me just extra when I have spare time

Regarding the power or voltage limits, in AC:U I can see this only when I open map, in game itself or in menu no power limits etc and clocks are stable and sits at 2164MHz 

Hope this helps 

Thanks, Jura


----------



## jura11

Fifth Horseman said:


> Today it was 97*f and it is about 78*f in my computer room right now,so yes im sweating my ass off. so its not helping my temps at all, getting back to above though i have my 1080ti directly after my monoblock, and for the record while i have 5ghz on my 8700k I got a big fat turd from intel it requires 1.393 volts to hold that overclock which generates alot of heat, so while i may have gotten a golden gpu from nvidia i got probably the worse 8700k one could get.


Hi there 

These ambient temperatures are not high, at what speed fans are you running and what waterblock do you have? 

Have tried only Phanteks Glacier and Bykski waterblocks for Aorus,Bykski has been within 2-3°C from Glacier WB in my tests

Did you tried normal block on yours CPU rather than monoblock, have tried few of them and all of them which I tried just raised like CPU temperatures and therefore put lot more heat to loop, off course VRM temperatures are lower but if its worth it not sure, rather I use universal Heatkiller VRM blocks on VRM if I need to use

Try lowering CPU OC and test again if temperatures improve, raise fan and pump speeds 

Do you have water temperature sensor? 

Hope this helps 

Thanks, Jura


----------



## kithylin

jura11 said:


> As I said in AC:O I'm running 2164MHz, same applies for many other games or even Unigine Superposition,FireStrike etc
> 
> I tried last night again AC:U just to test my OC and seems I can do there as well 2164MHz OC, checked Event Viewer what caused previous errors and seems is not after all my OC, because that game never been compatible with Win10 and I got in this game 5-45 mins errors and game would crash to desktop, disabled tablet service and can run my OC on daily basis without any issues
> 
> FC:5 seems is happy with 2113MHz or 2128MHz with same voltage 1.075v, didn't tried 1.093v and higher OC but doesn't matter as I finished the game few weeks ago
> 
> I own Ti for long enough to know how to OC the Pascal GPUs, have 2*GTX1080 and GTX1080Ti, all of them are OC,all of them are used for rendering and depending on ambient temperature Ti running from 2113-2164MHz,EVGA GTX1080 running as max 2100MHz and anything above that would crash render or stress test, Manli GTX1080 is best OC'er 2164MHz stable like in games or rendering, stress testing etc, then have tried few other Ti like MSI Gaming GTX1080Ti with which I could do as max 2128MHz and Aorus which would do 2088MHz, FTW3 has been good too but 2100MHz has been max,then have owned few GTX1070's which usually under water would do above 2100MHz
> 
> I use only use curves for OC,offset doesn't work for me
> 
> Easiest way for me find stable OC, try curves and if its crash lower OC by one bin, rinse and repeat,I use my GPUs for rendering or simulation in few SW,gaming is for me just extra when I have spare time
> 
> Regarding the power or voltage limits, in AC:U I can see this only when I open map, in game itself or in menu no power limits etc and clocks are stable and sits at 2164MHz
> 
> Hope this helps
> 
> Thanks, Jura


My 1080 Ti won't even do 2000 mhz core stable @ 1.087v, if I try it just crashes driver instantly. Same with 2050, 2088, and 2100 mhz, even on water cooling. 1.087v is the highest that the EVGA factory bios would give me. And with that (even on custom water) It was limiting me to 1950 mhz and that's all. The only way I was able to get any faster was XOC bios and slowly feeding it a little more voltage and trying a little higher and a little higher and found out the only way to get it at or above 2000 mhz core was 1.15v for 2088 mhz, and 2138 needs the full 1.200v.

I did make a mistake when buying mine.. I was in sort of a hurry to try and get one before the mining rush came in, and I -THOUGHT- I was buying an EVGA ICX one with the 9 sensors and 8+8 power and all that. But what I actually ended up with was the EVGA Black Edition.. the one with "ICX Cooler", not the ICX Sensors.. It wasn't the seller's fault, it was partially EVGA's fault back then they didn't clearly advertise the cards as "ICX Cooler", they just said "Black Edition ICX" and I didn't realize there was a difference. Anyway.. I know Buildzoid and everyone says the 6+8 power connectors on 1080 Ti should be enough, I still can't help but think that the people getting 2200+ mhz on their 1080 Ti's were using the 8+8 power ones possibly.... I'll never know now.

Anyway, maybe I have a bad sample in the silicon lotto, but based on my card and my experience with it so far, I too find it difficult to believe your 1080 Ti is doing 2100 mhz with just 1.075v, that just sounds impossible to me.


----------



## Fifth Horseman

kithylin said:


> Exactly what "Older games" are you trying? Because it's perfectly normal for the new GTX 900 and GTX 1000 series to not be very fast in very old games. If we're talking old DirectX 6,7,8, and some DirectX-9 games, then that's perfectly normal behaviour and the newer cards actually do run quite a bit slower in these older games than say, if you were to use a GTX-780 in the same games. This is sort of a known behaviour with nvidia video cards. The newer GPU's are optimized via the drivers for DirectX 10, 11, and 12 titles and almost no optimization at all for older games. Most people would not buy a 1080 Ti for playing old games. Most people buy a 1080 Ti to run today's latest and greatest AAA titles.


well for starters World of warcraft is one of them and no its not cause its and older game i know 3 people with 1080's that get 140+fps AT 4K. while im lucky to get 50-60 at any resolution.


----------



## Fifth Horseman

jura11 said:


> Hi there
> 
> These ambient temperatures are not high, at what speed fans are you running and what waterblock do you have?
> 
> Have tried only Phanteks Glacier and Bykski waterblocks for Aorus,Bykski has been within 2-3°C from Glacier WB in my tests
> 
> Did you tried normal block on yours CPU rather than monoblock, have tried few of them and all of them which I tried just raised like CPU temperatures and therefore put lot more heat to loop, off course VRM temperatures are lower but if its worth it not sure, rather I use universal Heatkiller VRM blocks on VRM if I need to use
> 
> Try lowering CPU OC and test again if temperatures improve, raise fan and pump speeds
> 
> Do you have water temperature sensor?
> 
> Hope this helps
> 
> Thanks, Jura


I wasnt complaining about my temps other people were. im not interested in changing anything with my loop i like it just the way it is


----------



## jura11

kithylin said:


> My 1080 Ti won't even do 2000 mhz core stable @ 1.087v, if I try it just crashes driver instantly. Same with 2050, 2088, and 2100 mhz, even on water cooling. 1.087v is the highest that the EVGA factory bios would give me. And with that (even on custom water) It was limiting me to 1950 mhz and that's all. The only way I was able to get any faster was XOC bios and slowly feeding it a little more voltage and trying a little higher and a little higher and found out the only way to get it at or above 2000 mhz core was 1.15v for 2088 mhz, and 2138 needs the full 1.200v.
> 
> I did make a mistake when buying mine.. I was in sort of a hurry to try and get one before the mining rush came in, and I -THOUGHT- I was buying an EVGA ICX one with the 9 sensors and 8+8 power and all that. But what I actually ended up with was the EVGA Black Edition.. the one with "ICX Cooler", not the ICX Sensors.. It wasn't the seller's fault, it was partially EVGA's fault back then they didn't clearly advertise the cards as "ICX Cooler", they just said "Black Edition ICX" and I didn't realize there was a difference. Anyway.. I know Buildzoid and everyone says the 6+8 power connectors on 1080 Ti should be enough, I still can't help but think that the people getting 2200+ mhz on their 1080 Ti's were using the 8+8 power ones possibly.... I'll never know now.
> 
> Anyway, maybe I have a bad sample in the silicon lotto, but based on my card and my experience with it so far, I too find it difficult to believe your 1080 Ti is doing 2100 mhz with just 1.075v, that just sounds impossible to me.


Hi there 

Looks like you are lost silicone lottery from what are you saying 

I have used two EVGA GTX 1080Ti Black Edition, one I used with Raijintek Morpheus II cooler which would do 2100MHz at 1.075v and temperatures has been in 42-45°C range, second one would do 2113MHz at 1.093v that's under water, friend have used this card with EVGA Hybrid AIO and his max OC has been in 2088MHz range in colder weather, in hot current weather no way he is able push that card well at same frequency 

Wow these voltages are way high for my liking there

XOC BIOS I never run on my cards to the date, friend tried on his card with Raijintek Morpheus II cooler and he too reverted back to stock BIOS, he was able to run 2164MHz but just power draw has been way too high for his liking 

I bought my Ti when prices had been way better and bought my Founders Edition as at that time waterblocks has been available for these cards 

Will post my screeshot from HWiNFO and you will see what I'm running there, what temperatures, what voltage etc 

I don't need to lie or telling you not truth

Hope this helps 

Thanks, Jura


----------



## jura11

kithylin said:


> My 1080 Ti won't even do 2000 mhz core stable @ 1.087v, if I try it just crashes driver instantly. Same with 2050, 2088, and 2100 mhz, even on water cooling. 1.087v is the highest that the EVGA factory bios would give me. And with that (even on custom water) It was limiting me to 1950 mhz and that's all. The only way I was able to get any faster was XOC bios and slowly feeding it a little more voltage and trying a little higher and a little higher and found out the only way to get it at or above 2000 mhz core was 1.15v for 2088 mhz, and 2138 needs the full 1.200v.
> 
> I did make a mistake when buying mine.. I was in sort of a hurry to try and get one before the mining rush came in, and I -THOUGHT- I was buying an EVGA ICX one with the 9 sensors and 8+8 power and all that. But what I actually ended up with was the EVGA Black Edition.. the one with "ICX Cooler", not the ICX Sensors.. It wasn't the seller's fault, it was partially EVGA's fault back then they didn't clearly advertise the cards as "ICX Cooler", they just said "Black Edition ICX" and I didn't realize there was a difference. Anyway.. I know Buildzoid and everyone says the 6+8 power connectors on 1080 Ti should be enough, I still can't help but think that the people getting 2200+ mhz on their 1080 Ti's were using the 8+8 power ones possibly.... I'll never know now.
> 
> Anyway, maybe I have a bad sample in the silicon lotto, but based on my card and my experience with it so far, I too find it difficult to believe your 1080 Ti is doing 2100 mhz with just 1.075v, that just sounds impossible to me.


Hi there

Here are screenshots of the HWiNFO and Aquasuite,hope this helps

Hope this helps

Thanks,Jura


----------



## kithylin

jura11 said:


> I don't need to lie or telling you not truth
> 
> Hope this helps
> 
> Thanks, Jura


My apologies.. I didn't mean to sound like I was accusing you of lying.

After reading your post I'm just unlucky then. Kind of sad to know.. but oh well. At least we do have XOC bios and I'm not stuck at 1950 Mhz core forever, that would be even sadder.


----------



## Fifth Horseman

Ok so Update i erased drive and reinstalled windows to see if there was any help and it didnt work. Interestingly enough i tried something i edited my config to force the game to run in DX11 instead of dx9 and boom got 250FPS, it appears all dx9 games i have appear to be suffering from this issue. Could i have missing or damaged DX9 files and better yet if so how do i update or fix them because as far as i know its built into the operating system. Also interesting tidbit vanilla wow from 2005 was not written for DX11 hell Dx11 diddnt even exist so how it was able to run in dx11 is beyond me

Edit: ignore the above i was mistaken. It wasnt forcing dx11 that fixed issue. I have been able to replicate this multiple times. i had blizzards website open in the background getting help on the forums......as stupid as this sounds if i have the blizzard website open behind my game i get 250+ fps as soon as i exit the website my fps tanks. I swear this sounds like i have a evil spirit in my card im about to get me a priest and some holy water.


----------



## krizby

jura11 said:


> Hi there
> 
> Here are screenshots of the HWiNFO and Aquasuite,hope this helps
> 
> Hope this helps
> 
> Thanks,Jura


Kinda hard to believe your card is stable when pulling only 37W of power and already 36C ?


----------



## jura11

krizby said:


> Kinda hard to believe your card is stable when pulling only 37W of power and already 36C ?


Hard to believe,I'm still on HWiNFO 5.52 switched back to this as is stable for me,newer beta or newer version does show GPU power correctly but is causing few issues on my board

And please check my screenshot again and check GPU power minimum which is 3.666W which must be wrong too and maximum is 37W which surely is wrong as well there at 2113MHz and Total GPU Power(% of TDP) sits at 120.2%,GPU Load at 100% etc


Hope this helps

Thanks,Jura


----------



## kithylin

jura11 said:


> Hard to believe,I'm still on HWiNFO 5.52 switched back to this as is stable for me,newer beta or newer version does show GPU power correctly but is causing few issues on my board
> 
> And please check my screenshot again and check GPU power minimum which is 3.666W which must be wrong too and maximum is 37W which surely is wrong as well there at 2113MHz and Total GPU Power(% of TDP) sits at 120.2%,GPU Load at 100% etc
> 
> 
> Hope this helps
> 
> Thanks,Jura


Just use Nvidia-SMI, it's completely accurate and tells you exactly what your card is drawing.


----------



## jura11

kithylin said:


> Just use Nvidia-SMI, it's completely accurate and tells you exactly what your card is drawing.


This should help,as you see I have 3*GPUs in my PC

Hope this helps

Thanks,Jura


----------



## Streetdragon

jura11 said:


> Hi there
> 
> In my case with 4*360mm radiators plus MO-ra3 360mm my GPU vs water temperature is in similar range 8-9°C but water delta T(ambient vs water temperature) is in 2.2-3°C during the gaming in Assassin's Creed Origins or Unity with 2088MHz OC and GPU temperatures are in range 35-38°C as max
> 
> Waterblock is same as yours EK block with Noctua NT-H1 TIM
> 
> Coolant I'm using Mayhems X1
> 
> Usually my idle GPU temperatures are within 0.5-1°C from water temperature, idle water delta T is in 0.5-1°C
> 
> I wouldn't worry about the water and GPU load delta
> 
> Hope this helps
> 
> Thanks, Jura


Thanks for the Info! Every Summer i start to panik a bit over my hardware, but your temps are more or less the same like mine, so i think its ok. Hate summer...


----------



## junglechocolate

ASUS GeForce GTX 1080 Ti Turbo Edition

Are Single blower cards like these any good. especially when it comes to heat? I can get one for cheap but I am wondering if its worth it or wait for multiple fan one?


----------



## buellersdayoff

junglechocolate said:


> ASUS GeForce GTX 1080 Ti Turbo Edition
> 
> Are Single blower cards like these any good. especially when it comes to heat? I can get one for cheap but I am wondering if its worth it or wait for multiple fan one?


Out of the box will run at 80-85c maybe 1600-1700mhz, compared to a decent dual/triple fan cooler at lower temps and probably 1900+mhz. To extract a little more performance you'd have to crank up the fan to jet engine like sound levels or put a water cooler on it for best performance. Dual or triple fan cards are usually pretty quiet up to around ~80% fan speed after that they're headed for airport also. I'd recommend the big cooler cards over the blower style unless you don't mind a louder card with lower performance.


----------



## kithylin

buellersdayoff said:


> Out of the box will run at 80-85c maybe 1600-1700mhz, compared to a decent dual/triple fan cooler at lower temps and probably 1900+mhz. To extract a little more performance you'd have to crank up the fan to jet engine like sound levels or put a water cooler on it for best performance. Dual or triple fan cards are usually pretty quiet up to around ~80% fan speed after that they're headed for airport also. I'd recommend the big cooler cards over the blower style unless you don't mind a louder card with lower performance.


While all of this is true, it also completely depends on your computer's case design. If you have any sort of non-blower card in your computer you absolutely must have very good airflow to get it out of the case or you'll end up with the same low performance as a blower cooler, even with a big 3-fan cooler on it. 1080 Ti's make a lot of heat. And also it depends on the card.. or your luck. I had a big twin-fan EVGA GTX 1080 Ti SC Black Edition card and stock with the stock air cooler, out-of-the-box, it was doing around 75c - 80c and maxed out throttled down @ 1850 mhz max.


----------



## junglechocolate

buellersdayoff said:


> Out of the box will run at 80-85c maybe 1600-1700mhz, compared to a decent dual/triple fan cooler at lower temps and probably 1900+mhz. To extract a little more performance you'd have to crank up the fan to jet engine like sound levels or put a water cooler on it for best performance. Dual or triple fan cards are usually pretty quiet up to around ~80% fan speed after that they're headed for airport also. I'd recommend the big cooler cards over the blower style unless you don't mind a louder card with lower performance.





kithylin said:


> While all of this is true, it also completely depends on your computer's case design. If you have any sort of non-blower card in your computer you absolutely must have very good airflow to get it out of the case or you'll end up with the same low performance as a blower cooler, even with a big 3-fan cooler on it. 1080 Ti's make a lot of heat. And also it depends on the card.. or your luck. I had a big twin-fan EVGA GTX 1080 Ti SC Black Edition card and stock with the stock air cooler, out-of-the-box, it was doing around 75c - 80c and maxed out throttled down @ 1850 mhz max.


The price is 525 for the Asus card I mentioned. Also it seems waterblock wouldn't be possible?
My case is this: https://www.newegg.com/Product/Product.aspx?Item=N82E16811124159
Right now I have a 290 Vapor edition and it runs actually hot. I avg 87-88 degs. Vapor Cards are supposed to have the best cooling. I don't think I have good airflow with how packed my computer is. So would it be better to get this limited in performance card? Keep in mind I have a 55" 4K 3d OLED TV and I want to play some games in 3D at 4k


----------



## buellersdayoff

junglechocolate said:


> The price is 525 for the Asus card I mentioned. Also it seems waterblock wouldn't be possible?
> My case is this: https://www.newegg.com/Product/Product.aspx?Item=N82E16811124159
> Right now I have a 290 Vapor edition and it runs actually hot. I avg 87-88 degs. Vapor Cards are supposed to have the best cooling. I don't think I have good airflow with how packed my computer is. So would it be better to get this limited in performance card? Keep in mind I have a 55" 4K 3d OLED TV and I want to play some games in 3D at 4k


If it's a good deal and your case has poor air flow then it might be worth it, it'll still be a good bump in performance over the 290 even if it is limited in performance compared to other 1080ti's.


----------



## junglechocolate

buellersdayoff said:


> If it's a good deal and your case has poor air flow then it might be worth it, it'll still be a good bump in performance over the 290 even if it is limited in performance compared to other 1080ti's.


Yeah I have to think about it. Apparently the card is still under warranty. I just played the Witcher 3 for about 20 mins on my Vapor X and it was hitting 90 degs. This is what the inside of my case looks like:. The case was closed in this situation. When I open it, the card runs cooler. I do despise how loud blowers can get. But I am thiking about what my case will limit. I also need to think about if I need to include a fan on top to help vent air out? Or to push air in. 


I also plan on putting in a GTX 470 (my backup card) to use for Nvidia works. Can this still be done? Ive heard blowers are best for setups like this?


----------



## ZealotKi11er

junglechocolate said:


> Yeah I have to think about it. Apparently the card is still under warranty. I just played the Witcher 3 for about 20 mins on my Vapor X and it was hitting 90 degs. This is what the inside of my case looks like:. The case was closed in this situation. When I open it, the card runs cooler. I do despise how loud blowers can get. But I am thiking about what my case will limit. I also need to think about if I need to include a fan on top to help vent air out? Or to push air in.
> 
> 
> I also plan on putting in a GTX 470 (my backup card) to use for Nvidia works. Can this still be done? Ive heard blowers are best for setups like this?


You do not need to run second card. They used to be done in the old games that had PhysX. Now the GPU is fast enough especially 1080 Ti. Take GTX470 and bin it.


----------



## kithylin

junglechocolate said:


> The price is 525 for the Asus card I mentioned. Also it seems waterblock wouldn't be possible?
> My case is this: https://www.newegg.com/Product/Product.aspx?Item=N82E16811124159
> Right now I have a 290 Vapor edition and it runs actually hot. I avg 87-88 degs. Vapor Cards are supposed to have the best cooling. I don't think I have good airflow with how packed my computer is. So would it be better to get this limited in performance card? Keep in mind I have a 55" 4K 3d OLED TV and I want to play some games in 3D at 4k


Just so you know, even the highest clocking 1080 Ti struggles to run games in 4K @ max settings and will typically average 30-50 FPS in most modern AAA Titles @ 4K with everything on ultra. Linus did a video on it just recently I can try to find you, even the Titan V (which is +26% faster than the 1080 Ti) still struggles to maintain 60 FPS minimums at all times in most popular modern games. You'll have to be turning down some settings to high/medium and run with a mix of medium/high/ultra to find something that works for you.


----------



## junglechocolate

kithylin said:


> Just so you know, even the highest clocking 1080 Ti struggles to run games in 4K @ max settings and will typically average 30-50 FPS in most modern AAA Titles @ 4K with everything on ultra. Linus did a video on it just recently I can try to find you, even the Titan V (which is +26% faster than the 1080 Ti) still struggles to maintain 60 FPS minimums at all times in most popular modern games. You'll have to be turning down some settings to high/medium and run with a mix of medium/high/ultra to find something that works for you.





ZealotKi11er said:


> You do not need to run second card. They used to be done in the old games that had PhysX. Now the GPU is fast enough especially 1080 Ti. Take GTX470 and bin it.


Yeah so question, should I buy a 1080 Ti SC now or wait till when 1180 comes out to get a better 1080 Ti version for the same price. (520 USD for the 1080 Ti SC). The 290 is so painful to play on right now avg 25 fps. Dialing down settings to manaegable image quality doesn't do much. Unless I was to downgrade graphics significantly.


----------



## rluker5

junglechocolate said:


> Yeah I have to think about it. Apparently the card is still under warranty. I just played the Witcher 3 for about 20 mins on my Vapor X and it was hitting 90 degs. This is what the inside of my case looks like:. The case was closed in this situation. When I open it, the card runs cooler. I do despise how loud blowers can get. But I am thiking about what my case will limit. I also need to think about if I need to include a fan on top to help vent air out? Or to push air in.
> 
> 
> I also plan on putting in a GTX 470 (my backup card) to use for Nvidia works. Can this still be done? Ive heard blowers are best for setups like this?


Maybe put a fan blowing in on the bottom of your case by the power supply. Air will find its way out.
Also, as a dedicated physx card user, I agree with ZealotKi11er. A 1080ti doesn't need a physx card for any games that use dedicated physx. The 1080ti can handle those games just fine and a dedicated card that adds noticeable heat or burden to your power supply will only detract from the performance that the 1080ti would provide without it.

I bought my 23w, pciex1 gt730 physx card to help sli 780tis run Witcher 3 at 4k. They needed all the help they could get. But they removed dedicated physx support from that game halfway through the updates and use the cpu now. I think there might be some way to hack it back in but not a reason if you have a 1080ti. Metros and Borderlands and some similarly easy to render games use it, and you can look at your physx card use in afterburner but that's the extent of the benefit for a 1080ti.

I wish they could use it more, and if M. Exodus uses it you can toss it in, but I would leave mine out if I were you.
I just have mine in for when I do a bios reset (if I get an itch for messing with ram timings or blk clocks or microcodes) since my Aorus cards only work with uefi and my Z97 bios defaults to legacy. So really more a convenience for fixing overclocking mistakes than a performance enhancer.

Also $520 for a 1080ti SC is like $420 for a 980ti before the 1080 came out. It's a good deal unless you are looking for easy 4k60 with all of the games coming out over the next year. If you just want 2-3x the fps over the 290(when you don't have a cpu bottleneck)you should get that.

Edit: I don't know how the 1080ti would do 4k and 3d at the same time. Do tv's do that yet? Mine only does 1080p 3d or 4k.
HDMI 2.1 in the next cards should do that, but if tvs aren't yet, don't get your hopes up.


----------



## kithylin

junglechocolate said:


> Yeah so question, should I buy a 1080 Ti SC now or wait till when 1180 comes out to get a better 1080 Ti version for the same price. (520 USD for the 1080 Ti SC). The 290 is so painful to play on right now avg 25 fps. Dialing down settings to manaegable image quality doesn't do much. Unless I was to downgrade graphics significantly.


Well the 1180 is supposed to come out some time between now and the end of the year (most likely before xmas) and is supposed to be at least +40% over the 1080 Ti, some sites say 1180 should be +50%, and it's supposed to cost about the same $700 that a 1080 Ti costs today when new so... I can't say I would recommend anyone buy a 1080 Ti today, or this year.


----------



## KedarWolf

kithylin said:


> Well the 1180 is supposed to come out some time between now and the end of the year (most likely before xmas) and is supposed to be at least +40% over the 1080 Ti, some sites say 1180 should be +50%, and it's supposed to cost about the same $700 that a 1080 Ti costs today when new so... I can't say I would recommend anyone buy a 1080 Ti today, or this year.


I've seen reports saying it's like 20-25% better than a 1080 Ti. I really don't think it'll be 40%, even when the 1180 Ti comes out.


----------



## junglechocolate

rluker5 said:


> Maybe put a fan blowing in on the bottom of your case by the power supply. Air will find its way out.
> Also, as a dedicated physx card user, I agree with ZealotKi11er. A 1080ti doesn't need a physx card for any games that use dedicated physx. The 1080ti can handle those games just fine and a dedicated card that adds noticeable heat or burden to your power supply will only detract from the performance that the 1080ti would provide without it.
> 
> I bought my 23w, pciex1 gt730 physx card to help sli 780tis run Witcher 3 at 4k. They needed all the help they could get. But they removed dedicated physx support from that game halfway through the updates and use the cpu now. I think there might be some way to hack it back in but not a reason if you have a 1080ti. Metros and Borderlands and some similarly easy to render games use it, and you can look at your physx card use in afterburner but that's the extent of the benefit for a 1080ti.
> 
> I wish they could use it more, and if M. Exodus uses it you can toss it in, but I would leave mine out if I were you.
> I just have mine in for when I do a bios reset (if I get an itch for messing with ram timings or blk clocks or microcodes) since my Aorus cards only work with uefi and my Z97 bios defaults to legacy. So really more a convenience for fixing overclocking mistakes than a performance enhancer.
> 
> Also $520 for a 1080ti SC is like $420 for a 980ti before the 1080 came out. It's a good deal unless you are looking for easy 4k60 with all of the games coming out over the next year. If you just want 2-3x the fps over the 290(when you don't have a cpu bottleneck)you should get that.
> 
> Edit: I don't know how the 1080ti would do 4k and 3d at the same time. Do tv's do that yet? Mine only does 1080p 3d or 4k.
> HDMI 2.1 in the next cards should do that, but if tvs aren't yet, don't get your hopes up.


I am not sure actually. I just thought it was a blu ray limitation. I know at 4K resolution, all I need to do is toggle my 3D to watch things on 4K system resoluation on VLC. The movies I have are all 1080p3D. But yes the hope was to play 4K3D. And when not 3D get around 45 fps or 50 with high to ultra settings. Right now, Med to high settings gets me 28 fps on 4K with my 290 and its painful. I'd have to go all low to see if I can hit 35 fps. 



kithylin said:


> Well the 1180 is supposed to come out some time between now and the end of the year (most likely before xmas) and is supposed to be at least +40% over the 1080 Ti, some sites say 1180 should be +50%, and it's supposed to cost about the same $700 that a 1080 Ti costs today when new so... I can't say I would recommend anyone buy a 1080 Ti today, or this year.


I ended up canceling the order. I wanted it very bad but I need to do some planning and prep for my move from Texas to California and the 1080 Ti wouldn't get much use before then. No need in buying now. Especially when buying it now would distract me from my goals before then.


----------



## loader963

junglechocolate said:


> The price is 525 for the Asus card I mentioned. Also it seems waterblock wouldn't be possible?
> My case is this: https://www.newegg.com/Product/Product.aspx?Item=N82E16811124159
> Right now I have a 290 Vapor edition and it runs actually hot. I avg 87-88 degs. Vapor Cards are supposed to have the best cooling. I don't think I have good airflow with how packed my computer is. So would it be better to get this limited in performance card? Keep in mind I have a 55" 4K 3d OLED TV and I want to play some games in 3D at 4k


 Bitspower makes them for the Asus Turbo Series and they are rarer than hens teeth. I ended up having to order straight thru bitspower's website and get it shipped from tawian.


----------



## ZealotKi11er

kithylin said:


> Well the 1180 is supposed to come out some time between now and the end of the year (most likely before xmas) and is supposed to be at least +40% over the 1080 Ti, some sites say 1180 should be +50%, and it's supposed to cost about the same $700 that a 1080 Ti costs today when new so... I can't say I would recommend anyone buy a 1080 Ti today, or this year.


Yeah... no. +50% faster is 1180 Ti. Also if this is 12nm its even less. 7nm its different story. Needs to be new architecture to be fast with 12nm. Titan V is only 30% faster than 1080 Ti with 5120 Cuda Cores and HBM2.


----------



## junglechocolate

loader963 said:


> Bitspower makes them for the Asus Turbo Series and they are rarer than hens teeth. I ended up having to order straight thru bitspower's website and get it shipped from tawian.


Part of the reason I didn't end up going with it. Just seems youre limited in your options for this card. I'm waiting till September


----------



## kithylin

ZealotKi11er said:


> Yeah... no. +50% faster is 1180 Ti. Also if this is 12nm its even less. 7nm its different story. Needs to be new architecture to be fast with 12nm. Titan V is only 30% faster than 1080 Ti with 5120 Cuda Cores and HBM2.


Sadly with these things it's all speculation at this point. But, can I come back and quote you in 6~8 months if it turns out you were wrong?


----------



## VeritronX

I'd guess it's 40-50% faster than a reference 1080, 10-15% faster than reference 1080ti. It will probably come down to effective memory bandwidth and ROP thoughput being the bottleneck like it normally does.


----------



## ZealotKi11er

VeritronX said:


> I'd guess it's 40-50% faster than a reference 1080, 10-15% faster than reference 1080ti. It will probably come down to effective memory bandwidth and ROP thoughput being the bottleneck like it normally does.


Pretty much. 980 was like 5% faster than 780 Ti but with time as Kepler was phased out 980 is faster. 1080 was 10-15% faster than 980 Ti but since Pascal is just Maxwell v3 its not going to change. 1080 had 16nm to work with. No 1180 has to either be 7nm or big enough architecture change. If its both it will be 20%+ faster. If its 12nm and very similar to Volta/Pascal then we will be lucky to see 10%+.


----------



## gammagoat

Looking for suggestions as far as which Bios I should use for my FE, I understand that XOC is out for me as I am only using a water block not a full cover block.

I was thinking FTW3-358, hoping that it will stop power limiting that I am getting on my oc of 2126mhz.


----------



## Thoth420

Anyone else with a 1080Ti get crashing in Mankind Divided? Only game out of hundreds tested that crashes so I doubt its something other than a game setting causing it.


----------



## mattliston

Whenever a specific game crashes with no real indicator, I kinda go out in left field, and open msi afterburner, and make sure my core clock curve is static at my core clock. Meaning, if I have it currently set to 1999mhz at 1.000 volts, I make sure nothing to the right of that setting is actually above 1999mhz, that way an easy part in a game, or a menu , doesnt let the card attempt to self overclock.


This has helped me solve really strange issues in the past with my gtx 1070 last year. Havent really had a chance to push my current 1080 ti too hard


----------



## Thoth420

mattliston said:


> Whenever a specific game crashes with no real indicator, I kinda go out in left field, and open msi afterburner, and make sure my core clock curve is static at my core clock. Meaning, if I have it currently set to 1999mhz at 1.000 volts, I make sure nothing to the right of that setting is actually above 1999mhz, that way an easy part in a game, or a menu , doesnt let the card attempt to self overclock.
> 
> 
> This has helped me solve really strange issues in the past with my gtx 1070 last year. Havent really had a chance to push my current 1080 ti too hard


I have it at stock factory OC settings and it still occurs.


----------



## mattliston

Do you have MSI Afterburner?


Could try maxxing out the power limit and temp limit bars, and enabling the default fan curve via settings>fan


Id also see about opening nvidia control panel, and setting the power management mode setting to prefer maximum performance


----------



## kithylin

mattliston said:


> Do you have MSI Afterburner?
> 
> 
> Could try maxxing out the power limit and temp limit bars, and enabling the default fan curve via settings>fan
> 
> 
> Id also see about opening nvidia control panel, and setting the power management mode setting to prefer maximum performance


Usually this will result in nvidia cards running at max boost all the time in most titles, so be warned about that.. it may fix your issues, but may also lead to excessive power usage and heat unnecessarily at the same time. "Prefer maximum performance" basically disables boost 3.0.


----------



## mattliston

It does not disable gpu boost.


my ol 1070 sat at 1561mhz with this setting enabled. By itself untouched, it would self boost to over 1900mhz as normal before any sort of OC programs touched it.


Same as my 1080 ti. sits under 1700mhz untouched, boosts to 1961mhz during games when not touched my OC programs (afterburner). Uses under 100 watts at minimum 3d clocks, as voltage is FAR below even 0.8 volts core at this speed.





Having the "prefer maximum performance" merely puts the card at the minimum 3d clock state as far as I can tell. Just open afterburner and look in curve clock graph to see what this would be.


----------



## Thoth420

mattliston said:


> Do you have MSI Afterburner?
> 
> 
> Could try maxxing out the power limit and temp limit bars, and enabling the default fan curve via settings>fan
> 
> 
> Id also see about opening nvidia control panel, and setting the power management mode setting to prefer maximum performance


I use Precision(to control my LEDs but I do like AB more) but I can try that in there. I already have all my games set to max performance in their respective profiles with exception to WoW because it just heats the core up more for no benefit.

The card is a FTW3 Gaming so it does have a small factory OC out of the box. I will try underclocking it to ref base clock as well if the other stuff doesn't help and see if that may be the issue.


----------



## deanstead2k15

Is it just me or is the 1080ti a bit of a let down, its like nvidia have optimised it only for 4k.

I had a 980ti and could happily sit in games at a solid 120fps on my 1440p 165hz monitor so i thought ill grab the 1080ti and go for the 165 fps, NOPE in 4k it crushes my 980ti and i can get 60 fps in most games and wow games like far cry 5 with the fake hdr mods look amazing in 4k but 60fps is just no good for online games, anything below 4k and its worse than my 980ti, i now get frame jumps from 80 to 140 and stutter all over the place and cant get a solid frame rate in any game at 1440p even if i turn the graphics down to low, all the card does is stay in the same fps bracket but throttles the card down to about 60% gpu usage and sits at 65 degrees and 80% out of 120% power usage why doesn't the card use its full potential to achieve more fps, am i missing something?? and for some reason when ever i use an overclock to play a game when i click the reset button in msi after burner when im done the mem clock drops back to default the gpu clock sits at 1480mhz on idle at 54 degrees, instead of the 139mhz it should. the only solution i have found is to reinstall the driver every time then it drops back to default. really struggling to get along with this card. ive got a full cover water block on the way hoping that will help but im doubtful.


----------



## MightEMatt

deanstead2k15 said:


> Is it just me or is the 1080ti a bit of a let down, its like nvidia have optimised it only for 4k.
> 
> I had a 980ti and could happily sit in games at a solid 120fps on my 1440p 165hz monitor so i thought ill grab the 1080ti and go for the 165 fps, NOPE in 4k it crushes my 980ti and i can get 60 fps in most games and wow games like far cry 5 with the fake hdr mods look amazing in 4k but 60fps is just no good for online games, anything below 4k and its worse than my 980ti, i now get frame jumps from 80 to 140 and stutter all over the place and cant get a solid frame rate in any game at 1440p even if i turn the graphics down to low, all the card does is stay in the same fps bracket but throttles the card down to about 60% gpu usage and sits at 65 degrees and 80% out of 120% power usage why doesn't the card use its full potential to achieve more fps, am i missing something?? and for some reason when ever i use an overclock to play a game when i click the reset button in msi after burner when im done the mem clock drops back to default the gpu clock sits at 1480mhz on idle at 54 degrees, instead of the 139mhz it should. the only solution i have found is to reinstall the driver every time then it drops back to default. really struggling to get along with this card. ive got a full cover water block on the way hoping that will help but im doubtful.



First problem sounds like a hard CPU bottleneck, power delivery issue or Windows/driver problem. The overclocking thing sounds like pebkac. Why do you reset the card every time you're done playing a game? It automatically pulls back down to idle or 2D clocks when in the desktop. Some drivers can't idle under 3D clocks when running at 165Hz, latest driver? Does restarting not fix the idle clock?


Edit: Also Chrome has a tendency to hook 3D clocks on occasion.


----------



## mouacyk

deanstead2k15 said:


> Is it just me or is the 1080ti a bit of a let down, its like nvidia have optimised it only for 4k.
> 
> I had a 980ti and could happily sit in games at a solid 120fps on my 1440p 165hz monitor so i thought ill grab the 1080ti and go for the 165 fps, NOPE in 4k it crushes my 980ti and i can get 60 fps in most games and wow games like far cry 5 with the fake hdr mods look amazing in 4k but 60fps is just no good for online games, anything below 4k and its worse than my 980ti, i now get frame jumps from 80 to 140 and stutter all over the place and cant get a solid frame rate in any game at 1440p even if i turn the graphics down to low, all the card does is stay in the same fps bracket but throttles the card down to about 60% gpu usage and sits at 65 degrees and 80% out of 120% power usage why doesn't the card use its full potential to achieve more fps, am i missing something?? and for some reason when ever i use an overclock to play a game when i click the reset button in msi after burner when im done the mem clock drops back to default the gpu clock sits at 1480mhz on idle at 54 degrees, instead of the 139mhz it should. the only solution i have found is to reinstall the driver every time then it drops back to default. really struggling to get along with this card. ive got a full cover water block on the way hoping that will help but im doubtful.


I would recommend creating a new thread and link the thread here. Please post your system specs and people will be happy to help you track down the problem. What you're experiencing is not typical of the 1080 ti.


----------



## mattliston

A more powerful graphics card will always increase CPU load at lower resolutions. It (GPU) is chewing through the graphics data and a faster rate and expecting the CPU to have more work for it to do, causing extra cpu stress.



Sounds like the CPU is getting hammered.


----------



## deanstead2k15

MightEMatt said:


> First problem sounds like a hard CPU bottleneck, power delivery issue or Windows/driver problem. The overclocking thing sounds like pebkac. Why do you reset the card every time you're done playing a game? It automatically pulls back down to idle or 2D clocks when in the desktop. Some drivers can't idle under 3D clocks when running at 165Hz, latest driver? Does restarting not fix the idle clock?
> 
> 
> Edit: Also Chrome has a tendency to hook 3D clocks on occasion.


First problem sounds like a hard CPU bottleneck, power delivery issue or Windows/driver problem. The overclocking thing sounds like pebkac. Why do you reset the card every time you're done playing a game? It automatically pulls back down to idle or 2D clocks when in the desktop. Some drivers can't idle under 3D clocks when running at 165Hz, latest driver? Does restarting not fix the idle clock?


Edit: Also Chrome has a tendency to hook 3D clocks on occasion.[/QUOTE]

Thank you for your reply. My cpu is a 4770k could that be the issue. I have a gigabyte ultra gaming z370 board and ram sat waiting and I'm going to pick up the 8770k on payday. I don't typically reset the over clock after gaming I was trying to get it to go back to idling at 139mhz. A reboot does not fix the issue but its possible I have something that uses 3d running in the background but if that is the case wouldnt it throttle even when it's not over clocked. If I reinstall the Nvidia driver it goes back to 139mhz idle instead of 1480mhz. My 980ti had no such issue. I might try a slightly older Nvidia driver see if that's fixes the issue and will try run monitor at 60hz, but doubt that's it as my 4k monitor is 60hz and it still does it if I'm using just that monitor. Thank you again for the reply it's been driving me mad paid a lot for card and I just can't getting running how I want it to. Will also try uninstall chrome see if that helps


----------



## jura11

Just finished build for friend of mine with MSI Gaming X 1080Ti, best OC 2050MHz and +350MHz on VRAM, temperatures I expected bit better but seen max 42-45°C

His build consist from i7-8700k with 5.1GHz non delidded with max temperatures in Realbench at 72-75°C on PKG and 1.35v , that's with EK Supermacy block and Asus Maximus X Formula and fans spinning at 1000RPM as max

Friend is happy with the build although I'm not sure on his GPU temperatures which are bit high, checked twice Phanteks Glacier block if its seated correctly, water temperature never go beyond 28°C in ambient 23-24°C 

Hope this helps 

Thanks, Jura


----------



## VeritronX

A 4770k running with its 3.7-3.9ghz stock turbo range will probably bottleneck the 1080ti, especially with slower ram.

If you have a Z87 or Z97 motherboard you could try overclocking it but without a delid most would struggle to get over 4.2Ghz. With delid and great cooling you might get anywhere up to about 5ghz, fast ram also helps a lot when you want high fps eg. 2400C10 ram.


----------



## kithylin

jura11 said:


> Just finished build for friend of mine with MSI Gaming X 1080Ti, best OC 2050MHz and +350MHz on VRAM, temperatures I expected bit better but seen max 42-45°C
> 
> His build consist from i7-8700k with 5.1GHz non delidded with max temperatures in Realbench at 72-75°C on PKG and 1.35v , that's with EK Supermacy block and Asus Maximus X Formula and fans spinning at 1000RPM as max
> 
> Friend is happy with the build although I'm not sure on his GPU temperatures which are bit high, checked twice Phanteks Glacier block if its seated correctly, water temperature never go beyond 28°C in ambient 23-24°C
> 
> Hope this helps
> 
> Thanks, Jura


Except the RTX 2080 is coming for pre-order literally in 1-10 days or so.... :/ kinda sad for anyone to be spending that kinda money on 1080 Ti today.


----------



## RoBiK

kithylin said:


> Except the RTX 2080 is coming for pre-order literally in 1-10 days or so.... :/ kinda sad for anyone to be spending that kinda money on 1080 Ti today.


 What "kinda money" are you talking about? I can't find a single mention of how much he paid for any of it in hist post. For all we know, he could have paid nothing for it.


----------



## mattliston

If everyone waited to the newest thing came out, no one would ever build a computer.


----------



## kithylin

mattliston said:


> If everyone waited to the newest thing came out, no one would ever build a computer.


I have no idea what this person in the other post there paid but anyone that pays $700 for the old generation stuff literally just a few days before the new stuff for the same price... kinda not smart.


----------



## FarmerJo

Does anyone know how to see power usage when using the XOC bios? I just Flashed my card to that one(under water dont worry) and can see vrm temps but would sometimes like to limit power usage and in msi and precision its not letting me.


----------



## Blameless

kithylin said:


> I have no idea what this person in the other post there paid but anyone that pays $700 for the old generation stuff literally just a few days before the new stuff for the same price... kinda not smart.


Even at ~$700 dollars, a 1080 Ti has a very good chance of being a better value than an RTX part in non-RTX games.


----------



## FastEddieNYC

Blameless said:


> Even at ~$700 dollars, a 1080 Ti has a very good chance of being a better value than an RTX part in non-RTX games.


I agree the Ti is still a good deal especially considering the prices of the new cards. It was no accident that Nvidia left out performance comparisons between the 10 and 20 series with DX11 and 12 games. The Ray-Tracing is Niche and according to some published reports tanks the performance.


----------



## KedarWolf

FastEddieNYC said:


> I agree the Ti is still a good deal especially considering the prices of the new cards. It was no accident that Nvidia left out performance comparisons between the 10 and 20 series with DX11 and 12 games. The Ray-Tracing is Niche and according to some published reports tanks the performance.


Anyone else think the Founder's 2080's and 2080 Ti's are ugly cards.


----------



## ZealotKi11er

KedarWolf said:


> Anyone else think the Founder's 2080's and 2080 Ti's are ugly cards.


They look good to me. The middle space looks weird. They could use RGB though.


----------



## shilka

My best friend offered me a pretty good price for my EVGA GTX 1080 FTW2 so i am seriously thinking about buying a GTX 1080 Ti as its more than fast enough for what i play at 1440P
Would be way cheaper than getting a RTX 2080 or even the RTX 2070 so good idea or not?

The GTX 1080 Ti is 6500 kr ($1000) here in Denmark but selling the GTX 1080 would drop it down to 2400 kr ($375) and thats about the same price as a brand new 6 GB GTX 1060 card
The pre orders for the RTX 280 are 6400 to 8200 kr and the RTX 2080 Ti are 10-11.000 kr

Dont give a damm about ray tracing and i dont even buy games untill they are 2-3 years or older so games coming up wont be bought untill much later anyway
Ace Combat 7 Skies Unknown will be the exception to that 

I personally think the geforce RTX is the bigest ripoff of all time and nothing so far has convinced me that the RTX cards are worth buying at those prices
I am looking at the Asus ROG Strix model as i am not buying another card from EVGA due to how much i hate EVGA Precision X


----------



## Blze001

shilka said:


> My best friend offered me a pretty good price for my EVGA GTX 1080 FTW2 so i am seriously thinking about buying a GTX 1080 Ti as its more than fast enough for what i play at 1440P
> Would be way cheaper than getting a RTX 2080 or even the RTX 2070 so good idea or not?
> 
> The GTX 1080 Ti is 6500 kr ($1000) here in Denmark but selling the GTX 1080 would drop it down to 2400 kr ($375) and thats about the same price as a brand new 6 GB GTX 1060 card
> The pre orders for the RTX 280 are 6400 to 8200 kr and the RTX 2080 Ti are 10-11.000 kr
> 
> Dont give a damm about ray tracing and i dont even buy games untill they are 2-3 years or older so games coming up wont be bought untill much later anyway
> Ace Combat 7 Skies Unknown will be the exception to that
> 
> I personally think the geforce RTX is the bigest ripoff of all time and nothing so far has convinced me that the RTX cards are worth buying at those prices


Ray tracing is the future, but it's honestly gonna be a bit before it's really worth buying a card specifically for this. Until they release benchmarks, we won't know for sure how the RTX cards stack up against the current gen, and the fact that they seemingly went out of their way to not talk about that during the keynote is raising a red-flag for me.

If you aren't gonna be buying games that are coming out in the next few years and are purely focused on your current game library, the 1080ti is definitely the more price-conscious choice I'd say.


----------



## shilka

Blze001 said:


> Ray tracing is the future, but it's honestly gonna be a bit before it's really worth buying a card specifically for this. Until they release benchmarks, we won't know for sure how the RTX cards stack up against the current gen, and the fact that they seemingly went out of their way to not talk about that during the keynote is raising a red-flag for me.
> 
> If you aren't gonna be buying games that are coming out in the next few years and are purely focused on your current game library, the 1080ti is definitely the more price-conscious choice I'd say.


Deus Ex Mankind Divided, Ashes of the Singularity Escalation and Ghost Recon Wildlands are the 3 games that wont run at more than 40-50 FPS with my GTX 1080 FTW2
Stepping up to the Ti would push those numbers past 60 FPS

The two games i spend the most time playing due to mods are Star Wars Empire at War from 2006 and Sins of a Solar Empire Rebellion from 2012
Not really what i would call demanding games

The only new game i want to buy from the start is Ace Combat 7 Skies Unknown which comes out feb 2019
Maybe i will buy Far Cry 5 if its on a big enough sale otherwise i might buy Project Cars 2


----------



## s1rrah

KedarWolf said:


> Anyone else think the Founder's 2080's and 2080 Ti's are ugly cards.


Personally, I really dig the minimalist, "monolithic" look of the card ... but I'm into primitive shapes and lots of negative space as a long time visual designer and so I might be biased. LOL ...

Joel


----------



## buellersdayoff

s1rrah said:


> Personally, I really dig the minimalist, "monolithic" look of the card ... but I'm into primitive shapes and lots of negative space as a long time visual designer and so I might be biased. LOL ...
> 
> Joel


Msi's version of the stock cooler (dual fan) looks better imo


----------



## paskowitz

Anyone know, first hand, if the PNY Blower Edition uses the reference PCB?


----------



## kithylin

shilka said:


> My best friend offered me a pretty good price for my EVGA GTX 1080 FTW2 so i am seriously thinking about buying a GTX 1080 Ti as its more than fast enough for what i play at 1440P
> Would be way cheaper than getting a RTX 2080 or even the RTX 2070 so good idea or not?
> 
> The GTX 1080 Ti is 6500 kr ($1000) here in Denmark but selling the GTX 1080 would drop it down to 2400 kr ($375) and thats about the same price as a brand new 6 GB GTX 1060 card
> The pre orders for the RTX 280 are 6400 to 8200 kr and the RTX 2080 Ti are 10-11.000 kr
> 
> Dont give a damm about ray tracing and i dont even buy games untill they are 2-3 years or older so games coming up wont be bought untill much later anyway
> Ace Combat 7 Skies Unknown will be the exception to that
> 
> I personally think the geforce RTX is the bigest ripoff of all time and nothing so far has convinced me that the RTX cards are worth buying at those prices
> I am looking at the Asus ROG Strix model as i am not buying another card from EVGA due to how much i hate EVGA Precision X


I would suggest you wait until 9/20/2018 until the NDA lifts and we get to see benchmarks of the new cards vs the 1080 to see if it's worthwhile. It may not be a good idea to change cards just yet.


----------



## ClashOfClans

Anybody running Ryzen 1800x / 1700x / 1700 with a 1080Ti? Was wondering how the performance was and if there was a cpu bottleneck.


----------



## jura11

paskowitz said:


> Anyone know, first hand, if the PNY Blower Edition uses the reference PCB?


Hi there 

Does looks like this card is based on reference PCB 

https://www.ekwb.com/configurator/waterblock/3831109831922#DB_inline?height=260&width=530&inline_id=comp_table


Hope this helps 

Thanks, Jura


----------



## jura11

kithylin said:


> Except the RTX 2080 is coming for pre-order literally in 1-10 days or so.... :/ kinda sad for anyone to be spending that kinda money on 1080 Ti today.


Hi there 

Sorry for late reply, he got this card few months back and he paid I think around £700 for Ti and block he got for £60 

If he bought card literally few days before RTX launch then for sure I would probably recommend him to return the card and wait on RTX 

He is happy with build, temperatures are my concern because I expected high 30's to low 40's that's with 2*360mm radiators plus 240mm radiator at front

Hope this helps 

Thanks, Jura


----------



## paskowitz

jura11 said:


> Hi there
> 
> Does looks like this card is based on reference PCB
> 
> https://www.ekwb.com/configurator/w...ine?height=260&width=530&inline_id=comp_table
> 
> 
> Hope this helps
> 
> Thanks, Jura



Thanks, Jura!


----------



## Blze001

I think it's kinda funny that I'm gonna be joining the 1080ti club this weekend... just before everyone jumps ship to the 2080ti. XD


----------



## jura11

Blze001 said:


> I think it's kinda funny that I'm gonna be joining the 1080ti club this weekend... just before everyone jumps ship to the 2080ti. XD


Hi there 

GTX1080Ti is not bad card there,depending what games you play and what resolution but I still think this card would be competitive against the RTX 2080Ti, I'm expecting around 15-20% difference between the 2080Ti and 1080Ti 

I'm keeping my 1080Ti and would probably upgrade next year when Nvidia comes with 7nm or AMD Navi 

Hope this helps 

Thanks, Jura


----------



## pez

ClashOfClans said:


> Anybody running Ryzen 1800x / 1700x / 1700 with a 1080Ti? Was wondering how the performance was and if there was a cpu bottleneck.


Specs in my sig, but 1700+Titan X and no issues here. I've seen some decreased GPU usage in certain game, but it was the same stuff I saw with my 7700K at 4.7GHz. The only real bottlenecks you'll usually see are due to the particular game/engine or for using a low resolution. 1440p+ is the sweet spot for elminating *most* CPU bottlenecks.


----------



## kithylin

Blze001 said:


> I think it's kinda funny that I'm gonna be joining the 1080ti club this weekend... just before everyone jumps ship to the 2080ti. XD


It's actually a pretty good time to pick up a 1080 Ti. I saw one last night for $475 after shipping on ebay used.

EDIT: Link for others. There's a bunch of em right around the $500 mark. https://www.ebay.com/sch/i.html?_fr...sc=0&LH_ItemCondition=3000&rt=nc&LH_PrefLoc=1


----------



## gecko991

Got a brand new Asus 1080ti, keeping 
it.


----------



## BLUuuE

Anyone with a Strix 1080 Ti tried running memory voltage offset at +100mV (1.45v) for daily use?

Sadly, my 1080Ti can't do 12GHz on the memory, but increasing voltage does seem to help a bit.

These are my results running Unigine Heaven Extreme preset:


Code:


freq    : min  avg   max   score comments
12000MHz: 36.4 190.2 368.4 4790  crashed at end
11800MHz: 35.7 191.9 364.5 4834

+100mV memory offset
12000MHz: 36.2 191.3 377.1 4819
12200MHz: 40.0 196.7 383.3 4954 minor artifacting

The stock memory voltage is 1.35v, but I'm not sure if it's safe to run 1.45v, maybe even 1.50v daily.

Strix 1080 Ti XOC Tools


----------



## doom26464

I just scooped up a used 1080ti off ebay for 500USD shipped. Its a zotac AMP! Extreme core edition. 


Seeing lots on ebay going for 475-500USD if your willing to play the bidding game. 

Should be about a 50-60% jump over my 980ti right??? Also looking forward to having 11gb of vram.


I wanted to jump on the RTX train but prices are too insane and used market for 1080ti is very prime right now.


----------



## kithylin

doom26464 said:


> I just scooped up a used 1080ti off ebay for 500USD shipped. Its a zotac AMP! Extreme core edition.
> 
> 
> Seeing lots on ebay going for 475-500USD if your willing to play the bidding game.
> 
> Should be about a 50-60% jump over my 980ti right??? Also looking forward to having 11gb of vram.
> 
> 
> I wanted to jump on the RTX train but prices are too insane and used market for 1080ti is very prime right now.


Yep, +60% average: http://gpu.userbenchmark.com/Compare/Nvidia-GTX-1080-Ti-vs-Nvidia-GTX-980-Ti/3918vs3439


----------



## Streetdragon

BLUuuE said:


> Anyone with a Strix 1080 Ti tried running memory voltage offset at +100mV (1.45v) for daily use?
> 
> Sadly, my 1080Ti can't do 12GHz on the memory, but increasing voltage does seem to help a bit.
> 
> These are my results running Unigine Heaven Extreme preset:
> 
> 
> Code:
> 
> 
> freq    : min  avg   max   score comments
> 12000MHz: 36.4 190.2 368.4 4790  crashed at end
> 11800MHz: 35.7 191.9 364.5 4834
> 
> +100mV memory offset
> 12000MHz: 36.2 191.3 377.1 4819
> 12200MHz: 40.0 196.7 383.3 4954 minor artifacting
> 
> The stock memory voltage is 1.35v, but I'm not sure if it's safe to run 1.45v, maybe even 1.50v daily.
> 
> Strix 1080 Ti XOC Tools


as far as i know you cant change the DRAM voltage^^


BTW i cleaned and reinstalled my Waterblock. While doing that i have done a shuntmod on the two for the PSU connetion. Not for the PCI slot.
Does it look good? Applied nail polish clear around AND on the sides of the shunt. Only the top polish free(ofcourse). How much extra headroom should that give?


----------



## Blameless

BLUuuE said:


> The stock memory voltage is 1.35v, but I'm not sure if it's safe to run 1.45v, maybe even 1.50v daily.


I'd probably be ok with 1.45v, but iffy about 1.5v unless the memory was able to be kept fairly cool. Do you still see clock scaling at 1.5v?



Streetdragon said:


> as far as i know you cant change the DRAM voltage^^


A few custom models can, including the Strix.


----------



## Phantomas 007

After the pre-orders prices of the new RTX series im thinking to get from EVGA a 1080ti FTW 3 for 680Euros. Do you think will have drop price of the 1080ti so to wait ?


----------



## shalafi

ClashOfClans said:


> Anybody running Ryzen 1800x / 1700x / 1700 with a 1080Ti? Was wondering how the performance was and if there was a cpu bottleneck.


Depends. If the game is at least partly multi-threaded / modern, you will only be a few negligible FPS below the Intels. If the game is heavily relying on a single thread and/or is an ancient engine or hardly optimised (Euro Truck Simulator 2, The Elder Scrolls: Online), you'll have a hard time feeding the card with frames to render. 
Direct comparison with a friend of mine in ESO: standing in the same place in a crowded town, same game options, 2560x1440 me vs 3440x1440 him. My 2700x @ ~4.25GHz, vs his delidded 8700k @ 5GHz (on all cores). My Gigabyte 1080Ti FE with the EVGA Hybrid cooler at 2012MHz, his MSI 1080Ti Gaming X at whatever the stock boost is (I'd say 1970MHz-ish). While I was getting around 30 FPS, he was at 60.


----------



## shalafi

double post


----------



## mouacyk

Streetdragon said:


> as far as i know you cant change the DRAM voltage^^
> 
> 
> BTW i cleaned and reinstalled my Waterblock. While doing that i have done a shuntmod on the two for the PSU connetion. Not for the PCI slot.
> Does it look good? Applied nail polish clear around AND on the sides of the shunt. Only the top polish free(ofcourse). How much extra headroom should that give?


You must NOT have heard of the stories where the metal TIM ate away the solder. I just hope what ever you're using to do the shunt mod doesn't also do that.


----------



## Streetdragon

mouacyk said:


> You must NOT have heard of the stories where the metal TIM ate away the solder. I just hope what ever you're using to do the shunt mod doesn't also do that.


Yeah i heard aboud it. because of that i "sealed" the shunts in nail polish round about and only used the TIM that was left on the Q-Tip. So its only a thin layer and cant run away or spread itself.
So it must eat itself through the shunt to the solder^^

And the TIM wont "eat" the solder. it just lowers the meltingpoint of it. That WOULD take some month to do so. And even than the shunt holds it place because of the huge amount of polish i used xD

And if the card dies after a year.... i hope AMD has some nice new GPUs in 7nm. Else i just buy a used/new 1080ti :thumb:


----------



## kithylin

Streetdragon said:


> as far as i know you cant change the DRAM voltage^^
> 
> 
> BTW i cleaned and reinstalled my Waterblock. While doing that i have done a shuntmod on the two for the PSU connetion. Not for the PCI slot.
> Does it look good? Applied nail polish clear around AND on the sides of the shunt. Only the top polish free(ofcourse). How much extra headroom should that give?


Just so you're aware, you completely violated your factory warranty on your card and you now have no more warranty.


----------



## Streetdragon

Yeah i know. i already voided it by using the waterblock. so its ok xD

a little test for now: 2000Mhz 1V GPU 5° Over water. Power was at 90-100% All in Superposition

Im a little happy gamer now^^


----------



## kithylin

Streetdragon said:


> Yeah i know. i already voided it by using the waterblock. so its ok xD
> 
> a little test for now: 2000Mhz 1V GPU 5° Over water. Power was at 90-100% All in Superposition
> 
> Im a little happy gamer now^^


Depending on the manufacturer, waterblocks may not always void warranties. I know for a fact that MSI, Gigabyte and EVGA do not void their warranty with using water blocks, as long as you don't modify the PCB (Shunt mods), and re-install the factory cooler for RMA.


----------



## serave

Can someone redpill me on not to buy the 1080Ti secondhand for around $570? im about to pull the trigger since i managed to sell my Vega56 for $480 about 2-3 weeks ago.

I think the new RTX Cards wont be that much faster compared to the 1080Ti (talking about the 2080 here).


----------



## Streetdragon

kithylin said:


> Depending on the manufacturer, waterblocks may not always void warranties. I know for a fact that MSI, Gigabyte and EVGA do not void their warranty with using water blocks, as long as you don't modify the PCB (Shunt mods), and re-install the factory cooler for RMA.


Have a FE form PALIT. Was the cheapest xD Even if i look at the card voids the warenty 

BTW a bit of testing.. feels so good

2100 with 1.063V









Screenshot is from the end of Superposition 4K optimised


----------



## wrc05

Hi,

I installed EVGA Hybrid Cooler 400-HY-5598-B1 for my Zotac FE 1080 TI, but two places circled in red in below pic aren't covered by EVGA heatsink, should i be worried 


I also attached pic from EVGA manual


----------



## Streetdragon

I would say its ok, if the cooler was made for your card model. If you are still worried you could buy small heatsinks and put them on this places just to be extra safe.
Thats what i would do^^


----------



## wrc05

Thanks, I have Enzotech MOS-C10-LE heatsinks, I'll use them.

That Hybrid cooler is for all FE models, wish they extended the heatsink to cover those two places too.


----------



## Blze001

FWIW the EK block I put on my FE 1080ti this weekend didn't have thermal pads on those either. The block covers them, but I don't know if they're touching it or not.

Looks like I got a really solid card. Always a risk when you play the eBay game, but this one came in the nVidia FE box with all of the red caps, manuals, and even the DVI adapter. Literally the only thing "bad" about it was some dust, I was impressed. Working great so far, didn't really have time to run many benchmarks aside from a quick TimeSpy run, but I was pretty happy with the numbers.

Now I should probably upgrade my 1080p monitor, huh? 

Also I never really knew that the 1080ti was so much different from the 1070 in terms of PCB layout and hardware. For some reason I always thought the empty spots on the 1070 were filled for the higher-spec cards. But I guess that's true with the 1080 and the 1080ti is actually a Titan board?

Naughty GPU nudes below:


Spoiler


----------



## wrc05

That's what confuses me, the stock plate has white thermal fleece pads. I watched Gamers Nexus teardown video (at 1:01 mark) and EKWB full block manual, both don't have,
maybe Gurus here will say what those chips are.

I could place Enzotech MOS-C10-LE on bottom chip, but on top the EVGA cover mount was obstructing.



Code:


https://youtu.be/eIri88Du2AQ?t=60


----------



## Kronos8

@wrc05
That is the memory VRM area (at the top). Normally you don't expect high temps there.
But to play it safe, try to read the temp in the area and then decide the best solution.
You can install a smaller heatsink, or try to have a fan "hitting" the area, or both.


----------



## KaRLiToS

I have a couple of GTX 1080ti and one is running 73’C 24/7. Is it bad for the card?


----------



## pez

KaRLiToS said:


> I have a couple of GTX 1080ti and one is running 73’C 24/7. Is it bad for the card?


Well within spec. I mean extended load and temps on any card will most likely attribute to shorter life, but who knows how much shorter?


----------



## kithylin

KaRLiToS said:


> I have a couple of GTX 1080ti and one is running 73’C 24/7. Is it bad for the card?


The cards are spec'd up to 90c max, this is well within design parameters for the card and will not shorten the life of the cards at all. It will cause it to reduce clocks and throttle a little bit and not run as fast as it's capable of, but that has nothing to do with the longevity of the card.


----------



## wrc05

@ Kronos8, ok I'll try that.

Not to drag this, wish other Hybrid users pitched in what they did for their FE cards.


----------



## Streetdragon

wrc05 said:


> @ Kronos8, ok I'll try that.
> 
> Not to drag this, wish other Hybrid users pitched in what they did for their FE cards.


If you are worried you can add heatsinks anytime you want :thumb:


----------



## wrc05

Ok I will try to install small heatsink on top chip, thanks again ^^


----------



## pez

wrc05 said:


> @ Kronos8, ok I'll try that.
> 
> Not to drag this, wish other Hybrid users pitched in what they did for their FE cards.


I seemed to have miss this. I kept the FE cooler and removed the top piece that allows fitment of the AIO. I'm going to look for my pics of it now. If the FE cooler covers it by default, then it's on there I believe.

EDIT:

Turns out I didn't get any pictures of the process itself. This is the best pic I have and it doesn't really help you. However, the point still stands that if the FE cooler covers it by default, it's covered in the way I use mine.

http://i.imgur.com/3dgcbpi.jpg


----------



## wrc05

^^ yes, because that is 1080/ 1070 FE Hybrid cooler, am sure everyone mounted just the water block and stock bottom plate has Thermal Fleece strips circled in red in my previous posts.

Even I had the same, it died and is out of warranty so I bought new 1080 Ti Hybrid cooler, this has memory and VRM cooling seperately which requires the stock bottom plate to be removed.
Hence my doubt how or why EVGA missed those details of not providing cooling, I'll install small heatsinks, please excuse my poor english.


----------



## deanstead2k15

deanstead2k15 said:


> First problem sounds like a hard CPU bottleneck, power delivery issue or Windows/driver problem. The overclocking thing sounds like pebkac. Why do you reset the card every time you're done playing a game? It automatically pulls back down to idle or 2D clocks when in the desktop. Some drivers can't idle under 3D clocks when running at 165Hz, latest driver? Does restarting not fix the idle clock?
> 
> 
> Edit: Also Chrome has a tendency to hook 3D clocks on occasion.
> 
> Thank you for your reply. My cpu is a 4770k could that be the issue. I have a gigabyte ultra gaming z370 board and ram sat waiting and I'm going to pick up the 8770k on payday. I don't typically reset the over clock after gaming I was trying to get it to go back to idling at 139mhz. A reboot does not fix the issue but its possible I have something that uses 3d running in the background but if that is the case wouldnt it throttle even when it's not over clocked. If I reinstall the Nvidia driver it goes back to 139mhz idle instead of 1480mhz. My 980ti had no such issue. I might try a slightly older Nvidia driver see if that's fixes the issue and will try run monitor at 60hz, but doubt that's it as my 4k monitor is 60hz and it still does it if I'm using just that monitor. Thank you again for the reply it's been driving me mad paid a lot for card and I just can't getting running how I want it to. Will also try uninstall chrome see if that helps


Just wanted to thank you again for your help. rebuilt my rig from scratch with gigabyte ultra gaming board, a 8600k with water block running at 4.9ghz, put an evga copper water block on the 1080ti and fitted a custom water cooling loop with 2 240mm rads and now im sitting solid at 165 fps at 1440p on cod ww2 with the gpu clocked to 2062mhz. and temps never touch 50 degrees with my fan and pump on low settings. the weird idle issue was still there which was driving me mad until last night when i installed the new 399 driver and im now back to idle at 139mhz with the overclock on so all is right with the world. if anyone has any pointers on how to bend rigid tubing without punching a hole in wall it would be greatly appreciated lmao. attached pic on my new rig just cos im proud of it apart from my crappy pipe bends


----------



## Kronos8

wrc05 said:


> Hence my doubt how or why EVGA missed those details of not providing cooling


Don't be surprised. Although the card costs 700-800Euro, the details that are missed are impressive. Not by just EVGA but from any vendor.

And since I modded successfully my water cooled 1080 Ti with universal water block, I will post a picture, just to give some ideas to people who are willing to try something similar and know what they are doing. Photo is before installation of EK universal waterblock. The idea came to me when I saw a similar picture from an OCN user more than 2 years ago.


----------



## mouacyk

That's awesome -- someone managed to do universal VGA cooling correctly, by also cooling the VRMs.


----------



## kithylin

Kronos8 said:


> Don't be surprised. Although the card costs 700-800Euro, the details that are missed are impressive. Not by just EVGA but from any vendor.
> 
> And since I modded successfully my water cooled 1080 Ti with universal water block, I will post a picture, just to give some ideas to people who are willing to try something similar and know what they are doing. Photo is before installation of EK universal waterblock. The idea came to me when I saw a similar picture from an OCN user more than 2 years ago.


I guess I'll never understand in my mind why anyone would do this to an expensive video card... if you're going to all the trouble to put heatsinks and all that junk on there and two blocks.. one vrm one main block, why not just go ahead and buy a proper full cover block to do it correctly? The worst part is you're completely killing any possible resale value of the card by doing this. As well as gluing heatsinks to the ram sticks violates the factory warranty and.. it's just in general looks like a terrible idea to me. No one would buy a random card that's all sorts of janky frankenstein'd together like that :/ And no, full cover blocks don't usually violate the factory warranty from most companies.


----------



## MrTOOSHORT

Kronos8 said:


> Don't be surprised. Although the card costs 700-800Euro, the details that are missed are impressive. Not by just EVGA but from any vendor.
> 
> And since I modded successfully my water cooled 1080 Ti with universal water block, I will post a picture, just to give some ideas to people who are willing to try something similar and know what they are doing. Photo is before installation of EK universal waterblock. The idea came to me when I saw a similar picture from an OCN user more than 2 years ago.


Looks pretty good, I like the VRM block.:thumb:


----------



## ZealotKi11er

wrc05 said:


> ^^ yes, because that is 1080/ 1070 FE Hybrid cooler, am sure everyone mounted just the water block and stock bottom plate has Thermal Fleece strips circled in red in my previous posts.
> 
> Even I had the same, it died and is out of warranty so I bought new 1080 Ti Hybrid cooler, this has memory and VRM cooling seperately which requires the stock bottom plate to be removed.
> Hence my doubt how or why EVGA missed those details of not providing cooling, I'll install small heatsinks, please excuse my poor english.


You do not need to cool those capacitors. They are only cooled because the stock plate runs hot and traps heat there.


----------



## Kronos8

kithylin said:


> I guess I'll never understand in my mind why anyone would do this to an expensive video card... if you're going to all the trouble to put heatsinks and all that junk on there and two blocks.. one vrm one main block, why not just go ahead and buy a proper full cover block to do it correctly? The worst part is you're completely killing any possible resale value of the card by doing this. As well as gluing heatsinks to the ram sticks violates the factory warranty and.. it's just in general looks like a terrible idea to me. No one would buy a random card that's all sorts of janky frankenstein'd together like that :/ And no, full cover blocks don't usually violate the factory warranty from most companies.


As I mentioned in my most, I will post a picture, just to give some ideas to people who are willing to try something similar and know what they are doing. For the rest, I don't care.....


----------



## wrc05

@ZealotKi11er thanks for the info
@Kronos8 looks good and all doubts cleared


----------



## ClashOfClans

Blze001 said:


> FWIW the EK block I put on my FE 1080ti this weekend didn't have thermal pads on those either. The block covers them, but I don't know if they're touching it or not.
> 
> Looks like I got a really solid card. Always a risk when you play the eBay game, but this one came in the nVidia FE box with all of the red caps, manuals, and even the DVI adapter. Literally the only thing "bad" about it was some dust, I was impressed. Working great so far, didn't really have time to run many benchmarks aside from a quick TimeSpy run, but I was pretty happy with the numbers.
> 
> Now I should probably upgrade my 1080p monitor, huh?
> 
> Also I never really knew that the 1080ti was so much different from the 1070 in terms of PCB layout and hardware. For some reason I always thought the empty spots on the 1070 were filled for the higher-spec cards. But I guess that's true with the 1080 and the 1080ti is actually a Titan board?
> 
> Naughty GPU nudes below:
> 
> 
> Spoiler


I too recently got a 1080 Ti like you on ebay. Putting it in my loop today hopefully but I'm not sure how to know if it's a good card or not. What tests to run etc. and what temps to expect under water. Comparing it to review site benchmarks since the release date doesn't make too much sense since drivers should have matured a lot and performance should of increased since launch.


----------



## SauronTheGreat

guys i have a small question... would RTX function be enabled on our 1080Tis ?


----------



## mouacyk

our 1080 Tis do not have RT cores.

Turing has RT (tracing) and Tensor (AI for filling the gaps) cores on top of traditional shader cores.


----------



## serave

anybody tried slapping one of these onto their card?

http://www.idcooling.com/Product/detail/id/134/name/FROSTFLOW 120VGA

Looks very appealing to me as it costs only $60, i have the Zotac AMP version which i'm pretty sure is based on the reference design.


----------



## Spiriva

SauronTheGreat said:


> guys i have a small question... would RTX function be enabled on our 1080Tis ?


No RTX doesnt work on 1080ti.


----------



## kithylin

serave said:


> anybody tried slapping one of these onto their card?
> 
> http://www.idcooling.com/Product/detail/id/134/name/FROSTFLOW 120VGA
> 
> Looks very appealing to me as it costs only $60, i have the Zotac AMP version which i'm pretty sure is based on the reference design.


Just so you know the Zotac AMP! GTX 1080 Ti is a fully custom PCB and is not compatible with nvidia reference design.


----------



## deanstead2k15

*cooler fitment*



serave said:


> anybody tried slapping one of these onto their card?
> 
> http://www.idcooling.com/Product/detail/id/134/name/FROSTFLOW 120VGA
> 
> Looks very appealing to me as it costs only $60, i have the Zotac AMP version which i'm pretty sure is based on the reference design.


Im pretty sure these just fit directly over the gpu chip so custom pcb or reference shouldnt make any difference. i dont think the custom psb changes the dimensions of the gpu fitting as the gpu chip itself is the same on all the cards. for example the kraken which is very similar says "The G12 supports both reference and non-reference design AMD and NVIDIA GPUs" and does this with only a single set of brackets


----------



## Streetdragon

And you need a heatsink for the vrm and i would ay for the ram too.
It only touch the GPU and the fan cools everything else. more or less.


Edit: just a brain fart: It should be possible to buy an extendable AIO and a waterblock for the GPU. Just build the GPU block in it and let the CPU-block(what has mostly the pump in it) hang around. Or better the swiftech AIO where the pump is at the radiator. Swap the CPU block for a GPU block. Should be the cheapest GPU waterloop or?
Still way over 60$...


----------



## serave

kithylin said:


> Just so you know the Zotac AMP! GTX 1080 Ti is a fully custom PCB and is not compatible with nvidia reference design.


sad, didnt know that before, techpowerup gpu db said it's indeed a reference design but w/e



deanstead2k15 said:


> Im pretty sure these just fit directly over the gpu chip so custom pcb or reference shouldnt make any difference. i dont think the custom psb changes the dimensions of the gpu fitting as the gpu chip itself is the same on all the cards. for example the kraken which is very similar says "The G12 supports both reference and non-reference design AMD and NVIDIA GPUs" and does this with only a single set of brackets


Think i'll just take the plunge after reading this, hopefully it works xD



Streetdragon said:


> And you need a heatsink for the vrm and i would ay for the ram too.
> It only touch the GPU and the fan cools everything else. more or less.
> 
> 
> Edit: just a brain fart: It should be possible to buy an extendable AIO and a waterblock for the GPU. Just build the GPU block in it and let the CPU-block(what has mostly the pump in it) hang around. Or better the swiftech AIO where the pump is at the radiator. Swap the CPU block for a GPU block. Should be the cheapest GPU waterloop or?
> Still way over 60$...


nah, anything that have "Swiftech" "EKWB" or anything from EU/West would have their price marked up anywhere from $30-$50~ here, not a choice sadly.


----------



## kithylin

serave said:


> sad, didnt know that before, techpowerup gpu db said it's indeed a reference design but w/e


Nvidia Reference PCB for GTX 1080 Ti:









Zotac AMP! Extreme GTX 1080 Ti:
https://www.guru3d.com/articles-pages/zotac-geforce-gtx-1080-ti-amp-extreme-review,5.html

PCB shot here about half way down the page. Does this match what your card is, taller up top?


----------



## serave

kithylin said:


> Nvidia Reference PCB for GTX 1080 Ti:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Zotac AMP! Extreme GTX 1080 Ti:
> https://www.guru3d.com/articles-pages/zotac-geforce-gtx-1080-ti-amp-extreme-review,5.html
> 
> PCB shot here about half way down the page. Does this match what your card is, taller up top?


oh so that's what you meant before, i have the dual fan version which is why i said mine is a ref design XD

anyway ill update if i ever get into it for real.


----------



## doom26464

At what point does memory overclock become useless usually with these cards. 

Im just starting to tweak my recent 1080ti but have it already at 6102mhz on memory(guess that makes it effectively 12204mhz) 

No sign of artifacts or crashing yet. Hard to tell if it does much between benchmarking much and especially at 1440p.


----------



## pez

The Strix OC Ti I had was the best memory OC'er I had across all of my Pascal-based cards. I think it OC'ed to around 500-550MHz. BF1 was the fussiest title for me with the VRAM OC and I used it as a gauge of stability and performance with it. Essentially noticeable gains stopped after about 100-150MHz. However, that could have been that the memory truly wasn't stable at +500...though other games saw reasonable differences that you could attribute to legitimate gains. 

It's one of those things that you should test across a few games you play the most and see how it fares and adjust accordingly.


----------



## dave g

are their any full cover waterblocks that fit the zotac 1080ti amp extreme guys?

got one coming tomorrow and would like to have it watercooled.

cheers.


----------



## Doubletap1911

This is a silent, air-cooled GTX1080Ti

Not just "quiet", it's inaudible.

I'm using the Arctic AX3 but when the fans started wearing out (after less than a year) and sounding loud, I replaced them with a pair of the new Noctua "sterrox" fans running at a steady 1400rpm. 

Card is clocked up to 1900Mhz and never goes above the low 60s.

Downside is it basically takes 3.5 slots 

System:
Asus Strix Z270G mATX Motherboard
7700k @ 4.8Ghz (1.34v)
16GB @ 3733Mhz
GTX1080Ti @ 1.9Ghz
Seasonic 600W Titanium Fanless PSU
Thermaltake V21 case
Noctua 200mm PWM
Noctua DH-15s
Noctua 140mm PWM (x2)
Noctua NFA-12x25 (x2)
Arctic Accelero Xtreme III


----------



## Abaidor

pez said:


> The Strix OC Ti I had was the best memory OC'er I had across all of my Pascal-based cards. I think it OC'ed to around 500-550MHz. BF1 was the fussiest title for me with the VRAM OC and I used it as a gauge of stability and performance with it. Essentially noticeable gains stopped after about 100-150MHz. However, that could have been that the memory truly wasn't stable at +500...though other games saw reasonable differences that you could attribute to legitimate gains.
> 
> It's one of those things that you should test across a few games you play the most and see how it fares and adjust accordingly.


Same card, same results here. Strix OC Ti - Phanteks GPU waterblock. I have successfully tested the card up to 6200 but have not run extensive game benchmarks - Just synthetics. Anyway, I have it at 6000Mhz when I overclock along a nice 2065MHz core.


----------



## Kokin

I have an EVGA 1080Ti FTW3 with the EK waterblock and see similar results for synthetics between a pure core OC and core/mem OC.

Pure Core OC, Core: 2087-2100MHz, Mem: 5500MHz (11000MHz effectively)
Core/Mem OC, Core: 2065MHz, Mem: 6000MHz (12000MHz effectively)

1.092V and +127% power target using Precision XOC as well as the "beta" BIOS released by k|ngp|n to surpass the lower power target limitation set by nVidia.

I definitely have room to go higher for both voltage and power, but I'm basically at the limit of Pascal arch unless I want to do a shunt mod.


----------



## Sweetwater

I've had my Strix OC since release and it is still going strong with the overclock @2114mhz core and 6075mhz ram (non water-cooled). I was thinking I'd upgrade to the 2080ti but as I live in Australia I'd be looking at crazy pricing nearing towards 2k.

I think I'll hang on to this card for a while yet until I upgrade my core pc as I'm still running a 4690k. See how it goes running cyberpunk 2077 🙂


----------



## ThrashZone

Hi,
Anyone that can go higher than 2050 is doing pretty good 
I'm gets wonky at 2050 depending on what reading utility one uses that is 
MSI afterburner for me but dang it seems 4.5 is weird :/

EVGA XOC has always been weird so I switched.


----------



## ClashOfClans

Doubletap1911 said:


> This is a silent, air-cooled GTX1080Ti
> 
> Not just "quiet", it's inaudible.
> 
> I'm using the Arctic AX3 but when the fans started wearing out (after less than a year) and sounding loud, I replaced them with a pair of the new Noctua "sterrox" fans running at a steady 1400rpm.
> 
> Card is clocked up to 1900Mhz and never goes above the low 60s.
> 
> Downside is it basically takes 3.5 slots
> 
> System:
> Asus Strix Z270G mATX Motherboard
> 7700k @ 4.8Ghz (1.34v)
> 16GB @ 3733Mhz
> GTX1080Ti @ 1.9Ghz
> Seasonic 600W Titanium Fanless PSU
> Thermaltake V21 case
> Noctua 200mm PWM
> Noctua DH-15s
> Noctua 140mm PWM (x2)
> Noctua NFA-12x25 (x2)
> Arctic Accelero Xtreme III


That's really good considering mine is watercooled and is mid 40s to 50C at load. You are very close to watercooling temps with that.


----------



## addicTix

I got a really bad Aorus Xtreme 1080 Ti.. bought it used from ebay with over 3 years warranty and invoice, but the card is hardly stable.
When I just increase the powerlimit to 150% and voltage to 100% without changing anything about core clock or mem clock, the card is crashing.
When I set voltage to 0% just power limit to 150%, it's barely stable.. wildlands still crashing, which didn't happen with my 1070.
As you can think already, it's not able to achieve 2000 MHz @1.000v or anything near that.

God damn.. I really hope the retailer where the first buyer bought his card will exchange it.
First buyer bought it on october 30th, 2017, so it's not even a year old.
Also the private seller on ebay didn't mention anything about instability.
Plus, the 4 years warranty are only available when the card got registered witihin the first month of purchase, so I hope he did that otherwise I'm trying to refund the purchase on ebay and send the card back to him.


----------



## kithylin

addicTix said:


> I got a really bad Aorus Xtreme 1080 Ti.. bought it used from ebay with over 3 years warranty and invoice, but the card is hardly stable.
> When I just increase the powerlimit to 150% and voltage to 100% without changing anything about core clock or mem clock, the card is crashing.
> When I set voltage to 0% just power limit to 150%, it's barely stable.. wildlands still crashing, which didn't happen with my 1070.
> As you can think already, it's not able to achieve 2000 MHz @1.000v or anything near that.
> 
> God damn.. I really hope the retailer where the first buyer bought his card will exchange it.
> First buyer bought it on october 30th, 2017, so it's not even a year old.
> Also the private seller on ebay didn't mention anything about instability.
> Plus, the 4 years warranty are only available when the card got registered witihin the first month of purchase, so I hope he did that otherwise I'm trying to refund the purchase on ebay and send the card back to him.


Are you sure your power supply is good enough to give it stable power? you want at least a tier-1 name brand gold or better 600W unit for a 1080 Ti.


----------



## addicTix

I have a Dark Power Pro 11 650W


----------



## 8051

serave said:


> anybody tried slapping one of these onto their card?
> 
> http://www.idcooling.com/Product/detail/id/134/name/FROSTFLOW 120VGA
> 
> Looks very appealing to me as it costs only $60, i have the Zotac AMP version which i'm pretty sure is based on the reference design.


It only dissipates up to 220W of power. I don't think that's enough for a 1080Ti if you do any kind of overclocking.


----------



## mattliston

Should do just fine. EVGA Hybrid cooelrs are older generation 120mm AIO units, and they work just fine and dandy


----------



## kithylin

addicTix said:


> I have a Dark Power Pro 11 650W


I found a review (if you're curious it's here.. dutch need a translator, chrome does it automatically, https://tweakers.net/productreview/119910/be-quiet!-dark-power-pro-11-650w.html )

That shouldn't be a problem then, it's probably something else......

So then I would say something is really wrong with that card. It's probably been coin mined and overclocked and just destroyed by now. It shouldn't be crashing like that, sounds like it's toasty.

I would suggest you go after them with ebay on this. But a suggestion from me, with lots of ebay experience: Do -NOT- go to the seller directly via ebay messaging. Instead, go in to ebay purchase history, and open a return case straight away. You can still communicate back and forth messaging each other via the return case system (don't use direct messages, use the return system to send messages so it's logged), and that way the seller can't say no and can't refuse a return if you use the ebay return system. If it gets to the end of the 5 days and they haven't already agreed to a return, you can escalate it and it will automatically judge in your favor. Since you said seller did not say anything about instability, then sounds like you legitimately have a "Item not as described" case and it would fall under ebay buyer protection system. You should have up to 45 days after the listing ended to file a claim like this. And do not accept "restocking fees" and do not accept the seller trying to get you to pay return shipping. Under buyer protection if you wait til the 5'th day and escalate, ebay will give you a free return shipping label and bill it to the seller for you, and once the tracking # shows delivered they'll credit the full value of the original purchase price back to your paypal automatically.


----------



## verick

I just bought a water cooled 1080ti on ebay from a reputable seller, however, I am having an odd issue. The card will not boost and is stuck at base clocks of 1480mhz and the voltage is stuck at .79. GPUZ states the perfcap reason is idle. I have tried synthetic and real workloads and they all have the same issue. I have uninstalled the drivers with DDU and reinstalled several times. Same issue on my main Ryzen system and and old intel system. Has anyone else experienced this?


----------



## ClashOfClans

verick said:


> I just bought a water cooled 1080ti on ebay from a reputable seller, however, I am having an odd issue. The card will not boost and is stuck at base clocks of 1480mhz and the voltage is stuck at .79. GPUZ states the perfcap reason is idle. I have tried synthetic and real workloads and they all have the same issue. I have uninstalled the drivers with DDU and reinstalled several times. Same issue on my main Ryzen system and and old intel system. Has anyone else experienced this?


I experience that in adaptive mode, and get even lower clocks on older games. Check Nvidia CP and see if your power management mode is on adaptive. If it's adaptive it will use lower clocks on games that are not as demanding. Maybe try Prefer Maximum Performance mode instead and see what clocks you get. I'm sure you already tried this but just a shot. I was getting weird stuff as well when I switched graphics cards. Performance didn't seem up to par, so I reinstalled the OS, latest drivers, and was much better in games. Did you run DDU with the safe mode option?


----------



## verick

ClashOfClans said:


> I experience that in adaptive mode, and get even lower clocks on older games. Check Nvidia CP and see if your power management mode is on adaptive. If it's adaptive it will use lower clocks on games that are not as demanding. Maybe try Prefer Maximum Performance mode instead and see what clocks you get. I'm sure you already tried this but just a shot. I was getting weird stuff as well when I switched graphics cards. Performance didn't seem up to par, so I reinstalled the OS, latest drivers, and was much better in games. Did you run DDU with the safe mode option?


I tried prefer maximum performance. I ran DDU in safe mode. I also tried on a fresh OS install. Another odd thing is if I try to change the power limit in Afterburner or EVGA precision the program freezes up.


----------



## Streetdragon

never use two OC-Tools at the same time.
Only afterburner or the one from EVGA etc..


----------



## verick

I didn't use them at the same time. I wanted to see if the behavior was the same so I uninstalled afterburner and tried the EVGA one.


----------



## addicTix

kithylin said:


> I found a review (if you're curious it's here.. dutch need a translator, chrome does it automatically, https://tweakers.net/productreview/119910/be-quiet!-dark-power-pro-11-650w.html )
> 
> That shouldn't be a problem then, it's probably something else......
> 
> So then I would say something is really wrong with that card. It's probably been coin mined and overclocked and just destroyed by now. It shouldn't be crashing like that, sounds like it's toasty.
> 
> I would suggest you go after them with ebay on this. But a suggestion from me, with lots of ebay experience: Do -NOT- go to the seller directly via ebay messaging. Instead, go in to ebay purchase history, and open a return case straight away. You can still communicate back and forth messaging each other via the return case system (don't use direct messages, use the return system to send messages so it's logged), and that way the seller can't say no and can't refuse a return if you use the ebay return system. If it gets to the end of the 5 days and they haven't already agreed to a return, you can escalate it and it will automatically judge in your favor. Since you said seller did not say anything about instability, then sounds like you legitimately have a "Item not as described" case and it would fall under ebay buyer protection system. You should have up to 45 days after the listing ended to file a claim like this. And do not accept "restocking fees" and do not accept the seller trying to get you to pay return shipping. Under buyer protection if you wait til the 5'th day and escalate, ebay will give you a free return shipping label and bill it to the seller for you, and once the tracking # shows delivered they'll credit the full value of the original purchase price back to your paypal automatically.


Is there any way to check if the card has been used for mining, like an indicator? 
I mean, that could definitely be the case, card has been bought on october 30th 2017.. 
Otherwise, the seller actually told me that the card has still over 3 years of warranty, I contacted the shop where he bought it first and asked if they could exchange it due to instability (no answer yet), but if they would do that.. that'd be great. 
If they say no, I will do it just like you described it. But I already have contacted the seller over ebay private message if he could give me his account data for the shop where he bought the card, so I have no trouble in exchanging the card or anything, but he also didn't answer on that question yet.. so I will see.
Will I still have the "right" to open a return case after 1-2 weeks? I really want to wait for an answer from the shop, maybe I'll give them a call tomorrow too


----------



## kithylin

addicTix said:


> Is there any way to check if the card has been used for mining, like an indicator?
> I mean, that could definitely be the case, card has been bought on october 30th 2017..
> Otherwise, the seller actually told me that the card has still over 3 years of warranty, I contacted the shop where he bought it first and asked if they could exchange it due to instability (no answer yet), but if they would do that.. that'd be great.
> If they say no, I will do it just like you described it. But I already have contacted the seller over ebay private message if he could give me his account data for the shop where he bought the card, so I have no trouble in exchanging the card or anything, but he also didn't answer on that question yet.. so I will see.
> Will I still have the "right" to open a return case after 1-2 weeks? I really want to wait for an answer from the shop, maybe I'll give them a call tomorrow too


Sadly, there's no way to know if a GPU has been used for coin mining or not, or overclocked or not. Anyone could realistically flash a different bios on a card, manually overclock it, and run coin mining on it for months and then flash it back and sell it and claim "Used, works great." when you buy used cards. It's especially more common on these high end cards "at the top of the product stack" like 1080 Ti's. I don't know what else could be causing the crashing.. that's very unusual for a 1080 Ti to do that. Have you tried completely uninstalling afterburner, evga precisionX, all overclocking tools, completely manually delete their folders in program files and just run it stock and see how it does? Does it still crash 100% stock like that? And it shouldn't cause crashing but the only other thing I can possibly think of is perhaps look at your power supply and if it's using a single 8 pin cable and then at the end of that single cable has a (8+6) + (8+6) connector at the end for both power ports, some power supplies do that. Perhaps instead see if you can run two separate power cables to it, one 8-pin and one 6-pin direct to the power supply instead. That -shouldn't- be an issue, but it might be.


----------



## addicTix

Well I just followed your suggestion about returning the item on ebay directly, I opened a return case and I hope it goes as well as you described it.
I literally paid 670€, which is way too much anyway (I knew that before, but because I sold my old GPU and I didn't wanted to wait too long for a nice offer, I just got this one. Big mistake on my side), and for that amount of money I want a perfectly working 1080 Ti.
I hope, when I get the money back, the EVGA offer is still available, they're selling the 1080 Ti SC2 + Power Connector + PSU for 750€. Not a bad deal if I sell the PSU


----------



## kithylin

addicTix said:


> Well I just followed your suggestion about returning the item on ebay directly, I opened a return case and I hope it goes as well as you described it.
> I literally paid 670€, which is way too much anyway (I knew that before, but because I sold my old GPU and I didn't wanted to wait too long for a nice offer, I just got this one. Big mistake on my side), and for that amount of money I want a perfectly working 1080 Ti.
> I hope, when I get the money back, the EVGA offer is still available, they're selling the 1080 Ti SC2 + Power Connector + PSU for 750€. Not a bad deal if I sell the PSU


Ehhh.. big thing on my part. All of those things I mentioned on how ebay works above was based on ebay's policy for USA buyers. I didn't catch/realize you were in germany (At least that's what overclock.net is saying for you). I don't know how germany's buyer protection laws work and if it works the same over there. Hope it works out swell for you.


----------



## addicTix

No problem man 
Afaik german laws in that regard are even a little tighter.
I think it shouldn't be any different from your experience actually.
But thank you for mentioning that! I hope the best.


----------



## blackhole2013

I Just got a zotac 1080 ti amp for 500 dollars and I was wondering if its normal that I cant overclock the core at all or it will crash but just by putting the power limit to 120 and voltage to 100 on afterburner my card boosts to 1987 mhz by its self and playing games it hovers 1900 to 1950 .. its like its overclocking itself to the max ...


----------



## Streetdragon

blackhole2013 said:


> I Just got a zotac 1080 ti amp for 500 dollars and I was wondering if its normal that I cant overclock the core at all or it will crash but just by putting the power limit to 120 and voltage to 100 on afterburner my card boosts to 1987 mhz by its self and playing games it hovers 1900 to 1950 .. its like its overclocking itself to the max ...


Welcome to boost 3.0^^

you ould try to set a manual curve in msi afterburner for voltage|clock or just let the card do the work^^


----------



## cg4200

hey just a thought if you can not overclock at all 1950 seems low for zotac ti amp.

out of 4 ti's I have had only one did not oc over 2000.

I would check temp maybe they never changed the tim.


----------



## kithylin

cg4200 said:


> hey just a thought if you can not overclock at all 1950 seems low for zotac ti amp.
> 
> out of 4 ti's I have had only one did not oc over 2000.
> 
> I would check temp maybe they never changed the tim.


My EVGA 1080 Ti SC Black Edition card wouldn't do anything over 1830 - 1850 mhz stock with the stock air cooler no matter what I did to it.. was running hot like 80's C and throttling. I was under the impression that's typical for most air cooled 1080 Ti's, and it's only the big exotic ones with 3 fans that could do 1950-1980 mhz stable and no 1080 Ti's can do 2ghz on air, supposedly.

I may be wrong though? That's just what I gathered from research on benchmarks online from around when they were new though and my 1 experience with my 1 card, so maybe I'm wrong and that's not typical.


----------



## SauronTheGreat

guys i have a simple question, we all have heard over the years that nvidia when it releases the new gpu gen in our case Turing, they create new drivers which tank the performance of old gen cards...so people upgrade ... how to true is this statement ? because i bought the new Shadow of the tomb raider game and want to play it with latest drivers 399.24 which are released for the RTX cards...


----------



## criminal

SauronTheGreat said:


> guys i have a simple question, we all have heard over the years that nvidia when it releases the new gpu gen in our case Turing, they create new drivers which tank the performance of old gen cards...so people upgrade ... how to true is this statement ? because i bought the new Shadow of the tomb raider game and want to play it with latest drivers 399.24 which are released for the RTX cards...


I don't believe that such a thing has ever been proven.


----------



## ThrashZone

blackhole2013 said:


> I Just got a zotac 1080 ti amp for 500 dollars and I was wondering if its normal that I cant overclock the core at all or it will crash but just by putting the power limit to 120 and voltage to 100 on afterburner my card boosts to 1987 mhz by its self and playing games it hovers 1900 to 1950 .. its like its overclocking itself to the max ...


Hi,
Zotac amp cards are already about maxed out 
Possibly a amp extreme would do more their silicon lottery seems more miss than hit 
I know a few guys that have them
Only one ruled the day two others were duds and the best one actually died


----------



## GraphicsWhore

SauronTheGreat said:


> guys i have a simple question, we all have heard over the years that nvidia when it releases the new gpu gen in our case Turing, they create new drivers which tank the performance of old gen cards


We have?


----------



## kithylin

Graphics***** said:


> We have?


Yes, it's been speculated over the years many many many times and Linus even did a video on it at one point with like 100 different drivers over time and pretty much proved it was false. They did have certain drivers that would take performance vs the drivers that came out before, but when it did happen it effected all cards that were supported by that driver, not any specific generation or family, the bad drivers were generally bad for all cards at the time. And then later they would release better drivers that would be faster (for all supported cards, new and old), faster than the bad drivers and faster than the older ones and then they would continue to improve over time. It's AMD that was actually caught degrading gpu performance with the HD 4000 series when HD 5000 and HD 6000 came out. Like they totally tanked performance for older cards as much as -60% with newer drivers and it was documented. But then once they were caught AMD never did it again and now they improve the performance of all cards with all newer drivers today.


----------



## cg4200

so for the overclocking with the zotak amp did not see you turned voltage all the way up..
Definitely look up how to best oc pascal.. tip afterburner works best for me ctrl and f opens you own curve I would put 2025 at 1075 volts flatlined with 120 power no volt to start.. see if you pass then try 1050 volts takes awhile to fine tune. And if used maybe kept original tim yuk 

I always thought I was unlucky with my 1080 ti's gigabyte fe 2050.. evga fe 2025.. evga sc 2 fan 2100 died evga sc rma 1950 which just died 2 weeks ago evga is great though I will have my new rma 9/20 hope its good .
Just bought used txp star wars on ebay could not pass it up 770.00 just got still testing.. price really droped on 1080 ti's was thinking about selling my replacement 1080 ti.. still not sure how 2080 will be on average across all games ? 
Crazy nvidia is hawking all reviewers like crazy {suggesting what games and setting to use} hope some real leaks come out


----------



## Sweetwater

kithylin said:


> It's only the big exotic ones with 3 fans that could do 1950-1980 mhz stable and no 1080 Ti's can do 2ghz on air, supposedly.


Lots can do over 2000 on air it's just random, I was lucky to score a 2114mhz on air card and my mate got a 2088mhz air card. Not that fps change much from 1950 to 2000+ anyway lol. However 1800 odd mhz is really unlucky I don't see many report that low of clocks 😞


----------



## kithylin

Sweetwater said:


> Lots can do over 2000 on air it's just random, I was lucky to score a 2114mhz on air card and my mate got a 2088mhz air card. Not that fps change much from 1950 to 2000+ anyway lol. However 1800 odd mhz is really unlucky I don't see many report that low of clocks 😞


Yep.. sucks to be me.. I had to go all the way to custom water loop and XOC unlimited bios and 1.200v to get this thing to do 2100 mhz. :thumbsdow But it will do it.. now I just have to struggle to keep it cool in the summer. Have plans to build a much larger 4-radiator water loop later.


----------



## ZealotKi11er

I am benchmarking SoTR with my 1080 Ti and I found something strange. First one is with latest driver and second one is with older driver. What happened to the CPU?


----------



## kithylin

ZealotKi11er said:


> I am benchmarking SoTR with my 1080 Ti and I found something strange. First one is with latest driver and second one is with older driver. What happened to the CPU?



I'm not sure what you're going on about? If you're referring to CPU render results.. it's -17 FPS with the newer driver? Is that the issue? But your gpu results appear to be fine either way.


----------



## ZealotKi11er

kithylin said:


> I'm not sure what you're going on about? If you're referring to CPU render results.. it's -17 FPS with the newer driver? Is that the issue? But your gpu results appear to be fine either way.


Yeah that is the issue. I test 4K and that is alway GPU bound. I am just trying to figure out what Nvidia did with the driver to "enhance" performance.


----------



## rluker5

ZealotKi11er said:


> I am benchmarking SoTR with my 1080 Ti and I found something strange. First one is with latest driver and second one is with older driver. What happened to the CPU?


Does Afterburner agree with the new driver results with the cpu use (At least with your cpu having heavy use where it is taking a while on the graph)? Because the new driver one looks totally fake. A cpu's performance won't mirror a gpu's. 

It looks like they just added some of the gpu's time to the cpu for pr purposes.


----------



## jura11

kithylin said:


> My EVGA 1080 Ti SC Black Edition card wouldn't do anything over 1830 - 1850 mhz stock with the stock air cooler no matter what I did to it.. was running hot like 80's C and throttling. I was under the impression that's typical for most air cooled 1080 Ti's, and it's only the big exotic ones with 3 fans that could do 1950-1980 mhz stable and no 1080 Ti's can do 2ghz on air, supposedly.
> 
> I may be wrong though? That's just what I gathered from research on benchmarks online from around when they were new though and my 1 experience with my 1 card, so maybe I'm wrong and that's not typical.



Hi there 

Looks like you are lost silicone lottery there, I have tried few air cooled Ti and most of them has been able to do above 2.0GHz on air and under water I have mixed experience as well 

Recently I put my old Morpheus II cooler on EVGA GTX1080Ti SC2 and Black Edition,I wanted to see what temperatures I can achieve as on friend SC2 EVGA Hybrid AIO failed, max OC on SC2 I could achieve has been 2088MHz and on Black Edition I couldn't pass 2050MHz,temperatures I could achieve high 40's to mid 50's on these cards... 

Now customers water cooled GPUs,Gigabyte GTX1080Ti Aorus Extreme max OC 2088MHz, MSI Gaming X 1080Ti that card wouldn't OC as much, max OC 2050MHz 

Hope this helps 

Thanks, Jura


----------



## ZealotKi11er

rluker5 said:


> Does Afterburner agree with the new driver results with the cpu use (At least with your cpu having heavy use where it is taking a while on the graph)? Because the new driver one looks totally fake. A cpu's performance won't mirror a gpu's.
> 
> It looks like they just added some of the gpu's time to the cpu for pr purposes.


Going to test it with lower settings where I am CPU bound and see.


----------



## kithylin

jura11 said:


> Hi there
> 
> Looks like you are lost silicone lottery there, I have tried few air cooled Ti and most of them has been able to do above 2.0GHz on air and under water I have mixed experience as well
> 
> Recently I put my old Morpheus II cooler on EVGA GTX1080Ti SC2 and Black Edition,I wanted to see what temperatures I can achieve as on friend SC2 EVGA Hybrid AIO failed, max OC on SC2 I could achieve has been 2088MHz and on Black Edition I couldn't pass 2050MHz,temperatures I could achieve high 40's to mid 50's on these cards...
> 
> Now customers water cooled GPUs,Gigabyte GTX1080Ti Aorus Extreme max OC 2088MHz, MSI Gaming X 1080Ti that card wouldn't OC as much, max OC 2050MHz
> 
> Hope this helps
> 
> Thanks, Jura


Yeah, I posted about that already. I was able to get it stable at 2136 mhz, but only after I converted it to full-cover EK block, a custom water loop, flashed the XOC unlimited-power bios and run it at 1.200v. Definitely lost the silicon lotto. It works though, and it will do 2136 mhz, which is what I'm happy about. And even with all of this it's still only maxes out @ 350 watts if I can ever find a game to load it @ 100%. I'm not complaining much. It could be nicer, do the 2.1 ghz with less volts and run cooler. But it does do it and so that's fine. I just find out that running it on a single 3x120mm rad means some games can take it up to 62-63c even with water cooling and that's right about where the driver crashes. I've tried it on adding another radiator to the loop and it runs 58-60c and never crashes @ 2136 mhz, so it's sensitive to temps. I just have to build a bigger loop for it with more radiators later, and clock it down to 2088 mhz for now.


----------



## addicTix

kithylin said:


> Ehhh.. big thing on my part. All of those things I mentioned on how ebay works above was based on ebay's policy for USA buyers. I didn't catch/realize you were in germany (At least that's what overclock.net is saying for you). I don't know how germany's buyer protection laws work and if it works the same over there. Hope it works out swell for you.


Actually I have another question, I opened the return case on september 12th and on september 22th, I can contact ebay to help me with the return if the seller didn't react until then or if we didn't find a solution.
As of today, the seller didn't answered.
If he hasn't answered on september 22th, what does happen when I contact ebay? Do I get back my money immediately or what do I have to expect?


----------



## mattliston

only ebay has that answer directly. Most likely it is settled at a case-by-case basis, rather than an overall decision.


Plead your case correctly and politely, and you should see a good result


----------



## porschedrifter

For all of you people claiming over 2000 on core, how many of you have artifact scanned with OCCT or FurMark ROG Edition x64? I'm willing to bet it's not technically stable and will detect artifacts very quickly at least for me it wasn't.
I can get my MSI 1080ti duke OC up to 2000+ core and it'll game, but if I do artifact scanner, it def doesn't pass.


With voltage on 100, the highest stable OC I can get on core passing artifact scanners is +45 core and +640 mem


----------



## 2ndLastJedi

jura11 said:


> kithylin said:
> 
> 
> 
> My EVGA 1080 Ti SC Black Edition card wouldn't do anything over 1830 - 1850 mhz stock with the stock air cooler no matter what I did to it.. was running hot like 80's C and throttling. I was under the impression that's typical for most air cooled 1080 Ti's, and it's only the big exotic ones with 3 fans that could do 1950-1980 mhz stable and no 1080 Ti's can do 2ghz on air, supposedly.
> 
> I may be wrong though? That's just what I gathered from research on benchmarks online from around when they were new though and my 1 experience with my 1 card, so maybe I'm wrong and that's not typical.
> 
> 
> 
> 
> Hi there
> 
> Looks like you are lost silicone lottery there, I have tried few air cooled Ti and most of them has been able to do above 2.0GHz on air and under water I have mixed experience as well
> 
> Recently I put my old Morpheus II cooler on EVGA GTX1080Ti SC2 and Black Edition,I wanted to see what temperatures I can achieve as on friend SC2 EVGA Hybrid AIO failed, max OC on SC2 I could achieve has been 2088MHz and on Black Edition I couldn't pass 2050MHz,temperatures I could achieve high 40's to mid 50's on these cards...
> 
> Now customers water cooled GPUs,Gigabyte GTX1080Ti Aorus Extreme max OC 2088MHz, MSI Gaming X 1080Ti that card wouldn't OC as much, max OC 2050MHz
> 
> Hope this helps
> 
> Thanks, Jura
Click to expand...

I have a Galax GTX 1080Ti EXOC , a dual slot twin fan cooler that'll do 2088 / 6008 with stock clocks but hit 74c with custom curve in 3Dmark but run it at .900 mV @1911 and hit 64c with same custom fan curve .


----------



## Zfast4y0u

porschedrifter said:


> For all of you people claiming over 2000 on core, how many of you have artifact scanned with OCCT or FurMark ROG Edition x64? I'm willing to bet it's not technically stable and will detect artifacts very quickly at least for me it wasn't.
> I can get my MSI 1080ti duke OC up to 2000+ core and it'll game, but if I do artifact scanner, it def doesn't pass.
> 
> 
> With voltage on 100, the highest stable OC I can get on core passing artifact scanners is +45 core and +640 mem


what does core has to do with artifacts? if you push ur memory clocks too high, you start seeing artifacts, unstable clock speeds give you spikes, freezes and crashes in worst case scenario when u are reaaaally unstable. i just fired up that power virus, i have no artifacts but also i couldnt maintain 2140mhz cause of power limit that im hitting ( over 120%) and that causes my clocks to drop to around ~2090mhz. if you are stable in all games you play thats all u need to say its stable. i could care less for some torture test syntetic benchmark as valid sign of ''stability'' while my vrm's are burning in unrealistic loads.

also, you are having problems going over 2000mhz cause you use that ******ed slider ( if you use msi afterburner which i assume) which shoots you over stable clocks and cause you instant crashes most of the time. i cant imagine any 1080ti not hitting over 2000mhz, if cooled properly and everything.


----------



## kithylin

addicTix said:


> Actually I have another question, I opened the return case on september 12th and on september 22th, I can contact ebay to help me with the return if the seller didn't react until then or if we didn't find a solution.
> As of today, the seller didn't answered.
> If he hasn't answered on september 22th, what does happen when I contact ebay? Do I get back my money immediately or what do I have to expect?


In the USA it works like this: You open return case on an item. Seller does not respond at all, no messages, no offers, complete silence. At the 5'th day you 'contact ebay to help' and then within 1-60 minutes the system automatically judges in favor of the buyer and authorizes a return. Ebay return system gives us a pre-printed return label for USPS. Print it out, affix to box, put item in box, put in the Post, wait. When item's tracking number shows "DELIVERED", ebay returns system automatically credits your paypal account the full original purchase price including any original shipping and automatically closes the return case. Usually this happens within the first hour of it showing delivered, but may sometimes take up to another 24 hours after tracking number shows delivered. Once you get to the 5'th day and ask ebay to step in, and you get the shipping label, then from that point the seller can not stop the process. The seller can not deny the delivery, can not stop ebay and can not do anything. Once you get the return shipping label you're guaranteed to get your money back automatically once it's delivered.

Again, I don't know how it works for you there but that's how it works here.


----------



## Jokesterwild

porschedrifter said:


> For all of you people claiming over 2000 on core, how many of you have artifact scanned with OCCT or FurMark ROG Edition x64? I'm willing to bet it's not technically stable and will detect artifacts very quickly at least for me it wasn't.
> I can get my MSI 1080ti duke OC up to 2000+ core and it'll game, but if I do artifact scanner, it def doesn't pass.
> 
> 
> With voltage on 100, the highest stable OC I can get on core passing artifact scanners is +45 core and +640 mem



You are getting artifacts from your memory overclock. Core clock doesn't create artifacts. Core clock when unstable will just crash display driver etc. My gigabyte card runs between 2000-2050 by just turning the max power limit to 130. Some chips are just better. I would presume the high end cards from each board builder are binned to a certain extent. Unfortunately your duke is not MSI top tier model. So it's a real flip of the coin if you get one that clocks higher.


----------



## chakku

Make sure you check performance numbers before updating drivers now. With Turing's disappointing performance it wouldn't be surprising to see NVIDIA quietly slowing down Pascal and older generations to make it look better.










(Likely margin of error, only time will tell)


----------



## porschedrifter

Zfast4y0u said:


> what does core has to do with artifacts? if you push ur memory clocks too high, you start seeing artifacts, unstable clock speeds give you spikes, freezes and crashes in worst case scenario when u are reaaaally unstable. i just fired up that power virus, i have no artifacts but also i couldnt maintain 2140mhz cause of power limit that im hitting ( over 120%) and that causes my clocks to drop to around ~2090mhz. if you are stable in all games you play thats all u need to say its stable. i could care less for some torture test syntetic benchmark as valid sign of ''stability'' while my vrm's are burning in unrealistic loads.
> 
> also, you are having problems going over 2000mhz cause you use that ******ed slider ( if you use msi afterburner which i assume) which shoots you over stable clocks and cause you instant crashes most of the time. i cant imagine any 1080ti not hitting over 2000mhz, if cooled properly and everything.





Jokesterwild said:


> You are getting artifacts from your memory overclock. Core clock doesn't create artifacts. Core clock when unstable will just crash display driver etc. My gigabyte card runs between 2000-2050 by just turning the max power limit to 130. Some chips are just better. I would presume the high end cards from each board builder are binned to a certain extent. Unfortunately your duke is not MSI top tier model. So it's a real flip of the coin if you get one that clocks higher.



Not getting artifacts. I was specifically referring to the artifact scanner. The artifact scanners only work for core overclocking, not mem. So my memory is stable, but if you use artifact scanners it allows you to supposedly get the most accurate stable OC without crashing the system. It picks up errors way before that happens. That's the beauty of it. :thumb:


So you may be gaming stable, but if you are getting artifacts on the scanner, you really aren't 100% stable.


----------



## mattliston

Are we really trying to say nvidia driver slowdowns based on LESS than ONE FPS margin of difference?


Really?


----------



## mouacyk

Duh... you hide it within the margin of error over many driver releases.


----------



## Zfast4y0u

porschedrifter said:


> Not getting artifacts. I was specifically referring to the artifact scanner. The artifact scanners only work for core overclocking, not mem. So my memory is stable, but if you use artifact scanners it allows you to supposedly get the most accurate stable OC without crashing the system. It picks up errors way before that happens. That's the beauty of it. :thumb:
> 
> 
> So you may be gaming stable, but if you are getting artifacts on the scanner, you really aren't 100% stable.



again you are pushing ur story, artifact=>core instability....


asus version of furmark, its so called artifact scanner feature, scans for artifacts when you push ur memory too high to see where is limit before you even start noticing it urself. for test sake if u dont trust me, leave memory clocks at default and overclock the core only, u might get crashes and what not, but zero artifacts detected. i just put my card on default clocks, overclocked memory to +850mhz and only then i started getting ''artifacts'' detected in that program. i could probably go even higher if i wanted until i actually start detecting artifacts myself.

btw have fun pulling this much out of ur card in any game you throw at it.

http://prntscr.com/kwge3m

furmark....


its cute looking program and all, but lets be real here, no1 uses furmark as stability test and its validation.



@ others,


+850 on memory is good or bad? i actually never bothered overclocking it but i think this aint bad, no? lemme know about ur results.


----------



## 2ndLastJedi

mouacyk said:


> Duh... you hide it within the margin of error over many driver releases.


Thats a worry ! Im still on 390.65 .


----------



## porschedrifter

Zfast4y0u said:


> again you are pushing ur story, artifact=>core instability....
> 
> 
> asus version of furmark, its so called artifact scanner feature, scans for artifacts when you push ur memory too high to see where is limit before you even start noticing it urself. for test sake if u dont trust me, leave memory clocks at default and overclock the core only, u might get crashes and what not, but zero artifacts detected. i just put my card on default clocks, overclocked memory to +850mhz and only then i started getting ''artifacts'' detected in that program. i could probably go even higher if i wanted until i actually start detecting artifacts myself.
> 
> btw have fun pulling this much out of ur card in any game you throw at it.
> 
> http://prntscr.com/kwge3m
> 
> furmark....
> 
> 
> its cute looking program and all, but lets be real here, no1 uses furmark as stability test and its validation.
> 
> 
> 
> @ others,
> 
> 
> +850 on memory is good or bad? i actually never bothered overclocking it but i think this aint bad, no? lemme know about ur results.


 Here's where you're wrong. It's not the same as memory artifacts. It actually is *not* useful for memory overclocking at all. Core clocks and memory clocks are symbiotic, they balance each other. You change one and the other may not run at the same clock as before. Again I repeat, you do not use this tool to check for memory stability. You clock your memory to an insane amount like that of course it's going to show errors elsewhere lmao. This isn't new tech, artifact scanning has been around since ATI tool. 


It's actually a very effective way to clock the core and make sure it's rock solid stable. It takes very little time to come up with an artifact. You can actually play around with the rendering resolution as well to reduce load, and it's still effective at detecting core errors or artifacts.


But I repeat, artifact scanners are not a memory overclocking tool. Only core. Especially with our GDDR5X memory, it's not going to be an effective way to test for stability. Error correction kicks in and slows things down before things start crashing and it's not optimal to wait until you see artifacts in game!


My card games just fine at 2000 core, but it's not stable in the artifact scanner.


No one is knocking you for being game stable, relax. But if you are getting artifacts in the scanner then you're not 100% stable. It's ok, the world will not end.


----------



## mattliston

Im wary of artifact scanner apps, because in the past Ive seen them sometimes claim my default clocks were spitting errors.


I suggest using artifact apps as a secondary test, rather than a direct affirmative or negative


----------



## porschedrifter

mattliston said:


> Im wary of artifact scanner apps, because in the past Ive seen them sometimes claim my default clocks were spitting errors.
> 
> 
> I suggest using artifact apps as a secondary test, rather than a direct affirmative or negative



Yeah I've seen that as well. The new ones like ROG furmark are pretty good, but that is less sensitive it seems than OCCT's latest version.
But on my end both seem to work pretty well.


----------



## bmgjet

Dont use artifact scanners on any card from maxwell on wards since they have error correction built into them.
You will start losing performances long before there are any artifacts.

Best way is get something like heaven or valley. Pause it on a high load scene and start feeding some mhz into the core and memory.
Take one step backwards once your see fps start to dip.


----------



## 2ndLastJedi

bmgjet said:


> Dont use artifact scanners on any card from maxwell on wards since they have error correction built into them.
> You will start losing performances long before there are any artifacts.
> 
> Best way is get something like heaven or valley. Pause it on a high load scene and start feeding some mhz into the core and memory.
> Take one step backwards once your see fps start to dip.


That is really interesting ! Have you made a video of this method of overclocking ? 
What steps in MHz do you take ?
Does it crash like this or just loose frames ?


----------



## Streetdragon

bmgjet said:


> Dont use artifact scanners on any card from maxwell on wards since they have error correction built into them.
> You will start losing performances long before there are any artifacts.
> 
> Best way is get something like heaven or valley. Pause it on a high load scene and start feeding some mhz into the core and memory.
> Take one step backwards once your see fps start to dip.


thats how i did my overclock.
go with the core as high as it gets. if the core gets unstable, the driver will crash. Than back up 13Mhz.

With ram its like you wrote. the fps will go up with the clock till the it got it max stable speed. after than the fps will drop with even higher mem clocks.

Just start heaven or so in window mode and enter the "free" walk and dont move the benchmark. just play with the clocks


----------



## porschedrifter

bmgjet said:


> Dont use artifact scanners on any card from maxwell on wards since they have error correction built into them.
> You will start losing performances long before there are any artifacts.
> 
> Best way is get something like heaven or valley. Pause it on a high load scene and start feeding some mhz into the core and memory.
> Take one step backwards once your see fps start to dip.



The memory only has error correction you can still use them to find max core clocks fine.


----------



## shilka

Thinking about buying a GTX 1080 Ti and the MSI Gaming X Trio is the cheapest card where i live that does not have a blower style cooler 
Anyone have any experience with this card and what do you think? never owned a card from MSI before

How is the MSI software you need to control the lights and make your own fan profiles?
Have a GTX 1080 FTW2 from EVGA and their software is god awful


----------



## anticommon

Hi everyone I have a 1080 Ti SC Hydrocopper (bought from a user here for a great price!) And have a few mods I am considering. I would be so appreciative of any help I can get in answering these questions as my google-fu is apparently inadequate lately... But I digress.

A few mods that I want to do are as follows:
1. RGB 'mod' - essentially I want to unplug the white LED cables and run a small RGB strip between the motherboard and GPU to shine through the acrylic. I know it is mostly covered but in theory it should work right? My biggest concern is the GTX logo on the side which might not be the same piece of acrylic might end up either not lighting up or staying white. If anyone knows of a workaround I'm all ears!
2. I am thinking of replacing the stock TC with cunductonaut. I have liquid electrical tape, should I use this over components around the die? Will it thermally insulated them increasing temps? Would pads be better for thermals/protection?
3. I am looking at Fujipoly pads which are sold in 60x50x1mm pads. Will one pad be enough to replace all the existing thermal pads? (3000mm2) is 1mm thickness appropriate? Does it matter much if I get the 11W/mK or 17W/mK pads? (One is $15/50x60x1mm other is $27/50x60x1mm) I was thinking to place some kryonaut between components/pad for increased efficiency. 
4. Should I place some thermal pads between backplate/pcb? 

The reason I'm looking at pad/tc replacement is that my GPU sits arlt around 46c when I should have headroom to take it lower @ my current 2050/6000 OC what with having two 360mm radiators and all. The cost is mostly my time and the thermal pads as I already have everything else.


Edit: I am also considering a soldered shunt mod for increased power limit but am unsure as to what method is best... If anyone has experience doing so on this card please let me know. Thanks!


----------



## kithylin

anticommon said:


> Hi everyone I have a 1080 Ti SC Hydrocopper (bought from a user here for a great price!) And have a few mods I am considering. I would be so appreciative of any help I can get in answering these questions as my google-fu is apparently inadequate lately... But I digress.
> 
> A few mods that I want to do are as follows:
> 1. RGB 'mod' - essentially I want to unplug the white LED cables and run a small RGB strip between the motherboard and GPU to shine through the acrylic. I know it is mostly covered but in theory it should work right? My biggest concern is the GTX logo on the side which might not be the same piece of acrylic might end up either not lighting up or staying white. If anyone knows of a workaround I'm all ears!
> 2. I am thinking of replacing the stock TC with cunductonaut. I have liquid electrical tape, should I use this over components around the die? Will it thermally insulated them increasing temps? Would pads be better for thermals/protection?
> 3. I am looking at Fujipoly pads which are sold in 60x50x1mm pads. Will one pad be enough to replace all the existing thermal pads? (3000mm2) is 1mm thickness appropriate? Does it matter much if I get the 11W/mK or 17W/mK pads? (One is $15/50x60x1mm other is $27/50x60x1mm) I was thinking to place some kryonaut between components/pad for increased efficiency.
> 4. Should I place some thermal pads between backplate/pcb?
> 
> The reason I'm looking at pad/tc replacement is that my GPU sits arlt around 46c when I should have headroom to take it lower @ my current 2050/6000 OC what with having two 360mm radiators and all. The cost is mostly my time and the thermal pads as I already have everything else.
> 
> 
> Edit: I am also considering a soldered shunt mod for increased power limit but am unsure as to what method is best... If anyone has experience doing so on this card please let me know. Thanks!


You should be able to activate full factory replacement warranty on your card (as long as the previous owner hasn't modified it) via the EVGA website, even though you bought it used / second hand. Any or all of your mods will completely violate that and render it void. I would advise not doing anything with it until the warranty period runs out, if it was me. 46c is fantastic for a OC'd 2080 Ti. Mine runs around 61c-63c @ 2100 core / 6210 ram, but I only have a single 360mm rad and I have my overclocked 5820K looped into the same loop. I'm planning a much larger loop with 4 radiators for it later next year though.


----------



## anticommon

kithylin said:


> You should be able to activate full factory replacement warranty on your card (as long as the previous owner hasn't modified it) via the EVGA website, even though you bought it used / second hand. Any or all of your mods will completely violate that and render it void. I would advise not doing anything with it until the warranty period runs out, if it was me. 46c is fantastic for a OC'd 2080 Ti. Mine runs around 61c-63c @ 2100 core / 6210 ram, but I only have a single 360mm rad and I have my overclocked 5820K looped into the same loop. I'm planning a much larger loop with 4 radiators for it later next year though.


I've never really been one for warranties, and apart from the shunt mod how would they even know I replaced the pads if I retain the originals for warranty servicing?


----------



## addicTix

Does anyone know how loud the fan of the Aorus Xtreme Waterforce is?
I can get one, and yeah I know most people don't like AiO watercooling because it's not "real" watercooling also it's louder etc., but just compared to my current Xtreme with 3 fans - how loud is the 120mm fan of the AiO?
I didn't read much about it, only like 1-2 threads where people complain that the fan is always running @33% or something like that, no matter if the card is idling or if it's under load.
My Xtreme Edition runs the fans with ~30% while idling (I don't use fan stop), and while gaming it goes up to 75% and I don't find it annoying at all.
To be honest, my pc isn't really silent anyway. It's definitely not loud, well maybe for people with almost completely silent pc's, but I don't mind it.
I remember my GTX 680 ref design, that thing was freaking loud while gaming.


----------



## SharpShoot3r07

I either just got scammed or found the deal of all deals. Just bought EVGA 1080Ti off ebay for $105. The item was described as brand new and ebay should refund me if I do get scammed. The seller had legit feedback/ratings so I don't know what to think.


----------



## JedixJarf

SharpShoot3r07 said:


> I either just got scammed or found the deal of all deals. Just bought EVGA 1080Ti off ebay for $105. The item was described as brand new and ebay should refund me if I do get scammed. The seller had legit feedback/ratings so I don't know what to think.


Probably a hacked acct.


----------



## killuchen

Was it worth adding $100 along with my 1080 FTW for a 1080 TI FTW? Both are evga cards. I game on 1440P


----------



## GreedyMuffin

killuchen said:


> Was it worth adding $100 along with my 1080 FTW for a 1080 TI FTW? Both are evga cards. I game on 1440P


I'd do it in a heartbeat personally.


----------



## killuchen

GreedyMuffin said:


> I'd do it in a heartbeat personally.



haha I figured. I jumped on the deal yesterday.


----------



## King4x4

I may be joining you guys soonish :>


----------



## ZealotKi11er

SharpShoot3r07 said:


> I either just got scammed or found the deal of all deals. Just bought EVGA 1080Ti off ebay for $105. The item was described as brand new and ebay should refund me if I do get scammed. The seller had legit feedback/ratings so I don't know what to think.


I do not even bother with stuff like that. Its never going to happen.


----------



## avp2007

SharpShoot3r07 said:


> I either just got scammed or found the deal of all deals. Just bought EVGA 1080Ti off ebay for $105. The item was described as brand new and ebay should refund me if I do get scammed. The seller had legit feedback/ratings so I don't know what to think.


Always wondered who the people are that actually hit the buy button on this, like the other member said probably a hacked account.


----------



## 2002sn95gt

it's been a while since ive been on the forum,

But now im back into water cooling with a used 1080ti Gigabyte Aorus water force.
[Imgur](https://i.imgur.com/qm5KJ8f.jpg)

I decided to get a 1080ti because my thought is the performance will be about the same as a 2080 founders card. and im thinking for games over 60 frames ray tracing most likely isnt there yet and i dont want to pay for 2 2080s or 2 2080ti in sli right now, 
I think when Cyber Punk 2077 releases next year i could run a 2nd 1080ti in sli if i need to or if Nvidia will ever release the Big format 65 inch 4k monitor with 120hz refresh and HDR

so far im surprised how much cooler temps are for Pascal vs Maxwell
i used to have 2 Maxwell Titans in SLI with EK water blocks and i flashed the bios to 1.3 volts temps ran about 45C with full load (Witcher 3 ran 55 to 60fps in 4k ultra) sold my cards before the cypto boom and ran a single 1070 for over a year.
i was pretty disappointed with locked bios on pascal cards, i gave up looking if anyone figured out any successful custom bios while i had a msi 1070 that could do 2100 core clock stock bios with 1.093 volts.

so far with the 1080ti gigabyte waterforce full cover block and my big overkill radiator im getting 24 to 25c idle and 30 to 33c full load temps 2063 core and 1.043 volts, im sure i need to do some more looking but is the a proper modded bios for my card? i remember there being something about the Asus ROG Strix card having a better stock bios with more voltage, but if its a different PCB and not a reference pcb i wont be able to directly flash it. 


current Specs for my PC
intel x99 with 5820k overclocked to 4.4ghz
GIGABYTE AORUS Xtreme Waterforce GTX 1080 Ti
Aphacool full Copper Radiator 560mm x 45mm (its a bit over kill but i got it when i had 2 titans)
water pump is cheap Ebay SC600

[Imgur](https://i.imgur.com/PwuDNkT.jpg)

Edit: bios version currently on my card is 86.02.39.00.DC and memory type is GDDR5X (Micron)
3D mark time spy score 1440p https://www.3dmark.com/3dm/28825843


----------



## Kronos8

2002sn95gt said:


> it's been a while since ive been on the forum,
> 
> But now im back into water cooling with a used 1080ti Gigabyte Aorus water force.
> [Imgur](https://i.imgur.com/qm5KJ8f.jpg)
> 
> I decided to get a 1080ti because my thought is the performance will be about the same as a 2080 founders card. and im thinking for games over 60 frames ray tracing most likely isnt there yet and i dont want to pay for 2 2080s or 2 2080ti in sli right now,
> I think when Cyber Punk 2077 releases next year i could run a 2nd 1080ti in sli if i need to or if Nvidia will ever release the Big format 65 inch 4k monitor with 120hz refresh and HDR
> 
> so far im surprised how much cooler temps are for Pascal vs Maxwell
> i used to have 2 Maxwell Titans in SLI with EK water blocks and i flashed the bios to 1.3 volts temps ran about 45C with full load (Witcher 3 ran 55 to 60fps in 4k ultra) sold my cards before the cypto boom and ran a single 1070 for over a year.
> i was pretty disappointed with locked bios on pascal cards, i gave up looking if anyone figured out any successful custom bios while i had a msi 1070 that could do 2100 core clock stock bios with 1.093 volts.
> 
> so far with the 1080ti gigabyte waterforce full cover block and my big overkill radiator im getting 24 to 25c idle and 30 to 33c full load temps 2063 core and 1.043 volts, im sure i need to do some more looking but is the a proper modded bios for my card? i remember there being something about the Asus ROG Strix card having a better stock bios with more voltage, but if its a different PCB and not a reference pcb i wont be able to directly flash it.
> 
> 
> current Specs for my PC
> intel x99 with 5820k overclocked to 4.4ghz
> GIGABYTE AORUS Xtreme Waterforce GTX 1080 Ti
> Aphacool full Copper Radiator 560mm x 45mm (its a bit over kill but i got it when i had 2 titans)
> water pump is cheap Ebay SC600
> 
> [Imgur](https://i.imgur.com/PwuDNkT.jpg)
> 
> Edit: bios version currently on my card is 86.02.39.00.DC and memory type is GDDR5X (Micron)
> 3D mark time spy score 1440p https://www.3dmark.com/3dm/28825843


Have you tried the first page of this thread? KedarWolf had done a great job...


----------



## KedarWolf

2002sn95gt said:


> it's been a while since ive been on the forum,
> 
> But now im back into water cooling with a used 1080ti Gigabyte Aorus water force.
> [Imgur](https://i.imgur.com/qm5KJ8f.jpg)
> 
> I decided to get a 1080ti because my thought is the performance will be about the same as a 2080 founders card. and im thinking for games over 60 frames ray tracing most likely isnt there yet and i dont want to pay for 2 2080s or 2 2080ti in sli right now,
> I think when Cyber Punk 2077 releases next year i could run a 2nd 1080ti in sli if i need to or if Nvidia will ever release the Big format 65 inch 4k monitor with 120hz refresh and HDR
> 
> so far im surprised how much cooler temps are for Pascal vs Maxwell
> i used to have 2 Maxwell Titans in SLI with EK water blocks and i flashed the bios to 1.3 volts temps ran about 45C with full load (Witcher 3 ran 55 to 60fps in 4k ultra) sold my cards before the cypto boom and ran a single 1070 for over a year.
> i was pretty disappointed with locked bios on pascal cards, i gave up looking if anyone figured out any successful custom bios while i had a msi 1070 that could do 2100 core clock stock bios with 1.093 volts.
> 
> so far with the 1080ti gigabyte waterforce full cover block and my big overkill radiator im getting 24 to 25c idle and 30 to 33c full load temps 2063 core and 1.043 volts, im sure i need to do some more looking but is the a proper modded bios for my card? i remember there being something about the Asus ROG Strix card having a better stock bios with more voltage, but if its a different PCB and not a reference pcb i wont be able to directly flash it.
> 
> 
> current Specs for my PC
> intel x99 with 5820k overclocked to 4.4ghz
> GIGABYTE AORUS Xtreme Waterforce GTX 1080 Ti
> Aphacool full Copper Radiator 560mm x 45mm (its a bit over kill but i got it when i had 2 titans)
> water pump is cheap Ebay SC600
> 
> [Imgur](https://i.imgur.com/PwuDNkT.jpg)
> 
> Edit: bios version currently on my card is 86.02.39.00.DC and memory type is GDDR5X (Micron)
> 3D mark time spy score 1440p https://www.3dmark.com/3dm/28825843


Not sure if other BIOS's will work right on an Aorus card, it has a different power delivery system that regular 1080 Ti's so I don't want to suggest you try something different. 
Not sure if a BIOS I recommend like the Palit350W will run on your card. 
I know if you flash the Aorus BOS on a regular card the card will still work, just the clocks are messed up and don't work right.

If you do flash the Palit BIOS or something else I'd want to have a second video card you can pop in to flash your Aorus back if it black screens or something.


----------



## anticommon

Will an FTW3 bios work on an SC card? I have the regular hydrocopper which I believe uses the standard 1080ti SC pcb.


----------



## 2muchRGB

*Joined the Club*

See below


----------



## 2muchRGB

*Joined the Club*

Hey gang,
The lady at the store told me it was time to upgrade my rig. She said I should see a significant boost in graphics performance when I play Solitaire. Is this true?


----------



## KedarWolf

2muchRGB said:


> Hey guys,
> The guy at the store told me was time to upgrade my rig. He said I should see a significant boost in graphics performance when I play Solitaire. Is this true?


Better than that, you'll actually be able to play Mine Sweeper at on a 640x480 monitor!!


----------



## GraphicsWhore

2muchRGB said:


> Hey gang,
> The lady at the store told me it was time to upgrade my rig. She said I should see a significant boost in graphics performance when I play Solitaire. Is this true?


Yeah if you want those last-gen graphics. A 2080Ti ray traces all the cards. That’s where it’s really at.


----------



## 2ndLastJedi

Better late than never 😛


----------



## 2002sn95gt

I was able to get 1.093 volts with the gigabyte 1080ti Aorus WB with a custom clock/voltage cure in msi afterburner using Control F. not sure why the voltage slider didn't effect the voltage to go past 1.043 on its own with clock offset or on it own with gpu boost. I did manage to get past 2063 clock speed with the voltage above 1.043 so far i got upto 2150 or so didnt crash yet, still need to do some benchmarks and tuning to find a stable daily gaming clock speed.


----------



## kithylin

2002sn95gt said:


> I was able to get 1.093 volts with the gigabyte 1080ti Aorus WB with a custom clock/voltage cure in msi afterburner using Control F. not sure why the voltage slider didn't effect the voltage to go past 1.043 on its own with clock offset or on it own with gpu boost. I did manage to get past 2063 clock speed with the voltage above 1.043 so far i got upto 2150 or so didnt crash yet, still need to do some benchmarks and tuning to find a stable daily gaming clock speed.


2100-2175 is the maximum for 97% of 1080 Ti's, just so you know. You're not going to get 2200 or more unless you get in to full custom water, cross-flashing the bios and higher volts. And even then only 3% of 2080 Ti's will even do 2200 mhz or higher anyway.


----------



## KedarWolf

I downloaded the FTW3 BIOS with the King|Pin fix, ran Time Spy, 10598

Then I ran the 140 Power Limit Palit BIOS at my same 24/7 clocks, 10678.

Palit still king for every day 24/7 gaming clocks.


----------



## rossctr

Hey guys, wondering if you could point me in the right direction of a BIOS with higher power limit for a MSI 1080ti FE? Also is it easy to recover a bad flash if things go wrong?

thanks


----------



## nolive721

stepping up from the 1080 owner club lol)

got a nice deal on ebay for a used MSI 1080Ti Seahawk X for 550USD delivered to me in Japan yesterday(you have no idea how insane are 1080ti prices over here so i called myself lucky with that one....)

Coming from a good Ocer EVGA FTW Hybrid1080, the MSI gave me 1974Mhz core and of course 11Ghz memory out of the box, at 1.05V, during Heaven benchmarks

Started to OC the beast late last night and could achieve stable 2088Mhz core and 12Ghz memory at 1.093V(with MSI AB specific voltage curve) with temps around 54-55degC(bare in mind still 21-22degC at night in Japan)

Looking at the card behavior in this benchmark,I am actually power throttling since it reaches 120% very often during the 2nd half of the benchmark.

I had flashed a custom BIOS on my EVGA to extend power limit to 130%, as well as memory clock, so I was wondering if you guys have any advice to give to extend the PL beyond the 120%?just for the Fun of it, I would like to break the 2100Mhz core barrier 

thanks so much

Olivier


----------



## nolive721

is that correct to assume a BIOS flash that works on NVIDIA FE to extend PL would also work on my card since it shares the same PCB?


----------



## toncij

nvm


----------



## Echoa

Alright! Gonna be joining the 1080ti club! Managed to snag a pny xlr8 1080ti for 500$ so im pretty happy


----------



## nolive721

KedarWolf said:


> I downloaded the FTW3 BIOS with the King|Pin fix, ran Time Spy, 10598
> 
> Then I ran the 140 Power Limit Palit BIOS at my same 24/7 clocks, 10678.
> 
> Palit still king for every day 24/7 gaming clocks.


hello.just sanity check although it shows in the details that PL is pushed to +40%, that is the one you are referring to right? safe and valuable to flash on a FE like yours then?

https://www.techpowerup.com/vgabios/197550/197550


----------



## Streetdragon

Echoa said:


> Alright! Gonna be joining the 1080ti club! Managed to snag a pny xlr8 1080ti for 500$ so im pretty happy


500$ for a 1080TI thats really nice^^ price/performence is awesome!


----------



## Echoa

Streetdragon said:


> 500$ for a 1080TI thats really nice^^ price/performence is awesome!


Yeah, looks like he was probably try to mine with them for a few months considering he has sold about 12-13 of them (different versions and good feedback)

I don't care though as long as it works, I'll check temps to make sure it doesn't need new paste or pads. Don't plan on doing any overclock in for a while as this'll be my card for several years. Hoping to get about 5yrs out of it like I did my HD6970.

If it was an AMD card I'd probably have passed on a suspected mining card, but considering NVIDIA and their bios limitations + the likelihood it was undervolted when mining IMO it's fine.


----------



## kithylin

Echoa said:


> Yeah, looks like he was probably try to mine with them for a few months considering he has sold about 12-13 of them (different versions and good feedback)
> 
> I don't care though as long as it works, I'll check temps to make sure it doesn't need new paste or pads. Don't plan on doing any overclock in for a while as this'll be my card for several years. Hoping to get about 5yrs out of it like I did my HD6970.
> 
> If it was an AMD card I'd probably have passed on a suspected mining card, but considering NVIDIA and their bios limitations + the likelihood it was undervolted when mining IMO it's fine.


Honestly with nvidia the main concern with buying a used mining card is the fans will probably wear out if it's an air cooled card. But if you're intending to slip it under a full cover water block then it doesn't matter if they mined it or not with nvidia really.


----------



## Echoa

kithylin said:


> Honestly with nvidia the main concern with buying a used mining card is the fans will probably wear out if it's an air cooled card. But if you're intending to slip it under a full cover water block then it doesn't matter if they mined it or not with nvidia really.


Im not worried about the fans, worse case scenario Ill throw on an Arctic cooler if the fans give out. I have a Define R5 with the non windowed side panel so its not like looks matter.


----------



## The Pook

If anyone's curious - the issues with 3DMark that I was asking about turns out was an issue with MSI Afterburner's overlay. I thought disabling the overlay ruled out the overlay causing any issues, but turns out you have to end the service and can't just turn it off. The more you know 




Echoa said:


> Alright! Gonna be joining the 1080ti club! Managed to snag a pny xlr8 1080ti for 500$ so im pretty happy





nolive721 said:


> stepping up from the 1080 owner club lol)
> 
> got a nice deal on ebay for a used MSI 1080Ti Seahawk X for 550USD delivered to me in Japan yesterday(you have no idea how insane are 1080ti prices over here so i called myself lucky with that one....)





2muchRGB said:


> See below



I think a bunch of us took advantage of the 15% off everything coupon on eBay last week. I was going to buy a 1070 Ti but saw that coupon and grabbed a 1080 Ti instead


----------



## Echoa

The Pook said:


> If anyone's curious - the issues with 3DMark that I was asking about turns out was an issue with MSI Afterburner's overlay. I thought disabling the overlay ruled out the overlay causing any issues, but turns out you have to end the service and can't just turn it off. The more you know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think a bunch of us took advantage of the 15% off everything coupon on eBay last week. I was going to buy a 1070 Ti but saw that coupon and grabbed a 1080 Ti instead


Unfortunately I missed that one (was broke at the time), but i did watch ebay like a hawk this last week. I passed on a couple cheaper ones with broken fans but right as i left work last night i found this guys listing for the PNY XLR8 and jumped on it. Wouldve liked to pay slightly less but oh well

Was gonna grab a 1080 but this one being only slightly more than i wanted i had to


----------



## mouacyk

Best of luck to to these cheap 2nd-hand mined 1080 Ti cards. Personally, I would be worried about the life of the VRMs if the cards were air-cooled only, because those could have coasted anywhere from 90C to 110C 24/7 for several months straight. They are rated for 120C to 125C, but these seem to be safe instantaneous temperature points not sustained.


----------



## Echoa

mouacyk said:


> Best of luck to to these cheap 2nd-hand mined 1080 Ti cards. Personally, I would be worried about the life of the VRMs if the cards were air-cooled only, because those could have coasted anywhere from 90C to 110C 24/7 for several months straight. They are rated for 120C to 125C, but these seem to be safe instantaneous temperature points not sustained.


My whole computer minus my main SSD and PSU is cheap 2nd hand at this point lol (I think i bought my 4770k from you), luck has been good so far and I'm confident in 2nd hand as long as it seems to be in good shape. I'll worry if something happens, till then no reason to IMO.


----------



## Echoa

Anyone know a way i can find out the thermal pad thicknesses for the PNY XLR8? I'd prefer not to have to geta bunch of sizes of i don't have to.

Does some have that model and happen to know?


----------



## nolive721

The Pook said:


> If anyone's curious - the issues with 3DMark that I was asking about turns out was an issue with MSI Afterburner's overlay. I thought disabling the overlay ruled out the overlay causing any issues, but turns out you have to end the service and can't just turn it off. The more you know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think a bunch of us took advantage of the 15% off everything coupon on eBay last week. I was going to buy a 1070 Ti but saw that coupon and grabbed a 1080 Ti instead


I bought mine from a seller in Israel 3 weeks ago so I had to pay 50$ for shipping but the card itself was only 499$.An hybrid 1080Ti that is, I couldn't believe it so I didn't hesitate a second you can guess.
Didn't take advantage of the Ebay offer but honestly,I cant complain. The card is a good OCer so far, I just need to work out the Rad fan speed control and config since the Push only gives me little too high temps for my taste (low 50s stock/ high 50s with extreme OC)
Might have to repaste the chip later down the road to help here as well, if people have a thermal paste they recommend for GPU? I am using Grizzly kryonaut on my CPU AIO


----------



## The Pook

Echoa said:


> Anyone know a way i can find out the thermal pad thicknesses for the PNY XLR8? I'd prefer not to have to geta bunch of sizes of i don't have to.
> 
> Does some have that model and happen to know?


EVGA uses 1 mm for the heatsink side and 1.5 mm for the backplate pads for 1060s and up. Can't help ya with PNY - but if pads are too thick you can cut them thinner with a sharp enough blade.



nolive721 said:


> I bought mine from a seller in Israel 3 weeks ago so I had to pay 50$ for shipping but the card itself was only 499$.An hybrid 1080Ti that is, I couldn't believe it so I didn't hesitate a second you can guess.
> Didn't take advantage of the Ebay offer but honestly,I cant complain. The card is a good OCer so far, I just need to work out the Rad fan speed control and config since the Push only gives me little too high temps for my taste (low 50s stock/ high 50s with extreme OC)
> Might have to repaste the chip later down the road to help here as well, if people have a thermal paste they recommend for GPU? I am using Grizzly kryonaut on my CPU AIO


I put liquid metal on my 1060 and it dropped temps a ton, but if you're just willing to use a "regular" paste MX-4 is cheap and works pretty well. I was an AS5 fanboy for ages but switched to MX-4 because the tubes are twice the size and cheaper and it performs a smidge better.

Really though for the most part, paste is paste. You're not gonna see >10c differences between em.


----------



## Echoa

The Pook said:


> EVGA uses 1 mm for the heatsink side and 1.5 mm for the backplate pads for 1060s and up. Can't help ya with PNY - but if pads are too thick you can cut them thinner with a sharp enough blade.
> 
> 
> 
> I put liquid metal on my 1060 and it dropped temps a ton, but if you're just willing to use a "regular" paste MX-4 is cheap and works pretty well. I was an AS5 fanboy for ages but switched to MX-4 because the tubes are twice the size and cheaper and it performs a smidge better.
> 
> Really though for the most part, paste is paste. You're not gonna see >10c differences between em.


I found a guy on Reddit who redid his xlr8 pads. He used 1mm so i grabbed some Arctic 6w ones for the vrms and generic 3.2w for anything else. I have some spare 1.5 if i need it.


----------



## Groove2013

Hi guys.

Have bought a Strix 1070 Ti Advanced and would like to overclock it. Have researched a lot and it appears that Firestike is the way to go to test OC stability.
Will have to buy 3DMark on Steam for 28€ to be able to tweak different settings, like make it run in a loop and have access to deeper/finer settings.

I have a 1440p monitor.

I would be grateful if someone here could tell me for how long should one run Firestrike in a loop and at which settings to consider the OC as stable.


----------



## mouacyk

Groove2013 said:


> Hi guys.
> 
> Have bought a Strix 1070 Ti Advanced and would like to overclock it. Have researched a lot and it appears that Firestike is the way to go to test OC stability.
> Will have to buy 3DMark on Steam for 28€ to be able to tweak different settings, like make it run in a loop and have access to deeper/finer settings.
> 
> I have a 1440p monitor.
> 
> I would be grateful if someone here could tell me for how long should one run Firestrike in a loop and at which settings to consider the OC as stable.


The Stress test is usually sufficient. I think it runs around 20 loops. This is the same number of loops I run in Metro Benchmark. Then if a game crashes later, just bump down the clock one or two notches then you should be set.


----------



## Groove2013

Also why does Afterburner say that I run in voltage limit and lowers the frequency?
I set it to 100%, which corresponds to 1.093 V. Should be sufficient?
Power limit is also flashing sometimes despite the fact it never being higher than 116-118% of 120%.


----------



## kithylin

Groove2013 said:


> Also why does Afterburner say that I run in voltage limit and lowers the frequency?
> I set it to 100%, which corresponds to 1.093 V. Should be sufficient?
> Power limit is also flashing sometimes despite the fact it never being higher than 116-118% of 120%.


The actual "Power limit" and "voltage limit" monitors don't do anything. They'll be at max even at stock even if you don't ever touch anything. Just remove em from monitoring. They're useless.


----------



## Groove2013

So what is making the frequency going down then?
The vBIOS of my 1070 Ti Strix is limited to 217 W. I can flash it to the one from MSI with 240 W limit, but don't know if it will work. Mine has 1x8 pin, MSI one 1x8 + 1x6 pin.


----------



## 8051

Groove2013 said:


> So what is making the frequency going down then?


Voltage and power perfcaps will definitely cause throttling. Flashing the Asus Strix OC VBIOS has allowed me to get around voltage perfcaps on my MSI 1080Ti Duke and my old Gigabyte 1080Ti Gaming OC (RIP). I think keeping temps down is the only way to evade power perfcaps.


----------



## Groove2013

Temps are <70°C.


----------



## KedarWolf

Groove2013 said:


> Also why does Afterburner say that I run in voltage limit and lowers the frequency?
> I set it to 100%, which corresponds to 1.093 V. Should be sufficient?
> Power limit is also flashing sometimes despite the fact it never being higher than 116-118% of 120%.


In my experience Power Limit kicks in around 117% on a 120% power limit BIOS and other factors affect your stable highest voltage like temperatures.

Try the 140 Power Limit Palit BIOS. Best BIOS I found for 24/7 every day overclocks.


----------



## Groove2013

Is there a way to add more voltage or is 1.093 sufficient once power limit problem solved?
Temps aren't a problem.


----------



## KedarWolf

Groove2013 said:


> Is there a way to add more voltage or is 1.093 sufficient once power limit problem solved?
> Temps aren't a problem.


The Palit BIOS also raises the voltage to 350W, just remember after flashing any BIOS run the correct powerlimit.bat file found in the BestBIOSCollection.zip to set the voltage. :cheers:

The other option IF YOU ARE USING A FULL COVER WATERBLOCK is the no power limit XOC BIOS. :h34r-smi


----------



## The Pook

It looks like my 1080 Ti is a terrible OCer ... or at least compared to my 1060. 

With voltage and power sliders maxed, it's stable at +50 on the core and it'll finish benchmarks at +75 with a few artifacts. +100 just fails a few seconds into a 3DMark run. My 1060 was good for +190 for 2200 



mouacyk said:


> Best of luck to to these cheap 2nd-hand mined 1080 Ti cards. Personally, I would be worried about the life of the VRMs if the cards were air-cooled only, because those could have coasted anywhere from 90C to 110C 24/7 for several months straight. They are rated for 120C to 125C, but these seem to be safe instantaneous temperature points not sustained.


If your post was directed to me ... I bought a new card 

Though, I don't know of any VRM that won't throttle by 110c. Even so, Asus markets their VRMs with 5000 hours @ 105C and 500000 hours at 65C and those are laughably low-balled.


----------



## Groove2013

Sorry man, but I have a 1070 Ti with official 217 W limit with 1x8 pin, not a 1080 Ti.


KedarWolf said:


> raises the voltage to 350W


Voltage is voltage and wattage is wattage or not? I don't understand.
I understand that higher voltage increases power draw.

If i flash to an official vBIOS from another vendor, do I have to reinstall the drivers?


----------



## Groove2013

Have flashed original 217 W Asus Strix vBIOS to a 252 W Zotac AMP! vBIOS.
Sure, my Strix has only 1x8 pin + PCI-E slot (150+75=225 W) vs. Zotac AMP! with 2x8 pin + PCI-E slot (150+150+75=375 W).
But I'm sure that 25 W more won't be a problem at all.

In 2013 on my Titan i had a vBIOS for 600 W while only 1x6pin + 1x8 pin + PCI-E slot (75+150+75=300 W) and it was drawing ~375 W for 3 years with no problems. Still working in my buddy's PC.


----------



## KedarWolf

Groove2013 said:


> Sorry man, but I have a 1070 Ti with official 217 W limit with 1x8 pin, not a 1080 Ti.
> 
> Voltage is voltage and wattage is wattage or not? I don't understand.
> I understand that higher voltage increases power draw.
> 
> If i flash to an official vBIOS from another vendor, do I have to reinstall the drivers?


Yes, it's wattage but increases the max power draw as well I believe otherwise there would be no difference between a 300W BIOS and this 350W.

Edit: Technically you don't need to reinstall the drivers, they'll reload again 30 seconds after rebooting, but I do recommend a DDU clean driver uninstall and a reinstall after rebooting.


----------



## GreedyMuffin

KedarWolf said:


> I downloaded the FTW3 BIOS with the King|Pin fix, ran Time Spy, 10598
> 
> Then I ran the 140 Power Limit Palit BIOS at my same 24/7 clocks, 10678.
> 
> Palit still king for every day 24/7 gaming clocks. /forum/images/smilies/smile.gif


Would you mind sharing the rom you have? Would it work fine on a FE?


----------



## vasyltheonly

Hey everyone, just wanted to clarify something about the Asus Strix No power limit bios. Uploading it to my Zotac Amp card seems to cause it to perform worse off. Reading through the earlier comments, people point it to being as an issue to the timings of the memory and such. Is that true? With the Artic Storm bios I can reach a maximum of 2113mhz and with the ASUS it can reach close to 2200mhz without crashing but the performance is always lower. So any idea what is truly going on?


----------



## kithylin

Groove2013 said:


> Temps are <70°C.


All Nvidia pascal series cards start reducing clocks @ 60c just so you know.



vasyltheonly said:


> Hey everyone, just wanted to clarify something about the Asus Strix No power limit bios. Uploading it to my Zotac Amp card seems to cause it to perform worse off. Reading through the earlier comments, people point it to being as an issue to the timings of the memory and such. Is that true? With the Artic Storm bios I can reach a maximum of 2113mhz and with the ASUS it can reach close to 2200mhz without crashing but the performance is always lower. So any idea what is truly going on?


It may have slightly lower performance vs other bios's but it depends on what you're doing. For me with my 1080 Ti, the XOC bios was the only way I was able to get +300 mhz on my core and +400 mhz on the memory. So yes it may run a little slower, but that may be completely offset by the higher clocks you may possibly reach.


----------



## 2ndLastJedi

Groove2013 said:


> Sorry man, but I have a 1070 Ti with official 217 W limit with 1x8 pin, not a 1080 Ti.
> 
> 
> KedarWolf said:
> 
> 
> 
> raises the voltage to 350W
> 
> 
> 
> Voltage is voltage and wattage is wattage or not? I don't understand.
> I understand that higher voltage increases power draw.
> 
> If i flash to an official vBIOS from another vendor, do I have to reinstall the drivers?
Click to expand...

I have more success undervolting than overvolting !
Have a go at an undervolt Overclocks !


----------



## nolive721

talking about undervolting.

managed to get my seahawk stable at 1999mhz core and 12,300mhz memory at 0.975V

any good?


----------



## CENS

Yes not every card can hit 2000Mhz with 1v even. So hitting 2000Mhz consistently with 0.975v is great. Especially also taking your memory OC into account.



On the other side of the spectrum there are also exceptionally great cards. At 1.012v this kingpin card managed 2138Mhz in 3Dmark. On Air... cool air though.


----------



## KedarWolf

GreedyMuffin said:


> Would you mind sharing the rom you have? Would it work fine on a FE?


It's in the BestBIOSCollection.zip in OP on the first page including the Palit350W.bat file to run to set the power limit after flashing it and rebooting.


----------



## KedarWolf

I tested every BIOS that can be used on a 1080 Ti FE (or most other cards).

This is the 24/7 overclock I used that I run daily on my machine for gaming.










I left Nvidia Control Panel on default except I changed it to Maximum Performance, Shader Cache off and V-sync off.

I ran the right powerlimit.bat file on each flash and rebooted after setting the Nvidia and Afterburner settings after I rebooted from flashing.

Here are the results. Seems Palit BIOS is king on my 24/7 every day settings. :drum:

Arctic Storm BIOS.










EVGA FTW BIOS.










Nvidia FE BIOS.










Palit BIOS.










StrixOC BIOS.










XOC No Power Limit BIOS


----------



## zGunBLADEz

my 24/7 vs my max overclock its not to far off
can do 2088/650 or 700 with a mere 0.98v on this 1080Ti


----------



## CENS

Also very impressive


----------



## ThrashZone

Hi,
EVGA ftw3 slave vbios is best with the king|pin fix for me anyway 10480 time spy best is 10498
May not work with a 8-6 pin power connector though 
https://www.techpowerup.com/vgabios/191833/evga-gtx1080ti-11264-170418


----------



## Groove2013

So it's all about the temperature and not the voltage to keep the boost constant?
Even 50+°C is to much to keep the boost constant?


----------



## Streetdragon

If i set my card to 2050Mhz and its under 23-24° it boosts to 2063. 

@43° or so the GPU should start to downclock a bin or? Dont know. My card never goes over 32° xD


----------



## nolive721

Snipes9 said:


> Yes not every card can hit 2000Mhz with 1v even. So hitting 2000Mhz consistently with 0.975v is great. Especially also taking your memory OC into account.
> 
> 
> 
> On the other side of the spectrum there are also exceptionally great cards. At 1.012v this kingpin card managed 2138Mhz in 3Dmark. On Air... cool air though.
> 
> https://www.youtube.com/watch?v=vft-J2ABDd8


thanks

and its obviously normal MSI AB report throttling when doing extreme undervolting or is there something I should look at to improve?


----------



## Streetdragon

GPU-Z is showing it too.
If your GPU-Clock is stable/not droping, than you are not limiting. So i would say its fine^^


----------



## Streetdragon

KedarWolf said:


> I tested every BIOS that can be used on a 1080 Ti FE (or most other cards).
> 
> This is the 24/7 overclock I used that I run daily on my machine for gaming.


My result from playing a bit:
https://www.3dmark.com/spy/4634174

Made the same changes in the driver like KedarWolf.

2100/6003 with 1.075V Stock Bios and shunt mod, Ithinks thats not so bad^^ I do now a run with the same speeds

Edit:
2037/6003(Higher Ram wont give more Power...)
https://www.3dmark.com/spy/4634337


----------



## ThrashZone

Streetdragon said:


> My result from playing a bit:
> https://www.3dmark.com/spy/4634174
> 
> Made the same changes in the driver like KedarWolf.
> 
> 2100/6003 with 1.075V Stock Bios and shunt mod, Ithinks thats not so bad^^ I do now a run with the same speeds
> 
> Edit:
> 2037/6003(Higher Ram wont give more Power...)
> https://www.3dmark.com/spy/4634337


Hi,
Looks like your cpu score is not doing very well 
Liked the first score better


----------



## KedarWolf

Streetdragon said:


> My result from playing a bit:
> https://www.3dmark.com/spy/4634174
> 
> Made the same changes in the driver like KedarWolf.
> 
> 2100/6003 with 1.075V Stock Bios and shunt mod, Ithinks thats not so bad^^ I do now a run with the same speeds
> 
> Edit:
> 2037/6003(Higher Ram wont give more Power...)
> https://www.3dmark.com/spy/4634337


For 3DMark try these settings for benching.


----------



## ThrashZone

Hi,
New version of 3dmark will flag those settings as not valid I imagine


----------



## kithylin

nolive721 said:


> thanks
> 
> and its obviously normal MSI AB report throttling when doing extreme undervolting or is there something I should look at to improve?


Those 3 monitors are always going to show "1" no matter what you're doing. Stock card, overclock, undervolting, they're always "1" no matter what. Just remove them from msi afterburner. I don't know why they're there. They're completely pointless and do nothing.


----------



## nolive721

kithylin said:


> Those 3 monitors are always going to show "1" no matter what you're doing. Stock card, overclock, undervolting, they're always "1" no matter what. Just remove them from msi afterburner. I don't know why they're there. They're completely pointless and do nothing.


thank you but I have a different experience where the OC is well balanced between Voltage and clocks, none of these indicators would show throttling. 

In my case with undervolting, I would imagine that even if my OC is concretely stable MSI AB and/or the card input would consider there is not enough volts in there and then trigger the power&thermal ones even if they are not beyond their theoretical limits.

my 2p of course, maybe people who know better about these Pascal cards would bring some light in there?


----------



## nolive721

KedarWolf said:


> I tested every BIOS that can be used on a 1080 Ti FE (or most other cards).
> 
> This is the 24/7 overclock I used that I run daily on my machine for gaming.
> 
> [
> Here are the results. Seems Palit BIOS is king on my 24/7 every day settings. :drum:



was achieving 2088Mhz core, 12,300Mhz Vram with the stock BIOS and a custom Volt curve in MSI AB but was power throttling as soon as I was trying higher clocks on the core

flash the palit BIOS included in the original post last night on my Seahawk X to see if it helped but even if power shows going beyond 120% you get in the stock BIOS, monitoring was showing still power throttling and the card would crash while trying 2100Mhz core.

Will post screenshot tonight after some more testing but any advice is much appreciated.


----------



## ThrashZone

Hi,
Yeah can't get past nvidia driver stuff it remains as vbios change


----------



## Streetdragon

KedarWolf said:


> For 3DMark try these settings for benching.


yeah tonight i will try that out.
Just wantet to compare a bit with your clocks and scores :thumb:

and my cpu, dont know why its so low... maybe close some background programms


----------



## Streetdragon

https://www.3dmark.com/spy/4650248
GPU 2100/6210 (VRam+700)

CPU 5930k clocked to 4800/4500

Just a little question: would 1.075V safe if i keep the core at max 31° 2100/6210Mhz?


----------



## kithylin

nolive721 said:


> was achieving 2088Mhz core, 12,300Mhz Vram with the stock BIOS and a custom Volt curve in MSI AB but was power throttling as soon as I was trying higher clocks on the core
> 
> flash the palit BIOS included in the original post last night on my Seahawk X to see if it helped but even if power shows going beyond 120% you get in the stock BIOS, monitoring was showing still power throttling and the card would crash while trying 2100Mhz core.
> 
> Will post screenshot tonight after some more testing but any advice is much appreciated.


You need to understand that "power percentage" is completely irrelevant and is a different value for every bios. Different bios's have a different base power limit for the cards while also having a different +% number. Some bios's for example may be 150W + 30% (195W Actual Max), while other bios's may be 275W + 15% (316W Actual Max). So don't look at just the +power% because you have to take in to account the base power limit before applying the +% to it. Most bios's you can find them on the techpowerup bios database that shows you both the base wattage and the +% boost.


----------



## Echoa

Alright, finally a proud owner of a 1080ti

The card arrived really quickly, they actually tried to deliver it sunday but i was at work so had to get it this morning. It was in fantastic condition as was the box, If not for him telling me you wouldnt know it had ever been mined on it was so pristine. The PNY cooler is a little meh, but with a custom fan curve it keeps it under 70c which is my goal. 

Tomorrow Ill replace the pads and paste and see how things improve (if at all).


----------



## KCDC

Hey all, been a while and tried scrolling through the past month, so I'm just gonna ask:


Is there still no way to have all three Display Ports active while on another BIOS with FE cards? Namely the XOC or Palit BIOS that KedarWolf mentioned?


Still running the same rig in sig.


----------



## H3avyM3tal

I managed to trade my 1080 for a 1080ti for 100$ - thats a god deal here!

What should I do? Overvolt, undervolt? Normal oc?

Its a galax exoc.


----------



## kithylin

H3avyM3tal said:


> I managed to trade my 1080 for a 1080ti for 100$ - thats a god deal here!
> 
> What should I do? Overvolt, undervolt? Normal oc?
> 
> Its a galax exoc.


Set temperature and power % sliders in afterburner to max, set fans to 65% and apply and play games and the cards will automagically self-overclock as far as they're able to with nvidia boost 3.0. There's not really much to it anymore with Nvidia.


----------



## LeoYunta

Hi guys,

I have an Evga 1080 Ti FTW3, running underwater. 

I never had to overclock that card cause I never needed too, I'm playing in 4K, and COD 4 making my card suffering a little (First time I have drop under 60 fps) This game is literally taking 10G of ram out of my 11g which is insane ! 
So I'm actually thinking to overlock it. 

Been a long time since I overlocked my GPU, so what would be the best to start with ? Just afterburner unlock voltage and OC ? I don't really wanna go crazy with flashing BIOS and everything unless it's really worth it. In fact, I just wanna be able to play without any drop with max setting.

Like I said I'm underwater, so temperature is not a problem (50C max while playing) 

For info I have a 4770K @4,5 and [email protected],4Ghz with Windows 10

Cheers !


----------



## GraphicsWhore

LeoYunta said:


> Hi guys,
> 
> I have an Evga 1080 Ti FTW3, running underwater.
> 
> I never had to overclock that card cause I never needed too, I'm playing in 4K, and COD 4 making my card suffering a little (First time I have drop under 60 fps) So I'm actually thinking to overlock it.
> 
> Been a long time since I overlocked my GPU, so what would be the best to start with ? Just afterburner unlock voltage and OC ? I don't really wanna go crazy with flashing BIOS and everything unless it's really worth it. In fact, I just wanna be able to play without any drop with max setting.
> 
> Like I said I'm underwater, so temperature is not a problem (50C max while playing)
> 
> For info I have a 4770K @4,5 and [email protected],4Ghz with Windows 10
> 
> Cheers !


Don’t expect the OC to do much for frame rates.

Best use of Afterburner would be to set your own voltage curve. I would try just modifying the stock to hit maybe 2080 @ 1093mV then just add an OC to memory. If it works raise the curve at 1093mV to higher clock.

I don’t know much about that stock FTW3 though. I’m coming from the perspective of using the XOC BIOS.

On that note I know you say you don’t want to get into flashing but it literally takes a minute and since you’re on water and assuming you have a good PSU you’ll probably get best results with that BIOS.


----------



## LeoYunta

GraphicsWhore said:


> Don’t expect the OC to do much for frame rates.
> 
> Best use of Afterburner would be to set your own voltage curve. I would try just modifying the stock to hit maybe 2080 @ 1093mV then just add an OC to memory. If it works raise the curve at 1093mV to higher clock.
> 
> I don’t know much about that stock FTW3 though. I’m coming from the perspective of using the XOC BIOS.
> 
> On that note I know you say you don’t want to get into flashing but it literally takes a minute and since you’re on water and assuming you have a good PSU you’ll probably get best results with that BIOS.


Thank you for your answer, well, I can flash it if I can really gain out of this, except my 1080 Ti, my hardware is starting to getting old, but still works like a charm ! I use a CORSAIR AX860 80 PLUS PLATINUM.

Well, I just want to stuck the thing at 60 fps, so I don't really need a big boost. I'm just really picky.


----------



## GraphicsWhore

LeoYunta said:


> GraphicsWhore said:
> 
> 
> 
> Don’t expect the OC to do much for frame rates.
> 
> Best use of Afterburner would be to set your own voltage curve. I would try just modifying the stock to hit maybe 2080 @ 1093mV then just add an OC to memory. If it works raise the curve at 1093mV to higher clock.
> 
> I don’t know much about that stock FTW3 though. I’m coming from the perspective of using the XOC BIOS.
> 
> On that note I know you say you don’t want to get into flashing but it literally takes a minute and since you’re on water and assuming you have a good PSU you’ll probably get best results with that BIOS.
> 
> 
> 
> Thank you for your answer, well, I can flash it if I can really gain out of this, except my 1080 Ti, my hardware is starting to getting old, but still works like a charm ! I use a CORSAIR AX860 80 PLUS PLATINUM.
> 
> Well, I just want to stuck the thing at 60 fps, so I don't really need a big boost. I'm just really picky.
Click to expand...

I would find your max OC with your current BIOS and compare it to XOC. However, all said and done you’re looking at a minimal increase in min/max FPS. Possibly not even noticeable, especially as you’re already on water and getting presumably good boost numbers.


----------



## Outcasst

Delete


----------



## kithylin

LeoYunta said:


> Hi guys,
> 
> I have an Evga 1080 Ti FTW3, running underwater.
> 
> I never had to overclock that card cause I never needed too, I'm playing in 4K, and COD 4 making my card suffering a little (First time I have drop under 60 fps) This game is literally taking 10G of ram out of my 11g which is insane !
> So I'm actually thinking to overlock it.
> 
> Been a long time since I overlocked my GPU, so what would be the best to start with ? Just afterburner unlock voltage and OC ? I don't really wanna go crazy with flashing BIOS and everything unless it's really worth it. In fact, I just wanna be able to play without any drop with max setting.
> 
> Like I said I'm underwater, so temperature is not a problem (50C max while playing)
> 
> For info I have a 4770K @4,5 and [email protected],4Ghz with Windows 10
> 
> Cheers !


Quite literally, 98% of 1080 Ti's all reach the same max overclock(2000~2075 Mhz). And since you're already under water cooling, with how nvidia boost 3.0 works with pascal, your card's most likely already hitting 2000 - 2025 mhz on it's own without you even doing anything. When boost 3.0 detects the card is running cooler it will automatically self-overclock further on it's own. All you really should do is just open afterburner and push the temperature target and power target sliders to maximum (all the way to the right) click apply and go play. That will give boost 3.0 more "headroom" to work with and it will automatically self-boost as far as it's able to. This will guarantee that your card both will be 100% completely stable at all times (boost 3.0 will never push your cards to any clocks that could be unstable) and always remain within safe regions and guaranteed zero possible damage to your card.

With manual overclocking you're only going to gain 1-3 FPS above what boost 3.0 can do, and you'll introduce the possibility of instability and (depending on how far you push voltage and overclocks) the possibility of damaging your card over time. So I would suggest you just raise the limits, click go, let boost 3.0 magically "do it's thing" and enjoy a sexy card. It's really not worth tuning or tweaking anything for any reason with a 1080 Ti. They're rather fast cards already and the gains are marginal at best.


----------



## mattliston

Actually, its DEFINITELY worth tuning the card.


Something as easy as getting it to run certain clocks at lower voltages means less power consumption AND generally able to maintain and hold higher clocks.




Out of the box, my card (giga aurus non-xtreme) only boosted to 1961mhz. Even when fan speeds were set above 50% rather than the factory curve that wanted only 10-20% at under 50*C




I was able to tune quite a large chunk of voltage out, and it still only self-clocked to 1961mhz.




The curve clocking feature of MSI AF was very handy in getting a 2050mhz clock out of much less voltage than the default setting to 1961mhz


----------



## GraphicsWhore

Outcasst said:


> Hello.
> 
> I had an accident tonight with my 1080 Ti.
> 
> I was removing my custom cooler which has seperate adhesive memory heatsinks. When removing one of these, it ripped a peice of the GDDR5X chip off with it. The GPU still works (without drivers installed) but shows servere artifacting.
> 
> Do you guys think it would be possible to get this repaired somewhere?


Maybe?

Just throwing it out there because I remember it from a recent thread but there was a user here "rob w" who messed up during a shunt mod to his Titan that made it a brick and he sent it to a company and it was fixed. The company is Retronix Ltd (retronix.com) in Scotland.


----------



## kithylin

mattliston said:


> Actually, its DEFINITELY worth tuning the card.
> 
> 
> Something as easy as getting it to run certain clocks at lower voltages means less power consumption AND generally able to maintain and hold higher clocks.
> 
> 
> 
> 
> Out of the box, my card (giga aurus non-xtreme) only boosted to 1961mhz. Even when *fan speeds *were set above 50% rather than the factory curve that wanted only 10-20% at under 50*C
> 
> 
> 
> 
> I was able to tune quite a large chunk of voltage out, and it still only self-clocked to 1961mhz.
> 
> The curve clocking feature of MSI AF was very handy in getting a 2050mhz clock out of much less voltage than the default setting to 1961mhz


You're using an air cooled card. 1080 Ti's under custom water loop (all of them) will all maintain the maximum boost speed at all times just because they're running cooler (as long as they're kept below 60c at all times, which is super easy with a custom water loop with at least a single 240mm rad or bigger). Pascal cards throttle from heat before they throttle from power limits. 1080 Ti's under custom water loop do -NOT- perform the same as when running as an air cooled card.


----------



## Abaidor

kithylin said:


> You're using an air cooled card. 1080 Ti's under custom water loop (all of them) will all maintain the maximum boost speed at all times just because they're running cooler (as long as they're kept below 60c at all times, which is super easy with a custom water loop with at least a single 240mm rad or bigger). Pascal cards throttle from heat before they throttle from power limits. 1080 Ti's under custom water loop do -NOT- perform the same as when running as an air cooled card.


True, I use a Phanteks block on my Asus Strix OC is not the top performer (it does not matter though) but of excellent quality and the card never exceeds 40C (big custom loop with external rad). While testing without any overclock the GPU clock stays fixed @ 1999MHz all the time. 

When overclocking with Asus GPU Tweak I can manage 6200MHz on Memore and 2075MHz on GPU with temps staying the same. I am using it stock since I don't have a 4K monitor and don't game much as a matter of fact. In any case a GPU waterblock is totally worth if only for how silent my PC is.


----------



## Streetdragon

Outcasst said:


> Hello.
> 
> I had an accident tonight with my 1080 Ti.
> 
> I was removing my custom cooler which has seperate adhesive memory heatsinks. When removing one of these, it ripped a peice of the GDDR5X chip off with it. The GPU still works (without drivers installed) but shows servere artifacting.
> 
> Do you guys think it would be possible to get this repaired somewhere?


You cant repair a dead VRAM chip. So its only a matter of time till the chip fully dies and with it the card.

Maybe with biosmodding it would be possible to "disable" that chip. But for now.... R.I.P...


----------



## Zfast4y0u

Streetdragon said:


> You cant repair a dead VRAM chip. So its only a matter of time till the chip fully dies and with it the card.
> 
> Maybe with biosmodding it would be possible to "disable" that chip. But for now.... R.I.P...


chip is soldered to PCB, what stops us from removing it and putting new on its place?


----------



## ThrashZone

LeoYunta said:


> Thank you for your answer, well, I can flash it if I can really gain out of this, except my 1080 Ti, my hardware is starting to getting old, but still works like a charm ! I use a CORSAIR AX860 80 PLUS PLATINUM.
> 
> Well, I just want to stuck the thing at 60 fps, so I don't really need a big boost. I'm just really picky.


Hi,
Assuming you're using Master vbios position which is near the power plugs to the left.

Really easy oc on the ftw3 is set +65 core clock and +600 memory clock can be adjusted more likely +7-800 is nothing for this card.
It's best to use evga precision it's the only utility that can control all three fans on the ftw3
Set a custom fan curve all doing the same curve 30c/30% fan speed straight line to 60c to 100% fan speed decrease to 70c 100% if too noisy.

Power limit shouldn't need adjustments at all unless going past +65 core clock.
Core voltage +115% should be fine if stutters occure at 100%.

I had to hunt it down but here's the ftw3 owners thread
https://www.overclock.net/forum/69-nvidia/1631796-evga-ftw3-gtx-1080-ti-owner-s-thread.html


----------



## mouacyk

Zfast4y0u said:


> chip is soldered to PCB, what stops us from removing it and putting new on its place?


Try asking this guy: https://www.youtube.com/user/rossmanngroup/search?query=vram


----------



## spyshagg

Hello guys

Sold my Vega64 for 450 and bought a Gigabyte Aorus 1080ti for 450  (14 years with AMD!)












In Blackops4 it boosts up to ~1960mhz without touching anything. It seems to hang if I try anything above ~2030mhz


How high (24/7) could this card go with the tips in the OP ? Anyone here has this card?


Cheers


----------



## kithylin

spyshagg said:


> Hello guys
> 
> Sold my Vega64 for 450 and bought a Gigabyte Aorus 1080ti for 450  (14 years with AMD!)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In Blackops4 it boosts up to ~1960mhz without touching anything. It seems to hang if I try anything above ~2030mhz
> 
> 
> How high (24/7) could this card go with the tips in the OP ? Anyone here has this card?
> 
> 
> Cheers


All air cooled 1080 Ti's can only do about 2000~2075 mhz core speed, no matter who makes it or what cooler it has. Some tips for you though: In MSI Afterburner window selected, press Control-F and you can access the voltage curve control screen and move voltages as much as it lets you and you might get 2050 mhz stable or 2075. Use GPU-Z monitoring to monitor what voltages it's actually running at. Push fans to higher and use a custom fan curve. Pascal throttles it's clocks from temperature more than power limits. The cooler you keep it the faster it will go. That's the most you can hope for with an air cooled card. Look more towards ram OC instead of core speed, you're already about maxed out on core speed. All 1080 Ti's can do at least +400 Mhz on ram, some can do +500, most air cooled cards won't do +600 on ram, but some might. +700 mhz ram for 1080 Ti is usually water cooling territory, but maybe you're lucky.


----------



## 8051

kithylin said:


> All air cooled 1080 Ti's can only do about 2000~2075 mhz core speed, no matter who makes it or what cooler it has. Some tips for you though: In MSI Afterburner window selected, press Control-F and you can access the voltage curve control screen and move voltages as much as it lets you and you might get 2050 mhz stable or 2075. Use GPU-Z monitoring to monitor what voltages it's actually running at. Push fans to higher and use a custom fan curve. Pascal throttles it's clocks from temperature more than power limits. The cooler you keep it the faster it will go. That's the most you can hope for with an air cooled card. Look more towards ram OC instead of core speed, you're already about maxed out on core speed. All 1080 Ti's can do at least +400 Mhz on ram, some can do +500, most air cooled cards won't do +600 on ram, but some might. +700 mhz ram for 1080 Ti is usually water cooling territory, but maybe you're lucky.


Kithylin, I installed nearly 100x100mm of 4mm thick thermal tape between the backplate and PCB of my MSI 1080Ti Duke and replaced the fan pack w/a shrouded Noctua AF14 iPPC3000 and a shrouded SanAce 127x38mm fan on the stock heatsink and can maintain 2152 Mhz @ 1.181V. Only on the hottest days of the summer does it throttle down to the next step and only when it exceeds ~63°C.


----------



## spyshagg

kithylin said:


> All air cooled 1080 Ti's can only do about 2000~2075 mhz core speed, no matter who makes it or what cooler it has. Some tips for you though: In MSI Afterburner window selected, press Control-F and you can access the voltage curve control screen and move voltages as much as it lets you and you might get 2050 mhz stable or 2075. Use GPU-Z monitoring to monitor what voltages it's actually running at. Push fans to higher and use a custom fan curve. Pascal throttles it's clocks from temperature more than power limits. The cooler you keep it the faster it will go. That's the most you can hope for with an air cooled card. Look more towards ram OC instead of core speed, you're already about maxed out on core speed. All 1080 Ti's can do at least +400 Mhz on ram, some can do +500, most air cooled cards won't do +600 on ram, but some might. +700 mhz ram for 1080 Ti is usually water cooling territory, but maybe you're lucky.


Awesome! thank you



8051 said:


> Kithylin, I installed nearly 100x100mm of 4mm thick thermal tape between the backplate and PCB of my MSI 1080Ti Duke and replaced the fan pack w/a shrouded Noctua AF14 iPPC3000 and a shrouded SanAce 127x38mm fan on the stock heatsink and can maintain 2152 Mhz @ 1.181V. Only on the hottest days of the summer does it throttle down to the next step and only when it exceeds ~63°C.


Mate, how did you get >1.09x volts? In AMD cards it was super easy, even on vega


----------



## mouacyk

spyshagg said:


> Mate, how did you get >1.09x volts? In AMD cards it was super easy, even on vega


The XOC BIOS allows up to 1.2v and unlimited power.


----------



## 8051

mouacyk said:


> The XOC BIOS allows up to 1.2v and unlimited power.


Yep the Asus Strix XOC VBIOS was key to getting the volts I needed to push to 2152 Mhz. and additional cooling was required to maintain it.


----------



## Streetdragon

little stupid question:
1.075V with 2100/6210Mhz are TimeSpy Stable.
In timespy extrem my card downclocks to 2088-2075Mhz. GPU reach max 32° and the powerconsumtion reads max 180W(shunt mod)
So no Powerthrotteling or Temp

1.025V 2050/6210Mhz is totaly find timespy extrem. no downclocking or whatever.
(yeah i know that is more or less nothing in performence, but 2100 sounds so much better^^ first nerd world problems)


----------



## spyshagg

mouacyk said:


> The XOC BIOS allows up to 1.2v and unlimited power.





8051 said:


> Yep the Asus Strix XOC VBIOS was key to getting the volts I needed to push to 2152 Mhz. and additional cooling was required to maintain it.


Is it compatible with the Gigabyte Aorus in this picture?
My default bios already enables 150% power limit

https://i.imgur.com/TdYWEnG.png


----------



## kithylin

8051 said:


> Kithylin, I installed nearly 100x100mm of 4mm thick thermal tape between the backplate and PCB of my MSI 1080Ti Duke and replaced the fan pack w/a shrouded Noctua AF14 iPPC3000 and a shrouded SanAce 127x38mm fan on the stock heatsink and can maintain 2152 Mhz @ 1.181V. Only on the hottest days of the summer does it throttle down to the next step and only when it exceeds ~63°C.


You're in the 1% of lucky people that can do that on air and it's not common for most air cooled cards. Nice card, though.


----------



## GraphicsWhore

After about a year and a half it's time for my blocked EVGA FE to find a new home. My first water-cooled card and I remember putting the block on like it was yesterday. Ran like a beast on the XOC BIOS, both OC'ed to hell and under-volted at 993mV.

I'm going to miss it. And you guys!


----------



## HyperC

spyshagg said:


> mouacyk said:
> 
> 
> 
> The XOC BIOS allows up to 1.2v and unlimited power.
> 
> 
> 
> 
> 
> 
> 8051 said:
> 
> 
> 
> Yep the Asus Strix XOC VBIOS was key to getting the volts I needed to push to 2152 Mhz. and additional cooling was required to maintain it.
> 
> Click to expand...
> 
> Is it compatible with the Gigabyte Aorus in this picture?
> My default bios already enables 150% power limit
> 
> https://i.imgur.com/TdYWEnG.png
Click to expand...

 are you guys overclocking the core speed with curve? Can you guys run superposition 4k so I can see if it's even worth going over 2088 luckily my ram goes 910


----------



## fat4l

Hi guys.

I see theres a lot of experienced people here. 

Can anyone tell for sure what is the real difference in Strix 1080Ti vs Strix 1080Ti OC?

yes I know theres a difference in MHz but is really the OC version better/binned core ? 
I need to watercool it, I would use XOC bios with no TDP limits. Im going to bin 8 cards and choose one ..... the question is these are NON OC model and I'm not sure if I should expect lower overclocks than with the OC model. 

Thanks


----------



## kithylin

fat4l said:


> Hi guys.
> 
> I see theres a lot of experienced people here.
> 
> Can anyone tell for sure what is the real difference in Strix 1080Ti vs Strix 1080Ti OC?
> 
> yes I know theres a difference in MHz but is really the OC version better/binned core ?
> I need to watercool it, I would use XOC bios with no TDP limits. Im going to bin 8 cards and choose one ..... the question is these are NON OC model and I'm not sure if I should expect lower overclocks than with the OC model.
> 
> Thanks


If you're going water cooling, then you shouldn't have any difference in either model. From what I heard, the only manufacturer that was binning cores for their cards was EVGA with the K|NGP|N model. For everyone else, all other 1080 Ti's use the exact same cores and the only difference is the air cooler or the power section. Of which, the founder's edition / reference PCB has enough power section to hit 2100 mhz with a custom water block on a very high percentage of 1080 Ti's.

And what do you mean you're planning to bin 8 cards? Return to the local retailer if it doesn't hit 2100 mhz? Good luck with that. Most retail stores in the USA won't accept returns on a video card unless it's physically defective and doesn't work at all. It might be different if you're in the EU. Also be warned: Fry's Electronics here in the USA doesn't accept any returns on video cards for any reason, working or not.


----------



## fat4l

kithylin said:


> If you're going water cooling, then you shouldn't have any difference in either model. From what I heard, the only manufacturer that was binning cores for their cards was EVGA with the K|NGP|N model. For everyone else, all other 1080 Ti's use the exact same cores and the only difference is the air cooler or the power section. Of which, the founder's edition / reference PCB has enough power section to hit 2100 mhz with a custom water block on a very high percentage of 1080 Ti's.
> 
> And what do you mean you're planning to bin 8 cards? Return to the local retailer if it doesn't hit 2100 mhz? Good luck with that. Most retail stores in the USA won't accept returns on a video card unless it's physically defective and doesn't work at all. It might be different if you're in the EU. Also be warned: Fry's Electronics here in the USA doesn't accept any returns on video cards for any reason, working or not.


Thanks for the input 

No, my friend has 8 cards which he used to mine with, for 2 months only  So no harm done. I'm seeing him this week, ill bin the cards and hopefully there will be a good one. All Asus Strix 1080Ti, non OC.

Then I wonder if they are not binned, then why asus bothers to label them as OC, nonOC, if they are the same ? Why not sell them all as OC?

Is there anyone with STRIX 1080Ti nonOC model that got very high, over 2100MHz ?


----------



## mattliston

"binning" is not a singular thing.
Its not done purely for frequency marks.


power, voltage, temperature, power efficiency, memory stability.... Lots to choose from when trying to get similar cards under the same "trim level"


----------



## Zfast4y0u

experts on reddit claim aorus 1080ti extreeme are binned. im under same impression for strix oc models, is there any official announce on this matter from partners?


----------



## 8051

fat4l said:


> Thanks for the input
> 
> No, my friend has 8 cards which he used to mine with, for 2 months only  So no harm done. I'm seeing him this week, ill bin the cards and hopefully there will be a good one. All Asus Strix 1080Ti, non OC.
> 
> Then I wonder if they are not binned, then why asus bothers to label them as OC, nonOC, if they are the same ? Why not sell them all as OC?
> 
> Is there anyone with STRIX 1080Ti nonOC model that got very high, over 2100MHz ?


I managed to get my MSI 1080Ti Duke to 2151 Mhz. It takes 1.181V, 100x100mm of 4mm thick thermal tape on the backplate and shrouded 140x25mm and 127x38mm fans on the heatsink to get there.


----------



## mouacyk

spyshagg said:


> Is it compatible with the Gigabyte Aorus in this picture?
> My default bios already enables 150% power limit
> 
> https://i.imgur.com/TdYWEnG.png


It looks like you can, but may lose functionality on one of your HDMI ports:
https://www.overclock.net/forum/69-...ferent-bios-your-1080-ti-33.html#post26113293


----------



## spyshagg

mouacyk said:


> It looks like you can, but may lose functionality on one of your HDMI ports:
> https://www.overclock.net/forum/69-...ferent-bios-your-1080-ti-33.html#post26113293


Thanks man!!


----------



## Streetdragon

i installed the extra pci molex connection on my motherboard (rampage extreme V) and installed the shunt mod on the shunt for the pci slot.
with 1.093V 2100/6210 i still downclock 1-3Bins. power shoes only 170Watt max, temp on core max 33°C

So everything is ok, or is there still a hidden "powerlimit"? 2050/6210 1.025V is fine and runs like a champ through timespy extreme


----------



## D13mass

HI, everybody! 
I would like to ask you about Fan speed on 1080Ti FE, I have moved back from watercooling to air and now I am trying to find best balance between noise and temps on my videocard.

What I have done: change curve and overclock little bit - 0.95V 1850 MHz/11800 MHz I am pretty happy with performance, but I am not satisfied with noise.

I setup fan in this way 

And usually, I have 70 Degrees as max temp during playing in AC Origin (1440p resolution for example).


----------



## spyshagg

Precision X is not showing me the arrows to choose voltage curve. Anyone know why?










Also the card is not idling. The clocks are always those.


Edit: they changed it from previous versions. I have to click on OC scan


----------



## kithylin

spyshagg said:


> Precision X is not showing me the arrows to choose voltage curve. Anyone know why?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also the card is not idling. The clocks are always those.


My card idles at the exact same clocks (1080 Ti) if I have my center monitor at 80hz manual monitor overclock set in display settings. But if I manually drop it down to 60hz (to match the 60hz left and 60hz right) monitors, then it drops down to idle clocks. If you're in a similar situation, that may be the cause of it for you.


----------



## spyshagg

the voltage curve on precision X is really buggy. You select the curve you want, click apply and the graph applies squares you didn't chose.


----------



## kithylin

spyshagg said:


> the voltage curve on precision X is really buggy. You select the curve you want, click apply and the graph applies squares you didn't chose.


Use MSI Afterburner. It's lots better. Make sure you completely uninstall all traces of PrecisionX first. You'll get system crashes if you have both installed on your computer, even if you are only actually running one at a time.


----------



## BLUuuE

kithylin said:


> My card idles at the exact same clocks (1080 Ti) if I have my center monitor at 80hz manual monitor overclock set in display settings. But if I manually drop it down to 60hz (to match the 60hz left and 60hz right) monitors, then it drops down to idle clocks. If you're in a similar situation, that may be the cause of it for you.


You can use Nvidia Inspector and enable Multi Display Power Saver. That allowed my core and memory clocks to drop to 608/810MHz.


----------



## kithylin

BLUuuE said:


> You can use Nvidia Inspector and enable Multi Display Power Saver. That allowed my core and memory clocks to drop to 608/810MHz.


That's still not the actual idle clocks. When I set my center screen to 60 Hz like the other 2 it drops all the way down to 139 Mhz core and 405 Mhz memory.


----------



## ThrashZone

D13mass said:


> HI, everybody!
> I would like to ask you about Fan speed on 1080Ti FE, I have moved back from watercooling to air and now I am trying to find best balance between noise and temps on my videocard.
> 
> What I have done: change curve and overclock little bit - 0.95V 1850 MHz/11800 MHz I am pretty happy with performance, but I am not satisfied with noise.
> 
> I setup fan in this way
> 
> And usually, I have 70 Degrees as max temp during playing in AC Origin (1440p resolution for example).


Hi,
Yeah it does that if you double click on a spot 
Only thing to do is double click on it again on the top speed dot.
Yes buggy/ annoying and few other can't really say descriptions


----------



## KedarWolf

If your 1080 Ti has a DVI connector like the Gainward card does, this Gainward BIOS should keep it enabled and is the same 350W 140% Power Limit as the Palit BIOS. 

It clocks about the same in TimeSpy as the Palit BIOS, 10503 to 10507 on the Palit which is within the margin of error. :h34r-smi


----------



## The Pook

D13mass said:


> HI, everybody!
> I would like to ask you about Fan speed on 1080Ti FE, I have moved back from watercooling to air and now I am trying to find best balance between noise and temps on my videocard.
> 
> What I have done: change curve and overclock little bit - 0.95V 1850 MHz/11800 MHz I am pretty happy with performance, but I am not satisfied with noise.
> 
> I setup fan in this way
> 
> And usually, I have 70 Degrees as max temp during playing in AC Origin (1440p resolution for example).



This is the fan profile I used for my 1060 and kept for my 1080 Ti. 










Recently I been setting fan speed to 80% before I load up a game since it's quieter than my 1060 at 60% and it keeps it under 60c


----------



## fat4l

So guys I just went through 8 Asus Strix 1080Ti, non OC.
Was testing max OC, in 3dMark Extreme, to test max possible boost/overclock.
Temps were around 40-45C.
TDP and temp limits and fan speed were all maxed in afterburner.
Core voltage was standard and was around 1.062 max.

4 cards could pass with boost of 2063MHz.
2 cards could pass with boost of 2075MHz.
1 card could pass with boost of 2088MHz.
1 card could pass with boost of 2126MHz.

So you I picked the best one, 2126MHz one, and I will be putting it under water with XOC bios


----------



## mattliston

I would set all the cards back to 2063mhz, and see what kind of memory overclock you can do.


Keep in mind, just because a card can run 12ghz memory stable, doesnt mean its running at peak performance, as error correction does exist and is VERY tolerant in the 10 series gddr5/x cards


My personal 1080 TI likes running with around 375mhz offset for best performance, but Can do over 500mhz offset quite easily for the big 12ghz.


----------



## fat4l

didnt really have time for that as it was not my place where I was testing and t obe honest, I'm aimig for the max core oc so my watercooling has some sense and efficiency


----------



## mattxx88

mouacyk said:


> The XOC BIOS allows up to 1.2v and unlimited power.


Are you sure ab this? cause i can see the unlimited power on my evga sc2, but my voltage is capped at 1.1v


----------



## BLUuuE

mattxx88 said:


> Are you sure ab this? cause i can see the unlimited power on my evga sc2, but my voltage is capped at 1.1v


Yeah, you gotta set it with the voltage frequency curve.


----------



## mouacyk

BLUuuE said:


> Yeah, you gotta set it with the voltage frequency curve.


*thumb*


----------



## mattxx88

thanks, now i give it a try!


----------



## 6u4rdi4n

I'll start by saying that I know I might be in the wrong thread, but these cards are really similar and there's a lot more activity here than in the 1080 thread 


So I made a boo-boo with my GTX 1080... I was installing a water block on it and somehow I managed to do some damage to an inductor. The card still seems to be working as it should, but it now makes a horrible coil whine. I've done a decent amount soldering work, so I was thinking about trying to replace the damaged inductor. Problem is that I have no idea about the specifications. I believe it's a 1R0 (only R0 visible and the knocked off pieces are gone) and as precise as I could, I measured it to being 8x8x4mm. The 1080 in question is an Inno3D GTX 1080 X2, also known as HerculeZ Twin X2. From what I could gather, it's also revision 2. (8+6pin power instead of only 8 pin, some additional caps and from what I can tell, the inductor I damaged)

Hoping there's someone here who can help me finding a replacement in the component jungle out there 

Here's a picture of the damage:


----------



## kithylin

mattxx88 said:


> thanks, now i give it a try!


Left-click on the MSI Afterburner window so it's in focus and then press control-F to bring up the curve screen.



6u4rdi4n said:


> I'll start by saying that I know I might be in the wrong thread, but these cards are really similar and there's a lot more activity here than in the 1080 thread
> 
> 
> So I made a boo-boo with my GTX 1080... I was installing a water block on it and somehow I managed to do some damage to an inductor. The card still seems to be working as it should, but it now makes a horrible coil whine. I've done a decent amount soldering work, so I was thinking about trying to replace the damaged inductor. Problem is that I have no idea about the specifications. I believe it's a 1R0 (only R0 visible and the knocked off pieces are gone) and as precise as I could, I measured it to being 8x8x4mm. The 1080 in question is an Inno3D GTX 1080 X2, also known as HerculeZ Twin X2. From what I could gather, it's also revision 2. (8+6pin power instead of only 8 pin, some additional caps and from what I can tell, the inductor I damaged)
> 
> Hoping there's someone here who can help me finding a replacement in the component jungle out there
> 
> Here's a picture of the damage:


Can you give us the exact model # of the card? I have some ideas for you.


----------



## 6u4rdi4n

kithylin said:


> Can you give us the exact model # of the card? I have some ideas for you.


I'll attach a picture of the sticker on the backplate. Any of these help?


----------



## kithylin

6u4rdi4n said:


> I'll attach a picture of the sticker on the backplate. Any of these help?


That's what I needed. Now, I've looked it up and there are two versions of your card.

The N1080-1SDN-P6DN V1.0 uses NVIDIA Reference GTX 1080 PCB. A PCB shot of that is here: https://www.ekwb.com/custom-loop-configurator/upload/pictures/GTX-1080_reference_93350.jpg

The N1080-1SDN-P6DN V2.0 is a custom PCB design by Inno3D, not-reference. A PCB shot of that one is here: https://www.ekwb.com/custom-loop-co...-8GB-GDDR5X-(N1080-1SDN-P6DN)-rev-2.0-PCB.jpg

Hopefully one of these will help you with the information you need.

EDIT: Neither of these seem to have that component by the fan header like in your picture though, so.. I have no idea what to think on that. Perhaps EKWeb has the wrong PCB shots for V2.0, but I highly doubt that, they generally are spot-on for that sort of thing.

EDIT #2: I looked a little further and here's a PCB shot of the Inno3D GTX 1070 Ti, which looks much closer in that corner to your original photo. I think you actually have a 1070 Ti and not a 1080. Maybe this will help: https://www.ekwb.com/custom-loop-co...,-DVI,-HDMI,-3x-DP-(C107T3-1SDN-P5DN)-pcb.jpg


----------



## Zfast4y0u

not sure why he didnt recreate the housing from debries and see whats written on top of it, shouldnt be hard. probably too late for that now -.-


----------



## 6u4rdi4n

kithylin said:


> That's what I needed. Now, I've looked it up and there are two versions of your card.
> 
> The N1080-1SDN-P6DN V1.0 uses NVIDIA Reference GTX 1080 PCB. A PCB shot of that is here: https://www.ekwb.com/custom-loop-configurator/upload/pictures/GTX-1080_reference_93350.jpg
> 
> The N1080-1SDN-P6DN V2.0 is a custom PCB design by Inno3D, not-reference. A PCB shot of that one is here: https://www.ekwb.com/custom-loop-co...-8GB-GDDR5X-(N1080-1SDN-P6DN)-rev-2.0-PCB.jpg
> 
> Hopefully one of these will help you with the information you need.
> 
> EDIT: Neither of these seem to have that component by the fan header like in your picture though, so.. I have no idea what to think on that. Perhaps EKWeb has the wrong PCB shots for V2.0, but I highly doubt that, they generally are spot-on for that sort of thing.
> 
> EDIT #2: I looked a little further and here's a PCB shot of the Inno3D GTX 1070 Ti, which looks much closer in that corner to your original photo. I think you actually have a 1070 Ti and not a 1080. Maybe this will help: https://www.ekwb.com/custom-loop-co...,-DVI,-HDMI,-3x-DP-(C107T3-1SDN-P5DN)-pcb.jpg


That PCB picture of the V2 on EKWB's site is wrong. That's an iChill x3 card. And if you look at the 1070 Ti, it has a lot fewer caps than my card. My card has the same caps as the "v2" from EKWB's site, but mine also have an additional cap by the power connectors and one additional towards the back near the middle. Not to forget that my card has 8+6 pin power.

But size wise and stuff, that inductor on the 1070 Ti looks mighty similar. 

And I can confirm that it's 1R0, or 1µH. Found a picture from before I broke it.



Zfast4y0u said:


> not sure why he didnt recreate the housing from debries and see whats written on top of it, shouldnt be hard. probably too late for that now -.-


That wouldn't be of too much help to be honest. The only thing written on it is its inductance. There's at least 2 more specifications that needs to be considered.


----------



## ThrashZone

Hi,
Well it was said the if it wasn't a FE card, FE meaning founders edition yes 8-6 power plugs is a FE card.
Also an entire picture of the cards board might of been nice too.
Just add some hot glue over it to seal it.


----------



## 6u4rdi4n

ThrashZone said:


> Hi,
> Well it was said the if it wasn't a FE card, FE meaning founders edition yes 8-6 power plugs is a FE card.
> Also an entire picture of the cards board might of been nice too.
> Just add some hot glue over it to seal it.


1080 Ti FE has 8+6 pin, but the 1080 FE only has a single 8 pin.

Already tried hot glue. Even tried nail polish to get in between the windings, but it still whines like a b.

Need to drain my loop again to get the card out and disassemble it again. I'll see if I can get around to it.


----------



## ThrashZone

Hi,
Wow even my evga 980 hybrid had 8-6 pin power it was a FE
Can't imagine a 1080 with just 1-8 pin


----------



## 6u4rdi4n

I thought the 980 reference had a 6+6 pin config?


----------



## Nervoize

I turned my 1080Ti into a heater for the upcoming winter


----------



## Zfast4y0u

Nervoize said:


> I turned my 1080Ti into a heater for the upcoming winter


i guess thats with XOC bios? reminds me on vega 56/64 lol. with costume bios they pull 600w ( and still perform like junk ).


----------



## kithylin

Nervoize said:


> I turned my 1080Ti into a heater for the upcoming winter


Whatever in the world have you run to cause this to happen? I've never seen my 1080 Ti go above 384 watts here. I also use XOC Bios and 1.200v too.


----------



## Nervoize

Zfast4y0u said:


> i guess thats with XOC bios? reminds me on vega 56/64 lol. with costume bios they pull 600w ( and still perform like junk ).





kithylin said:


> Whatever in the world have you run to cause this to happen? I've never seen my 1080 Ti go above 384 watts here. I also use XOC Bios and 1.200v too.


Yep, XOC bios. I was just running the EVGA Precision X1 test from their OC scanner.
I set my core to 2000 mhz and voltage to 1.000v right now because there is not really any big difference besides it consumes only up to 290 watts now.


----------



## Zfast4y0u

Nervoize said:


> Yep, XOC bios. I was just running the EVGA Precision X1 test from their OC scanner.
> I set my core to 2000 mhz and voltage to 1.000v right now because there is not really any big difference besides it consumes only up to 290 watts now.


can u lock its boost to 1700mhz and see whats max power then? just for testing.


----------



## Outcasst

Anyone have a FE and doesn't have awful coil whine? I'm on my 3rd RMA and I am starting to believe it might be a PSU issue. I've tried it on an EVGA 650 P2 and also and EVGA 430 W1, but they both have whine.

I was planning to put the card under water but I'm not sure there's much point if it screams as loud as a fan would.


----------



## Nervoize

Zfast4y0u said:


> can u lock its boost to 1700mhz and see whats max power then? just for testing.


Approx. 200 watts.


----------



## DixonCider

This looks like the right place to ask this, let me know if I'm wrong.

A little background: I've got two PNY 1080 TI FE that I've had since the day they were released. I built an epic water cooled rig at the time, and have spent the last (almost two years I guess?) without any issues, until I became the victim of the ASUS x99 problem where my Rampage V Edition 10 decided to fry itself and my i7 6950x. Both of my 1080 TIs were still good, so I've RMA'd the MB and the CPU. New motherboard took ASUS a month to get to me, and Intel shipped my new CPU the day they received the bad one, I'll be receiving that tomorrow. In the meantime, I've been using my HTPC that I built from the 8086K that I won in the Intel giveaway. 

I decided that I was tired of my GPUs hitting the power limit all the time, while maxing out in the high 30s temp (I have two EK 480x60 rads, and two EK 240x28 rads) so I decided to do the "extra resistor" power mod on my GPUs. I soldered an extra 5MO resistor on top of the two 5MO resistors that are closest to the 8 and 6 pin power connectors, and my GPU still worked...so that felt good. However, when running TimeSpy and using Afterburner to OSD the power limit/power %/GPU core freq I still see the card hitting 120% power and being power limited when the chip is hitting 2012mhz. What gives? 

I've looked around online and haven't found too much info on exactly how to solder the extra 5MO resistors. There are a few articles describing it, but my confusion is exactly where do I put the solder? There are two pads on either end of the original resistor. Is it a problem is the solder directly bridges those two pads together (the small one and the big one)? It doesn't seem like it should be an issue, since the end of the resistor should be electrically conductive between them anyway... 

Have I done something wrong? Do I need to mod the last resistor for the PCI-E power? Are the numbers in Afterburner supposed to show the "fake reduced" power that I was expecting? Is it possible that because I'm running on the FE air cooler (for now) that not having the epic cooling is limiting me? One would think that my power usage would show lower, and I would be temp limited...

I haven't read everything in this thread, so I'm not sure if others have successfully completed this mod. I'll keep reading now, but please let me know if you have any experience with this and what I may be doing wrong.

Thanks!


----------



## kithylin

Nervoize said:


> Yep, XOC bios. I was just running the EVGA Precision X1 test from their OC scanner.
> I set my core to 2000 mhz and voltage to 1.000v right now because there is not really any big difference besides it consumes only up to 290 watts now.


There's your problem.. that test is terrible and very similar to furmark. It serves no functional purpose other than to cause the maximum power draw possible on your card and nothing else. I would suggest not running it with XOC on.. as you saw, excessive power draw for no reason. If I were you I would be utterly frightened and seriously concerned about melting the card's power section with 600 watts power draw. That test with XOC could very realistically kill your card in seconds. There was a documented history of the original Furmark killing cards both AMD and Nvidia for a while and Nvidia and AMD both started throttling cards via drivers when they detected people running Furmark because of it. I'm a quite surprised they allowed the cards to hit full power under the EVGA test even. 

Under "Normal operating conditions", that is, either gaming or firestrike or any of the 3dmark tests, my 1080 Ti @ 2100 mhz @ 1.200v with XOC never even touches or exceeds 400 watts. At most it's only ever 350~384 watts max. The cards should never be drawing any where near that kind of power you saw for any reason.


----------



## Nervoize

kithylin said:


> There's your problem.. that test is terrible and very similar to furmark. It serves no functional purpose other than to cause the maximum power draw possible on your card and nothing else. I would suggest not running it with XOC on.. as you saw, excessive power draw for no reason. If I were you I would be utterly frightened and seriously concerned about melting the card's power section with 600 watts power draw. That test with XOC could very realistically kill your card in seconds. There was a documented history of the original Furmark killing cards both AMD and Nvidia for a while and Nvidia and AMD both started throttling cards via drivers when they detected people running Furmark because of it. I'm a quite surprised they allowed the cards to hit full power under the EVGA test even.
> 
> Under "Normal operating conditions", that is, either gaming or firestrike or any of the 3dmark tests, my 1080 Ti @ 2100 mhz @ 1.200v with XOC never even touches or exceeds 400 watts. At most it's only ever 350~384 watts max. The cards should never be drawing any where near that kind of power you saw for any reason.


I know, i just did it to measure the max power draw and it was only for about a minute. I currently have my GPU running at 2000mhz at 1.000v for 24/7 use, this draws about 250-300 watts max. Benching i mostly hit 2151 - 2176 at 1.200v depending on the overclock and benchmark.


----------



## Zfast4y0u

Nervoize said:


> Approx. 200 watts.


ya its good, maybe even sweet spot, with tweaked voltage its under 200w.


----------



## Nervoize

Zfast4y0u said:


> ya its good, maybe even sweet spot, with tweaked voltage its under 200w.


Why would you run the card on that speed if it automatically boosts up to a certain speed (1800-2000) without tweaking?


----------



## Zfast4y0u

Nervoize said:


> Why would you run the card on that speed if it automatically boosts up to a certain speed (1800-2000) without tweaking?





you dont have to, its just comparison. anyway i think performance loss is like 10fps between 1700mhz vs 2000mhz in games, but difference in temps and power draw are nice ( i wanted to see diff in XOC bios vs stock bios thats why i asked u to do little test  ) . for example, i tested it in games on 1700mhz @ 0,800mv and even this voltage is a bit higher for those clocks, but you cant drop it bellow 0,800mv on msi afterburner curve, and max power draw i saw was 140w, which is really damn nice, considering u sacrifice 10fps for it.


----------



## 6u4rdi4n

Never found out the exact specs for the broken inductor on my Inno3D GTX 1080 Twin X2, but I believe I found one that's as close as I can get: https://www.digikey.com/product-detail/en/bourns-inc/SRN8040-1R0Y/SRN8040-1R0YCT-ND/2756172 Will probably arrive on tuesday or wednesday along with new soldering tips for my soldering iron. Guess we'll find out if it works soon enough, lol.


----------



## chibi

6u4rdi4n said:


> Never found out the exact specs for the broken inductor on my Inno3D GTX 1080 Twin X2, but I believe I found one that's as close as I can get: https://www.digikey.com/product-detail/en/bourns-inc/SRN8040-1R0Y/SRN8040-1R0YCT-ND/2756172 Will probably arrive on tuesday or wednesday along with new soldering tips for my soldering iron. Guess we'll find out if it works soon enough, lol.



Have you thought about applying some hot glue to the inductor? I personally would try that first and see if the coil whine subsides before busting out the soldering iron. Good luck either way!


----------



## amd955be5670

So, a lot of stuff happened. Initially the FE was 57k INR in India, now upmarked to 64.8k. The 20 series prices also went up, so I went with the flow and got a 1080Ti MSI Gaming X for 66k after discounts instead. It is due to arrvie tomorrow.

I recently went from a 24" 1080P to 32" 1440P and it was frustrating playing with the same skyrim mods at 23fps 

I had a good time bios modding my 970, locking it to 1519mhz at full load. I imagine this is no longer possible with 10 series?

Also anything extra I should know, whilst eagerly awaiting the deliveryman?


----------



## mouacyk

amd955be5670 said:


> So, a lot of stuff happened. Initially the FE was 57k INR in India, now upmarked to 64.8k. The 20 series prices also went up, so I went with the flow and got a 1080Ti MSI Gaming X for 66k after discounts instead. It is due to arrvie tomorrow.
> 
> I recently went from a 24" 1080P to 32" 1440P and it was frustrating playing with the same skyrim mods at 23fps
> 
> I had a good time bios modding my 970, locking it to 1519mhz at full load. I imagine this is no longer possible with 10 series?
> 
> Also anything extra I should know, whilst eagerly awaiting the deliveryman?


Cannot flash modified BIOS (only custom AIB versions) on 10 series GPUs. However, there exists some AIB ones with really high power targets (350W+) that will eliminate most power throttling. Then there is the ultimate XOC BIOS that will eliminate ALL power throttling, which is compatible with the MSI Gaming X (I have one under a water block).


----------



## kithylin

amd955be5670 said:


> So, a lot of stuff happened. Initially the FE was 57k INR in India, now upmarked to 64.8k. The 20 series prices also went up, so I went with the flow and got a 1080Ti MSI Gaming X for 66k after discounts instead. It is due to arrvie tomorrow.
> 
> I recently went from a 24" 1080P to 32" 1440P and it was frustrating playing with the same skyrim mods at 23fps
> 
> I had a good time bios modding my 970, locking it to 1519mhz at full load. I imagine this is no longer possible with 10 series?
> 
> Also anything extra I should know, whilst eagerly awaiting the deliveryman?


Correct. No modifying the bios / editing it / making our own for the 10-series nor the 20-series yet. And I hope you got one with a big aftermarket cooler. The cooler you can run the 10-series the faster they are. With the new boost 3.0 the cards actually self-overclock on their own if kept cool enough even if you don't ever touch anything. It's not uncommon to see most GTX 1080 / GTX 1080 Ti's running at least +150 to +200 mhz core speed above their listed maximum boost speeds on the box just on their own without you even doing anything. What I might would suggest however is grab afterburner and not actually manually overclocking anything. But just push the power % and temp target sliders to max and consider putting fans about 60% or 65% manually and let it go and "let it do it's thing". That will give boost-3.0 more headroom to work with. Most 1080's and 1080 Ti's with sliders pushed a little more fan % should easily hit 1800 mhz and might boost to upper 1850~1900 mhz on their own.


----------



## 8051

mouacyk said:


> Cannot flash modified BIOS (only custom AIB versions) on 10 series GPUs. However, there exists some AIB ones with really high power targets (350W+) that will eliminate most power throttling. Then there is the ultimate XOC BIOS that will eliminate ALL power throttling, which is compatible with the MSI Gaming X (I have one under a water block).


I flashed the Asus Strix XOC VBIOS to my MSI Duke 1080Ti and before I replaced the stock fan pack w/shrouded 140x25mm and 127x38mm fans still saw PWR perfcaps being thrown at temps > ~63°C which resulted in throttling.


----------



## mouacyk

8051 said:


> I flashed the Asus Strix XOC VBIOS to my MSI Duke 1080Ti and before I replaced the stock fan pack w/shrouded 140x25mm and 127x38mm fans still saw PWR perfcaps being thrown at temps > ~63°C which resulted in throttling.


Interesting... it's possible that the power limit lifting is still temperature-limited on the XOC BIOS. My card never surpasses 40C, so I haven't experienced it yet.


----------



## 6u4rdi4n

chibi said:


> Have you thought about applying some hot glue to the inductor? I personally would try that first and see if the coil whine subsides before busting out the soldering iron. Good luck either way!


Tried that. Even tried nail polish to really get in there between the windings


----------



## kithylin

8051 said:


> I flashed the Asus Strix XOC VBIOS to my MSI Duke 1080Ti and before I replaced the stock fan pack w/shrouded 140x25mm and 127x38mm fans still saw PWR perfcaps being thrown at temps > ~63°C which resulted in throttling.


Yes, because the XOC bios is an LN2 (Liquid Nitrogen) bios and is intended only for water cooled cards and LN2 competitions. It's not supposed to ever be used on air cooled cards. Or if you do it's supposed to be < 60c at all times.


----------



## KedarWolf

MY GPU was bugged in my earlier Time Spy tests. It was only running at 4X, now it's at 16X, I cleaned the connector contacts with 99% rubbing alcohol.

With only options in Nvidia Control Panel changed being Maximum Performance, Prerendered Frames at 1, Shader Cache off, Anisotropic Sample Optimization On, and V-Sync off with the Palit BIOS I'm now getting this.

https://www.3dmark.com/3dm/29901822


----------



## 8051

kithylin said:


> Yes, because the XOC bios is an LN2 (Liquid Nitrogen) bios and is intended only for water cooled cards and LN2 competitions. It's not supposed to ever be used on air cooled cards. Or if you do it's supposed to be < 60c at all times.


But isn't the Asus Strix XOC 1080Ti an air-cooled card?

I'm not sure if it was the GPU core temp that was the reason I was hitting PWR perfcaps, I don't anymore, even at 64°C GPU core temps. I have nearly 100x100mm of 4mm thick thermal tape connecting my backplate to the back of the GPU and a powerful 127x38mm shrouded fan blowing on the VRM circuitry. I'm thinking the cooling of the power circuitry is what made the difference because I could never hit 2139 Mhz. previously.


----------



## RoBiK

8051 said:


> But isn't the Asus Strix XOC 1080Ti an air-cooled card?


There is no such card. There is a Asus Strix OC 1080Ti card, and then there is the XOC BIOS - two completely different things.


----------



## kithylin

RoBiK said:


> There is no such card. There is a Asus Strix OC 1080Ti card, and then there is the XOC BIOS - two completely different things.


And originally the XOC Bios was something Asus Strix OC 1080 Ti Owners could request from Asus, and it was given to people on-request-only and intended for Asus Strix OC 1080 Ti owners to use for LN2 / Liquid Nitrogen competitions. It's not intended even for Asus Strix OC 1080 Ti owners to even use on stock air cooling on their card. We just found out we could cross-flash it to other cards. But it doesn't do well above 60c, and most 1080 Ti users can't keep it below 60c with air coolers.


----------



## jdcchora1533

Hello guys,

Currently Im doing a guide for overclock a gigabyte gtx 1080 ti gaming oc 11gb.

I already mod all the card and will post it soon.

To finish my guide I want to know whats the best bios for it?

I got the gpu watercooled and the vram with heatsinks.

Thanks in advance


----------



## KedarWolf

KedarWolf said:


> MY GPU was bugged in my earlier Time Spy tests. It was only running at 4X, now it's at 16X, I cleaned the connector contacts with 99% rubbing alcohol.
> 
> With only options in Nvidia Control Panel changed being Maximum Performance, Prerendered Frames at 1, Shader Cache off, Anisotropic Sample Optimization On, and V-Sync off with the Palit BIOS I'm now getting this.
> 
> 8700k, Z370 Maximus X Formula motherboard.
> 
> https://www.3dmark.com/3dm/29901822


9900k, Z370 Maximus X Formula motherboard, slightly better graphics score, a huge jump in CPU score. 

https://www.3dmark.com/3dm/29916352


----------



## 8051

RoBiK said:


> There is no such card. There is a Asus Strix OC 1080Ti card, and then there is the XOC BIOS - two completely different things.


Where do you get the XOC VBIOS then? Techpowerup only has the Strix OC VBIOS (330 Watt maximum).


----------



## kithylin

8051 said:


> Where do you get the XOC VBIOS then? Techpowerup only has the Strix OC VBIOS (330 Watt maximum).


You have to email or call Asus support, ask for it directly, and I think sign a disclaimer and agree to void your factory warranty before they give it to you.


----------



## Zfast4y0u

kithylin said:


> You have to email or call Asus support, ask for it directly, and I think sign a disclaimer and agree to void your factory warranty before they give it to you.


i thought XOC is in open wild by now, i mean what prevents us sharing it among us, how would they know who shared it?


----------



## BLUuuE

8051 said:


> Where do you get the XOC VBIOS then? Techpowerup only has the Strix OC VBIOS (330 Watt maximum).


Download the BIOS collection in the OP or go to der8auer's guide.


----------



## kithylin

Zfast4y0u said:


> i thought XOC is in open wild by now, i mean what prevents us sharing it among us, how would they know who shared it?


It is out here in the wild. I thought they were asking how we originally acquired it. There's also a seperate thread dedicated to it. https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html


----------



## Lefty23

This is where it first became available as far as I know:
https://community.hwbot.org/topic/168085-ultimate-asus-1080ti-rog-strix-acln2-tips/

Unfortunately, after the migration of the hwbot forums the pics got lost.
Luckily der8auer's guide posted above has all the pics though


----------



## Vlada011

New owner of GTX1080Ti.


----------



## feznz

KedarWolf said:


> 9900k, Z370 Maximus X Formula motherboard, slightly better graphics score, a huge jump in CPU score.
> 
> https://www.3dmark.com/3dm/29916352
> 
> 
> 
> Spoiler



Bit better than my best run only ever managed to crack the 10K mark once that was runng a 8600k @ 5.5Ghz also the GPU is only running @ x8 


https://www.3dmark.com/3dm/25758649


----------



## Zfast4y0u

Vlada011 said:


> New owner of GTX1080Ti.


nice, test its temps under water ( with fans on also ), im intrested to see how to performs vs ek blocks


----------



## kithylin

Zfast4y0u said:


> nice, test its temps under water ( with fans on also ), im intrested to see how to performs vs ek blocks


----------



## jdcchora1533

8051 said:


> I've been using the XOC VBIOS on my air cooled Gigabyte Gaming OC 1080Ti for months now, but I replaced the fans with more powerful shrouded versions (incl. one 6000 RPM 92x38mm vaneaxial fan blowing right over the VRM's) and it's barely enough. Dying Light is the only game where I have to deliberately set a lower core clock.
> 
> I'm hoping an Arctic Cooling Accelero Xtreme III with more powerful fans will do better.





Hello, whats the clocks that u are using?


You flashed xoc without any problem?


Thanks


----------



## redeka

Hey guys. My apologies for my lack of contribution to this community. With that being said, though, I was hoping to put my foot into your guys' circle as ask a couple of questions. I recently found someone trying to get rid of 2 "as is" cards. One is a R90X that lights up and spools up when started, but won't be recognized by the gentlemen's system. The 1080 TI is said to have a burnt out (or several burnt out) r22 components. The guy _says_ he is not very technologically inclinded and thats why he rather get rid of them for ~$200 USD. 

https://www.overclock.net/forum/69-nvidia/1624521-official-nvidia-gtx-1080-ti-owner-s-club-1020.html

There is mention of these r22 components in the link above, but I am looking to find out A. How much they are. And B. How hard it is to solder all the components off and on. I've included some hi res photos for reference.

Thanks!


----------



## Zfast4y0u

that 1080ti is in a terrible condition, all ram modules melted and not only them. i wouldnt take it even if he was giving it to me for free. holy crap, how some ppl treat their hardware. if u wanna buy 1080ti, better get urself a used one in WORKING condition. who knows what else died too, maybe core itself too.


----------



## kithylin

redeka said:


> Hey guys. My apologies for my lack of contribution to this community. With that being said, though, I was hoping to put my foot into your guys' circle as ask a couple of questions. I recently found someone trying to get rid of 2 "as is" cards. One is a R90X that lights up and spools up when started, but won't be recognized by the gentlemen's system. The 1080 TI is said to have a burnt out (or several burnt out) r22 components. The guy _says_ he is not very technologically inclinded and thats why he rather get rid of them for ~$200 USD.
> 
> There is mention of these r22 components in the link above, but I am looking to find out A. How much they are. And B. How hard it is to solder all the components off and on. I've included some hi res photos for reference.
> 
> <snip>
> 
> Thanks!


And people over on the steam forums still try to say I'm crazy when I tell them coin mining kills video cards. That poor 1080 Ti's been burnt crispy. It's well beyond hope of repair. It should just be tossed to the recycler/trash.


----------



## mattliston

I dont see any melted ram modules. Looks like greasy-ness from the thermal pads or fingers.


Looks like a fixable card if the core is still okay. Not worth over $0 in my book though, just my opinion. Would be a fun experiment to revive it if you get it for free


----------



## bmgjet

So its coming into summer where I live and I want to keep my winter overclock of 2100-2088mhz. Which is 30-38c core temps.
Last summer core temps were in the 40-50c range since ambient tempture gets up to 32c where in winter its 20c

My thought is to set up a small chiller before the cards block. Any one done anything like this, The plan isnt to go sub zero or even below due point. But to just keep what I can clock to during winter.

Heres a quick sketch up of


----------



## Streetdragon

Looks good so far. But if you get the watertemp below ambient, you an get problems with condensation. So be aware of that.

Can you post, how you build the controller? i want something like that to, just to support my waterloop like that:

Only radcooling til delta-t water/ambient reaches 2°. Than start the peletier to hold that or lower the delta-t to 0-0.5° and than stop the peletier.
Would be nice^^


----------



## bmgjet

Just a ebay usb relay.
https://www.ebay.com/itm/1-2-4-8-Ch...cc07ee962:m:mlg_RwObhC-7ozedu3cOnAQ:rk:1:pf:0

Has source code provided so ill be incorprating that into my overclocking software which already overclocks based on core temp.
Yup im aware of condensation, Was considering polling the online weather and humidity reading for my area if its cooling is more then I expect.
Usally its around 30-60% so I have a little bit of room to dew point.


----------



## Cryptedvick

New owner of a used MSI Gaming X model and so far I am very impressed with the way it runs. 
It boosts to 1960s and can be OCd stable to 2063Mhz which is plenty for my needs. As far as temps go, I haven't seen it break 62C under load in any game with a custom fan profile that I made. Fans are really impressive too, very quiet, even at high RPMs compared to the 980Ti strix that I previously used. 
All in all, happy camper with the card and at 2560x1080 75Hz it should last me plenty of time from now on. RTX has no added value over this card for the price they're asking for it. 1575 USD for a new one? No thanks! Got this one used for 550 USD which is a very decent price where I live.


----------



## amd955be5670

Hey guys.. my MSI 1080Ti Gaming X finally arrived. And there is a special place in hell for me. When scratching my head today, about why my results are lower than others, in various benches, I realized, I was actually running it in x4 3.0 mode..

Which makes me even facedesk myself more because the 970 was also running in x4 mode... all this while.

My mobo has 4 full 16x slots. I moved the GPU down 1 because of my NH-U14S (thankfully recently), but I'll have to move it down more, because apprently the third from the top has 8x. I guess its a good time to move away to the 8700k, that way I get a 16x slot for my gpu free from the CPU Heatsink.

So far the card warms up to 75c ish. I made a conservative fan curve so the max now is 71c. The clocks at default stay at 1911, though this is after the 50c+ temp throttling. The actual default is 1962mhz. I tried +100 and +75 which results in the 'max' boost table to be 2025/2037mhz at which my card just crashes (even though after temps it may normalize at 1987-2000mhz fine most likely). Haven't touched volts or temp target yet. I clocked a +497 on memory, and haven't seen artifacts. For the short term, I'm seeing +38/+497 stable.


----------



## JMCB

Looks like I'm part of the SLI subgroup now...


----------



## GraphicsWhore

jdcchora1533 said:


> Hello guys,
> 
> Currently Im doing a guide for overclock a gigabyte gtx 1080 ti gaming oc 11gb.
> 
> I already mod all the card and will post it soon.
> 
> To finish my guide I want to know whats the best bios for it?
> 
> I got the gpu watercooled and the vram with heatsinks.
> 
> Thanks in advance /forum/images/smilies/smile.gif


XOC BIOS


----------



## 8051

GraphicsWhore said:


> XOC BIOS


Another up-vote for the Asus Strix XOC VBIOS. It enables you to raise the voltage to 1.2V and my overclock requires very near that.


----------



## kithylin

8051 said:


> Another up-vote for the Asus Strix XOC VBIOS. It enables you to raise the voltage to 1.2V and my overclock requires very near that.


I'll just take a moment to remind everyone: XOC bios is for full custom water loops with full-cover blocks only. It's been documented now that it can and will fry / kill air cooled cards, no matter how good the cooler.


----------



## bmgjet

kithylin said:


> I'll just take a moment to remind everyone: XOC bios is for full custom water loops with full-cover blocks only. It's been documented now that it can and will fry / kill air cooled cards, no matter how good the cooler.


And to add to that. It boosts voltage up as temp goes up unlike other cards which will throttle voltage down.

So at 40C youll be around 1.093v
60c it will be 1.150v
80C it will be at 1.2v
So thermals will get out of control if you have poor air cooling or a water cooling failure.

And it has looser vram timings so expect to see your bench mark scores drop if you keep the same memory clock as your stock bios.


----------



## 8051

kithylin said:


> I'll just take a moment to remind everyone: XOC bios is for full custom water loops with full-cover blocks only. It's been documented now that it can and will fry / kill air cooled cards, no matter how good the cooler.


I'm no longer sure if I have the plain-jane Asus Strix OC VBIOS or the Asus Strix XOC VBIOS flashed on my MSI 1080Ti Duke. I downloaded the file off of techpowerup's VBIOS archive and they only have the Asus Strix and Asus Strix OC VBIOS there -- there's no Asus Strix XOC VBIOS listed. The big difference I noticed w/the Asus Strix OC VBIOS was that I could set higher voltages without getting voltage perfcaps and less power throttling. The Asus Strix OC VBIOS boosted all by itself to 2000 MHz. and 1.093V -- even if I don't want it to and afterburner couldn't stop it.

My 1080Ti Gigabyte Gaming OC fried w/the Asus Strix OC VBIOS loaded, even though the GPU core temps were < 60°C. I was pushing for 2139 Mhz. @ 2181mV core volts. I believe a component in the VRM circuitry failed.


----------



## BLUuuE

8051 said:


> 2139 Mhz. @ 2181mV core volts


Wouldn't be surprising if it caught fire with almost 2.2v


----------



## Vlada011

kithylin said:


> https://www.youtube.com/watch?v=oD2BIiNENLg


I'm very sad because Watercool not make GPU block for ASUS GTX1080Ti Strix, Only Bitspower and EK.
I don't have Strix, but Poseidon have same PCB and fit on same waterblocks.

I see significant number of owners of used GTX1080Ti like me on different forums.
Because they could be found for 500-600$ in great condition with 1-2 years warranty.
But I hear that GTX1080Ti stocks are not so big. Never after new series someone continue so much to buy older because price and small performance difference, GTX1080Ti Strix and FTW3 and similar beat RTX2080 FE in scores, and that's not reference RTX2080, FE is now overclocked.
And if someone is interest to buy used GTX1080Ti and find something he like for nice price maybe is good if buy immediately.
Because if GTX1080Ti gone from market, price of used could go higher. Same as before with mining cards.


----------



## lightsout

Hey guys anyone running reference cards in here? I am going to score an EVGA blower card here locally 11G-P4-5390-KR.

It looks to be the same PCB as the founders. I am a little concerned about the cooler but the price is too good.

My question is besides full water cooling is anyone doing custom cooling on reference cards? I am thinking about getting one of those NZXT Kraken brackets and a cheap AIO for "the mod". My case is an NZXT H200 so its tight. Was also looking around for EVGA hybrid kits but they seem to be all dried up.

Do these cards usually run crazy hot (I am not familiar with how hot the 1080ti gets being the big die)


----------



## max883

Got the ASUS GTX 1080 Ti Strix OC to fit With 120mm fans in the case Used Kryonout and temps never go over 55c With fans at 900.rpm super low noise


----------



## lightsout

max883 said:


> Got the ASUS GTX 1080 Ti Strix OC to fit With 120mm fans in the case Used Kryonout and temps never go over 55c With fans at 900.rpm super low noise


55c. And idea how much just a re-tim job with that stuff helps?


----------



## Thoth420

I have a few games that get pretty random stutter, judder etc. with g sync on and off as well as native v sync on and off and settings be damned. I find that if I downclock my FTW3 by -88(normal base clock for a reference 1080ti) these issues seem to subside. It isn't temps for sure or throttling and I made sure it wasn't the CPU which is an 8700k. Any ideas why this card needs its factory OC dropped to get good perf? Do some games just not like OCs at all or is it maybe something with boost 3.0?

The only game that I have actual crashes with is just Deus Ex: Mankind Divided out of over a 100 titles. I can't seem to rectify that no matter what tweaks I try.


----------



## bmgjet

That stutter is because its on the verge of being stable, its the nvidia driver resetting you go any further and youll get driver hung or device removed crashes.
Id be RMA that card if it was me.


----------



## GraphicsWhore

Thoth420 said:


> I have a few games that get pretty random stutter, judder etc. with g sync on and off as well as native v sync on and off and settings be damned. I find that if I downclock my FTW3 by -88(normal base clock for a reference 1080ti) these issues seem to subside. It isn't temps for sure or throttling and I made sure it wasn't the CPU which is an 8700k. Any ideas why this card needs its factory OC dropped to get good perf? Do some games just not like OCs at all or is it maybe something with boost 3.0?
> 
> The only game that I have actual crashes with is just Deus Ex: Mankind Divided out of over a 100 titles. I can't seem to rectify that no matter what tweaks I try.


If you don’t downclock do you eventually get artifacts and/or crashing?

Are these games using DX12 by chance?

And if you turn on an on screen display monitor while playing the games, are the clocks or temps doing anything weird when the stutter occurs?

Have you set any settings in the nvidia control panel that override the game/application?


----------



## Thoth420

bmgjet said:


> That stutter is because its on the verge of being stable, its the nvidia driver resetting you go any further and youll get driver hung or device removed crashes.
> Id be RMA that card if it was me.


I can OC it past the factory OC and it never crashes or driver hangs unless I go way too far ofc. I am considering an RMA just because one of the fan bearings is rubbing now and occasionally makes noise as it goes up the curve. I just double checked and the only time I have had actual driver restarts it was due to an bad driver install a month or so back but I wiped that with DDU and am using a different version now. 



GraphicsWhore said:


> If you don’t downclock do you eventually get artifacts and/or crashing?
> 
> Are these games using DX12 by chance?
> 
> And if you turn on an on screen display monitor while playing the games, are the clocks or temps doing anything weird when the stutter occurs?
> 
> Have you set any settings in the nvidia control panel that override the game/application?


No(aside DX: MD which does it no matter what) and No, I never use DX12. I am not seeing anything strange in logs. I don't run OSD but when it happens I alt tab out to check the graphs. I get the issue if I have NVCP set to use 3D settings globally or if I have a profile and tweak etc. Is it possibly just the games and they have a bit of hitching at certain points loading assets? It's far from something that happens constantly just a hitch or stutter here or there. I should note I am using an ultrawide monitor as well.


----------



## GraphicsWhore

Thoth420 said:


> bmgjet said:
> 
> 
> 
> That stutter is because its on the verge of being stable, its the nvidia driver resetting you go any further and youll get driver hung or device removed crashes.
> Id be RMA that card if it was me.
> 
> 
> 
> I can OC it past the factory OC and it never crashes or driver hangs unless I go way too far ofc. I am considering an RMA just because one of the fan bearings is rubbing now and occasionally makes noise as it goes up the curve. I just double checked and the only time I have had actual driver restarts it was due to an bad driver install a month or so back but I wiped that with DDU and am using a different version now.
> 
> 
> 
> GraphicsWhore said:
> 
> 
> 
> If you don’t downclock do you eventually get artifacts and/or crashing?
> 
> Are these games using DX12 by chance?
> 
> And if you turn on an on screen display monitor while playing the games, are the clocks or temps doing anything weird when the stutter occurs?
> 
> Have you set any settings in the nvidia control panel that override the game/application?
> 
> Click to expand...
> 
> No(aside DX: MD which does it no matter what) and No, I never use DX12. I am not seeing anything strange in logs. I don't run OSD but when it happens I alt tab out to check the graphs. I get the issue if I have NVCP set to use 3D settings globally or if I have a profile and tweak etc. Is it possibly just the games and they have a bit of hitching at certain points loading assets? It's far from something that happens constantly just a hitch or stutter here or there. I should note I am using an ultrawide monitor as well.
Click to expand...

Ultrawide shouldn’t matter but lower the resolution to like 1080p while keeping the overclocks to test.

I can’t see signatures as I’m on mobile but what’s the rest of your setup? Only other possibility of something wrong with the card is a CPU bottleneck I guess


----------



## lightsout

I just got a new EVGA reference style blower card. Not the FE cooler but EVGA's version. Sweet card but she runs hot. Even with fans at 80% in COD BO4 @ 4k I can't keep it under 80c. Throttles quite a bit because of it. I'll have to do something probably down the road to manage temps better.


----------



## bmgjet

Had a day and a night now playing with my peltier chiller setup.
Going to do instructions and share my control software once iv gotten everything dialed in and any bugs worked out.
But for now here is a demo video.





But short story, Hottest part of the day with air temp around 27c, gpu runs at 45c.
With 1 chiller on runs at 25C.
With 2 chillers on runs at 20C.

During night when air temp gets down.
2 Chillers on 10C.
Idle temp 3C

And a drawing of my layout.
Cost less then $100 all up.


----------



## 8051

lightsout said:


> 55c. And idea how much just a re-tim job with that stuff helps?


Just a me-too on this question. I'd like to know as well.


----------



## 8051

Thoth420 said:


> I have a few games that get pretty random stutter, judder etc. with g sync on and off as well as native v sync on and off and settings be damned. I find that if I downclock my FTW3 by -88(normal base clock for a reference 1080ti) these issues seem to subside. It isn't temps for sure or throttling and I made sure it wasn't the CPU which is an 8700k. Any ideas why this card needs its factory OC dropped to get good perf? Do some games just not like OCs at all or is it maybe something with boost 3.0?
> 
> The only game that I have actual crashes with is just Deus Ex: Mankind Divided out of over a 100 titles. I can't seem to rectify that no matter what tweaks I try.


Strange DX:MD is the one title I can really crank up the overclock for on my MSI 1080Ti Duke, but I've modded the stock air cooling considerably.

Instead of reducing the core overclock you might try reducing the memory frequency. I have one game, Dying Light, that isn't stable at my high core clocks unless I reduce the memory frequency by 364 Mhz. to 500 Mhz. depending on ambient temps.


----------



## HeroofTime

Currently I'm on a GIGABYTE 1080 Ti Gaming OC. I'm playing with the idea of purchasing a Morpheus II GPU cooler and slapping two Phanteks 120mm fans on it which are designed for a CPU cooler. I find that I run into the 300W power ceiling frequently. I will *not* be considering the XOC BIOS. Not exactly because I'm not on any water cooling method, but because I don't think the power delivery on the Gaming OC is adequate if I want to keep it for long. Water cooling would definitely help me sleep at night though. What VBIOS would you think is best? I think the highest power limit VBIOS I will flash on this card is one from GIGABYTE with the 375W power limit. To be honest, I think even that's too much for this card. What do you guys think?

Using the simple overclock method (no boost curve adjustments), I'm able to achieve 2012MHz on core and 5900MHz on memory with 120% power limit. If I push my core clock one bin higher, I crash immediately in a game called _Obduction_. Otherwise, one bin higher shows no issues in any other game that I play or benchmark that I boot. With _Obduction_ completely maxed out in terms of graphics, so much work was taken out of finding the maximum stable core clock for my card. Finding the maximum memory clock was more tedious, but using OCCT's 3D GPU test seems to be the quickest way to finding my card's maximum memory overclock. My maximum stable memory clock ended up being 5900MHz. I use NVIDIA Inspector to overclock my GPU. My temperatures max out at 75°C. I do have MSI Afterburner v4.5.0 installed, but I ran into some weird issues with it in regards to the boost curve that I won't mention here at this time.

PS: I'm eyeballing the Palit GameRock Premium VBIOS (350W), the ASUS STRIX OC VBIOS (330W), and the GIGABYTE AORUS VBIOS (375W) after looking all of them over again.


----------



## amd955be5670

Many driver resets and game crashes later, it seems my oc with this gaming x model is +39/+499. Memory wise I could go higher, but on core anything more, say the next boost step (+52) crashes, sometimes immediately, sometimes after a while. The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c).

In the short duration I was able to run the card at +52 (periodically bench stable), got a timespy score of 10.1k

Backplate I've noticed gets pretty hot. Maybe I should stick some small copper heatsinks to the backside? Though my 970 backplate also did that.


----------



## specialedge

bmgjet said:


> Had a day and a night now playing with my peltier chiller setup.
> Going to do instructions and share my control software once iv gotten everything dialed in and any bugs worked out.
> But for now here is a demo video.
> https://www.youtube.com/watch?v=bT-3jFnnRuI
> 
> But short story, Hottest part of the day with air temp around 27c, gpu runs at 45c.
> With 1 chiller on runs at 25C.
> With 2 chillers on runs at 20C.
> 
> During night when air temp gets down.
> 2 Chillers on 10C.
> Idle temp 3C
> 
> And a drawing of my layout.
> Cost less then $100 all up.


This was an awesome watch. Thank you for sharing this. Have you taken any videos or photos of the interior of the case? Peltier systems are so sweet

Sent from my SM-N950U using Tapatalk


----------



## bmgjet

specialedge said:


> This was an awesome watch. Thank you for sharing this. Have you taken any videos or photos of the interior of the case? Peltier systems are so sweet
> 
> Sent from my SM-N950U using Tapatalk


. I'll do pictures and full write up once I've got it looking tidy over next weekend. Want to fold up a bit of alloy to make shroud that covers everything and sleeve the wiring up since it looks ugly as at the moment and I know how judgemental some guys on this forum are.


----------



## kithylin

amd955be5670 said:


> Many driver resets and game crashes later, it seems my oc with this gaming x model is +39/+499. Memory wise I could go higher, but on core anything more, say the next boost step (+52) crashes, sometimes immediately, sometimes after a while. The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c).
> 
> In the short duration I was able to run the card at +52 (periodically bench stable), got a timespy score of 10.1k
> 
> Backplate I've noticed gets pretty hot. Maybe I should stick some small copper heatsinks to the backside? Though my 970 backplate also did that.


Please post your actual core frequencies you're running at. Telling us the offsets you're using literally means nothing. Every single different 1080 Ti will have a different resulting actual frequency using the same offset. Also: Your ambient tempature in your house/room will make the frequencies different as well. May as well not even tell us the offset for your core frequency at all, it does nothing.

However, all 1080 Ti's do use the exact same memory frequency, so the offset is fine for memory. Just not for core.


----------



## lightsout

Is there a table that shows at one temps the core downclocks? Curious what I need to keep the card under for the clock to not downclock at all.

This is on a reference card with EVGA blower cooler.


----------



## bmgjet

lightsout said:


> Is there a table that shows at one temps the core downclocks? Curious what I need to keep the card under for the clock to not downclock at all.
> 
> This is on a reference card with EVGA blower cooler.


 iv only been able to test down to 15c reliably on my evga card and it drops a boost bin every 5c. I'd imagine it probably starts the table some where like 0c.

Power limit will drop clocks just as much. But lower temps mean less power for same clocks and voltage


----------



## hotrod717

amd955be5670 said:


> Many driver resets and game crashes later, it seems my oc with this gaming x model is +39/+499. Memory wise I could go higher, but on core anything more, say the next boost step (+52) crashes, sometimes immediately, sometimes after a while. The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c).
> 
> In the short duration I was able to run the card at +52 (periodically bench stable), got a timespy score of 10.1k
> 
> Backplate I've noticed gets pretty hot. Maybe I should stick some small copper heatsinks to the backside? Though my 970 backplate also did that.


You might want to try changing out or reapplying thermal paste. Are you the original owner. How long have you had the card?



kithylin said:


> Please post your actual core frequencies you're running at. Telling us the offsets you're using literally means nothing. Every single different 1080 Ti will have a different resulting actual frequency using the same offset. Also: Your ambient tempature in your house/room will make the frequencies different as well. May as well not even tell us the offset for your core frequency at all, it does nothing.
> 
> However, all 1080 Ti's do use the exact same memory frequency, so the offset is fine for memory. Just not for core.


Seriously? 
I guess this part doesn't give it -" The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c)."

Everytime i stop in this thread, i see you pulling this crap. Appearently you dont know how to converse with people appropriately.


----------



## lightsout

bmgjet said:


> iv only been able to test down to 15c reliably on my evga card and it drops a boost bin every 5c. I'd imagine it probably starts the table some where like 0c.
> 
> Power limit will drop clocks just as much. But lower temps mean less power for same clocks and voltage


What lol? So its impossible for it not to drop basically. Oh well I guess in the 50c range it should still hold a good clock. At 75-85c I'm around 1800-1850 mhz I think. I know the card has more in it. Going to slap a hybrid kit on it.


----------



## kithylin

hotrod717 said:


> You might want to try changing out or reapplying thermal paste. Are you the original owner. How long have you had the card?
> 
> Seriously?
> I guess this part doesn't give it -" The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c)."
> 
> Everytime i stop in this thread, i see you pulling this crap. Appearently you dont know how to converse with people appropriately.


I didn't see clock speeds in your post before, guess I was drunk. Sorry.


----------



## bmgjet

lightsout said:


> What lol? So its impossible for it not to drop basically. Oh well I guess in the 50c range it should still hold a good clock. At 75-85c I'm around 1800-1850 mhz I think. I know the card has more in it. Going to slap a hybrid kit on it.


Thats just the way boost 3.0 works. Will boost the card as far as temps/powerlimit/voltage allows.
You can manually set your curve in afterburner or PX and it will hold that curve for what ever tempture the card was at when you set it.
So youd want to have the card under load and temps settled already. But at that sort of temp your just wasting your time.


----------



## amd955be5670

kithylin said:


> Please post your actual core frequencies you're running at. Telling us the offsets you're using literally means nothing. Every single different 1080 Ti will have a different resulting actual frequency using the same offset. Also: Your ambient tempature in your house/room will make the frequencies different as well. May as well not even tell us the offset for your core frequency at all, it does nothing.
> 
> However, all 1080 Ti's do use the exact same memory frequency, so the offset is fine for memory. Just not for core.


My card(The Gaming X) starts off at the max boost table (2000mhz at +39mhz), and after reaching 65c~69c drops to 1974mhz. Generally if i'm playing for a long time, it drops further to 1962mhz, but I haven't seen lower than 1962 yet, as long as I stay under 70c.

Without the offset, I start at 1949mhz, and drop till 1911mhz.



hotrod717 said:


> You might want to try changing out or reapplying thermal paste. Are you the original owner. How long have you had the card?


There is this stupid 'warranty void if removed' sticker on the screw, that I'm not comfortable messing with. Not sure how that would effect RMAs in india. I am the original owner. I got this brand new instead of an RTX 2080 which was priced a whole 5~10k INR (70-140$) higher. Unless I got a ninja refurb, I'm not sure how I would call this GPU 'new'. It does look new though. If anything, it has an MFG date of August (or sept, will edit once I see the box back at home), and the Import date a month later.


----------



## 8051

amd955be5670 said:


> Many driver resets and game crashes later, it seems my oc with this gaming x model is +39/+499. Memory wise I could go higher, but on core anything more, say the next boost step (+52) crashes, sometimes immediately, sometimes after a while. The table max is 1987~2000 but I usually get ~1962mhz. 1949mhz after it crosses 70c (to a max of 72c).
> 
> In the short duration I was able to run the card at +52 (periodically bench stable), got a timespy score of 10.1k
> 
> Backplate I've noticed gets pretty hot. Maybe I should stick some small copper heatsinks to the backside? Though my 970 backplate also did that.


It would be interesting to see what effect the small copper heatsinks and a thin fan would make.


----------



## Noufel

got a refund for my 2080ti, and reinstalled my 1080ti Xtreme wasn't happy with the performance i paid for 
hope that 8nm and 7nm will bring us better perf/$


----------



## lightsout

bmgjet said:


> Thats just the way boost 3.0 works. Will boost the card as far as temps/powerlimit/voltage allows.
> You can manually set your curve in afterburner or PX and it will hold that curve for what ever tempture the card was at when you set it.
> So youd want to have the card under load and temps settled already. But at that sort of temp your just wasting your time.


Can you explain that more? Are you talking about a fan curve? Or boost curve?


----------



## Astral85

Noufel said:


> got a refund for my 2080ti, and reinstalled my 1080ti Xtreme wasn't happy with the performance i paid for
> hope that 8nm and 7nm will bring us better perf/$


Interesting. I was going to ask what 1080 Ti owners thought about upgrading to the 2080 Ti. So performance that bad? Most benchmarks show a 15-30 fps gain with the 2080 Ti. That should be very noticeable but I'm still not certain it's worth the cost...


----------



## GraphicsWhore

Astral85 said:


> Interesting. I was going to ask what 1080 Ti owners thought about upgrading to the 2080 Ti. So performance that bad? Most benchmarks show a 15-30 fps gain with the 2080 Ti. That should be very noticeable but I'm still not certain it's worth the cost...


The performance isn't bad - people are bad at setting expectations. For most it's not going to be worth going from 1080Ti to 2080Ti going strictly by game performance but the question of value is going to depend on many factors.

I went from a water 1080Ti to a water 2080Ti. Do I notice the additional performance in most of my current games? Not without putting a FPS counter overlay. Especially because my 1080Ti ran XOC BIOS with a big overclock. 

The two exceptions however happen to be two of my most-played games: Elite Dangerous VR and Project Cars 2 VR, where I can now set all VR settings to max and get butter smooth frame-rates in my VIVE Pro, something the 1080Ti couldn't do.

Also, I sold my 1080Ti for $625, recouping half of the money I spent on the 2080Ti.

So it just depends.


----------



## KedarWolf

I FINALLY beat my 5960x 1080 Ti Time Spy bench with my 9900k. By beat I mean top the graphics score.

Get much better CPU Physics scores as well.

My 5960X score. 11511 graphics score.

https://www.3dmark.com/spy/2545183










My 9900k score. 11542 graphics score 

https://www.3dmark.com/spy/5039067










Now, to post in the Time Spy thread. :h34r-smi


----------



## lightsout

KedarWolf said:


> I FINALLY beat my 5960x 1080 Ti Time Spy bench with my 9900k. By beat I mean top the graphics score.
> 
> Get much better CPU Physics scores as well.
> 
> My 5960X score. 11511 graphics score.
> 
> https://www.3dmark.com/spy/2545183
> 
> 
> 
> My 9900k score. 11542 graphics score
> 
> https://www.3dmark.com/spy/5039067
> 
> 
> 
> Now, to post in the Time Spy thread. :h34r-smi


Looks good, crazy ram speeds.


----------



## rakesh27

*I will miss this gen*

I own 1080ti with 7700k, and i play on 2160p g-sync monitor from Asus at 144hz...

I play games in DSR with 4k resolution which the 1080ti can easily handle this... ok its nots crisp at 4k native but its good enough...


i think the 980ti to 1080ti was more of a jump then a 1080ti to 2080ti jump.... not to mention the price hike for new technology....

i will see what the 2nd generation or 3rd generation card brings, as i know by then the hardware associated with these cards will fully exploit those generation cards.

Right now its too early, its the same with VR... once both technologies are refined then there will be a awesome show playing games, along with developers making big strides.


----------



## Hequaqua

Hey all....having a small issue with just one game: Doom.

Here is a short video showing what I'm talking about:






You can the checkerboarding going on.

I've done a clean install of 2-3 different drivers after using DDU to clean it up. I'm not having any kind of issues with any other games. 

Any help would be great!

System Specs are in my sig.


----------



## HeroofTime

Hey everyone,

Just a quick question. Do the blower style cards from the different vendors (everyone but NVIDIA's FE) lose output in certain ports if flashed with a different VBIOS? Thanks.

PS: Of course, the FE does lose a DisplayPort port, but I was curious about the other blower style 1080 Ti models from the different vendors out there.


----------



## evmota21

Hey guys! I was wondering if you could help me, I am desperate.

So I had to pack my GPU (MSI AERO GTX 1080 Ti 11GB) so I could put it on another PC. My GPU is 3 months old, and the packaging is the same as it came from Amazon. When I unpacked my GPU, and put it on this setup (I7 4930K, KINGSTON 64GB RAM, ASUS RAMPAGE IV) the GPU wasn't giving me any video, from the HDMI port. I have tried putting it in different PCs with no luck. The LEDs and fan turn on but Windows won't detect the videocard.


One thing to note is that I upgraded the NVIDIA drivers before I took it off my original PC. (i7 8700k, Z370 ASUS ITX, 16GB RAM). Any ideas? MSI won't respond because they don't offer global warranty. I bought it in Mexico. 

Card has never been overclocked, actually only undervolted and downclocked. vBIOS hasn't been touched.


----------



## The Pook

Uinstall drivers with DDU and try again. Things should _work_ even without a driver but it's worth a try.


----------



## GraphicsWhore

evmota21 said:


> Hey guys! I was wondering if you could help me, I am desperate.
> 
> So I had to pack my GPU (MSI AERO GTX 1080 Ti 11GB) so I could put it on another PC. My GPU is 3 months old, and the packaging is the same as it came from Amazon. When I unpacked my GPU, and put it on this setup (I7 4930K, KINGSTON 64GB RAM, ASUS RAMPAGE IV) the GPU wasn't giving me any video, from the HDMI port. I have tried putting it in different PCs with no luck. The LEDs and fan turn on but Windows won't detect the videocard.
> 
> 
> One thing to note is that I upgraded the NVIDIA drivers before I took it off my original PC. (i7 8700k, Z370 ASUS ITX, 16GB RAM). Any ideas? MSI won't respond because they don't offer global warranty. I bought it in Mexico.
> 
> Card has never been overclocked, actually only undervolted and downclocked. vBIOS hasn't been touched.


Does it display from any of the ports? Or is only HDMI affected?


----------



## kithylin

HeroofTime said:


> Hey everyone,
> 
> Just a quick question. Do the blower style cards from the different vendors (everyone but NVIDIA's FE) lose output in certain ports if flashed with a different VBIOS? Thanks.
> 
> PS: Of course, the FE does lose a DisplayPort port, but I was curious about the other blower style 1080 Ti models from the different vendors out there.


Yes. All 1080 Ti's do this. If the card of which the vbios you took from has a different port configuration than your card then you will lose display output ports for what's different.


----------



## evmota21

GraphicsWhore said:


> Does it display from any of the ports? Or is only HDMI affected?


All ports, the card won't get recognized by the BIOS. I only packed the card in the aesthetic bag, put the plastic thingies and unpacked it, put it on and it stopped working, what the ****.


----------



## evmota21

The Pook said:


> Uinstall drivers with DDU and try again. Things should _work_ even without a driver but it's worth a try.


Already tried that. I should've mentioned the card wasn't getting recognized by the motherboard (BIOS).


----------



## lightsout

I was able to sell off my blower card on ebay for a good price and scored a FTW3, should be here next week. What are some good clocks to shoot for? 2000-2100 range?


----------



## The Pook

General consensus when I asked was anything over 2000 is good. Mine refuses to do anything over ~1976 but it'll do it at 1.0v


----------



## lightsout

The Pook said:


> General consensus when I asked was anything over 2000 is good. Mine refuses to do anything over ~1976 but it'll do it at 1.0v


Yeah thats basically the number I have in mind for all 10xx cards. If it hits 2ghz I'm happy. But it is what it is in the end if it doesn't.


----------



## kithylin

evmota21 said:


> Already tried that. I should've mentioned the card wasn't getting recognized by the motherboard (BIOS).


Did you plug the monitor into the motherboard's onboard video ports and see if it's running video through the onboard video instead? And if that works, then try disabling the onboard video in bios and it should switch over to the GPU.


----------



## ExoticallyPure

Sshhhh!! Don't tell anyone over in the RTX 2080 Owner's Thread that we secretly have the fastest consumer graphics card because the 1080 Ti FTW3 is almost 10% faster than the RTX 2080 (and RTX 2080 Ti at ~$1300 is not a consumer product).


----------



## bmgjet

lightsout said:


> I was able to sell off my blower card on ebay for a good price and scored a FTW3, should be here next week. What are some good clocks to shoot for? 2000-2100 range?


The cooler its running the higher it will clock.
My EVGA black card wouldnt do 2ghz on stock air cooler. Iv got it at 2126mhz now with low 20C temps.


----------



## lightsout

bmgjet said:


> The cooler its running the higher it will clock.
> My EVGA black card wouldnt do 2ghz on stock air cooler. Iv got it at 2126mhz now with low 20C temps.


Yeah these cards act weird when they get warm. I get it at high temps but I don't really love how boost works these days.


----------



## fat4l

Hi guys.
Has anyone tried the xoc voltage tool for 1080ti strix and played with memory oc? 
We can clearly add more volts for our gddr5x but not sure how much would be still considered safe when watercooled. Default is 1.35v


----------



## Streetdragon

I would say 1.45 would be ok under water for the ram. the question is: is the memory voltagecontroller cooled too? ekwb waterblock misses the crontroller i think


----------



## amd955be5670

So... is this normal?

I am having better overclocking success with the Z370/8700k combo. My previous Z87/4670k had a strange issue where it nerfed the power to the 1080Ti upon a crash, but I don't have that problem on Z370. It instead outright blackscreens and resets clocks instead, like a gpu crash should..


----------



## kithylin

amd955be5670 said:


> So... is this normal?
> 
> I am having better overclocking success with the Z370/8700k combo. My previous Z87/4670k had a strange issue where it nerfed the power to the 1080Ti upon a crash, but I don't have that problem on Z370. It instead outright blackscreens and resets clocks instead, like a gpu crash should..


It should be your power supply that has any effect on that. All computers are supposed to put out the same 75 watts of power from the pci-express slot, per the pci-express specification.


----------



## Abaidor

I have the Asus Strix 1080Ti OC Edition watercooled with a Phanteks Waterblock. I have only tested overclocking with ASUS GPU Tweak II and the card will happily do 2063MHz GPU core & 6003MHz on memory. While doing so the temperature never exceeds 38C with my fans (on Radiator) set @ 450 RPM.

Is there a point in trying other utilities to achieve a higher overclock? Is a modded BIOS worth it? If so what tools do you suggest and where could I download them.

I decided to keep my 1080Ti and skip the 2080Ti since my games backlog includes a bit older games and by the time I catch up the 208Ti will be "old" news....So I thought of taking advantage of what I have and maximize my overclocking. After all that was the reason I got a waterblock for it in the first place.


----------



## mouacyk

Abaidor said:


> I have the Asus Strix 1080Ti OC Edition watercooled with a Phanteks Waterblock. I have only tested overclocking with ASUS GPU Tweak II and the card will happily do 2063MHz GPU core & 6003MHz on memory. While doing so the temperature never exceeds 38C with my fans (on Radiator) set @ 450 RPM.
> 
> Is there a point in trying other utilities to achieve a higher overclock? Is a modded BIOS worth it? If so what tools do you suggest and where could I download them.
> 
> I decided to keep my 1080Ti and skip the 2080Ti since my games backlog includes a bit older games and by the time I catch up the 208Ti will be "old" news....So I thought of taking advantage of what I have and maximize my overclocking. After all that was the reason I got a waterblock for it in the first place.


There are no flashable modded BIOSes. Since you're on water, try the XOC one from hwbot. It allows up to 1.2v and will remove all power throttling. 

I'm at a modest 2100 / 16200 at 1.1v for everyday. 1.2v only gets me another 64mhz so i use that only for benchmarks.


----------



## lightsout

Abaidor said:


> I have the Asus Strix 1080Ti OC Edition watercooled with a Phanteks Waterblock. I have only tested overclocking with ASUS GPU Tweak II and the card will happily do 2063MHz GPU core & 6003MHz on memory. While doing so the temperature never exceeds 38C with my fans (on Radiator) set @ 450 RPM.
> 
> Is there a point in trying other utilities to achieve a higher overclock? Is a modded BIOS worth it? If so what tools do you suggest and where could I download them.
> 
> I decided to keep my 1080Ti and skip the 2080Ti since my games backlog includes a bit older games and by the time I catch up the 208Ti will be "old" news....So I thought of taking advantage of what I have and maximize my overclocking. After all that was the reason I got a waterblock for it in the first place.


Seems like your sitting pretty, that core temp is insane. As stated above you can do a bios flash, but you'll probably be lucky to get another 100mhz. I personally would just let it be unless you are wanting to do some benching or something.


----------



## kithylin

lightsout said:


> Seems like your sitting pretty, that core temp is insane. As stated above you can do a bios flash, but you'll probably be lucky to get another 100mhz. I personally would just let it be unless you are wanting to do some benching or something.


Really it just depends on the system and game and how you play it.
I went through a system change recently and I'll explain some of my experiences to folks here. I went from a I7-5820K @ 4.4 ghz (Backwards? Forwards?) which had the cpu and gpu both looped into the same single 360mm radiator to a I5-3570K system that runs stable @ 5.2 ghz 24-7 for gaming. For the 3570K system I built a 4-radiator (360+240+240+120) loop. I now have more stuff added to the loop (CPU + Vrm/Mosfet block + 1080 Ti + Ram block) and my 1080 Ti overall seems to run a lot cooler. Sometimes I play games like Borderlands 2, Borderlands:TPS, for example and in these games with vsync on @ 1080p-80Hz my 1080 Ti typically runs max peak 29c-30c after hours of gaming. Although with vsync on it rarely ever boosts to 2100 mhz (even though I have afterburner set so it's capable of doing it if it needs to). It actually runs around 600 mhz core speed most of the time, sometimes 1500 depending on the area, in Borderlands games. I stopped playing ARK: Survival Evolved all together (no friends to play with.. bored, oh well), and mostly playing some older games lately.

And it's really weird. In my 5820K system, playing Fallout 4 (modified.. build restrictions removed, several large mega-settlements) my 1080 Ti would be boosting up pretty high to 2100 mhz most of the time, and ran pretty hot, mid-50's C and struggling to maintain even 60 FPS, often running in the low 50's and mid-40's. But here on this 3570K system, I can play fallout 4 and my 1080 Ti rarely hits max boost anymore. Often the GPU Sits around 1500~1700 mhz instead and runs cooler, around 30~35c, and is a perfectly smooth 60 FPS capped (Bethesda and their terrible engine requiring us to keep it @ 60 FPS for physics and scripts to not bug out).

Seems in some games, (like the borderlands series) I could probably push a 1440p/144 screen now with my "new" 3570K setup, but I'll wait until after I switch this system to a 3770K later and try to push that to 5.1+ ghz as well. And yes I've looked into it, there's not much IPC gains if you compare 3000 series and 8000 series Intel chips, once you clock em to the same clocks. Maybe I might be -4 to -5 FPS behind, 8700K @ 5.0 ghz vs 3000 series @ 5.2 ghz, but that's okay for me. Older system was a lot cheaper. I was looking at $1500+ for good/exotic board, cpu, and fast ddr4 for newer system and this older one was only $350 for me buying used. So it works fine, for now. And it's quite a good bit faster (in the games I like to play with tons of mods) vs my 5820K system, +30% or so in Fallout 4. With Intel it seems to be more about clock speed than generational differences. Anything Intel-side that can reliably hold 5.0~5.2 ghz stable seems to be about the same or really close.

I'll sit on it for a few years and watch and wait to see what Intel and AMD do with their new systems, then wait 1~2 years after the new stuff comes out and maybe buy something newer when the "newness" wears off price wise.


----------



## lightsout

Curious to hear from any FTW3 owners. I just got one and it's in a small case (h200).

The gap between the card and GPU shroud is very small so I know that is restricting air flow. Even with the fans at 80% it can get to 75-80c. 

Curious what users with this card and a bigger case are getting temp wise. These temps are while playing bfv at 1440p.


----------



## Abaidor

mouacyk said:


> There are no flashable modded BIOSes. Since you're on water, try the XOC one from hwbot. It allows up to 1.2v and will remove all power throttling.
> 
> I'm at a modest 2100 / 16200 at 1.1v for everyday. 1.2v only gets me another 64mhz so i use that only for benchmarks.


Then it is not worth it to through the hassle of flashing and reinstalling drivers and most importantly the FPS gains in games will be negligible. 



lightsout said:


> Seems like your sitting pretty, that core temp is insane. As stated above you can do a bios flash, but you'll probably be lucky to get another 100mhz. I personally would just let it be unless you are wanting to do some benching or something.


This temps are in Furemark GPU torture test & Superposition....bear in mind I use a big EXTERNAL radiator (much more efficient) against the coldest wall of the room. It is a MO-RA3 with 9X140 Fans. There are even better performing wateblocks (by 2-3 degrees) but I prefer the Phanteks quality over other and Watercool (which would also buy) did not make on for my card.


Does flashing the XOC BIOS cause any DP/HDMI ports loss? I use 2 displays (and might need two more) so that is a no go...


----------



## lightsout

Abaidor said:


> Then it is not worth it to through the hassle of flashing and reinstalling drivers and most importantly the FPS gains in games will be negligible.
> 
> 
> 
> This temps are in Furemark GPU torture test & Superposition....bear in mind I use a big EXTERNAL radiator (much more efficient) against the coldest wall of the room. It is a MO-RA3 with 9X140 Fans. There are even better performing wateblocks (by 2-3 degrees) but I prefer the Phanteks quality over other and Watercool (which would also buy) did not make on for my card.
> 
> 
> Does flashing the XOC BIOS cause any DP/HDMI ports loss? I use 2 displays (and might need two more) so that is a no go...


Someone else can answer the BIOS flashing question but that makes sense with that huge rad. Not really practical but pretty awesome. Well it may be for you but I have a one year old that has to inspect everything lol.


----------



## Abaidor

lightsout said:


> Someone else can answer the BIOS flashing question but that makes sense with that huge rad. Not really practical but pretty awesome. Well it may be for you but I have a one year old that has to inspect everything lol.



The RAD sits on a shelf behind and over the height of the computer so it is not within the reach of kids although mine are older (8 & 11)...it's just two tubes entering the PC case from the back and it also allows for so much better airflow within the case + more room + more silence....but I understand it is not for everyone. I was thinking of adding a second one but this thing does not stress ever (fans remain at low even though tied to water temp) so maybe I could go almost passive...we'll see not a priority now.


----------



## lightsout

Abaidor said:


> The RAD sits on a shelf behind and over the height of the computer so it is not within the reach of kids although mine are older (8 & 11)...it's just two tubes entering the PC case from the back and it also allows for so much better airflow within the case + more room + more silence....but I understand it is not for everyone. I was thinking of adding a second one but this thing does not stress ever (fans remain at low even though tied to water temp) so maybe I could go almost passive...we'll see not a priority now.


If you get a chance to take a pic post it for us.


----------



## KedarWolf

I made a .cmd file you can add to Task Scheduler in Windows to run at login that'll open Afterburner with a bit of a delay to let Windows load on login, then close it automatically. 

This sets your clock and memory speeds but doesn't keep Afterburner running in the background so it's monitoring doesn't slow down your PC or cause BSODs when you have incompatible programs running.

Uncheck 'Start with windows' in the Afterburner settings so the script runs it and not Afterburner and 'Start minimized' so you can see it run for the few seconds it does so you know it's working.

Open the code in the text editor, save as All Files, not .txt, as Afterburner.cmd and add to the Windows Task Scheduler to run at login.



Code:



 [USER=6236]@Echo[/USER] off
TIMEOUT /T 5 /NOBREAK >nul
CD C:\Program Files (x86)\MSI Afterburner
Start MSIAfterburner.exe
TIMEOUT /T 7 /NOBREAK >nul
Taskkill /IM MSIAfterburner.exe /F
Exit

[/code]


----------



## 8051

kithylin said:


> And it's really weird. In my 5820K system, playing Fallout 4 (modified.. build restrictions removed, several large mega-settlements) my 1080 Ti would be boosting up pretty high to 2100 mhz most of the time, and ran pretty hot, mid-50's C and struggling to maintain even 60 FPS, often running in the low 50's and mid-40's. But here on this 3570K system, I can play fallout 4 and my 1080 Ti rarely hits max boost anymore. Often the GPU Sits around 1500~1700 mhz instead and runs cooler, around 30~35c, and is a perfectly smooth 60 FPS capped (Bethesda and their terrible engine requiring us to keep it @ 60 FPS for physics and scripts to not bug out).
> 
> Seems in some games, (like the borderlands series) I could probably push a 1440p/144 screen now with my "new" 3570K setup, but I'll wait until after I switch this system to a 3770K later and try to push that to 5.1+ ghz as well. And yes I've looked into it, there's not much IPC gains if you compare 3000 series and 8000 series Intel chips, once you clock em to the same clocks. Maybe I might be -4 to -5 FPS behind, 8700K @ 5.0 ghz vs 3000 series @ 5.2 ghz, but that's okay for me. Older system was a lot cheaper. I was looking at $1500+ for good/exotic board, cpu, and fast ddr4 for newer system and this older one was only $350 for me buying used. So it works fine, for now. And it's quite a good bit faster (in the games I like to play with tons of mods) vs my 5820K system, +30% or so in Fallout 4. With Intel it seems to be more about clock speed than generational differences. Anything Intel-side that can reliably hold 5.0~5.2 ghz stable seems to be about the same or really close.


Interesting. I've noticed my 1080Ti is running low GPU usage in Fallout 4 (< 55% GPU usage) and the FPS is ~35FPS while only one core of my 5820k (4.45 Ghz.) is at from 90% to 98% usage.

What kind of memory speed are you running your DDR3 at? Fallout 4 has been conclusively proven to respond to faster memory speeds -- at least to a degree.

Would a 3770k be able to clock as high as your 3570k?


----------



## kithylin

8051 said:


> Interesting. I've noticed my 1080Ti is running low GPU usage in Fallout 4 (< 55% GPU usage) and the FPS is ~35FPS while only one core of my 5820k (4.45 Ghz.) is at from 90% to 98% usage.
> 
> What kind of memory speed are you running your DDR3 at? Fallout 4 has been conclusively proven to respond to faster memory speeds -- at least to a degree.
> 
> Would a 3770k be able to clock as high as your 3570k?


I honestly don't know about the 3770K yet.. they're still $175, and I don't want to risk that on a chip I'm going to clock high and potentially damage yet. My 3570K was only $53 shipped and came delidded. And this motherboard I'm using is the GA-Z77X-UP7, the crazy 32-phase-power monster beast board and so far it took a random used 2500K off ebay to 5.0 ghz stable, and now a random used 3570K to 5.2 ghz stable. I'm guessing if I can cool it, it would probably take a 3770K to 5.0 ghz stable too. And I'm running with a ram kit from Corsair Dominator series, with a corsair water cooler that runs it at 4x4GB @ DDR3-2133 @ 9-11-10-27-2T, Stock Via XMP (not a manual ram tuning) @ 1.50v ram voltage. EDIT: And 5820K system was running quad channel DDR4 @ 2666-13-13-13-30-1T.


----------



## Hequaqua

A little info would be helpful.

I've ran Firestrike over 1500 times, and just picked up this EVGA GTX1080Ti Black Edition. I've never noticed this with my GTX1080, so not sure of it's the card, the test, or me just not ever noticing. When it gets to the Combined Test(Test 4), my GPU load is never over 65%, it normally is around 55-60% load.

Is that normal? 

Note: When I run the Extreme/Ultra, it does load the GPU in the combined test to 99%.

Thanks


----------



## Abaidor

lightsout said:


> If you get a chance to take a pic post it for us.


My build is not complete right now since I had to put the PC into production some months ago so I wen with soft tubes. 
I have ordered and have here all the parts in order to do the following:

-Switch to hard tubes (have all the new Bitspower fittings + tubes)
-Delid the CPU + Direct Die Frame + LM
-Replace the monoblock with an EK Velocity + Watercool VRM waterblock

I have also designed several plexi pieces for the interior (to be laser cut) and got some extra Phantek Halos + Led Strips

For the radiator I have got some flexible metal tube to replace the black plastic cable wrap (+ a custom flange) and will also replace the tubes with EPDM + Stainless Stell anti-kink springs. I will also add a second valve on the top port so that I can detach the RAD easily (I don't want to use quick disconnects because of restriction and their failure rate. Thus I went for shut off valves instead. )

*EDIT* I am also going to order custom 15AWG cables from Pexon (excellent custom sleeving) but have not decided on the colors yet. 

Finally, I am planning a total revamp of my cables that reside outside the case and I am checking some parts for this reason on Amazon. 

The plan is to proceed with the upgrade during Christmas Holidays that I will have time since right now I working on several deadlines and can't turn off the PC for 1-2 days that will take me to do what I want. 

Once I finish I will take proper pictures of the whole setup.


----------



## kithylin

Hequaqua said:


> A little info would be helpful.
> 
> I've ran Firestrike over 1500 times, and just picked up this EVGA GTX1080Ti Black Edition. I've never noticed this with my GTX1080, so not sure of it's the card, the test, or me just not ever noticing. When it gets to the Combined Test(Test 4), my GPU load is never over 65%, it normally is around 55-60% load.
> 
> Is that normal?
> 
> Note: When I run the Extreme/Ultra, it does load the GPU in the combined test to 99%.
> 
> Thanks


This is completely normal. The combined test is not a "full video card test" it's partially CPU and partially video card combined together. The combined test is not supposed to load the gpu to 100%, that's not it's function.


----------



## Hequaqua

kithylin said:


> This is completely normal. The combined test is not a "full video card test" it's partially CPU and partially video card combined together. The combined test is not supposed to load the gpu to 100%, that's not it's function.


As I said....I really haven't ever noticed it....not on any of the cards I've had. I just figured the 1080p combined test wasn't really asking this GPU to do much. I might have to throw in the 1080 and 1060 I have just for fun to see....lol

Like I said...on the other two tests(Extreme and Ultra) it does load up the GPU.

Guess you do learn something everyday, if ya just pay attention. 

Thanks! :thumb:


----------



## The Pook

Anyone ever take real measurements of power usage? I finally got a Kill-A-Watt and even with a i7 6700 @ 1.4v with C1E/SpeedStep disabled paired with a 1080 Ti with voltage/power sliders maxed I only hit ~170w in Cinebench, peaked at 430w during 3DMark Firestrike, and mostly stayed ~375w. 

Was figuring _at least _500w.


----------



## specialedge

The Pook said:


> Anyone ever take real measurements of power usage? I finally got a Kill-A-Watt and even with a i7 6700 @ 1.4v with C1E/SpeedStep disabled paired with a 1080 Ti with voltage/power sliders maxed I only hit ~170w in Cinebench, peaked at 430w during 3DMark Firestrike, and mostly stayed ~375w.
> 
> Was figuring _at least _500w.


Dual 1080ti sli with ryzen 7 2700x on PBO4 draw peaked at 901w according to my UPS during time spy 1080p, actually 50w higher than on extreme. 

Sent from my SM-N950U using Tapatalk


----------



## kithylin

The Pook said:


> Anyone ever take real measurements of power usage? I finally got a Kill-A-Watt and even with a i7 6700 @ 1.4v with C1E/SpeedStep disabled paired with a 1080 Ti with voltage/power sliders maxed I only hit ~170w in Cinebench, peaked at 430w during 3DMark Firestrike, and mostly stayed ~375w.
> 
> Was figuring _at least _500w.


Create a new windows shortcut and paste in this as the target exactly: "C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe" -l

Run it and it will show you exactly what power your 1080 Ti is using. It's a program provided by Nvidia with the latest drivers.

Also 3dmark firestrike normal will not load the 1080 Ti to it's fullest. Run firestrike ultra or extreme for that.


----------



## bmgjet

Hequaqua said:


> A little info would be helpful.
> 
> I've ran Firestrike over 1500 times, and just picked up this EVGA GTX1080Ti Black Edition. I've never noticed this with my GTX1080, so not sure of it's the card, the test, or me just not ever noticing. When it gets to the Combined Test(Test 4), my GPU load is never over 65%, it normally is around 55-60% load.
> 
> Is that normal?
> 
> Note: When I run the Extreme/Ultra, it does load the GPU in the combined test to 99%.
> 
> Thanks


Normal since it will be trying load GPU and CPU equally.
Mine runs 60% CPU and 80% GPU in normal FS combined.
Extreme tho will load it up 90% cpu and 95% gpu.




The Pook said:


> Anyone ever take real measurements of power usage? I finally got a Kill-A-Watt and even with a i7 6700 @ 1.4v with C1E/SpeedStep disabled paired with a 1080 Ti with voltage/power sliders maxed I only hit ~170w in Cinebench, peaked at 430w during 3DMark Firestrike, and mostly stayed ~375w.
> 
> Was figuring _at least _500w.


Yeah
Kill-A-Watt with my rig shows just over 600W with cinebench only (HWInfo CPU W shows around 400W)
Time Spy extreme pulls 1029W at the most. (350W CPU / 450W GPU)

Thats with 2X PG27UQ, 1X QX2710LED,
7900X @ 5ghz @ 1.36V
1080ti @ 2126MHz @ 1.15V
96W chiller
4X 4TB HDD
2X NVME
2X SSD


----------



## Thoth420

I think my factory OC is finally starting to be unstable. 1080Ti FTW3 Gaming. Driver crashes and recovery. Game crashes etc. I tossed debug mode on in the NVCP (which is dropping the factory OC to Nvidia reference) to test for a week before I bother EVGA.
Anyone have experience RMAing for this reason on a card that has been owned for a year or longer?


----------



## kithylin

Thoth420 said:


> I think my factory OC is finally starting to be unstable. 1080Ti FTW3 Gaming. Driver crashes and recovery. Game crashes etc. I tossed debug mode on in the NVCP (which is dropping the factory OC to Nvidia reference) to test for a week before I bother EVGA.
> Anyone have experience RMAing for this reason on a card that has been owned for a year or longer?


You can try but don't mention you overclocked it when talking to the CSR or in email...


----------



## Thoth420

kithylin said:


> You can try but don't mention you overclocked it when talking to the CSR or in email...


I never modified it outside of the factory OC as the performance in the games I play is fine so no worries there.


----------



## adversary

These days I have chance to get few 1080 TIs, at really good price in my country. And I would grab chance and finally upgrade (using MSI 1050 TI at moment). 20xx is expensive for me and I do not need these features. Good choice of 1080 TI would be perfect for now.

But need to decide between there few cards that are available for good price..

So, there is Asus Strix OC, Gigabyte Aorus, Palit Gamerock, and Evga SC2 ICX.

Evga card is cheaper than others, but it is also only dual fan (Palit is, as I read, actually 4 fan), so I assume it would have inferior cooling compared to other mentioned cards.

Also about power limit.. as I search on internet, Palit card have 350W limit (when you allow 116%, which is max). Not sure about Gigabyte and Asus. Also, if these cards benefit from flashing different BIOS, I would of course take that into account and be interested in doing that. But as I do not have experience with 1080 TI cards, all advices and info is welcome here


----------



## The Pook

adversary said:


> These days I have chance to get few 1080 TIs, at really good price in my country. And I would grab chance and finally upgrade (using MSI 1050 TI at moment). 20xx is expensive for me and I do not need these features. Good choice of 1080 TI would be perfect for now.
> 
> But need to decide between there few cards that are available for good price..
> 
> So, there is Asus Strix OC, Gigabyte Aorus, Palit Gamerock, and Evga SC2 ICX.
> 
> Evga card is cheaper than others, but it is also only dual fan (Palit is, as I read, actually 4 fan), so I assume it would have inferior cooling compared to other mentioned cards.
> 
> Also about power limit.. as I search on internet, Palit card have 350W limit (when you allow 116%, which is max). Not sure about Gigabyte and Asus. Also, if these cards benefit from flashing different BIOS, I would of course take that into account and be interested in doing that. But as I do not have experience with 1080 TI cards, all advices and info is welcome here



Don't base cooling performance off the number of fans, base it off the heatsink. 3 fan cards are often louder anyway because they're using smaller fans at a higher RPM to match airflow of larger 2-fan designs, plus in usable cooling area the 2 fans have more since the hub of each fan is a dead spot and the area taken by the shroud/frame is more space where a fan isn't. 

I'd go for the Strix or the SC2 if it was my money.


----------



## specialedge

I have a 1070 sc2 and it's a pretty nice design. It's not as much metal at the strix, but it stayed pretty cool in a roomy case, and was very easy to run cool while mining.

My friend has msi Duke 1080ti, and it has such massive cooler along the same lines as strix. Without size as a concern, take strix.

Sent from my SM-N950U using Tapatalk


----------



## HeroofTime

kithylin said:


> Yes. All 1080 Ti's do this. If the card of which the vbios you took from has a different port configuration than your card then you will lose display output ports for what's different.


Thanks for the information. Just making sure, if I were to specifically buy any blower model (FE or not), and I could find a better card's VBIOS that had the same port configuration on the back, I won't end up losing any ports on the back? Thanks for the help.


----------



## kithylin

HeroofTime said:


> Thanks for the information. Just making sure, if I were to specifically buy any blower model (FE or not), and I could find a better card's VBIOS that had the same port configuration on the back, I won't end up losing any ports on the back? Thanks for the help.


I think that's how it works, but I'm not 100% sure. I would suggest you look online on like, newegg, or maybe find someone selling one on ebay. The card's vbios you found online. The one you want to flash in to your card. And then compare the port-out on the back and see how similar or dissimilar it is to yours. There's no real way to know which ports you may (or may not) lose until you try to flash it on and then boot up and see. But I would suggest you keep your original factory bios too somewhere. I think you can extract it with nvflash or gpu-z.


----------



## Gripen90

After my RTX 2080Ti died, I decided to go back to GTX 1080Ti. Instead of Zotac GTX 1080Ti F.E which I had before the 2080Ti I am now trying out Gigabyte GTX 1080Ti Gaming OC.


----------



## dVeLoPe

What would you say is a good price for a slightly used (purchased in May this year) EVGA SC2 Gaming iCX?


----------



## lightsout

dVeLoPe said:


> What would you say is a good price for a slightly used (purchased in May this year) EVGA SC2 Gaming iCX?


To buy or sell? 

Great price $500 and under
Common Price $600 and under (these are outside ebay)

If selling on Ebay you can probably get close to $700


----------



## Enterprise24

Wanna push higher but my card don't scale beyond 1.162V even with 33C full load.

https://www.3dmark.com/3dm/30614184


----------



## mattliston

Have you done any capacitor mods?


bullzoid I think showed a few 10 series cards that benefited from them. Not sure if it was any sort of large difference.


I think it was bullzoid. Im recalling a youtube video from approx a year ago, which is the last time I saw it.


----------



## EarlZ

I seem to be getting around 81-85c with 100% fanspeed on my MSI Gaming X, with the side panel off I only lowers the temps by 3-5 degrees. This is at 1866Mhz and no overclock my ambient is around 27-29 depending on the time of day. Seems a little too hot, does this mean my GPU needs a fresh application of TIM ?


----------



## lightsout

EarlZ said:


> I seem to be getting around 81-85c with 100% fanspeed on my MSI Gaming X, with the side panel off I only lowers the temps by 3-5 degrees. This is at 1866Mhz and no overclock my ambient is around 27-29 depending on the time of day. Seems a little too hot, does this mean my GPU needs a fresh application of TIM ?


What case? Got a picture? Something sounds off but I have a FTW3 that also struggles with temps (more like 75-80c) because the PSU shroud is right up against the GPU smothering it. 

Are you sure the fans are running? Do you use a custom fan curve?


----------



## kithylin

lightsout said:


> What case? Got a picture? Something sounds off but I have a FTW3 that also struggles with temps (more like 75-80c) because the PSU shroud is right up against the GPU smothering it.
> 
> Are you sure the fans are running? Do you use a custom fan curve?


My EVGA GTX 1080 Ti SC Black Edition (2 big fans) was doing the exact same thing even in an Antec Twelve Hundred chassis with all the case fans at max speed and no other cards and no obstructions, and even with the video card's fans set to 100% in afterburner: 80~85c and throttling down to 1850~1875 mhz when I first bought it, on the stock cooler. I was assuming that most 1080 Ti's run hot like this on their original stock air coolers. Except those rare ones like the MSI Lightning and the EVGA Kingpin cards.


----------



## EarlZ

lightsout said:


> What case? Got a picture? Something sounds off but I have a FTW3 that also struggles with temps (more like 75-80c) because the PSU shroud is right up against the GPU smothering it.
> 
> Are you sure the fans are running? Do you use a custom fan curve?


Meshify C, I am at work so I'll post photos later when I get home. Fans are spinning. 100% fan speed just to eliminate the automatic factor. Intake fans are 2x 140mm be quiet silent wings 3 and 1x 120mm.

I am also using a Z390 Master motherboard and only the bottom PCIe slot is populated with a Sound Blaster Z. Opening the side panel only sees a 3-5deg drop which tells me that the air flow is decent. If I manually clock it higher with a +75 on voltage, +65 core offset, it stays on 1974 ~ 2000


----------



## lightsout

kithylin said:


> My EVGA GTX 1080 Ti SC Black Edition (2 big fans) was doing the exact same thing even in an Antec Twelve Hundred chassis with all the case fans at max speed and no other cards and no obstructions, and even with the video card's fans set to 100% in afterburner: 80~85c and throttling down to 1850~1875 mhz when I first bought it, on the stock cooler. I was assuming that most 1080 Ti's run hot like this on their original stock air coolers. Except those rare ones like the MSI Lightning and the EVGA Kingpin cards.


I don't think so, look at some reviews for your card, 


EarlZ said:


> Meshify C, I am at work so I'll post photos later when I get home. Fans are spinning. 100% fan speed just to eliminate the automatic factor. Intake fans are 2x 140mm be quiet silent wings 3 and 1x 120mm.
> 
> I am also using a Z390 Master motherboard and only the bottom PCIe slot is populated with a Sound Blaster Z. Opening the side panel only sees a 3-5deg drop which tells me that the air flow is decent. If I manually clock it higher with a +75 on voltage, +65 core offset, it stays on 1974 ~ 2000


Yeah if you look at what your card will do in an open air setting its way better than that. Seems like your case has decent air flow maybe the single 120 exhaust is choked out by the double 140's. That case is known to have good air flow though I believe. Something seems amiss. 

But both of your guys cards aren't actually throttling, thats just what GPU boost does now, annoying I know, you have to have very low temps for the clocks to stay stable. Heat throttling would be when it drops below the base clock.


----------



## kithylin

lightsout said:


> I don't think so, look at some reviews for your card,


I was also (At the time) trying to overclock it as far as it would go on the stock cooler. Including increasing voltage to 1.087v with EVGA PrecisionX and trying for 1900 mhz, and I think I had the ram at +400 mhz too, and it just wouldn't go past 1875 mhz max and it was super hot. All of those reviews are with the cards just "plunked" in a system and run at stock clocks at stock fan curves. Most reviews don't try and overclock air cooled cards to their limits.


----------



## EarlZ

lightsout said:


> I don't think so, look at some reviews for your card,
> 
> 
> Yeah if you look at what your card will do in an open air setting its way better than that. Seems like your case has decent air flow maybe the single 120 exhaust is choked out by the double 140's. That case is known to have good air flow though I believe. Something seems amiss.
> 
> But both of your guys cards aren't actually throttling, thats just what GPU boost does now, annoying I know, you have to have very low temps for the clocks to stay stable. Heat throttling would be when it drops below the base clock.


I can definitely feel a good amount of warm air coming out from the PCIE vents which is a good sign for me. I am thinking they have a sub 22c ambient on those charts to only hit 72c. 

I agree that my card is not throttling below stock speeds but I would want to see better temperatures if its possible which has lead me to believe that a re-paste might be required altho at the expense of warranty as the MSI official distributor in my region does not allow TIM replacements done by end users.


----------



## lightsout

kithylin said:


> I was also (At the time) trying to overclock it as far as it would go on the stock cooler. Including increasing voltage to 1.087v with EVGA PrecisionX and trying for 1900 mhz, and I think I had the ram at +400 mhz too, and it just wouldn't go past 1875 mhz max and it was super hot. All of those reviews are with the cards just "plunked" in a system and run at stock clocks at stock fan curves. Most reviews don't try and overclock air cooled cards to their limits.


That may be true, plenty of them do overclock though, the TPU review didn't share the temp when they OC'd. Temp still may be a little high but these cards do seem to run pretty warm.


----------



## Glottis

Recently upgraded to 4K and surprised how well my 1080Ti still holds up. I can play most games at max or near max settings at 4K 60FPS. I really wish Nvidia supported Freesync/VRR, it would be perfect when playing some of these badly optimized games in 4K. Hopefully my GPU lasts a good while until this GPU market insanity ends.


----------



## 2ndLastJedi

My Galax 1080Ti EXOC (small dual slot cooler) runs at 74c on stock fan profile at between 1938MHz down to 18 something but as I have it undervolted to 1911MHz at 0.900 mV it runs at a max of 64c with a custom profile that never exceeds 68%, clock speed also never moves from 1911 either unlike stock where it jumps around like carzy. If i overclock it will do 2088MHz/12008MHz at 82c using same profile.
I would suggest learning to u dervolt to get lower temps and a completely stable clock speed!


----------



## kithylin

2ndLastJedi said:


> My Galax 1080Ti EXOC (small dual slot cooler) runs at 74c on stock fan profile at between 1938MHz down to 18 something but as I have it undervolted to 1911MHz at 0.900 mV it runs at a max of 64c with a custom profile that never exceeds 68%, clock speed also never moves from 1911 either unlike stock where it jumps around like carzy. If i overclock it will do 2088MHz/12008MHz at 82c using same profile.
> I would suggest learning to u dervolt to get lower temps and a completely stable clock speed!


That doesn't always work for everyone. And every card is different. I tried under-volting, I tried over-volting, nothing worked and nothing would let my 1080 Ti card even get to 1900 mhz stable. It would either run too hot or hit power limit (one of the two) and always throttle back down to 1850~1875 Mhz, there was literally -NOTHING- that would let it overclock any further on air cooling. It just wouldn't go. I got extremely unlucky with the silicon lotto with this card, sadly. :thumbsdow

Eventually I had to put it under a custom water loop and load the XOC bios and over-volt up to 1.200v and finally I can use 1900, 2000, and even up to 2100 mhz with my card. And even then the card was running around 60c under a small custom water loop and it was reducing clocks. I would set it for 2136 mhz in afterburner, but then I'd load a game and it would drop down to 2126 and sometimes 2112 under gaming load when it would run up to 60c. :thumbsdow

That was with just a single 360mm radiator and a i5-5820K @ 4.4 ghz looped into it with the card though. I've since moved to a 4-radiator custom loop with 15 fans and it finally can hold 2126 mhz where I set it to, and stays around 45~47c (Max Observed) when gaming, depending on the game. Most games it actually runs around 32~35c gaming now. :specool:


----------



## lightsout

kithylin said:


> That doesn't always work for everyone. And every card is different. I tried under-volting, I tried over-volting, nothing worked and nothing would let my 1080 Ti card even get to 1900 mhz stable. It would either run too hot or hit power limit (one of the two) and always throttle back down to 1850~1875 Mhz, there was literally -NOTHING- that would let it overclock any further on air cooling. It just wouldn't go. I got extremely unlucky with the silicon lotto with this card, sadly. :thumbsdow
> 
> Eventually I had to put it under a custom water loop and load the XOC bios and over-volt up to 1.200v and finally I can use 1900, 2000, and even up to 2100 mhz with my card. And even then the card was running around 60c under a small custom water loop and it was reducing clocks. I would set it for 2136 mhz in afterburner, but then I'd load a game and it would drop down to 2126 and sometimes 2112 under gaming load when it would run up to 60c. :thumbsdow
> 
> That was with just a single 360mm radiator and a i5-5820K @ 4.4 ghz looped into it with the card though. I've since moved to a 4-radiator custom loop with 15 fans and it finally can hold 2126 mhz where I set it to, and stays around 45~47c (Max Observed) when gaming, depending on the game. Most games it actually runs around 32~35c gaming now. :specool:


Yeah I read something like it starts dropping boost clock bins around 50c or something crazy. I'd rather have the old days of GPU overclocking. But these do get you close with no work.


----------



## kithylin

lightsout said:


> Yeah I read something like it starts dropping boost clock bins around 50c or something crazy. I'd rather have the old days of GPU overclocking. But these do get you close with no work.


It's lower than that. I don't know exactly where the threshold is but even keeping my 1080 Ti in the 40~45c range I still see certain games making it drop clocks from my set 2126 down to 2112. And I even saw it drop down to 2108 mhz yesterday playing Ark: Survival Evolved. Even though it was only doing 47~48c. The only way I can seem to get my 1080 Ti to actually maintain the 2126 mhz I set it to is when I play older games that don't load it as hard, where it can stay in the 30~34c range. Even 35c can sometimes cause clocks to drop a little. It's very annoying. And if I set the overclock in afterburner to like, say 2156 mhz, so when it clock drops it's actually running at 2126, it crashes the driver. So I have to set it at 2126 and just hope it will run somewhere in the 2100~2126 range. I too miss the old days of overclocking where when we would load an unlimited-power bios the cards would run at the clocks you set it to. 

As long as my card is running at least 2100 mhz under 100% load I'm happy though. My loop currently isn't as big as I'd like. I ended up with 4 radiators yes, but my setup is 360+240+240+120. Some day I may go with a different case and try for 4x480mm and maybe I could finally keep my 1080 Ti < 40c under max overclock.


----------



## lightsout

kithylin said:


> It's lower than that. I don't know exactly where the threshold is but even keeping my 1080 Ti in the 40~45c range I still see certain games making it drop clocks from my set 2126 down to 2112. And I even saw it drop down to 2108 mhz yesterday playing Ark: Survival Evolved. Even though it was only doing 47~48c. The only way I can seem to get my 1080 Ti to actually maintain the 2126 mhz I set it to is when I play older games that don't load it as hard, where it can stay in the 30~34c range. Even 35c can sometimes cause clocks to drop a little. It's very annoying. And if I set the overclock in afterburner to like, say 2156 mhz, so when it clock drops it's actually running at 2126, it crashes the driver. So I have to set it at 2126 and just hope it will run somewhere in the 2100~2126 range. I too miss the old days of overclocking where when we would load an unlimited-power bios the cards would run at the clocks you set it to.
> 
> As long as my card is running at least 2100 mhz under 100% load I'm happy though. My loop currently isn't as big as I'd like. I ended up with 4 radiators yes, but my setup is 360+240+240+120. Some day I may go with a different case and try for 4x480mm and maybe I could finally keep my 1080 Ti < 40c under max overclock.


Thats crazy, I notice the same behavior as I know what my card can do, setting it a bit above that to try to have it hit my desired clock will cause lock ups sometimes. I have noticed maxing the voltage in Precision keeps the clocks a lot more stable. Not perfect but much better than leaving it stock. I don't feel like it changes temp much either.


----------



## kithylin

lightsout said:


> Thats crazy, I notice the same behavior as I know what my card can do, setting it a bit above that to try to have it hit my desired clock will cause lock ups sometimes. I have noticed maxing the voltage in Precision keeps the clocks a lot more stable. Not perfect but much better than leaving it stock. I don't feel like it changes temp much either.


Yeah.. I'm already at 1.200v and we can't go any higher currently with 1080 Ti's, still sort of sad, but it is what it is. Even at 2100 mhz these cards are beastly and are only about -10% to -15% behind a stock FE 2080 Ti so this is good enough for me for now. I'm not upgrading again until they release a card that's at least a full +100% (in it's stock form, FE, before overclocking) over my max-clocked 1080 Ti so this will last me quite a while.


----------



## lightsout

kithylin said:


> Yeah.. I'm already at 1.200v and we can't go any higher currently with 1080 Ti's, still sort of sad, but it is what it is. Even at 2100 mhz these cards are beastly and are only about -10% to -15% behind a stock FE 2080 Ti so this is good enough for me for now. I'm not upgrading again until they release a card that's at least a full +100% (in it's stock form, FE, before overclocking) over my max-clocked 1080 Ti so this will last me quite a while.


Yeah that may be a long wait. I remember watching der bauer OC these it seems like they don't really scale with voltage anyways, if I remember correctly temperature was the #1 thing that helped clocks.

Your right though these cards are beastly, I still find myself wishing they had more power though at 1440p, I want to be able to run max settings and do 144fps


----------



## dVeLoPe

is this video accurate?


----------



## lightsout

dVeLoPe said:


> https://www.youtube.com/watch?v=ZAIwtAUpHyA
> 
> is this video accurate?


It appears that SLI is not doing much at all in the BFV and COD benchmarks, as I get similar FPS with a single 1080ti.


----------



## Astral85

Has anyone ever experienced freeze/stalls in games with their 1080 Ti? In a few games I'm getting intermittent hard freezes where the GPU usage can drop as low as 0%. The freeze lasts 1-3 seconds before recovering to normal. It's not just the GPU usage that drops but the GPU power and FB% (memory controller) at the same time also. I have exhausted troubleshooting and at this point am going with a faulty graphics card...


----------



## pewpewlazer

lightsout said:


> It appears that SLI is not doing much at all in the BFV and COD benchmarks, as I get similar FPS with a single 1080ti.


How similar is "similar"? I haven't played CoD 2018, but BF5 is obviously very CPU bottlenecked. I find it hard to believe your 1% lows are "similar" to 100 FPS with an AMD CPU. And by hard to believe I mean impossible.


----------



## Zfast4y0u

BF V has good sli support, 90%+ scaling with bf1 profile applied to it via nvidia inspector


----------



## 8051

lightsout said:


> Yeah that may be a long wait. I remember watching der bauer OC these it seems like they don't really scale with voltage anyways, if I remember correctly temperature was the #1 thing that helped clocks.
> 
> Your right though these cards are beastly, I still find myself wishing they had more power though at 1440p, I want to be able to run max settings and do 144fps


My 1080Ti w/the Asus XOC VBIOS flashed requires 1.181V to get 2152Mhz. and that's on air. I don't get any throttling either, but if the temps rise north of ~63°C I get a driver reset.


----------



## kithylin

Zfast4y0u said:


> BF V has good sli support, 90%+ scaling with bf1 profile applied to it via nvidia inspector


You're assuming people know about nvidia inspector. Quite a high percentage of computer owners don't know, or don't know how to mess with profiles (I do and do it all the time). Most people just plug in two gpu's and the connector/cable and go start the game. And "by default" there is currently no proper-scaling profile for BF-V (at least not yet), so a lot of review sites are reporting poor SLI scaling for BF-V currently. Most hardware review sites also won't screw around with profiles/inspector either because they assume that "most users" won't either. So even if it would fix it they won't even consider trying it for their reviews.


----------



## Zfast4y0u

kithylin said:


> You're assuming people know about nvidia inspector. Quite a high percentage of computer owners don't know, or don't know how to mess with profiles (I do and do it all the time). Most people just plug in two gpu's and the connector/cable and go start the game. And "by default" there is currently no proper-scaling profile for BF-V (at least not yet), so a lot of review sites are reporting poor SLI scaling for BF-V currently. Most hardware review sites also won't screw around with profiles/inspector either because they assume that "most users" won't either. So even if it would fix it they won't even consider trying it for their reviews.


i dont assume, majority are just stright out like that. thats why i clearly said how u can ''fix'' it with nvidia inspector, if u care to mess with it. anyhow, the bf1 profile applied to bf v has its issues as well, at least i did. massive distorsions on weapons when moving, something like motion blur on steroids, but the scaling is, great. the guy who made that video, i dont like how he makes videos at all, we do not see how cards behave, just graphs (dont like em too), wouldnt ever trust such comparison video anyway. btw whats the point comparing sli configurations to each other in game that dosent work properly on sli out of box?


----------



## cg4200

dVeLoPe said:


> https://www.youtube.com/watch?v=ZAIwtAUpHyA
> 
> is this video accurate?


I would say not good video if that is right 1080ti is running 2037 look at other 2


----------



## AlbertoM

I had SLI with a GTX 295 back in 2012... Would never again...

Only a few games work with that... You spend nights searching forums for profiles... Newer games never have support immediatly, you have to wait months for patchs, drivers, planets to align.

When it worked is amazing, but i can count on my fingers of one hand the number of games that I played with SLI.

Its advertising bull****, killer of PSUs and waste of energy marketing.


----------



## kithylin

AlbertoM said:


> I had SLI with a GTX 295 back in 2012... Would never again...
> 
> Only a few games work with that... You spend nights searching forums for profiles... Newer games never have support immediatly, you have to wait months for patchs, drivers, planets to align.
> 
> When it worked is amazing, but i can count on my fingers of one hand the number of games that I played with SLI.
> 
> Its advertising bull****, killer of PSUs and waste of energy marketing.


At least 80% of games scale well today with just AFR2. And it's not advertising BS. With nvidia with SLI you get access to 64xSSAA for older games or such, and we're limited to 32xSSAA for single-card. So there is a benefit to it. And if you set it up to scale correctly. SLI is pretty amazing these days, especially with our modern cards that are so powerful today. Currently Two high end cards for SLI (like 1080 Ti) are the only way to play 4K above 60 FPS minimums (at all times, even 0.1% lows) in AAA Games @ max ultra settings. So if people want to game in 4K, they have no choice but to use SLI currently.


----------



## lightsout

pewpewlazer said:


> How similar is "similar"? I haven't played CoD 2018, but BF5 is obviously very CPU bottlenecked. I find it hard to believe your 1% lows are "similar" to 100 FPS with an AMD CPU. And by hard to believe I mean impossible.


I don't think I mentioned 1% lows. I definitely keep over 100fps, I am not measuring 1% lows though, could check it out later. I do need to turn down settings to get closer to 144fps. But with dual cards I would be hoping for much better performance than that video.


----------



## The Pook

pewpewlazer said:


> BF5 is obviously very CPU bottlenecked. I find it hard to believe your 1% lows are "similar" to 100 FPS with an AMD CPU. And by hard to believe I mean impossible.



At 1080p, sure - just like any other game. But at 1440p ... not really. There's no difference between a 2080 (~1080 Ti) on a Ryzen 7 and a 9900K. 











That being said ... My 9900K is on the way :wheee:


----------



## kithylin

The Pook said:


> At 1080p, sure - just like any other game. But at 1440p ... not really. There's no difference between a 2080 (~1080 Ti) on a Ryzen 7 and a 9900K.
> 
> That being said ... My 9900K is on the way :wheee:


Curious how there's no actual 1080 Ti, or 1080 in that graph, but they do have the 1070.


----------



## The Pook

kithylin said:


> Curious how there's no actual 1080 Ti, or 1080 in that graph, but they do have the 1070.



Since the 2080 and 1080 Ti are so close I'd imagine it's just a time saving measure. It's hard to accurately bench multiplayer games, fewer cards = less time.


----------



## kithylin

The Pook said:


> Since the 2080 and 1080 Ti are so close I'd imagine it's just a time saving measure. It's hard to accurately bench multiplayer games, fewer cards = less time.


I haven't played it yet but (supposedly?) there is a static / scripted single-player aspect to the game as well. Which (and I'm guessing here) is probably where most reviewers are benchmarking the game with.


----------



## The Pook

kithylin said:


> I haven't played it yet but (supposedly?) there is a static / scripted single-player aspect to the game as well. Which (and I'm guessing here) is probably where most reviewers are benchmarking the game with.



Single player benchmarks are useless, just look for 64/32 player Conquest benchmarks, those are online. If you go by single player benchmarks then dual cores with HT are managing 60+ FPS just fine while barely being able to maintain 40FPS average in multiplayer with frequent dips and frame time spikes.


----------



## kithylin

The Pook said:


> Single player benchmarks are useless, just look for 64/32 player Conquest benchmarks, those are online. If you go by single player benchmarks then dual cores with HT are managing 60+ FPS just fine while barely being able to maintain 40FPS average in multiplayer with frequent dips and frame time spikes.


Unfortunately you are right there. But.. most "Reviewer" websites almost always. (90% of the time) will review single-player modes for all games, even battlefield games. It's sadly just how the nature of the industry is. There may be a few random small time reviewers that will test multiplayer mode, but that's kind of rare.


----------



## lightsout

The Pook said:


> Single player benchmarks are useless, just look for 64/32 player Conquest benchmarks, those are online. If you go by single player benchmarks then dual cores with HT are managing 60+ FPS just fine while barely being able to maintain 40FPS average in multiplayer with frequent dips and frame time spikes.


Yeah true have to find the ones that test multi player. It can be difficult be deceiving if you don't. 9900k huh nice, what are you using to cool it, that Corsair AIO? How is that thing working out?


----------



## The Pook

lightsout said:


> Yeah true have to find the ones that test multi player. It can be difficult be deceiving if you don't. 9900k huh nice, what are you using to cool it, that Corsair AIO? How is that thing working out?



Pretty well despite being in a super restrictive case with virtually no air flow  Temps are marginally better (~2c) than the Le Grand Macho RT and it's 100% silent which is nice since my old H100 was pretty loud. Have a feeling it'll struggle with the 9900K if I want to OC but we'll see. Even at stock it'll run circles around my 6700 @ 4.6. 

I was planning on waiting for the 7nm Ryzen chips but last night I got a System Service Exception BSOD and it refused to POST after. Got it to boot this morning with a 4GB stick of Corsair RAM @ 2400 but it refused to boot again after I decided to try to get it to boot with 8GB. Took about 2 hours to get it to boot again and I'm just hoping it lasts until the new parts come in


----------



## lightsout

The Pook said:


> Pretty well despite being in a super restrictive case with virtually no air flow  Temps are marginally better (~2c) than the Le Grand Macho RT and it's 100% silent which is nice since my old H100 was pretty loud. Have a feeling it'll struggle with the 9900K if I want to OC but we'll see. Even at stock it'll run circles around my 6700 @ 4.6.
> 
> I was planning on waiting for the 7nm Ryzen chips but last night I got a System Service Exception BSOD and it refused to POST after. Got it to boot this morning with a 4GB stick of Corsair RAM @ 2400 but it refused to boot again after I decided to try to get it to boot with 8GB. Took about 2 hours to get it to boot again and I'm just hoping it lasts until the new parts come in


Oh man that sucks. Yeah sounds like the time was now. Are you still using a P400?


----------



## The Pook

lightsout said:


> Oh man that sucks. Yeah sounds like the time was now. Are you still using a P400?



Yep, still the terrible P400S. I had a PC-O11 sitting in my cart for about a week but I decided to hold off with the new credit card bill I'll get next month


----------



## skupples

tfw when you buy a 2080, then realize the previous ti is as fast as, or faster 90% of the time.

guess i'll just go from 1070 to 2080ti so I can actually see a good difference.

no point in upgrading my GPU if the new one can still barely handle 4K, at way higher price than a use 1080ti


----------



## Hydroplane

I set my pair to +46 core clock in afterburner, for 2012 MHz total, seems stable. Afterburner doesn't seem to "apply" voltage or power limit changes so I will work on that next. +69 at stock voltage was a no go


----------



## skupples

=\ and it seems EBAY is well aware of this, thus selling 1080ti at near 2080ti prices, used or new.

i can't blame them. we did the same thing with original titan. Sold my three off between 7-9 years later.


----------



## The Pook

I'm sure when the RTX drivers mature the 2080 will slightly outperform the 1080 Ti consistently. The drivers for the 1080 Ti have been worked on for 2 years at this point.

But yeah, you missed the boat on cheap 1080 Tis. The day the RTX announcement stream went live on Twitch you could get 1080 Tis for $450 - $500 used. I grabbed mine for ~$590 new on eBay through Newegg the day after with the 15% off everything coupon that went out at the same time


----------



## Hydroplane

I bumped my memory up to 6264 MHz (+650). Tried for 6350 but started to get artifacting. Not sure if 2012/6264 is good at stock voltage


----------



## skupples

The Pook said:


> I'm sure when the RTX drivers mature the 2080 will slightly outperform the 1080 Ti consistently. The drivers for the 1080 Ti have been worked on for 2 years at this point.
> 
> But yeah, you missed the boat on cheap 1080 Tis. The day the RTX announcement stream went live on Twitch you could get 1080 Tis for $450 - $500 used. I grabbed mine for ~$590 new on eBay through Newegg the day after with the 15% off everything coupon that went out at the same time


I've gotta few bids going around that point. I'm aiming at something preblocked for $600 or less, i'll know in the morning if I'm getting that.

If not - I'll buy a second 1070 for $200 & call it a day for another year. (yes I know 10x0 series sli scaling is garbage)

the margin after improved drivers will not be great enough, specially with all featured tech being too new to be on anything major until the next tick of the clock, so back to amazon you go, crappy zotac 2080.

I'm just not seeing the performance all the videos show. My 9700k is on air atm, but still comfortably pushing 4.8 thx to it being 50f outside.  it's a good ol fashioned "on top o da mobo" test build. 

It really seemed like a viable 4K card pre-purchase, but I'm just not seeing it yet.


----------



## The Pook

Hydroplane said:


> I bumped my memory up to 6264 MHz (+650). Tried for 6350 but started to get artifacting. Not sure if 2012/6264 is good at stock voltage



My card is only good for 1974/6500. General consensus is anything over 2000 is good for the core. 



skupples said:


> I've gotta few bids going around that point. I'm aiming at something preblocked for $600 or less, i'll know in the morning if I'm getting that.
> 
> If not - I'll buy a second 1070 for $200 & call it a day for another year. (yes I know 10x0 series sli scaling is garbage)
> 
> the margin after improved drivers will not be great enough, specially with all featured tech being too new to be on anything major until the next tick of the clock, so back to amazon you go, crappy zotac 2080.
> 
> I'm just not seeing the performance all the videos show. My 9700k is on air atm, but still comfortably pushing 4.8 thx to it being 50f outside.  it's a good ol fashioned "on top o da mobo" test build.
> 
> It really seemed like a viable 4K card pre-purchase, but I'm just not seeing it yet.



2070 is definitely not a 4K card, the 1080 Ti/2080 are barely 4K cards 

Scaling on the 1000 series isn't any better or any worse than any other generation - it's just that few games scale properly and almost none perform well without frametime issues.


----------



## skupples

The Pook said:


> My card is only good for 1974/6500. General consensus is anything over 2000 is good for the core.
> 
> 
> 
> 
> 2070 is definitely not a 4K card, the 1080 Ti/2080 are barely 4K cards
> 
> Scaling on the 1000 series isn't any better or any worse than any other generation - it's just that few games scale properly and almost none perform well without frametime issues.


yeah, i didn't mean 2070, i meant sticking with my 1070, on a giant 4k screen, @ 1080p for another year  

that's definitely my general opinion. Even 2080 Ti, which does it doesn't seem to be doing all too well, until you see the NV Link numbers. 

the other issue is this zotac blower card is 100% locked down, & the blow is slow & quiet (what? a slow quiet hairdryer?!!?!?!) thus the card is running @ max temps the minute you load anything up = total garbage.

i spent 3K on GPUs once, when titan first came out, and the first two cards were definitely worth it (scaled well, and more supported titles back then i guess) but spending 3k on 2x 2080ti + blocks, thus coming to more like 3.5K or higher, just doesn't seem all that worth while for me. I shoulda gone from Titan to 1080ti, but instead I went to 1070, which has left me stuck in the mediocre upgrade room spot.

Thus the wait goes on, and maybe for the first time in a decade Red team will get my money with their next release.


----------



## The Pook

skupples said:


> yeah, i didn't mean 2070, i meant sticking with my 1070, on a giant 4k screen, @ 1080p for another year
> 
> that's definitely my general opinion. Even 2080 Ti, which does it doesn't seem to be doing all too well, until you see the NV Link numbers.
> 
> the other issue is this zotac blower card is 100% locked down, & the blow is slow & quiet (what? a slow quiet hairdryer?!!?!?!) thus the card is running @ max temps the minute you load anything up = total garbage.
> 
> i spent 3K on GPUs once, when titan first came out, and the first two cards were definitely worth it (scaled well, and more supported titles back then i guess) but spending 3k on 2x 2080ti + blocks, thus coming to more like 3.5K or higher, just doesn't seem all that worth while for me. I shoulda gone from Titan to 1080ti, but instead I went to 1070, which has left me stuck in the mediocre upgrade room spot.
> 
> Thus the wait goes on, and maybe for the first time in a decade Red team will get my money with their next release.



You can't set a custom fan curve in Afterburner or Precision X? 

My card is quiet under 50% and basically inaudible at 40% so I force it to never drop below 40%. I idle in the low/mid 20s and never really see higher than 60-65c.


----------



## skupples

kithylin said:


> At least 80% of games scale well today with just AFR2. And it's not advertising BS. With nvidia with SLI you get access to 64xSSAA for older games or such, and we're limited to 32xSSAA for single-card. So there is a benefit to it. And if you set it up to scale correctly. SLI is pretty amazing these days, especially with our modern cards that are so powerful today. Currently Two high end cards for SLI (like 1080 Ti) are the only way to play 4K above 60 FPS minimums (at all times, even 0.1% lows) in AAA Games @ max ultra settings. So if people want to game in 4K, they have no choice but to use SLI currently.


yeah, i'm actually not all that impressed at this point, the gains keep getting smaller & smaller. Maybe the next one in... 2020?!?!?! will show some promise, but 20x0 series is a total bust unless you're coming up from something @ 9 series or lower. I'll definitely keep watching all the high end cards on ebay though. I love seeing the price fluctuations after major press releases, events, etc. 

for now - just gotta 1070 ti for $225(with free shipping) on ebay, & will put my old 1070 in the old build.


----------



## AlbertoM

kithylin said:


> At least 80% of games scale well today with just AFR2. And it's not advertising BS. With nvidia with SLI you get access to 64xSSAA for older games or such, and we're limited to 32xSSAA for single-card. So there is a benefit to it. And if you set it up to scale correctly. SLI is pretty amazing these days, especially with our modern cards that are so powerful today. Currently Two high end cards for SLI (like 1080 Ti) are the only way to play 4K above 60 FPS minimums (at all times, even 0.1% lows) in AAA Games @ max ultra settings. So if people want to game in 4K, they have no choice but to use SLI currently.



AFR2 does nothing to SLI. Its a setting to use both cards with same or even less FPS of one, and introduce stutters, glitches, lag. If the game is not developed for SLI, it is useless try anything to it. And most of the games are not SLI compatible.

Thus the name ALTERNATE FRAME RENDERING. One frame in one, another in the other card, ALTERNATED. They are not rendered at same time so you don't have any gain of performance, only wasted energy of using 2 cards to do a job that one will do better.

Its like a game developed to use only one core and you have 2. It is useless.


----------



## skupples

AlbertoM said:


> AFR2 does nothing to SLI. Its a setting to use both cards with same or even less FPS of one, and introduce stutters, glitches, lag. If the game is not developed for SLI, it is useless try anything to it. And most of the games are not SLI compatible.
> 
> Its like a game developed to use only one core and you have 2. It is useless.


this is the main issue i've seen as a long time Surround + sli user. First, games stopped programming for, second NVidia stopped updating for it, third they stopped including it entirely in anything that isn't top tier. (same for surround support, minus the minor mixed monitor update)

I'm really hoping to see something decent come out of NVLink, early benches with 2080ti are promising, however, NV's pricing schemes would never allow you to actually benefit financially from Modern sli (NVLink)


----------



## kithylin

AlbertoM said:


> AFR2 does nothing to SLI. Its a setting to use both cards with same or even less FPS of one, and introduce stutters, glitches, lag. If the game is not developed for SLI, it is useless try anything to it. And most of the games are not SLI compatible.
> 
> Thus the name ALTERNATE FRAME RENDERING. One frame in one, another in the other card, ALTERNATED. They are not rendered at same time so you don't have any gain of performance, only wasted energy of using 2 cards to do a job that one will do better.
> 
> Its like a game developed to use only one core and you have 2. It is useless.


The last time I used SLI systems with a pair of GTX 770's AFR2 / AFR totally worked sometimes. Sometimes I would take certain games that literally wouldn't even wake up the second card out of idle mode normally, use nvidia inspector, set it to AFR2 forced and then suddenly.. second card both used and see +50% to +70% performance boost and SLI started scaling. Sometimes I had even seen this with my own eyes with games where the default nvidia profile would set the games to "AFR" mode and second card not even used. Most of the time it totally worked fine. And I had a lot of games go from < 50 FPS to > 60 after flipping that. It just sounds to me like you haven't personally owned an SLI setup or if you have then you haven't tried tweaking settings with nvidia inspector. I've quite often had a lot of success getting games that "don't support SLI" in many online reviews/benchmark websites to not only use SLI but show positive scaling with it just by flipping a few flags with inspector. Nvidia SLI owners have to invest a little time to find what works. Fortunately it's no 3rd-party-programs running while we game. With nvidia inspector, once you figure it out, it's set in the profile and loads automatically every time you start the game. Other things too.. There's an SLI-AA setting that will let both cards work on anti-aliasing too. The default nvidia control panel won't even expose that setting to users and it's only visible with nvidia inspector. A lot of games that showed marginal gains like +20% or +30% in benchmark/review sites turned out it was because only the primary card was working on AA. Often times I could force enable SLI-AA and suddenly see a lot better scaling and better performance. Nvidia is terrible at SLI profiles and "out of the box" most of them don't work or aren't any good at all. It's actually rather rare to see any of Nvidia's profiles work correctly with SLI "out of the box".

This is all kind of off topic though.. sorry, guess we should stop discussing this here.


----------



## skupples

some of us got tired of tweaking nvinspector per game, generation after generation, it's why I no longer game in surround, or use SLI, for now at least. Like I said, I look forward to it being worth a damn once again. NvLink hopefully means the driver packs will come with sli profiles once again, this is also much easier for them to manage now that most of us have submitted to modern NVUpate (Experience)

the community has grown thanks to NV's lack of attention, so now adays you can find pre-configured bits, & profiles, but its still a pita. 

woot, just dropped an offer on a ti w/ ek block n back plate starting @ $500. I'll do it for 6 all day long, as that's half of what the 2080 + block & plate cost.


----------



## AlbertoM

kithylin said:


> The last time I used SLI systems with a pair of GTX 770's AFR2 / AFR totally worked sometimes. Sometimes I would take certain games that literally wouldn't even wake up the second card out of idle mode normally, use nvidia inspector, set it to AFR2 forced and then suddenly.. second card both used and see +50% to +70% performance boost and SLI started scaling. Sometimes I had even seen this with my own eyes with games where the default nvidia profile would set the games to "AFR" mode and second card not even used. Most of the time it totally worked fine. And I had a lot of games go from < 50 FPS to > 60 after flipping that. It just sounds to me like you haven't personally owned an SLI setup or if you have then you haven't tried tweaking settings with nvidia inspector. I've quite often had a lot of success getting games that "don't support SLI" in many online reviews/benchmark websites to not only use SLI but show positive scaling with it just by flipping a few flags with inspector. Nvidia SLI owners have to invest a little time to find what works. Fortunately it's no 3rd-party-programs running while we game. With nvidia inspector, once you figure it out, it's set in the profile and loads automatically every time you start the game. Other things too.. There's an SLI-AA setting that will let both cards work on anti-aliasing too. The default nvidia control panel won't even expose that setting to users and it's only visible with nvidia inspector. A lot of games that showed marginal gains like +20% or +30% in benchmark/review sites turned out it was because only the primary card was working on AA. Often times I could force enable SLI-AA and suddenly see a lot better scaling and better performance. Nvidia is terrible at SLI profiles and "out of the box" most of them don't work or aren't any good at all. It's actually rather rare to see any of Nvidia's profiles work correctly with SLI "out of the box".
> 
> This is all kind of off topic though.. sorry, guess we should stop discussing this here.


I did all that and saw with my very own eyes with my GTX 295. If the game does not support SLI, you better of disabling it and using only one card. And I had at time 3D nvidia glasses and a 120Hz 3D monitor, so I could really use performance boost.

When a game supports SLI you have almost double the FPS of a single card. Almost because of the process of doing demand a lot of work, so you lose a few of fps, but is like 60FPS single card, SLI 110 FPS.

Show me a video/review/picture/case of any performance boost of any AFR mode versus single card.

al·ter·nate
verb
verb: alternate; 3rd person present: alternates; past tense: alternated; past participle: alternated; gerund or present participle: alternating
/ˈôltərˌnāt/

1.
occur in turn repeatedly.
"the governorship alternated between the Republican and Democratic parties"
synonyms:	be interspersed, occur in turn, rotate, follow one another; More

Logically it doesn't make sense, and on practice too. ALTERNATE. You just alternate the rendering with the other card, thus the sttuters, glitches, lag because they have to sync back again. It is actually worse, not mentioning the energy/power cost.

And that SLI AA mode I tried EVERY game I played that did not supported SLI. IT NEVER WORKED. No matter how you tuned the profile. Another marketing bull****.

You have all the statements of users here. I don't need to convince you. The guy above just said it.

If their marketing is so good that they can fool you, more power to SLI. Literally you are going to need a lot of power lol


----------



## skupples

you're talking almost 10 generations of GPU ago...

sli is dead, its being reborn as NvLink, and only for top end models. 

still, Sli was pretty damn effective for me back on 4xx, 5xx, 6xx, and OG Titan when running 5760x1080p. It sucked on my 1070 though.

two things changed.

games and nvidia both stopped giving a damn, even though it helps NV sell duplicates of top shelf products. Clearly there's an inherent flaw in the original SLI technology, thus it's DEAD with the modern generation and instead replaced with damn near a full PCI-E 16X slot.


three things actually, api, developers, and nvidia.


----------



## kithylin

AlbertoM said:


> I did all that and saw with my very own eyes with my GTX 295. If the game does not support SLI, you better of disabling it and using only one card. And I had at time 3D nvidia glasses and a 120Hz 3D monitor, so I could really use performance boost.


Quite an awful lot changed in the hardware and drivers between 200 series and 700 series. You can't really compare the two.



AlbertoM said:


> Show me a video/review/picture/case of any performance boost of any AFR mode versus single card.


Unfortunately, no one anywhere ever tests editing the nvidia profiles. Because like I stated previously, at least 90% or more of computer gamers just stick in cards, load drivers, and go. And if things don't work right, they just assume "Oh this game doesn't support SLI." So most review/testing sites won't bother testing messing with profiles because no one else does, there's no point. 



AlbertoM said:


> If their marketing is so good that they can fool you, more power to SLI. Literally you are going to need a lot of power lol


Nothing fooled anyone. I used to own that setup (Dual GTX 770's) for about 3 years and used it daily and saw everything I stated above myself to be true.

Anyway. We're -WAY- off topic of this thread. Can we please drop this and stick to discussing 1080 Ti's in here?


----------



## KedarWolf

AlbertoM said:


> I did all that and saw with my very own eyes with my GTX 295. If the game does not support SLI, you better of disabling it and using only one card. And I had at time 3D nvidia glasses and a 120Hz 3D monitor, so I could really use performance boost.
> 
> When a game supports SLI you have almost double the FPS of a single card. Almost because of the process of doing demand a lot of work, so you lose a few of fps, but is like 60FPS single card, SLI 110 FPS.
> 
> Show me a video/review/picture/case of any performance boost of any AFR mode versus single card.
> 
> al·ter·nate
> verb
> verb: alternate; 3rd person present: alternates; past tense: alternated; past participle: alternated; gerund or present participle: alternating
> /ˈôltərˌnāt/
> 
> 1.
> occur in turn repeatedly.
> "the governorship alternated between the Republican and Democratic parties"
> synonyms:	be interspersed, occur in turn, rotate, follow one another; More
> 
> 
> 
> Logically it doesn't make sense, and on practice too. ALTERNATE. You just alternate the rendering with the other card, thus the sttuters, glitches, lag because they have to sync back again. It is actually worse, not mentioning the energy/power cost.
> 
> And that SLI AA mode I tried EVERY game I played that did not supported SLI. IT NEVER WORKED. No matter how you tuned the profile. Another marketing bull****.
> 
> You have all the statements of users here. I don't need to convince you. The guy above just said it.
> 
> If their marketing is so good that they can fool you, more power to SLI. Literally you are going to need a lot of power lol


Many non-SLI games you can enable SLI with Nvidia Profile Inspector tweaks. :h34r-smi


----------



## Hydroplane

What's the best way to overvolt these? MSI Afterburner only lets me get from 1.06v to 1.09v. I'm not even at 40C so plenty of headroom to go


----------



## bmgjet

Hydroplane said:


> What's the best way to overvolt these? MSI Afterburner only lets me get from 1.06v to 1.09v. I'm not even at 40C so plenty of headroom to go


XOC bios allows more, But clock for clock its slower and you lose all the cards saftey functions.
Hardware mod isnt too hard on the card.


----------



## AlbertoM

KedarWolf said:


> Many non-SLI games you can enable SLI with Nvidia Profile Inspector tweaks. :h34r-smi


Yeah sure, i know that. Some you can use another games real SLI profiles to try fooling the hardware. Some work with similar engine games, and you have little side effects, some artifacts, etc, nothing to die for.

But MOSTLY are that crappy AFR thing that does nothing but sharing GPU usage on the two GPUs and introducing a lot of more crappy stutter and ****.

I understand people and nVIDIA to try to make this work. But the current tech will never go on. Its the same thing with CPUS. When we had 2 cpus in one motherboard, it was the most unpratical and not useful stuff ever. When they started putting 2 or more logical CPUS in one physical die, then it started getting somewhere. There will be that time when we will have multiple GPUS in one die, and games developed for that like we have for CPUS now, but until there, i will not, and a lot of people that tried SLI will not try that stuff anymore.

Of course, if you have unlimited budget and want to build a poser machine, go crazy, make 4 SLI and 2500W PSU.

But for the everyday enthusiast gamer that has to pay the electrical bill and the stuff it puts on their PC, no, no, no.... No no no no no....

The only exception would be if you like a game and you will play that a lot and it supports SLI, and you know you will be playing that a lot for future, then go for SLI for sure. Maybe you get lucky and find another five games on your hardware lifespan that gets to support SLI and that you like, and you will enjoy if you are playing those exact games. In all other cases it is useless.


----------



## skupples

AlbertoM said:


> Yeah sure, i know that. Some you can use another games real SLI profiles to try fooling the hardware. Some work with similar engine games, and you have little side effets, some artifacts, etc, nothing to die for.
> 
> But MOSTLY are that crappy AFR thing that does nothing but sharing GPU usage on the two GPUs and introducing a lot of more crappy stutter and ****.
> 
> I understand people and nVIDIA to try to make this work. But the current tech will never go on. Its the same thing with CPUS. When we had 2 cpus in one motherboard, it was the most unpratical and not useful stuff ever. When they started putting 2 or more logical CPUS in one physical die, then it started getting somewhere. There will be that time when we will have multiple GPUS in one die, and games developed for that like we have for CPUS now, but until there, i will not, and a lot of people that tried SLI will not try that stuff anymore.


i know plenty of us looking forward to NVLink, idk though. I only spend most of my time in surround/WSGF/sli forums.

sli is dead, y'all beating a dead horse, it sucked when it came out, n continued to suck however, it definitely had a prime, pre DX11, and DX11 EOL.


----------



## skupples

i know double posting is a sin, but to switch the topic back to 1080 ti...

they're only $100 a piece if you're in ausieland, bill paid, US $193.78 for 2x 

a few others popped up this morning ~$100 Buy It Now, in the US, but they're gone in a flash.

so - let's recap, assuming these show up - 

returned 2080, insurance, block, n plate For $1,100 - grabbed a new magni/modi, & 2x 1080 for less than half that... 

what am I missing? Is there some modern ebay scam? customer centric, get my money back either way, unless that's changed too XD

No, I'm not taking bets on likelyhood of them being flashed dinosaurs.


----------



## The Pook

I'll eat my shoes if they're legit. If they scam 20 people and they can only deposit 1-3 to their bank, withdraw the money, and close their bank account before PayPal/eBay catches on it's still a successful scam.


----------



## skupples

yeah, we guessed comprised account in OMPT, which it ended up being. 

at this point they've been literally inundated good accounts, turned compromised selling all types of computer stuff for ~600-700 CNY

ebay has already severed the transaction.


----------



## 8051

skupples said:


> yeah, we guessed comprised account in OMPT, which it ended up being.
> 
> at this point they've been literally inundated good accounts, turned compromised selling all types of computer stuff for ~600-700 CNY
> 
> ebay has already severed the transaction.


OMPT?


----------



## feznz

one million post thread


----------



## rluker5

AlbertoM said:


> Yeah sure, i know that. Some you can use another games real SLI profiles to try fooling the hardware. Some work with similar engine games, and you have little side effects, some artifacts, etc, nothing to die for.
> 
> But MOSTLY are that crappy AFR thing that does nothing but sharing GPU usage on the two GPUs and introducing a lot of more crappy stutter and ****.
> 
> I understand people and nVIDIA to try to make this work. But the current tech will never go on. Its the same thing with CPUS. When we had 2 cpus in one motherboard, it was the most unpratical and not useful stuff ever. When they started putting 2 or more logical CPUS in one physical die, then it started getting somewhere. There will be that time when we will have multiple GPUS in one die, and games developed for that like we have for CPUS now, but until there, i will not, and a lot of people that tried SLI will not try that stuff anymore.
> 
> Of course, if you have unlimited budget and want to build a poser machine, go crazy, make 4 SLI and 2500W PSU.
> 
> But for the everyday enthusiast gamer that has to pay the electrical bill and the stuff it puts on their PC, no, no, no.... No no no no no....
> 
> The only exception would be if you like a game and you will play that a lot and it supports SLI, and you know you will be playing that a lot for future, then go for SLI for sure. Maybe you get lucky and find another five games on your hardware lifespan that gets to support SLI and that you like, and you will enjoy if you are playing those exact games. In all other cases it is useless.


You do know that sli users can go single gpu in nvcp right?

Last time it was worth it for me to do that was when I played Wolfenstein New Order and have played dozens since. Right now the artifact I have to deal with is flickering in Shadow OTTR and I go to graphics settings and turn tesselation off then on to fix it. Generally just when I start, but sometimes after a long cutscene too. I play vsync 4k60 though, and maybe few others do. But sli 1080ti is as smooth at it as single at 1440p. You need sli 1080ti, sli2080 or 2080ti single for very high [email protected] You can choose or go without. It is WAY better than dropping to 30s fps or playing 1440p on your 4k. Hate all you want but most games support it just fine and for the ones that don't, then you have to settle for single gpu perf so you lose no performance, just money. 

If the price is more than it is worth to you, that is a valid opinion. They are only video games. But it does generally work, it just isn't cost efficient.


----------



## AlbertoM

rluker5 said:


> You do know that sli users can go single gpu in nvcp right?
> 
> Last time it was worth it for me to do that was when I played Wolfenstein New Order and have played dozens since. Right now the artifact I have to deal with is flickering in Shadow OTTR and I go to graphics settings and turn tesselation off then on to fix it. Generally just when I start, but sometimes after a long cutscene too. I play vsync 4k60 though, and maybe few others do. But sli 1080ti is as smooth at it as single at 1440p. You need sli 1080ti, sli2080 or 2080ti single for very high [email protected] You can choose or go without. It is WAY better than dropping to 30s fps or playing 1440p on your 4k. Hate all you want but most games support it just fine and for the ones that don't, then you have to settle for single gpu perf so you lose no performance, just money.
> 
> If the price is more than it is worth to you, that is a valid opinion. They are only video games. But it does generally work, it just isn't cost efficient.


Yeah man I know. My GTX 295 SLI card was mostly SLI disabled in control panel during its life time.

Well i had it, I know that its not worth in every way. Makes more sense buying a better card as soon it's released, if you have the money for that.

I think thats why I only tried the GTX 295 that was the one and only SLI card with 2 GPUS in one package. The price against the second grade GTX 280, reminding the 295 consisted of two 280s together, and the price was not the exact double, so IF SLI worked fine, that would be a bargain of a lifetime.

This GPU configuration didn't worked so much that nVIDIA never tried to sell a SLI card like that until today, and probably never will anymore.


----------



## 8051

AlbertoM said:


> Yeah man I know. My GTX 295 SLI card was mostly SLI disabled in control panel during its life time.
> 
> Well i had it, I know that its not worth in every way. Makes more sense buying a better card as soon it's released, if you have the money for that.
> 
> I think thats why I only tried the GTX 295 that was the one and only SLI card with 2 GPUS in one package. The price against the second grade GTX 280, reminding the 295 consisted of two 280s together, and the price was not the exact double, so IF SLI worked fine, that would be a bargain of a lifetime.
> 
> This GPU configuration didn't worked so much that nVIDIA never tried to sell a SLI card like that until today, and probably never will anymore.


What about the TitanZ? Wasn't it a dual-GPU card?


----------



## AlbertoM

8051 said:


> What about the TitanZ? Wasn't it a dual-GPU card?



Wow didn't knew about that one... Guess i know why i didn't heard about it lol

Well at least in the main consumer line it has not been tried again. And probably not even on Titan line. It was huge fail.

If SLI would be a promising architecture, we would have been seeing more and more cards coming in this dual GPU config, easing our builds, making it more efficient etc...

Why not? Because it just don't work and it's not worth it. lol


----------



## lightsout

AlbertoM said:


> Yeah man I know. My GTX 295 SLI card was mostly SLI disabled in control panel during its life time.
> 
> Well i had it, I know that its not worth in every way. Makes more sense buying a better card as soon it's released, if you have the money for that.
> 
> I think thats why I only tried the GTX 295 that was the one and only SLI card with 2 GPUS in one package. The price against the second grade GTX 280, reminding the 295 consisted of two 280s together, and the price was not the exact double, so IF SLI worked fine, that would be a bargain of a lifetime.
> 
> This GPU configuration didn't worked so much that nVIDIA never tried to sell a SLI card like that until today, and probably never will anymore.


What? What about the GTX 590 and GTX 690?


----------



## stangflyer

lightsout said:


> What? What about the GTX 590 and GTX 690?


I had a 7950GX2 - that was a dual gpu card also. Then I got a 8800 GTX.


----------



## AlbertoM

Yeah all those were huge hits, i hear about a lot of people that bought those cards, as i even didn't know they existed in the first place.

So let's wait until the next one come out, and we all will live in SLI wonderland.

ops, N O T

I guess Im among a few idiots that fell in that marketing bull****. 

The percentage of people who tried SLI and stay upgrading in that way must be like 0.00001%.

Tell me if im wrong please.

The main point is, until the industry of gaming ALL move to multi-gpu development, then this can go forward. Just like it was with multi CPU (cores).

If we are staying in this mess that just a few do and lot don't... no. It will not be a market hit ever.

And we are talking about a technology that was released on 2004. So something must be terribly wrong with that.

Quoting wikipedia, look what beautiful tech they made:

Split Frame Rendering (SFR)

This analyzes the rendered image in order to split the workload equally between the two GPUs. To do this, the frame is split horizontally in varying ratios depending on geometry. For example, in a scene where the top half of the frame is mostly empty sky, the dividing line will lower, balancing geometry workload between the two GPUs. This method does not scale geometry or work as well as AFR, however.

Alternate Frame Rendering (AFR)

Each GPU renders entire frames in sequence. For example, in a Two-Way setup, one GPU renders the odd frames, the other the even frames, one after the other. Finished outputs are sent to the master for display. Ideally, this would result in the rendering time being cut by the number of GPUs available. In their advertising, Nvidia claims up to 1.9x the performance of one card with the Two-Way setup. While AFR may produce higher overall framerates than SFR, it also exhibits the temporal artifact known as Micro stuttering, which may affect frame rate perception. It is noteworthy that while the frequency at which frames arrive may be doubled, the time to produce the frame is not reduced - which means that AFR is not a viable method of reducing input lag.

SLI Antialiasing

This is a standalone rendering mode that offers up to double the antialiasing performance by splitting the antialiasing workload between the two graphics cards, offering superior image quality. One GPU performs an antialiasing pattern which is slightly offset to the usual pattern (for example, slightly up and to the right), and the second GPU uses a pattern offset by an equal amount in the opposite direction (down and to the left). Compositing both the results gives higher image quality than is normally possible. This mode is not intended for higher frame rates, and can actually lower performance, but is instead intended for games which are not GPU-bound, offering a clearer image in place of better performance. When enabled, SLI Antialiasing offers advanced antialiasing options: SLI 8X, SLI 16X, and SLI 32x (for Quad SLI systems only).[9]

Yes, you all read that.

https://en.wikipedia.org/wiki/Scalable_Link_Interface


----------



## rluker5

AlbertoM said:


> Yeah all those were huge hits, i hear about a lot of people that bought those cards, as i even didn't know they existed in the first place.
> 
> So let's wait until the next one come out, and we all will live in SLI wonderland.
> 
> ops, N O T
> 
> I guess Im among a few idiots that fell in that marketing bull****.
> 
> The percentage of people who tried SLI and stay upgrading in that way must be like 0.00001%.
> 
> Tell me if im wrong please.
> 
> The main point is, until the industry of gaming ALL move to multi-gpu development, then this can go forward. Just like it was with multi CPU (cores).
> 
> If we are staying in this mess that just a few do and lot don't... no. It will not be a market hit ever.
> 
> And we are talking about a technology that was released on 2004. So something must be terribly wrong with that.
> 
> Quoting wikipedia, look what beautiful tech they made:
> 
> Split Frame Rendering (SFR)
> 
> This analyzes the rendered image in order to split the workload equally between the two GPUs. To do this, the frame is split horizontally in varying ratios depending on geometry. For example, in a scene where the top half of the frame is mostly empty sky, the dividing line will lower, balancing geometry workload between the two GPUs. This method does not scale geometry or work as well as AFR, however.
> 
> Alternate Frame Rendering (AFR)
> 
> Each GPU renders entire frames in sequence. For example, in a Two-Way setup, one GPU renders the odd frames, the other the even frames, one after the other. Finished outputs are sent to the master for display. Ideally, this would result in the rendering time being cut by the number of GPUs available. In their advertising, Nvidia claims up to 1.9x the performance of one card with the Two-Way setup. While AFR may produce higher overall framerates than SFR, it also exhibits the temporal artifact known as Micro stuttering, which may affect frame rate perception. It is noteworthy that while the frequency at which frames arrive may be doubled, the time to produce the frame is not reduced - which means that AFR is not a viable method of reducing input lag.
> 
> SLI Antialiasing
> 
> This is a standalone rendering mode that offers up to double the antialiasing performance by splitting the antialiasing workload between the two graphics cards, offering superior image quality. One GPU performs an antialiasing pattern which is slightly offset to the usual pattern (for example, slightly up and to the right), and the second GPU uses a pattern offset by an equal amount in the opposite direction (down and to the left). Compositing both the results gives higher image quality than is normally possible. This mode is not intended for higher frame rates, and can actually lower performance, but is instead intended for games which are not GPU-bound, offering a clearer image in place of better performance. When enabled, SLI Antialiasing offers advanced antialiasing options: SLI 8X, SLI 16X, and SLI 32x (for Quad SLI systems only).[9]
> 
> Yes, you all read that.
> 
> https://en.wikipedia.org/wiki/Scalable_Link_Interface


I started with cfx 7970s in2013, the stuttering was bad so I went over to a single 780ti. Then in jan 2014 I picked up a second 780ti. This enabled me to play 4k60 when I picked up the 4k tv on my sig in oct 2014 and wasa good enough to handle games that way until 2016 or so, then started to struggle. In july 2017 I picked up sli 1080tis which breeze through 4k60 and recently I picked up 2 old miner fury ntros for my living room htpc for $133 apiece including shipping. The 7970s were bad, everything else was totally worth it to me. Combined they are way faster than single in most games, including nearly all that I play that need more power than single gpu for the resolution, framerate and settings I want. (Furys a few less% of the games but they are still decent at it) But they aren't cost efficient and are more work to set up. You want to keep them running at the same speed and not thermal throttle them and sometimes put in a profile, and not use a particular graphics setting or two.


----------



## lightsout

rluker5 said:


> I started with cfx 7970s in2013, the stuttering was bad so I went over to a single 780ti. Then in jan 2014 I picked up a second 780ti. This enabled me to play 4k60 when I picked up the 4k tv on my sig in oct 2014 and wasa good enough to handle games that way until 2016 or so, then started to struggle. In july 2017 I picked up sli 1080tis which breeze through 4k60 and recently I picked up 2 old miner fury ntros for my living room htpc for $133 apiece including shipping. The 7970s were bad, everything else was totally worth it to me. Combined they are way faster than single in most games, including nearly all that I play that need more power than single gpu for the resolution, framerate and settings I want. (Furys a few less% of the games but they are still decent at it) But they aren't cost efficient and are more work to set up. You want to keep them running at the same speed and not thermal throttle them and sometimes put in a profile, and not use a particular graphics setting or two.


Xfire Furys for an Htpc! I love it.


----------



## rluker5

lightsout said:


> Xfire Furys for an Htpc! I love it.


Yes, it's packed, but not packed enough. 
The fans on the Fury Nitros shut off when not in use, so do all of my case fans (Asus mobo aisuite with case fans set to temp probe stuck in a gpu) leaving just a small cpu fan slowly spinning when I do something like play music over the speakers. The projector fan is way too loud for this so I have been looking for a solution where the projector could be off. What I currently have going is close to right, but not good enough: (see tiny monitor in pic)

Couple of nights ago I found this beauty: https://www.aliexpress.com/item/Fre...070.html?spm=a2g0s.9042311.0.0.1ad04c4dv2xhKQ 
My front cover is going to go under the knife as soon as it shows up  

And this picture of that pc's guts is just to bother AlbertoM ;p 

But I really do like my 1080ti sli. I heard Afterburner has some new stuff out for them too.


----------



## 8051

rluker5 said:


> I started with cfx 7970s in2013, the stuttering was bad so I went over to a single 780ti. Then in jan 2014 I picked up a second 780ti. This enabled me to play 4k60 when I picked up the 4k tv on my sig in oct 2014 and wasa good enough to handle games that way until 2016 or so, then started to struggle. In july 2017 I picked up sli 1080tis which breeze through 4k60 and recently I picked up 2 old miner fury ntros for my living room htpc for $133 apiece including shipping. The 7970s were bad, everything else was totally worth it to me. Combined they are way faster than single in most games, including nearly all that I play that need more power than single gpu for the resolution, framerate and settings I want. (Furys a few less% of the games but they are still decent at it) But they aren't cost efficient and are more work to set up. You want to keep them running at the same speed and not thermal throttle them and sometimes put in a profile, and not use a particular graphics setting or two.


Doesn't the lack of 2 PCIe 3.0 16x slots affect your 1080Ti 5775c SLI setup?


----------



## lightsout

rluker5 said:


> Yes, it's packed, but not packed enough.
> The fans on the Fury Nitros shut off when not in use, so do all of my case fans (Asus mobo aisuite with case fans set to temp probe stuck in a gpu) leaving just a small cpu fan slowly spinning when I do something like play music over the speakers. The projector fan is way too loud for this so I have been looking for a solution where the projector could be off. What I currently have going is close to right, but not good enough: (see tiny monitor in pic)
> 
> Couple of nights ago I found this beauty: https://www.aliexpress.com/item/Fre...070.html?spm=a2g0s.9042311.0.0.1ad04c4dv2xhKQ
> My front cover is going to go under the knife as soon as it shows up
> 
> And this picture of that pc's guts is just to bother AlbertoM ;p
> 
> But I really do like my 1080ti sli. I heard Afterburner has some new stuff out for them too.


Wow thats tight, I didn't know they sold air cooled versions of those cards. I notice a lot of people are doing to small screen thing. Looks like a cool mod.


----------



## skupples

now we're in business. Snagged one thru facebook marketplace for $450.    I knew I'd find one on the low low if I hunted long enough. The ones on reddit & H were going way too fast. 


arse marked it up to $500, but i found out my buddy is at his house in orlando, so time for a road trip.


----------



## rluker5

8051 said:


> Doesn't the lack of 2 PCIe 3.0 16x slots affect your 1080Ti 5775c SLI setup?


I really don't know anymore. It doesn't seem like it from benchmarks and the bus usage in AB, but it appears my previous testing was flawed. I tested plx16,16 vs plx 16,8 and saw a difference on a z97classified. Tried the same on a rog maximus vii hero and the difference was gone between that 8,8 and the plx 16,16. I know Shadow OTTR with TAA overloads the bus on 8,8. If I rearranged my rig I could try the same setting on plx 16,16 but I thought TAA isn't supposed to sli anyways.
It doesn't seem like a bottleneck where you have hanging or stutters but I don't have a good way to prove that.


----------



## mouacyk

Anyone know why the performance scaling sucks when using custom resolutions in the Witcher 3? I expected a 50% increment with each height halving. All settings on Ultra, medium Hairworks, extra LOD mod. SweetFX enabled/disabled affects about only 2fps at each resolution.



Code:


-----------------------------------------------------------------------
| width | height | fps | perf    | pixels  | Efps | Eperf   | eff     |
| ------------------------------------------------------------------- |
| 3840  | 2160   | 41  | 100.00% | 100.00% | 41   | 100.00% | 100.00% |
| 3840  | 1440   | 43  | 104.88% | 66.67%  | 61.5 | 150.00% | 69.92%  |
| 3840  | 1080   | 48  | 117.07% | 50.00%  | 82   | 200.00% | 58.54%  |
-----------------------------------------------------------------------


----------



## mattliston

It doesnt need to.


If you are utilizing the exact same game and graphics settings, and only the viewing resolution changes, it wont be as big of an impact as you think.


Compare 4k to 1080p. That would be a massive difference in pixel count, yet you certainly wont double your performance by dialing down to 1080P.


Witcher 3 is still a game that uses hardware quite a bit, even at small resolution settings.


EDIT Im actually curious what 1080p FPS would be. I would guess it will be right around 60fps, possibly towards 70fps. Which kinda messes with what I stated right above.


----------



## mouacyk

mattliston said:


> It doesnt need to.
> 
> 
> If you are utilizing the exact same game and graphics settings, and only the viewing resolution changes, it wont be as big of an impact as you think.
> 
> 
> Compare 4k to 1080p. That would be a massive difference in pixel count, yet you certainly wont double your performance by dialing down to 1080P.
> 
> 
> Witcher 3 is still a game that uses hardware quite a bit, even at small resolution settings.
> 
> 
> EDIT Im actually curious what 1080p FPS would be. I would guess it will be right around 60fps, possibly towards 70fps. Which kinda messes with what I stated right above.


I'll do you one better. I also run natively at 2560x1440 and it runs between 75fps and 90fps. I have a feeling that custom resolution still processes those invisible pixel rows. Now, I'm curious if using the CRU tool will make a difference.


----------



## nyk20z3

Went from a 1080 Strix OC 1080 Ti and had to add to my Lighting series collection so i picked up a Lighting Z off ebay -

I am patiently waiting for that 2080 Ti Lighting release


----------



## fat4l

So guys have you tried this new Afterburner beta 10 with automatic scanner ? 

For my card it works a bit weird. Not sure its because of the XOC strix bios but max value of voltage it is testing is 1.043v and this is even with voltage slider maxed out. This is way too low. I wish there was a way to make it go up to lets say 1.2v if we want. It should at least go up to 1.093v and not 1.043v....

What is your experience ?


----------



## Maximization

snagged a EVGA 1080 ti FTW3 hybrid for good price, anything unique about this card? upgrading sons 950 card. I have 2 posidians and i assume they perform same way.


----------



## Knoxx29

EVGA 1080 ti FTW3


----------



## rluker5

fat4l said:


> So guys have you tried this new Afterburner beta 10 with automatic scanner ?
> 
> For my card it works a bit weird. Not sure its because of the XOC strix bios but max value of voltage it is testing is 1.043v and this is even with voltage slider maxed out. This is way too low. I wish there was a way to make it go up to lets say 1.2v if we want. It should at least go up to 1.093v and not 1.043v....
> 
> What is your experience ?


It was testing mine at stock voltage at 1.012 but I still like it. Will be a great shortcut to setting best clocks to reduced volts if the values they give you are stable. And for the higher volts, I guess you will have to set that the old fashioned way if you have a different value that works better for you. And I don't know if it will make much of a difference compared to boost with regular full speed, haven't tried that yet.

I also like how the min volts goes down to 700mv now. That will be perfect for when my daughter uses my 1050ti laptop if the clocks don't drop off too badly.

Nice piece of software


----------



## nmkr

nyk20z3 said:


> Went from a 1080 Strix OC 1080 Ti and had to add to my Lighting series collection so i picked up a Lighting Z off ebay -
> 
> I am patiently waiting for that 2080 Ti Lighting release


congratulations  dont miss the owners thread in the forum 

can you please report your bios version numbers? normal and ln2 , thanks


----------



## KedarWolf

fat4l said:


> So guys have you tried this new Afterburner beta 10 with automatic scanner ?
> 
> For my card it works a bit weird. Not sure its because of the XOC strix bios but max value of voltage it is testing is 1.043v and this is even with voltage slider maxed out. This is way too low. I wish there was a way to make it go up to lets say 1.2v if we want. It should at least go up to 1.093v and not 1.043v....
> 
> What is your experience ?


OC Scanner not working right for me on XOC BIOS. It keeps increasing clock speeds, highest I've gotten is 2127 manually overclocking, it'll go 2169 and higher and PC will freeze.


----------



## fat4l

KedarWolf said:


> OC Scanner not working right for me on XOC BIOS. It keeps increasing clock speeds, highest I've gotten is 2127 manually overclocking, it'll go 2169 and higher and PC will freeze.


Whaat  
What voltage is it using in "point 4" ? For me, with voltage slider maxed to 100, it tests 1.043v.

Did you try any other bioses to confirm its XOC bios fault ?


----------



## PsYcHo29388

I tried OC scanner as soon as I put in my 1080 Ti I got as a replacement for my 1080 (Praise be EVGA!). I've also set +300 on the memory, 127% power limit, 90c temp limit, and so far The Division holds pretty steady at 1949mhz with no voltage increase.

If anyone has any input for me to get higher than that I'd appreciate it but otherwise I think I'm just gonna stay put.


----------



## mattliston

curve clocking is almost always the best way. Can also easily lock in certain frequencies, so it doesnt self-clock too high when temps are good.

Be wary of memory overclocking. My 1080 TI peaks at around +225mhz in most games, yet is perfectly stable with +550 or 600mhz mem offset.


You reach a point where error correction sets in, and starts to degrade performance. 


People have voiced using the Valley benchmark, finding a very high FPS zone, and then simply slowly cranking the memory until FPS no longer climbs, dial it back a few MHz, and call it a day.


Make sure to allow the bench to run in place for at least an hour to verify memory stays stable at higher operating temps.


----------



## skupples

mattliston said:


> curve clocking is almost always the best way. Can also easily lock in certain frequencies, so it doesnt self-clock too high when temps are good.
> 
> Be wary of memory overclocking. My 1080 TI peaks at around +225mhz in most games, yet is perfectly stable with +550 or 600mhz mem offset.
> 
> 
> You reach a point where error correction sets in, and starts to degrade performance.
> 
> 
> People have voiced using the Valley benchmark, finding a very high FPS zone, and then simply slowly cranking the memory until FPS no longer climbs, dial it back a few MHz, and call it a day.
> 
> 
> Make sure to allow the bench to run in place for at least an hour to verify memory stays stable at higher operating temps.


I pretty much always start @ +100 core +500 memory on anything from nvidia


----------



## lightsout

skupples said:


> I pretty much always start @ +100 core +500 memory on anything from nvidia


So do I, just dial those in and see what happens and go from there, I like to do them separate though.


----------



## rluker5

mattliston said:


> curve clocking is almost always the best way. Can also easily lock in certain frequencies, so it doesnt self-clock too high when temps are good.
> 
> Be wary of memory overclocking. My 1080 TI peaks at around +225mhz in most games, yet is perfectly stable with +550 or 600mhz mem offset.
> 
> 
> You reach a point where error correction sets in, and starts to degrade performance.
> 
> 
> People have voiced using the Valley benchmark, finding a very high FPS zone, and then simply slowly cranking the memory until FPS no longer climbs, dial it back a few MHz, and call it a day.
> 
> 
> Make sure to allow the bench to run in place for at least an hour to verify memory stays stable at higher operating temps.


+225 would be nice.. My Aorus stock at 2k so I don't think I will see that  The slow one does 2037 well and the fast one does 2060something. I like to state actual clocks since the factory oc varies a bunch more than the final clocks on these. I don't think oc scanner will make my card's top speed any faster though. But it will give me a better voltage curve to use how I see fit.


----------



## Hydroplane

mattliston said:


> curve clocking is almost always the best way. Can also easily lock in certain frequencies, so it doesnt self-clock too high when temps are good.
> 
> Be wary of memory overclocking. My 1080 TI peaks at around +225mhz in most games, yet is perfectly stable with +550 or 600mhz mem offset.
> 
> 
> You reach a point where error correction sets in, and starts to degrade performance.
> 
> 
> People have voiced using the Valley benchmark, finding a very high FPS zone, and then simply slowly cranking the memory until FPS no longer climbs, dial it back a few MHz, and call it a day.
> 
> 
> Make sure to allow the bench to run in place for at least an hour to verify memory stays stable at higher operating temps.


I have noticed this phenomenon too as well. It is kind of odd, my 2nd Aorus I could push from +700 all the way to +1000 but would lose a few FPS along the way. My 1st one would artifact beyond about +750


----------



## Knoxx29

My 24/7 settings


----------



## porschedrifter

I have a 1080ti that I purchased under a year ago, and I was wondering with the 1080ti's surplus dwindling, if I were to RMA my card now or 5 years from now, how soon will they be sending a 2080 or 2080ti as a replacement? 

Anyone do a 1080ti RMA lately? If so, what did you get back?


----------



## mattliston

Doing an RMA simply to see if you can get an upgrade is pretty darn lame.


Dont be a dirt bag.


----------



## porschedrifter

mattliston said:


> Doing an RMA simply to see if you can get an upgrade is pretty darn lame.
> 
> 
> Dont be a dirt bag.


I actually have a legitimate reason. However, I also wanted to discuss it, but thanks for your interest.


----------



## bmgjet

Be 1 - 2 years id say, At the moment your more likely to get a 1080ti back instead of a 2080 lol


----------



## mattliston

porschedrifter said:


> I actually have a legitimate reason. However, I also wanted to discuss it, but thanks for your interest.



Your wording of your prior post made it VERY easy to assume nothing was wrong and you simply wanted to test the system.


----------



## skupples

mattliston said:


> Doing an RMA simply to see if you can get an upgrade is pretty darn lame.
> 
> 
> Dont be a dirt bag.


you know, they say this is one of the reasons for NV gpu price inflation. 
You think EVGA's US based customer service dept is cheep? O.O




bmgjet said:


> Be 1 - 2 years id say, At the moment your more likely to get a 1080ti back instead of a 2080 lol




cards have to be way old to get an ugprade card. I RMA'd a 3 year old 480, got back?! a 480.

RMA'd my 580s when i got the first titans, got back?! 580s.

you basically need to be in an extended warranty period, now that warranties are no longer 5 years.

at least, this has always been my experience. we've all seen people get lucky, but very rarely is it ever a single generation later.  everyone started getting 670s for 580s, around the 4-5 year mark, if I remember correctly.


----------



## porschedrifter

skupples said:


> mattliston said:
> 
> 
> 
> Doing an RMA simply to see if you can get an upgrade is pretty darn lame.
> 
> 
> Dont be a dirt bag.
> 
> 
> 
> you know, they say this is one of the reasons for NV gpu price inflation.
> You think EVGA's US based customer service dept is cheep? O.O
> 
> 
> 
> 
> bmgjet said:
> 
> 
> 
> Be 1 - 2 years id say, At the moment your more likely to get a 1080ti back instead of a 2080 lol
> 
> Click to expand...
> 
> 
> 
> cards have to be way old to get an ugprade card. I RMA'd a 3 year old 480, got back?! a 480.
> 
> RMA'd my 580s when i got the first titans, got back?! 580s.
> 
> you basically need to be in an extended warranty period, now that warranties are no longer 5 years.
> 
> at least, this has always been my experience. we've all seen people get lucky, but very rarely is it ever a single generation later. /forum/images/smilies/frown.gif everyone started getting 670s for 580s, around the 4-5 year mark, if I remember correctly.
Click to expand...

That's what I figured as well, but, with the supply shortage of 1080ti's from mining and being discontinued, things may be different this time around... There's already a report on the Nvidia forum that someone rma'd a 1080ti last month and got a 2080 in return!


----------



## skupples

Let’s hope all that talked about driver maturation actually happens. I’d only take a 2080 as an upgrade. I’ve got 2 years left on my warranty. 😉


----------



## lightsout

porschedrifter said:


> That's what I figured as well, but, with the supply shortage of 1080ti's from mining and being discontinued, things may be different this time around... There's already a report on the Nvidia forum that someone rma'd a 1080ti last month and got a 2080 in return!


Not much of an upgrade there.


----------



## Streetdragon

Just a little question: Can ram overcloc under a waterblock degrade the ram? i mean the voltage is fixed. BUT more work=more heat even with fixed voltage

Played shadow of the tomb raider with 2050/[email protected] 1440P max settings for some hours and everything was fine so far^^


----------



## Hydroplane

Streetdragon said:


> Just a little question: Can ram overcloc under a waterblock degrade the ram? i mean the voltage is fixed. BUT more work=more heat even with fixed voltage
> 
> Played shadow of the tomb raider with 2050/[email protected] 1440P max settings for some hours and everything was fine so far^^


Highly unlikely at stock voltages or even a slight voltage bump. Especially with water cooling. At least, it's never happened in my experience.


----------



## skupples

Streetdragon said:


> Just a little question: Can ram overcloc under a waterblock degrade the ram? i mean the voltage is fixed. BUT more work=more heat even with fixed voltage
> 
> Played shadow of the tomb raider with 2050/[email protected] 1440P max settings for some hours and everything was fine so far^^


i murdered 3x OG titans with extreme voltage thru the chip & +750 on the memory for 3+ years, no signs of decay.

yes though, in theory, something will happen, eventually. What fails first though? Probably something else.


----------



## Knoxx29

skupples said:


> i murdered 3x OG titans with extreme voltage thru the chip & +750 on the memory for 3+ years,


How much Voltage?


----------



## skupples

Knoxx29 said:


> How much Voltage?


i'd have to go check, I wanna say 1.4 24/7. we had full control of the voltage controller via software, n we ran them like dogs for years & years over in the titan club. Nv literally showed up to tell us to stop, but it was too late, they couldn't keep us out. XD 

it's all very well documented in OCN Titan Club.


----------



## Knoxx29

I had my 1080ti with 50% voltage increased but i went back to 10% even it is Waterchilled, 10% is what i need to keep it at 2050MHz without throttling.


----------



## skupples

=\ the $500 1080 ti I got is from PNY. I've had their cards before, they aren't terrible, but its definitely 100% bone stock PCB . Either way, I need to get a block on it ASAP. Just gotta find one.


----------



## kithylin

skupples said:


> =\ the $500 1080 ti I got is from PNY. I've had their cards before, they aren't terrible, but its definitely 100% bone stock PCB . Either way, I need to get a block on it ASAP. Just gotta find one.


Performance PCS has a few. I would suggest the sexy EK copper ones with clear plexi. I love mine. 









Mine when I had it apart recently for the big expansion of my loop. I also have the full metal backplate from EK that matches, and it connects to the back of the PCB with thermal pads and is functional.


----------



## skupples

I always go EK, but I can't put a new block on a used card, that's just silly. Both have to be used.  

I'm more of an acetal guy, & I really miss all copper blocks, luckily I still have one, but I'm testing the new series with the jet plate insert innovation.

Pads on the back - 100% required, even without memory


----------



## kithylin

skupples said:


> I always go EK, but I can't put a new block on a used card, that's just silly. Both have to be used.
> 
> I'm more of an acetal guy, & I really miss all copper blocks, luckily I still have one, but I'm testing the new series with the jet plate insert innovation.
> 
> Pads on the back - 100% required, even without memory


I had a Jet plate in my EK block and I found later after I removed it that I had quite a bit better flow rate through the entire loop with it out and temps are exactly the same with and without it. Personally I'm saying it's useless.


----------



## skupples

kithylin said:


> I had a Jet plate in my EK block and I found later after I removed it that I had quite a bit better flow rate through the entire loop with it out and temps are exactly the same with and without it. Personally I'm saying it's useless.


the new ones use this big black molded plastic block, and a plate. 

I tested the classic "just plate" (CSQ/Clean CSQ) between the 1150 n 2011 plate back in the day, & you're correct the differences were marginal, never tried it without I don't think though...


----------



## Knoxx29

kithylin said:


> I would suggest the sexy EK copper ones with clear plexi. I love mine.


I got mine Nickel, the only time i used copper was when i had 2 x 770 SLI


----------



## Streetdragon

Than i will run the vram at 6210. Fells good man^^

And blocks.. next time i would buy a copper block without nickel. like copper^^


----------



## skupples

nickel looks good, but it had a rough start. I hope it's gotten better.


----------



## kithylin

Personally I use all copper in the blocks where I can and/or nickel-plated copper at least and then since it's all similar-metals in the loop I don't need any additives and just run straight distilled water for best cooling performance. It's been smooth for me. The photo I showed you earlier was after the first 14 months running my 1080 Ti in a loop with 100% distilled water with no additives like that. Perfectly clean except a little discoloration in the bottom corners from the o-ring.


----------



## nolive721

just posted in teh VEGA thread so please gents take no ffence

seeking advice here

running triple 1080p screens with an OC 1080Ti but the center one is Freesync.I have always been considering going back to VEGA after a bad experience with a Nitro 64 which was poor OCer and running really hot despite undervolting and all that

looking at the new adrenalin drivers and maybe some further development on voltage management, I could grab a 64LC edition for not silly money and make my rig Full AMD.


I know its an NVIDIA thread so might be getting biased answers lol) but anybody that would have done the move from a 64LC to a 1080Ti with limited to no regret in order to bring me back to reason maybe....

thanks so much


----------



## bmgjet

Does free sync support more then 1 screen being plugged in at a time now?
A V64 is going to be a down grade in performance any way you look at it. But if you want a all AMD rig thats your only real choice for a few months.
Personally id wait and see what the new AMD cards coming out have to offer.


----------



## skupples

I wouldn’t be surprised if AMD’s MM beats out surround. Specially with multiple cards. 

Your best bet is the WSGF.com for solid info, instead of opinions 

I eventually retired my NV surround setup due to technical issues always getting in the way.

AMD has historically spent more time supporting multi gpu + multi monitor setups than nvidia, but I’ve never taken the plunge to test it all myself so all I can do is recommend the powerhouse on the topic. Widescreen gaming forum.


----------



## nolive721

thanks and sorry I should have explained more.

regarding freesync, no only 1 monitor at a time is possible AFAIK so my plan would be to play latest and future AAA highly demanding game on single monitor with freesync on for butter smooth experience and back to triple with less graphically intense tittles

or move to 4K freesync monitor in the future


----------



## skupples

I'm having a hell of a time finding a reason to not side step my 34 inch 10 bit 4K for a large high hz 1440p display. 

idk about freesync & MM, but I thought I remembered reading they resolved G-sync & surround, right? via all three being on DP.


----------



## KedarWolf

When you run the 358WPowerLimit.bat file to see your FTW3 power limit it'll go back down to just over 356W on a reboot by default.

Unzip the attached .cmd file and add it in the Task Scheduler to run at login with the highest privileges, then it'll always be set to 358w after you reboot.

I posted this before as well but unzip the second .cmd file and have Afterburner set with your 24/7 overclock settings but in the Afterburner options uncheck Start With Windows.

It'll start Afterburner right after the power limit .cmd file, set your overclock, then close Afterburner. this helps improve your framerates as the monitoring isn't running and stops Afterburner from conflicting with other programs you may have running or want to run, it's known to have issues that way.

Have it run at login with highest privileges as well.

*Edit: Have the Power limit task run with a 30-second delay in Task Scheduler as having two command prompts open at once will not work and you need to let the Nvidia drivers finish loading.. 
Second Edit: Fixed the script code.
*


----------



## GRABibus

KedarWolf said:


> When you run the 358WPowerLimit.bat file to see your FTW3 power limit it'll go back down to just over 356W on a reboot by default.
> 
> Unzip the attached .cmd file and add it in the Task Scheduler to run at login with the highest privileges, then it'll always be set to 358w after you reboot.
> 
> I posted this before as well but unzip the second .cmd file and have Afterburner set with your 24/7 overclock settings but in the Afterburner options uncheck Start With Windows.
> 
> It'll start Afterburner right after the power limit .cmd file, set your overclock, then close Afterburner. this helps improve your framerates as the monitoring isn't running and stops Afterburner from conflicting with other programs you may have running or want to run, it's known to have issues that way.
> 
> Have it run at login with highest privileges as well.
> 
> *Edit: Have the Afterburner task run with a 30-second delay in Task Scheduler as having two command prompts open at once will not work. *



I really don't get what you mean.


----------



## eternal7trance

I have a Zotac 1080ti Amp Extreme Core that I recently bought and I noticed I crash on a few games, usually only ones that make my gpu sit at 99% usage. When I looked up the issue it said this could be power issues or a bad overclock. I have not overclocked the card at all but I do notice on the firestorm app the boost clock goes up to 1960 and on the site it lists the boost clock at 1721. Is this worth RMAing or is there a more permanent way I could lower the clock without having to use the app everytime?

Also for reference my psu is a brand new EVGA 750w G3.


----------



## The Pook

eternal7trance said:


> I have a Zotac 1080ti Amp Extreme Core that I recently bought and I noticed I crash on a few games, usually only ones that make my gpu sit at 99% usage. When I looked up the issue it said this could be power issues or a bad overclock. I have not overclocked the card at all but I do notice on the firestorm app the boost clock goes up to 1960 and on the site it lists the boost clock at 1721. Is this worth RMAing or is there a more permanent way I could lower the clock without having to use the app everytime?
> 
> Also for reference my psu is a brand new EVGA 750w G3.



That's the way GPU Boost works, the cooler your card is the higher it'll boost. Advertised specs are kind of useless. 

If you took the card out of the box, put it in your machine, and it is crashing ... then RMA the card, it's faulty. 

Though just because your PSU is new doesn't mean it isn't faulty, but it's likely not.


----------



## skupples

i'd flat out return it if you can, zotac is hot garbage in the summer sun. 

my 2080 was crashing out of box as well, until I ramped the fans to 100% - here amazon, you can have this back.


----------



## mattliston

eternal7trance said:


> I have a Zotac 1080ti Amp Extreme Core that I recently bought and I noticed I crash on a few games, usually only ones that make my gpu sit at 99% usage. When I looked up the issue it said this could be power issues or a bad overclock. I have not overclocked the card at all but I do notice on the firestorm app the boost clock goes up to 1960 and on the site it lists the boost clock at 1721. Is this worth RMAing or is there a more permanent way I could lower the clock without having to use the app everytime?
> 
> Also for reference my psu is a brand new EVGA 750w G3.





use hwinfo64 to read system sensors.


how does the 12v, 5v, and 3.3v rails look? You will want to see minimums of approx 11.75, 4.8, and 3.2 (my opinion)



How does the voltage for the gpu core voltage look?


If you use msi afterburner, you can try increasing the power limit on the card and see what it does. it should force/convince the card to self-boost a little more, and if the problem is worse, you can certainly claim the card as a bad egg. If it solves it, then perhaps its a power input issue from the PCIE slot or PSU cables.




If your motherboard is a cheaper variant, this can rarely happen, but the motherboard will struggle to keep the PCIE slot powered up.


----------



## The Pook

Measuring power supply rails with software is a joke. If you want to measure it, get a multimeter. Even then, rail stability isn't indicative of a functional/quality unit. You can have rails within spec and it still be a garbage unit.


----------



## kithylin

eternal7trance said:


> I have a Zotac 1080ti Amp Extreme Core that I recently bought and I noticed I crash on a few games, usually only ones that make my gpu sit at 99% usage. When I looked up the issue it said this could be power issues or a bad overclock. I have not overclocked the card at all but I do notice on the firestorm app the boost clock goes up to 1960 and on the site it lists the boost clock at 1721. Is this worth RMAing or is there a more permanent way I could lower the clock without having to use the app everytime?
> 
> Also for reference my psu is a brand new EVGA 750w G3.


Nvidia cards are -SUPPOSED- to boost over spec like that, like "The Pook" said already. But I wanted to add that most people -want- this. The cards have built in safety features and it will reduce it's clocks if it runs too hot, or ramp up it's fan higher if it gets too hot and the card should shut off if it really got hot and before it sustains damage, there's protections built into it.

I would guess that it's probably software related and nothing to do with the hardware.

Did you remember to completely uninstall your old drivers including running DDU in safe mode before installing your new card?

Also have you tried running with Vsync enabled? That should prevent the card from sitting at 99% usage and super-heating for no reason. Try that and just see if it does anything for you.


----------



## Robostyle

del


----------



## eternal7trance

Thanks for the replies guys. It seems like after checking everything, changing the nvidia power settings to prefer maximum performance instead of optimal power made it stop crashing.


----------



## kithylin

eternal7trance said:


> Thanks for the replies guys. It seems like after checking everything, changing the nvidia power settings to prefer maximum performance instead of optimal power made it stop crashing.


That usually prevents the card from down-boosting though and will (usually) run it at max boost speed at all times. Which usually makes the cards run hot and with loud fans. Generally not a good idea unless you're benchmarking for high scores.


----------



## bmgjet

kithylin said:


> That usually prevents the card from down-boosting though and will (usually) run it at max boost speed at all times. Which usually makes the cards run hot and with loud fans. Generally not a good idea unless you're benchmarking for high scores.


Will still down clock and everything normal. I run mine in maximum performance setting since it stops the clocks jumping around on desktop since I have 2X 4K and 1X 1440P screen, Youd move the mouse and it would jump up to 1596mhz, stop moving the mouse and it drops back to 1200mhz something. Leave it for 10secs drop down to 492mhz. Each time clock speed changes 1440P screen gets a tear in it for split second.
In max performance it goes from 492mhz to 1596mhz and stays there until computers been idle for 10 or so seconds.

So if that stopped it crashing there must be a P-State thats not got enough voltage in it, So chips lower quality then what the bios is made for.
Id RMA the card since if its not 100% stable at stock settings for the bios on it, it should of been in a lower classed card.


----------



## AlbertoM

eternal7trance said:


> I have a Zotac 1080ti Amp Extreme Core that I recently bought and I noticed I crash on a few games, usually only ones that make my gpu sit at 99% usage. When I looked up the issue it said this could be power issues or a bad overclock. I have not overclocked the card at all but I do notice on the firestorm app the boost clock goes up to 1960 and on the site it lists the boost clock at 1721. Is this worth RMAing or is there a more permanent way I could lower the clock without having to use the app everytime?
> 
> Also for reference my psu is a brand new EVGA 750w G3.


Try using a different PCI-E cable from the PSU.


----------



## fat4l

So I was playing with my card a bit and the best I was able to get is 

Graphics scores :
FS X: 16685 https://www.3dmark.com/fs/17881294
FS U: 8232 https://www.3dmark.com/3dm/32413740?
TS: 11572 https://www.3dmark.com/3dm/32414052?

Is that any good ? 4790K 5.2G, 2666MHz CL10 DDR3, Asus Strix 1080Ti, 2202MHz @1.2v, +800MHz mems, XOC bios.


----------



## kithylin

The new Nvidia drivers to support FreeSync on Nvidia are live. https://www.reddit.com/r/nvidia/comments/ag666r/41771_game_ready_driver_now_live/


----------



## fat4l

@KedarWolf https://www.3dmark.com/compare/spy/5813264/spy/5039067#
Wow that cpu score of 9900k smashes my old 4790k  double the cores should be 100% more, but its like 135% more for you, so...the IPC gain is visible there. Impressive stuff


----------



## 2ndLastJedi

fat4l said:


> @KedarWolf https://www.3dmark.com/compare/spy/5813264/spy/5039067#
> Wow that cpu score of 9900k smashes my old 4790k /forum/images/smilies/tongue.gif double the cores should be 100% more, but its like 135% more for you, so...the IPC gain is visible there. Impressive stuff


That was a nice way to say "look, i beat you" lol
Nice scores though!


----------



## r31ncarnat3d

Late to the party but finally upgraded my rig after almost seven years! It's primarily used for work (Pentesting things), but I couldn't resist sticking a GTX 1080 Ti in there. It helps that I got it for <$500 off of CL.


----------



## kithylin

r31ncarnat3d said:


> Late to the party but finally upgraded my rig after almost seven years! It's primarily used for work (Pentesting things), but I couldn't resist sticking a GTX 1080 Ti in there. It helps that I got it for <$500 off of CL.


They're awesome cards. There's currently nothing faster you can buy in the < $1000 range at the moment too.


----------



## r31ncarnat3d

kithylin said:


> They're awesome cards. There's currently nothing faster you can buy in the < $1000 range at the moment too.


Exactly why I went for one! I looked at the RTX 2080 for a while but from what I can find they had almost identical performance. I didn't want to spend ~$200 for ray tracing with an RTX 2080 and I definitely didn't want to double my GPU budget, so the GTX 1080 Ti was a no-brainer.

Next upgrade ETA: 9 years, 11 months, and 30 days!

Seriously though tech these days is easy. Things aren't quickly outpacing current tech like they were in the 2000s and now that I'm in an actual career I don't have time to game as much anymore to justify upgrading so quickly.


----------



## nyk20z3

Any one have experience with Barrow GPU blocks ?, Unfortunately this was the only block available for my 1080 Ti Lighting. There was limited info so i don't even know if it comes with a back plate etc


----------



## Abaidor

fat4l said:


> So I was playing with my card a bit and the best I was able to get is
> 
> Graphics scores :
> FS X: 16685 https://www.3dmark.com/fs/17881294
> FS U: 8232 https://www.3dmark.com/3dm/32413740?
> TS: 11572 https://www.3dmark.com/3dm/32414052?
> 
> Is that any good ? 4790K 5.2G, 2666MHz CL10 DDR3, Asus Strix 1080Ti, 2202MHz @1.2v, +800MHz mems, XOC bios.



Hey great scores there. I have the Asus Strix 1080Ti and it currently clocks @ 2085Mhz + 1050Mhz memory on stock bios while temps are great (38C max on water). I feel like the card can go higher but it is power limited. If I use the XOC BIOS do I lose any Display Ports or any ports for that matter? 

Could you share which BIOS are you using (download link) and how could I go about it. How to update BIOS and what to do after that? Do you simply flash the new BIOS and then the power limit is unlocked? I mean do you still use GPU Tweak or Afterburner to oveclock and you can increase the power limit?

I upgraded to a 4K monitor and I feel like any extra performance is needed now although I have a huge backlog of older titles I want to play first so it could last me for a year or two before I need something faster.


----------



## kithylin

Abaidor said:


> Hey great scores there. I have the Asus Strix 1080Ti and it currently clocks @ 2085Mhz + 1050Mhz memory on stock bios while temps are great (38C max on water). I feel like the card can go higher but it is power limited. If I use the XOC BIOS do I lose any Display Ports or any ports for that matter?
> 
> Could you share which BIOS are you using (download link) and how could I go about it. How to update BIOS and what to do after that? Do you simply flash the new BIOS and then the power limit is unlocked? I mean do you still use GPU Tweak or Afterburner to oveclock and you can increase the power limit?
> 
> I upgraded to a 4K monitor and I feel like any extra performance is needed now although I have a huge backlog of older titles I want to play first so it could last me for a year or two before I need something faster.


There's a dedicated thread for that over here: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html Instructions about how to do it and information are in the first post.

Also someone can correct me if I'm wrong. But I -THINK- the Asus XOC Bios is literally designed for the STRIX OC 1080 Ti card originally. So you probably you won't lose anything. You might want to ask in that thread and see if you'll lose any ports or not.


----------



## RoBiK

kithylin said:


> Also someone can correct me if I'm wrong. But I -THINK- the Asus XOC Bios is literally designed for the STRIX OC 1080 Ti card originally.


that is correct


----------



## Abaidor

kithylin said:


> There's a dedicated thread for that over here: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html Instructions about how to do it and information are in the first post.
> 
> Also someone can correct me if I'm wrong. But I -THINK- the Asus XOC Bios is literally designed for the STRIX OC 1080 Ti card originally. So you probably you won't lose anything. You might want to ask in that thread and see if you'll lose any ports or not.



Thanks heading over there...


----------



## lklem

nyk20z3 said:


> Any one have experience with Barrow GPU blocks ?, Unfortunately this was the only block available for my 1080 Ti Lighting. There was limited info so i don't even know if it comes with a back plate etc


I'm using Barrow GPU block but the model is for MSI GTX1080Ti Gaming X, my problem with the Barrow block was there is minor gap between VRAMs and the block eventhough i have put on the supplied thermal pads, if not mistaken the supplied thermal pads was 1.0mm, hence i bought some 1.5mm thermal pads as replacement to resolve the problem, overall the block is ok the highest temp i got was 50 degree while gaming, cooled by an XSPC RX480V3 radiator & 4 pcs of Gentle Typhoon GT fan running at 1500rpm for push configuration, ambient temp was around 32degree. I would say about similar performance as EKWB (i was using Asus GTX980Ti Strix with EKWB GPU block previously).


----------



## Knoxx29

I have a silly question just for curiosity.

As we know running a GPU at high temperatures (70c+) it's not a good idea but is it ok to run the GPU all the time at 10c?


----------



## Zfast4y0u

Knoxx29 said:


> I have a silly question just for curiosity.
> 
> As we know running a GPU at high temperatures (70c+) it's not a good idea but is it ok to run the GPU all the time at 10c?


electronics dosent like hot, dusty and general filthy area to be in, and humid.

cold core is happy core.


----------



## Knoxx29

Zfast4y0u said:


> electronics dosent like hot, dusty and general filthy area to be in, and humid.
> 
> cold core is happy core.


I asked because i have been running my 1080 ti Ftw3 at 10c for the last 2 weeks .


----------



## skupples

Knoxx29 said:


> I asked because i have been running my 1080 ti Ftw3 at 10c for the last 2 weeks .


its just about the only enjoyable thing you can get from winter.  cold & dry - happy.


----------



## Knoxx29

skupples said:


> its just about the only enjoyable thing you can get from winter.  cold & dry - happy.


Not just in winter because in the area where i live the humidity it's not much and the dew point is always low and that helps to drop my GPU temps to at least 15c/20c


----------



## skupples

Knoxx29 said:


> Not just in winter because in the area where i live the humidity it's not much and the dew point is always low and that helps to drop my GPU temps to at least 15c/20c


beautiful.

My office is 90f 9 months a year. Even when the central air, & stand alone office AC are blowing.

my ambient atm is 24c, which is pretty cool for us.


----------



## Knoxx29

skupples said:


> beautiful.
> 
> My office is 90f 9 months a year. Even when the central air, & stand alone office AC are blowing.
> 
> my ambient atm is 24c, which is pretty cool for us.


Well looks like my GPU will have a frozen life


----------



## skupples

I'd definitely put the effort into building an enclosed case with external intake, so it's always breathing cold air. Maybe even have it then vent into the room to help warm it. 

you know, like the data centers do that're in the right climates.


----------



## Knoxx29

skupples said:


> I'd definitely put the effort into building an enclosed case with external intake, so it's always breathing cold air. Maybe even have it then vent into the room to help warm it.
> 
> you know, like the data centers do that're in the right climates.


That would be an interesting Project.

Aren't you that guy that almost burned the house down?


----------



## skupples

Knoxx29 said:


> That would be an interesting Project.
> 
> Aren't you that guy that almost burned the house down?


...

i don't believe so.

when was this?


----------



## Knoxx29

skupples said:


> ...
> 
> i don't believe so.
> 
> when was this?


There is a thread somewhere in the forum when i find it i will post the link

If i am not mistaken it should be in the Chilled Water Cooling vs 3.0 Build Log thread.


----------



## kithylin

Knoxx29 said:


> I have a silly question just for curiosity.
> 
> As we know running a GPU at high temperatures (70c+) it's not a good idea but is it ok to run the GPU all the time at 10c?


Electronics love cold. It's the heat that can damage them. The colder the better usually for video cards. As long as you're not below ambient while gaming on it. Is your room really only 10c most of the time? And that 10c you have, is that under full load while gaming?


----------



## Knoxx29

kithylin said:


> Electronics love cold. It's the heat that can damage them. The colder the better usually for video cards. As long as you're not below ambient while gaming on it. Is your room really only 10c most of the time? And that 10c you have, is that under full load while gaming?


Yes the card is 10c under full load while Gaming but not because my room is 10c but because the Card is Waterchilled and i always run it below ambient because as i said in one of my previous post my dew point is very low.


----------



## Knoxx29

skupples said:


> ...
> 
> i don't believe so.
> 
> when was this?



Found it 

https://www.overclock.net/forum/27789842-post374.html

I dont know why i thought it was you


----------



## Zfast4y0u

Knoxx29 said:


> I asked because i have been running my 1080 ti Ftw3 at 10c for the last 2 weeks .


be careful if its sub ambient temp, u can get moist on ur gpu, and if it cause shorts somewhere, u will have bad day.


----------



## Knoxx29

Zfast4y0u said:


> be careful if its sub ambient temp, u can get moist on ur gpu, and if it cause shorts somewhere, u will have bad day.


It's sub ambient temp but that's not a problem as long as i keep the temps above the dew point.


----------



## nrpeyton

hey all -- long time no see...

Anyone tried the new OC Scanner built into the newest beta release of MSI Afterburner?


It seems like a bit of a power virus to me -- Even at only 1.0 to 1.05v it was hitting 500 watts +. My temperature probes that I connected to my cards VRM over a year ago I literally forgot I had done that UNTIL yesterday when the over-heating alarm started to sound on my VRM. I thought "what the hell is that noise" so I had a closer look -- and as I thought -- the temperature of my VRM (under water) was higher than I've ever seen it before.

I felt compelled to turn off the test.

Now I know I have liquid metal on the shunt resistors AND using the STRIX BIOS, so my card effectively has no power limit. But this has NEVER, NOT UNTIL NOW -- ever caused to draw so much power from the wall (in watts) as long as I kept the voltae in check. I'd expect I'd normally need to run at least 1.2v to consume 500 watts.


What ever algorthms they are using for testing in this new OC Scanner -- it is more dangerous than FURMARK as a power-virus.

Any thoughts ?


Nick
??


----------



## Knoxx29

Hi Nick long time no see, where in the heaven have you been?


----------



## nrpeyton

aww nice to see someone familiar so quick -- I was worried nobody would remember me or that everyone would have disappeared as it'd been so very long.

Just other things I suppose m8, nothing to do with computing really -- I kind of exhausted all my ideas for experiments and stuff except a few more risky ventures I'd still love to try -- but when the price of a 1080Ti doubled (compared to only paying £700) I thought nahhh -- 700 alone would absolutely kill me. But then imagine if I lost that and actually had to fork out over a grand 1200 for a new card. I'd probably hate myself and something like that would of spelled the end to my career of experimenting & modding hardware -- which in it's self would be a loss too- so, no; it just became too expensive. 

Doing stuff on older hardware can be fun -- but it's still not the same, as the forums for said hardware is usually dead so you feel like your just playing around rather than potentially setting precedents .

We need to see some common sense come back to the pricing arena for our hobby. Because currently it seems like a hobby now only reserved for the small 5% with money to "spare".


How have you been knoxx??

And then afterwards if we then go back to my oringinal post above; regarding the new *OC Scanner. * How are people finding it? I just removed the STRIX BIOX reverting to standard FE firmware _(note: card still has a shunt mod)_. 
And surprisingly OC SCANNER tested 2138mhz @ 1.038v *which is odd,* because power throttling occurred earlier in test _(stage 1/4 at 0.993v I think)_

Maybe there's more to this OC scanner I'm not yet understanding. I.E. yes it tests each curve point for all voltages & frequencies but also now with more than one demand on power/current. (I.E. (1)lighter workload / (2)medium workload / (3)heavy workload / (4)extreme workload) 1, 2, 3 & 4 are ALL perhaps tested at each voltage point separately)? <--- This could look amount to 1) 150 watts 2) 250 watts 3) 350 watts 4) 450 watts? Obviously nvidia doesn't expect anyone to have the XOC or shunt mod so true numbers are greater!

Alternatively maybe all of that is utter rubbish -- and the tester simply knows how to adjust current demand to keep within TDP to allow scanning at higher frequencies where power throttling would normally be an issue.


----------



## skupples

not sure how it works, I just know it makes my card scream (coil whine) which usually means the power's being tapped hard. Looking forward to getting this under water, it does 2ghz on air in most everything.


----------



## nrpeyton

What numbers is everyone getting using MSI Afterburner's new *OC Scanner*?

I get 2113 - and it seems to work as this is indeed perfectly stable for me.


----------



## kithylin

nrpeyton said:


> What numbers is everyone getting using MSI Afterburner's new *OC Scanner*?
> 
> I get 2113 - and it seems to work as this is indeed perfectly stable for me.


I won't ever use it after your comment above. 500W is terrifyingly scary for a 1080 Ti and I won't subject my card to that. Thanks for sharing the warning.


----------



## Zfast4y0u

kithylin said:


> I won't ever use it after your comment above. 500W is terrifyingly scary for a 1080 Ti and I won't subject my card to that. Thanks for sharing the warning.


well if u didnt mod ur card, i guess its safe, since of power throttle wall. im curious to see my voltage lvls for each step in curve when that new msi afterburner version comes out of beta, i just wish we could configure our voltages under 0.800mV too... 1080ti's can give so much performance at lower clocks and power draw is a joke.


----------



## kithylin

Zfast4y0u said:


> well if u didnt mod ur card, i guess its safe, since of power throttle wall. im curious to see my voltage lvls for each step in curve when that new msi afterburner version comes out of beta, i just wish we could configure our voltages under 0.800mV too... 1080ti's can give so much performance at lower clocks and power draw is a joke.


I'm running XOC bios on my 1080 Ti and I've seen it get up to 640 watts briefly for about 30 seconds once then freaked out and shut the game off I was testing with. I normally run it at 2126 core / 12420 memory @ 1.200v and it's perfectly stable in all games. Max temps 53c under demanding games like ARK Evolved, typical temps are around max 35c when playing games like american truck sim and fallout 4. Large custom water loop for the win. :thumb:


----------



## nrpeyton

kithylin said:


> I'm running XOC bios on my 1080 Ti and I've seen it get up to 640 watts briefly for about 30 seconds once then freaked out and shut the game off I was testing with. I normally run it at 2126 core / 12420 memory @ 1.200v and it's perfectly stable in all games. Max temps 53c under demanding games like ARK Evolved, typical temps are around max 35c when playing games like american truck sim and fallout 4. Large custom water loop for the win. :thumb:


I've re-flashed my FE BIOS so I can feel safer experimenting with the new O/C scanner. 600w +.. jesus.. did you hear a noise suddenly given off by the cards VRM? (a bit like coil whine)?


----------



## kithylin

nrpeyton said:


> I've re-flashed my FE BIOS so I can feel safer experimenting with the new O/C scanner. 600w +.. jesus.. did you hear a noise suddenly given off by the cards VRM? (a bit like coil whine)?


Nope. My computer with the 1080 Ti is physically in another room on the other side of a wall from my bedroom so I never hear it or the coil whine or anything. I could care less if the coil whine is screaming loud, I'll never hear it.


----------



## fredocini

Hey guys!

Got myself a EVGA 1080 Ti FTW. Wanting to re-TIM the paste with Kryonaut. Anyone have some experience with this? I've heard for some that it makes a dramatic difference. Others say they've seen no improvement at all.

For reference, my case is a Fractal Design Meshify C. The card will be using the stock cooler. My main goal of course is to run my OC - just don't want to listen to 100% fan speed all the time.

OC:
1974 mhz Core
12600 mhz Memory
Temp: 78C - 79C w/ Fire Strike Extreme Stress Test


----------



## skupples

fredocini said:


> Hey guys!
> 
> Got myself a EVGA 1080 Ti FTW. Wanting to re-TIM the paste with Kryonaut. Anyone have some experience with this? I've heard for some that it makes a dramatic difference. Others say they've seen no improvement at all.
> 
> For reference, my case is a Fractal Design Meshify C. The card will be using the stock cooler. My main goal of course is to run my OC - just don't want to listen to 100% fan speed all the time.
> 
> OC:
> 1974 mhz Core
> 12600 mhz Memory
> Temp: 78C - 79C w/ Fire Strike Extreme Stress Test


it depends on your card & how poorly it was TIMMED in the first place, & how bad the TIM has degraded over its lifespan. 

I'd be willing to bet a tear down on a 1008 TI would be beneficial at this point, unless you get it and its temps are already well within spec (Whatever that may be for your card, idk) 

some folks like to also upgrade/replace the thermal pads. That takes a bit more precision though, as you gotta know what thickness you need.


----------



## fredocini

skupples said:


> it depends on your card & how poorly it was TIMMED in the first place, & how bad the TIM has degraded over its lifespan.
> 
> I'd be willing to bet a tear down on a 1008 TI would be beneficial at this point, unless you get it and its temps are already well within spec (Whatever that may be for your card, idk)
> 
> some folks like to also upgrade/replace the thermal pads. That takes a bit more precision though, as you gotta know what thickness you need.



Thanks. Well, I got this card as an RMA from EVGA (First one I purchased had a fan noise issue). I've checked temps for VRAM and they're really good. Seems that the GPU die is the main thing that seems to be getting warm. I guess a $20 trial doesn't break the bank.


----------



## 2ndLastJedi

So can we use MSI AB to scan our 1080ti's for an OC? I thought it was only RTX cards. My Galax EXOC is on stock bios.


----------



## Glottis

skupples said:


> it depends on your card & how poorly it was TIMMED in the first place, & how bad the TIM has degraded over its lifespan.
> 
> I'd be willing to bet a tear down on a 1008 TI would be beneficial at this point, unless you get it and its temps are already well within spec (Whatever that may be for your card, idk)
> 
> some folks like to also upgrade/replace the thermal pads. That takes a bit more precision though, as you gotta know what thickness you need.


GPU manufacturers use long life TIM. Even if someone got their 1080Ti day 1 that's not even 2 years so stock TIM will still be in perfectly fine condition.

My 1080Ti STRIX has same temps as the day I bought it almost 2 years ago.


----------



## nrpeyton

kithylin said:


> Nope. My computer with the 1080 Ti is physically in another room on the other side of a wall from my bedroom so I never hear it or the coil whine or anything. I could care less if the coil whine is screaming loud, I'll never hear it.


Only reason I mentioned it is because I'm lucky enough not to have coil whine myself; except on the few rare occasions when my card has drawn in excess of 500w + I do hear a weird buzzing/hissing noise (more buzzing than hissing) and on hearing it I suddenly felt an overwhelming feeling of dread as I realised the VRM was probably at the limit of it's spec. Which is why I immediately shut it down.


----------



## kithylin

nrpeyton said:


> Only reason I mentioned it is because I'm lucky enough not to have coil whine myself; except on the few rare occasions when my card has drawn in excess of 500w + I do hear a weird buzzing/hissing noise (more buzzing than hissing) and on hearing it I suddenly felt an overwhelming feeling of dread as I realised the VRM was probably at the limit of it's spec. Which is why I immediately shut it down.


I wanted to see just the stupidly extreme worst case scenario once and I subjected my 1080 Ti to playing Project C.A.R.S. (the first one), with Vsync off in settings, and running it 4K on 1080p via nvidia DSR, and drove a car around until I saw a big power spike on nvidia-SMI. I then switched the camera to be down on the front bumper and drove the car into a corner to look out in to the far distance with nothing rendering and saw the card running around 580-625 watts in nvidia-SMI. I kinda freaked out and only subjected it to that for about 10-15 seconds then killed the game with alt-f4.


----------



## nrpeyton

kithylin said:


> I wanted to see just the stupidly extreme worst case scenario once and I subjected my 1080 Ti to playing Project C.A.R.S. (the first one), with Vsync off in settings, and running it 4K on 1080p via nvidia DSR, and drove a car around until I saw a big power spike on nvidia-SMI. I then switched the camera to be down on the front bumper and drove the car into a corner to look out in to the far distance with nothing rendering and saw the card running around 580-625 watts in nvidia-SMI. I kinda freaked out and only subjected it to that for about 10-15 seconds then killed the game with alt-f4.



Yup I hear ya, we've both had that feeling before then, lol. It's like Curiosity vs Fear.


The "dew point" _(the lowest temperature I can go before condensation begins)_ is exceptionally low today. Normally sitting at around 12c -- today only 3.5c. So I've just fired up the Chiller -- I am going to run the O/C scanner again (first with FE BIOS) and see if I can beat yesterdays 2113MHZ.


----------



## Dwofzz

Count me in, This puppy screams like a pig at 1.25v +


----------



## kithylin

Dwofzz said:


> Count me in, This puppy screams like a pig at 1.25v +


Careful.. better not kill that thing with overclocking too far. EVGA doesn't have replacements for 1080 Ti KINGPIN cards anymore. They'll just send you an FE card from warranty. Also how do you get these cards above 1.200v? EVBOT?


----------



## 8051

kithylin said:


> Nope. My computer with the 1080 Ti is physically in another room on the other side of a wall from my bedroom so I never hear it or the coil whine or anything. I could care less if the coil whine is screaming loud, I'll never hear it.


What a great idea. How did you manage the cabling for your keyboard, monitor and mouse? A powered KVM? USB?


----------



## kithylin

8051 said:


> What a great idea. How did you manage the cabling for your keyboard, monitor and mouse? A powered KVM? USB?


We own the house here (well mortgage) so I just run wires into the closet and cut holes in the bottom back corner of the closet and the computers is in the other room on the other side. 20ft usb cable, single usb cable goes to a pair of powered hubs on my desk, 20ft hdmi, displayport, dvi cables.. spdif to DAC for audio, etc. And a wireless power+reset switch I can operate in this room via remote control. Computer noise and heat in the other room. Nice and quiet and no fan noise, no coil whine and no heat in here in the summer. I also have a home NAS / file server in the other room and dedicated window a/c unit in there in the summer, and my own window a/c in here in the summer in my room. I like this setup because I found if I had my monitors + both computers in here all in one room originally, I could never get the a/c to cycle off in the summer. With this setup both a/c units cycle on-off in the summer now and it's less power usage even with two units, because they get to cycle off sometimes instead of one big one running constantly almost 24 hours a day in the summer. And I can let the computer room cycle between 75F <-> 84F in the summer, and keep my room a nice cool 70 <-> 75F in here in my room.

The only big problem with this setup is monitors. Right now I'm only using a 1080p screen @ 75 Hz that overclocks to 80 Hz. Which is okay on a 20ft displayport cable. But if I ever want to go to say, a 144hz 1440p screen later, I would have to re-arrange all the furniture in my room and move my whole desk + computer setup on to the other wall so I could use a 8-10ft displayport cable instead. Displayport can't push that kinda bandwidth at this distance reliably.

Speaking of distance and cables.. the problems I had at first with getting this setup to work is I found out in pushing computer data at even these distances (just 20-25 ft) that I started to really see the difference in cable quality. I literally had to buy and return at least 50 or more displayport cables before I found one that could even push [email protected] & 20ft without blanking out on me. So many displayport cables I literally lost count. And I had to try about 10-15 different USB cables and finally pay up for a Monster brand USB cable before I had one that would do it. The others would drop signal @ 20ft. And Even this one that maintains signal, can't maintain power. Nothing can connect to the end of the cable with just the USB cable alone. So I had to connect it to usb hubs that are powered at this end on the desk to provide power to the "local" devices here on my desk. And I use USB keyboard and microphone. I also at first was running a normal copper headphone and microphone cables to the tower along with the others, but I was getting bad signal noise from those two cables being next to the power and other cables in the "run". I had to switch to SPDIF/TOSLINK/optical for headphones and use a USB Microphone instead and it worked.

So it took some trial and error to figure it out and I have it all working now.


----------



## Dwofzz

kithylin said:


> Careful.. better not kill that thing with overclocking too far. EVGA doesn't have replacements for 1080 Ti KINGPIN cards anymore. They'll just send you an FE card from warranty. Also how do you get these cards above 1.200v? EVBOT?


Benched up to 1.3v with chilled water (8c~). It is running in my daily now at 1.14v~ should be plenty safe  
Yes Evbot or the classified tool that's what I'm using!


----------



## BulletSponge

Can cards develop coil whine after a year or so of use or is it something you have out of the box? I bought my 1080ti last March and last week it started whining like crazy. I have never been able to hear it before (probably due to flight deck induced hearing loss) but this is insane.


----------



## kithylin

BulletSponge said:


> Can cards develop coil whine after a year or so of use or is it something you have out of the box? I bought my 1080ti last March and last week it started whining like crazy. I have never been able to hear it before (probably due to flight deck induced hearing loss) but this is insane.


All high end video cards all have some form of coil whine even from the factory, every model of them does. It depends on what you're doing with it, what game you're playing and how you're running the card as to if it's discernable to human ears or not. Try playing with vsync on and it should help if you're playing with it off. Usually coil whine is when you're letting it run "untethered".


----------



## BulletSponge

kithylin said:


> All high end video cards all have some form of coil whine even from the factory, every model of them does. It depends on what you're doing with it, what game you're playing and how you're running the card as to if it's discernable to human ears or not. Try playing with vsync on and it should help if you're playing with it off. Usually coil whine is when you're letting it run "untethered".



It is whining gaming or not, sitting here with Chrome open it is loud enough my dogs won't even come in the room. Guess I need to run a few benches and see if it changes/worsens.


----------



## Zfast4y0u

Dwofzz said:


> Count me in, This puppy screams like a pig at 1.25v +


coil whine?? its nice looking card, being curious how much does brand new cost? ( before 2000 series was out )
@BulletSponge

you get high coil whine, if you overclock ur card much, if u dont limit fps to 60. ( assuming u use 60hz screen ) try lowering boost clocks, it will help u with coil whine. i have 2 1080ti's and one of em makes more coil whine then second one on same clock speeds. under 2000mhz they have a bit still, but nothing that would be annoying, fan noise is bigger then coil whine in that case.

as far as i know, coil whine dosent come or go with time, you might check this video to understand whats causing it


----------



## Dwofzz

MSRP was 1250$ when it launched and I picked this up for 950 and it is really mint! I had a founders 1080 Ti before this one but that one died after severe modding and extended sessions with LN2.. Altho the founders clocked way higher on air/water without any extra voltage.. Think it ran firestrike at 2126MHz and all game stable at 2115MHz, The kingpin does 2088MHz at 1.14v (both 25mV switches on) but thats about it.


----------



## nrpeyton

Dwofzz said:


> MSRP was 1250$ when it launched and I picked this up for 950 and it is really mint! I had a founders 1080 Ti before this one but that one died after severe modding and extended sessions with LN2.. Altho the founders clocked way higher on air/water without any extra voltage.. Think it ran firestrike at 2126MHz and all game stable at 2115MHz, The kingpin does 2088MHz at 1.14v (both 25mV switches on) but thats about it.


I noticed your previous post about chilled Water (8c).

What is your chiller like peformance at lower temperatures? (sub 8c)?

At full load 7700k + 1080Ti _(400w full load)_ my Hailea HC-500 won't go below 12-14c.

I've read stuff (+ seen youtube video) of people modifying old parts off Air Con systems. And getting to -25c easily. Even in the video by Linus his setup uses an open-top Walmart cool-box for gods sake!

Now compare that to my big 1/2 horse power Hailea HC-500 (stands about 1m tall). Just looking at it -- it's massively sturdy compared to those setups. It's also dedicated. And it's fairly loud. 

I've looked at the back of huge big 5 metre long open-door freezer cabinets in supermarkets and the compressors in those aren't any bigger (or quieter) than what I've got here in my Hailea.

I mean lets put it into perspective -- out of ALL the Aquarium Water Chillers that Hailea do -- it's almost their 2nd biggest model.

I'm also hearing a lot of stories about people who have "older" units (some less powerful) that also seem to perform much better at lower temperatures. My Hailea performs great at high temperatures (seems to easily boast the 800 watts of cooling power it claims). 

But below 13-14c it's pointless.

Why the lack of cooling? How does Linus's makeshift cool-box setup + other setups all seem to out-perform at low temps? 

Is it due to the gas?? (refrigerant)??

The modern Chiller models like mine use R134a (more environmentally friendly) but boiling point is only -26c. Compared to older models running on R22 (which I think is about -47c). Is that the reason? If so is there a work-around? is there a way I could modify my Chiller to perform better?

My unit is worth over £400 (thats 600 U.S dollars +).

I want to get lower on my 1080Ti



P.S.

Here's my video of my Chiller when I modified it to bypass thermostat for Sub-zero 1080Ti (had to run with ZERO load on my 1080Ti and *only *got to about -8c idling). _(Ignore the 1st 10 seconds showing CPU on dry ice)._:








-------------






Dwofzz said:


> Count me in, This puppy screams like a pig at 1.25v +


wow, nice -- she is beautiful ;-)

I used to have an 'EVGA Classified 1080' -- which is the closest I'll probably get to owning one of those!


----------



## Dwofzz

nrpeyton said:


> I noticed your previous post about chilled Water (8c)


It was a 960 rad outside in the snow (-17c) with 8 delta fans 19/13 mm Hose and 2x D5's.. Gotta use the winter for something good right? 

I was bored so I came up with this solution.. didn't want to go ln2 since I was going to use it in my daily and vaseline tend to make things messy..


----------



## scaramonga

Any recommendation for a good waterblock for my AORUS 1080 Ti (non-extreme)?

I've heard the EK ones are no good VRM wise, although does Rev 2.0 fix this?

Can't get hold of the Phanteks Glacier GTX 1080 TI Gigabyte AORUS here in UK, as I was gonna grab that.

They are all a bit thin on the ground now.


----------



## Zfast4y0u

Dwofzz said:


> MSRP was 1250$ when it launched and I picked this up for 950 and it is really mint! I had a founders 1080 Ti before this one but that one died after severe modding and extended sessions with LN2.. Altho the founders clocked way higher on air/water without any extra voltage.. Think it ran firestrike at 2126MHz and all game stable at 2115MHz, The kingpin does 2088MHz at 1.14v (both 25mV switches on) but thats about it.


eh, for that kind of price, man would think they actually put binned gpu's on that PCB. -.-


----------



## Dwofzz

Zfast4y0u said:


> Dwofzz said:
> 
> 
> 
> MSRP was 1250$ when it launched and I picked this up for 950 and it is really mint! I had a founders 1080 Ti before this one but that one died after severe modding and extended sessions with LN2.. Altho the founders clocked way higher on air/water without any extra voltage.. Think it ran firestrike at 2126MHz and all game stable at 2115MHz, The kingpin does 2088MHz at 1.14v (both 25mV switches on) but thats about it.
> 
> 
> 
> eh, for that kind of price, man would think they actually put binned gpu's on that PCB. -.-
Click to expand...

 Yeah I know.. But thanks to Nvidia and their bull**** all the highest binned gpu's goes to their founders so they can justify the price and get as much money out of it as possible.. 
That's why they the Kingpin card only have a "2025MHz guarantee"


----------



## GunnzAkimbo

*aimed at getting a 2070, then a 2080 and now i have a crack at getting a MSI 1080 ti Gaming X Trio card in the same city on Ebay.
It will cost less than a 2080, + a 10% discount voucher on ebay

So, $891.AUD for the card VS $1151.AUD for the best value MSI 2080 card (duke...borrrring) with ebay discount.
Only playing games at 4K or 1440p desktop window (heavy titles) on a 55" screen.

What ya think? I think it's good with the discount.


----------



## skupples

yes, do it.


i went from 1070 to 2080 n totally hated it, so went to 1080ti instead for $500, i didn't go for a fancy one though cuz RMA numbers are low, & I plan to be riding titans again soon.

.


----------



## GunnzAkimbo

skupples said:


> yes, do it.
> 
> 
> i went from 1070 to 2080 n totally hated it, so went to 1080ti instead for $500, i didn't go for a fancy one though cuz RMA numbers are low, & I plan to be riding titans again soon.
> 
> .


Planning too.


----------



## nrpeyton

The 2080 seems completely and utterly pointless to me.

It performs exactly the same as a 1080Ti but for a lot more money. You're basically just paying 200-250 more you need to. Only people who're unaware and haven't done their research will be duped.

Nvidia is actually relying on peoples neievity to sell a 2080.


----------



## kithylin

nrpeyton said:


> The 2080 seems completely and utterly pointless to me.
> 
> It performs exactly the same as a 1080Ti but for a lot more money. You're basically just paying 200-250 more you need to. Only people who're unaware and haven't done their research will be duped.
> 
> Nvidia is actually relying on peoples neievity to sell a 2080.


You get RTX cores and AI Tensor cores with the 2080. That's word $250 apparently, according to nvidia.


----------



## Jmatt110

GunnzAkimbo said:


> *aimed at getting a 2070, then a 2080 and now i have a crack at getting a MSI 1080 ti Gaming X Trio card in the same city on Ebay.
> It will cost less than a 2080, + a 10% discount voucher on ebay
> 
> So, $891.AUD for the card VS $1151.AUD for the best value MSI 2080 card (duke...borrrring) with ebay discount.
> Only playing games at 4K or 1440p desktop window (heavy titles) on a 55" screen.
> 
> What ya think? I think it's good with the discount.


I just bought an AORUS GTX 1080 Ti Waterforce Xtreme Edition for $800AUD on OCAU; eBay prices are pretty inflated for 1080 Ti's here.


----------



## nrpeyton

kithylin said:


> You get RTX cores and AI Tensor cores with the 2080. That's word $250 apparently, according to nvidia.


Aye absolutely right perhaps -- it probably is -- but they do make a big deal about being for "gamers."

Isn't tenser cores something to do with "machine intelligence" or AI?


----------



## kithylin

nrpeyton said:


> Aye absolutely right perhaps -- it probably is -- but they do make a big deal about being for "gamers."
> 
> Isn't tenser cores something to do with "machine intelligence" or AI?


Yep! It is exactly that. Tensor cores are an AI / Machine Intelligence acceleration core. That's literally it's entire function. Supposedly what Nvidia is doing with DLSS is they claim they are taking a certain game, running it through their super computer at Nvidia Labs, and letting a machine AI algorithm learn the optimal image quality for that game and create a profile for it. Then they release the profile with the drivers, the game supports DLSS, and combined with the profile in the drivers to tell the cards what to do, and the game's patched DLSS support then it uses the card's Tensor/AI core(s) and then allows folks to run a game at a slightly lower resolution and use AI-Cores to up-Scale and have supposedly comparative (They claim superior?) image quality VS running native resolution with AA enabled, but with as much as +30% (or more) performance. Supposedly if folks enable DLSS while also running RTX / Ray Tracing then they can run RTX with extremely little to no perceivable hit to FPS performance from RTX.


----------



## chispy

kithylin said:


> I'm running XOC bios on my 1080 Ti and I've seen it get up to 640 watts briefly for about 30 seconds once then freaked out and shut the game off I was testing with. I normally run it at 2126 core / 12420 memory @ 1.200v and it's perfectly stable in all games. Max temps 53c under demanding games like ARK Evolved, typical temps are around max 35c when playing games like american truck sim and fallout 4. Large custom water loop for the win. :thumb:


Hello @kithylin , would you tell me how can i apply 1.20v for v.core on my 1080Ti FE on EK Full block H2O , i'm already using xoc bios but i can never get above 1.093v  , appreciate any help or guidance how to get more volts than 1.093v. Thanks in advanced.


Kind Regards: Angelo


----------



## mouacyk

chispy said:


> Hello @kithylin , would you tell me how can i apply 1.20v for v.core on my 1080Ti FE on EK Full block H2O , i'm already using xoc bios but i can never get above 1.093v  , appreciate any help or guidance how to get more volts than 1.093v. Thanks in advanced.
> 
> 
> Kind Regards: Angelo


Control + F in MSI Afterburner. Drag point at 1.2v up so it has the highest clock on the graph, like 2100+ and click Apply (Check mark) in Afterburner.


----------



## mattliston

Can also hit "L" while on curve clocking screen and lock it to that voltage.


Beware, it runs like that even at desktop


----------



## chispy

mattliston said:


> Can also hit "L" while on curve clocking screen and lock it to that voltage.
> 
> 
> Beware, it runs like that even at desktop





mouacyk said:


> Control + F in MSI Afterburner. Drag point at 1.2v up so it has the highest clock on the graph, like 2100+ and click Apply (Check mark) in Afterburner.


Thank you guys , i did not know that the voltage would apply all the way to 1.2v , thanks for the help , appreciate it.

Kind Regards: Chispy


----------



## kithylin

chispy said:


> Hello @kithylin , would you tell me how can i apply 1.20v for v.core on my 1080Ti FE on EK Full block H2O , i'm already using xoc bios but i can never get above 1.093v  , appreciate any help or guidance how to get more volts than 1.093v. Thanks in advanced.
> 
> 
> Kind Regards: Angelo


Open MSI Afterburner, left-click once on the main afterburner window to make sure it's in focus. Press control + F for the curve screen. Use the curve.


----------



## Knoxx29

kithylin said:


> I'm running XOC bios on my 1080 Ti


Is that Bios for any 1080 Ti?


----------



## chispy

kithylin said:


> Open MSI Afterburner, left-click once on the main afterburner window to make sure it's in focus. Press control + F for the curve screen. Use the curve.


Thank you for the help , appreciate it.

Kind Regards: Chispy


----------



## mouacyk

Knoxx29 said:


> Is that Bios for any 1080 Ti?


For the most part yes, but you may get some disabled outputs if they are different than the original card it was intended for (either kpe or strix, someone remembers?)


----------



## GunnzAkimbo

Jmatt110 said:


> I just bought an AORUS GTX 1080 Ti Waterforce Xtreme Edition for $800AUD on OCAU; eBay prices are pretty inflated for 1080 Ti's here.


Missed out on the card i was aiming to get, sold in 1 day....

Got another picked out, just using ebay because it has given me a %10 discount on the purchase, so...


----------



## Knoxx29

@kithylin where are you gone?


----------



## nolive721

now that I have upgraded my triple screen set-up to 75Hz 1080p panels altogether, my OCed FTW3 Hybrid is a bit struggling to keep up the 75fps butter smooth experience with some AAA titles or Simracing games like Assetto Corsa or PCARS2, especially in rain conditions.
Since the card has dual Bios,I am thinking to go the route of flahsing the XOC bios to unleash more Core frequency with higher Voltage than 1.093V but I have also noticed the card goes down a bid from 42degC already and thsi temp is recahed relatively quickly in these scenario I am mentioning

Long story short, does that really make sense to try this super duper Bios?


----------



## fredocini

So, an update on re-TIMing the FTW3...

Used Kryonaut, my friend had some leftover and let me use it - upon removing the heatsink, there was literally no spread on the stock TIM. I spread the Kryonaut myself.

Temps decreased DRAMATICALLY!

After running fire strike stress test - Temps NEVER went past 70C. This is WITH the tempered glass panel on - AND maximum 70% fans. I can't believe I was running my GPU at 81C. This stuff really works!


----------



## kithylin

Knoxx29 said:


> @kithylin where are you gone?


Sorry. I missed your post before. Life's been busy for me.



Knoxx29 said:


> Is that Bios for any 1080 Ti?


Yes XOC is for any 1080 Ti, and it does completely remove the power limit all together. However it does come with a few catches: you may lose display output ports depending on if your card has a different display port configuration compared to the Asus Strix 1080 Ti. And it's only for fully custom water looped 1080 Ti's (NOT FOR AIR COOLED CARDS).


----------



## Knoxx29

kithylin said:


> I'm right here.. whatcha want? I just lurk a lot. Did I miss a post to respond to somewhere? Life's been busy lately for me.


https://www.overclock.net/forum/69-...x-1080-ti-owner-s-club-1767.html#post27841050

Where can i get the it?


----------



## kithylin

Knoxx29 said:


> https://www.overclock.net/forum/69-...x-1080-ti-owner-s-club-1767.html#post27841050
> 
> Where can i get the it?


I edited my post.. perhaps refresh the page and read the catches first. If you're on an air cooled card it's not for you to use anyway.

EDIT: If you are on a custom water cooled card then this is the thread about XOC. Do read the first post on the first page: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti-196.html


----------



## clancy2k

sdgs


----------



## Knoxx29

kithylin said:


> I edited my post.. perhaps refresh the page and read the catches first. If you're on an air cooled card it's not for you to use anyway.
> 
> EDIT: If you are on a custom water cooled card then this is the thread about XOC. Do read the first post on the first page: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti-196.html


What a pity that i dont have a Watercooled Card but instead a Waterchilled Card


----------



## kithylin

Knoxx29 said:


> What a pity that i dont have a Watercooled Card but instead a Waterchilled Card


Even better!


----------



## Knoxx29

kithylin said:


> Even better!


Thanks for the info but i pass, too many things to do and i dont understand the whole process.


----------



## kithylin

Knoxx29 said:


> Thanks for the info but i pass, too many things to do and i dont understand the whole process.


Download a bios file, use nvflash to extract and save your current bios, flash it with nvflash, shut computer off cold, do cold boot, boot up, enjoy?

It's no different than flashing any other nvidia card. There should be a link to nvflash and the bios file in the first post there.

If it's not there I can fetch you direct links to it.


----------



## Knoxx29

kithylin said:


> Download a bios file, flash it with nvflash, shut computer off cold, do cold boot, boot up, enjoy?


what is it all this?

Here:






kithylin said:


> It's no different than flashing any other nvidia card.


I have never done it before


----------



## kithylin

Knoxx29 said:


> what is it all this?
> 
> Here:<snip>
> 
> I have never done it before


Sorry.. I didn't mean to insult your intelligence in any way. I was just assuming (wrongly apparently) that if you had went so far as to build a chilled water setup for your video card, that you probably were in to flashing cards and such.. that's more of an unusually extreme user scenario thing.

The walkthrough video seems pretty straight forward to me, other than they forgot the step where you do nvflash --save file.rom and extract your factory bios that you can use to restore it later if you want.

Which part do you not understand?

Also booting in to safe mode and uninstalling drivers with DDU is a pretty standard process.


----------



## Knoxx29

kithylin said:


> Sorry.. I didn't mean to insult your intelligence in any way. I was just assuming (wrongly apparently) that if you had went so far as to build a chilled water setup for your video card, that you probably were in to flashing cards and such.. that's more of an unusually extreme user scenario thing.
> 
> The walkthrough video seems pretty straight forward to me, other than they forgot the step where you do nvflash --save file.rom and extract your factory bios that you can use to restore it later if you want.
> 
> Which part do you not understand?
> 
> Also booting in to safe mode and uninstalling drivers with DDU is a pretty standard process.




No offense taken.

I have been building PCs and Overclocking for almost 25 years but i never tried flashing a card because I have always been more focused on CPU and Ram Overclock, however i watched the video again and i got it except the part you mentioned: 
( they forgot the step where you do nvflash --save file.rom and extract your factory bios that you can use to restore it later if you want.) 

Don't worry about DDU, i have used it many times 

Please send me the link for the Bios.

Thanks in advance.


----------



## kithylin

Knoxx29 said:


> No offense taken.
> 
> I have been building PCs and Overclocking for almost 25 years but i never tried flashing a card because I have always been more focused on CPU and Ram Overclock, however i watched the video again and i got it except the part you mentioned:
> ( they forgot the step where you do nvflash --save file.rom and extract your factory bios that you can use to restore it later if you want.)
> 
> Don't worry about DDU, i have used it many times
> 
> Please send me the link for the Bios.
> 
> Thanks in advance.


Here's a direct link to the XOC Bios file: https://onedrive.live.com/?authkey=...4&parId=FABF1EAAEEBFA9D9!148675&action=locate

I can see why you would have trouble finding it, the original link went to some forums that don't exist anymore.


----------



## VeritronX

Just caught up on this thread, thinking about putting my waternlock on my 1080ti sc card when I get back from my holiday. 

I remember seeing linus use an active fiber thunderbolt cable and a breakout box to send displayport and usb etc from his pc closet to his desk in his own home, and even had gsync working over it.

edit: did some googling on that thundebolt passthrough setup.. you need a thunderbolt card in the pc which you connect your videocards displayport to, the cable used was a 30ft one from corning and the thunderbolt breakout box is made by elgato.


----------



## 2ndLastJedi

Thanks to MSI Afterburner OC scanner im now part of the 10,000 Superposition club !10,049 when the best i was able to manage on my own OC was 9996 !
Even though my OC was +136 the scanner did +137 but instead of just doing the slider it seems to do it across the whole curve . I added my memory OC (+440) and superposition pops out 10,049  
Here is the scanner in action in case anyone is interested .




Even my Firestrike and Timespy scores are higher !!
I've had this Galax 1080Ti EXOC since june 2017 and never gone this high in benchmarks ! I always run a undervolt [email protected] for my daily but had thi OC just for benching.https://www.3dmark.com/fs/18218051


----------



## kithylin

2ndLastJedi said:


> Thanks to MSI Afterburner OC scanner im now part of the 10,000 Superposition club !10,049 when the best i was able to manage on my own OC was 9996 !
> Even though my OC was +136 the scanner did +137 but instead of just doing the slider it seems to do it across the whole curve . I added my memory OC (+440) and superposition pops out 10,049
> Here is the scanner in action in case anyone is interested .
> Even my Firestrike and Timespy scores are higher !!
> I've had this Galax 1080Ti EXOC since june 2017 and never gone this high in benchmarks ! I always run a undervolt [email protected] for my daily but had thi OC just for benching.https://www.3dmark.com/fs/18218051


That's not even a 1080 Ti in the thumbnail and clickbait thumbnails are terrible.. I would of thought of clicking but not after seeing that.


----------



## Zfast4y0u

2ndLastJedi said:


> Thanks to MSI Afterburner OC scanner im now part of the 10,000 Superposition club !10,049 when the best i was able to manage on my own OC was 9996 !
> Even though my OC was +136 the scanner did +137 but instead of just doing the slider it seems to do it across the whole curve . I added my memory OC (+440) and superposition pops out 10,049
> Here is the scanner in action in case anyone is interested .
> https://www.youtube.com/watch?v=VirDsCifjBQ
> Even my Firestrike and Timespy scores are higher !!
> I've had this Galax 1080Ti EXOC since june 2017 and never gone this high in benchmarks ! I always run a undervolt [email protected] for my daily but had thi OC just for benching.https://www.3dmark.com/fs/18218051


wait wait, with that new version of msi afterburner, we can set our voltages in curve all the way down to 0.700mV? thats a great news for me ^^


----------



## Knoxx29

kithylin said:


> Here's a direct link to the XOC Bios file: https://onedrive.live.com/?authkey=...4&parId=FABF1EAAEEBFA9D9!148675&action=locate
> 
> I can see why you would have trouble finding it, the original link went to some forums that don't exist anymore.


Sorry because i didn't answer last night but i went to bed, had to get up at 3:00am 

I have downloaded the Bios and nvflash software, i just need one more favor from you: i am still missing the info how to save my Current Bios usin nvflash.

Thanks in advance.

Albert.


----------



## 2ndLastJedi

@kithylin Yeah but i liked the image so went with it  no apologies 
Edit:
Or maybe i should remove the clickbait ??


----------



## MacG32

Does anyone have an original BIOS from a Gigabyte GTX 1080 Ti Founders Edition that they could post and not the one from https://www.techpowerup.com/vgabios/191327/gigabyte-gtx1080ti-11264-170118 . It's nVidia branded and not from Gigabyte. Thank you in advance!


----------



## Knoxx29

@kithylin



The voltage is unlocked.

But when running Heaven crash almost at the end of the test, or i am doing something wrong or the card wont go above that, at least at 2088MHz don't crash.

Do you mind to send me a pic of your Afterburner conf and the curve you are using?


----------



## kithylin

Knoxx29 said:


> @kithylin
> 
> 
> 
> The voltage is unlocked.
> 
> But when running Heaven crash almost at the end of the test, or i am doing something wrong or the card wont go above that, at least at 2088MHz don't crash.
> 
> Do you mind to send me a pic of your Afterburner conf and the curve you are using?


Verify in the monitoring tab of GPU-Z that your card is actually running the 1.200v that you set it to in PrecisionX, I'll get you a screenshot here in a little bit.

EDIT: My curve:









XOC Bios defaults to 2000 Mhz max boost, so +126 on XOC = 2126.


----------



## Knoxx29

kithylin said:


> Verify in the monitoring tab of GPU-Z that your card is actually running the 1.200v that you set it to in PrecisionX, I'll get you a screenshot here in a little bit.
> 
> EDIT: My curve:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XOC Bios defaults to 2000 Mhz max boost, so +126 on XOC = 2126.


Thanks for your help.

I will try when i get home after work, in the AB curve i just have to move the last point of it, right and in AB the only setting i need to touch is the curve and Memory?


----------



## kithylin

Knoxx29 said:


> Thanks for your help.
> 
> I will try when i get home after work, in the AB curve i just have to move the last point of it, right?


Pretty much. Also what temps are you getting? 1080 Ti's will still throttle and reduce clocks as low as 35c core temps in my experience.


----------



## Knoxx29

kithylin said:


> Pretty much. Also what temps are you getting? 1080 Ti's will still throttle and reduce clocks as low as 35c core temps in my experience.


Idle temps are between 27c/29c and when running Heaven max is 50c but that's running the Waterchiller in Watercooling mode, if i turn on the Chiller i wont see temps above 30c or a little less when the card runs full load.

I edited my previous post.


----------



## KedarWolf

MacG32 said:


> Does anyone have an original BIOS from a Gigabyte GTX 1080 Ti Founders Edition that they could post and not the one from https://www.techpowerup.com/vgabios/191327/gigabyte-gtx1080ti-11264-170118 . It's nVidia branded and not from Gigabyte. Thank you in advance!


All Founder's BIOS's are Nvidia branded, that's just the Gigabyte version of it.

If you flash it and it says 'Gigabyte' in GPU-Z then it's the Gigabyte Founders BIOS.

Edit: Never mind. I flashed it, GPU-Z says in Nvidia, not Gigabyte.


----------



## Knoxx29

kithylin said:


> Pretty much. Also what temps are you getting? 1080 Ti's will still throttle and reduce clocks as low as 35c core temps in my experience.


 Thanks a lot @*kithylin
*


----------



## GunnzAkimbo

How good does an FTW3 run for gaming at 4k (i disable any AA at that res).


----------



## KedarWolf

GunnzAkimbo said:


> How good does an FTW3 run for gaming at 4k (i disable any AA at that res).


No A.A. Nividia panel, maxed out details in game, most games you'll do 60FPS with V-Sync on.

On my G-Sync 4K 60hz screen, Shadow Of The Tomb Raider and Hellblade: Senua's Sacrifice with A.A maxed out in game, graphics settings maxed out, except SOTR I have Shadows on low, Nvidia control panel at default settings, i get 40-60FPS G-Sync enabled. If i disabled AA I'd probably get 60FPS all the time. This is with the FTW3 Kingpin fix slave BIOS and a good overclock with Afterburner.

AA at 4K only gives you a very slight better visuals in the graphics so you can disable it, but I don't as with G-Sync the games look flawless.


----------



## scaramonga

I just picked up an FTW3, recommended BIOS? XOC?, on water


----------



## nexus35

Not necessarily. The xoc increases the voltage, but not necessarily efficient.
If for bench, why not and again. The study of your card is important.
I run better with a lower frequency of 1,093 for 3/4 of the software.


----------



## Streetdragon

Metro exodus killed all my overclocks on my 1080TI(shunt modded)

had to change the voltage from 0.993 to 1.025 for 2000/6003. Crazy!
Max Temp was 33°on core and around 38° on the backplate.

im still scared to go full tillt 1.093V and 2100Mhz.. dont wanna kill the card.... but other are running that speed and voltage all the time without problems..... need to grow some balls


----------



## kithylin

Streetdragon said:


> Metro exodus killed all my overclocks on my 1080TI(shunt modded)
> 
> had to change the voltage from 0.993 to 1.025 for 2000/6003. Crazy!
> Max Temp was 33°on core and around 38° on the backplate.
> 
> im still scared to go full tillt 1.093V and 2100Mhz.. dont wanna kill the card.... but other are running that speed and voltage all the time without problems..... need to grow some balls


I've been running my 1080 Ti @ 1.200v @ 2126 water cooled daily every day going on 17 months now non-stop. No issues what so ever yet. I just played Astroneer for hours tonight on it in fact, still no issues.


----------



## Zfast4y0u

@Knoxx29

can u tell me how did u manage to get power temp and memory temp readings in precision xoc with ur asus 1080ti??

i thought its MSI feature build into their gpu's, the sensors.


----------



## Knoxx29

Zfast4y0u said:


> @Knoxx29
> 
> can u tell me how did u manage to get power temp and memory temp readings in precision xoc with ur asus 1080ti??
> 
> i thought its MSI feature build into their gpu's, the sensors.


I own an Evga not Asus.


----------



## Streetdragon

kithylin said:


> I've been running my 1080 Ti @ 1.200v @ 2126 water cooled daily every day going on 17 months now non-stop. No issues what so ever yet. I just played Astroneer for hours tonight on it in fact, still no issues.


Is this game pushing your gpu as hard as exodus?

because even my timespy extrem clock was unstable with metro @1440p and high settings xD


----------



## skupples

Streetdragon said:


> Metro exodus killed all my overclocks on my 1080TI(shunt modded)
> 
> had to change the voltage from 0.993 to 1.025 for 2000/6003. Crazy!
> Max Temp was 33°on core and around 38° on the backplate.
> 
> im still scared to go full tillt 1.093V and 2100Mhz.. dont wanna kill the card.... but other are running that speed and voltage all the time without problems..... need to grow some balls


indeed you do. It'll be more than fine


----------



## Zfast4y0u

Knoxx29 said:


> I own an Evga not Asus.


ah, should have figured it out by now, you flashed XOC, thats why it shows asus 

you troll.


----------



## kithylin

Zfast4y0u said:


> can u tell me how did u manage to get power temp and memory temp readings in precision xoc with ur asus 1080ti??
> 
> i thought its MSI feature build into their gpu's, the sensors.


That's an exclusive EVGA thing that has extra sensors built into the cards called "ICX". Other cards don't have the extra sensors.

As they posted above, they own an EVGA card but I wanted to add that bit, the ICX part.


----------



## 8051

kithylin said:


> I've been running my 1080 Ti @ 1.200v @ 2126 water cooled daily every day going on 17 months now non-stop. No issues what so ever yet. I just played Astroneer for hours tonight on it in fact, still no issues.


What about the third party data mining aspects of Astroneer? Doesn't that bother you?


----------



## skupples

8051 said:


> What about the third party data mining aspects of Astroneer? Doesn't that bother you?


how are different than any other game's data collection methods? Anti-cheats are one of the most invasive pieces of software on your computer


----------



## kithylin

8051 said:


> What about the third party data mining aspects of Astroneer? Doesn't that bother you?


That's a lot of confusion that's been running around the internet and it's explained in detail here: https://steamcommunity.com/games/361420/announcements/detail/1756870940673197438

That was a broad and generalized EULA for all of the company's games in the past. The EULA for Astroneer has been updated and clarified. Read the post to understand further. Astroneer does -NOT- have any data mining nor analytics running in it. It never had any in the first place. People were just being silly about the game's previous EULA.


----------



## skupples

kithylin said:


> That's a lot of confusion that's been running around the internet and it's explained in detail here: https://steamcommunity.com/games/361420/announcements/detail/1756870940673197438
> 
> That was a broad and generalized EULA for all of the company's games in the past. The EULA for Astroneer has been updated and clarified. Read the post to understand further. Astroneer does -NOT- have any data mining nor analytics running in it. It never had any in the first place. People were just being silly about the game's previous EULA.


Thanks for that.


----------



## Streetdragon

kithylin said:


> I've been running my 1080 Ti @ 1.200v @ 2126 water cooled daily every day going on 17 months now non-stop. No issues what so ever yet. I just played Astroneer for hours tonight on it in fact, still no issues.


1.093V gets me to 2100Mhz on the core. but testet only 20 mins of gameplay
2° more on the core and the water gets a bit hotter too with the same fanspeed^^

Must test more. maybe i can get more out of the vram hmmm...


----------



## kithylin

Streetdragon said:


> 1.093V gets me to 2100Mhz on the core. but testet only 20 mins of gameplay
> 2° more on the core and the water gets a bit hotter too with the same fanspeed^^
> 
> Must test more. maybe i can get more out of the vram hmmm...


It depends on your card (silicon lotto). Some people on OCN have had their 2080 Ti's @ 2200 mhz with only as much as 1.125v, some people can do 2150 mhz all day long at 1.087v. Sadly my card needs a full 1.200v just to sustain 2126 mhz core. I forgot to add in my above post I'm running at +700 memory too. Some people can get +1000 or +1200 memory to work.. I've tried everything and can't get mine to do more than +700 memory no matter what. It's just the luck of the draw.


----------



## Trickster29

Hi, I have 2 1080 Tis (FTW3 hybrids)

Recently I've been having very rare crashes. (maybe once or twice a week normally during really long gaming sessions)

The core temp is normally not over 60C. There's no artifacting. And the most recent crash (yesterday) it was able to resume gaming just fine after the driver recovered. I know my fair share of things but I am wondering if anyone else has had a similar crashing pattern.

It being so rare and only during long term makes me wonder if it's the VRMs getting close to too hot during gaming and it causes a crash.

Any other thoughts?


----------



## skupples

I'd start with a DDU flush (display driver uninstaller) then a clean re-install. 

follow DDU directions, or don't follow them at your own peril.


----------



## Trickster29

skupples said:


> I'd start with a DDU flush (display driver uninstaller) then a clean re-install.
> 
> follow DDU directions, or don't follow them at your own peril.


I'll give it a shot I havnt tried clean installing.

Anyone else feel free to offer any other suggestions.


----------



## skupples

Trickster29 said:


> I'll give it a shot I havnt tried clean installing.
> 
> Anyone else feel free to offer any other suggestions.


full removal AND clean install. Not just using the "clean install" button.

DDU forces you to launch in "Safe" mode, do the uninstall, restart, start in regular windows, install new driver. 

If its not that, and the card isn't artifact, it's most likely not the GPU, but somewhere else. Possibly Memory, or dying drive? Unlikely its CPU, as no BSOD. Or something else is overheating? PSU? Not sure, will know more after DDU.


----------



## Trickster29

skupples said:


> Trickster29 said:
> 
> 
> 
> I'll give it a shot I havnt tried clean installing.
> 
> Anyone else feel free to offer any other suggestions.
> 
> 
> 
> full removal AND clean install. Not just using the "clean install" button.
> 
> DDU forces you to launch in "Safe" mode, do the uninstall, restart, start in regular windows, install new driver.
> 
> If its not that, and the card isn't artifact, it's most likely not the GPU, but somewhere else. Possibly Memory, or dying drive? Unlikely its CPU, as no BSOD. Or something else is overheating? PSU? Not sure, will know more after DDU.
Click to expand...

Yes I'm aware of the DDU method where you typically have it done in safe mode with the DDU application that you download.Thank you for making sure I knew. I was just typing lazily lol

Edit:

The hard drive light was solid during the last crash so maybe drive? . Also shadowplay decided it was okay to use all of my ram one time (memory leak I guess) . So it could totally be a driver issue.


----------



## kithylin

Trickster29 said:


> Hi, I have 2 1080 Tis (FTW3 hybrids)
> 
> Recently I've been having very rare crashes. (maybe once or twice a week normally during really long gaming sessions)
> 
> The core temp is normally not over 60C. There's no artifacting. And the most recent crash (yesterday) it was able to resume gaming just fine after the driver recovered. I know my fair share of things but I am wondering if anyone else has had a similar crashing pattern.
> 
> It being so rare and only during long term makes me wonder if it's the VRMs getting close to too hot during gaming and it causes a crash.
> 
> Any other thoughts?


It's probably nvidia's latest drivers. I've heard a whole host of BS from them from different people. A friend of mine I built a computer for in spring 2017 has a GTX 1080 and he's reported various crashes and a few random game crashes recently. And this is a system he has been running for over a year that literally had -ZERO ISSUES- with any game for a year. And then just recently with new drivers, *bam* crashing frequently. Someone else I know on this forums from the 775 thread bought a 1080 recently and with latest driver they couldn't get their 1080 to enter idle clocks. The GPU would be stuck at 1500 mhz core speed always even idle at desktop. And I have a friend with a 2080 Ti that's had ark: survival evolved freezing on them suddenly. In all of these instances I have had all of these friends roll back to driver version 417.71 and all of their issues magically disappeared and they're back to perfect stability again. So I would suggest you do the same. At least try it.


----------



## skupples

Trickster29 said:


> Yes I'm aware of the DDU method where you typically have it done in safe mode with the DDU application that you download.Thank you for making sure I knew. I was just typing lazily lol
> 
> Edit:
> 
> The hard drive light was solid during the last crash so maybe drive? . Also shadowplay decided it was okay to use all of my ram one time (memory leak I guess) . So it could totally be a driver issue.


no worries, i'm T3 help desk, so I'm overly used to over-explaining things. 

and it looks like you'll wanna install one driver previous, as good sir stated above.


----------



## Trickster29

kithylin said:


> Trickster29 said:
> 
> 
> 
> Hi, I have 2 1080 Tis (FTW3 hybrids)
> 
> Recently I've been having very rare crashes. (maybe once or twice a week normally during really long gaming sessions)
> 
> The core temp is normally not over 60C. There's no artifacting. And the most recent crash (yesterday) it was able to resume gaming just fine after the driver recovered. I know my fair share of things but I am wondering if anyone else has had a similar crashing pattern.
> 
> It being so rare and only during long term makes me wonder if it's the VRMs getting close to too hot during gaming and it causes a crash.
> 
> Any other thoughts?
> 
> 
> 
> It's probably nvidia's latest drivers. I've heard a whole host of BS from them from different people. A friend of mine I built a computer for in spring 2017 has a GTX 1080 and he's reported various crashes and a few random game crashes recently. And this is a system he has been running for over a year that literally had -ZERO ISSUES- with any game for a year. And then just recently with new drivers, *bam* crashing frequently. Someone else I know on this forums from the 775 thread bought a 1080 recently and with latest driver they couldn't get their 1080 to enter idle clocks. The GPU would be stuck at 1500 mhz core speed always even idle at desktop. And I have a friend with a 2080 Ti that's had ark: survival evolved freezing on them suddenly. In all of these instances I have had all of these friends roll back to driver version 417.71 and all of their issues magically disappeared and they're back to perfect stability again. So I would suggest you do the same. At least try it.
Click to expand...

I'll try that during reinstall. I'm not sure which version I'm running atm though. It is pretty recent though. So it could definitely be the cause.


----------



## Chobbit

Hey, I'm new to the 1080ti after i couldn't turn down the offer of a Strix 1080ti with a strix water block (i had to fit myself as it had never been fitted) for £350. I sold my two Strix 980's with water blocks for £400 so actually an upgrade and Profit!

It's all in now and ive got it stable in all games and benching at 2088/5670 and in stress testing it doesn't go over 40 degrees C. 

I know the temp is more than good enough and there's plenty of head room there but is this actual overclock any good for on water?

Where should I be looking to get the most out of this card? And is there any bios I should be looking at that works well with the Strix pcb? As the OP of this thread mainly talks about bios' for the FE and other versions can be bricked with the wrong bios.

Cheers for any help 🙂


----------



## Chobbit

I'm starting to read backwards through this thread and people seem to be able to get very high Memory Overclocks (someone mentioned anywhere between 700-1200mhz).

I can find that the base Core clock for a 1080ti is 1481mhz so I have a OC of 607 mhz (40% increase) but what is the base memory clock of a 1080ti? So I can figure out how much I've added with my current 5670 mhz OC.


----------



## skupples

Chobbit said:


> I'm starting to read backwards through this thread and people seem to be able to get very high Memory Overclocks (someone mentioned anywhere between 700-1200mhz).
> 
> I can find that the base Core clock for a 1080ti is 1481mhz so I have a OC of 607 mhz (40% increase) but what is the base memory clock of a 1080ti? So I can figure out how much I've added with my current 5670 mhz OC.


remember, the clock you set is multiplied by 2x. So your 607 is 1214mhz increase.

in general, & by rule of thumb, I always start @ +500 on any NV card since Keplar, which bumps the speed by a gig.

ALSO NOTE - just because it can support that clock doesn't mean its the best clock to run. Mem OC typically has diminishing returns, & then affects peformance after you start causing too many errors from high OC. So bench thoroughly before jut setting & forgetting because of the high number.


----------



## Chobbit

skupples said:


> Chobbit said:
> 
> 
> 
> I'm starting to read backwards through this thread and people seem to be able to get very high Memory Overclocks (someone mentioned anywhere between 700-1200mhz).
> 
> I can find that the base Core clock for a 1080ti is 1481mhz so I have a OC of 607 mhz (40% increase) but what is the base memory clock of a 1080ti? So I can figure out how much I've added with my current 5670 mhz OC.
> 
> 
> 
> remember, the clock you set is multiplied by 2x. So your 607 is 1214mhz increase.
> 
> in general, & by rule of thumb, I always start @ +500 on any NV card since Keplar, which bumps the speed by a gig.
> 
> ALSO NOTE - just because it can support that clock doesn't mean its the best clock to run. Mem OC typically has diminishing returns, & then affects peformance after you start causing too many errors from high OC. So bench thoroughly before jut setting & forgetting because of the high number.
Click to expand...

Thanks I didn't realise the core clock was doubled.

I'm doing this fully for VR as thats all I mostly play so couldn't use the sli with my 980's which is why the 1080ti is a massive boost improvement.

I just wondered if there was anymore improvements I could get out of the card to be able to add some more supersampling? Or should I just be happy with my current OC?


----------



## junglechocolate

So I moved from an R9 290 to a 1080 Ti black edition. I've noticed that Afterburner won't show my clock speeds. Also I don't know how to overclock the card. Do I need to reinstall afterburner and enable some setting in the controlpanel to be able to overclock my card?


----------



## 8051

kithylin said:


> It's probably nvidia's latest drivers. I've heard a whole host of BS from them from different people. A friend of mine I built a computer for in spring 2017 has a GTX 1080 and he's reported various crashes and a few random game crashes recently. And this is a system he has been running for over a year that literally had -ZERO ISSUES- with any game for a year. And then just recently with new drivers, *bam* crashing frequently. Someone else I know on this forums from the 775 thread bought a 1080 recently and with latest driver they couldn't get their 1080 to enter idle clocks. The GPU would be stuck at 1500 mhz core speed always even idle at desktop. And I have a friend with a 2080 Ti that's had ark: survival evolved freezing on them suddenly. In all of these instances I have had all of these friends roll back to driver version 417.71 and all of their issues magically disappeared and they're back to perfect stability again. So I would suggest you do the same. At least try it.


Thanks for that info kithylin and your info on astroneer.


----------



## kithylin

Chobbit said:


> Thanks I didn't realise the core clock was doubled.
> 
> I'm doing this fully for VR as thats all I mostly play so couldn't use the sli with my 980's which is why the 1080ti is a massive boost improvement.
> 
> I just wondered if there was anymore improvements I could get out of the card to be able to add some more supersampling? Or should I just be happy with my current OC?


https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

Since you're full-cover custom water looped, check out the XOC bios over in this thread. It's literally designed from Asus for the STRIX OC 1080 Ti. We've been using it on all our other cards, but it is an original asus bios, and it's designed for your card. It removes the power limit and gives you access to 1.200v voltage with msi afterburner. By default all 1080 Ti's are hard voltage locked to around 1.087 - 1.10v


----------



## kithylin

I'm making another post because I just found about this via a friend tonight. Nvidia posted a hotfix for people with random crashing problems using current gen nvidia cards on windows 7 x64. In case there's anyone (other than me) still on windows 7 using any gpu from the 1000 series from nvidia, I would suggest grabbing this as a short-term fix until they roll out the next stable driver.

https://forums.geforce.com/default/...force-hotfix-driver-418-99-released-2-17-19-/


----------



## lightsout

Fair price for my 1080ti FTW3? It's coming back from RMA and I am going to offload it to free up some cash.

I see an SC just sold for $550 in the marketplace. I am thinking $575 is fair for a top tier 1080ti with 2 years warranty left.


----------



## Jedi Mind Trick

lightsout said:


> Fair price for my 1080ti FTW3? It's coming back from RMA and I am going to offload it to free up some cash.
> 
> I see an SC just sold for $550 in the marketplace. I am thinking $575 is fair for a top tier 1080ti with 2 years warranty left.


You might be able to get ~$600 over at reddit (r/hardwareswap) if you have all the retail packaging. They seem pretty hungry for 1080tis [especially of the EVGA FTW3 variant]. I sold an ex-mining 1080Ti Trio without any packaging for $520 and probably could have gotten more if I tried (I have never had so many messages about a sale-post before).


----------



## lightsout

Jedi Mind Trick said:


> You might be able to get ~$600 over at reddit (r/hardwareswap) if you have all the retail packaging. They seem pretty hungry for 1080tis [especially of the EVGA FTW3 variant]. I sold an ex-mining 1080Ti Trio without any packaging for $520 and probably could have gotten more if I tried (I have never had so many messages about a sale-post before).


Thanks. I am leery about selling over there, just seems to be scams more often, have you had good results?


----------



## skupples

Chobbit said:


> skupples said:
> 
> 
> 
> 
> 
> Chobbit said:
> 
> 
> 
> I'm starting to read backwards through this thread and people seem to be able to get very high Memory Overclocks (someone mentioned anywhere between 700-1200mhz).
> 
> I can find that the base Core clock for a 1080ti is 1481mhz so I have a OC of 607 mhz (40% increase) but what is the base memory clock of a 1080ti? So I can figure out how much I've added with my current 5670 mhz OC.
> 
> 
> 
> remember, the clock you set is multiplied by 2x. So your 607 is 1214mhz increase.
> 
> in general, & by rule of thumb, I always start @ +500 on any NV card since Keplar, which bumps the speed by a gig.
> 
> ALSO NOTE - just because it can support that clock doesn't mean its the best clock to run. Mem OC typically has diminishing returns, & then affects peformance after you start causing too many errors from high OC. So bench thoroughly before jut setting & forgetting because of the high number.
> 
> Click to expand...
> 
> Thanks I didn't realise the core clock was doubled.
> 
> I'm doing this fully for VR as thats all I mostly play so couldn't use the sli with my 980's which is why the 1080ti is a massive boost improvement.
> 
> I just wondered if there was anymore improvements I could get out of the card to be able to add some more supersampling? Or should I just be happy with my current OC?
Click to expand...

Only mem clock.

Seems you’re on par. I’d be happy. More clock isn’t gonna help SS by much if at all, but you can always push it furtrr. Just remember to bench out the increase.


----------



## Chobbit

kithylin said:


> https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html
> 
> Since you're full-cover custom water looped, check out the XOC bios over in this thread. It's literally designed from Asus for the STRIX OC 1080 Ti. We've been using it on all our other cards, but it is an original asus bios, and it's designed for your card. It removes the power limit and gives you access to 1.200v voltage with msi afterburner. By default all 1080 Ti's are hard voltage locked to around 1.087 - 1.10v


Thanks I'll take a look at that bios, is Afterburner the prefered overclocking tool for the 1080ti? I used it with my 980's (but that was over 3 years ago when I first got them and found my perminant OC), I was just using the one that came with the strix.

I definetly feel my stable memory OC is a little weak currently, being only 300mhz over stock (6% increase) when the memory usually OC much better than the core and the cores at over 600mhz (40% increase). Hopefully the extra voltage will increase the mem OC.


----------



## kithylin

Chobbit said:


> Thanks I'll take a look at that bios, is Afterburner the prefered overclocking tool for the 1080ti? I used it with my 980's (but that was over 3 years ago when I first got them and found my perminant OC), I was just using the one that came with the strix.
> 
> I definetly feel my stable memory OC is a little weak currently, being only 300mhz over stock (6% increase) when the memory usually OC much better than the core and the cores at over 600mhz (40% increase). Hopefully the extra voltage will increase the mem OC.


Even under custom water loop most 1080 Ti's only average between 2050 - 2100 mhz. Some may get a few % over that, like 2126-2150 possibly. The rare ultra lucky few that win the silicon lotto have had their 1080 Ti's at 2200-2250 Mhz on custom water. Do remember it's entirely related to your temperature. Pascal is completely changed and nothing like the maxwell and older cards. On the new pascal architecture nvidia implements thermal throttling even as low as 40c. 1080 Ti's will reduce their max boost speed from temperature starting at 40c. I forget the exact delta curve and someone here can correct me if they know. But it's something like -50 Mhz for every +10c in temps or -75 Mhz for 5c or something, I can't remember. I'll share my experience with you I suppose.

I have a EVGA GTX 1080 Ti SC Black Edition card (which is just a fancy name for a reference model on reference PCB with EVGA's special twin-fan air cooler). When I first bought it I used it on air for about 2-3 months for my finances to settle after initially buying the card. On air, stock on stock bios I was limited to 1.087v core voltage and with that I was only able to get the card up to 2000 mhz Max boost. But.. even if I set it at 2000 mhz, and it did run it, it would only do so for about the first 5 minutes. Once the card got hot (and it ran around 80-84c stock overclocked) it would throttle the clocks down for me all the way down to 1825 Mhz was the maximum stable I could get when gaming. And I could only set the ram to +300 Mhz stock. Then I bought a EK copper plexi water block and connected it up to my custom water loop. Which at the time, back then, was my x99 system with a I7-5820K which was also looped in and I was trying to run both the gpu and the cpu on a single 3x120 radiator. Just on the stock bios being water cooled, the card would finally run at 1950 mhz max boost and stay there. I thought my temps were great, like 40-45c max gaming, thought everything was awesome. Thought I'll load this Asus Strix XOC bios and really see what my card can do. It went fine, no issues, just the card ran hot'ish. With XOC I found out I wasn't lucky in silicon lotto and my 1080 Ti required the full 1.200v just to remain stable at 2126 Mhz max boost, with +700 memory and that was the highest my card will ever do. Which it would do in games, depending on the game. Some games don't load the card very much and I was able to maintain 2126 Mhz core speed and only run around 40-45c and it would be fine. But if I tried to play a game like ARK: Survival Evolved (extremely demanding on the video card) it would max out my 1080 Ti even @ 1080p and make my card run up to 60-63c even custom water looped. And then the card would drop clocks from 2126 -> down to game at 2008 Mhz. This was kind of undesirable so I later switched systems and now I run a large 4-radiator loop (360+240+240+120), 2 pumps and a different system. I went "Backwards" and now I'm switched back to a I5-3570K in a special 32-phase-power Z77 motherboard that runs this chip stable @ 5.1 ghz (my 5820K would only do 4.3 ghz) which is better in games even with less cores and threads. So now I have the CPU looped in with motherboard vrm-mosfet block + water cooled ram + this big loop and even with adding more stuff to the loop I now can get my 1080 Ti to hold < 50c stable even in ARK, and typically it runs around 40-43c in ARK. And it does still thermally throttle at these temps down from 2126 -> 2103, but that's fine now. And ARK is the only game that does this. All other games I play now my 1080 Ti typically runs in the 28-35c range and at these temps it holds the 2126 max boost all the time and never throttles.

Maybe my story of getting my 1080 Ti to where it is today will give you some more insight in to how the cards work.


----------



## Chobbit

So my mate who I got my 1080ti from replaced it with a 2080ti FE and Ive just found out it's completely died with no OC after only just over 2 weeks since getting it. 

I've just looked into it and didn't realise this was actually a quite common thing 😐 whats going on with the RTX cards and I'm glad I got the 1080ti over the 2080 now.


----------



## Chobbit

kithylin said:


> Maybe my story of getting my 1080 Ti to where it is today will give you some more insight in to how the cards work.


Thanks for the background info, funny enough Ark was one of the games I tested to find my stable overclock and was one of the only games that would crash the drivers at 2100 core OC, so I settled on 2088. It was also the game that pushed the hottest temp Ive seen 40c (2x alphacool 560 radiators, cpu & gpu only in a single loop).

With your information and the fact that I'm using it for gaming and not benching scores, honestly it seems pointless trying to push further as I'm already at the thermal limit before throttling and loosing performance anyway. I think I just need to be happy with what I have 🙂


----------



## kithylin

Chobbit said:


> Thanks for the background info, funny enough Ark was one of the games I tested to find my stable overclock and was one of the only games that would crash the drivers at 2100 core OC, so I settled on 2088. It was also the game that pushed the hottest temp Ive seen 40c (2x alphacool 560 radiators, cpu & gpu only in a single loop).
> 
> With your information and the fact that I'm using it for gaming and not benching scores, honestly it seems pointless trying to push further as I'm already at the thermal limit before throttling and loosing performance anyway. I think I just need to be happy with what I have 🙂


Yeah.. I'm one of the Windows 7 Holdouts, and I don't care about DirectX 12 games or performance, and I don't care about RTX or DLSS (I'd never use DLSS even if I had it).. my 1080 Ti @ 2126 (now that I got it to stop thermal throttling with a huge loop) overclocked as it is, is only -12% slower than my good friend's stock Founder's Edition 2080 Ti (even after he tuned it with afterburner, undervolted it and got a ram overclock in) and the 2080 Ti is $1300 .. so there's nothing I could possibly upgrade to single-card even if I wanted to. I'm going to need my 1080 Ti to last me at least until the Nvidia 3000 series, or the 4000 series so I'm trying to squeeze every last 1 Mhz out of it I can until then myself. I don't even know where the future holds. I'm considering buying another EK fullcover water block for 1080 Ti this fall, and sit on it and wait for the future. It may be that nvidia drops win7 support for the 3000 or 4000 series of cards and I might have to be buying a second 1080 Ti instead.


----------



## Jedi Mind Trick

lightsout said:


> Thanks. I am leery about selling over there, just seems to be scams more often, have you had good results?


It isn't the worst! More seriously, it really doesn't seem that bad, I just assume that most people there know less about tech/are younger as I get some weird/basic questions, but I have yet to have a bad experience (I have really only had one bad experience selling online [ebay] and it was completely my [well, comcast's] fault for selling a phone that should have been unlocked, but was still tied to a ghost Xfinity Mobile line]).

I probably have ~10-15 items sold via reddit and have purchased about the same amount of things. I just try to make sure any/everything is listed in the post and on the paypal invoice. Definitely recommend sending an invoice instead of a 'payment request' but it really shouldn't make too much of a difference.

You can always list it here for $600 and if no one makes an offer that you want, list it there. You have a great card, get the price you want for it!


----------



## lightsout

Jedi Mind Trick said:


> It isn't the worst! More seriously, it really doesn't seem that bad, I just assume that most people there know less about tech/are younger as I get some weird/basic questions, but I have yet to have a bad experience (I have really only had one bad experience selling online [ebay] and it was completely my [well, comcast's] fault for selling a phone that should have been unlocked, but was still tied to a ghost Xfinity Mobile line]).
> 
> I probably have ~10-15 items sold via reddit and have purchased about the same amount of things. I just try to make sure any/everything is listed in the post and on the paypal invoice. Definitely recommend sending an invoice instead of a 'payment request' but it really shouldn't make too much of a difference.
> 
> You can always list it here for $600 and if no one makes an offer that you want, list it there. You have a great card, get the price you want for it!


I just sold a 2200g over there, like just now, it got the most response compared to here and [H] which was none. So definitely getting in front of more eye balls.

You are right I would like to get what I feel its worth, and what I paid, I will just make sure the person has plenty of feedback.


----------



## kithylin

*DISCLAIMER* I have no affiliation with the seller.

I occasionally browse through the 'hydro copper' listings on ebay to hopefully find some older gpu's cheap that come water cooled and I happened in to a listing for a pair of used GTX 1080 Ti KINGPIN Edition Hydro Copper water blocks for the EVGA GTX 1080 Ti KINGPIN Edition. I'm sharing a link here because perhaps there may be some KINGPIN 1080 Ti owners here that missed out on getting the hydro copper blocks and would like some.

Here you go: https://www.ebay.com/itm/254130053687


----------



## Jedi Mind Trick

lightsout said:


> Jedi Mind Trick said:
> 
> 
> 
> It isn't the worst! More seriously, it really doesn't seem that bad, I just assume that most people there know less about tech/are younger as I get some weird/basic questions, but I have yet to have a bad experience (I have really only had one bad experience selling online [ebay] and it was completely my [well, comcast's] fault for selling a phone that should have been unlocked, but was still tied to a ghost Xfinity Mobile line]).
> 
> I probably have ~10-15 items sold via reddit and have purchased about the same amount of things. I just try to make sure any/everything is listed in the post and on the paypal invoice. Definitely recommend sending an invoice instead of a 'payment request' but it really shouldn't make too much of a difference.
> 
> You can always list it here for $600 and if no one makes an offer that you want, list it there. You have a great card, get the price you want for it!
> 
> 
> 
> I just sold a 2200g over there, like just now, it got the most response compared to here and [H] which was none. So definitely getting in front of more eye balls.
> 
> You are right I would like to get what I feel its worth, and what I paid, *I will just make sure the person has plenty of feedback.*
Click to expand...

That is what I do, anyone that has <10 feedback and wants a ‘big ticket’ item, must have a good bit of karma or a lot of non-spammy posts (shows they have at least some commitment to reddit). But yea, hardwareswap has 10s (100s?) of thousands of people. OCN seems like a fraction of that these days 😞

Not that it was a good thing, but the classifieds here are going to get worse in the next few months (35 rep requirement being reinstated at the end of the month). Like half of the posts I've been interested in were from new members.


----------



## lightsout

Jedi Mind Trick said:


> That is what I do, anyone that has <10 feedback and wants a ‘big ticket’ item, must have a good bit of karma or a lot of non-spammy posts (shows they have at least some commitment to reddit). But yea, hardwareswap has 10s (100s?) of thousands of people. OCN seems like a fraction of that these days 😞
> 
> Not that it was a good thing, but the classifieds here are going to get worse in the next few months (35 rep requirement being reinstated at the end of the month). Like half of the posts I've been interested in were from new members.


Yeah that's true. But I'm glad. Back in the day it was a privilege to use the market place here. And you had a pretty good idea that people were legit. Lately it's been a lot of new accounts. But your right that will slow things down even more. OCN has always for me have pickier buyers when it comes to price. People here have a better eye for current pricing.


----------



## 8051

kithylin said:


> I have a EVGA GTX 1080 Ti SC Black Edition card (which is just a fancy name for a reference model on reference PCB with EVGA's special twin-fan air cooler). When I first bought it I used it on air for about 2-3 months for my finances to settle after initially buying the card. On air, stock on stock bios I was limited to 1.087v core voltage and with that I was only able to get the card up to 2000 mhz Max boost. But.. even if I set it at 2000 mhz, and it did run it, it would only do so for about the first 5 minutes. Once the card got hot (and it ran around 80-84c stock overclocked) it would throttle the clocks down for me all the way down to 1825 Mhz was the maximum stable I could get when gaming. And I could only set the ram to +300 Mhz stock. Then I bought a EK copper plexi water block and connected it up to my custom water loop. Which at the time, back then, was my x99 system with a I7-5820K which was also looped in and I was trying to run both the gpu and the cpu on a single 3x120 radiator. Just on the stock bios being water cooled, the card would finally run at 1950 mhz max boost and stay there. I thought my temps were great, like 40-45c max gaming, thought everything was awesome. Thought I'll load this Asus Strix XOC bios and really see what my card can do. It went fine, no issues, just the card ran hot'ish. With XOC I found out I wasn't lucky in silicon lotto and my 1080 Ti required the full 1.200v just to remain stable at 2126 Mhz max boost, with +700 memory and that was the highest my card will ever do. Which it would do in games, depending on the game. Some games don't load the card very much and I was able to maintain 2126 Mhz core speed and only run around 40-45c and it would be fine. But if I tried to play a game like ARK: Survival Evolved (extremely demanding on the video card) it would max out my 1080 Ti even @ 1080p and make my card run up to 60-63c even custom water looped. And then the card would drop clocks from 2126 -> down to game at 2008 Mhz. This was kind of undesirable so I later switched systems and now I run a large 4-radiator loop (360+240+240+120), 2 pumps and a different system. I went "Backwards" and now I'm switched back to a I5-3570K in a special 32-phase-power Z77 motherboard that runs this chip stable @ 5.1 ghz (my 5820K would only do 4.3 ghz) which is better in games even with less cores and threads. So now I have the CPU looped in with motherboard vrm-mosfet block + water cooled ram + this big loop and even with adding more stuff to the loop I now can get my 1080 Ti to hold < 50c stable even in ARK, and typically it runs around 40-43c in ARK. And it does still thermally throttle at these temps down from 2126 -> 2103, but that's fine now. And ARK is the only game that does this. All other games I play now my 1080 Ti typically runs in the 28-35c range and at these temps it holds the 2126 max boost all the time and never throttles.
> 
> Maybe my story of getting my 1080 Ti to where it is today will give you some more insight in to how the cards work.


I have the Asus Strix XOC VBIOS on my air-cooled MSI 1080Ti Duke, which is stock except for some more powerful, shrouded fans and most of the backplate covered in thermal tape. I can run 2150 core clocks @ 1181mV until the temps climb above 63° then it throws power and voltage perfcaps and throttles.


----------



## Warboss Choppa

I have a Zotac 1080 Ti Amp Extreme and I've managed to clock it on the stock cooler to 2088Mhz with the voltage slider set half way and +25 on the core, memory is at 6003Mhz. I'm so pissed no one made a block for the card and I'm trying a future experiment of removing the shroud around the fans and adding thermal pads on the back of the card to touch the back plate. As well as using Thermal Grizzly LM. I max temps out between 50-55C after a bit of gaming. I know with a block I could push this thing further, it's infuriating to know without modding the **** out of to get a block on it I'll never see the cards full potential.


----------



## Thoth420

I got my RMA card back from EVGA. 1080Ti FTW3. Is there any way to tell if it's refurbished and perhaps the reason it was originally returned? It looks completely unused. I bought a 2080Ti in the mean time so I plan on selling this and want to know what I have.


----------



## wonderin17

Hi,

so recently I've decided to overclock my Rog Strix 1080 Ti OC a bit (been using it for more than a year with no OC at all).

But every time I push the core voltage slider over the 10% state, it doesn't pass the 3D Mark FSE. I guess i can get a bit more from my 1080ti oc with the right adjustments. The temperatures are fine.









I've been struggling to overclock my GPU, but nothing over +60 (2012 mhz) on clock is 100% stable. I can push it further (2038 or at least 2025 mhz), and it would maintain like 3/4 of the Fire Strike Extreme loops, but it won't pass by any means. It can pass some other 3d mark tests, but not this particular FS Extreme.

I've tryed the OC scan and it just gives me strange results, all time different, but mostly with clocks lower than 2000.

It's been itching my brain for a while, and I can't stop trying.

I've noticed that GPU is mostly at 98-99% load during stress tests (and gaming), OCed or not. However, every time it reaches 100% load when OCed, the driver would crash within a couple of seconds. 

Any ideas where should i dig to? So frustrated!


----------



## skupples

anyone have an extra block laying around? I found the one I was looking for, but it was attached to a 1080ti, so I had to take em both. Maybe I'll just get a brand new one from EK at this point.


----------



## kithylin

8051 said:


> I have the Asus Strix XOC VBIOS on my air-cooled MSI 1080Ti Duke, which is stock except for some more powerful, shrouded fans and most of the backplate covered in thermal tape. I can run 2150 core clocks @ 1181mV until the temps climb above 63° then it throws power and voltage perfcaps and throttles.


The XOC bios does -NOT- have power limits. And you should not be using it on an air cooled card. That's extremely dangerous for the card's health. It's for full custom water cooling and liquid nitrogen only.


----------



## 8051

kithylin said:


> The XOC bios does -NOT- have power limits. And you should not be using it on an air cooled card. That's extremely dangerous for the card's health. It's for full custom water cooling and liquid nitrogen only.


I suppose the power and voltage perfcaps could have something to do w/the custom PCB of the MSI 1080Ti Duke rather than Asus Strix XOC VBIOS, but without editing the VBIOS itself how would anyone know? I'm going to try to get more cool air onto the card via a more powerful case floor fan. Chances are, I won't be using this card 18 months from now anyway. I'm going to be trying to get my fried Gigabyte 1080Ti Gaming OC repaired as well.


----------



## lightsout

Thoth420 said:


> I got my RMA card back from EVGA. 1080Ti FTW3. Is there any way to tell if it's refurbished and perhaps the reason it was originally returned? It looks completely unused. I bought a 2080Ti in the mean time so I plan on selling this and want to know what I have.


Not sure how you could know unless there was some repair done. Don't think there are many new 1080ti's around. But they clean them up well card should be good to go.


----------



## skupples

99% of RMA return cards are refurbs that go thru reconditioning. Specially this late into the life cycle. 

One of my best Titan Keplars was an RMA


----------



## Thoth420

lightsout said:


> Not sure how you could know unless there was some repair done. Don't think there are many new 1080ti's around. But they clean them up well card should be good to go.





skupples said:


> 99% of RMA return cards are refurbs that go thru reconditioning. Specially this late into the life cycle.
> 
> One of my best Titan Keplars was an RMA


Thanks fellas so I guess I will price it as used. Cheers


----------



## EdgeCrusher86

*[TOOL] Custom DSR Tool by Orbmu2k (NVIDIA Inspector) - up to 12.00x DSR works fine:*
https://www.forum-3dcenter.org/vbulletin/showthread.php?p=11935309#post11935309

A new nice toy for enthusiasts. 



I made some screenshots: 

*UWQHD (3440x1440 native) - 12.00x DSR (11919x4989) - PCars 2 - SMAA Ultra + max. Details:*

https://www24.zippyshare.com/v/DQk7Q2rm/file.html
https://www24.zippyshare.com/v/tmja2s7q/file.html
https://www24.zippyshare.com/v/LiRbV8re/file.html



*UWQHD (3440x1440 native) - 9.00x DSR (10320x4320) - max. Details each:*

*AC : Origins:*
https://www117.zippyshare.com/v/7jpY7Pbb/file.html

*BF V (that ugly lowres palm though ):
* https://www118.zippyshare.com/v/rwa0EaWg/file.html

*TW3*:
https://www118.zippyshare.com/v/WcjXUhVi/file.html


X-times more GPU horesepower and 24GB of VRAM would be nice to have. 

*Additionally there is also a new RTSS based G-SYNC indicator by Orbmu2k:*
https://www.forum-3dcenter.org/vbulletin/showthread.php?t=593443


----------



## jura11

Warboss Choppa said:


> I have a Zotac 1080 Ti Amp Extreme and I've managed to clock it on the stock cooler to 2088Mhz with the voltage slider set half way and +25 on the core, memory is at 6003Mhz. I'm so pissed no one made a block for the card and I'm trying a future experiment of removing the shroud around the fans and adding thermal pads on the back of the card to touch the back plate. As well as using Thermal Grizzly LM. I max temps out between 50-55C after a bit of gaming. I know with a block I could push this thing further, it's infuriating to know without modding the **** out of to get a block on it I'll never see the cards full potential.


Hi there 

I'm pretty sure Barrow or Bykski do makes waterblock for Zotac GTX1080Ti AMP Extreme 

You can buy them over on Aliexpress


Hope this helps 

Thanks, Jura


----------



## texas_nightowl

Hi group. I'm about to be a member of the club. Just paid for a used EVGA FTW3 1080 ti (11G-P4-6696-KR). I am keeping my sandy bridge system limping along a little bit longer. At least until Ryzen 3000 later this year. But I needed a good card to go with my new monitor (34" 3440x1440) and did not want to pay for a 2080. That said, I do need a new PSU for this 1080ti. Have you as a group found that a 650w 80+ gold psu is plenty for this card?


----------



## MrTOOSHORT

texas_nightowl said:


> Hi group. I'm about to be a member of the club. Just paid for a used EVGA FTW3 1080 ti (11G-P4-6696-KR). I am keeping my sandy bridge system limping along a little bit longer. At least until Ryzen 3000 later this year. But I needed a good card to go with my new monitor (34" 3440x1440) and did not want to pay for a 2080. That said, I do need a new PSU for this 1080ti. Have you as a group found that a 650w 80+ gold psu is plenty for this card?


2500k system with a 1080ti, a good 650w is more than enough no doubt. What psu is it?


----------



## texas_nightowl

MrTOOSHORT said:


> 2500k system with a 1080ti, a good 650w is more than enough no doubt. What psu is it?


The 650? I don't know yet...haven't chosen! My short list is: Corsair RMx (2018), EVGA SuperNOVA G3, SeaSonic Focus Plus. All 3 are gold. Any reason you know of to choose one over the other? 

FWIW, I plan to use the PSU to upgrade to Ryzen 3000 later this year, but need a PSU immediately for the 1080ti. My existing psu is the Corsair 520HX and is over 7 yrs old.


----------



## MrTOOSHORT

texas_nightowl said:


> The 650? I don't know yet...haven't chosen! My short list is: Corsair RMx (2018), EVGA SuperNOVA G3, SeaSonic Focus Plus. All 3 are gold. Any reason you know of to choose one over the other?
> 
> FWIW, I plan to use the PSU to upgrade to Ryzen 3000 later this year, but need a PSU immediately for the 1080ti. My existing psu is the Corsair 520HX and is over 7 yrs old.


Any of those mentioned power supplies are fine. All single rail design.


----------



## skupples

^^^ 

the amount of power needed is grossly overstated on daily basis, not so much around here anymore, but definitely ALL over reddit's PCbuild subforums, etc. 

I was supposedly gonna have issues running 3x titans (just the three titans) from a 1500W, yeaaaah no. Not even when running them at 1.4v.


----------



## texas_nightowl

Thanks! I'll get one of them ordered to be ready for my 1080ti!


----------



## Warboss Choppa

jura11 said:


> Hi there
> 
> I'm pretty sure Barrow or Bykski do makes waterblock for Zotac GTX1080Ti AMP Extreme
> 
> You can buy them over on Aliexpress
> 
> 
> Hope this helps
> 
> Thanks, Jura


It doesn't fit right, I've talked to a few people who have bought it.


----------



## Hequaqua

Does anyone know the thickness of the the thermal pads on the EVGA GTX1080Ti Black Edition?

Thought about a tear down and replace the pads and paste, and a good cleaning(not really that dirty, I'm just a bit OCD)


----------



## skupples

https://forums.evga.com/tm.aspx?m=2586299

pretty sure that covers stock pad sizes for multiple cards.

(remember, rep is back and now in the bottom left  )


----------



## cdnGhost

Need some help, I have an asus strix 1080ti oc. I have misplaced a large portion of the backplate screws and 1 of the I/O bracket screws that attach to the rigid frame that the thermal pads rest against.

Any chance anyone knows the size of the smaller screws that are around the gpu chip? And by the I/O bracket? As well as the longer ones that screw into the backplate?

Know anywhere I can buy them and or know anyone who may part with them?

Thanks.

If anyone needs photos or more info I can update later.


----------



## Zfast4y0u

so, today i decided to test msi afterburner beta version, regarding oc scanner. installed it, moved sliders to the right and fired up oc scan for both cards, on end it tells me limiting factor is power ofc... and max overclock i have is, [email protected] 1.050mV for both cards i have, i was monitoring cards behaviors during oc scan in hwinfo and, i had peak mhz of [email protected], but i guess that didnt made it to be in curve cause of potential power limits during those clock speeds?? i know for a fact that i can push 2150mhz oc without hitting power limits in games. oc scanner seam to be modest regarding max oc.

lower frequencies are a bit overvolted too but not all of em, for example [email protected] 0,775mV is 100% stable, but in curve that frequency is @ 0,800mV

what are your experiences with this tool?


----------



## wonderin17

i guess my request is ignored as I'm the [lottery] loser


----------



## skupples

huh?


----------



## Streetdragon

wonderin17 said:


> i guess my request is ignored as I'm the [lottery] loser


first switch to MSI afterburner and try the clock slider (ctrl+f) with 1V and 2000Mhz without any memory overclock and test that. If its not stable, try a higher voltage @2000Mhz till it gets stable and wont powerlimiting.

Should wörk


----------



## Trickster29

Trickster29 said:


> I'll try that during reinstall. I'm not sure which version I'm running atm though. It is pretty recent though. So it could definitely be the cause.





skupples said:


> no worries, i'm T3 help desk, so I'm overly used to over-explaining things.
> 
> and it looks like you'll wanna install one driver previous, as good sir stated above.





kithylin said:


> It's probably nvidia's latest drivers. I've heard a whole host of BS from them from different people. A friend of mine I built a computer for in spring 2017 has a GTX 1080 and he's reported various crashes and a few random game crashes recently. And this is a system he has been running for over a year that literally had -ZERO ISSUES- with any game for a year. And then just recently with new drivers, *bam* crashing frequently. Someone else I know on this forums from the 775 thread bought a 1080 recently and with latest driver they couldn't get their 1080 to enter idle clocks. The GPU would be stuck at 1500 mhz core speed always even idle at desktop. And I have a friend with a 2080 Ti that's had ark: survival evolved freezing on them suddenly. In all of these instances I have had all of these friends roll back to driver version 417.71 and all of their issues magically disappeared and they're back to perfect stability again. So I would suggest you do the same. At least try it.


Okay after DDU and installing 417.71 it appeared fine for about 8 days. i got another driver crash today though that it was able to recover from to let me finish my match, ironically right when my match sent me back to the main menu the application (Overwatch) lost the GPU. a weekly crash doesn't sound like a hardware issue to me. what i may have noticed is right when it crashed youtube was swapping videos on my second display (cant be too sure if it was swapping or just reloading cause of crash)

Does anyone have any other suggestions?


----------



## kithylin

Trickster29 said:


> Okay after DDU and installing 417.71 it appeared fine for about 8 days. i got another driver crash today though that it was able to recover from to let me finish my match, ironically right when my match sent me back to the main menu the application (Overwatch) lost the GPU. a weekly crash doesn't sound like a hardware issue to me. what i may have noticed is right when it crashed youtube was swapping videos on my second display (cant be too sure if it was swapping or just reloading cause of crash)
> 
> Does anyone have any other suggestions?


What windows version are you using? There's a hotfix for crashing issues from nvidia out called 418.99, but it's only for Win7 x64 and windows 8.1 x64. Here's a link: https://nvidia.custhelp.com/app/answers/detail/a_id/4775

A friend of mine I built a computer for has a GTX 1080 and win7x64 and he was having crashes with 417.71 still and this fixed it for him. At least I haven't heard any words from him in about a week now (no news is good news right?).

Unfortunately there's no hotfix for windows 10 users currently.

And there's a new driver posted that may resolve this for you just a few days ago. Info on that is here: https://forums.geforce.com/default/...18-99-released-2-17-19-/post/5993926/#5993926

Personally, me myself, I'm staying on 417.71 because none of the fixes or improvements in the new drivers are for any games I care about playing and I'm not having any issues at the moment at all with any games.

An old saying.. don't fix it if it's not broken. Maybe the new driver will help you however with your issues. Or the hotfix, one of the two.


----------



## wonderin17

Streetdragon said:


> wonderin17 said:
> 
> 
> 
> i guess my request is ignored as I'm the [lottery] loser
> 
> 
> 
> first switch to MSI afterburner and try the clock slider (ctrl+f) with 1V and 2000Mhz without any memory overclock and test that. If its not stable, try a higher voltage @2000Mhz till it gets stable and wont powerlimiting.
> 
> Should wörk
Click to expand...

thanks! 

can you share some insights if there is major difference between different brand's software? you advise to use afterburner instead of asus software, i always thought the "core" of all those softwares works the same. will definitely try afterburner.

it's stable at 2025Mhz with custom fan curve already, and if i turn fans manually as high as 75%, i can reach ~2070mhz stable, but it blows too loud for daily use 🙂


----------



## Trickster29

kithylin said:


> What windows version are you using? There's a hotfix for crashing issues from nvidia out called 418.99, but it's only for Win7 x64 and windows 8.1 x64. Here's a link: https://nvidia.custhelp.com/app/answers/detail/a_id/4775
> 
> A friend of mine I built a computer for has a GTX 1080 and win7x64 and he was having crashes with 417.71 still and this fixed it for him. At least I haven't heard any words from him in about a week now (no news is good news right?).
> 
> Unfortunately there's no hotfix for windows 10 users currently.
> 
> And there's a new driver posted that may resolve this for you just a few days ago. Info on that is here: https://forums.geforce.com/default/...18-99-released-2-17-19-/post/5993926/#5993926
> 
> Personally, me myself, I'm staying on 417.71 because none of the fixes or improvements in the new drivers are for any games I care about playing and I'm not having any issues at the moment at all with any games.
> 
> An old saying.. don't fix it if it's not broken. Maybe the new driver will help you however with your issues. Or the hotfix, one of the two.


I'll take a look in a bit. it did it twice in one day both it looked like Firefox was changing videos in the playlist. I disabled the Hardware acceleration in Firefox and it stopped for the day. I'm gonna see how that goes first. I'll let you know if it makes a difference. If not I'll try the driver updates you're recommending.

Edit: forgot to include my windows version Its Windows 10 Pro Insider Build 18317


----------



## skupples

can I get a 100% on something?

does a standard ref EK 1080 block fit a standard ref 1080ti card? (i know for example ref 780, titan, 780ti, etc all used the same block after a revision or two)

https://www.amazon.com/EKWB-EK-FC-G...ti+waterblock&qid=1551676858&s=gateway&sr=8-4

https://www.amazon.com/GeForce-Copp...ti+waterblock&qid=1551677104&s=gateway&sr=8-8

^^ better example - shows all ref 10x0 are compatible.

gut says yes, reviews point to yes, that still being the case.


----------



## Streetdragon

wonderin17 said:


> thanks!
> 
> can you share some insights if there is major difference between different brand's software? you advise to use afterburner instead of asus software, i always thought the "core" of all those softwares works the same. will definitely try afterburner.
> 
> it's stable at 2025Mhz with custom fan curve already, and if i turn fans manually as high as 75%, i can reach ~2070mhz stable, but it blows too loud for daily use 🙂


I meant the voltage curve^^
https://forums.evga.com/Guide-How-t...-overclock-with-msi-afterburner-m2820280.aspx

After step 2


----------



## kithylin

skupples said:


> can I get a 100% on something?
> 
> does a standard ref EK 1080 block fit a standard ref 1080ti card? (i know for example ref 780, titan, 780ti, etc all used the same block after a revision or two)
> 
> https://www.amazon.com/EKWB-EK-FC-G...ti+waterblock&qid=1551676858&s=gateway&sr=8-4
> 
> https://www.amazon.com/GeForce-Copp...ti+waterblock&qid=1551677104&s=gateway&sr=8-8
> 
> ^^ better example - shows all ref 10x0 are compatible.
> 
> gut says yes, reviews point to yes, that still being the case.


https://www.ekwb.com/configurator/

Put your video card you want to use in here and select it and this will tell you on the resulting page if the card is reference PCB and uses the Reference-PCB-Waterblock, or if it's custom PCB and requires a special one. This is why this part of their website exists.


----------



## wonderin17

Streetdragon said:


> wonderin17 said:
> 
> 
> 
> thanks!
> 
> can you share some insights if there is major difference between different brand's software? you advise to use afterburner instead of asus software, i always thought the "core" of all those softwares works the same. will definitely try afterburner.
> 
> it's stable at 2025Mhz with custom fan curve already, and if i turn fans manually as high as 75%, i can reach ~2070mhz stable, but it blows too loud for daily use 🙂
> 
> 
> 
> I meant the voltage curve^^
> https://forums.evga.com/Guide-How-t...-overclock-with-msi-afterburner-m2820280.aspx
> 
> After step 2
Click to expand...

what benefits will it bring if I will force it to work at [email protected]?
at 2000


----------



## Streetdragon

2000 1V was just a example^^

You said you you have problems with stability when overclocking. with the curve you can play around and find "easy" the best clock/voltage


----------



## skupples

kithylin said:


> https://www.ekwb.com/configurator/
> 
> Put your video card you want to use in here and select it and this will tell you on the resulting page if the card is reference PCB and uses the Reference-PCB-Waterblock, or if it's custom PCB and requires a special one. This is why this part of their website exists.


thanks, I always forget about their little tool.

1080 block actually fits 1060.70.80, & TI, all reference. I assume because it drastically reduces tooling cost. 

just needed to confirm if they're still doing it they way they've been doing it, which is yes.


----------



## D13mass

Hi, guys! Is anybody here tried to install some cooling system such a RAIJINTEK MORPHEUS II on 1080ti? I`m planning to install ARCTIC P12 PWM PST * 2 + RAIJINTEK MORPHEUS II.

Is it good solution? Any objections? 

I have FE card with reference design and it is very loud (previously I had custom water loop, but decided to switch to air). Wanted to change for something new from RTX series, but honestly, I am playing now not so often, so decided to stay on old 1080ti.


----------



## skupples

The only objections you'll get are in regards to cooling the VRMs & Memory, which you do with microsyncs, when using a solution that replaces the stock cooling plate for said components.


you could also possibly mod the stock cooler? Much easier to just get the micro heatsyncs though.


----------



## D13mass

skupples said:


> The only objections you'll get are in regards to cooling the VRMs & Memory, which you do with microsyncs, when using a solution that replaces the stock cooling plate for said components.
> 
> 
> you could also possibly mod the stock cooler? Much easier to just get the micro heatsyncs though.


>>cooling the VRMs & Memory
I have checked, it includes some heatsinks and other details https://youtu.be/rNXMG-LeiTg?t=360


----------



## skupples

enjoy!


----------



## D13mass

skupples said:


> /forum/images/smilies/smile.gif enjoy!


But I didn't ask your opinion, I asked does anybody in this thread uses such radiator or not?
Rephrase - I wanna ask owner of custom air cooler on 1080ti.


----------



## skupples

Fair enough. 

I only responded to the objections as It reminded me enough of the Accelero II (which I have experience with) to view them as essentially one in the same. 

Yes, you may have better luck on specifics if you seek out one of the custom aircooler threads, here or elsewhere. I don't think that's a common product in the US & Canada, & most users around here are American or Canadian, so they may give you the same type of feedback based off of Accelero II.


----------



## 8051

skupples said:


> The only objections you'll get are in regards to cooling the VRMs & Memory, which you do with microsyncs, when using a solution that replaces the stock cooling plate for said components.
> 
> you could also possibly mod the stock cooler? Much easier to just get the micro heatsyncs though.


Wouldn't it be possible to use the stock heat spreader plates that come w/the card?


----------



## skupples

8051 said:


> Wouldn't it be possible to use the stock heat spreader plates that come w/the card?


That's what I was thinking too, but I've not done any stock cooler / 3rd party air cooler modding in quite some time.

I'm sure it would work, with some modification, & I'd still glue the syncs onto the plate too


----------



## kithylin

D13mass said:


> But I didn't ask your opinion, I asked does anybody in this thread uses such radiator or not?
> Rephrase - I wanna ask owner of custom air cooler on 1080ti.


There's really not much point at all in using that sort of thing with our modern video cards. Several of these large three-fan video cards with large heatsinks are easily as good as or possibly better than that thing. That's more for very old cards that only had a single-fan stock setup, or for blower-style cards that have the "squirrel cage" fan.

A quick comparison for example. 

RAIJINTEK MORPHEUS II:









The heatsink setup stock on a EVGA GTX 1080 Ti FTW3:









Or the heatsink on the Asus ROG Strix OC GTX 1080 Ti:









What these cards come with are already good enough and you're not going to gain much if anything by using that thing. And the RAIJINTEK MORPHEUS II makes the cards be 3 - 3.5 slots wide, which will prevent you from using SLI with that thing too.


----------



## khemist

I've got a Morpheus II coming today although i've got a heatkiller block on mine, just fancied testing out the temps/noise of good air cooling.


----------



## Abaidor

I keep coming back to this thread with an itch to try the XOC BIOS and I keep leaving without being persuaded whether I will gain much if anything at all. My Asus Strix 1080Ti OC is doing 2075MHz & 12000 Memory with Asus GPU Tweak and stock BIOS and my temps never exceed 36C on water. 

From 2075MHz it only makes sense to hit something like 2200 or more to feel the difference but it does not look like it is something that happens often..Any card owners running at 2200MHz with XOC BIOS here?


----------



## kithylin

Abaidor said:


> I keep coming back to this thread with an itch to try the XOC BIOS and I keep leaving without being persuaded whether I will gain much if anything at all. My Asus Strix 1080Ti OC is doing 2075MHz & 12000 Memory with Asus GPU Tweak and stock BIOS and my temps never exceed 36C on water.
> 
> From 2075MHz it only makes sense to hit something like 2200 or more to feel the difference but it does not look like it is something that happens often..Any card owners running at 2200MHz with XOC BIOS here?


I've only seen I think 2 posts in the actual XOC bios thread.. (which you should probably be posting your question in) of people successfully running 2200 mhz on a 1080 Ti. Those were likely very special golden sample card. The majority of people even with the XOC bios typically only hit 2100-2150 Mhz. It depends on the silicon lotto too. Some cards need more voltage than others to clock higher.. XOC allows up to 1.200v where as most stock bios's will limit you to around 1.087v max, usually 1.050v max.


----------



## Abaidor

kithylin said:


> I've only seen I think 2 posts in the actual XOC bios thread.. (which you should probably be posting your question in) of people successfully running 2200 mhz on a 1080 Ti. Those were likely very special golden sample card. The majority of people even with the XOC bios typically only hit 2100-2150 Mhz. It depends on the silicon lotto too. Some cards need more voltage than others to clock higher.. XOC allows up to 1.200v where as most stock bios's will limit you to around 1.087v max, usually 1.050v max.


Well it's not worth the hassle if I am only going to gain 25MHz at the worst case scenario..I am just trying to extend its life until next gen hits the market. I mean 3080Ti or something...


----------



## ThrashZone

Abaidor said:


> I keep coming back to this thread with an itch to try the XOC BIOS and I keep leaving without being persuaded whether I will gain much if anything at all. My Asus Strix 1080Ti OC is doing 2075MHz & 12000 Memory with Asus GPU Tweak and stock BIOS and my temps never exceed 36C on water.
> 
> From 2075MHz it only makes sense to hit something like 2200 or more to feel the difference but it does not look like it is something that happens often..Any card owners running at 2200MHz with XOC BIOS here?


Hi,
You should try the evga ftw3 slave vbios
Even KW likes it better


----------



## skupples

2150? I’ve been doing that on air... 0.0


Wonder what this block will do headroom wise then


----------



## khemist

I'll post my temps when fitted.


----------



## skupples

someone should just remove the sync outta the stock cooler, attach & compare. I didn't realize how similar they really were until dude pointed it out.


----------



## D13mass

kithylin said:


> There's really not much point at all in using that sort of thing with our modern video cards. Several of these large three-fan video cards with large heatsinks are easily as good as or possibly better than that thing. That's more for very old cards that only had a single-fan stock setup, or for blower-style cards that have the "squirrel cage" fan.
> 
> A quick comparison for example.
> 
> RAIJINTEK MORPHEUS II:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The heatsink setup stock on a EVGA GTX 1080 Ti FTW3:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or the heatsink on the Asus ROG Strix OC GTX 1080 Ti:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What these cards come with are already good enough and you're not going to gain much if anything by using that thing. And the RAIJINTEK MORPHEUS II makes the cards be 3 - 3.5 slots wide, which will prevent you from using SLI with that thing too.


Yes, you are right, BUT I have Founders Edition card http://www.pny.eu/data/products/article-large/902-20170305123616.png 
I bought it because had water loop, now only air.
For me, it is only one good solution. 
3 years ago I had GTX 980ti card and installed Kraken G10 + Corsair H55, but I did not like this solution, looks huge.



khemist said:


> I've got a Morpheus II coming today although i've got a heatkiller block on mine, just fancied testing out the temps/noise of good air cooling.


Please, write a few sentences here when you will test it. Would be nice if you add some photos and minuses from your perspective.


----------



## kithylin

skupples said:


> 2150? I’ve been doing that on air... 0.0
> 
> 
> Wonder what this block will do headroom wise then


The question though is can you -sustain- 2150 mhz on air, at all times? Or does it start out at 2150 and clock down to say 2050 or 2000 or 1980 Mhz after you play for a while and the card gets hot?

I would find it very difficult to believe if you claim your card can start, maintain and never clock down from 2150 mhz on air cooling. Given how we know Nvidia drops clocks after the cards get hot with Boost 3.0 cards.


----------



## skupples

no, I highly doubt it maintains it. Specially since I push 4K whenever possible. 

I'm curious to see what it can maintain once its all blocked up though.


----------



## Zfast4y0u

skupples said:


> no, I highly doubt it maintains it. Specially since I push 4K whenever possible.
> 
> I'm curious to see what it can maintain once its all blocked up though.


whats the lowest voltage you can set to maintain 2000mhz stable? if you can hit 2150mhz on AIR, that thing could probably do another 100mhz on default bios with voltage offest to max, aka 1.09mV max voltage allowed


----------



## skupples

I really haven't played with it much yet. Waiting till I install the block.

I've just done a few scanner sweeps, and synthetics. Besides that, Iv'e not spent any time paying attention.I just apply my +200/+600 profile & go. 

that'll all change once I get #2 in & my loop all shored up.

I'm trying to do all this while living with my parents for the first time in a decade, while waiting on my house for the next three months.


----------



## 8051

kithylin said:


> The question though is can you -sustain- 2150 mhz on air, at all times? Or does it start out at 2150 and clock down to say 2050 or 2000 or 1980 Mhz after you play for a while and the card gets hot?
> 
> I would find it very difficult to believe if you claim your card can start, maintain and never clock down from 2150 mhz on air cooling. Given how we know Nvidia drops clocks after the cards get hot with Boost 3.0 cards.


Mine throttles down due to power and voltage perfcaps when the GPU exceeds 63°C. It's on air and the Asus Strix XOC VBIOS too. I'm going to try and install a high power fan in the floor of the case and see if that helps.


----------



## 8051

kithylin said:


> There's really not much point at all in using that sort of thing with our modern video cards. Several of these large three-fan video cards with large heatsinks are easily as good as or possibly better than that thing. That's more for very old cards that only had a single-fan stock setup, or for blower-style cards that have the "squirrel cage" fan.
> 
> A quick comparison for example.
> 
> The heatsink setup stock on a EVGA GTX 1080 Ti FTW3:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or the heatsink on the Asus ROG Strix OC GTX 1080 Ti:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What these cards come with are already good enough and you're not going to gain much if anything by using that thing. And the RAIJINTEK MORPHEUS II makes the cards be 3 - 3.5 slots wide, which will prevent you from using SLI with that thing too.


But the small, thin fans that come w/the stock heatsinks don't provide high static pressure or CFM relative to what you can install on the Raijintek Morpheus II.

My MSI 1080Ti Duke has a fairly large heatsink but only 4 heatpipes and either 10mm or 15mm thick 85mm diameter fans that spin at ~3000RPM. Replacing these fans w/shrouded a noctua iPPC3000 NFA14 and a shrouded SanAce 127x38mm 109P1312S1031 fan dropped the GPU temps from 75°C to the low 60's @ 80°F ambient temps.


----------



## kithylin

8051 said:


> But the small, thin fans that come w/the stock heatsinks don't provide high static pressure or CFM relative to what you can install on the Raijintek Morpheus II.
> 
> My MSI 1080Ti Duke has a fairly large heatsink but only 4 heatpipes and either 10mm or 15mm thick 85mm diameter fans that spin at ~3000RPM. Replacing these fans w/shrouded a noctua iPPC3000 NFA14 and a shrouded SanAce 127x38mm 109P1312S1031 fan dropped the GPU temps from 75°C to the low 60's @ 80°F ambient temps.


See my cons in the post above. Anything that causes a video card to be more than 2.5 slots wide negates any possible benefit you might have from extra fans.


----------



## doom3crazy

Hi friends. I know I am late to the game but I just picked up a 1080 ti. I know pascal is pretty locked down. Anyone have any tips/tricks on how to squeeze as much performance as possible out of these cards?


----------



## mouacyk

doom3crazy said:


> Hi friends. I know I am late to the game but I just picked up a 1080 ti. I know pascal is pretty locked down. Anyone have any tips/tricks on how to squeeze as much performance as possible out of these cards?


XOC Bios if you can cool it under 50C.


----------



## The Pook

kithylin said:


> See my cons in the post above. Anything that causes a video card to be more than 2.5 slots wide negates any possible benefit you might have from extra fans.



If you're in an ITX case or going SLI, yeah, but the vast majority of people aren't. 

The Morpheus II is most definitely worthwhile on modern cards too. 






If you can get 50c load temps on a stock cooler then you live in an igloo


----------



## doom3crazy

mouacyk said:


> XOC Bios if you can cool it under 50C.


Okay thats good to know thank you. I was actually thinking about getting an AIO + a kraken g12 bracket. Think that would keep it under 50c? I've never done water cooling with a gpu before, only cpu.


----------



## kithylin

doom3crazy said:


> Okay thats good to know thank you. I was actually thinking about getting an AIO + a kraken g12 bracket. Think that would keep it under 50c? I've never done water cooling with a gpu before, only cpu.


I have a big 4-radiator custom water loop to keep my 1080 Ti < 50c, but I do have a cpu and motherboard looped into it too.


----------



## KedarWolf

doom3crazy said:


> Okay thats good to know thank you. I was actually thinking about getting an AIO + a kraken g12 bracket. Think that would keep it under 50c? I've never done water cooling with a gpu before, only cpu.



AIO on the XOC BIOS never a good idea as the memory and VRM's are still air cooled. Full waterblock and backplate highly recommended.


----------



## kithylin

KedarWolf said:


> AIO on the XOC BIOS never a good idea as the memory and VRM's are still air cooled. Full waterblock and backplate highly recommended.


I'll second this. XOC bios on a 1080 Ti should be full-cover custom water loop only and probably I would say at least a 240mm radiator just for for gpu bare minimum. At one point in the the past I had tried to run my 1080 Ti with XOC Bios on a single 140mm radiator and it ran as hot as or hotter than air cooling, around 70-75c in firestrike.


----------



## 8051

The Pook said:


> If you're in an ITX case or going SLI, yeah, but the vast majority of people aren't.
> 
> The Morpheus II is most definitely worthwhile on modern cards too.
> 
> If you can get 50c load temps on a stock cooler then you live in an igloo


The Morpheus II is impressive and that graph only includes 2000 RPM fans too. I'd be running 3000+ RPM fans if I had one.


----------



## doom3crazy

KedarWolf said:


> doom3crazy said:
> 
> 
> 
> Okay thats good to know thank you. I was actually thinking about getting an AIO + a kraken g12 bracket. Think that would keep it under 50c? I've never done water cooling with a gpu before, only cpu.
> 
> 
> 
> 
> AIO on the XOC BIOS never a good idea as the memory and VRM's are still air cooled. Full waterblock and backplate highly recommended.
Click to expand...




kithylin said:


> KedarWolf said:
> 
> 
> 
> AIO on the XOC BIOS never a good idea as the memory and VRM's are still air cooled. Full waterblock and backplate highly recommended.
> 
> 
> 
> I'll second this. XOC bios on a 1080 Ti should be full-cover custom water loop only and probably I would say at least a 240mm radiator just for for gpu bare minimum. At one point in the the past I had tried to run my 1080 Ti with XOC Bios on a single 140mm radiator and it ran as hot as or hotter than air cooling, around 70-75c in firestrike.
Click to expand...

Thank you for saving me the headache and whatnot. Perhaps for now I’ll just get an aio to keep the gpu cool and keep boost clocks up as high as possible. 

Before this I was running a MSi 980 ti with a custom bios at 1526mhz on the core. It was a beast but I know pascal is locked down. Does the xoc bios give you voltage and power control ? Or just one of those ?


----------



## RoBiK

doom3crazy said:


> Does the xoc bios give you voltage and power control ? Or just one of those ?


XOC BIOS has no power limit so there is nothing to control in terms of power. Voltage can be set up to 1.2V.


----------



## VeritronX

How does the new afterburner tuning work with the xoc bios on a card with a waterblock?


----------



## 8051

RoBiK said:


> XOC BIOS has no power limit so there is nothing to control in terms of power. Voltage can be set up to 1.2V.


I don't think this is true, but if you're on liquid cooling you'll never see it. I have the Asus XOC VBIOS burned to my MSI 1080Ti Duke which is on air. If the temps exceed 63°C I get power and voltage perfcaps and the card throttles even with the XOC VBIOS.


----------



## kithylin

8051 said:


> I don't think this is true, but if you're on liquid cooling you'll never see it. I have the Asus XOC VBIOS burned to my MSI 1080Ti Duke which is on air. If the temps exceed 63°C I get power and voltage perfcaps and the card throttles even with the XOC VBIOS.


It is true and it's fact. The Asus Strix XOC Bios has no power limit. If you use it your msi afterburner will be grayed out on the power limit slider because there is no power limit with it at all. I've had my 1080 Ti pull up to 630 watts at one point on the XOC bios doing something stupid. I only did it for 15 seconds then stopped because I don't want to fry my card. Normal bios's for the 1080 Ti will limit you to 350~375'ish watts.



VeritronX said:


> How does the new afterburner tuning work with the xoc bios on a card with a waterblock?


I've read reports from other users that the new OC scanner in the new afterburner will cause 1080 Ti's with XOC bios to exceed past 500 watts power draw and so I would recommend to avoid doing that on it.


----------



## doom3crazy

kithylin said:


> I'll second this. XOC bios on a 1080 Ti should be full-cover custom water loop only and probably I would say at least a 240mm radiator just for for gpu bare minimum. At one point in the the past I had tried to run my 1080 Ti with XOC Bios on a single 140mm radiator and it ran as hot as or hotter than air cooling, around 70-75c in firestrike.


So if I was to get a 240mm rad(something like a corsair h105) and a g12 kraken bracket, if I put a better aftermarket fan on the g12 to cool the vrm and memory etc, do you think that would work, or would it still get to hot?


----------



## 8051

kithylin said:


> It is true and it's fact. The Asus Strix XOC Bios has no power limit. If you use it your msi afterburner will be grayed out on the power limit slider because there is no power limit with it at all. I've had my 1080 Ti pull up to 630 watts at one point on the XOC bios doing something stupid. I only did it for 15 seconds then stopped because I don't want to fry my card. Normal bios's for the 1080 Ti will limit you to 350~375'ish watts.


So those power and voltage perfcaps and throttling I see in MSI Afterburner are just an optical illusion? Why would I see a power perfcap if the VBIOS isn't power limited?


----------



## kithylin

8051 said:


> So those power and voltage perfcaps and throttling I see in MSI Afterburner are just an optical illusion? Why would I see a power perfcap if the VBIOS isn't power limited?


If you mean those readings in MSI Afterburner that are either "0 or 1" and randomly switch to "1" then yes, those don't mean anything. I have them disabled in my MSI Afterburner. Those things even switch to "1" when it's sitting idle at the desktop sometimes. I don't know what they mean but they're completely pointless. That's not any sort of power cap what so ever. And the card is not throttling if you have XOC bios on it. It's literally not possible. XOC Bios is designed from Asus for their STRIX OC GTX 1080 Ti to be used for liquid nitrogen (LN2) competitions originally. No power limits what so ever. You could theoretically pull 600+ watts over it (if you monitor it with nvidia-SMI) and I have done so with my 1080 Ti and the XOC bios before. Where as all other bios's limit to around 325~375 watts.


----------



## VeritronX

It could just be that the method for unlocking the power involved changing the tables in the bios such that it applies to temps below 63C.. which is a good thing because you really shouldn't be using that bios without a cooling system capable of keeping it below 60C, pulling 400W+ at higher temps is really bad for the gpu.


----------



## 8051

kithylin said:


> If you mean those readings in MSI Afterburner that are either "0 or 1" and randomly switch to "1" then yes, those don't mean anything. I have them disabled in my MSI Afterburner. Those things even switch to "1" when it's sitting idle at the desktop sometimes. I don't know what they mean but they're completely pointless. That's not any sort of power cap what so ever. And the card is not throttling if you have XOC bios on it. It's literally not possible. XOC Bios is designed from Asus for their STRIX OC GTX 1080 Ti to be used for liquid nitrogen (LN2) competitions originally. No power limits what so ever. You could theoretically pull 600+ watts over it (if you monitor it with nvidia-SMI) and I have done so with my 1080 Ti and the XOC bios before. Where as all other bios's limit to around 325~375 watts.


The next time I get throttling + power perfcaps + voltage perfcaps I'm going to get a screen cap of it because what you're writing and what I'm seeing are two different things.

How do you know for a fact the Asus 10180Ti XOC VBIOS doesn't have power throttling unless you've examined it?


----------



## kithylin

8051 said:


> How do you know for a fact the Asus 10180Ti XOC VBIOS doesn't have power throttling unless you've examined it?


I literally just explained it in my post that you just quoted...... you can't go over I think the limit is 375 watts with any other normal bios and I think that's the Zotac AMP! one. Most normal bios's with a power limit will limit you to around 315 - 335 watts from what I remember reading on techpowerup's GPU Bios database. No matter what you do, you can't even hit 400 watts with normal bios's, they won't let you. And yet I've gone up to and past 600+ watts with the XOC bios before. So if it does have a limit.. it's well beyond what a 1080 Ti could possibly ever pull in power.



8051 said:


> The next time I get throttling + power perfcaps + voltage perfcaps I'm going to get a screen cap of it because what you're writing and what I'm seeing are two different things.


It might be different MSI Afterburner Versions. I'm using version 4.3.0 and in this version of afterburner there are 4 options: Temp limit, Power limit, Voltage limit, and No load limit. Those are what I'm referring to. And those are either "0" or "1". And these don't really correlate to anything that I can ever figure out. They just arbitrarily flip 0 or 1 completely at random and don't mean anything. I intentionally stay on this older version because I had problems with newer versions of afterburner not applying the curve correctly for pascal cards and then I went back to this version and it works correctly so I'm never updating it. The newer versions of afterburner are designed mostly for RTX 2000 series cards.


----------



## 8051

kithylin said:


> I literally just explained it in my post that you just quoted...... you can't go over I think the limit is 375 watts with any other normal bios and I think that's the Zotac AMP! one. Most normal bios's with a power limit will limit you to around 315 - 335 watts from what I remember reading on techpowerup's GPU Bios database. No matter what you do, you can't even hit 400 watts with normal bios's, they won't let you. And yet I've gone up to and past 600+ watts with the XOC bios before. So if it does have a limit.. it's well beyond what a 1080 Ti could possibly ever pull in power.
> 
> 
> It might be different MSI Afterburner Versions. I'm using version 4.3.0 and in this version of afterburner there are 4 options: Temp limit, Power limit, Voltage limit, and No load limit. Those are what I'm referring to. And those are either "0" or "1". And these don't really correlate to anything that I can ever figure out. They just arbitrarily flip 0 or 1 completely at random and don't mean anything. I intentionally stay on this older version because I had problems with newer versions of afterburner not applying the curve correctly for pascal cards and then I went back to this version and it works correctly so I'm never updating it. The newer versions of afterburner are designed mostly for RTX 2000 series cards.


I'm using MSI Afterburner 4.5.0.12819. When I use GPUz to look up my card it comes back to a: ASUS ROG STRIX GTX 1080 Ti GAMING OC

https://www.techpowerup.com/gpu-specs/asus-rog-strix-gtx-1080-ti-gaming-oc.b4306

So maybe I'm not using the XOC VBIOS at all? I downloaded and flashed the VBIOS from techpowerup.com for the Asus Strix OC.


----------



## 8051

VeritronX said:


> It could just be that the method for unlocking the power involved changing the tables in the bios such that it applies to temps below 63C.. which is a good thing because you really shouldn't be using that bios without a cooling system capable of keeping it below 60C, pulling 400W+ at higher temps is really bad for the gpu.


I'm not really worried about the GPU going above 60°C anymore, I'm more worried about what's going on w/the VRM circuitry -- in particular the MOSFET's.


----------



## kithylin

8051 said:


> I'm using MSI Afterburner 4.5.0.12819. When I use GPUz to look up my card it comes back to a: ASUS ROG STRIX GTX 1080 Ti GAMING OC
> 
> https://www.techpowerup.com/gpu-specs/asus-rog-strix-gtx-1080-ti-gaming-oc.b4306
> 
> So maybe I'm not using the XOC VBIOS at all? I downloaded and flashed the VBIOS from techpowerup.com for the Asus Strix OC.


Probably not then, no. There's two different Bios's. There's the ASUS Strix OC one that came on the Asus Strix OC GTX 1080 Ti, and then there's the special "XOC" one we refer to. Which originally someone had to ask an Asus CSR agent about it, sign some waiver that basically meant they agreed to nullify their factory warranty and then get it in email. Fortunately though the community has been sharing it in a special thread designed for it. That's over here: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

If the VRM/Mosfets are air cooled (even if the core is water cooled), then generally you can assume the vrm's are probably in the +20c to +30c range above what ever temp the core is running at if you're under load. So say you run firestrike and the gpu core is at 50c, the vrm's are are probably running 70c - 80c. That's why I recommend strongly folks using the actual XOC bios use a full-cover waterblock that loops in the vrm's to the water loop.


----------



## doom3crazy

kithylin said:


> 8051 said:
> 
> 
> 
> I'm using MSI Afterburner 4.5.0.12819. When I use GPUz to look up my card it comes back to a: ASUS ROG STRIX GTX 1080 Ti GAMING OC
> 
> https://www.techpowerup.com/gpu-specs/asus-rog-strix-gtx-1080-ti-gaming-oc.b4306
> 
> So maybe I'm not using the XOC VBIOS at all? I downloaded and flashed the VBIOS from techpowerup.com for the Asus Strix OC.
> 
> 
> 
> Probably not then, no. There's two different Bios's. There's the ASUS Strix OC one that came on the Asus Strix OC GTX 1080 Ti, and then there's the special "XOC" one we refer to. Which originally someone had to ask an Asus CSR agent about it, sign some waiver that basically meant they agreed to nullify their factory warranty and then get it in email. Fortunately though the community has been sharing it in a special thread designed for it. That's over here: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html
> 
> If the VRM/Mosfets are air cooled (even if the core is water cooled), then generally you can assume the vrm's are probably in the +20c to +30c range above what ever temp the core is running at if you're under load. So say you run firestrike and the gpu core is at 50c, the vrm's are are probably running 70c - 80c. That's why I recommend strongly folks using the actual XOC bios use a full-cover waterblock that loops in the vrm's to the water loop.
Click to expand...

If someone was to use an aio and the xoc , what if you put a nice high rpm aftermarket fan on the bracket to keep the vrm and memory cool? Is that a viable option or in reality still not enough ?


----------



## Streetdragon

if you install a heatsinks over all chips/vrm that get hot.... maybe.
As long as you cover all chips and the back of the card it should be fine. but still risky.

Full block would be easyer


----------



## kithylin

doom3crazy said:


> If someone was to use an aio and the xoc , what if you put a nice high rpm aftermarket fan on the bracket to keep the vrm and memory cool? Is that a viable option or in reality still not enough ?


It's still super risky. It's your card and your risk but if you want to risk potentially killing a $700 video card you can't buy new anymore... I guess go ahead. I wouldn't even dare bother if it was me. Not to mention that sticking all those heatsinks all over it will most likely void your warranty in the process and then you couldn't even claim an RMA either if you bricked it.

Generally the only even remotely safe way to do it is full-cover custom water block or nothing for XOC Unlimited bios. And even then it's still risky if you don't build a big enough loop. I had my 1080 Ti on the XOC-Unlimited bios and it used to run mid-60's C with a single 360mm radiator (but I had a i7-5820K looped in with it too) and I always felt it was still scary and dangerous even then. I ended up building a large computer with 4 radiators and now it's fine and I can keep it < 50c finally.


----------



## doom3crazy

kithylin said:


> It's still super risky. It's your card and your risk but if you want to risk potentially killing a $700 video card you can't buy new anymore... I guess go ahead. I wouldn't even dare bother if it was me. Not to mention that sticking all those heatsinks all over it will most likely void your warranty in the process and then you couldn't even claim an RMA either if you bricked it.
> 
> Generally the only even remotely safe way to do it is full-cover custom water block or nothing for XOC Unlimited bios. And even then it's still risky if you don't build a big enough loop. I had my 1080 Ti on the XOC-Unlimited bios and it used to run mid-60's C with a single 360mm radiator (but I had a i7-5820K looped in with it too) and I always felt it was still scary and dangerous even then. I ended up building a large computer with 4 radiators and now it's fine and I can keep it < 50c finally.


This information has been very helpful thank you. I think I will just stick to the stock and just do the AIO thing. 

I am still a bit of a noob overall and if it's okay I would like to ask some advice on where to setup my AIO. Attached is a picture of my case(be quiet silent base 600) with potential spots for an AIO. So I have labeled each spot as 1, 2, 3, 4. Right away spot #2 is out of the question cause thats where(with the help of a friend) I put the AIO for the CPU. Next we have #3 which has two 140mm fans in there but I think they're built into the case and I don't have access to that area. I noticed on Be Quiet's website on the silent base 600 page it states "Radiators can be installed at the top, in the HDD section and at the rear of the case" Assuming by hdd section they mean #4, that leaves me with area #4 as an intake or #1 as exhaust(both 120mm fan/rad size)

Do you have any 120mm size AIO recommendations I might wanna check out? Also if you were in my shoes, between area #1 & #4, which spot would you choose and why? Big thanks in advance!


----------



## schoolofmonkey

I'm starting to wonder what's wrong with my ROG GTX1080ti Strix, the VRM temps a stupidly high, GPU temps are good though.

Plenty of airflow in my Air 540 (bottom fan mod made), so I don't get it.


----------



## kithylin

schoolofmonkey said:


> I'm starting to wonder what's wrong with my ROG GTX1080ti Strix, the VRM temps a stupidly high, GPU temps are good though.
> 
> Plenty of airflow in my Air 540 (bottom fan mod made), so I don't get it.


That looks normal to me for an air cooled video card. Usually on air cooling your VRM temps run +30c above your core temps. And that's showing +28c so I don't see a problem. VRM components are typically designed for 125c so it's no where near dangerous levels.


----------



## Zfast4y0u

how hot blower cards then get, if strix vrm cooling is in 90c...


----------



## skupples

most modern cards have thermal scans on youtube


----------



## madmax911

Just entered here. My inno3D 1080 TI is hitting 2100 MHz on the core running at 50c using a custom watercooling setup. Im quite satisfied with that. But i have no OC on the RAM. I wonder if i get any noticable performance boost from it.


----------



## kithylin

madmax911 said:


> Just entered here. My inno3D 1080 TI is hitting 2100 MHz on the core running at 50c using a custom watercooling setup. Im quite satisfied with that. But i have no OC on the RAM. I wonder if i get any noticable performance boost from it.


Most definitely yes. Ram helps on these cards a good bit. +400 to +500 on ram should be achievable with almost all 1080 Ti's even on their stock bios if you're under a custom water loop with a full-cover block on the gpu.


----------



## skupples

most will even do that on air! 

ever since keplar, I start @ +100 core, +500 mem.


----------



## scaramonga

My FTW3 sits quite nicely @ 90 on core & 500 on mem - 24 idle / 48 load, could probably push more, but I'm more than happy with that. Full cover EK block BTW


----------



## Masterchief79

I hit 2152MHz GPU and +850 Mem in Fire Strike Extreme recently. That's with my 1080Ti Lightning Z and a GPU mono block installed. Mem wasn't at its limits yet and GPU seemed happy at that clock. Well, happy for 1,093V that is. Wouldn't necessarily do that 24/7. 
https://www.3dmark.com/fs/18761479


----------



## GRABibus

For memory, I can get 11800MHz maximum (+397MHz in MSI AB), otherwise, I start to see artefacts in Heaven Benchmark for example.
This is not so high, compared to what I can see all around.

Hopefully, my GPU core is a really good clocker and this is the key, much more than memory, to have performance increase in games + benchmarks.


----------



## 8051

kithylin said:


> Most definitely yes. Ram helps on these cards a good bit. +400 to +500 on ram should be achievable with almost all 1080 Ti's even on their stock bios if you're under a custom water loop with a full-cover block on the gpu.


VRAM overclocks don't make a difference in all games. The only one of the titles that I have on which it makes a noticeable diff. is Deus Ex: MD. With Dyng Lte I have to drastically underclock the VRAM on my 1080Ti or the game crashes w/an overclocked core.


----------



## m3ta1head

D13mass said:


> Hi, guys! Is anybody here tried to install some cooling system such a RAIJINTEK MORPHEUS II on 1080ti? I`m planning to install ARCTIC P12 PWM PST * 2 + RAIJINTEK MORPHEUS II.
> 
> Is it good solution? Any objections?
> 
> I have FE card with reference design and it is very loud (previously I had custom water loop, but decided to switch to air). Wanted to change for something new from RTX series, but honestly, I am playing now not so often, so decided to stay on old 1080ti.


I mounted a Prolimatech MK-26 on my Zotac Amp Extreme, using Enzotech ramsinks on the vram components and keeping the stock VRM heatsinks and backplate. At 2ghz/.975v full load temps are 44-46C. VRMs get plenty of airflow and stay nice and cool, too.

https://imgur.com/a/i8gWUjk


----------



## kithylin

m3ta1head said:


> I mounted a Prolimatech MK-26 on my Zotac Amp Extreme, using Enzotech ramsinks on the vram components and keeping the stock VRM heatsinks and backplate. At 2ghz/.975v full load temps are 44-46C. VRMs get plenty of airflow and stay nice and cool, too.


Wow.. just turn it in to a 4 slot card then. I assume you're never considering SLI at all?

EDIT: Also you pretty much voided your factory warranty with those ram heatsinks and you can't convert the card to full cover water loop any more now either.


----------



## Zfast4y0u

@m3ta1head

card looks like a panzer now  u could have slammed 2 fans on top too, what are ur vrm temps??


----------



## 8051

kithylin said:


> Wow.. just turn it in to a 4 slot card then. I assume you're never considering SLI at all?


That looks like more than 4 slots to me.


----------



## mouacyk

Still pretty cool though, to get mid 40's for air-cooled temps. That's custom water-cooling territory.


----------



## schoolofmonkey

kithylin said:


> That looks normal to me for an air cooled video card. Usually on air cooling your VRM temps run +30c above your core temps. And that's showing +28c so I don't see a problem. VRM components are typically designed for 125c so it's no where near dangerous levels.


I'm still concerned about my VRM temps, they are hitting 96c.

This is completely stock, Stock fan profile, no oc, they never were this high before.


----------



## mouacyk

schoolofmonkey said:


> I'm still concerned about my VRM temps, they are hitting 96c.
> 
> This is completely stock, Stock fan profile, no oc, they never were this high before.


If you're up to it, apply new thermal pads (recommend fujipoly) or try RMA if still under warranty. It's likely your RMA will be denied because these are designed to handle up to 120C.


----------



## schoolofmonkey

mouacyk said:


> If you're up to it, apply new thermal pads (recommend fujipoly) or try RMA if still under warranty. It's likely your RMA will be denied because these are designed to handle up to 120C.


I still have warranty until July this year.
I tested something just before, I sat a 140mm Noctua fan on the back of the card blowing up, temps dropped BIG TIME.

This is with a 100mhz OC, 120% power.

Just waiting on the next model RTX cards to come out, they aren't worth it now.


----------



## kithylin

schoolofmonkey said:


> I'm still concerned about my VRM temps, they are hitting 96c.
> 
> This is completely stock, Stock fan profile, no oc, they never were this high before.


That's just how your card is designed. Sadly it seems whom ever made the cooler for your card came with a rather poor VRM cooling design but that is still typical operating temps for an air cooled 1080 Ti. It's not uncommon at all to see vrm temps in the 100-115c range for stock air cooled 1080 Ti's. Like I said, roughly +30c over your card's temps. So 70c core temps = 100c VRM temps = normal operating parameters. You can contact support for whom ever made your card if it's within warranty and see what they say. But they will most likely just reply and tell you it's normal for your card and deny an RMA request.


----------



## schoolofmonkey

kithylin said:


> That's just how your card is designed. Sadly it seems whom ever made the cooler for your card came with a rather poor VRM cooling design but that is still typical operating temps for an air cooled 1080 Ti. It's not uncommon at all to see vrm temps in the 100-115c range for stock air cooled 1080 Ti's. Like I said, roughly +30c over your card's temps. So 70c core temps = 100c VRM temps = normal operating parameters. You can contact support for whom ever made your card if it's within warranty and see what they say. But they will most likely just reply and tell you it's normal for your card and deny an RMA request.


It's the ROG Strix version.
First time I've actually had cooling issues with them, I knew the VRM's ran a little warmer than some of the other brands thanks to the Guru3D review, they had 82c when overclocked.

The little fan trick (up above) lowered the temps well.
Hitting GPU 71c VRM 89c with a good overclock on a stock fan curve, custom curve lowers it a lot more..

Oh it's summer here down under BTW...


----------



## Streetdragon

schoolofmonkey said:


> It's the ROG Strix version.
> First time I've actually had cooling issues with them, I knew the VRM's ran a little warmer than some of the other brands thanks to the Guru3D review, they had 82c when overclocked.
> 
> The little fan trick (up above) lowered the temps well.
> Hitting GPU 71c VRM 89c with a good overclock on a stock fan curve, custom curve lowers it a lot more..
> 
> Oh it's summer here down under BTW...


turn the fan on the backside around, so it blows on the backlate^^
Looks like it sucks from the plate. + Add a bit of space between plate and fan. That should improve the perfomrence even more


----------



## schoolofmonkey

Streetdragon said:


> turn the fan on the backside around, so it blows on the backlate^^
> Looks like it sucks from the plate. + Add a bit of space between plate and fan. That should improve the perfomrence even more


My thought was because there's holes cut in the backplate over the VRM's with no thermal pad connecting them to the metal backplate, sucking the hot air out from under the back plate would be better.

But now thinking about it blowing the cool air that's coming from the front intakes through the back of the card might work better, I'll try it out soon.


----------



## Zfast4y0u

@schoolofmonkey 

gigabyte aorus and msi gaming x have better temps then the strix, in vrm area, aorus being best of the three and strix has the worst. your temps are within card spec, if u take a look of review from kitguru, his vrm temps were 85c, in open test bench i believe, so your being 5-6c hotter is fine i guess. however yes it could be better. if you are worried for lifespan of ur card, take a look of this video where its explained in detail when you should worry about vrm temps.

btw strix backplate has no purpose in cooling what so ever, its just for the looks, thats why they cut holes for those chips that get hot, to get some breathing area. i know my ek back plate on strix gets somewhere between warm and hot to touch when card is pushed hard. even on water block vrm's can get ''hot'' if you overclock to the limit and put card on heavy load, i know mine can get up to 48-49c in such scenarios, on profile i run for daily use, vrms max out at around 41-44c max


----------



## 8051

mouacyk said:


> Still pretty cool though, to get mid 40's for air-cooled temps. That's custom water-cooling territory.


True enough. I don't think he's doing any overclocking though.


----------



## 8051

Zfast4y0u said:


> @schoolofmonkey
> 
> gigabyte aorus and msi gaming x have better temps then the strix, in vrm area, aorus being best of the three and strix has the worst. your temps are within card spec, if u take a look of review from kitguru, his vrm temps were 85c, in open test bench i believe, so your being 5-6c hotter is fine i guess. however yes it could be better. if you are worried for lifespan of ur card, take a look of this video where its explained in detail when you should worry about vrm temps.
> 
> btw strix backplate has no purpose in cooling what so ever, its just for the looks, thats why they cut holes for those chips that get hot, to get some breathing area. i know my ek back plate on strix gets somewhere between warm and hot to touch when card is pushed hard. even on water block vrm's can get ''hot'' if you overclock to the limit and put card on heavy load, i know mine can get up to 48-49c in such scenarios, on profile i run for daily use, vrms max out at around 41-44c max
> 
> https://www.youtube.com/watch?v=X9JxeIqVkuQ


Interesting thanks. I'm surprised the Asus Strix doesn't have a functional backplate considering its supposed to be a high end part?


----------



## Zfast4y0u

@8051 it makes zero contact with anything, its just for stupid RGB kinda sad but oh well, thats why this guy had better temps on vrm area when he put fan to remove hot air under backplate. that itself lower temps by 10c in his case xd i bet he would have even better temps without backplate at all.

on EK backplate u have lots of spots for thermal pads dedicated to make contact with PCB to cool things over. i just wish they start making fins on em too, for better heat removing, not just flat piece of material.


----------



## kithylin

Zfast4y0u said:


> @8051 it makes zero contact with anything, its just for stupid RGB kinda sad but oh well, thats why this guy had better temps on vrm area when he put fan to remove hot air under backplate. that itself lower temps by 10c in his case xd i bet he would have even better temps without backplate at all.
> 
> on EK backplate u have lots of spots for thermal pads dedicated to make contact with PCB to cool things over. i just wish they start making fins on em too, for better heat removing, not just flat piece of material.


My EVGA 1080 Ti stock at least the EVGA backplate made thermal contact with the back of the PCB in multiple places with thermal pads. Nasty pads that left this oily residue all over the PCB that took a few hours to clean off but they did contact it. But it was very thin, like 2~3 mm thick or less. I was happy that the EK full backplate I got was.. I'm not sure how thick but it seemed to be about 10mm thick or so. Definitely about +5x thicker than the thin thing that EVGA had on there originally. And it does contact the PCB on the back with multiple thermalpads. I've touched the backplate under load and it does seem to distribute the heat a lot. Looks nice and protects the back of the card too.


----------



## schoolofmonkey

Ok so I tried the fan blowing onto the card, didn't make any difference to the actual VRM temps, but it did raise the GPU temps.

I think the bottom intake and fan on the card are keeping the hot air in a pocket around the card itself, so the hot air is exhausting out the side of the card then the top fan (on the backplate) is blowing that same hot air back into the card.

So I'm getting better results using it to draw hot air out from under the backplate.

I am going to try the smaller 90mm Fractal fan I have here to blow onto the VRM's because that doesn't hang over the side of the card like the 140mm Noctua on does.

Update:
The 90mm fan brought the VRM's down 7c, but the GPU went up 4c, still nothing to worry about, worth the trade off.
This is with a small 100Mhz OC on the GPU and VRAM, but my GPU won't OC any more that 100Mhz anyway.


----------



## scaramonga

How do you get VRM temp to show?


----------



## schoolofmonkey

scaramonga said:


> How do you get VRM temp to show?


I've got a ROG Strix, HWinfo just showed the VRM temps automatically, I didn't have to do anything.


----------



## 8051

kithylin said:


> My EVGA 1080 Ti stock at least the EVGA backplate made thermal contact with the back of the PCB in multiple places with thermal pads. Nasty pads that left this oily residue all over the PCB that took a few hours to clean off but they did contact it. But it was very thin, like 2~3 mm thick or less. I was happy that the EK full backplate I got was.. I'm not sure how thick but it seemed to be about 10mm thick or so. Definitely about +5x thicker than the thin thing that EVGA had on there originally. And it does contact the PCB on the back with multiple thermalpads. I've touched the backplate under load and it does seem to distribute the heat a lot. Looks nice and protects the back of the card too.


The backplate on my MSI 1080Ti Duke had no thermal pads but an insulating plastic sheet covering it that was royal PITA to remove and the adhesive even worse. I had to use carb cleaner to get the adhesive off and even that wasn't easy.


----------



## skupples

😞
Pic added of issue in bios. 
Had a slight leak the other day while installing a second card. 

Got it all cleaned up, and now the card that didn’t even get wet seems to be causing extreme post times. 

I can post through the card that did get wet, but not the one that didn’t. 

Go figure.

Running a DDU. I really hope I didn’t biff something. 

Windows reads both cards but shows error on the one that I can’t post thru.

It’s code 43 - no steps to resolve have worked. 

Maybe it’s shorting from messed up block and plate install?

Update - 

Maybe that slot isn’t a spec supported by modern nv cards? Checking... the bios does recommend using only slot one and two. 

Damn, I guess I gotta take the block off, n test it in slot two, then get another block that actually syncs up with the RGB series.


----------



## kithylin

scaramonga said:


> How do you get VRM temp to show?


Not all 1080 Ti's have that extra sensor. The reference PCB cards do not. My EVGA GTX 1080 Ti SC Black Edition (Reference PCB) does not have a VRM temperature sensor either. Some cards do, some cards don't. Depends on the card. Usually non-reference PCB versions will probably have one.


----------



## scaramonga

kithylin said:


> Not all 1080 Ti's have that extra sensor. The reference PCB cards do not. My EVGA GTX 1080 Ti SC Black Edition (Reference PCB) does not have a VRM temperature sensor either. Some cards do, some cards don't. Depends on the card. Usually non-reference PCB versions will probably have one.



Interesting. I wonder if its part of the ICX sensor list shown here?


----------



## skupples

resolution - 

the power prong solder points were longer than they should be, causing contact with the backplate, thus putting the card into "protect" if you will. 

Lined them w/ .5mm thermal pad, & the system boots just fine now. 

that was an obscure one.

also - forgot about sli requiring 8x8x and how cheep mobos don't supply more than 2 8x capable slots.


----------



## kithylin

scaramonga said:


> Interesting. I wonder if its part of the ICX sensor list shown here?


Possibly. It's hard to say what those sensors are. Maybe GPU-Z will tell you better in it's sensors tab. I wanted an ICX-Sensor'd 1080 Ti.. I thought I was buying one but I accidentally bought one with an ICX Cooler, and not ICX sensors.. tricksy EVGA putting ICX in the description of the card yet not giving us ICX sensors.. still kinda mad I got screwed on that originally.


----------



## scaramonga

kithylin said:


> Maybe GPU-Z will tell you better in it's sensors tab.


Not really? lol. Interesting there is no 'left fan' sensor?, although I'm on water anyway, so it doesn't matter


----------



## GRABibus

Hi,
I have reworked my OC on my Gigabyte Gaming OC 11G with ASUS XOC Bios (AIO WC)
Currently, my stable OC in benchmarks + games is :
+111MHz in MSI AB on core => 1680MHz base clock
+397MHz memory => 11800MHz
Fixed V/F point at 2114MHz/1.125V and flat V/F curve at 2114MHz from 1.125V until 1.25V

As soon as I tried 2126MHz, even with 1.181V, i crashed in COD WWII for example.

Now, by adding 101MHz on core instead of 111MHz, I could pass several sessions of Unigine Heaven and Time SPy + 1 hour COD WWII with :
+101MHz in MSI AB on core => 1670MHz base clock
+397MHz memory => 11800MHz
Fixed V/F point at 2190MHz/1.162V and flat V/F curve at 2190MHz from 1.162V until 1.25V.

See attached my curve.

I cross the fingers that it will pass at least more hours of COD WWII , which seems to be my most demanding GPU game. Also in BF1, BFV, Overwatch, COD BO3, DOOM, COD BO4 which are the games I currently play the most.
If it is confirmed, then after mly golden i7-5930K, I could say that my 1080 Ti is not too far from a golden sample also


----------



## Zfast4y0u

GRABibus said:


> Hi,
> I have reworked my OC on my Gigabyte Gaming OC 11G with ASUS XOC Bios (AIO WC)
> Currently, my stable OC in benchmarks + games is :
> +111MHz in MSI AB on core => 1680MHz base clock
> +397MHz memory => 11800MHz
> Fixed V/F point at 2114MHz/1.125V and flat V/F curve at 2114MHz from 1.125V until 1.25V
> 
> As soon as I tried 2126MHz, even with 1.181V, i crashed in COD WWII for example.
> 
> Now, by adding 101MHz on core instead of 111MHz, I could pass several sessions of Unigine Heaven and Time SPy + 1 hour COD WWII with :
> +101MHz in MSI AB on core => 1670MHz base clock
> +397MHz memory => 11800MHz
> Fixed V/F point at 2190MHz/1.162V and flat V/F curve at 2190MHz from 1.162V until 1.25V.
> 
> See attached my curve.
> 
> I cross the fingers that it will pass at least more hours of COD WWII , which seems to be my most demanding GPU game. Also in BF1, BFV, Overwatch, COD BO3, DOOM, COD BO4 which are the games I currently play the most.
> If it is confirmed, then after mly golden i7-5930K, I could say that my 1080 Ti is not too far from a golden sample also



enjoy ur card 


this is my profile when SLI refuse to give me its dobble performance im used to -.-


----------



## skupples

don't y'all love how most games that support sli don't even really need it to run max settings smoothly on a single one of your 1080ti's?! 

I'm a bit annoyed with myself. I should'a stuck with my "not until DX12 n NVlink is mature" mindset back when I pawned off my GK110s.


----------



## kithylin

skupples said:


> don't y'all love how most games that support sli don't even really need it to run max settings smoothly on a single one of your 1080ti's?!
> 
> I'm a bit annoyed with myself. I should'a stuck with my "not until DX12 n NVlink is mature" mindset back when I pawned off my GK110s.


I'm on the opposite side. I have a few games I play that completely max out my 1080 Ti and run < 60 FPS even at 1080p.


----------



## skupples

kithylin said:


> I'm on the opposite side. I have a few games I play that completely max out my 1080 Ti and run < 60 FPS even at 1080p.


I'm curious, it's late & I'm tired.

What titles are these? 

What's more annoying is once you get SLI working in something, it provides so much extra power you want a 144hz 4k w/ gsync, but nooooo gsync & sli don't play well together.

example - stepped down to DX11, pushing 4K ultra settings in DEMK @ 100FPS + solid.

I truly hope NVLink + DX12 maturation brings dual card back into reality. I would gladly buy 2x flagships all day, i normally do anyways then pawn off one once I realize Sli is still garbage.


----------



## kithylin

skupples said:


> I'm curious, it's late & I'm tired.
> 
> What titles are these?
> 
> What's more annoying is once you get SLI working in something, it provides so much extra power you want a 144hz 4k w/ gsync, but nooooo gsync & sli don't play well together.
> 
> example - stepped down to DX11, pushing 4K ultra settings in DEMK @ 100FPS + solid.
> 
> I truly hope NVLink + DX12 maturation brings dual card back into reality. I would gladly buy 2x flagships all day, i normally do anyways then pawn off one once I realize Sli is still garbage.


American Truck Simulator (Using so many mods that the mods folder is a little more than double the actual game's folder) - max settings in-game @ 1080p = my 1080 Ti runs @ max boost & 100% utilization and 50-55 FPS in some towns.

Fallout 4 - I manually used console commands to completely remove the build limit restrictions at settlements and I've build huge sprawling settlements and some locations in-game are terribly unoptimized and will also max out my 1080 Ti @ max boost @ 100% utilization and run in the 40's and mid-50's FPS.

ARK Survival Evolved - Max settings in-game @ 1080p (except motion blur.. screw motion blur) = 1080 Ti maxed out again. Does "good" in this game, mostly 60-65 most of the time but sometimes in the 50's in certain areas.

Those are the only ones I have issues with at the moment. It's probably also due to my current build.. I'm running it on a 3570K OC'd @ 5.0 Ghz with a set of 32GB DDR3 @ 2500 @ 12-12-12-20-1T. I know that the cpu is holding it back some, I plan to get a 3770K for it this fall and crank it to 5.0~5.5 Ghz too for a while. Ultimately I'm waiting on Ryzen 4000 series (the last for AM4) to come out then I'll make the switch to a modern DDR4 system. I don't have a lot of money lately personally so it wouldn't make a lot of sense to invest nearly $1500 in a Intel system + fast DDR4 + motherboard + 9900K + cpu monoblock when I'll just be replacing it with an AMD system in 2 years. So I'm going to sit back on my older Intel system a little bit longer and wait it out then when I move up to the newer Ryzen system that will probably be my last computer for 7~10 years or so. I've been researching and the games I listed above that I have issues with performance currently do scale +80% to +90% with a second 1080 Ti. And they're most of the games I care about at the moment, or any other games based on the same engine (UE4) will scale well too. So I might end up actually investing in a second 1080 Ti in fall 2020 while I wait on nvidia to sort out their video cards later.


----------



## ttnuagmada

kithylin said:


> I plan to get a 3770K for it this fall and crank it to 5.0~5.5 Ghz too for a while.


Not to kill your dreams, but a 5ghz 3770K is a golden chip. You'll be lucky to get one that does 4.8. I have one that will do [email protected] 1.416v, but even that is very rare from everything I've read. My other one will only do [email protected] These are both delidded and water cooled too.

Honestly, a 3570K that does 5ghz is rare enough on its own. I completely understand trying to push the IB platform as far as it can go (im in the same boat), but unless you have a lead on a 5ghz 3770K, you're probably not going to come across one without binning CPUs.

Another thing i figured out recently too that I didn't really understand fully, apparently, having 4 dual rank dimms will hurt your fps minimums on the non-HEDT chips. I ran 32 gigs of DDR3 2400 10-12-12-31 for several years, and ended up pulling 2 of them out after reading some about memory ranks. The few benchmarks i ran showed an improvement in minimums, and I can now break 99% framerate stability on the 3Dmark stress test after not being able to do it before.


----------



## 8051

GRABibus said:


> Hi,
> I have reworked my OC on my Gigabyte Gaming OC 11G with ASUS XOC Bios (AIO WC)
> Currently, my stable OC in benchmarks + games is :
> +111MHz in MSI AB on core => 1680MHz base clock
> +397MHz memory => 11800MHz
> Fixed V/F point at 2114MHz/1.125V and flat V/F curve at 2114MHz from 1.125V until 1.25V
> 
> As soon as I tried 2126MHz, even with 1.181V, i crashed in COD WWII for example.
> 
> Now, by adding 101MHz on core instead of 111MHz, I could pass several sessions of Unigine Heaven and Time SPy + 1 hour COD WWII with :
> +101MHz in MSI AB on core => 1670MHz base clock
> +397MHz memory => 11800MHz
> Fixed V/F point at 2190MHz/1.162V and flat V/F curve at 2190MHz from 1.162V until 1.25V.
> 
> See attached my curve.
> 
> I cross the fingers that it will pass at least more hours of COD WWII , which seems to be my most demanding GPU game. Also in BF1, BFV, Overwatch, COD BO3, DOOM, COD BO4 which are the games I currently play the most.
> If it is confirmed, then after mly golden i7-5930K, I could say that my 1080 Ti is not too far from a golden sample also


Grabibus how are you getting above 1.2V Vcore on your 1080Ti?


----------



## kithylin

ttnuagmada said:


> Not to kill your dreams, but a 5ghz 3770K is a golden chip. You'll be lucky to get one that does 4.8. I have one that will do [email protected] 1.416v, but even that is very rare from everything I've read. My other one will only do [email protected] These are both delidded and water cooled too.
> 
> Honestly, a 3570K that does 5ghz is rare enough on its own. I completely understand trying to push the IB platform as far as it can go (im in the same boat), but unless you have a lead on a 5ghz 3770K, you're probably not going to come across one without binning CPUs.
> 
> Another thing i figured out recently too that I didn't really understand fully, apparently, having 4 dual rank dimms will hurt your fps minimums on the non-HEDT chips. I ran 32 gigs of DDR3 2400 10-12-12-31 for several years, and ended up pulling 2 of them out after reading some about memory ranks. The few benchmarks i ran showed an improvement in minimums, and I can now break 99% framerate stability on the 3Dmark stress test after not being able to do it before.


Sorry to inform you but with the right motherboard none of that is "rare", and none of these are "Golden Chips". I have the 32-phase GA-Z77X-UP7 board. So far I've tried three different 2500K's, and this is my 3rd 3570K, they all run at least 5.0 ghz in this board without even trying. Just set the volts and the multiplier and they do it stable. All are random chips bought completely at random used off ebay. The 2500K's all topped out around 5.2~5.3 Ghz stable. The first 2 3570K's did 5.4 ghz and this one I'm on now did 5.2 ghz. Seems I get about 6-7 months out of em at above 5.0 Ghz. When they degrade below 5.0 ghz I re-sell em as stock and go buy another. I've been waiting for the 3770K's to drop in price around $100-$120 each and start going through those instead. I'm rather confident I can get at least 5.0 out of any random 3770K in this board. It's the "above 5ghz" that will be the issue.


----------



## ttnuagmada

kithylin said:


> Sorry to inform you but with the right motherboard none of that is "rare", and none of these are "Golden Chips". I have the 32-phase GA-Z77X-UP7 board. So far I've tried three different 2500K's, and this is my 3rd 3570K, they all run at least 5.0 ghz in this board without even trying. Just set the volts and the multiplier and they do it stable. All are random chips bought completely at random used off ebay. The 2500K's all topped out around 5.2~5.3 Ghz stable. The first 2 3570K's did 5.5 ghz and this one I'm on now did 5.2 ghz. Seems I get about 6-7 months out of em at above 5.0 Ghz. When they degrade below 5.0 ghz I re-sell em as stock and go buy another. I've been waiting for the 3770K's to drop in price around $100-$120 each and start going through those instead. I'm rather confident I can get at least 5.0 out of any random 3770K in this board. It's the "above 5ghz" that will be the issue.



You're being overly optimistic me thinks. I mean I could maybe see it if you have no issues running 1.5v+ with temps in the 80's (if you're degrading that fast, then it sounds like that's probably what you're doing), and even then you'll not be doing it with any random 3770K. I'm running an MVF so it's not like my mobo holding me back, but there's no way in hell my 4.7 3770K is hitting 5ghz even north of 1.5v.


----------



## kithylin

ttnuagmada said:


> You're being overly optimistic me thinks. I mean I could maybe see it if you have no issues running 1.5v+ with temps in the 80's (if you're degrading that fast, then it sounds like that's probably what you're doing), and even then you'll not be doing it with any random 3770K. I'm running an MVF so it's not like my mobo holding me back, but there's no way in hell my 4.7 3770K is hitting 5ghz even north of 1.5v.


I usually run em bare die with liquid metal around 1.60 - 1.75v, it depends on the chip and I usually run em in the 95~103c range daily even with a big custom water loop. The chips never actually die, they just degrade. I usually sell em off when they get down to around 4.7 - 4.8 ghz. They're still fine at stock speeds or a small overclock for someone else, just not suitable for my needs anymore. I'm too addicted to the high clocks from ivy bridge. I'm pretty sure I have a "golden motherboard" way more than "golden chips".


----------



## ttnuagmada

kithylin said:


> I usually run em bare die with liquid metal around 1.60 - 1.75v, it depends on the chip and I usually run em in the 95~103c range daily even with a big custom water loop. The chips never actually die, they just degrade. I usually sell em off when they get down to around 4.7 - 4.8 ghz. They're still fine at stock speeds or a small overclock for someone else, just not suitable for my needs anymore. I'm too addicted to the high clocks from ivy bridge. I'm pretty sure I have a "golden motherboard" way more than "golden chips".


lol. ok that makes a lot more sense.


----------



## kithylin

ttnuagmada said:


> lol. ok that makes a lot more sense.


We're both kind of off topic for this thread. But you're welcome to DM me sometime and we could chat on steam or telegram if you would like.

Back to actually on topic: After about 17 months with the XOC bios on my 1080 Ti and running it at 1.200v & 2126 Mhz core speed I finally had it happen: My card degraded recently too it seems. I suddenly a few days ago got the "space invaders looking thing" across the screen and the video driver reset it's self on me. And all I was doing was watching youtube while playing Astroneer. The card wasn't even loaded hard at the time or anything. More like 1500 mhz core and 60% utilization. So I guess XOC is kind of bad after all. So instead I've re-flashed my stock bios from EVGA back onto the card. I actually have not tried the stock bios on my big custom water loop before. I went XOC and custom water at the same time and never looked back. It turns out I can get my card to run stable at 2012 Mhz and the same +700 ram that I got out of XOC. Except where XOC would throttle down my clock speed from 2126 -> 2108 when I went to 45c, and 2050 @ 53c, the stock EVGA bios on my card stays at 2012 Mhz all the way up to 55c. Mainly I'm worried I might have to RMA my card, I have 588 days left in warranty. So I wanted to put it back to stock bios while it's still able to be flashed back in case it dies totally at some point.

I just thought I would share my $0.02 with everyone about XOC. Even with a big custom water loop it seems it's not worth it long-term. And 2012 Mhz stock bios vs 2126 Mhz with XOC and I'm only seeing -1.8% performance loss being on stock bios according to firestrike. Good enough for me for now.


----------



## Streetdragon

Yeahhh^^ i think Nvida had a reason to lock the voltage^^


But 2000Mhz all the time is still not bad, cooler and uses way less power^^

But iwould say: its the memory and not the core, because i had only artefacts/error with to high memory.
With a unstable clock i only got "black" crashes


----------



## skupples

some of those titles actually have GOOd sli support too. Fall Out, etc.

Had both pegged @ 99% last night max settings, DE:MK, 4K, well over 60, if not 100 but I've never been able to adjust to no-vsync. thus why I'm gonna take a step down in resolution, & a step up in frames.

too bad the game just isn't grabbing me.


----------



## Knoxx29

kithylin said:


> So I guess XOC is kind of bad after all.



I ran it just for 3 days, i don't know why but i wasn't happy running my card at 1.20V:thumb:






ttnuagmada said:


> Not to kill your dreams, but a 5ghz 3770K is a golden chip. You'll be lucky to get one that does 4.8. I have one that will do [email protected] 1.416v, but even that is very rare from everything I've read. My other one will only do [email protected] These are both delidded and water cooled too.





kithylin said:


> Sorry to inform you but with the right motherboard none of that is "rare", and none of these are "Golden Chips". I have the 32-phase GA-Z77X-UP7 board. So far I've tried three different 2500K's, and this is my 3rd 3570K, they all run at least 5.0 ghz in this board without even trying. Just set the volts and the multiplier and they do it stable. All are random chips bought completely at random used off ebay. The 2500K's all topped out around 5.2~5.3 Ghz stable. The first 2 3570K's did 5.4 ghz and this one I'm on now did 5.2 ghz. Seems I get about 6-7 months out of em at above 5.0 Ghz. When they degrade below 5.0 ghz I re-sell em as stock and go buy another. I've been waiting for the 3770K's to drop in price around $100-$120 each and start going through those instead. I'm rather confident I can get at least 5.0 out of any random 3770K in this board. It's the "above 5ghz" that will be the issue.





ttnuagmada said:


> You're being overly optimistic me thinks. I mean I could maybe see it if you have no issues running 1.5v+ with temps in the 80's (if you're degrading that fast, then it sounds like that's probably what you're doing), and even then you'll not be doing it with any random 3770K. I'm running an MVF so it's not like my mobo holding me back, but there's no way in hell my 4.7 3770K is hitting 5ghz even north of 1.5v.





kithylin said:


> I usually run em bare die with liquid metal around 1.60 - 1.75v, it depends on the chip and I usually run em in the 95~103c range daily even with a big custom water loop. The chips never actually die, they just degrade. I usually sell em off when they get down to around 4.7 - 4.8 ghz. They're still fine at stock speeds or a small overclock for someone else, just not suitable for my needs anymore. I'm too addicted to the high clocks from ivy bridge. I'm pretty sure I have a "golden motherboard" way more than "golden chips".



My 3770K back in 2014, it could run at 1.38V but i always add a little more voltage when Overclock.


----------



## supermiguel

I have 2 AORUS GeForce GTX 1080 Ti Waterforce WB Xtreme Edition is it worth it to flash the BIOS to another one that is not from them?


----------



## 8051

kithylin said:


> I usually run em bare die with liquid metal around 1.60 - 1.75v, it depends on the chip and I usually run em in the 95~103c range daily even with a big custom water loop. The chips never actually die, they just degrade. I usually sell em off when they get down to around 4.7 - 4.8 ghz. They're still fine at stock speeds or a small overclock for someone else, just not suitable for my needs anymore. I'm too addicted to the high clocks from ivy bridge. I'm pretty sure I have a "golden motherboard" way more than "golden chips".


What an interesting tactic for overclocking. Can you remove the degraded CPU's without breaking your loop apart? Your strategy will make me much more leery about buying used CPU's.


----------



## 8051

skupples said:


> some of those titles actually have GOOd sli support too. Fall Out, etc.
> 
> Had both pegged @ 99% last night max settings, DE:MK, 4K, well over 60, if not 100 but I've never been able to adjust to no-vsync. thus why I'm gonna take a step down in resolution, & a step up in frames.
> 
> too bad the game just isn't grabbing me.


Is that Deus Ex: Mankind Divided you're referring to?


----------



## skupples

Yessir. You just gotta disable DX12 & it has gorgeous SLI support. Exemplary of what should be industry standard since we give NV more money than anyone else.


----------



## 8051

skupples said:


> Yessir. You just gotta disable DX12 & it has gorgeous SLI support. Exemplary of what should be industry standard since we give NV more money than anyone else.


That's excellent. I remember I saw some pitiful SLI scores for 980Ti's and DX:Mankind Divided. So your minimum FPS never falls below 60 in the benchmark?

I thought DX: Humanity Evolved had more interesting locations than DX: Mankind Divided and a better story.


----------



## skupples

I didn't run the bench, just played the game itself. V-sync off its a good 80-120FPS, Vsync on, it does a decent job not dropping below 60 almost ever. I choose to run w/ vsync on most of the time, the stutter and variance introduced from running above 60 on a 60hz panel is quite displeasing. 

(something most "sli" supporting titles could learn from) 

I'm either gonna sell this extra 1080 ti for a decent # back to the community, or delve into a second 400 hour witcher player through, this time fully modded w/ the top 10 list.


----------



## Zfast4y0u

kithylin said:


> We're both kind of off topic for this thread. But you're welcome to DM me sometime and we could chat on steam or telegram if you would like.
> 
> Back to actually on topic: After about 17 months with the XOC bios on my 1080 Ti and running it at 1.200v & 2126 Mhz core speed I finally had it happen: My card degraded recently too it seems. I suddenly a few days ago got the "space invaders looking thing" across the screen and the video driver reset it's self on me. And all I was doing was watching youtube while playing Astroneer. The card wasn't even loaded hard at the time or anything. More like 1500 mhz core and 60% utilization. So I guess XOC is kind of bad after all. So instead I've re-flashed my stock bios from EVGA back onto the card. I actually have not tried the stock bios on my big custom water loop before. I went XOC and custom water at the same time and never looked back. It turns out I can get my card to run stable at 2012 Mhz and the same +700 ram that I got out of XOC. Except where XOC would throttle down my clock speed from 2126 -> 2108 when I went to 45c, and 2050 @ 53c, the stock EVGA bios on my card stays at 2012 Mhz all the way up to 55c. Mainly I'm worried I might have to RMA my card, I have 588 days left in warranty. So I wanted to put it back to stock bios while it's still able to be flashed back in case it dies totally at some point.
> 
> I just thought I would share my $0.02 with everyone about XOC. Even with a big custom water loop it seems it's not worth it long-term. And 2012 Mhz stock bios vs 2126 Mhz with XOC and I'm only seeing -1.8% performance loss being on stock bios according to firestrike. Good enough for me for now.


Are you sure the issues you have with the card now are core related and not memory? try returning memory to stock speeds and keep ur oc on core at same speed. im not a fan running 1080ti anywhere close to 1.2v for gaming purposes, but it dosent have to mean ur card start behaving like that cause of ur ''high'' voltage overclock. as far you have it warranty covered you are fine i guess. 

btw what did that nvidia enginier said if u crank up voltage, after how long ur card dies? just curious if ur card fits in his predictions.
@skupples

so deus ex mankind devided has 100% scaling on dx 11?? thats great cause i just finished downloading it, wanted to try deus ex for some time now. how are sli support on older parts of franchise?


----------



## supermiguel

skupples said:


> I didn't run the bench, just played the game itself. V-sync off its a good 80-120FPS, Vsync on, it does a decent job not dropping below 60 almost ever. I choose to run w/ vsync on most of the time, the stutter and variance introduced from running above 60 on a 60hz panel is quite displeasing.
> 
> (something most "sli" supporting titles could learn from)
> 
> I'm either gonna sell this extra 1080 ti for a decent # back to the community, or delve into a second 400 hour witcher player through, this time fully modded w/ the top 10 list.


extra? u have 3?


----------



## skupples

no, only 2. the # of titles with support that actually need the power is so limited that its a bit of an exercise in futility. I feel I'd get more outta one card & a 1440hz Gsync panel.


----------



## kithylin

8051 said:


> What an interesting tactic for overclocking. Can you remove the degraded CPU's without breaking your loop apart? Your strategy will make me much more leery about buying used CPU's.


There's nothing to be leery about. Even these CPU's when I'm done with them will still run 4.5 - 4.7 ghz overclocked for someone else or stock speed even for years to come. There's nothing at all wrong with what I'm doing. In fact I got the idea from debau8er that sells his cpu's back to the public after running them at 6 Ghz under Ln2 all the time. And no I have to drain the loop and partially disassemble the loop, take the cpu block off and it's tubing off the loop. Mounting a water block direct on the chip bare-die is a delicate process and it won't work if the tubing is connected to the near by water system and putting pressure sideways during the mounting process. But once it's mounted and bolted down I can push/move the tubing around and it won't hurt anything.



Zfast4y0u said:


> Are you sure the issues you have with the card now are core related and not memory? try returning memory to stock speeds and keep ur oc on core at same speed. im not a fan running 1080ti anywhere close to 1.2v for gaming purposes, but it dosent have to mean ur card start behaving like that cause of ur ''high'' voltage overclock. as far you have it warranty covered you are fine i guess.
> 
> btw what did that nvidia enginier said if u crank up voltage, after how long ur card dies? just curious if ur card fits in his predictions.
> 
> @skupples
> 
> so deus ex mankind devided has 100% scaling on dx 11?? thats great cause i just finished downloading it, wanted to try deus ex for some time now. how are sli support on older parts of franchise?


After my crash the other day and seeing that weird space-invaders-looking thing over the screen (google for 2080 Ti crashing screen artifacts and you can see an example of what I'm referring to) I could not get any games to run at any overclock with XOC bios anymore. All games and benchmarks would crash to desktop, or dump to desktop with no error, or reset the driver, nothing would run with any overclock at all. With the XOC bios on the card. The only way I was able to get back to gaming was to flash the stock bios back on the card again. And once it's back on the stock bios, I can run +700 ram again just like I did on XOC bios and it games fine with no issues anymore.


----------



## Knoxx29

kithylin said:


> There's nothing at all wrong with what I'm doing. In fact I got the idea from debau8er that sells his cpu's back to the public after running them at 6 Ghz under Ln2 all the time.



In fact that is why i don't buy CPUs from CaseKing.de or any used CPU.:thumb:


----------



## skupples

Zfast4y0u said:


> Are you sure the issues you have with the card now are core related and not memory? try returning memory to stock speeds and keep ur oc on core at same speed. im not a fan running 1080ti anywhere close to 1.2v for gaming purposes, but it dosent have to mean ur card start behaving like that cause of ur ''high'' voltage overclock. as far you have it warranty covered you are fine i guess.
> 
> btw what did that nvidia enginier said if u crank up voltage, after how long ur card dies? just curious if ur card fits in his predictions.
> 
> @skupples
> 
> so deus ex mankind devided has 100% scaling on dx 11?? thats great cause i just finished downloading it, wanted to try deus ex for some time now. how are sli support on older parts of franchise?


I missed this, my apologies.

I assume full warranty loss once I slap a waterblock onto something.

Also, I've had less than a couple hours of test time, but so far I've got +130 core, +500 memory, with zero increased voltage, etc. It's a pretty common "stock" OC to apply to NV cards. 

I may be confused here, the only OC I've really mentioned was running my GK110s @ 1400 @ 1.4 for ages, the next owner of them still uses all three to this day, at not as high clocks. I'm not sure of what any NV dev said, except for when they showed up in Kepler Titan club to issue damage control over us gaining control to the voltage controller via software.  Yet again, another thing we accepted full warranty loss responsibility on, along with changing BIOS, etc. Yes, high OCs age your cards faster. To the point where it'll ever really effect you, or the second owner? Unlikely unless you run them thru the mud with 99% mining time for a year. 

as to the scaling, it's the best out of the box SLI experience I've tested so far. I know what old games support it, I just don't know what modern ones do. 


Any time you see your cards @ ~50/50, sli isn't really working, n you should shut it off. (OR you've capped to 60FPS at a low resolution and they're using zero effort)

remember, toggle DX11 in Mankind Divided.


----------



## skupples

by the way, anyone part of an active SLI thread somewhere?


100% looking forward to this NG+ Witcher 3 Enhanced Edition Mod + texture overhaul mod, & some other QOL stuff you don't wanna deal with in a replay, like all map locations revealed, etc. 

much excite.


----------



## Zfast4y0u

skupples said:


> I missed this, my apologies.
> 
> I assume full warranty loss once I slap a waterblock onto something.
> 
> Also, I've had less than a couple hours of test time, but so far I've got +130 core, +500 memory, with zero increased voltage, etc. It's a pretty common "stock" OC to apply to NV cards.
> 
> I may be confused here, the only OC I've really mentioned was running my GK110s @ 1400 @ 1.4 for ages, the next owner of them still uses all three to this day, at not as high clocks. I'm not sure of what any NV dev said, except for when they showed up in Kepler Titan club to issue damage control over us gaining control to the voltage controller via software.  Yet again, another thing we accepted full warranty loss responsibility on, along with changing BIOS, etc. Yes, high OCs age your cards faster. To the point where it'll ever really effect you, or the second owner? Unlikely unless you run them thru the mud with 99% mining time for a year.
> 
> as to the scaling, it's the best out of the box SLI experience I've tested so far. I know what old games support it, I just don't know what modern ones do.
> 
> 
> Any time you see your cards @ ~50/50, sli isn't really working, n you should shut it off. (OR you've capped to 60FPS at a low resolution and they're using zero effort)
> 
> remember, toggle DX11 in Mankind Divided.




i think you got confused a bit by my post, everything above ''@skupples'', i was writing to kithylin and his issue with his card after running XOC bios for 18 months. what was related to you was SLI scaling in deus ex mankind devided.

i just did quick test on mankind devided, scaling is ****, just 50% on direct x 11, on 12, there is no scaling at all, but both cards pegged at 100%, go figure ^^.

how much i can tell from seeing game for 5min, graphics are not anything special, but performance are even worse. oh well ^^


----------



## kithylin

Zfast4y0u said:


> so deus ex mankind devided has 100% scaling on dx 11?? thats great cause i just finished downloading it, wanted to try deus ex for some time now. how are sli support on older parts of franchise?


Just so you're aware: No game, title, GPU combination or system has ever in the history of computers gained +100% from a second video card. Not SLI. Not CrossFire. Not CrossFireX. There is always some sort of overhead that results in -10% to -20% loss. Most games/titles at best typically gain +80% with a second card even when properly optimized. Some game titles in the past have shown up to +90% scaling but that is an outlier / exception and should not be expected from SLI. Some titles may only gain +40% from a second card. It really depends on the title.

However at the moment.. the only options are a RTX 2080 Ti for +40% for $1300 USD or a second 1080 Ti for *(possibly up to) +80%, for around $500-$600 USD.

So at the moment if the games you are interested in playing benefit from SLI and have scaling at or above +60% then it makes sense right now.

Or if you're like me and insist on staying on Windows 7 forever (You can call me crazy, I don't care. I won't ever switch) at some point Nvidia will drop windows 7 from drivers and I'll have to double-up cards because it will be the last configuration I'll probably ever invest in. I don't know yet what that will be.. possibly GTX/RTX 3000 series or 4000 series. We'll see with time.


----------



## Zfast4y0u

kithylin said:


> Just so you're aware: No game, title, GPU combination or system has ever in the history of computers gained +100% from a second video card. Not SLI. Not CrossFire. Not CrossFireX. There is always some sort of overhead that results in -10% to -20% loss. Most games/titles at best typically gain +80% with a second card even when properly optimized.


dying light, rise of tomb raider, far cry 4, shadow of mordor. to name you a few just. if you gonna count every fps to support your claim, then this games i listed are 2 fps away from that magic 100% scaling.

idk why you think max scaling in sli can be 80%. you just need good optimized game for it.


----------



## kithylin

Zfast4y0u said:


> dying light, rise of tomb raider, far cry 4, shadow of mordor. to name you a few just. if you gonna count every fps to support your claim, then this 3 games i listed are 2 fps away from that magic 100% scaling.
> 
> idk why you think max scaling in sli can be 80%. you just need good optimized game for it.


I'm just going by history, documented benchmark reports from game review sites, etc. We've never seen +100%. Like I said above, due to driver overhead it's actually impossible. There will always be some slight losses somewhere.


----------



## Zfast4y0u

kithylin said:


> I'm just going by history, documented benchmark reports from game review sites, etc. We've never seen +100%. Like I said above, due to driver overhead it's actually impossible. There will always be some slight losses somewhere.


i wouldnt have anything against 10% loss, but in every game ^^

50% gain is too little for me tbh. nothing we can do.

not sure in what direction will sli go in future, nvidia is moving away from gaming slowly. we should have that in mind too.


----------



## kithylin

Zfast4y0u said:


> i wouldnt have anything against 10% loss, but in every game ^^
> 
> 50% gain is too little for me tbh. nothing we can do.
> 
> not sure in what direction will sli go in future, nvidia is moving away from gaming slowly. we should have that in mind too.


Yeah.. like I also said (edited in) in my above post: Some games may go to +90% or maybe even +95% in a rare instance in a select few titles (Literally less than 5 titles I know of can do +90% scaling with SLI). The typical "Normal" to be expected is between +60% and +80%. It's not a perfect system and usually manual tuning of SLI profiles is required to get the most out of it. SLI isn't something you can just add a second card, load drivers, go open game and expect it to work correctly. Unfortunately nvidia is moving away from dual-GPU setups and they actually usually don't even offer properly optimized SLI profiles for most titles and we have to fix it manually. Personally I think Nvidia is intentionally trying to discourage people from using Dual-GPU / SLI all together and they plan to ultimately completely drop it some day. That's just my thoughts though.


----------



## skupples

Zfast4y0u said:


> i think you got confused a bit by my post, everything above ''@skupples'', i was writing to kithylin and his issue with his card after running XOC bios for 18 months. what was related to you was SLI scaling in deus ex mankind devided.
> 
> i just did quick test on mankind devided, scaling is ****, just 50% on direct x 11, on 12, there is no scaling at all, but both cards pegged at 100%, go figure ^^.
> 
> how much i can tell from seeing game for 5min, graphics are not anything special, but performance are even worse. oh well ^^




that's backwards from what I experieced, but I loaded up the stock inspector profile first. 

sli in dx12, 2nd card @ 99, dx 11 in sli, both cards @ 90+ w. buffering off. ~70% w/ triple on. 

There are lots of titles that support sli, they're like 1 in 15! However, anything older than a year or two doesn't need two 1080ti to max out @ 40k 60fps, like dying light, etc. dying light 2 will likely have broken sli too, if the current trend means anything.

Nvidia has historically, since Keplar, been moving away from SLI, however NVLink says otherwise, n its barely a baby now. Give it a gen or two, + DX12 maturation. 2x flagships will be beast mode like DX9 days again, & I'll return to NVSurround gaming + Gsync cuz fantasies!

people like to claim some sorta sli conspiracy cuz of forced new hardware acquisition. I don't buy it. I just think SLI is an ancient & heavily flawed tech that DX11 never really did well with anyways. It only makes sense it had to die for the new sli 2.0 to be born.

anyone looking to be a competent sli user needs to invest in understanding how NVinspector works, & use it to the best of your patience and ability.


----------



## 8051

kithylin said:


> After my crash the other day and seeing that weird space-invaders-looking thing over the screen (google for 2080 Ti crashing screen artifacts and you can see an example of what I'm referring to) I could not get any games to run at any overclock with XOC bios anymore. All games and benchmarks would crash to desktop, or dump to desktop with no error, or reset the driver, nothing would run with any overclock at all. With the XOC bios on the card. The only way I was able to get back to gaming was to flash the stock bios back on the card again. And once it's back on the stock bios, I can run +700 ram again just like I did on XOC bios and it games fine with no issues anymore.


Did you update your 1080Ti drivers recently?


----------



## kithylin

8051 said:


> Did you update your 1080Ti drivers recently?


I had not actually. At the time of the crash I was using the same older 417.71 driver that I have been using since Jan 2019, which had no issues. I did update the driver (up to version 419.35) but only after the crash and in an attempt to diagnose the issue, thinking perhaps my install of the driver some how got corrupted with a couple of recent bluescreens from my current CPU degrading from 5.1 Ghz -> 5.0 Ghz. But no that didn't solve it. I had to go back to the stock EVGA bios to get back to gaming again.


----------



## Zfast4y0u

kithylin said:


> I had not actually. At the time of the crash I was using the same older 417.71 driver that I have been using since Jan 2019, which had no issues. I did update the driver (up to version 419.35) but only after the crash and in an attempt to diagnose the issue, thinking perhaps my install of the driver some how got corrupted with a couple of recent bluescreens from my current CPU degrading from 5.1 Ghz -> 5.0 Ghz. But no that didn't solve it. I had to go back to the stock EVGA bios to get back to gaming again.


so at what voltage are you now? does ur 2000mhz clock speed ( gpu ) now need more voltage then before?


----------



## kithylin

Zfast4y0u said:


> so at what voltage are you now? does ur 2000mhz clock speed ( gpu ) now need more voltage then before?


The stock EVGA bios won't let me change the voltage at all from what EVGA has it set to. I honestly didn't even look because I can't change it so I didn't bother to read it. I just went and loaded firestrike and it's 1.0620v for 2012 Mhz.


----------



## Zfast4y0u

kithylin said:


> The stock EVGA bios won't let me change the voltage at all from what EVGA has it set to. I honestly didn't even look because I can't change it so I didn't bother to read it. I just went and loaded firestrike and it's 1.0620v for 2012 Mhz.


O_O


you cant open voltage/freq curve in msi afterburner at all??? thats plain stupid if u ask me. ***


----------



## kithylin

Zfast4y0u said:


> O_O
> 
> 
> you cant open voltage/freq curve in msi afterburner at all??? thats plain stupid if u ask me. ***


I can open it and I can click apply but the voltage never changes. This is pretty standard. Most all 1080 Ti's do this on stock bios's. That's literally why XOC bios exists. But I can't use it anymore.


----------



## GRABibus

8051 said:


> Grabibus how are you getting above 1.2V Vcore on your 1080Ti?


Hi,
in fact I can't.
I was just talking about the maximum value shown as voltage in V/F diagram of MSI AB v4.6.0


----------



## Zfast4y0u

kithylin said:


> I can open it and I can click apply but the voltage never changes. This is pretty standard. Most all 1080 Ti's do this on stock bios's. That's literally why XOC bios exists. But I can't use it anymore.


did you try to tweak highest clock speed in curve or some clock speeds that are on second,third place? if u dont edit the highest clock speed in curve and tweak its voltage, it wont have effect, cause card will always go for top clock speed in the curve first.
for example u have 2050mhz 1.062v as highest clock speed in ur voltage/freq curve, and ur desired clock speed is not in the top of the curve, for example u have second highest clock on curve 2000mhz @ 1.050mV that u wanna run ur card on, and highest one is 2050mhz @ 1.062mV, card will naturally ignore second one and boost to the top clock speed in curve, in this case 2050mhz @ 1.062mV, what u wanna do is, make curve flat with clocks and voltage, so that ur 2000mhz starts at 1.050mV and its flat till 1.25v and there is no higher clock speed then 2000mhz in the curve. at least thats how i tweak with asus 1080ti oc bios. if this dosent work with ur bios, idk, this is literally how we undervolt the cards, it must work for everyone dang it.


this is 1080 undervolt example, but process is same


----------



## kithylin

Zfast4y0u said:


> did you try to tweak highest clock speed in curve or some clock speeds that are on second,third place? if u dont edit the highest clock speed in curve and tweak its voltage, it wont have effect, cause card will always go for top clock speed in the curve first.
> for example u have 2050mhz 1.062v as highest clock speed in ur voltage/freq curve, and ur desired clock speed is not in the top of the curve, for example u have second highest clock on curve 2000mhz @ 1.050mV that u wanna run ur card on, and highest one is 2050mhz @ 1.062mV, card will naturally ignore second one and boost to the top clock speed in curve, in this case 2050mhz @ 1.062mV, what u wanna do is, make curve flat with clocks and voltage, so that ur 2000mhz starts at 1.050mV and its flat till 1.25v and there is no higher clock speed then 2000mhz in the curve. at least thats how i tweak with asus 1080ti oc bios. if this dosent work with ur bios, idk, this is literally how we undervolt the cards, it must work for everyone dang it.


I know how to use the MSI Afterburner curve. I used it to get 2126 Mhz @ 1.200v, that's not the issue. I'm doing it correctly. It just doesn't change. Like I tried to tell you, most stock bios's for reference-PCB board 1080 Ti's will not let users increase the voltage at all. This is standard for almost all reference-PCB 1080 Ti's. It's Nvidia that was intentionally trying to lock down the voltage. This is what Nvidia does with their video cards. No one with any 2080 Ti can increase the voltage at all as far as I'm aware for example. The only ones that let us increase the voltage for 1080 Ti's are the custom-PCB cards typically. And even then they're usually limited to around 1.087v - 1.10v You can change the voltage -DOWN- and under-volt it but you can not change it -UP- to increase it. The XOC unlimited bios is the only bios that allows anyone with any 1080 Ti to raise voltage above 1.10v. I'm pretty sure of that.


----------



## Zfast4y0u

kithylin said:


> I know how to use the MSI Afterburner curve. I used it to get 2126 Mhz @ 1.200v, that's not the issue. I'm doing it correctly. It just doesn't change. Like I tried to tell you, most stock bios's for reference-PCB board 1080 Ti's will not let users increase the voltage at all. This is standard for almost all reference-PCB 1080 Ti's. It's Nvidia that was intentionally trying to lock down the voltage. This is what Nvidia does with their video cards. No one with any 2080 Ti can increase the voltage at all as far as I'm aware for example. The only ones that let us increase the voltage for 1080 Ti's are the custom-PCB cards typically. And even then they're usually limited to around 1.087v - 1.10v You can change the voltage -DOWN- and under-volt it but you can not change it -UP- to increase it. The XOC unlimited bios is the only bios that allows anyone with any 1080 Ti to raise voltage above 1.10v. I'm pretty sure of that.


i was thinking about undervolting, nvm  we didnt understand each other good. what i asked you in first question was , do u see degradation now too on lower clock speeds or only highest overclock is impacted that you used with XOC.

i do not have experience with chips that get degraded, so im curious how it behaves, once you get to that point, you just cant run highest overclock as before on same voltage, or for example lower overclocks with other voltages are also impacted and each clock speed now needs more voltage then before, or only highest clock speed u could have achieved?


----------



## kithylin

Zfast4y0u said:


> i was thinking about undervolting, nvm  we didnt understand each other good. what i asked you in first question was , do u see degradation now too on lower clock speeds or only highest overclock is impacted that you used with XOC.
> 
> i do not have experience with chips that get degraded, so im curious how it behaves, once you get to that point, you just cant run highest overclock as before on same voltage, or for example lower overclocks with other voltages are also impacted and each clock speed now needs more voltage then before, or only highest clock speed u could have achieved?


My card is 100% perfectly normal as if nothing happened now @ 2012 Mhz core, stock voltage and +700 Memory. It just won't run the 2126 @ 1.200v anymore that I previously could run with the XOC bios. And I can not clock higher than 2012 Mhz with the stock bios. Anything higher crashes the driver.


----------



## jura11

kithylin said:


> I know how to use the MSI Afterburner curve. I used it to get 2126 Mhz @ 1.200v, that's not the issue. I'm doing it correctly. It just doesn't change. Like I tried to tell you, most stock bios's for reference-PCB board 1080 Ti's will not let users increase the voltage at all. This is standard for almost all reference-PCB 1080 Ti's. It's Nvidia that was intentionally trying to lock down the voltage. This is what Nvidia does with their video cards. No one with any 2080 Ti can increase the voltage at all as far as I'm aware for example. The only ones that let us increase the voltage for 1080 Ti's are the custom-PCB cards typically. And even then they're usually limited to around 1.087v - 1.10v You can change the voltage -DOWN- and under-volt it but you can not change it -UP- to increase it. The XOC unlimited bios is the only bios that allows anyone with any 1080 Ti to raise voltage above 1.10v. I'm pretty sure of that.


Hi there 

I would rather say, on stock BIOS we are limited by 1.093v voltage, only XOC BIOS allows us to to run higher voltage than 1.093v, this is probably biggest difference 


I'm still running stock EVGA BIOS on my GTX1080Ti with 2113MHz and 1.07v and no issues, with 1.093v I have run 2126MHz and 2164MHz which are stable but temperatures cannot be higher than 33-34°C if they're higher profile/clocks 2164MHz are unstable for longer periods of time and therefore I run 2126MHz profile,I think there is still bit more in clocks, maybe something around 213x-214xMHz didn't tried push clocks bit higher

You have been unlucky with yours Ti that's for sure, I told you that last time and still think yours EVGA is dud like my Zotac RTX 2080Ti right now which wouldn't OC beyond 2115-2130MHz and still that's with 1.093v and many on RTX 2080Ti thread would OC well above 2160-2175MHz and in better case 220xMHz is possible 

Most of GTX1080Ti what I owned or tried they most of them will OC beyond 2100MHz under water, Aorus Extreme GTX1080Ti this wouldn't OC beyond 2088MHz and MSI GTX1080Ti Gaming X wouldn't OC beyond 2088MHz too

Hope this helps and good luck there 

Thanks, Jura


----------



## jura11

kithylin said:


> My card is 100% perfectly normal as if nothing happened now @ 2012 Mhz core, stock voltage and +700 Memory. It just won't run the 2126 @ 1.200v anymore that I previously could run with the XOC bios. And I can not clock higher than 2012 Mhz with the stock bios. Anything higher crashes the driver.


Did you tried run 2126 with XOC BIOS and 0MHz on VRAM or downclocking VRAM by - 200MHz, Micron memories aren't very consistent this has been confirmed by many OC'ers and have borrowed RTX 2080Ti from friend which would start showing space invaders at anything above +200MHz and then started to showing up again at stock VRAM speeds etc and only cure has been to downclock VRAM by - 300MHz and more

Hope this helps 

Thanks, Jura


----------



## skupples

that reminds me... these cards have different bios, i should probably sync em & test before tossing out one of the cards.


----------



## kithylin

jura11 said:


> Did you tried run 2126 with XOC BIOS and 0MHz on VRAM or downclocking VRAM by - 200MHz, Micron memories aren't very consistent this has been confirmed by many OC'ers and have borrowed RTX 2080Ti from friend which would start showing space invaders at anything above +200MHz and then started to showing up again at stock VRAM speeds etc and only cure has been to downclock VRAM by - 300MHz and more
> 
> Hope this helps
> 
> Thanks, Jura


I don't want to -decrease- performance below stock with this card ever. Right now it runs 2012 core / +700 ram and it's so far stable. At worst I will only ever drop it down to stock ram speeds of +0 Mhz. If for some reason some day it never runs or crashes at stock speeds then I will just RMA it back to EVGA before I would ever consider running it -300 Mhz ram below stock speed. It's still covered under factory warranty for another 580 days for me. Thanks for the suggestion but what I have right now works and I'm happy with it as is and don't want to change anything on it at the moment. It's only -1.8% off overall performance from where it was with the XOC bios and so I'm fine with that. And yes I know it's a dud but I don't have the money to sell it and buy another one and hope I get lucky and do it all over again. I rather just deal with what I have and enjoy it as-is.


----------



## KenjiS

I think the 1080 Ti is one of my favorite purchases ever...

Given the last time i had a GPU i liked this much it was my GeForce2 GTS...


----------



## 8051

KenjiS said:


> I think the 1080 Ti is one of my favorite purchases ever...
> 
> Given the last time i had a GPU i liked this much it was my GeForce2 GTS...


Talk about a trip down memory lane. Sometimes I wonder if AGP w/sideband addressing had less latency than PCIe -- after all AGP was a direct, point-to-point connection between the CPU and the video card with no intervening serial communications protocol or endecs.


----------



## mattxx88

Hi guys

can someone share EVGA 11G-P4-6390-KR (1080 Ti Founders edition) bios?


----------



## MrTOOSHORT

mattxx88 said:


> Hi guys
> 
> can someone share EVGA 11G-P4-6390-KR (1080 Ti Founders edition) bios?




*https://www.techpowerup.com/vgabios/201837/201837*


----------



## mattxx88

MrTOOSHORT said:


> *https://www.techpowerup.com/vgabios/201837/201837*


no matter if is unverified? got to check mine vram, cannot remember if they are samsung or micron


----------



## 8051

jura11 said:


> Did you tried run 2126 with XOC BIOS and 0MHz on VRAM or downclocking VRAM by - 200MHz, Micron memories aren't very consistent this has been confirmed by many OC'ers and have borrowed RTX 2080Ti from friend which would start showing space invaders at anything above +200MHz and then started to showing up again at stock VRAM speeds etc and only cure has been to downclock VRAM by - 300MHz and more
> 
> Hope this helps
> 
> Thanks, Jura


Jura I've had this experience. Dying Light is the one game I can't overclock the core and memory simultaneously but instead have to downclock the VRAM by some 253 Mhz. to get stability. It's interesting to note that increasing memory frequency notably increases heat output.


----------



## kithylin

8051 said:


> It's interesting to note that increasing memory frequency notably increases heat output.


If you're using an air cooled card, perhaps. I'm on a large custom water loop and I get the exact same temperatures on the GPU with the 1080 Ti's ram at 0% and +700%. Not even +1c hotter.


----------



## Streetdragon

So i got my secound 1080TI!
First time i hear a blower style cooler. Wow that is annoying^^

Clocked them for now at 1950 0,950V and vram +200.
Watercoled 27°
Aircoled 68°.
Already orderd a secound block from EK. Will set them parallel and turn up the pump speed. Now the D5 get some work to do hehe.

Little question. The secound card core load was at 99% most time and my main card(water) varies from 80-95% usage. Clocks voltage etc was the same. No powerlimit. Is that normal? First time sli^^
I use a hard bridge that is for 3-Way as you can see on the pics. I hope that is normal

Happy little camper for 570€ warehouse deal^^


----------



## Arexniba

Hey 1080 TI owners. I wanted to ask you all for recommendations/suggestions on a good case for a 1080 TI SLI. 

Currently, I have the old school Cooler Master HAF 932. It's worked great, but for the past year of owning the 2 cards, it's pretty annoying that they barely fit inside the case. The bottom 1080 TI card literally touches the top of the PSU.

Thanks!


----------



## kicurry

Can I offer a great 1080 Ti for sale here?


----------



## kithylin

Arexniba said:


> Hey 1080 TI owners. I wanted to ask you all for recommendations/suggestions on a good case for a 1080 TI SLI.
> 
> Currently, I have the old school Cooler Master HAF 932. It's worked great, but for the past year of owning the 2 cards, it's pretty annoying that they barely fit inside the case. The bottom 1080 TI card literally touches the top of the PSU.
> 
> Thanks!


Here's my current build with a water cooled 1080 Ti in a Corsair Graphite 780T:





Note: I removed the drives cages and their bottom mount entirely. I'm using a 4x1 (4x2.5" drives in 1x5.25" bay) icy dock hot swap bay in one of the 5.25" bays, and I use the 2.5" mounts in the middle-back of the case. I don't claim to be good at making videos so sorry if the quality sucks. It's just me and my DSLR. You could easily fit two or three 1080 Ti's in this case in a similar configuration to mine and still have tons of room for radiators. I'm using 4 of em: 360mm up top, 240mm front, 240m bottom front underneath the pumps, and a 120mm in the back. This is a hybrid build that dual-boots WindowsXP & Win7x64 so it currently has a GTX-780 in it for XP Side. Yes I'm crazy. No don't ask about it. No I don't want to discuss it.



kicurry said:


> Can I offer a great 1080 Ti for sale here?


Not in this thread, no. You can go to the OCN marketplace and try there though. Here's a link for you: https://www.overclock.net/forum/321-overclock-marketplace/


----------



## Streetdragon

So you have a water colled and one air colled card like me now. I call it hybrid cooling xD


----------



## terrorindeed

Hey everyone, so all of a sudden I’m having an issue with my AIO Aorus Extreme Waterforce 1080ti. 

I bought it in October 2017, out of the box stock clocks sustained at 2025mhz and temperatures were at 46C. What a great card, right ? After some testing I settled with 2063mhz on the core and no memory increase. 

Fast forward to the other day, the card began crashing in Star Wars Battlefront II and Call of Dury WW II, so I removed the OC and began testing stock clocks again. 

The card now only boosts to 1898-1924mhz and the temperatures are 58C. 

What the heck ?


----------



## kithylin

terrorindeed said:


> Hey everyone, so all of a sudden I’m having an issue with my AIO Aorus Extreme Waterforce 1080ti.
> 
> I bought it in October 2017, out of the box stock clocks sustained at 2025mhz and temperatures were at 46C. What a great card, right ? After some testing I settled with 2063mhz on the core and no memory increase.
> 
> Fast forward to the other day, the card began crashing in Star Wars Battlefront II and Call of Dury WW II, so I removed the OC and began testing stock clocks again.
> 
> The card now only boosts to 1898-1924mhz and the temperatures are 58C.
> 
> What the heck ?


When's the last time you cleaned the dust and dirt out of the radiator? And check the fan on it and see if it's still spinning.


----------



## RoBiK

terrorindeed said:


> Hey everyone, so all of a sudden I’m having an issue with my AIO Aorus Extreme Waterforce 1080ti.
> 
> I bought it in October 2017, out of the box stock clocks sustained at 2025mhz and temperatures were at 46C. What a great card, right ? After some testing I settled with 2063mhz on the core and no memory increase.
> 
> Fast forward to the other day, the card began crashing in Star Wars Battlefront II and Call of Dury WW II, so I removed the OC and began testing stock clocks again.
> 
> The card now only boosts to 1898-1924mhz and the temperatures are 58C.
> 
> What the heck ?


Maybe the pump or the radiator fan died.


----------



## terrorindeed

kithylin said:


> terrorindeed said:
> 
> 
> 
> Hey everyone, so all of a sudden I’m having an issue with my AIO Aorus Extreme Waterforce 1080ti.
> 
> I bought it in October 2017, out of the box stock clocks sustained at 2025mhz and temperatures were at 46C. What a great card, right ? After some testing I settled with 2063mhz on the core and no memory increase.
> 
> Fast forward to the other day, the card began crashing in Star Wars Battlefront II and Call of Dury WW II, so I removed the OC and began testing stock clocks again.
> 
> The card now only boosts to 1898-1924mhz and the temperatures are 58C.
> 
> What the heck ?
> 
> 
> 
> When's the last time you cleaned the dust and dirt out of the radiator? And check the fan on it and see if it's still spinning.
Click to expand...




RoBiK said:


> terrorindeed said:
> 
> 
> 
> Hey everyone, so all of a sudden I’m having an issue with my AIO Aorus Extreme Waterforce 1080ti.
> 
> I bought it in October 2017, out of the box stock clocks sustained at 2025mhz and temperatures were at 46C. What a great card, right ? After some testing I settled with 2063mhz on the core and no memory increase.
> 
> Fast forward to the other day, the card began crashing in Star Wars Battlefront II and Call of Dury WW II, so I removed the OC and began testing stock clocks again.
> 
> The card now only boosts to 1898-1924mhz and the temperatures are 58C.
> 
> What the heck ?
> 
> 
> 
> Maybe the pump or the radiator fan died.
Click to expand...

I literally just took it apart and cleaned the rad a few week ago. Just checked it and it’s mint. The fan is spinning as normal as can be. I added another 120mm fan today to see if that would help, but sadly , the temps aren’t improving. 

Is there anyway to check the pump speed ? The fan is spinning just fine. I’m thinking it’s the pump on the way out, slowly dying.


----------



## Streetdragon

terrorindeed said:


> I literally just took it apart and cleaned the rad a few week ago. Just checked it and it’s mint. The fan is spinning as normal as can be. I added another 120mm fan today to see if that would help, but sadly , the temps aren’t improving.
> 
> Is there anyway to check the pump speed ? The fan is spinning just fine. I’m thinking it’s the pump on the way out, slowly dying.


Pump is working or dead. It cant be between.
I would say that the block is gunked up or so


----------



## skupples

hey now, the first h100 died a slow death, as you'd tend to have the different speed zones stop functioning


----------



## Arexniba

kithylin said:


> Here's my current build with a water cooled 1080 Ti in a Corsair Graphite 780T:
> https://www.youtube.com/watch?v=yWrYGUoIdco
> 
> Note: I removed the drives cages and their bottom mount entirely. I'm using a 4x1 (4x2.5" drives in 1x5.25" bay) icy dock hot swap bay in one of the 5.25" bays, and I use the 2.5" mounts in the middle-back of the case. I don't claim to be good at making videos so sorry if the quality sucks. It's just me and my DSLR. You could easily fit two or three 1080 Ti's in this case in a similar configuration to mine and still have tons of room for radiators. I'm using 4 of em: 360mm up top, 240mm front, 240m bottom front underneath the pumps, and a 120mm in the back. This is a hybrid build that dual-boots WindowsXP & Win7x64 so it currently has a GTX-780 in it for XP Side. Yes I'm crazy. No don't ask about it. No I don't want to discuss it.


Hey Kithylin, thanks for the response. However, maybe I should have been clearer. I have the two GTX 1080 TI cards that are air cooled. They're bulkier than your water cooled versions. I would go water, but I am happy the way it is and don't feel like spending more money. 

If you, or any other OCers here can point me to a good case that would fit air cooled SLI 1080 Ti cards, that'd be great!

Cheers!


----------



## skupples

anything made to fit full size ATX boards should work. count the number of slots your current case has, then compare to the cases you're looking at. That'll give you a good scope of how much more space you'll have in the area you're looking at.


----------



## kithylin

Arexniba said:


> Hey Kithylin, thanks for the response. However, maybe I should have been clearer. I have the two GTX 1080 TI cards that are air cooled. They're bulkier than your water cooled versions. I would go water, but I am happy the way it is and don't feel like spending more money.
> 
> If you, or any other OCers here can point me to a good case that would fit air cooled SLI 1080 Ti cards, that'd be great!
> 
> Cheers!


I tried to show you how much room there is in this case. I'm sure you could easily fit air cooled 1080 Ti's in here if you took out some or all of the drive cages. I just went and measured and with the drive cages out the 780T has 18.5 inches (47 cm) interior space from the I/O bracket to the inside of the front of the case. That should be more than enough for any video card ever made, air or water.


----------



## ezveedub

Arexniba said:


> Hey 1080 TI owners. I wanted to ask you all for recommendations/suggestions on a good case for a 1080 TI SLI.
> 
> Currently, I have the old school Cooler Master HAF 932. It's worked great, but for the past year of owning the 2 cards, it's pretty annoying that they barely fit inside the case. The bottom 1080 TI card literally touches the top of the PSU.
> 
> Thanks!


Move the PSU to the top, then you got more room for the bottom GPU, lol


----------



## Streetdragon

Soo im in 1080TI Sli watercooled.

Thought sli would work as good as cossfire.. but nope. A bit disapointed. Metro exodus work only with hex editing and anthem not at all.

Some benchmarks dont use sli and dx12 has no sli support at all. Feels really bad, BUT it looks cool.

Maybe there comes more support in the future


----------



## kithylin

Streetdragon said:


> Soo im in 1080TI Sli watercooled.
> 
> Thought sli would work as good as cossfire.. but nope. A bit disapointed. Metro exodus work only with hex editing and anthem not at all.
> 
> Some benchmarks dont use sli and dx12 has no sli support at all. Feels really bad, BUT it looks cool.
> 
> Maybe there comes more support in the future


DirectX-12 does use both video cards but it's a different method, "Explicit Multi-GPU". It doesn't use SLI or CrossFire, but multi-GPU is possible with DX12. The problem is it's entirely up to the developer of whom ever makes the game to code in Multi-GPU support in to their game/title when they add DX-12 support. Some developers may want to bother coding it in, some may not bother coding it in.

EDIT: Just do some research and make sure the games you enjoy playing support SLI or can be made to support it easily before buying. For me.. all the games I enjoy playing all support it with little to no work. So I'm (potentially) looking forwards to it.


----------



## Pinnacle Fit

Streetdragon said:


> Soo im in 1080TI Sli watercooled.
> 
> 
> 
> Thought sli would work as good as cossfire.. but nope. A bit disapointed. Metro exodus work only with hex editing and anthem not at all.
> 
> 
> 
> Some benchmarks dont use sli and dx12 has no sli support at all. Feels really bad, BUT it looks cool.
> 
> 
> 
> Maybe there comes more support in the future




Now you know how I felt with my sli 980tis


Sent from my iPhone using Tapatalk


----------



## skupples

Streetdragon said:


> Soo im in 1080TI Sli watercooled.
> 
> Thought sli would work as good as cossfire.. but nope. A bit disapointed. Metro exodus work only with hex editing and anthem not at all.
> 
> Some benchmarks dont use sli and dx12 has no sli support at all. Feels really bad, BUT it looks cool.
> 
> Maybe there comes more support in the future


You coulda asked first. Someone points this out almost daily.
It’s essentially a dead technology at the moment, until dx12 and nvlink mature. 

I’ve been saying this since GK110... and yet I still get the second used card cuz when it does work, it’s glorious

For now though, I’d take GSync over sli 99% of the time since it works 100% of the time Vs. SLI’s 10% of the time, sad part is GSync in unison with SLI is pretty broken from what I understand.


----------



## Streetdragon

Yeah^^ mostly useless, BUT it looks nice and was fun to build xD
And both do 2100/6095Mhz ^^


----------



## kithylin

skupples said:


> You coulda asked first. Someone points this out almost daily.
> It’s essentially a dead technology at the moment, until dx12 and nvlink mature.
> 
> I’ve been saying this since GK110... and yet I still get the second used card cuz when it does work, it’s glorious
> 
> For now though, I’d take GSync over sli 99% of the time since it works 100% of the time Vs. SLI’s 10% of the time, sad part is GSync in unison with SLI is pretty broken from what I understand.


I still don't understand why people like you are intentionally trying to downplay SLI and try to tell everyone "It's useless" and "It doesn't work don't bother". SLI is a fantastic technology and the last time I used it a few years ago (GTX 770's) I had no issues getting SLI to work in every game I wanted it to work in, or rather the games I cared to play. Sometimes it may take a little manual tweaking of profiles but that's a one-time-and-done thing and then it works forever and the custom tweaked profiles can be backed up and restored between driver updates. It's not really a big deal. I usually saw between +60% to +80% scaling from a second card in all the titles I normally want to play and it was glorious. It's totally worth it and I'm excited to go back to SLI again with 1080 Ti's later.

Folks just need to do their due diligence and research if the games they want to play support SLI or if they don't support it out of the box, how much of a pain it is to make it work. It's not something you can just buy and expect everything to work, some games require a little work (one time only). Which by the way, AMD crossfire doesn't work for everything out of the box either but there's no way to tune it to make it work if it doesn't work out of the box. With nvidia at least we can tune games ourselves, which is fantastic and awesome. If you buy it and then complain the games you want to play don't use SLI And you didn't research it first then the problem is with you and not the underlying technology.

Sorry for the rant.. it just feels a little sad inside seeing people trying to say "SLI DIDN'T WORK IN MY GAMES, IT'S TERRIBLE, DON'T BUY INTO IT". :thumbsdow


----------



## skupples

keplar was the last time SLI was worth a damn, that's why we downplay it. We actually agree on this point, as my GK110 Titans were the last time I enjoyed SLI too! The only difference is I've kept using it since & have further witnessed its decline. For the moment, it's really a crap shoot, & folks that wanna play with it should realize that going into it. 

your last experience with it, was its dying breath.

Support level is maybe 1 in 10, and many games that DO Support it don't actually need it. (see official support list) 

sure if you love hacking n slashing ini files, hex editing, nv inspectoring, then SLI is for you.

***SLI is still great for some specific communities, like the racer kids and fly boys. 

you seem to keep missing my saying this part.

it isn't "my games" there's a pretty well tracked list of games with intentional support, & modded support. Most titles with modded support experience issues.

even nvidia thinks sli is dead... see removal of it on lower end cards, and NVLink 

I'm not a scientist but I'm pretty sure it qualifies as ancient & flawed. Its golden hour was DX9. 

Hopes - NVLink + DX12 = epic sli like the good ol' days.


----------



## kithylin

skupples said:


> keplar was the last time SLI was worth a damn, that's why we downplay it. We actually agree on this point, as my GK110 Titans were the last time I enjoyed SLI too! The only difference is I've kept using it since & have further witnessed its decline. For the moment, it's really a crap shoot, & folks that wanna play with it should realize that going into it.
> 
> your last experience with it, was its dying breath.
> 
> Support level is maybe 1 in 10, and many games that DO Support it don't actually need it. (see official support list)
> 
> sure if you love hacking n slashing ini files, hex editing, nv inspectoring, then SLI is for you.
> 
> ***SLI is still great for some specific communities, like the racer kids and fly boys.
> 
> you seem to keep missing my saying this part.
> 
> it isn't "my games" there's a pretty well tracked list of games with intentional support, & modded support. Most titles with modded support experience issues.
> 
> even nvidia thinks sli is dead... see removal of it on lower end cards, and NVLink
> 
> I'm not a scientist but I'm pretty sure it qualifies as ancient & flawed. Its golden hour was DX9.
> 
> Hopes - NVLink + DX12 = epic sli like the good ol' days.


Nvidia inspector is trivial.. like I said, do it once and it's good forever and can even move it to new driver updates and back up the changes. A little effort once for never touching it again.
A short list of a few of the games I'm concerned with:
ARK: Survival Evolved: +90% with 1080 Ti's.
Borderlands 2: +80% with two 1080 Ti's (Yes it's necessary with the new 4K texture update, loads my 1080 Ti hard in some areas).
Fallout 4: +85% with 1080 Ti's.
Monster Hunter World: +70% with 1080 Ti's.
F1 2018: +70% with 1080 Ti's.
I don't see anything negative at all, or any issues, problems, or any reason to think it's either dead or not useful. Seems great to me. A second 1080 Ti is faster in these titles than switching to a 2080 Ti, and cheaper. For me it will be the best option at the moment and I'm both looking forward to it and excited to get the second 1080 Ti later. I'm aiming for 100~120 FPS minimums in all titles above and some others at max settings.


----------



## Streetdragon

kithylin said:


> Nvidia inspector is trivial.. like I said, do it once and it's good forever and can even move it to new driver updates and back up the changes. A little effort once for never touching it again.
> A short list of a few of the games I'm concerned with:
> ARK: Survival Evolved: +90% with 1080 Ti's.
> Borderlands 2: +80% with two 1080 Ti's (Yes it's necessary with the new 4K texture update, loads my 1080 Ti hard in some areas).
> Fallout 4: +85% with 1080 Ti's.
> Monster Hunter World: +70% with 1080 Ti's.
> F1 2018: +70% with 1080 Ti's.
> I don't see anything negative at all, or any issues, problems, or any reason to think it's either dead or not useful. Seems great to me. A second 1080 Ti is faster in these titles than switching to a 2080 Ti, and cheaper. For me it will be the best option at the moment and I'm both looking forward to it and excited to get the second 1080 Ti later. I'm aiming for 100~120 FPS minimums in all titles above and some others at max settings.


more FPS and it looks nice in the case :thumb:


----------



## feznz

I run a single 1080Ti first time in 10 years I felt it wasn't necessary to run SLI but I keep my old 770 in there not for any reason but to keep the 1080Ti company would be all lonely in there by its self.
besides the mother board looks a bit empty with a single card in there.


If I had the spare cash laying around I would go SLI but just brought a house instead.


----------



## skupples

Yeaaah... that’s the issue. Some of those titles do fly, n some of them just bench out better but have horrid stutter issues, etc. 

MH:W is one of “my games” and I play in 1440p max over 4K cuz sli support is terrible, thus the game runs much smoother on a single card (this is a common trend if you look at any number besides FPS when reading benchmarks for SLI. It’s almost disingenuous of you to leave out frame times, etc when discussing SLI #s.)

You also just listed like 5 years worth of sli release titles. (excluding BL2)

My point boils down to this since you’re hell bent on arguing for a technology you haven’t used in 4 generations...

If youre buying new - get the best damn card you can, n maybe add a second one later..

If you’re buying used, get the best damn card you can, then add a second one later..

Hey look at that!!

The days of running 2x 770 cuz it’s faster than 1x titan are long gone my friend. They’ll return soon enough though. 

Oh btw - the above 2x 70s vs 1x titan? Another area where it would be disingenuous to not mention the fact that the Titan still wins and higher resolutions. 😉

You seem to not want to except any advice from anyone who’s been dealing with the technology since the beginning. If you want better FPS a monster Hunter world you won’t be doing it with two video cards I can promise you that, unless you have a very low sensitivity bar for stuttering pacing issues and extreme frame time variance. 

I wish you the best of luck with your second card but you’ll hear my voice at some point while busting your nuckles to get something that “should” support SLI to actually work well with it on. Yuuuuuuge difference. 

Also - most inspector profiles don’t do a whole lot, as NV is pretty damn good at baking in the correct bits after 1-2 driver updates so 99% of the time the bits given online are the bits already baked into the default profile for the title.

I await your 4K SLI MH:W prefer resolution results, or how you got it to actually run smoothly on the second card without extreme frame variance and hitching every 30 seconds. 


FO4 is a great sli title. One of the few to come out in the last 5 years. Along with Deus ex. Just make sure you turn off dx12. Same for metro. Good luck with that. They broke sli in the redux, and exodus. However, the results after editing are OK... still not as great as 2033 OG, which is infuriating.

Just remember - 2 cards @ 50-56% isn’t functional SLI. It’s two cards doing the work of one, With the inherent latency of sli baked in.


----------



## kithylin

skupples said:


> Yeaaah... that’s the issue. Some of those titles do fly, n some of them just bench out better but have horrid stutter issues, etc.
> 
> MH:W is one of “my games” and I play in 1440p max over 4K cuz sli support is terrible, thus the game runs much smoother on a single card (this is a common trend if you look at any number besides FPS when reading benchmarks for SLI. It’s almost disingenuous of you to leave out frame times, etc when discussing SLI #s.)
> 
> You also just listed like 5 years worth of sli release titles. (excluding BL2)
> 
> My point boils down to this since you’re hell bent on arguing for a technology you haven’t used in 4 generations...
> 
> If youre buying new - get the best damn card you can, n maybe add a second one later..
> 
> If you’re buying used, get the best damn card you can, then add a second one later..
> 
> Hey look at that!!
> 
> The days of running 2x 770 cuz it’s faster than 1x titan are long gone my friend. They’ll return soon enough though.
> 
> Oh btw - the above 2x 70s vs 1x titan? Another area where it would be disingenuous to not mention the fact that the Titan still wins and higher resolutions. 😉
> 
> You seem to not want to except any advice from anyone who’s been dealing with the technology since the beginning. If you want better FPS a monster Hunter world you won’t be doing it with two video cards I can promise you that, unless you have a very low sensitivity bar for stuttering pacing issues and extreme frame time variance.
> 
> I wish you the best of luck with your second card but you’ll hear my voice at some point while busting your nuckles to get something that “should” support SLI to actually work well with it on. Yuuuuuuge difference.
> 
> Also - most inspector profiles don’t do a whole lot, as NV is pretty damn good at baking in the correct bits after 1-2 driver updates so 99% of the time the bits given online are the bits already baked into the default profile for the title.
> 
> I await your 4K SLI MH:W prefer resolution results, or how you got it to actually run smoothly on the second card without extreme frame variance and hitching every 30 seconds.
> 
> 
> FO4 is a great sli title. One of the few to come out in the last 5 years. Along with Deus ex. Just make sure you turn off dx12. Same for metro. Good luck with that. They broke sli in the redux, and exodus. However, the results after editing are OK... still not as great as 2033 OG, which is infuriating.
> 
> Just remember - 2 cards @ 50-56% isn’t functional SLI. It’s two cards doing the work of one, With the inherent latency of sli baked in.


The problem is all the titles I listed a single 1080 Ti isn't fast enough to get the 100~120 FPS minimums I want in these games. And a RTX 2080 Ti isn't fast enough to do it either. I need more than 2080 Ti performance and right now there's no other way to do it other than SLI. Literally no other way. There isn't a fast enough single-card yet that money can buy. You can argue all you want but until the RTX/GTX 3000 and 4000 series come out there's no other option. And even by then it still may not do the performance I need single-card either. And for your information I have researched everything. I've looked at youtube videos for those titles and other titles I want to play. I've even talked directly to people on steam that use SLI 1080 Ti's in these titles. Even in MH:W there's zero stutter with a fast enough CPU (9900K with manual all-core overclock to 5.2 Ghz) with SLI 1080 Ti's @ 1080p and 1440p. The folks I talked to have nothing but good things to say. High FPS performance, no perceivable stutter to them and everything butter-smooth. From everything I've researched all these negative things you're mentioning don't exist in actual usage in actual games. I think you're just making all this up to make SLI sound bad. I still don't understand why though. I don't and never would use 4K however.. maybe it does have stutter at 4K.


----------



## skupples

As stated - this is in 4K, however results are similar but less pronounced in 1440p, specially when the frames really start picking up. Clearly though, we've figure it out, it's just my weak 5.3 9700k & 3200mhz memory  

We're also in two different boats. you want high FPS in 1080p, which i'm guessing wouldn't even require a second card based on 1440p ultra running 70-90 most of the time. Really I believe the issue is the same as its always been. Two cards doesn't do much for ya at higher resolutions if you're already nearing your memory bandwidth cap. 


I really do hope you get the epic experience you expect, I do. I'm not trying to talk anyone out of anything (no matter how many times you annoyingly say it, it won't come true), I'm pretty sure that's just your tribal nature kicking in & improperly deciphering these figures on the screen. 


People just gotta remember - one beefier card always wins against two mid range cards (specially when the beefier card has higher memory bus!!!!) UNLESS!?!?! anyone remember?! you're gaming in resolutions low enough to not max out the memory bandwidth, as memory bandwidth doesn't scale as most expect it to, in SLI. 

it's a funny time right now though, 20 series sucks & many of us are experienced that first hand, thus the only other thing to do is slap on an extra 1080ti for the one outta ten times its gonna work well. 

as I stated in the beginning of all of this like 2 weeks ago - I'm not trading in my 2nd card until I see how it fares on my 3440x1440p alienware next month. That however won't change the fact 1 in 10 games support SLI, and that its a dead technology being replaced by NVLink. The more DX12 titles we see, the less SLI titles we'll see, until they get that part all ironed out, which may take a couple more years & we currently get maybe 1 outta 20 with native support, n can maybe get 1 in 5 to work with tweaked support.


----------



## kithylin

skupples said:


> I really do hope you get the epic experience you expect, I do. I'm not trying to talk anyone out of anything (no matter how many times you annoyingly say it, it won't come true), I'm pretty sure that's just your tribal nature kicking in & improperly deciphering these figures on the screen.
> 
> 
> People just gotta remember - one beefier card always wins against two mid range cards (specially when the beefier card has higher memory bus!!!!) UNLESS!?!?! anyone remember?! you're gaming in resolutions low enough to not max out the memory bandwidth, as memory bandwidth doesn't scale as most expect it to, in SLI.
> 
> it's a funny time right now though, 20 series sucks & many of us are experienced that first hand, thus the only other thing to do is slap on an extra 1080ti for the one outta ten times its gonna work well.
> 
> as I stated in the beginning of all of this like 2 weeks ago - I'm not trading in my 2nd card until I see how it fares on my 3440x1440p alienware next month. That however won't change the fact 1 in 10 games support SLI, and that its a dead technology being replaced by NVLink. The more DX12 titles we see, the less SLI titles we'll see, until they get that part all ironed out, which may take a couple more years & we currently get maybe 1 outta 20 with native support, n can maybe get 1 in 5 to work with tweaked support.


The person I spoke with in steam claims 130~150 FPS minimums in every title I spoke about above with SLI'd 1080 Ti's, and claims to me zero stuttering, micro-stuttering or any other issues. So it definitely will do everything I want. However one other variable I wanted to add. Both Me and this other person I spoke with use windows 7 64-bit. I'm guessing you're using windows 10? Perhaps that may have something to do with the performance in general? I've heard Windows 10 tries to force some stupid "game mode" on users or something. DirectX-12 doesn't actually support SLI, but Vulcan has everything DX12 does graphics wise and fully supports SLI and CrossFire. DX12 games are proprietary, OS-Dependent (requires windows 10 and Microsoft store) and run in microsoft's little walled garden. I'm very hopeful myself that more games will embrace Vulcan. Both for SLI support and Multi-OS support.


----------



## skupples

130-50 in... 1440p, sure completely reasonable, and 200-240 all day in 1080p. 

as to the OS, that's an interesting variable. I may have to spin up a dual boot & test. I remember what you're talking about, n thought it was resolved years ago. 

Like I said though, I'm incredibly sensitive. My friends can play titles with v-sync off & state to not see anything but "slight tearing" I turn v-sync off, n my head explodes.


----------



## Streetdragon

MHW 4k or 1440p(prefered) in sli is working great for me and im really sensitive to stutter/flicker.
And i dont have the fastest CPU or so. 5930K 4.5Ghz 32Gig 3200Mhz tight timings

Testet MHW an PS4 and PC at the same time(sister played on PS4)
PC was way smoother


----------



## skupples

can you please post a screenshot of your GPU usage in 4K w/ sli on? 

I'd truly appreciate it, & look REP is back ! 

I've been running single GPU 1440p maxed, res pref, or 4K low via one card due to hitching issues in 4K on ultra. (they look about the same its hard to choose between the two besides 1440p running slightly faster)

Let's not get it twisted. I've been SLI gaming for 10 years & that's not gonna change. I just throw in the towel much sooner than I used to when it comes to rigging functionality. Before all the minutia mashing that went on, my main point is that getting SLI to work is a pain in the ass & usually comes with a catch. Very few engines truly do NOT support the tech, the rest are just victims of penny pinching or lazy devs. I 100% agree SLI should be mandatory on NV branded games, and I've believed this since NV purchased the technology. I believe it even more now that AMD has cornered the console market. NV needs a way to shine apart from consoles. It's almost like they've forgotten there's crazy folks out there like me that'll buy 2-4 flag ships, along with 3x 8k screens then strap it to a sim pit for the most epic in-home gaming experience possible. 

....so for now, I dream on to Dx12 + NVlink maturation... Like we've been doing since the first Titan. DX11 is a dinosaur. It needs to sink into the tar pits faster.


----------



## The Pook

skupples said:


> Like I said though, I'm incredibly sensitive. My friends can play titles with v-sync off & state to not see anything but "slight tearing" I turn v-sync off, n my head explodes.



If I go out of my way and look for tearing I can see it but otherwise I don't notice it. 

Depends a lot on the game though.


----------



## skupples

The Pook said:


> If I go out of my way and look for tearing I can see it but otherwise I don't notice it.
> 
> Depends a lot on the game though.


this is true, some games handle it better than others.

Most recent game I can think of where I can play w/ v-sync off no issues is PUBG. I ONLY see a latency difference in that title. Most recent MUST have V-sync on title is any Cryengine branched title. Far Cry, Metro,


edit - i did a sanity check & yeah, I get full second GPU usage & almost no 1st GPU usage. W, inspector i get that classic 50/50 split. ?(MHW)

Good news is the first of my 2TB NVMe cards is on the way, so I think I'll play with Win7 on that until all 4x 2tb drives show, which I'm slowly ordering over the next couple checks. That + the addition of the Dell:AW 3440x1440p 120hz monitor, and i'm DONE until 4 series. (they're going in the asus 4xNVMe 16x card)

I've had some weird issues lately like MH:W only working after a restart, & windows not wanting to boot from anything but 2 specific ports on my bottom card = I think my old SATA SSD raid is dying. MH:W even continues to do the weird restart requirement after a full removal & re-install. 

weirdly, all SSDs report 100% health after all these years..


----------



## Streetdragon

skupples said:


> can you please post a screenshot of your GPU usage in 4K w/ sli on?
> 
> I'd truly appreciate it, & look REP is back !
> 
> I've been running single GPU 1440p maxed, res pref, or 4K low via one card due to hitching issues in 4K on ultra. (they look about the same its hard to choose between the two besides 1440p running slightly faster)
> 
> Let's not get it twisted. I've been SLI gaming for 10 years & that's not gonna change. I just throw in the towel much sooner than I used to when it comes to rigging functionality. Before all the minutia mashing that went on, my main point is that getting SLI to work is a pain in the ass & usually comes with a catch. Very few engines truly do NOT support the tech, the rest are just victims of penny pinching or lazy devs. I 100% agree SLI should be mandatory on NV branded games, and I've believed this since NV purchased the technology. I believe it even more now that AMD has cornered the console market. NV needs a way to shine apart from consoles. It's almost like they've forgotten there's crazy folks out there like me that'll buy 2-4 flag ships, along with 3x 8k screens then strap it to a sim pit for the most epic in-home gaming experience possible.
> 
> ....so for now, I dream on to Dx12 + NVlink maturation... Like we've been doing since the first Titan. DX11 is a dinosaur. It needs to sink into the tar pits faster.


Soo i used this setting from here:
https://steamcommunity.com/app/582010/discussions/3/1745594817442794507/
(used DSR for 4k, because i have only 1440P panel)

runs lagfree. No stutter, but 1440P gives better fps, so no 4k for me xD


----------



## skupples

thank you sir, and that's on twin 1080ti's in win10... with what driver, & are you forcing Inspector? everything else looks identical. OC, volts, cpu. 

I think I'm also gonna push these over two a new bios. I've never NEEDED to do that for any specific reason other than more consistent OC though.

slightly off topic... I've been trying to slay kirin for the second time but keep eating feces in these pugs... So if anyone wants to help tonight  I went to the extent of getting full xeno + lightning runes, so she only really tickles me now, but folks who show up naked get nuked.


----------



## Streetdragon

skupples said:


> thank you sir, and that's on twin 1080ti's in win10... with what driver, & are you forcing Inspector? everything else looks identical. OC, volts, cpu.
> 
> I think I'm also gonna push these over two a new bios. I've never NEEDED to do that for any specific reason other than more consistent OC though.
> 
> slightly off topic... I've been trying to slay kirin for the second time but keep eating feces in these pugs... So if anyone wants to help tonight  I went to the extent of getting full xeno + lightning runes, so she only really tickles me now, but folks who show up naked get nuked.


driver is 425.31(wantet test raytracing. nice stuff^^) and changed the profile with the inspector like in the thread i posted.

I can post you the settings i used in the inspector and ingame settings if you want

OT: dont really play MWH anymore. was "just for fun" that i bought it for the PC, after i played it dead on the PS4, because i hate consol controller and aiming with that. Worst gameplay^^


----------



## skupples

The only weapon I like controller for is glaive, and that's cuz you can just hit auto-target & fly around for 30 seconds at a time, avoiding 90% of damage on 90% of mobs. Kirin & Kush are my only trouble mobs w/ glaive - so now I'm gonna level some sorta ranged... probably bow. This is actually my first Monster Hunter title due to my near life long PC usage over consoles. Me & consoles existed in an era where they were more likely to end up in pawnshops than entertainment cabinets. 

I'm not too curious about in-game settings, I've gotta pretty good handle on those... 

I'll look back to your post, I must have missed it. 

one 1080ti can only do 4k+4k textures on low


Edit : 

I noticed something today - I'm not on 1809, or even 1803... I wonder why I've broken off of the update pathing. 

anywhoo, maybe this will fix some of this extra sli weirdness i'm getting. I'd love to be able to actually force decent support again 

so things are running much better on 1809, but that sli profile gives me extreme artifacting  
it worked for like 5 minutes then things went belly up, so up on the list for tomorrow is a driver flush.

i hopped into Deus Ex for a sanity check, both cards are working fine while pegged @ 99.99999% usage in 4K + MSAA with no graphical errors.


----------



## VeritronX

The reason I dropped SLI was the same as the reason I don't like "120hz" etc TV's.. yeah you get more frames that way, but there is always more input lag then running everything 1:1.

With SLI you're actually rendering the extra frame instead of guessing, but the driver has to buffer frames and then spends some time routing them and the cards spend some time transferring the finished frame..

Just the fact that two frames are being rendered at the same time means the input latency is doubled to start with in a perfect scenario.


----------



## skupples

i got win10 all updated, applied the recommend profile earlier in the thread n it turns into artifact & glitch hell * (I'm on the ray tracing driver as well)

GPU usage is flawless though 

it was weird, i loaded in on 1440p, it was working fine, then I assigned 4k, allowed the game to do its restart thing, then boom borked. 

edit - this is quite depressing. flushed drivers, installed the 430. driver, it worked GLORIOUSLY just long enough to get logged in, then artifact hell. Tested in Fallout 4 for hardware sanity check, all seems well. Benchmarked the hell outta them, all seems well.


----------



## Blze001

Silly question, but what drivers are everyone using? Or has Nvidia not quite reached their "the old gen starts falling off" updates with the 10 series yet?


----------



## skupples

I’m on the one from the 23rd, I however keep all previous drivers I’ve installed just in case. supposedly that doesn’t matter! Once the changes are made in card there’s no going back... neber seen proof of this conspiracy though.


----------



## ThrashZone

Hi,
DDU should get rid of any driver sludge
397.64 seems fine still
Otherwise performance wise refer to game options and driver cp option to disable anything rtx related that might of been activated.


----------



## skupples

ThrashZone said:


> Hi,
> DDU should get rid of any driver sludge
> 397.64 seems fine still
> Otherwise performance wise refer to game options and driver cp option to disable anything rtx related that might of been activated.


yeah, that crossed my mind this morning while driving into the office. I need to check it tonight, after i get my first two 2tb NVMe setup. I'm migrating my win10 over to it, n deploying win7 on my sata raid.

every other known sli functional game/bench I have is working fine, so I'm staying away from the idea of a busted card.


----------



## StAndrew

I have an Gigabyte Aorus Xtreme waterblock and core runs solid at 2100. One mhz over and the card crashes. I'm toying with the shunt mod and I'm hoping there are other mods you can help me with to kick the core up a few more mhz. I feel it has plenty more headroom.


----------



## Streetdragon

StAndrew said:


> I have an Gigabyte Aorus Xtreme waterblock and core runs solid at 2100. One mhz over and the card crashes. I'm toying with the shunt mod and I'm hoping there are other mods you can help me with to kick the core up a few more mhz. I feel it has plenty more headroom.


shunt mod wont help there, only when you are at the power limit.
Want more mhz? Than get the core colder! Colder and more voltage.
Did i said colder?


----------



## StAndrew

Streetdragon said:


> shunt mod wont help there, only when you are at the power limit.
> Want more mhz? Than get the core colder! Colder and more voltage.
> Did i said colder?


Load doesnt go much over 35ish. Depending on my room ambient temp, most I've seen (after a few hours with the door closed) was in the lower 40's.


----------



## skupples

most folks find their wall 2085-2101 it seems. 

wonder if bios would help... I'm not up on what the limitations are these days. It used to just be not enough volts.


----------



## ThrashZone

StAndrew said:


> I have an Gigabyte Aorus Xtreme waterblock and core runs solid at 2100. One mhz over and the card crashes. I'm toying with the shunt mod and I'm hoping there are other mods you can help me with to kick the core up a few more mhz. I feel it has plenty more headroom.


Hi,
Might look into this 
https://www.overclock.net/forum/69-...erent-bios-your-1080-ti-205.html#post27928240


----------



## StAndrew

ThrashZone said:


> Hi,
> Might look into this
> https://www.overclock.net/forum/69-...erent-bios-your-1080-ti-205.html#post27928240


Much appreciated, I'll look into that. I'm not a huge fan of bios flashing with cards after windows install. I'm looing over results and I'm a bit disapointed. My GTX780 TI's ran up to 1200 core, only limted by my Power Supply (I had three 780's and amp draw at the wall was getting a bit high). Running without SLI, I could get up to 1300 core...


----------



## StAndrew

So got my water loop back together last night and did some testing. Lowering the voltage allowed 2126 which was solid for a while until the room heated up a bit and temps started going north of 35*. No hard crash but frames started to stutter and artifact. 

I'm considering a shunt mod but at the same time, given the "gains" I don't think its worth it...


----------



## skupples

shunting seems silly honestly.

Much rather find a way to just get control over the voltage controller.


----------



## mons7er

skupples said:


> As stated - this is in 4K, however results are similar but less pronounced in 1440p, specially when the frames really start picking up. Clearly though, we've figure it out, it's just my weak 5.3 9700k & 3200mhz memory
> 
> We're also in two different boats. you want high FPS in 1080p, which i'm guessing wouldn't even require a second card based on 1440p ultra running 70-90 most of the time. Really I believe the issue is the same as its always been. Two cards doesn't do much for ya at higher resolutions if you're already nearing your memory bandwidth cap.
> 
> 
> I really do hope you get the epic experience you expect, I do. I'm not trying to talk anyone out of anything (no matter how many times you annoyingly say it, it won't come true), I'm pretty sure that's just your tribal nature kicking in & improperly deciphering these figures on the screen.
> 
> 
> People just gotta remember - one beefier card always wins against two mid range cards (specially when the beefier card has higher memory bus!!!!) UNLESS!?!?! anyone remember?! you're gaming in resolutions low enough to not max out the memory bandwidth, as memory bandwidth doesn't scale as most expect it to, in SLI.
> 
> it's a funny time right now though, 20 series sucks & many of us are experienced that first hand, thus the only other thing to do is slap on an extra 1080ti for the one outta ten times its gonna work well.
> 
> as I stated in the beginning of all of this like 2 weeks ago - I'm not trading in my 2nd card until I see how it fares on my 3440x1440p alienware next month. That however won't change the fact 1 in 10 games support SLI, and that its a dead technology being replaced by NVLink. The more DX12 titles we see, the less SLI titles we'll see, until they get that part all ironed out, which may take a couple more years & we currently get maybe 1 outta 20 with native support, n can maybe get 1 in 5 to work with tweaked support.



I only have anecdotal evidence, but when I was trying SLI on the GTX 900 Series Cards, no matter what my frames were, the experience felt terrible and I opted to play all games at lower frame-rates with a single card.


----------



## skupples

mons7er said:


> I only have anecdotal evidence, but when I was trying SLI on the GTX 900 Series Cards, no matter what my frames were, the experience felt terrible and I opted to play all games at lower frame-rates with a single card.


yep, that's a pretty common response from folks... there's an inherent latency. The better solution these days would be a top tier card & a frame pacing monitor. 

Fallout 4 & Deus Ex were the last native-sli titles i know of that run beautifully in SLI. Pretty much everything else requires tweaking, and ends up having a tell or two (unless you're on win7 apparently  )

- turns out my weird data corruption issue has spread, more titles won't load, etc so I'm rebuilding my RAID this weekend after replacing my ancient sata cables. 

hopefully that'll fix the weird launch issues, & non-functioning Sli issues I've been experiencing in titles where it should work, but doesn't.

also - I totally forgot about cpu lanes, so my 8TB of NVMe on the ASUS expansion card has gone into storage for the time being until I upgrade back into the higher lane count CPUs... or pawn off my 2nd TI & get a g-sync monitor.


----------



## OrionBG

Hey guys,

I came upon an offer of some ASUS 1080TI FE cards recently. Those cards have been purchased in the middle of 2017 and used for mining (obviously) for an year. After that they have been removed have been staying in their boxes ever since. I've read a lot about mining cards and what condition they may or may not be in, but the price is very good and I have always wanted to do a dual GPU setup ( I already have one 1080TI) so... Is it worth it to go SLI with 2x1080TI's in this day and age? I don't game a lot (almost not at all) but I like to do benchmarks and play with settings, hardware, tinker and so on. My monitor is a 34" UWQHD 3440x1440 so I'm pushing about half the pixels for 4K...
Basically I can get 4 of those 1080TI's for the price of a single brand new RTX 2080TI and to be fair I don't like all that RTX snake oil...


----------



## KedarWolf

OrionBG said:


> Hey guys,
> 
> I came upon an offer of some ASUS 1080TI FE cards recently. Those cards have been purchased in the middle of 2017 and used for mining (obviously) for an year. After that they have been removed have been staying in their boxes ever since. I've read a lot about mining cards and what condition they may or may not be in, but the price is very good and I have always wanted to do a dual GPU setup ( I already have one 1080TI) so... Is it worth it to go SLI with 2x1080TI's in this day and age? I don't game a lot (almost not at all) but I like to do benchmarks and play with settings, hardware, tinker and so on. My monitor is a 34" UWQHD 3440x1440 so I'm pushing about half the pixels for 4K...
> Basically I can get 4 of those 1080TI's for the price of a single brand new RTX 2080TI and to be fair I don't like all that RTX snake oil...



SLI has latency issues over single cards so some people complain of stuttering and such. Benchmarking it would be good though, but not so much for gaming.


----------



## kithylin

KedarWolf said:


> SLI has latency issues over single cards so some people complain of stuttering and such. Benchmarking it would be good though, but not so much for gaming.


I think it depends in if your CPU is fast enough to keep up. I have met someone recently in steam chat and talked to them directly and with a 9900K system, and DDR4 @ 4000, they're reporting to me zero latency/stuttering issues with two 1080 Ti's driving [email protected] But same user also told me that trying to do the same setup with a 1st gen ryzen had bad latency/stuttering. So.. depends on your system. 

Also with any setup.. either nvidia SLI or AMD CrossFire you need to be prepared to fiddle with profiles to get your games working 100%. With AMD it requires a 3rd party program to run in the background to make it work (RadeonPro) and with Nvidia you can edit nvidia's profiles directly with no 3rd party background programs so.. pro's and cons of either setup.


----------



## skupples

9900k with DDR3? weird.

yes CPU helps, but adding a second card 100% adds latency, even if your homie with the best sli experience on earth is clearly the norm


----------



## kithylin

skupples said:


> 9900k with DDR3? weird.
> 
> yes CPU helps, but adding a second card 100% adds latency, even if your homie with the best sli experience on earth is clearly the norm


I meant ddr4 of course. It's called a typo. Edited and fixed now. And I'm sorry if I pick on you a lot but it just seems for some reason (that I can't understand) you're having a rather bad / sour SLI experience and then you're preaching it in the forums as if that's normal for everyone. It's not and that's just your experience. And it kind of makes me sad inside a little that you're trying to dissuade everyone else from SLI just because you had a bad experience. For most people (with fast enough computers) it's smooth and glorious with minimal issues. And for those of us right now chasing 100 FPS minimums @ 1080p or 1440p, it's literally the only option unless we can some how plant a money tree and can afford a pair of 2080 Ti's. According to the guy I chatted with in steam, in most titles that support SLI, his SLI-1080-Ti system is significantly faster than his girlfriend's 9900K + single 2080 Ti system by as much as +50%.


----------



## Streetdragon

Someone here solderes on there 1080TI FE 5mOhm shunts on top of the 5mOhm shunts to half the power readings?
Thinking aboud doing it for the 6+8Pin shunts. But im afraid that tha cards could jump into safe mode, because of the lower reading, or is that ok?
For the TitanX 5mOhm is ok.....


----------



## skupples

kithylin said:


> I meant ddr4 of course. It's called a typo. Edited and fixed now. And I'm sorry if I pick on you a lot but it just seems for some reason (that I can't understand) you're having a rather bad / sour SLI experience and then you're preaching it in the forums as if that's normal for everyone. It's not and that's just your experience. And it kind of makes me sad inside a little that you're trying to dissuade everyone else from SLI just because you had a bad experience. For most people (with fast enough computers) it's smooth and glorious with minimal issues. And for those of us right now chasing 100 FPS minimums @ 1080p or 1440p, it's literally the only option unless we can some how plant a money tree and can afford a pair of 2080 Ti's. According to the guy I chatted with in steam, in most titles that support SLI, his SLI-1080-Ti system is significantly faster than his girlfriend's 9900K + single 2080 Ti system by as much as +50%.


lol, i'm not worried about spirited back and forth, please no apologies. It's cute that you don't see how we're basically identical in this... You want to fan boy it, even though you haven't used it in forever, and I want to point out the flaws, since I've been using it... forever... including right this moment in rage 2. 

did you get that second card yet, or still speaking on someone else's behalf? He's right, when it works, it works beautifully.  good news is, it seems the dry spell may be over, last few titles I've purchased had pretty decent scaling out of the box with very little latency increase. Before 2019 though, well we've seen the supported title lists on official, and community support clubs. The last 2-3 years have been abysmal. Though predicted as far back as fermi. Luckily DX11 grows deader by the day. 

you seem to ignore basic functionality & design aspects of a computer.

two of something = more latency than one of something, however when properly accounted for its undetectable. 


you're quite the parrot, we've both agreed its the only way to get high frames at high settings above 1080p, it that's kinda of always been the purpose, thus why I've been using it for over a decade it can still be quite the pain in the ass, and will simply never work properly in many titles, even with all the bits in the world, and afr triggers in the world. I'm actually eyeing 2x 2080ti for sale on FB Marketplace in my area, may have to swoop. He wants $1800 COD.


----------



## Shawnb99

I’m sure people on Steam never lie or exaggerate anything. No one ever lies on the internet. 

According to one guy I talked to on Steam he owns a shark with laser beams though sadly his camera was broken when I asked for a pic so I can only take him at his word.
He seems truthful...


----------



## kithylin

skupples said:


> did you get that second card yet, or still speaking on someone else's behalf?


Not yet, but I've been speaking to people about it and doing my research still leads me to understand it's a viable option. For better or worse I refuse to use Windows 10 and I'm not made of money nor have a money tree in the back yard. So before making my next GPU Purchase I'm forced to sit around and wait and see what happens with nvidia. I'm silently hoping we will have a single-gpu solution from Nvidia with the 3000 series that will be at least +60% performance over a single overclocked 1080 Ti, support Windows 7 with drivers, and cost < $800 new. I will be buying another EK water block for 1080 Ti's this fall though. The water blocks are drying up and starting to get scarce, but used 1080 Ti's will be there for years to come. If Nvidia drops Win7 support with the release of the 3000 series I may be forced to buy another 1080 Ti and have no other option. And I'll be forced to cope with SLI and all it's pro's and cons. I don't know what the future holds yet but the 2080 Ti may actually be the last Windows-7 compatible GPU from Nvidia forever. Somewhere right around what we have now or what comes out in the next 1-2 years will be the end of Win7 drivers.. so I'm kind of waiting to see what it is. In the short term between waiting on Nvidia I plan to spend money on a platform upgrade instead. Probably later this year pick up a cheap 3770K for my current computer, delid it and run it bare chip in the water loop and try for 5.5'ish Ghz daily driver (my board can do it) and wait to see what AMD does and consider $1200 or so on a whole new X570 + 12-core 5Ghz Ryzen2 chip + 32GB DDR4-4000 in 2020, then look for gpu upgrades the year after that.

I've spoken to at least 10 different people using 1080 Ti SLI with modern fast Intel computers and you're the only person that has so much negative to say about it. Literally everyone else I talk to says it's great and they usually see +60% to +80% in most games they want to play. I have heard negative things about SLI when the gpu's aren't clock-matched though. One person reported crashing, blue screens and other problems when there's 100~300 Mhz difference between the two cards, then it went away when they clock-matched em with MSI Afterburner. So at least that's still a thing with Nvidia.


----------



## amd955be5670

MSI 1080Ti Gaming X owner here.

The other day I decided to do a Kyronaut repaste. Temps used to reached near 80c while gaming. In my first "attempt" I basically cleaned everything, repasted, and removed the LED lights (annoying to always switch them off in Windows). To my surprise after putting everything together the card just didn't POST. After a moment of despair for ruining something so expensive I decided to rettach the LEDs to prepare for RMA. After that just for the heck of it I started it up and it was bloody working!

Which means I cannot use a waterblock, or remove the shroud for noctua fans. If I ever have an LED failing the card would just be a brick. Is there anyway to jump the LED connectors into thinking they're connected? Did anyone even face this?

I tried googling, it seems nobody has ever had this issue. Really lost here.


----------



## Streetdragon

Didnt MSI hat a problem with disconected fans?
Check if there is a new Bios for your card


----------



## ThrashZone

Hi,
Yeah that was a 2080ti dude was doing both a water block and flashing different vbios and posted to safe mode though he borked his gpu 

Story was fans have to be connected so yeah water block is out 

I ruled out a 2080ti sea hawk because of msi's vbio stupidity


----------



## skupples

you can just spoof the plug on the 2080ti...


----------



## amd955be5670

Streetdragon said:


> Didnt MSI hat a problem with disconected fans?
> Check if there is a new Bios for your card


I'm currently on 86.02.40.00.1A, which seems to be the latest vBIOS. In this scenario, if I flashed to a different bios, like Palit as suggested a few pages back in this thread, would I still see this issue?




skupples said:


> you can just spoof the plug on the 2080ti...


How? Which connectors have to be jumped?


----------



## MegatronicRus

Hello. Is XOC bios from first post still best for watercooled 1080 ti fe or are there better options?


----------



## kithylin

MegatronicRus said:


> Hello. Is XOC bios from first post still best for watercooled 1080 ti fe or are there better options?


Apparently yes.. as long as you have a new card and only intend to use it for 12-14 months.


----------



## MegatronicRus

kithylin said:


> Apparently yes.. as long as you have a new card and only intend to use it for 12-14 months.


Why is that? Could there be any problems due to this bios? What then to flash for long-term use?


----------



## kithylin

MegatronicRus said:


> Why is that? Could there be any problems due to this bios? What then to flash for long-term use?


My 1080 Ti suffered degradation after running XOC bios daily @ 1.200v core voltage for 17 months. I didn't fry it totally. But I did do some damage to it. Even if I removed the software overclock (completely just stopped loading MSI Afterburner at all on start up so it never applied any kind of overclock) the card still would just completely crash to desktop instantly in any game I tried under any load and wouldn't do anything, when running on the XOC Bios. I had to flash back to the stock bios and reduce the overclock to get it to still run now.


----------



## Cryptedvick

kithylin said:


> My 1080 Ti suffered degradation after running XOC bios daily @ 1.200v core voltage for 17 months. I didn't fry it totally. But I did do some damage to it. Even if I removed the software overclock (completely just stopped loading MSI Afterburner at all on start up so it never applied any kind of overclock) the card still would just completely crash to desktop instantly in any game I tried under any load and wouldn't do anything, when running on the XOC Bios. I had to flash back to the stock bios and reduce the overclock to get it to still run now.


I'm curious, what kind of stable overclock can you maintain now with the stock bios?


----------



## Streetdragon

XOC is fine, as long as you limit the voltage to 1.093V and just use the "no power limit" feeling.

Like kithylin said, 1.2V is not healthy on the long turn


----------



## kithylin

Cryptedvick said:


> I'm curious, what kind of stable overclock can you maintain now with the stock bios?


With XOC I was able to run it at 2126 Mhz @ 1.200v (and yes my card needed that voltage to do it... it wouldn't run 2100+ Mhz at all without it) and +700 memory. Now on stock bios it runs 2012 - 2025 Mhz and I can still keep the +700 memory. So not terrible. So far it's been okay for a few months like this.


----------



## skupples

oh, apparently it requires an oscillator and some soldering? (did a quic kgooglefu)

https://arstechnica.com/civis/viewtopic.php?f=7&t=566930

I guess it would depend on the fan type, above is for PWM i assume...

however, I'd assume most cards use PWM fans these days.


----------



## Cryptedvick

kithylin said:


> With XOC I was able to run it at 2126 Mhz @ 1.200v (and yes my card needed that voltage to do it... it wouldn't run 2100+ Mhz at all without it) and +700 memory. Now on stock bios it runs 2012 - 2088 Mhz and I can still keep the +700 memory. So not terrible. So far it's been okay for a few months like this.


Well thats pretty good actually. Well within the average for most 1080Ti's. Max I can do on my MSI is 2063Mhz at 1.093v.


----------



## 8051

Cryptedvick said:


> Well thats pretty good actually. Well within the average for most 1080Ti's. Max I can do on my MSI is 2063Mhz at 1.093v.


I have the MSI 1080Ti Duke, I can squeeze 2152 out of it, but it takes 1.2V to do it and I have to reduce memory speed by 253Mhz.


----------



## kithylin

Cryptedvick said:


> Well thats pretty good actually. Well within the average for most 1080Ti's. Max I can do on my MSI is 2063Mhz at 1.093v.


What I have found personally is that with these cards we actually do not need the XOC bios to avoid throttling. I'm actually able to maintain 2012~2025 Mhz on this card and never drop below it on stock bios just because I'm able to keep the card < 45c at all times (even when the ambient room temps in the computer room where the physical tower is sometimes gets up to 32c ambient temps) due to a large custom water loop with 4 radiators and 15 fans. If we can just keep the cards cool enough they don't throttle at all. It's the temperatures that throttle 1080 Ti's badly. I vividly remember when I first bought this card on it's stock air cooler (EVGA GTX 1080 Ti SC Black Edition) I was trying to run it at max overclock and it would start out around 1950 Mhz core speed and then when I would play a while it would heat up terribly up to the mid-70's C core speed (even with the fans forced at 100% constantly) and eventually throttle it's self down to the low 1800~1820 Mhz range after eventually heating up around 80c. And now today with this big loop and I can keep it < 45c it stays at 2012-2025 mhz all the time and never goes below that. Well in benchmarks.. in actual games I play with Vsync on for 80 FPS / 80hz currently, so it often switches down around 1200-1300 mhz depending on how hard the game I'm playing loads it. But if I play a game like ARK: Survival Evolved, it can run at 2088 all day for hours now @ 40-43c with no throttling.

I don't think XOC is actually that beneficial. We can't even use XOC on an air cooled card anyway (safely) and if you're custom water already then it seems the cards shouldn't throttle themselves if your loop is big enough to keep them cool enough, even without using XOC.

EDIT: Fixed some Mhz typos.. I must of been drunk when I first wrote this post. I have no idea where I got 2088 from with the previous page and this post. My card maxes out at 2025 Mhz now on stock bios.


----------



## ZloY SloN

Hey, guys! There is Gigabyte 1080ti Gaming OC cooling system photo:



Is it possible to identify thermal pads thickness by this photo?


----------



## Streetdragon

looks like 0,5mm thick pads, but im not sure


----------



## ZloY SloN

0.5mm, but they were shrunk under pressure. I think I should go with 1mm pads. Thanks.


----------



## Bonhart89

Hello everyone,

I know it is a 1080TI Owner's club, but it popped up in google search when I searched for Total GPU power (normalized) (% of TDP) 
I have an Asus strix 1080 (no OC, everything is on the factory settings, never tried OCing it) and my normalized TDP jumps up when I am in game to 200-400%. Is that normal?

I hope you can open the picture.
https://imgur.com/a/fmpVYcM

Thanks for the help!


----------



## skupples

most air coolers use 1s, or larger, while blocks use 1s n smaller. LOL, some of these nasty blowers use 3mm pads.


----------



## Norlig

Just reinstalled windows.

What was the softmod needed for 1.093v?

I tried the curve settings in the first post, but seems I top off at 1.062v now.

Dont remember what was needed to get it to ignore whatever limits it could ignore...


----------



## Zfast4y0u

Norlig said:


> Just reinstalled windows.
> 
> What was the softmod needed for 1.093v?
> 
> I tried the curve settings in the first post, but seems I top off at 1.062v now.
> 
> Dont remember what was needed to get it to ignore whatever limits it could ignore...


if using msi afterburner, move voltage slider all the way to the right, then in curve you can set ur voltage/freq curve till 1.093mV instead of 1.062mV thats by default with voltage slider being all to the left.


btw, just installed shadow of the tomb raider, was waiting for em to release last patch and all tombs before i dive into it. must say, multi gpu support ( yes multi gpu not sli or crossfire ) is like i have nothing i seen before, scaling +90%

and then u get this gipsy's that roll out AAA titles every year and cant bother with it, but this little studio pulled it off. hats off to them. directX 12 doing its magic when properly implemented. this should be future of multi gpu systems, not freakin 80$ nvlink.


----------



## KCDC

Howdy ladies and gents. Anyone know a quick link to reinstalling the Founder Edition coolers? Also, anyone remember the factory pad thicknesses? Thank you kindly!


EDIT: Using a vid in reverse for install, but still want to know the original pad thicknesses if anyone may know.


EDIT 02: for the small weird white thermal "pads" that came on the card in some areas, would a .5 pad be ok for those? I'm prepping these cards for sale to a friend.


----------



## Knoxx29

kithylin said:


> now today with this big loop and I can keep it < 45c it stays at 2012-2025 mhz all the time and never goes below that. Well in benchmarks.. in actual games I play with Vsync on for 80 FPS / 80hz currently, so it often switches down around 1200-1300 mhz depending on how hard the game I'm playing loads it. But if I play a game like ARK: Survival Evolved, it can run at 2025all day for hours now @ 40-43c with no throttling



My card used to behave like your 2026-2012mhz most of the time 2026 and rarely dropped to 2012, when I say used to it's because something weird that i can't explain happened last week when i did a fresh windows installation, since then the card won't do 2026-2012mhz but instead 2038mhz and it wont go below that, i played hours and hours many different Games and ran like 3 or 4 different Benchmarks and the card stay all the time at 2038mhz, something I find funny is that your cooling system with 4 radiators and 15 fans has way more cooling capacity than my cooling system which has just a single 360 Rad and my GPUs temperature is 47-50c.


----------



## kithylin

Knoxx29 said:


> My card used to behave like your 2026-2012mhz most of the time 2026 and rarely dropped to 2012, when I say used to it's because something weird that i can't explain happened last week when i did a fresh windows installation, since then the card won't do 2026-2012mhz but instead 2038mhz and it wont go below that, i played hours and hours many different Games and ran like 3 or 4 different Benchmarks and the card stay all the time at 2038mhz, something I find funny is that your cooling system with 4 radiators and 15 fans has way more cooling capacity than my cooling system which has just a single 360 Rad and my GPUs temperature is 47-50c.


That's because I have a lot more in my loop than just the video card. I have a big Gigabyte GA-Z77X-UP7 motherboard with it's 32-phase VRM looped in with a VRM block. I have a set of corsair vengeance DDR3-2800 (currently runs at 2500) looped in, and a i5-3570K @ 5.0 Ghz daily-driver looped into it too. And it may be somewhat overkill right now, but I do have plans soon to add to it later. I want to loop in a second 1080 Ti into it. And I have a line on a friend in Florida with a 3770K that runs daily-driver clocked @ 5.5 Ghz (delidded, bare chip on block, no IHS) that I want to put in this system into the loop later too.


----------



## Knoxx29

kithylin said:


> That's because I have a lot more in my loop than just the video card. I have a big Gigabyte GA-Z77X-UP7 motherboard with it's 32-phase VRM looped in with a VRM block. I have a set of corsair vengeance DDR3-2800 (currently runs at 2500) looped in, and a i5-3570K @ 5.0 Ghz daily-driver looped into it too. And it may be somewhat overkill right now, but I do have plans soon to add to it later. I want to loop in a second 1080 Ti into it. And I have a line on a friend in Florida with a 3770K that runs daily-driver clocked @ 5.5 Ghz (delidded, bare chip on block, no IHS) that I want to put in this system into the loop later too.


I have the GPU and the CPU at 5.3GHz 1.36V, as I can see we have the same plan adding a second 1080ti, anyway what do you think about what happened to my the GPU after the windows reinstall because for me it's kinda weird.

You want a 3700K and i want to get a 9900Ks


----------



## kithylin

Knoxx29 said:


> I have the GPU and the CPU at 5.3GHz 1.36V, as I can see we have the same plan adding a second 1080ti, anyway what do you think about what happened to my the GPU after the windows reinstall because for me it's kinda weird.
> 
> You want a 3700K and i want to get a 9900Ks


What I really want is the (Refinement of Zen2) threadripper 4000 series equivalent of the 3950X, 16 core system. And a full 60-lane PCIE-4.0 motherboard with a big 12+ phase VRM that supports a monoblock, or at least a VRM block. But such a thing doesn't exist yet (It will soon though) and probably another 1-2 years for that. And probably wait 12 months after it's released for the "new price" to wear down a bit and do a new build. So likely another 3 years. In the short term planning to go 3770K @ 5.5 Ghz 24-7-daily until then. And my 3570K runs @ 1.635v daily. Older chips take lots more volts than people think. My friend with the OC'd 3770K has been running his at I believe 1.725v for about 4 years daily and still works. I played with him in Borderlands 2 last night and he gamed on it for hours with me. He's planning to go Ryzen Zen2/Zen2+ soon.


----------



## Knoxx29

kithylin said:


> What I really want is the (Refinement of Zen2) threadripper 4000 series equivalent of the 3950X, 16 core system. And a full 60-lane PCIE-4.0 motherboard with a big 12+ phase VRM that supports a monoblock, or at least a VRM block. But such a thing doesn't exist yet (It will soon though) and probably another 1-2 years for that. And probably wait 12 months after it's released for the "new price" to wear down a bit and do a new build. So likely another 3 years. In the short term planning to go 3770K @ 5.5 Ghz 24-7-daily until then. And my 3570K runs @ 1.635v daily. Older chips take lots more volts than people think. My friend with the OC'd 3770K has been running his at I believe 1.725v for about 4 years daily and still works. I played with him in Borderlands 2 last night and he gamed on it for hours with me. He's planning to go Ryzen Zen2/Zen2+ soon.



This morning I found this : https://www.techpowerup.com/forums/...-4-on-its-amd-400-series-motherboards.257515/


----------



## kithylin

Knoxx29 said:


> This morning I found this : https://www.techpowerup.com/forums/...-4-on-its-amd-400-series-motherboards.257515/


That's neat and all.. but I'm really looking for a lot more PCIE lanes than that. X470 and X570 boards are only 16-lanes for expansion cards, 16/0 or 8/8, I'm looking for a lot more.

Anyway we're both going off topic here.. This is supposed to be about 1080 Ti's. I still love mine.


----------



## philhalo66

Guess you can count me in this club now, EVGA sent me a 1080 Ti as a replacement for my dead 1080.


----------



## tartmann

Congrats. Nice upgrade for ya.


----------



## skupples

kithylin said:


> That's neat and all..* but I'm really looking for a lot more PCIE lanes than that. *X470 and X570 boards are only 16-lanes for expansion cards, 16/0 or 8/8, I'm looking for a lot more.
> 
> Anyway we're both going off topic here.. This is supposed to be about 1080 Ti's. I still love mine.


this.

whoever can sell me similar to 9700k w. 40+ lanes wins my next $1,000... i9 series... what a joke. INTEL, I9 WAS A JOKE. YOU WEREN'T SUPPOSED TO MAKE IT REAL! 

OT -

1080ti ftw! 

took the time to prove both of mine are still functioning normally enough, i just damaged my board while trying to sync two different gen ek blocks in asus z390a.

pci-e comms issues is what appears to be causing my weirdness, outside of normal weirdness with 2x gpu.


----------



## kithylin

skupples said:


> pci-e comms issues is what appears to be causing my weirdness, outside of normal weirdness with 2x gpu.


This is part of why I want more lanes. I have heard of 16x-PCIE platforms having all sorts of weirdness with dual gpu's. I'm crazy I suppose but I'm going to stay on windows 7 until my dying breath. I've found information and walkthroughs on youtube on how to get AMD Threadripper working 100% (including chipset drivers loaded) with windows 7, so I have hope there yet. Due to this, some day at some point nvidia will stop supporting windows 7 via drivers for newer cards. When that happens I will be forced to go dual-GPU with the top of the stack of whatever card that generation is. So I'm planning ahead for that day when it comes. When it is time for that, I would only consider it on my current (Z77X PLX Bridged Z77 Board), or a HEDT board from Intel/AMD. The main reason I want more lanes like I said (and like Skupples highlighted above) is because I want to get in to video editing as a part time job for local friend's small time company. And I want at least one PCIE-NVME drive on PCIE-4.0, and for scrubbing timelines on multiple 4K streams at once I would like as many PCIE-4.0-NVME drives striped together in raid-0 as the platform will allow. I just happen to also play a lot of games in my spare time so I want a system that can be good at both.


----------



## mouacyk

philhalo66 said:


> Guess you can count me in this club now, EVGA sent me a 1080 Ti as a replacement for my dead 1080.


Wow. Congrats for +50%.


----------



## philhalo66

mouacyk said:


> Wow. Congrats for +50%.


it was a SC ICX too so yeah i won the lottery on this, should be here friday.


----------



## KenjiS

https://www.3dmark.com/3dm/38304677?

Just spent some more time tweaking a bit. New case has significantly better temps (I'm at 65 or so under load, stock profile) I cant get more than +75 core and +50 memory on my Strix 1080 Ti... Is 2025 Core/2778 Ram good or not.

Good news is it holds that speed a lot better due to better cooling


----------



## kithylin

KenjiS said:


> https://www.3dmark.com/3dm/38304677?
> 
> Just spent some more time tweaking a bit. New case has significantly better temps (I'm at 65 or so under load, stock profile) I cant get more than +75 core and +50 memory on my Strix 1080 Ti... Is 2025 Core/2778 Ram good or not.
> 
> Good news is it holds that speed a lot better due to better cooling


It all seems to be silicon lotto and aftermarket / Custom-PCB cards don't seem to be any better/different than reference board cards in terms of normal overclocking (air and water cooling). Custom-PCB cards only seem to effect how far we can go on liquid nitrogen.

Some people have shown 2100 - 2150 Mhz core speed stable on reference PCB cards with good air cooling, and my card (EVGA GTX 1080 Ti SC Black Edition) is based on a reference board and only does 2026 Mhz, even with a large custom water loop and keeping the card < 40c even at full load. Some people have reported 2100 - 2175 Mhz with your strix card, or Custom-PCB cards.. and some people like you seem to only get 2025 Mhz out of yours. So it's all just a gamble.

What I have found with pascal cards though (1000 series from Nvidia) is that temps have the biggest impact on performance, way more than anything else, even more so than the design of the PCB. Aftermarket cards with big air coolers that can hold the temps down will overclock better (regardless of how the card's board is designed). I am confused a little on your memory speed though.. where are you getting that 2778 Mhz ram speed from? GPU-Z? What does afterburner say for your ram speed? The ram speed on these cards is reported differently in different programs and works on multipliers. My 1080 Ti runs with +700 Memory which results in 6210 Mhz for memory in afterburner, which is actually 12420 Mhz GDDR5x speed.


----------



## skupples

multiply is 2778 by 2 ~+400 OC? can't remember mem base clock atm.


----------



## KenjiS

kithylin said:


> It all seems to be silicon lotto and aftermarket / Custom-PCB cards don't seem to be any better/different than reference board cards in terms of normal overclocking (air and water cooling). Custom-PCB cards only seem to effect how far we can go on liquid nitrogen.
> 
> Some people have shown 2100 - 2150 Mhz core speed stable on reference PCB cards with good air cooling, and my card (EVGA GTX 1080 Ti SC Black Edition) is based on a reference board and only does 2026 Mhz, even with a large custom water loop and keeping the card < 40c even at full load. Some people have reported 2100 - 2175 Mhz with your strix card, or Custom-PCB cards.. and some people like you seem to only get 2025 Mhz out of yours. So it's all just a gamble.
> 
> What I have found with pascal cards though (1000 series from Nvidia) is that temps have the biggest impact on performance, way more than anything else, even more so than the design of the PCB. Aftermarket cards with big air coolers that can hold the temps down will overclock better (regardless of how the card's board is designed). I am confused a little on your memory speed though.. where are you getting that 2778 Mhz ram speed from? GPU-Z? What does afterburner say for your ram speed? The ram speed on these cards is reported differently in different programs and works on multipliers. My 1080 Ti runs with +700 Memory which results in 6210 Mhz for memory in afterburner, which is actually 12420 Mhz GDDR5x speed.


Multiplying the 1389 that 3dmark was reporting by two, my bad

My memory offset was +50 or so on the pre-oc Strix

And yeah, i think i just didnt luck out on the silicon lottery slash mine just had no extra to give really. Oh well, its still plenty fast


----------



## 2ndLastJedi

Link not working


----------



## kithylin

KenjiS said:


> Multiplying the 1389 that 3dmark was reporting by two, my bad
> 
> My memory offset was +50 or so on the pre-oc Strix
> 
> And yeah, i think i just didnt luck out on the silicon lottery slash mine just had no extra to give really. Oh well, its still plenty fast


Try and crank the ram a lot more. My 1080 Ti takes +700 ram and 2025 Mhz core clock on stock bios & stock voltage on a reference board.


----------



## 2ndLastJedi

KenjiS said:


> https://www.3dmark.com/3dm/38304677?
> 
> Just spent some more time tweaking a bit. New case has significantly better temps (I'm at 65 or so under load, stock profile) I cant get more than +75 core and +50 memory on my Strix 1080 Ti... Is 2025 Core/2778 Ram good or not.
> 
> Good news is it holds that speed a lot better due to better cooling


https://www.3dmark.com/compare/spy/8048753/spy/6128431
I have the dual slot Galax 1080Ti EXOC at +136 core and +400 memory keeping temps below 65c.
Quoted you post and score for comparison . Check the poor little 7700k , lol


----------



## kithylin

2ndLastJedi said:


> https://www.3dmark.com/compare/spy/8048753/spy/6128431
> I have the dual slot Galax 1080Ti EXOC at +136 core and +400 memory keeping temps below 65c.
> Quoted you post and score for comparison . Check the poor little 7700k , lol


It's interesting to see the graphics score is mostly unchanged, just +2% GPU performance with the ryzen system. I wonder sometimes if upgrading to new stuff is even worth it.


----------



## ThrashZone

Hi,
As said this platform you get better results from gpu memory clocks 
Reduce core clock from +130 and increase memory 
Core dividers are +13 intervals 
If +130 allows you to increase memory + reduce core a little more to +117 and increase memory more till you get closer to +700... memory is where all the gravy is at.

You should also be using curves for core clocks to reduce any droop.


----------



## 2ndLastJedi

ThrashZone said:


> Hi,
> As said this platform you get better results from gpu memory clocks
> Reduce core clock from +130 and increase memory
> Core dividers are +13 intervals
> If +130 allows you to increase memory + reduce core a little more to +117 and increase memory more till you get closer to +700... memory is where all the gravy is at.
> 
> You should also be using curves for core clocks to reduce any droop.


It is with the curve , actually MSI AB scanner OC was used here , lol but it gave the same +136 as i could get manually . Im not going to push this card anymore because i want it to last another 2 years as it seems there is little reason to upgrade anytime soon unless i want to spend $1800aud . Ive had it since june 2017 and another generation at least is my new goal .


----------



## Streetdragon

For me and both of my 1080TI FE cards +590 on memory is the sweetspot. 1Mhz higher and it drops performence. I think somewhere there the timings get worse or so


----------



## Cucumber

I tried the XOC bios last night, and after a bit of experimenting i found out that i got worse performance in games with 100 more Mhz on the core. Disappointing to say the least. Ended up at 2075 core and 6177 mem using afterburner. I didnt bother testing different memory speeds, as i got to the goal i had, which was to have rock solid 60fps in 1440p in all areas of MHW with as close to maxed out as i could. From what i understand, you can gain performance with lower memory clocks sometimes? Is there maybe a post i have overlooked which explains all the little tricks and tips for overclocking 1080 Ti/Pascal?


----------



## kithylin

Cucumber said:


> I tried the XOC bios last night, and after a bit of experimenting i found out that i got worse performance in games with 100 more Mhz on the core. Disappointing to say the least. Ended up at 2075 core and 6177 mem using afterburner. I didnt bother testing different memory speeds, as i got to the goal i had, which was to have rock solid 60fps in 1440p in all areas of MHW with as close to maxed out as i could. From what i understand, you can gain performance with lower memory clocks sometimes? Is there maybe a post i have overlooked which explains all the little tricks and tips for overclocking 1080 Ti/Pascal?


XOC is mainly for liquid nitrogen / overclocking competitions. Yes you get lower performance per clock with XOC bios but that is usually made up by being able to go +200 to +300 or more Mhz over what you could on any other bios, and the higher voltage and no power limit. It's not really for air cooled stuff at all and not really great for water either.


----------



## eeroo94

Any idea why my Fire Strike graphics score has dropped to 28k, I used to get like 31k before. All other benchmarks seem fine though. Seems like gpu usage is only 95-96% constantly during Fire Strike.


----------



## UltraMega

eeroo94 said:


> Any idea why my Fire Strike graphics score has dropped to 28k, I used to get like 31k before. All other benchmarks seem fine though. Seems like gpu usage is only 95-96% constantly during Fire Strike.


could be the power setting in the nvidia control panel. What CPU do you have?


----------



## eeroo94

UltraMega said:


> could be the power setting in the nvidia control panel. What CPU do you have?


9900k, I have all the power settings on max performance.

Here is my result from Fire Strike, I should be getting upwards of 31k graphics score with these clocks.

https://www.3dmark.com/3dm/38738397

For comparison my Time Spy score seems perfectly in line with others.

https://www.3dmark.com/3dm/38738254


----------



## philhalo66

what would be an average oc on these cards? i cant seem to get more than 60MHz even with more voltage added.


----------



## kithylin

eeroo94 said:


> 9900k, I have all the power settings on max performance.
> 
> Here is my result from Fire Strike, I should be getting upwards of 31k graphics score with these clocks.
> 
> https://www.3dmark.com/3dm/38738397
> 
> For comparison my Time Spy score seems perfectly in line with others.
> 
> https://www.3dmark.com/3dm/38738254


Maybe your card needs repasting and it's overheating and throttling it's self? The 1080 Ti's are 2 years old now. That's about long enough to replace the TIM in most video cards.



philhalo66 said:


> what would be an average oc on these cards? i cant seem to get more than 60MHz even with more voltage added.


You need to tell us what your actual core Mhz is that your card is reaching when you play games, -NOT- the offset from afterburner. And I'm referring to what the card sits at when gaming after a few minutes, not the initial peak clocks when you first start the game. After you play games a while and the card heats up it will drop it's clocks to an average core Mhz. By the way: +60 Mhz means absolutely nothing. The resulting core Mhz from +60 Mhz will be completely different for every single 1080 Ti Owner. Remember these cards use Nvidia Boost 3.0, they are already self-overclocking before you even touch afterburner. 1800 - 1900 mhz core clock is pretty standard for an air cooled 1080 Ti.


----------



## eeroo94

kithylin said:


> Maybe your card needs repasting and it's overheating and throttling it's self? The 1080 Ti's are 2 years old now. That's about long enough to replace the TIM in most video cards.


Nah, my temps are fine below 50c. If someone could run Fire Strike and post screenshot from this graph for comparison, it seem that my GPU usage during the 2nd test never goes above 95% and can drop as low as 93%.


----------



## philhalo66

kithylin said:


> Maybe your card needs repasting and it's overheating and throttling it's self? The 1080 Ti's are 2 years old now. That's about long enough to replace the TIM in most video cards.
> 
> 
> You need to tell us what your actual core Mhz is that your card is reaching when you play games, -NOT- the offset from afterburner. And I'm referring to what the card sits at when gaming after a few minutes, not the initial peak clocks when you first start the game. After you play games a while and the card heats up it will drop it's clocks to an average core Mhz. By the way: +60 Mhz means absolutely nothing. The resulting core Mhz from +60 Mhz will be completely different for every single 1080 Ti Owner. Remember these cards use Nvidia Boost 3.0, they are already self-overclocking before you even touch afterburner. 1800 - 1900 mhz core clock is pretty standard for an air cooled 1080 Ti.


Sorry, im not used to talking about offsets and what not, with the OC it stays at 2000MHz pretty much solid with short dips to 1987Mhz im not sure why it jumps like that my 1080 never did that, it stayed at 2088 solid never moving unless it got really hot.


----------



## yoadknux

eeroo94 said:


> Nah, my temps are fine below 50c. If someone could run Fire Strike and post screenshot from this graph for comparison, it seem that my GPU usage during the 2nd test never goes above 95% and can drop as low as 93%.


I think it's normal, this is what I get:








https://www.3dmark.com/3dm/38751699?


----------



## eeroo94

yoadknux said:


> I think it's normal, this is what I get:
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.3dmark.com/3dm/38751699?


Yeah, that score is what I used to get now I'm stuck in 28-29k range. Some weird software/driver issue specific to Fire Strike since all my other benchmarks are normal.


----------



## kithylin

philhalo66 said:


> Sorry, im not used to talking about offsets and what not, with the OC it stays at 2000MHz pretty much solid with short dips to 1987Mhz im not sure why it jumps like that my 1080 never did that, it stayed at 2088 solid never moving unless it got really hot.


That's probably because the 1080 Ti is a bigger card, runs hotter and uses more power than the 1080 does by a lot. GTX 1080's normally run < 200 watts even overclocked and most 2080 Ti's run around 250~375 watts, 400+ watts with XOC bios under water/chiller.


----------



## skupples

philhalo66 said:


> what would be an average oc on these cards? i cant seem to get more than 60MHz even with more voltage added.


I usually slap on +200, +500 & fly.

this equates to just under 2100mhz..

i'm on water though

my two ref cards are running the bios from my top EVGA branded card.

on a side note - 

the only weird thing still going on with my system is a random partial screen blink in 2D clocks. It looks like a menu in the back ground is coming forward for 1ms then vanishes. Rarely the same shape, or in the same spot.


----------



## Streetdragon

Have that too. Maybe once a week, but i always wondered why something blinks up for a moment


----------



## Zfast4y0u

its common ''issue'' can be display port cable some say but i doubt, gsync is not, i have it off. its beyond me whats causing it.


----------



## KedarWolf

This is with the latest driver using Ultra latency mode.


----------



## kithylin

KedarWolf said:


> This is with the latest driver using Ultra latency mode.


Could I get a link to the 3dmark result URL for that please?


----------



## KedarWolf

kithylin said:


> Could I get a link to the 3dmark result URL for that please?


https://www.3dmark.com/3dm/38879170


----------



## kithylin

KedarWolf said:


> https://www.3dmark.com/3dm/38879170


Thanks. I was curious about your OS and ram speed, it's easier to just click that and read it all.


----------



## chibi

Does anyone know the stock thermal pad size for the air cooler?


----------



## kithylin

chibi said:


> Does anyone know the stock thermal pad size for the air cooler?


For the Nvidia Founder's Edition or.. which card exactly?


----------



## chibi

kithylin said:


> For the Nvidia Founder's Edition or.. which card exactly?



1080 TI / Titan Xp - NVIDIA FE please, thank you!


----------



## Lester Gan

Hi guys! I'm new here, can anyone help me?

I got a Zotac gtx 1080ti 11gb amp edition , my msi burner oc setup is 
120% Powerlimit
+50 Core Clock Mhz
+335 Memory Clock Mhz
Fan Speed 70% 

I tried putting core clock to 60,70,90,100,120 but PC freezes after 3/4 minutes sometimes seconds when i use kombustor 90-93% GPU usage all the time.
same scenario at memory clock 350-500 it freezes

Temperature stays at 45+ , 75-83% heavy load when using kombustor in 15minutes run time. 

Hope this info helps you to get the fix for this


----------



## kithylin

Lester Gan said:


> Hi guys! I'm new here, can anyone help me?
> 
> I got a Zotac gtx 1080ti 11gb amp edition , my msi burner oc setup is
> 120% Powerlimit
> +50 Core Clock Mhz
> +335 Memory Clock Mhz
> Fan Speed 70%
> 
> I tried putting core clock to 60,70,90,100,120 but PC freezes after 3/4 minutes sometimes seconds when i use kombustor 90-93% GPU usage all the time.
> same scenario at memory clock 350-500 it freezes
> 
> Temperature stays at 45+ , 75-83% heavy load when using kombustor in 15minutes run time.
> 
> Hope this info helps you to get the fix for this


Instead of telling us the offsets, could you please tell us the actual core Mhz the card is running at? With how nvidia boost works we have no idea what your card is doing when you tell us +60 or +100 Mhz. That could have your card running at 2100 mhz or 1800 mhz depending on your setup. Instead we need to know the actual Mhz speed.

That aside, do look at the tempatures of your card. If you're hitting 85c and the clock speed is being reduced then you're already at the limit of the card's cooling and probably are not going to be able to overclock anything at all.


----------



## Lester Gan

kithylin said:


> Instead of telling us the offsets, could you please tell us the actual core Mhz the card is running at? With how nvidia boost works we have no idea what your card is doing when you tell us +60 or +100 Mhz. That could have your card running at 2100 mhz or 1800 mhz depending on your setup. Instead we need to know the actual Mhz speed.
> 
> That aside, do look at the tempatures of your card. If you're hitting 85c and the clock speed is being reduced then you're already at the limit of the card's cooling and probably are not going to be able to overclock anything at all.


Max core clock is 1933-1980mhz @ 93-97% GPU load
Max memory clock is 5833 mhz

I only reached 83 degrees since I turned off my aircon in my room.
with Aircon on, I'm getting 70ish+ degrees @ max GPU load.


----------



## kevindd992002

It's been years since I overclocked my system (GTX 670 era) and I now have the time to dive into this again but don't know where to start. Is there some sort of a guide in overclocking the EVGA GTX 1080Ti with an ASUS Maximus X Code board? I have a custom watercooling loop with this system.


----------



## kithylin

Lester Gan said:


> Max core clock is 1933-1980mhz @ 93-97% GPU load
> Max memory clock is 5833 mhz
> 
> I only reached 83 degrees since I turned off my aircon in my room.
> with Aircon on, I'm getting 70ish+ degrees @ max GPU load.


That sounds to be about the typical maximum stable overclock range for a air cooled 1080 Ti from my personal experience. These cards are based on temperature more than anything else. If you can get the core temps down -15 degrees some how that may let you overclock more.



kevindd992002 said:


> It's been years since I overclocked my system (GTX 670 era) and I now have the time to dive into this again but don't know where to start. Is there some sort of a guide in overclocking the EVGA GTX 1080Ti with an ASUS Maximus X Code board? I have a custom watercooling loop with this system.


Is your 1080 Ti in the loop?


----------



## kevindd992002

kithylin said:


> That sounds to be about the typical maximum stable overclock range for a air cooled 1080 Ti from my personal experience. These cards are based on temperature more than anything else. If you can get the core temps down -15 degrees some how that may let you overclock more.
> 
> 
> 
> Is your 1080 Ti in the loop?


Yes it is. For reference, you can check my system's pics in my sig. But yeah, it'a basically a CPU and a GPU loop.

Sent from my SM-G950F using Tapatalk


----------



## kithylin

kevindd992002 said:


> Yes it is. For reference, you can check my system's pics in my sig. But yeah, it'a basically a CPU and a GPU loop.
> 
> Sent from my SM-G950F using Tapatalk


Not a whole lot has changed with overclocking gpu's, it's the same thing: Increase it a little bit in afterburner, overclock a little more, wash and repeat until it crashes then back it down about -3 ticks and enjoy. The only thing that's really changed since the 600 series is we can't write in the custom clocks with a manually modified custom bios anymore. It's afterburner only now. And unlike the older cards the new cards overclock based on temperatures. So you want to remember to keep vsync/gsync on or (if you don't like those) at least use a frame rate limiter (nvidia profile inspector.. search it in google) to limit FPS to your monitor's maximum refresh rate. This should try and keep your temps in check. With the older 600 series video cards you could set what ever clocks you want and just run it and the cards didn't care how hot they run. Now with the 1000 series the cards reduce clocks by about -15 Mhz every +10c on the core temps. So the cooler you can get and keep the card the better you'll do.

I have a large 4-radiator custom loop and my 1080 Ti maxes out at 48c - 50c core temps and I only have 2012 - 2025 Mhz stable on mine, your milage may vary. With your setup (I saw your system pictures) you can probably expect about the same clocks mine runs at if you can keep the temps around 50c on the gpu. And don't try anything crazy with it like 4K (at all) or 1440p @ 100+ Hz/FPS and you'll be fine with it. A single 1080 Ti isn't going to be that great at either of those tasks and that would just heat it up unusually high trying to do it.


----------



## Lester Gan

kithylin said:


> That sounds to be about the typical maximum stable overclock range for a air cooled 1080 Ti from my personal experience. These cards are based on temperature more than anything else. If you can get the core temps down -15 degrees some how that may let you overclock more.


I'm on nzxt liquid cooling system. any thoughts of getting this to memory clock of 500+ ? and 100mhz on core ?


----------



## kevindd992002

kithylin said:


> Not a whole lot has changed with overclocking gpu's, it's the same thing: Increase it a little bit in afterburner, overclock a little more, wash and repeat until it crashes then back it down about -3 ticks and enjoy. The only thing that's really changed since the 600 series is we can't write in the custom clocks with a manually modified custom bios anymore. It's afterburner only now. And unlike the older cards the new cards overclock based on temperatures. So you want to remember to keep vsync/gsync on or (if you don't like those) at least use a frame rate limiter (nvidia profile inspector.. search it in google) to limit FPS to your monitor's maximum refresh rate. This should try and keep your temps in check. With the older 600 series video cards you could set what ever clocks you want and just run it and the cards didn't care how hot they run. Now with the 1000 series the cards reduce clocks by about -15 Mhz every +10c on the core temps. So the cooler you can get and keep the card the better you'll do.
> 
> I have a large 4-radiator custom loop and my 1080 Ti maxes out at 48c - 50c core temps and I only have 2012 - 2025 Mhz stable on mine, your milage may vary. With your setup (I saw your system pictures) you can probably expect about the same clocks mine runs at if you can keep the temps around 50c on the gpu. And don't try anything crazy with it like 4K (at all) or 1440p @ 100+ Hz/FPS and you'll be fine with it. A single 1080 Ti isn't going to be that great at either of those tasks and that would just heat it up unusually high trying to do it.


I see. And I'm assuming no way to change the power limit of 1000 series GPU's too, right? 

I can say my loop has more-than-average-quality components but the limiting factor is my ambient temp (33C or so) as I live in a tropical country. So I'm sure that you'll have better OC results than me, but we never know. From what I remember, these 1080Ti's have only a little wiggle room for overclocking. I thought 1080Ti's are good for 1440p high FPS? High FPS gaming (first-person shooters) is my target as I use a 144Hz GSync monitor.


----------



## kithylin

kevindd992002 said:


> I see. And I'm assuming no way to change the power limit of 1000 series GPU's too, right?


Some people have reported success cross-flashing their 1080 Ti's to a different bios of another vendor that may have a higher power limit and getting it to work. But it's not guaranteed to work and you may possibly lose some functionality )(some display output ports may stop working).

Here's a link to the thread about cross-flashing 1080 Ti's: https://www.overclock.net/forum/69-nvidia/1624521-official-nvidia-gtx-1080-ti-owner-s-club.html

There is an XOC bios for 1080 Ti that has no power limit.. but it's sort of dangerous (it will hard reset the driver on you and require a full reboot of the PC if your gpu reaches 65c core temps for example) and has slightly slower performance vs all other stock bios's. It's mainly for liquid nitrogen competitions.



kevindd992002 said:


> I can say my loop has more-than-average-quality components but the limiting factor is my ambient temp (33C or so) as I live in a tropical country. So I'm sure that you'll have better OC results than me, but we never know. From what I remember, these 1080Ti's have only a little wiggle room for overclocking. I thought 1080Ti's are good for 1440p high FPS? High FPS gaming (first-person shooters) is my target as I use a 144Hz GSync monitor.


It depends on your processor. If you want really high FPS for 1440p in a single 1080 Ti you'll probably need something like a 9900K @ 5.0 - 5.2 Ghz with fast ram (around 3600 Mhz DDR4 @ CL15).
Look up some reviews online for 1440p gaming benchmarks for the 1080 Ti. high FPS 1440p on a 1080 Ti can be done if you want to sacrifice something like AA turned down a few notches or shadows off.

I live in a hot climate too, north central Texas. Outside ambient temps are usually range in the 34c - 37c range daily highs, and we're currently only cooling down to around 25c - 28c overnight for our overnight lows. I have a 8000 BTU window A/C in the room where the computer tower with my 1080 Ti is and I let it cycle the ambient room temperature in there between 29c on the hot side when A/C comes on and 23c on the cold side when the a/c cycles off. The only real difference is I have a bigger loop than you, 4 radiators (360+240+240+120) and all of them are push+pull with static pressure fans, and 2 pumps (MCP-655 + MCP-355). But I have other stuff in the loop too. Ram (4x8GB / 32GB DDR3-2800 @ 1.5v), processor (3570K @ 5.0 Ghz @ 1.632v daily-driver OC, Delidded, no IHS, bare chip on block), and big 32-phase VRM section on this board also looped into it too.

It depends on what game I play. When I was playing Control on PC I was seeing my 1080 Ti running @ 98% - 100% usage constantly and it would average between 43c - 50c core temps depending on which side of the cycle the window a/c was currently at. Some games don't max out the gpu. Like when I play world of warcraft classic from time to time the 1080 Ti runs around like 20% and clocks down to around 1200 Mhz core. Or ARK: Survival Evolved will have the card clock up around 2012 Mhz from time to time but sometimes it will down-clock a bit. My monitor is currently 1080p @ 75hz and I overclocked it to 80hz so my target is 1080p/80 at the moment. I want to get a second 1080 Ti and aim for 1080p @ 144+ Hz, maybe 180-200 Hz.

From what I've seen with personal experience and the history of this thread, a setup like yours with a CLC 1080 Ti you should average around my overclock probably, about 2012 - 2025 Mhz core speed under load. And mine runs +700 memory. I didn't do anything special with overclocking.

Remember to play with the curve in afterburner instead of just the offset (for core speed). Open afterburner and press Control-F and access the curve.

I would suggest however to put your core speed at +0, stock, and try to overclock the ram first. Once you get the maximum ram stability, -THEN- try to OC the core. That's what a lot of us have done with our 1080 Ti's and it seems to work the best.


----------



## kevindd992002

kithylin said:


> Some people have reported success cross-flashing their 1080 Ti's to a different bios of another vendor that may have a higher power limit and getting it to work. But it's not guaranteed to work and you may possibly lose some functionality )(some display output ports may stop working).
> 
> Here's a link to the thread about cross-flashing 1080 Ti's: https://www.overclock.net/forum/69-nvidia/1624521-official-nvidia-gtx-1080-ti-owner-s-club.html
> 
> There is an XOC bios for 1080 Ti that has no power limit.. but it's sort of dangerous (it will hard reset the driver on you and require a full reboot of the PC if your gpu reaches 65c core temps for example) and has slightly slower performance vs all other stock bios's. It's mainly for liquid nitrogen competitions.
> 
> 
> It depends on your processor. If you want really high FPS for 1440p in a single 1080 Ti you'll probably need something like a 9900K @ 5.0 - 5.2 Ghz with fast ram (around 3600 Mhz DDR4 @ CL15).
> Look up some reviews online for 1440p gaming benchmarks for the 1080 Ti. high FPS 1440p on a 1080 Ti can be done if you want to sacrifice something like AA turned down a few notches or shadows off.
> 
> I live in a hot climate too, north central Texas. Outside ambient temps are usually range in the 34c - 37c range daily highs, and we're currently only cooling down to around 25c - 28c overnight for our overnight lows. I have a 8000 BTU window A/C in the room where the computer tower with my 1080 Ti is and I let it cycle the ambient room temperature in there between 29c on the hot side when A/C comes on and 23c on the cold side when the a/c cycles off. The only real difference is I have a bigger loop than you, 4 radiators (360+240+240+120) and all of them are push+pull with static pressure fans, and 2 pumps (MCP-655 + MCP-355). But I have other stuff in the loop too. Ram (4x8GB / 32GB DDR3-2800 @ 1.5v), processor (3570K @ 5.0 Ghz @ 1.632v daily-driver OC, Delidded, no IHS, bare chip on block), and big 32-phase VRM section on this board also looped into it too.
> 
> It depends on what game I play. When I was playing Control on PC I was seeing my 1080 Ti running @ 98% - 100% usage constantly and it would average between 43c - 50c core temps depending on which side of the cycle the window a/c was currently at. Some games don't max out the gpu. Like when I play world of warcraft classic from time to time the 1080 Ti runs around like 20% and clocks down to around 1200 Mhz core. Or ARK: Survival Evolved will have the card clock up around 2012 Mhz from time to time but sometimes it will down-clock a bit. My monitor is currently 1080p @ 75hz and I overclocked it to 80hz so my target is 1080p/80 at the moment. I want to get a second 1080 Ti and aim for 1080p @ 144+ Hz, maybe 180-200 Hz.
> 
> From what I've seen with personal experience and the history of this thread, a setup like yours with a CLC 1080 Ti you should average around my overclock probably, about 2012 - 2025 Mhz core speed under load. And mine runs +700 memory. I didn't do anything special with overclocking.
> 
> Remember to play with the curve in afterburner instead of just the offset (for core speed). Open afterburner and press Control-F and access the curve.
> 
> I would suggest however to put your core speed at +0, stock, and try to overclock the ram first. Once you get the maximum ram stability, -THEN- try to OC the core. That's what a lot of us have done with our 1080 Ti's and it seems to work the best.


Ok, that makes sense. Is there even just a simple guide to read for how to play with that Afterburner curve efficiently? I remember that option when I was still using a 670 but all I did was play with the offsets.


----------



## JollyGreenJoint

Hey'o  Have a few questions. How do i turn off trhe freaking led's !? Have a 1080ti kingpin, using prescision x1 and evga sync something or other but theres still an annoying white led strip at the front of the card that wont turn off  My other question is i see in the first post footer "kingping fix bios" whats it fixing ? and where in the 1800 some pages is it located ?  thx in advance )


----------



## kithylin

JollyGreenJoint said:


> Hey'o  Have a few questions. How do i turn off trhe freaking led's !? Have a 1080ti kingpin, using prescision x1 and evga sync something or other but theres still an annoying white led strip at the front of the card that wont turn off  My other question is i see in the first post footer "kingping fix bios" whats it fixing ? and where in the 1800 some pages is it located ?  thx in advance )


I may be wrong here but I'm pretty sure you can't turn it off. Most people buy those cards specifically because they want that lighting effect.


----------



## philhalo66

JollyGreenJoint said:


> Hey'o  Have a few questions. How do i turn off trhe freaking led's !? Have a 1080ti kingpin, using prescision x1 and evga sync something or other but theres still an annoying white led strip at the front of the card that wont turn off  My other question is i see in the first post footer "kingping fix bios" whats it fixing ? and where in the 1800 some pages is it located ?  thx in advance )


im fairly sure the only way to disable the LED nonsense is to physically unplug the cable for the LED's. i did the same on my old 1080, i only left the evga logo lit up.


----------



## JollyGreenJoint

i remember it was a thing with the old green logos, until someone finally came out with a .exe to shut it off was hoping for the same, hate to leave the plug exposed in the air stream, oh well. LOL def. did not buy for the leds  I thought most people bought this version for liquid nitrogen. I personally because of the cooler, and it was the same price as a sc black bstock from evga and it came with all the orginal packaging bstock is just a brown box  Really wanted to get a titan x / xp but couldnt find acx only blowers :/


----------



## kithylin

JollyGreenJoint said:


> i remember it was a thing with the old green logos, until someone finally came out with a .exe to shut it off was hoping for the same, hate to leave the plug exposed in the air stream, oh well. LOL def. did not buy for the leds  I thought most people bought this version for liquid nitrogen. I personally because of the cooler, and it was the same price as a sc black bstock from evga and it came with all the orginal packaging bstock is just a brown box  Really wanted to get a titan x / xp but couldnt find acx only blowers :/


When I went water cooled on my 1080 Ti and re-assembled it I intentionally didn't re-connect the LED plug and just tucked it up inside the block housing. Worked for me.

Random.. here's a picture of my 1080 Ti from December 2018 it's first removal from the loop for cleaning after I had it installed for 16 months non-stop. Also I removed that little metal "jet plate" in the middle of the block and it improved my overall loop flow rate a good bit. Which is a big deal for me and my complicated loop.. I've already had to go up to two pumps just to get enough flow rate for my loop to self-bleed it's self. Also note: I don't use any dyes and just run straight distilled water in my loop with no additives. I use all similar metals (nickel bronze or copper) and so I don't need any additives. 16 months of daily gaming on it in there and it's still 95% spotless. Some yellowing down in the corner from the o-ring.


----------



## skupples

"self bleed"? 

I'd still give it a shake every once in awhile during the first week. My MCP35x2 are currently only feeding two blocks n 2 480s, n I still gotta give the case a shake @ like 1.5-2.0 GPM.


----------



## kithylin

skupples said:


> "self bleed"?
> 
> I'd still give it a shake every once in awhile during the first week. My MCP35x2 are currently only feeding two blocks n 2 480s, n I still gotta give the case a shake @ 1.5-2GPM.


I went over the loop with a bright flash light and looked into all the tubes and fittings and there's no air bubbles anywhere in the loop. If you design a loop correctly it should automatically self-bleed all the air out normally. That's how water loops are supposed to work. I'm not shaking and rocking my computer when it weighs 84 lbs / 38 kg.


----------



## skupples

ah, of course. we're all just doing it wrong. got it.  

i was referring to the little bits here and there that like to cling inside radiators, though its likely more prevalent with thick boys. I suppose they do eventually work themselves out, specially with proper coolant. that's where those random bloops come from after the first day or two. 

what' you don't enjoy shaking a giant case?! It's so much fun to hear it creak and squeak 

i just rock my sth10 gently, it helps that its in a room with carpet.


----------



## kithylin

skupples said:


> ah, of course. we're all just doing it wrong. got it.


I never meant to say anyone is doing it wrong. I merely meant to state that if a loop is set up correctly with a powerful enough pump it should be able to bleed it's self naturally over time. We shouldn't have to do anything to it other than look and possibly tap tap tap tubes to dislodge bubbles. When I first built my current loop with a single MCP-655 (which was upgraded by performance PCS and bored out internally to increase flow rates) I had to do the rocking back and forth and moving it at all angles and tapping tubing and all of that nonsense. Then I added a second pump, a MCP-355 with upgraded acrylic top on it to the loop beside the MCP-655 and now when I fill this loop I just go use it and all of the bubbles all work out naturally over time without having to do anything. I went through and checked all the tubes and fittings, nothing stuck anywhere. I did rock it around once after running it for 2 months and no bubbles moved around so it bled it's self without having to touch it. So now I'm assuming that with strong enough flow rate, this is how loops should work.



skupples said:


> i was referring to the little bits here and there that like to cling inside radiators, though its likely more prevalent with thick boys. I suppose they do eventually work themselves out, specially with proper coolant. that's where those random bloops come from after the first day or two.


As mentioned above with my new pump setup I don't ever have that issue anymore. 



skupples said:


> what' you don't enjoy shaking a giant case?! It's so much fun to hear it creak and squeak
> 
> i just rock my sth10 gently, it helps that its in a room with carpet.


All hardwood floors in this house. I could use a rug but it's not the same. Also I guess I'm a weakling but rocking my big heavy case around makes my arms sore. If I can just put a bigger pump setup in it and never have to screw with that and just go game instead, lots better for me.


----------



## JollyGreenJoint

Thats cool, im glad it stayed spotless for that long. I'm just not a water cooling person, i prefer air even if its tedious to clean all the dust out every once in a while. I do fantasize about using one of the huge grow filters on an inline fan to supply my case with air and eliminate some case fans ... hopefully wouldnt have to dust EVER !  Anyhews looks like i'll have to pull the led cables, my next dilemma the front of my case has to be extended 3 or more inches to make this 3 fan setup fit better :/ Right now it fits but the connectors being on the end of the card its basically smashed up against the front case fan


----------



## 8051

kithylin said:


> I went over the loop with a bright flash light and looked into all the tubes and fittings and there's no air bubbles anywhere in the loop. If you design a loop correctly it should automatically self-bleed all the air out normally. That's how water loops are supposed to work. I'm not shaking and rocking my computer when it weighs 84 lbs / 38 kg.


I thought my case was heavy at half that weight. How much weight does the watercooling (and water) equipment add?


----------



## skupples

kithylin said:


> I never meant to say anyone is doing it wrong. I merely meant to state that if a loop is set up correctly with a powerful enough pump it should be able to bleed it's self naturally over time. We shouldn't have to do anything to it other than look and possibly tap tap tap tubes to dislodge bubbles. When I first built my current loop with a single MCP-655 (which was upgraded by performance PCS and bored out internally to increase flow rates) I had to do the rocking back and forth and moving it at all angles and tapping tubing and all of that nonsense. Then I added a second pump, a MCP-355 with upgraded acrylic top on it to the loop beside the MCP-655 and now when I fill this loop I just go use it and all of the bubbles all work out naturally over time without having to do anything. I went through and checked all the tubes and fittings, nothing stuck anywhere. I did rock it around once after running it for 2 months and no bubbles moved around so it bled it's self without having to touch it. So now I'm assuming that with strong enough flow rate, this is how loops should work.
> 
> 
> As mentioned above with my new pump setup I don't ever have that issue anymore.
> 
> 
> All hardwood floors in this house. I could use a rug but it's not the same. Also I guess I'm a weakling but rocking my big heavy case around makes my arms sore. If I can just put a bigger pump setup in it and never have to screw with that and just go game instead, lots better for me.


What pump did you go with? Dual D5s, or something more along the lines of AquaComputer's big boy pumps? Hopefully I just won a bid on a dual d5 kit to replace my ANCIENT MCP35x2. I don't remember there being many dual d5 options back in the day, in fact those of us getting 35x2s got flack from folks like B-Neg. typical. 



8051 said:


> I thought my case was heavy at half that weight. How much weight does the watercooling (and water) equipment add?


that's a good question.. it gets crazy man. Moving these things is a major pain in the ass. I forgot to get it in my u-hual last time, so the STH10 ended up in the trunk of my civic. When all said & done it looked like big momma was hiding in the back of my car.  

it adds up quick. BP fittings are heavy as hell, specially the 1/2 rotary soft tube compression. Fat boy radiators are heavy as hell. dozens of fans, heavy as hell. None of its light XD 
idk about hard tube fittings, but soft tube 1/2 3/4 fittings are heavy as hell. The tiny little rubbermade thing I use to store my old fittings weighs like 10lb.

in fact, at least with giant cases, I wouldn't go hard line if you change parts often, or see yourself moving in the near future. Small hard line cases can be easily packed like precious goods, but systems of this magnitude would need to be packaged like a piano for proper transport. (wrapped in moving blankets, with a wood frame built around it, then boxed in with plywood)


----------



## kithylin

skupples said:


> What pump did you go with? Dual D5s, or something more along the lines of AquaComputer's big boy pumps? Hopefully I just won a bid on a dual d5 kit to replace my ANCIENT MCP35x2. I don't remember there being many dual d5 options back in the day, in fact those of us getting 35x2s got flack from folks like B-Neg. typical.


I literally said it in the post you quoted. I use a "big boy" main pump, MCP-655 from Swiftech that's been bored out internally by Performance-PCS staff to increase flow rates. And yes I have the dial cranked to "5" at all times. And my secondary booster pump is an older swiftech MCP355 with an upgraded performance acrylic top a friend sent me free.

Here's a little video overview of my system from the last rebuild when I made the big 4-rad loop back in december 2018:





Don't expect youtube video quality. I'm a noob at video and used my 2013 phone for this.

This is my big beast with the 1080 Ti and the 32-phase-vrm motherboard and everything. This is not your fancy schamancy gold chrome bling bling type of water cooling build. This is "maximum overclocking performance and nothing else matters" type of build. It runs a 3570K @ 5.0 Ghz daily-driver currently, and a 32GB set of Corsair Vengance DDR3-2800-CL12 ram @ 2500 Mhz currently. If I can ever manage to get this credit card paid off I want to put a second CLC 1080 Ti and a 3770K in it and (in this special board) most likely try to run 5.5 - 5.7 ghz daily-driver with that. And try to get the ram up to 3000 Mhz CL-14 probably. Also if you wonder why I have an old creative X-Fi card and a GTX 780 in it: This system also Dual-boots XP-32 along with Win7-x64 and doubles as my retro gaming rig too.


----------



## jura11

8051 said:


> I thought my case was heavy at half that weight. How much weight does the watercooling (and water) equipment add?


Hi there 

How much add water cooling to overall weight, depending on radiators which are you using, waterblocks weight too(some waterblocks are really heavy like 1250g etc), usually my loop in Phanteks Enthoo Primo would take up to 1-1.250L of coolant that's with 360mm 60mm thick radiator on top, 240mm 60mm thick radiator on bottom and 3 GPUs and CPU block and 150ml reservoir and in this case I have mounted 8*3TB HDD plus 2*SSD etc 

And this case have been very heavy(35-40kg) although my current one is Caselabs M8 with pedestal and there I have 4*360mm radiators plus MO-ra3 360mm and 10*SATA HDD plus 3*SSD and there I'm running 4*GPUs setup and picking up this case is simply not possible, yes you can try but you are risking broken back

Hope this helps 

Thanks, Jura


----------



## jura11

skupples said:


> What pump did you go with? Dual D5s, or something more along the lines of AquaComputer's big boy pumps? Hopefully I just won a bid on a dual d5 kit to replace my ANCIENT MCP35x2. I don't remember there being many dual d5 options back in the day, in fact those of us getting 35x2s got flack from folks like B-Neg. typical.
> 
> 
> 
> that's a good question.. it gets crazy man. Moving these things is a major pain in the ass. I forgot to get it in my u-hual last time, so the STH10 ended up in the trunk of my civic. When all said & done it looked like big momma was hiding in the back of my car.
> 
> it adds up quick. BP fittings are heavy as hell, specially the 1/2 rotary soft tube compression. Fat boy radiators are heavy as hell. dozens of fans, heavy as hell. None of its light XD
> idk about hard tube fittings, but soft tube 1/2 3/4 fittings are heavy as hell. The tiny little rubbermade thing I use to store my old fittings weighs like 10lb.
> 
> in fact, at least with giant cases, I wouldn't go hard line if you change parts often, or see yourself moving in the near future. Small hard line cases can be easily packed like precious goods, but systems of this magnitude would need to be packaged like a piano for proper transport. (wrapped in moving blankets, with a wood frame built around it, then boxed in with plywood)


Hi there 

Personally I'm running 3 pump setup(XSPC D5 Vario with dual Laing DDC 18w) for 4*360mm radiators plus MO-ra3 360mm setup and with that setup I'm getting 135-140LPH currently and that's with 4*GPUs setup and Aquacomputer Kryos NEXT Vision CPU waterblock 

MCP50X this pump I considered too but couldn't find over here in UK, with these dual pumps my flow would be better

Regarding the D5 pump, I tried to run dual D5 pumps and with this setup my flow hasn't been better than with current setup and my DDC pumps are mainly quiet at full speed which I run all my pumps at full speed 

If yours loop is larger then running multiple pumps would help 

STH10 or TH10 this case I wanted to get as well but couldn't find it used and my current Caselabs M8 with pedestal its great case too but Phanteks Enthoo Primo is still my favourite case and used can be found for reasonable price 

I usually clean all my loop every few weeks and taking apart case is pain and picking up the case with all HDD etc I just couldn't pick up 

Hope this helps 

Thanks, Jura


----------



## neurotix

*My too-long comments and 'contribution' -_-*

forget it


----------



## Streetdragon

A little bit of 1080TI love
https://www.3dmark.com/spy/8708488

Best score so far. With 1.091V 2100Mhz stable(Still!) nice nice^^

Argh only place 4 with my rig^^


----------



## skupples

kithylin said:


> I literally said it in the post you quoted. I use a "big boy" main pump, MCP-655 from Swiftech that's been bored out internally by Performance-PCS staff to increase flow rates. And yes I have the dial cranked to "5" at all times. And my secondary booster pump is an older swiftech MCP355 with an upgraded performance acrylic top a friend sent me free.
> 
> Here's a little video overview of my system from the last rebuild when I made the big 4-rad loop back in december 2018:
> https://www.youtube.com/watch?v=yWrYGUoIdco
> 
> Don't expect youtube video quality. I'm a noob at video and used my 2013 phone for this.
> 
> This is my big beast with the 1080 Ti and the 32-phase-vrm motherboard and everything. This is not your fancy schamancy gold chrome bling bling type of water cooling build. This is "maximum overclocking performance and nothing else matters" type of build. It runs a 3570K @ 5.0 Ghz daily-driver currently, and a 32GB set of Corsair Vengance DDR3-2800-CL12 ram @ 2500 Mhz currently. If I can ever manage to get this credit card paid off I want to put a second CLC 1080 Ti and a 3770K in it and (in this special board) most likely try to run 5.5 - 5.7 ghz daily-driver with that. And try to get the ram up to 3000 Mhz CL-14 probably. Also if you wonder why I have an old creative X-Fi card and a GTX 780 in it: This system also Dual-boots XP-32 along with Win7-x64 and doubles as my retro gaming rig too.


thx, i'm usually on here when I should be asleep instead. cool to see PPC's modding D5/DDCs n such now(i'd be curious to know if this increases cavitation though). I figured you were going to say you have one of Aquacomputer's massive sump pump devices. 

Hopefully I win this used dual d5 kit, it'll give me a perfect reason to split my loop. Then both can run @ idle & still achieve 1gpm.


Jura,

how're your pumps setup? 2x in a top and a 3rd else where? If so, have you ever tried testing without the 3rd pump? or maybe you're using XSPC's scary triple top with success.. unlike many before you  we have similar systems (minus a fast DDC) & I've always been able to achieve ~120 LPH, even when I had CPU, 2x GPU, south bridge, and mosfets blocked (x79) The other thing I've noticed is that DDC's HATE having 90s within the first few inches from the pump. I'd assume D5s exhibit similar issues.


----------



## jura11

skupples said:


> thx, i'm usually on here when I should be asleep instead. cool to see PPC's modding D5/DDCs n such now(i'd be curious to know if this increases cavitation though). I figured you were going to say you have one of Aquacomputer's massive sump pump devices.
> 
> Hopefully I win this used dual d5 kit, it'll give me a perfect reason to split my loop. Then both can run @ idle & still achieve 1gpm.
> 
> 
> Jura,
> 
> how're your pumps setup? 2x in a top and a 3rd else where? If so, have you ever tried testing without the 3rd pump? or maybe you're using XSPC's scary triple top with success.. unlike many before you  we have similar systems (minus a fast DDC) & I've always been able to achieve ~120 LPH, even when I had CPU, 2x GPU, south bridge, and mosfets blocked (x79) The other thing I've noticed is that DDC's HATE having 90s within the first few inches from the pump. I'd assume D5s exhibit similar issues.


Hi there 

Yes my 2*DDC pumps are in Aquacomputer DDC dual pump top, its very nice made pump top and flow seems is better than with normal pump tops, tried few pump tops like EK or Barrow or Bykski ones 

Previously I have run EK 3.2 PWM Elite edition with Barrow DDC 18W pump and my flow have been similar or exactly same, as you I used 90° fittings for that and yes I can confirm 90° fittings can restrict flow if you are using them on inlet or outlet of pump

XSPC D5 Vario pump is mounted with Monsoon MMRS reservoir and dual DDC pumps are in pedestal and no issues with noise, I tried previously to run only XSPC D5 Vario with single EK DDC 3.2 PWM Elite edition and no issues but after adding MO-ra3 360mm radiator and 4th GPU my flow dropped and I needed to get another pump 

In Enthoo Primo where I run EK XE360 radiator on top, Mayhems Havoc 240mm 60mm thick radiator on bottom with 3*GPUs setup and CPU and in this setup I have used or run EK 3.2 PWM Elite edition and my flow have been in 240-265LPH 

Hope this helps

Thanks, Jura


----------



## KedarWolf

https://www.evga.com/products/product.aspx?pn=11G-P4-6393-RX










EVGA 1080 Ti SC Black Edition for SLI.

It's a reference design card so I can throw an EKWB waterblock and backplate on it!!


----------



## kithylin

KedarWolf said:


> https://www.evga.com/products/product.aspx?pn=11G-P4-6393-RX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA 1080 Ti SC Black Edition for SLI.
> 
> It's a reference design card so I can throw an EKWB waterblock and backplate on it!!


That's the card I have, exactly. 

Here it is with the EK block on it:


----------



## KedarWolf

kithylin said:


> That's the card I have, exactly.
> 
> Here it is with the EK block on it:
> 
> I'm getting the acetal/nickel one. and with the Pheonix MLC QDC's GPU Module it's really cheap, waterblock and backplate, 121.98 USD on sale.


----------



## kithylin

KedarWolf said:


> kithylin said:
> 
> 
> 
> That's the card I have, exactly.
> 
> Here it is with the EK block on it:
> 
> I'm getting the acetal/nickel one. and with the Pheonix MLC QDC's GPU Module it's really cheap, waterblock and backplate, 121.98 USD on sale.
> 
> 
> 
> About 6 months ago I disassembled the EK block on mine. I took the plexi cover off and removed the metal "jet plate" in the center over the core. Temperatures of the gpu during gaming was completely unaffected, didn't increase or decrease. But it did improve the flow rates in my loop a good bit. I don't know why they put that in there but it adds a decent bit of restriction to the loop and doesn't seem to do anything functional.
Click to expand...


----------



## KedarWolf

kithylin said:


> KedarWolf said:
> 
> 
> 
> About 6 months ago I disassembled the EK block on mine. I took the plexi cover off and removed the metal "jet plate" in the center over the core. Temperatures of the gpu during gaming was completely unaffected, didn't increase or decrease. But it did improve the flow rates in my loop a good bit. I don't know why they put that in there but it adds a decent bit of restriction to the loop and doesn't seem to do anything functional.
> 
> 
> 
> I might do that on both my blocks, I have four quick disconnects for my two video cards and they already decrease the flow some.
> 
> But for the ease of assembling/disassembling my PC, it's worth it.
Click to expand...


----------



## skupples

i love my QDCs now that I've got it sorted. I placed an order for the largest series, & only focused on the correct mating... Turns out they're fudging huge, they barely fit in an STH10, for reference.

I'll have to completely redesign next time around to make it look proper.


----------



## 8051

Streetdragon said:


> A little bit of 1080TI love
> https://www.3dmark.com/spy/8708488
> 
> Best score so far. With 1.091V 2100Mhz stable(Still!) nice nice^^
> 
> Argh only place 4 with my rig^^


You have an excellent 1080Ti if you can get to 2100Mhz at such a low voltage. It's far better than mine.


----------



## kithylin

8051 said:


> You have an excellent 1080Ti if you can get to 2100Mhz at such a low voltage. It's far better than mine.


My 1080 Ti needed 1.200v to do 2100 Mhz for example.


----------



## VeritronX

KedarWolf said:


> kithylin said:
> 
> 
> 
> That's the card I have, exactly.
> 
> Here it is with the EK block on it:
> 
> I'm getting the acetal/nickel one. and with the Pheonix MLC QDC's GPU Module it's really cheap, waterblock and backplate, 121.98 USD on sale.
> 
> 
> 
> Keep in mind you need EK's geforce fe block not the 1080ti block because you need the cutout for the dvi ports that aren't installed on the reference model.
Click to expand...


----------



## KedarWolf

VeritronX said:


> KedarWolf said:
> 
> 
> 
> Keep in mind you need EK's geforce fe block not the 1080ti block because you need the cutout for the dvi ports that aren't installed on the reference model.
> 
> 
> 
> Yes, I already checked the compatibility list and know I need the FE black. TY though, it would help if someone never knew.
> 
> And it's the FE block Pheonix module that's on sale too!!
Click to expand...


----------



## VeritronX

I ran across that a few years ago, I have the same card and I bought the nickel plexi fe block and nickel backplate for it.. they're still sitting in their boxes because the card runs so well as it was assembled with just a slight undervolt and a custom fan curve.

It sits at 1936-1950mhz core 993mv and +500 mem and almost never hits 70C or gets loud enough to be heard over 1500rpm 120mm vardars.


----------



## 8051

kithylin said:


> My 1080 Ti needed 1.200v to do 2100 Mhz for example.


I'm slightly luckier than you I can get 2115 at 1.131V. I have to say I really admire your approach in building a rig -- massively overclocked last gen. parts.


----------



## kithylin

8051 said:


> I'm slightly luckier than you I can get 2115 at 1.131V. I have to say I really admire your approach in building a rig -- massively overclocked last gen. parts.


I'm surprised anyone even found my thread on that.  Yes I'm running old parts OC'd out the butt daily-driver. I have full intentions of setting up a 3770K and try to clock it to 5.5 Ghz minimum for daily-driver, bare chip delidded direct to waterblock with big loop while I wait around for Ryzen 4000 series threadripper to release in a few years. I have a set of ram I'm looking to pair with it to stretch it's legs later too. Corsair Vengeance 32GB (4x8GB) DDR2-2800 (XMP), and I'm hopeful to try to get it to 3000 - 3200 Mhz. I currently have it mated to an EK water cooled ram block that originally was mated to an old set of Corsair Dominator GT. I have the Vengeance set chemically bonded to the heatsinks for the water kit via resin so it's not just thermal pads.


----------



## 8051

kithylin said:


> I'm surprised anyone even found my thread on that.  Yes I'm running old parts OC'd out the butt daily-driver. I have full intentions of setting up a 3770K and try to clock it to 5.5 Ghz minimum for daily-driver, bare chip delidded direct to waterblock with big loop while I wait around for Ryzen 4000 series threadripper to release in a few years. I have a set of ram I'm looking to pair with it to stretch it's legs later too. Corsair Vengeance 32GB (4x8GB) DDR2-2800 (XMP), and I'm hopeful to try to get it to 3000 - 3200 Mhz. I currently have it mated to an EK water cooled ram block that originally was mated to an old set of Corsair Dominator GT. I have the Vengeance set chemically bonded to the heatsinks for the water kit via resin so it's not just thermal pads.


Do you have a thread where I can follow the progress on this build kithylin? DDR3 is what you mean right? DDR3 to 3000Mhz. or 3200Mhz.! Would that be faster than DDR4 at those speeds? I only ask because I thought DDR3 had lower latencies than DDR4.


----------



## kithylin

8051 said:


> Do you have a thread where I can follow the progress on this build kithylin? DDR3 is what you mean right? DDR3 to 3000Mhz. or 3200Mhz.! Would that be faster than DDR4 at those speeds? I only ask because I thought DDR3 had lower latencies than DDR4.


I meant DDR3 yes, typo. Supposedly this kit I have (from some results I've found on hwbot.org with the same kit) should be capable of 3200 mhz @ 14-14-14. I don't know but I don't think there is a lot of DDR4 that can do that. I had a thread.. but no one noticed it. I'll PM you about it. We're veering off topic here.


----------



## neurotix

*Hey all*

Hey all, 












I recently got 8th place in the Heaven top 30 thread (at 274 fps), with my CPU @ 4475MHz on the first Core Complex, memory at 3800MHz/1900 fclk and my 1080tis @ 2025/5899 with Kboost on in Precision XOC, stock voltage, 100% fan and side panel/fan filters off. I'm trying to beat that 279 fps and get it as high as possible to beat some people at Heaven

My cards are 6696 models and my BIOS is 86.02.39.01.90 (from around March 2018), with no power limit issues (as I can see 127% power usage on both cards in say, looped Fire Strike Ultra combined test). Both are on the slave BIOS. These cards have always overclocked like crap on the core and I don't think it's heat related, if I try and raise both to 2037MHz and max out the voltage, most benches crash to black screen before they can even heat up, and I have to hard reset my rig (hold the power button).

In this situation, should I try to flash the EVGA K|NGP|IN bios and possibly shunt mod? Or since the pair is crashing instantly on raising the core frequency (at like 40c top/35c bottom) within a second or two, will this not make a difference because it's not power related? Should I try flashing the BIOS from this thread?

Should I also regress drivers to 390.xx if I can on Win10 Pro 1903? (These were the best performing drivers before RTX dropped)

I want to redo every bench I did on HWBOT with my 1080tis, multi-gpu and single, now that I get a huge boost to overall score because my physics score is like 35k with this new CPU + memory, but want to get my core OC higher if possible

Any advice appreciated. Thanks


----------



## KedarWolf

neurotix said:


> Hey all,
> 
> 
> View attachment 302212
> 
> 
> 
> 
> I recently got 8th place in the Heaven top 30 thread (at 274 fps), with my CPU @ 4475MHz on the first Core Complex, memory at 3800MHz/1900 fclk and my 1080tis @ 2025/5899 with Kboost on in Precision XOC, stock voltage, 100% fan and side panel/fan filters off. I'm trying to beat that 279 fps and get it as high as possible to beat some people at Heaven
> 
> My cards are 6696 models and my BIOS is 86.02.39.01.90 (from around March 2018), with no power limit issues (as I can see 127% power usage on both cards in say, looped Fire Strike Ultra combined test). Both are on the slave BIOS. These cards have always overclocked like crap on the core and I don't think it's heat related, if I try and raise both to 2037MHz and max out the voltage, most benches crash to black screen before they can even heat up, and I have to hard reset my rig (hold the power button).
> 
> In this situation, should I try to flash the EVGA K|NGP|IN bios and possibly shunt mod? Or since the pair is crashing instantly on raising the core frequency (at like 40c top/35c bottom) within a second or two, will this not make a difference because it's not power related? Should I try flashing the BIOS from this thread?
> 
> Should I also regress drivers to 390.xx if I can on Win10 Pro 1903? (These were the best performing drivers before RTX dropped)
> 
> I want to redo every bench I did on HWBOT with my 1080tis, multi-gpu and single, now that I get a huge boost to overall score because my physics score is like 35k with this new CPU + memory, but want to get my core OC higher if possible
> 
> Any advice appreciated. Thanks


Not sure you want to run XOC BIOS without full waterblocks and backplates. You'll very better clicks but will thermal throttle for sure, and your are risking damaging your cards.

Edit: You might want to try the FTW3 Kingpin Fix Bios in the best bios collection zip file in op though, might do better.


----------



## neurotix

KedarWolf said:


> Not sure you want to run XOC BIOS without full waterblocks and backplates. You'll very better clicks but will thermal throttle for sure, and your are risking damaging your cards.
> 
> Edit: You might want to try the FTW3 Kingpin Fix Bios in the best bios collection zip file in op though, might do better.



Thanks, rep+

Yeah I have the Kingpin Fix bios, that's what I meant when I said Kingpin bios. But on the EVGA forums it seemed like people were mostly using it on my cards when it had power limit issues they couldn't fix, that I've never had (stuck at 117% even on the slave bios)

Regarding XOC bios, point noted, I have actually been considering selling some retro games I don't play and other stuff to put together a full loop. However, since my cards are custom it seems the only blocks available are EK nickel full cover and they are still selling for $160 a piece. I dont want to spend money watercooling cards that will probably be replaced in 2-3 years, and are almost 3 years old anyway.

But yeah I'm already hitting 70c or so on the top card and 55c on the bottom with both at 127% power limit/90c temp limit/no additional voltage- 2025/5899MHz in Valley, Heaven, Superposition, Fire Strike Ultra, etc. benches. With all of my case fans on full blast, both card fans at 100% and my front and top fan filters off and side panel off.

So you're right it is probably dangerous and will add too much heat under benching loads, especially if I am able to run higher frequency on the core, which will add heat as well. Even with my new CPU/board/RAM setup I've tested my rig with a Kill-a-watt meter and I'm only pulling 850w at the wall max (In Fire Strike Ultra combined test looped), my power supply does it no problem- but heat is the issue. Especially since my cooling is less than adequate for the CPU as well, which runs around 80C under full load

Alright then. I won't attempt to flash them or run them at higher core, (since it crashes anyway), and will continue to work on my CPU and memory OC (especially timings) since RAM makes a huge difference on Ryzen 3000, and see if I can get some more fps in Heaven that way. Thanks


----------



## kithylin

neurotix said:


> Thanks, rep+
> 
> Yeah I have the Kingpin Fix bios, that's what I meant when I said Kingpin bios. But on the EVGA forums it seemed like people were mostly using it on my cards when it had power limit issues they couldn't fix, that I've never had (stuck at 117% even on the slave bios)
> 
> Regarding XOC bios, point noted, I have actually been considering selling some retro games I don't play and other stuff to put together a full loop. However, since my cards are custom it seems the only blocks available are EK nickel full cover and they are still selling for $160 a piece. I dont want to spend money watercooling cards that will probably be replaced in 2-3 years, and are almost 3 years old anyway.
> 
> But yeah I'm already hitting 70c or so on the top card and 55c on the bottom with both at 127% power limit/90c temp limit/no additional voltage- 2025/5899MHz in Valley, Heaven, Superposition, Fire Strike Ultra, etc. benches. With all of my case fans on full blast, both card fans at 100% and my front and top fan filters off and side panel off.
> 
> So you're right it is probably dangerous and will add too much heat under benching loads, especially if I am able to run higher frequency on the core, which will add heat as well. Even with my new CPU/board/RAM setup I've tested my rig with a Kill-a-watt meter and I'm only pulling 850w at the wall max (In Fire Strike Ultra combined test looped), my power supply does it no problem- but heat is the issue. Especially since my cooling is less than adequate for the CPU as well, which runs around 80C under full load
> 
> Alright then. I won't attempt to flash them or run them at higher core, (since it crashes anyway), and will continue to work on my CPU and memory OC (especially timings) since RAM makes a huge difference on Ryzen 3000, and see if I can get some more fps in Heaven that way. Thanks


I just wanted to add a comment here to you and to anyone that finds your post in the future in this thread: Yes as said above the XOC bios for 1080 Ti's is -ONLY- designed for temporary use for competitive overclocking with liquid nitrogen. It does have no power limit, but at the same time it will hard reset the nvidia driver if your card ever reaches or exceeds 65c in core temps. It's designed for being run at -50c or colder at all times. Also you can take it from someone that ran their 1080 Ti @ 1.200v for 17 months: It will degrade your card. I can't run XOC at those volts any more. I had to revert to stock bios only and deal with a power limit now.


----------



## KedarWolf

neurotix said:


> Hey all,
> 
> 
> View attachment 302212
> 
> 
> 
> 
> I recently got 8th place in the Heaven top 30 thread (at 274 fps), with my CPU @ 4475MHz on the first Core Complex, memory at 3800MHz/1900 fclk and my 1080tis @ 2025/5899 with Kboost on in Precision XOC, stock voltage, 100% fan and side panel/fan filters off. I'm trying to beat that 279 fps and get it as high as possible to beat some people at Heaven
> 
> My cards are 6696 models and my BIOS is 86.02.39.01.90 (from around March 2018), with no power limit issues (as I can see 127% power usage on both cards in say, looped Fire Strike Ultra combined test). Both are on the slave BIOS. These cards have always overclocked like crap on the core and I don't think it's heat related, if I try and raise both to 2037MHz and max out the voltage, most benches crash to black screen before they can even heat up, and I have to hard reset my rig (hold the power button).
> 
> In this situation, should I try to flash the EVGA K|NGP|IN bios and possibly shunt mod? Or since the pair is crashing instantly on raising the core frequency (at like 40c top/35c bottom) within a second or two, will this not make a difference because it's not power related? Should I try flashing the BIOS from this thread?
> 
> Should I also regress drivers to 390.xx if I can on Win10 Pro 1903? (These were the best performing drivers before RTX dropped)
> 
> I want to redo every bench I did on HWBOT with my 1080tis, multi-gpu and single, now that I get a huge boost to overall score because my physics score is like 35k with this new CPU + memory, but want to get my core OC higher if possible
> 
> Any advice appreciated. Thanks


My 9900k holding up well with AMD.


----------



## neurotix

KedarWolf said:


> My 9900k holding up well with AMD.



Wanna do Cinebench r20 or 3dmark Fire Strike? 

lol jk

I'm not surprised at this given that Heaven at 1080p depends more on cpu. I am just really thrilled with my upgrade given that I upgraded from a 4790k from SiliconLottery delidded and binned to 4.8GHz. I honestly didn't expect to be anywhere near competitive with a 9900k at 1080p benches or games- as in certain titles @ 1080p the 9900k gets +70-80 fps against my chip. I game at 5760x1080 still, and at that res, I am GPU bound and the 9900k and R9 3900x are about even.

What I'm saying is that I gained gaming performance in both either way, the 4790k was long in the tooth, and I fully expected it to run the same or slightly better at Surround res and honestly expected it to be worse 1080p and especially benches like Heaven, and thought it would be much worse than a 9900k. It's not.

For that run, my chip was at 4475/4450/4200/4200MHz, memory was 3800 c16, and the cards were barely at 2012MHz/5940MHz... what was your 9900k and GPUs at? Sig clocks?

Anyway... also thanks to the other person who explained the XOC bios and 65C thermal shutoff, glad I came here and asked first or they'd basically be unusable because of that. Glad I left them alone


----------



## KedarWolf

neurotix said:


> Wanna do Cinebench r20 or 3dmark Fire Strike?
> 
> lol jk
> 
> 
> 
> Spoiler
> 
> 
> 
> I'm not surprised at this given that Heaven at 1080p depends more on cpu. I am just really thrilled with my upgrade given that I upgraded from a 4790k from SiliconLottery delidded and binned to 4.8GHz. I honestly didn't expect to be anywhere near competitive with a 9900k at 1080p benches or games- as in certain titles @ 1080p the 9900k gets +70-80 fps against my chip. I game at 5760x1080 still, and at that res, I am GPU bound and the 9900k and R9 3900x are about even.
> 
> What I'm saying is that I gained gaming performance in both either way, the 4790k was long in the tooth, and I fully expected it to run the same or slightly better at Surround res and honestly expected it to be worse 1080p and especially benches like Heaven, and thought it would be much worse than a 9900k. It's not.
> 
> 
> 
> For that run, my chip was at 4475/4450/4200/4200MHz, memory was 3800 c16, and the cards were barely at 2012MHz/5940MHz... what was your 9900k and GPUs at? Sig clocks?
> 
> Anyway... also thanks to the other person who explained the XOC bios and 65C thermal shutoff, glad I came here and asked first or they'd basically be unusable because of that. Glad I left them alone


CPU was at 5GHZ, memory at 17-17-17-38 2T 4133GHZ, GPUs at what it says in Heaven. 

This Cinebench run was at my 24/7 overclock settings.

https://www.overclock.net/forum/6-i...0-vrm-discussion-thread-413.html#post28046906

https://www.3dmark.com/3dm/40489500

36873 Fire strike.


----------



## KedarWolf

neurotix said:


> Hey all,
> 
> 
> View attachment 302212
> 
> 
> 
> 
> I recently got 8th place in the Heaven top 30 thread (at 274 fps), with my CPU @ 4475MHz on the first Core Complex, memory at 3800MHz/1900 fclk and my 1080tis @ 2025/5899 with Kboost on in Precision XOC, stock voltage, 100% fan and side panel/fan filters off. I'm trying to beat that 279 fps and get it as high as possible to beat some people at Heaven
> 
> My cards are 6696 models and my BIOS is 86.02.39.01.90 (from around March 2018), with no power limit issues (as I can see 127% power usage on both cards in say, looped Fire Strike Ultra combined test). Both are on the slave BIOS. These cards have always overclocked like crap on the core and I don't think it's heat related, if I try and raise both to 2037MHz and max out the voltage, most benches crash to black screen before they can even heat up, and I have to hard reset my rig (hold the power button).
> 
> In this situation, should I try to flash the EVGA K|NGP|IN bios and possibly shunt mod? Or since the pair is crashing instantly on raising the core frequency (at like 40c top/35c bottom) within a second or two, will this not make a difference because it's not power related? Should I try flashing the BIOS from this thread?
> 
> Should I also regress drivers to 390.xx if I can on Win10 Pro 1903? (These were the best performing drivers before RTX dropped)
> 
> I want to redo every bench I did on HWBOT with my 1080tis, multi-gpu and single, now that I get a huge boost to overall score because my physics score is like 35k with this new CPU + memory, but want to get my core OC higher if possible
> 
> Any advice appreciated. Thanks


Best run. Try overclocking your memory and use custom curves in Afterburner, CTRL F, max out your CPU clocks and memory at 1093v.

KedarWolf - i9 9900k/5000MHz - 2x GTX 1080 Ti 2037/6237MHz - 294.1- 7407


----------



## neurotix

KedarWolf said:


> Best run. Try overclocking your memory and use custom curves in Afterburner, CTRL F, max out your CPU clocks and memory at 1093v.
> 
> KedarWolf - i9 9900k/5000MHz - 2x GTX 1080 Ti 2037/6237MHz - 294.1- 7407




I don't have Fire Strike on hand; can't find the screenshot. I'll rerun it tomorrow. It was just a hair over 36k, something like 36081. Cinebench was 7700+, I'll post the screenshot later.

Thats a great Heaven score. Submit it if you didn't. You are also within 10 fps of the best 2080ti SLI submission.

My cards core clocks won't do much more than that, even with added voltage, which adds heat. The memory on either doesn't go much higher than +425 and going higher reduces fps. Also, I do all benches with both cards fans at 100%. The top one will pass 70c in Heaven, bottom will hang around 55c. Unless I open a window, its already near freezing this time of year where I live.
The heat is ok though because in games it is rare for either card to exceed 60c even with a fan curve, I use vsync and frame rate limiting.

If I could run all cores of my chip at 5ghz, I'd be much closer, to your scores, but I knew that going in. I'd be interested to see FS Ultra as that is far less cpu dependent. Mine was a little more than 14k.


----------



## ThrashZone

Hi,
Yep nice clock on a very picky heaven benchmark KW :thumb:


----------



## KedarWolf

neurotix said:


> I don't have Fire Strike on hand; can't find the screenshot. I'll rerun it tomorrow. It was just a hair over 36k, something like 36081. Cinebench was 7700+, I'll post the screenshot later.
> 
> Thats a great Heaven score. Submit it if you didn't. You are also within 10 fps of the best 2080ti SLI submission.
> 
> My cards core clocks won't do much more than that, even with added voltage, which adds heat. The memory on either doesn't go much higher than +425 and going higher reduces fps. Also, I do all benches with both cards fans at 100%. The top one will pass 70c in Heaven, bottom will hang around 55c. Unless I open a window, its already near freezing this time of year where I live.
> The heat is ok though because in games it is rare for either card to exceed 60c even with a fan curve, I use vsync and frame rate limiting.
> 
> If I could run all cores of my chip at 5ghz, I'd be much closer, to your scores, but I knew that going in. I'd be interested to see FS Ultra as that is far less cpu dependent. Mine was a little more than 14k.


Had to run it again and disable my iGPU, it's against the rules.

KedarWolf - i9 9900k/5100MHz - 2x GTX 1080 Ti 2037/6210MHz - 297.5- 7494 - Nvidia Studio drivers.


----------



## KedarWolf

neurotix said:


> I don't have Fire Strike on hand; can't find the screenshot. I'll rerun it tomorrow. It was just a hair over 36k, something like 36081. Cinebench was 7700+, I'll post the screenshot later.
> 
> Thats a great Heaven score. Submit it if you didn't. You are also within 10 fps of the best 2080ti SLI submission.
> 
> My cards core clocks won't do much more than that, even with added voltage, which adds heat. The memory on either doesn't go much higher than +425 and going higher reduces fps. Also, I do all benches with both cards fans at 100%. The top one will pass 70c in Heaven, bottom will hang around 55c. Unless I open a window, its already near freezing this time of year where I live.
> The heat is ok though because in games it is rare for either card to exceed 60c even with a fan curve, I use vsync and frame rate limiting.
> 
> If I could run all cores of my chip at 5ghz, I'd be much closer, to your scores, but I knew that going in. I'd be interested to see FS Ultra as that is far less cpu dependent. Mine was a little more than 14k.


FireStrike 37236

https://www.3dmark.com/fs/20798900

FireStrike Ultra 14555

https://www.3dmark.com/fs/20798884


----------



## ThrashZone

Hi,
Now you're just showing off


----------



## KedarWolf

Benchmarking trick.

Install this.

https://github.com/abbodi1406/vcredist/releases/tag/0.26.0

Run Nvidia Profile Inspector as Admin I added here as a .zip.

Put this to Off, hit Apply Changes TWICE!










Set your RAM speed in Afterburner.

Run the runCustom.bat included in the second .zip file folder with the .exe etc. left in the folder.

Raise your video card RAM speed one notch at a time, rerun the bandwidth tester. You'll find on some memory settings you RAM will measure say 420 to 430 Gbyte/sec but a notch or two higher or lower it'll be at 470-490 Gbyte/sec. Run your video card benchmarks when it reads the higher bandwidth, your benchmark will do better, even if your actual memory clocks are one or two notches lower.


----------



## neurotix

KedarWolf said:


> Benchmarking trick.
> 
> Install this.
> 
> https://github.com/abbodi1406/vcredist/releases/tag/0.26.0
> 
> Run Nvidia Profile Inspector as Admin I added here as a .zip.
> 
> Put this to Off, hit Apply Changes TWICE!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Set your RAM speed in Afterburner.
> 
> Run the runCustom.bat included in the second .zip file folder with the .exe etc. left in the folder.
> 
> Raise your video card RAM speed one notch at a time, rerun the bandwidth tester. You'll find on some memory settings you RAM will measure say 420 to 430 Gbyte/sec but a notch or two higher or lower it'll be at 470-490 Gbyte/sec. Run your video card benchmarks when it reads the higher bandwidth, your benchmark will do better, even if your actual memory clocks are one or two notches lower.



Will definitely be trying this and something else I have up my sleeve for my Ryzen setup.

Rep+ thanks, will be seeing if this makes a difference or not.


----------



## skupples

I ended up having $250 of free money on amazon, used it towards a 5700xt. 

I'll miss these 1080tis, but considering I half killed both, good riddance. I ALMOST went with a $300 used 1080ti instead, but meh. Seems Pascal just wasn't for me / and or these cards are too fragile for used action. 

5700xt til 4080ti, or bust.


----------



## DxCK

Does XOC BIOS going to brick my card? ZOTAC GTX1080Ti AMP! EXTREME, with two eight pin power connector.


----------



## kithylin

DxCK said:


> Does XOC BIOS going to brick my card? ZOTAC GTX1080Ti AMP! EXTREME, with two eight pin power connector.


As always with flashing any other bios from any other card other than the bios that shipped on your card there is an inherent risk of potentially causing permanent damage to your card. If you do not like the idea of the risk or if you can not easily afford to replace your card then I would suggest not bothering with bios flashing it in general. The gains are marginal at best. I do not know about Zotac's RMA policy but some companies will void your factory warranty if you damage it during a bios flash and you send it back to them for RMA with some other vendor's bios on the card. They can and will find out about that.


----------



## JollyGreenJoint

Damn skupples why u gotta jinx meh! Having an issue cant play overwatch on anything but low settings otherwise it freaks out and either gives me a black screen with only the outlined silhoutte of the players or i get a rendering device lost error. Have used this card on two identical setups same issue, just noted the temp was 47c last time it did it. new/used 1080ti kingpin :| Havent tried switching to the oc bios.. or repasting ....

Did some more testing on heaven it eventually freaks out on everything but the lowest setting  Temps are stable around 47/50c :/

Awaiting rma approval....


----------



## mtbiker033

y'all my 1080ti founders (got one after the first reviews) flashed to a FTW3 and I added a hybrid kit too after about 2 months (availability) is still running like a champ. with curve it goes to 2101mhz and rarely breaks 60C. this was the best gpu purchase I ever made! 

congrats to anyone else still following this thread and still rocking a beast 1080ti! :cheers: :happyholidays:


----------



## kithylin

mtbiker033 said:


> y'all my 1080ti founders (got one after the first reviews) flashed to a FTW3 and I added a hybrid kit too after about 2 months (availability) is still running like a champ. with curve it goes to 2101mhz and rarely breaks 60C. this was the best gpu purchase I ever made!
> 
> congrats to anyone else still following this thread and still rocking a beast 1080ti! :cheers: :happyholidays:


Yep still rocking my water cooled 1080 Ti. I went to a different loop design and added a "Thicc Boy" old-school dual-core Hardware Labs radiator added to my system and it improved my system's cooling capacity a good bit. Now I'm rocking this configuration for rads: 480 + 360 + 240 + 120. I also dropped my old Z77X system I had and moved to a modern Ryzen system and now my 1080 Ti runs max 30-35c under load in benchmarks or gaming. Before my old configuration was: 360 + 240 + 240 + 120 and with that I was seeing around 45-48c temps on the video card. With the reduction in load temps of about -10 to -12c I'm now able to overclock my 1080 Ti from 2012 Mhz Core stable before (at 45-48c) -> 2063 stable now (30-35c). I tried XOC bios again, and nope.. my card won't run any clocks even stock speed on XOC, it's permanently degraded. Works great up to 2175 Mhz in benchmarks with 3dmark but instantly crashes to desktop if I load any actual game. Works perfectly fine back on stock EVGA bios though. So.. so much for that.

I'm wanting to expand my 120mm radiator spot to a dual-core 140 and bump the loop up to: 480 + 360 + 280 + 240, and that will probably be my final configuration for this system / the most radiator I can pack into this Corsair 780T case. I'm hopeful I might can finally get my card down to < 30c load temps with more radiators. 

This time of year we're around 15c - 20c ambient temps here. My coolant temp runs around 22c idle 27c load if I game long enough.

One of the benefits of a big loop and 15 fans on the radiators: I get peak cooling effiency with the fans set to 20% speed. Nice and quiet.


----------



## 2ndLastJedi

Still rocking my stock Galax GTX 1080 ti EXOC undervolted to 0.900mV @1911MHz and lovin it. Had it since June 2017 😉


----------



## offthewall

I'm rocking too! I love my new upgrade from Titan OG SLI to single EVGA 1080ti Black Gaming with triple surround 5760x1080 monitors! Overclocked 2100/6000Mhz @1093mV 24/7 game stable Division2, BF4, Metro Exodus and FarCry5. 3DMark Time Spy 9800 Fire Strike Extreme 14059 with 4930k CPU @4.7Ghz, 16GB Corsair Vengeance XMP on a Asus P9X79 MB. Water cooled with EK blocks GPU max 46C and CPU max 52C. 1 360 rad and 1 240 rad.


----------



## l Nuke l

So I figured out xj2 is gpu vcore. checked with dmm reads consistent with software. xj4 reads 1.3783v no idea what that voltage is for and xj6 reads 1.0301v also no clue what that is. the bottom 3 holes arent showing any voltage. Does anyone know what these are?
asus 1080ti strix


----------



## MrTOOSHORT

xj4 should be vram voltage, xj6 should be pll voltage. 

*https://community.hwbot.org/topic/157768-asus-gtx-1080-strix-oc-ln2-prep-bios-experiences/*


----------



## l Nuke l

MrTOOSHORT said:


> xj4 should be vram voltage, xj6 should be pll voltage.
> 
> *https://community.hwbot.org/topic/157768-asus-gtx-1080-strix-oc-ln2-prep-bios-experiences/*
> 
> 
> 
> 
> Â


 u da man thanks bro.


----------



## Dante007

Could any one attach here 1080ti XOC Bios aka Kingpin fix for EVGA FTW3 Elite 12GB also anyone test it with Aorus Xtreme 1080ti ?


----------



## kithylin

Dante007 said:


> Could any one attach here 1080ti XOC Bios aka Kingpin fix for EVGA FTW3 Elite 12GB also anyone test it with Aorus Xtreme 1080ti ?


XOC bios is on the first page of this thread. Do note: You should only use it on water cooled cards, and -NOT- air cooled cards. Also it's about -5% slower vs all other bios's. So far according to the testing in this thread the XOC bios should work on all 1080 Ti's. You may lose some functionality though, certain video ports may not work. Look for the Asus Strix 1080 Ti and compare it's video ports to what your card has. Anything that doesn't match what the asus card has could be lost (DVI, HDMI, Displayport, etc.) Also your mileage may vary. If it bricks your card it's not my fault or anyone elses. I haven't seen reports of the XOC bios killing any cards in this thread yet. Though if you run at 1.200v voltage long-term it will degrade your card though even if it doesn't kill it instantly. XOC bios should probably not be used as a daily driver bios for long-term usage. It's more of short-term for competitive overclocking.


----------



## Dante007

kithylin said:


> XOC bios is on the first page of this thread. Do note: You should only use it on water cooled cards, and -NOT- air cooled cards. Also it's about -5% slower vs all other bios's. So far according to the testing in this thread the XOC bios should work on all 1080 Ti's. You may lose some functionality though, certain video ports may not work. Look for the Asus Strix 1080 Ti and compare it's video ports to what your card has. Anything that doesn't match what the asus card has could be lost (DVI, HDMI, Displayport, etc.) Also your mileage may vary. If it bricks your card it's not my fault or anyone elses. I haven't seen reports of the XOC bios killing any cards in this thread yet. Though if you run at 1.200v voltage long-term it will degrade your card though even if it doesn't kill it instantly. XOC bios should probably not be used as a daily driver bios for long-term usage. It's more of short-term for competitive overclocking.


Thanks for your reply I get Aorus GTX 1080ti Xtreme a year ago and watercooled it with Alphacool Eiswolf GPX-Pro M21 that came with 120 rad and replace it with 360 copper rad from alphacool with Noctua NT-H2 and direct Noctua NF12S fan on the card and 3 fans on rad "idle 15-16 and load 34" overclock using MSI Afterburner and bios F3 300/375W to 2100Mhz core and 1654Memory and can do to 1676 memory but max number in Afterburner is 1000+, i test XOC bios from asus and reach 2164/1676 stable but with same performance as Aorus F3 bios "best one so far" i test zotac 432w and gives lower performance but i test EVGA elite 12GB and was good with lower TDP cap, so i find XOC for EVGA aka "Kingpin FIX" and i use it from "best bios Collection" and it seems power cap works and only links provide the bios for 3 cards "Elite FTW3 Hyprid" are for EVGA cards as exe and can't flash it to my card "even flash evga bios and try again" EVGA works with my card good as i use Display port and HDMI if anyone can upload bios and anyone test KPE bios on aorus card ?


----------



## yoadknux

Dante007 said:


> Thanks for your reply I get Aorus GTX 1080ti Xtreme a year ago and watercooled it with Alphacool Eiswolf GPX-Pro M21 that came with 120 rad and replace it with 360 copper rad from alphacool with Noctua NT-H2 and direct Noctua NF12S fan on the card and 3 fans on rad "idle 15-16 and load 34" overclock using MSI Afterburner and bios F3 300/375W to 2100Mhz core and 1654Memory and can do to 1676 memory but max number in Afterburner is 1000+, i test XOC bios from asus and reach 2164/1676 stable but with same performance as Aorus F3 bios "best one so far" i test zotac 432w and gives lower performance but i test EVGA elite 12GB and was good with lower TDP cap, so i find XOC for EVGA aka "Kingpin FIX" and i use it from "best bios Collection" and it seems power cap works and only links provide the bios for 3 cards "Elite FTW3 Hyprid" are for EVGA cards as exe and can't flash it to my card "even flash evga bios and try again" EVGA works with my card good as i use Display port and HDMI if anyone can upload bios and anyone test KPE bios on aorus card ?


what in the world did u just write brother?


----------



## neurotix

Dante007 said:


> Thanks for your reply I get Aorus GTX 1080ti Xtreme a year ago and watercooled it with Alphacool Eiswolf GPX-Pro M21 that came with 120 rad and replace it with 360 copper rad from alphacool with Noctua NT-H2 and direct Noctua NF12S fan on the card and 3 fans on rad "idle 15-16 and load 34" overclock using MSI Afterburner and bios F3 300/375W to 2100Mhz core and 1654Memory and can do to 1676 memory but max number in Afterburner is 1000+, i test XOC bios from asus and reach 2164/1676 stable but with same performance as Aorus F3 bios "best one so far" i test zotac 432w and gives lower performance but i test EVGA elite 12GB and was good with lower TDP cap, so i find XOC for EVGA aka "Kingpin FIX" and i use it from "best bios Collection" and it seems power cap works and only links provide the bios for 3 cards "Elite FTW3 Hyprid" are for EVGA cards as exe and can't flash it to my card "even flash evga bios and try again" EVGA works with my card good as i use Display port and HDMI if anyone can upload bios and anyone test KPE bios on aorus card ?


I'm not 100% sure on this and someone might correct me... @KedarWolf probably..

I don't think the Kingpin Bios will work on your Aorus card as the EVGA FTW cards all have a non-reference, custom design with different power delivery, VRMs, sensors etc and probably a different bios chip too.

I might be wrong but if you cant flash it, that is probably why


----------



## Streetdragon

Just wanna say, that my shuntmod is working great after over 1.5 Years or so! Still rocking


----------



## yoadknux

neurotix said:


> I'm not 100% sure on this and someone might correct me... @KedarWolf probably..
> 
> I don't think the Kingpin Bios will work on your Aorus card as the EVGA FTW cards all have a non-reference, custom design with different power delivery, VRMs, sensors etc and probably a different bios chip too.
> 
> I might be wrong but if you cant flash it, that is probably why


"FTW3 Kingpin Fix" bios works on my Aorus.


----------



## Dante007

yoadknux said:


> "FTW3 Kingpin Fix" bios works on my Aorus.


No power limit ? if so could you dm it to me


----------



## kithylin

Dante007 said:


> No power limit ? if so could you dm it to me


I would be interested in this too if someone could confirm no power limit.


----------



## The Pook

you guys see/use the new driver (441.87) that released today? 

https://www.nvidia.com/en-us/geforce/news/nvidia-geforce-ces-2020-game-ready-driver/

in-game FPS caps now at the driver level instead of using some bastardized version of RVT 

:wheee:


----------



## yoadknux

Dante007 said:


> No power limit ? if so could you dm it to me





kithylin said:


> I would be interested in this too if someone could confirm no power limit.


It's available in the "flashing" thread, in "BestBiosCollection". It's not new, KedarWolf has recommended this bios for a while.
For all purposes of standard overclocking, it acts as a "no power limit" bios, meaning that when you open the voltage curve, you can jump all the way to 1.093V and put the clocks there with no risk of downclocking due to power limit. However, it does not unlock voltages over 1.093V and therefore it will not improve your maximum overclock at all.
If I compare the stable OC of my stock bios vs stable OC of FTW3 bios, the difference between them is 1%-2% in scores. That's nothing, cannot be noticed. What can be noticed is loss of some video output ports. 

I know there is a 200 page about bios flashing the 1080ti, and I went through most of it. My conclusion is, it's a waste of time, stick with your bios.


----------



## kithylin

The Pook said:


> you guys see/use the new driver (441.87) that released today?
> 
> https://www.nvidia.com/en-us/geforce/news/nvidia-geforce-ces-2020-game-ready-driver/
> 
> in-game FPS caps now at the driver level instead of using some bastardized version of RVT
> 
> :wheee:


That has been included in the Nvidia Drivers since driver version 195.62 for Vista & Win7 that was released in fall 2009. 10 years ago. In the past we had to use Nvidia Inspector to access per-game FPS caps though because it wasn't in the control panel "exposed" to the user. But it's been there just about forever. It's -NOT- a new thing at all.


----------



## The Pook

I'm just a simple man but if you need extra software to do something in addition to the driver then it's not included in the driver. 

crazy idea, I know


----------



## kithylin

The Pook said:


> I'm just a simple man but if you need extra software to do something in addition to the driver then it's not included in the driver.
> 
> crazy idea, I know


That doesn't mean it's not in there. There are tons of features included in the drivers that nvidia doesn't decide to give us access to via the control panel that are in the drivers and do function. The FPS limiter is just one of them. This is literally why Nvidia Profile Inspector exists: So we can tune those settings. Nvidia lately is just trying to bring out these "old features" and suddenly expose them to the users as "Something new" and it looks good in the press but it's all BS and just used as a marketing ploy. Even the "Low Latency Mode" they put in the drivers a few versions back was just the old "Pre-Rendered Frames" that has been in the drivers since forever. They renamed it "Low Latency Mode".


----------



## Dante007

yoadknux said:


> It's available in the "flashing" thread, in "BestBiosCollection". It's not new, KedarWolf has recommended this bios for a while.
> For all purposes of standard overclocking, it acts as a "no power limit" bios, meaning that when you open the voltage curve, you can jump all the way to 1.093V and put the clocks there with no risk of downclocking due to power limit. However, it does not unlock voltages over 1.093V and therefore it will not improve your maximum overclock at all.
> If I compare the stable OC of my stock bios vs stable OC of FTW3 bios, the difference between them is 1%-2% in scores. That's nothing, cannot be noticed. What can be noticed is loss of some video output ports.
> 
> I know there is a 200 page about bios flashing the 1080ti, and I went through most of it. My conclusion is, it's a waste of time, stick with your bios.


My problem here that when the power limit kicks in the memory gets not enough power and start instability all over the place that not happen with Asus XOC using 1.093 voltage "same overclocking"
i don't care about more voltage it hearts the overclocking and gpu degradation i test evga ftw3 kingpin fix and power limit happens so if we have FTW3 12GB Elite bios with Kingpin fix "No powerlimit" that will be amazing for me


----------



## JollyGreenJoint

evga approved my rma on my 1080ti kingpin, they are sending me a "RTX 2080 FTW3 ULTRA, OVERCLOCKED" (08G-P4-2287-KR) as a replacement. Thoughts ? Personally feeling kinda **** on, i'd rather have a working 1080ti kingpin in return. I also use an overclocked dvi-d monitor and there no dvi-d ports on the 2080 :| Anyone know of an adapter that supports 96hz on 1440p ?


----------



## yoadknux

JollyGreenJoint said:


> evga approved my rma on my 1080ti kingpin, they are sending me a "RTX 2080 FTW3 ULTRA, OVERCLOCKED" (08G-P4-2287-KR) as a replacement. Thoughts ? Personally feeling kinda **** on, i'd rather have a working 1080ti kingpin in return. I also use an overclocked dvi-d monitor and there no dvi-d ports on the 2080 :| Anyone know of an adapter that supports 96hz on 1440p ?


Well, the 2080 is slightly stronger than the 1080ti for gaming, the FTW3 is a top version, and it can run ray tracing on the select few games that have it...
But at the same time, you bought the most expensive version of the 1080ti, which was originally priced about 200$ more, and is considered a "special edition". The Kingpin probably would have been sold higher on the 2nd hand market too.
It's a mixed bag. You won't lose anything gaming wise, but nobody buys the Kingpin version for gaming. Usually when it comes down to GPU warranties you get the same value or better, you were given same value or lower.


----------



## kithylin

JollyGreenJoint said:


> evga approved my rma on my 1080ti kingpin, they are sending me a "RTX 2080 FTW3 ULTRA, OVERCLOCKED" (08G-P4-2287-KR) as a replacement. Thoughts ? Personally feeling kinda **** on, i'd rather have a working 1080ti kingpin in return. I also use an overclocked dvi-d monitor and there no dvi-d ports on the 2080 :| Anyone know of an adapter that supports 96hz on 1440p ?


The Kingpin 1080 Ti cards were literally hand-selected and hand-binned by by Vince "K|NGP|N" himself. They were not really mass produced per say. They were but the cores were binned by hand first in a lab. They're not your normal 1080 Ti cores. As such they can't just slap any random 1080 Ti core on it, it wouldn't be the same. They only had a very limited run of I think maximum 100 or 200 of them ever and once they sold out they can't "Make" more to replace them with RMA. Pretty much it's been "Known" in the overclocking community here that the 1080 Ti Kingpin cards are a buy-once-keep-once thing and you always get some generic 2080 / 1080 Ti card as a replacement if you RMA it. The 2080 is actually a hair faster than the 1080 Ti, by like +8% and you get RTX cores with it. The only really bad thing I'd be mad about is the reduction from 12GB Vram (1080 Ti) -> 8GB Vram (RTX 2080). That would really suck. I've already seen several games run over 8GB Vram on my 1080 Ti, just at 1080p.


----------



## Shawnb99

kithylin said:


> The Kingpin 1080 Ti cards were literally hand-selected and hand-binned by by Vince "K|NGP|N" himself. They were not really mass produced per say. They were but the cores were binned by hand first in a lab. They're not your normal 1080 Ti cores. As such they can't just slap any random 1080 Ti core on it, it wouldn't be the same. They only had a very limited run of I think maximum 100 or 200 of them ever and once they sold out they can't "Make" more to replace them with RMA. Pretty much it's been "Known" in the overclocking community here that the 1080 Ti Kingpin cards are a buy-once-keep-once thing and you always get some generic 2080 / 1080 Ti card as a replacement if you RMA it. The 2080 is actually a hair faster than the 1080 Ti, by like +8% and you get RTX cores with it. The only really bad thing I'd be mad about is the reduction from 12GB Vram (1080 Ti) -> 8GB Vram (RTX 2080). That would really suck. I've already seen several games run over 8GB Vram on my 1080 Ti, just at 1080p.




Is that same with the 2080TI kingpins?
I know the waterblock isn’t being made anymore hence the demand for Optimus to make one, would assume the card is the same.


----------



## kithylin

Shawnb99 said:


> Is that same with the 2080TI kingpins?
> I know the waterblock isn’t being made anymore hence the demand for Optimus to make one, would assume the card is the same.


That I have no idea about. I just know about the 1080 Ti one from watching a youtube video where Vince was talking about how he helped EVGA engineers hand-design the 1080 Ti Kingpin cards and he said in the video that they would only produce "About 100 or so" of them.

You may want to ask over in the 2080 Ti thread. Here's a link for you: https://www.overclock.net/forum/69-nvidia/1706276-official-nvidia-rtx-2080-ti-owner-s-club.html


----------



## Zfast4y0u

SLI+Ray Tracing ON ( ultra )xDD

https://imgur.com/a/rdGZbpY

works fine for me, probably more fps then 2080ti xD first 2 pics are with maxed out AA as well, SMAAx4, last pic is AA OFF, everything else maxed out. this game really pushed 8700k almost to the edge, with no AA cpu on +70% load.


----------



## kithylin

Zfast4y0u said:


> SLI+Ray Tracing ON ( ultra )xDD
> 
> https://imgur.com/a/rdGZbpY
> 
> works fine for me, probably more fps then 2080ti xD first 2 pics are with maxed out AA as well, SMAAx4, last pic is AA OFF, everything else maxed out. this game really pushed 8700k almost to the edge, with no AA cpu on +70% load.


It looks like you're an all-stock setup there? CPU seems to be almost stock, the 2080 Ti's seem to be mostly stock. It'd probably help if you got to overclocking that stuff. This is OCN forums. 

Still, that looks great and giving me hope of trying dual 1080 Ti's later.


----------



## Zfast4y0u

kithylin said:


> It looks like you're an all-stock setup there? CPU seems to be almost stock, the 2080 Ti's seem to be mostly stock. It'd probably help if you got to overclocking that stuff. This is OCN forums.
> 
> Still, that looks great and giving me hope of trying dual 1080 Ti's later.


no no, you are miss reading, its 1080ti's. i delided this 8700k but when mounting it back i didnt glue IHS back to PCB so the paste under it didnt spread well while applying mounting mechanism on my mobo, on high loads i have as much as 13c diff between some cores, so i rather leave it at 4,4 ghz for now. im about to do maintenance and while doing it, gonna glue this time around IHS with PCB and put enough paste ( last time i run out of paste too -.- ) 

im not really fan of overclocking much, i prefer more lower voltages and temps tbh. i dont even run this cards at 2000mhz, but [email protected] 0,775mV. in sli dosent really matter much, when it works im above 100fps anyway.


----------



## kithylin

Zfast4y0u said:


> no no, you are miss reading, its 1080ti's. i delided this 8700k but when mounting it back i didnt glue IHS back to PCB so the paste under it didnt spread well while applying mounting mechanism on my mobo, on high loads i have as much as 13c diff between some cores, so i rather leave it at 4,4 ghz for now. im about to do maintenance and while doing it, gonna glue this time around IHS with PCB and put enough paste ( last time i run out of paste too -.- )
> 
> im not really fan of overclocking much, i prefer more lower voltages and temps tbh. i dont even run this cards at 2000mhz, but [email protected] 0,775mV. in sli dosent really matter much, when it works im above 100fps anyway.


Ah I suppose if that's your goal that's your goal.. If I ever went dual 1080 Ti's I would aim for at least 144 FPS minimums (and 0.1% lows) at all times in all games. Which would require heavy overclocking of both cards and the entire system. That's me though.


----------



## mtbiker033

the new drivers FPS limiter seems to work really good

I have a couple of games that do not like msi ab overlay so I have to make a global profile to turn rtss/OSD off (for some reason idk it won't run otherwise) and the ingame limiter on my game only went to multiples of 30 and I want to use 141. 

seems to be working great! 

I really hope the next generation of nv is as good as an experience as this card has been


----------



## 8051

Zfast4y0u said:


> SLI+Ray Tracing ON ( ultra )xDD
> 
> https://imgur.com/a/rdGZbpY
> 
> works fine for me, probably more fps then 2080ti xD first 2 pics are with maxed out AA as well, SMAAx4, last pic is AA OFF, everything else maxed out. this game really pushed 8700k almost to the edge, with no AA cpu on +70% load.


How are you using ray tracing on a 1080Ti? Or is ray tracing not the same thing as RTX?


----------



## kithylin

8051 said:


> How are you using ray tracing on a 1080Ti? Or is ray tracing not the same thing as RTX?


Nvidia was "Caught" with their claim that RTX required the new 2000 series of video cards and enabled it via drivers I think back to the GTX 600 series. I know it's at least enabled on the 1000 series for sure though.


----------



## D-EJ915

8051 said:


> How are you using ray tracing on a 1080Ti? Or is ray tracing not the same thing as RTX?


RTX is just Nvidia GPUs with "RT" cores designed to do ray tracing and nothing more.

https://www.nvidia.com/en-us/geforce/news/geforce-gtx-dxr-ray-tracing-available-now/


----------



## Zfast4y0u

8051 said:


> How are you using ray tracing on a 1080Ti? Or is ray tracing not the same thing as RTX?


its enabled via drivers. later in game i found out, ultra RT settings aint really doing well! but on medium it is very playable, there are some dips to 40fps when you enter really dark room from outside for example, but outside those dips, medium or off, there is no big performance impact and difference between medium or ultra is not noticeable.

anyway, this is not real ray tracing imho, this is some hybrid form that does ray tracing here, but not there, cause hardware is not capable to do real ray tracing yet, so i dont really care about ray tracing in years to come. once we get games like star wars demo, i will say, HELL YEA, till then, meh dont care much. i turn ray tracing off and on, ppl there is so little difference its not worth losing those constant 140fps i have in sottr with ray tracing off ( no dips to 40fps here and there like i explained )
@kithylin

i think ray tracing is enabled only on 10 series and in 10 series only for 1080ti and 1080.


----------



## Imprezzion

I'm grabbing a 1080 Ti second hand and ditching my RTX 2080 now that prices are still high lol. I tried a 5700 XT, realized in 5 minutes of gaming why I never buy AMD, and send it back lol.

1080 Ti's can be had very cheap secondhand and I will not run stock cooling. So far I found one with a Accelero Xtreme IV and backplate already mounted to it. I will see if I can buy that one for a good offer and run that or get a EK Phoenix block for it. I have a Phoenix on my CPU already so..

I'm curious whether a Inno3D Twin x2 VRM / VRAM plate under a Accelero IV can keep VRM temps low enough to run a shunt mod on it lol. It's a reference PCB and I don't know how beefy the VRM is in reference. With the EK Phoenix block it wouldn't be a problem but those are an extra like, €75 and I need a second rad. No way my single 280 can cool the GPU and the 9900K @ 5.1 hehe.


----------



## kithylin

Imprezzion said:


> I'm grabbing a 1080 Ti second hand and ditching my RTX 2080 now that prices are still high lol. I tried a 5700 XT, realized in 5 minutes of gaming why I never buy AMD, and send it back lol.
> 
> 1080 Ti's can be had very cheap secondhand and I will not run stock cooling. So far I found one with a Accelero Xtreme IV and backplate already mounted to it. I will see if I can buy that one for a good offer and run that or get a EK Phoenix block for it. I have a Phoenix on my CPU already so..
> 
> I'm curious whether a Inno3D Twin x2 VRM / VRAM plate under a Accelero IV can keep VRM temps low enough to run a shunt mod on it lol. It's a reference PCB and I don't know how beefy the VRM is in reference. With the EK Phoenix block it wouldn't be a problem but those are an extra like, €75 and I need a second rad. No way my single 280 can cool the GPU and the 9900K @ 5.1 hehe.


My 1080 Ti (EVGA GTX 1080 Ti SC Black Edition) is reference PCB and with XOC Bios I had it sucking down 650 watts at one point in a certain game in a certain situation and it survived just fine. Still works.


----------



## Imprezzion

kithylin said:


> My 1080 Ti (EVGA GTX 1080 Ti SC Black Edition) is reference PCB and with XOC Bios I had it sucking down 650 watts at one point in a certain game in a certain situation and it survived just fine. Still works.


650w.. wow haha. I'm not going that high. Just whatever it needs to sustain 2100Mhz+ without throttling IF the core is even capable of 2100+. I had a few 1080's and some did 2150+MHz and some bad ones barely hit 2000 so.. lottery..


----------



## kithylin

Imprezzion said:


> 650w.. wow haha. I'm not going that high. Just whatever it needs to sustain 2100Mhz+ without throttling IF the core is even capable of 2100+. I had a few 1080's and some did 2150+MHz and some bad ones barely hit 2000 so.. lottery..


As long as you're using a normal locked bios with the normal nvidia limit of 1.093v you won't ever see above 400 watts most likely with a 1080 Ti, even with a shunt mod, even with 2100 Mhz. I was pushing my card at 1.200v for 2126 Mhz.

Remember temperatures are your friend with 1080 Ti's. No matter what you're probably going to need to get your card to sustain at least < 40c core temps during load / gaming to get 2100 mhz stable unless you have a super special golden sample. Or a Kingpin.


----------



## Imprezzion

The Accelero can maintain around 50-55c but lower will require a waterblock. I will get a Phoenix compatible block so I can use my 280 rad for it for a while until I order a separate 240 Phoenix rad locally. EK does have the blocks in stock in the clearance sale for like, €75 but not the rads. I can get one locally tho.


----------



## kithylin

Imprezzion said:


> The Accelero can maintain around 50-55c but lower will require a waterblock. I will get a Phoenix compatible block so I can use my 280 rad for it for a while until I order a separate 240 Phoenix rad locally. EK does have the blocks in stock in the clearance sale for like, €75 but not the rads. I can get one locally tho.


Some people in this thread have reported they were lucky enough to get 1080 Ti's to hold sustained clocks under gaming load of 2100 - 2150 Mhz @ 50-55c but it's not common. At those temps you're probably going to start out around 2100 but throttle down to around 2063 - 2050 or 2026 when gaming and the card heats up. Even with a shunt mod these cards reduce clocks when they heat up. It's just how they're designed.


----------



## Imprezzion

kithylin said:


> Some people in this thread have reported they were lucky enough to get 1080 Ti's to hold sustained clocks under gaming load of 2100 - 2150 Mhz @ 50-55c but it's not common. At those temps you're probably going to start out around 2100 but throttle down to around 2063 - 2050 or 2026 when gaming and the card heats up. Even with a shunt mod these cards reduce clocks when they heat up. It's just how they're designed.


That's what makes my 2080 a bad clocker as well. It can sustain 2100 @ 1.093 but starts out at 2150 when I want that and that is so far from stable I'm forced to run 2100 max which is 2055 when warmed up. 

I'll see if the seller wants to sell that Inno3D reference for like, €400 and get a Phoenix block and 240 rad+pump. That should do for sub 50c.


----------



## kithylin

Imprezzion said:


> That's what makes my 2080 a bad clocker as well. It can sustain 2100 @ 1.093 but starts out at 2150 when I want that and that is so far from stable I'm forced to run 2100 max which is 2055 when warmed up.
> 
> I'll see if the seller wants to sell that Inno3D reference for like, €400 and get a Phoenix block and 240 rad+pump. That should do for sub 50c.


Just for comparison so you know: I have a large water loop with 4 radiators. Originally I had the loop configured like this: 960mm of rad space total (360 + 240 + 240 + 120). With that configuration and a pretty crazy system (I5-3570K @ 5.2 Ghz in the loop, water cooled DDR3 @ 1.8v @ 2500 Mhz in the loop, and the 32-phase VRM on the Z77X board I had also in the loop and my 1080 Ti in the loop) I was averaging around 45-48c, sometimes up to 52c on the video card and I could only get my 1080 Ti stable @ 2012 Mhz core speed, with a locked / power limited bios @ 1.093v gpu voltage. Now I've changed the loop. I switched it to just a single Ryzen 5 2600 @ 4.0+4.2 Ghz, and the video card in it. I upgraded one of the 240's to a double-core 240 from hardware labs (80mm thick 480mm rad that only occupies 240mm of length) so now my loop is configured like this: 1200mm of rad space (480 + 360 + 240 + 120) and only the 1080 Ti and the 6 core Ryzen CPU and I'm able to sustain max temps of 36c on the 1080 Ti under gaming load worst case scenario, typically 30-34c depending on ambient temps (It's winter here, Texas, USA) and now I'm able to get my 1080 Ti stable at 2075 Mhz core speed, still on 1.093v gpu voltage. And because of my low temps my clocks almost never fluxuate when it's 100% loaded. I game with vsync on though so if it drops to a lower FPS then my gpu's clocks drop too so that's expected. I was getting clock reduction due to power limits in games and benchmarks running 4K tests on the stock bios though so I switched to the FTW3 bios and now I'm not getting power limits @ 2075 Mhz anymore.


----------



## dangerSK

kithylin said:


> Some people in this thread have reported they were lucky enough to get 1080 Ti's to hold sustained clocks under gaming load of 2100 - 2150 Mhz @ 50-55c but it's not common. At those temps you're probably going to start out around 2100 but throttle down to around 2063 - 2050 or 2026 when gaming and the card heats up. Even with a shunt mod these cards reduce clocks when they heat up. It's just how they're designed.


My 1080ti with waterblock (39C in load after hour) can barely go 2088mhz  So yea i would like core that can sustain 2100mhz+. And no im not bound by power limit.


----------



## Imprezzion

I have had like, 4 regular 1080's and the worst one also barely did 2000 stable but the best one I had, a ASUS Poseidon, oops that was a 980 Ti Inno3D X4, did 2130 just fine.. lottery is a big thing with these cards. 

I'll settle for whatever it will do tho as it's just a temporary card until RTX3XXX launches tho. All I play now is BDO, borderlands 3 and a bit of BFV now and then. It'll do fine on a 1080 Ti for now.


----------



## lilchronic

How much should i be selling a 1080ti with waterblock for ?
https://www.overclock.net/forum/14779-video/1741632-fs-evga-gtx-1080ti-ekfc-titan-x-waterblock.html


----------



## jura11

dangerSK said:


> My 1080ti with waterblock (39C in load after hour) can barely go 2088mhz  So yea i would like core that can sustain 2100mhz+. And no im not bound by power limit.


Hi there 

I can say for myself my EVGA GTX1080Ti has been great OC'er, still running strong with 2113-2139MHz at 1.07v at stock BIOS and temperatures are still under 40°C with such clocks at any games or rendering

Got two GTX1080's, Manli GTX1080 which is running 2164MHz and EVGA GTX1080 with 2113MHz OC and tested them as well with SLI few weeks ago and no issues, they're rock solid at 2113MHz in most of the benchmarks or games and no issues

But agree with silicone lottery thing, I built few PC with GTX1080 or GTX1080Ti and some GTX1080Ti would do 2088MHz as max like Gigabyte GTX1080Ti Aorus Extreme, MSI GTX1080Ti Armor OC would do 2075MHz as max, then I tried few Black Edition of GTX1080Ti which wouldn't pass 2075MHz in best case or would run at bellow 2050MHz and less, then SC2

Hope this helps 

Thanks, Jura


----------



## Imprezzion

Well, i got a great deal on a MSI Armor 1080 Ti which uses the Gaming X PCB but has a terrible GTX1070 cooler strapped to it. The good thing is, EK has blocks (even Phoenix) for the 1080Ti TF6 a.k.a. Gaming X so i can just waterblock this Armor as it has the exact same VRM and such as a Gaming X.

I still wonder whether i should just Accelero Xtreme IV it, use a Kraken G12 with a cheap AIO or just go full EK Phoenix fullcover block with a 240 rad on it.. The EK combo is expensive as hell but it would be fun tho..


----------



## The Pook

Imprezzion said:


> Well, i got a great deal on a MSI Armor 1080 Ti which uses the Gaming X PCB but has a terrible GTX1070 cooler strapped to it. The good thing is, EK has blocks (even Phoenix) for the 1080Ti TF6 a.k.a. Gaming X so i can just waterblock this Armor as it has the exact same VRM and such as a Gaming X.
> 
> I still wonder whether i should just Accelero Xtreme IV it, use a Kraken G12 with a cheap AIO or just go full EK Phoenix fullcover block with a 240 rad on it.. The EK combo is expensive as hell but it would be fun tho..



I've been happy with my Raijentek Morpheus II  

the MSI Gaming heatsink isn't terrible but I still dropped ~20c at quieter fan speeds/less noise and ~25c at similar fan speeds/similar noise levels.

G12 isn't a bad route to go if you already have a spare AIO laying around that'd work with the kit, otherwise it's pretty expensive for (IMO) something less elegant. Accelero Xtreme IV is nice and cheap, Raijentek Morpheus II is more expensive and doesn't come with fans. 

If you got money for a loop and want to do a loop then do that, but it wouldn't win any bang for buck cooling awards.


----------



## Imprezzion

Well, I don't have a fitting AIO so that kinda rules out the G12 then. It's also kinda ugly yeah.


I have had pretty great experiences with Accelere Xtreme coolers, I had loads of them on many cards however, the baseplate for VRAM and VRM is pretty bad on the Armor and I really dislike those stick on heatsinks..

I think I'm just going to EK Phoenix this thing. If it clocks well that is.


----------



## dangerSK

jura11 said:


> Hi there
> 
> I can say for myself my EVGA GTX1080Ti has been great OC'er, still running strong with 2113-2139MHz at 1.07v at stock BIOS and temperatures are still under 40°C with such clocks at any games or rendering
> 
> Got two GTX1080's, Manli GTX1080 which is running 2164MHz and EVGA GTX1080 with 2113MHz OC and tested them as well with SLI few weeks ago and no issues, they're rock solid at 2113MHz in most of the benchmarks or games and no issues
> 
> But agree with silicone lottery thing, I built few PC with GTX1080 or GTX1080Ti and some GTX1080Ti would do 2088MHz as max like Gigabyte GTX1080Ti Aorus Extreme, MSI GTX1080Ti Armor OC would do 2075MHz as max, then I tried few Black Edition of GTX1080Ti which wouldn't pass 2075MHz in best case or would run at bellow 2050MHz and less, then SC2
> 
> Hope this helps
> 
> Thanks, Jura


Yea good to know, the thing is i have 1080 Ti OC LAB but Im not saying its bad sample yet as Ive not run that card on LN2, might be a bad ambient sample but good scaler on subzero


----------



## The Pook

Imprezzion said:


> Well, I don't have a fitting AIO so that kinda rules out the G12 then. It's also kinda ugly yeah.
> 
> 
> I have had pretty great experiences with Accelere Xtreme coolers, I had loads of them on many cards however, the baseplate for VRAM and VRM is pretty bad on the Armor and I really dislike those stick on heatsinks..
> 
> I think I'm just going to EK Phoenix this thing. If it clocks well that is.



does your card not have a "front plate" ? 










the non-X Gaming card has the plate over the vRAM but not over the VRM, so I had to add VRM heatsinks but not for the vRAM.


----------



## Imprezzion

It does but it only half covers the VRAM not covering 4 chips at all and it isn't very thick.. 

I found a GTX460 in my closet that I was going to use for the previous bench thing here but it died delidding. It happens to have a Accelero Xtreme III on it. I'll bolt that on the card when I get it in 2 days or so.


----------



## jura11

dangerSK said:


> Yea good to know, the thing is i have 1080 Ti OC LAB but Im not saying its bad sample yet as Ive not run that card on LN2, might be a bad ambient sample but good scaler on subzero


Hi there 

Galax GTX1080Ti OC LAB I would thought so will be bit better there under water, but yes I agree there maybe under LN2 will be better, can't comment on LN2 too much because I never tried or run my GPUs under LN2 or sub-zero there

My current Asus RTX 2080Ti Strix will do 2160-2175MHz(+100MHz on core or +115MHz on core) under water with Asus Matrix BIOS

Hope this helps 

Thanks, Jura


----------



## Imprezzion

Just got my 1080 Ti Armor. 
The stock cooler is just as bad as the reviews say lol.

Luckily i had the foresight to order a Accelero IV for it. 

Put the Accelero on with the backplate, thermal pads, and left the front plate on with a bunch of extra heatsinks on the VRAM side as it doesn't fully cover it and just a whole bunch of extra heatsinks on the VRM side just.. because i can.. and have a whole roll of Sekisui thermal tape.. 

It runs at 53-54c now with 1.093v and maxed power limits. Amazing. (This is with the Accelero at 100% but still, that is inaudible at 2000RPM).

The card does clock pretty well. I finished a whole run of Maliwan Takedown on Borderlands 3 with 1080p 125% scaling and Badass graphics - motion blur on 2101-2114Mhz core but it crashed soon after. 
I think this card will do 2114Mhz pretty easily if it had a bit more voltage to do it. Temperature matters massively with this card. With the stock cooler at 80c 2076Mhz immediatly crashed within 5 seconds.. 

I'll play around with the clocks and voltages for a bit lol. Maybe with 1.062-1.050v it might actually do better as it keeps it way cooler around 48-51c and with 10% less power draw. 
I'll try 2088-2076 on stock volts. That should be a nice balance between clocks and power draw and temps. 

After that, memory time 

EDIT:

Oh yeah, she is a happy little card at 1.062v.. Much better! See screenshot. Well, time to push the memory a bit.

EDIT2: Well, that was fun , i turned down the core to 2012Mhz to make sure it wasn't unstable, smashed the memory slider as high as it'll go, +1000 @ 6500Mhz, well, it takes it just fine.. FPS went up massively as well lol.. As did power draw.. holy hell. Upping memory to +1000 made it draw almost 12% more power.. not at the limit of 117% yet at stock voltage but at 1.093v it is hitting it. 

I'm not seeing any benefit in clocks between as low a 1.043v (curve locked) and 1.093v, maybe 20Mhz orso but it gets way hotter and power throttles. Plus, 2050 core with 6500 memory gives way more FPS gains than say, 2076Mhz with 5700 memory.

I'll let it run for a few hours AFKing a game on 2050/6500 and see if it can do that without crashes.


----------



## Zfast4y0u

Imprezzion said:


> Just got my 1080 Ti Armor.
> The stock cooler is just as bad as the reviews say lol.
> 
> Luckily i had the foresight to order a Accelero IV for it.
> 
> Put the Accelero on with the backplate, thermal pads, and left the front plate on with a bunch of extra heatsinks on the VRAM side as it doesn't fully cover it and just a whole bunch of extra heatsinks on the VRM side just.. because i can.. and have a whole roll of Sekisui thermal tape..
> 
> It runs at 53-54c now with 1.093v and maxed power limits. Amazing. (This is with the Accelero at 100% but still, that is inaudible at 2000RPM).
> 
> The card does clock pretty well. I finished a whole run of Maliwan Takedown on Borderlands 3 with 1080p 125% scaling and Badass graphics - motion blur on 2101-2114Mhz core but it crashed soon after.
> I think this card will do 2114Mhz pretty easily if it had a bit more voltage to do it. Temperature matters massively with this card. With the stock cooler at 80c 2076Mhz immediatly crashed within 5 seconds..
> 
> I'll play around with the clocks and voltages for a bit lol. Maybe with 1.062-1.050v it might actually do better as it keeps it way cooler around 48-51c and with 10% less power draw.
> I'll try 2088-2076 on stock volts. That should be a nice balance between clocks and power draw and temps.
> 
> After that, memory time
> 
> EDIT:
> 
> Oh yeah, she is a happy little card at 1.062v.. Much better! See screenshot. Well, time to push the memory a bit.
> 
> EDIT2: Well, that was fun , i turned down the core to 2012Mhz to make sure it wasn't unstable, smashed the memory slider as high as it'll go, +1000 @ 6500Mhz, well, it takes it just fine.. FPS went up massively as well lol.. As did power draw.. holy hell. Upping memory to +1000 made it draw almost 12% more power.. not at the limit of 117% yet at stock voltage but at 1.093v it is hitting it.
> 
> I'm not seeing any benefit in clocks between as low a 1.043v (curve locked) and 1.093v, maybe 20Mhz orso but it gets way hotter and power throttles. Plus, 2050 core with 6500 memory gives way more FPS gains than say, 2076Mhz with 5700 memory.
> 
> I'll let it run for a few hours AFKing a game on 2050/6500 and see if it can do that without crashes.


how much i heard armor has good PCB and just ****y cooler slapped on top of it, i assume point of that card was to either water cool it, or use for ln2 benching, probably the first.


----------



## Zfast4y0u

kithylin said:


> My 1080 Ti (EVGA GTX 1080 Ti SC Black Edition) is reference PCB and with XOC Bios I had it sucking down 650 watts at one point in a certain game in a certain situation and it survived just fine. Still works.


you went a lil bit out of spec there buddy, glad to hear ur 8-pins are still alive xD




Imprezzion said:


> I'm grabbing a 1080 Ti second hand and ditching my RTX 2080 now that prices are still high lol. I tried a 5700 XT, realized in 5 minutes of gaming why I never buy AMD, and send it back lol.
> 
> 1080 Ti's can be had very cheap secondhand and I will not run stock cooling. So far I found one with a Accelero Xtreme IV and backplate already mounted to it. I will see if I can buy that one for a good offer and run that or get a EK Phoenix block for it. I have a Phoenix on my CPU already so..
> 
> I'm curious whether a Inno3D Twin x2 VRM / VRAM plate under a Accelero IV can keep VRM temps low enough to run a shunt mod on it lol. It's a reference PCB and I don't know how beefy the VRM is in reference. With the EK Phoenix block it wouldn't be a problem but those are an extra like, €75 and I need a second rad. No way my single 280 can cool the GPU and the 9900K @ 5.1 hehe.



you aint first one that ditched 5700xt, ppl are switching from it, on 2070 super. how much i can tell, drivers are still a **** show and big draw back.

btw, i run 1080ti sli and 8700k atm on 360mm 60mm rad only xddd if i can do it, you can too! got second rad in lube, but no fans on it, so its kinda useless *working to fix this mess i created*


----------



## Imprezzion

Zfast4y0u said:


> how much i heard armor has good PCB and just ****y cooler slapped on top of it, i assume point of that card was to either water cool it, or use for ln2 benching, probably the first.


Well, the Armor just uses an identical MSI Gaming X PCB minus the LED controller. 
Funny thing is, someone wants to trade a 1080 Ti Gaming X and a good amount of cash for my 2080 as well. Might just do it so I have 2 identical PCB's and can just get the best one if the 2 with the good baseplate and even a OEM MSI backplate and sell the other one again.


----------



## kithylin

Zfast4y0u said:


> you went a lil bit out of spec there buddy, glad to hear ur 8-pins are still alive xD
> 
> Other people have gone way above what I did with LN2 and XOC bios on these cards. I was only pushing 2126 Mhz on the core. The legendary K|NGP|N has several scores logged on hwbot for a single 1080 Ti @ 2500, 2600, and 2700 mhz core speed. I'm sure those went well beyond 650 watts for a single card. Also I remember Buildzoid telling us repeatedly in his teardown videos that the 8 pin connector for graphics cards alone is realistically capable of 350 - 400 watts, and the 6-pin around 200 without issues.


----------



## Zfast4y0u

kithylin said:


> Zfast4y0u said:
> 
> 
> 
> you went a lil bit out of spec there buddy, glad to hear ur 8-pins are still alive xD
> 
> Other people have gone way above what I did with LN2 and XOC bios on these cards. I was only pushing 2126 Mhz on the core. The legendary K|NGP|N has several scores logged on hwbot for a single 1080 Ti @ 2500, 2600, and 2700 mhz core speed. I'm sure those went well beyond 650 watts for a single card. Also I remember Buildzoid telling us repeatedly in his teardown videos that the 8 pin connector for graphics cards alone is realistically capable of 350 - 400 watts, and the 6-pin around 200 without issues.
> 
> 
> 
> its all about heat build up, not long ln2 benches couldnt harm i guess, johnny gury for example was talking in one video how some mining cards had their 8 pins melted cause of constant full load and over time heat rise so much, it melted em. but who knows how long those cards were screaming for mercy, its not ur typical usage scenario, lol.
Click to expand...


----------



## kithylin

Zfast4y0u said:


> kithylin said:
> 
> 
> 
> its all about heat build up, not long ln2 benches couldnt harm i guess, johnny gury for example was talking in one video how some mining cards had their 8 pins melted cause of constant full load and over time heat rise so much, it melted em. but who knows how long those cards were screaming for mercy, its not ur typical usage scenario, lol.
> 
> 
> 
> Everyone knows coin mining kills video cards in 2 years or less anyway. That's kind of a given and common knowledge by now. But yeah. In my experience it was just briefly for a few moments looking out over a wall at the horizon in Project CARS (the first one) with vsync off just curious all "Hey how much power does it pull if I do this?" .. watching a software monitor program on the other screen like "Oh that's a lot!" and alt-F4'ing the game instantly. But it survived fine for a few seconds.
Click to expand...


----------



## Imprezzion

Well, I settled on a definitive OC for my Armor with a Accelero IV on it.

I'm running 2063-2050 core depending on temps and power draw with maxed power limit, stock voltage at 1.062v-1.050v. 
Memory is as high as Afterburner allows, +1000 a.k.a. 6500Mhz.

Power sits in the high 90% range usually and can peak as high as 110% but it has 117% available so. No throttling there.

Temps sit in the low to mid 50c range in any load with the Accelero at 100%. It's still seriously quiet at that level. I could let it run at like, 40% and barely hit 60c but that causes it to drop another clock bin down under load. It won't run 2076/2088 for long enough on 1.062v to be sta je in lighter loads that way so. 

It will actually do 2100Mhz just fine on 1.093v but power draw gets too high then and it power limits without flashing a different BIOS and I want to keep the BIOS stock on this one for now.

I might be getting another MSI Gaming X soon as someone wants to trade his for my RTX2080 with added cash and I kinda wanna bin that one to see if it clocks my better but I doubt it. This is already very impressive lol.


----------



## TwilightRavens

So I’m probably going to be snagging a 1080 ti here in a month or so to throw in my new Zen 2 build and I wanted to ask, which air cooled partner card has the best thermals? I’ve read it was the Strix card but are there any others anyone can chime in on? I mean I have an EVGA SC 1080 and the thermals are pretty good but I also know the 1080 ti runs quite a bit hotter. Or would it be better just to grab another 1080 and SLI it? I know SLI is pretty much dead now but hell if its a viable option I’d consider it.


----------



## Imprezzion

Inno3D X3 and X4 (if you can live with the noise of the 4th small fan) are pretty much the best coolers around temp wise.

The MSI Gaming X Trio is also pretty amazing as is the Strix but the Strix cannot be easily overvolted or modded as it uses proprietary controllers.

Otherwise, just get any old cheap reference based card and slap a Accelero IV on it like I did. Nothing will beat that in terms of air temps and noise.


----------



## TwilightRavens

Imprezzion said:


> Inno3D X3 and X4 (if you can live with the noise of the 4th small fan) are pretty much the best coolers around temp wise.
> 
> 
> 
> The MSI Gaming X Trio is also pretty amazing as is the Strix but the Strix cannot be easily overvolted or modded as it uses proprietary controllers.
> 
> 
> 
> Otherwise, just get any old cheap reference based card and slap a Accelero IV on it like I did. Nothing will beat that in terms of air temps and noise.




I might try one of the EVGA hybrid models, but I’d really like to switch the 120mm aio for a 240mm aio but I haven’t read on many people doing that, I saw one post where all a guy had to do was swap the coldplate from the one on the GPU to the new one on the 240 aio. I mean I guess I could go the Accelero route or the Kraken G12 route but meh, pretty sure my 3800X will dump enough heat into the case as is, not sure that having the hybrid dump more would be smart, but I’ll definitely see if I can find the Inno3d card when I go looking for one.

And yeah I know the Accelero works wonders but its not exactly the prettiest solution out there so I’d like to avoid that if possible, but I won’t rule the option out completely.


----------



## Imprezzion

Oh yeah it ain't pretty and I put it on a non-reference Armor which is the much wider Gaming X PCB so I literally had to zip-tie the backplate on but it's just a temporary card till RTX3xxx hits and I can revert it to stock with no permanent mods done to it (yet..).

If ya want a 240mm solution with the ease of a AIO look at EK's website for a clearance sale on GTX 1080 Ti "Phoenix" blocks with the pre-filles QDC fittings and get a 240mm Phoenix rad combo with it.

Isn't cheap but looks great and performance is almost the same as a full cover custom loop as the block itself is identical to a regular full cover block.


----------



## TwilightRavens

Imprezzion said:


> Oh yeah it ain't pretty and I put it on a non-reference Armor which is the much wider Gaming X PCB so I literally had to zip-tie the backplate on but it's just a temporary card till RTX3xxx hits and I can revert it to stock with no permanent mods done to it (yet..).
> 
> 
> 
> If ya want a 240mm solution with the ease of a AIO look at EK's website for a clearance sale on GTX 1080 Ti "Phoenix" blocks with the pre-filles QDC fittings and get a 240mm Phoenix rad combo with it.
> 
> 
> 
> Isn't cheap but looks great and performance is almost the same as a full cover custom loop as the block itself is identical to a regular full cover block.




Hmmm I’ll definitely look into that, if it can be used as like a gpu only loop for now, but in the future I may put the whole system in a loop.


----------



## Imprezzion

TwilightRavens said:


> Hmmm I’ll definitely look into that, if it can be used as like a gpu only loop for now, but in the future I may put the whole system in a loop.


Future will be difficult. The Phoenix line is EOL which is why it's on a super cheap clearance sale now at EK and local retailers that still have stock. I'm in EU and the blocks about €76 for a reference and they have some AIB blocks as well for a few € more. The rads sold out for the 240 version tho. Might need to source that from a retailer or online shop.

I fancy a bit of YOLO. Time to play with the XOC BIOS at 1.2v.. hehe..

If for some reason it goes pop, well I haven't sold my 2080 yet so...

EDIT: Oh god.. this thing LOVES voltage lol. Here's half an hour of Borderlands 3 on 1.200v XOC BIOS with 2152Mhz core. Had memory on a pretty basic OC, nowhere near what it can do (~6500Mhz and maybe even more).
I'm kinda scared i will absolutely blow the VRM to the moon and beyond with this load tho. It is hitting around 350w average load peaking at 380w ish. 

Would it survive that? lol.

EDIT2: It needs every bit of 1.200v for 2152Mhz. Dropping a couple of voltage bins to 1.180 ish crashes almost instantly. Card hit almost 70c core in Division 2 so I can only imagine the VRM temps at a constant 370w load @ 1.200.. 

I'm going a little easier on the card as it is well known GTX10xx hates high temps when overclocking so I'm testing 2126Mhz @ 1.162v now. It ran for a good while on 1.143v but crashed Borderlands 3 after a out 10 minutes so I upped it a bit and let's let it run AFK for a while. It's already looking way better lol. 64c so far and temps equalized nicely. It's sitting around 340-355w now according to hwinfo.


----------



## kithylin

Imprezzion said:


> EDIT: Oh god.. this thing LOVES voltage lol. Here's half an hour of Borderlands 3 on 1.200v XOC BIOS with 2152Mhz core. Had memory on a pretty basic OC, nowhere near what it can do (~6500Mhz and maybe even more).
> I'm kinda scared i will absolutely blow the VRM to the moon and beyond with this load tho. It is hitting around 350w average load peaking at 380w ish.


Do remember that while it won't kill it instantly, that will seriously degrade the life of the card running it over volted like that. I speak from experience with my 1080 Ti.


----------



## Imprezzion

kithylin said:


> Do remember that while it won't kill it instantly, that will seriously degrade the life of the card running it over volted like that. I speak from experience with my 1080 Ti.


How high did you have yours? And at what temps? 

It's just a temp card untill RTX3xxx launches, was pretty cheap and it's been quite a while since I actually killed anything hardware related with overclocking. I used to do custom BIOS and voltage / power hardmods all the time up to GTX7xx and never really broke anything except a 780 who's VRAM died somehow and only shows green and purple lines now.

And if it does degrade badly or die, well, another one for the wall decorations I guess and i'll just get another cheap secondhand one. The Accelero IV will fit on pretty much anything anyway.


----------



## kithylin

Imprezzion said:


> How high did you have yours? And at what temps?


1.200v @ 2126 Mhz core speed @ 50-60c temps. XOC bios will reset the video driver @ 63-65c too.

That was an older setup for me now back when I only had a single 360mm radiator. But after running it about 14-16 months like that my card will no longer run at all with the XOC bios on it. Anything "3D Related": Games, Benchmarks, none of it. Everything dumps it straight to desktop instant crash with XOC bios. The only way I can still game on it is by using a normal "locked" bios. Originally stock after XOC then I switched it to the FTW3 bios and it's fine now so far.


----------



## ThrashZone

kithylin said:


> 1.200v @ 2126 Mhz core speed @ 50-60c temps. XOC bios will reset the video driver @ 63-65c too.
> 
> That was an older setup for me now back when I only had a single 360mm radiator. But after running it about 14-16 months like that my card will no longer run at all with the XOC bios on it. Anything "3D Related": Games, Benchmarks, none of it. Everything dumps it straight to desktop instant crash with XOC bios. The only way I can still game on it is by using a normal "locked" bios. Originally stock after XOC then *I switched it to the FTW3 bios and it's fine now so far*.


Hi,
FTW3 vbios is a very good choice


----------



## 8051

I have a few games that have no problems w/my MSI 1080Ti Duke at 1.2V and 2152Mhz. but I put my powerful, shrouded fans on the stock heatsink. As long as the GPU stays below about 63°C it doesn't seem problematic. At most I only spend a max of three hours a day gaming though.

Thermally coupling the backplate w/the PCB seems to make a difference too.


----------



## kithylin

8051 said:


> I have a few games that have no problems w/my MSI 1080Ti Duke at 1.2V and 2152Mhz. but I put my powerful, shrouded fans on the stock heatsink. As long as the GPU stays below about 63°C it doesn't seem problematic. At most I only spend a max of three hours a day gaming though.
> 
> Thermally coupling the backplate w/the PCB seems to make a difference too.


Do remember that on typical cards your VRM's will usually run +30c hotter than your core temps for air cooled cards, possibly even higher with the XOC bios. You should not be using that on air cooled cards. Other users in the past have reported sudden death of their 1080 Ti's using it on an air cooled card. XOC bios should be used exclusively under full-cover custom water or better only.


----------



## Imprezzion

8051 said:


> I have a few games that have no problems w/my MSI 1080Ti Duke at 1.2V and 2152Mhz. but I put my powerful, shrouded fans on the stock heatsink. As long as the GPU stays below about 63°C it doesn't seem problematic. At most I only spend a max of three hours a day gaming though.
> 
> Thermally coupling the backplate w/the PCB seems to make a difference too.


I have the massive Accelero IV backplate and thermal pads on it. Heck I might even zip tie a low rpm 120mm sucking air through it and blowing it towards my top outtake fans on it for better dissipation of the heat.

I got it perfectly stable now on XOC BIOS at 2126Mhz 1.181v 6500Mhz memory. Runs at 60-65c but I purposely let it get hotter to see if it would crash, nope. Zero instability even at 70-72c. My card does have the full MSI Gaming X / Duke PCB and VRM so it's not reference and can handle a lot more abuse as that VRM is quite a bit beefier.

So far I tested Borderlands 3, Modern Warfare, Black Desert and The Division for a good 5-6 hours with zero crashes or artifacts.

I'm good for now, we shall see how long it lasts! It's OCN after all and keeping things safe is no fun hehe. I got my B-Die RAM pushing 1.55v as well as it needs it for 4200CL16 so yeah.. YOLO.


----------



## 8051

kithylin said:


> Do remember that on typical cards your VRM's will usually run +30c hotter than your core temps for air cooled cards, possibly even higher with the XOC bios. You should not be using that on air cooled cards. Other users in the past have reported sudden death of their 1080 Ti's using it on an air cooled card. XOC bios should be used exclusively under full-cover custom water or better only.


The MSI 1080Ti Duke has the same VRM config as the MSI 1080Ti Gaming X: 8 phases. This should keep the VRM's well under the 125°C max temperature even when overclocked.


----------



## kithylin

8051 said:


> The MSI 1080Ti Duke has the same VRM config as the MSI 1080Ti Gaming X: 8 phases. This should keep the VRM's well under the 125°C max temperature even when overclocked.


Under normal operations with it's original bios, yes. But the XOC bios literally has no power limit and it is possible to run up to +2x to +3x power draw through the VRM's on our cards vs the original design power limits with the original bios. It's your card and you do what you want with it but I'll still remind everyone that the 1080 Ti XOC bios should never be used on any air cooled card for any reason, no matter what air cooler you have on it or what fan configuration. It's too dangerous for air cooled cards.


----------



## Imprezzion

I removed the Accelero's fan module, zip tied 2 140mm MF140's on it, soldered the PWM and RPM wire to a old reset switch cable, stuck that on the cards controller so the card can control the PWM and read the RPM and put the 4 pin fan connectors with only +12v and ground ok my motherboards fan headers on DC control mode 100%.

Works like a charm. I can control the fans with Afterburner now and now the Accelero has RGB hehe. Also dropped temps in general about 5c. 

I slightly lowered voltage and clocks just to be sure to 2101Mhz with 1.143v which runs fine. Power draw sits around 340-360w now. 

This is where it will stay for as long as it takes for RTX3xxx to release or until the card dies.


----------



## skupples

ha! epic idea.


----------



## 8051

kithylin said:


> Under normal operations with it's original bios, yes. But the XOC bios literally has no power limit and it is possible to run up to +2x to +3x power draw through the VRM's on our cards vs the original design power limits with the original bios. It's your card and you do what you want with it but I'll still remind everyone that the 1080 Ti XOC bios should never be used on any air cooled card for any reason, no matter what air cooler you have on it or what fan configuration. It's too dangerous for air cooled cards.


Three times the power draw? That seems excessive. There's no way my 1080Ti Duke is sucking 900+ Watts (the power limit for the stock VBIOS is 330 Watts).

I have a screaming delta 4000 RPM FFB1212EH vaneaxial blowing on the part of my card that has the VRM's that has to count for something.


----------



## kithylin

8051 said:


> Three times the power draw? That seems excessive. There's no way my 1080Ti Duke is sucking 900+ Watts (the power limit for the stock VBIOS is 330 Watts).
> 
> I have a screaming delta 4000 RPM FFB1212EH vaneaxial blowing on the part of my card that has the VRM's that has to count for something.


I saw 600 watts briefly for a little while (then got scared and chickened out promptly and shut the game down) with my 1080 Ti @ 2126 core speed @ 1.200v with XOC bios. And I see K|NGP|N on hwbot has a few single 1080 Ti's @ 2600 and 2700 Mhz. I don't know what power he's pulling with those cards but I'm sure it's well north of 600 watts.


----------



## Imprezzion

Are you sure it actually drew 600w or was that a read out fault. 
I tried pretty much every game and benchmark I have installed on 1.200v and the highest I've ever seen it go is 405w which isn't anything immediately dangerous as it's only about 70w more then "stock". 

It did drop the driver once on 1.143v 2101Mhz after 2 hours if gaming so, up to 1.150v we go hehe.


----------



## dangerSK

Imprezzion said:


> Are you sure it actually drew 600w or was that a read out fault.
> I tried pretty much every game and benchmark I have installed on 1.200v and the highest I've ever seen it go is 405w which isn't anything immediately dangerous as it's only about 70w more then "stock".
> 
> It did drop the driver once on 1.143v 2101Mhz after 2 hours if gaming so, up to 1.150v we go hehe.


Very unlikely, I benched my 1080TI OC Lab on galax ln2 bios with 1.2Vcore (true voltage) and I wasnt nowhere near 600W with 3DMark, be it timespy or others.


----------



## kithylin

For high power usage on a 1080 Ti try "Project C.A.R.S." (the first one, not the second one) with nvidia DSR to render at 4K on a 1080p screen (I don't have a 4K screen) and then max all the graphics settings, turn vsync off and go around a turn on a race track but drive the car in to a wall and sit there "looking off into the distance" in a corner. It was reading 620-630 watts in all 3: gpu-z, hwinfo64, and the nvidia power reading program, I forget the name of that one at the moment. It's in the nvidia folder.

3DMark is a rather poor representation of power usage for 1080 Ti cards. It pulls much less power vs actual game engines running on the card.


----------



## Imprezzion

I ran BDO Remastered and 4k YouTube and also played a bit of Borderlands 3 on 4K sampling and Modern Warfare for a bit. Highest I've seen it hit in HWiNFO64 is 380w on 2101Mhz 6500Mhz memory 1.150v. Should be fine for now. Temps around the mid 60's for the core.

I should maybe turn on framerate limiter on 120/240 FPS as I use a 120Hz screen.


----------



## monza1412

Imprezzion said:


> I ran BDO Remastered and 4k YouTube and also played a bit of Borderlands 3 on 4K sampling and Modern Warfare for a bit. Highest I've seen it hit in HWiNFO64 is 380w on 2101Mhz 6500Mhz memory 1.150v. Should be fine for now. Temps around the mid 60's for the core.
> 
> I should maybe turn on framerate limiter on 120/240 FPS as I use a 120Hz screen.


Are you still on the stock bios?


----------



## kithylin

monza1412 said:


> Are you still on the stock bios?


Since they stated they are using 1.152v then they must be on the XOC Bios. All other bios's for 1080 Ti's limit us to 1.093v.


----------



## Imprezzion

Yeah XOC BIOS on a MSI Armor (Gaming X PCB) with a Accelero IV and 2 140mm fans on that. I'll post a pic later


----------



## Zfast4y0u

Imprezzion said:


> Yeah XOC BIOS on a MSI Armor (Gaming X PCB) with a Accelero IV and 2 140mm fans on that. I'll post a pic later


what are ur vrm temps at 380w and that voltage?


----------



## Imprezzion

Zfast4y0u said:


> what are ur vrm temps at 380w and that voltage?


No clue, there's no temp sensor on the VRM on this PCB. 

I have to say, I can still touch the PCB / heatsink / backplate after 4 hours of full load so it isn't getting super hot at all. 

Pics i promised:


----------



## smiteom

Imprezzion said:


> I ran BDO Remastered and 4k YouTube and also played a bit of Borderlands 3 on 4K sampling and Modern Warfare for a bit. Highest I've seen it hit in HWiNFO64 is 380w on 2101Mhz 6500Mhz memory 1.150v. Should be fine for now. Temps around the mid 60's for the core.
> 
> I should maybe turn on framerate limiter on 120/240 FPS as I use a 120Hz screen.


6500mhz! do you have copper heatsinks on the memory?


----------



## Imprezzion

smiteom said:


> 6500mhz! do you have copper heatsinks on the memory?


Nah, just the stock midplate and 4 aluminum heatsinks from Arctic on the 4 RAM chips the stock plate doesn't cover. 

The memory on this thing is a monster. It will actually bench 6700 but not game for very long. Scores and FPS do scale up to 6600Mhz so it's not ECCing too much either.

Like, standing still looking at the floor in Borderlands 3 at 5500Mhz it's 148 FPS, at 6000 151 and at 6500 154 and it doesn't fluctuate indicating corrections so. It works lol.


----------



## TwilightRavens

So would a EVGA Supernova 750W pq psu be enough for 2 x 1080 ti SLI with a 3800X? Or would that be pushing it?


----------



## mouacyk

That's really pushing it. I had a 660W platinum seasonic that was not sufficient for a single 1080ti TDP unlocked + 8700K at 5GHz.

On normal gaming load, 1080Ti already pushes near 300W.


----------



## Imprezzion

Running a 1080 Ti XOC BIOS @ 206hz 1.062v pulling 300-340w depending on the game and a 9900K @ 5.1Ghz core 1.248v (VR VOut, not CPU-Z) and 4.8 cache and some B-Die on 1.55v, 12 140mm RGB fans, water cooling and such on a Seasonic Focus+ Gold 750w and it handles it just fine.

I would've really thought that 660w platinum would easily handle that load from your system lol.

But, nah a 750w isn't going to handle that comfortably. Not with 2 1080 Ti's. Maybe it will do it if you keep power limits of them both at 100% with no TDP unlock but I wouldn't wanna run it tbh.


----------



## skupples

TwilightRavens said:


> So would a EVGA Supernova 750W pq psu be enough for 2 x 1080 ti SLI with a 3800X? Or would that be pushing it?


I'd shoot fr 1KW, so you have some head room.

you figure 250-300w per GPU, not sure about AMD's power usage these days. I'd assume it can get pretty high on the big boy series.

I always assume 300W per item when PSU shopping, but I overclock everything.


----------



## Zfast4y0u

TwilightRavens said:


> So would a EVGA Supernova 750W pq psu be enough for 2 x 1080 ti SLI with a 3800X? Or would that be pushing it?


if you overclock both cards and what not, it wont be, if you undervolt em, you can get away with 2 cards pulling just 300w at tops, so it will be plenty for you, however cards would have to run at 1700mhz and really low voltage, less then 0.800mV

if you have psu already, you can get away with it and sacrifice like 20 fps.


----------



## TwilightRavens

skupples said:


> I'd shoot fr 1KW, so you have some head room.
> 
> 
> 
> you figure 250-300w per GPU, not sure about AMD's power usage these days. I'd assume it can get pretty high on the big boy series.
> 
> 
> 
> I always assume 300W per item when PSU shopping, but I overclock everything.


-Edit: (Decided to get the 3900X as it got a price slash)


Yeah as far as the 3900X goes I’d be relying solely on PBO as far as overclocking goes, RAM will probably be run to the bleeding edge (3733MHz with as tight timings as possible) and I might just undervolt them for now, I do already have the 750W but I may just upgrade it and put the 750W in my aging Z97 system in which seems like that 850W i have in it is dying.


----------



## hotrod717

Back in the club. I remember launch day on these and standing in a MC for a couple hours because they got the release time wrong. lol.
Just picked up a super clean Poseidon and probably the best 1080ti i've had since that release card. Really surprised how well the cooler works with watercooling on this. Not that this clocks better than any card, just as far as the few I've owned. Disappointed they didnt put connector or pads for overvolt.


----------



## TwilightRavens

I ended up getting a MSI GTX 1080 Ti Armor OC, i’ve read mixed reviews but I should be able to optimize upon it with some Fujipoly 17W/mK pads and some Cryonaut, but will have to see if it just works out of the box first.


----------



## Imprezzion

TwilightRavens said:


> I ended up getting a MSI GTX 1080 Ti Armor OC, i’ve read mixed reviews but I should be able to optimize upon it with some Fujipoly 17W/mK pads and some Cryonaut, but will have to see if it just works out of the box first.


It's a Gaming X PCB so the card itself is great, can handle quite some abuse even with XOC BIOS, however, the Armor cooler is pathetic. I can run mine at stock BIOS on 1.081v 2063Mhz with Prolimatech PK-3 or Kryonaut at about 70-75c with fans on full blast 100%.

I used it with a Accelero IV as well, with the baseplate still installed of course, XOC BIOS, 1.150v 2101Mhz barely touched 60c. 

However, as someone needed a short 1080 Ti for a ITX build I traded the Armor for his Lightning X.

Getting the Lightning tomorrow and I'm very curious to see what that beast can do.


----------



## TwilightRavens

Imprezzion said:


> It's a Gaming X PCB so the card itself is great, can handle quite some abuse even with XOC BIOS, however, the Armor cooler is pathetic. I can run mine at stock BIOS on 1.081v 2063Mhz with Prolimatech PK-3 or Kryonaut at about 70-75c with fans on full blast 100%.
> 
> I used it with a Accelero IV as well, with the baseplate still installed of course, XOC BIOS, 1.150v 2101Mhz barely touched 60c.
> 
> However, as someone needed a short 1080 Ti for a ITX build I traded the Armor for his Lightning X.
> 
> Getting the Lightning tomorrow and I'm very curious to see what that beast can do.


Yeah I know the cooler is GTX 1070 level, but I’m just going to optimize it the best i can for now, I’ve got some Fujipoly 17W/Mk 1mm thermal pads coming as a replacement and some thermal Grizzly Cryonaut, so I’ll just maximize heat dissipation as much as I can and that should drop temps enough to work with. Worst case I’ll get a Kraken G12 and a 240mm aio for it and go from there.


----------



## Imprezzion

I just traded mine for a Lightning X. Bad choice.. that Lightning is a terrible bin. It can't even hold 2076Mhz on 1.093v. It looks pretty and itcs super cool and quiet but this bin is worse than my old Armor lol.

Memory is bad as well as my Armor had no issues at all running +1000 6500Mhz and this one, even with memory on +50mV still can't do +800 let alone +1000Mhz...

Oh well it is just a temp card till RTX3xxx drops.

Another wierd thing.. I always ran XOC BIOS on the Armor and it worked perfectly. However on the Lightning it does run and allows for insane clockspeeds, hell I even benched on 2202Mhz 1.200v but performance is way way way down. It also doesn't heat up at all. It doesn't play nice with the XOC BIOS I have from TPU.

Are there different versions of the XOC BIOS out there or any other BIOS that allows 1.200v curve OC's?


----------



## mouacyk

1080 ti lightning were MSI's red herring. everyone one here at the time knew MSI was just milking the name on Pascal gen


----------



## TwilightRavens

mouacyk said:


> 1080 ti lightning were MSI's red herring. everyone one here at the time knew MSI was just milking the name on Pascal gen




Yeah Lightning cards haven’t been superior since the 780 Lightning, then there were only ever like 5 780 ti Lightnings because Nvidia were strict about voltage limits, not sure on the 980 ti Lightning but I have always wanted one to play with. Then the 1080 ti Lightning is just an overpriced MSI card with really nothing special about it.


----------



## Imprezzion

Well, the cooler, backplate and the LEDs are quite cool and the cards extra monitoring like VRM and PCB temps are amazing. They should've just binned the chips better. I feel like they haven't put any effort into actually binning the chips or the memory chips at all.

The card has potential, a lot of potential. IF it had binned chips and 1.200v unlocked voltage on the LN2 BIOS it would've been one of the best cards available. It still doesn't justify the price but this one I got has box, warranty, all accesoires, and was like, €400 in a trade deal.

EDIT: This card really hates temperature way more than my Armor. It will actually run 2076Mhz fine on 1.093v as long as I keep it under 54c

I ran it with ~40% fanspeed which sees it sit at about 66-68c and it crashes all over the place in mere minutes of running a game even on 2050Mhz but with ~70% fanspeed and 49-51c temps it runs fine for hours in the same.games on 2063/2076Mhz. 

VRAM can do 6250 (+750) but it does require the Lightning RAM overvolt at +25mV 1.400v.


----------



## LSantos

*1080ti Seahawk X flashed for STRIXOC*

I'm new on the forum. Registered cause I saw such great topics and posts about overclocking and flashing. When I bought my 1080ti seahawk I hated the fact that we couldn't put fan % to 0...minimum was 29%. Back then, I searched for a way to do it with no luck, but today I remembered to get back to it...and DID IT. :specool: Now I've finally managed to flash STRIXOC330W.rom BIOS from this thread OP.

Problem is I ran Heaven before flash...with no OC whatsoever and got 121 average FPS...and i guess around 3000 score (can't remember the score right). Now I get around 90FPS average and 2100 score….

I noticed before flash on afterburner while doing the bench (ULTRA at 2160) that the max gpu clock was around 1939...and now is 1999 but results are poorer. I didn't tweak anything but I expected the Stock Strix OC BIOS to be better than the Seahawk X, no?

Also don't know if this is normal cause I didn't check before but "Video Engine Load" and "Bus interface load" sensors don't read any values. Is this normal? See attached screenshots (note they're on IDLE, thats why fan is at 0%)

GPU-Z card info http://gpuz.techpowerup.com/20/03/22/883.png
Sensors http://gpuz.techpowerup.com/20/03/22/qhm.png

I could always go back to original BIOS as i managed to back it up or FLASH another one that has the same feature of allowing fans to turn off, maybe the XOC one. 

And before you asked I've followed every step except for DDU...didn't seem necessary as the installing new (same 442.74) drivers they removed the old ones first and everything after was and is stable.

Some help appreciated, thanks.


----------



## TwilightRavens

LSantos said:


> I'm new on the forum. Registered cause I saw such great topics and posts about overclocking and flashing. When I bought my 1080ti seahawk I hated the fact that we couldn't put fan % to 0...minimum was 29%. Back then, I searched for a way to do it with no luck, but today I remembered to get back to it...and DID IT. :specool: Now I've finally managed to flash STRIXOC330W.rom BIOS from this thread OP.
> 
> Problem is I ran Heaven before flash...with no OC whatsoever and got 121 average FPS...and i guess around 3000 score (can't remember the score right). Now I get around 90FPS average and 2100 score….
> 
> I noticed before flash on afterburner while doing the bench (ULTRA at 2160) that the max gpu clock was around 1939...and now is 1999 but results are poorer. I didn't tweak anything but I expected the Stock Strix OC BIOS to be better than the Seahawk X, no?
> 
> Also don't know if this is normal cause I didn't check before but "Video Engine Load" and "Bus interface load" sensors don't read any values. Is this normal? See attached screenshots (note they're on IDLE, thats why fan is at 0%)
> 
> GPU-Z card info http://gpuz.techpowerup.com/20/03/22/883.png
> Sensors http://gpuz.techpowerup.com/20/03/22/qhm.png
> 
> I could always go back to original BIOS as i managed to back it up or FLASH another one that has the same feature of allowing fans to turn off, maybe the XOC one.
> 
> And before you asked I've followed every step except for DDU...didn't seem necessary as the installing new (same 442.74) drivers they removed the old ones first and everything after was and is stable.
> 
> Some help appreciated, thanks.



What’s your memory clock before and after flash?


----------



## LSantos

Before and after flash it stayed the same at 1376Mhz as shown above on GPU-Z printscreen.

I ran some little overclocking to CPU and GPU dialing up power limit and even then overall performance just rise up 5FPS... I get a score around 90FPS at 2160x1440 ULTRA, Fullscreen in Heaven Bench.

I've a 28inch G-sync monitor and that could affect framerate by locking to refresh but I disabled that in settings.


----------



## TwilightRavens

So my MSI GTX 1080 ti Armor OC finally got here and after an alcohol bath, replacing of the thermal pads with Fujipoly 17W/mK pads, and replacing the thermal paste with Cryonaut the thermals aren't bad at all even with a modest overclock.


----------



## TwilightRavens

Okay so after a couple of days testing the card (MSI GTX 1080 ti Armor OC) the cooler really isn't that bad in all honesty, I think a lot of the thermal issues come from the poor paste application from the factory (die was like half covered) and the blue play dough MSI calls thermal pads, I changed both like I stated in the previous post along with a ramped up fan curve and it really has no issue maintaining a 2GHz boost clock (mostly 2025-2037MHz range bouncing between the two) while also not going over 63C, I could play with the voltage curve and probably get more out of it but in all honesty so far this is great. 

One thing to note on the thermal pad application I did, unlike stock MSI like all other AIB's put a whole pad covering a section of VRAM, I didn't do that and I cut them as squares to fit on each individual GDDR5X chips, did it help? I honestly don't know but I had to do it that way as to cover as much of the components on the card as possible before I ran out of pads. The heatsink is very lacking as its pretty pathetic in terms of thickness, but the oversized fans do make up for it as they aren't inherently loud.

My VRAM does seem to cap out at around +600Mhz without touching voltage, I can probably do 625-650 but I'm pretty sure I saw it artifact at times in some games, but it does seem like the core clock has a bit left in it. Draws about 360-380W under full load.


----------



## skupples

redoing the factory job is always a good idea, if you know what you're doing. specially when you've got fuji in stock


----------



## TwilightRavens

skupples said:


> redoing the factory job is always a good idea, if you know what you're doing. specially when you've got fuji in stock



Yeah I’m honestly considering getting another pack of it to finish the card in the rest of the spots I wasn’t able to replace, and maybe the 1070 that i got from my wife after I upgraded her to my 1080, but really the 1070 doesn’t get all that hot even under full load.

But yeah I highly recommend this to anyone else who wants to drop their air temps by a few degrees, it helps even on the cards that have inadequate cooling solutions, though Fujipoly isn’t cheap, it is however really good and worth every penny.


----------



## TwilightRavens

After some fine tweaking that's what I can manage, 2088MHz on air mostly sustained. I'm fairly happy with that.


----------



## TwilightRavens

So question, does anyone know if the Kraken G12 or Arctic Accelero will for sure fit on the MSI Armor OC card? Was playing Kerbal Space Program last night and my card got up to 81C under load. Boost was down to like 1943MHz at 1.000v


----------



## blrjxzdsb

Someone knows the thickness of thermal pads Zotac 1080 Ti AMP Extreme?


----------



## blrjxzdsb

I have Zotac 1080 Ti AMP Extreme but without any thermal pads on it. Looking for the right size.


----------



## TwilightRavens

blrjxzdsb said:


> I have Zotac 1080 Ti AMP Extreme but without any thermal pads on it. Looking for the right size.



Probably 1mm if i had to guess, I replaced the ones on my Armor OC with 1mm pads and that worked, that seems to be the most common thickness on gpu’s nowadays.


----------



## TwilightRavens

Did the swap, didn't realize the Armor OC cooler was THAT pathetic, like good god no wonder it was hitting 81C full tilt. Card is a thicc azz boi now lol


----------



## yoadknux

*yoadknux*



TwilightRavens said:


> Did the swap, didn't realize the Armor OC cooler was THAT pathetic, like good god no wonder it was hitting 81C full tilt. Card is a thicc azz boi now lol


For the 1080ti, MSI Armor is identical to Gaming X in terms of board design, the only differences between them are factory clocks and cooler. Gaming X has a thicker twin frozr cooler, whereas the Armor is, well, a gtx 1070 cooler. It doesn't cool better than the FE. 
What are your new temps under heavy, sustained load?


----------



## TwilightRavens

yoadknux said:


> For the 1080ti, MSI Armor is identical to Gaming X in terms of board design, the only differences between them are factory clocks and cooler. Gaming X has a thicker twin frozr cooler, whereas the Armor is, well, a gtx 1070 cooler. It doesn't cool better than the FE.
> What are your new temps under heavy, sustained load?



Under a quiet fan curve it peaks at 62C, full blast 55C, vs quiet on the Armor 81C and full blast 72C.


----------



## rajkosto

What bios can be flashed to Gigabyte 1080Ti Gaming OC (Black) and retain use of DVI, HDMI, 2x DP ports, as well as having the fan control and using all 8 vcore and 2 vmem phases properly ? I just want slightly more power like 330W or 350W.

EDIT: Apparently most of them, but the gigabyte having 8+6pin makes the power reading in sensors go wonky, only showing half the % and usage watts


----------



## TwilightRavens

So either I got a really good bin on my 1080 ti or I’m just not pushing it hard enough, but with the new cooler (Accelero iii) i’ve managed 2.1GHz sustained for about 2 hours without crashing in some games, and this is on air at 51C. I’ll try running Heaven 4k for a few hours tonight to see if that can push it hard enough to crash or step down a voltage state. Unless that’s somewhat common for an air cooled card, but I’ve heard that’s usually water cooling territory.


----------



## yoadknux

TwilightRavens said:


> So either I got a really good bin on my 1080 ti or I’m just not pushing it hard enough, but with the new cooler (Accelero iii) i’ve managed 2.1GHz sustained for about 2 hours without crashing in some games, and this is on air at 51C. I’ll try running Heaven 4k for a few hours tonight to see if that can push it hard enough to crash or step down a voltage state. Unless that’s somewhat common for an air cooled card, but I’ve heard that’s usually water cooling territory.


At what voltage?
Not trying to bum you, but everyone at some point brag about 2.1MHz or even higher of OC, but it's almost never truly stable, regardless of cooling. 
The disparity arises from many different factors, in short - some people don't stress the card hard enough -> no crash, while others set the card to an absurd overclock at high voltage and the card just downclocks itself -> card doens't actually run at set speed. 

For me, there is one single stress test (not benchmark) that can say whether an overclock is stable: FireStrike Ultra. Run the stress test loop (Again, not benchmark), keep MSI Afterburner in the background, and have it monitor clock + voltage + power limit. After the run check the following:
1) Did the card crash? If it did - NOT STABLE
2) Did the card power limit often? If it did - NOT STABLE

The first criterion is the most important one when people run games, there's a very large difference between Witcher at 1080p to RDR2 at 4k in terms of stability, and you don't want it to crash in any situation. That's what FireStrike Ultra checks.
The second criterion is not important for daily use, but important for overclock integrity.


----------



## TwilightRavens

yoadknux said:


> At what voltage?
> Not trying to bum you, but everyone at some point brag about 2.1MHz or even higher of OC, but it's almost never truly stable, regardless of cooling.
> The disparity arises from many different factors, in short - some people don't stress the card hard enough -> no crash, while others set the card to an absurd overclock at high voltage and the card just downclocks itself -> card doens't actually run at set speed.
> 
> For me, there is one single stress test (not benchmark) that can say whether an overclock is stable: FireStrike Ultra. Run the stress test loop (Again, not benchmark), keep MSI Afterburner in the background, and have it monitor clock + voltage + power limit. After the run check the following:
> 1) Did the card crash? If it did - NOT STABLE
> 2) Did the card power limit often? If it did - NOT STABLE
> 
> The first criterion is the most important one when people run games, there's a very large difference between Witcher at 1080p to RDR2 at 4k in terms of stability, and you don't want it to crash in any situation. That's what FireStrike Ultra checks.
> The second criterion is not important for daily use, but important for overclock integrity.



1.093v, sustains it in RDR2 1440p for the 4 hours i tested it last night temp never goes over about 53-54C. Also tried 2113MHz and that did cause a crash pretty quickly. Same goes for anything above +600 on the memory, even something like +601 would cause a crash. But I’ve thrown pretty much everything at 2.1GHz and so far is fine with no crash or clock drops but I also always keep it relatively cool in my room.

Also tested other games and same thing it just stays at full 2.1GHz no drops or crash. Also ran Heaven, Valley and Superposition for about 2 hours each Extreme quality and 2560x1440 for an extra layer of verification and it ran through that no issues. Will try Firestrike Ultra later on see what happens because I’m like you, I don’t think it’s 100% stable but it also hasn’t crashed on anything yet that I’ve thrown at it.

Oh and lastly the card did not hit a power limit or a Perfcap reason.

Edit: So I ran Firestrike Ultra and it does hit a slight power limit but it is indeed stable at 2.1GHz when it does clock that high, the extra load has allowed me to add in a few more MHz down the curve to compensate the drops but like I said it is stable in it not crashing yet.


----------



## rajkosto

You got pretty lucky then, i can only do 1999MHz at 1.092v and < 60oC, or you got lucky and i got unlucky


----------



## yoadknux

TwilightRavens said:


> 1.093v, sustains it in RDR2 1440p for the 4 hours i tested it last night temp never goes over about 53-54C. Also tried 2113MHz and that did cause a crash pretty quickly. Same goes for anything above +600 on the memory, even something like +601 would cause a crash. But I’ve thrown pretty much everything at 2.1GHz and so far is fine with no crash or clock drops but I also always keep it relatively cool in my room.
> 
> Also tested other games and same thing it just stays at full 2.1GHz no drops or crash. Also ran Heaven, Valley and Superposition for about 2 hours each Extreme quality and 2560x1440 for an extra layer of verification and it ran through that no issues. Will try Firestrike Ultra later on see what happens because I’m like you, I don’t think it’s 100% stable but it also hasn’t crashed on anything yet that I’ve thrown at it.
> 
> Oh and lastly the card did not hit a power limit or a Perfcap reason.
> 
> Edit: So I ran Firestrike Ultra and it does hit a slight power limit but it is indeed stable at 2.1GHz when it does clock that high, the extra load has allowed me to add in a few more MHz down the curve to compensate the drops but like I said it is stable in it not crashing yet.


OK, nice. 
There's no need to stress it all day. If your overclock fits your needs, then it's good. 
I converted my Aorus into an hybrid card with NZXT X41, temps rarely go above 45c, but typical "100% stable" overclocks for me are 2088/6000. 2100 was stable-ish, it could work fine for days and then crash Overwatch randomly or something. As for memory, 6100-6200 was stable, but resulted in artifacts in some games.


----------



## TwilightRavens

yoadknux said:


> OK, nice.
> There's no need to stress it all day. If your overclock fits your needs, then it's good.
> I converted my Aorus into an hybrid card with NZXT X41, temps rarely go above 45c, but typical "100% stable" overclocks for me are 2088/6000. 2100 was stable-ish, it could work fine for days and then crash Overwatch randomly or something. As for memory, 6100-6200 was stable, but resulted in artifacts in some games.



Yeah mine artifacts on the memory if i go anything over 6104, so mine has somewhat dud memory lol. Its weird though because when it had the stock Armor cooler on it 1.093v was only stable at 2075MHz and it would crash relatively quick (within 5 mins) so that’s kinda neat how just cooler temps stabilize it by that much.


----------



## 8051

TwilightRavens said:


> Yeah mine artifacts on the memory if i go anything over 6104, so mine has somewhat dud memory lol. Its weird though because when it had the stock Armor cooler on it 1.093v was only stable at 2075MHz and it would crash relatively quick (within 5 mins) so that’s kinda neat how just cooler temps stabilize it by that much.


That is good. It would be interesting, but I guess dangerous, to see what the Asus XOC Strix VBIOS would do w/your card at higher voltages. 1.093V is the upper, VBIOS allowed limit on voltage for most all 1080Ti's -- except for the Asus XOC Strix 1080Ti.

For the record, my MSI Duke 1080Ti w/the Asus XOC Strix VBIOS does throttle when the GPU temps rise above 64°C or so when run without any overclocks. I can get 2139Mhz. out of my 1080Ti but it takes 1181mV to get there on my modded MSI Duke 1080Ti.

Do you have any pics of your 1080Ti w/the Accelero III installed? Did you use the stock Accelero III fan pack?


----------



## TwilightRavens

8051 said:


> That is good. It would be interesting, but I guess dangerous, to see what the Asus XOC Strix VBIOS would do w/your card at higher voltages. 1.093V is the upper, VBIOS allowed limit on voltage for most all 1080Ti's -- except for the Asus XOC Strix 1080Ti.
> 
> For the record, my MSI Duke 1080Ti w/the Asus XOC Strix VBIOS does throttle when the GPU temps rise above 64°C or so when run without any overclocks. I can get 2139Mhz. out of my 1080Ti but it takes 1181mV to get there on my modded MSI Duke 1080Ti.
> 
> Do you have any pics of your 1080Ti w/the Accelero III installed? Did you use the stock Accelero III fan pack?



Yeah I’m not flashing the XOC bios because I’ve heard that one degrades chips somewhat rapidly.

Yes, look a few posts back and you can see the pics i’ve posted.


----------



## 8051

TwilightRavens said:


> Yeah I’m not flashing the XOC bios because I’ve heard that one degrades chips somewhat rapidly.
> 
> Yes, look a few posts back and you can see the pics i’ve posted.


I've had the Asus Strix XOC VBIOS on my MSI Duke 1080Ti for 18 months now, so far no degradation of its overclocking ability.


----------



## nolive721

Ftw3 hybrid here. I can also sustain 2088mhz core if the cards stay under 45degC. I am using voltage curve in AB so when i push the card in benchmarking or heavy 4k gaming it can go up to 55degC with rad fan at 70% but then the core goes down to 2062mhz
Memory OC is very good for me +625mhz stable and no artifact


----------



## kithylin

8051 said:


> That is good. It would be interesting, but I guess dangerous, to see what the Asus XOC Strix VBIOS would do w/your card at higher voltages. 1.093V is the upper, VBIOS allowed limit on voltage for most all 1080Ti's -- except for the Asus XOC Strix 1080Ti.
> 
> For the record, my MSI Duke 1080Ti w/the Asus XOC Strix VBIOS does throttle when the GPU temps rise above 64°C or so when run without any overclocks. I can get 2139Mhz. out of my 1080Ti but it takes 1181mV to get there on my modded MSI Duke 1080Ti.
> 
> Do you have any pics of your 1080Ti w/the Accelero III installed? Did you use the stock Accelero III fan pack?


It was confirmed by a user in the bios flashing thread that using the XOC bios on an air cooled GTX 1080 Ti can cause "sudden death" of the card without any warning and with no prior indication of degredation or anything to indicate there is any issue. Just suddenly *POOF* card dead. The XOC bios should never ever be used on any air cooled GTX 1080 Ti for any reason. It is far too dangerous. Full cover custom water block (no AIO + fan either) or liquid nitrogen only.



8051 said:


> I've had the Asus Strix XOC VBIOS on my MSI Duke 1080Ti for 18 months now, so far no degradation of its overclocking ability.


You're lucky and that is not a typical experience. Just because it worked well for you does -NOT- mean it's safe for everyone to use. Please do not mislead people by claiming the XOC bios is safe to use long-term on any GTX 1080 Ti video card in the forums as that is false information. Here is the link where it for sure degraded my 1080 Ti, and bmgjet experienced it too: https://www.overclock.net/forum/69-...erent-bios-your-1080-ti-232.html#post28416840 The XOC bios on 1080 Ti's should only ever be used short-term for benchmarks only. Also further discussion of alternative bios's are pretty much off-topic for this thread, seeing as we already have a dedicated thread for that here in OCN. Here's a link to that if you'd like to discuss it over there: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html

Also since we're putting out disclaimers here: You should also be aware that the XOC bios runs -3% to -5% slower clock-for-clock vs any other bios that's not the XOC bios. Your card's performance at 2139 Mhz under XOC is equal to any normal 1080 Ti running 2035-2050 Mhz on any other normal locked bios. You're running your card hotter under extra voltage for no reason at all if you're only getting 2130'ish Mhz out of it with XOC. The XOC bios isn't beneficial unless you can maintain at least 2200 Mhz or higher. Which would also require that you can keep the 1080 Ti card at or below < 20c at all times.


----------



## 8051

All I wanted kithylin is the ability to exceed 1.093V on my 1080Ti's vcore. I don't go up to 1.2V on it, but 1.181V is probably almost as bad. I only run my 1080Ti overclocked for maybe 8 hours a week (that's all the time I have for gaming).


----------



## kithylin

8051 said:


> All I wanted kithylin is the ability to exceed 1.093V on my 1080Ti's vcore. I don't go up to 1.2V on it, but 1.181V is probably almost as bad. I only run my 1080Ti overclocked for maybe 8 hours a week (that's all the time I have for gaming).


But it's completely unnecessary and pointless for nothing. On XOC @ your overclock you would get exactly the same performance as running your card at 2050 Mhz on a normal bios @ 1.093v. XOC bios has a performance penalty per clock. You're just overheating and over volting your card for no actual gains.


----------



## 8051

kithylin said:


> But it's completely unnecessary and pointless for nothing. On XOC @ your overclock you would get exactly the same performance as running your card at 2050 Mhz on a normal bios @ 1.093v. XOC bios has a performance penalty per clock. You're just overheating and over volting your card for no actual gains.


Is the stock 1080Ti Asus Strix Gaming OC VBIOS the one that is problematic? Or is there a special VBIOS for the Strix Gaming OC card that is the problematic one? I don't think I have the LN Strix OC VBIOS loaded because I don't know where to even get the VBIOS.


----------



## kithylin

8051 said:


> Is the stock 1080Ti Asus Strix Gaming OC VBIOS the one that is problematic? Or is there a special VBIOS for the Strix Gaming OC card that is the problematic one? I don't think I have the LN Strix OC VBIOS loaded because I don't know where to even get the VBIOS.


If you are able to go above 1.093v then you have the liquid nitrogen / AKA XOC bios loaded. That's the only one that allows that, that I'm aware of anyone having figured out / discovered yet.


----------



## Grey Beard

Just bought a few of these, EVGA 1080Ti FTW3. Looking forward to doing a few new builds on water with multi-GPU set-up. Have read that this is a great card. More to come, as the down time has allowed to plan these builds.


Sent from my iPad using Tapatalk


----------



## alxns

I've got a non-working ASUS Strix 1080 ti and I'm trying to diagnose the problem. However I don't have a working spare so I was wondering if someone could do some resistance measurements for me on that card? Powered off, obviously. Mine is a rev 1.01 PCB, which all serial numbers starting with H5YVC* probably are.

Here is where I took my measurements (high res imgur link)


Contrary to Buildzoid's PCB breakdown on that card I located the 5V VRM under the main GPU VRMs, and 1.8V and PEX VRMs on the left above the PCIe slot.


Spoiler











My problem is that I have low resistance on those last two, 2 ohms for 1.8V and 4 ohms for PEX. 1.8V should probably be a few hundreds ohms and PEX around 60 ohms, but that's just guessing from information I gathered on other cards. I'm thinking that MAYBE the card does have low resistance on one of those lines when powered off or in protection mode but I don't even know if that's possible. From what I gathered this card has a single 1.8V line for the GDDR5X and components.

Vmem resistance shows 70 ohms, 5V has 0.7 kiloohms, and 3.3V has 1.8 kiloohms, which looks fine to me. The GPU's nominal resistance is too low for accurate measurement with common equipment. All 12V and 3.3V rails are present.

The card showed signs of corrosion so I'm hoping it's "just" that causing a short. I did clean the card, albeit not enough perhaps. I would have to look closer. There is no burn or obvious damage.

If you're willing to help but don't really want to disassemble your card, ASUS conveniently placed probing points on the back of the card labelled XJ2 for Vcore, XJ4 for Vmem, and XJ6 for PEX.


----------



## TwilightRavens

alxns said:


> I've got a non-working ASUS Strix 1080 ti and I'm trying to diagnose the problem. However I don't have a working spare so I was wondering if someone could do some resistance measurements for me on that card? Powered off, obviously. Mine is a rev 1.01 PCB, which all serial numbers starting with H5YVC* probably are.
> 
> Here is where I took my measurements (high res imgur link)
> 
> 
> Contrary to Buildzoid's PCB breakdown on that card I located the 5V VRM under the main GPU VRMs, and 1.8V and PEX VRMs on the left above the PCIe slot.
> Buildzoid's PCB breakdown
> 
> My problem is that I have low resistance on those last two, 2 ohms for 1.8V and 4 ohms for PEX. 1.8V should probably be a few hundreds ohms and PEX around 60 ohms, but that's just guessing from information I gathered on other cards. I'm thinking that MAYBE the card does have low resistance on one of those lines when powered off or in protection mode but I don't even know if that's possible. From what I gathered this card has a single 1.8V line for the GDDR5X and components.
> 
> Vmem resistance shows 70 ohms, 5V has 0.7 kiloohms, and 3.3V has 1.8 kiloohms, which looks fine to me. The GPU's nominal resistance is too low for accurate measurement with common equipment. All 12V and 3.3V rails are present.
> 
> The card showed signs of corrosion so I'm hoping it's "just" that causing a short. I did clean the card, albeit not enough perhaps. I would have to look closer. There is no burn or obvious damage.
> 
> If you're willing to help but don't really want to disassemble your card, ASUS conveniently placed probing points on the back of the card labelled XJ2 for Vcore, XJ4 for Vmem, and XJ6 for PEX.



You might try the oven trick if you are feeling adventurous or using a heatgun, solder points might just need to be reflowed and are causing shorts in the VRMs or the resistors.


----------



## philhalo66

Whats up with the massive price spike for 1080 Ti's? Been looking for a second one for my SC and prices have gone up over 100 bucks in like a few months. Early march my specific card was going for around 400~ now its 500+ for the same card.


----------



## kithylin

philhalo66 said:


> Whats up with the massive price spike for 1080 Ti's? Been looking for a second one for my SC and prices have gone up over 100 bucks in like a few months. Early march my specific card was going for around 400~ now its 500+ for the same card.


It's because they're still very good cards at rasterization / basic game rendering (stuff that's not using the AI Cores in the newer cards). There are only 5 video cards that are faster than a 1080 Ti that anyone can buy right now (Nvidia Titan X, Nvidia Titan Xp, Titan V, RTX 2080 Super, RTX 2080 Ti, Nvidia Titan RTX). The cheapest one is the 2080 Super @ $700 and all the others are over $1000 right now. On the AMD side there is nothing faster than a GTX 1080 Ti that anyone can buy at the moment. Even the Radeon VII is only 1:1 matched even with the 1080 Ti. The 1080 Ti is still one of the fastest cards for < $700 at the moment. I expect when AMD releases their new "Nvidia Killer" (their words, internal code name for their new card, not something I created) comes out and the new RTX 3000 series comes out they may have a card faster than the 1080 Ti for $500, possibly. Then we may see prices on the 1080 Ti's dropping again down to cheap levels. But they may not release a faster card for $500 or less and we may see the prices of the 1080 Ti's remaining as-is or even increasing for quite a while yet.


----------



## philhalo66

kithylin said:


> It's because they're still very good cards at rasterization / basic game rendering (stuff that's not using the AI Cores in the newer cards). There are only 5 video cards that are faster than a 1080 Ti that anyone can buy right now (Nvidia Titan X, Nvidia Titan Xp, Titan V, RTX 2080 Super, RTX 2080 Ti, Nvidia Titan RTX). The cheapest one is the 2080 Super @ $700 and all the others are over $1000 right now. On the AMD side there is nothing faster than a GTX 1080 Ti that anyone can buy at the moment. Even the Radeon VII is only 1:1 matched even with the 1080 Ti. The 1080 Ti is still one of the fastest cards for < $700 at the moment. I expect when AMD releases their new "Nvidia Killer" (their words, internal code name for their new card, not something I created) comes out and the new RTX 3000 series comes out they may have a card faster than the 1080 Ti for $500, possibly. Then we may see prices on the 1080 Ti's dropping again down to cheap levels. But they may not release a faster card for $500 or less and we may see the prices of the 1080 Ti's remaining as-is or even increasing for quite a while yet.


wow, maybe i'll just sell mine then since i got a 2080 Ti anyway.


----------



## alxns

TwilightRavens said:


> You might try the oven trick if you are feeling adventurous or using a heatgun, solder points might just need to be reflowed and are causing shorts in the VRMs or the resistors.


Nah, that is a ghetto method, and I've got time. Gotta find shorts the proper way, so I will try isolating VRM circuitry (VRM side and chip side) by removing inductors and work from there. It's just that reference points would help, but I found a whole list on xdevs.com.


----------



## Grey Beard

philhalo66 said:


> Whats up with the massive price spike for 1080 Ti's? Been looking for a second one for my SC and prices have gone up over 100 bucks in like a few months. Early march my specific card was going for around 400~ now its 500+ for the same card.



Given the current market, even a used 1080Ti’s still gets a premium. Not only is the performance still very good, but as the new prices escalate for next generation cards they continue to be an attractive alternative. When the new 30XX series comes out in the next few months, their prices may impact the demand, which could keep prices higher given the age of the technology. Given the current economic state, this market will continue to be robust.


Sent from my iPad using Tapatalk


----------



## mouacyk

TwilightRavens said:


> You might try the oven trick if you are feeling adventurous or using a heatgun, solder points might just need to be reflowed and are causing shorts in the VRMs or the resistors.


All the oven reflow results I've seen are short-lived for up to only a few hours, including my GTX 570. While it does reflow broken circuits temporarily, it is more likely to weaken others and cause wider damage.

On my sample, consecutive reflows shortened lifespan progressively. 3rd reflow died entirely.


----------



## d3v0

Grey Beard said:


> Given the current market, even a used 1080Ti’s still gets a premium. Not only is the performance still very good, but as the new prices escalate for next generation cards they continue to be an attractive alternative. When the new 30XX series comes out in the next few months, their prices may impact the demand, which could keep prices higher given the age of the technology. Given the current economic state, this market will continue to be robust.
> 
> 
> Sent from my iPad using Tapatalk


Sold my GTX 1080Ti for $450 shipped this week, basically instantly on Hardforum. Same card has been sitting at $475 shipped from another seller for a couple weeks so thats about where the market is right now.

I replaced it with an RTX 2070 XC Ultra for $380 which basically gets the same performance but instead of using 340w, it uses 220w and my office isnt so sweaty .

The time to sell and replace 1080ti's is now, when the market is high, and before new hardware hits and crushes its price v. performance v. power consumption.


----------



## mouacyk

d3v0 said:


> Sold my GTX 1080Ti for $450 shipped this week, basically instantly on Hardforum. Same card has been sitting at $475 shipped from another seller for a couple weeks so thats about where the market is right now.
> 
> I replaced it with an RTX 2070 XC Ultra for $380 which basically gets the same performance but instead of using 340w, it uses 220w and my office isnt so sweaty .
> 
> The time to sell and replace 1080ti's is now, when the market is high, and before new hardware hits and crushes its price v. performance v. power consumption.


Problem is resale value of that 2070 is going to be worse. The returns get progressively worse.


----------



## dVeLoPe

I have SLi EVGA GTX 1080 Ti SC2 iCX - 11G-P4-6593-KR that im considering parting ways with still has another year or so warranty on them if anyones interested


----------



## philhalo66

dVeLoPe said:


> I have SLi EVGA GTX 1080 Ti SC2 iCX - 11G-P4-6593-KR that im considering parting ways with still has another year or so warranty on them if anyones interested


this isn't really the place to make a listing my guy, should head over to the OCN marketplace and make a listing.


----------



## TwilightRavens

I guess now would be the ample time to sell 1080 ti’s before the 3000 series launches, i’ve a feeling that they’ll be down to what 980 ti’s are going for since the market will be flooded with them.


----------



## kithylin

dVeLoPe said:


> I have SLi EVGA GTX 1080 Ti SC2 iCX - 11G-P4-6593-KR that im considering parting ways with still has another year or so warranty on them if anyones interested


This is the wrong place for that sort of thing. I might suggest that perhaps you could try making a thread in the marketplace instead.


----------



## TwilightRavens

So I was artifacting in games and crashing in a few others and I thought it was the 2.1GHz causing it but nope, my VRAM is just a dud and will only do +500MHz instead of the +600MHz I initially thought, but thats fine 2100MHz/12004MHz is more than fine with me in the end. Since adjusting that its been rock solid.


----------



## mxthunder

Little late to the game I suppose, but I bought a 1080Ti FE on ebay yesterday for $500. Immidiately bought a phanteks block for it and plan on replacing my heavily OC'd / custom bios SLI 780Ti's.
Hope it wasnt a bad purchase.
Going to do a windfall upgrade of my kids PC's with the 780Ti's, and passing my sons 980 down to my daughter.


----------



## TwilightRavens

mxthunder said:


> Little late to the game I suppose, but I bought a 1080Ti FE on ebay yesterday for $500. Immidiately bought a phanteks block for it and plan on replacing my heavily OC'd / custom bios SLI 780Ti's.
> Hope it wasnt a bad purchase.
> Going to do a windfall upgrade of my kids PC's with the 780Ti's, and passing my sons 980 down to my daughter.



That’ll be a hell of a jump that you are gonna notice, you won’t regret it. Tbh 2080 ti is a waste of money for its price, yeah its got ray tracing but to me Turing feels more of like a beta test product than an actual release considering the performance penalty you get for even turning on rtx.


----------



## nolive721

mxthunder said:


> Little late to the game I suppose, but I bought a 1080Ti FE on ebay yesterday for $500. Immidiately bought a phanteks block for it and plan on replacing my heavily OC'd / custom bios SLI 780Ti's.
> Hope it wasnt a bad purchase.
> Going to do a windfall upgrade of my kids PC's with the 780Ti's, and passing my sons 980 down to my daughter.


man they are still selling for that sort of money??? Anyone wants my ftw3 hybrid for 650bucks lol

More seriously, enjoy this beast you wont regret it


----------



## kithylin

nolive721 said:


> man they are still selling for that sort of money??? Anyone wants my ftw3 hybrid for 650bucks lol
> 
> More seriously, enjoy this beast you wont regret it


That's pretty overprriced. I've seen some used on ebay for the $425 - $450 range.


----------



## TwilightRavens

kithylin said:


> That's pretty overprriced. I've seen some used on ebay for the $425 - $450 range.


Agreed


----------



## D-EJ915

The oem ones go for a lot less than founders editions whenever I've looked.


----------



## nolive721

D-EJ915 said:


> The oem ones go for a lot less than founders editions whenever I've looked.


what i would have thought indeed. Ok i keep my hybrid 1080ti then lol zero need to change looking at how it still performs these days vs the newer cards


----------



## TwilightRavens

nolive721 said:


> what i would have thought indeed. Ok i keep my hybrid 1080ti then lol zero need to change looking at how it still performs these days vs the newer cards


Yeah a 2080 ti isn't a huge jump, just ~20%, 15% if you have a heavily overclocked 1080 ti. But in DX12/Vulkan its a hair different. 3080 ti would probably be a nice jump in itself.


----------



## cstkl1

does anybody have a msi AB version that shows cpu clocks, cpu package etc..

if yes please share with me the two msiafterburner.cfg file.. one in the main folder and profile..

tyvm


----------



## kithylin

cstkl1 said:


> does anybody have a msi AB version that shows cpu clocks, cpu package etc..
> 
> if yes please share with me the two msiafterburner.cfg file.. one in the main folder and profile..
> 
> tyvm


You don't need that. The latest version of the program allows you to load modules in the monitoring configuration that can pull data from other apps like AIDA64 and Hwinfo.

This is an old screenshot of mine from some time in spring 2019. But see my MSI AB setup here:









CPU temperature, VRM Temperature, CPU current, CPU watts, and CPU Mhz are all pulled in from AIDA64 into MSI AB. It doesn't provide that data normally.


----------



## mxthunder

Thanks guys. Yeah it kinda kills me to spend $500 plus another $100 for a water block on a 3 year old card, but as you guys said the 2000 series isnt really worth it and im not going to play games for the 3000 series stuff either.
The 780Ti's honestly still have more than enough HP, its more of a VRAM limitation that is really holding me back at this point.
Another reason I went with the 1080Ti over the 2080. I wont want to get gipped with VRAM again. Nvidia has always been stingy about amount of VRAM and it pisses me off.


----------



## TwilightRavens

mxthunder said:


> Nvidia has always been stingy about amount of VRAM and it pisses me off.


*Angry GTX 1060 3GB noises*


----------



## mxthunder

Anyone ever see anomolies with the temperature sensors on these cards? My 1080Ti is reporting temperatures lower than ambient after putting the water block on. Even under full load, the card is only reading 31-33*C. Reporting idle temps of 15-18*C which is lower than the ambient temperature in the room.


----------



## mouacyk

Nope


----------



## mxthunder

Got everything tidied up. Learning to OC the new card. Is there a good BIOS to flash onto a FE watercooled card to over ride the 120% power limit? Getting conflicting info reading through the forums - some say there is no pascal bios editor like Kepler, Maxwell, etc, but I see in this thread people are flashing the bios.


----------



## skupples

there are BIOSs available, i think what changed is the new GPUs don't use classic hexidecimal bios (on uefi bios now), so you can't just throw them into a bios editor & tweak to your very own liking.


----------



## kithylin

mxthunder said:


> Got everything tidied up. Learning to OC the new card. Is there a good BIOS to flash onto a FE watercooled card to over ride the 120% power limit? Getting conflicting info reading through the forums - some say there is no pascal bios editor like Kepler, Maxwell, etc, but I see in this thread people are flashing the bios.


Nvidia has encrypted the bios with a new type of encryption and even though the 1080 Ti's have been out since 2016 no one as of yet has cracked the encryption on the new bios's. There is an editor, and people can edit the bios's but the problem is flashing them back to the card. The cards expect the bios's to be digitally signed and certified or they won't allow them to be flashed to the card. And since no one yet has figured out how to sign an encrypted bios we can't flash a modified custom bios back to the cards yet. Or at the very least if they do have something it hasn't been made public.

What people have been doing in here is taking an existing bios from other aftermarket / AIB cards from other vendors that have a higher power limit (Something that is already signed and certified) and flashing that to our cards to get a higher power limit.

It's not guaranteed to work, and we may lose the ability to use some of the displayport/HDMI/DVI ports on the cards, depending on the differences between the cards of the AIB one vs the one we already own. There is one special bios from the Asus Strix model of 1080 Ti that has no power limit but it's extremely dangerous and has been documented to kill cards if we run with higher voltages for very long. It's only for temporary use for benchmarks and then must be removed and switched back to a normal bios.

Also you need to understand that the "Percentage" you see in MSI Afterburner can be confusing.

I'll try to explain: You could have a bios with a 285 watt initial power limit that allows +25% in MSI afterburner, which would have a total resulting power limit @ 120% of 356 watts max (285 + 25% = 356.25). While at the same time you could have another bios from some other vendor that starts out with a 330 watt power limit but only allows +15% in after burner that would still result in more power (380 watts max) than the first one, even though it's a lower % (330 + 15% = 379.50, 380 watts).


----------



## mxthunder

Thanks for the summary. Very helpful!
I will play around and see how far I can push it with the stock bios before I attempt any flashing.


----------



## kithylin

mxthunder said:


> Thanks for the summary. Very helpful!
> I will play around and see how far I can push it with the stock bios before I attempt any flashing.


If you decide you do want to try bios flashing things then there is a separate thread dedicated to that subject (specifically for the 1080 Ti cards) with lots of information over here: https://www.overclock.net/forum/69-nvidia/1627212-how-flash-different-bios-your-1080-ti.html


----------



## Martin778

I have a 1080Ti FE that I struggle with...the card will sometimes shut down (no crash, just a hard shutdown) the PC when I run Fire Strike (normal) with the core OC'ed, OC'ing the memory is not a problem. Not-OC'ed it will pass any 3Dmark stress test.

- Stock clocks, max PL, runs everything fine
- Heaven w. Core and MEM OC, runs fine
- OC Fire Strike, hard shutdown but sometimes it would pass
- OC Time Spy Extreme Stress Test, passes fine with 98.4% (?!?)

If it shuts down, its always at the 2nd GPU test of FS but sometimes it would pass with no issues...I really don't get it.

What I've already tried:
- Replaced motherboards (both almost brand new though and pretty high end boards)
- Stress tested the CPU and RAM (both 2 weeks old), work fine
- Replaced the 550W Gold PSU with 1200W Platinum unit, no difference
- Reinstalled windows and the latest Nvidia drivers
- Card is repadded and repasted, nothing weird to see in GPU-Z
- Checked for artifacts, there are absolutely none
- XMP enabled / disabled

Can't put my hand on why this happens, I've had


----------



## kithylin

Martin778 said:


> - Reinstalled windows and the latest Nvidia drivers


That could be your problem. I keep trying newer drivers and I keep having "quite a lot" of instability and crashing in benchmarks and some games with my overclocked 1080 Ti. I usually end up reverting back to the last driver I've used that was 100% completely stable with my card. All they've added in the newer drivers is optimization for specific current AAA games. I have no interest in playing any of those games anyway so it doesn't even effect me (I'm not missing anything).

Perhaps just try this once and see if this works for you: https://www.nvidia.com/Download/driverResults.aspx/155056/en-us


----------



## KedarWolf

Martin778 said:


> I have a 1080Ti FE that I struggle with...the card will sometimes shut down (no crash, just a hard shutdown) the PC when I run Fire Strike (normal) with the core OC'ed, OC'ing the memory is not a problem. Not-OC'ed it will pass any 3Dmark stress test.
> 
> - Stock clocks, max PL, runs everything fine
> - Heaven w. Core and MEM OC, runs fine
> - OC Fire Strike, hard shutdown but sometimes it would pass
> - OC Time Spy Extreme Stress Test, passes fine with 98.4% (?!?)
> 
> If it shuts down, its always at the 2nd GPU test of FS but sometimes it would pass with no issues...I really don't get it.
> 
> What I've already tried:
> - Replaced motherboards (both almost brand new though and pretty high end boards)
> - Stress tested the CPU and RAM (both 2 weeks old), work fine
> - Replaced the 550W Gold PSU with 1200W Platinum unit, no difference
> - Reinstalled windows and the latest Nvidia drivers
> - Card is repadded and repasted, nothing weird to see in GPU-Z
> - Checked for artifacts, there are absolutely none
> - XMP enabled / disabled
> 
> Can't put my hand on why this happens, I've had


Also, uninstall your existing drivers with DDU. First, go into the settings when you start DDU, enable Safe Mode and Do Not Install Drivers Automatically, then restart DDU, boot into Safe Mode, uninstall drivers, let it reboot and install new drivers.

https://www.guru3d.com/files-details/display-driver-uninstaller-download.html


----------



## TwilightRavens

Martin778 said:


> I have a 1080Ti FE that I struggle with...the card will sometimes shut down (no crash, just a hard shutdown) the PC when I run Fire Strike (normal) with the core OC'ed, OC'ing the memory is not a problem. Not-OC'ed it will pass any 3Dmark stress test.
> 
> - Stock clocks, max PL, runs everything fine
> - Heaven w. Core and MEM OC, runs fine
> - OC Fire Strike, hard shutdown but sometimes it would pass
> - OC Time Spy Extreme Stress Test, passes fine with 98.4% (?!?)
> 
> If it shuts down, its always at the 2nd GPU test of FS but sometimes it would pass with no issues...I really don't get it.
> 
> What I've already tried:
> - Replaced motherboards (both almost brand new though and pretty high end boards)
> - Stress tested the CPU and RAM (both 2 weeks old), work fine
> - Replaced the 550W Gold PSU with 1200W Platinum unit, no difference
> - Reinstalled windows and the latest Nvidia drivers
> - Card is repadded and repasted, nothing weird to see in GPU-Z
> - Checked for artifacts, there are absolutely none
> - XMP enabled / disabled
> 
> Can't put my hand on why this happens, I've had



What are the temps on the card? If its running hot 3dmark will at times crash out, and if that is the issue you might wanna take the cooler off and try to clean as much dust out as you can and repaste it.


----------



## Martin778

It's been repasted, it will run Heaven/Valley/Furmark without issues, can't be the temps if it always craps around around the same moment.
I have another rig I might test it in, will have to give it a go. Others are to old to push the card to it's maximum in FS standard tests.


----------



## Streetdragon

try downclocking the mem and leave the core clock stock


----------



## Martin778

Still that doesn't really tell me why this happens...just swapped the card into my TRX40 rig, trying OC'ing it to the max at 100% fan speed = works PERFECTLY fine. 
What I did not do is reinstalling the drivers after swapping the 2080Ti for 1080Ti in that rig, windows dit it itself and it came with the same driver version as on the other setup. The TRX40 rig has a1300W seasonic PSU, all single rail designs (have had issues with OCP on multirail ones in the past).
Why would it shut down on Z490, no clue really, it don't remember it shutting down on X570 either.

Could it be Z490 needs both 2x EPS 8-Pin connected? I only have a single one in use atm.

+
Reinstalled the same drivers on the TRX40 rig, OC'ed the card - no shutdowns whatsoever.


----------



## kithylin

Martin778 said:


> Could it be Z490 needs both 2x EPS 8-Pin connected? I only have a single one in use atm.


That would do it.. that's kind of important... Other than that maybe the power supply in the Z490 system is overheating and shutting off from heat. That's a possibility too.


----------



## Martin778

That could mean they're feeding the PCI-E slot power from the EPS...but that again it's so weird - it will run furmark and Prime95, these are way more power hungry than FS.

I'm thinking it could be Afterburner itself doing something, I swapped in a 5700XT, reinstalled the drivers, started Afterburner just to set 100% fan curve and in Heaven the screen went black after 2 minutes.


----------



## yoadknux

Martin778 said:


> That could mean they're feeding the PCI-E slot power from the EPS...but that again it's so weird - it will run furmark and Prime95, these are way more power hungry than FS.
> 
> I'm thinking it could be Afterburner itself doing something, I swapped in a 5700XT, reinstalled the drivers, started Afterburner just to set 100% fan curve and in Heaven the screen went black after 2 minutes.


Nah, it went black because that's a 5700xt, don't use it as a reference to anything


----------



## Martin778

Sold it yesterday, got fed up with the drivers indeed.


----------



## StAndrew

Need some advice. Gigabyte Aorus 1080 TI with full cover block. Furmark sees max core temps at 35* and max stable core is 2088. However 2106 is stable for gaming and 2116+ has been stable for benching. However, for some reason, I can't get max voltage greater than 1.031. Bios max wattage should be 375W. 

Bios version is "F60" (86.02.39.00.DC on GPUz)

Would the XOC bios help me get more voltage? Is it worth it? Thanks.


----------



## mouacyk

StAndrew said:


> Need some advice. Gigabyte Aorus 1080 TI with full cover block. Furmark sees max core temps at 35* and max stable core is 2088. However 2106 is stable for gaming and 2116+ has been stable for benching. However, for some reason, I can't get max voltage greater than 1.031. Bios max wattage should be 375W.
> 
> Bios version is "F60" (86.02.39.00.DC on GPUz)
> 
> Would the XOC bios help me get more voltage? Is it worth it? Thanks.


Yes, up to 1.2v. Worth it for benching -- I can bench at 2177MHz and 2190MHz.


----------



## StAndrew

mouacyk said:


> Yes, up to 1.2v. Worth it for benching -- I can bench at 2177MHz and 2190MHz.


Awesome, thanks! I guessing no shunt mod? Do I need any specific overclock software / version or should all work? I've been messing around with MSI Afterburner (my favorite so far), Evga precision, and Gigabye (Barf).


----------



## kithylin

StAndrew said:


> Awesome, thanks! I guessing no shunt mod? Do I need any specific overclock software / version or should all work? I've been messing around with MSI Afterburner (my favorite so far), Evga precision, and Gigabye (Barf).


Do remember the disclaimer though: Using the XOC bios with your GPU @ 1.20v *WILL DEGRADE THE CARD*! You must remember to only use this temporarily for benchmarking for a short time and go back to a normal bios (NOT THE XOC BIOS) for normal gaming and day-to-day usage. Do not use the XOC bios on your card long-term for day-to-day usage.


----------



## StAndrew

kithylin said:


> Do remember the disclaimer though: Using the XOC bios with your GPU @ 1.20v *WILL DEGRADE THE CARD*! You must remember to only use this temporarily for benchmarking for a short time and go back to a normal bios (NOT THE XOC BIOS) for normal gaming and day-to-day usage. Do not use the XOC bios on your card long-term for day-to-day usage.


Is the voltage constant / adjustable?


----------



## kithylin

StAndrew said:


> Is the voltage constant / adjustable?


The voltage is adjustable with the XOC bios from stock volts up to 1.200v. But if you're not running it at 1.200v for benchmarks then there's no point in using the XOC bios. The GPU performance with the XOC bios is -5% slower at the same clock speeds vs normal manufacturer bios's. So you'll have to overclock it to at least +5% over what you could get with the normal bios to even see any gains with the XOC bios. Which pretty much requires at least 1.10v - 1.200v to make it useful. And at 1.200v you degrade your card so.. it's interesting for benchmarks but not that useful. I used it on my 1080 Ti originally. I bought the card in december 2017 and threw XOC on it right away (custom water looped card with big 3-radiator loop originally, now 4-rad loop) and after about 11 months of daily usage with XOC @ 1.200v now my card is degraded and won't run any game or benchmark at all with XOC on it (everything instant crashes to desktop with XOC bios). It still works normally, so far, with a "normal" bios on it though.


----------



## StAndrew

kithylin said:


> The voltage is adjustable with the XOC bios from stock volts up to 1.200v. But if you're not running it at 1.200v for benchmarks then there's no point in using the XOC bios. The GPU performance with the XOC bios is -5% slower at the same clock speeds vs normal manufacturer bios's. So you'll have to overclock it to at least +5% over what you could get with the normal bios to even see any gains with the XOC bios. Which pretty much requires at least 1.10v - 1.200v to make it useful. And at 1.200v you degrade your card so.. it's interesting for benchmarks but not that useful. I used it on my 1080 Ti originally. I bought the card in december 2017 and threw XOC on it right away (custom water looped card with big 3-radiator loop originally, now 4-rad loop) and after about 11 months of daily usage with XOC @ 1.200v now my card is degraded and won't run any game or benchmark at all with XOC on it (everything instant crashes to desktop with XOC bios). It still works normally, so far, with a "normal" bios on it though.


I don't necessarily want to run 1.2 but I would like to at least run a bit higher. Nvidia states the max voltage is 1.093 but I'm limited, oddly, to 1.031.


----------



## kithylin

StAndrew said:


> I don't necessarily want to run 1.2 but I would like to at least run a bit higher. Nvidia states the max voltage is 1.093 but I'm limited, oddly, to 1.031.


That's because (supposedly) 1.093v is the highest these cards can go without degradation. Additionally: Do note earlier you said you were trying furmark. Modern Nvidia drivers will detect furmark as a "Power Virus" and it won't allow the cards to go at full power draw / max boost. They stopped allowing that years ago. Instead you should try some of the 3dmark suite like firestrike or Time Spy to stress test the card.


----------



## mouacyk

If your GPU and VRM is water cooled, there's not much harm in playing around with the voltage up to 1.2v (which is the hard limit.) I've run my card at 1.2v for many months at 2165MHz for gaming with no degradation. But because I've tired of dumping out 350W+ of power, I've backed off to a more moderate 1.1v for 2114MHz.

Re: furmark, no they haven't. I can still pull 500W+ at 1.2v when I want to stress GPU temps on my water loop.


----------



## kithylin

mouacyk said:


> If your GPU and VRM is water cooled, there's not much harm in playing around with the voltage up to 1.2v (which is the hard limit.) I've run my card at 1.2v for many months at 2165MHz for gaming with no degradation. But because I've tired of dumping out 350W+ of power, I've backed off to a more moderate 1.1v for 2114MHz.
> 
> Re: furmark, no they haven't. I can still pull 500W+ at 1.2v when I want to stress GPU temps on my water loop.


When I last tried furmark on my 1080 Ti last year it wouldn't do that. It'd just sit there at about 1500 mhz and never boost up. I haven't seen furmark allow any card to boost up to max in any system I've had it in since about 5 years ago.

Additionally it took about 11-13 months for my 1080 Ti to get degraded. But trust me it does and will happen over time. So I guess it depends on how long people want to keep their cards. If you're planning to replace the 1080 Ti in a year then sure, go ahead and slam it at 1.200v at high clocks and run with it. If people buy a card used though there's no way to know how it was used before. It could of been run in coin mining for 1-2 years and then we try XOC bios today and it'll degrade in 3 months, who knows. My card was bought new so I know it's history.


----------



## StAndrew

kithylin said:


> That's because (supposedly) 1.093v is the highest these cards can go without degradation. Additionally: Do note earlier you said you were trying furmark. Modern Nvidia drivers will detect furmark as a "Power Virus" and it won't allow the cards to go at full power draw / max boost. They stopped allowing that years ago. Instead you should try some of the 3dmark suite like firestrike or Time Spy to stress test the card.


Well, technically, any amount of voltage will "degrade" the chip. I've hear the story before and ran 9800GTX+ overvolted and 900+ core for years. I ran them to the max "until they fail" and they never degraded or failed. I eventually got 780 Ti's and ran those also overvolted for about 2.5 years. I don't think the "degredation" is quite as bad as people think. Heat is generally the killer of silicon. Right now I don't think I've ever come even close to 40*C and only get up to 35*ish when my room starts heating up. 

Anyways, great point on furmark. I use the Heaven benchmark for core and MSI Kombustor (furmark) for memory stability. Never seen voltage get over 1.031.


----------



## kithylin

StAndrew said:


> Well, technically, any amount of voltage will "degrade" the chip. I've hear the story before and ran 9800GTX+ overvolted and 900+ core for years. I ran them to the max "until they fail" and they never degraded or failed. I eventually got 780 Ti's and ran those also overvolted for about 2.5 years. I don't think the "degredation" is quite as bad as people think. Heat is generally the killer of silicon. Right now I don't think I've ever come even close to 40*C and only get up to 35*ish when my room starts heating up.
> 
> Anyways, great point on furmark. I use the Heaven benchmark for core and MSI Kombustor (furmark) for memory stability. Never seen voltage get over 1.031.


It's completely different on pascal. The newer cards on the smaller nm scale are far more sensitive to voltage than the older cards were. All I know is it happened to me so I try to forewarn everyone else. I had to switch back to a normal bios off of XOC to actually use the card. Now with XOC I can't do anything with my card. No games. No Benchmarks. Nothing works with XOC anymore. Now I just run it with the EVGA 1080 Ti FTW3 bios and cope with 1.093v and it's good enough for me. At least I can still use my card now.


----------



## MarmotaOta

Hi everybody. I have used my card stock since launch. Started playing with afterburner and landed on 120% power limit, 175 mhz offset, 500 mhz vram offset. The card is running pretty fast and i was worried about lifespan. By now we have some information on degradation, should i be worried about 120% power limit on gaming? Some games have the gpu at 100% utilization, and the power limit seems to sit at 110-120. My card is under a evga 120mm hybrid kit, temperatures are maxing at 62-64, but mostly 55-60.


----------



## Streetdragon

no problem. you are running at stock voltage. So you are safe.
Everything <= 1.093V is NVIDA stock. so its fine to use the powerslider


----------



## MarmotaOta

Thank you, I'll just let it startup with the oc then, and enjoy the card at it's full power


----------



## mouacyk

Yeah crank this card or not and it's still very relevant for 1440p and some 4K. The lack of competition is so hilarious, that NVidia is trying to kill this card off with all sorts of tricks: rtx, dlss, concurrent int+fp32, and now concurrent int/fp32+fp32. These were unneeded innovations, because AMD has nothing. And none of these innovations come free to developers.


----------



## TwilightRavens

Really even 1.093v is perfectly safe but all Pascal, Volta, Turing (and probably Ampere too if I had to guess) cards are temperature sensitive so the cooler you can run it, the higher clocks you can get and usually undervolting to an extent will help with that endeavor i.e. 1.05v or less. Do not use XOC bios unless you want your GPU to degrade rapidly.



mouacyk said:


> Yeah crank this card or not and it's still very relevant for 1440p and some 4K. The lack of competition is so hilarious, that NVidia is trying to kill this card off with all sorts of tricks: rtx, dlss, concurrent int+fp32, and now concurrent int/fp32+fp32. These were unneeded innovations, because AMD has nothing. And none of these innovations come free to developers.


Pretty much, the 1080 ti was so good even Nvidia is still having a hard time killing it, unless they do what they did to the 780 ti, but even then a 1080 ti can still mostly just "brute force" things if really needed.


----------



## nolive721

So much agree with that. Nothing putting me off driving my 4K Tv with my 1080Ti hybrid with temps in low 50s and decent core and memory clocks.
Not even this full s**** of marketing Raytracing thing would make me getting rid of the beast


----------



## tiefox

Does anyone knows the specs ( size ) of thermal pads for a 1080 TI FE ? I will be selling my SLI setup to upgrade to a 3090 and need to put back the stock blower fan, would like to make sure thermal pads are replaced with correct ones.


----------



## acrvr

StAndrew said:


> I don't necessarily want to run 1.2 but I would like to at least run a bit higher. Nvidia states the max voltage is 1.093 but I'm limited, oddly, to 1.031.


Why would you want to run higher volts?

In my experience as the temp on the core goes down, the card will lower the voltage automatically while maintaining the same clock speed. I'm using an AIO to cool my EVGA 1080ti and at night when ambient are really cool, voltage would go down to 1.05 at 2100Mhz core.


----------



## kithylin

acrvr said:


> Why would you want to run higher volts?
> 
> In my experience as the temp on the core goes down, the card will lower the voltage automatically while maintaining the same clock speed. I'm using an AIO to cool my EVGA 1080ti and at night when ambient are really cool, voltage would go down to 1.05 at 2100Mhz core.


We can use MSI Afterburner and force it to a certain voltage at a certain clocks instead of letting it fluxuate automatically like that.


----------



## TwilightRavens

acrvr said:


> Why would you want to run higher volts?
> 
> In my experience as the temp on the core goes down, the card will lower the voltage automatically while maintaining the same clock speed. I'm using an AIO to cool my EVGA 1080ti and at night when ambient are really cool, voltage would go down to 1.05 at 2100Mhz core.


There’s no temperature difference for me with 1.05v and 1.093v, but at 1.05v i can run 2012MHz stable, at 1.093v I can run 2113MHz if i can keep the card below 40C and the VRAM at stock. Since I’m on air 40C isn’t a realistic temp for me to hit so I sit in the middle with 2075MHz/+620MHz vram at 51-57C on an Arctic Accelero iii, meanwhile like I said at stock I can only hit 2025MHz without becoming vrel limited which is voltage, at 1.093v i become pwr limited because of the 335W limit my Armor OC card has.


----------



## acrvr

TwilightRavens said:


> There’s no temperature difference for me with 1.05v and 1.093v, but at 1.05v i can run 2012MHz stable, at 1.093v I can run 2113MHz if i can keep the card below 40C and the VRAM at stock. Since I’m on air 40C isn’t a realistic temp for me to hit so I sit in the middle with 2075MHz/+620MHz vram at 51-57C on an Arctic Accelero iii, meanwhile like I said at stock I can only hit 2025MHz without becoming vrel limited which is voltage, at 1.093v i become pwr limited because of the 335W limit my Armor OC card has.


I hope you can get the higher clock speed. With my card higher voltage doesn't always guarantee higher clock. I set the manual OC moderately at 2060 but when the temp is cool it would boost up to 2100 by itself, while sometimes even lowering the voltage. The key seems to be pushing for colder temps.



kithylin said:


> We can use MSI Afterburner and force it to a certain voltage at a certain clocks instead of letting it fluxuate automatically like that.


I know it's supposed to be like that but in my experience the card would still break out of the locked voltage set in MSI AB.


----------



## StAndrew

acrvr said:


> Why would you want to run higher volts?
> 
> In my experience as the temp on the core goes down, the card will lower the voltage automatically while maintaining the same clock speed. I'm using an AIO to cool my EVGA 1080ti and at night when ambient are really cool, voltage would go down to 1.05 at 2100Mhz core.


I want more wattage tbh. I think that im limited to 1.031v because a watt limit but not sure.

TBH, I understand gains will be small but it's an old enough card that im contemplating an upgrade in the next year or two. About this time I've historically broken out the soldering gun and started hard modding and would run the cards into the dirt. My 9800 gtx+'s ran for years at 900 core. My GTX 780 TI's got 1200+ core but running all three crushed my PS so I figured a new card was in order.


----------



## TwilightRavens

StAndrew said:


> I want more wattage tbh. I think that im limited to 1.031v because a watt limit but not sure.
> 
> TBH, I understand gains will be small but it's an old enough card that im contemplating an upgrade in the next year or two. About this time I've historically broken out the soldering gun and started hard modding and would run the cards into the dirt. My 9800 gtx+'s ran for years at 900 core. My GTX 780 TI's got 1200+ core but running all three crushed my PS so I figured a new card was in order.


Yeah 1080 ti’s aren’t voltage limited, it usually power limits that hold them back, mine being capped at 330W is what keeps me from higher clocks, but keep in mind each card has a different power limit, my MSI Armor OC and the Gaming X model are both capped at 330W. However something like the Lightning Z is 350W, and the EVGA FTW3 w/ iCX is 358W, and the kingpin model is 390W, but after that it comes down to silicon quality. But for example even at 1080p in some games my card will bounce off the power limit, forcing clocks to drop down to the next bin.


----------



## acrvr

TwilightRavens said:


> Yeah 1080 ti’s aren’t voltage limited, it usually power limits that hold them back, mine being capped at 330W is what keeps me from higher clocks, but keep in mind each card has a different power limit, my MSI Armor OC and the Gaming X model are both capped at 330W. However something like the Lightning Z is 350W, and the EVGA FTW3 w/ iCX is 358W, and the kingpin model is 390W, but after that it comes down to silicon quality. But for example even at 1080p in some games my card will bounce off the power limit, forcing clocks to drop down to the next bin.


Just curious, would it be worth trying to flash EVGA FTW3 bios into my EVGA SC2 card? I notice my power limit is maxxed at 300W flat.


----------



## KedarWolf

acrvr said:


> Just curious, would it be worth trying to flash EVGA FTW3 bios into my EVGA SC2 card? I notice my power limit is maxxed at 300W flat.


Yes, but use the FTW3KingpinFix BIOS found in the BestBIOSCollection.zip in OP. It's the FTW3 BIOS, but with a power limit fix built-in.


----------



## ThrashZone

KedarWolf said:


> Yes, but use the FTW3KingpinFix BIOS found in the BestBIOSCollection.zip in OP. It's the FTW3 BIOS, but with a power limit fix built-in.


Hi,
Some silly people still think xoc vbios is better lol 








Shunt/Bios modded 1080ti FTW 3 Hybrid


Howdy y'all, Since the 3080s have been "released" I am wanting to push my 1080ti much farther and looking for some insight. I currently just have it manually OCed and have a stable core clock about 2000. Since y'all are much smarter I am looking to see what your ideas/opinions are as far as...




www.overclock.net


----------



## acrvr

KedarWolf said:


> Yes, but use the FTW3KingpinFix BIOS found in the BestBIOSCollection.zip in OP. It's the FTW3 BIOS, but with a power limit fix built-in.


Thanks. I flashed it to this:
FTW3KingpinFix.rom

Max power draw doesn't mind hanging around the 300W mark for longer periods of time now. However it made me realize that the 6+8pin + PCIe power lines are maxed out at 300W anyway. Guess that's why it's made for the FTW3.


----------



## ThrashZone

acrvr said:


> Thanks. I flashed it to this:
> FTW3KingpinFix.rom
> 
> Max power draw doesn't mind hanging around the 300W mark for longer periods of time now. However it made me realize that the 6+8pin + PCIe power lines are maxed out at 300W anyway. Guess that's why it's made for the FTW3.


Hi,
1080ti ftw3 is a dual 8 pin.


----------



## acrvr

I have the SC2 version unfortunately


----------



## ThrashZone

acrvr said:


> I have the SC2 version unfortunately


Hi,
FE different power delivery.


----------



## StAndrew

TwilightRavens said:


> Yeah 1080 ti’s aren’t voltage limited, it usually power limits that hold them back, mine being capped at 330W is what keeps me from higher clocks, but keep in mind each card has a different power limit, my MSI Armor OC and the Gaming X model are both capped at 330W. However something like the Lightning Z is 350W, and the EVGA FTW3 w/ iCX is 358W, and the kingpin model is 390W, but after that it comes down to silicon quality. But for example even at 1080p in some games my card will bounce off the power limit, forcing clocks to drop down to the next bin.


I think my silicon is good. I get 2106 gaming and 2100+ benching at 1.031v. It's a Gigabyte Aorus Extreme with a full cover block and liquid metal on the die and temps are kept sub 40*C even on the worst day. I'll give the BIOS a shot and see what I gain.


----------



## V5-aps

I’m running my Aorus Extreme at the same speeds and voltage too under water 👍
What speed are you running the memory at?


----------



## acrvr

KedarWolf said:


> Yes, but use the FTW3KingpinFix BIOS found in the BestBIOSCollection.zip in OP. It's the FTW3 BIOS, but with a power limit fix built-in.


Hi Kedar,
I flashed the bios to FTW3KingpinFix but I have reverted back to the stock EVGA 1080ti SC2 bios. I was having random lockups across different games with the FTW3KingpinFix even though all settings are the same to the stock bios.


----------



## kithylin

ThrashZone said:


> Hi,
> 1080ti ftw3 is a dual 8 pin.


That doesn't matter. I'm using the FTW3 bios on my EVGA 1080 Ti SC Black Edition, which uses a reference PCB with 6+8 power connectors. It works perfectly fine 100% with zero issues and I get the higher power limit and everything.


----------



## Pinnacle Fit

kithylin said:


> That doesn't matter. I'm using the FTW3 bios on my EVGA 1080 Ti SC Black Edition, which uses a reference PCB with 6+8 power connectors. It works perfectly fine 100% with zero issues and I get the higher power limit and everything.


Is the SC 1080ti the same PCB as reference? I think they take the reference waterblock at least? 

I ask because I have a reference 1080ti from MSI and I’m getting a FTW3 tomorrow so I wanted to flash the MSI one to the FTW3. 

Will this brick the bios? 


Sent from my iPhone using Tapatalk


----------



## KedarWolf

Pinnacle Fit said:


> Is the SC 1080ti the same PCB as reference? I think they take the reference waterblock at least?
> 
> I ask because I have a reference 1080ti from MSI and I’m getting a FTW3 tomorrow so I wanted to flash the MSI one to the FTW3.
> 
> Will this brick the bios?
> 
> 
> Sent from my iPhone using Tapatalk


If you buy from EK you need a different block than the FE block. SC has a DVI port and the block design is a bit different. Check the EK compatibility list.

Edit: You can flash the FTW3 BIOS on an MSI card.


----------



## Pinnacle Fit

KedarWolf said:


> If you buy from EK you need a different block than the FE block. SC has a DVI port and the block design is a bit different. Check the EK compatibility list.
> 
> Edit: You can flash the FTW3 BIOS on an MSI card.


Thanks...now i have to ask if im really going to be able to push the headroom on a reference card that's already been flashed with a strix bios and pushing 2025MHz at 1.125V...Im not sure its worth it. 

and i also noticed you're running two 1080ti's in SLI...How has that been working out gaming wise. Is there a recourse if games dont natively support SLI?


----------



## kithylin

Pinnacle Fit said:


> Is the SC 1080ti the same PCB as reference? I think they take the reference waterblock at least?
> 
> I ask because I have a reference 1080ti from MSI and I’m getting a FTW3 tomorrow so I wanted to flash the MSI one to the FTW3.
> 
> Will this brick the bios?
> 
> 
> Sent from my iPhone using Tapatalk


I refuse to comment on your question. I will not be held responsible if you brick your card or it's bios by flashing anything other than it's original bios onto it. Any flashing is at your own risk.

The EVGA GTX 1080 Ti SC Black Edition uses the Nvidia Reference / FE PCB and a reference FE block works with it, according to EK's Cooling Compatibility matrix thing. Refer to it here: Liquid cooling compatibility list | EKWB It's up to you to check compatibility yourself before you buy anything. No one but you will be responsible if you buy a incompatibile block.


----------



## Pinnacle Fit

-


----------



## TwilightRavens

Pinnacle Fit said:


> and i also noticed you're running two 1080ti's in SLI...How has that been working out gaming wise. Is there a recourse if games dont natively support SLI?


I too would like to know as i plan to buy a second soon ish.


----------



## KedarWolf

TwilightRavens said:


> I too would like to know as i plan to buy a second soon ish.


SLI helps in games like Hellblade:Senua's Sacrifice quite a bit!! And if you Google it there is a website for SLI profiles for games even if they don't support it. However, Nvidia is officially ending SLI support for games in 2021. Still, you can mod drivers for SLI support and MultiGPU like Shadow Of The Tomb Raider might be a thing for DirectX12.


----------



## TwilightRavens

Just flashed my Armor OC 1080 ti to the FTW3 bios and holy crap was I power limited. Before on the stock vbios I was only benchmark stable with 2100/+500, on this one I can do 2113/+615 and pass firestrike. I highly doubt its everyday stable but I'll find that out down the line.



http://www.3dmark.com/fs/23677799



I don't have a link to the old result but i have pics of both.

original vbios (MSI Armor OC 1080 ti 330W power limit)


new vbios (EVGA FTW3 1080 ti 358W power limit)



Also I tried the Lightning Z bios but with that the driver would install, but i couldn't use any video outputs on the card without a black screen with the driver enabled so it was effectively useless.


----------



## MarmotaOta

1.093 is perfectly safe voltage then? I started using the voltage slider and managed to get another few mhz out of my FE card.


----------



## kithylin

MarmotaOta said:


> 1.093 is perfectly safe voltage then? I started using the voltage slider and managed to get another few mhz out of my FE card.


As long as you are _NOT_ using the XOC bios (you're just using some normal bios, even a cross-flashed one) then those are the limits set by nvidia. Any voltage allowed by those bios's are harmless. Normal bios's will limit you from being able to set any voltage that is dangerous.


----------



## StAndrew

So I flashed the XOC bios and lost function of one of my display ports (I thought I bricked my card) and clocks drop to 130 core (very choppy desktop) and no control over power in any software and volts stuck at 1031mv still. Flashed back to original bios, I guess I'll stick with these.


----------



## AnneFlankinbot

I'm not sure if this is the place for this, but I need help. I was trying to utilize all 3 DP ports along with the HDMI port for the first time since owning the card. (I've had it for 3 years now). Originally I was running one DP and an HDMI for my second monitor. I went for a full gaming renovation and added a wall-mounted TV and a valve index headset. Where the issue comes in, is the 3rd DP port on the card itself (the DP port isolated by the HDMI port). I was hoping it was going to be as simple as plug and play, but it seems this 3rd DP port does not want to register any DP cable. I purchased a second DP cable to move the HDMI off into the newly mounted TV. After countless failures to draw a signal from that isolated DP port, I'm out of ideas. I unplugged the tower from its power source and tried booting with my main monitor in the Isolated DP port, no other video cables present, to no effect. Moving from the unresponsive port to either of the two next to each other immediately pulls the signal through to the monitor. The dream is the TV through HDMI with the monitors below running through DP cables and a valve index to the third DP port. Sounds like a fun setup right? Please tell me I've overlooked some small detail.


----------



## MrTOOSHORT

AnneFlankinbot said:


> I'm not sure if this is the place for this, but I need help. I was trying to utilize all 3 DP ports along with the HDMI port for the first time since owning the card. (I've had it for 3 years now). Originally I was running one DP and an HDMI for my second monitor. I went for a full gaming renovation and added a wall-mounted TV and a valve index headset. Where the issue comes in, is the 3rd DP port on the card itself (the DP port isolated by the HDMI port). I was hoping it was going to be as simple as plug and play, but it seems this 3rd DP port does not want to register any DP cable. I purchased a second DP cable to move the HDMI off into the newly mounted TV. After countless failures to draw a signal from that isolated DP port, I'm out of ideas. I unplugged the tower from its power source and tried booting with my main monitor in the Isolated DP port, no other video cables present, to no effect. Moving from the unresponsive port to either of the two next to each other immediately pulls the signal through to the monitor. The dream is the TV through HDMI with the monitors below running through DP cables and a valve index to the third DP port. Sounds like a fun setup right? Please tell me I've overlooked some small detail.


Did you flash another bios to the card before all this. Some bios' deactivate DPs on other cards.


----------



## AnneFlankinbot

I did not. This is also the first time I'll be using the DP port since owning it so it may very well have been bust all along. I do overclock through MSI afterburner, but nothing so serious as to damage the card.


----------



## KedarWolf

AnneFlankinbot said:


> I'm not sure if this is the place for this, but I need help. I was trying to utilize all 3 DP ports along with the HDMI port for the first time since owning the card. (I've had it for 3 years now). Originally I was running one DP and an HDMI for my second monitor. I went for a full gaming renovation and added a wall-mounted TV and a valve index headset. Where the issue comes in, is the 3rd DP port on the card itself (the DP port isolated by the HDMI port). I was hoping it was going to be as simple as plug and play, but it seems this 3rd DP port does not want to register any DP cable. I purchased a second DP cable to move the HDMI off into the newly mounted TV. After countless failures to draw a signal from that isolated DP port, I'm out of ideas. I unplugged the tower from its power source and tried booting with my main monitor in the Isolated DP port, no other video cables present, to no effect. Moving from the unresponsive port to either of the two next to each other immediately pulls the signal through to the monitor. The dream is the TV through HDMI with the monitors below running through DP cables and a valve index to the third DP port. Sounds like a fun setup right? Please tell me I've overlooked some small detail.


If you flashed a BIOS like the FTW3 BIOS you lose a DP port. I think the Palit 350 power limit BIOS in the best bios collection ZIP you can use all three DP ports. It's nearly as good as the FTW3 BIOS. It not, go back to the original BIOS for the card.


----------



## TK421

KedarWolf said:


> SLI helps in games like Hellblade:Senua's Sacrifice quite a bit!! And if you Google it there is a website for SLI profiles for games even if they don't support it. However, Nvidia is officially ending SLI support for games in 2021. Still, you can mod drivers for SLI support and MultiGPU like Shadow Of The Tomb Raider might be a thing for DirectX12.



I have weird issue where between restarts, odd VRAM offset gets better scores

Ie: +50 will have better scores than +100, and sometimes the other way around

Any idea what's causing this bug?


1080Ti DUKE model with strix XOC bios, have DDU'd driver


----------



## mouacyk

TK421 said:


> I have weird issue where between restarts, odd VRAM offset gets better scores
> 
> Ie: +50 will have better scores than +100, and sometimes the other way around
> 
> Any idea what's causing this bug?
> 
> 
> 1080Ti DUKE model with strix XOC bios, have DDU'd driver


You need to test performance with vRamBandWidthTest. Different mem offsets can force different straps.


----------



## KedarWolf

TK421 said:


> I have weird issue where between restarts, odd VRAM offset gets better scores
> 
> Ie: +50 will have better scores than +100, and sometimes the other way around
> 
> Any idea what's causing this bug?
> 
> 
> 1080Ti DUKE model with strix XOC bios, have DDU'd driver


Here, go DRAM-Bandwidth not Cache-Bandwidth. Lower of raise your DRAM clocks by one or two points and test.






vRamBandWidthTest-guru3d.zip







drive.google.com


----------



## TK421

mouacyk said:


> You need to test performance with vRamBandWidthTest. Different mem offsets can force different straps.





KedarWolf said:


> Here, go DRAM-Bandwidth not Cache-Bandwidth. Lower of raise your DRAM clocks by one or two points and test.
> 
> 
> 
> 
> 
> 
> vRamBandWidthTest-guru3d.zip
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


I restarted the PC and now +100 is doing better than +50 


I'll try that bandwidth test


----------



## TK421

@KedarWolf any idea if I should change any of the parameters of vrambandwidthtest?


----------



## TwilightRavens

mouacyk said:


> You need to test performance with vRamBandWidthTest. Different mem offsets can force different straps.


Where is the next memory strap after like +600MHz? I can do 615 but any higher artifacts in Valley. I tried 700, 800 and 900 but all were pretty much the same thing. Out of curiosity I figured it might have a chance of being stable right where the timings loosen on the next strap up if I could find at what MHz that kicks in.


----------



## KedarWolf

TK421 said:


> @KedarWolf any idea if I should change any of the parameters of vrambandwidthtest?


No need to, no.


----------



## alv-OC

Hi guys!

I have a Gigabyte 1080Ti Extreme 11GB with a EK Waterblock, I recently did the power shunt mod and flashed the BIOS, after bricking it 2 times I'm currently runing the Strix Oc BIOS and it's stable at 2100MHz on the cores and 6132MHz on the Mems, voltage max is 1.094v and temps never go over 50ºC. I feell like I would be able to push it further if I could rise up the voltage (just a bit, I don't pretend to run it at 1.200v) but Idon't really know wich BIOS could work or how can I mod de BIOS... any help please?

btw, XOC BIOS caused the first bricking, so its not an option...

Thanks in advance


----------



## TK421

mouacyk said:


> You need to test performance with vRamBandWidthTest. Different mem offsets can force different straps.





KedarWolf said:


> Here, go DRAM-Bandwidth not Cache-Bandwidth. Lower of raise your DRAM clocks by one or two points and test.
> 
> 
> 
> 
> 
> 
> vRamBandWidthTest-guru3d.zip
> 
> 
> 
> 
> 
> 
> 
> drive.google.com



I still have the issue where different offsets can result in different performance during benching


I can't really detect the issue with vrambandwidth test because the numbers are all over the place


Any thoughts on what's causing the issue? Can't wrap my head around this problem since it's really odd.





I'm using the strix 1080ti xoc bios on a msi gaming x pcb.


I also tried to use this tool from the hwbot website to increase the voltage for mem/gpu/PLL but it says can't read the card






ULTIMATE ASUS 1080TI ROG STRIX AC/LN2 TIPS


Introduction Finally 1080 TI custom PCB cards are out! This time I would like to focus on Asus Strix Gaming 1080 TI card. This guide applies also for POSEIDON card cause they use same PCB. Still 1080 TI chip is pretty new for vendors, and there is no super top highend LN2 card as Matrix yet, but ...



community.hwbot.org


----------



## neoroy

Just bought a used card, 2050/1540 seems OK, at stock volt. This card still good in gaming performance, no RT is okay for me


----------



## kithylin

neoroy said:


> Just bought a used card, 2050/1540 seems OK, at stock volt. This card still good in gaming performance, no RT is okay for me
> View attachment 2463697


The 1080 Ti actually does support RTX Technology and ray tracing. It was added in an nvidia driver a long time ago. It's just extremely slow at it.


----------



## acrvr

neoroy said:


> Just bought a used card, 2050/1540 seems OK, at stock volt. This card still good in gaming performance, no RT is okay for me
> View attachment 2463697


1080ti is still very good. I play warzone 1440p and still getting 140-150fps with it. 

Are you from Indonesia?


----------



## StAndrew

So I couldn't get the XOC bios to work on my card for some reason. I was having hard time understanding why it won't run any higher than 1.031v and 2088 core so I started playing around with the voltage/freq curve.

If you are overlcocking this card with the Slider, YOU ARE DOING IT WRONG. Using the curve you can tweak core speed at each voltage level. Its time consuming but with stock bios, no shunt mods, I got my card running at 2139, 1.091v. The stock Gigabyte Aorus Extreme WB does has a 375w power limit but I would love to see how much higher I could get this with a shunt mod.


----------



## kithylin

StAndrew said:


> So I couldn't get the XOC bios to work on my card for some reason. I was having hard time understanding why it won't run any higher than 1.031v and 2088 core so I started playing around with the voltage/freq curve.
> 
> If you are overlcocking this card with the Slider, YOU ARE DOING IT WRONG. Using the curve you can tweak core speed at each voltage level. Its time consuming but with stock bios, no shunt mods, I got my card running at 2139, 1.091v. The stock Gigabyte Aorus Extreme WB does has a 375w power limit but I would love to see how much higher I could get this with a shunt mod.


We've been telling everyone about the curve in afterburner for about the past 4 years in this thread since the first few posts in this thread. I generally thought most folks knew about it by now. All nvidia cards since 2016 all use the curve for overclocking in MSI-AB, 10-series, 20-series, and 30-series.


----------



## StAndrew

kithylin said:


> We've been telling everyone about the curve in afterburner for about the past 4 years in this thread since the first few posts in this thread. I generally thought most folks knew about it by now. All nvidia cards since 2016 all use the curve for overclocking in MSI-AB, 10-series, 20-series, and 30-series.


I'm kind of out of the loop.


----------



## mouacyk

StAndrew said:


> I'm kind of out of the loop.


Welcome back. To where the future is horrible. Everything is power-locked and clocks fluctuate at will.


----------



## StAndrew

mouacyk said:


> Welcome back. To where the future is horrible. Everything is power-locked and clocks fluctuate at will.


I went from the amazing 9800 GTX+ to the ok GTX 285, to the awesome GTX 780 TI to the GTX 1080 TI. Quite a few generation leaps but TBH I'm starting to really like the voltage frequency curve tuning. 

I really want to try the shunt mod. I keep seeing people solder the extra resistor on the current shunt resistors. Anyone know what size resistors these are and where to find them? 

Can I just jump it with a copper wire or would that be too much?


----------



## TwilightRavens

StAndrew said:


> I went from the amazing 9800 GTX+ to the ok GTX 285, to the awesome GTX 780 TI to the GTX 1080 TI. Quite a few generation leaps but TBH I'm starting to really like the voltage frequency curve tuning.
> 
> I really want to try the shunt mod. I keep seeing people solder the extra resistor on the current shunt resistors. Anyone know what size resistors these are and where to find them?
> 
> Can I just jump it with a copper wire or would that be too much?


Well what you wanna do is increase the resistance but not by too much as it’ll lock the card to 2d clocks (like 300MHz max) if you add too much resistance. I’d start with a dab of liquid metal first to see roughly where you stand, that way if you do add too much then you can just clean it up and try again.


----------



## mouacyk

TwilightRavens said:


> Well what you wanna do is increase the resistance but not by too much as it’ll lock the card to 2d clocks (like 300MHz max) if you add too much resistance. I’d start with a dab of liquid metal first to see roughly where you stand, that way if you do add too much then you can just clean it up and try again.


There are also stories of liquid metal eating the solder joints. Not sure if this was happenining with all liquid metal or just some.



StAndrew said:


> I went from the amazing 9800 GTX+ to the ok GTX 285, to the awesome GTX 780 TI to the GTX 1080 TI. Quite a few generation leaps but TBH I'm starting to really like the voltage frequency curve tuning.
> 
> I really want to try the shunt mod. I keep seeing people solder the extra resistor on the current shunt resistors. Anyone know what size resistors these are and where to find them?
> 
> Can I just jump it with a copper wire or would that be too much?


I would check first if you can flash the XOC bios. This bios unlocks the power completely (no throttling) and allows voltage slider to go up to 1.2v.


----------



## StAndrew

mouacyk said:


> There are also stories of liquid metal eating the solder joints. Not sure if this was happenining with all liquid metal or just some.
> 
> 
> I would check first if you can flash the XOC bios. This bios unlocks the power completely (no throttling) and allows voltage slider to go up to 1.2v.


For some reason the XOC bios didn't work for my card. It killed one of my DP ports and the clocks kept falling to 300 and windows ran like a slide show.

I have a DVI port which im guessing is rare which probably explains the DP not working but I couldn't get the clocks working for whatever reason.


----------



## TwilightRavens

mouacyk said:


> There are also stories of liquid metal eating the solder joints. Not sure if this was happenining with all liquid metal or just some.
> 
> 
> I would check first if you can flash the XOC bios. This bios unlocks the power completely (no throttling) and allows voltage slider to go up to 1.2v.


I didn’t mean as a permanent thing, mainly as a see how much resistance you need


----------



## StAndrew

So I was looking more into the XOC bios and it appears what I was experiencing was normal. I guess MSI afterburner doesn't like the XOC bios and this card does lose one DP (I only have one monitor so that's fine with me).

I reflashed to the XOC bios and I'm now running 2214 core @ 1.2V with a bit more fine tuning still needed. 2226 did crash so not much more head room left. But damn does it heat the room up.


----------



## Streetdragon

Loosing one DP-Port is normal, when you flash a different Bios on the FE.

Just be aware, that 1.2V can and WILL degrade your GPU over time


----------



## StAndrew

Streetdragon said:


> Loosing one DP-Port is normal, when you flash a different Bios on the FE.
> 
> Just be aware, that 1.2V can and WILL degrade your GPU over time


Its not FE but yeah, no big deal.

1.091V also can and will degrade your GPU


----------



## mouacyk

StAndrew said:


> So I was looking more into the XOC bios and it appears what I was experiencing was normal. I guess MSI afterburner doesn't like the XOC bios and this card does lose one DP (I only have one monitor so that's fine with me).
> 
> I reflashed to the XOC bios and I'm now running 2214 core @ 1.2V with a bit more fine tuning still needed. 2226 did crash so not much more head room left. But damn does it heat the room up.


I'm glad that worked out, so you don't need physical modding to test out the GPU's full potential. The loss of an output port is expected, but how did you get around getting stuck with the 2D clocks? Was that may a bad flash?


----------



## kithylin

StAndrew said:


> So I was looking more into the XOC bios and it appears what I was experiencing was normal. I guess MSI afterburner doesn't like the XOC bios and this card does lose one DP (I only have one monitor so that's fine with me).
> 
> I reflashed to the XOC bios and I'm now running 2214 core @ 1.2V with a bit more fine tuning still needed. 2226 did crash so not much more head room left. But damn does it heat the room up.


You need to be warned: Running your card at 1.20v daily for gaming _WILL_ degrade your card in less than a year and you won't be able to run those clocks any more. 1.100v or anything higher is only supposed to be used temporarily for benchmarks and then the card needs to be reset to a normal limited bios again. The XOC bios was originally only released for Asus STRIX cards to be used at overclocking competitions. It's _NOT_ a daily-driver bios for daily gaming use.


----------



## StAndrew

mouacyk said:


> I'm glad that worked out, so you don't need physical modding to test out the GPU's full potential. The loss of an output port is expected, but how did you get around getting stuck with the 2D clocks? Was that may a bad flash?


May have been. MSI still show 300 2D clocks but that is expected from wherein read. Last flash I did have some stutter issues so probably bad flash.


----------



## StAndrew

kithylin said:


> You need to be warned: Running your card at 1.20v daily for gaming _WILL_ degrade your card in less than a year and you won't be able to run those clocks any more. 1.100v or anything higher is only supposed to be used temporarily for benchmarks and then the card needs to be reset to a normal limited bios again. The XOC bios was originally only released for Asus STRIX cards to be used at overclocking competitions. It's _NOT_ a daily-driver bios for daily gaming use.


Its easy to set a freq/volt curve for a lower voltage. I have two profiles with one running at 1.093 & 2126 core (which is my highest overclock pre XOC bios). But tbh, I've been hearing how voltage kills cards since 2007 but have yet to see it. I've overvolted the 9800GTX+, the GTX285, and the GTX 780 TI with no issues. Except for the 780's; I ran those cards for years. The 780's went about 18 months at 1.2V+ core (IIRC) with no "degradation". I'm reasonably certain its heat that kills chips and my current loop is so overkill I didnt see more than 2* jump in temps from 1.2 and 1.093; temps stay well below 40*


----------



## kithylin

StAndrew said:


> Its easy to set a freq/volt curve for a lower voltage. I have two profiles with one running at 1.093 & 2126 core (which is my highest overclock pre XOC bios). But tbh, I've been hearing how voltage kills cards since 2007 but have yet to see it. I've overvolted the 9800GTX+, the GTX285, and the GTX 780 TI with no issues. Except for the 780's; I ran those cards for years. The 780's went about 18 months at 1.2V+ core (IIRC) with no "degradation". I'm reasonably certain its heat that kills chips and my current loop is so overkill I didnt see more than 2* jump in temps from 1.2 and 1.093; temps stay well below 40*


I'm warning you because it happened to me and from my personal experience. I ran my 1080 Ti with the XOC bios for 14 months @ 1.200v daily every day for gaming and now if I load the XOC bios back on my card again I can't do anything with it at any clock speed at any voltage. No games. No benchmarks. Everything instantly crashes to desktop. My card sucked on the silicon lotto and requires 1.200v just for 2126 Mhz. Now the only way to continue using my card was to flash it back to a normal limited bios and it works normally. I only get 2000 Mhz now. But yes it totally is a real thing and it happened to me and it will happen to you if you use XOC with higher voltage long-term. Other people in the forums here have reported in the past in this very thread that using XOC bios completely fried their card. Dead. Didn't POST anymore. That's possible too. If you value your card at all, don't use the XOC bios other than _VERY_ short term for benchmarks.

EDIT: Don't forget also that the XOC bios runs your card -7% slower clock-for-clock vs any other limited bios. You'll have to gain at least +7% Mhz OC just to match even to lower clock speeds on a limited bios.


----------



## ThrashZone

Hi,
XOC vbios sux use ftw3 instead.


----------



## mouacyk

ThrashZone said:


> Hi,
> XOC vbios sux use ftw3 instead.


What's the power limit on FTW3 bios?


----------



## kithylin

mouacyk said:


> What's the power limit on FTW3 bios?


358W with +20% OC, so 429W max.


----------



## ThrashZone

mouacyk said:


> What's the power limit on FTW3 bios?


Hi,
Read the op here 
Use KW's power limit bat file and see for your self  ftw3 vbios is on the op also with the other collection.


----------



## mouacyk

ThrashZone said:


> Hi,
> Read the op here
> Use KW's power limit bat file and see for your self  ftw3 vbios is on the op also with the other collection.


@kithylin Thanks both of you. I did skim and search the op post for "ftw3" but didn't find any direct results, then searched the thread and found KedarWolf's post claiming better performance but with no power limit numbers.

Is this the one? EVGA GTX 1080 Ti VBIOS


----------



## KedarWolf

mouacyk said:


> @kithylin Thanks both of you. I did skim and search the op post for "ftw3" but didn't find any direct results, then searched the thread and found KedarWolf's post claiming better performance but with no power limit numbers.
> 
> Is this the one? EVGA GTX 1080 Ti VBIOS


You want the FTW3KingpinFix bios found in the BestBiosCollection.zip download on the first page of the thread.

Here: https://www.overclock.net/attachments/bestbioscollectionandpowerlimitbatfiles-zip.250418/


----------



## StAndrew

kithylin said:


> I'm warning you because it happened to me and from my personal experience. I ran my 1080 Ti with the XOC bios for 14 months @ 1.200v daily every day for gaming and now if I load the XOC bios back on my card again I can't do anything with it at any clock speed at any voltage. No games. No benchmarks. Everything instantly crashes to desktop. My card sucked on the silicon lotto and requires 1.200v just for 2126 Mhz. Now the only way to continue using my card was to flash it back to a normal limited bios and it works normally. I only get 2000 Mhz now. But yes it totally is a real thing and it happened to me and it will happen to you if you use XOC with higher voltage long-term. Other people in the forums here have reported in the past in this very thread that using XOC bios completely fried their card. Dead. Didn't POST anymore. That's possible too. If you value your card at all, don't use the XOC bios other than _VERY_ short term for benchmarks.
> 
> EDIT: Don't forget also that the XOC bios runs your card -7% slower clock-for-clock vs any other limited bios. You'll have to gain at least +7% Mhz OC just to match even to lower clock speeds on a limited bios.


That doesn't sound like a silicon issue to me... Ive found that with Windows, too many bios flashes can break windows. I had a ton of issues with my 780's and eventually had to reinstall windows. I thought I had blown my cards but with a new fresh install, everything worked great.


----------



## StAndrew

ThrashZone said:


> Hi,
> XOC vbios sux use ftw3 instead.


Why is it better? I dont care too much about an obscure benchmark power limit bug.


----------



## kithylin

StAndrew said:


> That doesn't sound like a silicon issue to me... Ive found that with Windows, too many bios flashes can break windows. I had a ton of issues with my 780's and eventually had to reinstall windows. I thought I had blown my cards but with a new fresh install, everything worked great.


If this is your first "newer video card" since the GTX 600, 700 or 900 series then you have some learning to do. The Pascal cards are nothing like any of the previous cards from Nvidia. The entire architecture is completely different. Starting with the 1000 series cards and Pascal, these are the first cards from Nvidia that actually are very sensitive to voltage. They can and will take damage if you run them over volted long-term. 1.093v is the maximum safe voltage for 1080 Ti's that won't cause degradation or damage. You can not use your experience on overclocking previous Nvidia video cards to compare to Pascal because Pascal doesn't behave like the previous cards. You do what you want at your own risk of course, but I'm trying to warn you and help you avoid a damaged card.


StAndrew said:


> Why is it better? I dont care too much about an obscure benchmark power limit bug.


FTW3 is the highest power limit bios out of all of the "limited" bios's available. It doesn't run any risk of damaging your card like XOC does. It runs the card at full performance for the clock speed you can run it at. I'll explain in more detail what I tried to explain earlier: With the XOC bios it is 7% slower clock-for-clock vs limited bios's. That means that (for example) if you can get 2000 Mhz stable with the FTW3 bios (or any other limited bios) on your 1080 Ti then when you run the same card at 2140 Mhz (2000 + 7%, just used as this example here) on the XOC bios you will have exactly the same performance in games and benchmarks vs running the card on a limited bios at 2000 Mhz. In your example with 2214 Mhz on XOC @ 1.200v you will have the exact same performance if you can maintain 2059 Mhz on the card on a limited bios @ 1.093v (which is where the FTW3 bios caps out at). You claimed you can maintain 2126 Mhz with 1.093v. If that's true then 2126 Mhz @ 1.093v on the FTW3 bios will be faster for you than 2214 Mhz would be on the XOC bios.

The XOC bios is slower, runs a risk of damaging your card and in general is just terrible for daily-driver usage. It's only useful for people that are taking the cards to liquid nitrogen for OC competitions trying to hit world records. Situations where they can push the cards to 2300 Mhz and higher. Also situations where they'll only be benchmarking the cards for a few hours at a time then turn em off and store em later. It's never intended to be used "full time" for long periods of time.


----------



## mouacyk

KedarWolf said:


> You want the FTW3KingpinFix bios found in the BestBiosCollection.zip download on the first page of the thread.
> 
> Here: https://www.overclock.net/attachments/bestbioscollectionandpowerlimitbatfiles-zip.250418/


Thanks I will try it out. Gotta do some baseline benchmarks with XOC first, since it's already on.

@kithylin That 7÷ sounds like magic ever since I first read it claimed in this thread years ago. One bios cannot treat the same clock differently than another bios -- unless the XOC is forcing an invisible padding.


----------



## mouacyk

StAndrew said:


> Why is it better? I dont care too much about an obscure benchmark power limit bug.


It's bs as I suspected, a magical 7% no one can really explain. Did nothing for me, besides preventing me from surpassing 330W and 1.081v. See brand new [email protected] benches below for Shadow of the Tomb Raider, on latest drivers, installed after DDU.

In my averaged SOTTR benchmarks, XOC is 99.97% of FTW3. That's not anywhere near sux'ing.

XOC BIOS:


Code:


+-----------------------------------------------------------------------------+
| NVIDIA-SMI 457.09       Driver Version: 457.09       CUDA Version: 11.1     |
|-------------------------------+----------------------+----------------------+
| GPU  Name            TCC/WDDM | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 108... WDDM  | 00000000:01:00.0  On |                  N/A |
|  0%   23C    P8    14W /  N/A |    721MiB / 11264MiB |      4%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+
















FTW3 BIOS:



Code:


+-----------------------------------------------------------------------------+
| NVIDIA-SMI 457.09       Driver Version: 457.09       CUDA Version: 11.1     |
|-------------------------------+----------------------+----------------------+
| GPU  Name            TCC/WDDM | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  GeForce GTX 108... WDDM  | 00000000:01:00.0  On |                  N/A |
|  0%   25C    P8    15W / 358W |    693MiB / 11264MiB |      0%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+


----------



## kithylin

mouacyk said:


> In my averaged SOTTR benchmarks, XOC is 99.97% of FTW3. That's not anywhere near sux'ing.


Personally in all of the games I commonly played I saw a much bigger discrepancy in performance between the bios versions myself. But I play games like ARK: Evolved, Satisfactory, etc. I don't play any of the mainstream AAA games. I usually generally avoid them all on purpose. Also the 7% was documented in 3dmark in the past. Perhaps it was something to do with video drivers that has been fixed since then. That original 7% claim was around 3 years ago now. Personally I saw exactly the same performance in my games with 2126 Mhz on XOC @ 1.200v VS 2000 Mhz @ 1.093v in FTW3. The 7% performance or not it still doesn't change the fact that the XOC bios is extremely dangerous to have on cards. I wish someone would edit the post where ever the XOC bios is posted in this thread and put a BIG RED DISCLAIMER around the download link. I'm still having occasional driver resets sometimes even on the FTW3 bios from time to time lately and I'm going back to my original stock bios for my card pretty soon. I think my poor card is dying. Even with all overclocking disabled in MSI-AB it's causing driver resets now. I bought it brand new in october 2017. I used XOC on it daily for the first 14 months then switched back to FTW3 after it degraded. And now.. I don't know anymore. I hope it doesn't die on me. I had hoped to use this card for a few more years but I'm questioning if it will survive until 2021 at this rate. The driver resets are getting a bit more common and frequent. I reverted it back to my stock bios from EVGA tonight and I hope the driver resets go away.  I wish I had never used XOC on my card. It's barely even made it 3 years and it's already dying on me I think.


----------



## StAndrew

kithylin said:


> Personally in all of the games I commonly played I saw a much bigger discrepancy in performance between the bios versions myself. But I play games like ARK: Evolved, Satisfactory, etc. I don't play any of the mainstream AAA games. I usually generally avoid them all on purpose. Also the 7% was documented in 3dmark in the past. Perhaps it was something to do with video drivers that has been fixed since then. That original 7% claim was around 3 years ago now. Personally I saw exactly the same performance in my games with 2126 Mhz on XOC @ 1.200v VS 2000 Mhz @ 1.093v in FTW3. The 7% performance or not it still doesn't change the fact that the XOC bios is extremely dangerous to have on cards. I wish someone would edit the post where ever the XOC bios is posted in this thread and put a BIG RED DISCLAIMER around the download link. I'm still having occasional driver resets sometimes even on the FTW3 bios from time to time lately and I'm going back to my original stock bios for my card pretty soon. I think my poor card is dying. Even with all overclocking disabled in MSI-AB it's causing driver resets now. I bought it brand new in october 2017. I used XOC on it daily for the first 14 months then switched back to FTW3 after it degraded. And now.. I don't know anymore. I hope it doesn't die on me. I had hoped to use this card for a few more years but I'm questioning if it will survive until 2021 at this rate. The driver resets are getting a bit more common and frequent. I reverted it back to my stock bios from EVGA tonight and I hope the driver resets go away.  I wish I had never used XOC on my card. It's barely even made it 3 years and it's already dying on me I think.


Have you checked your GPU thermal paste? I'm running liquid metal an it dried out after about 2 years. Die temps were reported bt 38 and 40 (higher than typical) and I started having random crashes even at stock clocks. The GPU can overheat in limited areas, especially given the size even if its reporting good temps.

I understand the issue with Volts but I've found typically its heat that kills silicon. If you can control the heat you should be ok. 

I hope. 

We'll see


----------



## kithylin

StAndrew said:


> Have you checked your GPU thermal paste? I'm running liquid metal an it dried out after about 2 years. Die temps were reported bt 38 and 40 (higher than typical) and I started having random crashes even at stock clocks. The GPU can overheat in limited areas, especially given the size even if its reporting good temps.
> 
> I understand the issue with Volts but I've found typically its heat that kills silicon. If you can control the heat you should be ok.
> 
> I hope.
> 
> We'll see


I've been on a custom water loop with 4 radiators, 2 pumps and 15 fans since day 45. I bought the card brand new and the hottest it's ever run was the first 45 days after I got it in my hands while I waited for the water block I wanted to arrive that was on back order and it went around 80c on the stock air cooler out of the box for the first 45 days. As soon as I got my water block and installed it and put it in the loop it's been below 40c the rest of it's life for the past 3 years. Typically it only ever maxes out at 33~34c gaming now. Temperatures are not the issue for me. If it does degrade then it's something else. It's not the paste. I've re-pasted it around new years every year for the past 3 years. And just for your information: Liquid metal does not "Dry out". It's absorbed by the heatsink and/or water block, permanently altering the metal composition of the material of said block/heatsink.


----------



## StAndrew

kithylin said:


> I've been on a custom water loop with 4 radiators, 2 pumps and 15 fans since day 45. I bought the card brand new and the hottest it's ever run was the first 45 days after I got it in my hands while I waited for the water block I wanted to arrive that was on back order and it went around 80c on the stock air cooler out of the box for the first 45 days. As soon as I got my water block and installed it and put it in the loop it's been below 40c the rest of it's life for the past 3 years. Typically it only ever maxes out at 33~34c gaming now. Temperatures are not the issue for me. If it does degrade then it's something else. It's not the paste. I've re-pasted it around new years every year for the past 3 years. And just for your information: Liquid metal does not "Dry out". It's absorbed by the heatsink and/or water block, permanently altering the metal composition of the material of said block/heatsink.


Mine was crashing at 38C. Dry out, absorbed, potato potato. Its a rather large die and if the thermal compound is compromised in just a small section of the chip, that could overheat but the overall GPU temps sensor won't pick it up or just report the average over the chip. I'm just offering ideas but you seem to be a masochist bent on having a blown card.


----------



## kithylin

StAndrew said:


> Mine was crashing at 38C. Dry out, absorbed, potato potato. Its a rather large die and if the thermal compound is compromised in just a small section of the chip, that could overheat but the overall GPU temps sensor won't pick it up or just report the average over the chip. I'm just offering ideas but you seem to be a masochist bent on having a blown card.


I'm no masochist.. that's kind of an insult in general. I know how to apply thermal paste. I've been doing it in computers since the Pentium-MMX days. I've put thermal paste on well over a thousand video cards by now, either buying used ones and re-pasting em for friends to upgrade their computers, for myself for other machines, etc. The paste is not compromised in any way. The issues are not paste and not temperatures. Thank you for your concern though. And I'm continuing to use my card as it is because I can't afford anything else at the moment. So I just put it back on the stock bios, disabled any overclocks in MSI-AB and leave it alone and let it do it's thing and boost to whatever it'll boost to. Which seems to be 1936 Mhz stock for < 35c so far. Which is fine. Crossing fingers it continues working.


----------



## V3teran

Helloo everyone

Ive been out the loop for a few years. Can somebody please let me know if i can flash the 1080ti MSI Seaking (H20) to a better custom bios?
Whats best to use in this thread as im getting stability in 3d mark firestrike at 2000mhz on the core and no higher on the 1080ti.
People are getting much higher than this on ram and vcore. Worth me flashing?

Any volt mods about like there was with Keplar?



Is JPMboy still on these forums nowadays?

Any help much appreciated!


----------



## mouacyk

V3teran said:


> Helloo everyone
> 
> Ive been out the loop for a few years. Can somebody please let me know if i can flash the 1080ti MSI Seaking (H20) to a better custom bios?
> Whats best to use in this thread as im getting stability in 3d mark firestrike at 2000mhz on the core and no higher on the 1080ti.
> People are getting much higher than this on ram and vcore. Worth me flashing?
> 
> Any volt mods about like there was with Keplar?
> 
> 
> 
> Is JPMboy still on these forums nowadays?
> 
> Any help much appreciated!


Use this database to compare to other models that have the same display outputs for the best compatibility:








TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com





Of course following the general directions in post #1 should work for the main BIOSes people are testing.


----------



## kithylin

V3teran said:


> Helloo everyone
> 
> Ive been out the loop for a few years. Can somebody please let me know if i can flash the 1080ti MSI Seaking (H20) to a better custom bios?
> Whats best to use in this thread as im getting stability in 3d mark firestrike at 2000mhz on the core and no higher on the 1080ti.
> People are getting much higher than this on ram and vcore. Worth me flashing?
> 
> Any volt mods about like there was with Keplar?
> 
> 
> 
> Is JPMboy still on these forums nowadays?
> 
> Any help much appreciated!


There are no custom bios's any more. We can not modify the bios for any nvidia 1000 series and flash it back to the cards. We can't do that for the 2000 series or the 3000 series either. Those days are history and will never return. There's no volt mods either. There's a special bios that allows a higher voltage, but it's dangerous and can damage cards and shouldn't be used unless you're doing LN2 for world records. The only thing we can do (that is relatively safe.. but still flash at your own risk) is cross-flash a different bios from another brand's video card that has a higher power limit.


----------



## kithylin

Over the past few days I've drained part of the loop, removed the video card, cleaned it, re-pasted it again, checked all the thermal pads are all good, re-assembled it again, re-filled and bled loop and it's up and running again. And then tonight I'm getting another driver reset again. I have all overclocking disabled via MSI-AB and it's now running back on the original factory bios that came with it from EVGA. It's still running 33~34c when gaming in this loop. It sucks but the only thing it could possible be caused by was me using XOC for 1.5 years and now it's degraded. I guess I'll start under-clocking it just to run without crashing and see about saving up to buy another video card..  I wish the RTX 3070's were unobtanium at the moment.


----------



## Streetdragon

Feels bad man. 

F


----------



## neoroy

kithylin said:


> The 1080 Ti actually does support RTX Technology and ray tracing. It was added in an nvidia driver a long time ago. It's just extremely slow at it.


Yup, I know it  I just don't like RTX for now because it makes GPU more heavy in load like all RTX 2000, it cost 30-40% FPS with RTX ebabled right? I hope RTX 3000/4000 series will be much better.



acrvr said:


> 1080ti is still very good. I play warzone 1440p and still getting 140-150fps with it.
> 
> Are you from Indonesia?


Nice  hehe iya bro ^_^ I'm from Indonesia same like you do I guess.


----------



## kithylin

neoroy said:


> Yup, I know it  I just don't like RTX for now because it makes GPU more heavy in load like all RTX 2000, it cost 30-40% FPS with RTX ebabled right? I hope RTX 3000/4000 series will be much better.


That's why you enable DLSS at the same time and then the RTX option has zero impact on FPS and sometimes RTX + DLSS enabled is faster than without RTX on, depending on the game. Sadly the 1080 Ti's do not support DLSS it seems.


----------



## kithylin

Okay I wanted to write a new comment here and a public apology. First off: Apparently I was an idiot and I shouldn't of been saying the things I've been saying. XOC doesn't degrade 1080 Ti's after all. Turns out the real problem is not degradation. It's that the XOC bios isn't compatible with every driver version from Nvidia. I only recently learned this after owning this card for so long and trying different bios's and nvidia driver versions for a long time. So I'll put this in short story form and tell everyone my experience and what happened.

Originally I bought my 1080 Ti in december 2017. I don't think most of you remember now because 1080 Ti's have been plentiful and common for so long but right around when the cards came out they were incredibly powerful (at the time) as a video card. There was nothing that even came close from AMD and ASIC's for coin mining hadn't taken off yet. 1080 Ti's were the fastest option for coin miners and for gaming and they were in super high demand for the first 10-12 months. I was waiting for a lull in the "coin miner demand" and try to get myself one when they weren't super expensive. So I waited and bought mine in december 2017, almost a year after they came out. The first 2 months were air cooled.. because water blocks were on backorder everywhere. I finally got a water block on month 3 and water looped it. I was originally on the stock EVGA bios for.. I don't remember, a couple months, 2-3 months. Eventually I "Wanted more" and I found the XOC bios and loaded it on the card. Back then around summer 2017 I was using the XOC bios, everything was fine, and the highest clocks I could get out of my card was 2126 core / 6210 memory (MSI AB Values). Also at the time: I was sort of naive about nvidia drivers and I was just following along and loading every new driver every time it came out like a blind sheelpe thinking the newest was the best. Everything was fine for a long time. I gamed every day for hours, everything was stable and everything was fine. About 15 months went by like this. Around october 2018 I woke up one day and turned on the computer like any other day, tried to load a game and it instantly dumped me back to desktop and the game crashed the second it tried to load video display. I tried every game I could find and all games did this. Initially my first thought was "OH NO I KILLED MY VIDEO CARD!". So my very first thought was to revert back to the stock EVGA bios again. Which I did right away first thing. At switching back to the EVGA bios I saw my card worked completely normally again. So in my mind, running too high of voltage or too high of clock probably degraded the card as I couldn't do anything with the XOC bios loaded again. I honestly thought that was what had happened all this time. I was slowly expanding the custom water loop in my system and each time I would add more cooling I would go try the XOC bios again and it was always the same story: Load any game and instant crash to desktop right away. Sad face.. I thought card was still degraded. Somewhere right around Jan~Feb 2019 the latest nvidia driver at the time was causing general instability in games, even with the EVGA bios on my card. Games would crash. Driver resets. I had tried everything. Safe mode with DDU. Clean windows installs. Reset system bios to defaults so no overclocking at all anywhere in the computer, nothing solved it. Most of the time on a clean windows install the nvidia driver wouldn't even install the control panel at all. The only way I could continue using my computer and gaming on it was to go back to an old driver from december 2018 so I stuck with 441.41 forever.

Every time Nvidia released a new driver I would go grab a spare SSD, put it in the computer, do a clean windows 10 install, clean driver install, test it, and it would have instability and issues again. Pull SSD, go back and continue on 441.41 with my main NVME SSD. It got to the point where I just ignored all nvidia driver updates for a long time and didn't even bother. Then just recently I was getting driver resets on 441.41 too and I thought my card was seriously degrading even further and I was super depressed thinking my card's dead and dying. I disabled overclocking the card. I even tried under-clocking the card as much as -300 Mhz and still getting driver resets. Well just on a whim I figured "What the heck, let's try it" and I grabbed the most recent Nvidia driver on my main system and upgraded it from 441.41 -> 457.30, doing a clean install option during setup. Suddenly for the first time since spring 2019 everything is stable again. No more driver resets for a week and I can even overclock the card again. I even decided again "what the heck, let's just try it" and I loaded XOC bios again and suddenly by some miracle.. I can do everything with my card with XOC loaded like I used to could in 2018. In fact I can even get a higher overclock on these new drivers that I previously couldn't do. Previously my max (even with XOC and custom water loop) was 2126 core / 6210 Mem. Now with drivers 457.30 I can do 2152 core / 6300 memory and everything is (so far) perfectly stable.

So I guess in the end the XOC bios does work but it comes with a new catch: It's potentially not compatible with some nvidia driver versions. I only just recently figured this out. Another note about my experience: I do not play very many "Mainstream AAA Games". Most of the games I play are older versions of current games (mod compatibility) with tons and tons of mods loaded. Or small indie games that Nvidia doesn't even bother to create driver profiles for. Or early access games. And now that I see this working with this new high overclock on XOC with these new drivers I'm considering seriously getting a second 1080 Ti. Two of em @ 2152/6300 would be fun. All of the games I care to play have native SLI support. I just thought I would do a little write up about my experience and share my findings with others in case someone had similar experiences. Also I wanted to apologize for everyone claiming XOC degrades cards. Apparently it doesn't, it's just the drivers didn't work with it that makes one think it does. My card's still fine and I can still game on it with XOC on it after all these years. I just didn't know about the drivers with XOC thing. If I could go back and delete all of my comments about degradation with XOC I would. But people have quoted me so.. I'd just look like an idiot. I won't say it to anyone ever again.


----------



## 8051

Interesting observations Kithylin.
I just changed out the stock thermalpaste on my MSI Duke 1080ti for some Phantecs ph-ndc TIM. This coupled with the XOC Strix VBIOS, the more powerful, shrouded fans I have installed on the stock heatsink (127x38mm, Delta FFB1312EHE, 179.75CFM to 200 CFM and a Nidec B35502-35 150CFM to 160CFM) and a backplate thermally coupled with the PCB over most of its surface I can now squeeze 2164Mhz. out of it at 1.2V. So far this air cooled setup has kept the temps below 54°C @ 70°F ambient temps.
BTW for me, the STRIX VBIOS (or maybe Nvidia's drivers?) throttles my 1080Ti when it exceeds about 64°C. It doesn't seem to reduce the voltage, just the GPU clockspeed.
Overclocking both the memory and core with my 1080ti simultaneously seems impossible.


----------



## kithylin

8051 said:


> Interesting observations Kithylin.
> I just changed out the stock thermalpaste on my MSI Duke 1080ti for some Phantecs ph-ndc TIM. This coupled with the XOC Strix VBIOS, the more powerful, shrouded fans I have installed on the stock heatsink (127x38mm, Delta FFB1312EHE, 179.75CFM to 200 CFM and a Nidec B35502-35 150CFM to 160CFM) and a backplate thermally coupled with the PCB over most of its surface I can now squeeze 2164Mhz. out of it at 1.2V. So far this air cooled setup has kept the temps below 54°C @ 70°F ambient temps.
> BTW for me, the STRIX VBIOS (or maybe Nvidia's drivers?) throttles my 1080Ti when it exceeds about 64°C. It doesn't seem to reduce the voltage, just the GPU clockspeed.
> Overclocking both the memory and core with my 1080ti simultaneously seems impossible.


That's the XOC Bios. It doesn't work correctly above 64c. That's why me and everyone else tries hard to stress that no one should ever be using it on air cooled cards for any reason. It's supposed to be on fully custom water looped cards with full cover blocks (not hybrid water coolers, not AIO's) with large loops where we can keep the cards (and VRM's) at < 50c at all times, even when overclocked. The other reason is the XOC bios completely removes the card's safety mechanisms. If you were able to run the card over past 125c on the core or the VRM's, then it would physically melt/damage/catch fire to your VRM's suddenly and without any warning. It's extremely unsafe to have that bios on a card on air cooling.


----------



## 8051

Kithylin, gamersnexus's Buildzoid says that VRM MOSFETs are good into the 125°C range. Buildzoid did an in-depth tear-down of an MSI 1080ti Gaming X, which uses the same PCB as my MSI 1080Ti Duke and said it has a very robust VRM circuit as far as cooling.

I don't have any issues going above 64°C the card just throttles and that never happens anymore. It even boosts back to the higher overclocked core clock if the GPU temperature reduces.

How do you know you're keeping your VRM's below 50°C?

At this point, my 1080ti only has to last at the most another 6 months or so.


----------



## kithylin

8051 said:


> Kithylin, gamersnexus's Buildzoid says that VRM MOSFETs are good into the 125°C range. Buildzoid did an in-depth tear-down of an MSI 1080ti Gaming X, which uses the same PCB as my MSI 1080Ti Duke and said it has a very robust VRM circuit as far as cooling.
> 
> I don't have any issues going above 64°C the card just throttles and that never happens anymore. It even boosts back to the higher overclocked core clock if the GPU temperature reduces.
> 
> How do you know you're keeping your VRM's below 50°C?


It's generally assumed that the VRM's run +10 to +15c above core temps. And my 1080 Ti (even overclocked with XOC @ 1.200v) maxes out at 31-33c when gaming, even under 100% loads. However actual typical gaming temps are usually in the 28~30 C range most of the time. So therefore my VRM temps should be in the 40~45 C range max. I have a very large custom water loop with 4 radiators (480 + 360 + 240 + 240) all in one case. It's fitted with 15 fans in push+pull (1 rad can't mount a fan in one of it's fan spots, but that's fine) and it all runs on a fan controller with automatic fan speed adjustment based on the temperature of the water in the tubes.



8051 said:


> At this point, my 1080ti only has to last at the most another 6 months or so.


Good luck with that. If you're looking to buy an RTX 30-series card then expect at least until summer 2021, if not fall 2021 for general availability at most computer outlets. Nvidia has abnormally low supply this time and the demand from consumers to buy the new cards is higher this time than it ever has been for any previous GPU launch.


----------



## 8051

AMD's Big Navi is an option too, I'm hoping six months from now there will be more Big Navi cards available as well.


----------



## Hemin

Hello,

I recently purchased a second hand msi 1080 ti Armor with arctic accelero cooler. The stock boost clock for this card is 1645 MHz, but the strange thing is that on MSI Afterburner with all defaults, gpu speed boost raise up to 1911Mhz. Is this normal?

Sorry if this is a repeated question, but I can't find any answer.

Thanks


----------



## Dude970

Yes, that is normal


----------



## Hemin

Dude970 said:


> Yes, that is normal


And what is the reason that it boost more than the stock boost?


----------



## kithylin

Hemin said:


> And what is the reason that it boost more than the stock boost?


Because that's how Nvidia Boost works. If the card detects that it has the thermal headroom (It compares the difference between where it currently is temperature wise vs the maximum it can possibly heat up to) and also has room power wise (it compares the difference between how much power it is currently pulling vs the maximum power it's allowed to pull based on it's bios) it will self-overclock as far as it can until it hits some limit. If it hits a power limit, it will reduce clocks. If it hits a thermal temperature limit, it will reduce clocks. All Nvidia cards from the 1000 series, 2000 series and 3000 series behave like this. Additionally it is probably winter for you right now (Maybe?) and if you have really cold air in your room then the cards will self-overclock higher. The colder you can get the card the higher it will self-overclock. This is all completely normal and the card is operating as it is designed to operate.


----------



## Hemin

kithylin said:


> Because that's how Nvidia Boost works. If the card detects that it has the thermal headroom (It compares the difference between where it currently is temperature wise vs the maximum it can possibly heat up to) and also has room power wise (it compares the difference between how much power it is currently pulling vs the maximum power it's allowed to pull based on it's bios) it will self-overclock as far as it can until it hits some limit. If it hits a power limit, it will reduce clocks. If it hits a thermal temperature limit, it will reduce clocks. All Nvidia cards from the 1000 series, 2000 series and 3000 series behave like this. Additionally it is probably winter for you right now (Maybe?) and if you have really cold air in your room then the cards will self-overclock higher. The colder you can get the card the higher it will self-overclock. This is all completely normal and the card is operating as it is designed to operate.


Thank you very much for your answer! , it's a very interesting thing that I didn't see on any review/article. 
And yes, is winter here and the artic accelero xtreme iv helps to reduce the temps. 

Regards


----------



## kithylin

Hemin said:


> Thank you very much for your answer! , it's a very interesting thing that I didn't see on any review/article.
> And yes, is winter here and the artic accelero xtreme iv helps to reduce the temps.
> 
> Regards


They don't mention it any more in today's articles because it's "how modern video cards work now" for both AMD and Nvidia. There were a lot of reviews that talked about this new boost system for Nvidia cards back in 2016 when the cards first launched though. If you want to find articles that talked about it you'll have to (some how) find old articles that were posted around when the 1080 Ti originally launched.


----------



## 8051

Article benchmarking the 1080ti against more modern GPU's in 2020
How Does the GTX 1080 Ti Stack Up in 2020?


----------



## 8051

Since I've swapped out the lousy MSI TIM on my MSI 1080Ti Duke I've seen some interesting results with the Asus Strix XOC VBIOS and cooler GPU temps. It used to be the card would boost to 2000Mhz. (i.e. without any user intervention) in 3D clocks but at voltages > 1.1, now that it stays cooler it boosts to 2000Mhz. at 1.093V -- the maximum voltage allowed by Nvidia. I don't buy into the idea the Asus Strix XOC VBIOS has no limits either -- it thermally throttles above 64°C.


----------



## kithylin

8051 said:


> Since I've swapped out the lousy MSI TIM on my MSI 1080Ti Duke I've seen some interesting results with the Asus Strix XOC VBIOS and cooler GPU temps. It used to be the card would boost to 2000Mhz. (i.e. without any user intervention) in 3D clocks but at voltages > 1.1, now that it stays cooler it boosts to 2000Mhz. at 1.093V -- the maximum voltage allowed by Nvidia. I don't buy into the idea the Asus Strix XOC VBIOS has no limits either -- it thermally throttles above 64°C.


That's because it's supposed to be used under liquid nitrogen originally. That's what the XOC bios is designed for.

You should not be using that bios on any air cooled card for any length of time no matter what temps it's running at. It's too dangerous.

If you're going to use the XOC bios on a 1080 Ti you absolutely must have it in a custom water loop with a full cover block where you can keep it at least < 50c at all times if not colder (not on a AIO) for the safety of the card.

I have a 1080 Ti with XOC on it that I managed to get working again but in my custom water loop the card stays at 34c max core temps even when playing the new CyberPunk 2077 game where it's at 100% load constantly. Most of the other obscure games I play the card is usually in the 28~30 C range most of the time.


----------



## 8051

Those are amazing temps kithylin. I'm still going to squeeze at much perf. as I can out of this 1080ti because sometime in 2021 it'll be gone.


----------



## kithylin

8051 said:


> Those are amazing temps kithylin. I'm still going to squeeze at much perf. as I can out of this 1080ti because sometime in 2021 it'll be gone.


It's just from a large water loop: Total 1320mm of radiator space (480 + 360 + 240 + 240). 2 big pumps run at 100% + 15 fans on an automatic fan controller.


----------



## 8051

So that's how you get those low temps. How much water does it take to fill your entire loop?


----------



## kithylin

8051 said:


> So that's how you get those low temps. How much water does it take to fill your entire loop?


I haven't yet had a reason to do a full drain and refill since I constructed this loop in summer 2018. I partially drained some of it to change radiators (not a full drain) and so I have no idea honestly.


----------



## MHorse10

Anybody here experienced with the ZOTAC AMP! GTX 1080 non Ti? Just bought one...


----------



## kithylin

MHorse10 said:


> Anybody here experienced with the ZOTAC AMP! GTX 1080 non Ti? Just bought one...


What issues are you having with it? All aftermarket 1080 Ti's are pretty much the same. The only difference is just a few watts in their power limits of the bios on the card is all that sets them apart.


----------



## MHorse10

kithylin said:


> What issues are you having with it? All aftermarket 1080 Ti's are pretty much the same. The only difference is just a few watts in their power limits of the bios on the card is all that sets them apart.


No issues, not even arrived yet, it is due Tuesday, however i would like to be aware of what a reasonable overclock is and if there are any driver problems to look out for.


----------



## KedarWolf

I sold my one of my 1080 Ti's, the FE, with an EKWB waterblock and backplate on it with Fujipoly 17.0 W/mK pads for $500 Canadian dollars. 

I figured sell it now, get decent cash for it, keep my other one for now, then get my FTW3 Ultra 3090 with my tax refund in March.

I probably could have gotten more for it but I wanted a quick sale.


----------



## Jhawk

kithylin said:


> What issues are you having with it? All aftermarket 1080 Ti's are pretty much the same. The only difference is just a few watts in their power limits of the bios on the card is all that sets them apart.


First, I'd like to thank you for all the very useful posts I have recently read of yours, you are very knowledgeable! 2nd, - to this topic, I have 2 1080 ti (EVGA), one is an SC and one is a FTW3. They are both Hydro Copper (HC) and I run them in my custom loop with plenty of headroom on everything. I flashed XOC bios as my super heavy OC friends suggested because of my temps being so low and have been very impressed. I accidentally however flashed both cards with the XOC and meant to just do the slave on the FTW3. No biggie, both cards worked fine on XOC and GPU temps have never exceeded 44c on FTW3 or 38c on SC. I apologize if this has been answered as I have looked but is it not possible to add a power limit to the XOC bios, like a higher one than stock but lower than none, for safety purposes? 

Also, to note, I flashed both cards back to stock no problem first try, and now have reflashed back to XOC as I am going to be doing some experimental close to 0c continuous open loop cooling. Would you recommend I use another bios on these cards and the 6 + 8 pin and 8 + 8 pin connections seemed to cause no issues in VBios flashing or performance as I have wrecked some serious scores, but since the PCB on SC, or same as FE I believe and the FTW3 are clearly different in size, and the FTW3 even on liquid runs noticeably warmer at the chip (I have no clue what other temps are on SC as it does not have ICX sensors. I've nailed 2172mhz stable in AB showing 39C after an hour of both cards totaling 1282 watts of power ~100watts from PC.

Oh yeah, and last not least, 3) This is how small of a world it is... I'm almost positive I sold you a Classified GTX 780 that I re-furbished a few years back on eBay. You gave me good feedback and even messaged me telling me that the card was doing extremely well! If that was you, im pretty sure it was.. Anyways thanks in advance if you can help me, again sorry if its a repeat i couldnt find the right answer that i trusted. Thanks for everything!
-ready_to_go2k16 / JHawk


----------



## kithylin

Jhawk said:


> First, I'd like to thank you for all the very useful posts I have recently read of yours, you are very knowledgeable! 2nd, - to this topic, I have 2 1080 ti (EVGA), one is an SC and one is a FTW3. They are both Hydro Copper (HC) and I run them in my custom loop with plenty of headroom on everything. I flashed XOC bios as my super heavy OC friends suggested because of my temps being so low and have been very impressed. I accidentally however flashed both cards with the XOC and meant to just do the slave on the FTW3. No biggie, both cards worked fine on XOC and GPU temps have never exceeded 44c on FTW3 or 38c on SC. I apologize if this has been answered as I have looked but is it not possible to add a power limit to the XOC bios, like a higher one than stock but lower than none, for safety purposes?


No it is not possible to add a power limit to the XOC bios. It has no power limit and one can not be added. The XOC bios originally was intended to be for Asus STRIX GTX 1080 Ti owners that use liquid nitrogen for competitive overclocking competitions. It's technically not designed to be used even on custom water cooling. But if we can keep it under 64c then it will "work". If you are on a full cover custom water loop with a full cover block that gets the core + ram + VRM then it should, probably be safe enough to use it daily. As long as you remember to do things like use vsync or gsync to keep the frames under control and don't play games running at 2000 FPS with vsync off or something silly.



Jhawk said:


> Also, to note, I flashed both cards back to stock no problem first try, and now have reflashed back to XOC as I am going to be doing some experimental close to 0c continuous open loop cooling. Would you recommend I use another bios on these cards and the 6 + 8 pin and 8 + 8 pin connections seemed to cause no issues in VBios flashing or performance as I have wrecked some serious scores, but since the PCB on SC, or same as FE I believe and the FTW3 are clearly different in size, and the FTW3 even on liquid runs noticeably warmer at the chip (I have no clue what other temps are on SC as it does not have ICX sensors. I've nailed 2172mhz stable in AB showing 39C after an hour of both cards totaling 1282 watts of power ~100watts from PC.


Personally I also own the "EVGA GTX 1080 Ti SC Black Edition" card which uses the reference 8+6 power design and as a stupid test I once went in to Project CARS 1 (the original) with vsync off on max graphics and drove the car in to a corner to look in to the distance and nvidia-smi reported my card pulling around 650 watts. I let it do this for about 5 minutes and then decided to quit for fear of melting the card. Granted.. typically my card pulls around 350~380 watts with XOC bios on it, at 2126 Mhz core / 6210 Mhz memory (12420 Mhz effective memory speed) and it doesn't pull 650 watts all the time. But I did that once and I still use the card well over a year later and it has no issues and it didn't damage it. It was under a full cover water loop at the time so that may be why it survived. But it didn't seem to hurt it (Disclaimer: Do not hold me responsible if you pull 600W from your card and it melts on you, I may of just been lucky).



Jhawk said:


> Oh yeah, and last not least, 3) This is how small of a world it is... I'm almost positive I sold you a Classified GTX 780 that I re-furbished a few years back on eBay. You gave me good feedback and even messaged me telling me that the card was doing extremely well! If that was you, im pretty sure it was.. Anyways thanks in advance if you can help me, again sorry if its a repeat i couldnt find the right answer that i trusted. Thanks for everything!
> 
> -ready_to_go2k16 / JHawk


HAHA! small world indeed. That probably was you. I am the same Kithylin, yes. The 780 is still running fine. I found a water block for it after several years of hunting and it's stupidly overclocked now around and serves as my XP retro rig.

Do you use telegram? Discord? Steam? We could chat sometime if you would like. I could share some photos and stuff about the 780 and it's current home if you want. I don't have time just right now to reply to your other questions above. I'll try to answer those later today after sleep. You could try and send me a direct message here but the new forums are all broken and stupid and I don't know if I'll be able to read or respond to it at all. Or even figure out where the DM's / PM's even are. I also would be very interested to chat with someone about modern games on 1080 Ti SLI as that's a configuration I'm thinking of building some day soon.

EDIT: I checked in my ebay history and yep that was you. ready_to_go2k16. The 780 was sold to me on march 23, 2018.
EDIT #2: Now after some sleep and most of the day's insanity settled down I have had time to write a reply to your other parts of your message and edited it in here for you.


----------



## Kokin

MHorse10 said:


> No issues, not even arrived yet, it is due Tuesday, however i would like to be aware of what a reasonable overclock is and if there are any driver problems to look out for.


Last few drivers seem to break 1080Tis/10 series GPUs unless you turn off GSync, including the hotfix driver released last week (460.97). I get black/brown/white screens and GSync brings my 144hz monitor down to 24hz, same with my wife's 120hz monitor and 1070. Memory clocks also seem to be stuck at 100MHz (400 effective), unless rolling back to an older driver. Currently on 457.30 and don't experience the low memory clocks nor black screens, but Gsync issue still remains. Older drivers from August and earlier are pretty stable, anything after that YMMV. 

I don't use VR, but there seems to be issues across the generations for it.

Nvidia has really dropped the ball ever since the 3000 series launched. Went from rock solid for years to absolutely unusable drivers several releases in a row now. It's weird people bag on AMD for bad drivers, but Nvidia gets a free pass. Lots of 10 series owners complaining here, but also some for 20/30 series owners: https://www.nvidia.com/en-us/geforc...cing-geforce-hotfix-driver-46097-released-12/

I have a 1080Ti FTW3 on an EK waterblock and XOC vBIOS (power limit up to 128% and higher 330W limit), it normally boosts to 1976-2012mhz core, but could reach 2100mhz when overclocked. On air, it's reasonable to hit 1800-1900mhz depending on temps, voltage and power limits.


----------



## kithylin

Kokin said:


> Last few drivers seem to break 1080Tis/10 series GPUs unless you turn off GSync, including the hotfix driver released last week (460.97). I get black/brown/white screens and GSync brings my 144hz monitor down to 24hz, same with my wife's 120hz monitor and 1070. Memory clocks also seem to be stuck at 100MHz (400 effective), unless rolling back to an older driver. Currently on 457.30 and don't experience the low memory clocks nor black screens, but Gsync issue still remains. Older drivers from August and earlier are pretty stable, anything after that YMMV.
> 
> I don't use VR, but there seems to be issues across the generations for it.
> 
> Nvidia has really dropped the ball ever since the 3000 series launched. Went from rock solid for years to absolutely unusable drivers several releases in a row now. It's weird people bag on AMD for bad drivers, but Nvidia gets a free pass. Lots of 10 series owners complaining here, but also some for 20/30 series owners: https://www.nvidia.com/en-us/geforc...cing-geforce-hotfix-driver-46097-released-12/
> 
> I have a 1080Ti FTW3 on an EK waterblock and XOC vBIOS (power limit up to 128% and higher 330W limit), it normally boosts to 1976-2012mhz core, but could reach 2100mhz when overclocked. On air, it's reasonable to hit 1800-1900mhz depending on temps, voltage and power limits.


I'm using driver version 460.79 here (The driver that was game-ready for CyberPunk 2077) and my 1080 Ti is rock solid stable @ 2126 Mhz core / +700 memory (6210 Mhz / 12420 Mhz effective) with zero issues with the XOC bios. But then again my card is in a large custom water loop and always runs < 40c when gaming even maxed out at 99% for hours on end in CP-2077. The temperatures might have something to do with it.. I wouldn't know. What is your 1080 Ti's usual temps when gaming? Perhaps consider reverting back to this driver? It works perfect for me and I'm not changing any time soon.


----------



## KedarWolf

I'm going to try these thermal pads with my SC Black 1080 Ti and EKWB waterblock and backplate. They are super cheap, like $13 Canadian dollars on Amazon.ca.

12.8 W/mk over double the 6.0 W/mk pads that came with the block.

Money is kinda tight and can't afford the Fujipoly 17.0 W/mk pads. They are over $20 USD and with the currency exchange, a bit too expensive. 

Edit: AND two-day shipping. 






EXTREME ODYSSEY 85x45x1.0mm – Thermalright







thermalright.com










EXTREME ODYSSEY 85x45x0.5mm – Thermalright







thermalright.com





And if you wait a month for shipping, around $20 USD here for the 120mm x 120mm pads.









16.19US $ 18% OFF|Thermalright Odyssey Thermal Pad Non Conductive Chipset For Video Memory 12.8w/mk 120x120mm 0.5mm/1.0mm/1.5mm/2.0mm/2.5mm/3.0mm - Pc Components Cooling & Tools - AliExpress


Smarter Shopping, Better Living! Aliexpress.com




www.aliexpress.com


----------



## Kokin

kithylin said:


> I'm using driver version 460.79 here (The driver that was game-ready for CyberPunk 2077) and my 1080 Ti is rock solid stable @ 2126 Mhz core / +700 memory (6210 Mhz / 12420 Mhz effective) with zero issues with the XOC bios. But then again my card is in a large custom water loop and always runs < 40c when gaming even maxed out at 99% for hours on end in CP-2077. The temperatures might have something to do with it.. I wouldn't know. What is your 1080 Ti's usual temps when gaming? Perhaps consider reverting back to this driver? It works perfect for me and I'm not changing any time soon.


Drivers from Nov-Dec seem to have issues with my 1080Ti and my wife's 1070 unless we turn off GSync, no fixes for the black screens or super low memory clocks unless reverting back to 457.30 from early November. It runs at 45C with my waterblock, part of a 280+240 rad custom loop with fans just spinning at 1000rpm at load. Temps, power limit and voltage are no problem, and any drivers from September and earlier are rock solid and have been since 2017.

Even tried setting BIOS to run PCIE Gen3, but that didn't make a difference.


----------



## KedarWolf

I replaced my thermal pads on the RAM and VRMs on my 1080 Ti and reapplied the thermal paste, now I'm getting temps while gaming in the 45C range with XOC BIOS at 2138 core, 6220 memory at 1.2v. This with an EKWB Waterblock and backplate on a 280 rad with two quick disconnects on it, no CPU or anything on the loop. 

I haven't tried to go higher than 2138 core but is Time Spy and Cyberpunk stable.


----------



## KedarWolf

Highest TimeSpy score for a 3950x and a 1080 Ti with a far higher graphics score than anyone else in the top 100. 



https://www.3dmark.com/search#advanced?test=spy%20P&cpuId=2546&gpuId=1127&gpuCount=1&deviceType=ALL&memoryChannels=0&country=&scoreType=overallScore&hofMode=false&showInvalidResults=false&freeParams=&minGpuCoreClock=&maxGpuCoreClock=&minGpuMemClock=&maxGpuMemClock=&minCpuClock=&maxCpuClock=


----------



## utparatrooper

I have a reference 1080Ti and experienced the same blank (white, sometimes black) screens as well. I thought my card was starting to die, but as a test, changed from DP connection to HDMI, just to see if there is an issue with the DP ports on my 1080Ti. Switching to HDMI fixed it, but obviously only running at 60Hz on my PG348Q. Finally, I decided to switch back to my DP connection to see if it was just a problem with the overclock on my monitor (overclocked to 100Hz). Turns out the overclock on my monitor was unstable running Cyberpunk (haven't tested in other games yet). Currently, I am locked at 60fps on medium-ish settings at 3440x1440 with G Sync using my DP connection. 

I know your issues may have a different cause, but this was how I resolved my blank screens. Only once did I notice that while playing Cyberpunk, the OSD (MSI AB) for the memory clock on my 1080Ti had a zero value, but haven't experienced it again since.


----------



## KedarWolf

I'm running Cyberpunk ar 3840x1080 144Hz HDR FreeSync ultrawide at high settings with HDR on except for the shadow settings on low with a GitHub FPS tweaker (not the malware EZ Optimizer).

I get over 60 FPS all the time. I'm not really concerned about shadows in games and I find it's the best compromise for decent FPS with them low.

But 3440x1440 is 4953600 pixels and 3840x1080 is only 4147200 pixels. Still, at 144Hz with HDR and FreeSync on, I'm pretty happy.


----------



## Kokin

I just stuck with the older 457.30 driver as that was stable with GSync turned off. My wife and I actually got lucky as a local store had stock of 3070s, an Asus dual fan for her and EVGA FTW3 for me. Hoping to do a step up to the 3080 Ti FTW3 if that comes out in the next few months.

The 3070 is a great pairing for her 3440x1440 120Hz monitor, but is lacking a bit of oomph for my 3840x1600 144hz monitor (6,144,000 pixels!). Both run without issues on the latest hotfix driver 460.97.



KedarWolf said:


> I replaced my thermal pads on the RAM and VRMs on my 1080 Ti and reapplied the thermal paste, now I'm getting temps while gaming in the 45C range with XOC BIOS at 2138 core, 6220 memory at 1.2v. This with an EKWB Waterblock and backplate on a 280 rad with two quick disconnects on it, no CPU or anything on the loop.
> 
> I haven't tried to go higher than 2138 core but is Time Spy and Cyberpunk stable.


That's very nice! My 1080Ti maxed out at 2100 core and 6000 memory at 1.093V using Precision X1 software. It jumped from 45-50C depending on the game or benchmark.


----------



## Zfast4y0u

Kokin said:


> Last few drivers seem to break 1080Tis/10 series GPUs unless you turn off GSync, including the hotfix driver released last week (460.97). I get black/brown/white screens and GSync brings my 144hz monitor down to 24hz, same with my wife's 120hz monitor and 1070. Memory clocks also seem to be stuck at 100MHz (400 effective), unless rolling back to an older driver. Currently on 457.30 and don't experience the low memory clocks nor black screens, but Gsync issue still remains. Older drivers from August and earlier are pretty stable, anything after that YMMV.
> 
> I don't use VR, but there seems to be issues across the generations for it.
> 
> Nvidia has really dropped the ball ever since the 3000 series launched. Went from rock solid for years to absolutely unusable drivers several releases in a row now. It's weird people bag on AMD for bad drivers, but Nvidia gets a free pass. Lots of 10 series owners complaining here, but also some for 20/30 series owners: https://www.nvidia.com/en-us/geforc...cing-geforce-hotfix-driver-46097-released-12/
> 
> I have a 1080Ti FTW3 on an EK waterblock and XOC vBIOS (power limit up to 128% and higher 330W limit), it normally boosts to 1976-2012mhz core, but could reach 2100mhz when overclocked. On air, it's reasonable to hit 1800-1900mhz depending on temps, voltage and power limits.


I had mega issues with latest driver i downloaded, the hotfix one, my screen was acting weird all the time ( asus 278 qr, 27'' 1440p 144hz ), some areas of it would flash another color in split of a second while doing nothing on desktop. older driver versions had this issue too but not in this extent, im now at 457.30 and its perfect, zero issues other then video on youtube freezing when i move to other tab, when i come back to it, pic is frozen for like half sec, then it unfreezes again, cant bother contacting support on that one..


----------



## KedarWolf

Nvidia's Geforce 461.09 driver has a lot of fixes, especially for GTX 1080 Ti owners


Bug fixes, bug fixes and more bug fixes




www.overclock3d.net





Latest driver fixes 1080 Ti problems.


----------



## utparatrooper

KedarWolf said:


> Nvidia's Geforce 461.09 driver has a lot of fixes, especially for GTX 1080 Ti owners
> 
> 
> Bug fixes, bug fixes and more bug fixes
> 
> 
> 
> 
> www.overclock3d.net
> 
> 
> 
> 
> 
> Latest driver fixes 1080 Ti problems.


Awesome. Thanks for letting us know.


----------



## 8051

Gamersnexus thinks that overclocking margins are all down to the silicon lottery. I've managed to get my 1080ti, on the stock heatsink using the ASUS Strix XOC VBIOS but with better TIM and much more powerful fans fitted, to 2152/5670 @ 1.2V. I can get 2164 stable but only if I lower the memory clock by 500Mhz. below stock. For some reason I have to keep the core temps below 64°C or I get throttling -- even though no perfcaps are thrown in either afterburner or GPUz.
According to HWbot the highest frequency anyone has squeezed out of a 1080ti under H2O is 2190Mhz.


----------



## kithylin

8051 said:


> Gamersnexus thinks that overclocking margins are all down to the silicon lottery. I've managed to get my 1080ti, on the stock heatsink using the ASUS Strix XOC VBIOS but with better TIM and much more powerful fans fitted, to 2152/5670 @ 1.2V. I can get 2164 stable but only if I lower the memory clock by 500Mhz. below stock. For some reason I have to keep the core temps below 64°C or I get throttling -- even though no perfcaps are thrown in either afterburner or GPUz.
> According to HWbot the highest frequency anyone has squeezed out of a 1080ti under H2O is 2190Mhz.


Just so you are aware: All 1080 Ti's start throttling at 24c and they reduce clocks every 10c as you go up. This is hard-coded behaviour by Nvidia and it can not be bypassed.


----------



## 8051

kithylin said:


> Just so you are aware: All 1080 Ti's start throttling at 24c and they reduce clocks every 10c as you go up. This is hard-coded behaviour by Nvidia and it can not be bypassed.


This seems counter-intuitive at least with what I'm seeing withe Asus Strix XOC VBIOS. I'm getting 2152Mhz. unless afterburner, GPUz and HWiNFO64 are lying to me, but I don't go above 57°C anymore. I don't see any throttling until I exceed 63°C to 64°C and no perfcaps are thrown even then. Maybe the Asus Strix XOC VBIOS extends those limits.

What you're telling me isn't what I'm seeing.


----------



## omekone

Wow thats alot of posts.

Currently running a 1080ti FTW 3 with stock bios. She is under a custom loop (ek fc block/back plate) along with the cpu and 2x 360 rads.

Decided to wait on an upgrade and see if I can push this system another year or two. Would you advise the XOC bios or is something else better atm? with stock bios I can only hit 2060ish before she artifacts.


----------



## kithylin

8051 said:


> This seems counter-intuitive at least with what I'm seeing withe Asus Strix XOC VBIOS. I'm getting 2152Mhz. unless afterburner, GPUz and HWiNFO64 are lying to me, but I don't go above 57°C anymore. I don't see any throttling until I exceed 63°C to 64°C and no perfcaps are thrown even then. Maybe the Asus Strix XOC VBIOS extends those limits.
> 
> What you're telling me isn't what I'm seeing.


It's probably "All other bios's", the XOC bios is "Special" and doesn't behave like any other BIOS.


----------



## mouacyk

kithylin said:


> It's probably "All other bios's", the XOC bios is "Special" and doesn't behave like any other BIOS.


That is likely. The XOC bios has many protections and safeguards disabled, where the non-XOC bioses still default to throttling on *multiple *factors, and not just a single one.


----------



## kithylin

mouacyk said:


> That is likely. The XOC bios has many protections and safeguards disabled, where the non-XOC bioses still default to throttling on *multiple *factors, and not just a single one.


The XOC bios definitely throttles at 64-65c though. But no one that uses this bios should ever be in a situation where their card would be getting that hot anyway. They should be on a custom water loop where they can maintain < 50c at all times to begin with before using the XOC bios. I did think that keeping my card colder at < 40c would help me overclock further on the XOC bios with my card but that doesn't seem to be true. So I suppose the XOC bios isn't following the typical "Nvidia starts throttling at 24c and reduces clocks later on" trend like normal bios's do.


----------



## Kokin

omekone said:


> Wow thats alot of posts.
> 
> Currently running a 1080ti FTW 3 with stock bios. She is under a custom loop (ek fc block/back plate) along with the cpu and 2x 360 rads.
> 
> Decided to wait on an upgrade and see if I can push this system another year or two. Would you advise the XOC bios or is something else better atm? with stock bios I can only hit 2060ish before she artifacts.


The XOC bios for the 1080Ti FTW3 will increase your max power limit from 118% to 128%. I would see peaks of 330W in gaming and 380W in synethics. Mine got up to 2100 core and 6000 memory using Precision XOC (1.093V max), probably could have gone higher with MSI afterburner but didn't bother. 

Ran the XOC bios at stock 1.06V most of the time since core would boost to 1974-2000 by just increasing the power limit to 128%. Stock ran around 45C with the EK block, 280+240 rads and 3900X. Typical gpu power was 250-280W in gaming.


----------



## BluePaint

I put my 1080ti fe under water after 3 years and used the current cold weather to test it
TS 8C avg temps 2143 Mhz avg with ftw3 kingpin fix bios, 11413 gpu pts with [email protected] and 3200 ram
3dmark.com/spy/18362207
Didnt expect water to be that effective.

Update:
11571 GPU TS with XOC BIOS @ 2240 Mhz @ 8C with [email protected] 
3dmark.com/spy/18362207

















For fun:
1st place PR 1080Ti with 12.5 fps  2745 pts
3dmark.com/pr/882451
Port Royale Leaderboard 1080 Ti


----------



## Kashtan

What is best version of Pascal - Kingpin, HOF, Kudan, Evga FTW3 Elite (12gbit/s memory)? Kudan is a very rare, but i hope, kingpin and hof can be to compared?
I tried next:
most popularated benchmark for vga-s - timespy.
So.
Best result on 1080 ti on LN2 for kingpin - 14301








I scored 15 004 in Time Spy


Intel Core i9-7980XE Processor, NVIDIA GeForce GTX 1080 Ti x 1, 32458 MB, 64-bit Windows 10}




www.3dmark.com




for hof - 13882








I scored 14 700 in Time Spy


Intel Core i9-7980XE Processor, NVIDIA GeForce GTX 1080 Ti x 1, 1220 MB, 64-bit Windows 10}




www.3dmark.com




Water - kingpin best - 12418








I scored 13 106 in Time Spy


Intel Core i9-7960X Processor, NVIDIA GeForce GTX 1080 Ti x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com




hof - 12181








I scored 11 514 in Time Spy


Intel Core i7-8700K Processor, NVIDIA GeForce GTX 1080 Ti x 1, 16384 MB, 64-bit Windows 10}




www.3dmark.com




This is a correct?


----------



## kithylin

Kashtan said:


> What is best version of Pascal - Kingpin, HOF, Kudan, Evga FTW3 Elite (12gbit/s memory)? Kudan is a very rare, but i hope, kingpin and hof can be to compared?
> I tried next:
> most popularated benchmark for vga-s - timespy.
> So.
> Best result on 1080 ti on LN2 for kingpin - 14301
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 15 004 in Time Spy
> 
> 
> Intel Core i9-7980XE Processor, NVIDIA GeForce GTX 1080 Ti x 1, 32458 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> for hof - 13882
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 14 700 in Time Spy
> 
> 
> Intel Core i9-7980XE Processor, NVIDIA GeForce GTX 1080 Ti x 1, 1220 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> Water - kingpin best - 12418
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 13 106 in Time Spy
> 
> 
> Intel Core i9-7960X Processor, NVIDIA GeForce GTX 1080 Ti x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> hof - 12181
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 514 in Time Spy
> 
> 
> Intel Core i7-8700K Processor, NVIDIA GeForce GTX 1080 Ti x 1, 16384 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> This is a correct?


None of the cards are any different than any other and the extra / custom power sections only matter if you're doing competitive LN2 competition. Reference board 1080 Ti's can hit the same overclocks as custom-design 1080 Ti's if you can put em under a full cover water block and keep em < 40c in a water loop. I've managed to pull 600W over my reference-board 1080 Ti before so the difference in the design of the power systems on the cards doesn't matter either.


----------



## KedarWolf

BluePaint said:


> I put my 1080ti fe under water after 3 years and used the current cold weather to test it
> TS 8C avg temps 2143 Mhz avg with ftw3 kingpin fix bios, 11413 gpu pts with [email protected] and 3200 ram
> 3dmark.com/spy/18362207
> Didnt expect water to be that effective.
> 
> Update:
> 11571 GPU TS with XOC BIOS @ 2240 Mhz @ 8C with [email protected]
> 3dmark.com/spy/18362207
> View attachment 2478923
> View attachment 2478924
> 
> 
> 
> For fun:
> 1st place PR 1080Ti with 12.5 fps  2745 pts
> 3dmark.com/pr/882451
> Port Royale Leaderboard 1080 Ti


Here is my best result. 11546 Graphics score.

I thought I saved a screenshot, but when I checked it was an earlier result. 









I scored 12 121 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce GTX 1080 Ti x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## BluePaint

Great GPU score with those temps!

The weakness of my 1080Ti is the VRAM. I also wonder whether older driver would perform somewhat better with Pascal cards. In Port Royale, there was a 10% uplift with newer drivers but for older benchmarks I didnt check.

I also have a 2nd 1080Ti FE and a block (i forgot to test vram speed before putting the other one under water). Not sure whether to put it under water for some SLI fun or to sell it for 500 bucks on eBay. With the current price trends (1 month ago it was worth 400) it will soon be worth more than what I paid for in 2017, lol


----------



## KedarWolf

I'm pre-ordering an ASUS Strix OC 3090 in March.


----------



## bigblueshock

Figured this would be the best place to ask this. If mining talk is strictly prohibited in this thread, I will delete this post.

Anyone running NiceHash and mainly using Daggerhashimoto algorithm? Or do any mining at all? If so, what are you using for the sweet spot for undervolting / core clock / Power limit %? I guess I'm trying to stay around 170W TDP. But I'd like to hear what other's are achieving with 1080 Ti's

Edit: I use my 1080 Ti for mainly gaming, but with the current mining craze and with it already in hand, why not use it for other purposes while not gaming.


----------



## utparatrooper

bigblueshock said:


> Figured this would be the best place to ask this. If mining talk is strictly prohibited in this thread, I will delete this post.
> 
> Anyone running NiceHash and mainly using Daggerhashimoto algorithm? Or do any mining at all? If so, what are you using for the sweet spot for undervolting / core clock / Power limit %? I guess I'm trying to stay around 170W TDP. But I'd like to hear what other's are achieving with 1080 Ti's
> 
> Edit: I use my 1080 Ti for mainly gaming, but with the current mining craze and with it already in hand, why not use it for other purposes while not gaming.


I'm running my 1080 Ti in a standalone system only for mining (I've had my card since it was first released). I now have a 3080 for gaming and mining when I'm not gaming. 

For the 1080 Ti, it's an FE and over the years I put water blocks on that thing twice, switched back to the original cooler and changed to AIO and now back to the original cooler with the original thermal pads. I accordingly don't overclock the memory. I do underclock the core at -400 (as low as it goes in MSI AB) and set the power limit to 70%. At most I get around 32.xx MH/s. 

Completely different story with my 3080 though. Average 92.xx MH/S. Both mining on Nicehash (Daggerhashimoto).


----------



## bigblueshock

utparatrooper said:


> I'm running my 1080 Ti in a standalone system only for mining (I've had my card since it was first released). I now have a 3080 for gaming and mining when I'm not gaming.
> 
> For the 1080 Ti, it's an FE and over the years I put water blocks on that thing twice, switched back to the original cooler and changed to AIO and now back to the original cooler with the original thermal pads. I accordingly don't overclock the memory. I do underclock the core at -400 (as low as it goes in MSI AB) and set the power limit to 70%. At most I get around 32.xx MH/s.
> 
> Completely different story with my 3080 though. Average 92.xx MH/S. Both mining on Nicehash (Daggerhashimoto).


Thanks for the info! Yeah 3080 is a beast, I wish I had one. I would be doing same as you, 3080 for gaming (mining at idle) and keep the 1080 Ti for pure mining. 

My only gripe about the 3080 is amount of VRAM the card comes with if I ever decided on going 4k display. It's almost as nVidia knew a crypto mining phase was coming and knew to slap only 10GB on these cards for that reason. 3080 Should have been at 12GB or 16GB IMO. 

One of the reasons I went with MSI 1080 Ti Duke was the backplate block also covers the Memory chips and VRM, even with removal of stock heatsink. Was nice and easy to slap an AIO on it and leave everything else alone.


----------



## nexxusty

bigblueshock said:


> Thanks for the info! Yeah 3080 is a beast, I wish I had one. I would be doing same as you, 3080 for gaming (mining at idle) and keep the 1080 Ti for pure mining.
> 
> My only gripe about the 3080 is amount of VRAM the card comes with if I ever decided on going 4k display. It's almost as nVidia knew a crypto mining phase was coming and knew to slap only 10GB on these cards for that reason. 3080 Should have been at 12GB or 16GB IMO.
> 
> One of the reasons I went with MSI 1080 Ti Duke was the backplate block also covers the Memory chips and VRM, even with removal of stock heatsink. Was nice and easy to slap an AIO on it and leave everything else alone.


Sold my 3080 because it only had 10gb VRAM. Wasn't upset about it, I sold it for $1450 Canadian after paying $1200 for it.

Still, really pissed me off that the card only had 10GB. Absolutely ridiculous for a $1200 GPU. Not even slightly missing it. I bought one of the top 3 1080Ti's air cooled models anyway. Aorus Xtreme.

Absolute beast of a card. About to repaste and repad the VRM's.


----------



## kithylin

Nvidia released a new driver today, version 461.72 Which has support for the RTX 3060 series. But in the driver release notes it also says:



> Implicit SLI Disabled on NVIDIA *Ampere GPUs*
> 
> Implicit SLI, where the driver makes assumptions with application profiles to achieve GPU scaling, is disabled on NVIDIA Ampere GPUs. Explicit SLI is still supported, where the application knows the SLI state and uses extensions (such as DirectX 12 linked nodes, Vulkan device groups, or OpenGL multicast extensions) to issue commands to each device in the SLI group. See the NVIDIA Knowledge Base Article 5082 (NVIDIA SLI Support Transitioning to Native Game Integrations | NVIDIA) for more information.


I know Nvidia told us previously that they were not publishing new SLI Profiles after Jan 2021 but I had not heard that they were actively disabling SLI all together with new drivers so this was a surprise to me. The reason I'm writing this here: The message says it's only disabling SLI support on Ampere GPU's. It says nothing about any of the other video cards still supported by this driver (The driver supports all the way back to GTX 600 series still). Could someone with an SLI GTX 1080 Ti setup try the new driver and confirm back to us yes or no if SLI still works with the new driver on pascal cards?


----------



## KedarWolf

nexxusty said:


> Sold my 3080 because it only had 10gb VRAM. Wasn't upset about it, I sold it for $1450 Canadian after paying $1200 for it.
> 
> Still, really pissed me off that the card only had 10GB. Absolutely ridiculous for a $1200 GPU. Not even slightly missing it. I bought one of the top 3 1080Ti's air cooled models anyway. Aorus Xtreme.
> 
> Absolute beast of a card. About to repaste and repad the VRM's.


My second 1080 Ti, SC Black is a beast. It benches at 2026 core, +1013 memory (I'm at work, don't have the actual memory clocks) on the XOC BIOS.

Got my highest ever TimeSpy Graphics scored on a 1080 Ti, 11546 on water, alone on a 280 RAD, using an EKWB Phoenix AIO that I bought the waterblock prefilled from EKWB.

The core is about average but the memory is incredible on this card.

I run it a +993 24/7 with no crashes or artefacts.

I'm selling it locally with the EKWB waterblock and backplate on it, super cheap too, to sell it fast, like $500 CAD and maybe a bit more now that prices are going up.

When I get my tax refund in 7-10 days, pre-ordering an ASUS Strix OC RTX 3090, throwing a water block and stuff on it as well.

EKWB is supposed to start coming out with active backplates for 3090's soon. 3090 RAM is dual-sided, with ram modules under the backplate. They can get hot, so an active backplate would be perfect for it.


----------



## KedarWolf

At my settings I run 24/7 on my SC Black 1080 Ti with monitor HDR on. 

i.


----------



## jura11

What I remember I could run my EVGA GTX1080Ti Founders Edition with 2164MHz and 875MHz on VRAM in any benchmark, that's still with stock BIOS, later on I switched or flashed FTW3 BIOS which I think has been best on my FE, no issues and temperatures never broke 38°C 

For normal gaming I used same OC profile, only for rendering I couldn't use it, in rendering I used 2113MHz OC profile with 700MHz on VRAM 

Hope this helps 

Thanks, Jura


----------



## mouacyk

@KedarWolf At 1.2v you should be able to get 2,150+ on the core. Mine was gaming stable at 2165 and some benches can run at 2200. Memory was at 12,400.


----------



## KedarWolf

mouacyk said:


> @KedarWolf At 1.2v you should be able to get 2,150+ on the core. Mine was gaming stable at 2165 and some benches can run at 2200. Memory was at 12,400.


TimeSpy crashes above 2126.  That's what i run for benches.


----------



## KedarWolf

Pro-tip for anyone mining. Most of you probably know, but some don't.

Download this.









Release v3.5.0.0 · DeadManWalkingTO/NVidiaProfileInspectorDmW


Add AutoBuild bat script. Add Potrable Apps Original App Format.




github.com





You might have to run it as Admin.

Set P2 State as below or your memory speeds will be severely gimped while mining. Also, some benchmarks etc. lower RAM speeds as well and this fixes it.


----------



## 8051

@KedarWolf
Your memory overclock is fantastic. My 1080ti would lock up if I tried that. Is your 1080ti under water?

Are there any Asus Strix XOC equivalent VBIOS for ampere that can be cross-flashed?


----------



## KedarWolf

check the best BIOS collection.zip on first post. You want either the XOC BIOS of the FTW Kingpin one if you're not under water.

Yes, I have a waterblock and backplate on it. I'm selling it this weekend though, upgraded to an ASUS ROG Strix OC RTX 3090


----------



## KedarWolf

8051 said:


> @KedarWolf
> Your memory overclock is fantastic. My 1080ti would lock up if I tried that. Is your 1080ti under water?
> 
> Are there any Asus Strix XOC equivalent VBIOS for ampere that can be cross-flashed?


Oh wait, want video card you asking about?


----------



## demonius34

Hi everyone and sorry for my English. I have a problem with my geforce 1080 TI Zotac Amp Extreme. I Would like to create my own custom fan curves for my cards but it refuses to react with AfterBurner. Also, when I start a game the fan speed up to 100% for a second and then return as before. Could you help me please?


----------



## demonius34

Anyone ?


----------



## BluePaint

I had a zotac 980ti with bugged fan control. Flashing a different bios helped


----------



## Chufye

Hi guys. I have a EVGA 1080Ti Founders Edition card. It is air cooled using an EVGA SC Black shroud cooler.
What is the best bios to use for this?
I only intend to game really. Wanting to get the best performance out of this. 
Thanks for the help everyone.


----------



## mattliston

The best BIOS is the stock one. Unless overclocking with LN2 or similar, you pose zero benefits on the user side.

Many of the manufacturers offer updated BIOS files to fix issues or add headroom, or even just to gain some stability.


All it takes is ONE voltage controller being different, and you can brick or destroy your card.


Not worth the risk in these times where every video card is sky rocketed in price and hard to come by.


----------



## zeze1980

Hello guys
I want to raise the power limit by changing the bios
And my video card is MSI GeForce GTX 1080 Ti GAMING X 11G.
Or if there is another way to raise the power limit, that would be most welcome


----------



## KedarWolf

Chufye said:


> Hi guys. I have a EVGA 1080Ti Founders Edition card. It is air cooled using an EVGA SC Black shroud cooler.
> What is the best bios to use for this?
> I only intend to game really. Wanting to get the best performance out of this.
> Thanks for the help everyone.


The Kingpin Fix FTW3 BIOS in the BestBiosCollection.zip in the OP is really good for the Founders cards.

I tested them all, in gaming and benchmarks it performed the best.

Probably the same for @zeze1980


----------



## Chufye

KedarWolf said:


> The Kingpin Fix FTW3 BIOS in the BestBiosCollection.zip in the OP is really good for the Founders cards.
> 
> I tested them all, in gaming and benchmarks it performed the best.
> 
> Probably the same for @zeze1980


Thanks a million buddy..


----------



## zeze1980

I have finished reading it. It is clear, simple, and contains all attachments

But before I start. There is something on my mind
I have read a lot
They say you should be careful for the compatibility of the GPUs
For example : the number of fans, the power clip, the display ports, the number of power supplies, the type of video RAM, the manufacturer of the video RAM ... etc

Is MSI GeForce GTX 1080 Ti GAMING X compatible with the BIOS in the article?
I am afraid that I will do it and the result will be disappointing

Here are some details about my GPU:








MSI GTX 1080 Ti VBIOS


11 GB GDDR5X, 1544 MHz GPU, 1376 MHz Memory




www.techpowerup.com





Here is a picture of the internal configuration: https://www.gamersnexus.net/media/k2/items/cache/62bc30990b2d77eab0f5b6b38df34e15_XL.jpg

@KedarWolf
@alucardis666
@Tikerz
@KingEngineRevUp


----------



## kiriakos

zeze1980 said:


> Is MSI GeForce GTX 1080 Ti GAMING X compatible with the BIOS in the article?
> I am afraid that I will do it and the result will be disappointing


I am afraid that MSI will not move a finger, so the card to be recovered, if vBIOS game fails. 
Stay safe, do not play with it.


----------



## zeze1980

kiriakos said:


> I am afraid that MSI will not move a finger, so the card to be recovered, if vBIOS game fails.
> Stay safe, do not play with it.



you are right. 
I do not want to risk. So I wanted to make sure what I would do

And it looks like I'm back at the starting point
Or at all there is no way to raise the power limit?


----------



## kithylin

Just a note for everyone: At the moment I own two 1080 Ti's (although I'll be selling one soon). One is a EVGA GTX 1080 Ti SC Black Edition. The other is an Nvidia Founder's Edition. The EVGA card I have flashed at least 25-30 different vbios's from all sorts of different cards and vendors on it and it always worked fine and ran fine with no functional issues other than sometimes one of the display I/O ports might work or not work (DVI, DisplayPort, HDMI, those ports). But it never actually killed the card. The founders edition card I have so far flashed the EVGA FTW3 bios on it and then later the XOC bios on it and it's also been perfectly fine. I found out SLI actually is dead like everyone says and so my little experiment is over and I'm back to single card.

Additional information: Even with the XOC bios on it and running 1.200v through the card(s), my EVGA card listed above maxed out at 2126 Mhz stable on the core and +700 memory. But then more recently I bought this founder's edition card off ebay to test out SLI (only to find out like everyone said, yep SLI dead.. nothing scales at all in any game). But it turned out that the founders edition 1080 Ti's really are "Binned" like the rumors claimed they were. This founder's edition card runs at 2177 Mhz stable on the core and +725 memory stable.

So I'm probably going to keep the founders edition one that clocks higher, sell my EVGA 1080 Ti and sit on the proceeds until Feb~March 2022 and see what new GPU's come out and try to grab something for $1200 and sell my founders edition 1080 Ti then. Maybe. Or I might keep it forever since it clocks so high.


----------



## zeze1980

kithylin said:


> Just a note for everyone: At the moment I own two 1080 Ti's (although I'll be selling one soon). One is a EVGA GTX 1080 Ti SC Black Edition. The other is an Nvidia Founder's Edition. The EVGA card I have flashed at least 25-30 different vbios's from all sorts of different cards and vendors on it and it always worked fine and ran fine with no functional issues other than sometimes one of the display I/O ports might work or not work (DVI, DisplayPort, HDMI, those ports). But it never actually killed the card. The founders edition card I have so far flashed the EVGA FTW3 bios on it and then later the XOC bios on it and it's also been perfectly fine. I found out SLI actually is dead like everyone says and so my little experiment is over and I'm back to single card.
> 
> Additional information: Even with the XOC bios on it and running 1.200v through the card(s), my EVGA card listed above maxed out at 2126 Mhz stable on the core and +700 memory. But then more recently I bought this founder's edition card off ebay to test out SLI (only to find out like everyone said, yep SLI dead.. nothing scales at all in any game). But it turned out that the founders edition 1080 Ti's really are "Binned" like the rumors claimed they were. This founder's edition card runs at 2177 Mhz stable on the core and +725 memory stable.
> 
> So I'm probably going to keep the founders edition one that clocks higher, sell my EVGA 1080 Ti and sit on the proceeds until Feb~March 2022 and see what new GPU's come out and try to grab something for $1200 and sell my founders edition 1080 Ti then. Maybe. Or I might keep it forever since it clocks so high.




Thank you very much for sharing your experience

One last question remains. If we assume the following: I flashed the BIOS and the card did not work. Is there a way to solve the problem?


----------



## Blameless

kithylin said:


> But it turned out that the founders edition 1080 Ti's really are "Binned" like the rumors claimed they were.


That's impossible to ascertain from one sample. I've seen 1080 Ti's of the same SKU with maximum stable core OCs more than 100MHz apart at the same voltage. Spread on the memory is frequently even wider.



zeze1980 said:


> If we assume the following: I flashed the BIOS and the card did not work. Is there a way to solve the problem?


If it card has a BIOS switch, boot from the working BIOS, flip the switch and flash the correct firmware to the non-working ROM.

If the card doesn't have a BIOS switch boot with another card while the non-functional card is installed and flash the firmware on the bad card.


----------



## KedarWolf

zeze1980 said:


> Thank you very much for sharing your experience
> 
> One last question remains. If we assume the following: I flashed the BIOS and the card did not work. Is there a way to solve the problem?


You can plug in a second video card to get s display, then flash the first, or attach an HDMI to the onboard display on the back of the motherboard, flash the bricked one.

That's if your CPU has onboard display.

See the how to flash a 1080 Ti thread for instructions if you use a second video card.

I recently did that with a second video card when I was flashing the BIOS on my RTX 3090 and the PC BSOD'd during the flash and bricked my card. 🥶


----------



## kithylin

Blameless said:


> That's impossible to ascertain from one sample. I've seen 1080 Ti's of the same SKU with maximum stable core OCs more than 100MHz apart at the same voltage. Spread on the memory is frequently even wider.


A friend of mine has a Founder's Edition RTX 2080 Ti that does 2186 stable on water too. I know this is just 2 samples but supposedly founders edition cards were supposed to be binned for the best silicon. At least that was the rumor back when the 1080 Ti's were still sold new.


----------



## KedarWolf

kithylin said:


> A friend of mine has a Founder's Edition RTX 2080 Ti that does 2186 stable on water too. I know this is just 2 samples but supposedly founders edition cards were supposed to be binned for the best silicon. At least that was the rumor back when the 1080 Ti's were still sold new.


My Founder's was average but my SC Black was exceptional, especially on the memory.

I got a #1 Time Spy with it for a 9900k with a 1080 Ti combo.

You can search by CPU and card, or at least you could back then.

Edit: Actually, it was my 3950x I had.









OFFICIAL 3950x Overclocking thread....


Is this an expected behaviour with OCCT, or does this mean my CPU is degraded/disabled in any way? Don't think RAM is the cause, I've been testing it with Memtest for 3 hours and no errors found so far. Have you also ever tested latest Prime95 FMA/AVX2 small fft? If OCCT small AVX2 is a similar...




www.overclock.net





Second edit: Still #1 with that CPU and card combo.

Third edit: SC Black uses a reference PCB though, so it's all down to the silicon lottery.


----------



## kithylin

KedarWolf said:


> My Founder's was average but my SC Black was exceptional, especially on the memory.
> 
> I got a #1 Time Spy with it for a 9900k with a 1080 Ti combo.
> 
> You can search by CPU and card, or at least you could back then.
> 
> Edit: Actually, it was my 3950x I had.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OFFICIAL 3950x Overclocking thread....
> 
> 
> Is this an expected behaviour with OCCT, or does this mean my CPU is degraded/disabled in any way? Don't think RAM is the cause, I've been testing it with Memtest for 3 hours and no errors found so far. Have you also ever tested latest Prime95 FMA/AVX2 small fft? If OCCT small AVX2 is a similar...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> Second edit: Still #1 with that CPU and card combo.
> 
> Third edit: SC Black uses a reference PCB though, so it's all down to the silicon lottery.


And my SC Black edition card is a potato. 2126 core with XOC bios, water cooled and 1.200v is pathetic. This Founder's Edition card I picked up can do 2126 Mhz with stock volts. 



kiriakos said:


> @*kithylin *
> Will fix your card for free and pay shipping back cost too.
> It never did that before, but you never know.


I'm not sure what you're trying to write here? I'm not repairing anyone's video card. I would never do that for anyone unless I knew them very well as a close friend.


----------



## KedarWolf

kithylin said:


> And my SC Black edition card is a potato. 2126 core with XOC bios, water cooled and 1.200v is pathetic. This Founder's Edition card I picked up can do 2126 Mhz with stock volts.
> 
> 
> I'm not sure what you're trying to write here? I'm not repairing anyone's video card. I would never do that for anyone unless I knew them very well as a close friend.


I'm really sure they were being snide when you suggested they go ahead and flash the BIOS. 

Edit: Both my Founder's and Black would only do 2126 at 1.2v under water, but the Black clocked benching 500 more on memory or something crazy like that.


----------



## zeze1980

Hello guys, I flashed the BIOS (EVGA GTX 1080 Ti SC Black Edition) on my ( MSI GeForce GTX 1080 Ti GAMING X ) card and it worked fine, but there is a problem, the power limit is ( 200 watts ) and when I increase it to 120% it goes to (240 watts) )
Note: The BIOS (EVGA GTX 1080 Ti SC Black Edition) is 300 watts after the increase (120%).
Note: In the past, I could reach on the original BIOS up to (320 watts).
Note: I flashed the BIOS. I want to exceed the original power limit (320 watts) and the voltage limit (1093 millivolts) to higher values such as: (400 Watts) and (1200 millivolts) as an example

@kithylin
@KedarWolf
@alucardis666
@Tikerz
@KingEngineRevUp


----------



## kithylin

zeze1980 said:


> Hello guys, I flashed the BIOS (EVGA GTX 1080 Ti SC Black Edition) on my ( MSI GeForce GTX 1080 Ti GAMING X ) card and it worked fine, but there is a problem, the power limit is ( 200 watts ) and when I increase it to 120% it goes to (240 watts) )
> Note: The BIOS (EVGA GTX 1080 Ti SC Black Edition) is 300 watts after the increase (120%).
> Note: In the past, I could reach on the original BIOS up to (320 watts).
> Note: I flashed the BIOS. I want to exceed the original power limit (320 watts) and the voltage limit (1093 millivolts) to higher values such as: (400 Watts) and (1200 millivolts) as an example
> 
> @kithylin
> @KedarWolf
> @alucardis666
> @Tikerz
> @KingEngineRevUp


There's a powerlimit.bat file attached to the first post in this thread. Did you download it and run it?


----------



## zeze1980

I did what you told me, but it never worked
Then I flashed the BIOS with no power limit
This worked without problems, and I reached (330 + watts) and the voltage reached (1150 +)
But the problem is that I was unable to reach (2000 megahertz)
Note: On the original BIOS, I was up to ( 2000 megahertz ) without problems
Note: The temperature in the past and now did not exceed (70°C) and was not lower than 65°C.
Note: In the past, Momery Clock was (+650), but now it is (0+), however, I could not reach (2000 Megahertz), let alone exceed it.

@kithylin


----------



## kithylin

zeze1980 said:


> I did what you told me, but it never worked
> Then I flashed the BIOS with no power limit
> This worked without problems, and I reached (330 + watts) and the voltage reached (1150 +)
> But the problem is that I was unable to reach (2000 megahertz)
> Note: On the original BIOS, I was up to ( 2000 megahertz ) without problems
> Note: The temperature in the past and now did not exceed (70°C) and was not lower than 65°C.
> Note: In the past, Momery Clock was (+650), but now it is (0+), however, I could not reach (2000 Megahertz), let alone exceed it.
> 
> @kithylin


Do be warned: You should not be using the bios with no power limit on any air cooled card as you run a very high risk of causing permanent damage to your 1080 Ti. That bios is for people in a full cover water block in a custom water loop only. Not for air cooled cards ever. Also the power limit does not determine your maximum overclock with the 1080 Ti series, tempature does. The power limit only causes power limit throttling once it is triggered. If you are trying to overclock your video card further (past 2000 Mhz) you will have to drop the temperature of the card down below 50c (likely at least 40c and below) under load. Nvidia 1000 series (and all newer Nvidia cards past that) reduce their clocks depending on how hot they are. The hotter the card is the slower it's core Mhz will run. No matter which bios you run you will never see 2100 Mhz or above on air cooling.


----------



## Lobstar

Thanks to Black Friday my 1080ti FE gets it's first waterblock. This will be recommissioned in my 3950x/128GB RAM/88TB Unraid/Plex server for encoding and daily compute duties. I figured after near 5 years it could use a little refresh before being put back to use.


----------



## yzonker

Lobstar said:


> Thanks to Black Friday my 1080ti FE gets it's first waterblock. This will be recommissioned in my 3950x/128GB RAM/88TB Unraid/Plex server for encoding and daily compute duties. I figured after near 5 years it could use a little refresh before being put back to use.


Where did you buy the block from? 

I was thinking about asking in this thread as I haven't seen any in stock.


----------



## kithylin

yzonker said:


> Where did you buy the block from?
> 
> I was thinking about asking in this thread as I haven't seen any in stock.


PPCS has a lot of 1080 Ti blocks in stock still: Search results for: 'GTX 1080 Ti'

That specific one they showed above in the message you quoted is here: XSPC Blade - GTX 1080 / 1080 Ti It's even on sale for $75.


----------



## Lobstar

yzonker said:


> Where did you buy the block from?
> 
> I was thinking about asking in this thread as I haven't seen any in stock.


Performance PCs has this one for like $75. They have the heatkiller too last I checked but I wasn't looking to spend so much on a block for my server. Edit: just saw kith said the same thing lol


----------



## alienalvan

*Unable to synchronize SLI Overclock settings*
Hi guys,
I just get my hand on 2 Asus Strix GTX1080Ti running in SLi & I'm trying to synchronize both GPU overclock settings (Using curve & fixing voltage).

But unfortunately, the 2nd card often gets a higher clock & whenever I run any test (3DMark, Heaven Benchmark) it will go to the lower clock (Which is the 1st card) & when it idles the 2nd card goes back to a higher clock speed.

I've tried to flash both card BIOS with the exact same version of BIOS (Downloaded the BIOS pack from GTX1080Ti Club post) no luck to getting an exact match of voltage & frequency curve, izzit I'm doing it wrong or it is supposed to behave in such a way due to silicon quality? Thanks


----------



## kithylin

alienalvan said:


> *Unable to synchronize SLI Overclock settings*
> Hi guys,
> I just get my hand on 2 Asus Strix GTX1080Ti running in SLi & I'm trying to synchronize both GPU overclock settings (Using curve & fixing voltage).
> 
> But unfortunately, the 2nd card often gets a higher clock & whenever I run any test (3DMark, Heaven Benchmark) it will go to the lower clock (Which is the 1st card) & when it idles the 2nd card goes back to a higher clock speed.
> 
> I've tried to flash both card BIOS with the exact same version of BIOS (Downloaded the BIOS pack from GTX1080Ti Club post) no luck to getting an exact match of voltage & frequency curve, izzit I'm doing it wrong or it is supposed to behave in such a way due to silicon quality? Thanks


Are your cards air cooled or water cooled? And exactly which bios are you using on the two cards?


----------



## alienalvan

kithylin said:


> Are your cards air cooled or water cooled? And exactly which bios are you using on the two cards?



It is water + air-cooled at the same time 😁
Main GPU max temp at 51c, 2nd GPU around 43c, both flashed with below GPU BIOS version.


----------



## kithylin

alienalvan said:


> It is water + air-cooled at the same time 😁
> Main GPU max temp at 51c, 2nd GPU around 43c,


That's your problem then. Starting with the Pascal video cards (1000 series) Nvidia varies the clock speeds of the video cards based on the temperature of the GPU core regardless of whatever you set in MSI Afterburner. According to the information you have provided us you seem to have one video card running at 2075 Mhz and the other at 2037 Mhz. If that's true then this is a normal expected behaviour when you have a +/- 8c difference in temperatures. There is a bios that can kind of bypass this behaviour and let us run cards synchronized but you probably shouldn't be using it with your setup (AIO + fan cooled VRM's) because the VRM's can get too hot with that bios and you would run risk of damaging your cards. It's typically only advisable for people with full cover custom water loop water blocks on their cards that water cool both the VRM's and the core together.

Also I will comment that I recently tried 1080 Ti SLI myself and for 1080p and 1440p it's useless and doesn't scale at all in any games. You probably won't even notice a single FPS gain in anything outside of 3DMark unless you're running a 4K monitor.


----------



## alienalvan

kithylin said:


> That's your problem then. Starting with the Pascal video cards (1000 series) Nvidia varies the clock speeds of the video cards based on the temperature of the GPU core regardless of whatever you set in MSI Afterburner. According to the information you have provided us you seem to have one video card running at 2075 Mhz and the other at 2037 Mhz. If that's true then this is a normal expected behaviour when you have a +/- 8c difference in temperatures. There is a bios that can kind of bypass this behaviour and let us run cards synchronized but you probably shouldn't be using it with your setup (AIO + fan cooled VRM's) because the VRM's can get too hot with that bios and you would run risk of damaging your cards. It's typically only advisable for people with full cover custom water loop water blocks on their cards that water cool both the VRM's and the core together.
> 
> Also I will comment that I recently tried 1080 Ti SLI myself and for 1080p and 1440p it's useless and doesn't scale at all in any games. You probably won't even notice a single FPS gain in anything outside of 3DMark unless you're running a 4K monitor.


I see, it makes sense now since the primary GPU is the hottest & has the lowest clock boost therefore then 2nd GPU will synchronize toward the lowest clock when both GPUs is SLi'ing~
Thanks for the tips~ 😊


----------



## kithylin

alienalvan said:


> I see, it makes sense now since the primary GPU is the hottest & has the lowest clock boost therefore then 2nd GPU will synchronize toward the lowest clock when both GPUs is SLi'ing~
> Thanks for the tips~ 😊


Yeah the hotter the cards get the more they reduce clocks. I can't remember the exact scale (Someone correct me if I'm wrong?) but it's something around -15 Mhz per +5c temps starting from around 24c and going upwards. This is why me (and a lot of other folks) go for large custom water loops for even a single 1080 Ti: With these cards we can reach higher clock speeds just by keeping the temps low, even on the same bios without changing bios. Depending on silicon lotto.

For Example: My first 1080 Ti that I bought brand new in fall 2017 would only do 2126 Mhz core and +700 memory, even though I had it in a custom water loop and kept it < 40c at all times and even on the "unlimited" bios. I recently bought a second 1080 Ti off ebay to try out SLI and this one I bought off ebay will run 2155 Mhz core clock for gaming and 2177 for benchmarks, (also only stable with memory at +700) with temps < 40c. So it depends on a lot of things: Temperature is a factor for clock speeds, as well as power limit of the bios, and silicon lottery.


----------



## Lobstar

Just got my waterblocked FE up and running.

The better result is .981 @ 1911 w/ +400 mem and the other is just increased power limit in AB to 120. The bios is whichever was stock from the very first batch. Result

I haven't tweaked my ram or CPU so the memory is pretty slow (happens with 128gb ... ).

Edit: Rig tax.


----------



## KedarWolf

Lobstar said:


> Just got my waterblocked FE up and running.
> 
> The better result is .981 @ 1911 w/ +400 mem and the other is just increased power limit in AB to 120. The bios is whichever was stock from the very first batch. Result
> 
> I haven't tweaked my ram or CPU so the memory is pretty slow (happens with 128gb ... ).
> 
> Edit: Rig tax.
> View attachment 2549747


I just bought an Optimus Water Cooling Strix OC RTX 3090 block, best in class block. But they take a few months to actually ship. They are basically custom made on-demand.

I had the top 3DMark score last year for a 5950x and a 1080 Ti, but I don't recall if it was when I SLI'd or a single card.

Edit: It was single card.









3DMark.com search


3DMark.com search




www.3dmark.com


----------



## Lobstar

KedarWolf said:


> I just bought an Optimus Water Cooling Strix OC RTX 3090 block, best in class block. But they take a few months to actually ship. They are basically custom made on-demand.
> 
> I had the top 3DMark score last year for a 5950x and a 1080 Ti, but I don't recall if it was when I SLI'd or a single card.
> 
> Edit: It was single card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark.com search
> 
> 
> 3DMark.com search
> 
> 
> 
> 
> www.3dmark.com


Sheesh! That's a nice 1080ti score! I think either my dark hero or 5950x died today. Throwing scep ca auth errors and it spit a random 00 code. Sooo windows gets installed on the server for a bit lol.


----------



## zeze1980

welcome all
in the power limit adjustment file ( BAT )
I found something strange
The file extension in the BAT command was not correct. It has changed with the new updates for Nvidia
C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe (before the update. after the update I don't know)
Because of this I can't reach the maximum power in the BIOS (330), I'm stuck at less than 300 watts, this caused a drop in the core clock speed of 50 MHz
How to fix it ?


----------



## kithylin

zeze1980 said:


> welcome all
> in the power limit adjustment file ( BAT )
> I found something strange
> The file extension in the BAT command was not correct. It has changed with the new updates for Nvidia
> C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe (before the update. after the update I don't know)
> Because of this I can't reach the maximum power in the BIOS (330), I'm stuck at less than 300 watts, this caused a drop in the core clock speed of 50 MHz
> How to fix it ?


That's probably because Nvidia removed the nvidia-smi.exe tool from their drivers quite a long time ago, around spring 2021. It doesn't exist anymore. ME personally on my second computer (Where my 1080 Ti lives now) I just only use driver version 472.12 and I never update it. Nvidia hasn't included any fixes or performance enhancements in any of the newer drivers for 10-series cards for a long time either. The video cards are in the new drivers but the new drivers haven't done anything for us 1080 Ti owners in over a year. I don't see any reason to ever update the drivers for my 1080 Ti system. And it has nvidia-smi.exe included with this older driver version.


----------



## Maximization

@*alienalvan*


I would use Asus overclocking programs to make sure Asus products sync.


----------



## KedarWolf

zeze1980 said:


> welcome all
> in the power limit adjustment file ( BAT )
> I found something strange
> The file extension in the BAT command was not correct. It has changed with the new updates for Nvidia
> C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe (before the update. after the update I don't know)
> Because of this I can't reach the maximum power in the BIOS (330), I'm stuck at less than 300 watts, this caused a drop in the core clock speed of 50 MHz
> How to fix it ?


Change the powerlimit.bat file it to, this part from C:\Program Files\NVIDIA Corporation\NVSMI\nvidia-smi.exe to C:\Windows\System32\nvidia-smi.exe

or the max power limit of the BIOS you flashed. Highly recommend the FTW3KingpinFix BIOS. If I recall right, that one is 330 power limit.

overclock.net won't let me paste the full .bat file code. 

Edit: Or just have Afterburner start at boot with the power limit and voltage maxed out. It maxes out the power limit.


----------



## mxthunder

Hey guys - had a 1080ti for the last 2 years or so but never really looked into overclocking it much. I have it set to +150 on the core resulting in about a 2050MHz clock at all times under load (card is WC) and +200 on the memory. Is there generally more headroom on the memory? I never played with the voltage slider, would moving it around net me any more core speed? I actually just bought the better version of the Titan Xp to replace it with as I want to put the 1080ti in my bedroom PC to upgrade my 1080.


----------



## rexbinary

mxthunder said:


> Hey guys - had a 1080ti for the last 2 years or so but never really looked into overclocking it much. I have it set to +150 on the core resulting in about a 2050MHz clock at all times under load (card is WC) and +200 on the memory. Is there generally more headroom on the memory? I never played with the voltage slider, would moving it around net me any more core speed? I actually just bought the better version of the Titan Xp to replace it with as I want to put the 1080ti in my bedroom PC to upgrade my 1080.


150 core and 200 mem is good. You might could do better on core but maybe not a lot. Memory is very sensitive and can cause weird stability issues so bump it sparingly.

Use MSI Afterburner regardless of your card's brand. Slide Power Limit and Temp limit to max. Set fans to max or at least use the default fan curve in Afterburner. Bump core a bit at a time until it fails testing while running Heaven 4.0 benchmark. Once it fails, back it down a bit and test stability. Use other benchmarks or real games to further test stability. Battlefield 4 is a good one to test stability with. Personally I left my memory on +200 and never tried to increase it further. But same thing again, bump it a bit while testing with Heaven, etc just like with core.


----------



## zeze1980

Hello everyone

1080 ti graphics cards 
If the temperature is at 70 and i reduce it to 50 What is the expected core speed? 


Note : that the speed is now at 2000 mhz


----------



## kithylin

zeze1980 said:


> Hello everyone
> 
> 1080 ti graphics cards
> If the temperature is at 70 and i reduce it to 50 What is the expected core speed?
> 
> 
> Note : that the speed is now at 2000 mhz


No one can tell you that because it's different for every single 1080 Ti. Silicon lottery with the core and all is different for each card. For example: I had a GTX 1080 Ti from EVGA, the EVGA GTX 1080 Ti SC Black Edition that I put under a water block and I had it running around 35~38c when gaming and it would only ever overclock to 2126 Mhz on the core. But then a few years later I bought a NVIDIA Founders Edition GTX 1080 Ti and put the same water block on it and ran it at the same 35~38 C and it does 2177 Mhz stable on the core. I got lucky there and I sold my EVGA version.

So there's really no way for anyone to predict what your card will overclock to. You'll just have to cool it and try it and see what happens. You will probably also be limited by the card's power limit too at some point so that will effect things too.


----------



## zeze1980

kithylin said:


> No one can tell you that because it's different for every single 1080 Ti. Silicon lottery with the core and all is different for each card


But is there a *graph* for that? I mean temperature and frequency
or how much frequency do I get for each lower Celsius temperature?
for example: the temperature is now *70*, if the temperature drops to *50*, how much frequency gain is *expected*


----------



## kithylin

zeze1980 said:


> But is there a *graph* for that? I mean temperature and frequency
> or how much frequency do I get for each lower Celsius temperature?
> for example: the temperature is now *70*, if the temperature drops to *50*, how much frequency gain is *expected*


No. There is no graph. There is no way to tell. It is literally unique to every video card. Some clock higher some do not. That is why it is called "The Silicon *Lottery*" Maybe you'll get lucky and get a card that overclocks high or maybe you'll get a dud. No one can tell you until you try it. Generally however: Reducing the temperature of the core may also allow you to increase the clock speed. Or your card may not increase in clocks even with a colder core. It's completely random. There is a way to "delete" the power limit and run these cards with no power limit but that's only for people with a full cover water block and a good enough cooling system to keep the card cold when gaming. I'm guessing you're still on some sort of air cooling or hybrid AIO cooling if you mentioned the core running 50c all the time.


----------



## zeze1980

I already use air cooling. and when using water cooling or hybrid i* expect* the temperature to drop from the current degree from 70 to *50 or more* with the change of the cooling system 
So I'm asking about the performance gain, if there is *100MHz or **more *on *core clock* , that might motivate me to change the cooling system

Note that I have the card now:

MSI GTX 1080 Ti GAMING X
2000 core clock
6200 memory clock
65-70 temperature loads









MSI GTX 1080 Ti GAMING X Specs


NVIDIA GP102, 1657 MHz, 3584 Cores, 224 TMUs, 88 ROPs, 11264 MB GDDR5X, 1376 MHz, 352 bit




www.techpowerup.com













MSI GTX 1080 Ti VBIOS


11 GB GDDR5X, 1544 MHz GPU, 1376 MHz Memory




www.techpowerup.com







https://storage-asset.msi.com/datasheet/vga/global/GeForce-GTX-1080-Ti-GAMING-X-11G.pdf


----------



## kithylin

zeze1980 said:


> So I'm asking about the performance gain, if there is *100MHz or **more *on *core clock* , that might motivate me to change the cooling system
> 
> Note that I have the card now:
> 
> MSI GTX 1080 Ti GAMING X
> 2000 core clock
> 6200 memory clock
> 65-70 temperature loads


I know this isn't what you're wanting to hear but really there just isn't much way to tell. You might see 2100 Mhz stable with 50c. Or you might only get +20 Mhz more. It depends on a lot of different factors and isn't really predictable.


----------



## zeze1980

kithylin said:


> It depends on a lot of different factors and isn't really predictable.











MSI GTX 1080 Ti Gaming X PCB & VRM Quality Analysis


In our analysis of the PCB, we go over VRM design, overclocking potential, and power mods.




www.gamersnexus.net


----------



## kithylin

zeze1980 said:


> MSI GTX 1080 Ti Gaming X PCB & VRM Quality Analysis
> 
> 
> In our analysis of the PCB, we go over VRM design, overclocking potential, and power mods.
> 
> 
> 
> 
> www.gamersnexus.net


Okay? And what are you trying to say by posting this? I have monitored every post in this thread from the first day the thread started and read almost all of it. Most of that design doesn't really mean jack squat in terms of resulting overclocks in the end and doesn't make the card any more likely to overclock any better than any other card. It still comes down to "Luck of the draw" and random silicon. The only cards that had cherry-picked-guaranteed-good-overclocking cores were the 1080 Ti KINGPIN cards.


----------



## zeze1980

Thank you very much, you have given me a lot, now I know how far I can go with my core speed, thanks again


----------



## EternalStudent07

I can't find a way to search only this thread, so apologies as I assume this has been answered.

I have 2 Founder's Edition direct from NVIDIA, and it looks like the hardware ID isn't listed in the MSIAfterburner.oem2 file. Also the *voltage slider doesn't seem to do anything*. Or not reliably... Automatic overclock runs fail as unstable or bluescreen, and GPU-Z typically mentions Vrel as the reason for perfcap though sometimes Power (set to 120% already) depending on how I stress things (Kombustor vs. OCCT vs. Heaven vs...).

Am I correct that I must add the matching hardware ID value at the bottom and select "3rd party" for voltage control method? That was mentioned in a thread from 2019. The 3rd party setting feels wrong as this is a reference board, but...this isn't the first time I've had to work around software oddities.

I've selected a range of entries for the voltage selection method setting (reference, standard, extended, 3rd party), but haven't moved the slider after each of them. I kept assuming the program would tell me if something worked or didn't. And most help information made it sound like reference boards should be supported under any of those settings.

Another question is *how to know if the fan controller accepts custom firmware curve values* (to make the fan speed changes survive not having AB running). I mean other than testing it with a benchmark and GPU-Z's monitor, or some other program.

(update) Maybe the voltage issue is more about my default voltage being 1.042 usually. Even with the voltage slider at 0. After doing the suggestions above I could use voltage slider at 100, and it'd run my card up to 1.093 like expected (hitting the limit from NVIDIA).

There is a lot here that confused me...The % on the voltage label "Core (%)", the posts about 1.0V being the stock voltage and me not seeing that on an NVIDIA FE, and the current value not seeming to go up when I'd move the slider (guessing it was heat throttled before I changed the setting in app).


----------



## kithylin

EternalStudent07 said:


> Another question is *how to know if the fan controller accepts custom firmware curve values* (to make the fan speed changes survive not having AB running). I mean other than testing it with a benchmark and GPU-Z's monitor, or some other program.


I don't know about the rest of your comments but I do know how to answer this one: No. We can not put a custom firmware on the 10 series Nvidia cards. We can not edit the bios/firmware for these cards and create custom firmware. It's not possible. All these years later and no one ever cracked the encryption on the bios's and released custom firmware editor programs like the previous cards.

Also I experimented with a pair of 1080 Ti's around spring 2022 and determined it's completely pointless, at least on my current AMD X570 platform. It would probably work with one of the HEDT platforms that can run dual-16x-16x on both cards. But even then all of the games released in the past 5 years don't support SLI anyway and Nvidia's latest drivers dropped support for it (They aren't creating new SLI profiles at all). It would of been advisable for you to sell your second 1080 Ti earlier this year when they were going for $700, but as of right now I would suggest at least you remove one from your system and go single card. In my X570 system I saw literally zero gains in FPS with SLI on or off, even in older SLI-supported titles and even with older drivers, at least not at 1080p. SLI doesn't work and doesn't do anything anymore unless you're trying to play at 4K. I used Nvidia DSR and ran 4K on my 1080p monitor and saw gains with a second card of about +30%, even that's not worth the extra power usage, heat, and complexity. It's not even worth trying to sell your second card any more as used 1080 Ti's are down to $250 now which is almost pointless to even sell one.


----------



## EternalStudent07

kithylin said:


> No. We can not put a custom firmware on the 10 series Nvidia cards.


Sorry for the confusion. Inside MSI AB there is a setting to attempt to apply the fan curve on the card's firmware. No flashing process involved. It would just use the custom curve during boot and when the app wasn't running. It doesn't say which cards support that or not, just that it often limits you to 3 points. I clicked it and didn't notice anything change, but figured I might have tested wrong or gotten the process b0rked up.

And yeah, I'm surprised and sad to hear how SLI is going the way of the dodo. I'd thought pcpartspicker had a mistake when they didn't include it on a high end card (3080ti, 3090, or 3090ti I forget which). I mean the card even had the typical connector for it, but they said often the feature doesn't work anyway.

I'm especially surprised since they have NVLINK available. I guess that's just for the crazy expensive stuff now (Quadro and Tesla).


----------



## kithylin

EternalStudent07 said:


> Sorry for the confusion. Inside MSI AB there is a setting to attempt to apply the fan curve on the card's firmware. No flashing process involved. It would just use the custom curve during boot and when the app wasn't running. It doesn't say which cards support that or not, just that it often limits you to 3 points. I clicked it and didn't notice anything change, but figured I might have tested wrong or gotten the process b0rked up.


This also is not possible. As I said above: Starting with the 10-series (1080 Ti's or any 10XX video card from Nvidia) their bios's are encrypted and can not be edited or modified in any way by any software (Not MSI-AB or any other software). The only option we have is "cross-flashing" which can allow us to flash a different bios from a different vendor onto our cards. There is a "fully unlocked" bios available that allows voltage up to 1.200v and no power limit for the 1080 Ti (and is compatible with all 1080 Ti's, FE or AIB) but that's not for air cooled cards or even AIO cooled cards. Full-cover water block territory only. And only for people that have a big enough loop to keep the cards < 60c at all times even under full load.


----------



## KingCry

So I'm currently fighting with a Zotac AMP Extreme 1080ti. It's a second hand card and was planning on using it just till new cards release. It's been a headache, the card is extremely unstable at factory clocks out of the box. With sometimes needing to be -70 or more on the core, I'm not entirely sure what I'm fighting with I notice power draw is fairly low according to software monitoring (230-250w max) compared to my still working GTX 1080 Strix (280-300w). I wonder if I'm running to an issue with the bios or if it's cause I'm still running a daisy chain style PCI-E power.

Recent 3dmark Fire strike scores between 1080 vs 1080ti: 
1080ti AMP Extreme: I scored 28 590 in Fire Strike
1080 Strix: I scored 22 497 in Fire Strike


----------



## kithylin

KingCry said:


> (280-300w). I wonder if I'm running to an issue with the bios or if it's cause I'm still running a daisy chain style PCI-E power.


That could be it. I had some issues with my 1080 Ti for a while and the issues went away when I changed it from a single cable with 2 x 6+2 plugs on the end to two physical 6+2 cables from the power supply. Try that.


----------



## KingCry

kithylin said:


> That could be it. I had some issues with my 1080 Ti for a while and the issues went away when I changed it from a single cable with 2 x 6+2 plugs on the end to two physical 6+2 cables from the power supply. Try that.


I tried it, card was stable enough to make it through firestrike. But still crashes in games after 15mins.


----------



## kithylin

KingCry said:


> I tried it, card was stable enough to make it through firestrike. But still crashes in games after 15mins.


It could be anything then. Maybe the card you bought second hand was used for coin mining and it's burnt up now and won't work normally.


----------



## KingCry

kithylin said:


> It could be anything then. Maybe the card you bought second hand was used for coin mining and it's burnt up now and won't work normally.


It came from a high-school friend who I know didn't mine with it.


----------



## EternalStudent07

Isn't there a way to see if the card is throttling because of temperature or power? I'd go searching for that value when you stress the card. I only saw it once and I have a bunch of different programs so can't immediately point at one (GPU-z or MSI Afterburner are my guesses). MSI Afterburner is a handy program either way. My fan curve wasn't as aggressive as I'd like. And you can increase a few limits to get some "free" perf increases if you get this to run stably.

Have you cleaned it well? It might be overheating too fast if there is a blockage of airflow somewhere. I've got the FE so there was a plastic cover you could remove to clean out dust from. I assume aftermarket cards don't, and you'd just need to liberally spray canned air from all angles until nothing comes off. Maybe verify the fans spin easily (check by pushing them to see if they're gritty or stuck) and that they run well when powered up (look up with a flashlight? Or trust a software fan speed measure?).

Apparently thermal paste can dry up, or even get pumped out. Just watched a video about this, and how they improve thermal solutions between product versions. Gamer's Nexus I think is the group. They chatted with Michael Gutenberg or something.

NVIDIA said their dies are convex, but change to flat when they heat up to operating temps... And that can cause a pumping motion over a few hundred sessions, depending on the viscosity of the TIM (Thermal Interface Material). The fix would be to clean off then replace the paste. Think I read they need a lot more paste for GPU's (versus the "grain of rice" many people expect for CPU's). I know there are disassembly videos for a variety of cards, but you might have to just wing it.

Which isn't impossible if you're slow and methodical (make a map where screws came from), but especially with older cards the thermal pads might tear when you remove the heatsink. I've read that torn pads can still work, especially since this is just to carry you for a few months. Or you can buy replacements, but they have different thicknesses you'd have to match. The wrong thickness wouldn't touch (no heat conducting) or cause the heatsink to be not level (again not touching elsewhere and no heat conducting).

I'd hope since the power supply was fine with a 1080 that a 1080 ti wouldn't be much worse (need lots more power). Which supply do you have out of curiosity?


----------



## kithylin

I forgot about that. When is the last time that used 1080 Ti was re-pasted? If your high school friend never did it and you haven't done it then it's way past time. That should be done at least once every 2 years max and the 1080 Ti's came out 5 years ago at this point. It could have 5 year old thermal paste in it.


----------



## Maco88

Hey folks, 

I have a small issue with a Zotac GTX 1080 Ti AMP edition ( 2 fan version ), where the Freeze Stop feature does not work as intended. ie. when the card is at idle / sitting below 50c which I think is the threshold where the fans kick in. Many of the newer cards have this feature, but for some reason, the fans spin from the moment the PC is started with this card. 

The fans seem to function correctly responding the the fan curve, hotter the card gets, the faster the fans ramp up, but they never come to a full stop even when say the temps are well below 40c. 

Incidently, both MSI Afterbuner & GPU-Z, both show the fans at 0% when the temps are say below 45c, but the fans are still spinning, at what seems a low RPM speed.

I have also tried changing the "Power Management" setting from Optimal to Adaptive in the NVidia Control Panel, which did nothing. I have the latest NVidia drivers installed.

Any ideas would be much appreciated.


----------

